hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
541353ca491ac8dd96b86aeaa420c3a170d5cb09 | 442 | py | Python | functions.py | quant-om/python | 0b62f6b535b76ff1aeb8267da49b48be5c6595a2 | [
"MIT"
] | null | null | null | functions.py | quant-om/python | 0b62f6b535b76ff1aeb8267da49b48be5c6595a2 | [
"MIT"
] | null | null | null | functions.py | quant-om/python | 0b62f6b535b76ff1aeb8267da49b48be5c6595a2 | [
"MIT"
] | null | null | null | # functions
print("Demonstrating functions....")
def fun():
print("Printing my function: fun")
def fun1():
print("Printing my function: fun1")
def multiply(x, y):
z = x * y
return z
mulnum = multiply(150, 160)
print(f"The return value is {mulnum}")
# Demonstrating lambda functions
# Lambda functions are anonymous functions
x = lambda a: a + 10
print(f'The value of x is {x}')
y = x(5)
print(f'The value of y is {y}')
| 17.68 | 42 | 0.658371 | 70 | 442 | 4.157143 | 0.4 | 0.020619 | 0.092784 | 0.158076 | 0.109966 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.20362 | 442 | 24 | 43 | 18.416667 | 0.795455 | 0.183258 | 0 | 0 | 0 | 0 | 0.414566 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0 | 0 | 0.285714 | 0.428571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
541c854e118dac5950d17a1c2787ba831811309f | 766 | py | Python | db/add-user.py | straterra/DreamerFlask | da8c16c105cb3008bcc2e9e995b518d892f21623 | [
"MIT"
] | null | null | null | db/add-user.py | straterra/DreamerFlask | da8c16c105cb3008bcc2e9e995b518d892f21623 | [
"MIT"
] | null | null | null | db/add-user.py | straterra/DreamerFlask | da8c16c105cb3008bcc2e9e995b518d892f21623 | [
"MIT"
] | null | null | null | #!/usr/bin/python
# Written by Thomas York
# Imports
import datetime
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from flask_hashing import Hashing
from app.tabledef import *
from app import app
import os
import getpass
# Setup
basedir = os.path.abspath(os.path.dirname(__file__))
engine = create_engine('sqlite:///' + os.path.join(basedir, 'auth.sqlite3'), echo=True)
# Ask user for information
user = raw_input("Username:")
passwd = getpass.getpass("Password for " + user + ":")
h = hashing.hash_value(passwd, salt=app.config['SECRET_KEY'])
# create a Session
Session = sessionmaker(bind=engine)
session = Session()
user = User(user, h)
session.add(user)
# commit the record the database
session.commit()
session.commit()
| 20.702703 | 87 | 0.750653 | 107 | 766 | 5.280374 | 0.542056 | 0.031858 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001502 | 0.130548 | 766 | 36 | 88 | 21.277778 | 0.846847 | 0.164491 | 0 | 0.105263 | 0 | 0 | 0.086888 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.157895 | 0.421053 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
54232b74df71f4e0791d0cd39dc45c03aab966b6 | 133 | py | Python | setup.py | pandrei7/cf-tests | 4211d2f8c94730a63ceb960cfe487d9d13f9596b | [
"MIT"
] | null | null | null | setup.py | pandrei7/cf-tests | 4211d2f8c94730a63ceb960cfe487d9d13f9596b | [
"MIT"
] | 2 | 2020-04-28T16:50:43.000Z | 2020-09-13T10:36:08.000Z | setup.py | pandrei7/cf-tests | 4211d2f8c94730a63ceb960cfe487d9d13f9596b | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
name='cptest',
entry_points={
'console_scripts': [
'cptest=main:main'
]
}
)
| 12.090909 | 28 | 0.609023 | 14 | 133 | 5.642857 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.255639 | 133 | 10 | 29 | 13.3 | 0.79798 | 0 | 0 | 0 | 0 | 0 | 0.278195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 0.111111 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5809dabd98d23e6091a20dfd2db931198d7c4c8a | 392 | py | Python | pipetools/__init__.py | 0101/pipetools | 3f77b203bc538bcac519331d36edebd9c65c642a | [
"MIT"
] | 164 | 2015-03-04T08:09:27.000Z | 2022-03-23T05:00:40.000Z | pipetools/__init__.py | 0101/pipetools | 3f77b203bc538bcac519331d36edebd9c65c642a | [
"MIT"
] | 18 | 2016-07-25T16:54:42.000Z | 2022-03-31T12:59:40.000Z | pipetools/__init__.py | 0101/pipetools | 3f77b203bc538bcac519331d36edebd9c65c642a | [
"MIT"
] | 18 | 2015-03-30T16:53:32.000Z | 2022-01-31T23:50:52.000Z | from pipetools.utils import foreach
__version__ = VERSION = 1, 0, 1
__versionstr__ = VERSION > foreach(str) | '.'.join
from pipetools.main import pipe, X, maybe, xpartial
from pipetools.utils import *
# prevent namespace pollution
import pipetools.compat
for symbol in dir(pipetools.compat):
if globals().get(symbol) is getattr(pipetools.compat, symbol):
globals().pop(symbol)
| 28 | 66 | 0.742347 | 51 | 392 | 5.54902 | 0.588235 | 0.137809 | 0.127208 | 0.169611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009036 | 0.153061 | 392 | 13 | 67 | 30.153846 | 0.843373 | 0.068878 | 0 | 0 | 0 | 0 | 0.002755 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
584f47eb8fbdf910bc60a3a881de6a2dd7e21d34 | 193 | py | Python | win_basic_tools/sources/touch.py | HenriquedoVal/win-basic-commands | 4b209bdbfbeba31d87d197396098bc777784d4fa | [
"MIT"
] | null | null | null | win_basic_tools/sources/touch.py | HenriquedoVal/win-basic-commands | 4b209bdbfbeba31d87d197396098bc777784d4fa | [
"MIT"
] | null | null | null | win_basic_tools/sources/touch.py | HenriquedoVal/win-basic-commands | 4b209bdbfbeba31d87d197396098bc777784d4fa | [
"MIT"
] | null | null | null | import os
import sys
if len(sys.argv) >= 2:
for i in sys.argv[1:]:
os.system(f'type nul > {i}')
else:
print('Pass parameters for creating files. E.g. touch file.py file2.pyw')
| 21.444444 | 77 | 0.626943 | 35 | 193 | 3.457143 | 0.8 | 0.115702 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02 | 0.222798 | 193 | 8 | 78 | 24.125 | 0.786667 | 0 | 0 | 0 | 0 | 0 | 0.404145 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0.285714 | 0 | 0.285714 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5865e1444a4b70865e5a58a5ba9893858d340003 | 124 | py | Python | web/settings/dev.py | koualsky/start | 544fde3a353f78a7a6d5782d240d98644731d6b0 | [
"MIT"
] | null | null | null | web/settings/dev.py | koualsky/start | 544fde3a353f78a7a6d5782d240d98644731d6b0 | [
"MIT"
] | null | null | null | web/settings/dev.py | koualsky/start | 544fde3a353f78a7a6d5782d240d98644731d6b0 | [
"MIT"
] | null | null | null | from .base import *
STATIC_URL = '/static/'
STATIC_ROOT = '/code/static'
MEDIA_URL = '/media/'
MEDIA_ROOT = '/code/media'
| 15.5 | 28 | 0.677419 | 17 | 124 | 4.705882 | 0.470588 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145161 | 124 | 7 | 29 | 17.714286 | 0.754717 | 0 | 0 | 0 | 0 | 0 | 0.306452 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
586ef32d378bf40bb56aec82b6e9608badc82a57 | 192 | py | Python | systemfixtures/processes/__init__.py | alejdg/systemfixtures | d1c42d83c3dca2a36b52e8fc214639ebcb1cd8a1 | [
"MIT"
] | 13 | 2017-01-24T15:25:47.000Z | 2022-01-06T23:56:06.000Z | systemfixtures/processes/__init__.py | cjwatson/systemfixtures | 6ff52e224585d8fab2908dc08a22fe36dcaf93d4 | [
"MIT"
] | 10 | 2017-03-08T09:36:01.000Z | 2022-02-09T11:08:00.000Z | systemfixtures/processes/__init__.py | cjwatson/systemfixtures | 6ff52e224585d8fab2908dc08a22fe36dcaf93d4 | [
"MIT"
] | 5 | 2017-03-08T09:30:51.000Z | 2022-02-05T23:22:25.000Z | from .fixture import FakeProcesses
from .wget import Wget
from .systemctl import Systemctl
from .dpkg import Dpkg
__all__ = [
"FakeProcesses",
"Wget",
"Systemctl",
"Dpkg",
]
| 14.769231 | 34 | 0.682292 | 21 | 192 | 6.047619 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21875 | 192 | 12 | 35 | 16 | 0.846667 | 0 | 0 | 0 | 0 | 0 | 0.15625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
58913c43837c870fc436afac835a14be34fe83d4 | 922 | py | Python | HowYouDoing/HowYouDoing/messagingsystem/migrations/0001_initial.py | chenmic/how-you-doing | 8872d38cd3bee478e6157a3855466c7fb6c1b1f8 | [
"MIT"
] | null | null | null | HowYouDoing/HowYouDoing/messagingsystem/migrations/0001_initial.py | chenmic/how-you-doing | 8872d38cd3bee478e6157a3855466c7fb6c1b1f8 | [
"MIT"
] | null | null | null | HowYouDoing/HowYouDoing/messagingsystem/migrations/0001_initial.py | chenmic/how-you-doing | 8872d38cd3bee478e6157a3855466c7fb6c1b1f8 | [
"MIT"
] | null | null | null | # Generated by Django 3.1.1 on 2020-09-26 20:10
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Message',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('sender', models.CharField(max_length=128, verbose_name='Sender')),
('receiver', models.CharField(max_length=128, verbose_name='Receiver')),
('message', models.CharField(max_length=4096, verbose_name='Message')),
('subject', models.CharField(max_length=512, verbose_name='Subject')),
('creation_date', models.DateTimeField(verbose_name='Sent on')),
('is_read', models.BooleanField(default=False)),
],
),
]
| 34.148148 | 114 | 0.598698 | 95 | 922 | 5.663158 | 0.547368 | 0.122677 | 0.133829 | 0.178439 | 0.141264 | 0.141264 | 0.141264 | 0 | 0 | 0 | 0 | 0.041237 | 0.263557 | 922 | 26 | 115 | 35.461538 | 0.751105 | 0.048807 | 0 | 0 | 1 | 0 | 0.107429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.263158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
589525951479a0f22852b04d62b88d46d6d05ab3 | 230 | py | Python | vsketch/__init__.py | ademuri/vsketch | 1e84a25841186d629c1ace5edcd7807f0c1ba7d7 | [
"MIT"
] | null | null | null | vsketch/__init__.py | ademuri/vsketch | 1e84a25841186d629c1ace5edcd7807f0c1ba7d7 | [
"MIT"
] | null | null | null | vsketch/__init__.py | ademuri/vsketch | 1e84a25841186d629c1ace5edcd7807f0c1ba7d7 | [
"MIT"
] | null | null | null | """This module implements the vsketch API."""
from .param import Param, ParamType
from .vsketch import Vsketch
__all__ = ["Vsketch", "Param", "ParamType"]
def _init():
from .environment import setup
setup()
_init()
| 14.375 | 45 | 0.686957 | 27 | 230 | 5.62963 | 0.555556 | 0.184211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186957 | 230 | 15 | 46 | 15.333333 | 0.812834 | 0.169565 | 0 | 0 | 0 | 0 | 0.113514 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
589d9bfb282f6e63ef21201f22a38f09e47704d7 | 27,414 | py | Python | MOD_32_VariacionPotencia_H15_3o_T.py | Marita21/ThermalDesign | 49e0f795264502de5be723bdb1c89441ca0bb109 | [
"MIT"
] | 1 | 2021-09-22T12:43:16.000Z | 2021-09-22T12:43:16.000Z | MOD_32_VariacionPotencia_H15_3o_T.py | Marita21/ThermalDesign | 49e0f795264502de5be723bdb1c89441ca0bb109 | [
"MIT"
] | null | null | null | MOD_32_VariacionPotencia_H15_3o_T.py | Marita21/ThermalDesign | 49e0f795264502de5be723bdb1c89441ca0bb109 | [
"MIT"
] | null | null | null | #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#---------------------------------------------------------------------------
#Global mathematical model of SAC-A with 31 nodes and according to design
# status of March 3, 1997. Orbit of 51.6 degrees of inclination in winter
# solstice (beta=15), at 205 mn (380 km). Attitude: panels always deployed
# and directed towards the sun, satellite main axis perpendicular to the
# ecliptic plane.
# NOMINAL HOT CASE WITH 23W (INTERNAL)
#----------------------------------------------------------------------------
import tkinter
from matplotlib.backends.backend_tkagg import FigureCanvasTkAgg, NavigationToolbar2Tk
from solver import Node, ThermalModel, FDMExplicit
from matplotlib.figure import Figure
import datetime
inicio = datetime.datetime.now() # Tiempo de Compilación
#HEADER CONTROL DATA
EMLI = 0.01 # MLI effective emissivity
PINF = 9.0 # Internal power dissipated at lower platform
PSUP = 14.00 # Internal power dissipated at upper platform
F_AREA = 0.31 # Area correction for radiator
# Tiempo de Simulación
t_start = 0.0
t_end = 16579.8 #12000 #55266.0
# Tiempos en los que cambia las potencias
power_times1 = [ 0.0, 16580.0, 16580.0, 17300.0, 17300.0, 55266.0,]
power_Low= [1.0, 1.0, 1.0, 1.0, 1.0, 1.0,]
power_Upp= [1.0, 1.0, 4.429, 4.429, 1.0, 1.0,]
for index in range(len(power_Low)):
power_Low[index] = power_Low[index]*PINF
for index in range(len(power_Upp)):
power_Upp[index]=power_Upp[index]*PSUP
def generate_power_function(time, power):
from bisect import bisect
power_array = power
power_times1 = time
def get_power_X(time):
return power_array[bisect(power_times1, time)-1]
return get_power_X
power_L= generate_power_function(power_times1, power_Low)
power_U = generate_power_function(power_times1, power_Upp)
power_times = [ 0.0, 230.4, 460.8, 690.6, 921.0,
1151.4, 1381.8, 1611.6, 1842.0, 2072.4,
2302.8, 2532.6, 2763.0, 2993.4, 3223.8,
3454.2, 3684.0, 3914.4, 4144.8, 4375.2,
4605.0, 4835.0, 5065.8, 5296.2, 5526.6,
5526.6, 5757.0, 5987.4, 6217.2, 6447.6, # 2da orbita
6678.0, 6908.4, 7138.2, 7368.6, 7599.0,
7829.4, 8059.2, 8289.6, 8520.0, 8750.4,
8980.8, 9210.6, 9441.0, 9671.4, 9901.8,
10131.6,10361.6,10592.4,10822.8,11053.2,
11053.2,11283.6,11514.0,11743.8,11974.2, # 3ra orbita
12204.6,12435.0,12664.8,12895.2,13125.6,
13356.0,13585.8,13816.2,14046.6,14277.0,
14507.4,14737.2,14967.6,15198.0,15428.4,
15658.2,15888.2,16119.0,16349.4,16579.8,]
power_node_103 = [ 7.64, 7.30, 6.92, 6.52, 6.17,
5.96, 5.89, 5.96, 6.17, 6.52,
6.92, 7.30, 7.64, 8.03, 8.46,
8.88, 9.25, 9.51, 9.60, 9.51,
9.25, 8.88, 8.46, 8.03, 7.64,]
power_node_103 = power_node_103*3
power_node_104 = [14.16, 13.95, 13.68, 13.41, 13.17,
13.02, 12.97, 13.02, 13.17, 13.41,
13.68, 13.95, 14.16, 14.34, 14.49,
14.60, 14.69, 14.75, 14.77, 14.75,
14.69, 14.60, 14.48, 14.34, 14.16,]
power_node_104 = power_node_104*3
power_node_105 = [73.27, 73.22, 73.11, 72.96, 72.81,
72.70, 72.67, 72.70, 72.81, 72.96,
73.11, 73.22, 73.27, 73.36, 73.64,
73.96, 74.25, 74.43, 74.49, 74.43,
74.25, 73.96, 73.64, 73.36, 73.27,]
power_node_105 = power_node_105*3
power_node_106 = [ 4.31, 4.87, 5.69, 6.55, 7.29,
7.78, 7.95, 7.78, 7.28, 6.54,
5.69, 4.87, 4.31, 3.88, 3.49,
3.17, 2.93, 2.78, 2.73, 2.78,
2.93, 3.17, 3.50, 3.88, 4.31,]
power_node_106 = power_node_106*3
power_node_107 = [74.76, 75.19, 75.83, 76.65, 77.60,
78.59, 79.50, 80.20, 80.57, 80.55,
80.19, 79.62, 79.05, 78.59, 78.35,
78.19, 78.01, 77.73, 77.34, 76.82,
76.21, 75.54, 74.96, 74.64, 74.76,]
power_node_107 = power_node_107*3
power_node_108 = [79.05, 79.63, 80.20, 80.55, 80.57,
80.20, 79.50, 78.58, 77.60, 76.65,
75.83, 75.18, 74.76, 74.64, 74.96,
75.54, 76.21, 76.82, 77.34, 77.73,
78.01, 78.19, 78.35, 78.59, 79.05,]
power_node_108 = power_node_108*3
power_node_109 = [ 6.82, 6.81, 6.73, 6.60, 6.42,
6.23, 6.04, 5.87, 5.74, 5.64,
5.58, 5.56, 5.57, 5.60, 5.68,
5.79, 5.94, 6.10, 6.27, 6.43,
6.58, 6.69, 6.76, 6.80, 6.82,]
power_node_109 = power_node_109*3
power_node_110 = [ 5.57, 5.56, 5.58, 5.64, 5.74,
5.87, 6.04, 6.23, 6.42, 6.60,
6.73, 6.81, 6.82, 6.80, 6.76,
6.69, 6.58, 6.43, 6.27, 6.10,
5.94, 5.79, 5.68, 5.60, 5.57,]
power_node_110 = power_node_110*3
#
power_node_111= [ 1.58, 1.71, 1.83, 1.91, 1.95,
1.96, 1.96, 1.96, 1.95, 1.91,
1.83, 1.71, 1.58, 1.46, 1.34,
1.20, 1.06, 0.95, 0.90, 0.95,
1.06, 1.20, 1.34, 1.46, 1.58,]
power_node_111 = power_node_111*3
#
power_node_112= [2.47, 2.61, 2.75, 2.85, 2.92,
2.96, 2.98, 2.96, 2.92, 2.85,
2.75, 2.61, 2.47, 2.34, 2.21,
2.10, 1.99, 1.91, 1.88, 1.91,
1.99, 2.10, 2.21, 2.34, 2.47,]
power_node_112 = power_node_112*3
power_node_113= [0.21, 0.15, 0.09, 0.04, 0.01,
0.00, 0.00, 0.00, 0.01, 0.04,
0.09, 0.15, 0.21, 0.29, 0.37,
0.45, 0.52, 0.58, 0.60, 0.58,
0.52, 0.45, 0.37, 0.29, 0.21, ]
power_node_113 = power_node_113*3
#
power_node_114= [ 7.43, 10.01, 13.15, 16.40, 19.29,
21.34, 22.10, 21.33, 19.29, 16.40,
13.15, 10.00, 7.43, 5.56, 4.16,
3.18, 2.63, 2.39, 2.32, 2.39,
2.63, 3.18, 4.16, 5.56, 7.43,]
power_node_114 = power_node_114*3
#
power_node_115= [ 7.43, 10.01, 13.15, 16.40, 19.29,
21.34, 22.10, 21.33, 19.29, 16.40,
13.15, 10.00, 7.43, 5.56, 4.16,
3.18, 2.63, 2.39, 2.32, 2.39,
2.63, 3.18, 4.16, 5.56, 7.43,]
power_node_115 = power_node_115*3
#
power_node_116= [ 8.64, 6.56, 4.74, 3.36, 2.55,
2.27, 2.24, 2.27, 2.55, 3.36,
4.74, 6.56, 8.64, 11.08, 13.72,
16.27, 18.47, 20.02, 20.60, 20.02,
18.47, 16.27, 13.72, 11.08, 8.64,]
power_node_116 = power_node_116*3
#
power_node_117= [ 5.76, 5.95, 5.85, 5.47, 4.82,
3.94, 2.94, 1.95, 1.09, 0.47,
0.14, 0.07, 0.05, 0.05, 0.08,
0.27, 0.63, 1.12, 1.74, 2.48,
3.27, 4.07, 4.82, 5.43, 5.76,]
power_node_117 = power_node_117*3
#
power_node_118= [ 0.05, 0.07, 0.14, 0.47, 1.10,
1.95, 2.94, 3.94, 4.82, 5.47,
5.85, 5.95, 5.76, 5.43, 4.82,
4.07, 3.27, 2.48, 1.74, 1.12,
0.63, 0.27, 0.08, 0.05, 0.05,]
power_node_118 = power_node_118*3
#
power_node_119= [ 0.04, 0.05, 0.13, 0.51, 1.23,
2.27, 3.51, 4.81, 5.92, 6.73,
7.17, 7.22, 6.93, 6.55, 5.86,
4.98, 4.01, 3.06, 2.17, 1.40,
0.77, 0.33, 0.08, 0.03, 0.04,]
power_node_119 = power_node_119*3
#
power_node_120= [ 1.13, 1.55, 2.03, 2.54, 3.02,
3.40, 3.60, 3.60, 3.45, 3.15,
2.74, 2.29, 1.87, 1.50, 1.16,
0.87, 0.62, 0.43, 0.29, 0.21,
0.23, 0.36, 0.55, 0.80, 1.13,]
power_node_120 = power_node_120*3
#
power_node_121= [1.27, 1.68, 2.15, 2.60, 2.94,
3.13, 3.17, 3.02, 2.66, 2.22,
1.74, 1.29, 0.89, 0.59, 0.37,
0.20, 0.07, 0.04, 0.07, 0.12,
0.22, 0.39, 0.63, 0.93, 1.27,]
power_node_121 = power_node_121*3
#
power_node_122= [ 1.87, 2.30, 2.75, 3.15, 3.45,
3.60, 3.60, 3.40, 3.02, 2.54,
2.03, 1.55, 1.13, 0.80, 0.55,
0.36, 0.23, 0.21, 0.29, 0.43,
0.62, 0.87, 1.16, 1.50, 1.87,]
power_node_122 = power_node_122*3
#
power_node_123= [ 0.89, 1.29, 1.74, 2.22, 2.66,
3.02, 3.17, 3.13, 2.94, 2.60,
2.15, 1.68, 1.27, 0.93, 0.63,
0.39, 0.22, 0.12, 0.07, 0.04,
0.07, 0.20, 0.37, 0.59, 0.89,]
power_node_123 = power_node_123*3
#
power_node_126= [ 3.33, 3.67, 3.97, 4.04, 3.83,
3.37, 2.74, 2.05, 1.40, 0.86,
0.49, 0.28, 0.18, 0.16, 0.19,
0.29, 0.47, 0.74, 1.08, 1.48,
1.91, 2.35, 2.75, 3.08, 3.33,]
power_node_126 = power_node_126*3
for index in range(len(power_node_126)):
power_node_126[index] = power_node_126[index]*F_AREA
#
power_node_127= [ 0.18, 0.28, 0.49, 0.86, 1.40,
2.05, 2.74, 3.37, 3.83, 4.04,
3.97, 3.67, 3.33, 3.08, 2.75,
2.35, 1.91, 1.48, 1.08, 0.74,
0.47, 0.29, 0.19, 0.16, 0.18,]
power_node_127 = power_node_127*3
for index in range(len(power_node_127)):
power_node_127[index] = power_node_127[index]*F_AREA
#
power_node_129= [ 1.86, 2.14, 2.42, 2.67, 2.87,
2.99, 3.04, 2.99, 2.87, 2.67,
2.42, 2.14, 1.86, 1.61, 1.38,
1.15, 0.93, 0.78, 0.73, 0.78,
0.93, 1.15, 1.38, 1.61, 1.86,]
power_node_129 = power_node_129*3
def generate_power_function(time, power):
from bisect import bisect
power_array = power
power_times = time
def get_power_X(time):
return power_array[bisect(power_times, time)-1]
return get_power_X
power_103 = generate_power_function(power_times, power_node_103)
power_104 = generate_power_function(power_times, power_node_104)
power_105 = generate_power_function(power_times, power_node_105)
power_106 = generate_power_function(power_times, power_node_106)
power_107 = generate_power_function(power_times, power_node_107)
power_108 = generate_power_function(power_times, power_node_108)
power_109 = generate_power_function(power_times, power_node_109)
power_110 = generate_power_function(power_times, power_node_110)
power_111 = generate_power_function(power_times, power_node_111)
power_112 = generate_power_function(power_times, power_node_112)
power_113 = generate_power_function(power_times, power_node_113)
power_114 = generate_power_function(power_times, power_node_114)
power_115 = generate_power_function(power_times, power_node_115)
power_116 = generate_power_function(power_times, power_node_116)
power_117 = generate_power_function(power_times, power_node_117)
power_118 = generate_power_function(power_times, power_node_118)
power_119 = generate_power_function(power_times, power_node_119)
power_120 = generate_power_function(power_times, power_node_120)
power_121 = generate_power_function(power_times, power_node_121)
power_122 = generate_power_function(power_times, power_node_122)
power_123 = generate_power_function(power_times, power_node_123)
power_126 = generate_power_function(power_times, power_node_126)
power_127 = generate_power_function(power_times, power_node_127)
power_129 = generate_power_function(power_times, power_node_129)
def power_mtr(time): # Variación de Potencia en MTR
if time == 0.0:
return 0.0
elif time >= 100.0 and time <= 600.0:
return 200.0
#elif time >= 700.0 and time =< 899.0:
# return 0.0
elif time >= 900.0 and time <= 1400.0:
return 200.0
else:
return 0.0
model = ThermalModel()
#HEADER NODE DATA
model.addNode(Node(1, 14400.0, 297.98277018, power_L, 'Lower platform'))
model.addNode(Node(2, 22400.0, 298.24525503, power_U, 'Upper platform'))
model.addNode(Node(3, 1600.0, 296.06883956, power_103, 'Interface ring'))
model.addNode(Node(4, 400.0,329.76981036, power_104, 'Front panel radiator'))
model.addNode(Node(5, 450.0,352.92355844, power_105, 'Front solar panel'))
model.addNode(Node(6, 450.0,287.98149354, power_106, 'Rear solar panel'))
model.addNode(Node(7, 450.0,326.16023767, power_107, 'Lateral solar panel _1'))
model.addNode(Node(8, 450.0,326.16113637, power_108, 'Lateral solar panel _2'))
model.addNode(Node(9, 150.0,334.52013919, power_109, 'Silicon cell SiCELL_2'))
model.addNode(Node(10, 150.0,334.28346766, power_110, 'Silicon cell SiCELL_1'))
model.addNode(Node(11, 200.0,272.24511844, power_111, 'RF antena'))
model.addNode(Node(12, 300.0,278.89073804, power_112, 'Upper microSwitch'))
model.addNode(Node(13, 300.0, 289.79374615, power_113, 'Lower microSwitch'))
model.addNode(Node(14, 0.1,230.57181926, power_114, 'MLI-Upper platform'))
model.addNode(Node(15, 0.1,231.29593541, power_115, 'MLI-lateral_2'))
model.addNode(Node(16, 0.1, 230.27827859, power_116, 'MLI-Lower platform'))
model.addNode(Node(17, 0.1,246.37801687,power_117, 'Shunt_2'))
model.addNode(Node(18, 0.1,246.37762173, power_118, 'Shunt_1'))
model.addNode(Node(19, 0.1,231.29564763, power_119, 'MLI-lateral_1'))
model.addNode(Node(20, 100.0, 238.06403638, power_120, 'GPS_1 Antenna'))
model.addNode(Node(21, 100.0,240.55841625, power_121, 'GPS_2 Antenna'))
model.addNode(Node(22, 100.0, 238.06404797, power_122, 'GPS_3 Antenna'))
model.addNode(Node(23, 100.0,240.5584162, power_123, 'GPS_4 Antenna'))
model.addNode(Node(24, 250.0, 299.55312394, lambda x: 0.0, 'Structure - lateral_1'))
model.addNode(Node(25, 250.0, 296.43180947, lambda x: 0.0, 'Structure - rear'))
model.addNode(Node(26, 450.0,295.62728453, power_126, 'Radiator_2'))
model.addNode(Node(27, 450.0,295.62727054, power_127, 'Radiator_1'))
model.addNode(Node(28, 250.0, 299.55318916, lambda x: 0.0, 'Structure - lateral_2'))
model.addNode(Node(29, 3.0, 231.75567726, power_129, 'MLI - magnetometer'))
model.addNode(Node(30, 300.0, 287.14924544, lambda x: 0.0,'Magnetometer'))
model.addNode(Node(31, 1100.0,301.97360573, lambda x: 0.0, 'Structure - front'))
model.addNode(Node(32, 3.0,339.90875354, lambda x: 0.0, 'Mathematical node'))
model.addNode(Node(-99, 0.10, 0.0, lambda x: 0.0, 'Space'))
#HEADER CONDUCTOR DATA
# CONDUCTANCIAS
model.addConductance(1, 31, 1.10) # Lower_plat - Front_estr
model.addConductance(1, 24, 0.22) # Lower_plat - Latestr_1
model.addConductance(1, 28, 0.22) # Lower_plat - Latestr_2
model.addConductance(1, 25, 0.22) # Lower_plat - Rear_estr.
model.addConductance(1, 27, 0.44) # Lower_plat - Radiator_1
model.addConductance(1, 26, 0.44) # Lower_plat - Radiator_2
model.addConductance(2, 31, 1.10) # Upper_plat - Front_estr
model.addConductance(2, 24, 0.22) # Upper_plat - Latestr_1
model.addConductance(2, 28, 0.22) # Upper_plat - Latestr_2
model.addConductance(2, 25, 0.22) # Upper_plat - Rear_estr
model.addConductance(2, 27, 0.44) # Upper_plat - Radiator_1
model.addConductance(2, 26, 0.44) # Upper_plat - Radiator_2
model.addConductance(25, 26, 0.46) # Rear_estr - Radiator_2
model.addConductance(26, 28, 0.46) # Radiator_2 - Latestr_2
model.addConductance(28, 31, 0.23) # Latestr_2 - Front_estr
model.addConductance(31, 24, 0.23) # Front_estr - Latestr_1
model.addConductance(24, 27, 0.46) # Latestr_1 - Radiator_1
model.addConductance(27, 25, 0.46) # Radiator_1 - Rear_estr
model.addConductance(31, 4, 0.25) # Front_estr - Rad_panel
model.addConductance(31, 32, 0.07) # Front_estr - Arith_node
model.addConductance(32, 5, 1.84) # Arith_node - Panel_front
model.addConductance(4, 32, 2.10) # Rad_panel - Arith_node
model.addConductance(4, 10, 0.109) # Rad_panel - SiCELL_1
model.addConductance(4, 9, 0.10) # Rad_panel - SiCELL_2
model.addConductance(24, 7, 0.08) # Latestr_1 - Panel_lat1
model.addConductance(28, 8, 0.08) # Latestr_2 - Panel_lat2
model.addConductance(27, 6, 0.31) # Radiator_1 - Panel_rear
model.addConductance(26, 6, 0.31) # Radiator_2 - Panel_rear
model.addConductance(2, 6, 0.61) # Upper_plat - Panel_rear
model.addConductance(1, 6, 0.61) # Lower_plat - Panel_rear
model.addConductance(2, 20, 0.025) # Upper_plat - DGPS_1
model.addConductance(2, 21, 0.025) # Upper_plat - DGPS_2
model.addConductance(2, 22, 0.025) # Upper_plat - DGPS_3
model.addConductance(2, 23, 0.025) # Upper_plat - DGPS_4
model.addConductance(2, 11, 0.06) # Upper_plat - RF_antenna
model.addConductance(2, 12, 0.06) # Upper_plat - Up_switch
model.addConductance(2, 30, 0.06) # Upper_plat - Magnetomet
model.addConductance(1, 3, 2.30) # Lower_plat - Interf_ring
model.addConductance(1, 13, 0.10) # Lower_plat - Low_switch
# ADMITANCIAS
model.addAdmittance(1, 16, 1200.*EMLI) # Lower_plat - Lower_MLI
model.addAdmittance(2, 14, 1500.*EMLI) # Upper_plat - Upper_MLI
model.addAdmittance(24, 19, 500.*EMLI) # Latestr_1 - MLI_Later1
model.addAdmittance(24, 18, 400.*EMLI) # Latestr_1 - Shunt_1
model.addAdmittance(28, 15, 500.*EMLI) # Latestr_2 - MLI_Later2
model.addAdmittance(28, 17, 400.*EMLI) # Latestr_2 - Shunt_2
model.addAdmittance(30, 29, 3000.*EMLI) # Magnetomet - Magn_MLI
model.addAdmittance(3, 16, 106.424) # from SSPTA
model.addAdmittance(3, -99, 210.306) # from SSPTA
model.addAdmittance(4, 7, 8.474) # from SSPTA
model.addAdmittance(4, 8, 8.474) # from SSPTA
model.addAdmittance(4, -99, 436.524) # from SSPTA
model.addAdmittance(5, -99, 562.647) # from SSPTA
model.addAdmittance(6, -99, 562.647) # from SSPTA
model.addAdmittance(7, 10, 4.582) # from SSPTA
model.addAdmittance(7, 18, 59.946) # from SSPTA
model.addAdmittance(7, 19, 42.879) # from SSPTA
model.addAdmittance(7, 20, 1.538) # from SSPTA
model.addAdmittance(7, -99,1105.034) # from SSPTA
model.addAdmittance(8, 9, 4.582) # from SSPTA
model.addAdmittance(8, 15, 42.879) # from SSPTA
model.addAdmittance(8, 17, 59.946) # from SSPTA
model.addAdmittance(8, 22, 1.538) # from SSPTA
model.addAdmittance(8, -99,1105.034) # from SSPTA
model.addAdmittance(9, -99, 79.908) # from SSPTA
model.addAdmittance(10,-99, 79.908) # from SSPTA
model.addAdmittance(11, 12, 1.701) # from SSPTA
model.addAdmittance(11, 14, 15.962) # from SSPTA
model.addAdmittance(11, 20, 2.145) # from SSPTA
model.addAdmittance(11, 21, 2.142) # from SSPTA
model.addAdmittance(11, 22, 2.145) # from SSPTA
model.addAdmittance(11, 23, 2.142) # from SSPTA
model.addAdmittance(11, 29, 3.711) # from SSPTA
model.addAdmittance(11, -99, 86.412) # from SSPTA
model.addAdmittance(12, 14, 9.881) # from SSPTA
model.addAdmittance(12, 21, 2.745) # from SSPTA
model.addAdmittance(12, 23, 2.745) # from SSPTA
model.addAdmittance(12, -99, 97.698) # from SSPTA
model.addAdmittance(13, -99, 26.730) # from SSPTA
model.addAdmittance(14, 20, 32.392) # from SSPTA
model.addAdmittance(14, 21, 36.769) # from SSPTA
model.addAdmittance(14, 22, 32.392) # from SSPTA
model.addAdmittance(14, 23, 36.769) # from SSPTA
model.addAdmittance(14, 29, 30.516) # from SSPTA
model.addAdmittance(14, -99,688.541) # from SSPTA
model.addAdmittance(15, 17, 1.128) # from SSPTA
model.addAdmittance(15, -99,330.793) # from SSPTA
model.addAdmittance(16, -99,828.833) # from SSPTA
model.addAdmittance(17, -99,253.101) # from SSPTA
model.addAdmittance(18, 19, 1.128) # from SSPTA
model.addAdmittance(18, -99,253.101) # from SSPTA
model.addAdmittance(19, -99,330.793) # from SSPTA
model.addAdmittance(20, 22, 1.014) # from SSPTA
model.addAdmittance(20, 23, 1.289) # from SSPTA
model.addAdmittance(20, 29, 6.992) # from SSPTA
model.addAdmittance(20, -99,178.462) # from SSPTA
model.addAdmittance(21, 22, 1.289) # from SSPTA
model.addAdmittance(21, 23, 1.014) # from SSPTA
model.addAdmittance(21, 29, 1.113) # from SSPTA
model.addAdmittance(21, -99,144.697) # from SSPTA
model.addAdmittance(22, 29, 6.992) # from SSPTA
model.addAdmittance(22, -99,178.462) # from SSPTA
model.addAdmittance(23, 29, 1.113) # from SSPTA
model.addAdmittance(23, -99,144.697) # from SSPTA
model.addAdmittance(26, -99,191.484*F_AREA) # from SSPTA
model.addAdmittance(27, -99,191.484*F_AREA) # from SSPTA
model.addAdmittance(29, -99,160.929) # from SSPTA
solver = FDMExplicit(model,0.1)
solver.solve(t_start, t_end)
#------------------------------CREAR VENTANA---------------------------------
root = tkinter.Tk()
root.wm_title("Outgassing - PROTOTIPO")
#------------------------------CREAR GRAFICA---------------------------------
fig = Figure(figsize=(5, 5), dpi=100)
# GRAFICA 1
#fig.add_subplot(111).plot((solver.t)/3600,solver.T[0]-273, linestyle='--', color='yellow', label="Lower platform")
#fig.add_subplot(111).plot((solver.t)/3600,solver.T[1]-273, linestyle='--', color='blue', label="Upper platform")
#fig.add_subplot(111).plot((solver.t)/3600,solver.T[2]-273, linestyle='-', color='green', label="Interface ring")
# GRAFICA 2
#fig.add_subplot(111).plot((solver.t)/3600,solver.T[4]-273, linestyle='--', color='blue', label="Front solar panel")
#fig.add_subplot(111).plot((solver.t)/3600,solver.T[5]-273, linestyle='-.', color='magenta', label="Rear solar panel")
#fig.add_subplot(111).plot((solver.t)/3600,solver.T[6]-273, linestyle='-', color='red', label="Lateral solar panel _1")
#fig.add_subplot(111).plot((solver.t)/3600,solver.T[8]-273, linestyle='-', color='green', label="Silicon cell SiCELL_2")
# GRAFICA 3
fig.add_subplot(111).plot((solver.t)/3600,solver.T[10]-273, linestyle='--', color='blue', label="RF antena")
fig.add_subplot(111).plot((solver.t)/3600,solver.T[11]-273, linestyle='-', color='green', label="Upper microSwitch")
fig.add_subplot(111).plot((solver.t)/3600,solver.T[12]-273, linestyle='-', color='red', label="Lower microSwitch")
fig.add_subplot(111).plot((solver.t)/3600,solver.T[13]-273, linestyle='-.', color='cyan', label="GPS_1 Antenna")
fig.add_subplot(111).plot((solver.t)/3600,solver.T[14]-273, linestyle='-', color='purple', label="Magnetometer")
#*******************************************************************************
# Calculo del Tiempo
final = datetime.datetime.now()
#********************************************************************************
Diferencia = final - inicio
print(" ")
print("Tiempo de Procesamiento:")
print Diferencia.total_seconds()
# #GRAFICA IMAGENES
#
fig.suptitle("MOD_32_VariacionPotencia_H15_3o", fontsize=14)
fig.add_subplot(111).set_xlabel('Tiempo [h]')
fig.add_subplot(111).set_ylabel('Temperatura [C]')
fig.add_subplot(111).grid(linestyle='--', linewidth=0.9, alpha=0.5)
fig.add_subplot(111).legend(loc=6, bbox_to_anchor=(1.0,0.9))
canvas = FigureCanvasTkAgg(fig, master=root) # CREAR AREA DE DIBUJO DE TKINTER.
canvas.draw()
canvas.get_tk_widget().pack(side=tkinter.TOP, fill=tkinter.BOTH, expand=1)
#-----------------------AÑADIR BARRA DE HERRAMIENTAS--------------------------
toolbar = NavigationToolbar2Tk(canvas, root)# barra de iconos
toolbar.update()
canvas.get_tk_widget().pack(side=tkinter.TOP, fill=tkinter.BOTH, expand=1)
#-----------------------------BOTÓN "cerrar"----------------------------------
def cerrar():
root.quit()
root.destroy()
button = tkinter.Button(master=root, text="cerrar", command=cerrar)
button.pack(side=tkinter.BOTTOM)
tkinter.mainloop()
| 47.59375 | 121 | 0.527832 | 4,017 | 27,414 | 3.473737 | 0.144884 | 0.065788 | 0.059195 | 0.114161 | 0.394654 | 0.285366 | 0.233768 | 0.086857 | 0.082844 | 0.077111 | 0 | 0.244066 | 0.319216 | 27,414 | 575 | 122 | 47.676522 | 0.503617 | 0.152076 | 0 | 0.059553 | 0 | 0 | 0.031428 | 0.001378 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.01737 | null | null | 0.007444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5433932d8be64cc78376133d40786951f2c60d5f | 2,852 | py | Python | venv/Lib/site-packages/rcon/client.py | Svesnav2/Discord-Bot-Minecraft-server-status | ee34948e741930567a3adb557197523f9d32ace1 | [
"Unlicense"
] | null | null | null | venv/Lib/site-packages/rcon/client.py | Svesnav2/Discord-Bot-Minecraft-server-status | ee34948e741930567a3adb557197523f9d32ace1 | [
"Unlicense"
] | null | null | null | venv/Lib/site-packages/rcon/client.py | Svesnav2/Discord-Bot-Minecraft-server-status | ee34948e741930567a3adb557197523f9d32ace1 | [
"Unlicense"
] | null | null | null | """Synchronous client."""
from socket import socket
from typing import Optional
from rcon.exceptions import RequestIdMismatch, WrongPassword
from rcon.proto import Packet, Type
__all__ = ['Client']
class Client:
"""An RCON client."""
__slots__ = ('_socket', 'host', 'port', 'passwd')
def __init__(self, host: str, port: int, *,
timeout: Optional[float] = None,
passwd: Optional[str] = None):
"""Initializes the base client with the SOCK_STREAM socket type."""
self._socket = socket()
self.host = host
self.port = port
self.timeout = timeout
self.passwd = passwd
def __enter__(self):
"""Attempts an auto-login if a password is set."""
self._socket.__enter__()
self.connect(login=True)
return self
def __exit__(self, typ, value, traceback):
"""Delegates to the underlying socket's exit method."""
return self._socket.__exit__(typ, value, traceback)
@property
def timeout(self) -> float:
"""Returns the socket timeout."""
return self._socket.gettimeout()
@timeout.setter
def timeout(self, timeout: float):
"""Sets the socket timeout."""
self._socket.settimeout(timeout)
def connect(self, login: bool = False) -> None:
"""Connects the socket and attempts a
login if wanted and a password is set.
"""
self._socket.connect((self.host, self.port))
if login and self.passwd is not None:
self.login(self.passwd)
def close(self) -> None:
"""Closes the socket connection."""
self._socket.close()
def communicate(self, packet: Packet) -> Packet:
"""Sends and receives a packet."""
with self._socket.makefile('wb') as file:
file.write(bytes(packet))
return self.read()
def read(self) -> Packet:
"""Reads a packet."""
with self._socket.makefile('rb') as file:
return Packet.read(file)
def login(self, passwd: str) -> bool:
"""Performs a login."""
response = self.communicate(Packet.make_login(passwd))
# Wait for SERVERDATA_AUTH_RESPONSE according to:
# https://developer.valvesoftware.com/wiki/Source_RCON_Protocol
while response.type != Type.SERVERDATA_AUTH_RESPONSE:
response = self.read()
if response.id == -1:
raise WrongPassword()
return True
def run(self, command: str, *arguments: str, raw: bool = False) -> str:
"""Runs a command."""
request = Packet.make_command(command, *arguments)
response = self.communicate(request)
if response.id != request.id:
raise RequestIdMismatch(request.id, response.id)
return response if raw else response.payload
| 29.708333 | 75 | 0.609748 | 329 | 2,852 | 5.142857 | 0.334347 | 0.053191 | 0.014184 | 0.016548 | 0.062648 | 0.062648 | 0 | 0 | 0 | 0 | 0 | 0.000482 | 0.272791 | 2,852 | 95 | 76 | 30.021053 | 0.815333 | 0.18899 | 0 | 0 | 0 | 0 | 0.013908 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.207547 | false | 0.169811 | 0.075472 | 0 | 0.45283 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5439161162698895aa55610f907264f5b1629f27 | 24,318 | py | Python | pysnmp-with-texts/HUAWEI-MA5200-DEVICE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/HUAWEI-MA5200-DEVICE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/HUAWEI-MA5200-DEVICE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module HUAWEI-MA5200-DEVICE-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/HUAWEI-MA5200-DEVICE-MIB
# Produced by pysmi-0.3.4 at Wed May 1 13:46:33 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, ObjectIdentifier, OctetString = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsIntersection, SingleValueConstraint, ValueSizeConstraint, ConstraintsUnion, ValueRangeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsIntersection", "SingleValueConstraint", "ValueSizeConstraint", "ConstraintsUnion", "ValueRangeConstraint")
hwFrameIndex, hwSlotIndex = mibBuilder.importSymbols("HUAWEI-DEVICE-MIB", "hwFrameIndex", "hwSlotIndex")
hwMA5200Mib, = mibBuilder.importSymbols("HUAWEI-MIB", "hwMA5200Mib")
HWFrameType, HWPCBType, HWPortType, HWSubPCBType = mibBuilder.importSymbols("HUAWEI-TC-MIB", "HWFrameType", "HWPCBType", "HWPortType", "HWSubPCBType")
VlanIdOrNone, VlanId = mibBuilder.importSymbols("Q-BRIDGE-MIB", "VlanIdOrNone", "VlanId")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
Integer32, Counter64, iso, MibScalar, MibTable, MibTableRow, MibTableColumn, ModuleIdentity, IpAddress, MibIdentifier, TimeTicks, Unsigned32, Counter32, NotificationType, ObjectIdentity, Gauge32, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "Integer32", "Counter64", "iso", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "ModuleIdentity", "IpAddress", "MibIdentifier", "TimeTicks", "Unsigned32", "Counter32", "NotificationType", "ObjectIdentity", "Gauge32", "Bits")
MacAddress, DisplayString, DateAndTime, TextualConvention, RowStatus, TruthValue = mibBuilder.importSymbols("SNMPv2-TC", "MacAddress", "DisplayString", "DateAndTime", "TextualConvention", "RowStatus", "TruthValue")
hwMA5200Device = ModuleIdentity((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201))
if mibBuilder.loadTexts: hwMA5200Device.setLastUpdated('200408300900Z')
if mibBuilder.loadTexts: hwMA5200Device.setOrganization(' NanJing Institute,Huawei Technologies Co.,Ltd. HuiHong Mansion,No.91 BaiXia Rd. NanJing, P.R. of China Zipcode:210001 Http://www.huawei.com E-mail:support@huawei.com ')
if mibBuilder.loadTexts: hwMA5200Device.setContactInfo('The MIB contains objects of module MA5200 device.')
if mibBuilder.loadTexts: hwMA5200Device.setDescription('Huawei ma5200 device mib.')
hw52DevSlot = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 1))
hw52DevSlotNum = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevSlotNum.setStatus('current')
if mibBuilder.loadTexts: hw52DevSlotNum.setDescription(' The slot number. ')
hw52DevSubSlotNum = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevSubSlotNum.setStatus('current')
if mibBuilder.loadTexts: hw52DevSubSlotNum.setDescription(' THe sub Slot number. ')
hw52DevPortNum = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevPortNum.setStatus('current')
if mibBuilder.loadTexts: hw52DevPortNum.setDescription(' The port number. ')
hw52DevPortOperateStatus = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 1, 4), Integer32()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevPortOperateStatus.setStatus('current')
if mibBuilder.loadTexts: hw52DevPortOperateStatus.setDescription(' The port Operate Status. ')
hw52DevSlotTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 2))
hw52DevSlotReset = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 2, 1006)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevSlotReset.setStatus('current')
if mibBuilder.loadTexts: hw52DevSlotReset.setDescription(' The trap report of slot reset. ')
hw52DevSlotRegOK = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 2, 1007)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevSlotRegOK.setStatus('current')
if mibBuilder.loadTexts: hw52DevSlotRegOK.setDescription(' The trap report of slot register OK. ')
hw52DevSlotPlugOut = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 2, 1008)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevSlotPlugOut.setStatus('current')
if mibBuilder.loadTexts: hw52DevSlotPlugOut.setDescription(' The trap report of slot plug out. ')
hwHdDev = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 5))
hwHdDevTable = MibTable((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 5, 1), )
if mibBuilder.loadTexts: hwHdDevTable.setStatus('current')
if mibBuilder.loadTexts: hwHdDevTable.setDescription(' This table contains harddisk information. ')
hwHdDevEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 5, 1, 1), ).setIndexNames((0, "HUAWEI-DEVICE-MIB", "hwFrameIndex"), (0, "HUAWEI-DEVICE-MIB", "hwSlotIndex"), (0, "HUAWEI-MA5200-DEVICE-MIB", "hwHdDevIndex"))
if mibBuilder.loadTexts: hwHdDevEntry.setStatus('current')
if mibBuilder.loadTexts: hwHdDevEntry.setDescription(' The table entry of harddisk information. ')
hwHdDevIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 5, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535)))
if mibBuilder.loadTexts: hwHdDevIndex.setStatus('current')
if mibBuilder.loadTexts: hwHdDevIndex.setDescription(' The index of harddisk information table. ')
hwHdDevSize = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 5, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwHdDevSize.setStatus('current')
if mibBuilder.loadTexts: hwHdDevSize.setDescription(' Total Size in Octets of harddisk memory. ')
hwHdDevFree = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 5, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwHdDevFree.setStatus('current')
if mibBuilder.loadTexts: hwHdDevFree.setDescription(' Unused Size in Octets of harddisk memory. ')
hw52DevPortTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 6))
hw52DevPortUp = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 6, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSubSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortOperateStatus"))
if mibBuilder.loadTexts: hw52DevPortUp.setStatus('current')
if mibBuilder.loadTexts: hw52DevPortUp.setDescription(' Port up. ')
hw52DevPortDown = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 6, 2)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSubSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortOperateStatus"))
if mibBuilder.loadTexts: hw52DevPortDown.setStatus('current')
if mibBuilder.loadTexts: hw52DevPortDown.setDescription(' Port down. ')
hw52DevUserAttackInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 7))
hw52DevUserIPAddr = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 7, 1), IpAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevUserIPAddr.setStatus('current')
if mibBuilder.loadTexts: hw52DevUserIPAddr.setDescription(" The user's IP address. ")
hw52DevUserMac = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 7, 2), MacAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevUserMac.setStatus('current')
if mibBuilder.loadTexts: hw52DevUserMac.setDescription(" The user's MAC address. ")
hw52DevUserIndex = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 7, 3), Integer32()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevUserIndex.setStatus('current')
if mibBuilder.loadTexts: hw52DevUserIndex.setDescription(' The index of user, could be vlan id, Session id or VCD according with the type of user. ')
hw52DevUserOuterVlan = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 7, 4), VlanIdOrNone()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevUserOuterVlan.setStatus('current')
if mibBuilder.loadTexts: hw52DevUserOuterVlan.setDescription(' The outer vlan. ')
hw52DevUserAttack = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 8))
hw52DevUserAttackTrap = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 8, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserIPAddr"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserMac"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSubSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserIndex"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserOuterVlan"))
if mibBuilder.loadTexts: hw52DevUserAttackTrap.setStatus('current')
if mibBuilder.loadTexts: hw52DevUserAttackTrap.setDescription(' The trap report of user attack. ')
hw52TrapSwitch = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 9))
hw52HwdeviceOrBasetrap = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 9, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("disable", 1), ("hwdevice", 2), ("basetrap", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hw52HwdeviceOrBasetrap.setStatus('current')
if mibBuilder.loadTexts: hw52HwdeviceOrBasetrap.setDescription(' Trap switches between basetrap and hwdevice. ')
hw52DevMemUsage = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 10))
hw52DevMemUsageThreshold = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 10, 1), Integer32()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevMemUsageThreshold.setStatus('current')
if mibBuilder.loadTexts: hw52DevMemUsageThreshold.setDescription(' Memory usage threshold. ')
hw52DevMemUsageTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 11))
hw52DevMemUsageAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 11, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevMemUsageThreshold"))
if mibBuilder.loadTexts: hw52DevMemUsageAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevMemUsageAlarm.setDescription(' Memory usage alarm. ')
hw52DevMemUsageResume = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 11, 2)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevMemUsageThreshold"))
if mibBuilder.loadTexts: hw52DevMemUsageResume.setStatus('current')
if mibBuilder.loadTexts: hw52DevMemUsageResume.setDescription(' Memory usage alarm resum. ')
hw52DevStartupFileFail = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 12))
hw52DevDefaultStartupFileName = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 12, 1), OctetString()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevDefaultStartupFileName.setStatus('current')
if mibBuilder.loadTexts: hw52DevDefaultStartupFileName.setDescription(' Default startup file name. ')
hw52DevCurrentStartupFileName = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 12, 2), OctetString()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevCurrentStartupFileName.setStatus('current')
if mibBuilder.loadTexts: hw52DevCurrentStartupFileName.setDescription(' Current startup file name. ')
hw52DevStartupFileFailTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 13))
hw52DevStartupFileReloadAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 13, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevDefaultStartupFileName"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevCurrentStartupFileName"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevStartupFileReloadAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevStartupFileReloadAlarm.setDescription(' Startup file load fail alarm. ')
hw52DevDiskSelfTestFail = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 14))
hw52DevDiskSelfTestDiskType = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 14, 1), OctetString()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevDiskSelfTestDiskType.setStatus('current')
if mibBuilder.loadTexts: hw52DevDiskSelfTestDiskType.setDescription(' Disk type: cfcard or harddisk. ')
hw52DevDiskSelfTestFailStep = MibScalar((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 14, 2), OctetString()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: hw52DevDiskSelfTestFailStep.setStatus('current')
if mibBuilder.loadTexts: hw52DevDiskSelfTestFailStep.setDescription(' Disk self-test fail step. ')
hw52DevDiskSelfTestFailTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 15))
hw52DevDiskSelfTestFailAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 15, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevDiskSelfTestDiskType"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevDiskSelfTestFailStep"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevDiskSelfTestFailAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevDiskSelfTestFailAlarm.setDescription(' Disk selftest error alarm. ')
hw52DevCfUnregisterTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 16))
hw52DevCfUnregisteredAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 16, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevCfUnregisteredAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevCfUnregisteredAlarm.setDescription(' Cf card unregistered. ')
hw52DevHpt372ErrorTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 17))
hw52DevHpt372ErrorAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 17, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevHpt372ErrorAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevHpt372ErrorAlarm.setDescription(' Hpt372 occur error. ')
hw52DevHarddiskUsageTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 18))
hw52DevHarddiskUsageAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 18, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevHarddiskUsageAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevHarddiskUsageAlarm.setDescription(' Harddisk usage alarm. ')
hw52DevHarddiskUsageResume = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 18, 2)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevHarddiskUsageResume.setStatus('current')
if mibBuilder.loadTexts: hw52DevHarddiskUsageResume.setDescription(' Harddisk usage alarm resume. ')
hw52PacketError = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 19))
hw52InPacketErrorTrap = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 19, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSubSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortNum"))
if mibBuilder.loadTexts: hw52InPacketErrorTrap.setStatus('current')
if mibBuilder.loadTexts: hw52InPacketErrorTrap.setDescription(' In packet error. ')
hw52OutPacketErrorTrap = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 19, 2)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSubSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortNum"))
if mibBuilder.loadTexts: hw52OutPacketErrorTrap.setStatus('current')
if mibBuilder.loadTexts: hw52OutPacketErrorTrap.setDescription(' Out packet error. ')
hw52DevCfcardUsageTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 20))
hw52DevCfcardUsageAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 20, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevCfcardUsageAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevCfcardUsageAlarm.setDescription(' Cfcard usage alarm. ')
hw52DevCfcardUsageResume = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 20, 2)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevCfcardUsageResume.setStatus('current')
if mibBuilder.loadTexts: hw52DevCfcardUsageResume.setDescription(' Cfcard usage alarm resume. ')
hw52DevFlashUsageTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 21))
hw52DevFlashUsageAlarm = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 21, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevFlashUsageAlarm.setStatus('current')
if mibBuilder.loadTexts: hw52DevFlashUsageAlarm.setDescription(' Flash usage alarm. ')
hw52DevFlashUsageResume = NotificationType((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 21, 2)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"))
if mibBuilder.loadTexts: hw52DevFlashUsageResume.setStatus('current')
if mibBuilder.loadTexts: hw52DevFlashUsageResume.setDescription(' Flash usage alarm resume. ')
hw52DevConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200))
hw52DevCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200, 1))
hw52DevCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200, 1, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotGroup"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevHdTableGroup"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevTrapsGroup"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevTrapObjectsGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hw52DevCompliance = hw52DevCompliance.setStatus('current')
if mibBuilder.loadTexts: hw52DevCompliance.setDescription('The compliance statement for systems supporting the this module.')
hw52DevObjectGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200, 2))
hw52DevSlotGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200, 2, 1)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSubSlotNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortNum"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortOperateStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hw52DevSlotGroup = hw52DevSlotGroup.setStatus('current')
if mibBuilder.loadTexts: hw52DevSlotGroup.setDescription('The MA5200 device slot group objects.')
hw52DevHdTableGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200, 2, 2)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hwHdDevSize"), ("HUAWEI-MA5200-DEVICE-MIB", "hwHdDevFree"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hw52DevHdTableGroup = hw52DevHdTableGroup.setStatus('current')
if mibBuilder.loadTexts: hw52DevHdTableGroup.setDescription('The MA5200 device harddisk information table group.')
hw52DevTrapsGroup = NotificationGroup((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200, 2, 3)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotReset"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotRegOK"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevSlotPlugOut"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortUp"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevPortDown"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserAttackTrap"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevMemUsageAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevMemUsageResume"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevStartupFileReloadAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevDiskSelfTestFailAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevCfUnregisteredAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevHpt372ErrorAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevHarddiskUsageAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevHarddiskUsageResume"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52InPacketErrorTrap"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52OutPacketErrorTrap"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevCfcardUsageAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevCfcardUsageResume"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevFlashUsageAlarm"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevFlashUsageResume"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hw52DevTrapsGroup = hw52DevTrapsGroup.setStatus('current')
if mibBuilder.loadTexts: hw52DevTrapsGroup.setDescription('The MA5200 device traps group.')
hw52DevTrapObjectsGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 2011, 2, 6, 2, 201, 200, 2, 4)).setObjects(("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserIPAddr"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserMac"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserIndex"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevUserOuterVlan"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52HwdeviceOrBasetrap"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevMemUsageThreshold"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevDefaultStartupFileName"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevCurrentStartupFileName"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevDiskSelfTestDiskType"), ("HUAWEI-MA5200-DEVICE-MIB", "hw52DevDiskSelfTestFailStep"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hw52DevTrapObjectsGroup = hw52DevTrapObjectsGroup.setStatus('current')
if mibBuilder.loadTexts: hw52DevTrapObjectsGroup.setDescription('The objects of MA5200 device traps group.')
mibBuilder.exportSymbols("HUAWEI-MA5200-DEVICE-MIB", hwHdDevFree=hwHdDevFree, hw52DevHarddiskUsageResume=hw52DevHarddiskUsageResume, hw52DevConformance=hw52DevConformance, hw52DevObjectGroups=hw52DevObjectGroups, hw52DevMemUsageTrap=hw52DevMemUsageTrap, hw52DevHdTableGroup=hw52DevHdTableGroup, hw52DevTrapObjectsGroup=hw52DevTrapObjectsGroup, hw52DevSubSlotNum=hw52DevSubSlotNum, hw52DevCfcardUsageResume=hw52DevCfcardUsageResume, hw52DevHpt372ErrorAlarm=hw52DevHpt372ErrorAlarm, hw52HwdeviceOrBasetrap=hw52HwdeviceOrBasetrap, hw52DevMemUsageResume=hw52DevMemUsageResume, hwHdDevIndex=hwHdDevIndex, hw52DevUserAttackInfo=hw52DevUserAttackInfo, hw52DevSlotReset=hw52DevSlotReset, hw52DevSlot=hw52DevSlot, hw52DevMemUsageAlarm=hw52DevMemUsageAlarm, hw52DevUserIndex=hw52DevUserIndex, hw52DevFlashUsageAlarm=hw52DevFlashUsageAlarm, hw52DevMemUsageThreshold=hw52DevMemUsageThreshold, hw52DevDefaultStartupFileName=hw52DevDefaultStartupFileName, hw52DevHarddiskUsageAlarm=hw52DevHarddiskUsageAlarm, hw52DevPortTrap=hw52DevPortTrap, hw52DevUserIPAddr=hw52DevUserIPAddr, hw52TrapSwitch=hw52TrapSwitch, hwHdDevEntry=hwHdDevEntry, hw52DevDiskSelfTestFail=hw52DevDiskSelfTestFail, hw52DevSlotPlugOut=hw52DevSlotPlugOut, hwHdDevSize=hwHdDevSize, hw52DevUserAttack=hw52DevUserAttack, hw52DevPortUp=hw52DevPortUp, hw52DevStartupFileFail=hw52DevStartupFileFail, hw52DevDiskSelfTestDiskType=hw52DevDiskSelfTestDiskType, hw52DevDiskSelfTestFailAlarm=hw52DevDiskSelfTestFailAlarm, hw52DevCfUnregisteredAlarm=hw52DevCfUnregisteredAlarm, hw52DevTrapsGroup=hw52DevTrapsGroup, hw52DevCurrentStartupFileName=hw52DevCurrentStartupFileName, hw52DevFlashUsageTrap=hw52DevFlashUsageTrap, hw52DevCompliances=hw52DevCompliances, hw52DevSlotGroup=hw52DevSlotGroup, hwHdDev=hwHdDev, hw52DevDiskSelfTestFailTrap=hw52DevDiskSelfTestFailTrap, hw52PacketError=hw52PacketError, hw52InPacketErrorTrap=hw52InPacketErrorTrap, hw52DevStartupFileFailTrap=hw52DevStartupFileFailTrap, hw52DevCfUnregisterTrap=hw52DevCfUnregisterTrap, hw52DevCfcardUsageTrap=hw52DevCfcardUsageTrap, hw52DevHarddiskUsageTrap=hw52DevHarddiskUsageTrap, hw52DevCfcardUsageAlarm=hw52DevCfcardUsageAlarm, hw52DevFlashUsageResume=hw52DevFlashUsageResume, hw52DevDiskSelfTestFailStep=hw52DevDiskSelfTestFailStep, hw52DevUserOuterVlan=hw52DevUserOuterVlan, hw52DevUserAttackTrap=hw52DevUserAttackTrap, hw52DevUserMac=hw52DevUserMac, hw52DevPortDown=hw52DevPortDown, hwHdDevTable=hwHdDevTable, hw52DevCompliance=hw52DevCompliance, hw52DevPortNum=hw52DevPortNum, hw52DevHpt372ErrorTrap=hw52DevHpt372ErrorTrap, hw52DevStartupFileReloadAlarm=hw52DevStartupFileReloadAlarm, hw52DevMemUsage=hw52DevMemUsage, hw52DevSlotNum=hw52DevSlotNum, hw52DevPortOperateStatus=hw52DevPortOperateStatus, hwMA5200Device=hwMA5200Device, PYSNMP_MODULE_ID=hwMA5200Device, hw52DevSlotRegOK=hw52DevSlotRegOK, hw52OutPacketErrorTrap=hw52OutPacketErrorTrap, hw52DevSlotTrap=hw52DevSlotTrap)
| 132.163043 | 2,894 | 0.772062 | 2,672 | 24,318 | 7.025823 | 0.117889 | 0.058808 | 0.083418 | 0.097321 | 0.460289 | 0.345709 | 0.310286 | 0.293986 | 0.289139 | 0.278592 | 0 | 0.113916 | 0.082737 | 24,318 | 183 | 2,895 | 132.885246 | 0.727697 | 0.014064 | 0 | 0.028571 | 0 | 0.011429 | 0.270914 | 0.119039 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.057143 | 0 | 0.057143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
543b74bbf249991a39c73878b670772b46f1f152 | 837 | py | Python | Pillow-4.3.0/Tests/test_image_fromqpixmap.py | leorzz/simplemooc | 8b1c5e939d534b1fd729596df4c59fc69708b896 | [
"MIT"
] | null | null | null | Pillow-4.3.0/Tests/test_image_fromqpixmap.py | leorzz/simplemooc | 8b1c5e939d534b1fd729596df4c59fc69708b896 | [
"MIT"
] | null | null | null | Pillow-4.3.0/Tests/test_image_fromqpixmap.py | leorzz/simplemooc | 8b1c5e939d534b1fd729596df4c59fc69708b896 | [
"MIT"
] | null | null | null | from helper import unittest, PillowTestCase, hopper
from test_imageqt import PillowQtTestCase, PillowQPixmapTestCase
from PIL import ImageQt
class TestFromQPixmap(PillowQPixmapTestCase, PillowTestCase):
def roundtrip(self, expected):
PillowQtTestCase.setUp(self)
result = ImageQt.fromqpixmap(ImageQt.toqpixmap(expected))
# Qt saves all pixmaps as rgb
self.assert_image_equal(result, expected.convert('RGB'))
def test_sanity_1(self):
self.roundtrip(hopper('1'))
def test_sanity_rgb(self):
self.roundtrip(hopper('RGB'))
def test_sanity_rgba(self):
self.roundtrip(hopper('RGBA'))
def test_sanity_l(self):
self.roundtrip(hopper('L'))
def test_sanity_p(self):
self.roundtrip(hopper('P'))
if __name__ == '__main__':
unittest.main()
| 25.363636 | 65 | 0.696535 | 97 | 837 | 5.793814 | 0.402062 | 0.062278 | 0.115658 | 0.204626 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002976 | 0.197133 | 837 | 32 | 66 | 26.15625 | 0.833333 | 0.032258 | 0 | 0 | 0 | 0 | 0.02599 | 0 | 0 | 0 | 0 | 0 | 0.05 | 1 | 0.3 | false | 0 | 0.15 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
54448bc35cc8842bff2aceb3e53b24d808d47e56 | 14,894 | py | Python | docs/tutorials/deployment/int8_inference.py | ptrendx/gluon-cv | 5017de97bffcdf7fd90a0be5cdd3201dc7af769e | [
"Apache-2.0"
] | 11 | 2019-12-11T12:47:30.000Z | 2021-01-21T16:36:56.000Z | docs/tutorials/deployment/int8_inference.py | ptrendx/gluon-cv | 5017de97bffcdf7fd90a0be5cdd3201dc7af769e | [
"Apache-2.0"
] | 3 | 2019-09-03T01:35:15.000Z | 2019-11-13T06:28:00.000Z | docs/tutorials/deployment/int8_inference.py | ptrendx/gluon-cv | 5017de97bffcdf7fd90a0be5cdd3201dc7af769e | [
"Apache-2.0"
] | 3 | 2020-01-10T16:50:08.000Z | 2020-11-13T06:59:13.000Z | """3. Inference with Quantized Models
=====================================
This is a tutorial which illustrates how to use quantized GluonCV
models for inference on Intel Xeon Processors to gain higher performance.
The following example requires ``GluonCV>=0.4`` and ``MXNet-mkl>=1.6.0b20190829``. Please follow `our installation guide <../../index.html#installation>`__ to install or upgrade GluonCV and nightly build of MXNet if necessary.
Introduction
------------
GluonCV delivered some quantized models to improve the performance and reduce the deployment costs for the computer vision inference tasks. In real production, there are two main benefits of lower precision (INT8). First, the computation can be accelerated by the low precision instruction, like Intel Vector Neural Network Instruction (VNNI). Second, lower precision data type would save the memory bandwidth and allow for better cache locality and save the power. The new feature can get up to 4X performance speedup in the latest `AWS EC2 C5 instances <https://aws.amazon.com/blogs/aws/now-available-new-c5-instance-sizes-and-bare-metal-instances/>`_ under the `Intel Deep Learning Boost (VNNI) <https://www.intel.ai/intel-deep-learning-boost/>`_ enabled hardware with less than 0.5% accuracy drop.
Please checkout `verify_pretrained.py <https://raw.githubusercontent.com/dmlc/gluon-cv/master/scripts/classification/imagenet/verify_pretrained.py>`_ for imagenet inference,
`eval_ssd.py <https://raw.githubusercontent.com/dmlc/gluon-cv/master/scripts/detection/ssd/eval_ssd.py>`_ for SSD inference, and `test.py <https://raw.githubusercontent.com/dmlc/gluon-cv/master/scripts/segmentation/test.py>`_
for segmentation inference.
Performance
-----------
GluonCV supports some quantized classification models, detection models and segmentation models.
For the throughput, the target is to achieve the maximum machine efficiency to combine the inference requests together and get the results by one iteration. From the bar-chart, it is clearly that the fusion and quantization approach improved the throughput from 2.68X to 7.24X for selected models.
Below CPU performance is collected with dummy input from AWS EC2 C5.12xlarge instance with 24 physical cores.
.. figure:: https://user-images.githubusercontent.com/34727741/64021961-a9105280-cb67-11e9-989e-76a29e58530d.png
:alt: Gluon Quantization Performance
.. table::
:widths: 45 5 5 10 10 5 10 10
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| Model | Dataset | Batch Size | C5.12xlarge FP32 | C5.12xlarge INT8 | Speedup | FP32 Accuracy | INT8 Accuracy |
+=======================+==========+============+==================+==================+=========+=================+=================+
| ResNet50 V1 | ImageNet | 128 | 191.17 | 1384.4 | 7.24 | 77.21%/93.55% | 76.08%/93.04% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| MobileNet 1.0 | ImageNet | 128 | 565.21 | 3956.45 | 7.00 | 73.28%/91.22% | 71.94%/90.47% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| SSD-VGG 300* | VOC | 224 | 19.05 | 113.62 | 5.96 | 77.4 | 77.38 |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| SSD-VGG 512* | VOC | 224 | 6.78 | 37.62 | 5.55 | 78.41 | 78.38 |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| SSD-resnet50_v1 512* | VOC | 224 | 28.59 | 143.7 | 5.03 | 80.21 | 80.25 |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| SSD-mobilenet1.0 512* | VOC | 224 | 65.97 | 212.59 | 3.22 | 75.42 | 74.70 |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| FCN_resnet101 | VOC | 1 | 5.46 | 26.33 | 4.82 | 97.97% | 98.00% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| PSP_resnet101 | VOC | 1 | 3.96 | 10.63 | 2.68 | 98.46% | 98.45% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| Deeplab_resnet101 | VOC | 1 | 4.17 | 13.35 | 3.20 | 98.36% | 98.34% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| FCN_resnet101 | COCO | 1 | 5.19 | 26.22 | 5.05 | 91.28% | 90.96% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| PSP_resnet101 | COCO | 1 | 3.94 | 10.60 | 2.69 | 91.82% | 91.88% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
| Deeplab_resnet101 | COCO | 1 | 4.15 | 13.56 | 3.27 | 91.86% | 91.98% |
+-----------------------+----------+------------+------------------+------------------+---------+-----------------+-----------------+
Quantized SSD models are evaluated with ``nms_thresh=0.45``, ``nms_topk=200``. For segmentation models, the accuracy metric is pixAcc.
Demo usage for SSD
------------------
.. code:: bash
# set omp to use all physical cores of one socket
export KMP_AFFINITY=granularity=fine,noduplicates,compact,1,0
export CPUs=`lscpu | grep 'Core(s) per socket' | awk '{print $4}'`
export OMP_NUM_THREADS=$(CPUs)
# with Pascal VOC validation dataset saved on disk
python eval_ssd.py --network=mobilenet1.0 --quantized --data-shape=512 --batch-size=224 --dataset=voc --benchmark
Usage:
::
SYNOPSIS
python eval_ssd.py [-h] [--network NETWORK] [--deploy]
[--model-prefix] [--quantized]
[--data-shape DATA_SHAPE] [--batch-size BATCH_SIZE]
[--benchmark BENCHMARK] [--num-iterations NUM_ITERATIONS]
[--dataset DATASET] [--num-workers NUM_WORKERS]
[--num-gpus NUM_GPUS] [--pretrained PRETRAINED]
[--save-prefix SAVE_PREFIX] [--calibration CALIBRATION]
[--num-calib-batches NUM_CALIB_BATCHES]
[--quantized-dtype {auto,int8,uint8}]
[--calib-mode CALIB_MODE]
OPTIONS
-h, --help show this help message and exit
--network NETWORK base network name
--deploy whether load static model for deployment
--model-prefix MODEL_PREFIX
load static model as hybridblock.
--quantized use int8 pretrained model
--data-shape DATA_SHAPE
input data shape
--batch-size BATCH_SIZE
eval mini-batch size
--benchmark BENCHMARK run dummy-data based benchmarking
--num-iterations NUM_ITERATIONS number of benchmarking iterations.
--dataset DATASET eval dataset.
--num-workers NUM_WORKERS, -j NUM_WORKERS
number of data workers
--num-gpus NUM_GPUS number of gpus to use.
--pretrained PRETRAINED
load weights from previously saved parameters.
--save-prefix SAVE_PREFIX
saving parameter prefix
--calibration quantize model
--num-calib-batches NUM_CALIB_BATCHES
number of batches for calibration
--quantized-dtype {auto,int8,uint8}
quantization destination data type for input data
--calib-mode CALIB_MODE
calibration mode used for generating calibration table
for the quantized symbol; supports 1. none: no
calibration will be used. The thresholds for
quantization will be calculated on the fly. This will
result in inference speed slowdown and loss of
accuracy in general. 2. naive: simply take min and max
values of layer outputs as thresholds for
quantization. In general, the inference accuracy
worsens with more examples used in calibration. It is
recommended to use `entropy` mode as it produces more
accurate inference results. 3. entropy: calculate KL
divergence of the fp32 output and quantized output for
optimal thresholds. This mode is expected to produce
the best inference accuracy of all three kinds of
quantized models if the calibration dataset is
representative enough of the inference dataset.
Calibration Tool
----------------
GluonCV also delivered calibration tool for users to quantize their models into int8 with their own dataset. Currently, calibration tool only supports hybridized gluon models. Below is an example of quantizing SSD model.
.. code:: bash
# Calibration
python eval_ssd.py --network=mobilenet1.0 --data-shape=512 --batch-size=224 --dataset=voc --calibration --num-calib-batches=5 --calib-mode=naive
# INT8 Inference
python eval_ssd.py --network=mobilenet1.0 --data-shape=512 --batch-size=224 --deploy --model-prefix=./model/ssd_512_mobilenet1.0_voc-quantized-naive
The first command will launch naive calibration to quantize your ssd_mobilenet1.0 model to int8 by using a subset (5 batches) of your given dataset. Users can tune the int8 accuracy by setting different calibration configurations. After calibration, quantized model and parameter will be saved on your disk. Then, the second command will load quantized model as a symbolblock for inference.
Users can also quantize their own gluon hybridized model by using `quantize_net` api. Below are some descriptions.
API:
::
CODE
from mxnet.contrib.quantization import *
quantized_net = quantize_net(network, quantized_dtype='auto',
exclude_layers=None, exclude_layers_match=None,
calib_data=None, data_shapes=None,
calib_mode='naive', num_calib_examples=None,
ctx=mx.cpu(), logger=logging)
Parameters
network : Gluon HybridBlock
Defines the structure of a neural network for FP32 data types.
quantized_dtype : str
The quantized destination type for input data. Currently support 'int8'
, 'uint8' and 'auto'.
'auto' means automatically select output type according to calibration result.
Default value is 'int8'.
exclude_layers : list of strings
A list of strings representing the names of the symbols that users want to excluding
exclude_layers_match : list of strings
A list of strings wildcard matching the names of the symbols that users want to excluding
from being quantized.
calib_data : mx.io.DataIter or gluon.DataLoader
A iterable data loading object.
data_shapes : list
List of DataDesc, required if calib_data is not provided
calib_mode : str
If calib_mode='none', no calibration will be used and the thresholds for
requantization after the corresponding layers will be calculated at runtime by
calling min and max operators. The quantized models generated in this
mode are normally 10-20% slower than those with calibrations during inference.
If calib_mode='naive', the min and max values of the layer outputs from a calibration
dataset will be directly taken as the thresholds for quantization.
If calib_mode='entropy', the thresholds for quantization will be
derived such that the KL divergence between the distributions of FP32 layer outputs and
quantized layer outputs is minimized based upon the calibration dataset.
calib_layer : function
Given a layer's output name in string, return True or False for deciding whether to
calibrate this layer. If yes, the statistics of the layer's output will be collected;
otherwise, no information of the layer's output will be collected. If not provided,
all the layers' outputs that need requantization will be collected.
num_calib_examples : int or None
The maximum number of examples that user would like to use for calibration.
If not provided, the whole calibration dataset will be used.
ctx : Context
Defines the device that users want to run forward propagation on the calibration
dataset for collecting layer output statistics. Currently, only supports single context.
Currently only support CPU with MKL-DNN backend.
logger : Object
A logging object for printing information during the process of quantization.
Returns
network : Gluon SymbolBlock
Defines the structure of a neural network for INT8 data types.
"""
| 70.254717 | 801 | 0.514905 | 1,543 | 14,894 | 4.920285 | 0.333117 | 0.008693 | 0.007113 | 0.007903 | 0.13804 | 0.114726 | 0.083641 | 0.072181 | 0.047945 | 0.047945 | 0 | 0.045276 | 0.298577 | 14,894 | 211 | 802 | 70.587678 | 0.68144 | 0.999261 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5451517ebc4f77c2e5ef40adf195a8bb91567ca6 | 5,444 | py | Python | examples/section3.1.py | rickyHong/dwave-qubo-example-repl | ff7306de273d438902db7c8448a2ea825ecfa50f | [
"Apache-2.0"
] | null | null | null | examples/section3.1.py | rickyHong/dwave-qubo-example-repl | ff7306de273d438902db7c8448a2ea825ecfa50f | [
"Apache-2.0"
] | null | null | null | examples/section3.1.py | rickyHong/dwave-qubo-example-repl | ff7306de273d438902db7c8448a2ea825ecfa50f | [
"Apache-2.0"
] | null | null | null | """
Section 3.1 The Number Partitioning Problem
Partition a set of numbers into two subsets such that the subset sums are as close to each other as possible.
"""
import copy
import dimod
from dwave.system.samplers import DWaveSampler
from dwave.system.composites import EmbeddingComposite
numbers = [25,7,13,31,42,17,21,10]
print numbers
print "#"*80
# First try to solve using classical computing / programming naively.
# Sort the list in ascending order and then split it into 2 list based on odd and even index position
numbers_copy = copy.deepcopy(numbers)
numbers_copy.sort()
list1 = numbers_copy[0::2]
list2 = numbers_copy[1::2]
print "Using classical computing / programming"
print "list1: {}, sum: {}".format(list1, sum(list1))
print "list2: {}, sum: {}".format(list2, sum(list2))
print "diff: abs(sum(list1) - sum(list2)) = {}".format(abs(sum(list1) - sum(list2)))
# Using classical computing / programming
# list1: [7, 13, 21, 31], sum: 72
# list2: [10, 17, 25, 42], sum: 94
# diff: abs(sum(list1) - sum(list2)) = 22
print "#"*80
def split_numbers_list(numbers, result):
list1 = []
list2 = []
for key, include_in_list in result.items():
index = key-1
if include_in_list:
list1.append(numbers[index])
else:
list2.append(numbers[index])
return list1, list2
c = sum(numbers)
c_square = c**c
linear = {}
quadratic = {}
offset = 0.0
vartype = dimod.BINARY
for index, value in enumerate(numbers):
linear[index+1] = value * (value - c)
for index1, value1 in enumerate(numbers[:-1]):
for index2 in range(index1+1, len(numbers)):
value = value1 * numbers[index2]
idx = (index1+1, index2+1)
quadratic[idx] = quadratic[tuple(reversed(idx))] = value
# print linear
# print quadratic
# Expected Solution
# x=(0,0,0,1,1,0,0,1), ie list1=[31,42,10]; list2=[25,7,13,17,21]
# y=-6889
bqm = dimod.BinaryQuadraticModel(
linear,
quadratic,
offset,
vartype)
sampler = dimod.ExactSolver()
sample_set = sampler.sample(bqm)
sample_set = sample_set.truncate(5)
print "Using ExactSolver()"
print sample_set
for sample in sample_set.samples():
list1, list2 = split_numbers_list(numbers, sample)
print "list1: {}, sum: {}, list2: {}, sum: {}".format(list1, sum(list1), list2, sum(list2))
# Print the first 5 results, notice there are multiple solutions that achieve
# the right answer.
#
# Using ExactSolver()
# 1 2 3 4 5 6 7 8 energy num_oc.
# 0 0 0 0 1 1 0 0 1 -6889.0 1
# 1 0 1 1 0 1 0 1 0 -6889.0 1
# 2 1 1 1 0 0 1 1 0 -6889.0 1
# 3 1 0 0 1 0 1 0 1 -6889.0 1
# 4 1 1 0 1 0 0 1 0 -6888.0 1
# ['BINARY', 5 rows, 5 samples, 8 variables]
# list1: [31, 42, 10], sum: 83, list2: [25, 7, 13, 17, 21], sum: 83
# list1: [7, 13, 42, 21], sum: 83, list2: [25, 31, 17, 10], sum: 83
# list1: [25, 7, 13, 17, 21], sum: 83, list2: [31, 42, 10], sum: 83
# list1: [25, 31, 17, 10], sum: 83, list2: [7, 13, 42, 21], sum: 83
# list1: [25, 7, 31, 21], sum: 84, list2: [13, 42, 17, 10], sum: 82
print '#'*80
sampler = dimod.SimulatedAnnealingSampler()
sample_set = sampler.sample(bqm)
sample_set = sample_set.truncate(5)
print "Using SimulatedAnnlearingSampler()"
print sample_set
for sample in sample_set.samples():
list1, list2 = split_numbers_list(numbers, sample)
print "list1: {}, sum: {}, list2: {}, sum: {}".format(list1, sum(list1), list2, sum(list2))
# Using SimulatedAnnlearingSampler()
# 1 2 3 4 5 6 7 8 energy num_oc.
# 0 0 0 0 1 1 0 0 1 -6889.0 1
# 1 1 1 1 0 0 1 1 0 -6889.0 1
# 2 1 1 0 0 1 0 0 1 -6888.0 1
# 3 0 0 0 0 1 1 1 0 -6880.0 1
# 4 0 0 0 0 1 1 1 0 -6880.0 1
# ['BINARY', 5 rows, 5 samples, 8 variables]
# list1: [31, 42, 10], sum: 83, list2: [25, 7, 13, 17, 21], sum: 83
# list1: [25, 7, 13, 17, 21], sum: 83, list2: [31, 42, 10], sum: 83
# list1: [25, 7, 42, 10], sum: 84, list2: [13, 31, 17, 21], sum: 82
# list1: [42, 17, 21], sum: 80, list2: [25, 7, 13, 31, 10], sum: 86
# list1: [42, 17, 21], sum: 80, list2: [25, 7, 13, 31, 10], sum: 86
print '#'*80
sampler = EmbeddingComposite(DWaveSampler())
sample_set = sampler.sample(bqm, num_reads=10)
print "Using DWaveSampler()"
print sample_set
for sample in sample_set.samples():
list1, list2 = split_numbers_list(numbers, sample)
print "list1: {}, sum: {}, list2: {}, sum: {}".format(list1, sum(list1), list2, sum(list2))
# Using DWaveSampler()
# 1 2 3 4 5 6 7 8 energy num_oc. chain_.
# 0 1 1 1 0 0 1 1 0 -6889.0 2 0.75
# 6 0 1 0 0 1 1 1 0 -6873.0 1 0.875
# 1 0 0 1 1 1 0 0 1 -6720.0 1 0.875
# 5 0 1 1 0 1 1 1 0 -6600.0 3 1.0
# 3 0 1 1 0 0 1 1 0 -6264.0 1 0.875
# 4 1 1 0 1 1 0 0 1 -5865.0 1 0.875
# 2 1 1 1 0 1 1 1 0 -5125.0 1 0.875
# ['BINARY', 7 rows, 10 samples, 8 variables]
# list1: [25, 7, 13, 17, 21], sum: 83, list2: [31, 42, 10], sum: 83
# list1: [7, 42, 17, 21], sum: 87, list2: [25, 13, 31, 10], sum: 79
# list1: [13, 31, 42, 10], sum: 96, list2: [25, 7, 17, 21], sum: 70
# list1: [7, 13, 42, 17, 21], sum: 100, list2: [25, 31, 10], sum: 66
# list1: [7, 13, 17, 21], sum: 58, list2: [25, 31, 42, 10], sum: 108
# list1: [25, 7, 31, 42, 10], sum: 115, list2: [13, 17, 21], sum: 51
# list1: [25, 7, 13, 42, 17, 21], sum: 125, list2: [31, 10], sum: 41 | 36.783784 | 109 | 0.597171 | 959 | 5,444 | 3.353493 | 0.175182 | 0.029851 | 0.023321 | 0.018657 | 0.41791 | 0.383396 | 0.336754 | 0.324938 | 0.318719 | 0.318719 | 0 | 0.200929 | 0.24853 | 5,444 | 148 | 110 | 36.783784 | 0.585187 | 0.507348 | 0 | 0.289855 | 0 | 0 | 0.123782 | 0.011364 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.057971 | null | null | 0.26087 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
546cb5bd4172348970436aa64f8df777f73e4b56 | 604 | py | Python | server/src/tests/samples/protocol2.py | jhutchings1/pyright | 2b8593a58a2aecc95dac49cce92fc16678cd4e14 | [
"MIT"
] | null | null | null | server/src/tests/samples/protocol2.py | jhutchings1/pyright | 2b8593a58a2aecc95dac49cce92fc16678cd4e14 | [
"MIT"
] | 1 | 2021-08-31T20:37:43.000Z | 2021-08-31T20:37:43.000Z | server/src/tests/samples/protocol2.py | jhutchings1/pyright | 2b8593a58a2aecc95dac49cce92fc16678cd4e14 | [
"MIT"
] | null | null | null | # This sample tests the type checker's handling of
# generic protocols with invariant, constrained, and contravariant
# type arguments.
from typing import TypeVar, Protocol
T = TypeVar("T")
StrLike = TypeVar("StrLike", str, bytes)
T_contra = TypeVar("T_contra", contravariant=True)
class Writer(Protocol[T_contra]):
def write(self, data: T_contra) -> None:
...
class WriteFile:
def write(self, s: bytes) -> None:
pass
def f(writer: Writer[bytes]):
pass
def g(writer: Writer[T]):
pass
def h(writer: Writer[StrLike]):
pass
w = WriteFile()
f(w)
g(w)
h(w)
| 15.1 | 66 | 0.665563 | 85 | 604 | 4.682353 | 0.482353 | 0.070352 | 0.060302 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206954 | 604 | 39 | 67 | 15.487179 | 0.830898 | 0.213576 | 0 | 0.2 | 0 | 0 | 0.034043 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.2 | 0.05 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
54777e5cbec70f950cd51b00e7abe671c0c1beb1 | 8,525 | py | Python | tests/test_nll.py | ndem0/ATHENA | 87825ad95de539ac5e816a19922e9d615fabd5b8 | [
"MIT"
] | 33 | 2019-12-05T15:20:26.000Z | 2022-03-27T17:53:57.000Z | tests/test_nll.py | ndem0/ATHENA | 87825ad95de539ac5e816a19922e9d615fabd5b8 | [
"MIT"
] | 12 | 2020-03-23T08:54:32.000Z | 2021-11-07T14:33:04.000Z | tests/test_nll.py | ndem0/ATHENA | 87825ad95de539ac5e816a19922e9d615fabd5b8 | [
"MIT"
] | 16 | 2019-12-05T14:10:57.000Z | 2021-07-30T14:12:10.000Z | from unittest import TestCase
import numpy as np
from athena import NonlinearLevelSet, ForwardNet, BackwardNet, Normalizer
import torch
import os
from contextlib import contextmanager
import matplotlib.pyplot as plt
@contextmanager
def assert_plot_figures_added():
"""
Assert that the number of figures is higher than
when you started the test
"""
num_figures_before = plt.gcf().number
yield
num_figures_after = plt.gcf().number
assert num_figures_before < num_figures_after
def read_data():
data = np.loadtxt('tests/data/naca0012.txt', skiprows=1, delimiter=',')
real_inputs = data[:, 1:19]
n_params = real_inputs.shape[1]
lb = -0.01 * np.ones(n_params)
ub = 0.01 * np.ones(n_params)
normalizer = Normalizer(lb=lb, ub=ub)
# inputs in [-1, 1]
inputs = normalizer.fit_transform(real_inputs)
lift = data[:, 19]
# gradients with respect to normalized inputs
grad_lift = data[:, 21:39]
return inputs, lift, grad_lift
inputs, lift, grad_lift = read_data()
inputs_torch = torch.as_tensor(inputs, dtype=torch.double)
grad_torch = torch.as_tensor(grad_lift, dtype=torch.double)
class TestNonlinearLevelSet(TestCase):
def test_init_n_layers(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertEqual(nll.n_layers, 2)
def test_init_active_dim(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertEqual(nll.active_dim, 1)
def test_init_lr(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertEqual(nll.lr, 0.1)
def test_init_epochs(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertEqual(nll.epochs, 100)
def test_init_dh(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertEqual(nll.dh, 0.25)
def test_init_forward(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertIsNone(nll.forward)
def test_init_backward(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertIsNone(nll.backward)
def test_init_loss_vec(self):
nll = NonlinearLevelSet(n_layers=2,
active_dim=1,
lr=0.1,
epochs=100,
dh=0.25)
self.assertEqual(nll.loss_vec, [])
def test_train_01(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
self.assertIsInstance(nll.forward, ForwardNet)
def test_train_02(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
self.assertIsInstance(nll.backward, BackwardNet)
def test_train_03(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
self.assertIs(len(nll.loss_vec), 1)
def test_train_04(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
with self.assertRaises(ValueError):
nll.train(inputs=inputs_torch,
gradients=grad_torch,
interactive=True)
def test_train_05(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
with assert_plot_figures_added():
nll.train(inputs=inputs_torch,
gradients=grad_torch,
outputs=lift,
interactive=True)
def test_forward_n_params(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
self.assertEqual(nll.forward.n_params, 9)
def test_backward_n_params(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
self.assertEqual(nll.backward.n_params, 9)
def test_plot_sufficient_summary_01(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
with assert_plot_figures_added():
nll.plot_sufficient_summary(inputs=inputs_torch, outputs=lift)
def test_plot_sufficient_summary_02(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=2, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
with self.assertRaises(ValueError):
nll.plot_sufficient_summary(inputs=inputs_torch, outputs=lift)
def test_plot_loss(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=2)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
with assert_plot_figures_added():
nll.plot_loss()
def test_save_forward(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
outfilename = 'tests/data/saved_forward.pth'
nll.save_forward(outfilename)
self.assertTrue(os.path.exists(outfilename))
self.addCleanup(os.remove, outfilename)
def test_load_forward(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.load_forward(infile='tests/data/forward_test.pth', n_params=18)
self.assertIsInstance(nll.forward, ForwardNet)
def test_save_backward(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.train(inputs=inputs_torch, gradients=grad_torch, interactive=False)
outfilename = 'tests/data/saved_backward.pth'
nll.save_backward(outfilename)
self.assertTrue(os.path.exists(outfilename))
self.addCleanup(os.remove, outfilename)
def test_load_backward(self):
nll = NonlinearLevelSet(n_layers=2, active_dim=1, lr=0.02, epochs=1)
nll.load_backward(infile='tests/data/backward_test.pth', n_params=18)
self.assertIsInstance(nll.backward, BackwardNet)
class TestForwardNet(TestCase):
def test_init_n_params(self):
nll = ForwardNet(n_params=6, n_layers=2, dh=0.25, active_dim=1)
self.assertEqual(nll.n_params, 3)
def test_init_n_layers(self):
nll = ForwardNet(n_params=6, n_layers=2, dh=0.25, active_dim=1)
self.assertEqual(nll.n_layers, 2)
def test_init_dh(self):
nll = ForwardNet(n_params=6, n_layers=2, dh=0.20, active_dim=1)
self.assertEqual(nll.dh, 0.20)
def test_init_omega(self):
nll = ForwardNet(n_params=6, n_layers=2, dh=0.25, active_dim=1)
self.assertEqual(nll.omega, slice(1))
class TestBackwardNet(TestCase):
def test_init_n_params(self):
nll = BackwardNet(n_params=6, n_layers=2, dh=0.25)
self.assertEqual(nll.n_params, 3)
def test_init_n_layers(self):
nll = BackwardNet(n_params=6, n_layers=2, dh=0.25)
self.assertEqual(nll.n_layers, 2)
def test_init_dh(self):
nll = BackwardNet(n_params=6, n_layers=2, dh=0.20)
self.assertEqual(nll.dh, 0.20)
| 38.400901 | 79 | 0.603754 | 1,089 | 8,525 | 4.523416 | 0.121212 | 0.049736 | 0.051969 | 0.111652 | 0.741981 | 0.707674 | 0.682704 | 0.669306 | 0.634186 | 0.613276 | 0 | 0.0446 | 0.295132 | 8,525 | 221 | 80 | 38.574661 | 0.775171 | 0.01607 | 0 | 0.581921 | 0 | 0 | 0.016254 | 0.016135 | 0 | 0 | 0 | 0 | 0.175141 | 1 | 0.175141 | false | 0 | 0.039548 | 0 | 0.237288 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
54861499a8b0bfd6d735384cc17f1f60f4678825 | 1,161 | py | Python | hide-message.py | AkashSDas/Mini-Projects | 908cea6df6ef6c62abdfe05585634b786e60b812 | [
"MIT"
] | null | null | null | hide-message.py | AkashSDas/Mini-Projects | 908cea6df6ef6c62abdfe05585634b786e60b812 | [
"MIT"
] | null | null | null | hide-message.py | AkashSDas/Mini-Projects | 908cea6df6ef6c62abdfe05585634b786e60b812 | [
"MIT"
] | null | null | null | import random
import string
def generate_random_characters():
alphabets = string.ascii_letters
numbers = string.digits
special_characters = string.punctuation
space_character = " "
characters = f"{space_character}{alphabets}{numbers}{special_characters}"
return characters[random.randint(0, len(characters)-1)]
def get_encrypt_message(message, separate):
encrypt_message = ''
count = 0
for i in range(len(message)*separate+separate):
if i % separate == 0 and i >= separate:
encrypt_message += message[count]
count += 1
else:
encrypt_message += generate_random_characters()
return encrypt_message
def get_decrypt_message(encrypt_message, separate):
decrypt_message = ''
for i in range(len(encrypt_message)):
if i % separate == 0 and i >= separate:
decrypt_message += encrypt_message[i]
return decrypt_message
message = 'I Love You'
separate = 1000
encrypt_message = get_encrypt_message(message, separate)
print(encrypt_message)
print()
decrypt_message = get_decrypt_message(encrypt_message, separate)
print(decrypt_message)
| 24.702128 | 77 | 0.701981 | 136 | 1,161 | 5.757353 | 0.279412 | 0.214559 | 0.08046 | 0.10728 | 0.278416 | 0.16092 | 0.061303 | 0 | 0 | 0 | 0 | 0.010881 | 0.208441 | 1,161 | 46 | 78 | 25.23913 | 0.841132 | 0 | 0 | 0.0625 | 1 | 0 | 0.05857 | 0.049096 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09375 | false | 0 | 0.0625 | 0 | 0.25 | 0.09375 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5490ee020cde9bd75c1bf84b636d0d1749d87b15 | 248,853 | py | Python | test/pytest/test_answers_and_docs.py | dmulyalin/ttp | a3a3753724c4d980dff23548ab93fa6d9d389f01 | [
"MIT"
] | 254 | 2019-09-23T15:37:13.000Z | 2022-03-24T18:56:56.000Z | test/pytest/test_answers_and_docs.py | dmulyalin/ttp | a3a3753724c4d980dff23548ab93fa6d9d389f01 | [
"MIT"
] | 71 | 2019-09-26T16:32:55.000Z | 2022-03-31T15:57:12.000Z | test/pytest/test_answers_and_docs.py | dmulyalin/ttp | a3a3753724c4d980dff23548ab93fa6d9d389f01 | [
"MIT"
] | 38 | 2019-10-18T03:43:42.000Z | 2022-01-19T20:03:33.000Z | import sys
sys.path.insert(0, "../..")
import pprint
import pytest
import logging
logging.basicConfig(level=logging.DEBUG)
from ttp import ttp
def test_answer_1():
"""https://stackoverflow.com/questions/63522291/parsing-blocks-of-text-within-a-file-into-objects"""
data = """
#*Approximate Distance Oracles with Improved Query Time.
#@Christian Wulff-Nilsen
#t2015
#cEncyclopedia of Algorithms
#index555036b37cea80f954149ffc
#*Subset Sum Algorithm for Bin Packing.
#@Julián Mestre
#t2015
#cEncyclopedia of Algorithms
#index555036b37cea80f954149ffd
"""
template = """
#*{{ info | ORPHRASE }}
#@{{ author | ORPHRASE }}
#t{{ year }}
#c{{ title | ORPHRASE }}
#index{{ index }}
"""
parser = ttp(data, template)
parser.parse()
res = parser.result(structure="flat_list")
pprint.pprint(res)
assert res == [
{
"author": "Christian Wulff-Nilsen",
"index": "555036b37cea80f954149ffc",
"info": "Approximate Distance Oracles with Improved Query Time.",
"title": "Encyclopedia of Algorithms",
"year": "2015",
},
{
"author": "Julián Mestre",
"index": "555036b37cea80f954149ffd",
"info": "Subset Sum Algorithm for Bin Packing.",
"title": "Encyclopedia of Algorithms",
"year": "2015",
},
]
# test_answer_1()
def test_answer_2():
"""https://stackoverflow.com/questions/63499479/extract-value-from-text-string-using-format-string-in-python"""
data = """
name=username1, age=1001
name=username2, age=1002
name=username3, age=1003
"""
template = "name={{ name }}, age={{ age }}"
parser = ttp(data, template)
parser.parse()
res = parser.result(structure="flat_list")
# pprint.pprint(res)
assert res == [
{"age": "1001", "name": "username1"},
{"age": "1002", "name": "username2"},
{"age": "1003", "name": "username3"},
]
# test_answer_2()
def test_issue_20_answer():
data_to_parse = """
(*, 239.100.100.100)
LISP0.4200, (192.2.101.65, 232.0.3.1), Forward/Sparse, 1d18h/stopped
LISP0.4201, (192.2.101.70, 232.0.3.1), Forward/Sparse, 2d05h/stopped
(192.2.31.3, 239.100.100.100), 6d20h/00:02:23, flags: FT
Incoming interface: Vlan1029, RPF nbr 0.0.0.0
Outgoing interface list:
LISP0.4100, (192.2.101.70, 232.0.3.1), Forward/Sparse, 1d18h/stopped
"""
show_mcast1 = """
<template name="mcast" results="per_template">
<group name="mcast_entries.{{ overlay_src }}">
({{ overlay_src | _start_ | replace("*", "'*'")}}, {{ overlay_grp | IP }})
({{ overlay_src | _start_ | IP }}, {{ overlay_grp | IP }}), {{ entry_uptime }}/{{ entry_state_or_timer }}, flags: {{ entry_flags }}
Incoming interface: {{ incoming_intf }}, RPF nbr {{ rpf_neighbor }}
<group name="oil_entries*">
{{ outgoing_intf }}, ({{ underlay_src | IP }}, {{ underlay_grp | IP }}), Forward/Sparse, {{ oil_uptime }}/{{ oil_state_or_timer}}
</group>
</group>
</template>
"""
parser = ttp(template=show_mcast1)
parser.add_input(data_to_parse, template_name="mcast")
parser.parse()
res = parser.result(structure="dictionary")
# pprint.pprint(res, width=100)
assert res == {
"mcast": {
"mcast_entries": {
"'*'": {
"oil_entries": [
{
"oil_state_or_timer": "stopped",
"oil_uptime": "1d18h",
"outgoing_intf": "LISP0.4200",
"underlay_grp": "232.0.3.1",
"underlay_src": "192.2.101.65",
},
{
"oil_state_or_timer": "stopped",
"oil_uptime": "2d05h",
"outgoing_intf": "LISP0.4201",
"underlay_grp": "232.0.3.1",
"underlay_src": "192.2.101.70",
},
],
"overlay_grp": "239.100.100.100",
},
"192.2.31.3": {
"entry_flags": "FT",
"entry_state_or_timer": "00:02:23",
"entry_uptime": "6d20h",
"incoming_intf": "Vlan1029",
"oil_entries": [
{
"oil_state_or_timer": "stopped",
"oil_uptime": "1d18h",
"outgoing_intf": "LISP0.4100",
"underlay_grp": "232.0.3.1",
"underlay_src": "192.2.101.70",
}
],
"overlay_grp": "239.100.100.100",
"rpf_neighbor": "0.0.0.0",
},
}
}
}
# test_issue_20_answer()
def test_answer_3():
"""
Fixed bug with results forming - when have two _start_ matches, but
one of them is False, TTP was selecting first match without checking
if its False, updated decision logic to do that check.
"""
data = """
/c/slb/virt 12
dis
ipver v4
vip 1.1.1.1
rtsrcmac ena
vname "my name"
/c/slb/virt 12/service 443 https
group 15
rport 443
pbind clientip
dbind forceproxy
/c/slb/virt 12/service 443 https/http
xforward ena
httpmod hsts_insert
/c/slb/virt 12/service 443 https/ssl
srvrcert cert certname
sslpol ssl-Policy
/c/slb/virt 12/service 80 http
group 15
rport 80
pbind clientip
dbind forceproxy
/c/slb/virt 12/service 80 http/http
xforward ena
/c/slb/virt 14
dis
ipver v4
vip 1.1.4.4
rtsrcmac ena
vname "my name2"
"""
template = """
<template name="VIP_cfg" results="per_template">
<group name="{{ vip }}">
/c/slb/virt {{ virt_seq | DIGIT }}
dis {{ config_state | set("dis") }}
ipver {{ ipver}}
vip {{ vip }}
rtsrcmac {{ rtsrcmac }}
vname "{{ vip_name | ORPHRASE }}"
<group name="services.{{ port }}.{{ proto }}">
/c/slb/virt 12/service {{ port | DIGIT }} {{ proto | exclude(ssl) }}
group {{group_seq }}
rport {{ real_port }}
pbind {{ pbind }}
dbind {{ dbind }}
xforward {{ xforward }}
httpmod {{ httpmod }}
</group>
<group name="ssl_profile">
/c/slb/virt {{ virt_seq }}/service 443 https/ssl
srvrcert cert {{ ssl_server_cert }}
sslpol {{ ssl_profile }}
{{ ssl | set("https/ssl") }}
</group>
</group>
</template>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result(structure="dictionary")
# pprint.pprint(res, width=50)
assert res == {
"VIP_cfg": {
"1.1.1.1": {
"config_state": "dis",
"ipver": "v4",
"rtsrcmac": "ena",
"services": {
"443": {
"https": {
"dbind": "forceproxy",
"group_seq": "15",
"pbind": "clientip",
"real_port": "443",
},
"https/http": {"httpmod": "hsts_insert", "xforward": "ena"},
},
"80": {
"http": {
"dbind": "forceproxy",
"group_seq": "15",
"pbind": "clientip",
"real_port": "80",
},
"http/http": {"xforward": "ena"},
},
},
"ssl_profile": {
"ssl": "https/ssl",
"ssl_profile": "ssl-Policy",
"ssl_server_cert": "certname",
"virt_seq": "12",
},
"vip_name": "my name",
"virt_seq": "12",
},
"1.1.4.4": {
"config_state": "dis",
"ipver": "v4",
"rtsrcmac": "ena",
"vip_name": "my name2",
"virt_seq": "14",
},
}
}
# test_answer_3()
def test_answer_4():
data = """
/c/slb/virt 12
dis
ipver v4
vip 1.1.1.1
rtsrcmac ena
vname "my name"
/c/slb/virt 12/service 443 https
group 15
rport 443
pbind clientip
dbind forceproxy
/c/slb/virt 12/service 443 https/http
xforward ena
httpmod hsts_insert
/c/slb/virt 12/service 443 https/ssl
srvrcert cert certname
sslpol ssl-Policy
/c/slb/virt 12/service 80 http
group 15
rport 80
pbind clientip
dbind forceproxy
/c/slb/virt 12/service 80 http/http
xforward ena
/c/slb/virt 14
dis
ipver v4
vip 1.1.4.4
rtsrcmac ena
vname "my name2"
"""
template = """
<template name="VIP_cfg" results="per_template">
<group name="{{ vip }}">
/c/slb/virt {{ virt_seq | DIGIT }}
dis {{ config_state | set("dis") }}
ipver {{ ipver}}
vip {{ vip }}
rtsrcmac {{ rtsrcmac }}
vname "{{ vip_name | ORPHRASE }}"
<group name="services.{{ port }}" contains="dbind, pbind">
/c/slb/virt 12/service {{ port | DIGIT }} {{ proto | exclude(ssl) }}
group {{group_seq }}
rport {{ real_port }}
pbind {{ pbind }}
dbind {{ dbind }}
xforward {{ xforward }}
httpmod {{ httpmod }}
</group>
<group name="ssl_profile">
/c/slb/virt {{ virt_seq }}/service 443 https/ssl
srvrcert cert {{ ssl_server_cert }}
sslpol {{ ssl_profile }}
{{ ssl | set("https/ssl") }}
</group>
</group>
</template>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result(structure="dictionary")
# pprint.pprint(res, width=50)
assert res == {
"VIP_cfg": {
"1.1.1.1": {
"config_state": "dis",
"ipver": "v4",
"rtsrcmac": "ena",
"services": {
"443": {
"dbind": "forceproxy",
"group_seq": "15",
"pbind": "clientip",
"proto": "https",
"real_port": "443",
},
"80": {
"dbind": "forceproxy",
"group_seq": "15",
"pbind": "clientip",
"proto": "http",
"real_port": "80",
},
},
"ssl_profile": {
"ssl": "https/ssl",
"ssl_profile": "ssl-Policy",
"ssl_server_cert": "certname",
"virt_seq": "12",
},
"vip_name": "my name",
"virt_seq": "12",
},
"1.1.4.4": {
"config_state": "dis",
"ipver": "v4",
"rtsrcmac": "ena",
"vip_name": "my name2",
"virt_seq": "14",
},
}
}
# test_answer_4()
def test_issue_20_answer_2():
data_to_parse = """
(*, 239.100.100.101)
LISP0.4200, (192.2.101.65, 232.0.3.1), Forward/Sparse, 1d18h/stopped
LISP0.4201, (192.2.101.70, 232.0.3.1), Forward/Sparse, 2d05h/stopped
(192.2.31.3, 239.100.100.100), 2d05h/00:01:19, flags: FT
Incoming interface: Vlan1029, RPF nbr 0.0.0.0
Outgoing interface list:
LISP0.4100, (192.2.101.70, 232.0.3.1), Forward/Sparse, 2d05h/stopped
LISP0.4101, (192.2.101.70, 232.0.3.1), Forward/Sparse, 2d05h/stopped
(*, 239.100.100.100), 6d20h/00:03:28, RP 192.2.199.1, flags: S
Incoming interface: Null, RPF nbr 0.0.0.0
Outgoing interface list:
Vlan3014, Forward/Sparse, 1d18h/00:03:28
LISP0.4100, (192.2.101.65, 232.0.3.1), Forward/Sparse, 1d18h/stopped
"""
show_mcast1 = """
<template name="mcast" results="per_template">
<group name="mcast_entries.{{ overlay_src }}">
({{ overlay_src | _start_ | replace("*", "'*'") }}, {{ overlay_grp | IP }})
({{ overlay_src | _start_ | IP }}, {{ overlay_grp | IP }}), {{ entry_uptime }}/{{ entry_state_or_timer }}, flags: {{ entry_flags }}
({{ overlay_src | _start_ | replace("*", "'*'") }}, {{ overlay_grp | IP }}), {{ entry_uptime }}/{{ entry_state_or_timer }}, RP {{ rp }}, flags: {{ entry_flags }}
Incoming interface: {{ incoming_intf }}, RPF nbr {{ rpf_neighbor }}
<group name="oil_entries*">
{{ outgoing_intf }}, Forward/Sparse, {{ oil_uptime }}/{{ oil_state_or_timer}}
{{ outgoing_intf }}, ({{ underlay_src | IP }}, {{ underlay_grp | IP }}), Forward/Sparse, {{ oil_uptime }}/{{ oil_state_or_timer}}
</group>
</group>
</template>
"""
parser = ttp(template=show_mcast1)
parser.add_input(data_to_parse, template_name="mcast")
parser.parse()
res = parser.result(structure="dictionary")
# pprint.pprint(res, width=100)
assert res == {
"mcast": {
"mcast_entries": {
"'*'": [
{"overlay_grp": "239.100.100.101"},
{
"entry_flags": "S",
"entry_state_or_timer": "00:03:28",
"entry_uptime": "6d20h",
"incoming_intf": "Null",
"oil_entries": [
{
"oil_state_or_timer": "00:03:28",
"oil_uptime": "1d18h",
"outgoing_intf": "Vlan3014",
"underlay_grp": "232.0.3.1",
"underlay_src": "192.2.101.65",
}
],
"overlay_grp": "239.100.100.100",
"rp": "192.2.199.1",
"rpf_neighbor": "0.0.0.0",
},
],
"192.2.31.3": {
"entry_flags": "FT",
"entry_state_or_timer": "00:01:19",
"entry_uptime": "2d05h",
"incoming_intf": "Vlan1029",
"overlay_grp": "239.100.100.100",
"rpf_neighbor": "0.0.0.0",
},
}
}
}
# test_issue_20_answer_2()
def test_docs_ttp_dictionary_usage_example():
template = """
<input load="text">
interface Lo0
ip address 124.171.238.50/29
!
interface Lo1
ip address 1.1.1.1/30
</input>
<group macro="add_last_host">
interface {{ interface }}
ip address {{ ip }}
</group>
<macro>
def add_last_host(data):
ip_obj, _ = _ttp_["match"]["to_ip"](data["ip"])
all_ips = list(ip_obj.network.hosts())
data["last_host"] = str(all_ips[-1])
return data
</macro>
"""
parser = ttp(template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
[
{
"interface": "Lo0",
"ip": "124.171.238.50/29",
"last_host": "124.171.238.54",
},
{"interface": "Lo1", "ip": "1.1.1.1/30", "last_host": "1.1.1.2"},
]
]
]
# test_docs_ttp_dictionary_usage_example()
def test_github_issue_21_answer():
data_to_parse = """
R1#sh ip nbar protocol-discovery protocol
GigabitEthernet1
Last clearing of "show ip nbar protocol-discovery" counters 00:13:45
Input Output
----- ------
Protocol Packet Count Packet Count
Byte Count Byte Count
5min Bit Rate (bps) 5min Bit Rate (bps)
5min Max Bit Rate (bps) 5min Max Bit Rate (bps)
---------------------------- ------------------------ ------------------------
ssh 191 134
24805 22072
2000 1000
1999 1001
unknown 172 503
39713 31378
0 0
3000 0
ping 144 144
14592 14592
0 0
1000 1000
dns 107 0
21149 0
0 0
2000 0
vrrp 0 738
0 39852
0 0
0 0
ldp 174 175
13224 13300
0 0
0 0
ospf 86 87
9460 9570
0 0
0 0
Total 874 1781
122943 130764
2000 1000
8000 2000
"""
show_nbar = """
<template name="nbar" results="per_template">
<vars>C1 = "DIGIT | to_int | to_list | joinmatches"</vars>
<group name="{{ interface }}">
{{ interface | re('Gig.+') | re('Ten.+') }}
<group name="{{ protocol }}" macro="map_to_keys">
{{ protocol }} {{ in | chain(C1) }} {{ out | chain(C1) }}
{{ ignore(r"\\s+") }} {{ in | chain(C1) }} {{ out | chain(C1) }}
</group>
</group>
<macro>
def map_to_keys(data):
# uncomment to see data
# print(data)
inp_values = data.pop("in")
out_values = data.pop("out")
inp_keys = ["IN Packet Count", "IN Byte Count", "IN 5min Bit Rate (bps)", "IN 5min Max Bit Rate (bps)"]
out_keys = ["OUT Packet Count", "OUT Byte Count", "OUT 5min Bit Rate (bps)", "OUT 5min Max Bit Rate (bps)"]
data.update(dict(zip(inp_keys, inp_values)))
data.update(dict(zip(out_keys, out_values)))
return data
</macro>
</template>
"""
parser = ttp(template=show_nbar)
parser.add_input(data_to_parse, template_name="nbar")
parser.parse()
res = parser.result(structure="dictionary")
pprint.pprint(res, width=100)
assert res == {
"nbar": {
"GigabitEthernet1 ": {
"Total": {
"IN 5min Bit Rate (bps)": 2000,
"IN 5min Max Bit Rate (bps)": 8000,
"IN Byte Count": 122943,
"IN Packet Count": 874,
"OUT 5min Bit Rate (bps)": 1000,
"OUT 5min Max Bit Rate (bps)": 2000,
"OUT Byte Count": 130764,
"OUT Packet Count": 1781,
},
"dns": {
"IN 5min Bit Rate (bps)": 0,
"IN 5min Max Bit Rate (bps)": 2000,
"IN Byte Count": 21149,
"IN Packet Count": 107,
"OUT 5min Bit Rate (bps)": 0,
"OUT 5min Max Bit Rate (bps)": 0,
"OUT Byte Count": 0,
"OUT Packet Count": 0,
},
"ldp": {
"IN 5min Bit Rate (bps)": 0,
"IN 5min Max Bit Rate (bps)": 0,
"IN Byte Count": 13224,
"IN Packet Count": 174,
"OUT 5min Bit Rate (bps)": 0,
"OUT 5min Max Bit Rate (bps)": 0,
"OUT Byte Count": 13300,
"OUT Packet Count": 175,
},
"ospf": {
"IN 5min Bit Rate (bps)": 0,
"IN 5min Max Bit Rate (bps)": 0,
"IN Byte Count": 9460,
"IN Packet Count": 86,
"OUT 5min Bit Rate (bps)": 0,
"OUT 5min Max Bit Rate (bps)": 0,
"OUT Byte Count": 9570,
"OUT Packet Count": 87,
},
"ping": {
"IN 5min Bit Rate (bps)": 0,
"IN 5min Max Bit Rate (bps)": 1000,
"IN Byte Count": 14592,
"IN Packet Count": 144,
"OUT 5min Bit Rate (bps)": 0,
"OUT 5min Max Bit Rate (bps)": 1000,
"OUT Byte Count": 14592,
"OUT Packet Count": 144,
},
"ssh": {
"IN 5min Bit Rate (bps)": 2000,
"IN 5min Max Bit Rate (bps)": 1999,
"IN Byte Count": 24805,
"IN Packet Count": 191,
"OUT 5min Bit Rate (bps)": 1000,
"OUT 5min Max Bit Rate (bps)": 1001,
"OUT Byte Count": 22072,
"OUT Packet Count": 134,
},
"unknown": {
"IN 5min Bit Rate (bps)": 0,
"IN 5min Max Bit Rate (bps)": 3000,
"IN Byte Count": 39713,
"IN Packet Count": 172,
"OUT 5min Bit Rate (bps)": 0,
"OUT 5min Max Bit Rate (bps)": 0,
"OUT Byte Count": 31378,
"OUT Packet Count": 503,
},
"vrrp": {
"IN 5min Bit Rate (bps)": 0,
"IN 5min Max Bit Rate (bps)": 0,
"IN Byte Count": 0,
"IN Packet Count": 0,
"OUT 5min Bit Rate (bps)": 0,
"OUT 5min Max Bit Rate (bps)": 0,
"OUT Byte Count": 39852,
"OUT Packet Count": 738,
},
}
}
}
# test_github_issue_21_answer()
def test_github_issue_22():
data = """
interface Loopback0
description Fabric Node Router ID
ip address 192.2.101.70 255.255.255.255
ip pim sparse-mode
ip router isis
clns mtu 1400
end
interface Loopback0
description Fabric Node Router ID
ip address 192.2.101.71 255.255.255.255
ip pim sparse-mode
ip router isis
clns mtu 1400
end
"""
template = """{{ ignore(r"\\s+") }}ip address {{ ip_address }} 255.255.255.255"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
# pprint.pprint(res, width=100)
assert res == [[[{"ip_address": "192.2.101.70"}, {"ip_address": "192.2.101.71"}]]]
# test_github_issue_22()
def test_github_issue_24():
data = """
19: IP4 1.1.1.1, 00:03:b2:78:04:13, vname portal, NO SERVICES UP
Virtual Services:
http: rport http, group 11, health http (HTTP), pbind clientip
Real Servers:
22: 10.10.10.10, web1, group ena, health (runtime HTTP), 0 ms, FAILED
Reason: N/A
23: 10.11.11.11, web2, group ena, health (runtime HTTP), 0 ms, FAILED
Reason: N/A
https: rport https, group 12, health tcp (TCP), pbind clientip
Real Servers:
22: 10.10.10.10, web1, group ena, health (runtime TCP), 0 ms, FAILED
Reason: N/A
23: 10.11.11.11, web2, group ena, health (runtime TCP), 0 ms, FAILED
Reason: N/A
"""
template = """
<template name="VIP_cfg" results="per_template">
<group name="{{ vs_instance }}" default="">
{{ vs_instance }}: IP4 {{ vs_ip }},{{ ignore(".+") }}
<group name="services*" default="">
{{ vs_service }}: rport {{ rport }},{{ ignore(".+") }}
<group name="pool*" default="">
{{ node_id }}: {{ node_ip }},{{ ignore(".+") }}
Reason: {{ reason }}
</group>
</group>
</group>
</template>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result(structure="dictionary")
# pprint.pprint(res, width=100)
assert res == {
"VIP_cfg": {
"19": {
"services": [
{
"pool": [
{
"node_id": "22",
"node_ip": "10.10.10.10",
"reason": "N/A",
},
{
"node_id": "23",
"node_ip": "10.11.11.11",
"reason": "N/A",
},
],
"rport": "http",
"vs_service": "http",
},
{
"pool": [
{
"node_id": "22",
"node_ip": "10.10.10.10",
"reason": "N/A",
},
{
"node_id": "23",
"node_ip": "10.11.11.11",
"reason": "N/A",
},
],
"rport": "https",
"vs_service": "https",
},
],
"vs_ip": "1.1.1.1",
}
}
}
# test_github_issue_24()
def test_reddit_answer_1():
"""
https://www.reddit.com/r/networking/comments/j106ot/export_custom_lists_from_the_config_aruba_switch/
Hit a bug while was doing this template - join action overridden by ignore indicator add action
"""
data = """
SWITCH# show vlan port 2/11 detail
Status and Counters - VLAN Information - for ports 2/11
Port name:
VLAN ID Name | Status Voice Jumbo Mode
------- -------------------- + ---------- ----- ----- --------
60 ABC | Port-based No No Tagged
70 DEF | Port-based No No Tagged
101 GHIJ | Port-based No No Untagged
105 KLMNO | Port-based No No Tagged
116 PQRS | Port-based No No Tagged
117 TVU | Port-based No No Tagged
SWITCH# show vlan port 2/12 detail
Status and Counters - VLAN Information - for ports 2/12
Port name:
VLAN ID Name | Status Voice Jumbo Mode
------- -------------------- + ---------- ----- ----- --------
61 ABC | Port-based No No Tagged
71 DEF | Port-based No No Tagged
103 GHI | Port-based No No Untagged
"""
template = """
<vars>
hostname="gethostname"
</vars>
<group name="vlans*">
Status and Counters - VLAN Information - for ports {{ Port_Number }}
{{ Tagged_VLAN | joinmatches(" ") }} {{ ignore }} | {{ ignore }} {{ ignore }} {{ ignore }} Tagged
{{ Untagged_VLAN }} {{ ignore }} | {{ ignore }} {{ ignore }} {{ ignore }} Untagged
{{ Hostname | set(hostname) }}
</group>
<output>
format = "csv"
path = "vlans"
headers = "Hostname, Port_Number, Untagged_VLAN, Tagged_VLAN"
</output>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
# print(res)
assert res == [
'"Hostname","Port_Number","Untagged_VLAN","Tagged_VLAN"\n"SWITCH","2/11","101","60 70 105 116 117"\n"SWITCH","2/12","103","61 71"'
]
# test_reddit_answer_1()
def test_reddit_answer_2():
data = """
config router ospf
set abr-type standard
set auto-cost-ref-bandwidth 1000
set distance-external 110
set distance-inter-area 110
set distance-intra-area 110
set database-overflow disable
set database-overflow-max-lsas 10000
set database-overflow-time-to-recover 300
set default-information-originate disable
set default-information-metric 10
set default-information-metric-type 2
set default-information-route-map ''
set default-metric 10
set distance 110
set rfc1583-compatible disable
set router-id 10.1.1.1
set spf-timers 5 10
set bfd disable
set log-neighbour-changes enable
set distribute-list-in "OSPF_IMPORT_PREFIX"
set distribute-route-map-in ''
set restart-mode none
set restart-period 120
config area
edit 0.0.0.1
set shortcut disable
set authentication none
set default-cost 10
set nssa-translator-role candidate
set stub-type summary
set type nssa
set nssa-default-information-originate disable
set nssa-default-information-originate-metric 10
set nssa-default-information-originate-metric-type 2
set nssa-redistribution enable
next
end
config ospf-interface
edit "vlan1-int"
set interface "Vlan1"
set ip 0.0.0.0
set authentication text
set authentication-key netconanRemoved13
set prefix-length 0
set retransmit-interval 5
set transmit-delay 1
set cost 0
set priority 1
set dead-interval 40
set hello-interval 10
set hello-multiplier 0
set database-filter-out disable
set mtu 0
set mtu-ignore disable
set network-type point-to-point
set bfd global
set status enable
set resync-timeout 40
next
edit "vlan2-int"
set interface "vlan2"
set ip 0.0.0.0
set authentication text
set authentication-key netconanRemoved14
set prefix-length 0
set retransmit-interval 5
set transmit-delay 1
set cost 0
set priority 1
set dead-interval 40
set hello-interval 10
set hello-multiplier 0
set database-filter-out disable
set mtu 0
set mtu-ignore disable
set network-type point-to-point
set bfd global
set status enable
set resync-timeout 40
next
end
config network
edit 1
set prefix 10.1.1.1 255.255.255.252
set area 0.0.0.1
next
edit 2
set prefix 10.1.1.3 255.255.255.252
set area 0.0.0.1
next
end
config redistribute "connected"
set status enable
set metric 0
set routemap ''
set metric-type 2
set tag 0
end
config redistribute "static"
set status enable
set metric 0
set routemap ''
set metric-type 2
set tag 0
end
config redistribute "rip"
set status disable
set metric 0
set routemap ''
set metric-type 2
set tag 0
end
config redistribute "bgp"
set status enable
set metric 0
set routemap ''
set metric-type 2
set tag 0
end
config redistribute "isis"
set status disable
set metric 0
set routemap ''
set metric-type 2
set tag 0
end
end
"""
template = """
<vars>
clean_phrase = [
'ORPHRASE',
'macro(\"clean_str\")'
]
clean_list = [
'ORPHRASE',
'macro(\"build_list\")'
]
</vars>
<macro>
def build_list(data):
if "\\" \\"" in data:
t = data.split("\\" \\"")
for i in range(0, len(t)):
t[i] = t[i].strip("\\"").replace(" ", "_")
i+=1
return t
else:
return [data.strip("\\"").replace(" ", "_")]
def clean_str(data):
return data.replace("\\"","").replace(" ", "_")
def match_ip_or_any(data):
import ipaddress
if data == \"any\":
return data
elif "/" in data:
return str(data)
else:
t = data.replace(" ", "/")
return str(ipaddress.IPv4Network(t, strict=False))
def ignore_empty(data):
if data == "\'\'":
return bool(False)
else:
return data
</macro>
<macro>
def skip_empty(data):
if data == {}:
return False
return data
</macro>
<group name="ospf">
config router ospf {{ _start_ }}
set auto-cost-ref-bandwidth {{ ref_bw }}
set default-information-originate {{ default_originate | contains("enable") }}
set default-information-metric {{ default_originate_metric }}
set default-information-metric-type {{ default_originate_metric_type }}
set default-information-route-map {{ default_originate_routemap | chain("clean_phrase") | macro("ignore_empty") }}
set default-metric {{ default_rt_metric }}
set rfc1583-compatible {{ rfc1583_compat | contains("enable") }}
set router-id {{ router_id }}
set distribute-list-in {{ dist_list_in | chain("clean_phrase") | macro("ignore_empty") }}
set distribute-route-map-in {{ dist_routemap_in | chain("clean_phrase") | macro("ignore_empty") }}
<group name="areas*" macro="skip_empty">
config area {{ _start_ }}
<group>
edit {{ area | _start_ }}
set stub-type {{ stub_type }}
set type {{ area_type }}
set nssa-default-information-originate {{ nssa_default_originate | contains("enable") }}
set nssa-default-information-originate-metric {{ nssa_default_metric }}
set nssa-default-information-originate-metric-type {{ nssa_default_metric_type }}
set nssa-redistribution {{ nssa_redis }}
next {{ _end_ }}
</group>
end {{ _end_ }}
</group>
<group name="interfaces*" macro="skip_empty">
config ospf-interface {{ _start_ }}
<group contains="status">
edit {{ name | chain("clean_phrase") | _start_ }}
set interface {{ interface | chain("clean_phrase")}}
set ip {{ ip | exclude("0.0.0.0") }}
set cost {{ cost | exclude("0") }}
set priority {{ priority }}
set mtu {{ mtu | exclude("0") }}
set network-type {{ network }}
set status {{ status | contains("enable") }}
next {{ _end_ }}
</group>
end {{ _end_ }}
</group>
<group name="networks*" macro="skip_empty">
config network {{ _start_ }}
<group>
edit {{ id | _start_ }}
set prefix {{ prefix | ORPHRASE | to_ip | with_prefixlen }}
set area {{ area }}
next {{ _end_ }}
</group>
end {{ _end_ }}
</group>
<group name="redistribute*" contains="status">
config redistribute {{ protocol | chain("clean_phrase") | _start_ }}
set status {{ status | contains('enable') }}
set route-map {{ route_map | chain("clean_phrase") | macro("ignore_empty") }}
set metric-type {{ metric-type }}
set metric {{ metric | exclude("0") }}
set tag {{ tag | exclude("0")}}
end {{ _end_ }}
</group>
end {{ _end_ }}
</group>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"ospf": {
"areas": [
{
"area": "0.0.0.1",
"area_type": "nssa",
"nssa_default_metric": "10",
"nssa_default_metric_type": "2",
"nssa_redis": "enable",
"stub_type": "summary",
}
],
"default_originate_metric": "10",
"default_originate_metric_type": "2",
"default_rt_metric": "10",
"dist_list_in": "OSPF_IMPORT_PREFIX",
"interfaces": [
{
"interface": "Vlan1",
"name": "vlan1-int",
"network": "point-to-point",
"priority": "1",
"status": "enable",
},
{
"interface": "vlan2",
"name": "vlan2-int",
"network": "point-to-point",
"priority": "1",
"status": "enable",
},
],
"networks": [
{"area": "0.0.0.1", "id": "1", "prefix": "10.1.1.1/30"},
{"area": "0.0.0.1", "id": "2", "prefix": "10.1.1.3/30"},
],
"redistribute": [
{
"metric-type": "2",
"protocol": "connected",
"status": "enable",
},
{"metric-type": "2", "protocol": "static", "status": "enable"},
{"metric-type": "2", "protocol": "bgp", "status": "enable"},
],
"ref_bw": "1000",
"router_id": "10.1.1.1",
}
}
]
]
# test_reddit_answer_2()
def test_github_issue_32():
data = """
.id=*c;export-route-targets=65001:48;65001:0;import-route-targets=65001:48;interfaces=lo-ext;vlan56;route-distinguisher=65001:48;routing-mark=VRF_EXT
.id=*10;comment=;export-route-targets=65001:80;import-route-targets=65001:80;65001:0;interfaces=lo-private;route-distinguisher=65001:80;routing-mark=VRF_PRIVATE
"""
template = """
<group method="table">
.id={{ id | exclude(";") }};export-route-targets={{ export-route-targets }};import-route-targets={{ import-route-targets }};interfaces={{ interfaces }};route-distinguisher={{ route-distinguisher }};routing-mark={{ routing-mark }}
.id={{ id }};comment{{ comment }};export-route-targets={{ export-route-targets }};import-route-targets={{ import-route-targets }};interfaces={{ interfaces }};route-distinguisher={{ route-distinguisher }};routing-mark={{ routing-mark }}
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result(structure="flat_list")
# pprint.pprint(res)
assert res == [
{
"export-route-targets": "65001:48;65001:0",
"id": "*c",
"import-route-targets": "65001:48",
"interfaces": "lo-ext;vlan56",
"route-distinguisher": "65001:48",
"routing-mark": "VRF_EXT",
},
{
"comment": "=",
"export-route-targets": "65001:80",
"id": "*10",
"import-route-targets": "65001:80;65001:0",
"interfaces": "lo-private",
"route-distinguisher": "65001:80",
"routing-mark": "VRF_PRIVATE",
},
]
# test_github_issue_32()
def test_slack_answer_1():
data = """
Firmware
Version
----------------
02.1.1 Build 002
Hardware
Version
----------------
V2R4
"""
template = """
<group name="versions">
Hardware {{ _start_ }}
Firmware {{ _start_ }}
{{ version | PHRASE | let("type", "firmware") }}
{{ version | exclude("---") | exclude("Vers") | let("type", "hardware") }}
{{ _end_ }}
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result(structure="flat_list")
# pprint.pprint(res)
assert res == [
{
"versions": [
{"type": "firmware", "version": "02.1.1 Build 002"},
{"type": "hardware", "version": "V2R4"},
]
}
]
# test_slack_answer_1()
def test_group_default_docs():
template = """
<input load="text">
device-hostame uptime is 27 weeks, 3 days, 10 hours, 46 minutes, 10 seconds
</input>
<group name="uptime**">
device-hostame uptime is {{ uptime | PHRASE }}
<group name="software">
software version {{ version | default("uncknown") }}
</group>
</group>
<group name="domain" default="Uncknown">
Default domain is {{ fqdn }}
</group>
"""
parser = ttp(template=template)
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"domain": {"fqdn": "Uncknown"},
"uptime": {
"software": {"version": "uncknown"},
"uptime": "27 weeks, 3 days, 10 hours, 46 minutes, 10 seconds",
},
}
]
]
# test_group_default_docs()
def test_github_issue_34_answer():
template = """
<input load="text">
Hi World
</input>
<group name='demo'>
<group name='audiences*'>
Hello {{ audience | default([]) }}
</group>
</group>
"""
parser = ttp(template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [[{"demo": {"audiences": [{"audience": []}]}}]]
# test_github_issue_34_answer()
def test_github_issue_33_answer_1():
template = """
<input load="text">
server 1.1.1.1
server 2.2.2.2 3.3.3.3
server 4.4.4.4 5.5.5.5 6.6.6.6
</input>
<group name="servers" method="table">
server {{ server | re(r"\\S+") | let("servers_number", 1 ) }}
server {{ server | re(r"\\S+ \\S+") | let("servers_number", 2) }}
server {{ server | re(r"\\S+ \\S+ \\S+") | let("servers_number", 3) }}
</group>
"""
parser = ttp(template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"servers": [
{"server": "1.1.1.1", "servers_number": 1},
{"server": "2.2.2.2 3.3.3.3", "servers_number": 2},
{"server": "4.4.4.4 5.5.5.5 6.6.6.6", "servers_number": 3},
]
}
]
]
# test_github_issue_33_answer_1()
def test_issue_36():
template = """
<input load="text">
ip access-list standard 42
10 remark machine_A
10 permit 192.168.200.162
20 remark machine_B
20 permit 192.168.200.149
30 deny any log
ip access-list standard 98
10 permit 10.10.10.1
20 remark toto
20 permit 30.30.30.1
30 permit 30.30.30.0 0.0.0.255
ip access-list standard 99
10 permit 10.20.30.40 log
20 permit 20.30.40.1 log
30 remark DEVICE - SNMP RW
30 permit 50.50.50.128 0.0.0.127
40 permit 60.60.60.64 0.0.0.63
ip access-list extended 199
10 remark COLLECTOR - SNMP
10 permit ip 70.70.70.0 0.0.0.255 any
20 remark RETURN - Back
20 permit ip 80.80.80.0 0.0.0.127 any
30 remark VISUALIZE
30 permit ip host 90.90.90.138 any
</input>
<group name="ip.{{ acl_type }}.{{ acl_name }}">
ip access-list {{ acl_type }} {{ acl_name }}
<group name="{{ entry_id }}*" method="table">
{{ entry_id }} remark {{ remark_name | re(".+") | let("action", "remark") }}
{{ entry_id }} {{ action }} {{ src_host }}
{{ entry_id }} {{ action }} {{ src_host | let("log", "log") }} log
{{ entry_id }} {{ action }} {{ protocol }} host {{ src_host | let("dest_any", "any") }} any
{{ entry_id }} {{ action }} {{ protocol }} {{ src_ntw | let("dest_any", "any") }} {{ src_wildcard | IP }} any
{{ entry_id }} {{ action }} {{ src_ntw }} {{ src_wildcard | IP }}
</group>
</group>
"""
parser = ttp(template=template, log_level="ERROR")
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [
[
{
"ip": {
"extended": {
"199": {
"10": [
{"action": "remark", "remark_name": "COLLECTOR - SNMP"},
{
"action": "permit",
"dest_any": "any",
"protocol": "ip",
"src_ntw": "70.70.70.0",
"src_wildcard": "0.0.0.255",
},
],
"20": [
{"action": "remark", "remark_name": "RETURN - Back"},
{
"action": "permit",
"dest_any": "any",
"protocol": "ip",
"src_ntw": "80.80.80.0",
"src_wildcard": "0.0.0.127",
},
],
"30": [
{"action": "remark", "remark_name": "VISUALIZE"},
{
"action": "permit",
"dest_any": "any",
"protocol": "ip",
"src_host": "90.90.90.138",
},
],
}
},
"standard": {
"42": {
"10": [
{"action": "remark", "remark_name": "machine_A"},
{"action": "permit", "src_host": "192.168.200.162"},
],
"20": [
{"action": "remark", "remark_name": "machine_B"},
{"action": "permit", "src_host": "192.168.200.149"},
],
"30": [{"action": "deny", "log": "log", "src_host": "any"}],
},
"98": {
"10": [{"action": "permit", "src_host": "10.10.10.1"}],
"20": [
{"action": "remark", "remark_name": "toto"},
{"action": "permit", "src_host": "30.30.30.1"},
],
"30": [
{
"action": "permit",
"src_ntw": "30.30.30.0",
"src_wildcard": "0.0.0.255",
}
],
},
"99": {
"10": [
{
"action": "permit",
"log": "log",
"src_host": "10.20.30.40",
}
],
"20": [
{
"action": "permit",
"log": "log",
"src_host": "20.30.40.1",
}
],
"30": [
{"action": "remark", "remark_name": "DEVICE - SNMP RW"},
{
"action": "permit",
"src_ntw": "50.50.50.128",
"src_wildcard": "0.0.0.127",
},
],
"40": [
{
"action": "permit",
"src_ntw": "60.60.60.64",
"src_wildcard": "0.0.0.63",
}
],
},
},
}
}
]
]
# test_issue_36()
def test_github_issue_37_original_data_template():
template = """
<macro>
import re
def qinq(data):
data = re.sub(r"\\*", r"qinq", data)
return data
</macro>
<group name="service">
service {{ ignore }}
<group name="epipe.{{ service_id }}" default="none">
epipe {{ service_id | _start_ }} customer {{ customer_id }} create
description "{{ description | ORPHRASE | default("none") }}"
service-mtu {{ service_mtu | default("none") }}
service-name "{{ service_name | ORPHRASE | default("none") }}"
<group name="endpoint" default="none">
endpoint {{ endpoint | _start_ }} create
revert-time {{ revert_time | default("none") }}
exit {{ _end_ }}
</group>
<group name="sap.{{ sap_id }}" default="none">
sap {{ sap_id | macro("qinq") | _start_ | ORPHRASE }} create
description "{{ description | ORPHRASE | default("none")}}"
multi-service-site "{{ mss_name | default("none") }}"
<group name="ingress" default="default_ingress" >
ingress {{ _start_ }}
qos {{ sap_ingress | default("1") }}
scheduler-policy {{ scheduler_policy | default("none")}}
exit {{ _end_ }}
</group>
<group name="egress" default="default_egress">
egress {{ _start_ }}
scheduler-policy {{ scheduler_policy | default("none") }}
qos {{ sap_egress | default("1)") }}
exit {{ _end_ }}
</group>
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
<group name="pwr_sdp.{{pwr_spoke_sdp_id}}**" default="none">
spoke-sdp {{ pwr_spoke_sdp_id | default("none")}}:{{vc_id | _start_ | default("none") }} endpoint {{ endpoint | default("none") }} create
precedence {{ precedence | default("default_precedence") }}
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
<group name="regular_sdp.{{r_spoke_sdp_id}}**" default="none">
spoke-sdp {{ r_spoke_sdp_id }}:{{vc_id | _start_ }} create
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
exit {{ _end_ }}
</group>
"""
data = """
service foo
epipe 103076 customer 160 create
description "vf=EWL:cn=TATA_COM:tl=2C02495918:st=act:"
service-mtu 1588
service-name "EPIPE service-103076 DKTN08a-D0105 (63.130.108.41)"
sap 1/2/12:20.* create
description "vf=EWL:cn=TATA_COM:tl=2C02495890:st=act:"
multi-service-site "TATA_VSNL_STRAT_A206_LAN10"
ingress
queue-override
queue 1 create
cbs default
mbs 40 kilobytes
rate 10000 cir 10000
exit
exit
exit
egress
queue-override
queue 1 create
cbs default
mbs 40 kilobytes
rate 10000 cir 10000
exit
exit
exit
accounting-policy 4
no shutdown
exit
spoke-sdp 8051:103076 create
no shutdown
exit
no shutdown
exit
epipe 103206 customer 1904 create
description "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA UK PLC Stepney Green E1 3DG'"
service-mtu 1988
service-name "EPIPE service-103206 DKTN08a-D0105 (63.130.108.41)"
sap 2/2/3:401.100 create
description "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA UK PLC Stepney Green E1 3DG'"
multi-service-site "SKANSKA_E13DG_A825_LAN1"
ingress
qos 11010
queue-override
queue 1 create
cbs default
mbs 1188 kilobytes
rate max cir 47500
exit
queue 3 create
cbs default
mbs 63 kilobytes
rate max cir 2500
exit
exit
exit
egress
qos 11010
queue-override
queue 1 create
cbs default
mbs 1188 kilobytes
rate max cir 47500
exit
queue 3 create
cbs default
mbs 63 kilobytes
rate max cir 2500
exit
exit
exit
collect-stats
accounting-policy 4
no shutdown
exit
spoke-sdp 8035:103206 create
no shutdown
exit
no shutdown
exit
epipe 103256 customer 160 create
description "vf=EWL:cn=TATA_COMM:tl=2C02490189:st=act:"
service-mtu 1988
service-name "EPIPE service-103256 DKTN08a-D0105 (63.130.108.41)"
sap 1/2/12:15.* create
description "vf=EWL:cn=TATA_COMM:tl=2C02490171:st=act:"
multi-service-site "TATA_VSNL_STRAT_A206_LAN5"
ingress
qos 11000
queue-override
queue 1 create
cbs default
mbs 391 kilobytes
rate 100000 cir 100000
exit
exit
exit
egress
qos 11000
queue-override
queue 1 create
cbs default
mbs 391 kilobytes
rate 100000 cir 100000
exit
exit
exit
accounting-policy 4
no shutdown
exit
spoke-sdp 8139:103256 create
no shutdown
exit
no shutdown
exit
epipe 103742 customer 160 create
description "vf=EWL:cn=TATA_COM:tl=2C02410363:st=act:"
service-mtu 1588
service-name "EPIPE service-103742 DKTN08a-D0105 (63.130.108.41)"
sap 5/2/50:20.* create
description "vf=EWL:cn=TATA_COM:tl=2C02410338:st=act:"
multi-service-site "TATA_STRAT_LON_A206_LANA"
ingress
qos 11000
queue-override
queue 1 create
cbs default
mbs 32 kilobytes
rate 8000 cir 8000
exit
exit
exit
egress
qos 11000
queue-override
queue 1 create
cbs default
mbs 32 kilobytes
rate 8000 cir 8000
exit
exit
exit
accounting-policy 4
no shutdown
exit
spoke-sdp 8061:103742 create
no shutdown
exit
no shutdown
exit
epipe 55513386 customer 4 vc-switching create
description "vf=EAGG:cn=Bulldog:tl=VF"
service-mtu 1526
spoke-sdp 78:55513386 create
control-word
no shutdown
exit
spoke-sdp 8245:55513386 create
control-word
no shutdown
exit
no shutdown
exit
epipe 55517673 customer 4 create
description "vf=EAGG:cn=Bulldog:tl=2C01291821:st=act:no=NGA EPIPE#BAACTQ#VLAN 901"
service-mtu 1526
service-name "epipe service-64585 DKTN08a-D0105 (63.130.108.41)"
endpoint "SDP" create
revert-time infinite
exit
sap 2/2/3:901.* create
description "2_2_3,H0505824A,Bulldog,VLAN 901"
ingress
scheduler-policy "NGA-LLU-300M"
qos 20010
exit
egress
scheduler-policy "NGA-LLU-300M"
qos 20010
exit
no shutdown
exit
spoke-sdp 8243:55517673 endpoint "SDP" create
collect-stats
precedence 1
no shutdown
exit
spoke-sdp 8245:55517673 endpoint "SDP" create
collect-stats
precedence primary
no shutdown
exit
no shutdown
exit
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"service": {
"epipe": {
"103076": {
"customer_id": "160",
"description": "vf=EWL:cn=TATA_COM:tl=2C02495918:st=act:",
"regular_sdp": {
"8051": {"state": "enabled", "vc_id": "103076"}
},
"sap": {
"1/2/12:20.qinq": {
"description": "vf=EWL:cn=TATA_COM:tl=2C02495890:st=act:",
"egress": {
"sap_egress": "1)",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "1",
"scheduler_policy": "none",
},
"mss_name": "TATA_VSNL_STRAT_A206_LAN10",
"state": "enabled",
}
},
"service_mtu": "1588",
"service_name": "EPIPE service-103076 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"103206": {
"customer_id": "1904",
"description": "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA "
"UK PLC Stepney Green E1 "
"3DG'",
"regular_sdp": {
"8035": {"state": "enabled", "vc_id": "103206"}
},
"sap": {
"2/2/3:401.100": {
"description": "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA "
"UK "
"PLC "
"Stepney "
"Green "
"E1 "
"3DG'",
"egress": {
"sap_egress": "11010",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "11010",
"scheduler_policy": "none",
},
"mss_name": "SKANSKA_E13DG_A825_LAN1",
"state": "enabled",
}
},
"service_mtu": "1988",
"service_name": "EPIPE service-103206 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"103256": {
"customer_id": "160",
"description": "vf=EWL:cn=TATA_COMM:tl=2C02490189:st=act:",
"regular_sdp": {
"8139": {"state": "enabled", "vc_id": "103256"}
},
"sap": {
"1/2/12:15.qinq": {
"description": "vf=EWL:cn=TATA_COMM:tl=2C02490171:st=act:",
"egress": {
"sap_egress": "11000",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "11000",
"scheduler_policy": "none",
},
"mss_name": "TATA_VSNL_STRAT_A206_LAN5",
"state": "enabled",
}
},
"service_mtu": "1988",
"service_name": "EPIPE service-103256 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"103742": {
"customer_id": "160",
"description": "vf=EWL:cn=TATA_COM:tl=2C02410363:st=act:",
"regular_sdp": {
"8061": {"state": "enabled", "vc_id": "103742"}
},
"sap": {
"5/2/50:20.qinq": {
"description": "vf=EWL:cn=TATA_COM:tl=2C02410338:st=act:",
"egress": {
"sap_egress": "11000",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "11000",
"scheduler_policy": "none",
},
"mss_name": "TATA_STRAT_LON_A206_LANA",
"state": "enabled",
}
},
"service_mtu": "1588",
"service_name": "EPIPE service-103742 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"55517673": {
"customer_id": "4",
"description": "vf=EAGG:cn=Bulldog:tl=2C01291821:st=act:no=NGA "
"EPIPE#BAACTQ#VLAN 901",
"endpoint": {
"endpoint": '"SDP"',
"revert_time": "infinite",
},
"pwr_sdp": {
"8243": {
"endpoint": '"SDP"',
"precedence": "1",
"state": "enabled",
"vc_id": "55517673",
},
"8245": {
"endpoint": '"SDP"',
"precedence": "primary",
"state": "enabled",
"vc_id": "55517673",
},
},
"sap": {
"2/2/3:901.qinq": {
"description": "2_2_3,H0505824A,Bulldog,VLAN "
"901",
"egress": {
"sap_egress": "20010",
"scheduler_policy": '"NGA-LLU-300M"',
},
"ingress": {
"sap_ingress": "20010",
"scheduler_policy": '"NGA-LLU-300M"',
},
"mss_name": "none",
"state": "enabled",
}
},
"service_mtu": "1526",
"service_name": "epipe service-64585 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
}
}
}
]
]
# test_github_issue_37_original_data_template()
def test_github_issue_37_cleaned_up_data():
"""
Problem with below template without bug fix, was that
'no shutdown' statement for sap group was matched by
spoke-sdp group as well and added to results causing
false match. To fix it, added tracking of previously
started groups in results object, so that before add
match results to overall results if PATH differ need
to check that this particular item groups has been
started before, previous logic was not checking for that.
Have not noticed any issues with other 200+ tests or
any performance degradation for single/multi-process
parsing.
"""
template = """
<group name="service">
service {{ ignore }}
<group name="epipe.{{ service_id }}">
epipe {{ service_id }} customer {{ customer_id }} create
<group name="regular_sdp.{{r_spoke_sdp_id}}**">
spoke-sdp {{ r_spoke_sdp_id }}:{{vc_id }} create
no shutdown {{ state | set("enabled") }}
</group>
</group>
</group>
"""
data = """
service foo
epipe 103076 customer 160 create
description "vf=EWL:cn=TATA_COM:tl=2C02495918:st=act:"
service-mtu 1588
service-name "EPIPE service-103076 DKTN08a-D0105 (63.130.108.41)"
sap 1/2/12:20.* create
description "vf=EWL:cn=TATA_COM:tl=2C02495890:st=act:"
multi-service-site "TATA_VSNL_STRAT_A206_LAN10"
ingress
queue-override
queue 1 create
cbs default
mbs 40 kilobytes
rate 10000 cir 10000
exit
exit
exit
egress
queue-override
queue 1 create
cbs default
mbs 40 kilobytes
rate 10000 cir 10000
exit
exit
exit
accounting-policy 4
no shutdown
exit
spoke-sdp 8051:103076 create
no shutdown
exit
no shutdown
exit
epipe 103206 customer 1904 create
description "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA UK PLC Stepney Green E1 3DG'"
service-mtu 1988
service-name "EPIPE service-103206 DKTN08a-D0105 (63.130.108.41)"
sap 2/2/3:401.100 create
description "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA UK PLC Stepney Green E1 3DG'"
multi-service-site "SKANSKA_E13DG_A825_LAN1"
ingress
qos 11010
queue-override
queue 1 create
cbs default
mbs 1188 kilobytes
rate max cir 47500
exit
queue 3 create
cbs default
mbs 63 kilobytes
rate max cir 2500
exit
exit
exit
egress
qos 11010
queue-override
queue 1 create
cbs default
mbs 1188 kilobytes
rate max cir 47500
exit
queue 3 create
cbs default
mbs 63 kilobytes
rate max cir 2500
exit
exit
exit
collect-stats
accounting-policy 4
no shutdown
exit
spoke-sdp 8035:103206 create
no shutdown
exit
no shutdown
exit
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [
[
{
"service": {
"epipe": {
"103076": {
"customer_id": "160",
"regular_sdp": {
"8051": {"state": "enabled", "vc_id": "103076"}
},
},
"103206": {
"customer_id": "1904",
"regular_sdp": {
"8035": {"state": "enabled", "vc_id": "103206"}
},
},
}
}
}
]
]
# test_github_issue_37_cleaned_up_data()
def test_github_issue_37_cleaned_data_template():
template = """
<group name="service">
service {{ ignore }}
<group name="epipe.{{ service_id }}" default="none">
epipe {{ service_id }} customer {{ customer_id }} create
description "{{ description | ORPHRASE }}"
service-mtu {{ service_mtu }}
service-name "{{ service_name | ORPHRASE }}"
<group name="endpoint" default="none">
endpoint {{ endpoint }} create
revert-time {{ revert_time }}
exit {{ _end_ }}
</group>
<group name="sap.{{ sap_id }}" default="none">
sap {{ sap_id | resub(r"\\*", "qinq") | ORPHRASE }} create
description "{{ description | ORPHRASE }}"
multi-service-site "{{ mss_name }}"
<group name="ingress">
ingress {{ _start_ }}
qos {{ sap_ingress | default("1") }}
scheduler-policy {{ scheduler_policy | default("none")}}
exit {{ _end_ }}
</group>
<group name="egress">
egress {{ _start_ }}
scheduler-policy {{ scheduler_policy | default("none") }}
qos {{ sap_egress | default("1)") }}
exit {{ _end_ }}
</group>
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
<group name="pwr_sdp.{{pwr_spoke_sdp_id}}**" default="none">
spoke-sdp {{ pwr_spoke_sdp_id }}:{{vc_id }} endpoint {{ endpoint }} create
precedence {{ precedence | default("default_precedence") }}
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
<group name="regular_sdp.{{r_spoke_sdp_id}}**" default="none">
spoke-sdp {{ r_spoke_sdp_id }}:{{vc_id }} create
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
no shutdown {{ state | set("enabled") | default("disabled") }}
exit {{ _end_ }}
</group>
exit {{ _end_ }}
</group>
"""
data = """
service foo
epipe 103076 customer 160 create
description "vf=EWL:cn=TATA_COM:tl=2C02495918:st=act:"
service-mtu 1588
service-name "EPIPE service-103076 DKTN08a-D0105 (63.130.108.41)"
sap 1/2/12:20.* create
description "vf=EWL:cn=TATA_COM:tl=2C02495890:st=act:"
multi-service-site "TATA_VSNL_STRAT_A206_LAN10"
ingress
queue-override
queue 1 create
cbs default
mbs 40 kilobytes
rate 10000 cir 10000
exit
exit
exit
egress
queue-override
queue 1 create
cbs default
mbs 40 kilobytes
rate 10000 cir 10000
exit
exit
exit
accounting-policy 4
no shutdown
exit
spoke-sdp 8051:103076 create
no shutdown
exit
no shutdown
exit
epipe 103206 customer 1904 create
description "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA UK PLC Stepney Green E1 3DG'"
service-mtu 1988
service-name "EPIPE service-103206 DKTN08a-D0105 (63.130.108.41)"
sap 2/2/3:401.100 create
description "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA UK PLC Stepney Green E1 3DG'"
multi-service-site "SKANSKA_E13DG_A825_LAN1"
ingress
qos 11010
queue-override
queue 1 create
cbs default
mbs 1188 kilobytes
rate max cir 47500
exit
queue 3 create
cbs default
mbs 63 kilobytes
rate max cir 2500
exit
exit
exit
egress
qos 11010
queue-override
queue 1 create
cbs default
mbs 1188 kilobytes
rate max cir 47500
exit
queue 3 create
cbs default
mbs 63 kilobytes
rate max cir 2500
exit
exit
exit
collect-stats
accounting-policy 4
no shutdown
exit
spoke-sdp 8035:103206 create
no shutdown
exit
no shutdown
exit
epipe 103256 customer 160 create
description "vf=EWL:cn=TATA_COMM:tl=2C02490189:st=act:"
service-mtu 1988
service-name "EPIPE service-103256 DKTN08a-D0105 (63.130.108.41)"
sap 1/2/12:15.* create
description "vf=EWL:cn=TATA_COMM:tl=2C02490171:st=act:"
multi-service-site "TATA_VSNL_STRAT_A206_LAN5"
ingress
qos 11000
queue-override
queue 1 create
cbs default
mbs 391 kilobytes
rate 100000 cir 100000
exit
exit
exit
egress
qos 11000
queue-override
queue 1 create
cbs default
mbs 391 kilobytes
rate 100000 cir 100000
exit
exit
exit
accounting-policy 4
no shutdown
exit
spoke-sdp 8139:103256 create
no shutdown
exit
no shutdown
exit
epipe 103742 customer 160 create
description "vf=EWL:cn=TATA_COM:tl=2C02410363:st=act:"
service-mtu 1588
service-name "EPIPE service-103742 DKTN08a-D0105 (63.130.108.41)"
sap 5/2/50:20.* create
description "vf=EWL:cn=TATA_COM:tl=2C02410338:st=act:"
multi-service-site "TATA_STRAT_LON_A206_LANA"
ingress
qos 11000
queue-override
queue 1 create
cbs default
mbs 32 kilobytes
rate 8000 cir 8000
exit
exit
exit
egress
qos 11000
queue-override
queue 1 create
cbs default
mbs 32 kilobytes
rate 8000 cir 8000
exit
exit
exit
accounting-policy 4
no shutdown
exit
spoke-sdp 8061:103742 create
no shutdown
exit
no shutdown
exit
epipe 55513386 customer 4 vc-switching create
description "vf=EAGG:cn=Bulldog:tl=VF"
service-mtu 1526
spoke-sdp 78:55513386 create
control-word
no shutdown
exit
spoke-sdp 8245:55513386 create
control-word
no shutdown
exit
no shutdown
exit
epipe 55517673 customer 4 create
description "vf=EAGG:cn=Bulldog:tl=2C01291821:st=act:no=NGA EPIPE#BAACTQ#VLAN 901"
service-mtu 1526
service-name "epipe service-64585 DKTN08a-D0105 (63.130.108.41)"
endpoint "SDP" create
revert-time infinite
exit
sap 2/2/3:901.* create
description "2_2_3,H0505824A,Bulldog,VLAN 901"
ingress
scheduler-policy "NGA-LLU-300M"
qos 20010
exit
egress
scheduler-policy "NGA-LLU-300M"
qos 20010
exit
no shutdown
exit
spoke-sdp 8243:55517673 endpoint "SDP" create
collect-stats
precedence 1
no shutdown
exit
spoke-sdp 8245:55517673 endpoint "SDP" create
collect-stats
precedence primary
no shutdown
exit
no shutdown
exit
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"service": {
"epipe": {
"103076": {
"customer_id": "160",
"description": "vf=EWL:cn=TATA_COM:tl=2C02495918:st=act:",
"regular_sdp": {
"8051": {"state": "enabled", "vc_id": "103076"}
},
"sap": {
"1/2/12:20.qinq": {
"description": "vf=EWL:cn=TATA_COM:tl=2C02495890:st=act:",
"egress": {
"sap_egress": "1)",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "1",
"scheduler_policy": "none",
},
"mss_name": "TATA_VSNL_STRAT_A206_LAN10",
"state": "enabled",
}
},
"service_mtu": "1588",
"service_name": "EPIPE service-103076 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"103206": {
"customer_id": "1904",
"description": "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA "
"UK PLC Stepney Green E1 "
"3DG'",
"regular_sdp": {
"8035": {"state": "enabled", "vc_id": "103206"}
},
"sap": {
"2/2/3:401.100": {
"description": "vf=1273:cn=skanska:tl=3C02407455:st=act:no='SKANSKA "
"UK "
"PLC "
"Stepney "
"Green "
"E1 "
"3DG'",
"egress": {
"sap_egress": "11010",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "11010",
"scheduler_policy": "none",
},
"mss_name": "SKANSKA_E13DG_A825_LAN1",
"state": "enabled",
}
},
"service_mtu": "1988",
"service_name": "EPIPE service-103206 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"103256": {
"customer_id": "160",
"description": "vf=EWL:cn=TATA_COMM:tl=2C02490189:st=act:",
"regular_sdp": {
"8139": {"state": "enabled", "vc_id": "103256"}
},
"sap": {
"1/2/12:15.qinq": {
"description": "vf=EWL:cn=TATA_COMM:tl=2C02490171:st=act:",
"egress": {
"sap_egress": "11000",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "11000",
"scheduler_policy": "none",
},
"mss_name": "TATA_VSNL_STRAT_A206_LAN5",
"state": "enabled",
}
},
"service_mtu": "1988",
"service_name": "EPIPE service-103256 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"103742": {
"customer_id": "160",
"description": "vf=EWL:cn=TATA_COM:tl=2C02410363:st=act:",
"regular_sdp": {
"8061": {"state": "enabled", "vc_id": "103742"}
},
"sap": {
"5/2/50:20.qinq": {
"description": "vf=EWL:cn=TATA_COM:tl=2C02410338:st=act:",
"egress": {
"sap_egress": "11000",
"scheduler_policy": "none",
},
"ingress": {
"sap_ingress": "11000",
"scheduler_policy": "none",
},
"mss_name": "TATA_STRAT_LON_A206_LANA",
"state": "enabled",
}
},
"service_mtu": "1588",
"service_name": "EPIPE service-103742 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
"55517673": {
"customer_id": "4",
"description": "vf=EAGG:cn=Bulldog:tl=2C01291821:st=act:no=NGA "
"EPIPE#BAACTQ#VLAN 901",
"endpoint": {
"endpoint": '"SDP"',
"revert_time": "infinite",
},
"pwr_sdp": {
"8243": {
"endpoint": '"SDP"',
"precedence": "1",
"state": "enabled",
"vc_id": "55517673",
},
"8245": {
"endpoint": '"SDP"',
"precedence": "primary",
"state": "enabled",
"vc_id": "55517673",
},
},
"sap": {
"2/2/3:901.qinq": {
"description": "2_2_3,H0505824A,Bulldog,VLAN "
"901",
"egress": {
"sap_egress": "20010",
"scheduler_policy": '"NGA-LLU-300M"',
},
"ingress": {
"sap_ingress": "20010",
"scheduler_policy": '"NGA-LLU-300M"',
},
"mss_name": "none",
"state": "enabled",
}
},
"service_mtu": "1526",
"service_name": "epipe service-64585 "
"DKTN08a-D0105 "
"(63.130.108.41)",
"state": "enabled",
},
}
}
}
]
]
# test_github_issue_37_cleaned_data_template()
def test_github_issue_42():
data = """
vrf xyz
address-family ipv4 unicast
import route-target
65000:3507
65000:3511
65000:5453
65000:5535
!
export route-target
65000:5453
65000:5535
!
!
!
"""
template = """
<group name="vrfs">
vrf {{name}}
<group name="route-targets">
import route-target {{ _start_ }}
{{ import | to_list | joinmatches }}
</group>
!
<group name="route-targets">
export route-target {{ _start_ }}
{{ export | to_list | joinmatches }}
</group>
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"vrfs": {
"name": "xyz",
"route-targets": [
{
"import": [
"65000:3507",
"65000:3511",
"65000:5453",
"65000:5535",
]
},
{"export": ["65000:5453", "65000:5535"]},
],
}
}
]
]
# test_github_issue_42()
def test_github_issue_42_answer():
data = """
vrf xyz
address-family ipv4 unicast
import route-target
65000:3507
65000:3511
65000:5453
65000:5535
!
export route-target
65000:5453
65000:5535
!
!
!
"""
template = """
<group name="vrfs">
vrf {{name}}
<group name="import_rts">
import route-target {{ _start_ }}
{{ import_rt | _start_ }}
</group>
!
<group name="export_rts">
export route-target {{ _start_ }}
{{ export_rt | _start_ }}
</group>
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"vrfs": {
"export_rts": [
{"export_rt": "65000:5453"},
{"export_rt": "65000:5535"},
],
"import_rts": [
{"import_rt": "65000:3507"},
{"import_rt": "65000:3511"},
{"import_rt": "65000:5453"},
{"import_rt": "65000:5535"},
],
"name": "xyz",
}
}
]
]
# test_github_issue_42_answer()
def test_issue_45():
data = """
vrf2 {
forwarding-options {
dhcp-relay {
server-group {
IN_MEDIA_SIGNALING {
10.154.6.147;
}
DHCP-NGN-SIG {
10.154.6.147;
}
}
group group2 {
active-server-group IN_MEDIA_SIGNALING;
overrides {
trust-option-82;
}
}
group NGN-SIG {
active-server-group DHCP-NGN-SIG;
overrides {
trust-option-82;
}
}
}
}
}
"""
template = """
<group name="vrfs*">
{{ name | _start_ }} {
<group name="forwarding_options">
forwarding-options { {{ _start_ }}
<group name="dhcp_relay">
dhcp-relay { {{ _start_ }}
<group name="server_group">
server-group { {{ _start_ }}
<group name="dhcp*">
{{ server_group_name1 | _start_ }} {
<group name="helper_addresses*">
{{ helper_address | IP }};
</group>
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
<group name="groups*">
group {{ group_name | _start_ }} {
active-server-group {{server_group_name2}};
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
pprint.pprint(res)
# assert res == [
# [
# {
# "vrfs": [
# {
# "forwarding_options": {
# "dhcp_relay": {
# "groups": [
# {
# "group_name": "group2",
# "server_group_name2": "IN_MEDIA_SIGNALING",
# },
# {
# "group_name": "NGN-SIG",
# "server_group_name2": "DHCP-NGN-SIG",
# },
# ],
# "server_group": {
# "dhcp": [
# {
# "helper_addresses": [
# {"helper_address": "10.154.6.147"}
# ],
# "server_group_name1": "IN_MEDIA_SIGNALING",
# },
# {
# "helper_addresses": [
# {"helper_address": "10.154.6.147"}
# ],
# "server_group_name1": "DHCP-NGN-SIG",
# },
# {"server_group_name1": "overrides"},
# {"server_group_name1": "overrides"},
# ]
# },
# }
# },
# "name": "vrf2",
# }
# ]
# }
# ]
# ]
# was able to fix the issue by introducing ended_groups tracking in results
# processing while was trying to fix issue 57
assert res == [
[
{
"vrfs": [
{
"forwarding_options": {
"dhcp_relay": {
"groups": [
{
"group_name": "group2",
"server_group_name2": "IN_MEDIA_SIGNALING",
},
{
"group_name": "NGN-SIG",
"server_group_name2": "DHCP-NGN-SIG",
},
],
"server_group": {
"dhcp": [
{
"helper_addresses": [
{"helper_address": "10.154.6.147"}
],
"server_group_name1": "IN_MEDIA_SIGNALING",
},
{
"helper_addresses": [
{"helper_address": "10.154.6.147"}
],
"server_group_name1": "DHCP-NGN-SIG",
},
]
},
}
},
"name": "vrf2",
}
]
}
]
]
# test_issue_45()
def test_issue_45_1():
data = """
vrf2 {
forwarding-options {
dhcp-relay {
server-group {
IN_MEDIA_SIGNALING {
10.154.6.147;
}
group NGN-SIG {
active-server-group DHCP-NGN-SIG;
overrides {
trust-option-82;
}
}
}
}
}
"""
template = """
<group name="vrfs*">
{{ name | _start_ }} {
<group name="forwarding_options">
forwarding-options { {{ _start_ }}
<group name="dhcp_relay">
dhcp-relay { {{ _start_ }}
<group name="server_group">
server-group { {{ _start_ }}
<group name="dhcp*">
{{ server_group_name | _start_ }} {
</group>
</group>
<group name="groups*">
group {{ group_name | _start_ }} {
</group>
</group>
</group>
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"vrfs": [
{
"forwarding_options": {
"dhcp_relay": {
"groups": [{"group_name": "NGN-SIG"}],
"server_group": {
"dhcp": [
{"server_group_name": "IN_MEDIA_SIGNALING"},
{"server_group_name": "overrides"},
]
},
}
},
"name": "vrf2",
}
]
}
]
]
# test_issue_45_1()
def test_issue_45_filtering_fix():
data = """
vrf2 {
forwarding-options {
dhcp-relay {
server-group {
IN_MEDIA_SIGNALING {
10.154.6.147;
}
DHCP-NGN-SIG {
10.154.6.147;
}
}
group group2 {
active-server-group IN_MEDIA_SIGNALING;
overrides {
trust-option-82;
}
}
group NGN-SIG {
active-server-group DHCP-NGN-SIG;
overrides {
trust-option-82;
}
}
}
}
}
"""
template = """
<group name="vrfs*">
{{ name | _start_ }} {
<group name="forwarding_options">
forwarding-options { {{ _start_ }}
<group name="dhcp_relay">
dhcp-relay { {{ _start_ }}
<group name="server_group">
server-group { {{ _start_ }}
<group name="dhcp*">
{{ server_group_name1 | _start_ | exclude("overrides") }} {
<group name="helper_addresses*">
{{ helper_address | IP }};
</group>
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
<group name="groups*">
group {{ group_name | _start_ }} {
active-server-group {{server_group_name2}};
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"vrfs": [
{
"forwarding_options": {
"dhcp_relay": {
"groups": [
{
"group_name": "group2",
"server_group_name2": "IN_MEDIA_SIGNALING",
},
{
"group_name": "NGN-SIG",
"server_group_name2": "DHCP-NGN-SIG",
},
],
"server_group": {
"dhcp": [
{
"helper_addresses": [
{"helper_address": "10.154.6.147"}
],
"server_group_name1": "IN_MEDIA_SIGNALING",
},
{
"helper_addresses": [
{"helper_address": "10.154.6.147"}
],
"server_group_name1": "DHCP-NGN-SIG",
},
]
},
}
},
"name": "vrf2",
}
]
}
]
]
# test_issue_45_filtering_fix()
def test_issue_47_answer():
data = """
Some text which indicates that below block should be included in results ABC
interface Loopback0
description Router-id-loopback
ip address 192.168.0.113/24
!
Some text which indicates that below block should be included in results DEF
interface Loopback2
description Router-id-loopback 2
ip address 192.168.0.114/24
!
Some text which indicates that below block should NOT be included in results
interface Vlan778
description CPE_Acces_Vlan
ip address 2002::fd37/124
ip vrf CPE1
!
Some text which indicates that below block should be included in results GKL
interface Loopback3
description Router-id-loopback 3
ip address 192.168.0.115/24
!
"""
template = """
Some text which indicates that below block should be included in results ABC {{ _start_ }}
Some text which indicates that below block should be included in results DEF {{ _start_ }}
Some text which indicates that below block should be included in results GKL {{ _start_ }}
interface {{ interface }}
ip address {{ ip }}/{{ mask }}
description {{ description | re(".+") }}
ip vrf {{ vrf }}
! {{ _end_ }}
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=150)
assert res == [
[
[
{
"description": "Router-id-loopback",
"interface": "Loopback0",
"ip": "192.168.0.113",
"mask": "24",
},
{
"description": "Router-id-loopback 2",
"interface": "Loopback2",
"ip": "192.168.0.114",
"mask": "24",
},
{
"description": "Router-id-loopback 3",
"interface": "Loopback3",
"ip": "192.168.0.115",
"mask": "24",
},
]
]
]
# test_issue_47_answer()
def test_issue_48_answer():
data = """
ECON*3400 The Economics of Personnel Management U (3-0) [0.50]
In this course, we examine the economics of personnel management in organizations.
Using mainstream microeconomic and behavioural economic theory, we will consider
such issues as recruitment, promotion, financial and non-financial incentives,
compensation, job performance, performance evaluation, and investment in personnel.
The interplay between theoretical models and empirical evidence will be emphasized in
considering different approaches to the management of personnel.
Prerequisite(s): ECON*2310 or ECON*2200
Department(s): Department of Economics and Finance
ECON*4400 The Economics of Personnel Management U (7-1) [0.90]
In this course, we examine the economics of personnel management in organizations.
Using mainstream microeconomic and behavioural economic theory, we will consider
such issues as recruitment, promotion, financial and non-financial incentives,
compensation, job performance, performance evaluation, and investment in personnel.
Prerequisite(s): ECON*2310
Department(s): Department of Economics
"""
template = """
<vars>
descr_chain = [
"PHRASE",
"exclude('Prerequisite(s)')",
"exclude('Department(s)')",
"joinmatches"
]
</vars>
<group>
{{ course }}*{{ code }} {{ name | PHRASE }} {{ semester }} ({{ lecture_lab_time }}) [{{ weight }}]
{{ description | chain(descr_chain) }}
Prerequisite(s): {{ prereqs | ORPHRASE }}
Department(s): {{ department | ORPHRASE }}
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=150)
assert res == [
[
[
{
"code": "3400",
"course": "ECON",
"department": "Department of Economics and Finance",
"description": "In this course, we examine the economics of personnel management in organizations.\n"
"Using mainstream microeconomic and behavioural economic theory, we will consider\n"
"such issues as recruitment, promotion, financial and non-financial incentives,\n"
"compensation, job performance, performance evaluation, and investment in personnel.\n"
"The interplay between theoretical models and empirical evidence will be emphasized in\n"
"considering different approaches to the management of personnel.",
"lecture_lab_time": "3-0",
"name": "The Economics of Personnel Management",
"prereqs": "ECON*2310 or ECON*2200",
"semester": "U",
"weight": "0.50",
},
{
"code": "4400",
"course": "ECON",
"department": "Department of Economics",
"description": "In this course, we examine the economics of personnel management in organizations.\n"
"Using mainstream microeconomic and behavioural economic theory, we will consider\n"
"such issues as recruitment, promotion, financial and non-financial incentives,\n"
"compensation, job performance, performance evaluation, and investment in personnel.",
"lecture_lab_time": "7-1",
"name": "The Economics of Personnel Management",
"prereqs": "ECON*2310",
"semester": "U",
"weight": "0.90",
},
]
]
]
# test_issue_48_answer()
def test_issue_48_answer_more():
data = """
IBIO*4521 Thesis in Integrative Biology F (0-12) [1.00]
This course is the first part of the two-semester course IBIO*4521/2. This course is
a two-semester (F,W) undergraduate project in which students conduct a comprehensive,
independent research project in organismal biology under the supervision of a faculty
member in the Department of Integrative Biology. Projects involve a thorough literature
review, a research proposal, original research communicated in oral and poster
presentations, and in a written, publication quality document. This two-semester course
offers students the opportunity to pursue research questions and experimental designs
that cannot be completed in the single semester research courses. Students must make
arrangements with both a faculty supervisor and the course coordinator at least one
semester in advance. A departmental registration form must be obtained from the course
coordinator and submitted no later than the second class day of the fall semester. This is
a twosemester course offered over consecutive semesters F-W. When you select this
course, you must select IBIO*4521 in the Fall semester and IBIO*4522 in the Winter
semester.A grade will not be assigned to IBIO*4521 until IBIO*4522 has been completed.
Prerequisite(s): 12.00 credits
Restriction(s): Normally a minimum cumulative average of 70%. Permission of course
coordinator.
Department(s): Department of Integrative Biology
IBIO*4533 Thesis in Integrative Biology F (0-14) [2.00]
This course is the first part of the two-semester course IBIO*4521/2. This course is
a two-semester (F,W) undergraduate project in which students conduct a comprehensive,
independent research project in organismal biology under the supervision of a faculty
member in the Department of Integrative Biology.
Restriction(s): Normally a minimum cumulative average of 80%. Permission of course
coordinator. Normally a minimum cumulative average of 90%. Permission of course
coordinator.
Department(s): Department of Integrative Biology
"""
template = """
<vars>
chain_1 = [
"ORPHRASE",
"exclude('Prerequisite(s)')",
"exclude('Department(s)')",
"exclude('Restriction(s)')",
"joinmatches"
]
</vars>
<group>
{{ course }}*{{ code }} {{ name | PHRASE }} {{ semester }} ({{ lecture_lab_time }}) [{{ weight }}]
{{ description | chain(chain_1) }}
Prerequisite(s): {{ prereqs | ORPHRASE }}
Department(s): {{ department | ORPHRASE }}
<group name="_">
Restriction(s): {{ restrictions | PHRASE | joinmatches }}
{{ restrictions | chain(chain_1) }}
</group>
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
pprint.pprint(res, width=150)
assert res == [
[
[
{
"code": "4521",
"course": "IBIO",
"department": "Department of Integrative Biology",
"description": "This course is the first part of the two-semester course IBIO*4521/2. This course is\n"
"a two-semester (F,W) undergraduate project in which students conduct a comprehensive,\n"
"independent research project in organismal biology under the supervision of a faculty\n"
"member in the Department of Integrative Biology. Projects involve a thorough literature\n"
"review, a research proposal, original research communicated in oral and poster\n"
"presentations, and in a written, publication quality document. This two-semester course\n"
"offers students the opportunity to pursue research questions and experimental designs\n"
"that cannot be completed in the single semester research courses. Students must make\n"
"arrangements with both a faculty supervisor and the course coordinator at least one\n"
"semester in advance. A departmental registration form must be obtained from the course\n"
"coordinator and submitted no later than the second class day of the fall semester. This is\n"
"a twosemester course offered over consecutive semesters F-W. When you select this\n"
"course, you must select IBIO*4521 in the Fall semester and IBIO*4522 in the Winter\n"
"semester.A grade will not be assigned to IBIO*4521 until IBIO*4522 has been completed.",
"lecture_lab_time": "0-12",
"name": "Thesis in Integrative Biology",
"prereqs": "12.00 credits",
"restrictions": "Normally a minimum cumulative average of 70%. Permission of course\ncoordinator.",
"semester": "F",
"weight": "1.00",
},
{
"code": "4533",
"course": "IBIO",
"department": "Department of Integrative Biology",
"description": "This course is the first part of the two-semester course IBIO*4521/2. This course is\n"
"a two-semester (F,W) undergraduate project in which students conduct a comprehensive,\n"
"independent research project in organismal biology under the supervision of a faculty\n"
"member in the Department of Integrative Biology.",
"lecture_lab_time": "0-14",
"name": "Thesis in Integrative Biology",
"restrictions": "Normally a minimum cumulative average of 80%. Permission of course\n"
"coordinator. Normally a minimum cumulative average of 90%. Permission of course\n"
"coordinator.",
"semester": "F",
"weight": "2.00",
},
]
]
]
# test_issue_48_answer_more()
def test_slack_channel_answer_for_Noif():
data = """
# not disabled and no comment
/ip address add address=10.4.1.245 interface=lo0 network=10.4.1.245
/ip address add address=10.4.1.246 interface=lo1 network=10.4.1.246
# not disabled and comment with no quotes
/ip address add address=10.9.48.241/29 comment=SITEMON interface=ether2 network=10.9.48.240
/ip address add address=10.9.48.233/29 comment=Camera interface=vlan205@bond1 network=10.9.48.232
/ip address add address=10.9.49.1/24 comment=SM-Management interface=vlan200@bond1 network=10.9.49.0
# not disabled and comment with quotes
/ip address add address=10.4.1.130/30 comment="to core01" interface=vlan996@bond4 network=10.4.1.128
/ip address add address=10.4.250.28/29 comment="BH 01" interface=vlan210@bond1 network=10.4.250.24
/ip address add address=10.9.50.13/30 comment="Cust: site01-PE" interface=vlan11@bond1 network=10.9.50.12
# disabled no comment
/ip address add address=10.0.0.2/30 disabled=yes interface=bridge:customer99 network=10.0.0.0
# disabled with comment
/ip address add address=169.254.1.100/24 comment=Cambium disabled=yes interface=vlan200@bond1 network=169.254.1.0
# disabled with comment with quotes
/ip address add address=10.4.248.20/29 comment="Backhaul to AGR (Test Segment)" disabled=yes interface=vlan209@bond1 network=10.4.248.16
"""
template = """
<vars>
default_values = {
"comment": "",
"disabled": False
}
</vars>
<group default="default_values">
## not disabled and no comment
/ip address add address={{ ip | _start_ }} interface={{ interface }} network={{ network }}
## not disabled and comment with/without quotes
/ip address add address={{ ip | _start_ }}/{{ mask }} comment={{ comment | ORPHRASE | exclude("disabled=") | strip('"')}} interface={{ interface }} network={{ network }}
## disabled no comment
/ip address add address={{ ip | _start_ }}/{{ mask }} disabled={{ disabled }} interface={{ interface }} network={{ network }}
## disabled with comment with/without quotes
/ip address add address={{ ip | _start_ }}/{{ mask }} comment={{ comment | ORPHRASE | exclude("disabled=") | strip('"') }} disabled={{ disabled }} interface={{ interface }} network={{ network }}
</group>
"""
parser = ttp(data=data, template=template, log_level="ERROR")
parser.parse()
res = parser.result(structure="flat_list")
# pprint.pprint(res, width=200)
assert res == [
{
"comment": "",
"disabled": False,
"interface": "lo0",
"ip": "10.4.1.245",
"network": "10.4.1.245",
},
{
"comment": "",
"disabled": False,
"interface": "lo1",
"ip": "10.4.1.246",
"network": "10.4.1.246",
},
{
"comment": "SITEMON",
"disabled": False,
"interface": "ether2",
"ip": "10.9.48.241",
"mask": "29",
"network": "10.9.48.240",
},
{
"comment": "Camera",
"disabled": False,
"interface": "vlan205@bond1",
"ip": "10.9.48.233",
"mask": "29",
"network": "10.9.48.232",
},
{
"comment": "SM-Management",
"disabled": False,
"interface": "vlan200@bond1",
"ip": "10.9.49.1",
"mask": "24",
"network": "10.9.49.0",
},
{
"comment": "to core01",
"disabled": False,
"interface": "vlan996@bond4",
"ip": "10.4.1.130",
"mask": "30",
"network": "10.4.1.128",
},
{
"comment": "BH 01",
"disabled": False,
"interface": "vlan210@bond1",
"ip": "10.4.250.28",
"mask": "29",
"network": "10.4.250.24",
},
{
"comment": "Cust: site01-PE",
"disabled": False,
"interface": "vlan11@bond1",
"ip": "10.9.50.13",
"mask": "30",
"network": "10.9.50.12",
},
{
"comment": "",
"disabled": "yes",
"interface": "bridge:customer99",
"ip": "10.0.0.2",
"mask": "30",
"network": "10.0.0.0",
},
{
"comment": "Cambium",
"disabled": "yes",
"interface": "vlan200@bond1",
"ip": "169.254.1.100",
"mask": "24",
"network": "169.254.1.0",
},
{
"comment": "Backhaul to AGR (Test Segment)",
"disabled": "yes",
"interface": "vlan209@bond1",
"ip": "10.4.248.20",
"mask": "29",
"network": "10.4.248.16",
},
]
# test_slack_channel_answer_for_Noif()
def test_slack_answer_2():
data_to_parse = """
port 1/1/1
description "port 1 description"
ethernet
mode hybrid
encap-type dot1q
crc-monitor
sd-threshold 5 multiplier 5
sf-threshold 3 multiplier 5
window-size 60
exit
network
queue-policy "ncq-only"
accounting-policy 12
collect-stats
egress
queue-group "qos-policy-for-router1" instance 1 create
accounting-policy 1
collect-stats
agg-rate
rate 50000
exit
exit
exit
exit
access
egress
queue-group "policer-output-queues" instance 1 create
accounting-policy 1
collect-stats
exit
exit
exit
lldp
dest-mac nearest-bridge
admin-status tx-rx
notification
tx-tlvs port-desc sys-name sys-desc sys-cap
tx-mgmt-address system
exit
exit
down-on-internal-error
exit
no shutdown
exit
port 1/1/2
description "another port to a another router"
ethernet
mode hybrid
encap-type dot1q
egress-scheduler-policy "qos-port-scheduler"
crc-monitor
sd-threshold 5 multiplier 5
sf-threshold 3 multiplier 5
window-size 60
exit
access
egress
queue-group "policer-output-queues" instance 1 create
accounting-policy 1
collect-stats
exit
exit
exit
down-on-internal-error
exit
no shutdown
exit
port 1/1/3
description "port 3 to some third router"
ethernet
mode access
encap-type dot1q
mtu 2000
egress-scheduler-policy "strict-scheduler"
network
queue-policy "ncq-only"
accounting-policy 12
collect-stats
egress
queue-group "some-shaping-policy" instance 1 create
accounting-policy 1
collect-stats
agg-rate
rate 50000
exit
exit
queue-group "another-shaping-policy" instance 1 create
accounting-policy 1
collect-stats
agg-rate
rate 50000
exit
exit
queue-group "this-shaper-is-cool" instance 1 create
agg-rate
rate 1000000
exit
exit
exit
exit
exit
no shutdown
exit
"""
template = """
<group name="system.ports">
port {{ id }}
shutdown {{ admin_enabled | set(false) }}
description "{{ description | ORPHRASE | strip('"') }}"
<group name="ethernet">
ethernet {{ _start_ }}
mode {{ mode }}
encap-type {{ encap_type }}
mtu {{ mtu | DIGIT }}
egress-scheduler-policy {{ egress_sched_policy | strip('"') }}
loopback internal persistent {{ loop_internal | set(true) }}
<group name="network">
network {{ _start_ }}
queue-policy {{ queue_policy | ORPHRASE | strip('"') }}
accounting-policy {{ accounting_policy | DIGIT }}
collect-stats {{ collect_stats | set(true) }}
<group name="egress">
egress {{ _start_ }}
<group name="queuegroups*">
queue-group {{ name | strip('"') }} instance 1 create
rate {{ agg_rate | DIGIT }}
exit {{_end_}}
</group>
## this "exit {{ _end_ }}" had wrong indentation level, leading to
## group name="egress" finishing too early
exit {{_end_}}
</group>
exit {{_end_}}
</group>
lldp {{ lldp_enabled | set(true) }}
exit {{_end_}}
</group>
no shutdown {{admin_enabled | set(true)}}
exit {{_end_}}
</group>
"""
parser = ttp(data=data_to_parse, template=template, log_level="ERROR")
parser.parse()
res = parser.result()
pprint.pprint(res, width=150)
assert res == [
[
{
"system": {
"ports": [
{
"admin_enabled": True,
"description": "port 1 description",
"ethernet": {
"encap_type": "dot1q",
"lldp_enabled": True,
"mode": "hybrid",
"network": {
"accounting_policy": "12",
"collect_stats": True,
"egress": {
"queuegroups": [
{
"agg_rate": "50000",
"name": "qos-policy-for-router1",
}
]
},
"queue_policy": "ncq-only",
},
},
"id": "1/1/1",
},
{
"admin_enabled": True,
"description": "another port to a another router",
"ethernet": {
"egress_sched_policy": "qos-port-scheduler",
"encap_type": "dot1q",
"mode": "hybrid",
},
"id": "1/1/2",
},
{
"admin_enabled": True,
"description": "port 3 to some third router",
"ethernet": {
"egress_sched_policy": "strict-scheduler",
"encap_type": "dot1q",
"mode": "access",
"mtu": "2000",
"network": {
"accounting_policy": "12",
"collect_stats": True,
"egress": {
"queuegroups": [
{
"agg_rate": "50000",
"name": "some-shaping-policy",
},
{
"agg_rate": "50000",
"name": "another-shaping-policy",
},
{
"agg_rate": "1000000",
"name": "this-shaper-is-cool",
},
]
},
"queue_policy": "ncq-only",
},
},
"id": "1/1/3",
},
]
}
}
]
]
# test_slack_answer_2()
def test_slack_answer_3():
"""
Problem was that interfaces were matched by regexes from both ospf and ospfv3
groups, decision logic was not able to properly work out to which group result
should belong, changed behavior to check if match is a child of current record
group and use it if so. Also had to change how group id encoded from string to
tuple of two elements ("group path", "group index",)
Here is some debug output until problem was fixed:
self.record["GRP_ID"]: service.vprns*.{{id}}**.ospf3**::1
re_["GROUP"].group_id: service.vprns*.{{id}}**.ospf**.interfaces*::0
re_idex: 0
self.record["GRP_ID"]: service.vprns*.{{id}}**.ospf3**::1
re_["GROUP"].group_id: service.vprns*.{{id}}**.ospf3**.interfaces*::0
re_idex: 1
# problem was happening because logic was not able to decide that need to use this match
self.record["GRP_ID"]: service.vprns*.{{id}}**.ospf**::0
re_["GROUP"].group_id: service.vprns*.{{id}}**.ospf**.interfaces*::0
re_idex: 0
# problem was happening because logic was picking up this match
self.record["GRP_ID"]: service.vprns*.{{id}}**.ospf**::0
re_["GROUP"].group_id: service.vprns*.{{id}}**.ospf3**.interfaces*::0
re_idex: 1
Wrong results:
[[{'service': {'vprns': [{'4': {'name': 'ospf_version3_vprn',
'ospf': {'area': '0.0.0.0', 'interfaces': [{'name': 'interface-one'}]},
'ospf3': {'area': '0.0.0.0', 'interfaces': [{'name': 'interface-two'}]}},
'5': {'name': 'vprn5', 'ospf': {'area': '0.0.0.0'},
'ospf3': {'interfaces': [{'name': 'interface-three'}]}}}]}}]]
"""
data = """
service
vprn 4 name "ospf_version3_vprn" customer 40 create
ospf
area 0.0.0.0
interface "interface-one"
ospf3 0
area 0.0.0.0
interface "interface-two"
vprn 5 name "vprn5" customer 50 create
ospf
area 0.0.0.0
interface "interface-three"
"""
template = """
<group name="service.vprns*.{{id}}**">
vprn {{ id }} name {{ name | ORPHRASE | strip('"') }} customer {{ ignore }} create
<group name="ospf**">
ospf {{ _start_ }}
area {{ area }}
<group name="interfaces*">
interface {{ name | ORPHRASE | strip('"') }}
</group>
</group>
<group name="ospf3**">
ospf3 0 {{ _start_ }}
area {{ area }}
<group name="interfaces*">
interface {{ name | ORPHRASE | strip('"') }}
</group>
</group>
</group>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=100)
assert res == [
[
{
"service": {
"vprns": [
{
"4": {
"name": "ospf_version3_vprn",
"ospf": {
"area": "0.0.0.0",
"interfaces": [{"name": "interface-one"}],
},
"ospf3": {
"area": "0.0.0.0",
"interfaces": [{"name": "interface-two"}],
},
},
"5": {
"name": "vprn5",
"ospf": {
"area": "0.0.0.0",
"interfaces": [{"name": "interface-three"}],
},
},
}
]
}
}
]
]
# test_slack_answer_3()
def test_slack_answer_3_full():
data = """
service
vprn 1 name "vprn1" customer 10 create
interface "loopback" create
exit
interface "interface-one" create
exit
interface "interface-two" create
exit
interface "bgp-interface" create
exit
exit
vprn 2 name "vprn2" customer 20 create
interface "loopback" create
exit
interface "interface-two" create
exit
interface "bgp-interface" create
exit
exit
vprn 3 name "vprn3" customer 30 create
interface "loopback" create
exit
interface "interface-two" create
exit
exit
vprn 4 name "ospf_version3_vprn" customer 40 create
interface "loopback" create
exit
interface "interface-two" create
exit
exit
vprn 5 name "vprn5" customer 50 create
interface "loopback" create
exit
interface "interface-two" create
exit
interface "bgp-interface" create
exit
exit
vprn 1 name "vprn1" customer 10 create
interface "loopback" create
address 10.10.10.1/32
loopback
exit
interface "interface-one" create
address 10.10.10.10/30
sap 1/1/1:10 create
exit
exit
interface "interface-two" create
address 10.10.10.100/31
sap lag-5:80 create
exit
exit
interface "bgp-interface" create
address 10.10.10.200/31
sap lag-4:100 create
exit
exit
ospf
area 0.0.0.0
interface "interface-two"
passive
no shutdown
exit
exit
no shutdown
exit
no shutdown
exit
vprn 2 name "vprn2" customer 20 create
interface "interface-two" create
address 10.11.11.10/31
sap lag-1:50 create
exit
exit
ospf
area 0.0.0.0
interface "interface-two"
passive
no shutdown
exit
exit
no shutdown
exit
no shutdown
exit
vprn 3 name "vprn3" customer 30 create
interface "loopback" create
address 10.12.12.12/32
loopback
exit
interface "interface-two" create
address 10.12.12.100/31
sap lag-5:33 create
exit
exit
ospf
area 0.0.0.0
interface "interface-two"
passive
no shutdown
exit
exit
no shutdown
exit
no shutdown
exit
vprn 4 name "ospf_version3_vprn" customer 40 create
interface "loopback" create
address 10.40.40.10/32
ipv6
address 1500:1000:460e::a03:ae46/128
exit
loopback
exit
interface "interface-two" create
address 10.40.40.100/31
ipv6
address 1500:1000:460e::2222:1111/64
exit
sap lag-5:800 create
exit
exit
ospf
area 0.0.0.0
interface "interface-two"
passive
no shutdown
exit
exit
no shutdown
exit
ospf3 0
area 0.0.0.0
interface "interface-two"
passive
no shutdown
exit
exit
no shutdown
exit
no shutdown
exit
vprn 5 name "vprn5" customer 50 create
interface "loopback" create
address 10.50.50.50/32
loopback
exit
interface "interface-two" create
address 10.50.50.100/31
sap lag-5:5 create
exit
exit
interface "bgp-interface" create
address 10.50.50.200/31
sap lag-1:602 create
exit
exit
bgp
group "eBGP"
peer-as 4444
neighbor 10.50.50.201
exit
exit
no shutdown
exit
ospf
area 0.0.0.0
interface "interface-two"
passive
no shutdown
exit
exit
no shutdown
exit
no shutdown
exit
exit
"""
template = """
#-------------------------------------------------- {{ ignore }}
echo "Service Configuration" {{ ignore }}
#-------------------------------------------------- {{ ignore }}
service {{ ignore }}
<group name="service.vprns*.{{id}}**">
vprn {{ id }} name {{ name | ORPHRASE | strip('"') }} customer {{ ignore }} create
shutdown {{ admin_enabled | set("False") }}
description {{ description | ORPHRASE | strip('"') }}
vrf-import {{ import_policy | ORPHRASE | strip('"') }}
router-id {{ router_id }}
autonomous-system {{ local_as }}
route-distinguisher {{ loopback_ip }}:{{ vrf_routedist }}
vrf-target target:{{ ignore }}:{{ vrf_routetarget }}
vrf-target {{ vrf_export }} target:{{ ignore }}:{{ vrf_routetarget }}
<group name="interfaces*.{{name}}**">
interface {{ name | ORPHRASE | strip('"') }} create
shutdown {{ admin_enabled | set("False") }}
description {{ description | ORPHRASE | strip('"') }}
address {{ address | IP }}/{{ mask | DIGIT }}
ip-mtu {{ mtu }}
bfd {{ bfd_timers }} receive {{ ignore }} multiplier {{ bfd_interval }}
<group name="vrrp">
vrrp {{ instance }}
backup {{ backup }}
priority {{ priority }}
policy {{ policy }}
ping-reply {{ pingreply | set("True") }}
traceroute-reply {{ traceroute_reply | set("True") }}
init-delay {{ initdelay }}
message-interval {{ message_int_seconds }}
message-interval milliseconds {{ message_int_milliseconds }}
bfd-enable 1 interface {{ bfd_interface | ORPHRASE | strip('"')}} dst-ip {{ bfd_dst_ip }}
exit {{ _end_ }}
</group>
<group name="ipv6">
ipv6 {{ _start_ }}
address {{ address | IPV6 }}/{{ mask | DIGIT }}
address {{ address | _start_ | IPV6 }}/{{ mask | DIGIT }} dad-disable
link-local-address {{ linklocal_address | IPV6 }} dad-disable
<group name="vrrp">
vrrp {{ instance | _start_ }}
<group name="backup*">
backup {{ ip }}
</group>
priority {{ priority }}
policy {{ policy }}
ping-reply {{ pingreplay | set("True") }}
traceroute-reply {{ traceroute_reply | set("True") }}
init-delay {{ initdelay }}
message-interval milliseconds {{ message_int_milliseconds }}
exit {{ _end_ }}
</group>
exit {{ _end_ }}
</group>
<group name="vpls">
vpls {{ vpls_name | ORPHRASE | strip('"') | _start_ }}
exit {{ _end_ }}
</group>
<group name="sap**">
sap {{ port | _start_ }}:{{ vlan | DIGIT }} create
ingress {{ _exact_ }}
qos {{ qos_sap_ingress }}
<group name="_">
egress {{ _start_ }}
qos {{ qos_sap_egress }}
</group>
collect-stats {{ collect_stats | set("True") }}
accounting-policy {{ accounting_policy }}
exit {{ _end_}}
</group>
exit {{ _end_}}
</group>
<group name="staticroutes*">
static-route-entry {{ prefix | PREFIX | _start_ }}
black-hole {{ blackhole | set("True") }}
next-hop {{ nexthop | IP }}
shutdown {{ admin_enabled | set("False") }}
no shutdown {{ admin_enabled | set("True") }}
exit {{ _end_ }}
</group>
<group name="aggregates">
aggregate {{ agg_block | PREFIX | _start_ }} summary-only
</group>
<group name="router_advertisement">
router-advertisement {{ _start_ }}
interface {{ interface | ORPHRASE | strip('"') }}
use-virtual-mac {{ use_virtualmac | set("True") }}
no shutdown {{ admin_enabled | set("True") }}
exit {{ _end_ }}
</group>
<group name="bgp**">
bgp {{ _start_ }}
min-route-advertisement {{ min_route_advertisement | DIGIT }}
<group name="peergroups*">
group {{ name | ORPHRASE | strip('"') }}
family {{ family | ORPHRASE | split(" ") }}
type {{ peer_type | ORPHRASE }}
import {{ importpolicy | ORPHRASE | strip('"') }}
export {{ exportpolicy | ORPHRASE | strip('"') }}
peer-as {{ remote_as }}
bfd-enable {{ bfd_enabled | set("True") }}
<group name="neighbors*">
neighbor {{ address | IP | _start_ }}
neighbor {{ address | IPV6 | _start_ }}
shutdown {{ admin_enabled | set("False") }}
keepalive {{ keepalive }}
hold-time {{ holdtime }}
bfd-enable {{ bfd_enabled | set("True") }}
as-override {{ as_override | set("True") }}
exit {{ _end_ }}
</group>
exit {{ _end_ }}
</group>
no shutdown {{ admin_enabled | set("True") | _start_ }}
exit {{ _end_ }}
</group>
<group name="ospf**">
ospf {{ _start_ }}{{ _exact_ }}
area {{ area }}
<group name="interfaces*">
interface {{ name | ORPHRASE | strip('"') | _start_ }}
passive {{ passive | set("True") }}
exit {{ _end_ }}
</group>
no shutdown {{ admin_enabled | set("True") }}
exit {{ _end_ }}
</group>
<group name="ospf3**">
ospf3 0 {{ _start_ }}{{ _exact_ }}
area {{ area }}
<group name="interfaces*">
interface {{ name | ORPHRASE | strip('"') | _start_ }}
passive {{ passive | set("True") }}
exit {{ _end_ }}
</group>
no shutdown {{ admin_enabled | set("True") }}
exit {{ _end_ }}
</group>
no shutdown {{ admin_enabled | set("True") }}
exit {{ _end_ }}
</group>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
pprint.pprint(res, width=100)
assert res == [
[
{
"service": {
"vprns": [
{
"1": {
"admin_enabled": "True",
"interfaces": [
{
"bgp-interface": {
"address": "10.10.10.200",
"mask": "31",
"sap": {"port": "lag-4", "vlan": "100"},
},
"interface-one": {
"address": "10.10.10.10",
"mask": "30",
"sap": {"port": "1/1/1", "vlan": "10"},
},
"interface-two": {
"address": "10.10.10.100",
"mask": "31",
"sap": {"port": "lag-5", "vlan": "80"},
},
"loopback": {
"address": "10.10.10.1",
"mask": "32",
},
}
],
"name": "vprn1",
"ospf": {
"admin_enabled": "True",
"area": "0.0.0.0",
"interfaces": [
{"name": "interface-two", "passive": "True"}
],
},
},
"2": {
"admin_enabled": "True",
"interfaces": [
{
"bgp-interface": {},
"interface-two": {
"address": "10.11.11.10",
"mask": "31",
"sap": {"port": "lag-1", "vlan": "50"},
},
"loopback": {},
}
],
"name": "vprn2",
"ospf": {
"admin_enabled": "True",
"area": "0.0.0.0",
"interfaces": [
{"name": "interface-two", "passive": "True"}
],
},
},
"3": {
"admin_enabled": "True",
"interfaces": [
{
"interface-two": {
"address": "10.12.12.100",
"mask": "31",
"sap": {"port": "lag-5", "vlan": "33"},
},
"loopback": {
"address": "10.12.12.12",
"mask": "32",
},
}
],
"name": "vprn3",
"ospf": {
"admin_enabled": "True",
"area": "0.0.0.0",
"interfaces": [
{"name": "interface-two", "passive": "True"}
],
},
},
"4": {
"admin_enabled": "True",
"interfaces": [
{
"interface-two": {
"address": "10.40.40.100",
"ipv6": {
"address": "1500:1000:460e::2222:1111",
"mask": "64",
},
"mask": "31",
"sap": {"port": "lag-5", "vlan": "800"},
},
"loopback": {
"address": "10.40.40.10",
"ipv6": {
"address": "1500:1000:460e::a03:ae46",
"mask": "128",
},
"mask": "32",
},
}
],
"name": "ospf_version3_vprn",
"ospf": {
"admin_enabled": "True",
"area": "0.0.0.0",
"interfaces": [
{"name": "interface-two", "passive": "True"}
],
},
"ospf3": {
"admin_enabled": "True",
"area": "0.0.0.0",
"interfaces": [
{"name": "interface-two", "passive": "True"}
],
},
},
"5": {
"admin_enabled": "True",
"bgp": {
"admin_enabled": "True",
"peergroups": [
{
"name": "eBGP",
"neighbors": [{"address": "10.50.50.201"}],
"remote_as": "4444",
}
],
},
"interfaces": [
{
"bgp-interface": {
"address": "10.50.50.200",
"mask": "31",
"sap": {"port": "lag-1", "vlan": "602"},
},
"interface-two": {
"address": "10.50.50.100",
"mask": "31",
"sap": {"port": "lag-5", "vlan": "5"},
},
"loopback": {
"address": "10.50.50.50",
"mask": "32",
},
}
],
"name": "vprn5",
"ospf": {
"area": "0.0.0.0",
"interfaces": [
{"name": "interface-two", "passive": "True"}
],
},
},
}
]
}
}
]
]
# test_slack_answer_3_full()
def test_issue_45_for_junos_cfg():
data = """
system {
host-name LAB-MX-1;
time-zone some/time;
default-address-selection;
no-redirects;
no-ping-record-route;
no-ping-time-stamp;
tacplus-server {
1.1.1.1 {
port 49;
secret "<SECRET_HASH>"; ## SECRET-DATA
source-address 5.5.5.5;
}
2.2.2.2 {
port 49;
secret "<SECRET_HASH>"; ## SECRET-DATA
source-address 5.5.5.5;
}
4.4.4.4 {
port 49;
secret "<SECRET_HASH>"; ## SECRET-DATA
source-address 5.5.5.5;
}
}
services {
ssh {
root-login deny;
no-tcp-forwarding;
protocol-version v2;
max-sessions-per-connection 32;
client-alive-count-max 3;
client-alive-interval 10;
connection-limit 10;
rate-limit 5;
}
netconf {
ssh {
connection-limit 10;
rate-limit 4;
}
}
}
}
"""
template = """
<group name="system_level">
system { {{ _start_ }}
host-name {{ HOSTNAME }};
time-zone {{ TZ }};
default-address-selection; {{ default_address_selection | set(True) }}
no-redirects; {{ no_redirects | set(True) }}
no-ping-record-route; {{ no_ping_record_route | set(True) }}
no-ping-time-stamp; {{ no_ping_time_stamp | set(True) }}
<group name="services">
services { {{ _start_ }}
<group name="{{ service }}">
{{ service }} {
http; {{ http | set(true) }}
https; {{ https | set(true) }}
no-tcp-forwarding; {{ no-tcp-fwding | set(true) }}
protocol-version {{ ssh-proto }};
connection-limit {{ connection-limit | DIGIT }};
rate-limit {{rate-limit | DIGIT }};
root-login deny; {{ root-login | set(false) }}
max-sessions-per-connection {{ max-sessions | DIGIT }};
client-alive-count-max {{ client-alive-count-max | DIGIT }};
client-alive-interval {{ client-alive-interval | DIGIT }};
<group name="ssh">
ssh; {{ ssh | set(true) }}
</group>
<group name="ssh">
ssh { {{ _start_ }}
connection-limit {{ connection-limit | DIGIT }};
rate-limit {{ rate-limit | DIGIT }};
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
} {{ _end_ }}
</group>
<group name="internet-options">
internet-options { {{ _start_ }}
icmpv4-rate-limit packet-rate {{ packet-rate| DIGIT }};
icmpv6-rate-limit packet-rate {{ packet-rate| DIGIT }};
no-source-quench; {{ no-source-quench | set(true) }}
tcp-drop-synfin-set; {{ tcp-drop-synfin-set | set(true) }}
no-tcp-reset {{ no-tcp-reset }};
} {{ _end_ }}
</group>
authentication-order [{{ authentication-order }}];
<group name="ports">
ports { {{ _start_ }}
auxiliary disable; {{ auxiliary | set(false) }}
} {{ _end_ }}
</group>
<group name="root-authentication">
root-authentication { {{ _start_ }}
encrypted-password "{{ encrypted-password }}"; ## SECRET-DATA
} {{ _end_ }}
</group>
<group name="dns" itemize="name_server">
name-server { {{ _start_ }}
{{ name_server | IP | _line_ | to_list }};
} {{ _end_ }}
</group>
<group name="commit">
commit { {{ _start_ }}
synchronize; {{ commit_sync | set(true) }}
persist-groups-inheritance; {{ commit_persist-groups-inherit | set(true) }}
} {{ _end_ }}
</group>
<group name="tacacs">
tacplus-server { {{ _start_ }}
<group name="tacacs-servers.{{ tac_server }}">
{{ tac_server | IP }} {
port {{ tac_port }};
secret "{{ tac_secret }}"; ## SECRET-DATA
source-address {{ tac_source | IP }};
} {{ end }}
</group>
} {{ end }}
</group>
} {{ end }}
</group>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=100)
assert res == [
[
{
"system_level": {
"HOSTNAME": "LAB-MX-1",
"TZ": "some/time",
"default_address_selection": True,
"no_ping_record_route": True,
"no_ping_time_stamp": True,
"no_redirects": True,
"services": {
"netconf": {
"ssh": {"connection-limit": "10", "rate-limit": "4"}
},
"ssh": {
"client-alive-count-max": "3",
"client-alive-interval": "10",
"connection-limit": "10",
"max-sessions": "32",
"no-tcp-fwding": True,
"rate-limit": "5",
"root-login": False,
"ssh-proto": "v2",
},
},
"tacacs": {
"tacacs-servers": {
"1.1.1.1": {
"tac_port": "49",
"tac_secret": "<SECRET_HASH>",
"tac_source": "5.5.5.5",
},
"2.2.2.2": {
"tac_port": "49",
"tac_secret": "<SECRET_HASH>",
"tac_source": "5.5.5.5",
},
"4.4.4.4": {
"tac_port": "49",
"tac_secret": "<SECRET_HASH>",
"tac_source": "5.5.5.5",
},
}
},
}
}
]
]
# test_issue_45_for_junos_cfg()
def test_faq_multiline_output_matching():
data = """
Local Intf: Te2/1/23
System Name: r1.lab.local
System Description:
Cisco IOS Software, Catalyst 1234 L3 Switch Software (cat1234e-ENTSERVICESK9-M), Version 1534.1(1)SG, RELEASE SOFTWARE (fc3)
Technical Support: http://www.cisco.com/techsupport
Copyright (c) 1986-2012 by Cisco Systems, Inc.
Compiled Sun 15-Apr-12 02:35 by p
Time remaining: 92 seconds
"""
template = """
<group>
Local Intf: {{ local_intf }}
System Name: {{ peer_name }}
<group name="peer_system_description">
System Description: {{ _start_ }}
{{ sys_description | _line_ | joinmatches(" ") }}
Time remaining: {{ ignore }} seconds {{ _end_ }}
</group>
</group>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=100)
assert res == [
[
[
{
"local_intf": "Te2/1/23",
"peer_name": "r1.lab.local",
"peer_system_description": {
"sys_description": "Cisco IOS Software, Catalyst 1234 L3 Switch "
"Software (cat1234e-ENTSERVICESK9-M), Version "
"1534.1(1)SG, RELEASE SOFTWARE (fc3) Technical "
"Support: http://www.cisco.com/techsupport "
"Copyright (c) 1986-2012 by Cisco Systems, Inc. "
"Compiled Sun 15-Apr-12 02:35 by p"
},
}
]
]
]
# test_faq_multiline_output_matching()
def test_issue_52_answer():
data = """
Origin:
Some random name
Example Address, example number, example city
Origin:
Some random name 2
Example Address, example number, example city 2
Origin:
Some random name 3
Example Address, example number, example city 3
One more string
"""
template = """
<macro>
def process(data):
lines = data["match"].splitlines()
name = lines[0]
address = lines[1]
return {"name": name, "address": address}
</macro>
<group name="origin*" macro="process">
Origin: {{ _start_ }}
{{ match | _line_ | joinmatches }}
</group>
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=100)
assert res == [
[
{
"origin": [
{
"address": "Example Address, example number, example city",
"name": "Some random name",
},
{
"address": "Example Address, example number, example city 2",
"name": "Some random name 2",
},
{
"address": "Example Address, example number, example city 3",
"name": "Some random name 3",
},
]
}
]
]
# test_issue_52_answer()
def test_issue_51_answer():
""" test workaround for removing <> chars from input data """
data = """
Name:Jane<br>
Name:Michael<br>
Name:July<br>
"""
template = """
<group name="people">
Name:{{ name }}<br>
</group>
"""
# this works as well
# template = "Name:{{ name }}br"
# data = data.replace("<", "").replace(">", "")
# this did not work. fails with xml parsing error
# template = "Name:{{ name }}<br>"
# data = data.replace("<", "<").replace(">", ">")
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=100)
assert res == [
[{"people": [{"name": "Jane"}, {"name": "Michael"}, {"name": "July"}]}]
]
# test_issue_51_answer()
def test_issue_50():
template = """
<input load="text">
interface "BNG-RH201-CORE"
address 11.11.11.11/31
description "BNG-RH201-CORE"
ldp-sync-timer 10
port lag-107:709
ipv6
address 1111:0:1111:1111::1/64
exit
bfd 150 receive 150 multiplier 3
no shutdown
exit
interface "BNG-RH202-CORE"
address 22.22.22.22/31
description "BNG-RH201-CORE"
ldp-sync-timer 10
port lag-108:809
ipv6
address 2222:0:2222:2222::2/64
exit
bfd 150 receive 150 multiplier 3
no shutdown
exit
interface "system"
address 33.33.33.33/32
ipv6
address 3333:0:3333:3333::3/128
exit
no shutdown
exit
ies 97 name "OTDR-MGT" customer 1 create
description "OTDR-MGT"
interface "OTDR-MGT" create
address 44.44.44.44/25
vrrp 97
backup 10.20.30.1
priority 200
exit
vpls "OTDR-MGT-VPLS"
exit
exit
no shutdown
exit
ies 99 name "OLT-MGT" customer 1 create
description "OLT-INBAND-MGT"
interface "OLT-MGT" create
address 55.55.55.55/25
vrrp 1
backup 10.20.40.1
priority 200
exit
vpls "OLT-MGT-VPLS"
exit
exit
no shutdown
exit
ies 100 name "100" customer 1 create
description "IES 100 for subscribers"
redundant-interface "shunt" create
address 66.66.66.66/31
spoke-sdp 1:100 create
no shutdown
exit
exit
subscriber-interface "s100" create
description " Subscriber interface for subscribers"
allow-unmatching-subnets
address 77.77.77.77/22 gw-ip-address 77.77.77.1
address 88.88.88.88/20 gw-ip-address 88.88.88.1
group-interface "s100-lag210-vlan101" create
tos-marking-state trusted
ipv6
router-advertisements
managed-configuration
no shutdown
exit
dhcp6
proxy-server
no shutdown
exit
exit
exit
exit
exit
</input>
<group name="ifaces.{{ name }}" contains="ipv4,ipv6">
## group to match top level interfaces
interface "{{ name }}"
description {{ description | re(".+") | strip('"') }}
address {{ ipv4 | joinmatches('; ') }}
address {{ ipv6 | contains(":") | joinmatches('; ') }}
exit {{ _end_ }}
</group>
<group name="ifaces.{{ name }}" contains="ipv4,ipv6">
## group to match lower level interfaces
interface "{{ name | _start_ }}" create
{{ iftype }}-interface "{{ name | _start_ }}" create
description {{ description | re(".+") | strip('"') | strip }}
address {{ ipv4 | contains(".") | joinmatches('; ') }}
address {{ ipv4 | contains(".") | joinmatches('; ') }} gw-ip-address {{ ignore }}
exit {{ _end_ }}
</group>
"""
parser = ttp(template=template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"ifaces": {
"BNG-RH201-CORE": {
"description": "BNG-RH201-CORE",
"ipv4": "11.11.11.11/31",
"ipv6": "1111:0:1111:1111::1/64",
},
"BNG-RH202-CORE": {
"description": "BNG-RH201-CORE",
"ipv4": "22.22.22.22/31",
"ipv6": "2222:0:2222:2222::2/64",
},
"OLT-MGT": {"ipv4": "55.55.55.55/25"},
"OTDR-MGT": {"ipv4": "44.44.44.44/25"},
"s100": {
"description": "Subscriber interface for subscribers",
"iftype": "subscriber",
"ipv4": "77.77.77.77/22; 88.88.88.88/20",
},
"shunt": {"iftype": "redundant", "ipv4": "66.66.66.66/31"},
"system": {
"ipv4": "33.33.33.33/32",
"ipv6": "3333:0:3333:3333::3/128",
},
}
}
]
]
# test_issue_50()
def test_start_with_set():
data = """
authentication {
inactive: authentication {
"""
template = """
authentication { {{ inactive | set(False) | _start_ }}
inactive: authentication { {{ inactive | set(True) | _start_ }}
"""
parser = ttp(data, template, log_level="ERROR")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [[[{"inactive": False}, {"inactive": True}]]]
# test_start_with_set()
def test_ios_bgp_pers_pars():
template = """
<vars>
defaults_bgp_peers = {
"description": "",
"remote-as": "",
"shutdown": "no",
"inherit_peer-session": "",
"update-source": "",
"password": ""
}
</vars>
<group name="bgp_peers">
<group name="{{ ASN }}">
router bgp {{ ASN }}
<group name="{{ PeerIP }}" default="defaults_bgp_peers">
neighbor {{ PeerIP }} remote-as {{ remote-as }}
neighbor {{ PeerIP }} description {{ description | ORPHRASE }}
neighbor {{ PeerIP | let("shutdown", "yes") }} shutdown
neighbor {{ PeerIP }} inherit peer-session {{ inherit_peer-session }}
neighbor {{ PeerIP }} password {{ password | ORPHRASE }}
neighbor {{ PeerIP }} update-source {{ update-source }}
</group>
</group>
</group>
"""
data = """
router bgp 65100
neighbor 1.1.1.1 remote-as 1234
neighbor 1.1.1.1 description Some Description here
neighbor 1.1.1.1 shutdown
neighbor 1.1.1.1 inherit peer-session session_1
neighbor 1.1.1.1 password 12345678
neighbor 1.1.1.1 update-source Loopback 1
neighbor 1.1.1.2 remote-as 1234
neighbor 1.1.1.2 inherit peer-session session_1
neighbor 1.1.1.2 update-source Loopback 1
"""
parser = ttp(data, template, log_level="DEBUG")
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"bgp_peers": {
"65100": {
"1.1.1.1": {
"description": "Some Description here",
"inherit_peer-session": "session_1",
"password": "12345678",
"remote-as": "1234",
"shutdown": "yes",
"update-source": "",
},
"1.1.1.2": {
"description": "",
"inherit_peer-session": "session_1",
"password": "",
"remote-as": "1234",
"shutdown": "no",
"update-source": "",
},
}
}
}
]
]
# test_ios_bgp_pers_pars()
def test_ip_address_parsing():
data = """
interface Vlan99
description vlan99_interface
ip address 20.99.10.1 255.255.255.0 secondary
ip address 30.99.10.1 255.255.255.0 secondary
ip address 10.99.10.1 255.255.255.0
load-interval 60
bandwidth 10000000
!
interface Vlan100
description vlan100_interface
ip address 10.100.10.1 255.255.255.0
load-interval 60
bandwidth 10000000
!
"""
template = """
<group name="interface">
interface {{ interface }}
description {{ description }}
ip address {{ ipv4_addr | PHRASE | exclude("secondary") | to_ip | with_prefixlen }}
load-interval {{ load-interval }}
bandwidth {{ bandwidth }}
<group name="ipv4_secondary*">
ip address {{ ipv4_addr | PHRASE | let("is_secondary", True) | to_ip | with_prefixlen }} secondary
</group>
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [
[
{
"interface": [
{
"bandwidth": "10000000",
"description": "vlan99_interface",
"interface": "Vlan99",
"ipv4_addr": "10.99.10.1/24",
"ipv4_secondary": [
{"ipv4_addr": "20.99.10.1/24", "is_secondary": True},
{"ipv4_addr": "30.99.10.1/24", "is_secondary": True},
],
"load-interval": "60",
},
{
"bandwidth": "10000000",
"description": "vlan100_interface",
"interface": "Vlan100",
"ipv4_addr": "10.100.10.1/24",
"load-interval": "60",
},
]
}
]
]
# test_ip_address_parsing()
def test_vlans_parsing():
template = """
<group name="ports_summary*">
{{ port }} {{ mode }} {{ encap }} {{ satus }} {{ native_vlan | DIGIT }}
</group>
<group name="vlans_allowed">
Port Vlans allowed on trunk {{ _start_ }}
<group name="interfaces*">
{{ port }} {{ vlans | unrange('-', ',') | split(",") }}
</group>
{{ _end_ }}
</group>
<group name="vlans_active">
Port Vlans allowed and active in management domain {{ _start_ }}
<group name="interfaces*">
{{ port }} {{ vlans | unrange('-', ',') | split(",") }}
</group>
{{ _end_ }}
</group>
<group name="vlans_forwarding">
Port Vlans in spanning tree forwarding state and not pruned {{ _start_ }}
<group name="interfaces*">
{{ port }} {{ vlans | unrange('-', ',') | split(",") }}
</group>
{{ _end_ }}
</group>
"""
data = """
Port Mode Encapsulation Status Native vlan
Gi0 on 802.1q trunking 1
Gi7 on 802.1q trunking 1
Port Vlans allowed on trunk
Gi0 1,8,999,1002-1005
Gi7 1,100,120,1000,1002-1005
Port Vlans allowed and active in management domain
Gi0 1,8,999
Gi7 1,100,120,1000
Port Vlans in spanning tree forwarding state and not pruned
Gi0 1,8,999
Gi7 1,100,120,1000
"""
parser = ttp(data, template, log_level="DEBUG")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=120)
assert res == [
[
{
"ports_summary": [
{
"encap": "802.1q",
"mode": "on",
"native_vlan": "1",
"port": "Gi0",
"satus": "trunking",
},
{
"encap": "802.1q",
"mode": "on",
"native_vlan": "1",
"port": "Gi7",
"satus": "trunking",
},
],
"vlans_active": {
"interfaces": [
{"port": "Gi0", "vlans": ["1", "8", "999"]},
{"port": "Gi7", "vlans": ["1", "100", "120", "1000"]},
]
},
"vlans_allowed": {
"interfaces": [
{
"port": "Gi0",
"vlans": ["1", "8", "999", "1002", "1003", "1004", "1005"],
},
{
"port": "Gi7",
"vlans": [
"1",
"100",
"120",
"1000",
"1002",
"1003",
"1004",
"1005",
],
},
]
},
"vlans_forwarding": {
"interfaces": [
{"port": "Gi0", "vlans": ["1", "8", "999"]},
{"port": "Gi7", "vlans": ["1", "100", "120", "1000"]},
]
},
}
]
]
# test_vlans_parsing()
def test_asa_acls_issue_55_uses_itemize_with_dynamic_path():
data = """
object-group service gokuhead
service-object tcp-udp destination eq gokurpc
service-object tcp destination eq 902
service-object tcp destination eq https
service-object tcp destination eq nfs
service-object tcp destination eq 10025
object-group network gohan
network-object object gohan-01
network-object object gohan-02
network-object object vlan_944
network-object object gohan-03
network-object object gohan-05
network-object object gohan-06
object-group service sql tcp
port-object eq 1433
object-group network vegeta
group-object trunks
network-object object vegeta-01
object-group network Space-Users
network-object object ab
network-object object ac
network-object object ad
network-object object ae
network-object object af
network-object object ag
network-object object ah
network-object object ai
network-object object aj
object-group network dalmatians
network-object object dog-01
group-object trunks
network-object object vlan_950
group-object Space-Users
network-object object Darts-Summary
"""
template = """
<vars>
SVC_PORTS = "tcp-udp|tcp|udp"
</vars>
<group name="object-{{ object_type }}-groups**.{{ object_name }}**">
object-group {{ object_type }} {{ object_name | _start_ }}
object-group {{ object_type }} {{ object_name | _start_ }} {{ protocol | re("SVC_PORTS")}}
description {{ description | re(".*") }}
<group name="{{ type }}-objects" itemize="obj_name" method="table">
network-object object {{ obj_name | let("type", "network") }}
network-object host {{ obj_name | IP | let("type", "network") }}
group-object {{ obj_name | let("type", "group") }}
service-object object {{ obj_name | let("type", "service") }}
service-object {{ obj_name | let("type", "service") }}
</group>
<group name="service-object-ports*">
service-object {{ protocol | re("SVC_PORTS") }} destination eq {{port}}
</group>
<group name="service-object-port-ranges*">
service-object {{ protocol | re("SVC_PORTS") }} destination range {{port_begin}} {{port_end}}
</group>
<group name="service-port-objects" itemize="port_obj">
port-object eq {{ port_obj }}
</group>
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
# pprint.pprint(res, width=80)
assert res == [
[
{
"object-network-groups": {
"Space-Users": {
"network-objects": [
"ab",
"ac",
"ad",
"ae",
"af",
"ag",
"ah",
"ai",
"aj",
]
},
"dalmatians": {
"group-objects": ["trunks", "Space-Users"],
"network-objects": ["dog-01", "vlan_950", "Darts-Summary"],
},
"gohan": {
"network-objects": [
"gohan-01",
"gohan-02",
"vlan_944",
"gohan-03",
"gohan-05",
"gohan-06",
]
},
"vegeta": {
"group-objects": ["trunks"],
"network-objects": ["vegeta-01"],
},
},
"object-service-groups": {
"gokuhead": {
"service-object-ports": [
{"port": "gokurpc", "protocol": "tcp-udp"},
{"port": "902", "protocol": "tcp"},
{"port": "https", "protocol": "tcp"},
{"port": "nfs", "protocol": "tcp"},
{"port": "10025", "protocol": "tcp"},
]
},
"sql": {"protocol": "tcp", "service-port-objects": ["1433"]},
},
}
]
]
# test_asa_acls_issue_55()
def test_asa_acls_issue_55():
data = """
object-group service gokuhead
service-object tcp-udp destination eq gokurpc
service-object tcp destination eq 902
service-object tcp destination eq https
service-object tcp destination eq nfs
service-object tcp destination eq 10025
object-group network gohan
network-object object gohan-01
network-object object gohan-02
network-object object vlan_944
network-object object gohan-03
network-object object gohan-05
network-object object gohan-06
object-group service sql tcp
port-object eq 1433
object-group network vegeta
group-object trunks
network-object object vegeta-01
object-group network Space-Users
network-object object ab
network-object object ac
network-object object ad
network-object object ae
network-object object af
network-object object ag
network-object object ah
network-object object ai
network-object object aj
object-group network dalmatians
network-object object dog-01
group-object trunks
network-object object vlan_950
group-object Space-Users
network-object object Darts-Summary
"""
template = """
<vars>
SVC_PORTS = "tcp-udp|tcp|udp"
</vars>
<group name="object-{{ object_type }}-groups**.{{ object_name }}**">
object-group {{ object_type }} {{ object_name | _start_ }}
object-group {{ object_type }} {{ object_name | _start_ }} {{ protocol | re("SVC_PORTS")}}
description {{ description | re(".*") }}
<group name="network-objects" itemize="obj_name" method="table">
network-object object {{ obj_name | }}
network-object host {{ obj_name | IP }}
</group>
<group name="group-objects" itemize="obj_name" method="table">
group-object {{ obj_name }}
</group>
<group name="group-objects" itemize="obj_name" method="table">
service-object object {{ obj_name }}
service-object {{ obj_name }}
</group>
<group name="service-object-ports*">
service-object {{ protocol | re("SVC_PORTS") }} destination eq {{port}}
</group>
<group name="service-object-port-ranges*">
service-object {{ protocol | re("SVC_PORTS") }} destination range {{port_begin}} {{port_end}}
</group>
<group name="service-port-objects" itemize="port_obj">
port-object eq {{ port_obj }}
</group>
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
# pprint.pprint(res, width=80)
assert res == [
[
{
"object-network-groups": {
"Space-Users": {
"network-objects": [
"ab",
"ac",
"ad",
"ae",
"af",
"ag",
"ah",
"ai",
"aj",
]
},
"dalmatians": {
"group-objects": ["trunks", "Space-Users"],
"network-objects": ["dog-01", "vlan_950", "Darts-Summary"],
},
"gohan": {
"network-objects": [
"gohan-01",
"gohan-02",
"vlan_944",
"gohan-03",
"gohan-05",
"gohan-06",
]
},
"vegeta": {
"group-objects": ["trunks"],
"network-objects": ["vegeta-01"],
},
},
"object-service-groups": {
"gokuhead": {
"service-object-ports": [
{"port": "gokurpc", "protocol": "tcp-udp"},
{"port": "902", "protocol": "tcp"},
{"port": "https", "protocol": "tcp"},
{"port": "nfs", "protocol": "tcp"},
{"port": "10025", "protocol": "tcp"},
]
},
"sql": {"protocol": "tcp", "service-port-objects": ["1433"]},
},
}
]
]
# test_asa_acls_issue_55()
def test_issue_57_headers_parsing():
"""
Issue first was with startempty match not beeing selected in favour
of start match produced by headers :
Interface Link Protocol Primary_IP Description {{ _headers_ }}
that was fixed by adding this code to the TTP selection logic for multiple
matches:
# startempty RE always more preferred
if startempty_re:
for index in startempty_re:
re_ = result[index][0]
result_data = result[index][1]
# skip results that did not pass validation check
if result_data == False:
continue
# prefer result with same path as current record
elif re_["GROUP"].group_id == self.record["GRP_ID"]:
break
# prefer children of current record group
elif self.record["GRP_ID"] and re_["GROUP"].group_id[
0
].startswith(self.record["GRP_ID"][0]):
break
# start RE preferred next
elif start_re:
Another problem was with
Interface Link Protocol Primary_IP Description {{ _headers_ }}
matching on "Duplex: (a)/A - auto; H - half; F - full" line, that was fixed
by chaning _end_ logic by introducing self.ended_groups set to _results_class
and replacing self.GRPLOCL with logic to use self.ended_groups instead.
All in all it resulted in better _end_ handling behavior and allowed to fix issue
45 as well where before this one had to use filtering instead, but now _end_ also
helps.
"""
data = """
Brief information on interfaces in route mode:
Link: ADM - administratively down; Stby - standby
Protocol: (s) - spoofing
Interface Link Protocol Primary IP Description
InLoop0 UP UP(s) --
REG0 UP -- --
Vlan401 UP UP 10.251.147.36 HSSBC_to_inband_mgmt_r4
Brief information on interfaces in bridge mode:
Link: ADM - administratively down; Stby - standby
Speed: (a) - auto
Duplex: (a)/A - auto; H - half; F - full
Type: A - access; T - trunk; H - hybrid
Interface Link Speed Duplex Type PVID Description
BAGG1 UP 20G(a) F(a) T 1 to-KDC-R4.10-Core-1
BAGG14 UP 10G(a) F(a) T 1 KDC-R429-E1 BackUp Chassis
BAGG22 UP 20G(a) F(a) T 1 HSSBC-NS-01
FGE1/0/49 DOWN auto A A 1
XGE1/0/1 UP 10G(a) F(a) T 1 KDC-R402-E1 Backup Chassis
"""
template = """
<group name = "interfaces">
<group name="routed">
Brief information on interfaces in route mode: {{ _start_ }}
<group name = "{{Interface}}">
Interface Link Protocol Primary_IP Description {{ _headers_ }}
</group>
{{ _end_ }}
</group>
<group name="bridged">
Brief information on interfaces in bridge mode: {{ _start_ }}
<group name = "{{Interface}}">
Interface Link Speed Duplex Type PVID Description {{ _headers_ }}
</group>
{{ _end_ }}
</group>
</group>
"""
parser = ttp(data, template, log_level="error")
parser.parse()
res = parser.result()
pprint.pprint(res, width=80)
assert res == [
[
{
"interfaces": {
"bridged": {
"BAGG1": {
"Description": "to-KDC-R4.10-Core-1",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "20G(a)",
"Type": "T",
},
"BAGG14": {
"Description": "KDC-R429-E1 BackUp " "Chassis",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "10G(a)",
"Type": "T",
},
"BAGG22": {
"Description": "HSSBC-NS-01",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "20G(a)",
"Type": "T",
},
"FGE1/0/49": {
"Description": "",
"Duplex": "A",
"Link": "DOWN",
"PVID": "1",
"Speed": "auto",
"Type": "A",
},
"Link: ADM - administr": {
"Description": "",
"Duplex": "Stby -",
"Link": "ative",
"PVID": "dby",
"Speed": "ly down;",
"Type": "stan",
},
"XGE1/0/1": {
"Description": "KDC-R402-E1 Backup " "Chassis",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "10G(a)",
"Type": "T",
},
},
"routed": {
"InLoop0": {
"Description": "",
"Link": "UP",
"Primary_IP": "--",
"Protocol": "UP(s)",
},
"Link: ADM - administr": {
"Description": "",
"Link": "ative",
"Primary_IP": "Stby - " "standby",
"Protocol": "ly down;",
},
"REG0": {
"Description": "",
"Link": "UP",
"Primary_IP": "--",
"Protocol": "--",
},
"Vlan401": {
"Description": "HSSBC_to_inband_mgmt_r4",
"Link": "UP",
"Primary_IP": "10.251.147.36",
"Protocol": "UP",
},
},
}
}
]
]
# test_issue_57_headers_parsing()
def test_issue_57_headers_parsing_using_columns():
"""
Added columns for headers, now can adjust headers size as required
to filter unwanted results
"""
data = """
Brief information on interfaces in route mode:
Link: ADM - administratively down; Stby - standby
Protocol: (s) - spoofing
Interface Link Protocol Primary IP Description
InLoop0 UP UP(s) --
REG0 UP -- --
Vlan401 UP UP 10.251.147.36 HSSBC_to_inband_mgmt_r4
Brief information on interfaces in bridge mode:
Link: ADM - administratively down; Stby - standby
Speed: (a) - auto
Duplex: (a)/A - auto; H - half; F - full
Type: A - access; T - trunk; H - hybrid
Interface Link Speed Duplex Type PVID Description
BAGG1 UP 20G(a) F(a) T 1 to-KDC-R4.10-Core-1
BAGG14 UP 10G(a) F(a) T 1 KDC-R429-E1 BackUp Chassis
BAGG22 UP 20G(a) F(a) T 1 HSSBC-NS-01
FGE1/0/49 DOWN auto A A 1
XGE1/0/1 UP 10G(a) F(a) T 1 KDC-R402-E1 Backup Chassis
"""
template = """
<group name = "interfaces">
<group name="routed">
Brief information on interfaces in route mode: {{ _start_ }}
<group name = "{{Interface}}">
Interface Link Protocol Primary_IP Description {{ _headers_ | columns(5)}}
</group>
{{ _end_ }}
</group>
<group name="bridged">
Brief information on interfaces in bridge mode: {{ _start_ }}
<group name = "{{Interface}}">
Interface Link Speed Duplex Type PVID Description {{ _headers_ | columns(7) }}
</group>
{{ _end_ }}
</group>
</group>
"""
parser = ttp(data, template, log_level="error")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=80)
assert res == [
[
{
"interfaces": {
"bridged": {
"BAGG1": {
"Description": "to-KDC-R4.10-Core-1",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "20G(a)",
"Type": "T",
},
"BAGG14": {
"Description": "KDC-R429-E1 BackUp " "Chassis",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "10G(a)",
"Type": "T",
},
"BAGG22": {
"Description": "HSSBC-NS-01",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "20G(a)",
"Type": "T",
},
"FGE1/0/49": {
"Description": "",
"Duplex": "A",
"Link": "DOWN",
"PVID": "1",
"Speed": "auto",
"Type": "A",
},
"XGE1/0/1": {
"Description": "KDC-R402-E1 Backup " "Chassis",
"Duplex": "F(a)",
"Link": "UP",
"PVID": "1",
"Speed": "10G(a)",
"Type": "T",
},
},
"routed": {
"InLoop0": {
"Description": "",
"Link": "UP",
"Primary_IP": "--",
"Protocol": "UP(s)",
},
"REG0": {
"Description": "",
"Link": "UP",
"Primary_IP": "--",
"Protocol": "--",
},
"Vlan401": {
"Description": "HSSBC_to_inband_mgmt_r4",
"Link": "UP",
"Primary_IP": "10.251.147.36",
"Protocol": "UP",
},
},
}
}
]
]
# test_issue_57_headers_parsing_using_columns()
def test_interface_template_not_collecting_all_data_solution():
data = """
interface Bundle-Ether10
description Bundle-Ether10
bfd mode ietf
bfd address-family ipv4 multiplier 3
bfd address-family ipv4 destination 192.168.1.7
bfd address-family ipv4 fast-detect
bfd address-family ipv4 minimum-interval 100
mtu 9114
ipv4 address 192.168.1.6 255.255.255.254
ipv6 address fc00::1:5/127
load-interval 30
!
interface Bundle-Ether51
description Bundle-Ether51
bfd mode ietf
bfd address-family ipv4 multiplier 3
bfd address-family ipv4 destination 192.168.1.2
bfd address-family ipv4 fast-detect
bfd address-family ipv4 minimum-interval 100
mtu 9114
ipv4 address 192.168.1.3 255.255.255.254
ipv6 address fc00::1:3/127
load-interval 30
!
interface Loopback0
description Loopback0
ipv4 address 10.1.1.1 255.255.255.255
ipv4 address 10.2.2.2 255.255.255.255 secondary
ipv6 address fc00::1/128
ipv6 address fc00::101/128
!
interface Loopback1
description Loopback1
ipv4 address 10.100.0.1 255.255.255.0
ipv4 address 10.100.1.1 255.255.255.0 secondary
ipv4 address 10.100.2.1 255.255.255.0 secondary
ipv6 address fc00:100::1/64
ipv6 address fc00:100::101/64
ipv6 address fc00:100::201/64
!
interface MgmtEth0/RP0/CPU0/0
description MgmtEth0/RP0/CPU0/0
cdp
vrf VRF-MGMT
ipv4 address 172.23.136.21 255.255.252.0
!
interface GigabitEthernet0/0/0/12
description GigabitEthernet0/0/0/12
mtu 9018
lldp
receive disable
transmit disable
!
negotiation auto
load-interval 30
l2transport
!
!
interface TenGigE0/0/0/4
description TenGigE0/0/0/4
bundle id 51 mode active
cdp
load-interval 30
!
interface TenGigE0/0/0/5
shutdown
!
interface TenGigE0/0/0/5.100 l2transport
description TenGigE0/0/0/5.100
!
interface TenGigE0/0/0/47
description TenGigE0/0/0/47
shutdown
mac-address 201.b19.1234
!
interface BVI101
cdp
description BVI101
ipv4 address 192.168.101.1 255.255.255.0
load-interval 30
mac-address 200.b19.4321
!
interface HundredGigE0/0/1/0
description HundredGigE0/0/1/0
bundle id 10 mode active
cdp
load-interval 30
mac-address 200.b19.5678
!
interface preconfigure GigabitEthernet0/0/0/11
description GigabitEthernet0/0/0/11
shutdown
!
interface preconfigure GigabitEthernet0/0/0/16
description GigabitEthernet0/0/0/16
shutdown
!
interface preconfigure GigabitEthernet0/0/0/17
description GigabitEthernet0/0/0/17
shutdown
!
"""
template_original = """
<doc>
Template for capturing interface configuration data from IOS-XR devices
Note: In order to different interface appearances, the interface block has been replicated.
Be sure to update all blocks accordingly when adding any new values to capture.
</doc>
<vars>
intf_defaults = {
"description": None,
"speed": None,
"negotiation": None,
"disabled": False,
"mode": None,
}
</vars>
<macro>
## parses ipv4 addresses to determine which is primary and which are secondary
## and converts dotted-quad subnet mask into cidr format
def ipv4_macro(data):
data_list = list(data.split(" "))
addr = str(data_list[0])
mask = str(data_list[1])
mask = str(sum(bin(int(x)).count('1') for x in mask.split('.')))
ipv4 = addr+"/"+mask
if 'secondary' in data:
is_secondary = True
else:
is_secondary = False
result = { "ipv4" : ipv4, "is_secondary" : is_secondary }
return result
</macro>
<group name="interfaces" default="intf_defaults">
interface {{ interface | _start_}}
interface {{ interface | let("mode", "l2transport") | _start_ }} l2transport
interface preconfigure {{ interface | let("mode", "preconfigure") | _start_ }}
description {{ description | re(".+") }}
speed {{ speed }}
negotiation {{ negotiation }}
shutdown {{ disabled | set(True) }}
mac-address {{ mac_address }}
<group name="ipv4*" method="table" containsall="ipv4">
ipv4 address {{ ipv4 | PHRASE | _exact_ | macro("ipv4_macro") }}
</group>
<group name="ipv6*" method="table" containsall="ipv6">
ipv6 address {{ ipv6 | ORPHRASE | _exact_ }}
</group>
! {{ _end_ }}
</group>
"""
parser = ttp(data, template_original, log_level="error")
parser.parse()
res = parser.result()
pprint.pprint(res, width=80)
assert res == [
[
{
"interfaces": [
{
"description": "Bundle-Ether10",
"disabled": False,
"interface": "Bundle-Ether10",
"ipv4": [
{"ipv4": {"ipv4": "192.168.1.6/31", "is_secondary": False}}
],
"ipv6": [{"ipv6": "fc00::1:5/127"}],
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "Bundle-Ether51",
"disabled": False,
"interface": "Bundle-Ether51",
"ipv4": [
{"ipv4": {"ipv4": "192.168.1.3/31", "is_secondary": False}}
],
"ipv6": [{"ipv6": "fc00::1:3/127"}],
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "Loopback0",
"disabled": False,
"interface": "Loopback0",
"ipv4": [
{"ipv4": {"ipv4": "10.1.1.1/32", "is_secondary": False}},
{"ipv4": {"ipv4": "10.2.2.2/32", "is_secondary": True}},
],
"ipv6": [{"ipv6": "fc00::1/128"}, {"ipv6": "fc00::101/128"}],
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "Loopback1",
"disabled": False,
"interface": "Loopback1",
"ipv4": [
{"ipv4": {"ipv4": "10.100.0.1/24", "is_secondary": False}},
{"ipv4": {"ipv4": "10.100.1.1/24", "is_secondary": True}},
{"ipv4": {"ipv4": "10.100.2.1/24", "is_secondary": True}},
],
"ipv6": [
{"ipv6": "fc00:100::1/64"},
{"ipv6": "fc00:100::101/64"},
{"ipv6": "fc00:100::201/64"},
],
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "MgmtEth0/RP0/CPU0/0",
"disabled": False,
"interface": "MgmtEth0/RP0/CPU0/0",
"ipv4": [
{
"ipv4": {
"ipv4": "172.23.136.21/22",
"is_secondary": False,
}
}
],
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "GigabitEthernet0/0/0/12",
"disabled": False,
"interface": "GigabitEthernet0/0/0/12",
"mode": None,
"negotiation": "auto",
"speed": None,
},
{
"description": "TenGigE0/0/0/4",
"disabled": False,
"interface": "TenGigE0/0/0/4",
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": None,
"disabled": True,
"interface": "TenGigE0/0/0/5",
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "TenGigE0/0/0/5.100",
"disabled": False,
"interface": "TenGigE0/0/0/5.100",
"mode": "l2transport",
"negotiation": None,
"speed": None,
},
{
"description": "TenGigE0/0/0/47",
"disabled": True,
"interface": "TenGigE0/0/0/47",
"mac_address": "201.b19.1234",
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "BVI101",
"disabled": False,
"interface": "BVI101",
"ipv4": [
{
"ipv4": {
"ipv4": "192.168.101.1/24",
"is_secondary": False,
}
}
],
"mac_address": "200.b19.4321",
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "HundredGigE0/0/1/0",
"disabled": False,
"interface": "HundredGigE0/0/1/0",
"mac_address": "200.b19.5678",
"mode": None,
"negotiation": None,
"speed": None,
},
{
"description": "GigabitEthernet0/0/0/11",
"disabled": True,
"interface": "GigabitEthernet0/0/0/11",
"mode": "preconfigure",
"negotiation": None,
"speed": None,
},
{
"description": "GigabitEthernet0/0/0/16",
"disabled": True,
"interface": "GigabitEthernet0/0/0/16",
"mode": "preconfigure",
"negotiation": None,
"speed": None,
},
{
"description": "GigabitEthernet0/0/0/17",
"disabled": True,
"interface": "GigabitEthernet0/0/0/17",
"mode": "preconfigure",
"negotiation": None,
"speed": None,
},
]
}
]
]
# test_interface_template_not_collecting_all_data_solution()
@pytest.mark.skipif(True, reason="Need to fix this one")
def test_interface_template_not_collecting_all_data():
"""
For interface BVI101 not collecting mac-address
"""
data = """
interface Bundle-Ether10
description Bundle-Ether10
bfd mode ietf
bfd address-family ipv4 multiplier 3
bfd address-family ipv4 destination 192.168.1.7
bfd address-family ipv4 fast-detect
bfd address-family ipv4 minimum-interval 100
mtu 9114
ipv4 address 192.168.1.6 255.255.255.254
ipv6 address fc00::1:5/127
load-interval 30
!
interface Bundle-Ether51
description Bundle-Ether51
bfd mode ietf
bfd address-family ipv4 multiplier 3
bfd address-family ipv4 destination 192.168.1.2
bfd address-family ipv4 fast-detect
bfd address-family ipv4 minimum-interval 100
mtu 9114
ipv4 address 192.168.1.3 255.255.255.254
ipv6 address fc00::1:3/127
load-interval 30
!
interface Loopback0
description Loopback0
ipv4 address 10.1.1.1 255.255.255.255
ipv4 address 10.2.2.2 255.255.255.255 secondary
ipv6 address fc00::1/128
ipv6 address fc00::101/128
!
interface Loopback1
description Loopback1
ipv4 address 10.100.0.1 255.255.255.0
ipv4 address 10.100.1.1 255.255.255.0 secondary
ipv4 address 10.100.2.1 255.255.255.0 secondary
ipv6 address fc00:100::1/64
ipv6 address fc00:100::101/64
ipv6 address fc00:100::201/64
!
interface MgmtEth0/RP0/CPU0/0
description MgmtEth0/RP0/CPU0/0
cdp
vrf VRF-MGMT
ipv4 address 172.23.136.21 255.255.252.0
!
interface GigabitEthernet0/0/0/12
description GigabitEthernet0/0/0/12
mtu 9018
lldp
receive disable
transmit disable
!
negotiation auto
load-interval 30
l2transport
!
!
interface TenGigE0/0/0/4
description TenGigE0/0/0/4
bundle id 51 mode active
cdp
load-interval 30
!
interface TenGigE0/0/0/5
shutdown
!
interface TenGigE0/0/0/5.100 l2transport
description TenGigE0/0/0/5.100
!
interface TenGigE0/0/0/47
description TenGigE0/0/0/47
shutdown
mac-address 201.b19.1234
!
interface BVI101
cdp
description BVI101
ipv4 address 192.168.101.1 255.255.255.0
load-interval 30
mac-address 200.b19.4321
!
interface HundredGigE0/0/1/0
description HundredGigE0/0/1/0
bundle id 10 mode active
cdp
load-interval 30
mac-address 200.b19.5678
!
interface preconfigure GigabitEthernet0/0/0/11
description GigabitEthernet0/0/0/11
shutdown
!
interface preconfigure GigabitEthernet0/0/0/16
description GigabitEthernet0/0/0/16
shutdown
!
interface preconfigure GigabitEthernet0/0/0/17
description GigabitEthernet0/0/0/17
shutdown
!
"""
template_original = """
<doc>
Template for capturing interface configuration data from IOS-XR devices
Note: In order to different interface appearances, the interface block has been replicated.
Be sure to update all blocks accordingly when adding any new values to capture.
</doc>
<macro>
## parses ipv4 addresses to determine which is primary and which are secondary
## and converts dotted-quad subnet mask into cidr format
def ipv4_macro(data):
data_list = list(data.split(" "))
addr = str(data_list[0])
mask = str(data_list[1])
mask = str(sum(bin(int(x)).count('1') for x in mask.split('.')))
ipv4 = addr+"/"+mask
if 'secondary' in data:
is_secondary = True
else:
is_secondary = False
result = { "ipv4" : ipv4, "is_secondary" : is_secondary }
return result
</macro>
## parent group for all interface groups
<group name="interfaces">
## matches primary interfaces
<group>
{{ mode | set(None) }}
{{ description | set(None) }}
{{ speed | set(None) }}
{{ negotiation | set(None) }}
{{ disabled | set(False) }}
interface {{ interface }}
description {{ description | re(".+") }}
<group name="ipv4*" method="table" containsall="ipv4">
ipv4 address {{ ipv4 | PHRASE | _exact_ | macro("ipv4_macro") }}
</group>
<group name="ipv6*" method="table" containsall="ipv6">
ipv6 address {{ ipv6 | PHRASE | _exact_ }}
</group>
speed {{ speed }}
negotiation {{ negotiation }}
shutdown {{ disabled | set(True) }}
mac-address {{ mac_address }}
</group>
## matches pre-configured interfaces
<group>
{{ mode | set('preconfigure') }}
{{ description | set(None) }}
{{ speed | set(None) }}
{{ negotiation | set(None) }}
{{ disabled | set(False) }}
interface preconfigure {{ interface }}
description {{ description | re(".+") }}
<group name="ipv4*" method="table" containsall="ipv4">
ipv4 address {{ ipv4 | PHRASE | _exact_ | macro("ipv4_macro") }}
</group>
<group name="ipv6*" method="table" containsall="ipv6">
ipv6 address {{ ipv6 | PHRASE | _exact_ }}
</group>
speed {{ speed }}
negotiation {{ negotiation }}
shutdown {{ disabled | set(True) }}
mac-address {{ mac_address }}
</group>
## matches sub-interfaces
<group>
{{ mode | set('l2transport') }}
{{ description | set(None) }}
{{ speed | set(None) }}
{{ negotiation | set(None) }}
{{ disabled | set(False) }}
interface {{ interface }} l2transport
description {{ description | re(".+") }}
<group name="ipv4*" method="table" containsall="ipv4">
ipv4 address {{ ipv4 | PHRASE | _exact_ | macro("ipv4_macro") }}
</group>
<group name="ipv6*" method="table" containsall="ipv6">
ipv6 address {{ ipv6 | PHRASE | _exact_ }}
</group>
speed {{ speed }}
negotiation {{ negotiation }}
shutdown {{ disabled | set(True) }}
mac-address {{ mac_address }}
</group>
</group>
"""
parser = ttp(data, template_original, log_level="error")
parser.parse()
res = parser.result()
pprint.pprint(res, width=80)
# test_interface_template_not_collecting_all_data()
def test_interface_template_not_collecting_all_data_reduced():
"""
Below template and data were producing this result:
[[{'interfaces': [{'interface': 'TenGigE0/0/0/5.100'},
{'interface': 'BVI101',
'ipv4': [{'ipv4': '192.168.101.1 255.255.255.0'}]}]}]]
TTP was not collecting mac-address for BVI 101
"""
data = """
interface TenGigE0/0/0/5.100 l2transport
!
interface BVI101
ipv4 address 192.168.101.1 255.255.255.0
mac-address 200.b19.4321
!
"""
template = """
<group name="interfaces">
## matches primary interfaces
<group>
interface {{ interface }}
<group name="ipv4*" method="table" containsall="ipv4">
ipv4 address {{ ipv4 | _line_ | _exact_ }}
</group>
mac-address {{ mac_address }}
</group>
## matches sub-interfaces
<group>
interface {{ interface }} l2transport
mac-address {{ mac_address }}
</group>
</group>
"""
parser = ttp(data, template, log_level="error")
parser.parse()
res = parser.result()
# pprint.pprint(res, width=80)
assert res == [
[
{
"interfaces": [
{"interface": "TenGigE0/0/0/5.100"},
{
"interface": "BVI101",
"ipv4": [{"ipv4": "192.168.101.1 255.255.255.0"}],
"mac_address": "200.b19.4321",
},
]
}
]
]
# test_interface_template_not_collecting_all_data_reduced()
@pytest.mark.skipif(True, reason="Need to fix this one")
def test_interface_template_not_collecting_all_data_reduced_2():
"""
Below template and data producing this result:
[[{'interfaces': [{'interface': 'TenGigE0/0/0/5'},
{'interface': 'TenGigE0/0/0/5.100',
'mac_address': '200.b19.1234'},
{'interface': 'BVI101',
'ipv4': [{'ipv4': '192.168.101.1 255.255.255.0'}]},
{'interface': 'HundredGigE0/0/1/0',
'mac_address': '200.b19.5678'}]}]]
Interface BVI should not have IPv4 address matched, but
should have mac-address matched. Problem is due to that
l2transport group starts and it has group for IPv4 addresses,
next match after matching IPv4 is mac-address, but his parent
is a different group, as a result IPv4 address saved under wrong group
and mac-address not saved at all
IDEA: try to implement automatic end of group tracking, to add pevious
groups to self.ended_groups if next, different group starts.
Current solution to this problem would be to use _end_ to explicitly
indicate end of group
"""
data = """
interface TenGigE0/0/0/5
!
interface TenGigE0/0/0/5.100 l2transport
mac-address 200.b19.1234
!
interface BVI101
ipv4 address 192.168.101.1 255.255.255.0
mac-address 200.b19.4321
!
interface HundredGigE0/0/1/0
mac-address 200.b19.5678
!
"""
template_original = """
<group name="interfaces">
## matches primary interfaces
<group>
interface {{ interface }}
mac-address {{ mac_address }}
</group>
## matches sub-interfaces
<group>
interface {{ interface }} l2transport
<group name="ipv4*" method="table" containsall="ipv4">
ipv4 address {{ ipv4 | _line_ | _exact_ }}
</group>
</group>
</group>
"""
parser = ttp(data, template_original, log_level="error")
parser.parse()
res = parser.result()
pprint.pprint(res, width=80)
# test_interface_template_not_collecting_all_data_reduced_2()
def test_issue_61():
data = """
banner motd &
BANNER MESSAGE line 1
BANNER MESSAGE line 2
BANNER MESSAGE line 3
&
some
other staff
"""
template_to_match_marker = "banner motd {{ marker }}"
template_to_parse_banner = """
<group name="motd">
banner motd {{ ignore(banner_marker) }} {{ _start_ }}
{{ banner_mesage | _line_ | joinmatches("\\n") }}
{{ ignore(banner_marker) }} {{ _end_ }}
</group>
"""
# extract marker value
parser = ttp(data, template_to_match_marker)
parser.parse()
marker = parser.result()[0][0]["marker"]
# parse banner
parser = ttp(data, template_to_parse_banner, vars={"banner_marker": marker})
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [[{'motd': {'banner_mesage': 'BANNER MESSAGE line 1\n'
'BANNER MESSAGE line 2\n'
'BANNER MESSAGE line 3'}}]]
# test_issue_61()
def test_fortigate_intf_parsing():
template = """
<group name="interfaces">
config system interface {{ _start_ }}
<group name="/interfaces*">
edit "{{ interface }}"
set allowaccess {{ allowaccess }}
set description "{{ description }}"
set interface "{{ phy_interface }}"
set snmp-index {{ snmp_index }}
set type {{ fgt_int_type }}
set vdom "{{ vdom }}"
set vlanid {{ vlan }}
next {{ _end_ }}
</group>
end {{ _end_ }}
</group>
"""
data = """
config system np6
edit "np6_0"
next
end
config system interface
edit "mgmt1"
set vdom "root"
set ip 10.10.10.1 255.255.255.248
set allowaccess ping
set type physical
set description "mgmt1"
set snmp-index 1
next
edit "port1"
set vdom "internal"
set ip 20.20.20.1 255.255.255.248
set allowaccess ping
set type physical
set snmp-index 2
next
end
config system custom-language
edit "en"
set filename "en"
next
edit "fr"
set filename "fr"
next
end
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [[{'interfaces': [{'allowaccess': 'ping',
'description': 'mgmt1',
'fgt_int_type': 'physical',
'interface': 'mgmt1',
'snmp_index': '1',
'vdom': 'root'},
{'allowaccess': 'ping',
'fgt_int_type': 'physical',
'interface': 'port1',
'snmp_index': '2',
'vdom': 'internal'}]}]]
# test_fortigate_intf_parsing()
def test_issue_57_one_more():
"""
Without _anonymous_ group groups id formation bug fix
below template/data were producitng this result:
[[{'portchannel': {'1': {'local_members': [{}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/1',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/2',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]},
'2': {'local_members': [{}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/3',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/4',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]}}}]]
Further debugging revelead the flaw in results selection logic,
due to exclude("Port") statemets group was invalidated and anonymous group_id
was same as parent group_id resulting in new anonymous group matches were not
able to restart the group, fixed by changing the way how anonymous group id formed.
Before fix:
self.ended_groups: set()
re_["GROUP"].group_id: ('portchannel.{{channel_number}}.local_members*', 0)
re_["GROUP"].parent_group_id: ('portchannel.{{channel_number}}.local_members*', 0)
self.ended_groups: {('portchannel.{{channel_number}}.local_members*', 0)}
re_["GROUP"].group_id: ('portchannel.{{channel_number}}.local_members*', 0)
re_["GROUP"].parent_group_id: ('portchannel.{{channel_number}}.local_members*', 0)
self.ended_groups: {('portchannel.{{channel_number}}.local_members*', 0)}
re_["GROUP"].group_id: ('portchannel.{{channel_number}}.local_members*', 0)
re_["GROUP"].parent_group_id: ('portchannel.{{channel_number}}.local_members*', 0)
After fix:
self.ended_groups: set()
re_["GROUP"].group_id: ('portchannel.{{channel_number}}.local_members*._anonymous_', 0)
re_["GROUP"].parent_group_id: ('portchannel.{{channel_number}}.local_members*', 0)
self.ended_groups: {('portchannel.{{channel_number}}.local_members*._anonymous_', 0)}
re_["GROUP"].group_id: ('portchannel.{{channel_number}}.local_members*._anonymous_', 0)
re_["GROUP"].parent_group_id: ('portchannel.{{channel_number}}.local_members*', 0)
self.ended_groups: set()
re_["GROUP"].group_id: ('portchannel.{{channel_number}}.local_members*._anonymous_', 0)
re_["GROUP"].parent_group_id: ('portchannel.{{channel_number}}.local_members*', 0)
"""
data = """
Loadsharing Type: Shar -- Loadsharing, NonS -- Non-Loadsharing
Port Status: S -- Selected, U -- Unselected,
I -- Individual, * -- Management port
Flags: A -- LACP_Activity, B -- LACP_Timeout, C -- Aggregation,
D -- Synchronization, E -- Collecting, F -- Distributing,
G -- Defaulted, H -- Expired
Aggregate Interface: Bridge-Aggregation1
Aggregation Mode: Dynamic
Loadsharing Type: Shar
Management VLAN : None
System ID: 0x8000, d07e-28b5-a200
Local:
Port Status Priority Oper-Key Flag
--------------------------------------------------------------------------------
GE6/0/1 U 32768 1 {ACG}
GE6/0/2 U 32768 1 {ACG}
Remote:
Actor Partner Priority Oper-Key SystemID Flag
--------------------------------------------------------------------------------
GE6/0/1 0 32768 0 0x8000, 0000-0000-0000 {EF}
GE6/0/2 0 32768 0 0x8000, 0000-0000-0000 {EF}
Aggregate Interface: Bridge-Aggregation2
Aggregation Mode: Dynamic
Loadsharing Type: Shar
Management VLAN : None
System ID: 0x8000, d07e-28b5-a200
Local:
Port Status Priority Oper-Key Flag
--------------------------------------------------------------------------------
GE6/0/3 U 32768 2 {ACG}
GE6/0/4 U 32768 2 {ACG}
Remote:
Actor Partner Priority Oper-Key SystemID Flag
--------------------------------------------------------------------------------
GE6/0/3 0 32768 0 0x8000, 0000-0000-0000 {EF}
GE6/0/4 0 32768 0 0x8000, 0000-0000-0000 {EF}
"""
template = """
<group name = "portchannel.{{channel_number}}">
Aggregate Interface: Bridge-Aggregation{{ channel_number}}
<group name = "local_members*" void="">
Local: {{_start_}}
<group>
{{interface | exclude("Port") }} {{status}} {{priority}} {{oper_key }} {{flag}}
</group>
</group>
<group name = "remote_members*">
{{interface }} {{status}} {{priority}} {{oper_key}} {{sys_id}}, {{ mac | MAC }} {{flag}}
</group>
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [[{'portchannel': {'1': {'local_members': [{'flag': '{ACG}',
'interface': 'GE6/0/1',
'oper_key': '1',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/2',
'oper_key': '1',
'priority': '32768',
'status': 'U'}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/1',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/2',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]},
'2': {'local_members': [{'flag': '{ACG}',
'interface': 'GE6/0/3',
'oper_key': '2',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/4',
'oper_key': '2',
'priority': '32768',
'status': 'U'}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/3',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/4',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]}}}]]
# test_issue_57_one_more()
def test_issue_57_one_more_answer():
data = """
Loadsharing Type: Shar -- Loadsharing, NonS -- Non-Loadsharing
Port Status: S -- Selected, U -- Unselected,
I -- Individual, * -- Management port
Flags: A -- LACP_Activity, B -- LACP_Timeout, C -- Aggregation,
D -- Synchronization, E -- Collecting, F -- Distributing,
G -- Defaulted, H -- Expired
Aggregate Interface: Bridge-Aggregation1
Aggregation Mode: Dynamic
Loadsharing Type: Shar
Management VLAN : None
System ID: 0x8000, d07e-28b5-a200
Local:
Port Status Priority Oper-Key Flag
--------------------------------------------------------------------------------
GE6/0/1 U 32768 1 {ACG}
GE6/0/2 U 32768 1 {ACG}
Remote:
Actor Partner Priority Oper-Key SystemID Flag
--------------------------------------------------------------------------------
GE6/0/1 0 32768 0 0x8000, 0000-0000-0000 {EF}
GE6/0/2 0 32768 0 0x8000, 0000-0000-0000 {EF}
Aggregate Interface: Bridge-Aggregation2
Aggregation Mode: Dynamic
Loadsharing Type: Shar
Management VLAN : None
System ID: 0x8000, d07e-28b5-a200
Local:
Port Status Priority Oper-Key Flag
--------------------------------------------------------------------------------
GE6/0/3 U 32768 2 {ACG}
GE6/0/4 U 32768 2 {ACG}
Remote:
Actor Partner Priority Oper-Key SystemID Flag
--------------------------------------------------------------------------------
GE6/0/3 0 32768 0 0x8000, 0000-0000-0000 {EF}
GE6/0/4 0 32768 0 0x8000, 0000-0000-0000 {EF}
"""
template = """
<group name = "portchannel.{{channel_number}}">
Aggregate Interface: Bridge-Aggregation{{ channel_number}}
<group name = "local_members*">
{{interface}} {{status}} {{priority | DIGIT}} {{oper_key | DIGIT}} {{flag}}
</group>
<group name = "remote_members*">
{{interface}} {{status}} {{priority | DIGIT}} {{oper_key | DIGIT}} {{sys_id}}, {{ mac | MAC }} {{flag}}
</group>
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [[{'portchannel': {'1': {'local_members': [{'flag': '{ACG}',
'interface': 'GE6/0/1',
'oper_key': '1',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/2',
'oper_key': '1',
'priority': '32768',
'status': 'U'}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/1',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/2',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]},
'2': {'local_members': [{'flag': '{ACG}',
'interface': 'GE6/0/3',
'oper_key': '2',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/4',
'oper_key': '2',
'priority': '32768',
'status': 'U'}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/3',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/4',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]}}}]]
# test_issue_57_one_more_answer()
def test_issue_57_one_more_empty_dict_in_res():
"""
Without fix this results produced:
[[{'portchannel': {'1': {'local_members': [{},
{'flag': '{ACG}',
'interface': 'GE6/0/1',
'oper_key': '1',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/2',
'oper_key': '1',
'priority': '32768',
'status': 'U'}],
'remote_members': [{},
{'flag': '{EF}',
'interface': 'GE6/0/1',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/2',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]},
'2': {'local_members': [{},
{'flag': '{ACG}',
'interface': 'GE6/0/3',
'oper_key': '2',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/4',
'oper_key': '2',
'priority': '32768',
'status': 'U'}],
'remote_members': [{},
{'flag': '{EF}',
'interface': 'GE6/0/3',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/4',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]}}}]]
Above results contain empty dictionary list item, this is because
local_members* and remote_members* use * to indicate list item
as a result self.dict_by_path was returning E as a list element,
and results were appended to that element, but results are empty dictionary,
update saving logic to check if results are empty and skip appending them
if so.
"""
data = """
Loadsharing Type: Shar -- Loadsharing, NonS -- Non-Loadsharing
Port Status: S -- Selected, U -- Unselected,
I -- Individual, * -- Management port
Flags: A -- LACP_Activity, B -- LACP_Timeout, C -- Aggregation,
D -- Synchronization, E -- Collecting, F -- Distributing,
G -- Defaulted, H -- Expired
Aggregate Interface: Bridge-Aggregation1
Aggregation Mode: Dynamic
Loadsharing Type: Shar
Management VLAN : None
System ID: 0x8000, d07e-28b5-a200
Local:
Port Status Priority Oper-Key Flag
--------------------------------------------------------------------------------
GE6/0/1 U 32768 1 {ACG}
GE6/0/2 U 32768 1 {ACG}
Remote:
Actor Partner Priority Oper-Key SystemID Flag
--------------------------------------------------------------------------------
GE6/0/1 0 32768 0 0x8000, 0000-0000-0000 {EF}
GE6/0/2 0 32768 0 0x8000, 0000-0000-0000 {EF}
Aggregate Interface: Bridge-Aggregation2
Aggregation Mode: Dynamic
Loadsharing Type: Shar
Management VLAN : None
System ID: 0x8000, d07e-28b5-a200
Local:
Port Status Priority Oper-Key Flag
--------------------------------------------------------------------------------
GE6/0/3 U 32768 2 {ACG}
GE6/0/4 U 32768 2 {ACG}
Remote:
Actor Partner Priority Oper-Key SystemID Flag
--------------------------------------------------------------------------------
GE6/0/3 0 32768 0 0x8000, 0000-0000-0000 {EF}
GE6/0/4 0 32768 0 0x8000, 0000-0000-0000 {EF}
"""
template = """
<group name = "portchannel.{{channel_number}}">
Aggregate Interface: Bridge-Aggregation{{ channel_number}}
<group name = "local_members*">
Local: {{_start_}}
<group>
{{interface }} {{status}} {{priority}} {{oper_key | DIGIT }} {{flag}}
</group>
</group>
<group name = "remote_members*">
Remote: {{_start_}}
<group>
{{interface }} {{status}} {{priority}} {{oper_key}} {{sys_id}}, {{ mac | MAC }} {{flag}}
</group>
</group>
</group>
"""
parser = ttp(data, template)
parser.parse()
res = parser.result()
# pprint.pprint(res)
assert res == [[{'portchannel': {'1': {'local_members': [{'flag': '{ACG}',
'interface': 'GE6/0/1',
'oper_key': '1',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/2',
'oper_key': '1',
'priority': '32768',
'status': 'U'}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/1',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/2',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]},
'2': {'local_members': [{'flag': '{ACG}',
'interface': 'GE6/0/3',
'oper_key': '2',
'priority': '32768',
'status': 'U'},
{'flag': '{ACG}',
'interface': 'GE6/0/4',
'oper_key': '2',
'priority': '32768',
'status': 'U'}],
'remote_members': [{'flag': '{EF}',
'interface': 'GE6/0/3',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'},
{'flag': '{EF}',
'interface': 'GE6/0/4',
'mac': '0000-0000-0000',
'oper_key': '0',
'priority': '32768',
'status': '0',
'sys_id': '0x8000'}]}}}]]
# test_issue_57_one_more_empty_dict_in_res() | 36.954707 | 235 | 0.405565 | 21,266 | 248,853 | 4.635663 | 0.073215 | 0.004017 | 0.002678 | 0.011158 | 0.758455 | 0.704703 | 0.667688 | 0.639478 | 0.619139 | 0.59871 | 0 | 0.083696 | 0.475707 | 248,853 | 6,734 | 236 | 36.954707 | 0.671884 | 0.072902 | 0 | 0.638423 | 0 | 0.013591 | 0.615131 | 0.042784 | 0 | 0 | 0.000782 | 0 | 0.008893 | 1 | 0.009228 | false | 0.003356 | 0.005034 | 0 | 0.016946 | 0.002852 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
54a297ca1fb9b2a704b8ad8106a116bb97071658 | 359 | py | Python | mayan/apps/common/exceptions.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 2,743 | 2017-12-18T07:12:30.000Z | 2022-03-27T17:21:25.000Z | mayan/apps/common/exceptions.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 15 | 2020-06-06T00:00:48.000Z | 2022-03-12T00:03:54.000Z | mayan/apps/common/exceptions.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 257 | 2017-12-18T03:12:58.000Z | 2022-03-25T08:59:10.000Z | from __future__ import unicode_literals
class BaseCommonException(Exception):
"""
Base exception for the common app
"""
pass
class NotLatestVersion(BaseCommonException):
"""
The installed version is not the latest available version
"""
def __init__(self, upstream_version):
self.upstream_version = upstream_version
| 21.117647 | 61 | 0.715877 | 37 | 359 | 6.621622 | 0.675676 | 0.183673 | 0.155102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21727 | 359 | 16 | 62 | 22.4375 | 0.871886 | 0.253482 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
54b5faaa6450454b9d10d5e9ae989e9cc3a5ffbe | 4,514 | py | Python | aospy_user/calcs/gms.py | spencerahill/aospy-obj-lib | 76803806e8c6b0042c901735eed1c88042d4e4ed | [
"Apache-2.0"
] | 3 | 2015-10-27T19:32:17.000Z | 2021-05-07T12:41:30.000Z | aospy_user/calcs/gms.py | spencerahill/aospy-obj-lib | 76803806e8c6b0042c901735eed1c88042d4e4ed | [
"Apache-2.0"
] | 11 | 2015-09-25T15:45:59.000Z | 2020-03-31T13:50:29.000Z | aospy_user/calcs/gms.py | spencerahill/aospy-obj-lib | 76803806e8c6b0042c901735eed1c88042d4e4ed | [
"Apache-2.0"
] | null | null | null | """Gross moist stability-related quantities."""
from aospy.constants import c_p, grav, L_v
from aospy.utils.vertcoord import to_pascal
from indiff.deriv import EtaCenDeriv, CenDeriv
import numpy as np
from .. import PLEVEL_STR
from . import horiz_divg, vert_divg
from .thermo import dse, mse, fmse
def field_vert_int_max(arr, dp):
"""Maximum magnitude of integral of a field from surface up."""
dp = to_pascal(dp)
# 2015-05-15: Problem: Sigma data indexing starts at TOA, while pressure
# data indexing starts at 1000 hPa. So for now only do for
# sigma data and flip array direction to start from sfc.
arr_dp_g = (arr*dp)[::-1] / grav
# Input array dimensions are assumed ([time dims,] level, lat, lon).
pos_max = np.amax(np.cumsum(arr_dp_g, axis=0), axis=-3)
neg_max = np.amin(np.cumsum(arr_dp_g, axis=0), axis=-3)
# Flip sign because integrating from p_sfc up, i.e. with dp negative.
return -1*np.where(pos_max > -neg_max, pos_max, neg_max)
def horiz_divg_vert_int_max(u, v, radius, dp):
"""Maximum magnitude of integral upwards of horizontal divergence."""
return field_vert_int_max(horiz_divg(u, v, radius, dp), dp)
def vert_divg_vert_int_max(omega, p, dp):
"""Maximum magnitude of integral from surface up of vertical divergence."""
return field_vert_int_max(vert_divg(omega, p, dp), dp)
def gms_like_ratio(weights, tracer, dp):
"""Compute ratio of integrals in the style of gross moist stability."""
# Integrate weights over lower tropospheric layer
dp = to_pascal(dp)
denominator = field_vert_int_max(weights, dp)
# Integrate tracer*weights over whole column and divide.
numerator = np.sum(weights*tracer*dp, axis=-3) / grav
return numerator / denominator
def gross_moist_strat(sphum, u, v, radius, dp):
"""Gross moisture stratification, in horizontal divergence form."""
divg = horiz_divg(u, v, radius)
return L_v*gms_like_ratio(divg, sphum, dp)
def gross_dry_stab(temp, hght, u, v, radius, dp):
"""Gross dry stability, in horizontal divergence form."""
divg = horiz_divg(u, v, radius)
return -gms_like_ratio(divg, dse(temp, hght), dp)
def gross_moist_stab(temp, hght, sphum, u, v, radius, dp):
"""Gross moist stability, in horizontal divergence form."""
divg = horiz_divg(u, v, radius)
return -gms_like_ratio(divg, mse(temp, hght, sphum), dp)
def gms_up_low(temp, hght, sphum, level, lev_up=400., lev_dn=925.):
"""Gross moist stability. Upper minus lower level MSE."""
m = mse(temp, hght, sphum)
return (np.squeeze(m[np.where(level == lev_up)] -
m[np.where(level == lev_dn)])/c_p)
def gms_each_level(temp, hght, sphum, level, lev_dn=925.):
m = mse(temp, hght, sphum)
return (m - m[np.where(level == lev_dn)])/c_p
def dry_static_stab(temp, hght, level, lev_dn=925.):
"""Dry static stability, in terms of dry static energy."""
d = dse(temp, hght)
return (d - d[np.where(level == lev_dn)])/c_p
def frozen_moist_static_stab(temp, hght, sphum, q_ice, ps, bk, pk):
"""Frozen moist static stability using model-native coordinate data."""
return EtaCenDeriv(fmse(temp, hght, sphum, q_ice), pk, bk, ps, order=2,
fill_edge=True).deriv()
def moist_static_stab(temp, hght, sphum, ps, bk, pk):
"""Moist static stability using model-native coordinate data. No ice."""
return EtaCenDeriv(mse(temp, hght, sphum), pk, bk, ps, order=2,
fill_edge=True).deriv()
def frozen_moist_static_stab_p(temp, hght, sphum, q_ice):
"""Frozen moist static stability using pressure-interpolated data.
Note that the values in the stratosphere become unphysical using pressure
interpolated data, but otherwise in the troposphere they agree well with
data on model-native coordinates.
"""
p = to_pascal(temp[PLEVEL_STR])
return CenDeriv(fmse(temp, hght, sphum, q_ice), PLEVEL_STR, coord=p,
order=2, fill_edge=True).deriv()
def moist_static_stab_p(temp, hght, sphum):
"""Moist static stability using pressure-interpolated data. No ice.
Note that the values in the stratosphere become unphysical using pressure
interpolated data, but otherwise in the troposphere they agree well with
data on model-native coordinates.
"""
p = to_pascal(temp[PLEVEL_STR])
return CenDeriv(mse(temp, hght, sphum), PLEVEL_STR, coord=p,
order=2, fill_edge=True).deriv()
| 38.913793 | 79 | 0.686752 | 695 | 4,514 | 4.310791 | 0.263309 | 0.048064 | 0.060748 | 0.016689 | 0.518024 | 0.449266 | 0.372163 | 0.3251 | 0.284379 | 0.253004 | 0 | 0.009701 | 0.200709 | 4,514 | 115 | 80 | 39.252174 | 0.820676 | 0.364865 | 0 | 0.236364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.254545 | false | 0 | 0.127273 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
54bd60dadfd5ead7749762421d054a007292a3a1 | 885 | py | Python | pyecs/components/pose.py | xaedes/pyecs | ce830d8849cfcf7c00fa49ad8954d8ce22e7204f | [
"MIT"
] | 1 | 2017-06-20T12:19:50.000Z | 2017-06-20T12:19:50.000Z | pyecs/components/pose.py | xaedes/pyecs | ce830d8849cfcf7c00fa49ad8954d8ce22e7204f | [
"MIT"
] | null | null | null | pyecs/components/pose.py | xaedes/pyecs | ce830d8849cfcf7c00fa49ad8954d8ce22e7204f | [
"MIT"
] | null | null | null | #!/usr/bin/env python2
# -*- coding: utf-8 -*-
from __future__ import division # Standardmäßig float division - Ganzzahldivision kann man explizit mit '//' durchführen
import math
from pyecs import *
class Pose(Component):
"""docstring for Pose"""
def __init__(self, x, y, angle=0, *args,**kwargs):
super(Pose, self).__init__(*args,**kwargs)
self.x = x
self.y = y
self.angle = angle # in degree
def distance_to(self, pose):
dx,dy = self.vector_to(pose)
return math.sqrt(dx*dx+dy*dy)
def distance_to_xy(self, x, y):
dx,dy = self.vector_to_xy(x, y)
return math.sqrt(dx*dx+dy*dy)
def vector_to(self, pose):
dx = pose.x - self.x
dy = pose.y - self.y
return (dx,dy)
def vector_to_xy(self, x, y):
dx = x - self.x
dy = y - self.y
return (dx,dy) | 28.548387 | 123 | 0.581921 | 133 | 885 | 3.714286 | 0.345865 | 0.060729 | 0.036437 | 0.048583 | 0.275304 | 0.214575 | 0.101215 | 0.101215 | 0 | 0 | 0 | 0.004724 | 0.282486 | 885 | 31 | 124 | 28.548387 | 0.773228 | 0.179661 | 0 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0.130435 | 0 | 0.565217 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
49a36987764a223772be4dedced6ee33d1395a8b | 1,073 | py | Python | workspace/Pipeline/gameApp/etl/stream/streamETL.py | yennanliu/Python_basics | 6a597442d39468295946cefbfb11d08f61424dc3 | [
"Unlicense"
] | 18 | 2019-08-01T07:45:02.000Z | 2022-03-31T18:05:44.000Z | workspace/Pipeline/gameApp/etl/stream/streamETL.py | yennanliu/Python_basics | 6a597442d39468295946cefbfb11d08f61424dc3 | [
"Unlicense"
] | null | null | null | workspace/Pipeline/gameApp/etl/stream/streamETL.py | yennanliu/Python_basics | 6a597442d39468295946cefbfb11d08f61424dc3 | [
"Unlicense"
] | 15 | 2019-12-29T08:46:20.000Z | 2022-03-08T14:14:05.000Z | """
class extract stream event data form API endpoint / stream event files
https://2.python-requests.org/en/master/user/advanced/#body-content-workflow
"""
import requests
class StreamETL:
def __init__(self, cfg):
self.cfg = cfg
self.stream_file = "stream.txt"
self.end_point = "https://myAwesomeApi/v1/events"
# https://stackoverflow.com/questions/57497833/python-requests-stream-data-from-api
def get_stream():
s = requests.Session()
with s.get(self.end_point, headers=None, stream=True) as response:
for line in response.lines():
if line:
#print (line)
#return line
self.save_stream(line)
def save_stream(steam):
with open(self.stream_file) as f:
try:
for line in steam:
f.write(line + "\n")
except Exception as e:
print ("save_stream failed")
def normalize_stream(steam):
pass
def verify_stream(stream):
pass | 27.512821 | 87 | 0.578751 | 128 | 1,073 | 4.742188 | 0.539063 | 0.049423 | 0.046129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01368 | 0.318733 | 1,073 | 39 | 88 | 27.512821 | 0.816689 | 0.236719 | 0 | 0.086957 | 0 | 0 | 0.074166 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0.086957 | 0.043478 | 0 | 0.304348 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
49a3774b3e6a5ad97936b0ac1ba43d9972ce59df | 10,637 | py | Python | python/test/mapreduce/util_test.py | Batterii/appengine-mapreduce | f4393e2347d0b841d62f34383fcdf849ef922309 | [
"Apache-2.0"
] | 228 | 2015-01-09T20:32:25.000Z | 2021-11-18T13:20:29.000Z | python/test/mapreduce/util_test.py | Batterii/appengine-mapreduce | f4393e2347d0b841d62f34383fcdf849ef922309 | [
"Apache-2.0"
] | 82 | 2015-01-13T15:12:15.000Z | 2020-10-30T15:26:41.000Z | python/test/mapreduce/util_test.py | Batterii/appengine-mapreduce | f4393e2347d0b841d62f34383fcdf849ef922309 | [
"Apache-2.0"
] | 127 | 2015-01-08T19:50:03.000Z | 2021-10-09T16:47:12.000Z | #!/usr/bin/env python
# Copyright 2010 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=g-bad-name
import datetime
import os
import sys
import unittest
from google.appengine.api import taskqueue
from mapreduce import model
from mapreduce import parameters
from mapreduce import util
class TestHandler(object):
"""Test handler class."""
def __call__(self, entity):
pass
def process(self, entity):
pass
@staticmethod
def process2(entity):
pass
@classmethod
def process3(cls):
pass
# pylint: disable=unused-argument
def test_handler_function(entity):
"""Empty test handler function."""
pass
class TestHandlerWithArgs(object):
"""Test handler with argument in constructor."""
def __init__(self, arg_unused):
"""Constructor."""
pass
def process(self, entity):
"""Empty process function."""
pass
# pylint: disable=g-old-style-class
class TestHandlerOldStyle():
"""Old style class."""
def __call__(self, entity):
pass
# pylint: disable=unused-argument
def test_handler_yield(entity):
"""Yielding handler function."""
yield 1
yield 2
class MockMapreduceSpec:
"""Mock MapreduceSpec class."""
def __init__(self):
self.params = {}
class ForNameTest(unittest.TestCase):
"""Test util.for_name function."""
def testClassName(self):
"""Test passing fq class name."""
self.assertEquals(TestHandler, util.for_name("__main__.TestHandler"))
def testFunctionName(self):
"""Test passing function name."""
self.assertEquals(test_handler_function,
util.for_name("__main__.test_handler_function"))
def testMethodName(self):
"""Test passing method name."""
self.assertEquals(TestHandler.process,
util.for_name("__main__.TestHandler.process"))
def testClassWithArgs(self):
"""Test passing method name of class with constructor args."""
self.assertEquals(TestHandlerWithArgs.process,
util.for_name("__main__.TestHandlerWithArgs.process"))
def testBadModule(self):
"""Tests when the module name is bogus."""
try:
util.for_name("this_is_a_bad_module_name.stuff")
except ImportError, e:
self.assertEquals(
"Could not find 'stuff' on path 'this_is_a_bad_module_name'",
str(e))
else:
self.fail("Did not raise exception")
def testBadFunction(self):
"""Tests when the module name is good but the function is missing."""
try:
util.for_name("__main__.does_not_exist")
except ImportError, e:
self.assertEquals(
"Could not find 'does_not_exist' on path '__main__'",
str(e))
else:
self.fail("Did not raise exception")
def testBadClass(self):
"""Tests when the class is found but the function name is missing."""
try:
util.for_name("__main__.TestHandlerWithArgs.missing")
except ImportError, e:
self.assertEquals(
"Could not find 'missing' on path '__main__.TestHandlerWithArgs'",
str(e))
else:
self.fail("Did not raise exception")
def testGlobalName(self):
"""Tests when the name has no dots in it."""
try:
util.for_name("this_is_a_bad_module_name")
except ImportError, e:
self.assertTrue(str(e).startswith(
"Could not find 'this_is_a_bad_module_name' on path "))
else:
self.fail("Did not raise exception")
class TestGetQueueName(unittest.TestCase):
def testGetQueueName(self):
self.assertEqual("foo", util.get_queue_name("foo"))
os.environ["HTTP_X_APPENGINE_QUEUENAME"] = "foo"
self.assertEqual("foo", util.get_queue_name(None))
os.environ["HTTP_X_APPENGINE_QUEUENAME"] = "__cron"
self.assertEqual(parameters.config.QUEUE_NAME, util.get_queue_name(None))
class SerializeHandlerTest(unittest.TestCase):
"""Test util.try_*serialize_handler works on various types."""
def testNonSerializableTypes(self):
# function.
self.assertEquals(None, util.try_serialize_handler(test_handler_function))
# Unbound method.
self.assertEquals(None, util.try_serialize_handler(TestHandler.process))
# bounded method.
self.assertEquals(None, util.try_serialize_handler(TestHandler().process))
# class method.
self.assertEquals(None, util.try_serialize_handler(TestHandler.process3))
# staticmethod, which is really a function.
self.assertEquals(None, util.try_serialize_handler(TestHandler.process2))
def testSerializableTypes(self):
# new style callable instance.
i = TestHandler()
self.assertNotEquals(
None, util.try_deserialize_handler(util.try_serialize_handler(i)))
i = TestHandlerOldStyle()
self.assertNotEquals(
None, util.try_deserialize_handler(util.try_serialize_handler(i)))
class IsGeneratorFunctionTest(unittest.TestCase):
"""Test util.is_generator function."""
def testGenerator(self):
self.assertTrue(util.is_generator(test_handler_yield))
def testNotGenerator(self):
self.assertFalse(util.is_generator(test_handler_function))
class GetTaskHeadersTest(unittest.TestCase):
def setUp(self):
super(GetTaskHeadersTest, self).setUp()
os.environ["CURRENT_VERSION_ID"] = "v7.1"
os.environ["CURRENT_MODULE_ID"] = "foo-module"
os.environ["DEFAULT_VERSION_HOSTNAME"] = "foo.appspot.com"
def testGetTaskHost(self):
self.assertEqual("v7.foo-module.foo.appspot.com", util._get_task_host())
task = taskqueue.Task(url="/relative_url",
headers={"Host": util._get_task_host()})
self.assertEqual("v7.foo-module.foo.appspot.com",
task.headers["Host"])
self.assertEqual("v7.foo-module", task.target)
def testGetTaskHostDefaultModule(self):
os.environ["CURRENT_MODULE_ID"] = "default"
self.assertEqual("v7.foo.appspot.com", util._get_task_host())
task = taskqueue.Task(url="/relative_url",
headers={"Host": util._get_task_host()})
self.assertEqual("v7.foo.appspot.com",
task.headers["Host"])
self.assertEqual("v7", task.target)
def testGetTaskHeaders(self):
mr_spec = model.MapreduceSpec(
name="foo", mapreduce_id="foo_id",
mapper_spec=model.MapperSpec("foo", "foo", {}, 8).to_json())
task = taskqueue.Task(url="/relative_url",
headers=util._get_task_headers(mr_spec.mapreduce_id))
self.assertEqual("foo_id", task.headers[util._MR_ID_TASK_HEADER])
self.assertEqual("v7.foo-module.foo.appspot.com",
task.headers["Host"])
self.assertEqual("v7.foo-module", task.target)
class GetShortNameTest(unittest.TestCase):
"""Test util.get_short_name function."""
def testGetShortName(self):
self.assertEquals("blah", util.get_short_name("blah"))
self.assertEquals("blah", util.get_short_name(".blah"))
self.assertEquals("blah", util.get_short_name("__mmm__.blah"))
self.assertEquals("blah", util.get_short_name("__mmm__.Krb.blah"))
class TotalSecondsTest(unittest.TestCase):
"""Test util.total_seconds."""
def testTotalSeconds(self):
td = datetime.timedelta(days=1, seconds=1)
self.assertEqual(24 * 60 * 60 + 1, util.total_seconds(td))
td = datetime.timedelta(days=1, seconds=1, microseconds=1)
self.assertEqual(24 * 60 * 60 + 2, util.total_seconds(td))
class ParseBoolTest(unittest.TestCase):
"""Test util.parse_bool function."""
def testParseBool(self):
self.assertEquals(True, util.parse_bool(True))
self.assertEquals(False, util.parse_bool(False))
self.assertEquals(True, util.parse_bool("True"))
self.assertEquals(False, util.parse_bool("False"))
self.assertEquals(True, util.parse_bool(1))
self.assertEquals(False, util.parse_bool(0))
self.assertEquals(True, util.parse_bool("on"))
self.assertEquals(False, util.parse_bool("off"))
class CreateConfigTest(unittest.TestCase):
"""Test create_datastore_write_config function."""
def setUp(self):
super(CreateConfigTest, self).setUp()
self.spec = MockMapreduceSpec()
def testDefaultConfig(self):
config = util.create_datastore_write_config(self.spec)
self.assertTrue(config)
self.assertFalse(config.force_writes)
def testForceWrites(self):
self.spec.params["force_writes"] = "True"
config = util.create_datastore_write_config(self.spec)
self.assertTrue(config)
self.assertTrue(config.force_writes)
class FooClass(object):
pass
class ObjToPathTest(unittest.TestCase):
def setUp(self):
super(ObjToPathTest, self).setUp()
self.sys_modules = sys.modules
def tearDown(self):
super(ObjToPathTest, self).tearDown()
sys.modules = self.sys_modules
def testBasic(self):
self.assertEqual(None, util._obj_to_path(None))
self.assertEqual("__main__.FooClass", util._obj_to_path(FooClass))
self.assertEqual("__main__.test_handler_function",
util._obj_to_path(test_handler_function))
@staticmethod
def foo():
pass
class FooClass2(object):
pass
def testNotTopLevel(self):
self.assertRaises(ValueError, util._obj_to_path, self.FooClass2)
def testNotTopLevel2(self):
self.assertRaises(ValueError, util._obj_to_path, self.foo)
def testUnexpectedType(self):
self.assertRaises(TypeError, util._obj_to_path, self.testUnexpectedType)
class GetDescendingKeyTest(unittest.TestCase):
"""Tests the _get_descending_key function."""
def testBasic(self):
"""Basic test of the function."""
now = 1234567890
os.environ["REQUEST_ID_HASH"] = "12345678"
self.assertEquals(
"159453012940012345678",
util._get_descending_key(
gettime=lambda: now))
class StripPrefixFromItemsTest(unittest.TestCase):
"""Tests the strip_prefix_from_items function."""
def testBasic(self):
"""Basic test of the function."""
items = ["/foo/bar", "/foos/bar2", "/bar3"]
prefix = "/foo/"
self.assertEquals(["bar", "/foos/bar2", "/bar3"],
util.strip_prefix_from_items(prefix, items))
if __name__ == "__main__":
unittest.main()
| 29.547222 | 79 | 0.700197 | 1,297 | 10,637 | 5.535852 | 0.224364 | 0.057939 | 0.013788 | 0.025627 | 0.420056 | 0.370752 | 0.317967 | 0.274513 | 0.223398 | 0.17688 | 0 | 0.010508 | 0.17693 | 10,637 | 359 | 80 | 29.629526 | 0.809595 | 0.079158 | 0 | 0.306977 | 0 | 0 | 0.136108 | 0.058283 | 0 | 0 | 0 | 0 | 0.255814 | 0 | null | null | 0.051163 | 0.055814 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
49a66ce930ebe50a057a917ae5b27b04e3cbd0c6 | 704 | py | Python | judge/templatetags/gravatar_tags.py | TheAvidDev/pnoj-site | 63299e873b1fb654667545222ce2b3157e78acd9 | [
"MIT"
] | 2 | 2020-04-02T19:50:03.000Z | 2020-08-06T18:30:25.000Z | judge/templatetags/gravatar_tags.py | TheAvidDev/pnoj-site | 63299e873b1fb654667545222ce2b3157e78acd9 | [
"MIT"
] | 28 | 2020-03-19T16:29:58.000Z | 2021-09-22T18:47:30.000Z | judge/templatetags/gravatar_tags.py | TheAvidDev/pnoj-site | 63299e873b1fb654667545222ce2b3157e78acd9 | [
"MIT"
] | 2 | 2020-08-09T06:23:12.000Z | 2020-10-13T00:13:25.000Z | import hashlib
import urllib
from django import template
from django.utils.safestring import mark_safe
register = template.Library()
# return only the URL of the gravatar
# TEMPLATE USE: {{ email|gravatar_url:150 }}
@register.filter
def gravatar_url(email, size=40):
email = email.encode('utf-8')
return "https://www.gravatar.com/avatar/%s?%s" % (hashlib.md5(email.lower()).hexdigest(), urllib.parse.urlencode({'d': 'retro', 's':str(size)}))
# return an image tag with the gravatar
# TEMPLATE USE: {{ email|gravatar:150 }}
@register.filter
def gravatar(email, size=40):
url = gravatar_url(email, size)
return mark_safe('<img src="%s" width="%d" height="%d">' % (url, size, size))
| 33.52381 | 148 | 0.698864 | 102 | 704 | 4.77451 | 0.480392 | 0.067762 | 0.078029 | 0.090349 | 0.258727 | 0.143737 | 0 | 0 | 0 | 0 | 0 | 0.019802 | 0.139205 | 704 | 20 | 149 | 35.2 | 0.783828 | 0.223011 | 0 | 0.153846 | 0 | 0 | 0.158672 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.307692 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
49afdc01c13f58495efaad7985a4e782fe376fab | 1,347 | py | Python | tests/legacy_pytests/multiple_symlink_levels/testme.py | depaul-dice/provenance-to-use | e16e2824fbbe0b4e09cc50f0d2bcec3400bf4b87 | [
"BSD-3-Clause"
] | null | null | null | tests/legacy_pytests/multiple_symlink_levels/testme.py | depaul-dice/provenance-to-use | e16e2824fbbe0b4e09cc50f0d2bcec3400bf4b87 | [
"BSD-3-Clause"
] | null | null | null | tests/legacy_pytests/multiple_symlink_levels/testme.py | depaul-dice/provenance-to-use | e16e2824fbbe0b4e09cc50f0d2bcec3400bf4b87 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python2
'''
some programs like java are really picky about the EXACT directory
structure being replicated within cde-package. e.g., java will refuse
to start unless the directory structure is perfectly mimicked (since it
uses its true path to load start-up libraries). this means that CDE
Needs to be able to potentially traverse through multiple levels of
symlinks and faithfully recreate them within cde-package.
For example, on chongzi (Fedora Core 9):
/usr/bin/java is a symlink to /etc/alternatives/java
but /etc/alternatives/java is itself a symlink to /usr/lib/jvm/jre-1.6.0-openjdk/bin/java
this example involves 2 levels of symlinks, and java requires that the
TRUE binary to be found here in the package in order to run properly:
/usr/lib/jvm/jre-1.6.0-openjdk/bin/java
'''
import sys
sys.path.insert(0, '..')
from cde_test_common import *
def checker_func():
assert os.path.islink(CDE_ROOT_DIR + '/home/pgbovine/CDE/tests/multiple_symlink_levels/fake-root/usr/bin/java')
assert os.path.islink(CDE_ROOT_DIR + '/home/pgbovine/CDE/tests/multiple_symlink_levels/fake-root/etc/alternatives/java')
assert os.path.isfile(CDE_ROOT_DIR + '/home/pgbovine/CDE/tests/multiple_symlink_levels/fake-root/usr/lib/jvm/jre-1.6.0-openjdk/bin/java')
generic_test_runner(["cat", "fake-root/usr/bin/java"], checker_func)
| 37.416667 | 139 | 0.775798 | 230 | 1,347 | 4.465217 | 0.473913 | 0.040896 | 0.029211 | 0.035054 | 0.319377 | 0.295034 | 0.295034 | 0.295034 | 0.295034 | 0.295034 | 0 | 0.010952 | 0.118782 | 1,347 | 35 | 140 | 38.485714 | 0.854254 | 0.592428 | 0 | 0 | 0 | 0.125 | 0.511152 | 0.501859 | 0 | 0 | 0 | 0 | 0.375 | 1 | 0.125 | true | 0 | 0.25 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
49b0bc3f78687122c2de6e7a9663ee951ba7a00c | 1,068 | py | Python | slt/models.py | Jamesoc23/slt-website | 67ba28caabb0fe03991931fa95f7adf52e8336b6 | [
"MIT"
] | null | null | null | slt/models.py | Jamesoc23/slt-website | 67ba28caabb0fe03991931fa95f7adf52e8336b6 | [
"MIT"
] | null | null | null | slt/models.py | Jamesoc23/slt-website | 67ba28caabb0fe03991931fa95f7adf52e8336b6 | [
"MIT"
] | null | null | null | from django.db import models
from django.core.files.storage import FileSystemStorage
from django.conf import settings
image_storage = FileSystemStorage(
# Physical file location ROOT
location=u'{0}'.format(settings.MEDIA_ROOT),
# Url for file
base_url=u'{0}'.format(settings.MEDIA_URL),
)
def image_directory_path(instance, filename):
# file will be uploaded to MEDIA_ROOT/my_sell/picture/<filename>
return u'picture/{0}'.format(filename)
class Service(models.Model):
title = models.CharField(max_length=100, default='')
body = models.TextField()
image = models.ImageField(upload_to=image_directory_path, storage=image_storage, blank=True, null=True)
def __str__(self):
return self.title
class ContactInfo(models.Model):
email = models.CharField(max_length=100)
phone = models.CharField(max_length=10)
social = models.CharField(max_length=200)
def __str__(self):
return self.email
class About(models.Model):
about = models.TextField()
def __str__(self):
return self.about | 28.864865 | 107 | 0.725655 | 141 | 1,068 | 5.297872 | 0.432624 | 0.080321 | 0.096386 | 0.128514 | 0.208835 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01573 | 0.166667 | 1,068 | 37 | 108 | 28.864865 | 0.823596 | 0.096442 | 0 | 0.12 | 0 | 0 | 0.017672 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16 | false | 0 | 0.12 | 0.16 | 0.84 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
49b58379748472f0525ef68e440e9dda037b02e4 | 486 | py | Python | App/api/stats.py | MaiXiaochai/SnailAPI | 2de24f2d1f0b93dc743210b1d66ef2665a09ba40 | [
"Apache-2.0"
] | null | null | null | App/api/stats.py | MaiXiaochai/SnailAPI | 2de24f2d1f0b93dc743210b1d66ef2665a09ba40 | [
"Apache-2.0"
] | 9 | 2020-09-14T02:02:31.000Z | 2020-09-14T02:08:20.000Z | App/api/stats.py | MaiXiaochai/SnailAPI | 2de24f2d1f0b93dc743210b1d66ef2665a09ba40 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
--------------------------------------
@File : stats.py
@Author : maixiaochai
@Email : maixiaochai@outlook.com
@CreatedOn : 2020/6/8 23:29
--------------------------------------
一些统计信息
1)任务数量, 总数、成功、失败、运行中、运行完
2)任务数量分布统计
3)历史成功率统计
4)硬件信息统计,磁盘、cpu、内存、文件数量
5)日志数量
6) 运行时间长度排名,日、周、月、年
7)业务运行时间排名、业务分布
8)任务更改次数排名、业务分布
10)接口访问统计,总数,成功,失败
11)用户访问统计,总数,成功、失败、top
...
一些运行状态
1)某个任务在运行[中|完|成功|失败]
"""
| 18.692308 | 38 | 0.485597 | 68 | 486 | 3.470588 | 0.794118 | 0.067797 | 0.076271 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062827 | 0.213992 | 486 | 25 | 39 | 19.44 | 0.554974 | 0.975309 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
49b84cd454d2a34df7fb61b551cda4c3900ca093 | 2,639 | py | Python | cctexconvert/texturemap.py | Fam0r/CCTexConvert | 502948e6e87c736d8430097bc08be3113b4ff117 | [
"WTFPL"
] | null | null | null | cctexconvert/texturemap.py | Fam0r/CCTexConvert | 502948e6e87c736d8430097bc08be3113b4ff117 | [
"WTFPL"
] | null | null | null | cctexconvert/texturemap.py | Fam0r/CCTexConvert | 502948e6e87c736d8430097bc08be3113b4ff117 | [
"WTFPL"
] | null | null | null | # A file containing mappings of CC -> MC file names
# Pack version 3
VER3 = {
'char.png': 'minecraft/textures/entity/steve.png',
'chicken.png': 'minecraft/textures/entity/chicken.png',
'creeper.png': 'minecraft/textures/entity/creeper/creeper.png',
'pig.png': 'minecraft/textures/entity/pig/pig.png',
'sheep.png': 'minecraft/textures/entity/sheep/sheep.png',
'sheep_fur.png': 'minecraft/textures/entity/sheep/sheep_fur.png',
'skeleton.png': 'minecraft/textures/entity/skeleton/skeleton.png',
'spider.png': 'minecraft/textures/entity/spider/spider.png',
'zombie.png': 'minecraft/textures/entity/zombie/zombie.png',
'default.png': 'minecraft/textures/font/ascii.png',
'icons.png': 'minecraft/textures/gui/icons.png',
'gui.png': 'minecraft/textures/gui/widgets.png',
'gui_classic.png': 'minecraft/textures/gui/widgets.png'
}
# Blocks
BLOCKS3 = [
# glass_pane_top_brown is substitute for rope
'grass_top', 'stone', 'dirt', 'grass_side',
'planks_oak', 'stone_slab_side', 'stone_slab_top', 'brick',
'tnt_side', 'tnt_top', 'tnt_bottom', ('glass_pane_top_brown', 'glass_pane_top'),
'flower_rose', 'flower_dandelion', 'water_still', 'sapling_oak',
'cobblestone', 'bedrock', 'sand', 'gravel',
'log_oak', 'log_oak_top', 'leaves_oak', 'iron_block',
'gold_block', 'sandstone_top', 'quartz_block_lines_top', None,
'mushroom_red', 'mushroom_brown', 'lava_still', 'grass_top',
# TODO: fire_layer_0 is an animation sheet, only get first frame for terrain.png
'gold_ore', 'iron_ore', 'coal_ore', 'bookshelf',
'cobblestone_mossy', 'obsidian', 'fire_layer_0', 'iron_block',
'gold_block', 'sandstone_normal', 'quartz_block_lines', None,
None, None, None, None,
# shulker_top_brown is crate
'sponge', 'glass', 'snow', 'ice',
'stonebrick', 'shulker_top_brown', 'quartz_block_side', 'iron_block',
'gold_block', 'sandstone_bottom', 'quartz_block_lines_top', None,
None, None, None, None,
# TODO: Some colours don't exist in modern minecraft, get them from hex colours instead
'wool_colored_red', 'wool_colored_orange', 'wool_colored_yellow', 'wool_colored_lime',
'wool_colored_green', 'TEAL', 'wool_colored_light_blue', 'wool_colored_cyan',
'BLUE', 'wool_colored_purple', 'VIOLET', 'wool_colored_magenta',
'PINK', 'wool_colored_black', 'wool_colored_gray', 'wool_colored_white',
'wool_colored_pink', 'FOREST GREEN', 'wool_colored_brown', 'wool_colored_blue',
'TURQUOISE', None, 'magma'
# TODO: Are the breaking state textures at the bottom ever used by anyone? Don't transfer them over for now.
] | 47.981818 | 112 | 0.704055 | 353 | 2,639 | 4.994334 | 0.402266 | 0.09359 | 0.147476 | 0.132728 | 0.170732 | 0.078276 | 0 | 0 | 0 | 0 | 0 | 0.002199 | 0.13831 | 2,639 | 55 | 113 | 47.981818 | 0.773087 | 0.156878 | 0 | 0.051282 | 0 | 0 | 0.692828 | 0.258457 | 0 | 0 | 0 | 0.018182 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
49cb4219e0656bfdd6ec1f8ec48b7a2986447b61 | 244 | py | Python | core/src/main/resources/tf_algos/easytransfer/run_easytransfer_train_main.py | starburst-project/Alink | 2e3b6d20cd683211b0209141d1fcb3e0bce01b8d | [
"Apache-2.0"
] | 3,301 | 2018-10-01T16:30:44.000Z | 2022-03-30T08:07:16.000Z | core/src/main/resources/tf_algos/easytransfer/run_easytransfer_train_main.py | starburst-project/Alink | 2e3b6d20cd683211b0209141d1fcb3e0bce01b8d | [
"Apache-2.0"
] | 206 | 2019-11-27T14:04:42.000Z | 2022-03-28T08:02:05.000Z | core/src/main/resources/tf_algos/easytransfer/run_easytransfer_train_main.py | starburst-project/Alink | 2e3b6d20cd683211b0209141d1fcb3e0bce01b8d | [
"Apache-2.0"
] | 765 | 2018-10-09T02:02:19.000Z | 2022-03-31T12:06:21.000Z | import os
os.environ["HOME"] = os.path.expanduser('~')
from akdl.models.tf.easytransfer import easytransfer_main
from akdl.runner.config import TrainTaskConfig
def main(task_config: TrainTaskConfig):
easytransfer_main.main(task_config)
| 22.181818 | 57 | 0.795082 | 32 | 244 | 5.9375 | 0.53125 | 0.084211 | 0.147368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102459 | 244 | 10 | 58 | 24.4 | 0.86758 | 0 | 0 | 0 | 0 | 0 | 0.020492 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
49d04dd9c352124debe427155af9d363f26dc209 | 302 | py | Python | python/30 days of code/day26.py | angelopassaro/Hacktoberfest-1 | 21f90f5d49efba9b1a27f4d9b923f5017ab43f0e | [
"Apache-2.0"
] | 1 | 2020-10-06T01:20:07.000Z | 2020-10-06T01:20:07.000Z | python/30 days of code/day26.py | angelopassaro/Hacktoberfest-1 | 21f90f5d49efba9b1a27f4d9b923f5017ab43f0e | [
"Apache-2.0"
] | null | null | null | python/30 days of code/day26.py | angelopassaro/Hacktoberfest-1 | 21f90f5d49efba9b1a27f4d9b923f5017ab43f0e | [
"Apache-2.0"
] | null | null | null | x,y,z = map(int,input().split(' '))
a,b,c = map(int,input().split(' '))
if (x, y, z) == (a, b, c):
print("0")
elif (y, z) == (b, c):
print(15 * (x - a))
elif z==c:
if y<=b and x<=a:
print("0")
else:
print(500 * (y - b))
elif z>c:
print("10000")
else:
print("0")
| 18.875 | 35 | 0.430464 | 56 | 302 | 2.321429 | 0.339286 | 0.046154 | 0.046154 | 0.246154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060185 | 0.284768 | 302 | 15 | 36 | 20.133333 | 0.541667 | 0 | 0 | 0.333333 | 0 | 0 | 0.033113 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
49d9021c257e942b6894496ef90406b7dceaa6c4 | 1,993 | py | Python | mongo_adapter/exceptions.py | moonso/mongo_adapter | b61841c7a3a743546670a133c8fc03ca7e1644dc | [
"MIT"
] | null | null | null | mongo_adapter/exceptions.py | moonso/mongo_adapter | b61841c7a3a743546670a133c8fc03ca7e1644dc | [
"MIT"
] | 1 | 2021-11-12T14:17:01.000Z | 2021-11-22T08:11:37.000Z | mongo_adapter/exceptions.py | Clinical-Genomics/mongo_adapter | b61841c7a3a743546670a133c8fc03ca7e1644dc | [
"MIT"
] | null | null | null | """
The exceptions here tries to follow PEP249 (https://www.python.org/dev/peps/pep-0249/#exceptions)
"""
class Warning(Exception):
"""Exception raised for important warnings like data truncations while inserting, etc."""
pass
class Error(Exception):
"""Exception that is the base class of all other error exceptions.
You can use this to catch all errors with one single except statement.
Warnings are not considered errors and thus should not use this class as base.
"""
pass
class InterfaceError(Error):
"""Exception raised for errors that are related to the database interface rather than
the database itself.
"""
pass
class DatabaseError(Error):
"""Exception raised for errors that are related to the database."""
pass
class DataError(DatabaseError):
"""Exception raised for errors that are due to problems with the processed
data like division by zero, numeric value out of range, etc
"""
pass
class OperationalError(DatabaseError):
"""Exception raised for errors that are related to the database's operation
and not necessarily under the control of the programmer, e.g. an unexpected
disconnect occurs, the data source name is not found, a transaction could
not be processed, a memory allocation error occurred during processing, etc
"""
pass
class IntegrityError(DatabaseError):
"""Exception raised when the relational integrity of the database is affected,
e.g. a foreign key check fails.
"""
pass
class InternalError(DatabaseError):
"""Exception raised when the database encounters an internal error, e.g. the
cursor is not valid anymore, the transaction is out of sync, etc.
"""
pass
class ProgrammingError(DatabaseError):
"""Exception raised for programming errors, e.g. table not found or already exists,
syntax error in the SQL statement, wrong number of parameters specified, etc
"""
pass
| 34.964912 | 97 | 0.71149 | 265 | 1,993 | 5.350943 | 0.479245 | 0.084626 | 0.076164 | 0.067701 | 0.204513 | 0.155148 | 0.155148 | 0.114951 | 0.114951 | 0.114951 | 0 | 0.004513 | 0.221776 | 1,993 | 56 | 98 | 35.589286 | 0.909736 | 0.706974 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
49e2cf71718e9a75f3dd431c5af8214bea2de4ec | 665 | py | Python | adlmagics/adlmagics/test/mocks/mock_presenter_factory.py | Azure/Azure-Data-Service-Notebook | 6bd28587c9fa0a7c1f9113f638b790b1773c5585 | [
"MIT"
] | 6 | 2018-06-06T08:37:53.000Z | 2020-06-01T13:13:13.000Z | adlmagics/adlmagics/test/mocks/mock_presenter_factory.py | Azure/Azure-Data-Service-Notebook | 6bd28587c9fa0a7c1f9113f638b790b1773c5585 | [
"MIT"
] | 30 | 2018-06-08T02:47:18.000Z | 2018-07-25T07:07:07.000Z | adlmagics/adlmagics/test/mocks/mock_presenter_factory.py | Azure/Azure-Data-Service-Notebook | 6bd28587c9fa0a7c1f9113f638b790b1773c5585 | [
"MIT"
] | 5 | 2018-06-06T08:37:55.000Z | 2021-01-07T09:15:15.000Z | class MockPresenterFactory:
def __init__(self):
self.__presented_logs = []
def register_presenter(self, presenter):
pass
def present(self, obj):
text = ""
if isinstance(obj, str):
text = obj
elif isinstance(obj, list):
if len(obj) > 0:
text = "A list of %s" % (type(obj[0]).__name__)
else:
text = "A list"
else:
text = type(obj).__name__
self.__presented_logs.append(text)
def clear(self):
self.__presented_logs.clear()
@property
def presented_logs(self):
return self.__presented_logs | 24.62963 | 63 | 0.541353 | 72 | 665 | 4.638889 | 0.416667 | 0.194611 | 0.203593 | 0.125749 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004662 | 0.354887 | 665 | 27 | 64 | 24.62963 | 0.773893 | 0 | 0 | 0.090909 | 0 | 0 | 0.027027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.227273 | false | 0.045455 | 0 | 0.045455 | 0.318182 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
49f2853da596c799a644f87a616be3de2d24c604 | 13,046 | py | Python | main.py | RoberWare/Ex-menes-Selectividad-PAU-Andaluc-a | 3544b2f9fa56dc05a3911a9121c290687d6ad09c | [
"MIT"
] | null | null | null | main.py | RoberWare/Ex-menes-Selectividad-PAU-Andaluc-a | 3544b2f9fa56dc05a3911a9121c290687d6ad09c | [
"MIT"
] | null | null | null | main.py | RoberWare/Ex-menes-Selectividad-PAU-Andaluc-a | 3544b2f9fa56dc05a3911a9121c290687d6ad09c | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#KIVY IMPORTS
import kivy
#kivy.config.Config.set('graphics','resizable', False)
from kivy.core.window import Window
from kivy.metrics import dp
Window.size=(dp(590),dp((480)))
Window.minimum_width=(dp(580))
Window.minimum_height=dp(480)
from kivy.app import App
from kivy.uix.button import Button
from kivy.uix.spinner import Spinner
from kivy.uix.popup import Popup
from kivy.uix.video import Video
from kivy.lang import Builder
from kivy.uix.screenmanager import ScreenManager, Screen, FadeTransition
from kivy.uix.widget import Widget
from kivy.properties import ListProperty, OptionProperty, BooleanProperty, DictProperty, ObjectProperty, StringProperty, NumericProperty
from kivy.uix.boxlayout import BoxLayout
from kivy.uix.floatlayout import FloatLayout
from kivy.uix.anchorlayout import AnchorLayout
from kivy.utils import platform
from kivy.clock import Clock
from functools import partial
from kivy.config import Config
Config.set("input", "mouse", "mouse, disable_multitouch")
#SYSTEM IMPORTS
import os
import json,re,shutil
from scripts import scrap_html,download,pdf_tools,zip_tools,zips_to_pdf
if platform == 'android':
from plyer import email
#GLOBAL VARS
url='https://www.juntadeandalucia.es/economiayconocimiento/sguit/g_b_examenes_anteriores.php'
workpath=os.path.abspath("./tmp")+"/"
progressbar="blue-barber-bar"
class NoneUI(Screen):
pass
class MainPanel(BoxLayout):
pass
class PicChooser(Popup):
pass
class MyButton(Button):
myimg=StringProperty()
mylbl=StringProperty()
pass
class PopFAQ(Popup):
pass
class PopSETTINGS(Popup):
pass
class PopINFO(AnchorLayout):
pass
class PopMsg(AnchorLayout):
pass
class PopMsgERRORFILL(AnchorLayout):
pass
class PopMsgWARN(Popup):
pass
class PopMsgEND(Popup):
pass
class PopMsgUPD(AnchorLayout):
pass
class PopMsgUPDCOMPLETED(Popup):
pass
class MainApp(App):
def __init__(self, **kwargs):
super(MainApp, self).__init__(**kwargs)
def build(self):
#Window.borderless = True
#Window.left=0
#Window.top=0
# WINDOW SETTINGS
self.title = u"PAU PDF MERGER"
self.window_icon = "./imgs/logo.png"
Window.bind(on_resize=self.wd_resize)
Config.set("graphics","resizable",0)
#Window.size=(dp(885)),dp(370)
#self.icon = "./imgs/logo.png"
self.url=url
self.criterios=1
self.orientation_name=""
self.years=[]
# SCREENS
self.myMainPanel = MainPanel()
root = self.myMainPanel
self.myNoneUI = self.myMainPanel.ids.myNoneUI
self.myPicChooser = PicChooser()
self.myPicChooser.ids.pc.path=os.path.expanduser("~")
self.myPopSETTINGS = PopSETTINGS()
self.myPopFAQ = PopFAQ()
self.myPopINFO = Popup(title='Información',content=PopINFO(),size_hint=(.8,.5))
self.myPopMsg = Popup(title='Lo sentimos...',content=PopMsg(),size_hint=(.8,.5))
self.myPopMsgERRORFILL = Popup(title='ERROR',content=PopMsgERRORFILL(),size_hint=(.8, .5))
self.myPopMsgWARN = PopMsgWARN()
self.myPopMsgUPD = Popup(title='Actualizando',content=PopMsgUPD(),size_hint=(.8,.5))
self.myPopMsgUPDC =PopMsgUPDCOMPLETED()
self.myPopMsgEND = PopMsgEND()
self.myNoneUI.ids.pbar.source = "./imgs/animations/progress-bar/%s/%s-stop.png"%(progressbar,progressbar)
return root
def on_start(self):
tmp= self.is_extension(os.listdir("./tmp"),".zip")
print tmp
if tmp != []:
self.myPopMsg.open()
#scrap_html.table(url=url)
try:
with open('./data/subjects.json') as json_data:
self.all_subjects = json.load(json_data)
with open('./data/orientations.json') as json_data:
self.all_subjects_orientations = json.load(json_data)
with open('./data/settings.json') as json_data:
self.settings = json.load(json_data)
except:pass
self.myNoneUI.ids.subjects.values=self.all_subjects
self.myPopSETTINGS.ids.path.text = self.settings["default_folder"]
self.myPopSETTINGS.ids.img.text = self.settings["img_pau"]
self.myPopSETTINGS.ids.font.text = self.settings["font"]
self.myPopSETTINGS.ids.url.text = self.settings["url"]
self.myNoneUI.ids.path.text = self.settings["default_folder"]
self.main()
def show_side_panel(self):
self.myMetaPanel.ids.NavDraw.toggle_state()
def wd_resize(self,window,x,y):
print Window.size
#Window.size=(dp(885), dp(315))
##SCREENS
def vpui(self):
self.myMainPanel.ids.ScrMan.current = self.myVideoPadUI.name
###UPDATE FUNCTIONS
def spinner_subjects_update(self):
if self.myNoneUI.ids.subjects.text in self.all_subjects:
years=[]
for x in self.all_subjects[self.myNoneUI.ids.subjects.text]:
n= re.sub("\D", "", x)
if re.search('[0-9]+',n)!=None:
years.append(n)
print n
self.myNoneUI.ids.since.values=years
self.myNoneUI.ids.since.text=min(years)
self.myNoneUI.ids.to.values=years
self.myNoneUI.ids.to.text=max(years)
###MAIN FUNCTIONS
def is_extension(self,file_names,extension):
files=[]
for x in file_names:
filename, file_extension = os.path.splitext(x)
if file_extension == extension:
files.append(x)
return files
def delete_all(self):
shutil.rmtree('./tmp')
os.mkdir("./tmp")
self.myPopMsg.dismiss()
def continue_zipping(self):
pass
"""
nl=list_plus_str(string1=workpath,list_names=tmp)
print nl
zips_to_pdf.merge_all(nl,workpath)
"""
def ignore(self):
self.myPopMsg.dismiss()
def ignoreERRORFILL(self):
self.myPopMsgERRORFILL.dismiss()
def ignoreEND(self):
self.myPopMsgEND.dismiss()
def ignoreINFO(self):
self.myPopINFO.dismiss()
def ignoreFAQ(self,arg):
self.myPopFAQ.dismiss()
if arg:
print pdf_tools.word_counter("/home/roberto/Programación/Python/PROJECTS/BUFFERPDF/PC/beta_0.2/output/sel_2011_matematicas_sel_2012_matematicas_sel_2013_matematicas_sel_2014_matematicas_sel_2015_matematicas_sel_2016_matematicas_sel_2017_matematicas_.pdf")
def ignoreSETTINGS(self,save):
self.myPopSETTINGS.dismiss()
if save:
self.settings={}
self.settings["img_pau"]=self.myPopSETTINGS.ids.img.text
self.settings["font"]=self.myPopSETTINGS.ids.font.text
self.settings["default_folder"]=self.myPopSETTINGS.ids.path.text
self.settings["url"]=self.myPopSETTINGS.ids.url.text
with open('./data/settings.json', 'w') as outfile:
json.dump(self.settings, outfile)
def ignorePicChooser(self,upd):
self.myPicChooser.dismiss()
if upd:
self.myNoneUI.ids.path.text=self.myPicChooser.ids.pc.path
self.myPopSETTINGS.ids.path.text=self.myPicChooser.ids.pc.path
print self.myPopSETTINGS.ids.path.text
def name_from_url(self,link):
n=0
for x in link[::-1]:
if x == "/":break
else:n+=1
print link[-n:]
file_name = "./tmp/"+link[-n:]
return file_name
def wait_until_download(self,myurl,*largs):
#self.myNoneUI.ids.prompt.text = download.progress
if download.download_finished:
print "FINISHED"
download.download_finished=False
self.myNoneUI.ids.pbar.value+=1
#name=self.name_from_url(myurl)
name=download.file_name
print "SE acaba de descargar",download.file_name
if os.path.splitext(name)[1] == ".zip":
self.zip_names.append(name)
else:
self.orientation_name=name
print "Orientaciones>>>>>>>",name
if len(self.zip_names)>=len(self.pdf_urls):
self.THREADdownload.cancel()
self.THREADwaitfordownload.cancel()
self.merge_zips(self.zip_names)
self.disabled_all_widgets(self.myNoneUI.ids.core.children,False)
self.delete_all()
self.myPopMsgEND.ids.name.text = self.myNoneUI.ids.subjects.text
self.myPopMsgEND.ids.date.text = str(str(self.years[0])+"-"+str(self.years[1]))
self.myPopMsgEND.ids.docname.text = self.myNoneUI.ids.path.text
self.myPopMsgEND.open()
self.myNoneUI.ids.prompt.text="¡Trabajo completado!"
self.myNoneUI.ids.pbar.source = "./imgs/animations/progress-bar/%s/%s-stop.png"%(progressbar,progressbar)
return False
def merge_zips(self,zip_names):
print zip_names
zips_to_pdf.merge_all(zip_names,workpath,self.myNoneUI.ids.create_cover.active,self.myNoneUI.ids.subjects.text,self.criterios,self.orientation_name,self.myNoneUI.ids.path.text,str((self.myNoneUI.ids.subjects.text).encode("utf-8")+"_"+str(self.years[0])+"-"+str(self.years[1])))
print "-cr.",self.criterios
def disabled_all_widgets(self,name,value):
for x in name:
x.disabled=value
def proceed(self):
if self.myNoneUI.ids.subjects.text != "Materia" and self.myNoneUI.ids.since.text != "Desde" and self.myNoneUI.ids.to.text != "Hasta" and self.myNoneUI.ids.path.text != "...":
self.all_years=pdf_tools.n_to_m(int(self.myNoneUI.ids.since.text),int(self.myNoneUI.ids.to.text))
for y in self.years:
self.years.remove(y)
self.years.append(min(self.all_years))
self.years.append(max(self.all_years))
print self.years
if self.settings["show_warn"]=="normal": self.myPopMsgWARN.open()
else: self.process()
else:
self.myPopMsgERRORFILL.open()
def update_table_links(self):
self.myPopMsgUPD.open()
scrap_html.table(url=url)
self.myPopMsgUPD.dismiss()
self.myPopMsgUPDC.open()
def process(self):
print self.myNoneUI.ids.subjects.text,self.myNoneUI.ids.since.text,self.myNoneUI.ids.to.text,self.myNoneUI.ids.path.text
self.myPopMsgWARN.dismiss()
print self.myPopMsgWARN.ids.show_again.state
self.settings["show_warn"]=self.myPopMsgWARN.ids.show_again.state
with open('./data/settings.json', 'w') as outfile:
json.dump(self.settings, outfile)
self.myNoneUI.ids.pbar.source = "./imgs/animations/progress-bar/%s/%s.zip"%(progressbar,progressbar)
self.myNoneUI.ids.pbar.anim_delay = 0.01
self.disabled_all_widgets(self.myNoneUI.ids.core.children,True)
self.myNoneUI.ids.prompt.disabled=False
if self.myNoneUI.ids.orientations.state=="down":
self.orientations=1
self.orientation_name=self.name_from_url(self.all_subjects_orientations[self.myNoneUI.ids.subjects.text])
if self.myNoneUI.ids.allyears.state=="down":self.criterios=1
elif self.myNoneUI.ids.lastyear.state=="down":self.criterios=self.years[1]
else: self.criterios=-1
self.pdf_urls=[]
if self.orientations: self.THREADdownload=Clock.schedule_interval(partial(download.progress_bar,self.all_subjects_orientations[self.myNoneUI.ids.subjects.text]), 0.5)
for x in self.all_subjects[self.myNoneUI.ids.subjects.text]:
for y in self.all_years:
if x.find(str(y)) != -1:
self.pdf_urls.append(x)
#print pdf_urls
self.myNoneUI.ids.pbar.max=len(self.pdf_urls)
self.myNoneUI.ids.pbar.value=0
self.myNoneUI.ids.prompt.text="Su tarea se está procesando. Porfavor, sea paciente...."
#chunks=len(pdf_urls)
self.zip_names=[]
for x in self.pdf_urls:
#download.download_finished=False
#download.progress_bar(x)
self.THREADdownload=Clock.schedule_interval(partial(download.progress_bar,x), 0.5)
#Clock.schedule_once(partial(download.progress_bar,x), 0.5)
self.THREADwaitfordownload=Clock.schedule_interval(partial(self.wait_until_download,x), 0.5)
def prompt(self,text):
self.myNoneUI.ids.prompt.text=text
def main(self):
print ""
if __name__ == '__main__':
MainApp().run()
| 34.331579 | 287 | 0.627472 | 1,580 | 13,046 | 5.071519 | 0.218987 | 0.068888 | 0.084238 | 0.031574 | 0.333957 | 0.232248 | 0.191439 | 0.130912 | 0.097342 | 0.05541 | 0 | 0.009933 | 0.251495 | 13,046 | 379 | 288 | 34.422164 | 0.810548 | 0.044611 | 0 | 0.100386 | 0 | 0.003861 | 0.081895 | 0.03069 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.057915 | 0.088803 | null | null | 0.057915 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
49fc017d8495f8dcc161bce9d606ab32102255fd | 631 | py | Python | jsonmirror/register_models.py | synw/django-jsonmirror | b0aa0e84d2abf6999fb1ea7e792d8060482ada62 | [
"MIT"
] | null | null | null | jsonmirror/register_models.py | synw/django-jsonmirror | b0aa0e84d2abf6999fb1ea7e792d8060482ada62 | [
"MIT"
] | null | null | null | jsonmirror/register_models.py | synw/django-jsonmirror | b0aa0e84d2abf6999fb1ea7e792d8060482ada62 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import importlib
from django.db.models.signals import post_save, post_delete
from jsonmirror.conf import BACKEND
if BACKEND == "rethinkdb":
from jsonmirror.backends.rethinkdb.signals import model_save, model_delete
def register_model(model):
post_save.connect(model_save, sender=model)
post_delete.connect(model_delete, sender=model)
return
def get_model_from_path(modpath):
modsplit = modpath.split('.')
path = '.'.join(modsplit[:-1])
modname = '.'.join(modsplit[-1:])
module = importlib.import_module(path)
model = getattr(module, modname)
return model
| 28.681818 | 78 | 0.714739 | 80 | 631 | 5.475 | 0.4375 | 0.059361 | 0.059361 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005693 | 0.164818 | 631 | 21 | 79 | 30.047619 | 0.825427 | 0.033281 | 0 | 0 | 0 | 0 | 0.019737 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.3125 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
49fe8430c551a66598f671467854758526715d3e | 1,169 | py | Python | app/tests/test.py | waweru12/The-news-highlighter | e7d2d98d0be67bd7471eaee44f7dabc54d60a308 | [
"Unlicense"
] | null | null | null | app/tests/test.py | waweru12/The-news-highlighter | e7d2d98d0be67bd7471eaee44f7dabc54d60a308 | [
"Unlicense"
] | null | null | null | app/tests/test.py | waweru12/The-news-highlighter | e7d2d98d0be67bd7471eaee44f7dabc54d60a308 | [
"Unlicense"
] | null | null | null | import unittest
from app.models import Source
from app.models import Article
class SourceTest(unittest.TestCase):
'''
Test Class to test the behaviour of the Source class
'''
def setUp(self):
'''
Set up method that will run before every Test
'''
self.new_source = Source('KTN', 'KTN-NEWS', 'Home of News', 'https://ktn.co.ke', 'general', 'ke')
def test_instance(self):
'''
Test to check if new_source instance exists
'''
self.assertTrue(isinstance(self.new_source,Source))
class ArticleTest(unittest.TestCase):
'''
Test Class to test the behaviour of the Article class
'''
def setUp(self):
'''
Set up method that will run before every Test
'''
self.new_article = Article('Wekesa', 'Kenyan Cars', 'The variety and rich culture that exists in Kenyan motorsport', 'https://ktn.co.ke', 'https://ktn.co.ke/image1', '24/06/2012', 'kenyan motorshow is among the best')
def test_instance(self):
'''
Test to check if new_Article instance exists
'''
self.assertTrue(isinstance(self.new_article,Article))
| 29.974359 | 225 | 0.629598 | 153 | 1,169 | 4.75817 | 0.379085 | 0.038462 | 0.041209 | 0.049451 | 0.519231 | 0.519231 | 0.519231 | 0.395604 | 0.395604 | 0.299451 | 0 | 0.010321 | 0.254063 | 1,169 | 38 | 226 | 30.763158 | 0.824541 | 0.245509 | 0 | 0.307692 | 0 | 0 | 0.279683 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.307692 | false | 0 | 0.230769 | 0 | 0.692308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b706c0a1f4d9891787468c8f173db9f8741f8cc2 | 415 | py | Python | docs/guide/code/awesome-bot-6/awesome/plugins/scheduler.py | yucongo/none-bot | 3f66c14241115ec2136d7891210368910081375f | [
"MIT"
] | null | null | null | docs/guide/code/awesome-bot-6/awesome/plugins/scheduler.py | yucongo/none-bot | 3f66c14241115ec2136d7891210368910081375f | [
"MIT"
] | null | null | null | docs/guide/code/awesome-bot-6/awesome/plugins/scheduler.py | yucongo/none-bot | 3f66c14241115ec2136d7891210368910081375f | [
"MIT"
] | null | null | null | from datetime import datetime
import none
import pytz
from aiocqhttp.exceptions import Error as CQHttpError
@none.scheduler.scheduled_job('cron', hour='*')
async def _():
bot = none.get_bot()
now = datetime.now(pytz.timezone('Asia/Shanghai'))
try:
await bot.send_group_msg(group_id=672076603,
message=f'现在{now.hour}点整啦!')
except CQHttpError:
pass
| 24.411765 | 61 | 0.653012 | 52 | 415 | 5.096154 | 0.692308 | 0.10566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.240964 | 415 | 16 | 62 | 25.9375 | 0.812698 | 0 | 0 | 0 | 0 | 0 | 0.081928 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.076923 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
b70721703cf4a4f5bb16c6a078ee1dac748c3719 | 250 | py | Python | assignment7/reducer.py | IITDU-BSSE06/ads-demystifying-the-logs-mehedi-iitdu | 1750141851d0295900d48a5a0e92495a2afbaa77 | [
"MIT"
] | null | null | null | assignment7/reducer.py | IITDU-BSSE06/ads-demystifying-the-logs-mehedi-iitdu | 1750141851d0295900d48a5a0e92495a2afbaa77 | [
"MIT"
] | null | null | null | assignment7/reducer.py | IITDU-BSSE06/ads-demystifying-the-logs-mehedi-iitdu | 1750141851d0295900d48a5a0e92495a2afbaa77 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import sys
requests = {}
for line in sys.stdin:
if line in requests:
requests[line] = requests[line]+1
else:
requests[line] = 1
for item in requests:
result = "{0} {1}".format(item,requests[item])
print(result) | 16.666667 | 50 | 0.648 | 37 | 250 | 4.378378 | 0.486486 | 0.222222 | 0.160494 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020101 | 0.204 | 250 | 15 | 51 | 16.666667 | 0.79397 | 0.064 | 0 | 0 | 0 | 0 | 0.029915 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.1 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b71829376ce087486b2008cd4036fa9b3b229666 | 684 | py | Python | enrollment_ms/enrollment/enrollment/admin.py | omiguelperez/agile-academusoft-v2-backend | d69da6fa0a416d7cb36a58f28e2a8fe512e93452 | [
"MIT"
] | null | null | null | enrollment_ms/enrollment/enrollment/admin.py | omiguelperez/agile-academusoft-v2-backend | d69da6fa0a416d7cb36a58f28e2a8fe512e93452 | [
"MIT"
] | 1 | 2021-10-02T20:57:46.000Z | 2021-10-02T20:57:46.000Z | enrollment_ms/enrollment/enrollment/admin.py | omiguelperez/agile-academusoft-v2-backend | d69da6fa0a416d7cb36a58f28e2a8fe512e93452 | [
"MIT"
] | null | null | null | from django.contrib import admin
from enrollment.enrollment.models import (
StudentEnrollment,
CourseGroup,
Schedule,
)
@admin.register(CourseGroup)
class CourseGroupAdmin(admin.ModelAdmin):
list_display = ['course', 'name', 'teacher', 'year', 'semester']
search_fields = ['course', 'name', 'teacher']
@admin.register(StudentEnrollment)
class StudentEnrollmentAdmin(admin.ModelAdmin):
list_display = ['student', 'course_group']
autocomplete_fields = ['course_group']
@admin.register(Schedule)
class ScheduleAdmin(admin.ModelAdmin):
list_display = ['course_group', 'week_day', 'start_time', 'end_time']
autocomplete_fields = ['course_group']
| 26.307692 | 73 | 0.730994 | 69 | 684 | 7.057971 | 0.478261 | 0.090349 | 0.117043 | 0.160164 | 0.131417 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134503 | 684 | 25 | 74 | 27.36 | 0.822635 | 0 | 0 | 0.111111 | 0 | 0 | 0.185673 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b718e6ebb16c5a007eaea28e24cd21b3d8dc6dab | 391 | py | Python | setup.py | Stormholt/DWL5500XY-Python | a1c505d2a0e4f56f9f4af5aab19a91e4afd49488 | [
"MS-PL"
] | null | null | null | setup.py | Stormholt/DWL5500XY-Python | a1c505d2a0e4f56f9f4af5aab19a91e4afd49488 | [
"MS-PL"
] | null | null | null | setup.py | Stormholt/DWL5500XY-Python | a1c505d2a0e4f56f9f4af5aab19a91e4afd49488 | [
"MS-PL"
] | null | null | null | from distutils.core import setup
setup(name = "DWL5500XY",
version = "0",
description = "Python module to use the Digipass DWL5500XY",
author = "Ajs R. Stormholt",
author_email = "nicetry@gmail.com",
url = "https://github.com/Stormholt/DWL5500XY-Python",
packages = ['DWL5500XY'],
scripts = ["test.py"],
long_description = """Really long text here."""
) | 26.066667 | 64 | 0.647059 | 45 | 391 | 5.577778 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054662 | 0.204604 | 391 | 15 | 65 | 26.066667 | 0.752412 | 0 | 0 | 0 | 0 | 0 | 0.431122 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.090909 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b71c5cb99ac6dafca688018b1dacbc27940075d9 | 2,498 | py | Python | pywavez/tests/test_serialization_zwave.py | spacedentist/pywavez | 88739989c19380e722767ea08eaba927cfa7f169 | [
"MIT"
] | 1 | 2020-03-01T00:02:43.000Z | 2020-03-01T00:02:43.000Z | pywavez/tests/test_serialization_zwave.py | spacedentist/pywavez | 88739989c19380e722767ea08eaba927cfa7f169 | [
"MIT"
] | null | null | null | pywavez/tests/test_serialization_zwave.py | spacedentist/pywavez | 88739989c19380e722767ea08eaba927cfa7f169 | [
"MIT"
] | null | null | null | import unittest
from pywavez.zwave import Message, inboundMessageFromBytes
from pywavez.zwave.Constants import LibraryType
class TestGetVersion(unittest.TestCase):
def test_GetVersionResponse(self):
data = bytes.fromhex("01155a2d5761766520342e30350001")
obj = inboundMessageFromBytes(data)
self.assertEqual(type(obj), Message.GetVersionResponse)
self.assertEqual(obj.libraryVersion, "Z-Wave 4.05")
self.assertEqual(obj.libraryType, LibraryType.STATIC_CONTROLLER)
self.assertEqual(obj.toBytes(), data)
def test_SerialApiGetCapabilitiesResponse(self):
data = bytes.fromhex(
"0107aabb12345678abcd00020820800002000000"
"0000000000000000000000000000000000000000"
"0000"
)
obj = inboundMessageFromBytes(data)
self.assertEqual(type(obj), Message.SerialApiGetCapabilitiesResponse)
self.assertEqual(obj.serialApiVersion, 0xAA)
self.assertEqual(obj.serialApiRevision, 0xBB)
self.assertEqual(obj.manufacturerId, 0x1234)
self.assertEqual(obj.manufacturerProduct, 0x5678)
self.assertEqual(obj.manufacturerProductId, 0xABCD)
self.assertEqual(obj.supportedFunctions, {10, 20, 30, 40, 50})
self.assertEqual(obj.toBytes(), data)
def test_SerialApiGetInitDataResponse(self):
data = bytes.fromhex(
"010205001dadff3f000000000000000000000000"
"00000000000000000000000000000500"
)
obj = inboundMessageFromBytes(data)
self.assertEqual(type(obj), Message.SerialApiGetInitDataResponse)
self.assertEqual(obj.serialApiApplicationVersion, 5)
self.assertEqual(obj.isSlave, False)
self.assertEqual(obj.timerSupport, False)
self.assertEqual(obj.isSecondary, False)
self.assertEqual(obj.isSIS, False)
self.assertEqual(
obj.nodes,
{
1,
3,
4,
6,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
},
)
self.assertEqual(obj.chipType, 5)
self.assertEqual(obj.chipVersion, 0)
self.assertEqual(obj.toBytes(), data)
if __name__ == "__main__":
unittest.main()
| 33.306667 | 77 | 0.598078 | 199 | 2,498 | 7.447236 | 0.422111 | 0.222672 | 0.230769 | 0.062078 | 0.187584 | 0.168016 | 0.168016 | 0.119433 | 0 | 0 | 0 | 0.134739 | 0.316653 | 2,498 | 74 | 78 | 33.756757 | 0.733451 | 0 | 0 | 0.119403 | 0 | 0 | 0.082066 | 0.072858 | 0 | 0 | 0.010408 | 0 | 0.328358 | 1 | 0.044776 | false | 0 | 0.044776 | 0 | 0.104478 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b71c5d08b1677875d0a78d3916dda89cb6e91e05 | 603 | py | Python | glc/cli/subs/c_mr.py | evinoca/MyCli | bd6fcb98024c403f9562424f8e8acbacf1380a76 | [
"MIT"
] | 1 | 2020-05-24T03:59:37.000Z | 2020-05-24T03:59:37.000Z | glc/cli/subs/c_mr.py | evinoca/MyCli | bd6fcb98024c403f9562424f8e8acbacf1380a76 | [
"MIT"
] | 12 | 2020-05-29T07:08:27.000Z | 2022-01-12T22:39:49.000Z | glc/cli/subs/c_mr.py | evinoca/MyCli | bd6fcb98024c403f9562424f8e8acbacf1380a76 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from __future__ import unicode_literals
import click
@click.group()
@click.pass_obj
def cli(config):
"""
Merge Requests.
"""
click.echo("calling sub cmd repo")
@cli.command("ls", short_help="List MR")
def ls():
click.echo("repo list")
@cli.command("add", short_help="Create merge request from current repo")
def add():
click.echo("repo add")
@cli.command("update", short_help="update MR")
def update():
click.echo("repo update")
@cli.command("status", short_help="Show status repo")
def status():
click.echo("repo desc")
| 17.735294 | 72 | 0.68325 | 85 | 603 | 4.670588 | 0.4 | 0.11335 | 0.130982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162521 | 603 | 33 | 73 | 18.272727 | 0.786139 | 0.024876 | 0 | 0 | 0 | 0 | 0.251748 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.263158 | false | 0.052632 | 0.157895 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b72fe19d01acee6f62a8e04b5b867719df5a113e | 2,562 | py | Python | tests/onegov/core/test_elements.py | politbuero-kampagnen/onegov-cloud | 20148bf321b71f617b64376fe7249b2b9b9c4aa9 | [
"MIT"
] | null | null | null | tests/onegov/core/test_elements.py | politbuero-kampagnen/onegov-cloud | 20148bf321b71f617b64376fe7249b2b9b9c4aa9 | [
"MIT"
] | null | null | null | tests/onegov/core/test_elements.py | politbuero-kampagnen/onegov-cloud | 20148bf321b71f617b64376fe7249b2b9b9c4aa9 | [
"MIT"
] | null | null | null | from onegov.core.utils import Bunch
from onegov.core.elements import Link, Confirm, Intercooler
def test_link(render_element):
# text is translated
result = render_element(Link(text="Settings", url='/settings'))
assert result.pyquery('a').text() == "Settings"
assert result.pyquery('a').attr('href') == '/settings'
# other attributes are rendered
result = render_element(Link(text='foo', url='#', attrs={
'data-foo': 'bar'
}))
assert result.pyquery('a').attr('data-foo') == 'bar'
# we show a hint if the link is hidden from public
result = render_element(Link(text='hidden', url='#', model=Bunch(
access='private'
)))
def test_confirm_link(render_element):
result = render_element(Link(text="Delete", url='#', traits=(
Confirm(
"Confirm?",
"Extra...",
"Yes",
"No"
),
), attrs={'class': 'foo'}))
assert result.pyquery('a').attr('data-confirm') == "Confirm?"
assert result.pyquery('a').attr('data-confirm-extra') == "Extra..."
assert result.pyquery('a').attr('data-confirm-yes') == "Yes"
assert result.pyquery('a').attr('data-confirm-no') == "No"
assert result.pyquery('a').attr('class') in ('foo confirm', 'confirm foo')
def test_link_slots():
# make sure that the Link class as well as all its parents have
# __slots__ defined (for some lookup speed and memory improvements)
assert not hasattr(Link("Slots", '#'), '__dict__')
def test_intercooler_link(render_element):
result = render_element(Link(text="Delete", traits=Intercooler(
request_method="POST", redirect_after='#redirect', target='#target'
)))
assert result.pyquery('a').attr('ic-post-to') == '#'
assert result.pyquery('a').attr('ic-target') == '#target'
assert result.pyquery('a').attr('redirect-after') == '#redirect'
assert result.pyquery('a').attr('href') is None
def test_class_attributes(render_element):
result = render_element(Link(text="Delete", attrs={
'class': 'foo'
}))
assert result.pyquery('a').attr('class') == 'foo'
result = render_element(Link(text="Delete", attrs={
'class': ('foo', 'bar')
}))
assert result.pyquery('a').attr('class') in ('foo bar', 'bar foo')
result = render_element(Link(text="Delete", attrs={
'class': ('foo', 'bar')
}))
assert result.pyquery('a').attr('class') in ('foo bar', 'bar foo')
result = render_element(Link(text="Delete"))
assert result.pyquery('a').attr('class') is None
| 34.16 | 78 | 0.62178 | 320 | 2,562 | 4.878125 | 0.25 | 0.122998 | 0.194747 | 0.204997 | 0.596413 | 0.521461 | 0.443306 | 0.303652 | 0.234465 | 0.140935 | 0 | 0 | 0.191647 | 2,562 | 74 | 79 | 34.621622 | 0.753742 | 0.087822 | 0 | 0.25 | 0 | 0 | 0.187902 | 0 | 0 | 0 | 0 | 0 | 0.326923 | 1 | 0.096154 | false | 0 | 0.038462 | 0 | 0.134615 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b730cb46cc6bdd05d72419af6a0f9344720414f1 | 2,898 | py | Python | backend/users/views.py | otraczyk/cryptstarter | 6dad77408b15285644d9f69de0cb853be0acf151 | [
"MIT"
] | null | null | null | backend/users/views.py | otraczyk/cryptstarter | 6dad77408b15285644d9f69de0cb853be0acf151 | [
"MIT"
] | null | null | null | backend/users/views.py | otraczyk/cryptstarter | 6dad77408b15285644d9f69de0cb853be0acf151 | [
"MIT"
] | null | null | null | import json
from django.http import HttpResponse
from django.contrib.auth import get_user_model
from users.models import UserFacebookStuff
User = get_user_model()
import open_facebook
from django.views.generic.base import View
from django.contrib.auth.decorators import login_required
from django.core.exceptions import ObjectDoesNotExist
from pubkeys.models import Pubkey
# Create your views here.
JsonResponse = lambda data, status=200: HttpResponse(
json.dumps(data),
status=status,
content_type="application/json"
)
ErrorResponse = lambda msg: JsonResponse(msg, status=500)
def auth(request):
# data = json.loads(request.body)
data = json.loads( request.body.decode('utf-8') )
try:
uid = data["clientId"]
token = data["code"]
except KeyError:
return ErrorResponse("clientId and code are required")
try:
# test token validity
api = open_facebook.OpenFacebook(token)
user = api.get('/me')
assert user["id"] == uid
except open_facebook.exceptions.OAuthException as e:
return ErrorResponse(str(e))
except AssertionError:
ErrorResponse("Token doesn't match user")
fbp = UserFacebookStuff.objects.filter(facebook_id=uid).first()
if not fbp:
fbp = UserFacebookStuff()
user = User()
fbp = UserFacebookStuff()
fbp.facebook_id = uid
user.save()
fbp.access_token = token
fbp.user = user
fbp.save()
return JsonResponse("Logged in")
class profile(View):
@login_required
def get(self, request, *args, **kwargs):
try:
# TODO change when we want more keys per user
pubkey = Pubkey.objects.get(user = request.user)
return JsonResponse({'pubkey': pubkey.key)
except ObjectDoesNotExist:
return JsonResponse({}, status=404)
@login_required
def post(self, request, *args, **kwargs):
data = json.loads(request.body.decode('utf-8'))
try:
# TODO change when we want more keys per user
pubkey = Pubkey.objects.get(user = request.user)
status = 204
except ObjectDoesNotExist:
pubkey = Pubkey(user=request.user)
status = 201
pubkey.key = data['pubkey']
pubkey.save()
return JsonResponse({}, status)
class friends(View):
@login_required
def get(self, request, *args, **kwargs):
friends = request.user.get_friends()
keys = []
for friend in friends:
try:
pubkey = Pubkey.get_by_facebook_id(friend.id)
keys.append({
'id': friend.id,
'key': pubkey.key,
'name': friend.name,
})
except ObjectDoesNotExist:
pass
return JsonRespinse(keys)
| 29.272727 | 67 | 0.613182 | 322 | 2,898 | 5.459627 | 0.344721 | 0.040956 | 0.022184 | 0.03413 | 0.18942 | 0.175768 | 0.175768 | 0.175768 | 0.175768 | 0.085324 | 0 | 0.008256 | 0.28951 | 2,898 | 98 | 68 | 29.571429 | 0.845556 | 0 | 0 | 0.24359 | 0 | 0 | 0.046537 | 0 | 0 | 0 | 0 | 0.010204 | 0.025641 | 0 | null | null | 0.012821 | 0.115385 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b73c6aaabf3ab2bd8b5e6d312900cb91438d7567 | 2,676 | py | Python | data/aggregate_genres.py | com-480-data-visualization/data-visualization-project-2021-pia | 81678676d9cd81eefe3bcfbbb2375b99b6491fed | [
"MIT"
] | 49 | 2018-08-25T08:12:56.000Z | 2022-03-29T21:53:11.000Z | data/aggregate_genres.py | com-480-data-visualization/data-visualization-project-2021-pia | 81678676d9cd81eefe3bcfbbb2375b99b6491fed | [
"MIT"
] | null | null | null | data/aggregate_genres.py | com-480-data-visualization/data-visualization-project-2021-pia | 81678676d9cd81eefe3bcfbbb2375b99b6491fed | [
"MIT"
] | 43 | 2018-10-02T20:35:05.000Z | 2022-03-01T20:40:05.000Z | aggregate_genres = [{"rock": ["symphonic rock", "jazz-rock", "heartland rock", "rap rock", "garage rock", "folk-rock", "roots rock", "adult alternative pop rock", "rock roll", "punk rock", "arena rock", "pop-rock", "glam rock", "southern rock", "indie rock", "funk rock", "country rock", "piano rock", "art rock", "rockabilly", "acoustic rock", "progressive rock", "folk rock", "psychedelic rock", "rock & roll", "blues rock", "alternative rock", "rock and roll", "soft rock", "rock and indie", "hard rock", "pop/rock", "pop rock", "rock", "classic pop and rock", "psychedelic", "british psychedelia", "punk", "metal", "heavy metal"]},
{"alternative/indie": ["adult alternative pop rock", "alternative rock", "alternative metal", "alternative", "lo-fi indie", "indie", "indie folk", "indietronica", "indie pop", "indie rock", "rock and indie"]},
{"electronic/dance": ["dance and electronica", "electro house", "electronic", "electropop", "progressive house", "hip house", "house", "eurodance", "dancehall", "dance", "trap"]},
{"soul": ["psychedelic soul", "deep soul", "neo-soul", "neo soul", "southern soul", "smooth soul", "blue-eyed soul", "soul and reggae", "soul"]},
{"classical/soundtrack": ["classical", "orchestral", "film soundtrack", "composer"]},
{"pop": ["country-pop", "latin pop", "classical pop", "pop-metal", "orchestral pop", "instrumental pop", "indie pop", "sophisti-pop", "pop punk", "pop reggae", "britpop", "traditional pop", "power pop", "sunshine pop", "baroque pop", "synthpop", "art pop", "teen pop", "psychedelic pop", "folk pop", "country pop", "pop rap", "pop soul", "pop and chart", "dance-pop", "pop", "top 40"]},
{"hip-hop/rnb": ["conscious hip hop", "east coast hip hop", "hardcore hip hop", "west coast hip hop", "hiphop", "southern hip hop", "hip-hop", "hip hop", "hip hop rnb and dance hall", "contemporary r b", "gangsta rap", "rapper", "rap", "rhythm and blues", "contemporary rnb", "contemporary r&b", "rnb", "rhythm & blues","r&b", "blues"]},
{"disco": ["disco"]},
{"swing": ["swing"]},
{"folk": ["contemporary folk", "folk"]},
{"country": ["country rock", "country-pop", "country pop", "contemporary country", "country"]},
{"jazz": ["vocal jazz", "jazz", "jazz-rock"]},
{"religious": ["christian", "christmas music", "gospel"]},
{"blues": ["delta blues", "rock blues", "urban blues", "electric blues", "acoustic blues", "soul blues", "country blues", "jump blues", "classic rock. blues rock", "jazz and blues", "piano blues", "british blues", "british rhythm & blues", "rhythm and blues", "blues", "blues rock", "rhythm & blues"]},
{"reggae": ["reggae fusion", "roots reggae", "reggaeton", "pop reggae", "reggae", "soul and reggae"]}] | 178.4 | 635 | 0.650972 | 339 | 2,676 | 5.135693 | 0.303835 | 0.031017 | 0.018955 | 0.020678 | 0.013785 | 0.013785 | 0 | 0 | 0 | 0 | 0 | 0.000849 | 0.119955 | 2,676 | 15 | 636 | 178.4 | 0.738429 | 0 | 0 | 0 | 0 | 0 | 0.708629 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3f8656a953aef0d61d45209e88de7f1ead442ea9 | 309 | py | Python | tests/test_qr_node.py | chrisspen/ros_qr_tracker | 9c2b4e61051d5f1e18a3a71ccec2c50d695fa788 | [
"MIT"
] | 6 | 2018-10-21T02:13:11.000Z | 2021-01-19T06:33:23.000Z | tests/test_qr_node.py | chrisspen/ros_qr_tracker | 9c2b4e61051d5f1e18a3a71ccec2c50d695fa788 | [
"MIT"
] | null | null | null | tests/test_qr_node.py | chrisspen/ros_qr_tracker | 9c2b4e61051d5f1e18a3a71ccec2c50d695fa788 | [
"MIT"
] | 2 | 2018-04-13T20:34:58.000Z | 2020-04-15T16:31:06.000Z | #!/usr/bin/env python
import sys
import unittest
import roslib
PKG = 'ros_qr_tracker'
roslib.load_manifest(PKG)
class Tests(unittest.TestCase):
def test_one_equals_one(self):
self.assertEquals(1, 1)
if __name__ == '__main__':
import rostest
rostest.rosrun(PKG, 'test_qr_node', Tests)
| 17.166667 | 46 | 0.721683 | 44 | 309 | 4.704545 | 0.681818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007782 | 0.168285 | 309 | 17 | 47 | 18.176471 | 0.797665 | 0.064725 | 0 | 0 | 0 | 0 | 0.118056 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3f8cf31469b6d727f72a8ab6fe934d59455e107a | 183 | py | Python | Old_Chapter03/Exercise 3.1/urls.py | lmoshood/The-Django-Workshop | 52e86a8f93cb38bf70d50e9b8d2c6d7dac416f62 | [
"MIT"
] | null | null | null | Old_Chapter03/Exercise 3.1/urls.py | lmoshood/The-Django-Workshop | 52e86a8f93cb38bf70d50e9b8d2c6d7dac416f62 | [
"MIT"
] | null | null | null | Old_Chapter03/Exercise 3.1/urls.py | lmoshood/The-Django-Workshop | 52e86a8f93cb38bf70d50e9b8d2c6d7dac416f62 | [
"MIT"
] | 1 | 2020-05-27T13:41:58.000Z | 2020-05-27T13:41:58.000Z |
from django.conf.urls import url
from .views import index , detail
urlpatterns = [
url(r'^$', view=index),
#/bookr/22/
url(r'^(?P<book_id>[0-9]+)/$',view = detail)
]
| 15.25 | 48 | 0.590164 | 27 | 183 | 3.962963 | 0.703704 | 0.074766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027397 | 0.202186 | 183 | 11 | 49 | 16.636364 | 0.705479 | 0.054645 | 0 | 0 | 0 | 0 | 0.140351 | 0.128655 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
3f9a98e77ef2f923d4b9ae7ee06e73cb28a162ac | 3,141 | py | Python | tests/test_103_build.py | s-yoshiki/kenall | d6dcd8f43b21c3d07f436a31d5c0f2f10dd56df1 | [
"MIT"
] | 1 | 2021-05-09T09:26:00.000Z | 2021-05-09T09:26:00.000Z | tests/test_103_build.py | s-yoshiki/kenall | d6dcd8f43b21c3d07f436a31d5c0f2f10dd56df1 | [
"MIT"
] | null | null | null | tests/test_103_build.py | s-yoshiki/kenall | d6dcd8f43b21c3d07f436a31d5c0f2f10dd56df1 | [
"MIT"
] | null | null | null | from .util import *
def test_build():
write_item("\r\n".join([
'13113,"150 ","1500013","トウキョウト","シブヤク","エビス(ツギノビルヲノゾク)","東京都","渋谷区","恵比寿(次のビルを除く)",0,0,1,0,0,0',
'13113,"150 ","1506090","トウキョウト","シブヤク","エビスエビスガーデンプレイス(チカイ・カイソウフメイ)","東京都","渋谷区","恵比寿恵比寿ガーデンプレイス(地階・階層不明)",0,0,0,0,0,0',
'13113,"150 ","1506001","トウキョウト","シブヤク","エビスエビスガーデンプレイス(1カイ)","東京都","渋谷区","恵比寿恵比寿ガーデンプレイス(1階)",0,0,0,0,0,0',
'13113,"150 ","1500021","トウキョウト","シブヤク","エビスニシ","東京都","渋谷区","恵比寿西",0,0,1,0,0,0',
'23105,"453 ","4530002","アイチケン","ナゴヤシナカムラク","メイエキ(1-1-8、1-1-12、1-1-13、1-1-14、1-3-4、","愛知県","名古屋市中村区","名駅(1−1−8、1−1−12、1−1−13、1−1−14、1−3−4、",1,0,1,0,0,0',
'23105,"453 ","4530002","アイチケン","ナゴヤシナカムラク","1-3-7)","愛知県","名古屋市中村区","1−3−7)",1,0,1,0,0,0',
'23105,"450 ","4506051","アイチケン","ナゴヤシナカムラク","メイエキジェイアールセントラルタワーズ(51カイ)","愛知県","名古屋市中村区","名駅JRセントラルタワーズ(51階)",0,0,0,0,0,0',
'23105,"450 ","4506290","アイチケン","ナゴヤシナカムラク","メイエキミッドランドスクエア(コウソウトウ)(チカイ・カイソウフメイ)","愛知県","名古屋市中村区","名駅ミッドランドスクエア(高層棟)(地階・階層不明)",0,0,0,0,0,0',
'23105,"450 ","4506201","アイチケン","ナゴヤシナカムラク","メイエキミッドランドスクエア(コウソウトウ)(1カイ)","愛知県","名古屋市中村区","名駅ミッドランドスクエア(高層棟)(1階)",0,0,0,0,0,0',
]))
postal = generate_parser()
# 1
item = next(postal)
assert item.zip == '1500013'
assert item.town == '恵比寿'
assert item.town_kana == 'エビス'
assert item.build == ''
assert item.build_kana == ''
assert item.floor == ''
# 2
item = next(postal)
assert item.zip == '1506090'
assert item.town == '恵比寿'
assert item.town_kana == 'エビス'
assert item.build == '恵比寿ガーデンプレイス'
assert item.build_kana == 'エビスガーデンプレイス'
assert item.floor == ''
# 3
item = next(postal)
assert item.zip == '1506001'
assert item.town == '恵比寿'
assert item.town_kana == 'エビス'
assert item.build == '恵比寿ガーデンプレイス'
assert item.build_kana == 'エビスガーデンプレイス'
assert item.floor == '1'
# 4
item = next(postal)
assert item.zip == '1500021'
assert item.town == '恵比寿西'
assert item.town_kana == 'エビスニシ'
assert item.build == ''
assert item.build_kana == ''
assert item.floor == ''
# 5
item = next(postal)
assert item.zip == '4530002'
assert item.town == '名駅'
assert item.town_kana == 'メイエキ'
assert item.build == ''
assert item.build_kana == ''
assert item.floor == ''
# 6
item = next(postal)
assert item.zip == '4506051'
assert item.town == '名駅'
assert item.town_kana == 'メイエキ'
assert item.build == 'JRセントラルタワーズ'
assert item.build_kana == 'ジェイアールセントラルタワーズ'
assert item.floor == '51'
# 7
item = next(postal)
assert item.zip == '4506290'
assert item.town == '名駅'
assert item.town_kana == 'メイエキ'
assert item.build == 'ミッドランドスクエア'
assert item.build_kana == 'ミッドランドスクエア'
assert item.floor == ''
# 8
item = next(postal)
assert item.zip == '4506201'
assert item.town == '名駅'
assert item.town_kana == 'メイエキ'
assert item.build == 'ミッドランドスクエア'
assert item.build_kana == 'ミッドランドスクエア'
assert item.floor == '1'
| 36.523256 | 163 | 0.592805 | 453 | 3,141 | 4.048565 | 0.189845 | 0.261723 | 0.039258 | 0.032715 | 0.600872 | 0.594875 | 0.473282 | 0.461287 | 0.437296 | 0.423119 | 0 | 0.121632 | 0.196434 | 3,141 | 85 | 164 | 36.952941 | 0.608558 | 0.004776 | 0 | 0.614286 | 0 | 0.128571 | 0.410026 | 0.303021 | 0 | 0 | 0 | 0 | 0.685714 | 1 | 0.014286 | false | 0 | 0.014286 | 0 | 0.028571 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3fa6e839a7e806f05291a97579dbb9d6e1043ccc | 1,217 | py | Python | gefapi/services/__init__.py | erick-otenyo/trends.earth-API | 143a4798965e9daa4a53626ace984c7e5b9e359d | [
"X11"
] | null | null | null | gefapi/services/__init__.py | erick-otenyo/trends.earth-API | 143a4798965e9daa4a53626ace984c7e5b9e359d | [
"X11"
] | null | null | null | gefapi/services/__init__.py | erick-otenyo/trends.earth-API | 143a4798965e9daa4a53626ace984c7e5b9e359d | [
"X11"
] | null | null | null | """GEFAPI SERVICES MODULE"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import sys
import logging
import rollbar
from rollbar.logger import RollbarHandler
from gefapi.services.docker_service import DockerService, docker_build, docker_run
from gefapi.services.email_service import EmailService
from gefapi.services.script_service import ScriptService
from gefapi.services.user_service import UserService
from gefapi.services.execution_service import ExecutionService
# Ensure all unhandled exceptions are logged, and reported to rollbar
logger = logging.getLogger(__name__)
handler = logging.StreamHandler(stream=sys.stdout)
logger.addHandler(handler)
rollbar.init(os.getenv('ROLLBAR_SERVER_TOKEN'), os.getenv('ENV'))
rollbar_handler = RollbarHandler()
rollbar_handler.setLevel(logging.ERROR)
logger.addHandler(rollbar_handler)
def handle_exception(exc_type, exc_value, exc_traceback):
if issubclass(exc_type, KeyboardInterrupt):
sys.__excepthook__(exc_type, exc_value, exc_traceback)
return
logger.critical("Uncaught exception", exc_info=(exc_type, exc_value, exc_traceback))
sys.excepthook = handle_exception
| 33.805556 | 88 | 0.828266 | 153 | 1,217 | 6.281046 | 0.437909 | 0.087409 | 0.093652 | 0.046826 | 0.084287 | 0.084287 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10189 | 1,217 | 35 | 89 | 34.771429 | 0.879231 | 0.074774 | 0 | 0 | 0 | 0 | 0.036607 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.5 | 0 | 0.576923 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3fb2cf3c8c5a92866783564a2731ce98239e0325 | 135 | py | Python | lesson15/HomeWork/task8.py | zainllw0w/skillbox | 896287b6f7f5612cf589094131fd1a12b0b192ba | [
"MIT"
] | null | null | null | lesson15/HomeWork/task8.py | zainllw0w/skillbox | 896287b6f7f5612cf589094131fd1a12b0b192ba | [
"MIT"
] | null | null | null | lesson15/HomeWork/task8.py | zainllw0w/skillbox | 896287b6f7f5612cf589094131fd1a12b0b192ba | [
"MIT"
] | null | null | null | n = [1, 2, 3, 4, 5]
new = []
k = int(input('Сдвиг: '))
for _ in range(5):
new.append(n[-k])
k -= 1
print(n)
print(new) | 15 | 25 | 0.459259 | 25 | 135 | 2.44 | 0.64 | 0.131148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073684 | 0.296296 | 135 | 9 | 26 | 15 | 0.568421 | 0 | 0 | 0 | 0 | 0 | 0.051471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3fb5815dea127e7ae6818309480f87b18fed8de6 | 3,364 | py | Python | {{cookiecutter.project_slug}}/config/urls.py | julianwagle/cookiecutter-django-react | bce2e264cc90bfe89bb7f6f1300441f725731b12 | [
"BSD-3-Clause"
] | 2 | 2021-08-10T18:33:49.000Z | 2021-11-09T15:37:51.000Z | {{cookiecutter.project_slug}}/config/urls.py | julianwagle/cookiecutter-django-react | bce2e264cc90bfe89bb7f6f1300441f725731b12 | [
"BSD-3-Clause"
] | 1 | 2022-03-10T23:38:06.000Z | 2022-03-11T15:22:48.000Z | {{cookiecutter.project_slug}}/config/urls.py | julianwagle/cookiecutter-django-react | bce2e264cc90bfe89bb7f6f1300441f725731b12 | [
"BSD-3-Clause"
] | 1 | 2021-08-13T21:34:33.000Z | 2021-08-13T21:34:33.000Z |
from django.conf import settings
from django.conf.urls.static import static
from django.contrib import admin
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
from django.urls import path
from django.views import defaults as default_views
from django.conf.urls import include
from django.views.generic import RedirectView
from {{cookiecutter.project_slug}}.users.api.views import VerifyEmailView
# from rest_framework.authtoken.views import obtain_auth_token
urlpatterns = [
path(settings.ADMIN_URL, admin.site.urls),
path("api/", include("config.api_router")),
path('api/', include('dj_rest_auth.urls')),
# URLs that do not require a session or valid token
# path('password/reset/', PasswordResetView.as_view(), name='rest_password_reset'),
# path('password/reset/confirm/', PasswordResetConfirmView.as_view(), name='rest_password_reset_confirm'),
# path('login/', LoginView.as_view(), name='rest_login'),
# # URLs that require a user to be logged in with a valid session / token.
# path('logout/', LogoutView.as_view(), name='rest_logout'),
# path('user/', UserDetailsView.as_view(), name='rest_user_details'),
# path('password/change/', PasswordChangeView.as_view(), name='rest_password_change'),
path('api/registration/', include('dj_rest_auth.registration.urls')),
# path('', RegisterView.as_view(), name='rest_register'),
# path('verify-email/', VerifyEmailView.as_view(), name='rest_verify_email'),
# path('resend-email/', ResendEmailVerificationView.as_view(), name="rest_resend_email"),
# https://dj-rest-auth.readthedocs.io/en/latest/api_endpoints.html
# https://github.com/iMerica/dj-rest-auth/blob/master/dj_rest_auth/registration/urls.py
path('api/confirm-email/<key>/', VerifyEmailView.as_view(), name='email_verification_sent'),
path('account/', include('allauth.urls')),
path('accounts/profile/', RedirectView.as_view(url='/', permanent=True), name='profile-redirect'),
# path("api/auth-token/", obtain_auth_token),
path('api/', include('{{cookiecutter.project_slug}}.articles.urls', namespace='articles')),
path('api/', include('{{cookiecutter.project_slug}}.profiles.urls', namespace='profiles')),
# urls that don't perfectly adhere to realworld specs
# path('users/', RegistrationAPIView.as_view()), => now api/registration
# path('users/login/', LoginAPIView.as_view()), => now api/login
] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
if settings.DEBUG:
# This allows the error pages to be debugged during development, just visit
# these url in browser to see how these error pages look like.
urlpatterns += [
path("400/", default_views.bad_request, kwargs={"exception": Exception("Bad Request!")},),
path("403/", default_views.permission_denied, kwargs={"exception": Exception("Permission Denied")},),
path("404/", default_views.page_not_found, kwargs={"exception": Exception("Page not Found")},),
path("500/", default_views.server_error),
]
# Static file serving when using Gunicorn + Uvicorn for local web socket development
urlpatterns += staticfiles_urlpatterns()
if "debug_toolbar" in settings.INSTALLED_APPS:
import debug_toolbar
urlpatterns = [path("__debug__/", include(debug_toolbar.urls))] + urlpatterns
| 51.753846 | 110 | 0.724138 | 425 | 3,364 | 5.564706 | 0.362353 | 0.032981 | 0.042283 | 0.053277 | 0.085412 | 0.054123 | 0 | 0 | 0 | 0 | 0 | 0.004098 | 0.129608 | 3,364 | 64 | 111 | 52.5625 | 0.80362 | 0.432818 | 0 | 0 | 0 | 0 | 0.217091 | 0.086518 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.322581 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
3fbb6d8f93e1a821ca13b85dfa711388bbbf33b8 | 1,052 | py | Python | src/djanban/apps/boards/migrations/0071_auto_20170530_1711.py | diegojromerolopez/djanban | 6451688d49cf235d03c604b19a6a8480b33eed87 | [
"MIT"
] | 33 | 2017-06-14T18:04:25.000Z | 2021-06-15T07:07:56.000Z | src/djanban/apps/boards/migrations/0071_auto_20170530_1711.py | diegojromerolopez/djanban | 6451688d49cf235d03c604b19a6a8480b33eed87 | [
"MIT"
] | 1 | 2017-05-10T08:45:55.000Z | 2017-05-10T08:45:55.000Z | src/djanban/apps/boards/migrations/0071_auto_20170530_1711.py | diegojromerolopez/djanban | 6451688d49cf235d03c604b19a6a8480b33eed87 | [
"MIT"
] | 8 | 2017-08-27T11:14:25.000Z | 2021-03-03T12:11:16.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2017-05-30 15:11
from __future__ import unicode_literals
from django.db import migrations, models
from django.db.models import Sum
# Set adjusted spent time initial value for each card
def set_card_adjusted_spent_time(apps, schema):
Card = apps.get_model("boards", "Card")
for card in Card.objects.all():
card_adjusted_spent_time = card.daily_spent_times.all().\
aggregate(adj_spent_time_sum=Sum("adjusted_spent_time"))["adj_spent_time_sum"]
card.adjusted_spent_time = card_adjusted_spent_time
card.save()
class Migration(migrations.Migration):
dependencies = [
('boards', '0070_auto_20170530_1456'),
]
operations = [
migrations.AddField(
model_name='card',
name='adjusted_spent_time',
field=models.DecimalField(decimal_places=4, default=None, max_digits=12, null=True, verbose_name='Adjusted spent time'),
),
migrations.RunPython(set_card_adjusted_spent_time)
]
| 31.878788 | 132 | 0.692966 | 137 | 1,052 | 5.021898 | 0.50365 | 0.143895 | 0.222384 | 0.152616 | 0.172965 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041766 | 0.203422 | 1,052 | 32 | 133 | 32.875 | 0.779236 | 0.112167 | 0 | 0 | 1 | 0 | 0.126882 | 0.024731 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.136364 | 0 | 0.318182 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3fca602e52ec55b0def35c5194f95a5d6f0b0c8b | 537 | py | Python | tests/unit/load_balancer_test.py | esra-sengul/hazelcast-python-client | bdfd945aba9335cc583fd0c70e4812093f247d66 | [
"Apache-2.0"
] | null | null | null | tests/unit/load_balancer_test.py | esra-sengul/hazelcast-python-client | bdfd945aba9335cc583fd0c70e4812093f247d66 | [
"Apache-2.0"
] | null | null | null | tests/unit/load_balancer_test.py | esra-sengul/hazelcast-python-client | bdfd945aba9335cc583fd0c70e4812093f247d66 | [
"Apache-2.0"
] | null | null | null | import unittest
from hazelcast.util import RandomLB, RoundRobinLB
class _MockClusterService:
def __init__(self, members):
self._members = members
def add_listener(self, listener, *_):
for m in self._members:
listener(m)
def get_members(self):
return self._members
class LoadBalancersTest(unittest.TestCase):
def test_random_lb_with_no_members(self):
cluster = _MockClusterService([])
lb = RandomLB()
lb.init(cluster)
self.assertIsNone(lb.next())
| 22.375 | 49 | 0.668529 | 59 | 537 | 5.79661 | 0.508475 | 0.128655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243948 | 537 | 23 | 50 | 23.347826 | 0.842365 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 1 | 0.25 | false | 0 | 0.125 | 0.0625 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3fcb6dc43074ea8057cd6962b1d2c7cdc34c51f7 | 2,618 | py | Python | dataModels/wall.py | JoelEager/pyTanks-Server | 4455c0475980ea1372f11d7ec41d95990b3c8f44 | [
"MIT"
] | 5 | 2017-09-04T09:36:40.000Z | 2019-10-31T20:52:13.000Z | dataModels/wall.py | JoelEager/pyTanks-Server | 4455c0475980ea1372f11d7ec41d95990b3c8f44 | [
"MIT"
] | 3 | 2018-12-28T08:09:51.000Z | 2019-01-04T20:36:47.000Z | dataModels/wall.py | JoelEager/pyTanks-Server | 4455c0475980ea1372f11d7ec41d95990b3c8f44 | [
"MIT"
] | 1 | 2018-12-28T08:12:36.000Z | 2018-12-28T08:12:36.000Z | from random import randint
import config
class wall:
"""
Stores the state data for a wall on the map
"""
def __init__(self):
"""
Randomly generates a wall using the bounding values in config.py
"""
# Set lengths for the long and short sides of the wall
longSide = randint(config.game.wall.longSideBounds[0], config.game.wall.longSideBounds[1])
shortSide = randint(config.game.wall.shortSideBounds[0], config.game.wall.shortSideBounds[1])
# Decide if this is going to be a tall or long wall
if randint(0, 2) == 0:
self.width, self.height = longSide, shortSide
self.x = randint(config.game.wall.placementPadding, config.game.map.width -
config.game.wall.placementPadding - config.game.wall.longSideBounds[0])
self.y = randint(config.game.wall.placementPadding, config.game.map.height -
config.game.wall.placementPadding - config.game.wall.shortSideBounds[0])
else:
self.height, self.width = longSide, shortSide
self.y = randint(config.game.wall.placementPadding, config.game.map.height -
config.game.wall.placementPadding - config.game.wall.longSideBounds[0])
self.x = randint(config.game.wall.placementPadding, config.game.map.width -
config.game.wall.placementPadding - config.game.wall.shortSideBounds[0])
# Check to make sure the wall doesn't go too far
if self.x + self.width > config.game.map.width - config.game.wall.placementPadding:
self.width = config.game.map.width - config.game.wall.placementPadding - self.x
elif self.y + self.height > config.game.map.height - config.game.wall.placementPadding:
self.height = config.game.map.height - config.game.wall.placementPadding - self.y
# Correct x and y to be the center of the wall instead of top-left corner
self.x += self.width / 2
self.y += self.height / 2
def toPoly(self, margin=0):
"""
:param margin: If set the polygon will have a padding of margin pixels in every direction
:return: The wall's polygon as a list of points as tuples
"""
halfWidth = (self.width / 2) + margin
halfHeight = (self.height / 2) + margin
return [(self.x - halfWidth, self.y - halfHeight),
(self.x + halfWidth, self.y - halfHeight),
(self.x + halfWidth, self.y + halfHeight),
(self.x - halfWidth, self.y + halfHeight)] | 51.333333 | 101 | 0.624523 | 332 | 2,618 | 4.912651 | 0.26506 | 0.171674 | 0.171674 | 0.220723 | 0.548743 | 0.512569 | 0.512569 | 0.512569 | 0.512569 | 0.492949 | 0 | 0.008403 | 0.272727 | 2,618 | 51 | 102 | 51.333333 | 0.848214 | 0.182964 | 0 | 0.258065 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0 | 0.064516 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3fd1ac9ae5a4406a52aec0104089a12d701aff36 | 282 | py | Python | rfhub2/db/model/doc_mixin.py | Wolfe1/rfhub2 | 7bc5bd95a5b80f0dec62211bc1771d11d604e01b | [
"Apache-2.0"
] | null | null | null | rfhub2/db/model/doc_mixin.py | Wolfe1/rfhub2 | 7bc5bd95a5b80f0dec62211bc1771d11d604e01b | [
"Apache-2.0"
] | null | null | null | rfhub2/db/model/doc_mixin.py | Wolfe1/rfhub2 | 7bc5bd95a5b80f0dec62211bc1771d11d604e01b | [
"Apache-2.0"
] | null | null | null | from robot.libdocpkg.htmlwriter import DocToHtml
class DocMixin:
@property
def synopsis(self) -> str:
return self.doc.splitlines()[0] if self.doc else ""
@property
def html_doc(self) -> str:
return DocToHtml("ROBOT")(self.doc) if self.doc else ""
| 23.5 | 63 | 0.656028 | 37 | 282 | 4.972973 | 0.540541 | 0.152174 | 0.141304 | 0.141304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004566 | 0.223404 | 282 | 11 | 64 | 25.636364 | 0.835616 | 0 | 0 | 0.25 | 0 | 0 | 0.017731 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.25 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
3fe0c731b4b1c6ac01da7200976dca2b288d2077 | 235 | py | Python | chapter-5/basic_hook/get_hooked.py | carltyndall/Modular-Programming-with-Python | efe1c725602b2148fdeb530e89381895c3e7f696 | [
"MIT"
] | null | null | null | chapter-5/basic_hook/get_hooked.py | carltyndall/Modular-Programming-with-Python | efe1c725602b2148fdeb530e89381895c3e7f696 | [
"MIT"
] | null | null | null | chapter-5/basic_hook/get_hooked.py | carltyndall/Modular-Programming-with-Python | efe1c725602b2148fdeb530e89381895c3e7f696 | [
"MIT"
] | null | null | null | hooked_function = None
def set_hook(hook):
global hooked_function
hooked_function = hook
hooked_function
def do_it():
if hooked_function != None:
hooked_function()
else:
print("Did not get hooked")
| 19.583333 | 35 | 0.668085 | 30 | 235 | 4.966667 | 0.5 | 0.563758 | 0.241611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.255319 | 235 | 11 | 36 | 21.363636 | 0.851429 | 0 | 0 | 0 | 0 | 0 | 0.076596 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0.1 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3fe4e8bbc504925f650df701d0f5f891bb86af26 | 1,365 | py | Python | utils/freshrotation.py | joergsimon/gesture-analysis | 1077b19c20bb7bf437df7de4938528accc93acd1 | [
"Apache-2.0"
] | null | null | null | utils/freshrotation.py | joergsimon/gesture-analysis | 1077b19c20bb7bf437df7de4938528accc93acd1 | [
"Apache-2.0"
] | null | null | null | utils/freshrotation.py | joergsimon/gesture-analysis | 1077b19c20bb7bf437df7de4938528accc93acd1 | [
"Apache-2.0"
] | null | null | null | import numpy as np
def rotX(theta):
return np.array([[1, 0, 0]
, [0, np.cos(theta), -np.sin(theta)]
, [0, np.sin(theta), np.cos(theta)]])
def rotY(theta):
return np.array([[np.cos(theta), 0, np.sin(theta)]
, [0, 1, 0]
, [-np.sin(theta), 0, np.cos(theta)]])
def rotZ(theta):
return np.array([[np.cos(theta), -np.sin(theta), 0]
, [np.sin(theta), np.cos(theta), 0]
, [0, 0, 1]])
def euler_matrix(x, y, z):
return rotX(x).dot(rotY(y)).dot(rotZ(z))
def vector_slerp(v1, v2, fraction):
perp_v = np.cross(v1, v2)
# perp_v /= np.linalg.norm(perp_v)
angle = np.arccos(np.dot(v1,v2)/(np.linalg.norm(v1)*np.linalg.norm(v2))) * fraction
return rotation_matrix(angle, perp_v).dot(v1)
def unit_vector(v):
return v/np.linalg.norm(v)
def rotation_matrix(angle, direction):
sina = np.sin(angle)
cosa = np.cos(angle)
direction = unit_vector(direction)
# rotation matrix around unit vector
R = np.diag([cosa, cosa, cosa])
R += np.outer(direction, direction) * (1.0 - cosa)
direction *= sina
R += np.array([[ 0.0, -direction[2], direction[1]],
[ direction[2], 0.0, -direction[0]],
[-direction[1], direction[0], 0.0]])
return R | 31.022727 | 87 | 0.536264 | 202 | 1,365 | 3.574257 | 0.212871 | 0.022161 | 0.083102 | 0.060942 | 0.213296 | 0.177285 | 0.177285 | 0.113573 | 0.113573 | 0.113573 | 0 | 0.038697 | 0.280586 | 1,365 | 44 | 88 | 31.022727 | 0.696538 | 0.049084 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.21875 | false | 0 | 0.03125 | 0.15625 | 0.46875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
3ffe7c564ce6575a61047ee944d7eb5f1424abe0 | 351 | py | Python | config/core.py | Thedonutz/config-client | 40ad3a8ef19409775f66c61c12dcb580497e7ea4 | [
"Apache-2.0"
] | 20 | 2019-05-07T11:55:20.000Z | 2022-02-15T08:44:36.000Z | config/core.py | Thedonutz/config-client | 40ad3a8ef19409775f66c61c12dcb580497e7ea4 | [
"Apache-2.0"
] | 42 | 2019-06-21T00:50:38.000Z | 2022-03-02T16:56:10.000Z | config/core.py | Thedonutz/config-client | 40ad3a8ef19409775f66c61c12dcb580497e7ea4 | [
"Apache-2.0"
] | 12 | 2019-06-12T18:32:59.000Z | 2021-08-07T11:45:23.000Z | """Core functions."""
from functools import wraps
def singleton(cls):
"""Ensure singleton instance."""
instances = {}
@wraps(cls)
def instance(*args, **kwargs):
"""Create class instance."""
if cls not in instances:
instances[cls] = cls(*args, **kwargs)
return instances[cls]
return instance
| 20.647059 | 49 | 0.592593 | 37 | 351 | 5.621622 | 0.540541 | 0.096154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.267806 | 351 | 16 | 50 | 21.9375 | 0.809339 | 0.185185 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b7466e6f3649f809acb1c310d7fd25741199b1f7 | 2,437 | py | Python | Exp_Main/migrations/0005_auto_20210516_1527.py | SimonSchubotz/Electronic-Laboratory-Notebook | a5dc3daa76b07370c1ee5b7e74fb6c780c3d3c97 | [
"Apache-2.0"
] | null | null | null | Exp_Main/migrations/0005_auto_20210516_1527.py | SimonSchubotz/Electronic-Laboratory-Notebook | a5dc3daa76b07370c1ee5b7e74fb6c780c3d3c97 | [
"Apache-2.0"
] | null | null | null | Exp_Main/migrations/0005_auto_20210516_1527.py | SimonSchubotz/Electronic-Laboratory-Notebook | a5dc3daa76b07370c1ee5b7e74fb6c780c3d3c97 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 3.1.3 on 2021-05-16 13:27
import datetime
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('Exp_Sub', '0005_auto_20210516_1527'),
('Lab_Dash', '0005_auto_20210516_1527'),
('Exp_Main', '0004_auto_20210506_2243'),
]
operations = [
migrations.AlterField(
model_name='liquid',
name='Born',
field=models.DateTimeField(blank=True, default=datetime.datetime(2021, 5, 16, 15, 27, 22, 850169), null=True),
),
migrations.AlterField(
model_name='liquid',
name='Death',
field=models.DateTimeField(blank=True, default=datetime.datetime(2021, 5, 16, 15, 27, 22, 850169), null=True),
),
migrations.CreateModel(
name='RSD',
fields=[
('expbase_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='Exp_Main.expbase')),
('Link', models.TextField(blank=True, null=True)),
('Link_Data', models.TextField(blank=True, null=True)),
('Link_PDF', models.TextField(blank=True, null=True)),
('Link_Osz_join_LSP', models.TextField(blank=True, null=True)),
('Temp_Observation', models.TextField(blank=True, null=True)),
('Temp_Hypothesis', models.TextField(blank=True, null=True)),
('Temp_Mixing_ratio', models.TextField(blank=True, null=True)),
('Temp_Atmosphere_relax', models.TextField(blank=True, null=True)),
('Temp_Flowrate', models.TextField(blank=True, null=True)),
('Temp_Volume', models.TextField(blank=True, null=True)),
('Temp_Buzz_word', models.TextField(blank=True, null=True)),
('Temp_Bath_time', models.TextField(blank=True, null=True)),
('Dash', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='Lab_Dash.oca')),
('Liquid', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='Exp_Main.liquid')),
('Sub_Exp', models.ManyToManyField(blank=True, to='Exp_Sub.ExpBase')),
],
bases=('Exp_Main.expbase',),
),
]
| 48.74 | 194 | 0.610587 | 279 | 2,437 | 5.175627 | 0.318996 | 0.105956 | 0.126039 | 0.16482 | 0.600416 | 0.600416 | 0.531163 | 0.256925 | 0.228532 | 0.228532 | 0 | 0.054683 | 0.242101 | 2,437 | 49 | 195 | 49.734694 | 0.727125 | 0.018465 | 0 | 0.209302 | 1 | 0 | 0.157741 | 0.037657 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.069767 | 0 | 0.139535 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b74eb80ea94d0f21461c6e09d6f6576893245540 | 206 | py | Python | handlingException.py | nkukadiya89/learn-python | d0a8c438dd77b8feeb1e0126ec379873ef4b2978 | [
"MIT"
] | 1 | 2021-06-16T16:42:41.000Z | 2021-06-16T16:42:41.000Z | handlingException.py | nkukadiya89/learn-python | d0a8c438dd77b8feeb1e0126ec379873ef4b2978 | [
"MIT"
] | null | null | null | handlingException.py | nkukadiya89/learn-python | d0a8c438dd77b8feeb1e0126ec379873ef4b2978 | [
"MIT"
] | null | null | null | #Handling Exceptions
try:
age = int(input("Enter your age:"))
except ValueError as ex:
print(ex)
print(type(ex))
print("Please enter valid age!")
else:
print("else part executed")
| 17.166667 | 39 | 0.635922 | 28 | 206 | 4.678571 | 0.678571 | 0.160305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.23301 | 206 | 11 | 40 | 18.727273 | 0.829114 | 0.092233 | 0 | 0 | 0 | 0 | 0.302703 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b7500fd018d38736f2a7f72a0ede36d815c36cf3 | 718 | py | Python | app/models/school.py | jnsdrtlf/et | 7ebf03ef0dc3fe9f9ff9f361063572fd4fe6ac26 | [
"Apache-2.0"
] | null | null | null | app/models/school.py | jnsdrtlf/et | 7ebf03ef0dc3fe9f9ff9f361063572fd4fe6ac26 | [
"Apache-2.0"
] | null | null | null | app/models/school.py | jnsdrtlf/et | 7ebf03ef0dc3fe9f9ff9f361063572fd4fe6ac26 | [
"Apache-2.0"
] | null | null | null | from app.models import db, ma
class School(db.Model):
"""School
This table stores the configuration for each individual school.
Schools are addressed through their short name.
"""
__tablename__ = 'school'
id = db.Column(db.Integer, primary_key=True, unique=True, nullable=False)
school_name = db.Column(db.String(48), nullable=False)
# `short_name` can be used as a subdomain (e.g. abc.tuutor.de)
short_name = db.Column(db.String(8), nullable=False, unique=True)
# Duration of a typical lesson (see `time` or `lesson`) in minutes
lesson_duration = db.Column(db.Integer, nullable=False, default=45)
def __repr__(self):
return f'<School {self.short_name}>'
| 31.217391 | 77 | 0.692201 | 104 | 718 | 4.644231 | 0.615385 | 0.074534 | 0.082816 | 0.070393 | 0.082816 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008621 | 0.192201 | 718 | 22 | 78 | 32.636364 | 0.824138 | 0.342618 | 0 | 0 | 0 | 0 | 0.070796 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0.111111 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
b761c3b87b1f697ea2ac6d2f45661a67ddf76f9f | 1,254 | py | Python | ansible/venv/lib/python2.7/site-packages/ansible/plugins/doc_fragments/tower.py | gvashchenkolineate/gvashchenkolineate_infra_trytravis | 0fb18850afe0d8609693ba4b23f29c7cda17d97f | [
"MIT"
] | 17 | 2017-06-07T23:15:01.000Z | 2021-08-30T14:32:36.000Z | ansible/ansible/plugins/doc_fragments/tower.py | SergeyCherepanov/ansible | 875711cd2fd6b783c812241c2ed7a954bf6f670f | [
"MIT"
] | 9 | 2017-06-25T03:31:52.000Z | 2021-05-17T23:43:12.000Z | ansible/ansible/plugins/doc_fragments/tower.py | SergeyCherepanov/ansible | 875711cd2fd6b783c812241c2ed7a954bf6f670f | [
"MIT"
] | 3 | 2018-05-26T21:31:22.000Z | 2019-09-28T17:00:45.000Z | # -*- coding: utf-8 -*-
# Copyright: (c) 2017, Wayne Witzel III <wayne@riotousliving.com>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
class ModuleDocFragment(object):
# Ansible Tower documentation fragment
DOCUMENTATION = r'''
options:
tower_host:
description:
- URL to your Tower instance.
type: str
tower_username:
description:
- Username for your Tower instance.
type: str
tower_password:
description:
- Password for your Tower instance.
type: str
validate_certs:
description:
- Whether to allow insecure connections to Tower.
- If C(no), SSL certificates will not be validated.
- This should only be used on personally controlled sites using self-signed certificates.
type: bool
aliases: [ tower_verify_ssl ]
tower_config_file:
description:
- Path to the Tower config file.
type: path
requirements:
- ansible-tower-cli >= 3.0.2
notes:
- If no I(config_file) is provided we will attempt to use the tower-cli library
defaults to find your Tower host information.
- I(config_file) should contain Tower configuration in the following format
host=hostname
username=username
password=password
'''
| 26.680851 | 93 | 0.708931 | 168 | 1,254 | 5.232143 | 0.589286 | 0.040956 | 0.05802 | 0.071672 | 0.100114 | 0.100114 | 0 | 0 | 0 | 0 | 0 | 0.012121 | 0.210526 | 1,254 | 46 | 94 | 27.26087 | 0.875758 | 0.169856 | 0 | 0.222222 | 0 | 0 | 0.929537 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b76abc107a062e832633bcb2ca38915051f23c3f | 1,025 | py | Python | src/accounts/api/views.py | kkweon/django-blog-rest | 3e721db008a1e6c17941fb61a05879329766fddb | [
"MIT"
] | null | null | null | src/accounts/api/views.py | kkweon/django-blog-rest | 3e721db008a1e6c17941fb61a05879329766fddb | [
"MIT"
] | null | null | null | src/accounts/api/views.py | kkweon/django-blog-rest | 3e721db008a1e6c17941fb61a05879329766fddb | [
"MIT"
] | null | null | null | from django.contrib.auth import get_user_model
from rest_framework.generics import CreateAPIView
from rest_framework.permissions import AllowAny
from rest_framework.response import Response
from rest_framework.status import HTTP_200_OK, HTTP_400_BAD_REQUEST
from rest_framework.views import APIView
from accounts.api.serializers import UserCreateSerializer, UserLoginSerializer
User = get_user_model()
class UserCreateAPIView(CreateAPIView):
serializer_class = UserCreateSerializer
queryset = User.objects.all()
permission_classes = [AllowAny]
class UserLoginAPIView(APIView):
permission_classes = [AllowAny]
serializer_class = UserLoginSerializer
def post(self, request, *args, **kwargs):
data = request.data
serializer = UserLoginSerializer(data=data)
if serializer.is_valid(raise_exception=True):
new_data = serializer.data
return Response(new_data, status=HTTP_200_OK)
return Response(serializer.errors, status=HTTP_400_BAD_REQUEST)
| 33.064516 | 78 | 0.779512 | 118 | 1,025 | 6.542373 | 0.449153 | 0.051813 | 0.110104 | 0.044041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013905 | 0.158049 | 1,025 | 30 | 79 | 34.166667 | 0.880649 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.318182 | 0 | 0.772727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b77200cd10d165b552812b97c20c7f3c2edce121 | 422 | py | Python | django_workflow_system/models/abstract_models.py | eikonomega/django-workflow-system | dc0e8807263266713d3d7fa46e240e8d72db28d1 | [
"MIT"
] | 2 | 2022-01-28T12:35:42.000Z | 2022-03-23T16:06:05.000Z | django_workflow_system/models/abstract_models.py | eikonomega/django-workflow-system | dc0e8807263266713d3d7fa46e240e8d72db28d1 | [
"MIT"
] | 10 | 2021-04-27T20:26:32.000Z | 2021-07-21T15:34:31.000Z | django_workflow_system/models/abstract_models.py | eikonomega/django-workflow-system | dc0e8807263266713d3d7fa46e240e8d72db28d1 | [
"MIT"
] | 1 | 2021-11-13T14:30:34.000Z | 2021-11-13T14:30:34.000Z | """Application Wide Abstract Model Definitions."""
from django.db import models
class CreatedModifiedAbstractModel(models.Model):
"""
Abstract base model that is used to add `created_date`
and `modified_date` fields to all descendant models.
"""
created_date = models.DateTimeField(auto_now_add=True)
modified_date = models.DateTimeField(auto_now=True)
class Meta:
abstract = True
| 24.823529 | 58 | 0.725118 | 51 | 422 | 5.862745 | 0.588235 | 0.073579 | 0.153846 | 0.180602 | 0.200669 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191943 | 422 | 16 | 59 | 26.375 | 0.876833 | 0.36019 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b77ae041039444eabec2a0f4765645961d3e1a52 | 432 | py | Python | blackmamba/util/path.py | oz90210/blackmamba | 65c82c8e99028d6fbb57098ce82d0a394df215a0 | [
"MIT"
] | 61 | 2017-08-21T21:37:36.000Z | 2020-12-21T07:41:14.000Z | blackmamba/util/path.py | oz90210/blackmamba | 65c82c8e99028d6fbb57098ce82d0a394df215a0 | [
"MIT"
] | 39 | 2017-08-20T15:10:36.000Z | 2020-03-31T18:45:57.000Z | blackmamba/util/path.py | oz90210/blackmamba | 65c82c8e99028d6fbb57098ce82d0a394df215a0 | [
"MIT"
] | 12 | 2017-08-24T08:38:49.000Z | 2020-12-02T02:04:50.000Z | #!python3
import os
_DOCUMENTS = os.path.expanduser('~/Documents')
def strip_documents_folder(path):
"""
Strip ~/Documents part of the path.
~/Documents/hallo.py -> hallo.py
~/Documents/folder/a.py -> folder/a.py
"""
if path.startswith(_DOCUMENTS):
return path[len(_DOCUMENTS) + 1:]
return path
def is_python_file(path):
_, ext = os.path.splitext(path)
return ext.lower() == '.py'
| 18.782609 | 46 | 0.636574 | 56 | 432 | 4.767857 | 0.464286 | 0.044944 | 0.067416 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005865 | 0.210648 | 432 | 22 | 47 | 19.636364 | 0.777126 | 0.270833 | 0 | 0 | 0 | 0 | 0.04811 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b7927136694ac6734f303c1679226e491ed2ab39 | 287 | py | Python | release/scripts/mgear/version.py | yamahigashi/mgear4 | d439a69bbdc0ec727cded924a616b14194dfbe00 | [
"MIT"
] | 72 | 2020-09-28T20:00:59.000Z | 2022-03-25T14:35:14.000Z | release/scripts/mgear/version.py | Mikfr83/mgear4 | 2fa28080027f1004e8e0139ccf93f7ec2448b1fd | [
"MIT"
] | 101 | 2020-09-28T19:53:53.000Z | 2022-03-31T01:44:41.000Z | release/scripts/mgear/version.py | Mikfr83/mgear4 | 2fa28080027f1004e8e0139ccf93f7ec2448b1fd | [
"MIT"
] | 32 | 2020-10-09T10:49:45.000Z | 2022-03-31T08:27:37.000Z | import mgear
VERSION_MAJOR = mgear.VERSION[0]
VERSION_MINOR = mgear.VERSION[1]
VERSION_PATCH = mgear.VERSION[2]
version_info = (VERSION_MAJOR, VERSION_MINOR, VERSION_PATCH)
version = '%i.%i.%i' % version_info
__version__ = version
__all__ = ['version', 'version_info', '__version__']
| 23.916667 | 60 | 0.756098 | 38 | 287 | 5.157895 | 0.315789 | 0.244898 | 0.27551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011765 | 0.111498 | 287 | 11 | 61 | 26.090909 | 0.756863 | 0 | 0 | 0 | 0 | 0 | 0.132404 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b796f663b1a4a5f1ed0520d6f0d8e6a33e590d8a | 401 | py | Python | theano/sparse/__init__.py | michaelosthege/aesara | 55c88832ba71f87c9612d573ede74a4c042ef570 | [
"BSD-3-Clause"
] | 1 | 2020-12-30T19:12:52.000Z | 2020-12-30T19:12:52.000Z | theano/sparse/__init__.py | michaelosthege/aesara | 55c88832ba71f87c9612d573ede74a4c042ef570 | [
"BSD-3-Clause"
] | null | null | null | theano/sparse/__init__.py | michaelosthege/aesara | 55c88832ba71f87c9612d573ede74a4c042ef570 | [
"BSD-3-Clause"
] | null | null | null | from warnings import warn
try:
import scipy
enable_sparse = True
except ImportError:
enable_sparse = False
warn("SciPy can't be imported. Sparse matrix support is disabled.")
from theano.sparse.type import *
if enable_sparse:
from theano.sparse import opt, sharedvar
from theano.sparse.basic import *
from theano.sparse.sharedvar import sparse_constructor as shared
| 21.105263 | 72 | 0.745636 | 54 | 401 | 5.462963 | 0.537037 | 0.135593 | 0.216949 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.199501 | 401 | 18 | 73 | 22.277778 | 0.919003 | 0 | 0 | 0 | 0 | 0 | 0.149626 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b7a8885bf19ed5536c860f41027d45c7b2de322d | 247 | py | Python | src/cupyopt/__init__.py | staylorx/cupyopt | ea264dedc6797cd1c52ba3c61fb17f307ab278da | [
"Apache-2.0"
] | 4 | 2020-07-27T16:53:56.000Z | 2021-03-12T20:46:34.000Z | src/cupyopt/__init__.py | staylorx/cupyopt | ea264dedc6797cd1c52ba3c61fb17f307ab278da | [
"Apache-2.0"
] | 10 | 2020-07-28T16:29:07.000Z | 2020-12-07T22:59:25.000Z | src/cupyopt/__init__.py | staylorx/cupyopt | ea264dedc6797cd1c52ba3c61fb17f307ab278da | [
"Apache-2.0"
] | 2 | 2020-10-19T20:48:25.000Z | 2021-01-15T16:04:04.000Z | """ Init for cupyopt base """
from .oradb_tasks import ORADBGetEngine, ORADBSelectToDataFrame
from .sftp_tasks import (
DFGetOldestFile,
SFTPExists,
SFTPGet,
SFTPPoll,
SFTPPut,
SFTPRemove,
)
from .task_factory import ptask
| 20.583333 | 63 | 0.724696 | 25 | 247 | 7.04 | 0.8 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202429 | 247 | 11 | 64 | 22.454545 | 0.893401 | 0.08502 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.3 | 0 | 0.3 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b7a9fe3f98e400c0cda8edec1e995b546d6e32b5 | 8,618 | py | Python | venv/Lib/site-packages/mediapipe/calculators/util/non_max_suppression_calculator_pb2.py | Farhan-Malik/advance-hand-gesture | 0ebe21ddd7c8c2eb14746678be57b33d38c47205 | [
"MIT"
] | 41 | 2021-06-19T13:57:18.000Z | 2021-12-02T17:08:53.000Z | venv/Lib/site-packages/mediapipe/calculators/util/non_max_suppression_calculator_pb2.py | HxnDev/Pose-Detection | 2be27e88cf79a0fb643c5047158cba478c770be9 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/mediapipe/calculators/util/non_max_suppression_calculator_pb2.py | HxnDev/Pose-Detection | 2be27e88cf79a0fb643c5047158cba478c770be9 | [
"MIT"
] | 4 | 2021-07-02T03:09:51.000Z | 2021-11-25T13:00:10.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: mediapipe/calculators/util/non_max_suppression_calculator.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from mediapipe.framework import calculator_pb2 as mediapipe_dot_framework_dot_calculator__pb2
mediapipe_dot_framework_dot_calculator__options__pb2 = mediapipe_dot_framework_dot_calculator__pb2.mediapipe_dot_framework_dot_calculator__options__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='mediapipe/calculators/util/non_max_suppression_calculator.proto',
package='mediapipe',
syntax='proto2',
serialized_pb=_b('\n?mediapipe/calculators/util/non_max_suppression_calculator.proto\x12\tmediapipe\x1a$mediapipe/framework/calculator.proto\"\xf5\x04\n\"NonMaxSuppressionCalculatorOptions\x12 \n\x15num_detection_streams\x18\x01 \x01(\x05:\x01\x31\x12\x1e\n\x12max_num_detections\x18\x02 \x01(\x05:\x02-1\x12\x1f\n\x13min_score_threshold\x18\x06 \x01(\x02:\x02-1\x12$\n\x19min_suppression_threshold\x18\x03 \x01(\x02:\x01\x31\x12X\n\x0coverlap_type\x18\x04 \x01(\x0e\x32\x39.mediapipe.NonMaxSuppressionCalculatorOptions.OverlapType:\x07JACCARD\x12\x1f\n\x17return_empty_detections\x18\x05 \x01(\x08\x12V\n\talgorithm\x18\x07 \x01(\x0e\x32:.mediapipe.NonMaxSuppressionCalculatorOptions.NmsAlgorithm:\x07\x44\x45\x46\x41ULT\"k\n\x0bOverlapType\x12\x1c\n\x18UNSPECIFIED_OVERLAP_TYPE\x10\x00\x12\x0b\n\x07JACCARD\x10\x01\x12\x14\n\x10MODIFIED_JACCARD\x10\x02\x12\x1b\n\x17INTERSECTION_OVER_UNION\x10\x03\")\n\x0cNmsAlgorithm\x12\x0b\n\x07\x44\x45\x46\x41ULT\x10\x00\x12\x0c\n\x08WEIGHTED\x10\x01\x32[\n\x03\x65xt\x12\x1c.mediapipe.CalculatorOptions\x18\xbc\xa8\xb4\x1a \x01(\x0b\x32-.mediapipe.NonMaxSuppressionCalculatorOptions')
,
dependencies=[mediapipe_dot_framework_dot_calculator__pb2.DESCRIPTOR,])
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_NONMAXSUPPRESSIONCALCULATOROPTIONS_OVERLAPTYPE = _descriptor.EnumDescriptor(
name='OverlapType',
full_name='mediapipe.NonMaxSuppressionCalculatorOptions.OverlapType',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='UNSPECIFIED_OVERLAP_TYPE', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='JACCARD', index=1, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='MODIFIED_JACCARD', index=2, number=2,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='INTERSECTION_OVER_UNION', index=3, number=3,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=503,
serialized_end=610,
)
_sym_db.RegisterEnumDescriptor(_NONMAXSUPPRESSIONCALCULATOROPTIONS_OVERLAPTYPE)
_NONMAXSUPPRESSIONCALCULATOROPTIONS_NMSALGORITHM = _descriptor.EnumDescriptor(
name='NmsAlgorithm',
full_name='mediapipe.NonMaxSuppressionCalculatorOptions.NmsAlgorithm',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='DEFAULT', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='WEIGHTED', index=1, number=1,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=612,
serialized_end=653,
)
_sym_db.RegisterEnumDescriptor(_NONMAXSUPPRESSIONCALCULATOROPTIONS_NMSALGORITHM)
_NONMAXSUPPRESSIONCALCULATOROPTIONS = _descriptor.Descriptor(
name='NonMaxSuppressionCalculatorOptions',
full_name='mediapipe.NonMaxSuppressionCalculatorOptions',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='num_detection_streams', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.num_detection_streams', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='max_num_detections', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.max_num_detections', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=True, default_value=-1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='min_score_threshold', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.min_score_threshold', index=2,
number=6, type=2, cpp_type=6, label=1,
has_default_value=True, default_value=float(-1),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='min_suppression_threshold', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.min_suppression_threshold', index=3,
number=3, type=2, cpp_type=6, label=1,
has_default_value=True, default_value=float(1),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='overlap_type', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.overlap_type', index=4,
number=4, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='return_empty_detections', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.return_empty_detections', index=5,
number=5, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='algorithm', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.algorithm', index=6,
number=7, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
_descriptor.FieldDescriptor(
name='ext', full_name='mediapipe.NonMaxSuppressionCalculatorOptions.ext', index=0,
number=55383100, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=True, extension_scope=None,
options=None),
],
nested_types=[],
enum_types=[
_NONMAXSUPPRESSIONCALCULATOROPTIONS_OVERLAPTYPE,
_NONMAXSUPPRESSIONCALCULATOROPTIONS_NMSALGORITHM,
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=117,
serialized_end=746,
)
_NONMAXSUPPRESSIONCALCULATOROPTIONS.fields_by_name['overlap_type'].enum_type = _NONMAXSUPPRESSIONCALCULATOROPTIONS_OVERLAPTYPE
_NONMAXSUPPRESSIONCALCULATOROPTIONS.fields_by_name['algorithm'].enum_type = _NONMAXSUPPRESSIONCALCULATOROPTIONS_NMSALGORITHM
_NONMAXSUPPRESSIONCALCULATOROPTIONS_OVERLAPTYPE.containing_type = _NONMAXSUPPRESSIONCALCULATOROPTIONS
_NONMAXSUPPRESSIONCALCULATOROPTIONS_NMSALGORITHM.containing_type = _NONMAXSUPPRESSIONCALCULATOROPTIONS
DESCRIPTOR.message_types_by_name['NonMaxSuppressionCalculatorOptions'] = _NONMAXSUPPRESSIONCALCULATOROPTIONS
NonMaxSuppressionCalculatorOptions = _reflection.GeneratedProtocolMessageType('NonMaxSuppressionCalculatorOptions', (_message.Message,), dict(
DESCRIPTOR = _NONMAXSUPPRESSIONCALCULATOROPTIONS,
__module__ = 'mediapipe.calculators.util.non_max_suppression_calculator_pb2'
# @@protoc_insertion_point(class_scope:mediapipe.NonMaxSuppressionCalculatorOptions)
))
_sym_db.RegisterMessage(NonMaxSuppressionCalculatorOptions)
_NONMAXSUPPRESSIONCALCULATOROPTIONS.extensions_by_name['ext'].message_type = _NONMAXSUPPRESSIONCALCULATOROPTIONS
mediapipe_dot_framework_dot_calculator__options__pb2.CalculatorOptions.RegisterExtension(_NONMAXSUPPRESSIONCALCULATOROPTIONS.extensions_by_name['ext'])
# @@protoc_insertion_point(module_scope)
| 47.351648 | 1,128 | 0.793456 | 970 | 8,618 | 6.743299 | 0.196907 | 0.040361 | 0.028589 | 0.085767 | 0.41645 | 0.36126 | 0.33634 | 0.308821 | 0.244764 | 0.244764 | 0 | 0.0396 | 0.106289 | 8,618 | 181 | 1,129 | 47.61326 | 0.80966 | 0.032954 | 0 | 0.45625 | 1 | 0.00625 | 0.272695 | 0.25048 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04375 | 0 | 0.04375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b7c3465361d539a79294776f7df6ddb3ed36d292 | 96 | py | Python | pyprac/bignumprd.py | passingbreeze/LearnAlgorithm | fa1f0706a6b66f2ee4f0e7c9aef5fd3b2507f01b | [
"MIT"
] | null | null | null | pyprac/bignumprd.py | passingbreeze/LearnAlgorithm | fa1f0706a6b66f2ee4f0e7c9aef5fd3b2507f01b | [
"MIT"
] | null | null | null | pyprac/bignumprd.py | passingbreeze/LearnAlgorithm | fa1f0706a6b66f2ee4f0e7c9aef5fd3b2507f01b | [
"MIT"
] | null | null | null | num = [ eval(i) for i in input().split()]
product = lambda x,y:x*y
print(product(num[0],num[1])) | 32 | 41 | 0.635417 | 20 | 96 | 3.05 | 0.7 | 0.065574 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0.125 | 96 | 3 | 42 | 32 | 0.702381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b7cbb5a26a24b77470295a88c0959c494f0e9a09 | 3,301 | py | Python | leetcode/lessons/binary_search/__init__.py | wangkuntian/leetcode | e8dc9c8032c805a7d071ad19b94841ee8e52e834 | [
"MIT"
] | null | null | null | leetcode/lessons/binary_search/__init__.py | wangkuntian/leetcode | e8dc9c8032c805a7d071ad19b94841ee8e52e834 | [
"MIT"
] | 2 | 2020-03-24T18:00:21.000Z | 2020-03-26T11:33:51.000Z | leetcode/lessons/binary_search/__init__.py | wangkuntian/leetcode | e8dc9c8032c805a7d071ad19b94841ee8e52e834 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
__project__ = 'leetcode'
__file__ = '__init__.py'
__author__ = 'king'
__time__ = '2020/1/9 10:08'
_ooOoo_
o8888888o
88" . "88
(| -_- |)
O\ = /O
____/`---'\____
.' \\| |// `.
/ \\||| : |||// \
/ _||||| -:- |||||- \
| | \\\ - /// | |
| \_| ''\---/'' | |
\ .-\__ `-` ___/-. /
___`. .' /--.--\ `. . __
."" '< `.___\_<|>_/___.' >'"".
| | : `- \`.;`\ _ /`;.`/ - ` : | |
\ \ `-. \_ __\ /__ _/ .-` / /
======`-.____`-.___\_____/___.-`____.-'======
`=---='
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
佛祖保佑 永无BUG
"""
"""
二分查找也称折半查找(Binary Search),它是一种效率较高的查找方法,
前提是数据结构必须先排好序,可以在数据规模的对数时间复杂度内完成查找。
但是,二分查找要求线性表具有有随机访问的特点(例如数组),
也要求线性表能够根据中间元素的特点推测它两侧元素的性质,以达到缩减问题规模的效果。
"""
# 基本的二分搜索
def binary_search(nums, target):
left = 0
right = len(nums) - 1
while left <= right:
mid = left + (right - left) // 2
if nums[mid] == target:
return mid
elif nums[mid] < target:
left = mid + 1
elif nums[mid] > target:
right = mid - 1
return -1
# 1.寻找左侧边界的二分搜索
def left_bound(nums, target):
if len(nums) == 0:
return -1
left = 0
right = len(nums)
while left < right:
mid = left + (right - left) // 2
if nums[mid] == target:
right = mid
elif nums[mid] < target:
left = mid + 1
elif nums[mid] > target:
right = mid
return left if nums[left] == target else -1
# 2.寻找左侧边界的二分搜索
def left_bound2(nums, target):
left = 0
right = len(nums) - 1
while left <= right:
mid = left + (right - left) // 2
if nums[mid] == target:
right = mid - 1
elif nums[mid] < target:
left = mid + 1
elif nums[mid] > target:
right = mid - 1
if nums[left] != target or left >= len(nums):
return -1
return left
# 1.寻找右侧边界的二分搜索
def right_bound(nums, target):
if len(nums) == 0:
return -1
left = 0
right = len(nums)
while left < right:
mid = left + (right - left) // 2
if nums[mid] == target:
left = mid + 1
elif nums[mid] < target:
left = mid + 1
elif nums[mid] > target:
right = mid
if right == 0 or nums[right - 1] != target:
return -1
return right - 1
# 2.寻找右侧边界的二分搜索
def right_bound(nums, target):
if len(nums) == 0:
return -1
left = 0
right = len(nums) - 1
while left <= right:
mid = left + (right - left) // 2
if nums[mid] == target:
left = mid + 1
elif nums[mid] < target:
left = mid + 1
elif nums[mid] > target:
right = mid
if right <= 0 or nums[right] != target:
return -1
return right
| 25.589147 | 56 | 0.400788 | 305 | 3,301 | 4.068852 | 0.186885 | 0.084609 | 0.157131 | 0.136986 | 0.681708 | 0.64303 | 0.64303 | 0.64303 | 0.64303 | 0.64303 | 0 | 0.034965 | 0.436837 | 3,301 | 128 | 57 | 25.789063 | 0.632598 | 0.336868 | 0 | 0.847222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.069444 | false | 0 | 0 | 0 | 0.236111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b7f577ff590a350a00238d5b26c5f551f6a3fef2 | 5,604 | py | Python | commandment/mdm/__init__.py | sreyemnayr/commandment | addfe4bad7abd501fd0fa6facde783dc6fe730d4 | [
"MIT"
] | 2 | 2018-12-05T12:47:03.000Z | 2019-06-27T12:08:01.000Z | commandment/mdm/__init__.py | sighttviewliu/commandment | a1418bf1eaa3adfdfb31a53155243c8112c1ef82 | [
"MIT"
] | null | null | null | commandment/mdm/__init__.py | sighttviewliu/commandment | a1418bf1eaa3adfdfb31a53155243c8112c1ef82 | [
"MIT"
] | null | null | null | from typing import Set
from enum import IntFlag, auto, Enum, IntEnum
class CommandType(Enum):
ProfileList = 'ProfileList'
InstallProfile = 'InstallProfile'
RemoveProfile = 'RemoveProfile'
ProvisioningProfileList = 'ProvisioningProfileList'
InstallProvisioningProfile = 'InstallProvisioningProfile'
RemoveProvisioningProfile = 'RemoveProvisioningProfile'
CertificateList = 'CertificateList'
InstalledApplicationList = 'InstalledApplicationList'
DeviceInformation = 'DeviceInformation'
SecurityInfo = 'SecurityInfo'
DeviceLock = 'DeviceLock'
RestartDevice = 'RestartDevice'
ShutDownDevice = 'ShutDownDevice'
ClearPasscode = 'ClearPasscode'
EraseDevice = 'EraseDevice'
RequestMirroring = 'RequestMirroring'
StopMirroring = 'StopMirroring'
Restrictions = 'Restrictions'
ClearRestrictionsPasscode = 'ClearRestrictionsPasscode'
# Shared iPad
UserList = 'UserList'
UnlockUserAccount = 'UnlockUserAccount'
LogOutUser = 'LogOutUser'
DeleteUser = 'DeleteUser'
EnableLostMode = 'EnableLostMode'
PlayLostModeSound = 'PlayLostModeSound'
DisableLostMode = 'DisableLostMode'
DeviceLocation = 'DeviceLocation'
# Managed Applications
InstallApplication = 'InstallApplication'
ApplyRedemptionCode = 'ApplyRedemptionCode'
ManagedApplicationList = 'ManageApplicationList'
RemoveApplication = 'RemoveApplication'
InviteToProgram = 'InviteToProgram'
ValidateApplications = 'ValidateApplications'
# Books
InstallMedia = 'InstallMedia'
ManagedMediaList = 'ManagedMediaList'
RemoveMedia = 'RemoveMedia'
Settings = 'Settings'
ManagedApplicationConfiguration = 'ManagedApplicationConfiguration'
ApplicationConfiguration = 'ApplicationConfiguration'
ManagedApplicationAttributes = 'ManagedApplicationAttributes'
ManagedApplicationFeedback = 'ManagedApplicationFeedback'
AccountConfiguration = 'AccountConfiguration'
SetFirmwarePassword = 'SetFirmwarePassword'
VerifyFirmwarePassword = 'VerifyFirmwarePassword'
SetAutoAdminPassword = 'SetAutoAdminPassword'
DeviceConfigured = 'DeviceConfigured'
ScheduleOSUpdate = 'ScheduleOSUpdate'
ScheduleOSUpdateScan = 'ScheduleOSUpdateScan'
AvailableOSUpdates = 'AvailableOSUpdates'
OSUpdateStatus = 'OSUpdateStatus'
ActiveNSExtensions = 'ActiveNSExtensions'
NSExtensionMappings = 'NSExtensionMappings'
RotateFileVaultKey = 'RotateFileVaultKey'
class Platform(Enum):
"""The platform of the managed device."""
Unknown = 'Unknown' # Not enough information
macOS = 'macOS'
iOS = 'iOS'
tvOS = 'tvOS'
class AccessRights(IntFlag):
"""The MDM protocol defines a bitmask for granting permissions to an MDM to perform certain operations.
This enumeration contains all of those access rights flags.
"""
def _generate_next_value_(name, start, count, last_values):
return 2 ** count
ProfileInspection = auto()
ProfileInstallRemove = auto()
DeviceLockPasscodeRemoval = auto()
DeviceErase = auto()
QueryDeviceInformation = auto()
QueryNetworkInformation = auto()
ProvProfileInspection = auto()
ProvProfileInstallRemove = auto()
InstalledApplications = auto()
RestrictionQueries = auto()
SecurityQueries = auto()
ChangeSettings = auto()
ManageApps = auto()
All = ProfileInspection | ProfileInstallRemove | DeviceLockPasscodeRemoval | DeviceErase | QueryDeviceInformation \
| QueryNetworkInformation | ProvProfileInspection | ProvProfileInstallRemove | InstalledApplications \
| RestrictionQueries | SecurityQueries | ChangeSettings | ManageApps
AccessRightsSet = Set[AccessRights]
class CommandStatus(Enum):
"""CommandStatus describes all the possible states of a command in the device command queue.
The following statuses are based upon the return status of the MDM client:
- Acknowledged
- Error
- CommandFormatError
- NotNow
Additionally, there are statuses to explain the lifecycle of the command before and after the MDM client processes
them:
- Queued: The command was newly created and not yet sent to the device.
- Sent: The command has been sent to the device, but no response has come back yet.
- Expired: The command was never acknowledged, or the device was removed.
"""
# MDM Client Statuses
Idle = 'Idle'
Acknowledged = 'Acknowledged'
Error = 'Error'
CommandFormatError = 'CommandFormatError'
NotNow = 'NotNow'
# Commandment Lifecycle Statuses
Queued = 'Queued'
Sent = 'Sent'
Expired = 'Expired'
class SettingsItem(Enum):
"""A list of possible values for Managed Settings items.
See Also:
- `Managed Settings <https://developer.apple.com/library/content/documentation/Miscellaneous/Reference/MobileDeviceManagementProtocolRef/3-MDM_Protocol/MDM_Protocol.html#//apple_ref/doc/uid/TP40017387-CH3-SW59>`_._
"""
VoiceRoaming = 'VoiceRoaming'
PersonalHotspot = 'PersonalHotspot'
Wallpaper = 'Wallpaper'
DataRoaming = 'DataRoaming'
ApplicationAttributes = 'ApplicationAttributes'
DeviceName = 'DeviceName'
HostName = 'HostName'
MDMOptions = 'MDMOptions'
PasscodeLockGracePeriod = 'PasscodeLockGracePeriod'
MaximumResidentUsers = 'MaximumResidentUsers'
class WallpaperLocation(IntEnum):
"""A list of possible values for the Wallpaper `where` setting.
Determines where the given wallpaper will be used.
"""
LockScreen = 1
HomeScreen = 2
Both = 3
| 33.357143 | 224 | 0.734832 | 429 | 5,604 | 9.575758 | 0.508159 | 0.008763 | 0.005842 | 0.007303 | 0.011685 | 0.011685 | 0 | 0 | 0 | 0 | 0 | 0.003536 | 0.192541 | 5,604 | 167 | 225 | 33.556886 | 0.904309 | 0.228944 | 0 | 0 | 0 | 0 | 0.265558 | 0.075772 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009524 | false | 0.07619 | 0.019048 | 0.009524 | 0.971429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
b7f73bf1ae0aab7e532dd7b71be773f5ec3caacd | 221 | py | Python | GodwillOnyewuchi/Phase 1/Python Basic 2/day 7 task/task10.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 6 | 2020-05-23T19:53:25.000Z | 2021-05-08T20:21:30.000Z | GodwillOnyewuchi/Phase 1/Python Basic 2/day 7 task/task10.py | GREENFONTS/python-challenge-solutions | a9aad85a250892fe41961a7d5e77f67b8d14fc1b | [
"MIT"
] | 8 | 2020-05-14T18:53:12.000Z | 2020-07-03T00:06:20.000Z | GodwillOnyewuchi/Phase 1/Python Basic 2/day 7 task/task10.py | GREENFONTS/python-challenge-solutions | a9aad85a250892fe41961a7d5e77f67b8d14fc1b | [
"MIT"
] | 39 | 2020-05-10T20:55:02.000Z | 2020-09-12T17:40:59.000Z | import math
opposite = int(input("Enter the opposite side: "))
adjacent = int(input("Enter the opposite side: "))
hypotenuse = math.sqrt(math.pow(opposite, 2) + math.pow(adjacent, 2))
print(f'hypotenuse = {hypotenuse}') | 31.571429 | 69 | 0.714932 | 31 | 221 | 5.096774 | 0.483871 | 0.101266 | 0.164557 | 0.202532 | 0.35443 | 0.35443 | 0 | 0 | 0 | 0 | 0 | 0.010309 | 0.122172 | 221 | 7 | 70 | 31.571429 | 0.804124 | 0 | 0 | 0 | 0 | 0 | 0.337838 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b7f781af7ae37f2a11448949674464f82666fcbd | 1,899 | py | Python | tests/date/test_sub.py | shammellee/pendulum | bb179c8fb6ef92b7bfc471a46338abbfac9fafca | [
"MIT"
] | 1 | 2018-11-25T03:10:22.000Z | 2018-11-25T03:10:22.000Z | tests/date/test_sub.py | shammellee/pendulum | bb179c8fb6ef92b7bfc471a46338abbfac9fafca | [
"MIT"
] | null | null | null | tests/date/test_sub.py | shammellee/pendulum | bb179c8fb6ef92b7bfc471a46338abbfac9fafca | [
"MIT"
] | 1 | 2020-07-24T17:37:18.000Z | 2020-07-24T17:37:18.000Z | import pytest
import pendulum
from datetime import timedelta
from ..conftest import assert_date
def test_subtract_years_positive():
assert pendulum.date(1975, 1, 1).subtract(years=1).year == 1974
def test_subtract_years_zero():
assert pendulum.date(1975, 1, 1).subtract(years=0).year == 1975
def test_subtract_years_negative():
assert pendulum.date(1975, 1, 1).subtract(years=-1).year == 1976
def test_subtract_months_positive():
assert pendulum.date(1975, 1, 1).subtract(months=1).month == 12
def test_subtract_months_zero():
assert pendulum.date(1975, 12, 1).subtract(months=0).month == 12
def test_subtract_months_negative():
assert pendulum.date(1975, 11, 1).subtract(months=-1).month == 12
def test_subtract_days_positive():
assert pendulum.Date(1975, 6, 1).subtract(days=1).day == 31
def test_subtract_days_zero():
assert pendulum.Date(1975, 5, 31).subtract(days=0).day == 31
def test_subtract_days_negative():
assert pendulum.Date(1975, 5, 30).subtract(days=-1).day == 31
def test_subtract_weeks_positive():
assert pendulum.Date(1975, 5, 28).subtract(weeks=1).day == 21
def test_subtract_weeks_zero():
assert pendulum.Date(1975, 5, 21).subtract(weeks=0).day == 21
def test_subtract_weeks_negative():
assert pendulum.Date(1975, 5, 14).subtract(weeks=-1).day == 21
def test_subtract_timedelta():
delta = timedelta(days=18)
d = pendulum.date(2015, 3, 14)
new = d - delta
assert isinstance(new, pendulum.Date)
assert_date(new, 2015, 2, 24)
def test_subtract_duration():
delta = pendulum.duration(years=2, months=3, days=18)
d = pendulum.date(2015, 3, 14)
new = d - delta
assert_date(new, 2012, 11, 26)
def test_addition_invalid_type():
d = pendulum.date(2015, 3, 14)
with pytest.raises(TypeError):
d - "ab"
with pytest.raises(TypeError):
"ab" - d
| 22.879518 | 69 | 0.697209 | 284 | 1,899 | 4.5 | 0.193662 | 0.150235 | 0.164319 | 0.206573 | 0.63302 | 0.532081 | 0.353678 | 0.353678 | 0.189358 | 0.12989 | 0 | 0.103034 | 0.16693 | 1,899 | 82 | 70 | 23.158537 | 0.704804 | 0 | 0 | 0.155556 | 0 | 0 | 0.002106 | 0 | 0 | 0 | 0 | 0 | 0.355556 | 1 | 0.333333 | false | 0 | 0.088889 | 0 | 0.422222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b7fe579d59500a97dc95e9e09d55d6b959da781d | 7,264 | py | Python | track.py | Johj/track-analysis | 214ecfcacfbf1ca5e2d0eac0d74642801141b100 | [
"MIT"
] | null | null | null | track.py | Johj/track-analysis | 214ecfcacfbf1ca5e2d0eac0d74642801141b100 | [
"MIT"
] | null | null | null | track.py | Johj/track-analysis | 214ecfcacfbf1ca5e2d0eac0d74642801141b100 | [
"MIT"
] | null | null | null | import functions as f
import itertools
import numpy as np
import plotly.plotly as py
import plotly.graph_objs as go
import random
name = "Peter"
filepath = "C:/Users/Peter/Dropbox/Programming/Python/track-analysis/" + name + "/" + name + ".txt"
count, data = f.getData(filepath)
start, end = f.getDateRange(data)
'''
0. Click Type (pie)
1. Monitor Use (pie)
2. Clicks Per Date (line)
3. Hours Per Session (line)
4. Frequency Of Turn On Times (histogram)
5. Frequency Of Shutdown Times (histogram)
6. Click Mapping (scatter)
7. Frequency Of Seconds Between Clicks (histogram)
8. Clicks By The Hour (bar)
9. Clicks By The Day (bar)
10. Frequency Of Clicks Per Session (histogram)
11. Frequency Of Hours Per Session (histogram)
'''
'''
fig0 = {
'data': [{
'labels': ['Left', 'Right', 'Middle', '1', '2'],
'values': [round(x / count * 100, 2) for x in f.sumSessions(f.clickTypePerSession(data))],
'type': 'pie',
'marker': {'colors': [
'rgb(152, 202, 237)',
'rgb(108, 180, 230)',
'rgb(65, 158, 222)',
'rgb(35, 133, 201)',
'rgb(27, 104, 158)'
]}
}],
'layout': {'title': "Click Type (" + name + ")<br>" + start + " - " + end}
}
py.image.save_as(fig0, name + "/fig0.png")
fig1 = {
'data': [{
'labels': ['Primary', 'Secondary'],
'values': [round(x / count * 100, 2) for x in f.sumSessions(f.monitorUsedPerSession(data))],
'type': 'pie',
'marker': {'colors': [
'rgb(152, 202, 237)',
'rgb(65, 158, 222)'
]}
}],
'layout': {'title': "Monitor Use (" + name + ")<br>" + start + " - " + end}
}
py.image.save_as(fig1, name + "/fig1.png")
fig2 = dict(
data = [go.Scatter(
x = f.turnOffTimePerSession(data),
y = f.clicksPerSession(data)
)],
layout = go.Layout(
title = "Clicks Per Date (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Date"
),
yaxis = dict(
title = "Clicks"
)
)
)
py.image.save_as(fig2, name + "/fig2.png")
data3 = [round(x.total_seconds() / 3600, 2) for x in f.timePerSession(data)]
fig3 = dict(
data = [go.Scatter(
x = list(range(len(data3))),
y = data3
)],
layout = go.Layout(
title = "Hours Per Session (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Session"
),
yaxis = dict(
title = "Hours"
)
)
)
py.image.save_as(fig3, name + "/fig3.png")
data4Temp = []
for x in f.turnOnTimePerSession(data):
if x is None:
continue
elif x.time().minute >= 30:
data4Temp.append((x.time().hour + 1) % 24)
else:
data4Temp.append((x.time().hour) % 24)
data4 = [0] * 24
for x in data4Temp:
data4[x] += 1
fig4 = go.Figure(
data = [go.Bar(
x = [
'0:00', '1:00', '2:00', '3:00', '4:00', '5:00',
'6:00', '7:00', '8:00', '9:00', '10:00', '11:00',
'12:00', '13:00', '14:00', '15:00', '16:00', '17:00',
'18:00', '19:00', '20:00', '21:00', '22:00', '23:00'
],
y = data4
)],
layout = go.Layout(
title = "Frequency Of Turn On Times (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Turn On Times",
tickangle = -45,
),
yaxis = dict(
title = "Frequency"
),
bargap = 0
)
)
py.image.save_as(fig4, name + "/fig4.png")
data5Temp = []
for x in f.turnOffTimePerSession(data):
if x is None:
continue
elif x.time().minute >= 30:
data5Temp.append((x.time().hour + 1) % 24)
else:
data5Temp.append((x.time().hour) % 24)
data5 = [0] * 24
for x in data5Temp:
data5[x] += 1
fig5 = go.Figure(
data = [go.Bar(
x = [
'0:00', '1:00', '2:00', '3:00', '4:00', '5:00',
'6:00', '7:00', '8:00', '9:00', '10:00', '11:00',
'12:00', '13:00', '14:00', '15:00', '16:00', '17:00',
'18:00', '19:00', '20:00', '21:00', '22:00', '23:00'
],
y = data5
)],
layout = go.Layout(
title = "Frequency Of Shutdown Times (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Shutdown Times",
tickangle = -45,
),
yaxis = dict(
title = "Frequency"
),
bargap = 0
)
)
py.image.save_as(fig5, name + "/fig5.png")
data6XTemp, data6YTemp = f.coordinatesPerSession(data)
# flatten
data6XTemp = list(itertools.chain(*data6XTemp))
data6YTemp = list(itertools.chain(*data6YTemp))
k = 0.80
random.seed(0)
indicies = random.sample(range(len(data6XTemp)), round(len(data6XTemp) * k))
data6X = [data6XTemp[i] for i in indicies]
data6Y = [data6YTemp[i] for i in indicies]
fig6 = dict(
data = [go.Scatter(
x = data6X,
y = data6Y,
mode = "markers",
marker = dict(
size = 1
)
)],
layout = go.Layout(
title = "Click Mapping (Primary), Sampling " + str(k * 100) + "% out of " + str(count) + " (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "x",
range = [0, 1920] #[-1920, -400]
),
yaxis = dict(
title = "y",
range = [1080, 0], #[864, 0]
zeroline = False
)
)
)
py.image.save_as(fig6, name + "/fig6.png")
data7Temp = [x.total_seconds() for x in list(itertools.chain(*f.timeBetweenClicksPerSession(data)))]
arr = np.array(data7Temp)
# mean = np.mean(arr)
std = np.std(arr)
data7 = []
for x in data7Temp:
if x <= std / 5 and x >= 0:
data7.append(x)
fig7 = go.Figure(
data = [go.Histogram(
x = data7
)],
layout = go.Layout(
title = "Frequency Of Seconds Between Clicks (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Seconds Between Clicks"
),
yaxis = dict(
title = "Frequency"
)
)
)
py.image.save_as(fig7, name + "/fig7.png")
fig8 = go.Figure(
data = [go.Bar(
x = [
'0:00', '1:00', '2:00', '3:00', '4:00', '5:00',
'6:00', '7:00', '8:00', '9:00', '10:00', '11:00',
'12:00', '13:00', '14:00', '15:00', '16:00', '17:00',
'18:00', '19:00', '20:00', '21:00', '22:00', '23:00'
],
y = f.sumSessions(f.clicksPerHourPerSession(data))
)],
layout = go.Layout(
title = "Clicks By The Hour (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Hour",
tickangle = -45,
),
yaxis = dict(
title = "Clicks"
)
)
)
py.image.save_as(fig8, name + "/fig8.png")
fig9 = go.Figure(
data = [go.Bar(
x = [
'Monday', 'Tuesday', 'Wednesday', 'Thursday',
'Friday', 'Saturday', 'Sunday'
],
y = f.sumSessions(f.clicksPerDayPerSession(data))
)],
layout = go.Layout(
title = "Clicks By The Day (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Day"
),
yaxis = dict(
title = "Clicks"
)
)
)
py.image.save_as(fig9, name + "/fig9.png")
data10 = [round(x, -2) for x in f.clicksPerSession(data)]
fig10 = go.Figure(
data = [go.Histogram(
x = data10,
autobinx = False,
xbins = dict(
start = 0,
end = max(data10),
size = 500
)
)],
layout = go.Layout(
title = "Frequency Of Clicks Per Session (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Clicks Per Session"
),
yaxis = dict(
title = "Frequency"
)
)
)
py.image.save_as(fig10, name + "/fig10.png")
fig11 = go.Figure(
data = [go.Histogram(
x = [round(x.total_seconds() / 3600, 0) for x in f.timePerSession(data)],
autobinx = False,
xbins = dict(
start = 0,
end = 30,
size = 1
)
)],
layout = go.Layout(
title = "Frequency Of Hours Per Session (" + name + ")<br>" + start + " - " + end,
xaxis = dict(
title = "Hours Per Session"
),
yaxis = dict(
title = "Frequency"
)
)
)
py.image.save_as(fig11, name + "/fig11.png")
'''
| 23.282051 | 135 | 0.565253 | 1,026 | 7,264 | 3.986355 | 0.206628 | 0.04401 | 0.032274 | 0.041076 | 0.531051 | 0.439609 | 0.354523 | 0.294377 | 0.215403 | 0.206112 | 0 | 0.089589 | 0.222467 | 7,264 | 311 | 136 | 23.356913 | 0.634561 | 0 | 0 | 0 | 0 | 0 | 0.210692 | 0.179245 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4d008ee15318143d13fe51f5b535f412797070d9 | 335 | py | Python | Linked_list/Python/Singly_linked_list_test.py | bakuryuthem0/al-go-rithms | 8ad4b65a988740525585ecae2b6f0b815cdbcd66 | [
"CC0-1.0"
] | 2 | 2019-10-06T02:55:28.000Z | 2019-12-11T05:48:10.000Z | Linked_list/Python/Singly_linked_list_test.py | bakuryuthem0/al-go-rithms | 8ad4b65a988740525585ecae2b6f0b815cdbcd66 | [
"CC0-1.0"
] | null | null | null | Linked_list/Python/Singly_linked_list_test.py | bakuryuthem0/al-go-rithms | 8ad4b65a988740525585ecae2b6f0b815cdbcd66 | [
"CC0-1.0"
] | 3 | 2019-11-02T16:16:25.000Z | 2021-02-20T11:21:01.000Z | list=linkedlist()
list.head=node("Monday")
list1=node("Tuesday")
list2=node("Thursday")
list.head.next=list1
list1.next=list2
print("Before insertion:")
list.printing()
print('\n')
list.push_after(list1,"Wednesday")
print("After insertion:")
list.printing()
print('\n')
list.deletion(3)
print("After deleting 4th node")
list.printing() | 20.9375 | 34 | 0.743284 | 48 | 335 | 5.166667 | 0.458333 | 0.145161 | 0.169355 | 0.209677 | 0.25 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0.025397 | 0.059701 | 335 | 16 | 35 | 20.9375 | 0.761905 | 0 | 0 | 0.3125 | 0 | 0 | 0.267857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
4d0488d2ce2f6542ca86285a263d297ed5b7428b | 13,071 | py | Python | lib/ophcrack.py | GrimHacker/hashcatter | f17c1c6ef072301cf72a14e1bd264953a6f30d94 | [
"Unlicense"
] | 2 | 2016-01-19T14:31:01.000Z | 2019-04-26T07:47:54.000Z | lib/ophcrack.py | GrimHacker/hashcatter | f17c1c6ef072301cf72a14e1bd264953a6f30d94 | [
"Unlicense"
] | null | null | null | lib/ophcrack.py | GrimHacker/hashcatter | f17c1c6ef072301cf72a14e1bd264953a6f30d94 | [
"Unlicense"
] | null | null | null | '''
.1111... | Title: ophcrack
.10000000000011. .. | Author: Oliver Morton
.00 000... | Email: grimhacker@grimhacker.com
1 01.. | Description:
.. | runs ophcrack as a subprocess, parsing stdout.
.. | does not return cracked hashes - ophcrack must
GrimHacker .. | output to a file which can then be parsed (otherwise
.. | we can't be sure the plain is correct)
grimhacker.com .. |
@_grimhacker .. |
--------------------------------------------------------------------------------
Created on 22 Sep 2013
@author: GrimHacker
Copyright (c) 2013 GrimHacker
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
'''
import logging
import threading
import re
from lib.command import Command
class Ophcrack(threading.Thread, Command):
def __init__(self, ophcrack):
threading.Thread.__init__(self)
self.log = logging.getLogger(__name__)
self.options = ophcrack
self.nt_hash_pattern = re.compile("NT hash")
def _stdout(self, line): # overriding inherited function. (woo! inheritance!)
"""
filter stdout of ophcrack - only log cracked ntlm hashes
note this is only for running output - too many possibilities for error to use reliably for final output.
"""
#self.log.debug(line)
line_split = line.split(";")
# if we don't split(;) above then it doesn't output correctly - think this is because ophcrack is rewriting the screen and using \r or it might be because the ";" is being interpreted by something.
# passwords with ";" in them are going to break this. - doesn't matter too much as only used for running status messages
# TODO: use a regex instead. Something like "Found password .* for" and parse it out based on the match location.
# #this would also allow getting the status of ophcrack out.
if len(line_split) == 2:
# this should be output like this:
# 0h 0m 8s; Found password t3rm1nat10n for user server-admin (NT hash #3)
#TODO: add error handling around this
found = line.split(";")[1].strip() # going to have a problem if there is a ; in the username or password... #TODO: find out what is interpreting this data as commands and kill it with fire.
#self.log.debug(found)
if self.nt_hash_pattern.search(found): # if the line from stdout contains the pattern it is something we want to log, otherwise ignore it.
cracked = {'username': '', 'ntlm': '', 'nthashnum': '', 'passwd': ''} # initialised empty to prevent key errors. ntlm included incase decide to move to separate function shared with hashcat.
# lots of logic here to deal with spaces in usernames and passwords (hopefully ophcrack and hashcat deal with them as well or all this is pointless!)
if found.startswith("Found empty password for user"):
# "Found empty password for user <username> (NT hash #<hashnumber>)"
username = ""
for i in range(5, len(found.split("(")[0].split(" "))):
username += "{0}".format(found.split("(")[0].split(" ")[i])
if i == len(found.split("(")[0].split(" ")): # if this is the last part of the username do nothing, otherwise add a space to the username
pass
else:
username += " "
cracked['username'] = username
cracked['nthashnum'] = found.split("#")[-1].split(")")[0]
elif found.startswith("Found empty password for NT hash"):
# "Found empty password for NT hash #<hashnumber>")
cracked['nthashnum'] = found.split("#")[-1]
elif found.startswith("Found password"):
# Found password ThdLrFv4uu for NT hash #19712
# Found password s3cretpassword for user adminuser (NT hash #0)
if "for user" in found:
# parse out the password
# know the start of the password is always the 3rd element.
# find the end of the password by looking for the string "for user"
# use these indicies to pull out the password - by doing it like this we can handle passwords with spaces in.
for i in range(0, len(found.split(" ")) - 1):
test = ' '.join(found.split(" ")[i:i + 2]) # select two elements at a time and join together with a space in between
if test == "for user":
endpass = i # first element of the list AFTER the password
break
cracked['passwd'] = ' '.join(found.split(" ")[2:endpass])
# parse out the username
username = found.split("(")[0].strip(" ").split(" ")[endpass + 2:][0] # this is pulling out the username based on the end of the password discovered before and the "(" character because we are doing it like this we can handle usernames with spaces in.
cracked['username'] = username
# parse out the hash number
cracked['nthashnum'] = found.split("#")[-1].split(")")[0]
elif "for NT hash" in found:
# parse out the hash number
cracked['nthashnum'] = found.split("#")[-1]
# parse out the password
# know the start is always the 3rd element
# find the end of the password by looking for the string "for NT"
# use these indicies to pull out the password - doing it like this supports passwords with spaces.
for i in range(0, len(found.split(" ")) - 1):
test = ' '.join(found.split(" ")[i:i + 2]) # select two elements at a time and join together with a space in between
if test == "for NT":
endpass = i # first element of the list AFTER the password
break
cracked['passwd'] = ' '.join(found.split(" ")[2:endpass])
else:
self.log.warning("can't parse ophcrack stdout. this should not affect the output file.")
self.log.info("OPHCRACK Found- {username}:::{passwd} from NT hash #{nthashnum}".format(**cracked))
else:
# not a message about a cracked ntlm hash
pass
"""
elif len(line_split) == 5:
# messages like:
# 0h 0m 9s; brute force (36%); search (5%); tables: total 20, done 0, using 9; pwd found 2/5.
#
success = True
oph_status = {}
self.log.debug("line_split = {0}".format(line_split))
for section in line_split[1:]: # skipping the first element - which should be the running time.
info = section.strip()
self.log.debug("info = {0}".format(info))
if info.startswith("brute force"):
# this should be:
# brute force (36%)
try:
oph_status['brute'] = info.split("(")[1].split(")")[0]
except Exception as e:
self.log.warning("error parsing ophcrack message")
success = False
break
elif info.startswith("preload"):
try:
oph_status['preload'] = info.split("(")[1].split(")")[0]
except Exception as e:
self.log.warning("error parsing ophcrack message")
success = False
break
elif info.startswith("search"):
# this should be:
# search (5%)
try:
oph_status['search'] = info.split("(")[1].split(")")[0]
except Exception as e:
self.log.warning("error parsing ophcrack message")
success = False
break
elif info.startswith("tables"):
# this should be:
# tables: total 20, done 0, using 9
tbl = info.split(" ")
try:
oph_status['tbl_total'] = tbl[2].strip(",")
except Exception as e:
self.log.warning("error parsing ophcrack message")
success = False
break
else:
try:
oph_status['tbl_done'] = tbl[4].strip(",")
except Exception as e:
self.log.warning("error parsing ophcrack message")
success = False
break
else:
try:
oph_status['tbl_using'] = tbl[6]
except Exception as e:
self.log.warning("error parsing ophcrack message")
success = False
break
elif info.startswith("pwd found"):
# this should be:
# pwd found 2/5.
try:
oph_status['pwd_found'] = info.split(" ")[2].strip(".")
except Exception as e:
self.log.warning("error parsing ophcrack message")
success = False
break
elif info.startswith("Found password"):
#
else:
# running time will drop into this.
pass
if success:
if "search" in oph_status.keys():
self.log.debug("OPHCRACK Status- found {pwd_found}, bruteforce({brute}), search({search}), tables: total {tbl_total} done {tbl_done} using {tbl_using}".format(**oph_status))
elif "preload" in oph_status.keys():
self.log.debug("OPHCRACK Status- found {pwd_found}, bruteforce({brute}), preload({preload}), tables: total {tbl_total} done {tbl_done} using {tbl_using}".format(**oph_status))
else:
pass
"""
else:
pass
# self.log.debug("ophcrack stdout has an unexpected number of ';' on this line. - this should not affect the output file.".format(str(line_split)))
# might be part 1 or 2 of lm, bruteforce/search percentage etc
# don't want to clutter the output so just ignore them for now.
# TODO: consider making this output part of a very verbose logging setting
def _build_cmd(self, **kwargs):
"""
build string of command to execute
"""
# TODO: deal with spaces in paths (e.g. ophcrack maybe full path to exe which could have a space in)
part1 = "{0}".format(kwargs.pop("exe", None)) # ophcrack exe
part2 = ""
for key in kwargs.keys():
part2 += " {0} {1}".format(key, kwargs[key]) # ophcrack optional arguments
cmd = part1 + part2
return cmd
def run(self):
"""
run ophcrack subprocess
"""
self.log.debug("starting thread: '{0}'".format(self.name))
cmd = self._build_cmd(**self.options)
self._execute(cmd)
self.log.debug("finished thread: '{0}'".format(self.name))
| 55.858974 | 276 | 0.525974 | 1,480 | 13,071 | 4.608784 | 0.256081 | 0.019499 | 0.015833 | 0.018472 | 0.326638 | 0.301862 | 0.286468 | 0.253335 | 0.235889 | 0.235889 | 0 | 0.015364 | 0.377553 | 13,071 | 233 | 277 | 56.098712 | 0.823009 | 0.381532 | 0 | 0.348485 | 0 | 0 | 0.108232 | 0.005335 | 0 | 0 | 0 | 0.012876 | 0 | 1 | 0.060606 | false | 0.19697 | 0.060606 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4d0adc9581e4bfdaf2d48b4290fbe41d858ad4e9 | 441 | py | Python | Functions/Editing.py | SkepticalPotato2k/ZeonEditor | 048e24549b62fc063b5950a00bf66d16c5ecca9e | [
"MIT"
] | null | null | null | Functions/Editing.py | SkepticalPotato2k/ZeonEditor | 048e24549b62fc063b5950a00bf66d16c5ecca9e | [
"MIT"
] | null | null | null | Functions/Editing.py | SkepticalPotato2k/ZeonEditor | 048e24549b62fc063b5950a00bf66d16c5ecca9e | [
"MIT"
] | null | null | null | # Editing Class #
# Things Relating To Editing The File #
import pyperclip
class Editing:
def Replace(loadedContent, lineID, newStr):
loadedContent[lineID] = newStr
def newLine(loadedContent, Str):
loadedContent.append(Str)
def delLine(loadedContent, lineID):
del loadedContent[lineID]
def copyLine(loadedContent, lineID):
pyperclip.copy(loadedContent[lineID])
| 23.210526 | 48 | 0.657596 | 42 | 441 | 6.904762 | 0.5 | 0.393103 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.263039 | 441 | 18 | 49 | 24.5 | 0.892308 | 0.113379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.1 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4d0d24f880f9eed856b5e2c5cc2f15de83015a38 | 67 | py | Python | timeclock/settings.py | mikejarrett/company-time-clock | 17a652719ef25a9b058551b1f4d3f05cddadc04a | [
"MIT"
] | null | null | null | timeclock/settings.py | mikejarrett/company-time-clock | 17a652719ef25a9b058551b1f4d3f05cddadc04a | [
"MIT"
] | null | null | null | timeclock/settings.py | mikejarrett/company-time-clock | 17a652719ef25a9b058551b1f4d3f05cddadc04a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
DB_LOCATION = 'timeclock.db'
DEBUG = True
| 13.4 | 28 | 0.61194 | 9 | 67 | 4.444444 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018182 | 0.179104 | 67 | 4 | 29 | 16.75 | 0.709091 | 0.313433 | 0 | 0 | 0 | 0 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4d179409fbd395914bfc870050a69cb0cc5adc77 | 1,492 | py | Python | ddb/feature/cookiecutter/schema.py | gfi-centre-ouest/docker-devbox-ddb | 1597d85ef6e9e8322cce195a454de54186ce9ec7 | [
"MIT"
] | 4 | 2020-06-11T20:54:47.000Z | 2020-09-22T13:07:17.000Z | ddb/feature/cookiecutter/schema.py | gfi-centre-ouest/docker-devbox-ddb | 1597d85ef6e9e8322cce195a454de54186ce9ec7 | [
"MIT"
] | 113 | 2019-11-07T00:40:36.000Z | 2021-01-18T12:50:16.000Z | ddb/feature/cookiecutter/schema.py | inetum-orleans/docker-devbox-ddb | 20c713cf7bfcaf289226a17a9648c17d16003b4d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import tempfile
from marshmallow import Schema, fields
from ddb.feature.schema import FeatureSchema
class CookiecutterOptions(Schema):
"""
Cookiecutter options
"""
no_input = fields.Boolean(allow_none=True, default=None)
replay = fields.Boolean(allow_none=True, default=False)
overwrite_if_exists = fields.Boolean(allow_none=True, default=True)
config_file = fields.String(allow_none=True, default=None)
default_config = fields.Raw(allow_none=True, default=None)
class TemplateSchema(CookiecutterOptions):
"""
Template schema
"""
template = fields.String(required=True)
output_dir = fields.String(allow_none=True, default=None)
checkout = fields.String(allow_none=True, default=None)
extra_context = fields.Dict(allow_none=True, default=None)
password = fields.String(allow_none=True, default=None)
version = fields.String(allow_none=True, default=None)
cookiecutter_tmp_dir = os.path.join(tempfile.gettempdir(), "ddb", "cookiecutter")
class CookiecutterFeatureSchema(FeatureSchema, CookiecutterOptions):
"""
Cookiecutter feature schema.
"""
templates = fields.List(fields.Nested(TemplateSchema()))
cookiecutters_dir = fields.String(allow_none=True, default=os.path.join(cookiecutter_tmp_dir, "templates"))
replay_dir = fields.String(allow_none=True, default=os.path.join(cookiecutter_tmp_dir, "replay"))
default_context = fields.Dict(allow_none=True)
| 33.909091 | 111 | 0.745308 | 180 | 1,492 | 6.016667 | 0.294444 | 0.108033 | 0.156048 | 0.221607 | 0.468144 | 0.432133 | 0.285319 | 0.116343 | 0.116343 | 0.116343 | 0 | 0.000779 | 0.13941 | 1,492 | 43 | 112 | 34.697674 | 0.842679 | 0.058981 | 0 | 0 | 0 | 0 | 0.022091 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.043478 | 0.173913 | 0 | 0.956522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4d35e3189565fa47eed41d7be204f095b5738902 | 1,628 | py | Python | corehq/apps/reminders/management/commands/run_reminder_queue.py | dslowikowski/commcare-hq | ad8885cf8dab69dc85cb64f37aeaf06106124797 | [
"BSD-3-Clause"
] | 1 | 2015-02-10T23:26:39.000Z | 2015-02-10T23:26:39.000Z | corehq/apps/reminders/management/commands/run_reminder_queue.py | SEL-Columbia/commcare-hq | 992ee34a679c37f063f86200e6df5a197d5e3ff6 | [
"BSD-3-Clause"
] | 1 | 2022-03-12T01:03:25.000Z | 2022-03-12T01:03:25.000Z | corehq/apps/reminders/management/commands/run_reminder_queue.py | johan--/commcare-hq | 86ee99c54f55ee94e4c8f2f6f30fc44e10e69ebd | [
"BSD-3-Clause"
] | null | null | null | from django.core.management.base import CommandError
from django.conf import settings
from dimagi.utils.parsing import json_format_datetime
from corehq.apps.reminders.models import CaseReminderHandler, CaseReminder
from corehq.apps.reminders.tasks import fire_reminder
from hqscripts.generic_queue import GenericEnqueuingOperation
class ReminderEnqueuingOperation(GenericEnqueuingOperation):
args = ""
help = "Runs the Reminders Queue"
def get_queue_name(self):
return "reminders-queue"
def get_enqueuing_timeout(self):
return settings.REMINDERS_QUEUE_ENQUEUING_TIMEOUT
def get_items_to_be_processed(self, utcnow):
utcnow_json = json_format_datetime(utcnow)
result = CaseReminder.view('reminders/by_next_fire',
startkey=[None],
endkey=[None, utcnow_json],
include_docs=False,
).all()
return [{"id": e["id"], "key": e["key"][1]} for e in result]
def use_queue(self):
return settings.REMINDERS_QUEUE_ENABLED
def enqueue_item(self, _id):
fire_reminder.delay(_id)
def enqueue_directly(self, reminder):
"""
This method is used to try to send a reminder directly to the
celery queue, without waiting for it to be enqueued by the handle()
thread.
"""
try:
self.enqueue(reminder._id, json_format_datetime(reminder.next_fire))
except:
# If anything goes wrong here, no problem, the handle() thread will
# pick it up later and enqueue.
pass
class Command(ReminderEnqueuingOperation):
pass
| 33.22449 | 80 | 0.686732 | 196 | 1,628 | 5.530612 | 0.5 | 0.051661 | 0.049816 | 0.042435 | 0.059041 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000803 | 0.234644 | 1,628 | 48 | 81 | 33.916667 | 0.869181 | 0.143735 | 0 | 0.0625 | 0 | 0 | 0.052515 | 0.016272 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0.0625 | 0.1875 | 0.09375 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
4d4641d216181b85a297a7cb896c29bc9a013911 | 281 | py | Python | ia870/iacloseth.py | andreperesnl/ia870 | e6a089e17ea9def39cb9fd6901bbdf72a6ba7dfc | [
"BSD-3-Clause"
] | 5 | 2015-11-16T11:37:27.000Z | 2020-07-20T22:10:31.000Z | ia870/iacloseth.py | Abigale-Xin/e2dhipseg | 520366326cd20c75b5db855c9dd05cf0a8d49089 | [
"MIT"
] | 2 | 2020-07-28T22:29:54.000Z | 2021-07-07T20:37:25.000Z | ia870/iacloseth.py | Abigale-Xin/e2dhipseg | 520366326cd20c75b5db855c9dd05cf0a8d49089 | [
"MIT"
] | 30 | 2015-02-20T23:33:32.000Z | 2020-10-29T05:14:07.000Z | # -*- encoding: utf-8 -*-
# Module iacloseth
from numpy import *
def iacloseth(f, b=None):
from iasubm import iasubm
from iaclose import iaclose
from iasecross import iasecross
if b is None:
b = iasecross()
y = iasubm( iaclose(f,b), f)
return y
| 17.5625 | 35 | 0.629893 | 39 | 281 | 4.538462 | 0.487179 | 0.022599 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004902 | 0.274021 | 281 | 15 | 36 | 18.733333 | 0.862745 | 0.142349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4d47bac1d00300e75ace26437352a28d0ee21ee2 | 517 | py | Python | tests/test_fixture.py | acpaquette/deepstate | 6acf07ad5dd95c9e8b970ca516ab93aa2620ea6e | [
"Apache-2.0"
] | 684 | 2018-02-18T18:04:23.000Z | 2022-03-26T06:18:39.000Z | tests/test_fixture.py | acpaquette/deepstate | 6acf07ad5dd95c9e8b970ca516ab93aa2620ea6e | [
"Apache-2.0"
] | 273 | 2018-02-18T04:01:36.000Z | 2022-02-09T16:07:38.000Z | tests/test_fixture.py | acpaquette/deepstate | 6acf07ad5dd95c9e8b970ca516ab93aa2620ea6e | [
"Apache-2.0"
] | 77 | 2018-02-19T00:18:33.000Z | 2022-03-16T04:12:09.000Z | from __future__ import print_function
import logrun
import deepstate_base
class FixtureTest(deepstate_base.DeepStateTestCase):
def run_deepstate(self, deepstate):
(r, output) = logrun.logrun([deepstate, "build/examples/Fixture"],
"deepstate.out", 1800)
self.assertEqual(r, 0)
self.assertTrue("Passed: MyTest_Something" in output)
self.assertFalse("Failed: MyTest_Something" in output)
self.assertTrue("Setting up!" in output)
self.assertTrue("Tearing down!" in output)
| 30.411765 | 70 | 0.727273 | 61 | 517 | 6 | 0.557377 | 0.087432 | 0.098361 | 0.125683 | 0.147541 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011601 | 0.166344 | 517 | 16 | 71 | 32.3125 | 0.837587 | 0 | 0 | 0 | 0 | 0 | 0.206963 | 0.042553 | 0 | 0 | 0 | 0 | 0.416667 | 1 | 0.083333 | false | 0.083333 | 0.25 | 0 | 0.416667 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4d4b26eba315ab1fd10b643152e071eb29586398 | 981 | py | Python | new-webservice/create-pathway.py | wikipathways/scripts | 95f3ac9823f066ef977b0e2fef81a80bceeee8ec | [
"Apache-2.0"
] | null | null | null | new-webservice/create-pathway.py | wikipathways/scripts | 95f3ac9823f066ef977b0e2fef81a80bceeee8ec | [
"Apache-2.0"
] | null | null | null | new-webservice/create-pathway.py | wikipathways/scripts | 95f3ac9823f066ef977b0e2fef81a80bceeee8ec | [
"Apache-2.0"
] | 1 | 2021-02-26T18:49:54.000Z | 2021-02-26T18:49:54.000Z | ###
# Test script for new WikiPathways webservice API
# author: msk (mkutmon@gmail.com)
###
import requests
import getpass
from lxml import etree as ET
##################################
# variables
username = 'Mkutmon'
gpml_file = 'test.gpml'
basis_url = 'http://pvjs.wikipathways.org/wpi/webservicetest/'
##################################
# define namespaces
namespaces = {'ns1':'http://www.wso2.org/php/xsd','ns2':'http://www.wikipathways.org/webservice'}
# login
pswd = getpass.getpass('Password:')
auth = {'name' : username , 'pass' : pswd}
r_login = requests.get(basis_url + 'login', params=auth)
dom = ET.fromstring(r_login.text)
authentication = ''
for node in dom.findall('ns1:auth', namespaces):
authentication = node.text
# read gpml file
f = open(gpml_file, 'r')
gpml = f.read()
# create pathway
update_params = {'gpml': gpml, 'auth' : authentication, 'username': username}
re = requests.post(basis_url + 'createPathway', params=update_params)
print re.text
| 24.525 | 97 | 0.666667 | 123 | 981 | 5.243902 | 0.528455 | 0.037209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004635 | 0.120285 | 981 | 39 | 98 | 25.153846 | 0.742758 | 0.14577 | 0 | 0 | 0 | 0 | 0.257937 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.157895 | 0.157895 | null | null | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4d4b929996fec2b8eeaef10087d86618e91ed505 | 1,636 | py | Python | src/climsoft_api/api/data_form/schema.py | faysal-ishtiaq/climsoft-api | 46dacdeba5d935ee3b944df00731640170b87ccd | [
"MIT"
] | null | null | null | src/climsoft_api/api/data_form/schema.py | faysal-ishtiaq/climsoft-api | 46dacdeba5d935ee3b944df00731640170b87ccd | [
"MIT"
] | 2 | 2022-01-16T15:41:27.000Z | 2022-01-30T18:37:13.000Z | src/climsoft_api/api/data_form/schema.py | openclimateinitiative/climsoft-api | 3591d7499dd7777617b8086332dc83fab1af9588 | [
"MIT"
] | 2 | 2021-12-22T21:50:19.000Z | 2022-01-28T12:53:32.000Z | from typing import List
from climsoft_api.api.schema import Response, BaseSchema
from pydantic import constr, Field
class CreateDataForm(BaseSchema):
form_name: constr(max_length=250) = Field(title="Form Name")
order_num: int = Field(title="Order Number")
table_name: constr(max_length=255) = Field(title="Table Name")
description: str = Field(title="Description")
selected: bool = Field(title="Selected")
val_start_position: int = Field(title="Start Position")
val_end_position: int = Field(title="End Position")
elem_code_location: constr(max_length=255) = Field(title="Location Code of Element")
sequencer: constr(max_length=50) = Field(title="Sequencer")
entry_mode: bool = Field(title="Entry Mode")
class UpdateDataForm(BaseSchema):
order_num: int = Field(title="Order Number")
table_name: constr(max_length=255) = Field(title="Table Name")
description: str = Field(title="Description")
selected: bool = Field(title="Selected")
val_start_position: int = Field(title="Start Position")
val_end_position: int = Field(title="End Position")
elem_code_location: constr(max_length=255) = Field(title="Location Code of Element")
sequencer: constr(max_length=50) = Field(title="Sequencer")
entry_mode: bool = Field(title="Entry Mode")
class DataForm(CreateDataForm):
class Config:
orm_mode = True
class DataFormResponse(Response):
result: List[DataForm] = Field(title="Result")
class DataFormQueryResponse(DataFormResponse):
limit: int = Field(title="Limit")
page: int = Field(title="Page")
pages: int = Field(title="Pages")
| 36.355556 | 88 | 0.720049 | 210 | 1,636 | 5.47619 | 0.257143 | 0.2 | 0.101739 | 0.062609 | 0.643478 | 0.643478 | 0.643478 | 0.643478 | 0.643478 | 0.643478 | 0 | 0.013738 | 0.154645 | 1,636 | 44 | 89 | 37.181818 | 0.817787 | 0 | 0 | 0.545455 | 0 | 0 | 0.1522 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.969697 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4d51b970362782978b092b56b25da8c1c5705aa5 | 36,791 | py | Python | pysnmp/ENTERASYS-SERVICE-LEVEL-REPORTING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/ENTERASYS-SERVICE-LEVEL-REPORTING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/ENTERASYS-SERVICE-LEVEL-REPORTING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module ENTERASYS-SERVICE-LEVEL-REPORTING-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/ENTERASYS-SERVICE-LEVEL-REPORTING-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:50:17 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, OctetString, Integer = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "OctetString", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ValueRangeConstraint, ValueSizeConstraint, ConstraintsIntersection, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ValueRangeConstraint", "ValueSizeConstraint", "ConstraintsIntersection", "ConstraintsUnion")
etsysModules, = mibBuilder.importSymbols("ENTERASYS-MIB-NAMES", "etsysModules")
InetAddress, InetAddressType = mibBuilder.importSymbols("INET-ADDRESS-MIB", "InetAddress", "InetAddressType")
SnmpAdminString, = mibBuilder.importSymbols("SNMP-FRAMEWORK-MIB", "SnmpAdminString")
ObjectGroup, NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "ObjectGroup", "NotificationGroup", "ModuleCompliance")
Integer32, MibScalar, MibTable, MibTableRow, MibTableColumn, IpAddress, MibIdentifier, Unsigned32, Counter32, Gauge32, iso, ModuleIdentity, ObjectIdentity, Counter64, TimeTicks, NotificationType, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "Integer32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "IpAddress", "MibIdentifier", "Unsigned32", "Counter32", "Gauge32", "iso", "ModuleIdentity", "ObjectIdentity", "Counter64", "TimeTicks", "NotificationType", "Bits")
TextualConvention, RowStatus, StorageType, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "RowStatus", "StorageType", "DisplayString")
etsysServiceLevelReportingMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39))
etsysServiceLevelReportingMIB.setRevisions(('2003-11-06 15:15', '2003-10-24 19:02', '2003-10-22 23:32',))
if mibBuilder.loadTexts: etsysServiceLevelReportingMIB.setLastUpdated('200311061515Z')
if mibBuilder.loadTexts: etsysServiceLevelReportingMIB.setOrganization('Enterasys Networks Inc.')
class EtsysSrvcLvlOwnerString(TextualConvention, OctetString):
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(0, 32)
class TimeUnit(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9))
namedValues = NamedValues(("year", 1), ("month", 2), ("week", 3), ("day", 4), ("hour", 5), ("second", 6), ("millisecond", 7), ("microsecond", 8), ("nanosecond", 9))
class EtsysSrvcLvlStandardMetrics(TextualConvention, Bits):
status = 'current'
namedValues = NamedValues(("reserved", 0), ("instantUnidirectionConnectivity", 1), ("instantBidirectionConnectivity", 2), ("intervalUnidirectionConnectivity", 3), ("intervalBidirectionConnectivity", 4), ("intervalTemporalConnectivity", 5), ("oneWayDelay", 6), ("oneWayDelayPoissonStream", 7), ("oneWayDelayPercentile", 8), ("oneWayDelayMedian", 9), ("oneWayDelayMinimum", 10), ("oneWayDelayInversePercentile", 11), ("oneWayPacketLoss", 12), ("oneWayPacketLossPoissonStream", 13), ("oneWayPacketLossAverage", 14), ("roundtripDelay", 15), ("roundtripDelayPoissonStream", 16), ("roundtripDelayPercentile", 17), ("roundtripDelayMedian", 18), ("roundtripDelayMinimum", 19), ("roundtripDelayInversePercentile", 20), ("oneWayLossDistanceStream", 21), ("oneWayLossPeriodStream", 22), ("oneWayLossNoticeableRate", 23), ("oneWayLossPeriodTotal", 24), ("oneWayLossPeriodLengths", 25), ("oneWayInterLossPeriodLengths", 26), ("oneWayIpdv", 27), ("oneWayIpdvPoissonStream", 28), ("oneWayIpdvPercentile", 29), ("oneWayIpdvInversePercentile", 30), ("oneWayIpdvJitter", 31), ("oneWayPeakToPeakIpdv", 32), ("oneWayDelayPeriodicStream", 33), ("roundtripDelayAverage", 34), ("roundtripPacketLoss", 35), ("roundtripPacketLossAverage", 36), ("roundtripIpdv", 37))
class GMTTimeStamp(TextualConvention, OctetString):
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(8, 8)
fixedLength = 8
class TypeP(TextualConvention, OctetString):
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(0, 512)
class TypePaddress(TextualConvention, OctetString):
status = 'current'
displayHint = '255a'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(0, 512)
etsysSrvcLvlConfigObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1))
etsysSrvcLvlSystem = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1))
etsysSrvcLvlOwners = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2))
etsysSrvcLvlHistory = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3))
etsysSrvcLvlMeasure = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4))
etsysSrvcLvlSystemTime = MibScalar((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 1), GMTTimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlSystemTime.setStatus('current')
etsysSrvcLvlSystemClockResolution = MibScalar((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 2), Integer32()).setUnits('picoseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlSystemClockResolution.setStatus('current')
etsysSrvcLvlMetricTable = MibTable((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 3), )
if mibBuilder.loadTexts: etsysSrvcLvlMetricTable.setStatus('current')
etsysSrvcLvlMetricEntry = MibTableRow((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 3, 1), ).setIndexNames((0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlMetricIndex"))
if mibBuilder.loadTexts: etsysSrvcLvlMetricEntry.setStatus('current')
etsysSrvcLvlMetricIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 3, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37))).clone(namedValues=NamedValues(("instantUnidirectionConnectivity", 1), ("instantBidirectionConnectivity", 2), ("intervalUnidirectionConnectivity", 3), ("intervalBidirectionConnectivity", 4), ("intervalTemporalConnectivity", 5), ("oneWayDelay", 6), ("oneWayDelayPoissonStream", 7), ("oneWayDelayPercentile", 8), ("oneWayDelayMedian", 9), ("oneWayDelayMinimum", 10), ("oneWayDelayInversePercentile", 11), ("oneWayPacketLoss", 12), ("oneWayPacketLossPoissonStream", 13), ("oneWayPacketLossAverage", 14), ("roundtripDelay", 15), ("roundtripDelayPoissonStream", 16), ("roundtripDelayPercentile", 17), ("roundtripDelayMedian", 18), ("roundtripDelayMinimum", 19), ("roundtripDelayInversePercentile", 20), ("oneWayLossDistanceStream", 21), ("oneWayLossPeriodStream", 22), ("oneWayLossNoticeableRate", 23), ("oneWayLossPeriodTotal", 24), ("oneWayLossPeriodLengths", 25), ("oneWayInterLossPeriodLengths", 26), ("oneWayIpdv", 27), ("oneWayIpdvPoissonStream", 28), ("oneWayIpdvPercentile", 29), ("oneWayIpdvInversePercentile", 30), ("oneWayIpdvJitter", 31), ("oneWayPeakToPeakIpdv", 32), ("oneWayDelayPeriodicStream", 33), ("roundtripDelayAverage", 34), ("roundtripPacketLoss", 35), ("roundtripPacketLossAverage", 36), ("roundtripIpdv", 37))))
if mibBuilder.loadTexts: etsysSrvcLvlMetricIndex.setStatus('current')
etsysSrvcLvlMetricCapabilities = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 3, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("notImplemented", 0), ("implemented", 1))).clone('implemented')).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlMetricCapabilities.setStatus('current')
etsysSrvcLvlMetricType = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 3, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("network", 0), ("aggregated", 1))).clone('aggregated')).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlMetricType.setStatus('current')
etsysSrvcLvlMetricUnit = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 3, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("noUnit", 0), ("second", 1), ("millisecond", 2), ("microsecond", 3), ("nanosecond", 4), ("percentage", 5), ("packet", 6), ("byte", 7), ("kilobyte", 8), ("megabyte", 9)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlMetricUnit.setStatus('current')
etsysSrvcLvlMetricDescription = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 1, 3, 1, 5), SnmpAdminString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlMetricDescription.setStatus('current')
etsysSrvcLvlOwnersTable = MibTable((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1), )
if mibBuilder.loadTexts: etsysSrvcLvlOwnersTable.setStatus('current')
etsysSrvcLvlOwnersEntry = MibTableRow((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1), ).setIndexNames((0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersIndex"))
if mibBuilder.loadTexts: etsysSrvcLvlOwnersEntry.setStatus('current')
etsysSrvcLvlOwnersIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)))
if mibBuilder.loadTexts: etsysSrvcLvlOwnersIndex.setStatus('current')
etsysSrvcLvlOwnersOwner = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 2), EtsysSrvcLvlOwnerString()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersOwner.setStatus('current')
etsysSrvcLvlOwnersGrantedMetrics = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 3), EtsysSrvcLvlStandardMetrics()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersGrantedMetrics.setStatus('current')
etsysSrvcLvlOwnersQuota = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 4), Integer32()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersQuota.setStatus('current')
etsysSrvcLvlOwnersIpAddressType = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 5), InetAddressType()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersIpAddressType.setStatus('current')
etsysSrvcLvlOwnersIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 6), InetAddress().subtype(subtypeSpec=ValueSizeConstraint(1, 128))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersIpAddress.setStatus('current')
etsysSrvcLvlOwnersEmail = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 7), SnmpAdminString()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersEmail.setStatus('current')
etsysSrvcLvlOwnersSMS = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 8), SnmpAdminString()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersSMS.setStatus('current')
etsysSrvcLvlOwnersStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 2, 1, 1, 9), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlOwnersStatus.setStatus('current')
etsysSrvcLvlHistoryTable = MibTable((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1), )
if mibBuilder.loadTexts: etsysSrvcLvlHistoryTable.setStatus('current')
etsysSrvcLvlHistoryEntry = MibTableRow((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1), ).setIndexNames((0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistoryMeasureOwner"), (0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistoryMeasureIndex"), (0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistoryMetricIndex"), (0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistoryIndex"))
if mibBuilder.loadTexts: etsysSrvcLvlHistoryEntry.setStatus('current')
etsysSrvcLvlHistoryMeasureOwner = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1, 1), EtsysSrvcLvlOwnerString())
if mibBuilder.loadTexts: etsysSrvcLvlHistoryMeasureOwner.setStatus('current')
etsysSrvcLvlHistoryMeasureIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)))
if mibBuilder.loadTexts: etsysSrvcLvlHistoryMeasureIndex.setStatus('current')
etsysSrvcLvlHistoryMetricIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)))
if mibBuilder.loadTexts: etsysSrvcLvlHistoryMetricIndex.setStatus('current')
etsysSrvcLvlHistoryIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)))
if mibBuilder.loadTexts: etsysSrvcLvlHistoryIndex.setStatus('current')
etsysSrvcLvlHistorySequence = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlHistorySequence.setStatus('current')
etsysSrvcLvlHistoryTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1, 6), GMTTimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlHistoryTimestamp.setStatus('current')
etsysSrvcLvlHistoryValue = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 3, 1, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlHistoryValue.setStatus('current')
etsysSrvcLvlNetMeasureTable = MibTable((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1), )
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureTable.setStatus('current')
etsysSrvcLvlNetMeasureEntry = MibTableRow((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1), ).setIndexNames((0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureOwner"), (0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureIndex"))
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureEntry.setStatus('current')
etsysSrvcLvlNetMeasureOwner = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 1), EtsysSrvcLvlOwnerString())
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureOwner.setStatus('current')
etsysSrvcLvlNetMeasureIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)))
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureIndex.setStatus('current')
etsysSrvcLvlNetMeasureName = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 3), SnmpAdminString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureName.setStatus('current')
etsysSrvcLvlNetMeasureMetrics = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 4), EtsysSrvcLvlStandardMetrics()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureMetrics.setStatus('current')
etsysSrvcLvlNetMeasureBeginTime = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 5), GMTTimeStamp()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureBeginTime.setStatus('current')
etsysSrvcLvlNetMeasureDurationUnit = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 6), TimeUnit().clone('second')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureDurationUnit.setStatus('current')
etsysSrvcLvlNetMeasureDuration = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 7), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureDuration.setStatus('current')
etsysSrvcLvlNetMeasureHistorySize = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 8), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureHistorySize.setStatus('current')
etsysSrvcLvlNetMeasureFailureMgmtMode = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("auto", 1), ("manual", 2), ("discarded", 3))).clone('auto')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureFailureMgmtMode.setStatus('current')
etsysSrvcLvlNetMeasureResultsMgmt = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("wrap", 1), ("suspend", 2), ("delete", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureResultsMgmt.setStatus('current')
etsysSrvcLvlNetMeasureSrcTypeP = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 11), TypeP().clone('ip')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureSrcTypeP.setStatus('current')
etsysSrvcLvlNetMeasureSrc = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 12), TypePaddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureSrc.setStatus('current')
etsysSrvcLvlNetMeasureDstTypeP = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 13), TypeP().clone('ip')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureDstTypeP.setStatus('current')
etsysSrvcLvlNetMeasureDst = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 14), TypePaddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureDst.setStatus('current')
etsysSrvcLvlNetMeasureTxMode = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 15), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("other", 0), ("periodic", 1), ("poisson", 2), ("multiburst", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureTxMode.setStatus('current')
etsysSrvcLvlNetMeasureTxPacketRateUnit = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 16), TimeUnit()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureTxPacketRateUnit.setStatus('current')
etsysSrvcLvlNetMeasureTxPacketRate = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 17), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureTxPacketRate.setStatus('current')
etsysSrvcLvlNetMeasureDevtnOrBurstSize = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 18), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureDevtnOrBurstSize.setStatus('current')
etsysSrvcLvlNetMeasureMedOrIntBurstSize = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 19), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureMedOrIntBurstSize.setStatus('current')
etsysSrvcLvlNetMeasureLossTimeout = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 20), Integer32()).setUnits('Milliseconds').setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureLossTimeout.setStatus('current')
etsysSrvcLvlNetMeasureL3PacketSize = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 21), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureL3PacketSize.setStatus('current')
etsysSrvcLvlNetMeasureDataPattern = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 22), OctetString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureDataPattern.setStatus('current')
etsysSrvcLvlNetMeasureMap = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 23), SnmpAdminString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureMap.setStatus('current')
etsysSrvcLvlNetMeasureSingletons = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 24), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureSingletons.setStatus('current')
etsysSrvcLvlNetMeasureOperState = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 1, 1, 25), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("unknown", 0), ("running", 1), ("stopped", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlNetMeasureOperState.setStatus('current')
etsysSrvcLvlAggrMeasureTable = MibTable((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2), )
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureTable.setStatus('current')
etsysSrvcLvlAggrMeasureEntry = MibTableRow((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1), ).setIndexNames((0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureOwner"), (0, "ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureIndex"))
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureEntry.setStatus('current')
etsysSrvcLvlAggrMeasureOwner = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 1), EtsysSrvcLvlOwnerString())
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureOwner.setStatus('current')
etsysSrvcLvlAggrMeasureIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)))
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureIndex.setStatus('current')
etsysSrvcLvlAggrMeasureName = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 3), SnmpAdminString()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureName.setStatus('current')
etsysSrvcLvlAggrMeasureMetrics = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 4), EtsysSrvcLvlStandardMetrics()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureMetrics.setStatus('current')
etsysSrvcLvlAggrMeasureBeginTime = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 5), GMTTimeStamp()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureBeginTime.setStatus('current')
etsysSrvcLvlAggrMeasureAggrPeriodUnit = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 6), TimeUnit().clone('second')).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureAggrPeriodUnit.setStatus('current')
etsysSrvcLvlAggrMeasureAggrPeriod = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 7), Integer32()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureAggrPeriod.setStatus('current')
etsysSrvcLvlAggrMeasureDurationUnit = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 8), TimeUnit().clone('second')).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureDurationUnit.setStatus('current')
etsysSrvcLvlAggrMeasureDuration = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 9), Integer32()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureDuration.setStatus('current')
etsysSrvcLvlAggrMeasureHistorySize = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 10), Integer32()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureHistorySize.setStatus('current')
etsysSrvcLvlAggrMeasureStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 11), StorageType().clone('volatile')).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureStorageType.setStatus('current')
etsysSrvcLvlAggrMeasureResultsMgmt = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("wrap", 1), ("suspend", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureResultsMgmt.setStatus('current')
etsysSrvcLvlAggrMeasureHistoryOwner = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 13), EtsysSrvcLvlOwnerString()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureHistoryOwner.setStatus('current')
etsysSrvcLvlAggrMeasureHistoryOwnerIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 14), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureHistoryOwnerIndex.setStatus('current')
etsysSrvcLvlAggrMeasureHistoryMetric = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 15), Integer32()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureHistoryMetric.setStatus('current')
etsysSrvcLvlAggrMeasureAdminState = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 16), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("start", 0), ("stop", 1)))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureAdminState.setStatus('current')
etsysSrvcLvlAggrMeasureMap = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 17), SnmpAdminString()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureMap.setStatus('current')
etsysSrvcLvlAggrMeasureStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 1, 4, 2, 1, 18), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: etsysSrvcLvlAggrMeasureStatus.setStatus('current')
etsysSrvcLvlReportingConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2))
etsysSrvcLvlReportingGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2, 1))
etsysSrvcLvlReportingCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2, 2))
etsysSrvcLvlSystemGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2, 1, 1)).setObjects(("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlSystemTime"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlSystemClockResolution"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlMetricCapabilities"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlMetricType"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlMetricUnit"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlMetricDescription"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
etsysSrvcLvlSystemGroup = etsysSrvcLvlSystemGroup.setStatus('current')
etsysSrvcLvlOwnersGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2, 1, 2)).setObjects(("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersOwner"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersGrantedMetrics"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersQuota"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersIpAddressType"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersIpAddress"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersEmail"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersSMS"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
etsysSrvcLvlOwnersGroup = etsysSrvcLvlOwnersGroup.setStatus('current')
etsysSrvcLvlHistoryGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2, 1, 3)).setObjects(("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistorySequence"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistoryTimestamp"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistoryValue"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
etsysSrvcLvlHistoryGroup = etsysSrvcLvlHistoryGroup.setStatus('current')
etsysSrvcLvlMeasureGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2, 1, 4)).setObjects(("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureName"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureMetrics"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureBeginTime"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureDurationUnit"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureDuration"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureHistorySize"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureFailureMgmtMode"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureResultsMgmt"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureSrcTypeP"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureSrc"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureDstTypeP"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureDst"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureTxMode"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureTxPacketRateUnit"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureTxPacketRate"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureDevtnOrBurstSize"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureMedOrIntBurstSize"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureLossTimeout"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureL3PacketSize"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureDataPattern"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureMap"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureSingletons"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlNetMeasureOperState"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureName"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureMetrics"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureBeginTime"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureAggrPeriodUnit"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureAggrPeriod"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureDurationUnit"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureDuration"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureHistorySize"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureStorageType"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureResultsMgmt"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureHistoryOwner"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureHistoryOwnerIndex"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureHistoryMetric"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureAdminState"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureMap"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlAggrMeasureStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
etsysSrvcLvlMeasureGroup = etsysSrvcLvlMeasureGroup.setStatus('current')
etsysSrvcLvlReportingCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 5624, 1, 2, 39, 2, 2, 1)).setObjects(("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlSystemGroup"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlOwnersGroup"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlHistoryGroup"), ("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", "etsysSrvcLvlMeasureGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
etsysSrvcLvlReportingCompliance = etsysSrvcLvlReportingCompliance.setStatus('current')
mibBuilder.exportSymbols("ENTERASYS-SERVICE-LEVEL-REPORTING-MIB", etsysSrvcLvlAggrMeasureHistoryMetric=etsysSrvcLvlAggrMeasureHistoryMetric, etsysSrvcLvlHistoryMeasureIndex=etsysSrvcLvlHistoryMeasureIndex, etsysSrvcLvlNetMeasureName=etsysSrvcLvlNetMeasureName, TimeUnit=TimeUnit, etsysSrvcLvlAggrMeasureStatus=etsysSrvcLvlAggrMeasureStatus, etsysSrvcLvlAggrMeasureMetrics=etsysSrvcLvlAggrMeasureMetrics, etsysSrvcLvlAggrMeasureDuration=etsysSrvcLvlAggrMeasureDuration, etsysServiceLevelReportingMIB=etsysServiceLevelReportingMIB, etsysSrvcLvlNetMeasureIndex=etsysSrvcLvlNetMeasureIndex, etsysSrvcLvlReportingGroups=etsysSrvcLvlReportingGroups, etsysSrvcLvlNetMeasureDuration=etsysSrvcLvlNetMeasureDuration, etsysSrvcLvlHistoryEntry=etsysSrvcLvlHistoryEntry, etsysSrvcLvlAggrMeasureIndex=etsysSrvcLvlAggrMeasureIndex, etsysSrvcLvlOwnersTable=etsysSrvcLvlOwnersTable, etsysSrvcLvlNetMeasureDurationUnit=etsysSrvcLvlNetMeasureDurationUnit, EtsysSrvcLvlOwnerString=EtsysSrvcLvlOwnerString, etsysSrvcLvlNetMeasureSrcTypeP=etsysSrvcLvlNetMeasureSrcTypeP, etsysSrvcLvlAggrMeasureBeginTime=etsysSrvcLvlAggrMeasureBeginTime, etsysSrvcLvlSystemClockResolution=etsysSrvcLvlSystemClockResolution, etsysSrvcLvlHistory=etsysSrvcLvlHistory, etsysSrvcLvlConfigObjects=etsysSrvcLvlConfigObjects, etsysSrvcLvlHistoryMetricIndex=etsysSrvcLvlHistoryMetricIndex, PYSNMP_MODULE_ID=etsysServiceLevelReportingMIB, TypePaddress=TypePaddress, etsysSrvcLvlNetMeasureHistorySize=etsysSrvcLvlNetMeasureHistorySize, etsysSrvcLvlReportingConformance=etsysSrvcLvlReportingConformance, etsysSrvcLvlNetMeasureFailureMgmtMode=etsysSrvcLvlNetMeasureFailureMgmtMode, etsysSrvcLvlAggrMeasureMap=etsysSrvcLvlAggrMeasureMap, etsysSrvcLvlNetMeasureMetrics=etsysSrvcLvlNetMeasureMetrics, etsysSrvcLvlNetMeasureOwner=etsysSrvcLvlNetMeasureOwner, etsysSrvcLvlAggrMeasureHistorySize=etsysSrvcLvlAggrMeasureHistorySize, etsysSrvcLvlNetMeasureDevtnOrBurstSize=etsysSrvcLvlNetMeasureDevtnOrBurstSize, etsysSrvcLvlNetMeasureEntry=etsysSrvcLvlNetMeasureEntry, etsysSrvcLvlNetMeasureTxPacketRate=etsysSrvcLvlNetMeasureTxPacketRate, etsysSrvcLvlAggrMeasureOwner=etsysSrvcLvlAggrMeasureOwner, etsysSrvcLvlHistoryTimestamp=etsysSrvcLvlHistoryTimestamp, etsysSrvcLvlOwnersEmail=etsysSrvcLvlOwnersEmail, etsysSrvcLvlAggrMeasureTable=etsysSrvcLvlAggrMeasureTable, etsysSrvcLvlOwnersGroup=etsysSrvcLvlOwnersGroup, etsysSrvcLvlOwnersSMS=etsysSrvcLvlOwnersSMS, etsysSrvcLvlNetMeasureTable=etsysSrvcLvlNetMeasureTable, EtsysSrvcLvlStandardMetrics=EtsysSrvcLvlStandardMetrics, etsysSrvcLvlMetricIndex=etsysSrvcLvlMetricIndex, etsysSrvcLvlOwnersStatus=etsysSrvcLvlOwnersStatus, etsysSrvcLvlHistorySequence=etsysSrvcLvlHistorySequence, etsysSrvcLvlHistoryGroup=etsysSrvcLvlHistoryGroup, etsysSrvcLvlAggrMeasureAggrPeriod=etsysSrvcLvlAggrMeasureAggrPeriod, etsysSrvcLvlNetMeasureTxPacketRateUnit=etsysSrvcLvlNetMeasureTxPacketRateUnit, etsysSrvcLvlOwnersOwner=etsysSrvcLvlOwnersOwner, etsysSrvcLvlAggrMeasureEntry=etsysSrvcLvlAggrMeasureEntry, etsysSrvcLvlNetMeasureL3PacketSize=etsysSrvcLvlNetMeasureL3PacketSize, etsysSrvcLvlNetMeasureSrc=etsysSrvcLvlNetMeasureSrc, etsysSrvcLvlHistoryIndex=etsysSrvcLvlHistoryIndex, etsysSrvcLvlReportingCompliance=etsysSrvcLvlReportingCompliance, etsysSrvcLvlMetricTable=etsysSrvcLvlMetricTable, etsysSrvcLvlOwnersIpAddressType=etsysSrvcLvlOwnersIpAddressType, etsysSrvcLvlOwnersGrantedMetrics=etsysSrvcLvlOwnersGrantedMetrics, etsysSrvcLvlMeasure=etsysSrvcLvlMeasure, etsysSrvcLvlNetMeasureMap=etsysSrvcLvlNetMeasureMap, etsysSrvcLvlNetMeasureMedOrIntBurstSize=etsysSrvcLvlNetMeasureMedOrIntBurstSize, etsysSrvcLvlAggrMeasureResultsMgmt=etsysSrvcLvlAggrMeasureResultsMgmt, etsysSrvcLvlAggrMeasureAggrPeriodUnit=etsysSrvcLvlAggrMeasureAggrPeriodUnit, etsysSrvcLvlOwnersEntry=etsysSrvcLvlOwnersEntry, etsysSrvcLvlHistoryValue=etsysSrvcLvlHistoryValue, etsysSrvcLvlAggrMeasureHistoryOwnerIndex=etsysSrvcLvlAggrMeasureHistoryOwnerIndex, etsysSrvcLvlNetMeasureDataPattern=etsysSrvcLvlNetMeasureDataPattern, etsysSrvcLvlNetMeasureTxMode=etsysSrvcLvlNetMeasureTxMode, etsysSrvcLvlMetricType=etsysSrvcLvlMetricType, etsysSrvcLvlReportingCompliances=etsysSrvcLvlReportingCompliances, etsysSrvcLvlOwnersQuota=etsysSrvcLvlOwnersQuota, etsysSrvcLvlAggrMeasureName=etsysSrvcLvlAggrMeasureName, etsysSrvcLvlMetricCapabilities=etsysSrvcLvlMetricCapabilities, etsysSrvcLvlNetMeasureLossTimeout=etsysSrvcLvlNetMeasureLossTimeout, GMTTimeStamp=GMTTimeStamp, etsysSrvcLvlMetricEntry=etsysSrvcLvlMetricEntry, etsysSrvcLvlOwnersIpAddress=etsysSrvcLvlOwnersIpAddress, etsysSrvcLvlOwners=etsysSrvcLvlOwners, etsysSrvcLvlMeasureGroup=etsysSrvcLvlMeasureGroup, etsysSrvcLvlAggrMeasureDurationUnit=etsysSrvcLvlAggrMeasureDurationUnit, etsysSrvcLvlMetricUnit=etsysSrvcLvlMetricUnit, etsysSrvcLvlNetMeasureSingletons=etsysSrvcLvlNetMeasureSingletons, etsysSrvcLvlNetMeasureDstTypeP=etsysSrvcLvlNetMeasureDstTypeP, etsysSrvcLvlHistoryMeasureOwner=etsysSrvcLvlHistoryMeasureOwner, etsysSrvcLvlSystemGroup=etsysSrvcLvlSystemGroup, etsysSrvcLvlSystem=etsysSrvcLvlSystem, etsysSrvcLvlHistoryTable=etsysSrvcLvlHistoryTable, etsysSrvcLvlNetMeasureBeginTime=etsysSrvcLvlNetMeasureBeginTime, etsysSrvcLvlSystemTime=etsysSrvcLvlSystemTime, etsysSrvcLvlOwnersIndex=etsysSrvcLvlOwnersIndex, etsysSrvcLvlMetricDescription=etsysSrvcLvlMetricDescription, etsysSrvcLvlNetMeasureOperState=etsysSrvcLvlNetMeasureOperState, etsysSrvcLvlAggrMeasureHistoryOwner=etsysSrvcLvlAggrMeasureHistoryOwner, etsysSrvcLvlNetMeasureResultsMgmt=etsysSrvcLvlNetMeasureResultsMgmt, etsysSrvcLvlAggrMeasureAdminState=etsysSrvcLvlAggrMeasureAdminState, TypeP=TypeP, etsysSrvcLvlAggrMeasureStorageType=etsysSrvcLvlAggrMeasureStorageType, etsysSrvcLvlNetMeasureDst=etsysSrvcLvlNetMeasureDst)
| 163.515556 | 5,697 | 0.784784 | 3,546 | 36,791 | 8.14185 | 0.094473 | 0.009975 | 0.012158 | 0.012469 | 0.437429 | 0.33809 | 0.254096 | 0.243809 | 0.226421 | 0.202418 | 0 | 0.066028 | 0.072953 | 36,791 | 224 | 5,698 | 164.245536 | 0.780456 | 0.010002 | 0 | 0.061905 | 0 | 0 | 0.235102 | 0.165321 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.042857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4d6312c2680a983411f7e0111ac248839863f067 | 265 | py | Python | core/migrations/0007_merge_20190425_1250.py | rob-taylor543/graffiti_royale | 9e77451df24803d8f78a52cef7fa1a9999e0af69 | [
"MIT"
] | 1 | 2019-06-06T19:51:48.000Z | 2019-06-06T19:51:48.000Z | core/migrations/0007_merge_20190425_1250.py | graffiti-royale/graffiti_royale | 9e77451df24803d8f78a52cef7fa1a9999e0af69 | [
"MIT"
] | null | null | null | core/migrations/0007_merge_20190425_1250.py | graffiti-royale/graffiti_royale | 9e77451df24803d8f78a52cef7fa1a9999e0af69 | [
"MIT"
] | null | null | null | # Generated by Django 2.2 on 2019-04-25 16:50
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('core', '0006_auto_20190423_1348'),
('core', '0006_auto_20190424_1412'),
]
operations = [
]
| 17.666667 | 45 | 0.641509 | 32 | 265 | 5.125 | 0.78125 | 0.097561 | 0.146341 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.227723 | 0.237736 | 265 | 14 | 46 | 18.928571 | 0.584158 | 0.162264 | 0 | 0 | 1 | 0 | 0.245455 | 0.209091 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4d88a8979de6b4bdc06a47a5b5ee8a37f6fe1386 | 3,382 | py | Python | tests/fixtures/data.py | jamesqo/conllu | e3f3961aa122b5b93611c92d0a6ac5451346cbf9 | [
"MIT"
] | null | null | null | tests/fixtures/data.py | jamesqo/conllu | e3f3961aa122b5b93611c92d0a6ac5451346cbf9 | [
"MIT"
] | null | null | null | tests/fixtures/data.py | jamesqo/conllu | e3f3961aa122b5b93611c92d0a6ac5451346cbf9 | [
"MIT"
] | null | null | null | # encoding: utf-8
from __future__ import unicode_literals
from textwrap import dedent
# The quick brown fox jumps over the lazy dog
data1 = dedent("""
1\tThe\tthe\tDET\tDT\tDefinite=Def|PronType=Art\t4\tdet\t_\t_
2\tquick\tquick\tADJ\tJJ\tDegree=Pos\t4\tamod\t_\t_
3\tbrown\tbrown\tADJ\tJJ\tDegree=Pos\t4\tamod\t_\t_
4\tfox\tfox\tNOUN\tNN\tNumber=Sing\t5\tnsubj\t_\t_
5\tjumps\tjump\tVERB\tVBZ\tMood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin\t0\troot\t_\t_
6\tover\tover\tADP\tIN\t_\t9\tcase\t_\t_
7\tthe\tthe\tDET\tDT\tDefinite=Def|PronType=Art\t9\tdet\t_\t_
8\tlazy\tlazy\tADJ\tJJ\tDegree=Pos\t9\tamod\t_\t_
9\tdog\tdog\tNOUN\tNN\tNumber=Sing\t5\tnmod\t_\tSpaceAfter=No
10\t.\t.\tPUNCT\t.\t_\t5\tpunct\t_\t_
""")
# Då var han elva år.
data2 = dedent("""
1 Då då ADV AB _
2 var vara VERB VB.PRET.ACT Tense=Past|Voice=Act
3 han han PRON PN.UTR.SIN.DEF.NOM Case=Nom|Definite=Def|Gender=Com|Number=Sing
4 elva elva NUM RG.NOM Case=Nom|NumType=Card
5 år år NOUN NN.NEU.PLU.IND.NOM Case=Nom|Definite=Ind|Gender=Neut|Number=Plur
6 . . PUNCT DL.MAD _
""")
# They buy and sell books
data3 = dedent("""
1 They they PRON PRP Case=Nom|Number=Plur 2 nsubj 2:nsubj|4:nsubj
2 buy buy VERB VBP Number=Plur|Person=3|Tense=Pres 0 root 0:root
3 and and CONJ CC _ 4 cc 4:cc
4 sell sell VERB VBP Number=Plur|Person=3|Tense=Pres 2 conj 0:root|2:conj
5 books book NOUN NNS Number=Plur 2 obj 2:obj|4:obj
6 . . PUNCT . _ 2 punct 2:punct
""")
data4 = dedent("""
# sent_id = 1
# text = They buy and sell books.
1 They they PRON PRP Case=Nom|Number=Plur 2 nsubj 2:nsubj|4:nsubj _
2 buy buy VERB VBP Number=Plur|Person=3|Tense=Pres 0 root 0:root _
3 and and CONJ CC _ 4 cc 4:cc _
4 sell sell VERB VBP Number=Plur|Person=3|Tense=Pres 2 conj 0:root|2:conj _
5 books book NOUN NNS Number=Plur 2 obj 2:obj|4:obj SpaceAfter=No
6 . . PUNCT . _ 2 punct 2:punct _
# sent_id = 2
# text = I have no clue.
1 I I PRON PRP Case=Nom|Number=Sing|Person=1 2 nsubj _ _
2 have have VERB VBP Number=Sing|Person=1|Tense=Pres 0 root _ _
3 no no DET DT PronType=Neg 4 det _ _
4 clue clue NOUN NN Number=Sing 2 obj _ SpaceAfter=No
5 . . PUNCT . _ 2 punct _ _
""")
data5 = dedent("""
1 Jag jag PRON PRP Case=Nom|Definite=Def|Gender=Com|Number=Sing|PronType=Prs 2 nsubj _ _
2 längtar längta VERB VBP Mood=Ind|Tense=Pres|VerbForm=Fin|Voice=Act 0 root _ _
""")
| 51.242424 | 118 | 0.504731 | 477 | 3,382 | 3.467505 | 0.30608 | 0.013301 | 0.036276 | 0.048368 | 0.465538 | 0.405079 | 0.383313 | 0.383313 | 0.261185 | 0.261185 | 0 | 0.048805 | 0.406268 | 3,382 | 65 | 119 | 52.030769 | 0.7749 | 0.030455 | 0 | 0.098039 | 0 | 0.352941 | 0.941356 | 0.28986 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.039216 | 0 | 0.039216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4d8cdda575a3670447eef68b74e93455b9e329ff | 6,907 | py | Python | matcho/template.py | mbillingr/matcho | 473949164e5cb61fa122d0e503eb56807ef62fa6 | [
"MIT"
] | null | null | null | matcho/template.py | mbillingr/matcho | 473949164e5cb61fa122d0e503eb56807ef62fa6 | [
"MIT"
] | 3 | 2022-03-11T11:52:44.000Z | 2022-03-14T12:12:11.000Z | matcho/template.py | mbillingr/matcho | 473949164e5cb61fa122d0e503eb56807ef62fa6 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from functools import reduce, singledispatch
from operator import or_
from typing import Any, Set
from matcho.bindings import Repeating
__all__ = ["build_template", "insert"]
def insert(name: str):
"""Mark a place in the template where to insert the value bound to name."""
return Insert(name)
@dataclass
class Insert:
name: str
def __hash__(self):
return hash(self.name)
class Template:
def instantiate(self, bindings, nesting_level):
"""Instantiate this template in the condext of some bindings and nesting level"""
raise NotImplementedError(f"{self.__class__.__name__}.instantiate()")
def insertions(self) -> Set[str]:
raise NotImplementedError(f"{self.__class__.__name__}.insertions()")
def __call__(self, bindings, nesting_level=()):
return self.instantiate(bindings, nesting_level)
@dataclass
class LiteralTemplate(Template):
value: Any
def instantiate(self, bindings, nesting_level):
return self.value
def insertions(self) -> Set[str]:
return set()
def __hash__(self):
return hash(self.value)
@singledispatch
def build_template(spec) -> Template:
"""Build a template from a specification.
The resulting template is an object that when called with a set of
bindings (as produced by a matcher from `build_matcher`), returns
an instance of the template with names substituted by their bound values.
This is a generic function. Support for additional template specifications
can be added with the `build_template.register(<type>, <handler>)` function.
See the documentation of `functools.singledispatch` for further information.
"""
return LiteralTemplate(spec)
@build_template.register(Insert)
def _(insert_spec):
return InsertionTemplate(insert_spec.name)
@dataclass
class InsertionTemplate(Template):
"""Template that is substituted with values bound to name."""
name: str
def instantiate(self, bindings, nesting_level):
value = get_nested(bindings[self.name], nesting_level)
if isinstance(value, Repeating):
raise ValueError(f"{self.name} is still repeating at this level")
return value
def insertions(self) -> Set[str]:
return {self.name}
def __hash__(self):
return hash(self.name)
@build_template.register(list)
def build_list_template(template):
"""Build a template that constructs lists.
Typically, `build_template` should be used instead, which delegates to
this function where appropriate.
"""
if len(template) > 0 and template[0] is ...:
raise ValueError("Ellipsis must be preceded by another list element")
for a, b in zip(template, template[1:]):
if a is ... and b is not ...:
raise ValueError("Ellipsis can't be followed by non-ellipsis list elements")
if len(template) > 2 and template[-2:] == [..., ...]:
items = template[:-2]
return FlattenListTemplate(items)
if len(template) > 1 and template[-1] is ...:
items1 = template[:-2]
rep = template[-2]
return VariableListTemplate(items1, rep)
return FixedListTemplate(template)
class FlattenListTemplate(Template):
"""Template that flattens one level of nesting."""
def __init__(self, items):
self.deep_template = build_list_template([[*items, ...], ...])
def instantiate(self, bindings, nesting_level):
return flatten(self.deep_template(bindings, nesting_level))
def insertions(self) -> Set[str]:
return self.deep_template.insertions()
def flatten(sequence):
"""Remove one level of nesting from a sequence of sequences
by concatenating all inner sequences to one list."""
result = []
for s in sequence:
result.extend(s)
return result
class FixedListTemplate(Template):
"""Template for lists of fixed length."""
def __init__(self, list_template):
self.templates = [build_template(t) for t in list_template]
def instantiate(self, bindings, nesting_level):
return [x(bindings, nesting_level) for x in self.templates]
def insertions(self) -> Set[str]:
return reduce(or_, (t.insertions() for t in self.templates), set())
class VariableListTemplate(Template):
"""Template for lists of variable length."""
def __init__(self, items, rep):
self.fixed_template = FixedListTemplate(items)
self.repeated_template = build_template(rep)
self.names_in_rep = self.repeated_template.insertions()
def instantiate(self, bindings, nesting_level):
fixed_part = self.fixed_template.instantiate(bindings, nesting_level)
rep_len = common_repetition_length(bindings, nesting_level, self.names_in_rep)
variable_part = [
self.repeated_template.instantiate(bindings, nesting_level + (i,))
for i in range(rep_len)
]
return fixed_part + variable_part
def insertions(self) -> Set[str]:
return self.fixed_template.insertions() | self.repeated_template.insertions()
def common_repetition_length(bindings, nesting_level, used_names):
"""Try to find a common length suitable for all used bindings at given nesting level."""
length = None
for name in used_names:
value = get_nested(bindings[name], nesting_level)
if isinstance(value, Repeating):
multiplicity = len(value.values)
if length is None:
length = multiplicity
else:
if multiplicity != length:
raise ValueError(
f"{name}'s number of values {multiplicity} "
f"does not match other bindings of length {length}"
)
assert length == multiplicity
if length is None:
raise ValueError("no repeated bindings")
return length
@build_template.register(dict)
class DictTemplate(Template):
"""Template for dictionaries"""
def __init__(self, dict_spec):
self.item_templates = {
build_template(k): build_template(v) for k, v in dict_spec.items()
}
def instantiate(self, bindings, nesting_level):
return {
k(bindings, nesting_level): v(bindings, nesting_level)
for k, v in self.item_templates.items()
}
def insertions(self) -> Set[str]:
names = set()
for k, v in self.item_templates.items():
names |= k.insertions()
names |= v.insertions()
return names
def get_nested(value, nesting_level):
"""Get the value of nested repeated bindings."""
while nesting_level != ():
if not isinstance(value, Repeating):
break
value = value.values[nesting_level[0]]
nesting_level = nesting_level[1:]
return value
| 30.561947 | 92 | 0.663675 | 828 | 6,907 | 5.375604 | 0.205314 | 0.070097 | 0.076387 | 0.043136 | 0.257246 | 0.20355 | 0.129858 | 0.035048 | 0 | 0 | 0 | 0.002665 | 0.239467 | 6,907 | 225 | 93 | 30.697778 | 0.84466 | 0.174026 | 0 | 0.217391 | 0 | 0 | 0.063416 | 0.013755 | 0 | 0 | 0 | 0 | 0.007246 | 1 | 0.210145 | false | 0 | 0.036232 | 0.101449 | 0.507246 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
4d9d4558fd66e78b20ee4792c8b6d80c7cf9bf85 | 1,731 | py | Python | src/merkle_drop/status.py | trustlines-network/merkle-drop | 4310d1b22d83c6feacef35248dd5500cd66288dc | [
"MIT"
] | 11 | 2020-06-22T12:52:37.000Z | 2022-02-07T06:20:22.000Z | src/merkle_drop/status.py | trustlines-protocol/merkle-drop | 4310d1b22d83c6feacef35248dd5500cd66288dc | [
"MIT"
] | 40 | 2019-04-08T15:59:09.000Z | 2021-07-01T19:43:31.000Z | src/merkle_drop/status.py | trustlines-network/merkle-drop | 4310d1b22d83c6feacef35248dd5500cd66288dc | [
"MIT"
] | 2 | 2019-04-07T14:30:55.000Z | 2022-01-22T07:08:05.000Z | import pendulum
from deploy_tools.deploy import load_contracts_json
from eth_utils import to_checksum_address
def get_merkle_drop_status(web3, contract_address):
compiled_contracts = load_contracts_json(__name__)
merkle_drop_contract = web3.eth.contract(
address=contract_address, abi=compiled_contracts["MerkleDrop"]["abi"]
)
token_contract = web3.eth.contract(
address=merkle_drop_contract.functions.droppedToken().call(),
abi=compiled_contracts["ERC20Interface"]["abi"],
)
return {
"address": to_checksum_address(merkle_drop_contract.address),
"root": merkle_drop_contract.functions.root().call(),
"decay_start_time": merkle_drop_contract.functions.decayStartTime().call(),
"decay_duration_in_seconds": merkle_drop_contract.functions.decayDurationInSeconds().call(),
"initial_balance": merkle_drop_contract.functions.initialBalance().call(),
"remaining_value": merkle_drop_contract.functions.remainingValue().call(),
"spent_tokens": merkle_drop_contract.functions.spentTokens().call(),
"token_address": to_checksum_address(token_contract.address),
"token_name": token_contract.functions.name().call(),
"token_symbol": token_contract.functions.symbol().call(),
"token_decimals": token_contract.functions.decimals().call(),
"token_balance": token_contract.functions.balanceOf(
merkle_drop_contract.address
).call(),
"decayed_remaining_value": merkle_drop_contract.functions.decayedEntitlementAtTime(
merkle_drop_contract.functions.remainingValue().call(),
pendulum.now().int_timestamp,
True,
).call(),
}
| 43.275 | 100 | 0.717504 | 181 | 1,731 | 6.475138 | 0.314917 | 0.110922 | 0.1843 | 0.207338 | 0.174915 | 0.12372 | 0 | 0 | 0 | 0 | 0 | 0.003467 | 0.166956 | 1,731 | 39 | 101 | 44.384615 | 0.809293 | 0 | 0 | 0.060606 | 0 | 0 | 0.120739 | 0.02773 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0.090909 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4da5a9d4e4ed2e47f59b4f956d8875e4c3277208 | 397 | py | Python | ca_bc_municipalities_candidates/__init__.py | dcycle/scrapers-ca | 4c7a6cd01d603221b5b3b7a400d2e5ca0c6e916f | [
"MIT"
] | 19 | 2015-05-26T03:18:50.000Z | 2022-01-31T03:27:41.000Z | ca_bc_municipalities_candidates/__init__.py | dcycle/scrapers-ca | 4c7a6cd01d603221b5b3b7a400d2e5ca0c6e916f | [
"MIT"
] | 119 | 2015-01-09T06:09:35.000Z | 2022-01-20T23:05:05.000Z | ca_bc_municipalities_candidates/__init__.py | dcycle/scrapers-ca | 4c7a6cd01d603221b5b3b7a400d2e5ca0c6e916f | [
"MIT"
] | 17 | 2015-11-23T05:00:10.000Z | 2021-09-15T16:03:33.000Z | from utils import CanadianJurisdiction
class BritishColumbiaMunicipalitiesCandidates(CanadianJurisdiction):
classification = 'executive' # just to avoid clash
division_id = 'ocd-division/country:ca/province:bc'
division_name = 'British Columbia'
name = 'British Columbia municipal councils'
url = 'http://civicinfo.bc.ca'
def get_organizations(self):
return []
| 30.538462 | 68 | 0.738035 | 40 | 397 | 7.25 | 0.8 | 0.075862 | 0.131034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176322 | 397 | 12 | 69 | 33.083333 | 0.88685 | 0.047859 | 0 | 0 | 0 | 0 | 0.31117 | 0.093085 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0.111111 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
4dbaabec8dc304338d562937e4dd4465d1bf3151 | 1,553 | py | Python | PyTorch/loader.py | avinashsai/Gated-CNN-for-ABSA | 0fa940ce6756240a38755cac6a85a6cbd2a1ccea | [
"MIT"
] | null | null | null | PyTorch/loader.py | avinashsai/Gated-CNN-for-ABSA | 0fa940ce6756240a38755cac6a85a6cbd2a1ccea | [
"MIT"
] | null | null | null | PyTorch/loader.py | avinashsai/Gated-CNN-for-ABSA | 0fa940ce6756240a38755cac6a85a6cbd2a1ccea | [
"MIT"
] | 1 | 2020-02-13T23:28:58.000Z | 2020-02-13T23:28:58.000Z | import sys
import os
import re
import ast
from ast import literal_eval
label = {'negative':0,'positive':1,'neutral':2}
def preprocess(string):
string = re.sub(r"[^A-Za-z0-9(),!?\'\`]", " ", string)
string = re.sub(r"\'s", " \'s", string)
string = re.sub(r"\'ve", " \'ve", string)
string = re.sub(r"n\'t", " n\'t", string)
string = re.sub(r"\'re", " \'re", string)
string = re.sub(r"\'d", " \'d", string)
string = re.sub(r"\'ll", " \'ll", string)
string = re.sub(r",", " , ", string)
string = re.sub(r"!", " ! ", string)
string = re.sub(r"\(", " \( ", string)
string = re.sub(r"\)", " \) ", string)
string = re.sub(r"\?", " \? ", string)
string = re.sub(r"\s{2,}", " ", string)
return string.strip()
def load_data(dataset):
temp=open(dataset+"atsa_train.json","r",encoding="ISO-8859-1").read()
train=literal_eval(temp)
train_sentence=[]
train_aspect=[]
train_sentiment=[]
for i in train:
if(i['sentiment']!='conflict'):
train_sentence.append(preprocess(i["sentence"]))
train_aspect.append(preprocess(i["aspect"]))
train_sentiment.append(label[i["sentiment"]])
temp=open(dataset+"atsa_test.json","r",encoding="ISO-8859-1").read()
test=literal_eval(temp)
test_sentence=[]
test_aspect=[]
test_sentiment=[]
for i in test:
if(i['sentiment']!='conflict'):
test_sentence.append(preprocess(i["sentence"]))
test_aspect.append(preprocess(i["aspect"]))
test_sentiment.append(label[i["sentiment"]])
return train_sentence,test_sentence,train_aspect,test_aspect,train_sentiment,test_sentiment | 29.865385 | 92 | 0.636832 | 224 | 1,553 | 4.308036 | 0.241071 | 0.161658 | 0.188601 | 0.229016 | 0.487047 | 0.184456 | 0.163731 | 0.111917 | 0.111917 | 0.111917 | 0 | 0.011887 | 0.13329 | 1,553 | 52 | 92 | 29.865385 | 0.705052 | 0 | 0 | 0.046512 | 0 | 0 | 0.166667 | 0.013514 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046512 | false | 0 | 0.116279 | 0 | 0.209302 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4ddcfc2abfd22c44858c04df9038005bc3ba0218 | 1,061 | py | Python | user_login.py | grooviq/password | e271622371bc1c718d7814e09a6bf354de7912fd | [
"MIT",
"Unlicense"
] | null | null | null | user_login.py | grooviq/password | e271622371bc1c718d7814e09a6bf354de7912fd | [
"MIT",
"Unlicense"
] | null | null | null | user_login.py | grooviq/password | e271622371bc1c718d7814e09a6bf354de7912fd | [
"MIT",
"Unlicense"
] | null | null | null | class User:
"""
Class that generates new users login system
"""
def __init__(self,fullname, email, username, password):
self.fullname = fullname
self.email = email
self.username = username
self.password = password
user_list = []
def save_user(self):
"""
method that saves user object to user_list
"""
User.user_list.append(self)
@classmethod
def user_exists(cls, username):
"""
Method that checks user existense in the user list.
Args:
username: user to search if the username exists
Returns Boolean: True or false accordingly
"""
for user in cls.user_list:
if user.username == username:
return True
else:
return False
@classmethod
def find_by_username(cls,username):
for user in cls.user_list:
if user.username == username:
return user
else:
return 0 | 23.577778 | 59 | 0.544769 | 114 | 1,061 | 4.95614 | 0.385965 | 0.084956 | 0.031858 | 0.042478 | 0.169912 | 0.169912 | 0.169912 | 0.169912 | 0.169912 | 0.169912 | 0 | 0.001538 | 0.38737 | 1,061 | 45 | 60 | 23.577778 | 0.867692 | 0.221489 | 0 | 0.347826 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173913 | false | 0.086957 | 0 | 0 | 0.434783 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4df5b4698f54b92ecae31a9e5751eb1ec4935002 | 1,573 | py | Python | settings.py | jy617lee/naver_blog_crawler | 6dd603083c160cfa00bf87fdff21bce63bfa5bec | [
"MIT"
] | 8 | 2018-08-09T17:58:06.000Z | 2021-09-05T00:11:52.000Z | settings.py | jy617lee/naver_blog_crawler | 6dd603083c160cfa00bf87fdff21bce63bfa5bec | [
"MIT"
] | null | null | null | settings.py | jy617lee/naver_blog_crawler | 6dd603083c160cfa00bf87fdff21bce63bfa5bec | [
"MIT"
] | 5 | 2018-08-11T08:30:05.000Z | 2021-07-30T09:00:11.000Z | # WEB_DRIVER_PATH = 'C:\chromedriver.exe'
# XLSX_PATH = 'C:/dateGirls/m3/suyo_sample.xls'
# START_DATE = ['20170220', '20170220', '20170220', '20170220']
# END_DATE = ['20170421', '20170421', '20170421', '20170421']
# DINING_NAME = ['나노하나', '바라티에', '서대문양꼬치', '백곰막걸리']
# BROAD_NAME = '수요미식회'
# BROAD_DATE = ['20170322', '20170322', '20170322', '20170322']
# WEB_DRIVER_PATH = 'C:\chromedriver.exe'
# XLSX_PATH = 'C:/dateGirls/m3/suyo_sample.xls'
# START_DATE = ['20170220', '20170220']
# END_DATE = ['20170421', '20170421']
# DINING_NAME = ['서대문양꼬치', '백곰막걸리']
# BROAD_NAME = '수요미식회'
# BROAD_DATE = ['20170322', '20170322']
WEB_DRIVER_PATH = 'C:\chromedriver.exe'
XLSX_PATH = 'C:/dateGirls/m3/sangsang_sample.xls'
START_DATE = ['20170507', '20170507', '20170514', '20170514']
END_DATE = ['20170706', '20170706', '20170714', '20170714']
DINING_NAME = ['아티장베이커스', '악소', '부첼리하우스', '볼트스테이크하우스']
BROAD_NAME = '생생정보통'
BROAD_DATE = ['20170607', '20170607', '20170614', '20170614']
# WEB_DRIVER_PATH = 'C:\chromedriver.exe'
# XLSX_PATH = 'C:/dateGirls/m3/mashit_sample.xls'
# START_DATE = ['20170202', '20170202', '20170209', '20170216']
# END_DATE = ['20170401', '20170401', '20170408', '20170415']
# DINING_NAME = ['등촌최월선칼국수', '도셰프', '현대북어집', '포브라더스']
# BROAD_NAME = '맛있는녀석들'
# BROAD_DATE = ['20170303', '20170303', '20170310', '20170317']
# WEB_DRIVER_PATH = 'C:\chromedriver.exe'
# XLSX_PATH = 'C:/dateGirls/m3/mashit_sample.xls'
# START_DATE = ['20170302']
# END_DATE = ['20170305']
# DINING_NAME = ['등촌최월선칼국수']
# BROAD_NAME = '맛있는녀석들'
# BROAD_DATE = ['20170303']
| 39.325 | 63 | 0.669421 | 185 | 1,573 | 5.448649 | 0.297297 | 0.049603 | 0.064484 | 0.069444 | 0.606151 | 0.606151 | 0.481151 | 0.481151 | 0.481151 | 0.397817 | 0 | 0.263348 | 0.118881 | 1,573 | 39 | 64 | 40.333333 | 0.463925 | 0.737444 | 0 | 0 | 0 | 0 | 0.466146 | 0.091146 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4df8c4a801ecbbffa68a0c520bd37e55e052cac8 | 794 | py | Python | FirstStepsInPython/Basics/Exercise3 Conditional Statements Advanced/05. Journey.py | Pittor052/SoftUni-Studies | 1ee6341082f6ccfa45b3e82824c37722bcf2fb31 | [
"MIT"
] | null | null | null | FirstStepsInPython/Basics/Exercise3 Conditional Statements Advanced/05. Journey.py | Pittor052/SoftUni-Studies | 1ee6341082f6ccfa45b3e82824c37722bcf2fb31 | [
"MIT"
] | null | null | null | FirstStepsInPython/Basics/Exercise3 Conditional Statements Advanced/05. Journey.py | Pittor052/SoftUni-Studies | 1ee6341082f6ccfa45b3e82824c37722bcf2fb31 | [
"MIT"
] | 1 | 2021-10-07T18:30:42.000Z | 2021-10-07T18:30:42.000Z | budget = float(input())
season = str(input())
region = str()
final_budget = 0
accommodation = str()
if budget <= 100:
region = "Bulgaria"
if season == "summer":
accommodation = "Camp"
final_budget = budget * 0.7
else:
accommodation = "Hotel"
final_budget = budget * 0.30
elif 100 < budget <= 1000:
region = "Balkans"
if season == "summer":
accommodation = "Camp"
final_budget = budget * 0.6
else:
accommodation = "Hotel"
final_budget = budget * 0.20
else:
region = "Europe"
accommodation = "Hotel"
if season == "summer":
final_budget = budget * 0.1
else:
final_budget = budget * 0.1
print(f"Somewhere in {region}")
print(f"{accommodation} - {budget - final_budget:.2f}")
| 23.352941 | 55 | 0.585642 | 91 | 794 | 5.021978 | 0.32967 | 0.19256 | 0.223195 | 0.236324 | 0.472648 | 0.389497 | 0.389497 | 0.214442 | 0.214442 | 0 | 0 | 0.045775 | 0.284635 | 794 | 33 | 56 | 24.060606 | 0.758803 | 0 | 0 | 0.466667 | 0 | 0 | 0.161209 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.066667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
150412ba7848113fb96ca9793768891d5f5798ff | 1,630 | py | Python | task_tool.py | YiChengCai1999/SinaWeiboCrawler | 0f65c6c96926ee0a599b7833a95ef2cc77420683 | [
"Apache-2.0"
] | null | null | null | task_tool.py | YiChengCai1999/SinaWeiboCrawler | 0f65c6c96926ee0a599b7833a95ef2cc77420683 | [
"Apache-2.0"
] | null | null | null | task_tool.py | YiChengCai1999/SinaWeiboCrawler | 0f65c6c96926ee0a599b7833a95ef2cc77420683 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2021/2/4 22:50
# @Author : cendeavor
# @Site :
# @File : task_tool.py
# @Software: PyCharm
def update_tweet_task(dbname, skip=1):
from weibo.dbscripts.m.task import TweetTaskDBM
from weibo.settings import MONGO_URI, MONGO_DATABASE, MONGO_USER, MONGO_PASSWORD
# 清算爬虫任务
task_dbm = TweetTaskDBM(
uri=MONGO_URI,
dbname=dbname,
username=MONGO_USER,
password=MONGO_PASSWORD
)
task_dbm.update_tasks(skip=skip)
task_dbm.close()
def add_longtext_task(dbname):
from weibo.dbscripts.m.task import LongTweetTaskDBM
from weibo.settings import MONGO_URI, MONGO_DATABASE, MONGO_USER, MONGO_PASSWORD
task_dbm = LongTweetTaskDBM(
uri=MONGO_URI,
dbname=dbname,
username=MONGO_USER,
password=MONGO_PASSWORD
)
task_dbm.add_tasks(
quantity=0, # task_num, 0表示全部添加
)
task_dbm.close()
def update_profile_task(dbname):
from weibo.dbscripts.m.task import ProfileTaskDBM
from weibo.settings import MONGO_URI, MONGO_DATABASE, MONGO_USER, MONGO_PASSWORD
# 清算爬虫任务
task_dbm = ProfileTaskDBM(
uri=MONGO_URI,
dbname=dbname,
username=MONGO_USER,
password=MONGO_PASSWORD
)
task_dbm.update_tasks()
task_dbm.close()
def recover_profile_task(dbname):
# TODO: 如果registration_time为''则重置user_state_item中的profile为false
pass
if __name__ == '__main__':
# add_longtext_task(dbname="control_weibo")
# update_profile_task(dbname="mdd_weibo")
update_tweet_task(dbname="mdd_weibo", skip=3)
| 25.076923 | 84 | 0.690184 | 203 | 1,630 | 5.231527 | 0.334975 | 0.059322 | 0.06403 | 0.07533 | 0.524482 | 0.524482 | 0.497175 | 0.497175 | 0.423729 | 0.423729 | 0 | 0.011755 | 0.217178 | 1,630 | 64 | 85 | 25.46875 | 0.820533 | 0.196319 | 0 | 0.461538 | 0 | 0 | 0.013107 | 0 | 0 | 0 | 0 | 0.015625 | 0 | 1 | 0.102564 | false | 0.179487 | 0.153846 | 0 | 0.25641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
151531fa59d34eeced56a860c0e08c809c5d8c44 | 663 | py | Python | sightpy/materials/emissive.py | ulises1229/Python-Raytracer | ad89b9dabda1c3eeb68af2d3578c3f38dee9f5b9 | [
"MIT"
] | 326 | 2020-08-14T07:29:40.000Z | 2022-03-30T11:13:32.000Z | sightpy/materials/emissive.py | ulises1229/Python-Raytracer | ad89b9dabda1c3eeb68af2d3578c3f38dee9f5b9 | [
"MIT"
] | 7 | 2020-08-14T21:57:56.000Z | 2021-06-09T00:53:04.000Z | sightpy/materials/emissive.py | ulises1229/Python-Raytracer | ad89b9dabda1c3eeb68af2d3578c3f38dee9f5b9 | [
"MIT"
] | 37 | 2020-08-14T17:37:56.000Z | 2022-03-30T09:37:22.000Z | from ..utils.constants import *
from ..utils.vector3 import vec3, rgb, extract
from functools import reduce as reduce
from ..ray import Ray, get_raycolor
from .. import lights
import numpy as np
from . import Material
from ..textures import *
class Emissive(Material):
def __init__(self, color, **kwargs):
if isinstance(color, vec3):
self.texture_color = solid_color(color)
elif isinstance(color, texture):
self.texture_color = color
super().__init__(**kwargs)
def get_color(self, scene, ray, hit):
diff_color = self.texture_color.get_color(hit)
return diff_color | 28.826087 | 55 | 0.660633 | 83 | 663 | 5.072289 | 0.433735 | 0.078385 | 0.114014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006024 | 0.248869 | 663 | 23 | 56 | 28.826087 | 0.839357 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
151cc17d306ec3e2d6d4e12bfd948dc412fbab73 | 1,469 | py | Python | pncnn/dataloaders/nyu_camera_parameters.py | akweury/improved_normal_inference | a10ed16f43362c15f2220345275be5c029f31198 | [
"MIT"
] | null | null | null | pncnn/dataloaders/nyu_camera_parameters.py | akweury/improved_normal_inference | a10ed16f43362c15f2220345275be5c029f31198 | [
"MIT"
] | null | null | null | pncnn/dataloaders/nyu_camera_parameters.py | akweury/improved_normal_inference | a10ed16f43362c15f2220345275be5c029f31198 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Mar 12 09:53:06 2019
@author: abdel62
"""
import numpy as np
# Calibrated using the RGBDemo Calibration tool:
# http://labs.manctl.com/rgbdemo/
#
# The maximum depth used, in meters.
maxDepth = 10
# RGB Intrinsic Parameters
fx_rgb = 5.1885790117450188e+02
fy_rgb = 5.1946961112127485e+02
cx_rgb = 3.2558244941119034e+02
cy_rgb = 2.5373616633400465e+02
# RGB Distortion Parameters
k1_rgb = 2.0796615318809061e-01
k2_rgb = -5.8613825163911781e-01
p1_rgb = 7.2231363135888329e-04
p2_rgb = 1.0479627195765181e-03
k3_rgb = 4.9856986684705107e-01
# Depth Intrinsic Parameters
fx_d = 5.8262448167737955e+02
fy_d = 5.8269103270988637e+02
cx_d = 3.1304475870804731e+02
cy_d = 2.3844389626620386e+02
# RGB Distortion Parameters
k1_d = -9.9897236553084481e-02
k2_d = 3.9065324602765344e-01
p1_d = 1.9290592870229277e-03
p2_d = -1.9422022475975055e-03
k3_d = -5.1031725053400578e-01
# Rotation
R = -np.array([ 9.9997798940829263e-01, 5.0518419386157446e-03, \
4.3011152014118693e-03, -5.0359919480810989e-03, \
9.9998051861143999e-01, -3.6879781309514218e-03, \
-4.3196624923060242e-03, 3.6662365748484798e-03, \
9.9998394948385538e-01 ], dtype=np.float32)
R = np.reshape(R, (3, 3))
R = np.linalg.inv(R)
# 3D Translation
t_x = 2.5031875059141302e-02
t_z = -2.9342312935846411e-04
t_y = 6.6238747008330102e-04
# Parameters for making depth absolute.
depthParam1 = 351.3
depthParam2 = 1092.5 | 24.898305 | 65 | 0.752212 | 216 | 1,469 | 5.018519 | 0.537037 | 0.01107 | 0.038745 | 0.046125 | 0.049816 | 0 | 0 | 0 | 0 | 0 | 0 | 0.479688 | 0.128659 | 1,469 | 59 | 66 | 24.898305 | 0.367188 | 0.258679 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.03125 | 0 | 0.03125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
151d610f81af22c6fe84b166f89136de5b4034f0 | 817 | py | Python | tests/pybaseball/test_team_batting.py | reddigari/pybaseball | 2d878cf3505ce0a5e657694ae967d6275dc3c211 | [
"MIT"
] | 650 | 2017-06-29T20:05:19.000Z | 2022-03-31T03:27:25.000Z | tests/pybaseball/test_team_batting.py | reddigari/pybaseball | 2d878cf3505ce0a5e657694ae967d6275dc3c211 | [
"MIT"
] | 216 | 2017-10-21T05:05:08.000Z | 2022-03-31T04:04:53.000Z | tests/pybaseball/test_team_batting.py | reddigari/pybaseball | 2d878cf3505ce0a5e657694ae967d6275dc3c211 | [
"MIT"
] | 214 | 2017-07-18T21:40:01.000Z | 2022-03-29T03:19:55.000Z | from typing import Callable
import pandas as pd
import pytest
import requests
from pybaseball.team_batting import team_batting
@pytest.fixture(name="sample_html")
def _sample_html(get_data_file_contents: Callable) -> str:
return get_data_file_contents('team_batting.html')
@pytest.fixture(name="sample_processed_result")
def _sample_processed_result(get_data_file_dataframe: Callable) -> pd.DataFrame:
return get_data_file_dataframe('team_batting.csv')
def test_team_batting(response_get_monkeypatch: Callable, sample_html: str, sample_processed_result: pd.DataFrame):
season = 2019
response_get_monkeypatch(sample_html)
team_batting_result = team_batting(season).reset_index(drop=True)
pd.testing.assert_frame_equal(team_batting_result, sample_processed_result, check_dtype=False)
| 29.178571 | 115 | 0.818849 | 113 | 817 | 5.522124 | 0.380531 | 0.141026 | 0.070513 | 0.073718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005457 | 0.102815 | 817 | 27 | 116 | 30.259259 | 0.845839 | 0 | 0 | 0 | 0 | 0 | 0.082007 | 0.028152 | 0 | 0 | 0 | 0 | 0.0625 | 1 | 0.1875 | false | 0 | 0.3125 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 2 |
1274424d5e1fda1072d5c5a1691d0a25e43c4f62 | 937 | py | Python | py/cidoc_crm_types/entities/e72_legal_object.py | minorg/cidoc-crm-types | 9018bdbf0658e4d28a87bc94543e467be45d8aa5 | [
"Apache-2.0"
] | null | null | null | py/cidoc_crm_types/entities/e72_legal_object.py | minorg/cidoc-crm-types | 9018bdbf0658e4d28a87bc94543e467be45d8aa5 | [
"Apache-2.0"
] | null | null | null | py/cidoc_crm_types/entities/e72_legal_object.py | minorg/cidoc-crm-types | 9018bdbf0658e4d28a87bc94543e467be45d8aa5 | [
"Apache-2.0"
] | null | null | null | from .e70_thing import E70Thing
from dataclasses import dataclass
@dataclass
class E72LegalObject(E70Thing):
"""
Scope note:
This class comprises those material or immaterial items to which instances of E30 Right, such as the right of ownership or use, can be applied.
This is true for all instances of E18 Physical Thing. In the case of instances of E28 Conceptual Object, however, the identity of an instance of E28 Conceptual Object or the method of its use may be too ambiguous to reliably establish instances of E30 Right, as in the case of taxa and inspirations. Ownership of corporations is currently regarded as out of scope of the CIDOC CRM.
Examples:
- the Cullinan diamond (E19) (Scarratt and Shor, 2006)
- definition of the CIDOC Conceptual Reference Model Version 5.0.4 (E73) (ISO 21127: 2004)
In First Order Logic:
E72(x) ⊃ E70(x)
"""
TYPE_URI = "http://erlangen-crm.org/current/E72_Legal_Object"
| 39.041667 | 381 | 0.769477 | 151 | 937 | 4.748344 | 0.622517 | 0.061367 | 0.039052 | 0.052999 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061935 | 0.172892 | 937 | 23 | 382 | 40.73913 | 0.863226 | 0.791889 | 0 | 0 | 0 | 0 | 0.262295 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.