hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
5009f7da66f6e63e2db54c0c3b7128421b2a8194 | 596 | py | Python | src/globus_cli/constants.py | sirosen/temp-cli-test | 416fd3fea17b4c7c2cf35d6ccde63cb5719a1af6 | [
"Apache-2.0"
] | 47 | 2016-04-21T19:51:17.000Z | 2022-02-25T14:13:30.000Z | src/globus_cli/constants.py | sirosen/temp-cli-test | 416fd3fea17b4c7c2cf35d6ccde63cb5719a1af6 | [
"Apache-2.0"
] | 421 | 2016-04-20T18:45:24.000Z | 2022-03-14T14:50:41.000Z | src/globus_cli/constants.py | sirosen/temp-cli-test | 416fd3fea17b4c7c2cf35d6ccde63cb5719a1af6 | [
"Apache-2.0"
] | 20 | 2016-09-10T20:25:27.000Z | 2021-10-06T16:02:47.000Z | """
This module is used to define constants used throughout the code.
It should not depend on any other part of the globus-cli codebase.
(If you need to import something else, maybe it's not simple enough to be a constant...)
"""
__all__ = ["EXPLICIT_NULL"]
class _ExplicitNullClass:
"""
Magic sentinel value used to disambiguate values which are being
intentionally nulled from values which are `None` because no argument was
provided
"""
def __bool__(self):
return False
def __repr__(self):
return "null"
EXPLICIT_NULL = _ExplicitNullClass()
| 22.923077 | 88 | 0.708054 | 81 | 596 | 5.012346 | 0.777778 | 0.029557 | 0.068966 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223154 | 596 | 25 | 89 | 23.84 | 0.87689 | 0.620805 | 0 | 0 | 0 | 0 | 0.087179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0.285714 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
501eb39c07b319bc55ef0596e494297651c3e113 | 237 | py | Python | legal-report/ssf_legal_report.py | mindthegab/contrib-toolbox | 5ed82e6cb427ecaf0a77fd1be0093304bdf3784d | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2016-06-14T21:25:00.000Z | 2016-06-14T21:25:00.000Z | legal-report/ssf_legal_report.py | mindthegab/contrib-toolbox | 5ed82e6cb427ecaf0a77fd1be0093304bdf3784d | [
"ECL-2.0",
"Apache-2.0"
] | 7 | 2016-05-26T14:15:43.000Z | 2018-04-11T00:09:24.000Z | legal-report/ssf_legal_report.py | mindthegab/contrib-toolbox | 5ed82e6cb427ecaf0a77fd1be0093304bdf3784d | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2016-05-26T22:35:59.000Z | 2022-02-09T03:21:48.000Z | from legal_report_utils import loadConfig, checkGithubOrg
# Prerequisites:
# - Python 2.7+
# - git available on command-line
# - Apache Maven
# - Leiningen
# - npm install -g license-report
config = loadConfig()
checkGithubOrg(config)
| 19.75 | 57 | 0.751055 | 28 | 237 | 6.285714 | 0.857143 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00995 | 0.151899 | 237 | 11 | 58 | 21.545455 | 0.865672 | 0.50211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
50201fdf0615fcaa22e22c8d4f1ba42f579bba87 | 117 | py | Python | sm/settings.py | NinaCalvi/OKBC | e25ad0296137ed354593c74509b077a22f60425e | [
"MIT"
] | 6 | 2020-07-06T14:31:18.000Z | 2021-09-13T10:15:14.000Z | sm/settings.py | NinaCalvi/OKBC | e25ad0296137ed354593c74509b077a22f60425e | [
"MIT"
] | 2 | 2021-09-12T17:49:09.000Z | 2021-09-14T15:28:54.000Z | sm/settings.py | NinaCalvi/OKBC | e25ad0296137ed354593c74509b077a22f60425e | [
"MIT"
] | 1 | 2021-06-07T01:46:44.000Z | 2021-06-07T01:46:44.000Z | import torch
cuda=False
def set_settings(args):
global cuda
cuda=args.cuda and torch.cuda.is_available() | 19.5 | 52 | 0.726496 | 18 | 117 | 4.611111 | 0.666667 | 0.216867 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188034 | 117 | 6 | 52 | 19.5 | 0.873684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
503330fabb56549f18452db02a63ac505843a166 | 250 | py | Python | spinta/backends/mongo/helpers.py | atviriduomenys/spinta | 77a10e201f8cdc63143fce7996fd0898acb1ff58 | [
"MIT"
] | 2 | 2019-03-14T06:41:14.000Z | 2019-03-26T11:48:14.000Z | spinta/backends/mongo/helpers.py | sirex/spinta | 77a10e201f8cdc63143fce7996fd0898acb1ff58 | [
"MIT"
] | 44 | 2019-04-05T15:52:45.000Z | 2022-03-30T07:41:33.000Z | spinta/backends/mongo/helpers.py | sirex/spinta | 77a10e201f8cdc63143fce7996fd0898acb1ff58 | [
"MIT"
] | 1 | 2019-04-01T09:54:27.000Z | 2019-04-01T09:54:27.000Z | from spinta.utils.schema import NA
from spinta.components import DataSubItem, Action
def inserting(data: DataSubItem):
return (
data.root.action == Action.INSERT or
(data.root.action == Action.UPSERT and data.saved is NA)
)
| 25 | 64 | 0.7 | 33 | 250 | 5.30303 | 0.606061 | 0.114286 | 0.16 | 0.228571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208 | 250 | 9 | 65 | 27.777778 | 0.883838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0.142857 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
504a8ce1cbeccd16814262aef9462b54d5115c97 | 124 | py | Python | func_nesting.py | oushu1zhangxiangxuan1/pylayground | 22590b10a5de7e07149e4a6029a094d51d2e48a4 | [
"Apache-2.0"
] | null | null | null | func_nesting.py | oushu1zhangxiangxuan1/pylayground | 22590b10a5de7e07149e4a6029a094d51d2e48a4 | [
"Apache-2.0"
] | null | null | null | func_nesting.py | oushu1zhangxiangxuan1/pylayground | 22590b10a5de7e07149e4a6029a094d51d2e48a4 | [
"Apache-2.0"
] | null | null | null |
def outer():
def inner():
print(out_var)
out_var = 10
inner()
if "__main__" == __name__:
outer() | 11.272727 | 26 | 0.524194 | 15 | 124 | 3.666667 | 0.666667 | 0.218182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024096 | 0.330645 | 124 | 11 | 27 | 11.272727 | 0.638554 | 0 | 0 | 0 | 0 | 0 | 0.065041 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.285714 | 0.142857 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5053caa50c0717fd8beb048c4ff6591cb65746e4 | 24,165 | py | Python | code/plot_fig3_vs_mag_single_NL_beta_gamma_delta.py | plazas/wfirst-detectors-vnl | db77f55cc3fe540092061319355731e45cd0071e | [
"MIT"
] | 3 | 2016-01-07T19:30:43.000Z | 2020-07-16T09:57:03.000Z | code/plot_fig3_vs_mag_single_NL_beta_gamma_delta.py | plazas/wfirst-detectors-vnl | db77f55cc3fe540092061319355731e45cd0071e | [
"MIT"
] | null | null | null | code/plot_fig3_vs_mag_single_NL_beta_gamma_delta.py | plazas/wfirst-detectors-vnl | db77f55cc3fe540092061319355731e45cd0071e | [
"MIT"
] | null | null | null | #!/usr/bin/python
import numpy as np
import os
import sys
import math
import matplotlib
matplotlib.use('Pdf')
import matplotlib.pyplot as plt
from mpl_toolkits.axes_grid1 import make_axes_locatable
from matplotlib.backends.backend_pdf import PdfPages
import matplotlib.font_manager as fm
## 6-1-15
## Simple code to explore NL as a function of beta, by using interleaving method
import logging
logging.basicConfig(format="%(message)s", level=logging.INFO, stream=sys.stdout)
logger = logging.getLogger("tests_hsm_interleaving")
from galsim.cdmodel import *
from sim2 import * ## where all the BF stuff is
from scipy import optimize
from measurement_function import *
### DATA
pp=PdfPages("out.pdf")
print "Output PDF: differences_beta.pdf"
#### PLOTS
#### Do the plotting here
plt.minorticks_on()
#plt.tight_layout()
### We do not have matplotlib 1.1, with the 'style' package. Modify the matplotlibrc file parameters instead
import matplotlib as mpl
mpl.rc('lines', linewidth=1, color='black', linestyle='-')
mpl.rc('font', family='serif',weight='normal', size=6.0 )
mpl.rc('text', color='black', usetex=False)
mpl.rc('axes', edgecolor='black', linewidth=1, grid=False, titlesize=10, labelsize=9, labelweight='normal',labelcolor='black')
mpl.rc('axes.formatter', limits=[-4,4])
mpl.rcParams['xtick.major.size']=6
mpl.rcParams['xtick.minor.size']=3
mpl.rcParams['xtick.major.pad']=7
mpl.rcParams['xtick.minor.pad']=7
mpl.rcParams['xtick.labelsize']= '8'
mpl.rcParams['xtick.minor.width']= 1.0
mpl.rcParams['xtick.major.width']= 1.0
mpl.rcParams['ytick.major.size']=6
mpl.rcParams['ytick.minor.size']=3
mpl.rcParams['ytick.major.pad']=7
mpl.rcParams['ytick.minor.pad']=7
mpl.rcParams['ytick.labelsize']= '8'
mpl.rcParams['ytick.minor.width']= 1.0
mpl.rcParams['ytick.major.width']= 1.0
mpl.rc ('legend', numpoints=1, fontsize='8', shadow=False, frameon=False)
## Plot parameters
plt.subplots_adjust(hspace=0.01, wspace=0.01)
prop = fm.FontProperties(size=7)
marker_size=5.0
alpha=0.6
loc_label = "upper right"
visible_x, visible_y = True, True
#1. Data vectors
x_vec=[18.3, 19, 20, 21, 22]
x_vec_b=x_vec
x_vec_g=x_vec
x_vec_d=x_vec
# 1.1 Figure 3
# 1.1.1 beta
# 1.1.1.1 J filter
e1_inter_vec_j_b, e2_inter_vec_j_b, size_inter_vec_j_b = [0.00019690167158842084, 0.00010330328717827806, 4.1090967133641158e-05, 1.6333376988768579e-05, 6.49617984890943e-06] ,np.abs(np.array([-0.0033104890584945679, -0.001771166920661928, -0.0007142011821269983, -0.00028577730059623857, -0.00011399552226066728])) , np.abs(np.array([-0.015468129925790559, -0.0082765317459168736, -0.0033382015864973046, -0.0013359327502772077, -0.00053295610537621547] ))
e1_inter_vec_err_j_b, e2_inter_vec_err_j_b, size_inter_vec_err_j_b = [4.5027901288919729e-07, 2.7099822075049764e-07, 1.6483804962879991e-07, 9.0399434375126951e-08, 5.4338115979028633e-08] ,[5.1665882330321957e-07, 3.8714352419471205e-07, 2.6773082645177064e-07, 1.7421866438137307e-07, 8.2560905260495884e-08], [3.8633672946091329e-07, 2.7587950308532963e-07, 1.6565816579355902e-07, 1.0819590261387035e-07, 7.2163445068632627e-08]
# 1.1.1.2 Y filter
e1_inter_vec_y_b, e2_inter_vec_y_b, size_inter_vec_y_b = np.abs(np.array([-6.0582254081964632e-05, -3.8728304207325155e-05, -1.73868052661421e-05, -7.250234484673136e-06, -2.9443949460985703e-06] )) , np.abs(np.array([-0.0038858753442764238, -0.0020605184137821187, -0.00082566767930984111, -0.00032951742410659347, -0.00013132020831108039])) , np.abs(np.array([-0.018537439609319344, -0.0099161888178844696, -0.0039984786445593475, -0.0015999666218593223, -0.00063826367402239131] ))
e1_inter_vec_err_y_b, e2_inter_vec_err_y_b, size_inter_vec_err_y_b= [1.9639530744538168e-07, 1.6050587366354696e-07, 1.223111167188192e-07, 7.4961838729719984e-08, 4.9591240195405975e-08] ,[4.0384005313401515e-07, 3.518402088691094e-07, 1.919970528752604e-07, 1.0285946798243457e-07, 7.0505486261789784e-08] ,[3.0281777684246864e-07, 2.158877468034303e-07, 1.2883778032467686e-07, 7.8181685470450067e-08, 6.1250973727782876e-08]
# 1.1.1.3 H filter
e1_inter_vec_h_b, e2_inter_vec_h_b, size_inter_vec_h_b = np.abs(np.array([0.00013764604926109309, 7.3166899383068153e-05, 2.9389420524239403e-05, 1.1742282658815523e-05, 4.6773254871369101e-06] )) , np.abs(np.array([-0.0017084905505180365, -0.00091335929930210207, -0.00036813631653785607, -0.00014727905392646846, -5.8750808238983152e-05] )) , np.abs(np.array([-0.010549136360830175, -0.0056202991322085302, -0.0022602570657424435, -0.00090348901776855176, -0.00036026690774752069] ))
e1_inter_vec_err_h_b, e2_inter_vec_err_h_b, size_inter_vec_err_h_b= [2.6058448509165922e-07, 2.2348437369231719e-07, 1.4832646787638415e-07, 8.8314072387274768e-08, 5.2075527814900713e-08] ,[4.3492770456502384e-07, 3.4859242684343871e-07, 2.2045707737736892e-07, 1.3609671933344581e-07, 9.4278552888972642e-08], [2.6308648042910477e-07, 2.1856459437983719e-07, 1.3137244764161994e-07, 8.557409932406714e-08, 5.5982585986227436e-08]
# 1.1.1.4 F filter
e1_inter_vec_f_b, e2_inter_vec_f_b, size_inter_vec_f_b= np.abs(np.array( [5.1516401581466078e-05, 2.7229278348386288e-05, 1.0894164443015853e-05, 4.3349573388694243e-06, 1.7254613339898716e-06] )) ,np.abs(np.array([-0.00062052235007286, -0.00032897140830755178, -0.00013185013085603576, -5.2637644112108889e-05, -2.0980872213838334e-05] )) , np.abs(np.array([-0.0051374091967324089, -0.0027170575868423787, -0.0010872939630307233, -0.00043376261322903729, -0.00017283056424823861] ))
e1_inter_vec_err_f_b, e2_inter_vec_err_f_b, size_inter_vec_err_f_b=[1.9524122880585939e-07, 1.5400325221101859e-07, 9.2211139359874765e-08, 6.9999482693809907e-08, 5.3099313917246978e-08] ,[2.9871207214388973e-07, 2.0320637017208355e-07, 1.4135832354677853e-07, 9.5641624835236111e-08, 6.4309549498402991e-08] ,[1.6791092499484376e-07, 1.4291496692226087e-07, 9.9742419348795944e-08, 7.4679276925686175e-08, 5.9380826250142824e-08]
# 1.1.2 gamma
# 1.1.2.1 J FILTER
e1_inter_vec_j_g, e2_inter_vec_j_g, size_inter_vec_j_g=np.abs(np.array([-0.00064297635108232453, -0.00022431382909417129, -3.6966064944863247e-05, -5.8864243328569583e-06, -9.3488022685066576e-07])) ,np.abs(np.array([0.017410580813884732, 0.0044732397794723464, 0.00069105848670005941, 0.00010906800627708018, 1.7274171113970115e-05] )), np.abs(np.array([0.087407041894478787, 0.021918423617126929, 0.0033732758145093712, 0.00053217012229681785, 8.427813662977845e-05] ))
e1_inter_vec_err_j_g, e2_inter_vec_err_j_g, size_inter_vec_err_j_g=[4.1755530512554943e-06, 1.5422316142910765e-06, 2.6405691893799887e-07, 7.9207821493668261e-08, 1.9535755850815507e-08] ,[8.6617029068346218e-07, 6.4009089177767134e-07, 2.5797843770932167e-07, 9.7605563256907843e-08, 1.7038876751351607e-08], [4.0078135204479973e-06, 1.2778627778803934e-06, 2.7961249409556268e-07, 8.0077361931685072e-08, 3.540078210877129e-08]
# 1.1.2.2 Y FILTER
e1_inter_vec_y_g, e2_inter_vec_y_g, size_inter_vec_y_g= np.abs(np.array([0.0016023941896855828, 0.00022506741806864732, 2.6559308171271478e-05, 3.9936788380145335e-06, 6.2670558690979683e-07] )),np.abs(np.array([0.019573820084333424, 0.0055881303548812886, 0.00088649779558182544, 0.00014045894145966187, 2.2265315055851885e-05] )), np.abs(np.array([0.1138750248505032, 0.028824834903240303, 0.0044399551902175195, 0.00070048696860347537, 0.00011094083424912027]))
e1_inter_vec_err_y_g, e2_inter_vec_err_y_g, size_inter_vec_err_y_g=[3.9371389829086751e-06, 1.910269065052068e-06, 3.8858551852386633e-07, 8.0849569054451458e-08, 1.8599274299666921e-08] ,[6.8556372322708642e-07, 5.4190538908321889e-07, 2.1826025372367053e-07, 1.1273632029855858e-07, 3.1398694973214966e-08] ,[3.4746726155439191e-06, 1.5773749949110623e-06, 3.3487770474988962e-07, 8.524033444978079e-08, 3.202366357730605e-08]
# 1.1.2.2 H FILTER
e1_inter_vec_h_g, e2_inter_vec_h_g, size_inter_vec_h_g = np.abs(np.array([-0.00052423992194235321, -0.00014080004766583445, -2.2071087732911023e-05, -3.4910067915914186e-06, -5.5159442126759616e-07] )), np.abs(np.array([0.0072325474768877018, 0.0018477221578359604, 0.00028596349060535221, 4.5143887400625053e-05, 7.1474164724358306e-06] )), np.abs(np.array([0.046285595624040866, 0.011999762523198365, 0.0018665867085863085, 0.0002949594203055206, 4.6727971812314806e-05] ))
e1_inter_vec_err_h_g, e2_inter_vec_err_h_g, size_inter_vec_err_h_g= [6.6101948654697875e-06, 2.0387236002652375e-06, 3.5254601612602619e-07, 6.366614306159372e-08, 1.733053684334119e-08] ,[7.6229459988004784e-07, 5.289516121421775e-07, 2.132914106409429e-07, 6.5555753023779919e-08, 4.0627121724738819e-08], [4.3162103029638919e-06, 1.3018150549729136e-06, 2.5814389326722608e-07, 5.7780888781154944e-08, 3.1774424110692337e-08]
# 1.1.2.2 F FILTER
e1_inter_vec_f_g, e2_inter_vec_f_g, size_inter_vec_f_g= np.abs(np.array([-0.00010281892493367206, -2.7936277911067224e-05, -4.4081034138799755e-06, -6.9647096097489281e-07, -1.0929536074413825e-07] )), np.abs(np.array([0.0012921334803104412, 0.00034799527376890363, 5.4777972400189484e-05, 8.6734816432022804e-06, 1.377351582051503e-06] )) , np.abs(np.array([0.011421910254137722, 0.003096164881458161, 0.00048824342912332061, 7.7314925928566593e-05, 1.2254329147878273e-05] ))
e1_inter_vec_err_f_g, e2_inter_vec_err_f_g, size_inter_vec_err_f_g= [3.0561947288736299e-07, 1.5051667544628253e-07, 6.7184678913064233e-08, 1.5193011440276204e-08, 1.3654169624445699e-08] ,[3.9935713879575187e-07, 2.504735465903926e-07, 1.0778826689522119e-07, 4.2951983255195341e-08, 3.9077762534797767e-08], [2.5322440499199287e-07, 1.5096605450915258e-07, 7.1382182762351393e-08, 3.5358605028954566e-08, 2.4459053146101336e-08]
# 1.1.3 delta
# 1.1.3.1 J FILTER
e1_inter_vec_j_d, e2_inter_vec_j_d, size_inter_vec_j_d= np.abs(np.array([0.0038559308042749761, 0.00062159046530723555, 3.6076754331588571e-05, 2.2584572434424659e-06, 1.4249235391608147e-07] )) , np.abs(np.array([-0.046415479108691218, -0.010180546343326569, -0.00067918896675109894, -4.2986571788786733e-05, -2.7129054069516269e-06] )), np.abs(np.array([-0.25139030299210652, -0.051859464193442735, -0.0034825298218656908, -0.00022060139045079641, -1.39255422258866e-05] ))
e1_inter_vec_err_j_d, e2_inter_vec_err_j_d, size_inter_vec_err_j_d=[0.00062216353723449627, 1.8884949951594682e-05, 7.9960667546142057e-07, 5.7218233710699224e-08, 3.1445069268441539e-09] ,[5.2090580850352196e-05, 1.8485612681719569e-06, 2.632069845864016e-07, 4.85412044459238e-08, 3.5388297114356716e-09] , [0.00031664967326526636, 1.2034640127765004e-05, 5.3798333991205246e-07, 5.1998895600055832e-08, 3.4465983764153874e-08]
# 1.1.3.2 Y FILTER
e1_inter_vec_y_d, e2_inter_vec_y_d, size_inter_vec_y_d= np.abs(np.array( [0.0041923589492216702, -5.6567788124084714e-05, -2.9280073940754284e-05, -1.9771419465547392e-06, -1.2468546628952026e-07] )) , np.abs(np.array([-0.073787294179201116, -0.015496996343135831, -0.00097189337015151451, -6.1226487159721512e-05, -3.862231969828933e-06] )) , np.abs(np.array([-0.32982131735350551, -0.074786729375086361, -0.0050467899233639334, -0.0003197236716009766, -2.0177746353168267e-05] ))
e1_inter_vec_err_y_d, e2_inter_vec_err_y_d, size_inter_vec_err_y_d=np.abs(np.array( [0.0036940571319651755, 0.00011879904456895412, 5.2375666241501541e-06, 3.1946492486990712e-07, 2.0509101843815907e-08] )) ,np.abs(np.array([0.00043676331755044233, 1.4749439486062841e-05, 6.6295003228236791e-07, 5.4576507288763654e-08, 5.8457390185910433e-09] )) , np.abs(np.array( [0.0019032169935499004, 9.7711332809103044e-05, 4.7679577743480444e-06, 3.0363803847647574e-07, 3.5842463256476226e-08] ))
# 1.1.3.3 H FILTER
e1_inter_vec_h_d, e2_inter_vec_h_d, size_inter_vec_h_d= np.abs(np.array([0.0016179979825392365, 0.0002721070777624846, 1.742012798786179e-05, 1.0961852967738931e-06, 7.0668756961805168e-08] )) , np.abs(np.array([-0.018451258167624475, -0.0034605363756418238, -0.00022788316011428957, -1.4420375227928023e-05, -9.1008841991410684e-07] )), np.abs(np.array([-0.13884910366706085, -0.024208182991226606, -0.0015776716576059247, -9.9749551047645377e-05, -6.2942585680225705e-06] ))
e1_inter_vec_err_h_d, e2_inter_vec_err_h_d, size_inter_vec_err_h_d= np.abs(np.array([0.00014684684435472165, 1.5735974201861641e-05, 9.5835743929463754e-07, 7.1890741074496526e-08, 1.2409522963511591e-08])) ,np.abs(np.array([1.1946212922394299e-05, 1.5994212074466431e-06, 2.2366866319197627e-07, 6.0118891345322573e-08, 2.8614497739762397e-09] )), np.abs(np.array( [8.8623611611446826e-05, 1.1410671466122349e-05, 6.9557459542341527e-07, 5.5444103812212157e-08, 3.0634736403512809e-08] ))
# 1.1.3.4 F FILTER
e1_inter_vec_f_d, e2_inter_vec_f_d, size_inter_vec_f_d= np.abs(np.array( [0.00020148088224232201, 2.9806769452988988e-05, 1.8785335123536802e-06, 1.1697411537152196e-07, 7.5343996286045274e-09] )) , np.abs(np.array( [-0.0024735460057854659, -0.00037379942834377139, -2.3754760622976546e-05, -1.5014037489877869e-06, -9.4771385192246597e-08] )) , np.abs(np.array([-0.023896247224743813, -0.0035584211283920687, -0.00022561053622218385, -1.4241436844301124e-05, -8.994277169549481e-07] ))
e1_inter_vec_err_f_d, e2_inter_vec_err_f_d, size_inter_vec_err_f_d= [2.2799255051447669e-06, 3.0640815860881143e-07, 6.1157170222198703e-08, 2.2597962227897496e-08, 1.906369765484394e-10] ,[5.5287308680572783e-07, 2.4992504794968476e-07, 7.5093536574620079e-08, 2.1409596942902262e-08, 1.8491854736139376e-09] , [1.2576420956984396e-06, 2.1421848073885994e-07, 6.5940281123907586e-08, 3.1052498694016888e-08, 1.3024972388178838e-08]
fig=plt.figure()
ax = fig.add_subplot (331)
ax.errorbar( x_vec_b, size_inter_vec_f_b, yerr= size_inter_vec_err_f_b, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, size_inter_vec_j_b, yerr= size_inter_vec_err_j_b, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, size_inter_vec_y_b, yerr= size_inter_vec_err_y_b, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, size_inter_vec_h_b, yerr= size_inter_vec_err_h_b, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=False)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=False)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
y1label=r"$|\Delta R/R|$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
plt.title (r"$\beta$", size=14)
ax = fig.add_subplot (332)
ax.errorbar( x_vec_g, size_inter_vec_f_g, yerr= size_inter_vec_err_f_g, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, size_inter_vec_j_g, yerr= size_inter_vec_err_j_g, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, size_inter_vec_y_g, yerr= size_inter_vec_err_y_g, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, size_inter_vec_h_g, yerr= size_inter_vec_err_h_g, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=False)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=False)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
#y1label=r"$d_{\Delta R/R}$"
ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
#ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
plt.title (r"$\gamma$", size=14)
ax = fig.add_subplot (333)
ax.errorbar( x_vec_d, size_inter_vec_f_d, yerr= size_inter_vec_err_f_g, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, size_inter_vec_j_d, yerr= size_inter_vec_err_j_g, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, size_inter_vec_y_d, yerr= size_inter_vec_err_y_g, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, size_inter_vec_h_d, yerr= size_inter_vec_err_h_g, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=False)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=False)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
#y1label=r"$d_{\Delta R/R}$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
#ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
plt.title (r"$\delta$", size=14)
ax = fig.add_subplot (334)
ax.errorbar( x_vec_b, e1_inter_vec_f_b, yerr= e1_inter_vec_err_f_b, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, e1_inter_vec_j_b, yerr= e1_inter_vec_err_j_b, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, e1_inter_vec_y_b, yerr= e1_inter_vec_err_y_b, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, e1_inter_vec_h_b, yerr= e1_inter_vec_err_h_b, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=False)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=False)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
y1label=r"$|\Delta$$e_1|$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
ax = fig.add_subplot (335)
ax.errorbar( x_vec_g, e1_inter_vec_f_g, yerr= e1_inter_vec_err_f_g, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, e1_inter_vec_j_g, yerr= e1_inter_vec_err_j_g, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, e1_inter_vec_y_g, yerr= e1_inter_vec_err_y_g, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, e1_inter_vec_h_g, yerr= e1_inter_vec_err_h_g, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=False)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=False)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
#y1label=r"$d_{\Delta R/R}$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
#ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
ax = fig.add_subplot (336)
ax.errorbar( x_vec_d, e1_inter_vec_f_d, yerr= e1_inter_vec_err_f_g, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, e1_inter_vec_j_d, yerr= e1_inter_vec_err_j_g, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, e1_inter_vec_y_d, yerr= e1_inter_vec_err_y_g, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, e1_inter_vec_h_d, yerr= e1_inter_vec_err_h_g, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=False)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=False)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
#y1label=r"$d_{\Delta R/R}$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
#ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
ax = fig.add_subplot (337)
ax.errorbar( x_vec_b, e2_inter_vec_f_b, yerr= e2_inter_vec_err_f_b, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, e2_inter_vec_j_b, yerr= e2_inter_vec_err_j_b, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, e2_inter_vec_y_b, yerr= e2_inter_vec_err_y_b, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_b, e2_inter_vec_h_b, yerr= e2_inter_vec_err_h_b, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=visible_x)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=visible_x)
ax.set_xscale('linear')
plt.ylim ([1e-5, 0.01])
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
y1label=r"$|\Delta$$e_2|$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
ax = fig.add_subplot (338)
ax.errorbar( x_vec_g, e2_inter_vec_f_g, yerr= e2_inter_vec_err_f_g, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, e2_inter_vec_j_g, yerr= e2_inter_vec_err_j_g, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, e2_inter_vec_y_g, yerr= e2_inter_vec_err_y_g, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_g, e2_inter_vec_h_g, yerr= e2_inter_vec_err_h_g, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=visible_x)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=visible_x)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
#y1label=r"$d_{\Delta R/R}$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
#ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
ax = fig.add_subplot (339)
ax.errorbar( x_vec_d, e2_inter_vec_f_d, yerr= e2_inter_vec_err_f_g, ecolor = 'r', label='F184', fmt='r:+', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, e2_inter_vec_j_d, yerr= e2_inter_vec_err_j_g, ecolor = 'b', label='J129', fmt='b--o', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, e2_inter_vec_y_d, yerr= e2_inter_vec_err_y_g, ecolor = 'g', label='Y106', fmt='g-s', markersize=marker_size, alpha=alpha)
ax.errorbar( x_vec_d, e2_inter_vec_h_d, yerr= e2_inter_vec_err_h_g, ecolor = 'y', label='H158', fmt='y-.x', markersize=marker_size, alpha=alpha)
plt.axhline(y=0.,color='k',ls='solid')
ax.set_xticklabels([int(x) for x in ax.get_xticks()], visible=visible_x)
x1label=r"mag"
lx=ax.set_xlabel(x1label, visible=visible_x)
ax.set_xscale('linear')
ax.set_yscale('log')
ax.set_yticklabels(ax.get_yticks(), visible= visible_y)
#y1label=r"$\Delta$e$_1/\beta$"
#y1label=r"$d_{\Delta R/R}$"
#ax.legend(loc=loc_label , fancybox=True, ncol=1, numpoints=1, prop = prop)
#ly=ax.set_ylabel(y1label, visible=visible_y, size=12)
xmin, xmax=plt.xlim()
delta=(xmax-xmin)
plt.xlim ([xmin - 0.02*delta, xmax + 0.02*delta])
fig.tight_layout()
fig.suptitle (" ", size=11)
plt.subplots_adjust(top=0.925)
pp.savefig()
pp.close()
| 74.814241 | 489 | 0.770867 | 3,991 | 24,165 | 4.437985 | 0.152593 | 0.065041 | 0.044715 | 0.027778 | 0.514962 | 0.48724 | 0.407125 | 0.389849 | 0.376807 | 0.376807 | 0 | 0.331291 | 0.067908 | 24,165 | 322 | 490 | 75.046584 | 0.45507 | 0.079247 | 0 | 0.409692 | 0 | 0 | 0.041576 | 0.000992 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.066079 | null | null | 0.004405 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5057dd8eabab3dcef28627c2ec4fc30e3dfa2f71 | 3,973 | py | Python | categorical_embedder/embedders/core/aux/loss_factory.py | erelcan/categorical-embedder | 376b8779500af2aa459c879f8e525f2ef25d6b31 | [
"Apache-2.0"
] | 3 | 2020-12-19T10:52:58.000Z | 2021-06-08T09:06:44.000Z | ts_embedder/embedders/core/aux/loss_factory.py | erelcan/ts-embedder | 2fe73c70aa8a7bb8a232b3a66730b55d30db4694 | [
"Apache-2.0"
] | null | null | null | ts_embedder/embedders/core/aux/loss_factory.py | erelcan/ts-embedder | 2fe73c70aa8a7bb8a232b3a66730b55d30db4694 | [
"Apache-2.0"
] | null | null | null | from keras import losses
from keras import backend as K
def get_loss_function(loss_info):
if "discriminative" in loss_info:
loss_dict = {}
for name, info in loss_info.items():
loss_dict[name] = _create_loss_function(info)
return loss_dict
else:
return _create_loss_function(loss_info)
def _create_loss_function(loss_info):
if loss_info is None:
return None
else:
if "class_weights" in loss_info:
return _class_weight_wrapper(loss_info["class_weights"], _loss_functions[loss_info["type"]](loss_info["parameters"]))
else:
return _loss_functions[loss_info["type"]](loss_info["parameters"])
def _get_cross_entropy_loss(from_logits=False, label_smoothing=0):
return lambda y_true, y_pred: losses.categorical_crossentropy(y_true, y_pred, from_logits, label_smoothing)
def _get_binary_cross_entropy_loss(from_logits=False, label_smoothing=0):
return lambda y_true, y_pred: losses.binary_crossentropy(y_true, y_pred, from_logits, label_smoothing)
def _get_sparse_categorical_crossentropy_loss(from_logits=False, axis=-1):
return lambda y_true, y_pred: losses.sparse_categorical_crossentropy(y_true, y_pred, from_logits, axis)
def _get_cosine_similarity_loss(axis=-1, squeeze=False):
def inner_func(y_true, y_pred):
if squeeze:
return -losses.cosine_similarity(K.cast_to_floatx(K.squeeze(y_true, -1)), K.squeeze(y_pred, -1), axis=axis)
else:
return -losses.cosine_similarity(K.cast_to_floatx(y_true), y_pred, axis=axis)
return inner_func
def _get_kl_divergence_loss():
return lambda y_true, y_pred: losses.kl_divergence(y_true, y_pred)
def _get_log_cosh_loss():
return lambda y_true, y_pred: losses.log_cosh(y_true, y_pred)
def _get_mean_absolute_percentage_error_loss():
return lambda y_true, y_pred: losses.mean_absolute_percentage_error(y_true, y_pred)
def _get_mean_squared_logarithmic_error_loss():
return lambda y_true, y_pred: losses.mean_squared_logarithmic_error(y_true, y_pred)
def _get_hinge_loss():
return lambda y_true, y_pred: losses.hinge(y_true, y_pred)
def _get_jaccard_distance_loss(smooth=100):
def jaccard_distance_loss(y_true, y_pred):
y_true = K.cast_to_floatx(y_true)
intersection = K.sum(K.abs(y_true * y_pred), axis=-1)
sum_ = K.sum(K.abs(y_true) + K.abs(y_pred), axis=-1)
jac = (intersection + smooth) / (sum_ - intersection + smooth)
return (1 - jac) * smooth
return jaccard_distance_loss
def _class_weight_wrapper(class_weights, loss_fn):
# Designed for 1D output!!
def inner_loss(y_true, y_pred):
#weight_list = class_weights
#fn = lambda x: weight_list[K.eval(x)]
#weights_per_sample = K.map_fn(fn, K.cast(y_true, dtype="int64"))
weights_per_sample = K.map_fn(lambda x: x * class_weights[1] + (1 - x) * class_weights[0], K.cast_to_floatx(y_true))
loss_per_sample = loss_fn(K.expand_dims(y_true), K.expand_dims(y_pred))
loss = K.mean(loss_per_sample * K.expand_dims(weights_per_sample))
return loss
return inner_loss
_loss_functions = {
"cross_entropy": lambda parameters: _get_cross_entropy_loss(**parameters),
"binary_cross_entropy": lambda parameters: _get_binary_cross_entropy_loss(**parameters),
"sparse_cross_entropy": lambda parameters: _get_sparse_categorical_crossentropy_loss(**parameters),
"negative_cosine_similarity": lambda parameters: _get_cosine_similarity_loss(**parameters),
"kl_divergence": lambda parameters: _get_kl_divergence_loss(),
"log_cosh": lambda parameters: _get_log_cosh_loss(),
"MAPE": lambda parameters: _get_mean_absolute_percentage_error_loss(),
"MSLE": lambda parameters: _get_mean_squared_logarithmic_error_loss(),
"hinge": lambda parameters: _get_hinge_loss(),
"jaccard_distance": lambda parameters: _get_jaccard_distance_loss(**parameters)
} | 38.95098 | 129 | 0.736219 | 582 | 3,973 | 4.584192 | 0.158076 | 0.052474 | 0.047226 | 0.078711 | 0.488006 | 0.372189 | 0.27961 | 0.244378 | 0.125187 | 0.125187 | 0 | 0.00542 | 0.164108 | 3,973 | 102 | 130 | 38.95098 | 0.797952 | 0.038258 | 0 | 0.058824 | 0 | 0 | 0.051611 | 0.006812 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.029412 | 0.117647 | 0.558824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
5059569f4930e68d4889d23224edd63602d570a1 | 79 | py | Python | bioimageit_viewer/__init__.py | bioimageit/bioimageit_viewer | 49e8baf650047f3724b62b8866bd5cf07c3fa859 | [
"BSD-2-Clause"
] | null | null | null | bioimageit_viewer/__init__.py | bioimageit/bioimageit_viewer | 49e8baf650047f3724b62b8866bd5cf07c3fa859 | [
"BSD-2-Clause"
] | null | null | null | bioimageit_viewer/__init__.py | bioimageit/bioimageit_viewer | 49e8baf650047f3724b62b8866bd5cf07c3fa859 | [
"BSD-2-Clause"
] | null | null | null | from bioimageit_viewer.viewer import BiMultiViewer
__all__ = ['BiMultiViewer'] | 26.333333 | 50 | 0.835443 | 8 | 79 | 7.625 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088608 | 79 | 3 | 51 | 26.333333 | 0.847222 | 0 | 0 | 0 | 0 | 0 | 0.1625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
acaaaf90386a6e4f4da0240899629fbf56c4e6d2 | 669 | py | Python | apps/controllerx/controllerx.py | htvekov/controllerx | 97feb0ee925165c44aed963dee3e2e05b722c504 | [
"MIT"
] | null | null | null | apps/controllerx/controllerx.py | htvekov/controllerx | 97feb0ee925165c44aed963dee3e2e05b722c504 | [
"MIT"
] | null | null | null | apps/controllerx/controllerx.py | htvekov/controllerx | 97feb0ee925165c44aed963dee3e2e05b722c504 | [
"MIT"
] | null | null | null | """
Bring full functionality to light and media player controllers.
From turning devices on/off to changing the color lights.
https://github.com/xaviml/controllerx
"""
from cx_core import (
CallServiceController,
Controller,
CoverController,
CustomCoverController,
CustomLightController,
CustomMediaPlayerController,
CustomSwitchController,
LightController,
MediaPlayerController,
SwitchController,
)
from cx_devices.aqara import *
from cx_devices.ikea import *
from cx_devices.legrand import *
from cx_devices.lutron import *
from cx_devices.philips import *
from cx_devices.smartthings import *
from cx_devices.trust import *
| 25.730769 | 63 | 0.781764 | 73 | 669 | 7.054795 | 0.589041 | 0.093204 | 0.176699 | 0.221359 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156951 | 669 | 25 | 64 | 26.76 | 0.913121 | 0.239163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.421053 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
acc25711b196a0a836207da26837583f7aad3c47 | 248 | py | Python | src/database/migration/config.py | dalmarcogd/challenge_acai | cab0dc04762e3c03bf285c31557c0371c45cdf96 | [
"Apache-2.0"
] | null | null | null | src/database/migration/config.py | dalmarcogd/challenge_acai | cab0dc04762e3c03bf285c31557c0371c45cdf96 | [
"Apache-2.0"
] | null | null | null | src/database/migration/config.py | dalmarcogd/challenge_acai | cab0dc04762e3c03bf285c31557c0371c45cdf96 | [
"Apache-2.0"
] | null | null | null | from alembic.config import Config
from settings import BASE_DIR, DATABASE_URI
alembic_cfg = Config()
alembic_cfg.set_main_option("script_location", f"{BASE_DIR}/src/database/migration")
alembic_cfg.set_main_option("sqlalchemy.url", DATABASE_URI)
| 31 | 84 | 0.826613 | 37 | 248 | 5.216216 | 0.540541 | 0.15544 | 0.134715 | 0.176166 | 0.238342 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072581 | 248 | 7 | 85 | 35.428571 | 0.83913 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.133065 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
acd552f4cf011183dfb2db8b738c0ad222a82766 | 754 | gyp | Python | binding.gyp | p8952/node-native-input | b68efcd4fff233d75a69dc0d424287806b7938a3 | [
"MIT"
] | 1 | 2018-10-15T10:58:12.000Z | 2018-10-15T10:58:12.000Z | binding.gyp | p8952/node-native-input | b68efcd4fff233d75a69dc0d424287806b7938a3 | [
"MIT"
] | null | null | null | binding.gyp | p8952/node-native-input | b68efcd4fff233d75a69dc0d424287806b7938a3 | [
"MIT"
] | 1 | 2018-10-15T10:58:38.000Z | 2018-10-15T10:58:38.000Z | {
"targets": [
{
"target_name": "nodeNativeInput",
"sources": [
"src/nodeNativeInput.cpp",
"src/getOne/getOne.cpp",
"src/getTwo/getTwo.cpp",
"src/getThree/getThree.cpp"
],
"include_dirs": ["<!(node -e \"require('nan')\")"],
"conditions": [
["OS == \"win\"", {
"defines": ["Windows"],
"link_settings": {
"libraries": []
}
}],
["OS == \"mac\"", {
"defines": ["MacOS"],
"link_settings": {
"libraries": []
}
}],
["OS == \"linux\"", {
"defines": ["Linux"],
"link_settings": {
"libraries": []
}
}]
]
}
]
}
| 21.542857 | 57 | 0.367374 | 48 | 754 | 5.666667 | 0.5625 | 0.066176 | 0.231618 | 0.169118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.415119 | 754 | 34 | 58 | 22.176471 | 0.61678 | 0 | 0 | 0.235294 | 0 | 0 | 0.388594 | 0.119363 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
acdcc62cdbda03f6db177ae86f854bf67dae066b | 23,865 | py | Python | pyrgg/test.py | sepandhaghighi/pyggen | 7d98b2ac7ad0b4767a5c68bc5febefd0e7af32c4 | [
"MIT"
] | 164 | 2017-04-26T00:17:06.000Z | 2022-02-10T17:26:37.000Z | pyrgg/test.py | sepandhaghighi/pyggen | 7d98b2ac7ad0b4767a5c68bc5febefd0e7af32c4 | [
"MIT"
] | 103 | 2017-07-29T03:05:44.000Z | 2022-03-07T01:26:51.000Z | pyrgg/test.py | sepandhaghighi/pyggen | 7d98b2ac7ad0b4767a5c68bc5febefd0e7af32c4 | [
"MIT"
] | 29 | 2017-05-27T11:08:47.000Z | 2021-11-13T01:35:42.000Z | # -*- coding: utf-8 -*-
"""Test file."""
"""
>>> from pyrgg import *
>>> import pyrgg.params
>>> import random
>>> import os
>>> import json
>>> import yaml
>>> import pickle
>>> pyrgg.params.PYRGG_TEST_MODE = True
>>> get_precision(2)
0
>>> get_precision(2.2)
1
>>> get_precision(2.22)
2
>>> get_precision(2.223)
3
>>> convert_str_to_number("20")
20
>>> convert_str_to_number("20.2")
20.2
>>> convert_str_to_bool("1")
True
>>> convert_str_to_bool("3")
True
>>> convert_str_to_bool("0")
False
>>> is_float(10)
False
>>> is_float(10.2)
True
>>> is_float(None)
False
>>> result = input_filter({"file_name": "test","vertices": 5,"max_weight": 1000,"min_weight":455,"min_edge": -45,"max_edge": -11,"sign": False,"output_format": 19, "direct": False,"self_loop": True,"multigraph":False,"number_of_files":2})
>>> result == {'output_format': 1, 'min_weight': 455, 'min_edge': 5, 'max_edge': 5, 'file_name': 'test', 'vertices': 5, 'max_weight': 1000, 'sign': False, "direct": False,"self_loop": True,"multigraph":False,"number_of_files":2}
True
>>> result = input_filter({"file_name": "test","vertices": 5,"max_weight": 1000,"min_weight":455,"min_edge": -45,"max_edge": -11,"sign": False,"output_format": 19, "direct": False,"self_loop": False,"multigraph":False,"number_of_files":2})
>>> result == {'output_format': 1, 'min_weight': 455, 'min_edge': 4, 'max_edge': 4, 'file_name': 'test', 'vertices': 5, 'max_weight': 1000, 'sign': False, "direct": False,"self_loop": False,"multigraph":False,"number_of_files":2}
True
>>> result = input_filter({"file_name": "test","vertices": -5,"max_weight": 1000,"min_weight":455,"min_edge": -45,"max_edge": -11,"sign": False,"output_format": 19, "direct": False,"self_loop": False,"multigraph":True,"number_of_files":-1})
>>> result == {'output_format': 1, 'min_weight': 455, 'min_edge': 11, 'max_edge': 45, 'file_name': 'test', 'vertices': 5, 'max_weight': 1000, 'sign': False, "direct": False,"self_loop": False,"multigraph":True,"number_of_files":1}
True
>>> result = input_filter({"file_name": "test2","vertices": 23,"max_weight": 2,"min_weight": 80,"min_edge": 23,"max_edge": 1,"sign": True,"output_format": 1, "direct": False,"self_loop": True,"multigraph":False,"number_of_files":100})
>>> result == {'min_weight': 2, 'vertices': 23, 'file_name': 'test2', 'max_edge': 23, 'min_edge': 1, 'max_weight': 80, 'output_format': 1, 'sign': True, "direct": False,"self_loop": True,"multigraph":False,"number_of_files":100}
True
>>> logger('test',100,50,1000,10,1,0,0,1,20,1,'2min')
>>> file=open('logfile.log','r')
>>> print("\n".join(file.read().splitlines()[1:-1]))
Filename : test
Vertices : 100
Total Edges : 50
Max Edge : 1000
Min Edge : 10
Directed : True
Signed : False
Multigraph : False
Self Loop : True
Weighted : True
Max Weight : 20
Min Weight : 1
Elapsed Time : 2min
>>> convert_bytes(200)
'200.0 bytes'
>>> convert_bytes(6000)
'5.9 KB'
>>> convert_bytes(80000)
'78.1 KB'
>>> time_convert(33)
'00 days, 00 hours, 00 minutes, 33 seconds'
>>> time_convert(15000)
'00 days, 04 hours, 10 minutes, 00 seconds'
>>> time_convert('sadasdasd')
Traceback (most recent call last):
...
ValueError: could not convert string to float: 'sadasdasd'
>>> line(12,"*")
************
>>> random.seed(2)
>>> sign_gen()
1
>>> random.seed(11)
>>> sign_gen()
-1
>>> used_vertices = {k:[] for k in range(1,41)}
>>> degree_dict = {k:0 for k in range(1,41)}
>>> degree_dict_sort = {k:{} for k in range(41)}
>>> degree_dict_sort[0] = {i:i for i in range(1,41)}
>>> all_vertices = list(range(1, 41))
>>> random.seed(2)
>>> branch_gen(1,10,10,1,20,True,True,True,False,used_vertices,degree_dict,degree_dict_sort)
[[4, 25, 18, 3, 30, 34, 2, 26, 14, 11], [3, 10, 20, 14, -18, -2, -15, -14, 8, 6]]
>>> random.seed(20)
>>> branch_gen(1,10,4,1,20,False,True,True,False,used_vertices,degree_dict,degree_dict_sort)
[[], []]
>>> used_vertices = {k:[] for k in range(1,41)}
>>> degree_dict = {k:0 for k in range(1,41)}
>>> degree_dict_sort = {k:{} for k in range(41)}
>>> degree_dict_sort[0] = {i:i for i in range(1,41)}
>>> branch_gen(1,10,4,1,20,False,True,True,False,used_vertices,degree_dict,degree_dict_sort)
[[10, 7, 39, 2], [9, 11, 6, 14]]
>>> branch_gen(40,1,20,1)
Traceback (most recent call last):
...
TypeError: branch_gen() missing 8 required positional arguments: 'max_weight', 'sign', 'direct', 'self_loop', 'multigraph', 'used_vertices', 'degree_dict', and 'degree_sort_dict'
>>> random.seed(2)
>>> edge_gen(20,0,400,2,10,True,True,True,False)
[{1: [3, 7], 2: [4, 17, 20, 9, 11], 3: [14, 8, 5, 12, 16, 19, 15], 4: [15, 17, 12, 8, 14, 13], 5: [16, 9, 7, 20, 19, 18, 13, 5], 6: [6, 10], 7: [18, 10, 11], 8: [], 9: [], 10: [12, 18, 8, 1, 14], 11: [9, 11], 12: [], 13: [], 14: [19, 16, 17, 20, 15], 15: [6, 1, 19], 16: [12, 13, 8, 9, 17], 17: [], 18: [9, 12, 17, 6, 20, 19, 1], 19: [13], 20: []}, {1: [184, -128], 2: [220, -278, -257, 14, -163], 3: [286, 118, 166, 261, -263, 228, -303], 4: [-82, -335, 250, -256, -338, -179], 5: [-337, -358, -395, -155, -159, 250, -350, -371], 6: [30, -302], 7: [386, -125, 216], 8: [], 9: [], 10: [127, 42, 12, 191, 80], 11: [-301, 77], 12: [], 13: [], 14: [146, -15, -282, 135, 242], 15: [-52, -65, -249], 16: [-132, -334, 343, -17, 87], 17: [], 18: [126, -37, 302, -131, -142, 77, -209], 19: [123], 20: []}, 61]
>>> random.seed(11)
>>> edge_gen(20,0,100,2,10,False,True,True,False)
[{1: [18, 15, 19, 7, 20, 11, 2, 6, 3], 2: [17], 3: [8, 4, 5, 9, 12, 10, 14, 16], 4: [20, 13, 4, 6], 5: [12, 7, 11, 10, 14], 6: [9], 7: [19], 8: [8, 18, 11, 2, 16, 17, 10], 9: [15, 12, 18], 10: [20, 14, 13, 15, 17, 16], 11: [19, 7, 20], 12: [13], 13: [2, 16, 13], 14: [18, 19, 6, 14, 17, 15], 15: [6, 7, 16], 16: [17, 20, 12, 18], 17: [19], 18: [7, 6, 9, 12, 20], 19: [19, 11, 4], 20: []}, {1: [99, 57, 75, 23, 80, 23, 57, 18, 68], 2: [50], 3: [79, 67, 7, 24, 76, 99, 41, 75], 4: [29, 63, 84, 58], 5: [70, 90, 40, 65, 3], 6: [51], 7: [37], 8: [2, 0, 26, 60, 90, 53, 72], 9: [43, 39, 1], 10: [15, 31, 1, 59, 22, 57], 11: [98, 53, 49], 12: [53], 13: [34, 2, 23], 14: [82, 12, 18, 56, 1, 37], 15: [9, 26, 1], 16: [47, 58, 75, 73], 17: [23], 18: [39, 78, 92, 20, 49], 19: [10, 6, 13], 20: []}, 74]
>>> edge_gen(0,400,2,10,1)
Traceback (most recent call last):
...
TypeError: edge_gen() missing 4 required positional arguments: 'sign', 'direct', 'self_loop', and 'multigraph'
>>> random.seed(2)
>>> dimacs_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.gr','r')
>>> print(file.read())
c FILE :testfile.gr
c No. of vertices :10
c No. of edges :7
c Max. weight :200
c Min. weight :0
c Min. edge :0
c Max. edge :2
p sp 10 7
a 4 3 -64
a 5 6 148
a 5 9 110
a 6 10 -139
a 7 7 7
a 8 2 -97
a 9 1 60
<BLANKLINE>
>>> random.seed(4)
>>> dimacs_maker('testfile2',0,50,30,0,4,True,True,True,False)
35
>>> file=open('testfile2.gr','r')
>>> print(file.read())
c FILE :testfile2.gr
c No. of vertices :30
c No. of edges :35
c Max. weight :50
c Min. weight :0
c Min. edge :0
c Max. edge :4
p sp 30 35
a 1 10 46
a 2 18 5
a 2 4 25
a 2 22 -48
a 4 23 -17
a 5 7 -13
a 7 15 10
a 7 17 -40
a 8 8 -42
a 8 25 11
a 9 29 -5
a 10 3 -36
a 10 27 -48
a 11 13 -27
a 11 26 -27
a 11 21 14
a 11 16 -2
a 14 20 -44
a 14 14 43
a 14 12 26
a 15 28 -11
a 16 30 -40
a 16 24 20
a 19 19 7
a 20 12 -29
a 20 1 22
a 22 24 20
a 22 23 -9
a 23 18 18
a 23 27 28
a 24 6 -24
a 25 17 23
a 27 6 -50
a 28 21 28
a 28 13 -13
<BLANKLINE>
>>> random.seed(20)
>>> dimacs_maker('testfile3',10,30,100,0,4,False,True,True,False)
137
>>> file=open('testfile3.gr','r')
>>> print(file.read())
c FILE :testfile3.gr
c No. of vertices :100
c No. of edges :137
c Max. weight :30
c Min. weight :10
c Min. edge :0
c Max. edge :4
p sp 100 137
a 1 34 30
a 3 76 15
a 3 5 23
a 4 13 13
a 4 21 20
a 4 67 28
a 5 60 16
a 5 32 20
a 5 92 20
a 6 64 12
a 6 94 26
a 7 62 12
a 7 36 28
a 7 42 11
a 8 20 12
a 9 47 19
a 10 49 15
a 10 27 10
a 11 48 17
a 11 51 11
a 13 58 14
a 13 70 29
a 14 37 30
a 14 61 27
a 14 87 15
a 15 84 13
a 16 83 28
a 17 45 17
a 17 24 29
a 17 18 26
a 18 59 15
a 19 98 12
a 21 2 30
a 21 99 20
a 22 69 26
a 22 96 11
a 22 88 15
a 24 79 20
a 24 12 12
a 24 82 13
a 26 50 30
a 26 30 19
a 29 52 26
a 31 25 26
a 32 68 14
a 33 65 13
a 33 78 13
a 33 55 17
a 34 63 13
a 35 44 27
a 35 57 14
a 37 74 10
a 37 41 16
a 37 100 30
a 38 72 13
a 38 56 16
a 39 91 19
a 39 43 13
a 41 28 22
a 41 81 19
a 42 90 13
a 42 46 28
a 42 97 16
a 45 86 10
a 45 53 18
a 46 85 13
a 46 23 11
a 47 71 29
a 48 95 12
a 48 77 19
a 48 93 11
a 49 75 22
a 50 73 18
a 50 40 24
a 50 54 28
a 51 80 17
a 51 66 19
a 51 89 20
a 52 58 29
a 52 16 21
a 52 43 12
a 53 8 13
a 53 98 17
a 54 55 10
a 56 62 26
a 56 27 10
a 57 70 26
a 58 44 22
a 59 90 27
a 59 91 19
a 59 78 29
a 60 87 12
a 60 92 25
a 61 69 14
a 61 79 17
a 62 25 21
a 63 97 27
a 63 29 30
a 65 9 26
a 65 64 21
a 66 67 27
a 66 95 19
a 66 93 30
a 68 30 18
a 70 83 12
a 70 99 15
a 71 31 17
a 71 89 20
a 73 36 18
a 75 72 12
a 76 2 26
a 76 12 25
a 76 86 22
a 78 23 19
a 78 100 27
a 79 40 24
a 80 84 26
a 80 80 14
a 81 20 16
a 82 15 16
a 82 88 22
a 83 19 19
a 84 85 13
a 84 28 16
a 85 77 16
a 85 94 23
a 86 1 21
a 87 74 15
a 87 96 19
a 90 93 22
a 92 49 14
a 95 98 26
a 95 55 11
a 97 38 28
a 99 19 29
a 99 89 24
a 100 40 11
<BLANKLINE>
>>> dimacs_maker('testfile', 0, 200, 10, 0,0,True)
Traceback (most recent call last):
...
TypeError: dimacs_maker() missing 3 required positional arguments: 'direct', 'self_loop', and 'multigraph'
>>> random.seed(2)
>>> json_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.json','r')
>>> testfile_1=json.load(file)
>>> testfile_1['graph']['nodes'][1]
{'id': 2}
>>> testfile_1['graph']['edges'][1]['source']
5
>>> testfile_1['graph']['edges'][1]['target']
6
>>> testfile_1['graph']['edges'][1]['weight']
148
>>> json_to_yaml('testfile')
>>> file=open('testfile.yaml','r')
>>> testfile_1_yaml=yaml.load(file)
>>> testfile_1_yaml['graph']['edges'][1]['source']
5
>>> testfile_1_yaml['graph']['edges'][1]['target']
6
>>> testfile_1_yaml['graph']['edges'][1]['weight']
148
>>> json_to_pickle('testfile')
>>> testfile_1_p=pickle.load( open( 'testfile.p', 'rb' ) )
>>> testfile_1_p['graph']['edges'][1]['source']
5
>>> testfile_1_p['graph']['edges'][1]['target']
6
>>> testfile_1_p['graph']['edges'][1]['weight']
148
>>> random.seed(4)
>>> json_maker('testfile2',0,50,30,0,4,True,True,True,False)
35
>>> file=open('testfile2.json','r')
>>> testfile_2=json.load(file)
>>> testfile_2['graph']['nodes'][1]
{'id': 2}
>>> testfile_2['graph']['edges'][1]['source']
2
>>> testfile_2['graph']['edges'][1]['target']
18
>>> testfile_2['graph']['edges'][1]['weight']
5
>>> json_to_yaml('testfile2')
>>> file=open('testfile2.yaml','r')
>>> testfile_2_yaml=yaml.load(file)
>>> testfile_2_yaml['graph']['nodes'][1]
{'id': 2}
>>> testfile_2_yaml['graph']['edges'][1]['source']
2
>>> testfile_2_yaml['graph']['edges'][1]['target']
18
>>> testfile_2_yaml['graph']['edges'][1]['weight']
5
>>> json_to_pickle('testfile2')
>>> testfile_2_p=pickle.load( open( 'testfile2.p', 'rb' ) )
>>> testfile_2_p['graph']['edges'][1]['source']
2
>>> testfile_2_p['graph']['edges'][1]['target']
18
>>> testfile_2_p['graph']['edges'][1]['weight']
5
>>> random.seed(20)
>>> json_maker('testfile3',10,30,100,0,4,False,True,True,False)
137
>>> file=open('testfile3.json','r')
>>> testfile_3=json.load(file)
>>> testfile_3['graph']['nodes'][1]
{'id': 2}
>>> testfile_3['graph']['edges'][1]['source']
3
>>> testfile_3['graph']['edges'][1]['target']
76
>>> testfile_3['graph']['edges'][1]['weight']
15
>>> json_to_yaml('testfile3')
>>> file=open('testfile3.yaml','r')
>>> testfile_3_yaml=yaml.load(file)
>>> testfile_3_yaml['graph']['nodes'][1]
{'id': 2}
>>> testfile_3_yaml['graph']['edges'][1]['source']
3
>>> testfile_3_yaml['graph']['edges'][1]['target']
76
>>> testfile_3_yaml['graph']['edges'][1]['weight']
15
>>> json_to_yaml('testfile24')
[Error] Bad Input File!
>>> json_to_pickle('testfile24')
[Error] Bad Input File!
>>> json_maker('testfile', 0, 200, 10, 0, 0,True)
Traceback (most recent call last):
...
TypeError: json_maker() missing 3 required positional arguments: 'direct', 'self_loop', and 'multigraph'
>>> json_to_pickle('testfile3')
>>> testfile_3_p=pickle.load( open( 'testfile3.p', 'rb' ) )
>>> testfile_3_p['graph']['edges'][1]['source']
3
>>> testfile_3_p['graph']['edges'][1]['target']
76
>>> testfile_3_p['graph']['edges'][1]['weight']
15
>>> random.seed(2)
>>> csv_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> random.seed(2)
>>> gml_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.gml','r')
>>> print(file.read())
graph
[
multigraph 0
directed 1
node
[
id 1
label "Node 1"
]
node
[
id 2
label "Node 2"
]
node
[
id 3
label "Node 3"
]
node
[
id 4
label "Node 4"
]
node
[
id 5
label "Node 5"
]
node
[
id 6
label "Node 6"
]
node
[
id 7
label "Node 7"
]
node
[
id 8
label "Node 8"
]
node
[
id 9
label "Node 9"
]
node
[
id 10
label "Node 10"
]
edge
[
source 4
target 3
value -64
]
edge
[
source 5
target 6
value 148
]
edge
[
source 5
target 9
value 110
]
edge
[
source 6
target 10
value -139
]
edge
[
source 7
target 7
value 7
]
edge
[
source 8
target 2
value -97
]
edge
[
source 9
target 1
value 60
]
]
>>> random.seed(2)
>>> gexf_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.gexf', 'r')
>>> random.seed(2)
>>> mtx_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> random.seed(2)
>>> tsv_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.mtx','r')
>>> print(file.read())
%%MatrixMarket matrix coordinate real general
10 10 7
4 3 -64
5 6 148
5 9 110
6 10 -139
7 7 7
8 2 -97
9 1 60
<BLANKLINE>
>>> random.seed(2)
>>> gdf_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.gdf','r')
>>> print(file.read())
nodedef>name VARCHAR,label VARCHAR
1,Node1
2,Node2
3,Node3
4,Node4
5,Node5
6,Node6
7,Node7
8,Node8
9,Node9
10,Node10
edgedef>node1 VARCHAR,node2 VARCHAR,weight DOUBLE
4,3,-64
5,6,148
5,9,110
6,10,-139
7,7,7
8,2,-97
9,1,60
<BLANKLINE>
>>> random.seed(2)
>>> gl_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.gl','r')
>>> print(file.read())
4 3:-64
5 6:148 9:110
6 10:-139
7 7:7
8 2:-97
9 1:60
<BLANKLINE>
>>> file=open('testfile.csv','r')
>>> print(file.read())
4,3,-64
5,6,148
5,9,110
6,10,-139
7,7,7
8,2,-97
9,1,60
<BLANKLINE>
>>> random.seed(4)
>>> csv_maker('testfile2',0,50,30,0,4,True,True,True,False)
35
>>> file=open('testfile2.csv','r')
>>> print(file.read())
1,10,46
2,18,5
2,4,25
2,22,-48
4,23,-17
5,7,-13
7,15,10
7,17,-40
8,8,-42
8,25,11
9,29,-5
10,3,-36
10,27,-48
11,13,-27
11,26,-27
11,21,14
11,16,-2
14,20,-44
14,14,43
14,12,26
15,28,-11
16,30,-40
16,24,20
19,19,7
20,12,-29
20,1,22
22,24,20
22,23,-9
23,18,18
23,27,28
24,6,-24
25,17,23
27,6,-50
28,21,28
28,13,-13
<BLANKLINE>
>>> random.seed(4)
>>> csv_maker('testfile4',0,50.2,30,0,4,True,True,True,False)
41
>>> file=open('testfile4.csv','r')
>>> print(file.read())
1,10,36.2
2,6,3.3
2,16,-40.2
2,29,11.1
3,17,-39.1
3,7,-10.8
3,3,-40.2
4,12,-14.5
5,9,-33.7
5,28,8.9
6,21,47.4
6,27,-0.4
6,15,-42.6
7,20,-30.1
8,23,11.7
8,18,4.1
8,25,-26.0
9,24,50.1
9,13,20.7
9,14,-13.9
10,26,-31.8
10,19,-5.1
12,22,6.1
13,30,-1.3
14,11,-36.9
14,22,16.2
15,16,-43.2
15,11,-31.0
16,19,12.6
17,21,18.2
18,18,-39.3
18,25,-28.7
19,23,-46.0
24,20,27.4
25,4,-50.1
25,1,-38.8
26,27,-10.1
26,30,-24.7
26,29,-12.5
27,28,-9.4
29,20,26.4
<BLANKLINE>
>>> random.seed(20)
>>> csv_maker('testfile3',10,30,100,0,4,False,True,True,False)
137
>>> file=open('testfile3.csv','r')
>>> print(file.read())
1,34,30
3,76,15
3,5,23
4,13,13
4,21,20
4,67,28
5,60,16
5,32,20
5,92,20
6,64,12
6,94,26
7,62,12
7,36,28
7,42,11
8,20,12
9,47,19
10,49,15
10,27,10
11,48,17
11,51,11
13,58,14
13,70,29
14,37,30
14,61,27
14,87,15
15,84,13
16,83,28
17,45,17
17,24,29
17,18,26
18,59,15
19,98,12
21,2,30
21,99,20
22,69,26
22,96,11
22,88,15
24,79,20
24,12,12
24,82,13
26,50,30
26,30,19
29,52,26
31,25,26
32,68,14
33,65,13
33,78,13
33,55,17
34,63,13
35,44,27
35,57,14
37,74,10
37,41,16
37,100,30
38,72,13
38,56,16
39,91,19
39,43,13
41,28,22
41,81,19
42,90,13
42,46,28
42,97,16
45,86,10
45,53,18
46,85,13
46,23,11
47,71,29
48,95,12
48,77,19
48,93,11
49,75,22
50,73,18
50,40,24
50,54,28
51,80,17
51,66,19
51,89,20
52,58,29
52,16,21
52,43,12
53,8,13
53,98,17
54,55,10
56,62,26
56,27,10
57,70,26
58,44,22
59,90,27
59,91,19
59,78,29
60,87,12
60,92,25
61,69,14
61,79,17
62,25,21
63,97,27
63,29,30
65,9,26
65,64,21
66,67,27
66,95,19
66,93,30
68,30,18
70,83,12
70,99,15
71,31,17
71,89,20
73,36,18
75,72,12
76,2,26
76,12,25
76,86,22
78,23,19
78,100,27
79,40,24
80,84,26
80,80,14
81,20,16
82,15,16
82,88,22
83,19,19
84,85,13
84,28,16
85,77,16
85,94,23
86,1,21
87,74,15
87,96,19
90,93,22
92,49,14
95,98,26
95,55,11
97,38,28
99,19,29
99,89,24
100,40,11
<BLANKLINE>
>>> csv_maker('testfile', 0, 200, 10, 0,0,True)
Traceback (most recent call last):
...
TypeError: csv_maker() missing 3 required positional arguments: 'direct', 'self_loop', and 'multigraph'
>>> random.seed(2)
>>> wel_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.wel','r')
>>> print(file.read())
4 3 -64
5 6 148
5 9 110
6 10 -139
7 7 7
8 2 -97
9 1 60
<BLANKLINE>
>>> random.seed(4)
>>> wel_maker('testfile2',0,50,30,0,4,True,True,True,False)
35
>>> file=open('testfile2.wel','r')
>>> print(file.read())
1 10 46
2 18 5
2 4 25
2 22 -48
4 23 -17
5 7 -13
7 15 10
7 17 -40
8 8 -42
8 25 11
9 29 -5
10 3 -36
10 27 -48
11 13 -27
11 26 -27
11 21 14
11 16 -2
14 20 -44
14 14 43
14 12 26
15 28 -11
16 30 -40
16 24 20
19 19 7
20 12 -29
20 1 22
22 24 20
22 23 -9
23 18 18
23 27 28
24 6 -24
25 17 23
27 6 -50
28 21 28
28 13 -13
<BLANKLINE>
>>> random.seed(20)
>>> wel_maker('testfile3',10,30,100,0,4,False,True,True,False)
137
>>> file=open('testfile3.wel','r')
>>> print(file.read())
1 34 30
3 76 15
3 5 23
4 13 13
4 21 20
4 67 28
5 60 16
5 32 20
5 92 20
6 64 12
6 94 26
7 62 12
7 36 28
7 42 11
8 20 12
9 47 19
10 49 15
10 27 10
11 48 17
11 51 11
13 58 14
13 70 29
14 37 30
14 61 27
14 87 15
15 84 13
16 83 28
17 45 17
17 24 29
17 18 26
18 59 15
19 98 12
21 2 30
21 99 20
22 69 26
22 96 11
22 88 15
24 79 20
24 12 12
24 82 13
26 50 30
26 30 19
29 52 26
31 25 26
32 68 14
33 65 13
33 78 13
33 55 17
34 63 13
35 44 27
35 57 14
37 74 10
37 41 16
37 100 30
38 72 13
38 56 16
39 91 19
39 43 13
41 28 22
41 81 19
42 90 13
42 46 28
42 97 16
45 86 10
45 53 18
46 85 13
46 23 11
47 71 29
48 95 12
48 77 19
48 93 11
49 75 22
50 73 18
50 40 24
50 54 28
51 80 17
51 66 19
51 89 20
52 58 29
52 16 21
52 43 12
53 8 13
53 98 17
54 55 10
56 62 26
56 27 10
57 70 26
58 44 22
59 90 27
59 91 19
59 78 29
60 87 12
60 92 25
61 69 14
61 79 17
62 25 21
63 97 27
63 29 30
65 9 26
65 64 21
66 67 27
66 95 19
66 93 30
68 30 18
70 83 12
70 99 15
71 31 17
71 89 20
73 36 18
75 72 12
76 2 26
76 12 25
76 86 22
78 23 19
78 100 27
79 40 24
80 84 26
80 80 14
81 20 16
82 15 16
82 88 22
83 19 19
84 85 13
84 28 16
85 77 16
85 94 23
86 1 21
87 74 15
87 96 19
90 93 22
92 49 14
95 98 26
95 55 11
97 38 28
99 19 29
99 89 24
100 40 11
<BLANKLINE>
>>> wel_maker('testfile', 0, 200, 10, 0,0,True)
Traceback (most recent call last):
...
TypeError: wel_maker() missing 3 required positional arguments: 'direct', 'self_loop', and 'multigraph'
>>> random.seed(2)
>>> lp_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.lp','r')
>>> print(file.read())
node(1).
node(2).
node(3).
node(4).
node(5).
node(6).
node(7).
node(8).
node(9).
node(10).
edge(4,3,-64).
edge(5,6,148).
edge(5,9,110).
edge(6,10,-139).
edge(7,7,7).
edge(8,2,-97).
edge(9,1,60).
<BLANKLINE>
>>> random.seed(4)
>>> lp_maker('testfile2',0,50,30,0,4,True,True,True,False)
35
>>> file=open('testfile2.lp','r')
>>> print(file.read())
node(1).
node(2).
node(3).
node(4).
node(5).
node(6).
node(7).
node(8).
node(9).
node(10).
node(11).
node(12).
node(13).
node(14).
node(15).
node(16).
node(17).
node(18).
node(19).
node(20).
node(21).
node(22).
node(23).
node(24).
node(25).
node(26).
node(27).
node(28).
node(29).
node(30).
edge(1,10,46).
edge(2,18,5).
edge(2,4,25).
edge(2,22,-48).
edge(4,23,-17).
edge(5,7,-13).
edge(7,15,10).
edge(7,17,-40).
edge(8,8,-42).
edge(8,25,11).
edge(9,29,-5).
edge(10,3,-36).
edge(10,27,-48).
edge(11,13,-27).
edge(11,26,-27).
edge(11,21,14).
edge(11,16,-2).
edge(14,20,-44).
edge(14,14,43).
edge(14,12,26).
edge(15,28,-11).
edge(16,30,-40).
edge(16,24,20).
edge(19,19,7).
edge(20,12,-29).
edge(20,1,22).
edge(22,24,20).
edge(22,23,-9).
edge(23,18,18).
edge(23,27,28).
edge(24,6,-24).
edge(25,17,23).
edge(27,6,-50).
edge(28,21,28).
edge(28,13,-13).
<BLANKLINE>
>>> input_dic=get_input(input_func=lambda x: str(len(x)))
>>> input_dic['sign']
True
>>> input_dic['vertices']
20
>>> input_dic['min_edge']
20
>>> input_dic['min_weight']
15
>>> input_dic['output_format']
1
>>> input_dic['max_weight']
15
>>> input_dic['file_name']
'14'
>>> input_dic['max_edge']
20
>>> random.seed(2)
>>> tgf_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.tgf','r')
>>> print(file.read())
1
2
3
4
5
6
7
8
9
10
#
4 3 -64
5 6 148
5 9 110
6 10 -139
7 7 7
8 2 -97
9 1 60
<BLANKLINE>
>>> random.seed(4)
>>> tgf_maker('testfile2',0,50,30,0,4,True,True,True,False)
35
>>> file=open('testfile2.tgf','r')
>>> print(file.read())
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
#
1 10 46
2 18 5
2 4 25
2 22 -48
4 23 -17
5 7 -13
7 15 10
7 17 -40
8 8 -42
8 25 11
9 29 -5
10 3 -36
10 27 -48
11 13 -27
11 26 -27
11 21 14
11 16 -2
14 20 -44
14 14 43
14 12 26
15 28 -11
16 30 -40
16 24 20
19 19 7
20 12 -29
20 1 22
22 24 20
22 23 -9
23 18 18
23 27 28
24 6 -24
25 17 23
27 6 -50
28 21 28
28 13 -13
<BLANKLINE>
>>> random.seed(2)
>>> dl_maker('testfile', 0, 200, 10, 0, 2, True,True,True,False)
7
>>> file=open('testfile.dl','r')
>>> print(file.read())
dl
format=edgelist1
n=10
data:
4 3 -64
5 6 148
5 9 110
6 10 -139
7 7 7
8 2 -97
9 1 60
<BLANKLINE>
>>> random.seed(4)
>>> dl_maker('testfile2',0,50,30,0,4,True,True,True,False)
35
>>> file=open('testfile2.dl','r')
>>> print(file.read())
dl
format=edgelist1
n=30
data:
1 10 46
2 18 5
2 4 25
2 22 -48
4 23 -17
5 7 -13
7 15 10
7 17 -40
8 8 -42
8 25 11
9 29 -5
10 3 -36
10 27 -48
11 13 -27
11 26 -27
11 21 14
11 16 -2
14 20 -44
14 14 43
14 12 26
15 28 -11
16 30 -40
16 24 20
19 19 7
20 12 -29
20 1 22
22 24 20
22 23 -9
23 18 18
23 27 28
24 6 -24
25 17 23
27 6 -50
28 21 28
28 13 -13
<BLANKLINE>
>>> file.close()
>>> os.remove('testfile.csv')
>>> os.remove('testfile.gml')
>>> os.remove('testfile.gexf')
>>> os.remove('testfile.tsv')
>>> os.remove('testfile.dl')
>>> os.remove('testfile.gr')
>>> os.remove('testfile.json')
>>> os.remove('testfile.lp')
>>> os.remove('testfile.p')
>>> os.remove('testfile.tgf')
>>> os.remove('testfile.wel')
>>> os.remove('testfile.yaml')
>>> os.remove('testfile.mtx')
>>> os.remove('testfile.gdf')
>>> os.remove('testfile.gl')
>>> os.remove('testfile2.csv')
>>> os.remove('testfile2.dl')
>>> os.remove('testfile2.gr')
>>> os.remove('testfile2.json')
>>> os.remove('testfile2.lp')
>>> os.remove('testfile2.p')
>>> os.remove('testfile2.tgf')
>>> os.remove('testfile2.wel')
>>> os.remove('testfile2.yaml')
>>> os.remove('testfile3.csv')
>>> os.remove('testfile4.csv')
>>> os.remove('testfile3.gr')
>>> os.remove('testfile3.json')
>>> os.remove('testfile3.p')
>>> os.remove('testfile3.wel')
>>> os.remove('testfile3.yaml')
>>> os.remove('logfile.log')
"""
| 17.573638 | 801 | 0.61035 | 5,188 | 23,865 | 2.752506 | 0.062837 | 0.029692 | 0.027311 | 0.027381 | 0.565826 | 0.548529 | 0.532003 | 0.462675 | 0.455952 | 0.448529 | 0 | 0.297744 | 0.178965 | 23,865 | 1,357 | 802 | 17.586588 | 0.43105 | 0.001383 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
acf90149ae1d5391921c145661fb813109951569 | 4,638 | py | Python | digital_storage_calculator/data_units/decimal_based_on_bytes_units.py | GonnaFlyMethod/digital-storage-calculator | f5e6d3614711651c8fbc04cca9862e50d4f71ff9 | [
"MIT"
] | null | null | null | digital_storage_calculator/data_units/decimal_based_on_bytes_units.py | GonnaFlyMethod/digital-storage-calculator | f5e6d3614711651c8fbc04cca9862e50d4f71ff9 | [
"MIT"
] | null | null | null | digital_storage_calculator/data_units/decimal_based_on_bytes_units.py | GonnaFlyMethod/digital-storage-calculator | f5e6d3614711651c8fbc04cca9862e50d4f71ff9 | [
"MIT"
] | null | null | null | class Kilobyte:
def __init__(self, value_kilobytes: int):
self._value_kilobytes = value_kilobytes
self.one_kilobyte_in_bits = 8000
self._value_bits = self._convert_into_bits(value_kilobytes)
self.id = "KB"
def _convert_into_bits(self, value_kilobytes: int) -> int:
return value_kilobytes * self.one_kilobyte_in_bits
def convert_from_bits_to_kilobytes(self, bits: int) -> float:
return (bits / self.one_kilobyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bits
def get_val_in_kilobytes(self) -> int:
return self._value_kilobytes
class Megabyte:
def __init__(self, value_megabytes: int):
self._value_megabytes = value_megabytes
self.one_megabyte_in_bits = 8e+6
self._value_bits = self._convert_into_bits(value_megabytes)
self.id = "MB"
def _convert_into_bits(self, value_megabytes: int) -> int:
return value_megabytes * self.one_megabyte_in_bits
def convert_from_bits_to_megabytes(self, bits: int) -> float:
return (bits / self.one_megabyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bits
def get_val_in_megabytes(self) -> int:
return self._value_megabytes
class Gigabyte:
def __init__(self, value_gigabytes: int):
self._value_gigabytes = value_gigabytes
self.one_gigabyte_in_bits = 8e+9
self._value_bits = self._convert_into_bits(value_gigabytes)
self.id = "GB"
def _convert_into_bits(self, value_gigabytes: int) -> int:
return value_gigabytes * self.one_gigabyte_in_bits
def convert_from_bits_to_gigabytes(self, bits: int) -> float:
return (bits / self.one_gigabyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bits
def get_val_in_gigabytes(self) -> int:
return self._value_gigabytes
class Terabyte:
def __init__(self, value_terabytes: int):
self._value_terabytes = value_terabytes
self.one_terabyte_in_bits = 8e+12
self._value_bits = self._convert_into_bits(value_terabytes)
self.id = "TB"
def _convert_into_bits(self, value_terabytes: int) -> int:
return value_terabytes * self.one_terabyte_in_bits
def convert_from_bits_to_terabytes(self, bits: int) -> float:
return (bits / self.one_terabyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bits
def get_val_in_terabytes(self) -> int:
return self._value_terabytes
class Petabyte:
def __init__(self, value_petabytes: int):
self._value_petabytes = value_petabytes
self.one_petabyte_in_bits = 8e+15
self._value_bits = self._convert_into_bits(value_petabytes)
self.id = "PB"
def _convert_into_bits(self, value_petabytes: int) -> int:
return value_petabytes * self.one_petabyte_in_bits
def convert_from_bits_to_petabytes(self, bits: int) -> float:
return (bits / self.one_petabyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bytes
def get_val_in_petabytes(self) -> int:
return self._value_petabytes
class Exabyte:
def __init__(self, value_exabytes: int):
self._value_exabytes = value_exabytes
self.one_exabyte_in_bits = 8e+18
self._value_bits = self._convert_into_bits(value_exabytes)
self.id = "EB"
def _convert_into_bits(self, value_exabytes: int) -> int:
return value_exabytes * self.one_exabyte_in_bits
def convert_from_bits_to_exabytes(self, bits: int) -> float:
return (bits / self.one_exabyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bits
def get_val_in_exabytes(self) -> int:
return self._value_exabytes
class Zettabyte:
def __init__(self, value_zettabytes: int):
self._value_zettabytes = value_zettabytes
self.one_zettabyte_in_bits = 8e+21
self._value_bits = self._convert_into_bits(value_zettabytes)
self.id = "ZB"
def _convert_into_bits(self, value_zettabytes: int) -> int:
return value_zettabytes * self.one_zettabyte_in_bits
def convert_from_bits_to_zettabytes(self, bits: int) -> float:
return (bits / self.one_zettabyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bits
def get_val_in_zettabyte(self) -> int:
return self._value_zettabytes
class Yottabyte:
def __init__(self, value_yottabytes: int):
self._value_yottabytes = value_yottabytes
self.one_yottabyte_in_bits = 8e+24
self._value_bits = self._convert_into_bits(value_yottabytes)
self.id = "YB"
def _convert_into_bits(self, value_yottabytes: int) -> int:
return value_yottabytes * self.one_yottabyte_in_bits
def convert_from_bits_to_yottabytes(self, bits: int) -> float:
return (bits / self.one_yottabyte_in_bits)
def get_val_in_bits(self) -> int:
return self._value_bits
def get_val_in_yottabyte(self) -> int:
return self._value_yottabytes
| 27.939759 | 63 | 0.769728 | 703 | 4,638 | 4.600285 | 0.073969 | 0.133581 | 0.074212 | 0.054422 | 0.67718 | 0.622758 | 0.535869 | 0.311998 | 0.138837 | 0.138837 | 0 | 0.005772 | 0.140793 | 4,638 | 165 | 64 | 28.109091 | 0.805772 | 0 | 0 | 0.133929 | 0 | 0 | 0.00345 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.357143 | false | 0 | 0 | 0.285714 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
4a1a3e2c6893239df24704738369b2576f43eb47 | 1,754 | py | Python | Credential.py | Ruterana/password_locker | eadc92b506bea9607c395fe5027b8cc61722a6a3 | [
"MIT"
] | null | null | null | Credential.py | Ruterana/password_locker | eadc92b506bea9607c395fe5027b8cc61722a6a3 | [
"MIT"
] | null | null | null | Credential.py | Ruterana/password_locker | eadc92b506bea9607c395fe5027b8cc61722a6a3 | [
"MIT"
] | null | null | null | class Credential:
"""
Class that generates new instances of Credentials .
"""
Credential_list = [] # Empty User list
def __init__(self,Account,user_name,password):
# docstring removed for simplicity
self.Account= Account
self.user_name = user_name
self.password = password
def save_Credential(self):
'''
save_Credential method saves Credential objects into Credential_list
'''
Credential.Credential_list.append(self)
def delete_Credential(self):
'''
delete_Credential method deletes a saved Credential from the Credential_list
'''
Credential.Credential_list.remove(self)
@classmethod
def find_by_Account(cls,Account):
'''
Method that takes in a credential and returns a account that matches that account .
Args:
Account: Account to search for
Returns :
Credential of Account that matches the Account.
'''
for Credential in cls.Credential_list:
if Credential.Account == Account:
return Credential
@classmethod
def Credential_exist(cls,Account):
'''
Method that checks if a Account exists from the Credential list.
Args:
Account: ACCOUNT to search if it exists
Returns :
Boolean: True or false depending if the Account exists
'''
for Credential in cls.Credential_list:
if Credential.Account == Account:
return True
return False
@classmethod
def display_Credentials(cls):
'''
method that returns the Credential list
'''
return cls.Credential_list | 27.84127 | 92 | 0.611745 | 185 | 1,754 | 5.675676 | 0.313514 | 0.133333 | 0.048571 | 0.064762 | 0.24381 | 0.121905 | 0.121905 | 0.121905 | 0.121905 | 0.121905 | 0 | 0 | 0.327822 | 1,754 | 63 | 93 | 27.84127 | 0.890585 | 0.376283 | 0 | 0.291667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.083333 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
4a1ac93ecbdafca5f1a19f9acf5ffa8f6e6f4bef | 1,455 | py | Python | venv/lib/python3.8/site-packages/tensorflow/_api/v2/compat/v1/autograph/__init__.py | JIANG-CX/data_labeling | 8d2470bbb537dfc09ed2f7027ed8ee7de6447248 | [
"MIT"
] | 1 | 2021-05-24T10:08:51.000Z | 2021-05-24T10:08:51.000Z | venv/lib/python3.8/site-packages/tensorflow/_api/v2/compat/v1/autograph/__init__.py | JIANG-CX/data_labeling | 8d2470bbb537dfc09ed2f7027ed8ee7de6447248 | [
"MIT"
] | null | null | null | venv/lib/python3.8/site-packages/tensorflow/_api/v2/compat/v1/autograph/__init__.py | JIANG-CX/data_labeling | 8d2470bbb537dfc09ed2f7027ed8ee7de6447248 | [
"MIT"
] | null | null | null | # This file is MACHINE GENERATED! Do not edit.
# Generated by: tensorflow/python/tools/api/generator/create_python_api.py script.
"""Conversion of eager-style Python into TensorFlow graph code.
NOTE: In TensorFlow 2.0, AutoGraph is automatically applied when using
`tf.function`. This module contains lower-level APIs for advanced use.
AutoGraph transforms a subset of Python which operates on TensorFlow objects
into equivalent TensorFlow graph code. When executing the graph, it has the same
effect as if you ran the original code in eager mode.
Python code which doesn't operate on TensorFlow objects remains functionally
unchanged, but keep in mind that `tf.function` only executes such code at trace
time, and generally will not be consistent with eager execution.
For more information, see the
[AutoGraph reference documentation](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/g3doc/reference/index.md),
and the [tf.function guide](https://www.tensorflow.org/guide/function#autograph_transformations).
"""
from __future__ import print_function as _print_function
import sys as _sys
from . import experimental
from tensorflow.python.autograph.impl.api import to_code_v1 as to_code
from tensorflow.python.autograph.impl.api import to_graph_v1 as to_graph
from tensorflow.python.autograph.utils.ag_logging import set_verbosity
from tensorflow.python.autograph.utils.ag_logging import trace
del _print_function
| 45.46875 | 143 | 0.823368 | 219 | 1,455 | 5.369863 | 0.538813 | 0.081633 | 0.106293 | 0.098639 | 0.158163 | 0.158163 | 0.158163 | 0.158163 | 0 | 0 | 0 | 0.003879 | 0.114089 | 1,455 | 31 | 144 | 46.935484 | 0.908456 | 0.713402 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.875 | 0 | 0.875 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
4a24ce592bf9a1ad1bfa8597f12822f88adc0326 | 635 | py | Python | pyridge/preprocess/log.py | cperales/PyRidge | b0029fae9e24a4e5c364bbd8fc3791eab15baa75 | [
"MIT"
] | 8 | 2019-03-09T13:47:23.000Z | 2022-01-29T03:51:00.000Z | pyridge/preprocess/log.py | cperales/pyridge | 74a9aa83c1687e5362b0fd02f526281ad6837b75 | [
"MIT"
] | 1 | 2018-10-19T18:46:53.000Z | 2018-10-19T18:46:53.000Z | pyridge/preprocess/log.py | cperales/PyRidge | b0029fae9e24a4e5c364bbd8fc3791eab15baa75 | [
"MIT"
] | 3 | 2020-08-26T10:08:20.000Z | 2021-11-13T11:42:23.000Z | from pyridge.generic.scaler import Scaler
import numpy as np
class LogScaler(Scaler):
"""
Scaler for that transform the values in a logaritmic
scaler.
"""
def __init__(self):
self.min_: np.float
def get_params(self):
return {'min_': self.min_}
def fit(self, values):
self.min_ = np.min(values, axis=0)
def transform(self, values):
return np.log(values + (1.0 - self.min_))
def fit_transform(self, values):
self.fit(values)
return self.transform(values)
def inverse_transform(self, values):
return np.exp(values) - (1.0 - self.min_)
| 22.678571 | 56 | 0.623622 | 86 | 635 | 4.453488 | 0.383721 | 0.091384 | 0.148825 | 0.067885 | 0.219321 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010638 | 0.259843 | 635 | 27 | 57 | 23.518519 | 0.804255 | 0.094488 | 0 | 0 | 0 | 0 | 0.007207 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0.125 | 0.1875 | 0.8125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
c57bcc6acb9c43631696838c26641dc6013708ae | 2,393 | py | Python | torch2trt/converters/div.py | Akababa/torch2trt | 03063b74a7eb40f5aac88d49be6b8b5e4e4e92d7 | [
"MIT"
] | 2 | 2020-07-10T06:26:03.000Z | 2020-07-10T07:38:08.000Z | torch2trt/converters/div.py | Akababa/torch2trt | 03063b74a7eb40f5aac88d49be6b8b5e4e4e92d7 | [
"MIT"
] | 1 | 2020-02-16T09:43:35.000Z | 2020-02-16T09:43:35.000Z | torch2trt/converters/div.py | Akababa/torch2trt | 03063b74a7eb40f5aac88d49be6b8b5e4e4e92d7 | [
"MIT"
] | null | null | null | from ..conversion_context import *
from torch2trt.module_test import add_module_test
from .binary import convert_binary_elementwise
@tensorrt_converter('torch.div')
@tensorrt_converter('torch.Tensor.__div__') # py2
@tensorrt_converter('torch.Tensor.__idiv__') # py2
@tensorrt_converter('torch.Tensor.__truediv__') # py3
@tensorrt_converter('torch.Tensor.__itruediv__') # py3
def convert_div(ctx):
convert_binary_elementwise(ctx, trt.ElementWiseOperation.DIV)
@tensorrt_converter('torch.Tensor.__rdiv__') # py2
@tensorrt_converter('torch.Tensor.__rtruediv__') # py3
def convert_rdiv(ctx):
convert_binary_elementwise(ctx, trt.ElementWiseOperation.DIV, flip=True)
@tensorrt_converter('torch.Tensor.__floordiv__') # py2
def convert_div(ctx):
convert_binary_elementwise(ctx, trt.ElementWiseOperation.FLOOR_DIV)
@tensorrt_converter('torch.Tensor.__rfloordiv__') # py3
def convert_rdiv(ctx):
convert_binary_elementwise(ctx, trt.ElementWiseOperation.FLOOR_DIV, flip=True)
class Div(torch.nn.Module):
def __init__(self):
super(Div, self).__init__()
def forward(self, x, y):
return x / y
@add_module_test(torch.float32, torch.device('cuda'), [(1, 3, 224, 224), (1, 3, 224, 224)])
def test_div_basic():
return Div()
class IDiv(torch.nn.Module):
def __init__(self):
super(IDiv, self).__init__()
def forward(self, x, y):
x /= y
return x
@add_module_test(torch.float32, torch.device('cuda'), [(1, 3, 224, 224), (1, 3, 224, 224)])
def test_div_idiv():
return IDiv()
class TorchDiv(torch.nn.Module):
def __init__(self):
super(TorchDiv, self).__init__()
def forward(self, x, y):
return torch.div(x, y)
@add_module_test(torch.float32, torch.device('cuda'), [(1, 3, 224, 224), (1, 3, 224, 224)])
def test_div_torchdiv():
return TorchDiv()
class RDivInt(torch.nn.Module):
def __init__(self):
super(RDivInt, self).__init__()
def forward(self, x):
return 100 / x
@add_module_test(torch.float32, torch.device('cuda'), [(1, 3, 3, 3)])
def test_rdiv_int():
return RDivInt()
class RDivFloat(torch.nn.Module):
def __init__(self):
super(RDivFloat, self).__init__()
def forward(self, x):
return 100.0 / x
@add_module_test(torch.float32, torch.device('cuda'), [(1, 3, 3, 3)])
def test_rdiv_float():
return RDivFloat()
| 25.189474 | 91 | 0.692018 | 324 | 2,393 | 4.737654 | 0.169753 | 0.099674 | 0.12899 | 0.145928 | 0.704886 | 0.587622 | 0.587622 | 0.477524 | 0.392834 | 0.38241 | 0 | 0.040939 | 0.162975 | 2,393 | 94 | 92 | 25.457447 | 0.725412 | 0.012954 | 0 | 0.311475 | 0 | 0 | 0.091798 | 0.070973 | 0 | 0 | 0 | 0 | 0 | 1 | 0.311475 | false | 0 | 0.04918 | 0.147541 | 0.606557 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
c5a7343637bd12ccc731a64fda6bcafb70eed6da | 16,144 | py | Python | admin_application/reserve.py | freeze1/My_copy_of_Hotel_Reservation | 4af612acb0220135707a73e4624d7c816a1103df | [
"MIT"
] | 14 | 2015-12-19T19:41:41.000Z | 2021-03-19T07:03:38.000Z | admin_application/reserve.py | freeze1/My_copy_of_Hotel_Reservation | 4af612acb0220135707a73e4624d7c816a1103df | [
"MIT"
] | null | null | null | admin_application/reserve.py | freeze1/My_copy_of_Hotel_Reservation | 4af612acb0220135707a73e4624d7c816a1103df | [
"MIT"
] | 17 | 2015-12-20T05:19:03.000Z | 2021-09-29T04:12:38.000Z | #!/usr/bin/python
from Tkinter import *
import tkMessageBox,Tkinter,comm
import util
# The main window which shows all the hotels with that particular facility
class book(util.window):
# Name of the thing is required like room, banquet etc
def __init__(self, server, name):
self.server = server
self.name = name
if not server.testConnection():
tkMessageBox.showerror("Connection Failed", "Connection to the server failed please try again later.")
self.root.destroy()
udata = server.req('get_user_details')
hotels = server.req('get_all_hotels')
# create window and its widgets
util.window.__init__(self,"SVS HOTELS | Computerized Reservation System | Make a reservation")
canvas = Canvas(width = 600, height = 150, bg = 'white')
canvas.pack(expand = YES, fill = BOTH)
gif1 = PhotoImage(file = './data/3.gif')
canvas.create_image(0, 0, image = gif1, anchor = NW)
user_frame = Frame(self.root, height = 40, width = 580, relief = RAISED, bd = 1)
user_frame.place(x=10,y=165)
logout_button = Button(user_frame, text ="back", command=(lambda: self.root.destroy()))
logout_button.place(x=330+180, y = 2)
x,y = 10,6
label_info = Label(user_frame, text="Pick a hotel to reserve " + name)
label_info.place(x=x, y = y)
main_frame = Frame(self.root, height = 260, width = 580, relief = RAISED, bd = 0)
main_frame.place(x=10,y=215)
def data():
j = 0
for i in hotels:
if len(i[name]) > 0:
subFrame = Frame(frame, height = 40, width = 550, relief = RAISED, bd = 1)
subFrame.grid(row=j,column=0)
Label(subFrame,text=str(j+1)+".").place(x=5,y=7)
Label(subFrame,text=i['name']).place(x=30,y=7)
Button(subFrame, text ="Choose", command=book2(server,i,name,self).show).place(x=470,y=4)
j+=1
def myfunction(event):
canvas.configure(scrollregion=canvas.bbox("all"),width=557,height=260)
# Creates a scrollable frame so that many results can be displayed there
canvas=Canvas(main_frame)
frame=Frame(canvas)
myscrollbar=Scrollbar(main_frame,orient="vertical",command=canvas.yview)
canvas.configure(yscrollcommand=myscrollbar.set)
myscrollbar.pack(side="right",fill="y")
canvas.pack(side="left")
canvas.create_window((0,0),window=frame,anchor='nw')
frame.bind("<Configure>",myfunction)
data()
self.root.mainloop()
# This window shows variations of the same type in a particular hotel
class book2(util.window):
def __init__(self,server,hotel,name,main):
self.server = server
self.hotel = hotel
self.name = name
self.main = main
def show(self):
dat = self.hotel
name = self.name
server = self.server
# create window and its widgets
util.window.__init__(self,"SVS HOTELS | " + dat['name'] + " | " + name)
root = self.root
user_frame = Frame(root, height = 40, width = 580, relief = RAISED, bd = 1)
user_frame.place(x=10,y=10)
logout_button = Button(user_frame, text ="back", command=(lambda: root.destroy()))
logout_button.place(x=330+180, y = 2)
x,y = 10,6
label_info = Label(user_frame, text="Book " + name + " in the hotel: " + dat['name'])
label_info.place(x=x, y = y)
main_frameH = Frame(root, height = 400, width = 580, relief = RAISED, bd = 0)
main_frameH.place(x=10,y=60)
def components():
j = 0
subFrame = Frame(Hframe, height=40, width = 550, relief = RAISED, bd = 0)
subFrame.grid(row=j,column=0)
Label(subFrame,text='Following are available for booking:', font=("helvetica", 19)).place(x=5,y=7)
j+=1
for i in dat[name]:
subFrame = Frame(Hframe, height = 40, width = 550, relief = RAISED, bd = 0)
subFrame.grid(row=j,column=0)
Label(subFrame,text=i['name'], font=("helvetica", 16)).place(x=30,y=7)
Button(subFrame, text ="Choose", command=book3(server,dat,name,i,self.main,self).show).place(x=470,y=4)
Label(Hframe,text="\t"+i['description'],wraplength=540,justify=LEFT).grid(row=j+1,column=0)
j+=2
def myfunctionH(event):
Hcanvas.configure(scrollregion=Hcanvas.bbox("all"),width=557,height=400)
Hcanvas=Canvas(main_frameH)
Hframe=Frame(Hcanvas)
myscrollbarH=Scrollbar(main_frameH,orient="vertical",command=Hcanvas.yview)
Hcanvas.configure(yscrollcommand=myscrollbarH.set)
myscrollbarH.pack(side="right",fill="y")
Hcanvas.pack(side="left")
Hcanvas.create_window((0,0),window=Hframe,anchor='nw')
Hframe.bind("<Configure>",myfunctionH)
components()
root.mainloop()
# Creates an object for button to work with
class book3(util.window):
def __init__(self,server,hotel,name,dat,main,main2):
self.server = server
self.hotel = hotel
self.dat = dat
self.name = name
self.main = main
self.main2 = main2
def show(self):
if self.name == 'restaurants':
bookrest(self.server, self.hotel, self.dat, self.main, self.main2).show()
else:
bookRoomsBanqMeeting(self.server, self.hotel, self.dat, self.main, self.main2, self.name).show()
# This window is used to book Rooms - Banquet - Meeting rooms
class bookRoomsBanqMeeting(util.window):
def __init__(self,server,hotel,dat,main,main2, name):
self.server = server
self.hotel = hotel
self.dat = dat
self.name = name
self.main = main
self.main2 = main2
def show(self):
dat = self.hotel
server = self.server
name = self.name
thing = self.dat
util.window.__init__(self,"SVS HOTELS | " + dat['name'] + " | " + name)
root = self.root
def check():
if not server.testConnection():
tkMessageBox.showerror("Connection Failed", "Connection to the server failed please try again later.",parent=root)
self.main.root.destroy()
fine = True
# Various checks to verify the user inputs
def greaterDate(one,two):
if int(one[2]) > int(two[2]):return True
elif int(one[2]) < int(two[2]): return False
if int(one[1]) > int(two[1]):return True
elif int(one[1]) < int(two[1]): return False
if int(one[0]) > int(two[0]):return True
elif int(one[0]) < int(two[0]): return False
if int(one[3]) > int(two[3]):return True
elif int(one[3]) < int(two[3]): return False
if int(one[4]) > int(two[4]):return True
elif int(one[4]) < int(two[4]): return False
if int(one[5]) > int(two[5]):return True
elif int(one[5]) <= int(two[5]): return False
try:
fro = [int(i) for i in entry_from.get().split(".")]
to = [int(i) for i in entry_to.get().split(".")]
if self.name == 'rooms':
qty = int(entry_qty.get())
if not greaterDate(to, fro):
raise Exception
else:
qty = 1
if len(fro) < 3 or len(to) < 3 or qty < 1:
raise Exception
d,m,y = fro
if not (type(d) is int and type(m) is int and type(y) is int and
y > 0 and (0 < m <= 12) and
( ((m in [1,3,5,7,8,10,12]) and (0<d<=31)) or
((m in [4,6,9,11]) and (0<d<=30)) or
((m == 2) and ((0 < d <= 28 and y % 4 != 0) or (0 < d <= 29 and y % 4 == 0)))
)):
raise Exception
d,m,y = to
if not (type(d) is int and type(m) is int and type(y) is int and
y > 0 and (0 < m <= 12) and
( ((m in [1,3,5,7,8,10,12]) and (0<d<=31)) or
((m in [4,6,9,11]) and (0<d<=30)) or
((m == 2) and ((0 < d <= 28 and y % 4 != 0) or (0 < d <= 29 and y % 4 == 0)))
)):
raise Exception
d = str(entry_comment.get())
except:
tkMessageBox.showerror("Invalid entry", "Please enter numbers according to the format provided into fields",parent=root)
fine = False
# do this only if everything is fine
if fine:
fro = [int(i) for i in entry_from.get().split(".")]
to = [int(i) for i in entry_to.get().split(".")]
totime = [0,0,0]
if self.name == 'rooms':
qty = int(entry_qty.get())
else:
qty = 1
totime[0]=23
totime[1]=59
totime[2]=59
# check availability and prompt to book
data = server.req('availability',dat['id'],name,thing['id'],fro[0],fro[1],fro[2],0,0,0,to[0],to[1],to[2],totime[0],totime[1],totime[2])[0]
if qty > data:
tkMessageBox.showerror("Unavaialable", "Sorry these many quantites are not available. Only " +str(data)+ " are available.",parent=root)
else:
if tkMessageBox.askyesno("Availability", "The required quantity are available. Do you want to confirm booking?",parent=root):
data = server.req('make_reservation',dat['id'],name,thing['id'],fro[0],fro[1],fro[2],0,0,0,to[0],to[1],to[2],0,0,0,str(entry_comment.get()),"",qty,"","")
if data:
tkMessageBox.showinfo("Successfully booked", "Successfully reserved the required quantity. Our representative will speak with you within 24 hrs. to confirm the booking and payment method.",parent=root)
self.root.destroy()
self.main2.root.destroy()
self.main.root.destroy()
user_frame = Frame(root, height = 40, width = 580, relief = RAISED, bd = 1)
user_frame.place(x=10,y=10)
logout_button = Button(user_frame, text ="back", command=(lambda: root.destroy()))
logout_button.place(x=330+180, y = 2)
x,y = 10,6
label_info = Label(user_frame, text="Book " + name + " in the hotel: " + dat['name'])
label_info.place(x=x, y = y)
main_frameH = Frame(root, height = 400, width = 580, relief = RAISED, bd = 0)
main_frameH.place(x=10,y=60)
main_frame = Frame(root, height = 400, width = 580, relief = RAISED, bd = 1)
main_frame.place(x=10,y=60)
x,y = 10,6
label_info = Label(main_frame, text=thing['name'])
label_info.place(x=x, y = y)
try:
label_info = Label(main_frame, text="Price per day: " + str(thing['price']))
label_info.place(x=x+400-80, y = y)
except:
pass
label_info = Label(main_frame, text="Booking information: ")
label_info.place(x=x, y = y+30)
x,y = 80,6-80
label_info = Label(main_frame, text="From Date (DD.MM.YYYY) : ")
label_info.place(x=x, y = y+140)
entry_from = Entry(main_frame)
entry_from.place(x=x+200, y = y+140)
label_info = Label(main_frame, text="To Date (DD.MM.YYYY) : ")
label_info.place(x=x, y = y+170)
entry_to = Entry(main_frame)
entry_to.place(x=x+200, y = y+170)
if self.name == 'rooms':
label_qty = Label(main_frame, text="Quantity : ")
label_qty.place(x=x, y = y+200)
entry_qty = Entry(main_frame)
entry_qty.place(x=x+200, y = y+200)
else:
label_qty = Label(main_frame, text="Put the same date if you want to book only one day")
label_qty.place(x=x, y = y+200)
label_qty = Label(main_frame, text="Flight Details, Comments and Special Requests: ")
label_qty.place(x=x, y = y+230)
entry_comment = Entry(main_frame, width=37)
entry_comment.place(x=x+50, y = y+260)
check_button = Button(main_frame, text ="Check availability", command=check)
check_button.place(x=200, y = 240)
root.mainloop()
# This window is used to book Restaurants
class bookrest(util.window):
def __init__(self,server,hotel,dat,main,main2):
self.server = server
self.hotel = hotel
self.dat = dat
self.name = 'restaurants'
self.main = main
self.main2 = main2
def show(self):
dat = self.hotel
server = self.server
name = self.name
thing = self.dat
util.window.__init__(self,"SVS HOTELS | " + dat['name'] + " | " + name)
root = self.root
# By: Vinay C K
def check():
if not server.testConnection():
tkMessageBox.showerror("Connection Failed", "Connection to the server failed please try again later.",parent=root)
self.main.root.destroy()
fine = True
# Various checks to verify the user inputs
try:
date = [int(i) for i in entry_from.get().split(".")]
time = [int(i) for i in entry_to.get().split(":")]
qty = int(entry_qty.get())
if len(date) < 3 or len(time) < 2 or qty < 1:
raise Exception
d,m,y = date
if not (type(d) is int and type(m) is int and type(y) is int and
y > 0 and (0 < m <= 12) and
( ((m in [1,3,5,7,8,10,12]) and (0<d<=31)) or
((m in [4,6,9,11]) and (0<d<=30)) or
((m == 2) and ((0 < d <= 28 and y % 4 != 0) or (0 < d <= 29 and y % 4 == 0)))
)):
raise Exception
h,m = time
if not (0 <= h < 24 and 0 <= m < 60):
raise Exception
d = str(entry_comment.get())
except:
tkMessageBox.showerror("Invalid entry", "Please enter numbers according to the format provided into fields",parent=root)
fine = False
if fine:
date = [int(i) for i in entry_from.get().split(".")]
time = [int(i) for i in entry_to.get().split(":")]
qty = int(entry_qty.get())
time2, date2 = time[:], date[:]
if (time[0] + 2) >= 24:
time2[0] = time[0] - 22
date2[0] = date[0] + 1
if ((date2[1] in [1,3,5,7,8,10,12] and date2[0] > 31) or
(date2[1] in [4,6,9,11] and date2[0] > 30) or
((date2[1] == 2) and (date2[0] > 28 and date[2] % 4 != 0) or
(date2[0] > 29 and date[2] % 4 == 0 ))
):
date2[0] = 1
date2[1] = date[1] + 1
if date2[1] > 12:
date2[1] = 1
date2[2] = date[2] + 1
else:
time2[0] = time[0] + 2
data = server.req('availability',dat['id'],name,thing['id'],date[0],date[1],date[2],time[0],time[1],0,date2[0],date2[1],date2[2],time2[0],time2[1],0)[0]
if qty > data:
tkMessageBox.showerror("Unavaialable", "Sorry these many quantites are not available. Only " +str(data)+ " are available.",parent=root)
else:
if tkMessageBox.askyesno("Availability", "The required quantity are available. Do you want to confirm booking?",parent=root):
data = server.req('make_reservation',dat['id'],name,thing['id'],date[0],date[1],date[2],time[0],time[1],0,date2[0],date2[1],date2[2],time2[0],time2[1],0,str(entry_comment.get()),"",qty,"","")
if data:
tkMessageBox.showinfo("Successfully booked", "Successfully reserved the required quantity. Our representative will speak with you within 24 hrs. to confirm the booking and payment method.",parent=root)
self.root.destroy()
self.main2.root.destroy()
self.main.root.destroy()
user_frame = Frame(root, height = 40, width = 580, relief = RAISED, bd = 1)
user_frame.place(x=10,y=10)
logout_button = Button(user_frame, text ="back", command=(lambda: root.destroy()))
logout_button.place(x=330+180, y = 2)
x,y = 10,6
label_info = Label(user_frame, text="Book " + name + " in the hotel: " + dat['name'])
label_info.place(x=x, y = y)
main_frameH = Frame(root, height = 400, width = 580, relief = RAISED, bd = 0)
main_frameH.place(x=10,y=60)
main_frame = Frame(root, height = 400, width = 580, relief = RAISED, bd = 1)
main_frame.place(x=10,y=60)
x,y = 10,6
label_info = Label(main_frame, text=thing['name'])
label_info.place(x=x, y = y)
try:
label_info = Label(main_frame, text="Price per day: " + str(thing['price']))
label_info.place(x=x+400-80, y = y)
except:
pass
label_info = Label(main_frame, text="Booking information: ")
label_info.place(x=x, y = y+30)
x,y = 80,6-80
label_info = Label(main_frame, text="Date (DD.MM.YYYY): ")
label_info.place(x=x, y = y+140)
entry_from = Entry(main_frame)
entry_from.place(x=x+200, y = y+140)
label_info = Label(main_frame, text="Time (HH:MM 24 Hr format): ")
label_info.place(x=x, y = y+170)
entry_to = Entry(main_frame)
entry_to.place(x=x+200, y = y+170)
label_qty = Label(main_frame, text="Number of people : ")
label_qty.place(x=x, y = y+200)
entry_qty = Entry(main_frame)
entry_qty.place(x=x+200, y = y+200)
label_qty = Label(main_frame, text="Comments and Special Requests: ")
label_qty.place(x=x, y = y+230)
entry_comment = Entry(main_frame, width=37)
entry_comment.place(x=x+50, y = y+260)
check_button = Button(main_frame, text ="Check availability", command=check)
check_button.place(x=200, y = 240)
root.mainloop()
if __name__ == "__main__":
server = comm.link('vinay','letmein')
server.login()
book(server,'restaurants')
| 35.248908 | 209 | 0.626301 | 2,572 | 16,144 | 3.856532 | 0.119751 | 0.02964 | 0.019054 | 0.013711 | 0.761972 | 0.728299 | 0.708842 | 0.67164 | 0.658232 | 0.628088 | 0 | 0.052511 | 0.213206 | 16,144 | 457 | 210 | 35.326039 | 0.728389 | 0.040387 | 0 | 0.647887 | 0 | 0.005634 | 0.127544 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04507 | false | 0.005634 | 0.008451 | 0 | 0.067606 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c5ce2b6234a6c764b1388631e4cae4cdeb3cde41 | 500 | py | Python | com/blueberr/python/study/day2/list_test2.py | CoderHongKong/python-study | 4e13359e3546b67d555a79adee63422cac7968c2 | [
"Apache-2.0"
] | null | null | null | com/blueberr/python/study/day2/list_test2.py | CoderHongKong/python-study | 4e13359e3546b67d555a79adee63422cac7968c2 | [
"Apache-2.0"
] | null | null | null | com/blueberr/python/study/day2/list_test2.py | CoderHongKong/python-study | 4e13359e3546b67d555a79adee63422cac7968c2 | [
"Apache-2.0"
] | null | null | null | # -*- coding: UTF-8 -*-
#!/usr/bin/env python
#-------------------------------------------------------------------------------
# Name:
# Purpose:
#
# Author: hekai
#-------------------------------------------------------------------------------
arrs = ['a', 'b', 'c', ['x', 'y'],'d', 'e']
print(arrs)
# 浅拷贝
arrs_copy = arrs.copy()
print(arrs_copy)
arrs_ = arrs[:]
print(arrs_)
l = list(arrs)
print(l)
print(arrs[0:-1:2])
print(arrs[:2])
print('------------')
for i in arrs:
print(i)
| 20.833333 | 80 | 0.356 | 53 | 500 | 3.283019 | 0.566038 | 0.258621 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011494 | 0.13 | 500 | 23 | 81 | 21.73913 | 0.388506 | 0.488 | 0 | 0 | 0 | 0 | 0.077236 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.615385 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
c5d5f844dc7b6d549c2e35fed6864c15335bb942 | 185 | py | Python | Chapter_2/SCU_2.7.py | charliealpha094/Introduction-to-Python-Programming-for-Business-and-Social-Sciences-Applications | 52c4612b0d80da3311411bc7aee1d98f75de3d95 | [
"MIT"
] | null | null | null | Chapter_2/SCU_2.7.py | charliealpha094/Introduction-to-Python-Programming-for-Business-and-Social-Sciences-Applications | 52c4612b0d80da3311411bc7aee1d98f75de3d95 | [
"MIT"
] | null | null | null | Chapter_2/SCU_2.7.py | charliealpha094/Introduction-to-Python-Programming-for-Business-and-Social-Sciences-Applications | 52c4612b0d80da3311411bc7aee1d98f75de3d95 | [
"MIT"
] | null | null | null | # Done by Carlos Amaral (16/09/2020)
# SCU_2.7 - Modify a User-Defined Function
def my_function(x, y):
return (x+y)/2
x = 4
y = 5
print("The average is: ", my_function(x,y))
| 12.333333 | 43 | 0.632432 | 36 | 185 | 3.166667 | 0.722222 | 0.052632 | 0.192982 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089655 | 0.216216 | 185 | 14 | 44 | 13.214286 | 0.696552 | 0.405405 | 0 | 0 | 0 | 0 | 0.152381 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0.2 | 0.4 | 0.2 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
c5de3484c9384a40139f9092736b15d7e18061b5 | 50,010 | py | Python | sandbox-singlepage/data_prep_blockgrp.py | epcw/housing_equity | 074da171880fb445765662a98a816d13779f160f | [
"MIT"
] | null | null | null | sandbox-singlepage/data_prep_blockgrp.py | epcw/housing_equity | 074da171880fb445765662a98a816d13779f160f | [
"MIT"
] | null | null | null | sandbox-singlepage/data_prep_blockgrp.py | epcw/housing_equity | 074da171880fb445765662a98a816d13779f160f | [
"MIT"
] | 1 | 2021-07-19T20:25:02.000Z | 2021-07-19T20:25:02.000Z | import pandas as pd
import geopandas as gp
from geopy import distance
import json
import csv
#set root directory for data files
#ROOTDIR = '/home/ubuntu/housing_equity/sandbox-singlepage/' #production
ROOTDIR = '' #local
#read in shapefile (needs to be in GeoJSON format)
#with open('/home/ubuntu/dash/data/washingtongeo.json','r') as GeoJSON:
with open(ROOTDIR + 'data/wa_king_census_block_groups.geojson','r') as GeoJSON:
block_grp_geoids = json.load(GeoJSON)
df_rent = pd.read_csv(ROOTDIR + 'data/king_blockgrp_rent.csv', dtype={"GEOID": str, "TRACT_NUM": str, "YEAR":str, "BLOCK_GRP":str}) #NOTE: pre-filtered in SQL for King County
#filter for 2013
df_rent = df_rent[(df_rent['YEAR'] == '2013') | (df_rent['YEAR'] == '2018')]
#sort median_costs_df by census query, creating new column-sorted dfs instead of rows
df13 = df_rent[(df_rent['YEAR'] == '2013')]
dfrent_13 = df13[(df13['CENSUS_QUERY'] == 'B25071_001E')]
dfrent_13 = dfrent_13.rename(columns = {'DATA' : 'RENT_AS_PCT_INCOME_2013'})
dfrent_13 = dfrent_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_AS_PCT_INCOME_2013']]
costs_df25_13 = df13[(df13['CENSUS_QUERY'] == 'B25057_001E')]
costs_df25_13 = costs_df25_13.rename(columns = {'DATA' : 'RENT_25PCTILE_2013'})
costs_df25_13 = costs_df25_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_25PCTILE_2013']]
costs_df50_13 = df13[(df13['CENSUS_QUERY'] == 'B25058_001E')]
costs_df50_13 = costs_df50_13.rename(columns = {'DATA' : 'RENT_50PCTILE_2013'})
costs_df50_13 = costs_df50_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_50PCTILE_2013']]
costs_df75_13 = df13[(df13['CENSUS_QUERY'] == 'B25059_001E')]
costs_df75_13 = costs_df75_13.rename(columns = {'DATA' : 'RENT_75PCTILE_2013'})
costs_df75_13 = costs_df75_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_75PCTILE_2013']]
df13 = dfrent_13.merge(costs_df25_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df13 = df13.merge(costs_df50_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df13 = df13.merge(costs_df75_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df18 = df_rent[(df_rent['YEAR'] == '2018')]
dfrent_18 = df18[(df18['CENSUS_QUERY'] == 'B25071_001E')]
dfrent_18 = dfrent_18.rename(columns = {'DATA' : 'RENT_AS_PCT_INCOME_2018'})
dfrent_18 = dfrent_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_AS_PCT_INCOME_2018']]
costs_df25_18 = df18[(df18['CENSUS_QUERY'] == 'B25057_001E')]
costs_df25_18 = costs_df25_18.rename(columns = {'DATA' : 'RENT_25PCTILE_2018'})
costs_df25_18 = costs_df25_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_25PCTILE_2018']]
costs_df50_18 = df18[(df18['CENSUS_QUERY'] == 'B25058_001E')]
costs_df50_18 = costs_df50_18.rename(columns = {'DATA' : 'RENT_50PCTILE_2018'})
costs_df50_18 = costs_df50_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_50PCTILE_2018']]
costs_df75_18 = df18[(df18['CENSUS_QUERY'] == 'B25059_001E')]
costs_df75_18 = costs_df75_18.rename(columns = {'DATA' : 'RENT_75PCTILE_2018'})
costs_df75_18 = costs_df75_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','RENT_75PCTILE_2018']]
df18 = dfrent_18.merge(costs_df25_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df18 = df18.merge(costs_df50_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df18 = df18.merge(costs_df75_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df = df13 = df13.merge(df18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
#load in race data
rdf = pd.read_csv(ROOTDIR + 'data/king_blockgrp_race.csv', dtype={"GEOID": str, "TRACT_NUM": str, "YEAR":str, "BLOCK_GRP":str}) #NOTE: pre-filtered in SQL for King County
#filter for year 2013
rdf = rdf[(rdf['YEAR'] == '2013') | (rdf['YEAR'] == '2018') ]
gdf = pd.read_csv(ROOTDIR + 'data/wa_king_census_block_groups_distances.csv',
dtype={"block_group_geoid_a": str,"block_group_geoid_b": str})
rdf13 = rdf[(rdf['YEAR'] == '2013')]
totpop_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B01001_001E')]
totpop_13 = totpop_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA','YEAR']]
totpop_13 = totpop_13.rename(columns = {'DATA' : 'TOT_POP_2013'})
white_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B03002_003E')]
white_13 = white_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
white_13 = white_13.rename(columns = {'DATA' : 'pop_white_nonhisp_only_2013'})
black_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B02001_003E')]
black_13 = black_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
black_13 = black_13.rename(columns = {'DATA' : 'pop_black_only_2013'})
native_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B02001_004E')]
native_13 = native_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
native_13 = native_13.rename(columns = {'DATA' : 'pop_native_only_2013'})
asian_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B02001_005E')]
asian_13 = asian_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
asian_13 = asian_13.rename(columns = {'DATA' : 'pop_asian_only_2013'})
polynesian_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B02001_006E')]
polynesian_13 = polynesian_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
polynesian_13 = polynesian_13.rename(columns = {'DATA' : 'pop_polynesian_only_2013'})
latino_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B03002_012E')]
latino_13 = latino_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
latino_13 = latino_13.rename(columns = {'DATA' : 'pop_hispanic_2013'})
other_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B02001_007E')]
other_13 = other_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
other_13 = other_13.rename(columns = {'DATA' : 'pop_other_only_2013'})
multiracial_13 = rdf13[(rdf13['CENSUS_QUERY'] == 'B02001_008E')]
multiracial_13 = multiracial_13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
multiracial_13 = multiracial_13.rename(columns = {'DATA' : 'pop_multiracial_2013'})
racial13 = totpop_13.merge(white_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial13 = racial13.merge(black_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial13 = racial13.merge(native_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial13 = racial13.merge(asian_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial13 = racial13.merge(polynesian_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial13 = racial13.merge(latino_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial13 = racial13.merge(other_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial13 = racial13.merge(multiracial_13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
race_data13 = racial13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','TOT_POP_2013','pop_white_nonhisp_only_2013','pop_black_only_2013','pop_native_only_2013','pop_asian_only_2013','pop_polynesian_only_2013','pop_hispanic_2013','pop_other_only_2013','pop_multiracial_2013']]
rdf18 = rdf[(rdf['YEAR'] == '2018')]
totpop_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B01001_001E')]
totpop_18 = totpop_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA','YEAR']]
totpop_18 = totpop_18.rename(columns = {'DATA' : 'TOT_POP_2018'})
white_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B03002_003E')]
white_18 = white_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
white_18 = white_18.rename(columns = {'DATA' : 'pop_white_nonhisp_only_2018'})
black_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B02001_003E')]
black_18 = black_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
black_18 = black_18.rename(columns = {'DATA' : 'pop_black_only_2018'})
native_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B02001_004E')]
native_18 = native_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
native_18 = native_18.rename(columns = {'DATA' : 'pop_native_only_2018'})
asian_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B02001_005E')]
asian_18 = asian_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
asian_18 = asian_18.rename(columns = {'DATA' : 'pop_asian_only_2018'})
polynesian_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B02001_006E')]
polynesian_18 = polynesian_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
polynesian_18 = polynesian_18.rename(columns = {'DATA' : 'pop_polynesian_only_2018'})
latino_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B03002_012E')]
latino_18 = latino_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
latino_18 = latino_18.rename(columns = {'DATA' : 'pop_hispanic_2018'})
other_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B02001_007E')]
other_18 = other_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
other_18 = other_18.rename(columns = {'DATA' : 'pop_other_only_2018'})
multiracial_18 = rdf18[(rdf18['CENSUS_QUERY'] == 'B02001_008E')]
multiracial_18 = multiracial_18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','DATA']]
multiracial_18 = multiracial_18.rename(columns = {'DATA' : 'pop_multiracial_2018'})
racial18 = totpop_18.merge(white_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial18 = racial18.merge(black_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial18 = racial18.merge(native_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial18 = racial18.merge(asian_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial18 = racial18.merge(polynesian_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial18 = racial18.merge(latino_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial18 = racial18.merge(other_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
racial18 = racial18.merge(multiracial_18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
race_data18 = racial18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','TOT_POP_2018','pop_white_nonhisp_only_2018','pop_black_only_2018','pop_native_only_2018','pop_asian_only_2018','pop_polynesian_only_2018','pop_hispanic_2018','pop_other_only_2018','pop_multiracial_2018']]
race_data = race_data13.merge(race_data18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df = df.merge(race_data, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP']).drop_duplicates()
df['minority_pop_2013'] = df['TOT_POP_2013'] - df['pop_white_nonhisp_only_2013']
df['minority_pop_pct_2013'] = df['minority_pop_2013'] / df['TOT_POP_2013']
df['minority_pop_2018'] = df['TOT_POP_2018'] - df['pop_white_nonhisp_only_2018']
df['minority_pop_pct_2018'] = df['minority_pop_2018'] / df['TOT_POP_2018']
#bring in affordable housing data
housing_df_raw = pd.read_csv(ROOTDIR + 'data/king_blockgrp_housing-details.csv', dtype={"DATA":float,"YEAR":str,"TRACT_NUM": str, "BLOCK_GRP": str})
#create geoid
housing_df_raw['GEOID'] = '53033' + housing_df_raw['TRACT_NUM'] + housing_df_raw['BLOCK_GRP']
#Aggregate unit data (rows in original housing_df_raw are counts of units under 100/mo, 200/mo, 300/mo, etc.
affordable_df_raw = housing_df_raw[(housing_df_raw['CENSUS_QUERY'] == 'B25063_003E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_004E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_005E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_006E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_007E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_008E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_009E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_010E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_011E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_012E') | (housing_df_raw['CENSUS_QUERY'] == 'B25063_013E')]
affordable_df = affordable_df_raw[['COUNTY','TRACT_NUM','BLOCK_GRP','GEOID','YEAR']]
affordable_data13 = affordable_df_raw[(affordable_df_raw['YEAR'] == '2013')]
affordable_data13 = affordable_data13.groupby(['GEOID']).sum().reset_index()
affordable_df = affordable_df.merge(affordable_data13, how='left', left_on=['GEOID'], right_on=['GEOID'])
affordable_df = affordable_df[['COUNTY','TRACT_NUM','BLOCK_GRP','GEOID','DATA']]
affordable_df = affordable_df.rename(columns = {'DATA' : 'sub_600_units_2013'})
affordable_data18 = affordable_df_raw[(affordable_df_raw['YEAR'] == '2018')]
affordable_data18 = affordable_data18.groupby(['GEOID']).sum().reset_index()
affordable_df = affordable_df.merge(affordable_data18, how='left', left_on=['GEOID'], right_on=['GEOID']).drop_duplicates()
affordable_df = affordable_df[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','sub_600_units_2013','DATA']]
affordable_df = affordable_df.rename(columns = {'DATA' : 'sub_600_units_2018'})
df = df.merge(affordable_df, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
df['sub_600_units_per_capita_2013'] = df['sub_600_units_2013']/df['TOT_POP_2013']
df['sub_600_units_per_capita_2018'] = df['sub_600_units_2018']/df['TOT_POP_2018']
#Tenancy data
tenancy_df_raw = housing_df_raw[(housing_df_raw['CENSUS_QUERY'] == 'B25039_001E')]
tenancy13 = tenancy_df_raw[(tenancy_df_raw['YEAR'] == '2013')]
tenancy13['median_tenancy_2013'] = 2013 - (tenancy13.DATA)
tenancy13 = tenancy13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','median_tenancy_2013']]
tenancy18 = tenancy_df_raw[(tenancy_df_raw['YEAR'] == '2018')]
tenancy18['median_tenancy_2018'] = 2018 - (tenancy18.DATA)
tenancy18 = tenancy18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','median_tenancy_2018']]
df = df.merge(tenancy13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
df = df.merge(tenancy18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
#housing age data
housing_age_df_raw = housing_df_raw[(housing_df_raw['CENSUS_QUERY'] == 'B25035_001E')]
housing_age13 = housing_age_df_raw[(housing_age_df_raw['YEAR'] == '2013')]
housing_age13['median_housing_age_2013'] = 2013 - (housing_age13.DATA)
housing_age13 = housing_age13[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','median_housing_age_2013']]
housing_age13.loc[housing_age13.median_housing_age_2013 > 100, 'median_housing_age_2013'] = None #corrects for the census having "2018" as an answer to some of these
housing_age18 = housing_age_df_raw[(housing_age_df_raw['YEAR'] == '2018')]
housing_age18['median_housing_age_2018'] = 2018 - (housing_age18.DATA)
housing_age18 = housing_age18[['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP','median_housing_age_2018']]
housing_age18.loc[housing_age18.median_housing_age_2018 > 100, 'median_housing_age_2018'] = None #corrects for the census having "2018" as an answer to some of these
df = df.merge(housing_age13, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
df = df.merge(housing_age18, how = 'inner', left_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'], right_on = ['GEOID','COUNTY','TRACT_NUM','BLOCK_GRP'])
'''
#DEBUG - CHECK FOR NaNs
nandf = df[df.isnull().any(axis=1)]
csv_filename = 'nan-check.csv'
nandf.to_csv(csv_filename, index = False,quotechar='"',quoting=csv.QUOTE_ALL)
print("Exporting csv...")
'''
gdf = gdf.merge(df[['GEOID']], how='left', left_on='block_group_geoid_a', right_on='GEOID')
gdf = gdf.rename(columns={'block_group_geoid_a':'GEOID_a'})
gdf = gdf.merge(df[['GEOID']], how='left', left_on='block_group_geoid_b', right_on='GEOID')
gdf = gdf.rename(columns={'block_group_geoid_b':'GEOID_b'})
df['minority_pop_pct_change'] = (df.minority_pop_pct_2018 - df.minority_pop_pct_2013)
df['rent_25th_pctile_change'] = (df.RENT_25PCTILE_2018 - df.RENT_25PCTILE_2013)
df['totpop_change'] = (df.TOT_POP_2018 - df.TOT_POP_2013)
df['rent_pct_income_change'] = (df.RENT_AS_PCT_INCOME_2018 - df.RENT_AS_PCT_INCOME_2013)
df['affordable_units_per_cap_change'] = (df.sub_600_units_per_capita_2018 - df.sub_600_units_per_capita_2013)
df['median_tenancy_change'] = (df.median_tenancy_2018 - df.median_tenancy_2013)
df['median_housing_age_change'] = (df.median_housing_age_2018 - df.median_housing_age_2013)
#CONVERT ALL CHANGES TO Z-SCORE SO YOU CAN COMPARE THEM
df['minority_pop_pct_change'] = (df.minority_pop_pct_change - df.minority_pop_pct_change.mean())/df.minority_pop_pct_change.std()
df['rent_25th_pctile_change'] = (df.rent_25th_pctile_change - df.rent_25th_pctile_change.mean())/df.rent_25th_pctile_change.std()
df['totpop_change'] = (df.totpop_change - df.totpop_change.mean())/df.totpop_change.std()
df['rent_pct_income_change'] = (df.rent_pct_income_change - df.rent_pct_income_change.mean())/df.rent_pct_income_change.std()
df['affordable_units_per_cap_change'] = (df.affordable_units_per_cap_change - df.affordable_units_per_cap_change.mean())/df.affordable_units_per_cap_change.std()
df['median_tenancy_change'] = (df.median_tenancy_change - df.median_tenancy_change.mean())/df.median_tenancy_change.std()
df['median_housing_age_change'] = (df.median_housing_age_change - df.median_housing_age_change.mean())/df.median_housing_age_change.std()
#CONVERT 2013 numbers to z-score for comparison
df['minority_pop_pct_2013z'] = (df.minority_pop_pct_2013 - df.minority_pop_pct_2013.mean())/df.minority_pop_pct_2013.std()
df['rent_25th_pctile_2013z'] = (df.RENT_25PCTILE_2013 - df.RENT_25PCTILE_2013.mean())/df.RENT_25PCTILE_2013.std()
df['totpop_2013z'] = (df.TOT_POP_2013 - df.TOT_POP_2013.mean())/df.TOT_POP_2013.std()
df['rent_pct_income_2013z'] = (df.RENT_AS_PCT_INCOME_2013 - df.RENT_AS_PCT_INCOME_2013.mean())/df.RENT_AS_PCT_INCOME_2013.std()
df['affordable_units_per_cap_2013z'] = (df.sub_600_units_per_capita_2013 - df.sub_600_units_per_capita_2013.mean())/df.sub_600_units_per_capita_2013.std()
df['median_tenancy_2013z'] = (df.median_tenancy_2013 - df.median_tenancy_2013.mean())/df.median_tenancy_2013.std()
df['median_housing_age_2013z'] = (df.median_housing_age_2013 - df.median_housing_age_2013.mean())/df.median_housing_age_2013.std()
#CONVERT 2018 numbers to z-score for comparison
df['minority_pop_pct_2018z'] = (df.minority_pop_pct_2018 - df.minority_pop_pct_2018.mean())/df.minority_pop_pct_2018.std()
df['rent_25th_pctile_2018z'] = (df.RENT_25PCTILE_2018 - df.RENT_25PCTILE_2018.mean())/df.RENT_25PCTILE_2018.std()
df['totpop_2018z'] = (df.TOT_POP_2018 - df.TOT_POP_2018.mean())/df.TOT_POP_2018.std()
df['rent_pct_income_2018z'] = (df.RENT_AS_PCT_INCOME_2018 - df.RENT_AS_PCT_INCOME_2018.mean())/df.RENT_AS_PCT_INCOME_2018.std()
df['affordable_units_per_cap_2018z'] = (df.sub_600_units_per_capita_2018 - df.sub_600_units_per_capita_2018.mean())/df.sub_600_units_per_capita_2018.std()
df['median_tenancy_2018z'] = (df.median_tenancy_2018 - df.median_tenancy_2018.mean())/df.median_tenancy_2018.std()
df['median_housing_age_2018z'] = (df.median_housing_age_2018 - df.median_housing_age_2018.mean())/df.median_housing_age_2018.std()
#merge dataframes to combine the different datasets so that you can calculate it.
minority = df[['GEOID','minority_pop_pct_change','minority_pop_pct_2013z','minority_pop_pct_2018z']]
gdf = gdf.merge(minority, how = 'inner', left_on = ['GEOID_a'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'minority_pop_pct_change':'minority_pop_pct_change_a'})
gdf = gdf.rename(columns = {'minority_pop_pct_2013z':'minority_pop_pct_2013z_a'})
gdf = gdf.rename(columns = {'minority_pop_pct_2018z':'minority_pop_pct_2018z_a'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_2013z_a','minority_pop_pct_2018z_a']]
gdf = gdf.merge(minority, how = 'inner', left_on = ['GEOID_b'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'minority_pop_pct_change':'minority_pop_pct_change_b'})
gdf = gdf.rename(columns = {'minority_pop_pct_2013z':'minority_pop_pct_2013z_b'})
gdf = gdf.rename(columns = {'minority_pop_pct_2018z':'minority_pop_pct_2018z_b'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b']]
lower_quartile_rent = df[['GEOID','rent_25th_pctile_change','rent_25th_pctile_2013z','rent_25th_pctile_2018z']]
gdf = gdf.merge(lower_quartile_rent, how = 'inner', left_on = ['GEOID_a'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'rent_25th_pctile_change':'rent_25th_pctile_change_a'})
gdf = gdf.rename(columns = {'rent_25th_pctile_2013z':'rent_25th_pctile_2013z_a'})
gdf = gdf.rename(columns = {'rent_25th_pctile_2018z':'rent_25th_pctile_2018z_a'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a']]
gdf = gdf.merge(lower_quartile_rent, how = 'inner', left_on = ['GEOID_b'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'rent_25th_pctile_change':'rent_25th_pctile_change_b'})
gdf = gdf.rename(columns = {'rent_25th_pctile_2013z':'rent_25th_pctile_2013z_b'})
gdf = gdf.rename(columns = {'rent_25th_pctile_2018z':'rent_25th_pctile_2018z_b'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b']]
totpop = df[['GEOID','totpop_change','totpop_2013z','totpop_2018z']]
gdf = gdf.merge(totpop, how = 'inner', left_on = ['GEOID_a'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'totpop_change':'totpop_change_a'})
gdf = gdf.rename(columns = {'totpop_2013z':'totpop_2013z_a'})
gdf = gdf.rename(columns = {'totpop_2018z':'totpop_2018z_a'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a']]
gdf = gdf.merge(totpop, how = 'inner', left_on = ['GEOID_b'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'totpop_change':'totpop_change_b'})
gdf = gdf.rename(columns = {'totpop_2013z':'totpop_2013z_b'})
gdf = gdf.rename(columns = {'totpop_2018z':'totpop_2018z_b'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b']]
rent_pct_income = df[['GEOID','rent_pct_income_change','rent_pct_income_2013z','rent_pct_income_2018z']]
gdf = gdf.merge(rent_pct_income, how = 'inner', left_on = ['GEOID_a'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'rent_pct_income_change':'rent_pct_income_change_a'})
gdf = gdf.rename(columns = {'rent_pct_income_2013z':'rent_pct_income_2013z_a'})
gdf = gdf.rename(columns = {'rent_pct_income_2018z':'rent_pct_income_2018z_a'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a']]
gdf = gdf.merge(rent_pct_income, how = 'inner', left_on = ['GEOID_b'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'rent_pct_income_change':'rent_pct_income_change_b'})
gdf = gdf.rename(columns = {'rent_pct_income_2013z':'rent_pct_income_2013z_b'})
gdf = gdf.rename(columns = {'rent_pct_income_2018z':'rent_pct_income_2018z_b'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a','rent_pct_income_change_b','rent_pct_income_2013z_b','rent_pct_income_2018z_b']]
affordable_units_per_cap = df[['GEOID','affordable_units_per_cap_change','affordable_units_per_cap_2013z','affordable_units_per_cap_2018z']]
gdf = gdf.merge(affordable_units_per_cap, how = 'inner', left_on = ['GEOID_a'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'affordable_units_per_cap_change':'affordable_units_per_cap_change_a'})
gdf = gdf.rename(columns = {'affordable_units_per_cap_2013z':'affordable_units_per_cap_2013z_a'})
gdf = gdf.rename(columns = {'affordable_units_per_cap_2018z':'affordable_units_per_cap_2018z_a'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a','rent_pct_income_change_b','rent_pct_income_2013z_b','rent_pct_income_2018z_b','affordable_units_per_cap_change_a','affordable_units_per_cap_2013z_a','affordable_units_per_cap_2018z_a']]
gdf = gdf.merge(affordable_units_per_cap, how = 'inner', left_on = ['GEOID_b'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'affordable_units_per_cap_change':'affordable_units_per_cap_change_b'})
gdf = gdf.rename(columns = {'affordable_units_per_cap_2013z':'affordable_units_per_cap_2013z_b'})
gdf = gdf.rename(columns = {'affordable_units_per_cap_2018z':'affordable_units_per_cap_2018z_b'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a','rent_pct_income_change_b','rent_pct_income_2013z_b','rent_pct_income_2018z_b','affordable_units_per_cap_change_a','affordable_units_per_cap_2013z_a','affordable_units_per_cap_2018z_a','affordable_units_per_cap_change_b','affordable_units_per_cap_2013z_b','affordable_units_per_cap_2018z_b']]
tenancy = df[['GEOID','median_tenancy_change','median_tenancy_2013z','median_tenancy_2018z']]
gdf = gdf.merge(tenancy, how = 'inner', left_on = ['GEOID_a'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'median_tenancy_change':'median_tenancy_change_a'})
gdf = gdf.rename(columns = {'median_tenancy_2013z':'median_tenancy_2013z_a'})
gdf = gdf.rename(columns = {'median_tenancy_2018z':'median_tenancy_2018z_a'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a','rent_pct_income_change_b','rent_pct_income_2013z_b','rent_pct_income_2018z_b','affordable_units_per_cap_change_a','affordable_units_per_cap_2013z_a','affordable_units_per_cap_2018z_a','affordable_units_per_cap_change_b','affordable_units_per_cap_2013z_b','affordable_units_per_cap_2018z_b','median_tenancy_change_a','median_tenancy_2013z_a','median_tenancy_2018z_a']]
gdf = gdf.merge(tenancy, how = 'inner', left_on = ['GEOID_b'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'median_tenancy_change':'median_tenancy_change_b'})
gdf = gdf.rename(columns = {'median_tenancy_2013z':'median_tenancy_2013z_b'})
gdf = gdf.rename(columns = {'median_tenancy_2018z':'median_tenancy_2018z_b'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a','rent_pct_income_change_b','rent_pct_income_2013z_b','rent_pct_income_2018z_b','affordable_units_per_cap_change_a','affordable_units_per_cap_2013z_a','affordable_units_per_cap_2018z_a','affordable_units_per_cap_change_b','affordable_units_per_cap_2013z_b','affordable_units_per_cap_2018z_b','median_tenancy_change_a','median_tenancy_2013z_a','median_tenancy_2018z_a','median_tenancy_change_b','median_tenancy_2013z_b','median_tenancy_2018z_b']]
housing_age = df[['GEOID','median_housing_age_change','median_housing_age_2013z','median_housing_age_2018z']]
gdf = gdf.merge(housing_age, how = 'inner', left_on = ['GEOID_a'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'median_housing_age_change':'median_housing_age_change_a'})
gdf = gdf.rename(columns = {'median_housing_age_2013z':'median_housing_age_2013z_a'})
gdf = gdf.rename(columns = {'median_housing_age_2018z':'median_housing_age_2018z_a'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a','rent_pct_income_change_b','rent_pct_income_2013z_b','rent_pct_income_2018z_b','affordable_units_per_cap_change_a','affordable_units_per_cap_2013z_a','affordable_units_per_cap_2018z_a','affordable_units_per_cap_change_b','affordable_units_per_cap_2013z_b','affordable_units_per_cap_2018z_b','median_tenancy_change_a','median_tenancy_2013z_a','median_tenancy_2018z_a','median_tenancy_change_b','median_tenancy_2013z_b','median_tenancy_2018z_b','median_housing_age_change_a','median_housing_age_2013z_a','median_housing_age_2018z_a']]
gdf = gdf.merge(housing_age, how = 'inner', left_on = ['GEOID_b'], right_on = ['GEOID'])
gdf = gdf.rename(columns = {'median_housing_age_change':'median_housing_age_change_b'})
gdf = gdf.rename(columns = {'median_housing_age_2013z':'median_housing_age_2013z_b'})
gdf = gdf.rename(columns = {'median_housing_age_2018z':'median_housing_age_2018z_b'})
gdf = gdf[['GEOID_a','GEOID_b','distance','minority_pop_pct_change_a','minority_pop_pct_change_b','minority_pop_pct_2013z_a','minority_pop_pct_2013z_b','minority_pop_pct_2018z_a','minority_pop_pct_2018z_b','rent_25th_pctile_change_a','rent_25th_pctile_2013z_a','rent_25th_pctile_2018z_a','rent_25th_pctile_change_b','rent_25th_pctile_2013z_b','rent_25th_pctile_2018z_b','totpop_change_a','totpop_2013z_a','totpop_2018z_a','totpop_change_b','totpop_2013z_b','totpop_2018z_b','rent_pct_income_change_a','rent_pct_income_2013z_a','rent_pct_income_2018z_a','rent_pct_income_change_b','rent_pct_income_2013z_b','rent_pct_income_2018z_b','affordable_units_per_cap_change_a','affordable_units_per_cap_2013z_a','affordable_units_per_cap_2018z_a','affordable_units_per_cap_change_b','affordable_units_per_cap_2013z_b','affordable_units_per_cap_2018z_b','median_tenancy_change_a','median_tenancy_2013z_a','median_tenancy_2018z_a','median_tenancy_change_b','median_tenancy_2013z_b','median_tenancy_2018z_b','median_housing_age_change_a','median_housing_age_2013z_a','median_housing_age_2018z_a','median_housing_age_change_b','median_housing_age_2013z_b','median_housing_age_2018z_b']]
omicron = 1/3 #this is the vectored weighting factor of starting point (omicron) vs change (1-omicron)
#calculate diff between the two tracts (and take absolute value since sign is meaningless here) - Delta taking into account starting point (2013) and change.
gdf['minority_pop_pct_change_delta'] = ((((1-omicron) * gdf.minority_pop_pct_change_a) + (omicron * gdf.minority_pop_pct_2013z_a)) - (((1-omicron) * gdf.minority_pop_pct_change_b) + (omicron * gdf.minority_pop_pct_2013z_b))).abs()
gdf['minority_pop_pct_change_delta'] = gdf['minority_pop_pct_change_delta'].fillna(0) #deals with nan in dataframe, which was breaking the network
gdf['rent_25th_pctile_change_delta'] = ((((1-omicron) * gdf.rent_25th_pctile_change_a) + (omicron * gdf.rent_25th_pctile_2013z_a)) - (((1-omicron) * gdf.rent_25th_pctile_change_b) + (omicron * gdf.rent_25th_pctile_2013z_b))).abs()
gdf['rent_25th_pctile_change_delta'] = gdf['rent_25th_pctile_change_delta'].fillna(0)
gdf['totpop_change_delta'] = ((((1-omicron) * gdf.totpop_change_a) + (omicron * gdf.totpop_2013z_a)) - (((1-omicron) * gdf.totpop_change_b) + (omicron * gdf.totpop_2013z_b))).abs()
gdf['totpop_change_delta'] = gdf['totpop_change_delta'].fillna(0)
gdf['rent_pct_income_change_delta'] = ((((1-omicron) * gdf.rent_pct_income_change_a) + (omicron * gdf.rent_pct_income_2013z_a)) - (((1-omicron) * gdf.rent_pct_income_change_b) + (omicron * gdf.rent_pct_income_2013z_b))).abs()
gdf['rent_pct_income_change_delta'] = gdf['rent_pct_income_change_delta'].fillna(0)
gdf['affordable_units_per_cap_change_delta'] = ((((1-omicron) * gdf.affordable_units_per_cap_change_a) + (omicron * gdf.affordable_units_per_cap_2013z_a)) - (((1-omicron) * gdf.affordable_units_per_cap_change_b) + (omicron * gdf.affordable_units_per_cap_2013z_b))).abs()
gdf['affordable_units_per_cap_change_delta'] = gdf['affordable_units_per_cap_change_delta'].fillna(0)
gdf['median_tenancy_change_delta'] = ((((1-omicron) * gdf.median_tenancy_change_a) + (omicron * gdf.median_tenancy_2013z_a)) - (((1-omicron) * gdf.median_tenancy_change_b) + (omicron * gdf.median_tenancy_2013z_b))).abs()
gdf['median_tenancy_change_delta'] = gdf['median_tenancy_change_delta'].fillna(0)
gdf['median_housing_age_change_delta'] = ((((1-omicron) * gdf.median_housing_age_change_a) + (omicron * gdf.median_housing_age_2013z_a)) - (((1-omicron) * gdf.median_housing_age_change_b) + (omicron * gdf.median_housing_age_2013z_b))).abs()
gdf['median_housing_age_change_delta'] = gdf['median_housing_age_change_delta'].fillna(0)
#Delta in 2013 (without taking into account change)
gdf['minority_pop_pct_change_delta_2013'] = ((gdf.minority_pop_pct_2013z_a) - (gdf.minority_pop_pct_2013z_b)).abs()
gdf['minority_pop_pct_change_delta_2013'] = gdf['minority_pop_pct_change_delta_2013'].fillna(0) #deals with nan in dataframe, which was breaking the network
gdf['rent_25th_pctile_change_delta_2013'] = ((gdf.rent_25th_pctile_2013z_a) - (gdf.rent_25th_pctile_2013z_b)).abs()
gdf['rent_25th_pctile_change_delta_2013'] = gdf['rent_25th_pctile_change_delta_2013'].fillna(0)
gdf['totpop_change_delta_2013'] = ((gdf.totpop_2013z_a) - (gdf.totpop_2013z_b)).abs()
gdf['totpop_change_delta_2013'] = gdf['totpop_change_delta_2013'].fillna(0)
gdf['rent_pct_income_change_delta_2013'] = ((gdf.rent_pct_income_2013z_a) - (gdf.rent_pct_income_2013z_b)).abs()
gdf['rent_pct_income_change_delta_2013'] = gdf['rent_pct_income_change_delta_2013'].fillna(0)
gdf['affordable_units_per_cap_change_delta_2013'] = ((gdf.affordable_units_per_cap_2013z_a) - (gdf.affordable_units_per_cap_2013z_b)).abs()
gdf['affordable_units_per_cap_change_delta_2013'] = gdf['affordable_units_per_cap_change_delta_2013'].fillna(0)
gdf['median_tenancy_change_delta_2013'] = ((gdf.median_tenancy_2013z_a) - (gdf.median_tenancy_2013z_b)).abs()
gdf['median_tenancy_change_delta_2013'] = gdf['median_tenancy_change_delta_2013'].fillna(0)
gdf['median_housing_age_change_delta_2013'] = ((gdf.median_housing_age_2013z_a) - (gdf.median_housing_age_2013z_b)).abs()
gdf['median_housing_age_change_delta_2013'] = gdf['median_housing_age_change_delta_2013'].fillna(0)
#Delta in 2018 (without taking into account change)
gdf['minority_pop_pct_change_delta_2018'] = (gdf.minority_pop_pct_2018z_a) - (gdf.minority_pop_pct_2018z_b).abs()
gdf['minority_pop_pct_change_delta_2018'] = gdf['minority_pop_pct_change_delta_2018'].fillna(0) #deals with nan in dataframe, which was breaking the network
gdf['rent_25th_pctile_change_delta_2018'] = (gdf.rent_25th_pctile_2018z_a) - (gdf.rent_25th_pctile_2018z_b).abs()
gdf['rent_25th_pctile_change_delta_2018'] = gdf['rent_25th_pctile_change_delta_2018'].fillna(0)
gdf['totpop_change_delta_2018'] = (gdf.totpop_2018z_a) - (gdf.totpop_2018z_b).abs()
gdf['totpop_change_delta_2018'] = gdf['totpop_change_delta_2018'].fillna(0)
gdf['rent_pct_income_change_delta_2018'] = (gdf.rent_pct_income_2018z_a) - (gdf.rent_pct_income_2018z_b).abs()
gdf['rent_pct_income_change_delta_2018'] = gdf['rent_pct_income_change_delta_2018'].fillna(0)
gdf['affordable_units_per_cap_change_delta_2018'] = (gdf.affordable_units_per_cap_2018z_a) - (gdf.affordable_units_per_cap_2018z_b).abs()
gdf['affordable_units_per_cap_change_delta_2018'] = gdf['affordable_units_per_cap_change_delta_2018'].fillna(0)
gdf['median_tenancy_change_delta_2018'] = (gdf.median_tenancy_2018z_a) - (gdf.median_tenancy_2018z_b).abs()
gdf['median_tenancy_change_delta_2018'] = gdf['median_tenancy_change_delta_2018'].fillna(0)
gdf['median_housing_age_change_delta_2018'] = (gdf.median_housing_age_2018z_a) - (gdf.median_housing_age_2018z_b).abs()
gdf['median_housing_age_change_delta_2018'] = gdf['median_housing_age_change_delta_2018'].fillna(0)
#weight the edges
alpha = 1/6.0
bravo = 1/6.0
charlie = 1/6.0
delta = 1/6.0
echo = 1/6.0
foxtrot = 1/6.0
golf = 0
threshold = -0.5
#2013 + change version
gdf['omega'] = (
-(alpha * gdf.minority_pop_pct_change_delta) + \
(bravo * gdf.rent_25th_pctile_change_delta) + \
(charlie * gdf.totpop_change_delta) + \
(delta * gdf.rent_pct_income_change_delta) + \
-(echo * gdf.affordable_units_per_cap_change_delta) + \
-(foxtrot * gdf.median_tenancy_change_delta) + \
(golf * gdf.median_housing_age_change_delta)
)
#gdf.loc[gdf.omega < 0, 'omega'] = None #corrects for the census having "2018" as an answer to some of these
gdf = gdf[(gdf['omega'] >= threshold)]
#2013 only version
gdf['omega13'] = (
-(alpha * gdf.minority_pop_pct_change_delta_2013) + \
(bravo * gdf.rent_25th_pctile_change_delta_2013) + \
(charlie * gdf.totpop_change_delta_2013) + \
(delta * gdf.rent_pct_income_change_delta_2013) + \
-(echo * gdf.affordable_units_per_cap_change_delta_2013) + \
-(foxtrot * gdf.median_tenancy_change_delta_2013) + \
(golf * gdf.median_housing_age_change_delta_2013)
)
gdf = gdf[(gdf['omega13'] >= threshold)]
#2018 only version
gdf['omega18'] = (
-(alpha * gdf.minority_pop_pct_change_delta_2018) + \
(bravo * gdf.rent_25th_pctile_change_delta_2018) + \
(charlie * gdf.totpop_change_delta_2018) + \
(delta * gdf.rent_pct_income_change_delta_2018) + \
-(echo * gdf.affordable_units_per_cap_change_delta_2018) + \
-(foxtrot * gdf.median_tenancy_change_delta_2018) + \
(golf * gdf.median_housing_age_change_delta_2018)
)
gdf = gdf[(gdf['omega18'] >= threshold)]
#tester for bar graph of just geoid_a
gdf['omega_bar'] = (
-(alpha * (((1-omicron) * gdf.minority_pop_pct_change_a) + (omicron * gdf.minority_pop_pct_2013z_a))) + \
(bravo * (((1-omicron) * gdf.rent_25th_pctile_change_a) + (omicron * gdf.rent_25th_pctile_2013z_a))) + \
(charlie * (((1-omicron) * gdf.totpop_change_a) + (omicron * gdf.totpop_2013z_a))) + \
(delta * (((1-omicron) * gdf.rent_pct_income_change_a) + (omicron * gdf.rent_pct_income_2013z_a))) + \
-(echo * (((1-omicron) * gdf.affordable_units_per_cap_change_a) + (omicron * gdf.affordable_units_per_cap_2013z_a))) + \
-(foxtrot * (((1-omicron) * gdf.median_tenancy_change_a) + (omicron * gdf.median_tenancy_2013z_a))) + \
(golf * (((1-omicron) * gdf.median_housing_age_change_a) + (omicron * gdf.median_housing_age_2013z_a)))
)
gdf['omega_bar'] = gdf['omega_bar'].fillna(0) #deals with nan in dataframe, which was breaking the network
gdf = gdf[(gdf['omega_bar'] >= threshold)]
#gdf.loc[gdf.omega < 0, 'omega'] = None #corrects for the census having "2018" as an answer to some of these
gdf = gdf[(gdf['omega'] >= threshold)]
#gdf['omega'] = gdf['omega'] / gdf['omega'].max() #normalize so edges don't go nuts
#gdf = gdf[gdf['distance'] < 3500] #filter, is only necessary if you need to threshold this and also don't use one of the subset dfs below.
#create cityname df
muni_gdf = gp.read_file(ROOTDIR + 'data/shapefiles/Municipal_Boundaries/Municipal_Boundaries.shp')
tract_gdf = gp.read_file(ROOTDIR + 'data/shapefiles/KingCountyTracts/kc_tract_10.shp')
t = tract_gdf[['OBJECTID', 'STATEFP10', 'COUNTYFP10', 'TRACTCE10', 'GEOID10', 'geometry']]
m = muni_gdf[['OBJECTID', 'CITYNAME', 'geometry']]
m = m.to_crs(epsg=2926)
#pd.options.display.max_rows = 500
mt = gp.sjoin(t, m, how='left', op='intersects', lsuffix='', rsuffix='_muni')
mt[['GEOID10', 'CITYNAME']]
#merge with cityname df
#df = df.merge(mt, how = 'inner', left_on = ['GEOID'], right_on = ['GEOID10']) #DON'T DO THIS UNTIL CITIES ASSIGNED TO BLOCK GROUPS
#df = df.rename(columns = {'CITYNAME_y':'CITYNAME'})
import itertools
#create mtbaker_station_df & mtbaker_station_gdf
mtbaker_station_gdf = gdf[((gdf['GEOID_a'] == '530330100011') & (gdf['distance'] < 1.500)) | ((gdf['GEOID_b'] == '530330100011') & (gdf['distance'] < 1.500))]
mtbaker_station_gid_a = list(mtbaker_station_gdf['GEOID_a'].drop_duplicates())
mtbaker_station_gid_b = list(mtbaker_station_gdf['GEOID_b'].drop_duplicates())
mtbaker_station_pair_data = {
'GEOID_a': list(),
'GEOID_b': list()
}
for ga, gb in itertools.product(mtbaker_station_gid_a + mtbaker_station_gid_b, mtbaker_station_gid_a + mtbaker_station_gid_b):
mtbaker_station_pair_data['GEOID_a'].append(ga)
mtbaker_station_pair_data['GEOID_b'].append(gb)
mtbaker_station_pair_df = pd.DataFrame.from_dict(mtbaker_station_pair_data)
mtbaker_station_pair_df = mtbaker_station_pair_df.merge(gdf, how='left', on=['GEOID_a', 'GEOID_b'])
mtbaker_station_pair_df = mtbaker_station_pair_df[~mtbaker_station_pair_df['distance'].isnull()]
mtbaker_station_gdf = mtbaker_station_pair_df
mtbaker_station_geoids = list(mtbaker_station_gdf['GEOID_a'].drop_duplicates()) + \
list(mtbaker_station_gdf['GEOID_b'].drop_duplicates())
mtbaker_station_df = df[df['GEOID'].isin(mtbaker_station_geoids)]
mtbaker_station_df['neighborhood'] = 'mtbaker_station'
#create othello_station_df & othello_station_gdf
othello_station_gdf = gdf[((gdf['GEOID_a'] == '530330110012') & (gdf['distance'] < 1.500)) | ((gdf['GEOID_b'] == '530330110012') & (gdf['distance'] < 1.500))]
othello_station_gid_a = list(othello_station_gdf['GEOID_a'].drop_duplicates())
othello_station_gid_b = list(othello_station_gdf['GEOID_b'].drop_duplicates())
othello_station_pair_data = {
'GEOID_a': list(),
'GEOID_b': list()
}
for ga, gb in itertools.product(othello_station_gid_a + othello_station_gid_b, othello_station_gid_a + othello_station_gid_b):
othello_station_pair_data['GEOID_a'].append(ga)
othello_station_pair_data['GEOID_b'].append(gb)
othello_station_pair_df = pd.DataFrame.from_dict(othello_station_pair_data)
othello_station_pair_df = othello_station_pair_df.merge(gdf, how='left', on=['GEOID_a', 'GEOID_b'])
othello_station_pair_df = othello_station_pair_df[~othello_station_pair_df['distance'].isnull()]
othello_station_gdf = othello_station_pair_df
othello_station_geoids = list(othello_station_gdf['GEOID_a'].drop_duplicates()) + \
list(othello_station_gdf['GEOID_b'].drop_duplicates())
othello_station_df = df[df['GEOID'].isin(othello_station_geoids)]
othello_station_df['neighborhood'] = 'othello_station'
#create rainier_beach_df & rainier_beach_gdf
rainier_beach_gdf = gdf[((gdf['GEOID_a'] == '530330117001') & (gdf['distance'] < 2)) | ((gdf['GEOID_b'] == '530330117001') & (gdf['distance'] < 2))]
rainier_beach_gid_a = list(rainier_beach_gdf['GEOID_a'].drop_duplicates())
rainier_beach_gid_b = list(rainier_beach_gdf['GEOID_b'].drop_duplicates())
rainier_beach_pair_data = {
'GEOID_a': list(),
'GEOID_b': list()
}
for ga, gb in itertools.product(rainier_beach_gid_a + rainier_beach_gid_b, rainier_beach_gid_a + rainier_beach_gid_b):
rainier_beach_pair_data['GEOID_a'].append(ga)
rainier_beach_pair_data['GEOID_b'].append(gb)
rainier_beach_pair_df = pd.DataFrame.from_dict(rainier_beach_pair_data)
rainier_beach_pair_df = rainier_beach_pair_df.merge(gdf, how='left', on=['GEOID_a', 'GEOID_b'])
rainier_beach_pair_df = rainier_beach_pair_df[~rainier_beach_pair_df['distance'].isnull()]
rainier_beach_gdf = rainier_beach_pair_df
rainier_beach_geoids = list(rainier_beach_gdf['GEOID_a'].drop_duplicates()) + \
list(rainier_beach_gdf['GEOID_b'].drop_duplicates())
rainier_beach_df = df[df['GEOID'].isin(rainier_beach_geoids)]
rainier_beach_df['neighborhood'] = 'rainier_beach'
#create wallingford_df & wallingford_gdf
wallingford_gdf = gdf[((gdf['GEOID_a'] == '530330046001') & (gdf['distance'] < 3.000)) | ((gdf['GEOID_b'] == '530330046001') & (gdf['distance'] < 3.000))]
wallingford_gid_a = list(wallingford_gdf['GEOID_a'].drop_duplicates())
wallingford_gid_b = list(wallingford_gdf['GEOID_b'].drop_duplicates())
wallingford_pair_data = {
'GEOID_a': list(),
'GEOID_b': list()
}
for ga, gb in itertools.product(wallingford_gid_a + wallingford_gid_b, wallingford_gid_a + wallingford_gid_b):
wallingford_pair_data['GEOID_a'].append(ga)
wallingford_pair_data['GEOID_b'].append(gb)
wallingford_pair_df = pd.DataFrame.from_dict(wallingford_pair_data)
wallingford_pair_df = wallingford_pair_df.merge(gdf, how='left', on=['GEOID_a', 'GEOID_b'])
wallingford_pair_df = wallingford_pair_df[~wallingford_pair_df['distance'].isnull()]
wallingford_gdf = wallingford_pair_df
wallingford_geoids = list(wallingford_gdf['GEOID_a'].drop_duplicates()) + \
list(wallingford_gdf['GEOID_b'].drop_duplicates())
wallingford_df = df[df['GEOID'].isin(wallingford_geoids)]
wallingford_df['neighborhood'] = 'wallingford'
def get_df(subset='all'):
subsets = {
'all': df,
'mtbaker_station': mtbaker_station_df,
'othello_station': othello_station_df,
'rainier_beach': rainier_beach_df,
'wallingford': wallingford_df
}
if subset in subsets:
return subsets[subset]
else:
raise('ERROR - Unrecognized subset. Must be one of {}, bet received: {}'.format(subsets.keys(), subset))
def get_gdf(subset='all'):
subsets = {
'all': gdf,
'mtbaker_station': mtbaker_station_gdf,
'othello_station': othello_station_gdf,
'rainier_beach': rainier_beach_gdf,
'wallingford': wallingford_gdf
}
if subset in subsets:
return subsets[subset]
else:
raise ('ERROR - Unrecognized subset. Must be one of {}, bet received: {}'.format(subsets.keys(), subset))
| 84.05042 | 1,173 | 0.767287 | 7,771 | 50,010 | 4.442157 | 0.048385 | 0.044293 | 0.054751 | 0.052289 | 0.831025 | 0.769554 | 0.688789 | 0.619902 | 0.548725 | 0.520365 | 0 | 0.082325 | 0.071666 | 50,010 | 594 | 1,174 | 84.191919 | 0.661224 | 0.05057 | 0 | 0.042553 | 0 | 0 | 0.430014 | 0.251906 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004255 | false | 0 | 0.012766 | 0 | 0.021277 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c5df8b7ad170b27ba879a52b84d09f449761877e | 29 | py | Python | cachebuster/tests.py | ossobv/django-cachebuster | 2e48e5c4c94fb60c0ccb3b9bd52b5803162e2180 | [
"BSD-3-Clause"
] | null | null | null | cachebuster/tests.py | ossobv/django-cachebuster | 2e48e5c4c94fb60c0ccb3b9bd52b5803162e2180 | [
"BSD-3-Clause"
] | null | null | null | cachebuster/tests.py | ossobv/django-cachebuster | 2e48e5c4c94fb60c0ccb3b9bd52b5803162e2180 | [
"BSD-3-Clause"
] | null | null | null | __author__ = 'James Addison'
| 14.5 | 28 | 0.758621 | 3 | 29 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0.448276 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c5f4396c9c2a4bc1df9d0de095d61c8098c8d796 | 805 | py | Python | assignment1/assignment1/sigmoid.py | lars4/Machine-Learning | 04156b51608c3d19cde9643dad9367e7ceaace21 | [
"MIT"
] | null | null | null | assignment1/assignment1/sigmoid.py | lars4/Machine-Learning | 04156b51608c3d19cde9643dad9367e7ceaace21 | [
"MIT"
] | null | null | null | assignment1/assignment1/sigmoid.py | lars4/Machine-Learning | 04156b51608c3d19cde9643dad9367e7ceaace21 | [
"MIT"
] | null | null | null | from numpy import exp
def sigmoid(z):
"""
Computes the sigmoid of z element-wise.
Args:
z: An np.array of arbitrary shape
Returns:
g: An np.array of the same shape as z
"""
g = None
#######################################################################
# TODO: #
# Compute and return the sigmoid of z in g #
#######################################################################
g = ((1) / (1 + exp(-z)))
#######################################################################
# END OF YOUR CODE #
#######################################################################
return g
| 28.75 | 75 | 0.239752 | 56 | 805 | 3.446429 | 0.571429 | 0.103627 | 0.124352 | 0.134715 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003891 | 0.361491 | 805 | 27 | 76 | 29.814815 | 0.371595 | 0.392547 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
6804c4cf60f22bd8e1138677cca8ce98dba893f1 | 141 | py | Python | Python/chapter1.py | thebiebs/python-practise-exercises | 28db498f35ec6f9bdd437f54c91990027f4b9436 | [
"MIT"
] | null | null | null | Python/chapter1.py | thebiebs/python-practise-exercises | 28db498f35ec6f9bdd437f54c91990027f4b9436 | [
"MIT"
] | null | null | null | Python/chapter1.py | thebiebs/python-practise-exercises | 28db498f35ec6f9bdd437f54c91990027f4b9436 | [
"MIT"
] | null | null | null | # Autonr : Biswadeep Roy
# import os
''' thiple single quote is used to
do multiple line comments'''
print("Hello world")
| 10.846154 | 35 | 0.624113 | 18 | 141 | 4.888889 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.283688 | 141 | 12 | 36 | 11.75 | 0.871287 | 0.638298 | 0 | 0 | 0 | 0 | 0.37931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
680b387b1517d20d6c712efab67f23a5cb553a13 | 1,148 | py | Python | tests/gates/rx/test_constructor.py | edyounis/QFactor | 8ce255b432157e2a0d4e91e0a8c4be0ef4f60a74 | [
"BSD-3-Clause-LBNL"
] | 3 | 2020-12-15T17:14:13.000Z | 2020-12-17T11:35:37.000Z | tests/gates/rx/test_constructor.py | edyounis/qFactor | 8ce255b432157e2a0d4e91e0a8c4be0ef4f60a74 | [
"BSD-3-Clause-LBNL"
] | null | null | null | tests/gates/rx/test_constructor.py | edyounis/qFactor | 8ce255b432157e2a0d4e91e0a8c4be0ef4f60a74 | [
"BSD-3-Clause-LBNL"
] | null | null | null | import scipy
import numpy as np
import unittest as ut
from qfactor import get_distance
from qfactor.gates import RxGate
class TestRxGateConstructor ( ut.TestCase ):
def test_rxgate_constructor_invalid ( self ):
self.assertRaises( TypeError, RxGate, 1, 0 )
self.assertRaises( TypeError, RxGate, "a", 0 )
self.assertRaises( TypeError, RxGate, [0, 1], 0 )
self.assertRaises( ValueError, RxGate, np.pi/2, -1 )
self.assertRaises( TypeError, RxGate, np.pi/2, [0, 1] )
self.assertRaises( TypeError, RxGate, np.pi/2, (0, 1) )
self.assertRaises( TypeError, RxGate, np.pi/2, ("a") )
self.assertRaises( TypeError, RxGate, np.pi/2, 0, 0 )
def test_rxgate_constructor_valid ( self ):
gate = RxGate( np.pi, 0, True )
X = np.array( [ [ 0, 1 ], [ 1, 0 ] ] )
Rx = scipy.linalg.expm( -1j * np.pi/2 * X )
self.assertTrue( get_distance( [ gate ], Rx ) < 1e-15 )
self.assertTrue( np.array_equal( gate.location, (0,) ) )
self.assertEqual( gate.gate_size, 1 )
self.assertTrue( gate.fixed )
if __name__ == '__main__':
ut.main()
| 32.8 | 64 | 0.616725 | 151 | 1,148 | 4.569536 | 0.324503 | 0.185507 | 0.253623 | 0.314493 | 0.310145 | 0.217391 | 0.217391 | 0.217391 | 0.163768 | 0.163768 | 0 | 0.034843 | 0.25 | 1,148 | 34 | 65 | 33.764706 | 0.766551 | 0 | 0 | 0 | 0 | 0 | 0.008718 | 0 | 0 | 0 | 0 | 0 | 0.48 | 1 | 0.08 | false | 0 | 0.2 | 0 | 0.32 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
6818bcf80778fa40f78accee9577156d7b677201 | 1,016 | py | Python | tests/api/test_api.py | ludeeus/HARM | 9c4f69d7cd7c240c2aa66c150723536b533b9f2f | [
"MIT"
] | 456 | 2019-03-02T22:46:27.000Z | 2019-10-23T18:34:23.000Z | tests/api/test_api.py | ludeeus/HARM | 9c4f69d7cd7c240c2aa66c150723536b533b9f2f | [
"MIT"
] | 573 | 2019-03-02T18:42:28.000Z | 2019-10-24T07:52:38.000Z | tests/api/test_api.py | ludeeus/harm | 55794176fccf55321780decf9ef354e91a41d1b9 | [
"MIT"
] | 275 | 2019-05-22T06:27:25.000Z | 2019-10-23T11:42:51.000Z | import os
from homeassistant.core import HomeAssistant
import pytest
from custom_components.hacs.websocket import (
acknowledge_critical_repository,
get_critical_repositories,
hacs_config,
hacs_removed,
hacs_repositories,
hacs_repository,
hacs_repository_data,
hacs_settings,
hacs_status,
)
@pytest.mark.asyncio
async def test_check_local_path(hacs, connection, tmpdir):
hacs.hass = HomeAssistant()
os.makedirs(tmpdir, exist_ok=True)
get_critical_repositories(hacs.hass, connection, {"id": 1})
hacs_config(hacs.hass, connection, {"id": 1})
hacs_removed(hacs.hass, connection, {"id": 1})
hacs_repositories(hacs.hass, connection, {"id": 1})
hacs_repository(hacs.hass, connection, {"id": 1})
hacs_repository_data(hacs.hass, connection, {"id": 1})
hacs_settings(hacs.hass, connection, {"id": 1})
hacs_status(hacs.hass, connection, {"id": 1})
acknowledge_critical_repository(hacs.hass, connection, {"repository": "test/test", "id": 1})
| 30.787879 | 96 | 0.720472 | 126 | 1,016 | 5.579365 | 0.293651 | 0.113798 | 0.230441 | 0.227596 | 0.341394 | 0.311522 | 0.169275 | 0 | 0 | 0 | 0 | 0.010465 | 0.153543 | 1,016 | 32 | 97 | 31.75 | 0.806977 | 0 | 0 | 0 | 0 | 0 | 0.036417 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.148148 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a84c93910e0067ef22094363b9104b41671748b4 | 2,872 | py | Python | cexapi.py | binary-signal/Ethereum-Tracker | c7f57114e9f4d0772d2aae37a4d058c4a595525e | [
"BSD-2-Clause"
] | null | null | null | cexapi.py | binary-signal/Ethereum-Tracker | c7f57114e9f4d0772d2aae37a4d058c4a595525e | [
"BSD-2-Clause"
] | null | null | null | cexapi.py | binary-signal/Ethereum-Tracker | c7f57114e9f4d0772d2aae37a4d058c4a595525e | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# Author: t0pep0
import hmac
import hashlib
import time
import json
import requests
class API(object):
__username = ''
__api_key = ''
__api_secret = ''
__nonce_v = ''
# Init class
def __init__(self, username, api_key, api_secret):
self.__username = username
self.__api_key = api_key
self.__api_secret = api_secret
# get timestamp as nonce
def __nonce(self):
self.__nonce_v = '{:.10f}'.format(time.time() * 1000).split('.')[0]
# generate signature
def __signature(self):
string = self.__nonce_v + self.__username + self.__api_key # create string
signature = hmac.new(bytes(self.__api_secret, 'latin-1'), bytes(string, 'latin-1'),
digestmod=hashlib.sha256).hexdigest().upper() # create signature
return signature
def __post(self, url, param): # Post Request (Low Level API call)
page = requests.post(url, param)
return page.text
def api_call(self, method, param={}, private=0,
couple=''): # api call (Middle level)
url = 'https://cex.io/api/' + method + '/' # generate url
if couple != '':
url = url + couple + '/' # set couple if needed
if private == 1: # add auth-data if needed
self.__nonce()
param.update({
'key' : self.__api_key,
'signature': self.__signature(),
'nonce' : self.__nonce_v})
answer = self.__post(url, param) # Post Request
return json.loads(answer) # generate dict and return
def ticker(self, couple='ETH/EUR'):
return self.api_call('ticker', {}, 0, couple)
def order_book(self, couple='ETH/EUR'):
return self.api_call('order_book', {}, 0, couple)
def trade_history(self, since=1, couple='ETH/EUR'):
return self.api_call('trade_history', {"since": str(since)}, 0, couple)
def balance(self):
return self.api_call('balance', {}, 1)
def current_orders(self, couple='ETH/EUR'):
return self.api_call('open_orders', {}, 1, couple)
def cancel_order(self, order_id):
return self.api_call('cancel_order', {"id": order_id}, 1)
def place_order(self, ptype='buy', amount=1, price=1, couple='ETH/EUR'):
return self.api_call('place_order',
{"type" : ptype, "amount": str(amount),
"price": str(price)}, 1, couple)
def price_stats(self, last_hours, max_resp_arr_size, couple='ETH/EUR'):
return self.api_call(
'price_stats',
{"lastHours": last_hours, "maxRespArrSize": max_resp_arr_size},
0, couple)
def converter(self, amount, couple='ETH/EUR'):
return self.api_call('convert', {"amnt": amount}, 0, couple)
| 34.60241 | 94 | 0.581476 | 352 | 2,872 | 4.485795 | 0.292614 | 0.062065 | 0.074098 | 0.096897 | 0.166561 | 0.137429 | 0.137429 | 0.100697 | 0 | 0 | 0 | 0.014016 | 0.279596 | 2,872 | 82 | 95 | 35.02439 | 0.749154 | 0.095404 | 0 | 0 | 0 | 0 | 0.0964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.233333 | false | 0 | 0.083333 | 0.15 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
a855b0f2caaa009080c63d609278957a43507fa9 | 4,484 | py | Python | Fujitsu/benchmarks/resnet/implementations/mxnet/3rdparty/tvm/tests/python/relay/test_op_level4.py | mengkai94/training_results_v0.6 | 43dc3e250f8da47b5f8833197d74cb8cf1004fc9 | [
"Apache-2.0"
] | 64 | 2021-05-02T14:42:34.000Z | 2021-05-06T01:35:03.000Z | Fujitsu/benchmarks/resnet/implementations/mxnet/3rdparty/tvm/tests/python/relay/test_op_level4.py | mengkai94/training_results_v0.6 | 43dc3e250f8da47b5f8833197d74cb8cf1004fc9 | [
"Apache-2.0"
] | 23 | 2019-07-29T05:21:52.000Z | 2020-08-31T18:51:42.000Z | Fujitsu/benchmarks/resnet/implementations/mxnet/3rdparty/tvm/tests/python/relay/test_op_level4.py | mengkai94/training_results_v0.6 | 43dc3e250f8da47b5f8833197d74cb8cf1004fc9 | [
"Apache-2.0"
] | 51 | 2019-07-12T05:10:25.000Z | 2021-07-28T16:19:06.000Z | import tvm
import numpy as np
from tvm import relay
from tvm.relay.ir_pass import infer_type
from tvm.relay.ir_builder import IRBuilder, func_type
from tvm.relay.ir_builder import scalar_type, convert, tensor_type
from tvm.relay.env import Environment
def assert_has_type(expr, typ, env=Environment({})):
checked_expr = infer_type(env, expr)
checked_type = checked_expr.checked_type
if checked_type != typ:
raise RuntimeError("Type mismatch %s vs %s" % (
checked_type, typ))
def test_cmp_type():
for op in (relay.greater,
relay.greater_equal,
relay.less,
relay.less_equal,
relay.equal,
relay.not_equal):
ib = relay.ir_builder.IRBuilder()
x = ib.param("x", relay.TensorType((10, 4), "float32"))
y = ib.param("y", relay.TensorType((5, 10, 1), "float32"))
with ib.function(x, y) as func:
ib.ret(op(x.var, y.var))
ib.ret(func)
func = relay.ir_pass.infer_type(ib.env, func.to_func())
ftype = func.checked_type
assert ftype.ret_type == relay.TensorType((5, 10, 4), "uint1")
def test_binary_broadcast():
for op in [relay.right_shift,
relay.left_shift,
relay.maximum]:
ib = relay.ir_builder.IRBuilder()
x = ib.param("x", relay.TensorType((10, 4), "int32"))
y = ib.param("y", relay.TensorType((5, 10, 1), "int32"))
with ib.function(x, y) as func:
ib.ret(op(x.var, y.var))
ib.ret(func)
func = relay.ir_pass.infer_type(ib.env, func.to_func())
ftype = func.checked_type
assert ftype.ret_type == relay.TensorType((5, 10, 4), "int32")
def test_binary_op():
def check_binary_op(opfunc):
"""
Program:
fn (x, y) {
return x <op> y;
}
"""
b = IRBuilder()
x = b.param('x', tensor_type(5, 5, 5))
y = b.param('y', tensor_type(5, 5, 5))
with b.function(x, y) as func:
b.ret(opfunc(x.var, y.var))
b.ret(func)
prog, env = b.get()
ttype = tensor_type(5, 5, 5)
expected_ty = func_type([ttype, ttype], ttype)
assert_has_type(func.to_func(), expected_ty)
for opfunc in [relay.pow]:
check_binary_op(opfunc)
def test_binary_broadcast_op():
def check_binary_broadcast_op(opfunc):
"""
Program:
fn (x: Tensor[(10, 4), f32], y: Tensor[(5, 10, 1), f32]) -> Tensor[(5, 10, 4), f32] {
return x <op> y;
}
"""
b = IRBuilder()
x = b.param('x', tensor_type(10, 4))
y = b.param('y', tensor_type(5, 10, 1))
with b.function(x, y) as func:
b.ret(opfunc(x.var, y.var))
b.ret(func)
prog, env = b.get()
expected_ty = func_type([tensor_type(10, 4), tensor_type(5, 10, 1)],
tensor_type(5, 10, 4))
assert_has_type(func.to_func(), expected_ty)
for opfunc in [relay.pow]:
check_binary_broadcast_op(opfunc)
def test_cmp_type():
for op in (relay.greater,
relay.greater_equal,
relay.less,
relay.less_equal,
relay.equal,
relay.not_equal):
ib = relay.ir_builder.IRBuilder()
x = ib.param("x", relay.TensorType((10, 4), "float32"))
y = ib.param("y", relay.TensorType((5, 10, 1), "float32"))
with ib.function(x, y) as func:
ib.ret(op(x.var, y.var))
ib.ret(func)
func = relay.ir_pass.infer_type(ib.env, func.to_func())
ftype = func.checked_type
assert ftype.ret_type == relay.TensorType((5, 10, 4), "uint1")
def test_binary_broadcast():
for op in [relay.right_shift,
relay.left_shift,
relay.maximum,
relay.minimum]:
ib = relay.ir_builder.IRBuilder()
x = ib.param("x", relay.TensorType((10, 4), "int32"))
y = ib.param("y", relay.TensorType((5, 10, 1), "int32"))
with ib.function(x, y) as func:
ib.ret(op(x.var, y.var))
ib.ret(func)
func = relay.ir_pass.infer_type(ib.env, func.to_func())
ftype = func.checked_type
assert ftype.ret_type == relay.TensorType((5, 10, 4), "int32")
if __name__ == "__main__":
test_cmp_type()
test_binary_broadcast()
test_binary_op()
test_binary_broadcast_op()
| 33.714286 | 97 | 0.557984 | 631 | 4,484 | 3.790808 | 0.126783 | 0.016304 | 0.053512 | 0.060201 | 0.758361 | 0.713211 | 0.713211 | 0.671405 | 0.671405 | 0.671405 | 0 | 0.035032 | 0.299732 | 4,484 | 132 | 98 | 33.969697 | 0.726752 | 0.041258 | 0 | 0.666667 | 0 | 0 | 0.026166 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.085714 | false | 0.047619 | 0.066667 | 0 | 0.152381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a86962e651316a79f083b17d24ecf32b3747966f | 3,210 | py | Python | cishouseholds/pipeline/survey_responses_version_1_ETL.py | ONS-SST/cis_households | e475df5929e6763a46cd05aff1f7e960ccbe8e21 | [
"MIT"
] | null | null | null | cishouseholds/pipeline/survey_responses_version_1_ETL.py | ONS-SST/cis_households | e475df5929e6763a46cd05aff1f7e960ccbe8e21 | [
"MIT"
] | 252 | 2021-05-19T11:12:43.000Z | 2022-03-02T10:39:10.000Z | cishouseholds/pipeline/survey_responses_version_1_ETL.py | ONS-SST/cis_households | e475df5929e6763a46cd05aff1f7e960ccbe8e21 | [
"MIT"
] | null | null | null | from pyspark.sql import DataFrame
from cishouseholds.edit import update_column_values_from_map
def transform_survey_responses_version_1_delta(df: DataFrame) -> DataFrame:
"""
Call functions to process input for iqvia version 1 survey deltas.
"""
df = update_column_values_from_map(
df=df,
column="work_status_v1",
map={
"Employed and currently working": "Employed", # noqa: E501
"Employed and currently not working": "Furloughed (temporarily not working)", # noqa: E501
"Self-employed and currently not working": "Furloughed (temporarily not working)", # noqa: E501
"Retired": "Not working (unemployed, retired, long-term sick etc.)", # noqa: E501
"Not working and not looking for work": "Not working (unemployed, retired, long-term sick etc.)", # noqa: E501,W503
"5y and older in full-time education": "Student", # noqa: E501
"Child under 5y attending child care": "Child under 4-5y attending child care", # noqa: E501
"Child under 5y attending nursery or pre-school or childminder": "Child under 4-5y attending child care", # noqa: E501
"Child under 4-5y attending nursery or pre-school or childminder": "Child under 4-5y attending child care", # noqa: E501
"Child under 5y not attending nursery or pre-school or childminder": "Child under 4-5y not attending child care", # noqa: E501
"Child under 5y not attending child care": "Child under 4-5y not attending child care", # noqa: E501
"Child under 4-5y not attending nursery or pre-school or childminder": "Child under 4-5y not attending child care", # noqa: E501
"4-5y and older at school/home-school (including if temporarily absent)": "4-5y and older at school/home-school", # noqa: E501
"Employed and currently not working (e.g. on leave due to the COVID-19 pandemic (furloughed) or sick leave for 4 weeks or longer or maternity/paternity leave)": "Employed and currently not working", # noqa: E501
"Employed and currently working (including if on leave or sick leave for less than 4 weeks)": "Employed and currently working", # noqa: E501
"Not in paid work and not looking for paid work (include doing voluntary work here)": "Not working and not looking for work", # noqa: E501
"Self-employed and currently not working (e.g. on leave due to the COVID-19 pandemic or sick leave for 4 weeks or longer or maternity/paternity leave)": "Self-employed and currently not working", # noqa: E501
"Self-employed and currently not working (e.g. on leave due to the COVID-19 pandemic (furloughed) or sick leave for 4 weeks or longer or maternity/paternity leave)": "Self-employed and currently not working", # noqa: E501
"Self-employed and currently working (include if on leave or sick leave for less than 4 weeks)": "Self-employed and currently working", # noqa: E501
"Looking for paid work and able to start": "Looking for paid work and able to start", # noqa: E501
"Participant Would Not/Could Not Answer": None,
},
)
return df
| 82.307692 | 234 | 0.681308 | 458 | 3,210 | 4.742358 | 0.213974 | 0.073665 | 0.119705 | 0.084715 | 0.792818 | 0.755525 | 0.714088 | 0.649171 | 0.593002 | 0.592081 | 0 | 0.041582 | 0.235826 | 3,210 | 38 | 235 | 84.473684 | 0.843865 | 0.090966 | 0 | 0 | 0 | 0.096774 | 0.736039 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0 | 0.064516 | 0 | 0.129032 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a8839f47f0607984bb6bf33ad2644bd86b6beb66 | 142 | py | Python | app/Http/python/report/report.py | asavadorndeja/sttools | 97154c20abfc9181420de82585af9a44b2e7f50a | [
"MIT"
] | null | null | null | app/Http/python/report/report.py | asavadorndeja/sttools | 97154c20abfc9181420de82585af9a44b2e7f50a | [
"MIT"
] | null | null | null | app/Http/python/report/report.py | asavadorndeja/sttools | 97154c20abfc9181420de82585af9a44b2e7f50a | [
"MIT"
] | null | null | null | from docx import Document
document = Document()
paragraph = document.add_paragraph('Lorem ipsum dolor sit amet.')
document.save('test.docx')
| 23.666667 | 65 | 0.774648 | 19 | 142 | 5.736842 | 0.684211 | 0.293578 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112676 | 142 | 5 | 66 | 28.4 | 0.865079 | 0 | 0 | 0 | 0 | 0 | 0.253521 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a88c4ef527adf6b8ae1d513fbe9ac0efdd5e0f50 | 132 | py | Python | tests/port_tests/local_minimum_list_tests/test_initialization.py | synapticarbors/wagyu | b98354611dceda8888f2951e9704f843a4e88c27 | [
"MIT"
] | 1 | 2021-01-20T05:49:13.000Z | 2021-01-20T05:49:13.000Z | tests/port_tests/local_minimum_list_tests/test_initialization.py | synapticarbors/wagyu | b98354611dceda8888f2951e9704f843a4e88c27 | [
"MIT"
] | 1 | 2020-11-20T18:21:24.000Z | 2020-11-20T18:21:37.000Z | tests/port_tests/local_minimum_list_tests/test_initialization.py | synapticarbors/wagyu | b98354611dceda8888f2951e9704f843a4e88c27 | [
"MIT"
] | 2 | 2020-11-20T18:17:31.000Z | 2021-01-20T14:58:22.000Z | from wagyu.local_minimum import LocalMinimumList
def test_basic() -> None:
result = LocalMinimumList()
assert not result
| 16.5 | 48 | 0.742424 | 15 | 132 | 6.4 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189394 | 132 | 7 | 49 | 18.857143 | 0.897196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a8acf5cdaf0becf37a0b507e99ea7dbc51b7249b | 1,514 | py | Python | carball/analysis/stats/stats_list.py | Druthyn/carball | 924ba75d2a60d069a5722a3d2a98814765d79197 | [
"Apache-2.0"
] | null | null | null | carball/analysis/stats/stats_list.py | Druthyn/carball | 924ba75d2a60d069a5722a3d2a98814765d79197 | [
"Apache-2.0"
] | null | null | null | carball/analysis/stats/stats_list.py | Druthyn/carball | 924ba75d2a60d069a5722a3d2a98814765d79197 | [
"Apache-2.0"
] | null | null | null | from typing import List
from ...analysis.stats.possession.ball_distances import BallDistanceStat
from ...analysis.stats.tendencies.hit_counts import HitCountStat
from ...analysis.stats.ball_forward.distance_hit_ball_forward import DistanceStats
from ...analysis.stats.boost.boost import BoostStat
from ...analysis.stats.possession.possession import PossessionStat
from ...analysis.stats.possession.turnovers import TurnoverStat
from ...analysis.stats.stats import BaseStat, HitStat
from ...analysis.stats.tendencies.tendencies import TendenciesStat
class StatsList:
"""
Where you add any extra stats you want calculated.
"""
@staticmethod
def get_player_stats() -> List[BaseStat]:
"""These are stats that end up being assigned to a specific player"""
return [BoostStat(),
TendenciesStat(),
BallDistanceStat(),
]
@staticmethod
def get_team_stats() -> List[BaseStat]:
"""These are stats that end up being assigned to a specific team"""
return [PossessionStat()]
@staticmethod
def get_general_stats() ->List[BaseStat]:
"""These are stats that end up being assigned to the game as a whole"""
return []
@staticmethod
def get_hit_stats() ->List[HitStat]:
"""These are stats that depend on current hit and next hit"""
return [DistanceStats(),
PossessionStat(),
HitCountStat(),
TurnoverStat()
]
| 34.409091 | 82 | 0.669089 | 166 | 1,514 | 6.018072 | 0.361446 | 0.096096 | 0.136136 | 0.068068 | 0.18018 | 0.18018 | 0.18018 | 0.18018 | 0.18018 | 0.18018 | 0 | 0 | 0.23712 | 1,514 | 43 | 83 | 35.209302 | 0.864935 | 0.19683 | 0 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137931 | true | 0 | 0.310345 | 0 | 0.62069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
a8bb61012c18672e5e35e12db9b92f5f395a5092 | 63 | py | Python | pipeline/__init__.py | BioroboticsLab/deeppipeline | 7e60e410b3a1f4d8a65924033f13dbf0ecdd3e94 | [
"Apache-2.0"
] | 4 | 2017-02-27T16:21:09.000Z | 2020-12-10T13:42:23.000Z | pipeline/__init__.py | BioroboticsLab/deeppipeline | 7e60e410b3a1f4d8a65924033f13dbf0ecdd3e94 | [
"Apache-2.0"
] | 5 | 2016-04-30T16:39:04.000Z | 2016-07-19T13:24:10.000Z | pipeline/__init__.py | BioroboticsLab/deeppipeline | 7e60e410b3a1f4d8a65924033f13dbf0ecdd3e94 | [
"Apache-2.0"
] | 4 | 2016-12-29T04:33:22.000Z | 2020-02-04T15:26:10.000Z | from pipeline.pipeline import Pipeline
__all__ = ["Pipeline"]
| 15.75 | 38 | 0.777778 | 7 | 63 | 6.428571 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126984 | 63 | 3 | 39 | 21 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.126984 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
a8d17cff9d03c9b5f0bbfc31454729949c4905b3 | 486 | py | Python | main/admin.py | SkyFlame00/assf | 807b409716f7cff0e322eb47ffc79532a437d566 | [
"MIT"
] | null | null | null | main/admin.py | SkyFlame00/assf | 807b409716f7cff0e322eb47ffc79532a437d566 | [
"MIT"
] | null | null | null | main/admin.py | SkyFlame00/assf | 807b409716f7cff0e322eb47ffc79532a437d566 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import *
#admin.site.unregister(User)
admin.site.register(User)
admin.site.register(University)
admin.site.register(Student)
admin.site.register(SubjectArea)
admin.site.register(Competency)
admin.site.register(UniversityProgram)
admin.site.register(Company)
admin.site.register(StudentEducation)
admin.site.register(StudentJob)
admin.site.register(StudentCompetency)
admin.site.register(Vacancy)
admin.site.register(SubjectAreaVacancies)
| 27 | 41 | 0.837449 | 60 | 486 | 6.783333 | 0.35 | 0.287469 | 0.501229 | 0.103194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047325 | 486 | 17 | 42 | 28.588235 | 0.87905 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a8d1a0d2a446a497847c615a74555505a5c4576c | 67 | py | Python | download/gnosis/util/__init__.py | DavidMertz/gnosis-web | df5ab0d470a02ed6c51c971f2b40f1dcdc7d7350 | [
"CC0-1.0"
] | 3 | 2018-09-29T14:14:28.000Z | 2022-01-05T03:45:50.000Z | library/CanFestival/objdictgen/gnosis/util/__init__.py | Lembed/STM32duino-gcc-Projects | 67829e9cd1388601daf9815b0561da557e0b9f82 | [
"MIT"
] | 1 | 2017-06-17T08:15:28.000Z | 2017-06-17T08:15:28.000Z | library/CanFestival/objdictgen/gnosis/util/__init__.py | Lembed/STM32duino-gcc-Projects | 67829e9cd1388601daf9815b0561da557e0b9f82 | [
"MIT"
] | 1 | 2019-12-08T15:11:55.000Z | 2019-12-08T15:11:55.000Z | __doc__ = """Utilities for manipulating text and XML documents"""
| 22.333333 | 65 | 0.746269 | 8 | 67 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149254 | 67 | 2 | 66 | 33.5 | 0.807018 | 0 | 0 | 0 | 0 | 0 | 0.742424 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a8d4cb7ba4b4a2da6c7cceb6a066b8d3f629af23 | 2,610 | py | Python | guillotina/json/serialize_value.py | rboixaderg/guillotina | fcae65c2185222272f3b8fee4bc2754e81e0e983 | [
"BSD-2-Clause"
] | 173 | 2017-03-10T18:26:12.000Z | 2022-03-03T06:48:56.000Z | guillotina/json/serialize_value.py | rboixaderg/guillotina | fcae65c2185222272f3b8fee4bc2754e81e0e983 | [
"BSD-2-Clause"
] | 921 | 2017-03-08T14:04:43.000Z | 2022-03-30T10:28:56.000Z | guillotina/json/serialize_value.py | rboixaderg/guillotina | fcae65c2185222272f3b8fee4bc2754e81e0e983 | [
"BSD-2-Clause"
] | 60 | 2017-03-16T19:59:44.000Z | 2022-03-03T06:48:59.000Z | # -*- coding: utf-8 -*-
from datetime import date
from datetime import datetime
from datetime import time
from datetime import timedelta
from decimal import Decimal
from guillotina import configure
from guillotina.component import query_adapter
from guillotina.i18n import Message
from guillotina.interfaces import IValueToJson
from guillotina.profile import profilable
from guillotina.schema.vocabulary import SimpleVocabulary
_MISSING = object()
@profilable
def json_compatible(value):
if value is None:
return value
type_ = type(value)
if type_ in (str, bool, int, float):
return value
result_value = query_adapter(value, IValueToJson, default=_MISSING)
if result_value is _MISSING:
raise TypeError("No converter for making" " {0!r} ({1}) JSON compatible.".format(value, type(value)))
else:
return result_value
@configure.value_serializer(SimpleVocabulary)
def vocabulary_converter(value):
return [x.token for x in value]
@configure.value_serializer(str)
def string_converter(value):
return str(value)
@configure.value_serializer(bytes)
def bytes_converter(value):
return str(value, encoding="utf-8")
@configure.value_serializer(list)
def list_converter(value):
return list(map(json_compatible, value))
@configure.value_serializer(tuple)
def tuple_converter(value):
return list(map(json_compatible, value))
@configure.value_serializer(frozenset)
def frozenset_converter(value):
return list(map(json_compatible, value))
@configure.value_serializer(set)
def set_converter(value):
return list(map(json_compatible, value))
@configure.value_serializer(dict)
def dict_converter(value):
if value == {}:
return {}
keys, values = zip(*value.items())
keys = map(json_compatible, keys)
values = map(json_compatible, values)
return dict(zip(keys, values))
@configure.value_serializer(datetime)
def python_datetime_converter(value):
try:
return value.isoformat()
except AttributeError: # handle date problems
return None
@configure.value_serializer(date)
def date_converter(value):
return value.isoformat()
@configure.value_serializer(time)
def time_converter(value):
return value.isoformat()
@configure.value_serializer(timedelta)
def timedelta_converter(value):
return value.total_seconds()
@configure.value_serializer(Message)
def i18n_message_converter(value):
# TODO:
# value = translate(value, context=getRequest())
return value
@configure.value_serializer(Decimal)
def decimal_converter(value):
return str(value)
| 23.097345 | 109 | 0.749042 | 320 | 2,610 | 5.959375 | 0.25625 | 0.102779 | 0.176193 | 0.121657 | 0.251704 | 0.207656 | 0.207656 | 0.207656 | 0.146827 | 0.146827 | 0 | 0.003636 | 0.157088 | 2,610 | 112 | 110 | 23.303571 | 0.863182 | 0.036398 | 0 | 0.162162 | 0 | 0 | 0.022709 | 0 | 0 | 0 | 0 | 0.008929 | 0 | 1 | 0.202703 | false | 0 | 0.148649 | 0.162162 | 0.608108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
a8d736adfea0f5cf1f91ed53d809275e9235e323 | 616 | py | Python | maths/binary_exponentiation.py | KirilBangachev/Python | 7ad45a46e02edda86a45969de8768f26ef44b306 | [
"MIT"
] | 3 | 2019-10-11T20:50:59.000Z | 2020-01-21T02:10:10.000Z | maths/binary_exponentiation.py | KirilBangachev/Python | 7ad45a46e02edda86a45969de8768f26ef44b306 | [
"MIT"
] | 1 | 2019-09-01T06:43:06.000Z | 2019-09-01T06:44:55.000Z | maths/binary_exponentiation.py | KirilBangachev/Python | 7ad45a46e02edda86a45969de8768f26ef44b306 | [
"MIT"
] | 3 | 2020-08-05T02:33:42.000Z | 2020-10-13T02:47:01.000Z | """Binary Exponentiation."""
# Author : Junth Basnet
# Time Complexity : O(logn)
def binary_exponentiation(a, n):
if (n == 0):
return 1
elif (n % 2 == 1):
return binary_exponentiation(a, n - 1) * a
else:
b = binary_exponentiation(a, n / 2)
return b * b
if __name__ == "__main__":
try:
BASE = int(input("Enter Base : ").strip())
POWER = int(input("Enter Power : ").strip())
except ValueError:
print("Invalid literal for integer")
RESULT = binary_exponentiation(BASE, POWER)
print("{}^({}) : {}".format(BASE, POWER, RESULT))
| 21.241379 | 53 | 0.568182 | 73 | 616 | 4.630137 | 0.520548 | 0.295858 | 0.186391 | 0.195266 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013453 | 0.275974 | 616 | 28 | 54 | 22 | 0.744395 | 0.11526 | 0 | 0 | 0 | 0 | 0.137546 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0 | 0 | 0.25 | 0.125 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
764500292880b940ffedf37330fd4fdeaba8b412 | 267 | py | Python | rename_to_project_and_finish/software/light_software.py | vasetousa/OOP | e4fedc497dd149c9800613ea11846e0e770d122c | [
"MIT"
] | null | null | null | rename_to_project_and_finish/software/light_software.py | vasetousa/OOP | e4fedc497dd149c9800613ea11846e0e770d122c | [
"MIT"
] | null | null | null | rename_to_project_and_finish/software/light_software.py | vasetousa/OOP | e4fedc497dd149c9800613ea11846e0e770d122c | [
"MIT"
] | null | null | null | from rename_to_project_and_finish.software.software import Software
class LightSoftware(Software):
def __init__(self, name, capacity_consumption, memory_consumption):
super().__init__(name, "Light", capacity_consumption * 1.5, memory_consumption * 0.5)
| 38.142857 | 93 | 0.782772 | 33 | 267 | 5.848485 | 0.666667 | 0.196891 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017094 | 0.123596 | 267 | 6 | 94 | 44.5 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0.018727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
7650577452e0cf4ff18f847e249769d2b41e28d9 | 229,370 | py | Python | pysnmp-with-texts/Nortel-MsCarrier-MscPassport-AtmNetworkingMIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/Nortel-MsCarrier-MscPassport-AtmNetworkingMIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/Nortel-MsCarrier-MscPassport-AtmNetworkingMIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module Nortel-MsCarrier-MscPassport-AtmNetworkingMIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/Nortel-MsCarrier-MscPassport-AtmNetworkingMIB
# Produced by pysmi-0.3.4 at Wed May 1 14:29:08 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, Integer, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "OctetString", "Integer", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ConstraintsUnion, ValueSizeConstraint, ValueRangeConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ConstraintsUnion", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsIntersection")
mscAtmIfVptVcc, mscAtmIfVpc, mscAtmIfVpcIndex, mscAtmIfIndex, mscAtmIfVptVccIndex, mscAtmIfVcc, mscAtmIfVptIndex, mscAtmIfVccIndex = mibBuilder.importSymbols("Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVcc", "mscAtmIfVpc", "mscAtmIfVpcIndex", "mscAtmIfIndex", "mscAtmIfVptVccIndex", "mscAtmIfVcc", "mscAtmIfVptIndex", "mscAtmIfVccIndex")
Unsigned32, StorageType, DisplayString, RowStatus, Counter32, RowPointer, Integer32, Gauge32 = mibBuilder.importSymbols("Nortel-MsCarrier-MscPassport-StandardTextualConventionsMIB", "Unsigned32", "StorageType", "DisplayString", "RowStatus", "Counter32", "RowPointer", "Integer32", "Gauge32")
HexString, AsciiString, NonReplicated, AsciiStringIndex, IntegerSequence, FixedPoint1 = mibBuilder.importSymbols("Nortel-MsCarrier-MscPassport-TextualConventionsMIB", "HexString", "AsciiString", "NonReplicated", "AsciiStringIndex", "IntegerSequence", "FixedPoint1")
mscPassportMIBs, mscComponents = mibBuilder.importSymbols("Nortel-MsCarrier-MscPassport-UsefulDefinitionsMIB", "mscPassportMIBs", "mscComponents")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
ObjectIdentity, Unsigned32, NotificationType, IpAddress, Gauge32, iso, MibScalar, MibTable, MibTableRow, MibTableColumn, MibIdentifier, Counter32, Counter64, TimeTicks, ModuleIdentity, Integer32, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "ObjectIdentity", "Unsigned32", "NotificationType", "IpAddress", "Gauge32", "iso", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "MibIdentifier", "Counter32", "Counter64", "TimeTicks", "ModuleIdentity", "Integer32", "Bits")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
atmNetworkingMIB = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42))
mscARtg = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95))
mscARtgRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 1), )
if mibBuilder.loadTexts: mscARtgRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgRowStatusTable.setDescription('This entry controls the addition and deletion of mscARtg components.')
mscARtgRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"))
if mibBuilder.loadTexts: mscARtgRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgRowStatusEntry.setDescription('A single entry in the table represents a single mscARtg component.')
mscARtgRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtg components. These components can be added and deleted.')
mscARtgComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgStorageType.setDescription('This variable represents the storage type value for the mscARtg tables.')
mscARtgIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscARtgIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgIndex.setDescription('This variable represents the index for the mscARtg tables.')
mscARtgStatsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 10), )
if mibBuilder.loadTexts: mscARtgStatsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgStatsTable.setDescription('This group contains the statistical operational attributes of an ARtg component.')
mscARtgStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"))
if mibBuilder.loadTexts: mscARtgStatsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgStatsEntry.setDescription('An entry in the mscARtgStatsTable.')
mscARtgRoutingAttempts = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 10, 1, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgRoutingAttempts.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgRoutingAttempts.setDescription('This attribute counts the total number of calls routed. The counter wraps when it exceeds the maximum value.')
mscARtgFailedRoutingAttempts = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 10, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgFailedRoutingAttempts.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgFailedRoutingAttempts.setDescription('This attribute counts the total number of calls which were not successfully routed.The counter wraps when it exceeds the maximum value.')
mscARtgDna = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2))
mscARtgDnaRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 1), )
if mibBuilder.loadTexts: mscARtgDnaRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgDna components.')
mscARtgDnaRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgDnaIndex"))
if mibBuilder.loadTexts: mscARtgDnaRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgDna component.')
mscARtgDnaRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgDna components. These components cannot be added nor deleted.')
mscARtgDnaComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgDnaStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaStorageType.setDescription('This variable represents the storage type value for the mscARtgDna tables.')
mscARtgDnaIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 1, 1, 10), AsciiStringIndex().subtype(subtypeSpec=ValueSizeConstraint(1, 40)))
if mibBuilder.loadTexts: mscARtgDnaIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaIndex.setDescription('This variable represents the index for the mscARtgDna tables.')
mscARtgDnaDestInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2))
mscARtgDnaDestInfoRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 1), )
if mibBuilder.loadTexts: mscARtgDnaDestInfoRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgDnaDestInfo components.')
mscARtgDnaDestInfoRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgDnaIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgDnaDestInfoIndex"))
if mibBuilder.loadTexts: mscARtgDnaDestInfoRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgDnaDestInfo component.')
mscARtgDnaDestInfoRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaDestInfoRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgDnaDestInfo components. These components cannot be added nor deleted.')
mscARtgDnaDestInfoComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaDestInfoComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgDnaDestInfoStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaDestInfoStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoStorageType.setDescription('This variable represents the storage type value for the mscARtgDnaDestInfo tables.')
mscARtgDnaDestInfoIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 50)))
if mibBuilder.loadTexts: mscARtgDnaDestInfoIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoIndex.setDescription('This variable represents the index for the mscARtgDnaDestInfo tables.')
mscARtgDnaDestInfoOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 10), )
if mibBuilder.loadTexts: mscARtgDnaDestInfoOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoOperTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This group contains the operational attributes for the DestInfo component.')
mscARtgDnaDestInfoOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgDnaIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgDnaDestInfoIndex"))
if mibBuilder.loadTexts: mscARtgDnaDestInfoOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoOperEntry.setDescription('An entry in the mscARtgDnaDestInfoOperTable.')
mscARtgDnaDestInfoType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 10, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))).clone(namedValues=NamedValues(("primary", 0), ("alternate", 1), ("registered", 2), ("default", 3), ("ebr", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaDestInfoType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoType.setDescription('This attribute indicates the type of the address at the destination interface. Provisioned addresses are assigned a type of primary or alternate; ATM routing will try primary routes and then the alternate routes if none of the primary routes succeed. The type registered is used for dynamic addresses registered through ILMI. The type default is used for Soft PVC addresses. The type ebr indicates addresses used by Edge Based Rerouting.')
mscARtgDnaDestInfoScope = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 10, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1, 104))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaDestInfoScope.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoScope.setDescription('This attribute indicates the highest level (meaning the lowest level number) in the hierarchy that the address will be advertised to. A value of -1 indicates that the scope is not applicable since this node has not been configured as a PNNI node.')
mscARtgDnaDestInfoStdComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 10, 1, 3), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaDestInfoStdComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoStdComponentName.setDescription('This attribute represents a component name of the interface through which the address can be reached.')
mscARtgDnaDestInfoReachability = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 2, 2, 10, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("internal", 0), ("exterior", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgDnaDestInfoReachability.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgDnaDestInfoReachability.setDescription('This attribute indicates whether the address is internal or exterior.')
mscARtgPnni = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3))
mscARtgPnniRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 1), )
if mibBuilder.loadTexts: mscARtgPnniRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRowStatusTable.setDescription('This entry controls the addition and deletion of mscARtgPnni components.')
mscARtgPnniRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnni component.')
mscARtgPnniRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnni components. These components can be added and deleted.')
mscARtgPnniComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniStorageType.setDescription('This variable represents the storage type value for the mscARtgPnni tables.')
mscARtgPnniIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscARtgPnniIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniIndex.setDescription('This variable represents the index for the mscARtgPnni tables.')
mscARtgPnniProvTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 10), )
if mibBuilder.loadTexts: mscARtgPnniProvTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniProvTable.setDescription('This group contains the generic provisionable attributes of a Pnni component.')
mscARtgPnniProvEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniProvEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniProvEntry.setDescription('An entry in the mscARtgPnniProvTable.')
mscARtgPnniNodeAddressPrefix = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 19))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniNodeAddressPrefix.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniNodeAddressPrefix.setDescription("This attribute specifies the ATM address of this node. It allows the default node address to be overridden. If this attribute is set to the null string, then the default node address prefix is assumed, and computed as follows: the value provisioned for the ModuleData component's nodePrefix attribute, followed by a unique MAC address (6 octets).")
mscARtgPnniDefaultScope = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 10, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 104))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniDefaultScope.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniDefaultScope.setDescription('This attribute specifies the default PNNI scope for ATM addresses associated with this node. The PNNI scope determines the level to which the address will be advertised within the PNNI routing domain. A provisioned Addr component may override the default scope in a PnniInfo subcomponent. A value of 0 means that all addresses which do not have provisioned scopes will be advertised globally within the PNNI routing domain. The value specified must be numerically smaller than or equal to that of the lowest level at which this node is configured in the PNNI hierarchy.')
mscARtgPnniDomain = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 10, 1, 3), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(1, 32)).clone(hexValue="31")).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniDomain.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniDomain.setDescription('This attribute specifies the routing domain name. This attribute should be set identically for all nodes in the same routing domain.')
mscARtgPnniRestrictTransit = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 10, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("false", 0), ("true", 1))).clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRestrictTransit.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRestrictTransit.setDescription('This attribute specifies if the node should restrict tandeming of SVCs. If this attribute is set to true, then other lowest level nodes in the PNNI hierarchy will avoid traversing this node during route computation.')
mscARtgPnniMaxAlternateRoutesOnCrankback = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 10, 1, 5), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 20)).clone(1)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniMaxAlternateRoutesOnCrankback.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniMaxAlternateRoutesOnCrankback.setDescription('This attribute specifies the number of alternate routing attempts before a call requiring crank back is rejected.')
mscARtgPnniPglParmsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 11), )
if mibBuilder.loadTexts: mscARtgPnniPglParmsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPglParmsTable.setDescription('This group contains the provisionable attributes for the peer group leader election timer parameters of a Pnni component.')
mscARtgPnniPglParmsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniPglParmsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPglParmsEntry.setDescription('An entry in the mscARtgPnniPglParmsTable.')
mscARtgPnniPglInitTime = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 11, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535)).clone(15)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniPglInitTime.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPglInitTime.setDescription('This attribute specifies how long this node will delay advertising its choice of preferred peer group leader after having initialized operation and reached the full peer state with at least one neighbor in the peer group.')
mscARtgPnniOverrideDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 11, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(30)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniOverrideDelay.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniOverrideDelay.setDescription('This attribute specifies how long a node will wait for itself to be declared the preferred peer group leader by unanimous agreement among its peers.')
mscARtgPnniReElectionInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 11, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(15)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniReElectionInterval.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniReElectionInterval.setDescription('This attribute specifies how long this node will wait after losing connectivity to the current peer group leader before re-starting the process of electing a new peer group leader.')
mscARtgPnniHlParmsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 12), )
if mibBuilder.loadTexts: mscARtgPnniHlParmsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniHlParmsTable.setDescription('This group contains the default provisionable Hello protocol parameters.')
mscARtgPnniHlParmsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 12, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniHlParmsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniHlParmsEntry.setDescription('An entry in the mscARtgPnniHlParmsTable.')
mscARtgPnniHelloHoldDown = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 12, 1, 1), FixedPoint1().subtype(subtypeSpec=ValueRangeConstraint(1, 655350)).clone(10)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniHelloHoldDown.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniHelloHoldDown.setDescription('This attribute is used to limit the rate at which this node sends out Hello packets. Specifically, it specifies the default minimum amount of time between successive Hellos used by routing control channels on this node.')
mscARtgPnniHelloInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 12, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(15)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniHelloInterval.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniHelloInterval.setDescription('This attribute specifies the default duration of the Hello Timer in seconds for routing control channels on this node. Every helloInterval seconds, this node will send out a Hello packet to the neighbor node, subject to the helloHoldDown timer having expired at least once since the last Hello packet was sent.')
mscARtgPnniHelloInactivityFactor = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 12, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(5)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniHelloInactivityFactor.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniHelloInactivityFactor.setDescription('This attribute specifies the default number of Hello intervals allowed to pass without receiving a Hello from the neighbor node, before an attempt is made to re-stage, for routing control channels on this node. The hello inactivity timer is enabled in the oneWayInside, twoWayInside, oneWayOutside, twoWayOutside and commonOutside (see the helloState attribute on the Rcc component for a description of these states). Note that the value for the Hello interval used in the calculation is the one specified in the Hello packet from the neighbor node.')
mscARtgPnniPtseParmsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 13), )
if mibBuilder.loadTexts: mscARtgPnniPtseParmsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPtseParmsTable.setDescription('This group contains the provisionable attributes for the PTSE timer values of a Pnni component.')
mscARtgPnniPtseParmsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 13, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniPtseParmsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPtseParmsEntry.setDescription('An entry in the mscARtgPnniPtseParmsTable.')
mscARtgPnniPtseHoldDown = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 13, 1, 1), FixedPoint1().subtype(subtypeSpec=ValueRangeConstraint(1, 655350)).clone(10)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniPtseHoldDown.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPtseHoldDown.setDescription('This attribute is used to limit the rate at which this node sends out PTSE packets. Specifically, it specifies the minimum amount of time in seconds that this node must wait between sending successive PTSE packets.')
mscARtgPnniPtseRefreshInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 13, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(300, 65535)).clone(1800)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniPtseRefreshInterval.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPtseRefreshInterval.setDescription('This attribute specifies the duration of the PTSE Timer. Every ptseRefreshInterval seconds, this node will send out a self- originated PTSE packet to the neighbor node, subject to the ptseHoldDown timer having expired at least once since the last PTSE packet was sent.')
mscARtgPnniPtseLifetimeFactor = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 13, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(101, 1000)).clone(200)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniPtseLifetimeFactor.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPtseLifetimeFactor.setDescription('This attribute specifies the lifetime multiplier. The result of multiplying the ptseRefreshInterval by this value is used as the initial lifetime that this node places into PTSEs.')
mscARtgPnniRequestRxmtInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 13, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(5)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRequestRxmtInterval.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRequestRxmtInterval.setDescription('This attribute specifies the period between retransmissions of unacknowledged Database Summary packets, PTSE Request packets and PTSPs.')
mscARtgPnniPeerDelayedAckInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 13, 1, 5), FixedPoint1().subtype(subtypeSpec=ValueRangeConstraint(1, 655350)).clone(1)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniPeerDelayedAckInterval.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPeerDelayedAckInterval.setDescription('This attribute specifies the minimum amount of time between transmissions of delayed PTSE acknowledgment packets.')
mscARtgPnniThreshParmsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 14), )
if mibBuilder.loadTexts: mscARtgPnniThreshParmsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniThreshParmsTable.setDescription('This group contains the provisionable attributes for the change thresholds of a ARtg Pnni component.')
mscARtgPnniThreshParmsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 14, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniThreshParmsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniThreshParmsEntry.setDescription('An entry in the mscARtgPnniThreshParmsTable.')
mscARtgPnniAvcrMt = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 14, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 99)).clone(3)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniAvcrMt.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniAvcrMt.setDescription('This attribute when multiplied by the Maximum Cell Rate specifies the minimum threshold used in the algorithms that determine significant change for average cell rate parameters.')
mscARtgPnniAvcrPm = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 14, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 99)).clone(50)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniAvcrPm.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniAvcrPm.setDescription('This attribute when multiplied by the current Available Cell Rate specifies the threshold used in the algorithms that determine significant change for AvCR parameters. If the resulting threshold is lower than minimum threshold, minimum threshold will be used. Increasing the value of the attribute increases the range of insignificance and reduces the amount of PTSP flooding due to changes in resource availability.')
mscARtgPnniOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 15), )
if mibBuilder.loadTexts: mscARtgPnniOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniOperTable.setDescription('This group contains the generic operational attributes of an ARtg Pnni component.')
mscARtgPnniOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 15, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniOperEntry.setDescription('An entry in the mscARtgPnniOperTable.')
mscARtgPnniTopologyMemoryExhaustion = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 15, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("false", 0), ("true", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopologyMemoryExhaustion.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopologyMemoryExhaustion.setDescription('This attribute indicates if the topology database is overloaded. A node goes into a database overload state when it fails to store the complete topology database due to insufficient memory in the node. A node in this state performs resynchronization periodically by restarting all its Neighbor Peer Finite State Machines. The node will stay in this state until it synchronizes with all of its neighbors without any overload problems. When this attribute is set an alarm will be issued.')
mscARtgPnniStatsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 16), )
if mibBuilder.loadTexts: mscARtgPnniStatsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniStatsTable.setDescription('This group contains the statistical operational attributes of a ARtg Pnni component.')
mscARtgPnniStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 16, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"))
if mibBuilder.loadTexts: mscARtgPnniStatsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniStatsEntry.setDescription('An entry in the mscARtgPnniStatsTable.')
mscARtgPnniSuccessfulRoutingAttempts = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 16, 1, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniSuccessfulRoutingAttempts.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniSuccessfulRoutingAttempts.setDescription('This attribute counts successful PNNI routing attempts. The counter wraps when it exceeds the maximum value.')
mscARtgPnniFailedRoutingAttempts = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 16, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniFailedRoutingAttempts.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniFailedRoutingAttempts.setDescription('This attribute counts failed PNNI routing attempts. The counter wraps when it exceeds the maximum value.')
mscARtgPnniAlternateRoutingAttempts = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 16, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniAlternateRoutingAttempts.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniAlternateRoutingAttempts.setDescription('This attribute counts successful PNNI alternate routing attempts. The counter wraps when it exceeds the maximum value.')
mscARtgPnniOptMetricTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 386), )
if mibBuilder.loadTexts: mscARtgPnniOptMetricTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniOptMetricTable.setDescription('This attribute is a vector that specifies the optimization metric for each ATM service category. The optimization metric is used during Generic Connection Admission Control (GCAC) route computation. Setting the value to cdv for a particular service category will cause GCAC to optimize for cell delay variation on call setups requiring that service category. Setting the value to maxCtd for a particular service category will cause GCAC to optimize for maximum cell transfer delay on call setups requiring that service category. Setting the value to aw for a particular service category will cause GCAC to optimize for administrative weight on call setups requiring that service category.')
mscARtgPnniOptMetricEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 386, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniOptMetricIndex"))
if mibBuilder.loadTexts: mscARtgPnniOptMetricEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniOptMetricEntry.setDescription('An entry in the mscARtgPnniOptMetricTable.')
mscARtgPnniOptMetricIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 386, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("cbr", 1), ("rtVbr", 2), ("nrtVbr", 3), ("ubr", 4))))
if mibBuilder.loadTexts: mscARtgPnniOptMetricIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniOptMetricIndex.setDescription('This variable represents the mscARtgPnniOptMetricTable specific index for the mscARtgPnniOptMetricTable.')
mscARtgPnniOptMetricValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 386, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("cdv", 0), ("maxCtd", 1), ("aw", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniOptMetricValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniOptMetricValue.setDescription('This variable represents an individual value for the mscARtgPnniOptMetricTable.')
mscARtgPnniRf = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2))
mscARtgPnniRfRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 1), )
if mibBuilder.loadTexts: mscARtgPnniRfRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRowStatusTable.setDescription('This entry controls the addition and deletion of mscARtgPnniRf components.')
mscARtgPnniRfRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfIndex"))
if mibBuilder.loadTexts: mscARtgPnniRfRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniRf component.')
mscARtgPnniRfRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniRfRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniRf components. These components cannot be added nor deleted.')
mscARtgPnniRfComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniRfComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniRfStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniRfStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniRf tables.')
mscARtgPnniRfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscARtgPnniRfIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfIndex.setDescription('This variable represents the index for the mscARtgPnniRf tables.')
mscARtgPnniRfCriteriaTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10), )
if mibBuilder.loadTexts: mscARtgPnniRfCriteriaTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfCriteriaTable.setDescription('This group contains the attributes specifying the routing criteria for the route computation.')
mscARtgPnniRfCriteriaEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfIndex"))
if mibBuilder.loadTexts: mscARtgPnniRfCriteriaEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfCriteriaEntry.setDescription('An entry in the mscARtgPnniRfCriteriaTable.')
mscARtgPnniRfDestinationAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(1, 20))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfDestinationAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfDestinationAddress.setDescription('This attribute specifies the destination NSAP address to be used for the computation. If this attribute specifies an invalid address then no routes will be found.')
mscARtgPnniRfMaxRoutes = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 15)).clone(1)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfMaxRoutes.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfMaxRoutes.setDescription('This attribute specifies a ceiling on the number of routes to be computed.')
mscARtgPnniRfTxTrafficDescType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8))).clone(namedValues=NamedValues(("n1", 1), ("n2", 2), ("n3", 3), ("n4", 4), ("n5", 5), ("n6", 6), ("n7", 7), ("n8", 8))).clone('n1')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfTxTrafficDescType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfTxTrafficDescType.setDescription('This attribute specifies the type of traffic management which is applied to the transmit direction as defined in the ATM Forum. The txTrafficDescType determines the number and meaning of the parameters in the txTrafficDescParm attribute.')
mscARtgPnniRfRxTrafficDescType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 15))).clone(namedValues=NamedValues(("n1", 1), ("n2", 2), ("n3", 3), ("n4", 4), ("n5", 5), ("n6", 6), ("n7", 7), ("n8", 8), ("sameAsTx", 15))).clone('sameAsTx')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfRxTrafficDescType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRxTrafficDescType.setDescription('This attribute specifies the type of traffic management which is applied to the receive direction of this connection as defined in the ATM Forum. The rxTrafficDescType determines the number and meaning of the parameters in the rxTrafficDescParm attribute When sameAsTx is selected, the rxTrafficDescType as well as the rxTrafficDescParm are taken from the transmit values.')
mscARtgPnniRfAtmServiceCategory = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 15))).clone(namedValues=NamedValues(("unspecifiedBitRate", 0), ("constantBitRate", 1), ("rtVariableBitRate", 2), ("nrtVariableBitRate", 3), ("derivedFromBBC", 15))).clone('unspecifiedBitRate')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfAtmServiceCategory.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfAtmServiceCategory.setDescription("This attribute specifies the ATM service category for both directions of the connection. If this attribute is set to derivedFromBBC, the Broadband Bearer Capability (BBC) and bestEffort attributes are used to determine the atmServiceCategory of this connection. If this attribute is set to other than derivedFromBBC, the value of this attribute is used to override the provisioned BBC IE parameters. In those cases, the BBC attributes are not used. The constantBitRate service category is intended for real time applications, that is those requiring tightly constrained delay and delay variation, as would be appropriate for voice and video applications. The consistent availability of a fixed quantity of bandwidth is considered appropriate for CBR service. Cells which are delayed beyond the value specified by CellTransfer Delay are assumed to be of significantly reduce value to the application. The rtVariableBitRate service category is intended for real time applications, that is those requiring tightly constrained delay and delay variation, as would be appropriate for voice and video applications. Sources are expected to transmit at a rate which varies with time. Equivalently, the source can be described as 'bursty'. Cells which are delayed beyond the value specified by CTD are assumed to be of significantly reduced value to the application. VBR real time service may support statistical multiplexing of real time sources. The nrtVariableBitRate service category is intended for non-real time applications which have bursty traffic characteristics and which can be characterized in terms of a PCR, SCR, and MBS. For those cells which are transferred within the traffic contract, the application expects a low cell loss ratio. For all connections, it expects a bound on the mean cell transfer delay. VBR non-real time service may support statistical multiplexing of connections. The unspecifiedBitRate service is intended for non-real time applications; that is, those not requiring tightly constrained delay and delay variation. UBR sources are expected to be bursty. UBR service supports a high degree of statistical multiplexing among sources. UBR service does not specify traffic related service guarantees. No numerical commitments are made with respect to the cell loss ratio experienced by a UBR connection, or as to the cell transfer delay experienced by cells on the connection.")
mscARtgPnniRfFwdQosClass = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))).clone(namedValues=NamedValues(("n0", 0), ("n1", 1), ("n2", 2), ("n3", 3), ("n4", 4))).clone('n0')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfFwdQosClass.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfFwdQosClass.setDescription('This attribute specifies the quality of service for the forward direction for this connection. Class 1 supports a QOS that will meet Service Class A performance requirements (Circuit emulation, constant bit rate video). Class 2 supports a QOS that will meet Service Class B performance requirements (Variable bit rate audio and video). Class 3 supports a QOS that will meet Service Class C performance requirements (Connection-Oriented Data Transfer). Class 4 supports a QOS that will meet Service Class D performance requirements (Connectionless Data Transfer). Class 0 is the unspecified bit rate QOS class; no objective is specified for the performance parameters.')
mscARtgPnniRfBwdQosClass = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 15))).clone(namedValues=NamedValues(("n0", 0), ("n1", 1), ("n2", 2), ("n3", 3), ("n4", 4), ("sameAsFwd", 15))).clone('sameAsFwd')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfBwdQosClass.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfBwdQosClass.setDescription('This attribute specifies the quality of service for the backward direction for this connection. Class 1 supports a QOS that will meet Service Class A performance requirements (Circuit emulation, constant bit rate video). Class 2 supports a QOS that will meet Service Class B performance requirements (Variable bit rate audio and video). Class 3 supports a QOS that will meet Service Class C performance requirements (Connection-Oriented Data Transfer). Class 4 supports a QOS that will meet Service Class D performance requirements (Connectionless Data Transfer). Class 0 is the unspecified bit rate QOS class; no objective is specified for the performance parameters. The sameAsFwd selection sets the backward quality of service to be the same as the forward quality of service.')
mscARtgPnniRfBearerClassBbc = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 3, 16, 31))).clone(namedValues=NamedValues(("a", 1), ("c", 3), ("x", 16), ("derivedFromServiceCategory", 31))).clone('derivedFromServiceCategory')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfBearerClassBbc.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfBearerClassBbc.setDescription('This attribute specifies the bearer capability. It is one of the Broadband Bearer Capability (BBC) attributes. The purpose of the BBC information element is to indicate a requested broadband connection-oriented bearer service to be provided by the network. The value derivedFromServiceCategory specifies that the actual value which is used for this connection is derived from the value of the atmServiceCategory. Either, this attribute must be set to derivedFromServiceCategory, or the atmServiceCategory attribute must be set to derivedFromBBC, but not both. Class a service is a connection-oriented, constant bit rate ATM transport service. Class a service has end to end timing requirements and may require stringent cell loss, cell delay and cell delay variation performance.When a is set, the user is requesting more than an ATM only service. The network may look at the AAL to provide interworking based upon its contents. Class c service is a connection-oriented, variable bit rate ATM transport service. Class c service has no end-to-end timing requirements. When c is set, the user is requesting more than an ATM only service. The network interworking function may look at the AAL and provide service based on it. Class x service is a connection-oriented ATM transport service where the AAL, trafficType (vbr or cbr) and timing requirements are user defined (that is, transparent to the network).When x is set the user is requesting an ATM only service from the network. In this case, the network shall not process any higher layer protocol.')
mscARtgPnniRfTransferCapabilityBbc = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 13), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 5, 8, 9, 10, 30, 31))).clone(namedValues=NamedValues(("n0", 0), ("n1", 1), ("n2", 2), ("n5", 5), ("n8", 8), ("n9", 9), ("n10", 10), ("notApplicable", 30), ("derivedFromServiceCategory", 31))).clone('derivedFromServiceCategory')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfTransferCapabilityBbc.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfTransferCapabilityBbc.setDescription('This attribute specifies the transfer capability for this connection. Uni 3.0/3.1 traffic type and end-to-end timing parameters are mapped into this parameter as follows: <transferCapability : TrafficType, Timing> 0 : NoIndication, NoIndication 1 : NoIndication, yes 2 : NoIndication, no 5 : CBR, yes 8 : VBR, NoIndication 9 : VBR, yes 10: VBR, no NotApplicable specifies that the user does not want to specify the transfer capability. The CBR traffic type refers to traffic offered on services such as a constant bit rate video service or a circuit emulation. The VBR traffic type refers to traffic offered on services such as packetized audio and video, or data. The value NoIndication for traffic type is used if the user has not set the traffic type; this is also the case for end-to-end timing. The value yes for end-to-end timing indicates that end-to-end timing is required. The value no for end-to-end timing indicates that end-to-end timing is not required. The value derivedFromServiceCategory specifies that the actual value which is used for this connection is derived from the value of the atmServiceCategory. Either, this attribute must be set to derivedFromServiceCategory, or the atmServiceCategory attribute must be set to derivedFromBBC, but not both.')
mscARtgPnniRfClippingBbc = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 14), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1))).clone('no')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfClippingBbc.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfClippingBbc.setDescription('This attribute specifies the value for the clipping susceptibility parameter in the BBC IE. This attribute is only used for SPVC connections. It is one of the Broadband Bearer Capability attributes. Clipping is an impairment in which the first fraction of a second of information to be transferred is lost. It occurs after a call is answered and before an associated connection is switched through.')
mscARtgPnniRfBestEffort = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 15), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 15))).clone(namedValues=NamedValues(("indicated", 0), ("notIndicated", 1), ("derivedFromServiceCategory", 15))).clone('derivedFromServiceCategory')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfBestEffort.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfBestEffort.setDescription('This attribute specifies the value of the best effort parameter in the ATM Traffic Descriptor IE. It is one of the Broadband Bearer Capability attributes. The value indicated implies that the quality of service for this connection is not guaranteed. The value notIndicated implies that the quality of service for this connection is guaranteed. The value derivedFromServiceCategory specifies that the actual value which is used for this connection is derived from the value of the atmServiceCategory. Either, this attribute must be set to derivedFromServiceCategory, or the atmServiceCategory attribute must be set to derivedFromBBC, but not both. DESCRIPTION')
mscARtgPnniRfOptimizationMetric = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 10, 1, 16), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("cdv", 0), ("maxCtd", 1), ("aw", 2))).clone('aw')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfOptimizationMetric.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfOptimizationMetric.setDescription('This attribute specifies the optimization metric to be used in the route computation; one of cell delay variation (cdv), maximum cell transfer delay (maxCtd), or administrative weight (aw).')
mscARtgPnniRfRxTdpTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 388), )
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpTable.setDescription('This attribute is a vector of four traffic parameters whose meanings are defined by the rxTrafficDescType attribute. The values of peak cell rate (PCR) and sustained cell rate (SCR) are expressed in cell/s. Maximum burst size (MBS) is expressed in cells. The value of CDVT is expressed in microseconds. The values of PCR, SCR, MBS and CDVT are used for usage parameter control (UPC). When rxTrafficDescType is 1 or 2, all of the parameters must be set to zero (unused). When rxTrafficDescType is 3, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic. Parameter 1 must be non-zero. Parameters 2 and 3 must be set to zero (unused). When rxTrafficDescType is 4, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the PCR for CLP equal to 0 traffic with cell discard. Parameters 1 and 2 must be non-zero. Parameter 3 must be set to zero (unused). Parameter 1 must be greater than or equal to parameter 2. When rxTrafficDescType is 5, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the PCR for CLP equal to 0 traffic with cell tagging. Parameters 1 and 2 must be non-zero. Parameter 3 must be set to zero (unused). Parameter 1 must be greater than or equal to parameter 2. When rxTrafficDescType is a 6, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the SCR for CLP equal to 0 and 1 traffic; parameter 3 represents the MBS for CLP equal to 0 and 1 traffic. Parameters 1, 2 and 3 must be non- zero. Parameter 1 must be greater than or equal to Parameter 2. When rxTrafficDescType is 7, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the SCR for CLP equal to 0 traffic with cell discard; parameter 3 represents the MBS for CLP equal to 0 traffic. Parameters 1, 2 and 3 must be non- zero. Parameter 1 must be greater than or equal to parameter 2. When rxTrafficDescType is 8, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the SCR for CLP equal to 0 traffic with cell tagging; parameter 3 represents the MBS for CLP equal to 0 traffic. Parameter 1, 2 and 3 must be non- zero. Parameter 1 must be greater than or equal to parameter 2. When rxTrafficDescType is any value from 3 through 8, parameter 4 represents the CDVT. If this value is zero, the CDVT is taken from the ConnectionAdministrator defaults for the particular atmServiceCategory of this connection. When rxTrafficDescriptorType is 3 through 8, there are certain extreme combinations of rxTrafficDescParm which are outside the capabilities of the UPC hardware. To calculate the limits, use the following formulae: I1 = 1 000 000 000 / PCR L1 = CDVT * 1000 I2 = 1 000 000 000 / SCR L2 = CDVT + (MBS - 1) * (I2 - I1) I1 and I2 must be less than or equal to 335 523 840. I1 + L1 must be less than or equal to 1 342 156 800. I2 + L2 must be less than or equal to 1 342 156 800. Note that I2 and L2 only apply when the rxTrafficDescriptorType is 6 through 8. If the values of I1, L1, I2 or L2 are closer to the limits described above, a further restriction applies. Specifically, if either: I1 > 41 940 480 or I2 > 41 940 480 or I1 + L1 > 167 769 600 or I2 + L2 > 167 769 600 then both I1 and I2 must be greater than 20 480. Parameter 5 of the rxTrafficDescParm is always unused. If the rxTrafficDescType is sameAsTx, the values in this attribute will be taken from the txTrafficDescParm.')
mscARtgPnniRfRxTdpEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 388, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfRxTdpIndex"))
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpEntry.setDescription('An entry in the mscARtgPnniRfRxTdpTable.')
mscARtgPnniRfRxTdpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 388, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 5)))
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpIndex.setDescription('This variable represents the mscARtgPnniRfRxTdpTable specific index for the mscARtgPnniRfRxTdpTable.')
mscARtgPnniRfRxTdpValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 388, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfRxTdpValue.setDescription('This variable represents an individual value for the mscARtgPnniRfRxTdpTable.')
mscARtgPnniRfTxTdpTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 389), )
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpTable.setDescription('This attribute is a vector of five traffic parameters whose meanings are defined by the txTrafficDescType attribute. The values of peak cell rate (PCR), sustained cell rate (SCR) and requested shaping rate are expressed in cell/s. Maximum burst size (MBS) is expressed in cells. CDVT is expressed in microseconds. The values of PCR, SCR, MBS and CDVT are used for connection admission control (CAC). The value of CDVT is only used for connections where the atmServiceCategory is constantBitRate. For all other values of atmServiceCategory, CDVT is ignored. The values of PCR, SCR and requested shaping rate are used to determine the actual shaping rate where traffic shaping is enabled. When txTrafficDescType is 1 or 2, all of the parameters must be set to zero. When txTrafficDescType is 3, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 4 represents the CDVT; and parameter 5 represents the requested shaping rate. A non-zero value in parameter 5 overrides any value in parameter 1. This result is used as the PCR. Parameter 1 must be non-zero. Parameters 2 and 3 must be zero. When txTrafficDescType is 4, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic with cell discard; parameter 2 represents the PCR for CLP equal to 0 traffic; parameter 4 represents the CDVT; and parameter 5 represents the requested shaping rate. A non-zero value in parameter 5 overrides any value in parameter 1. This result is used as the PCR. Parameter 1 must be greater than or equal to parameter 2. Parameters 1 and 2 must be non-zero. Parameter 3 must be zero. When txTrafficDescType is 5, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic with cell tagging; parameter 2 represents the PCR for CLP equal to 0 traffic; parameter 4 represents the CDVT; and parameter 5 represents the requested shaping rate. A non-zero value in parameter 5 overrides any value in parameter 1. This result is used as the PCR. Parameter 1 must be greater than or equal to parameter 2. Parameters 1 and 2 must be non-zero. Parameter 3 must be zero. When txTrafficDescType is 6, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the SCR for CLP equal to 0 and 1 traffic; parameter 3 represents the MBS for CLP equal to 0 and 1 traffic; parameter 4 represents the CDVT; and parameter 5 represents the requested shaping rate. A non-zero value in parameter 5 overrides any value in parameter 1. This result is used as the PCR. Parameters 1, 2 and 3 must be non-zero. Parameter 1 must be greater than or equal to parameter 2. Parameter 5, must either be zero (unused) or greater than or equal to parameter 2. When txTrafficDescType is 7, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the SCR for CLP equal to 0 with cell discard; parameter 3 represents the MBS for CLP equal to 0 traffic; parameter 4 represents the CDVT; and parameter 5 represents the requested shaping rate. A non-zero value in parameter 5 overrides any value in parameter 1. This result is used as the PCR. Parameters 1, 2 and 3 must be non-zero. Parameter 1 must be greater than or equal to parameter 2. Parameter 5, must either be zero (unused) or greater than or equal to parameter 2. When txTrafficDescType is 8, parameter 1 represents the PCR for CLP equal to 0 and 1 traffic; parameter 2 represents the SCR for CLP equal to 0 traffic with cell tagging; parameter 3 represents the MBS for CLP equal to 0 traffic; parameter 4 represents the CDVT; and parameter 5 represents the requested shaping rate. A non-zero value in parameter 5 overrides any value in parameter 1. This result is used as the PCR. Parameters 1, 2 and 3 must be non-zero. Parameter 1 must be greater than or equal to parameter 2. Parameter 5, must either be zero (unused) or greater than or equal to parameter 2. Whenever it is valid for PCR to be specified, parameter 5 may also be used to specify a requested shaping rate. A non-zero value in parameter 5 overrides the value in parameter 1 and is used as the peak cell rate in calculations of CAC and shaping rate. For txTrafficDescType 3, 4 and 5, the transmit traffic will be shaped at the next rate less than the PCR. For txTrafficDescType 6, 7 and 8, the transmit traffic will be shaped at the highest available rate which is between PCR and SCR. However, if there is no available shaping rate between PCR and SCR, traffic will be shaped at the next rate above the PCR.')
mscARtgPnniRfTxTdpEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 389, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfTxTdpIndex"))
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpEntry.setDescription('An entry in the mscARtgPnniRfTxTdpTable.')
mscARtgPnniRfTxTdpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 389, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 5)))
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpIndex.setDescription('This variable represents the mscARtgPnniRfTxTdpTable specific index for the mscARtgPnniRfTxTdpTable.')
mscARtgPnniRfTxTdpValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 389, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfTxTdpValue.setDescription('This variable represents an individual value for the mscARtgPnniRfTxTdpTable.')
mscARtgPnniRfFqpTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 390), )
if mibBuilder.loadTexts: mscARtgPnniRfFqpTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfFqpTable.setDescription('This attribute is a vector of three elements that specify the quality of service parameters for the forward direction for this connection. This attribute is used for SPVC connections. The cdv element specifies the acceptable peak-to-peak Cell Delay Variation (CDV) of real-time connections (CBR, and rt-VBR). It is signalled through the extended QoS information element. The ctd specifies the acceptable maximum Cell Transfer Delay (maxCtd) of real-time connections (CBR, and rt-VBR). It is signalled through the end to end transit delay information element. The clr specifies the acceptable Cell Loss Ratio (CLR) of CBR, rt- VBR, and nrt-VBR connections. It is signalled through the extended QoS information element.')
mscARtgPnniRfFqpEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 390, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfFqpIndex"))
if mibBuilder.loadTexts: mscARtgPnniRfFqpEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfFqpEntry.setDescription('An entry in the mscARtgPnniRfFqpTable.')
mscARtgPnniRfFqpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 390, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("cdv", 0), ("ctd", 1), ("clr", 2))))
if mibBuilder.loadTexts: mscARtgPnniRfFqpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfFqpIndex.setDescription('This variable represents the mscARtgPnniRfFqpTable specific index for the mscARtgPnniRfFqpTable.')
mscARtgPnniRfFqpValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 390, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfFqpValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfFqpValue.setDescription('This variable represents an individual value for the mscARtgPnniRfFqpTable.')
mscARtgPnniRfBqpTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 393), )
if mibBuilder.loadTexts: mscARtgPnniRfBqpTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfBqpTable.setDescription('This attribute is a vector of three elements that specify the quality of service parameters for the backward direction for this connection. This attribute is used for SPVC connections. The cdv element specifies the acceptable peak-to-peak Cell Delay Variation (CDV) of real-time connections (CBR, and rt-VBR). It is signalled through the extended QoS information element. The ctd specifies the acceptable maximum Cell Transfer Delay (maxCtd) of real-time connections (CBR, and rt-VBR). It is signalled through the end to end transit delay information element. The clr specifies the acceptable Cell Loss Ratio (CLR) of CBR, rt- VBR, and nrt-VBR connections. It is signalled through the extended QoS information element.')
mscARtgPnniRfBqpEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 393, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniRfBqpIndex"))
if mibBuilder.loadTexts: mscARtgPnniRfBqpEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfBqpEntry.setDescription('An entry in the mscARtgPnniRfBqpTable.')
mscARtgPnniRfBqpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 393, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("cdv", 0), ("ctd", 1), ("clr", 2))))
if mibBuilder.loadTexts: mscARtgPnniRfBqpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfBqpIndex.setDescription('This variable represents the mscARtgPnniRfBqpTable specific index for the mscARtgPnniRfBqpTable.')
mscARtgPnniRfBqpValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 2, 393, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniRfBqpValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniRfBqpValue.setDescription('This variable represents an individual value for the mscARtgPnniRfBqpTable.')
mscARtgPnniCfgNode = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3))
mscARtgPnniCfgNodeRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 1), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeRowStatusTable.setDescription('This entry controls the addition and deletion of mscARtgPnniCfgNode components.')
mscARtgPnniCfgNodeRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniCfgNode component.')
mscARtgPnniCfgNodeRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniCfgNode components. These components can be added and deleted.')
mscARtgPnniCfgNodeComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniCfgNodeStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniCfgNode tables.')
mscARtgPnniCfgNodeIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 104)))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeIndex.setDescription('This variable represents the index for the mscARtgPnniCfgNode tables.')
mscARtgPnniCfgNodeProvTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 10), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeProvTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeProvTable.setDescription('This group contains the provisionable attributes of a ConfiguredNode component.')
mscARtgPnniCfgNodeProvEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeProvEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeProvEntry.setDescription('An entry in the mscARtgPnniCfgNodeProvTable.')
mscARtgPnniCfgNodeNodeId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 10, 1, 2), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 22))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNodeId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNodeId.setDescription('This attribute specifies the node id of the configured node. If this attribute is set to null, then the node id is computed as follows: If this is the lowest configured node, then the node id is computed as the level (one octet), followed by the integer value 160 (one octet), followed by the node address (20 octets). If this is not the lowest configured node, then the node id is computed as the level (one octet), followed by the 14 octet peer group id of the child peer group which the LGN represents, followed by the ESI specified in the node address (6 octets), followed by the integer value 0 (one octet).')
mscARtgPnniCfgNodePeerGroupId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 10, 1, 3), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 14))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniCfgNodePeerGroupId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodePeerGroupId.setDescription("This attribute allows the peer group id of the Logical Group Node (LGN) to be set. The peer group id is specified by 28 hex digits where the first octet represents the level of the node and the remaining 13 octets form the End System Address. If this attribute is set to the null string then the peer group id is computed as follows: The peer group id for a lowest level node is computed to be the node's level (one octet), followed by the first <level> bits of the node's address, followed by zero or more padding 0 bits. The peer group id for an LGN is computed to be the LGN's level (one octet), followed by the first <level> bits of the id of the peer group which this LGN represents.")
mscARtgPnniCfgNodeOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOperTable.setDescription('This group contains the generic operational attributes of a ConfiguredNode component.')
mscARtgPnniCfgNodeOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOperEntry.setDescription('An entry in the mscARtgPnniCfgNodeOperTable.')
mscARtgPnniCfgNodeNodeAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNodeAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNodeAddress.setDescription('This attribute indicates the address of the node at this level. At the lowest level, the nodeAddress is determined by the value of the nodeAddressPrefix attribute for the ARtg Pnni component followed by the level of this CfgNode. For LGNs, the nodeAddress is the same as the nodeAddress of the node at the lowest level, with the selector field set to the level of the peer group containing the LGN.')
mscARtgPnniCfgNodeOpNodeId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1, 2), HexString().subtype(subtypeSpec=ValueSizeConstraint(22, 22)).setFixedLength(22)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOpNodeId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOpNodeId.setDescription('This attribute indicates the node id of the node at this level. The default node id is computed as follows: If this is the lowest level node, then the default node id is computed as the level (one octet), followed by the integer value 160 (one octet), followed by the node address (20 octets). If this is not the lowest level node, then the default node id is computed as the level (one octet), followed by the 14 octet peer group id of the child peer group which the LGN represents, followed by the ESI specified in the node address (6 octets), followed by the integer value 0 (one octet).')
mscARtgPnniCfgNodeOpPeerGroupId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1, 3), HexString().subtype(subtypeSpec=ValueSizeConstraint(14, 14)).setFixedLength(14)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOpPeerGroupId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeOpPeerGroupId.setDescription('This attribute indicates the peer group id of the node at this level. The value is determined by the provisioned peerGroupId attribute.')
mscARtgPnniCfgNodeNumNeighbors = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNumNeighbors.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNumNeighbors.setDescription('This attribute indicates the number of PNNI nodes which are neighbors of this node at this level.')
mscARtgPnniCfgNodeNumRccs = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1, 5), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNumRccs.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNumRccs.setDescription("This attribute indicates the number of Routing Control Channels to this node's neighbors at this level.")
mscARtgPnniCfgNodeCurrentLeadershipPriority = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1, 6), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 205))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeCurrentLeadershipPriority.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeCurrentLeadershipPriority.setDescription('This attribute indicates the leadership priority of the node that this node believes should be the peer group leader at this point in time.')
mscARtgPnniCfgNodePglElectionState = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 11, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("starting", 0), ("awaiting", 1), ("awaitingFull", 2), ("initialDelay", 3), ("calculating", 4), ("operNotPgl", 5), ("operPgl", 6), ("awaitUnanimity", 7), ("hungElection", 8), ("awaitReElection", 9)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodePglElectionState.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodePglElectionState.setDescription('This attribute indicates the current state of the peer group leader election process. The following are the possible values for this attribute: starting: the initial state of the state machine. awaiting: the node has started the Hello Finite State Machine on at least one link, and no peer has been found yet. awaitingFull: no database synchronization process has been completed yet but at least one neighboring peer has been found. initialDelay: Database synchronization has been completed with at least one neighboring peer. The node must wait pglInitTime second before it can select and advertise its preferred Peer Group Leader (PGL). calculating: the node is in the process of calculating what its new choice for preferred PGL will be. operNotPgl: a non PGL node is in the process of determining which node has the highest priority to be PGL by examining PTSEs sent by other nodes. operPgl: a PGL node is in the process of determining if another node has a higher priority than itself by examining PTSEs sent by other nodes. awaitUnanimity: the node has chosen itself as PGL. If the node has been elected unanimously, it generates a Unanimity event. It waits for unanimity or expiration of the overrideDelay timer before declaring itself peer group leader. hungElection: the node has chosen itself as PGL with less than 2/3 of the other nodes advertising it as their preferred PGL. In this case either this node should change its choice of preferred PGL, or the other nodes are going to accept it as PGL. awaitReElection: the node has lost connectivity to the current PGL. The connectivity must be reestablished before the reElectionInterval timer fires, otherwise the election is redone.')
mscARtgPnniCfgNodeSAddr = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2))
mscARtgPnniCfgNodeSAddrRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrRowStatusTable.setDescription('This entry controls the addition and deletion of mscARtgPnniCfgNodeSAddr components.')
mscARtgPnniCfgNodeSAddrRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrAddressIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrPrefixLengthIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrReachabilityIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniCfgNodeSAddr component.')
mscARtgPnniCfgNodeSAddrRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniCfgNodeSAddr components. These components can be added and deleted.')
mscARtgPnniCfgNodeSAddrComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniCfgNodeSAddrStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniCfgNodeSAddr tables.')
mscARtgPnniCfgNodeSAddrAddressIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1, 1, 10), HexString().subtype(subtypeSpec=ValueSizeConstraint(19, 19)).setFixedLength(19))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrAddressIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrAddressIndex.setDescription('This variable represents an index for the mscARtgPnniCfgNodeSAddr tables.')
mscARtgPnniCfgNodeSAddrPrefixLengthIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1, 1, 11), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 152)))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrPrefixLengthIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrPrefixLengthIndex.setDescription('This variable represents an index for the mscARtgPnniCfgNodeSAddr tables.')
mscARtgPnniCfgNodeSAddrReachabilityIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 1, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("internal", 0), ("exterior", 1))))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrReachabilityIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrReachabilityIndex.setDescription('This variable represents an index for the mscARtgPnniCfgNodeSAddr tables.')
mscARtgPnniCfgNodeSAddrProvTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 10), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrProvTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrProvTable.setDescription('This group contains the provisionable attributes of a SummaryAddress component. A summary address is an abbreviation of a set of addresses, represented by an address prefix that all of the summarized addresses have in common. A suppressed summary address is used to suppress the advertisement of addresses which match this prefix, regardless of scope.')
mscARtgPnniCfgNodeSAddrProvEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrAddressIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrPrefixLengthIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrReachabilityIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrProvEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrProvEntry.setDescription('An entry in the mscARtgPnniCfgNodeSAddrProvTable.')
mscARtgPnniCfgNodeSAddrSuppress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 10, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("false", 0), ("true", 1))).clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrSuppress.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrSuppress.setDescription('This attribute specifies whether or not the address should be suppressed. If this attribute is set to true, then all addresses matching that prefix will not be advertised above this level.')
mscARtgPnniCfgNodeSAddrOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 11), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrOperTable.setDescription('This group contains the operational attributes of a SummaryAddress component.')
mscARtgPnniCfgNodeSAddrOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrAddressIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrPrefixLengthIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeSAddrReachabilityIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrOperEntry.setDescription('An entry in the mscARtgPnniCfgNodeSAddrOperTable.')
mscARtgPnniCfgNodeSAddrState = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 11, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("advertising", 0), ("suppressing", 1), ("inactive", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrState.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrState.setDescription('This attribute indicates the state of the address: one of advertising, suppressing or inactive. inactive: the summary address has been configured but is not suppressing or summarizing any ATM addresses. suppressing: the summary address has suppressed at least one ATM address on the node. advertising: the summary address is summarizing at least one ATM address on the node.')
mscARtgPnniCfgNodeSAddrScope = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 2, 11, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1, 104))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrScope.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeSAddrScope.setDescription('This attribute indicates the scope of the summary address. The scope corresponds to the scope of the underlying summarized address with the highest advertised scope. A value of -1 means the scope is unknown.')
mscARtgPnniCfgNodeNbr = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3))
mscARtgPnniCfgNodeNbrRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 1), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgPnniCfgNodeNbr components.')
mscARtgPnniCfgNodeNbrRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeNbrIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniCfgNodeNbr component.')
mscARtgPnniCfgNodeNbrRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniCfgNodeNbr components. These components cannot be added nor deleted.')
mscARtgPnniCfgNodeNbrComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniCfgNodeNbrStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniCfgNodeNbr tables.')
mscARtgPnniCfgNodeNbrIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 1, 1, 10), HexString().subtype(subtypeSpec=ValueSizeConstraint(22, 22)).setFixedLength(22))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrIndex.setDescription('This variable represents the index for the mscARtgPnniCfgNodeNbr tables.')
mscARtgPnniCfgNodeNbrOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 10), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrOperTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This group contains the operational attributes of a Neighbor component.')
mscARtgPnniCfgNodeNbrOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeNbrIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrOperEntry.setDescription('An entry in the mscARtgPnniCfgNodeNbrOperTable.')
mscARtgPnniCfgNodeNbrPeerState = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 10, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))).clone(namedValues=NamedValues(("npDown", 0), ("negotiating", 1), ("exchanging", 2), ("loading", 3), ("full", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPeerState.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPeerState.setDescription('This attribute indicates the state of the routing database exchange with the peer node. npDown: there are no active links (i.e. in the twoWayInside Hello state) to the neighboring peer. negotiating: the first step in creating an adjacency between the two neighboring peers; this step determines which node is the master, and what the initial DS sequence number will be. exchanging: the node describes its topology database by sending Database Summary packets to the neighboring peer. loading: a full sequence of Database Summary packets has been exchanged with the neighboring peer, and the required PTSEs are requested and at least one has not yet been received. full: All PTSEs known to be available have been received from the neighboring peer. At this point the all ports leading to the neighbor node will be flooded in PTSEs within the peer group.')
mscARtgPnniCfgNodeNbrStatsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrStatsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrStatsTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This group contains the statistical operational attributes of a Neighbor component.')
mscARtgPnniCfgNodeNbrStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeNbrIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrStatsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrStatsEntry.setDescription('An entry in the mscARtgPnniCfgNodeNbrStatsTable.')
mscARtgPnniCfgNodeNbrPtspRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtspRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtspRx.setDescription('This attribute counts the PNNI Topology State Packets received from the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrPtspTx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtspTx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtspTx.setDescription('This attribute counts the total number of PTSPs send to the neighbor node.The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrPtseRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseRx.setDescription('This attribute counts the total number of PTSEs received from the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrPtseTx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseTx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseTx.setDescription('This attribute counts the total number of PTSEs sent to the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrPtseReqRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseReqRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseReqRx.setDescription('This attribute counts the total number of PTSE requests received from the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrPtseReqTx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseReqTx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseReqTx.setDescription('This attribute counts the total number of PTSE requests sent to the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrPtseAcksRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseAcksRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseAcksRx.setDescription('This attribute counts the total number of PTSE acknowledgments received from the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrPtseAcksTx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseAcksTx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrPtseAcksTx.setDescription('This attribute counts the total number of PTSE acknowledgments sent to the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrDbSummariesRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrDbSummariesRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrDbSummariesRx.setDescription('This attribute counts the number of database summary packets received from the neighbor. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrDbSummariesTx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrDbSummariesTx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrDbSummariesTx.setDescription('This attribute counts the number of database summary packets transmitted to the neighbor. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrBadPtspRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 11), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtspRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtspRx.setDescription('This attribute counts the total number of invalid PTSP packets received from the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrBadPtseRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 12), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtseRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtseRx.setDescription('This attribute counts the total number of invalid PTSE packets received to the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrBadPtseReqRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 13), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtseReqRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtseReqRx.setDescription('This attribute counts the total number of invalid PTSE requests received from the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrBadPtseAckRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 14), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtseAckRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadPtseAckRx.setDescription('This attribute counts the total number of invalid PTSE acknowledgments received from the neighbor node. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrBadDbSummariesRx = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 11, 1, 15), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadDbSummariesRx.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrBadDbSummariesRx.setDescription('This attribute counts the total number of invalid database summary packets received from the neighbor. The counter wraps when it exceeds the maximum value.')
mscARtgPnniCfgNodeNbrRccListTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 385), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRccListTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRccListTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This attribute indicates the component names of all Routing Control Channels to the neighbor PNNI node.')
mscARtgPnniCfgNodeNbrRccListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 385, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeNbrIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeNbrRccListValue"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRccListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRccListEntry.setDescription('An entry in the mscARtgPnniCfgNodeNbrRccListTable.')
mscARtgPnniCfgNodeNbrRccListValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 3, 385, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRccListValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeNbrRccListValue.setDescription('This variable represents both the value and the index for the mscARtgPnniCfgNodeNbrRccListTable.')
mscARtgPnniCfgNodeDefSAddr = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4))
mscARtgPnniCfgNodeDefSAddrRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 1), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrRowStatusTable.setDescription('This entry controls the addition and deletion of mscARtgPnniCfgNodeDefSAddr components.')
mscARtgPnniCfgNodeDefSAddrRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeDefSAddrIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniCfgNodeDefSAddr component.')
mscARtgPnniCfgNodeDefSAddrRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniCfgNodeDefSAddr components. These components cannot be added nor deleted.')
mscARtgPnniCfgNodeDefSAddrComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniCfgNodeDefSAddrStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniCfgNodeDefSAddr tables.')
mscARtgPnniCfgNodeDefSAddrIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrIndex.setDescription('This variable represents the index for the mscARtgPnniCfgNodeDefSAddr tables.')
mscARtgPnniCfgNodeDefSAddrDefAddrTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 10), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrDefAddrTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrDefAddrTable.setDescription('This group contains the operational attributes of a DefSummaryAddress component.')
mscARtgPnniCfgNodeDefSAddrDefAddrEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeDefSAddrIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrDefAddrEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrDefAddrEntry.setDescription('An entry in the mscARtgPnniCfgNodeDefSAddrDefAddrTable.')
mscARtgPnniCfgNodeDefSAddrAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(13, 13)).setFixedLength(13)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrAddress.setDescription('This attribute indicates the default summary address of the node at this level.')
mscARtgPnniCfgNodeDefSAddrOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 11), )
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrOperTable.setDescription('This group contains the operational attributes of a SummaryAddress component.')
mscARtgPnniCfgNodeDefSAddrOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniCfgNodeDefSAddrIndex"))
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrOperEntry.setDescription('An entry in the mscARtgPnniCfgNodeDefSAddrOperTable.')
mscARtgPnniCfgNodeDefSAddrState = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 11, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("advertising", 0), ("suppressing", 1), ("inactive", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrState.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrState.setDescription('This attribute indicates the state of the address: one of advertising, suppressing or inactive. inactive: the summary address has been configured but is not suppressing or summarizing any ATM addresses. suppressing: the summary address has suppressed at least one ATM address on the node. advertising: the summary address is summarizing at least one ATM address on the node.')
mscARtgPnniCfgNodeDefSAddrScope = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 3, 4, 11, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1, 104))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrScope.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniCfgNodeDefSAddrScope.setDescription('This attribute indicates the scope of the summary address. The scope corresponds to the scope of the underlying summarized address with the highest advertised scope. A value of -1 means the scope is unknown.')
mscARtgPnniTop = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4))
mscARtgPnniTopRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 1), )
if mibBuilder.loadTexts: mscARtgPnniTopRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgPnniTop components.')
mscARtgPnniTopRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopIndex"))
if mibBuilder.loadTexts: mscARtgPnniTopRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniTop component.')
mscARtgPnniTopRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniTop components. These components cannot be added nor deleted.')
mscARtgPnniTopComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniTopStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniTop tables.')
mscARtgPnniTopIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 104)))
if mibBuilder.loadTexts: mscARtgPnniTopIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopIndex.setDescription('This variable represents the index for the mscARtgPnniTop tables.')
mscARtgPnniTopOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 10), )
if mibBuilder.loadTexts: mscARtgPnniTopOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopOperTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This group contains the operational attributes of a Topology component.')
mscARtgPnniTopOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopIndex"))
if mibBuilder.loadTexts: mscARtgPnniTopOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopOperEntry.setDescription('An entry in the mscARtgPnniTopOperTable.')
mscARtgPnniTopPtsesInDatabase = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 10, 1, 1), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopPtsesInDatabase.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopPtsesInDatabase.setDescription("This attribute indicates the number of PTSEs in storage in this node's topology database for this level.")
mscARtgPnniTopPglNodeId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 10, 1, 2), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 22))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopPglNodeId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopPglNodeId.setDescription('This attribute indicates the node id of the peer group leader. If this attribute is empty, it indicates the Peer Group Level node id is unknown.')
mscARtgPnniTopActiveParentNodeId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 10, 1, 3), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 22))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopActiveParentNodeId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopActiveParentNodeId.setDescription('This attribute indicates the node identifier being used by the LGN representing this peer group at the next higher level peer group.')
mscARtgPnniTopPreferredPglNodeId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 10, 1, 4), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 22))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopPreferredPglNodeId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopPreferredPglNodeId.setDescription('This attribute represents the node in database with the highest Peer Group Level (PGL) priority. If this attribute is empty, it indicates the preferred PGL node id is unknown.')
mscARtgPnniTopNode = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2))
mscARtgPnniTopNodeRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 1), )
if mibBuilder.loadTexts: mscARtgPnniTopNodeRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgPnniTopNode components.')
mscARtgPnniTopNodeRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeIndex"))
if mibBuilder.loadTexts: mscARtgPnniTopNodeRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniTopNode component.')
mscARtgPnniTopNodeRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniTopNode components. These components cannot be added nor deleted.')
mscARtgPnniTopNodeComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniTopNodeStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniTopNode tables.')
mscARtgPnniTopNodeIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 1, 1, 10), HexString().subtype(subtypeSpec=ValueSizeConstraint(22, 22)).setFixedLength(22))
if mibBuilder.loadTexts: mscARtgPnniTopNodeIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeIndex.setDescription('This variable represents the index for the mscARtgPnniTopNode tables.')
mscARtgPnniTopNodeAddr = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2))
mscARtgPnniTopNodeAddrRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1), )
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgPnniTopNodeAddr components.')
mscARtgPnniTopNodeAddrRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeAddrAddressIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeAddrPrefixLengthIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeAddrReachabilityIndex"))
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniTopNodeAddr component.')
mscARtgPnniTopNodeAddrRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniTopNodeAddr components. These components cannot be added nor deleted.')
mscARtgPnniTopNodeAddrComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniTopNodeAddrStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniTopNodeAddr tables.')
mscARtgPnniTopNodeAddrAddressIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1, 1, 10), HexString().subtype(subtypeSpec=ValueSizeConstraint(19, 19)).setFixedLength(19))
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrAddressIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrAddressIndex.setDescription('This variable represents an index for the mscARtgPnniTopNodeAddr tables.')
mscARtgPnniTopNodeAddrPrefixLengthIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1, 1, 11), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 152)))
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrPrefixLengthIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrPrefixLengthIndex.setDescription('This variable represents an index for the mscARtgPnniTopNodeAddr tables.')
mscARtgPnniTopNodeAddrReachabilityIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 1, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("internal", 0), ("exterior", 1))))
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrReachabilityIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrReachabilityIndex.setDescription('This variable represents an index for the mscARtgPnniTopNodeAddr tables.')
mscARtgPnniTopNodeAddrOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 10), )
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrOperTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This attribute group contains the operational attributes for the Address component.')
mscARtgPnniTopNodeAddrOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeAddrAddressIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeAddrPrefixLengthIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeAddrReachabilityIndex"))
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrOperEntry.setDescription('An entry in the mscARtgPnniTopNodeAddrOperTable.')
mscARtgPnniTopNodeAddrScope = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 2, 10, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 104))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrScope.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeAddrScope.setDescription('This attribute specifies the scope of the ATM address, which is the highest level to which this address will be advertised in the PNNI hierarchy.')
mscARtgPnniTopNodeLink = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3))
mscARtgPnniTopNodeLinkRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 1), )
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgPnniTopNodeLink components.')
mscARtgPnniTopNodeLinkRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeLinkIndex"))
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniTopNodeLink component.')
mscARtgPnniTopNodeLinkRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniTopNodeLink components. These components cannot be added nor deleted.')
mscARtgPnniTopNodeLinkComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniTopNodeLinkStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniTopNodeLink tables.')
mscARtgPnniTopNodeLinkIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 268435455)))
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkIndex.setDescription('This variable represents the index for the mscARtgPnniTopNodeLink tables.')
mscARtgPnniTopNodeLinkOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 10), )
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkOperTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This group contains the operational attributes of a Link component.')
mscARtgPnniTopNodeLinkOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniTopNodeLinkIndex"))
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkOperEntry.setDescription('An entry in the mscARtgPnniTopNodeLinkOperTable.')
mscARtgPnniTopNodeLinkRemoteNodeId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(22, 22)).setFixedLength(22)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRemoteNodeId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRemoteNodeId.setDescription('This attribute indicates the id of the node at the far end of this link.')
mscARtgPnniTopNodeLinkRemotePortId = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 4, 2, 3, 10, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRemotePortId.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniTopNodeLinkRemotePortId.setDescription("This attribute indicates the node's port id at the far end of this link.")
mscARtgPnniPort = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5))
mscARtgPnniPortRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 1), )
if mibBuilder.loadTexts: mscARtgPnniPortRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscARtgPnniPort components.')
mscARtgPnniPortRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniPortIndex"))
if mibBuilder.loadTexts: mscARtgPnniPortRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortRowStatusEntry.setDescription('A single entry in the table represents a single mscARtgPnniPort component.')
mscARtgPnniPortRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniPortRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscARtgPnniPort components. These components cannot be added nor deleted.')
mscARtgPnniPortComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniPortComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscARtgPnniPortStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniPortStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortStorageType.setDescription('This variable represents the storage type value for the mscARtgPnniPort tables.')
mscARtgPnniPortIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 268435455)))
if mibBuilder.loadTexts: mscARtgPnniPortIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortIndex.setDescription('This variable represents the index for the mscARtgPnniPort tables.')
mscARtgPnniPortOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 10), )
if mibBuilder.loadTexts: mscARtgPnniPortOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortOperTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This group contains the operational attributes of a Port component.')
mscARtgPnniPortOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscARtgPnniPortIndex"))
if mibBuilder.loadTexts: mscARtgPnniPortOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortOperEntry.setDescription('An entry in the mscARtgPnniPortOperTable.')
mscARtgPnniPortStdComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 95, 3, 5, 10, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscARtgPnniPortStdComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscARtgPnniPortStdComponentName.setDescription('This attribute indicates the component name of the port.')
mscAtmCR = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113))
mscAtmCRRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 1), )
if mibBuilder.loadTexts: mscAtmCRRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmCR components.')
mscAtmCRRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRIndex"))
if mibBuilder.loadTexts: mscAtmCRRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmCR component.')
mscAtmCRRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmCRRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmCR components. These components can be added and deleted.')
mscAtmCRComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmCRStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRStorageType.setDescription('This variable represents the storage type value for the mscAtmCR tables.')
mscAtmCRIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmCRIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRIndex.setDescription('This variable represents the index for the mscAtmCR tables.')
mscAtmCRProvTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 10), )
if mibBuilder.loadTexts: mscAtmCRProvTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRProvTable.setDescription('This group represents the provisioned attributes for the AtmCallRouter component.')
mscAtmCRProvEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRIndex"))
if mibBuilder.loadTexts: mscAtmCRProvEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRProvEntry.setDescription('An entry in the mscAtmCRProvTable.')
mscAtmCRNodeAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 10, 1, 1), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(26, 26)).setFixedLength(26)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmCRNodeAddress.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRNodeAddress.setDescription('This attribute specifies the NSAP address prefix used for ILMI purposes.')
mscAtmCRStatsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 11), )
if mibBuilder.loadTexts: mscAtmCRStatsTable.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRStatsTable.setDescription('This group represents the operational attributes for the AtmCallRouter component.')
mscAtmCRStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRIndex"))
if mibBuilder.loadTexts: mscAtmCRStatsEntry.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRStatsEntry.setDescription('An entry in the mscAtmCRStatsTable.')
mscAtmCRCallsRouted = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 11, 1, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRCallsRouted.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRCallsRouted.setDescription('This attribute counts the total number of calls routed.')
mscAtmCRCallsFailed = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 11, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRCallsFailed.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRCallsFailed.setDescription('This attribute specifies the number of calls that failed to route.')
mscAtmCRDna = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2))
mscAtmCRDnaRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 1), )
if mibBuilder.loadTexts: mscAtmCRDnaRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRDnaRowStatusTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This entry controls the addition and deletion of mscAtmCRDna components.')
mscAtmCRDnaRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRDnaIndex"))
if mibBuilder.loadTexts: mscAtmCRDnaRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRDnaRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmCRDna component.')
mscAtmCRDnaRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRDnaRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRDnaRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmCRDna components. These components cannot be added nor deleted.')
mscAtmCRDnaComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRDnaComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRDnaComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmCRDnaStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRDnaStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRDnaStorageType.setDescription('This variable represents the storage type value for the mscAtmCRDna tables.')
mscAtmCRDnaIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 1, 1, 10), AsciiStringIndex().subtype(subtypeSpec=ValueSizeConstraint(1, 40)))
if mibBuilder.loadTexts: mscAtmCRDnaIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmCRDnaIndex.setDescription('This variable represents the index for the mscAtmCRDna tables.')
mscAtmCRDnaDestinationNameTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 289), )
if mibBuilder.loadTexts: mscAtmCRDnaDestinationNameTable.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRDnaDestinationNameTable.setDescription('*** THIS TABLE CURRENTLY NOT IMPLEMENTED *** This attribute indicates which components have this address provisioned or dynamically registered via ILMI.')
mscAtmCRDnaDestinationNameEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 289, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRDnaIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmCRDnaDestinationNameValue"))
if mibBuilder.loadTexts: mscAtmCRDnaDestinationNameEntry.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRDnaDestinationNameEntry.setDescription('An entry in the mscAtmCRDnaDestinationNameTable.')
mscAtmCRDnaDestinationNameValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 113, 2, 289, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmCRDnaDestinationNameValue.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmCRDnaDestinationNameValue.setDescription('This variable represents both the value and the index for the mscAtmCRDnaDestinationNameTable.')
mscAtmIfVpcSrc = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6))
mscAtmIfVpcSrcRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 1), )
if mibBuilder.loadTexts: mscAtmIfVpcSrcRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVpcSrc components.')
mscAtmIfVpcSrcRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVpcSrcRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVpcSrc component.')
mscAtmIfVpcSrcRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVpcSrcRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVpcSrc components. These components can be added and deleted.')
mscAtmIfVpcSrcComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcSrcComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVpcSrcStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcSrcStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVpcSrc tables.')
mscAtmIfVpcSrcIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVpcSrcIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcIndex.setDescription('This variable represents the index for the mscAtmIfVpcSrc tables.')
mscAtmIfVpcSrcProvTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 10), )
if mibBuilder.loadTexts: mscAtmIfVpcSrcProvTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcProvTable.setDescription('This attribute group contains the provisionable attributes of the AtmIf/n Vpc/vpi SrcPvp component.')
mscAtmIfVpcSrcProvEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVpcSrcProvEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcProvEntry.setDescription('An entry in the mscAtmIfVpcSrcProvTable.')
mscAtmIfVpcSrcCallingAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 20)).clone(hexValue="")).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVpcSrcCallingAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcCallingAddress.setDescription('This attribute specifies the calling address of the soft PVP. If it is a null string, then the calling address is the address of the current interface (that is, where the soft PVC originates).')
mscAtmIfVpcSrcCalledAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 10, 1, 2), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVpcSrcCalledAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcCalledAddress.setDescription('This attribute specifies the called (remote) address of the soft PVP.')
mscAtmIfVpcSrcCalledVpi = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 10, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 4095))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVpcSrcCalledVpi.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcCalledVpi.setDescription('This attribute specifies the called VPI of the soft PVP.')
mscAtmIfVpcSrcOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 11), )
if mibBuilder.loadTexts: mscAtmIfVpcSrcOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcOperTable.setDescription('This attribute group contains the operational attributes associated with the SrcPvp or SrcPvc component.')
mscAtmIfVpcSrcOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVpcSrcOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcOperEntry.setDescription('An entry in the mscAtmIfVpcSrcOperTable.')
mscAtmIfVpcSrcState = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 11, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("active", 0), ("inactive", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcSrcState.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcState.setDescription('This attribute indicates the state of the soft PVP or soft PVC.')
mscAtmIfVpcSrcRetryCount = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 11, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcSrcRetryCount.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcRetryCount.setDescription('This attribute indicates the number of failed attempts to set up the soft PVP or soft PVC since the last time the connection failed.')
mscAtmIfVpcSrcLastFailureCauseCode = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 11, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcSrcLastFailureCauseCode.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcLastFailureCauseCode.setDescription('This attribute contains the cause code in the last transmitted signalling message that contains the CAUSE information element. The cause code is used to describe the reason for generating certain signalling messages. The default value for this attribute is set to 0.')
mscAtmIfVpcSrcLastFailureDiagCode = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 6, 11, 1, 4), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(0, 8))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcSrcLastFailureDiagCode.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcSrcLastFailureDiagCode.setDescription('This attribute contains the diagnostic code in the last transmitted signalling message. The diagnostic code is contained in the CAUSE information element and identifies an information element type or timer type. The diagnostic code is present only if a procedural error is detected by the signalling protocol. A diagnostic code is always accompanied by the cause code. If there is no failure, this attribute is set to NULL.')
mscAtmIfVpcRp = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7))
mscAtmIfVpcRpRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 1), )
if mibBuilder.loadTexts: mscAtmIfVpcRpRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVpcRp components.')
mscAtmIfVpcRpRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcRpIndex"))
if mibBuilder.loadTexts: mscAtmIfVpcRpRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVpcRp component.')
mscAtmIfVpcRpRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcRpRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVpcRp components. These components cannot be added nor deleted.')
mscAtmIfVpcRpComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcRpComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVpcRpStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcRpStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVpcRp tables.')
mscAtmIfVpcRpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVpcRpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpIndex.setDescription('This variable represents the index for the mscAtmIfVpcRp tables.')
mscAtmIfVpcRpOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 10), )
if mibBuilder.loadTexts: mscAtmIfVpcRpOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpOperTable.setDescription('This attribute group contains the operational attributes for the AtmRelayPoint component.')
mscAtmIfVpcRpOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcRpIndex"))
if mibBuilder.loadTexts: mscAtmIfVpcRpOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpOperEntry.setDescription('An entry in the mscAtmIfVpcRpOperTable.')
mscAtmIfVpcRpNextHop = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 10, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHop.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHop.setDescription('This attribute indicates the component name of the Rp component with which this Rp component is associated.')
mscAtmIfVpcRpNextHopsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 430), )
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHopsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHopsTable.setDescription('This attribute indicates the component name(s) of the Rp component(s) with which this Rp component is associated. This attribute can have more than one component name only when the Vcc distributionType is pointToMultipoint and the callDirection is fromLink.')
mscAtmIfVpcRpNextHopsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 430, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcRpIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcRpNextHopsValue"))
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHopsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHopsEntry.setDescription('An entry in the mscAtmIfVpcRpNextHopsTable.')
mscAtmIfVpcRpNextHopsValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 7, 430, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHopsValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcRpNextHopsValue.setDescription('This variable represents both the value and the index for the mscAtmIfVpcRpNextHopsTable.')
mscAtmIfVpcDst = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8))
mscAtmIfVpcDstRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 1), )
if mibBuilder.loadTexts: mscAtmIfVpcDstRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVpcDst components.')
mscAtmIfVpcDstRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcDstIndex"))
if mibBuilder.loadTexts: mscAtmIfVpcDstRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVpcDst component.')
mscAtmIfVpcDstRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcDstRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVpcDst components. These components cannot be added nor deleted.')
mscAtmIfVpcDstComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcDstComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVpcDstStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcDstStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVpcDst tables.')
mscAtmIfVpcDstIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVpcDstIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstIndex.setDescription('This variable represents the index for the mscAtmIfVpcDst tables.')
mscAtmIfVpcDstOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 10), )
if mibBuilder.loadTexts: mscAtmIfVpcDstOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstOperTable.setDescription('This attribute group contains the operational attributes for the AtmIf/n Vpc/vpi DstPvp component.')
mscAtmIfVpcDstOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVpcIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVpcDstIndex"))
if mibBuilder.loadTexts: mscAtmIfVpcDstOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstOperEntry.setDescription('An entry in the mscAtmIfVpcDstOperTable.')
mscAtmIfVpcDstCalledAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcDstCalledAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstCalledAddress.setDescription('This attribute indicates the called address of the soft PVP.')
mscAtmIfVpcDstCallingAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 10, 1, 2), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(7, 40))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcDstCallingAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstCallingAddress.setDescription('This attribute indicates the calling (remote) address of the soft PVP. If the address in not known, then the value of this address is Unknown.')
mscAtmIfVpcDstCallingVpi = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 4, 8, 10, 1, 3), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(1, 7))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVpcDstCallingVpi.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVpcDstCallingVpi.setDescription('This attribute represents the calling (remote) VPI of the soft PVP. If the VPI value is not known, the attribute value is set to Unknown.')
mscAtmIfVccSrc = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8))
mscAtmIfVccSrcRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 1), )
if mibBuilder.loadTexts: mscAtmIfVccSrcRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVccSrc components.')
mscAtmIfVccSrcRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVccSrcRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVccSrc component.')
mscAtmIfVccSrcRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVccSrcRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVccSrc components. These components can be added and deleted.')
mscAtmIfVccSrcComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccSrcComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVccSrcStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccSrcStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVccSrc tables.')
mscAtmIfVccSrcIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVccSrcIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcIndex.setDescription('This variable represents the index for the mscAtmIfVccSrc tables.')
mscAtmIfVccSrcProvTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 10), )
if mibBuilder.loadTexts: mscAtmIfVccSrcProvTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcProvTable.setDescription('This attribute group contains the provisionable attributes of the SourcePvc component.')
mscAtmIfVccSrcProvEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVccSrcProvEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcProvEntry.setDescription('An entry in the mscAtmIfVccSrcProvTable.')
mscAtmIfVccSrcRemoteAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVccSrcRemoteAddress.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmIfVccSrcRemoteAddress.setDescription('This attribute represents the remote address of the soft PVC.')
mscAtmIfVccSrcRemoteVpiVci = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 10, 1, 2), IntegerSequence().subtype(subtypeSpec=ValueSizeConstraint(3, 9))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVccSrcRemoteVpiVci.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmIfVccSrcRemoteVpiVci.setDescription('This attribute represents the remote VPI and VCI of the soft PVC.')
mscAtmIfVccSrcCallingAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 10, 1, 3), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 20)).clone(hexValue="")).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVccSrcCallingAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcCallingAddress.setDescription('This attribute represents the calling address of the soft PVC. If it is a null string, then the calling address is the address of the current interface (that is, where the soft PVC originates).')
mscAtmIfVccSrcCalledAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 10, 1, 4), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVccSrcCalledAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcCalledAddress.setDescription('This attribute represents the called (remote) address of the soft PVC.')
mscAtmIfVccSrcCalledVpiVci = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 10, 1, 5), IntegerSequence().subtype(subtypeSpec=ValueSizeConstraint(3, 10))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVccSrcCalledVpiVci.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcCalledVpiVci.setDescription('This attribute represents the remote VPI and VCI of the soft PVC.')
mscAtmIfVccSrcOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 11), )
if mibBuilder.loadTexts: mscAtmIfVccSrcOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcOperTable.setDescription('This attribute group contains the operational attributes associated with the SrcPvp or SrcPvc component.')
mscAtmIfVccSrcOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVccSrcOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcOperEntry.setDescription('An entry in the mscAtmIfVccSrcOperTable.')
mscAtmIfVccSrcState = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 11, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("active", 0), ("inactive", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccSrcState.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcState.setDescription('This attribute indicates the state of the soft PVP or soft PVC.')
mscAtmIfVccSrcRetryCount = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 11, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccSrcRetryCount.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcRetryCount.setDescription('This attribute indicates the number of failed attempts to set up the soft PVP or soft PVC since the last time the connection failed.')
mscAtmIfVccSrcLastFailureCauseCode = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 11, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccSrcLastFailureCauseCode.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcLastFailureCauseCode.setDescription('This attribute contains the cause code in the last transmitted signalling message that contains the CAUSE information element. The cause code is used to describe the reason for generating certain signalling messages. The default value for this attribute is set to 0.')
mscAtmIfVccSrcLastFailureDiagCode = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 8, 11, 1, 4), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(0, 8))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccSrcLastFailureDiagCode.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccSrcLastFailureDiagCode.setDescription('This attribute contains the diagnostic code in the last transmitted signalling message. The diagnostic code is contained in the CAUSE information element and identifies an information element type or timer type. The diagnostic code is present only if a procedural error is detected by the signalling protocol. A diagnostic code is always accompanied by the cause code. If there is no failure, this attribute is set to NULL.')
mscAtmIfVccEp = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9))
mscAtmIfVccEpRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 1), )
if mibBuilder.loadTexts: mscAtmIfVccEpRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVccEp components.')
mscAtmIfVccEpRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccEpIndex"))
if mibBuilder.loadTexts: mscAtmIfVccEpRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVccEp component.')
mscAtmIfVccEpRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccEpRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVccEp components. These components cannot be added nor deleted.')
mscAtmIfVccEpComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccEpComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVccEpStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccEpStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVccEp tables.')
mscAtmIfVccEpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVccEpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpIndex.setDescription('This variable represents the index for the mscAtmIfVccEp tables.')
mscAtmIfVccEpOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 10), )
if mibBuilder.loadTexts: mscAtmIfVccEpOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpOperTable.setDescription('This attribute group contains the operational attributes for the AtmEndPoint component.')
mscAtmIfVccEpOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccEpIndex"))
if mibBuilder.loadTexts: mscAtmIfVccEpOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpOperEntry.setDescription('An entry in the mscAtmIfVccEpOperTable.')
mscAtmIfVccEpApplicationName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 9, 10, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccEpApplicationName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccEpApplicationName.setDescription('This attribute indicates the component name associated with the application associated with the switched VCC.')
mscAtmIfVccRp = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10))
mscAtmIfVccRpRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 1), )
if mibBuilder.loadTexts: mscAtmIfVccRpRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVccRp components.')
mscAtmIfVccRpRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccRpIndex"))
if mibBuilder.loadTexts: mscAtmIfVccRpRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVccRp component.')
mscAtmIfVccRpRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccRpRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVccRp components. These components cannot be added nor deleted.')
mscAtmIfVccRpComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccRpComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVccRpStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccRpStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVccRp tables.')
mscAtmIfVccRpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVccRpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpIndex.setDescription('This variable represents the index for the mscAtmIfVccRp tables.')
mscAtmIfVccRpOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 10), )
if mibBuilder.loadTexts: mscAtmIfVccRpOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpOperTable.setDescription('This attribute group contains the operational attributes for the AtmRelayPoint component.')
mscAtmIfVccRpOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccRpIndex"))
if mibBuilder.loadTexts: mscAtmIfVccRpOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpOperEntry.setDescription('An entry in the mscAtmIfVccRpOperTable.')
mscAtmIfVccRpNextHop = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 10, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccRpNextHop.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmIfVccRpNextHop.setDescription('This attribute indicates the component name of the Rp component with which this Rp component is associated.')
mscAtmIfVccRpNextHopsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 430), )
if mibBuilder.loadTexts: mscAtmIfVccRpNextHopsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpNextHopsTable.setDescription('This attribute indicates the component name(s) of the Rp component(s) with which this Rp component is associated. This attribute can have more than one component name only when the Vcc distributionType is pointToMultipoint and the callDirection is fromLink.')
mscAtmIfVccRpNextHopsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 430, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccRpIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccRpNextHopsValue"))
if mibBuilder.loadTexts: mscAtmIfVccRpNextHopsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpNextHopsEntry.setDescription('An entry in the mscAtmIfVccRpNextHopsTable.')
mscAtmIfVccRpNextHopsValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 10, 430, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccRpNextHopsValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccRpNextHopsValue.setDescription('This variable represents both the value and the index for the mscAtmIfVccRpNextHopsTable.')
mscAtmIfVccDst = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11))
mscAtmIfVccDstRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 1), )
if mibBuilder.loadTexts: mscAtmIfVccDstRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVccDst components.')
mscAtmIfVccDstRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccDstIndex"))
if mibBuilder.loadTexts: mscAtmIfVccDstRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVccDst component.')
mscAtmIfVccDstRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccDstRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVccDst components. These components cannot be added nor deleted.')
mscAtmIfVccDstComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccDstComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVccDstStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccDstStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVccDst tables.')
mscAtmIfVccDstIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVccDstIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstIndex.setDescription('This variable represents the index for the mscAtmIfVccDst tables.')
mscAtmIfVccDstOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 10), )
if mibBuilder.loadTexts: mscAtmIfVccDstOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstOperTable.setDescription('This attribute group contains the operational attributes for the DestinationPvc component.')
mscAtmIfVccDstOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVccDstIndex"))
if mibBuilder.loadTexts: mscAtmIfVccDstOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstOperEntry.setDescription('An entry in the mscAtmIfVccDstOperTable.')
mscAtmIfVccDstCalledAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 10, 1, 3), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccDstCalledAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstCalledAddress.setDescription('This attribute represents the called address of the soft PVC.')
mscAtmIfVccDstCallingAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 10, 1, 4), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(7, 40))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccDstCallingAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstCallingAddress.setDescription('This attribute represents the remote address of the soft PVC. If the address in not known, then the value of this address is Unknown.')
mscAtmIfVccDstCallingVpiVci = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 5, 11, 10, 1, 5), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(7, 9))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVccDstCallingVpiVci.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVccDstCallingVpiVci.setDescription('This attribute represents the remote VPI and VCI of the soft PVC. If the VPI and VCI values are not known, this attribute is set to Unknown.')
mscAtmIfVptVccSrc = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8))
mscAtmIfVptVccSrcRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 1), )
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVptVccSrc components.')
mscAtmIfVptVccSrcRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVptVccSrc component.')
mscAtmIfVptVccSrcRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 1, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVptVccSrc components. These components can be added and deleted.')
mscAtmIfVptVccSrcComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVptVccSrcStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVptVccSrc tables.')
mscAtmIfVptVccSrcIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVptVccSrcIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcIndex.setDescription('This variable represents the index for the mscAtmIfVptVccSrc tables.')
mscAtmIfVptVccSrcProvTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 10), )
if mibBuilder.loadTexts: mscAtmIfVptVccSrcProvTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcProvTable.setDescription('This attribute group contains the provisionable attributes of the SourcePvc component.')
mscAtmIfVptVccSrcProvEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccSrcProvEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcProvEntry.setDescription('An entry in the mscAtmIfVptVccSrcProvTable.')
mscAtmIfVptVccSrcRemoteAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 10, 1, 1), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRemoteAddress.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRemoteAddress.setDescription('This attribute represents the remote address of the soft PVC.')
mscAtmIfVptVccSrcRemoteVpiVci = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 10, 1, 2), IntegerSequence().subtype(subtypeSpec=ValueSizeConstraint(3, 9))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRemoteVpiVci.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRemoteVpiVci.setDescription('This attribute represents the remote VPI and VCI of the soft PVC.')
mscAtmIfVptVccSrcCallingAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 10, 1, 3), HexString().subtype(subtypeSpec=ValueSizeConstraint(0, 20)).clone(hexValue="")).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcCallingAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcCallingAddress.setDescription('This attribute represents the calling address of the soft PVC. If it is a null string, then the calling address is the address of the current interface (that is, where the soft PVC originates).')
mscAtmIfVptVccSrcCalledAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 10, 1, 4), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcCalledAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcCalledAddress.setDescription('This attribute represents the called (remote) address of the soft PVC.')
mscAtmIfVptVccSrcCalledVpiVci = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 10, 1, 5), IntegerSequence().subtype(subtypeSpec=ValueSizeConstraint(3, 10))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcCalledVpiVci.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcCalledVpiVci.setDescription('This attribute represents the remote VPI and VCI of the soft PVC.')
mscAtmIfVptVccSrcOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 11), )
if mibBuilder.loadTexts: mscAtmIfVptVccSrcOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcOperTable.setDescription('This attribute group contains the operational attributes associated with the SrcPvp or SrcPvc component.')
mscAtmIfVptVccSrcOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 11, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccSrcIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccSrcOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcOperEntry.setDescription('An entry in the mscAtmIfVptVccSrcOperTable.')
mscAtmIfVptVccSrcState = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 11, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("active", 0), ("inactive", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcState.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcState.setDescription('This attribute indicates the state of the soft PVP or soft PVC.')
mscAtmIfVptVccSrcRetryCount = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 11, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRetryCount.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcRetryCount.setDescription('This attribute indicates the number of failed attempts to set up the soft PVP or soft PVC since the last time the connection failed.')
mscAtmIfVptVccSrcLastFailureCauseCode = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 11, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcLastFailureCauseCode.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcLastFailureCauseCode.setDescription('This attribute contains the cause code in the last transmitted signalling message that contains the CAUSE information element. The cause code is used to describe the reason for generating certain signalling messages. The default value for this attribute is set to 0.')
mscAtmIfVptVccSrcLastFailureDiagCode = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 8, 11, 1, 4), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(0, 8))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccSrcLastFailureDiagCode.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccSrcLastFailureDiagCode.setDescription('This attribute contains the diagnostic code in the last transmitted signalling message. The diagnostic code is contained in the CAUSE information element and identifies an information element type or timer type. The diagnostic code is present only if a procedural error is detected by the signalling protocol. A diagnostic code is always accompanied by the cause code. If there is no failure, this attribute is set to NULL.')
mscAtmIfVptVccEp = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9))
mscAtmIfVptVccEpRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 1), )
if mibBuilder.loadTexts: mscAtmIfVptVccEpRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVptVccEp components.')
mscAtmIfVptVccEpRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccEpIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccEpRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVptVccEp component.')
mscAtmIfVptVccEpRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccEpRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVptVccEp components. These components cannot be added nor deleted.')
mscAtmIfVptVccEpComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccEpComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVptVccEpStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccEpStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVptVccEp tables.')
mscAtmIfVptVccEpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVptVccEpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpIndex.setDescription('This variable represents the index for the mscAtmIfVptVccEp tables.')
mscAtmIfVptVccEpOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 10), )
if mibBuilder.loadTexts: mscAtmIfVptVccEpOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpOperTable.setDescription('This attribute group contains the operational attributes for the AtmEndPoint component.')
mscAtmIfVptVccEpOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccEpIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccEpOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpOperEntry.setDescription('An entry in the mscAtmIfVptVccEpOperTable.')
mscAtmIfVptVccEpApplicationName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 9, 10, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccEpApplicationName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccEpApplicationName.setDescription('This attribute indicates the component name associated with the application associated with the switched VCC.')
mscAtmIfVptVccRp = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10))
mscAtmIfVptVccRpRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 1), )
if mibBuilder.loadTexts: mscAtmIfVptVccRpRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVptVccRp components.')
mscAtmIfVptVccRpRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccRpIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccRpRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVptVccRp component.')
mscAtmIfVptVccRpRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccRpRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVptVccRp components. These components cannot be added nor deleted.')
mscAtmIfVptVccRpComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccRpComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVptVccRpStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccRpStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVptVccRp tables.')
mscAtmIfVptVccRpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVptVccRpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpIndex.setDescription('This variable represents the index for the mscAtmIfVptVccRp tables.')
mscAtmIfVptVccRpOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 10), )
if mibBuilder.loadTexts: mscAtmIfVptVccRpOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpOperTable.setDescription('This attribute group contains the operational attributes for the AtmRelayPoint component.')
mscAtmIfVptVccRpOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccRpIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccRpOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpOperEntry.setDescription('An entry in the mscAtmIfVptVccRpOperTable.')
mscAtmIfVptVccRpNextHop = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 10, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHop.setStatus('obsolete')
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHop.setDescription('This attribute indicates the component name of the Rp component with which this Rp component is associated.')
mscAtmIfVptVccRpNextHopsTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 430), )
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHopsTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHopsTable.setDescription('This attribute indicates the component name(s) of the Rp component(s) with which this Rp component is associated. This attribute can have more than one component name only when the Vcc distributionType is pointToMultipoint and the callDirection is fromLink.')
mscAtmIfVptVccRpNextHopsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 430, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccRpIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccRpNextHopsValue"))
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHopsEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHopsEntry.setDescription('An entry in the mscAtmIfVptVccRpNextHopsTable.')
mscAtmIfVptVccRpNextHopsValue = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 10, 430, 1, 1), RowPointer()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHopsValue.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccRpNextHopsValue.setDescription('This variable represents both the value and the index for the mscAtmIfVptVccRpNextHopsTable.')
mscAtmIfVptVccDst = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11))
mscAtmIfVptVccDstRowStatusTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 1), )
if mibBuilder.loadTexts: mscAtmIfVptVccDstRowStatusTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstRowStatusTable.setDescription('This entry controls the addition and deletion of mscAtmIfVptVccDst components.')
mscAtmIfVptVccDstRowStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 1, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccDstIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccDstRowStatusEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstRowStatusEntry.setDescription('A single entry in the table represents a single mscAtmIfVptVccDst component.')
mscAtmIfVptVccDstRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 1, 1, 1), RowStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccDstRowStatus.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstRowStatus.setDescription('This variable is used as the basis for SNMP naming of mscAtmIfVptVccDst components. These components cannot be added nor deleted.')
mscAtmIfVptVccDstComponentName = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccDstComponentName.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstComponentName.setDescription("This variable provides the component's string name for use with the ASCII Console Interface")
mscAtmIfVptVccDstStorageType = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 1, 1, 4), StorageType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccDstStorageType.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstStorageType.setDescription('This variable represents the storage type value for the mscAtmIfVptVccDst tables.')
mscAtmIfVptVccDstIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 1, 1, 10), NonReplicated())
if mibBuilder.loadTexts: mscAtmIfVptVccDstIndex.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstIndex.setDescription('This variable represents the index for the mscAtmIfVptVccDst tables.')
mscAtmIfVptVccDstOperTable = MibTable((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 10), )
if mibBuilder.loadTexts: mscAtmIfVptVccDstOperTable.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstOperTable.setDescription('This attribute group contains the operational attributes for the DestinationPvc component.')
mscAtmIfVptVccDstOperEntry = MibTableRow((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 10, 1), ).setIndexNames((0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmCoreMIB", "mscAtmIfVptVccIndex"), (0, "Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", "mscAtmIfVptVccDstIndex"))
if mibBuilder.loadTexts: mscAtmIfVptVccDstOperEntry.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstOperEntry.setDescription('An entry in the mscAtmIfVptVccDstOperTable.')
mscAtmIfVptVccDstCalledAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 10, 1, 3), HexString().subtype(subtypeSpec=ValueSizeConstraint(20, 20)).setFixedLength(20)).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccDstCalledAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstCalledAddress.setDescription('This attribute represents the called address of the soft PVC.')
mscAtmIfVptVccDstCallingAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 10, 1, 4), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(7, 40))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccDstCallingAddress.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstCallingAddress.setDescription('This attribute represents the remote address of the soft PVC. If the address in not known, then the value of this address is Unknown.')
mscAtmIfVptVccDstCallingVpiVci = MibTableColumn((1, 3, 6, 1, 4, 1, 562, 36, 2, 1, 114, 9, 20, 11, 10, 1, 5), AsciiString().subtype(subtypeSpec=ValueSizeConstraint(7, 9))).setMaxAccess("readonly")
if mibBuilder.loadTexts: mscAtmIfVptVccDstCallingVpiVci.setStatus('mandatory')
if mibBuilder.loadTexts: mscAtmIfVptVccDstCallingVpiVci.setDescription('This attribute represents the remote VPI and VCI of the soft PVC. If the VPI and VCI values are not known, this attribute is set to Unknown.')
atmNetworkingGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 1))
atmNetworkingGroupCA = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 1, 1))
atmNetworkingGroupCA02 = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 1, 1, 3))
atmNetworkingGroupCA02A = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 1, 1, 3, 2))
atmNetworkingCapabilities = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 3))
atmNetworkingCapabilitiesCA = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 3, 1))
atmNetworkingCapabilitiesCA02 = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 3, 1, 3))
atmNetworkingCapabilitiesCA02A = MibIdentifier((1, 3, 6, 1, 4, 1, 562, 36, 2, 2, 42, 3, 1, 3, 2))
mibBuilder.exportSymbols("Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", mscARtgPnniThreshParmsTable=mscARtgPnniThreshParmsTable, mscARtgPnniCfgNodeNbrStatsEntry=mscARtgPnniCfgNodeNbrStatsEntry, mscARtgPnniCfgNodeDefSAddrIndex=mscARtgPnniCfgNodeDefSAddrIndex, mscARtgDnaDestInfoRowStatusEntry=mscARtgDnaDestInfoRowStatusEntry, mscARtgPnniPtseLifetimeFactor=mscARtgPnniPtseLifetimeFactor, mscARtgPnniCfgNodeDefSAddrAddress=mscARtgPnniCfgNodeDefSAddrAddress, atmNetworkingGroupCA02=atmNetworkingGroupCA02, atmNetworkingCapabilitiesCA02=atmNetworkingCapabilitiesCA02, mscARtgPnniRfFqpIndex=mscARtgPnniRfFqpIndex, mscARtgPnniCfgNodeDefSAddrOperTable=mscARtgPnniCfgNodeDefSAddrOperTable, mscARtgPnniPglInitTime=mscARtgPnniPglInitTime, mscARtgPnniCfgNodeSAddrPrefixLengthIndex=mscARtgPnniCfgNodeSAddrPrefixLengthIndex, mscAtmIfVptVccRpComponentName=mscAtmIfVptVccRpComponentName, mscAtmIfVccDstOperTable=mscAtmIfVccDstOperTable, mscARtgPnniRfBqpEntry=mscARtgPnniRfBqpEntry, mscARtgPnniCfgNodeNbrPtseTx=mscARtgPnniCfgNodeNbrPtseTx, mscARtgPnniTopNodeRowStatusTable=mscARtgPnniTopNodeRowStatusTable, mscARtgPnniDomain=mscARtgPnniDomain, mscAtmIfVptVccSrcComponentName=mscAtmIfVptVccSrcComponentName, mscARtgPnniRowStatus=mscARtgPnniRowStatus, mscARtgPnniTopIndex=mscARtgPnniTopIndex, mscARtgPnniRfTxTdpTable=mscARtgPnniRfTxTdpTable, mscARtgPnniCfgNodeSAddrRowStatusTable=mscARtgPnniCfgNodeSAddrRowStatusTable, mscAtmIfVccEpStorageType=mscAtmIfVccEpStorageType, mscAtmIfVpcSrcLastFailureDiagCode=mscAtmIfVpcSrcLastFailureDiagCode, mscARtgPnniCfgNodeDefSAddrStorageType=mscARtgPnniCfgNodeDefSAddrStorageType, mscAtmCRCallsRouted=mscAtmCRCallsRouted, mscARtgPnniRfFqpValue=mscARtgPnniRfFqpValue, mscARtgPnniSuccessfulRoutingAttempts=mscARtgPnniSuccessfulRoutingAttempts, mscAtmIfVptVccEpIndex=mscAtmIfVptVccEpIndex, mscARtgPnniReElectionInterval=mscARtgPnniReElectionInterval, mscARtgPnniCfgNodeDefSAddrRowStatus=mscARtgPnniCfgNodeDefSAddrRowStatus, mscARtgStorageType=mscARtgStorageType, mscARtgPnniCfgNodeNbrIndex=mscARtgPnniCfgNodeNbrIndex, mscARtgPnniCfgNodeNbrBadPtseAckRx=mscARtgPnniCfgNodeNbrBadPtseAckRx, mscAtmIfVccSrcProvEntry=mscAtmIfVccSrcProvEntry, mscARtgPnniTopologyMemoryExhaustion=mscARtgPnniTopologyMemoryExhaustion, mscAtmIfVptVccSrcCallingAddress=mscAtmIfVptVccSrcCallingAddress, mscAtmIfVptVccSrcStorageType=mscAtmIfVptVccSrcStorageType, mscARtgPnniHelloInterval=mscARtgPnniHelloInterval, mscAtmIfVccDstComponentName=mscAtmIfVccDstComponentName, mscARtgPnniPtseParmsTable=mscARtgPnniPtseParmsTable, mscAtmIfVpcSrcComponentName=mscAtmIfVpcSrcComponentName, mscAtmIfVptVccSrcRetryCount=mscAtmIfVptVccSrcRetryCount, mscAtmIfVptVccSrcRemoteVpiVci=mscAtmIfVptVccSrcRemoteVpiVci, mscAtmCRRowStatusEntry=mscAtmCRRowStatusEntry, mscARtgPnniIndex=mscARtgPnniIndex, mscAtmCRDnaStorageType=mscAtmCRDnaStorageType, mscARtgPnniTopNodeLinkRowStatusEntry=mscARtgPnniTopNodeLinkRowStatusEntry, mscAtmIfVptVccRpRowStatus=mscAtmIfVptVccRpRowStatus, mscAtmIfVptVccEp=mscAtmIfVptVccEp, mscAtmIfVptVccRpNextHop=mscAtmIfVptVccRpNextHop, mscAtmIfVccRpNextHop=mscAtmIfVccRpNextHop, mscAtmIfVpcDst=mscAtmIfVpcDst, mscARtgPnniTopNodeLinkComponentName=mscARtgPnniTopNodeLinkComponentName, mscAtmCRProvEntry=mscAtmCRProvEntry, mscARtgDnaDestInfoOperTable=mscARtgDnaDestInfoOperTable, mscARtgPnniThreshParmsEntry=mscARtgPnniThreshParmsEntry, mscARtgPnniCfgNodeNbr=mscARtgPnniCfgNodeNbr, mscARtgPnniHlParmsEntry=mscARtgPnniHlParmsEntry, mscAtmIfVccRpComponentName=mscAtmIfVccRpComponentName, mscARtgPnniCfgNodeDefSAddr=mscARtgPnniCfgNodeDefSAddr, mscAtmIfVptVccSrcProvTable=mscAtmIfVptVccSrcProvTable, mscARtgPnniDefaultScope=mscARtgPnniDefaultScope, mscARtgPnniTopComponentName=mscARtgPnniTopComponentName, mscAtmIfVccSrcStorageType=mscAtmIfVccSrcStorageType, mscARtgPnniCfgNodeNumRccs=mscARtgPnniCfgNodeNumRccs, mscAtmIfVptVccEpRowStatusTable=mscAtmIfVptVccEpRowStatusTable, mscARtgPnniRfRxTdpIndex=mscARtgPnniRfRxTdpIndex, mscAtmIfVpcRp=mscAtmIfVpcRp, mscARtgPnniNodeAddressPrefix=mscARtgPnniNodeAddressPrefix, mscAtmIfVpcRpRowStatus=mscAtmIfVpcRpRowStatus, mscAtmIfVptVccDstCalledAddress=mscAtmIfVptVccDstCalledAddress, mscAtmIfVccDstCallingAddress=mscAtmIfVccDstCallingAddress, mscARtgPnniCfgNodeSAddrRowStatusEntry=mscARtgPnniCfgNodeSAddrRowStatusEntry, mscARtgPnniRfTxTrafficDescType=mscARtgPnniRfTxTrafficDescType, mscAtmIfVptVccEpStorageType=mscAtmIfVptVccEpStorageType, mscARtgPnniAvcrPm=mscARtgPnniAvcrPm, mscAtmIfVpcSrcRowStatusTable=mscAtmIfVpcSrcRowStatusTable, mscAtmIfVpcSrcStorageType=mscAtmIfVpcSrcStorageType, mscARtgPnniCfgNodeSAddrScope=mscARtgPnniCfgNodeSAddrScope, mscAtmIfVccSrcCalledAddress=mscAtmIfVccSrcCalledAddress, mscAtmIfVpcDstRowStatusTable=mscAtmIfVpcDstRowStatusTable, mscAtmIfVpcDstIndex=mscAtmIfVpcDstIndex, mscARtgPnniCfgNodeNbrPtseRx=mscARtgPnniCfgNodeNbrPtseRx, atmNetworkingGroupCA=atmNetworkingGroupCA, mscARtgPnniTopNodeComponentName=mscARtgPnniTopNodeComponentName, atmNetworkingMIB=atmNetworkingMIB, mscARtgPnniCfgNodeNodeId=mscARtgPnniCfgNodeNodeId, mscAtmCRStatsEntry=mscAtmCRStatsEntry, mscAtmIfVccSrcState=mscAtmIfVccSrcState, mscARtgPnniCfgNodeSAddrSuppress=mscARtgPnniCfgNodeSAddrSuppress, mscAtmIfVccDstIndex=mscAtmIfVccDstIndex, mscAtmIfVccRp=mscAtmIfVccRp, mscAtmIfVccRpStorageType=mscAtmIfVccRpStorageType, mscARtgPnniOperEntry=mscARtgPnniOperEntry, mscARtgPnniRfRowStatusTable=mscARtgPnniRfRowStatusTable, mscARtgDnaRowStatus=mscARtgDnaRowStatus, mscARtgPnni=mscARtgPnni, mscARtgPnniTopNodeRowStatus=mscARtgPnniTopNodeRowStatus, mscAtmIfVccSrcProvTable=mscAtmIfVccSrcProvTable, mscARtgPnniStatsEntry=mscARtgPnniStatsEntry, mscAtmIfVptVccSrcOperTable=mscAtmIfVptVccSrcOperTable, mscAtmIfVptVccEpOperTable=mscAtmIfVptVccEpOperTable, mscAtmCRDnaDestinationNameValue=mscAtmCRDnaDestinationNameValue, mscAtmIfVpcRpNextHopsTable=mscAtmIfVpcRpNextHopsTable, mscAtmIfVptVccDstStorageType=mscAtmIfVptVccDstStorageType, mscAtmIfVpcRpStorageType=mscAtmIfVpcRpStorageType, mscARtgPnniTopStorageType=mscARtgPnniTopStorageType, mscAtmCRDnaDestinationNameTable=mscAtmCRDnaDestinationNameTable, mscARtgPnniRfFqpEntry=mscARtgPnniRfFqpEntry, mscAtmIfVptVccSrcRemoteAddress=mscAtmIfVptVccSrcRemoteAddress, mscARtgDnaDestInfoRowStatus=mscARtgDnaDestInfoRowStatus, mscAtmIfVccRpIndex=mscAtmIfVccRpIndex, mscAtmIfVptVccDstRowStatus=mscAtmIfVptVccDstRowStatus, mscARtgPnniCfgNodeDefSAddrOperEntry=mscARtgPnniCfgNodeDefSAddrOperEntry, mscAtmIfVccSrcOperEntry=mscAtmIfVccSrcOperEntry, mscARtgPnniCfgNodeComponentName=mscARtgPnniCfgNodeComponentName, mscAtmIfVptVccDst=mscAtmIfVptVccDst, mscAtmIfVptVccSrcRowStatus=mscAtmIfVptVccSrcRowStatus, mscARtgPnniPortComponentName=mscARtgPnniPortComponentName, mscARtgPnniHelloInactivityFactor=mscARtgPnniHelloInactivityFactor, mscAtmIfVpcSrcRetryCount=mscAtmIfVpcSrcRetryCount, mscAtmIfVccDstStorageType=mscAtmIfVccDstStorageType, mscARtgPnniCfgNodeOpNodeId=mscARtgPnniCfgNodeOpNodeId, mscARtgPnniRfBestEffort=mscARtgPnniRfBestEffort, mscARtgDnaDestInfoReachability=mscARtgDnaDestInfoReachability, mscARtgPnniRfCriteriaTable=mscARtgPnniRfCriteriaTable, mscARtgPnniRequestRxmtInterval=mscARtgPnniRequestRxmtInterval, mscARtg=mscARtg, mscARtgPnniCfgNodeNbrRowStatusEntry=mscARtgPnniCfgNodeNbrRowStatusEntry, mscARtgPnniHlParmsTable=mscARtgPnniHlParmsTable, mscAtmIfVccRpRowStatusEntry=mscAtmIfVccRpRowStatusEntry, mscARtgPnniRfClippingBbc=mscARtgPnniRfClippingBbc, mscARtgDnaDestInfoIndex=mscARtgDnaDestInfoIndex, mscAtmIfVptVccDstCallingAddress=mscAtmIfVptVccDstCallingAddress, mscARtgDnaDestInfoOperEntry=mscARtgDnaDestInfoOperEntry, mscARtgPnniAlternateRoutingAttempts=mscARtgPnniAlternateRoutingAttempts, mscARtgPnniCfgNodeNbrPtspTx=mscARtgPnniCfgNodeNbrPtspTx, mscARtgPnniFailedRoutingAttempts=mscARtgPnniFailedRoutingAttempts, mscAtmIfVptVccDstCallingVpiVci=mscAtmIfVptVccDstCallingVpiVci, mscARtgPnniCfgNodeStorageType=mscARtgPnniCfgNodeStorageType, mscARtgPnniRfAtmServiceCategory=mscARtgPnniRfAtmServiceCategory, mscAtmIfVccEpIndex=mscAtmIfVccEpIndex, mscAtmIfVccSrcRowStatus=mscAtmIfVccSrcRowStatus, mscARtgPnniCfgNodeNbrRccListValue=mscARtgPnniCfgNodeNbrRccListValue, mscARtgPnniCfgNodeProvTable=mscARtgPnniCfgNodeProvTable, mscAtmIfVptVccRpNextHopsValue=mscAtmIfVptVccRpNextHopsValue, mscAtmIfVpcSrcRowStatusEntry=mscAtmIfVpcSrcRowStatusEntry, mscARtgPnniTopNodeAddrAddressIndex=mscARtgPnniTopNodeAddrAddressIndex, mscAtmIfVccSrcRemoteAddress=mscAtmIfVccSrcRemoteAddress, mscARtgPnniTopOperTable=mscARtgPnniTopOperTable, mscAtmIfVccSrcLastFailureCauseCode=mscAtmIfVccSrcLastFailureCauseCode, mscARtgPnniRfTxTdpIndex=mscARtgPnniRfTxTdpIndex, mscARtgPnniCfgNodeIndex=mscARtgPnniCfgNodeIndex, mscAtmIfVptVccDstIndex=mscAtmIfVptVccDstIndex, mscAtmIfVpcSrcIndex=mscAtmIfVpcSrcIndex, mscARtgPnniCfgNodeNbrDbSummariesTx=mscARtgPnniCfgNodeNbrDbSummariesTx, mscARtgPnniCfgNodeSAddrComponentName=mscARtgPnniCfgNodeSAddrComponentName, mscARtgPnniRfFwdQosClass=mscARtgPnniRfFwdQosClass, mscAtmCRStatsTable=mscAtmCRStatsTable, mscAtmIfVpcRpRowStatusEntry=mscAtmIfVpcRpRowStatusEntry, mscARtgDnaDestInfo=mscARtgDnaDestInfo, mscARtgPnniCfgNodeNbrPtspRx=mscARtgPnniCfgNodeNbrPtspRx, mscAtmIfVccEp=mscAtmIfVccEp, mscAtmIfVptVccDstOperEntry=mscAtmIfVptVccDstOperEntry, mscARtgPnniTop=mscARtgPnniTop, mscAtmIfVpcSrcLastFailureCauseCode=mscAtmIfVpcSrcLastFailureCauseCode, mscARtgRowStatusTable=mscARtgRowStatusTable, mscAtmIfVptVccSrcCalledVpiVci=mscAtmIfVptVccSrcCalledVpiVci, mscAtmIfVptVccRpStorageType=mscAtmIfVptVccRpStorageType, mscARtgPnniCfgNodeNbrOperTable=mscARtgPnniCfgNodeNbrOperTable, mscARtgPnniTopNodeLinkOperEntry=mscARtgPnniTopNodeLinkOperEntry, mscAtmIfVpcRpOperTable=mscAtmIfVpcRpOperTable, mscAtmIfVccEpRowStatusEntry=mscAtmIfVccEpRowStatusEntry, mscARtgPnniTopNodeAddrScope=mscARtgPnniTopNodeAddrScope, mscAtmIfVpcRpOperEntry=mscAtmIfVpcRpOperEntry, mscARtgPnniRfDestinationAddress=mscARtgPnniRfDestinationAddress, mscARtgPnniCfgNodeNbrRowStatus=mscARtgPnniCfgNodeNbrRowStatus, mscARtgPnniTopNodeAddrRowStatus=mscARtgPnniTopNodeAddrRowStatus, mscARtgPnniCfgNodeRowStatus=mscARtgPnniCfgNodeRowStatus, mscARtgPnniCfgNodeDefSAddrRowStatusEntry=mscARtgPnniCfgNodeDefSAddrRowStatusEntry, mscARtgPnniCfgNodeDefSAddrDefAddrEntry=mscARtgPnniCfgNodeDefSAddrDefAddrEntry, mscAtmIfVptVccSrcOperEntry=mscAtmIfVptVccSrcOperEntry, mscAtmIfVpcSrcOperEntry=mscAtmIfVpcSrcOperEntry, mscARtgDnaDestInfoType=mscARtgDnaDestInfoType, mscARtgPnniPortRowStatus=mscARtgPnniPortRowStatus, mscARtgStatsTable=mscARtgStatsTable, mscAtmIfVptVccSrcLastFailureCauseCode=mscAtmIfVptVccSrcLastFailureCauseCode, mscAtmIfVptVccRpNextHopsEntry=mscAtmIfVptVccRpNextHopsEntry, mscAtmIfVptVccDstRowStatusTable=mscAtmIfVptVccDstRowStatusTable, mscARtgPnniRfBqpTable=mscARtgPnniRfBqpTable, mscARtgPnniRowStatusEntry=mscARtgPnniRowStatusEntry, mscARtgPnniOverrideDelay=mscARtgPnniOverrideDelay, mscARtgPnniRfIndex=mscARtgPnniRfIndex, mscAtmIfVccDstCallingVpiVci=mscAtmIfVccDstCallingVpiVci, mscARtgPnniCfgNodeNbrComponentName=mscARtgPnniCfgNodeNbrComponentName, mscAtmIfVccRpNextHopsEntry=mscAtmIfVccRpNextHopsEntry, mscARtgPnniPortRowStatusEntry=mscARtgPnniPortRowStatusEntry, mscARtgPnniStorageType=mscARtgPnniStorageType, mscAtmIfVptVccRpOperEntry=mscAtmIfVptVccRpOperEntry, mscARtgPnniTopNodeLinkRowStatus=mscARtgPnniTopNodeLinkRowStatus, mscARtgPnniRfTxTdpEntry=mscARtgPnniRfTxTdpEntry, mscARtgPnniCfgNodeOperEntry=mscARtgPnniCfgNodeOperEntry, mscARtgPnniCfgNodeSAddrState=mscARtgPnniCfgNodeSAddrState, mscARtgPnniCfgNodeSAddrProvEntry=mscARtgPnniCfgNodeSAddrProvEntry, mscARtgRowStatusEntry=mscARtgRowStatusEntry, mscARtgPnniRfRxTdpEntry=mscARtgPnniRfRxTdpEntry, mscAtmIfVccSrcIndex=mscAtmIfVccSrcIndex, mscARtgDnaDestInfoComponentName=mscARtgDnaDestInfoComponentName, mscAtmIfVptVccSrcState=mscAtmIfVptVccSrcState, mscAtmIfVccDstRowStatus=mscAtmIfVccDstRowStatus, mscARtgPnniProvTable=mscARtgPnniProvTable, mscARtgPnniRf=mscARtgPnniRf, mscAtmIfVccSrc=mscAtmIfVccSrc, mscARtgPnniCfgNodeCurrentLeadershipPriority=mscARtgPnniCfgNodeCurrentLeadershipPriority, mscARtgPnniPortStorageType=mscARtgPnniPortStorageType, mscAtmIfVccDstRowStatusEntry=mscAtmIfVccDstRowStatusEntry, mscARtgPnniCfgNodeNbrRowStatusTable=mscARtgPnniCfgNodeNbrRowStatusTable, mscARtgPnniRfBearerClassBbc=mscARtgPnniRfBearerClassBbc, mscARtgPnniCfgNodeNbrBadPtseReqRx=mscARtgPnniCfgNodeNbrBadPtseReqRx, mscAtmIfVpcDstOperTable=mscAtmIfVpcDstOperTable, mscAtmIfVptVccSrcProvEntry=mscAtmIfVptVccSrcProvEntry, mscARtgDnaDestInfoRowStatusTable=mscARtgDnaDestInfoRowStatusTable, mscARtgPnniRfRowStatusEntry=mscARtgPnniRfRowStatusEntry, mscAtmIfVpcSrcCalledVpi=mscAtmIfVpcSrcCalledVpi, mscAtmCRNodeAddress=mscAtmCRNodeAddress, mscAtmCRDna=mscAtmCRDna, mscAtmCRProvTable=mscAtmCRProvTable, mscARtgPnniCfgNodeSAddrRowStatus=mscARtgPnniCfgNodeSAddrRowStatus, mscAtmCRDnaRowStatusTable=mscAtmCRDnaRowStatusTable, mscARtgPnniRfRowStatus=mscARtgPnniRfRowStatus, mscAtmCRComponentName=mscAtmCRComponentName, mscAtmIfVccSrcRowStatusTable=mscAtmIfVccSrcRowStatusTable, mscAtmIfVptVccSrcRowStatusTable=mscAtmIfVptVccSrcRowStatusTable, mscARtgPnniCfgNodeDefSAddrRowStatusTable=mscARtgPnniCfgNodeDefSAddrRowStatusTable, mscARtgPnniTopNode=mscARtgPnniTopNode, mscARtgPnniPort=mscARtgPnniPort, mscAtmIfVpcSrcProvTable=mscAtmIfVpcSrcProvTable, mscARtgPnniTopNodeLinkIndex=mscARtgPnniTopNodeLinkIndex, mscARtgPnniOperTable=mscARtgPnniOperTable, mscARtgPnniPortIndex=mscARtgPnniPortIndex, mscAtmIfVptVccDstComponentName=mscAtmIfVptVccDstComponentName, mscAtmIfVccSrcRemoteVpiVci=mscAtmIfVccSrcRemoteVpiVci, mscARtgPnniTopPtsesInDatabase=mscARtgPnniTopPtsesInDatabase, mscAtmIfVccRpRowStatusTable=mscAtmIfVccRpRowStatusTable, mscARtgPnniRfOptimizationMetric=mscARtgPnniRfOptimizationMetric, mscAtmIfVpcSrcOperTable=mscAtmIfVpcSrcOperTable, mscARtgPnniTopNodeLinkOperTable=mscARtgPnniTopNodeLinkOperTable, mscAtmCRRowStatusTable=mscAtmCRRowStatusTable, mscARtgPnniCfgNodeDefSAddrDefAddrTable=mscARtgPnniCfgNodeDefSAddrDefAddrTable, mscAtmIfVpcDstStorageType=mscAtmIfVpcDstStorageType, mscARtgPnniTopNodeLinkRemoteNodeId=mscARtgPnniTopNodeLinkRemoteNodeId, mscARtgPnniCfgNodeOperTable=mscARtgPnniCfgNodeOperTable)
mibBuilder.exportSymbols("Nortel-MsCarrier-MscPassport-AtmNetworkingMIB", mscARtgPnniPglParmsEntry=mscARtgPnniPglParmsEntry, mscAtmIfVccSrcOperTable=mscAtmIfVccSrcOperTable, mscARtgPnniCfgNodeDefSAddrComponentName=mscARtgPnniCfgNodeDefSAddrComponentName, mscARtgPnniTopNodeAddrReachabilityIndex=mscARtgPnniTopNodeAddrReachabilityIndex, mscARtgPnniRfCriteriaEntry=mscARtgPnniRfCriteriaEntry, mscARtgPnniPeerDelayedAckInterval=mscARtgPnniPeerDelayedAckInterval, mscAtmIfVpcSrcRowStatus=mscAtmIfVpcSrcRowStatus, mscAtmIfVccDst=mscAtmIfVccDst, mscARtgPnniTopRowStatusEntry=mscARtgPnniTopRowStatusEntry, mscARtgPnniRfFqpTable=mscARtgPnniRfFqpTable, mscARtgPnniPortOperTable=mscARtgPnniPortOperTable, mscARtgPnniComponentName=mscARtgPnniComponentName, mscAtmIfVccSrcCalledVpiVci=mscAtmIfVccSrcCalledVpiVci, mscAtmIfVccSrcRowStatusEntry=mscAtmIfVccSrcRowStatusEntry, mscAtmIfVpcRpIndex=mscAtmIfVpcRpIndex, mscARtgPnniTopNodeLinkRemotePortId=mscARtgPnniTopNodeLinkRemotePortId, mscARtgPnniCfgNodeNbrBadPtspRx=mscARtgPnniCfgNodeNbrBadPtspRx, mscAtmIfVptVccRpRowStatusEntry=mscAtmIfVptVccRpRowStatusEntry, mscAtmIfVpcDstCallingAddress=mscAtmIfVpcDstCallingAddress, mscARtgPnniPglParmsTable=mscARtgPnniPglParmsTable, mscARtgPnniPortStdComponentName=mscARtgPnniPortStdComponentName, atmNetworkingGroup=atmNetworkingGroup, mscARtgDnaStorageType=mscARtgDnaStorageType, mscARtgDnaDestInfoStdComponentName=mscARtgDnaDestInfoStdComponentName, mscAtmCRCallsFailed=mscAtmCRCallsFailed, mscARtgRoutingAttempts=mscARtgRoutingAttempts, mscAtmIfVptVccDstOperTable=mscAtmIfVptVccDstOperTable, mscARtgPnniMaxAlternateRoutesOnCrankback=mscARtgPnniMaxAlternateRoutesOnCrankback, mscAtmIfVptVccSrcIndex=mscAtmIfVptVccSrcIndex, mscAtmIfVptVccRpRowStatusTable=mscAtmIfVptVccRpRowStatusTable, mscARtgPnniRfBqpIndex=mscARtgPnniRfBqpIndex, mscAtmIfVpcDstRowStatus=mscAtmIfVpcDstRowStatus, mscARtgPnniProvEntry=mscARtgPnniProvEntry, mscARtgPnniCfgNodeNbrPtseAcksRx=mscARtgPnniCfgNodeNbrPtseAcksRx, mscAtmIfVccRpNextHopsValue=mscAtmIfVccRpNextHopsValue, mscAtmIfVptVccRpNextHopsTable=mscAtmIfVptVccRpNextHopsTable, mscARtgPnniTopActiveParentNodeId=mscARtgPnniTopActiveParentNodeId, mscAtmIfVptVccEpRowStatusEntry=mscAtmIfVptVccEpRowStatusEntry, mscAtmIfVptVccEpApplicationName=mscAtmIfVptVccEpApplicationName, mscAtmIfVpcRpNextHopsValue=mscAtmIfVpcRpNextHopsValue, mscARtgPnniCfgNodeNbrRccListTable=mscARtgPnniCfgNodeNbrRccListTable, mscAtmIfVpcRpNextHop=mscAtmIfVpcRpNextHop, mscARtgPnniCfgNodeNbrPtseAcksTx=mscARtgPnniCfgNodeNbrPtseAcksTx, mscARtgPnniTopRowStatusTable=mscARtgPnniTopRowStatusTable, mscARtgPnniCfgNodeNbrStatsTable=mscARtgPnniCfgNodeNbrStatsTable, mscARtgPnniRestrictTransit=mscARtgPnniRestrictTransit, mscARtgPnniOptMetricTable=mscARtgPnniOptMetricTable, mscARtgPnniCfgNodeDefSAddrState=mscARtgPnniCfgNodeDefSAddrState, mscAtmIfVptVccSrcRowStatusEntry=mscAtmIfVptVccSrcRowStatusEntry, mscARtgIndex=mscARtgIndex, mscARtgPnniCfgNodeNumNeighbors=mscARtgPnniCfgNodeNumNeighbors, mscARtgPnniCfgNodeSAddrProvTable=mscARtgPnniCfgNodeSAddrProvTable, mscAtmIfVccRpRowStatus=mscAtmIfVccRpRowStatus, mscARtgPnniRfTransferCapabilityBbc=mscARtgPnniRfTransferCapabilityBbc, mscARtgPnniCfgNode=mscARtgPnniCfgNode, mscAtmIfVptVccRpIndex=mscAtmIfVptVccRpIndex, mscARtgPnniRfBwdQosClass=mscARtgPnniRfBwdQosClass, mscARtgPnniCfgNodeNbrPeerState=mscARtgPnniCfgNodeNbrPeerState, mscARtgPnniTopNodeAddrComponentName=mscARtgPnniTopNodeAddrComponentName, atmNetworkingGroupCA02A=atmNetworkingGroupCA02A, mscARtgPnniTopNodeRowStatusEntry=mscARtgPnniTopNodeRowStatusEntry, mscAtmCR=mscAtmCR, mscAtmIfVptVccEpComponentName=mscAtmIfVptVccEpComponentName, mscARtgPnniCfgNodeNbrDbSummariesRx=mscARtgPnniCfgNodeNbrDbSummariesRx, mscARtgPnniOptMetricValue=mscARtgPnniOptMetricValue, mscARtgPnniTopPglNodeId=mscARtgPnniTopPglNodeId, mscARtgDnaDestInfoStorageType=mscARtgDnaDestInfoStorageType, mscARtgDnaRowStatusEntry=mscARtgDnaRowStatusEntry, mscAtmIfVpcSrcState=mscAtmIfVpcSrcState, mscARtgPnniCfgNodeSAddrAddressIndex=mscARtgPnniCfgNodeSAddrAddressIndex, mscARtgPnniStatsTable=mscARtgPnniStatsTable, mscAtmIfVptVccRpOperTable=mscAtmIfVptVccRpOperTable, mscAtmIfVpcDstRowStatusEntry=mscAtmIfVpcDstRowStatusEntry, mscAtmIfVpcSrcProvEntry=mscAtmIfVpcSrcProvEntry, mscARtgStatsEntry=mscARtgStatsEntry, mscARtgPnniCfgNodeNodeAddress=mscARtgPnniCfgNodeNodeAddress, mscAtmIfVptVccEpRowStatus=mscAtmIfVptVccEpRowStatus, mscAtmIfVpcDstCalledAddress=mscAtmIfVpcDstCalledAddress, mscARtgPnniCfgNodeNbrPtseReqTx=mscARtgPnniCfgNodeNbrPtseReqTx, mscAtmIfVccDstCalledAddress=mscAtmIfVccDstCalledAddress, mscAtmIfVptVccRp=mscAtmIfVptVccRp, mscARtgPnniTopNodeAddrPrefixLengthIndex=mscARtgPnniTopNodeAddrPrefixLengthIndex, mscAtmIfVpcDstComponentName=mscAtmIfVpcDstComponentName, mscAtmIfVccSrcCallingAddress=mscAtmIfVccSrcCallingAddress, mscARtgPnniTopNodeIndex=mscARtgPnniTopNodeIndex, mscAtmIfVccDstRowStatusTable=mscAtmIfVccDstRowStatusTable, mscARtgPnniCfgNodeSAddr=mscARtgPnniCfgNodeSAddr, mscARtgPnniCfgNodeRowStatusEntry=mscARtgPnniCfgNodeRowStatusEntry, mscAtmCRDnaIndex=mscAtmCRDnaIndex, mscARtgPnniCfgNodeNbrRccListEntry=mscARtgPnniCfgNodeNbrRccListEntry, mscARtgPnniPtseHoldDown=mscARtgPnniPtseHoldDown, mscARtgPnniRfMaxRoutes=mscARtgPnniRfMaxRoutes, mscARtgPnniTopNodeAddrOperTable=mscARtgPnniTopNodeAddrOperTable, mscARtgDna=mscARtgDna, mscAtmIfVccEpApplicationName=mscAtmIfVccEpApplicationName, mscAtmIfVccEpComponentName=mscAtmIfVccEpComponentName, atmNetworkingCapabilitiesCA=atmNetworkingCapabilitiesCA, mscARtgPnniTopNodeLinkStorageType=mscARtgPnniTopNodeLinkStorageType, mscARtgPnniRfTxTdpValue=mscARtgPnniRfTxTdpValue, mscARtgRowStatus=mscARtgRowStatus, mscARtgPnniTopNodeLinkRowStatusTable=mscARtgPnniTopNodeLinkRowStatusTable, mscARtgPnniCfgNodeOpPeerGroupId=mscARtgPnniCfgNodeOpPeerGroupId, mscARtgPnniCfgNodeNbrOperEntry=mscARtgPnniCfgNodeNbrOperEntry, mscARtgPnniTopRowStatus=mscARtgPnniTopRowStatus, mscARtgPnniTopNodeAddrOperEntry=mscARtgPnniTopNodeAddrOperEntry, mscARtgPnniCfgNodeSAddrStorageType=mscARtgPnniCfgNodeSAddrStorageType, mscARtgPnniCfgNodeNbrPtseReqRx=mscARtgPnniCfgNodeNbrPtseReqRx, mscARtgPnniTopNodeLink=mscARtgPnniTopNodeLink, mscARtgDnaDestInfoScope=mscARtgDnaDestInfoScope, mscARtgPnniRfStorageType=mscARtgPnniRfStorageType, mscARtgPnniCfgNodeProvEntry=mscARtgPnniCfgNodeProvEntry, mscARtgPnniRfRxTrafficDescType=mscARtgPnniRfRxTrafficDescType, mscARtgDnaIndex=mscARtgDnaIndex, atmNetworkingCapabilitiesCA02A=atmNetworkingCapabilitiesCA02A, mscAtmCRIndex=mscAtmCRIndex, mscARtgPnniTopPreferredPglNodeId=mscARtgPnniTopPreferredPglNodeId, mscAtmIfVpcRpRowStatusTable=mscAtmIfVpcRpRowStatusTable, mscARtgPnniPtseParmsEntry=mscARtgPnniPtseParmsEntry, mscAtmCRRowStatus=mscAtmCRRowStatus, mscARtgPnniTopNodeStorageType=mscARtgPnniTopNodeStorageType, mscAtmIfVccDstOperEntry=mscAtmIfVccDstOperEntry, mscAtmIfVptVccSrc=mscAtmIfVptVccSrc, mscAtmIfVpcSrc=mscAtmIfVpcSrc, mscARtgDnaRowStatusTable=mscARtgDnaRowStatusTable, mscAtmIfVpcDstOperEntry=mscAtmIfVpcDstOperEntry, mscAtmIfVptVccSrcLastFailureDiagCode=mscAtmIfVptVccSrcLastFailureDiagCode, mscARtgPnniPortOperEntry=mscARtgPnniPortOperEntry, mscAtmIfVpcSrcCalledAddress=mscAtmIfVpcSrcCalledAddress, mscAtmCRDnaRowStatusEntry=mscAtmCRDnaRowStatusEntry, mscAtmIfVccSrcComponentName=mscAtmIfVccSrcComponentName, mscARtgPnniCfgNodeDefSAddrScope=mscARtgPnniCfgNodeDefSAddrScope, mscAtmIfVccSrcRetryCount=mscAtmIfVccSrcRetryCount, mscARtgPnniAvcrMt=mscARtgPnniAvcrMt, mscARtgPnniPtseRefreshInterval=mscARtgPnniPtseRefreshInterval, mscAtmIfVccEpOperTable=mscAtmIfVccEpOperTable, mscARtgPnniPortRowStatusTable=mscARtgPnniPortRowStatusTable, mscARtgPnniOptMetricEntry=mscARtgPnniOptMetricEntry, mscARtgPnniCfgNodeNbrStorageType=mscARtgPnniCfgNodeNbrStorageType, mscAtmCRDnaComponentName=mscAtmCRDnaComponentName, mscAtmIfVccRpOperEntry=mscAtmIfVccRpOperEntry, atmNetworkingCapabilities=atmNetworkingCapabilities, mscAtmIfVccRpOperTable=mscAtmIfVccRpOperTable, mscARtgPnniCfgNodePglElectionState=mscARtgPnniCfgNodePglElectionState, mscARtgPnniTopOperEntry=mscARtgPnniTopOperEntry, mscAtmIfVccEpRowStatus=mscAtmIfVccEpRowStatus, mscAtmIfVpcDstCallingVpi=mscAtmIfVpcDstCallingVpi, mscAtmIfVccRpNextHopsTable=mscAtmIfVccRpNextHopsTable, mscAtmIfVptVccDstRowStatusEntry=mscAtmIfVptVccDstRowStatusEntry, mscARtgPnniCfgNodeSAddrOperEntry=mscARtgPnniCfgNodeSAddrOperEntry, mscARtgPnniCfgNodeNbrBadPtseRx=mscARtgPnniCfgNodeNbrBadPtseRx, mscARtgPnniOptMetricIndex=mscARtgPnniOptMetricIndex, mscAtmCRStorageType=mscAtmCRStorageType, mscAtmCRDnaRowStatus=mscAtmCRDnaRowStatus, mscAtmIfVpcRpNextHopsEntry=mscAtmIfVpcRpNextHopsEntry, mscARtgPnniCfgNodeSAddrReachabilityIndex=mscARtgPnniCfgNodeSAddrReachabilityIndex, mscAtmIfVccSrcLastFailureDiagCode=mscAtmIfVccSrcLastFailureDiagCode, mscARtgPnniTopNodeAddr=mscARtgPnniTopNodeAddr, mscARtgPnniRfRxTdpValue=mscARtgPnniRfRxTdpValue, mscARtgPnniRfBqpValue=mscARtgPnniRfBqpValue, mscARtgPnniCfgNodeSAddrOperTable=mscARtgPnniCfgNodeSAddrOperTable, mscARtgPnniRfComponentName=mscARtgPnniRfComponentName, mscAtmIfVpcSrcCallingAddress=mscAtmIfVpcSrcCallingAddress, mscAtmIfVpcRpComponentName=mscAtmIfVpcRpComponentName, mscARtgPnniCfgNodeRowStatusTable=mscARtgPnniCfgNodeRowStatusTable, mscARtgPnniTopNodeAddrStorageType=mscARtgPnniTopNodeAddrStorageType, mscARtgPnniCfgNodeNbrBadDbSummariesRx=mscARtgPnniCfgNodeNbrBadDbSummariesRx, mscARtgPnniTopNodeAddrRowStatusTable=mscARtgPnniTopNodeAddrRowStatusTable, mscAtmIfVccEpOperEntry=mscAtmIfVccEpOperEntry, mscARtgPnniCfgNodePeerGroupId=mscARtgPnniCfgNodePeerGroupId, mscAtmIfVptVccEpOperEntry=mscAtmIfVptVccEpOperEntry, mscARtgFailedRoutingAttempts=mscARtgFailedRoutingAttempts, mscARtgComponentName=mscARtgComponentName, mscARtgDnaComponentName=mscARtgDnaComponentName, mscARtgPnniRowStatusTable=mscARtgPnniRowStatusTable, mscARtgPnniTopNodeAddrRowStatusEntry=mscARtgPnniTopNodeAddrRowStatusEntry, mscARtgPnniHelloHoldDown=mscARtgPnniHelloHoldDown, mscAtmCRDnaDestinationNameEntry=mscAtmCRDnaDestinationNameEntry, mscARtgPnniRfRxTdpTable=mscARtgPnniRfRxTdpTable, mscAtmIfVptVccSrcCalledAddress=mscAtmIfVptVccSrcCalledAddress, mscAtmIfVccEpRowStatusTable=mscAtmIfVccEpRowStatusTable)
| 183.496 | 13,983 | 0.803549 | 26,314 | 229,370 | 7.004256 | 0.053926 | 0.051826 | 0.090695 | 0.009419 | 0.591693 | 0.475151 | 0.458939 | 0.446351 | 0.419863 | 0.393614 | 0 | 0.051183 | 0.093779 | 229,370 | 1,249 | 13,984 | 183.642914 | 0.835519 | 0.001674 | 0 | 0 | 0 | 0.095008 | 0.376453 | 0.068911 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.066023 | 0.008052 | 0 | 0.008052 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
765f9f230cd6637c94f6d617e46c42003ae62808 | 7,940 | py | Python | client/untitled8forget_password.py | seaase/sqlserver- | e31ccbe43c13e3758b59388771577c484920e67d | [
"Apache-2.0"
] | null | null | null | client/untitled8forget_password.py | seaase/sqlserver- | e31ccbe43c13e3758b59388771577c484920e67d | [
"Apache-2.0"
] | null | null | null | client/untitled8forget_password.py | seaase/sqlserver- | e31ccbe43c13e3758b59388771577c484920e67d | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'untitled8forget_password_.ui'
#
# Created by: PyQt5 UI code generator 5.11.3
#
# WARNING! All changes made in this file will be lost!
import cookielib
import json
import urllib2
from PyQt5 import QtCore, QtGui, QtWidgets
from PyQt5.QtWidgets import QWidget, QMessageBox, QDialog
import constants
import encdec
class forget_password(QDialog):
def __init__(self,parent=None):
super(forget_password, self).__init__(parent)
self.setupUi()
def setupUi(self):
self.setObjectName("Form")
self.setFixedSize(400, 480)
self.setStyleSheet("#Form{border-image:url(./background.jpg);}")
self.setModal(QtCore.Qt.WindowModal)
self.label_2 = QtWidgets.QLabel(self)
self.label_2.setGeometry(QtCore.QRect(30, 80, 81, 31))
self.label_2.setObjectName("label_2")
self.label_3 = QtWidgets.QLabel(self)
self.label_3.setGeometry(QtCore.QRect(30, 140, 81, 31))
self.label_3.setObjectName("label_3")
self.comboBox = QtWidgets.QComboBox(self)
self.comboBox.setGeometry(QtCore.QRect(140, 81, 221, 31))
self.comboBox.setObjectName("comboBox")
self.lineEdit = QtWidgets.QLineEdit(self)
self.lineEdit.setGeometry(QtCore.QRect(140, 140, 221, 31))
self.lineEdit.setObjectName("lineEdit")
self.pushButton = QtWidgets.QPushButton(self)
self.pushButton.setGeometry(QtCore.QRect(110, 430, 93, 28))
self.pushButton.setObjectName("pushButton")
self.pushButton_2 = QtWidgets.QPushButton(self)
self.pushButton_2.setGeometry(QtCore.QRect(210, 430, 93, 28))
self.pushButton_2.setObjectName("pushButton_2")
self.label_4 = QtWidgets.QLabel(self)
self.label_4.setGeometry(QtCore.QRect(30, 260, 81, 31))
self.label_4.setObjectName("label_4")
self.comboBox_2 = QtWidgets.QComboBox(self)
self.comboBox_2.setGeometry(QtCore.QRect(140, 201, 221, 31))
self.comboBox_2.setObjectName("comboBox_2")
self.lineEdit_2 = QtWidgets.QLineEdit(self)
self.lineEdit_2.setGeometry(QtCore.QRect(140, 260, 221, 31))
self.lineEdit_2.setObjectName("lineEdit_2")
self.label_5 = QtWidgets.QLabel(self)
self.label_5.setGeometry(QtCore.QRect(30, 200, 81, 31))
self.label_5.setObjectName("label_5")
self.label_6 = QtWidgets.QLabel(self)
self.label_6.setGeometry(QtCore.QRect(30, 320, 81, 31))
self.label_6.setObjectName("label_6")
self.lineEdit_3 = QtWidgets.QLineEdit(self)
self.lineEdit_3.setGeometry(QtCore.QRect(140, 320, 221, 31))
self.lineEdit_3.setInputMethodHints(QtCore.Qt.ImhHiddenText)
self.lineEdit_3.setObjectName("lineEdit_3")
self.lineEdit_3.setEchoMode(QtWidgets.QLineEdit.Password)
self.label_7 = QtWidgets.QLabel(self)
self.label_7.setGeometry(QtCore.QRect(30, 370, 81, 31))
self.label_7.setObjectName("label_7")
self.lineEdit_4 = QtWidgets.QLineEdit(self)
self.lineEdit_4.setGeometry(QtCore.QRect(140, 370, 221, 31))
self.lineEdit_4.setInputMethodHints(QtCore.Qt.ImhHiddenText)
self.lineEdit_4.setObjectName("lineEdit_4")
self.lineEdit_4.setEchoMode(QtWidgets.QLineEdit.Password)
self.label_8 = QtWidgets.QLabel(self)
self.label_8.setGeometry(QtCore.QRect(30, 20, 81, 31))
self.label_8.setObjectName("label_8")
self.lineEdit_5 = QtWidgets.QLineEdit(self)
self.lineEdit_5.setGeometry(QtCore.QRect(140, 20, 221, 31))
self.lineEdit_5.setObjectName("lineEdit_5")
self.retranslateUi(self)
QtCore.QMetaObject.connectSlotsByName(self)
def retranslateUi(self, Form):
_translate = QtCore.QCoreApplication.translate
Form.setWindowTitle(_translate("Form", "忘记密码"))
self.label_2.setText(_translate("Form", "密保问题1:"))
self.label_3.setText(_translate("Form", "答案1:"))
self.comboBox.addItem("你的母校")
self.comboBox.addItem("你父亲的名字")
self.comboBox.addItem("你爱人的名字")
self.comboBox.addItem("你喜欢的电影")
self.comboBox.addItem("最想去的地方")
self.pushButton.setText(_translate("Form", "提交"))
self.pushButton_2.setText(_translate("Form", "退出"))
self.label_4.setText(_translate("Form", "答案2:"))
self.comboBox_2.addItem("你的母校")
self.comboBox_2.addItem("你父亲的名字")
self.comboBox_2.addItem("你爱人的名字")
self.comboBox_2.addItem("你喜欢的电影")
self.comboBox_2.addItem("最想去的地方")
self.label_5.setText(_translate("Form", "密保问题2:"))
self.label_6.setText(_translate("Form", "密码:"))
self.label_7.setText(_translate("Form", "确认密码:"))
self.label_8.setText(_translate("Form", "用户名:"))
self.pushButton.clicked.connect(self.onTreeWidget2Clicked)
self.pushButton_2.clicked.connect(self.onTreeWidget3Clicked)
def updatePassword1(self,userInfo):
cookie = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cookie))
data = json.dumps(userInfo, sort_keys=True).encode()
en_data = encdec.des_encrypt(data)
# 需要POST的数据,
postdata = '{"userData":"' + en_data + '"}'
print postdata
headers = {'Content-Type': 'application/json'}
req = urllib2.Request(url=constants.updatepass1_url, headers=headers, data=postdata)
# 访问该链接#
result = opener.open(req)
ans = result.read()
status = result.getcode()
if status == 200 and ans == '更新成功':
QMessageBox.information(self.pushButton, '亲', '修改成功!请登录!', QMessageBox.Yes)
self.close()
else:
QMessageBox.warning(self.pushButton, '亲',ans, QMessageBox.Yes)
#处理按钮确定事件
def hand(self):
if self.pushButton.clicked:
username = self.lineEdit_5.text().strip()
if username == '':
QMessageBox.warning(self.pushButton, '错误', '用户名不能为空', QMessageBox.Yes)
return
if not constants.judge_pure_english(username):
QMessageBox.warning(self.pushButton, '错误', '用户名只能是英文字符', QMessageBox.Yes)
return
secretK1 = self.lineEdit.text().strip()
if secretK1 == '':
QMessageBox.warning(self.pushButton, '亲', '您还没有填写密保问题1', QMessageBox.Yes)
return
secretK2 = self.lineEdit_2.text().strip()
if secretK2 == '':
QMessageBox.warning(self.pushButton, '亲', '您还没有填写密保问题2', QMessageBox.Yes)
return
if self.comboBox.currentText() == self.comboBox_2.currentText():
QMessageBox.warning(self.pushButton, '亲', '两个密保问题不能一样', QMessageBox.Yes)
return
password1 = self.lineEdit_3.text().strip()
password2 = self.lineEdit_4.text().strip()
if password1 == '' or password2 == '':
QMessageBox.warning(self.pushButton, '错误', '密码不能为空', QMessageBox.Yes)
return
if password2 != password1:
QMessageBox.warning(self.pushButton, '错误', '两次输入的密码不一样', QMessageBox.Yes)
return
if not constants.judge_pure_english(password1):
QMessageBox.warning(self.pushButton, '错误', '密码只能是英文字符', QMessageBox.Yes)
return
userInfo = {
'username':username,
'mibao_q1':self.comboBox.currentText(),
'mibao_a1':secretK1,
'mibao_q2': self.comboBox_2.currentText(),
'mibao_a2': secretK2,
"new_password":password1
}
self.updatePassword1(userInfo)
def onTreeWidget2Clicked(self):
self.hand()
print("提交")
def onTreeWidget3Clicked(self):
self.close()
print("退出") | 43.867403 | 92 | 0.638665 | 875 | 7,940 | 5.665143 | 0.227429 | 0.050837 | 0.071011 | 0.0581 | 0.249344 | 0.094008 | 0.020173 | 0.020173 | 0.020173 | 0 | 0 | 0.04696 | 0.235642 | 7,940 | 181 | 93 | 43.867403 | 0.769814 | 0.028212 | 0 | 0.063694 | 1 | 0 | 0.06748 | 0.00545 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.082803 | 0.044586 | null | null | 0.019108 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
767dcd3d317250256f72fac9cfaf7e74e116c49c | 212 | py | Python | faceRegonization/views.py | jeremykid/FunAlgorithm | 0ee34f130574d2a42ce995a1a44545a7368f9add | [
"MIT"
] | null | null | null | faceRegonization/views.py | jeremykid/FunAlgorithm | 0ee34f130574d2a42ce995a1a44545a7368f9add | [
"MIT"
] | null | null | null | faceRegonization/views.py | jeremykid/FunAlgorithm | 0ee34f130574d2a42ce995a1a44545a7368f9add | [
"MIT"
] | null | null | null | def my_view(request):
if request.method == 'POST':
import set_gpio
#Do your stuff ,calling whatever you want from set_gpio.py
return "1"#Something, normally a HTTPResponse, using django"" | 35.333333 | 66 | 0.683962 | 30 | 212 | 4.733333 | 0.9 | 0.098592 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006098 | 0.226415 | 212 | 6 | 67 | 35.333333 | 0.859756 | 0.504717 | 0 | 0 | 0 | 0 | 0.048077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
76832d61e4a10b1f36ba9b2a0925fb5737021c78 | 1,239 | py | Python | series_tiempo_ar_api/apps/analytics/migrations/0012_auto_20190124_1117.py | datosgobar/series-tiempo-ar-api | 6b553c573f6e8104f8f3919efe79089b7884280c | [
"MIT"
] | 28 | 2017-12-16T20:30:52.000Z | 2021-08-11T17:35:04.000Z | series_tiempo_ar_api/apps/analytics/migrations/0012_auto_20190124_1117.py | datosgobar/series-tiempo-ar-api | 6b553c573f6e8104f8f3919efe79089b7884280c | [
"MIT"
] | 446 | 2017-11-16T15:21:40.000Z | 2021-06-10T20:14:21.000Z | series_tiempo_ar_api/apps/analytics/migrations/0012_auto_20190124_1117.py | datosgobar/series-tiempo-ar-api | 6b553c573f6e8104f8f3919efe79089b7884280c | [
"MIT"
] | 12 | 2018-08-23T16:13:32.000Z | 2022-03-01T23:12:28.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2019-01-24 14:17
from __future__ import unicode_literals
import builtins
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('analytics', '0011_auto_20190123_1530'),
]
operations = [
migrations.RenameField(
model_name='analyticsimporttask',
old_name='timestamp',
new_name='created'
),
migrations.AlterField(
model_name='analyticsimporttask',
name='created',
field=models.DateTimeField(auto_now_add=True),
preserve_default=False,
),
migrations.AddField(
model_name='analyticsimporttask',
name='finished',
field=models.DateTimeField(null=True),
),
migrations.AlterField(
model_name='analyticsimporttask',
name='logs',
field=models.TextField(),
),
migrations.AlterField(
model_name='analyticsimporttask',
name='status',
field=models.CharField(choices=[('RUNNING', 'Procesando catálogos'), ('FINISHED', 'Finalizada')], max_length=20),
),
]
| 28.813953 | 125 | 0.588378 | 108 | 1,239 | 6.574074 | 0.601852 | 0.06338 | 0.197183 | 0.180282 | 0.219718 | 0.219718 | 0 | 0 | 0 | 0 | 0 | 0.040184 | 0.297014 | 1,239 | 42 | 126 | 29.5 | 0.774971 | 0.054883 | 0 | 0.371429 | 1 | 0 | 0.182363 | 0.019692 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.228571 | 0 | 0.314286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
769c3e39c92e07a33491de093807aa7916f7b562 | 749 | py | Python | core/migrations/0012_facets_ordering.py | kingsdigitallab/egomedia-django | 7347fc96ca52ac195b388e43204d0a0faab0c88f | [
"MIT"
] | null | null | null | core/migrations/0012_facets_ordering.py | kingsdigitallab/egomedia-django | 7347fc96ca52ac195b388e43204d0a0faab0c88f | [
"MIT"
] | 10 | 2021-04-06T18:17:44.000Z | 2022-03-01T12:21:40.000Z | core/migrations/0012_facets_ordering.py | kingsdigitallab/egomedia-django | 7347fc96ca52ac195b388e43204d0a0faab0c88f | [
"MIT"
] | null | null | null | # Generated by Django 2.1.5 on 2019-02-07 12:53
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('core', '0011_add_researcherpage_facets'),
]
operations = [
migrations.AlterModelOptions(
name='discipline',
options={'ordering': ['title']},
),
migrations.AlterModelOptions(
name='focus',
options={'ordering': ['title'], 'verbose_name_plural': 'Focus'},
),
migrations.AlterModelOptions(
name='keyword',
options={'ordering': ['title']},
),
migrations.AlterModelOptions(
name='method',
options={'ordering': ['title']},
),
]
| 24.966667 | 76 | 0.543391 | 60 | 749 | 6.7 | 0.6 | 0.268657 | 0.308458 | 0.149254 | 0.253731 | 0.253731 | 0 | 0 | 0 | 0 | 0 | 0.036965 | 0.313752 | 749 | 29 | 77 | 25.827586 | 0.745136 | 0.06008 | 0 | 0.478261 | 1 | 0 | 0.196581 | 0.042735 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.043478 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
76ab30d445f38b5284e8d136416629a76379a3dd | 29 | py | Python | package/__init__.py | nebasam/Datafetch | 3842c99c390e6262e290bdf3896b31ba72021c67 | [
"MIT"
] | null | null | null | package/__init__.py | nebasam/Datafetch | 3842c99c390e6262e290bdf3896b31ba72021c67 | [
"MIT"
] | 5 | 2021-08-22T09:41:52.000Z | 2021-08-24T07:35:29.000Z | package/__init__.py | nebasam/Datafetch | 3842c99c390e6262e290bdf3896b31ba72021c67 | [
"MIT"
] | null | null | null | # version
__version__ = 0.01 | 9.666667 | 18 | 0.724138 | 4 | 29 | 4.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0.172414 | 29 | 3 | 18 | 9.666667 | 0.583333 | 0.241379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
76ab5652d1bee271e69a49cd81b5b2c4d3ea4be0 | 340 | py | Python | class_only_design/__init__.py | chaburkland/class-only-design | 3c49795458d80ffab080af51dde44b31a4633702 | [
"MIT"
] | 5 | 2020-07-08T09:29:17.000Z | 2022-03-07T11:52:53.000Z | class_only_design/__init__.py | chaburkland/class-only-design | 3c49795458d80ffab080af51dde44b31a4633702 | [
"MIT"
] | 2 | 2018-08-22T21:12:44.000Z | 2018-09-20T00:08:40.000Z | class_only_design/__init__.py | InvestmentSystems/class_only_design | 3724ebd9ffeb244a926e2f2033a7a3c9e8430021 | [
"MIT"
] | 1 | 2020-02-10T18:32:46.000Z | 2020-02-10T18:32:46.000Z | # -*- coding: utf-8 -*-
"""Top-level package for Class Only."""
__author__ = """Tom Rutherford"""
__email__ = "foreverwintr@gmail.com"
__version__ = "0.3.0"
from class_only_design.api import ClassOnly
from class_only_design.api import Namespace
from class_only_design.api import constant
from class_only_design.constants import autoname
| 26.153846 | 48 | 0.773529 | 48 | 340 | 5.0625 | 0.583333 | 0.185185 | 0.213992 | 0.312757 | 0.345679 | 0.345679 | 0 | 0 | 0 | 0 | 0 | 0.013289 | 0.114706 | 340 | 12 | 49 | 28.333333 | 0.79402 | 0.164706 | 0 | 0 | 0 | 0 | 0.147482 | 0.079137 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.571429 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
76b8f1979e1c7076dc386642b3bdfb94d3dfdd64 | 234 | py | Python | katas/beta/another_one_down_survival_of_fittest.py | the-zebulan/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 40 | 2016-03-09T12:26:20.000Z | 2022-03-23T08:44:51.000Z | katas/beta/another_one_down_survival_of_fittest.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | null | null | null | katas/beta/another_one_down_survival_of_fittest.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 36 | 2016-11-07T19:59:58.000Z | 2022-03-31T11:18:27.000Z | def remove_smallest(n, lst):
if n <= 0:
return lst
elif n > len(lst):
return []
dex = [b[1] for b in sorted((a, i) for i, a in enumerate(lst))[:n]]
return [c for i, c in enumerate(lst) if i not in dex]
| 29.25 | 71 | 0.547009 | 44 | 234 | 2.886364 | 0.477273 | 0.07874 | 0.220472 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 0.307692 | 234 | 7 | 72 | 33.428571 | 0.771605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
4f108cae5082e44b7199955fbd9665d371aaec3a | 1,779 | py | Python | world/load_busstop.py | homata/geodjango-hands-on | 3ecee17c0f14122a72aacc71f187011f0b112d08 | [
"Apache-2.0"
] | 13 | 2018-12-02T07:40:08.000Z | 2022-03-13T09:04:34.000Z | world/load_busstop.py | homata/geodjango-hands-on | 3ecee17c0f14122a72aacc71f187011f0b112d08 | [
"Apache-2.0"
] | 3 | 2020-06-05T19:24:35.000Z | 2021-06-10T20:58:57.000Z | download/source_code/world/load_busstop.py | homata/geodjango-book | 94842892bb5f4ea053b8366189e89a36bf7505c5 | [
"Apache-2.0"
] | 6 | 2018-06-24T10:07:32.000Z | 2019-08-12T09:46:18.000Z | # -*- coding: utf-8 -*-
import os
from django.contrib.gis.utils import LayerMapping
from world.models import Busstop
# Modelとファイルのカラムのマッピング
mapping = {
'p11_001' : 'P11_001' ,
'p11_002' : 'P11_002',
'p11_003_1' : 'P11_003_1',
'p11_003_2' : 'P11_003_2',
'p11_003_3' : 'P11_003_3',
'p11_003_4' : 'P11_003_4',
'p11_003_5' : 'P11_003_5',
'p11_003_6' : 'P11_003_6',
'p11_003_7' : 'P11_003_7',
'p11_003_8' : 'P11_003_8',
'p11_003_9' : 'P11_003_9',
'p11_003_10' : 'P11_003_10',
'p11_003_11' : 'P11_003_11',
'p11_003_12' : 'P11_003_12',
'p11_003_13' : 'P11_003_13',
'p11_003_14' : 'P11_003_14',
'p11_003_15' : 'P11_003_15',
'p11_003_16' : 'P11_003_16',
'p11_003_17' : 'P11_003_17',
'p11_003_18' : 'P11_003_18',
'p11_003_19' : 'P11_003_19',
'p11_004_1' : 'P11_004_1',
'p11_004_2' : 'P11_004_2',
'p11_004_3' : 'P11_004_3',
'p11_004_4' : 'P11_004_4',
'p11_004_5' : 'P11_004_5',
'p11_004_6' : 'P11_004_6',
'p11_004_7' : 'P11_004_7',
'p11_004_8' : 'P11_004_8',
'p11_004_9' : 'P11_004_9',
'p11_004_10' : 'P11_004_10',
'p11_004_11' : 'P11_004_11',
'p11_004_12' : 'P11_004_12',
'p11_004_13' : 'P11_004_13',
'p11_004_14' : 'P11_004_14',
'p11_004_15' : 'P11_004_15',
'p11_004_16' : 'P11_004_16',
'p11_004_17' : 'P11_004_17',
'p11_004_18' : 'P11_004_18',
'p11_004_19' : 'P11_004_19',
'geom' : 'POINT',
}
# ファイルパス
geojson_file = os.path.abspath(os.path.join(os.path.dirname(__file__), 'data', 'busstop.geojson'))
# 実行
def run(verbose=True):
lm = LayerMapping(Busstop, geojson_file, mapping, transform=False, encoding='UTF-8')
lm.save(strict=True, verbose=verbose)
| 30.672414 | 99 | 0.602586 | 295 | 1,779 | 3.084746 | 0.20339 | 0.250549 | 0.01978 | 0.021978 | 0.606593 | 0 | 0 | 0 | 0 | 0 | 0 | 0.375635 | 0.224845 | 1,779 | 57 | 100 | 31.210526 | 0.284264 | 0.02923 | 0 | 0 | 0 | 0 | 0.471471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02 | false | 0 | 0.06 | 0 | 0.08 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
4f2e1ee4de7cb22626da8030ea8a9272f98b1887 | 143 | py | Python | CodeWars/6 Kyu/ASCII Fun #2 - Funny Dots.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | CodeWars/6 Kyu/ASCII Fun #2 - Funny Dots.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | CodeWars/6 Kyu/ASCII Fun #2 - Funny Dots.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | def dot(n,m):
r=['+'+'---+'*n]
for _ in range(m):
r.append('|'+' o |'*n)
r.append('+'+'---+'*n)
return '\n'.join(r) | 23.833333 | 30 | 0.356643 | 21 | 143 | 2.380952 | 0.571429 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.27972 | 143 | 6 | 31 | 23.833333 | 0.485437 | 0 | 0 | 0 | 0 | 0 | 0.118056 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
4f324d9591817ae7e76ed952da3ba2d089be20d7 | 752 | py | Python | Example/sk/IrisClassifier.py | napoler/tkit-bentoml-frameworks-expand | 64da36bbfb740ae28fe591086087ebac7d78e51f | [
"Apache-2.0"
] | 1 | 2021-12-17T03:13:34.000Z | 2021-12-17T03:13:34.000Z | Example/sk/IrisClassifier.py | napoler/tkit-bentoml-frameworks-expand | 64da36bbfb740ae28fe591086087ebac7d78e51f | [
"Apache-2.0"
] | null | null | null | Example/sk/IrisClassifier.py | napoler/tkit-bentoml-frameworks-expand | 64da36bbfb740ae28fe591086087ebac7d78e51f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
作者: terrychan
Blog: https://terrychan.org
# 说明:
"""
import bentoml
from bentoml import web_static_content
from bentoml.adapters import DataframeInput
from bentoml.frameworks.sklearn import SklearnModelArtifact
# @save_to_dir("test.py") # 测试保存文件
@bentoml.env(infer_pip_packages=True)
@bentoml.artifacts([SklearnModelArtifact('model')])
@web_static_content('./static')
class IrisClassifier(bentoml.BentoService):
"""
IrisClassifier分类接口
测试说明
"""
@bentoml.api(input=DataframeInput(), batch=True)
def predict(self, df, metadata={"aa": 2}):
"""
预测接口
:param df:
:return:
"""
return self.artifacts.model.predict(df)
if __name__ == '__main__':
pass
| 19.789474 | 59 | 0.667553 | 82 | 752 | 5.926829 | 0.670732 | 0.067901 | 0.065844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003289 | 0.191489 | 752 | 37 | 60 | 20.324324 | 0.796053 | 0.202128 | 0 | 0 | 0 | 0 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.076923 | 0.307692 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
4f60aaf734523156a890b237eaf31d3d6056e7d1 | 289 | py | Python | sp/shell/server.py | florianm93/studium | 21e40e053dfa3f30ab0b508c42c9d4827a312a50 | [
"MIT"
] | null | null | null | sp/shell/server.py | florianm93/studium | 21e40e053dfa3f30ab0b508c42c9d4827a312a50 | [
"MIT"
] | null | null | null | sp/shell/server.py | florianm93/studium | 21e40e053dfa3f30ab0b508c42c9d4827a312a50 | [
"MIT"
] | null | null | null | from SimpleXMLRPCServer import SimpleXMLRPCServer
class Funcs:
def add(self, x, y):
return x + y
def sub(self, x, y):
return x - y
server = SimpleXMLRPCServer(("localhost", 8000))
server.register_instance(Funcs())
server.serve_forever()
| 22.230769 | 49 | 0.622837 | 33 | 289 | 5.393939 | 0.575758 | 0.044944 | 0.067416 | 0.134831 | 0.157303 | 0.157303 | 0 | 0 | 0 | 0 | 0 | 0.019139 | 0.276817 | 289 | 12 | 50 | 24.083333 | 0.832536 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.222222 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
4f8d0f880fbc20e5c097be5277625bf19daea6dd | 1,208 | py | Python | FreeTAKServer/controllers/services/API/tests.py | logikal/FreeTakServer | c0916ce65781b5c60079d6440e52db8fc6ee0467 | [
"MIT"
] | null | null | null | FreeTAKServer/controllers/services/API/tests.py | logikal/FreeTakServer | c0916ce65781b5c60079d6440e52db8fc6ee0467 | [
"MIT"
] | null | null | null | FreeTAKServer/controllers/services/API/tests.py | logikal/FreeTakServer | c0916ce65781b5c60079d6440e52db8fc6ee0467 | [
"MIT"
] | null | null | null | import unittest
import threading
class Tests(unittest.TestCase):
def test_service_info(self):
from multiprocessing import Pipe
from FreeTAKServer.model.ServiceObjects.FTS import FTS
pipe1, pipe2 = Pipe(True)
from FreeTAKServer.controllers.services.API import RestAPI
RestAPI.CommandPipe = pipe1
pipe2.send(FTS())
RestAPI.show_service_info()
def test_change_status(self):
from FreeTAKServer.controllers.services import RestAPI
RestAPI.changeStatus({"services": {"TCPDataPackageService":{"status":"off"}}, "ip": ""})
def test_add_system_user(self):
import json
from FreeTAKServer.controllers.services import RestAPI
RestAPI.addSystemUser(json.dumps({
"systemUsers":
[
{"Name": "bbrdk", "Group": "Yellow", "Token": "token", "Password": "psw1", "Certs": "true"}
]
}))
def test_get_system_user(self):
from FreeTAKServer.controllers.services import RestAPI
RestAPI.systemUsers() | 40.266667 | 135 | 0.570364 | 106 | 1,208 | 6.386792 | 0.471698 | 0.125554 | 0.165436 | 0.212703 | 0.25997 | 0.25997 | 0.25997 | 0.177253 | 0 | 0 | 0 | 0.006219 | 0.334437 | 1,208 | 30 | 136 | 40.266667 | 0.835821 | 0 | 0 | 0.115385 | 0 | 0 | 0.084367 | 0.01737 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0.038462 | 0.346154 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
4f8d669c620b55368b3dde1ed4523e8739d5464c | 336 | py | Python | app/blueprints/api/v1/api.py | SimonSkade/labplaner | 5ca73237713faa0d423374fdcf0bf1185ebc11f8 | [
"Apache-2.0"
] | 1 | 2021-06-06T17:56:44.000Z | 2021-06-06T17:56:44.000Z | app/blueprints/api/v1/api.py | SimonSkade/labplaner | 5ca73237713faa0d423374fdcf0bf1185ebc11f8 | [
"Apache-2.0"
] | null | null | null | app/blueprints/api/v1/api.py | SimonSkade/labplaner | 5ca73237713faa0d423374fdcf0bf1185ebc11f8 | [
"Apache-2.0"
] | null | null | null | from flask import Blueprint, request, jsonify, render_template
from app.models.user import User, UserSchema
from app import db
bp = Blueprint('api', __name__)
user_schema = UserSchema()
users_schema = UserSchema(many=True)
@bp.route('/', methods=['GET'])
def index():
return render_template('api/v1/index.html', title='API v1')
| 24 | 63 | 0.738095 | 47 | 336 | 5.106383 | 0.617021 | 0.116667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006803 | 0.125 | 336 | 13 | 64 | 25.846154 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0.089286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0.111111 | 0.555556 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 3 |
4fa480357fe18fc44f4b067d7dddb0da42dd9435 | 84 | py | Python | embedding_encoder/__init__.py | r-sosa-gh/embedding-encoder | f8200124dbf2e37eec0aa35b4eaf809ee1869785 | [
"MIT"
] | 16 | 2022-03-25T15:25:34.000Z | 2022-03-31T18:17:20.000Z | embedding_encoder/__init__.py | r-sosa-gh/embedding-encoder | f8200124dbf2e37eec0aa35b4eaf809ee1869785 | [
"MIT"
] | 7 | 2022-01-26T01:16:08.000Z | 2022-03-23T00:53:51.000Z | embedding_encoder/__init__.py | r-sosa-gh/embedding-encoder | f8200124dbf2e37eec0aa35b4eaf809ee1869785 | [
"MIT"
] | 3 | 2022-01-12T16:01:29.000Z | 2022-01-12T17:01:12.000Z | from embedding_encoder.core import EmbeddingEncoder
__all__ = ["EmbeddingEncoder"]
| 21 | 51 | 0.833333 | 8 | 84 | 8.125 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 84 | 3 | 52 | 28 | 0.855263 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
96c421f732ab71871a092202481b35d009b37b1a | 1,623 | py | Python | tests/preprocess/test_preprocess__KernelDensity.py | kamilazdybal/PCAfold | 251ca0dc8c8f98976266b94147504247ddd09bd2 | [
"MIT"
] | 1 | 2022-02-01T08:57:18.000Z | 2022-02-01T08:57:18.000Z | tests/preprocess/test_preprocess__KernelDensity.py | kamilazdybal/PCAfold | 251ca0dc8c8f98976266b94147504247ddd09bd2 | [
"MIT"
] | null | null | null | tests/preprocess/test_preprocess__KernelDensity.py | kamilazdybal/PCAfold | 251ca0dc8c8f98976266b94147504247ddd09bd2 | [
"MIT"
] | 1 | 2022-03-13T13:19:56.000Z | 2022-03-13T13:19:56.000Z | import unittest
import numpy as np
from PCAfold import preprocess
from PCAfold import reduction
from PCAfold import analysis
class Preprocess(unittest.TestCase):
def test_preprocess__KernelDensity__allowed_calls(self):
X = np.random.rand(100,20)
try:
kerneld = preprocess.KernelDensity(X, X[:,1])
except Exception:
self.assertTrue(False)
try:
kerneld = preprocess.KernelDensity(X, X[:,4:9])
except Exception:
self.assertTrue(False)
try:
kerneld = preprocess.KernelDensity(X, X[:,0])
except Exception:
self.assertTrue(False)
try:
kerneld = preprocess.KernelDensity(X, X)
except Exception:
self.assertTrue(False)
try:
kerneld.X_weighted
kerneld.weights
except Exception:
self.assertTrue(False)
# ------------------------------------------------------------------------------
def test_preprocess__KernelDensity__not_allowed_calls(self):
X = np.random.rand(100,20)
kerneld = preprocess.KernelDensity(X, X[:,1])
with self.assertRaises(AttributeError):
kerneld.X_weighted = 1
with self.assertRaises(AttributeError):
kerneld.weights = 1
with self.assertRaises(ValueError):
kerneld = preprocess.KernelDensity(X, X[20:30,1])
with self.assertRaises(ValueError):
kerneld = preprocess.KernelDensity(X, X[20:30,:])
# ------------------------------------------------------------------------------
| 27.508475 | 80 | 0.55268 | 152 | 1,623 | 5.802632 | 0.269737 | 0.234694 | 0.238095 | 0.246032 | 0.72449 | 0.685941 | 0.513605 | 0.463719 | 0.463719 | 0.386621 | 0 | 0.021941 | 0.269871 | 1,623 | 58 | 81 | 27.982759 | 0.722363 | 0.096734 | 0 | 0.575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225 | 1 | 0.05 | false | 0 | 0.125 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
96d2bb044c390731b1e2f1e708489a0c310e59cd | 417 | py | Python | cdd/tests/test_pkg_utils.py | SamuelMarks/docstring2class | a70745134bcb069fffdf1d9210c17129925c8fd5 | [
"Apache-2.0",
"MIT"
] | 4 | 2021-05-28T14:50:58.000Z | 2022-03-01T23:22:14.000Z | cdd/tests/test_pkg_utils.py | SamuelMarks/docstring2class | a70745134bcb069fffdf1d9210c17129925c8fd5 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-02-19T04:13:38.000Z | 2021-02-19T04:13:38.000Z | cdd/tests/test_pkg_utils.py | SamuelMarks/docstring2class | a70745134bcb069fffdf1d9210c17129925c8fd5 | [
"Apache-2.0",
"MIT"
] | 3 | 2020-09-09T21:23:09.000Z | 2021-02-13T07:53:07.000Z | """ Tests for pkg utils """
from unittest import TestCase
from cdd.pkg_utils import relative_filename
from cdd.tests.utils_for_tests import unittest_main
class TestPkgUtils(TestCase):
"""Test class for pkg utils"""
def test_relative_filename(self) -> None:
"""Tests relative_filename ident"""
expect = "gaffe"
self.assertEqual(relative_filename(expect), expect)
unittest_main()
| 21.947368 | 59 | 0.719424 | 52 | 417 | 5.576923 | 0.423077 | 0.22069 | 0.075862 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184652 | 417 | 18 | 60 | 23.166667 | 0.852941 | 0.177458 | 0 | 0 | 0 | 0 | 0.015385 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
96dcc54db89e44f1ba7c1d9d3c9010c7621e5110 | 171 | py | Python | localflavor_forms/urls.py | ankurchopra87/django-localflavor-forms | bd8e42171ad3e1805549e3c52be1c3427bf5b47d | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | localflavor_forms/urls.py | ankurchopra87/django-localflavor-forms | bd8e42171ad3e1805549e3c52be1c3427bf5b47d | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | localflavor_forms/urls.py | ankurchopra87/django-localflavor-forms | bd8e42171ad3e1805549e3c52be1c3427bf5b47d | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | from django.contrib import admin
from django.urls import path
from . import views
urlpatterns = [
path('<str:country_code>/', views.get_address_form, name='home'),
]
| 21.375 | 69 | 0.736842 | 24 | 171 | 5.125 | 0.708333 | 0.162602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 171 | 7 | 70 | 24.428571 | 0.836735 | 0 | 0 | 0 | 0 | 0 | 0.134503 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
8c169b1b03c09729601eaf8006d8ce375c223baa | 29,962 | py | Python | python/dns.py | burakbudanur/dnsbox | 13ae8b76c54bd45a7bd7902aa88097be783415d4 | [
"MIT"
] | 1 | 2022-02-18T08:11:15.000Z | 2022-02-18T08:11:15.000Z | python/dns.py | burakbudanur/dnsbox | 13ae8b76c54bd45a7bd7902aa88097be783415d4 | [
"MIT"
] | 2 | 2021-07-20T12:19:11.000Z | 2021-07-21T20:29:59.000Z | python/dns.py | burakbudanur/dnsbox | 13ae8b76c54bd45a7bd7902aa88097be783415d4 | [
"MIT"
] | 4 | 2020-12-12T08:18:37.000Z | 2022-03-03T14:52:53.000Z | #!/usr/bin/env python3
"""
This is a module that keeps functions common to all utilities.
"""
from os import makedirs
from pathlib import Path
from struct import pack, unpack
from sys import exit
import numpy as np
from matplotlib import rcParams
from numpy.lib import NumpyVersion
figuresDirName = "figures"
Ly = 4.0
qF = 1
nStretch = 3
def isEven(N):
# Return True if N is even, False otherwise
if N % 2 == 0:
return True
else:
return False
def createFiguresDir(mainDir):
mainDir = Path(mainDir)
figuresDir = mainDir / figuresDirName
# Create it if it doesn't exist
if not Path.is_dir(figuresDir):
makedirs(figuresDir)
return figuresDir
def setPlotDefaults(tex=False):
# Used to apply some sane plotting defaults.
rcParams.update(
{
"xtick.labelsize": 13,
"ytick.labelsize": 13,
"axes.labelsize": 13,
"axes.titlesize": 14,
"legend.fontsize": 10,
"figure.autolayout": True,
"figure.dpi": 100,
"figure.figsize": [8, 8],
}
)
if tex:
rcParams.update(
{
"text.usetex": True,
"text.latex.preamble": r"\usepackage[p,osf]{scholax}"
r"\usepackage[T1]{fontenc}"
r"\usepackage[utf8]{inputenc}"
r"\usepackage[varqu,varl]{inconsolata}"
r"\usepackage{mathtools}"
r"\usepackage[scaled=1.075,ncf,vvarbb]{newtxmath}"
r"\usepackage{bm}",
}
)
def readParameters(parametersFile):
"""Reads parameters.in, returns parameters in a dictionary."""
import f90nml
# Convert to Path
parametersFile = Path(parametersFile)
params = f90nml.read(parametersFile)
return params
def writeParameters(params, parametersFile):
parametersFile = Path(parametersFile)
with open(parametersFile, "w") as f:
params.write(f)
def readState_xcompact(stateFilePath):
stateFilePath = Path(stateFilePath)
stateFile = open(stateFilePath, "rb")
(forcing,) = unpack("=1i", stateFile.read(1 * 4))
nx, ny, nz = unpack("=3i", stateFile.read(3 * 4))
Lx, Lz = unpack("=2d", stateFile.read(2 * 8))
(Re,) = unpack("=1d", stateFile.read(1 * 8))
(tilt_angle,) = unpack("=1d", stateFile.read(1 * 8))
(dt,) = unpack("=1d", stateFile.read(1 * 8))
(itime,) = unpack("=1i", stateFile.read(1 * 4))
(time,) = unpack("=1d", stateFile.read(1 * 8))
ny_half = ny // 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
header = (forcing, nx, ny, nz, Lx, Lz, Re, tilt_angle, dt, itime, time)
# Read into y, z, x
state = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64)
# read only the zero mode for u
nCells = (2 * ny_half) * (nz - 1)
toReshape = np.asarray(unpack("={}d".format(nCells), stateFile.read(nCells * 8)))
buff = np.reshape(toReshape, (2 * ny_half, nz - 1, 1), order="F")
state[:, : nzp + 1, 0, 0] = buff[:, : nzp + 1, 0]
state[:, nzp + 2 :, 0, 0] = buff[:, nzp + 1 :, 0]
# read v and w in full
for n in range(1, 3):
nCells = (nx - 1) * (2 * ny_half) * (nz - 1)
toReshape = np.asarray(
unpack("={}d".format(nCells), stateFile.read(nCells * 8))
)
# These were written in Fortran order, so read back in that order
buff = np.reshape(toReshape, (2 * ny_half, nz - 1, nx - 1), order="F")
state[:, : nzp + 1, : nxp + 1, n] = buff[:, : nzp + 1, : nxp + 1]
state[:, : nzp + 1, nxp + 2 :, n] = buff[:, : nzp + 1, nxp + 1 :]
state[:, nzp + 2 :, : nxp + 1 :, n] = buff[:, nzp + 1 :, : nxp + 1]
state[:, nzp + 2 :, nxp + 2 :, n] = buff[:, nzp + 1 :, nxp + 1 :]
stateFile.close()
# compute the rest from
# kx u + ky v + kz w = 0
kx, ky, kz = wavenumbers(Lx, Lz, nx, ny_half, nz)
for j in range(ny_half):
for k in range(nz):
for i in range(1, nx):
state[2 * j, k, i, 0] = (
-(kz[k] * state[2 * j, k, i, 2] + ky[j] * state[2 * j, k, i, 1])
/ kx[i]
)
state[2 * j + 1, k, i, 0] = (
-(
kz[k] * state[2 * j + 1, k, i, 2]
+ ky[j] * state[2 * j + 1, k, i, 1]
)
/ kx[i]
)
# Pack into a complex array
state_ = np.zeros((ny_half, nz, nx, 3), dtype=np.complex128)
for j in range(ny_half):
state_[j, :, :, :] = state[2 * j, :, :, :] + 1j * state[2 * j + 1, :, :, :]
# Rotate to x, y, z
state__ = np.moveaxis(state_, [0, 1, 2, 3], [1, 2, 0, 3])
return state__, header
def readState_ycompact(stateFilePath):
stateFilePath = Path(stateFilePath)
stateFile = open(stateFilePath, "rb")
(forcing,) = unpack("=1i", stateFile.read(1 * 4))
nx, ny, nz = unpack("=3i", stateFile.read(3 * 4))
Lx, Lz = unpack("=2d", stateFile.read(2 * 8))
(Re,) = unpack("=1d", stateFile.read(1 * 8))
(tilt_angle,) = unpack("=1d", stateFile.read(1 * 8))
(dt,) = unpack("=1d", stateFile.read(1 * 8))
(itime,) = unpack("=1i", stateFile.read(1 * 4))
(time,) = unpack("=1d", stateFile.read(1 * 8))
ny_half = ny // 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
header = (forcing, nx, ny, nz, Lx, Lz, Re, tilt_angle, dt, itime, time)
# Read into y, z, x
state = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64)
# read u in full
nCells = (nx - 1) * (2 * ny_half) * (nz - 1)
toReshape = np.asarray(unpack("={}d".format(nCells), stateFile.read(nCells * 8)))
buff = np.reshape(toReshape, (2 * ny_half, nz - 1, nx - 1), order="F")
state[:, : nzp + 1, : nxp + 1, 0] = buff[:, : nzp + 1, : nxp + 1]
state[:, : nzp + 1, nxp + 2 :, 0] = buff[:, : nzp + 1, nxp + 1 :]
state[:, nzp + 2 :, : nxp + 1 :, 0] = buff[:, nzp + 1 :, : nxp + 1]
state[:, nzp + 2 :, nxp + 2 :, 0] = buff[:, nzp + 1 :, nxp + 1 :]
# read only the zero mode for v
nCells = 2 * (nx - 1) * (nz - 1)
toReshape = np.asarray(unpack("={}d".format(nCells), stateFile.read(nCells * 8)))
buff = np.reshape(toReshape, (2, nz - 1, nx - 1), order="F")
state[:2, : nzp + 1, : nxp + 1, 1] = buff[:2, : nzp + 1, : nxp + 1]
state[:2, : nzp + 1, nxp + 2 :, 1] = buff[:2, : nzp + 1, nxp + 1 :]
state[:2, nzp + 2 :, : nxp + 1, 1] = buff[:2, nzp + 1 :, : nxp + 1]
state[:2, nzp + 2 :, nxp + 2 :, 1] = buff[:2, nzp + 1 :, nxp + 1 :]
# read w in full
nCells = (nx - 1) * (2 * ny_half) * (nz - 1)
toReshape = np.asarray(unpack("={}d".format(nCells), stateFile.read(nCells * 8)))
buff = np.reshape(toReshape, (2 * ny_half, nz - 1, nx - 1), order="F")
state[:, : nzp + 1, : nxp + 1, 2] = buff[:, : nzp + 1, : nxp + 1]
state[:, : nzp + 1, nxp + 2 :, 2] = buff[:, : nzp + 1, nxp + 1 :]
state[:, nzp + 2 :, : nxp + 1 :, 2] = buff[:, nzp + 1 :, : nxp + 1]
state[:, nzp + 2 :, nxp + 2 :, 2] = buff[:, nzp + 1 :, nxp + 1 :]
stateFile.close()
# compute the rest from
# kx u + ky v + kz w = 0
kx, ky, kz = wavenumbers(Lx, Lz, nx, ny_half, nz)
for j in range(1, ny_half):
for k in range(nz):
for i in range(nx):
state[2 * j, k, i, 1] = (
-(kx[i] * state[2 * j, k, i, 0] + kz[k] * state[2 * j, k, i, 2])
/ ky[j]
)
state[2 * j + 1, k, i, 1] = (
-(
kx[i] * state[2 * j + 1, k, i, 0]
+ kz[k] * state[2 * j + 1, k, i, 2]
)
/ ky[j]
)
# Pack into a complex array
state_ = np.zeros((ny_half, nz, nx, 3), dtype=np.complex128)
for j in range(ny_half):
state_[j, :, :, :] = state[2 * j, :, :, :] + 1j * state[2 * j + 1, :, :, :]
# Rotate to x, y, z
state__ = np.moveaxis(state_, [0, 1, 2, 3], [1, 2, 0, 3])
return state__, header
def readState_zcompact(stateFilePath):
stateFilePath = Path(stateFilePath)
stateFile = open(stateFilePath, "rb")
(forcing,) = unpack("=1i", stateFile.read(1 * 4))
nx, ny, nz = unpack("=3i", stateFile.read(3 * 4))
Lx, Lz = unpack("=2d", stateFile.read(2 * 8))
(Re,) = unpack("=1d", stateFile.read(1 * 8))
(tilt_angle,) = unpack("=1d", stateFile.read(1 * 8))
(dt,) = unpack("=1d", stateFile.read(1 * 8))
(itime,) = unpack("=1i", stateFile.read(1 * 4))
(time,) = unpack("=1d", stateFile.read(1 * 8))
ny_half = ny // 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
header = (forcing, nx, ny, nz, Lx, Lz, Re, tilt_angle, dt, itime, time)
# Read into y, z, x
state = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64)
# read u and v in full
for n in range(0, 2):
nCells = (nx - 1) * (2 * ny_half) * (nz - 1)
toReshape = np.asarray(
unpack("={}d".format(nCells), stateFile.read(nCells * 8))
)
# These were written in Fortran order, so read back in that order
buff = np.reshape(toReshape, (2 * ny_half, nz - 1, nx - 1), order="F")
state[:, : nzp + 1, : nxp + 1, n] = buff[:, : nzp + 1, : nxp + 1]
state[:, : nzp + 1, nxp + 2 :, n] = buff[:, : nzp + 1, nxp + 1 :]
state[:, nzp + 2 :, : nxp + 1 :, n] = buff[:, nzp + 1 :, : nxp + 1]
state[:, nzp + 2 :, nxp + 2 :, n] = buff[:, nzp + 1 :, nxp + 1 :]
# read only the zero mode for w
nCells = (nx - 1) * (2 * ny_half)
toReshape = np.asarray(unpack("={}d".format(nCells), stateFile.read(nCells * 8)))
buff = np.reshape(toReshape, (2 * ny_half, 1, nx - 1), order="F")
state[:, 0, : nxp + 1, 2] = buff[:, 0, : nxp + 1]
state[:, 0, nxp + 2 :, 2] = buff[:, 0, nxp + 1 :]
stateFile.close()
# compute the rest from
# kx u + ky v + kz w = 0
kx, ky, kz = wavenumbers(Lx, Lz, nx, ny_half, nz)
for j in range(ny_half):
for k in range(1, nz):
for i in range(nx):
state[2 * j, k, i, 2] = (
-(kx[i] * state[2 * j, k, i, 0] + ky[j] * state[2 * j, k, i, 1])
/ kz[k]
)
state[2 * j + 1, k, i, 2] = (
-(
kx[i] * state[2 * j + 1, k, i, 0]
+ ky[j] * state[2 * j + 1, k, i, 1]
)
/ kz[k]
)
# Pack into a complex array
state_ = np.zeros((ny_half, nz, nx, 3), dtype=np.complex128)
for j in range(ny_half):
state_[j, :, :, :] = state[2 * j, :, :, :] + 1j * state[2 * j + 1, :, :, :]
# Rotate to x, y, z
state__ = np.moveaxis(state_, [0, 1, 2, 3], [1, 2, 0, 3])
return state__, header
def readState_nocompact(stateFilePath):
stateFilePath = Path(stateFilePath)
stateFile = open(stateFilePath, "rb")
(forcing,) = unpack("=1i", stateFile.read(1 * 4))
nx, ny, nz = unpack("=3i", stateFile.read(3 * 4))
Lx, Lz = unpack("=2d", stateFile.read(2 * 8))
(Re,) = unpack("=1d", stateFile.read(1 * 8))
(tilt_angle,) = unpack("=1d", stateFile.read(1 * 8))
(dt,) = unpack("=1d", stateFile.read(1 * 8))
(itime,) = unpack("=1i", stateFile.read(1 * 4))
(time,) = unpack("=1d", stateFile.read(1 * 8))
ny_half = ny // 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
header = (forcing, nx, ny, nz, Lx, Lz, Re, tilt_angle, dt, itime, time)
# Read into y, z, x
state = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64)
for n in range(3):
nCells = (nx - 1) * (2 * ny_half) * (nz - 1)
toReshape = np.asarray(
unpack("={}d".format(nCells), stateFile.read(nCells * 8))
)
# These were written in Fortran order, so read back in that order
buff = np.reshape(toReshape, (2 * ny_half, nz - 1, nx - 1), order="F")
state[:, : nzp + 1, : nxp + 1, n] = buff[:, : nzp + 1, : nxp + 1]
state[:, : nzp + 1, nxp + 2 :, n] = buff[:, : nzp + 1, nxp + 1 :]
state[:, nzp + 2 :, : nxp + 1 :, n] = buff[:, nzp + 1 :, : nxp + 1]
state[:, nzp + 2 :, nxp + 2 :, n] = buff[:, nzp + 1 :, nxp + 1 :]
stateFile.close()
# Pack into a complex array
state_ = np.zeros((ny_half, nz, nx, 3), dtype=np.complex128)
for j in range(ny_half):
state_[j, :, :, :] = state[2 * j, :, :, :] + 1j * state[2 * j + 1, :, :, :]
# Rotate to x, y, z
state__ = np.moveaxis(state_, [0, 1, 2, 3], [1, 2, 0, 3])
return state__, header
def readState(stateFilePath):
stateFilePath = Path(stateFilePath)
stateFile = open(stateFilePath, "rb")
(forcing,) = unpack("=1i", stateFile.read(1 * 4))
nx, ny, nz = unpack("=3i", stateFile.read(3 * 4))
stateFile.close()
ny_half = ny // 2
if nx - 1 >= ny_half and nx - 1 >= nz - 1:
return readState_xcompact(stateFilePath)
elif ny_half >= nx - 1 and ny_half >= nz - 1:
return readState_ycompact(stateFilePath)
else:
return readState_zcompact(stateFilePath)
def writeState_xcompact(
state,
forcing=1,
Lx=4,
Lz=2,
Re=628.3185307179584,
tilt_angle=0.0,
dt=0.03183098861837907,
itime=0,
time=0.0,
outFile="state.000000",
):
# Write array to a dnsbox state file.
outFile = Path(outFile)
stateFile = open(outFile, "wb")
nx, ny_half, nz, _ = state.shape
ny = ny_half * 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
stateFile.write(pack("=1i", forcing))
stateFile.write(pack("=3i", nx, ny, nz))
stateFile.write(pack("=2d", Lx, Lz))
stateFile.write(pack("=1d", Re))
stateFile.write(pack("=1d", tilt_angle))
stateFile.write(pack("=1d", dt))
stateFile.write(pack("=1i", itime))
stateFile.write(pack("=1d", time))
# convert to y, z, x
state_ = np.moveaxis(state, [0, 1, 2], [2, 0, 1])
# Convert to a real valued array
stateOut = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64, order="F")
for j in range(ny_half):
stateOut[2 * j, :, :, :] = state_[j, :, :, :].real
stateOut[2 * j + 1, :, :, :] = state_[j, :, :, :].imag
# write only the zero mode for u
nCells = 2 * ny_half * (nz - 1)
buff = np.zeros((2 * ny_half, nz - 1, 1), dtype=np.float64)
buff[:, : nzp + 1, 0] = stateOut[:, : nzp + 1, 0, 0]
buff[:, nzp + 1 :, 0] = stateOut[:, nzp + 2 :, 0, 0]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
# write everything for v and w
for n in range(1, 3):
nCells = (nx - 1) * 2 * ny_half * (nz - 1)
buff = np.zeros((2 * ny_half, nz - 1, nx - 1), dtype=np.float64)
buff[:, : nzp + 1, : nxp + 1] = stateOut[:, : nzp + 1, : nxp + 1, n]
buff[:, : nzp + 1, nxp + 1 :] = stateOut[:, : nzp + 1, nxp + 2 :, n]
buff[:, nzp + 1 :, : nxp + 1] = stateOut[:, nzp + 2 :, : nxp + 1, n]
buff[:, nzp + 1 :, nxp + 1 :] = stateOut[:, nzp + 2 :, nxp + 2 :, n]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
stateFile.close()
def writeState_ycompact(
state,
forcing=1,
Lx=4,
Lz=2,
Re=628.3185307179584,
tilt_angle=0.0,
dt=0.03183098861837907,
itime=0,
time=0.0,
outFile="state.000000",
):
# Write array to a dnsbox state file.
outFile = Path(outFile)
stateFile = open(outFile, "wb")
nx, ny_half, nz, _ = state.shape
ny = ny_half * 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
stateFile.write(pack("=1i", forcing))
stateFile.write(pack("=3i", nx, ny, nz))
stateFile.write(pack("=2d", Lx, Lz))
stateFile.write(pack("=1d", Re))
stateFile.write(pack("=1d", tilt_angle))
stateFile.write(pack("=1d", dt))
stateFile.write(pack("=1i", itime))
stateFile.write(pack("=1d", time))
# convert to y, z, x
state_ = np.moveaxis(state, [0, 1, 2], [2, 0, 1])
# Convert to a real valued array
stateOut = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64, order="F")
for j in range(ny_half):
stateOut[2 * j, :, :, :] = state_[j, :, :, :].real
stateOut[2 * j + 1, :, :, :] = state_[j, :, :, :].imag
# write everything for u
nCells = (nx - 1) * 2 * ny_half * (nz - 1)
buff = np.zeros((2 * ny_half, nz - 1, nx - 1), dtype=np.float64)
buff[:, : nzp + 1, : nxp + 1] = stateOut[:, : nzp + 1, : nxp + 1, 0]
buff[:, : nzp + 1, nxp + 1 :] = stateOut[:, : nzp + 1, nxp + 2 :, 0]
buff[:, nzp + 1 :, : nxp + 1] = stateOut[:, nzp + 2 :, : nxp + 1, 0]
buff[:, nzp + 1 :, nxp + 1 :] = stateOut[:, nzp + 2 :, nxp + 2 :, 0]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
# write only the zero mode for v
nCells = 2 * (nx - 1) * (nz - 1)
buff = np.zeros((2, nz - 1, nx - 1), dtype=np.float64)
buff[:2, : nzp + 1, : nxp + 1] = stateOut[:2, : nzp + 1, : nxp + 1, 1]
buff[:2, : nzp + 1, nxp + 1 :] = stateOut[:2, : nzp + 1, nxp + 2 :, 1]
buff[:2, nzp + 1 :, : nxp + 1] = stateOut[:2, nzp + 2 :, : nxp + 1, 1]
buff[:2, nzp + 1 :, nxp + 1 :] = stateOut[:2, nzp + 2 :, nxp + 2 :, 1]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
# write everything for w
nCells = (nx - 1) * 2 * ny_half * (nz - 1)
buff = np.zeros((2 * ny_half, nz - 1, nx - 1), dtype=np.float64)
buff[:, : nzp + 1, : nxp + 1] = stateOut[:, : nzp + 1, : nxp + 1, 2]
buff[:, : nzp + 1, nxp + 1 :] = stateOut[:, : nzp + 1, nxp + 2 :, 2]
buff[:, nzp + 1 :, : nxp + 1] = stateOut[:, nzp + 2 :, : nxp + 1, 2]
buff[:, nzp + 1 :, nxp + 1 :] = stateOut[:, nzp + 2 :, nxp + 2 :, 2]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
stateFile.close()
def writeState_zcompact(
state,
forcing=1,
Lx=4,
Lz=2,
Re=628.3185307179584,
tilt_angle=0.0,
dt=0.03183098861837907,
itime=0,
time=0.0,
outFile="state.000000",
):
# Write array to a dnsbox state file.
outFile = Path(outFile)
stateFile = open(outFile, "wb")
nx, ny_half, nz, _ = state.shape
ny = ny_half * 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
stateFile.write(pack("=1i", forcing))
stateFile.write(pack("=3i", nx, ny, nz))
stateFile.write(pack("=2d", Lx, Lz))
stateFile.write(pack("=1d", Re))
stateFile.write(pack("=1d", tilt_angle))
stateFile.write(pack("=1d", dt))
stateFile.write(pack("=1i", itime))
stateFile.write(pack("=1d", time))
# convert to y, z, x
state_ = np.moveaxis(state, [0, 1, 2], [2, 0, 1])
# Convert to a real valued array
stateOut = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64, order="F")
for j in range(ny_half):
stateOut[2 * j, :, :, :] = state_[j, :, :, :].real
stateOut[2 * j + 1, :, :, :] = state_[j, :, :, :].imag
# write everything for u and v
for n in range(2):
nCells = (nx - 1) * 2 * ny_half * (nz - 1)
buff = np.zeros((2 * ny_half, nz - 1, nx - 1), dtype=np.float64)
buff[:, : nzp + 1, : nxp + 1] = stateOut[:, : nzp + 1, : nxp + 1, n]
buff[:, : nzp + 1, nxp + 1 :] = stateOut[:, : nzp + 1, nxp + 2 :, n]
buff[:, nzp + 1 :, : nxp + 1] = stateOut[:, nzp + 2 :, : nxp + 1, n]
buff[:, nzp + 1 :, nxp + 1 :] = stateOut[:, nzp + 2 :, nxp + 2 :, n]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
# write only the zero mode for w
nCells = (nx - 1) * 2 * ny_half
buff = np.zeros((2 * ny_half, 1, nx - 1), dtype=np.float64)
buff[:, 0, : nxp + 1] = stateOut[:, 0, : nxp + 1, 2]
buff[:, 0, nxp + 1 :] = stateOut[:, 0, nxp + 2 :, 2]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
stateFile.close()
def writeState_nocompact(
state,
forcing=1,
Lx=4,
Lz=2,
Re=628.3185307179584,
tilt_angle=0.0,
dt=0.03183098861837907,
itime=0,
time=0.0,
outFile="state.000000",
):
# Write array to a dnsbox state file.
outFile = Path(outFile)
stateFile = open(outFile, "wb")
nx, ny_half, nz, _ = state.shape
ny = ny_half * 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
stateFile.write(pack("=1i", forcing))
stateFile.write(pack("=3i", nx, ny, nz))
stateFile.write(pack("=2d", Lx, Lz))
stateFile.write(pack("=1d", Re))
stateFile.write(pack("=1d", tilt_angle))
stateFile.write(pack("=1d", dt))
stateFile.write(pack("=1i", itime))
stateFile.write(pack("=1d", time))
# convert to y, z, x
state_ = np.moveaxis(state, [0, 1, 2], [2, 0, 1])
# Convert to a real valued array
stateOut = np.zeros((2 * ny_half, nz, nx, 3), dtype=np.float64, order="F")
for j in range(ny_half):
stateOut[2 * j, :, :, :] = state_[j, :, :, :].real
stateOut[2 * j + 1, :, :, :] = state_[j, :, :, :].imag
for n in range(3):
nCells = (nx - 1) * 2 * ny_half * (nz - 1)
buff = np.zeros((2 * ny_half, nz - 1, nx - 1), dtype=np.float64)
buff[:, : nzp + 1, : nxp + 1] = stateOut[:, : nzp + 1, : nxp + 1, n]
buff[:, : nzp + 1, nxp + 1 :] = stateOut[:, : nzp + 1, nxp + 2 :, n]
buff[:, nzp + 1 :, : nxp + 1] = stateOut[:, nzp + 2 :, : nxp + 1, n]
buff[:, nzp + 1 :, nxp + 1 :] = stateOut[:, nzp + 2 :, nxp + 2 :, n]
dataPack = pack("={}d".format(nCells), *buff.flatten(order="F"))
stateFile.write(dataPack)
stateFile.close()
def writeState(
state,
forcing=1,
Lx=4,
Lz=2,
Re=628.3185307179584,
tilt_angle=0.0,
dt=0.03183098861837907,
itime=0,
time=0.0,
outFile="state.000000",
):
nx, ny_half, nz, _ = state.shape
if nx - 1 >= ny_half and nx - 1 >= nz - 1:
writeState_xcompact(
state,
forcing=forcing,
Lx=Lx,
Lz=Lz,
Re=Re,
tilt_angle=tilt_angle,
dt=dt,
itime=itime,
time=time,
outFile=outFile,
)
elif ny_half >= nx - 1 and ny_half >= nz - 1:
writeState_ycompact(
state,
forcing=forcing,
Lx=Lx,
Lz=Lz,
Re=Re,
tilt_angle=tilt_angle,
dt=dt,
itime=itime,
time=time,
outFile=outFile,
)
else:
writeState_zcompact(
state,
forcing=forcing,
Lx=Lx,
Lz=Lz,
Re=Re,
tilt_angle=tilt_angle,
dt=dt,
itime=itime,
time=time,
outFile=outFile,
)
def inprod(state1, state2):
# outputs only the real part of the inner product
if state1.shape != state2.shape:
exit("u1 and u2 have different shapes.")
# DO: account for zeroed modes
# account for double-counting due to hermiticity
kin = np.sum(np.conj(state1[:, 0, :, :]) * state2[:, 0, :, :]) / 2
kin += np.sum(np.conj(state1[:, 1:, :, :]) * state2[:, 1:, :, :])
# the nyquist mode is not there after shrinking the grid
kin = kin.real
return kin
def wavenumbers(Lx, Lz, nx, ny_half, nz):
kx = np.zeros((nx), dtype=np.float64)
ky = np.zeros((ny_half), dtype=np.float64)
kz = np.zeros((nz), dtype=np.float64)
for i in range(nx):
if i <= nx // 2:
kx[i] = i * (2 * np.pi / Lx)
else:
kx[i] = i * (2 * np.pi / Lx) - nx * (2 * np.pi / Lx)
for j in range(ny_half):
ky[j] = j * (2 * np.pi / Ly)
for k in range(nz):
if k <= nz // 2:
kz[k] = k * (2 * np.pi / Lz)
else:
kz[k] = k * (2 * np.pi / Lz) - nz * (2 * np.pi / Lz)
return kx, ky, kz
def derivative(subState, axis, Lx, Lz):
nx, ny_half, nz = subState.shape
nxp = nx // 2 - 1
nzp = nz // 2 - 1
kx, ky, kz = wavenumbers(Lx, Lz, nx, ny_half, nz)
out = np.zeros((nx, ny_half, nz), dtype=np.complex128)
if axis == 0:
for i in range(nx):
if i == nxp + 1:
continue
out[i, :, :] = 1j * kx[i] * subState[i, :, :]
elif axis == 1:
for j in range(ny_half):
out[:, j, :] = 1j * ky[j] * subState[:, j, :]
elif axis == 2:
for k in range(nz):
if k == nzp + 1:
continue
out[:, :, k] = 1j * kz[k] * subState[:, :, k]
return out
def vorticity(state, Lx, Lz):
wy = derivative(state[:, :, :, 2], 1, Lx, Lz)
vz = derivative(state[:, :, :, 1], 2, Lx, Lz)
uz = derivative(state[:, :, :, 0], 2, Lx, Lz)
wx = derivative(state[:, :, :, 2], 0, Lx, Lz)
vx = derivative(state[:, :, :, 1], 0, Lx, Lz)
uy = derivative(state[:, :, :, 0], 1, Lx, Lz)
out = np.zeros(state.shape, dtype=np.complex128)
out[:, :, :, 0] = wy - vz
out[:, :, :, 1] = uz - wx
out[:, :, :, 2] = vx - uy
return out
def dissipation(state, Lx, Lz, Re):
vor = vorticity(state, Lx, Lz)
diss = 2 * inprod(vor, vor) / Re
return diss
def fftSpecToPhys(subState, supersample=False):
nx, ny_half, nz = subState.shape
nxp, nzp = nx // 2 - 1, nz // 2 - 1
if supersample:
nxx, nyy, nzz = nStretch * (nx // 2), nStretch * ny_half, nStretch * (nz // 2)
nyy_half_pad1 = nyy // 2 + 1
expandZ = np.zeros((nx, ny_half, nzz), dtype=np.complex128)
for k in range(nzp + 1):
expandZ[:, :, k] = subState[:, :, k]
if 0 < k <= nzp:
expandZ[:, :, -k] = subState[:, :, -k]
expandZ = np.fft.ifft(expandZ, axis=2, norm="backward", n=nzz)
expandX = np.zeros((nxx, ny_half, nzz), dtype=np.complex128)
for i in range(nxp + 1):
expandX[i, :, :] = expandZ[i, :, :]
if 0 < i <= nxp:
expandX[-i, :, :] = expandZ[-i, :, :]
expandX = np.fft.ifft(expandX, axis=0, norm="backward", n=nxx)
expandY = np.zeros((nxx, nyy_half_pad1, nzz), dtype=np.complex128)
expandY[:, :ny_half, :] = expandX
stateOut = np.fft.irfft(expandY, axis=1, norm="backward", n=nyy)
else:
nxx, nyy, nzz = nx, 2 * ny_half, nz
stateOut = np.swapaxes(
np.fft.irfftn(
np.swapaxes(subState, 1, 2), s=(nxx, nzz, nyy), norm="backward"
),
1,
2,
)
return stateOut
def fftSpecToPhysAll(state, supersample=False):
nx, ny_half, nz, _ = state.shape
if supersample:
nxx, nyy, nzz = nStretch * (nx // 2), nStretch * (ny_half), nStretch * (nz // 2)
else:
nxx, nyy, nzz = nx, 2 * ny_half, nz
stateOutAll = np.zeros((nxx, nyy, nzz, 3), dtype=np.float64)
for comp in range(3):
stateOutAll[:, :, :, comp] = fftSpecToPhys(
state[:, :, :, comp], supersample=supersample
)
return stateOutAll
def fftPhysToSpec(stateOut, supersample=False):
nxx, nyy, nzz = stateOut.shape
if supersample:
nx, ny, nz = 2 * (nxx // nStretch), 2 * (nyy // nStretch), 2 * (nzz // nStretch)
ny_half = ny // 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
# FFT in y
expandY = np.fft.rfft(stateOut, axis=1, norm="backward", n=nyy)
# Shrink in y
expandX = expandY[:, :ny_half, :]
# FFT in x
expandX = np.fft.fft(expandX, axis=0, norm="backward", n=nxx)
# Shrink in x
expandZ = np.zeros((nx, ny_half, nzz), dtype=np.complex128)
for i in range(nxp + 1):
expandZ[i, :, :] = expandX[i, :, :]
if 0 < i <= nxp:
expandZ[-i, :, :] = expandX[-i, :, :]
# FFT in z
expandZ = np.fft.fft(expandZ, axis=2, norm="backward", n=nzz)
# Shrink in z
subState = np.zeros((nx, ny_half, nz), dtype=np.complex128)
for k in range(nzp + 1):
subState[:, :, k] = expandZ[:, :, k]
if 0 < k <= nzp:
subState[:, :, -k] = expandZ[:, :, -k]
else:
nx, ny, nz = nxx, nyy, nzz
ny_half = ny // 2
nxp, nzp = nx // 2 - 1, nz // 2 - 1
subState = np.swapaxes(
np.fft.rfftn(
np.swapaxes(stateOut, 1, 2), s=(nxx, nzz, nyy), norm="backward"
),
1,
2,
)
# Zero the Nyquist modes
subState = subState[:, :ny_half, :]
subState[nxp + 1, :, :] = 0
subState[:, :, nzp + 1] = 0
return subState
def fftPhysToSpecAll(stateOutAll, supersample=False):
nxx, nyy, nzz, _ = stateOutAll.shape
if supersample:
nx, ny, nz = 2 * (nxx // nStretch), 2 * (nyy // nStretch), 2 * (nzz // nStretch)
else:
nx, ny, nz = nxx, nyy, nzz
ny_half = ny // 2
state = np.zeros((nx, ny_half, nz, 3), dtype=np.complex128)
for comp in range(3):
state[:, :, :, comp] = fftPhysToSpec(
stateOutAll[:, :, :, comp], supersample=supersample
)
return state
def tilt_state(state, tilt_angle):
nx, ny_half, nz, _ = state.shape
tilt_radians = tilt_angle * np.pi / 180.0
stateout = np.zeros((nx, ny_half, nz, 3), dtype=np.complex128)
stateout[:, :, :, 1] = state[:, :, :, 1]
stateout[:, :, :, 0] = (
np.cos(tilt_radians) * state[:, :, :, 0]
- np.sin(tilt_radians) * state[:, :, :, 2]
)
stateout[:, :, :, 2] = (
np.sin(tilt_radians) * state[:, :, :, 0]
+ np.cos(tilt_radians) * state[:, :, :, 2]
)
return stateout
def laminar(forcing, nx, ny_half, nz, tilt_angle=0.0):
state = np.zeros((nx, ny_half, nz, 3), dtype=np.complex128)
if forcing == 1:
state[0, qF, 0, 0] = -1j * 0.5
elif forcing == 2:
state[0, qF, 0, 0] = 0.5
if abs(tilt_angle) > 0:
state = tilt_state(state, tilt_angle)
return state
| 32.079229 | 88 | 0.500935 | 4,228 | 29,962 | 3.501656 | 0.070719 | 0.041337 | 0.034043 | 0.032421 | 0.765957 | 0.749139 | 0.726781 | 0.698548 | 0.690172 | 0.676528 | 0 | 0.059287 | 0.314331 | 29,962 | 933 | 89 | 32.113612 | 0.661361 | 0.060577 | 0 | 0.619614 | 0 | 0 | 0.028713 | 0.006519 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038633 | false | 0 | 0.011887 | 0 | 0.08321 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8c19a3c12ecabd65e74e10b6b6f1d5e7e624cace | 510 | py | Python | tests/test_open.py | dimitri-justeau/rasterio | dda4b823473ba3cb27038e00ee7aa82f867f2a55 | [
"BSD-3-Clause"
] | 1 | 2020-03-17T10:17:57.000Z | 2020-03-17T10:17:57.000Z | tests/test_open.py | dimitri-justeau/rasterio | dda4b823473ba3cb27038e00ee7aa82f867f2a55 | [
"BSD-3-Clause"
] | null | null | null | tests/test_open.py | dimitri-justeau/rasterio | dda4b823473ba3cb27038e00ee7aa82f867f2a55 | [
"BSD-3-Clause"
] | 2 | 2019-09-03T12:18:04.000Z | 2020-03-17T10:18:07.000Z | import pytest
import rasterio
def test_open_bad_path():
with pytest.raises(TypeError):
rasterio.open(3.14)
def test_open_bad_mode_1():
with pytest.raises(TypeError):
rasterio.open("tests/data/RGB.byte.tif", mode=3.14)
def test_open_bad_mode_2():
with pytest.raises(ValueError):
rasterio.open("tests/data/RGB.byte.tif", mode="foo")
def test_open_bad_driver():
with pytest.raises(TypeError):
rasterio.open("tests/data/RGB.byte.tif", mode="r", driver=3.14)
| 22.173913 | 71 | 0.692157 | 77 | 510 | 4.402597 | 0.324675 | 0.082596 | 0.129794 | 0.165192 | 0.690265 | 0.690265 | 0.581121 | 0.457227 | 0.353982 | 0.353982 | 0 | 0.025822 | 0.164706 | 510 | 22 | 72 | 23.181818 | 0.769953 | 0 | 0 | 0.214286 | 0 | 0 | 0.143137 | 0.135294 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | true | 0 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8c4ff4e9a6672d0ccac9e9f56d8ec8c97011b488 | 4,674 | py | Python | mottak-arkiv-service/tests/connectors/mailgun/test_models.py | omBratteng/mottak | b7d2e1d063b31c2ad89c66e5414297612f91ebe9 | [
"Apache-2.0"
] | 4 | 2021-03-05T15:39:24.000Z | 2021-09-15T06:11:45.000Z | mottak-arkiv-service/tests/connectors/mailgun/test_models.py | omBratteng/mottak | b7d2e1d063b31c2ad89c66e5414297612f91ebe9 | [
"Apache-2.0"
] | 631 | 2020-04-27T10:39:18.000Z | 2022-03-31T14:51:38.000Z | mottak-arkiv-service/tests/connectors/mailgun/test_models.py | omBratteng/mottak | b7d2e1d063b31c2ad89c66e5414297612f91ebe9 | [
"Apache-2.0"
] | 3 | 2020-02-20T15:48:03.000Z | 2021-12-16T22:50:40.000Z | import json
import base64
import uuid
from app.connectors.mailgun.models import InvitasjonUploadUrl, InvitasjonEmail
MAILGUN_DOMAIN = 'sandbox7cbdab032f7b4321af274439b1f353a2.mailgun.org'
TO = ['kriwal@arkivverket.no']
BASE64_URL = 'dpldr://eyJyZWZlcmVuY2UiOiAiMThkZGVkYjYtYThjOC00ZTI4LWIzNTUtZmM4MDJlMzlhNTg4IiwgInVwbG9hZF91cmwiOiAiaHR0cDovL3Rlc3Qubm8vIiwgInVwbG9hZF90eXBlIjogInRhciIsICJtZXRhIjogeyJpbnZpdGFzam9uX3V1aWQiOiAiZjM0NGEyODAtODQ0OS00ZjM5LWE3ZTktYzllMWI4NTg1MGE0In19'
TITTEL = 'Arkivgeneratorkomisjonen -- Et sikkelig bra arkiv'
_UUID = uuid.UUID('38da4005-92de-4aea-81ac-2262f90d0f5b')
EXAMPLE_MELDING = {"reference": "18ddedb6-a8c8-4e28-b355-fc802e39a588",
"uploadUrl": "http://test.no/",
"uploadType": "tar",
"meta": {"invitasjonEksternId": "4480d16c-267d-4051-aaea-fd5384895b2b"}}
EXAMPLE_EMAIL = {'subject': 'Invitasjon til opplastning',
'to': ['kriwal@arkivverket.no'],
'from': 'Mottak Digitalarkivet <donotreply@sandbox7cbdab032f7b4321af274439b1f353a2.mailgun.org>',
'text': 'Du mottar denne eposten fordi du har bestilt lenke til opplasting i Digitalarkivet. Det gjelder opplasting av følgende arkiv:\n\n\n\n \n \n\n\n\n\nTittel:\nArkivgeneratorkomisjonen -- Et sikkelig bra arkiv\n\n\nFil:\n38da4005-92de-4aea-81ac-2262f90d0f5b.tar\n\n\n\nDersom du ikke har bestilt lenke for opplasting kan du se bort fra denne eposten.\nTrinnvis beskrivelse av hvordan du laster opp et arkiv:\n\nSiste versjon av program for å laste opp arkivuttrekk finnes her: https://uploader.digitalisering.arkivverket.no\nStart applikasjonen\nKopier invitasjons url’en i slutten av eposten inn i en nettleser (starter med dpldr://)\nVelg tar fila som skal lastes opp\n\nInvitasjonslenken:\ndpldr://eyJyZWZlcmVuY2UiOiAiMThkZGVkYjYtYThjOC00ZTI4LWIzNTUtZmM4MDJlMzlhNTg4IiwgInVwbG9hZF91cmwiOiAiaHR0cDovL3Rlc3Qubm8vIiwgInVwbG9hZF90eXBlIjogInRhciIsICJtZXRhIjogeyJpbnZpdGFzam9uX3V1aWQiOiAiZjM0NGEyODAtODQ0OS00ZjM5LWE3ZTktYzllMWI4NTg1MGE0In19',
'html': '<p>Du mottar denne eposten fordi du har bestilt lenke til opplasting i Digitalarkivet. Det gjelder opplasting av følgende arkiv:</p>\n<table>\n<thead>\n<tr>\n<th><!-- --></th>\n<th><!-- --></th>\n</tr>\n</thead>\n<tbody>\n<tr>\n<td><strong>Tittel:</strong></td>\n<td>Arkivgeneratorkomisjonen -- Et sikkelig bra arkiv</td>\n</tr>\n<tr>\n<td><strong>Fil:</strong></td>\n<td>38da4005-92de-4aea-81ac-2262f90d0f5b.tar</td>\n</tr>\n</tbody>\n</table>\n<p>Dersom du ikke har bestilt lenke for opplasting kan du se bort fra denne eposten.</p>\n<p>Trinnvis beskrivelse av hvordan du laster opp et arkiv:</p>\n<ul>\n<li>Siste versjon av program for å laste opp arkivuttrekk finnes her: <a href="https://uploader.digitalisering.arkivverket.no">https://uploader.digitalisering.arkivverket.no</a></li>\n<li>Start applikasjonen</li>\n<li>Kopier invitasjons url’en i slutten av eposten inn i en nettleser (starter med dpldr://)</li>\n<li>Velg tar fila som skal lastes opp</li>\n</ul>\n<p>Invitasjonslenken:\n<a href="dpldr://eyJyZWZlcmVuY2UiOiAiMThkZGVkYjYtYThjOC00ZTI4LWIzNTUtZmM4MDJlMzlhNTg4IiwgInVwbG9hZF91cmwiOiAiaHR0cDovL3Rlc3Qubm8vIiwgInVwbG9hZF90eXBlIjogInRhciIsICJtZXRhIjogeyJpbnZpdGFzam9uX3V1aWQiOiAiZjM0NGEyODAtODQ0OS00ZjM5LWE3ZTktYzllMWI4NTg1MGE0In19">dpldr://eyJyZWZlcmVuY2UiOiAiMThkZGVkYjYtYThjOC00ZTI4LWIzNTUtZmM4MDJlMzlhNTg4IiwgInVwbG9hZF91cmwiOiAiaHR0cDovL3Rlc3Qubm8vIiwgInVwbG9hZF90eXBlIjogInRhciIsICJtZXRhIjogeyJpbnZpdGFzam9uX3V1aWQiOiAiZjM0NGEyODAtODQ0OS00ZjM5LWE3ZTktYzllMWI4NTg1MGE0In19</a></p>'}
def test_invitasjon_melding_as_base64_url():
"""
GIVEN static input data to InvitasjonMelding
WHEN calling the method as_base64_url()
THEN verify that the correct base64 encoded url is returned
"""
expected = _as_base64_url(EXAMPLE_MELDING)
melding = InvitasjonUploadUrl(uuid.UUID(EXAMPLE_MELDING['reference']), EXAMPLE_MELDING['uploadUrl'],
uuid.UUID(EXAMPLE_MELDING['meta']['invitasjonEksternId']),
EXAMPLE_MELDING['uploadType'])
assert expected == melding.as_base64_url()
def test_mailgun_email():
"""
GIVEN static input data to MailgunEmail
WHEN calling the method as_data()
THEN verify that the correct dictionary is returned
"""
email = InvitasjonEmail(MAILGUN_DOMAIN, TO, _UUID, TITTEL, BASE64_URL)
print(email.as_data())
assert EXAMPLE_EMAIL == email.as_data()
def _as_base64_url(data: dict) -> str:
json_str = json.dumps(data)
json_bytes = json_str.encode('utf-8')
return 'dpldr://' + str(base64.b64encode(json_bytes), 'utf-8')
| 88.188679 | 1,528 | 0.764014 | 514 | 4,674 | 6.873541 | 0.336576 | 0.006227 | 0.006793 | 0.006793 | 0.323521 | 0.193886 | 0.180866 | 0.180866 | 0.156807 | 0.156807 | 0 | 0.068974 | 0.12837 | 4,674 | 52 | 1,529 | 89.884615 | 0.798233 | 0.06119 | 0 | 0 | 0 | 0.0625 | 0.736697 | 0.434001 | 0 | 0 | 0 | 0 | 0.0625 | 1 | 0.09375 | false | 0 | 0.125 | 0 | 0.25 | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
4fb562b573aae6a5d79b7e4155c87b2e196fa890 | 395 | py | Python | tests/test_image_approval_polls.py | carlosmarca/polls-trump-covid | 8e484f1560d8379716daefb2b56b0374eb23c44b | [
"MIT"
] | 1 | 2021-06-14T14:35:38.000Z | 2021-06-14T14:35:38.000Z | tests/test_image_approval_polls.py | carlosmarca/polls-trump-covid | 8e484f1560d8379716daefb2b56b0374eb23c44b | [
"MIT"
] | null | null | null | tests/test_image_approval_polls.py | carlosmarca/polls-trump-covid | 8e484f1560d8379716daefb2b56b0374eb23c44b | [
"MIT"
] | null | null | null | import unittest
from filtered_csv import filtered_df
from image_approval_polls import create_approval_bar
class TestImageApprovalPolls(unittest.TestCase):
def test_create_approval_bar(self):
df_1 = filtered_df("data/covid_approval_polls.csv")
df_new = create_approval_bar(df_1)
n = sum(df_new.approve) + sum(df_new.disapprove)
self.assertEqual(n, 78587.0)
| 30.384615 | 59 | 0.756962 | 56 | 395 | 5 | 0.5 | 0.15 | 0.182143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024316 | 0.167089 | 395 | 12 | 60 | 32.916667 | 0.826748 | 0 | 0 | 0 | 0 | 0 | 0.073418 | 0.073418 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
4fb9b6a33610518509b8f1e7f2ed589c9afe0799 | 512 | py | Python | code_cesar.py | elig-45/cryptologie | 5cb9555ef7c7572b5988800b6718316ff09d0e11 | [
"MIT"
] | null | null | null | code_cesar.py | elig-45/cryptologie | 5cb9555ef7c7572b5988800b6718316ff09d0e11 | [
"MIT"
] | null | null | null | code_cesar.py | elig-45/cryptologie | 5cb9555ef7c7572b5988800b6718316ff09d0e11 | [
"MIT"
] | null | null | null | def dechiffrer(message, cle_chiffrement):
int(cle_chiffrement)
resultat = ""
for lettre in message:
resultat = resultat + chr(ord(lettre) - cle_chiffrement)
return resultat
def chiffrer(message, cle_chiffrement):
return dechiffrer(message, -cle_chiffrement)
if __name__ == "__main__":
assert dechiffrer("Nsktwrfynvzj", 5) == "Informatique", "Erreur 1"
assert dechiffrer(chiffrer("azertyuiop", 3), 3) == "azertyuiop", "Erreur 2"
print("Tests validés.\n\n") | 34.133333 | 80 | 0.673828 | 56 | 512 | 5.928571 | 0.535714 | 0.210843 | 0.189759 | 0.186747 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012225 | 0.201172 | 512 | 15 | 81 | 34.133333 | 0.799511 | 0 | 0 | 0 | 0 | 0 | 0.172345 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.166667 | false | 0 | 0 | 0.083333 | 0.333333 | 0.083333 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
4fcae1bc7f4c58f37c02b287e7d546f371d1836f | 1,119 | py | Python | setup.py | misialq/paycheck | efa6bbd199cb9c87369aae42a2b5961d548e04f9 | [
"BSD-3-Clause"
] | null | null | null | setup.py | misialq/paycheck | efa6bbd199cb9c87369aae42a2b5961d548e04f9 | [
"BSD-3-Clause"
] | null | null | null | setup.py | misialq/paycheck | efa6bbd199cb9c87369aae42a2b5961d548e04f9 | [
"BSD-3-Clause"
] | null | null | null | from setuptools import setup, find_packages
setup(
name='paycheck',
version='0.0.3',
author='Ben Kaehler',
author_email='kaehler@gmail.com',
description='Tools for testing q2-clawback',
scripts=['paycheck/assets/run_traceable_dada_paired.R'],
license='BSD-3-Clause',
packages=find_packages(),
package_data={'paycheck.tests': ['data/*']},
entry_points={
'console_scripts':
['paycheck_simulate=paycheck.simulate:simulate_all_samples',
'pick_up_the_check=paycheck.simulate:simulate_missed_samples',
'paycheck_cv=paycheck.cross_validate:cross_validate',
'paycheck_shannon=paycheck.shannon:shannon',
'paycheck_diversity=paycheck.shannon:diversity',
'paycheck_cv_for_weights='
'paycheck.cross_validate:cross_validate_for_weights',
'paycheck_cv_average='
'paycheck.cross_validate:cross_validate_average',
'paycheck_generate_folds='
'paycheck.cross_validate:generate_folds',
'paycheck_cv_classifier='
'paycheck.cross_validate:cross_validate_classifier']
}
)
| 37.3 | 71 | 0.699732 | 121 | 1,119 | 6.115702 | 0.471074 | 0.158108 | 0.141892 | 0.140541 | 0.183784 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005476 | 0.184093 | 1,119 | 29 | 72 | 38.586207 | 0.805038 | 0 | 0 | 0 | 0 | 0 | 0.612154 | 0.489723 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.035714 | 0 | 0.035714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
4fcdf8102536a0b74bdc4ad0e0c036b247347c25 | 238 | py | Python | apps/interfacemocks/serializers.py | op896898466/apitest | 6c60514fda01776a9c39b930732bf09b1e8cd0f9 | [
"Apache-2.0"
] | 3 | 2020-09-25T03:01:38.000Z | 2021-08-16T05:48:43.000Z | apps/interfacemocks/serializers.py | op896898466/apitest | 6c60514fda01776a9c39b930732bf09b1e8cd0f9 | [
"Apache-2.0"
] | 8 | 2020-09-25T02:56:06.000Z | 2022-02-27T10:12:32.000Z | apps/interfacemocks/serializers.py | op896898466/apitest | 6c60514fda01776a9c39b930732bf09b1e8cd0f9 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from rest_framework import serializers
from .models import Interfacemocks
class InterfacemocksSerializer(serializers.ModelSerializer):
class Meta:
model = Interfacemocks
fields = '__all__'
| 18.307692 | 60 | 0.718487 | 22 | 238 | 7.545455 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005263 | 0.201681 | 238 | 12 | 61 | 19.833333 | 0.868421 | 0.088235 | 0 | 0 | 0 | 0 | 0.03271 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
8b040a80e12a0e4feb039e0f4cbcfd9e1a068064 | 157 | py | Python | fantaland_app/apps.py | yaping0816/fantaland_api | 12de524f08bdc177634a4545eb11b64b9330501a | [
"MIT"
] | null | null | null | fantaland_app/apps.py | yaping0816/fantaland_api | 12de524f08bdc177634a4545eb11b64b9330501a | [
"MIT"
] | null | null | null | fantaland_app/apps.py | yaping0816/fantaland_api | 12de524f08bdc177634a4545eb11b64b9330501a | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class FantalandAppConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'fantaland_app'
| 22.428571 | 56 | 0.77707 | 18 | 157 | 6.611111 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140127 | 157 | 6 | 57 | 26.166667 | 0.881481 | 0 | 0 | 0 | 0 | 0 | 0.267516 | 0.184713 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
8b0d20a79fc1281c4d0682f6a342b46d94fdb445 | 6,274 | py | Python | element/migrations/0002_auto_20190521_1519.py | weirdaze/tauto | e5a635628cd92998212cf3ae74aef2f0436430f5 | [
"MIT"
] | null | null | null | element/migrations/0002_auto_20190521_1519.py | weirdaze/tauto | e5a635628cd92998212cf3ae74aef2f0436430f5 | [
"MIT"
] | 6 | 2021-03-19T16:01:33.000Z | 2022-03-12T00:54:23.000Z | element/migrations/0002_auto_20190521_1519.py | weirdaze/tauto | e5a635628cd92998212cf3ae74aef2f0436430f5 | [
"MIT"
] | null | null | null | # Generated by Django 2.2 on 2019-05-21 15:19
import datetime
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('contenttypes', '0002_remove_content_type_name'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('element', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='CodeType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='DataModel',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='DataPointCategory',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='MethodType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='OperatingSystem',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='TimelineType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.AddField(
model_name='datapoint',
name='active',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='datapoint',
name='description',
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name='datapoint',
name='name',
field=models.CharField(default=1, max_length=200),
preserve_default=False,
),
migrations.AddField(
model_name='vendor',
name='name',
field=models.CharField(default=1, max_length=200),
preserve_default=False,
),
migrations.CreateModel(
name='Timeline',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('date_start', models.DateTimeField(blank=True, null=True)),
('date_end', models.DateTimeField(blank=True, null=True)),
('duration', models.DurationField(blank=True, default=datetime.timedelta(days=10, seconds=36010), null=True)),
('date_due', models.DateTimeField(blank=True, null=True)),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('owner', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.DO_NOTHING, to=settings.AUTH_USER_MODEL)),
('type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='element.TimelineType')),
],
),
migrations.CreateModel(
name='DataPointMethod',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('script', models.TextField(blank=True, null=True)),
('active', models.BooleanField(default=False)),
('order', models.PositiveIntegerField(default=0)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='element.MethodType')),
],
),
migrations.CreateModel(
name='DataChild',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='CodeRev',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('number', models.CharField(blank=True, max_length=200, null=True)),
('suffix', models.CharField(blank=True, max_length=20, null=True)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('operating_system', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='element.OperatingSystem')),
('type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='element.CodeType')),
],
),
]
| 47.172932 | 151 | 0.585432 | 611 | 6,274 | 5.87725 | 0.175123 | 0.0401 | 0.047062 | 0.061543 | 0.751601 | 0.751601 | 0.63325 | 0.63325 | 0.63325 | 0.6313 | 0 | 0.016055 | 0.275263 | 6,274 | 132 | 152 | 47.530303 | 0.773697 | 0.006854 | 0 | 0.65873 | 1 | 0 | 0.106117 | 0.02376 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.031746 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8b0da55d6d365c65952ed687839427fb6275e5b5 | 360 | py | Python | python-server/setup.py | MobMonRob/Strand-Zaubertrick-mit-Waermebildkamera-Studien | d30e0f59e2bb0de0d24e74b76bd78fb1fc38a500 | [
"Apache-2.0"
] | null | null | null | python-server/setup.py | MobMonRob/Strand-Zaubertrick-mit-Waermebildkamera-Studien | d30e0f59e2bb0de0d24e74b76bd78fb1fc38a500 | [
"Apache-2.0"
] | null | null | null | python-server/setup.py | MobMonRob/Strand-Zaubertrick-mit-Waermebildkamera-Studien | d30e0f59e2bb0de0d24e74b76bd78fb1fc38a500 | [
"Apache-2.0"
] | null | null | null | from setuptools import setup
setup(
name='python_remote_processor',
version='0.1',
description='',
package_dir={'': 'src'},
packages=['python_remote_processor'],
install_requires=['llvmlite',
'numba',
'numpy',
'opencv-contrib-python',
'Pillow']
)
| 24 | 46 | 0.502778 | 29 | 360 | 6.034483 | 0.827586 | 0.137143 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008696 | 0.361111 | 360 | 14 | 47 | 25.714286 | 0.752174 | 0 | 0 | 0 | 0 | 0 | 0.269444 | 0.186111 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.076923 | 0 | 0.076923 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8b311d84eec5d0eb18c4e0b282e17ebf77a722ef | 960 | py | Python | utils/state.py | DeWa/PiBooth | 2b2c1b85976bd36257d61e5d6461b52275f72241 | [
"MIT"
] | 1 | 2019-04-22T16:01:32.000Z | 2019-04-22T16:01:32.000Z | utils/state.py | DeWaster/PiBooth | 2b2c1b85976bd36257d61e5d6461b52275f72241 | [
"MIT"
] | 6 | 2021-03-18T21:38:58.000Z | 2022-03-11T23:35:31.000Z | utils/state.py | DeWaster/PiBooth | 2b2c1b85976bd36257d61e5d6461b52275f72241 | [
"MIT"
] | 1 | 2022-02-26T20:44:02.000Z | 2022-02-26T20:44:02.000Z | class State:
state = {}
@staticmethod
def init():
State.state = {
"lowres_frames": None,
"highres_frames": None,
"photopath": "",
"frame": 0,
"current_photo": None,
"share_code": "",
"preview_image_width": 0,
"photo_countdown": 0,
"reset_time": 0,
"upload_url": "",
"photo_url": "",
"api_key": ""
}
return State.state
@staticmethod
def set(value):
State.state[value[0]] = value[1]
return State.state
@staticmethod
def set_dict(value):
State.state = {**State.state, **value}
return State.state
@staticmethod
def get(key):
try:
return State.state.get(key)
except KeyError:
print("No key found")
return None
@staticmethod
def print():
print(State.state)
| 22.325581 | 46 | 0.479167 | 91 | 960 | 4.923077 | 0.417582 | 0.245536 | 0.196429 | 0.223214 | 0.220982 | 0.151786 | 0 | 0 | 0 | 0 | 0 | 0.010363 | 0.396875 | 960 | 42 | 47 | 22.857143 | 0.763385 | 0 | 0 | 0.216216 | 0 | 0 | 0.152083 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.135135 | false | 0 | 0 | 0 | 0.324324 | 0.081081 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8c973a97a9d83f1d1d9a2de482a3e6a3942fd083 | 51 | py | Python | programs/__init__.py | michael-xander/communique-webapp | 85b450d7f6d0313c5e5ef53a262a850b7e93c3d6 | [
"MIT"
] | null | null | null | programs/__init__.py | michael-xander/communique-webapp | 85b450d7f6d0313c5e5ef53a262a850b7e93c3d6 | [
"MIT"
] | null | null | null | programs/__init__.py | michael-xander/communique-webapp | 85b450d7f6d0313c5e5ef53a262a850b7e93c3d6 | [
"MIT"
] | null | null | null | default_app_config = 'programs.apps.ProgramsConfig' | 51 | 51 | 0.862745 | 6 | 51 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039216 | 51 | 1 | 51 | 51 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0.538462 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8cb5d17f2c4ba03db1c97906ce683dee795fb860 | 840 | py | Python | affinity_pdbdataset/utility/PDB_container.py | Reimilia/Affinity_pdbdataset | 26538e9f6efff2251d36a5e1caa6689d1293241c | [
"MIT"
] | null | null | null | affinity_pdbdataset/utility/PDB_container.py | Reimilia/Affinity_pdbdataset | 26538e9f6efff2251d36a5e1caa6689d1293241c | [
"MIT"
] | null | null | null | affinity_pdbdataset/utility/PDB_container.py | Reimilia/Affinity_pdbdataset | 26538e9f6efff2251d36a5e1caa6689d1293241c | [
"MIT"
] | null | null | null |
class PDB_container:
'''
Use to read a pdb file from a cocrystal structure and parse
it for further use
Note this class cannot write any files and we do not want any
files to be added by this class
This is good for IO management although the code might be hard to
understand
'''
def __init__(self):
'''
parse a pdb from file
'''
pass
def register_a_ligand(self):
pass
def Is_pure_protein(self):
pass
def Is_pure_nucleic(self):
pass
def Is_protein_nucleic_complex(self):
pass
def Bundle_ligand_result_dict(self):
pass
def Bundle_ligand_result_list(self):
pass
def list_ligand_ResId(self):
pass
def get_ligand_dict(self):
pass
class ligand_container:
pass
| 17.87234 | 70 | 0.621429 | 117 | 840 | 4.247863 | 0.470085 | 0.112676 | 0.15493 | 0.078471 | 0.185111 | 0.1167 | 0 | 0 | 0 | 0 | 0 | 0 | 0.329762 | 840 | 46 | 71 | 18.26087 | 0.882771 | 0.327381 | 0 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.47619 | 0 | 0 | 0.52381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
8cdd623deba84f746be842320145b9dd4b48be44 | 350 | py | Python | tests/test_app.py | miikapo/fastapi-crud | d8517d93068b0e71fb114a695a41f48570387b9a | [
"Apache-2.0"
] | 5 | 2021-11-02T20:13:41.000Z | 2022-03-19T00:01:53.000Z | tests/test_app.py | miikapo/fastapi-crud | d8517d93068b0e71fb114a695a41f48570387b9a | [
"Apache-2.0"
] | null | null | null | tests/test_app.py | miikapo/fastapi-crud | d8517d93068b0e71fb114a695a41f48570387b9a | [
"Apache-2.0"
] | null | null | null | from fastapi.testclient import TestClient
def test_index(test_client: TestClient):
r = test_client.get("/")
assert r.status_code == 200
assert r.json() == {"status": "OK"}
def test_read_companies_empty(test_client: TestClient):
r = test_client.get("/companies")
assert r.status_code == 200
assert r.json() == {"data": []}
| 25 | 55 | 0.671429 | 47 | 350 | 4.787234 | 0.425532 | 0.177778 | 0.177778 | 0.186667 | 0.577778 | 0.577778 | 0.577778 | 0.275556 | 0 | 0 | 0 | 0.020833 | 0.177143 | 350 | 13 | 56 | 26.923077 | 0.760417 | 0 | 0 | 0.222222 | 0 | 0 | 0.065714 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8ce9abe4e606c47c8045a25651accc352892ea25 | 2,255 | py | Python | ecpy/fields/Field.py | andynuma/time-release-encryption-py3 | a5c48d07fae8121b59100d4cd79d3e38402d928c | [
"MIT"
] | 48 | 2016-03-30T07:20:49.000Z | 2022-01-27T10:48:43.000Z | ecpy/fields/Field.py | andynuma/time-release-encryption-py3 | a5c48d07fae8121b59100d4cd79d3e38402d928c | [
"MIT"
] | 11 | 2017-03-26T11:03:20.000Z | 2021-06-01T15:54:03.000Z | ecpy/fields/Field.py | andynuma/time-release-encryption-py3 | a5c48d07fae8121b59100d4cd79d3e38402d928c | [
"MIT"
] | 12 | 2016-06-05T19:09:26.000Z | 2021-04-18T04:23:20.000Z | from ..rings.CommutativeRing import CommutativeRing, CommutativeRingElement
class Field(CommutativeRing):
def __init__(s, element_class):
CommutativeRing.__init__(s, element_class)
def _inv(s, a):
raise NotImplementedError()
def _mod(s, a, b):
raise NotImplementedError()
def _div(s, a, b):
return s._mul(a, (s._inv(b)))
class FieldElement(CommutativeRingElement):
def __init__(s, field, x):
CommutativeRingElement.__init__(s, field, x)
s.field = field
if isinstance(x, FieldElement):
x = x.x
s.x = x
def change_field(s, _field):
return s.__class__(_field, *tuple(s))
def __mul__(s, rhs):
return s.field._mul(tuple(s), s._to_tuple(rhs))
def __div__(s, rhs):
return s.field._div(tuple(s), s._to_tuple(rhs))
def __rdiv__(s, lhs):
return s.field._div(s._to_tuple(lhs), tuple(s))
def __truediv__(s, rhs):
return s.field._div(tuple(s), s._to_tuple(rhs))
def __rtruediv__(s, lhs):
return s.field._div(s._to_tuple(lhs), tuple(s))
def __floordiv__(s, rhs):
if hasattr(s.field, '_fdiv'):
return s.field._fdiv(tuple(s), s._to_tuple(rhs))
else:
return s.field._div(tuple(s), s._to_tuple(rhs))
def __rfloordiv__(s, lhs):
if hasattr(s.field, '_fdiv'):
return s.field._fdiv(s._to_tuple(lhs), tuple(s))
else:
return s.field._div(s._to_tuple(lhs), tuple(s))
def __pow__(s, rhs, mod=None):
from six.moves import map
if rhs == 0:
return s.__class__(s.field, 1)
d = int(rhs)
if d < 0:
x = 1 / s
d = -d
else:
x = s
bits = list(map(int, bin(d)[2:]))[::-1]
if bits[0]:
res = x
else:
res = s.field(1)
for cur in bits[1:]:
x *= x
if cur:
res *= x
return res
def __mod__(s, rhs):
return s.field._mod(tuple(s), s._to_tuple(rhs))
def __rmod__(s, lhs):
return s.field._mod(s._to_tuple(lhs), tuple(s))
def __rmul__(s, lhs):
return s * lhs
def int(s):
return int(s.x)
def __ne__(s, rhs):
return not (s == rhs)
def __eq__(s, rhs):
return s.field._equ(tuple(s), s._to_tuple(rhs))
def __iter__(s):
return (s.x, ).__iter__()
def __int__(s):
return s.int()
def __hash__(s):
return s.x
| 22.107843 | 76 | 0.609756 | 359 | 2,255 | 3.440111 | 0.172702 | 0.097166 | 0.116599 | 0.051012 | 0.390283 | 0.331984 | 0.304453 | 0.239676 | 0.239676 | 0.182996 | 0 | 0.005214 | 0.23459 | 2,255 | 101 | 77 | 22.326733 | 0.710313 | 0 | 0 | 0.181818 | 0 | 0 | 0.004437 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.298701 | false | 0 | 0.025974 | 0.207792 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
50694bb281664f99da72dbeb75de94ab7e0ba11f | 100 | py | Python | zoj/1001.py | zirui/challenges | e3d8f2c29685c9c7ecbe32127fe29476675b5131 | [
"MIT"
] | null | null | null | zoj/1001.py | zirui/challenges | e3d8f2c29685c9c7ecbe32127fe29476675b5131 | [
"MIT"
] | null | null | null | zoj/1001.py | zirui/challenges | e3d8f2c29685c9c7ecbe32127fe29476675b5131 | [
"MIT"
] | null | null | null | import sys
for line in sys.stdin:
a, b = line.split()
ans = int(a) + int(b)
print ans
| 12.5 | 25 | 0.56 | 18 | 100 | 3.111111 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.31 | 100 | 7 | 26 | 14.285714 | 0.811594 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.2 | null | null | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5079a06f1827129279c0c31a01ad02811dd9cdb2 | 153 | py | Python | tezpie/p2p/messages/ack_message.py | dakk/tezpie | 04af3ff899ee2f58f5cda010583ca09f9c9d287b | [
"MIT"
] | 2 | 2020-09-26T09:51:12.000Z | 2020-09-26T10:09:28.000Z | tezpie/p2p/messages/ack_message.py | dakk/tezpie | 04af3ff899ee2f58f5cda010583ca09f9c9d287b | [
"MIT"
] | null | null | null | tezpie/p2p/messages/ack_message.py | dakk/tezpie | 04af3ff899ee2f58f5cda010583ca09f9c9d287b | [
"MIT"
] | null | null | null | from ...proto import Encoder
AckMessage = Encoder('AckMessage', [
{ 'type': 'u8be', 'name': 'status' }
])
AckMessage.ACK = 0x00
AckMessage.NACK = 0xFF | 19.125 | 37 | 0.666667 | 17 | 153 | 6 | 0.764706 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.150327 | 153 | 8 | 38 | 19.125 | 0.746154 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0.051948 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
508268549745d04934bd4cafd5a8a97dd1d27c5f | 197 | py | Python | Project/App/forms.py | cs-fullstack-2019-spring/django-mini-project5-rdunavant | 57426c6757fd4f08f61a259f8c87df8756147982 | [
"Apache-2.0"
] | null | null | null | Project/App/forms.py | cs-fullstack-2019-spring/django-mini-project5-rdunavant | 57426c6757fd4f08f61a259f8c87df8756147982 | [
"Apache-2.0"
] | null | null | null | Project/App/forms.py | cs-fullstack-2019-spring/django-mini-project5-rdunavant | 57426c6757fd4f08f61a259f8c87df8756147982 | [
"Apache-2.0"
] | null | null | null | from django import forms
class RecipeModel(forms.ModelForm):
class Meta:
exclude = ["userTableForeignKey"]
fields = ["picture", "name", "description", "date", "creator", "edit"] | 39.4 | 78 | 0.654822 | 19 | 197 | 6.789474 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192893 | 197 | 5 | 78 | 39.4 | 0.811321 | 0 | 0 | 0 | 0 | 0 | 0.282828 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
5091cdf10cd88cdb3b76607a781d01ecf33a24f5 | 93 | py | Python | python/basic_data_types/tuples.py | scouvreur/hackerrank | 93a52ed6a744ce41fd215d8007435a682228fa01 | [
"MIT"
] | 1 | 2021-01-28T14:23:24.000Z | 2021-01-28T14:23:24.000Z | python/basic_data_types/tuples.py | scouvreur/hackerrank | 93a52ed6a744ce41fd215d8007435a682228fa01 | [
"MIT"
] | null | null | null | python/basic_data_types/tuples.py | scouvreur/hackerrank | 93a52ed6a744ce41fd215d8007435a682228fa01 | [
"MIT"
] | null | null | null | n = int(input())
integer_tuple = tuple(map(int, input().split()))
print(hash(integer_tuple))
| 23.25 | 48 | 0.698925 | 14 | 93 | 4.5 | 0.642857 | 0.253968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086022 | 93 | 3 | 49 | 31 | 0.741176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5092ee063107347172dea54671e779e32a9373c6 | 84 | py | Python | esercizi/listaFor.py | gdv/python-alfabetizzazione | d87561222de8a230db11d8529c49cf1702aec326 | [
"MIT"
] | null | null | null | esercizi/listaFor.py | gdv/python-alfabetizzazione | d87561222de8a230db11d8529c49cf1702aec326 | [
"MIT"
] | null | null | null | esercizi/listaFor.py | gdv/python-alfabetizzazione | d87561222de8a230db11d8529c49cf1702aec326 | [
"MIT"
] | 1 | 2019-03-26T11:14:33.000Z | 2019-03-26T11:14:33.000Z | lista = [7, 3.0 + 5, 'pippo', 2 +1j]
for elemento in lista :
print elemento
| 10.5 | 36 | 0.571429 | 14 | 84 | 3.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0.285714 | 84 | 7 | 37 | 12 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0.060976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
50a5d5b371e77b9a7b0ec716de0dcc084efde5c5 | 87 | py | Python | audesc/__init__.py | RainOfAshes/audesc | 037d461c9cd256abcc9981fa62fdb8c25ebf36a0 | [
"MIT"
] | null | null | null | audesc/__init__.py | RainOfAshes/audesc | 037d461c9cd256abcc9981fa62fdb8c25ebf36a0 | [
"MIT"
] | null | null | null | audesc/__init__.py | RainOfAshes/audesc | 037d461c9cd256abcc9981fa62fdb8c25ebf36a0 | [
"MIT"
] | null | null | null | from .describers import (
WaveDescriber,
FlacDescriber,
MP3Describer
)
| 14.5 | 25 | 0.666667 | 6 | 87 | 9.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0.275862 | 87 | 5 | 26 | 17.4 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
50bb547cb0b25d36bf98f00cc60161cadf13763f | 447 | py | Python | {{cookiecutter.directory_name}}/apps/core/querysets/soft_delete.py | zurek11/django-project-template | 04dbdc87e4fd3ac14d235b3a6d5626f9f1ec2f61 | [
"MIT"
] | null | null | null | {{cookiecutter.directory_name}}/apps/core/querysets/soft_delete.py | zurek11/django-project-template | 04dbdc87e4fd3ac14d235b3a6d5626f9f1ec2f61 | [
"MIT"
] | null | null | null | {{cookiecutter.directory_name}}/apps/core/querysets/soft_delete.py | zurek11/django-project-template | 04dbdc87e4fd3ac14d235b3a6d5626f9f1ec2f61 | [
"MIT"
] | 2 | 2021-05-19T10:13:54.000Z | 2022-03-17T11:54:15.000Z | from django.db.models import QuerySet
from django.utils import timezone
class SoftDeleteQuerySet(QuerySet):
def delete(self):
return super(SoftDeleteQuerySet, self).update(deleted_at=timezone.now(), is_deleted=True)
def hard_delete(self):
return super(SoftDeleteQuerySet, self).delete()
def alive(self):
return self.filter(is_deleted=False)
def dead(self):
return self.exclude(is_deleted=True)
| 26.294118 | 97 | 0.718121 | 56 | 447 | 5.642857 | 0.482143 | 0.126582 | 0.101266 | 0.132911 | 0.272152 | 0.272152 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183445 | 447 | 16 | 98 | 27.9375 | 0.865753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0 | 0.181818 | 0.363636 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
50c2aa2c33e2f3f693095203a0f654662b2d32a0 | 624 | py | Python | tests/test_counter_limit_source.py | Arfius/requests-counter | 053ce84de2ffb88a54487a21757dd992e1b5b901 | [
"MIT"
] | null | null | null | tests/test_counter_limit_source.py | Arfius/requests-counter | 053ce84de2ffb88a54487a21757dd992e1b5b901 | [
"MIT"
] | null | null | null | tests/test_counter_limit_source.py | Arfius/requests-counter | 053ce84de2ffb88a54487a21757dd992e1b5b901 | [
"MIT"
] | null | null | null | from requests_counter.reqcounter import ReqCounter
import pytest
origin_allowed = ["origin_1", "origin_2"]
@pytest.mark.asyncio
async def test_check_origin_in():
cl = ReqCounter('redis://localhost')
await cl.setup_source(origin_allowed)
res = await cl.check_source("origin_1")
await cl.destroy_all()
await cl.close()
assert res is True
@pytest.mark.asyncio
async def test_check_origin_not_in():
cl = ReqCounter('redis://localhost')
await cl.setup_source(origin_allowed)
res = await cl.check_source("origin_12")
await cl.destroy_all()
await cl.close()
assert res is False
| 24.96 | 50 | 0.724359 | 89 | 624 | 4.842697 | 0.370787 | 0.12993 | 0.078886 | 0.102088 | 0.770302 | 0.770302 | 0.770302 | 0.770302 | 0.584687 | 0.584687 | 0 | 0.009634 | 0.168269 | 624 | 24 | 51 | 26 | 0.820809 | 0 | 0 | 0.526316 | 0 | 0 | 0.107372 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
50cec153b126270fd9ef368280797071a3ee4a8d | 2,236 | py | Python | deuce/drivers/storage/blocks/disk/diskstoragedriver.py | TheSriram/deuce | 9e8a7a342275aa02d0a59953b5a8c96ffb760b51 | [
"Apache-2.0"
] | null | null | null | deuce/drivers/storage/blocks/disk/diskstoragedriver.py | TheSriram/deuce | 9e8a7a342275aa02d0a59953b5a8c96ffb760b51 | [
"Apache-2.0"
] | null | null | null | deuce/drivers/storage/blocks/disk/diskstoragedriver.py | TheSriram/deuce | 9e8a7a342275aa02d0a59953b5a8c96ffb760b51 | [
"Apache-2.0"
] | null | null | null | from pecan import conf
from deuce.drivers.storage.blocks import BlockStorageDriver
import os
import io
import shutil
class DiskStorageDriver(BlockStorageDriver):
"""A driver for storing blocks onto local disk
IMPORTANT: This driver should not be considered
secure and therefore should not be ran in
any production environment.
"""
def __init__(self):
# Load the pecan config
self._path = conf.block_storage_driver.options.path
def _get_vault_path(self, project_id, vault_id):
return os.path.join(self._path, str(project_id), vault_id)
def _get_block_path(self, project_id, vault_id, block_id):
vault_path = self._get_vault_path(project_id, vault_id)
return os.path.join(vault_path, str(block_id))
def create_vault(self, project_id, vault_id):
path = self._get_vault_path(project_id, vault_id)
if not os.path.exists(path):
shutil.os.makedirs(path)
def vault_exists(self, project_id, vault_id):
path = self._get_vault_path(project_id, vault_id)
return os.path.exists(path)
def delete_vault(self, project_id, vault_id):
path = self._get_vault_path(project_id, vault_id)
os.rmdir(path)
def store_block(self, project_id, vault_id, block_id, blockdata):
path = self._get_block_path(project_id, vault_id, block_id)
with open(path, 'wb') as outfile:
outfile.write(blockdata)
return True
def block_exists(self, project_id, vault_id, block_id):
path = self._get_block_path(project_id, vault_id, block_id)
return os.path.exists(path)
def delete_block(self, project_id, vault_id, block_id):
path = self._get_block_path(project_id, vault_id, block_id)
if os.path.exists(path):
os.remove(path)
def get_block_obj(self, project_id, vault_id, block_id):
"""Returns a file-like object capable or streaming the
block data. If the object cannot be retrieved, the list
of objects should be returned
"""
path = self._get_block_path(project_id, vault_id, block_id)
if not os.path.exists(path):
return None
return open(path, 'rb')
| 31.055556 | 69 | 0.682916 | 326 | 2,236 | 4.395706 | 0.257669 | 0.092812 | 0.175855 | 0.200977 | 0.494766 | 0.494766 | 0.478018 | 0.412421 | 0.326588 | 0.326588 | 0 | 0 | 0.232111 | 2,236 | 71 | 70 | 31.492958 | 0.834595 | 0.144454 | 0 | 0.275 | 0 | 0 | 0.002151 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.025 | 0.575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
50cee4d951c2f762a5a95f2caf26269dacd77fd4 | 17,235 | py | Python | submodules/tensorflow-fcn/NNet.py | zhuanjiao2222/MultiAi | d53f1bd15ba591218c2209e3d14765c8d3f06707 | [
"MIT"
] | null | null | null | submodules/tensorflow-fcn/NNet.py | zhuanjiao2222/MultiAi | d53f1bd15ba591218c2209e3d14765c8d3f06707 | [
"MIT"
] | null | null | null | submodules/tensorflow-fcn/NNet.py | zhuanjiao2222/MultiAi | d53f1bd15ba591218c2209e3d14765c8d3f06707 | [
"MIT"
] | null | null | null | from network import Network
import tensorflow as tf
class NEWNet_BN(Network):
def setup(self, is_training, num_classes):
(self.feed('data')
# qianbo - +
# .interp(s_factor=0.5, name='data_sub2')
.conv(3, 3, 32*2, 2, 2, biased=False, padding='SAME', relu=False, name='conv1_1_3x3')
.batch_normalization(relu=True, name='conv1_1_3x3_bn')
# =================
.conv(3, 3, 32*2, 2, 2, biased=False, padding='SAME', relu=False, name='conv1_1_3x3_s2')
.batch_normalization(relu=True, name='conv1_1_3x3_s2_bn')
.conv(3, 3, 32*2, 1, 1, biased=False, padding='SAME', relu=False, name='conv1_2_3x3')
.batch_normalization(relu=True, name='conv1_2_3x3_bn')
.conv(3, 3, 64*2, 1, 1, biased=False, padding='SAME', relu=False, name='conv1_3_3x3')
.batch_normalization(relu=True, name='conv1_3_3x3_bn')
# qianbo - +
# .zero_padding(paddings=1, name='padding0')
# .max_pool(3, 3, 2, 2, name='pool1_3x3_s2')
.conv(3, 3, 64*2, 2, 2, biased=False, padding='SAME', relu=False, name='conv1_1_3x3_1')
.batch_normalization(relu=True, name='conv1_1_3x3_1_bn')
# =============
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv2_1_1x1_proj')
.batch_normalization(relu=False, name='conv2_1_1x1_proj_bn'))
(self.feed('conv1_1_3x3_1_bn')
.conv(1, 1, 32*2, 1, 1, biased=False, relu=False, name='conv2_1_1x1_reduce')
.batch_normalization(relu=True, name='conv2_1_1x1_reduce_bn')
.zero_padding(paddings=1, name='padding1')
.conv(3, 3, 32*2, 1, 1, biased=False, relu=False, name='conv2_1_3x3')
.batch_normalization(relu=True, name='conv2_1_3x3_bn')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv2_1_1x1_increase')
.batch_normalization(relu=False, name='conv2_1_1x1_increase_bn'))
(self.feed('conv2_1_1x1_proj_bn',
'conv2_1_1x1_increase_bn')
.add(name='conv2_1')
.relu(name='conv2_1/relu')
.conv(1, 1, 32*2, 1, 1, biased=False, relu=False, name='conv2_2_1x1_reduce')
.batch_normalization(relu=True, name='conv2_2_1x1_reduce_bn')
.zero_padding(paddings=1, name='padding2')
.conv(3, 3, 32*2, 1, 1, biased=False, relu=False, name='conv2_2_3x3')
.batch_normalization(relu=True, name='conv2_2_3x3_bn')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv2_2_1x1_increase')
.batch_normalization(relu=False, name='conv2_2_1x1_increase_bn'))
(self.feed('conv2_1/relu',
'conv2_2_1x1_increase_bn')
.add(name='conv2_2')
.relu(name='conv2_2/relu')
.conv(1, 1, 32*2, 1, 1, biased=False, relu=False, name='conv2_3_1x1_reduce')
.batch_normalization(relu=True, name='conv2_3_1x1_reduce_bn')
.zero_padding(paddings=1, name='padding3')
.conv(3, 3, 32*2, 1, 1, biased=False, relu=False, name='conv2_3_3x3')
.batch_normalization(relu=True, name='conv2_3_3x3_bn')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv2_3_1x1_increase')
.batch_normalization(relu=False, name='conv2_3_1x1_increase_bn'))
(self.feed('conv2_2/relu',
'conv2_3_1x1_increase_bn')
.add(name='conv2_3')
.relu(name='conv2_3/relu')
.conv(1, 1, 256*2, 2, 2, biased=False, relu=False, name='conv3_1_1x1_proj')
.batch_normalization(relu=False, name='conv3_1_1x1_proj_bn'))
(self.feed('conv2_3/relu')
.conv(1, 1, 64*2, 2, 2, biased=False, relu=False, name='conv3_1_1x1_reduce')
.batch_normalization(relu=True, name='conv3_1_1x1_reduce_bn')
.zero_padding(paddings=1, name='padding4')
.conv(3, 3, 64*2, 1, 1, biased=False, relu=False, name='conv3_1_3x3')
.batch_normalization(relu=True, name='conv3_1_3x3_bn')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv3_1_1x1_increase')
.batch_normalization(relu=False, name='conv3_1_1x1_increase_bn'))
(self.feed('conv3_1_1x1_proj_bn',
'conv3_1_1x1_increase_bn')
.add(name='conv3_1')
.relu(name='conv3_1/relu')
# .conv(3, 3, 256 * 2, 2, 2, biased=False, padding='SAME', relu=False, name='conv3_1_sub4')
# .batch_normalization(relu=True, name='conv3_1_sub4_bn')
.interp(s_factor=0.5, name='conv3_1_sub4')
.conv(1, 1, 64*2, 1, 1, biased=False, relu=False, name='conv3_2_1x1_reduce')
.batch_normalization(relu=True, name='conv3_2_1x1_reduce_bn')
.zero_padding(paddings=1, name='padding5')
.conv(3, 3, 64*2, 1, 1, biased=False, relu=False, name='conv3_2_3x3')
.batch_normalization(relu=True, name='conv3_2_3x3_bn')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv3_2_1x1_increase')
.batch_normalization(relu=False, name='conv3_2_1x1_increase_bn'))
(self.feed('conv3_1_sub4',
'conv3_2_1x1_increase_bn')
.add(name='conv3_2')
.relu(name='conv3_2/relu')
.conv(1, 1, 64*2, 1, 1, biased=False, relu=False, name='conv3_3_1x1_reduce')
.batch_normalization(relu=True, name='conv3_3_1x1_reduce_bn')
.zero_padding(paddings=1, name='padding6')
.conv(3, 3, 64*2, 1, 1, biased=False, relu=False, name='conv3_3_3x3')
.batch_normalization(relu=True, name='conv3_3_3x3_bn')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv3_3_1x1_increase')
.batch_normalization(relu=False, name='conv3_3_1x1_increase_bn'))
(self.feed('conv3_2/relu',
'conv3_3_1x1_increase_bn')
.add(name='conv3_3')
.relu(name='conv3_3/relu')
.conv(1, 1, 64*2, 1, 1, biased=False, relu=False, name='conv3_4_1x1_reduce')
.batch_normalization(relu=True, name='conv3_4_1x1_reduce_bn')
.zero_padding(paddings=1, name='padding7')
.conv(3, 3, 64*2, 1, 1, biased=False, relu=False, name='conv3_4_3x3')
.batch_normalization(relu=True, name='conv3_4_3x3_bn')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv3_4_1x1_increase')
.batch_normalization(relu=False, name='conv3_4_1x1_increase_bn'))
(self.feed('conv3_3/relu',
'conv3_4_1x1_increase_bn')
.add(name='conv3_4')
.relu(name='conv3_4/relu')
.conv(1, 1, 512*2, 1, 1, biased=False, relu=False, name='conv4_1_1x1_proj')
.batch_normalization(relu=False, name='conv4_1_1x1_proj_bn'))
(self.feed('conv3_4/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv4_1_1x1_reduce')
.batch_normalization(relu=True, name='conv4_1_1x1_reduce_bn')
.zero_padding(paddings=2, name='padding8')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv4_1_3x3')
.batch_normalization(relu=True, name='conv4_1_3x3_bn')
.conv(1, 1, 512*2, 1, 1, biased=False, relu=False, name='conv4_1_1x1_increase')
.batch_normalization(relu=False, name='conv4_1_1x1_increase_bn'))
(self.feed('conv4_1_1x1_proj_bn',
'conv4_1_1x1_increase_bn')
.add(name='conv4_1')
.relu(name='conv4_1/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv4_2_1x1_reduce')
.batch_normalization(relu=True, name='conv4_2_1x1_reduce_bn')
.zero_padding(paddings=2, name='padding9')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv4_2_3x3')
.batch_normalization(relu=True, name='conv4_2_3x3_bn')
.conv(1, 1, 512*2, 1, 1, biased=False, relu=False, name='conv4_2_1x1_increase')
.batch_normalization(relu=False, name='conv4_2_1x1_increase_bn'))
(self.feed('conv4_1/relu',
'conv4_2_1x1_increase_bn')
.add(name='conv4_2')
.relu(name='conv4_2/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv4_3_1x1_reduce')
.batch_normalization(relu=True, name='conv4_3_1x1_reduce_bn')
.zero_padding(paddings=2, name='padding10')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv4_3_3x3')
.batch_normalization(relu=True, name='conv4_3_3x3_bn')
.conv(1, 1, 512*2, 1, 1, biased=False, relu=False, name='conv4_3_1x1_increase')
.batch_normalization(relu=False, name='conv4_3_1x1_increase_bn'))
(self.feed('conv4_2/relu',
'conv4_3_1x1_increase_bn')
.add(name='conv4_3')
.relu(name='conv4_3/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv4_4_1x1_reduce')
.batch_normalization(relu=True, name='conv4_4_1x1_reduce_bn')
.zero_padding(paddings=2, name='padding11')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv4_4_3x3')
.batch_normalization(relu=True, name='conv4_4_3x3_bn')
.conv(1, 1, 512*2, 1, 1, biased=False, relu=False, name='conv4_4_1x1_increase')
.batch_normalization(relu=False, name='conv4_4_1x1_increase_bn'))
(self.feed('conv4_3/relu',
'conv4_4_1x1_increase_bn')
.add(name='conv4_4')
.relu(name='conv4_4/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv4_5_1x1_reduce')
.batch_normalization(relu=True, name='conv4_5_1x1_reduce_bn')
.zero_padding(paddings=2, name='padding12')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv4_5_3x3')
.batch_normalization(relu=True, name='conv4_5_3x3_bn')
.conv(1, 1, 512*2, 1, 1, biased=False, relu=False, name='conv4_5_1x1_increase')
.batch_normalization(relu=False, name='conv4_5_1x1_increase_bn'))
(self.feed('conv4_4/relu',
'conv4_5_1x1_increase_bn')
.add(name='conv4_5')
.relu(name='conv4_5/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv4_6_1x1_reduce')
.batch_normalization(relu=True, name='conv4_6_1x1_reduce_bn')
.zero_padding(paddings=2, name='padding13')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv4_6_3x3')
.batch_normalization(relu=True, name='conv4_6_3x3_bn')
.conv(1, 1, 512*2, 1, 1, biased=False, relu=False, name='conv4_6_1x1_increase')
.batch_normalization(relu=False, name='conv4_6_1x1_increase_bn'))
(self.feed('conv4_5/relu',
'conv4_6_1x1_increase_bn')
.add(name='conv4_6')
.relu(name='conv4_6/relu')
.conv(1, 1, 1024*2, 1, 1, biased=False, relu=False, name='conv5_1_1x1_proj')
.batch_normalization(relu=False, name='conv5_1_1x1_proj_bn'))
(self.feed('conv4_6/relu')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv5_1_1x1_reduce')
.batch_normalization(relu=True, name='conv5_1_1x1_reduce_bn')
.zero_padding(paddings=4, name='padding14')
.atrous_conv(3, 3, 256*2, 4, biased=False, relu=False, name='conv5_1_3x3')
.batch_normalization(relu=True, name='conv5_1_3x3_bn')
.conv(1, 1, 1024*2, 1, 1, biased=False, relu=False, name='conv5_1_1x1_increase')
.batch_normalization(relu=False, name='conv5_1_1x1_increase_bn'))
(self.feed('conv5_1_1x1_proj_bn',
'conv5_1_1x1_increase_bn')
.add(name='conv5_1')
.relu(name='conv5_1/relu')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv5_2_1x1_reduce')
.batch_normalization(relu=True, name='conv5_2_1x1_reduce_bn')
.zero_padding(paddings=4, name='padding15')
.atrous_conv(3, 3, 256*2, 4, biased=False, relu=False, name='conv5_2_3x3')
.batch_normalization(relu=True, name='conv5_2_3x3_bn')
.conv(1, 1, 1024*2, 1, 1, biased=False, relu=False, name='conv5_2_1x1_increase')
.batch_normalization(relu=False, name='conv5_2_1x1_increase_bn'))
(self.feed('conv5_1/relu',
'conv5_2_1x1_increase_bn')
.add(name='conv5_2')
.relu(name='conv5_2/relu')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv5_3_1x1_reduce')
.batch_normalization(relu=True, name='conv5_3_1x1_reduce_bn')
.zero_padding(paddings=4, name='padding16')
.atrous_conv(3, 3, 256*2, 4, biased=False, relu=False, name='conv5_3_3x3')
.batch_normalization(relu=True, name='conv5_3_3x3_bn')
.conv(1, 1, 1024*2, 1, 1, biased=False, relu=False, name='conv5_3_1x1_increase')
.batch_normalization(relu=False, name='conv5_3_1x1_increase_bn'))
(self.feed('conv5_2/relu',
'conv5_3_1x1_increase_bn')
.add(name='conv5_3')
.relu(name='conv5_3/relu'))
shape = self.layers['conv5_3/relu'].get_shape().as_list()[1:3]
h, w = shape
# shape = tf.shape(self.layers['conv5_3/relu'])
# h = shape[1]
# w = shape[2]
(self.feed('conv5_3/relu')
.avg_pool(h, w, h, w, name='conv5_3_pool1')
.resize_bilinear(shape, name='conv5_3_pool1_interp'))
(self.feed('conv5_3/relu')
.avg_pool(h/2, w/2, h/2, w/2, name='conv5_3_pool2')
.resize_bilinear(shape, name='conv5_3_pool2_interp'))
(self.feed('conv5_3/relu')
.avg_pool(h/3, w/3, h/3, w/3, name='conv5_3_pool3')
.resize_bilinear(shape, name='conv5_3_pool3_interp'))
(self.feed('conv5_3/relu')
.avg_pool(h/6, w/6, h/6, w/6, name='conv5_3_pool6')
.resize_bilinear(shape, name='conv5_3_pool6_interp'))
(self.feed('conv5_3/relu',
'conv5_3_pool6_interp',
'conv5_3_pool3_interp',
'conv5_3_pool2_interp',
'conv5_3_pool1_interp')
.add(name='conv5_3_sum')
.conv(1, 1, 256*2, 1, 1, biased=False, relu=False, name='conv5_4_k1')
.batch_normalization(relu=True, name='conv5_4_k1_bn')
.interp(z_factor=2.0, name='conv5_4_interp')
.zero_padding(paddings=2, name='padding17')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv_sub4')
.batch_normalization(relu=False, name='conv_sub4_bn'))
(self.feed('conv3_1/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv3_1_sub2_proj')
.batch_normalization(relu=False, name='conv3_1_sub2_proj_bn'))
(self.feed('conv_sub4_bn',
'conv3_1_sub2_proj_bn')
.add(name='sub24_sum')
.relu(name='sub24_sum/relu')
.interp(z_factor=2.0, name='sub24_sum_interp')
.zero_padding(paddings=2, name='padding18')
.atrous_conv(3, 3, 128*2, 2, biased=False, relu=False, name='conv_sub2')
.batch_normalization(relu=False, name='conv_sub2_bn'))
# qianbo - +
# (self.feed('data')
# .conv(3, 3, 32, 2, 2, biased=False, padding='SAME', relu=False, name='conv1_sub1')
# .batch_normalization(relu=True, name='conv1_sub1_bn')
# .conv(3, 3, 32, 2, 2, biased=False, padding='SAME', relu=False, name='conv2_sub1')
# .batch_normalization(relu=True, name='conv2_sub1_bn')
# .conv(3, 3, 64, 2, 2, biased=False, padding='SAME', relu=False, name='conv3_sub1')
# .batch_normalization(relu=True, name='conv3_sub1_bn')
# .conv(1, 1, 128, 1, 1, biased=False, relu=False, name='conv3_sub1_proj')
# .batch_normalization(relu=False, name='conv3_sub1_proj_bn'))
(self.feed('conv2_3/relu')
.conv(1, 1, 128*2, 1, 1, biased=False, relu=False, name='conv2_3_sub1_proj')
.batch_normalization(relu=False,name='conv2_3_sub1_proj_bn'))
# ================================================================================
(self.feed('conv_sub2_bn',
'conv2_3_sub1_proj_bn')
.add(name='sub12_sum')
.relu(name='sub12_sum/relu')
.interp(z_factor=2.0, name='sub12_sum_interp')
.conv(1, 1, num_classes, 1, 1, biased=True, relu=False, name='conv6_cls'))
# (self.feed('conv5_4_interp')
# .conv(1, 1, num_classes, 1, 1, biased=True, relu=False, name='sub4_out'))
#
# (self.feed('sub24_sum_interp')
# .conv(1, 1, num_classes, 1, 1, biased=True, relu=False, name='sub24_out'))
(self.feed('conv6_cls')
.interp(z_factor=4.0, name='conv6_cls_out'))
| 53.859375 | 103 | 0.593734 | 2,494 | 17,235 | 3.807538 | 0.042101 | 0.090038 | 0.130055 | 0.122157 | 0.887426 | 0.858677 | 0.735257 | 0.610046 | 0.379949 | 0.345409 | 0 | 0.106068 | 0.24839 | 17,235 | 319 | 104 | 54.028213 | 0.626988 | 0.075544 | 0 | 0.02381 | 0 | 0 | 0.236733 | 0.067404 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003968 | false | 0 | 0.007937 | 0 | 0.015873 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
50e687a34102d32cb07c18e8afdb8a819da9c4b8 | 1,033 | py | Python | setup.py | liumengjun/django-fast-test | 1996fc284872ae61f8d1634ecaa27a8030d8b8d7 | [
"MIT"
] | null | null | null | setup.py | liumengjun/django-fast-test | 1996fc284872ae61f8d1634ecaa27a8030d8b8d7 | [
"MIT"
] | null | null | null | setup.py | liumengjun/django-fast-test | 1996fc284872ae61f8d1634ecaa27a8030d8b8d7 | [
"MIT"
] | null | null | null | from setuptools import setup, find_packages
from django_fast_test import __version__ as version_str
setup(
name = "django-fast-test",
version = version_str,
license = "MIT",
description = "Django fast test command.",
author = "Mengjun Liu",
author_email = "mengjun18@gmail.com",
url = "https://github.com/liumengjun/django-fast-test",
packages = find_packages(),
install_requires = [
"django>=1.7",
],
extras_require = {
"test": [
],
},
classifiers = [
"Development Status :: 3 - Alpha",
"Environment :: Web Environment",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.3",
"Programming Language :: Python :: 3.4",
"Programming Language :: Python :: 3.5",
"Framework :: Django",
],
)
| 27.918919 | 59 | 0.588577 | 101 | 1,033 | 5.891089 | 0.554455 | 0.159664 | 0.210084 | 0.17479 | 0.090756 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016064 | 0.276864 | 1,033 | 36 | 60 | 28.694444 | 0.780455 | 0 | 0 | 0.09375 | 0 | 0 | 0.478219 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
50f92a592946d1897b5fc6bb541c6bc8fe13164b | 581 | py | Python | toontown/distributed/DistributedTimerAI.py | CrankySupertoon01/Toontown-2 | 60893d104528a8e7eb4aced5d0015f22e203466d | [
"MIT"
] | 1 | 2021-02-13T22:40:50.000Z | 2021-02-13T22:40:50.000Z | toontown/distributed/DistributedTimerAI.py | CrankySupertoonArchive/Toontown-2 | 60893d104528a8e7eb4aced5d0015f22e203466d | [
"MIT"
] | 1 | 2018-07-28T20:07:04.000Z | 2018-07-30T18:28:34.000Z | toontown/distributed/DistributedTimerAI.py | CrankySupertoonArchive/Toontown-2 | 60893d104528a8e7eb4aced5d0015f22e203466d | [
"MIT"
] | 2 | 2019-12-02T01:39:10.000Z | 2021-02-13T22:41:00.000Z | from direct.directnotify import DirectNotifyGlobal
from direct.distributed.DistributedObjectAI import DistributedObjectAI
from direct.distributed.ClockDelta import *
import time
class DistributedTimerAI(DistributedObjectAI):
notify = DirectNotifyGlobal.directNotify.newCategory("DistributedTimerAI")
def __init__(self, air):
DistributedObjectAI.__init__(self, air)
self.setStartTime(globalClockDelta.getRealNetworkTime(bits = 32))
def setStartTime(self, time):
self.startTime = time
def getStartTime(self):
return self.startTime
| 32.277778 | 78 | 0.776248 | 53 | 581 | 8.358491 | 0.471698 | 0.06772 | 0.094808 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004065 | 0.153184 | 581 | 17 | 79 | 34.176471 | 0.896341 | 0 | 0 | 0 | 0 | 0 | 0.030981 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.307692 | 0.076923 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
0f9a936b19e68c743f212d7c29b6877da2eaaa94 | 868 | py | Python | testsuite/tests/NA17-007__line_length/run_test.py | AdaCore/style_checker | 17108ebfc44375498063ecdad6c6e4430458e60a | [
"CNRI-Python"
] | 2 | 2017-10-22T18:04:26.000Z | 2020-03-06T11:07:41.000Z | testsuite/tests/NA17-007__line_length/run_test.py | AdaCore/style_checker | 17108ebfc44375498063ecdad6c6e4430458e60a | [
"CNRI-Python"
] | null | null | null | testsuite/tests/NA17-007__line_length/run_test.py | AdaCore/style_checker | 17108ebfc44375498063ecdad6c6e4430458e60a | [
"CNRI-Python"
] | 4 | 2018-05-22T12:08:54.000Z | 2020-12-14T15:25:27.000Z | def test_length_ko_1_adb(style_checker):
"""Check length-ko-1.adb
"""
style_checker.set_year(2006)
p = style_checker.run_style_checker('gnat', 'length-ko-1.adb')
style_checker.assertNotEqual(p.status, 0, p.image)
style_checker.assertRunOutputEqual(p, """\
length-ko-1.adb:50:80: (style) this line is too long
""")
def test_length_ko_2_c(style_checker):
"""Check length-ko-2.c
"""
style_checker.set_year(2006)
p = style_checker.run_style_checker('gnat', 'length-ko-2.c')
style_checker.assertEqual(p.status, 0, p.image)
style_checker.assertRunOutputEmpty(p)
def test_misc_ok_1_c(style_checker):
"""Check misc-ok-1.c
"""
style_checker.set_year(2006)
p = style_checker.run_style_checker('gnat', 'misc-ok-1.c')
style_checker.assertEqual(p.status, 0, p.image)
style_checker.assertRunOutputEmpty(p)
| 31 | 66 | 0.707373 | 135 | 868 | 4.281481 | 0.244444 | 0.373702 | 0.134948 | 0.083045 | 0.82872 | 0.811419 | 0.607266 | 0.562284 | 0.562284 | 0.562284 | 0 | 0.039136 | 0.146313 | 868 | 27 | 67 | 32.148148 | 0.740891 | 0.079493 | 0 | 0.411765 | 0 | 0 | 0.136247 | 0.028278 | 0 | 0 | 0 | 0 | 0.352941 | 1 | 0.176471 | false | 0 | 0 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
0f9f0370e2f0099a0b8d46f70c478d5ea2b7fa71 | 758 | py | Python | utils/metrics.py | aarshpatel/Modeling-Affect-Intensity | 163bdcc6dea6345c1bf3344f625ad1bcaa531081 | [
"MIT"
] | 1 | 2017-11-23T19:28:39.000Z | 2017-11-23T19:28:39.000Z | utils/metrics.py | aarshpatel/Modeling-Affect-Intensity | 163bdcc6dea6345c1bf3344f625ad1bcaa531081 | [
"MIT"
] | null | null | null | utils/metrics.py | aarshpatel/Modeling-Affect-Intensity | 163bdcc6dea6345c1bf3344f625ad1bcaa531081 | [
"MIT"
] | 1 | 2018-11-29T06:16:03.000Z | 2018-11-29T06:16:03.000Z | """ Implementation of various evaluation metrics """
from sklearn.metrics import mean_squared_error
from sklearn.metrics import make_scorer
import scipy
import math
def rmse(y_true, y_pred):
""" Root mean squared error """
return math.sqrt(mean_squared_error(y_true, y_pred))
def mse(y_true, y_pred):
""" Mean Squared Error """
return mean_squared_error(y_true, y_pred)
def pearson_correlation(y_true, y_pred):
""" Caculates the pearson correlation between two datasets """
return scipy.stats.pearsonr(y_true, y_pred)[0]
# scikit-learn scorer funcs
rmse_scorer = make_scorer(rmse, greater_is_better=False)
mse_scorer = make_scorer(mse, greater_is_better=False)
pearson_scorer = make_scorer(pearson_correlation, greater_is_better=True)
| 31.583333 | 73 | 0.779683 | 114 | 758 | 4.894737 | 0.359649 | 0.053763 | 0.064516 | 0.107527 | 0.103943 | 0.103943 | 0.103943 | 0.103943 | 0 | 0 | 0 | 0.001511 | 0.126649 | 758 | 23 | 74 | 32.956522 | 0.84139 | 0.228232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.307692 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
0fc916e41ac3c141599cbd5ee161fb2929bcdb19 | 316 | py | Python | 17. Decorators - Lab/class_as_decorator_demo.py | elenaborisova/Python-OOP | 584882c08f84045b12322917f0716c7c7bd9befc | [
"MIT"
] | 1 | 2021-03-27T16:56:30.000Z | 2021-03-27T16:56:30.000Z | 17. Decorators - Lab/class_as_decorator_demo.py | elenaborisova/Python-OOP | 584882c08f84045b12322917f0716c7c7bd9befc | [
"MIT"
] | null | null | null | 17. Decorators - Lab/class_as_decorator_demo.py | elenaborisova/Python-OOP | 584882c08f84045b12322917f0716c7c7bd9befc | [
"MIT"
] | 1 | 2021-03-15T14:50:39.000Z | 2021-03-15T14:50:39.000Z | from time import sleep
class delay_decorator:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
sleep(2)
result = self.func(*args, **kwargs)
return result
@delay_decorator
def say_hi(name):
return f"Hi, {name}!"
print(say_hi("Elena"))
| 15.8 | 43 | 0.613924 | 42 | 316 | 4.333333 | 0.547619 | 0.131868 | 0.186813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004255 | 0.256329 | 316 | 19 | 44 | 16.631579 | 0.770213 | 0 | 0 | 0 | 0 | 0 | 0.050633 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.083333 | 0.083333 | 0.583333 | 0.083333 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
0fca9a724179c83c04a94a27f104ee5ada465fc6 | 10,845 | py | Python | models/unet_encoder.py | milesgray/CALAE | a2ab2f7d9ee17cc6c24ff6ac370b0373537079ac | [
"Apache-2.0"
] | null | null | null | models/unet_encoder.py | milesgray/CALAE | a2ab2f7d9ee17cc6c24ff6ac370b0373537079ac | [
"Apache-2.0"
] | null | null | null | models/unet_encoder.py | milesgray/CALAE | a2ab2f7d9ee17cc6c24ff6ac370b0373537079ac | [
"Apache-2.0"
] | null | null | null | import math
from typing import Dict
import torch
from torch import nn
from torch.nn import functional as F
from torchvision.models.resnet import BasicBlock
from ..utils.latent_projecting import Latents, CodeLatents
class UNetLikeEncoder(nn.Module):
def __init__(self, image_size: int, latent_size: int, num_input_channels: int, size_channel_map: dict, *, target_size: int = 4, stylegan_variant: int = 2):
super().__init__()
self.image_size = image_size
self.latent_size = latent_size
self.stylegan_variant = stylegan_variant
self.size_channel_map = size_channel_map
self.log_input_size = int(math.log(image_size, 2))
self.log_target_size = int(math.log(target_size, 2))
assert image_size > target_size, "Input size must be larger than target size"
assert 2 ** self.log_input_size == image_size, "Input size must be a power of 2"
assert 2 ** self.log_target_size == target_size, "Target size must be a power of 2"
self.start_block = BasicBlock(
num_input_channels,
size_channel_map[image_size],
downsample=nn.Sequential(
nn.Conv2d(num_input_channels, size_channel_map[image_size], kernel_size=1, stride=1),
nn.BatchNorm2d(size_channel_map[image_size])
)
)
self.intermediate_block = BasicBlock(
size_channel_map[image_size],
size_channel_map[image_size],
)
self.resnet_blocks = [
BasicBlock(
in_planes := size_channel_map[2 ** current_size],
out_planes := size_channel_map[2 ** (current_size - 1)],
stride=2,
downsample=nn.Sequential(
nn.Conv2d(in_planes, out_planes, kernel_size=1, stride=2),
nn.BatchNorm2d(out_planes)
)
)
for current_size in range(self.log_input_size, self.log_target_size, -1)
]
self.intermediate_resnet_blocks = [
BasicBlock(
in_planes := size_channel_map[2 ** current_size],
in_planes,
)
for current_size in range(self.log_input_size, self.log_target_size - 1, -1)
]
self.resnet_blocks = nn.ModuleList([self.start_block] + self.resnet_blocks)
self.intermediate_resnet_blocks = nn.ModuleList(self.intermediate_resnet_blocks)
num_latents = (self.log_input_size - self.log_target_size) * 2 + 2
assert sum(map(len, [self.resnet_blocks, self.intermediate_resnet_blocks])) == num_latents, "The sum of all resnet blocks must be equal to the number of required latents"
self.build_projecting_layers(self.log_input_size, self.log_target_size, size_channel_map)
def build_projecting_layers(self, log_input_size, log_target_size, size_channel_map):
raise NotImplementedError
def get_to_x_convs(self, input_size: int, target_size: int, target_channels: int, size_channel_map: Dict[int, int]) -> nn.ModuleList:
conv_layer = [
nn.Conv2d(size_channel_map[2 ** current_size], target_channels, kernel_size=1, stride=1)
for current_size in range(input_size, target_size - 1, -1)
]
return nn.ModuleList(conv_layer)
def forward(self, x: torch.Tensor) -> Latents:
raise NotImplementedError
class WPlusEncoder(UNetLikeEncoder):
def build_projecting_layers(self, log_input_size, log_target_size, size_channel_map):
self.to_latent = self.get_to_x_convs(log_input_size, log_target_size, self.latent_size, size_channel_map)
self.intermediate_to_latent = self.get_to_x_convs(log_input_size, log_target_size, self.latent_size, size_channel_map)
self.to_noise = self.get_to_x_convs(log_input_size, log_target_size, 1, size_channel_map)
if self.stylegan_variant == 2:
self.intermediate_to_noise = self.get_to_x_convs(log_input_size, log_target_size, 1, size_channel_map)
def forward(self, x: torch.Tensor) -> Latents:
latent_codes = []
noise_codes = []
h = x
for i in range(len(self.resnet_blocks)):
h = self.resnet_blocks[i](h)
latent_codes.append(self.to_latent[i](F.adaptive_avg_pool2d(h, (1, 1))))
noise_codes.append(self.to_noise[i](h))
h = self.intermediate_resnet_blocks[i](h)
latent_codes.append(self.intermediate_to_latent[i](F.adaptive_avg_pool2d(h, (1, 1))))
if self.stylegan_variant == 2 and i < len(self.resnet_blocks) - 1:
noise_codes.append(self.intermediate_to_noise[i](h))
latent_codes.reverse()
latent_codes = torch.stack(latent_codes, dim=1)
latent_codes = latent_codes.squeeze(3).squeeze(3)
noise_codes.reverse()
return Latents(latent_codes, noise_codes)
class WPlusResnetNoiseEncoder(WPlusEncoder):
def get_noise_resblocks(self, input_size: int, target_size: int, size_channel_map: Dict[int, int]) -> nn.ModuleList:
resblocks = [
BasicBlock(
in_planes := size_channel_map[2 ** current_size],
1,
downsample=nn.Sequential(
nn.Conv2d(in_planes, 1, kernel_size=1, stride=1)
)
)
for current_size in range(input_size, target_size - 1, -1)
]
return nn.ModuleList(resblocks)
def build_projecting_layers(self, log_input_size, log_target_size, size_channel_map):
self.to_latent = self.get_to_x_convs(log_input_size, log_target_size, self.latent_size, size_channel_map)
self.intermediate_to_latent = self.get_to_x_convs(log_input_size, log_target_size, self.latent_size, size_channel_map)
self.to_noise = self.get_noise_resblocks(log_input_size, log_target_size, size_channel_map)
if self.stylegan_variant == 2:
self.intermediate_to_noise = self.get_noise_resblocks(log_input_size, log_target_size, size_channel_map)
class WEncoder(UNetLikeEncoder):
def build_projecting_layers(self, log_input_size, log_target_size, size_channel_map):
self.to_latent = nn.Conv2d(self.latent_size, self.latent_size, kernel_size=1, stride=1)
self.to_noise = self.get_to_x_convs(log_input_size, log_target_size, 1, size_channel_map)
if self.stylegan_variant == 2:
self.intermediate_to_noise = self.get_to_x_convs(log_input_size, log_target_size, 1, size_channel_map)
def forward(self, x: torch.Tensor) -> Latents:
latent_codes = []
noise_codes = []
h = x
for i in range(len(self.resnet_blocks)):
h = self.resnet_blocks[i](h)
noise_codes.append(self.to_noise[i](h))
h = self.intermediate_resnet_blocks[i](h)
if self.stylegan_variant == 2 and i < len(self.resnet_blocks) - 1:
noise_codes.append(self.intermediate_to_noise[i](h))
latent_codes.append(self.to_latent(F.adaptive_avg_pool2d(h, (1, 1))))
latent_codes.reverse()
latent_codes = latent_codes[0].squeeze(2).squeeze(2)
noise_codes.reverse()
return Latents(latent_codes, noise_codes)
class WWPlusEncoder(WPlusEncoder):
def forward(self, x: torch.Tensor) -> Latents:
resulting_latents = super().forward(x)
latent_code = resulting_latents.latent.sum(dim=1)
return Latents(latent_code, resulting_latents.noise)
class WCodeEncoder(WEncoder):
def __init__(self, code_dim: int, /, *args, **kwargs):
self.code_dim = code_dim
super().__init__(*args, **kwargs)
def build_projecting_layers(self, log_input_size, log_target_size, size_channel_map):
super().build_projecting_layers(log_input_size, log_target_size, size_channel_map)
self.to_code = nn.Conv2d(self.latent_size, self.code_dim, kernel_size=1, stride=1)
def forward(self, x: torch.Tensor) -> CodeLatents:
noise_codes = []
h = x
for i in range(len(self.resnet_blocks)):
h = self.resnet_blocks[i](h)
noise_codes.append(self.to_noise[i](h))
h = self.intermediate_resnet_blocks[i](h)
if self.stylegan_variant == 2 and i < len(self.resnet_blocks) - 1:
noise_codes.append(self.intermediate_to_noise[i](h))
h = F.adaptive_avg_pool2d(h, (1, 1))
latent_code = self.to_latent(h)
latent_code = latent_code.squeeze(2).squeeze(2)
info_code = self.to_code(h)
info_code = info_code.squeeze(2).squeeze(2)
noise_codes.reverse()
return CodeLatents(latent_code, noise_codes, info_code)
class WPlusNoNoiseEncoder(UNetLikeEncoder):
def build_projecting_layers(self, log_input_size, log_target_size, size_channel_map):
self.to_latent = self.get_to_x_convs(log_input_size, log_target_size, self.latent_size, size_channel_map)
self.intermediate_to_latent = self.get_to_x_convs(log_input_size, log_target_size, self.latent_size, size_channel_map)
def forward(self, x: torch.Tensor) -> Latents:
latent_codes = []
h = x
for i in range(len(self.resnet_blocks)):
h = self.resnet_blocks[i](h)
latent_codes.append(self.to_latent[i](F.adaptive_avg_pool2d(h, (1, 1))))
h = self.intermediate_resnet_blocks[i](h)
latent_codes.append(self.intermediate_to_latent[i](F.adaptive_avg_pool2d(h, (1, 1))))
latent_codes.reverse()
latent_codes = torch.stack(latent_codes, dim=1)
latent_codes = latent_codes.squeeze(3).squeeze(3)
return Latents(latent_codes, None)
class WNoNoiseEncoder(WPlusNoNoiseEncoder):
def forward(self, x: torch.Tensor) -> Latents:
resulting_latents = super().forward(x)
latent_code = resulting_latents.latent.sum(dim=1)
return Latents(latent_code, resulting_latents.noise)
class NoiseEncoder(UNetLikeEncoder):
def build_projecting_layers(self, log_input_size, log_target_size, size_channel_map):
self.to_noise = self.get_to_x_convs(log_input_size, log_target_size, 1, size_channel_map)
if self.stylegan_variant == 2:
self.intermediate_to_noise = self.get_to_x_convs(log_input_size, log_target_size, 1, size_channel_map)
def forward(self, x: torch.Tensor) -> Latents:
noise_codes = []
h = x
for i in range(len(self.resnet_blocks)):
h = self.resnet_blocks[i](h)
noise_codes.append(self.to_noise[i](h))
h = self.intermediate_resnet_blocks[i](h)
if self.stylegan_variant == 2 and i < len(self.resnet_blocks) - 1:
noise_codes.append(self.intermediate_to_noise[i](h))
noise_codes.reverse()
return Latents(None, noise_codes)
| 40.924528 | 178 | 0.670908 | 1,491 | 10,845 | 4.537223 | 0.078471 | 0.061789 | 0.07864 | 0.04878 | 0.793496 | 0.759497 | 0.715595 | 0.669623 | 0.628529 | 0.610495 | 0 | 0.011633 | 0.231166 | 10,845 | 264 | 179 | 41.079545 | 0.799712 | 0 | 0 | 0.556701 | 0 | 0 | 0.01669 | 0 | 0 | 0 | 0 | 0 | 0.020619 | 1 | 0.097938 | false | 0 | 0.036082 | 0 | 0.226804 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ba1cf8189ced67f1716c32cbf3a83be4b68add52 | 153 | py | Python | src/click_validators.py | AbdulrhmnGhanem/notion-tools | 727372e9be2d956fe1bbd138e5bfada3516f31d0 | [
"MIT"
] | 2 | 2022-01-01T13:02:58.000Z | 2022-01-01T14:37:54.000Z | src/click_validators.py | AbdulrhmnGhanem/notion-tools | 727372e9be2d956fe1bbd138e5bfada3516f31d0 | [
"MIT"
] | 8 | 2022-01-30T22:44:51.000Z | 2022-02-11T07:16:55.000Z | src/click_validators.py | AbdulrhmnGhanem/notion-tools | 727372e9be2d956fe1bbd138e5bfada3516f31d0 | [
"MIT"
] | null | null | null | import click
def positive_num(ctx, param, value):
if value < 1:
raise click.BadParameter("Should be a positive integer!")
return value
| 19.125 | 65 | 0.679739 | 21 | 153 | 4.904762 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008547 | 0.235294 | 153 | 7 | 66 | 21.857143 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0.189542 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
ba2bc37e87181bef1284aee61adac79eb25ecc46 | 670 | py | Python | venv/Lib/site-packages/werkzeug/wrappers/__init__.py | jodieritchie/MLHPortfolio | 66d23165c1c3277a6b7320af1cfc353a07385e7a | [
"MIT"
] | 1 | 2021-06-18T16:32:10.000Z | 2021-06-18T16:32:10.000Z | venv/Lib/site-packages/werkzeug/wrappers/__init__.py | jodieritchie/MLHPortfolio | 66d23165c1c3277a6b7320af1cfc353a07385e7a | [
"MIT"
] | null | null | null | venv/Lib/site-packages/werkzeug/wrappers/__init__.py | jodieritchie/MLHPortfolio | 66d23165c1c3277a6b7320af1cfc353a07385e7a | [
"MIT"
] | 1 | 2021-06-20T19:28:37.000Z | 2021-06-20T19:28:37.000Z | from .accept import AcceptMixin
from .auth import AuthorizationMixin
from .auth import WWWAuthenticateMixin
from .base_request import BaseRequest
from .base_response import BaseResponse
from .common_descriptors import CommonRequestDescriptorsMixin
from .common_descriptors import CommonResponseDescriptorsMixin
from .etag import ETagRequestMixin
from .etag import ETagResponseMixin
from .request import PlainRequest
from .request import Request as Request
from .request import StreamOnlyMixin
from .response import Response as Response
from .response import ResponseStream
from .response import ResponseStreamMixin
from .user_agent import UserAgentMixin
| 39.411765 | 63 | 0.850746 | 73 | 670 | 7.739726 | 0.369863 | 0.092035 | 0.090265 | 0.095575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125373 | 670 | 16 | 64 | 41.875 | 0.964164 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
e84157bd41714720c0adb89b81058cbce6d14845 | 1,736 | py | Python | tests/flask/test_blueprint.py | isp1r0/vulnpy | 4709b77bdcd858931d4238490538c77eaa2cc74e | [
"MIT"
] | null | null | null | tests/flask/test_blueprint.py | isp1r0/vulnpy | 4709b77bdcd858931d4238490538c77eaa2cc74e | [
"MIT"
] | null | null | null | tests/flask/test_blueprint.py | isp1r0/vulnpy | 4709b77bdcd858931d4238490538c77eaa2cc74e | [
"MIT"
] | null | null | null | import pytest
from flask import Flask
from vulnpy.flask.blueprint import vulnerable_blueprint
@pytest.fixture
def client():
app = Flask(__name__)
app.register_blueprint(vulnerable_blueprint)
with app.test_client() as client:
yield client
def test_home(client):
response = client.get("/vulnpy/")
assert response.status_code == 200
def test_cmdi_os_system_get(client):
response = client.get("/vulnpy/cmdi/os-system/?user_input=echo%20attack")
assert int(response.get_data()) == 0
def test_cmdi_os_system_post(client):
response = client.post(
"/vulnpy/cmdi/os-system/", data={"user_input": "echo attack"}
)
assert int(response.get_data()) == 0
def test_cmdi_os_system_bad_command(client):
response = client.get("/vulnpy/cmdi/os-system/?user_input=foo")
assert int(response.get_data()) != 0
def test_cmdi_os_system_invalid_input(client):
response = client.get("/vulnpy/cmdi/os-system/?ignored_param=bad")
assert response.status_code == 400
def test_cmdi_subprocess_popen_get(client):
response = client.get("/vulnpy/cmdi/subprocess-popen/?user_input=echo%20attack")
assert response.get_data() == b"attack\n"
def test_cmdi_subprocess_popen_post(client):
response = client.post(
"/vulnpy/cmdi/subprocess-popen/", data={"user_input": "echo attack"}
)
assert response.get_data() == b"attack\n"
def test_cmdi_subprocess_popen_bad_command(client):
response = client.get("/vulnpy/cmdi/subprocess-popen/?user_input=foo")
assert response.status_code == 400
def test_cmdi_subprocess_popen_invalid_input(client):
response = client.get("/vulnpy/cmdi/os-system/?ignored_param=bad")
assert response.status_code == 400
| 28 | 84 | 0.728111 | 239 | 1,736 | 5.029289 | 0.196653 | 0.052413 | 0.14975 | 0.133943 | 0.78619 | 0.72629 | 0.68802 | 0.6198 | 0.603161 | 0.603161 | 0 | 0.012795 | 0.144585 | 1,736 | 61 | 85 | 28.459016 | 0.796633 | 0 | 0 | 0.275 | 0 | 0 | 0.222926 | 0.184908 | 0 | 0 | 0 | 0 | 0.225 | 1 | 0.25 | false | 0 | 0.075 | 0 | 0.325 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e850b0f03e8c9ef53d3100be89ab6c1da84f5385 | 229 | py | Python | toshell/utils.py | mtg-to/pairings-manager | d84a57c2446e1790097e215243348e1b086d215f | [
"MIT"
] | 1 | 2021-03-15T13:44:10.000Z | 2021-03-15T13:44:10.000Z | toshell/utils.py | mtg-to/pairings-manager | d84a57c2446e1790097e215243348e1b086d215f | [
"MIT"
] | null | null | null | toshell/utils.py | mtg-to/pairings-manager | d84a57c2446e1790097e215243348e1b086d215f | [
"MIT"
] | 2 | 2021-03-16T14:51:10.000Z | 2021-03-18T20:28:16.000Z | class ExitMixin():
def do_exit(self, _):
'''
Exit the shell.
'''
print(self._exit_msg)
return True
def do_EOF(self, params):
print()
return self.do_exit(params)
| 15.266667 | 35 | 0.50655 | 26 | 229 | 4.230769 | 0.538462 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.379913 | 229 | 14 | 36 | 16.357143 | 0.774648 | 0.065502 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.714286 | 0.285714 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
e86b2c9e6be9cc2f55dc945d378162f30a9e5009 | 389 | py | Python | paste/paste/models/__init__.py | hbjydev/pastr | dd902e93261a529468b4429f792fcddfc8de1c1c | [
"BSD-3-Clause"
] | null | null | null | paste/paste/models/__init__.py | hbjydev/pastr | dd902e93261a529468b4429f792fcddfc8de1c1c | [
"BSD-3-Clause"
] | null | null | null | paste/paste/models/__init__.py | hbjydev/pastr | dd902e93261a529468b4429f792fcddfc8de1c1c | [
"BSD-3-Clause"
] | null | null | null | # Copyright 2021 Hayden Young. All rights reserved.
# Use of this source code is governed by a BSD-style
# license that can be found in the LICENSE file.
from typing import Any, Optional
from pydantic import BaseModel
from paste.models.paste import Pastes, Paste_Pydantic, PasteIn_Pydantic
class APIResponse(BaseModel):
message: str
data: Optional[Any]
error: Optional[str]
| 27.785714 | 71 | 0.771208 | 57 | 389 | 5.22807 | 0.754386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012422 | 0.172237 | 389 | 13 | 72 | 29.923077 | 0.913043 | 0.377892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
e8b1194597f7e23e0f612746f9affecf6680cda3 | 271 | py | Python | test_suite/testenv.py | austince/starlark | 44b6bb5efdaa148c185c6ca773518e7aff57c072 | [
"Apache-2.0"
] | 1 | 2021-12-12T07:16:40.000Z | 2021-12-12T07:16:40.000Z | test_suite/testenv.py | austince/starlark | 44b6bb5efdaa148c185c6ca773518e7aff57c072 | [
"Apache-2.0"
] | null | null | null | test_suite/testenv.py | austince/starlark | 44b6bb5efdaa148c185c6ca773518e7aff57c072 | [
"Apache-2.0"
] | null | null | null | STARLARK_BINARY_PATH = {
"java": "external/io_bazel/src/main/java/net/starlark/java/cmd/Starlark",
"go": "external/net_starlark_go/cmd/starlark/*/starlark",
"rust": "external/starlark-rust/target/debug/starlark-repl",
}
STARLARK_TESTDATA_PATH = "test_suite/"
| 38.714286 | 77 | 0.738007 | 36 | 271 | 5.333333 | 0.527778 | 0.114583 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092251 | 271 | 6 | 78 | 45.166667 | 0.780488 | 0 | 0 | 0 | 0 | 0 | 0.664207 | 0.586716 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e8ba9e9287d41e18eaa0e44c4176eae5bff85437 | 952 | py | Python | classy_vision/heads/identity_head.py | jdsgomes/ClassyVision-1 | 309d4f12431c6b4d8540010a781dc2aa25fe88e7 | [
"MIT"
] | null | null | null | classy_vision/heads/identity_head.py | jdsgomes/ClassyVision-1 | 309d4f12431c6b4d8540010a781dc2aa25fe88e7 | [
"MIT"
] | null | null | null | classy_vision/heads/identity_head.py | jdsgomes/ClassyVision-1 | 309d4f12431c6b4d8540010a781dc2aa25fe88e7 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
from typing import Any, Dict
from classy_vision.heads import ClassyHead, register_head
@register_head("identity")
class IdentityHead(ClassyHead):
"""This head returns the input without changing it. This can
be attached to a model, if the output of the model is the
desired result.
"""
def forward(self, x):
return x
@classmethod
def from_config(cls, config: Dict[str, Any]) -> "IdentityHead":
"""Instantiates a IdentityHead from a configuration.
Args:
config: A configuration for a IdentityHead.
See :func:`__init__` for parameters expected in the config.
Returns:
A IdentityHead instance.
"""
return cls(config["unique_id"])
| 28 | 75 | 0.670168 | 124 | 952 | 5.072581 | 0.620968 | 0.023847 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001403 | 0.25105 | 952 | 33 | 76 | 28.848485 | 0.880785 | 0.563025 | 0 | 0 | 0 | 0 | 0.085546 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0.111111 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
2cdbb7018fa9479026b032b640cc01c086a0256e | 124 | py | Python | FirstStepsInPython/Fundamentals/Exercice/Data Types and Variables/05. Print Part of the ASCII Table.py | Pittor052/SoftUni-Studies | 1ee6341082f6ccfa45b3e82824c37722bcf2fb31 | [
"MIT"
] | null | null | null | FirstStepsInPython/Fundamentals/Exercice/Data Types and Variables/05. Print Part of the ASCII Table.py | Pittor052/SoftUni-Studies | 1ee6341082f6ccfa45b3e82824c37722bcf2fb31 | [
"MIT"
] | null | null | null | FirstStepsInPython/Fundamentals/Exercice/Data Types and Variables/05. Print Part of the ASCII Table.py | Pittor052/SoftUni-Studies | 1ee6341082f6ccfa45b3e82824c37722bcf2fb31 | [
"MIT"
] | 1 | 2021-10-07T18:30:42.000Z | 2021-10-07T18:30:42.000Z | start = int(input())
stop = int(input())
result = ""
for number in range(start, stop + 1):
print(chr(number), end=" ")
| 17.714286 | 37 | 0.596774 | 18 | 124 | 4.111111 | 0.722222 | 0.216216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01 | 0.193548 | 124 | 6 | 38 | 20.666667 | 0.73 | 0 | 0 | 0 | 0 | 0 | 0.00813 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
2ce5ba1536918857f85fbf182688ef7f79b58f44 | 322 | py | Python | scripts/template.py | Srujan-DiagonalMatrix/flaskTutorial | ee670c3d332b9a81683c1f3d7cba033361bf3b60 | [
"MIT"
] | null | null | null | scripts/template.py | Srujan-DiagonalMatrix/flaskTutorial | ee670c3d332b9a81683c1f3d7cba033361bf3b60 | [
"MIT"
] | null | null | null | scripts/template.py | Srujan-DiagonalMatrix/flaskTutorial | ee670c3d332b9a81683c1f3d7cba033361bf3b60 | [
"MIT"
] | null | null | null | from flask import Flask, render_template
app = Flask(__name__)
@app.route('/')
def index():
return render_template('template.html')
@app.route('/user/<user>/<int:depo>')
def hello_name(user,depo):
return render_template('template.html', name=user, deposit=depo)
if __name__=='__main__':
app.run(debug=True)
| 23 | 68 | 0.71118 | 45 | 322 | 4.733333 | 0.488889 | 0.197183 | 0.187793 | 0.262911 | 0.300469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118012 | 322 | 13 | 69 | 24.769231 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.180124 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0.2 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
2ce61cfea424ca333b3b2df2081de01237ed6edd | 174 | py | Python | ExerciciosPython/Desafios/desafio057.py | xaviergthiago/PycharmProjects | 191e645223036fa800b32102f37e182eabe0e35c | [
"MIT"
] | null | null | null | ExerciciosPython/Desafios/desafio057.py | xaviergthiago/PycharmProjects | 191e645223036fa800b32102f37e182eabe0e35c | [
"MIT"
] | null | null | null | ExerciciosPython/Desafios/desafio057.py | xaviergthiago/PycharmProjects | 191e645223036fa800b32102f37e182eabe0e35c | [
"MIT"
] | null | null | null | sexo=str
while (sexo != 'F') and (sexo != 'M'):
sexo = (input("Digite o sexo: "))
if (sexo != 'F') and (sexo != 'M'):
print('Opção inválida, tente novamente') | 34.8 | 48 | 0.528736 | 24 | 174 | 3.833333 | 0.625 | 0.108696 | 0.173913 | 0.26087 | 0.282609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241379 | 174 | 5 | 48 | 34.8 | 0.69697 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
2ce67ae751f609a1499c9e61021a36d8558e53ad | 1,153 | py | Python | aiotdlib/api/types/message_calendar.py | mostafa-arshadi/aiotdlib | 59f430a65dfb424fc69d471a0d7bcd77ad7acf08 | [
"MIT"
] | 37 | 2021-05-04T10:41:41.000Z | 2022-03-30T13:48:05.000Z | aiotdlib/api/types/message_calendar.py | mostafa-arshadi/aiotdlib | 59f430a65dfb424fc69d471a0d7bcd77ad7acf08 | [
"MIT"
] | 13 | 2021-07-17T19:54:51.000Z | 2022-02-26T06:50:00.000Z | aiotdlib/api/types/message_calendar.py | mostafa-arshadi/aiotdlib | 59f430a65dfb424fc69d471a0d7bcd77ad7acf08 | [
"MIT"
] | 7 | 2021-09-22T21:27:11.000Z | 2022-02-20T02:33:19.000Z | # =============================================================================== #
# #
# This file has been generated automatically!! Do not change this manually! #
# #
# =============================================================================== #
from __future__ import annotations
from pydantic import Field
from .message_calendar_day import MessageCalendarDay
from ..base_object import BaseObject
class MessageCalendar(BaseObject):
"""
Contains information about found messages, split by days according to the option "utc_time_offset"
:param total_count: Total number of found messages
:type total_count: :class:`int`
:param days: Information about messages sent
:type days: :class:`list[MessageCalendarDay]`
"""
ID: str = Field("messageCalendar", alias="@type")
total_count: int
days: list[MessageCalendarDay]
@staticmethod
def read(q: dict) -> MessageCalendar:
return MessageCalendar.construct(**q)
| 34.939394 | 102 | 0.509107 | 94 | 1,153 | 6.117021 | 0.62766 | 0.052174 | 0.048696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.290546 | 1,153 | 32 | 103 | 36.03125 | 0.702934 | 0.588899 | 0 | 0 | 1 | 0 | 0.047733 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0.090909 | 0.909091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
2cebae38779d96dd4a89862f8e8be1d16cb643da | 1,422 | py | Python | flaskd3/domains/repositoryif/object_repository.py | hohihohi/flask_ddd_sample | 34bd1af4614577be0a6b410250f2d58d2c6f3b93 | [
"MIT"
] | null | null | null | flaskd3/domains/repositoryif/object_repository.py | hohihohi/flask_ddd_sample | 34bd1af4614577be0a6b410250f2d58d2c6f3b93 | [
"MIT"
] | null | null | null | flaskd3/domains/repositoryif/object_repository.py | hohihohi/flask_ddd_sample | 34bd1af4614577be0a6b410250f2d58d2c6f3b93 | [
"MIT"
] | null | null | null | from abc import ABCMeta, abstractmethod
# NOTE: domain service concern with domain and business logic, so call them and has responsibility.
class ObjectRepositoryIF(metaclass=ABCMeta):
# find object source by id from database. This method should return object or None
@abstractmethod
def find_by_id(self, id):
pass
# find object source by name from database. This method should return object or None
@abstractmethod
def find_by_name(self, name):
pass
# find object sources by user id from database. This method should return object or None
@abstractmethod
def find_by_user_id(self, user_id):
pass
# save object source to database. This method should return saved object or error
@abstractmethod
def save(self, ob):
pass
# save bucket source database. This method should return saved bucket or error
@abstractmethod
def save_bucket(self, bucket):
pass
# delete object source from database. This method should return deleted object or error
@abstractmethod
def delete(self, ob):
pass
# delete bucket source from database. This method should return deleted bucket or error
@abstractmethod
def delete_bucket(self, bucket):
pass
# update object source to database. This method should return updated object or error
@abstractmethod
def update(self, ob):
pass
| 31.6 | 99 | 0.710267 | 189 | 1,422 | 5.291005 | 0.259259 | 0.096 | 0.144 | 0.192 | 0.603 | 0.433 | 0.393 | 0.393 | 0.211 | 0.211 | 0 | 0 | 0.244023 | 1,422 | 44 | 100 | 32.318182 | 0.930233 | 0.535162 | 0 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0.307692 | 0.038462 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
2cf2747ff5797a58fe94bbd513ce919c254edbb4 | 180 | py | Python | tests/test_filing.py | kfarr3/edgar3 | 0207a2946550b55ccf280ab14cb52e38a315d107 | [
"MIT"
] | 1 | 2019-04-17T19:31:08.000Z | 2019-04-17T19:31:08.000Z | tests/test_filing.py | kfarr3/edgar3 | 0207a2946550b55ccf280ab14cb52e38a315d107 | [
"MIT"
] | null | null | null | tests/test_filing.py | kfarr3/edgar3 | 0207a2946550b55ccf280ab14cb52e38a315d107 | [
"MIT"
] | null | null | null | from edgar3 import filing
def test_filing():
with open("tests/raw_filing.txt", "r") as fin:
fil = filing.Filing(fin.read())
assert isinstance(fil, filing.Filing)
| 22.5 | 50 | 0.672222 | 26 | 180 | 4.576923 | 0.692308 | 0.151261 | 0.252101 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006897 | 0.194444 | 180 | 7 | 51 | 25.714286 | 0.813793 | 0 | 0 | 0 | 0 | 0 | 0.116667 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.