hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
eac56c80b826119870d8bb0f4a988cbd07be6358 | 2,597 | py | Python | pysnmp/CYCLADES-ACS-ADM-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CYCLADES-ACS-ADM-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CYCLADES-ACS-ADM-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CYCLADES-ACS-ADM-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CYCLADES-ACS-ADM-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:18:44 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsIntersection, SingleValueConstraint, ConstraintsUnion, ValueSizeConstraint, ValueRangeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsIntersection", "SingleValueConstraint", "ConstraintsUnion", "ValueSizeConstraint", "ValueRangeConstraint")
cyACSMgmt, = mibBuilder.importSymbols("CYCLADES-ACS-MIB", "cyACSMgmt")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Gauge32, IpAddress, TimeTicks, MibScalar, MibTable, MibTableRow, MibTableColumn, iso, ModuleIdentity, Integer32, Counter32, ObjectIdentity, NotificationType, Counter64, Unsigned32, MibIdentifier, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "Gauge32", "IpAddress", "TimeTicks", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "iso", "ModuleIdentity", "Integer32", "Counter32", "ObjectIdentity", "NotificationType", "Counter64", "Unsigned32", "MibIdentifier", "Bits")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
cyACSAdm = ModuleIdentity((1, 3, 6, 1, 4, 1, 2925, 4, 4))
cyACSAdm.setRevisions(('2005-08-29 00:00', '2002-09-20 00:00',))
if mibBuilder.loadTexts: cyACSAdm.setLastUpdated('200508290000Z')
if mibBuilder.loadTexts: cyACSAdm.setOrganization('Cyclades Corporation')
cyACSSave = MibScalar((1, 3, 6, 1, 4, 1, 2925, 4, 4, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("nosave", 0), ("save", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: cyACSSave.setStatus('current')
cyACSSerialHUP = MibScalar((1, 3, 6, 1, 4, 1, 2925, 4, 4, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("norestartportslave", 0), ("restartportslave", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: cyACSSerialHUP.setStatus('current')
mibBuilder.exportSymbols("CYCLADES-ACS-ADM-MIB", cyACSSerialHUP=cyACSSerialHUP, PYSNMP_MODULE_ID=cyACSAdm, cyACSAdm=cyACSAdm, cyACSSave=cyACSSave)
| 108.208333 | 477 | 0.778976 | 277 | 2,597 | 7.296029 | 0.429603 | 0.079664 | 0.041564 | 0.025235 | 0.425532 | 0.327561 | 0.285007 | 0.285007 | 0.285007 | 0.27907 | 0 | 0.064315 | 0.072006 | 2,597 | 23 | 478 | 112.913043 | 0.774274 | 0.12861 | 0 | 0 | 0 | 0 | 0.277728 | 0.019521 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4375 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
ead84eeefd70332698bb5734f57cefaab436a5bb | 524 | py | Python | examples/geocalc/common.py | EliAndrewC/protlib | 2c9302d4ecc941478bae886ee2d36e3055757c35 | [
"BSD-3-Clause"
] | 14 | 2017-02-21T10:12:40.000Z | 2022-03-30T18:06:34.000Z | examples/geocalc/common.py | EliAndrewC/protlib | 2c9302d4ecc941478bae886ee2d36e3055757c35 | [
"BSD-3-Clause"
] | 1 | 2018-08-27T14:33:26.000Z | 2018-08-27T14:33:26.000Z | examples/geocalc/common.py | EliAndrewC/protlib | 2c9302d4ecc941478bae886ee2d36e3055757c35 | [
"BSD-3-Clause"
] | null | null | null | import logging
logging.basicConfig(level = logging.INFO)
from protlib import *
SERVER_ADDR = ("127.0.0.1", 32123)
class Point(CStruct):
code = CShort(always = 1)
x = CFloat()
y = CFloat()
class Vector(CStruct):
code = CShort(always = 2)
p1 = Point.get_type()
p2 = Point.get_type()
class Rectangle(CStruct):
code = CShort(always = 4)
points = CArray(4, Point)
class PointGroup(CStruct):
code = CShort(always = 3)
count = CInt()
points = CArray("count", Point)
| 20.153846 | 41 | 0.620229 | 67 | 524 | 4.80597 | 0.537313 | 0.136646 | 0.21118 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04557 | 0.246183 | 524 | 25 | 42 | 20.96 | 0.76962 | 0 | 0 | 0 | 0 | 0 | 0.026718 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.894737 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
eaf6dc6458809d4cc738ff6ca99da2346d4b1527 | 16,550 | py | Python | networkapi/equipamento/resource/EquipamentoAcessoResource.py | vinicius-marinho/GloboNetworkAPI | 94651d3b4dd180769bc40ec966814f3427ccfb5b | [
"Apache-2.0"
] | 73 | 2015-04-13T17:56:11.000Z | 2022-03-24T06:13:07.000Z | networkapi/equipamento/resource/EquipamentoAcessoResource.py | leopoldomauricio/GloboNetworkAPI | 3b5b2e336d9eb53b2c113977bfe466b23a50aa29 | [
"Apache-2.0"
] | 99 | 2015-04-03T01:04:46.000Z | 2021-10-03T23:24:48.000Z | networkapi/equipamento/resource/EquipamentoAcessoResource.py | leopoldomauricio/GloboNetworkAPI | 3b5b2e336d9eb53b2c113977bfe466b23a50aa29 | [
"Apache-2.0"
] | 64 | 2015-08-05T21:26:29.000Z | 2022-03-22T01:06:28.000Z | # -*- coding: utf-8 -*-
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import with_statement
import logging
from networkapi.admin_permission import AdminPermission
from networkapi.auth import has_perm
from networkapi.distributedlock import distributedlock
from networkapi.distributedlock import LOCK_EQUIPMENT_ACCESS
from networkapi.equipamento.models import Equipamento
from networkapi.equipamento.models import EquipamentoAccessDuplicatedError
from networkapi.equipamento.models import EquipamentoAcesso
from networkapi.equipamento.models import EquipamentoError
from networkapi.equipamento.models import EquipamentoNotFoundError
from networkapi.exception import InvalidValueError
from networkapi.grupo.models import GrupoError
from networkapi.infrastructure.xml_utils import dumps_networkapi
from networkapi.infrastructure.xml_utils import loads
from networkapi.infrastructure.xml_utils import XMLError
from networkapi.rest import RestResource
from networkapi.tipoacesso.models import AccessTypeNotFoundError
from networkapi.tipoacesso.models import TipoAcesso
from networkapi.util import is_valid_int_greater_zero_param
from networkapi.util import is_valid_string_maxsize
from networkapi.util import is_valid_string_minsize
from networkapi.api_vrf.models import Vrf
class EquipamentoAcessoResource(RestResource):
"""Classe que trata as requisições de PUT,POST,GET e DELETE para a tabela equiptos_acesso."""
log = logging.getLogger('EquipamentoAcessoResource')
def handle_get(self, request, user, *args, **kwargs):
"""Trata as requisições GET para consulta de Informações de Acesso a Equipamentos.
Permite a consulta de Informações de Acesso a Equipamentos existentes.
URL: /equipamentoacesso/
"""
try:
if not has_perm(user, AdminPermission.EQUIPMENT_MANAGEMENT, AdminPermission.READ_OPERATION):
return self.not_authorized()
# Efetua a consulta de todos os tipos de acesso
results = EquipamentoAcesso.search(user.grupos.all())
if results.count() > 0:
# Monta lista com dados retornados
map_list = []
for item in results:
item_map = self.get_equipamento_acesso_map(item)
if item_map not in map_list:
map_list.append(item_map)
# Gera response (XML) com resultados
return self.response(dumps_networkapi({'equipamento_acesso': map_list}))
else:
# Gera response (XML) para resultado vazio
return self.response(dumps_networkapi({}))
except (EquipamentoError, GrupoError):
return self.response_error(1)
def handle_post(self, request, user, *args, **kwargs):
"""Trata as requisições de POST para criar Informações de Acesso a Equipamentos.
URL: /equipamentoacesso
"""
# Obtém dados do request e verifica acesso
try:
# Obtém os dados do xml do request
xml_map, attrs_map = loads(request.raw_post_data)
# Obtém o mapa correspondente ao root node do mapa do XML
# (networkapi)
networkapi_map = xml_map.get('networkapi')
if networkapi_map is None:
return self.response_error(3, u'Não existe valor para a tag networkapi do XML de requisição.')
# Verifica a existência do node "equipamento_acesso"
equipamento_acesso_map = networkapi_map.get('equipamento_acesso')
if equipamento_acesso_map is None:
return self.response_error(3, u'Não existe valor para a tag equipamento_acesso do XML de requisição.')
# Verifica a existência do valor "id_equipamento"
id_equipamento = equipamento_acesso_map.get('id_equipamento')
# Valid ID Equipment
if not is_valid_int_greater_zero_param(id_equipamento):
self.log.error(
u'The id_equipamento parameter is not a valid value: %s.', id_equipamento)
raise InvalidValueError(None, 'id_equipamento', id_equipamento)
try:
id_equipamento = int(id_equipamento)
except (TypeError, ValueError):
self.log.error(
u'Valor do id_equipamento inválido: %s.', id_equipamento)
return self.response_error(117, id_equipamento)
# Após obtenção do id_equipamento podemos verificar a permissão
if not has_perm(user,
AdminPermission.EQUIPMENT_MANAGEMENT,
AdminPermission.WRITE_OPERATION,
None,
id_equipamento,
AdminPermission.EQUIP_WRITE_OPERATION):
return self.not_authorized()
# Verifica a existência do valor "fqdn"
fqdn = equipamento_acesso_map.get('fqdn')
# Valid fqdn
if not is_valid_string_maxsize(fqdn, 100) or not is_valid_string_minsize(fqdn, 4):
self.log.error(u'Parameter fqdn is invalid. Value: %s', fqdn)
raise InvalidValueError(None, 'fqdn', fqdn)
# Verifica a existência do valor "user"
username = equipamento_acesso_map.get('user')
# Valid username
if not is_valid_string_maxsize(username, 20) or not is_valid_string_minsize(username, 3):
self.log.error(
u'Parameter username is invalid. Value: %s', username)
raise InvalidValueError(None, 'username', username)
# Verifica a existência do valor "pass"
password = equipamento_acesso_map.get('pass')
# Valid password
if not is_valid_string_maxsize(password, 150) or not is_valid_string_minsize(password, 3):
self.log.error(u'Parameter password is invalid.')
raise InvalidValueError(None, 'password', '****')
# Verifica a existência do valor "id_tipo_acesso"
id_tipo_acesso = equipamento_acesso_map.get('id_tipo_acesso')
# Valid ID Equipment
if not is_valid_int_greater_zero_param(id_tipo_acesso):
self.log.error(
u'The id_tipo_acesso parameter is not a valid value: %s.', id_tipo_acesso)
raise InvalidValueError(None, 'id_tipo_acesso', id_tipo_acesso)
try:
id_tipo_acesso = int(id_tipo_acesso)
except (TypeError, ValueError):
self.log.error(
u'Valor do id_tipo_acesso inválido: %s.', id_tipo_acesso)
return self.response_error(171, id_tipo_acesso)
# Obtém o valor de "enable_pass"
enable_pass = equipamento_acesso_map.get('enable_pass')
# Valid enable_pass
if not is_valid_string_maxsize(enable_pass, 150) or not is_valid_string_minsize(enable_pass, 3):
self.log.error(u'Parameter enable_pass is invalid.')
raise InvalidValueError(None, 'enable_pass', '****')
# Obtém o valor de "vrf"
vrf = equipamento_acesso_map.get('vrf')
vrf_obj = None
if vrf:
# Valid enable_pass
if not is_valid_int_greater_zero_param(vrf):
self.log.error(
u'The vrf parameter is not a valid value: %s.', vrf)
raise InvalidValueError(None, 'vrf', vrf)
vrf_obj = Vrf(int(vrf))
# Cria acesso ao equipamento conforme dados recebidos no XML
equipamento_acesso = EquipamentoAcesso(
equipamento=Equipamento(id=id_equipamento),
fqdn=fqdn,
user=username,
password=password,
tipo_acesso=TipoAcesso(id=id_tipo_acesso),
enable_pass=enable_pass,
vrf=vrf_obj
)
equipamento_acesso.create(user)
# Monta dict para response
networkapi_map = dict()
equipamento_acesso_map = dict()
equipamento_acesso_map['id'] = equipamento_acesso.id
networkapi_map['equipamento_acesso'] = equipamento_acesso_map
return self.response(dumps_networkapi(networkapi_map))
except InvalidValueError as e:
return self.response_error(269, e.param, e.value)
except XMLError, x:
self.log.error(u'Erro ao ler o XML da requisição.')
return self.response_error(3, x)
except EquipamentoNotFoundError:
return self.response_error(117, id_equipamento)
except TipoAcesso.DoesNotExist:
return self.response_error(171, id_tipo_acesso)
except EquipamentoAccessDuplicatedError:
return self.response_error(242, id_equipamento, id_tipo_acesso)
except (EquipamentoError, GrupoError):
return self.response_error(1)
def handle_put(self, request, user, *args, **kwargs):
"""Trata uma requisição PUT para alterar informações de acesso a equipamentos.
URL: /equipamentoacesso/id_equipamento/id_tipo_acesso/
"""
# Obtém dados do request e verifica acesso
try:
# Obtém argumentos passados na URL
id_equipamento = kwargs.get('id_equipamento')
if id_equipamento is None:
return self.response_error(147)
id_tipo_acesso = kwargs.get('id_tipo_acesso')
if id_tipo_acesso is None:
return self.response_error(208)
# Após obtenção do id_equipamento podemos verificar a permissão
if not has_perm(user,
AdminPermission.EQUIPMENT_MANAGEMENT,
AdminPermission.WRITE_OPERATION,
None,
id_equipamento,
AdminPermission.EQUIP_WRITE_OPERATION):
return self.not_authorized()
# Obtém dados do XML
xml_map, attrs_map = loads(request.raw_post_data)
# Obtém o mapa correspondente ao root node do mapa do XML
# (networkapi)
networkapi_map = xml_map.get('networkapi')
if networkapi_map is None:
return self.response_error(3, u'Não existe valor para a tag networkapi do XML de requisição.')
# Verifica a existência do node "equipamento_acesso"
equipamento_acesso_map = networkapi_map.get('equipamento_acesso')
if equipamento_acesso_map is None:
return self.response_error(3, u'Não existe valor para a tag equipamento_acesso do XML de requisição.')
# Verifica a existência do valor "fqdn"
fqdn = equipamento_acesso_map.get('fqdn')
if fqdn is None:
return self.response_error(205)
# Verifica a existência do valor "user"
username = equipamento_acesso_map.get('user')
if username is None:
return self.response_error(206)
# Verifica a existência do valor "pass"
password = equipamento_acesso_map.get('pass')
if password is None:
return self.response_error(207)
# Obtém o valor de "enable_pass"
enable_pass = equipamento_acesso_map.get('enable_pass')
with distributedlock(LOCK_EQUIPMENT_ACCESS % id_tipo_acesso):
# Altera a informação de acesso ao equipamentoconforme dados
# recebidos no XML
EquipamentoAcesso.update(user,
id_equipamento,
id_tipo_acesso,
fqdn=fqdn,
user=username,
password=password,
enable_pass=enable_pass
)
# Retorna response vazio em caso de sucesso
return self.response(dumps_networkapi({}))
except XMLError, x:
self.log.error(u'Erro ao ler o XML da requisição.')
return self.response_error(3, x)
except EquipamentoNotFoundError:
return self.response_error(117, id_equipamento)
except EquipamentoAcesso.DoesNotExist:
return self.response_error(209, id_equipamento, id_tipo_acesso)
except (EquipamentoError, GrupoError):
return self.response_error(1)
def handle_delete(self, request, user, *args, **kwargs):
"""Trata uma requisição DELETE para excluir uma informação de acesso a equipamento
URL: /equipamentoacesso/id_equipamento/id_tipo_acesso/
"""
# Verifica acesso e obtém dados do request
try:
# Obtém argumentos passados na URL
id_equipamento = kwargs.get('id_equipamento')
# Valid ID Equipment
if not is_valid_int_greater_zero_param(id_equipamento):
self.log.error(
u'The id_equipamento parameter is not a valid value: %s.', id_equipamento)
raise InvalidValueError(None, 'id_equipamento', id_equipamento)
id_tipo_acesso = kwargs.get('id_tipo_acesso')
# Valid ID Equipment
if not is_valid_int_greater_zero_param(id_tipo_acesso):
self.log.error(
u'The id_tipo_acesso parameter is not a valid value: %s.', id_tipo_acesso)
raise InvalidValueError(None, 'id_tipo_acesso', id_tipo_acesso)
Equipamento.get_by_pk(id_equipamento)
TipoAcesso.get_by_pk(id_tipo_acesso)
# Após obtenção do id_equipamento podemos verificar a permissão
if not has_perm(user,
AdminPermission.EQUIPMENT_MANAGEMENT,
AdminPermission.WRITE_OPERATION,
None,
id_equipamento,
AdminPermission.EQUIP_WRITE_OPERATION):
return self.not_authorized()
with distributedlock(LOCK_EQUIPMENT_ACCESS % id_tipo_acesso):
# Remove a informação de acesso a equipamento
EquipamentoAcesso.remove(user, id_equipamento, id_tipo_acesso)
# Retorna response vazio em caso de sucesso
return self.response(dumps_networkapi({}))
except InvalidValueError as e:
return self.response_error(269, e.param, e.value)
except EquipamentoNotFoundError:
return self.response_error(117, id_equipamento)
except AccessTypeNotFoundError:
return self.response_error(171, id_tipo_acesso)
except EquipamentoAcesso.DoesNotExist:
return self.response_error(209, id_equipamento, id_tipo_acesso)
except (EquipamentoError, GrupoError):
return self.response_error(1)
def get_equipamento_acesso_map(self, equipamento_acesso):
map = dict()
map['id_equipamento'] = equipamento_acesso.id
map['fqdn'] = equipamento_acesso.fqdn
map['user'] = equipamento_acesso.user
map['pass'] = equipamento_acesso.password
map['id_tipo_acesso'] = equipamento_acesso.tipo_acesso.id
map['enable_pass'] = equipamento_acesso.enable_pass
map['protocolo_tipo_acesso'] = equipamento_acesso.tipo_acesso.protocolo
if equipamento_acesso.vrf:
map['vrf'] = equipamento_acesso.vrf.id
else:
map['vrf'] = ''
return map
| 43.667546 | 118 | 0.626828 | 1,837 | 16,550 | 5.444747 | 0.148612 | 0.055889 | 0.044391 | 0.062088 | 0.667766 | 0.60238 | 0.523095 | 0.469006 | 0.429814 | 0.407519 | 0 | 0.00717 | 0.309003 | 16,550 | 378 | 119 | 43.783069 | 0.867436 | 0.147795 | 0 | 0.528139 | 0 | 0 | 0.091729 | 0.003459 | 0 | 0 | 0 | 0.002646 | 0 | 0 | null | null | 0.073593 | 0.099567 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d81800173dc4a465b8851bbca0c6e1d24df14890 | 112 | py | Python | master/src/utils/openstack/config.py | rezabojnordi/Kangaroo | 4640acdcceeb1c3f6fe9947662d2b2a936790f56 | [
"MIT"
] | 1 | 2022-02-03T16:34:39.000Z | 2022-02-03T16:34:39.000Z | master/src/utils/openstack/config.py | rezabojnordi/Kangaroo | 4640acdcceeb1c3f6fe9947662d2b2a936790f56 | [
"MIT"
] | null | null | null | master/src/utils/openstack/config.py | rezabojnordi/Kangaroo | 4640acdcceeb1c3f6fe9947662d2b2a936790f56 | [
"MIT"
] | null | null | null |
user_name = "user"
password = "pass"
url= "ip address"
project_scope_name = "username"
domain_id = "defa"
| 9.333333 | 31 | 0.678571 | 15 | 112 | 4.8 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 112 | 11 | 32 | 10.181818 | 0.791209 | 0 | 0 | 0 | 0 | 0 | 0.283019 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d81ea41a76599bebe4fb7004a50b159933ded1e7 | 1,406 | py | Python | Cactus.py | lwunruh/dino_game_clone | aada4928447051fc8e28d8cd713b648a311e44cc | [
"CC0-1.0"
] | null | null | null | Cactus.py | lwunruh/dino_game_clone | aada4928447051fc8e28d8cd713b648a311e44cc | [
"CC0-1.0"
] | null | null | null | Cactus.py | lwunruh/dino_game_clone | aada4928447051fc8e28d8cd713b648a311e44cc | [
"CC0-1.0"
] | null | null | null | import pygame as pg
big_cactus = pg.image.load("assets/big_cactus.png")
#small = pg.image.load("assets/small_cactus.png")
#small_double = pg.image.load("assets/small_double_cactus.png")
#small_triple = pg.image.load("assets/small_triple_cactus.png")
class Cactus(object):
def __init__(self, x, ground_y, velocity, size):
self.x = x
if size == 0:
if size == 1:
if size == 2:
if size == 3:
if size == 4:
self.width = big_cactus.get_width()
self.height = big_cactus.get_height()
self.velocity = velocity
self.size = size
self.hitbox = (self.x, self.y, self.width, self.height)
#TODO variable cactus sizes
def draw(self, win):
win.blit(big_cactus, (self.x, self.y))
self.hitbox = (self.x, self.y, self.width, self.height)
def is_colliding(self, dino):
if self.x < dino.get_Hx() + dino.get_Hwidth() and self.x + self.width > dino.get_Hx() and self.y < dino.get_Hy() + dino.get_Hheight() and self.y + self.height > dino.get_Hy():
return True
else:
return False
def draw_hitbox(self, win):
pg.draw.rect(win, (255, 0, 0), self.hitbox, 1)
def get_x(self):
return self.x
def get_y(self):
return self.y
def get_width(self):
return self.width
def get_height(self):
return self.height | 29.291667 | 183 | 0.603129 | 211 | 1,406 | 3.872038 | 0.251185 | 0.04896 | 0.053856 | 0.083231 | 0.188494 | 0.095471 | 0.095471 | 0.095471 | 0.095471 | 0.095471 | 0 | 0.010638 | 0.26458 | 1,406 | 48 | 184 | 29.291667 | 0.779497 | 0.140825 | 0 | 0.060606 | 0 | 0 | 0.017427 | 0.017427 | 0 | 0 | 0 | 0.020833 | 0 | 0 | null | null | 0 | 0.030303 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d83009f2802a6ac2db20accfc0bb345ee47e6601 | 446 | py | Python | DQMOffline/Hcal/python/HcalNoiseRatesParam_cfi.py | pasmuss/cmssw | 566f40c323beef46134485a45ea53349f59ae534 | [
"Apache-2.0"
] | null | null | null | DQMOffline/Hcal/python/HcalNoiseRatesParam_cfi.py | pasmuss/cmssw | 566f40c323beef46134485a45ea53349f59ae534 | [
"Apache-2.0"
] | null | null | null | DQMOffline/Hcal/python/HcalNoiseRatesParam_cfi.py | pasmuss/cmssw | 566f40c323beef46134485a45ea53349f59ae534 | [
"Apache-2.0"
] | null | null | null | import FWCore.ParameterSet.Config as cms
hcalNoiseRates = cms.EDAnalyzer('HcalNoiseRates',
# outputFile = cms.untracked.string('NoiseRatesRelVal.root'),
outputFile = cms.untracked.string(''),
rbxCollName = cms.untracked.InputTag('hcalnoise'),
minRBXEnergy = cms.untracked.double(20.0),
minHitEnergy = cms.untracked.double(1.5),
useAllHistos = cms.untracked.bool(False),
noiselabel = cms.InputTag('hcalnoise')
)
| 37.166667 | 66 | 0.717489 | 46 | 446 | 6.956522 | 0.586957 | 0.225 | 0.1375 | 0.175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013123 | 0.14574 | 446 | 11 | 67 | 40.545455 | 0.826772 | 0.136771 | 0 | 0 | 0 | 0 | 0.084211 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d830dc95647c12fc5e7b19cd67c9d88fa1b5e05a | 932 | py | Python | backend/settings/dbrouters.py | vtr0n/PlayMusicStats | 2de1ef03d363fcfc33cc41506ded6e682e1c6b65 | [
"MIT"
] | 9 | 2019-09-09T01:07:52.000Z | 2021-11-08T12:55:20.000Z | backend/settings/dbrouters.py | vtr0n/PlayMusicStats | 2de1ef03d363fcfc33cc41506ded6e682e1c6b65 | [
"MIT"
] | 18 | 2019-09-08T16:30:11.000Z | 2022-02-12T20:20:45.000Z | backend/settings/dbrouters.py | vtr0n/PlayMusicStats | 2de1ef03d363fcfc33cc41506ded6e682e1c6b65 | [
"MIT"
] | null | null | null | from backend.stats.models import PlayMusicStats
class DBRouter(object):
def db_for_read(self, model, **hints):
""" reading PlayMusicStats from mongo """
if model == PlayMusicStats:
return 'mongo'
return None
def db_for_write(self, model, **hints):
""" writing PlayMusicStats to mongo """
if model == PlayMusicStats:
return 'mongo'
return None
def allow_relation(self, obj1, obj2, **hints):
"""
Relations between objects are allowed
"""
db_list = ('default', 'mongo')
if obj1._state.db in db_list and obj2._state.db in db_list:
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
Allow migrate for stats app only for mongo
"""
if app_label == 'stats':
return db == 'mongo'
return None
| 27.411765 | 69 | 0.57618 | 107 | 932 | 4.88785 | 0.411215 | 0.053537 | 0.086042 | 0.099426 | 0.248566 | 0.191205 | 0.191205 | 0.191205 | 0.191205 | 0 | 0 | 0.006349 | 0.324034 | 932 | 33 | 70 | 28.242424 | 0.82381 | 0.157725 | 0 | 0.421053 | 0 | 0 | 0.044138 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.052632 | 0 | 0.736842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d832fbd439796738f32ca5c752214d1896604972 | 207 | py | Python | example_app/model/my_dto.py | keotl/jivago | 892dfb0cae773e36245083c3e56f0f8523145523 | [
"MIT"
] | 12 | 2018-03-19T20:57:44.000Z | 2020-01-27T14:11:24.000Z | example_app/model/my_dto.py | keotl/jivago | 892dfb0cae773e36245083c3e56f0f8523145523 | [
"MIT"
] | 73 | 2018-04-20T22:26:00.000Z | 2021-12-01T14:17:37.000Z | example_app/model/my_dto.py | keotl/jivago | 892dfb0cae773e36245083c3e56f0f8523145523 | [
"MIT"
] | 1 | 2019-02-28T13:33:45.000Z | 2019-02-28T13:33:45.000Z | from jivago.lang.annotations import Serializable
@Serializable
class MyDto(object):
name: str
age: int
def __init__(self, name: str, age: int):
self.name = name
self.age = age
| 17.25 | 48 | 0.647343 | 27 | 207 | 4.814815 | 0.592593 | 0.107692 | 0.153846 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 207 | 11 | 49 | 18.818182 | 0.849673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.625 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d83707332a0b67b51a0a31668deb0a45af2bb8c5 | 558 | py | Python | costools/__init__.py | stsci-hack/costools | e6c63e4529b2be2f5ecaff9def6319ecd8955a7b | [
"BSD-3-Clause"
] | 1 | 2021-04-24T12:57:59.000Z | 2021-04-24T12:57:59.000Z | costools/__init__.py | stsci-hack/costools | e6c63e4529b2be2f5ecaff9def6319ecd8955a7b | [
"BSD-3-Clause"
] | 16 | 2016-12-07T16:20:39.000Z | 2021-01-22T15:53:29.000Z | costools/__init__.py | stsci-hack/costools | e6c63e4529b2be2f5ecaff9def6319ecd8955a7b | [
"BSD-3-Clause"
] | 7 | 2016-03-29T19:50:52.000Z | 2021-09-30T15:18:59.000Z | from __future__ import absolute_import, division # confidence high
from pkg_resources import get_distribution, DistributionNotFound
try:
__version__ = get_distribution(__name__).version
except DistributionNotFound:
# package is not installed
__version__ = 'UNKNOWN'
from . import timefilter
from . import splittag
from . import x1dcorr
# These lines allow TEAL to print out the names of TEAL-enabled tasks
# upon importing this package.
import os
from stsci.tools import teal
teal.print_tasknames(__name__, os.path.dirname(__file__))
| 27.9 | 74 | 0.78853 | 70 | 558 | 5.871429 | 0.657143 | 0.072993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002132 | 0.159498 | 558 | 19 | 75 | 29.368421 | 0.8742 | 0.24552 | 0 | 0 | 0 | 0 | 0.016827 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.583333 | 0 | 0.583333 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
d8372781303ce5bbebc98e16c9d65ea953476b9b | 1,633 | py | Python | poputils/utils/cidr.py | GaretJax/pop-utils | 2cdfaf24c2f8678edfab1f430c07611d488247d5 | [
"MIT"
] | null | null | null | poputils/utils/cidr.py | GaretJax/pop-utils | 2cdfaf24c2f8678edfab1f430c07611d488247d5 | [
"MIT"
] | 1 | 2021-03-22T17:12:51.000Z | 2021-03-22T17:12:51.000Z | poputils/utils/cidr.py | GaretJax/pop-utils | 2cdfaf24c2f8678edfab1f430c07611d488247d5 | [
"MIT"
] | null | null | null |
class CIDR(object):
def __init__(self, base, size=None):
try:
base, _size = base.split('/')
except ValueError:
pass
else:
if size is None:
size = _size
self.size = 2 ** (32 - int(size))
self._mask = ~(self.size - 1)
self._base = self.ip2dec(base) & self._mask
self.base = self.dec2ip(self._base)
self.block = int(size)
self.mask = self.dec2ip(self._mask)
@property
def last(self):
return self.dec2ip(self._base + self.size - 1)
def __len__(self):
return self.size
def __str__(self):
return "{0}/{1}".format(
self.base,
self.block,
)
def __contains__(self, ip):
return self.ip2dec(ip) & self._mask == self._base
@staticmethod
def ip2dec(ip):
return sum([int(q) << i * 8 for i, q in enumerate(reversed(ip.split(".")))])
@staticmethod
def dec2ip(ip):
return '.'.join([str((ip >> 8 * i) & 255) for i in range(3, -1, -1)])
class CIDRIterator(object):
def __init__(self, base, size):
self.current = base
self.final = base + size
def __iter__(self):
return self
def next(self):
c = self.current
self.current += 1
if self.current > self.final:
raise StopIteration
return CIDR.dec2ip(c)
def __iter__(self):
return self.CIDRIterator(self._base, self.size) | 25.920635 | 84 | 0.492345 | 185 | 1,633 | 4.135135 | 0.297297 | 0.094118 | 0.094118 | 0.044444 | 0.227451 | 0.065359 | 0 | 0 | 0 | 0 | 0 | 0.024024 | 0.388243 | 1,633 | 63 | 85 | 25.920635 | 0.741742 | 0 | 0 | 0.085106 | 0 | 0 | 0.006127 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.234043 | false | 0.021277 | 0 | 0.170213 | 0.468085 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
dc1b2755b683015ebb50f623614bb311ce90648d | 99 | py | Python | src/api/constants.py | massenergize/portalBackEnd | 7ed971b2be13901667a216d8c8a46f0bed6d6ccd | [
"MIT"
] | 1 | 2019-04-17T20:39:55.000Z | 2019-04-17T20:39:55.000Z | src/api/constants.py | massenergize/portalBackEnd | 7ed971b2be13901667a216d8c8a46f0bed6d6ccd | [
"MIT"
] | 8 | 2019-04-15T14:52:18.000Z | 2019-04-17T13:18:03.000Z | src/api/constants.py | massenergize/portalBackEnd | 7ed971b2be13901667a216d8c8a46f0bed6d6ccd | [
"MIT"
] | null | null | null | USERS = "users"
COMMUNITIES = 'communities'
TEAMS = 'teams'
METRICS = 'metrics'
ACTIONS = 'actions' | 19.8 | 27 | 0.707071 | 10 | 99 | 7 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141414 | 99 | 5 | 28 | 19.8 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dc631f72e2e5e0fca75a20f429877d881f1ac1cf | 1,229 | py | Python | build/lib/CameraControl/Record.py | uwdrone/Abzu | 9617dd036402d347c40c2e30f823b51f5236e895 | [
"FSFAP"
] | null | null | null | build/lib/CameraControl/Record.py | uwdrone/Abzu | 9617dd036402d347c40c2e30f823b51f5236e895 | [
"FSFAP"
] | null | null | null | build/lib/CameraControl/Record.py | uwdrone/Abzu | 9617dd036402d347c40c2e30f823b51f5236e895 | [
"FSFAP"
] | null | null | null | from threading import Thread
import time
class VideoRecorder(Thread):
def __init__(self, inputMonitor):
Thread.__init__(self)
self.inputMap = inputMonitor["inputMap"]
self.readLock = inputMonitor["readLock"]
self.writeLock = inputMonitor["writeLock"]
self.inputMonitor = inputMonitor
self.triangle = None
self.recording = False
def run(self):
self.recordVideo()
def recordVideo(self):
while True:
self.readLock.acquire(blocking=True, timeout=-1)
self.inputMonitor["pendingReaders"] += 1
self.readLock.wait()
self.inputMonitor["pendingReaders"] -= 1
self.inputMonitor["readers"] += 1
self.triangle = self.inputMap["triangle"]
self.inputMonitor["readers"] -= 1
if self.inputMonitor["pendingWriters"] > 0:
self.writeLock.notify_all()
elif self.inputMonitor["pendingReaders"]>0 or self.inputMonitor["readers"]>0:
self.readLock.notify_all()
else:
pass
self.readLock.release()
if self.triangle == 1:
#do things
pass
| 30.725 | 89 | 0.574451 | 113 | 1,229 | 6.159292 | 0.362832 | 0.206897 | 0.12931 | 0.08908 | 0.100575 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010778 | 0.320586 | 1,229 | 39 | 90 | 31.512821 | 0.822754 | 0.007323 | 0 | 0.064516 | 0 | 0 | 0.090238 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0.064516 | 0.064516 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
dc7384b6301a40b03f6ddbdae785f78fa5c0fbc8 | 2,378 | py | Python | datahub/mi_dashboard/test/test_tasks.py | alixedi/data-hub-api-cd-poc | a5e5ea45bb496c0d2a06635864514af0c7d4291a | [
"MIT"
] | null | null | null | datahub/mi_dashboard/test/test_tasks.py | alixedi/data-hub-api-cd-poc | a5e5ea45bb496c0d2a06635864514af0c7d4291a | [
"MIT"
] | 16 | 2020-04-01T15:25:35.000Z | 2020-04-14T14:07:30.000Z | datahub/mi_dashboard/test/test_tasks.py | alixedi/data-hub-api-cd-poc | a5e5ea45bb496c0d2a06635864514af0c7d4291a | [
"MIT"
] | null | null | null | from unittest.mock import Mock
import pytest
from django.conf import settings
from datahub.mi_dashboard.tasks import mi_investment_project_etl_pipeline
# mark the whole module for db use
pytestmark = pytest.mark.django_db
def test_mi_dashboard_feed(monkeypatch):
"""Test that the fdi_dashboard_pipeline gets called."""
run_mi_investment_project_etl_pipeline_mock = Mock(side_effect=[(0, 0)])
monkeypatch.setattr(
'datahub.mi_dashboard.tasks.run_mi_investment_project_etl_pipeline',
run_mi_investment_project_etl_pipeline_mock,
)
mi_investment_project_etl_pipeline.apply()
assert run_mi_investment_project_etl_pipeline_mock.call_count == 1
def test_mi_dashboard_feed_retries_on_error(monkeypatch):
"""Test that the mi_dashboard_feed task retries on error."""
run_mi_investment_project_etl_pipeline_mock = Mock(side_effect=[AssertionError, (0, 0)])
monkeypatch.setattr(
'datahub.mi_dashboard.tasks.run_mi_investment_project_etl_pipeline',
run_mi_investment_project_etl_pipeline_mock,
)
mi_investment_project_etl_pipeline.apply()
assert run_mi_investment_project_etl_pipeline_mock.call_count == 2
@pytest.mark.parametrize(
'elapsed_time,num_warnings',
(
(settings.MI_FDI_DASHBOARD_TASK_DURATION_WARNING_THRESHOLD + 1, 1),
(settings.MI_FDI_DASHBOARD_TASK_DURATION_WARNING_THRESHOLD, 0),
),
)
def test_mi_dashboard_elapsed_time_warning(elapsed_time, num_warnings, monkeypatch, caplog):
"""Test that crossing the elapsed time threshold would result in warning."""
caplog.set_level('WARNING')
run_mi_investment_project_etl_pipeline_mock = Mock(side_effect=[(0, 0)])
monkeypatch.setattr(
'datahub.mi_dashboard.tasks.run_mi_investment_project_etl_pipeline',
run_mi_investment_project_etl_pipeline_mock,
)
perf_counter_mock = Mock(side_effect=[0, elapsed_time])
monkeypatch.setattr(
'datahub.mi_dashboard.tasks.perf_counter',
perf_counter_mock,
)
mi_investment_project_etl_pipeline.apply()
assert run_mi_investment_project_etl_pipeline_mock.call_count == 1
assert len(caplog.records) == num_warnings
if num_warnings > 0:
assert (
'The mi_investment_project_etl_pipeline task took a long time '
'({elapsed_time:.2f} seconds).'
) in caplog.text
| 33.971429 | 92 | 0.760303 | 316 | 2,378 | 5.259494 | 0.234177 | 0.122744 | 0.194344 | 0.22503 | 0.615523 | 0.541516 | 0.516847 | 0.516847 | 0.456679 | 0.456679 | 0 | 0.007534 | 0.162742 | 2,378 | 69 | 93 | 34.463768 | 0.827223 | 0.087889 | 0 | 0.354167 | 0 | 0 | 0.165351 | 0.136089 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.0625 | false | 0 | 0.083333 | 0 | 0.145833 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dc8a3888bc122de88d26ed8b8845cc00d1016249 | 486 | py | Python | src/modules/print.py | wuttinanhi/dumb-lang | 031679717d7ae8dbc5024a676d3c34c684aedc3c | [
"Apache-2.0"
] | null | null | null | src/modules/print.py | wuttinanhi/dumb-lang | 031679717d7ae8dbc5024a676d3c34c684aedc3c | [
"Apache-2.0"
] | null | null | null | src/modules/print.py | wuttinanhi/dumb-lang | 031679717d7ae8dbc5024a676d3c34c684aedc3c | [
"Apache-2.0"
] | null | null | null | from typing import Dict
from src.token import Token
from src.modules.content import ContentParser
from src.variable_storage import VariableStorage
class PrintModule:
@staticmethod
def execute(token: Token, variable_storage: VariableStorage):
# seperate text by space
content = token.line.script.split(" ", 1)
# parse content and print
parsed = ContentParser.parse(content[1], variable_storage)
# output to screen
print(parsed)
| 28.588235 | 66 | 0.711934 | 57 | 486 | 6.017544 | 0.561404 | 0.061224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005277 | 0.220165 | 486 | 16 | 67 | 30.375 | 0.899736 | 0.12963 | 0 | 0 | 0 | 0 | 0.002387 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.6 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
dca66bc2807efbe25d7332dc3fe606e37dd87a3a | 60 | py | Python | src/cscli/views/__init__.py | ryohare/python-cscli | 9455e2758fea2c1a1025c40d785fdb502ebee7ca | [
"MIT"
] | null | null | null | src/cscli/views/__init__.py | ryohare/python-cscli | 9455e2758fea2c1a1025c40d785fdb502ebee7ca | [
"MIT"
] | null | null | null | src/cscli/views/__init__.py | ryohare/python-cscli | 9455e2758fea2c1a1025c40d785fdb502ebee7ca | [
"MIT"
] | null | null | null | from .hosts import HostsView
__all__ = [
"HostsView"
]
| 10 | 28 | 0.666667 | 6 | 60 | 6 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.233333 | 60 | 5 | 29 | 12 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dcb96c7eb3a980ffbe3345807b1bf75a37150344 | 588 | py | Python | donations/payment_gateways/stripe/constants.py | diffractive/newstream | cf1a1f230e18d01c63b50ab9d360aa44ac5a486f | [
"MIT"
] | 1 | 2020-05-03T12:33:42.000Z | 2020-05-03T12:33:42.000Z | donations/payment_gateways/stripe/constants.py | diffractive/newstream | cf1a1f230e18d01c63b50ab9d360aa44ac5a486f | [
"MIT"
] | 14 | 2020-07-06T20:05:57.000Z | 2022-03-12T00:39:11.000Z | donations/payment_gateways/stripe/constants.py | diffractive/newstream | cf1a1f230e18d01c63b50ab9d360aa44ac5a486f | [
"MIT"
] | null | null | null | from site_settings.models import GATEWAY_CAN_EDIT_SUBSCRIPTION, GATEWAY_CAN_TOGGLE_SUBSCRIPTION, GATEWAY_CAN_CANCEL_SUBSCRIPTION
API_CAPABILITIES = [GATEWAY_CAN_EDIT_SUBSCRIPTION, GATEWAY_CAN_TOGGLE_SUBSCRIPTION, GATEWAY_CAN_CANCEL_SUBSCRIPTION]
EVENT_CHECKOUT_SESSION_COMPLETED = 'checkout.session.completed'
EVENT_PAYMENT_INTENT_SUCCEEDED = 'payment_intent.succeeded'
EVENT_INVOICE_CREATED = 'invoice.created'
EVENT_INVOICE_PAID = 'invoice.paid'
EVENT_CUSTOMER_SUBSCRIPTION_UPDATED = 'customer.subscription.updated'
EVENT_CUSTOMER_SUBSCRIPTION_DELETED = 'customer.subscription.deleted' | 65.333333 | 128 | 0.891156 | 69 | 588 | 7.057971 | 0.376812 | 0.123203 | 0.180698 | 0.106776 | 0.336756 | 0.336756 | 0.336756 | 0.336756 | 0.336756 | 0.336756 | 0 | 0 | 0.04932 | 588 | 9 | 129 | 65.333333 | 0.871199 | 0 | 0 | 0 | 0 | 0 | 0.229202 | 0.183362 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f49b517e78ed6d123cf31f145b7c30127f527799 | 1,054 | py | Python | setup.py | sozforex/pygments-plugins-serv | 436be1719056c3aeb34bbf32f002932d7d9e7301 | [
"BSD-3-Clause"
] | null | null | null | setup.py | sozforex/pygments-plugins-serv | 436be1719056c3aeb34bbf32f002932d7d9e7301 | [
"BSD-3-Clause"
] | null | null | null | setup.py | sozforex/pygments-plugins-serv | 436be1719056c3aeb34bbf32f002932d7d9e7301 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/python
from setuptools import setup
setup(name='pygments-plugins-serv',
version='0.0.1',
description='Pygments plugins.',
keywords='pygments plugins',
license='BSD',
author='Kichatov Feodor',
author_email='sozforex@gmail.com',
url='https://github.com/sozforex/pygments-plugins-serv',
packages=['pygments_plugins_serv'],
install_requires=['pygments>=2.1'],
entry_points='''[pygments.lexers]
MakefileLexer1=pygments_plugins_serv.lexers.make:MakefileLexer
M4Lexer=pygments_plugins_serv.lexers.m4:M4Lexer''',
classifiers=[
'Environment :: Plugins',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Operating System :: OS Independent',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 3',
'Topic :: Software Development :: Libraries :: Python Modules',
],)
| 31.939394 | 84 | 0.609108 | 99 | 1,054 | 6.393939 | 0.59596 | 0.165877 | 0.150079 | 0.078989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014031 | 0.256167 | 1,054 | 32 | 85 | 32.9375 | 0.793367 | 0.01518 | 0 | 0 | 0 | 0 | 0.612343 | 0.145612 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f4a029c309ef0408c208afdd3383c1e3ab1af1d7 | 203 | py | Python | src/setup.py | litosly/atec | 1632531f58c5c15f4d2072c29c22ab1f7464b4f8 | [
"MIT"
] | 1 | 2018-12-03T16:38:47.000Z | 2018-12-03T16:38:47.000Z | src/setup.py | litosly/atec | 1632531f58c5c15f4d2072c29c22ab1f7464b4f8 | [
"MIT"
] | null | null | null | src/setup.py | litosly/atec | 1632531f58c5c15f4d2072c29c22ab1f7464b4f8 | [
"MIT"
] | null | null | null | # python environment setup
from distutils.core import setup
setup(
name='ATEC_wenle',
version='0.1dev',
packages=['',],
license='',
# long_description=open('README.txt').read(),
) | 18.454545 | 49 | 0.635468 | 23 | 203 | 5.521739 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012195 | 0.192118 | 203 | 11 | 50 | 18.454545 | 0.762195 | 0.339901 | 0 | 0 | 0 | 0 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f4b3d03e1ff6948ba2e7cf92e698642597cb5418 | 1,897 | py | Python | project2/tests/q3_1_1.py | DrRossTaylor/intro-DS-Assignments | 88f0747b89869cae4e4227e6f3a936f0f1583937 | [
"CC0-1.0"
] | null | null | null | project2/tests/q3_1_1.py | DrRossTaylor/intro-DS-Assignments | 88f0747b89869cae4e4227e6f3a936f0f1583937 | [
"CC0-1.0"
] | null | null | null | project2/tests/q3_1_1.py | DrRossTaylor/intro-DS-Assignments | 88f0747b89869cae4e4227e6f3a936f0f1583937 | [
"CC0-1.0"
] | null | null | null | test = {
'name': 'q3_1_1',
'points': 1,
'suites': [
{
'cases': [
{
'code': r"""
>>> len(my_20_features)
20
""",
'hidden': False,
'locked': False
},
{
'code': r"""
>>> np.all([f in test_movies.labels for f in my_20_features])
True
""",
'hidden': False,
'locked': False
},
{
'code': r"""
>>> # It looks like there are many movies in the training set that;
>>> # don't have any of your chosen words. That will make your;
>>> # classifier perform very poorly in some cases. Try choosing;
>>> # at least 1 common word.;
>>> train_f = train_movies.select(my_20_features);
>>> np.count_nonzero(train_f.apply(lambda r: np.sum(np.abs(np.array(r))) == 0)) < 20
True
""",
'hidden': False,
'locked': False
},
{
'code': r"""
>>> # It looks like there are many movies in the test set that;
>>> # don't have any of your chosen words. That will make your;
>>> # classifier perform very poorly in some cases. Try choosing;
>>> # at least 1 common word.;
>>> test_f = test_movies.select(my_20_features);
>>> np.count_nonzero(test_f.apply(lambda r: np.sum(np.abs(np.array(r))) == 0)) < 5
True
""",
'hidden': False,
'locked': False
},
{
'code': r"""
>>> # It looks like you may have duplicate words! Make sure not to!;
>>> len(set(my_20_features)) >= 20
True
""",
'hidden': False,
'locked': False
}
],
'scored': True,
'setup': '',
'teardown': '',
'type': 'doctest'
}
]
}
| 28.742424 | 94 | 0.448603 | 211 | 1,897 | 3.933649 | 0.364929 | 0.03012 | 0.072289 | 0.13253 | 0.728916 | 0.728916 | 0.660241 | 0.660241 | 0.568675 | 0.568675 | 0 | 0.022026 | 0.401687 | 1,897 | 65 | 95 | 29.184615 | 0.709251 | 0 | 0 | 0.461538 | 0 | 0.030769 | 0.706906 | 0.117027 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f4baf5c1af6d47d513d1578fbc685f36061ce7c3 | 1,086 | py | Python | flask_wiki/backend/models.py | gcavalcante8808/flask-wiki | a2c0af2e7fa6ce64faeb38e678a2e207ff63f3a6 | [
"BSD-2-Clause"
] | null | null | null | flask_wiki/backend/models.py | gcavalcante8808/flask-wiki | a2c0af2e7fa6ce64faeb38e678a2e207ff63f3a6 | [
"BSD-2-Clause"
] | 35 | 2015-10-08T21:00:22.000Z | 2021-06-25T15:29:41.000Z | flask_wiki/backend/models.py | gcavalcante8808/flask-wiki | a2c0af2e7fa6ce64faeb38e678a2e207ff63f3a6 | [
"BSD-2-Clause"
] | 1 | 2019-07-09T14:17:48.000Z | 2019-07-09T14:17:48.000Z | import uuid
import markdown2
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy import UniqueConstraint, event
from flask_wiki.backend.custom_fields import GUIDField
from slugify import slugify
db = SQLAlchemy()
# TODO: Add Owner and other security fields later.
class Page(db.Model):
"""
Implements the Page Model.
"""
guid = db.Column(GUIDField, primary_key=True, default=uuid.uuid4)
name = db.Column(db.String, nullable=False, unique=True)
raw_content = db.Column(db.Text, nullable=False, unique=True)
rendered_content = db.Column(db.Text)
slug = db.Column(db.String, nullable=False, unique=True)
UniqueConstraint('name', 'raw_content')
def __repr__(self):
return self.__str__()
def __str__(self):
return self.name
def make_slug(mapper, connection, target):
target.slug = slugify(target.name)
def render_html(mapper, connection, target):
target.rendered_content = markdown2.markdown(target.raw_content)
event.listen(Page, 'before_insert', make_slug)
event.listen(Page, 'before_insert', render_html)
| 27.15 | 69 | 0.736648 | 143 | 1,086 | 5.405594 | 0.412587 | 0.051746 | 0.051746 | 0.089263 | 0.225097 | 0.100906 | 0.100906 | 0.100906 | 0 | 0 | 0 | 0.003286 | 0.1593 | 1,086 | 39 | 70 | 27.846154 | 0.843373 | 0.069982 | 0 | 0 | 0 | 0 | 0.041247 | 0 | 0 | 0 | 0 | 0.025641 | 0 | 1 | 0.166667 | false | 0 | 0.25 | 0.083333 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f4c10b691dc7f8fee363bd796f93466d86266daf | 196 | py | Python | lib/_version.py | ppizarror/FI3104-Template | 730b2b2d2af2c205e068fd7499ac3fd1bdd4fdfa | [
"MIT"
] | null | null | null | lib/_version.py | ppizarror/FI3104-Template | 730b2b2d2af2c205e068fd7499ac3fd1bdd4fdfa | [
"MIT"
] | null | null | null | lib/_version.py | ppizarror/FI3104-Template | 730b2b2d2af2c205e068fd7499ac3fd1bdd4fdfa | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
VERSION
Control de versiones.
Autor: PABLO PIZARRO @ github.com/ppizarror
Fecha: SEPTIEMBRE 2016
Licencia: MiT licence
"""
__version__ = '1.0.0'
| 16.333333 | 43 | 0.693878 | 27 | 196 | 4.888889 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.142857 | 196 | 11 | 44 | 17.818182 | 0.738095 | 0.826531 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f4cb01b0a60cdc67597b31e3763ce3d157d00e7c | 860 | py | Python | sarpy/io/general/nitf_elements/tres/unclass/CSPROA.py | pressler-vsc/sarpy | fa6c951c42b9a7d9df2edfa53c771494cb0246fb | [
"MIT"
] | 1 | 2021-02-04T08:44:18.000Z | 2021-02-04T08:44:18.000Z | sarpy/io/general/nitf_elements/tres/unclass/CSPROA.py | pressler-vsc/sarpy | fa6c951c42b9a7d9df2edfa53c771494cb0246fb | [
"MIT"
] | null | null | null | sarpy/io/general/nitf_elements/tres/unclass/CSPROA.py | pressler-vsc/sarpy | fa6c951c42b9a7d9df2edfa53c771494cb0246fb | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from ..tre_elements import TREExtension, TREElement
__classification__ = "UNCLASSIFIED"
__author__ = "Thomas McCullough"
class CSPROAType(TREElement):
def __init__(self, value):
super(CSPROAType, self).__init__()
self.add_field('RESERVED_0', 's', 12, value)
self.add_field('RESERVED_1', 's', 12, value)
self.add_field('RESERVED_2', 's', 12, value)
self.add_field('RESERVED_3', 's', 12, value)
self.add_field('RESERVED_4', 's', 12, value)
self.add_field('RESERVED_5', 's', 12, value)
self.add_field('RESERVED_6', 's', 12, value)
self.add_field('RESERVED_7', 's', 12, value)
self.add_field('RESERVED_8', 's', 12, value)
self.add_field('BWC', 's', 12, value)
class CSPROA(TREExtension):
_tag_value = 'CSPROA'
_data_type = CSPROAType
| 31.851852 | 52 | 0.632558 | 113 | 860 | 4.460177 | 0.345133 | 0.138889 | 0.238095 | 0.357143 | 0.484127 | 0.484127 | 0.444444 | 0 | 0 | 0 | 0 | 0.043988 | 0.206977 | 860 | 26 | 53 | 33.076923 | 0.695015 | 0.024419 | 0 | 0 | 0 | 0 | 0.164875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.052632 | 0 | 0.315789 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f4cbf31b94fe29159209492373bfb6b4758ce678 | 6,165 | py | Python | python/pyspark/mllib/feature.py | loleek/spark | 671d709ad8abc85666d09d75c84b39f2a1c5d360 | [
"Apache-2.0"
] | 1 | 2019-10-14T03:56:23.000Z | 2019-10-14T03:56:23.000Z | python/pyspark/mllib/feature.py | loleek/spark | 671d709ad8abc85666d09d75c84b39f2a1c5d360 | [
"Apache-2.0"
] | null | null | null | python/pyspark/mllib/feature.py | loleek/spark | 671d709ad8abc85666d09d75c84b39f2a1c5d360 | [
"Apache-2.0"
] | null | null | null | #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""
Python package for feature in MLlib.
"""
from pyspark.serializers import PickleSerializer, AutoBatchedSerializer
from pyspark.mllib.linalg import _convert_to_vector
__all__ = ['Word2Vec', 'Word2VecModel']
class Word2VecModel(object):
"""
class for Word2Vec model
"""
def __init__(self, sc, java_model):
"""
:param sc: Spark context
:param java_model: Handle to Java model object
"""
self._sc = sc
self._java_model = java_model
def __del__(self):
self._sc._gateway.detach(self._java_model)
def transform(self, word):
"""
:param word: a word
:return: vector representation of word
Transforms a word to its vector representation
Note: local use only
"""
# TODO: make transform usable in RDD operations from python side
result = self._java_model.transform(word)
return PickleSerializer().loads(str(self._sc._jvm.SerDe.dumps(result)))
def findSynonyms(self, x, num):
"""
:param x: a word or a vector representation of word
:param num: number of synonyms to find
:return: array of (word, cosineSimilarity)
Find synonyms of a word
Note: local use only
"""
# TODO: make findSynonyms usable in RDD operations from python side
ser = PickleSerializer()
if type(x) == str:
jlist = self._java_model.findSynonyms(x, num)
else:
bytes = bytearray(ser.dumps(_convert_to_vector(x)))
vec = self._sc._jvm.SerDe.loads(bytes)
jlist = self._java_model.findSynonyms(vec, num)
words, similarity = ser.loads(str(self._sc._jvm.SerDe.dumps(jlist)))
return zip(words, similarity)
class Word2Vec(object):
"""
Word2Vec creates vector representation of words in a text corpus.
The algorithm first constructs a vocabulary from the corpus
and then learns vector representation of words in the vocabulary.
The vector representation can be used as features in
natural language processing and machine learning algorithms.
We used skip-gram model in our implementation and hierarchical softmax
method to train the model. The variable names in the implementation
matches the original C implementation.
For original C implementation, see https://code.google.com/p/word2vec/
For research papers, see
Efficient Estimation of Word Representations in Vector Space
and
Distributed Representations of Words and Phrases and their Compositionality.
>>> sentence = "a b " * 100 + "a c " * 10
>>> localDoc = [sentence, sentence]
>>> doc = sc.parallelize(localDoc).map(lambda line: line.split(" "))
>>> model = Word2Vec().setVectorSize(10).setSeed(42L).fit(doc)
>>> syms = model.findSynonyms("a", 2)
>>> str(syms[0][0])
'b'
>>> str(syms[1][0])
'c'
>>> len(syms)
2
>>> vec = model.transform("a")
>>> len(vec)
10
>>> syms = model.findSynonyms(vec, 2)
>>> str(syms[0][0])
'b'
>>> str(syms[1][0])
'c'
>>> len(syms)
2
"""
def __init__(self):
"""
Construct Word2Vec instance
"""
self.vectorSize = 100
self.learningRate = 0.025
self.numPartitions = 1
self.numIterations = 1
self.seed = 42L
def setVectorSize(self, vectorSize):
"""
Sets vector size (default: 100).
"""
self.vectorSize = vectorSize
return self
def setLearningRate(self, learningRate):
"""
Sets initial learning rate (default: 0.025).
"""
self.learningRate = learningRate
return self
def setNumPartitions(self, numPartitions):
"""
Sets number of partitions (default: 1). Use a small number for accuracy.
"""
self.numPartitions = numPartitions
return self
def setNumIterations(self, numIterations):
"""
Sets number of iterations (default: 1), which should be smaller than or equal to number of
partitions.
"""
self.numIterations = numIterations
return self
def setSeed(self, seed):
"""
Sets random seed.
"""
self.seed = seed
return self
def fit(self, data):
"""
Computes the vector representation of each word in vocabulary.
:param data: training data. RDD of subtype of Iterable[String]
:return: python Word2VecModel instance
"""
sc = data.context
ser = PickleSerializer()
vectorSize = self.vectorSize
learningRate = self.learningRate
numPartitions = self.numPartitions
numIterations = self.numIterations
seed = self.seed
model = sc._jvm.PythonMLLibAPI().trainWord2Vec(
data._to_java_object_rdd(), vectorSize,
learningRate, numPartitions, numIterations, seed)
return Word2VecModel(sc, model)
def _test():
import doctest
from pyspark import SparkContext
globs = globals().copy()
globs['sc'] = SparkContext('local[4]', 'PythonTest', batchSize=2)
(failure_count, test_count) = doctest.testmod(globs=globs, optionflags=doctest.ELLIPSIS)
globs['sc'].stop()
if failure_count:
exit(-1)
if __name__ == "__main__":
_test()
| 31.454082 | 98 | 0.641038 | 730 | 6,165 | 5.328767 | 0.354795 | 0.020823 | 0.01671 | 0.010797 | 0.08946 | 0.059126 | 0.046787 | 0.01491 | 0.01491 | 0.01491 | 0 | 0.01369 | 0.265369 | 6,165 | 195 | 99 | 31.615385 | 0.84522 | 0.142741 | 0 | 0.104478 | 0 | 0 | 0.019016 | 0 | 0 | 0 | 0 | 0.010256 | 0 | 0 | null | null | 0 | 0.059701 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f4cee5e426a1bbbea73b22b5c9fb0dfb7cd7931d | 1,250 | py | Python | linebot/models/video_play_complete.py | coman1024/line-bot-sdk-python | f4fbc8abd1b0b287902344d33590f4fab48e1c75 | [
"Apache-2.0"
] | 2 | 2021-09-07T13:06:50.000Z | 2021-09-14T08:14:45.000Z | linebot/models/video_play_complete.py | coman1024/line-bot-sdk-python | f4fbc8abd1b0b287902344d33590f4fab48e1c75 | [
"Apache-2.0"
] | null | null | null | linebot/models/video_play_complete.py | coman1024/line-bot-sdk-python | f4fbc8abd1b0b287902344d33590f4fab48e1c75 | [
"Apache-2.0"
] | 1 | 2021-01-13T10:08:53.000Z | 2021-01-13T10:08:53.000Z | # -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""linebot.models.video_play_complete module."""
from __future__ import unicode_literals
from abc import ABCMeta
from future.utils import with_metaclass
from .base import Base
class VideoPlayComplete(with_metaclass(ABCMeta, Base)):
"""Abstract Base Class of VideoPlayComplete."""
def __init__(self, tracking_id=None, **kwargs):
"""__init__ method.
:param str tracking_id: the video viewing complete event occurs
when the user finishes watching the video.
Max character limit: 100.
:param kwargs:
"""
super(VideoPlayComplete, self).__init__(**kwargs)
self.tracking_id = tracking_id
| 32.051282 | 76 | 0.7136 | 167 | 1,250 | 5.191617 | 0.616766 | 0.069204 | 0.029988 | 0.036909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008065 | 0.2064 | 1,250 | 38 | 77 | 32.894737 | 0.865927 | 0.6512 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f4e5a55e3445ecc038625e6feaeb7d1f89ef7717 | 544 | py | Python | jianzhioffer/translateNum.py | summer-vacation/AlgoExec | 55c6c3e7890b596b709b50cafa415b9594c03edd | [
"MIT"
] | null | null | null | jianzhioffer/translateNum.py | summer-vacation/AlgoExec | 55c6c3e7890b596b709b50cafa415b9594c03edd | [
"MIT"
] | 2 | 2019-12-09T06:12:51.000Z | 2019-12-16T14:38:34.000Z | jianzhioffer/translateNum.py | summer-vacation/AlgoExec | 55c6c3e7890b596b709b50cafa415b9594c03edd | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
File Name: translateNum
Author : jing
Date: 2020/4/26
把数字翻译成字符串,求不同的翻译个数
"""
class Solution:
def translateNum(self, num: int) -> int:
if num is None or num < 0:
return 0
if 0 <= num <= 9:
return 1
mod = num % 100
if mod <= 9 or mod >= 26:
return self.translateNum(num // 10)
else:
return self.translateNum(num // 100) + self.translateNum(num // 10)
print(Solution().translateNum(12258))
| 20.923077 | 79 | 0.512868 | 65 | 544 | 4.292308 | 0.523077 | 0.172043 | 0.204301 | 0.179211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089595 | 0.363971 | 544 | 25 | 80 | 21.76 | 0.716763 | 0.211397 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.5 | 0.083333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
761c18c04e36dc554e9567b8569af372d958e27f | 141 | py | Python | launcher.py | CircuitsBots/AutoSnipe | 59908d217f9f4c373ca383c91c27c735e9fc24f2 | [
"MIT"
] | 1 | 2021-04-22T17:51:51.000Z | 2021-04-22T17:51:51.000Z | launcher.py | CircuitsBots/AutoSnipe | 59908d217f9f4c373ca383c91c27c735e9fc24f2 | [
"MIT"
] | null | null | null | launcher.py | CircuitsBots/AutoSnipe | 59908d217f9f4c373ca383c91c27c735e9fc24f2 | [
"MIT"
] | null | null | null | import os
import dotenv
from app.bot import AutoSnipe
bot = AutoSnipe(".")
dotenv.load_dotenv()
token = os.getenv("TOKEN")
bot.run(token)
| 12.818182 | 29 | 0.730496 | 21 | 141 | 4.857143 | 0.52381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134752 | 141 | 10 | 30 | 14.1 | 0.836066 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
5213641b0265b67149013504431a63ae12d41600 | 301 | py | Python | courses/urls.py | amandasavluchinske/mooc | 3d83748b2ddda646597b5a3b57f838ed0fb99b3e | [
"MIT"
] | null | null | null | courses/urls.py | amandasavluchinske/mooc | 3d83748b2ddda646597b5a3b57f838ed0fb99b3e | [
"MIT"
] | 2 | 2021-05-07T01:37:50.000Z | 2022-02-10T10:08:41.000Z | courses/urls.py | amandasavluchinske/mooc | 3d83748b2ddda646597b5a3b57f838ed0fb99b3e | [
"MIT"
] | null | null | null | from django.conf import settings
from django.conf.urls import include, url # noqa
from django.contrib import admin
from django.views.generic import TemplateView, FormView
import django_js_reverse.views
urlpatterns = [
url(r'^$', TemplateView.as_view(template_name='index.html'), name='home'),
] | 30.1 | 78 | 0.777409 | 42 | 301 | 5.47619 | 0.619048 | 0.173913 | 0.121739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 301 | 10 | 79 | 30.1 | 0.864662 | 0.013289 | 0 | 0 | 0 | 0 | 0.054054 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.625 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
5218eea41bed8e9fd45d629d2f0b3249c3bbac54 | 165 | py | Python | examples/python/simple/func_global.py | airgiser/ucb | d03e62a17f35a9183ed36662352f603f0f673194 | [
"MIT"
] | 1 | 2022-01-08T14:59:44.000Z | 2022-01-08T14:59:44.000Z | examples/python/simple/func_global.py | airgiser/just-for-fun | d03e62a17f35a9183ed36662352f603f0f673194 | [
"MIT"
] | null | null | null | examples/python/simple/func_global.py | airgiser/just-for-fun | d03e62a17f35a9183ed36662352f603f0f673194 | [
"MIT"
] | null | null | null | #!/usr/bin/python
# Filename: func_global.py
def func():
global x
print 'x is', x
x = 2
print 'x is change to', x
x = 50
func()
print 'x is now',x
| 12.692308 | 29 | 0.569697 | 31 | 165 | 3 | 0.516129 | 0.193548 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025424 | 0.284848 | 165 | 12 | 30 | 13.75 | 0.762712 | 0.248485 | 0 | 0 | 0 | 0 | 0.213115 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.375 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5233c68bce71ec8a0f235408ac38a757b34a3140 | 1,140 | py | Python | tackle/providers/tackle/tests/block/test_provider_tackle_block.py | geometry-labs/tackle-box | 83424a10416955ba983f0c14ec89bd79673a4282 | [
"BSD-3-Clause"
] | 1 | 2021-04-13T23:10:11.000Z | 2021-04-13T23:10:11.000Z | tackle/providers/tackle/tests/block/test_provider_tackle_block.py | geometry-labs/tackle-box | 83424a10416955ba983f0c14ec89bd79673a4282 | [
"BSD-3-Clause"
] | 4 | 2021-01-27T00:06:12.000Z | 2021-02-12T01:20:32.000Z | tackle/providers/tackle/tests/block/test_provider_tackle_block.py | geometry-labs/tackle-box | 83424a10416955ba983f0c14ec89bd79673a4282 | [
"BSD-3-Clause"
] | 1 | 2021-05-07T05:07:29.000Z | 2021-05-07T05:07:29.000Z | """Tests dict input objects for `tackle.providers.tackle.block` module."""
from tackle.main import tackle
def test_provider_system_hook_block_tackle(change_dir):
"""Simple block test."""
output = tackle('basic.yaml', no_input=True)
assert output['stuff'] == 'here'
assert 'things' not in output
def test_provider_system_hook_block_embedded_blocks(change_dir):
"""Embedded with multiple blocks."""
output = tackle('embedded_blocks.yaml', no_input=True)
assert output['things'] == 'things'
def test_provider_system_hook_block_looped(change_dir):
"""With a for loop."""
output = tackle('looped.yaml', no_input=True)
assert len(output['blocker']) == 2
def test_provider_system_hook_block_block_merge(change_dir):
"""Block with a merge."""
output = tackle('block_merge.yaml', no_input=True)
assert output['things'] == 'here'
assert output['foo'] == 'bar'
def test_provider_system_hook_block_block(change_dir):
"""Complex block."""
output = tackle('block.yaml', no_input=True)
assert output['block']['things'] == 'here'
assert output['block']['foo'] == 'bar'
| 27.804878 | 74 | 0.695614 | 152 | 1,140 | 4.960526 | 0.296053 | 0.095491 | 0.09947 | 0.139257 | 0.399204 | 0.371353 | 0.180371 | 0 | 0 | 0 | 0 | 0.00104 | 0.15614 | 1,140 | 40 | 75 | 28.5 | 0.782744 | 0.149123 | 0 | 0 | 0 | 0 | 0.15229 | 0 | 0 | 0 | 0 | 0 | 0.421053 | 1 | 0.263158 | false | 0 | 0.052632 | 0 | 0.315789 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
526064e8f9352f549958bdfdba4da287f5d5cbc8 | 1,452 | py | Python | venv/Lib/site-packages/tests/smoke/device_info.py | melihteke/ebook_study | 4848ea42e37ee1d6ec777bfc33f49984653ace34 | [
"MIT"
] | 1 | 2020-04-17T18:53:02.000Z | 2020-04-17T18:53:02.000Z | venv/Lib/site-packages/tests/smoke/device_info.py | melihteke/ebook_study | 4848ea42e37ee1d6ec777bfc33f49984653ace34 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/tests/smoke/device_info.py | melihteke/ebook_study | 4848ea42e37ee1d6ec777bfc33f49984653ace34 | [
"MIT"
] | null | null | null | import os
host = os.getenv("SCRAPLI_SMOKE_HOST", None)
port = os.getenv("SCRAPLI_SMOKE_PORT", None)
user = os.getenv("SCRAPLI_SMOKE_USER", None)
password = os.getenv("SCRAPLI_SMOKE_PASS", None)
iosxe_device = {
"host": host or "172.18.0.11",
"port": port or 22,
"auth_username": user or "vrnetlab",
"auth_password": password or "VR-netlab9",
"auth_strict_key": False,
"transport": "system",
"keepalive": True,
"keepalive_interval": 1,
}
nxos_device = {
"host": "172.18.0.12",
"port": 22,
"auth_username": "vrnetlab",
"auth_password": "VR-netlab9",
"auth_strict_key": False,
"transport": "system",
"keepalive": True,
"keepalive_interval": 1,
}
iosxr_device = {
"host": "172.18.0.13",
"port": 22,
"auth_username": "vrnetlab",
"auth_password": "VR-netlab9",
"auth_strict_key": False,
"transport": "system",
"keepalive": True,
"keepalive_interval": 1,
}
eos_device = {
"host": "172.18.0.14",
"port": 22,
"auth_username": "vrnetlab",
"auth_password": "VR-netlab9",
"auth_strict_key": False,
"transport": "system",
"keepalive": True,
"keepalive_interval": 1,
"comms_ansi": True,
}
junos_device = {
"host": "172.18.0.15",
"port": 22,
"auth_username": "vrnetlab",
"auth_password": "VR-netlab9",
"auth_strict_key": False,
"transport": "system",
"keepalive": True,
"keepalive_interval": 1,
}
| 23.047619 | 48 | 0.610882 | 176 | 1,452 | 4.818182 | 0.244318 | 0.058962 | 0.035377 | 0.112028 | 0.685142 | 0.60967 | 0.60967 | 0.60967 | 0.60967 | 0.60967 | 0 | 0.051993 | 0.205234 | 1,452 | 62 | 49 | 23.419355 | 0.682842 | 0 | 0 | 0.571429 | 0 | 0 | 0.469697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.107143 | 0.017857 | 0 | 0.017857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5261073fd1db37b28c63f34abc35676da5fa9abd | 298 | py | Python | tests/test_rqscheduler.py | dymaxionlabs/nb_workflows | 336e4d83dd5f8a7edfbaacfa426b23a42c0a68a9 | [
"Apache-2.0"
] | 4 | 2022-02-17T19:47:52.000Z | 2022-02-17T20:11:06.000Z | tests/test_rqscheduler.py | dymaxionlabs/nb_workflows | 336e4d83dd5f8a7edfbaacfa426b23a42c0a68a9 | [
"Apache-2.0"
] | 2 | 2022-03-26T00:07:05.000Z | 2022-03-30T21:20:00.000Z | tests/test_rqscheduler.py | dymaxionlabs/nb_workflows | 336e4d83dd5f8a7edfbaacfa426b23a42c0a68a9 | [
"Apache-2.0"
] | 1 | 2022-02-18T13:33:00.000Z | 2022-02-18T13:33:00.000Z | from labfunctions.conf.server_settings import settings
from labfunctions.control_plane import rqscheduler
def test_rqscheduler_run(mocker):
mocker.patch(
"labfunctions.control_plane.rqscheduler.Scheduler.run", return_value=None
)
rqscheduler.run(settings.RQ_REDIS, 5, "INFO")
| 29.8 | 81 | 0.785235 | 36 | 298 | 6.305556 | 0.611111 | 0.140969 | 0.211454 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003861 | 0.130872 | 298 | 9 | 82 | 33.111111 | 0.872587 | 0 | 0 | 0 | 0 | 0 | 0.187919 | 0.174497 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5272cfded6807b3a8d9d5917b0e18331709aafd1 | 3,540 | py | Python | src/sagemaker/workflow/entities.py | mufaddal-rohawala/sagemaker-python-sdk | 72c12bb1481c368f799b17c38d07c5fc34864093 | [
"Apache-2.0"
] | null | null | null | src/sagemaker/workflow/entities.py | mufaddal-rohawala/sagemaker-python-sdk | 72c12bb1481c368f799b17c38d07c5fc34864093 | [
"Apache-2.0"
] | 20 | 2021-09-17T20:50:11.000Z | 2021-12-09T00:29:02.000Z | src/sagemaker/workflow/entities.py | mufaddal-rohawala/sagemaker-python-sdk | 72c12bb1481c368f799b17c38d07c5fc34864093 | [
"Apache-2.0"
] | null | null | null | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
"""Defines the base entities used in workflow."""
from __future__ import absolute_import
import abc
from enum import EnumMeta
from typing import Any, Dict, List, Union
PrimitiveType = Union[str, int, bool, float, None]
RequestType = Union[Dict[str, Any], List[Dict[str, Any]]]
class Entity(abc.ABC):
"""Base object for workflow entities.
Entities must implement the to_request method.
"""
@abc.abstractmethod
def to_request(self) -> RequestType:
"""Get the request structure for workflow service calls."""
class DefaultEnumMeta(EnumMeta):
"""An EnumMeta which defaults to the first value in the Enum list."""
default = object()
def __call__(cls, *args, value=default, **kwargs):
"""Defaults to the first value in the Enum list."""
if value is DefaultEnumMeta.default:
return next(iter(cls))
return super().__call__(value, *args, **kwargs)
factory = __call__
class Expression(abc.ABC):
"""Base object for expressions.
Expressions must implement the expr property.
"""
@property
@abc.abstractmethod
def expr(self) -> RequestType:
"""Get the expression structure for workflow service calls."""
class PipelineVariable(Expression):
"""Base object for pipeline variables
PipelineVariables must implement the expr property.
"""
def __add__(self, other: Union[Expression, PrimitiveType]):
"""Add function for PipelineVariable
Args:
other (Union[Expression, PrimitiveType]): The other object to be concatenated.
Always raise an error since pipeline variables do not support concatenation
"""
raise TypeError("Pipeline variables do not support concatenation.")
def __str__(self):
"""Override built-in String function for PipelineVariable"""
raise TypeError(
"Pipeline variables do not support __str__ operation. "
"Please use `.to_string()` to convert it to string type in execution time"
"or use `.expr` to translate it to Json for display purpose in Python SDK."
)
def __int__(self):
"""Override built-in Integer function for PipelineVariable"""
raise TypeError("Pipeline variables do not support __int__ operation.")
def __float__(self):
"""Override built-in Float function for PipelineVariable"""
raise TypeError("Pipeline variables do not support __float__ operation.")
def to_string(self):
"""Prompt the pipeline to convert the pipeline variable to String in runtime"""
from sagemaker.workflow.functions import Join
return Join(on="", values=[self])
@property
@abc.abstractmethod
def expr(self) -> RequestType:
"""Get the expression structure for workflow service calls."""
@property
@abc.abstractmethod
def _referenced_steps(self) -> List[str]:
"""List of step names that this function depends on."""
| 32.181818 | 90 | 0.686158 | 438 | 3,540 | 5.429224 | 0.374429 | 0.025231 | 0.03995 | 0.046257 | 0.296888 | 0.257359 | 0.216569 | 0.198486 | 0.198486 | 0.168209 | 0 | 0.001456 | 0.224011 | 3,540 | 109 | 91 | 32.477064 | 0.864216 | 0.450565 | 0 | 0.214286 | 0 | 0 | 0.195773 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0 | 0.119048 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5278bc829279f7e15d79fd95354c40b02e9c04cc | 2,565 | py | Python | zntrack/zn/dependencies.py | zincware/ZnTrack | 7767e133720a75ccb289a5b19d7960584e9dc74f | [
"Apache-2.0"
] | 16 | 2021-12-08T15:35:22.000Z | 2022-03-29T09:43:31.000Z | zntrack/zn/dependencies.py | zincware/ZnTrack | 7767e133720a75ccb289a5b19d7960584e9dc74f | [
"Apache-2.0"
] | 108 | 2021-10-20T08:00:57.000Z | 2022-03-30T14:52:30.000Z | zntrack/zn/dependencies.py | zincware/ZnTrack | 7767e133720a75ccb289a5b19d7960584e9dc74f | [
"Apache-2.0"
] | 2 | 2021-11-18T07:41:52.000Z | 2022-03-17T15:39:56.000Z | from __future__ import annotations
import dataclasses
import pathlib
from typing import TYPE_CHECKING, List, Union
import znjson
from zntrack.utils import utils
if TYPE_CHECKING:
from zntrack import Node
@dataclasses.dataclass
class NodeAttribute:
module: str
cls: str
name: str
attribute: str
affected_files: List[pathlib.Path]
class RawNodeAttributeConverter(znjson.ConverterBase):
"""Serializer for Node Attributes
Instead of returning the actual attribute this returns the NodeAttribute cls.
"""
instance = NodeAttribute
representation = "NodeAttribute"
level = 999
def _encode(self, obj: NodeAttribute) -> dict:
"""Convert NodeAttribute to serializable dict"""
return dataclasses.asdict(obj)
def _decode(self, value: dict) -> NodeAttribute:
"""return serialized Node attribute"""
return NodeAttribute(**value)
def getdeps(node: Union[Node, type(Node)], attribute: str) -> NodeAttribute:
"""Allow for Node attributes as dependencies
Parameters
----------
node
attribute
Returns
-------
"""
# TODO add check if the attribute exists in the given Node
# _ = getattr(node, attribute)
node = utils.load_node_dependency(node) # run node = Node.load() if required
return NodeAttribute(
module=node.module,
cls=node.__class__.__name__,
name=node.node_name,
attribute=attribute,
affected_files=list(node.affected_files),
)
def get_origin(
node: Union[Node, type(Node)], attribute: str
) -> Union[NodeAttribute, List[NodeAttribute]]:
"""Get the NodeAttribute from a zn.deps
Typically, when using zn.deps there is no way to access the original Node where
the data comes from. This function allows you to get the underlying
NodeAttribute object to access e.g. the name of the original Node.
"""
znjson.register(RawNodeAttributeConverter)
new_node = node.load(name=node.node_name)
try:
value = getattr(new_node, attribute)
except AttributeError as err:
raise AttributeError("Can only use get_origin with zn.deps") from err
znjson.deregister(RawNodeAttributeConverter)
if isinstance(value, (list, tuple)):
if any([not isinstance(x, NodeAttribute) for x in value]):
raise AttributeError("Can only use get_origin with zn.deps using getdeps.")
elif not isinstance(value, NodeAttribute):
raise AttributeError("Can only use get_origin with zn.deps using getdeps.")
return value
| 28.5 | 87 | 0.695127 | 308 | 2,565 | 5.691558 | 0.37013 | 0.044495 | 0.03765 | 0.044495 | 0.133485 | 0.133485 | 0.133485 | 0.095836 | 0.095836 | 0.095836 | 0 | 0.001499 | 0.219493 | 2,565 | 89 | 88 | 28.820225 | 0.874126 | 0.25614 | 0 | 0.041667 | 0 | 0 | 0.082785 | 0 | 0 | 0 | 0 | 0.011236 | 0 | 1 | 0.083333 | false | 0 | 0.145833 | 0 | 0.520833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
527d6947d76baee892980b6d09552e88ce1b4422 | 2,676 | py | Python | tests/conftest.py | FFY00/python-uhid | 99f459aef3934a146d1954ae372b1a107b1f34eb | [
"MIT"
] | 7 | 2020-08-28T01:57:58.000Z | 2021-10-16T10:49:58.000Z | tests/conftest.py | FFY00/python-uhid | 99f459aef3934a146d1954ae372b1a107b1f34eb | [
"MIT"
] | 4 | 2020-08-28T00:25:35.000Z | 2021-04-08T19:13:48.000Z | tests/conftest.py | FFY00/python-uhid | 99f459aef3934a146d1954ae372b1a107b1f34eb | [
"MIT"
] | null | null | null | import pytest
@pytest.fixture
def basic_mouse_rdesc():
return [
# Generic mouse report descriptor
0x05, 0x01, # Usage Page (Generic Desktop) 0
0x09, 0x02, # Usage (Mouse) 2
0xa1, 0x01, # Collection (Application) 4
0x09, 0x02, # .Usage (Mouse) 6
0xa1, 0x02, # .Collection (Logical) 8
0x09, 0x01, # ..Usage (Pointer) 10
0xa1, 0x00, # ..Collection (Physical) 12
0x05, 0x09, # ...Usage Page (Button) 14
0x19, 0x01, # ...Usage Minimum (1) 16
0x29, 0x03, # ...Usage Maximum (3) 18
0x15, 0x00, # ...Logical Minimum (0) 20
0x25, 0x01, # ...Logical Maximum (1) 22
0x75, 0x01, # ...Report Size (1) 24
0x95, 0x03, # ...Report Count (3) 26
0x81, 0x02, # ...Input (Data,Var,Abs) 28
0x75, 0x05, # ...Report Size (5) 30
0x95, 0x01, # ...Report Count (1) 32
0x81, 0x03, # ...Input (Cnst,Var,Abs) 34
0x05, 0x01, # ...Usage Page (Generic Desktop) 36
0x09, 0x30, # ...Usage (X) 38
0x09, 0x31, # ...Usage (Y) 40
0x15, 0x81, # ...Logical Minimum (-127) 42
0x25, 0x7f, # ...Logical Maximum (127) 44
0x75, 0x08, # ...Report Size (8) 46
0x95, 0x02, # ...Report Count (2) 48
0x81, 0x06, # ...Input (Data,Var,Rel) 50
0xc0, # ..End Collection 52
0xc0, # .End Collection 53
0xc0, # End Collection 54
]
@pytest.fixture
def vendor_rdesc():
return [
# Vendor page report descriptor
0x06, 0x00, 0xff, # Usage Page (Vendor Page)
0x09, 0x00, # Usage (Vendor Usage 0)
0xa1, 0x01, # Collection (Application)
0x85, 0x20, # .Report ID (0x20)
0x75, 0x08, # .Report Size (8)
0x95, 0x08, # .Report Count (8)
0x15, 0x00, # .Logical Minimum (0)
0x26, 0xff, 0x00, # .Logical Maximum (255)
0x09, 0x00, # .Usage (Vendor Usage 0)
0x81, 0x00, # .Input (Data,Arr,Abs)
0x09, 0x00, # .Usage (Vendor Usage 0)
0x91, 0x00, # .Outpur (Data,Arr,Abs)
0xc0, # End Collection
]
| 46.137931 | 61 | 0.420777 | 258 | 2,676 | 4.352713 | 0.372093 | 0.032057 | 0.060552 | 0.050757 | 0.196794 | 0.121995 | 0 | 0 | 0 | 0 | 0 | 0.219888 | 0.466368 | 2,676 | 57 | 62 | 46.947368 | 0.566527 | 0.547459 | 0 | 0.411765 | 0 | 0 | 0 | 0 | 0 | 0 | 0.281304 | 0 | 0 | 1 | 0.039216 | true | 0 | 0.019608 | 0.039216 | 0.098039 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
52803545f3b8b643a3ef5b1ddbc0043fdf710aed | 269 | py | Python | vk_types/base.py | kz159/vk_types | 84496ca22e34f0991a2d8dc353601272fb9f2108 | [
"MIT"
] | null | null | null | vk_types/base.py | kz159/vk_types | 84496ca22e34f0991a2d8dc353601272fb9f2108 | [
"MIT"
] | null | null | null | vk_types/base.py | kz159/vk_types | 84496ca22e34f0991a2d8dc353601272fb9f2108 | [
"MIT"
] | null | null | null | import pydantic
class BaseModel(pydantic.BaseModel):
class Config:
allow_mutation = False
def __str__(self):
return str(self.dict())
def __repr__(self):
return f"{self.__class__} {getattr(self, 'id', getattr(self, 'type', ''))}"
| 20.692308 | 83 | 0.628253 | 31 | 269 | 5.032258 | 0.580645 | 0.089744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230483 | 269 | 12 | 84 | 22.416667 | 0.753623 | 0 | 0 | 0 | 0 | 0.125 | 0.241636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.25 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
528dac7ca562f7233e511a674159e4616f0c822c | 268 | py | Python | src/core/migrations/0107_merge_20191024_0737.py | metabolism-of-cities/ARCHIVED-metabolism-of-cities-platform-v3 | c754d3b1b401906a21640b8eacb6b724a448b31c | [
"MIT"
] | null | null | null | src/core/migrations/0107_merge_20191024_0737.py | metabolism-of-cities/ARCHIVED-metabolism-of-cities-platform-v3 | c754d3b1b401906a21640b8eacb6b724a448b31c | [
"MIT"
] | null | null | null | src/core/migrations/0107_merge_20191024_0737.py | metabolism-of-cities/ARCHIVED-metabolism-of-cities-platform-v3 | c754d3b1b401906a21640b8eacb6b724a448b31c | [
"MIT"
] | null | null | null | # Generated by Django 2.1.2 on 2019-10-24 07:37
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('core', '0106_auto_20191024_0738'),
('core', '0106_merge_20191024_0726'),
]
operations = [
]
| 17.866667 | 47 | 0.641791 | 33 | 268 | 5.030303 | 0.787879 | 0.096386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229268 | 0.235075 | 268 | 14 | 48 | 19.142857 | 0.580488 | 0.16791 | 0 | 0 | 1 | 0 | 0.248869 | 0.21267 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5290f8333d19eb6ffaf139cb721b7ffed661d212 | 787 | py | Python | TopQuarkAnalysis/TopJetCombination/python/TtFullLepHypKinSolution_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | TopQuarkAnalysis/TopJetCombination/python/TtFullLepHypKinSolution_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | TopQuarkAnalysis/TopJetCombination/python/TtFullLepHypKinSolution_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
#
# module to make the kinematic solution hypothesis
#
ttFullLepHypKinSolution = cms.EDProducer("TtFullLepHypKinSolution",
electrons = cms.InputTag("selectedPatElectrons"),
muons = cms.InputTag("selectedPatMuons"),
jets = cms.InputTag("selectedPatJets"),
mets = cms.InputTag("patMETs"),
match = cms.InputTag("kinSolutionTtFullLepEventHypothesis"),
Neutrinos = cms.InputTag("kinSolutionTtFullLepEventHypothesis","fullLepNeutrinos"),
NeutrinoBars = cms.InputTag("kinSolutionTtFullLepEventHypothesis","fullLepNeutrinoBars"),
solutionWeight = cms.InputTag("kinSolutionTtFullLepEventHypothesis","solWeight"),
jetCorrectionLevel = cms.string("L3Absolute")
)
| 39.35 | 99 | 0.717916 | 56 | 787 | 10.089286 | 0.625 | 0.155752 | 0.325664 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001543 | 0.17662 | 787 | 19 | 100 | 41.421053 | 0.87037 | 0.060991 | 0 | 0 | 0 | 0 | 0.375171 | 0.222374 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5292cfd908f895829e2b11e6368eeccc4e342901 | 285 | py | Python | tests/guinea-pigs/unittest/test_changes_name.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 105 | 2015-06-24T15:40:41.000Z | 2022-02-04T10:30:34.000Z | tests/guinea-pigs/unittest/test_changes_name.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 145 | 2015-06-24T15:26:28.000Z | 2022-03-22T20:04:19.000Z | tests/guinea-pigs/unittest/test_changes_name.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 76 | 2015-07-20T08:18:21.000Z | 2022-03-18T20:03:53.000Z | import unittest
from teamcity.unittestpy import TeamcityTestRunner
class Foo(unittest.TestCase):
a = 1
def test_aa(self): pass
def shortDescription(self):
s = str(self.a)
self.a += 10
return s
unittest.main(testRunner=TeamcityTestRunner())
| 14.25 | 50 | 0.666667 | 34 | 285 | 5.558824 | 0.676471 | 0.05291 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013953 | 0.245614 | 285 | 19 | 51 | 15 | 0.865116 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.1 | 0.2 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
bfeb3cf6ef4f13f92d0eb22554aad60508da4fa4 | 52 | py | Python | eogrow/__init__.py | sentinel-hub/eo-grow | 458202fc55c01cc95a17b442144ef2c935103c80 | [
"MIT"
] | 17 | 2022-01-27T11:50:23.000Z | 2022-02-12T10:01:03.000Z | eogrow/__init__.py | sentinel-hub/eo-grow | 458202fc55c01cc95a17b442144ef2c935103c80 | [
"MIT"
] | 5 | 2022-02-10T11:15:09.000Z | 2022-02-28T10:46:47.000Z | eogrow/__init__.py | sentinel-hub/eo-grow | 458202fc55c01cc95a17b442144ef2c935103c80 | [
"MIT"
] | null | null | null | """
The main init module
"""
__version__ = "1.1.0"
| 8.666667 | 21 | 0.596154 | 8 | 52 | 3.375 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.192308 | 52 | 5 | 22 | 10.4 | 0.571429 | 0.384615 | 0 | 0 | 0 | 0 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bff343ffa167026068c12f5adf92282e0b8a9b2a | 2,243 | py | Python | src/modules/routerbox/routerboxReportsOS.py | tsuriu/integra | 94aea81852087aef37f994d4db9dece012d10fd1 | [
"MIT"
] | 1 | 2021-08-13T00:53:32.000Z | 2021-08-13T00:53:32.000Z | src/modules/routerbox/routerboxReportsOS.py | tsuriu/integra | 94aea81852087aef37f994d4db9dece012d10fd1 | [
"MIT"
] | null | null | null | src/modules/routerbox/routerboxReportsOS.py | tsuriu/integra | 94aea81852087aef37f994d4db9dece012d10fd1 | [
"MIT"
] | null | null | null | from .odbc_mysql import mysqldb
from decimal import Decimal
class RouterboxBDReports(object):
def __init__(self):
self.host = "172.31.254.41"
self.user = "squid"
self.passwd = "***root*fistel@kaua2020dbrbx"
self.schema = "isupergaus"
self.rbxdb = mysqldb(self.host, self.user, self.passwd, self.schema)
def repOS(self, groupid):
result = {}
qry_nocap = "SELECT Numero, Cliente, Usu_Abertura, Usu_Designado, Grupo_Designado, Fluxo, Topico, TRUNCATE(UNIX_TIMESTAMP(CONCAT(Data_AB, ' ',Hora_AB)),0) as Abertura, TRUNCATE(UNIX_TIMESTAMP(CONCAT(Data_BX, ' ',Hora_BX)),0) as BX, TRUNCATE(UNIX_TIMESTAMP(CONCAT(Data_Prox, ' ',Hora_Prox)),0) as Prox, UNIX_TIMESTAMP(Data_ATU) as ATU, UNIX_TIMESTAMP(SLAInicio) as SLAInicio, SLA, SLATipo, Duracao, Situacao, (select Foto from usuarios where usuario=Usu_Abertura) as FotoAbert, (select Foto from usuarios where usuario=Usu_Designado) as FotoDesig from Atendimentos where Situacao = 'A' and Grupo_Designado = {} ORDER BY Numero ASC;".format(groupid)
qry_cap = "SELECT Numero, Cliente, Usu_Abertura, Usu_Designado, Grupo_Designado, Fluxo, Topico, TRUNCATE(UNIX_TIMESTAMP(CONCAT(Data_AB, ' ',Hora_AB)),0) as Abertura, TRUNCATE(UNIX_TIMESTAMP(CONCAT(Data_BX, ' ',Hora_BX)),0) as BX, TRUNCATE(UNIX_TIMESTAMP(CONCAT(Data_Prox, ' ',Hora_Prox)),0) as Prox, UNIX_TIMESTAMP(Data_ATU) as ATU, UNIX_TIMESTAMP(SLAInicio) as SLAInicio, SLA, SLATipo, Duracao, Situacao, (select Foto from usuarios where usuario=Usu_Abertura) as FotoAbert, (select Foto from usuarios where usuario=Usu_Designado) as FotoDesig from Atendimentos where Situacao != 'F' and Usu_Designado in (select usuario from usuarios where idgrupo={}) ORDER BY Numero ASC;".format(groupid)
osNocap = self.rbxdb.fetchAll(qry_nocap)
osCap = self.rbxdb.fetchAll(qry_cap)
for data in [osNocap, osCap]:
for dta in data['data']:
for dt in list(dta.keys()):
if isinstance(dta[dt], Decimal):
dta[dt] = int(dta[dt])
result['nocap'] = osNocap
result['cap'] = osCap
return result
if __name__ == '__main__':
rbx = RouterboxBDReports()
os = rbx.repOS(11)
| 59.026316 | 698 | 0.695497 | 300 | 2,243 | 5.023333 | 0.303333 | 0.086264 | 0.08361 | 0.107498 | 0.61712 | 0.61712 | 0.578633 | 0.578633 | 0.578633 | 0.578633 | 0 | 0.012101 | 0.189478 | 2,243 | 37 | 699 | 60.621622 | 0.816832 | 0 | 0 | 0 | 0 | 0.076923 | 0.604102 | 0.200624 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.076923 | 0.076923 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
bff9c45e794d4ee590261bc60152b42109a1677b | 649 | py | Python | class-example/todo.py | nadernashaatn/python-course | da5681d87cc9e9358848d0211b8c22a1512b9f42 | [
"MIT"
] | null | null | null | class-example/todo.py | nadernashaatn/python-course | da5681d87cc9e9358848d0211b8c22a1512b9f42 | [
"MIT"
] | null | null | null | class-example/todo.py | nadernashaatn/python-course | da5681d87cc9e9358848d0211b8c22a1512b9f42 | [
"MIT"
] | null | null | null | class Todo:
"""
Implement the Todo Class
"""
def __init__(self, name, description, points, completed=False):
self.name = name
self.description = description
self.points = points
self.completed = completed
def __repr__(self):
return (f" Task Name: {self.name} \n Task status: {self.completed} \n Task points: {self.points}")
class TodoList:
def __init__(self, todos):
self.todos = todos
def averge_points(self):
total = 0
for todo in self.todos:
total += todo.points
return total / len(self.todos)
| 22.37931 | 109 | 0.567026 | 73 | 649 | 4.863014 | 0.356164 | 0.101408 | 0.061972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002315 | 0.334361 | 649 | 28 | 110 | 23.178571 | 0.819444 | 0.03698 | 0 | 0 | 0 | 0.0625 | 0.148532 | 0 | 0 | 0 | 0 | 0.035714 | 0 | 1 | 0.25 | false | 0 | 0 | 0.0625 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8705594c7f19d3db86599b60803ccba51ca08a67 | 450 | py | Python | crawlers/scholarly.py | narayana1043/g-scholar-rank | d1ceae9e2e0a5c517af8f9b6199c405b3e2310fd | [
"Apache-2.0"
] | null | null | null | crawlers/scholarly.py | narayana1043/g-scholar-rank | d1ceae9e2e0a5c517af8f9b6199c405b3e2310fd | [
"Apache-2.0"
] | null | null | null | crawlers/scholarly.py | narayana1043/g-scholar-rank | d1ceae9e2e0a5c517af8f9b6199c405b3e2310fd | [
"Apache-2.0"
] | null | null | null | from fp.fp import FreeProxy
from scholarly import scholarly
class Scholarly:
def __init__(self):
proxy = FreeProxy(rand=True, timeout=1, country_id=['US', 'CA']).get()
scholarly.use_proxy(http=proxy, https=proxy)
def get_author_details(self, name):
"""
:return:
"""
author = next(scholarly.search_author(name))
return author
def get_author_publications(self, id):
pass | 22.5 | 78 | 0.628889 | 54 | 450 | 5.037037 | 0.555556 | 0.044118 | 0.088235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002994 | 0.257778 | 450 | 20 | 79 | 22.5 | 0.811377 | 0.017778 | 0 | 0 | 0 | 0 | 0.009569 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0.090909 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
870f08effff527b1225f25140e0421a9b4dc341c | 303 | py | Python | utils.py | yonlif/CitySimulator | ca0d0de41cc37ef17f22af2c1a329319d2dbbeb2 | [
"Apache-2.0"
] | null | null | null | utils.py | yonlif/CitySimulator | ca0d0de41cc37ef17f22af2c1a329319d2dbbeb2 | [
"Apache-2.0"
] | null | null | null | utils.py | yonlif/CitySimulator | ca0d0de41cc37ef17f22af2c1a329319d2dbbeb2 | [
"Apache-2.0"
] | null | null | null |
class Location:
def __init__(self, lat: float, lon: float):
self.lat = lat
self.lon = lon
def __repr__(self) -> str:
return f"({self.lat}, {self.lon})"
def str_between(input_str: str, left: str, right: str) -> str:
return input_str.split(left)[1].split(right)[0]
| 23.307692 | 62 | 0.60396 | 45 | 303 | 3.822222 | 0.422222 | 0.122093 | 0.116279 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008658 | 0.237624 | 303 | 12 | 63 | 25.25 | 0.735931 | 0 | 0 | 0 | 0 | 0 | 0.07947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0 | 0.25 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
871194bf2671cc303426cf2916a1b4e3c707ad19 | 72 | py | Python | inchestocm.py | toobythedragontamer/My-work | d1d013523268cdbdcef6b67bbf493dd7b51a202f | [
"CC0-1.0"
] | null | null | null | inchestocm.py | toobythedragontamer/My-work | d1d013523268cdbdcef6b67bbf493dd7b51a202f | [
"CC0-1.0"
] | null | null | null | inchestocm.py | toobythedragontamer/My-work | d1d013523268cdbdcef6b67bbf493dd7b51a202f | [
"CC0-1.0"
] | null | null | null | cm_calculator = cm = input('cm')
inches = int(cm) / 2.54
print(inches) | 24 | 34 | 0.652778 | 12 | 72 | 3.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.166667 | 72 | 3 | 35 | 24 | 0.716667 | 0 | 0 | 0 | 0 | 0 | 0.027397 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8712f2c035bb10fafd24c86f27cbc463f2ce0e8f | 94 | py | Python | chapter3/condel/condel/__init__.py | chris-zen/phd-thesis | 1eefdff8e7ca1910304e27ae42551dc64496b101 | [
"Unlicense"
] | 1 | 2015-12-22T00:53:18.000Z | 2015-12-22T00:53:18.000Z | chapter3/condel/condel/__init__.py | chris-zen/phd-thesis | 1eefdff8e7ca1910304e27ae42551dc64496b101 | [
"Unlicense"
] | null | null | null | chapter3/condel/condel/__init__.py | chris-zen/phd-thesis | 1eefdff8e7ca1910304e27ae42551dc64496b101 | [
"Unlicense"
] | null | null | null | VERSION="2.0-dev"
AUTHORS="Barcelona Biomedical Genomics"
AUTHORS_EMAIL="nuria.lopez@upf.edu"
| 23.5 | 39 | 0.797872 | 14 | 94 | 5.285714 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022472 | 0.053191 | 94 | 3 | 40 | 31.333333 | 0.808989 | 0 | 0 | 0 | 0 | 0 | 0.585106 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
871719d3e6dc5235af8b87adea9772adb011a473 | 101 | py | Python | credentials.sample.py | fuadajip/sentiment-analysis | 3d9fd2e29540a0b774aba3e878759ac64721149c | [
"Apache-2.0"
] | 1 | 2021-04-21T17:58:52.000Z | 2021-04-21T17:58:52.000Z | credentials.sample.py | fuadajip/sentiment-analysis | 3d9fd2e29540a0b774aba3e878759ac64721149c | [
"Apache-2.0"
] | null | null | null | credentials.sample.py | fuadajip/sentiment-analysis | 3d9fd2e29540a0b774aba3e878759ac64721149c | [
"Apache-2.0"
] | null | null | null | # Consume:
CONSUMER_KEY = ''
CONSUMER_SECRET = ''
# Access:
ACCESS_TOKEN = ''
ACCESS_SECRET = '' | 14.428571 | 20 | 0.643564 | 10 | 101 | 6.1 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19802 | 101 | 7 | 21 | 14.428571 | 0.753086 | 0.158416 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8717b33347ca0cf2a3132ed3f31931fa028e9844 | 496 | py | Python | invitation/management/commands/cleanupinvitations.py | volrath/django-invitation-backend | aaab733be53c3626efcb762f5a85793a0963e876 | [
"BSD-3-Clause"
] | 1 | 2019-06-27T13:58:43.000Z | 2019-06-27T13:58:43.000Z | invitation/management/commands/cleanupinvitations.py | volrath/django-invitation-backend | aaab733be53c3626efcb762f5a85793a0963e876 | [
"BSD-3-Clause"
] | null | null | null | invitation/management/commands/cleanupinvitations.py | volrath/django-invitation-backend | aaab733be53c3626efcb762f5a85793a0963e876 | [
"BSD-3-Clause"
] | 3 | 2015-03-29T17:45:02.000Z | 2021-05-10T15:49:40.000Z | """
A management command which deletes expired invitation keys from the database.
Calls ``Invitation.objects.delete_expired_keys()``, which contains the actual
logic for determining which accounts are deleted.
"""
from django.core.management.base import NoArgsCommand
from invitation.models import Invitation
class Command(NoArgsCommand):
help = "Delete expired invitation keys from the database"
def handle_noargs(self, **options):
Invitation.objects.delete_expired_keys()
| 27.555556 | 77 | 0.782258 | 61 | 496 | 6.278689 | 0.57377 | 0.101828 | 0.109661 | 0.130548 | 0.365535 | 0.18799 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143145 | 496 | 17 | 78 | 29.176471 | 0.901176 | 0.415323 | 0 | 0 | 0 | 0 | 0.170213 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
871a7d8fb38c3257f2c99e1726e0b23388b2e00c | 334 | py | Python | user.py | paramsingh/lazycoin | 834f58fcbc1e8fd1ff911760992589bd2671c602 | [
"MIT"
] | 14 | 2017-04-01T12:16:01.000Z | 2021-11-25T04:41:38.000Z | user.py | paramsingh/lazycoin | 834f58fcbc1e8fd1ff911760992589bd2671c602 | [
"MIT"
] | 2 | 2017-04-01T22:24:56.000Z | 2017-04-03T08:13:23.000Z | user.py | paramsingh/lazycoin | 834f58fcbc1e8fd1ff911760992589bd2671c602 | [
"MIT"
] | 8 | 2017-04-01T16:30:03.000Z | 2021-01-02T22:34:54.000Z | import rsa
import json
class LazyUser(object):
def __init__(self):
self.pub, self.priv = rsa.newkeys(512)
def sign(self, transaction):
message = json.dumps(transaction.to_dict(), sort_keys=True).encode('utf-8')
signature = rsa.sign(message, self.priv, 'SHA-256')
return (message, signature)
| 25.692308 | 83 | 0.655689 | 44 | 334 | 4.840909 | 0.659091 | 0.075117 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026515 | 0.209581 | 334 | 12 | 84 | 27.833333 | 0.780303 | 0 | 0 | 0 | 0 | 0 | 0.035928 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
871e13829b34616baf459ddcb84ea6555ac560aa | 9,875 | py | Python | deps/cndict/bundle_friso.py | rrelledge/RediSearch | 096f7cc7557c27cd6a0352a954cb75041d7c38a9 | [
"Apache-2.0",
"Ruby",
"BSD-3-Clause",
"MIT"
] | 2,098 | 2019-05-13T09:11:54.000Z | 2022-03-31T06:24:50.000Z | deps/cndict/bundle_friso.py | rrelledge/RediSearch | 096f7cc7557c27cd6a0352a954cb75041d7c38a9 | [
"Apache-2.0",
"Ruby",
"BSD-3-Clause",
"MIT"
] | 1,659 | 2019-05-13T07:55:29.000Z | 2022-03-31T02:42:57.000Z | deps/cndict/bundle_friso.py | rrelledge/RediSearch | 096f7cc7557c27cd6a0352a954cb75041d7c38a9 | [
"Apache-2.0",
"Ruby",
"BSD-3-Clause",
"MIT"
] | 227 | 2019-05-17T07:54:49.000Z | 2022-03-28T03:50:19.000Z | #!/usr/bin/env python
"""
This script gathers settings and dictionaries from friso (a chinese
tokenization library) and generates a C source file that can later be
compiled into RediSearch, allowing the module to have a built-in chinese
dictionary. By default this script will generate a C source file of
compressed data but there are other options to control output (mainly for
debugging).
The `read_friso` script can be used to analyze the dumped data for debugging
purposes
"""
import zlib
import errno
import os
import re
import struct
import sys
import time
import string
from argparse import ArgumentParser
# Load the ini file
ap = ArgumentParser()
ap.add_argument('-i', '--ini', default='friso/friso.ini',
help='ini file to use for initialization')
ap.add_argument('-m', '--mode', default='c', help='output mode',
choices=['c', 'raw_z', 'raw_u'])
ap.add_argument('-d', '--dir', default='.',
help='Override directory of lex files')
ap.add_argument('-o', '--out', help='Name of destination directory',
default='cndict_generated')
opts = ap.parse_args()
lexdir = opts.dir
DICT_VARNAME = 'ChineseDict'
SIZE_COMP_VARNAME = 'ChineseDictCompressedLength'
SIZE_FULL_VARNME = 'ChineseDictFullLength'
class ConfigEntry(object):
def __init__(self, srcname, dstname, pytype):
self.srcname = srcname
self.dstname = dstname
self.pytype = pytype
self.value = None
configs = [
ConfigEntry('max_len', 'max_len', int),
ConfigEntry('r_name', 'r_name', int),
ConfigEntry('mix_len', 'mix_len', int),
ConfigEntry('lna_len', 'lna_len', int),
ConfigEntry('add_syn', 'add_syn', int),
ConfigEntry('clr_stw', 'clr_stw', int),
ConfigEntry('keep_urec', 'keep_urec', int),
ConfigEntry('spx_out', 'spx_out', int),
ConfigEntry('nthreshold', 'nthreshold', int),
ConfigEntry('mode', 'mode', int),
ConfigEntry('charset', 'charset', int),
ConfigEntry('en_sseg', 'en_sseg', int),
ConfigEntry('st_minl', 'st_minl', int),
ConfigEntry('kpuncs', 'kpuncs', str)
]
def write_config_init(varname, configs):
ret = []
for config in configs:
if config.value is None:
continue
if config.srcname == 'mode':
ret.append('friso_set_mode({},{});'.format(varname, config.value))
elif config.dstname == 'kpuncs':
ret.append('strcpy({}->kpuncs, "{}");'.format(varname, config.value))
elif config.dstname == 'charset':
pass
# Skip
elif config.pytype == int:
ret.append('{}->{} = {};'.format(varname, config.dstname, config.value))
else:
raise ValueError("Don't understand config!", config)
return ret
def set_key_value(name, value):
for config in configs:
name = name.lower().replace("friso.", "").strip()
# print name, config.srcname
if config.srcname == name:
config.value = config.pytype(value)
return
raise ValueError('Bad config key', name)
with open(opts.ini, 'r') as fp:
for line in fp:
line = line.strip()
if not line or line.startswith('#'):
continue
key, value = line.split('=')
key = key.strip()
value = value.strip()
if key == 'friso.lex_dir':
if not lexdir:
lexdir = value
else:
set_key_value(key, value)
# Parse the header snippet in order to emit the correct constant.
_LEXTYPE_MAP_STRS = \
r'''
__LEX_CJK_WORDS__ = 0,
__LEX_CJK_UNITS__ = 1,
__LEX_ECM_WORDS__ = 2, //english and chinese mixed words.
__LEX_CEM_WORDS__ = 3, //chinese and english mixed words.
__LEX_CN_LNAME__ = 4,
__LEX_CN_SNAME__ = 5,
__LEX_CN_DNAME1__ = 6,
__LEX_CN_DNAME2__ = 7,
__LEX_CN_LNA__ = 8,
__LEX_STOPWORDS__ = 9,
__LEX_ENPUN_WORDS__ = 10,
__LEX_EN_WORDS__ = 11,
__LEX_OTHER_WORDS__ = 15,
__LEX_NCSYN_WORDS__ = 16,
__LEX_PUNC_WORDS__ = 17, //punctuations
__LEX_UNKNOW_WORDS__ = 18 //unrecognized words.
'''
LEXTYPE_MAP = {}
for m in re.findall('\s*(__[^=]*__)\s*=\s*([\d]*)', _LEXTYPE_MAP_STRS):
LEXTYPE_MAP[m[0]] = int(m[1])
# Lex type currently occupies
TYPE_MASK = 0x1F
F_SYNS = 0x01 << 5
F_FREQS = 0x02 << 5
class LexBuffer(object):
# Size of input buffer before flushing to a zlib block
CHUNK_SIZE = 65536
VERSION = 0
def __init__(self, fp, use_compression=True):
self._buf = bytearray()
self._fp = fp
self._compressor = zlib.compressobj(-1)
self._use_compression = use_compression
# Write the file header
self._fp.write(struct.pack("!I", self.VERSION))
self._fp.flush()
self.compressed_size = 0
self.full_size = 4 # For the 'version' byte
def _write_data(self, data):
self._fp.write(data)
self.compressed_size += len(data)
def flush(self, is_final=False):
if not self._use_compression:
self._write_data(self._buf)
else:
# Flush any outstanding data in the buffer
self._write_data(self._compressor.compress(bytes(self._buf)))
if is_final:
self._write_data(self._compressor.flush(zlib.Z_FINISH))
self._fp.flush()
self.full_size += len(self._buf)
self._buf = bytearray()
def _maybe_flush(self):
if len(self._buf) > self.CHUNK_SIZE:
self.flush()
def add_entry(self, lextype, term, syns, freq):
# Perform the encoding...
header = LEXTYPE_MAP[lextype]
if syns:
header |= F_SYNS
if freq:
header |= F_FREQS
self._buf.append(header)
self._buf += term
self._buf.append(0) # NUL terminator
if syns:
self._buf += struct.pack("!h", len(syns))
for syn in syns:
self._buf += syn
self._buf.append(0)
if freq:
self._buf += struct.pack("!I", freq)
self._maybe_flush()
def encode_pair(c):
if c in string.hexdigits:
return '\\x{0:x}'.format(ord(c))
elif c in ('"', '\\', '?'):
return '\\' + c
else:
return repr('%c' % (c,))[1:-1]
# return '\\x{0:x}'.format(ord(c)) if _needs_escape(c) else c
class SourceEncoder(object):
LINE_LEN = 40
def __init__(self, fp):
self._fp = fp
self._curlen = 0
def write(self, blob):
blob = buffer(blob)
while len(blob):
chunk = buffer(blob, 0, self.LINE_LEN)
blob = buffer(blob, len(chunk), len(blob)-len(chunk))
encoded = ''.join([encode_pair(c) for c in chunk])
self._fp.write('"' + encoded + '"\n')
return len(blob)
def flush(self):
self._fp.flush()
def close(self):
pass
def process_lex_entry(type, file, buf):
print type, file
fp = open(file, 'r')
for line in fp:
line = line.strip()
comps = line.split('/')
# print comps
term = comps[0]
syns = comps[1].split(',') if len(comps) > 1 else []
if len(syns) == 1 and syns[0].lower() == 'null':
syns = []
freq = int(comps[2]) if len(comps) > 2 else 0
buf.add_entry(type, term, syns, freq)
# print "Term:", term, "Syns:", syns, "Freq", freq
# Now dump it, somehow
def strip_comment_lines(blob):
lines = [line.strip() for line in blob.split('\n')]
lines = [line for line in lines if line and not line.startswith('#')]
return lines
def sanitize_file_entry(typestr, filestr):
typestr = strip_comment_lines(typestr)[0]
filestr = strip_comment_lines(filestr)
filestr = [f.rstrip(';') for f in filestr]
return typestr, filestr
lexre = re.compile(r'([^:]+)\w*:\w*\[([^\]]*)\]', re.MULTILINE)
lexindex = os.path.join(lexdir, 'friso.lex.ini')
lexinfo = open(lexindex, 'r').read()
matches = lexre.findall(lexinfo)
# print matches
dstdir = opts.out
if opts.mode == 'c':
dstfile = 'cndict_data.c'
else:
dstfile = 'cndict_data.out'
try:
os.makedirs(dstdir)
except OSError as e:
if e.errno != errno.EEXIST:
raise
dstfile = os.path.join(dstdir, dstfile)
ofp = open(dstfile, 'w')
if opts.mode == 'c':
ofp.write(r'''
// Compressed chinese dictionary
// Generated by {}
// at {}
#include "friso/friso.h"
#include <stdlib.h>
#include <string.h>
const char {}[] =
'''.format(' '.join(sys.argv), time.ctime(), DICT_VARNAME))
ofp.flush()
lexout = SourceEncoder(ofp)
lexbuf = LexBuffer(lexout)
for m in matches:
typestr, filestr = sanitize_file_entry(m[0], m[1])
# print typestr
# print filestr
for filename in filestr:
filename = os.path.join(os.path.dirname(lexindex), filename)
process_lex_entry(typestr, filename, lexbuf)
lexbuf.flush(is_final=True)
ofp.write(';\n')
ofp.write('const size_t {} = {};\n'.format(SIZE_COMP_VARNAME, lexbuf.compressed_size))
ofp.write('const size_t {} = {};\n'.format(SIZE_FULL_VARNME, lexbuf.full_size))
config_lines = write_config_init('frisoConfig', configs)
config_fn = '\n'.join(config_lines)
friso_config_txt = '''
void ChineseDictConfigure(friso_t friso, friso_config_t frisoConfig) {
'''
friso_config_txt += config_fn
friso_config_txt += '\n}\n'
ofp.write(friso_config_txt)
ofp.flush()
ofp.close()
# hdrfile = os.path.join(dstdir, 'cndict_data.h')
# hdrfp = open(hdrfile, 'w')
# hdrfp.write(r'''
#ifndef CNDICT_DATA_H
#define CNDICT_DATA_H
# extern const char {data_var}[];
# extern const size_t {uncomp_len_var};
# extern const size_t {comp_len_var};
# {config_fn_txt}
# #endif
# '''.format(
# data_var=DICT_VARNAME,
# uncomp_len_var=SIZE_FULL_VARNME,
# comp_len_var=SIZE_COMP_VARNAME,
# config_fn_txt=friso_config_txt
# ))
# hdrfp.flush()
| 28.214286 | 86 | 0.618228 | 1,312 | 9,875 | 4.425305 | 0.256098 | 0.031347 | 0.011195 | 0.008784 | 0.054771 | 0.038925 | 0.038925 | 0.00999 | 0 | 0 | 0 | 0.008829 | 0.243038 | 9,875 | 349 | 87 | 28.295129 | 0.767893 | 0.096 | 0 | 0.140969 | 1 | 0 | 0.126018 | 0.019348 | 0 | 0 | 0.001527 | 0 | 0 | 0 | null | null | 0.008811 | 0.039648 | null | null | 0.004405 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
87281bfe92af024794b209dfb5797780add7ab8b | 1,508 | py | Python | transaction/migrations/0001_initial.py | chromity/pawnshop-app | 582fe8d14f46d954adb3cbd9c04c2455f6a95fbd | [
"MIT"
] | null | null | null | transaction/migrations/0001_initial.py | chromity/pawnshop-app | 582fe8d14f46d954adb3cbd9c04c2455f6a95fbd | [
"MIT"
] | null | null | null | transaction/migrations/0001_initial.py | chromity/pawnshop-app | 582fe8d14f46d954adb3cbd9c04c2455f6a95fbd | [
"MIT"
] | null | null | null | # Generated by Django 2.1.1 on 2018-09-30 15:16
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='PawnTransaction',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('first_name', models.CharField(max_length=256)),
('middle_name', models.CharField(max_length=64)),
('last_name', models.CharField(max_length=128)),
('sex', models.CharField(choices=[('M', 'MALE'), ('F', 'FEMALE')], max_length=1)),
('nationality', models.CharField(max_length=256)),
('contact_number', models.CharField(max_length=64)),
('address', models.CharField(max_length=512)),
('date_time', models.DateTimeField()),
('code', models.CharField(max_length=256, unique=True)),
('description', models.TextField(max_length=2048)),
('karat', models.IntegerField(null=True)),
('grams', models.IntegerField(null=True)),
('percentage', models.FloatField(choices=[(0.04, '4%'), (0.05, '5%'), (0.06, '6%')])),
('price_value', models.FloatField()),
('number_of_days', models.FloatField()),
('a_value', models.FloatField()),
],
),
]
| 40.756757 | 114 | 0.549735 | 150 | 1,508 | 5.386667 | 0.54 | 0.100248 | 0.155941 | 0.207921 | 0.209158 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047222 | 0.28382 | 1,508 | 36 | 115 | 41.888889 | 0.700926 | 0.029841 | 0 | 0 | 1 | 0 | 0.121834 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
873b745b14e2b75a79763fdf5ea994b776d78270 | 751 | py | Python | hard/python/c0056_301_remove-invalid-parentheses/00_leetcode_0056.py | drunkwater/leetcode | 8cc4a07763e71efbaedb523015f0c1eff2927f60 | [
"Ruby"
] | null | null | null | hard/python/c0056_301_remove-invalid-parentheses/00_leetcode_0056.py | drunkwater/leetcode | 8cc4a07763e71efbaedb523015f0c1eff2927f60 | [
"Ruby"
] | null | null | null | hard/python/c0056_301_remove-invalid-parentheses/00_leetcode_0056.py | drunkwater/leetcode | 8cc4a07763e71efbaedb523015f0c1eff2927f60 | [
"Ruby"
] | 3 | 2018-02-09T02:46:48.000Z | 2021-02-20T08:32:03.000Z | # DRUNKWATER TEMPLATE(add description and prototypes)
# Question Title and Description on leetcode.com
# Function Declaration and Function Prototypes on leetcode.com
#301. Remove Invalid Parentheses
#Remove the minimum number of invalid parentheses in order to make the input string valid. Return all possible results.
#Note: The input string may contain letters other than the parentheses ( and ).
#Examples:
#"()())()" -> ["()()()", "(())()"]
#"(a)())()" -> ["(a)()()", "(a())()"]
#")(" -> [""]
#Credits:
#Special thanks to @hpplayer for adding this problem and creating all test cases.
#class Solution(object):
# def removeInvalidParentheses(self, s):
# """
# :type s: str
# :rtype: List[str]
# """
# Time Is Money | 34.136364 | 119 | 0.652463 | 90 | 751 | 5.444444 | 0.722222 | 0.040816 | 0.053061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004894 | 0.183755 | 751 | 22 | 120 | 34.136364 | 0.794454 | 0.945406 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8744c053c8a2f59cb096e1d1d7ebdcccca50306e | 442 | py | Python | user43_xNecki6wqa_0.py | KuanZhasulan/Python-Games | b26f12cc5f052844c056a3922be3371acd114bc5 | [
"Apache-2.0"
] | null | null | null | user43_xNecki6wqa_0.py | KuanZhasulan/Python-Games | b26f12cc5f052844c056a3922be3371acd114bc5 | [
"Apache-2.0"
] | null | null | null | user43_xNecki6wqa_0.py | KuanZhasulan/Python-Games | b26f12cc5f052844c056a3922be3371acd114bc5 | [
"Apache-2.0"
] | null | null | null | # Conditionals Examples
# Return True if year is a leap year, false otherwise
def is_leap_year(year):
if (year % 400) == 0:
return True
elif (year % 100) == 0:
return False
elif (year % 4) == 0:
return True
else:
return False
year = 2012
leap_year = is_leap_year(year)
if leap_year:
print year, "is a leap year"
else:
print year, "is not a leap year"
| 19.217391 | 54 | 0.563348 | 63 | 442 | 3.857143 | 0.333333 | 0.230453 | 0.111111 | 0.090535 | 0.255144 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048951 | 0.352941 | 442 | 22 | 55 | 20.090909 | 0.800699 | 0.165158 | 0 | 0.4 | 0 | 0 | 0.093023 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.133333 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8758aab1c3714c91f4dd1173d9d4c023be3f749b | 138 | py | Python | mangadex/__init__.py | mansuf/mangadex.py | 42d6278f4383c99fb64add179dff3df3e82b2baa | [
"MIT"
] | null | null | null | mangadex/__init__.py | mansuf/mangadex.py | 42d6278f4383c99fb64add179dff3df3e82b2baa | [
"MIT"
] | null | null | null | mangadex/__init__.py | mansuf/mangadex.py | 42d6278f4383c99fb64add179dff3df3e82b2baa | [
"MIT"
] | null | null | null | """
MangaDex API wrapper for Python
:copyright: (c), 2021 Mansuf
:license: MIT, see LICENSE for more details.
"""
__version__ = 'v0.0.1' | 17.25 | 44 | 0.695652 | 20 | 138 | 4.6 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060345 | 0.15942 | 138 | 8 | 45 | 17.25 | 0.732759 | 0.768116 | 0 | 0 | 0 | 0 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8763eb38dc3f40c354c9aeb76e6cac51e465fdfc | 252 | py | Python | django/website/publisher/views.py | aptivate/opendatacomparison | 49784acee56108e37b98744686e1f648a97bd960 | [
"MIT"
] | null | null | null | django/website/publisher/views.py | aptivate/opendatacomparison | 49784acee56108e37b98744686e1f648a97bd960 | [
"MIT"
] | null | null | null | django/website/publisher/views.py | aptivate/opendatacomparison | 49784acee56108e37b98744686e1f648a97bd960 | [
"MIT"
] | null | null | null | from django.views.generic import DetailView, ListView
from .models import Publisher
class PublisherListView(ListView):
queryset = Publisher.objects.all().order_by('country', 'name')
class PublisherDetailView(DetailView):
model = Publisher
| 21 | 66 | 0.77381 | 27 | 252 | 7.185185 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130952 | 252 | 11 | 67 | 22.909091 | 0.885845 | 0 | 0 | 0 | 0 | 0 | 0.043651 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
8771604e56cd07e6e69d1b7bd930f4f7e74538cf | 994 | py | Python | src/nes.py | psykad/pytendo | f8e1eb66b2fcae3dfb69229650eafefd7b5d63d3 | [
"MIT"
] | null | null | null | src/nes.py | psykad/pytendo | f8e1eb66b2fcae3dfb69229650eafefd7b5d63d3 | [
"MIT"
] | null | null | null | src/nes.py | psykad/pytendo | f8e1eb66b2fcae3dfb69229650eafefd7b5d63d3 | [
"MIT"
] | null | null | null | from cpu import CPU
from mmu import MMU
from ppu import PPU
from cartridge import Cartridge
import time
class NES:
def __init__(self):
self.cpu = CPU(self)
self.mmu = MMU(self)
self.ppu = PPU(self)
self.cartridge = None
self.ram = [0xFF] * 2048
self._clock = 0
def reset(self):
self.cpu.reset()
def start(self):
if (self.cartridge is None):
raise RuntimeError("No ROM loaded!")
self.cpu.reset()
# TODO: Create proper execution loop.
while True:
self.frame()
#time.sleep(0.016)
def frame(self):
self.step()
def step(self):
self.cpu.step()
def consume_cycles(self, cycles):
# Update clock with instruction cycles.
self._clock = (self._clock+cycles)&0xFFFFFFFF
self.ppu.step(cycles)
def load_cartridge(self, filename):
self.cartridge = Cartridge(filename) | 23.116279 | 53 | 0.567404 | 120 | 994 | 4.625 | 0.383333 | 0.100901 | 0.059459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016616 | 0.334004 | 994 | 43 | 54 | 23.116279 | 0.821752 | 0.090543 | 0 | 0.066667 | 0 | 0 | 0.015538 | 0 | 0 | 0 | 0.015538 | 0.023256 | 0 | 1 | 0.233333 | false | 0 | 0.166667 | 0 | 0.433333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8778653a79b57395882c239baa2288b19a4c9d43 | 912 | py | Python | second/utils/reading pickle.py | mayanks888/second.pytorch_ros | 9915bc0ed827b9e3fe8ba14eb5702a1b75e32d63 | [
"MIT"
] | 1 | 2021-09-29T12:01:22.000Z | 2021-09-29T12:01:22.000Z | second/utils/reading pickle.py | mayanks888/second.pytorch_ros | 9915bc0ed827b9e3fe8ba14eb5702a1b75e32d63 | [
"MIT"
] | null | null | null | second/utils/reading pickle.py | mayanks888/second.pytorch_ros | 9915bc0ed827b9e3fe8ba14eb5702a1b75e32d63 | [
"MIT"
] | null | null | null | import pickle
# datapath_file ='/home/mayank_sati/pycharm_projects/pytorch/second_nuscene_mayank/second/save_pkl/nuscenes_infos_train.pkl'
# datapath_file ='/home/mayank_sati/pycharm_projects/tensorflow/traffic_light_detection_classification-master/traffic_light_classification/autokeras/model_file/test_autokeras_model.pkl'
datapath_file ='/home/mayank_sati/pycharm_projects/pytorch/second.pytorch_traveller59_date_9_05/second/point_pp_nuscene/eval_results/step_140670/result.pkl'
# datapath_file ='/home/mayank_sati/Documents/point_clouds/nuscene_v1.0-mini/infos_val.pkl'
boxes = pickle.load(open(datapath_file, "rb"))
print(1)
# import mayavi.mlab as mlab
#
# fig = mlab.figure(figure=None, bgcolor=(0, 0, 0), fgcolor=None, engine=None, size=(1000, 500))
# pcd_data = gt_points
# print(pcd_data.shape)
# # pcd_data = points
# draw_lidar(pcd_data, fig=fig)
# mlab.show() | 60.8 | 185 | 0.781798 | 131 | 912 | 5.122137 | 0.526718 | 0.089419 | 0.09538 | 0.131148 | 0.274218 | 0.274218 | 0.230999 | 0.230999 | 0.160954 | 0 | 0 | 0.029197 | 0.098684 | 912 | 15 | 186 | 60.8 | 0.787105 | 0.725877 | 0 | 0 | 0 | 0.25 | 0.592437 | 0.584034 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8779c5bc5ca9213250c4f3135843cad9ca3a624e | 560 | py | Python | Django/ML/helper.py | webclub/CFI | 122c8ae6fbde6d48398ce938fac28d40ccb6cabc | [
"Apache-2.0"
] | null | null | null | Django/ML/helper.py | webclub/CFI | 122c8ae6fbde6d48398ce938fac28d40ccb6cabc | [
"Apache-2.0"
] | null | null | null | Django/ML/helper.py | webclub/CFI | 122c8ae6fbde6d48398ce938fac28d40ccb6cabc | [
"Apache-2.0"
] | null | null | null | import datetime
import random
import numpy as np
from main import predict
def learn_and_predict(dates, attendance, date_predict):
date = []
for i in dates:
date.append([int(datetime.datetime.strptime(str(i), '%Y-%m-%d').strftime('%u'))])
Y = []
for i in attendance:
Y.append([int(i)])
X = np.asarray(date)
y = np.asarray(Y)
dt = [int(datetime.datetime.strptime(str(date_predict), '%Y-%m-%d').strftime('%u'))]
pr = np.asarray([dt])
return predict(X, y, pr)
if __name__ == '__main__':
print "ASD"
| 20 | 89 | 0.610714 | 82 | 560 | 4.02439 | 0.439024 | 0.081818 | 0.036364 | 0.163636 | 0.254545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216071 | 560 | 27 | 90 | 20.740741 | 0.751708 | 0 | 0 | 0 | 0 | 0 | 0.055357 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.222222 | null | null | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e428e5aec28226af1b032e091b687431f220a39 | 496 | py | Python | mmdet/models/necks/__init__.py | Gitgigabyte/mmd | 02cf37884d3ac9a6018656d1871695669966dfb3 | [
"Apache-2.0"
] | 1 | 2020-03-13T08:37:35.000Z | 2020-03-13T08:37:35.000Z | mmdet/models/necks/__init__.py | Gitgigabyte/mmd | 02cf37884d3ac9a6018656d1871695669966dfb3 | [
"Apache-2.0"
] | null | null | null | mmdet/models/necks/__init__.py | Gitgigabyte/mmd | 02cf37884d3ac9a6018656d1871695669966dfb3 | [
"Apache-2.0"
] | null | null | null | from .bfp import BFP
from .fpn import FPN
from .hrfpn import HRFPN
from .PAfpn import PAFPN
from .FPXN import FPXN
from .fuseFPN import FuseFPN
from .segprocess_head_neck import SemanticProcessNeck
from .SPN import SemanticPyramidNeck
from .NSPN import NSemanticPyramidNeck
from .GAS import GAS
from .Fuse import FuseNeck
__all__ = ['FPN', 'BFP', 'HRFPN', 'PAFPN', 'FPXN', 'FuseFPN',
'SemanticProcessNeck', 'SemanticPyramidNeck', 'NSemanticPyramidNeck',
'GAS', 'FuseNeck']
| 31 | 80 | 0.745968 | 58 | 496 | 6.275862 | 0.344828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 496 | 15 | 81 | 33.066667 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.785714 | 0 | 0.785714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
5e4bdca235cf4fb3e690652fd03dfc87638e77b0 | 953 | py | Python | user.py | GituMbugua/password-locker | efe5b9ed56cb103cbfa53d36a4e4bc7af45dd5fc | [
"MIT"
] | null | null | null | user.py | GituMbugua/password-locker | efe5b9ed56cb103cbfa53d36a4e4bc7af45dd5fc | [
"MIT"
] | null | null | null | user.py | GituMbugua/password-locker | efe5b9ed56cb103cbfa53d36a4e4bc7af45dd5fc | [
"MIT"
] | null | null | null | import pyperclip
import string
import secrets
global user_list
class User:
'''
class that will create accounts for users
'''
user_list = []
def __init__(self, login_name, password):
self.login_name = login_name
self.password = password
def save_user(self):
'''
save user object to user list
'''
User.user_list.append(self)
@classmethod
def user_exist(cls, name, password):
'''
authenticate user if they have an account
'''
for user in User.user_list:
if user.login_name == name and user.password == password:
return user
return False
def generate_password():
'''
function to generate a new password
'''
alphabet = string.ascii_letters + string.digits
password = ''.join(secrets.choice(alphabet) for i in range(5))
return password | 23.243902 | 70 | 0.583421 | 109 | 953 | 4.954128 | 0.458716 | 0.074074 | 0.048148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00158 | 0.335782 | 953 | 41 | 71 | 23.243902 | 0.851501 | 0.156348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0.333333 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
5e4c28e8f3111626fb3dd065ce6e00ea6bd43be5 | 1,623 | py | Python | tests/test_coordinates.py | seignovert/pyvims | a70b5b9b8bc5c37fa43b7db4d15407f312a31849 | [
"BSD-3-Clause"
] | 4 | 2019-09-16T15:50:22.000Z | 2021-04-08T15:32:48.000Z | tests/test_coordinates.py | seignovert/pyvims | a70b5b9b8bc5c37fa43b7db4d15407f312a31849 | [
"BSD-3-Clause"
] | 3 | 2018-05-04T09:28:24.000Z | 2018-12-03T09:00:31.000Z | tests/test_coordinates.py | seignovert/pyvims | a70b5b9b8bc5c37fa43b7db4d15407f312a31849 | [
"BSD-3-Clause"
] | 1 | 2020-10-12T15:14:17.000Z | 2020-10-12T15:14:17.000Z | """Test coordinates module."""
from pyvims.coordinates import salt, slat, slon, slon_e, slon_w
def test_lon():
"""Test longitude string."""
assert slon(0) == slon(.001) == slon(-.001) == slon(360) == '0°'
assert slon(180) == slon(180.001) == slon(-180) == '180°'
assert slon(90) == slon(-270) == '90°W'
assert slon(-90) == slon(270) == '90°E'
assert slon(90.01, precision=2) == '90.01°W'
assert slon(-90.01, precision=2) == '90.01°E'
def test_lon_w():
"""Test West longitude string."""
assert slon_w(0) == slon_w(.001) == '0°'
assert slon_w(180) == '180°W'
assert slon_w(90) == '90°W'
assert slon_w(270) == '270°W'
assert slon_w(360) == slon_w(360.) == '360°W'
assert slon_w(90.01, precision=2) == '90.01°W'
def test_lon_e():
"""Test east longitude string."""
assert slon_e(0) == slon_e(.001) == '0°'
assert slon_e(180) == slon_e(-180) == slon_e(180.01) == '180°'
assert slon_e(90) == '90°E'
assert slon_e(-90) == '90°W'
assert slon_e(90.01, precision=2) == '90.01°E'
assert slon_e(-90.01, precision=2) == '90.01°W'
def test_lat():
"""Test latitude string."""
assert slat(0) == slat(.001) == 'Eq.'
assert slat(90) == slat(89.999) == 'N.P.'
assert slat(-90) == slat(-89.999) == 'S.P.'
assert slat(45) == slat(45.) == '45°N'
assert slat(-45) == slat(-45.) == '45°S'
assert slat(45.01, precision=2) == '45.01°N'
assert slat(-45.01, precision=2) == '45.01°S'
def test_alt():
"""Test altitude string."""
assert salt(0) == salt(.1) == '0 km'
assert salt(100) == salt(100.1) == '100 km'
| 29.509091 | 68 | 0.557609 | 287 | 1,623 | 3.139373 | 0.146341 | 0.199778 | 0.062153 | 0.09323 | 0.539401 | 0.490566 | 0.332963 | 0.237514 | 0.234184 | 0.105438 | 0 | 0.157731 | 0.207024 | 1,623 | 54 | 69 | 30.055556 | 0.525253 | 0.090573 | 0 | 0 | 0 | 0 | 0.087889 | 0 | 0 | 0 | 0 | 0 | 0.818182 | 1 | 0.151515 | true | 0 | 0.030303 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e4f22294e9cd518d04b0ddca09a221358daf2f0 | 1,584 | py | Python | hacking/checks.py | platform9/omni-devstack-fixes | bc94150974fe181840ab3c5d618fa5ce3db44805 | [
"Apache-2.0"
] | null | null | null | hacking/checks.py | platform9/omni-devstack-fixes | bc94150974fe181840ab3c5d618fa5ce3db44805 | [
"Apache-2.0"
] | 1 | 2020-03-03T13:53:23.000Z | 2020-03-03T13:53:23.000Z | hacking/checks.py | platform9/omni-devstack-fixes | bc94150974fe181840ab3c5d618fa5ce3db44805 | [
"Apache-2.0"
] | 1 | 2020-09-03T20:54:21.000Z | 2020-09-03T20:54:21.000Z | # Copyright (c) 2017 OpenStack Foundation.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import re
"""
Guidelines for writing new hacking checks
- Use only for Omni-specific tests. OpenStack general tests
should be submitted to the common 'hacking' module.
- Pick numbers in the range O3xx. Find the current test with
the highest allocated number and then pick the next value.
If nova has an N3xx code for that test, use the same number.
- Keep the test method code in the source file ordered based
on the O3xx value.
- List the new rule in the top level HACKING.rst file
- Add test cases for each new rule to omnitests/test_hacking.py
"""
asse_trueinst_re = re.compile(
r"(.)*assertTrue\(isinstance\((\w|\.|\'|\"|\[|\])+, "
"(\w|\.|\'|\"|\[|\])+\)\)")
def assert_true_instance(logical_line):
"""Check for assertTrue(isinstance(a, b)) sentences
O316
"""
if asse_trueinst_re.match(logical_line):
yield (0, "O316: assertTrue(isinstance(a, b)) sentences not allowed")
def factory(register):
register(assert_true_instance)
| 32.326531 | 77 | 0.71654 | 236 | 1,584 | 4.762712 | 0.576271 | 0.053381 | 0.023132 | 0.02847 | 0.05516 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013932 | 0.184343 | 1,584 | 48 | 78 | 33 | 0.856037 | 0.390783 | 0 | 0 | 0 | 0 | 0.279683 | 0.166227 | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e51751cd124add61435bcb1aa88de3c44a84b13 | 163 | py | Python | examples/libraries/spectralist.py | zxkjack123/pypact | 8b37f42007e0accabc9fb31d4ab76935b559d817 | [
"Apache-2.0"
] | 18 | 2018-01-22T14:00:18.000Z | 2022-03-08T06:29:22.000Z | examples/libraries/spectralist.py | listato/pypact | a418ba218cdf4a25ae3e7d72e0919905d027d2ba | [
"Apache-2.0"
] | 28 | 2018-12-07T14:30:46.000Z | 2022-02-27T20:33:06.000Z | examples/libraries/spectralist.py | listato/pypact | a418ba218cdf4a25ae3e7d72e0919905d027d2ba | [
"Apache-2.0"
] | 8 | 2018-05-29T13:41:59.000Z | 2021-01-21T01:33:41.000Z | import pypact as pp
with pp.SpectrumLibJSONReader() as lib:
manager = pp.SpectrumLibManager(lib)
for spectrum in manager.list():
print(spectrum)
| 20.375 | 40 | 0.705521 | 20 | 163 | 5.75 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208589 | 163 | 7 | 41 | 23.285714 | 0.891473 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e5252473d85d0780b3334dd888d13cf0eccf693 | 878 | py | Python | utils/Clean_news_function.py | jasonjgarcia24/global-crypto-behavior-1 | 46ec6ab32715fefb541e11f6e274f692c2f303dd | [
"MIT"
] | null | null | null | utils/Clean_news_function.py | jasonjgarcia24/global-crypto-behavior-1 | 46ec6ab32715fefb541e11f6e274f692c2f303dd | [
"MIT"
] | 6 | 2021-08-05T18:15:56.000Z | 2021-08-17T01:34:01.000Z | utils/Clean_news_function.py | jasonjgarcia24/global-crypto-behavior-1 | 46ec6ab32715fefb541e11f6e274f692c2f303dd | [
"MIT"
] | 4 | 2021-08-22T21:21:58.000Z | 2021-10-16T12:22:18.000Z | import sys
sys.path.append('.')
import pandas as pd
def date_str_func(suffix):
return f"./data/ticker-news_data_{suffix}.csv"
def data_pull(date_var):
filename = date_str_func(date_var)
data_crypto_news = pd.read_csv(filename)
data_crypto_news = data_crypto_news[data_crypto_news["tickers"].notna()]
data_crypto_news["date"] = pd.to_datetime(data_crypto_news["date"],infer_datetime_format=True, utc=True)
data_crypto_news = data_crypto_news.drop_duplicates()
data_crypto_news = data_crypto_news.drop(columns= ['type','news_id','eventid'])
return data_crypto_news
def write_df(df, date_var):
df.to_csv(date_str_func(date_var), mode="w")
if __name__ == "__main__":
for d in range(14, 22):
date_var = f"202108{d}"
crypto_news_df = data_pull(date_var)
write_df(crypto_news_df, date_var)
| 25.823529 | 108 | 0.705011 | 136 | 878 | 4.110294 | 0.375 | 0.232558 | 0.275492 | 0.128801 | 0.254025 | 0.189624 | 0.11449 | 0 | 0 | 0 | 0 | 0.013774 | 0.173121 | 878 | 33 | 109 | 26.606061 | 0.756198 | 0 | 0 | 0 | 0 | 0 | 0.100228 | 0.041002 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.1 | 0.05 | 0.35 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e611abbba142d234b552db1e9df1e6e0c79c01b | 1,422 | py | Python | problems/chapter14/Ysi/arc005_3.py | tokuma09/algorithm_problems | 58534620df73b230afbeb12de126174362625a78 | [
"CC0-1.0"
] | 1 | 2021-07-07T15:46:58.000Z | 2021-07-07T15:46:58.000Z | problems/chapter14/Ysi/arc005_3.py | tokuma09/algorithm_problems | 58534620df73b230afbeb12de126174362625a78 | [
"CC0-1.0"
] | 5 | 2021-06-05T14:16:41.000Z | 2021-07-10T07:08:28.000Z | problems/chapter14/Ysi/arc005_3.py | tokuma09/algorithm_problems | 58534620df73b230afbeb12de126174362625a78 | [
"CC0-1.0"
] | null | null | null | def main():
H, W = map(int, input().split())
C = []
for i in range(H):
row = list(input())
C.append(row)
for i in range(H):
for j in range(W):
if C[i][j] == "s":
sx = i
sy = j
if C[i][j] == "g":
gx = i
gy = j
from collections import deque
Q = deque()
Q.append((sx, sy))
dist = [[float("inf")] * W for _ in range(H)]
dist[sx][sy] = 0
while len(Q) > 0:
i, j = Q.popleft()
for dx, dy in [(1, 0), (-1, 0), (0, 1), (0, -1)]:
next_i = i + dx
next_j = j + dy
if not ((0 <= next_i < H) and (0 <= next_j < W)):
continue
if dist[next_i][next_j] == float("inf"):
if C[next_i][next_j] == ".":
dist[next_i][next_j] = min(dist[next_i][next_j], dist[i][j])
Q.appendleft((next_i, next_j))
elif C[next_i][next_j] == "#":
dist[next_i][next_j] = min(dist[next_i][next_j],dist[i][j]+1)
Q.append((next_i, next_j))
elif C[next_i][next_j] == "g":
dist[next_i][next_j] = min(dist[next_i][next_j],dist[i][j])
if dist[gx][gy] <= 2:
ans = "YES"
else:
ans = "NO"
print(ans)
if __name__=='__main__':
main() | 30.255319 | 81 | 0.400844 | 208 | 1,422 | 2.5625 | 0.254808 | 0.131332 | 0.202627 | 0.225141 | 0.393996 | 0.322702 | 0.322702 | 0.322702 | 0.322702 | 0.322702 | 0 | 0.017032 | 0.421941 | 1,422 | 47 | 82 | 30.255319 | 0.631387 | 0 | 0 | 0.095238 | 0 | 0 | 0.016866 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02381 | false | 0 | 0.02381 | 0 | 0.047619 | 0.02381 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e7a74f82a7b6160e942893b7a29ce4b45674a85 | 502 | py | Python | api/tests/opentrons/hardware_control/integration/conftest.py | Corey-ONeal/opentrons-app_ws-remote | a255b76c8a07457787d575da12b2d5bdb6220a91 | [
"Apache-2.0"
] | null | null | null | api/tests/opentrons/hardware_control/integration/conftest.py | Corey-ONeal/opentrons-app_ws-remote | a255b76c8a07457787d575da12b2d5bdb6220a91 | [
"Apache-2.0"
] | null | null | null | api/tests/opentrons/hardware_control/integration/conftest.py | Corey-ONeal/opentrons-app_ws-remote | a255b76c8a07457787d575da12b2d5bdb6220a91 | [
"Apache-2.0"
] | null | null | null | import pytest
import threading
import asyncio
from opentrons.hardware_control.emulation.app import run
@pytest.fixture(scope="session")
def emulation_app():
"""Run the emulators"""
def runit():
asyncio.run(run())
# TODO 20210219
# The emulators must be run in a separate thread because our serial
# drivers block the main thread. Remove this thread when that is no
# longer true.
t = threading.Thread(target=runit)
t.daemon = True
t.start()
yield t
| 25.1 | 72 | 0.689243 | 69 | 502 | 4.985507 | 0.652174 | 0.069767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020566 | 0.2251 | 502 | 19 | 73 | 26.421053 | 0.863753 | 0.358566 | 0 | 0 | 0 | 0 | 0.022364 | 0 | 0 | 0 | 0 | 0.052632 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
5e804c1b152a4ded18aa30f3c4305b2d71ffaa50 | 1,774 | py | Python | Day 24/DrawingBook.py | sandeep-krishna/100DaysOfCode | af4594fb6933e4281d298fa921311ccc07295a7c | [
"MIT"
] | null | null | null | Day 24/DrawingBook.py | sandeep-krishna/100DaysOfCode | af4594fb6933e4281d298fa921311ccc07295a7c | [
"MIT"
] | null | null | null | Day 24/DrawingBook.py | sandeep-krishna/100DaysOfCode | af4594fb6933e4281d298fa921311ccc07295a7c | [
"MIT"
] | null | null | null | '''
A teacher asks the class to open their books to a page number. A student can either start turning pages from the front of the book or from the back of the book. They always turn pages one at a time. When they open the book, page is always on the right side:
When they flip page , they see pages and . Each page except the last page will always be printed on both sides. The last page may only be printed on the front, given the length of the book. If the book is pages long, and a student wants to turn to page , what is the minimum number of pages to turn? They can start at the beginning or the end of the book.
Given and , find and print the minimum number of pages that must be turned in order to arrive at page .
Using the diagram above, if the student wants to get to page , they open the book to page , flip page and they are on the correct page. If they open the book to the last page, page , they turn page and are at the correct page. Return .
Function Description
Complete the pageCount function in the editor below.
pageCount has the following parameter(s):
int n: the number of pages in the book
int p: the page number to turn to
Returns
Hint: the minimum number of pages to turn
Input Format
The first line contains an integer , the number of pages in the book.
The second line contains an integer, , the page to turn to.
Sample Input 0
6
2
Sample Output 0
1
'''
#!/bin/python3
import os
import sys
#
# Complete the pageCount function below.
#
def pageCount(n, p):
#
# Write your code here.
return min(p//2, n//2 - p//2)
if __name__ == '__main__':
fptr = open(os.environ['OUTPUT_PATH'], 'w')
n = int(input())
p = int(input())
result = pageCount(n, p)
fptr.write(str(result) + '\n')
fptr.close()
| 30.586207 | 357 | 0.718151 | 323 | 1,774 | 3.916409 | 0.371517 | 0.055336 | 0.051383 | 0.035573 | 0.166008 | 0.085375 | 0.085375 | 0 | 0 | 0 | 0 | 0.006508 | 0.220406 | 1,774 | 57 | 358 | 31.122807 | 0.908171 | 0.82469 | 0 | 0 | 0 | 0 | 0.073826 | 0 | 0 | 0 | 0 | 0.017544 | 0 | 1 | 0.090909 | false | 0 | 0.181818 | 0.090909 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e88c415b780584efc174063f84bfc10c22db2b3 | 7,286 | py | Python | code/figures/8.0-plot-fig-S5.py | edunnsigouin/ds21grl | b6544cbc97529943da86e48a437ce68dc00e0f82 | [
"MIT"
] | 1 | 2021-04-08T18:13:47.000Z | 2021-04-08T18:13:47.000Z | code/figures/8.0-plot-fig-S5.py | edunnsigouin/ds21grl | b6544cbc97529943da86e48a437ce68dc00e0f82 | [
"MIT"
] | null | null | null | code/figures/8.0-plot-fig-S5.py | edunnsigouin/ds21grl | b6544cbc97529943da86e48a437ce68dc00e0f82 | [
"MIT"
] | 1 | 2021-06-10T14:48:04.000Z | 2021-06-10T14:48:04.000Z | """
Plots figure S5:
yt correlation of zonal-mean downward long wave radiation at the surface (top)
and longwave cloud radiative forcing at the surface (bottom)
with vertically and zonally integrated eddy moisture transport at 70N for
(left) reanalysis data (left) and aquaplanet control simulation data (right).
"""
import numpy as np
import xarray as xr
from matplotlib import pyplot as plt
from ds21grl import dim_aqua,dim_erai
from ds21grl.config import data_name,dir_interim,dir_fig
# INPUT -----------------------------------------------------------
write2file = 1
# -----------------------------------------------------------------
# read correlation data for downwar longwave radiation
filename1 = dir_interim + data_name[0] + '/yt_corr_nbs5000_VQ_k1-40_70N_with_zm_FLDS_sfc_' + dim_erai.timestamp + '.nc'
filename2 = dir_interim + data_name[1] + '/yt_corr_nbs5000_VQ_k1-40_70N_with_zm_FLDS_sfc_' + dim_aqua.timestamp + '.nc'
ds1 = xr.open_dataset(filename1)
ds2 = xr.open_dataset(filename2)
corr_erai_1 = ds1['corr'].values
corr_aqua_1 = ds2['corr'].values
sig_erai_1 = ds1['sig'].values
sig_aqua_1 = ds2['sig'].values
lag = ds1['lag'].values
ds1.close()
ds2.close()
# read correlation data for longwave cloud forcing
filename1 = dir_interim + data_name[0] + '/yt_corr_nbs5000_VQ_k1-40_70N_with_zm_LWCFS_sfc_' + dim_erai.timestamp + '.nc'
filename2 = dir_interim + data_name[1] + '/yt_corr_nbs5000_VQ_k1-40_70N_with_zm_LWCFS_sfc_' + dim_aqua.timestamp + '.nc'
ds1 = xr.open_dataset(filename1)
ds2 = xr.open_dataset(filename2)
corr_erai_2 = ds1['corr'].values
corr_aqua_2 = ds2['corr'].values
sig_erai_2 = ds1['sig'].values
sig_aqua_2 = ds2['sig'].values
lag = ds1['lag'].values
ds1.close()
ds2.close()
# Plot
corr_aqua_1 = np.flip(np.swapaxes(corr_aqua_1,0,1),axis=1)
corr_aqua_2 = np.flip(np.swapaxes(corr_aqua_2,0,1),axis=1)
corr_erai_1 = np.flip(np.swapaxes(corr_erai_1,0,1),axis=1)
corr_erai_2 = np.flip(np.swapaxes(corr_erai_2,0,1),axis=1)
sig_aqua_1 = np.flip(np.swapaxes(sig_aqua_1,0,1),axis=1)
sig_aqua_2 = np.flip(np.swapaxes(sig_aqua_2,0,1),axis=1)
sig_erai_1 = np.flip(np.swapaxes(sig_erai_1,0,1),axis=1)
sig_erai_2 = np.flip(np.swapaxes(sig_erai_2,0,1),axis=1)
clevs = np.arange(-0.45,0.50,0.05)
cmap = 'RdBu_r'
figsize = np.array([11,8])
fontsize = 12
fig,axes = plt.subplots(nrows=2,ncols=2,figsize=(figsize[0],figsize[1]))
axes = axes.ravel()
plt.subplots_adjust(hspace=0.3,wspace=0.225,left=0.1,right=0.975,top=0.9,bottom=0.1)
# erai FLDS
p = axes[0].contourf(lag,dim_erai.lat,corr_erai_1,levels=clevs,cmap=cmap,extend='both')
axes[0].contourf(lag,dim_erai.lat,sig_erai_1,levels=1,colors='none',hatches=["", "///"])
plt.rcParams['hatch.linewidth'] = 0.25
axes[0].set_yticks(np.arange(0,100,10))
axes[0].set_xticks(np.arange(lag[0],lag[-1]+1,1))
axes[0].set_yticklabels([0,'','',30,'','',60,'','',90],fontsize=fontsize)
axes[0].set_xticklabels([-10,'','','','',-5,'','','','',0,'','','','',5,'','','','',10],fontsize=fontsize)
axes[0].set_ylabel('latitude',fontsize=fontsize)
axes[0].set_xlabel('lag (days)',fontsize=fontsize)
axes[0].set_title('(a)',fontsize=fontsize)
axes[0].set_ylim([0,90])
axes[0].set_xlim([lag[0],lag[-1]])
cb = fig.colorbar(p, ax=axes[0], orientation='vertical',ticks=clevs[1::4],pad=0.025,aspect=15)
cb.ax.set_title('[r]',fontsize=fontsize)
cb.ax.tick_params(labelsize=fontsize,size=0)
# erai LWCF_sfc
p = axes[2].contourf(lag,dim_erai.lat,corr_erai_2,levels=clevs,cmap=cmap,extend='both')
axes[2].contourf(lag,dim_erai.lat,sig_erai_2,levels=1,colors='none',hatches=["", "///"])
plt.rcParams['hatch.linewidth'] = 0.25
axes[2].set_yticks(np.arange(0,100,10))
axes[2].set_xticks(np.arange(lag[0],lag[-1]+1,1))
axes[2].set_yticklabels([0,'','',30,'','',60,'','',90],fontsize=fontsize)
axes[2].set_xticklabels([-10,'','','','',-5,'','','','',0,'','','','',5,'','','','',10],fontsize=fontsize)
axes[2].set_ylabel('latitude',fontsize=fontsize)
axes[2].set_xlabel('lag (days)',fontsize=fontsize)
axes[2].set_title('(c)',fontsize=fontsize)
axes[2].set_ylim([0,90])
axes[2].set_xlim([lag[0],lag[-1]])
cb = fig.colorbar(p, ax=axes[2], orientation='vertical',ticks=clevs[1::4],pad=0.025,aspect=15)
cb.ax.set_title('[r]',fontsize=fontsize)
cb.ax.tick_params(labelsize=fontsize,size=0)
# aqua FLDS
p = axes[1].contourf(lag,dim_aqua.lat,corr_aqua_1,levels=clevs,cmap=cmap,extend='both')
axes[1].contourf(lag,dim_aqua.lat,sig_aqua_1,levels=1,colors='none',hatches=["", "///"])
plt.rcParams['hatch.linewidth'] = 0.25
axes[1].set_yticks(np.arange(0,100,10))
axes[1].set_xticks(np.arange(lag[0],lag[-1]+1,1))
axes[1].set_yticklabels([0,'','',30,'','',60,'','',90],fontsize=fontsize)
axes[1].set_xticklabels([-10,'','','','',-5,'','','','',0,'','','','',5,'','','','',10],fontsize=fontsize)
axes[1].set_ylabel('latitude',fontsize=fontsize)
axes[1].set_xlabel('lag (days)',fontsize=fontsize)
axes[1].set_title('(b)',fontsize=fontsize)
axes[1].set_ylim([0,90])
axes[1].set_xlim([lag[0],lag[-1]])
cb = fig.colorbar(p, ax=axes[1], orientation='vertical',ticks=clevs[1::4],pad=0.025,aspect=15)
cb.ax.set_title('[r]',fontsize=fontsize)
cb.ax.tick_params(labelsize=fontsize,size=0)
# aqua LWCF_sfc
p = axes[3].contourf(lag,dim_aqua.lat,corr_aqua_2,levels=clevs,cmap=cmap,extend='both')
axes[3].contourf(lag,dim_aqua.lat,sig_aqua_2,levels=1,colors='none',hatches=["", "///"])
plt.rcParams['hatch.linewidth'] = 0.25
axes[3].set_yticks(np.arange(0,100,10))
axes[3].set_xticks(np.arange(lag[0],lag[-1]+1,1))
axes[3].set_yticklabels([0,'','',30,'','',60,'','',90],fontsize=fontsize)
axes[3].set_xticklabels([-10,'','','','',-5,'','','','',0,'','','','',5,'','','','',10],fontsize=fontsize)
axes[3].set_ylabel('latitude',fontsize=fontsize)
axes[3].set_xlabel('lag (days)',fontsize=fontsize)
axes[3].set_title('(d)',fontsize=fontsize)
axes[3].set_ylim([0,90])
axes[3].set_xlim([lag[0],lag[-1]])
cb = fig.colorbar(p, ax=axes[3], orientation='vertical',ticks=clevs[1::4],pad=0.025,aspect=15)
cb.ax.set_title('[r]',fontsize=fontsize)
cb.ax.tick_params(labelsize=fontsize,size=0)
# extra labels
axes[0].text(-0.19,0.5,r'corr $\langle [v^{*} q^{*}] \rangle_{70N}$ & $[DLWR]_{sfc}$',fontsize=fontsize*1.25,fontweight='normal',\
va='bottom', ha='center',rotation='vertical',rotation_mode='anchor',transform=axes[0].transAxes)
axes[0].text(0.5,1.1,'reanalysis',fontsize=fontsize*1.25,fontweight='normal', va='bottom', ha='center',rotation='horizontal',\
rotation_mode='anchor',transform=axes[0].transAxes)
axes[1].text(0.5,1.1,'control',fontsize=fontsize*1.25,fontweight='normal',va='bottom', ha='center',rotation='horizontal',\
rotation_mode='anchor',transform=axes[1].transAxes)
axes[2].text(-0.19,0.5,r'corr $\langle [v^{*} q^{*}] \rangle_{70N}$ & $[LWCRF]_{sfc}$',fontsize=fontsize*1.25,fontweight='normal',\
va='bottom', ha='center',rotation='vertical',rotation_mode='anchor',transform=axes[2].transAxes)
if write2file == 1:
plt.savefig(dir_fig + 'fig_S5.pdf')
plt.show()
| 46.113924 | 131 | 0.656053 | 1,175 | 7,286 | 3.905532 | 0.162553 | 0.097625 | 0.087165 | 0.027893 | 0.806276 | 0.740031 | 0.638919 | 0.552626 | 0.493354 | 0.457616 | 0 | 0.065448 | 0.106643 | 7,286 | 157 | 132 | 46.407643 | 0.639576 | 0.089212 | 0 | 0.192982 | 0 | 0.017544 | 0.113647 | 0.028714 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04386 | 0 | 0.04386 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e9e87a9fd58b451185c65c43b2b61efd3f695fd | 1,033 | py | Python | rpass/__main__.py | surajkarki66/rpass | 0d6d91007cebb4e00b42ef9cb08bfe7ee7031237 | [
"MIT"
] | 1 | 2021-07-24T04:22:51.000Z | 2021-07-24T04:22:51.000Z | rpass/__main__.py | surajkarki66/rpass | 0d6d91007cebb4e00b42ef9cb08bfe7ee7031237 | [
"MIT"
] | null | null | null | rpass/__main__.py | surajkarki66/rpass | 0d6d91007cebb4e00b42ef9cb08bfe7ee7031237 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import os
import click
import pyperclip
from rpass.utils.create_password import generate_password
CONTEXT_SETTINGS = dict(help_option_names=["-h", "--help"])
@click.command(context_settings=CONTEXT_SETTINGS)
@click.version_option(version="0.0.2")
@click.option(
"-l", "--length", type=int, default=12, help="Password length(default: 12)"
)
@click.option("-ns", "--no-symbols", is_flag=True, help="Do not use symbols")
@click.option("-nd", "--no-digits", is_flag=True, help="Do not use digits")
@click.option("--upper", is_flag=True, help="Only use upper case letters")
@click.option("--lower", is_flag=True, help="Only use lower case letters")
def main(**kwargs):
"""Welcome to rpass! A strong random password generator cli utility tool!"""
password = generate_password(**kwargs)
click.echo(click.style(f"Your password is {password}", bold=True))
pyperclip.copy(password)
click.echo(click.style("Saved! to clipboard", fg="green"))
pass
if __name__ == "__main__":
main()
| 30.382353 | 80 | 0.704743 | 148 | 1,033 | 4.777027 | 0.486486 | 0.077793 | 0.056577 | 0.079208 | 0.121641 | 0.121641 | 0.062235 | 0 | 0 | 0 | 0 | 0.007735 | 0.123911 | 1,033 | 33 | 81 | 31.30303 | 0.773481 | 0.088093 | 0 | 0 | 1 | 0 | 0.258547 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0.272727 | 0.181818 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5ea3228abed648307f151eab63fdf5047c1e43e4 | 1,083 | py | Python | lib/log.py | sergei-dyshel/tmux-clost | 699723565cfecc2ef2d6ededb258d57d467792f4 | [
"MIT"
] | null | null | null | lib/log.py | sergei-dyshel/tmux-clost | 699723565cfecc2ef2d6ededb258d57d467792f4 | [
"MIT"
] | null | null | null | lib/log.py | sergei-dyshel/tmux-clost | 699723565cfecc2ef2d6ededb258d57d467792f4 | [
"MIT"
] | null | null | null | import logging
import sys
_logger = logging.getLogger()
def configure(log_file=None, level=logging.DEBUG):
if not log_file or log_file == '-' or log_file == 'stdout':
formatter = logging.Formatter('%(message)s')
handler = logging.StreamHandler(sys.stderr)
else:
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
handler = logging.FileHandler(log_file)
handler.setFormatter(formatter)
_logger.addHandler(handler)
_logger.setLevel(level)
warning('========================================================')
def debug(msg, *args, **kwargs):
return _log(logging.DEBUG, msg, *args, **kwargs)
def info(msg, *args, **kwargs):
return _log(logging.INFO, msg, *args, **kwargs)
def warning(msg, *args, **kwargs):
return _log(logging.WARNING, msg, *args, **kwargs)
def error(msg, *args, **kwargs):
return _log(logging.ERROR, msg, *args, **kwargs)
def _log(level, msg, *args, **kwargs):
full_msg = msg.format(*args, **kwargs) if args or kwargs else msg
_logger.log(level, full_msg)
| 27.075 | 78 | 0.630656 | 132 | 1,083 | 5.05303 | 0.287879 | 0.149925 | 0.175412 | 0.113943 | 0.211394 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172669 | 1,083 | 39 | 79 | 27.769231 | 0.74442 | 0 | 0 | 0 | 0 | 0 | 0.102493 | 0.051708 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0 | 0.08 | 0.16 | 0.48 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
5ea53b19d47371b65a141712722feda58a513224 | 176 | py | Python | ex051.py | pepev123/PythonEx | 8f39751bf87a9099d7b733aa829988595dab2344 | [
"MIT"
] | null | null | null | ex051.py | pepev123/PythonEx | 8f39751bf87a9099d7b733aa829988595dab2344 | [
"MIT"
] | null | null | null | ex051.py | pepev123/PythonEx | 8f39751bf87a9099d7b733aa829988595dab2344 | [
"MIT"
] | null | null | null | pt = int(input('Digite o primeiro termo da PA: '))
r = int(input('Digite a razão : '))
s = 0
for c in range(0, 11):
s = pt + (c * r)
print('A{} = {}'.format(c + 1, s)) | 29.333333 | 50 | 0.517045 | 33 | 176 | 2.757576 | 0.666667 | 0.175824 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038168 | 0.255682 | 176 | 6 | 51 | 29.333333 | 0.656489 | 0 | 0 | 0 | 0 | 0 | 0.316384 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5ec5bff68ba343e8e74d637b819a3106942edcbc | 455 | py | Python | main.py | mjalkio/parallel-python-tutorial | 86e77cb6b1e0e21aa2b09270c249e232761f78e5 | [
"MIT"
] | null | null | null | main.py | mjalkio/parallel-python-tutorial | 86e77cb6b1e0e21aa2b09270c249e232761f78e5 | [
"MIT"
] | null | null | null | main.py | mjalkio/parallel-python-tutorial | 86e77cb6b1e0e21aa2b09270c249e232761f78e5 | [
"MIT"
] | null | null | null | """TODO."""
import os
import pandas as pd
from dotenv import load_dotenv
from facebook_client import FacebookClient
load_dotenv()
fb = FacebookClient(access_token=os.getenv('FACEBOOK_ACCESS_TOKEN'))
nonprofit_df = pd.read_csv('nonprofit_facebook.csv')
nonprofit_df['fan_count'] = nonprofit_df['facebook'].map(fb.get_page_fan_count)
nonprofit_df['about'] = nonprofit_df['facebook'].map(fb.get_page_about)
nonprofit_df.to_csv('output.csv', index=False)
| 28.4375 | 79 | 0.795604 | 68 | 455 | 5.014706 | 0.441176 | 0.193548 | 0.099707 | 0.111437 | 0.181818 | 0.181818 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0.074725 | 455 | 15 | 80 | 30.333333 | 0.809976 | 0.010989 | 0 | 0 | 0 | 0 | 0.186937 | 0.096847 | 0 | 0 | 0 | 0.066667 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0d62c0c732aab7f1a0ff9ecebf4569c07f75adc9 | 980 | py | Python | qutipy/general_functions/unitary_distance.py | sumeetkhatri/QuTIPy | ca2a3344c1caa818504425496ea37278d80b1c44 | [
"Apache-2.0"
] | 19 | 2020-11-11T13:00:22.000Z | 2022-03-14T11:18:04.000Z | qutipy/general_functions/unitary_distance.py | sumeetkhatri/QuTIPy | ca2a3344c1caa818504425496ea37278d80b1c44 | [
"Apache-2.0"
] | null | null | null | qutipy/general_functions/unitary_distance.py | sumeetkhatri/QuTIPy | ca2a3344c1caa818504425496ea37278d80b1c44 | [
"Apache-2.0"
] | 1 | 2022-03-03T15:20:15.000Z | 2022-03-03T15:20:15.000Z | '''
This code is part of QuTIpy.
(c) Copyright Sumeet Khatri, 2021
This code is licensed under the Apache License, Version 2.0. You may
obtain a copy of this license in the LICENSE.txt file in the root directory
of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
Any modifications or derivative works of this code must retain this
copyright notice, and modified files need to carry a notice indicating
that they have been altered from the originals.
'''
import numpy as np
from qutipy.general_functions import dag,Tr
def unitary_distance(U,V):
'''
Checks whether two unitaries U and V are the same (taking into account global phase) by using the distance measure:
1-(1/d)*|Tr[UV^†]|,
where d is the dimension of the space on which the unitaries act.
U and V are the same if and only if this is equal to zero; otherwise, it is greater than zero.
'''
d=U.shape[0]
return 1-(1/d)*np.abs(Tr(U@dag(V))) | 29.69697 | 119 | 0.711224 | 173 | 980 | 4.023121 | 0.612717 | 0.034483 | 0.028736 | 0.022989 | 0.043103 | 0.043103 | 0 | 0 | 0 | 0 | 0 | 0.016861 | 0.213265 | 980 | 33 | 120 | 29.69697 | 0.884566 | 0.782653 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
0d65df46c2e0a9418cc23e22ff2135213963baa4 | 121 | py | Python | output/models/ms_data/regex/re_k70_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 1 | 2021-08-14T17:59:21.000Z | 2021-08-14T17:59:21.000Z | output/models/ms_data/regex/re_k70_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 4 | 2020-02-12T21:30:44.000Z | 2020-04-15T20:06:46.000Z | output/models/ms_data/regex/re_k70_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | null | null | null | from output.models.ms_data.regex.re_k70_xsd.re_k70 import (
Regex,
Doc,
)
__all__ = [
"Regex",
"Doc",
]
| 12.1 | 59 | 0.603306 | 17 | 121 | 3.823529 | 0.705882 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043956 | 0.247934 | 121 | 9 | 60 | 13.444444 | 0.67033 | 0 | 0 | 0 | 0 | 0 | 0.066116 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d66ec2c7785e9dfea830ef490f1edb10189c3b1 | 6,959 | py | Python | Template Generator.py | Tripwire-VERT/Protocol-Independent-Fuzzer | 0389e41940c8e5615f4ab6097f14d2527fdb01b7 | [
"BSD-2-Clause"
] | 8 | 2015-10-13T08:47:39.000Z | 2022-02-09T04:50:55.000Z | Template Generator.py | Tripwire-VERT/Protocol-Independent-Fuzzer | 0389e41940c8e5615f4ab6097f14d2527fdb01b7 | [
"BSD-2-Clause"
] | null | null | null | Template Generator.py | Tripwire-VERT/Protocol-Independent-Fuzzer | 0389e41940c8e5615f4ab6097f14d2527fdb01b7 | [
"BSD-2-Clause"
] | 8 | 2015-02-23T21:09:13.000Z | 2021-04-22T12:23:55.000Z | #################################################################
# Generates Template files for fuzzer.py
# Does a couple things so far
# 1. flips bytes
# 2. minimizes and maximizes values
# TODO add more functionality and ways to fuzz.
#################################################################
import re
import os
import sys
import struct
from optparse import OptionParser
template = open('test.template.rdp', 'r').read()
t_spit = template.split('\ndef')[1:]
template_dict = dict()
def resolveArgs():
'''resolve the input args'''
usage = "usage: %prog [options]"
parser = OptionParser(usage=usage)
parser.add_option('--max', dest='max', type='string', help='The amount of fuzzed packets.')
(opts, args) = parser.parse_args()
return opts
def remove_white_space(data):
variable_white_space = re.findall('\' +\+ +\'', data)
for space in variable_white_space:
data = data.replace(space,'').replace('\' + \'', '')
return data
def parse_field(data, field):
data = data[data.find(field):]
data_variable = data[:data.find('=')].strip()
data_var_values = data[data.find('['):data.find(']')+1]
return data_var_values
def grab_default_values(data, variables):
variable_dict = dict()
for variable in variables:
test = re.search('%s += (.+)' % variable, data)
variable_dict[variable] = remove_white_space(test.group(1))
return variable_dict
def end_of_function(data, variables):
data = data[data.find('length_fields'):]
data = data[data.find('\n'):]
data = data[data.find('%s ' %variables[-1]):]
data = data[data.find('\n'):]
return data
def list_2_dict(variables_length):
temp = dict()
for x in variables_length:
temp[x[0]] = x[1]
return temp
for x in t_spit:
function_name = 'def%s' % x[:x.find(':')+1]
template_dict[function_name] = dict()
variables = parse_field(x.replace(' \\\n', ''), 'variables')
variables_length = parse_field(x.replace(' \\\n', ''), 'variables_length')
length_fields = parse_field(x.replace(' \\\n', ''), 'length_fields')
exec '%s = %s' % ('variables',variables)
exec '%s = %s' % ('variables_length',variables_length)
exec '%s = %s' % ('length_fields',length_fields)
variables_length = list_2_dict(variables_length)
default_values = grab_default_values(x.replace(' \\\n', ''), variables)
default_values['variables'] = variables
default_values['variables_length'] = variables_length
default_values['length_fields'] = length_fields
default_values['build'] = end_of_function(x, variables)
template_dict[function_name] = default_values
packets = ['def generateX224Request():', 'def generateErectDomain():', 'def generateMCSAttachUser():', 'def channelJoinRequest(channel):', 'def generateMCSRequest( encryption_type = 16):', 'def createClientInfo():', 'def confirmActive():', 'def macSignature(crypto, data):']
def flip_byte(f_bytes):
if len(f_bytes) == 1:
byte_val = 0xFF
byte_len = 'B'
elif len(f_bytes) == 2:
byte_val = 0xFFFF
byte_len = 'H'
elif len(f_bytes) == 4:
byte_val = 0xFFFFFFFF
byte_len = '>L'
byte_value = byte_val - struct.unpack(byte_len, f_bytes)[0]
flipped = struct.pack(byte_len, byte_value)
return flipped
def byte_flip(f_bytes, adding, var):
flip_b_dict = dict()
count = 0
temp = []
while (count <= len(f_bytes)-adding) and len(f_bytes) >= adding:
temp_byte = f_bytes[count:count+adding]
temp.append('%s%s%s' % (f_bytes[:count], flip_byte(temp_byte), f_bytes[count+adding:]))
count += adding
if len(temp) > 0:
flip_b_dict[var] = temp
else:
flip_b_dict[var] = None
return flip_b_dict
def fuzz_variable(packet, var, p_dict):
try:
exec '%s = %s' % ('p_var',p_dict[packet][var])
# need to parse the packet and flip bits, bytes and endianess
flip_b_1_dict = byte_flip(p_var, 1, var)
flip_b_2_dict = byte_flip(p_var, 2, var)
flip_b_4_dict = byte_flip(p_var, 4, var)
min_max_dict = min_max(len(p_var), var)
build_packet(packet, flip_b_1_dict, p_dict, '1_byte')
build_packet(packet, flip_b_2_dict, p_dict, '2_byte')
build_packet(packet, flip_b_4_dict, p_dict, '4_byte')
build_packet(packet, min_max_dict, p_dict, 'min_max')
except NameError:
exec '%s = %s' % ('l_var',p_dict[packet]['variables_length'])
min_max_dict = min_max(l_var[var], var)
build_packet(packet, min_max_dict, p_dict, 'min_max')
return 0
def get_original(packet_name, p_dict):
packets = []
for packet in p_dict:
con_packet = []
if packet == packet_name:
continue
con_packet.append(packet)
for var in p_dict[packet]['variables']:
con_packet.append(' %s = %s' % (var, p_dict[packet][var]))
f_packet = '\n'.join(con_packet) + p_dict[packet]['build']
packets.append(f_packet)
return '\n'.join(packets)
def min_max(len_var, var, both = True, value = ''):
min_max = dict()
temp = ['\x00' * len_var, '\xff' * len_var]
min_max[var] = temp
return min_max
def build_packet(packet_name, f_dict, p_dict, version):
vars = p_dict[packet_name]['variables']
count = 0
packets = get_original(packet_name, p_dict)
for var in f_dict:
try:
for x in f_dict[var]:
temp = ''
con_packet = []
con_packet.append(packet_name)
for vars in variables:
if vars == var:
con_packet.append(' %s = %s' % (vars, repr(x)))
else:
con_packet.append(' %s = %s' % (vars, p_dict[packet_name][vars]))
packet = '\n'.join(con_packet) + p_dict[packet_name]['build']
count += 1
temp = 'import struct\n' + packets + '\n\n' + packet
write_packet_2_disk(packet_name, temp, version, var, count)
except TypeError:
continue
def write_packet_2_disk(packet_name, packet, version, var, count):
f_name = packet_name.replace('def ', '').replace('():', '').strip()
dir = 'rdp_templates'
if not((os.path.isdir(dir))):
os.mkdir(dir)
print '%s_%s_%s.txt' % (var, version, count)
file = open('%s/%s_%s_%s.template.rdp' % (dir, var, version, count), 'wb')
file.write(packet)
opts = resolveArgs()
for packet in packets:
if opts.max != None:
max_count = int(opts.max)
else:
max_count = 'all'
variables = template_dict[packet]['variables']
total_count = 0
for var in variables:
if max_count <= total_count:
print exit
sys.exit()
fuzz_variable(packet, var, template_dict)
total_count += 1
| 33.946341 | 274 | 0.598506 | 921 | 6,959 | 4.277959 | 0.186754 | 0.022843 | 0.022335 | 0.020305 | 0.185279 | 0.099492 | 0.035025 | 0.035025 | 0.019289 | 0.019289 | 0 | 0.008729 | 0.242707 | 6,959 | 204 | 275 | 34.112745 | 0.738899 | 0.031757 | 0 | 0.10625 | 1 | 0 | 0.116689 | 0.018257 | 0 | 0 | 0.003043 | 0.004902 | 0 | 0 | null | null | 0 | 0.0375 | null | null | 0.0125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d6b2734fd2fb44f1f0ee68d4472c14046681e25 | 1,647 | py | Python | luminoth/utils/test/gt_boxes.py | czbiohub/luminoth | 3b4d57a9b4c3704c64816bbcbd6126a2ac23a069 | [
"BSD-3-Clause"
] | 3 | 2020-01-29T12:04:28.000Z | 2021-04-05T16:30:57.000Z | luminoth/utils/test/gt_boxes.py | czbiohub/luminoth-uv-imaging | 3b4d57a9b4c3704c64816bbcbd6126a2ac23a069 | [
"BSD-3-Clause"
] | 9 | 2019-11-20T16:57:45.000Z | 2021-03-18T06:57:10.000Z | luminoth/utils/test/gt_boxes.py | czbiohub/luminoth | 3b4d57a9b4c3704c64816bbcbd6126a2ac23a069 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
def generate_gt_boxes(total_boxes, image_size, min_size=10, total_classes=None):
"""
Generate `total_boxes` fake (but consistent) ground-truth boxes for an
image of size `image_size` (height, width).
Args:
total_boxes (int): The total number of boxes.
image_size (tuple): Size of the fake image.
Returns:
gt_boxes (np.array): With shape [total_boxes, 4].
"""
image_size = np.array(image_size)
assert (
image_size > min_size
).all(), "Can't generate gt_boxes that small for that image size"
# Generate random sizes for each boxes.
max_size = np.min(image_size) - min_size
random_sizes = np.random.randint(low=min_size, high=max_size, size=(total_boxes, 2))
# Generate random starting points for bounding boxes (left top point)
random_leftop = np.random.randint(low=0, high=max_size, size=(total_boxes, 2))
rightbottom = np.minimum(random_sizes + random_leftop, np.array(image_size) - 1)
gt_boxes = np.column_stack((random_leftop, rightbottom))
# TODO: Remove asserts after writing tests for this function.
assert (gt_boxes[:, 0] < gt_boxes[:, 2]).all(), "Gt boxes without consistent Xs"
assert (gt_boxes[:, 1] < gt_boxes[:, 3]).all(), "Gt boxes without consistent Ys"
if total_classes:
random_classes = np.random.randint(
low=0, high=total_classes - 1, size=(total_boxes, 1)
)
gt_boxes = np.column_stack((gt_boxes, random_classes))
assert (
gt_boxes[:, 1] < total_classes
).all(), "Gt boxes without consistent classes"
return gt_boxes
| 33.612245 | 88 | 0.665452 | 236 | 1,647 | 4.440678 | 0.322034 | 0.100191 | 0.034351 | 0.045802 | 0.207061 | 0.129771 | 0.049618 | 0 | 0 | 0 | 0 | 0.011737 | 0.224044 | 1,647 | 48 | 89 | 34.3125 | 0.808294 | 0.273224 | 0 | 0.090909 | 1 | 0 | 0.129004 | 0 | 0 | 0 | 0 | 0.020833 | 0.181818 | 1 | 0.045455 | false | 0 | 0.045455 | 0 | 0.136364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d83b3928eb305e87244b1c78b0cfa1d89f6eb0a | 664 | py | Python | demos/parsing_table_report.py | wannaphong/pycfg | ffa67958ed1c3deb73cadb3969ac086336fb1269 | [
"MIT"
] | 8 | 2017-12-18T08:51:27.000Z | 2020-11-26T02:21:06.000Z | demos/parsing_table_report.py | wannaphong/pycfg | ffa67958ed1c3deb73cadb3969ac086336fb1269 | [
"MIT"
] | 1 | 2020-01-09T15:41:09.000Z | 2020-01-09T15:41:09.000Z | demos/parsing_table_report.py | wannaphong/pycfg | ffa67958ed1c3deb73cadb3969ac086336fb1269 | [
"MIT"
] | 6 | 2017-06-12T16:58:40.000Z | 2019-11-27T06:55:07.000Z | '''Compute the SLR parsing table for a grammar read from stdin in extended
sytax, printing the original grammar, augmented grammar, first sets, follow
sets, and table to stdout in HTML.'''
import sys
from cfg.cfg_reader import *
from cfg.slr import *
try:
G = parse_cfg(sys.stdin.read())
except ValueError, e:
print e
sys.exit(1)
T = ParsingTable(G)
print '<h1>Original Grammar</h1>'
print T._grammar.html()
print '<h1>Augmented Grammar</h1>'
print T._automaton.augmented_grammar().html()
print '<h1>First Sets</h1>'
print T._first_sets.html()
print '<h1>Follow Sets</h1>'
print T._follow_sets.html()
print '<h1>Parsing Table</h1>'
print T.html()
| 23.714286 | 75 | 0.722892 | 107 | 664 | 4.401869 | 0.383178 | 0.07431 | 0.084926 | 0.063694 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019332 | 0.143072 | 664 | 27 | 76 | 24.592593 | 0.808436 | 0 | 0 | 0 | 0 | 0 | 0.235789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.157895 | null | null | 0.578947 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
0dad717dfbc5eeed64dd36fae969c7a582eb5210 | 1,032 | py | Python | corehq/apps/userreports/migrations/0001_initial.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2020-05-05T13:10:01.000Z | 2020-05-05T13:10:01.000Z | corehq/apps/userreports/migrations/0001_initial.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2019-12-09T14:00:14.000Z | 2019-12-09T14:00:14.000Z | corehq/apps/userreports/migrations/0001_initial.py | MaciejChoromanski/commcare-hq | fd7f65362d56d73b75a2c20d2afeabbc70876867 | [
"BSD-3-Clause"
] | 5 | 2015-11-30T13:12:45.000Z | 2019-07-01T19:27:07.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from __future__ import absolute_import
from django.conf import settings
from django.db import migrations
from corehq.apps.userreports.models import DataSourceConfiguration
from corehq.preindex import get_preindex_plugin
from corehq.sql_db.connections import DEFAULT_ENGINE_ID
from corehq.util.couch import IterDB
from dimagi.utils.couch.database import iter_docs
def set_default_engine_ids(apps, schema_editor):
if not settings.UNIT_TESTING:
get_preindex_plugin('userreports').sync_design_docs()
ucr_db = DataSourceConfiguration.get_db()
with IterDB(ucr_db) as iter_db:
for doc in iter_docs(ucr_db, DataSourceConfiguration.all_ids()):
if not doc.get('engine_id'):
doc['engine_id'] = DEFAULT_ENGINE_ID
iter_db.save(doc)
class Migration(migrations.Migration):
dependencies = [
]
operations = [
migrations.RunPython(set_default_engine_ids),
]
| 30.352941 | 76 | 0.725775 | 130 | 1,032 | 5.453846 | 0.453846 | 0.056417 | 0.045134 | 0.053597 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001211 | 0.199612 | 1,032 | 33 | 77 | 31.272727 | 0.857143 | 0.020349 | 0 | 0 | 0 | 0 | 0.028741 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.375 | 0 | 0.541667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
0dc27f3ffdf014d5dfe5a97390e8398ed9cc26aa | 716 | py | Python | solution.py | ashok-arora/HashCode | 7f30ea0ce0eeda5f6cf8bcb0e1cda4ce27b8bf32 | [
"MIT"
] | null | null | null | solution.py | ashok-arora/HashCode | 7f30ea0ce0eeda5f6cf8bcb0e1cda4ce27b8bf32 | [
"MIT"
] | null | null | null | solution.py | ashok-arora/HashCode | 7f30ea0ce0eeda5f6cf8bcb0e1cda4ce27b8bf32 | [
"MIT"
] | null | null | null | import sys
from parse import *
from output import *
from basicGreedy import *
from greedyLookahead import *
# TODO:
# Convert bidirectional edges to 2 unidirectional edges by:
# if Edge.direction == 2, then Edge.is_visited == True and Edge_2nd.is_visited == True
# map banaae a, street and if bi-dirctional insert in map b, street (rm -rf */ this line for now)
def main():
start, cars, time, class_adj_list = parse("in.txt")
cars_data = car_start(start, cars, time, class_adj_list)
# print_solution(cars_data, cars, 'basic_greedy')
cars_data = car_start_look_ahead(start, cars, time, class_adj_list)
print_solution(cars_data, cars, 'greedy_look_ahead')
if __name__ == "__main__":
main()
| 31.130435 | 97 | 0.727654 | 109 | 716 | 4.504587 | 0.522936 | 0.065173 | 0.07943 | 0.10998 | 0.254582 | 0.254582 | 0.203666 | 0.203666 | 0.203666 | 0.203666 | 0 | 0.005059 | 0.171788 | 716 | 22 | 98 | 32.545455 | 0.822934 | 0.407821 | 0 | 0 | 0 | 0 | 0.074163 | 0 | 0 | 0 | 0 | 0.045455 | 0 | 1 | 0.083333 | false | 0 | 0.416667 | 0 | 0.5 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0dc35c4faad340d3f69ae7965d8bc7c3d2065621 | 2,055 | py | Python | models/Conv2d.py | Apollo1840/DeepECG | 5132b5fc8f6b40c4b2f175cd5e56c4aec128ab3e | [
"MIT"
] | 2 | 2020-11-16T10:50:56.000Z | 2020-11-23T12:31:30.000Z | models/Conv2d.py | Apollo1840/DeepECG | 5132b5fc8f6b40c4b2f175cd5e56c4aec128ab3e | [
"MIT"
] | null | null | null | models/Conv2d.py | Apollo1840/DeepECG | 5132b5fc8f6b40c4b2f175cd5e56c4aec128ab3e | [
"MIT"
] | 1 | 2020-08-05T00:23:54.000Z | 2020-08-05T00:23:54.000Z | from keras.models import Sequential
from keras.layers import Dense, Dropout, Conv2D, GlobalAveragePooling2D, MaxPooling2D, Flatten
def conv2d(input_dim, output_dim=4):
model = Sequential()
# model.load_weights('my_model_weights.h5')
# 64 conv
model.add(Conv2D(64, (3, 3), activation='relu', input_shape=input_dim, padding='same'))
model.add(Conv2D(64, (3, 3), activation='relu', padding='same'))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
# 128 conv
model.add(Conv2D(128, (3, 3), activation='relu', padding='same'))
model.add(Conv2D(128, (3, 3), activation='relu', padding='same'))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
# #256 conv
model.add(Conv2D(256, (3, 3), activation='relu', padding='same'))
model.add(Conv2D(256, (3, 3), activation='relu', padding='same'))
model.add(Conv2D(256, (3, 3), activation='relu', padding='same'))
model.add(Conv2D(256, (3, 3), activation='relu', padding='same'))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
# #512 conv
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(Conv2D(512, (3, 3), activation='relu'))
# model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
# Dense part
model.add(Flatten())
model.add(Dense(4096, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(4096, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(1000, activation='relu'))
model.add(Dense(output_dim, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
return model
| 41.938776 | 94 | 0.644282 | 288 | 2,055 | 4.548611 | 0.184028 | 0.170992 | 0.170992 | 0.19542 | 0.700763 | 0.692366 | 0.692366 | 0.692366 | 0.655725 | 0.655725 | 0 | 0.087156 | 0.151338 | 2,055 | 48 | 95 | 42.8125 | 0.663991 | 0.293431 | 0 | 0.541667 | 0 | 0 | 0.085136 | 0.016748 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.083333 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0df6d6ea0927c077739bfaea6e0cf23ee1ad49b4 | 725 | py | Python | train.py | samkugji/chatbot_from_scratch | 9aa448db2bbc7e9a2fd9d2b536e311ccda42c2e8 | [
"MIT"
] | 1 | 2017-08-16T08:46:59.000Z | 2017-08-16T08:46:59.000Z | train.py | samkugji/chatbot_from_scratch | 9aa448db2bbc7e9a2fd9d2b536e311ccda42c2e8 | [
"MIT"
] | null | null | null | train.py | samkugji/chatbot_from_scratch | 9aa448db2bbc7e9a2fd9d2b536e311ccda42c2e8 | [
"MIT"
] | null | null | null | import numpy as np
import tensorflow as tf
import time
import math
import os
import sys
from lib import model_utils
from configs import model_config
def main(_):
config = model_config.Config()
#TODO: 클래스의 모든 변수를 출력할 수 있는 법
print("batch size: ", config.batch_size)
with tf.Session() as sess:
# if forward_only == False, Training Mode. Batch size is modified, include optimizer.
forward_only = False
user_vocab_path = os.path.join(config.data_dir, 'vocab_user.in')
bot_vocab_path = os.path.join(config.data_dir, 'vocab_bot.in')
# create model
model = model_utils.create_model(sess, config, forward_only)
# main()
if __name__ == "__main__":
tf.app.run() | 25 | 93 | 0.689655 | 108 | 725 | 4.388889 | 0.5 | 0.056962 | 0.067511 | 0.063291 | 0.156118 | 0.156118 | 0.156118 | 0.156118 | 0.156118 | 0 | 0 | 0 | 0.217931 | 725 | 29 | 94 | 25 | 0.835979 | 0.182069 | 0 | 0 | 0 | 0 | 0.076271 | 0 | 0 | 0 | 0 | 0.034483 | 0 | 1 | 0.055556 | false | 0 | 0.444444 | 0 | 0.5 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
218bdae0da9e333918ad4d976fa4fa191a9140ec | 368 | py | Python | mldictionary_api/models/requests.py | PabloEmidio/api-dictionary | 4eb1d4d02a133b96bcb905575d8bccb3cf0311c3 | [
"MIT"
] | 7 | 2021-06-13T23:21:14.000Z | 2022-01-04T01:35:47.000Z | mldictionary_api/models/requests.py | PabloEmidio/MLDictionaryAPI | 501f203802c1f6aafe213c0fb7a5808f9a9ad3ab | [
"MIT"
] | null | null | null | mldictionary_api/models/requests.py | PabloEmidio/MLDictionaryAPI | 501f203802c1f6aafe213c0fb7a5808f9a9ad3ab | [
"MIT"
] | 1 | 2021-06-28T07:26:54.000Z | 2021-06-28T07:26:54.000Z | __all__ = ['RedisRequests']
from mldictionary_api.models.base import RedisBaseModel
class RedisRequests(RedisBaseModel):
def __init__(self):
super().__init__()
def get(self, match: str) -> int:
requests = self.db.get(match) or 0
return int(requests)
def set(self, key, value, ttl: int):
self.db.set(key, value, ex=ttl)
| 23 | 55 | 0.649457 | 47 | 368 | 4.808511 | 0.595745 | 0.097345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003509 | 0.225543 | 368 | 15 | 56 | 24.533333 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0.035326 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
21b5b3d371035661a22cc5c973d6a552f5dc156d | 69 | py | Python | section_5_price_alert/src/config.py | sagarnildass/The-complete-python-web-course | d5c21eebeeea466a0fa46c9dfac203c0e5b01433 | [
"MIT"
] | null | null | null | section_5_price_alert/src/config.py | sagarnildass/The-complete-python-web-course | d5c21eebeeea466a0fa46c9dfac203c0e5b01433 | [
"MIT"
] | null | null | null | section_5_price_alert/src/config.py | sagarnildass/The-complete-python-web-course | d5c21eebeeea466a0fa46c9dfac203c0e5b01433 | [
"MIT"
] | null | null | null |
DEBUG = True
ADMINS = frozenset([
"yourname@yourdomain.com"
])
| 9.857143 | 29 | 0.652174 | 7 | 69 | 6.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202899 | 69 | 6 | 30 | 11.5 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.343284 | 0.343284 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
21b8dae6932c806248b39abe76b7b480656aadce | 4,944 | py | Python | sdk/python/pulumi_aws/ec2/eip_association.py | pulumi-bot/pulumi-aws | 756c60135851e015232043c8206567101b8ebd85 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/ec2/eip_association.py | pulumi-bot/pulumi-aws | 756c60135851e015232043c8206567101b8ebd85 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/ec2/eip_association.py | pulumi-bot/pulumi-aws | 756c60135851e015232043c8206567101b8ebd85 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import pulumi
import pulumi.runtime
class EipAssociation(pulumi.CustomResource):
"""
Provides an AWS EIP Association as a top level resource, to associate and
disassociate Elastic IPs from AWS Instances and Network Interfaces.
~> **NOTE:** Do not use this resource to associate an EIP to `aws_lb` or `aws_nat_gateway` resources. Instead use the `allocation_id` available in those resources to allow AWS to manage the association, otherwise you will see `AuthFailure` errors.
~> **NOTE:** `aws_eip_association` is useful in scenarios where EIPs are either
pre-existing or distributed to customers or users and therefore cannot be changed.
"""
def __init__(__self__, __name__, __opts__=None, allocation_id=None, allow_reassociation=None, instance_id=None, network_interface_id=None, private_ip_address=None, public_ip=None):
"""Create a EipAssociation resource with the given unique name, props, and options."""
if not __name__:
raise TypeError('Missing resource name argument (for URN creation)')
if not isinstance(__name__, basestring):
raise TypeError('Expected resource name to be a string')
if __opts__ and not isinstance(__opts__, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
__props__ = dict()
if allocation_id and not isinstance(allocation_id, basestring):
raise TypeError('Expected property allocation_id to be a basestring')
__self__.allocation_id = allocation_id
"""
The allocation ID. This is required for EC2-VPC.
"""
__props__['allocationId'] = allocation_id
if allow_reassociation and not isinstance(allow_reassociation, bool):
raise TypeError('Expected property allow_reassociation to be a bool')
__self__.allow_reassociation = allow_reassociation
"""
Whether to allow an Elastic IP to
be re-associated. Defaults to `true` in VPC.
"""
__props__['allowReassociation'] = allow_reassociation
if instance_id and not isinstance(instance_id, basestring):
raise TypeError('Expected property instance_id to be a basestring')
__self__.instance_id = instance_id
"""
The ID of the instance. This is required for
EC2-Classic. For EC2-VPC, you can specify either the instance ID or the
network interface ID, but not both. The operation fails if you specify an
instance ID unless exactly one network interface is attached.
"""
__props__['instanceId'] = instance_id
if network_interface_id and not isinstance(network_interface_id, basestring):
raise TypeError('Expected property network_interface_id to be a basestring')
__self__.network_interface_id = network_interface_id
"""
The ID of the network interface. If the
instance has more than one network interface, you must specify a network
interface ID.
"""
__props__['networkInterfaceId'] = network_interface_id
if private_ip_address and not isinstance(private_ip_address, basestring):
raise TypeError('Expected property private_ip_address to be a basestring')
__self__.private_ip_address = private_ip_address
"""
The primary or secondary private IP address
to associate with the Elastic IP address. If no private IP address is
specified, the Elastic IP address is associated with the primary private IP
address.
"""
__props__['privateIpAddress'] = private_ip_address
if public_ip and not isinstance(public_ip, basestring):
raise TypeError('Expected property public_ip to be a basestring')
__self__.public_ip = public_ip
"""
The Elastic IP address. This is required for EC2-Classic.
"""
__props__['publicIp'] = public_ip
super(EipAssociation, __self__).__init__(
'aws:ec2/eipAssociation:EipAssociation',
__name__,
__props__,
__opts__)
def set_outputs(self, outs):
if 'allocationId' in outs:
self.allocation_id = outs['allocationId']
if 'allowReassociation' in outs:
self.allow_reassociation = outs['allowReassociation']
if 'instanceId' in outs:
self.instance_id = outs['instanceId']
if 'networkInterfaceId' in outs:
self.network_interface_id = outs['networkInterfaceId']
if 'privateIpAddress' in outs:
self.private_ip_address = outs['privateIpAddress']
if 'publicIp' in outs:
self.public_ip = outs['publicIp']
| 47.085714 | 251 | 0.680016 | 596 | 4,944 | 5.347315 | 0.265101 | 0.039536 | 0.055224 | 0.060245 | 0.127079 | 0.076247 | 0 | 0 | 0 | 0 | 0 | 0.00162 | 0.250809 | 4,944 | 104 | 252 | 47.538462 | 0.858801 | 0.164644 | 0 | 0 | 1 | 0 | 0.235843 | 0.011905 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037736 | false | 0.056604 | 0.037736 | 0 | 0.09434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
21d9a8a635118bbf38f19b0d7f09b499f8f323e1 | 903 | py | Python | src/api/serializers.py | Dakhnovskiy/social-network | 51a78ce2bfbf7f8e38f1f5e0b72756a79880e494 | [
"Apache-2.0"
] | null | null | null | src/api/serializers.py | Dakhnovskiy/social-network | 51a78ce2bfbf7f8e38f1f5e0b72756a79880e494 | [
"Apache-2.0"
] | null | null | null | src/api/serializers.py | Dakhnovskiy/social-network | 51a78ce2bfbf7f8e38f1f5e0b72756a79880e494 | [
"Apache-2.0"
] | null | null | null | import datetime
from typing import List, Optional
from constants import RelationStatus, Sex
from pydantic import BaseModel, conint
class City(BaseModel):
id: int
name: str
class CitiesOut(BaseModel):
data: List[City]
class UserIn(BaseModel):
login: str
password: str
first_name: str
last_name: str
age: conint(ge=0)
sex: Sex
interests: Optional[List[str]]
city_id: int
class UserOut(BaseModel):
id: int
login: str
first_name: str
last_name: str
age: conint(ge=0)
sex: Sex
city_id: int
created_dt: datetime.datetime
class UsersOut(BaseModel):
data: List[UserOut]
class RelatedUser:
first_name: str
last_name: str
relation_status: RelationStatus
class RelatedUsers:
users: List[RelatedUser]
class AuthIn(BaseModel):
login: str
password: str
class AuthOut(BaseModel):
token: str
| 15.05 | 41 | 0.683278 | 116 | 903 | 5.232759 | 0.353448 | 0.080725 | 0.059308 | 0.079077 | 0.270181 | 0.182867 | 0.144975 | 0.144975 | 0.144975 | 0.144975 | 0 | 0.002911 | 0.239203 | 903 | 59 | 42 | 15.305085 | 0.88064 | 0 | 0 | 0.475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.05 | 0.1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
21ef18dfc3cb3cd300b565c0404c36df3f912cc7 | 265 | py | Python | code/abc075_c_01.py | KoyanagiHitoshi/AtCoder | 731892543769b5df15254e1f32b756190378d292 | [
"MIT"
] | 3 | 2019-08-16T16:55:48.000Z | 2021-04-11T10:21:40.000Z | code/abc075_c_01.py | KoyanagiHitoshi/AtCoder | 731892543769b5df15254e1f32b756190378d292 | [
"MIT"
] | null | null | null | code/abc075_c_01.py | KoyanagiHitoshi/AtCoder | 731892543769b5df15254e1f32b756190378d292 | [
"MIT"
] | null | null | null | N,M=map(int,input().split())
edges=[list(map(int,input().split())) for i in range(M)]
ans=0
for x in edges:
l=list(range(N))
for y in edges:
if y!=x:l=[l[y[0]-1] if l[i]==l[y[1]-1] else l[i] for i in range(N)]
if len(set(l))!=1:ans+=1
print(ans) | 29.444444 | 76 | 0.562264 | 63 | 265 | 2.365079 | 0.365079 | 0.080537 | 0.147651 | 0.214765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0.181132 | 265 | 9 | 77 | 29.444444 | 0.654378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
21faecb8e43196d43ddcc06306bc7efc72f33466 | 430 | py | Python | multiplepagesproject/prodmodels.py | mandeep-django/admin-panel | 30c65730e74004ec21cf891627fbbaa027f626db | [
"MIT"
] | null | null | null | multiplepagesproject/prodmodels.py | mandeep-django/admin-panel | 30c65730e74004ec21cf891627fbbaa027f626db | [
"MIT"
] | null | null | null | multiplepagesproject/prodmodels.py | mandeep-django/admin-panel | 30c65730e74004ec21cf891627fbbaa027f626db | [
"MIT"
] | null | null | null | from django.db import connections
from django.db import models
class proddisplay(models.Model):
categoryid = models.IntegerField()
pname = models.CharField(max_length=100)
pprice = models.IntegerField()
pcontent = models.TextField(max_length=100)
pimage = models.FileField()
pdisplayorder = models.IntegerField()
pstatus = models.IntegerField()
class Meta:
db_table = "product" | 33.076923 | 48 | 0.702326 | 46 | 430 | 6.5 | 0.565217 | 0.240803 | 0.080268 | 0.120401 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0.204651 | 430 | 13 | 49 | 33.076923 | 0.856725 | 0 | 0 | 0 | 0 | 0 | 0.016706 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.916667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df1f4f30fef8818539f455ac084622401ca3ae51 | 31,670 | py | Python | src/nbtucker_sparse.py | vibinabraham/FermiCluster | 2264b343d15c9e23ab964810c99ba0af60abe985 | [
"Apache-2.0"
] | 3 | 2020-10-21T00:51:09.000Z | 2021-12-26T17:27:07.000Z | src/nbtucker_sparse.py | vibinabraham/FermiCluster | 2264b343d15c9e23ab964810c99ba0af60abe985 | [
"Apache-2.0"
] | null | null | null | src/nbtucker_sparse.py | vibinabraham/FermiCluster | 2264b343d15c9e23ab964810c99ba0af60abe985 | [
"Apache-2.0"
] | 1 | 2020-10-29T03:05:00.000Z | 2020-10-29T03:05:00.000Z | import math
import sys
import numpy as np
import scipy
import itertools
import copy as cp
from helpers import *
import opt_einsum as oe
import tools
import time
from ClusteredOperator import *
from ClusteredState import *
from Cluster import *
from ham_build import *
def compute_rspt2_correction(ci_vector, clustered_ham, e0, nproc=1):
# {{{
print(" Compute Matrix Vector Product:", flush=True)
start = time.time()
if nproc==1:
#h0v(clustered_ham,ci_vector)
#exit()
pt_vector = matvec1(clustered_ham, ci_vector)
else:
pt_vector = matvec1_parallel1(clustered_ham, ci_vector, nproc=nproc)
stop = time.time()
print(" Time spent in matvec: ", stop-start)
#pt_vector.prune_empty_fock_spaces()
tmp = ci_vector.dot(pt_vector)
var = pt_vector.norm() - tmp*tmp
print(" Variance: %12.8f" % var,flush=True)
print("Dim of PT space %4d"%len(pt_vector))
print(" Remove CI space from pt_vector vector")
for fockspace,configs in pt_vector.items():
if fockspace in ci_vector.fblocks():
for config,coeff in list(configs.items()):
if config in ci_vector[fockspace]:
del pt_vector[fockspace][config]
print("Dim of PT space %4d"%len(pt_vector))
for fockspace,configs in ci_vector.items():
if fockspace in pt_vector:
for config,coeff in configs.items():
assert(config not in pt_vector[fockspace])
print(" Norm of CI vector = %12.8f" %ci_vector.norm())
print(" Dimension of CI space: ", len(ci_vector))
print(" Dimension of PT space: ", len(pt_vector))
print(" Compute Denominator",flush=True)
# compute diagonal for PT2
start = time.time()
#pt_vector.prune_empty_fock_spaces()
if nproc==1:
Hd = build_h0(clustered_ham, ci_vector, pt_vector)
else:
Hd = build_h0(clustered_ham, ci_vector, pt_vector)
#Hd = build_hamiltonian_diagonal_parallel1(clustered_ham, pt_vector, nproc=nproc)
#pr.disable()
#pr.print_stats(sort='time')
end = time.time()
print(" Time spent in demonimator: ", end - start)
denom = 1/(e0 - Hd)
pt_vector_v = pt_vector.get_vector()
pt_vector_v.shape = (pt_vector_v.shape[0])
e2 = np.multiply(denom,pt_vector_v)
pt_vector.set_vector(e2)
e2 = np.dot(pt_vector_v,e2)
print(" PT2 Energy Correction = %12.8f" %e2)
return e2,pt_vector
# }}}
def compute_lcc2_correction(ci_vector, clustered_ham, e0, nproc=1):
# {{{
print(" Compute Matrix Vector Product:", flush=True)
start = time.time()
if nproc==1:
#h0v(clustered_ham,ci_vector)
#exit()
pt_vector = matvec1(clustered_ham, ci_vector)
else:
pt_vector = matvec1_parallel1(clustered_ham, ci_vector, nproc=nproc)
stop = time.time()
print(" Time spent in matvec: ", stop-start)
#pt_vector.prune_empty_fock_spaces()
tmp = ci_vector.dot(pt_vector)
var = pt_vector.norm() - tmp*tmp
print(" Variance: %12.8f" % var,flush=True)
print("Dim of PT space %4d"%len(pt_vector))
print(" Remove CI space from pt_vector vector")
for fockspace,configs in pt_vector.items():
if fockspace in ci_vector.fblocks():
for config,coeff in list(configs.items()):
if config in ci_vector[fockspace]:
del pt_vector[fockspace][config]
print("Dim of PT space %4d"%len(pt_vector))
for fockspace,configs in ci_vector.items():
if fockspace in pt_vector:
for config,coeff in configs.items():
assert(config not in pt_vector[fockspace])
print(" Norm of CI vector = %12.8f" %ci_vector.norm())
print(" Dimension of CI space: ", len(ci_vector))
print(" Dimension of PT space: ", len(pt_vector))
print(" Compute Denominator",flush=True)
# compute diagonal for PT2
start = time.time()
#pt_vector.prune_empty_fock_spaces()
if nproc==1:
Hd = build_h0(clustered_ham, ci_vector, pt_vector)
Hd2 = build_hamiltonian_diagonal(clustered_ham, pt_vector)
else:
Hd = build_h0(clustered_ham, ci_vector, pt_vector)
Hd2 = build_hamiltonian_diagonal_parallel1(clustered_ham, pt_vector, nproc=nproc)
#pr.disable()
#pr.print_stats(sort='time')
end = time.time()
print(" Time spent in demonimator: ", end - start)
denom = 1/(e0 - Hd)
pt_vector_v = pt_vector.get_vector()
pt_vector_v.shape = (pt_vector_v.shape[0])
e2 = np.multiply(denom,pt_vector_v)
pt_vector.set_vector(e2)
e2 = np.dot(pt_vector_v,e2)
print(" PT2 Energy Correction = %12.8f" %e2)
print("E DPS %16.8f"%e0)
#ci_vector.add(pt_vector)
H = build_full_hamiltonian(clustered_ham, pt_vector)
np.fill_diagonal(H,0)
denom.shape = (denom.shape[0],1)
v1 = pt_vector.get_vector()
for j in range(0,10):
print("v1",v1.shape)
v2 = H @ v1
#print("v2",v2.shape)
#print("denom",denom.shape)
v2 = np.multiply(denom,v2)
#print("afrer denom",v2.shape)
e3 = np.dot(pt_vector_v,v2)
print(e3.shape)
print("PT2",e2)
print("PT3",e3[0])
v1 = v2
pt_vector.set_vector(v2)
return e3[0],pt_vector
# }}}
def build_h0(clustered_ham,ci_vector,pt_vector):
"""
Build hamiltonian diagonal in basis in ci_vector as difference of cluster energies as in RSPT
"""
# {{{
clusters = clustered_ham.clusters
Hd = np.zeros((len(pt_vector)))
E0 = build_full_hamiltonian(clustered_ham,ci_vector,iprint=0, opt_einsum=True)[0,0]
assert(len(ci_vector)==1)
print("E0%16.8f"%E0)
#idx = 0
#Hd1 = np.zeros((len(pt_vector)))
#for f,c,v in pt_vector:
# e0_X = 0
# for ci in clustered_ham.clusters:
# e0_X += ci.ops['H_mf'][(f[ci.idx],f[ci.idx])][c[ci.idx],c[ci.idx]]
# Hd1[idx] = e0_X
# idx += 1
idx = 0
Hd = np.zeros((len(pt_vector)))
for ci_fspace, ci_configs in ci_vector.items():
for ci_config, ci_coeff in ci_configs.items():
for fockspace, configs in pt_vector.items():
#print("FS",fockspace)
active = []
inactive = []
for ind,fs in enumerate(fockspace):
if fs != ci_fspace[ind]:
active.append(ind)
else:
inactive.append(ind)
delta_fock= tuple([(fockspace[ci][0]-ci_fspace[ci][0], fockspace[ci][1]-ci_fspace[ci][1]) for ci in range(len(clusters))])
#print("active",active)
#print("active",delta_fock)
#print(tuple(np.array(list(fockspace))[inactive]))
for config, coeff in configs.items():
delta_fock= tuple([(0,0) for ci in range(len(clusters))])
diff = tuple(x-y for x,y in zip(ci_config,config))
#print("CI",ci_config)
for x in inactive:
if diff[x] != 0 and x not in active:
active.append(x)
#print("PT",config)
#print("d ",diff)
#print("ACTIVE",active)
Hd[idx] = E0
for cidx in active:
fspace = fockspace[cidx]
conf = config[cidx]
#print(" Cluster: %4d Fock Space:%s config:%4d Energies %16.8f"%(cidx,fspace,conf,clusters[cidx].energies[fspace][conf]))
#Hd[idx] += clusters[cidx].energies[fockspace[cidx]][config[cidx]]
#Hd[idx] -= clusters[cidx].energies[ci_fspace[cidx]][ci_config[cidx]]
Hd[idx] += clusters[cidx].ops['H_mf'][(fockspace[cidx],fockspace[cidx])][config[cidx],config[cidx]]
Hd[idx] -= clusters[cidx].ops['H_mf'][(ci_fspace[cidx],ci_fspace[cidx])][ci_config[cidx],ci_config[cidx]]
#print("-Cluster: %4d Fock Space:%s config:%4d Energies %16.8f"%(cidx,ci_fspace[cidx],ci_config[cidx],clusters[cidx].energies[ci_fspace[cidx]][ci_config[cidx]]))
# for EN:
#for term in terms:
# Hd[idx] += term.matrix_element(fockspace,config,fockspace,config)
# print(term.active)
# for RS
#Hd[idx] = E0 - term.active
idx += 1
return Hd
# }}}
def cepa(clustered_ham,ci_vector,pt_vector,cepa_shift):
# {{{
ts = 0
H00 = build_full_hamiltonian(clustered_ham,ci_vector,iprint=0)
#H00 = H[tb0.start:tb0.stop,tb0.start:tb0.stop]
E0,V0 = np.linalg.eigh(H00)
E0 = E0[ts]
Ec = 0.0
#cepa_shift = 'aqcc'
#cepa_shift = 'cisd'
#cepa_shift = 'acpf'
#cepa_shift = 'cepa0'
cepa_mit = 1
cepa_mit = 100
for cit in range(0,cepa_mit):
#Hdd = cp.deepcopy(H[tb0.stop::,tb0.stop::])
Hdd = build_full_hamiltonian(clustered_ham,pt_vector,iprint=0)
shift = 0.0
if cepa_shift == 'acpf':
shift = Ec * 2.0 / n_blocks
#shift = Ec * 2.0 / n_sites
elif cepa_shift == 'aqcc':
shift = (1.0 - (n_blocks-3.0)*(n_blocks - 2.0)/(n_blocks * ( n_blocks-1.0) )) * Ec
elif cepa_shift == 'cisd':
shift = Ec
elif cepa_shift == 'cepa0':
shift = 0
Hdd += -np.eye(Hdd.shape[0])*(E0 + shift)
#Hdd += -np.eye(Hdd.shape[0])*(E0 + -0.220751700895 * 2.0 / 8.0)
#Hd0 = H[tb0.stop::,tb0.start:tb0.stop].dot(V0[:,ts])
H0d = build_block_hamiltonian(clustered_ham,ci_vector,pt_vector,iprint=0)
Hd0 = H0d.T
Hd0 = Hd0.dot(V0[:,ts])
#Cd = -np.linalg.inv(Hdd-np.eye(Hdd.shape[0])*E0).dot(Hd0)
#Cd = np.linalg.inv(Hdd).dot(-Hd0)
Cd = np.linalg.solve(Hdd, -Hd0)
print(" CEPA(0) Norm : %16.12f"%np.linalg.norm(Cd))
V0 = V0[:,ts]
V0.shape = (V0.shape[0],1)
Cd.shape = (Cd.shape[0],1)
C = np.vstack((V0,Cd))
H00d = np.insert(H0d,0,E0,axis=1)
#E = V0[:,ts].T.dot(H[tb0.start:tb0.stop,:]).dot(C)
E = V0[:,ts].T.dot(H00d).dot(C)
cepa_last_vectors = C
cepa_last_values = E
print(" CEPA(0) Energy: %16.12f"%E)
if abs(E-E0 - Ec) < 1e-10:
print("Converged")
break
Ec = E - E0
print("Ec %16.8f"%Ec)
return Ec[0]
# }}}
def build_block_hamiltonian(clustered_ham,ci_vector,pt_vector,iprint=0):
"""
Build hamiltonian in basis of two different clustered states
"""
# {{{
clusters = clustered_ham.clusters
H0d = np.zeros((len(ci_vector),len(pt_vector)))
shift_l = 0
for fock_li, fock_l in enumerate(ci_vector.data):
configs_l = ci_vector[fock_l]
if iprint > 0:
print(fock_l)
for config_li, config_l in enumerate(configs_l):
idx_l = shift_l + config_li
shift_r = 0
for fock_ri, fock_r in enumerate(pt_vector.data):
configs_r = pt_vector[fock_r]
delta_fock= tuple([(fock_l[ci][0]-fock_r[ci][0], fock_l[ci][1]-fock_r[ci][1]) for ci in range(len(clusters))])
try:
terms = clustered_ham.terms[delta_fock]
except KeyError:
shift_r += len(configs_r)
continue
for config_ri, config_r in enumerate(configs_r):
idx_r = shift_r + config_ri
#print("FOC",fock_l,fock_r)
#print("con",config_l,config_r)
#print(shift_r,config_ri)
#print("idx",idx_l,idx_r)
for term in terms:
me = term.matrix_element(fock_l,config_l,fock_r,config_r)
H0d[idx_l,idx_r] += me
shift_r += len(configs_r)
shift_l += len(configs_l)
return H0d
# }}}
def compute_cisd_correction(ci_vector, clustered_ham, nproc=1):
# {{{
print(" Compute Matrix Vector Product:", flush=True)
start = time.time()
H00 = build_full_hamiltonian(clustered_ham,ci_vector,iprint=0)
E0,V0 = np.linalg.eigh(H00)
E0 = E0[0]
if nproc==1:
pt_vector = matvec1(clustered_ham, ci_vector)
else:
pt_vector = matvec1_parallel1(clustered_ham, ci_vector, nproc=nproc)
stop = time.time()
print(" Time spent in matvec: ", stop-start)
print(" Remove CI space from pt_vector vector")
for fockspace,configs in pt_vector.items():
if fockspace in ci_vector.fblocks():
for config,coeff in list(configs.items()):
if config in ci_vector[fockspace]:
del pt_vector[fockspace][config]
for fockspace,configs in ci_vector.items():
if fockspace in pt_vector:
for config,coeff in configs.items():
assert(config not in pt_vector[fockspace])
ci_vector.add(pt_vector)
if nproc==1:
H = build_full_hamiltonian(clustered_ham, ci_vector)
else:
H = build_full_hamiltonian(clustered_ham, ci_vector)
e,v = scipy.sparse.linalg.eigsh(H,10,which='SA')
idx = e.argsort()
e = e[idx]
v = v[:,idx]
v = v[:,0]
e0 = e[0]
e = e[0]
Ec = e - E0
print(" CISD Energy Correction = %12.8f" %Ec)
return Ec
# }}}
def truncated_ci(clustered_ham, ci_vector, pt_vector=None, nproc=1):
# {{{
print(" Compute Matrix Vector Product:", flush=True)
if pt_vector==None:
H = build_full_hamiltonian(clustered_ham, ci_vector)
e,v = scipy.sparse.linalg.eigsh(H,10,which='SA')
idx = e.argsort()
e = e[idx]
v = v[:,idx]
v = v[:,0]
e0 = e[0]
e = e[0]
else:
H00 = build_full_hamiltonian(clustered_ham,ci_vector,iprint=0)
E0,V0 = np.linalg.eigh(H00)
E0 = E0[0]
ci_vector.add(pt_vector)
H = build_full_hamiltonian(clustered_ham, ci_vector)
e,v = scipy.sparse.linalg.eigsh(H,10,which='SA')
idx = e.argsort()
e = e[idx]
v = v[:,idx]
v = v[:,0]
e0 = e[0]
e = e[0]
Ec = e - E0
print(" Truncated CI Correction = %12.8f" %Ec)
ci_vector.set_vector(v)
return e,ci_vector
# }}}
def expand_doubles(ci_vector,clusters):
# {{{
ci_vector.print_configs()
for fspace in ci_vector.keys():
for ci in clusters:
for cj in clusters:
if cj.idx != ci.idx:
#same fock space
nfs = fspace
fock_i = nfs[ci.idx]
fock_j = nfs[cj.idx]
dims = [[0] for ca in range(len(clusters))]
dims[ci.idx] = range(1,ci.basis[fock_i].shape[1])
dims[cj.idx] = range(1,cj.basis[fock_j].shape[1])
for newconfig_idx, newconfig in enumerate(itertools.product(*dims)):
ci_vector[nfs][newconfig] = 0
# alpha excitation
new_fspace_a = [list(fs) for fs in fspace]
new_fspace_a[ci.idx][0] += 1
new_fspace_a[cj.idx][0] -= 1
new_fspace_a = tuple( tuple(fs) for fs in new_fspace_a)
good = True
for c in clusters:
if min(new_fspace_a[c.idx]) < 0 or max(new_fspace_a[c.idx]) > c.n_orb:
good = False
break
if good == False:
p = 1
else:
print(new_fspace_a)
ci_vector.add_fockspace(new_fspace_a)
nfs = new_fspace_a
fock_i = nfs[ci.idx]
fock_j = nfs[cj.idx]
dims = [[0] for ca in range(len(clusters))]
dims[ci.idx] = range(ci.basis[fock_i].shape[1])
dims[cj.idx] = range(cj.basis[fock_j].shape[1])
for newconfig_idx, newconfig in enumerate(itertools.product(*dims)):
ci_vector[nfs][newconfig] = 0
# beta excitation
new_fspace_b = [list(fs) for fs in fspace]
new_fspace_b[ci.idx][1] += 1
new_fspace_b[cj.idx][1] -= 1
new_fspace_b = tuple( tuple(fs) for fs in new_fspace_b)
good = True
for c in clusters:
if min(new_fspace_b[c.idx]) < 0 or max(new_fspace_b[c.idx]) > c.n_orb:
good = False
break
if good == False:
p = 1
else:
print(new_fspace_b)
ci_vector.add_fockspace(new_fspace_b)
nfs = new_fspace_b
fock_i = nfs[ci.idx]
fock_j = nfs[cj.idx]
dims = [[0] for ca in range(len(clusters))]
dims[ci.idx] = range(ci.basis[fock_i].shape[1])
dims[cj.idx] = range(cj.basis[fock_j].shape[1])
for newconfig_idx, newconfig in enumerate(itertools.product(*dims)):
ci_vector[nfs][newconfig] = 0
new_fspace_aa = [list(fs) for fs in fspace]
new_fspace_aa[ci.idx][0] += 2
new_fspace_aa[cj.idx][0] -= 2
new_fspace_aa = tuple( tuple(fs) for fs in new_fspace_aa)
good = True
for c in clusters:
if min(new_fspace_aa[c.idx]) < 0 or max(new_fspace_aa[c.idx]) > c.n_orb:
good = False
break
if good == False:
p = 1
else:
print(new_fspace_aa)
ci_vector.add_fockspace(new_fspace_aa)
nfs = new_fspace_aa
fock_i = nfs[ci.idx]
fock_j = nfs[cj.idx]
dims = [[0] for ca in range(len(clusters))]
dims[ci.idx] = range(ci.basis[fock_i].shape[1])
dims[cj.idx] = range(cj.basis[fock_j].shape[1])
for newconfig_idx, newconfig in enumerate(itertools.product(*dims)):
ci_vector[nfs][newconfig] = 0
new_fspace_bb = [list(fs) for fs in fspace]
new_fspace_bb[ci.idx][1] += 2
new_fspace_bb[cj.idx][1] -= 2
new_fspace_bb = tuple( tuple(fs) for fs in new_fspace_bb)
print(new_fspace_bb)
good = True
for c in clusters:
if min(new_fspace_bb[c.idx]) < 0 or max(new_fspace_bb[c.idx]) > c.n_orb:
good = False
break
if good == False:
p = 1
else:
print(new_fspace_bb)
ci_vector.add_fockspace(new_fspace_bb)
nfs = new_fspace_bb
fock_i = nfs[ci.idx]
fock_j = nfs[cj.idx]
dims = [[0] for ca in range(len(clusters))]
dims[ci.idx] = range(ci.basis[fock_i].shape[1])
dims[cj.idx] = range(cj.basis[fock_j].shape[1])
for newconfig_idx, newconfig in enumerate(itertools.product(*dims)):
ci_vector[nfs][newconfig] = 0
new_fspace_ab = [list(fs) for fs in fspace]
new_fspace_ab[ci.idx][0] += 1
new_fspace_ab[cj.idx][0] -= 1
new_fspace_ab[ci.idx][1] += 1
new_fspace_ab[cj.idx][1] -= 1
new_fspace_ab = tuple( tuple(fs) for fs in new_fspace_ab)
print("AB",new_fspace_ab)
good = True
for c in clusters:
if min(new_fspace_ab[c.idx]) < 0 or max(new_fspace_ab[c.idx]) > c.n_orb:
good = False
break
if good == False:
p = 1
else:
print(new_fspace_ab)
ci_vector.add_fockspace(new_fspace_ab)
nfs = new_fspace_ab
fock_i = nfs[ci.idx]
fock_j = nfs[cj.idx]
dims = [[0] for ca in range(len(clusters))]
dims[ci.idx] = range(ci.basis[fock_i].shape[1])
dims[cj.idx] = range(cj.basis[fock_j].shape[1])
for newconfig_idx, newconfig in enumerate(itertools.product(*dims)):
ci_vector[nfs][newconfig] = 0
new_fspace_ab = [list(fs) for fs in fspace]
new_fspace_ab[ci.idx][0] += 1
new_fspace_ab[cj.idx][0] -= 1
new_fspace_ab[ci.idx][1] -= 1
new_fspace_ab[cj.idx][1] += 1
new_fspace_ab = tuple( tuple(fs) for fs in new_fspace_ab)
print("AB",new_fspace_ab)
good = True
for c in clusters:
if min(new_fspace_ab[c.idx]) < 0 or max(new_fspace_ab[c.idx]) > c.n_orb:
good = False
break
if good == False:
p = 1
else:
print(new_fspace_ab)
ci_vector.add_fockspace(new_fspace_ab)
nfs = new_fspace_ab
fock_i = nfs[ci.idx]
fock_j = nfs[cj.idx]
dims = [[0] for ca in range(len(clusters))]
dims[ci.idx] = range(ci.basis[fock_i].shape[1])
dims[cj.idx] = range(cj.basis[fock_j].shape[1])
for newconfig_idx, newconfig in enumerate(itertools.product(*dims)):
ci_vector[nfs][newconfig] = 0
ci_vector.print_configs()
print(len(ci_vector))
#ci_vector.add_single_excitonic_states()
#ci_vector.expand_each_fock_space()
#ci_vector.print()
print(len(ci_vector))
ci_vector.print_configs()
"""
for fspace in ci_vector.keys():
config = [0]*len(self.clusters)
for ci in self.clusters:
fock_i = fspace[ci.idx]
new_config = cp.deepcopy(config)
for cii in range(ci.basis[fock_i].shape[1]):
new_config[ci.idx] = cii
self[fspace][tuple(new_config)] = 0
"""
return ci_vector
# }}}
def truncated_pt2(clustered_ham,ci_vector,pt_vector,method = 'mp2',inf=False):
# {{{
""" method: mp2,mplcc2,en2,enlcc, use the inf command to do infinite order PT when u have the full H"""
clusters = clustered_ham.clusters
ts = 0
print("len CI",len(ci_vector))
print("len PT",len(pt_vector))
print(" Remove CI space from pt_vector vector")
for fockspace,configs in pt_vector.items():
if fockspace in ci_vector.fblocks():
for config,coeff in list(configs.items()):
if config in ci_vector[fockspace]:
del pt_vector[fockspace][config]
print("Dim of PT space %4d"%len(pt_vector))
pt_dim = len(pt_vector)
ci_dim = len(ci_vector)
pt_order = 500
for fockspace,configs in ci_vector.items():
if fockspace in pt_vector:
for config,coeff in configs.items():
assert(config not in pt_vector[fockspace])
H0d = build_block_hamiltonian(clustered_ham,ci_vector,pt_vector,iprint=0)
H00 = build_full_hamiltonian(clustered_ham,ci_vector,iprint=0)
Hdd = build_full_hamiltonian(clustered_ham,pt_vector,iprint=0)
print(H00)
E0,V0 = np.linalg.eigh(H00)
E0 = E0[ts]
if method == 'en2' or method == 'enlcc':
Hd = build_hamiltonian_diagonal(clustered_ham,pt_vector)
np.fill_diagonal(Hdd,0)
elif method == 'mp2' or method == 'mplcc':
Hd = build_h0(clustered_ham, ci_vector, pt_vector)
#Hd += 10
for i in range(0,Hdd.shape[0]):
Hdd[i,i] -= (Hd[i])
else:
print("Method not found")
print("E0 %16.8f"%E0)
print(Hdd)
R0 = 1/(E0 - Hd)
print(R0)
v1 = np.multiply(R0,H0d)
pt_vector.set_vector(v1.T)
print(v1.shape)
print(H0d.shape)
e2 = H0d @ v1.T
print(e2)
v_n = np.zeros((pt_dim,pt_order+1)) #list of PT vectors
E_mpn = np.zeros((pt_order+1)) #PT energy
v_n[: ,0] = v1
E_mpn[0] = e2
E_corr = 0
print(" %6s %16s %16s "%("Order","Correction","Energy"))
#print(" %6i %16.8f %16.8f "%(1,first_order_E[0,s],E_mpn[0]))
E_corr = E_mpn[0]
print(" %6i %16.8f %16.8f "%(2,E_mpn[0],E_corr))
if method == 'enlcc' or method == 'mplcc':
Eold = E_corr
for i in range(1,pt_order-1):
h1 = Hdd @ v_n[:,i-1]
v_n[:,i] = h1.reshape(pt_dim)
if inf ==True:
for k in range(0,i):
v_n[:,i] -= np.multiply(E_mpn[k-1],v_n[:,(i-k-1)].reshape(pt_dim))
v_n[:,i] = np.multiply(R0,v_n[:,i])
E_mpn[i] = H0d @ v_n[:,i].T
#print(E_mpn)
E_corr += E_mpn[i]
print(" %6i %16.8f %16.8f "%(i+2,E_mpn[i],E_corr))
pt_vector.set_vector(v_n[:,i])
if abs(E_corr - Eold) < 1e-10:
print("LCC:%16.8f "%E_corr)
break
else:
Eold = E_corr
#v_n[:,i] = 0.8 * v_n[:,i] + 0.2 * v_n[:,i-1]
elif method == 'en2' or method == 'mp2':
print("MP2:%16.8f "%E_corr)
return E_corr, pt_vector
# }}}
def pt2infty(clustered_ham,ci_vector,pt_vector,form_H=True,nproc=None):
"""
The DMBPT infty equivalent for TPS methods. equvalent to CEPA/ LCCSD
Input:
clustered_ham: clustered_ham for the system
ci_vector: single CMF state for now
pt_vector: the pt_vector generated using compute_pt2_correction function
Output:
The correlation energy using cepa
pt_vector: which is the updated LCC vector
"""
# {{{
clusters = clustered_ham.clusters
ts = 0
print("len CI",len(ci_vector))
print("len PT",len(pt_vector))
pt_dim = len(pt_vector)
ci_dim = len(ci_vector)
pt_order = 500
for fockspace,configs in ci_vector.items():
if fockspace in pt_vector:
for config,coeff in configs.items():
assert(config not in pt_vector[fockspace])
H0d = build_block_hamiltonian(clustered_ham,ci_vector,pt_vector,iprint=0)
H00 = build_full_hamiltonian(clustered_ham,ci_vector,iprint=0)
if form_H:
print("Storage for H %8.4f GB"%((pt_dim*pt_dim*8)/10e9))
if pt_dim > 60000:
print("Memory for just storing H is approx 29 GB")
exit()
Hdd = build_full_hamiltonian_parallel2(clustered_ham,pt_vector,iprint=0,nproc=nproc)
np.fill_diagonal(Hdd,0)
print(H00)
E0,V0 = np.linalg.eigh(H00)
E0 = E0[ts]
Hd = build_hamiltonian_diagonal(clustered_ham,pt_vector)
print("E0 %16.8f"%E0)
R0 = 1/(E0 - Hd)
print(R0)
v1 = np.multiply(R0,H0d)
pt_vector.set_vector(v1.T)
print(v1.shape)
print(H0d.shape)
e2 = H0d @ v1.T
print(e2)
v_n = np.zeros((pt_dim,pt_order+1)) #list of PT vectors
E_mpn = np.zeros((pt_order+1)) #PT energy
v_n[: ,0] = v1
E_mpn[0] = e2
E_corr = 0
print(" %6s %16s %16s "%("Order","Correction","Energy"))
#print(" %6i %16.8f %16.8f "%(1,first_order_E[0,s],E_mpn[0]))
E_corr = E_mpn[0]
print(" %6i %16.8f %16.8f "%(2,E_mpn[0],E_corr))
Eold = E_corr
for i in range(1,pt_order-1):
#h1 = Hdd @ v_n[:,i-1]
if form_H:
h1 = Hdd @ v_n[:,i-1]
else:
sigma = build_sigma(clustered_ham,pt_vector,iprint=0, opt_einsum=True)
h1 = sigma.get_vector()
v_n[:,i] = h1.reshape(pt_dim)
#for k in range(0,i):
# v_n[:,i] -= np.multiply(E_mpn[k-1],v_n[:,(i-k-1)].reshape(pt_dim))
v_n[:,i] = np.multiply(R0,v_n[:,i])
E_mpn[i] = H0d @ v_n[:,i].T
#print(E_mpn)
E_corr += E_mpn[i]
print(" %6i %16.8f %16.8f "%(i+2,E_mpn[i],E_corr))
pt_vector.set_vector(v_n[:,i])
if abs(E_corr - Eold) < 1e-10:
print("LCC:%16.8f "%E_corr)
break
else:
Eold = E_corr
#v_n[:,i] = 0.8 * v_n[:,i] + 0.2 * v_n[:,i-1]
return E_corr, pt_vector
# }}}
def build_sigma(clustered_ham,ci_vector,iprint=0, opt_einsum=True):
"""
Form the sigma vector using the EN zero order hamiltonian
Cannot be used for davidson since this is for H0 of EN partitioning(diagonal of H is 0)
"""
# {{{
clusters = clustered_ham.clusters
sigma = np.zeros(len(ci_vector))
ci_v = ci_vector.get_vector()
shift_l = 0
for fock_li, fock_l in enumerate(ci_vector.data):
configs_l = ci_vector[fock_l]
if iprint > 0:
print(fock_l)
for config_li, config_l in enumerate(configs_l):
idx_l = shift_l + config_li
shift_r = 0
for fock_ri, fock_r in enumerate(ci_vector.data):
configs_r = ci_vector[fock_r]
delta_fock= tuple([(fock_l[ci][0]-fock_r[ci][0], fock_l[ci][1]-fock_r[ci][1]) for ci in range(len(clusters))])
if fock_ri<fock_li:
shift_r += len(configs_r)
continue
try:
terms = clustered_ham.terms[delta_fock]
except KeyError:
shift_r += len(configs_r)
continue
for config_ri, config_r in enumerate(configs_r):
idx_r = shift_r + config_ri
if idx_r<idx_l:
continue
for term in terms:
me = term.matrix_element(fock_l,config_l,fock_r,config_r)
if idx_r == idx_l:
me = 0
sigma[idx_l] += me * ci_v[idx_r]
if idx_r>idx_l:
sigma[idx_r] += me * ci_v[idx_l]
#print(" %4i %4i = %12.8f"%(idx_l,idx_r,me)," : ",config_l,config_r, " :: ", term)
shift_r += len(configs_r)
shift_l += len(configs_l)
sigma_vec = ci_vector.copy()
sigma_vec.set_vector(sigma)
return sigma_vec
# }}}
| 34.386536 | 186 | 0.521756 | 4,271 | 31,670 | 3.671037 | 0.076563 | 0.056636 | 0.029466 | 0.042095 | 0.751579 | 0.725238 | 0.677786 | 0.655654 | 0.628994 | 0.608904 | 0 | 0.032301 | 0.357752 | 31,670 | 920 | 187 | 34.423913 | 0.738545 | 0.110388 | 0 | 0.693312 | 0 | 0 | 0.047976 | 0 | 0 | 0 | 0 | 0 | 0.009788 | 1 | 0.017945 | false | 0 | 0.022839 | 0 | 0.058728 | 0.169657 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df27bb6b52edc78f27182407541613f7179a0bdc | 229,849 | py | Python | pysnmp-with-texts/EMC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/EMC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/EMC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module EMC-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/EMC-MIB
# Produced by pysmi-0.3.4 at Wed May 1 13:02:38 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, OctetString, Integer = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "OctetString", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ConstraintsIntersection, ValueRangeConstraint, ConstraintsUnion, ValueSizeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ConstraintsIntersection", "ValueRangeConstraint", "ConstraintsUnion", "ValueSizeConstraint")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Gauge32, Counter32, TimeTicks, NotificationType, Integer32, MibIdentifier, Unsigned32, MibScalar, MibTable, MibTableRow, MibTableColumn, Counter64, Bits, ObjectIdentity, IpAddress, Opaque, ModuleIdentity, experimental, NotificationType, enterprises, iso = mibBuilder.importSymbols("SNMPv2-SMI", "Gauge32", "Counter32", "TimeTicks", "NotificationType", "Integer32", "MibIdentifier", "Unsigned32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Counter64", "Bits", "ObjectIdentity", "IpAddress", "Opaque", "ModuleIdentity", "experimental", "NotificationType", "enterprises", "iso")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
class UInt32(Gauge32):
pass
emc = MibIdentifier((1, 3, 6, 1, 4, 1, 1139))
emcSymmetrix = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1))
systemCalls = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 2))
informational = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1))
systemInformation = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257))
systemCodes = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258))
diskAdapterDeviceConfiguration = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273))
deviceHostAddressConfiguration = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281))
control = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 2, 2))
discovery = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 3))
agentAdministration = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4))
analyzer = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000))
analyzerFiles = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3))
clients = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1001))
trapSetup = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1002))
activePorts = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1003))
agentConfiguration = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1004))
subagentConfiguration = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1005))
mainframeVariables = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 5))
symAPI = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6))
symAPIList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1))
symList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 1))
symRemoteList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 2))
symDevList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 3))
symPDevList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 4))
symPDevNoDgList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 5))
symDevNoDgList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 6))
symDgList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 7))
symLDevList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 8))
symGateList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 9))
symBcvDevList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 10))
symBcvPDevList = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 11))
symAPIShow = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2))
symShow = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1))
symDevShow = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2))
symAPIStatistics = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3))
dirPortStatistics = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10))
symmEvent = MibIdentifier((1, 3, 6, 1, 4, 1, 1139, 1, 7))
emcControlCenter = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 1), )
if mibBuilder.loadTexts: emcControlCenter.setStatus('obsolete')
if mibBuilder.loadTexts: emcControlCenter.setDescription('A list of EMC Control Center specific variables entries. The number of entries is given by the value of discoveryTableSize.')
esmVariables = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: esmVariables.setStatus('obsolete')
if mibBuilder.loadTexts: esmVariables.setDescription('An entry containing objects for a particular Symmertrix.')
emcSymCnfg = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 1), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymCnfg.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymCnfg.setDescription('symmetrix.cnfg disk variable ')
emcSymDiskCfg = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 2), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymDiskCfg.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymDiskCfg.setDescription('Symmetrix DISKS CONFIGURATION data')
emcSymMirrorDiskCfg = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 3), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymMirrorDiskCfg.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymMirrorDiskCfg.setDescription('Symmetrix MIRRORED DISKS CONFIGURATION data')
emcSymMirror3DiskCfg = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 4), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymMirror3DiskCfg.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymMirror3DiskCfg.setDescription('Symmetrix MIRRORED3 DISKS CONFIGURATION data')
emcSymMirror4DiskCfg = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 5), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymMirror4DiskCfg.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymMirror4DiskCfg.setDescription('Symmetrix MIRRORED4 DISKS CONFIGURATION data')
emcSymStatistics = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 6), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymStatistics.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymStatistics.setDescription('Symmetrix STATISTICS data')
emcSymUtilA7 = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymUtilA7.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymUtilA7.setDescription("UTILITY A7 -- Show disks 'W PEND' tracks counts")
emcSymRdfMaint = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 8), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymRdfMaint.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymRdfMaint.setDescription('Symmetrix Remote Data Facility Maintenance')
emcSymWinConfig = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 9), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymWinConfig.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymWinConfig.setDescription('EMC ICDA Manager CONFIGURATION data')
emcSymUtil99 = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 10), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymUtil99.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymUtil99.setDescription('Symmetrix error stats')
emcSymDir = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 11), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymDir.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymDir.setDescription('Symmetrix director information.')
emcSymDevStats = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 12), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymDevStats.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymDevStats.setDescription('Symmetrix error stats')
emcSymSumStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 13), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymSumStatus.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymSumStatus.setDescription('Symmetrix Summary Status. 0 = no errors 1 = warning 2+ = Fatal error ')
emcRatiosOutofRange = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 14), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcRatiosOutofRange.setStatus('obsolete')
if mibBuilder.loadTexts: emcRatiosOutofRange.setDescription('Symmetrix Write/Hit Ratio Status. A bit-wise integer value indicating hit or write ratio out of range. If (value & 1) then hit ratio out of specified range. If (value & 2) then write ratio out of specified range. ')
emcSymPortStats = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 15), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymPortStats.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymPortStats.setDescription('Symmetrix port statistics')
emcSymBCVDevice = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 16), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymBCVDevice.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymBCVDevice.setDescription('Symmetrix BCV Device information')
emcSymSaitInfo = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 17), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymSaitInfo.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymSaitInfo.setDescription('Symmetrix SCSI/Fiber channel information')
emcSymTimefinderInfo = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 18), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymTimefinderInfo.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymTimefinderInfo.setDescription('Symmetrix Tinmefinder information')
emcSymSRDFInfo = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 19), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymSRDFInfo.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymSRDFInfo.setDescription('Symmetrix RDF Device information')
emcSymPhysDevStats = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 20), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymPhysDevStats.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymPhysDevStats.setDescription('Symmetrix Physical Device Statistics')
emcSymSumStatusErrorCodes = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 1, 1, 98), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymSumStatusErrorCodes.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymSumStatusErrorCodes.setDescription('A Colon-delimited list of error codes that caused the error in emcSymSumStatus ')
systemInfoHeaderTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 1), )
if mibBuilder.loadTexts: systemInfoHeaderTable.setStatus('obsolete')
if mibBuilder.loadTexts: systemInfoHeaderTable.setDescription('A table of Symmetrix information contain is the results of Syscall 0x0101 for the specified Symmetrix instance. The number of entries is given by the value discIndex.')
systemInfoHeaderEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: systemInfoHeaderEntry.setStatus('obsolete')
if mibBuilder.loadTexts: systemInfoHeaderEntry.setDescription('An entry containing objects for the indicated systemInfoHeaderTable element.')
sysinfoBuffer = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 1, 1, 1), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoBuffer.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoBuffer.setDescription('The entire return buffer of system call 0x0101 for the indicated Symmetrix.')
sysinfoNumberofRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoNumberofRecords.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoNumberofRecords.setDescription('The count of the number of records in the buffer returned by Syscall 0x0101.')
sysinfoRecordSize = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoRecordSize.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoRecordSize.setDescription('This object is the size of one record in the buffer returned by Syscall 0x0101.')
sysinfoFirstRecordNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 1, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoFirstRecordNumber.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoFirstRecordNumber.setDescription('This object is the first record number in the buffer returned by Syscall 0x0101.')
sysinfoMaxRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 1, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoMaxRecords.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoMaxRecords.setDescription('This object is the maximum number of records available in this Symmetrix for Syscall 0x0101.')
sysinfoRecordsTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 2), )
if mibBuilder.loadTexts: sysinfoRecordsTable.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoRecordsTable.setDescription('This table provides a method to access one device record within the buffer returned by Syscall 0x0101.')
sysinfoRecordsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: sysinfoRecordsEntry.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoRecordsEntry.setDescription('One entire record of system information for the indicated Symmetrix.')
sysinfoSerialNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 2, 1, 1), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(16, 16)).setFixedLength(16)).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoSerialNumber.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoSerialNumber.setDescription(' This object describes the serial number of indicated Symmetrix.')
sysinfoNumberofDirectors = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 2, 1, 19), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoNumberofDirectors.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoNumberofDirectors.setDescription('The number of directors in this Symmetrix. ')
sysinfoNumberofVolumes = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 2, 1, 23), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoNumberofVolumes.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoNumberofVolumes.setDescription(' This object describes the number of logical devices present on the system.')
sysinfoMemorySize = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 257, 2, 1, 25), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sysinfoMemorySize.setStatus('obsolete')
if mibBuilder.loadTexts: sysinfoMemorySize.setDescription('The amount of memory, in Megabytes, in the system. This is also the last memory address in the system ')
systemCodesTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 1), )
if mibBuilder.loadTexts: systemCodesTable.setStatus('obsolete')
if mibBuilder.loadTexts: systemCodesTable.setDescription('A table of Symmetrix information contain is the results of Syscall 0x0102 for the specified Symmetrix instance. The number of entries is given by the value discIndex.')
systemCodesEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: systemCodesEntry.setStatus('obsolete')
if mibBuilder.loadTexts: systemCodesEntry.setDescription('An entry containing objects for the indicated systemCodesTable element.')
syscodesBuffer = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 1, 1, 1), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: syscodesBuffer.setStatus('obsolete')
if mibBuilder.loadTexts: syscodesBuffer.setDescription('The entire return buffer of system call 0x0102. ')
syscodesNumberofRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: syscodesNumberofRecords.setStatus('obsolete')
if mibBuilder.loadTexts: syscodesNumberofRecords.setDescription('The count of the number of records in the buffer returned by Syscall 0x0102.')
syscodesRecordSize = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: syscodesRecordSize.setStatus('obsolete')
if mibBuilder.loadTexts: syscodesRecordSize.setDescription('This object is the size of one record in the buffer returned by Syscall 0x0102.')
syscodesFirstRecordNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 1, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: syscodesFirstRecordNumber.setStatus('obsolete')
if mibBuilder.loadTexts: syscodesFirstRecordNumber.setDescription('This object is the first record number in the buffer returned by Syscall 0x0102.')
syscodesMaxRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 1, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: syscodesMaxRecords.setStatus('obsolete')
if mibBuilder.loadTexts: syscodesMaxRecords.setDescription('This object is the maximum number of records available in this Symmetrix for Syscall 0x0102.')
systemCodesRecordsTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2), )
if mibBuilder.loadTexts: systemCodesRecordsTable.setStatus('obsolete')
if mibBuilder.loadTexts: systemCodesRecordsTable.setDescription('This table provides a method to access one director record within the buffer returned by Syscall 0x0102.')
systemCodesRecordsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "syscodesDirectorNum"))
if mibBuilder.loadTexts: systemCodesRecordsEntry.setStatus('obsolete')
if mibBuilder.loadTexts: systemCodesRecordsEntry.setDescription('One entire record of system code information for the the indicated Symmetrix and director.')
syscodesDirectorType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("parallel-adapter", 1), ("escon-adapter", 2), ("scsi-adapter", 3), ("disk-adapter", 4), ("remote-adapter", 5), ("fiber-adapter", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: syscodesDirectorType.setStatus('obsolete')
if mibBuilder.loadTexts: syscodesDirectorType.setDescription('The 1 byte director type identifier. 1 - Parallel Channel Adapter card 2 - ESCON Adapter card 3 - SCSI Adapter card 4 - Disk Adapter card 5 - RDF Adapter card 6 - Fiber Channel Adapter card ')
syscodesDirectorNum = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: syscodesDirectorNum.setStatus('obsolete')
if mibBuilder.loadTexts: syscodesDirectorNum.setDescription('The index to the table. The number of instances should be the number of Directors. The instance we are interested in at any point would be the Director number. NOTE: Director numbering may be zero based. If so, then an instance is Director number plus 1. ')
emulCodeType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 5), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 4))).setMaxAccess("readonly")
if mibBuilder.loadTexts: emulCodeType.setStatus('obsolete')
if mibBuilder.loadTexts: emulCodeType.setDescription("The 4 byte code type of the director. Value is 'EMUL' ")
emulVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emulVersion.setStatus('obsolete')
if mibBuilder.loadTexts: emulVersion.setDescription("The 4 byte version of the director's code. ")
emulDate = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 7), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emulDate.setStatus('obsolete')
if mibBuilder.loadTexts: emulDate.setDescription("The 4 byte version date for the director's code. ")
emulChecksum = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emulChecksum.setStatus('obsolete')
if mibBuilder.loadTexts: emulChecksum.setDescription("The 4 byte checksum for the director's code. ")
emulMTPF = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emulMTPF.setStatus('obsolete')
if mibBuilder.loadTexts: emulMTPF.setDescription("The 4 byte MTPF of the director's code. ")
emulFileCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 10), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emulFileCount.setStatus('obsolete')
if mibBuilder.loadTexts: emulFileCount.setDescription("The 4 byte file length of the director's code. ")
implCodeType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 11), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 4))).setMaxAccess("readonly")
if mibBuilder.loadTexts: implCodeType.setStatus('obsolete')
if mibBuilder.loadTexts: implCodeType.setDescription("The 4 byte code type of the director. Value is 'IMPL' ")
implVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 12), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: implVersion.setStatus('obsolete')
if mibBuilder.loadTexts: implVersion.setDescription("The 4 byte version of the director's code. ")
implDate = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 13), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: implDate.setStatus('obsolete')
if mibBuilder.loadTexts: implDate.setDescription("The 4 byte version date for the director's code. ")
implChecksum = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 14), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: implChecksum.setStatus('obsolete')
if mibBuilder.loadTexts: implChecksum.setDescription("The 4 byte checksum for the director's code. ")
implMTPF = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 15), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: implMTPF.setStatus('obsolete')
if mibBuilder.loadTexts: implMTPF.setDescription("The 4 byte MTPF of the director's code. ")
implFileCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 16), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: implFileCount.setStatus('obsolete')
if mibBuilder.loadTexts: implFileCount.setDescription("The 4 byte file length of the director's code. ")
initCodeType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 17), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 4))).setMaxAccess("readonly")
if mibBuilder.loadTexts: initCodeType.setStatus('obsolete')
if mibBuilder.loadTexts: initCodeType.setDescription("The 4 byte code type of the director. Value is 'INIT' ")
initVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 18), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: initVersion.setStatus('obsolete')
if mibBuilder.loadTexts: initVersion.setDescription("The 4 byte version of the director's code. ")
initDate = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 19), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: initDate.setStatus('obsolete')
if mibBuilder.loadTexts: initDate.setDescription("The 4 byte version date for the director's code. ")
initChecksum = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 20), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: initChecksum.setStatus('obsolete')
if mibBuilder.loadTexts: initChecksum.setDescription("The 4 byte checksum for the director's code. ")
initMTPF = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 21), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: initMTPF.setStatus('obsolete')
if mibBuilder.loadTexts: initMTPF.setDescription("The 4 byte MTPF of the director's code. ")
initFileCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 22), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: initFileCount.setStatus('obsolete')
if mibBuilder.loadTexts: initFileCount.setDescription("The 4 byte file length of the director's code. ")
escnCodeType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 23), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 4))).setMaxAccess("readonly")
if mibBuilder.loadTexts: escnCodeType.setStatus('obsolete')
if mibBuilder.loadTexts: escnCodeType.setDescription("The 4 byte code type of the director. Value is 'ESCN' ")
escnVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 24), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: escnVersion.setStatus('obsolete')
if mibBuilder.loadTexts: escnVersion.setDescription("The 4 byte version of the director's code. ")
escnDate = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 25), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: escnDate.setStatus('obsolete')
if mibBuilder.loadTexts: escnDate.setDescription("The 4 byte version date for the director's code. ")
escnChecksum = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 26), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: escnChecksum.setStatus('obsolete')
if mibBuilder.loadTexts: escnChecksum.setDescription("The 4 byte checksum for the director's code. ")
escnMTPF = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 27), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: escnMTPF.setStatus('obsolete')
if mibBuilder.loadTexts: escnMTPF.setDescription("The 4 byte MTPF of the director's code. ")
escnFileCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 258, 2, 1, 28), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: escnFileCount.setStatus('obsolete')
if mibBuilder.loadTexts: escnFileCount.setDescription("The 4 byte file length of the director's code. ")
diskAdapterDeviceConfigurationTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 1), )
if mibBuilder.loadTexts: diskAdapterDeviceConfigurationTable.setStatus('obsolete')
if mibBuilder.loadTexts: diskAdapterDeviceConfigurationTable.setDescription('A table of Symmetrix information contain is the results of Syscall 0x0111 for the specified Symmetrix instance. The number of entries is given by the value discIndex.')
diskAdapterDeviceConfigurationEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: diskAdapterDeviceConfigurationEntry.setStatus('obsolete')
if mibBuilder.loadTexts: diskAdapterDeviceConfigurationEntry.setDescription('An entry containing objects for the indicated diskAdapterDeviceConfigurationTable element.')
dadcnfigBuffer = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 1, 1, 1), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigBuffer.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigBuffer.setDescription('The entire return buffer of system call 0x0111. ')
dadcnfigNumberofRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigNumberofRecords.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigNumberofRecords.setDescription('The count of the number of records in the buffer returned by Syscall 0x0111.')
dadcnfigRecordSize = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigRecordSize.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigRecordSize.setDescription('This object is the size of one record in the buffer returned by Syscall 0x0111.')
dadcnfigFirstRecordNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 1, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigFirstRecordNumber.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigFirstRecordNumber.setDescription('This object is the first record number in the buffer returned by Syscall 0x0111.')
dadcnfigMaxRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 1, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMaxRecords.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMaxRecords.setDescription('This object is the maximum number of records available in this Symmetrix for Syscall 0x0111.')
dadcnfigDeviceRecordsTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2), )
if mibBuilder.loadTexts: dadcnfigDeviceRecordsTable.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigDeviceRecordsTable.setDescription('This table provides a method to access one device record within the buffer returned by Syscall 0x0111.')
dadcnfigDeviceRecordsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "dadcnfigSymmNumber"))
if mibBuilder.loadTexts: dadcnfigDeviceRecordsEntry.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigDeviceRecordsEntry.setDescription('One entire record of disk adapter device configuration information for the indicated Symmetrix.')
dadcnfigSymmNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigSymmNumber.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigSymmNumber.setDescription('The 2 byte Symmetrix number of the device. ')
dadcnfigMirrors = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 8), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirrors.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirrors.setDescription("an 8 byte buffer, 4 mirrors * 2 bytes per, indicating director and port assignments for this device's mirrors. Buffer format is: mir 1 mir 2 mir 3 mir 4 *----+----*----+----*----+----*----+----+ |DIR |i/f |DIR |i/f |DIR |i/f |DIR |i/f | | # | | # | | # | | # | | *----+----*----+----*----+----*----+----+ ")
dadcnfigMirror1Director = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror1Director.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror1Director.setDescription("The director number of this device's Mirror 1 device. ")
dadcnfigMirror1Interface = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 10), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror1Interface.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror1Interface.setDescription("The interface number of this device's Mirror 1 device. ")
dadcnfigMirror2Director = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror2Director.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror2Director.setDescription("The director number of this device's Mirror 2 device. ")
dadcnfigMirror2Interface = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 12), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror2Interface.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror2Interface.setDescription("The interface number of this device's Mirror 2 device. ")
dadcnfigMirror3Director = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 13), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror3Director.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror3Director.setDescription("The director number of this device's Mirror 3 device. ")
dadcnfigMirror3Interface = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 14), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror3Interface.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror3Interface.setDescription("The interface number of this device's Mirror 3 device. ")
dadcnfigMirror4Director = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 15), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror4Director.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror4Director.setDescription("The director number of this device's Mirror 4 device. ")
dadcnfigMirror4Interface = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 273, 2, 1, 16), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dadcnfigMirror4Interface.setStatus('obsolete')
if mibBuilder.loadTexts: dadcnfigMirror4Interface.setDescription("The interface number of this device's Mirror 4 device. ")
deviceHostAddressConfigurationTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 1), )
if mibBuilder.loadTexts: deviceHostAddressConfigurationTable.setStatus('obsolete')
if mibBuilder.loadTexts: deviceHostAddressConfigurationTable.setDescription('A table of Symmetrix information contain is the results of Syscall 0x0119 for the specified Symmetrix instance. The number of entries is given by the value discIndex.')
deviceHostAddressConfigurationEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: deviceHostAddressConfigurationEntry.setStatus('obsolete')
if mibBuilder.loadTexts: deviceHostAddressConfigurationEntry.setDescription('An entry containing objects for the indicated deviceHostAddressConfigurationTable element.')
dvhoaddrBuffer = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 1, 1, 1), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrBuffer.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrBuffer.setDescription('The entire return buffer of system call 0x0119. ')
dvhoaddrNumberofRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrNumberofRecords.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrNumberofRecords.setDescription('The count of the number of records in the buffer returned by Syscall 0x0119.')
dvhoaddrRecordSize = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrRecordSize.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrRecordSize.setDescription('This object is the size of one record in the buffer returned by Syscall 0x0119.')
dvhoaddrFirstRecordNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 1, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrFirstRecordNumber.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrFirstRecordNumber.setDescription('This object is the first record number in the buffer returned by Syscall 0x0119.')
dvhoaddrMaxRecords = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 1, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrMaxRecords.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrMaxRecords.setDescription('This object is the maximum number of records available in this Symmetrix for Syscall 0x0119.')
dvhoaddrDeviceRecordsTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2), )
if mibBuilder.loadTexts: dvhoaddrDeviceRecordsTable.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrDeviceRecordsTable.setDescription('This table provides a method to access one device record record within the buffer returned by Syscall 0x0119.')
dvhoaddrDeviceRecordsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "dvhoaddrSymmNumber"))
if mibBuilder.loadTexts: dvhoaddrDeviceRecordsEntry.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrDeviceRecordsEntry.setDescription('One entire record of device host address information for the indicated Symmetrix.')
dvhoaddrSymmNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrSymmNumber.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrSymmNumber.setDescription('The 2 byte Symmetrix number of the device. ')
dvhoaddrDirectorNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrDirectorNumber.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrDirectorNumber.setDescription('The 2 byte Symmetrix number of the director. ')
dvhoaddrPortAType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 6))).clone(namedValues=NamedValues(("parallel-ca", 1), ("escon-ca", 2), ("sa", 3), ("fibre", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortAType.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortAType.setDescription('The type of port for Port A. 01 - Parallel CA 02 - ESCON CA 03 - SA 04 - Fiber Channel')
dvhoaddrPortADeviceAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 4), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortADeviceAddress.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortADeviceAddress.setDescription('The 1 byte port address for this device on this port.')
dvhoaddrPortBType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 6))).clone(namedValues=NamedValues(("parallel-ca", 1), ("escon-ca", 2), ("sa", 3), ("fibre", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortBType.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortBType.setDescription('The type of port for Port A. 01 - Parallel CA 02 - ESCON CA 03 - SA 04 - Fiber Channel')
dvhoaddrPortBDeviceAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 6), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortBDeviceAddress.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortBDeviceAddress.setDescription('The 1 byte port address for this device on this port.')
dvhoaddrPortCType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 6))).clone(namedValues=NamedValues(("parallel-ca", 1), ("escon-ca", 2), ("sa", 3), ("fibre", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortCType.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortCType.setDescription('The type of port for Port A. 01 - Parallel CA 02 - ESCON CA 03 - SA 04 - Fiber Channel')
dvhoaddrPortCDeviceAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 8), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortCDeviceAddress.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortCDeviceAddress.setDescription('The 1 byte port address for this device on this port.')
dvhoaddrPortDType = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 6))).clone(namedValues=NamedValues(("parallel-ca", 1), ("escon-ca", 2), ("sa", 3), ("fibre", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortDType.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortDType.setDescription('The type of port for Port A. 01 - Parallel CA 02 - ESCON CA 03 - SA 04 - Fiber Channel')
dvhoaddrPortDDeviceAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 10), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrPortDDeviceAddress.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrPortDDeviceAddress.setDescription('The 1 byte port address for this device on this port.')
dvhoaddrMetaFlags = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 11), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrMetaFlags.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrMetaFlags.setDescription('The 1 byte Meta Flags if this record is an SA record.')
dvhoaddrFiberChannelAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 2, 1, 281, 2, 1, 15), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dvhoaddrFiberChannelAddress.setStatus('obsolete')
if mibBuilder.loadTexts: dvhoaddrFiberChannelAddress.setDescription('The 2 byte address if this record is a Fiber Channel record. There is only 1 port defined.')
discoveryTableSize = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 3, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discoveryTableSize.setStatus('mandatory')
if mibBuilder.loadTexts: discoveryTableSize.setDescription('The number of Symmetrixes that are, or have been present on this system.')
discoveryTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2), )
if mibBuilder.loadTexts: discoveryTable.setStatus('mandatory')
if mibBuilder.loadTexts: discoveryTable.setDescription('A list of Symmetrixes. The number of entries is given by the value of discoveryTableSize.')
discoveryTbl = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: discoveryTbl.setStatus('mandatory')
if mibBuilder.loadTexts: discoveryTbl.setDescription('An interface entry containing objects for a particular Symmetrix.')
discIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discIndex.setStatus('mandatory')
if mibBuilder.loadTexts: discIndex.setDescription("A unique value for each Symmetrix. Its value ranges between 1 and the value of discoveryTableSize. The value for each Symmetrix must remain constant from one agent re-initialization to the next re-initialization, or until the agent's discovery list is reset.")
discSerialNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discSerialNumber.setStatus('mandatory')
if mibBuilder.loadTexts: discSerialNumber.setDescription('The serial number of this attached Symmetrix')
discRawDevice = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discRawDevice.setStatus('obsolete')
if mibBuilder.loadTexts: discRawDevice.setDescription("The 'gatekeeper' device the agent uses to extract information from this Symmetrix, via the SCSI connection")
discModel = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discModel.setStatus('mandatory')
if mibBuilder.loadTexts: discModel.setDescription('This Symmetrix Model number')
discCapacity = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discCapacity.setStatus('obsolete')
if mibBuilder.loadTexts: discCapacity.setDescription("The size, in bytes of the 'gatekeeper' device for this Symmetrix This object is obsolete in Mib Version 2.0. Agent revisions of 4.0 or greater will return a zero as a value.")
discChecksum = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discChecksum.setStatus('mandatory')
if mibBuilder.loadTexts: discChecksum.setDescription('The checksum value of the IMPL for this Symmetrix.')
discConfigDate = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discConfigDate.setStatus('mandatory')
if mibBuilder.loadTexts: discConfigDate.setDescription('The date of the last configuration change. Format = MMDDYYYY ')
discRDF = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("false", 0), ("true", 1), ("unknown", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: discRDF.setStatus('mandatory')
if mibBuilder.loadTexts: discRDF.setDescription('Indicates if RDF is available for this Symmetrix')
discBCV = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("false", 0), ("true", 1), ("unknown", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: discBCV.setStatus('mandatory')
if mibBuilder.loadTexts: discBCV.setDescription('Indicates if BCV Devices are configured in this Symmetrix')
discState = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("unknown", 1), ("online", 2), ("offline", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: discState.setStatus('mandatory')
if mibBuilder.loadTexts: discState.setDescription('Indicates the online/offline state of this Symmetrix')
discStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("unknown", 1), ("unused", 2), ("ok", 3), ("warning", 4), ("failed", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: discStatus.setStatus('mandatory')
if mibBuilder.loadTexts: discStatus.setDescription('Indicates the overall status of this Symmetrix')
discMicrocodeVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 12), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discMicrocodeVersion.setStatus('mandatory')
if mibBuilder.loadTexts: discMicrocodeVersion.setDescription('The microcode version running in this Symmetrix')
discSymapisrv_IP = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 13), IpAddress()).setLabel("discSymapisrv-IP").setMaxAccess("readonly")
if mibBuilder.loadTexts: discSymapisrv_IP.setStatus('mandatory')
if mibBuilder.loadTexts: discSymapisrv_IP.setDescription("The IP address of the symapi server from where this Symmetrix's data is sourced. ")
discNumEvents = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 14), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discNumEvents.setStatus('mandatory')
if mibBuilder.loadTexts: discNumEvents.setDescription('Number of events currently in the symmEventTable.')
discEventCurrID = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 3, 2, 1, 15), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discEventCurrID.setStatus('mandatory')
if mibBuilder.loadTexts: discEventCurrID.setDescription('The last used event id (symmEventId).')
agentRevision = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: agentRevision.setStatus('mandatory')
if mibBuilder.loadTexts: agentRevision.setDescription('The current revision of the agent software ')
mibRevision = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mibRevision.setStatus('mandatory')
if mibBuilder.loadTexts: mibRevision.setDescription('Mib Revision 4.1')
agentType = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("unknown", 0), ("unix-host", 1), ("mainframe", 2), ("nt-host", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: agentType.setStatus('mandatory')
if mibBuilder.loadTexts: agentType.setDescription('Integer value indicating the agent host environment, so polling applications can adjust accordingly')
periodicDiscoveryFrequency = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: periodicDiscoveryFrequency.setStatus('mandatory')
if mibBuilder.loadTexts: periodicDiscoveryFrequency.setDescription("Indicates how often the Discovery thread should rebuild the table of attached Symmetrixes, adding new ones, and removing old ones. Any changes between the new table and the previous table will generate a trap (Trap 4) to all registered trap clients. Initialize at startup from a separate 'config' file. Recommended default is 3600 seconds (1 hour). A value less than 60 will disable the thread. Minimum frequency is 60 seconds.")
checksumTestFrequency = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: checksumTestFrequency.setStatus('mandatory')
if mibBuilder.loadTexts: checksumTestFrequency.setDescription("Indicates how often the Checksum thread should test for any changes between the current and previous checksum value for all discovered Symmetrixes. For each checksum change, a trap will be generated (Trap 5) to all registered trap clients. Initialize at startup from a separate 'config' file. Recommended default is 60 seconds (1 minute). A value less than 60 will disable the thread. Minimum frequency is 60 seconds.")
statusCheckFrequency = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: statusCheckFrequency.setStatus('mandatory')
if mibBuilder.loadTexts: statusCheckFrequency.setDescription("Indicates how often the Status thread should gather all status and error data from all discovered Symmetrixes. For each director or device error condition, a trap will be generated (Trap 1 or 2) to all registered trap clients. Initialize at startup from a separate 'config' file. Recommended default is 60 seconds (1 minute). A value less than 60 will disable the thread. Minimum frequency is 60 seconds.")
discoveryChangeTime = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 302), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discoveryChangeTime.setStatus('mandatory')
if mibBuilder.loadTexts: discoveryChangeTime.setDescription("Indicates the last time the discovery table was last change by the agent. The value is in seconds, as returned by the standard 'C' function time(). ")
clientListMaintenanceFrequency = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1001, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: clientListMaintenanceFrequency.setStatus('obsolete')
if mibBuilder.loadTexts: clientListMaintenanceFrequency.setDescription("Indicates how often the Client List maintenance thread should 'wake up' to remove old requests and clients from the list Initialize at startup from a separate 'config' file. Recommended default is 15 minutes (1800 seconds).")
clientListRequestExpiration = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1001, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: clientListRequestExpiration.setStatus('obsolete')
if mibBuilder.loadTexts: clientListRequestExpiration.setDescription("Indicates how old a client request should be to consider removing it from the Client List. It's assumed that a time out condition occurred at the client, and the data is no longer of any value, and deleted. Initialize at startup from a separate 'config' file. Recommended default is 15 minutes (900 seconds).")
clientListClientExpiration = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1001, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: clientListClientExpiration.setStatus('obsolete')
if mibBuilder.loadTexts: clientListClientExpiration.setDescription("Indicates how long a client can remain in the Client List without make a request to the agent, after which point it is deleted from the list. Clients are added to the list by making a request to the agent. Initialize at startup from a separate 'config' file. Recommended default is 30 minutes (1800 seconds).")
discoveryTrapPort = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1002, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: discoveryTrapPort.setStatus('obsolete')
if mibBuilder.loadTexts: discoveryTrapPort.setDescription("Each client can set it's own port for receiving rediscovery traps in the event the client cannot listen on port 162. The agent will send any discovery table notifications to port 162, and, if set (i.e. >0), the clients designated port. ")
trapTestFrequency = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1002, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: trapTestFrequency.setStatus('obsolete')
if mibBuilder.loadTexts: trapTestFrequency.setDescription("Indicates how often the Trap Test thread should 'wake up' to test for trap conditions in each attached Symmetrix. Initialize at startup from a separate 'config' file. Recommended default is 60 seconds.")
standardSNMPRequestPort = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1003, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: standardSNMPRequestPort.setStatus('mandatory')
if mibBuilder.loadTexts: standardSNMPRequestPort.setDescription('Indicates if the agent was able to bind to the standard SNMP request port, port 161. Value of -1 indicates another SNMP agent was already active on this host.')
esmSNMPRequestPort = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1003, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esmSNMPRequestPort.setStatus('mandatory')
if mibBuilder.loadTexts: esmSNMPRequestPort.setDescription("Indicates what port the agent was able to bind to receive SNMP requests for ESM data. This port can also be browsed for all available MIB objects in the event the standard SNMP port is unavailable, or the standard EMC.MIB is not loaded into the host's SNMP agent.")
celerraTCPPort = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1003, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: celerraTCPPort.setStatus('obsolete')
if mibBuilder.loadTexts: celerraTCPPort.setDescription('Indicates what port the agent was able to bind to receive TCP requests for ESM data from a Celerra Monitor. Requests made to this port must conform to the internal proprietary protocol established between a Celerra Monitor and the agent')
xdrTCPPort = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1003, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: xdrTCPPort.setStatus('obsolete')
if mibBuilder.loadTexts: xdrTCPPort.setDescription('Indicates what port the agent was able to bind to receive TCP requests for ESM data. Requests made to this port must be XDR encoded and conform to the command set established in the agent')
esmVariablePacketSize = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1004, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: esmVariablePacketSize.setStatus('obsolete')
if mibBuilder.loadTexts: esmVariablePacketSize.setDescription("Agent's Maximum SNMP Packet Size for ESM Opaque variables. Each client can set it's own preferred size with the agent's internal client list. Default size: 2000 bytes ")
discoveryFrequency = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1004, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: discoveryFrequency.setStatus('obsolete')
if mibBuilder.loadTexts: discoveryFrequency.setDescription("Indicates how often the Discovery thread should 'wake up' to rebuild the table of attached Symmetrixes, adding new ones, and removing old ones. Any changes between the new table and the previous table will generate a trap to all clients that had previously retrieved the discovery table. Those clients will be prevented from retrieving any data from the agent until they retrieve the new discovery table. Initialize at startup from a separate 'config' file. Recommended default is 600 seconds (10 minutes).")
masterTraceMessagesEnable = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1004, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("off", 0), ("on", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: masterTraceMessagesEnable.setStatus('obsolete')
if mibBuilder.loadTexts: masterTraceMessagesEnable.setDescription('Turns trace messages on or off to the console. 0 = OFF 1 = ON')
analyzerTopFileSavePolicy = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: analyzerTopFileSavePolicy.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerTopFileSavePolicy.setDescription("Indicates how long a *.top file can remain on disk before being deleted. Value is in days. Initialize at startup from a separate 'config' file. Recommended default is 7 days.")
analyzerSpecialDurationLimit = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: analyzerSpecialDurationLimit.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerSpecialDurationLimit.setDescription("This is a cap applied to 'special' analyzer collection requests that have a duration that exceeds this amount. Value is in hours. Initialize at startup from a separate 'config' file. Recommended default is 24 hours.")
analyzerFilesCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 1), )
if mibBuilder.loadTexts: analyzerFilesCountTable.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFilesCountTable.setDescription('A list of the number of analyzer file present for the given Symmetrix instance.')
analyzerFileCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 1, 1), ).setIndexNames((0, "EMC-MIB", "symListCount"))
if mibBuilder.loadTexts: analyzerFileCountEntry.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileCountEntry.setDescription('A file count entry containing objects for the specified Symmetrix')
analyzerFileCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: analyzerFileCount.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileCount.setDescription('The number of entries in the AnaylzerFileList table for the indicated Symmetrix instance')
analyzerFilesListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2), )
if mibBuilder.loadTexts: analyzerFilesListTable.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFilesListTable.setDescription('A list of the number of analyzer file present for the given Symmetrix instance. The number of entries is given by the value of analyzerFileCount.')
analyzerFilesListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2, 1), ).setIndexNames((0, "EMC-MIB", "symListCount"), (0, "EMC-MIB", "analyzerFileCount"))
if mibBuilder.loadTexts: analyzerFilesListEntry.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFilesListEntry.setDescription('A file list entry containing objects for the specified Symmetrix')
analyzerFileName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: analyzerFileName.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileName.setDescription('The analyzer file name for the indicated instance')
analyzerFileSize = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: analyzerFileSize.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileSize.setDescription('The analyzer file size for the indicated instances, in bytes')
analyzerFileCreation = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: analyzerFileCreation.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileCreation.setDescription('The analyzer file creation time for the indicated instance')
analyzerFileLastModified = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: analyzerFileLastModified.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileLastModified.setDescription('The analyzer file last modified time for the indicated instance')
analyzerFileIsActive = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("inactive", 0), ("active", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: analyzerFileIsActive.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileIsActive.setDescription('Indicates if the analyzer collector is collection and storing Symmetrix information into this file.')
analyzerFileRuntime = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1000, 3, 2, 1, 6), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: analyzerFileRuntime.setStatus('obsolete')
if mibBuilder.loadTexts: analyzerFileRuntime.setDescription('The length of time this file has run, or has been running for.')
subagentInformation = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1005, 1), )
if mibBuilder.loadTexts: subagentInformation.setStatus('obsolete')
if mibBuilder.loadTexts: subagentInformation.setDescription(' A list of subagent entries operational on the host. The number of entries is given by the value of discoveryTableSize.')
subagentInfo = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1005, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: subagentInfo.setStatus('obsolete')
if mibBuilder.loadTexts: subagentInfo.setDescription('An entry containing objects for a particular Symmetrix subagent.')
subagentSymmetrixSerialNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1005, 1, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: subagentSymmetrixSerialNumber.setStatus('obsolete')
if mibBuilder.loadTexts: subagentSymmetrixSerialNumber.setDescription('The serial number of this attached Symmetrix')
subagentProcessActive = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1005, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("false", 0), ("true", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: subagentProcessActive.setStatus('obsolete')
if mibBuilder.loadTexts: subagentProcessActive.setDescription('The subagent process is running for this symmetrix. 0 = False 1 = True')
subagentTraceMessagesEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 4, 1005, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("off", 0), ("on", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: subagentTraceMessagesEnable.setStatus('obsolete')
if mibBuilder.loadTexts: subagentTraceMessagesEnable.setDescription('Turns trace messages on or off to the console. 0 = OFF 1 = ON')
mainframeDiskInformation = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 5, 1), )
if mibBuilder.loadTexts: mainframeDiskInformation.setStatus('obsolete')
if mibBuilder.loadTexts: mainframeDiskInformation.setDescription('This table of mainframe specific disk variables for each attached Symmetrix. The number of entries is given by the value of discoveryTableSize.')
mfDiskInformation = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 5, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: mfDiskInformation.setStatus('obsolete')
if mibBuilder.loadTexts: mfDiskInformation.setDescription('An mainframe disk information entry containing objects for a particular Symmetrix.')
emcSymMvsVolume = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 5, 1, 1, 1), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymMvsVolume.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymMvsVolume.setDescription('Specific mainframe information for each disk of this attached Symmetrix. Supported ONLY in agents running in an MVS environment. ')
mainframeDataSetInformation = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 5, 2), )
if mibBuilder.loadTexts: mainframeDataSetInformation.setStatus('obsolete')
if mibBuilder.loadTexts: mainframeDataSetInformation.setDescription('This table of mainframe specific data set for each attached Symmetrix. The number of entries is given by the value of discoveryTableSize.')
mfDataSetInformation = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 5, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "emcSymMvsLUNNumber"))
if mibBuilder.loadTexts: mfDataSetInformation.setStatus('obsolete')
if mibBuilder.loadTexts: mfDataSetInformation.setDescription('An mainframe data set entry containing objects for a particular Symmetrix.')
emcSymMvsLUNNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 5, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymMvsLUNNumber.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymMvsLUNNumber.setDescription('The LUN number for this data set ')
emcSymMvsDsname = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 5, 2, 1, 2), Opaque()).setMaxAccess("readonly")
if mibBuilder.loadTexts: emcSymMvsDsname.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymMvsDsname.setDescription('Specific mainframe information for each LUN of this attached Symmetrix. Supported ONLY in agents running in an MVS environment. ')
emcSymMvsBuildStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 5, 2, 1, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: emcSymMvsBuildStatus.setStatus('obsolete')
if mibBuilder.loadTexts: emcSymMvsBuildStatus.setDescription('Polled value to indicate the state of the data set list. 0 = data set unavailable 1 = build process initiated 2 = build in progress 3 = data set available ')
symListCount = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symListCount.setStatus('obsolete')
if mibBuilder.loadTexts: symListCount.setDescription('The number of entries in symListTable, representing the number of Symmetrixes present on this system.')
symListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 1, 2), )
if mibBuilder.loadTexts: symListTable.setStatus('obsolete')
if mibBuilder.loadTexts: symListTable.setDescription('A list of attached Symmetrix serial numbers. The number of entries is given by the value of symListCount.')
symListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 1, 2, 1), ).setIndexNames((0, "EMC-MIB", "symListCount"))
if mibBuilder.loadTexts: symListEntry.setStatus('obsolete')
if mibBuilder.loadTexts: symListEntry.setDescription('An entry containing objects for the indicated symListTable element.')
serialNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 1, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: serialNumber.setStatus('obsolete')
if mibBuilder.loadTexts: serialNumber.setDescription('The Symmetrix serial number for the indicated instance')
symRemoteListCount = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 2, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symRemoteListCount.setStatus('obsolete')
if mibBuilder.loadTexts: symRemoteListCount.setDescription('The number of entries in symRemoteListTable, representing the number of remote Symmetrixes present on this system.')
symRemoteListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 2, 2), )
if mibBuilder.loadTexts: symRemoteListTable.setStatus('obsolete')
if mibBuilder.loadTexts: symRemoteListTable.setDescription('A list of remote Symmetrix serial numbers. The number of entries is given by the value of symRemoteListCount.')
symRemoteListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 2, 2, 2), ).setIndexNames((0, "EMC-MIB", "symRemoteListCount"))
if mibBuilder.loadTexts: symRemoteListEntry.setStatus('obsolete')
if mibBuilder.loadTexts: symRemoteListEntry.setDescription('An entry containing objects for the indicated symRemoteListTable element.')
remoteSerialNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 2, 2, 2, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: remoteSerialNumber.setStatus('obsolete')
if mibBuilder.loadTexts: remoteSerialNumber.setDescription('The remote Symmetrix serial number for the indicated instance')
symDevListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 3, 1), )
if mibBuilder.loadTexts: symDevListCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symDevListCountTable.setDescription('A list of the number of Symmetrix devices for the given Symmetrix instance.')
symDevListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 3, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symDevListCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symDevListCountEntry.setDescription('An entry containing objects for the number of Symmetrix device names found for the specified Symmetrix')
symDevListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 3, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symDevListCount.setStatus('mandatory')
if mibBuilder.loadTexts: symDevListCount.setDescription('The number of entries in the SymDevList table for the indicated Symmetrix instance')
symDevListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 3, 2), )
if mibBuilder.loadTexts: symDevListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symDevListTable.setDescription('A list of Symmetrix device names found for the indicated Symmetrix instance. The number of entries is given by the value of symDevListCount.')
symDevListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 3, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symDevListCount"))
if mibBuilder.loadTexts: symDevListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symDevListEntry.setDescription('An entry containing objects for the indicated symDevListTable element.')
symDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 3, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: symDeviceName.setDescription('The device name for the indicated instance')
symPDevListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 4, 1), )
if mibBuilder.loadTexts: symPDevListCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevListCountTable.setDescription('A list of the number of available devices for the given Symmetrix instance.')
symPDevListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 4, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symPDevListCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevListCountEntry.setDescription('An entry containing objects for the number of available devices found for the specified Symmetrix')
symPDevListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 4, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symPDevListCount.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevListCount.setDescription('The number of entries in the symPDeviceList table for the indicated Symmetrix instance')
symPDevListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 4, 2), )
if mibBuilder.loadTexts: symPDevListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevListTable.setDescription('A list of host device filenames (pdevs) for all available devices for the indicated Symmetrix instance. The number of entries is given by the value of symPDevListCount.')
symPDevListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 4, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symPDevListCount"))
if mibBuilder.loadTexts: symPDevListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevListEntry.setDescription('An entry containing objects for the indicated symPDevListTable element.')
symPDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 4, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symPDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: symPDeviceName.setDescription('The physical device name for the indicated instance')
symPDevNoDgListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 5, 1), )
if mibBuilder.loadTexts: symPDevNoDgListCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevNoDgListCountTable.setDescription('A list of the number of Symmetrix devices that are not members of a device group for the given Symmetrix instance.')
symPDevNoDgListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 5, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symPDevNoDgListCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevNoDgListCountEntry.setDescription('An entry containing objects for the number of Symmetrix devices that are not members of a device group for the specified Symmetrix')
symPDevNoDgListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 5, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symPDevNoDgListCount.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevNoDgListCount.setDescription('The number of entries in the symPDeviceNoDgList table for the indicated Symmetrix instance')
symPDevNoDgListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 5, 2), )
if mibBuilder.loadTexts: symPDevNoDgListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevNoDgListTable.setDescription('A list of all Symmetrix devices that are not members of a device group found for the indicated Symmetrix instance. The number of entries is given by the value of symPDevNoDgListCount.')
symPDevNoDgListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 5, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symPDevNoDgListCount"))
if mibBuilder.loadTexts: symPDevNoDgListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevNoDgListEntry.setDescription('An entry containing objects for the indicated symPDevNoDgListTable element.')
symPDevNoDgDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 5, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symPDevNoDgDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: symPDevNoDgDeviceName.setDescription('The device name for the indicated instance')
symDevNoDgListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 6, 1), )
if mibBuilder.loadTexts: symDevNoDgListCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symDevNoDgListCountTable.setDescription('A list of the number of Symmetrix devices, that are not members of a device group for the given Symmetrix instance.')
symDevNoDgListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 6, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symDevNoDgListCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symDevNoDgListCountEntry.setDescription('An entry containing objects for the number of Symmetrix device names found for the specified Symmetrix')
symDevNoDgListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 6, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symDevNoDgListCount.setStatus('mandatory')
if mibBuilder.loadTexts: symDevNoDgListCount.setDescription('The number of entries in the symDeviceNoDgList table for the indicated Symmetrix instance')
symDevNoDgListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 6, 2), )
if mibBuilder.loadTexts: symDevNoDgListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symDevNoDgListTable.setDescription('A list of Symmetrix device names found for the specified Symmetrix. The number of entries is given by the value of symDevNoDgListCount.')
symDevNoDgListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 6, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symDevNoDgListCount"))
if mibBuilder.loadTexts: symDevNoDgListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symDevNoDgListEntry.setDescription('An entry containing objects for the indicated symDevNoDgListTable element.')
symDevNoDgDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 6, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symDevNoDgDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: symDevNoDgDeviceName.setDescription('The device name for the indicated instance')
symDgListCount = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 7, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symDgListCount.setStatus('obsolete')
if mibBuilder.loadTexts: symDgListCount.setDescription('The number of entries in symDgListTable, representing the number of device groups present on this system.')
symDgListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 7, 2), )
if mibBuilder.loadTexts: symDgListTable.setStatus('obsolete')
if mibBuilder.loadTexts: symDgListTable.setDescription('A list of device groups present on the system. The number of entries is given by the value of symDgListCount.')
symDgListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 7, 2, 1), ).setIndexNames((0, "EMC-MIB", "symDgListCount"))
if mibBuilder.loadTexts: symDgListEntry.setStatus('obsolete')
if mibBuilder.loadTexts: symDgListEntry.setDescription('An entry containing objects for the indicated symDgListTable element.')
symDevGroupName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 7, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symDevGroupName.setStatus('obsolete')
if mibBuilder.loadTexts: symDevGroupName.setDescription('The device groupname for the indicated instance')
symLDevListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 8, 1), )
if mibBuilder.loadTexts: symLDevListCountTable.setStatus('obsolete')
if mibBuilder.loadTexts: symLDevListCountTable.setDescription('A list of the number of devices in a specific device group.')
symLDevListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 8, 1, 1), ).setIndexNames((0, "EMC-MIB", "symDgListCount"))
if mibBuilder.loadTexts: symLDevListCountEntry.setStatus('obsolete')
if mibBuilder.loadTexts: symLDevListCountEntry.setDescription('An entry containing objects for the number of devices in a specific device group')
symLDevListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 8, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symLDevListCount.setStatus('obsolete')
if mibBuilder.loadTexts: symLDevListCount.setDescription('The number of entries in the SymLDevList table for the indicated Symmetrix instance')
symLDevListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 8, 2), )
if mibBuilder.loadTexts: symLDevListTable.setStatus('obsolete')
if mibBuilder.loadTexts: symLDevListTable.setDescription('A list of devices in the specified device group. The number of entries is given by the value of symLDevListCount.')
symLDevListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 8, 2, 1), ).setIndexNames((0, "EMC-MIB", "symDgListCount"), (0, "EMC-MIB", "symLDevListCount"))
if mibBuilder.loadTexts: symLDevListEntry.setStatus('obsolete')
if mibBuilder.loadTexts: symLDevListEntry.setDescription('An entry containing objects for the indicated symLDevListTable element.')
lDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 8, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: lDeviceName.setStatus('obsolete')
if mibBuilder.loadTexts: lDeviceName.setDescription('The device name for the indicated instance')
symGateListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 9, 1), )
if mibBuilder.loadTexts: symGateListCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symGateListCountTable.setDescription('A list of the number of host physical device filenames that are currently in the gatekeeper device list for the given Symmetrix instance.')
symGateListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 9, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symGateListCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symGateListCountEntry.setDescription('An entry containing objects for the number of host physical device filenames that are currently in the gatekeeper device list for the specified Symmetrix')
symGateListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 9, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symGateListCount.setStatus('mandatory')
if mibBuilder.loadTexts: symGateListCount.setDescription('The number of entries in the SymGateList table for the indicated Symmetrix instance')
symGateListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 9, 2), )
if mibBuilder.loadTexts: symGateListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symGateListTable.setDescription('A list of host physical device filenames that are currently in the gatekeeper device list for the specified Symmetrix. The number of entries is given by the value of symGateListCount.')
symGateListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 9, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symGateListCount"))
if mibBuilder.loadTexts: symGateListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symGateListEntry.setDescription('An entry containing objects for the indicated symGateListTable element.')
gatekeeperDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 9, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: gatekeeperDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: gatekeeperDeviceName.setDescription('The gatekeeper device name for the indicated instance')
symBcvDevListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 10, 1), )
if mibBuilder.loadTexts: symBcvDevListCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvDevListCountTable.setDescription('A list of the number of BCV devices for the given Symmetrix instance.')
symBcvDevListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 10, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symBcvDevListCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvDevListCountEntry.setDescription('An entry containing objects for the number of BCV devices for the specified Symmetrix')
symBcvDevListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 10, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symBcvDevListCount.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvDevListCount.setDescription('The number of entries in the SymBcvDevList table for the indicated Symmetrix instance')
symBcvDevListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 10, 2), )
if mibBuilder.loadTexts: symBcvDevListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvDevListTable.setDescription('A list of BCV devices for the specified Symmetrix. The number of entries is given by the value of symBcvDevListCount.')
symBcvDevListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 10, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symBcvDevListCount"))
if mibBuilder.loadTexts: symBcvDevListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvDevListEntry.setDescription('An entry containing objects for the indicated symBcvDevListTable element.')
bcvDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 10, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bcvDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: bcvDeviceName.setDescription('The BCV device name for the indicated instance')
symBcvPDevListCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 11, 1), )
if mibBuilder.loadTexts: symBcvPDevListCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvPDevListCountTable.setDescription('A list of the number of all BCV devices that are accessible by the host systems for the given Symmetrix instance.')
symBcvPDevListCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 11, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symBcvPDevListCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvPDevListCountEntry.setDescription('An entry containing objects for the number of all BCV devices that are accessible by the host systems for the specified Symmetrix')
symBcvPDevListCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 11, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symBcvPDevListCount.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvPDevListCount.setDescription('The number of entries in the SymBcvDevList table for the indicated Symmetrix instance')
symBcvPDevListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 11, 2), )
if mibBuilder.loadTexts: symBcvPDevListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvPDevListTable.setDescription('A list of all BCV devices tha are accessible by the host systems for the specified Symmetrix. The number of entries is given by the value of symBcvPDevListCount.')
symBcvPDevListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 11, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symBcvPDevListCount"))
if mibBuilder.loadTexts: symBcvPDevListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvPDevListEntry.setDescription('An entry containing objects for the indicated symBcvPDevListTable element.')
symBcvPDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 1, 11, 2, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symBcvPDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: symBcvPDeviceName.setDescription('The BCV physical device name for the indicated instance')
class StateValues(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))
namedValues = NamedValues(("enabled", 0), ("disabled", 1), ("mixed", 2), ("state-na", 3))
class DirectorType(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8))
namedValues = NamedValues(("fibre-channel", 0), ("scsi-adapter", 1), ("disk-adapter", 2), ("channel-adapter", 3), ("memory-board", 4), ("escon-adapter", 5), ("rdf-adapter-r1", 6), ("rdf-adapter-r2", 7), ("rdf-adapter-bi", 8))
class DirectorStatus(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))
namedValues = NamedValues(("online", 0), ("offline", 1), ("dead", 2), ("unknown", 3))
class PortStatus(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))
namedValues = NamedValues(("status-na", 0), ("on", 1), ("off", 2), ("wd", 3))
class SCSIWidth(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))
namedValues = NamedValues(("not-applicable", 0), ("narrow", 1), ("wide", 2), ("ultra", 3))
symShowConfiguration = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1), )
if mibBuilder.loadTexts: symShowConfiguration.setStatus('mandatory')
if mibBuilder.loadTexts: symShowConfiguration.setDescription('A table of Symmetrix configuration information for the indicated Symmetrix instance. The number of entries is given by the value of discIndex.')
symShowEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symShowEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symShowEntry.setDescription('An entry containing objects for the Symmetrix configuration information for the specified Symmetrix')
symShowSymid = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowSymid.setStatus('mandatory')
if mibBuilder.loadTexts: symShowSymid.setDescription('Symmetrix serial id')
symShowSymmetrix_ident = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 2), DisplayString()).setLabel("symShowSymmetrix-ident").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowSymmetrix_ident.setStatus('mandatory')
if mibBuilder.loadTexts: symShowSymmetrix_ident.setDescription('Symmetrix generation; Symm3 or Symm4. (reserved for EMC use only.)')
symShowSymmetrix_model = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 3), DisplayString()).setLabel("symShowSymmetrix-model").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowSymmetrix_model.setStatus('mandatory')
if mibBuilder.loadTexts: symShowSymmetrix_model.setDescription('Symmetrix model number: 3100, 3200, and so forth')
symShowMicrocode_version = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 4), DisplayString()).setLabel("symShowMicrocode-version").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMicrocode_version.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMicrocode_version.setDescription('Microcode revision string ')
symShowMicrocode_version_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 5), DisplayString()).setLabel("symShowMicrocode-version-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMicrocode_version_num.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMicrocode_version_num.setDescription('Microcode version')
symShowMicrocode_date = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 6), DisplayString()).setLabel("symShowMicrocode-date").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMicrocode_date.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMicrocode_date.setDescription('Date of microcode build (MMDDYYYY)')
symShowMicrocode_patch_level = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 7), DisplayString()).setLabel("symShowMicrocode-patch-level").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMicrocode_patch_level.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMicrocode_patch_level.setDescription('Microcode patch level')
symShowMicrocode_patch_date = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 8), DisplayString()).setLabel("symShowMicrocode-patch-date").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMicrocode_patch_date.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMicrocode_patch_date.setDescription('Date of microcode patch level (MMDDYY)')
symShowSymmetrix_pwron_time = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 9), TimeTicks()).setLabel("symShowSymmetrix-pwron-time").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowSymmetrix_pwron_time.setStatus('mandatory')
if mibBuilder.loadTexts: symShowSymmetrix_pwron_time.setDescription('Time since the last power-on ')
symShowSymmetrix_uptime = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 10), TimeTicks()).setLabel("symShowSymmetrix-uptime").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowSymmetrix_uptime.setStatus('mandatory')
if mibBuilder.loadTexts: symShowSymmetrix_uptime.setDescription('Uptime in seconds of the Symmetrix ')
symShowDb_sync_time = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 11), TimeTicks()).setLabel("symShowDb-sync-time").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDb_sync_time.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDb_sync_time.setDescription('Time since the configuration information was gathered ')
symShowDb_sync_bcv_time = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 12), TimeTicks()).setLabel("symShowDb-sync-bcv-time").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDb_sync_bcv_time.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDb_sync_bcv_time.setDescription('Time since the configuration information was gathered for BCVs ')
symShowDb_sync_rdf_time = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 13), TimeTicks()).setLabel("symShowDb-sync-rdf-time").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDb_sync_rdf_time.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDb_sync_rdf_time.setDescription('Time since the configuration information was gathered for the SRDF ')
symShowLast_ipl_time = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 14), TimeTicks()).setLabel("symShowLast-ipl-time").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowLast_ipl_time.setStatus('mandatory')
if mibBuilder.loadTexts: symShowLast_ipl_time.setDescription('Time since the last Symmetrix IPL ')
symShowLast_fast_ipl_time = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 15), TimeTicks()).setLabel("symShowLast-fast-ipl-time").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowLast_fast_ipl_time.setStatus('mandatory')
if mibBuilder.loadTexts: symShowLast_fast_ipl_time.setDescription("Time since the last Symmetrix 'fast IPL' ")
symShowReserved = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 16), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowReserved.setStatus('mandatory')
if mibBuilder.loadTexts: symShowReserved.setDescription('Reserved for future use ')
symShowCache_size = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 17), UInt32()).setLabel("symShowCache-size").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowCache_size.setStatus('mandatory')
if mibBuilder.loadTexts: symShowCache_size.setDescription('Cache size in megabytes ')
symShowCache_slot_count = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 18), UInt32()).setLabel("symShowCache-slot-count").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowCache_slot_count.setStatus('mandatory')
if mibBuilder.loadTexts: symShowCache_slot_count.setDescription('Number of 32K cache slots ')
symShowMax_wr_pend_slots = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 19), UInt32()).setLabel("symShowMax-wr-pend-slots").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMax_wr_pend_slots.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMax_wr_pend_slots.setDescription('Number of write pending slots allowed before starting delayed fast writes ')
symShowMax_da_wr_pend_slots = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 20), UInt32()).setLabel("symShowMax-da-wr-pend-slots").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMax_da_wr_pend_slots.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMax_da_wr_pend_slots.setDescription('Number of write pending slots allowed for one DA before delayed fast writes ')
symShowMax_dev_wr_pend_slots = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 21), UInt32()).setLabel("symShowMax-dev-wr-pend-slots").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowMax_dev_wr_pend_slots.setStatus('mandatory')
if mibBuilder.loadTexts: symShowMax_dev_wr_pend_slots.setDescription('Number of write pending slots allowed for one device before starting delayed fast writes ')
symShowPermacache_slot_count = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 22), UInt32()).setLabel("symShowPermacache-slot-count").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPermacache_slot_count.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPermacache_slot_count.setDescription('Number of slots allocated to PermaCache ')
symShowNum_disks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 23), UInt32()).setLabel("symShowNum-disks").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowNum_disks.setStatus('mandatory')
if mibBuilder.loadTexts: symShowNum_disks.setDescription('Number of physical disks in Symmetrix ')
symShowNum_symdevs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 24), UInt32()).setLabel("symShowNum-symdevs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowNum_symdevs.setStatus('mandatory')
if mibBuilder.loadTexts: symShowNum_symdevs.setDescription('Number of Symmetrix devices ')
symShowNum_pdevs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 25), UInt32()).setLabel("symShowNum-pdevs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowNum_pdevs.setStatus('mandatory')
if mibBuilder.loadTexts: symShowNum_pdevs.setDescription(' Number of host physical devices configured for this Symmetrix ')
symShowAPI_version = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 26), DisplayString()).setLabel("symShowAPI-version").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowAPI_version.setStatus('mandatory')
if mibBuilder.loadTexts: symShowAPI_version.setDescription('SYMAPI revision set to SYMAPI_T_VERSION ')
symShowSDDF_configuration = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 27), StateValues()).setLabel("symShowSDDF-configuration").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowSDDF_configuration.setStatus('mandatory')
if mibBuilder.loadTexts: symShowSDDF_configuration.setDescription('The configuration state of the Symmetrix Differential Data Facility (SDDF). Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ')
symShowConfig_checksum = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 28), UInt32()).setLabel("symShowConfig-checksum").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowConfig_checksum.setStatus('mandatory')
if mibBuilder.loadTexts: symShowConfig_checksum.setDescription('Checksum of the Microcode file')
symShowNum_powerpath_devs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 1, 1, 29), UInt32()).setLabel("symShowNum-powerpath-devs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowNum_powerpath_devs.setStatus('mandatory')
if mibBuilder.loadTexts: symShowNum_powerpath_devs.setDescription('Number of Symmetrix devices accessible through the PowerPath driver.')
symShowPDevCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 2), )
if mibBuilder.loadTexts: symShowPDevCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPDevCountTable.setDescription('A list of the number of available devices for the given Symmetrix instance.')
symShowPDevCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symShowPDevCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPDevCountEntry.setDescription('An entry containing objects for the number of available devices found for the specified Symmetrix')
symShowPDevCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPDevCount.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPDevCount.setDescription('The number of entries in the SymShowPDeviceList table for the indicated Symmetrix instance')
symShowPDevListTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 3), )
if mibBuilder.loadTexts: symShowPDevListTable.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPDevListTable.setDescription('A list of host device filenames (pdevs) for all available devices for the indicated Symmetrix instance. The number of entries is given by the value of symShowPDevCount.')
symShowPDevListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 3, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowPDevCount"))
if mibBuilder.loadTexts: symShowPDevListEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPDevListEntry.setDescription('An entry containing objects for the indicated symShowPDevListTable element.')
symShowPDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 3, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPDeviceName.setDescription('The physical device name for the indicated instance')
symShowDirectorCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 4), )
if mibBuilder.loadTexts: symShowDirectorCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirectorCountTable.setDescription('A list of the number of directors for the given Symmetrix instance.')
symShowDirectorCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 4, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symShowDirectorCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirectorCountEntry.setDescription('An entry containing objects for the number of directors for the specified Symmetrix')
symShowDirectorCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 4, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDirectorCount.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirectorCount.setDescription('The number of entries in the SymShowDirectorList table for the indicated Symmetrix instance')
symShowDirectorConfigurationTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5), )
if mibBuilder.loadTexts: symShowDirectorConfigurationTable.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirectorConfigurationTable.setDescription('A table of Symmetrix director configuration for the indicated Symmetrix instance. The number of entries is given by the value of symShowDirectorCount.')
symShowDirectorConfigurationEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowDirectorCount"))
if mibBuilder.loadTexts: symShowDirectorConfigurationEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirectorConfigurationEntry.setDescription('An entry containing objects for the indicated symShowDirectorConfigurationTable element.')
symShowDirector_type = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 1), DirectorType()).setLabel("symShowDirector-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDirector_type.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirector_type.setDescription('Defines the type of director. Possible types are: SYMAPI_DIRTYPE_FIBRECHANNEL SYMAPI_DIRTYPE_SCSI SYMAPI_DIRTYPE_DISK SYMAPI_DIRTYPE_CHANNEL SYMAPI_DIRTYPE_MEMORY SYMAPI_DIRTYPE_R1 SYMAPI_DIRTYPE_R2 SYMAPI_DIRTYPE_RDF_B (bidirectional RDF) ')
symShowDirector_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 2), UInt32()).setLabel("symShowDirector-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDirector_num.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirector_num.setDescription('Number of a director (1 - 32) ')
symShowSlot_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 3), UInt32()).setLabel("symShowSlot-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowSlot_num.setStatus('mandatory')
if mibBuilder.loadTexts: symShowSlot_num.setDescription('Slot number of a director (1 - 16) ')
symShowDirector_ident = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 4), DisplayString()).setLabel("symShowDirector-ident").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDirector_ident.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirector_ident.setDescription(' Director identifier. For example, SA-16 ')
symShowDirector_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 5), DirectorStatus()).setLabel("symShowDirector-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowDirector_status.setStatus('mandatory')
if mibBuilder.loadTexts: symShowDirector_status.setDescription("online(0) - The director's status is ONLINE. offline(1) - The director's status is OFFLINE. dead(2) - The status is non-operational, and its LED display will show 'DD'. unknown(3) - The director's status is unknown. ")
symShowScsi_capability = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 6), SCSIWidth()).setLabel("symShowScsi-capability").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowScsi_capability.setStatus('mandatory')
if mibBuilder.loadTexts: symShowScsi_capability.setDescription('Defines SCSI features for SAs and DAs. Possible values are: SYMAPI_C_SCSI_NARROW SYMAPI_C_SCSI_WIDE SYMAPI_C_SCSI_ULTRA ')
symShowNum_da_volumes = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 7), UInt32()).setLabel("symShowNum-da-volumes").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowNum_da_volumes.setStatus('mandatory')
if mibBuilder.loadTexts: symShowNum_da_volumes.setDescription('Indicates how many volumes are serviced by the DA. ')
symShowRemote_symid = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 8), DisplayString()).setLabel("symShowRemote-symid").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowRemote_symid.setStatus('mandatory')
if mibBuilder.loadTexts: symShowRemote_symid.setDescription('If this is an RA in an SRDF system,this is the remote Symmetrix serial number. If this is not an RDF director, this field is NULL ')
symShowRa_group_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 9), UInt32()).setLabel("symShowRa-group-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowRa_group_num.setStatus('mandatory')
if mibBuilder.loadTexts: symShowRa_group_num.setDescription('If this is an RA in an SRDF system, this is the RA group number; otherwise it is zero. ')
symShowRemote_ra_group_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 10), UInt32()).setLabel("symShowRemote-ra-group-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowRemote_ra_group_num.setStatus('mandatory')
if mibBuilder.loadTexts: symShowRemote_ra_group_num.setDescription('If this is an RA in an SRDF system, this is the RA group number on the remote Symmetrix; otherwise it is zero. ')
symShowPrevent_auto_link_recovery = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 11), StateValues()).setLabel("symShowPrevent-auto-link-recovery").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPrevent_auto_link_recovery.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPrevent_auto_link_recovery.setDescription('Prevent the automatic resumption of data copy across the RDF links as soon as the links have recovered. Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ')
symShowPrevent_ra_online_upon_pwron = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 12), StateValues()).setLabel("symShowPrevent-ra-online-upon-pwron").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPrevent_ra_online_upon_pwron.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPrevent_ra_online_upon_pwron.setDescription("Prevent RA's from coming online after the Symmetrix is powered on. Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ")
symShowNum_ports = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 13), UInt32()).setLabel("symShowNum-ports").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowNum_ports.setStatus('mandatory')
if mibBuilder.loadTexts: symShowNum_ports.setDescription('Number of ports available on this director.')
symShowPort0_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 14), PortStatus()).setLabel("symShowPort0-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPort0_status.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPort0_status.setDescription("The status of port 0 for this director. Possible values are: status-na(0) - the port's status is not applicable. on(1) - the port's status is on. off(2) - the port's status is off. wd(3) - the port's status is write disabled. ")
symShowPort1_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 15), PortStatus()).setLabel("symShowPort1-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPort1_status.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPort1_status.setDescription("The status of port 1 for this director. Possible values are: status-na(0) - the port's status is not applicable. on(1) - the port's status is on. off(2) - the port's status is off. wd(3) - the port's status is write disabled. ")
symShowPort2_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 16), PortStatus()).setLabel("symShowPort2-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPort2_status.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPort2_status.setDescription("The status of port 2 for this director. Possible values are: status-na(0) - the port's status is not applicable. on(1) - the port's status is on. off(2) - the port's status is off. wd(3) - the port's status is write disabled. ")
symShowPort3_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 1, 5, 1, 17), PortStatus()).setLabel("symShowPort3-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: symShowPort3_status.setStatus('mandatory')
if mibBuilder.loadTexts: symShowPort3_status.setDescription("The status of port 3 for this director. Possible values are: status-na(0) - the port's status is not applicable. on(1) - the port's status is on. off(2) - the port's status is off. wd(3) - the port's status is write disabled. ")
class DeviceStatus(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))
namedValues = NamedValues(("ready", 0), ("not-ready", 1), ("write-disabled", 2), ("not-applicable", 3), ("mixed", 4))
class DeviceType(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 4, 8, 16, 32, 128))
namedValues = NamedValues(("not-applicable", 1), ("local-data", 2), ("raid-s", 4), ("raid-s-parity", 8), ("remote-r1-data", 16), ("remote-r2-data", 32), ("hot-spare", 128))
class DeviceEmulation(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6))
namedValues = NamedValues(("emulation-na", 0), ("fba", 1), ("as400", 2), ("icl", 3), ("unisys-fba", 4), ("ckd-3380", 5), ("ckd-3390", 6))
class SCSIMethod(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2))
namedValues = NamedValues(("method-na", 0), ("synchronous", 1), ("asynchronous", 2))
class BCVState(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10))
namedValues = NamedValues(("never-established", 0), ("in-progress", 1), ("synchronous", 2), ("split-in-progress", 3), ("split-before-sync", 4), ("split", 5), ("split-no-incremental", 6), ("restore-in-progress", 7), ("restored", 8), ("split-before-restore", 9), ("invalid", 10))
class RDFPairState(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110))
namedValues = NamedValues(("invalid", 100), ("syncinprog", 101), ("synchronized", 102), ("split", 103), ("suspended", 104), ("failed-over", 105), ("partitioned", 106), ("r1-updated", 107), ("r1-updinprog", 108), ("mixed", 109), ("state-na", 110))
class RDFType(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1))
namedValues = NamedValues(("r1", 0), ("r2", 1))
class RDFMode(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))
namedValues = NamedValues(("synchronous", 0), ("semi-synchronous", 1), ("adaptive-copy", 2), ("mixed", 3), ("rdf-mode-na", 4))
class RDFAdaptiveCopy(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))
namedValues = NamedValues(("disabled", 0), ("wp-mode", 1), ("disk-mode", 2), ("mixed", 3), ("ac-na", 4))
class RDFLinkConfig(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3))
namedValues = NamedValues(("escon", 1), ("t3", 2), ("na", 3))
class RDDFTransientState(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))
namedValues = NamedValues(("transient-state-na", 1), ("offline", 2), ("offline-pend", 3), ("online", 4))
devShowConfigurationTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1), )
if mibBuilder.loadTexts: devShowConfigurationTable.setStatus('mandatory')
if mibBuilder.loadTexts: devShowConfigurationTable.setDescription('A table of Symmetrix device configuration information for the indicated Symmetrix and device instance. The number of entries is given by the value of symDevListCount.')
devShowConfigurationEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symDevListCount"))
if mibBuilder.loadTexts: devShowConfigurationEntry.setStatus('mandatory')
if mibBuilder.loadTexts: devShowConfigurationEntry.setDescription('An entry containing objects for the Symmetrix device configuration information for the specified Symmetrix and device.')
devShowVendor_id = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 1), DisplayString()).setLabel("devShowVendor-id").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowVendor_id.setStatus('mandatory')
if mibBuilder.loadTexts: devShowVendor_id.setDescription('Vendor ID of the Symmetrix device ')
devShowProduct_id = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 2), DisplayString()).setLabel("devShowProduct-id").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowProduct_id.setStatus('mandatory')
if mibBuilder.loadTexts: devShowProduct_id.setDescription('Product ID of the Symmetrix device ')
devShowProduct_rev = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 3), DisplayString()).setLabel("devShowProduct-rev").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowProduct_rev.setStatus('mandatory')
if mibBuilder.loadTexts: devShowProduct_rev.setDescription('Product revision level of the Symmetrix device ')
devShowSymid = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowSymid.setStatus('mandatory')
if mibBuilder.loadTexts: devShowSymid.setDescription('Symmetrix serial number ')
devShowDevice_serial_id = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 5), DisplayString()).setLabel("devShowDevice-serial-id").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDevice_serial_id.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDevice_serial_id.setDescription('Symmetrix device serial ID ')
devShowSym_devname = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 6), DisplayString()).setLabel("devShowSym-devname").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowSym_devname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowSym_devname.setDescription('Symmetrix device name/number ')
devShowPdevname = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowPdevname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowPdevname.setDescription('Physical device name. If not visible to the host this field is NULL ')
devShowDgname = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 8), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDgname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDgname.setDescription('Name of device group. If the device is not a member of a device group, this field is NULL ')
devShowLdevname = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 9), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowLdevname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowLdevname.setDescription('Name of this device in the device group. If the device is not a member of a device group, this field is NULL ')
devShowDev_config = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16))).clone(namedValues=NamedValues(("unprotected", 0), ("mirror-2", 1), ("mirror-3", 2), ("mirror-4", 3), ("raid-s", 4), ("raid-s-mirror", 5), ("rdf-r1", 6), ("rdf-r2", 7), ("rdf-r1-raid-s", 8), ("rdf-r2-raid-s", 9), ("rdf-r1-mirror", 10), ("rdf-r2-mirror", 11), ("bcv", 12), ("hot-spare", 13), ("bcv-mirror-2", 14), ("bcv-rdf-r1", 15), ("bcv-rdf-r1-mirror", 16)))).setLabel("devShowDev-config").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_config.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_config.setDescription(' unprotected(0) - no data protection method applied mirror-2(1) - device is a two-way mirror mirror-3(2) - device is a three-way mirror mirror-4(3) - device is a four-way mirror raid-s(4) - device is a standard raid-s device raid-s-mirror(5) - device is a raid-s device plus a local mirror rdf-r1(6) - device is an SRDF Master (R1) rdf-r2(7) - device is an SRDF Slave (R2) rdf-r1-raid-s(8) - device is an SRDF Master (R1) with RAID_S rdf-r2-raid-s(9) - device is an SRDF Slave (R2) with RAID_S rdf-r1-mirror(10) - device is an SRDF source (R1) with a local mirror rdf-r2-mirror(11) - device is an SRDF target (R2) with mirror bcv(12) - device is a BCV device hot-spare(13) - device is a Hot Spare device bcv-mirror-2(14) - device is a protected BCV device with mirror bcv-rdf-r1(15) - device is an SRDF Master (R1), BCV device bcv-rdf-r1-mirror(16) - device is an SRDF Master (R1), BCV device with mirror ')
devShowDev_parameters = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 4, 8, 16, 32))).clone(namedValues=NamedValues(("ckd-device", 1), ("gatekeeper-device", 2), ("associated-device", 4), ("multi-channel-device", 8), ("meta-head-device", 16), ("meta-member-device", 32)))).setLabel("devShowDev-parameters").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_parameters.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_parameters.setDescription(" ckd-device(1) - device is an 'Count Key Data' (CKD) device gatekeeper-device(2) - device is a gatekeeper device associated-device(4) - BCV or Gatekeeper device,associated with a group multi-channel-device(8) - device visible by the host over more than one SCSI-bus meta-head-device(16) - device is a META head device meta-member-device(32) - device is a META member device ")
devShowDev_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 12), DeviceStatus()).setLabel("devShowDev-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_status.setDescription(' ready(0) - the device is ready not-ready(1) - the device is not ready write-disabled - the device is write disabled not-applicable(3) - there is no such device ')
devShowDev_capacity = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 13), UInt32()).setLabel("devShowDev-capacity").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_capacity.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_capacity.setDescription('Device capacity specified as the number of device blocks ')
devShowTid = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 14), UInt32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowTid.setStatus('mandatory')
if mibBuilder.loadTexts: devShowTid.setDescription('SCSI target ID of the Symmetrix device ')
devShowLun = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 15), UInt32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowLun.setStatus('mandatory')
if mibBuilder.loadTexts: devShowLun.setDescription('SCSI logical unit number of the Symmetrix device ')
devShowDirector_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 16), Integer32()).setLabel("devShowDirector-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDirector_num.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDirector_num.setDescription("Symmetrix director number of the device's FW SCSI Channel director, or SA, (1-32). If there is no primary port, it is zero. ")
devShowDirector_slot_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 17), Integer32()).setLabel("devShowDirector-slot-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDirector_slot_num.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDirector_slot_num.setDescription('The slot number of the director. If there is no primary port, it is zero. ')
devShowDirector_ident = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 18), DisplayString()).setLabel("devShowDirector-ident").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDirector_ident.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDirector_ident.setDescription('The identification number of the director, for example: SA-16. If there is no primary port, this field is NULL ')
devShowDirector_port_num = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 19), Integer32()).setLabel("devShowDirector-port-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDirector_port_num.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDirector_port_num.setDescription("The device's SA port number (0-3). If there is no primary port, it is zero. ")
devShowMset_M1_type = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 20), DeviceType()).setLabel("devShowMset-M1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M1_type.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M1_type.setDescription(' Device type of each hyper volume forming a Symmetrix mirror set. not-applicable(1) - No device exists for this mirror local-data(2) - This device is a local data device raid-s(4) - This device is a local raid-s data device raid-s-parity(8) - This device is a local raid_s parity device remote-r1-data(16) - This device is a remote R1 data device remote-r2-data(32) - This device is a remote R2 data device hot-spare(128) - This device is a hot spare device ')
devShowMset_M2_type = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 21), DeviceType()).setLabel("devShowMset-M2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M2_type.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M2_type.setDescription(' Device type of each hyper volume forming a Symmetrix mirror set. not-applicable(1) - No device exists for this mirror local-data(2) - This device is a local data device raid-s(4) - This device is a local raid-s data device raid-s-parity(8) - This device is a local raid_s parity device remote-r1-data(16) - This device is a remote R1 data device remote-r2-data(32) - This device is a remote R2 data device hot-spare(128) - This device is a hot spare device ')
devShowMset_M3_type = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 22), DeviceType()).setLabel("devShowMset-M3-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M3_type.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M3_type.setDescription(' Device type of each hyper volume forming a Symmetrix mirror set. not-applicable(1) - No device exists for this mirror local-data(2) - This device is a local data device raid-s(4) - This device is a local raid-s data device raid-s-parity(8) - This device is a local raid_s parity device remote-r1-data(16) - This device is a remote R1 data device remote-r2-data(32) - This device is a remote R2 data device hot-spare(128) - This device is a hot spare device ')
devShowMset_M4_type = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 23), DeviceType()).setLabel("devShowMset-M4-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M4_type.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M4_type.setDescription(' Device type of each hyper volume forming a Symmetrix mirror set. not-applicable(1) - No device exists for this mirror local-data(2) - This device is a local data device raid-s(4) - This device is a local raid-s data device raid-s-parity(8) - This device is a local raid_s parity device remote-r1-data(16) - This device is a remote R1 data device remote-r2-data(32) - This device is a remote R2 data device hot-spare(128) - This device is a hot spare device ')
devShowMset_M1_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 24), DeviceStatus()).setLabel("devShowMset-M1-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M1_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M1_status.setDescription(' ready(0) - the device is ready not-ready(1) - the device is not ready write-disabled - the device is write disabled not-applicable(3) - there is no such device ')
devShowMset_M2_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 25), DeviceStatus()).setLabel("devShowMset-M2-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M2_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M2_status.setDescription(' ready(0) - the device is ready not-ready(1) - the device is not ready write-disabled - the device is write disabled not-applicable(3) - there is no such device ')
devShowMset_M3_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 26), DeviceStatus()).setLabel("devShowMset-M3-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M3_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M3_status.setDescription(' ready(0) - the device is ready not-ready(1) - the device is not ready write-disabled - the device is write disabled not-applicable(3) - there is no such device ')
devShowMset_M4_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 27), DeviceStatus()).setLabel("devShowMset-M4-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M4_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M4_status.setDescription(' ready(0) - the device is ready not-ready(1) - the device is not ready write-disabled - the device is write disabled not-applicable(3) - there is no such device ')
devShowMset_M1_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 28), Integer32()).setLabel("devShowMset-M1-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M1_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M1_invalid_tracks.setDescription('The number of invalid tracks for Mirror 1')
devShowMset_M2_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 29), Integer32()).setLabel("devShowMset-M2-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M2_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M2_invalid_tracks.setDescription('The number of invalid tracks for Mirror 2')
devShowMset_M3_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 30), Integer32()).setLabel("devShowMset-M3-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M3_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M3_invalid_tracks.setDescription('The number of invalid tracks for Mirror 3')
devShowMset_M4_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 31), Integer32()).setLabel("devShowMset-M4-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowMset_M4_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowMset_M4_invalid_tracks.setDescription('The number of invalid tracks for Mirror 4')
devShowDirector_port_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 48), DeviceStatus()).setLabel("devShowDirector-port-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDirector_port_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDirector_port_status.setDescription(' ready(0) - the device is ready not-ready(1) - the device is not ready write-disabled - the device is write disabled not-applicable(3) - there is no such device ')
devShowDev_sa_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 49), DeviceStatus()).setLabel("devShowDev-sa-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_sa_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_sa_status.setDescription(' ready(0) - the device is ready not-ready(1) - the device is not ready write-disabled - the device is write disabled not-applicable(3) - there is no such device ')
devShowVbus = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 50), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowVbus.setStatus('mandatory')
if mibBuilder.loadTexts: devShowVbus.setDescription('The virtual busis used for fibre channel ports. For EA and CA ports, this is the device address. ')
devShowEmulation = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 51), DeviceEmulation()).setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowEmulation.setStatus('mandatory')
if mibBuilder.loadTexts: devShowEmulation.setDescription(' emulation-na(0) - the emulation type for this device is not available. fba(1) - the emulation type for this device is fba. as400(2) - the emulation type for this device is as/400. icl(3) - the emulation type for this device is icl. unisys-fba(4) - the emulation type for this device is unisys fba. ckd-3380(5) - the emulation type for this device is ckd 3380. ckd-3390(6) - the emulation type for this device is ckd 3390. ')
devShowDev_block_size = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 52), UInt32()).setLabel("devShowDev-block-size").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_block_size.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_block_size.setDescription('Indicates the number of bytes per block ')
devShowSCSI_negotiation = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 53), SCSIWidth()).setLabel("devShowSCSI-negotiation").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowSCSI_negotiation.setStatus('mandatory')
if mibBuilder.loadTexts: devShowSCSI_negotiation.setDescription('width-na(0) - width not available narrow(1), wide(2), ultra(3) ')
devShowSCSI_method = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 54), SCSIMethod()).setLabel("devShowSCSI-method").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowSCSI_method.setStatus('mandatory')
if mibBuilder.loadTexts: devShowSCSI_method.setDescription(' method-na(0) - method not available synchronous(1) asynchronous(2) ')
devShowDev_cylinders = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 55), UInt32()).setLabel("devShowDev-cylinders").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_cylinders.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_cylinders.setDescription('Number device cylinders ')
devShowAttached_bcv_symdev = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 1, 1, 56), DisplayString()).setLabel("devShowAttached-bcv-symdev").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowAttached_bcv_symdev.setStatus('mandatory')
if mibBuilder.loadTexts: devShowAttached_bcv_symdev.setDescription('If this is a std device, this may be set to indicate the preferred BCV device to which this device would be paired. ')
devShowRDFInfoTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2), )
if mibBuilder.loadTexts: devShowRDFInfoTable.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRDFInfoTable.setDescription('A table of Symmetrix RDF device configuration information for the indicated Symmetrix and device instance. The number of entries is given by the value of symDevListCount.')
devShowRDFInfoEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symDevListCount"))
if mibBuilder.loadTexts: devShowRDFInfoEntry.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRDFInfoEntry.setDescription('An entry containing objects for the Symmetrix RDF device configuration information for the specified Symmetrix and device.')
devShowRemote_symid = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 1), DisplayString()).setLabel("devShowRemote-symid").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRemote_symid.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRemote_symid.setDescription('Serial number of the Symmetrix containing the target (R2) volume ')
devShowRemote_sym_devname = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 2), DisplayString()).setLabel("devShowRemote-sym-devname").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRemote_sym_devname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRemote_sym_devname.setDescription('Symmetrix device name of the remote device in an RDF pair ')
devShowRa_group_number = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 3), Integer32()).setLabel("devShowRa-group-number").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRa_group_number.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRa_group_number.setDescription('The RA group number (1 - n)')
devShowDev_rdf_type = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 4), RDFType()).setLabel("devShowDev-rdf-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_rdf_type.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_rdf_type.setDescription('Type of RDF device. Values are: r1(0) - an R1 device r2(1) - an R2 device ')
devShowDev_ra_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 5), DeviceStatus()).setLabel("devShowDev-ra-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_ra_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_ra_status.setDescription('The status of the remote device. Values are: ready(0) not-ready(1) write-disabled(2) not-applicable(3) mixed(4) ')
devShowDev_link_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 6), DeviceStatus()).setLabel("devShowDev-link-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_link_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_link_status.setDescription('The RDF link status. Values are: ready(0) not-ready(1) write-disabled(2) not-applicable(3) mixed(4) ')
devShowRdf_mode = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 7), RDFMode()).setLabel("devShowRdf-mode").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRdf_mode.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRdf_mode.setDescription('The RDF Mode. Values are synchronous(0) semi_synchronous(1) adaptive_copy(2) mixed(3) - This state is set when the RDF modes of the devices in the group are different from each other rdf_mode_na (4) - not applicable ')
devShowRdf_pair_state = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 8), RDFPairState()).setLabel("devShowRdf-pair-state").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRdf_pair_state.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRdf_pair_state.setDescription('The RDF pair state. Values are: invalid(100) - The device & link states are in an unrecognized combination syncinprog(101) - Synchronizing in progress synchronized(102) - The source and target have identical data split(103) - The source is split from the target, and the target is write enabled. suspended(104) - The link is suspended failed-over(105) - The target is write enabled, the source is write disabled, the link is suspended. partitioned(106) - The communication link to the remote symmetrix is down, and the device is write enabled. r1-updated(107) - The target is write enabled, the source is write disabled, and the link is up. r1-updinprog(108) - same as r1-updated but there are invalid tracks between target and source mixed(109) - This state is set when the RDF modes of the devices in the group are different from each other state-na(110) - Not applicable ')
devShowRdf_domino = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 9), StateValues()).setLabel("devShowRdf-domino").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRdf_domino.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRdf_domino.setDescription('The RDF Domino state. Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ')
devShowAdaptive_copy = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 10), RDFAdaptiveCopy()).setLabel("devShowAdaptive-copy").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowAdaptive_copy.setStatus('mandatory')
if mibBuilder.loadTexts: devShowAdaptive_copy.setDescription('Adaptive copy state. Values are: disabled(0) - Adaptive Copy is Disabled wp-mode(1) - Adaptive Copy Write Pending Mode disk-mode(2) - Adaptive Copy Disk Mode mixed(3) - This state is set when the RDF modes of the devices in the group are different from each other ac-na(4) - Not Applicable ')
devShowAdaptive_copy_skew = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 11), UInt32()).setLabel("devShowAdaptive-copy-skew").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowAdaptive_copy_skew.setStatus('mandatory')
if mibBuilder.loadTexts: devShowAdaptive_copy_skew.setDescription('Number of invalid tracks when in Adaptive copy mode. ')
devShowNum_r1_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 12), UInt32()).setLabel("devShowNum-r1-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowNum_r1_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowNum_r1_invalid_tracks.setDescription('Number of invalid tracks invalid tracks on R1')
devShowNum_r2_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 13), UInt32()).setLabel("devShowNum-r2-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowNum_r2_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowNum_r2_invalid_tracks.setDescription('Number of invalid tracks invalid tracks on R2')
devShowDev_rdf_state = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 14), DeviceStatus()).setLabel("devShowDev-rdf-state").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_rdf_state.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_rdf_state.setDescription('The RDF state. Values are: ready(0) not-ready(1) write-disabled(2) not-applicable(3) mixed(4) ')
devShowRemote_dev_rdf_state = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 15), DeviceStatus()).setLabel("devShowRemote-dev-rdf-state").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRemote_dev_rdf_state.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRemote_dev_rdf_state.setDescription('The RDF device state. Values are: ready(0) not-ready(1) write-disabled(2) not-applicable(3) mixed(4) ')
devShowRdf_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 16), DeviceStatus()).setLabel("devShowRdf-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowRdf_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowRdf_status.setDescription('The RDF status. Values are: ready(0) not-ready(1) write-disabled(2) not-applicable(3) mixed(4) ')
devShowLink_domino = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 17), StateValues()).setLabel("devShowLink-domino").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowLink_domino.setStatus('mandatory')
if mibBuilder.loadTexts: devShowLink_domino.setDescription('The link domino state. Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ')
devShowPrevent_auto_link_recovery = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 18), StateValues()).setLabel("devShowPrevent-auto-link-recovery").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowPrevent_auto_link_recovery.setStatus('mandatory')
if mibBuilder.loadTexts: devShowPrevent_auto_link_recovery.setDescription('Prevent the automatic resumption of data copy across the RDF links as soon as the links have recovered. Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ')
devShowLink_config = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 19), RDFLinkConfig()).setLabel("devShowLink-config").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowLink_config.setStatus('mandatory')
if mibBuilder.loadTexts: devShowLink_config.setDescription('The RDF link configuration: Values are: escon(1), t3(2), na(3) ')
devShowSuspend_state = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 20), RDDFTransientState()).setLabel("devShowSuspend-state").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowSuspend_state.setStatus('mandatory')
if mibBuilder.loadTexts: devShowSuspend_state.setDescription('For R1 devices in a consistency group, will be set to OFFLINE or OFFLINE_PENDING if a device in the group experiences a link failure. Values are: transient-state-na(1) - state not applicable offline(2), offline_pend(3), online(4) ')
devShowConsistency_state = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 21), StateValues()).setLabel("devShowConsistency-state").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowConsistency_state.setStatus('mandatory')
if mibBuilder.loadTexts: devShowConsistency_state.setDescription('Indicates if this R1 device is a member of any consistency group. Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ')
devShowAdaptive_copy_wp_state = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 22), RDDFTransientState()).setLabel("devShowAdaptive-copy-wp-state").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowAdaptive_copy_wp_state.setStatus('mandatory')
if mibBuilder.loadTexts: devShowAdaptive_copy_wp_state.setDescription('The Adaptive Copy Write Pending state. Values are: transient-state-na(1) - state not applicable offline(2), offline_pend(3), online(4) ')
devShowPrevent_ra_online_upon_pwron = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 2, 1, 23), StateValues()).setLabel("devShowPrevent-ra-online-upon-pwron").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowPrevent_ra_online_upon_pwron.setStatus('mandatory')
if mibBuilder.loadTexts: devShowPrevent_ra_online_upon_pwron.setDescription("Prevent RA's from coming online after the Symmetrix is powered on. Possible states are: enabled(0), disabled(1), mixed(2) - This state is set when the RDF modes of the devices in the group are different from each other state-na(3) - state not applicable ")
devShowBCVInfoTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3), )
if mibBuilder.loadTexts: devShowBCVInfoTable.setStatus('mandatory')
if mibBuilder.loadTexts: devShowBCVInfoTable.setDescription('A table of Symmetrix BCV device configuration information for the indicated Symmetrix and device instance. The number of entries is given by the value of symDevListCount.')
devShowBCVInfoEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symDevListCount"))
if mibBuilder.loadTexts: devShowBCVInfoEntry.setStatus('mandatory')
if mibBuilder.loadTexts: devShowBCVInfoEntry.setDescription('An entry containing objects for the Symmetrix BCV device configuration information for the specified Symmetrix and device.')
devShowDev_serial_id = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 1), DisplayString()).setLabel("devShowDev-serial-id").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_serial_id.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_serial_id.setDescription('Symmetrix device serial ID for the standard device in a BCV pair. If the device is a: BCV device in a BCV pair, this field is the serial ID of the standard device with which the BCV is paired. Standard device in a BCV pair, this field is the same as device_serial_id in SYMAPI_DEVICE_T. if the standard device that was never paired with a BCV device, this field is NULL ')
devShowDev_sym_devname = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 2), DisplayString()).setLabel("devShowDev-sym-devname").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_sym_devname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_sym_devname.setDescription('Symmetrix device name/number for the standard device in a BCV pair. If the device is a: BCV device in a BCV pair, this is the device name/number of the standard device with which the BCV is paired. Standard device in a BCV pair, this is the same as sym_devname in SYMAPI_DEVICE_T. If the standard device that was never paired with a BCV device, this field is NULL ')
devShowDev_dgname = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 3), DisplayString()).setLabel("devShowDev-dgname").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowDev_dgname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowDev_dgname.setDescription('Name of the device group that the standard device is a member. If the standard device is not a member of a device group, this field is NULL')
devShowBcvdev_serial_id = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 4), DisplayString()).setLabel("devShowBcvdev-serial-id").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowBcvdev_serial_id.setStatus('mandatory')
if mibBuilder.loadTexts: devShowBcvdev_serial_id.setDescription('Symmetrix device serial ID for the BCV device in a BCV pair. If the device is a: BCV device in a BCV pair, this is the same as device_serial_id in SYMAPI_DEVICE_T. Standard device in a BCV pair, this is the serial ID of the BCV device with which the standard device is paired. If the BCV device that was never paired with a standard device, this field is NULL ')
devShowBcvdev_sym_devname = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 5), DisplayString()).setLabel("devShowBcvdev-sym-devname").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowBcvdev_sym_devname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowBcvdev_sym_devname.setDescription('Symmetrix device name/number for the BCV device in a BCV pair. If the device is a: BCV device in a BCV pair, this is the same as sym_devname in SYMAPI_DEVICE_T. Standard device in a BCV pair, this is the device name/number of the BCV device with which the standard device is paired. If the BCV device was never paired with a standard device, this field is NULL.')
devShowBcvdev_dgname = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 6), DisplayString()).setLabel("devShowBcvdev-dgname").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowBcvdev_dgname.setStatus('mandatory')
if mibBuilder.loadTexts: devShowBcvdev_dgname.setDescription('Name of the device group that the BCV device is associated with. If the BCV device is not associated with a device group, this field is NULL')
devShowBcv_pair_state = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 7), BCVState()).setLabel("devShowBcv-pair-state").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowBcv_pair_state.setStatus('mandatory')
if mibBuilder.loadTexts: devShowBcv_pair_state.setDescription(' never-established(0), in-progress(1), synchronous(2), split-in-progress(3), split-before-sync(4), split(5), split-no-incremental(6), restore-in-progress(7), restored(8), split-before-restore(9), invalid(10) ')
devShowNum_dev_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 8), UInt32()).setLabel("devShowNum-dev-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowNum_dev_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowNum_dev_invalid_tracks.setDescription('Number of invalid tracks on the standard device')
devShowNum_bcvdev_invalid_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 9), UInt32()).setLabel("devShowNum-bcvdev-invalid-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowNum_bcvdev_invalid_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devShowNum_bcvdev_invalid_tracks.setDescription('Number of invalid tracks on the BCV device')
devShowBcvdev_status = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 2, 2, 3, 1, 10), DeviceStatus()).setLabel("devShowBcvdev-status").setMaxAccess("readonly")
if mibBuilder.loadTexts: devShowBcvdev_status.setStatus('mandatory')
if mibBuilder.loadTexts: devShowBcvdev_status.setDescription('The BCV Device status. Values are: ready(0) not-ready(1) write-disabled(2) not-applicable(3) mixed(4) ')
symStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1), )
if mibBuilder.loadTexts: symStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: symStatTable.setDescription('A table of Symmetrix statistcs for the indicated Symmetrix instance. The number of entries is given by the value of discIndex.')
symStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"))
if mibBuilder.loadTexts: symStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symStatEntry.setDescription('An entry containing objects for the Symmetrix statistics for the specified Symmetrix and device.')
symstatTime_stamp = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 1), TimeTicks()).setLabel("symstatTime-stamp").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatTime_stamp.setStatus('mandatory')
if mibBuilder.loadTexts: symstatTime_stamp.setDescription(' Time since these statistics were last collected')
symstatNum_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 2), UInt32()).setLabel("symstatNum-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_rw_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_rw_reqs.setDescription(' Total number of all read and write requests on the specified Symmetrix unit ')
symstatNum_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 3), UInt32()).setLabel("symstatNum-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_read_reqs.setDescription(' Number of all read requests on the specified Symmetrix unit ')
symstatNum_write_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 4), UInt32()).setLabel("symstatNum-write-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_write_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_write_reqs.setDescription(' Number of all write requests on the specified Symmetrix unit ')
symstatNum_rw_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 5), UInt32()).setLabel("symstatNum-rw-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_rw_hits.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_rw_hits.setDescription(' Total number of all read and write cache hits for all devices on the specified Symmetrix unit ')
symstatNum_read_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 6), UInt32()).setLabel("symstatNum-read-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_read_hits.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_read_hits.setDescription('Total number of cache read hits for all devices on the specified Symmetrix unit ')
symstatNum_write_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 7), UInt32()).setLabel("symstatNum-write-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_write_hits.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_write_hits.setDescription('Total number of cache write hits for all devices on the specified Symmetrix unit ')
symstatNum_blocks_read = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 12), UInt32()).setLabel("symstatNum-blocks-read").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_blocks_read.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_blocks_read.setDescription('Total number of (512 byte) blocks read for all devices on the specified Symmetrix unit ')
symstatNum_blocks_written = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 17), UInt32()).setLabel("symstatNum-blocks-written").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_blocks_written.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_blocks_written.setDescription('Total number of (512 byte) blocks written for all devices on the specified Symmetrix unit ')
symstatNum_seq_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 18), UInt32()).setLabel("symstatNum-seq-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_seq_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_seq_read_reqs.setDescription('Number of sequential read requests ')
symstatNum_prefetched_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 19), UInt32()).setLabel("symstatNum-prefetched-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_prefetched_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_prefetched_tracks.setDescription('Number of prefetched tracks ')
symstatNum_destaged_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 20), UInt32()).setLabel("symstatNum-destaged-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_destaged_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_destaged_tracks.setDescription('Number of destaged tracks ')
symstatNum_deferred_writes = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 21), UInt32()).setLabel("symstatNum-deferred-writes").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_deferred_writes.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_deferred_writes.setDescription('Number of deferred writes ')
symstatNum_delayed_dfw = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 22), UInt32()).setLabel("symstatNum-delayed-dfw").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_delayed_dfw.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_delayed_dfw.setDescription('Number of delayed deferred writes untils tracks are destaged. (reserved for EMC use only.) ')
symstatNum_wr_pend_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 23), UInt32()).setLabel("symstatNum-wr-pend-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_wr_pend_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_wr_pend_tracks.setDescription('Number of tracks waiting to be destaged from cache on to disk for the specified Symmetrix unit ')
symstatNum_format_pend_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 24), UInt32()).setLabel("symstatNum-format-pend-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_format_pend_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_format_pend_tracks.setDescription(' Number of formatted pending tracks. (reserved for EMC use only.) ')
symstatDevice_max_wp_limit = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 25), UInt32()).setLabel("symstatDevice-max-wp-limit").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatDevice_max_wp_limit.setStatus('mandatory')
if mibBuilder.loadTexts: symstatDevice_max_wp_limit.setDescription('Maximum write pending limit for a device ')
symstatNum_sa_cdb_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 26), UInt32()).setLabel("symstatNum-sa-cdb-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_sa_cdb_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_sa_cdb_reqs.setDescription('Number of Command Descriptor Blocks (CDBs) sent to the Symmetrix unit. (Reads, writes, and inquiries are the types of commands sent in the CDBs.) ')
symstatNum_sa_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 27), UInt32()).setLabel("symstatNum-sa-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_sa_rw_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_sa_rw_reqs.setDescription('Total number of all read and write requests for all SCSI adapters (SAs) on the specified Symmetrix unit')
symstatNum_sa_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 28), UInt32()).setLabel("symstatNum-sa-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_sa_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_sa_read_reqs.setDescription('Total number of all read requests for all SCSI adapters (SAs) on the specified Symmetrix unit')
symstatNum_sa_write_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 29), UInt32()).setLabel("symstatNum-sa-write-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_sa_write_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_sa_write_reqs.setDescription('Total number of all write requests for all SCSI adapters (SAs) on the specified Symmetrix unit')
symstatNum_sa_rw_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 30), UInt32()).setLabel("symstatNum-sa-rw-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_sa_rw_hits.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_sa_rw_hits.setDescription('Total number of all read and write cache hits for all SCSI adapters (SAs) on the specified Symmetrix unit')
symstatNum_free_permacache_slots = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 31), UInt32()).setLabel("symstatNum-free-permacache-slots").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_free_permacache_slots.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_free_permacache_slots.setDescription('Total number of PermaCache slots that are available')
symstatNum_used_permacache_slots = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 1, 1, 32), UInt32()).setLabel("symstatNum-used-permacache-slots").setMaxAccess("readonly")
if mibBuilder.loadTexts: symstatNum_used_permacache_slots.setStatus('mandatory')
if mibBuilder.loadTexts: symstatNum_used_permacache_slots.setDescription('Total number of PermaCache slots that are used')
devStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2), )
if mibBuilder.loadTexts: devStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: devStatTable.setDescription('A table of Symmetrix device statistics for the indicated Symmetrix and device instance. The number of entries is given by the value of symDevListCount.')
devStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symDevListCount"))
if mibBuilder.loadTexts: devStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: devStatEntry.setDescription('An entry containing objects for the Symmetrix device statistics for the specified Symmetrix and device.')
devstatTime_stamp = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 1), TimeTicks()).setLabel("devstatTime-stamp").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatTime_stamp.setStatus('mandatory')
if mibBuilder.loadTexts: devstatTime_stamp.setDescription(' Time since these statistics were last collected')
devstatNum_sym_timeslices = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 2), UInt32()).setLabel("devstatNum-sym-timeslices").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_sym_timeslices.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_sym_timeslices.setDescription(' Number of 1/2 seconds from reset until the Symmetrix internally snapshots the director counters. You must use this field when computing a rate.')
devstatNum_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 3), UInt32()).setLabel("devstatNum-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_rw_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_rw_reqs.setDescription(' Total number of I/Os for the device')
devstatNum_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 4), UInt32()).setLabel("devstatNum-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_read_reqs.setDescription(' Total number of read requests for the device')
devstatNum_write_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 5), UInt32()).setLabel("devstatNum-write-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_write_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_write_reqs.setDescription(' Total number of write requests for the device ')
devstatNum_rw_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 6), UInt32()).setLabel("devstatNum-rw-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_rw_hits.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_rw_hits.setDescription(' Total number of cache hits (read & write) for the device ')
devstatNum_read_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 7), UInt32()).setLabel("devstatNum-read-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_read_hits.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_read_hits.setDescription(' Total number of cache read hits for the device ')
devstatNum_write_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 8), UInt32()).setLabel("devstatNum-write-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_write_hits.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_write_hits.setDescription(' Total number of cache write hits for the device ')
devstatNum_blocks_read = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 13), UInt32()).setLabel("devstatNum-blocks-read").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_blocks_read.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_blocks_read.setDescription(' Total number of (512 byte) blocks read for the device ')
devstatNum_blocks_written = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 18), UInt32()).setLabel("devstatNum-blocks-written").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_blocks_written.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_blocks_written.setDescription(' Total number of (512 byte) blocks written for the device ')
devstatNum_seq_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 19), UInt32()).setLabel("devstatNum-seq-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_seq_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_seq_read_reqs.setDescription(' Total number of number of sequential read reqs for the device ')
devstatNum_prefetched_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 20), UInt32()).setLabel("devstatNum-prefetched-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_prefetched_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_prefetched_tracks.setDescription(' Total number of prefetched tracks for the device ')
devstatNum_destaged_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 21), UInt32()).setLabel("devstatNum-destaged-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_destaged_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_destaged_tracks.setDescription(' Total number of destaged tracks for the device ')
devstatNum_deferred_writes = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 22), UInt32()).setLabel("devstatNum-deferred-writes").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_deferred_writes.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_deferred_writes.setDescription(' Total number of deferred writes for the device ')
devstatNum_delayed_dfw = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 23), UInt32()).setLabel("devstatNum-delayed-dfw").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_delayed_dfw.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_delayed_dfw.setDescription(' Total number of delayed deferred writes until track destaged')
devstatNum_wp_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 24), UInt32()).setLabel("devstatNum-wp-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_wp_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_wp_tracks.setDescription(' Total number of write pending tracks for the device')
devstatNum_format_pend_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 25), UInt32()).setLabel("devstatNum-format-pend-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatNum_format_pend_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: devstatNum_format_pend_tracks.setDescription(' Total number of format pending tracks for the device')
devstatDevice_max_wp_limit = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 2, 1, 26), UInt32()).setLabel("devstatDevice-max-wp-limit").setMaxAccess("readonly")
if mibBuilder.loadTexts: devstatDevice_max_wp_limit.setStatus('mandatory')
if mibBuilder.loadTexts: devstatDevice_max_wp_limit.setDescription(' Device max write pending limit')
pDevStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3), )
if mibBuilder.loadTexts: pDevStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: pDevStatTable.setDescription('A table of Symmetrix phisical device statistics for the indicated Symmetrix and device instance. The number of entries is given by the value of symPDevListCount.')
pDevStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symPDevListCount"))
if mibBuilder.loadTexts: pDevStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: pDevStatEntry.setDescription('An entry containing objects for the Symmetrix physical device statistics for the specified Symmetrix and device.')
pdevstatTime_stamp = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 1), TimeTicks()).setLabel("pdevstatTime-stamp").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatTime_stamp.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatTime_stamp.setDescription(' Time since these statistics were last collected')
pdevstatNum_sym_timeslices = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 2), UInt32()).setLabel("pdevstatNum-sym-timeslices").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_sym_timeslices.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_sym_timeslices.setDescription(' Number of 1/2 seconds from reset until the Symmetrix internally snapshots the director counters. You must use this field when computing a rate.')
pdevstatNum_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 3), UInt32()).setLabel("pdevstatNum-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_rw_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_rw_reqs.setDescription(' Total number of I/Os for the device')
pdevstatNum_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 4), UInt32()).setLabel("pdevstatNum-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_read_reqs.setDescription(' Total number of read requests for the device')
pdevstatNum_write_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 5), UInt32()).setLabel("pdevstatNum-write-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_write_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_write_reqs.setDescription(' Total number of write requests for the device ')
pdevstatNum_rw_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 6), UInt32()).setLabel("pdevstatNum-rw-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_rw_hits.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_rw_hits.setDescription(' Total number of cache hits (read & write) for the device ')
pdevstatNum_read_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 7), UInt32()).setLabel("pdevstatNum-read-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_read_hits.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_read_hits.setDescription(' Total number of cache read hits for the device ')
pdevstatNum_write_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 8), UInt32()).setLabel("pdevstatNum-write-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_write_hits.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_write_hits.setDescription(' Total number of cache write hits for the device ')
pdevstatNum_blocks_read = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 13), UInt32()).setLabel("pdevstatNum-blocks-read").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_blocks_read.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_blocks_read.setDescription(' Total number of (512 byte) blocks read for the device ')
pdevstatNum_blocks_written = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 18), UInt32()).setLabel("pdevstatNum-blocks-written").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_blocks_written.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_blocks_written.setDescription(' Total number of (512 byte) blocks written for the device ')
pdevstatNum_seq_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 19), UInt32()).setLabel("pdevstatNum-seq-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_seq_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_seq_read_reqs.setDescription(' Total number of number of sequential read reqs for the device ')
pdevstatNum_prefetched_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 20), UInt32()).setLabel("pdevstatNum-prefetched-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_prefetched_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_prefetched_tracks.setDescription(' Total number of prefetched tracks for the device ')
pdevstatNum_destaged_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 21), UInt32()).setLabel("pdevstatNum-destaged-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_destaged_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_destaged_tracks.setDescription(' Total number of destaged tracks for the device ')
pdevstatNum_deferred_writes = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 22), UInt32()).setLabel("pdevstatNum-deferred-writes").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_deferred_writes.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_deferred_writes.setDescription(' Total number of deferred writes for the device ')
pdevstatNum_delayed_dfw = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 23), UInt32()).setLabel("pdevstatNum-delayed-dfw").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_delayed_dfw.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_delayed_dfw.setDescription(' Total number of delayed deferred writes until track destaged')
pdevstatNum_wp_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 24), UInt32()).setLabel("pdevstatNum-wp-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_wp_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_wp_tracks.setDescription(' Total number of write pending tracks for the device')
pdevstatNum_format_pend_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 25), UInt32()).setLabel("pdevstatNum-format-pend-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatNum_format_pend_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatNum_format_pend_tracks.setDescription(' Total number of format pending tracks for the device')
pdevstatDevice_max_wp_limit = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 3, 1, 26), UInt32()).setLabel("pdevstatDevice-max-wp-limit").setMaxAccess("readonly")
if mibBuilder.loadTexts: pdevstatDevice_max_wp_limit.setStatus('mandatory')
if mibBuilder.loadTexts: pdevstatDevice_max_wp_limit.setDescription(' Device max write pending limit')
lDevStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4), )
if mibBuilder.loadTexts: lDevStatTable.setStatus('obsolete')
if mibBuilder.loadTexts: lDevStatTable.setDescription('A table of Symmetrix logical device statistics for the indicated Symmetrix and device instance. The number of entries is given by the value of symLDevListCount.')
lDevStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1), ).setIndexNames((0, "EMC-MIB", "symDgListCount"), (0, "EMC-MIB", "symLDevListCount"))
if mibBuilder.loadTexts: lDevStatEntry.setStatus('obsolete')
if mibBuilder.loadTexts: lDevStatEntry.setDescription('An entry containing objects for the Symmetrix logical device statistics for the specified Symmetrix and device.')
ldevstatTime_stamp = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 1), TimeTicks()).setLabel("ldevstatTime-stamp").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatTime_stamp.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatTime_stamp.setDescription(' Time since these statistics were last collected')
ldevstatNum_sym_timeslices = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 2), UInt32()).setLabel("ldevstatNum-sym-timeslices").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_sym_timeslices.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_sym_timeslices.setDescription(' Number of 1/2 seconds from reset until the Symmetrix internally snapshots the director counters. You must use this field when computing a rate.')
ldevstatNum_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 3), UInt32()).setLabel("ldevstatNum-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_rw_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_rw_reqs.setDescription(' Total number of I/Os for the device')
ldevstatNum_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 4), UInt32()).setLabel("ldevstatNum-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_read_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_read_reqs.setDescription(' Total number of read requests for the device')
ldevstatNum_write_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 5), UInt32()).setLabel("ldevstatNum-write-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_write_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_write_reqs.setDescription(' Total number of write requests for the device ')
ldevstatNum_rw_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 6), UInt32()).setLabel("ldevstatNum-rw-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_rw_hits.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_rw_hits.setDescription(' Total number of cache hits (read & write) for the device ')
ldevstatNum_read_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 7), UInt32()).setLabel("ldevstatNum-read-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_read_hits.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_read_hits.setDescription(' Total number of cache read hits for the device ')
ldevstatNum_write_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 8), UInt32()).setLabel("ldevstatNum-write-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_write_hits.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_write_hits.setDescription(' Total number of cache write hits for the device ')
ldevstatNum_blocks_read = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 13), UInt32()).setLabel("ldevstatNum-blocks-read").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_blocks_read.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_blocks_read.setDescription(' Total number of (512 byte) blocks read for the device ')
ldevstatNum_blocks_written = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 18), UInt32()).setLabel("ldevstatNum-blocks-written").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_blocks_written.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_blocks_written.setDescription(' Total number of (512 byte) blocks written for the device ')
ldevstatNum_seq_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 19), UInt32()).setLabel("ldevstatNum-seq-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_seq_read_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_seq_read_reqs.setDescription(' Total number of number of sequential read reqs for the device ')
ldevstatNum_prefetched_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 20), UInt32()).setLabel("ldevstatNum-prefetched-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_prefetched_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_prefetched_tracks.setDescription(' Total number of prefetched tracks for the device ')
ldevstatNum_destaged_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 21), UInt32()).setLabel("ldevstatNum-destaged-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_destaged_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_destaged_tracks.setDescription(' Total number of destaged tracks for the device ')
ldevstatNum_deferred_writes = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 22), UInt32()).setLabel("ldevstatNum-deferred-writes").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_deferred_writes.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_deferred_writes.setDescription(' Total number of deferred writes for the device ')
ldevstatNum_delayed_dfw = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 23), UInt32()).setLabel("ldevstatNum-delayed-dfw").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_delayed_dfw.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_delayed_dfw.setDescription(' Total number of delayed deferred writes until track destaged')
ldevstatNum_wp_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 24), UInt32()).setLabel("ldevstatNum-wp-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_wp_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_wp_tracks.setDescription(' Total number of write pending tracks for the device')
ldevstatNum_format_pend_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 25), UInt32()).setLabel("ldevstatNum-format-pend-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatNum_format_pend_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatNum_format_pend_tracks.setDescription(' Total number of format pending tracks for the device')
ldevstatDevice_max_wp_limit = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 4, 1, 26), UInt32()).setLabel("ldevstatDevice-max-wp-limit").setMaxAccess("readonly")
if mibBuilder.loadTexts: ldevstatDevice_max_wp_limit.setStatus('obsolete')
if mibBuilder.loadTexts: ldevstatDevice_max_wp_limit.setDescription(' Device max write pending limit')
dgStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5), )
if mibBuilder.loadTexts: dgStatTable.setStatus('obsolete')
if mibBuilder.loadTexts: dgStatTable.setDescription('A table of Symmetrix device group statistics for the indicated device group instance. The number of entries is given by the value of symDgListCount.')
dgStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1), ).setIndexNames((0, "EMC-MIB", "symDgListCount"))
if mibBuilder.loadTexts: dgStatEntry.setStatus('obsolete')
if mibBuilder.loadTexts: dgStatEntry.setDescription('An entry containing objects for the device group statistics for the specified device group.')
dgstatTime_stamp = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 1), TimeTicks()).setLabel("dgstatTime-stamp").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatTime_stamp.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatTime_stamp.setDescription(' Time since these statistics were last collected')
dgstatNum_sym_timeslices = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 2), UInt32()).setLabel("dgstatNum-sym-timeslices").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_sym_timeslices.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_sym_timeslices.setDescription(' Number of 1/2 seconds from reset until the Symmetrix internally snapshots the director counters. You must use this field when computing a rate.')
dgstatNum_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 3), UInt32()).setLabel("dgstatNum-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_rw_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_rw_reqs.setDescription(' Total number of I/Os for the device group')
dgstatNum_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 4), UInt32()).setLabel("dgstatNum-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_read_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_read_reqs.setDescription(' Total number of read requests for the device group')
dgstatNum_write_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 5), UInt32()).setLabel("dgstatNum-write-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_write_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_write_reqs.setDescription(' Total number of write requests for the device group ')
dgstatNum_rw_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 6), UInt32()).setLabel("dgstatNum-rw-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_rw_hits.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_rw_hits.setDescription(' Total number of cache hits (read & write) for the device group ')
dgstatNum_read_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 7), UInt32()).setLabel("dgstatNum-read-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_read_hits.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_read_hits.setDescription(' Total number of cache read hits for the device group ')
dgstatNum_write_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 8), UInt32()).setLabel("dgstatNum-write-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_write_hits.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_write_hits.setDescription(' Total number of cache write hits for the device group ')
dgstatNum_blocks_read = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 13), UInt32()).setLabel("dgstatNum-blocks-read").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_blocks_read.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_blocks_read.setDescription(' Total number of (512 byte) blocks read for the device group ')
dgstatNum_blocks_written = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 18), UInt32()).setLabel("dgstatNum-blocks-written").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_blocks_written.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_blocks_written.setDescription(' Total number of (512 byte) blocks written for the device group ')
dgstatNum_seq_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 19), UInt32()).setLabel("dgstatNum-seq-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_seq_read_reqs.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_seq_read_reqs.setDescription(' Total number of number of sequential read reqs for the device group ')
dgstatNum_prefetched_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 20), UInt32()).setLabel("dgstatNum-prefetched-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_prefetched_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_prefetched_tracks.setDescription(' Total number of prefetched tracks for the device group ')
dgstatNum_destaged_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 21), UInt32()).setLabel("dgstatNum-destaged-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_destaged_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_destaged_tracks.setDescription(' Total number of destaged tracks for the device group ')
dgstatNum_deferred_writes = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 22), UInt32()).setLabel("dgstatNum-deferred-writes").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_deferred_writes.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_deferred_writes.setDescription(' Total number of deferred writes for the device group ')
dgstatNum_delayed_dfw = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 23), UInt32()).setLabel("dgstatNum-delayed-dfw").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_delayed_dfw.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_delayed_dfw.setDescription(' Total number of delayed deferred writes until track destaged')
dgstatNum_wp_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 24), UInt32()).setLabel("dgstatNum-wp-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_wp_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_wp_tracks.setDescription(' Total number of write pending tracks for the device group')
dgstatNum_format_pend_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 25), UInt32()).setLabel("dgstatNum-format-pend-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatNum_format_pend_tracks.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatNum_format_pend_tracks.setDescription(' Total number of format pending tracks for the device group')
dgstatdevice_max_wp_limit = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 5, 1, 26), UInt32()).setLabel("dgstatdevice-max-wp-limit").setMaxAccess("readonly")
if mibBuilder.loadTexts: dgstatdevice_max_wp_limit.setStatus('obsolete')
if mibBuilder.loadTexts: dgstatdevice_max_wp_limit.setDescription(' Device group max write pending limit')
directorStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6), )
if mibBuilder.loadTexts: directorStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: directorStatTable.setDescription('A table of Symmetrix director statistics for the indicated Symmetrix and director instance. The number of entries is given by the value of symShowDirectorCount.')
directorStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowDirectorCount"))
if mibBuilder.loadTexts: directorStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: directorStatEntry.setDescription('An entry containing objects for the Symmetrix director statistics for the specified Symmetrix and director.')
dirstatTime_stamp = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 1), TimeTicks()).setLabel("dirstatTime-stamp").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatTime_stamp.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatTime_stamp.setDescription(' Time since these statistics were last collected')
dirstatNum_sym_timeslices = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 2), UInt32()).setLabel("dirstatNum-sym-timeslices").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatNum_sym_timeslices.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatNum_sym_timeslices.setDescription(' Number of 1/2 seconds from reset until the Symmetrix internally snapshots the director counters. You must use this field when computing a rate.')
dirstatNum_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 3), UInt32()).setLabel("dirstatNum-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatNum_rw_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatNum_rw_reqs.setDescription(' Total number of I/Os for the device')
dirstatNum_read_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 4), UInt32()).setLabel("dirstatNum-read-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatNum_read_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatNum_read_reqs.setDescription(' Total number of read requests for the device')
dirstatNum_write_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 5), UInt32()).setLabel("dirstatNum-write-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatNum_write_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatNum_write_reqs.setDescription(' Total number of write requests for the device ')
dirstatNum_rw_hits = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 6), UInt32()).setLabel("dirstatNum-rw-hits").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatNum_rw_hits.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatNum_rw_hits.setDescription(' Total number of cache hits (read & write) for the device ')
dirstatNum_permacache_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 7), UInt32()).setLabel("dirstatNum-permacache-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatNum_permacache_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatNum_permacache_reqs.setDescription(' Total number of cache read hits for the device ')
dirstatNum_ios = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 6, 1, 8), UInt32()).setLabel("dirstatNum-ios").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatNum_ios.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatNum_ios.setDescription(' Total number of cache write hits for the device ')
saDirStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 7), )
if mibBuilder.loadTexts: saDirStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: saDirStatTable.setDescription('A table of Symmetrix SA director statistics for the indicated Symmetrix and director instance. The number of entries is given by the value of symShowDirectorCount.')
saDirStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 7, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowDirectorCount"))
if mibBuilder.loadTexts: saDirStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: saDirStatEntry.setDescription('An entry containing objects for the Symmetrix SA director statistics for the specified Symmetrix and director.')
dirstatSANum_read_misses = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 7, 1, 1), UInt32()).setLabel("dirstatSANum-read-misses").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatSANum_read_misses.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatSANum_read_misses.setDescription(' Total number of cache read misses')
dirstatSANum_slot_collisions = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 7, 1, 2), UInt32()).setLabel("dirstatSANum-slot-collisions").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatSANum_slot_collisions.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatSANum_slot_collisions.setDescription(' Total number of cache slot collisions')
dirstatSANum_system_wp_disconnects = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 7, 1, 3), UInt32()).setLabel("dirstatSANum-system-wp-disconnects").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatSANum_system_wp_disconnects.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatSANum_system_wp_disconnects.setDescription('Total number of system write pending disconnects. The limit for the system parameter for maximum number of write pendings was exceeded.')
dirstatSANum_device_wp_disconnects = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 7, 1, 4), UInt32()).setLabel("dirstatSANum-device-wp-disconnects").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatSANum_device_wp_disconnects.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatSANum_device_wp_disconnects.setDescription('Total number of device write pending disconnects. The limit for the device parameter for maximum number of write pendings was exceeded.')
daDirStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8), )
if mibBuilder.loadTexts: daDirStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: daDirStatTable.setDescription('A table of Symmetrix DA director statistics for the indicated Symmetrix and director instance. The number of entries is given by the value of symShowDirectorCount.')
daDirStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowDirectorCount"))
if mibBuilder.loadTexts: daDirStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: daDirStatEntry.setDescription('An entry containing objects for the Symmetrix DA director statistics for the specified Symmetrix and director.')
dirstatDANum_pf_tracks = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1, 1), UInt32()).setLabel("dirstatDANum-pf-tracks").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatDANum_pf_tracks.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatDANum_pf_tracks.setDescription(' The number of prefetched tracks. Remember that cache may contain vestigial read or written tracks.')
dirstatDANum_pf_tracks_used = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1, 2), UInt32()).setLabel("dirstatDANum-pf-tracks-used").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatDANum_pf_tracks_used.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatDANum_pf_tracks_used.setDescription('The number of prefetched tracks used to satisfy read/write requests')
dirstatDANum_pf_tracks_unused = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1, 3), UInt32()).setLabel("dirstatDANum-pf-tracks-unused").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatDANum_pf_tracks_unused.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatDANum_pf_tracks_unused.setDescription('The number of prefetched tracks unused and replaced by another.')
dirstatDANum_pf_short_misses = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1, 4), UInt32()).setLabel("dirstatDANum-pf-short-misses").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatDANum_pf_short_misses.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatDANum_pf_short_misses.setDescription('The number of tracks already being prefetched when a read/write request for that track occurred.')
dirstatDANum_pf_long_misses = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1, 5), UInt32()).setLabel("dirstatDANum-pf-long-misses").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatDANum_pf_long_misses.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatDANum_pf_long_misses.setDescription('The number of tracks requiring a complete fetch when a read/write request for that track occurred.')
dirstatDANum_pf_restarts = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1, 6), UInt32()).setLabel("dirstatDANum-pf-restarts").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatDANum_pf_restarts.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatDANum_pf_restarts.setDescription('The number of times that a prefetch task needed to be restarted.')
dirstatDANum_pf_mismatches = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 8, 1, 7), UInt32()).setLabel("dirstatDANum-pf-mismatches").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatDANum_pf_mismatches.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatDANum_pf_mismatches.setDescription('The number of times a prefetch task needed to be canceled.')
raDirStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 9), )
if mibBuilder.loadTexts: raDirStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: raDirStatTable.setDescription('A table of Symmetrix RA director statistics for the indicated Symmetrix and director instance. The number of entries is given by the value of symShowDirectorCount.')
raDirStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 9, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowDirectorCount"))
if mibBuilder.loadTexts: raDirStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: raDirStatEntry.setDescription('An entry containing objects for the Symmetrix RA director statistics for the specified Symmetrix and director.')
dirstatRANum_read_misses = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 9, 1, 1), UInt32()).setLabel("dirstatRANum-read-misses").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatRANum_read_misses.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatRANum_read_misses.setDescription(' Total number of cache read misses')
dirstatRANum_slot_collisions = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 9, 1, 2), UInt32()).setLabel("dirstatRANum-slot-collisions").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatRANum_slot_collisions.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatRANum_slot_collisions.setDescription(' Total number of cache slot collisions')
dirstatRANum_system_wp_disconnects = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 9, 1, 3), UInt32()).setLabel("dirstatRANum-system-wp-disconnects").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatRANum_system_wp_disconnects.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatRANum_system_wp_disconnects.setDescription('Total number of system write pending disconnects. The limit for the system parameter for maximum number of write pendings was exceeded.')
dirstatRANum_device_wp_disconnects = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 9, 1, 4), UInt32()).setLabel("dirstatRANum-device-wp-disconnects").setMaxAccess("readonly")
if mibBuilder.loadTexts: dirstatRANum_device_wp_disconnects.setStatus('mandatory')
if mibBuilder.loadTexts: dirstatRANum_device_wp_disconnects.setDescription('Total number of device write pending disconnects. The limit for the device parameter for maximum number of write pendings was exceeded.')
dirStatPortCountTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 1), )
if mibBuilder.loadTexts: dirStatPortCountTable.setStatus('mandatory')
if mibBuilder.loadTexts: dirStatPortCountTable.setDescription('A list of the number of available ports for the given Symmetrix and director instance. The number of entries is given by the value of symShowDirectorCount.')
dirStatPortCountEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 1, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowDirectorCount"))
if mibBuilder.loadTexts: dirStatPortCountEntry.setStatus('mandatory')
if mibBuilder.loadTexts: dirStatPortCountEntry.setDescription('An entry containing objects for the number of available ports for the specified Symmetrix and director')
dirPortStatPortCount = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dirPortStatPortCount.setStatus('mandatory')
if mibBuilder.loadTexts: dirPortStatPortCount.setDescription('The number of entries in the dirPortStatTable table for the indicated Symmetrix and director instance')
dirPortStatTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 2), )
if mibBuilder.loadTexts: dirPortStatTable.setStatus('mandatory')
if mibBuilder.loadTexts: dirPortStatTable.setDescription('A table of Symmetrix director statistics for the indicated Symmetrix, director and port instance. The number of entries is given by the value dirPortStatPortCount.')
dirPortStatEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symShowDirectorCount"), (0, "EMC-MIB", "dirPortStatPortCount"))
if mibBuilder.loadTexts: dirPortStatEntry.setStatus('mandatory')
if mibBuilder.loadTexts: dirPortStatEntry.setDescription('An entry containing objects for the Symmetrix director port statistics for the specified Symmetrix, director, and port.')
portstatTime_stamp = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 2, 1, 1), TimeTicks()).setLabel("portstatTime-stamp").setMaxAccess("readonly")
if mibBuilder.loadTexts: portstatTime_stamp.setStatus('mandatory')
if mibBuilder.loadTexts: portstatTime_stamp.setDescription(' Time since these statistics were last collected')
portstatNum_sym_timeslices = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 2, 1, 2), UInt32()).setLabel("portstatNum-sym-timeslices").setMaxAccess("readonly")
if mibBuilder.loadTexts: portstatNum_sym_timeslices.setStatus('mandatory')
if mibBuilder.loadTexts: portstatNum_sym_timeslices.setDescription(' Number of 1/2 seconds from reset until the Symmetrix internally snapshots the director counters. You must use this field when computing a rate.')
portstatNum_rw_reqs = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 2, 1, 3), UInt32()).setLabel("portstatNum-rw-reqs").setMaxAccess("readonly")
if mibBuilder.loadTexts: portstatNum_rw_reqs.setStatus('mandatory')
if mibBuilder.loadTexts: portstatNum_rw_reqs.setDescription('The number of I/O requests (reads and writes) handled by the front-end port. ')
portstatNum_blocks_read_and_written = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 6, 3, 10, 2, 1, 4), UInt32()).setLabel("portstatNum-blocks-read-and-written").setMaxAccess("readonly")
if mibBuilder.loadTexts: portstatNum_blocks_read_and_written.setStatus('mandatory')
if mibBuilder.loadTexts: portstatNum_blocks_read_and_written.setDescription('The number of blocks read and written by the front-end port ')
symmEventMaxEvents = MibScalar((1, 3, 6, 1, 4, 1, 1139, 1, 7, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symmEventMaxEvents.setStatus('mandatory')
if mibBuilder.loadTexts: symmEventMaxEvents.setDescription('Max number of events that can be defined in each Symmetrixes symmEventTable.')
symmEventTable = MibTable((1, 3, 6, 1, 4, 1, 1139, 1, 7, 2), )
if mibBuilder.loadTexts: symmEventTable.setStatus('mandatory')
if mibBuilder.loadTexts: symmEventTable.setDescription('The table of Symmetrix events. Errors, warnings, and information should be reported in this table.')
symmEventEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1139, 1, 7, 2, 1), ).setIndexNames((0, "EMC-MIB", "discIndex"), (0, "EMC-MIB", "symmEventIndex"))
if mibBuilder.loadTexts: symmEventEntry.setStatus('mandatory')
if mibBuilder.loadTexts: symmEventEntry.setDescription('Each entry contains information on a specific event for the given Symmetrix.')
symmEventIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 7, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: symmEventIndex.setStatus('mandatory')
if mibBuilder.loadTexts: symmEventIndex.setDescription('Each Symmetrix has its own event buffer. As it wraps, it may write over previous events. This object is an index into the buffer. The index value is an incrementing integer starting from one every time there is a table reset. On table reset, all contents are emptied and all indeces are set to zero. When an event is added to the table, the event is assigned the next higher integer value than the last item entered into the table. If the index value reaches its maximum value, the next item entered will cause the index value to roll over and start at one again.')
symmEventTime = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 7, 2, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symmEventTime.setStatus('mandatory')
if mibBuilder.loadTexts: symmEventTime.setDescription('This is the time when the event occurred. It has the following format. DOW MON DD HH:MM:SS YYYY DOW=day of week MON=Month DD=day number HH=hour number MM=minute number SS=seconds number YYYY=year number If not applicable, return a NULL string.')
symmEventSeverity = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 7, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10))).clone(namedValues=NamedValues(("unknown", 1), ("emergency", 2), ("alert", 3), ("critical", 4), ("error", 5), ("warning", 6), ("notify", 7), ("info", 8), ("debug", 9), ("mark", 10)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: symmEventSeverity.setStatus('mandatory')
if mibBuilder.loadTexts: symmEventSeverity.setDescription('The event severity level. These map directly with those from the FIbre Mib, version 2.2')
symmEventDescr = MibTableColumn((1, 3, 6, 1, 4, 1, 1139, 1, 7, 2, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: symmEventDescr.setStatus('mandatory')
if mibBuilder.loadTexts: symmEventDescr.setDescription('The description of the event.')
emcDeviceStatusTrap = NotificationType((1, 3, 6, 1, 4, 1, 1139, 1) + (0,1)).setObjects(("EMC-MIB", "symmEventDescr"))
if mibBuilder.loadTexts: emcDeviceStatusTrap.setDescription("This trap is sent for each device found 'NOT READY' during the most recent test of each attached Symmetrix for Device Not Ready conditions. ")
emcSymmetrixStatusTrap = NotificationType((1, 3, 6, 1, 4, 1, 1139, 1) + (0,2)).setObjects(("EMC-MIB", "symmEventDescr"))
if mibBuilder.loadTexts: emcSymmetrixStatusTrap.setDescription("This trap is sent for each new WARNING and FATAL error condition found during the most recent 'health' test of each attached Symmetrix. Format of the message is: Symmetrix s/n: %s, Dir - %d, %04X, %s, %s an example of which is: Symmetrix s/n: 12345, Dir - 31, 470, Thu Apr 6 10:53:16 2000, Environmental alarm: Battery Fault ")
emcRatiosOutofRangeTrap = NotificationType((1, 3, 6, 1, 4, 1, 1139, 1) + (0,3)).setObjects(("EMC-MIB", "symmEventDescr"))
if mibBuilder.loadTexts: emcRatiosOutofRangeTrap.setDescription('This trap is sent for each attached Symmetrix when the Hit Ratio, Write Ratio, or IO/sec Ratio were out of the specified range during the most recent test for these conditions. The ratios are preconfigured at agent startup, and apply to all Symmetrixes attached. ')
discoveryTableChange = NotificationType((1, 3, 6, 1, 4, 1, 1139, 1) + (0,4)).setObjects(("EMC-MIB", "discoveryChangeTime"))
if mibBuilder.loadTexts: discoveryTableChange.setDescription('This trap is sent whenever the periodic check of attached Symmetrixes reveals newly attached Symmetrixes, or changes in the configuration of previously attached Symmetrixes. ')
emcSymmetrixEventTrap = NotificationType((1, 3, 6, 1, 4, 1, 1139, 1) + (0,5)).setObjects(("EMC-MIB", "symmEventDescr"))
if mibBuilder.loadTexts: emcSymmetrixEventTrap.setDescription('This trap is sent whenever a non-specific (i.e. not traps 1-4) event occurs in the agent, or for a specific Symmetrix. ')
mibBuilder.exportSymbols("EMC-MIB", analyzer=analyzer, sysinfoFirstRecordNumber=sysinfoFirstRecordNumber, symstatNum_write_hits=symstatNum_write_hits, devShowVendor_id=devShowVendor_id, sysinfoNumberofVolumes=sysinfoNumberofVolumes, symstatNum_blocks_read=symstatNum_blocks_read, symShowPDevCountTable=symShowPDevCountTable, analyzerFileLastModified=analyzerFileLastModified, ldevstatNum_write_hits=ldevstatNum_write_hits, portstatNum_sym_timeslices=portstatNum_sym_timeslices, dgstatNum_delayed_dfw=dgstatNum_delayed_dfw, dadcnfigMirror2Director=dadcnfigMirror2Director, subagentTraceMessagesEnable=subagentTraceMessagesEnable, pdevstatNum_format_pend_tracks=pdevstatNum_format_pend_tracks, standardSNMPRequestPort=standardSNMPRequestPort, symDevListCount=symDevListCount, symstatNum_sa_cdb_reqs=symstatNum_sa_cdb_reqs, devstatNum_deferred_writes=devstatNum_deferred_writes, devstatNum_blocks_written=devstatNum_blocks_written, symDevNoDgList=symDevNoDgList, emcSymUtil99=emcSymUtil99, devShowSymid=devShowSymid, emcSymMvsDsname=emcSymMvsDsname, devShowBCVInfoEntry=devShowBCVInfoEntry, symstatNum_format_pend_tracks=symstatNum_format_pend_tracks, emcControlCenter=emcControlCenter, symShowMicrocode_version=symShowMicrocode_version, ldevstatNum_write_reqs=ldevstatNum_write_reqs, symShowDirectorCountTable=symShowDirectorCountTable, ldevstatNum_blocks_read=ldevstatNum_blocks_read, dvhoaddrDeviceRecordsTable=dvhoaddrDeviceRecordsTable, pdevstatNum_read_reqs=pdevstatNum_read_reqs, symstatNum_used_permacache_slots=symstatNum_used_permacache_slots, dvhoaddrDeviceRecordsEntry=dvhoaddrDeviceRecordsEntry, symPDevList=symPDevList, symLDevListCountEntry=symLDevListCountEntry, symmEventEntry=symmEventEntry, RDDFTransientState=RDDFTransientState, symstatNum_deferred_writes=symstatNum_deferred_writes, analyzerSpecialDurationLimit=analyzerSpecialDurationLimit, symDgListEntry=symDgListEntry, emulMTPF=emulMTPF, xdrTCPPort=xdrTCPPort, escnChecksum=escnChecksum, symPDeviceName=symPDeviceName, symShowSymmetrix_pwron_time=symShowSymmetrix_pwron_time, PortStatus=PortStatus, dirstatDANum_pf_tracks_unused=dirstatDANum_pf_tracks_unused, dadcnfigMirrors=dadcnfigMirrors, dirstatSANum_read_misses=dirstatSANum_read_misses, devShowConfigurationTable=devShowConfigurationTable, emcSymPhysDevStats=emcSymPhysDevStats, devShowBcvdev_status=devShowBcvdev_status, pdevstatDevice_max_wp_limit=pdevstatDevice_max_wp_limit, initFileCount=initFileCount, analyzerFileCreation=analyzerFileCreation, symstatNum_sa_rw_reqs=symstatNum_sa_rw_reqs, devShowPrevent_auto_link_recovery=devShowPrevent_auto_link_recovery, symShowDirector_num=symShowDirector_num, symBcvDevListCount=symBcvDevListCount, dirstatNum_ios=dirstatNum_ios, implVersion=implVersion, portstatTime_stamp=portstatTime_stamp, symShowPDevListEntry=symShowPDevListEntry, devShowLink_config=devShowLink_config, symstatNum_read_hits=symstatNum_read_hits, escnFileCount=escnFileCount, symShowCache_slot_count=symShowCache_slot_count, discState=discState, pdevstatNum_prefetched_tracks=pdevstatNum_prefetched_tracks, symShow=symShow, pdevstatTime_stamp=pdevstatTime_stamp, gatekeeperDeviceName=gatekeeperDeviceName, symstatNum_delayed_dfw=symstatNum_delayed_dfw, symstatNum_wr_pend_tracks=symstatNum_wr_pend_tracks, emulCodeType=emulCodeType, devShowAdaptive_copy_skew=devShowAdaptive_copy_skew, pdevstatNum_rw_reqs=pdevstatNum_rw_reqs, emcSymBCVDevice=emcSymBCVDevice, dadcnfigSymmNumber=dadcnfigSymmNumber, trapSetup=trapSetup, DeviceStatus=DeviceStatus, devShowVbus=devShowVbus, emcSymMirrorDiskCfg=emcSymMirrorDiskCfg, dadcnfigMirror3Interface=dadcnfigMirror3Interface, symShowRa_group_num=symShowRa_group_num, initDate=initDate, discoveryChangeTime=discoveryChangeTime, devShowMset_M2_type=devShowMset_M2_type, dgstatNum_seq_read_reqs=dgstatNum_seq_read_reqs, BCVState=BCVState, systemCodesRecordsEntry=systemCodesRecordsEntry, diskAdapterDeviceConfigurationEntry=diskAdapterDeviceConfigurationEntry, dadcnfigMirror1Interface=dadcnfigMirror1Interface, saDirStatEntry=saDirStatEntry, analyzerFilesListTable=analyzerFilesListTable, devShowBCVInfoTable=devShowBCVInfoTable, emcSymMirror3DiskCfg=emcSymMirror3DiskCfg, discSerialNumber=discSerialNumber, symShowSymmetrix_uptime=symShowSymmetrix_uptime, symstatTime_stamp=symstatTime_stamp, analyzerFiles=analyzerFiles, devShowMset_M3_status=devShowMset_M3_status, dadcnfigRecordSize=dadcnfigRecordSize, clients=clients, diskAdapterDeviceConfigurationTable=diskAdapterDeviceConfigurationTable, dgstatNum_write_hits=dgstatNum_write_hits, symShowPDevCountEntry=symShowPDevCountEntry, ldevstatNum_delayed_dfw=ldevstatNum_delayed_dfw, symPDevNoDgListCountEntry=symPDevNoDgListCountEntry, emcSymMirror4DiskCfg=emcSymMirror4DiskCfg, systemCalls=systemCalls, agentRevision=agentRevision, periodicDiscoveryFrequency=periodicDiscoveryFrequency, emcSymMvsVolume=emcSymMvsVolume, devstatNum_seq_read_reqs=devstatNum_seq_read_reqs, saDirStatTable=saDirStatTable, analyzerFilesCountTable=analyzerFilesCountTable, symDevListCountTable=symDevListCountTable, symBcvPDevListCount=symBcvPDevListCount, symShowPort3_status=symShowPort3_status, devShowRDFInfoEntry=devShowRDFInfoEntry, symmEventMaxEvents=symmEventMaxEvents, symDevListEntry=symDevListEntry, devShowDev_block_size=devShowDev_block_size, dgstatNum_format_pend_tracks=dgstatNum_format_pend_tracks, dvhoaddrPortBDeviceAddress=dvhoaddrPortBDeviceAddress, symDevNoDgListCount=symDevNoDgListCount, devstatNum_wp_tracks=devstatNum_wp_tracks, devShowSCSI_negotiation=devShowSCSI_negotiation, devShowAttached_bcv_symdev=devShowAttached_bcv_symdev, symDevGroupName=symDevGroupName, symGateListTable=symGateListTable, dirstatDANum_pf_tracks=dirstatDANum_pf_tracks, devShowNum_r1_invalid_tracks=devShowNum_r1_invalid_tracks, devstatNum_rw_reqs=devstatNum_rw_reqs, symstatNum_rw_reqs=symstatNum_rw_reqs, dvhoaddrPortDDeviceAddress=dvhoaddrPortDDeviceAddress, symShowLast_ipl_time=symShowLast_ipl_time, agentConfiguration=agentConfiguration, symShowDb_sync_rdf_time=symShowDb_sync_rdf_time, symstatNum_destaged_tracks=symstatNum_destaged_tracks, symShowPDevListTable=symShowPDevListTable, symShowReserved=symShowReserved, symShowPermacache_slot_count=symShowPermacache_slot_count, analyzerFileCountEntry=analyzerFileCountEntry, symShowSymmetrix_ident=symShowSymmetrix_ident, devShowPdevname=devShowPdevname, systemCodesRecordsTable=systemCodesRecordsTable, DirectorStatus=DirectorStatus, symLDevListTable=symLDevListTable, discoveryFrequency=discoveryFrequency, devShowMset_M4_type=devShowMset_M4_type, devShowDev_rdf_state=devShowDev_rdf_state, symmEventDescr=symmEventDescr, symstatNum_seq_read_reqs=symstatNum_seq_read_reqs, emcSymSaitInfo=emcSymSaitInfo, devShowAdaptive_copy_wp_state=devShowAdaptive_copy_wp_state, UInt32=UInt32, symGateListCountTable=symGateListCountTable, symShowDb_sync_bcv_time=symShowDb_sync_bcv_time, dirstatNum_sym_timeslices=dirstatNum_sym_timeslices, dgstatNum_deferred_writes=dgstatNum_deferred_writes, symPDevListTable=symPDevListTable, discStatus=discStatus, sysinfoSerialNumber=sysinfoSerialNumber, symShowNum_pdevs=symShowNum_pdevs, symStatTable=symStatTable, emulDate=emulDate, symPDevNoDgList=symPDevNoDgList, emcSymUtilA7=emcSymUtilA7, symDevNoDgListEntry=symDevNoDgListEntry, ldevstatNum_seq_read_reqs=ldevstatNum_seq_read_reqs, symDgList=symDgList, symShowPort1_status=symShowPort1_status, analyzerFilesListEntry=analyzerFilesListEntry, devShowDirector_num=devShowDirector_num, symDevNoDgListTable=symDevNoDgListTable, devShowProduct_rev=devShowProduct_rev, devStatEntry=devStatEntry, implFileCount=implFileCount, emcSymTimefinderInfo=emcSymTimefinderInfo, devstatNum_destaged_tracks=devstatNum_destaged_tracks, mainframeDataSetInformation=mainframeDataSetInformation, symShowMicrocode_version_num=symShowMicrocode_version_num, symShowDirector_ident=symShowDirector_ident, devShowSCSI_method=devShowSCSI_method, pdevstatNum_deferred_writes=pdevstatNum_deferred_writes, analyzerFileName=analyzerFileName, symBcvDevListCountEntry=symBcvDevListCountEntry, discSymapisrv_IP=discSymapisrv_IP, ldevstatNum_deferred_writes=ldevstatNum_deferred_writes, emcSymSumStatus=emcSymSumStatus, devShowMset_M1_type=devShowMset_M1_type, emcSymDevStats=emcSymDevStats, subagentProcessActive=subagentProcessActive, symListCount=symListCount, pdevstatNum_blocks_read=pdevstatNum_blocks_read, symShowDirectorCountEntry=symShowDirectorCountEntry, devShowDev_sym_devname=devShowDev_sym_devname, symShowMicrocode_patch_date=symShowMicrocode_patch_date, devShowLdevname=devShowLdevname, devShowMset_M1_status=devShowMset_M1_status, RDFPairState=RDFPairState, discBCV=discBCV, symShowMax_wr_pend_slots=symShowMax_wr_pend_slots, dirstatNum_rw_hits=dirstatNum_rw_hits, devShowDevice_serial_id=devShowDevice_serial_id, emcDeviceStatusTrap=emcDeviceStatusTrap, devShowLink_domino=devShowLink_domino, dgstatNum_prefetched_tracks=dgstatNum_prefetched_tracks, devShowDev_rdf_type=devShowDev_rdf_type, DeviceType=DeviceType, clientListMaintenanceFrequency=clientListMaintenanceFrequency, devstatDevice_max_wp_limit=devstatDevice_max_wp_limit, analyzerFileCount=analyzerFileCount, symmEventTime=symmEventTime, mainframeVariables=mainframeVariables, symPDevListEntry=symPDevListEntry, symLDevListEntry=symLDevListEntry, initChecksum=initChecksum, dvhoaddrBuffer=dvhoaddrBuffer, symShowPort0_status=symShowPort0_status, analyzerTopFileSavePolicy=analyzerTopFileSavePolicy, emcSymStatistics=emcSymStatistics, devShowMset_M4_status=devShowMset_M4_status, emcSymMvsLUNNumber=emcSymMvsLUNNumber, raDirStatEntry=raDirStatEntry, symShowCache_size=symShowCache_size, dgstatNum_destaged_tracks=dgstatNum_destaged_tracks, devstatNum_delayed_dfw=devstatNum_delayed_dfw, symShowNum_powerpath_devs=symShowNum_powerpath_devs, symShowDirectorCount=symShowDirectorCount, dirstatSANum_slot_collisions=dirstatSANum_slot_collisions, sysinfoBuffer=sysinfoBuffer, syscodesNumberofRecords=syscodesNumberofRecords, symAPI=symAPI, subagentSymmetrixSerialNumber=subagentSymmetrixSerialNumber, symRemoteListCount=symRemoteListCount, syscodesFirstRecordNumber=syscodesFirstRecordNumber, devShowDirector_ident=devShowDirector_ident, systemCodesEntry=systemCodesEntry, dvhoaddrNumberofRecords=dvhoaddrNumberofRecords, symShowNum_disks=symShowNum_disks, devShowRdf_domino=devShowRdf_domino, symShowScsi_capability=symShowScsi_capability, dvhoaddrPortADeviceAddress=dvhoaddrPortADeviceAddress, emcSymPortStats=emcSymPortStats, dadcnfigMirror1Director=dadcnfigMirror1Director, emcSymmetrix=emcSymmetrix, symmEventSeverity=symmEventSeverity, devShowRemote_symid=devShowRemote_symid, implMTPF=implMTPF, devShowRemote_dev_rdf_state=devShowRemote_dev_rdf_state)
mibBuilder.exportSymbols("EMC-MIB", symLDevListCountTable=symLDevListCountTable, devShowMset_M2_invalid_tracks=devShowMset_M2_invalid_tracks, devShowSuspend_state=devShowSuspend_state, informational=informational, pDevStatTable=pDevStatTable, symGateListCountEntry=symGateListCountEntry, symGateListCount=symGateListCount, symListEntry=symListEntry, symstatDevice_max_wp_limit=symstatDevice_max_wp_limit, symBcvPDevListTable=symBcvPDevListTable, initMTPF=initMTPF, symShowAPI_version=symShowAPI_version, discoveryTable=discoveryTable, symShowConfig_checksum=symShowConfig_checksum, ldevstatNum_prefetched_tracks=ldevstatNum_prefetched_tracks, symShowPrevent_auto_link_recovery=symShowPrevent_auto_link_recovery, symDgListTable=symDgListTable, symPDevListCountTable=symPDevListCountTable, devShowBcvdev_sym_devname=devShowBcvdev_sym_devname, ldevstatNum_sym_timeslices=ldevstatNum_sym_timeslices, systemInfoHeaderEntry=systemInfoHeaderEntry, devShowLun=devShowLun, symShowMax_da_wr_pend_slots=symShowMax_da_wr_pend_slots, discoveryTbl=discoveryTbl, devShowRdf_mode=devShowRdf_mode, esmVariablePacketSize=esmVariablePacketSize, dgstatNum_rw_hits=dgstatNum_rw_hits, sysinfoMemorySize=sysinfoMemorySize, symList=symList, dirstatTime_stamp=dirstatTime_stamp, symmEventIndex=symmEventIndex, symStatEntry=symStatEntry, discCapacity=discCapacity, systemInformation=systemInformation, discRDF=discRDF, dvhoaddrFirstRecordNumber=dvhoaddrFirstRecordNumber, devShowDev_cylinders=devShowDev_cylinders, symPDevListCountEntry=symPDevListCountEntry, symShowNum_da_volumes=symShowNum_da_volumes, ldevstatNum_blocks_written=ldevstatNum_blocks_written, dvhoaddrRecordSize=dvhoaddrRecordSize, discModel=discModel, initVersion=initVersion, deviceHostAddressConfigurationEntry=deviceHostAddressConfigurationEntry, dirstatRANum_read_misses=dirstatRANum_read_misses, devstatTime_stamp=devstatTime_stamp, dirstatRANum_slot_collisions=dirstatRANum_slot_collisions, symGateListEntry=symGateListEntry, pdevstatNum_sym_timeslices=pdevstatNum_sym_timeslices, symShowDirectorConfigurationTable=symShowDirectorConfigurationTable, symShowDirector_status=symShowDirector_status, dadcnfigDeviceRecordsTable=dadcnfigDeviceRecordsTable, dvhoaddrPortAType=dvhoaddrPortAType, symBcvPDevList=symBcvPDevList, symBcvPDeviceName=symBcvPDeviceName, emcSymRdfMaint=emcSymRdfMaint, symLDevListCount=symLDevListCount, symRemoteListTable=symRemoteListTable, symAPIShow=symAPIShow, StateValues=StateValues, bcvDeviceName=bcvDeviceName, implDate=implDate, dgstatNum_sym_timeslices=dgstatNum_sym_timeslices, symLDevList=symLDevList, escnDate=escnDate, emcRatiosOutofRangeTrap=emcRatiosOutofRangeTrap, symShowConfiguration=symShowConfiguration, dgstatNum_blocks_read=dgstatNum_blocks_read, devShowRDFInfoTable=devShowRDFInfoTable, devShowDirector_port_num=devShowDirector_port_num, discoveryTrapPort=discoveryTrapPort, devShowMset_M4_invalid_tracks=devShowMset_M4_invalid_tracks, symDevList=symDevList, devstatNum_format_pend_tracks=devstatNum_format_pend_tracks, emc=emc, sysinfoNumberofRecords=sysinfoNumberofRecords, devShowSym_devname=devShowSym_devname, sysinfoRecordsEntry=sysinfoRecordsEntry, devShowPrevent_ra_online_upon_pwron=devShowPrevent_ra_online_upon_pwron, discoveryTableSize=discoveryTableSize, ldevstatNum_read_hits=ldevstatNum_read_hits, dirPortStatistics=dirPortStatistics, systemCodesTable=systemCodesTable, devShowDev_link_status=devShowDev_link_status, escnVersion=escnVersion, esmVariables=esmVariables, dirstatDANum_pf_long_misses=dirstatDANum_pf_long_misses, symGateList=symGateList, DeviceEmulation=DeviceEmulation, discovery=discovery, emcSymMvsBuildStatus=emcSymMvsBuildStatus, devShowDev_serial_id=devShowDev_serial_id, symBcvPDevListCountEntry=symBcvPDevListCountEntry, dadcnfigNumberofRecords=dadcnfigNumberofRecords, dirstatNum_write_reqs=dirstatNum_write_reqs, pdevstatNum_rw_hits=pdevstatNum_rw_hits, escnMTPF=escnMTPF, symShowMicrocode_patch_level=symShowMicrocode_patch_level, devShowDev_ra_status=devShowDev_ra_status, emulChecksum=emulChecksum, devShowConfigurationEntry=devShowConfigurationEntry, pdevstatNum_destaged_tracks=pdevstatNum_destaged_tracks, discChecksum=discChecksum, emcSymDir=emcSymDir, dvhoaddrPortCDeviceAddress=dvhoaddrPortCDeviceAddress, devShowMset_M2_status=devShowMset_M2_status, devstatNum_prefetched_tracks=devstatNum_prefetched_tracks, dgStatTable=dgStatTable, symmEventTable=symmEventTable, deviceHostAddressConfigurationTable=deviceHostAddressConfigurationTable, escnCodeType=escnCodeType, dvhoaddrPortCType=dvhoaddrPortCType, dirstatNum_permacache_reqs=dirstatNum_permacache_reqs, daDirStatTable=daDirStatTable, ldevstatNum_rw_reqs=ldevstatNum_rw_reqs, devShowMset_M3_invalid_tracks=devShowMset_M3_invalid_tracks, discEventCurrID=discEventCurrID, devShowBcvdev_dgname=devShowBcvdev_dgname, emcSymWinConfig=emcSymWinConfig, emcRatiosOutofRange=emcRatiosOutofRange, symstatNum_blocks_written=symstatNum_blocks_written, ldevstatNum_read_reqs=ldevstatNum_read_reqs, portstatNum_blocks_read_and_written=portstatNum_blocks_read_and_written, symShowLast_fast_ipl_time=symShowLast_fast_ipl_time, symstatNum_prefetched_tracks=symstatNum_prefetched_tracks, dirstatDANum_pf_tracks_used=dirstatDANum_pf_tracks_used, subagentInformation=subagentInformation, pdevstatNum_blocks_written=pdevstatNum_blocks_written, directorStatEntry=directorStatEntry, symDevNoDgListCountTable=symDevNoDgListCountTable, RDFAdaptiveCopy=RDFAdaptiveCopy, symPDevNoDgListEntry=symPDevNoDgListEntry, symShowPort2_status=symShowPort2_status, symShowSlot_num=symShowSlot_num, symPDevNoDgListCount=symPDevNoDgListCount, dirstatNum_rw_reqs=dirstatNum_rw_reqs, dvhoaddrMaxRecords=dvhoaddrMaxRecords, dvhoaddrDirectorNumber=dvhoaddrDirectorNumber, initCodeType=initCodeType, syscodesDirectorNum=syscodesDirectorNum, devShowConsistency_state=devShowConsistency_state, symShowMicrocode_date=symShowMicrocode_date, symShowNum_symdevs=symShowNum_symdevs, symstatNum_free_permacache_slots=symstatNum_free_permacache_slots, dgstatTime_stamp=dgstatTime_stamp, devShowTid=devShowTid, devShowDev_sa_status=devShowDev_sa_status, symRemoteList=symRemoteList, dvhoaddrPortBType=dvhoaddrPortBType, discConfigDate=discConfigDate, pdevstatNum_write_hits=pdevstatNum_write_hits, symDevNoDgDeviceName=symDevNoDgDeviceName, dirStatPortCountEntry=dirStatPortCountEntry, dgstatNum_read_reqs=dgstatNum_read_reqs, checksumTestFrequency=checksumTestFrequency, symRemoteListEntry=symRemoteListEntry, symPDevNoDgListCountTable=symPDevNoDgListCountTable, mfDataSetInformation=mfDataSetInformation, symDevNoDgListCountEntry=symDevNoDgListCountEntry, symShowSymid=symShowSymid, sysinfoRecordsTable=sysinfoRecordsTable, dgstatNum_write_reqs=dgstatNum_write_reqs, symDeviceName=symDeviceName, dirPortStatPortCount=dirPortStatPortCount, symstatNum_sa_read_reqs=symstatNum_sa_read_reqs, symShowNum_ports=symShowNum_ports, symShowPDevCount=symShowPDevCount, symShowDirectorConfigurationEntry=symShowDirectorConfigurationEntry, analyzerFileSize=analyzerFileSize, dgstatNum_rw_reqs=dgstatNum_rw_reqs, devShowNum_r2_invalid_tracks=devShowNum_r2_invalid_tracks, trapTestFrequency=trapTestFrequency, mainframeDiskInformation=mainframeDiskInformation, dirstatRANum_device_wp_disconnects=dirstatRANum_device_wp_disconnects, symShowDirector_type=symShowDirector_type, dirstatDANum_pf_short_misses=dirstatDANum_pf_short_misses, discoveryTableChange=discoveryTableChange, discRawDevice=discRawDevice, dadcnfigDeviceRecordsEntry=dadcnfigDeviceRecordsEntry, symAPIList=symAPIList, devShowBcvdev_serial_id=devShowBcvdev_serial_id, devstatNum_write_hits=devstatNum_write_hits, control=control, symShowRemote_ra_group_num=symShowRemote_ra_group_num, agentType=agentType, dgstatNum_blocks_written=dgstatNum_blocks_written, symmEvent=symmEvent, agentAdministration=agentAdministration, dirstatRANum_system_wp_disconnects=dirstatRANum_system_wp_disconnects, symDevListCountEntry=symDevListCountEntry, sysinfoRecordSize=sysinfoRecordSize, pdevstatNum_read_hits=pdevstatNum_read_hits, emcSymSRDFInfo=emcSymSRDFInfo, syscodesDirectorType=syscodesDirectorType, emcSymCnfg=emcSymCnfg, symShowPDeviceName=symShowPDeviceName, devShowRemote_sym_devname=devShowRemote_sym_devname, devShowDev_capacity=devShowDev_capacity, ldevstatNum_wp_tracks=ldevstatNum_wp_tracks, remoteSerialNumber=remoteSerialNumber, devstatNum_rw_hits=devstatNum_rw_hits, dvhoaddrSymmNumber=dvhoaddrSymmNumber, devShowEmulation=devShowEmulation, DirectorType=DirectorType, daDirStatEntry=daDirStatEntry, dadcnfigMirror4Interface=dadcnfigMirror4Interface, dvhoaddrMetaFlags=dvhoaddrMetaFlags, symPDevNoDgDeviceName=symPDevNoDgDeviceName, RDFMode=RDFMode, SCSIWidth=SCSIWidth, devShowDirector_slot_num=devShowDirector_slot_num, pdevstatNum_wp_tracks=pdevstatNum_wp_tracks, mibRevision=mibRevision, syscodesRecordSize=syscodesRecordSize, ldevstatNum_destaged_tracks=ldevstatNum_destaged_tracks, lDevStatTable=lDevStatTable, emcSymDiskCfg=emcSymDiskCfg, symShowRemote_symid=symShowRemote_symid, devShowDgname=devShowDgname, symstatNum_read_reqs=symstatNum_read_reqs, devShowDirector_port_status=devShowDirector_port_status, symDgListCount=symDgListCount, systemCodes=systemCodes, symShowDb_sync_time=symShowDb_sync_time, emcSymSumStatusErrorCodes=emcSymSumStatusErrorCodes, devStatTable=devStatTable, devstatNum_blocks_read=devstatNum_blocks_read, ldevstatDevice_max_wp_limit=ldevstatDevice_max_wp_limit, dirPortStatEntry=dirPortStatEntry, emcSymmetrixStatusTrap=emcSymmetrixStatusTrap, symBcvDevListCountTable=symBcvDevListCountTable, dadcnfigMirror4Director=dadcnfigMirror4Director, dadcnfigFirstRecordNumber=dadcnfigFirstRecordNumber, systemInfoHeaderTable=systemInfoHeaderTable, symListTable=symListTable, syscodesBuffer=syscodesBuffer, symBcvDevListEntry=symBcvDevListEntry, dirPortStatTable=dirPortStatTable, symAPIStatistics=symAPIStatistics, discMicrocodeVersion=discMicrocodeVersion, symShowPrevent_ra_online_upon_pwron=symShowPrevent_ra_online_upon_pwron, dvhoaddrFiberChannelAddress=dvhoaddrFiberChannelAddress, symstatNum_sa_write_reqs=symstatNum_sa_write_reqs, pDevStatEntry=pDevStatEntry, symstatNum_rw_hits=symstatNum_rw_hits, dadcnfigMirror2Interface=dadcnfigMirror2Interface, statusCheckFrequency=statusCheckFrequency, dirstatDANum_pf_mismatches=dirstatDANum_pf_mismatches, emulFileCount=emulFileCount, devstatNum_write_reqs=devstatNum_write_reqs, esmSNMPRequestPort=esmSNMPRequestPort, dvhoaddrPortDType=dvhoaddrPortDType, raDirStatTable=raDirStatTable, RDFType=RDFType)
mibBuilder.exportSymbols("EMC-MIB", symBcvDevList=symBcvDevList, analyzerFileRuntime=analyzerFileRuntime, directorStatTable=directorStatTable, dadcnfigMaxRecords=dadcnfigMaxRecords, subagentInfo=subagentInfo, ldevstatNum_format_pend_tracks=ldevstatNum_format_pend_tracks, dgStatEntry=dgStatEntry, dadcnfigBuffer=dadcnfigBuffer, sysinfoMaxRecords=sysinfoMaxRecords, symBcvPDevListCountTable=symBcvPDevListCountTable, devShowProduct_id=devShowProduct_id, pdevstatNum_delayed_dfw=pdevstatNum_delayed_dfw, devShowBcv_pair_state=devShowBcv_pair_state, diskAdapterDeviceConfiguration=diskAdapterDeviceConfiguration, RDFLinkConfig=RDFLinkConfig, pdevstatNum_seq_read_reqs=pdevstatNum_seq_read_reqs, devShowRdf_status=devShowRdf_status, dirstatSANum_system_wp_disconnects=dirstatSANum_system_wp_disconnects, clientListClientExpiration=clientListClientExpiration, devShowMset_M3_type=devShowMset_M3_type, dirstatSANum_device_wp_disconnects=dirstatSANum_device_wp_disconnects, pdevstatNum_write_reqs=pdevstatNum_write_reqs, devShowAdaptive_copy=devShowAdaptive_copy, symDevShow=symDevShow, emulVersion=emulVersion, discIndex=discIndex, ldevstatNum_rw_hits=ldevstatNum_rw_hits, lDeviceName=lDeviceName, serialNumber=serialNumber, implCodeType=implCodeType, symDevListTable=symDevListTable, symShowSDDF_configuration=symShowSDDF_configuration, symPDevListCount=symPDevListCount, dgstatNum_read_hits=dgstatNum_read_hits, devShowMset_M1_invalid_tracks=devShowMset_M1_invalid_tracks, syscodesMaxRecords=syscodesMaxRecords, celerraTCPPort=celerraTCPPort, devstatNum_read_hits=devstatNum_read_hits, subagentConfiguration=subagentConfiguration, devShowDev_parameters=devShowDev_parameters, symstatNum_write_reqs=symstatNum_write_reqs, deviceHostAddressConfiguration=deviceHostAddressConfiguration, activePorts=activePorts, symShowSymmetrix_model=symShowSymmetrix_model, portstatNum_rw_reqs=portstatNum_rw_reqs, dirstatNum_read_reqs=dirstatNum_read_reqs, emcSymmetrixEventTrap=emcSymmetrixEventTrap, clientListRequestExpiration=clientListRequestExpiration, implChecksum=implChecksum, dirStatPortCountTable=dirStatPortCountTable, devShowDev_config=devShowDev_config, symPDevNoDgListTable=symPDevNoDgListTable, analyzerFileIsActive=analyzerFileIsActive, lDevStatEntry=lDevStatEntry, dgstatdevice_max_wp_limit=dgstatdevice_max_wp_limit, discNumEvents=discNumEvents, masterTraceMessagesEnable=masterTraceMessagesEnable, symBcvPDevListEntry=symBcvPDevListEntry, devShowDev_status=devShowDev_status, devstatNum_sym_timeslices=devstatNum_sym_timeslices, ldevstatTime_stamp=ldevstatTime_stamp, symBcvDevListTable=symBcvDevListTable, SCSIMethod=SCSIMethod, dirstatDANum_pf_restarts=dirstatDANum_pf_restarts, mfDiskInformation=mfDiskInformation, devShowRa_group_number=devShowRa_group_number, dgstatNum_wp_tracks=dgstatNum_wp_tracks, devShowNum_bcvdev_invalid_tracks=devShowNum_bcvdev_invalid_tracks, symShowMax_dev_wr_pend_slots=symShowMax_dev_wr_pend_slots, devShowDev_dgname=devShowDev_dgname, sysinfoNumberofDirectors=sysinfoNumberofDirectors, symShowEntry=symShowEntry, devShowNum_dev_invalid_tracks=devShowNum_dev_invalid_tracks, devShowRdf_pair_state=devShowRdf_pair_state, symstatNum_sa_rw_hits=symstatNum_sa_rw_hits, dadcnfigMirror3Director=dadcnfigMirror3Director, devstatNum_read_reqs=devstatNum_read_reqs)
| 134.650849 | 10,647 | 0.781282 | 30,473 | 229,849 | 5.814721 | 0.043941 | 0.071448 | 0.125034 | 0.01289 | 0.639199 | 0.538749 | 0.474068 | 0.380835 | 0.320144 | 0.273641 | 0 | 0.055999 | 0.097055 | 229,849 | 1,706 | 10,648 | 134.729777 | 0.79777 | 0.00134 | 0 | 0.004162 | 0 | 0.08918 | 0.304489 | 0.019997 | 0 | 0 | 0.000732 | 0 | 0 | 1 | 0 | false | 0.000595 | 0.003567 | 0 | 0.032699 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df3578e4f5fe9045f0e8c3053785c4e1bf584e5c | 12,964 | py | Python | python/src/main/python/pygw/query/statistics/statistic_query_builder.py | radiant-maxar/geowave | 2d9f39d32e4621c8f5965a4dffff0623c1c03231 | [
"Apache-2.0"
] | 280 | 2017-06-14T01:26:19.000Z | 2022-03-28T15:45:23.000Z | python/src/main/python/pygw/query/statistics/statistic_query_builder.py | radiant-maxar/geowave | 2d9f39d32e4621c8f5965a4dffff0623c1c03231 | [
"Apache-2.0"
] | 458 | 2017-06-12T20:00:59.000Z | 2022-03-31T04:41:59.000Z | python/src/main/python/pygw/query/statistics/statistic_query_builder.py | radiant-maxar/geowave | 2d9f39d32e4621c8f5965a4dffff0623c1c03231 | [
"Apache-2.0"
] | 135 | 2017-06-12T20:39:34.000Z | 2022-03-15T13:42:30.000Z | #
# Copyright (c) 2013-2020 Contributors to the Eclipse Foundation
#
# See the NOTICE file distributed with this work for additional information regarding copyright
# ownership. All rights reserved. This program and the accompanying materials are made available
# under the terms of the Apache License, Version 2.0 which accompanies this distribution and is
# available at http://www.apache.org/licenses/LICENSE-2.0.txt
# ===============================================================================================
from pygw.base import GeoWaveObject
from pygw.base.type_conversions import StringArrayType
from pygw.config import geowave_pkg
from pygw.query.statistics.statistic_query import StatisticQuery
from pygw.statistics.bin_constraints import BinConstraints
from pygw.statistics.statistic_type import StatisticType, IndexStatisticType, DataTypeStatisticType
class StatisticQueryBuilder(GeoWaveObject):
"""
A builder for creating statistics queries. This class should not be constructed directly, instead use one of the
static methods to create an appropriate builder.
"""
def __init__(self, java_ref, result_transformer):
self._result_transformer = result_transformer
super().__init__(java_ref)
def tag(self, tag):
"""
Sets the tag to query for.
Args:
tag (str): The tag to query for.
Returns:
This statistic query builder.
"""
self._java_ref.tag(tag)
return self
def internal(self):
"""
When set, only internal statistics will be queried.
Returns:
This statistic query builder.
"""
self._java_ref.internal()
return self
def add_authorization(self, authorization):
"""
Adds an authorization to the query.
Args:
authorization (str): The authorization to add.
Returns:
This statistic query builder.
"""
self._java_ref.addAuthorization(authorization)
return self
def authorizations(self, authorizations):
"""
Sets the set of authorizations to use for the query.
Args:
authorizations (array of str): The authorizations to use for the query.
Returns:
This statistic query builder.
"""
self._java_ref.authorizations(StringArrayType().to_java(authorizations))
return self
def bin_constraints(self, bin_constraints):
"""
Sets the constraints to use for the statistic query. Only bins that match the given constraints will be
returned.
Args:
bin_constraints (BinConstraints): The constraints to constrain the query by.
Returns:
This statistic query builder.
"""
if not isinstance(bin_constraints, BinConstraints):
raise AttributeError('Must be a BinConstraints instance.')
self._java_ref.binConstraints(bin_constraints.java_ref())
return self
def build(self):
"""
Build the statistic query.
Returns:
This constructed statistic query.
"""
return StatisticQuery(self._java_ref.build(), self._result_transformer)
@staticmethod
def new_builder(statistic_type):
"""
Create a statistic query builder for the given statistic type.
Args:
statistic_type (StatisticType): The statistic type for the query builder.
Returns:
A statistic query builder.
"""
if not isinstance(statistic_type, StatisticType):
raise AttributeError('Must be a StatisticType instance.')
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.newBuilder(statistic_type.java_ref())
if isinstance(statistic_type, IndexStatisticType):
return IndexStatisticQueryBuilder(statistic_type, j_builder)
if isinstance(statistic_type, DataTypeStatisticType):
return DataTypeStatisticQueryBuilder(statistic_type, j_builder)
return FieldStatisticQueryBuilder(statistic_type, j_builder)
@staticmethod
def differing_visibility_count():
"""
Create a statistic query builder for a differing visibility count statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.differingVisibilityCount()
return IndexStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def duplicate_entry_count():
"""
Create a statistic query builder for a duplicate entry count statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.duplicateEntryCount()
return IndexStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def field_visibility_count():
"""
Create a statistic query builder for a field visibility count statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.fieldVisibilityCount()
return IndexStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def index_meta_data_set():
"""
Create a statistic query builder for an index meta data set statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.indexMetaDataSet()
return IndexStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def max_duplicates():
"""
Create a statistic query builder for a max duplicates statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.maxDuplicates()
return IndexStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def partitions():
"""
Create a statistic query builder for a partitions statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.partitions()
return IndexStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def row_range_histogram():
"""
Create a statistic query builder for a row range histogram statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.rowRangeHistogram()
return IndexStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def count():
"""
Create a statistic query builder for a count statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.count()
return DataTypeStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def bbox():
"""
Create a statistic query builder for a bounding box statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.geotime.store.statistics.SpatialTemporalStatisticQueryBuilder.bbox()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def bloom_filter():
"""
Create a statistic query builder for a bloom filter statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.bloomFilter()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def time_range():
"""
Create a statistic query builder for a time range statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.geotime.store.statistics.SpatialTemporalStatisticQueryBuilder.timeRange()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def count_min_sketch():
"""
Create a statistic query builder for a count min sketch statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.countMinSketch()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def fixed_bin_numeric_histogram():
"""
Create a statistic query builder for a fixed bin numeric histogram statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.fixedBinNumericHistogram()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def hyper_log_log():
"""
Create a statistic query builder for a hyper log log statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.hyperLogLog()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def numeric_histogram():
"""
Create a statistic query builder for a numeric histogram statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.numericHistogram()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def numeric_mean():
"""
Create a statistic query builder for a numeric mean statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.numericMean()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def numeric_range():
"""
Create a statistic query builder for a numeric range statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.numericRange()
return FieldStatisticQueryBuilder(java_ref=j_builder)
@staticmethod
def numeric_stats():
"""
Create a statistic query builder for a numeric stats statistic.
Returns:
A statistic query builder.
"""
j_builder = geowave_pkg.core.store.api.StatisticQueryBuilder.numericStats()
return FieldStatisticQueryBuilder(java_ref=j_builder)
class IndexStatisticQueryBuilder(StatisticQueryBuilder):
"""
A builder for index statistic queries.
"""
def __init__(self, statistic_type=None, java_ref=None):
if java_ref is None:
j_qbuilder = geowave_pkg.core.statistics.query.IndexStatisticQueryBuilder(statistic_type.java_ref())
else:
j_qbuilder = java_ref
super().__init__(j_qbuilder, None)
def index_name(self, index_name):
"""
Set the index name to constrain the query by.
Args:
index_name (str): The index name to query.
Returns:
This statistic query builder.
"""
self._java_ref.indexName(index_name)
return self
class DataTypeStatisticQueryBuilder(StatisticQueryBuilder):
"""
A builder for data type statistic queries.
"""
def __init__(self, statistic_type=None, java_ref=None):
if java_ref is None:
j_qbuilder = geowave_pkg.core.statistics.query.DataTypeStatisticQueryBuilder(statistic_type.java_ref())
else:
j_qbuilder = java_ref
super().__init__(j_qbuilder, None)
def type_name(self, type_name):
"""
Set the type name to constrain the query by.
Args:
type_name (str): The type name to query.
Returns:
This statistic query builder.
"""
self._java_ref.typeName(type_name)
return self
class FieldStatisticQueryBuilder(StatisticQueryBuilder):
"""
A builder for field statistic queries.
"""
def __init__(self, statistic_type=None, java_ref=None):
if java_ref is None:
j_qbuilder = geowave_pkg.core.statistics.query.FieldStatisticQueryBuilder(statistic_type.java_ref())
else:
j_qbuilder = java_ref
super().__init__(j_qbuilder, None)
def type_name(self, type_name):
"""
Set the type name to constrain the query by.
Args:
type_name (str): The type name to query.
Returns:
This statistic query builder.
"""
self._java_ref.typeName(type_name)
return self
def field_name(self, field_name):
"""
Set the field name to constrain the query by.
Args:
field_name (str): The field name to query.
Returns:
This statistic query builder.
"""
self._java_ref.fieldName(field_name)
return self
| 32.248756 | 116 | 0.653193 | 1,355 | 12,964 | 6.078229 | 0.141697 | 0.086693 | 0.11984 | 0.101506 | 0.612797 | 0.597742 | 0.565444 | 0.538368 | 0.388417 | 0.326858 | 0 | 0.001269 | 0.270827 | 12,964 | 401 | 117 | 32.329177 | 0.869988 | 0.333539 | 0 | 0.444444 | 0 | 0 | 0.009197 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.229167 | false | 0 | 0.041667 | 0 | 0.513889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df4568fa3ce1d1a711aae12215e7303c88faa3d0 | 1,024 | py | Python | 213. House Robber II.py | minyoungan/coding-interview | 56cf90d8e65690c0cb49512b6d06a4de3f129a41 | [
"MIT"
] | 1 | 2021-11-24T01:30:38.000Z | 2021-11-24T01:30:38.000Z | 213. House Robber II.py | minyoungan/Coding-Interview | 56cf90d8e65690c0cb49512b6d06a4de3f129a41 | [
"MIT"
] | null | null | null | 213. House Robber II.py | minyoungan/Coding-Interview | 56cf90d8e65690c0cb49512b6d06a4de3f129a41 | [
"MIT"
] | null | null | null | class Solution:
def rob(self, nums: List[int]) -> int:
def recur(arr):
memo = {}
def dfs(i):
if i in memo:
return memo[i]
if i >= len(arr):
return 0
cur = max(arr[i] + dfs(i + 2), dfs(i + 1))
memo[i] = cur
return cur
return dfs(0)
return max(nums[0], recur(nums[0:-1]), recur(nums[1:]))
class Solution:
def rob(self, nums: List[int]) -> int:
skip_first = self._helper(nums[1:])
skp_last = self._helper(nums[:-1])
return max(nums[0], skip_first, skp_last)
def _helper(self, nums: List[int]) -> int:
one = 0
two = 0
for n in nums:
tmp = two
two = max(one + n, two)
one = tmp
return two
| 23.272727 | 63 | 0.367188 | 113 | 1,024 | 3.265487 | 0.283186 | 0.065041 | 0.097561 | 0.121951 | 0.249322 | 0.200542 | 0.200542 | 0.200542 | 0.200542 | 0 | 0 | 0.026639 | 0.523438 | 1,024 | 43 | 64 | 23.813953 | 0.729508 | 0 | 0 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.185185 | false | 0 | 0 | 0 | 0.518519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df47f5365cf4dd879f12f60b05bf5870f8c06119 | 7,103 | py | Python | runesanalyzer/data.py | clemsciences/runes-analyzer | 0af8baaf604179c31dcbf28b1a023ca650a9ff34 | [
"MIT"
] | 5 | 2018-10-17T15:35:51.000Z | 2022-01-23T10:57:55.000Z | runesanalyzer/data.py | clemsciences/runes-analyzer | 0af8baaf604179c31dcbf28b1a023ca650a9ff34 | [
"MIT"
] | 5 | 2018-06-25T16:06:00.000Z | 2018-09-09T13:50:16.000Z | runesanalyzer/data.py | clemsciences/runes-analyzer | 0af8baaf604179c31dcbf28b1a023ca650a9ff34 | [
"MIT"
] | null | null | null | """
Sources:
- Viking Language 1 by Jessie L. Byock 2013
Unicode: 16A0–16FF
Swedish runic inscriptions
http://runes.verbix.com/index.html
http://runes.verbix.com/vikingage/Uppland.html
"""
import json
from runesanalyzer.runes import RunicAlphabetName, Rune, Transcriber
__author__ = ["Clément Besnier <clemsciences@aol.com>", ]
# ᚠ ᚢ ᚦ ᚨ ᚲ ᚱ ᚷ ᚹ ᚼ ᚾ ᛁ ᛃ ᛇ ᛈ ᛉ ᛊ ᛏ ᛒ ᛖ ᛗ ᛚ ᛜ ᛟ ᛞ
ELDER_FUTHARK = [
Rune(RunicAlphabetName.elder_futhark, "\u16A0", "f", "f", "fehu"),
Rune(RunicAlphabetName.elder_futhark, "\u16A2", "u", "u", "uruz"),
Rune(RunicAlphabetName.elder_futhark, "\u16A6", "θ", "þ", "þuriaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16A8", "a", "a", "ansuz"),
Rune(RunicAlphabetName.elder_futhark, "\u16B1", "r", "r", "raido"),
Rune(RunicAlphabetName.elder_futhark, "\u16B2", "k", "k", "kaunan"),
Rune(RunicAlphabetName.elder_futhark, "\u16B7", "g", "g", "gyfu"),
Rune(RunicAlphabetName.elder_futhark, "\u16B9", "w", "w", "wynn"),
Rune(RunicAlphabetName.elder_futhark, "\u16BA", "h", "h", "haglaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16BE", "n", "n", "naudiz"),
Rune(RunicAlphabetName.elder_futhark, "\u16C1", "i", "i", "isaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16C3", "j", "j", "jeran"),
Rune(RunicAlphabetName.elder_futhark, "\u16C7", "æ", "E", "eiwaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16C8", "p", "p", "peorð"),
Rune(RunicAlphabetName.elder_futhark, "\u16C9", "ʀ", "R", "algiz"),
Rune(RunicAlphabetName.elder_futhark, "\u16CA", "s", "s", "sowilo"),
Rune(RunicAlphabetName.elder_futhark, "\u16CF", "t", "t", "tiwaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16D2", "b", "b", "berkanan"),
Rune(RunicAlphabetName.elder_futhark, "\u16D6", "e", "e", "ehwaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16D7", "m", "m", "mannaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16DA", "l", "l", "laguz"),
Rune(RunicAlphabetName.elder_futhark, "\u16DC", "ŋ", "ng", "ingwaz"),
Rune(RunicAlphabetName.elder_futhark, "\u16DF", "ø", "œ", "odal"),
Rune(RunicAlphabetName.elder_futhark, "\u16DE", "d", "d", "dagaz"),
]
# ᚠ ᚢ ᚦ ᚭ ᚱ ᚴ ᚼ ᚾ ᛁ ᛅ ᛋ ᛏ ᛒ ᛖ ᛘ ᛚ ᛦ
YOUNGER_FUTHARK = [
Rune(RunicAlphabetName.younger_futhark, "\u16A0", "f", "f", "fehu"),
Rune(RunicAlphabetName.younger_futhark, "\u16A2", "u", "u", "uruz"),
Rune(RunicAlphabetName.younger_futhark, "\u16A6", "θ", "þ", "þuriaz"),
Rune(RunicAlphabetName.younger_futhark, "\u16AD", "a", "a", "ansuz"),
Rune(RunicAlphabetName.younger_futhark, "\u16B1", "r", "r", "raido"),
Rune(RunicAlphabetName.younger_futhark, "\u16B4", "k", "k", "kaunan"),
Rune(RunicAlphabetName.younger_futhark, "\u16BC", "h", "h", "haglaz"),
Rune(RunicAlphabetName.younger_futhark, "\u16BE", "n", "n", "naudiz"),
Rune(RunicAlphabetName.younger_futhark, "\u16C1", "i", "i", "isaz"),
Rune(RunicAlphabetName.younger_futhark, "\u16C5", "a", "a", "jeran"),
Rune(RunicAlphabetName.younger_futhark, "\u16CB", "s", "s", "sowilo"),
Rune(RunicAlphabetName.younger_futhark, "\u16CF", "t", "t", "tiwaz"),
Rune(RunicAlphabetName.younger_futhark, "\u16D2", "b", "b", "berkanan"),
Rune(RunicAlphabetName.younger_futhark, "\u16D6", "e", "e", "ehwaz"),
Rune(RunicAlphabetName.younger_futhark, "\u16D8", "m", "m", "mannaz"), # also \u16D9
Rune(RunicAlphabetName.younger_futhark, "\u16DA", "l", "l", "laguz"),
Rune(RunicAlphabetName.younger_futhark, "\u16E6", "r", "R", "algiz")
]
# ᚠ ᚢ ᚦ ᚭ ᚱ ᚴ ᚽ ᚿ ᛁ ᛅ ᛌ ᛐ ᛓ ᛖ ᛙ ᛚ ᛧ
SHORT_TWIG_YOUNGER_FUTHARK = [
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16A0", "f", "f", "fehu"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16A2", "u", "u", "uruz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16A6", "θ", "þ", "þuriaz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16AD", "a", "a", "ansuz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16B1", "r", "r", "raido"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16B4", "k", "k", "kaunan"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16BD", "h", "h", "haglaz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16BF", "n", "n", "naudiz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16C1", "i", "i", "isaz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16C5", "a", "a", "jeran"), # not good
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16CC", "s", "s", "sowilo"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16D0", "t", "t", "tiwaz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16D3", "b", "b", "berkanan"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16D6", "e", "e", "ehwaz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16D9", "m", "m", "mannaz"), # also \u16D9
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16DA", "l", "l", "laguz"),
Rune(RunicAlphabetName.short_twig_younger_futhark, "\u16E7", "r", "R", "algiz"),
]
# https://fr.wikipedia.org/wiki/Petite_pierre_de_Jelling
little_jelling_stone = "᛬ᚴᚢᚱᛘᛦ᛬ᚴᚢᚾᚢᚴᛦ᛬ᚴ(ᛅᚱ)ᚦᛁ᛬ᚴᚢᛒᛚ᛬ᚦᚢᛋᛁ᛬ᛅ(ᚠᛏ)᛬ᚦᚢᚱᚢᛁ᛬ᚴᚢᚾᚢ᛬ᛋᛁᚾᛅ᛬ᛏᛅᚾᛘᛅᚱᚴᛅᛦ᛬ᛒᚢᛏ᛬"
big_jelling_stone = "ᚼᛅᚱᛅᛚᛏᚱ᛬ᚴᚢᚾᚢᚴᛦ᛬ᛒᛅᚦ᛬ᚴᛅᚢᚱᚢᛅ ᚴᚢᛒᛚ᛬ᚦᛅᚢᛋᛁ᛬ᛅᚠᛏ᛬ᚴᚢᚱᛘ ᚠᛅᚦᚢᚱ ᛋᛁᚾ ᛅᚢᚴ ᛅᚠᛏ᛬ᚦᚭᚢᚱᚢᛁ᛬ᛘᚢᚦᚢᚱ᛬ᛋᛁᚾᛅ᛬ᛋᛅ " \
"ᚼᛅᚱᛅᛚᛏᚱ(᛬)ᛁᛅᛋ᛬ᛋᚭᛦ᛫ᚢᛅᚾ᛫ᛏᛅᚾᛘᛅᚢᚱᚴ\nᛅᛚᛅ᛫ᛅᚢᚴ᛫ᚾᚢᚱᚢᛁᚴ\n᛫ᛅᚢᚴ᛫ᛏ(ᛅ)ᚾᛁ(᛫ᚴᛅᚱᚦᛁ᛫)ᚴᚱᛁᛋᛏᚾᚭ"
ramsund_runestone = "ᛋᛁᚱᚦ᛬ᚴᛁᛅᚱᚦᛁ᛬ᛒᚢᚦ᛬ᚦᚭᛋᛁ᛬ᛘᚢᚦᛁᛦ᛬ᛅᛚᚱᛁᚴᛋ᛬ᛏᚢᛏᛁᛦ᛬ᚢᚱᛘᛋ᛬ᚠᚢᚱ᛬ᛋᛅᛚᚢ᛬ᚼᛅᛚᚢ᛬ᚼᚢᛚᛘᚴᛁᚱᛋ᛬ᚠᛅᚦᚢᚱ᛬ᛋᚢᚴᚱᚢᚦᛅᚱ᛬ᛒᚢᛅᛏᛅ᛬ᛋᛁᛋ"
# The Ed (Boulder) Inscription from Uppland, Sweden
# http://runes.verbix.com/vikingage/Uppland.html
ed_inscription = "ᚱᛅᚼᚾᚢᛅᛚᛏᚱ ᛫ ᛚᛁᛏ ᛫ ᚱᛁᛋᛏᛅ ᛫ ᚱᚢᚾᛅᚱ ᛫ ᚽᚡᛦ ᛫ ᚡᛅᛋᛏᚢᛁ ᛫ ᛘᚬᚦᚢᚱ ᛫ ᛋᛁᚾᛅ ᛫ ᚬᚾᚽᛘᛋ ᛫ ᛏᚬᛏᛦ ᛫ ᛏᚬ ᛁ ᛫ " \
"ᛅᛁᚦᛁ ᛫ ᚴᚢᚦ ᚼᛁᛅᛚᛒᛁ ᛫ ᛅᚾᛏ ᛫ ᚼᚽᚾᛅ ᛫ " \
"§B ᚱᚢᚾᛅ ᛫ ᚱᛁᛋᛏᛅ ᛫ ᛚᛁᛏ ᛫ ᚱᛅᚼᚾᚢᛅᛚᛏᚱ ᛫ ᚼᚢᛅᚱ ᛅ × ᚵᚱᛁᚴᛚᛅᚾᛏᛁ ᛫ ᚢᛅᛋ ᛫ ᛚᛁᛋ ᛫ ᚡᚬᚱᚢᚾᚴᛁ ᛫"
# https://en.wikipedia.org/wiki/Uppland_Runic_Inscription_448
u448 = "ᚴᚢᛚ᛫ᛅᚢᚴ᛫ᛒᛁᚢᚱᚾ᛫ᛚᛁᛏᚢ᛫ᚱᛅᛁᛋᛅ᛫ᛋᛏᛅᛁᚾ᛫ᛂᚠᛏᛁᛦ᛫ᚦᚢᚱᛋᛏᛅᛁᚾ᛫ᚠᛅᚦᚢᚱ"
# https://en.wikipedia.org/wiki/Uppland_Runic_Inscription_92
u92 = "ᚴᚾᚢᛏᚱ ' ᛁ ᚢᛁᚴ'ᚼᚢᛋᚢᛘ ' ᛚᛁᛏ ' ᛋᛏᛅᛁᚾ ' ᚱᛁᛏᛅ ' ᚢᚴ ' ᛒᚱᚬ ' ᚴᛁᚱᛅ ᛫ ᛁᚡᛏᛁᛦ ' ᚡᛅᚦᚢᚱ ᚢᚴ ᛫ ᛘᚬᚦᚬᚱ ᛫ ᚢᚴ ᛫" \
" ᛒᚱᚤᚦᚱ ᛫ ᛋᛁᚾᛅ ᛫ ᚢᚴ ᛫ ᛋᚢᛋᛏᚢᚱ"
sweden_runic_inscription_filename = "sweden_runes.json"
def read_sweden_runes():
with open(sweden_runic_inscription_filename, "r") as f:
return json.load(f)
if __name__ == "__main__":
print(" ".join(Rune.display_runes(ELDER_FUTHARK)))
print(" ".join(Rune.display_runes(YOUNGER_FUTHARK)))
print(" ".join(Rune.display_runes(SHORT_TWIG_YOUNGER_FUTHARK)))
# https://sv.wikipedia.org/wiki/Upplands_runinskrifter_937
inscription = "ᚦᛁᛅᚴᚾ᛫ᛅᚢᚴ᛫ᚴᚢᚾᛅᚱ᛫ᚱᛅᛁᛋᛏᚢ᛫ᛋᛏᛅᛁᚾᛅ ᛅᚠᛏᛁᛦ᛫ᚢᛅᚱ᛫ᛒᚱᚢᚦᚢᚱ᛫ᛋᛁᚾ"
print(inscription)
print(Transcriber.transcribe(inscription, YOUNGER_FUTHARK))
print(little_jelling_stone)
print(Transcriber.transcribe(little_jelling_stone, YOUNGER_FUTHARK))
print(big_jelling_stone)
print(Transcriber.transcribe(big_jelling_stone, YOUNGER_FUTHARK))
print(ramsund_runestone)
print(Transcriber.transcribe(ramsund_runestone, YOUNGER_FUTHARK))
| 50.021127 | 114 | 0.678727 | 988 | 7,103 | 4.816802 | 0.314777 | 0.255936 | 0.13112 | 0.166422 | 0.528893 | 0.449674 | 0.339147 | 0.19668 | 0 | 0 | 0 | 0.029977 | 0.135858 | 7,103 | 141 | 115 | 50.375887 | 0.729879 | 0.092918 | 0 | 0 | 0 | 0.074468 | 0.245213 | 0.060097 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010638 | false | 0 | 0.021277 | 0 | 0.042553 | 0.117021 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df4a81a9acda59ab397d3b7d2b0f2552fa3a9ecb | 351 | py | Python | errorHandler.py | LWollatz/mendeley2bibtex | eadda877fc08c51e4572bb7498021fa2e324c050 | [
"Apache-2.0"
] | null | null | null | errorHandler.py | LWollatz/mendeley2bibtex | eadda877fc08c51e4572bb7498021fa2e324c050 | [
"Apache-2.0"
] | null | null | null | errorHandler.py | LWollatz/mendeley2bibtex | eadda877fc08c51e4572bb7498021fa2e324c050 | [
"Apache-2.0"
] | null | null | null | AllErrors=[]
AllWarnings=[]
mainVerbose=True
def raiseError(msg,verbose=mainVerbose):
global AllErrors
AllErrors.append(msg)
if verbose:
print "ERROR: "+msg
return None
def raiseWarning(msg,verbose=mainVerbose):
global AllWarnings
AllWarnings.append(msg)
if verbose:
print "WARNING: "+msg
return None
| 19.5 | 42 | 0.68661 | 38 | 351 | 6.342105 | 0.447368 | 0.082988 | 0.174274 | 0.224066 | 0.190871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.219373 | 351 | 17 | 43 | 20.647059 | 0.879562 | 0 | 0 | 0.266667 | 0 | 0 | 0.045584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df584c511c38c63a4d478cb89958f0881f65aa76 | 8,312 | py | Python | sdk/python/pulumi_scaleway/instance_security_group.py | Kamaradeivanov/pulumi-scaleway | a93249741b95b164f3ee651b1798c0b6f245b12d | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-10-16T09:09:09.000Z | 2020-10-16T09:09:09.000Z | sdk/python/pulumi_scaleway/instance_security_group.py | Kamaradeivanov/pulumi-scaleway | a93249741b95b164f3ee651b1798c0b6f245b12d | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_scaleway/instance_security_group.py | Kamaradeivanov/pulumi-scaleway | a93249741b95b164f3ee651b1798c0b6f245b12d | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Union
from . import utilities, tables
class InstanceSecurityGroup(pulumi.CustomResource):
description: pulumi.Output[str]
"""
The description of the security group
"""
external_rules: pulumi.Output[bool]
inbound_default_policy: pulumi.Output[str]
"""
Default inbound traffic policy for this security group
"""
inbound_rules: pulumi.Output[list]
"""
Inbound rules for this security group
* `action` (`str`)
* `ip` (`str`)
* `ip_range` (`str`)
* `port` (`float`)
* `portRange` (`str`)
* `protocol` (`str`)
"""
name: pulumi.Output[str]
"""
The name of the security group
"""
organization_id: pulumi.Output[str]
"""
The organization_id you want to attach the resource to
"""
outbound_default_policy: pulumi.Output[str]
"""
Default outbound traffic policy for this security group
"""
outbound_rules: pulumi.Output[list]
"""
Outbound rules for this security group
* `action` (`str`)
* `ip` (`str`)
* `ip_range` (`str`)
* `port` (`float`)
* `portRange` (`str`)
* `protocol` (`str`)
"""
stateful: pulumi.Output[bool]
"""
The stateful value of the security group
"""
zone: pulumi.Output[str]
"""
The zone you want to attach the resource to
"""
def __init__(__self__, resource_name, opts=None, description=None, external_rules=None, inbound_default_policy=None, inbound_rules=None, name=None, organization_id=None, outbound_default_policy=None, outbound_rules=None, stateful=None, zone=None, __props__=None, __name__=None, __opts__=None):
"""
Create a InstanceSecurityGroup resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: The description of the security group
:param pulumi.Input[str] inbound_default_policy: Default inbound traffic policy for this security group
:param pulumi.Input[list] inbound_rules: Inbound rules for this security group
:param pulumi.Input[str] name: The name of the security group
:param pulumi.Input[str] organization_id: The organization_id you want to attach the resource to
:param pulumi.Input[str] outbound_default_policy: Default outbound traffic policy for this security group
:param pulumi.Input[list] outbound_rules: Outbound rules for this security group
:param pulumi.Input[bool] stateful: The stateful value of the security group
:param pulumi.Input[str] zone: The zone you want to attach the resource to
The **inbound_rules** object supports the following:
* `action` (`pulumi.Input[str]`)
* `ip` (`pulumi.Input[str]`)
* `ip_range` (`pulumi.Input[str]`)
* `port` (`pulumi.Input[float]`)
* `portRange` (`pulumi.Input[str]`)
* `protocol` (`pulumi.Input[str]`)
The **outbound_rules** object supports the following:
* `action` (`pulumi.Input[str]`)
* `ip` (`pulumi.Input[str]`)
* `ip_range` (`pulumi.Input[str]`)
* `port` (`pulumi.Input[float]`)
* `portRange` (`pulumi.Input[str]`)
* `protocol` (`pulumi.Input[str]`)
"""
if __name__ is not None:
warnings.warn("explicit use of __name__ is deprecated", DeprecationWarning)
resource_name = __name__
if __opts__ is not None:
warnings.warn("explicit use of __opts__ is deprecated, use 'opts' instead", DeprecationWarning)
opts = __opts__
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = dict()
__props__['description'] = description
__props__['external_rules'] = external_rules
__props__['inbound_default_policy'] = inbound_default_policy
__props__['inbound_rules'] = inbound_rules
__props__['name'] = name
__props__['organization_id'] = organization_id
__props__['outbound_default_policy'] = outbound_default_policy
__props__['outbound_rules'] = outbound_rules
__props__['stateful'] = stateful
__props__['zone'] = zone
super(InstanceSecurityGroup, __self__).__init__(
'scaleway:index/instanceSecurityGroup:InstanceSecurityGroup',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name, id, opts=None, description=None, external_rules=None, inbound_default_policy=None, inbound_rules=None, name=None, organization_id=None, outbound_default_policy=None, outbound_rules=None, stateful=None, zone=None):
"""
Get an existing InstanceSecurityGroup resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param str id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: The description of the security group
:param pulumi.Input[str] inbound_default_policy: Default inbound traffic policy for this security group
:param pulumi.Input[list] inbound_rules: Inbound rules for this security group
:param pulumi.Input[str] name: The name of the security group
:param pulumi.Input[str] organization_id: The organization_id you want to attach the resource to
:param pulumi.Input[str] outbound_default_policy: Default outbound traffic policy for this security group
:param pulumi.Input[list] outbound_rules: Outbound rules for this security group
:param pulumi.Input[bool] stateful: The stateful value of the security group
:param pulumi.Input[str] zone: The zone you want to attach the resource to
The **inbound_rules** object supports the following:
* `action` (`pulumi.Input[str]`)
* `ip` (`pulumi.Input[str]`)
* `ip_range` (`pulumi.Input[str]`)
* `port` (`pulumi.Input[float]`)
* `portRange` (`pulumi.Input[str]`)
* `protocol` (`pulumi.Input[str]`)
The **outbound_rules** object supports the following:
* `action` (`pulumi.Input[str]`)
* `ip` (`pulumi.Input[str]`)
* `ip_range` (`pulumi.Input[str]`)
* `port` (`pulumi.Input[float]`)
* `portRange` (`pulumi.Input[str]`)
* `protocol` (`pulumi.Input[str]`)
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = dict()
__props__["description"] = description
__props__["external_rules"] = external_rules
__props__["inbound_default_policy"] = inbound_default_policy
__props__["inbound_rules"] = inbound_rules
__props__["name"] = name
__props__["organization_id"] = organization_id
__props__["outbound_default_policy"] = outbound_default_policy
__props__["outbound_rules"] = outbound_rules
__props__["stateful"] = stateful
__props__["zone"] = zone
return InstanceSecurityGroup(resource_name, opts=opts, __props__=__props__)
def translate_output_property(self, prop):
return tables._CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
def translate_input_property(self, prop):
return tables._SNAKE_TO_CAMEL_CASE_TABLE.get(prop) or prop
| 44.212766 | 297 | 0.6564 | 969 | 8,312 | 5.357069 | 0.143447 | 0.089 | 0.086303 | 0.064727 | 0.700443 | 0.67906 | 0.645926 | 0.639376 | 0.607975 | 0.601233 | 0 | 0.000159 | 0.240977 | 8,312 | 187 | 298 | 44.449198 | 0.822634 | 0.397377 | 0 | 0.029851 | 1 | 0 | 0.153078 | 0.039957 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059701 | false | 0.014925 | 0.074627 | 0.029851 | 0.343284 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df6a5a6667316fd4402654e44db52f177022ad5f | 649 | py | Python | djangobench/benchmarks/template_compilation/benchmark.py | Bouke/djangobench | 94fc28d99f95c65d26d0fad8af44e46c49282220 | [
"BSD-3-Clause"
] | 3 | 2016-11-27T22:25:34.000Z | 2018-12-12T20:06:40.000Z | djangobench/benchmarks/template_compilation/benchmark.py | Bouke/djangobench | 94fc28d99f95c65d26d0fad8af44e46c49282220 | [
"BSD-3-Clause"
] | null | null | null | djangobench/benchmarks/template_compilation/benchmark.py | Bouke/djangobench | 94fc28d99f95c65d26d0fad8af44e46c49282220 | [
"BSD-3-Clause"
] | null | null | null | from django.template import Template
from djangobench.utils import run_benchmark
def benchmark():
# Just compile the template, no rendering
t = Template("""
{% for v in vals %}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{{ v }}
{% endfor %}
""")
run_benchmark(
benchmark,
syncdb = False,
meta = {
'description': 'Template compilation time.',
}
)
| 19.666667 | 52 | 0.349769 | 51 | 649 | 4.411765 | 0.509804 | 0.124444 | 0.173333 | 0.213333 | 0.066667 | 0.066667 | 0.066667 | 0.066667 | 0.066667 | 0.066667 | 0 | 0 | 0.497689 | 649 | 32 | 53 | 20.28125 | 0.690184 | 0.060092 | 0 | 0.517241 | 0 | 0 | 0.643092 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.068966 | 0 | 0.103448 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df86e15db6784b6e132bdf3812d4af8173403acc | 1,710 | py | Python | pollumeter/pollumeterbeta/migrations/0001_initial.py | dhruvildave/pollumeter | 3eb6eceae4a8a9bb3fb95e2e3b0385f8472c4a65 | [
"BSD-3-Clause"
] | null | null | null | pollumeter/pollumeterbeta/migrations/0001_initial.py | dhruvildave/pollumeter | 3eb6eceae4a8a9bb3fb95e2e3b0385f8472c4a65 | [
"BSD-3-Clause"
] | 3 | 2021-03-11T01:34:50.000Z | 2022-02-27T08:15:58.000Z | pollumeter/pollumeterbeta/migrations/0001_initial.py | dhruvildave/pollumeter | 3eb6eceae4a8a9bb3fb95e2e3b0385f8472c4a65 | [
"BSD-3-Clause"
] | 1 | 2020-03-21T12:37:56.000Z | 2020-03-21T12:37:56.000Z | # Generated by Django 3.0.3 on 2020-03-14 09:14
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='pollimetermodel',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('datetime', models.DateTimeField(blank=True)),
('long', models.FloatField()),
('lat', models.FloatField()),
('city', models.CharField(max_length=255)),
('area', models.CharField(max_length=255)),
('pol_co2', models.FloatField()),
('pol_co', models.FloatField()),
('pol_so2', models.FloatField()),
('pol_no2', models.FloatField()),
('pol_no', models.FloatField()),
('pol_AQI', models.FloatField()),
('pol_pm25', models.FloatField()),
('pol_pm10', models.FloatField()),
('pol_bc', models.FloatField()),
('fac_JAM', models.FloatField()),
('fac_avgspeed', models.FloatField()),
('fac_indfert', models.FloatField()),
('fac_indman', models.FloatField()),
('fac_indtech', models.FloatField()),
('fac_indpharm', models.FloatField()),
('fac_indfood', models.FloatField()),
('active_population', models.FloatField()),
('weather_temp', models.FloatField()),
('weather_humidity', models.FloatField()),
],
),
]
| 38 | 114 | 0.51462 | 144 | 1,710 | 5.944444 | 0.479167 | 0.392523 | 0.17757 | 0.056075 | 0.063084 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.328655 | 1,710 | 44 | 115 | 38.863636 | 0.721254 | 0.026316 | 0 | 0 | 1 | 0 | 0.134095 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.135135 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df8b5e34f00e28c5ebe9256aaa1dbdfe27d2f8a7 | 88 | py | Python | Python Fundamentals/Lists Basics/Lab/Task01.py | DonikaChervenkova/SoftUni | bff579c037ec48f39ed193b34bc3502a32e90732 | [
"MIT"
] | null | null | null | Python Fundamentals/Lists Basics/Lab/Task01.py | DonikaChervenkova/SoftUni | bff579c037ec48f39ed193b34bc3502a32e90732 | [
"MIT"
] | null | null | null | Python Fundamentals/Lists Basics/Lab/Task01.py | DonikaChervenkova/SoftUni | bff579c037ec48f39ed193b34bc3502a32e90732 | [
"MIT"
] | 1 | 2021-12-04T12:30:57.000Z | 2021-12-04T12:30:57.000Z | tail = input()
body = input()
head = input()
animal = [head, body, tail]
print(animal)
| 12.571429 | 27 | 0.636364 | 12 | 88 | 4.666667 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 88 | 6 | 28 | 14.666667 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df8f3771556ffe0a193031a57f3a2d7afa936b5a | 468 | py | Python | apps/rent/validators.py | WouterVdS/BosvogelWebPlatform | 4388d6fd98645ffb7c1470718d42dab43b953b95 | [
"MIT"
] | 1 | 2019-02-05T20:14:41.000Z | 2019-02-05T20:14:41.000Z | apps/rent/validators.py | WouterVdS/BosvogelWebPlatform | 4388d6fd98645ffb7c1470718d42dab43b953b95 | [
"MIT"
] | 157 | 2019-02-12T18:07:28.000Z | 2022-02-10T07:14:24.000Z | apps/rent/validators.py | WouterVdS/BosvogelWebPlatform | 4388d6fd98645ffb7c1470718d42dab43b953b95 | [
"MIT"
] | null | null | null | import re
from django.core.exceptions import ValidationError
def validate_international_phone_number(value):
if not value.startswith('0032'):
raise ValidationError(f'{value} should start with 0032')
def validate_iban_format(value):
pattern = re.compile(r'[A-Z]{2}\d{2} ?\d{4} ?\d{4} ?\d{4} ?[\d]{0,2}')
if not pattern.match(value):
raise ValidationError(f'{value} should be a valid bank account number in format: BExx xxxx xxxx xxxx')
| 31.2 | 110 | 0.698718 | 72 | 468 | 4.472222 | 0.569444 | 0.018634 | 0.02795 | 0.161491 | 0.220497 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038363 | 0.16453 | 468 | 14 | 111 | 33.428571 | 0.785166 | 0 | 0 | 0 | 0 | 0.111111 | 0.331197 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df94fc8f6cd3b4c08d73e23809abf86297ab24f7 | 272 | py | Python | jadn/libs/convert/__init__.py | g2-inc/openc2-jadn | c15483d4f9c01636528b2b94753c68d2ca9d222a | [
"Apache-2.0"
] | null | null | null | jadn/libs/convert/__init__.py | g2-inc/openc2-jadn | c15483d4f9c01636528b2b94753c68d2ca9d222a | [
"Apache-2.0"
] | null | null | null | jadn/libs/convert/__init__.py | g2-inc/openc2-jadn | c15483d4f9c01636528b2b94753c68d2ca9d222a | [
"Apache-2.0"
] | null | null | null | from .w_jas import jas_dump
from .w_html import html_dump
from .w_markdown import markdown_dump
# from w_proto import proto_dump
# from w_thrift import thrift_dump
__all__ = [
'jas_dump',
'markdown_dump',
'html_dump'
# 'proto_dump',
# 'thrift_dump'
]
| 19.428571 | 37 | 0.716912 | 41 | 272 | 4.292683 | 0.243902 | 0.142045 | 0.204545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.198529 | 272 | 13 | 38 | 20.923077 | 0.807339 | 0.334559 | 0 | 0 | 0 | 0 | 0.170455 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
10c311cb80d00df709e9fa1e6fa0cc5d75aef1c3 | 5,326 | py | Python | calc.py | IamEinstein/python-math-template | 086c23525c044a521d4704f5e7cfabdec4c5d2ab | [
"MIT"
] | null | null | null | calc.py | IamEinstein/python-math-template | 086c23525c044a521d4704f5e7cfabdec4c5d2ab | [
"MIT"
] | null | null | null | calc.py | IamEinstein/python-math-template | 086c23525c044a521d4704f5e7cfabdec4c5d2ab | [
"MIT"
] | null | null | null | import datetime
import math
pi = 22 / 7
def add():
num1 = float(input("Enter the number"))
num2 = float(input("Enter the second number"))
return num1 + num2
def subtract():
num1 = float(input("Enter the number"))
num2 = float(input("Enter the second number"))
return num1 - num2
def multiply():
num1 = float(input("Enter the number"))
num2 = float(input("Enter the second number"))
return num1 * num2
def divide():
num1 = float(input("Enter the number"))
num2 = float(input("Enter the second number"))
return num1 / num2
def sin():
num = float(input("Enter the number"))
return math.sin(num)
def cos():
num = float(input("Enter the number"))
return math.cos(num)
def tan():
num = int(input("Enter the number "))
return math.tan(num)
def factorial():
num = int(input("Enter the number"))
return math.factorial(num)
def power():
num1 = int(input('Enter the number'))
num2 = int(input('Enter the power'))
return num1**num2
def root():
num1 = int(input('Enter the number'))
num2 = int(input('Enter the power'))
if num2 != 0 and num2 > 0:
return num1**1 / num2
# physics
def speed():
distance = int(input("Enter the distance"))
time = int(input("Enter th etime"))
return distance / time
def velocity():
displacement = int(input("Enter the displacement"))
time = int(input("Enter th etime"))
return displacement / time
def time():
return datetime.datetime.now()
def time_from():
distance = float(input("Enter the distance/displacement"))
speed = float(input("Enter the speed/velocity"))
# speed = d/t time
return distance / speed
def distance_from():
" Gives you the distance from time and velocity"
time = float(input("Enter the time"))
speed = float(input("Enter the speed/velocity"))
# s=d/t => d= s x t
return time * speed
# shapes
# shapes only
class Square():
def __init__(self, side):
self.side = side
def area(self):
return self.side**2
def perimeter(self):
return self.side * 4
class Rectangle():
def __init__(self, length, width):
self.length = length
self.width = width
def area(self):
return self.width * self.length
def perimeter(self):
return 2 * (self.length + self.width)
def Circle():
def __init__(self, radius):
self.radius = radius
def area(self):
return 3.14 * (self.radius**2)
def perimeter(self):
return 2 * 3.14 * self.radius
class Trapezium():
def __init__(self, base1, base2, height):
self.height = height
self.base1 = base1
self.base2 = base2
def area(self):
area = self.height / 2 * (self.base1 + self.base2)
return area
def perimeter(self):
# TODO: Add an advanced formula
pass
class Cylinder():
def __init__(self, radius, height):
self.height = height
self.radius = radius
def volume(self):
circle_area = pi * (self.radius**2)
volume = circle_area * self.height
return volume
def total_surface_area(self):
total_circle_area = 2 * (pi * (self.radius**2))
rect_area = self.height * (pi * 2 * self.radius)
return rect_area + total_circle_area
def lateral_surface_area(self):
rect_area = self.height * (pi * 2 * self.radius)
return rect_area
class Triangle():
def __init__(self):
pass
def area(self):
method = int(input('''
how do u want to calculate the area
1) b * h * 1/2
2) heron's formula
'''))
if method == 2:
side1 = float(input('Enter the side'))
side2 = float(input('Enter the side'))
side3 = float(input('Enter the side'))
s = (side1 + side2 + side3) / 2
return (s * (s - side1) * (s - side2) * (s - side3))**0.5
else:
base = float(input('Enter the base'))
height = float(input('Enter the height'))
return 1 / 2 * base * height
def perimeter(self):
side1 = float(input('Enter the side'))
side2 = float(input('Enter the side'))
side3 = float(input('Enter the side'))
return side1 + side2 + side3
class Cuboid():
def __init__(self, height, length, width):
self.height = height
self.length = length
self.width = width
def volume(self):
return self.length*self.height*self.width
def total_surface_area(self):
lb = self.length * self.width
bh = self.height * self.width
lh = self.length * self.height
return 2*(lb+bh+lh)
# todo:
# def lateral_surface_area(self):
class Cube():
def __init__(self, side):
self.side = side
def volume(self):
return self.side**3
def total_surface_area(self):
return 6*(self.side**2)
def lateral_surface_area(self):
return 4*(self.side**2)
# testing
# rect = Rectangle(length=20, width=30)
# rect.area()
| 22.858369 | 70 | 0.566842 | 672 | 5,326 | 4.409226 | 0.159226 | 0.107999 | 0.131623 | 0.133648 | 0.477219 | 0.361795 | 0.361795 | 0.294971 | 0.226122 | 0.226122 | 0 | 0.02515 | 0.313181 | 5,326 | 232 | 71 | 22.956897 | 0.784855 | 0.043935 | 0 | 0.403974 | 0 | 0 | 0.13953 | 0.00429 | 0 | 0 | 0 | 0.00431 | 0 | 1 | 0.278146 | false | 0.013245 | 0.013245 | 0.072848 | 0.556291 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
10c4b850a0e98c3f338bd0059a048d7043055f77 | 283 | py | Python | rest_profile/migrations/0009_merge_20180624_1801.py | Visitey/Visitey-Server | e91c0ac52e943af9a739c42a6a3fe8603994c6b6 | [
"MIT"
] | 1 | 2018-06-06T09:46:00.000Z | 2018-06-06T09:46:00.000Z | rest_profile/migrations/0009_merge_20180624_1801.py | Visitey/Visitey-Server | e91c0ac52e943af9a739c42a6a3fe8603994c6b6 | [
"MIT"
] | null | null | null | rest_profile/migrations/0009_merge_20180624_1801.py | Visitey/Visitey-Server | e91c0ac52e943af9a739c42a6a3fe8603994c6b6 | [
"MIT"
] | null | null | null | # Generated by Django 2.0.6 on 2018-06-24 16:01
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('rest_profile', '0006_auto_20180606_1024'),
('rest_profile', '0008_auto_20180620_1723'),
]
operations = [
]
| 18.866667 | 52 | 0.660777 | 35 | 283 | 5.114286 | 0.828571 | 0.122905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.213636 | 0.222615 | 283 | 14 | 53 | 20.214286 | 0.6 | 0.159011 | 0 | 0 | 1 | 0 | 0.29661 | 0.194915 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.