hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f478ee4f909aa0c8efff87625fc2a81d0f45ce6f | 1,642 | py | Python | maps/models.py | Mickey253/hyperbolic-space-graphs | f23b071cf07e2a85a8a23a0af3aa99455f26b3e3 | [
"MIT"
] | null | null | null | maps/models.py | Mickey253/hyperbolic-space-graphs | f23b071cf07e2a85a8a23a0af3aa99455f26b3e3 | [
"MIT"
] | null | null | null | maps/models.py | Mickey253/hyperbolic-space-graphs | f23b071cf07e2a85a8a23a0af3aa99455f26b3e3 | [
"MIT"
] | null | null | null | from django.db import models
from json import dumps
class Task(models.Model):
creation_date = models.CharField(null=True, max_length=64)
creation_ip = models.CharField(null=True, max_length=64)
input_dot = models.TextField()
dot_rep = models.TextField()
svg_rep = models.TextField()
svg_rep0 = models.TextField(null=True)
svg_rep1 = models.TextField(null=True)
svg_rep2 = models.TextField(null=True)
svg_rep3 = models.TextField(null=True)
status = models.TextField()
width = models.FloatField(null=True, blank=True)
height = models.FloatField(null=True, blank=True)
vis_type = models.CharField(max_length=64)
layout_algorithm = models.CharField(max_length=64)
cluster_algorithm = models.CharField(max_length=64)
contiguous_algorithm = models.CharField(max_length=64)
hyperbolic_projection = models.CharField(max_length=64)
color_scheme = models.CharField(max_length=64)
semantic_zoom = models.CharField(max_length=64)
def metadata(self):
return {
'id': self.id,
'status': self.status,
'width': self.width,
'height': self.height,
}
def json_metadata(self):
return dumps(self.metadata())
def description(self):
desc = ''
desc += 'Visualization Type: ' + self.vis_type + '\n'
desc += 'Layout Algorithm: ' + self.layout_algorithm + '\n'
desc += 'Cluster Algorithm: ' + self.cluster_algorithm + '\n'
desc += 'Color Scheme: ' + self.color_scheme + '\n'
desc += 'Semantic Zoom: ' + self.semantic_zoom + '\n'
return desc
| 32.196078 | 69 | 0.654689 | 198 | 1,642 | 5.272727 | 0.272727 | 0.12931 | 0.094828 | 0.16092 | 0.403257 | 0.228927 | 0.065134 | 0 | 0 | 0 | 0 | 0.017282 | 0.224726 | 1,642 | 50 | 70 | 32.84 | 0.802828 | 0 | 0 | 0 | 0 | 0 | 0.070037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.051282 | 0.051282 | 0.717949 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f4857105befdd46ac51f59a5bd8534f66e8012d3 | 1,439 | py | Python | leaker/api/leakage_pattern.py | encryptogroup/LEAKER | b51324e0cdd80a499ac7d32f209ded1af81c32d2 | [
"MIT"
] | 8 | 2021-08-30T04:55:21.000Z | 2022-03-20T16:14:33.000Z | leaker/api/leakage_pattern.py | encryptogroup/LEAKER | b51324e0cdd80a499ac7d32f209ded1af81c32d2 | [
"MIT"
] | 1 | 2021-08-09T09:22:00.000Z | 2021-08-09T09:22:00.000Z | leaker/api/leakage_pattern.py | encryptogroup/LEAKER | b51324e0cdd80a499ac7d32f209ded1af81c32d2 | [
"MIT"
] | null | null | null | """
For License information see the LICENSE file.
Authors: Johannes Leupold
"""
from abc import ABC, abstractmethod
from typing import Iterable, List, Generic, TypeVar, Union, Tuple
from .range_database import RangeDatabase
from .dataset import Dataset
T = TypeVar("T", covariant=True)
class LeakagePattern(ABC, Generic[T]):
"""A leakage pattern, that is, a function from a sequence of queries or values to some specific leakage type."""
@abstractmethod
def leak(self, dataset: Union[Dataset, RangeDatabase], queries: Union[Iterable[int], Iterable[str],
Iterable[Tuple[int, int]]]) -> List[T]:
"""
Calculates the leakage on the given data set and queries.
Parameters
----------
dataset : Union[Dataset, RangeDatabase]
the data set or range DB to calculate the leakage on
queries : Union[Iterable[int], Iterable[str], Iterable[Tuple[int, int]]]
the values or queries to leak on
Returns
-------
leak : List[T]
the leakage
"""
raise NotImplementedError
def __call__(self, dataset: Union[Dataset, RangeDatabase], queries: Union[Iterable[int], Iterable[str],
Iterable[Tuple[int, int]]]) -> List[T]:
return self.leak(dataset, queries)
| 34.261905 | 117 | 0.589298 | 157 | 1,439 | 5.369427 | 0.401274 | 0.042705 | 0.067616 | 0.113879 | 0.285884 | 0.285884 | 0.285884 | 0.285884 | 0.285884 | 0.285884 | 0 | 0 | 0.312022 | 1,439 | 41 | 118 | 35.097561 | 0.851515 | 0.357887 | 0 | 0.153846 | 0 | 0 | 0.001239 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.307692 | 0.076923 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
be3ec900e1d28eb3207237bf92505c4cf718b945 | 1,054 | py | Python | loss.py | qlwang25/CFBD | bec0bd449f832f4caa19c2cd8225d99522b7abbf | [
"Apache-2.0"
] | null | null | null | loss.py | qlwang25/CFBD | bec0bd449f832f4caa19c2cd8225d99522b7abbf | [
"Apache-2.0"
] | null | null | null | loss.py | qlwang25/CFBD | bec0bd449f832f4caa19c2cd8225d99522b7abbf | [
"Apache-2.0"
] | null | null | null | import torch
import torch.nn as nn
import models
import data.dict as dict
from torch.autograd import Variable
def criterion(tgt_vocab_size, use_cuda):
weight = torch.ones(tgt_vocab_size)
weight[dict.PAD] = 0
crit = nn.CrossEntropyLoss(weight, size_average=False)
if use_cuda:
crit.cuda()
return crit
# def cross_entropy_loss(hidden_outputs, targets, decoder, criterion, config):
# loss = torch.FloatTensor([0])
# if config.use_cuda:
# loss = loss.cuda()
# for i, penalty in zip(range(3), [1, 1, 1.5]):
# outputs = hidden_outputs[i].view(-1, hidden_outputs[i].size(2))
# scores = decoder.score_fn(outputs)
# loss += criterion(scores, targets[i].view(-1)) * penalty
# pred = scores.max(1)[1]
# num_correct = pred.eq(targets[-1].view(-1)).masked_select(targets[-1].view(-1).ne(dict.PAD)).sum()
# num_total = targets[-1].ne(dict.PAD).sum()
# loss.div(num_total.float()).backward()
# return loss, num_total, num_correct
| 30.114286 | 105 | 0.63093 | 149 | 1,054 | 4.328859 | 0.422819 | 0.031008 | 0.037209 | 0.04031 | 0.04031 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020757 | 0.22296 | 1,054 | 34 | 106 | 31 | 0.766789 | 0.618596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.416667 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
be51a737094e439bb7923057432082504a44d7ca | 445 | py | Python | djangocms_page_meta/settings.py | Bernardvdv/djangocms-page-meta | e8c5e2180819341bd7f5afd7ec3ef1046c1a7712 | [
"BSD-3-Clause"
] | 65 | 2015-01-01T18:29:53.000Z | 2022-03-15T07:40:31.000Z | djangocms_page_meta/settings.py | Bernardvdv/djangocms-page-meta | e8c5e2180819341bd7f5afd7ec3ef1046c1a7712 | [
"BSD-3-Clause"
] | 109 | 2015-03-30T15:14:36.000Z | 2021-06-25T15:24:06.000Z | djangocms_page_meta/settings.py | Bernardvdv/djangocms-page-meta | e8c5e2180819341bd7f5afd7ec3ef1046c1a7712 | [
"BSD-3-Clause"
] | 52 | 2015-01-15T01:48:35.000Z | 2021-11-25T11:21:28.000Z | def get_setting(name):
from django.conf import settings
description_length = getattr(settings, "PAGE_META_DESCRIPTION_LENGTH", None) or 320
tw_description_length = getattr(settings, "PAGE_META_TWITTER_DESCRIPTION_LENGTH", None) or 320
default = {
"PAGE_META_DESCRIPTION_LENGTH": description_length,
"PAGE_META_TWITTER_DESCRIPTION_LENGTH": tw_description_length,
}
return default["PAGE_META_%s" % name]
| 34.230769 | 98 | 0.752809 | 54 | 445 | 5.777778 | 0.407407 | 0.435897 | 0.153846 | 0.205128 | 0.548077 | 0.25641 | 0 | 0 | 0 | 0 | 0 | 0.016216 | 0.168539 | 445 | 12 | 99 | 37.083333 | 0.827027 | 0 | 0 | 0 | 0 | 0 | 0.314607 | 0.28764 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
be53788b3939b8fb16a1b2b0235f6a8dff4828c1 | 1,701 | py | Python | tests/test_host_serial.py | Xilinx/roast-xilinx | ce24becf31c90fb3c4562f4347bf0859af845c05 | [
"MIT"
] | 1 | 2021-09-01T15:29:08.000Z | 2021-09-01T15:29:08.000Z | tests/test_host_serial.py | Xilinx/roast-xilinx | ce24becf31c90fb3c4562f4347bf0859af845c05 | [
"MIT"
] | null | null | null | tests/test_host_serial.py | Xilinx/roast-xilinx | ce24becf31c90fb3c4562f4347bf0859af845c05 | [
"MIT"
] | 1 | 2021-11-18T13:11:21.000Z | 2021-11-18T13:11:21.000Z | #
# Copyright (c) 2020 Xilinx, Inc. All rights reserved.
# SPDX-License-Identifier: MIT
#
import pytest
from roast.serial import SerialBase, Serial
config = {
"board_interface": "host_target",
"remote_host": "remote_host",
"com": "com",
"baudrate": "baudrate",
}
def test_host_serial(mocker):
mock_xexpect = mocker.patch(
"roast.serial.Xexpect", return_value=mocker.Mock("xexpect")
)
mock_xexpect.return_value.expect = mocker.Mock("xexpect", return_value="xexpect")
mock_xexpect.return_value.sendline = mocker.Mock("sendline")
mock_xexpect.return_value.runcmd = mocker.Mock("runcmd")
mock_xexpect.return_value.runcmd_list = mocker.Mock("runcmd_list")
mock_xexpect.return_value.sendcontrol = mocker.Mock("sendcontrol")
mock_xexpect.return_value.send = mocker.Mock("send")
mock_xexpect.return_value.output = mocker.Mock("output")
mock_xexpect.return_value._setup_init = mocker.Mock("setup_init")
mock_xexpect.return_value.search = mocker.Mock("search")
mock_xexpect.return_value.sync = mocker.Mock("sync")
mock_picom_connect = mocker.patch("roast.component.host_serial.picom_connect")
mock_picom_disconnect = mocker.patch("roast.component.host_serial.picom_disconnect")
s = Serial(serial_type="host", config=config)
assert s.driver.config == config
assert s.driver.hostname == "remote_host"
mock_picom_connect.assert_called_with(mocker.ANY, "com", "baudrate")
s.exit()
mock_picom_disconnect.assert_called()
def test_serial_exception():
config["board_interface"] = None
with pytest.raises(Exception, match="invalid serial interface"):
Serial(serial_type="host", config=config)
| 36.978261 | 88 | 0.734274 | 216 | 1,701 | 5.527778 | 0.291667 | 0.119765 | 0.180905 | 0.20268 | 0.247906 | 0.120603 | 0.067002 | 0 | 0 | 0 | 0 | 0.002727 | 0.137566 | 1,701 | 45 | 89 | 37.8 | 0.811179 | 0.047619 | 0 | 0 | 0 | 0 | 0.204954 | 0.052632 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.058824 | false | 0 | 0.058824 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
be5d12811f1ee17e0834b41b649e7823d2165c41 | 3,991 | py | Python | app/store/models.py | igndukwe/ecomm-app-api | c6e175c7b6036cc72b02c5e6fcba67eb3f0566c0 | [
"MIT"
] | null | null | null | app/store/models.py | igndukwe/ecomm-app-api | c6e175c7b6036cc72b02c5e6fcba67eb3f0566c0 | [
"MIT"
] | null | null | null | app/store/models.py | igndukwe/ecomm-app-api | c6e175c7b6036cc72b02c5e6fcba67eb3f0566c0 | [
"MIT"
] | null | null | null | import uuid
import os
from django.db import models
from django.contrib.auth.models import User
def product_image_file_path(instance, filename):
"""Generate file path for new recipe image"""
ext = filename.split('.')[-1]
filename = f'{uuid.uuid4()}.{ext}'
# path to store the file
return os.path.join('uploads/product/', filename)
# Create your models here.
class Customer(models.Model):
# once Customer is deleted
# let User be deleted as well
user = models.OneToOneField(
User,
null=True,
blank=True,
on_delete=models.CASCADE
)
name = models.CharField(max_length=200, null=True)
email = models.CharField(max_length=200)
def __str__(self):
return self.name
class Product(models.Model):
name = models.CharField(max_length=200)
price = models.DecimalField(max_digits=7, decimal_places=2)
digital = models.BooleanField(default=False, null=True, blank=True)
image = models.ImageField(
null=True,
blank=True,
upload_to=product_image_file_path
)
def __str__(self):
return self.name
@property
def imageURL(self):
try:
url = self.image.url
except Exception:
url = ''
return url
class Order(models.Model):
customer = models.ForeignKey(
Customer,
on_delete=models.SET_NULL,
null=True,
blank=True
)
date_ordered = models.DateTimeField(auto_now_add=True)
complete = models.BooleanField(default=False)
transaction_id = models.CharField(max_length=100, null=True)
def __str__(self):
return str(self.id)
@property
def get_cart_total(self):
"""Total amount of ordered items in cart"""
# Total is a product of price and quantity
# >OrderItem is a Child of Order
# Syntax: Parent.child_set.all()
# remember that child must be lower cases
# e.g. call from inside inside class/method:
# self.orderitem_set.all()
# call from outside the class:
# Order.orderitem_set.all()
orderitems = self.orderitem_set.all()
total = sum([item.get_total for item in orderitems])
return total
@property
def get_cart_items(self):
"""Total quantity of ordered items in cart"""
orderitems = self.orderitem_set.all()
total = sum([item.quantity for item in orderitems])
return total
@property
def shipping(self):
shipping = False
orderitems = self.orderitem_set.all()
for i in orderitems:
if i.product.digital is False:
shipping = True
return shipping
class OrderItem(models.Model):
# on deleting OrderItem, do not delete products or orders
product = models.ForeignKey(Product, on_delete=models.SET_NULL, null=True)
order = models.ForeignKey(Order, on_delete=models.SET_NULL, null=True)
quantity = models.IntegerField(default=0, null=True, blank=True)
date_added = models.DateTimeField(auto_now_add=True)
@property
def get_total(self):
"""Total of a particular item"""
# Total is a product of price and quantity
total = self.product.price * self.quantity
return total
class ShippingAddress(models.Model):
# want customers info even if order gets deleted
customer = models.ForeignKey(
Customer,
on_delete=models.SET_NULL,
null=True
)
order = models.ForeignKey(
Order,
on_delete=models.SET_NULL,
null=True
)
address = models.CharField(max_length=200, null=False)
city = models.CharField(max_length=200, null=False)
state = models.CharField(max_length=200, null=False)
zipcode = models.CharField(max_length=200, null=False)
country = models.CharField(max_length=200, null=False)
date_added = models.DateTimeField(auto_now_add=True)
def __str__(self):
return self.address
| 28.304965 | 78 | 0.650965 | 503 | 3,991 | 5.031809 | 0.28827 | 0.034769 | 0.064006 | 0.085342 | 0.421968 | 0.356381 | 0.298301 | 0.227183 | 0.103121 | 0.103121 | 0 | 0.010771 | 0.255575 | 3,991 | 140 | 79 | 28.507143 | 0.841131 | 0.16437 | 0 | 0.347368 | 1 | 0 | 0.011212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.042105 | 0.042105 | 0.547368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
be66523c1fa4a345f8871077e1d11f170a08f1e0 | 419 | py | Python | test/tests_without_spec/test_missing_spec.py | kovalewvladimir/falcon-openapi | 8fd3af74bdbda200b5b201bb60da8546a57e434b | [
"MIT"
] | null | null | null | test/tests_without_spec/test_missing_spec.py | kovalewvladimir/falcon-openapi | 8fd3af74bdbda200b5b201bb60da8546a57e434b | [
"MIT"
] | null | null | null | test/tests_without_spec/test_missing_spec.py | kovalewvladimir/falcon-openapi | 8fd3af74bdbda200b5b201bb60da8546a57e434b | [
"MIT"
] | null | null | null | import json
import os
import sys
from shutil import copytree
import pytest
sys.path.insert(0, '../..')
from falcon_openapi import OpenApiRouter # isort:skip
class TestRouter():
def test_missing_default(self):
with pytest.raises(FileNotFoundError):
OpenApiRouter()
def test_bad_file(self):
with pytest.raises(FileNotFoundError):
OpenApiRouter(file_path='kdjh.yaml')
| 20.95 | 54 | 0.701671 | 49 | 419 | 5.877551 | 0.612245 | 0.048611 | 0.097222 | 0.138889 | 0.347222 | 0.347222 | 0 | 0 | 0 | 0 | 0 | 0.003003 | 0.205251 | 419 | 19 | 55 | 22.052632 | 0.861862 | 0.023866 | 0 | 0.142857 | 0 | 0 | 0.034398 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
be71816ccc005bf3f913861ef24d9d5f00539d99 | 736 | py | Python | app/models.py | edumorris/newser | c672e6e2a6efb3ae4c679c28f844dc7fd5b6e865 | [
"MIT"
] | null | null | null | app/models.py | edumorris/newser | c672e6e2a6efb3ae4c679c28f844dc7fd5b6e865 | [
"MIT"
] | null | null | null | app/models.py | edumorris/newser | c672e6e2a6efb3ae4c679c28f844dc7fd5b6e865 | [
"MIT"
] | null | null | null | class newsArticles:
'''
Class defining articles
'''
def __init__(self, source, author, title, description, url, image_url, publish_time, content):
self.source = source # Name of the source of news
self.author = author # Author of the news article
self.title = title # Title of the news article
self.description = description # Snippet of the news article
self.url = url # URL to the news article
self.image_url = image_url # URL for the news article image
self.publish_time = publish_time # Publication date of the news article
self.content = content # Content of the article
class newsSource:
'''
Class defining news sources
'''
pass | 36.8 | 98 | 0.660326 | 95 | 736 | 5.010526 | 0.305263 | 0.063025 | 0.176471 | 0.189076 | 0.168067 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.273098 | 736 | 20 | 99 | 36.8 | 0.88972 | 0.373641 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
be74cb5278563e2c0c727d6764696d19f41a4004 | 657 | py | Python | Python/Courses/Python-Tutorials.Telusko/01.Object-Oriented-Programming/13.Types-of-Methods.py | shihab4t/Books-Code | b637b6b2ad42e11faf87d29047311160fe3b2490 | [
"Unlicense"
] | null | null | null | Python/Courses/Python-Tutorials.Telusko/01.Object-Oriented-Programming/13.Types-of-Methods.py | shihab4t/Books-Code | b637b6b2ad42e11faf87d29047311160fe3b2490 | [
"Unlicense"
] | null | null | null | Python/Courses/Python-Tutorials.Telusko/01.Object-Oriented-Programming/13.Types-of-Methods.py | shihab4t/Books-Code | b637b6b2ad42e11faf87d29047311160fe3b2490 | [
"Unlicense"
] | null | null | null | # There are three type methods
# Instance methods
# Class methods
# Static methods
class Student:
school = "Telusko"
@classmethod
def get_school(cls):
return cls.school
@staticmethod
def info():
print("This is Student Class")
def __init__(self, m1, m2, m3):
self.m1 = m1
self.m2 = m2
self.m3 = m3
def avg(self):
return (self.m1 + self.m2 + self.m3) / 3
def get_m1(self):
return self.m1
def set_m1(self, value):
self.m1 = value
s1 = Student(34, 47, 32)
s2 = Student(89, 32, 12)
print(s1.avg(), s2.avg())
print(Student.get_school())
Student.info()
| 16.846154 | 48 | 0.585997 | 93 | 657 | 4.053763 | 0.397849 | 0.079576 | 0.04244 | 0.084881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070815 | 0.290715 | 657 | 38 | 49 | 17.289474 | 0.738197 | 0.112633 | 0 | 0 | 0 | 0 | 0.048443 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.26087 | false | 0 | 0 | 0.130435 | 0.478261 | 0.130435 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
be7d3076f96338a9c4fb5a77f3606791b300827b | 556 | py | Python | app/university/migrations/0004_auto_20220121_1538.py | gbr-mendes/ead-courses-api | fd0fcc8aef996c4b671f34d51a8a74aafdf78480 | [
"MIT"
] | 1 | 2022-01-31T18:12:42.000Z | 2022-01-31T18:12:42.000Z | app/university/migrations/0004_auto_20220121_1538.py | gbr-mendes/ead-courses-api | fd0fcc8aef996c4b671f34d51a8a74aafdf78480 | [
"MIT"
] | null | null | null | app/university/migrations/0004_auto_20220121_1538.py | gbr-mendes/ead-courses-api | fd0fcc8aef996c4b671f34d51a8a74aafdf78480 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.11 on 2022-01-21 15:38
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('university', '0003_lesson'),
]
operations = [
migrations.AlterField(
model_name='employee',
name='hired_date',
field=models.DateField(default='2022-01-21'),
),
migrations.AlterField(
model_name='teacher',
name='hired_date',
field=models.DateField(default='2022-01-21'),
),
]
| 23.166667 | 57 | 0.571942 | 57 | 556 | 5.491228 | 0.596491 | 0.057508 | 0.076677 | 0.185304 | 0.306709 | 0.306709 | 0.306709 | 0.306709 | 0.306709 | 0.306709 | 0 | 0.092784 | 0.302158 | 556 | 23 | 58 | 24.173913 | 0.713918 | 0.082734 | 0 | 0.470588 | 1 | 0 | 0.149606 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
be7df4b1fdc26739ccb3963215d52453e8e38462 | 87,701 | py | Python | dimensionality_reduction_functions.py | rebeccawalters95/PathReducer | b9e7b0dc924686a909cb95697a98e5b89e2f7d93 | [
"MIT"
] | null | null | null | dimensionality_reduction_functions.py | rebeccawalters95/PathReducer | b9e7b0dc924686a909cb95697a98e5b89e2f7d93 | [
"MIT"
] | null | null | null | dimensionality_reduction_functions.py | rebeccawalters95/PathReducer | b9e7b0dc924686a909cb95697a98e5b89e2f7d93 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
from typing import Tuple
import numpy as np
import PathReducer.calculate_rmsd as rmsd
import pandas as pd
import math
import glob
import os
import sys
import ntpath
import MDAnalysis as mda
import PathReducer.plotting_functions as plotting_functions
from periodictable import *
from sklearn import *
from sympy import solve, Symbol
def path_leaf(path):
head, tail = ntpath.split(path)
return tail or ntpath.basename(head)
def read_traj_file(*args, **kwargs) -> Tuple[str, np.ndarray, np.ndarray]:
"""
Reads in a trajectory using MDAnalysis' Universe class, documentation and information on parameters found here: (https://www.mdanalysis.org/docs/documentation_pages/core/universe.html#MDAnalysis.core.universe.Universe). A topology file is always required, however there are multiple ways of setting up a universe for a trajectory. Examples include:
u = Universe(topology, trajectory) # read system from file(s)
u = Universe(pdbfile) # read atoms and coordinates from PDB or GRO
u = Universe(topology, [traj1, traj2, ...]) # read from a list of trajectories
u = Universe(topology, traj1, traj2, ...) # read from multiple trajectories
The trajectory being read in should be already pruned (of explicit solvent, backbone residues, and anything that you don't want PCA to capture. The function then returns a numpy array of all of the atom types of the system, and a numpy array of the Cartesian coordinates of each atom for every frame.
:param topology: str (.pdb, .top, .gro etc)
:param coordinates: str (.dcd, .nc, .xyz etc)
:return extensionless_system_name
atom_list
cartesians
"""
u = mda.Universe(*args, **kwargs)
system_name = path_leaf(u.filename)
extensionless_system_name = os.path.splitext(system_name)[0]
n_frames = len(u.trajectory)
n_atoms = len(u.atoms)
cartesians = np.ndarray((n_frames, n_atoms, 3))
try:
atom_list = u.atoms.elements
except AttributeError:
atom_list = u.atoms.types
for frame_index, ts in enumerate(u.trajectory):
cartesians[frame_index] = ts.positions
return extensionless_system_name, atom_list, cartesians
def read_xyz_file(path):
""" Reads in an xyz file from path as a DataFrame. This DataFrame is then turned into a 3D array such that the
dimensions are (number of points) X (number of atoms) X 3 (Cartesian coordinates). The system name (based on the
filename), list of atoms in the system, and Cartesian coordinates are output.
:param path: path to xyz file to be read
:return extensionless_system_name: str
atom_list: numpy array
cartesians: numpy array
"""
system_name = path_leaf(path)
print("File being read is: %s" % system_name)
extensionless_system_name = os.path.splitext(system_name)[0]
data = pd.read_csv(path, header=None, delim_whitespace=True, names=['atom', 'X', 'Y', 'Z'])
n_atoms = int(data.loc[0][0])
n_lines_per_frame = int(n_atoms + 2)
data_array = np.array(data)
data_reshape = np.reshape(data_array, (int(data_array.shape[0]/n_lines_per_frame), n_lines_per_frame,
data_array.shape[1]))
cartesians = data_reshape[:, 2::, 1::].astype(np.float)
atom_list = data_reshape[0, 2::, 0]
return extensionless_system_name, atom_list, cartesians
def remove_atoms_by_type(atom_types_to_remove, atom_list, cartesians):
"""
Removes specific atoms if they are not wanted for PCA
:param atom_list: list of atoms in the structure
:param cartesians: cartesian coordinates of each frame
:return: cartesian coordinates of each frame with specific atom types removed
"""
matches_indexes = [i for i, x in enumerate(atom_list) if x in atom_types_to_remove]
cartesians_sans_atoms = np.delete(cartesians, list(matches_indexes), axis=1)
atom_list_sans_atoms = np.delete(atom_list, list(matches_indexes), axis=0)
return atom_list_sans_atoms, cartesians_sans_atoms
def calculate_velocities(cartesians, timestep=1):
"""
Calculate velocities at each timestep given Cartesian coordinates. Velocities at the first and last point are
extrapolated.
:param cartesians: Cartesian coordinates along trajectory
:param timestep: time step between frames in units of fs, default=1
:return: velocities
"""
velocities = []
for i in range(0, len(cartesians)):
if i == 0:
velocity = (cartesians[i + 1] - cartesians[i]) / timestep
elif i == len(cartesians) - 1:
velocity = (cartesians[i] - cartesians[i - 1]) / timestep
else:
velocity = (cartesians[i + 1] - cartesians[i - 1]) / 2 * timestep
velocities.append(velocity)
return velocities
def calculate_momenta(velocities, atoms):
"""
:param cartesians: Cartesian coordinates along trajectory
:param timestep: time step between frames in units of fs, default=1
:return: velocities
"""
velocities = np.array(velocities)
atoms = np.array(atoms)
atom_masses = np.array([formula(atom).mass for atom in atoms])
momenta = velocities * atom_masses[np.newaxis, :, np.newaxis]
return momenta
def set_atom_one_to_origin(coordinates):
coordinates_shifted = coordinates - coordinates[:, np.newaxis, 0]
return coordinates_shifted
def mass_weighting(atoms, cartesians):
cartesians = np.array(cartesians)
atoms = np.array(atoms)
atom_masses = [formula(atom).mass for atom in atoms]
weighting = np.sqrt(atom_masses)
mass_weighted_cartesians = cartesians * weighting[np.newaxis, :, np.newaxis]
return mass_weighted_cartesians
def remove_mass_weighting(atoms, coordinates):
coordinates = np.array(coordinates)
atoms = np.array(atoms)
atom_masses = [formula(atom).mass for atom in atoms]
weighting = np.sqrt(atom_masses)
unmass_weighted_coords = coordinates / weighting[np.newaxis, :, np.newaxis]
return unmass_weighted_coords
def generate_distance_matrices(coordinates):
""" Generates distance matrices for each structure.
"""
coordinates = np.array(coordinates)
d2 = np.sum((coordinates[:, :, None] - coordinates[:, None, :]) ** 2, axis=3)
return d2
def generate_dihedral_matrices(coordinates):
return coordinates
def generate_and_reshape_ds_big_structures(coordinates):
""" Generates matrix of pairwise distances, which includes pairwise distances for each structure.
:param coordinates:
"""
coordinates = np.array(coordinates)
atoms = int(coordinates.shape[1])
d_re = np.zeros((coordinates.shape[0], int(atoms * (atoms - 1) / 2)))
for i in range(coordinates.shape[0]):
d2 = np.square(metrics.pairwise.euclidean_distances(coordinates[i]))
x = d2[0].shape[0]
dint_re = d2[np.triu_indices(x, k=1)]
d_re[i] = dint_re
return d_re
def reshape_ds(d):
""" Takes only the upper triangle of the distance matrices and reshapes them into 1D arrays.
"""
d_re = []
x = d[0][0].shape[0]
for dint in d:
dint_re = dint[np.triu_indices(x, k=1)]
d_re.append(dint_re)
d_re = np.asarray(d_re)
return d_re
def vector_to_matrix(v):
""" Converts a representation from 1D vector to 2D square matrix. Slightly altered from rmsd package to disregard
zeroes along diagonal of matrix.
:param v: 1D input representation.
:type v: numpy array
:return: Square matrix representation.
:rtype: numpy array
"""
if not (np.sqrt(8 * v.shape[0] + 1) == int(np.sqrt(8 * v.shape[0] + 1))):
print("ERROR: Can not make a square matrix.")
exit(1)
n = v.shape[0]
w = ((-1 + int(np.sqrt(8 * n + 1))) // 2) + 1
m = np.zeros((w, w))
index = 0
for i in range(w):
for j in range(w):
if i > j - 1:
continue
m[i, j] = v[index]
m[j, i] = m[i, j]
index += 1
return m
def distance_matrix_to_coords(v):
""" Converts a (2D square) distance matrix representation of a structure to Cartesian coordinates (first 3 columns
correspond to 3D xyz coordinates) via a Gram matrix.
:param v: 1D vector, numpy array
:return: 3D Cartesian coordinates, numpy array
"""
d = vector_to_matrix(v)
d_one = np.reshape(d[:, 0], (d.shape[0], 1))
m = (-0.5) * (d - np.matmul(np.ones((d.shape[0], 1)), np.transpose(d_one)) - np.matmul(d_one,
np.ones((1, d.shape[0]))))
values, vectors = np.linalg.eig(m)
idx = values.argsort()[::-1]
values = values[idx]
vectors = vectors[:, idx]
assert np.allclose(np.dot(m, vectors), values * vectors)
coords = np.dot(vectors, np.diag(np.sqrt(values)))
# Only taking first three columns as Cartesian (xyz) coordinates
coords = np.asarray(coords[:, 0:3])
return coords
def pca_dr(matrix):
"""
Does PCA on input matrix with specified number of dimensions. Outputs information used to later generate xyz files
in the reduced dimensional space and also for the function that filters out distances between key atoms and their
neighbors.
:param matrix: array
:return: matrix_pca: input data (in matrix) projected onto covariance matrix eigenvectors (PCs)
matrix_pca_fit: fit used to transform new data into reduced dimensional space
pca.components_: eigenvectors of covariance matrix
pca.mean_: mean of the original dataset (in matrix)
pca.explained_variance_: amount of variance described by each PC
"""
matrix = pd.DataFrame(matrix)
pca = decomposition.PCA()
matrix_pca_fit = pca.fit(matrix)
matrix_pca = pca.transform(matrix)
return matrix_pca, matrix_pca_fit, pca.components_, pca.mean_, pca.explained_variance_
# TODO: Add function that is able to do LDA on data rather than PCA
def lda_dr(matrix, data_labels):
"""
Does LDA (Linear Discriminant Analysis) on input matrix with specified number of dimensions. Outputs information
used to later generate xyz files in the reduced dimensional space and also for the function that filters out
distances between key atoms and their neighbors.
:param matrix: array
:param data_labels: array, separates data into classes to differentiate
:return: matrix_lda: input data (in matrix) projected onto covariance matrix eigenvectors (PCs)
matrix_lda_fit: fit used to transform new data into reduced dimensional space
lda.coef_: weight vectors
lda.means_: means of each class of the original dataset
lda.explained_variance_ratio_: amount of variance described by each PC
"""
matrix = pd.DataFrame(matrix)
lda = discriminant_analysis.LinearDiscriminantAnalysis()
matrix_lda_fit = lda.fit(matrix, data_labels)
matrix_lda = lda.transform(matrix)
return matrix_lda, matrix_lda_fit, lda.coef_, lda.means_, lda.explained_variance_ratio_
def calc_mean_distance_vector(d2_matrix):
return np.mean(d2_matrix, axis=0)
def filter_important_distances(upper_tri_d2_matrices, num_dists=75000):
num_points = upper_tri_d2_matrices.shape[0]
vec_length = upper_tri_d2_matrices.shape[1]
num_atoms = calc_num_atoms(vec_length)
variances = []
atom_indexes = {}
for k in range(vec_length):
variances.append(np.var(upper_tri_d2_matrices[:, k]))
atom1, atom2 = calc_ij(k, num_atoms)
atom_indexes[k] = atom1, atom2
important_distances_matrix = np.zeros((num_points, num_dists))
top_vars_indexes = top_values_indexes(variances, num_dists)
i = 0
selected_dist_atom_indexes = {}
for index in top_vars_indexes:
important_distances_matrix[:, i] = upper_tri_d2_matrices[:, index]
selected_dist_atom_indexes[i] = atom_indexes[index], index
i += 1
# print(selected_dist_atom_indexes)
return important_distances_matrix, selected_dist_atom_indexes
def calc_num_atoms(vec_length):
"""
Calculates number of atoms in a system based on the length of the vector generated by flattening the upper triangle
of its interatomic distance matrix.
:param vec_length: length of interatomic distance matrix vector
:return: num_atoms: int, number of atoms in the system
"""
n = Symbol('n', positive=True)
answers = solve(n * (n - 1) / 2 - vec_length, n)
num_atoms = int(answers[0])
return num_atoms
def set_unimportant_distance_weights_to_zero(components, selected_dist_atom_indexes, num_atoms):
num_dists = int((num_atoms * (num_atoms - 1)) / 2)
num_points = components.shape[0]
components_all_distances = np.zeros((num_points, num_dists))
distance_vector_indexes = list(pd.DataFrame(list(selected_dist_atom_indexes.values()))[1])
for i in range(len(distance_vector_indexes)):
components_all_distances[:, distance_vector_indexes[i]] = components[:, i]
return components_all_distances
def generate_PC_matrices_selected_distances(n_dim, matrix_reduced, components, mean, selected_dist_atom_indexes,
num_atoms):
num_points = matrix_reduced.shape[0]
num_dists = int((num_atoms * (num_atoms - 1)) / 2)
PCs_separate = []
for i in range(0, n_dim):
PCi = np.zeros((num_points, num_dists))
PCi_selected = np.dot(matrix_reduced[:, i, None], components[None, i, :]) + mean
for j in range(len(selected_dist_atom_indexes)):
distance_location = selected_dist_atom_indexes[j][1]
PCi[:, distance_location] = PCi_selected[:, j]
PCs_separate.append(PCi)
PCs_combined = np.zeros((num_points, num_dists))
PCs_combined_selected = np.dot(matrix_reduced, components) + mean
for j in range(len(selected_dist_atom_indexes)):
distance_location = selected_dist_atom_indexes[j][1]
PCs_combined[:, distance_location] = PCs_combined_selected[:, j]
PCs_separate = np.array(PCs_separate)
PCs_combined = np.array(PCs_combined)
return PCs_separate, PCs_combined
def inverse_transform_of_pcs(n_dim, matrix_reduced, components, mean):
"""
Calculates the inverse transform of the PCs to see what the PCs correspond to in terms of geometric changes.
Different than inverse_transform_of_pcs_as_normal_modes function because this function shows only structures that
are spanned by the input data itself (i.e., uses the PC scores of each structure).
:param n_dim: int, number of principal components specified to define reduced dimensional space
:param matrix_reduced: array, PCs in reduced dimensional space
:param components: array, eigenvectors of the covariance matrix of the input data
:param mean: mean structure of the input data
:return: PCs_separate, PCs_combined as arrays
"""
PCs_separate = []
for i in range(0, n_dim):
PCi = np.dot(matrix_reduced[:, i, None], components[None, i, :]) + mean
PCs_separate.append(PCi)
PCs_combined = np.dot(matrix_reduced[:, 0:n_dim], components[0:n_dim, :]) + mean
PCs_separate = np.array(PCs_separate)
PCs_combined = np.array(PCs_combined)
return PCs_separate, PCs_combined
def inverse_transform_of_pcs_as_normal_modes(n_dim, matrix_reduced, components, mean, alpha=0.40):
"""
Adds incremental amounts of each eigenvector to the mean structure to show the effect of individual eigenvectors
on molecular structure. Different than the inverse_transform_of_pcs function as this function does NOT take into
account the space spanned by the original input data, but rather distorts the geometry of the mean structure in a
linear fashion (i.e., how visualization of a normal mode appears in GaussView).
:param n_dim: int, number of principal components specified to define reduced dimensional space
:param components: array, eigenvectors of the covariance matrix of the input data
:param mean: mean structure of the input data
:param alpha: the multiple of each eigenvector to add to the mean structure
:return: PCs_separate, PCs_combined as arrays
"""
PCs_separate = []
for i in range(0, n_dim):
PCi = np.dot(alpha * (np.arange(-20, 21))[:, None], components[None, i, :]) + mean
PCs_separate.append(PCi)
multiplier = np.zeros((len(np.arange(-20, 21)), n_dim))
for i in range(n_dim):
multiplier[:, i] = np.arange(-20, 21)
PCs_combined = np.dot(alpha * multiplier, components[0:n_dim, :]) + mean
PCs_separate = np.array(PCs_separate)
PCs_combined = np.array(PCs_combined)
return PCs_separate, PCs_combined
def calc_ij(k, n):
"""
Calculate indexes i and j of a square symmetric matrix given upper triangle vector index k and matrix side length n.
:param k: vector index
:param n: side length of resultant matrix M
:return: i, j as ints
"""
i = n - 2 - math.floor((np.sqrt(-8 * k + 4 * n * (n - 1) - 7) / 2) - 0.5)
j = k + i + 1 - (n * (n - 1) / 2) + ((n - i) * ((n - i) - 1) / 2)
return int(i), int(j)
def top_values_indexes(a, n):
"""
Determine indexes of n top values of matrix a
:param a: matrix
:param n: integer, number of top values desired
:return: sorted list of indexes of n top values of a
"""
return np.argsort(a)[::-1][:n]
def kabsch(coordinates):
"""Kabsch algorithm to get orientation of axes that minimizes RMSD. All structures will be aligned to the first
structure in the trajectory.
:param coordinates: coordinates along trajectory to be aligned, list or array
"""
coordinates = np.array(coordinates)
coordinates[0] -= rmsd.centroid(coordinates[0])
coords_kabsch = []
for i in range(len(coordinates)):
coordinates[i] -= rmsd.centroid(coordinates[i])
coords_kabschi = rmsd.kabsch_rotate(coordinates[i], coordinates[0])
coords_kabsch.append(coords_kabschi)
return np.array(coords_kabsch)
def align_to_original_traj(coords, original_traj_coords):
"""Kabsch algorithm to get orientation of axes that minimizes RMSD (to avoid rotations in visualization). All
structures will be aligned to the first structure in the original trajectory.
:param coords: coordinates along trajectory to be aligned, list or array
:param original_traj_coords: coordinates along original trajectory
"""
coords = np.array(coords)
coords_aligned = []
original_traj_coords[0] -= rmsd.centroid(original_traj_coords[0])
for i in range(len(coords)):
coords[i] -= rmsd.centroid(coords[i])
coords_i = rmsd.kabsch_rotate(coords[i], original_traj_coords[0])
coords_aligned.append(coords_i)
return np.array(coords_aligned)
def chirality_test(coords, stereo_atoms):
""" Determines chirality of structure so it is consistent throughout the generated reduced dimensional
IRC/trajectory.
:param coords: xyz coordinates along IRC or trajectory
:param stereo_atoms: list of 4 atom numbers that represent groups around a chiral center
:type coords: numpy array
:type stereo_atoms: list
"""
a1 = stereo_atoms[0]
a2 = stereo_atoms[1]
a3 = stereo_atoms[2]
a4 = stereo_atoms[3]
signs = []
for i in range(len(coords)):
m = np.ones((4, 4))
m[0, 0:3] = coords[i][a1 - 1]
m[1, 0:3] = coords[i][a2 - 1]
m[2, 0:3] = coords[i][a3 - 1]
m[3, 0:3] = coords[i][a4 - 1]
if np.linalg.det(m) < 0:
signs.append(-1)
elif np.linalg.det(m) > 0:
signs.append(1)
elif np.linalg.det(m) == 0:
signs.append(0)
negs = np.where(np.array(signs) < 0)
poss = np.where(np.array(signs) > 0)
zeros = np.where(np.array(signs) == 0)
return negs, poss, zeros, signs
def chirality_changes(reconstructed_coordinates, stereo_atoms, original_structure_signs):
""" Determines chirality of structure along original trajectory and reconstructed reduced dimensional trajectory
and switches inconsistencies along reduced dimensional IRC/trajectory.
:param reconstructed_coordinates: coordinates of trajectory in the reduced dimensional space
:param stereo_atoms: list of 4 indexes of atoms surrounding stereogenic center
:param original_structure_signs: signs (positive or negative) that represent chirality at given point along original
trajectory, numpy array
:return: correct_chirality_coordinates: coordinates with the chirality of each structure consistent with the
original coordinates (based on original_structure_signs), array
"""
pos, neg, zero, signs_reconstructed = chirality_test(reconstructed_coordinates, stereo_atoms)
correct_chirality_coordinates = reconstructed_coordinates
for i in range(len(original_structure_signs)):
if original_structure_signs[i] == 0:
# If molecule begins planar but reconstruction of PCs are not, keep chirality consistent along reconstructed
# trajectory
if i > 0 and signs_reconstructed[i] != signs_reconstructed[0]:
correct_chirality_coordinates[i] = -correct_chirality_coordinates[i]
elif signs_reconstructed[i] != original_structure_signs[i]:
correct_chirality_coordinates[i] = -correct_chirality_coordinates[i]
return correct_chirality_coordinates
def chirality_changes_normal_modes(reconstructed_coordinates, stereo_atoms, original_structure_signs):
""" Determines chirality of structure along original trajectory and reconstructed reduced dimensional trajectory
and switches inconsistencies along reduced dimensional IRC/trajectory.
:param reconstructed_coordinates: coordinates of trajectory in the reduced dimensional space
:param stereo_atoms: list of 4 indexes of atoms surrounding stereogenic center
:param original_structure_signs: signs (positive or negative) that represent chirality at given point along original
trajectory, numpy array
:return: correct_chirality_coordinates: coordinates with the chirality of each structure consistent with the
original coordinates (based on original_structure_signs), array
"""
pos, neg, zero, signs_reconstructed = chirality_test(reconstructed_coordinates, stereo_atoms)
correct_chirality_coordinates = reconstructed_coordinates
for i in range(len(reconstructed_coordinates)):
if original_structure_signs[0] == 0:
if i > 0 and signs_reconstructed[i] != signs_reconstructed[0]:
correct_chirality_coordinates[i] = -correct_chirality_coordinates[i]
elif signs_reconstructed[i] != original_structure_signs[0]:
correct_chirality_coordinates[i] = -correct_chirality_coordinates[i]
return correct_chirality_coordinates
def make_pc_xyz_files(output_directory, title, atoms, coordinates):
""" Save principal coordinates as xyz files PC[n].xyz to output directory.
:param output_directory: output directory to store xyz files, str
:param atoms: atoms in input trajectory, list
:param title: name of the input system, str
:param coordinates: xyz coordinates of structures along PCi, list or numpy array
:return: None
"""
for k in range(np.array(coordinates).shape[0]):
if np.array(coordinates).shape[0] == 1:
f = open(os.path.join(output_directory, '%s_all_PCs.xyz' % title), 'w')
else:
f = open(os.path.join(output_directory, '%s_PC%s.xyz' % (title, k + 1)), 'w')
for i in range(len(coordinates[k])):
a = coordinates[k][i]
a = a.tolist()
b = []
for j in range(len(a)):
a[j] = ['%.5f' % x for x in a[j]]
a[j].insert(0, atoms[j])
b.append(a[j])
f.write('%d' % len(atoms) + '\n')
f.write('%s point %i' % (title, i + 1) + '\n')
f.write('%s' % str(np.asarray(b)).replace("[", "").replace("]", "").replace("'", "") + '\n')
f.close()
def print_prop_of_var_to_txt(values, system_name, directory):
"""
Print list of proportions of variance explained by each principal component to a text file.
:param values: array or list, proportions of variance in descending order
:param system_name: name of the system, used for the text file name
:param directory: output directory to put the output text file
:return: None
"""
normalized_values = values / np.sum(values)
df = pd.DataFrame({'Principal Component': pd.Series([i + 1 for i in range(len(values))]),
'Singular Value': values,
'Prop. of Variance': normalized_values,
'Cumul. Prop. of Var.': np.cumsum(normalized_values)})
pd.set_option('display.expand_frame_repr', False)
print(df.head())
df.to_csv(os.path.join(directory, system_name + '_prop_of_var.txt'), sep='\t', index=None)
def print_distance_weights_to_files(directory, n_dim, system_name, pca_components, num_atoms,
selected_atom_indexes=None):
for n in range(n_dim):
if selected_atom_indexes:
distance_vector_indexes = list(pd.DataFrame(list(selected_atom_indexes.values()))[1])
else:
distance_vector_indexes = range(len(pca_components[n]))
d = []
for k, l in zip(distance_vector_indexes, range(len(pca_components[n]))):
i, j = calc_ij(k, num_atoms)
coeff = pca_components[n][l]
d.append({'atom 1': i, 'atom 2': j, 'Coefficient of Distance': coeff})
d_df = pd.DataFrame(d)
sorted_d = d_df.reindex(d_df['Coefficient of Distance'].abs().sort_values(ascending=False).index)
output_path = os.path.join(directory, system_name + '_PC%s_components.txt' % (n + 1))
sorted_d.to_csv(output_path, sep='\t', index=None)
def print_distance_weights_to_files_select_atom_indexes(atom_indexes, n_dim, pca_components, system_name, directory):
for n in range(n_dim):
d = []
for k in range(len(pca_components[n])):
coeff = pca_components[n][k]
d.append({'atom 1': atom_indexes[k][0], 'atom 2': atom_indexes[k][1], 'Coefficient of Distance': coeff})
d_df = pd.DataFrame(d)
sorted_d = d_df.reindex(d_df['Coefficient of Distance'].abs().sort_values(ascending=False).index)
sorted_d.to_csv(os.path.join(directory, system_name + '_PC%s_components.txt' % (n + 1)), sep='\t', index=None)
def print_distance_weights_to_files_weighted(directory, n_dim, system_name, pca_components, pca_values, num_atoms,
display=False):
for n in range(n_dim):
d = []
for k in range(len(pca_components[n])):
i, j = calc_ij(k, num_atoms)
coeff = (pca_values[n] / sum(pca_values)) * pca_components[n][k]
d.append({'atom 1': i, 'atom 2': j, 'Coefficient of Distance': coeff})
d_df = pd.DataFrame(d)
sorted_d = d_df.reindex(d_df['Coefficient of Distance'].abs().sort_values(ascending=False).index)
sorted_d.to_csv(os.path.join(directory, system_name + '_PC%s_components_weighted.txt' % (n + 1)), sep='\t',
index=None)
if display:
print("PC%s" % (n + 1))
print(sorted_d)
def transform_new_data(new_xyz_file_path, output_directory, n_dim, pca_fit, pca_components, pca_mean,
original_traj_coords, input_type, stereo_atoms=[1, 2, 3, 4], mw=False, remove_atom_types=None,
selected_atom_indexes=None):
if input_type == "Cartesians":
new_system_name, components_df = transform_new_data_cartesians(new_xyz_file_path, output_directory, n_dim,
pca_fit, pca_components, pca_mean,
original_traj_coords, mw=mw,
remove_atom_types=remove_atom_types)
elif input_type == "Distances":
if selected_atom_indexes:
new_system_name, components_df = transform_new_data_only_top_distances(new_xyz_file_path, output_directory,
n_dim,
pca_fit, pca_components, pca_mean,
selected_atom_indexes=selected_atom_indexes,
stereo_atoms=stereo_atoms, mw=mw,
remove_atom_types=remove_atom_types)
else:
new_system_name, components_df = transform_new_data_distances(new_xyz_file_path, output_directory, n_dim,
pca_fit, pca_components, pca_mean,
stereo_atoms=stereo_atoms, mw=mw,
remove_atom_types=remove_atom_types)
else:
print("ERROR: Please specify input_type=\"Cartesians\" or \"Distances\"")
return new_system_name, components_df
def transform_new_data_cartesians(new_trajectory_file_path, output_directory, n_dim, pca_fit, pca_components, pca_mean,
original_traj_coords, mw=False, remove_atom_types=None, topology=None):
"""
Takes as input a new trajectory (xyz file) for a given system for which dimensionality reduction has already been
conducted and transforms this new data into the reduced dimensional space. Generates a plot, with the new data atop
the "trained" data, and generates xyz files for the new trajectories represented by the principal components.
:param new_trajectory_file_path: new input to dimensionality reduction (xyz file location), str
:param output_directory: output directory, str
:param n_dim: number of dimensions of the reduced dimensional space, int
:param pca_fit: fit from PCA on training data
:param pca_components: components from PCA on training data, array
:param pca_mean: mean of input data to PCA (mean structure as coords or distances), array
:param original_traj_coords: coordinates of the trajectory that the reduced dimensional space was trained on
:param MW: whether coordinates should be mass weighted prior to PCA, bool
"""
print("\nTransforming %s into reduced dimensional representation..." % new_trajectory_file_path)
new_system_name, atoms, coordinates = read_traj_file(new_trajectory_file_path)
if remove_atom_types is not None:
atoms, coordinates = remove_atoms_by_type(remove_atom_types, atoms, coordinates)
if not os.path.exists(output_directory):
os.makedirs(output_directory)
print("\nResults for %s input will be stored in %s" % (new_trajectory_file_path, output_directory))
# Determining names of output directories/files
file_name_end = "_Cartesians"
# Align structures using Kabsch algorithm so rotations don't affect PCs
aligned_original_traj_coords = kabsch(original_traj_coords)
coords_for_analysis = align_to_original_traj(coordinates, aligned_original_traj_coords)
if mw is True:
file_name_end = file_name_end + "_MW"
mass_weighted_coords = mass_weighting(atoms, coords_for_analysis)
coords_for_analysis = mass_weighted_coords
else:
file_name_end = file_name_end + "_noMW"
coords_for_analysis = coords_for_analysis
coords_for_analysis = np.reshape(coords_for_analysis, (coords_for_analysis.shape[0],
coords_for_analysis.shape[1] *
coords_for_analysis.shape[2]))
components = pca_fit.transform(coords_for_analysis)
components_df = pd.DataFrame(components)
PCs_separate = []
for i in range(0, n_dim):
PCi = np.dot(components[:, i, None], pca_components[None, i, :]) + pca_mean
PCs_separate.append(PCi)
PCs_combined = np.dot(components, pca_components) + pca_mean
PCs_separate = np.array(PCs_separate)
PCs_combined = np.array(PCs_combined)
# Reshape n x 3N x 1 arrays into n x N x 3 arrays
PCs_separate = np.reshape(PCs_separate, (PCs_separate.shape[0], PCs_separate.shape[1],
int(PCs_separate.shape[2] / 3), 3))
PCs_combined = np.reshape(PCs_combined, (1, PCs_combined.shape[0], int(PCs_combined.shape[1] / 3), 3))
if mw is True:
# Remove mass-weighting of coordinates
no_mass_weighting_PCs_separate = [remove_mass_weighting(atoms, PCs_separate[i])
for i in range(n_dim)]
no_mass_weighting_PCs_combined = remove_mass_weighting(atoms, PCs_combined)
else:
no_mass_weighting_PCs_separate = PCs_separate
no_mass_weighting_PCs_combined = PCs_combined
aligned_PCs_separate = no_mass_weighting_PCs_separate
aligned_PCs_combined = no_mass_weighting_PCs_combined
make_pc_xyz_files(output_directory, new_system_name + file_name_end, atoms, aligned_PCs_separate)
make_pc_xyz_files(output_directory, new_system_name + file_name_end, atoms, aligned_PCs_combined)
return new_system_name, components_df
def transform_new_data_distances(new_trajectory_file_path, output_directory, n_dim, pca_fit, pca_components, pca_mean,
stereo_atoms=[1, 2, 3, 4], mw=False, remove_atom_types=None):
"""
Takes as input a new trajectory (xyz file) for a given system for which dimensionality reduction has already been
conducted and transforms this new data into the reduced dimensional space. Generates a plot, with the new data atop
the "trained" data, and generates xyz files for the new trajectories represented by the principal components.
:param new_trajectory_file_path: new input to dimensionality reduction (xyz file location), str
:param output_directory: output directory, str
:param n_dim: number of dimensions of the reduced dimensional space, int
:param pca_fit: fit from PCA on training data
:param pca_components: components from PCA on training data, array
:param pca_mean: mean of input data to PCA (mean structure as coords or distances), array
:param original_traj_coords: coordinates of the trajectory that the reduced dimensional space was trained on
:param stereo_atoms: indexes of 4 atoms surrounding stereogenic center, list of ints
:param MW: whether coordinates should be mass weighted prior to PCA, bool
"""
print("\nTransforming %s into reduced dimensional representation..." % new_trajectory_file_path)
new_system_name, atoms, coordinates = read_traj_file(new_trajectory_file_path)
if remove_atom_types is not None:
atoms, coordinates = remove_atoms_by_type(remove_atom_types, atoms, coordinates)
if not os.path.exists(output_directory):
os.makedirs(output_directory)
print("\nResults for %s input will be stored in %s" % (new_trajectory_file_path, output_directory))
# Determining names of output directories/files
file_name_end = "_Distances"
if mw is True:
file_name_end = file_name_end + "_MW"
coordinates_shifted = set_atom_one_to_origin(coordinates)
mass_weighted_coords = mass_weighting(atoms, coordinates_shifted)
coords_for_analysis = mass_weighted_coords
else:
file_name_end = file_name_end + "_noMW"
coords_for_analysis = coordinates
negatives, positives, zeroes, all_signs = chirality_test(coordinates, stereo_atoms)
d2 = generate_distance_matrices(coords_for_analysis)
coords_for_analysis = reshape_ds(d2)
components = pca_fit.transform(coords_for_analysis)
components_df = pd.DataFrame(components)
PCs_separate = []
for i in range(0, n_dim):
PCi = np.dot(components[:, i, None], pca_components[None, i, :]) + pca_mean
PCs_separate.append(PCi)
PCs_combined = np.dot(components, pca_components) + pca_mean
PCs_separate = np.array(PCs_separate)
PCs_combined = np.array(PCs_combined)
# Turning distance matrix representations of structures back into Cartesian coordinates
PCs_separate = [[distance_matrix_to_coords(PCs_separate[i][k])
for k in range(PCs_separate.shape[1])] for i in range(PCs_separate.shape[0])]
PCs_combined = [distance_matrix_to_coords(PCs_combined[i])
for i in range(np.array(PCs_combined).shape[0])]
PCs_separate = np.real(PCs_separate)
PCs_combined = np.real(PCs_combined)
if mw is True:
# Remove mass-weighting of coordinates
no_mass_weighting_PCs_separate = [remove_mass_weighting(atoms, PCs_separate[i])
for i in range(n_dim)]
no_mass_weighting_PCs_combined = remove_mass_weighting(atoms, PCs_combined)
else:
no_mass_weighting_PCs_separate = PCs_separate
no_mass_weighting_PCs_combined = PCs_combined
# Reorient coordinates so they are in a consistent orientation
aligned_PCs_separate = [kabsch(chirality_changes(no_mass_weighting_PCs_separate[i], stereo_atoms,
all_signs)) for i in range(n_dim)]
aligned_PCs_combined = kabsch(chirality_changes(no_mass_weighting_PCs_combined, stereo_atoms, all_signs))
aligned_PCs_combined = np.reshape(aligned_PCs_combined, (1, aligned_PCs_combined.shape[0],
aligned_PCs_combined.shape[1],
aligned_PCs_combined.shape[2]))
make_pc_xyz_files(output_directory, new_system_name + file_name_end, atoms, aligned_PCs_separate)
make_pc_xyz_files(output_directory, new_system_name + file_name_end, atoms, aligned_PCs_combined)
return new_system_name, components_df
def transform_new_data_only_top_distances(new_xyz_file_path, output_directory, n_dim, pca_fit, pca_components, pca_mean,
selected_atom_indexes, stereo_atoms=[1, 2, 3, 4], mw=False,
remove_atom_types=None):
"""
Takes as input a new trajectory (xyz file) for a given system for which dimensionality reduction has already been
conducted and transforms this new data into the reduced dimensional space. Generates a plot, with the new data atop
the "trained" data, and generates xyz files for the new trajectories represented by the principal components.
:param new_xyz_file_path: new input to dimensionality reduction (xyz file location), str
:param output_directory: output directory, str
:param n_dim: number of dimensions of the reduced dimensional space, int
:param pca_fit: fit from PCA on training data
:param pca_components: components from PCA on training data, array
:param pca_mean: mean of input data to PCA (mean structure as coords or distances), array
:param original_traj_coords: coordinates of the trajectory that the reduced dimensional space was trained on
:param stereo_atoms: indexes of 4 atoms surrounding stereogenic center, list of ints
:param MW: whether coordinates should be mass weighted prior to PCA, bool
"""
print("\nTransforming %s into reduced dimensional representation..." % new_xyz_file_path)
new_system_name, atoms, coordinates = read_xyz_file(new_xyz_file_path)
if remove_atom_types is not None:
atoms, coordinates = remove_atoms_by_type(remove_atom_types, atoms, coordinates)
if not os.path.exists(output_directory):
os.makedirs(output_directory)
print("\nResults for %s input will be stored in %s" % (new_xyz_file_path, output_directory))
if mw is True:
coordinates_shifted = set_atom_one_to_origin(coordinates)
mass_weighted_coords = mass_weighting(atoms, coordinates_shifted)
coords_for_analysis = mass_weighted_coords
else:
coords_for_analysis = coordinates
d2_vector_matrix_all = generate_and_reshape_ds_big_structures(coords_for_analysis)
print('Starting new bit')
num_dists = len(list(selected_atom_indexes.keys()))
num_points = d2_vector_matrix_all.shape[0]
important_distances_matrix = np.zeros((num_points, num_dists))
distance_vector_indexes = list(pd.DataFrame(list(selected_atom_indexes.values()))[1])
for i in range(len(distance_vector_indexes)):
important_distances_matrix[:, i] = d2_vector_matrix_all[:, distance_vector_indexes[i]]
components = pca_fit.transform(important_distances_matrix)
components_df = pd.DataFrame(components)
return new_system_name, components_df
def pathreducer(trajectory_file_path, n_dim, stereo_atoms=[1, 2, 3, 4], input_type="Cartesians", mw=False, reconstruct=True,
normal_modes=False, remove_atom_types=None, num_dists=None, topology=None):
"""
Workhorse function for doing dimensionality reduction on xyz files. Dimensionality reduction can be done on the
structures represented as Cartesian coordinates (easy/faster) or the structures represented as distances matrices
(slower, but potentially more useful for certain systems that vary non-linearly with respect to Cartesian space,
e.g., torsions).
:param trajectory_file_path: xyz file or directory filled with xyz files that will be used to generate the reduced
dimensional space, str
:param n_dim: number of dimensions to reduce system to using PCA, int
:param stereo_atoms: list of 4 atom indexes surrounding stereogenic center, ints
:param input_type: input type to PCA, either "Cartesians" or "Distances", str
:return: name, directory, pca, pca_fit, components, mean, values, lengths
"""
# Make sure even large matrices are printed out in their entirety (for the generation of xyz files)
np.set_printoptions(threshold=sys.maxsize)
# Check if input is directory (containing input files) or a single input file itself
assert os.path.isfile(trajectory_file_path) or os.path.isdir(trajectory_file_path), "No such file or directory."
if os.path.isfile(trajectory_file_path) is True:
if input_type == "Cartesians":
system_name, output_directory, pca, pca_fit, components, mean, values, aligned_coords = \
pathreducer_cartesians_one_file(trajectory_file_path, n_dim, mw=mw, reconstruct=reconstruct,
normal_modes=normal_modes, remove_atom_types=remove_atom_types,
topology=topology)
return system_name, output_directory, pca, pca_fit, components, mean, \
values, aligned_coords
elif input_type == "Distances":
if num_dists:
system_name, output_directory, pca, pca_fit, components, mean, values, aligned_coords, selected_dist_atom_indexes = \
pathreducer_distances_one_file(trajectory_file_path, n_dim, stereo_atoms=stereo_atoms, mw=mw,
reconstruct=reconstruct, normal_modes=normal_modes,
remove_atom_types=remove_atom_types, num_dists=num_dists, topology=topology)
lengths = aligned_coords.shape[0]
return system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords, \
selected_dist_atom_indexes
else:
system_name, output_directory, pca, pca_fit, components, mean, values, aligned_coords = \
pathreducer_distances_one_file(trajectory_file_path, n_dim, stereo_atoms=stereo_atoms, mw=mw,
reconstruct=reconstruct, normal_modes=normal_modes,
remove_atom_types=remove_atom_types, num_dists=num_dists, topology=topology)
lengths = aligned_coords.shape[0]
return system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords
elif os.path.isdir(trajectory_file_path) is True:
if input_type == "Cartesians":
system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords = \
pathreducer_cartesians_directory_of_files(trajectory_file_path, n_dim, mw=mw, reconstruct=reconstruct,
normal_modes=normal_modes,
remove_atom_types=remove_atom_types, topology=topology)
return system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords
elif input_type == "Distances":
if num_dists:
system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords, \
selected_dist_atom_indexes = pathreducer_distances_directory_of_files(trajectory_file_path, n_dim,
stereo_atoms=stereo_atoms, mw=mw,
reconstruct=reconstruct,
normal_modes=normal_modes,
num_dists=num_dists,
remove_atom_types=remove_atom_types,
topology=topology)
return system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords, \
selected_dist_atom_indexes
else:
system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords, \
selected_dist_atom_indexes = pathreducer_distances_directory_of_files(trajectory_file_path, n_dim,
stereo_atoms=stereo_atoms, mw=mw,
reconstruct=reconstruct,
normal_modes=normal_modes,
num_dists=num_dists,
remove_atom_types=remove_atom_types)
return system_name, output_directory, pca, pca_fit, components, mean, values, lengths, aligned_coords
def pathreducer_cartesians_one_file(trajectory_file_path, n_dim, mw=False, reconstruct=True, normal_modes=False,
remove_atom_types=None, topology=None):
"""
Workhorse function for doing dimensionality reduction on xyz files. Dimensionality reduction can be done on the
structures represented as Cartesian coordinates (easy/faster) or the structures represented as distances matrices
(slower, but potentially more useful for certain systems that vary in non-linear ways, e.g., torsions).
:param trajectory_file_path: xyz file or directory filled with xyz files that will be used to generate the reduced
dimensional space, str
:param n_dim: number of dimensions to reduce system to using PCA, int
:return: name, directory, pca, pca_fit, components, mean, values, lengths
"""
# Make sure even large matrices are printed out in their entirety (for the generation of xyz files)
np.set_printoptions(threshold=sys.maxsize)
# Check if input is directory (containing input files) or a single input file itself
assert os.path.isfile(trajectory_file_path) or os.path.isdir(trajectory_file_path), "No such file or directory."
# Determining names of output directories/files
file_name_end = "_Cartesians"
if mw is True:
file_name_end = file_name_end + "_MW"
elif mw is False:
file_name_end = file_name_end + "_noMW"
print("\nInput is one file.")
system_name, atoms, coordinates = _read_single_traj_file(topology, trajectory_file_path)
if remove_atom_types is not None:
atoms, coordinates = remove_atoms_by_type(remove_atom_types, atoms, coordinates)
# Creating a directory for output (if directory doesn't already exist)
output_directory = system_name + file_name_end + "_output"
if not os.path.exists(output_directory):
os.makedirs(output_directory)
print("Results for %s input will be stored in %s" % (trajectory_file_path, output_directory))
aligned_coords = kabsch(coordinates)
print("\n(1C) Done aligning structures using Kabsch algorithm")
if mw is True:
mass_weighted_coordinates = mass_weighting(atoms, aligned_coords)
print("\n(MW) Done mass-weighting coordinates!")
matrix_for_pca = np.reshape(mass_weighted_coordinates, (mass_weighted_coordinates.shape[0],
mass_weighted_coordinates.shape[1] *
mass_weighted_coordinates.shape[2]))
else:
matrix_for_pca = np.reshape(aligned_coords, (aligned_coords.shape[0], aligned_coords.shape[1] *
aligned_coords.shape[2]))
# PCA
cartesians_pca, cartesians_pca_fit, cartesians_components, cartesians_mean, cartesians_values = \
pca_dr(matrix_for_pca)
print("\n(2) Done with PCA of Cartesian coordinates!")
if reconstruct:
if normal_modes:
function = inverse_transform_of_pcs_as_normal_modes
file_name_end += "_normal_modes"
else:
function = inverse_transform_of_pcs
PCs_separate, PCs_combined = function(n_dim, cartesians_pca, cartesians_components,
cartesians_mean)
print("\n(3) Done transforming reduced dimensional representation of input into full dimensional space!")
# Reshape n x 3N x 1 arrays into n x N x 3 arrays
PCs_separate = np.reshape(PCs_separate, (PCs_separate.shape[0], PCs_separate.shape[1],
int(PCs_separate.shape[2] / 3), 3))
PCs_combined = np.reshape(PCs_combined, (1, PCs_combined.shape[0], int(PCs_combined.shape[1] / 3), 3))
if mw is True:
# Remove mass-weighting of coordinates, individual Xs
no_mass_weighting_PCs_separate = [remove_mass_weighting(atoms, PCs_separate[i]) for i in range(n_dim)]
# Remove mass-weighting of coordinates, all Xs combined into one array/reduced dimensional trajectory
no_mass_weighting_PCs_combined = remove_mass_weighting(atoms, PCs_combined)
print("\n(UMW) Done removing mass-weighting!")
else:
no_mass_weighting_PCs_separate = [PCs_separate[i] for i in range(n_dim)]
no_mass_weighting_PCs_combined = PCs_combined
# Make xyz files from final coordinate arrays
make_pc_xyz_files(output_directory, system_name + file_name_end, atoms, no_mass_weighting_PCs_separate)
make_pc_xyz_files(output_directory, system_name + file_name_end, atoms, no_mass_weighting_PCs_combined)
print("\n(4) Done with making output xyz files!")
return system_name, output_directory, cartesians_pca, cartesians_pca_fit, cartesians_components, cartesians_mean, \
cartesians_values, aligned_coords
def pathreducer_cartesians_directory_of_files(trajectory_directory_path, n_dim, mw=False, reconstruct=True,
normal_modes=False, remove_atom_types=None, topology=None):
"""
Workhorse function for doing dimensionality reduction on xyz files. Dimensionality reduction can be done on the
structures represented as Cartesian coordinates (easy/faster) or the structures represented as distances matrices
(slower, but potentially more useful for certain systems that vary in non-linear ways, e.g., torsions).
:param trajectory_directory_path: xyz file or directory filled with xyz files that will be used to generate the
reduced dimensional space, str
:param n_dim: number of dimensions to reduce system to using PCA, int
:return: name, directory, pca, pca_fit, components, mean, values, lengths
"""
# Make sure even large matrices are printed out in their entirety (for the generation of xyz files)
np.set_printoptions(threshold=sys.maxsize)
# Check if input is directory (containing input files) or a single input file itself
assert os.path.isfile(trajectory_directory_path) or os.path.isdir(trajectory_directory_path), "No such file or " \
"directory."
# Determining names of output directories/files
file_name_end = "_Cartesians"
if mw is True:
file_name_end = file_name_end + "_MW"
elif mw is False:
file_name_end = file_name_end + "_noMW"
print("\nInput is a directory of files.")
path = os.path.dirname(trajectory_directory_path)
system_name = os.path.basename(path)
print("\nDoing dimensionality reduction on files in %s" % system_name)
trajectory_files = sorted(glob.glob(os.path.join(trajectory_directory_path, '*.xyz')))
names = []
atoms = []
file_lengths = []
i = 0
for trajectory_file in trajectory_files:
i = i + 1
if topology is not None:
name, atoms_one_file, coordinates = read_traj_file(topology, trajectory_file)
else: # Assume topology is in the trajectory file, e.g. XYZ or PDB
name, atoms_one_file, coordinates = read_traj_file(trajectory_file)
if remove_atom_types is not None:
atoms_one_file, coordinates = remove_atoms_by_type(remove_atom_types, atoms_one_file, coordinates)
names.append(name)
atoms.append(atoms_one_file)
file_lengths.append(coordinates.shape[0])
if i == 1:
coords_for_analysis = coordinates
else:
coords_for_analysis = np.concatenate((coords_for_analysis, coordinates), axis=0)
# Creating a directory for output (if directory doesn't already exist)
output_directory = system_name + file_name_end + "_output"
if not os.path.exists(output_directory):
os.makedirs(output_directory)
print("Results for %s input will be stored in %s" % (trajectory_directory_path, output_directory))
aligned_coords = kabsch(coords_for_analysis)
print("\n(1C) Done aligning structures using Kabsch algorithm")
if mw is True:
mass_weighted_coordinates = mass_weighting(atoms_one_file, aligned_coords)
print("\n(MW) Done mass-weighting coordinates!")
matrix_for_pca = np.reshape(mass_weighted_coordinates, (mass_weighted_coordinates.shape[0],
mass_weighted_coordinates.shape[1] *
mass_weighted_coordinates.shape[2]))
else:
matrix_for_pca = np.reshape(aligned_coords, (aligned_coords.shape[0],
aligned_coords.shape[1] * aligned_coords.shape[2]))
# PCA
cartesians_pca, cartesians_pca_fit, cartesians_components, cartesians_mean, cartesians_values = \
pca_dr(matrix_for_pca)
print("\n(2) Done with PCA of Cartesian coordinates!")
if reconstruct:
if normal_modes:
function = inverse_transform_of_pcs_as_normal_modes
file_name_end += "_normal_modes"
else:
function = inverse_transform_of_pcs
PCs_separate, PCs_combined = function(n_dim, cartesians_pca, cartesians_components,
cartesians_mean)
print("\n(3) Done transforming reduced dimensional representation of input into full dimensional space!")
# Reshape n x 3N x 1 arrays into n x N x 3 arrays
PCs_separate = np.reshape(PCs_separate, (PCs_separate.shape[0], PCs_separate.shape[1],
int(PCs_separate.shape[2] / 3), 3))
PCs_combined = np.reshape(PCs_combined, (1, PCs_combined.shape[0], int(PCs_combined.shape[1] / 3), 3))
if mw is True:
# Remove mass-weighting of coordinates, individual Xs
no_mass_weighting_PCs_separate = [remove_mass_weighting(atoms_one_file, PCs_separate[i]) for i in
range(n_dim)]
# Remove mass-weighting of coordinates, all Xs combined into one array/reduced dimensional trajectory
no_mass_weighting_PCs_combined = remove_mass_weighting(atoms_one_file, PCs_combined)
print("\n(UMW) Done removing mass-weighting!")
else:
no_mass_weighting_PCs_separate = [PCs_separate[i] for i in range(n_dim)]
no_mass_weighting_PCs_combined = PCs_combined
# Make xyz files from final coordinate arrays
for x in range(len(file_lengths)):
filename = names[x]
if x == 0:
start_index = 0
end_index = file_lengths[x]
one_file_PCs_separate = np.array(no_mass_weighting_PCs_separate)[:, start_index:end_index, :, :]
one_file_PCs_combined = np.array(no_mass_weighting_PCs_combined)[:, start_index:end_index, :, :]
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_separate)
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_combined)
else:
start_index = sum(file_lengths[:x])
end_index = sum(file_lengths[:(x + 1)])
one_file_PCs_separate = np.array(no_mass_weighting_PCs_separate)[:, start_index:end_index, :, :]
one_file_PCs_combined = np.array(no_mass_weighting_PCs_combined)[:, start_index:end_index, :, :]
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_separate)
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_combined)
print("\nDone generating output!")
return system_name, output_directory, cartesians_pca, cartesians_pca_fit, cartesians_components, cartesians_mean, \
cartesians_values, file_lengths, coords_for_analysis
def pathreducer_distances_one_file(trajectory_file_path, n_dim, stereo_atoms=[1, 2, 3, 4], mw=False,
print_distance_coefficients=True, reconstruct=True, normal_modes=False,
num_dists=None, remove_atom_types=None, topology=None):
"""
Workhorse function for doing dimensionality reduction on xyz files. Dimensionality reduction can be done on the
structures represented as Cartesian coordinates (easy/faster) or the structures represented as distances matrices
(slower, but potentially more useful for certain systems that vary in non-linear ways, e.g., torsions).
:param trajectory_file_path: xyz file or directory filled with xyz files that will be used to generate the reduced
dimensional space, str
:param n_dim: number of dimensions to reduce system to using PCA, int
:param stereo_atoms: list of 4 atom indexes surrounding stereogenic center, ints
:return: name, directory, pca, pca_fit, components, mean, values, lengths
"""
# Make sure even large matrices are printed out in their entirety (for the generation of xyz files)
np.set_printoptions(threshold=sys.maxsize)
# Check if input is directory (containing input files) or a single input file itself
assert os.path.isfile(trajectory_file_path) or os.path.isdir(trajectory_file_path), "No such file or directory."
# Determining names of output directories/files
file_name_end = "_Distances"
if mw is True:
file_name_end = file_name_end + "_MW"
elif mw is False:
file_name_end = file_name_end + "_noMW"
print("\nInput is one file.")
name, atoms, coordinates= _read_single_traj_file(topology, trajectory_file_path)
if remove_atom_types is not None:
atoms, coordinates = remove_atoms_by_type(remove_atom_types, atoms, coordinates)
# Creating a directory for output (if directory doesn't already exist)
output_directory = name + file_name_end + "_output"
if not os.path.exists(output_directory):
os.makedirs(output_directory)
print("Results for %s input will be stored in %s" % (trajectory_file_path, output_directory))
aligned_coordinates = kabsch(coordinates)
negatives, positives, zeroes, all_signs = chirality_test(aligned_coordinates, stereo_atoms)
if mw is True:
coordinates_shifted = set_atom_one_to_origin(coordinates)
mass_weighted_coordinates = mass_weighting(atoms, coordinates_shifted)
coords_for_pca = mass_weighted_coordinates
print("\n(MW) Done mass-weighting coordinates!")
else:
coords_for_pca = aligned_coordinates
if coords_for_pca.shape[1] > 1000:
num_dists = 75000
print("Big matrix. Using the top %s distances for PCA..." % num_dists)
d2_vector_matrix_all = generate_and_reshape_ds_big_structures(coords_for_pca)
d2_vector_matrix, selected_dist_atom_indexes = filter_important_distances(d2_vector_matrix_all,
num_dists=num_dists)
else:
d2_full_matrices = generate_distance_matrices(coords_for_pca)
d2_vector_matrix = reshape_ds(d2_full_matrices)
print("\n(1D) Generation of distance matrices and reshaping upper triangles into vectors done!")
# PCA on distance matrix
d_pca, d_pca_fit, d_components, d_mean, d_values = pca_dr(d2_vector_matrix)
print("\n(2) Done with PCA of structures as distance matrices!")
if print_distance_coefficients:
if coords_for_pca.shape[1] > 1000:
print_distance_weights_to_files(output_directory, n_dim, name + file_name_end, d_components, len(atoms),
selected_atom_indexes=selected_dist_atom_indexes)
else:
print_distance_weights_to_files(output_directory, n_dim, name + file_name_end, d_components, len(atoms))
if reconstruct:
if normal_modes:
function = inverse_transform_of_pcs_as_normal_modes
file_name_end += "_normal_modes"
else:
function = inverse_transform_of_pcs
if coords_for_pca.shape[1] > 1000:
d_components = set_unimportant_distance_weights_to_zero(d_components, selected_dist_atom_indexes,
len(atoms))
d_mean = calc_mean_distance_vector(d2_vector_matrix_all)
PCs_separate_d, PCs_combined_d = function(n_dim, d_pca, d_components, d_mean)
print("\n(3) Done transforming reduced dimensional representation of input into full dimensional space!")
# Turning distance matrix representations of structures back into Cartesian coordinates
PCs_separate = [[distance_matrix_to_coords(PCs_separate_d[i][k])
for k in range(PCs_separate_d.shape[1])] for i in range(PCs_separate_d.shape[0])]
# Turning distance matrix representations of structures back into Cartesian coordinates (all chosen Xs combined
# into one xyz file)
PCs_combined = [distance_matrix_to_coords(PCs_combined_d[i])
for i in range(np.array(PCs_combined_d).shape[0])]
PCs_separate = np.real(PCs_separate)
PCs_combined = np.real(PCs_combined)
print("\n(4D)-(6D) Done with converting distance matrices back to Cartesian coordinates!")
if mw is True:
# Remove mass-weighting of coordinates, individual PCs
no_mass_weighting_PCs_separate = [remove_mass_weighting(atoms, PCs_separate[i])
for i in range(n_dim)]
no_mass_weighting_PCs_combined = remove_mass_weighting(atoms, PCs_combined)
print("\n(UMW) Done removing mass-weighting!")
else:
no_mass_weighting_PCs_separate = PCs_separate
no_mass_weighting_PCs_combined = PCs_combined
if normal_modes:
chirality_consistent_PCs_separate = [
kabsch(chirality_changes_normal_modes(no_mass_weighting_PCs_separate[i], stereo_atoms,
all_signs)) for i in range(n_dim)]
# Reorient coordinates so they are in a consistent coordinate system/chirality, all Xs combined into one array
chirality_consistent_PCs_combined = kabsch(
chirality_changes_normal_modes(no_mass_weighting_PCs_combined, stereo_atoms,
all_signs))
else:
chirality_consistent_PCs_separate = [
kabsch(chirality_changes(no_mass_weighting_PCs_separate[i], stereo_atoms,
all_signs)) for i in range(n_dim)]
# Reorient coordinates so they are in a consistent coordinate system/chirality, all Xs combined into one array
chirality_consistent_PCs_combined = kabsch(chirality_changes(no_mass_weighting_PCs_combined, stereo_atoms,
all_signs))
chirality_consistent_PCs_combined = np.reshape(chirality_consistent_PCs_combined,
(1,
chirality_consistent_PCs_combined.shape[0],
chirality_consistent_PCs_combined.shape[1],
chirality_consistent_PCs_combined.shape[2]))
# Align new Cartesian coordinates to ALIGNED original trajectory
aligned_PCs_separate = [align_to_original_traj(chirality_consistent_PCs_separate[i], aligned_coordinates)
for i in range(len(chirality_consistent_PCs_separate))]
aligned_PCs_combined = [align_to_original_traj(chirality_consistent_PCs_combined[i], aligned_coordinates)
for i in range(len(chirality_consistent_PCs_combined))]
print("\n(7D) Done checking chirality of resultant structures!")
print("\n(8D) Done aligning!")
# Make final structures into xyz files
make_pc_xyz_files(output_directory, name + file_name_end, atoms, aligned_PCs_separate)
make_pc_xyz_files(output_directory, name + file_name_end, atoms, aligned_PCs_combined)
print("\nDone generating output!")
if num_dists:
return name, output_directory, d_pca, d_pca_fit, d_components, d_mean, d_values, aligned_coordinates, \
selected_dist_atom_indexes
else:
return name, output_directory, d_pca, d_pca_fit, d_components, d_mean, d_values, aligned_coordinates,
def _read_single_traj_file(topology, trajectory_file_path):
if topology is not None:
name, atoms, coordinates = read_traj_file(topology, trajectory_file_path)
else:
name, atoms, coordinates = read_traj_file(trajectory_file_path)
return name, atoms, coordinates
def pathreducer_distances_directory_of_files(trajectory_file_path, n_dim, stereo_atoms=[1, 2, 3, 4], mw=False,
print_distance_coefficients=True, reconstruct=True, normal_modes=False,
num_dists=None, remove_atom_types=None, topology=None):
"""
Workhorse function for doing dimensionality reduction on xyz files. Dimensionality reduction can be done on the
structures represented as Cartesian coordinates (easy/faster) or the structures represented as distances matrices
(slower, but potentially more useful for certain systems that vary in non-linear ways, e.g., torsions).
:param trajectory_file_path: xyz file or directory filled with xyz files that will be used to generate the
reduced dimensional space, str
:param n_dim: number of dimensions to reduce system to using PCA, int
:param stereo_atoms: list of 4 atom indexes surrounding stereogenic center, ints
:return: name, directory, pca, pca_fit, components, mean, values, lengths
"""
# Check if input is directory (containing input files) or a single input file itself
assert os.path.isfile(trajectory_file_path) or os.path.isdir(trajectory_file_path), "No such file or " \
"directory."
print("\nInput is a directory of files.")
# Make sure even large matrices are printed out in their entirety (for the generation of xyz files)
np.set_printoptions(threshold=sys.maxsize)
# Determining names of output directories/files
file_name_end = "_Distances"
if mw is True:
file_name_end = file_name_end + "_MW"
elif mw is False:
file_name_end = file_name_end + "_noMW"
path = os.path.dirname(trajectory_file_path)
system_name = os.path.basename(path)
print("\nDoing dimensionality reduction on files in %s" % system_name)
trajectory_files = sorted(glob.glob(os.path.join(trajectory_file_path, '*.xyz')))
names = []
atoms = []
file_lengths = []
i = 0
for trajectory_file in trajectory_files:
i = i + 1
name, atoms_one_file, coordinates = read_traj_file(trajectory_file)
if remove_atom_types is not None:
atoms_one_file, coordinates = remove_atoms_by_type(remove_atom_types, atoms_one_file, coordinates)
names.append(name)
atoms.append(atoms_one_file)
file_lengths.append(coordinates.shape[0])
if i == 1:
coords_for_analysis = coordinates
else:
coords_for_analysis = np.concatenate((coords_for_analysis, coordinates), axis=0)
# Creating a directory for output (if directory doesn't already exist)
output_directory = system_name + file_name_end + "_output"
if not os.path.exists(output_directory):
os.makedirs(output_directory)
print("Results for %s input will be stored in %s" % (system_name, output_directory))
aligned_coordinates = kabsch(coords_for_analysis)
negatives, positives, zeroes, all_signs = chirality_test(coords_for_analysis, stereo_atoms)
if mw is True:
coordinates_shifted = set_atom_one_to_origin(coords_for_analysis)
mass_weighted_coordinates = mass_weighting(atoms_one_file, coordinates_shifted)
print("\n(MW) Done mass-weighting coordinates!")
coords_for_pca = mass_weighted_coordinates
else:
coords_for_pca = coords_for_analysis
if coords_for_pca.shape[1] > 1000:
num_dists = 75000
print("Big matrix. Using the top %s distances for PCA..." % num_dists)
d2_vector_matrix_all = generate_and_reshape_ds_big_structures(coords_for_pca)
d2_mean = calc_mean_distance_vector(d2_vector_matrix_all)
d2_vector_matrix, selected_dist_atom_indexes = filter_important_distances(d2_vector_matrix_all,
num_dists=num_dists)
# TODO: Make reconstruction possible by setting weights on all "non-important" distances to zero
reconstruct = False
else:
d2_full_matrices = generate_distance_matrices(coords_for_pca)
d2_vector_matrix = reshape_ds(d2_full_matrices)
print("\n(1D) Generation of distance matrices and reshaping upper triangles into vectors done!")
# PCA on distance matrix
d_pca, d_pca_fit, d_components, d_mean, d_values = pca_dr(d2_vector_matrix)
print("\n(2) Done with PCA of structures as interatomic distance matrices!")
if print_distance_coefficients:
print_distance_weights_to_files(output_directory, n_dim, system_name + file_name_end, d_components,
len(atoms_one_file))
if reconstruct:
if normal_modes:
function = inverse_transform_of_pcs_as_normal_modes
file_name_end += "_normal_modes"
else:
function = inverse_transform_of_pcs
PCs_separate_d, PCs_combined_d = function(n_dim, d_pca, d_components, d_mean)
print("\n(3) Done transforming reduced dimensional representation of input into full dimensional space!")
# Turning distance matrix representations of structures back into Cartesian coordinates
PCs_separate = [[distance_matrix_to_coords(PCs_separate_d[i][k])
for k in range(PCs_separate_d.shape[1])] for i in range(PCs_separate_d.shape[0])]
# Turning distance matrix representations of structures back into Cartesian coordinates (all chosen PCs combined
# into one xyz file)
PCs_combined = [distance_matrix_to_coords(PCs_combined_d[i]) for i in range(np.array(PCs_combined_d).shape[0])]
PCs_separate = np.real(PCs_separate)
PCs_combined = np.real(PCs_combined)
print("\n(4D)-(6D) Done with converting distance matrices back to Cartesian coordinates!")
if mw is True:
# Remove mass-weighting of coordinates, individual PCs
no_mass_weighting_PCs_separate = [remove_mass_weighting(atoms_one_file, PCs_separate[i])
for i in range(n_dim)]
no_mass_weighting_PCs_combined = remove_mass_weighting(atoms_one_file, PCs_combined)
print("\n(UMW) Done removing mass-weighting!")
else:
no_mass_weighting_PCs_separate = PCs_separate
no_mass_weighting_PCs_combined = PCs_combined
if normal_modes:
chirality_consistent_PCs_separate = [
kabsch(chirality_changes_normal_modes(no_mass_weighting_PCs_separate[i], stereo_atoms,
all_signs)) for i in range(n_dim)]
chirality_consistent_PCs_combined = kabsch(
chirality_changes_normal_modes(no_mass_weighting_PCs_combined, stereo_atoms,
all_signs))
else:
chirality_consistent_PCs_separate = [chirality_changes(no_mass_weighting_PCs_separate[i], stereo_atoms,
all_signs)
for i in range(n_dim)]
chirality_consistent_PCs_combined = kabsch(chirality_changes(no_mass_weighting_PCs_combined, stereo_atoms,
all_signs))
chirality_consistent_PCs_combined = np.reshape(chirality_consistent_PCs_combined,
(1,
chirality_consistent_PCs_combined.shape[0],
chirality_consistent_PCs_combined.shape[1],
chirality_consistent_PCs_combined.shape[2]))
# Align new Cartesian coordinates to ALIGNED original trajectory
aligned_PCs_separate = [align_to_original_traj(chirality_consistent_PCs_separate[i], aligned_coordinates)
for i in range(len(chirality_consistent_PCs_separate))]
aligned_PCs_combined = [align_to_original_traj(chirality_consistent_PCs_combined[i], aligned_coordinates)
for i in range(len(chirality_consistent_PCs_combined))]
print("\n(7D) Done checking chirality of resultant structures!")
print("\n(8D) Done aligning!")
for x in range(len(trajectory_files)):
filename = names[x]
if x == 0:
start_index = 0
end_index = file_lengths[x]
one_file_PCs_separate = np.array(aligned_PCs_separate)[:, start_index:end_index, :, :]
one_file_PCs_combined = np.array(aligned_PCs_combined)[:, start_index:end_index, :, :]
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_separate)
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_combined)
else:
start_index = sum(file_lengths[:x])
end_index = sum(file_lengths[:(x + 1)])
one_file_PCs_separate = np.array(aligned_PCs_separate)[:, start_index:end_index, :, :]
one_file_PCs_combined = np.array(aligned_PCs_combined)[:, start_index:end_index, :, :]
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_separate)
make_pc_xyz_files(output_directory, filename + file_name_end, atoms_one_file, one_file_PCs_combined)
print("\nDone generating output!")
if num_dists:
return system_name, output_directory, d_pca, d_pca_fit, d_components, d_mean, d_values, file_lengths, \
aligned_coordinates, selected_dist_atom_indexes
else:
return system_name, output_directory, d_pca, d_pca_fit, d_components, d_mean, d_values, file_lengths, \
aligned_coordinates
def pathreducer_interactive():
while True:
input_path = input("\nInput a path to an xyz file or directory of xyz files.\n")
if os.path.isfile(input_path):
print("Input is an individual file.")
system_name = os.path.basename(input_path)
break
elif os.path.isdir(input_path):
print("Input is a directory of files.")
path = os.path.dirname(input_path)
system_name = os.path.basename(path)
break
else:
print("No such file or directory.")
continue
print("\nDoing dimensionality reduction on files in %s" % system_name)
while True:
mass_weight = input("\nWould you like to mass-weight the Cartesian coordinates of your structures prior to "
"dimensionality reduction? (True or False)\n")
if mass_weight in ("True", "true", "T", "t"):
mw = True
break
elif mass_weight in ("False", "false", "F", "f"):
mw = False
break
else:
print("Please type True or False.")
continue
while True:
structure_type = input("\nHow would you like to represent your structures? (Cartesians or Distances)\n")
if structure_type in ("Cartesians", "cartesians", "C", "c"):
input_type = "Cartesians"
break
elif structure_type in ("Distances", "distances", "D", "d"):
input_type = "Distances"
while True:
stereo_atoms = input(
"\nOptional: Enter four atom numbers (separated by commas) to define the chirality of your "
"molecule. Hit Return to skip.\n")
if stereo_atoms == "":
stereo_atoms = '1, 2, 3, 4'
break
elif len(stereo_atoms.split(',')) == 4:
break
elif len(stereo_atoms.split(',')) != 4:
print("Please enter four atom numbers separated by commas, or hit Return to skip.")
continue
stereo_atoms = [int(s) for s in stereo_atoms.split(',')]
break
else:
print("Please type Cartesians or Distances.")
continue
while True:
try:
n_dim = int(input("\nHow many principal components would you like to print out? "
"(If you're not sure, use 3)\n"))
except ValueError:
print("Sorry, number of principal components must be an integer value.")
continue
if n_dim <= 0:
print("Sorry, number of principal components must be greater than zero.")
continue
else:
break
if os.path.isfile(input_path) and input_type == "Cartesians":
system_name, output_directory, pca, pca_fit, components, mean, singular_values, aligned_coords = \
pathreducer_cartesians_one_file(input_path, n_dim, mw=mw)
elif os.path.isdir(input_path) and input_type == "Cartesians":
system_name, output_directory, pca, pca_fit, components, mean, singular_values, traj_lengths, aligned_coords = \
pathreducer_cartesians_directory_of_files(input_path, n_dim, mw=mw)
elif os.path.isfile(input_path) and input_type == "Distances":
system_name, output_directory, pca, pca_fit, components, mean, singular_values, aligned_coords = \
pathreducer_distances_one_file(input_path, n_dim, stereo_atoms=stereo_atoms, mw=mw)
elif os.path.isdir(input_path) and input_type == "Distances":
system_name, output_directory, pca, pca_fit, components, mean, singular_values, traj_lengths, aligned_coords = \
pathreducer_distances_directory_of_files(input_path, n_dim, stereo_atoms=stereo_atoms, mw=mw)
else:
print("Something went wrong.")
pcs_df = pd.DataFrame(pca)
if os.path.isdir(input_path):
lengths = traj_lengths
else:
lengths = None
plot_variance = input("\nWould you like a plot of the variance captured by each PC? (True or False)\n")
if plot_variance == "True":
plotting_functions.plot_prop_of_var(singular_values, system_name, output_directory)
print_prop_of_var_to_txt(singular_values, system_name, output_directory)
plot_pcs = input("\nWould you like a plot of the top two and top three PCs? (True or False)\n")
if plot_pcs == "True":
points_to_circle = input("\nIf you have points to circle, enter them now, separated by commas.\n")
if points_to_circle != "":
points_to_circle = [int(s) for s in points_to_circle.split(',')]
else:
points_to_circle = None
plotting_functions.colored_line_and_scatter_plot(pcs_df[0], pcs_df[1], pcs_df[2], same_axis=False,
output_directory=output_directory, lengths=lengths,
points_to_circle=points_to_circle,
imgname=(system_name + "_scatterline"))
new_data_to_project = input("\nDo you have new data you would like to project into this reduced dimensional space? "
"(True or False)\n")
while new_data_to_project == "True":
new_input = input("\nWhat is the path to the file of interest? (Can only take one file at a time)\n")
if input_type == "Cartesians":
new_system_name, new_data_df = transform_new_data_cartesians(new_input, output_directory, n_dim, pca_fit,
components, mean, aligned_coords, MW=mw)
elif input_type == "Distances":
new_system_name, new_data_df = transform_new_data_distances(new_input, output_directory, n_dim, pca_fit,
components, mean, stereo_atoms=stereo_atoms,
MW=mw)
plot_new_data = input("\nWould you like to plot the new data? (True or False)\n")
if plot_new_data == "True":
plotting_functions.colored_line_and_scatter_plot(pcs_df[0], pcs_df[1], pcs_df[2], same_axis=False,
output_directory=output_directory, lengths=lengths,
new_data=new_data_df, points_to_circle=points_to_circle,
imgname=(new_system_name + "_scatterline"))
continue
def generate_deformation_vector(start_structure, end_structure):
np.set_printoptions(formatter={'float': lambda x: "{0:0.3f}".format(x)})
nmd_coords = np.reshape(start_structure,
(1, np.array(start_structure).shape[0] * np.array(start_structure).shape[1]))
deformation_vector = end_structure - start_structure
deformation_vector = np.reshape(deformation_vector,
(1, np.array(deformation_vector).shape[0] * np.array(deformation_vector).shape[1]))
print("NMD Coordinates:", nmd_coords)
print("Deformation vector:", deformation_vector)
return deformation_vector
| 48.994972 | 353 | 0.645945 | 10,973 | 87,701 | 4.901212 | 0.064157 | 0.02618 | 0.011658 | 0.014057 | 0.755806 | 0.726818 | 0.695394 | 0.67403 | 0.650527 | 0.635726 | 0 | 0.007223 | 0.278549 | 87,701 | 1,789 | 354 | 49.022359 | 0.842774 | 0.220704 | 0 | 0.545624 | 0 | 0.001862 | 0.075191 | 0.000826 | 0 | 0 | 0 | 0.001118 | 0.005587 | 1 | 0.044693 | false | 0 | 0.023277 | 0.001862 | 0.113594 | 0.080074 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
be9c0c3b21a9d11d639038ba19c85d53765d8191 | 9,226 | py | Python | python/form_handler/business.py | jonathan-longe/RSBC-DataHub-API | 90c3810a982a3f440673e8d4ea60973467eb335f | [
"Apache-2.0"
] | 3 | 2020-02-18T20:54:13.000Z | 2022-01-14T17:11:45.000Z | python/form_handler/business.py | jonathan-longe/RSBC-DataHub-API | 90c3810a982a3f440673e8d4ea60973467eb335f | [
"Apache-2.0"
] | 41 | 2020-03-19T19:00:51.000Z | 2022-03-05T02:12:18.000Z | python/form_handler/business.py | jonathan-longe/RSBC-DataHub-API | 90c3810a982a3f440673e8d4ea60973467eb335f | [
"Apache-2.0"
] | 6 | 2020-02-20T20:05:24.000Z | 2022-01-25T02:19:29.000Z | import python.common.middleware as middleware
import python.common.actions as actions
import python.common.rsi_email as rsi_email
def process_incoming_form() -> dict:
"""
This function lists the business rules required when processing
each form. The Orbeon form name is used as the key. For example,
the "send_disclosure" attributes below are used when processing the
"send_disclosure" Orbeon form.
"""
return {
"unknown_event": [
{"try": actions.add_unknown_event_error_to_message, "fail": []},
{"try": actions.add_to_failed_queue, "fail": []},
{"try": rsi_email.admin_unknown_event_type, "fail": []}
],
"send_disclosure": [
{"try": actions.is_not_on_hold, "fail": [
{"try": actions.add_to_hold_queue, "fail": []}
]},
{"try": middleware.get_data_from_disclosure_event, "fail": []},
{"try": middleware.create_correlation_id, "fail": []},
{"try": middleware.determine_current_datetime, "fail": []},
{"try": middleware.get_vips_status, "fail": []},
{"try": middleware.prohibition_exists_in_vips, "fail": []},
{"try": middleware.is_review_in_the_future, "fail": [
# No further disclosure will be sent. The review has concluded.
]},
{"try": middleware.is_any_unsent_disclosure, "fail": [
# No new disclosure to send at present, try again later
{"try": actions.add_hold_before_sending_disclosure, "fail": []},
{"try": actions.add_to_hold_queue, "fail": []}
]},
{"try": middleware.retrieve_unsent_disclosure, "fail": []},
{"try": middleware.if_required_add_adp_disclosure, "fail": []},
{"try": rsi_email.applicant_disclosure, "fail": [
# if send is not successful, add back to hold queue
]},
{"try": middleware.mark_disclosure_as_sent, "fail": []},
{"try": actions.add_hold_before_sending_disclosure, "fail": []},
{"try": actions.add_to_hold_queue, "fail": []}
],
"verify_schedule": [
{"try": actions.is_not_on_hold, "fail": [
{"try": actions.add_to_hold_queue, "fail": []}
]},
{"try": middleware.get_data_from_verify_schedule_event, "fail": []},
{"try": middleware.create_correlation_id, "fail": []},
{"try": middleware.determine_current_datetime, "fail": []},
{"try": middleware.get_vips_status, "fail": []},
{"try": middleware.prohibition_exists_in_vips, "fail": []},
{"try": middleware.review_has_been_scheduled, "fail": [
# if review has not been scheduled, notify Appeals Registry
{"try": rsi_email.applicant_did_not_schedule, "fail": []},
]}
# If review has been scheduled, do nothing
],
"review_schedule_picker": [
# aka: review scheduler
{"try": middleware.create_correlation_id, "fail": []},
{"try": middleware.determine_current_datetime, "fail": []},
{"try": middleware.get_data_from_schedule_form, "fail": []},
{"try": middleware.clean_prohibition_number, "fail": []},
{"try": middleware.get_vips_status, "fail": []},
{"try": middleware.prohibition_exists_in_vips, "fail": []},
{"try": middleware.user_submitted_last_name_matches_vips, "fail": []},
{"try": middleware.application_has_been_saved_to_vips, "fail": []},
{"try": middleware.get_payment_status, "fail": []},
{"try": middleware.received_valid_payment_status, "fail": []},
{"try": middleware.paid_not_more_than_24hrs_ago, "fail": []},
{"try": middleware.application_has_been_paid, "fail": []},
{"try": middleware.get_application_details, "fail": []},
{"try": middleware.valid_application_received_from_vips, "fail": []},
{"try": middleware.get_invoice_details, "fail": []},
{"try": middleware.calculate_schedule_window, "fail": []},
{"try": middleware.decode_selected_timeslot, "fail": []},
{"try": middleware.get_human_friendly_time_slot_string, "fail": []},
{"try": middleware.save_schedule_to_vips, "fail": [
# Consider sending a message to the applicant in the unlikely
# event that the schedule save operation is unsuccessful
]},
{"try": rsi_email.applicant_schedule_confirmation, "fail": []},
{"try": rsi_email.applicant_evidence_instructions, "fail": []},
{"try": middleware.create_disclosure_event, "fail": []},
{"try": actions.add_hold_before_sending_disclosure, "fail": []},
{"try": actions.add_to_hold_queue, "fail": []}
],
"prohibition_review": [
# aka: submit prohibition application
{"try": actions.is_not_on_hold, "fail": [
{"try": actions.add_to_hold_queue, "fail": []}
]},
{"try": middleware.get_data_from_application_form, "fail": []},
{"try": middleware.get_user_entered_notice_type_from_message, "fail": []},
{"try": middleware.clean_prohibition_number, "fail": []},
{"try": middleware.populate_driver_name_fields_if_null, "fail": []},
{"try": middleware.create_correlation_id, "fail": []},
{"try": middleware.determine_current_datetime, "fail": []},
{"try": middleware.get_vips_status, "fail": [
{"try": actions.add_to_hold_queue, "fail": []}
]},
{"try": middleware.prohibition_exists_in_vips, "fail": [
{"try": middleware.prohibition_served_within_past_week, "fail": [
{"try": rsi_email.applicant_prohibition_not_found, "fail": []}
]},
{"try": middleware.applicant_has_more_than_one_day_to_apply, "fail": [
{"try": rsi_email.applicant_prohibition_still_not_found, "fail": []},
{"try": actions.add_24_hour_hold_until, "fail": []},
{"try": actions.add_to_hold_queue, "fail": []}
]},
{"try": rsi_email.applicant_prohibition_not_found_yet, "fail": []},
{"try": actions.add_hold_before_trying_vips_again, "fail": []},
{"try": actions.add_to_hold_queue, "fail": []}
]},
{"try": middleware.application_not_previously_saved_to_vips, "fail": [
{"try": rsi_email.already_applied, "fail": []},
]},
{"try": middleware.review_has_not_been_scheduled, "fail": [
{"try": rsi_email.applicant_applied_at_icbc, "fail": []},
]},
{"try": middleware.user_submitted_last_name_matches_vips, "fail": [
{"try": rsi_email.applicant_last_name_mismatch, "fail": []}
]},
{"try": middleware.is_applicant_within_window_to_apply, "fail": [
{"try": rsi_email.applicant_prohibition_served_more_than_7_days_ago, "fail": []}
]},
{"try": middleware.has_drivers_licence_been_seized, "fail": [
{"try": rsi_email.applicant_licence_not_seized, "fail": []}
]},
{"try": middleware.transform_hearing_request_type, "fail": []},
{"try": middleware.force_presentation_type_to_written_if_ineligible_for_oral, "fail": []},
{"try": middleware.transform_applicant_role_type, "fail": []},
{"try": middleware.compress_form_data_xml, "fail": []},
{"try": middleware.save_application_to_vips, "fail": [
{"try": actions.add_to_failed_queue, "fail": []},
{"try": rsi_email.admin_unable_to_save_to_vips, "fail": []}
]},
{"try": rsi_email.application_accepted, "fail": []},
{"try": middleware.is_applicant_ineligible_for_oral_review_but_requested_oral, "fail": [
# end of successful application process
]},
{"try": rsi_email.applicant_review_type_change, "fail": []}
],
"Document_submission": [
# aka: evidence submission form
{"try": middleware.create_correlation_id, "fail": []},
{"try": middleware.determine_current_datetime, "fail": []},
{"try": middleware.get_data_from_document_submission_form, "fail": []},
{"try": middleware.clean_prohibition_number, "fail": []},
{"try": middleware.get_vips_status, "fail": []},
{"try": middleware.prohibition_exists_in_vips, "fail": []},
{"try": middleware.user_submitted_last_name_matches_vips, "fail": []},
{"try": middleware.application_has_been_saved_to_vips, "fail": []},
{"try": middleware.get_application_details, "fail": []},
{"try": middleware.valid_application_received_from_vips, "fail": []},
{"try": rsi_email.applicant_evidence_received, "fail": []},
]
}
| 55.915152 | 102 | 0.576848 | 932 | 9,226 | 5.309013 | 0.206009 | 0.130154 | 0.216451 | 0.064673 | 0.553557 | 0.496564 | 0.46059 | 0.46059 | 0.427648 | 0.409256 | 0 | 0.000742 | 0.269564 | 9,226 | 164 | 103 | 56.256098 | 0.733492 | 0.079775 | 0 | 0.489362 | 0 | 0 | 0.098202 | 0.002603 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007092 | true | 0 | 0.021277 | 0 | 0.035461 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bea3e45defad1dd317f95269761d590a4652739f | 1,954 | py | Python | school/views/school_category_views.py | tnemisteam/cdf-steps | 78896eebd08ba9975a2dece97f73dca9aa781238 | [
"MIT"
] | null | null | null | school/views/school_category_views.py | tnemisteam/cdf-steps | 78896eebd08ba9975a2dece97f73dca9aa781238 | [
"MIT"
] | null | null | null | school/views/school_category_views.py | tnemisteam/cdf-steps | 78896eebd08ba9975a2dece97f73dca9aa781238 | [
"MIT"
] | null | null | null | from django.views.generic import ListView, DetailView, CreateView, \
DeleteView, UpdateView
from school.models import School_category
class School_categoryView(object):
model = School_category
def get_template_names(self):
"""Nest templates within school_category directory."""
tpl = super(School_categoryView, self).get_template_names()[0]
app = self.model._meta.app_label
mdl = 'school_category'
#self.template_name = tpl.replace(app, '{0}/{1}'.format(app, mdl))
self.template_name = tpl[:7]+'school_category/'+tpl[7:]
return [self.template_name]
class School_categoryBaseListView(School_categoryView):
paginate_by = 10
def get_success_url(self):
from django.core.urlresolvers import reverse
return reverse('school_school_category_list')
class School_categoryCreateView(School_categoryView, CreateView):
pass
def get_success_url(self):
from django.core.urlresolvers import reverse
return reverse('school_school_category_list')
class School_categoryDeleteView(School_categoryView, DeleteView):
def get_success_url(self):
from django.core.urlresolvers import reverse
return reverse('school_school_category_list')
class School_categoryDetailView(School_categoryView, DetailView):
pass
def get_success_url(self):
from django.core.urlresolvers import reverse
return reverse('school_school_category_list')
class School_categoryListView(School_categoryBaseListView, ListView):
pass
def get_success_url(self):
from django.core.urlresolvers import reverse
return reverse('school_school_category_list')
class School_categoryUpdateView(School_categoryView, UpdateView):
pass
def get_success_url(self):
from django.core.urlresolvers import reverse
return reverse('school_school_category_list')
| 25.710526 | 74 | 0.725179 | 220 | 1,954 | 6.172727 | 0.25 | 0.113402 | 0.057437 | 0.070692 | 0.476436 | 0.476436 | 0.476436 | 0.476436 | 0.476436 | 0.476436 | 0 | 0.004467 | 0.198055 | 1,954 | 75 | 75 | 26.053333 | 0.862157 | 0.058342 | 0 | 0.55 | 0 | 0 | 0.105522 | 0.088573 | 0 | 0 | 0 | 0 | 0 | 1 | 0.175 | false | 0.1 | 0.2 | 0 | 0.775 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
bea7e0749083910778bf84282cd6d32a61239c71 | 94 | py | Python | 2752.py | kwondohun0308/Beakjoon | 65e8c1015bcb3bb757d8056525034ee333ecb681 | [
"MIT"
] | null | null | null | 2752.py | kwondohun0308/Beakjoon | 65e8c1015bcb3bb757d8056525034ee333ecb681 | [
"MIT"
] | null | null | null | 2752.py | kwondohun0308/Beakjoon | 65e8c1015bcb3bb757d8056525034ee333ecb681 | [
"MIT"
] | null | null | null | a,b,c=list(map(int, input().split()))
arr=[a,b,c]
arr=sorted(arr)
print(arr[0],arr[1],arr[2])
| 18.8 | 37 | 0.617021 | 22 | 94 | 2.636364 | 0.636364 | 0.068966 | 0.103448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033708 | 0.053191 | 94 | 4 | 38 | 23.5 | 0.617978 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bea907795a8104e5b29890add4c5ec7adfd0ae58 | 5,824 | py | Python | note_seq/chord_inference_test.py | tetromino/note-seq | a71716327619d9d995cce3ca1e0d8442ab2e0d73 | [
"Apache-2.0"
] | 113 | 2020-06-17T15:59:30.000Z | 2022-03-15T08:01:05.000Z | note_seq/chord_inference_test.py | tetromino/note-seq | a71716327619d9d995cce3ca1e0d8442ab2e0d73 | [
"Apache-2.0"
] | 26 | 2020-06-17T18:04:40.000Z | 2021-12-08T11:04:19.000Z | note_seq/chord_inference_test.py | tetromino/note-seq | a71716327619d9d995cce3ca1e0d8442ab2e0d73 | [
"Apache-2.0"
] | 45 | 2020-06-17T18:05:03.000Z | 2022-02-11T17:07:14.000Z | # Copyright 2021 The Magenta Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for chord_inference."""
from absl.testing import absltest
from note_seq import chord_inference
from note_seq import sequences_lib
from note_seq import testing_lib
from note_seq.protobuf import music_pb2
CHORD_SYMBOL = music_pb2.NoteSequence.TextAnnotation.CHORD_SYMBOL
class ChordInferenceTest(absltest.TestCase):
def testSequenceNotePitchVectors(self):
sequence = music_pb2.NoteSequence()
testing_lib.add_track_to_sequence(
sequence, 0,
[(60, 100, 0.0, 0.0), (62, 100, 0.0, 0.5),
(60, 100, 1.5, 2.5),
(64, 100, 2.0, 2.5), (67, 100, 2.25, 2.75), (70, 100, 2.5, 4.5),
(60, 100, 6.0, 6.0),
])
note_pitch_vectors = chord_inference.sequence_note_pitch_vectors(
sequence, seconds_per_frame=1.0)
expected_note_pitch_vectors = [
[0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.5, 0.0, 0.0, 0.5, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
]
self.assertEqual(expected_note_pitch_vectors, note_pitch_vectors.tolist())
def testSequenceNotePitchVectorsVariableLengthFrames(self):
sequence = music_pb2.NoteSequence()
testing_lib.add_track_to_sequence(
sequence, 0,
[(60, 100, 0.0, 0.0), (62, 100, 0.0, 0.5),
(60, 100, 1.5, 2.5),
(64, 100, 2.0, 2.5), (67, 100, 2.25, 2.75), (70, 100, 2.5, 4.5),
(60, 100, 6.0, 6.0),
])
note_pitch_vectors = chord_inference.sequence_note_pitch_vectors(
sequence, seconds_per_frame=[1.5, 2.0, 3.0, 5.0])
expected_note_pitch_vectors = [
[0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.5, 0.0, 0.0, 0.5, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
]
self.assertEqual(expected_note_pitch_vectors, note_pitch_vectors.tolist())
def testInferChordsForSequence(self):
sequence = music_pb2.NoteSequence()
testing_lib.add_track_to_sequence(
sequence, 0,
[(60, 100, 0.0, 1.0), (64, 100, 0.0, 1.0), (67, 100, 0.0, 1.0), # C
(62, 100, 1.0, 2.0), (65, 100, 1.0, 2.0), (69, 100, 1.0, 2.0), # Dm
(60, 100, 2.0, 3.0), (65, 100, 2.0, 3.0), (69, 100, 2.0, 3.0), # F
(59, 100, 3.0, 4.0), (62, 100, 3.0, 4.0), (67, 100, 3.0, 4.0)]) # G
quantized_sequence = sequences_lib.quantize_note_sequence(
sequence, steps_per_quarter=4)
chord_inference.infer_chords_for_sequence(
quantized_sequence, chords_per_bar=2)
expected_chords = [('C', 0.0), ('Dm', 1.0), ('F', 2.0), ('G', 3.0)]
chords = [(ta.text, ta.time) for ta in quantized_sequence.text_annotations]
self.assertEqual(expected_chords, chords)
def testInferChordsForSequenceAddKeySignatures(self):
sequence = music_pb2.NoteSequence()
testing_lib.add_track_to_sequence(
sequence, 0,
[(60, 100, 0.0, 1.0), (64, 100, 0.0, 1.0), (67, 100, 0.0, 1.0), # C
(62, 100, 1.0, 2.0), (65, 100, 1.0, 2.0), (69, 100, 1.0, 2.0), # Dm
(60, 100, 2.0, 3.0), (65, 100, 2.0, 3.0), (69, 100, 2.0, 3.0), # F
(59, 100, 3.0, 4.0), (62, 100, 3.0, 4.0), (67, 100, 3.0, 4.0), # G
(66, 100, 4.0, 5.0), (70, 100, 4.0, 5.0), (73, 100, 4.0, 5.0), # F#
(68, 100, 5.0, 6.0), (71, 100, 5.0, 6.0), (75, 100, 5.0, 6.0), # G#m
(66, 100, 6.0, 7.0), (71, 100, 6.0, 7.0), (75, 100, 6.0, 7.0), # B
(65, 100, 7.0, 8.0), (68, 100, 7.0, 8.0), (73, 100, 7.0, 8.0)]) # C#
quantized_sequence = sequences_lib.quantize_note_sequence(
sequence, steps_per_quarter=4)
chord_inference.infer_chords_for_sequence(
quantized_sequence, chords_per_bar=2, add_key_signatures=True)
expected_key_signatures = [(0, 0.0), (6, 4.0)]
key_signatures = [(ks.key, ks.time)
for ks in quantized_sequence.key_signatures]
self.assertEqual(expected_key_signatures, key_signatures)
def testInferChordsForSequenceWithBeats(self):
sequence = music_pb2.NoteSequence()
testing_lib.add_track_to_sequence(
sequence, 0,
[(60, 100, 0.0, 1.1), (64, 100, 0.0, 1.1), (67, 100, 0.0, 1.1), # C
(62, 100, 1.1, 1.9), (65, 100, 1.1, 1.9), (69, 100, 1.1, 1.9), # Dm
(60, 100, 1.9, 3.0), (65, 100, 1.9, 3.0), (69, 100, 1.9, 3.0), # F
(59, 100, 3.0, 4.5), (62, 100, 3.0, 4.5), (67, 100, 3.0, 4.5)]) # G
testing_lib.add_beats_to_sequence(sequence, [0.0, 1.1, 1.9, 1.9, 3.0])
chord_inference.infer_chords_for_sequence(sequence)
expected_chords = [('C', 0.0), ('Dm', 1.1), ('F', 1.9), ('G', 3.0)]
chords = [(ta.text, ta.time) for ta in sequence.text_annotations
if ta.annotation_type == CHORD_SYMBOL]
self.assertEqual(expected_chords, chords)
if __name__ == '__main__':
absltest.main()
| 43.789474 | 79 | 0.577953 | 1,106 | 5,824 | 2.925859 | 0.137432 | 0.15822 | 0.20581 | 0.247219 | 0.605686 | 0.533993 | 0.522868 | 0.510507 | 0.507108 | 0.507108 | 0 | 0.198443 | 0.228194 | 5,824 | 132 | 80 | 44.121212 | 0.521468 | 0.106628 | 0 | 0.57732 | 0 | 0 | 0.003484 | 0 | 0 | 0 | 0 | 0 | 0.051546 | 1 | 0.051546 | false | 0 | 0.051546 | 0 | 0.113402 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
beb0dd2093b8980e64bedc2f82d6f3b1a74b310b | 13,516 | py | Python | bin/plot_diagnostics.py | mcnanna/obztak | fdbdac79348c207b13fdd2a9956fe349cc0e00ef | [
"MIT"
] | 4 | 2019-01-18T20:34:29.000Z | 2022-01-26T14:35:31.000Z | bin/plot_diagnostics.py | mcnanna/obztak | fdbdac79348c207b13fdd2a9956fe349cc0e00ef | [
"MIT"
] | 8 | 2017-06-22T19:12:31.000Z | 2020-01-27T03:52:59.000Z | bin/plot_diagnostics.py | mcnanna/obztak | fdbdac79348c207b13fdd2a9956fe349cc0e00ef | [
"MIT"
] | 3 | 2017-12-15T21:51:23.000Z | 2019-08-10T05:07:53.000Z | #!/usr/bin/env python
"""
Create diagnostic plots for a particular survey strategy.
"""
#import sys
import os
import numpy as np
import matplotlib.pyplot as plt
import ephem
import obztak.utils.ortho
import obztak.utils.fileio
plt.ion()
############################################################
def movie(infile_accomplished_fields, infile_target_fields=None, outdir=None, chunk=1):
plt.ioff()
#accomplished_fields = np.recfromtxt(infile_accomplished_fields, delimiter=',', names=True)
accomplished_fields = obztak.utils.fileio.csv2rec(infile_accomplished_fields)
print len(accomplished_fields)
if infile_target_fields is not None:
#target_fields = np.recfromtxt(infile_target_fields, names=True)
target_fields = obztak.utils.fileio.csv2rec(infile_target_fields)
for ii in range(0, 1000):
print ii, accomplished_fields['DATE'][ii * chunk]
fig, basemap = obztak.utils.ortho.makePlot(accomplished_fields['DATE'][ii * chunk], figsize=(10.5 / 2., 8.5 / 2.), s=25, dpi=160, moon=False)
if infile_target_fields is not None:
cut_accomplished = np.in1d(target_fields['ID'], accomplished_fields['ID'][0:ii * chunk])
#cut_accomplished[ii] = True
proj = obztak.utils.ortho.safeProj(basemap,
target_fields['RA'][np.logical_not(cut_accomplished)],
target_fields['DEC'][np.logical_not(cut_accomplished)])
basemap.scatter(*proj, c=np.tile(0, np.sum(np.logical_not(cut_accomplished))), edgecolor='none', s=25, vmin=0, vmax=4, cmap='summer_r')
proj = obztak.utils.ortho.safeProj(basemap, accomplished_fields['RA'][0:(ii * chunk) + 1], accomplished_fields['DEC'][0:(ii * chunk) + 1])
basemap.scatter(*proj, c=accomplished_fields['TILING'][0:(ii * chunk) + 1], edgecolor='none', s=25, vmin=0, vmax=4, cmap='summer_r')
colorbar = plt.colorbar(label='Tiling')
# Show the selected field
if chunk == 1:
proj = obztak.utils.ortho.safeProj(basemap, [accomplished_fields['RA'][ii]], [accomplished_fields['DEC'][ii]])
else:
proj = obztak.utils.ortho.safeProj(basemap,
accomplished_fields['RA'][ii * chunk:(ii + 1) * chunk],
accomplished_fields['DEC'][ii * chunk:(ii + 1) * chunk])
basemap.scatter(*proj, c='magenta', edgecolor='none', s=25)
if outdir is not None:
plt.savefig('%s/movie%05i.gif'%(outdir, ii))
#raw_input('WAIT')
fig.clf()
if ephem.Date(accomplished_fields['DATE'][(ii + 1) * chunk]) > ephem.Date('2017/1/1 00:00'):
break
if outdir is not None:
print 'Generating animated gif...'
os.system('convert -set delay 10 -loop 0 %s/*.gif %s/output.gif'%(outdir, outdir))
############################################################
def slew(infile_accomplished_fields, tag=None):
#accomplished_fields = np.recfromtxt(infile_accomplished_fields, delimiter=',', names=True)
accomplished_fields = obztak.utils.fileio.csv2rec(infile_accomplished_fields)
plt.figure()
plt.hist(accomplished_fields['SLEW'], bins=np.linspace(0., 20., 21), color='green')
plt.xlabel('Slew Angle (deg)')
plt.ylabel('Number of Fields')
plt.xlim(0., 20.)
# Inset panel
if np.any(accomplished_fields['SLEW'] > 20.):
a = plt.axes([.4, .4, .45, .45], axisbg='w')
max_slew = np.max(accomplished_fields['SLEW'])
plt.hist(accomplished_fields['SLEW'], bins=np.arange(20., max_slew + 2., 1.), color='green')
plt.xlabel('Slew Angle (deg)')
plt.ylabel('Number of Fields')
plt.xlim(20., max(40., max_slew + 2.))
if tag is not None:
plt.savefig('slew_hist_%s.pdf'%(tag))
plt.figure()
#plt.scatter(np.arange(len(accomplished_fields['SLEW'])), accomplished_fields['SLEW'], edgecolor='none', alpha=0.33)
plt.scatter(np.arange(len(accomplished_fields['SLEW'])), accomplished_fields['SLEW'], marker='x')
plt.xlabel('Sequential Field Observed')
plt.ylabel('Slew Angle (deg)')
plt.xlim(0., len(accomplished_fields['SLEW']) + 1)
plt.ylim(0., 30.)
if tag is not None:
plt.savefig('slew_sequential_%s.pdf'%(tag))
cut = accomplished_fields['SLEW'] > 10.
for index in np.nonzero(cut)[0]:
print accomplished_fields['SLEW'][index], accomplished_fields['RA'][index], accomplished_fields['DEC'][index], accomplished_fields['DATE'][index]
############################################################
def slewAnalysis(infile_accomplished_fields):
#accomplished_fields = np.recfromtxt(infile_accomplished_fields, delimiter=',', names=True)
accomplished_fields = obztak.utils.fileio.csv2rec(infile_accomplished_fields)
cut = accomplished_fields['SLEW'] > 10.
for index in np.nonzero(cut)[0]:
date = accomplished_fields['DATE'][index]
fig, basemap = obztak.utils.ortho.makePlot(date, figsize=(10.5, 8.5), s=50, dpi=80, airmass=False, moon=False, center=(0., -90.), name='ortho')
index_min = max(0, index - 10)
index_max = min(len(accomplished_fields['DATE']), index + 11)
proj = obztak.utils.ortho.safeProj(basemap,
accomplished_fields['RA'][index_min:index_max],
accomplished_fields['DEC'][index_min:index_max])
basemap.scatter(*proj, c=np.arange(index_min, index_max), edgecolor='none', s=50, vmin=index_min, vmax=index_max, cmap='Spectral')
colorbar = plt.colorbar(label='Index')
raw_input('%i %.1f'%(index, accomplished_fields['SLEW'][index]))
plt.clf()
############################################################
def hourAngle(infile_accomplished_fields, tag=None):
#accomplished_fields = np.recfromtxt(infile_accomplished_fields, delimiter=',', names=True)
accomplished_fields = obztak.utils.fileio.csv2rec(infile_accomplished_fields)
plt.figure()
#plt.scatter(np.arange(len(accomplished_fields['SLEW'])), accomplished_fields['SLEW'], edgecolor='none', alpha=0.33)
plt.scatter(np.arange(len(accomplished_fields['HOURANGLE'])), accomplished_fields['HOURANGLE'], marker='x')
plt.xlabel('Sequential Field Observed')
plt.ylabel('Hour Angle (deg)')
plt.xlim(0., len(accomplished_fields['HOURANGLE']) + 1)
#plt.ylim(300., 420.)
if tag is not None:
plt.savefig('hour_angle_sequential_%s.pdf'%(tag))
plt.figure()
plt.scatter(accomplished_fields['DEC'], accomplished_fields['HOURANGLE'], c=np.arange(len(accomplished_fields['HOURANGLE'])), marker='x')
plt.xlabel('Dec (deg)')
plt.ylabel('Hour Angle (deg)')
#plt.xlim(0., len(accomplished_fields['HOURANGLE']) + 1)
#plt.ylim(300., 420.)
if tag is not None:
plt.savefig('dec_hour_angle_%s.pdf'%(tag))
############################################################
def airmass(infile_accomplished_fields, tag=None):
#accomplished_fields = np.recfromtxt(infile_accomplished_fields, delimiter=',', names=True)
accomplished_fields = obztak.utils.fileio.csv2rec(infile_accomplished_fields)
plt.figure()
plt.hist(accomplished_fields['AIRMASS'], bins=np.linspace(1., 2., 21), color='red')
plt.xlabel('Airmass')
plt.ylabel('Number of Fields')
plt.xlim(1., 2.)
if tag is not None:
plt.savefig('airmass_hist_%s.pdf'%(tag))
plt.figure()
plt.scatter(np.arange(len(accomplished_fields['AIRMASS'])), accomplished_fields['AIRMASS'], marker='x', color='red')
plt.xlabel('Sequential Field Observed')
plt.ylabel('Airmass')
plt.xlim(0., len(accomplished_fields['AIRMASS']) + 1)
plt.ylim(1., 2.)
if tag is not None:
plt.savefig('airmass_sequential_%s.pdf'%(tag))
fig, basemap = obztak.utils.ortho.makePlot(accomplished_fields['DATE'][0], figsize=(10.5, 8.5), s=50, dpi=80, center=(0., -90.), airmass=False, moon=False)
proj = obztak.utils.ortho.safeProj(basemap, accomplished_fields['RA'], accomplished_fields['DEC'])
basemap.scatter(*proj, c=accomplished_fields['AIRMASS'], edgecolor='none', s=50, alpha=0.5, vmin=np.min(accomplished_fields['AIRMASS']), vmax=2, cmap='RdYlGn_r')
colorbar = plt.colorbar(label='Airmass')
if tag is not None:
plt.savefig('airmass_map_%s.pdf'%(tag))
############################################################
def progress(infile_accomplished_fields, date, infile_target_fields=None, tag=None):
if type(date) != ephem.Date:
date = ephem.Date(date)
#accomplished_fields = np.recfromtxt(infile_accomplished_fields, delimiter=',', names=True)
accomplished_fields = obztak.utils.fileio.csv2rec(infile_accomplished_fields)
date_array = np.tile(0., len(accomplished_fields['DATE']))
for ii in range(0, len(accomplished_fields['DATE'])):
date_array[ii] = ephem.Date(accomplished_fields['DATE'][ii]).real
cut_accomplished = date_array <= date.real
fig, basemap = obztak.utils.ortho.makePlot(date, figsize=(10.5, 8.5), s=50, dpi=80, airmass=False, moon=False, center=(0., -90.))
if infile_target_fields is not None:
#target_fields = np.recfromtxt(infile_target_fields, delimiter=',', names=True)
target_fields = obztak.utils.fileio.csv2rec(infile_target_fields)
proj = obztak.utils.ortho.safeProj(basemap,
target_fields['RA'][np.logical_not(cut_accomplished)],
target_fields['DEC'][np.logical_not(cut_accomplished)])
basemap.scatter(*proj, c=np.tile(0, np.sum(np.logical_not(cut_accomplished))), edgecolor='none', s=50, vmin=0, vmax=4, cmap='summer_r')
proj = obztak.utils.ortho.safeProj(basemap, accomplished_fields['RA'][cut_accomplished], accomplished_fields['DEC'][cut_accomplished])
basemap.scatter(*proj, c=accomplished_fields['TILING'][cut_accomplished], edgecolor='none', s=50, vmin=0, vmax=4, cmap='summer_r')
colorbar = plt.colorbar(label='Tiling')
if tag is not None:
plt.savefig('progress_map_%s_%s.pdf'%(obztak.utils.ortho.datestring(date).replace('/', '_').replace(':', '_').replace(' ', '_'), tag))
############################################################
def tiling(infile_accomplished_fields, infile_target_fields=None, tag=None):
"""
Survey progress in terms of tiling coverage
"""
accomplished_fields = obztak.utils.fileio.csv2rec(infile_accomplished_fields)
#print accomplished_fields
#print accomplished_fields.dtype
if infile_target_fields is not None:
target_fields = obztak.utils.fileio.csv2rec(infile_target_fields)
x = np.arange(len(accomplished_fields['TILING']))
plt.figure()
for ii, color in enumerate(['black', 'red', 'blue', 'green']):
if infile_target_fields is not None:
print ii + 1, np.sum(target_fields['TILING'] == (ii + 1))
y = np.cumsum(accomplished_fields['TILING'] == (ii + 1)) / float(np.sum(target_fields['TILING'] == (ii + 1)))
else:
y = np.cumsum(accomplished_fields['TILING'] == (ii + 1))
plt.plot(x, y, c=color, label='Tiling %i'%(ii))
plt.xlabel('Sequential Field Observed')
if infile_target_fields is not None:
plt.ylabel('Fraction of Tiling Complete')
else:
plt.ylabel('Completed Exposures in Tiling')
plt.legend(frameon=False, loc='upper left')
if tag is not None:
plt.savefig('tiling_%s.pdf'%(tag))
############################################################
if __name__ == '__main__':
#from obztak.utils.parser import Parser
#description = __doc__
#parser = Parser(description=description)
import argparse
description = __doc__
formatter = argparse.ArgumentDefaultsHelpFormatter
parser = argparse.ArgumentParser(description=description,
formatter_class=formatter)
parser.add_argument('-s','--scheduled',default='scheduled_fields.csv',
help='infile of scheduled fields')
parser.add_argument('-t','--tag',default=None,
help='save plots with the given tag.')
args = parser.parse_args()
print args
#if len(sys.argv) > 1:
# infile_scheduled_fields = sys.argv[1]
#else:
# infile_scheduled_fields = 'scheduled_fields.csv'
#movie('accomplished_fields.txt', infile_target_fields='target_fields.txt', outdir='movie')
#movie('accomplished_fields.txt', infile_target_fields='target_fields.txt', outdir='movie2', chunk=2)
#movie('accomplished_fields.txt', infile_target_fields='target_fields.txt', outdir='movie3', chunk=2)
#slew('accomplished_fields.txt')
#airmass('accomplished_fields.txt')
#progress('accomplished_fields.txt', '2016/6/30 10:32:50', infile_target_fields='target_fields.txt')
#progress('accomplished_fields.txt', '2017/6/30 10:32:51', infile_target_fields='target_fields.txt')
progress(args.scheduled, '2017/6/30 10:32:51', infile_target_fields='target_fields.csv', tag=args.tag)
slew(args.scheduled, tag=args.tag)
#slewAnalysis(args.scheduled)
airmass(args.scheduled, tag=args.tag)
#hourAngle(args.scheduled)
tiling(args.scheduled, infile_target_fields='target_fields.csv', tag=args.tag)
raw_input('...wait...')
############################################################
| 45.816949 | 165 | 0.633619 | 1,672 | 13,516 | 4.958732 | 0.135167 | 0.219274 | 0.045592 | 0.015921 | 0.650103 | 0.601978 | 0.570378 | 0.494512 | 0.470751 | 0.414305 | 0 | 0.02213 | 0.180897 | 13,516 | 294 | 166 | 45.972789 | 0.726764 | 0.147677 | 0 | 0.348315 | 0 | 0 | 0.112025 | 0.010907 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.039326 | null | null | 0.033708 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
beba60e7ce5ced3f95fc8c00a67227e77fe7333d | 263 | py | Python | kernel/arch/arm/3ds11/config.py | MTGos/mtgos | 270f4932565eb3fe074cacc8c55c41103fa0a371 | [
"BSD-2-Clause"
] | 2 | 2018-01-04T18:11:39.000Z | 2020-03-12T17:54:13.000Z | kernel/arch/arm/3ds11/config.py | MTGos/kernel | 270f4932565eb3fe074cacc8c55c41103fa0a371 | [
"BSD-2-Clause"
] | 10 | 2017-02-05T11:09:45.000Z | 2017-02-08T13:42:14.000Z | kernel/arch/arm/3ds11/config.py | MTGos/kernel | 270f4932565eb3fe074cacc8c55c41103fa0a371 | [
"BSD-2-Clause"
] | null | null | null | config["LOWEST_CPU"] = "arm11mpcore"
config["ENABLE_HARD"] = get_yes_no("Enable VFP ABI", True)
if not config["ENABLE_HARD"]:
config["ENABLE_THUMB"] = get_yes_no("Enable Thumb")
import sys
sys.argv=["","kernel/mmaps/3ds11.mc"]
from buildtools import mmapcomp
| 32.875 | 58 | 0.73384 | 39 | 263 | 4.74359 | 0.641026 | 0.194595 | 0.172973 | 0.151351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021186 | 0.102662 | 263 | 7 | 59 | 37.571429 | 0.762712 | 0 | 0 | 0 | 0 | 0 | 0.387833 | 0.079848 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bec82106e382dc058eb664c23253c64d2483ea1d | 270 | py | Python | catalog/bindings/wfs/property_is_nil.py | NIVANorge/s-enda-playground | 56ae0a8978f0ba8a5546330786c882c31e17757a | [
"Apache-2.0"
] | null | null | null | catalog/bindings/wfs/property_is_nil.py | NIVANorge/s-enda-playground | 56ae0a8978f0ba8a5546330786c882c31e17757a | [
"Apache-2.0"
] | null | null | null | catalog/bindings/wfs/property_is_nil.py | NIVANorge/s-enda-playground | 56ae0a8978f0ba8a5546330786c882c31e17757a | [
"Apache-2.0"
] | null | null | null | from dataclasses import dataclass
from bindings.wfs.property_is_nil_type import PropertyIsNilType
__NAMESPACE__ = "http://www.opengis.net/fes/2.0"
@dataclass
class PropertyIsNil(PropertyIsNilType):
class Meta:
namespace = "http://www.opengis.net/fes/2.0"
| 24.545455 | 63 | 0.766667 | 35 | 270 | 5.714286 | 0.628571 | 0.13 | 0.16 | 0.23 | 0.31 | 0.31 | 0.31 | 0.31 | 0 | 0 | 0 | 0.016949 | 0.125926 | 270 | 10 | 64 | 27 | 0.830508 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fe22346511260dc06092829fa58a0969b5ba32c9 | 1,018 | py | Python | textattack/goal_functions/classification/targeted_classification.py | heytitle/TextAttack | 99fedba1046a1c99e01432f97f7fe5a768ae7817 | [
"MIT"
] | 2 | 2021-02-22T12:15:27.000Z | 2021-05-02T15:22:05.000Z | textattack/goal_functions/classification/targeted_classification.py | heytitle/TextAttack | 99fedba1046a1c99e01432f97f7fe5a768ae7817 | [
"MIT"
] | null | null | null | textattack/goal_functions/classification/targeted_classification.py | heytitle/TextAttack | 99fedba1046a1c99e01432f97f7fe5a768ae7817 | [
"MIT"
] | null | null | null | from .classification_goal_function import ClassificationGoalFunction
class TargetedClassification(ClassificationGoalFunction):
"""
A targeted attack on classification models which attempts to maximize the
score of the target label. Complete when the arget label is the predicted label.
"""
def __init__(self, *args, target_class=0, **kwargs):
super().__init__(*args, **kwargs)
self.target_class = target_class
def _is_goal_complete(self, model_output, _):
return (
self.target_class == model_output.argmax()
) or self.ground_truth_output == self.target_class
def _get_score(self, model_output, _):
if self.target_class < 0 or self.target_class >= len(model_output):
raise ValueError(
f"target class set to {self.target_class} with {len(model_output)} classes."
)
else:
return model_output[self.target_class]
def extra_repr_keys(self):
return ["target_class"]
| 35.103448 | 92 | 0.67387 | 121 | 1,018 | 5.363636 | 0.446281 | 0.186441 | 0.161787 | 0.064715 | 0.07396 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002591 | 0.24165 | 1,018 | 28 | 93 | 36.357143 | 0.838083 | 0.152259 | 0 | 0 | 0 | 0 | 0.10083 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.055556 | 0.111111 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
fe32ac639582423ff4c817ab5037f8d1f9bb2aec | 1,200 | py | Python | src/python/intensity/logging.py | kripken/intensityengine | 9ae352b4f526ecb180004ae4968db7f64f140762 | [
"MIT"
] | 31 | 2015-01-18T20:27:31.000Z | 2021-07-03T03:58:47.000Z | src/python/intensity/logging.py | JamesLinus/intensityengine | 9ae352b4f526ecb180004ae4968db7f64f140762 | [
"MIT"
] | 4 | 2015-07-05T21:09:37.000Z | 2019-09-06T14:34:59.000Z | src/python/intensity/logging.py | JamesLinus/intensityengine | 9ae352b4f526ecb180004ae4968db7f64f140762 | [
"MIT"
] | 11 | 2015-02-03T19:24:10.000Z | 2019-09-20T10:59:50.000Z |
# Copyright 2010 Alon Zakai ('kripken'). All rights reserved.
# This file is part of Syntensity/the Intensity Engine, an open source project. See COPYING.txt for licensing.
"""
A simple logging system with variable levels of logging detail. This module is a thin
wrapper around our C++ Logging module.
"""
import c_module
CModule = c_module.CModule.holder
## Storage for logging constants.
class logging: # Must be synchronized with logging.h/cpp
INFO = 0
DEBUG = 1
WARNING = 2
ERROR = 3
OFF = 4
strings = ['INFO', 'DEBUG', 'WARNING', 'ERROR', 'OFF']
@staticmethod
def should_show(level):
return CModule.should_show(level)
#curr_level = logging.DEBUG # TODO : load from config
## Write a message to the log.
## @param level The level of this message. If the current logging level allows this message level to be shown, it will, otherwise not.
## For example, if this is a WARNING, and we are running at a current level of INFO, then we would show INFO and all more serious
## messages, and in particular this WARNING one.
## @param message The content of the message, in string form.
def log(level, message):
CModule.log(level, message)
| 33.333333 | 134 | 0.710833 | 182 | 1,200 | 4.659341 | 0.565934 | 0.007075 | 0.033019 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009454 | 0.206667 | 1,200 | 35 | 135 | 34.285714 | 0.881303 | 0.673333 | 0 | 0 | 0 | 0 | 0.065574 | 0 | 0 | 0 | 0 | 0.028571 | 0 | 1 | 0.142857 | false | 0 | 0.071429 | 0.071429 | 0.785714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fe36a9029163b1756d983adfae87b6e84cf7a692 | 1,499 | py | Python | dev/scripts/clear_testing.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | 157 | 2018-04-30T16:27:53.000Z | 2022-03-31T08:17:21.000Z | dev/scripts/clear_testing.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | 3,250 | 2018-04-26T14:14:25.000Z | 2022-03-31T23:49:15.000Z | dev/scripts/clear_testing.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | 65 | 2018-05-10T14:11:50.000Z | 2022-03-18T19:22:58.000Z | #
# Copyright 2021 Red Hat Inc.
# SPDX-License-Identifier: Apache-2.0
#
"""Clear out our local testing directories."""
import argparse
import os
import shutil
TESTING_DIRS = [
"local_providers/aws_local",
"local_providers/aws_local_0",
"local_providers/aws_local_1",
"local_providers/aws_local_2",
"local_providers/aws_local_3",
"local_providers/aws_local_4",
"local_providers/aws_local_5",
"local_providers/azure_local",
"local_providers/gcp_local",
"local_providers/gcp_local_0",
"local_providers/gcp_local_1",
"local_providers/gcp_local_2",
"local_providers/gcp_local_3",
"local_providers/insights_local",
"pvc_dir/insights_local",
"pvc_dir/processing",
"parquet_data",
]
def main(*args, **kwargs):
testing_path = kwargs["testing_path"]
paths_to_clear = [f"{testing_path}/{directory}" for directory in TESTING_DIRS]
for path in paths_to_clear:
try:
print(f"Checking {path}")
dirs_to_remove = [f.path for f in os.scandir(path) if f.is_dir()]
for directory in dirs_to_remove:
print(f"Removing {directory}")
shutil.rmtree(directory)
except FileNotFoundError as err:
print(err)
if __name__ == "__main__":
PARSER = argparse.ArgumentParser()
PARSER.add_argument("-p", "--path", dest="testing_path", help="The path to the testing directory", required=True)
ARGS = vars(PARSER.parse_args())
main(**ARGS)
| 28.283019 | 117 | 0.675784 | 196 | 1,499 | 4.826531 | 0.372449 | 0.207188 | 0.125793 | 0.162791 | 0.051797 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013412 | 0.204136 | 1,499 | 52 | 118 | 28.826923 | 0.779547 | 0.070047 | 0 | 0 | 0 | 0 | 0.406498 | 0.306859 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025641 | false | 0 | 0.076923 | 0 | 0.102564 | 0.076923 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe3ed2563a0e3bbd250e9cb9fa7354200f7cfaea | 1,676 | py | Python | tests/search/chembl/test_indication_search.py | dmyersturnbull/chembler | b1b54a5f4fe7939e012c0cc8b227fcea60d6e744 | [
"Apache-2.0"
] | null | null | null | tests/search/chembl/test_indication_search.py | dmyersturnbull/chembler | b1b54a5f4fe7939e012c0cc8b227fcea60d6e744 | [
"Apache-2.0"
] | null | null | null | tests/search/chembl/test_indication_search.py | dmyersturnbull/chembler | b1b54a5f4fe7939e012c0cc8b227fcea60d6e744 | [
"Apache-2.0"
] | null | null | null | from collections import Sequence
import pytest
from mandos.entry.api_singletons import Apis
from mandos.model.hits import AbstractHit
from mandos.search.chembl.indication_search import IndicationSearch
"""
class TestIndicationSearch:
def test_find(self):
search = IndicationSearch(key="indications", api=Apis.Chembl, min_phase=0)
inchikeys = get_test_resource("inchis.txt").read_text(encoding="utf8").splitlines()
hits: Sequence[AbstractHit] = search.find_all(inchikeys)
assert len(hits) == 6
assert {t.compound_name.lower() for t in hits} == {"alprazolam"}
assert {t.object_id for t in triples} == {
"D012559",
"D016584",
"D003704",
"D001008",
"D001007",
"D003866",
}
assert {t.obj.lower() for t in triples} == {
"schizophrenia",
"panic disorder",
"dementia",
"anxiety",
"depressive disorder",
"anxiety disorders",
}
assert {t.pred for t in triples} == {"phase-4 indication", "phase-3 indication"}
def test_cocaine_hcl(self):
api = ChemblApi.wrap(Chembl)
df, triples = Searcher(IndicationSearch(api, min_phase=3)).search_for(["CHEMBL529437"])
assert len(df) == 1
assert len(triples) == 1
assert triples[0].compound_name.lower() == "cocaine"
assert triples[0].compound_id == "CHEMBL370805"
assert triples[0].object_id == "D000758"
assert triples[0].obj == "Anesthesia"
assert triples[0].pred == "phase-4 indication"
if __name__ == "__main__":
pytest.main()
"""
| 32.862745 | 95 | 0.605609 | 183 | 1,676 | 5.409836 | 0.448087 | 0.065657 | 0.070707 | 0.039394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05551 | 0.269093 | 1,676 | 50 | 96 | 33.52 | 0.752653 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
fe4be27ff376cdc57b6d065ddffc53d93d345b80 | 147 | py | Python | lucario_reasons/pokerank.py | Floozutter/lucario-reasons | 7b75054e9ec27165d2e01186098a3b3736d25cc5 | [
"Unlicense"
] | null | null | null | lucario_reasons/pokerank.py | Floozutter/lucario-reasons | 7b75054e9ec27165d2e01186098a3b3736d25cc5 | [
"Unlicense"
] | null | null | null | lucario_reasons/pokerank.py | Floozutter/lucario-reasons | 7b75054e9ec27165d2e01186098a3b3736d25cc5 | [
"Unlicense"
] | null | null | null | if __name__ == "__main__":
with open("poketags.txt", "r", encoding = "utf-8") as ifile:
poketags = tuple(ifile.read().strip().split())
| 36.75 | 64 | 0.605442 | 19 | 147 | 4.263158 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008333 | 0.183673 | 147 | 3 | 65 | 49 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.176871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe60f3bd27f42243cebbd24079a11ae3b3b832d5 | 341 | py | Python | RPi_local_backup_sql/upload_test.py | LisaTsai/CF_2018 | 8e8b7fc66913e4da69a125ae6f8800da0b59b606 | [
"MIT"
] | null | null | null | RPi_local_backup_sql/upload_test.py | LisaTsai/CF_2018 | 8e8b7fc66913e4da69a125ae6f8800da0b59b606 | [
"MIT"
] | null | null | null | RPi_local_backup_sql/upload_test.py | LisaTsai/CF_2018 | 8e8b7fc66913e4da69a125ae6f8800da0b59b606 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import MySQLdb
db=MySQLdb.connect(host="192.168.50.98",user="root",passwd="ntubime405",db="PN_CF", port=3306)
cur=db.cursor()
try:
cur.execute("""DELETE FROM THI where 1""")
cur.execute("""INSERT INTO THI values(CURRENT_TIMESTAMP,'22','33','44','9')""")
db.commit()
db.close()
except:
print "Error"
db.rollback()
| 26.230769 | 94 | 0.686217 | 55 | 341 | 4.218182 | 0.818182 | 0.086207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080128 | 0.085044 | 341 | 12 | 95 | 28.416667 | 0.663462 | 0.058651 | 0 | 0 | 0 | 0 | 0.375 | 0.1375 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.090909 | 0.090909 | null | null | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
fe641ca1faefb23598f51039ce298c9dc9f46dd9 | 126 | py | Python | output/models/ms_data/model_groups/mg_k009_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 1 | 2021-08-14T17:59:21.000Z | 2021-08-14T17:59:21.000Z | output/models/ms_data/model_groups/mg_k009_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 4 | 2020-02-12T21:30:44.000Z | 2020-04-15T20:06:46.000Z | output/models/ms_data/model_groups/mg_k009_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | null | null | null | from output.models.ms_data.model_groups.mg_k009_xsd.mg_k009 import (
Doc,
Foo,
)
__all__ = [
"Doc",
"Foo",
]
| 12.6 | 68 | 0.619048 | 18 | 126 | 3.833333 | 0.777778 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.238095 | 126 | 9 | 69 | 14 | 0.65625 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe6a2c23db6e93f48b4985882a2cfbcfb62ae91b | 230 | py | Python | gosling/examples/__init__.py | gosling-lang/gos | 0099d51e03b1e27aba5fa7554770eb27078716ed | [
"MIT"
] | 32 | 2021-09-08T18:18:49.000Z | 2022-02-22T00:27:02.000Z | gosling/examples/__init__.py | gosling-lang/gos | 0099d51e03b1e27aba5fa7554770eb27078716ed | [
"MIT"
] | 18 | 2021-09-08T16:18:09.000Z | 2022-03-25T19:25:28.000Z | gosling/examples/__init__.py | manzt/ipygosling | 0099d51e03b1e27aba5fa7554770eb27078716ed | [
"MIT"
] | 2 | 2022-02-18T13:43:55.000Z | 2022-03-11T17:49:41.000Z | import pathlib
def iter_examples():
example_dir = pathlib.Path(__file__).parent.absolute()
for fp in example_dir.iterdir():
if fp.name.startswith("_") or fp.suffix != ".py":
continue
yield fp
| 23 | 58 | 0.626087 | 29 | 230 | 4.689655 | 0.793103 | 0.147059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.252174 | 230 | 9 | 59 | 25.555556 | 0.790698 | 0 | 0 | 0 | 0 | 0 | 0.017391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe784dc36933e3a2e1a717bf63310b644076de07 | 6,911 | py | Python | tests/test_basics.py | roguextech/OpenRocket-pyjnius | 1692c95099ca8eaf652ac24dbbcbf4eb08d8191f | [
"MIT"
] | 908 | 2015-01-03T03:24:43.000Z | 2022-03-29T14:05:16.000Z | tests/test_basics.py | roguextech/OpenRocket-pyjnius | 1692c95099ca8eaf652ac24dbbcbf4eb08d8191f | [
"MIT"
] | 465 | 2015-01-03T19:13:21.000Z | 2022-03-15T23:32:23.000Z | tests/test_basics.py | roguextech/OpenRocket-pyjnius | 1692c95099ca8eaf652ac24dbbcbf4eb08d8191f | [
"MIT"
] | 244 | 2015-01-02T22:41:35.000Z | 2022-03-12T13:16:30.000Z | from __future__ import print_function
from __future__ import division
from __future__ import absolute_import
import sys
import unittest
from jnius.reflect import autoclass
try:
long
except NameError:
# Python 3
long = int
def py2_encode(uni):
if sys.version_info < (3, 0):
uni = uni.encode('utf-8')
return uni
class BasicsTest(unittest.TestCase):
def test_static_methods(self):
Test = autoclass('org.jnius.BasicsTest')
self.assertEqual(Test.methodStaticZ(), True)
self.assertEqual(Test.methodStaticB(), 127)
self.assertEqual(Test.methodStaticC(), 'k')
self.assertEqual(Test.methodStaticS(), 32767)
self.assertEqual(Test.methodStaticI(), 2147483467)
self.assertEqual(Test.methodStaticJ(), 9223372036854775807)
self.assertAlmostEqual(Test.methodStaticF(), 1.23456789)
self.assertEqual(Test.methodStaticD(), 1.23456789)
self.assertEqual(Test.methodStaticString(), py2_encode(u'hello \U0001F30E!'))
def test_static_fields(self):
Test = autoclass('org.jnius.BasicsTest')
self.assertEqual(Test.fieldStaticZ, True)
self.assertEqual(Test.fieldStaticB, 127)
self.assertEqual(Test.fieldStaticC, 'k')
self.assertEqual(Test.fieldStaticS, 32767)
self.assertEqual(Test.fieldStaticI, 2147483467)
self.assertEqual(Test.fieldStaticJ, 9223372036854775807)
self.assertAlmostEqual(Test.fieldStaticF, 1.23456789)
self.assertEqual(Test.fieldStaticD, 1.23456789)
self.assertEqual(Test.fieldStaticString, py2_encode(u'hello \U0001F30E!'))
def test_instance_methods(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodZ(), True)
self.assertEqual(test.methodB(), 127)
self.assertEqual(test.methodC(), 'k')
self.assertEqual(test.methodS(), 32767)
self.assertEqual(test.methodI(), 2147483467)
self.assertEqual(test.methodJ(), 9223372036854775807)
self.assertAlmostEqual(test.methodF(), 1.23456789)
self.assertEqual(test.methodD(), 1.23456789)
self.assertEqual(test.methodString(), py2_encode(u'hello \U0001F30E!'))
def test_instance_fields(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.fieldZ, True)
self.assertEqual(test.fieldB, 127)
self.assertEqual(test.fieldC, 'k')
self.assertEqual(test.fieldS, 32767)
self.assertEqual(test.fieldI, 2147483467)
self.assertEqual(test.fieldJ, 9223372036854775807)
self.assertAlmostEqual(test.fieldF, 1.23456789)
self.assertEqual(test.fieldD, 1.23456789)
self.assertEqual(test.fieldString, py2_encode(u'hello \U0001F30E!'))
test2 = autoclass('org.jnius.BasicsTest')(10)
self.assertEqual(test2.fieldB, 10)
self.assertEqual(test.fieldB, 127)
self.assertEqual(test2.fieldB, 10)
def test_instance_getter_naming(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.disabled, True)
self.assertEqual(test.enabled, False)
def test_instance_set_fields(self):
test = autoclass('org.jnius.BasicsTest')()
test.fieldSetZ = True
test.fieldSetB = 127
test.fieldSetC = ord('k')
test.fieldSetS = 32767
test.fieldSetI = 2147483467
test.fieldSetJ = 9223372036854775807
test.fieldSetF = 1.23456789
test.fieldSetD = 1.23456789
self.assertTrue(test.testFieldSetZ())
self.assertTrue(test.testFieldSetB())
self.assertTrue(test.testFieldSetC())
self.assertTrue(test.testFieldSetS())
self.assertTrue(test.testFieldSetI())
self.assertTrue(test.testFieldSetJ())
self.assertTrue(test.testFieldSetF())
self.assertTrue(test.testFieldSetD())
def test_instances_methods_array(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodArrayZ(), [True] * 3)
self.assertEqual(test.methodArrayB()[0], 127)
if sys.version_info >= (3, 0):
self.assertEqual(test.methodArrayB(), [127] * 3)
self.assertEqual(test.methodArrayC(), ['k'] * 3)
self.assertEqual(test.methodArrayS(), [32767] * 3)
self.assertEqual(test.methodArrayI(), [2147483467] * 3)
self.assertEqual(test.methodArrayJ(), [9223372036854775807] * 3)
ret = test.methodArrayF()
ref = [1.23456789] * 3
self.assertAlmostEqual(ret[0], ref[0])
self.assertAlmostEqual(ret[1], ref[1])
self.assertAlmostEqual(ret[2], ref[2])
self.assertEqual(test.methodArrayD(), [1.23456789] * 3)
self.assertEqual(test.methodArrayString(), [py2_encode(u'hello \U0001F30E!')] * 3)
def test_instances_methods_params(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodParamsZBCSIJFD(
True, 127, 'k', 32767, 2147483467, 9223372036854775807, 1.23456789, 1.23456789), True)
self.assertEqual(test.methodParamsZBCSIJFD(
True, long(127), 'k', long(32767), long(2147483467), 9223372036854775807, 1.23456789, 1.23456789), True)
self.assertEqual(test.methodParamsString(py2_encode(u'hello \U0001F30E!')), True)
self.assertEqual(test.methodParamsArrayI([1, 2, 3]), True)
self.assertEqual(test.methodParamsArrayString([
py2_encode(u'hello'), py2_encode(u'\U0001F30E')]), True)
def test_instances_methods_params_object_list_str(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodParamsObject([
'hello', 'world']), True)
def test_instances_methods_params_object_list_int(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodParamsObject([1, 2]), True)
def test_instances_methods_params_object_list_float(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodParamsObject([3.14, 1.61]), True)
def test_instances_methods_params_object_list_long(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodParamsObject([1, 2]), True)
def test_instances_methods_params_array_byte(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodParamsArrayByte([127, 127, 127]), True)
ret = test.methodArrayB()
self.assertEqual(test.methodParamsArrayByte(ret), True)
def test_return_array_as_object_array_of_strings(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodReturnStrings(), [py2_encode(u'Hello'),
py2_encode(u'\U0001F30E')])
def test_return_array_as_object_of_integers(self):
test = autoclass('org.jnius.BasicsTest')()
self.assertEqual(test.methodReturnIntegers(), [1, 2])
| 42.925466 | 116 | 0.677615 | 747 | 6,911 | 6.149933 | 0.198126 | 0.192643 | 0.235742 | 0.094036 | 0.43731 | 0.347192 | 0.328037 | 0.293644 | 0.231606 | 0.16108 | 0 | 0.100647 | 0.194907 | 6,911 | 160 | 117 | 43.19375 | 0.725018 | 0.001158 | 0 | 0.167883 | 0 | 0 | 0.068831 | 0 | 0 | 0 | 0 | 0 | 0.540146 | 1 | 0.116788 | false | 0 | 0.043796 | 0 | 0.175182 | 0.007299 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe7b34ae2583a471921d494e45be3e58e531ca00 | 22,176 | py | Python | heat/core/tests/test_trigonometrics.py | sebimarkgraf/heat | 9638e384f52c9bade75590963b9d57e080692da4 | [
"MIT"
] | null | null | null | heat/core/tests/test_trigonometrics.py | sebimarkgraf/heat | 9638e384f52c9bade75590963b9d57e080692da4 | [
"MIT"
] | null | null | null | heat/core/tests/test_trigonometrics.py | sebimarkgraf/heat | 9638e384f52c9bade75590963b9d57e080692da4 | [
"MIT"
] | null | null | null | import math
import torch
import heat as ht
from .test_suites.basic_test import TestCase
class TestTrigonometrics(TestCase):
def test_arccos(self):
# base elements
elements = [-1.0, -0.83, -0.12, 0.0, 0.24, 0.67, 1.0]
comparison = torch.tensor(
elements, dtype=torch.float64, device=self.device.torch_device
).acos()
# arccos of float32
float32_tensor = ht.array(elements, dtype=ht.float32)
float32_arccos = ht.acos(float32_tensor)
self.assertIsInstance(float32_arccos, ht.DNDarray)
self.assertEqual(float32_arccos.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_arccos.larray.double(), comparison))
# arccos of float64
float64_tensor = ht.array(elements, dtype=ht.float64)
float64_arccos = ht.arccos(float64_tensor)
self.assertIsInstance(float64_arccos, ht.DNDarray)
self.assertEqual(float64_arccos.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_arccos.larray.double(), comparison))
# arccos of value out of domain
nan_tensor = ht.array([1.2])
nan_arccos = ht.arccos(nan_tensor)
self.assertIsInstance(float64_arccos, ht.DNDarray)
self.assertEqual(nan_arccos.dtype, ht.float32)
self.assertTrue(math.isnan(nan_arccos.larray.item()))
# check exceptions
with self.assertRaises(TypeError):
ht.arccos([1, 2, 3])
with self.assertRaises(TypeError):
ht.arccos("hello world")
def test_arcsin(self):
# base elements
elements = [-1.0, -0.83, -0.12, 0.0, 0.24, 0.67, 1.0]
comparison = torch.tensor(
elements, dtype=torch.float64, device=self.device.torch_device
).asin()
# arcsin of float32
float32_tensor = ht.array(elements, dtype=ht.float32)
float32_arcsin = ht.asin(float32_tensor)
self.assertIsInstance(float32_arcsin, ht.DNDarray)
self.assertEqual(float32_arcsin.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_arcsin.larray.double(), comparison))
# arcsin of float64
float64_tensor = ht.array(elements, dtype=ht.float64)
float64_arcsin = ht.arcsin(float64_tensor)
self.assertIsInstance(float64_arcsin, ht.DNDarray)
self.assertEqual(float64_arcsin.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_arcsin.larray.double(), comparison))
# arcsin of value out of domain
nan_tensor = ht.array([1.2])
nan_arcsin = ht.arcsin(nan_tensor)
self.assertIsInstance(float64_arcsin, ht.DNDarray)
self.assertEqual(nan_arcsin.dtype, ht.float32)
self.assertTrue(math.isnan(nan_arcsin.larray.item()))
# check exceptions
with self.assertRaises(TypeError):
ht.arcsin([1, 2, 3])
with self.assertRaises(TypeError):
ht.arcsin("hello world")
def test_arctan(self):
# base elements
elements = 30
comparison = torch.arange(
elements, dtype=torch.float64, device=self.device.torch_device
).atan()
# arctan of float32
float32_tensor = ht.arange(elements, dtype=ht.float32)
float32_arctan = ht.arctan(float32_tensor)
self.assertIsInstance(float32_arctan, ht.DNDarray)
self.assertEqual(float32_arctan.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_arctan.larray.double(), comparison))
# arctan of float64
float64_tensor = ht.arange(elements, dtype=ht.float64)
float64_arctan = ht.arctan(float64_tensor)
self.assertIsInstance(float64_arctan, ht.DNDarray)
self.assertEqual(float64_arctan.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_arctan.larray.double(), comparison))
# arctan of ints, automatic conversion to intermediate floats
int32_tensor = ht.arange(elements, dtype=ht.int32)
int32_arctan = ht.arctan(int32_tensor)
self.assertIsInstance(int32_arctan, ht.DNDarray)
self.assertEqual(int32_arctan.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_arctan.larray.double(), comparison))
# arctan of longs, automatic conversion to intermediate floats
int64_tensor = ht.arange(elements, dtype=ht.int64)
int64_arctan = ht.atan(int64_tensor)
self.assertIsInstance(int64_arctan, ht.DNDarray)
self.assertEqual(int64_arctan.dtype, ht.float64)
self.assertTrue(torch.allclose(int64_arctan.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.arctan([1, 2, 3])
with self.assertRaises(TypeError):
ht.arctan("hello world")
def test_arctan2(self):
float32_y = torch.randn(30, device=self.device.torch_device)
float32_x = torch.randn(30, device=self.device.torch_device)
float32_comparison = torch.atan2(float32_y, float32_x)
float32_arctan2 = ht.arctan2(ht.array(float32_y), ht.array(float32_x))
self.assertIsInstance(float32_arctan2, ht.DNDarray)
self.assertEqual(float32_arctan2.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_arctan2.larray, float32_comparison))
float64_y = torch.randn(30, dtype=torch.float64, device=self.device.torch_device)
float64_x = torch.randn(30, dtype=torch.float64, device=self.device.torch_device)
float64_comparison = torch.atan2(float64_y, float64_x)
float64_arctan2 = ht.atan2(ht.array(float64_y), ht.array(float64_x))
self.assertIsInstance(float64_arctan2, ht.DNDarray)
self.assertEqual(float64_arctan2.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_arctan2.larray, float64_comparison))
# Rare Special Case with integers
int32_x = ht.array([-1, +1, +1, -1])
int32_y = ht.array([-1, -1, +1, +1])
int32_comparison = ht.array([-135.0, -45.0, 45.0, 135.0], dtype=ht.float64)
int32_arctan2 = ht.arctan2(int32_y, int32_x) * 180 / ht.pi
self.assertIsInstance(int32_arctan2, ht.DNDarray)
self.assertEqual(int32_arctan2.dtype, ht.float64)
self.assertTrue(ht.allclose(int32_arctan2, int32_comparison))
int16_x = ht.array([-1, +1, +1, -1], dtype=ht.int16)
int16_y = ht.array([-1, -1, +1, +1], dtype=ht.int16)
int16_comparison = ht.array([-135.0, -45.0, 45.0, 135.0], dtype=ht.float32)
int16_arctan2 = ht.arctan2(int16_y, int16_x) * 180.0 / ht.pi
self.assertIsInstance(int16_arctan2, ht.DNDarray)
self.assertEqual(int16_arctan2.dtype, ht.float32)
self.assertTrue(ht.allclose(int16_arctan2, int16_comparison))
def test_degrees(self):
# base elements
elements = [0.0, 0.2, 0.6, 0.9, 1.2, 2.7, 3.14]
comparison = (
180.0
* torch.tensor(elements, dtype=torch.float64, device=self.device.torch_device)
/ 3.141592653589793
)
# degrees with float32
float32_tensor = ht.array(elements, dtype=ht.float32)
float32_degrees = ht.degrees(float32_tensor)
self.assertIsInstance(float32_degrees, ht.DNDarray)
self.assertEqual(float32_degrees.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_degrees.larray.double(), comparison))
# degrees with float64
float64_tensor = ht.array(elements, dtype=ht.float64)
float64_degrees = ht.degrees(float64_tensor)
self.assertIsInstance(float64_degrees, ht.DNDarray)
self.assertEqual(float64_degrees.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_degrees.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.degrees([1, 2, 3])
with self.assertRaises(TypeError):
ht.degrees("hello world")
def test_deg2rad(self):
# base elements
elements = [0.0, 20.0, 45.0, 78.0, 94.0, 120.0, 180.0, 270.0, 311.0]
comparison = (
3.141592653589793
* torch.tensor(elements, dtype=torch.float64, device=self.device.torch_device)
/ 180.0
)
# deg2rad with float32
float32_tensor = ht.array(elements, dtype=ht.float32)
float32_deg2rad = ht.deg2rad(float32_tensor)
self.assertIsInstance(float32_deg2rad, ht.DNDarray)
self.assertEqual(float32_deg2rad.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_deg2rad.larray.double(), comparison))
# deg2rad with float64
float64_tensor = ht.array(elements, dtype=ht.float64)
float64_deg2rad = ht.deg2rad(float64_tensor)
self.assertIsInstance(float64_deg2rad, ht.DNDarray)
self.assertEqual(float64_deg2rad.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_deg2rad.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.deg2rad([1, 2, 3])
with self.assertRaises(TypeError):
ht.deg2rad("hello world")
def test_cos(self):
# base elements
elements = 30
comparison = torch.arange(
elements, dtype=torch.float64, device=self.device.torch_device
).cos()
# cosine of float32
float32_tensor = ht.arange(elements, dtype=ht.float32)
float32_cos = ht.cos(float32_tensor)
self.assertIsInstance(float32_cos, ht.DNDarray)
self.assertEqual(float32_cos.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_cos.larray.double(), comparison))
# cosine of float64
float64_tensor = ht.arange(elements, dtype=ht.float64)
float64_cos = ht.cos(float64_tensor)
self.assertIsInstance(float64_cos, ht.DNDarray)
self.assertEqual(float64_cos.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_cos.larray.double(), comparison))
# cosine of ints, automatic conversion to intermediate floats
int32_tensor = ht.arange(elements, dtype=ht.int32)
int32_cos = ht.cos(int32_tensor)
self.assertIsInstance(int32_cos, ht.DNDarray)
self.assertEqual(int32_cos.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_cos.larray.double(), comparison))
# cosine of longs, automatic conversion to intermediate floats
int64_tensor = ht.arange(elements, dtype=ht.int64)
int64_cos = int64_tensor.cos()
self.assertIsInstance(int64_cos, ht.DNDarray)
self.assertEqual(int64_cos.dtype, ht.float64)
self.assertTrue(torch.allclose(int64_cos.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.cos([1, 2, 3])
with self.assertRaises(TypeError):
ht.cos("hello world")
def test_cosh(self):
# base elements
elements = 30
comparison = torch.arange(
elements, dtype=torch.float64, device=self.device.torch_device
).cosh()
# hyperbolic cosine of float32
float32_tensor = ht.arange(elements, dtype=ht.float32)
float32_cosh = float32_tensor.cosh()
self.assertIsInstance(float32_cosh, ht.DNDarray)
self.assertEqual(float32_cosh.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_cosh.larray.double(), comparison))
# hyperbolic cosine of float64
float64_tensor = ht.arange(elements, dtype=ht.float64)
float64_cosh = ht.cosh(float64_tensor)
self.assertIsInstance(float64_cosh, ht.DNDarray)
self.assertEqual(float64_cosh.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_cosh.larray.double(), comparison))
# hyperbolic cosine of ints, automatic conversion to intermediate floats
int32_tensor = ht.arange(elements, dtype=ht.int32)
int32_cosh = ht.cosh(int32_tensor)
self.assertIsInstance(int32_cosh, ht.DNDarray)
self.assertEqual(int32_cosh.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_cosh.larray.double(), comparison))
# hyperbolic cosine of longs, automatic conversion to intermediate floats
int64_tensor = ht.arange(elements, dtype=ht.int64)
int64_cosh = ht.cosh(int64_tensor)
self.assertIsInstance(int64_cosh, ht.DNDarray)
self.assertEqual(int64_cosh.dtype, ht.float64)
self.assertTrue(torch.allclose(int64_cosh.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.cosh([1, 2, 3])
with self.assertRaises(TypeError):
ht.cosh("hello world")
def test_rad2deg(self):
# base elements
elements = [0.0, 0.2, 0.6, 0.9, 1.2, 2.7, 3.14]
comparison = (
180.0
* torch.tensor(elements, dtype=torch.float64, device=self.device.torch_device)
/ 3.141592653589793
)
# rad2deg with float32
float32_tensor = ht.array(elements, dtype=ht.float32)
float32_rad2deg = ht.rad2deg(float32_tensor)
self.assertIsInstance(float32_rad2deg, ht.DNDarray)
self.assertEqual(float32_rad2deg.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_rad2deg.larray.double(), comparison))
# rad2deg with float64
float64_tensor = ht.array(elements, dtype=ht.float64)
float64_rad2deg = ht.rad2deg(float64_tensor)
self.assertIsInstance(float64_rad2deg, ht.DNDarray)
self.assertEqual(float64_rad2deg.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_rad2deg.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.rad2deg([1, 2, 3])
with self.assertRaises(TypeError):
ht.rad2deg("hello world")
def test_radians(self):
# base elements
elements = [0.0, 20.0, 45.0, 78.0, 94.0, 120.0, 180.0, 270.0, 311.0]
comparison = (
3.141592653589793
* torch.tensor(elements, dtype=torch.float64, device=self.device.torch_device)
/ 180.0
)
# radians with float32
float32_tensor = ht.array(elements, dtype=ht.float32)
float32_radians = ht.radians(float32_tensor)
self.assertIsInstance(float32_radians, ht.DNDarray)
self.assertEqual(float32_radians.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_radians.larray.double(), comparison))
# radians with float64
float64_tensor = ht.array(elements, dtype=ht.float64)
float64_radians = ht.radians(float64_tensor)
self.assertIsInstance(float64_radians, ht.DNDarray)
self.assertEqual(float64_radians.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_radians.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.radians([1, 2, 3])
with self.assertRaises(TypeError):
ht.radians("hello world")
def test_sin(self):
# base elements
elements = 30
comparison = torch.arange(
elements, dtype=torch.float64, device=self.device.torch_device
).sin()
# sine of float32
float32_tensor = ht.arange(elements, dtype=ht.float32)
float32_sin = float32_tensor.sin()
self.assertIsInstance(float32_sin, ht.DNDarray)
self.assertEqual(float32_sin.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_sin.larray.double(), comparison))
# sine of float64
float64_tensor = ht.arange(elements, dtype=ht.float64)
float64_sin = ht.sin(float64_tensor)
self.assertIsInstance(float64_sin, ht.DNDarray)
self.assertEqual(float64_sin.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_sin.larray.double(), comparison))
# sine of ints, automatic conversion to intermediate floats
int32_tensor = ht.arange(elements, dtype=ht.int32)
int32_sin = ht.sin(int32_tensor)
self.assertIsInstance(int32_sin, ht.DNDarray)
self.assertEqual(int32_sin.dtype, ht.float32)
self.assertTrue(torch.allclose(int32_sin.larray.double(), comparison))
# sine of longs, automatic conversion to intermediate floats
int64_tensor = ht.arange(elements, dtype=ht.int64)
int64_sin = ht.sin(int64_tensor)
self.assertIsInstance(int64_sin, ht.DNDarray)
self.assertEqual(int64_sin.dtype, ht.float64)
self.assertTrue(torch.allclose(int64_sin.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.sin([1, 2, 3])
with self.assertRaises(TypeError):
ht.sin("hello world")
def test_sinh(self):
# base elements
elements = 30
comparison = torch.arange(
elements, dtype=torch.float64, device=self.device.torch_device
).sinh()
# hyperbolic sine of float32
float32_tensor = ht.arange(elements, dtype=ht.float32)
float32_sinh = float32_tensor.sinh()
self.assertIsInstance(float32_sinh, ht.DNDarray)
self.assertEqual(float32_sinh.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_sinh.larray.double(), comparison))
# hyperbolic sine of float64
float64_tensor = ht.arange(elements, dtype=ht.float64)
float64_sinh = ht.sinh(float64_tensor)
self.assertIsInstance(float64_sinh, ht.DNDarray)
self.assertEqual(float64_sinh.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_sinh.larray.double(), comparison))
# hyperbolic sine of ints, automatic conversion to intermediate floats
int32_tensor = ht.arange(elements, dtype=ht.int32)
int32_sinh = ht.sinh(int32_tensor)
self.assertIsInstance(int32_sinh, ht.DNDarray)
self.assertEqual(int32_sinh.dtype, ht.float32)
self.assertTrue(torch.allclose(int32_sinh.larray.double(), comparison))
# hyperbolic sine of longs, automatic conversion to intermediate floats
int64_tensor = ht.arange(elements, dtype=ht.int64)
int64_sinh = ht.sinh(int64_tensor)
self.assertIsInstance(int64_sinh, ht.DNDarray)
self.assertEqual(int64_sinh.dtype, ht.float64)
self.assertTrue(torch.allclose(int64_sinh.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.sinh([1, 2, 3])
with self.assertRaises(TypeError):
ht.sinh("hello world")
def test_tan(self):
# base elements
elements = 30
comparison = torch.arange(
elements, dtype=torch.float64, device=self.device.torch_device
).tan()
# tangent of float32
float32_tensor = ht.arange(elements, dtype=ht.float32)
float32_tan = float32_tensor.tan()
self.assertIsInstance(float32_tan, ht.DNDarray)
self.assertEqual(float32_tan.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_tan.larray.double(), comparison))
# tangent of float64
float64_tensor = ht.arange(elements, dtype=ht.float64)
float64_tan = ht.tan(float64_tensor)
self.assertIsInstance(float64_tan, ht.DNDarray)
self.assertEqual(float64_tan.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_tan.larray.double(), comparison))
# tangent of ints, automatic conversion to intermediate floats
int32_tensor = ht.arange(elements, dtype=ht.int32)
int32_tan = ht.tan(int32_tensor)
self.assertIsInstance(int32_tan, ht.DNDarray)
self.assertEqual(int32_tan.dtype, ht.float32)
self.assertTrue(torch.allclose(int32_tan.larray.double(), comparison))
# tangent of longs, automatic conversion to intermediate floats
int64_tensor = ht.arange(elements, dtype=ht.int64)
int64_tan = ht.tan(int64_tensor)
self.assertIsInstance(int64_tan, ht.DNDarray)
self.assertEqual(int64_tan.dtype, ht.float64)
self.assertTrue(torch.allclose(int64_tan.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.tan([1, 2, 3])
with self.assertRaises(TypeError):
ht.tan("hello world")
def test_tanh(self):
# base elements
elements = 30
comparison = torch.arange(
elements, dtype=torch.float64, device=self.device.torch_device
).tanh()
# hyperbolic tangent of float32
float32_tensor = ht.arange(elements, dtype=ht.float32)
float32_tanh = float32_tensor.tanh()
self.assertIsInstance(float32_tanh, ht.DNDarray)
self.assertEqual(float32_tanh.dtype, ht.float32)
self.assertTrue(torch.allclose(float32_tanh.larray.double(), comparison))
# hyperbolic tangent of float64
float64_tensor = ht.arange(elements, dtype=ht.float64)
float64_tanh = ht.tanh(float64_tensor)
self.assertIsInstance(float64_tanh, ht.DNDarray)
self.assertEqual(float64_tanh.dtype, ht.float64)
self.assertTrue(torch.allclose(float64_tanh.larray.double(), comparison))
# hyperbolic tangent of ints, automatic conversion to intermediate floats
int32_tensor = ht.arange(elements, dtype=ht.int32)
int32_tanh = ht.tanh(int32_tensor)
self.assertIsInstance(int32_tanh, ht.DNDarray)
self.assertEqual(int32_tanh.dtype, ht.float32)
self.assertTrue(torch.allclose(int32_tanh.larray.double(), comparison))
# hyperbolic tangent of longs, automatic conversion to intermediate floats
int64_tensor = ht.arange(elements, dtype=ht.int64)
int64_tanh = ht.tanh(int64_tensor)
self.assertIsInstance(int64_tanh, ht.DNDarray)
self.assertEqual(int64_tanh.dtype, ht.float64)
self.assertTrue(torch.allclose(int64_tanh.larray.double(), comparison))
# check exceptions
with self.assertRaises(TypeError):
ht.tanh([1, 2, 3])
with self.assertRaises(TypeError):
ht.tanh("hello world")
| 42.564299 | 90 | 0.668155 | 2,622 | 22,176 | 5.525553 | 0.041571 | 0.043484 | 0.044451 | 0.079376 | 0.857261 | 0.665447 | 0.624931 | 0.605052 | 0.44561 | 0.41386 | 0 | 0.070467 | 0.226326 | 22,176 | 520 | 91 | 42.646154 | 0.77397 | 0.088249 | 0 | 0.338753 | 0 | 0 | 0.007097 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0.03794 | false | 0 | 0.01084 | 0 | 0.051491 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe7c93136f5957abd1473539b48e2af055283e53 | 4,780 | py | Python | src/sima/hydro/linearcurrentcoefficientitem.py | SINTEF/simapy | 650b8c2f15503dad98e2bfc0d0788509593822c7 | [
"MIT"
] | null | null | null | src/sima/hydro/linearcurrentcoefficientitem.py | SINTEF/simapy | 650b8c2f15503dad98e2bfc0d0788509593822c7 | [
"MIT"
] | null | null | null | src/sima/hydro/linearcurrentcoefficientitem.py | SINTEF/simapy | 650b8c2f15503dad98e2bfc0d0788509593822c7 | [
"MIT"
] | null | null | null | # This an autogenerated file
#
# Generated with LinearCurrentCoefficientItem
from typing import Dict,Sequence,List
from dmt.entity import Entity
from dmt.blueprint import Blueprint
from .blueprints.linearcurrentcoefficientitem import LinearCurrentCoefficientItemBlueprint
from typing import Dict
from sima.sima.moao import MOAO
from sima.sima.scriptablevalue import ScriptableValue
class LinearCurrentCoefficientItem(MOAO):
"""
Keyword arguments
-----------------
name : str
(default "")
description : str
(default "")
_id : str
(default "")
scriptableValues : List[ScriptableValue]
direction : float
Direction(default 0.0)
c11 : float
Linear current force coefficient for 1. degree of freedom(default 0.0)
c12 : float
Linear current force coefficient for 2. degree of freedom(default 0.0)
c13 : float
Linear current force coefficient for 3. degree of freedom(default 0.0)
c14 : float
Linear current force coefficient for 4. degree of freedom(default 0.0)
c15 : float
Linear current force coefficient for 5. degree of freedom(default 0.0)
c16 : float
Linear current force coefficient for 6. degree of freedom(default 0.0)
"""
def __init__(self , name="", description="", _id="", direction=0.0, c11=0.0, c12=0.0, c13=0.0, c14=0.0, c15=0.0, c16=0.0, **kwargs):
super().__init__(**kwargs)
self.name = name
self.description = description
self._id = _id
self.scriptableValues = list()
self.direction = direction
self.c11 = c11
self.c12 = c12
self.c13 = c13
self.c14 = c14
self.c15 = c15
self.c16 = c16
for key, value in kwargs.items():
if not isinstance(value, Dict):
setattr(self, key, value)
@property
def blueprint(self) -> Blueprint:
"""Return blueprint that this entity represents"""
return LinearCurrentCoefficientItemBlueprint()
@property
def name(self) -> str:
""""""
return self.__name
@name.setter
def name(self, value: str):
"""Set name"""
self.__name = str(value)
@property
def description(self) -> str:
""""""
return self.__description
@description.setter
def description(self, value: str):
"""Set description"""
self.__description = str(value)
@property
def _id(self) -> str:
""""""
return self.___id
@_id.setter
def _id(self, value: str):
"""Set _id"""
self.___id = str(value)
@property
def scriptableValues(self) -> List[ScriptableValue]:
""""""
return self.__scriptableValues
@scriptableValues.setter
def scriptableValues(self, value: List[ScriptableValue]):
"""Set scriptableValues"""
if not isinstance(value, Sequence):
raise Exception("Expected sequense, but was " , type(value))
self.__scriptableValues = value
@property
def direction(self) -> float:
"""Direction"""
return self.__direction
@direction.setter
def direction(self, value: float):
"""Set direction"""
self.__direction = float(value)
@property
def c11(self) -> float:
"""Linear current force coefficient for 1. degree of freedom"""
return self.__c11
@c11.setter
def c11(self, value: float):
"""Set c11"""
self.__c11 = float(value)
@property
def c12(self) -> float:
"""Linear current force coefficient for 2. degree of freedom"""
return self.__c12
@c12.setter
def c12(self, value: float):
"""Set c12"""
self.__c12 = float(value)
@property
def c13(self) -> float:
"""Linear current force coefficient for 3. degree of freedom"""
return self.__c13
@c13.setter
def c13(self, value: float):
"""Set c13"""
self.__c13 = float(value)
@property
def c14(self) -> float:
"""Linear current force coefficient for 4. degree of freedom"""
return self.__c14
@c14.setter
def c14(self, value: float):
"""Set c14"""
self.__c14 = float(value)
@property
def c15(self) -> float:
"""Linear current force coefficient for 5. degree of freedom"""
return self.__c15
@c15.setter
def c15(self, value: float):
"""Set c15"""
self.__c15 = float(value)
@property
def c16(self) -> float:
"""Linear current force coefficient for 6. degree of freedom"""
return self.__c16
@c16.setter
def c16(self, value: float):
"""Set c16"""
self.__c16 = float(value)
| 27.471264 | 136 | 0.602301 | 543 | 4,780 | 5.187845 | 0.145488 | 0.00994 | 0.076677 | 0.097977 | 0.27476 | 0.253461 | 0.234292 | 0.225772 | 0.225772 | 0.225772 | 0 | 0.046825 | 0.285146 | 4,780 | 173 | 137 | 27.630058 | 0.777583 | 0.279289 | 0 | 0.126316 | 1 | 0 | 0.008385 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.252632 | false | 0 | 0.073684 | 0 | 0.463158 | 0.042105 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe82b2e3ac7a0f9b999f223966b12a5a465ac4b3 | 3,129 | py | Python | src/sims4communitylib/enums/skills_enum.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | src/sims4communitylib/enums/skills_enum.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | src/sims4communitylib/enums/skills_enum.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | """
The Sims 4 Community Library is licensed under the Creative Commons Attribution 4.0 International public license (CC BY 4.0).
https://creativecommons.org/licenses/by/4.0/
https://creativecommons.org/licenses/by/4.0/legalcode
Copyright (c) COLONOLNUTTY
"""
from sims4communitylib.enums.enumtypes.common_int import CommonInt
class CommonSkillId(CommonInt):
"""Identifiers for sim skills.
"""
ADULT_MAJOR_ACTING: 'CommonSkillId' = 194727
ADULT_MAJOR_ARCHAEOLOGY: 'CommonSkillId' = 174237
ADULT_MAJOR_BAKING: 'CommonSkillId' = 104198
ADULT_MAJOR_BAR_TENDING: 'CommonSkillId' = 16695
ADULT_MAJOR_CHARISMA: 'CommonSkillId' = 16699
ADULT_MAJOR_COMEDY: 'CommonSkillId' = 16698
ADULT_MAJOR_DJ_MIXING: 'CommonSkillId' = 121612
ADULT_MAJOR_FABRICATION: 'CommonSkillId' = 231908
ADULT_MAJOR_FISHING: 'CommonSkillId' = 39397
ADULT_MAJOR_FLOWER_ARRANGING: 'CommonSkillId' = 186703
ADULT_MAJOR_GARDENING: 'CommonSkillId' = 16700
ADULT_MAJOR_GOURMET_COOKING: 'CommonSkillId' = 16701
ADULT_MAJOR_GUITAR: 'CommonSkillId' = 16702
ADULT_MAJOR_HANDINESS: 'CommonSkillId' = 16704
ADULT_MAJOR_HERBALISM: 'CommonSkillId' = 101920
ADULT_MAJOR_HOME_STYLE_COOKING: 'CommonSkillId' = 16705
ADULT_MAJOR_LOGIC: 'CommonSkillId' = 16706
ADULT_MAJOR_MISCHIEF: 'CommonSkillId' = 16707
ADULT_MAJOR_PAINTING: 'CommonSkillId' = 16708
ADULT_MAJOR_PARENTING: 'CommonSkillId' = 160504
ADULT_MAJOR_PHOTOGRAPHY: 'CommonSkillId' = 105774
ADULT_MAJOR_PIANO: 'CommonSkillId' = 16709
ADULT_MAJOR_PIPE_ORGAN: 'CommonSkillId' = 149665
ADULT_MAJOR_PROGRAMMING: 'CommonSkillId' = 16703
ADULT_MAJOR_ROCKET_SCIENCE: 'CommonSkillId' = 16710
ADULT_MAJOR_SINGING: 'CommonSkillId' = 137811
ADULT_MAJOR_VETERINARIAN: 'CommonSkillId' = 161190
ADULT_MAJOR_VIDEO_GAMING: 'CommonSkillId' = 16712
ADULT_MAJOR_VIOLIN: 'CommonSkillId' = 16713
ADULT_MAJOR_WELLNESS: 'CommonSkillId' = 117858
ADULT_MAJOR_WRITING: 'CommonSkillId' = 16714
ADULT_MINOR_DANCING: 'CommonSkillId' = 128145
ADULT_MINOR_JUICE_FIZZING: 'CommonSkillId' = 234806
ADULT_MINOR_LOCAL_CULTURE: 'CommonSkillId' = 174687
ADULT_MINOR_MEDIA_PRODUCTION: 'CommonSkillId' = 192655
BOWLING: 'CommonSkillId' = 158659
CHILD_CREATIVITY: 'CommonSkillId' = 16718
CHILD_MENTAL: 'CommonSkillId' = 16719
CHILD_MOTOR: 'CommonSkillId' = 16720
CHILD_SOCIAL: 'CommonSkillId' = 16721
DOG_TRAINING: 'CommonSkillId' = 161220
FITNESS: 'CommonSkillId' = 16659
HIDDEN_SKATING: 'CommonSkillId' = 179925
HIDDEN_TREAD_MILL_ROCK_CLIMBING_WALL_CLIMB: 'CommonSkillId' = 165900
HIDDEN_VAMPIRE_LORE: 'CommonSkillId' = 149556
OBJECT_UPGRADE_COMPUTER_GAMING: 'CommonSkillId' = 29027
RETAIL_MAINTENANCE: 'CommonSkillId' = 111904
RETAIL_SALES: 'CommonSkillId' = 111902
RETAIL_WORK_ETHIC: 'CommonSkillId' = 111903
TODDLER_COMMUNICATION: 'CommonSkillId' = 140170
TODDLER_IMAGINATION: 'CommonSkillId' = 140706
TODDLER_MOVEMENT: 'CommonSkillId' = 136140
TODDLER_POTTY: 'CommonSkillId' = 144913
TODDLER_THINKING: 'CommonSkillId' = 140504
| 45.347826 | 125 | 0.763183 | 324 | 3,129 | 7.033951 | 0.540123 | 0.136025 | 0.005265 | 0.007898 | 0.03247 | 0.03247 | 0.03247 | 0.03247 | 0.03247 | 0.03247 | 0 | 0.117069 | 0.153723 | 3,129 | 68 | 126 | 46.014706 | 0.74358 | 0.091403 | 0 | 0 | 0 | 0 | 0.248057 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.017857 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fe8aa50bdb9069e13614b39415ae064abc64bf13 | 683 | py | Python | final_project/machinetranslation/tests/tests.py | andylbarnes76/xzceb-flask_eng_fr | c7b5c6d24711436fbc058640a2be56c8aa6af2e4 | [
"Apache-2.0"
] | null | null | null | final_project/machinetranslation/tests/tests.py | andylbarnes76/xzceb-flask_eng_fr | c7b5c6d24711436fbc058640a2be56c8aa6af2e4 | [
"Apache-2.0"
] | null | null | null | final_project/machinetranslation/tests/tests.py | andylbarnes76/xzceb-flask_eng_fr | c7b5c6d24711436fbc058640a2be56c8aa6af2e4 | [
"Apache-2.0"
] | null | null | null | import unittest
from translator import english_to_french, french_to_english
class TestFrenchtoEnglish (unittest.TestCase):
def test1(self):
self.assertEqual(french_to_english("NULL"),"NULL") #test that empty string returns empty string
self.assertEqual(french_to_english("Bonjour"),"Hello") #test that Bonjour in French returns hello in english
class TestFrenchtoEnglish (unittest.TestCase):
def test1(self):
self.assertEqual(english_to_french("NULL"),"NULL") #test that empty string returns empty string
self.assertEqual(english_to_french("Hello"),"Bonjour") #test that Bonjour in French returns hello in english
unittest.main() | 45.533333 | 116 | 0.752562 | 87 | 683 | 5.770115 | 0.275862 | 0.119522 | 0.089641 | 0.155378 | 0.814741 | 0.695219 | 0.695219 | 0.695219 | 0.695219 | 0.533865 | 0 | 0.003472 | 0.156662 | 683 | 15 | 117 | 45.533333 | 0.868056 | 0.278184 | 0 | 0.363636 | 0 | 0 | 0.081633 | 0 | 0 | 0 | 0 | 0 | 0.363636 | 1 | 0.181818 | false | 0 | 0.181818 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fe8cb2041205bbc149b2efc5d74bc031cd1a5a58 | 86 | py | Python | config_example.py | ha107642/RoachFactsBot | 8a6301b9afd117086d432df61495ae92e04cf0e7 | [
"MIT"
] | null | null | null | config_example.py | ha107642/RoachFactsBot | 8a6301b9afd117086d432df61495ae92e04cf0e7 | [
"MIT"
] | null | null | null | config_example.py | ha107642/RoachFactsBot | 8a6301b9afd117086d432df61495ae92e04cf0e7 | [
"MIT"
] | null | null | null | username = u"TestBot"
oauth = u"oauth:324ouinjsdfkj9234jf"
channels = [u"testchannel"] | 28.666667 | 36 | 0.767442 | 10 | 86 | 6.6 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089744 | 0.093023 | 86 | 3 | 37 | 28.666667 | 0.75641 | 0 | 0 | 0 | 0 | 0 | 0.494253 | 0.287356 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fe8d4ec1866188e680b374723197bb5810367c12 | 5,232 | py | Python | gnosis/eth/contracts/__init__.py | gosuto-ai/gnosis-py | 637ce5c92f208f0e1d4f5e0687c10f378cfdff56 | [
"MIT"
] | 1 | 2021-10-21T06:43:52.000Z | 2021-10-21T06:43:52.000Z | gnosis/eth/contracts/__init__.py | HonzaDajc/gnosis-py | a2ab705f98479c4e46ca2b225a4bc75a773d69a7 | [
"MIT"
] | null | null | null | gnosis/eth/contracts/__init__.py | HonzaDajc/gnosis-py | a2ab705f98479c4e46ca2b225a4bc75a773d69a7 | [
"MIT"
] | 2 | 2021-07-14T10:02:16.000Z | 2021-08-02T22:04:41.000Z | import json
import os
import sys
from typing import Any, Dict, Optional
from eth_typing import ChecksumAddress
from hexbytes import HexBytes
from web3 import Web3
from web3.contract import Contract
def load_contract_interface(file_name):
return _load_json_file(_abi_file_path(file_name))
def _abi_file_path(file):
return os.path.abspath(os.path.join(os.path.dirname(__file__), file))
def _load_json_file(path):
with open(path) as f:
return json.load(f)
current_module = sys.modules[__name__]
"""
Safe Addresses. Should be the same for every chain. Check:
https://github.com/gnosis/safe-contracts/blob/development/zos.mainnet.json
https://github.com/gnosis/safe-contracts/blob/development/zos.rinkeby.json
GnosisSafeV1.1.1: 0x34CfAC646f301356fAa8B21e94227e3583Fe3F5F
GnosisSafeV1.1.0: 0xaE32496491b53841efb51829d6f886387708F99B
GnosisSafeV1.0.0: 0xb6029EA3B2c51D09a50B53CA8012FeEB05bDa35A
Factories
ProxyFactoryV1.1.0: 0x50e55Af101C777bA7A1d560a774A82eF002ced9F
ProxyFactoryV1.0.0: 0x12302fE9c02ff50939BaAaaf415fc226C078613C
Libraries
CreateAndAddModules: 0x1a56aE690ab0818aF5cA349b7D21f1d7e76a3d36
MultiSend: 0xD4B7B161E4779629C2717385114Bf78D612aEa72
"""
contracts = {
'safe': 'GnosisSafe_V1_1_1.json',
'safe_V1_3_0': 'GnosisSafe_V1_3_0.json',
'safe_V1_0_0': 'GnosisSafe_V1_0_0.json',
'safe_V0_0_1': 'GnosisSafe_V0_0_1.json',
'erc20': 'ERC20.json',
'erc721': 'ERC721.json',
'example_erc20': 'ERC20TestToken.json',
'delegate_constructor_proxy': 'DelegateConstructorProxy.json',
'multi_send': 'MultiSend.json',
'paying_proxy': 'PayingProxy.json',
'proxy_factory': 'ProxyFactory_V1_3_0.json',
'proxy_factory_V1_1_1': 'ProxyFactory_V1_1_1.json',
'proxy_factory_V1_0_0': 'ProxyFactory_V1_0_0.json',
'proxy': 'Proxy_V1_1_1.json',
'uniswap_exchange': 'uniswap_exchange.json',
'uniswap_factory': 'uniswap_factory.json',
'uniswap_v2_factory': 'uniswap_v2_factory.json',
'uniswap_v2_pair': 'uniswap_v2_pair.json',
'uniswap_v2_router': 'uniswap_v2_router.json', # Router02
'kyber_network_proxy': 'kyber_network_proxy.json',
'cpk_factory': 'CPKFactory.json',
}
def generate_contract_fn(contract: Dict[str, Any]):
"""
Dynamically generate functions to work with the contracts
:param contract:
:return:
"""
def fn(w3: Web3, address: Optional[ChecksumAddress] = None):
return w3.eth.contract(address=address,
abi=contract['abi'],
bytecode=contract.get('bytecode'))
return fn
# Anotate functions that will be generated later with `setattr` so typing does not complains
def get_safe_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_safe_V1_3_0_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_safe_V1_0_0_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_safe_V0_0_1_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_erc20_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_erc721_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_example_erc20_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_delegate_constructor_proxy_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_multi_send_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_paying_proxy_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_proxy_factory_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_proxy_factory_V1_1_1_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_proxy_factory_V1_0_0_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_proxy_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_uniswap_exchange_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_uniswap_factory_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_uniswap_v2_factory_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_uniswap_v2_pair_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_uniswap_v2_router_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_kyber_network_proxy_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_cpk_factory_contract(w3: Web3, address: Optional[str] = None) -> Contract:
pass
def get_paying_proxy_deployed_bytecode() -> bytes:
return HexBytes(load_contract_interface('PayingProxy.json')['deployedBytecode'])
def get_proxy_1_0_0_deployed_bytecode() -> bytes:
return HexBytes(load_contract_interface('Proxy_V1_0_0.json')['deployedBytecode'])
for contract_name, json_contract_filename in contracts.items():
fn_name = 'get_{}_contract'.format(contract_name)
contract_dict = load_contract_interface(json_contract_filename)
setattr(current_module, fn_name, generate_contract_fn(contract_dict))
| 29.066667 | 97 | 0.741972 | 683 | 5,232 | 5.382138 | 0.18448 | 0.037541 | 0.077802 | 0.12568 | 0.438792 | 0.413493 | 0.413493 | 0.413493 | 0.383025 | 0.355277 | 0 | 0.07656 | 0.1487 | 5,232 | 179 | 98 | 29.22905 | 0.748765 | 0.035168 | 0 | 0.221053 | 0 | 0 | 0.181317 | 0.070002 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294737 | false | 0.221053 | 0.084211 | 0.052632 | 0.452632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
fe8d90c0fe4107761de41ed38852a8d5cacb5ab0 | 758 | py | Python | tests/pytest/utils.py | dzubke/speech-lite | 65f83ac2b7551650820f079ce5152741f2a6fdb8 | [
"Apache-2.0"
] | null | null | null | tests/pytest/utils.py | dzubke/speech-lite | 65f83ac2b7551650820f079ce5152741f2a6fdb8 | [
"Apache-2.0"
] | null | null | null | tests/pytest/utils.py | dzubke/speech-lite | 65f83ac2b7551650820f079ce5152741f2a6fdb8 | [
"Apache-2.0"
] | null | null | null | # standard libraries
import glob
import os
import platform
# project libraries
from speech.utils.wave import array_from_wave
from speech.utils.signal_augment import inject_noise_sample
from speech.utils.compat import get_main_dir_path
def check_length(audio_path:str, noise_path:str, noise_level:float=0.5):
audio_data, samp_rate = array_from_wave(audio_path)
audio_noise = inject_noise_sample(audio_data, samp_rate, noise_path,
noise_level=noise_level, logger=None)
def get_all_test_audio():
system_main_dir = get_main_dir_path()
common_path = "tests/pytest/test_audio"
test_audio_dir = os.path.join(system_main_dir, common_path)
pattern = "*"
return glob.glob(os.path.join(test_audio_dir, pattern))
| 34.454545 | 73 | 0.770449 | 116 | 758 | 4.672414 | 0.387931 | 0.051661 | 0.083026 | 0.051661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003106 | 0.150396 | 758 | 21 | 74 | 36.095238 | 0.838509 | 0.047493 | 0 | 0 | 0 | 0 | 0.03338 | 0.031989 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
fe9ab7da55844d4a1fdd39cf799a850e990bc7bc | 216 | py | Python | mainwin.py | 916958205/HanWei | 66cd9d4495cd95145fe72de4cbcce5a0a76b716c | [
"MIT"
] | null | null | null | mainwin.py | 916958205/HanWei | 66cd9d4495cd95145fe72de4cbcce5a0a76b716c | [
"MIT"
] | null | null | null | mainwin.py | 916958205/HanWei | 66cd9d4495cd95145fe72de4cbcce5a0a76b716c | [
"MIT"
] | null | null | null | import sys
from PyQt5.QtWidgets import *
app = QApplication(sys.argv)
mywidget = QWidget()
mywidget.setGeometry(200,200,600,300)
mywidget.setWindowTitle("Hello PyQt5")
mywidget.show()
sys.exit(app.exec_())
| 21.6 | 39 | 0.740741 | 28 | 216 | 5.678571 | 0.678571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0.125 | 216 | 9 | 40 | 24 | 0.767196 | 0 | 0 | 0 | 0 | 0 | 0.053398 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fea331d72af939d00cabb8cbf36afb17d3ad1ab1 | 787 | py | Python | rpc/RPyC/tutorials/services/registry_discovery/service01.py | 2581676612/python | b309564a05838b23044bb8112fd4ef71307266b6 | [
"MIT"
] | 112 | 2017-09-19T17:38:38.000Z | 2020-05-27T18:00:27.000Z | rpc/RPyC/tutorials/services/registry_discovery/service01.py | tomoncle/Python-notes | ce675486290c3d1c7c2e4890b57e3d0c8a1228cc | [
"MIT"
] | null | null | null | rpc/RPyC/tutorials/services/registry_discovery/service01.py | tomoncle/Python-notes | ce675486290c3d1c7c2e4890b57e3d0c8a1228cc | [
"MIT"
] | 56 | 2017-09-20T01:24:12.000Z | 2020-04-16T06:19:31.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 17-8-15 下午1:35
# @Author : Tom.Lee
# @CopyRight : 2016-2017 OpenBridge by yihecloud
# @File : service.py
# @Product : PyCharm
# @Docs :
# @Source :
import rpyc
from rpyc.utils.server import ThreadedServer
class MyService(rpyc.Service):
def on_connect(self):
pass
def on_disconnect(self):
pass
@classmethod
def exposed_get_answer(cls):
return 66
@classmethod
def get_question(cls):
return "what is the airspeed velocity of an unladen swallow?"
if __name__ == "__main__":
t = ThreadedServer(MyService, port=18861)
print """
service start ok! port {port}
""".format(port=18861)
t.start()
| 20.179487 | 69 | 0.590851 | 93 | 787 | 4.860215 | 0.741935 | 0.022124 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052252 | 0.29479 | 787 | 38 | 70 | 20.710526 | 0.762162 | 0.311309 | 0 | 0.210526 | 0 | 0 | 0.18609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.105263 | 0.105263 | null | null | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
fea6f09ab23638e1620e4cd3a84fd26dec806269 | 886 | py | Python | Python2/TestDrivenDevelopment/src/testadder.py | ceeblet/OST_PythonCertificationTrack | 042e0ce964bc88b3f4132dcbd7e06c5f504eae34 | [
"MIT"
] | null | null | null | Python2/TestDrivenDevelopment/src/testadder.py | ceeblet/OST_PythonCertificationTrack | 042e0ce964bc88b3f4132dcbd7e06c5f504eae34 | [
"MIT"
] | null | null | null | Python2/TestDrivenDevelopment/src/testadder.py | ceeblet/OST_PythonCertificationTrack | 042e0ce964bc88b3f4132dcbd7e06c5f504eae34 | [
"MIT"
] | null | null | null | """
Demonstrates the fundamentals of unit test.
adder() is a function that lets you 'add' integers, strings, and lists.
"""
from adder import adder # keep the tested code separate from the tests.
import unittest
class TestAdder(unittest.TestCase):
def test_numbers(self):
self.assertEqual(adder(3,4), 7, "3 + 4 should be 7")
def test_strings(self):
self.assertEqual(adder('x','y'), 'xy', "x + y should be xy")
def test_lists(self):
self.assertEqual(adder([1,2],[3,4]), [1,2,3,4], "[1,2] + [3,4] should be [1,2,3,4]")
def test_number_and_string(self):
self.assertEqual(adder(1, 'two'), '1two', "1 + two should be 1two")
def test_numbers_and_list(self):
self.assertEqual(adder(4,[1,2,3]), [1,2,3,4], "4 + [1,2,3] should be [1,2,3,4]")
if __name__ == "__main__":
unittest.main() | 34.076923 | 92 | 0.603837 | 140 | 886 | 3.7 | 0.364286 | 0.030888 | 0.046332 | 0.046332 | 0.164093 | 0.069498 | 0.023166 | 0.023166 | 0 | 0 | 0 | 0.061493 | 0.22912 | 886 | 26 | 93 | 34.076923 | 0.696925 | 0.182844 | 0 | 0 | 0 | 0.066667 | 0.195258 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | false | 0 | 0.133333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fead6377c7ce82420fb675cacf96d942b02e212a | 850 | py | Python | src/genie/libs/parser/iosxr/tests/ShowPceIPV4PeerPrefix/cli/equal/golden_output_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/iosxr/tests/ShowPceIPV4PeerPrefix/cli/equal/golden_output_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/iosxr/tests/ShowPceIPV4PeerPrefix/cli/equal/golden_output_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z |
expected_output = {
'nodes': {
1: {
'te_router_id': '192.168.0.4',
'host_name': 'rtrD',
'isis_system_id': [
'1921.68ff.1004 level-1',
'1921.68ff.1004 level-2',
'1921.68ff.1004 level-2'],
'asn': [
65001,
65001,
65001],
'domain_id': [
1111,
1111,
9999],
'advertised_prefixes': [
'192.168.0.4',
'192.168.0.4',
'192.168.0.4',
'192.168.0.6']},
2: {
'te_router_id': '192.168.0.1',
'host_name': 'rtrA',
'isis_system_id': ['1921.68ff.1001 level-2'],
'advertised_prefixes': ['192.168.0.1']}}}
| 28.333333 | 73 | 0.358824 | 84 | 850 | 3.464286 | 0.369048 | 0.14433 | 0.168385 | 0.109966 | 0.632302 | 0.223368 | 0.106529 | 0.106529 | 0.106529 | 0.106529 | 0 | 0.294521 | 0.484706 | 850 | 29 | 74 | 29.310345 | 0.369863 | 0 | 0 | 0.259259 | 0 | 0 | 0.351415 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
feb86c300efea826ff83f4a7ca162a2abad92dad | 1,193 | py | Python | environment/cogment_verse_environment/utils/serialization_helpers.py | air-sara/cogment-verse | 0adbab0c21da3d189e411e89621d1d11e7fffe12 | [
"Apache-2.0"
] | null | null | null | environment/cogment_verse_environment/utils/serialization_helpers.py | air-sara/cogment-verse | 0adbab0c21da3d189e411e89621d1d11e7fffe12 | [
"Apache-2.0"
] | null | null | null | environment/cogment_verse_environment/utils/serialization_helpers.py | air-sara/cogment-verse | 0adbab0c21da3d189e411e89621d1d11e7fffe12 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 AI Redefined Inc. <dev+cogment@ai-r.com>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from data_pb2 import NDArray
import numpy as np
import cv2
def deserialize_np_array(nd_array):
return np.frombuffer(nd_array.data, dtype=nd_array.dtype).reshape(*nd_array.shape)
def serialize_np_array(np_array):
return NDArray(shape=np_array.shape, dtype=str(np_array.dtype), data=np_array.tobytes())
def deserialize_img(img_bytes):
return cv2.imdecode(np.frombuffer(img_bytes, dtype=np.uint8), cv2.IMREAD_COLOR)
def serialize_img(img):
# note rgb -> bgr for cv2
result, data = cv2.imencode(".jpg", img[:, :, ::-1])
assert result
return data.tobytes()
| 31.394737 | 92 | 0.746857 | 187 | 1,193 | 4.668449 | 0.545455 | 0.068729 | 0.029782 | 0.036655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0.155071 | 1,193 | 37 | 93 | 32.243243 | 0.850198 | 0.502934 | 0 | 0 | 0 | 0 | 0.00692 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 1 | 0.307692 | false | 0 | 0.230769 | 0.230769 | 0.846154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
22810781b4be08067014350be832d4706e14f0bd | 9,959 | py | Python | nova/tests/functional/api_sample_tests/test_cells.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/functional/api_sample_tests/test_cells.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/functional/api_sample_tests/test_cells.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2012 Nebula, Inc.'
nl|'\n'
comment|'# Copyright 2013 IBM Corp.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'mock'
newline|'\n'
name|'from'
name|'six'
op|'.'
name|'moves'
name|'import'
name|'range'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'cells'
name|'import'
name|'state'
newline|'\n'
name|'import'
name|'nova'
op|'.'
name|'conf'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'models'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'functional'
op|'.'
name|'api_sample_tests'
name|'import'
name|'api_sample_base'
newline|'\n'
nl|'\n'
DECL|variable|CONF
name|'CONF'
op|'='
name|'nova'
op|'.'
name|'conf'
op|'.'
name|'CONF'
newline|'\n'
name|'CONF'
op|'.'
name|'import_opt'
op|'('
string|"'osapi_compute_extension'"
op|','
nl|'\n'
string|"'nova.api.openstack.compute.legacy_v2.extensions'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|CellsSampleJsonTest
name|'class'
name|'CellsSampleJsonTest'
op|'('
name|'api_sample_base'
op|'.'
name|'ApiSampleTestBaseV21'
op|')'
op|':'
newline|'\n'
DECL|variable|extension_name
indent|' '
name|'extension_name'
op|'='
string|'"os-cells"'
newline|'\n'
nl|'\n'
DECL|member|_get_flags
name|'def'
name|'_get_flags'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'f'
op|'='
name|'super'
op|'('
name|'CellsSampleJsonTest'
op|','
name|'self'
op|')'
op|'.'
name|'_get_flags'
op|'('
op|')'
newline|'\n'
name|'f'
op|'['
string|"'osapi_compute_extension'"
op|']'
op|'='
name|'CONF'
op|'.'
name|'osapi_compute_extension'
op|'['
op|':'
op|']'
newline|'\n'
name|'f'
op|'['
string|"'osapi_compute_extension'"
op|']'
op|'.'
name|'append'
op|'('
nl|'\n'
string|"'nova.api.openstack.compute.contrib.cells.Cells'"
op|')'
newline|'\n'
name|'f'
op|'['
string|"'osapi_compute_extension'"
op|']'
op|'.'
name|'append'
op|'('
string|"'nova.api.openstack.compute.'"
nl|'\n'
string|"'contrib.cell_capacities.Cell_capacities'"
op|')'
newline|'\n'
name|'return'
name|'f'
newline|'\n'
nl|'\n'
DECL|member|setUp
dedent|''
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# db_check_interval < 0 makes cells manager always hit the DB'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'flags'
op|'('
name|'enable'
op|'='
name|'True'
op|','
name|'db_check_interval'
op|'='
op|'-'
number|'1'
op|','
name|'group'
op|'='
string|"'cells'"
op|')'
newline|'\n'
name|'super'
op|'('
name|'CellsSampleJsonTest'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'cells'
op|'='
name|'self'
op|'.'
name|'start_service'
op|'('
string|"'cells'"
op|','
name|'manager'
op|'='
name|'CONF'
op|'.'
name|'cells'
op|'.'
name|'manager'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_stub_cells'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_stub_cells
dedent|''
name|'def'
name|'_stub_cells'
op|'('
name|'self'
op|','
name|'num_cells'
op|'='
number|'5'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'cell_list'
op|'='
op|'['
op|']'
newline|'\n'
name|'self'
op|'.'
name|'cells_next_id'
op|'='
number|'1'
newline|'\n'
nl|'\n'
DECL|function|_fake_cell_get_all
name|'def'
name|'_fake_cell_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'self'
op|'.'
name|'cell_list'
newline|'\n'
nl|'\n'
DECL|function|_fake_cell_get
dedent|''
name|'def'
name|'_fake_cell_get'
op|'('
name|'inst'
op|','
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'cell'
name|'in'
name|'self'
op|'.'
name|'cell_list'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'cell'
op|'['
string|"'name'"
op|']'
op|'=='
name|'cell_name'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'cell'
newline|'\n'
dedent|''
dedent|''
name|'raise'
name|'exception'
op|'.'
name|'CellNotFound'
op|'('
name|'cell_name'
op|'='
name|'cell_name'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'x'
name|'in'
name|'range'
op|'('
name|'num_cells'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cell'
op|'='
name|'models'
op|'.'
name|'Cell'
op|'('
op|')'
newline|'\n'
name|'our_id'
op|'='
name|'self'
op|'.'
name|'cells_next_id'
newline|'\n'
name|'self'
op|'.'
name|'cells_next_id'
op|'+='
number|'1'
newline|'\n'
name|'cell'
op|'.'
name|'update'
op|'('
op|'{'
string|"'id'"
op|':'
name|'our_id'
op|','
nl|'\n'
string|"'name'"
op|':'
string|"'cell%s'"
op|'%'
name|'our_id'
op|','
nl|'\n'
string|"'transport_url'"
op|':'
string|"'rabbit://username%s@/'"
op|'%'
name|'our_id'
op|','
nl|'\n'
string|"'is_parent'"
op|':'
name|'our_id'
op|'%'
number|'2'
op|'=='
number|'0'
op|'}'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'cell_list'
op|'.'
name|'append'
op|'('
name|'cell'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.cell_get_all'"
op|','
name|'_fake_cell_get_all'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.cells.rpcapi.CellsAPI.cell_get'"
op|','
name|'_fake_cell_get'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cells_empty_list
dedent|''
name|'def'
name|'test_cells_empty_list'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Override this'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'_stub_cells'
op|'('
name|'num_cells'
op|'='
number|'0'
op|')'
newline|'\n'
name|'response'
op|'='
name|'self'
op|'.'
name|'_do_get'
op|'('
string|"'os-cells'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_verify_response'
op|'('
string|"'cells-list-empty-resp'"
op|','
op|'{'
op|'}'
op|','
name|'response'
op|','
number|'200'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cells_list
dedent|''
name|'def'
name|'test_cells_list'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'response'
op|'='
name|'self'
op|'.'
name|'_do_get'
op|'('
string|"'os-cells'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_verify_response'
op|'('
string|"'cells-list-resp'"
op|','
op|'{'
op|'}'
op|','
name|'response'
op|','
number|'200'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cells_get
dedent|''
name|'def'
name|'test_cells_get'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'response'
op|'='
name|'self'
op|'.'
name|'_do_get'
op|'('
string|"'os-cells/cell3'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_verify_response'
op|'('
string|"'cells-get-resp'"
op|','
op|'{'
op|'}'
op|','
name|'response'
op|','
number|'200'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_cell_capacity
dedent|''
name|'def'
name|'test_get_cell_capacity'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_mock_cell_capacity'
op|'('
op|')'
newline|'\n'
name|'state_manager'
op|'='
name|'state'
op|'.'
name|'CellStateManager'
op|'('
op|')'
newline|'\n'
name|'my_state'
op|'='
name|'state_manager'
op|'.'
name|'get_my_state'
op|'('
op|')'
newline|'\n'
name|'response'
op|'='
name|'self'
op|'.'
name|'_do_get'
op|'('
string|"'os-cells/%s/capacities'"
op|'%'
nl|'\n'
name|'my_state'
op|'.'
name|'name'
op|')'
newline|'\n'
name|'return'
name|'self'
op|'.'
name|'_verify_response'
op|'('
string|"'cells-capacities-resp'"
op|','
nl|'\n'
op|'{'
op|'}'
op|','
name|'response'
op|','
number|'200'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_all_cells_capacity
dedent|''
name|'def'
name|'test_get_all_cells_capacity'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_mock_cell_capacity'
op|'('
op|')'
newline|'\n'
name|'response'
op|'='
name|'self'
op|'.'
name|'_do_get'
op|'('
string|"'os-cells/capacities'"
op|')'
newline|'\n'
name|'return'
name|'self'
op|'.'
name|'_verify_response'
op|'('
string|"'cells-capacities-resp'"
op|','
nl|'\n'
op|'{'
op|'}'
op|','
name|'response'
op|','
number|'200'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_mock_cell_capacity
dedent|''
name|'def'
name|'_mock_cell_capacity'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'response'
op|'='
op|'{'
string|'"ram_free"'
op|':'
nl|'\n'
op|'{'
string|'"units_by_mb"'
op|':'
op|'{'
string|'"8192"'
op|':'
number|'0'
op|','
string|'"512"'
op|':'
number|'13'
op|','
nl|'\n'
string|'"4096"'
op|':'
number|'1'
op|','
string|'"2048"'
op|':'
number|'3'
op|','
string|'"16384"'
op|':'
number|'0'
op|'}'
op|','
nl|'\n'
string|'"total_mb"'
op|':'
number|'7680'
op|'}'
op|','
nl|'\n'
string|'"disk_free"'
op|':'
nl|'\n'
op|'{'
string|'"units_by_mb"'
op|':'
op|'{'
string|'"81920"'
op|':'
number|'11'
op|','
string|'"20480"'
op|':'
number|'46'
op|','
nl|'\n'
string|'"40960"'
op|':'
number|'23'
op|','
string|'"163840"'
op|':'
number|'5'
op|','
string|'"0"'
op|':'
number|'0'
op|'}'
op|','
nl|'\n'
string|'"total_mb"'
op|':'
number|'1052672'
op|'}'
nl|'\n'
op|'}'
newline|'\n'
name|'goc_mock'
op|'='
name|'mock'
op|'.'
name|'Mock'
op|'('
op|')'
newline|'\n'
name|'goc_mock'
op|'.'
name|'return_value'
op|'='
name|'response'
newline|'\n'
name|'self'
op|'.'
name|'cells'
op|'.'
name|'manager'
op|'.'
name|'state_manager'
op|'.'
name|'get_our_capacities'
op|'='
name|'goc_mock'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 12.984355 | 88 | 0.606185 | 1,502 | 9,959 | 3.903462 | 0.133156 | 0.118711 | 0.081869 | 0.06686 | 0.651032 | 0.584854 | 0.499574 | 0.403718 | 0.347092 | 0.28876 | 0 | 0.01189 | 0.10483 | 9,959 | 766 | 89 | 13.001305 | 0.645766 | 0 | 0 | 0.861619 | 0 | 0 | 0.379255 | 0.055829 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.010444 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
22a0144eab6cc372bbe6baa3e1ee14cd188fb313 | 2,233 | py | Python | quantity/trader/api.py | wyjcpu/quantity | a53126a430f12b5bac81a52b2fe749cc497faf36 | [
"MIT"
] | null | null | null | quantity/trader/api.py | wyjcpu/quantity | a53126a430f12b5bac81a52b2fe749cc497faf36 | [
"MIT"
] | null | null | null | quantity/trader/api.py | wyjcpu/quantity | a53126a430f12b5bac81a52b2fe749cc497faf36 | [
"MIT"
] | 1 | 2021-05-11T09:33:59.000Z | 2021-05-11T09:33:59.000Z | # coding=utf-8
import logging
from .gftrader import GFTrader
from .httrader import HTTrader
from .joinquant_follower import JoinQuantFollower
from .ricequant_follower import RiceQuantFollower
from .log import log
from .xq_follower import XueQiuFollower
from .xqtrader import XueQiuTrader
from .yhtrader import YHTrader
from .yjbtrader import YJBTrader
def use(broker, debug=True, **kwargs):
"""用于生成特定的券商对象
:param broker:券商名支持 ['ht', 'HT', '华泰’] ['yjb', 'YJB', ’佣金宝'] ['yh', 'YH', '银河'] ['gf', 'GF', '广发']
:param debug: 控制 debug 日志的显示, 默认为 True
:param initial_assets: [雪球参数] 控制雪球初始资金,默认为一百万
:param remove_zero: [ht参数],是否移除 08 账户开头的 0, 默认 True
:return the class of trader
Usage::
>>> import easytrader
>>> user = easytrader.use('ht')
>>> user.prepare('ht.json')
"""
if not debug:
log.setLevel(logging.INFO)
if broker.lower() in ['ht', '华泰']:
return HTTrader(**kwargs)
if broker.lower() in ['yjb', '佣金宝']:
return YJBTrader()
if broker.lower() in ['yh', '银河']:
return YHTrader()
if broker.lower() in ['xq', '雪球']:
return XueQiuTrader(**kwargs)
if broker.lower() in ['gf', '广发']:
return GFTrader()
if broker.lower() in ['yh_client', '银河客户端']:
from .yh_clienttrader import YHClientTrader
return YHClientTrader()
def follower(platform, **kwargs):
"""用于生成特定的券商对象
:param platform:平台支持 ['jq', 'joinquant', '聚宽’]
:param initial_assets: [雪球参数] 控制雪球初始资金,默认为一万, 总资金由 initial_assets * 组合当前净值 得出
:param total_assets: [雪球参数] 控制雪球总资金,无默认值, 若设置则覆盖 initial_assets
:return the class of follower
Usage::
>>> import easytrader
>>> user = easytrader.use('xq')
>>> user.prepare('xq.json')
>>> jq = easytrader.follower('jq')
>>> jq.login(user='username', password='password')
>>> jq.follow(users=user, strategies=['strategies_link'])
"""
if platform.lower() in ['rq', 'ricequant', '米筐']:
return RiceQuantFollower()
if platform.lower() in ['jq', 'joinquant', '聚宽']:
return JoinQuantFollower()
if platform.lower() in ['xq', 'xueqiu', '雪球']:
return XueQiuFollower(**kwargs)
| 28.265823 | 102 | 0.621137 | 260 | 2,233 | 5.288462 | 0.369231 | 0.045818 | 0.056727 | 0.065455 | 0.154182 | 0.055273 | 0 | 0 | 0 | 0 | 0 | 0.002316 | 0.226601 | 2,233 | 78 | 103 | 28.628205 | 0.793862 | 0.39588 | 0 | 0 | 0 | 0 | 0.057692 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
22af578037baf99b23908b56812166ad8e98f173 | 483 | py | Python | portfolio/views.py | AthmanZiri/django-site | 04c6e0967b628b8ebef1ca1caae8cee83c1a2f07 | [
"MIT"
] | null | null | null | portfolio/views.py | AthmanZiri/django-site | 04c6e0967b628b8ebef1ca1caae8cee83c1a2f07 | [
"MIT"
] | null | null | null | portfolio/views.py | AthmanZiri/django-site | 04c6e0967b628b8ebef1ca1caae8cee83c1a2f07 | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
from django.shortcuts import render
from django.utils import timezone
from .models import Project
def project_list(request):
projects = Project.objects.all()
return render(request, 'portfolio/project_list.html', {'projects': projects})
def project_detail(request):
try:
detail = Project.objects.get(pk=post_id)
except Project.DoesNotExist:
raise Http404
return render(request, 'portfolio/portfolio_detail.html', {'detail': detail}) | 30.1875 | 78 | 0.79089 | 62 | 483 | 6 | 0.5 | 0.053763 | 0.102151 | 0.150538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006961 | 0.10766 | 483 | 16 | 79 | 30.1875 | 0.856148 | 0 | 0 | 0 | 0 | 0 | 0.14876 | 0.119835 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.307692 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
22afb6cae21fac551c61195ce7227d6545f3f74a | 751 | py | Python | gemynd/api/__init__.py | gemynd/gemynd | e2bca0dc48d19cb2aecf0921e2e5ae5be7a2e2f1 | [
"Apache-2.0"
] | null | null | null | gemynd/api/__init__.py | gemynd/gemynd | e2bca0dc48d19cb2aecf0921e2e5ae5be7a2e2f1 | [
"Apache-2.0"
] | null | null | null | gemynd/api/__init__.py | gemynd/gemynd | e2bca0dc48d19cb2aecf0921e2e5ae5be7a2e2f1 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# A library that provides a Gemynd AI bot interface
# Copyright (C) 2016
# Gemynd AI Team <devs@gemynd.ai>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
from telegramapi.api import *
__author__ = 'devs@gemynd.ai'
__version__ = '0.1alpha'
| 41.722222 | 81 | 0.764314 | 119 | 751 | 4.756303 | 0.680672 | 0.106007 | 0.042403 | 0.056537 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015823 | 0.158455 | 751 | 17 | 82 | 44.176471 | 0.879747 | 0.850866 | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
22b1e39819ca2b61399739e865325c973feac3a4 | 797 | py | Python | src/main/util/strings_util.py | JetBrains-Research/codetracker-data | 6fb3900bd3cfe44d900d7fa8e89c7c35818424ed | [
"MIT"
] | 4 | 2020-05-12T10:29:15.000Z | 2020-08-21T04:26:12.000Z | src/main/util/strings_util.py | JetBrains-Research/task-tracker-post-processing | 6fb3900bd3cfe44d900d7fa8e89c7c35818424ed | [
"MIT"
] | 15 | 2020-02-05T12:18:00.000Z | 2020-05-08T09:22:50.000Z | src/main/util/strings_util.py | JetBrains-Research/task-tracker-post-processing | 6fb3900bd3cfe44d900d7fa8e89c7c35818424ed | [
"MIT"
] | 1 | 2020-06-21T06:50:11.000Z | 2020-06-21T06:50:11.000Z | # Copyright (c) 2020 Anastasiia Birillo, Elena Lyulina
import re
from typing import List
def add_symbol_to_begin(string: str, symbol: str) -> str:
if not string.startswith(symbol):
string = symbol + string
return string
def contains_any_of_substrings(string: str, substrings: List[str]) -> bool:
for substring in substrings:
if substring in string:
return True
return False
def convert_camel_case_to_snake_case(string: str) -> str:
words = re.findall(r'[A-Z]?[a-z]+|[A-Z]{1,}(?=[A-Z][a-z]|\d|\W|$|_)|\d+\.\d+|\d+', string)
return '_'.join(map(str.lower, words))
def crop_string(string: str, short_part_length: int, separator: str = '...') -> str:
return ''.join((string[:short_part_length], separator, string[-short_part_length:]))
| 29.518519 | 94 | 0.663739 | 116 | 797 | 4.387931 | 0.465517 | 0.019646 | 0.017682 | 0.023576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007634 | 0.178168 | 797 | 26 | 95 | 30.653846 | 0.769466 | 0.065245 | 0 | 0 | 0 | 0.0625 | 0.084791 | 0.079408 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.0625 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
22b613d117a69506c4577119481b571781def79e | 747 | py | Python | cli/scaleout/auth.py | aitmlouk/fedn | d6b663554492464e4eaed391a7048ec09cb11f25 | [
"Apache-2.0"
] | null | null | null | cli/scaleout/auth.py | aitmlouk/fedn | d6b663554492464e4eaed391a7048ec09cb11f25 | [
"Apache-2.0"
] | null | null | null | cli/scaleout/auth.py | aitmlouk/fedn | d6b663554492464e4eaed391a7048ec09cb11f25 | [
"Apache-2.0"
] | null | null | null |
import requests
from scaleout.errors import AuthenticationError
import json
def login(url,username,password):
""" Login to Studio services. """
def get_bearer_token(url, username, password):
""" Exchange username,password for an auth token.
TODO: extend to get creds from keyring. """
data = {
'username': username,
'password': password
}
r = requests.post(url, data=data)
if r.status_code == 200:
return json.loads(r.content)['token']
else:
print('Authentication failed!')
print("Requesting an authorization token failed.")
print('Returned status code: {}'.format(r.status_code))
print('Reason: {}'.format(r.reason))
raise AuthenticationError
| 26.678571 | 63 | 0.650602 | 86 | 747 | 5.604651 | 0.534884 | 0.13278 | 0.078838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005217 | 0.230254 | 747 | 27 | 64 | 27.666667 | 0.833043 | 0.149933 | 0 | 0 | 0 | 0 | 0.193126 | 0 | 0 | 0 | 0 | 0.037037 | 0 | 1 | 0.111111 | false | 0.166667 | 0.166667 | 0 | 0.333333 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
22c4abe6dfa164916e4496ade8f018aeda813111 | 889 | py | Python | gtdb2td/Utils.py | dportik/gtdb_to_taxdump | 9f0988882a5017fa18147fd45fe5dddba7a59c6a | [
"MIT"
] | null | null | null | gtdb2td/Utils.py | dportik/gtdb_to_taxdump | 9f0988882a5017fa18147fd45fe5dddba7a59c6a | [
"MIT"
] | null | null | null | gtdb2td/Utils.py | dportik/gtdb_to_taxdump | 9f0988882a5017fa18147fd45fe5dddba7a59c6a | [
"MIT"
] | null | null | null | import os
import sys
import gzip
import bz2
import argparse
import logging
import csv
import urllib.request
import codecs
from collections import OrderedDict
def Decode(x):
"""
Decoding input, if needed
"""
try:
x = x.decode('utf-8')
except AttributeError:
pass
return x
def Open(infile, mode='rb'):
"""
Openning of input, regardless of compression
"""
if infile.endswith('.bz2'):
return bz2.open(infile, mode)
elif infile.endswith('.gz'):
return gzip.open(infile, mode)
else:
return open(infile, mode)
def get_url_data(url):
"""
Downloading data from url; assuming gzip
"""
req = urllib.request.Request(url)
req.add_header('Accept-Encoding', 'gzip')
response = urllib.request.urlopen(req)
content = gzip.decompress(response.read())
return content.splitlines()
| 20.674419 | 48 | 0.646794 | 110 | 889 | 5.2 | 0.509091 | 0.06993 | 0.097902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005935 | 0.241845 | 889 | 42 | 49 | 21.166667 | 0.84273 | 0.124859 | 0 | 0 | 0 | 0 | 0.045082 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103448 | false | 0.034483 | 0.344828 | 0 | 0.62069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
22d279edff31ac868b1a2504c30b87127cb52517 | 1,576 | py | Python | examples/plugin/custom_matcher.py | ymoch/preacher | ae68170d14c72791884e91b20054bd13a79b52d0 | [
"MIT"
] | 3 | 2019-08-01T03:14:49.000Z | 2020-01-31T08:55:22.000Z | examples/plugin/custom_matcher.py | ymoch/preacher | ae68170d14c72791884e91b20054bd13a79b52d0 | [
"MIT"
] | 353 | 2019-04-14T14:53:28.000Z | 2022-03-11T03:26:08.000Z | examples/plugin/custom_matcher.py | ymoch/preacher | ae68170d14c72791884e91b20054bd13a79b52d0 | [
"MIT"
] | 1 | 2020-08-01T06:23:08.000Z | 2020-08-01T06:23:08.000Z | """
A custom matcher plugin example.
"""
from hamcrest.core.base_matcher import BaseMatcher
from hamcrest.core.description import Description
from preacher.compilation.verification import MatcherFactoryCompiler
from preacher.plugin import hookimpl
class IsEven(BaseMatcher[int]):
"""
A custom matcher example
according to: https://pyhamcrest.readthedocs.io/en/stable/custom_matchers/
"""
def _matches(self, item: int) -> bool:
if not isinstance(item, int):
return False
return item % 2 == 0
def describe_to(self, description: Description) -> None:
description.append_text("even")
class IsMultipleOf(BaseMatcher[int]):
"""
Another custom matcher example, which takes a value.
"""
def __init__(self, base: int):
self.base = base
def _matches(self, item: int) -> bool:
if not isinstance(item, int):
return False
return item % self.base == 0
def describe_to(self, description: Description) -> None:
description.append_text("multiple of ").append_description_of(self.base)
@hookimpl
def preacher_add_matchers(compiler: MatcherFactoryCompiler) -> None:
"""
An example of hook to add matchers. This hook requires:
- decoration of `preacher.plugin.hookimpl`
- named `preacher_add_matchers`
"""
# Add a static matcher.
compiler.add_static(("be_even", "is_even", "even"), IsEven())
# Add a matcher taking a value.
compiler.add_taking_value(("be_multiple_of", "is_multiple_of", "multiple_of"), IsMultipleOf)
| 27.649123 | 96 | 0.68401 | 189 | 1,576 | 5.555556 | 0.349206 | 0.026667 | 0.026667 | 0.034286 | 0.253333 | 0.253333 | 0.253333 | 0.253333 | 0.253333 | 0.253333 | 0 | 0.002406 | 0.208756 | 1,576 | 56 | 97 | 28.142857 | 0.839615 | 0.234137 | 0 | 0.333333 | 0 | 0 | 0.063979 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
22f229783eaf5f71b56f3b25e8967436bdf68424 | 364 | py | Python | tests/unit/test_mask_image_read.py | sqoshi/mask-imposer | d7eeae3403009c8991c9c5d5aa9fc80042a99a82 | [
"MIT"
] | null | null | null | tests/unit/test_mask_image_read.py | sqoshi/mask-imposer | d7eeae3403009c8991c9c5d5aa9fc80042a99a82 | [
"MIT"
] | 9 | 2021-08-09T15:44:46.000Z | 2021-12-19T13:06:21.000Z | tests/unit/test_mask_image_read.py | sqoshi/mask-imposer | d7eeae3403009c8991c9c5d5aa9fc80042a99a82 | [
"MIT"
] | null | null | null | # from logging import getLogger
# from unittest import TestCase
#
# from mask_imposer.imposer.mask_imposer import Imposer
#
#
# class MaskImageReadTestCase(TestCase):
# @classmethod
# def setUp(cls) -> None:
# cls.imposer = Imposer({}, getLogger("test"))
#
# # def test_should_read_mask_image(self):
# # self.imposer._read_mask_image()
| 26 | 55 | 0.68956 | 41 | 364 | 5.902439 | 0.487805 | 0.090909 | 0.107438 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.18956 | 364 | 13 | 56 | 28 | 0.820339 | 0.92033 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
22f50c09db9261f4a22e896afac164a323afcea3 | 605 | py | Python | api/views/search.py | tavoxr/drf-ecommerce-api | 61c6b84dc4e811987ca72c73dfdce08f99d7481d | [
"MIT"
] | null | null | null | api/views/search.py | tavoxr/drf-ecommerce-api | 61c6b84dc4e811987ca72c73dfdce08f99d7481d | [
"MIT"
] | null | null | null | api/views/search.py | tavoxr/drf-ecommerce-api | 61c6b84dc4e811987ca72c73dfdce08f99d7481d | [
"MIT"
] | null | null | null | from django.db.models import Q
from rest_framework.decorators import api_view
from rest_framework.response import Response
from ..models import Product
from ..serializers import ProductSerializer
@api_view(['POST'])
def search(request):
query = request.data.get('query','')
print('request.data', request.data)
if query:
products = Product.objects.filter(Q(name__icontains = query) | Q(description__icontains = query))
serializer = ProductSerializer(products, many = True)
return Response(serializer.data)
else:
return Response({"products": []}) | 31.842105 | 105 | 0.707438 | 70 | 605 | 6 | 0.5 | 0.078571 | 0.080952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183471 | 605 | 19 | 106 | 31.842105 | 0.850202 | 0 | 0 | 0 | 0 | 0 | 0.047855 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.333333 | 0 | 0.533333 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
fe01bd4cfd4f1c6329c31d57044fbdf30ec9de55 | 3,141 | py | Python | other/demo_logic.py | vpv11110000/pyss | bc2226e2e66e0b551a09ae6ab6835b0bb6c7f32b | [
"MIT"
] | null | null | null | other/demo_logic.py | vpv11110000/pyss | bc2226e2e66e0b551a09ae6ab6835b0bb6c7f32b | [
"MIT"
] | 2 | 2017-09-05T11:12:05.000Z | 2017-09-07T19:23:15.000Z | other/demo_logic.py | vpv11110000/pyss | bc2226e2e66e0b551a09ae6ab6835b0bb6c7f32b | [
"MIT"
] | null | null | null | # #!/usr/bin/python
# -*- coding: utf-8 -*-
"""
Запуск:
python ./demo_logic.py
"""
# pylint: disable=line-too-long,missing-docstring,bad-whitespace
import sys
import os
import random
import math
import os
from pyss.logic_object import LogicObject
DIRNAME_MODULE = os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(os.path.realpath(sys.argv[0]))))) + os.sep
sys.path.append(DIRNAME_MODULE)
sys.path.append(DIRNAME_MODULE + "pyss" + os.sep)
from pyss import pyssobject
from pyss.pyss_model import PyssModel
from pyss.segment import Segment
from pyss import generate
from pyss.generate import Generate
from pyss.terminate import Terminate
from pyss import logger
from pyss.table import Table
from pyss.assemble import Assemble
from pyss.qtable import Qtable
from pyss.handle import Handle
from pyss.enter import Enter
from pyss.leave import Leave
from pyss.storage import Storage
from pyss.advance import Advance
from pyss.assign import Assign
from pyss.preempt import Preempt
from pyss.g_return import GReturn
from pyss.facility import Facility
from pyss.seize import Seize
from pyss.release import Release
from pyss.transfer import Transfer
from pyss.tabulate import Tabulate
from pyss.test import Test
from pyss.queue import Queue
from pyss.depart import Depart
from pyss.split import Split
from pyss.test import Test
from pyss.bprint import Bprint
from pyss.gate import Gate
from pyss.pyss_const import *
from pyss.func_discrete import FuncDiscrete
from pyss.func_exponential import Exponential
from pyss.func_normal import Normal
from pyss.plot_func import PlotFunc
from pyss import logic
from pyss.logic import Logic
from pyss.simpleobject import SimpleObject
def buildModel():
logger.info("-------------------------------------")
### MODEL ----------------------------------
m = PyssModel()
sgm = Segment(m)
#
m[OPTIONS].setAllFalse()
m[OPTIONS].printResult = True
#---------
LK1 = "LK1"
LK2 = "LK2"
LogicObject(m, logicObjectName=LK1, initialState=False)
LogicObject(m, logicObjectName=LK2, initialState=True)
#-----------------------------
Generate(sgm,
med_value=1,
max_amount=100,
priority=1)
Logic(sgm, actionFunc=logic.invert, logicObjectName=LK1)
Logic(sgm, actionFunc=logic.invert, logicObjectName=LK2)
Terminate(sgm, deltaTerminate=1)
return m
def main():
logger.info("-------------------------------------")
#-------------------------------
# Время моделирования
MAX_TIME = 10
### MODEL ----------------------------------
m = buildModel()
### КАРТИНКИ ----------------------
# таблицы
# m.initPlotTable()
# m.initPlotQueueLifeLine()
m.initPlotLogicObjectLifeLine()
# РАСЧЁТ --------------------------
m.start(terminationCount=100, maxTime=MAX_TIME)
# ПОКАЗ КАРТИНОК ------------------------
# m.getPlotSubsystem().plotByModules()
# m.getPlotSubsystem().show()
m.plotByModulesAndSave("demo_logic")
m.plotByModulesAndShow()
if __name__ == '__main__':
main()
| 25.536585 | 123 | 0.658071 | 372 | 3,141 | 5.489247 | 0.330645 | 0.152791 | 0.025465 | 0.029383 | 0.126347 | 0.100881 | 0.057786 | 0.028404 | 0.028404 | 0.028404 | 0 | 0.008074 | 0.17192 | 3,141 | 122 | 124 | 25.745902 | 0.777009 | 0.171601 | 0 | 0.08 | 0 | 0 | 0.039704 | 0.028805 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026667 | false | 0 | 0.586667 | 0 | 0.626667 | 0.026667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
fe1644b47edbf6d9c3c8745a3d705eb77955dc7a | 438 | py | Python | blueprints/virtual/bin-base/bin-base.py | juergenhoetzel/craft | 9d3fe6dc07f2307e8f8212c8981b980a9d2d28fd | [
"BSD-2-Clause"
] | 55 | 2016-11-20T17:08:19.000Z | 2022-03-11T22:19:43.000Z | blueprints/virtual/bin-base/bin-base.py | juergenhoetzel/craft | 9d3fe6dc07f2307e8f8212c8981b980a9d2d28fd | [
"BSD-2-Clause"
] | 17 | 2017-09-20T07:52:17.000Z | 2021-12-03T10:03:00.000Z | blueprints/virtual/bin-base/bin-base.py | juergenhoetzel/craft | 9d3fe6dc07f2307e8f8212c8981b980a9d2d28fd | [
"BSD-2-Clause"
] | 29 | 2016-12-10T15:00:11.000Z | 2021-12-02T12:54:05.000Z | import info
class subinfo(info.infoclass):
def setTargets(self):
self.targets['0.2'] = ""
self.defaultTarget = '0.2'
self.description = "deprecated: use virtual/base instead"
def setDependencies(self):
self.runtimeDependencies["virtual/base"] = None
from Package.VirtualPackageBase import *
class Package(VirtualPackageBase):
def __init__(self):
VirtualPackageBase.__init__(self)
| 23.052632 | 65 | 0.684932 | 45 | 438 | 6.488889 | 0.555556 | 0.054795 | 0.041096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011494 | 0.205479 | 438 | 18 | 66 | 24.333333 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0.123288 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.166667 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
a3a6503d5f66597ef4dd411ed10aa366a283c98d | 34,488 | py | Python | pysnmp-with-texts/DLINK-3100-DHCP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/DLINK-3100-DHCP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/DLINK-3100-DHCP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module DLINK-3100-DHCP-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/DLINK-3100-DHCP-MIB
# Produced by pysmi-0.3.4 at Wed May 1 12:48:12 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, Integer, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "OctetString", "Integer", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsIntersection, ValueSizeConstraint, ValueRangeConstraint, ConstraintsUnion, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsIntersection", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsUnion", "SingleValueConstraint")
rnd, = mibBuilder.importSymbols("DLINK-3100-MIB", "rnd")
PortList, = mibBuilder.importSymbols("Q-BRIDGE-MIB", "PortList")
SnmpAdminString, = mibBuilder.importSymbols("SNMP-FRAMEWORK-MIB", "SnmpAdminString")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
TimeTicks, Gauge32, Counter32, Unsigned32, IpAddress, NotificationType, Counter64, ObjectIdentity, iso, MibIdentifier, MibScalar, MibTable, MibTableRow, MibTableColumn, ModuleIdentity, Bits, Integer32 = mibBuilder.importSymbols("SNMPv2-SMI", "TimeTicks", "Gauge32", "Counter32", "Unsigned32", "IpAddress", "NotificationType", "Counter64", "ObjectIdentity", "iso", "MibIdentifier", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "ModuleIdentity", "Bits", "Integer32")
TruthValue, DisplayString, RowStatus, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "TruthValue", "DisplayString", "RowStatus", "TextualConvention")
rsDHCP = ModuleIdentity((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38))
rsDHCP.setRevisions(('2003-10-18 00:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: rsDHCP.setRevisionsDescriptions(('Initial version of this MIB.',))
if mibBuilder.loadTexts: rsDHCP.setLastUpdated('200310180000Z')
if mibBuilder.loadTexts: rsDHCP.setOrganization('Dlink, Inc.')
if mibBuilder.loadTexts: rsDHCP.setContactInfo('www.dlink.com')
if mibBuilder.loadTexts: rsDHCP.setDescription('The private MIB module definition for DHCP server support in DLINK-3100 devices.')
rsDhcpMibVersion = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 14), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rsDhcpMibVersion.setStatus('current')
if mibBuilder.loadTexts: rsDhcpMibVersion.setDescription("DHCP MIB's version, the current version is 4.")
rlDhcpRelayEnable = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 25), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayEnable.setDescription('Enable or disable the use of the DHCP relay.')
rlDhcpRelayExists = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 26), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpRelayExists.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayExists.setDescription('This variable shows whether the device can function as a DHCP Relay Agent.')
rlDhcpRelayNextServerTable = MibTable((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 27), )
if mibBuilder.loadTexts: rlDhcpRelayNextServerTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerTable.setDescription('The DHCP Relay Next Servers configuration Table')
rlDhcpRelayNextServerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 27, 1), ).setIndexNames((0, "DLINK-3100-DHCP-MIB", "rlDhcpRelayNextServerIpAddr"))
if mibBuilder.loadTexts: rlDhcpRelayNextServerEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerEntry.setDescription('The row definition for this table. DHCP requests are relayed to the specified next server according to their threshold values.')
rlDhcpRelayNextServerIpAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 27, 1, 1), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpRelayNextServerIpAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerIpAddr.setDescription('The IPAddress of the next configuration server. DHCP Server may act as a DHCP relay if this parameter is not equal to 0.0.0.0.')
rlDhcpRelayNextServerSecThreshold = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 27, 1, 2), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayNextServerSecThreshold.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerSecThreshold.setDescription('DHCP requests are relayed only if their SEC field is greater or equal to the threshold value in order to allow local DHCP Servers to answer first.')
rlDhcpRelayNextServerRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 27, 1, 3), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayNextServerRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerRowStatus.setDescription("This variable displays the validity or invalidity of the entry. Setting it to 'destroy' has the effect of rendering it inoperative. The internal effect (row removal) is implementation dependent.")
rlDhcpRelayInterfaceTable = MibTable((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 28), )
if mibBuilder.loadTexts: rlDhcpRelayInterfaceTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceTable.setDescription('The enabled DHCP Relay Interface Table')
rlDhcpRelayInterfaceEntry = MibTableRow((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 28, 1), ).setIndexNames((0, "DLINK-3100-DHCP-MIB", "rlDhcpRelayInterfaceIfindex"))
if mibBuilder.loadTexts: rlDhcpRelayInterfaceEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceEntry.setDescription('The row definition for this table. The user can add entry when DHCP relay is enabled on Interface.')
rlDhcpRelayInterfaceIfindex = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 28, 1, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceIfindex.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceIfindex.setDescription('The Interface on which an user enables a DHCP relay ')
rlDhcpRelayInterfaceUseGiaddr = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 28, 1, 2), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceUseGiaddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceUseGiaddr.setDescription('The flag is used to set a DHCP relay interface to use GiAddr in the standard way. Default is TRUE. The field is not supported.')
rlDhcpRelayInterfaceRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 28, 1, 3), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceRowStatus.setDescription('Entry status. Can be destroy, active or createAndGo')
rlDhcpRelayInterfaceListTable = MibTable((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29), )
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListTable.setDescription('This table contains one entry. The entry contains port list and vlan lists of interfaces that have configured DHCP relay')
rlDhcpRelayInterfaceListEntry = MibTableRow((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29, 1), ).setIndexNames((0, "DLINK-3100-DHCP-MIB", "rlDhcpRelayInterfaceListIndex"))
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListEntry.setDescription(' The entry contains port list and vlan lists of interfaces that have configured DHCP relay.')
rlDhcpRelayInterfaceListIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListIndex.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListIndex.setDescription('Index in the table. Already 1.')
rlDhcpRelayInterfaceListPortList = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29, 1, 2), PortList()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListPortList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListPortList.setDescription('DHCP relay Interface Port List.')
rlDhcpRelayInterfaceListVlanId1To1024 = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29, 1, 3), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 128))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId1To1024.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId1To1024.setDescription(' DHCP relay Interface VlanId List 1.')
rlDhcpRelayInterfaceListVlanId1025To2048 = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29, 1, 4), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 128))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId1025To2048.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId1025To2048.setDescription(' DHCP relay Interface VlanId List 2.')
rlDhcpRelayInterfaceListVlanId2049To3072 = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29, 1, 5), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 128))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId2049To3072.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId2049To3072.setDescription(' DHCP relay Interface VlanId List 3.')
rlDhcpRelayInterfaceListVlanId3073To4094 = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 29, 1, 6), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 128))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId3073To4094.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayInterfaceListVlanId3073To4094.setDescription(' DHCP relay Interface VlanId List 4.')
rlDhcpSrvEnable = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 30), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvEnable.setDescription('Enable or Disable the use of the DHCP Server. For a router product the default value is TRUE. For a switch product the default is FALSE.')
rlDhcpSrvExists = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 31), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvExists.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvExists.setDescription('This variable shows whether the device can function as a DHCP Server.')
rlDhcpSrvDbLocation = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 32), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("nvram", 1), ("flash", 2))).clone('flash')).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDbLocation.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDbLocation.setDescription('Describes where DHCP Server database is stored.')
rlDhcpSrvMaxNumOfClients = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 33), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvMaxNumOfClients.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvMaxNumOfClients.setDescription('This variable shows maximum number of clients that can be supported by DHCP Server dynamic allocation.')
rlDhcpSrvDbNumOfActiveEntries = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 34), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDbNumOfActiveEntries.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDbNumOfActiveEntries.setDescription('This variable shows number of active entries stored in database.')
rlDhcpSrvDbErase = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 35), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDbErase.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDbErase.setDescription('The value is always false. Setting this variable to true causes erasing all entries in DHCP database.')
rlDhcpSrvProbeEnable = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 36), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvProbeEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvProbeEnable.setDescription('Enable or Disable the use of the DHCP probe before allocating an IP address.')
rlDhcpSrvProbeTimeout = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 37), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(300, 10000)).clone(500)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvProbeTimeout.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvProbeTimeout.setDescription('Indicates the peiod of time in milliseconds the DHCP probe will wait before issuing a new trial or deciding that no other device on the network has the IP address which DHCP considers allocating.')
rlDhcpSrvProbeRetries = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 38), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 10)).clone(2)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvProbeRetries.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvProbeRetries.setDescription('Indicates how many times DHCP will probe before deciding that no other device on the network has the IP address which DHCP considers allocating.')
rlDhcpSrvDefaultNetworkPoolName = MibScalar((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 39), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 32))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDefaultNetworkPoolName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDefaultNetworkPoolName.setDescription('Contains a default network pool name. Used in case of one network pool only.')
rlDhcpSrvIpAddrTable = MibTable((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45), )
if mibBuilder.loadTexts: rlDhcpSrvIpAddrTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrTable.setDescription('Table of IP Addresses allocated by DHCP Server by static and dynamic allocations.')
rlDhcpSrvIpAddrEntry = MibTableRow((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1), ).setIndexNames((0, "DLINK-3100-DHCP-MIB", "rlDhcpSrvIpAddrIpAddr"))
if mibBuilder.loadTexts: rlDhcpSrvIpAddrEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrEntry.setDescription('The row definition for this table. Parameters of DHCP allocated IP Addresses table.')
rlDhcpSrvIpAddrIpAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 1), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpAddr.setDescription('The IP address that was allocated by DHCP Server.')
rlDhcpSrvIpAddrIpNetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpNetMask.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpNetMask.setDescription('The subnet mask associated with the IP address of this entry. The value of the mask is an IP address with all the network bits set to 1 and all the hosts bits set to 0.')
rlDhcpSrvIpAddrIdentifier = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 3), OctetString().subtype(subtypeSpec=ValueSizeConstraint(2, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifier.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifier.setDescription('Unique Identifier for client. Either physical address or DHCP Client Identifier.')
rlDhcpSrvIpAddrIdentifierType = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("physAddr", 1), ("clientId", 2))).clone('clientId')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifierType.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifierType.setDescription('Identifier Type. Either physical address or DHCP Client Identifier.')
rlDhcpSrvIpAddrClnHostName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 5), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrClnHostName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrClnHostName.setDescription('Client Host Name. DHCP Server will use it to update DNS Server. Must be unique per client.')
rlDhcpSrvIpAddrMechanism = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("manual", 1), ("automatic", 2), ("dynamic", 3))).clone('manual')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrMechanism.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrMechanism.setDescription('Mechanism of allocation IP Address by DHCP Server. The only value that can be set by user is manual.')
rlDhcpSrvIpAddrAgeTime = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 7), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrAgeTime.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrAgeTime.setDescription('Age time of the IP Address.')
rlDhcpSrvIpAddrPoolName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 8), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrPoolName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrPoolName.setDescription('Ip address pool name. A unique name for host pool static allocation, or network pool name in case of dynamic allocation.')
rlDhcpSrvIpAddrConfParamsName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 9), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrConfParamsName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrConfParamsName.setDescription('This variable points (serves as key) to appropriate set of parameters in the DHCP Server configuration parameters table.')
rlDhcpSrvIpAddrRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 45, 1, 10), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrRowStatus.setDescription('The row status variable, used according to row installation and removal conventions.')
rlDhcpSrvDynamicTable = MibTable((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46), )
if mibBuilder.loadTexts: rlDhcpSrvDynamicTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicTable.setDescription("The DHCP Dynamic Server's configuration Table")
rlDhcpSrvDynamicEntry = MibTableRow((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1), ).setIndexNames((0, "DLINK-3100-DHCP-MIB", "rlDhcpSrvDynamicPoolName"))
if mibBuilder.loadTexts: rlDhcpSrvDynamicEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicEntry.setDescription('The row definition for this table. Parameters sent in as a DHCP Reply to DHCP Request with specified indices')
rlDhcpSrvDynamicPoolName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 1), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicPoolName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicPoolName.setDescription('The name of DHCP dynamic addresses pool.')
rlDhcpSrvDynamicIpAddrFrom = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrFrom.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrFrom.setDescription('The first IP address allocated in this row.')
rlDhcpSrvDynamicIpAddrTo = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 3), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrTo.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrTo.setDescription('The last IP address allocated in this row.')
rlDhcpSrvDynamicIpNetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 4), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpNetMask.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpNetMask.setDescription('The subnet mask associated with the IP addresses of this entry. The value of the mask is an IP address with all the network bits set to 1 and all the hosts bits set to 0.')
rlDhcpSrvDynamicLeaseTime = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 5), Unsigned32().clone(86400)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicLeaseTime.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicLeaseTime.setDescription('Maximum lease-time in seconds granted to a requesting DHCP client. For automatic allocation use 0xFFFFFFFF. To exclude addresses from allocation mechanism, set this value to 0.')
rlDhcpSrvDynamicProbeEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 6), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicProbeEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicProbeEnable.setDescription('Enable or Disable the use of the DHCP probe before allocating the address.')
rlDhcpSrvDynamicTotalNumOfAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 7), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDynamicTotalNumOfAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicTotalNumOfAddr.setDescription('Total number of addresses in space.')
rlDhcpSrvDynamicFreeNumOfAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 8), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDynamicFreeNumOfAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicFreeNumOfAddr.setDescription('Free number of addresses in space.')
rlDhcpSrvDynamicConfParamsName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 9), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicConfParamsName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicConfParamsName.setDescription('This variable points (serves as key) to appropriate set of parameters in the DHCP Server configuration parameters table.')
rlDhcpSrvDynamicRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 46, 1, 10), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicRowStatus.setDescription('The row status variable, used according to row installation and removal conventions.')
rlDhcpSrvConfParamsTable = MibTable((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47), )
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTable.setDescription('The DHCP Configuration Parameters Table')
rlDhcpSrvConfParamsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1), ).setIndexNames((0, "DLINK-3100-DHCP-MIB", "rlDhcpSrvConfParamsName"))
if mibBuilder.loadTexts: rlDhcpSrvConfParamsEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsEntry.setDescription('The row definition for this table. Each entry corresponds to one specific parameters set.')
rlDhcpSrvConfParamsName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 1), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsName.setDescription('This value is a unique index for the entry in the rlDhcpSrvConfParamsTable.')
rlDhcpSrvConfParamsNextServerIp = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerIp.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerIp.setDescription('The IP of next server for client to use in configuration process.')
rlDhcpSrvConfParamsNextServerName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 64))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerName.setDescription('The mame of next server for client to use in configuration process.')
rlDhcpSrvConfParamsBootfileName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 4), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 128))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsBootfileName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsBootfileName.setDescription('Name of file for client to request from next server.')
rlDhcpSrvConfParamsRoutersList = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 5), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRoutersList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRoutersList.setDescription("The value of option code 3, which defines default routers list. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsTimeSrvList = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 6), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTimeSrvList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTimeSrvList.setDescription("The value of option code 4, which defines time servers list. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsDnsList = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 7), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDnsList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDnsList.setDescription("The value of option code 6, which defines the list of DNSs. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsDomainName = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 8), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDomainName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDomainName.setDescription('The value option code 15, which defines the domain name..')
rlDhcpSrvConfParamsNetbiosNameList = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 9), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNameList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNameList.setDescription("The value option code 44, which defines the list of NETBios Name Servers. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsNetbiosNodeType = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 4, 8))).clone(namedValues=NamedValues(("b-node", 1), ("p-node", 2), ("m-node", 4), ("h-node", 8))).clone('h-node')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNodeType.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNodeType.setDescription('The value option code 46, which defines the NETBios node type. The option will be added only if rlDhcpSrvConfParamsNetbiosNameList is not empty.')
rlDhcpSrvConfParamsCommunity = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 11), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32)).clone('public')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsCommunity.setStatus('obsolete')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsCommunity.setDescription('The value of site-specific option 128, which defines Community. The option will be added only if rlDhcpSrvConfParamsNmsIp is set.')
rlDhcpSrvConfParamsNmsIp = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 12), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNmsIp.setStatus('obsolete')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNmsIp.setDescription('The value of site-specific option 129, which defines IP of Network Manager.')
rlDhcpSrvConfParamsOptionsList = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 13), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsOptionsList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsOptionsList.setDescription("The sequence of option segments. Each option segment is represented by a triplet <code/length/value>. The code defines the code of each supported option. The length defines the length of each supported option. The value defines the value of the supported option. If there is a number of elements in the value field, they are divided by ','. Each element of type IP address in value field is represented in dotted decimal notation format.")
rlDhcpSrvConfParamsRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 171, 10, 94, 89, 89, 38, 47, 1, 14), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRowStatus.setDescription('The row status variable, used according to row installation and removal conventions.')
mibBuilder.exportSymbols("DLINK-3100-DHCP-MIB", PYSNMP_MODULE_ID=rsDHCP, rlDhcpRelayExists=rlDhcpRelayExists, rlDhcpSrvIpAddrMechanism=rlDhcpSrvIpAddrMechanism, rlDhcpSrvConfParamsDnsList=rlDhcpSrvConfParamsDnsList, rlDhcpSrvConfParamsOptionsList=rlDhcpSrvConfParamsOptionsList, rlDhcpSrvConfParamsDomainName=rlDhcpSrvConfParamsDomainName, rlDhcpSrvDynamicFreeNumOfAddr=rlDhcpSrvDynamicFreeNumOfAddr, rlDhcpSrvConfParamsNextServerIp=rlDhcpSrvConfParamsNextServerIp, rlDhcpSrvConfParamsNextServerName=rlDhcpSrvConfParamsNextServerName, rlDhcpRelayInterfaceListTable=rlDhcpRelayInterfaceListTable, rlDhcpRelayInterfaceUseGiaddr=rlDhcpRelayInterfaceUseGiaddr, rlDhcpSrvDynamicIpAddrFrom=rlDhcpSrvDynamicIpAddrFrom, rlDhcpSrvConfParamsTable=rlDhcpSrvConfParamsTable, rlDhcpSrvConfParamsNetbiosNodeType=rlDhcpSrvConfParamsNetbiosNodeType, rlDhcpSrvDynamicConfParamsName=rlDhcpSrvDynamicConfParamsName, rlDhcpRelayInterfaceRowStatus=rlDhcpRelayInterfaceRowStatus, rlDhcpRelayInterfaceListIndex=rlDhcpRelayInterfaceListIndex, rlDhcpSrvMaxNumOfClients=rlDhcpSrvMaxNumOfClients, rlDhcpSrvExists=rlDhcpSrvExists, rlDhcpSrvConfParamsCommunity=rlDhcpSrvConfParamsCommunity, rlDhcpSrvConfParamsNetbiosNameList=rlDhcpSrvConfParamsNetbiosNameList, rlDhcpSrvIpAddrClnHostName=rlDhcpSrvIpAddrClnHostName, rlDhcpSrvConfParamsRowStatus=rlDhcpSrvConfParamsRowStatus, rlDhcpSrvIpAddrPoolName=rlDhcpSrvIpAddrPoolName, rlDhcpSrvConfParamsName=rlDhcpSrvConfParamsName, rlDhcpSrvProbeTimeout=rlDhcpSrvProbeTimeout, rlDhcpSrvDefaultNetworkPoolName=rlDhcpSrvDefaultNetworkPoolName, rlDhcpSrvConfParamsTimeSrvList=rlDhcpSrvConfParamsTimeSrvList, rlDhcpSrvDbLocation=rlDhcpSrvDbLocation, rlDhcpSrvDynamicIpNetMask=rlDhcpSrvDynamicIpNetMask, rlDhcpRelayInterfaceListEntry=rlDhcpRelayInterfaceListEntry, rlDhcpRelayInterfaceListVlanId1025To2048=rlDhcpRelayInterfaceListVlanId1025To2048, rlDhcpSrvIpAddrRowStatus=rlDhcpSrvIpAddrRowStatus, rsDHCP=rsDHCP, rlDhcpSrvProbeRetries=rlDhcpSrvProbeRetries, rlDhcpSrvProbeEnable=rlDhcpSrvProbeEnable, rlDhcpSrvDynamicLeaseTime=rlDhcpSrvDynamicLeaseTime, rlDhcpRelayNextServerIpAddr=rlDhcpRelayNextServerIpAddr, rlDhcpSrvConfParamsBootfileName=rlDhcpSrvConfParamsBootfileName, rsDhcpMibVersion=rsDhcpMibVersion, rlDhcpRelayInterfaceListVlanId3073To4094=rlDhcpRelayInterfaceListVlanId3073To4094, rlDhcpSrvIpAddrIpNetMask=rlDhcpSrvIpAddrIpNetMask, rlDhcpSrvIpAddrIdentifierType=rlDhcpSrvIpAddrIdentifierType, rlDhcpSrvEnable=rlDhcpSrvEnable, rlDhcpSrvDbErase=rlDhcpSrvDbErase, rlDhcpSrvIpAddrTable=rlDhcpSrvIpAddrTable, rlDhcpSrvIpAddrIdentifier=rlDhcpSrvIpAddrIdentifier, rlDhcpSrvDbNumOfActiveEntries=rlDhcpSrvDbNumOfActiveEntries, rlDhcpRelayInterfaceListPortList=rlDhcpRelayInterfaceListPortList, rlDhcpSrvDynamicTotalNumOfAddr=rlDhcpSrvDynamicTotalNumOfAddr, rlDhcpSrvDynamicRowStatus=rlDhcpSrvDynamicRowStatus, rlDhcpRelayNextServerSecThreshold=rlDhcpRelayNextServerSecThreshold, rlDhcpSrvIpAddrConfParamsName=rlDhcpSrvIpAddrConfParamsName, rlDhcpRelayInterfaceListVlanId1To1024=rlDhcpRelayInterfaceListVlanId1To1024, rlDhcpSrvIpAddrIpAddr=rlDhcpSrvIpAddrIpAddr, rlDhcpSrvConfParamsEntry=rlDhcpSrvConfParamsEntry, rlDhcpSrvConfParamsRoutersList=rlDhcpSrvConfParamsRoutersList, rlDhcpSrvDynamicProbeEnable=rlDhcpSrvDynamicProbeEnable, rlDhcpRelayInterfaceListVlanId2049To3072=rlDhcpRelayInterfaceListVlanId2049To3072, rlDhcpRelayInterfaceTable=rlDhcpRelayInterfaceTable, rlDhcpSrvDynamicIpAddrTo=rlDhcpSrvDynamicIpAddrTo, rlDhcpRelayInterfaceIfindex=rlDhcpRelayInterfaceIfindex, rlDhcpRelayNextServerTable=rlDhcpRelayNextServerTable, rlDhcpSrvConfParamsNmsIp=rlDhcpSrvConfParamsNmsIp, rlDhcpRelayEnable=rlDhcpRelayEnable, rlDhcpRelayInterfaceEntry=rlDhcpRelayInterfaceEntry, rlDhcpRelayNextServerRowStatus=rlDhcpRelayNextServerRowStatus, rlDhcpSrvDynamicTable=rlDhcpSrvDynamicTable, rlDhcpSrvIpAddrEntry=rlDhcpSrvIpAddrEntry, rlDhcpRelayNextServerEntry=rlDhcpRelayNextServerEntry, rlDhcpSrvDynamicPoolName=rlDhcpSrvDynamicPoolName, rlDhcpSrvDynamicEntry=rlDhcpSrvDynamicEntry, rlDhcpSrvIpAddrAgeTime=rlDhcpSrvIpAddrAgeTime)
| 143.7 | 4,034 | 0.801293 | 3,869 | 34,488 | 7.142156 | 0.120186 | 0.063837 | 0.111714 | 0.010422 | 0.455759 | 0.32045 | 0.267542 | 0.239786 | 0.208953 | 0.205768 | 0 | 0.065677 | 0.087886 | 34,488 | 239 | 4,035 | 144.301255 | 0.81276 | 0.009627 | 0 | 0 | 0 | 0.099567 | 0.254049 | 0.008141 | 0 | 0 | 0.000293 | 0 | 0 | 1 | 0 | false | 0 | 0.038961 | 0 | 0.038961 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a3a6941eba01b998c35fd677865c3888ba20cf12 | 1,943 | py | Python | manage.py | PrynsTag/oneBarangay | 6a8d56003d85b8385e91f5c5d81208619023c1ee | [
"Apache-2.0"
] | null | null | null | manage.py | PrynsTag/oneBarangay | 6a8d56003d85b8385e91f5c5d81208619023c1ee | [
"Apache-2.0"
] | 96 | 2021-08-28T12:37:02.000Z | 2022-03-23T04:25:12.000Z | manage.py | PrynsTag/oneBarangay | 6a8d56003d85b8385e91f5c5d81208619023c1ee | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
"""Django's command-line utility for administrative tasks."""
import os
import sys
from google.auth.exceptions import DefaultCredentialsError
from one_barangay.local_settings import logger
def main():
"""Run administrative tasks."""
# os.environ.setdefault("DJANGO_SETTINGS_MODULE", "one_barangay.settings")
# os.environ["FIRESTORE_DATASET"] = "rbi"
# os.environ["FIRESTORE_EMULATOR_HOST"] = "127.0.0.1:8080"
# os.environ["FIRESTORE_EMULATOR_HOST_PATH"] = "127.0.0.1:8080/firestore"
# os.environ["FIRESTORE_HOST"] = "http://127.0.0.1:8080"
# os.environ["FIRESTORE_PROJECT_ID"] = "onebarangay-malanday"
try:
from django.core.management import ( # For Django pylint: disable=import-outside-toplevel
execute_from_command_line,
)
except ImportError as exc:
raise ImportError(
"Couldn't import Django. Are you sure it's installed and "
"available on your PYTHONPATH environment variable? Did you "
"forget to activate a virtual environment?"
) from exc
execute_from_command_line(sys.argv)
if os.getenv("GAE_ENV", "").startswith("standard"):
try:
import googleclouddebugger # For GCP pylint: disable=import-outside-toplevel
googleclouddebugger.enable(
module="oneBarangay",
version="v1.0",
breakpoint_enable_canary=True,
)
logger.info("Cloud Debugger started.")
except ImportError as exc:
logger.exception(
"Couldn't import Cloud Debugger. "
"Are you sure its installed and "
"is available in your requirements.txt?\n%s",
exc,
)
except DefaultCredentialsError as e:
logger.exception("The credential json or path is invalid.. %s", e)
if __name__ == "__main__":
main()
| 34.696429 | 98 | 0.629954 | 221 | 1,943 | 5.39819 | 0.502262 | 0.045264 | 0.07544 | 0.015088 | 0.147527 | 0.04694 | 0.04694 | 0.04694 | 0 | 0 | 0 | 0.022425 | 0.265569 | 1,943 | 55 | 99 | 35.327273 | 0.813595 | 0.287185 | 0 | 0.111111 | 0 | 0 | 0.267204 | 0.015373 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027778 | true | 0 | 0.305556 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
a3a79deae39827f3055900cacc776a035ca14fc6 | 1,368 | py | Python | setup.py | keans/powerstrip | fb5ba4d19b60d4404d68fd9b4af739afc64a161c | [
"MIT"
] | null | null | null | setup.py | keans/powerstrip | fb5ba4d19b60d4404d68fd9b4af739afc64a161c | [
"MIT"
] | null | null | null | setup.py | keans/powerstrip | fb5ba4d19b60d4404d68fd9b4af739afc64a161c | [
"MIT"
] | null | null | null | from setuptools import setup, find_packages
from pathlib import Path
# get current directory
current_directory = Path(__file__).resolve().parent
def get_long_description():
"""
get long description from README.rst file
"""
with current_directory.joinpath("README.rst").open() as f:
return f.read()
setup(
name="powerstrip",
version="0.0.1",
description="Simple module to manage plugins.",
long_description=get_long_description(),
url="https://github.com/keans/powerstrip",
author="Ansgar Kellner",
author_email="akellner@gmx.de",
license="MIT",
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Topic :: Software Development :: Build Tools",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
],
python_requires=">=3.6",
keywords="powerstrip",
packages=find_packages(
exclude=["contrib", "docs", "tests"]
),
install_requires=[
"pyyaml", "cerberus"
],
extra_require={
"docs": ["mkdocs"],
}
)
| 27.36 | 62 | 0.613304 | 144 | 1,368 | 5.708333 | 0.583333 | 0.138686 | 0.182482 | 0.189781 | 0.080292 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017341 | 0.241228 | 1,368 | 49 | 63 | 27.918367 | 0.774566 | 0.046784 | 0 | 0.051282 | 0 | 0 | 0.42236 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025641 | false | 0 | 0.051282 | 0 | 0.102564 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a3ac362f2924b954541be4eb3091dd3762f9ffa3 | 174 | py | Python | src/dfm/about.py | centre-for-humanities-computing/danish-foundation-models | 1cc0950fab55650fd92f56b4bad0b13a5be73061 | [
"MIT"
] | 8 | 2021-12-10T08:37:55.000Z | 2022-02-08T14:42:01.000Z | src/dfm/about.py | centre-for-humanities-computing/danish-foundation-models | 1cc0950fab55650fd92f56b4bad0b13a5be73061 | [
"MIT"
] | 47 | 2021-12-12T14:41:31.000Z | 2022-03-31T08:57:56.000Z | src/dfm/about.py | centre-for-humanities-computing/danish-foundation-models | 1cc0950fab55650fd92f56b4bad0b13a5be73061 | [
"MIT"
] | null | null | null | __version__ = "0.0.3" # only source of version ID
__title__ = "dfm"
__download_url__ = (
"https://github.com/centre-for-humanities-computing/danish-foundation-models"
)
| 29 | 81 | 0.729885 | 23 | 174 | 4.956522 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019737 | 0.126437 | 174 | 5 | 82 | 34.8 | 0.730263 | 0.143678 | 0 | 0 | 0 | 0 | 0.564626 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a3b5cadb4532150557509114a58740b9592df49d | 759 | py | Python | pyeccodes/defs/grib1/12_table.py | ecmwf/pyeccodes | dce2c72d3adcc0cb801731366be53327ce13a00b | [
"Apache-2.0"
] | 7 | 2020-04-14T09:41:17.000Z | 2021-08-06T09:38:19.000Z | pyeccodes/defs/grib1/12_table.py | ecmwf/pyeccodes | dce2c72d3adcc0cb801731366be53327ce13a00b | [
"Apache-2.0"
] | null | null | null | pyeccodes/defs/grib1/12_table.py | ecmwf/pyeccodes | dce2c72d3adcc0cb801731366be53327ce13a00b | [
"Apache-2.0"
] | 3 | 2020-04-30T12:44:48.000Z | 2020-12-15T08:40:26.000Z | def load(h):
return ({'abbr': 0, 'code': 0, 'title': 'Explicit co-ordinate values sent'},
{'abbr': 1, 'code': 1, 'title': 'Linear co-cordinates'},
{'abbr': 2, 'code': 2, 'title': 'Log co-ordinates'},
{'abbr': 3, 'code': 3, 'title': 'Reserved'},
{'abbr': 4, 'code': 4, 'title': 'Reserved'},
{'abbr': 5, 'code': 5, 'title': 'Reserved'},
{'abbr': 6, 'code': 6, 'title': 'Reserved'},
{'abbr': 7, 'code': 7, 'title': 'Reserved'},
{'abbr': 8, 'code': 8, 'title': 'Reserved'},
{'abbr': 9, 'code': 9, 'title': 'Reserved'},
{'abbr': 10, 'code': 10, 'title': 'Reserved'},
{'abbr': 11, 'code': 11, 'title': 'Geometric Co-ordinates'})
| 54.214286 | 80 | 0.450593 | 86 | 759 | 3.976744 | 0.348837 | 0.304094 | 0.397661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051661 | 0.285903 | 759 | 13 | 81 | 58.384615 | 0.579336 | 0 | 0 | 0 | 0 | 0 | 0.408432 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0.076923 | 0.153846 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a3b74bced99dec7b2da3fad3b9fc0e7cf2d7ec0c | 934 | py | Python | igame_platform/accounts/views.py | jhonyavella90/igame_platform_test | 1376535a994f67b503fcfef2cd49c82cbb42ffd5 | [
"MIT"
] | null | null | null | igame_platform/accounts/views.py | jhonyavella90/igame_platform_test | 1376535a994f67b503fcfef2cd49c82cbb42ffd5 | [
"MIT"
] | null | null | null | igame_platform/accounts/views.py | jhonyavella90/igame_platform_test | 1376535a994f67b503fcfef2cd49c82cbb42ffd5 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from django.contrib.auth.decorators import login_required
from django.contrib.auth import login, authenticate
from django.shortcuts import render
from django.views.generic.edit import FormView
from igame_platform.accounts.forms import RegisterForm
@login_required
def home(request):
return render(request, 'accounts/home.html')
class RegisterView(FormView):
template_name = 'accounts/register.html'
form_class = RegisterForm
success_url = '/home/'
def form_valid(self, form):
# This method is called when valid form data has been POSTed.
# It should return an HttpResponse.
form.save()
username = form.cleaned_data.get('username')
raw_password = form.cleaned_data.get('password1')
user = authenticate(username=username, password=raw_password)
login(self.request, user)
return super(RegisterView, self).form_valid(form)
| 31.133333 | 69 | 0.723769 | 116 | 934 | 5.724138 | 0.517241 | 0.060241 | 0.051205 | 0.063253 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002621 | 0.183084 | 934 | 29 | 70 | 32.206897 | 0.867628 | 0.123126 | 0 | 0 | 0 | 0 | 0.077301 | 0.026994 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0.105263 | 0.263158 | 0.052632 | 0.684211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
a3d7a975a851bdcd7599ac9caef0e2d814599871 | 1,296 | py | Python | data/test/python/a3d7a975a851bdcd7599ac9caef0e2d814599871test_decorators.py | harshp8l/deep-learning-lang-detection | 2a54293181c1c2b1a2b840ddee4d4d80177efb33 | [
"MIT"
] | 84 | 2017-10-25T15:49:21.000Z | 2021-11-28T21:25:54.000Z | data/test/python/a3d7a975a851bdcd7599ac9caef0e2d814599871test_decorators.py | vassalos/deep-learning-lang-detection | cbb00b3e81bed3a64553f9c6aa6138b2511e544e | [
"MIT"
] | 13 | 2015-02-14T12:08:28.000Z | 2021-03-29T20:44:50.000Z | data/test/python/a3d7a975a851bdcd7599ac9caef0e2d814599871test_decorators.py | vassalos/deep-learning-lang-detection | cbb00b3e81bed3a64553f9c6aa6138b2511e544e | [
"MIT"
] | 24 | 2017-11-22T08:31:00.000Z | 2022-03-27T01:22:31.000Z | # -*- coding: utf-8 -*-
from watson.framework import views
from watson.framework.views.decorators import view
class MyController(object):
@view(format='xml')
def xml_action(self, *args, **kwargs):
return {}
@view(format='xml/text')
def xml_full_mime_action(self):
return {}
@view(format='html', template='test')
def html_action(self):
return {}
@view(format='json')
def bool_action(self):
return True
class TestViewDecorator(object):
def test_view_model_format(self):
controller = MyController()
controller_response = controller.xml_action()
assert isinstance(controller_response, views.Model)
assert controller_response.format == 'xml'
assert controller.xml_full_mime_action().format == 'xml/text'
def test_view_model_template(self):
controller = MyController()
controller_response = controller.html_action()
assert isinstance(controller_response, views.Model)
assert controller_response.format == 'html'
assert controller_response.template == 'test'
def test_view_model_response(self):
controller = MyController()
controller_response = controller.bool_action()
assert controller_response.data['content']
| 29.454545 | 69 | 0.679012 | 141 | 1,296 | 6.028369 | 0.269504 | 0.190588 | 0.112941 | 0.056471 | 0.44 | 0.378824 | 0.188235 | 0.188235 | 0.188235 | 0.188235 | 0 | 0.000982 | 0.214506 | 1,296 | 43 | 70 | 30.139535 | 0.833988 | 0.016204 | 0 | 0.25 | 0 | 0 | 0.038492 | 0 | 0 | 0 | 0 | 0 | 0.21875 | 1 | 0.21875 | false | 0 | 0.0625 | 0.125 | 0.46875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
a3ddbef96269a6702e6dbfd58c1118c43d0e570f | 3,049 | py | Python | bokeh/util/options.py | teresafds/bokeh | 95b2a74ff463cfabdf9e3390951fa380166e6691 | [
"BSD-3-Clause"
] | null | null | null | bokeh/util/options.py | teresafds/bokeh | 95b2a74ff463cfabdf9e3390951fa380166e6691 | [
"BSD-3-Clause"
] | null | null | null | bokeh/util/options.py | teresafds/bokeh | 95b2a74ff463cfabdf9e3390951fa380166e6691 | [
"BSD-3-Clause"
] | null | null | null | #-----------------------------------------------------------------------------
# Copyright (c) 2012 - 2022, Anaconda, Inc., and Bokeh Contributors.
# All rights reserved.
#
# The full license is in the file LICENSE.txt, distributed with this software.
#-----------------------------------------------------------------------------
""" Utilities for specifying, validating, and documenting configuration
options.
"""
#-----------------------------------------------------------------------------
# Boilerplate
#-----------------------------------------------------------------------------
from __future__ import annotations
import logging # isort:skip
log = logging.getLogger(__name__)
#-----------------------------------------------------------------------------
# Imports
#-----------------------------------------------------------------------------
# Standard library imports
from typing import Any, Dict
# Bokeh imports
from ..core.has_props import HasProps, Local
#-----------------------------------------------------------------------------
# Globals and constants
#-----------------------------------------------------------------------------
__all__ = (
'Options',
)
#-----------------------------------------------------------------------------
# General API
#-----------------------------------------------------------------------------
class Options(HasProps, Local):
''' Leverage the Bokeh properties type system for specifying and
validating configuration options.
Subclasses of ``Options`` specify a set of configuration options
using standard Bokeh properties:
.. code-block:: python
class ConnectOpts(Options):
host = String(default="127.0.0.1", help="a host value")
port = Int(default=5590, help="a port value")
Then a ``ConnectOpts`` can be created by passing a dictionary
containing keys and values corresponding to the configuration options,
as well as any additional keys and values. The items corresponding
to the properties on ``ConnectOpts`` will be ***removed*** from the
dictionary. This can be useful for functions that accept their own
set of config keyword arguments in addition to some set of Bokeh model
properties.
'''
def __init__(self, kw: Dict[str, Any]) -> None:
# remove any items that match our declared properties
props: Dict[str, Any] = {}
for k in self.properties():
if k in kw:
props[k] = kw.pop(k)
super().__init__(**props)
#-----------------------------------------------------------------------------
# Dev API
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Private API
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Code
#-----------------------------------------------------------------------------
| 34.647727 | 78 | 0.401443 | 232 | 3,049 | 5.185345 | 0.556034 | 0.0665 | 0.021613 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006865 | 0.140046 | 3,049 | 87 | 79 | 35.045977 | 0.451945 | 0.801902 | 0 | 0 | 0 | 0 | 0.014085 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.266667 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a3ff0f67daa7cbdca2ffe709e522082f6cc2f624 | 202 | py | Python | horseradish/constants.py | havron/horseradish | 0d2b2a87e3a4c4436a9f31e74313a67afc855461 | [
"Apache-2.0"
] | null | null | null | horseradish/constants.py | havron/horseradish | 0d2b2a87e3a4c4436a9f31e74313a67afc855461 | [
"Apache-2.0"
] | null | null | null | horseradish/constants.py | havron/horseradish | 0d2b2a87e3a4c4436a9f31e74313a67afc855461 | [
"Apache-2.0"
] | null | null | null | """
.. module: horseradish.constants
:copyright: (c) 2018 by Netflix Inc.
:license: Apache, see LICENSE for more details.
"""
SUCCESS_METRIC_STATUS = "success"
FAILURE_METRIC_STATUS = "failure"
| 25.25 | 51 | 0.717822 | 24 | 202 | 5.875 | 0.791667 | 0.170213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023529 | 0.158416 | 202 | 7 | 52 | 28.857143 | 0.805882 | 0.618812 | 0 | 0 | 0 | 0 | 0.202899 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
430a16984c314ff08cf84c870e581777455fca39 | 209 | py | Python | desafio-013/aumento-de-salario.py | roberto-cardoso/desafios-python | 524a5c723c3a85a441cd6bc226dff6009731d3f6 | [
"MIT"
] | null | null | null | desafio-013/aumento-de-salario.py | roberto-cardoso/desafios-python | 524a5c723c3a85a441cd6bc226dff6009731d3f6 | [
"MIT"
] | null | null | null | desafio-013/aumento-de-salario.py | roberto-cardoso/desafios-python | 524a5c723c3a85a441cd6bc226dff6009731d3f6 | [
"MIT"
] | null | null | null | print('"Aumento"')
print(':'*50)
salario = float(input('Quanto recebe o funcionário? '))
novo_salario = salario+(salario*(15/100))
print('O novo salário dele é de R${}.'.format(novo_salario))
print(':'*50)
| 20.9 | 60 | 0.674641 | 30 | 209 | 4.633333 | 0.633333 | 0.100719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048387 | 0.110048 | 209 | 9 | 61 | 23.222222 | 0.698925 | 0 | 0 | 0.333333 | 0 | 0 | 0.334928 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
4315ef164a82a9b3082f7fc48c80f724fa5d757d | 822 | py | Python | packages/api-server/api_server/routes/building_map.py | mayman99/rmf-web | 5670bd943567c6a866ec6345c972e6fb84d73476 | [
"Apache-2.0"
] | 23 | 2021-04-13T23:01:12.000Z | 2022-03-21T02:15:24.000Z | packages/api-server/api_server/routes/building_map.py | mayman99/rmf-web | 5670bd943567c6a866ec6345c972e6fb84d73476 | [
"Apache-2.0"
] | 326 | 2021-03-10T17:32:17.000Z | 2022-03-30T04:42:14.000Z | packages/api-server/api_server/routes/building_map.py | mayman99/rmf-web | 5670bd943567c6a866ec6345c972e6fb84d73476 | [
"Apache-2.0"
] | 13 | 2021-04-10T10:33:36.000Z | 2022-02-22T15:39:58.000Z | from fastapi import Depends, HTTPException
from rx import operators as rxops
from api_server.fast_io import FastIORouter, SubscriptionRequest
from api_server.models import BuildingMap
from api_server.repositories import RmfRepository, rmf_repo_dep
from api_server.rmf_io import rmf_events
router = FastIORouter(tags=["Building"])
@router.get("", response_model=BuildingMap)
async def get_building_map(rmf_repo: RmfRepository = Depends(rmf_repo_dep)):
"""
Available in socket.io
"""
building_map = await rmf_repo.get_bulding_map()
if building_map is None:
raise HTTPException(status_code=404)
return building_map
@router.sub("", response_model=BuildingMap)
def sub_building_map(_req: SubscriptionRequest):
return rmf_events.building_map.pipe(rxops.filter(lambda x: x is not None))
| 31.615385 | 78 | 0.785888 | 113 | 822 | 5.469027 | 0.460177 | 0.106796 | 0.084142 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004219 | 0.135037 | 822 | 25 | 79 | 32.88 | 0.864979 | 0 | 0 | 0 | 0 | 0 | 0.010204 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.375 | 0.0625 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
431b988bf183a914d452a2203ea253249e65b2a0 | 1,184 | py | Python | wagtail/contrib/redirects/wagtail_hooks.py | Frojd/wagtail | df164a7e6938f7faf9d75efae31ee39e9ccce9c3 | [
"BSD-3-Clause"
] | 2 | 2019-05-23T01:31:18.000Z | 2020-06-27T21:19:10.000Z | wagtail/contrib/redirects/wagtail_hooks.py | Frojd/wagtail | df164a7e6938f7faf9d75efae31ee39e9ccce9c3 | [
"BSD-3-Clause"
] | 1 | 2020-07-01T09:25:23.000Z | 2020-07-01T09:25:23.000Z | wagtail/contrib/redirects/wagtail_hooks.py | Frojd/wagtail | df164a7e6938f7faf9d75efae31ee39e9ccce9c3 | [
"BSD-3-Clause"
] | null | null | null | from django.conf.urls import include, url
from django.contrib.auth.models import Permission
from django.urls import reverse
from django.utils.translation import gettext_lazy as _
from wagtail.admin.menu import MenuItem
from wagtail.contrib.redirects import urls
from wagtail.contrib.redirects.permissions import permission_policy
from wagtail.core import hooks
@hooks.register('register_admin_urls')
def register_admin_urls():
return [
url(r'^redirects/', include(urls, namespace='wagtailredirects')),
]
class RedirectsMenuItem(MenuItem):
def is_shown(self, request):
return permission_policy.user_has_any_permission(
request.user, ['add', 'change', 'delete']
)
@hooks.register('register_settings_menu_item')
def register_redirects_menu_item():
return RedirectsMenuItem(
_('Redirects'), reverse('wagtailredirects:index'), icon_name='redirect', order=800
)
@hooks.register('register_permissions')
def register_permissions():
return Permission.objects.filter(content_type__app_label='wagtailredirects',
codename__in=['add_redirect', 'change_redirect', 'delete_redirect'])
| 32 | 105 | 0.740709 | 135 | 1,184 | 6.266667 | 0.444444 | 0.047281 | 0.074468 | 0.06383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003012 | 0.158784 | 1,184 | 36 | 106 | 32.888889 | 0.846386 | 0 | 0 | 0 | 0 | 0 | 0.173142 | 0.041385 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.296296 | 0.148148 | 0.62963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
4342a9d98e87bf951566e7f038ff70be4ba92a2a | 634 | py | Python | Data Structure/Array Or Vector/Convert Array into Zig-Zag Fashion/solutionByVaishnavi.py | Mdanish777/Programmers-Community | b5ca9582fc1cd4337baa7077ff62130a1052583f | [
"MIT"
] | 261 | 2019-09-30T19:47:29.000Z | 2022-03-29T18:20:07.000Z | Data Structure/Array Or Vector/Convert Array into Zig-Zag Fashion/solutionByVaishnavi.py | Mdanish777/Programmers-Community | b5ca9582fc1cd4337baa7077ff62130a1052583f | [
"MIT"
] | 647 | 2019-10-01T16:51:29.000Z | 2021-12-16T20:39:44.000Z | Data Structure/Array Or Vector/Convert Array into Zig-Zag Fashion/solutionByVaishnavi.py | Mdanish777/Programmers-Community | b5ca9582fc1cd4337baa7077ff62130a1052583f | [
"MIT"
] | 383 | 2019-09-30T19:32:07.000Z | 2022-03-24T16:18:26.000Z | def zigZag_Fashion(array, length):
flag = True
for i in range(length - 1):
if flag is True:
if array[i] > array[i+1]:
array[i],array[i+1] = array[i+1],array[i]
else:
if array[i] < array[i+1]:
array[i],array[i+1] = array[i+1],array[i]
flag = bool( 1 - flag )
print(array)
arraySize = int(input("Enter Array Size:- " ))
array=[]
print("Enter Array Elements")
for i in range(arraySize):
array.append(int(input()))
length = len(array)
zigZag_Fashion(array, length)
| 22.642857 | 57 | 0.492114 | 83 | 634 | 3.73494 | 0.301205 | 0.232258 | 0.135484 | 0.232258 | 0.264516 | 0.264516 | 0.264516 | 0.264516 | 0.264516 | 0.264516 | 0 | 0.01995 | 0.367508 | 634 | 27 | 58 | 23.481481 | 0.753117 | 0 | 0 | 0.111111 | 0 | 0 | 0.061514 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.055556 | 0.111111 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4a286d18240424f533a224258982f95e5a71b826 | 831 | py | Python | reverse_integer.py | tusharsadhwani/leetcode | a17a8a7587c5654f05fcd13ae7cdf47263ab2ea8 | [
"MIT"
] | 6 | 2021-05-21T01:10:42.000Z | 2021-12-16T16:12:30.000Z | reverse_integer.py | tusharsadhwani/leetcode | a17a8a7587c5654f05fcd13ae7cdf47263ab2ea8 | [
"MIT"
] | null | null | null | reverse_integer.py | tusharsadhwani/leetcode | a17a8a7587c5654f05fcd13ae7cdf47263ab2ea8 | [
"MIT"
] | null | null | null | class Solution:
def reverse(self, x: int) -> int:
x_str = str(abs(x))
reversed_str = x_str[::-1]
int_max = (1 << 31) - 1
reversed_num = 0
for digit_str in reversed_str:
digit = int(digit_str)
if reversed_num > int_max // 10:
return 0
if reversed_num == int_max // 10:
if x < 0 and digit > 8:
return 0
elif x > 0 and digit > 7:
return 0
reversed_num *= 10
reversed_num += digit
if x < 0:
reversed_num *= -1
return reversed_num
tests = [
(
(123,),
321,
),
(
(-123,),
-321,
),
(
(120,),
21,
),
(
(0,),
0,
),
]
| 17.680851 | 45 | 0.380265 | 90 | 831 | 3.333333 | 0.333333 | 0.256667 | 0.086667 | 0.106667 | 0.14 | 0.14 | 0 | 0 | 0 | 0 | 0 | 0.09828 | 0.510229 | 831 | 46 | 46 | 18.065217 | 0.638821 | 0 | 0 | 0.184211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0 | 0 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4a29d74d1bc5d76c9c143f6cad15c941bcfaedf4 | 679 | py | Python | django_client_framework/api/__init__.py | kaleido-public/django-client-framework | cd755261e001a0d446a85407550648563511f61b | [
"MIT"
] | null | null | null | django_client_framework/api/__init__.py | kaleido-public/django-client-framework | cd755261e001a0d446a85407550648563511f61b | [
"MIT"
] | 3 | 2021-06-28T20:36:39.000Z | 2021-11-11T02:12:35.000Z | django_client_framework/api/__init__.py | kaleido-public/django-client-framework | cd755261e001a0d446a85407550648563511f61b | [
"MIT"
] | null | null | null | from typing import Type, TypeVar
from django_client_framework.models.abstract.serializable import (
ISerializable,
Serializable,
)
from .base_model_api import BaseModelAPI
from .model_collection_api import ModelCollectionAPI
from .model_object_api import ModelObjectAPI
from .related_model_api import RelatedModelAPI
T = TypeVar("T", bound="Type[ISerializable]")
def register_api_model(model_class: T) -> T:
BaseModelAPI.models.append(model_class)
return model_class
def check_integrity() -> None:
for model in BaseModelAPI.models:
if not issubclass(model, Serializable):
raise TypeError(f"model {model} must inherit Serializable")
| 27.16 | 71 | 0.771723 | 83 | 679 | 6.120482 | 0.506024 | 0.070866 | 0.055118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154639 | 679 | 24 | 72 | 28.291667 | 0.885017 | 0 | 0 | 0 | 0 | 0 | 0.086892 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.352941 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4a337bcd9e7dc7858500de3e7951d1a7a53aed40 | 4,765 | py | Python | cvxpy/constraints/leq_constraint.py | quantopian/cvxpy | 7deee4d172470aa8f629dab7fead50467afa75ff | [
"Apache-2.0"
] | 5 | 2017-08-31T01:37:00.000Z | 2022-03-24T04:23:09.000Z | cvxpy/constraints/leq_constraint.py | quantopian/cvxpy | 7deee4d172470aa8f629dab7fead50467afa75ff | [
"Apache-2.0"
] | null | null | null | cvxpy/constraints/leq_constraint.py | quantopian/cvxpy | 7deee4d172470aa8f629dab7fead50467afa75ff | [
"Apache-2.0"
] | 6 | 2017-02-09T19:37:07.000Z | 2021-01-07T00:17:54.000Z | """
Copyright 2017 Steven Diamond
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import cvxpy.utilities as u
import cvxpy.lin_ops.lin_utils as lu
# Only need Variable from expressions, but that would create a circular import.
from cvxpy.expressions import cvxtypes
from cvxpy.constraints.constraint import Constraint
import numpy as np
class LeqConstraint(u.Canonical, Constraint):
OP_NAME = "<="
TOLERANCE = 1e-4
def __init__(self, lh_exp, rh_exp):
self.args = [lh_exp, rh_exp]
self._expr = lh_exp - rh_exp
self.dual_variable = cvxtypes.variable()(*self._expr.size)
super(LeqConstraint, self).__init__()
@property
def id(self):
"""Wrapper for compatibility with variables.
"""
return self.constr_id
def name(self):
return ' '.join([str(self.args[0].name()),
self.OP_NAME,
str(self.args[1].name())])
def __str__(self):
"""Returns a string showing the mathematical constraint.
"""
return self.name()
def __repr__(self):
"""Returns a string with information about the constraint.
"""
return "%s(%s, %s)" % (self.__class__.__name__,
repr(self.args[0]),
repr(self.args[1]))
def __nonzero__(self):
"""Raises an exception when called.
Python 2 version.
Called when evaluating the truth value of the constraint.
Raising an error here prevents writing chained constraints.
"""
return self._chain_constraints()
def _chain_constraints(self):
"""Raises an error due to chained constraints.
"""
raise Exception(
("Cannot evaluate the truth value of a constraint or "
"chain constraints, e.g., 1 >= x >= 0.")
)
def __bool__(self):
"""Raises an exception when called.
Python 3 version.
Called when evaluating the truth value of the constraint.
Raising an error here prevents writing chained constraints.
"""
return self._chain_constraints()
@property
def size(self):
return self._expr.size
# Left hand expression must be convex and right hand must be concave.
def is_dcp(self):
return self._expr.is_convex()
def canonicalize(self):
"""Returns the graph implementation of the object.
Marks the top level constraint as the dual_holder,
so the dual value will be saved to the LeqConstraint.
Returns
-------
tuple
A tuple of (affine expression, [constraints]).
"""
obj, constraints = self._expr.canonical_form
dual_holder = lu.create_leq(obj, constr_id=self.id)
return (None, constraints + [dual_holder])
def variables(self):
"""Returns the variables in the compared expressions.
"""
return self._expr.variables()
def parameters(self):
"""Returns the parameters in the compared expressions.
"""
return self._expr.parameters()
def constants(self):
"""Returns the constants in the compared expressions.
"""
return self._expr.constants()
@property
def value(self):
"""Does the constraint hold?
Returns
-------
bool
"""
resid = self.residual.value
if resid is None:
return None
else:
return np.all(resid <= self.TOLERANCE)
@property
def residual(self):
"""The residual of the constraint.
Returns
-------
Expression
"""
return cvxtypes.pos()(self._expr)
@property
def violation(self):
"""How much is this constraint off by?
Returns
-------
NumPy matrix
"""
return self.residual.value
# The value of the dual variable.
@property
def dual_value(self):
return self.dual_variable.value
def save_value(self, value):
"""Save the value of the dual variable for the constraint's parent.
Args:
value: The value of the dual variable.
"""
self.dual_variable.save_value(value)
| 28.029412 | 79 | 0.610073 | 571 | 4,765 | 4.964974 | 0.334501 | 0.038801 | 0.017637 | 0.010582 | 0.204233 | 0.189418 | 0.1806 | 0.093122 | 0.093122 | 0.093122 | 0 | 0.005399 | 0.300315 | 4,765 | 169 | 80 | 28.195266 | 0.844931 | 0.420777 | 0 | 0.117647 | 0 | 0 | 0.041513 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.279412 | false | 0 | 0.073529 | 0.058824 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4a353f79d760f086e4364fc26d546d9388db244f | 275 | py | Python | PV_ICE/__init__.py | NREL/PV-DEMICE | 6e2938950ff10c37f176f46aeb76c78de609f535 | [
"BSD-3-Clause"
] | 12 | 2021-01-28T13:44:16.000Z | 2022-03-20T07:28:48.000Z | PV_ICE/__init__.py | NREL/PV_DEMICE | 6e2938950ff10c37f176f46aeb76c78de609f535 | [
"BSD-3-Clause"
] | 6 | 2020-12-22T21:12:54.000Z | 2021-09-24T19:50:15.000Z | PV_ICE/__init__.py | NREL/PV_DEMICE | 6e2938950ff10c37f176f46aeb76c78de609f535 | [
"BSD-3-Clause"
] | 1 | 2020-04-09T17:36:28.000Z | 2020-04-09T17:36:28.000Z | from PV_ICE.main import Simulation, Scenario, Material, weibull_params, weibull_cdf, calculateLCA, weibull_cdf_vis
from PV_ICE.main import sens_StageImprovement, sens_StageEfficiency
from ._version import get_versions
__version__ = get_versions()['version']
del get_versions
| 45.833333 | 114 | 0.850909 | 37 | 275 | 5.891892 | 0.540541 | 0.151376 | 0.082569 | 0.119266 | 0.174312 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087273 | 275 | 5 | 115 | 55 | 0.868526 | 0 | 0 | 0 | 0 | 0 | 0.025455 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
4a40ad4cbbf8c50162703277b986c4699a2ab466 | 5,367 | py | Python | spytest/spytest/env.py | macikgozwa/sonic-mgmt | 86338be8b2e55fd03d4913037d0e641e443762b0 | [
"Apache-2.0"
] | null | null | null | spytest/spytest/env.py | macikgozwa/sonic-mgmt | 86338be8b2e55fd03d4913037d0e641e443762b0 | [
"Apache-2.0"
] | 1 | 2021-02-24T13:48:41.000Z | 2021-02-24T13:48:41.000Z | spytest/spytest/env.py | macikgozwa/sonic-mgmt | 86338be8b2e55fd03d4913037d0e641e443762b0 | [
"Apache-2.0"
] | null | null | null | import os
defaults = {
"SPYTEST_LOGS_TIME_FMT_ELAPSED": "0",
"SPYTEST_LOGS_MODULE_ONLY_SUPPORT" : "0",
"SPYTEST_NO_CONSOLE_LOG": "0",
"SPYTEST_PROMPTS_FILENAME": None,
"SPYTEST_TEXTFSM_INDEX_FILENAME": "index",
"SPYTEST_UI_POSITIVE_CASES_ONLY": "0",
"SPYTEST_REPEAT_MODULE_SUPPORT": "0",
"SPYTEST_FILE_PREFIX": "results",
"SPYTEST_RESULTS_PREFIX": None,
"SPYTEST_RESULTS_PNG": "1",
"SPYTEST_MODULE_CSV_FILENAME": "modules.csv",
"SPYTEST_MODULE_INFO_CSV_FILENAME": "module_info.csv",
"SPYTEST_FUNCTION_INFO_CSV_FILENAME": "function_info.csv",
"SPYTEST_TCMAP_CSV_FILENAME": "tcmap.csv",
"SPYTEST_TESTBED_IGNORE_CONSTRAINTS": "",
"SPYTEST_FLEX_DUT": "1",
"SPYTEST_FLEX_PORT": "0",
"SPYTEST_MGMT_IFNAME": "eth0",
"SPYTEST_TOPO_SEP": None,
"SPYTEST_TESTBED_RANDOMIZE_DEVICES": "0",
"SPYTEST_BUCKETS_DEADNODE_RECOVERY": "1",
"SPYTEST_BATCH_DEFAULT_BUCKET": "1",
"SPYTEST_TOPO_1": "D1T1:2",
"SPYTEST_TOPO_2": "D1T1:4 D1D2:6 D2T1:2",
"SPYTEST_TOPO_3": "D1 D2 D3",
"SPYTEST_TOPO_4": "D1T1:2 D2T1:2 D3T1:2 D4T1:2 D1D2:4 D2D3:4 D3D4:4 D4D1:4",
"SPYTEST_TOPO_5": "D1 D2 D3 D4 D5",
"SPYTEST_TOPO_6": "D1D3:4 D1D4:4 D1D5:2 D1D6:4 D2D3:4 D2D4:4 D2D5:4 D2D6:4 D3T1:2 D4T1:2 D5T1:2 D6T1:2",
"SPYTEST_TOPO_7": "D1 D2 D3 D4 D5 D6 D7",
"SPYTEST_TOPO_8": "D1 D2 D3 D4 D5 D6 D7 D8",
"SPYTEST_EMAIL_BODY_PREFIX": "",
"SPYTEST_TECH_SUPPORT_ONERROR": "system,port_list,port_status",
"SPYTEST_SAVE_CLI_TYPE": "1",
"SPYTEST_SAVE_CLI_CMDS": "1",
"SPYTEST_SHUTDOWN_FREE_PORTS": "0",
"SPYTEST_ABORT_ON_VERSION_MISMATCH": "2",
"SPYTEST_TOPOLOGY_STATUS_MAX_WAIT": "60",
"SPYTEST_TOPOLOGY_STATUS_ONFAIL_ABORT": "module",
"SPYTEST_LIVE_RESULTS": "1",
"SPYTEST_DEBUG_FIND_PROMPT": "0",
"SPYTEST_KDUMP_ENABLE": "1",
"SPYTEST_LOG_DUTID_FMT": "LABEL",
"SPYTEST_SWAP_PASSWORD": "0",
"SPYTEST_SYSRQ_ENABLE": "0",
"SPYTEST_SET_STATIC_IP": "1",
"SPYTEST_ONREBOOT_RENEW_MGMT_IP": "0",
"SPYTEST_DATE_SYNC": "1",
"SPYTEST_BOOT_FROM_GRUB": "0",
"SPYTEST_RECOVERY_MECHANISMS": "1",
"SPYTEST_RESET_CONSOLES": "0",
"SPYTEST_ONCONSOLE_HANG": "recover",
"SPYTEST_CONNECT_DEVICES_RETRY": "10",
"SPYTEST_OPENCONFIG_API": "GNMI",
"SPYTEST_IFA_ENABLE": "0",
"SPYTEST_ROUTING_CONFIG_MODE": None,
"SPYTEST_CLEAR_MGMT_INTERFACE": "0",
"SPYTEST_CLEAR_DEVICE_METADATA_HOSTNAME": "0",
"SPYTEST_NTP_CONFIG_INIT": "0",
"SPYTEST_GENERATE_CERTIFICATE": "0",
"SPYTEST_HOOKS_PORT_ADMIN_STATE_UITYPE": "click",
"SPYTEST_HOOKS_PORT_STATUS_UITYPE": "click",
"SPYTEST_HOOKS_VERSION_UITYPE": "click",
"SPYTEST_HOOKS_BREAKOUT_UITYPE": "klish",
"SPYTEST_HOOKS_SPEED_UITYPE": "",
"SPYTEST_IFNAME_MAP_UITYPE": "click",
"SPYTEST_API_INSTRUMENT_SUPPORT": "0",
"SPYTEST_REDIS_DB_CLI_TYPE": "1",
"SPYTEST_TOPOLOGY_SHOW_ALIAS": "0",
"SPYTEST_TOPOLOGY_STATUS_FAST": "1",
"SPYTEST_BGP_API_UITYPE": "",
"SPYTEST_BGP_CFG_API_UITYPE": "",
"SPYTEST_BGP_SHOW_API_UITYPE": "",
}
dev_defaults = {
"SPYTEST_TOPOLOGY_SIMULATE_FAIL": "0",
"SPYTEST_REST_TEST_URL": None,
"SPYTEST_BATCH_BACKUP_NODES": None,
"SPYTEST_BATCH_RERUN_NODES": None,
"SPYTEST_BATCH_MODULE_TOPO_PREF": None,
"SPYTEST_BATCH_MATCHING_BUCKET_ORDER": "larger,largest",
"SPYTEST_BATCH_RERUN": None,
"SPYTEST_TESTBED_FILE": "testbed.yaml",
"SPYTEST_FILE_MODE": "0",
"SPYTEST_SCHEDULING": None,
"SPYTEST_BATCH_RUN": None,
"PYTEST_XDIST_WORKER": None,
"SPYTEST_BUCKETS_DEADNODE_SIMULATE": "0",
"SPYTEST_USER_ROOT": None,
"SPYTEST_CMDLINE_ARGS": "",
"SPYTEST_SUITE_ARGS": "",
"SPYTEST_TEXTFSM_DUMP_INDENT_JSON": None,
"SPYTEST_TESTBED_EXCLUDE_DEVICES": None,
"SPYTEST_TESTBED_INCLUDE_DEVICES": None,
"SPYTEST_LOGS_PATH": None,
"SPYTEST_LOGS_LEVEL": "info",
"SPYTEST_APPLY_BASE_CONFIG_AFTER_MODULE": "0",
"SPYTEST_COMMUNITY_BUILD_FEATURES": "0",
"SPYTEST_ONFAIL_TGEN_STATS": "0",
"SPYTEST_SYSTEM_READY_AFTER_PORT_SETTINGS": "0",
"SPYTEST_APPLY_CONFIG_BEFORE_BREAKOUT": None,
"SPYTEST_TCLIST_FILE": None,
"SPYTEST_MODULE_REPORT_SORTER": "CDT",
"SPYTEST_ASAN_OPTIONS": "",
"SPYTEST_RECOVER_INITIAL_SYSTEM_NOT_READY": "0",
"SPYTEST_LIVE_TRACE_OUTPUT": "0",
"SPYTEST_USE_SAMPLE_DATA": None,
"SPYTEST_DRYRUN_CMD_DELAY": "0",
"SPYTEST_FASTER_CLI_OVERRIDE": None,
"SPYTEST_FASTER_CLI_LAST_PROMPT": "1",
"SPYTEST_NEW_FIND_PROMPT": "0",
"SPYTEST_DETECT_CONCURRENT_ACCESS": "0",
"SPYTEST_SPLIT_COMMAND_LIST": "0",
"SPYTEST_CHECK_SKIP_ERROR": "0",
"SPYTEST_HELPER_CONFIG_DB_RELOAD": "yes",
"SPYTEST_CHECK_HELPER_SIGNATURE": "0",
"SPYTEST_CLICK_HELPER_ARGS": "",
}
def _get_logs_path():
user_root = os.getenv("SPYTEST_USER_ROOT", os.getcwd())
logs_path = os.getenv("SPYTEST_LOGS_PATH", user_root)
if not os.path.isabs(logs_path):
logs_path = os.path.join(user_root, logs_path)
if not os.path.exists(logs_path):
os.makedirs(logs_path)
return logs_path
def get(name, default=None):
cur_def = defaults.get(name, default)
if cur_def is None and default is not None:
cur_def = default
retval = os.getenv(name, cur_def)
return retval
def get_default_all():
return sorted(defaults.items())
| 37.795775 | 108 | 0.701882 | 719 | 5,367 | 4.742698 | 0.367177 | 0.084457 | 0.02346 | 0.007038 | 0.011144 | 0.008211 | 0.008211 | 0 | 0 | 0 | 0 | 0.035247 | 0.159493 | 5,367 | 141 | 109 | 38.06383 | 0.720683 | 0 | 0 | 0 | 0 | 0.014815 | 0.625792 | 0.43552 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022222 | false | 0.007407 | 0.007407 | 0.007407 | 0.051852 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4a4141fb002291e77dbef0111a645a7dbe8c8908 | 972 | py | Python | deepr/layers/multi.py | criteo-dexter/deepr | 4de9cb8afc09cb3d2f7c42da248a966bfea5fc83 | [
"Apache-2.0"
] | 50 | 2020-05-19T17:29:44.000Z | 2022-01-15T20:50:50.000Z | deepr/layers/multi.py | criteo-dexter/deepr | 4de9cb8afc09cb3d2f7c42da248a966bfea5fc83 | [
"Apache-2.0"
] | 75 | 2020-05-20T16:53:37.000Z | 2022-01-12T15:53:46.000Z | deepr/layers/multi.py | criteo-dexter/deepr | 4de9cb8afc09cb3d2f7c42da248a966bfea5fc83 | [
"Apache-2.0"
] | 17 | 2020-05-25T13:23:03.000Z | 2022-02-21T11:22:08.000Z | # pylint: disable=no-value-for-parameter,invalid-name,unexpected-keyword-arg
"""Negative Multinomial Log Likelihood."""
import tensorflow as tf
from deepr.layers import base
class MultiLogLikelihood(base.Layer):
"""Negative Multinomial Log Likelihood."""
def __init__(self, **kwargs):
super().__init__(n_in=2, n_out=1, **kwargs)
def forward(self, tensors, mode: str = None):
"""Multinomial Log Likelihood
Parameters
----------
tensors : Tuple[tf.Tensor]
- logits : shape = (batch, num_classes), tf.float32
- classes : shape = (batch, num_classes), tf.int64 as a
one-hot vector
Returns
-------
tf.Tensor
Negative Multinomial Log Likelihood, scalar
"""
logits, classes = tensors
log_softmax = tf.nn.log_softmax(logits)
return -tf.reduce_mean(tf.reduce_sum(log_softmax * tf.cast(classes, tf.float32), axis=-1))
| 29.454545 | 98 | 0.618313 | 113 | 972 | 5.168142 | 0.575221 | 0.09589 | 0.164384 | 0.164384 | 0.075342 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012431 | 0.255144 | 972 | 32 | 99 | 30.375 | 0.794199 | 0.44856 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4a4575e3340dd85040ef5921ec2db9b6e061a711 | 377 | py | Python | tests/unit/method_test.py | cybergarage/round-py | fb9404835ad1eb3f4b63eac92a8de2a3e60aa38a | [
"BSD-3-Clause"
] | null | null | null | tests/unit/method_test.py | cybergarage/round-py | fb9404835ad1eb3f4b63eac92a8de2a3e60aa38a | [
"BSD-3-Clause"
] | null | null | null | tests/unit/method_test.py | cybergarage/round-py | fb9404835ad1eb3f4b63eac92a8de2a3e60aa38a | [
"BSD-3-Clause"
] | null | null | null | #################################################################
#
# Round for Python
#
# Copyright (C) Satoshi Konno 2016
#
# This is licensed under BSD-style license, see file COPYING.
#
##################################################################
import pytest
from round import Method
def test_bad_method():
method = Method()
assert not method.is_valid()
| 20.944444 | 66 | 0.461538 | 35 | 377 | 4.885714 | 0.8 | 0.140351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01227 | 0.135279 | 377 | 17 | 67 | 22.176471 | 0.51227 | 0.289125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4a4bbab8c66a7fe3189cc11bfb0170c08681d1fa | 816 | py | Python | mongo2json/cleaner.py | drunckoder/mongo2json | 629e1ab81afb8621421e48dcf189663d8cb477b5 | [
"MIT"
] | null | null | null | mongo2json/cleaner.py | drunckoder/mongo2json | 629e1ab81afb8621421e48dcf189663d8cb477b5 | [
"MIT"
] | null | null | null | mongo2json/cleaner.py | drunckoder/mongo2json | 629e1ab81afb8621421e48dcf189663d8cb477b5 | [
"MIT"
] | null | null | null | import re
from functools import reduce
from typing import Pattern, List, Iterable
patterns: List[Pattern[str]] = [
re.compile(pattern=r'ObjectId\((.*)\)'),
re.compile(pattern=r'ISODate\((.*)\)'),
re.compile(pattern=r'NumberLong\((.*)\)'),
re.compile(pattern=r'NumberInt\((.*)\)'),
re.compile(pattern=r'NumberDecimal\("(.*)"\)'),
]
def apply_pattern(pattern: Pattern[str], string: str) -> str:
return pattern.sub(repl=r'\1', string=string)
def clean_line(line: str) -> str:
return reduce(
lambda string, pattern: apply_pattern(pattern=pattern, string=string),
patterns,
line
)
def clean_string(string: str) -> Iterable[str]:
return map(clean_line, string.splitlines())
def clean(string: str) -> str:
return ''.join(clean_string(string=string))
| 25.5 | 78 | 0.64951 | 101 | 816 | 5.188119 | 0.306931 | 0.085878 | 0.152672 | 0.162214 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001468 | 0.165441 | 816 | 31 | 79 | 26.322581 | 0.767988 | 0 | 0 | 0 | 0 | 0 | 0.11152 | 0.028186 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.136364 | 0.181818 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
4a61e6ea2a753f984b7b232b73f4635f477e2dc6 | 3,918 | py | Python | examples/place_on_bristlecone.py | Saibaba-Alapati/Cirq | 782efcd04c3bbf73a0d630306a3d1cfd9966521d | [
"Apache-2.0"
] | 3,326 | 2018-07-18T23:17:21.000Z | 2022-03-29T22:28:24.000Z | examples/place_on_bristlecone.py | resduo/Cirq | 680f897345eb1c71c9242515edda8f04b8594319 | [
"Apache-2.0"
] | 3,443 | 2018-07-18T21:07:28.000Z | 2022-03-31T20:23:21.000Z | examples/place_on_bristlecone.py | resduo/Cirq | 680f897345eb1c71c9242515edda8f04b8594319 | [
"Apache-2.0"
] | 865 | 2018-07-18T23:30:24.000Z | 2022-03-30T11:43:23.000Z | # pylint: disable=line-too-long
"""Create a circuit, optimize it, and map it onto a Bristlecone chip.
===EXAMPLE_OUTPUT===
Length 10 line on Bristlecone:
(5, 0)━━(5, 1)
┃
(6, 1)━━(6, 2)
┃
(7, 2)━━(7, 3)
┃
(8, 3)━━(8, 4)
┃
(9, 4)━━(9, 5)
Initial circuit:
0: ───×───────M('all')───
│ │
1: ───×───×───M──────────
│ │
2: ───×───×───M──────────
│ │
3: ───×───×───M──────────
│ │
4: ───×───×───M──────────
│ │
5: ───×───×───M──────────
│ │
6: ───×───×───M──────────
│ │
7: ───×───×───M──────────
│ │
8: ───×───×───M──────────
│ │
9: ───×───────M──────────
Xmon circuit:
(5, 0): ───Y^-0.5───@───Y^0.5───@───────Y^-0.5───@──────────────────────────────────────────────────────────────────M('all')───
│ │ │ │
(5, 1): ───X^-0.5───@───X^0.5───@───────X^-0.5───@────────X^0.5───────────@───X^-0.5───@────────X^0.5───@───────────M──────────
│ │ │ │
(6, 1): ───Y^-0.5───────@───────Y^0.5───@────────Y^-0.5───@───────Y^0.5───@───Y^-0.5───@────────Y^0.5───@───────────M──────────
│ │ │ │
(6, 2): ───X^-0.5───────@───────X^0.5───@────────X^-0.5───@───────X^0.5───────@────────X^-0.5───@───────X^0.5───@───M──────────
│ │ │ │
(7, 2): ───Y^-0.5───@───Y^0.5───@───────Y^-0.5───@────────Y^0.5───────────────@────────Y^-0.5───@───────Y^0.5───@───M──────────
│ │ │ │
(7, 3): ───X^-0.5───@───X^0.5───@───────X^-0.5───@────────X^0.5───────────@───X^-0.5───@────────X^0.5───@───────────M──────────
│ │ │ │
(8, 3): ───Y^-0.5───────@───────Y^0.5───@────────Y^-0.5───@───────Y^0.5───@───Y^-0.5───@────────Y^0.5───@───────────M──────────
│ │ │ │
(8, 4): ───X^-0.5───────@───────X^0.5───@────────X^-0.5───@───────X^0.5───────@────────X^-0.5───@───────X^0.5───@───M──────────
│ │ │ │
(9, 4): ───Y^-0.5───@───Y^0.5───@───────Y^-0.5───@────────Y^0.5───────────────@────────Y^-0.5───@───────Y^0.5───@───M──────────
│ │ │ │
(9, 5): ───X^-0.5───@───X^0.5───@───────X^-0.5───@──────────────────────────────────────────────────────────────────M──────────
"""
import cirq
import cirq_google
def main():
print("Length 10 line on Bristlecone:")
line = cirq_google.line_on_device(cirq_google.Bristlecone, length=10)
print(line)
print("Initial circuit:")
n = 10
depth = 2
circuit = cirq.Circuit(
cirq.SWAP(cirq.LineQubit(j), cirq.LineQubit(j + 1))
for i in range(depth)
for j in range(i % 2, n - 1, 2)
)
circuit.append(cirq.measure(*cirq.LineQubit.range(n), key='all'))
print(circuit)
print()
print("Xmon circuit:")
translated = cirq_google.optimized_for_xmon(
circuit=circuit, new_device=cirq_google.Bristlecone, qubit_map=lambda q: line[q.x]
)
print(translated)
if __name__ == '__main__':
main()
| 44.022472 | 127 | 0.217203 | 574 | 3,918 | 3.350174 | 0.130662 | 0.056162 | 0.114405 | 0.071763 | 0.665107 | 0.460738 | 0.460738 | 0.460738 | 0.460738 | 0.460738 | 0 | 0.073139 | 0.403267 | 3,918 | 88 | 128 | 44.522727 | 0.282293 | 0.809597 | 0 | 0 | 0 | 0 | 0.094723 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.083333 | 0 | 0.125 | 0.291667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4a64343de17edb390dc61422f64d7be687a506aa | 1,441 | py | Python | Python/Matplotlib/01-Introduction/finished_code.py | sagarsaliya/code_snippets | 692ccae54a47a16c8286d91b08707150c20531fc | [
"MIT"
] | 9,588 | 2017-03-21T16:07:40.000Z | 2022-03-31T08:43:39.000Z | Python/Matplotlib/01-Introduction/finished_code.py | JaredDelora/code_snippets | ed3c42ff06bb31da1f9f00689fa76d90babddc97 | [
"MIT"
] | 135 | 2017-04-29T15:28:11.000Z | 2022-03-27T19:20:49.000Z | Python/Matplotlib/01-Introduction/finished_code.py | JaredDelora/code_snippets | ed3c42ff06bb31da1f9f00689fa76d90babddc97 | [
"MIT"
] | 20,939 | 2017-03-27T14:42:56.000Z | 2022-03-31T16:41:14.000Z |
from matplotlib import pyplot as plt
plt.xkcd()
ages_x = [18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35,
36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55]
py_dev_y = [20046, 17100, 20000, 24744, 30500, 37732, 41247, 45372, 48876, 53850, 57287, 63016, 65998, 70003, 70000, 71496, 75370, 83640, 84666,
84392, 78254, 85000, 87038, 91991, 100000, 94796, 97962, 93302, 99240, 102736, 112285, 100771, 104708, 108423, 101407, 112542, 122870, 120000]
plt.plot(ages_x, py_dev_y, label='Python')
js_dev_y = [16446, 16791, 18942, 21780, 25704, 29000, 34372, 37810, 43515, 46823, 49293, 53437, 56373, 62375, 66674, 68745, 68746, 74583, 79000,
78508, 79996, 80403, 83820, 88833, 91660, 87892, 96243, 90000, 99313, 91660, 102264, 100000, 100000, 91660, 99240, 108000, 105000, 104000]
plt.plot(ages_x, js_dev_y, label='JavaScript')
dev_y = [17784, 16500, 18012, 20628, 25206, 30252, 34368, 38496, 42000, 46752, 49320, 53200, 56000, 62316, 64928, 67317, 68748, 73752, 77232,
78000, 78508, 79536, 82488, 88935, 90000, 90056, 95000, 90000, 91633, 91660, 98150, 98964, 100000, 98988, 100000, 108923, 105000, 103117]
plt.plot(ages_x, dev_y, color='#444444', linestyle='--', label='All Devs')
plt.xlabel('Ages')
plt.ylabel('Median Salary (USD)')
plt.title('Median Salary (USD) by Age')
plt.legend()
plt.tight_layout()
plt.savefig('plot.png')
plt.show()
| 45.03125 | 154 | 0.668286 | 226 | 1,441 | 4.19469 | 0.778761 | 0.025316 | 0.03481 | 0.037975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.56318 | 0.170715 | 1,441 | 31 | 155 | 46.483871 | 0.230126 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4a6f606a01def5270d9ecb6b7505728ad6287b1e | 325 | py | Python | decorator/my_simple_decorator.py | mlobf/pythonOOP | 49c6963cefda85b380635151829697e1dbf9e3c1 | [
"BSD-3-Clause"
] | null | null | null | decorator/my_simple_decorator.py | mlobf/pythonOOP | 49c6963cefda85b380635151829697e1dbf9e3c1 | [
"BSD-3-Clause"
] | null | null | null | decorator/my_simple_decorator.py | mlobf/pythonOOP | 49c6963cefda85b380635151829697e1dbf9e3c1 | [
"BSD-3-Clause"
] | null | null | null | import time
def timer(func):
def wrapper(*args, **kwargs):
start = time.time()
rv = func()
total = time.time() - start
print("Time:", total)
return rv
return wrapper
# how to implement a simple decorator
@timer
def test():
for _ in range(100000):
pass
test()
| 14.772727 | 37 | 0.556923 | 40 | 325 | 4.5 | 0.625 | 0.088889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027397 | 0.326154 | 325 | 21 | 38 | 15.47619 | 0.794521 | 0.107692 | 0 | 0 | 0 | 0 | 0.017361 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0.071429 | 0.071429 | 0 | 0.428571 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4a965c5980c30c52cfdda06e8e52426a4f9d21f5 | 4,305 | py | Python | infiniteremixer/segmentation/separator.py | nroldanf/infiniteremixer | 5ffc2fc85e9cbebf6196d5baa374ba4630f2e3c0 | [
"MIT"
] | null | null | null | infiniteremixer/segmentation/separator.py | nroldanf/infiniteremixer | 5ffc2fc85e9cbebf6196d5baa374ba4630f2e3c0 | [
"MIT"
] | null | null | null | infiniteremixer/segmentation/separator.py | nroldanf/infiniteremixer | 5ffc2fc85e9cbebf6196d5baa374ba4630f2e3c0 | [
"MIT"
] | null | null | null | # import math
# import os
# import numpy as np
# # from spleeter.separator import Separator
# from infiniteremixer.utils.io import load, write_wav
# class SeparatorChunks:
# """Separate a track into 4 different sources and save the
# corresponding signals as audio files in different folders.
# The separator first divide in chunks that fit into RAM memory and Concatenate
# the results of prediction to an array.
# """
# def __init__(self, sample_rate, max_size):
# # Define the model
# # self.separator = Separator("spleeter:4stems")
# self.max_size = max_size # Maximum size of array (Equivalent 10_000_000 of 80M for a 2 channel audio).
# self.sample_rate = sample_rate
# self._audio_format = "wav"
# self.audio_array_length = None
# def create_and_save_sources(self, dir, save_dir):
# for root, _, files in os.walk(dir):
# for file in files:
# if file[-3:] in ["mp3", "wav"]:
# self._separate(file, dir, save_dir)
# def _separate(self, file, root, save_dir):
# file_path = os.path.join(root, file)
# # Load the audio
# y = load(file_path, self.sample_rate, False)
# self.audio_array_length = y.shape[1]
# # Separate the audio
# if not self._is_file_too_big():
# other, drums = self._separate_full_file(y)
# else:
# other, drums = self._separate_by_chunks(y)
# # Generate the save path and save the audio
# audio_path_other = self._generate_save_path(file, save_dir, "other")
# audio_path_drums = self._generate_save_path(file, save_dir, "drums")
# write_wav(audio_path_other, other.T, self.sample_rate)
# write_wav(audio_path_drums, drums.T, self.sample_rate)
# # sf.write(save_path_other, other.T, self.sample_rate, subtype="PCM_24")
# # sf.write(save_path_drums, drums.T, self.sample_rate, subtype="PCM_24")
# def _generate_save_path(self, file, save_dir, source):
# file_name = f"{file}_{source}.{self._audio_format}"
# folder_path = self._create_save_path(file, save_dir)
# audio_path = os.path.join(folder_path, file_name)
# return audio_path
# def _create_save_path(self, file, save_dir):
# folder_path = os.path.join(save_dir, file)
# try:
# os.mkdir(folder_path)
# except OSError:
# print(f"{folder_path} already exists.")
# return folder_path
# # def _write_wav(self, file, save_root, save_dir, signal):
# # # Save the separated files
# # write_wav(save_dir, signal, self.sample_rate)
# def _is_file_too_big(self):
# return self.audio_array_length > self.max_size
# def _separate_full_file(self, stereo):
# # Get the different sources
# prediction = self.separator.separate(stereo.T)
# other, drums = prediction["other"], prediction["drums"]
# return other, drums
# def _separate_by_chunks(self, stereo):
# # Establish the number of chunks
# n_chunks = math.ceil(self.audio_array_length / self.max_size)
# # Initialize variables
# start_idx = 0
# end_idx = self.max_size
# other, drums = [], []
# for i in range(n_chunks):
# # Get sources for the chunk
# prediction = self.separator.separate(stereo[:, start_idx:end_idx].T)
# # Concatenate chunk predictions (sources) of interest
# other.append(prediction["other"].T)
# drums.append(prediction["drums"].T)
# start_idx = end_idx + 1
# # Frontier handling
# if (stereo.shape[1] - end_idx) <= self.max_size:
# end_idx = stereo.shape[1]
# else:
# end_idx += self.max_size + 1
# # Concatenate
# other = np.column_stack(other)
# drums = np.column_stack(drums)
# return other, drums
# if __name__ == "__main__":
# sample_rate = 22050
# max_size = 6_000_000
# # os.mkdir("/app/results/splitted/tes1")
# separator = SeparatorChunks(sample_rate, max_size)
# separator.create_and_save_sources("/app/results/audios/", "/app/results/splitted")
| 38.097345 | 113 | 0.617189 | 553 | 4,305 | 4.531646 | 0.253165 | 0.043895 | 0.044693 | 0.031923 | 0.181165 | 0.123703 | 0.105347 | 0 | 0 | 0 | 0 | 0.012129 | 0.272242 | 4,305 | 112 | 114 | 38.4375 | 0.787743 | 0.947735 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4abd311698d91b195b7c8e08cf1aec887a6cb94d | 684 | py | Python | SBDM_Data.py | LiorArmon/Python-Hackathon | 075056d8476c6144b2eaa826ff623cf828385c73 | [
"Apache-2.0"
] | null | null | null | SBDM_Data.py | LiorArmon/Python-Hackathon | 075056d8476c6144b2eaa826ff623cf828385c73 | [
"Apache-2.0"
] | null | null | null | SBDM_Data.py | LiorArmon/Python-Hackathon | 075056d8476c6144b2eaa826ff623cf828385c73 | [
"Apache-2.0"
] | 2 | 2018-06-26T08:40:17.000Z | 2018-06-27T06:17:55.000Z | import psychopy.core
import psychopy.event
import psychopy.visual
import pandas as pd
import numpy as np
import psychopy.gui
import psychopy.sound
import os
import yaml
import json
from pathlib import Path
import random
from Stimulus import Stimulus
class SBDM_Data:
def __init__(self, df):
self.df = df
def create_stim_list(self):
snack_names = self.df.index.values
print(snack_names)
stimlist = []
for idx, stim in enumerate(snack_names):
show = self.df['show'][idx]
cued = self.df['cued'][idx]
A = Stimulus(name=stim, show=show, cued=cued)
stimlist.append(A)
return stimlist
| 23.586207 | 57 | 0.662281 | 94 | 684 | 4.712766 | 0.489362 | 0.158014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25731 | 684 | 28 | 58 | 24.428571 | 0.872047 | 0 | 0 | 0 | 0 | 0 | 0.011696 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.5 | 0 | 0.653846 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
434f1248eb3156de43d3fb7802283874cffe5da9 | 1,274 | py | Python | setup.py | transformify-plugins/warpify | 82782b488d26ccc53978fbe6ba8bd65839dc6937 | [
"BSD-3-Clause"
] | null | null | null | setup.py | transformify-plugins/warpify | 82782b488d26ccc53978fbe6ba8bd65839dc6937 | [
"BSD-3-Clause"
] | null | null | null | setup.py | transformify-plugins/warpify | 82782b488d26ccc53978fbe6ba8bd65839dc6937 | [
"BSD-3-Clause"
] | null | null | null | # !/usr/bin/env python
from distutils.core import setup
setup(
name='warpify',
packages=[],
version='0.0.0',
description='Python image warping plugin.',
maintainer='Nicholas Sofroniew',
maintainer_email='sofroniewn@gmail.com',
license='BSD 3-Clause',
url='https://github.com/transformify-plugins/warpify',
keywords=['transformify-plugins', 'transformify', 'warpify', ],
classifiers=[
'Development Status :: 1 - Planning',
'Environment :: Plugins',
'Intended Audience :: Education',
'Intended Audience :: Science/Research',
'License :: OSI Approved :: BSD License',
'Programming Language :: Python',
'Programming Language :: Python :: 3 :: Only',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Topic :: Scientific/Engineering',
'Topic :: Scientific/Engineering :: Visualization',
'Topic :: Scientific/Engineering :: Information Analysis',
'Topic :: Scientific/Engineering :: Bio-Informatics',
'Topic :: Utilities',
'Operating System :: Microsoft :: Windows',
'Operating System :: POSIX',
'Operating System :: Unix',
'Operating System :: MacOS',
],
)
| 36.4 | 67 | 0.61303 | 115 | 1,274 | 6.782609 | 0.6 | 0.097436 | 0.128205 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01032 | 0.239403 | 1,274 | 34 | 68 | 37.470588 | 0.794634 | 0.015699 | 0 | 0 | 0 | 0 | 0.638978 | 0.070288 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.03125 | 0 | 0.03125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
435ac442fb161d63725aab8f9d700c5d48d7ad8d | 4,017 | py | Python | third_party/tests/BlackParrot/external/basejump_stl/bsg_noc/bsg_mesh_to_ring_stitch.py | bsp13/Surelog | 3a04f7f59791390c69f1e3b69609db4ede7668b9 | [
"Apache-2.0"
] | 1 | 2021-05-13T06:43:24.000Z | 2021-05-13T06:43:24.000Z | third_party/tests/BlackParrot/external/basejump_stl/bsg_noc/bsg_mesh_to_ring_stitch.py | bsp13/Surelog | 3a04f7f59791390c69f1e3b69609db4ede7668b9 | [
"Apache-2.0"
] | 2 | 2020-05-13T06:06:49.000Z | 2020-05-15T10:49:11.000Z | third_party/tests/BlackParrot/external/basejump_stl/bsg_noc/bsg_mesh_to_ring_stitch.py | bsp13/Surelog | 3a04f7f59791390c69f1e3b69609db4ede7668b9 | [
"Apache-2.0"
] | 1 | 2020-05-02T03:11:56.000Z | 2020-05-02T03:11:56.000Z | #!/usr/bin/python
#
# bsg_mesh_to_ring_stitch
#
# this module uses space filling curves to organize
# a mesh of items into a ring of nearest-neighbor connections
#
# because of geometry, both X and Y coordinates cannot be odd
#
#
# MBT 5-26-2016
#
#
topX = 8
topY = 8
print "// AUTOGENERATED FILE; DO NOT MODIFY."
print "// run with topX=",topX," and topY=",topY
print "// ";
print "module bsg_mesh_to_ring_stitch #(parameter y_max_p=-1"
print " ,parameter x_max_p=-1"
print " ,parameter width_back_p = -1"
print " ,parameter width_fwd_p = -1"
print " ,parameter b_lp = $clog2(x_max_p*y_max_p)"
print " ) (output [x_max_p-1:0][y_max_p-1:0][b_lp-1:0] id_o"
print " ,output [x_max_p-1:0][y_max_p-1:0][width_back_p-1:0] back_data_in_o"
print " ,input [x_max_p-1:0][y_max_p-1:0][width_back_p-1:0] back_data_out_i"
print " ,output [x_max_p-1:0][y_max_p-1:0][width_fwd_p-1:0] fwd_data_in_o"
print " ,input [x_max_p-1:0][y_max_p-1:0][width_fwd_p-1:0] fwd_data_out_i"
print " );\n\n"
def print_config (maxX,maxY,order) :
matrix = [[0 for y in range(maxY)] for x in range(maxX)]
my_dict = dict();
for position,(x,y) in enumerate(order) :
my_dict[position] = (x,y);
matrix[x][y] = position;
print "if (x_max_p ==",maxX," && y_max_p ==",maxY,")\nbegin\n"
for y in range(maxY-1,-1,-1) :
for x in range(maxX-1,-1,-1) :
position=matrix[x][y];
below = maxX*maxY-1 if ((position - 1) < 0) else position - 1;
above = 0 if ((position + 1) == maxX*maxY) else position + 1;
(below_x,below_y)=my_dict[below];
(above_x,above_y)=my_dict[above];
print "assign back_data_in_o[",below_x,"][",below_y,"] = back_data_out_i[",x,"][",y,"]; // ",below,"<-",position
print "assign fwd_data_in_o [",above_x,"][",above_y,"] = fwd_data_out_i [",x,"][",y,"]; // ",position,"->",above
print "\n"
print " assign id_o = \n {"
print "// y = ",
for y in range(0,maxY) :
print str(y)+", ",
print "";
for x in range(0,maxX) :
print " {",
for y in range(0,maxY) :
if (y != 0) :
print ",",
print "b_lp ' (" + str(matrix[x][y]) +")",
if (x != maxX-1) :
print " }, // x = ",x
else:
print " } // x = ",x
print " };\nend\n"
# even X, odd/even Y
for maxX in range(2,topX+1,2) :
for maxY in range(2,topY+1,1) :
order=[]
for x in range(0,maxX,2) :
for y in range(1,maxY,1) :
order.append( (x,y))
for y in range(maxY-1,0,-1) :
order.append( (x+1,y))
for x in range(maxX-1,-1,-1) :
order.append((x,0))
print_config(maxX,maxY,order)
# odd X, even Y
for maxX in range(3,topX+1,2) :
for maxY in range(2,topY+1,2) :
order=[]
for y in range(0,maxY,2) :
for x in range(1,maxX,1) :
order.append( (x,y))
for x in range(maxX-1,0,-1) :
order.append( (x,y+1))
for y in range(maxY-1,-1,-1) :
order.append((0,y))
print_config(maxX,maxY,order)
# handle 1x2
print_config(1,2,[(0,0), (0,1)]);
print_config(2,1,[(0,0), (1,0)]);
print "initial assert ((x_max_p <= " + str(topX) + ") && (y_max_p <= " + str(topY) +")) else begin $error(\"%m x_max_p %d or y_max_p %d too large; rerun generator with larger size than %d/%d\",x_max_p,y_max_p,"+str(topX)+","+str(topY)+"); $finish(); end "
print "endmodule"
| 36.853211 | 255 | 0.485437 | 611 | 4,017 | 3.011457 | 0.180033 | 0.047826 | 0.022826 | 0.032609 | 0.436957 | 0.302174 | 0.188587 | 0.178804 | 0.139674 | 0.139674 | 0 | 0.043262 | 0.349764 | 4,017 | 108 | 256 | 37.194444 | 0.661179 | 0.066716 | 0 | 0.16 | 0 | 0.066667 | 0.337262 | 0.068578 | 0 | 0 | 0 | 0 | 0.013333 | 0 | null | null | 0 | 0 | null | null | 0.466667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
4360f2944d85e6781331cf09202edce05990a45e | 26,182 | py | Python | boa3/neo/vm/opcode/OpcodeInfo.py | DanPopa46/neo3-boa | e4ef340744b5bd25ade26f847eac50789b97f3e9 | [
"Apache-2.0"
] | null | null | null | boa3/neo/vm/opcode/OpcodeInfo.py | DanPopa46/neo3-boa | e4ef340744b5bd25ade26f847eac50789b97f3e9 | [
"Apache-2.0"
] | null | null | null | boa3/neo/vm/opcode/OpcodeInfo.py | DanPopa46/neo3-boa | e4ef340744b5bd25ade26f847eac50789b97f3e9 | [
"Apache-2.0"
] | null | null | null | from typing import Optional
from boa3 import constants
from boa3.neo.vm.opcode.Opcode import Opcode
from boa3.neo.vm.opcode.OpcodeInformation import OpcodeInformation
class OpcodeInfo:
@classmethod
def get_info(cls, opcode: Opcode) -> Optional[OpcodeInformation]:
"""
Gets a binary operation given the operator.
:param opcode: Neo VM opcode
:return: The opcode info if it exists. None otherwise
:rtype: OpcodeInformation or None
"""
for id, op in vars(cls).items():
if isinstance(op, OpcodeInformation) and op.opcode is opcode:
return op
# region Constants
PUSHINT8 = OpcodeInformation(Opcode.PUSHINT8, 1)
PUSHINT16 = OpcodeInformation(Opcode.PUSHINT16, 2)
PUSHINT32 = OpcodeInformation(Opcode.PUSHINT32, 4)
PUSHINT64 = OpcodeInformation(Opcode.PUSHINT64, 8)
PUSHINT128 = OpcodeInformation(Opcode.PUSHINT128, 16)
PUSHINT256 = OpcodeInformation(Opcode.PUSHINT256, 32)
# Convert the next four bytes to an address, and push the address onto the stack.
PUSHA = OpcodeInformation(Opcode.PUSHA, 4)
# The item null is pushed onto the stack.
PUSHNULL = OpcodeInformation(Opcode.PUSHNULL)
# The next byte contains the number of bytes to be pushed onto the stack.
PUSHDATA1 = OpcodeInformation(Opcode.PUSHDATA1, 1, constants.ONE_BYTE_MAX_VALUE)
# The next two bytes contain the number of bytes to be pushed onto the stack.
PUSHDATA2 = OpcodeInformation(Opcode.PUSHDATA2, 2, constants.TWO_BYTES_MAX_VALUE)
# The next four bytes contain the number of bytes to be pushed onto the stack.
PUSHDATA4 = OpcodeInformation(Opcode.PUSHDATA4, 4, constants.FOUR_BYTES_MAX_VALUE)
# The number -1 is pushed onto the stack.
PUSHM1 = OpcodeInformation(Opcode.PUSHM1)
# The number 0 is pushed onto the stack.
PUSH0 = OpcodeInformation(Opcode.PUSH0)
# The number 1 is pushed onto the stack.
PUSH1 = OpcodeInformation(Opcode.PUSH1)
# The number 2 is pushed onto the stack.
PUSH2 = OpcodeInformation(Opcode.PUSH2)
# The number 3 is pushed onto the stack.
PUSH3 = OpcodeInformation(Opcode.PUSH3)
# The number 4 is pushed onto the stack.
PUSH4 = OpcodeInformation(Opcode.PUSH4)
# The number 5 is pushed onto the stack.
PUSH5 = OpcodeInformation(Opcode.PUSH5)
# The number 6 is pushed onto the stack.
PUSH6 = OpcodeInformation(Opcode.PUSH6)
# The number 7 is pushed onto the stack.
PUSH7 = OpcodeInformation(Opcode.PUSH7)
# The number 8 is pushed onto the stack.
PUSH8 = OpcodeInformation(Opcode.PUSH8)
# The number 9 is pushed onto the stack.
PUSH9 = OpcodeInformation(Opcode.PUSH9)
# The number 10 is pushed onto the stack.
PUSH10 = OpcodeInformation(Opcode.PUSH10)
# The number 11 is pushed onto the stack.
PUSH11 = OpcodeInformation(Opcode.PUSH11)
# The number 12 is pushed onto the stack.
PUSH12 = OpcodeInformation(Opcode.PUSH12)
# The number 13 is pushed onto the stack.
PUSH13 = OpcodeInformation(Opcode.PUSH13)
# The number 14 is pushed onto the stack.
PUSH14 = OpcodeInformation(Opcode.PUSH14)
# The number 15 is pushed onto the stack.
PUSH15 = OpcodeInformation(Opcode.PUSH15)
# The number 16 is pushed onto the stack.
PUSH16 = OpcodeInformation(Opcode.PUSH16)
# endregion
# region Flow control
# The NOP operation does nothing. It is intended to fill in space if opcodes are patched.
NOP = OpcodeInformation(Opcode.NOP)
# Unconditionally transfers control to a target instruction. The target instruction is represented as a 1-byte
# signed offset from the beginning of the current instruction.
JMP = OpcodeInformation(Opcode.JMP, 1)
# Unconditionally transfers control to a target instruction. The target instruction is represented as a 4-bytes
# signed offset from the beginning of the current instruction.
JMP_L = OpcodeInformation(Opcode.JMP_L, 4)
# Transfers control to a target instruction if the value is True, not null, or non-zero. The target instruction
# is represented as a 1-byte signed offset from the beginning of the current instruction.
JMPIF = OpcodeInformation(Opcode.JMPIF, 1, stack_items=1)
# Transfers control to a target instruction if the value is True, not null, or non-zero. The target instruction
# is represented as a 4-bytes signed offset from the beginning of the current instruction.
JMPIF_L = OpcodeInformation(Opcode.JMPIF_L, 4, stack_items=1)
# Transfers control to a target instruction if the value is False, a null reference,
# or zero. The target instruction is represented as a 1-byte signed offset from the beginning of the current
# instruction.
JMPIFNOT = OpcodeInformation(Opcode.JMPIFNOT, 1, stack_items=1)
# Transfers control to a target instruction if the value is False, a null reference,
# or zero. The target instruction is represented as a 4-bytes signed offset from the beginning of the current
# instruction.
JMPIFNOT_L = OpcodeInformation(Opcode.JMPIFNOT_L, 4, stack_items=1)
# Transfers control to a target instruction if two values are equal. The target instruction is represented as a
# 1-byte signed offset from the beginning of the current instruction.
JMPEQ = OpcodeInformation(Opcode.JMPEQ, 1, stack_items=2)
# Transfers control to a target instruction if two values are equal. The target instruction is represented as a
# 4-bytes signed offset from the beginning of the current instruction.
JMPEQ_L = OpcodeInformation(Opcode.JMPEQ_L, 4, stack_items=2)
# Transfers control to a target instruction when two values are not equal. The target instruction is represented
# as a 1-byte signed offset from the beginning of the current instruction.
JMPNE = OpcodeInformation(Opcode.JMPNE, 1, stack_items=2)
# Transfers control to a target instruction when two values are not equal. The target instruction is represented
# as a 4-bytes signed offset from the beginning of the current instruction.
JMPNE_L = OpcodeInformation(Opcode.JMPNE_L, 4, stack_items=2)
# Transfers control to a target instruction if the first value is greater than the second value. The target
# instruction is represented as a 1-byte signed offset from the beginning of the current instruction.
JMPGT = OpcodeInformation(Opcode.JMPGT, 1, stack_items=2)
# Transfers control to a target instruction if the first value is greater than the second value. The target
# instruction is represented as a 4-bytes signed offset from the beginning of the current instruction.
JMPGT_L = OpcodeInformation(Opcode.JMPGT_L, 4, stack_items=2)
# Transfers control to a target instruction if the first value is greater than or equal to the second value. The
# target instruction is represented as a 1-byte signed offset from the beginning of the current instruction.
JMPGE = OpcodeInformation(Opcode.JMPGE, 1, stack_items=2)
# Transfers control to a target instruction if the first value is greater than or equal to the second value. The
# target instruction is represented as a 4-bytes signed offset from the beginning of the current instruction.
JMPGE_L = OpcodeInformation(Opcode.JMPGE_L, 4, stack_items=2)
# Transfers control to a target instruction if the first value is less than the second value. The target
# instruction is represented as a 1-byte signed offset from the beginning of the current instruction.
JMPLT = OpcodeInformation(Opcode.JMPLT, 1, stack_items=2)
# Transfers control to a target instruction if the first value is less than the second value. The target
# instruction is represented as a 4-bytes signed offset from the beginning of the current instruction.
JMPLT_L = OpcodeInformation(Opcode.JMPLT_L, 4, stack_items=2)
# Transfers control to a target instruction if the first value is less than or equal to the second value. The
# target instruction is represented as a 1-byte signed offset from the beginning of the current instruction.
JMPLE = OpcodeInformation(Opcode.JMPLE, 1, stack_items=2)
# Transfers control to a target instruction if the first value is less than or equal to the second value. The
# target instruction is represented as a 4-bytes signed offset from the beginning of the current instruction.
JMPLE_L = OpcodeInformation(Opcode.JMPLE_L, 4, stack_items=2)
# Calls the function at the target address which is represented as a 1-byte signed offset from the beginning of
# the current instruction.
CALL = OpcodeInformation(Opcode.CALL, 1)
# Calls the function at the target address which is represented as a 4-bytes signed offset from the beginning of
# the current instruction.
CALL_L = OpcodeInformation(Opcode.CALL_L, 4)
# Pop the address of a function from the stack, and call the function.
CALLA = OpcodeInformation(Opcode.CALLA)
# Calls the function which is described by the token.
CALLT = OpcodeInformation(Opcode.CALLT, 2)
# It turns the vm state to FAULT immediately, and cannot be caught.
ABORT = OpcodeInformation(Opcode.ABORT)
# Pop the top value of the stack, if it false, then exit vm execution and set vm state to FAULT.
ASSERT = OpcodeInformation(Opcode.ASSERT)
# Pop the top value of the stack, and throw it.
THROW = OpcodeInformation(Opcode.THROW)
# TRY CatchOffset(sbyte) FinallyOffset(sbyte). If there's no catch body, set CatchOffset 0. If there's no finally
# body, set FinallyOffset 0.
TRY = OpcodeInformation(Opcode.TRY, 2)
# TRY_L CatchOffset(int) FinallyOffset(int). If there's no catch body, set CatchOffset 0. If there's no finally
# body, set FinallyOffset 0.
TRY_L = OpcodeInformation(Opcode.TRY_L, 8)
# Ensures that the appropriate surrounding finally blocks are executed. And then unconditionally transfers
# control to the specific target instruction, represented as a 1-byte signed offset from the beginning of the
# current instruction.
ENDTRY = OpcodeInformation(Opcode.ENDTRY, 1)
# Ensures that the appropriate surrounding finally blocks are executed. And then unconditionally transfers
# control to the specific target instruction, represented as a 4-byte signed offset from the beginning of the
# current instruction.
ENDTRY_L = OpcodeInformation(Opcode.ENDTRY_L, 4)
# End finally, If no exception happen or be catched, vm will jump to the target instruction of ENDTRY/ENDTRY_L.
# Otherwise vm will rethrow the exception to upper layer.
ENDFINALLY = OpcodeInformation(Opcode.ENDFINALLY)
# Returns from the current method.
RET = OpcodeInformation(Opcode.RET)
# Calls to an interop service.
SYSCALL = OpcodeInformation(Opcode.SYSCALL, min_data_len=4)
# endregion
# region Stack
# Puts the number of stack items onto the stack.
DEPTH = OpcodeInformation(Opcode.DEPTH)
# Removes the top stack item.
DROP = OpcodeInformation(Opcode.DROP)
# Removes the second-to-top stack item.
NIP = OpcodeInformation(Opcode.NIP)
# The item n back in the main stack is removed.
XDROP = OpcodeInformation(Opcode.XDROP, stack_items=1)
# Clear the stack
CLEAR = OpcodeInformation(Opcode.CLEAR)
# Duplicates the top stack item.
DUP = OpcodeInformation(Opcode.DUP)
# Copies the second-to-top stack item to the top.
OVER = OpcodeInformation(Opcode.OVER)
# The item n back in the stack is copied to the top.
PICK = OpcodeInformation(Opcode.PICK, stack_items=1)
# The item at the top of the stack is copied and inserted before the second-to-top item.
TUCK = OpcodeInformation(Opcode.TUCK)
# The top two items on the stack are swapped.
SWAP = OpcodeInformation(Opcode.SWAP)
# The top three items on the stack are rotated to the left.
ROT = OpcodeInformation(Opcode.ROT)
# The item n back in the stack is moved to the top.
ROLL = OpcodeInformation(Opcode.ROLL)
# Reverse the order of the top 3 items on the stack.
REVERSE3 = OpcodeInformation(Opcode.REVERSE3)
# Reverse the order of the top 4 items on the stack.
REVERSE4 = OpcodeInformation(Opcode.REVERSE4)
# Pop the number N on the stack, and reverse the order of the top N items on the stack.
REVERSEN = OpcodeInformation(Opcode.REVERSEN, stack_items=1)
# endregion
# region Slot
# Initialize the static field list for the current execution context.
INITSSLOT = OpcodeInformation(Opcode.INITSSLOT, 1)
# Initialize the argument slot and the local variable list for the current execution context.
INITSLOT = OpcodeInformation(Opcode.INITSLOT, 2)
# Loads the static field at index 0 onto the evaluation stack.
LDSFLD0 = OpcodeInformation(Opcode.LDSFLD0)
# Loads the static field at index 1 onto the evaluation stack.
LDSFLD1 = OpcodeInformation(Opcode.LDSFLD1)
# Loads the static field at index 2 onto the evaluation stack.
LDSFLD2 = OpcodeInformation(Opcode.LDSFLD2)
# Loads the static field at index 3 onto the evaluation stack.
LDSFLD3 = OpcodeInformation(Opcode.LDSFLD3)
# Loads the static field at index 4 onto the evaluation stack.
LDSFLD4 = OpcodeInformation(Opcode.LDSFLD4)
# Loads the static field at index 5 onto the evaluation stack.
LDSFLD5 = OpcodeInformation(Opcode.LDSFLD5)
# Loads the static field at index 6 onto the evaluation stack.
LDSFLD6 = OpcodeInformation(Opcode.LDSFLD6)
# Loads the static field at a specified index onto the evaluation stack. The index is represented as a 1-byte
# unsigned integer.
LDSFLD = OpcodeInformation(Opcode.LDSFLD, 1)
# Stores the value on top of the evaluation stack in the static field list at index 0.
STSFLD0 = OpcodeInformation(Opcode.STSFLD0)
# Stores the value on top of the evaluation stack in the static field list at index 1.
STSFLD1 = OpcodeInformation(Opcode.STSFLD1)
# Stores the value on top of the evaluation stack in the static field list at index 2.
STSFLD2 = OpcodeInformation(Opcode.STSFLD2)
# Stores the value on top of the evaluation stack in the static field list at index 3.
STSFLD3 = OpcodeInformation(Opcode.STSFLD3)
# Stores the value on top of the evaluation stack in the static field list at index 4.
STSFLD4 = OpcodeInformation(Opcode.STSFLD4)
# Stores the value on top of the evaluation stack in the static field list at index 5.
STSFLD5 = OpcodeInformation(Opcode.STSFLD5)
# Stores the value on top of the evaluation stack in the static field list at index 6.
STSFLD6 = OpcodeInformation(Opcode.STSFLD6)
# Stores the value on top of the evaluation stack in the static field list at a specified index. The index is
# represented as a 1-byte unsigned integer.
STSFLD = OpcodeInformation(Opcode.STSFLD, 1)
# Loads the local variable at index 0 onto the evaluation stack.
LDLOC0 = OpcodeInformation(Opcode.LDLOC0)
# Loads the local variable at index 1 onto the evaluation stack.
LDLOC1 = OpcodeInformation(Opcode.LDLOC1)
# Loads the local variable at index 2 onto the evaluation stack.
LDLOC2 = OpcodeInformation(Opcode.LDLOC2)
# Loads the local variable at index 3 onto the evaluation stack.
LDLOC3 = OpcodeInformation(Opcode.LDLOC3)
# Loads the local variable at index 4 onto the evaluation stack.
LDLOC4 = OpcodeInformation(Opcode.LDLOC4)
# Loads the local variable at index 5 onto the evaluation stack.
LDLOC5 = OpcodeInformation(Opcode.LDLOC5)
# Loads the local variable at index 6 onto the evaluation stack.
LDLOC6 = OpcodeInformation(Opcode.LDLOC6)
# Loads the local variable at a specified index onto the evaluation stack. The index is represented as a 1-byte
# unsigned integer.
LDLOC = OpcodeInformation(Opcode.LDLOC, 1)
# Stores the value on top of the evaluation stack in the local variable list at index 0.
STLOC0 = OpcodeInformation(Opcode.STLOC0)
# Stores the value on top of the evaluation stack in the local variable list at index 1.
STLOC1 = OpcodeInformation(Opcode.STLOC1)
# Stores the value on top of the evaluation stack in the local variable list at index 2.
STLOC2 = OpcodeInformation(Opcode.STLOC2)
# Stores the value on top of the evaluation stack in the local variable list at index 3.
STLOC3 = OpcodeInformation(Opcode.STLOC3)
# Stores the value on top of the evaluation stack in the local variable list at index 4.
STLOC4 = OpcodeInformation(Opcode.STLOC4)
# Stores the value on top of the evaluation stack in the local variable list at index 5.
STLOC5 = OpcodeInformation(Opcode.STLOC5)
# Stores the value on top of the evaluation stack in the local variable list at index 6.
STLOC6 = OpcodeInformation(Opcode.STLOC6)
# Stores the value on top of the evaluation stack in the local variable list at a specified index. The index is
# represented as a 1-byte unsigned integer.
STLOC = OpcodeInformation(Opcode.STLOC, 1)
# Loads the argument at index 0 onto the evaluation stack.
LDARG0 = OpcodeInformation(Opcode.LDARG0)
# Loads the argument at index 1 onto the evaluation stack.
LDARG1 = OpcodeInformation(Opcode.LDARG1)
# Loads the argument at index 2 onto the evaluation stack.
LDARG2 = OpcodeInformation(Opcode.LDARG2)
# Loads the argument at index 3 onto the evaluation stack.
LDARG3 = OpcodeInformation(Opcode.LDARG3)
# Loads the argument at index 4 onto the evaluation stack.
LDARG4 = OpcodeInformation(Opcode.LDARG4)
# Loads the argument at index 5 onto the evaluation stack.
LDARG5 = OpcodeInformation(Opcode.LDARG5)
# Loads the argument at index 6 onto the evaluation stack.
LDARG6 = OpcodeInformation(Opcode.LDARG6)
# Loads the argument at a specified index onto the evaluation stack. The index is represented as a 1-byte
# unsigned integer.
LDARG = OpcodeInformation(Opcode.LDARG, 1)
# Stores the value on top of the evaluation stack in the argument slot at index 0.
STARG0 = OpcodeInformation(Opcode.STARG0)
# Stores the value on top of the evaluation stack in the argument slot at index 1.
STARG1 = OpcodeInformation(Opcode.STARG1)
# Stores the value on top of the evaluation stack in the argument slot at index 2.
STARG2 = OpcodeInformation(Opcode.STARG2)
# Stores the value on top of the evaluation stack in the argument slot at index 3.
STARG3 = OpcodeInformation(Opcode.STARG3)
# Stores the value on top of the evaluation stack in the argument slot at index 4.
STARG4 = OpcodeInformation(Opcode.STARG4)
# Stores the value on top of the evaluation stack in the argument slot at index 5.
STARG5 = OpcodeInformation(Opcode.STARG5)
# Stores the value on top of the evaluation stack in the argument slot at index 6.
STARG6 = OpcodeInformation(Opcode.STARG6)
# Stores the value on top of the evaluation stack in the argument slot at a specified index. The index is
# represented as a 1-byte unsigned integer.
STARG = OpcodeInformation(Opcode.STARG, 1)
# endregion
# region Splice
NEWBUFFER = OpcodeInformation(Opcode.NEWBUFFER)
MEMCPY = OpcodeInformation(Opcode.MEMCPY)
# Concatenates two strings.
CAT = OpcodeInformation(Opcode.CAT)
# Returns a section of a string.
SUBSTR = OpcodeInformation(Opcode.SUBSTR)
# Keeps only characters left of the specified point in a string.
LEFT = OpcodeInformation(Opcode.LEFT)
# Keeps only characters right of the specified point in a string.
RIGHT = OpcodeInformation(Opcode.RIGHT)
# endregion
# region Bitwise logic
# Flips all of the bits in the input.
INVERT = OpcodeInformation(Opcode.INVERT)
# Boolean and between each bit in the inputs.
AND = OpcodeInformation(Opcode.AND)
# Boolean or between each bit in the inputs.
OR = OpcodeInformation(Opcode.OR)
# Boolean exclusive or between each bit in the inputs.
XOR = OpcodeInformation(Opcode.XOR)
# Returns 1 if the inputs are exactly equal, 0 otherwise.
EQUAL = OpcodeInformation(Opcode.EQUAL)
# Returns 1 if the inputs are not equal, 0 otherwise.
NOTEQUAL = OpcodeInformation(Opcode.NOTEQUAL)
# endregion
# region Arithmetic
# Puts the sign of top stack item on top of the main stack. If value is negative, put -1; if positive,
# put 1; if value is zero, put 0.
SIGN = OpcodeInformation(Opcode.SIGN)
# The input is made positive.
ABS = OpcodeInformation(Opcode.ABS)
# The sign of the input is flipped.
NEGATE = OpcodeInformation(Opcode.NEGATE)
# 1 is added to the input.
INC = OpcodeInformation(Opcode.INC)
# 1 is subtracted from the input.
DEC = OpcodeInformation(Opcode.DEC)
# a is added to b.
ADD = OpcodeInformation(Opcode.ADD)
# b is subtracted from a.
SUB = OpcodeInformation(Opcode.SUB)
# a is multiplied by b.
MUL = OpcodeInformation(Opcode.MUL)
# a is divided by b.
DIV = OpcodeInformation(Opcode.DIV)
# Returns the remainder after dividing a by b.
MOD = OpcodeInformation(Opcode.MOD)
# The result of raising value to the exponent power.
POW = OpcodeInformation(Opcode.POW)
# Returns the square root of a specified number.
SQRT = OpcodeInformation(Opcode.SQRT)
# Shifts a left b bits, preserving sign.
SHL = OpcodeInformation(Opcode.SHL)
# Shifts a right b bits, preserving sign.
SHR = OpcodeInformation(Opcode.SHR)
# If the input is 0 or 1, it is flipped. Otherwise the output will be 0.
NOT = OpcodeInformation(Opcode.NOT)
# If both a and b are not 0, the output is 1. Otherwise 0.
BOOLAND = OpcodeInformation(Opcode.BOOLAND)
# If a or b is not 0, the output is 1. Otherwise 0.
BOOLOR = OpcodeInformation(Opcode.BOOLOR)
# Returns 0 if the input is 0. 1 otherwise.
NZ = OpcodeInformation(Opcode.NZ)
# Returns 1 if the numbers are equal, 0 otherwise.
NUMEQUAL = OpcodeInformation(Opcode.NUMEQUAL)
# Returns 1 if the numbers are not equal, 0 otherwise.
NUMNOTEQUAL = OpcodeInformation(Opcode.NUMNOTEQUAL)
# Returns 1 if a is less than b, 0 otherwise.
LT = OpcodeInformation(Opcode.LT)
# Returns 1 if a is less than or equal to b, 0 otherwise.
LE = OpcodeInformation(Opcode.LE)
# Returns 1 if a is greater than b, 0 otherwise.
GT = OpcodeInformation(Opcode.GT)
# Returns 1 if a is greater than or equal to b, 0 otherwise.
GE = OpcodeInformation(Opcode.GE)
# Returns the smaller of a and b.
MIN = OpcodeInformation(Opcode.MIN)
# Returns the larger of a and b.
MAX = OpcodeInformation(Opcode.MAX)
# Returns 1 if x is within the specified range (left-inclusive), 0 otherwise.
WITHIN = OpcodeInformation(Opcode.WITHIN)
# endregion
# region Compound-type
# A value n is taken from top of main stack. The next n items on main stack are removed, put inside n-sized array
# and this array is put on top of the main stack.
PACK = OpcodeInformation(Opcode.PACK)
# An array is removed from top of the main stack. Its elements are put on top of the main stack (in reverse
# order) and the array size is also put on main stack.
UNPACK = OpcodeInformation(Opcode.UNPACK)
# An empty array (with size 0) is put on top of the main stack.
NEWARRAY0 = OpcodeInformation(Opcode.NEWARRAY0)
# A value n is taken from top of main stack. A null-filled array with size n is put on top of the main stack.
NEWARRAY = OpcodeInformation(Opcode.NEWARRAY)
# A value n is taken from top of main stack. An array of type T with size n is put on top of the main stack.
NEWARRAY_T = OpcodeInformation(Opcode.NEWARRAY_T, 1)
# An empty struct (with size 0) is put on top of the main stack.
NEWSTRUCT0 = OpcodeInformation(Opcode.NEWSTRUCT0)
# A value n is taken from top of main stack. A zero-filled struct with size n is put on top of the main stack.
NEWSTRUCT = OpcodeInformation(Opcode.NEWSTRUCT)
# A Map is created and put on top of the main stack.
NEWMAP = OpcodeInformation(Opcode.NEWMAP)
# An array is removed from top of the main stack. Its size is put on top of the main stack.
SIZE = OpcodeInformation(Opcode.SIZE)
# An input index n (or key) and an array (or map) are removed from the top of the main stack. Puts True on top of
# main stack if array[n] (or map[n]) exist, and False otherwise.
HASKEY = OpcodeInformation(Opcode.HASKEY)
# A map is taken from top of the main stack. The keys of this map are put on top of the main stack.
KEYS = OpcodeInformation(Opcode.KEYS)
# A map is taken from top of the main stack. The values of this map are put on top of the main stack.
VALUES = OpcodeInformation(Opcode.VALUES)
# An input index n (or key) and an array (or map) are taken from main stack. Element array[n] (or map[n]) is put
# on top of the main stack.
PICKITEM = OpcodeInformation(Opcode.PICKITEM)
# The item on top of main stack is removed and appended to the second item on top of the main stack.
APPEND = OpcodeInformation(Opcode.APPEND)
# A value v, index n (or key) and an array (or map) are taken from main stack. Attribution array[n]=v
# (or map[n]=v) is performed.
SETITEM = OpcodeInformation(Opcode.SETITEM)
# An array is removed from the top of the main stack and its elements are reversed.
REVERSEITEMS = OpcodeInformation(Opcode.REVERSEITEMS)
# An input index n (or key) and an array (or map) are removed from the top of the main stack. Element array[n]
# (or map[n]) is removed.
REMOVE = OpcodeInformation(Opcode.REMOVE)
# Remove all the items from the compound-type.
CLEARITEMS = OpcodeInformation(Opcode.CLEARITEMS)
# endregion
# region Types
# Returns true if the input is null. Returns false otherwise.
ISNULL = OpcodeInformation(Opcode.ISNULL)
# Returns true if the top item is of the specified type.
ISTYPE = OpcodeInformation(Opcode.ISTYPE, 1)
# Converts the top item to the specified type.
CONVERT = OpcodeInformation(Opcode.CONVERT, 1)
# endregion
| 55.004202 | 117 | 0.732755 | 3,766 | 26,182 | 5.078598 | 0.119756 | 0.224877 | 0.045174 | 0.019868 | 0.496026 | 0.459741 | 0.428736 | 0.377183 | 0.36338 | 0.360033 | 0 | 0.018512 | 0.209801 | 26,182 | 475 | 118 | 55.12 | 0.90594 | 0.579979 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005076 | 1 | 0.005076 | false | 0 | 0.020305 | 0 | 0.984772 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4368404d6e8e21bd92faf4ba8ac4c41f9c54053c | 266 | py | Python | utils/merge_json.py | mirfan899/rasa-qa-bot | 5c3d6f295f4960e53d2106aa945a123bf940377f | [
"MIT"
] | 3 | 2020-05-05T12:35:26.000Z | 2020-10-11T04:12:32.000Z | utils/merge_json.py | mirfan899/rasa-qa-bot | 5c3d6f295f4960e53d2106aa945a123bf940377f | [
"MIT"
] | null | null | null | utils/merge_json.py | mirfan899/rasa-qa-bot | 5c3d6f295f4960e53d2106aa945a123bf940377f | [
"MIT"
] | null | null | null | import json
questions = json.load(open("english_questions_answers.json", encoding="utf-8"))
feedjson = json.load(open("data/nlu/qa.json", encoding='utf-8'))
for obj in questions:
feedjson.append(obj)
json.dump(feedjson, open("data/nlu/qa.json", "w"), indent=4) | 33.25 | 79 | 0.721805 | 42 | 266 | 4.52381 | 0.52381 | 0.084211 | 0.126316 | 0.168421 | 0.178947 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012397 | 0.090226 | 266 | 8 | 80 | 33.25 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0.273408 | 0.11236 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4368ab34dd37ff7cff072f1d6269f09c97d052a3 | 268 | py | Python | src/core/migrations/0076_merge_20190802_1454.py | metabolism-of-cities/ARCHIVED-metabolism-of-cities-platform-v3 | c754d3b1b401906a21640b8eacb6b724a448b31c | [
"MIT"
] | null | null | null | src/core/migrations/0076_merge_20190802_1454.py | metabolism-of-cities/ARCHIVED-metabolism-of-cities-platform-v3 | c754d3b1b401906a21640b8eacb6b724a448b31c | [
"MIT"
] | null | null | null | src/core/migrations/0076_merge_20190802_1454.py | metabolism-of-cities/ARCHIVED-metabolism-of-cities-platform-v3 | c754d3b1b401906a21640b8eacb6b724a448b31c | [
"MIT"
] | null | null | null | # Generated by Django 2.1.2 on 2019-08-02 14:54
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('core', '0075_auto_20190802_1420'),
('core', '0072_merge_20190802_1405'),
]
operations = [
]
| 17.866667 | 47 | 0.641791 | 33 | 268 | 5.030303 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229268 | 0.235075 | 268 | 14 | 48 | 19.142857 | 0.580488 | 0.16791 | 0 | 0 | 1 | 0 | 0.248869 | 0.21267 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4378901f6f642ffd2b679bb77e060f74ee15df1b | 999 | py | Python | gfg/arrays/smallest_missing_number.py | rrwt/daily-coding-challenge | b16fc365fd142ebab429e605cb146c8bb0bc97a2 | [
"MIT"
] | 1 | 2019-04-18T03:29:02.000Z | 2019-04-18T03:29:02.000Z | gfg/arrays/smallest_missing_number.py | rrwt/daily-coding-challenge | b16fc365fd142ebab429e605cb146c8bb0bc97a2 | [
"MIT"
] | null | null | null | gfg/arrays/smallest_missing_number.py | rrwt/daily-coding-challenge | b16fc365fd142ebab429e605cb146c8bb0bc97a2 | [
"MIT"
] | null | null | null | """
Given a sorted array of n distinct integers where each integer is in the range
from 0 to m-1 and m > n. Find the smallest number that is missing from the array.
"""
def smallest_missing(arr: list) -> int:
"""
A modified version of binary search can be applied.
If the number mid index == value, then the missing value
is on the right, else on the left side.
Time Complexity: O(log(n))
"""
beg, end = 0, len(arr) - 1
while beg <= end:
if arr[beg] != beg:
return beg
m = (beg + end) >> 1
if arr[m] == m:
beg = m + 1
else:
end = m - 1
return end + 1
if __name__ == "__main__":
print(smallest_missing([]))
print(smallest_missing([0]))
print(smallest_missing([0, 1, 2, 3, 4, 5, 6, 7, 9, 10]))
print(smallest_missing([0, 2, 3, 4, 5, 6, 7, 8, 9, 10]))
print(smallest_missing([0, 1, 3, 4, 5, 6, 7, 8, 9, 10]))
print(smallest_missing([0, 1, 2, 3, 5, 6, 7, 8, 9, 10]))
| 27 | 81 | 0.55956 | 167 | 999 | 3.257485 | 0.407186 | 0.193015 | 0.220588 | 0.193015 | 0.227941 | 0.227941 | 0.161765 | 0.113971 | 0.113971 | 0.113971 | 0 | 0.075714 | 0.299299 | 999 | 36 | 82 | 27.75 | 0.701429 | 0.336336 | 0 | 0 | 0 | 0 | 0.012739 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.166667 | 0.333333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
439f4cc3d8266614f9169edaa95d3122e21c623b | 837 | py | Python | syllabot/src/bot.py | Hunter-Open-Source-Club/syllabase | 7c9a23fddb1a8ff13326dc27c69f5a2b9a88694f | [
"MIT"
] | null | null | null | syllabot/src/bot.py | Hunter-Open-Source-Club/syllabase | 7c9a23fddb1a8ff13326dc27c69f5a2b9a88694f | [
"MIT"
] | 10 | 2021-01-02T08:22:46.000Z | 2021-02-01T19:42:15.000Z | syllabot/src/bot.py | Hunter-Open-Source-Club/syllabase | 7c9a23fddb1a8ff13326dc27c69f5a2b9a88694f | [
"MIT"
] | null | null | null |
import asyncio
import logging
import socket
import warnings
import os
import discord
from discord.ext import commands
class Bot(commands.Bot):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
async def on_connect(self):
print('[LOGS] Connecting to discord!')
async def on_disconnect(self):
print('[LOGS] Bot disconnected.')
async def on_ready(self):
print('[LOGS] We have logged in as {0.user}'.format(self))
def add_cogs(self, directory):
print('[LOGS] Adding cogs from {0} directory'.format(directory))
for file in os.listdir(directory):
if file.endswith(".py"):
name = file[:-3]
# TODO: MAKE THE LOAD_EXTENSION TO BE MORE ROBUST
self.load_extension(f"src.cogs.{name}")
print('[LOGS] Finished loading cogs from directory')
| 25.363636 | 68 | 0.670251 | 115 | 837 | 4.756522 | 0.521739 | 0.082267 | 0.054845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004505 | 0.204301 | 837 | 33 | 69 | 25.363636 | 0.816817 | 0.056153 | 0 | 0 | 0 | 0 | 0.237009 | 0 | 0 | 0 | 0 | 0.030303 | 0 | 1 | 0.086957 | false | 0 | 0.304348 | 0 | 0.434783 | 0.217391 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
43a636d25b54b8569da313767876d319ebfb404f | 2,176 | py | Python | pybank/main.py | Damola-A/Python-challenge | 37c3346ab782c221f38bc77691c927b4512f3f30 | [
"ADSL"
] | null | null | null | pybank/main.py | Damola-A/Python-challenge | 37c3346ab782c221f38bc77691c927b4512f3f30 | [
"ADSL"
] | null | null | null | pybank/main.py | Damola-A/Python-challenge | 37c3346ab782c221f38bc77691c927b4512f3f30 | [
"ADSL"
] | null | null | null | import os
import csv
csvpath = os.path.join('Resources', 'budget_data.csv')
date_count = 0
total_profit_loss = 0
total_profit_loss_change = 0
previous_profit_loss = 867884
average_profit_loss_change = 0
profit_loss_change_list = []
profit_loss_date_list = []
with open(csvpath) as csvfile:
csvreader = csv.reader(csvfile, delimiter=',')
header = next(csvreader)
for row in csvreader:
date_count = date_count + 1
total_profit_loss = float(row[1]) + total_profit_loss
profit_loss_change = float(row[1]) - previous_profit_loss
previous_profit_loss = float(row[1])
date_change = row[0]
profit_loss_change_list.append(profit_loss_change)
profit_loss_date_list.append(date_change)
total_profit_loss_change = sum(profit_loss_change_list)
min_profit_loss = min(profit_loss_change_list)
max_profit_loss = max(profit_loss_change_list)
min_date = profit_loss_date_list[profit_loss_change_list.index(min_profit_loss)]
max_date = profit_loss_date_list[profit_loss_change_list.index(max_profit_loss)]
average_profit_loss_change = round(total_profit_loss_change / (date_count - 1), 2)
print ("Financial Analysis")
print ("----------------------------------")
print ("Total Months:" + " " + str(date_count))
print ("Total:" " " + "$"+ str(int(total_profit_loss)))
print ("Average Change:" + " " + "$"+ str(average_profit_loss_change))
print ("Greatest Increase in Profits:" + " " + str(max_date) + " " + "$" + str(int(max_profit_loss)))
print ("Greatest Decrease in Profits:" + " " + str(min_date) + " " + "$" + str(int(min_profit_loss)))
output_file = os.path.join('Analysis', 'budget_data.txt')
writer= open(output_file, "w")
writer.write("Financial Analysis\n")
writer.write("----------------------------------\n")
writer.write(f"Total Months: {date_count}\n")
writer.write(f"Total: ${int(total_profit_loss)}\n")
writer.write(f"Average Change: ${average_profit_loss_change}\n")
writer.write(f"Greatest Increase in Profits: {max_date} (${int(max_profit_loss)})\n")
writer.write(f"Greatest Decrease in Profits: {min_date} (${int(min_profit_loss)})\n")
| 37.517241 | 101 | 0.690717 | 300 | 2,176 | 4.65 | 0.196667 | 0.258065 | 0.183513 | 0.100358 | 0.237276 | 0.100358 | 0.067384 | 0.067384 | 0.067384 | 0.067384 | 0 | 0.009169 | 0.147978 | 2,176 | 57 | 102 | 38.175439 | 0.743258 | 0 | 0 | 0 | 0 | 0 | 0.232398 | 0.083755 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.046512 | 0 | 0.046512 | 0.162791 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
43b4687d1eb15e669a6b4ae313345ce173a5e48a | 678 | py | Python | PythonHelloWorld/HelloWorldFlask/hello_world.py | davidleonm/python-hello-world | e833ec69440d29cb40f8ef85f4888c0220a1ee1b | [
"MIT"
] | null | null | null | PythonHelloWorld/HelloWorldFlask/hello_world.py | davidleonm/python-hello-world | e833ec69440d29cb40f8ef85f4888c0220a1ee1b | [
"MIT"
] | null | null | null | PythonHelloWorld/HelloWorldFlask/hello_world.py | davidleonm/python-hello-world | e833ec69440d29cb40f8ef85f4888c0220a1ee1b | [
"MIT"
] | null | null | null | from flask import Flask
class HelloWorldFlask(object):
app = None
def __init__(self, name):
self.app = Flask(name)
self.app.add_url_rule('/', 'index', self.forbidden)
self.app.add_url_rule('/helloworld', 'helloworld', self.helloworld)
def run(self):
self.app.run(host='0.0.0.0', port=9999)
@staticmethod
def helloworld():
return 'Hello World!'
@staticmethod
def forbidden():
return 'Forbidden site!'
def testing_client(self):
self.app.testing = True
return self.app.test_client()
if __name__ == '__main__':
flask_app = HelloWorldFlask('helloworld')
flask_app.run()
| 21.870968 | 75 | 0.626844 | 82 | 678 | 4.939024 | 0.414634 | 0.103704 | 0.054321 | 0.064198 | 0.083951 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015595 | 0.243363 | 678 | 30 | 76 | 22.6 | 0.773879 | 0 | 0 | 0.095238 | 0 | 0 | 0.116519 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0 | 0.047619 | 0.095238 | 0.52381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
43b7d7e9b290c6f228670d1f461e06b4183ed88a | 267 | py | Python | work6.py | panwuying/home-work2 | ce6a73aeb5be2817f8b122897c6bff39125e608f | [
"Apache-2.0"
] | null | null | null | work6.py | panwuying/home-work2 | ce6a73aeb5be2817f8b122897c6bff39125e608f | [
"Apache-2.0"
] | null | null | null | work6.py | panwuying/home-work2 | ce6a73aeb5be2817f8b122897c6bff39125e608f | [
"Apache-2.0"
] | null | null | null | a=eval(raw_input("enter employee's name>>"))
b=eval(raw_input("enter number in a week>>"))
c=eval(raw_input("enter hourly>>"))
d=eval(raw_input("d=>>"))
print(a)
print(b)
print(a*b)
print(a*b*0.2)
print(a*b*0.09)
print(a*b*0.2+a*b*0.09)
print(a*b-a*b*0.2-a*b*0.09)
| 19.071429 | 45 | 0.644195 | 65 | 267 | 2.584615 | 0.292308 | 0.095238 | 0.107143 | 0.303571 | 0.261905 | 0.208333 | 0.208333 | 0.107143 | 0 | 0 | 0 | 0.060729 | 0.074906 | 267 | 13 | 46 | 20.538462 | 0.619433 | 0 | 0 | 0 | 0 | 0 | 0.245283 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.636364 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
43bb7299a50b1fc5e8de8279157b8c414ff392b1 | 1,830 | py | Python | release/stubs.min/System/Security/AccessControl_parts/FileSystemAuditRule.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 182 | 2017-06-27T02:26:15.000Z | 2022-03-30T18:53:43.000Z | release/stubs.min/System/Security/AccessControl_parts/FileSystemAuditRule.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 28 | 2017-06-27T13:38:23.000Z | 2022-03-15T11:19:44.000Z | release/stubs.min/System/Security/AccessControl_parts/FileSystemAuditRule.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 67 | 2017-06-28T09:43:59.000Z | 2022-03-20T21:17:10.000Z | class FileSystemAuditRule(AuditRule):
"""
Represents an abstraction of an access control entry (ACE) that defines an audit rule for a file or directory. This class cannot be inherited.
FileSystemAuditRule(identity: IdentityReference,fileSystemRights: FileSystemRights,flags: AuditFlags)
FileSystemAuditRule(identity: IdentityReference,fileSystemRights: FileSystemRights,inheritanceFlags: InheritanceFlags,propagationFlags: PropagationFlags,flags: AuditFlags)
FileSystemAuditRule(identity: str,fileSystemRights: FileSystemRights,flags: AuditFlags)
FileSystemAuditRule(identity: str,fileSystemRights: FileSystemRights,inheritanceFlags: InheritanceFlags,propagationFlags: PropagationFlags,flags: AuditFlags)
"""
@staticmethod
def __new__(self,identity,fileSystemRights,*__args):
"""
__new__(cls: type,identity: IdentityReference,fileSystemRights: FileSystemRights,flags: AuditFlags)
__new__(cls: type,identity: IdentityReference,fileSystemRights: FileSystemRights,inheritanceFlags: InheritanceFlags,propagationFlags: PropagationFlags,flags: AuditFlags)
__new__(cls: type,identity: str,fileSystemRights: FileSystemRights,flags: AuditFlags)
__new__(cls: type,identity: str,fileSystemRights: FileSystemRights,inheritanceFlags: InheritanceFlags,propagationFlags: PropagationFlags,flags: AuditFlags)
"""
pass
AccessMask=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the access mask for this rule.
"""
FileSystemRights=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the System.Security.AccessControl.FileSystemRights flags associated with the current System.Security.AccessControl.FileSystemAuditRule object.
Get: FileSystemRights(self: FileSystemAuditRule) -> FileSystemRights
"""
| 39.782609 | 173 | 0.800546 | 167 | 1,830 | 8.640719 | 0.335329 | 0.177408 | 0.113652 | 0.158004 | 0.701317 | 0.68815 | 0.625087 | 0.534304 | 0.494802 | 0.440748 | 0 | 0 | 0.114754 | 1,830 | 45 | 174 | 40.666667 | 0.890741 | 0.646448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
43bcd0e910cee932f9a567880fedf261329f3421 | 2,726 | py | Python | datacatalog/settings/__init__.py | SD2E/python-datacatalog | 51ab366639505fb6e8a14cd6b446de37080cd20d | [
"CNRI-Python"
] | null | null | null | datacatalog/settings/__init__.py | SD2E/python-datacatalog | 51ab366639505fb6e8a14cd6b446de37080cd20d | [
"CNRI-Python"
] | 2 | 2019-07-25T15:39:04.000Z | 2019-10-21T15:31:46.000Z | datacatalog/settings/__init__.py | SD2E/python-datacatalog | 51ab366639505fb6e8a14cd6b446de37080cd20d | [
"CNRI-Python"
] | 1 | 2019-10-15T14:33:44.000Z | 2019-10-15T14:33:44.000Z | import os
# from funcy import distinct, remove
from .helpers import fix_assets_path, array_from_string, parse_boolean, int_or_none, set_from_string
from .organization import *
from .callbacks import *
from .crypt import *
from .debug import *
from .identifiers import *
from .jsonschema import *
from .mongo import *
def all_settings():
from types import ModuleType
settings = {}
for name, item in globals().iteritems():
if not callable(item) and not name.startswith("__") and not isinstance(item, ModuleType):
settings[name] = item
return settings
# TODO - Add code and a build target that walks settings to generate an example Dockerfile w envs
# TypedUUID
# UUID_NAMESPACE = uuid3(NAMESPACE_DNS, DNS_DOMAIN)
# Managed storage defaults
STORAGE_SYSTEM = os.environ.get(
'CATALOG_STORAGE_SYSTEM', TACC_PRIMARY_STORAGE_SYSTEM)
STORAGE_MANAGED_VERSION = os.environ.get(
'CATALOG_STORAGE_MANAGED_VERSION', 'v2')
TAPIS_ANY_AUTHENTICATED_USERNAME = 'world'
TAPIS_UNAUTHENTICATED_USERNAME = 'public'
# Managed stores
UPLOADS_ROOT = os.environ.get('CATALOG_UPLOADS_ROOT', 'uploads')
UPLOADS_SYSTEM = os.environ.get('CATALOG_UPLOADS_SYSTEM', STORAGE_SYSTEM)
UPLOADS_VERSION = os.environ.get('CATALOG_UPLOADS_VERSION', STORAGE_MANAGED_VERSION)
PRODUCTS_ROOT = os.environ.get('CATALOG_PRODUCTS_ROOT', 'products')
PRODUCTS_SYSTEM = os.environ.get('CATALOG_PRODUCTS_SYSTEM', STORAGE_SYSTEM)
PRODUCTS_VERSION = os.environ.get('CATALOG_PRODUCTS_VERSION', STORAGE_MANAGED_VERSION)
REFERENCES_ROOT = os.environ.get('CATALOG_REFERENCES_ROOT', 'references')
REFERENCES_SYSTEM = os.environ.get('CATALOG_PREFERENCES_SYSTEM', STORAGE_SYSTEM)
REFERENCES_VERSION = os.environ.get('CATALOG_REFERENCES_VERSION', STORAGE_MANAGED_VERSION)
# Path naming preferences
UNICODE_PATHS = parse_boolean(os.environ.get('CATALOG_UNICODE_PATHS', '0'))
# Contents of record._admin.source
RECORDS_SOURCE = os.environ.get('CATALOG_RECORDS_SOURCE', 'testing')
# Prefix for minting file.id strings
FILE_ID_PREFIX = os.environ.get('CATALOG_FILE_ID_PREFIX', 'file.tacc') + '.'
LOG_LEVEL = os.environ.get('CATALOG_LOG_LEVEL', 'NOTSET')
LOG_FIXITY_ERRORS = parse_boolean(os.environ.get('CATALOG_LOG_FIXITY_ERRORS', '0'))
LOG_UPDATES = parse_boolean(os.environ.get('CATALOG_LOG_UPDATES', '1'))
LOG_VERBOSE = parse_boolean(os.environ.get(
'CATALOG_LOG_VERBOSE', '0'))
# Maximum number of
MAX_INDEX_FILTERS = int(os.environ.get('CATALOG_MAX_INDEX_FILTERS', '100'))
MAX_INDEX_PATTERNS = int(os.environ.get('CATALOG_MAX_INDEX_PATTERNS', '512'))
# Python JSONschema Objects Cache
PJS_CACHE_MAX_AGE = int(os.environ.get('CATALOG_PJS_CACHE_MAX_AGE', '3600'))
PJS_CACHE_DIR = os.environ.get('CATALOG_PJS_CACHE_DIR', None)
| 40.686567 | 100 | 0.786134 | 375 | 2,726 | 5.397333 | 0.338667 | 0.097826 | 0.130435 | 0.206522 | 0.3083 | 0.122036 | 0.08004 | 0 | 0 | 0 | 0 | 0.006563 | 0.105649 | 2,726 | 66 | 101 | 41.30303 | 0.823626 | 0.136464 | 0 | 0 | 0 | 0 | 0.247652 | 0.18275 | 0 | 0 | 0 | 0.015152 | 0 | 1 | 0.023256 | false | 0 | 0.232558 | 0 | 0.27907 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
43bd9030064a44a8a10c52e3bf2af4e1cd17ceb6 | 918 | py | Python | utime/errors/__init__.py | amiyapatanaik/U-Time | a9ed4892da77d165a71dbfef1d069d782c909757 | [
"MIT"
] | 1 | 2022-03-15T12:31:30.000Z | 2022-03-15T12:31:30.000Z | utime/errors/__init__.py | amiyapatanaik/U-Time | a9ed4892da77d165a71dbfef1d069d782c909757 | [
"MIT"
] | null | null | null | utime/errors/__init__.py | amiyapatanaik/U-Time | a9ed4892da77d165a71dbfef1d069d782c909757 | [
"MIT"
] | null | null | null | """
Small collection of custom objects.
Sometimes used for their custom names only.
"""
class CouldNotLoadError(ResourceWarning):
def __init__(self, *args, study_id=None):
super(CouldNotLoadError, self).__init__(*args)
self.study_id = study_id
class ChannelNotFoundError(CouldNotLoadError):
def __init__(self, *args, **kwargs):
super(ChannelNotFoundError, self).__init__(*args, **kwargs)
class H5ChannelRootError(KeyError): pass
class H5VariableAttributesError(ValueError): pass
class MissingHeaderFieldError(KeyError): pass
class HeaderFieldTypeError(TypeError): pass
class LengthZeroSignalError(ValueError): pass
class NotLoadedError(ResourceWarning): pass
class StripError(RuntimeError): pass
class MarginError(ValueError):
def __init__(self, *args, shift=None, **kwargs):
super(MarginError, self).__init__(*args, **kwargs)
self.shift = shift
| 21.348837 | 67 | 0.745098 | 93 | 918 | 7.064516 | 0.408602 | 0.09589 | 0.050228 | 0.068493 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002571 | 0.152505 | 918 | 42 | 68 | 21.857143 | 0.841902 | 0.086057 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.388889 | 0 | 0 | 0.722222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
43c2b02bb218cd6c4f29e7642ac3a59bebf45484 | 2,271 | py | Python | vscvs/utils/path.py | fcoclavero/vscvs | 27fab0bc62fb68da044cf6f2516e3c1853f77533 | [
"MIT"
] | 1 | 2019-07-02T19:07:15.000Z | 2019-07-02T19:07:15.000Z | vscvs/utils/path.py | fcoclavero/vscvs | 27fab0bc62fb68da044cf6f2516e3c1853f77533 | [
"MIT"
] | 2 | 2019-10-23T18:05:37.000Z | 2020-09-25T14:16:25.000Z | vscvs/utils/path.py | fcoclavero/vscvs | 27fab0bc62fb68da044cf6f2516e3c1853f77533 | [
"MIT"
] | null | null | null | __author__ = ["Francisco Clavero"]
__email__ = ["fcoclavero32@gmail.com"]
__status__ = "Prototype"
""" Path handler. """
import os
import shutil
from datetime import datetime
from vscvs.settings import CHECKPOINT_NAME_FORMAT
from vscvs.settings import ROOT_DIR
def get_path(*paths):
"""Get the path of a file or directory by joining the given path components. Project files will be stored under the
`data` subdirectory of the projects `ROOT_DIR`, which is set as an environment variable.
Args:
paths: path components.
*paths:
Returns:
the actual path.
"""
return os.path.join(ROOT_DIR, "data", *paths or "")
def get_checkpoint_path(checkpoint_name, *tags, date=datetime.now()):
"""Get the path where trainer model checkpoints should be stored.
Args:
checkpoint_name: the name of the model
tags: optional tags for organizing checkpoints.
date: the date string of the model checkpoint. Defaults to the current date.
*tags:
Returns:
the model checkpoint path
"""
return get_path("checkpoints", checkpoint_name, *tags, date.strftime(CHECKPOINT_NAME_FORMAT))
def get_log_directory(model_name, *tags, date=datetime.now()):
"""Get the path where trainer tensorboard logs should be stored.
Args:
model_name: the name of the model.
tags: optional tags for organizing tensorboard logs.
date: the date string of the model checkpoint. Defaults to the current date.
*tags:
Returns:
the model checkpoint path
"""
return get_path("tensorboard", model_name, *tags, date.strftime(CHECKPOINT_NAME_FORMAT))
def get_subdirectories(path):
"""Get a list of all the child directories of the given path.
Args:
path: the path who's child directories are to be returned.
Returns:
the paths of the child directories, relative to the given path.
"""
return next(os.walk(path))[1]
def recreate_directory(directory_path):
"""Delete and recreate the directory at the given path to ensure an empty directory.
Args:
directory_path: the path to the directory to be recreated.
Returns:
"""
shutil.rmtree(directory_path, ignore_errors=True)
os.makedirs(directory_path)
| 25.516854 | 119 | 0.701453 | 308 | 2,271 | 5.038961 | 0.327922 | 0.022552 | 0.030928 | 0.029639 | 0.319588 | 0.319588 | 0.319588 | 0.319588 | 0.319588 | 0.260309 | 0 | 0.00169 | 0.218406 | 2,271 | 88 | 120 | 25.806818 | 0.872676 | 0.534126 | 0 | 0 | 0 | 0 | 0.084765 | 0.0252 | 0 | 0 | 0 | 0 | 0 | 1 | 0.263158 | false | 0 | 0.263158 | 0 | 0.736842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
43c833c872d65921ada8e3c10536a4fbc45a695a | 678 | py | Python | src/clims/migrations/0021_auto_20201110_0806.py | commonlims/commonlims | 36a02ed244c7b59ee1f2523e64e4749e404ab0f7 | [
"BSD-3-Clause"
] | 4 | 2019-05-27T13:55:07.000Z | 2021-03-30T07:05:09.000Z | src/clims/migrations/0021_auto_20201110_0806.py | commonlims/commonlims | 36a02ed244c7b59ee1f2523e64e4749e404ab0f7 | [
"BSD-3-Clause"
] | 99 | 2019-05-20T14:16:33.000Z | 2021-01-19T09:25:15.000Z | src/clims/migrations/0021_auto_20201110_0806.py | commonlims/commonlims | 36a02ed244c7b59ee1f2523e64e4749e404ab0f7 | [
"BSD-3-Clause"
] | 1 | 2020-08-10T07:55:40.000Z | 2020-08-10T07:55:40.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.9.13 on 2020-11-10 08:06
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations
import django.db.models.deletion
import sentry.db.models.fields.foreignkey
class Migration(migrations.Migration):
dependencies = [
('clims', '0020_transition'),
]
operations = [
migrations.AlterField(
model_name='transition',
name='user',
field=sentry.db.models.fields.foreignkey.FlexibleForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='transitions', to=settings.AUTH_USER_MODEL),
),
]
| 28.25 | 185 | 0.69469 | 81 | 678 | 5.679012 | 0.617284 | 0.069565 | 0.06087 | 0.095652 | 0.130435 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038321 | 0.19174 | 678 | 23 | 186 | 29.478261 | 0.801095 | 0.100295 | 0 | 0 | 1 | 0 | 0.074135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
43d14d939b9c82d029e7b6175aec1be89456550d | 75,226 | py | Python | pysnmp/WYSE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/WYSE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/WYSE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module WYSE-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/WYSE-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 21:32:11 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, Integer, OctetString = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "Integer", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsUnion, ValueRangeConstraint, SingleValueConstraint, ValueSizeConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsUnion", "ValueRangeConstraint", "SingleValueConstraint", "ValueSizeConstraint", "ConstraintsIntersection")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
ObjectIdentity, Counter64, Unsigned32, ModuleIdentity, Integer32, NotificationType, TimeTicks, Counter32, Gauge32, IpAddress, enterprises, MibIdentifier, NotificationType, MibScalar, MibTable, MibTableRow, MibTableColumn, Bits, iso = mibBuilder.importSymbols("SNMPv2-SMI", "ObjectIdentity", "Counter64", "Unsigned32", "ModuleIdentity", "Integer32", "NotificationType", "TimeTicks", "Counter32", "Gauge32", "IpAddress", "enterprises", "MibIdentifier", "NotificationType", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Bits", "iso")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
wyse = MibIdentifier((1, 3, 6, 1, 4, 1, 714))
product = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1))
old = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 1))
thinClient = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2))
wysenet = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 1, 1))
wbt3 = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3))
wbt3Memory = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1))
wbt3PCCard = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 2))
wbt3IODevice = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 3))
wbt3Display = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4))
wbt3DhcpInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5))
wbt3BuildInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6))
wbt3CustomFields = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 7))
wbt3Administration = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8))
wbt3TrapsInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9))
wbt3MibRev = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 10))
wbt3Network = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11))
wbt3Apps = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12))
wbt3Connections = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13))
wbt3Users = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14))
wbt3Ram = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 1))
wbt3Rom = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 2))
wbt3RamNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RamNum.setStatus('mandatory')
wbt3RamTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 1, 2), )
if mibBuilder.loadTexts: wbt3RamTable.setStatus('mandatory')
wbt3RamEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 1, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3RamIndex"))
if mibBuilder.loadTexts: wbt3RamEntry.setStatus('mandatory')
wbt3RamIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 1, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RamIndex.setStatus('mandatory')
wbt3RamType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 1, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("base", 1), ("video", 2), ("extend", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RamType.setStatus('mandatory')
wbt3RamSize = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 1, 2, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RamSize.setStatus('mandatory')
wbt3RomNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RomNum.setStatus('mandatory')
wbt3RomTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 2, 2), )
if mibBuilder.loadTexts: wbt3RomTable.setStatus('mandatory')
wbt3RomEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 2, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3RomIndex"))
if mibBuilder.loadTexts: wbt3RomEntry.setStatus('mandatory')
wbt3RomIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 2, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RomIndex.setStatus('mandatory')
wbt3RomType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 2, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("boot", 1), ("os", 2), ("option", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RomType.setStatus('mandatory')
wbt3RomSize = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 1, 2, 2, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RomSize.setStatus('mandatory')
wbt3PCCardNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3PCCardNum.setStatus('mandatory')
wbt3PCCardTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 2, 2), )
if mibBuilder.loadTexts: wbt3PCCardTable.setStatus('mandatory')
wbt3PCCardEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 2, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3PCCardIndex"))
if mibBuilder.loadTexts: wbt3PCCardEntry.setStatus('mandatory')
wbt3PCCardIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 2, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3PCCardIndex.setStatus('mandatory')
wbt3PCCardType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 2, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(256, 0, 1, 2, 3, 4, 5, 6, 7))).clone(namedValues=NamedValues(("empty", 256), ("multifunction", 0), ("memory", 1), ("serial-port-modem", 2), ("parallel-port", 3), ("fixed-disk", 4), ("video-adaptor", 5), ("lan-adapter", 6), ("aims", 7)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3PCCardType.setStatus('mandatory')
wbt3PCCardVendor = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 2, 2, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3PCCardVendor.setStatus('mandatory')
wbt3IODevAttached = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 3, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 4294967295))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3IODevAttached.setStatus('mandatory')
wbt3kbLanguage = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 3, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24))).clone(namedValues=NamedValues(("english-us", 0), ("english-uk", 1), ("french", 2), ("german", 3), ("spanish", 4), ("italian", 5), ("swedish", 6), ("danish", 7), ("norwegian", 8), ("dutch", 9), ("belgian-french", 10), ("finnish", 11), ("swiss-french", 12), ("swiss-german", 13), ("japanese", 14), ("canadian-french", 15), ("belgian-dutch", 16), ("portuguese", 17), ("brazilian-abnt", 18), ("italian-142", 19), ("latin-american", 20), ("us-international", 21), ("canadian-fr-multi", 22), ("canadian-eng-multi", 23), ("spanish-variation", 24)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3kbLanguage.setStatus('mandatory')
wbt3CharacterRepeatDelay = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 3, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(250, 500, 750, 1000))).clone(namedValues=NamedValues(("delay-250", 250), ("delay-500", 500), ("delay-750", 750), ("delay-1000", 1000)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3CharacterRepeatDelay.setStatus('mandatory')
wbt3CharacterRepeatRate = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 3, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 31))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3CharacterRepeatRate.setStatus('mandatory')
wbt3DispCharacteristic = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 1))
wbt3DispCapability = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 2))
wbt3EnergySaver = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("none", 0), ("screensaver", 1), ("monitoroff", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3EnergySaver.setStatus('mandatory')
wbt3ScreenTimeOut = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 1440))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ScreenTimeOut.setStatus('mandatory')
wbt3TouchScreen = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("none", 0), ("com1", 1), ("com2", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TouchScreen.setStatus('mandatory')
wbt3DispFreq = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 256))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3DispFreq.setStatus('mandatory')
wbt3DispHorizPix = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3DispHorizPix.setStatus('mandatory')
wbt3DispVertPix = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3DispVertPix.setStatus('mandatory')
wbt3DispColor = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DispColor.setStatus('mandatory')
wbt3DispUseDDC = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3DispUseDDC.setStatus('mandatory')
wbt3DispFreqMax = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 256))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DispFreqMax.setStatus('mandatory')
wbt3DispHorizPixMax = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DispHorizPixMax.setStatus('mandatory')
wbt3DispVertPixMax = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DispVertPixMax.setStatus('mandatory')
wbt3DispColorMax = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 4, 2, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DispColorMax.setStatus('mandatory')
wbt3DhcpInfoNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DhcpInfoNum.setStatus('mandatory')
wbt3DhcpInfoTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2), )
if mibBuilder.loadTexts: wbt3DhcpInfoTable.setStatus('mandatory')
wbt3DHCPoptionIDs = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3))
wbt3DhcpInfoEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3DhcpInfoIndex"))
if mibBuilder.loadTexts: wbt3DhcpInfoEntry.setStatus('mandatory')
wbt3DhcpInfoIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DhcpInfoIndex.setStatus('mandatory')
wbt3InterfaceNum = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3InterfaceNum.setStatus('mandatory')
wbt3ServerIP = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3ServerIP.setStatus('mandatory')
wbt3Username = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3Username.setStatus('mandatory')
wbt3Domain = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3Domain.setStatus('mandatory')
wbt3Password = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("nopassword", 0), ("password", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3Password.setStatus('mandatory')
wbt3CommandLine = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CommandLine.setStatus('mandatory')
wbt3WorkingDir = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 8), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3WorkingDir.setStatus('mandatory')
wbt3FileServer = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 9), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3FileServer.setStatus('mandatory')
wbt3FileRootPath = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 10), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3FileRootPath.setStatus('mandatory')
wbt3TrapServerList = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 11), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3TrapServerList.setStatus('mandatory')
wbt3SetCommunity = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("ignored", 0), ("provided", 1), ("notprovided", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3SetCommunity.setStatus('mandatory')
wbt3RDPstartApp = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 13), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RDPstartApp.setStatus('mandatory')
wbt3EmulationMode = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 14), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3EmulationMode.setStatus('mandatory')
wbt3TerminalID = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 15), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3TerminalID.setStatus('mandatory')
wbt3VirtualPortServer = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 2, 1, 16), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3VirtualPortServer.setStatus('mandatory')
remoteServer = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: remoteServer.setStatus('mandatory')
logonUserName = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: logonUserName.setStatus('mandatory')
domain = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: domain.setStatus('mandatory')
password = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: password.setStatus('mandatory')
commandLine = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commandLine.setStatus('mandatory')
workingDirectory = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: workingDirectory.setStatus('mandatory')
fTPFileServer = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 8), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: fTPFileServer.setStatus('mandatory')
fTPRootPath = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 9), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: fTPRootPath.setStatus('mandatory')
trapServerList = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: trapServerList.setStatus('mandatory')
setCommunity = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 11), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: setCommunity.setStatus('mandatory')
rDPStartupApp = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 7), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rDPStartupApp.setStatus('mandatory')
emulationMode = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: emulationMode.setStatus('mandatory')
terminalID = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 13), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: terminalID.setStatus('mandatory')
virtualPortServer = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 5, 3, 14), Integer32().subtype(subtypeSpec=ValueRangeConstraint(128, 254))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: virtualPortServer.setStatus('mandatory')
wbt3CurrentInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1))
wbt3DhcpUpdateInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2))
wbt3CurInfoNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurInfoNum.setStatus('mandatory')
wbt3CurInfoTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2), )
if mibBuilder.loadTexts: wbt3CurInfoTable.setStatus('mandatory')
wbt3CurInfoEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3DhcpInfoIndex"))
if mibBuilder.loadTexts: wbt3CurInfoEntry.setStatus('mandatory')
wbt3CurInfoIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurInfoIndex.setStatus('mandatory')
wbt3CurBuildNum = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurBuildNum.setStatus('mandatory')
wbt3CurOEMBuildNum = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurOEMBuildNum.setStatus('mandatory')
wbt3CurModBuildDate = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurModBuildDate.setStatus('mandatory')
wbt3CurOEM = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurOEM.setStatus('mandatory')
wbt3CurHWPlatform = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1, 6), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurHWPlatform.setStatus('mandatory')
wbt3CurOS = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 1, 2, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3CurOS.setStatus('mandatory')
wbt3DUpInfoNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DUpInfoNum.setStatus('mandatory')
wbt3DUpInfoTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 2), )
if mibBuilder.loadTexts: wbt3DUpInfoTable.setStatus('mandatory')
wbt3DUpInfoEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3DUpInfoIndex"))
if mibBuilder.loadTexts: wbt3DUpInfoEntry.setStatus('mandatory')
wbt3DUpInfoIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DUpInfoIndex.setStatus('mandatory')
wbt3DUpBuildNum = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 2, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DUpBuildNum.setStatus('mandatory')
wbt3DUpOEMBuildNum = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 2, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DUpOEMBuildNum.setStatus('mandatory')
wbt3DUpModBuildDate = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 2, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DUpModBuildDate.setStatus('mandatory')
wbt3DUpOEMBuildDate = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 6, 2, 2, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3DUpOEMBuildDate.setStatus('mandatory')
wbt3CustomField1 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 7, 1), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3CustomField1.setStatus('mandatory')
wbt3CustomField2 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 7, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3CustomField2.setStatus('mandatory')
wbt3CustomField3 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 7, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3CustomField3.setStatus('mandatory')
wbt3UpDnLoad = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1))
wbt3Action = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 2))
wbt3FTPsetting = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 3))
wbt3SNMPupdate = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3SNMPupdate.setStatus('mandatory')
wbt3DHCPupdate = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3DHCPupdate.setStatus('mandatory')
wbt3UpDnLoadNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadNum.setStatus('mandatory')
wbt3UpDnLoadTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2), )
if mibBuilder.loadTexts: wbt3UpDnLoadTable.setStatus('mandatory')
wbt3AcceptReq = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AcceptReq.setStatus('mandatory')
wbt3SubmitLoadJob = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("notready", 0), ("ready", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3SubmitLoadJob.setStatus('mandatory')
wbt3UpDnLoadEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3UpDnLoadIndex"))
if mibBuilder.loadTexts: wbt3UpDnLoadEntry.setStatus('mandatory')
wbt3UpDnLoadIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadIndex.setStatus('mandatory')
wbt3UpDnLoadId = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadId.setStatus('mandatory')
wbt3UpDnLoadOp = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("upload", 0), ("download", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadOp.setStatus('mandatory')
wbt3UpDnLoadSrcFile = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadSrcFile.setStatus('mandatory')
wbt3UpDnLoadDstFile = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 5), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadDstFile.setStatus('mandatory')
wbt3UpDnLoadFileType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("binary", 0), ("ascii", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadFileType.setStatus('mandatory')
wbt3UpDnLoadProtocol = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("ftp", 0), ("tftp", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadProtocol.setStatus('mandatory')
wbt3UpDnLoadFServer = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 8), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadFServer.setStatus('mandatory')
wbt3UpDnLoadTimeFlag = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 1, 2, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0))).clone(namedValues=NamedValues(("immediate", 0)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UpDnLoadTimeFlag.setStatus('mandatory')
wbt3RebootRequest = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 2, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("noreboot", 0), ("rebootnow", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RebootRequest.setStatus('mandatory')
wbt3ResetToFactoryDefault = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 2, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("noreset", 0), ("resetnow", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ResetToFactoryDefault.setStatus('mandatory')
wbt3ServerName = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 3, 1), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ServerName.setStatus('mandatory')
wbt3Directory = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 3, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Directory.setStatus('mandatory')
wbt3UserID = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 3, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UserID.setStatus('mandatory')
wbt3Password2 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 3, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Password2.setStatus('mandatory')
wbt3SavePassword = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 3, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3SavePassword.setStatus('mandatory')
wbt3InfoLocation = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 3, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("uselocalinfo", 0), ("usedhcpinfo", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3InfoLocation.setStatus('mandatory')
wbt3Security = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7))
wbt3SecurityEnable = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3SecurityEnable.setStatus('mandatory')
wbt3HideConfigTab = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3HideConfigTab.setStatus('mandatory')
wbt3FailOverEnable = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3FailOverEnable.setStatus('mandatory')
wbt3MultipleConnect = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3MultipleConnect.setStatus('mandatory')
wbt3PingBeforeConnect = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3PingBeforeConnect.setStatus('mandatory')
wbt3Verbose = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Verbose.setStatus('mandatory')
wbt3AutoLoginEnable = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoLoginEnable.setStatus('mandatory')
wbt3AutoLoginUserName = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 8), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoLoginUserName.setStatus('mandatory')
wbt3SingleButtonConnect = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3SingleButtonConnect.setStatus('mandatory')
wbt3AutoFailRecovery = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 8, 7, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoFailRecovery.setStatus('mandatory')
wbt3TrapStatus = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28))).clone(namedValues=NamedValues(("ls-done", 0), ("ls-done-sameversion", 1), ("ls-notready", 2), ("ls-fail-shutdown", 3), ("ls-fail-noupd", 4), ("ls-fail-dnld-blocked", 5), ("ls-fail-filenotfound", 6), ("ls-fail-dir", 7), ("ls-fail-upld-blocked", 8), ("ls-fail-noserv", 9), ("ls-fail-prot", 10), ("ls-fail-nomem", 11), ("ls-fail-noresource", 12), ("ls-fail-resolvename", 13), ("ls-fail-notbundle", 14), ("ls-fail-checksum", 15), ("ls-fail-flasherror", 16), ("ls-fail-dnld-flash", 17), ("ls-fail-usercancel", 18), ("ls-fail-norflash", 19), ("ls-fail-protnsupport", 20), ("ls-fail-parsereg", 21), ("ls-fail-parsereg-verincomp", 22), ("ls-fail-parsereg-platfincomp", 23), ("ls-fail-parsereg-osincomp", 24), ("ls-fail-reset-defaultfactory", 25), ("ls-fail-paraminifilenotfound", 26), ("ls-invalid-bootstrap", 27), ("ls-fail-badkey", 28)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3TrapStatus.setStatus('mandatory')
wbt3TrapReqId = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3TrapReqId.setStatus('mandatory')
wbt3TrapServers = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9, 3))
wbt3TrapServer1 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9, 3, 1), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TrapServer1.setStatus('mandatory')
wbt3TrapServer2 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9, 3, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TrapServer2.setStatus('mandatory')
wbt3TrapServer3 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9, 3, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TrapServer3.setStatus('mandatory')
wbt3TrapServer4 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 9, 3, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TrapServer4.setStatus('mandatory')
wbt3MibRevMajor = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 10, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3MibRevMajor.setStatus('mandatory')
wbt3MibRevMinor = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 10, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3MibRevMinor.setStatus('mandatory')
wbt3NetworkNum = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3NetworkNum.setStatus('mandatory')
wbt3NetworkTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2), )
if mibBuilder.loadTexts: wbt3NetworkTable.setStatus('mandatory')
wbt3NetworkEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3NetworkIndex"))
if mibBuilder.loadTexts: wbt3NetworkEntry.setStatus('mandatory')
wbt3NetworkIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3NetworkIndex.setStatus('mandatory')
wbt3dhcpEnable = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3dhcpEnable.setStatus('mandatory')
wbt3NetworkAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3NetworkAddress.setStatus('mandatory')
wbt3SubnetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3SubnetMask.setStatus('mandatory')
wbt3Gateway = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 5), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Gateway.setStatus('mandatory')
wbt3dnsEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3dnsEnable.setStatus('mandatory')
wbt3defaultDomain = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 7), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3defaultDomain.setStatus('mandatory')
wbt3primaryDNSserverIPaddress = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 8), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3primaryDNSserverIPaddress.setStatus('mandatory')
wbt3secondaryDNSserverIPaddress = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 9), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3secondaryDNSserverIPaddress.setStatus('mandatory')
wbt3winsEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3winsEnable.setStatus('mandatory')
wbt3primaryWINSserverIPaddress = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 11), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3primaryWINSserverIPaddress.setStatus('mandatory')
wbt3secondaryWINSserverIPaddress = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 12), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3secondaryWINSserverIPaddress.setStatus('mandatory')
wbt3NetworkSpeed = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 17), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 9, 8, 7, 6))).clone(namedValues=NamedValues(("auto-detect", 0), ("mbs-10halfduplex", 9), ("mbs-10fullduplex", 8), ("mbs-100halfduplex", 7), ("mbs-100fullduplex", 6)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3NetworkSpeed.setStatus('mandatory')
wbt3NetworkProtocol = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 11, 2, 1, 20), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 256))).clone(namedValues=NamedValues(("tcp-ip", 0), ("unknown", 256)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3NetworkProtocol.setStatus('mandatory')
wbt3RDPencryption = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("enable", 0), ("disable", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPencryption.setStatus('mandatory')
wbt3VirtualPortServerIPaddress = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3VirtualPortServerIPaddress.setStatus('mandatory')
wbt3com1Share = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("yes", 0), ("no", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3com1Share.setStatus('mandatory')
wbt3com2Share = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("yes", 0), ("no", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3com2Share.setStatus('mandatory')
wbt3parallelShare = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("yes", 0), ("no", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3parallelShare.setStatus('mandatory')
iCADefaultHotkeys = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6), )
if mibBuilder.loadTexts: iCADefaultHotkeys.setStatus('mandatory')
defaultHotkeysEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1), )
if mibBuilder.loadTexts: defaultHotkeysEntry.setStatus('mandatory')
iCAStatusDialog = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("ctrl", 0), ("shift", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAStatusDialog.setStatus('mandatory')
iCAStatusDialog2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAStatusDialog2.setStatus('mandatory')
iCACloseRemoteApplication = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("ctrl", 0), ("shift", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCACloseRemoteApplication.setStatus('mandatory')
iCACloseRemoteApplication2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCACloseRemoteApplication2.setStatus('mandatory')
iCAtoggleTitleBar = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("ctrl", 0), ("shift", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAtoggleTitleBar.setStatus('mandatory')
iCAtoggleTitleBar2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAtoggleTitleBar2.setStatus('mandatory')
iCActrlAltDel = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0))).clone(namedValues=NamedValues(("ctrl", 0)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCActrlAltDel.setStatus('mandatory')
iCActrlAltDel2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCActrlAltDel2.setStatus('mandatory')
iCActrlEsc = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0))).clone(namedValues=NamedValues(("ctrl", 0)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCActrlEsc.setStatus('mandatory')
iCActrlEsc2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCActrlEsc2.setStatus('mandatory')
iCAaltEsc = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("ctrl", 0), ("shift", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAaltEsc.setStatus('mandatory')
iCAaltEsc2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAaltEsc2.setStatus('mandatory')
iCAaltTab = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 13), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("ctrl", 0), ("shift", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAaltTab.setStatus('mandatory')
iCAaltTab2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 14), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAaltTab2.setStatus('mandatory')
iCAaltBackTab = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 15), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("ctrl", 0), ("shift", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAaltBackTab.setStatus('mandatory')
iCAaltBackTab2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 12, 6, 1, 16), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("nb-0", 0), ("nb-1", 1), ("nb-2", 2), ("nb-3", 3), ("nb-4", 4), ("nb-5", 5), ("nb-6", 6), ("nb-7", 7), ("nb-8", 8), ("nb-9", 9)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: iCAaltBackTab2.setStatus('mandatory')
wbt3ConnectionsTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 2), )
if mibBuilder.loadTexts: wbt3ConnectionsTable.setStatus('mandatory')
wbt3ConnectionEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3ConnectionName"))
if mibBuilder.loadTexts: wbt3ConnectionEntry.setStatus('mandatory')
wbt3ConnectionName = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 2, 1, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ConnectionName.setStatus('mandatory')
wbt3ConnectionType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("rdp", 0), ("ica", 1), ("tec", 2), ("dialup", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ConnectionType.setStatus('mandatory')
wbt3ConnectionEntryStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 2, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("active", 1), ("notInService", 2), ("notReady", 3), ("createAndGo", 4), ("createAndWait", 5), ("destroy", 6)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ConnectionEntryStatus.setStatus('mandatory')
wbt3RDPConnections = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3))
wbt3RDPConnTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1), )
if mibBuilder.loadTexts: wbt3RDPConnTable.setStatus('mandatory')
wbt3RDPConnEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1), )
if mibBuilder.loadTexts: wbt3RDPConnEntry.setStatus('mandatory')
wbt3RDPConnName = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RDPConnName.setStatus('mandatory')
wbt3RDPConnServer = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnServer.setStatus('mandatory')
wbt3RDPConnLowSpeed = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnLowSpeed.setStatus('mandatory')
wbt3RDPConnAutoLogon = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnAutoLogon.setStatus('mandatory')
wbt3RDPConnUsername = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 5), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnUsername.setStatus('mandatory')
wbt3RDPConnDomain = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 7), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnDomain.setStatus('mandatory')
wbt3RDPConnStartApplication = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 2))).clone(namedValues=NamedValues(("desktop", 0), ("filename", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnStartApplication.setStatus('mandatory')
wbt3RDPConnFilename = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 9), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnFilename.setStatus('mandatory')
wbt3RDPConnWorkingDir = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 10), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3RDPConnWorkingDir.setStatus('mandatory')
wbt3RDPConnModifiable = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 3, 1, 1, 30), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3RDPConnModifiable.setStatus('mandatory')
wbt3ICAConnections = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4))
wbt3ICAConnTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1), )
if mibBuilder.loadTexts: wbt3ICAConnTable.setStatus('mandatory')
wbt3ICAConnEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1), )
if mibBuilder.loadTexts: wbt3ICAConnEntry.setStatus('mandatory')
wbt3ICAConnName = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3ICAConnName.setStatus('mandatory')
wbt3ICAConnCommType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0))).clone(namedValues=NamedValues(("network", 0)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnCommType.setStatus('mandatory')
wbt3ICAConnServer = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnServer.setStatus('mandatory')
wbt3ICAConnCommandLine = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnCommandLine.setStatus('mandatory')
wbt3ICAConnWorkingDir = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 5), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnWorkingDir.setStatus('mandatory')
wbt3ICAConnUsername = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 6), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnUsername.setStatus('mandatory')
wbt3ICAConnDomain = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 8), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnDomain.setStatus('mandatory')
wbt3ICAConnColors = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("nb-16", 0), ("nb-256", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnColors.setStatus('mandatory')
wbt3ICAConnDataCompress = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnDataCompress.setStatus('mandatory')
wbt3ICAConnSoundQuality = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("none", 0), ("low", 1), ("medium", 2), ("high", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3ICAConnSoundQuality.setStatus('mandatory')
wbt3ICAConnModifiable = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 4, 1, 1, 30), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3ICAConnModifiable.setStatus('mandatory')
wbt3TermConnections = MibIdentifier((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5))
wbt3TermConnTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1), )
if mibBuilder.loadTexts: wbt3TermConnTable.setStatus('mandatory')
wbt3TermConnEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1), )
if mibBuilder.loadTexts: wbt3TermConnEntry.setStatus('mandatory')
wbt3TermConnName = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3TermConnName.setStatus('mandatory')
wbt3TermConnCommType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0))).clone(namedValues=NamedValues(("network", 0)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnCommType.setStatus('mandatory')
wbt3TermConnServer = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnServer.setStatus('mandatory')
wbt3TermConnEmuType = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17))).clone(namedValues=NamedValues(("vt52", 0), ("vt100", 1), ("vt400-7-bit", 3), ("vt400-8-bit", 4), ("ansi-bbs", 5), ("sco-console", 6), ("ibm3270", 7), ("ibm3151", 8), ("ibm5250", 9), ("wy50", 10), ("wy50-plus", 11), ("tvi910", 12), ("tvi920", 13), ("tvi925", 14), ("adds-a2", 15), ("hz1500", 16), ("wy60", 17)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnEmuType.setStatus('mandatory')
wbt3TermConnVTEmuModel = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 256))).clone(namedValues=NamedValues(("vt100", 0), ("vt101", 1), ("vt102", 2), ("vt125", 3), ("vt220", 4), ("vt240", 5), ("vt320", 6), ("vt340", 7), ("vt420", 8), ("vt131", 9), ("vt132", 10), ("not-applicable", 256)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnVTEmuModel.setStatus('mandatory')
wbt3TermConnIBM3270EmuModel = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 256))).clone(namedValues=NamedValues(("ibm3278-2", 0), ("ibm3278-3", 1), ("ibm3278-4", 2), ("ibm3278-5", 3), ("ibm3278-2-e", 4), ("ibm3278-3-e", 5), ("ibm3278-4-e", 6), ("ibm3278-5-e", 7), ("ibm3279-2", 8), ("ibm3279-3", 9), ("ibm3279-4", 10), ("ibm3279-5", 11), ("ibm3287-1", 12), ("not-applicable", 256)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnIBM3270EmuModel.setStatus('mandatory')
wbt3TermConnIBM5250EmuModel = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 256))).clone(namedValues=NamedValues(("ibm5291-1", 0), ("ibm5292-2", 1), ("ibm5251-11", 2), ("ibm3179-2", 3), ("ibm3196-a1", 4), ("ibm3180-2", 5), ("ibm3477-fc", 6), ("ibm3477-fg", 7), ("ibm3486-ba", 8), ("ibm3487-ha", 9), ("ibm3487-hc", 10), ("not-applicable", 256)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnIBM5250EmuModel.setStatus('mandatory')
wbt3TermConnPortNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnPortNumber.setStatus('mandatory')
wbt3TermConnTelnetName = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 11), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnTelnetName.setStatus('mandatory')
wbt3TermConnPrinterPort = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0))).clone(namedValues=NamedValues(("lpt1", 0)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnPrinterPort.setStatus('mandatory')
wbt3TermConnFormFeed = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 13), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnFormFeed.setStatus('mandatory')
wbt3TermConnAutoLineFeed = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 14), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnAutoLineFeed.setStatus('mandatory')
wbt3TermConnScript = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 15), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3TermConnScript.setStatus('mandatory')
wbt3TermConnModifiable = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 13, 5, 1, 1, 30), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: wbt3TermConnModifiable.setStatus('mandatory')
wbt3UsersTable = MibTable((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2), )
if mibBuilder.loadTexts: wbt3UsersTable.setStatus('mandatory')
wbt3UsersEntry = MibTableRow((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1), ).setIndexNames((0, "WYSE-MIB", "wbt3userName"))
if mibBuilder.loadTexts: wbt3UsersEntry.setStatus('mandatory')
wbt3UsersStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("active", 1), ("notInService", 2), ("notReady", 3), ("createAndGo", 4), ("createAndWait", 5), ("destroy", 6)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UsersStatus.setStatus('mandatory')
wbt3userName = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3userName.setStatus('mandatory')
wbt3password = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3password.setStatus('mandatory')
wbt3privilege = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("admin", 0), ("user", 1), ("guest", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3privilege.setStatus('mandatory')
wbt3Connection1 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 7), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection1.setStatus('mandatory')
wbt3Connection2 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 8), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection2.setStatus('mandatory')
wbt3Connection3 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 9), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection3.setStatus('mandatory')
wbt3Connection4 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 10), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection4.setStatus('mandatory')
wbt3Connection5 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 11), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection5.setStatus('mandatory')
wbt3Connection6 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 12), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection6.setStatus('mandatory')
wbt3Connection7 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 13), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection7.setStatus('mandatory')
wbt3Connection8 = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 14), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3Connection8.setStatus('mandatory')
wbt3AutoStart1 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 15), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart1.setStatus('mandatory')
wbt3AutoStart2 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 16), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart2.setStatus('mandatory')
wbt3AutoStart3 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 17), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart3.setStatus('mandatory')
wbt3AutoStart4 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 18), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart4.setStatus('mandatory')
wbt3AutoStart5 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 19), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart5.setStatus('mandatory')
wbt3AutoStart6 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 20), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart6.setStatus('mandatory')
wbt3AutoStart7 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 21), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart7.setStatus('mandatory')
wbt3AutoStart8 = MibScalar((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 22), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3AutoStart8.setStatus('mandatory')
wbt3UserPasswordChange = MibTableColumn((1, 3, 6, 1, 4, 1, 714, 1, 2, 3, 14, 2, 1, 23), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("no", 0), ("yes", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: wbt3UserPasswordChange.setStatus('mandatory')
wbt3TrapDHCPBuildMismatch = NotificationType((1, 3, 6, 1, 4, 1, 714, 1, 2, 3) + (0,1)).setObjects(("WYSE-MIB", "wbt3CurBuildNum"), ("WYSE-MIB", "wbt3CurOEMBuildNum"), ("WYSE-MIB", "wbt3CurModBuildDate"), ("WYSE-MIB", "wbt3DUpBuildNum"), ("WYSE-MIB", "wbt3DUpOEMBuildNum"), ("WYSE-MIB", "wbt3DUpOEMBuildDate"))
wbt3TrapDHCPUpdDone = NotificationType((1, 3, 6, 1, 4, 1, 714, 1, 2, 3) + (0,2)).setObjects(("WYSE-MIB", "wbt3CurBuildNum"), ("WYSE-MIB", "wbt3CurOEMBuildNum"), ("WYSE-MIB", "wbt3CurModBuildDate"), ("WYSE-MIB", "wbt3DUpBuildNum"), ("WYSE-MIB", "wbt3DUpOEMBuildNum"), ("WYSE-MIB", "wbt3DUpOEMBuildDate"))
wbt3TrapDHCPUpdNotComplete = NotificationType((1, 3, 6, 1, 4, 1, 714, 1, 2, 3) + (0,3)).setObjects(("WYSE-MIB", "wbt3CurBuildNum"), ("WYSE-MIB", "wbt3CurOEMBuildNum"), ("WYSE-MIB", "wbt3CurModBuildDate"), ("WYSE-MIB", "wbt3DUpBuildNum"), ("WYSE-MIB", "wbt3DUpOEMBuildNum"), ("WYSE-MIB", "wbt3DUpOEMBuildDate"), ("WYSE-MIB", "wbt3TrapStatus"))
wbt3TrapSNMPAccptLd = NotificationType((1, 3, 6, 1, 4, 1, 714, 1, 2, 3) + (0,4)).setObjects(("WYSE-MIB", "wbt3SubmitLoadJob"))
wbt3TrapSNMPLdDone = NotificationType((1, 3, 6, 1, 4, 1, 714, 1, 2, 3) + (0,5)).setObjects(("WYSE-MIB", "wbt3TrapReqId"))
wbt3TrapSNMPLdNotComplete = NotificationType((1, 3, 6, 1, 4, 1, 714, 1, 2, 3) + (0,6)).setObjects(("WYSE-MIB", "wbt3TrapReqId"), ("WYSE-MIB", "wbt3TrapStatus"))
wbt3TrapRebootNotComplete = NotificationType((1, 3, 6, 1, 4, 1, 714, 1, 2, 3) + (0,7)).setObjects(("WYSE-MIB", "wbt3TrapStatus"))
mibBuilder.exportSymbols("WYSE-MIB", wbt3CustomField1=wbt3CustomField1, wbt3PCCardIndex=wbt3PCCardIndex, wbt3EnergySaver=wbt3EnergySaver, wbt3HideConfigTab=wbt3HideConfigTab, wbt3TrapServer3=wbt3TrapServer3, wbt3SetCommunity=wbt3SetCommunity, wbt3UpDnLoad=wbt3UpDnLoad, wbt3CurInfoNum=wbt3CurInfoNum, wbt3UpDnLoadIndex=wbt3UpDnLoadIndex, wbt3FailOverEnable=wbt3FailOverEnable, wbt3RDPConnections=wbt3RDPConnections, wbt3ICAConnColors=wbt3ICAConnColors, logonUserName=logonUserName, wbt3Connection6=wbt3Connection6, wbt3DispVertPixMax=wbt3DispVertPixMax, wbt3CharacterRepeatDelay=wbt3CharacterRepeatDelay, wbt3DhcpInfoIndex=wbt3DhcpInfoIndex, wbt3RDPConnUsername=wbt3RDPConnUsername, iCACloseRemoteApplication=iCACloseRemoteApplication, wbt3RDPencryption=wbt3RDPencryption, wbt3TrapDHCPUpdNotComplete=wbt3TrapDHCPUpdNotComplete, domain=domain, emulationMode=emulationMode, wbt3CurOEMBuildNum=wbt3CurOEMBuildNum, wbt3DHCPupdate=wbt3DHCPupdate, wbt3CustomField3=wbt3CustomField3, wbt3DUpOEMBuildDate=wbt3DUpOEMBuildDate, wbt3UpDnLoadTable=wbt3UpDnLoadTable, wbt3DUpOEMBuildNum=wbt3DUpOEMBuildNum, wbt3DispFreq=wbt3DispFreq, wbt3UpDnLoadProtocol=wbt3UpDnLoadProtocol, iCAaltBackTab2=iCAaltBackTab2, wbt3Username=wbt3Username, wbt3RDPConnStartApplication=wbt3RDPConnStartApplication, wbt3Password2=wbt3Password2, wbt3DispHorizPixMax=wbt3DispHorizPixMax, wbt3VirtualPortServerIPaddress=wbt3VirtualPortServerIPaddress, iCACloseRemoteApplication2=iCACloseRemoteApplication2, wbt3AutoStart1=wbt3AutoStart1, wbt3TrapSNMPAccptLd=wbt3TrapSNMPAccptLd, wbt3TrapSNMPLdNotComplete=wbt3TrapSNMPLdNotComplete, wbt3NetworkIndex=wbt3NetworkIndex, iCActrlEsc2=iCActrlEsc2, wbt3Password=wbt3Password, trapServerList=trapServerList, wbt3PingBeforeConnect=wbt3PingBeforeConnect, thinClient=thinClient, wbt3InfoLocation=wbt3InfoLocation, wbt3InterfaceNum=wbt3InterfaceNum, wysenet=wysenet, remoteServer=remoteServer, wbt3TrapStatus=wbt3TrapStatus, wbt3TrapServerList=wbt3TrapServerList, wbt3DhcpInfoEntry=wbt3DhcpInfoEntry, iCAaltEsc2=iCAaltEsc2, wbt3SubmitLoadJob=wbt3SubmitLoadJob, wbt3com1Share=wbt3com1Share, wbt3CurOS=wbt3CurOS, wbt3CurInfoIndex=wbt3CurInfoIndex, wbt3UsersEntry=wbt3UsersEntry, wbt3Domain=wbt3Domain, wbt3dhcpEnable=wbt3dhcpEnable, wbt3PCCard=wbt3PCCard, wbt3DUpInfoNum=wbt3DUpInfoNum, wbt3DUpInfoTable=wbt3DUpInfoTable, wbt3ICAConnCommandLine=wbt3ICAConnCommandLine, wbt3CurHWPlatform=wbt3CurHWPlatform, wbt3MibRev=wbt3MibRev, wbt3RDPConnFilename=wbt3RDPConnFilename, wbt3AutoLoginUserName=wbt3AutoLoginUserName, wbt3CommandLine=wbt3CommandLine, wbt3TermConnServer=wbt3TermConnServer, wbt3TrapServer1=wbt3TrapServer1, wbt3ConnectionEntry=wbt3ConnectionEntry, iCAtoggleTitleBar2=iCAtoggleTitleBar2, wbt3ICAConnWorkingDir=wbt3ICAConnWorkingDir, wbt3RomNum=wbt3RomNum, iCAtoggleTitleBar=iCAtoggleTitleBar, wbt3TermConnFormFeed=wbt3TermConnFormFeed, wbt3CurBuildNum=wbt3CurBuildNum, wbt3DispColorMax=wbt3DispColorMax, wbt3CurModBuildDate=wbt3CurModBuildDate, wbt3MultipleConnect=wbt3MultipleConnect, iCActrlAltDel2=iCActrlAltDel2, wbt3Connection2=wbt3Connection2, wbt3Directory=wbt3Directory, wbt3ICAConnTable=wbt3ICAConnTable, wbt3TermConnIBM5250EmuModel=wbt3TermConnIBM5250EmuModel, wbt3TrapSNMPLdDone=wbt3TrapSNMPLdDone, wbt3UpDnLoadOp=wbt3UpDnLoadOp, iCAStatusDialog2=iCAStatusDialog2, product=product, wbt3primaryWINSserverIPaddress=wbt3primaryWINSserverIPaddress, iCAStatusDialog=iCAStatusDialog, wbt3NetworkEntry=wbt3NetworkEntry, wbt3RomSize=wbt3RomSize, wbt3ConnectionsTable=wbt3ConnectionsTable, wbt3SecurityEnable=wbt3SecurityEnable, wbt3ICAConnServer=wbt3ICAConnServer, wbt3Connection1=wbt3Connection1, wbt3primaryDNSserverIPaddress=wbt3primaryDNSserverIPaddress, wbt3DhcpInfo=wbt3DhcpInfo, wbt3UserPasswordChange=wbt3UserPasswordChange, wbt3RebootRequest=wbt3RebootRequest, wbt3RDPstartApp=wbt3RDPstartApp, wbt3CharacterRepeatRate=wbt3CharacterRepeatRate, wbt3TermConnPrinterPort=wbt3TermConnPrinterPort, wbt3Connection7=wbt3Connection7, wbt3CurInfoEntry=wbt3CurInfoEntry, wbt3AutoStart3=wbt3AutoStart3, wbt3Rom=wbt3Rom, wbt3SubnetMask=wbt3SubnetMask, wbt3AutoStart2=wbt3AutoStart2, wbt3DispVertPix=wbt3DispVertPix, wbt3PCCardEntry=wbt3PCCardEntry, wbt3parallelShare=wbt3parallelShare, iCAaltBackTab=iCAaltBackTab, wbt3ICAConnDomain=wbt3ICAConnDomain, wbt3NetworkProtocol=wbt3NetworkProtocol, wbt3TermConnections=wbt3TermConnections, wbt3AutoStart8=wbt3AutoStart8, wbt3RDPConnServer=wbt3RDPConnServer, wbt3IODevice=wbt3IODevice, wbt3ICAConnUsername=wbt3ICAConnUsername, wbt3TermConnPortNumber=wbt3TermConnPortNumber, wbt3ICAConnCommType=wbt3ICAConnCommType, wbt3CurInfoTable=wbt3CurInfoTable, wbt3NetworkSpeed=wbt3NetworkSpeed, iCAaltTab=iCAaltTab, wbt3TermConnAutoLineFeed=wbt3TermConnAutoLineFeed, fTPFileServer=fTPFileServer, wbt3TermConnModifiable=wbt3TermConnModifiable, wbt3Connection8=wbt3Connection8, wbt3TrapServer4=wbt3TrapServer4, wbt3DispCharacteristic=wbt3DispCharacteristic, wbt3ICAConnModifiable=wbt3ICAConnModifiable, wbt3AutoStart7=wbt3AutoStart7, wbt3winsEnable=wbt3winsEnable, wbt3UpDnLoadDstFile=wbt3UpDnLoadDstFile, wbt3Memory=wbt3Memory, wbt3Connection5=wbt3Connection5, wbt3ServerName=wbt3ServerName, wbt3VirtualPortServer=wbt3VirtualPortServer, virtualPortServer=virtualPortServer, iCAaltEsc=iCAaltEsc, wbt3PCCardType=wbt3PCCardType, wbt3MibRevMajor=wbt3MibRevMajor, wbt3PCCardVendor=wbt3PCCardVendor, wbt3DhcpInfoTable=wbt3DhcpInfoTable, wbt3Verbose=wbt3Verbose, wbt3UpDnLoadId=wbt3UpDnLoadId, wbt3RamNum=wbt3RamNum, wbt3dnsEnable=wbt3dnsEnable, wbt3Network=wbt3Network, wbt3TouchScreen=wbt3TouchScreen, wbt3RomType=wbt3RomType, wbt3ConnectionEntryStatus=wbt3ConnectionEntryStatus, wbt3RDPConnLowSpeed=wbt3RDPConnLowSpeed, wbt3ICAConnections=wbt3ICAConnections, wbt3UserID=wbt3UserID, wbt3ICAConnDataCompress=wbt3ICAConnDataCompress, wbt3Gateway=wbt3Gateway, wbt3RamType=wbt3RamType, wbt3SavePassword=wbt3SavePassword, fTPRootPath=fTPRootPath, wbt3ICAConnEntry=wbt3ICAConnEntry, terminalID=terminalID, wbt3ConnectionType=wbt3ConnectionType, wbt3Connection4=wbt3Connection4, wbt3RomEntry=wbt3RomEntry, wbt3AutoStart4=wbt3AutoStart4, wbt3Connections=wbt3Connections, wbt3PCCardTable=wbt3PCCardTable, iCAaltTab2=iCAaltTab2, wbt3Display=wbt3Display, wbt3Apps=wbt3Apps, wbt3TermConnVTEmuModel=wbt3TermConnVTEmuModel, wbt3RomIndex=wbt3RomIndex, rDPStartupApp=rDPStartupApp, wbt3TrapRebootNotComplete=wbt3TrapRebootNotComplete, wbt3CustomFields=wbt3CustomFields, wbt3SingleButtonConnect=wbt3SingleButtonConnect, wbt3FTPsetting=wbt3FTPsetting, wbt3UpDnLoadEntry=wbt3UpDnLoadEntry, wbt3TermConnEmuType=wbt3TermConnEmuType, wbt3AutoFailRecovery=wbt3AutoFailRecovery, wbt3RamTable=wbt3RamTable, wbt3DispCapability=wbt3DispCapability, wbt3NetworkAddress=wbt3NetworkAddress, wbt3RDPConnName=wbt3RDPConnName, wbt3RDPConnEntry=wbt3RDPConnEntry, wbt3NetworkNum=wbt3NetworkNum, wbt3DispHorizPix=wbt3DispHorizPix, wbt3BuildInfo=wbt3BuildInfo, wbt3CurrentInfo=wbt3CurrentInfo, wbt3RDPConnWorkingDir=wbt3RDPConnWorkingDir, wbt3RDPConnTable=wbt3RDPConnTable, wbt3kbLanguage=wbt3kbLanguage, wbt3TerminalID=wbt3TerminalID, wbt3DUpModBuildDate=wbt3DUpModBuildDate, wbt3Security=wbt3Security, wbt3RamSize=wbt3RamSize, wbt3ResetToFactoryDefault=wbt3ResetToFactoryDefault, wbt3TrapServer2=wbt3TrapServer2, wbt3Administration=wbt3Administration, wbt3WorkingDir=wbt3WorkingDir, wbt3DUpBuildNum=wbt3DUpBuildNum, wbt3UpDnLoadNum=wbt3UpDnLoadNum, wbt3UsersStatus=wbt3UsersStatus, wbt3Connection3=wbt3Connection3, wbt3RDPConnDomain=wbt3RDPConnDomain, password=password, old=old, wbt3DUpInfoEntry=wbt3DUpInfoEntry, wbt3RomTable=wbt3RomTable, wbt3TrapReqId=wbt3TrapReqId, wbt3secondaryWINSserverIPaddress=wbt3secondaryWINSserverIPaddress, wbt3ScreenTimeOut=wbt3ScreenTimeOut, wbt3TrapDHCPUpdDone=wbt3TrapDHCPUpdDone, wbt3com2Share=wbt3com2Share, wbt3SNMPupdate=wbt3SNMPupdate, wbt3UsersTable=wbt3UsersTable, wbt3privilege=wbt3privilege, wbt3FileServer=wbt3FileServer, wbt3RamIndex=wbt3RamIndex, wbt3AutoLoginEnable=wbt3AutoLoginEnable, wbt3DispUseDDC=wbt3DispUseDDC, workingDirectory=workingDirectory, wbt3DUpInfoIndex=wbt3DUpInfoIndex, iCActrlEsc=iCActrlEsc, wbt3CurOEM=wbt3CurOEM, wbt3TermConnTable=wbt3TermConnTable, wbt3TermConnName=wbt3TermConnName, defaultHotkeysEntry=defaultHotkeysEntry, wbt3EmulationMode=wbt3EmulationMode, iCADefaultHotkeys=iCADefaultHotkeys, wbt3AutoStart6=wbt3AutoStart6, wbt3DhcpUpdateInfo=wbt3DhcpUpdateInfo, wbt3TermConnScript=wbt3TermConnScript, wbt3Action=wbt3Action, wbt3ICAConnName=wbt3ICAConnName, wbt3Ram=wbt3Ram, wbt3DispColor=wbt3DispColor, wbt3DispFreqMax=wbt3DispFreqMax, wbt3=wbt3, setCommunity=setCommunity, wbt3DhcpInfoNum=wbt3DhcpInfoNum, iCActrlAltDel=iCActrlAltDel, wbt3TrapsInfo=wbt3TrapsInfo, wbt3AcceptReq=wbt3AcceptReq, wbt3NetworkTable=wbt3NetworkTable, wbt3ConnectionName=wbt3ConnectionName, wbt3UpDnLoadTimeFlag=wbt3UpDnLoadTimeFlag, wbt3ICAConnSoundQuality=wbt3ICAConnSoundQuality)
mibBuilder.exportSymbols("WYSE-MIB", wbt3TermConnIBM3270EmuModel=wbt3TermConnIBM3270EmuModel, wbt3userName=wbt3userName, wbt3secondaryDNSserverIPaddress=wbt3secondaryDNSserverIPaddress, wbt3DHCPoptionIDs=wbt3DHCPoptionIDs, wbt3ServerIP=wbt3ServerIP, wbt3AutoStart5=wbt3AutoStart5, wbt3RDPConnModifiable=wbt3RDPConnModifiable, wbt3RamEntry=wbt3RamEntry, wbt3IODevAttached=wbt3IODevAttached, wbt3TrapServers=wbt3TrapServers, wbt3password=wbt3password, wbt3Users=wbt3Users, wbt3TermConnTelnetName=wbt3TermConnTelnetName, wbt3PCCardNum=wbt3PCCardNum, wbt3MibRevMinor=wbt3MibRevMinor, wbt3FileRootPath=wbt3FileRootPath, wbt3CustomField2=wbt3CustomField2, wbt3defaultDomain=wbt3defaultDomain, wbt3RDPConnAutoLogon=wbt3RDPConnAutoLogon, wbt3UpDnLoadSrcFile=wbt3UpDnLoadSrcFile, wbt3TermConnEntry=wbt3TermConnEntry, wbt3TermConnCommType=wbt3TermConnCommType, commandLine=commandLine, wbt3TrapDHCPBuildMismatch=wbt3TrapDHCPBuildMismatch, wbt3UpDnLoadFileType=wbt3UpDnLoadFileType, wbt3UpDnLoadFServer=wbt3UpDnLoadFServer, wyse=wyse)
| 140.347015 | 8,834 | 0.73246 | 9,316 | 75,226 | 5.914556 | 0.064298 | 0.012632 | 0.016116 | 0.020399 | 0.549002 | 0.533303 | 0.494973 | 0.485136 | 0.463521 | 0.412105 | 0 | 0.107061 | 0.088746 | 75,226 | 535 | 8,835 | 140.609346 | 0.696732 | 0.004121 | 0 | 0 | 0 | 0 | 0.103475 | 0.00239 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.026515 | 0.011364 | 0 | 0.011364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
43d5ba18dada32e9f17e7195f9786c94db955fe4 | 1,324 | py | Python | aes_ecb.py | tanupoo/lorawan-parser | 5206a570be650ad3956b7a67f7a3fa244802a46b | [
"MIT"
] | 3 | 2020-09-24T08:07:24.000Z | 2021-12-30T02:40:51.000Z | aes_ecb.py | tanupoo/lorawan-parser | 5206a570be650ad3956b7a67f7a3fa244802a46b | [
"MIT"
] | null | null | null | aes_ecb.py | tanupoo/lorawan-parser | 5206a570be650ad3956b7a67f7a3fa244802a46b | [
"MIT"
] | 1 | 2021-03-31T01:22:03.000Z | 2021-03-31T01:22:03.000Z | #
# a wrapper module for pycryptodome.
#
from Crypto.Cipher import AES
class AES_ECB():
def __init__(self, key):
"""
key: 8 bytes of bytearray
"""
self.aes_ecb = AES.new(key, AES.MODE_ECB)
def encrypt(self, data):
"""
data: any size of bytearray. expanded into 16 bytes if less.
"""
blk_list = []
for i in range(0,len(data),16):
blk = data[i:i+16]
blk += b"\x00"*(16-len(blk))
blk_list.append(self.aes_ecb.encrypt(bytes(blk)))
return b"".join(blk_list)
def decrypt(self, enc_data):
"""
enc_data: in bytearray, must be multiple of 16.
"""
blk_list = []
for i in range(0,len(enc_data),16):
blk_list.append(self.aes_ecb.decrypt(data[i:i+16]))
return b"".join(blk_list)
def aes128_encrypt(key, plain_data):
"""
one time encryper.
it's used mainly for key generation.
key: in bytes.
plain_data: in bytes.
"""
cipher = AES_ECB(key)
return cipher.encrypt(plain_data)
def aes128_decrypt(key, enc_data):
"""
one time encryper.
it's used mainly for key generation.
key: in bytes.
plain_data: in bytes.
"""
cipher = AES_ECB(key)
return cipher.decrypt(enc_data)
| 24.981132 | 68 | 0.568731 | 186 | 1,324 | 3.897849 | 0.311828 | 0.049655 | 0.041379 | 0.030345 | 0.46069 | 0.46069 | 0.33931 | 0.33931 | 0.278621 | 0.278621 | 0 | 0.027263 | 0.307402 | 1,324 | 52 | 69 | 25.461538 | 0.763359 | 0.280967 | 0 | 0.272727 | 0 | 0 | 0.004848 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0.227273 | false | 0 | 0.045455 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
43ddb4c26f9437ccecebca8a585e26804aef4f1a | 1,192 | py | Python | mastering-flask/webapp/controllers/admin/__init__.py | zhchnchn/flask-repo | 51aad8c9e80112d53a6455221bc94cc9a523e356 | [
"Apache-2.0"
] | null | null | null | mastering-flask/webapp/controllers/admin/__init__.py | zhchnchn/flask-repo | 51aad8c9e80112d53a6455221bc94cc9a523e356 | [
"Apache-2.0"
] | null | null | null | mastering-flask/webapp/controllers/admin/__init__.py | zhchnchn/flask-repo | 51aad8c9e80112d53a6455221bc94cc9a523e356 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from flask_admin import BaseView, expose
from flask_admin.contrib.sqla import ModelView
from flask_admin.contrib.fileadmin import FileAdmin
from flask_login import login_required, current_user
from ...extensions import admin_permission
from .forms import CKTextAreaField
class CustomView(BaseView):
@expose('/')
@login_required
@admin_permission.require(http_exception=403)
def index(self):
return self.render('admin/custom.html')
@expose('/second_page')
@login_required
@admin_permission.require(http_exception=403)
def second_page(self):
return self.render('admin/second_page.html')
class CustomModelView(ModelView):
def is_accessible(self):
return current_user.is_authenticated and admin_permission.can()
class PostView(CustomModelView):
form_overrides = dict(text=CKTextAreaField)
column_searchable_list = ('text', 'title')
column_filters = ('publish_date', )
create_template = 'admin/post_edit.html'
edit_template = 'admin/post_edit.html'
class CustomFileAdmin(FileAdmin):
def is_accessible(self):
return current_user.is_authenticated and admin_permission.can()
| 29.8 | 71 | 0.748322 | 144 | 1,192 | 5.965278 | 0.409722 | 0.087311 | 0.048894 | 0.048894 | 0.409779 | 0.293364 | 0.293364 | 0.293364 | 0.293364 | 0.167637 | 0 | 0.006938 | 0.153523 | 1,192 | 39 | 72 | 30.564103 | 0.8444 | 0.017617 | 0 | 0.275862 | 0 | 0 | 0.096664 | 0.01882 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137931 | false | 0 | 0.206897 | 0.137931 | 0.793103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
43dff465b7cd67989ae57af385b438a9bf67ddce | 558 | py | Python | iadmin/shortcuts.py | saxix/django-iadmin | 675317e8f0b4142eaf351595da27c065637a83ba | [
"BSD-1-Clause"
] | 1 | 2015-06-23T09:24:12.000Z | 2015-06-23T09:24:12.000Z | iadmin/shortcuts.py | saxix/django-iadmin | 675317e8f0b4142eaf351595da27c065637a83ba | [
"BSD-1-Clause"
] | null | null | null | iadmin/shortcuts.py | saxix/django-iadmin | 675317e8f0b4142eaf351595da27c065637a83ba | [
"BSD-1-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from guardian.admin import GuardedModelAdmin
from iadmin.options import IModelAdmin, IModelAdminMixin
class IGuardedModelAdmin(IModelAdminMixin, GuardedModelAdmin):
change_form_template = 'iadmin/guardian/model/change_form.html'
obj_perms_manage_template ='iadmin/guardian/model/obj_perms_manage.html'
obj_perms_manage_user_template ='iadmin/guardian/model/obj_perms_manage_user.html'
obj_perms_manage_group_template ='iadmin/guardian/model/obj_perms_manage_group.html'
# def get_urls(self):
# return
| 37.2 | 88 | 0.797491 | 69 | 558 | 6.115942 | 0.42029 | 0.113744 | 0.199052 | 0.255924 | 0.291469 | 0.291469 | 0.291469 | 0 | 0 | 0 | 0 | 0.002024 | 0.114695 | 558 | 14 | 89 | 39.857143 | 0.852227 | 0.103943 | 0 | 0 | 0 | 0 | 0.358871 | 0.358871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
78db5dd3de21bcde5ad6c34c0c3595232a4e33e2 | 311 | py | Python | src/cmp/cool_lang/ast/binary_node.py | codestrange/cool-compiler-2020 | 30508965d75a1a1d1362d0b51bef8da3978fd0c2 | [
"MIT"
] | 3 | 2020-01-14T04:47:32.000Z | 2020-09-10T17:57:20.000Z | src/cmp/cool_lang/ast/binary_node.py | codestrange/cool-compiler-2020 | 30508965d75a1a1d1362d0b51bef8da3978fd0c2 | [
"MIT"
] | 5 | 2020-01-14T06:06:35.000Z | 2020-02-19T01:01:33.000Z | src/cmp/cool_lang/ast/binary_node.py | codestrange/cool-compiler-2020 | 30508965d75a1a1d1362d0b51bef8da3978fd0c2 | [
"MIT"
] | 3 | 2020-01-14T04:58:24.000Z | 2020-01-14T16:23:41.000Z | from .expresion_node import ExpressionNode
class BinaryNode(ExpressionNode):
def __init__(self, left: ExpressionNode, right: ExpressionNode, line: int, column: int):
super(BinaryNode, self).__init__(line, column)
self.left: ExpressionNode = left
self.right: ExpressionNode = right
| 34.555556 | 92 | 0.726688 | 33 | 311 | 6.575758 | 0.484848 | 0.073733 | 0.202765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.18328 | 311 | 8 | 93 | 38.875 | 0.854331 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
78e1ab2a5fc5cc279195c6aeb608bab48268078c | 1,703 | py | Python | 02-boggle-solver/python/trienode.py | SolangeUG/summer-of-code | 93b47c0e0ca190d6d25f52584c6084a4feebf3b9 | [
"Apache-2.0"
] | 1 | 2018-08-17T11:30:19.000Z | 2018-08-17T11:30:19.000Z | 02-boggle-solver/python/trienode.py | SolangeUG/summer-of-code | 93b47c0e0ca190d6d25f52584c6084a4feebf3b9 | [
"Apache-2.0"
] | null | null | null | 02-boggle-solver/python/trienode.py | SolangeUG/summer-of-code | 93b47c0e0ca190d6d25f52584c6084a4feebf3b9 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
class TrieNode(object):
"""
This class represents a node in a trie
"""
def __init__(self, text: str):
"""
Constructor: initialize a trie node with a given text string
:param text: text string at this trie node
"""
self.__text = text
self.__children = {}
self.__ends_word = False
def get_ends_word(self):
"""
Does this node end a word?
:return: True if this node ends a word, False otherwise
"""
return self.__ends_word
def set_ends_word(self, ends_word: bool):
"""
Set whether or not this node ends a word in a trie
:param ends_word: value determining whether this node ends a word
:return: None
"""
self.__ends_word = ends_word
def get_child(self, char: str):
"""
Return the child trie node that is found when you follow the link from the given character
:param char: the character in the key
:return: the trie node the given character links to, or None if that link is not in trie
"""
if char in self.__children.keys():
return self.__children[char]
else:
return None
def insert(self, char: str):
"""
Insert a character at this trie node
:param char: the character to be inserted
:return: the newly created trie node, or None if the character is already in the trie
"""
if char not in self.__children:
next_node = TrieNode(self.__text + char)
self.__children[char] = next_node
return next_node
else:
return None
| 30.410714 | 98 | 0.588961 | 229 | 1,703 | 4.213974 | 0.296943 | 0.066321 | 0.049741 | 0.040415 | 0.05285 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00089 | 0.340575 | 1,703 | 55 | 99 | 30.963636 | 0.858415 | 0.446858 | 0 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0 | 0 | 0 | 0.52381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
78e391e916d86d9e9f369d5ae195a6910458d4e2 | 230 | py | Python | src/erpbrasil/edoc/pdf/__init__.py | rcbull/erpbrasil.edoc.pdf | 9cea84efc1e3d0a2060753e4eecefffed97c77df | [
"MIT"
] | null | null | null | src/erpbrasil/edoc/pdf/__init__.py | rcbull/erpbrasil.edoc.pdf | 9cea84efc1e3d0a2060753e4eecefffed97c77df | [
"MIT"
] | null | null | null | src/erpbrasil/edoc/pdf/__init__.py | rcbull/erpbrasil.edoc.pdf | 9cea84efc1e3d0a2060753e4eecefffed97c77df | [
"MIT"
] | null | null | null | from lxml import etree
from lxml import objectify
__version__ = '0.1.1'
lookup = etree.ElementNamespaceClassLookup(
objectify.ObjectifyElementClassLookup())
parser = etree.XMLParser()
parser.set_element_class_lookup(lookup)
| 23 | 44 | 0.808696 | 26 | 230 | 6.884615 | 0.615385 | 0.089385 | 0.156425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014634 | 0.108696 | 230 | 9 | 45 | 25.555556 | 0.858537 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
600315aec52564237a9f33d04fe65f9de659ab4d | 3,214 | py | Python | pip_services3_components/count/ICounters.py | pip-services-python/pip-services-components-python | 428ec7a9f0f0bdfa4d39cecb2541e87b1e5d33e0 | [
"MIT"
] | null | null | null | pip_services3_components/count/ICounters.py | pip-services-python/pip-services-components-python | 428ec7a9f0f0bdfa4d39cecb2541e87b1e5d33e0 | [
"MIT"
] | null | null | null | pip_services3_components/count/ICounters.py | pip-services-python/pip-services-components-python | 428ec7a9f0f0bdfa4d39cecb2541e87b1e5d33e0 | [
"MIT"
] | 1 | 2020-03-11T21:46:42.000Z | 2020-03-11T21:46:42.000Z | # -*- coding: utf-8 -*-
"""
pip_services3_components.counters.ICounters
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Interface for performance counters components.
:copyright: Conceptual Vision Consulting LLC 2018-2019, see AUTHORS for more details.
:license: MIT, see LICENSE for more details.
"""
import datetime
from abc import ABC
from pip_services3_components.count import CounterTiming
class ICounters(ABC):
"""
Interface for performance counters that measure execution metrics.
The performance counters measure how code is performing:
how fast or slow, how many transactions performed, how many objects
are stored, what was the latest transaction time and so on.
They are critical to monitor and improve performance, scalability and reliability of code in production.
"""
def begin_timing(self, name: str) -> CounterTiming:
"""
Begins measurement of execution time interval.
It returns :class:`CounterTiming <pip_services3_components.count.CounterTiming.CounterTiming>` object which has to be called at
:func:`CounterTiming.end_timing` to end the measurement and update the counter.
:param name: a counter name of Interval type.
:return: a :class:`CounterTiming <pip_services3_components.count.CounterTiming.CounterTiming>` callback object to end timing.
"""
raise NotImplementedError('Method from interface definition')
def stats(self, name: str, value: float):
"""
Calculates min/average/max statistics based on the current and previous values.
:param name: a counter name of Statistics type
:param value: a value to update statistics
"""
raise NotImplementedError('Method from interface definition')
def last(self, name: str, value: float):
"""
Records the last calculated measurement value.
Usually this method is used by metrics calculated externally.
:param name: a counter name of Last type.
:param value: a last value to record.
"""
raise NotImplementedError('Method from interface definition')
def timestamp_now(self, name: str):
"""
Records the current time as a timestamp.
:param name: a counter name of Timestamp type.
"""
raise NotImplementedError('Method from interface definition')
def timestamp(self, name: str, value: datetime.datetime):
"""
Records the given timestamp.
:param name: a counter name of Timestamp type.
:param value: a timestamp to record.
"""
raise NotImplementedError('Method from interface definition')
def increment_one(self, name: str):
"""
Increments counter by 1.
:param name: a counter name of Increment type.
"""
raise NotImplementedError('Method from interface definition')
def increment(self, name: str, value: float):
"""
Increments counter by given value.
:param name: a counter name of Increment type.
:param value: a value to add to the counter.
"""
raise NotImplementedError('Method from interface definition')
| 33.831579 | 135 | 0.666459 | 373 | 3,214 | 5.710456 | 0.340483 | 0.026291 | 0.03615 | 0.055869 | 0.430516 | 0.400939 | 0.324883 | 0.2723 | 0.102347 | 0 | 0 | 0.005809 | 0.250156 | 3,214 | 94 | 136 | 34.191489 | 0.878008 | 0.573118 | 0 | 0.388889 | 0 | 0 | 0.21875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.388889 | false | 0 | 0.166667 | 0 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.