hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c2fa21376f4ea9c90a43a879dad95d4e499dca61 | 3,270 | py | Python | topology/protocols.py | di-unipi-socc/arch-miner | 702144699cf422f3ec710080231f5f8203017366 | [
"MIT"
] | 4 | 2021-11-10T09:11:44.000Z | 2022-02-28T14:45:46.000Z | topology/protocols.py | di-unipi-socc/arch-miner | 702144699cf422f3ec710080231f5f8203017366 | [
"MIT"
] | null | null | null | topology/protocols.py | di-unipi-socc/arch-miner | 702144699cf422f3ec710080231f5f8203017366 | [
"MIT"
] | 1 | 2021-11-10T09:12:16.000Z | 2021-11-10T09:12:16.000Z | from .errors import NoInfoError
import ipaddress
class IP:
def __init__(self, packet: dict):
self.senderIP = ipaddress.ip_address(packet['ip.src'])
self.receiverIP = ipaddress.ip_address(packet['ip.dst'])
self.senderHost = packet['ip.src_host']
self.receiverHost = packet['ip.dst_host']
def getSenderIP(self) -> ipaddress:
return self.senderIP
def getReceiverIP(self) -> ipaddress:
return self.receiverIP
def getSenderHost(self) -> str:
return self.senderHost
def getReceiverHost(self) -> str:
return self.receiverHost
class AMQP:
def __init__(self, packet: dict):
#AMQP 1.0
if 'amqp.length' in packet and 'amqp.doff' in packet and 'amqp.type' in packet:
if packet['amqp.type'] == '0' and packet['amqp.performative'] == '20':
self.channel = packet['amqp.channel']
self.more = bool(packet['amqp.method.arguments']['amqp.performative.arguments.more']) if 'amqp.performative.arguments.more' in packet['amqp.method.arguments'] else False
self.handle = packet['amqp.method.arguments']['amqp.performative.arguments.handle']
self.properties = packet['amqp.properties'] if 'amqp.properties' in packet else {}
self.applicationProperties = packet['amqp.applicationProperties'] if 'amqp.applicationProperties' in packet else {}
self.data = packet['amqp.data'] if 'amqp.data' in packet else ''
self.version = '1.0.0'
else:
raise NoInfoError('')
#AMQP 0.9.1
elif 'amqp.type' in packet and 'amqp.channel' in packet and 'amqp.length' in packet:
if packet['amqp.type'] == '1':
self.version = '0.9.1'
self.classID = packet['amqp.method.class']
self.methodID = packet['amqp.method.method']
else:
raise NoInfoError('')
else:
raise NoInfoError('')
def getVersion(self) -> str:
return self.version
def getClassID(self) -> int:
return int(self.classID)
def getMethodID(self) -> int:
return int(self.methodID)
def getChannel(self) -> int:
return int(self.channel)
def getMore(self) -> bool:
return self.more
def getHandle(self) -> int:
return int(self.handle)
def getProperties(self) -> dict:
return self.properties
def getApplicationProperties(self) -> dict:
return self.applicationProperties
def getData(self) -> str:
return self.data
class MQTT:
def __init__(self, packet: dict):
self.controlPacketType = int(packet['mqtt.hdrflags_tree']['mqtt.msgtype'])
def getControlPacketType(self) -> int:
return self.controlPacketType
class STOMP:
def __init__(self, packet: dict):
self.command = packet['stomp.command']
def getCommand(self) -> str:
return self.command
class HTTP:
def __init__(self, packet: dict):
self.xForwardedFor = packet['http.x_forwarded_for'].replace(' ', '').split(',')
def getXForwardedForHeader(self) -> []:
return self.xForwardedFor
| 31.442308 | 185 | 0.604587 | 361 | 3,270 | 5.401662 | 0.224377 | 0.061538 | 0.028205 | 0.04359 | 0.205641 | 0.127179 | 0.051282 | 0 | 0 | 0 | 0 | 0.006321 | 0.274312 | 3,270 | 103 | 186 | 31.747573 | 0.815424 | 0.005505 | 0 | 0.15493 | 0 | 0 | 0.159077 | 0.065538 | 0 | 0 | 0 | 0 | 0 | 1 | 0.295775 | false | 0 | 0.028169 | 0.225352 | 0.619718 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
6c02a1dcb7bcf68f321fdff494bf41d5e3090339 | 2,250 | py | Python | fairsearchdeltr/models.py | fair-search/fairsearchdeltr-python | 5bc2e8f0bf24ecd492978474919f512d3c0b65e1 | [
"Apache-2.0"
] | 3 | 2019-10-08T16:17:45.000Z | 2020-03-12T03:29:10.000Z | fairsearchdeltr/models.py | fair-search/fairsearchdeltr-python | 5bc2e8f0bf24ecd492978474919f512d3c0b65e1 | [
"Apache-2.0"
] | null | null | null | fairsearchdeltr/models.py | fair-search/fairsearchdeltr-python | 5bc2e8f0bf24ecd492978474919f512d3c0b65e1 | [
"Apache-2.0"
] | 1 | 2021-05-19T08:07:21.000Z | 2021-05-19T08:07:21.000Z | # -*- coding: utf-8 -*-
"""
fairsearchdeltr.models
~~~~~~~~~~~~~~~
This module contains the primary objects that power fairsearchdeltr.
"""
class TrainStep(object):
"""The :class:`TrainStep` object, which is a representation of the parameters in each step of the training.
Contains a `timestamp`, `omega`, `omega_gradient`, `loss`, `loss_standard`, `loss_exposure`.
TODO: is the name of the class OK?
"""
def __init__(self, timestamp, omega, omega_gradient, loss_standard, loss_exposure, loss):
"""
Object constructor
:param timestamp: timestamp of object creation
:param omega: current values of model
:param omega_gradient: calculated gradient
:param loss_standard: represents the change in ranking of training set vs predictions for training set
:param loss_exposure: represents the difference in exposures
:param loss: this should decrease at each iteration of training
"""
self.timestamp = timestamp
self.omega = omega
self.omega_gradient = omega_gradient
self.loss_standard = loss_standard
self.loss_exposure = loss_exposure
self.loss = loss
def __repr__(self):
return "<TrainStep [{0},{1},{2},{3},{4}]>".format(self.timestamp, self.omega, self.omega_gradient,
self.loss_standard, self.loss_exposure)
class FairScoreDoc(object):
"""The :class:`FairScoreDoc` object, which is a representation of the items in the rankings.
Contains an `id`, `is_protected` attribute, `features` and a `score`
"""
def __init__(self, id, is_protected, features, score):
self.id = id
self.score = score
self.features = features
self.is_protected = is_protected
def __repr__(self):
return "<FairScoreDoc [%s]>" % ("Protected" if self.is_protected else "Nonprotected")
class Query(object):
"""The :class:`FairScoreDoc` object, which is a representation of the items in the rankings.
Contains an `id`, `is_protected` attribute, `features` and a `score`
"""
def __init__(self, id, docs):
self.id = id
self.docs = docs | 38.135593 | 116 | 0.638667 | 266 | 2,250 | 5.236842 | 0.308271 | 0.055994 | 0.030151 | 0.030151 | 0.340991 | 0.223259 | 0.223259 | 0.199569 | 0.199569 | 0.199569 | 0 | 0.003593 | 0.257778 | 2,250 | 59 | 117 | 38.135593 | 0.830539 | 0.494222 | 0 | 0.173913 | 0 | 0 | 0.072277 | 0.021782 | 0 | 0 | 0 | 0.016949 | 0 | 1 | 0.217391 | false | 0 | 0 | 0.086957 | 0.434783 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c0e7459e10509e1b31e1610fff26016d0b0831c | 571 | py | Python | bulletin_board_bot/services/user_service.py | t3m8ch/bulletin-board-bot | c76dd041fdfc6de55f96cd88bc7cf16a2aae30a6 | [
"MIT"
] | null | null | null | bulletin_board_bot/services/user_service.py | t3m8ch/bulletin-board-bot | c76dd041fdfc6de55f96cd88bc7cf16a2aae30a6 | [
"MIT"
] | null | null | null | bulletin_board_bot/services/user_service.py | t3m8ch/bulletin-board-bot | c76dd041fdfc6de55f96cd88bc7cf16a2aae30a6 | [
"MIT"
] | null | null | null | from abc import abstractmethod, ABC
from bulletin_board_bot.models import AdModel
from bulletin_board_bot.models.user import UserModel
from bulletin_board_bot.services.base_service import BaseService
class BaseUserService(BaseService, ABC):
@abstractmethod
async def register_user(self, telegram_id: int) -> UserModel:
pass
@abstractmethod
async def add_ad_to_favorites(self, ad: AdModel, user_telegram_id: int):
pass
@abstractmethod
async def remove_ad_from_favorites(self, ad: AdModel, user_telegram_id: int):
pass
| 27.190476 | 81 | 0.763573 | 74 | 571 | 5.635135 | 0.405405 | 0.086331 | 0.122302 | 0.143885 | 0.330935 | 0.206235 | 0.206235 | 0.206235 | 0.206235 | 0 | 0 | 0 | 0.176883 | 571 | 20 | 82 | 28.55 | 0.887234 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.214286 | 0.285714 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6c103f567622f823c52e4cd49ce7211181ffdde5 | 6,538 | py | Python | nimismies/models.py | kimvais/nimismies | 47bdd7a9790dfc76b5a9402c0f9f888a83873be0 | [
"MIT"
] | 1 | 2018-06-25T19:31:16.000Z | 2018-06-25T19:31:16.000Z | nimismies/models.py | kimvais/nimismies | 47bdd7a9790dfc76b5a9402c0f9f888a83873be0 | [
"MIT"
] | null | null | null | nimismies/models.py | kimvais/nimismies | 47bdd7a9790dfc76b5a9402c0f9f888a83873be0 | [
"MIT"
] | null | null | null | #
# -*- coding: utf-8 -*-
#
# Copyright © 2013 Kimmo Parviainen-Jalanko <k@77.fi>
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
import datetime
import M2Crypto
from django.contrib.auth.models import AbstractBaseUser, BaseUserManager
from django.db import models
from . import fields
from utils import generate_key_fingerprint
class UserManager(BaseUserManager):
def create_user(self, email, dn, password=None):
"""
Creates and saves a User with the given email, date of
birth and password.
"""
if not email:
raise ValueError('Users must have an email address')
user = self.model(
email=email,
dn=dn,
)
user.set_password(password)
user.save(using=self._db)
return user
class User(AbstractBaseUser):
dn = models.CharField(max_length=1024, unique=True, db_index=True)
email = models.EmailField(
verbose_name='email address',
max_length=255,
unique=True,
db_index=True,
)
USERNAME_FIELD = 'email'
objects = UserManager()
is_staff = True
def get_full_name(self):
return self.dn
def get_short_name(self):
return self.email
class PrivateKey(models.Model):
owner = models.ForeignKey('nimismies.User')
data = fields.EncryptedCharField(null=False,
passphrase_setting='SECRET_KEY',
max_length=8192)
public_key = models.TextField()
key_type = models.CharField(max_length=16)
created = models.DateTimeField(default=datetime.datetime.utcnow)
def get_m2_key(self, md='sha1'):
if not self.data:
return None
if self.key_type != 'rsa':
raise RuntimeWarning('Not a RSA key')
_rsa_key = M2Crypto.RSA.load_key_string(self.data)
key = M2Crypto.EVP.PKey(md=md)
key.assign_rsa(_rsa_key)
return key
def __unicode__(self):
try:
return '{bits}-bit {key_type} key #{id} for <{owner}>'.format(
owner=self.owner, bits=self.bits, id=self.pk,
key_type=self.key_type)
except:
return "Private Key #" + str(self.pk)
def __init__(self, *args, **kwargs):
super(PrivateKey, self).__init__(*args, **kwargs)
self.key = self.get_m2_key()
@property
def bits(self):
return self.key.size() * 8
class Certificate(models.Model):
owner = models.ForeignKey('nimismies.User', null=True)
_issuer = models.ForeignKey('nimismies.Certificate',
null=True) # null means self-signed
data = models.TextField(null=False)
created = models.DateTimeField(default=datetime.datetime.utcnow)
private_key = models.ForeignKey('nimismies.PrivateKey', null=True)
def __unicode__(self):
return '{0} signed by {1}'.format(self.subject, self.issuer)
def get_m2_certificate(self):
return M2Crypto.X509.load_cert_string(str(self.data))
def __init__(self, *args, **kwargs):
super(Certificate, self).__init__(*args, **kwargs)
if self.data:
self.m2_certificate = self.get_m2_certificate()
else:
self.m2_certificate = None
@property
def issuer(self):
return self.m2_certificate.get_issuer().as_text()
@property
def subject(self):
return self.m2_certificate.get_subject().as_text()
@property
def serial(self):
return self.m2_certificate.get_serial_number()
@property
def sha1_fingerprint(self):
return self.m2_certificate.get_fingerprint(md='sha1')
@property
def sha2_fingerprint(self):
return self.m2_certificate.get_fingerprint(md='sha256')
@property
def validity(self):
return (self.m2_certificate.get_not_before().get_datetime(),
self.m2_certificate.get_not_after().get_datetime())
@property
def key_id(self):
return generate_key_fingerprint(self.m2_certificate.get_pubkey().as_der())
class CertificateSigningRequest(models.Model):
owner = models.ForeignKey('nimismies.User', null=True)
created = models.DateTimeField(default=datetime.datetime.utcnow)
data = models.TextField(null=False)
subject = models.CharField(max_length=1024)
status = models.CharField(max_length=32, default="new")
private_key = models.ForeignKey('nimismies.PrivateKey', null=True)
def __init__(self, *args, **kwargs):
super(CertificateSigningRequest, self).__init__(*args, **kwargs)
if self.data:
bio = M2Crypto.BIO.MemoryBuffer(data=str(self.data))
self.m2csr = M2Crypto.X509.load_request_bio(bio)
self.subject = self.m2csr.get_subject().as_text()
@classmethod
def from_pem(cls, data):
self = cls()
self.data = data
bio = M2Crypto.BIO.MemoryBuffer(data=str(self.data))
self.m2csr = M2Crypto.X509.load_request_bio(bio)
self.subject = self.m2csr.get_subject().as_text()
return self
@property
def public_key(self):
return self.m2csr.get_pubkey().as_pem()
@property
def key_id(self):
return generate_key_fingerprint(self.m2csr.get_pubkey().as_der())
@classmethod
def from_pem(cls, data):
csr = cls(data=data)
return csr
class CASerial(models.Model):
subject = models.CharField(max_length=1024, unique=True)
serial_number = models.IntegerField(default=0)
| 32.854271 | 82 | 0.665647 | 824 | 6,538 | 5.123786 | 0.293689 | 0.03316 | 0.03316 | 0.037897 | 0.343202 | 0.307674 | 0.238276 | 0.158219 | 0.158219 | 0.081004 | 0 | 0.015569 | 0.233711 | 6,538 | 198 | 83 | 33.020202 | 0.826946 | 0.182472 | 0 | 0.281481 | 0 | 0 | 0.051219 | 0.003969 | 0 | 0 | 0 | 0 | 0 | 1 | 0.162963 | false | 0.022222 | 0.044444 | 0.103704 | 0.577778 | 0.051852 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
6c2bf5a72cfc4cc3b0dda7408bf5305d5bbaaee1 | 71,585 | py | Python | src/emuvim/test/unittests/test_openstack.py | PedroPCardoso/fogbed | 11d9c8ce6ccd32ee71fbb77d719cc322dd9515da | [
"Apache-2.0"
] | null | null | null | src/emuvim/test/unittests/test_openstack.py | PedroPCardoso/fogbed | 11d9c8ce6ccd32ee71fbb77d719cc322dd9515da | [
"Apache-2.0"
] | null | null | null | src/emuvim/test/unittests/test_openstack.py | PedroPCardoso/fogbed | 11d9c8ce6ccd32ee71fbb77d719cc322dd9515da | [
"Apache-2.0"
] | null | null | null | """
Copyright (c) 2015 SONATA-NFV
ALL RIGHTS RESERVED.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Neither the name of the SONATA-NFV [, ANY ADDITIONAL AFFILIATION]
nor the names of its contributors may be used to endorse or promote
products derived from this software without specific prior written
permission.
This work has been performed in the framework of the SONATA project,
funded by the European Commission under Grant number 671517 through
the Horizon 2020 and 5G-PPP programmes. The authors would like to
acknowledge the contributions of their colleagues of the SONATA
partner consortium (www.sonata-nfv.eu).
"""
"""
Test suite to automatically test emulator REST API endpoints.
"""
import os
import unittest
import requests
import simplejson as json
import time
from emuvim.test.api_base_openstack import ApiBaseOpenStack
class testRestApi(ApiBaseOpenStack):
"""
Tests to check the REST API endpoints of the emulator.
"""
def setUp(self):
# create network
self.createNet(nswitches=3, ndatacenter=2, nhosts=2, ndockers=0, autolinkswitches=True)
# setup links
self.net.addLink(self.dc[0], self.h[0])
self.net.addLink(self.h[1], self.dc[1])
self.net.addLink(self.dc[0], self.dc[1])
# start api
self.startApi()
# start Mininet network
self.startNet()
@unittest.skip("temporarily disabled")
def testChainingDummy(self):
print('->>>>>>> test Chaining Class->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
print(" ")
headers = {'Content-type': 'application/json'}
test_heatapi_template_create_stack = open(os.path.join(os.path.dirname(__file__), "test_heatapi_template_chaining.json")).read()
url = "http://0.0.0.0:8004/v1/tenantabc123/stacks"
requests.post(url, data=json.dumps(json.loads(test_heatapi_template_create_stack)), headers=headers)
print('->>>>>>> test Chaining Versions ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/"
listapiversionstackresponse = requests.get(url, headers=headers)
self.assertEqual(listapiversionstackresponse.status_code, 200)
self.assertEqual(json.loads(listapiversionstackresponse.content)["versions"][0]["id"], "v1")
print(" ")
print('->>>>>>> test Chaining List ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/list"
chainlistresponse = requests.get(url, headers=headers)
self.assertEqual(chainlistresponse.status_code, 200)
self.assertEqual(json.loads(chainlistresponse.content)["chains"], [])
print(" ")
print('->>>>>>> test Loadbalancing List ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/list"
lblistresponse = requests.get(url, headers=headers)
self.assertEqual(lblistresponse.status_code, 200)
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"], [])
print(" ")
testchain = "dc0_s1_firewall1/fire-out-0/dc0_s1_iperf1/iper-in-0"
print('->>>>>>> test Chain VNF Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" %(testchain)
chainvnfresponse = requests.put(url)
self.assertEqual(chainvnfresponse.status_code, 200)
self.assertGreaterEqual(json.loads(chainvnfresponse.content)["cookie"], 0)
print(" ")
print('->>>>>>> test Chaining List ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/list"
chainlistresponse = requests.get(url, headers=headers)
self.assertEqual(chainlistresponse.status_code, 200)
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["dst_vnf"], "dc0_s1_firewall1")
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["dst_intf"], "fire-out-0")
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["src_vnf"], "dc0_s1_iperf1")
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["src_intf"], "iper-in-0")
print(" ")
print('->>>>>>> test Chain VNF Delete Chain ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
deletechainvnfresponse = requests.delete(url)
self.assertEqual(deletechainvnfresponse.status_code, 200)
self.assertEqual(deletechainvnfresponse.content, "true")
print(" ")
print('->>>>>>> test Chaining List If Empty Again ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/list"
chainlistresponse = requests.get(url, headers=headers)
self.assertEqual(chainlistresponse.status_code, 200)
self.assertEqual(json.loads(chainlistresponse.content)["chains"], [])
print(" ")
testchain = "dc0_s1_firewall1/fire-out-0/dc0_s1_iperf1/iper-in-0"
print('->>>>>>> test Stack Chain VNF Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
stackchainvnfresponse = requests.post(url, data=json.dumps(json.loads('{"path":["dc1.s1", "s1","s2","s3","s1","dc1.s1"]}')), headers=headers)
self.assertEqual(stackchainvnfresponse.status_code, 200)
print (stackchainvnfresponse.content)
self.assertGreaterEqual(json.loads(stackchainvnfresponse.content)["cookie"], 0)
print(" ")
print('->>>>>>> test Stack Chaining List ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/list"
chainlistresponse = requests.get(url, headers=headers)
self.assertEqual(chainlistresponse.status_code, 200)
print (chainlistresponse.content)
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["dst_vnf"], "dc0_s1_firewall1")
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["dst_intf"], "fire-out-0")
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["src_vnf"], "dc0_s1_iperf1")
self.assertEqual(json.loads(chainlistresponse.content)["chains"][0]["src_intf"], "iper-in-0")
self.assertItemsEqual(json.loads(chainlistresponse.content)["chains"][0]["path"],['dc1.s1', 's1', 's2', 's3', 's1', 'dc1.s1'])
print(" ")
print('->>>>>>> test Chain VNF Delete Chain ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
deletechainvnfresponse = requests.delete(url)
self.assertEqual(deletechainvnfresponse.status_code, 200)
self.assertEqual(deletechainvnfresponse.content, "true")
print(" ")
print('->>>>>>> test Chaining List If Empty Again ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/list"
chainlistresponse = requests.get(url, headers=headers)
self.assertEqual(chainlistresponse.status_code, 200)
self.assertEqual(json.loads(chainlistresponse.content)["chains"], [])
print(" ")
testchain = "dc0_s1_firewall1/non-existing-interface/dc0_s1_iperf1/iper-in-0"
print('->>>>>>> test Chain VNF Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
chainvnfresponse = requests.put(url)
self.assertEqual(chainvnfresponse.status_code, 501)
print(" ")
testchain = "dc0_s1_firewall1/fire-out-0/dc0_s1_iperf1/non-existing-interface"
print('->>>>>>> test Chain VNF Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
chainvnfresponse = requests.put(url)
self.assertEqual(chainvnfresponse.status_code, 501)
print(" ")
testchain = "dc0_s1_firewall1/non-existing-interface/dc0_s1_iperf1/iper-in-0"
print('->>>>>>> test Chain VNF Delete Chain ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
deletechainvnfresponse = requests.delete(url)
self.assertEqual(deletechainvnfresponse.status_code, 501)
print(" ")
testchain = "dc0_s1_firewall1/fire-out-0/dc0_s1_iperf1/non-existing-interface"
print('->>>>>>> test Chain VNF Delete Chain ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
deletechainvnfresponse = requests.delete(url)
self.assertEqual(deletechainvnfresponse.status_code, 501)
print(" ")
testchain = "non-existent-dc/s1/firewall1/firewall1:cp03:output/dc0/s1/iperf1/iperf1:cp02:input"
print('->>>>>>> test Chain VNF Non-Existing DC ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
chainvnfresponse = requests.put(url)
self.assertEqual(chainvnfresponse.status_code, 500)
print(" ")
testchain = "dc0/s1/firewall1/non-existent:cp03:output/dc0/s1/iperf1/iperf1:cp02:input"
print('->>>>>>> test Chain VNF Non-Existing Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
chainvnfresponse = requests.put(url)
self.assertEqual(chainvnfresponse.status_code, 500)
print(" ")
testchain = "dc0/s1/firewall1/firewall1:cp03:output/dc0/s1/iperf1/non-existent:cp02:input"
print('->>>>>>> test Chain VNF Non-Existing Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
chainvnfresponse = requests.put(url)
self.assertEqual(chainvnfresponse.status_code, 500)
print(" ")
testchain = "dc0/s1/firewall1/firewall1:cp03:output/dc0/s1/iperf1/iperf1:cp02:input"
print('->>>>>>> test Chain VNF Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
chainvnfresponse = requests.put(url)
print (chainvnfresponse.content)
self.assertEqual(chainvnfresponse.status_code, 200)
self.assertGreaterEqual(json.loads(chainvnfresponse.content)["cookie"], 0)
print(" ")
print('->>>>>>> test Chain VNF Delete Chain ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
deletechainvnfresponse = requests.delete(url)
self.assertEqual(deletechainvnfresponse.status_code, 200)
self.assertEqual(deletechainvnfresponse.content, "true")
print(" ")
print('->>>>>>> test Chaining List If Empty Again ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/list"
chainlistresponse = requests.get(url, headers=headers)
self.assertEqual(chainlistresponse.status_code, 200)
self.assertEqual(json.loads(chainlistresponse.content)["chains"], [])
print(" ")
testchain = "dc0/s1/firewall1/firewall1:cp03:output/dc0/s1/iperf1/iperf1:cp02:input"
print('->>>>>>> test Stack Chain VNF Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
stackchainvnfresponse = requests.post(url, data=json.dumps(
json.loads('{"path":["dc1.s1", "s1","s2","s3","s1","dc1.s1"]}')), headers=headers)
self.assertEqual(stackchainvnfresponse.status_code, 200)
print (stackchainvnfresponse.content)
self.assertGreaterEqual(json.loads(stackchainvnfresponse.content)["cookie"], 0)
print(" ")
testchain = "dc0/s1/firewall1/firewall1:cp03:output/dc0/s1/iperf1/iperf1:cp02:input"
print('->>>>>>> test Stack Chain VNF Interfaces ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/chain/%s" % (testchain)
stackchainvnfresponse = requests.delete(url, headers=headers)
self.assertEqual(stackchainvnfresponse.status_code, 200)
print(" ")
print('->>>>>>> test Loadbalancing ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/dc0/s1/firewall1/firewall1:cp03:output"
lblistresponse = requests.post(url, data=json.dumps(
{"dst_vnf_interfaces":[{"pop":"dc0","stack":"s1","server":"iperf1","port":"iperf1:cp02:input"}]})
, headers=headers)
print (lblistresponse.content)
self.assertEqual(lblistresponse.status_code, 200)
self.assertIn("dc0_s1_firewall1:fire-out-0", lblistresponse.content)
print(" ")
print('->>>>>>> test Loadbalancing List ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/list"
lblistresponse = requests.get(url, headers=headers)
self.assertEqual(lblistresponse.status_code, 200)
print (lblistresponse.content )
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["paths"][0]["dst_vnf"], "dc0_s1_iperf1")
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["paths"][0]["dst_intf"], "iper-in-0")
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["src_vnf"], "dc0_s1_firewall1")
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["src_intf"],"fire-out-0")
print(" ")
print('->>>>>>> test delete Loadbalancing ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/dc0/s1/firewall1/firewall1:cp03:output"
lbdeleteresponse = requests.delete(url, headers=headers)
print (lbdeleteresponse.content)
self.assertEqual(lbdeleteresponse.status_code, 200)
print(" ")
print('->>>>>>> testFloatingLoadbalancer ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/dc0/floating/bla/blubb"
lblistresponse = requests.post(url, data=json.dumps(
{"dst_vnf_interfaces":[{"pop":"dc0","stack":"s1","server":"iperf1","port":"iperf1:cp02:input"}]})
, headers=headers)
print (lblistresponse.content)
self.assertEqual(lblistresponse.status_code, 200)
resp = json.loads(lblistresponse.content)
self.assertIsNotNone(resp.get('cookie'))
self.assertIsNotNone(resp.get('floating_ip'))
cookie = resp.get('cookie')
print(" ")
print('->>>>>>> testDeleteFloatingLoadbalancer ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/dc0/floating/%s/blubb" % cookie
lblistresponse = requests.delete(url, headers=headers)
print (lblistresponse.content)
self.assertEqual(lblistresponse.status_code, 200)
print(" ")
print('->>>>>>> testLoadbalancingCustomPath ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/dc0_s1_firewall1/fire-out-0"
lblistresponse = requests.post(url, data=json.dumps(
{"dst_vnf_interfaces":{"dc0_s1_iperf1":"iper-in-0"},
"path": {"dc0_s1_iperf1": {"iper-in-0": ["dc1.s1", "s1","s2","s3","s1","dc1.s1"]}}}), headers=headers)
print (lblistresponse.content)
self.assertEqual(lblistresponse.status_code, 200)
self.assertIn("dc0_s1_firewall1:fire-out-0", lblistresponse.content)
print(" ")
print('->>>>>>> testLoadbalancingListCustomPath ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/list"
lblistresponse = requests.get(url, headers=headers)
self.assertEqual(lblistresponse.status_code, 200)
print (lblistresponse.content )
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["paths"][0]["dst_vnf"], "dc0_s1_iperf1")
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["paths"][0]["dst_intf"], "iper-in-0")
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["paths"][0]["path"],
["dc1.s1", "s1","s2","s3","s1","dc1.s1"] )
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["src_vnf"], "dc0_s1_firewall1")
self.assertEqual(json.loads(lblistresponse.content)["loadbalancers"][0]["src_intf"],"fire-out-0")
print(" ")
print('->>>>>>> test Delete Loadbalancing ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/lb/dc0_s1_firewall1/fire-out-0"
lblistresponse = requests.delete(url, headers=headers)
self.assertEqual(lblistresponse.status_code, 200)
print(" ")
print('->>>>>>> test Query Topology ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:4000/v1/topo"
topolistresponse = requests.get(url, headers=headers)
print(topolistresponse.content)
self.assertEqual(topolistresponse.status_code, 200)
print(" ")
def testNovaDummy(self):
print('->>>>>>> test Nova Dummy Class->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
print(" ")
headers = {'Content-type': 'application/json'}
test_heatapi_template_create_stack = open(os.path.join(os.path.dirname(__file__), "test_heatapi_template_create_stack.json")).read()
url = "http://0.0.0.0:18004/v1/tenantabc123/stacks"
requests.post(url, data=json.dumps(json.loads(test_heatapi_template_create_stack)),
headers=headers)
print('->>>>>>> test Nova List Versions ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/"
listapiversionnovaresponse = requests.get(url, headers=headers)
self.assertEqual(listapiversionnovaresponse.status_code, 200)
self.assertEqual(json.loads(listapiversionnovaresponse.content)["versions"][0]["id"], "v2.1")
self.assertEqual(json.loads(listapiversionnovaresponse.content)["versions"][0]["status"], "CURRENT")
self.assertEqual(json.loads(listapiversionnovaresponse.content)["versions"][0]["version"], "2.38")
self.assertEqual(json.loads(listapiversionnovaresponse.content)["versions"][0]["min_version"], "2.1")
self.assertEqual(json.loads(listapiversionnovaresponse.content)["versions"][0]["updated"], "2013-07-23T11:33:21Z")
print(" ")
print('->>>>>>> test Nova Version Show ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla"
listapiversion21novaresponse = requests.get(url, headers=headers)
self.assertEqual(listapiversion21novaresponse.status_code, 200)
self.assertEqual(json.loads(listapiversion21novaresponse.content)["version"]["id"], "v2.1")
self.assertEqual(json.loads(listapiversion21novaresponse.content)["version"]["status"], "CURRENT")
self.assertEqual(json.loads(listapiversion21novaresponse.content)["version"]["version"], "2.38")
self.assertEqual(json.loads(listapiversion21novaresponse.content)["version"]["min_version"], "2.1")
self.assertEqual(json.loads(listapiversion21novaresponse.content)["version"]["updated"], "2013-07-23T11:33:21Z")
print(" ")
print('->>>>>>> test Nova Version List Server APIs ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers"
listserverapisnovaresponse = requests.get(url, headers=headers)
self.assertEqual(listserverapisnovaresponse.status_code, 200)
self.assertNotEqual(json.loads(listserverapisnovaresponse.content)["servers"][0]["name"], "")
print(" ")
print('->>>>>>> test Nova Delete Server APIs ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/%s" % (json.loads(listserverapisnovaresponse.content)["servers"][0]["id"])
deleteserverapisnovaresponse = requests.delete(url, headers=headers)
self.assertEqual(deleteserverapisnovaresponse.status_code, 204)
print(" ")
print('->>>>>>> test Nova Delete Non-Existing Server APIs ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/non-existing-ix"
deleteserverapisnovaresponse = requests.delete(url, headers=headers)
self.assertEqual(deleteserverapisnovaresponse.status_code, 404)
print(" ")
print('->>>>>>> testNovaVersionListServerAPIs_withPortInformation ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/andPorts"
listserverapisnovaresponse = requests.get(url, headers=headers)
self.assertEqual(listserverapisnovaresponse.status_code, 200)
self.assertNotEqual(json.loads(listserverapisnovaresponse.content)["servers"][0]["name"], "")
print(" ")
print('->>>>>>> test Nova List Flavors ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/flavors"
listflavorsresponse = requests.get(url, headers=headers)
self.assertEqual(listflavorsresponse.status_code, 200)
self.assertIn(json.loads(listflavorsresponse.content)["flavors"][0]["name"], ["m1.nano", "m1.tiny", "m1.micro", "m1.small"])
self.assertIn(json.loads(listflavorsresponse.content)["flavors"][1]["name"], ["m1.nano", "m1.tiny", "m1.micro", "m1.small"])
self.assertIn(json.loads(listflavorsresponse.content)["flavors"][2]["name"], ["m1.nano", "m1.tiny", "m1.micro", "m1.small"])
print(" ")
print('->>>>>>> testNovaAddFlavors ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/flavors"
addflavorsresponse = requests.post(url,
data='{"flavor":{"name": "testFlavor", "vcpus": "test_vcpus", "ram": 1024, "disk": 10}}',
headers=headers)
self.assertEqual(addflavorsresponse.status_code, 200)
self.assertIsNotNone(json.loads(addflavorsresponse.content)["flavor"]["id"])
self.assertIsNotNone(json.loads(addflavorsresponse.content)["flavor"]["links"][0]['href'])
print(" ")
print('->>>>>>> test Nova List Flavors Detail ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/flavors/detail"
listflavorsdetailresponse = requests.get(url, headers=headers)
self.assertEqual(listflavorsdetailresponse.status_code, 200)
self.assertIn(json.loads(listflavorsdetailresponse.content)["flavors"][0]["name"],["m1.nano", "m1.tiny", "m1.micro", "m1.small"])
self.assertIn(json.loads(listflavorsdetailresponse.content)["flavors"][1]["name"],["m1.nano", "m1.tiny", "m1.micro", "m1.small"])
self.assertIn(json.loads(listflavorsdetailresponse.content)["flavors"][2]["name"],["m1.nano", "m1.tiny", "m1.micro", "m1.small"])
print(" ")
print('->>>>>>> testNovaAddFlavors ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/flavors/detail"
addflavorsresponse = requests.post(url,
data='{"flavor":{"name": "testFlavor", "vcpus": "test_vcpus", "ram": 1024, "disk": 10}}',
headers=headers)
self.assertEqual(addflavorsresponse.status_code, 200)
self.assertIsNotNone(json.loads(addflavorsresponse.content)["flavor"]["id"])
self.assertIsNotNone(json.loads(addflavorsresponse.content)["flavor"]["links"][0]['href'])
print(" ")
print('->>>>>>> test Nova List Flavor By Id ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/flavors/%s" % (json.loads(listflavorsdetailresponse.content)["flavors"][0]["name"])
listflavorsbyidresponse = requests.get(url, headers=headers)
self.assertEqual(listflavorsbyidresponse.status_code, 200)
self.assertEqual(json.loads(listflavorsbyidresponse.content)["flavor"]["id"], json.loads(listflavorsdetailresponse.content)["flavors"][0]["id"])
print(" ")
print('->>>>>>> test Nova List Images ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/images"
listimagesresponse = requests.get(url, headers=headers)
self.assertEqual(listimagesresponse.status_code, 200)
print(listimagesresponse.content)
# deactivated: highly depends on the environment in which the tests are executed. one cannot make such an assumption.
#self.assertIn(json.loads(listimagesresponse.content)["images"][0]["name"],["google/cadvisor:latest", "ubuntu:trusty", "prom/pushgateway:latest"])
#self.assertIn(json.loads(listimagesresponse.content)["images"][1]["name"],["google/cadvisor:latest", "ubuntu:trusty", "prom/pushgateway:latest"])
#self.assertIn(json.loads(listimagesresponse.content)["images"][2]["name"],["google/cadvisor:latest", "ubuntu:trusty", "prom/pushgateway:latest"])
print(" ")
print('->>>>>>> test Nova List Images Details ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/images/detail"
listimagesdetailsresponse = requests.get(url, headers=headers)
self.assertEqual(listimagesdetailsresponse.status_code, 200)
# deactivated: highly depends on the environment in which the tests are executed. one cannot make such an assumption.
#self.assertIn(json.loads(listimagesdetailsresponse.content)["images"][0]["name"],["google/cadvisor:latest", "ubuntu:trusty", "prom/pushgateway:latest"])
#self.assertIn(json.loads(listimagesdetailsresponse.content)["images"][1]["name"],["google/cadvisor:latest", "ubuntu:trusty", "prom/pushgateway:latest"])
#self.assertIn(json.loads(listimagesdetailsresponse.content)["images"][2]["name"],["google/cadvisor:latest", "ubuntu:trusty", "prom/pushgateway:latest"])
self.assertEqual(json.loads(listimagesdetailsresponse.content)["images"][0]["metadata"]["architecture"],"x86_64")
print(" ")
print('->>>>>>> test Nova List Image By Id ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/images/%s" % (json.loads(listimagesdetailsresponse.content)["images"][0]["id"])
listimagebyidresponse = requests.get(url, headers=headers)
self.assertEqual(listimagebyidresponse.status_code, 200)
self.assertEqual(json.loads(listimagebyidresponse.content)["image"]["id"],json.loads(listimagesdetailsresponse.content)["images"][0]["id"])
print(" ")
print('->>>>>>> test Nova List Image By Non-Existend Id ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/images/non_existing_id"
listimagebynonexistingidresponse = requests.get(url, headers=headers)
self.assertEqual(listimagebynonexistingidresponse.status_code, 404)
print(" ")
#find ubuntu id
for image in json.loads(listimagesresponse.content)["images"]:
if image["name"] == "ubuntu:trusty":
ubuntu_image_id = image["id"]
print('->>>>>>> test Nova Create Server Instance ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers"
data = '{"server": {"name": "X", "flavorRef": "%s", "imageRef":"%s"}}' % (json.loads(listflavorsresponse.content)["flavors"][0]["id"], ubuntu_image_id)
createserverinstance = requests.post(url, data=data, headers=headers)
self.assertEqual(createserverinstance.status_code, 200)
self.assertEqual(json.loads(createserverinstance.content)["server"]["image"]["id"], ubuntu_image_id)
print(" ")
print('->>>>>>> test Nova Create Server Instance With Already Existing Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers"
data = '{"server": {"name": "X", "flavorRef": "%s", "imageRef":"%s"}}' % (json.loads(listflavorsresponse.content)["flavors"][0]["id"], ubuntu_image_id)
createserverinstance = requests.post(url, data=data, headers=headers)
self.assertEqual(createserverinstance.status_code, 409)
print(" ")
print('->>>>>>> test Nova Version List Server APIs Detailed ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/detail"
listserverapisdetailedresponse = requests.get(url, headers=headers)
self.assertEqual(listserverapisdetailedresponse.status_code, 200)
self.assertEqual(json.loads(listserverapisdetailedresponse.content)["servers"][0]["status"], "ACTIVE")
print(" ")
print('->>>>>>> test Nova Show Server Details ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/%s" % (json.loads(listserverapisdetailedresponse.content)["servers"][0]["id"])
listserverdetailsresponse = requests.get(url, headers=headers)
self.assertEqual(listserverdetailsresponse.status_code, 200)
self.assertEqual(json.loads(listserverdetailsresponse.content)["server"]["flavor"]["links"][0]["rel"], "bookmark")
print(" ")
print('->>>>>>> test Nova Show Non-Existing Server Details ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/non_existing_server_id"
listnonexistingserverdetailsresponse = requests.get(url, headers=headers)
self.assertEqual(listnonexistingserverdetailsresponse.status_code, 404)
print(" ")
def testNeutronDummy(self):
print('->>>>>>> test Neutron Dummy Class->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
print(" ")
headers = {'Content-type': 'application/json'}
test_heatapi_template_create_stack = open(os.path.join(os.path.dirname(__file__), "test_heatapi_template_create_stack.json")).read()
url = "http://0.0.0.0:18004/v1/tenantabc123/stacks"
requests.post(url, data=json.dumps(json.loads(test_heatapi_template_create_stack)), headers=headers)
# test_heatapi_keystone_get_token = open("test_heatapi_keystone_get_token.json").read()
print('->>>>>>> test Neutron List Versions ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/"
listapiversionstackresponse = requests.get(url, headers=headers)
self.assertEqual(listapiversionstackresponse.status_code, 200)
self.assertEqual(json.loads(listapiversionstackresponse.content)["versions"][0]["id"], "v2.0")
print(" ")
print('->>>>>>> test Neutron Show API v2.0 ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0"
listapiversionv20response = requests.get(url, headers=headers)
self.assertEqual(listapiversionv20response.status_code, 200)
self.assertEqual(json.loads(listapiversionv20response.content)["resources"][0]["name"], "subnet")
self.assertEqual(json.loads(listapiversionv20response.content)["resources"][1]["name"], "network")
self.assertEqual(json.loads(listapiversionv20response.content)["resources"][2]["name"], "ports")
print(" ")
print('->>>>>>> test Neutron List Networks ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks"
listnetworksesponse1 = requests.get(url, headers=headers)
self.assertEqual(listnetworksesponse1.status_code, 200)
self.assertEqual(json.loads(listnetworksesponse1.content)["networks"][0]["status"], "ACTIVE")
listNetworksId = json.loads(listnetworksesponse1.content)["networks"][0]["id"]
listNetworksName = json.loads(listnetworksesponse1.content)["networks"][0]["name"]
listNetworksId2 = json.loads(listnetworksesponse1.content)["networks"][1]["id"]
print(" ")
print('->>>>>>> test Neutron List Non-Existing Networks ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks?name=non_existent_network_name"
listnetworksesponse2 = requests.get(url,headers=headers)
self.assertEqual(listnetworksesponse2.status_code, 404)
print(" ")
print('->>>>>>> test Neutron List Networks By Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks?name=" + listNetworksName #tcpdump-vnf:input:net:9df6a98f-9e11-4cb7-b3c0-InAdUnitTest
listnetworksesponse3 = requests.get(url, headers=headers)
self.assertEqual(listnetworksesponse3.status_code, 200)
self.assertEqual(json.loads(listnetworksesponse3.content)["networks"][0]["name"], listNetworksName)
print(" ")
print('->>>>>>> test Neutron List Networks By Id ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks?id=" + listNetworksId # tcpdump-vnf:input:net:9df6a98f-9e11-4cb7-b3c0-InAdUnitTest
listnetworksesponse4 = requests.get(url, headers=headers)
self.assertEqual(listnetworksesponse4.status_code, 200)
self.assertEqual(json.loads(listnetworksesponse4.content)["networks"][0]["id"], listNetworksId)
print(" ")
print('->>>>>>> test Neutron List Networks By Multiple Ids ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks?id=" + listNetworksId + "&id="+ listNetworksId2 # tcpdump-vnf:input:net:9df6a98f-9e11-4cb7-b3c0-InAdUnitTest
listnetworksesponse5 = requests.get(url, headers=headers)
self.assertEqual(listnetworksesponse5.status_code, 200)
self.assertEqual(json.loads(listnetworksesponse5.content)["networks"][0]["id"], listNetworksId)
self.assertEqual(json.loads(listnetworksesponse5.content)["networks"][1]["id"], listNetworksId2)
print(" ")
print('->>>>>>> test Neutron Show Network ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks/"+listNetworksId
shownetworksesponse = requests.get(url, headers=headers)
self.assertEqual(shownetworksesponse.status_code, 200)
self.assertEqual(json.loads(shownetworksesponse.content)["network"]["status"], "ACTIVE")
print(" ")
print('->>>>>>> test Neutron Show Network Non-ExistendNetwork ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks/non_existent_network_id"
shownetworksesponse2 = requests.get(url, headers=headers)
self.assertEqual(shownetworksesponse2.status_code, 404)
print(" ")
print('->>>>>>> test Neutron Create Network ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks"
createnetworkresponse = requests.post(url, data='{"network": {"name": "sample_network","admin_state_up": true}}', headers=headers)
self.assertEqual(createnetworkresponse.status_code, 201)
self.assertEqual(json.loads(createnetworkresponse.content)["network"]["status"], "ACTIVE")
print(" ")
print('->>>>>>> test Neutron Create Network With Existing Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks"
createnetworkresponsefailure = requests.post(url,data='{"network": {"name": "sample_network","admin_state_up": true}}',headers=headers)
self.assertEqual(createnetworkresponsefailure.status_code, 400)
print(" ")
print('->>>>>>> test Neutron Update Network ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks/%s" % (json.loads(createnetworkresponse.content)["network"]["id"])
updatenetworkresponse = requests.put(url, data='{"network": {"status": "ACTIVE", "admin_state_up":true, "tenant_id":"abcd123", "name": "sample_network_new_name", "shared":false}}' , headers=headers)
self.assertEqual(updatenetworkresponse.status_code, 200)
self.assertEqual(json.loads(updatenetworkresponse.content)["network"]["name"], "sample_network_new_name")
self.assertEqual(json.loads(updatenetworkresponse.content)["network"]["tenant_id"], "abcd123")
print(" ")
print('->>>>>>> test Neutron Update Non-Existing Network ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks/non-existing-name123"
updatenetworkresponse = requests.put(url, data='{"network": {"name": "sample_network_new_name"}}', headers=headers)
self.assertEqual(updatenetworkresponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron List Subnets ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets"
listsubnetsresponse = requests.get(url, headers=headers)
listSubnetName = json.loads(listsubnetsresponse.content)["subnets"][0]["name"]
listSubnetId = json.loads(listsubnetsresponse.content)["subnets"][0]["id"]
listSubnetId2 = json.loads(listsubnetsresponse.content)["subnets"][1]["id"]
self.assertEqual(listsubnetsresponse.status_code, 200)
self.assertNotIn('None', listSubnetName)
print(" ")
print('->>>>>>> test Neutron List Subnets By Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets?name="+listSubnetName
listsubnetByNameresponse = requests.get(url, headers=headers)
self.assertEqual(listsubnetByNameresponse.status_code, 200)
self.assertNotIn('None', json.loads(listsubnetByNameresponse.content)["subnets"][0]["name"])
print(" ")
print('->>>>>>> test Neutron List Subnets By Id ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets?id=" + listSubnetId
listsubnetsbyidresponse = requests.get(url, headers=headers)
self.assertEqual(listsubnetsbyidresponse.status_code, 200)
self.assertNotIn("None", json.loads(listsubnetsbyidresponse.content)["subnets"][0]["name"])
print(" ")
print('->>>>>>> test Neutron List Subnets By Multiple Id ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets?id=" + listSubnetId +"&id="+listSubnetId2
listsubnetsbymultipleidsresponse = requests.get(url, headers=headers)
self.assertEqual(listsubnetsbymultipleidsresponse.status_code, 200)
self.assertNotIn("None", json.loads(listsubnetsbymultipleidsresponse.content)["subnets"][0]["name"])
print(" ")
print('->>>>>>> test Neutron Show Subnet->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets/%s" % (json.loads(listsubnetsresponse.content)["subnets"][0]["id"])
showsubnetsresponse = requests.get(url, headers=headers)
self.assertEqual(showsubnetsresponse.status_code, 200)
self.assertNotIn("None", json.loads(showsubnetsresponse.content)["subnet"]["name"])
print(" ")
print('->>>>>>> test Neutron Show Non-Existing Subnet->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets/non-existing-id123"
showsubnetsresponse = requests.get(url, headers=headers)
self.assertEqual(showsubnetsresponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron Create Subnet ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets"
createsubnetdata = '{"subnet": {"name": "new_subnet", "network_id": "%s","ip_version": 4,"cidr": "10.0.0.1/24"} }' % (json.loads(createnetworkresponse.content)["network"]["id"])
createsubnetresponse = requests.post(url, data=createsubnetdata, headers=headers)
self.assertEqual(createsubnetresponse.status_code, 201)
self.assertEqual(json.loads(createsubnetresponse.content)["subnet"]["name"], "new_subnet")
print(" ")
print('->>>>>>> test Neutron Create Second Subnet ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets"
createsubnetdata = '{"subnet": {"name": "new_subnet", "network_id": "%s","ip_version": 4,"cidr": "10.0.0.1/24"} }' % (json.loads(createnetworkresponse.content)["network"]["id"])
createsubnetfailureresponse = requests.post(url, data=createsubnetdata, headers=headers)
self.assertEqual(createsubnetfailureresponse.status_code, 409)
print(" ")
print('->>>>>>> test Neutron Update Subnet ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets/%s" % (json.loads(createsubnetresponse.content)["subnet"]["id"])
updatesubnetdata = '{"subnet": {"name": "new_subnet_new_name", "network_id":"some_id", "tenant_id":"new_tenant_id", "allocation_pools":"change_me", "gateway_ip":"192.168.1.120", "ip_version":4, "cidr":"10.0.0.1/24", "id":"some_new_id", "enable_dhcp":true} }'
updatesubnetresponse = requests.put(url, data=updatesubnetdata, headers=headers)
self.assertEqual(updatesubnetresponse.status_code, 200)
self.assertEqual(json.loads(updatesubnetresponse.content)["subnet"]["name"], "new_subnet_new_name")
print(" ")
print('->>>>>>> test Neutron Update Non-Existing Subnet ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/subnets/non-existing-subnet-12345"
updatenonexistingsubnetdata = '{"subnet": {"name": "new_subnet_new_name"} }'
updatenonexistingsubnetresponse = requests.put(url, data=updatenonexistingsubnetdata, headers=headers)
self.assertEqual(updatenonexistingsubnetresponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron List Ports ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports"
listportsesponse = requests.get(url, headers=headers)
self.assertEqual(listportsesponse.status_code, 200)
self.assertEqual(json.loads(listportsesponse.content)["ports"][0]["status"], "ACTIVE")
listPortsName = json.loads(listportsesponse.content)["ports"][0]["name"]
listPortsId1 = json.loads(listportsesponse.content)["ports"][0]["id"]
listPortsId2 = json.loads(listportsesponse.content)["ports"][1]["id"]
print(" ")
print('->>>>>>> test Neutron List Ports By Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports?name=" + listPortsName
listportsbynameesponse = requests.get(url, headers=headers)
self.assertEqual(listportsbynameesponse.status_code, 200)
self.assertEqual(json.loads(listportsbynameesponse.content)["ports"][0]["name"], listPortsName)
print(" ")
print('->>>>>>> test Neutron List Ports By Id ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports?id=" + listPortsId1
listportsbyidesponse = requests.get(url, headers=headers)
self.assertEqual(listportsbyidesponse.status_code, 200)
self.assertEqual(json.loads(listportsbyidesponse.content)["ports"][0]["id"], listPortsId1)
print(" ")
print('->>>>>>> test Neutron List Ports By Multiple Ids ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports?id=" + listPortsId1 +"&id="+listPortsId2
listportsbymultipleidsesponse = requests.get(url, headers=headers)
self.assertEqual(listportsbymultipleidsesponse.status_code, 200)
self.assertEqual(json.loads(listportsbymultipleidsesponse.content)["ports"][0]["id"], listPortsId1)
print(" ")
print('->>>>>>> test Neutron List Non-Existing Ports ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports?id=non-existing-port-id"
listportsbynonexistingidsesponse = requests.get(url, headers=headers)
self.assertEqual(listportsbynonexistingidsesponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron Show Port ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports/%s" % (json.loads(listportsesponse.content)["ports"][0]["id"])
showportresponse = requests.get(url, headers=headers)
self.assertEqual(showportresponse.status_code, 200)
self.assertEqual(json.loads(showportresponse.content)["port"]["status"], "ACTIVE")
print(" ")
print('->>>>>>> test Neutron Show Non-Existing Port ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports/non-existing-portid123"
shownonexistingportresponse = requests.get(url, headers=headers)
self.assertEqual(shownonexistingportresponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron Create Port In Non-Existing Network ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports"
createnonexistingportdata = '{"port": {"name": "new_port", "network_id": "non-existing-id"} }'
createnonexistingnetworkportresponse = requests.post(url, data=createnonexistingportdata, headers=headers)
self.assertEqual(createnonexistingnetworkportresponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron Create Port ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports"
createportdata = '{"port": {"name": "new_port", "network_id": "%s", "admin_state_up":true, "device_id":"device_id123", "device_owner":"device_owner123", "fixed_ips":"change_me","id":"new_id1234", "mac_address":"12:34:56:78:90", "status":"change_me", "tenant_id":"tenant_id123"} }' % (json.loads(createnetworkresponse.content)["network"]["id"])
createportresponse = requests.post(url, data=createportdata, headers=headers)
self.assertEqual(createportresponse.status_code, 201)
print (createportresponse.content)
self.assertEqual(json.loads(createportresponse.content)["port"]["name"], "new_port")
print(" ")
print('->>>>>>> test Neutron Create Port With Existing Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports"
createportwithexistingnamedata = '{"port": {"name": "new_port", "network_id": "%s"} }' % (json.loads(createnetworkresponse.content)["network"]["id"])
createportwithexistingnameresponse = requests.post(url, data=createportwithexistingnamedata, headers=headers)
self.assertEqual(createportwithexistingnameresponse.status_code, 500)
print(" ")
print('->>>>>>> test Neutron Create Port Without Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports"
createportdatawithoutname = '{"port": {"network_id": "%s"} }' % (json.loads(createnetworkresponse.content)["network"]["id"])
createportwithoutnameresponse = requests.post(url, data=createportdatawithoutname, headers=headers)
self.assertEqual(createportwithoutnameresponse.status_code, 201)
self.assertIn("port:cp", json.loads(createportwithoutnameresponse.content)["port"]["name"])
print(" ")
print('->>>>>>> test Neutron Update Port ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
print(json.loads(createportresponse.content)["port"]["name"])
url = "http://0.0.0.0:19696/v2.0/ports/%s" % (json.loads(createportresponse.content)["port"]["name"])
updateportdata = '{"port": {"name": "new_port_new_name", "admin_state_up":true, "device_id":"device_id123", "device_owner":"device_owner123", "fixed_ips":"change_me","mac_address":"12:34:56:78:90", "status":"change_me", "tenant_id":"tenant_id123", "network_id":"network_id123"} }'
updateportresponse = requests.put(url, data=updateportdata, headers=headers)
self.assertEqual(updateportresponse.status_code, 200)
self.assertEqual(json.loads(updateportresponse.content)["port"]["name"], "new_port_new_name")
print(" ")
print('->>>>>>> test Neutron Update Non-Existing Port ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports/non-existing-port-ip"
updatenonexistingportdata = '{"port": {"name": "new_port_new_name"} }'
updatenonexistingportresponse = requests.put(url, data=updatenonexistingportdata, headers=headers)
self.assertEqual(updatenonexistingportresponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron Delete Port ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
righturl = "http://0.0.0.0:19696/v2.0/ports/%s" % (json.loads(createportresponse.content)["port"]["id"])
deleterightportresponse = requests.delete(righturl, headers=headers)
self.assertEqual(deleterightportresponse.status_code, 204)
print(" ")
print('->>>>>>> test Neutron Delete Non-Existing Port ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
wrongurl = "http://0.0.0.0:19696/v2.0/ports/unknownid"
deletewrongportresponse = requests.delete(wrongurl, headers=headers)
self.assertEqual(deletewrongportresponse.status_code, 404)
print(" ")
print('->>>>>>> test Neutron Delete Subnet ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
wrongurl = "http://0.0.0.0:19696/v2.0/subnets/unknownid"
righturl = "http://0.0.0.0:19696/v2.0/subnets/%s" % (json.loads(updatesubnetresponse.content)["subnet"]["id"])
deletewrongsubnetresponse = requests.delete(wrongurl, headers=headers)
deleterightsubnetresponse = requests.delete(righturl, headers=headers)
self.assertEqual(deletewrongsubnetresponse.status_code, 404)
self.assertEqual(deleterightsubnetresponse.status_code, 204)
print(" ")
print('->>>>>>> test Neutron Delete Network ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
righturl = "http://0.0.0.0:19696/v2.0/networks/%s" % (json.loads(createnetworkresponse.content)["network"]["id"])
deleterightnetworkresponse = requests.delete(righturl, headers=headers)
self.assertEqual(deleterightnetworkresponse.status_code, 204)
print(" ")
print('->>>>>>> test Neutron Delete Non-Existing Network ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
wrongurl = "http://0.0.0.0:19696/v2.0/networks/unknownid"
deletewrongnetworkresponse = requests.delete(wrongurl, headers=headers)
self.assertEqual(deletewrongnetworkresponse.status_code, 404)
print(" ")
def testKeystomeDummy(self):
print('->>>>>>> test Keystone Dummy Class->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
print(" ")
headers = {'Content-type': 'application/json'}
test_heatapi_keystone_get_token = open(os.path.join(os.path.dirname(__file__), "test_heatapi_keystone_get_token.json")).read()
print('->>>>>>> test Keystone List Versions ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:15000/"
listapiversionstackresponse = requests.get(url, headers=headers)
self.assertEqual(listapiversionstackresponse.status_code, 200)
self.assertEqual(json.loads(listapiversionstackresponse.content)["versions"]["values"][0]["id"], "v2.0")
print(" ")
print('->>>>>>> test Keystone Show ApiV2 ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:15000/v2.0"
showapiversionstackresponse = requests.get(url, headers=headers)
self.assertEqual(showapiversionstackresponse.status_code, 200)
self.assertEqual(json.loads(showapiversionstackresponse.content)["version"]["id"], "v2.0")
print(" ")
print('->>>>>>> test Keystone Get Token ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:15000/v2.0/tokens"
gettokenstackresponse = requests.post(url, data=json.dumps(json.loads(test_heatapi_keystone_get_token)), headers=headers)
self.assertEqual(gettokenstackresponse.status_code, 200)
self.assertEqual(json.loads(gettokenstackresponse.content)["access"]["user"]["name"], "tenantName")
print(" ")
def testHeatDummy(self):
print('->>>>>>> test Heat Dummy Class->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
print(" ")
headers = {'Content-type': 'application/json'}
test_heatapi_template_create_stack = open(os.path.join(os.path.dirname(__file__), "test_heatapi_template_create_stack.json")).read()
test_heatapi_template_update_stack = open(os.path.join(os.path.dirname(__file__), "test_heatapi_template_update_stack.json")).read()
print('->>>>>>> test Heat List API Versions Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/"
listapiversionstackresponse = requests.get(url, headers=headers)
self.assertEqual(listapiversionstackresponse.status_code, 200)
self.assertEqual(json.loads(listapiversionstackresponse.content)["versions"][0]["id"], "v1.0")
print(" ")
print('->>>>>>> test Create Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123/stacks"
createstackresponse = requests.post(url, data=json.dumps(json.loads(test_heatapi_template_create_stack)), headers=headers)
self.assertEqual(createstackresponse.status_code, 201)
self.assertNotEqual(json.loads(createstackresponse.content)["stack"]["id"], "")
print(" ")
print('->>>>>>> test Create Stack With Existing Name ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123/stacks"
createstackwithexistingnameresponse = requests.post(url, data='{"stack_name" : "s1"}', headers=headers)
self.assertEqual(createstackwithexistingnameresponse.status_code, 409)
print(" ")
print('->>>>>>> test Create Stack With Unsupported Version ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123/stacks"
createstackwitheunsupportedversionresponse = requests.post(url, data='{"stack_name" : "stackname123", "template" : {"heat_template_version": "2015-04-29"}}', headers=headers)
self.assertEqual(createstackwitheunsupportedversionresponse.status_code, 400)
print(" ")
print('->>>>>>> test List Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123/stacks"
liststackresponse = requests.get(url, headers=headers)
self.assertEqual(liststackresponse.status_code, 200)
self.assertEqual(json.loads(liststackresponse.content)["stacks"][0]["stack_status"], "CREATE_COMPLETE")
print(" ")
print('->>>>>>> test Show Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123showStack/stacks/%s"% json.loads(createstackresponse.content)['stack']['id']
liststackdetailsresponse = requests.get(url, headers=headers)
self.assertEqual(liststackdetailsresponse.status_code, 200)
self.assertEqual(json.loads(liststackdetailsresponse.content)["stack"]["stack_status"], "CREATE_COMPLETE")
print(" ")
print('->>>>>>> test Show Non-Exisitng Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123showStack/stacks/non_exisitng_id123"
listnonexistingstackdetailsresponse = requests.get(url, headers=headers)
self.assertEqual(listnonexistingstackdetailsresponse.status_code, 404)
print(" ")
print('->>>>>>> test Update Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123updateStack/stacks/%s"% json.loads(createstackresponse.content)['stack']['id']
updatestackresponse = requests.put(url, data=json.dumps(json.loads(test_heatapi_template_update_stack)),
headers=headers)
self.assertEqual(updatestackresponse.status_code, 202)
liststackdetailsresponse = requests.get(url, headers=headers)
self.assertEqual(json.loads(liststackdetailsresponse.content)["stack"]["stack_status"], "UPDATE_COMPLETE")
print(" ")
print('->>>>>>> test Update Non-Existing Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123updateStack/stacks/non_existing_id_1234"
updatenonexistingstackresponse = requests.put(url, data={"non": "sense"}, headers=headers)
self.assertEqual(updatenonexistingstackresponse.status_code, 404)
print(" ")
print('->>>>>>> test Delete Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123showStack/stacks/%s" % \
json.loads(createstackresponse.content)['stack']['id']
deletestackdetailsresponse = requests.delete(url, headers=headers)
self.assertEqual(deletestackdetailsresponse.status_code, 204)
print(" ")
def test_CombinedTesting(self):
print('->>>>>>> test Combinded tests->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
print(" ")
headers = {'Content-type': 'application/json'}
test_heatapi_template_create_stack = open(os.path.join(os.path.dirname(__file__),
"test_heatapi_template_create_stack.json")).read()
test_heatapi_template_update_stack = open(os.path.join(os.path.dirname(__file__),
"test_heatapi_template_update_stack.json")).read()
print('->>>>>>> test Combined Create Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123/stacks"
createstackresponse = requests.post(url,
data=json.dumps(json.loads(test_heatapi_template_create_stack)),
headers=headers)
self.assertEqual(createstackresponse.status_code, 201)
self.assertNotEqual(json.loads(createstackresponse.content)["stack"]["id"], "")
print(" ")
print('->>>>>>> test Combined Neutron List Ports ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports"
listportsesponse = requests.get(url, headers=headers)
self.assertEqual(listportsesponse.status_code, 200)
self.assertEqual(len(json.loads(listportsesponse.content)["ports"]), 9)
for port in json.loads(listportsesponse.content)["ports"]:
self.assertEqual(len(str(port['fixed_ips'][0]['subnet_id'])), 36)
print(" ")
print('->>>>>>> test Combined Neutron List Networks ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks"
listnetworksesponse = requests.get(url, headers=headers)
self.assertEqual(listnetworksesponse.status_code, 200)
self.assertEqual(len(json.loads(listnetworksesponse.content)["networks"]), 10)
for net in json.loads(listnetworksesponse.content)["networks"]:
self.assertEqual(len(str(net['subnets'][0])), 36)
print(" ")
print('->>>>>>> test Combined Update Stack ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18004/v1/tenantabc123updateStack/stacks/%s"% \
json.loads(createstackresponse.content)['stack']['id']
updatestackresponse = requests.put(url,
data=json.dumps(json.loads(test_heatapi_template_update_stack)),
headers=headers)
self.assertEqual(updatestackresponse.status_code, 202)
liststackdetailsresponse = requests.get(url, headers=headers)
self.assertEqual(json.loads(liststackdetailsresponse.content)["stack"]["stack_status"], "UPDATE_COMPLETE")
print(" ")
print('->>>>>>> test Combined Neutron List Ports ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/ports"
listportsesponse = requests.get(url, headers=headers)
self.assertEqual(listportsesponse.status_code, 200)
self.assertEqual(len(json.loads(listportsesponse.content)["ports"]), 18)
for port in json.loads(listportsesponse.content)["ports"]:
self.assertEqual(len(str(port['fixed_ips'][0]['subnet_id'])), 36)
print(" ")
print('->>>>>>> test Combined Neutron List Networks ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/networks"
listnetworksesponse = requests.get(url, headers=headers)
self.assertEqual(listnetworksesponse.status_code, 200)
self.assertEqual(len(json.loads(listnetworksesponse.content)["networks"]), 14)
for net in json.loads(listnetworksesponse.content)["networks"]:
self.assertEqual(len(str(net['subnets'][0])), 36)
print(" ")
# workflow create floating ip and assign it to a server
print('->>>>>>> CombinedNeutronCreateFloatingIP ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:19696/v2.0/floatingips"
createflip = requests.post(url, headers=headers,
data='{"floatingip":{"floating_network_id":"default"}}')
self.assertEqual(createflip.status_code, 200)
self.assertIsNotNone(json.loads(createflip.content)["floatingip"].get("port_id"))
port_id = json.loads(createflip.content)["floatingip"].get("port_id")
print(" ")
print('->>>>>>> CombinedNovaGetServer ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/detail"
listserverapisdetailedresponse = requests.get(url, headers=headers)
self.assertEqual(listserverapisdetailedresponse.status_code, 200)
self.assertEqual(json.loads(listserverapisdetailedresponse.content)["servers"][0]["status"], "ACTIVE")
server_id = json.loads(listserverapisdetailedresponse.content)["servers"][0]["id"]
print(" ")
print('->>>>>>> CombinedNovaAssignInterface ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/%s/os-interface" % server_id
assign = requests.post(url, headers=headers,
data='{"interfaceAttachment":{"net_id": "default"}}')
self.assertEqual(assign.status_code, 202)
self.assertIsNotNone(json.loads(assign.content)["interfaceAttachment"].get("port_id"))
port_id = json.loads(assign.content)["interfaceAttachment"].get("port_id")
print(" ")
print('->>>>>>> CombinedNovaDeleteInterface ->>>>>>>>>>>>>>>')
print('->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>')
url = "http://0.0.0.0:18774/v2.1/id_bla/servers/%s/os-interface/%s" % (server_id, port_id)
getintfs = requests.delete(url, headers=headers)
self.assertEqual(getintfs.status_code, 202)
print(" ")
if __name__ == '__main__':
unittest.main()
| 61.079352 | 351 | 0.541524 | 6,608 | 71,585 | 5.789195 | 0.081265 | 0.018978 | 0.018821 | 0.021958 | 0.780813 | 0.758698 | 0.714573 | 0.597281 | 0.561025 | 0.53559 | 0 | 0.038447 | 0.163959 | 71,585 | 1,171 | 352 | 61.131512 | 0.600755 | 0.037228 | 0 | 0.545929 | 0 | 0.044885 | 0.389676 | 0.168113 | 0 | 0 | 0 | 0 | 0.23904 | 1 | 0.007307 | false | 0 | 0.006263 | 0 | 0.014614 | 0.397704 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c2eff3b4a9c98327585ed6129ba25d7436c202b | 982 | py | Python | 03_Unit_3/Sprint3/Module2-consuming-data-from-an-api/app/elephant_insert.py | mark-morelos/my_DataScience_notes | 2e05a3d77949cb8ccd9aac7c04cf7758d7b73d1c | [
"MIT"
] | null | null | null | 03_Unit_3/Sprint3/Module2-consuming-data-from-an-api/app/elephant_insert.py | mark-morelos/my_DataScience_notes | 2e05a3d77949cb8ccd9aac7c04cf7758d7b73d1c | [
"MIT"
] | null | null | null | 03_Unit_3/Sprint3/Module2-consuming-data-from-an-api/app/elephant_insert.py | mark-morelos/my_DataScience_notes | 2e05a3d77949cb8ccd9aac7c04cf7758d7b73d1c | [
"MIT"
] | null | null | null |
# DOESN'T WORK WITH PG
#import os
#import pandas as pd
#from dotenv import load_dotenv
#import psycopg2
#
#load_dotenv()
#
#DB_HOST = os.getenv("DB_HOST", default="OOPS")
#DB_NAME = os.getenv("DB_NAME", default="OOPS")
#DB_USER = os.getenv("DB_USER", default="OOPS")
#DB_PASSWORD = os.getenv("DB_PASSWORD", default="OOPS")
#
#connection = psycopg2.connect(dbname=DB_NAME, user=DB_USER, password=DB_PASSWORD, host=DB_HOST)
#print("CONNECTION", type(connection))
#
#CSV_FILEPATH = os.path.join(os.path.dirname(__file__), "..", "data", "buddymove_holidayiq.csv")
#df = pd.read_csv(CSV_FILEPATH)
#df.index.rename("id", inplace=True) # assigns a column label "id" for the index column
#df.index += 1 # starts ids at 1 instead of 0
#print(df.head())
#
#table_name = "reviews"
#df.to_sql(table_name, con=connection)
#
#cursor = connection.cursor()
#cursor.execute(f"SELECT count(distinct id) as review_count FROM {table_name};")
#results = cursor.fetchone()
#print(results, "RECORDS")
#
| 29.757576 | 96 | 0.724033 | 150 | 982 | 4.56 | 0.48 | 0.046784 | 0.05848 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005695 | 0.105906 | 982 | 32 | 97 | 30.6875 | 0.773349 | 0.933809 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c51dee98e88a8d981cdeafa80e89babc0cb0a21 | 691 | py | Python | examples/mem_profile.py | chrhck/contagion | ea67c5db3fce02ddbd85142fcf545af948595a69 | [
"MIT"
] | 2 | 2020-10-08T10:23:47.000Z | 2021-11-19T09:00:19.000Z | examples/mem_profile.py | chrhck/contagion | ea67c5db3fce02ddbd85142fcf545af948595a69 | [
"MIT"
] | null | null | null | examples/mem_profile.py | chrhck/contagion | ea67c5db3fce02ddbd85142fcf545af948595a69 | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
import sys
import pandas as pd
import seaborn as sns
import networkx as nx
import yaml
from networkx.algorithms import approximation as nx_approx
from tqdm.notebook import tqdm
from copy import deepcopy
# Adding path to module
sys.path.append("../")
# picture path
PICS = '../pics/'
# Module imports
from contagion import Contagion, config
from contagion.config import _baseconfig
from contagion.plotting import plot_infection_history
from memory_profiler import profile
my_config = yaml.safe_load(open("test_social_graph_cpp_params.yaml"))
contagion = Contagion(my_config)
contagion.sim()
results = pd.DataFrame(contagion.statistics)
| 23.827586 | 69 | 0.811867 | 100 | 691 | 5.49 | 0.54 | 0.071038 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12301 | 691 | 28 | 70 | 24.678571 | 0.905941 | 0.070912 | 0 | 0 | 0 | 0 | 0.069182 | 0.051887 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.7 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
6c5a9a02bdebda618d324a7975dbbc767cc08e48 | 371 | py | Python | pythonLearning/class.py | liusong-cn/python | f67933f0879021a595258e09c4cde5ca1f9f6aed | [
"Apache-2.0"
] | 1 | 2019-11-12T13:38:54.000Z | 2019-11-12T13:38:54.000Z | pythonLearning/class.py | liusong-cn/python | f67933f0879021a595258e09c4cde5ca1f9f6aed | [
"Apache-2.0"
] | null | null | null | pythonLearning/class.py | liusong-cn/python | f67933f0879021a595258e09c4cde5ca1f9f6aed | [
"Apache-2.0"
] | null | null | null | #可以继承父类,默认的父类为object,单继承
class Foo(object):
version = 1.0
#self是类实例的引用,类似其他语言中的this
def __init__(self, name = 'zhang'):
self.name = name
print('这个方法类似构造函数,但不同于其他语言,此处是类实例在创建后自动执行的第一个方法而已')
def show(self):
print(self.name)
def addMe2Me3(self, x):
print(str(x + x))
foo = Foo();
print(foo.name)
print(foo.addMe2Me3(2)) | 21.823529 | 59 | 0.633423 | 46 | 371 | 5.021739 | 0.543478 | 0.103896 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.226415 | 371 | 17 | 60 | 21.823529 | 0.780488 | 0.126685 | 0 | 0 | 0 | 0 | 0.145511 | 0.130031 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.416667 | 0.416667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
6c60d1efb15222619fa51cd7e19381c8bf695850 | 960 | py | Python | pyairwatch/mdm/tags.py | jprichards/PyVMwareAirWatch | ac25185d1f6a001c9670d57d11a3047553f743a1 | [
"MIT"
] | 17 | 2018-03-19T05:52:37.000Z | 2022-03-09T16:41:03.000Z | pyairwatch/mdm/tags.py | marshall-brown/PyVMwareAirWatch | ac25185d1f6a001c9670d57d11a3047553f743a1 | [
"MIT"
] | 2 | 2018-12-06T17:12:54.000Z | 2019-08-27T09:57:13.000Z | pyairwatch/mdm/tags.py | marshall-brown/PyVMwareAirWatch | ac25185d1f6a001c9670d57d11a3047553f743a1 | [
"MIT"
] | 9 | 2018-04-02T18:42:51.000Z | 2020-06-10T02:11:05.000Z | class Tags(object):
"""A class to manage various AirWatch device tag functionalities"""
def __init__(self, client):
self.client = client
def get_id_by_name(self, name, og_id):
# mdm/tags/search?name={name}
response = self._get(path='/tags/search', params={'name':str(name), 'organizationgroupid':str(og_id)})
return response
def _get(self, module='mdm', path=None, version=None, params=None, header=None):
"""GET requests for the /MDM/Tags module."""
response = self.client.get(module=module, path=path, version=version, params=params, header=header)
return response
def _post(self, module='mdm', path=None, version=None, params=None, data=None, json=None, header=None):
"""POST requests for the /MDM/Tags module."""
response = self.client.post(module=module, path=path, version=version, params=params, data=data, json=json, header=header)
return response
| 45.714286 | 130 | 0.669792 | 129 | 960 | 4.891473 | 0.294574 | 0.063391 | 0.053883 | 0.053883 | 0.421553 | 0.421553 | 0.421553 | 0.421553 | 0.275753 | 0 | 0 | 0 | 0.189583 | 960 | 20 | 131 | 48 | 0.811054 | 0.176042 | 0 | 0.25 | 0 | 0 | 0.052903 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6c62129959a68d55c143f574207db30bb59790c9 | 5,811 | py | Python | FakeNewsNetDataset/scraper.py | stavIatrop/Thesis-Fake-News-Detection | 096fae9551ed836a1a03451ad6f32cf7f96218e2 | [
"MIT"
] | 9 | 2020-08-27T10:31:30.000Z | 2022-02-16T15:32:32.000Z | FakeNewsNetDataset/scraper.py | stavIatrop/Thesis-Fake-News-Detection | 096fae9551ed836a1a03451ad6f32cf7f96218e2 | [
"MIT"
] | null | null | null | FakeNewsNetDataset/scraper.py | stavIatrop/Thesis-Fake-News-Detection | 096fae9551ed836a1a03451ad6f32cf7f96218e2 | [
"MIT"
] | 2 | 2021-11-02T03:22:29.000Z | 2022-02-03T14:18:09.000Z | import requests
from bs4 import BeautifulSoup
import pandas as pd
list_art = list()
list_text = list()
list1 = [['id', 'text']]
headers = {'user-agent': 'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)'}
f = open('real_urls_vol2', "r")
f2 = open('true_texts_vol2', "a")
i = 1
for data in f:
text_id = []
j = 16
while data[j].isdigit():
j += 1
text_id = data[:j]
url = data[j:len(data) -1]
list_text = []
if "http" not in url:
if "www." not in url:
url = "https://www." + url
else:
url = "https://" + url
print(url)
if "longroom.com" in url:
f2.write(str(text_id) + "-" + url + "\n")
f2.write("\n---------------------------------------------------------------------------------------------------------------\n")
i += 1
continue
if "foodandwine.com" not in url and "southernliving.com" not in url and "realsimple.com/" not in url and "heightline.com/" not in url and "sportskeeda.com/" not in url and "travelandleisure.com/" not in url and "tvovermind.com/" not in url and "dreadcentral.com/" not in url and "rarolae.com/" not in url and "metro.us/" not in url and "dailystar.co.uk/" not in url and "newbeauty.com" not in url and "manrepeller.com/" not in url and "pastemagazine.com/" not in url and "newsday.com/" not in url and "newsweek.com/" not in url and "overthemoon.com/" not in url and "turitmo.com" not in url and "worldstarsmag.com" not in url and "news.starbucks.com/" not in url and "entertainment.inquirer.net" not in url and "ebony.com" not in url and "digviral.com" not in url and "mindfood.com/" not in url and "newsday.com/" not in url and "huffpost.com" not in url and "newsweek.com" not in url and "tvovermind.com/" not in url and "hot1039.com" not in url and "cybersecurity-insiders.com" not in url and "purpleclover.com" not in url and "nbc.com" not in url and "newsweek.com" not in url and "health.com" not in url and "newbeauty.com" not in url and "rarolae.com" not in url and "cosmopolitan.com" not in url and "elledecor.com" not in url and "theblast.com" not in url and "vh1.com" not in url and "okayplayer.com" not in url and "mirror.co.uk" not in url and "time.com" not in url and "telegraph.co." not in url and "galoremag.com" not in url and "huffingtonpost.c" not in url and "parents.com" not in url and "people.com" not in url and "hellogiggles.com" not in url and "instyle.com" not in url and "/ew.com" not in url and "independent.co.uk" not in url and "omgcheckitout.com" not in url:
cookies = {'enwiki_session': '17ab96bd8ffbe8ca58a78657a918558'}
try :
r = requests.get(url, cookies=cookies)
except requests.exceptions.Timeout as tmout:
continue
except requests.exceptions.ConnectionError as conn:
continue
except KeyError:
continue
soup = BeautifulSoup(r.content, "html.parser")
for p in soup.select('p'):
list_text.append(p.get_text(strip=True))
print("Text ready.")
result = ' '.join(list_text)
list1.append([text_id , result])
f2.write(str(text_id) + "-" + url + "\n")
f2.write(result)
f2.write("\n---------------------------------------------------------------------------------------------------------------\n")
i += 1
continue
# if "longroom" in url:
# f2.write(str(text_id) + "-" + url + "\n")
# f2.write("\n---------------------------------------------------------------------------------------------------------------\n")
# i += 1
# continue
#url = "https://" + url
try :
s = requests.Session()
# if "money.com" in url:
# print("ye")
# r = s.get("https://money.com", headers=headers)
# else:
# r = s.get("https://time.com", headers=headers)
r = s.get(url, headers=headers)
cookies = r.cookies.get_dict()
#print(s.cookies.get_dict())
#print(type(r.cookies)
r = s.get(url, headers=headers , cookies=cookies)
#print(r.headers)
except requests.exceptions.Timeout as tmout: # Maybe set up for a retry, or continue in a retry loop
# print(tmout)
# archieve_url = "http://web.archive.org/cdx/search/cdx?url={}&output=json".format(url)
# try:
# r = requests.get(url, headers=headers, cookies=cookies)
# except requests.exceptions.Timeout as t:
# print(t)
# continue
# except requests.exceptions.ConnectionError as c:
# print(c)
continue
except requests.exceptions.ConnectionError as conn:
# print(conn)
# archieve_url = "http://web.archive.org/cdx/search/cdx?url={}&output=json".format(url)
# try:
# r = requests.get(url, headers=headers, cookies=cookies)
# except requests.exceptions.Timeout as t:
# print(t)
# continue
# except requests.exceptions.ConnectionError as c:
# print(c)
continue
except KeyError:
continue
soup = BeautifulSoup(r.content, "html.parser")
for p in soup.select('p'):
list_text.append(p.get_text(strip=True))
print("Text ready.")
result = ' '.join(list_text)
list1.append([text_id , result])
f2.write(str(text_id) + "-" + url + "\n")
f2.write(result)
f2.write("\n---------------------------------------------------------------------------------------------------------------\n")
i += 1
f.close()
f2.close()
# df1 = pd.DataFrame(list1)
# df1.to_csv('real_texts.csv',sep=',',index = False ,header = False)
| 42.727941 | 1,692 | 0.551024 | 775 | 5,811 | 4.095484 | 0.216774 | 0.091367 | 0.138626 | 0.180214 | 0.622558 | 0.496849 | 0.470384 | 0.408318 | 0.404537 | 0.404537 | 0 | 0.013266 | 0.247634 | 5,811 | 135 | 1,693 | 43.044444 | 0.712717 | 0.205989 | 0 | 0.534247 | 0 | 0.013699 | 0.292704 | 0.098078 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041096 | 0 | 0.041096 | 0.041096 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c65aa95bcf56bccfa67a8b0a7af75c77031eee1 | 1,771 | py | Python | mj_envs/hand_manipulation_suite/__init__.py | Jendker/mj_envs | 472d86974a891fc814702b75a6bdbd44780f2224 | [
"Apache-2.0"
] | null | null | null | mj_envs/hand_manipulation_suite/__init__.py | Jendker/mj_envs | 472d86974a891fc814702b75a6bdbd44780f2224 | [
"Apache-2.0"
] | null | null | null | mj_envs/hand_manipulation_suite/__init__.py | Jendker/mj_envs | 472d86974a891fc814702b75a6bdbd44780f2224 | [
"Apache-2.0"
] | null | null | null | from gym.envs.registration import register
from mjrl.envs.mujoco_env import MujocoEnv
# Swing the door open
register(
id='door-v0',
entry_point='mj_envs.hand_manipulation_suite:DoorEnvV0',
max_episode_steps=200,
)
from mj_envs.hand_manipulation_suite.door_v0 import DoorEnvV0
# Hammer a nail into the board
register(
id='hammer-v0',
entry_point='mj_envs.hand_manipulation_suite:HammerEnvV0',
max_episode_steps=200,
)
from mj_envs.hand_manipulation_suite.hammer_v0 import HammerEnvV0
# Reposition a pen in hand
register(
id='pen-v0',
entry_point='mj_envs.hand_manipulation_suite:PenEnvV0',
max_episode_steps=100,
)
from mj_envs.hand_manipulation_suite.pen_v0 import PenEnvV0
# Relocate an object to the target
register(
id='relocate-v0',
entry_point='mj_envs.hand_manipulation_suite:RelocateEnvV0',
max_episode_steps=200,
)
from mj_envs.hand_manipulation_suite.relocate_v0 import RelocateEnvV0
# Remove an object from the initial position
register(
id='remove-v0',
entry_point='mj_envs.hand_manipulation_suite:RemoveEnvV0',
max_episode_steps=200,
)
from mj_envs.hand_manipulation_suite.remove_v0 import RemoveEnvV0
# Remove an object from the initial position with old reward function - toy example
register(
id='remove_old_reward-v0',
entry_point='mj_envs.hand_manipulation_suite:RemoveOldRewardEnvV0',
max_episode_steps=200,
)
from mj_envs.hand_manipulation_suite.remove_old_reward_v0 import RemoveOldRewardEnvV0
# Custom environment - relocate an object to the target
register(
id='relocate_custom-v0',
entry_point='mj_envs.hand_manipulation_suite:RelocateCustomEnvV0',
max_episode_steps=200,
)
from mj_envs.hand_manipulation_suite.relocate_custom_v0 import RelocateCustomEnvV0
| 30.016949 | 85 | 0.801242 | 250 | 1,771 | 5.368 | 0.232 | 0.062593 | 0.104322 | 0.229508 | 0.587183 | 0.587183 | 0.564083 | 0.510432 | 0.307004 | 0.23994 | 0 | 0.031593 | 0.124224 | 1,771 | 58 | 86 | 30.534483 | 0.833656 | 0.160926 | 0 | 0.295455 | 0 | 0 | 0.267253 | 0.213126 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.204545 | 0 | 0.204545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66687ee82078e35892dba2694bc97948c449f59c | 1,887 | py | Python | applicationfinder.py | KingCprey/applicationfinder | b6074e14d2b5ccac8163127baee80ef0d5414005 | [
"MIT"
] | null | null | null | applicationfinder.py | KingCprey/applicationfinder | b6074e14d2b5ccac8163127baee80ef0d5414005 | [
"MIT"
] | null | null | null | applicationfinder.py | KingCprey/applicationfinder | b6074e14d2b5ccac8163127baee80ef0d5414005 | [
"MIT"
] | 1 | 2020-11-21T18:57:00.000Z | 2020-11-21T18:57:00.000Z | #!/usr/bin/env python3
import os
import appcache,mimefinder
application_dir="/usr/share/applications"
cache_file=os.path.expanduser("~/.cache/")
#listdir but returns absolute paths
def ldirabs(dirpath):return [os.path.join(dirpath,f) for f in os.listdir(os.path.abspath(dirpath))]
def get_desktop_files():
try:
return [f for f in ldirabs(application_dir) if os.path.isfile(f) and ".desktop" in f.lower()]
except OSError:
return None
#TODO: check if cached applications still exist before passing off to user, then check if executable exists and is accessible
#TODO: check if the found application is on the PATH, then show the command to the user.
def parse_desktop(desktop_src):
lines=[f for f in desktop_src.split("\n") if len(f.strip())>0]
desk_keys={}
if lines[0].lower()=="[desktop entry]":
for l in lines[1:]:
lsplit=l.split("=")
key=lsplit[0]
value="=".join(lsplit[1:])
if "[" in key and "]" in key:
sub_key=key[key.index("["):key.rindex("]")]
if key in desk_keys:
#sub_key uses the root key which I use for no subkey (like a lil bich)
if sub_key=="_":sub_key="__"
if type(desk_keys[key])is dict:
if sub_key in desk_keys[key]:
else:desk_keys[key]={"_":[desk_keys[key]],sub_key:value}
else:
#2 lines have exact same key, so converting into a list
if key in desk_keys:
if type(desk_keys[key]) is list:desk_keys[key].append(value)
else:desk_keys[key]=[desk_keys[key],value]
else:raise ValueError("Invalid desktop file")
return desk_keys
def test():
print(mimefinder.get_mime("./README.md"))
def main():
test()
if __name__=="__main__":
main()
| 35.603774 | 125 | 0.604663 | 273 | 1,887 | 4.03663 | 0.413919 | 0.087114 | 0.079855 | 0.019056 | 0.108893 | 0.08167 | 0.047187 | 0 | 0 | 0 | 0 | 0.005087 | 0.2708 | 1,887 | 52 | 126 | 36.288462 | 0.795785 | 0.206147 | 0 | 0.054054 | 0 | 0 | 0.071046 | 0.015416 | 0 | 0 | 0 | 0.019231 | 0 | 0 | null | null | 0 | 0.054054 | null | null | 0.027027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66902dc9f11bf7a2e68b40ee0964ecbab45593ae | 265 | py | Python | tests/example_app/example/app.py | karmingc/flask-pg-extras | 4a3c386324d4290feae95332e31557b3ef4b582f | [
"MIT"
] | 31 | 2020-08-11T16:59:59.000Z | 2022-03-19T20:25:54.000Z | tests/example_app/example/app.py | karmingc/flask-pg-extras | 4a3c386324d4290feae95332e31557b3ef4b582f | [
"MIT"
] | 1 | 2022-03-24T18:06:11.000Z | 2022-03-28T23:53:53.000Z | tests/example_app/example/app.py | karmingc/flask-pg-extras | 4a3c386324d4290feae95332e31557b3ef4b582f | [
"MIT"
] | 5 | 2020-08-17T21:44:25.000Z | 2022-03-24T15:06:10.000Z | from flask import Flask
from flask_pg_extras import FlaskPGExtras
flask_pg_extras = FlaskPGExtras()
def create_app():
app = Flask(__name__)
flask_pg_extras.init_app(app)
@app.route('/')
def index():
return 'Hello world'
return app
| 15.588235 | 41 | 0.686792 | 35 | 265 | 4.857143 | 0.457143 | 0.123529 | 0.229412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222642 | 265 | 16 | 42 | 16.5625 | 0.825243 | 0 | 0 | 0 | 0 | 0 | 0.045283 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.1 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
669133f7f7a68da4004ce0c4b51a2d54f481ead7 | 756 | py | Python | app/backend/wells/migrations/0012_load_water_quality_codes.py | stephenhillier/gwells | 235d35f1f40dd845f8fecd0d7c3371c4564567c6 | [
"Apache-2.0"
] | null | null | null | app/backend/wells/migrations/0012_load_water_quality_codes.py | stephenhillier/gwells | 235d35f1f40dd845f8fecd0d7c3371c4564567c6 | [
"Apache-2.0"
] | null | null | null | app/backend/wells/migrations/0012_load_water_quality_codes.py | stephenhillier/gwells | 235d35f1f40dd845f8fecd0d7c3371c4564567c6 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.15 on 2018-08-14 17:48
from __future__ import unicode_literals
from django.db import migrations
import json
from io import open
import os
from gwells.codes import CodeFixture
from wells.models import WaterQualityCharacteristic, WaterQualityColour
def water_quality_codes():
fixture = '0012_water_quality_codes.json'
fixture_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), fixture)
return CodeFixture(fixture_path)
class Migration(migrations.Migration):
dependencies = [
('wells', '0011_auto_20180917_1728'),
]
operations = [
migrations.RunPython(water_quality_codes().load_fixture, reverse_code=water_quality_codes().unload_fixture),
]
| 26.068966 | 116 | 0.752646 | 96 | 756 | 5.65625 | 0.59375 | 0.088398 | 0.12523 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059282 | 0.152116 | 756 | 28 | 117 | 27 | 0.787832 | 0.09127 | 0 | 0 | 1 | 0 | 0.083333 | 0.076023 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.388889 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
66987de6cd50b09efbfd3c52eb2c6a78cadf4458 | 1,860 | py | Python | cloudapi_digitalocean/digitaloceanobjects/account.py | zorani/cloudapi_digitalocean | 670da3b66fe343553194dd5f39a8209fd6012802 | [
"MIT"
] | null | null | null | cloudapi_digitalocean/digitaloceanobjects/account.py | zorani/cloudapi_digitalocean | 670da3b66fe343553194dd5f39a8209fd6012802 | [
"MIT"
] | null | null | null | cloudapi_digitalocean/digitaloceanobjects/account.py | zorani/cloudapi_digitalocean | 670da3b66fe343553194dd5f39a8209fd6012802 | [
"MIT"
] | null | null | null | from __future__ import annotations
from dataclasses import dataclass, field
from requests.models import Response
from ..digitaloceanapi.accounts import Accounts
from ..common.cloudapiexceptions import *
import json
import threading
import time
import re
@dataclass
class AccountAttributes:
droplet_limit: int = None
floating_ip_limit: int = None
volume_limit: int = None
email: str = None
uuid: str = None
email_verified: bool = None
status: str = None
status_message: str = None
class AccountManager:
def __init__(self):
self.accountapi = Accounts()
def retrieve_account_details(self):
response = self.accountapi.list_account_information()
if response:
content = json.loads(response.content.decode("utf-8"))
account_data = content["account"]
newaccount = Account()
newaccount.attributes = AccountAttributes(**account_data)
return newaccount
def droplet_limit(self):
return self.retrieve_account_details().attributes.droplet_limit
def floating_ip_limit(self):
return self.retrieve_account_details().attributes.floating_ip_limit
def volume_limit(self):
return self.retrieve_account_details().attributes.volume_limit
def email(self):
return self.retrieve_account_details().attributes.email
def uuid(self):
return self.retrieve_account_details().attributes.uuid
def email_verified(self):
return self.retrieve_account_details().attributes.email_verified
def status(self):
return self.retrieve_account_details().attributes.status
def status_message(self):
return self.retrieve_account_details().attributes.status_message
class Account:
def __init__(self, status=None):
self.attributes = AccountAttributes()
| 27.352941 | 75 | 0.713441 | 208 | 1,860 | 6.129808 | 0.259615 | 0.105882 | 0.155294 | 0.138039 | 0.317647 | 0.317647 | 0.317647 | 0.281569 | 0 | 0 | 0 | 0.00068 | 0.209677 | 1,860 | 67 | 76 | 27.761194 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0.006452 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.22449 | false | 0 | 0.183673 | 0.163265 | 0.816327 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
669b5de5cbe9e2654a30b9f2775dba773e4cd5a7 | 837 | py | Python | generatorpass/ext/site/main.py | es99/password-generator | edbbfcebfd5f083de88c1a4de9f1f67ff0f1010a | [
"Unlicense"
] | null | null | null | generatorpass/ext/site/main.py | es99/password-generator | edbbfcebfd5f083de88c1a4de9f1f67ff0f1010a | [
"Unlicense"
] | null | null | null | generatorpass/ext/site/main.py | es99/password-generator | edbbfcebfd5f083de88c1a4de9f1f67ff0f1010a | [
"Unlicense"
] | null | null | null | from flask import request, render_template
from flask import Blueprint
from generatorpass.ext.gerador import generate
from generatorpass.ext.db.models import Senhas
bp = Blueprint("site", __name__)
@bp.route('/', methods=['GET', 'POST'])
def index():
info = None
if request.method == 'POST':
conta = request.form['conta']
user = request.form['user']
length = int(request.form['num_caracteres'])
if generate(conta, user, length):
info = "Dados enviados com sucesso!"
else:
info = "Erro ao salvar no banco de dados"
return render_template("index.html", var=info)
@bp.route('/senhas')
def senhas():
items = Senhas.query.all()
return render_template("senhas.html", items=items)
@bp.route('/about')
def about():
return render_template("about.html")
| 27.9 | 54 | 0.657109 | 106 | 837 | 5.103774 | 0.490566 | 0.103512 | 0.110906 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205496 | 837 | 29 | 55 | 28.862069 | 0.813534 | 0 | 0 | 0 | 1 | 0 | 0.169654 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.083333 | 0.166667 | 0.041667 | 0.416667 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
669fcc7aece753467906f215e6ce84a6ef99228f | 1,089 | py | Python | tests/test_openhab/test_plugins/test_thing/test_filter.py | DerOetzi/HABApp | a123fbfa9928ebb3cda9a84f6984dcba593c8236 | [
"Apache-2.0"
] | 44 | 2018-12-13T08:46:44.000Z | 2022-03-07T03:23:21.000Z | tests/test_openhab/test_plugins/test_thing/test_filter.py | DerOetzi/HABApp | a123fbfa9928ebb3cda9a84f6984dcba593c8236 | [
"Apache-2.0"
] | 156 | 2019-03-02T20:53:31.000Z | 2022-03-23T13:13:58.000Z | tests/test_openhab/test_plugins/test_thing/test_filter.py | DerOetzi/HABApp | a123fbfa9928ebb3cda9a84f6984dcba593c8236 | [
"Apache-2.0"
] | 18 | 2019-03-08T07:13:21.000Z | 2022-03-22T19:52:31.000Z | from HABApp.openhab.connection_logic.plugin_things.filters import ThingFilter, apply_filters
def test_thing_filter():
f = ThingFilter('thing_label', 'asdfasdf')
assert f.matches({'label': 'ASDFASDF'}, True)
assert f.matches({'label': 'ASDFASDF'}, False)
f = ThingFilter('thing_label', r'\d+')
assert not f.matches({'label': 'asdf1234'}, True)
assert not f.matches({'label': 'asdf1234'}, False)
f = ThingFilter('thing_label', r'asdf\d+')
assert f.matches({'label': 'asdf1234'}, True)
assert f.matches({'label': 'asdf1234'}, False)
def test_filters():
data = [{'label': '1'}, {'label': '2'}, {'label': 'a'}, {'label': 'b'}, ]
assert list(apply_filters([ThingFilter('thing_label', r'\d+')], data, True)) == [{'label': '1'}, {'label': '2'}]
assert list(apply_filters([ThingFilter('thing_label', r'\d+')], data, False)) == [{'label': '1'}, {'label': '2'}]
assert list(apply_filters([ThingFilter('thing_label', 'no_match')], data, True)) == []
assert list(apply_filters([ThingFilter('thing_label', 'no_match')], data, False)) == []
| 43.56 | 117 | 0.628099 | 136 | 1,089 | 4.889706 | 0.257353 | 0.168421 | 0.221053 | 0.114286 | 0.709774 | 0.572932 | 0.345865 | 0.345865 | 0.345865 | 0.345865 | 0 | 0.023555 | 0.142332 | 1,089 | 24 | 118 | 45.375 | 0.688437 | 0 | 0 | 0 | 0 | 0 | 0.223141 | 0 | 0 | 0 | 0 | 0 | 0.588235 | 1 | 0.117647 | false | 0 | 0.058824 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66bfb997a0732583e64fe7173567367bb5963be7 | 970 | py | Python | test/test_trie.py | paolodelia99/Python-C-Algorithms | 9113ad566e4e659c1f16135c2d3abd3a4c57a46e | [
"MIT"
] | 2 | 2021-02-13T10:58:58.000Z | 2021-03-16T09:56:01.000Z | test/test_trie.py | paolodelia99/Python-C-Algorithms | 9113ad566e4e659c1f16135c2d3abd3a4c57a46e | [
"MIT"
] | null | null | null | test/test_trie.py | paolodelia99/Python-C-Algorithms | 9113ad566e4e659c1f16135c2d3abd3a4c57a46e | [
"MIT"
] | null | null | null | import nose
import trie
Trie = trie.Trie
def test_trie_101():
t = Trie()
nose.tools.assert_is_instance(t, trie.Trie)
def test_trie_num_entries():
t = Trie()
nose.tools.assert_equal(t.num_entries(), 0)
def test_trie_append():
t = Trie()
t.append("ci", 2)
nose.tools.assert_equal(t.num_entries(), 1)
def test_trie_append_102():
t = Trie()
t.append("a", 40)
t.append("ab", 12)
t.append("abc", 14)
nose.tools.assert_equal(t.num_entries(), 3)
def test_trie_remove():
t = Trie()
t.append("ab", 34)
t.append("a", 12)
t.remove("ab")
nose.tools.assert_equal(t.num_entries(), 1)
def test_trie_lookup():
t = Trie()
t.append("a", 40)
t.append("ab", 12)
t.append("abc", 14)
t.append("b", 13)
nose.tools.assert_equal(t.lookup("a"), 40)
nose.tools.assert_equal(t.lookup("b"), 13)
nose.tools.assert_equal(t.lookup("ab"), 12)
nose.tools.assert_equal(t.lookup("abc"), 14) | 20.208333 | 48 | 0.615464 | 160 | 970 | 3.55625 | 0.2 | 0.123023 | 0.237258 | 0.281195 | 0.704745 | 0.594025 | 0.499121 | 0.390158 | 0.28471 | 0.28471 | 0 | 0.047558 | 0.197938 | 970 | 48 | 48 | 20.208333 | 0.683805 | 0 | 0 | 0.4 | 0 | 0 | 0.027806 | 0 | 0 | 0 | 0 | 0 | 0.257143 | 1 | 0.171429 | false | 0 | 0.057143 | 0 | 0.228571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66cb3adf0d524f2655f2e19b20eca3615106fd58 | 225 | py | Python | Reddit-scripts/GetRedditLink/main.py | cheeseywhiz/cheeseywhiz | 51f6651ddbaeebd14d9ce77776bc4cf3a95511c4 | [
"MIT"
] | null | null | null | Reddit-scripts/GetRedditLink/main.py | cheeseywhiz/cheeseywhiz | 51f6651ddbaeebd14d9ce77776bc4cf3a95511c4 | [
"MIT"
] | null | null | null | Reddit-scripts/GetRedditLink/main.py | cheeseywhiz/cheeseywhiz | 51f6651ddbaeebd14d9ce77776bc4cf3a95511c4 | [
"MIT"
] | null | null | null | #!/usr/bin/python3
import sys
from PyQt5 import QtWidgets
from Dialogs import Input
def main():
app = QtWidgets.QApplication([])
Input.InputDialog()
sys.exit(app.exec_())
if __name__ == '__main__':
main()
| 15 | 36 | 0.675556 | 28 | 225 | 5.107143 | 0.678571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010989 | 0.191111 | 225 | 14 | 37 | 16.071429 | 0.774725 | 0.075556 | 0 | 0 | 0 | 0 | 0.038647 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
66d0521bd0bf72697556d30f3c6128855a5bb584 | 799 | py | Python | cms/forms.py | danizavtz/sisimob | d1692cb59d1d761560aeaca98f9185f4f3b91040 | [
"MIT"
] | null | null | null | cms/forms.py | danizavtz/sisimob | d1692cb59d1d761560aeaca98f9185f4f3b91040 | [
"MIT"
] | null | null | null | cms/forms.py | danizavtz/sisimob | d1692cb59d1d761560aeaca98f9185f4f3b91040 | [
"MIT"
] | null | null | null | from django.contrib.auth.forms import AuthenticationForm
from django.contrib.auth.models import User
from .models import Imovel
from django import forms
from django.utils.html import strip_tags
class AuthenticateForm(AuthenticationForm):
username = forms.CharField(widget=forms.widgets.TextInput(attrs={'placeholder': 'usuário'}))
password = forms.CharField(widget=forms.widgets.PasswordInput(attrs={'placeholder':'senha'}))
def is_valid(self):
form = super(AuthenticateForm,self).is_valid()
for f, error in self.errors.items():
if f != '__all__':
self.fields[f].widget.attrs.update({'class': 'error', 'value': strip_tags(error)})
return form
class ImovelCadastro(forms.ModelForm):
class Meta:
model = Imovel
fields = '__all__' | 38.047619 | 98 | 0.715895 | 96 | 799 | 5.833333 | 0.510417 | 0.071429 | 0.060714 | 0.075 | 0.114286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162703 | 799 | 21 | 99 | 38.047619 | 0.83707 | 0 | 0 | 0 | 0 | 0 | 0.07875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.055556 | 0.277778 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
66d63bb5e9cd306acdbb844fc662ce33baf6ae9d | 507 | py | Python | harris_corner_detector_plot.py | mariekevanbrussel/Harris_Corner_Detector | 6a299bcb45470a35bb13716eaebfec091cc694f7 | [
"MIT"
] | null | null | null | harris_corner_detector_plot.py | mariekevanbrussel/Harris_Corner_Detector | 6a299bcb45470a35bb13716eaebfec091cc694f7 | [
"MIT"
] | null | null | null | harris_corner_detector_plot.py | mariekevanbrussel/Harris_Corner_Detector | 6a299bcb45470a35bb13716eaebfec091cc694f7 | [
"MIT"
] | null | null | null | #%%
import cv2
import numpy as np
from scipy.signal import convolve2d
from harris_corner_detector import harris_corner_detector
def harris_corner_detector_plot(img_color_orig, thresh, sigma=1, window_size=5):
img_color = img_color_orig.copy()
Points = harris_corner_detector(img_color, thresh, sigma, window_size)
radius = 1
color = (0, 255, 0) # Green
thickness = 1
for p in Points:
cv2.circle(img_color, (p[1], p[0]), radius, color, thickness)
return img_color
| 23.045455 | 80 | 0.717949 | 76 | 507 | 4.539474 | 0.460526 | 0.13913 | 0.231884 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03423 | 0.193294 | 507 | 21 | 81 | 24.142857 | 0.809291 | 0.015779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
66dec5d4537c3a2a5a813aabc936ad12573196fe | 884 | py | Python | src/fl_simulation/server/aggregation/multi_model_aggregator/aggregator_factory.py | microsoft/fl-simulation | d177d329c82559c7efe82deae8dea8f9baa49495 | [
"MIT"
] | 5 | 2021-12-14T02:21:53.000Z | 2021-12-26T07:45:13.000Z | src/fl_simulation/server/aggregation/multi_model_aggregator/aggregator_factory.py | microsoft/fl-simulation | d177d329c82559c7efe82deae8dea8f9baa49495 | [
"MIT"
] | 1 | 2022-01-04T04:51:20.000Z | 2022-01-04T04:51:20.000Z | src/fl_simulation/server/aggregation/multi_model_aggregator/aggregator_factory.py | microsoft/fl-simulation | d177d329c82559c7efe82deae8dea8f9baa49495 | [
"MIT"
] | null | null | null | """Aggregator factory."""
from typing import Any, Generic
import torch
import torch.nn as nn
from typing_extensions import Protocol
from ._types import T_aggregator
class AggregatorFactory(Protocol, Generic[T_aggregator]):
"""An Aggregator factory.
Used to instansiate a new aggregator. Works with existing aggregators. For example:
```python
a: AggregatorFactory = FedAvgAggregator.__call__
```
"""
def __call__(
self, intial_model: nn.Module, device: torch.device = torch.device("cpu"), *args: Any, **kwds: Any
) -> T_aggregator:
"""Instansiate a new aggregator.
Args:
intial_model (nn.Module): initial model.
device (torch.device, optional): device used for computation. Defaults to torch.device("cpu").
Returns:
T_aggregator: Aggregator
"""
...
| 24.555556 | 110 | 0.649321 | 97 | 884 | 5.752577 | 0.463918 | 0.078853 | 0.091398 | 0.089606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 884 | 35 | 111 | 25.257143 | 0.841629 | 0.466063 | 0 | 0 | 0 | 0 | 0.007853 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.5 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
dd055670a60b4dea5bd55fd87990e16c258a6ded | 8,838 | py | Python | rolling/action/base.py | coolkat64/rolling | 4c3ee2401128e993a52ac9b52cdbd32e17728129 | [
"MIT"
] | null | null | null | rolling/action/base.py | coolkat64/rolling | 4c3ee2401128e993a52ac9b52cdbd32e17728129 | [
"MIT"
] | null | null | null | rolling/action/base.py | coolkat64/rolling | 4c3ee2401128e993a52ac9b52cdbd32e17728129 | [
"MIT"
] | null | null | null | # coding: utf-8
import abc
import dataclasses
import typing
from urllib.parse import urlencode
import serpyco
from guilang.description import Description
from rolling.model.event import ZoneEvent
from rolling.model.event import ZoneEventData
from rolling.rolling_types import ActionType
from rolling.server.controller.url import CHARACTER_ACTION
from rolling.server.controller.url import WITH_BUILD_ACTION
from rolling.server.controller.url import WITH_CHARACTER_ACTION
from rolling.server.controller.url import WITH_RESOURCE_ACTION
from rolling.server.controller.url import WITH_STUFF_ACTION
from rolling.server.link import CharacterActionLink
if typing.TYPE_CHECKING:
from rolling.kernel import Kernel
from rolling.game.base import GameConfig
from rolling.model.character import CharacterModel
from rolling.model.stuff import StuffModel
def remove_none_values(dict_: dict) -> dict:
new_dict = dict(dict_)
for key, value in list(new_dict.items()):
if value is None:
del new_dict[key]
return new_dict
def get_character_action_url(
character_id: str, action_type: ActionType, action_description_id: str, query_params: dict
) -> str:
query_params = remove_none_values(query_params)
base_url = CHARACTER_ACTION.format(
character_id=character_id,
action_type=action_type.value,
action_description_id=action_description_id,
)
return f"{base_url}?{urlencode(query_params)}"
def get_with_build_action_url(
character_id: str,
build_id: int,
action_type: ActionType,
action_description_id: str,
query_params: dict,
) -> str:
query_params = remove_none_values(query_params)
base_url = WITH_BUILD_ACTION.format(
character_id=character_id,
action_type=action_type.value,
build_id=str(build_id),
action_description_id=action_description_id,
)
return f"{base_url}?{urlencode(query_params)}"
def get_with_stuff_action_url(
character_id: str,
action_type: ActionType,
stuff_id: int,
query_params: dict,
action_description_id: str,
) -> str:
query_params = remove_none_values(query_params)
base_url = WITH_STUFF_ACTION.format(
character_id=character_id,
action_type=action_type.value,
stuff_id=str(stuff_id),
action_description_id=action_description_id,
)
return f"{base_url}?{urlencode(query_params)}"
def get_with_resource_action_url(
character_id: str,
action_type: ActionType,
resource_id: str,
query_params: dict,
action_description_id: str,
) -> str:
query_params = remove_none_values(query_params)
base_url = WITH_RESOURCE_ACTION.format(
character_id=character_id,
action_type=action_type.value,
resource_id=resource_id,
action_description_id=action_description_id,
)
return f"{base_url}?{urlencode(query_params)}"
def get_with_character_action_url(
character_id: str,
action_type: ActionType,
with_character_id: str,
query_params: dict,
action_description_id: str,
) -> str:
query_params = remove_none_values(query_params)
base_url = WITH_CHARACTER_ACTION.format(
character_id=character_id,
action_type=action_type.value,
with_character_id=with_character_id,
action_description_id=action_description_id,
)
return f"{base_url}?{urlencode(query_params)}"
@dataclasses.dataclass
class ActionDescriptionModel:
id: str
action_type: ActionType
base_cost: float
properties: typing.Dict[str, typing.Any]
name: typing.Optional[str] = None
class Action(abc.ABC):
input_model: typing.Type[object]
input_model_serializer: serpyco.Serializer
def __init__(self, kernel: "Kernel", description: ActionDescriptionModel) -> None:
self._kernel = kernel
self._description = description
self._character_lib = kernel.character_lib
self._effect_manager = kernel.effect_manager
@classmethod
@abc.abstractmethod
def get_properties_from_config(cls, game_config: "GameConfig", action_config_raw: dict) -> dict:
pass
@property
def description(self) -> ActionDescriptionModel:
return self._description
class WithStuffAction(Action):
@abc.abstractmethod
def check_is_possible(self, character: "CharacterModel", stuff: "StuffModel") -> None:
pass
@abc.abstractmethod
def check_request_is_possible(
self, character: "CharacterModel", stuff: "StuffModel", input_: typing.Any
) -> None:
pass
@abc.abstractmethod
def get_character_actions(
self, character: "CharacterModel", stuff: "StuffModel"
) -> typing.List[CharacterActionLink]:
pass
def get_cost(
self,
character: "CharacterModel",
stuff: "StuffModel",
input_: typing.Optional[typing.Any] = None,
) -> typing.Optional[float]:
return self._description.base_cost
@abc.abstractmethod
def perform(
self, character: "CharacterModel", stuff: "StuffModel", input_: typing.Any
) -> Description:
pass
class WithBuildAction(Action):
@abc.abstractmethod
def check_is_possible(self, character: "CharacterModel", build_id: int) -> None:
pass
@abc.abstractmethod
def check_request_is_possible(
self, character: "CharacterModel", build_id: int, input_: typing.Any
) -> None:
pass
@abc.abstractmethod
def get_character_actions(
self, character: "CharacterModel", build_id: int
) -> typing.List[CharacterActionLink]:
pass
def get_cost(
self, character: "CharacterModel", build_id: int, input_: typing.Optional[typing.Any] = None
) -> typing.Optional[float]:
return self._description.base_cost
@abc.abstractmethod
def perform(
self, character: "CharacterModel", build_id: int, input_: typing.Any
) -> Description:
pass
class WithResourceAction(Action):
@abc.abstractmethod
def check_is_possible(self, character: "CharacterModel", resource_id: str) -> None:
pass
@abc.abstractmethod
def check_request_is_possible(
self, character: "CharacterModel", resource_id: str, input_: typing.Any
) -> None:
pass
@abc.abstractmethod
def get_character_actions(
self, character: "CharacterModel", resource_id: str
) -> typing.List[CharacterActionLink]:
pass
def get_cost(
self,
character: "CharacterModel",
resource_id: str,
input_: typing.Optional[typing.Any] = None,
) -> typing.Optional[float]:
return self._description.base_cost
@abc.abstractmethod
def perform(
self, character: "CharacterModel", resource_id: str, input_: typing.Any
) -> Description:
pass
class CharacterAction(Action):
@abc.abstractmethod
def check_is_possible(self, character: "CharacterModel") -> None:
pass
@abc.abstractmethod
def check_request_is_possible(self, character: "CharacterModel", input_: typing.Any) -> None:
pass
@abc.abstractmethod
def get_character_actions(
self, character: "CharacterModel"
) -> typing.List[CharacterActionLink]:
pass
@abc.abstractmethod
def perform(self, character: "CharacterModel", input_: typing.Any) -> Description:
pass
def perform_from_event(
self, character: "CharacterModel", input_: typing.Any
) -> typing.Tuple[typing.List[ZoneEvent], typing.List[ZoneEvent]]:
"""
return: [0]: all zone websockets; [1]: sender socket
"""
pass
def get_cost(
self, character: "CharacterModel", input_: typing.Optional[typing.Any] = None
) -> typing.Optional[float]:
return self._description.base_cost
class WithCharacterAction(Action):
@abc.abstractmethod
def check_is_possible(
self, character: "CharacterModel", with_character: "CharacterModel"
) -> None:
pass
@abc.abstractmethod
def check_request_is_possible(
self, character: "CharacterModel", with_character: "CharacterModel", input_: typing.Any
) -> None:
pass
@abc.abstractmethod
def get_character_actions(
self, character: "CharacterModel", with_character: "CharacterModel"
) -> typing.List[CharacterActionLink]:
pass
def get_cost(
self,
character: "CharacterModel",
with_character: "CharacterModel",
input_: typing.Optional[typing.Any] = None,
) -> typing.Optional[float]:
return self._description.base_cost
@abc.abstractmethod
def perform(
self, character: "CharacterModel", with_character: "CharacterModel", input_: typing.Any
) -> Description:
pass
| 29.264901 | 100 | 0.695293 | 993 | 8,838 | 5.917422 | 0.106747 | 0.121341 | 0.119469 | 0.044929 | 0.755106 | 0.735194 | 0.697754 | 0.672907 | 0.625936 | 0.572668 | 0 | 0.000432 | 0.213397 | 8,838 | 301 | 101 | 29.362126 | 0.844793 | 0.007581 | 0 | 0.622951 | 0 | 0 | 0.077741 | 0.020578 | 0 | 0 | 0 | 0 | 0 | 1 | 0.143443 | false | 0.090164 | 0.077869 | 0.02459 | 0.327869 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
dd0806cc6929af588554bc76afba1cb03d0da903 | 635 | py | Python | djmo/tests/project_for_tests/apps/soccer/models.py | davidegriffon/djmo | 853212471e59a0fa49d22f17cf57caeb552783f8 | [
"MIT"
] | 5 | 2015-12-09T08:13:03.000Z | 2019-05-12T07:14:25.000Z | djmo/tests/project_for_tests/apps/soccer/models.py | davidegriffon/djmo | 853212471e59a0fa49d22f17cf57caeb552783f8 | [
"MIT"
] | null | null | null | djmo/tests/project_for_tests/apps/soccer/models.py | davidegriffon/djmo | 853212471e59a0fa49d22f17cf57caeb552783f8 | [
"MIT"
] | null | null | null | """
Only for test purpose
"""
from django.db import models
class SoccerTeam(models.Model):
name = models.CharField(max_length=50)
number_of_supporters = models.IntegerField()
@property
def all_players(self):
return SoccerPlayer.objects.filter(team=self)
def __str__(self):
return "<SoccerTeam `{}`>".format(self.name)
class SoccerPlayer(models.Model):
team = models.ForeignKey(SoccerTeam)
first_name = models.CharField(max_length=50)
last_name = models.CharField(max_length=50)
def __str__(self):
return "<SoccerPlayer `{} {}`>".format(self.first_name, self.last_name) | 25.4 | 79 | 0.696063 | 77 | 635 | 5.506494 | 0.467532 | 0.070755 | 0.134434 | 0.15566 | 0.212264 | 0.212264 | 0 | 0 | 0 | 0 | 0 | 0.011472 | 0.176378 | 635 | 25 | 79 | 25.4 | 0.799235 | 0.033071 | 0 | 0.133333 | 0 | 0 | 0.06425 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.066667 | 0.2 | 0.933333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
dd0b52fd3540b89515ea68476efa56f9b21e2667 | 295 | py | Python | app/users/tests/utils.py | noxITRS/family-budget | 987621256f3e8290ae58854d2ba22db8be3e73d3 | [
"MIT"
] | null | null | null | app/users/tests/utils.py | noxITRS/family-budget | 987621256f3e8290ae58854d2ba22db8be3e73d3 | [
"MIT"
] | null | null | null | app/users/tests/utils.py | noxITRS/family-budget | 987621256f3e8290ae58854d2ba22db8be3e73d3 | [
"MIT"
] | null | null | null | from unittest.mock import ANY
def get_expected_user(item, **kwargs):
return {
"id": kwargs.get("id", item.id),
"username": kwargs.get("username", item.username),
"email": kwargs.get("email", item.email),
"date_joined": kwargs.get("date_joined", ANY),
}
| 26.818182 | 58 | 0.60678 | 37 | 295 | 4.72973 | 0.459459 | 0.205714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216949 | 295 | 10 | 59 | 29.5 | 0.757576 | 0 | 0 | 0 | 0 | 0 | 0.176271 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0.125 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
dd1c7ac3b7fe58bc5fe19b4a35bfdf6242335a67 | 280 | py | Python | math/abstract-algebra/1.py | admariner/playground | 02a3104472c8fa3589fe87f7265e70c61d5728c7 | [
"MIT"
] | 3 | 2021-06-12T04:42:32.000Z | 2021-06-24T13:57:38.000Z | math/abstract-algebra/1.py | admariner/playground | 02a3104472c8fa3589fe87f7265e70c61d5728c7 | [
"MIT"
] | null | null | null | math/abstract-algebra/1.py | admariner/playground | 02a3104472c8fa3589fe87f7265e70c61d5728c7 | [
"MIT"
] | 1 | 2021-08-19T14:57:17.000Z | 2021-08-19T14:57:17.000Z | # Semigroup = non-empty set with an associative binary operation
# using addition
print((2 + 3) + 4 == 2 + (3 + 4) == 9)
# We get a property of closure, because the number we get is still a number of the same set,
# natural numbers, including 0
# (2 + 3) + 4
# 2 + (3 + 4)
# 9
| 23.333333 | 92 | 0.635714 | 49 | 280 | 3.632653 | 0.653061 | 0.044944 | 0.067416 | 0.044944 | 0.078652 | 0.078652 | 0.078652 | 0 | 0 | 0 | 0 | 0.070755 | 0.242857 | 280 | 11 | 93 | 25.454545 | 0.768868 | 0.796429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
dd2f9b1bc293903b0a611273338d842fb70f439e | 410 | py | Python | chap03/exercise/Loader.py | viekie/basic_deeplearning | 6c9e55cd621504da3d7ea1627e6783c9819a1916 | [
"Apache-2.0"
] | 3 | 2017-05-23T08:11:44.000Z | 2017-09-25T11:17:57.000Z | chap03/exercise/Loader.py | viekie/basic_deeplearning | 6c9e55cd621504da3d7ea1627e6783c9819a1916 | [
"Apache-2.0"
] | null | null | null | chap03/exercise/Loader.py | viekie/basic_deeplearning | 6c9e55cd621504da3d7ea1627e6783c9819a1916 | [
"Apache-2.0"
] | 1 | 2017-06-19T03:36:40.000Z | 2017-06-19T03:36:40.000Z | #!/usr/bin/env python
# -*- coding:utf8 -*-
# Power by viekie. 2017-05-19 15:36:33
import struct
class Loader(object):
def __init__(self, path, count):
self.path = path
self.count = count
def get_file_content(self):
with open(self.path, 'rb') as f:
content = f.read()
return content
def to_int(self, byte):
return struct.unpack('B', byte)[0]
| 21.578947 | 42 | 0.587805 | 59 | 410 | 3.966102 | 0.694915 | 0.102564 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053691 | 0.273171 | 410 | 18 | 43 | 22.777778 | 0.731544 | 0.187805 | 0 | 0 | 0 | 0 | 0.009091 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.090909 | 0.090909 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dd304c144be6ea52163169a230cb7429ebc3d520 | 337 | py | Python | 30_day_leetcoding_challenge/2020_11/03-Consecutive_Characters.py | QuenLo/leecode | ce861103949510dc54fd5cb336bd992c40748de2 | [
"MIT"
] | 6 | 2018-06-13T06:48:42.000Z | 2020-11-25T10:48:13.000Z | 30_day_leetcoding_challenge/2020_11/03-Consecutive_Characters.py | QuenLo/leecode | ce861103949510dc54fd5cb336bd992c40748de2 | [
"MIT"
] | null | null | null | 30_day_leetcoding_challenge/2020_11/03-Consecutive_Characters.py | QuenLo/leecode | ce861103949510dc54fd5cb336bd992c40748de2 | [
"MIT"
] | null | null | null | class Solution:
def maxPower(self, s: str) -> int:
power = []
i, temp = 1, ""
for s_char in s:
if s_char == temp:
i += 1
else:
power.append( i )
i = 1
temp = s_char
power.append(i)
return max(power)
| 21.0625 | 38 | 0.373887 | 38 | 337 | 3.236842 | 0.526316 | 0.121951 | 0.195122 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018868 | 0.52819 | 337 | 15 | 39 | 22.466667 | 0.754717 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd34ed621068d713f8d551cffc83faae660649cc | 2,637 | py | Python | tutorials/python/decorators2020/decorators.py | joamatab/install_new_computer | 92080adc4bdd693bbd07c19e696a2a6cae8bb8f8 | [
"MIT"
] | 3 | 2019-09-30T22:33:45.000Z | 2021-06-02T03:01:55.000Z | tutorials/python/decorators2020/decorators.py | joamatab/install_new_computer | 92080adc4bdd693bbd07c19e696a2a6cae8bb8f8 | [
"MIT"
] | null | null | null | tutorials/python/decorators2020/decorators.py | joamatab/install_new_computer | 92080adc4bdd693bbd07c19e696a2a6cae8bb8f8 | [
"MIT"
] | 1 | 2019-12-27T11:19:36.000Z | 2019-12-27T11:19:36.000Z | import functools
import time
REGISTERED = {}
def register(func):
REGISTERED[func.__name__] = func
return func
def timer(func):
""" template for decorators """
@functools.wraps(func)
def _timer(*args, **kwargs):
t0 = time.perf_counter()
value = func(*args, **kwargs)
t1 = time.perf_counter()
print(f"elapsed time: {t1-t0} seconds")
return value
return _timer
def repeat_n(num_times=2):
""" repeat n times"""
def decorator_repeat(func):
@functools.wraps(func)
def _wrapper(*args, **kwargs):
for _ in range(num_times):
value = func(*args, **kwargs)
return value
return _wrapper
return decorator_repeat
def repeat(_func=None, *, num_times=2):
def decorator_repeat(func):
@functools.wraps(func)
def wrapper_repeat(*args, **kwargs):
for _ in range(num_times):
value = func(*args, **kwargs)
return value
return wrapper_repeat
if _func is None:
return decorator_repeat
else:
return decorator_repeat(_func)
def trace(func):
""" trace """
@functools.wraps(func)
def _wrapper(*args, **kwargs):
args_repr = [repr(a) for a in args]
kwargs_repr = [f"{k}={v!r}" for k, v in kwargs.items()]
signature = ", ".join(args_repr + kwargs_repr)
print(f"calling {func.__name__}({signature})")
value = func(*args, **kwargs)
print(f"{func.__name__!r}({signature}) returned {value!r}")
return value
return _wrapper
# def count_calls(func):
# print('called')
# if hasattr(func, 'num_calls'):
# func.num_calls += 1
# else:
# func.num_calls = 1
# return func
def count_calls(func):
""" count the number of calls to a function
shows how to keep state in your decorator
"""
@functools.wraps(func)
def _count_calls(*args, **kwargs):
_count_calls.num_calls += 1
return func(*args, **kwargs)
_count_calls.num_calls = 0
return _count_calls
class Adder:
def __init__(self, number):
self.number = number
def __call__(self, other):
return other + self.number
class CountCalls:
""" count number of calls to a function"""
def __init__(self, func):
self.func = func
self.num_calls = 0
functools.update_wrapper(self, func)
def __call__(self, *args, **kwargs):
self.num_calls += 1
return self.func(*args, **kwargs)
if __name__ == "__main__":
add_3 = Adder(3)
print(add_3(5))
| 20.928571 | 67 | 0.589306 | 325 | 2,637 | 4.52 | 0.236923 | 0.088496 | 0.057182 | 0.071477 | 0.281824 | 0.261402 | 0.190606 | 0.164738 | 0.164738 | 0.096664 | 0 | 0.008524 | 0.288206 | 2,637 | 125 | 68 | 21.096 | 0.774108 | 0.122867 | 0 | 0.328571 | 0 | 0 | 0.058772 | 0.02563 | 0 | 0 | 0 | 0 | 0 | 1 | 0.242857 | false | 0 | 0.028571 | 0.014286 | 0.528571 | 0.057143 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dd4ff723cfd9e94008f392b049f7cd0a5d450c66 | 6,608 | py | Python | convert_tf2npu/conver_by_ast.py | Judithsq/tensorflow | 82507eb26ab120c02bf217610975754100686970 | [
"Apache-2.0"
] | null | null | null | convert_tf2npu/conver_by_ast.py | Judithsq/tensorflow | 82507eb26ab120c02bf217610975754100686970 | [
"Apache-2.0"
] | 1 | 2021-04-10T08:28:59.000Z | 2021-06-15T01:46:05.000Z | convert_tf2npu/conver_by_ast.py | Judithsq/tensorflow | 82507eb26ab120c02bf217610975754100686970 | [
"Apache-2.0"
] | 1 | 2021-04-10T07:51:47.000Z | 2021-04-10T07:51:47.000Z | # Copyright 2020 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless REQUIRED by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
import os
import sys
import ast
import astunparse
import util_global
from file_op import write_output_after_conver
from file_op import write_report_after_conver
from file_op import scan_file
from util import log_success_report
from util import log_migration_report
from ast_impl import attribute
from ast_impl import node_tree
from ast_impl import insert_config_pb2_import
from ast_impl import insert_npu_init_func
from ast_impl import insert_NPUBroadcastGlobalVariablesHook_import
from ast_impl import insert_npu_hooks_append_func
from ast_impl import insert_npu_run_config_func
from ast_impl import insert_npu_session_config_func
from ast_impl import insert_RewriterConfig_import
from ast_impl import insert_npu_import
from ast_impl import insert_npu_tf_opt_func
from ast_impl import insert_npu_keras_opt_func
from ast_impl import insert_empty_hook
from ast_impl import import_from
from ast_impl import ast_import
from ast_impl import ast_function_def
from ast_impl import ast_call
from ast_impl import ast_assign
from visit_by_ast import get_tf_api
class ConverByAst(ast.NodeTransformer):
def generic_visit(self, node):
ast.NodeTransformer.generic_visit(self, node)
return node
def visit_Attribute(self, node):
self.generic_visit(node)
#if node.attr in util_global.get_value('estimator') and isinstance(node.value, ast.Attribute):
# if node.value.attr == 'estimator':
# return attribute(node)
if node.attr in util_global.get_value('hvd'):
if isinstance(node.value, ast.Name):
if 'hvd' in str(node.value.id):
return attribute(node)
if isinstance(node.value, ast.Attribute):
if 'hvd' in str(node.value.attr):
return attribute(node)
if node.attr == 'run':
log_migration_report(getattr(node, "lineno", "None"), node.attr)
util_global.set_value('need_conver', True)
return node
return node
def visit_FunctionDef(self, node):
if node.name == 'gelu':
return ast_function_def(node)
self.generic_visit(node)
return node
def visit_Call(self, node):
node = ast_call(node)
self.generic_visit(node)
return node
def visit_ImportFrom(self, node):
import_from(node)
self.generic_visit(node)
return node
def visit_Import(self, node):
ast_import(node)
self.generic_visit(node)
return node
def visit_Assign(self, node):
for target in node.targets:
if (isinstance(target, ast.Name) and target.id == 'global_jit_level') or (isinstance(target, ast.Attribute) and target.attr == 'global_jit_level'):
return ast_assign(node)
if isinstance(node.value, ast.Call) and isinstance(node.value.func, ast.Attribute):
if (node.value.func.attr == 'ConfigProto') or (node.value.func.attr == 'GraphOptions') or (node.value.func.attr == 'OptimizerOptions'):
return ast_assign(node)
ast_assign(node)
self.generic_visit(node)
return node
def conver_ast(path, out_path_dst, file_name):
util_global.set_value('need_conver', False)
util_global.set_value('import_config_pb2', False)
util_global.set_value('insert_npu_init_func', False)
util_global.set_value('insert_estimator_add_hook_func', False)
util_global.set_value('insert_npu_hooks_append', False)
util_global.set_value('import_NPUBroadcastGlobalVariablesHook', False)
util_global.set_value('insert_npu_run_config_func', False)
util_global.set_value('insert_npu_session_config_func', False)
util_global.set_value('import_RewriterConfig', False)
util_global.set_value('insert_npu_tf_opt_func', False)
util_global.set_value('insert_npu_keras_opt_func', False)
util_global.set_value('insert_empty_hook', False)
with open(os.path.join(path, file_name), "r", encoding='utf-8') as file:
source = file.read()
r_node = ast.parse(source)
sys.setrecursionlimit(10000)
visitor = ConverByAst()
visitor.visit(r_node)
ast.fix_missing_locations(r_node)
(api, lineno) = get_tf_api(os.path.join(path, file_name))
if len(api) == 0:
print("No Tensorflow module is imported in script {}.".format(file_name))
scan_file(file_name, api, lineno)
if util_global.get_value('need_conver', False):
insert_npu_import(r_node)
if util_global.get_value('insert_npu_hooks_append', False):
insert_npu_hooks_append_func(r_node)
if util_global.get_value('import_NPUBroadcastGlobalVariablesHook', False):
insert_NPUBroadcastGlobalVariablesHook_import(r_node)
if util_global.get_value('insert_npu_run_config_func', False):
insert_npu_run_config_func(r_node)
if util_global.get_value('insert_npu_session_config_func', False):
insert_npu_session_config_func(r_node)
if util_global.get_value('import_config_pb2', False):
insert_config_pb2_import(r_node)
if util_global.get_value('insert_npu_init_func', False):
insert_npu_init_func(r_node)
if util_global.get_value('import_RewriterConfig', False):
insert_RewriterConfig_import(r_node)
if util_global.get_value('insert_npu_tf_opt_func', False):
insert_npu_tf_opt_func(r_node)
if util_global.get_value('insert_npu_keras_opt_func', False):
insert_npu_keras_opt_func(r_node)
if util_global.get_value('insert_empty_hook', False):
insert_empty_hook(r_node)
dst_content = astunparse.unparse(r_node)
write_output_after_conver(os.path.join(util_global.get_value('output'), out_path_dst, file_name), dst_content)
if file_name.endswith("a.py"):
write_report_after_conver("only_for_test", file_name, node_tree(ast.dump(r_node))) | 42.909091 | 159 | 0.71368 | 935 | 6,608 | 4.720856 | 0.189305 | 0.063435 | 0.044857 | 0.069325 | 0.527866 | 0.414363 | 0.308111 | 0.184413 | 0.142728 | 0.062755 | 0 | 0.003561 | 0.192494 | 6,608 | 154 | 160 | 42.909091 | 0.823651 | 0.120914 | 0 | 0.145161 | 0 | 0 | 0.123101 | 0.069061 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0 | 0.354839 | 0 | 0.532258 | 0.008065 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
dd500cc19d38d536b493d29c56a7e61833c9fba0 | 124 | py | Python | MUL.py | ritwik15416/SPOJ-Classical-Problems | 128c491afd6f0cec4d616e1f3cb784389101826b | [
"MIT"
] | null | null | null | MUL.py | ritwik15416/SPOJ-Classical-Problems | 128c491afd6f0cec4d616e1f3cb784389101826b | [
"MIT"
] | null | null | null | MUL.py | ritwik15416/SPOJ-Classical-Problems | 128c491afd6f0cec4d616e1f3cb784389101826b | [
"MIT"
] | null | null | null | # Used Python for handling large integers
for _ in range(int(input())):
a,b = list(map(int,input().split()))
print(a*b)
| 20.666667 | 41 | 0.66129 | 21 | 124 | 3.857143 | 0.761905 | 0.197531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153226 | 124 | 5 | 42 | 24.8 | 0.771429 | 0.314516 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd528a57669cef77ac7a00f0db44093564eeadd2 | 311 | py | Python | Programmers/[cutz]2016.py | cutz-j/AlgorithmStudy | de0f81220e29bd5e109d174800f507b12a3bee36 | [
"MIT"
] | 3 | 2019-11-26T14:31:01.000Z | 2020-01-10T18:19:46.000Z | Programmers/[cutz]2016.py | cutz-j/AlgorithmStudy | de0f81220e29bd5e109d174800f507b12a3bee36 | [
"MIT"
] | null | null | null | Programmers/[cutz]2016.py | cutz-j/AlgorithmStudy | de0f81220e29bd5e109d174800f507b12a3bee36 | [
"MIT"
] | null | null | null | def solution(a, b):
answer = ''
month = {0:0, 1:31, 2:29, 3:31, 4:30, 5:31, 6:30, 7:31, 8:31, 9:30, 10:31, 11:30, 12:31}
day = {1:'FRI',2:'SAT',3:'SUN', 4:'MON',5:'TUE',6:'WED', 0:'THU'}
d = 0
for i in range(0, a):
d += month[i]
d += b
answer = day[(d % 7)]
return answer | 31.1 | 92 | 0.459807 | 66 | 311 | 2.166667 | 0.545455 | 0.097902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.226667 | 0.276527 | 311 | 10 | 93 | 31.1 | 0.408889 | 0 | 0 | 0 | 0 | 0 | 0.067308 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd55138352fcba509556e05eefd4de0090ed8535 | 953 | py | Python | src/matchingframework/scores/clusters.py | quoctoanpk2511/matching-framework | 03975cadfd67aa3d24f3a9ab83558633243d526d | [
"MIT"
] | null | null | null | src/matchingframework/scores/clusters.py | quoctoanpk2511/matching-framework | 03975cadfd67aa3d24f3a9ab83558633243d526d | [
"MIT"
] | null | null | null | src/matchingframework/scores/clusters.py | quoctoanpk2511/matching-framework | 03975cadfd67aa3d24f3a9ab83558633243d526d | [
"MIT"
] | null | null | null | import abc
class Clustering:
def add_matcher(self, matcher):
"""
Add the match object on the Clustering.
Args:
matcher: matchingframework.match.matchers.Matcher
Returns: None
"""
self.matcher = matcher
"""The matcher object"""
@abc.abstractmethod
def analyze(self):
"""
Analyze data using Cluster Analysis.
"""
def add_cluster(self):
"""
Adding cluster results to two matcher's dataset
"""
cluster_left, cluster_right = self.split_list_records()
self.matcher.left_data.df['match_cluster_id'] = cluster_left
self.matcher.right_data.df['match_cluster_id'] = cluster_right
def split_list_records(self):
"""
Get the split point.
"""
part = len(self.matcher.left_data.df['id_left'])
return self.matcher.clusters[:part], self.matcher.clusters[part:]
| 24.435897 | 73 | 0.60021 | 107 | 953 | 5.17757 | 0.392523 | 0.138989 | 0.057762 | 0.072202 | 0.162455 | 0.097473 | 0 | 0 | 0 | 0 | 0 | 0 | 0.294858 | 953 | 38 | 74 | 25.078947 | 0.824405 | 0.231899 | 0 | 0 | 0 | 0 | 0.066102 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0.076923 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dd5b41794980885802ed9a4901457028e14e12e1 | 512 | py | Python | Conteudo das Aulas/084/8rainhas.py | cerberus707/lab-python | ebba3c9cde873d70d4bb61084f79ce30b7f9e047 | [
"Apache-2.0"
] | null | null | null | Conteudo das Aulas/084/8rainhas.py | cerberus707/lab-python | ebba3c9cde873d70d4bb61084f79ce30b7f9e047 | [
"Apache-2.0"
] | null | null | null | Conteudo das Aulas/084/8rainhas.py | cerberus707/lab-python | ebba3c9cde873d70d4bb61084f79ce30b7f9e047 | [
"Apache-2.0"
] | null | null | null | """
Você deve resolver o clássico exercício das 8 rainhas
Nele o usuário lhe passa o tamanho do tabuleiro n
(lembrar que tabuleiros são quadrados então o usuário
só precisa lhe passar um inteiro) e você deve gerar
uma todas as distribuições de n rainhas neste tabuleiro
e imprimi-las de uma forma adequada.
Veja o livro Beginning Python, na descrição
do video para a explicação da solução, ou entre
no dropbox para ver a solução comentada
Esse exercício não é fácil!!
Não se preocupe se você não conseguir
"""
| 32 | 55 | 0.794922 | 88 | 512 | 4.625 | 0.738636 | 0.039312 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002375 | 0.177734 | 512 | 15 | 56 | 34.133333 | 0.964371 | 0.982422 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd5c98849c8f5e07ef866d7cb464355449c46746 | 1,473 | py | Python | code/visualisation/plot2d.py | yidanliu-psy/final_project | 2fadf3687ec1a47ca9920a05b32c231c0ba23752 | [
"MIT"
] | 3 | 2021-01-30T08:01:41.000Z | 2021-05-13T14:43:05.000Z | code/visualisation/plot2d.py | yidanliu-psy/GameVis | 2fadf3687ec1a47ca9920a05b32c231c0ba23752 | [
"MIT"
] | 1 | 2020-08-25T01:58:14.000Z | 2020-09-09T01:47:09.000Z | code/visualisation/plot2d.py | yidanliu-psy/GameVis | 2fadf3687ec1a47ca9920a05b32c231c0ba23752 | [
"MIT"
] | 1 | 2020-08-16T21:54:50.000Z | 2020-08-16T21:54:50.000Z | # plot 2d 2rd-order regression
import matplotlib.pyplot as plt
import numpy as np
I = np.arange(4.40990640657, 58.51715740979401, 0.1) # (min, max, step)
I_coef_list = [
[-0.13927776461608068, 0.002038919337462459, 2.3484253283211487], # doubles
[-0.09850865867608666, 0.001429693439506745, 1.673346834786426], # triples
[-0.04062711141502941, 0.0005561811330575912, 0.7243531132252763], # quadruples
[-0.09931019535824288, 0.0014388069537618795, 1.6896976115572775] # all
]
# coef for [order-1, order-2, constant]
OvI = np.arange(0.3978092294245647, 7.325677230721877, 0.01)
OvI_coef_list = [
[0.08149445158061253, 0.19664308140877118, -0.27567136158525557],
[0.1691745820752472, 0.07835210254462913, -0.23984672383683908],
[0.055910443720261264, 0.05469892798492301, -0.10648949308046807],
[0.15332557744227973, 0.09344338104295906, -0.24194502634113763]
]
# now set for fig. 4
x = I
coef_list = I_coef_list
name = "I"
type_list = ["Doubles", "Triples", "Quadruples", "All"]
pattern_list = ['--', '-.', ':', '']
fig, ax = plt.subplots()
for i in range(len(coef_list)):
y = coef_list[i][0] * x + coef_list[i][1] * np.power(x, 2) + coef_list[i][2] * 1 # prediction
ax.plot(x, y, pattern_list[i], label=type_list[i])
ax.set_xlabel(name, size=15)
ax.set_ylabel('Δ', size=15)
ax.tick_params(axis='both', direction='in')
plt.yticks(rotation=90)
ax.legend()
plt.savefig(name.replace('/', 'v') + ".png", dpi=300)
plt.show()
| 29.46 | 97 | 0.698574 | 199 | 1,473 | 5.080402 | 0.492462 | 0.063304 | 0.035608 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.405956 | 0.133741 | 1,473 | 49 | 98 | 30.061224 | 0.386364 | 0.098439 | 0 | 0 | 0 | 0 | 0.035034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd6b89f17c431e5b6b50bef3d69de51b6790cd10 | 8,052 | py | Python | int-tests/test_server.py | manimaul/xio | cd3e24fa61f0d4bd2a3582077071ea9f0218a5fe | [
"Apache-2.0"
] | 40 | 2016-08-18T21:09:50.000Z | 2022-01-26T11:38:19.000Z | int-tests/test_server.py | manimaul/xio | cd3e24fa61f0d4bd2a3582077071ea9f0218a5fe | [
"Apache-2.0"
] | 78 | 2015-01-27T05:45:46.000Z | 2021-03-31T18:47:13.000Z | int-tests/test_server.py | manimaul/xio | cd3e24fa61f0d4bd2a3582077071ea9f0218a5fe | [
"Apache-2.0"
] | 29 | 2015-10-09T16:52:37.000Z | 2019-04-10T01:07:44.000Z | import os.path as path
import unittest
from unittest import TestCase
from server_controller import Server, Initializer, module_dir
from unsafe_client import http_get, http_post, Response
back_init = Initializer('int-test-backend-server')
front_init = Initializer('int-test-proxy-server')
back_end = None
front_end = None
class TestReverseProxyServer(TestCase):
@classmethod
def tearDownClass(cls):
for each in [each for each in [back_end, front_end] if each is not None]:
each.kill()
@classmethod
def setup_back(cls, h2: bool, h1_tls: bool = True, verbose=False):
back_ready_str = "Active Connections"
global back_end
back_end = Server(back_init.init_script, back_ready_str, h2, h1_tls, name="backend1", port=8444, verbose=verbose).run()
@classmethod
def setup_front(cls, h2: bool, back_tsl: bool = True, verbose=False):
front_ready_str = "proxy accepting connections"
conf = path.abspath(path.join(module_dir, "proxy.conf"))
if h2:
proxy_config = 'xio.h2ReverseProxy'
else:
proxy_config = 'xio.h1ReverseProxy'
if back_tsl:
client_config = 'xio.testProxyRoute'
else:
client_config = 'xio.testProxyRoutePlainText'
global front_end
front_end = Server(front_init.init_script, front_ready_str, conf, proxy_config, client_config,
name="proxy", verbose=verbose).run()
def check_response(self, response: Response, method: str):
self.assertEqual('backend1', response.headers['x-tag'])
self.assertEqual(method, response.headers['x-method'])
self.assertEqual('echo', response.headers['x-echo'])
keys = { k.lower(): k for k in response.headers.keys() }
self.assertFalse('transfer-encoding' in keys and 'content-length' in keys)
self.assertEqual({'title': 'Release', 'description': 'the Kraken'}, response.json_body)
self.assertEqual(200, response.status)
class TestReverseProxyServerH1H1PlainText(TestReverseProxyServer):
@classmethod
def setUpClass(cls):
print("setup h1:h1")
cls.setup_front(h2=False, back_tsl=False)
cls.setup_back(h2=False, h1_tls=False)
# @skip
def test_backend_server_get_h1_works(self):
response = http_get(url='http://localhost:{}/'.format(back_end.port), headers={"x-echo": "echo"}, h2=False)
self.check_response(response, 'GET')
# @skip
def test_backend_server_post_h1_works(self):
response = http_post(url='http://localhost:{}/'.format(back_end.port), data={"key": "value"},
headers={"x-echo": "echo"}, h2=False)
self.check_response(response, 'POST')
# @skip
def test_proxy_get_h1_h1(self):
responses = [
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=False),
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=False),
]
for response in responses:
self.check_response(response, 'GET')
# @skip
def test_proxy_post_h1_h1(self):
responses = [
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=False),
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=False),
]
for response in responses:
self.check_response(response, 'POST')
class TestReverseProxyServerH1H1(TestReverseProxyServer):
@classmethod
def setUpClass(cls):
print("setup h1:h1")
cls.setup_front(h2=False)
cls.setup_back(h2=False)
# @skip
def test_backend_server_get_h1_works(self):
response = http_get(url='https://localhost:{}/'.format(back_end.port), headers={"x-echo": "echo"}, h2=False)
self.check_response(response, 'GET')
# @skip
def test_backend_server_post_h1_works(self):
response = http_post(url='https://localhost:{}/'.format(back_end.port), data={"key": "value"},
headers={"x-echo": "echo"}, h2=False)
self.check_response(response, 'POST')
# @skip
def test_proxy_get_h1_h1(self):
responses = [
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=False),
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=False),
]
for response in responses:
self.check_response(response, 'GET')
# @skip
def test_proxy_post_h1_h1(self):
responses = [
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=False),
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=False),
]
for response in responses:
self.check_response(response, 'POST')
class TestReverseProxyServerH2H1(TestReverseProxyServer):
@classmethod
def setUpClass(cls):
print("setup h2:h1")
cls.setup_front(h2=True)
cls.setup_back(h2=False)
# @skip
def test_proxy_get_h2_h1(self):
responses = [
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=True),
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=True),
]
for response in responses:
self.check_response(response, 'GET')
# @skip
def test_proxy_post_h2_h1(self):
responses = [
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=True),
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=True),
]
for response in responses:
self.check_response(response, 'POST')
class TestReverseProxyServerH2H1PlainText(TestReverseProxyServer):
@classmethod
def setUpClass(cls):
print("setup h2:h1")
cls.setup_front(h2=True, back_tsl=False)
cls.setup_back(h2=False, h1_tls=False)
# @skip
def test_proxy_get_h2_h1(self):
responses = [
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=True),
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=True),
]
for response in responses:
self.check_response(response, 'GET')
# @skip
def test_proxy_post_h2_h1(self):
responses = [
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=True),
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=True),
]
for response in responses:
self.check_response(response, 'POST')
class TestReverseProxyServerH2H2(TestReverseProxyServer):
@classmethod
def setUpClass(cls):
print("setup h2:h2")
cls.setup_front(h2=True)
cls.setup_back(h2=True)
# @skip
def test_backend_server_get_h2_works(self):
responses = [
http_get(url='https://localhost:{}/'.format(back_end.port), headers={"x-echo": "echo"}, h2=True),
http_get(url='https://localhost:{}/'.format(back_end.port), headers={"x-echo": "echo"}, h2=True),
]
for response in responses:
self.check_response(response, 'GET')
# @skip
def test_backend_server_post_h2_works(self):
responses = [
http_post(url='https://localhost:{}/'.format(back_end.port), data={"key": "value"}, headers={"x-echo": "echo"},
h2=True),
http_post(url='https://localhost:{}/'.format(back_end.port), data={"key": "value"}, headers={"x-echo": "echo"},
h2=True),
]
for response in responses:
self.check_response(response, 'POST')
# @skip
def test_proxy_get_h2_h2(self):
responses = [
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=True),
http_get(url='https://localhost:8443/', headers={"x-echo": "echo"}, h2=True),
]
for response in responses:
self.check_response(response, 'GET')
# @skip
def test_proxy_post_h2_h2(self):
responses = [
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=True),
http_post(url='https://localhost:8443/', data={"key": "value"}, headers={"x-echo": "echo"}, h2=True),
]
for response in responses:
self.check_response(response, 'POST')
if __name__ == '__main__':
unittest.main()
| 34.55794 | 123 | 0.65996 | 1,055 | 8,052 | 4.860664 | 0.109953 | 0.048362 | 0.067863 | 0.087363 | 0.707098 | 0.703588 | 0.697933 | 0.691303 | 0.676677 | 0.667512 | 0 | 0.027358 | 0.169275 | 8,052 | 232 | 124 | 34.706897 | 0.739273 | 0.011798 | 0 | 0.603448 | 0 | 0 | 0.179471 | 0.008942 | 0 | 0 | 0 | 0 | 0.034483 | 1 | 0.143678 | false | 0 | 0.028736 | 0 | 0.206897 | 0.028736 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd7fe2505156aff13677bf728abddadd6a2cd4aa | 212 | py | Python | src/platform/tomcat/fingerprints/Tomcat33.py | 0x27/clusterd | 0f04a4955c61aa523274e9ae35d750f4339b1e59 | [
"MIT"
] | 539 | 2015-01-08T23:59:32.000Z | 2022-03-29T17:53:02.000Z | src/platform/tomcat/fingerprints/Tomcat33.py | M31MOTH/clusterd | d190b2cbaa93820e928a7ce5471c661d4559fb7c | [
"MIT"
] | 21 | 2015-01-17T21:51:21.000Z | 2019-09-20T09:23:18.000Z | src/platform/tomcat/fingerprints/Tomcat33.py | M31MOTH/clusterd | d190b2cbaa93820e928a7ce5471c661d4559fb7c | [
"MIT"
] | 192 | 2015-01-26T20:44:14.000Z | 2021-12-22T01:39:50.000Z | from src.platform.tomcat.interfaces import AppInterface
class FPrint(AppInterface):
def __init__(self):
super(FPrint, self).__init__()
self.version = "3.3"
self.uri = "/doc/readme"
| 21.2 | 55 | 0.65566 | 25 | 212 | 5.24 | 0.72 | 0.122137 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012121 | 0.221698 | 212 | 9 | 56 | 23.555556 | 0.781818 | 0 | 0 | 0 | 0 | 0 | 0.066038 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd839bedadd3f904f4462a19bbb54565bde22dd2 | 539 | py | Python | frontend/maBlog/models.py | rishirajpurohit/Angular2withDjango-python | 3b293ae62fd671bd352c045f4876d735e36fcd4e | [
"MIT"
] | null | null | null | frontend/maBlog/models.py | rishirajpurohit/Angular2withDjango-python | 3b293ae62fd671bd352c045f4876d735e36fcd4e | [
"MIT"
] | null | null | null | frontend/maBlog/models.py | rishirajpurohit/Angular2withDjango-python | 3b293ae62fd671bd352c045f4876d735e36fcd4e | [
"MIT"
] | null | null | null | from django.db import models
# Cnreate your models here.
class posts(models.Model):
post = models.CharField(max_length=200)
summary = models.CharField(max_length=50)
author = models.CharField(max_length=50)
date = models.DateTimeField('date published')
title = models.CharField(max_length=50)
def __str__(self):
return self.summary
def __toDict(self):
dict = {}
dict['author'] = self.author
dict['summary'] = self.summary
dict['date'] = self.date
dict['title'] = self.title
dict['id'] = self.id
return dict
| 20.730769 | 46 | 0.706865 | 74 | 539 | 5.013514 | 0.405405 | 0.161725 | 0.19407 | 0.25876 | 0.210243 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019824 | 0.157699 | 539 | 25 | 47 | 21.56 | 0.797357 | 0.046382 | 0 | 0 | 0 | 0 | 0.074803 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.058824 | 0.058824 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd897381a3f20b3dfbaa0e11129a61ca25945c2c | 86 | py | Python | snakeflake/__init__.py | Roadcrosser/snakeflake | 699f48f8a415721dde5875b95401989384048f02 | [
"Apache-2.0"
] | 4 | 2020-03-04T22:08:25.000Z | 2021-06-09T09:47:00.000Z | snakeflake/__init__.py | Roadcrosser/snakeflake | 699f48f8a415721dde5875b95401989384048f02 | [
"Apache-2.0"
] | null | null | null | snakeflake/__init__.py | Roadcrosser/snakeflake | 699f48f8a415721dde5875b95401989384048f02 | [
"Apache-2.0"
] | null | null | null | """Snakeflake init file"""
__all__ = ["snakeflake", "config", "utils", "exceptions"]
| 21.5 | 57 | 0.651163 | 8 | 86 | 6.5 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 86 | 3 | 58 | 28.666667 | 0.684211 | 0.232558 | 0 | 0 | 0 | 0 | 0.516667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd8a85b273c8b372a78df9c0b81e423d6134955d | 212 | py | Python | practice/lc136.py | itsmeolivia/code_eval | b00c7c968343b1be2efeb138b58471833d24022d | [
"MIT"
] | null | null | null | practice/lc136.py | itsmeolivia/code_eval | b00c7c968343b1be2efeb138b58471833d24022d | [
"MIT"
] | null | null | null | practice/lc136.py | itsmeolivia/code_eval | b00c7c968343b1be2efeb138b58471833d24022d | [
"MIT"
] | null | null | null | class Solution:
# @param {integer[]} nums
# @return {integer}
def singleNumber(self, nums):
a = set(nums)
a = sum(a)*2
singleNumber = a - sum(nums)
return singleNumber
| 23.555556 | 36 | 0.551887 | 24 | 212 | 4.875 | 0.541667 | 0.17094 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006993 | 0.325472 | 212 | 8 | 37 | 26.5 | 0.811189 | 0.193396 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd949b779c5e8d921abc137e847e3d4a29f97f72 | 1,485 | py | Python | tests/conftest.py | Kupoman/shadertest | 79b959e0ff00ac8c30918e83e0751f25bcc447ae | [
"MIT"
] | 7 | 2018-11-10T20:49:56.000Z | 2021-08-31T04:34:56.000Z | tests/conftest.py | Kupoman/shadertest | 79b959e0ff00ac8c30918e83e0751f25bcc447ae | [
"MIT"
] | null | null | null | tests/conftest.py | Kupoman/shadertest | 79b959e0ff00ac8c30918e83e0751f25bcc447ae | [
"MIT"
] | 2 | 2018-12-10T03:01:05.000Z | 2018-12-10T12:51:11.000Z | import pytest
from shadertest.shader_parser import (
Argument,
Function,
)
@pytest.fixture
def no_arg_function():
return Function(
'function',
'float',
[],
'float function () { return 1.0; }'
)
@pytest.fixture
def one_arg_function():
return Function(
'function',
'float',
[
Argument('float', 'a'),
],
'float function (float a) { return a * 2.0; }'
)
@pytest.fixture
def two_arg_function():
return Function(
'function',
'float',
[
Argument('float', 'a'),
Argument('float', 'b'),
],
'float function (float a, float b) { return a + b; }'
)
@pytest.fixture
def int_arg_function():
return Function(
'function',
'float',
[
Argument('int', 'a'),
],
'float function (int a) { return a / 2; }'
)
@pytest.fixture
def bool_arg_function():
return Function(
'function',
'float',
[
Argument('bool', 'a'),
],
'float function (bool a) { return (a) ? 1.0 : 0.0; }'
)
@pytest.fixture
def bool_return_function():
return Function(
'function',
'bool',
[],
'bool function () { return true; }'
)
@pytest.fixture
def int_return_function():
return Function(
'function',
'int',
[],
'int function () { return 1; }'
)
| 17.267442 | 61 | 0.485522 | 141 | 1,485 | 5.007092 | 0.170213 | 0.1983 | 0.15864 | 0.29745 | 0.433428 | 0.331445 | 0.27762 | 0.147309 | 0.147309 | 0 | 0 | 0.01065 | 0.367677 | 1,485 | 85 | 62 | 17.470588 | 0.741214 | 0 | 0 | 0.5 | 0 | 0.014286 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | true | 0 | 0.028571 | 0.1 | 0.228571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd95a279e4806da94547a2c3f13a776301e90249 | 795 | py | Python | rltime/acting/actor_wrapper.py | frederikschubert/rltime | d1722ffd4cf7b4599655b8d9c64abc243919afc9 | [
"Apache-2.0"
] | 147 | 2019-09-05T10:41:15.000Z | 2021-12-28T23:41:54.000Z | rltime/acting/actor_wrapper.py | frederikschubert/rltime | d1722ffd4cf7b4599655b8d9c64abc243919afc9 | [
"Apache-2.0"
] | 1 | 2020-10-18T14:55:53.000Z | 2021-11-18T10:41:36.000Z | rltime/acting/actor_wrapper.py | frederikschubert/rltime | d1722ffd4cf7b4599655b8d9c64abc243919afc9 | [
"Apache-2.0"
] | 11 | 2019-09-08T09:18:28.000Z | 2020-11-30T12:41:37.000Z | from .acting_interface import ActingInterface
class ActorWrapper(ActingInterface):
"""Wrapper for a created actor
Allows overriding only specific actor methods while passing through the
rest, similar to gym wrappers
"""
def __init__(self, actor):
super().__init__(*actor.get_spaces())
self._actor = actor
def get_samples(self, min_samples):
return self._actor.get_samples(min_samples)
def get_env_count(self):
return self._actor.get_env_count()
def set_actor_policy(self, actor_policy):
return self._actor.set_actor_policy(actor_policy)
def update_state(self, progress, policy_state=None):
return self._actor.update_state(progress, policy_state)
def close(self):
return self._actor.close()
| 27.413793 | 75 | 0.709434 | 102 | 795 | 5.196078 | 0.431373 | 0.135849 | 0.141509 | 0.067925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206289 | 795 | 28 | 76 | 28.392857 | 0.839937 | 0.163522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.066667 | 0.333333 | 0.866667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
dd9d65c0b2d12ecb340ef6dba5a5cc0f64640c9e | 2,510 | py | Python | bug_tracker_v2/tracker/filters.py | ViMonks/bug_tracker_django_fullstack | 773fddcb9e99c8e0a8418ae92f2a2f23c22749e1 | [
"MIT"
] | null | null | null | bug_tracker_v2/tracker/filters.py | ViMonks/bug_tracker_django_fullstack | 773fddcb9e99c8e0a8418ae92f2a2f23c22749e1 | [
"MIT"
] | null | null | null | bug_tracker_v2/tracker/filters.py | ViMonks/bug_tracker_django_fullstack | 773fddcb9e99c8e0a8418ae92f2a2f23c22749e1 | [
"MIT"
] | null | null | null | from django.contrib.auth import get_user_model
import django_filters
from .models import Ticket, Project, Team
from django.forms import DateInput
User = get_user_model()
class TicketFilter(django_filters.FilterSet):
# STATUS_CHOICES = (
# ('open', 'Open'),
# ('assigned', 'Assigned'),
# ('in_progress', 'In progress'),
# )
developer = django_filters.ModelChoiceFilter(queryset=lambda request: Team.objects.get(slug=request.resolver_match.kwargs['team_slug']).members.all(), null_label = 'Unassigned')
title = django_filters.CharFilter(lookup_expr='icontains')
user = django_filters.ModelChoiceFilter(queryset=User.objects.all())
created_start_date = django_filters.DateFilter(field_name='created_on', lookup_expr='date__gte', widget=DateInput(attrs={'type': 'date'}))
created_end_date = django_filters.DateFilter(field_name='created_on', lookup_expr='date__lte', widget=DateInput(attrs={'type': 'date'}))
updated_start_date = django_filters.DateFilter(field_name='last_updated_on', lookup_expr='date__gte', widget=DateInput(attrs={'type': 'date'}))
updated_end_date = django_filters.DateFilter(field_name='last_updated_on', lookup_expr='date__lte', widget=DateInput(attrs={'type': 'date'}))
# status = django_filters.ChoiceFilter(choices=STATUS_CHOICES)
project = django_filters.ModelChoiceFilter(queryset=lambda request: Project.objects.filter_for_team_and_user(team_slug=request.resolver_match.kwargs['team_slug'], user=request.user).filter(is_archived=False))
class Meta:
model = Ticket
exclude = ('description', 'team', 'status' )
class TicketFilterArchivedProjects(TicketFilter):
project = django_filters.ModelChoiceFilter(queryset=lambda request: Project.objects.filter_for_team_and_user(team_slug=request.resolver_match.kwargs['team_slug'], user=request.user).filter(is_archived=True))
class ProjectFilter(django_filters.FilterSet):
manager = django_filters.ModelChoiceFilter(queryset=lambda request: Team.objects.get(slug=request.resolver_match.kwargs['team_slug']).get_managers())
title = django_filters.CharFilter(lookup_expr='icontains')
start_date = django_filters.DateFilter(field_name='created_on', lookup_expr='date__gte', widget=DateInput(attrs={'type': 'date'}))
end_date = django_filters.DateFilter(field_name='created_on', lookup_expr='date__lte', widget=DateInput(attrs={'type': 'date'}))
class Meta:
model = Project
exclude = ('description', 'team', )
| 58.372093 | 212 | 0.757371 | 306 | 2,510 | 5.918301 | 0.24183 | 0.122032 | 0.056322 | 0.089453 | 0.671452 | 0.671452 | 0.663722 | 0.607399 | 0.607399 | 0.607399 | 0 | 0 | 0.108765 | 2,510 | 42 | 213 | 59.761905 | 0.809566 | 0.067331 | 0 | 0.148148 | 0 | 0 | 0.116538 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.148148 | 0 | 0.814815 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
06b3fff75eadbc2a1ad78f51282e05b97e4cfc5f | 1,764 | py | Python | CAP3-Functions/thinkpythonEX3.py | falble/mythinkpython2 | 25de15c8657f32a8f85189d9cb0588c816e9e7d3 | [
"Apache-2.0"
] | 1 | 2020-11-20T16:28:32.000Z | 2020-11-20T16:28:32.000Z | CAP3-Functions/thinkpythonEX3.py | falble/mythinkpython2 | 25de15c8657f32a8f85189d9cb0588c816e9e7d3 | [
"Apache-2.0"
] | null | null | null | CAP3-Functions/thinkpythonEX3.py | falble/mythinkpython2 | 25de15c8657f32a8f85189d9cb0588c816e9e7d3 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sat Feb 9 19:48:15 2019
@author: Utente
"""
#Exercises Chapter 3
#3.2 do_four
def do_twice(func, arg):
func(arg)
func(arg)
def print_twice(arg):
print(arg)
print(arg)
def do_four(func, arg):
do_twice(func, arg)
do_twice(func, arg)
do_twice(print_twice, 'cazzo vuoi')
print('')
do_four(print_twice, 'cazzo vuoi')
#3.3 grid
def re_do_twice(f):
f()
f()
def re_do_four(f):
re_do_twice(f)
re_do_twice(f)
def print_beam():
print('+ - - - -', end=' ')
def print_post():
print('| ', end=' ')
def print_beams():
re_do_twice(print_beam)
print('+')
def print_posts():
re_do_twice(print_post)
print('|')
def print_row():
print_beams()
re_do_four(print_posts)
def print_grid():
re_do_twice(print_row)
print_beams()
print_grid()
#3.3 grid part two
def one_four_one(f, g, h):
f()
re_do_four(g)
h()
def print_A():
print('+', end=' ')
def print_Z():
print('-', end=' ')
def print_C():
print('|', end=' ')
def print_O():
print('O', end=' ')
def print_end():
print(' ')
def nothing():
"do nothing"
def print1beam():
one_four_one(nothing, print_Z, print_A)
def print1post():
one_four_one(nothing, print_O, print_C)
def print4beams():
one_four_one(print_A, print1beam, print_end)
def print4posts():
one_four_one(print_C, print1post, print_end)
def print_row():
one_four_one(nothing, print4posts, print4beams)
def print_grid():
one_four_one(print4beams, print_row, nothing)
print_grid()
| 15.076923 | 52 | 0.561791 | 245 | 1,764 | 3.75102 | 0.204082 | 0.121872 | 0.083787 | 0.104461 | 0.116431 | 0.045702 | 0.045702 | 0.045702 | 0 | 0 | 0 | 0.022436 | 0.292517 | 1,764 | 116 | 53 | 15.206897 | 0.713942 | 0.080499 | 0 | 0.359375 | 0 | 0 | 0.040559 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0 | 0 | 0.375 | 0.71875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
06ecb83781bb61c11e01be77718a1aeb28fb280a | 8,264 | py | Python | long_tests/test_constrained_conjugate_gradients.py | ComputationalMechanics/ContactMechanics | c734edeae83f5b3369fd9a2bd1c4fc7d00e382f3 | [
"MIT"
] | 7 | 2020-06-03T20:01:42.000Z | 2021-03-03T09:24:37.000Z | long_tests/test_constrained_conjugate_gradients.py | ComputationalMechanics/ContactMechanics | c734edeae83f5b3369fd9a2bd1c4fc7d00e382f3 | [
"MIT"
] | 19 | 2020-06-02T13:02:18.000Z | 2021-03-19T09:48:12.000Z | long_tests/test_constrained_conjugate_gradients.py | ComputationalMechanics/ContactMechanics | c734edeae83f5b3369fd9a2bd1c4fc7d00e382f3 | [
"MIT"
] | 1 | 2021-09-10T22:20:41.000Z | 2021-09-10T22:20:41.000Z | from SurfaceTopography import make_sphere
import ContactMechanics as Solid
import numpy as np
import scipy.optimize as optim
from NuMPI.Optimization import ccg_without_restart, ccg_with_restart
import pytest
from NuMPI import MPI
pytestmark = pytest.mark.skipif(MPI.COMM_WORLD.Get_size() > 1,
reason="tests only serial funcionalities, "
"please execute with pytest")
def test_primal_obj():
nx = ny = 1024
sx, sy = 1., 1.
R = 10.
gtol = 1e-6
surface = make_sphere(R, (nx, ny), (sx, sy), kind="paraboloid")
Es = 50.
substrate = Solid.PeriodicFFTElasticHalfSpace((nx, ny), young=Es,
physical_sizes=(sx, sy))
system = Solid.Systems.NonSmoothContactSystem(substrate, surface)
offset = 0.005
lbounds = np.zeros((nx, ny))
bnds = system._reshape_bounds(lbounds, )
disp = np.zeros((nx, ny))
init_gap = disp - surface.heights() - offset
# ####################POLONSKY-KEER##############################
res = ccg_with_restart.constrained_conjugate_gradients(
system.primal_objective(offset, gradient=True),
system.primal_hessian_product, x0=init_gap, gtol=gtol)
assert res.success
polonsky_gap = res.x.reshape((nx, ny))
# ####################BUGNICOURT###################################
res = ccg_without_restart.constrained_conjugate_gradients(system.primal_objective
(offset,
gradient=True),
system.primal_hessian_product,
x0=init_gap,
mean_val=None,
gtol=gtol)
assert res.success
bugnicourt_gap = res.x.reshape((nx, ny))
# #####################LBFGSB#####################################
res = optim.minimize(system.primal_objective(offset, gradient=True),
init_gap,
method='L-BFGS-B', jac=True,
bounds=bnds,
options=dict(gtol=gtol, ftol=1e-20))
assert res.success
lbfgsb_gap = res.x.reshape((nx, ny))
np.testing.assert_allclose(polonsky_gap, bugnicourt_gap, atol=1e-3)
np.testing.assert_allclose(polonsky_gap, lbfgsb_gap, atol=1e-3)
np.testing.assert_allclose(lbfgsb_gap, bugnicourt_gap, atol=1e-3)
# ##########TEST MEAN VALUES#######################################
mean_val = np.mean(lbfgsb_gap)
# ####################POLONSKY-KEER##############################
res = ccg_with_restart.constrained_conjugate_gradients(
system.primal_objective(offset, gradient=True),
system.primal_hessian_product, init_gap, gtol=gtol,
mean_value=mean_val)
assert res.success
polonsky_gap_mean_cons = res.x.reshape((nx, ny))
# ####################BUGNICOURT###################################
ccg_without_restart.constrained_conjugate_gradients(system.primal_objective
(offset, gradient=True),
system.
primal_hessian_product,
x0=init_gap,
mean_val=mean_val,
gtol=gtol
)
assert res.success
bugnicourt_gap_mean_cons = res.x.reshape((nx, ny))
np.testing.assert_allclose(polonsky_gap_mean_cons, lbfgsb_gap, atol=1e-3)
np.testing.assert_allclose(bugnicourt_gap_mean_cons, lbfgsb_gap, atol=1e-3)
np.testing.assert_allclose(lbfgsb_gap, bugnicourt_gap, atol=1e-3)
np.testing.assert_allclose(lbfgsb_gap, bugnicourt_gap_mean_cons, atol=1e-3)
def test_dual_obj():
nx = ny = 1024
sx, sy = 1., 1.
R = 10.
gtol = 1e-7
surface = make_sphere(R, (nx, ny), (sx, sy), kind="paraboloid")
Es = 50.
substrate = Solid.PeriodicFFTElasticHalfSpace((nx, ny), young=Es,
physical_sizes=(sx, sy))
substrate_2 = Solid.PeriodicFFTElasticHalfSpace((nx, ny), young=Es,
physical_sizes=(sx, sy),
stiffness_q0=0.0)
system = Solid.Systems.NonSmoothContactSystem(substrate, surface)
system_2 = Solid.Systems.NonSmoothContactSystem(substrate_2, surface, )
offset = 0.005
lbounds = np.zeros((nx, ny))
bnds = system._reshape_bounds(lbounds, )
init_gap = np.zeros((nx, ny))
disp = init_gap + surface.heights() + offset
init_pressure = substrate.evaluate_force(disp)
# ####################LBFGSB########################################
res = optim.minimize(system.dual_objective(offset, gradient=True),
init_pressure,
method='L-BFGS-B', jac=True,
bounds=bnds,
options=dict(gtol=gtol, ftol=1e-20))
print(res.message, res.nfev)
assert res.success
lbfgsb_force = res.x.reshape((nx, ny))
CA_lbfgsb = res.x.reshape((nx, ny)) > 0 # Contact area
fun = system.dual_objective(offset, gradient=True)
gap_lbfgsb = fun(res.x)[1]
gap_lbfgsb = gap_lbfgsb.reshape((nx, ny))
# ###################BUGNICOURT########################################
ccg_without_restart.constrained_conjugate_gradients(
system.dual_objective(offset, gradient=True),
system.
dual_hessian_product,
init_pressure,
mean_val=None, gtol=gtol)
assert res.success
bugnicourt_force = res.x.reshape((nx, ny))
CA_bugnicourt = res.x.reshape((nx, ny)) > 0 # Contact area
gap_bugnicourt = fun(res.x)[1]
gap_bugnicourt = gap_bugnicourt.reshape((nx, ny))
# # ##################POLONSKY-KEER#####################################
res = ccg_with_restart.constrained_conjugate_gradients(
system.dual_objective(offset, gradient=True),
system.dual_hessian_product, init_pressure, gtol=gtol)
assert res.success
polonsky_force = res.x
CA_polonsky = res.x.reshape((nx, ny)) > 0 # Contact area
gap_polonsky = fun(res.x)[1]
gap_polonsky = gap_polonsky.reshape((nx, ny))
np.testing.assert_allclose(gap_lbfgsb, gap_polonsky, atol=1e-3)
np.testing.assert_allclose(gap_lbfgsb, gap_bugnicourt, atol=1e-3)
np.testing.assert_allclose(gap_bugnicourt, gap_polonsky, atol=1e-3)
np.testing.assert_allclose(lbfgsb_force, bugnicourt_force, atol=1e-3)
np.testing.assert_allclose(lbfgsb_force,
polonsky_force.reshape(lbfgsb_force.shape),
atol=1e-3)
np.testing.assert_allclose(polonsky_force.reshape(lbfgsb_force.shape),
bugnicourt_force, atol=1e-3)
# ##########TEST MEAN VALUES#######################################
mean_val = np.mean(lbfgsb_force)
# print('mean {}'.format(mean_val))
# ####################POLONSKY-KEER##############################
res = ccg_with_restart.constrained_conjugate_gradients(
system.dual_objective(offset, gradient=True),
system.dual_hessian_product, init_pressure, gtol=gtol,
mean_value=mean_val)
assert res.success
polonsky_force_mean_cons = res.x.reshape((nx, ny))
# # ####################BUGNICOURT###################################
ccg_without_restart.constrained_conjugate_gradients(
system.dual_objective(offset, gradient=True),
system.
dual_hessian_product,
init_pressure,
mean_val=mean_val,
gtol=gtol)
assert res.success
bugnicourt_force_mean_cons = res.x.reshape((nx, ny))
np.testing.assert_allclose(polonsky_force_mean_cons, lbfgsb_force,
atol=1e-3)
np.testing.assert_allclose(bugnicourt_force_mean_cons, lbfgsb_force,
atol=1e-3)
| 41.32 | 86 | 0.54562 | 870 | 8,264 | 4.952874 | 0.148276 | 0.024136 | 0.038292 | 0.080065 | 0.811557 | 0.784869 | 0.695057 | 0.672082 | 0.617545 | 0.554885 | 0 | 0.01432 | 0.290174 | 8,264 | 199 | 87 | 41.527638 | 0.720252 | 0.026864 | 0 | 0.473333 | 0 | 0 | 0.013012 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.013333 | false | 0 | 0.046667 | 0 | 0.06 | 0.006667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6605a2b78adeca45a2d0793e23aa804ff3b5c505 | 3,535 | py | Python | temboo/core/Library/Microsoft/Translator/GetToken.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Microsoft/Translator/GetToken.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Microsoft/Translator/GetToken.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | # -*- coding: utf-8 -*-
###############################################################################
#
# GetToken
# Retrieves an access token that can be used to authenticate with the Microsoft Translator API.
#
# Python versions 2.6, 2.7, 3.x
#
# Copyright 2014, Temboo Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
#
#
###############################################################################
from temboo.core.choreography import Choreography
from temboo.core.choreography import InputSet
from temboo.core.choreography import ResultSet
from temboo.core.choreography import ChoreographyExecution
import json
class GetToken(Choreography):
def __init__(self, temboo_session):
"""
Create a new instance of the GetToken Choreo. A TembooSession object, containing a valid
set of Temboo credentials, must be supplied.
"""
super(GetToken, self).__init__(temboo_session, '/Library/Microsoft/Translator/GetToken')
def new_input_set(self):
return GetTokenInputSet()
def _make_result_set(self, result, path):
return GetTokenResultSet(result, path)
def _make_execution(self, session, exec_id, path):
return GetTokenChoreographyExecution(session, exec_id, path)
class GetTokenInputSet(InputSet):
"""
An InputSet with methods appropriate for specifying the inputs to the GetToken
Choreo. The InputSet object is used to specify input parameters when executing this Choreo.
"""
def set_ClientID(self, value):
"""
Set the value of the ClientID input for this Choreo. ((required, string) The Client ID obtained when signing up for Microsoft Translator on Azure Marketplace.)
"""
super(GetTokenInputSet, self)._set_input('ClientID', value)
def set_ClientSecret(self, value):
"""
Set the value of the ClientSecret input for this Choreo. ((required, string) The Client Secret obtained when signing up for Microsoft Translator on Azure Marketplace.)
"""
super(GetTokenInputSet, self)._set_input('ClientSecret', value)
class GetTokenResultSet(ResultSet):
"""
A ResultSet with methods tailored to the values returned by the GetToken Choreo.
The ResultSet object is used to retrieve the results of a Choreo execution.
"""
def getJSONFromString(self, str):
return json.loads(str)
def get_AccessToken(self):
"""
Retrieve the value for the "AccessToken" output from this Choreo execution. ((string) The access token returned from Microsoft.)
"""
return self._output.get('AccessToken', None)
def get_ExpiresIn(self):
"""
Retrieve the value for the "ExpiresIn" output from this Choreo execution. ((integer) The number of seconds for which the access token is valid.)
"""
return self._output.get('ExpiresIn', None)
class GetTokenChoreographyExecution(ChoreographyExecution):
def _make_result_set(self, response, path):
return GetTokenResultSet(response, path)
| 38.010753 | 175 | 0.685714 | 429 | 3,535 | 5.575758 | 0.370629 | 0.025084 | 0.023411 | 0.043478 | 0.25 | 0.155518 | 0.133779 | 0.112876 | 0.078595 | 0.078595 | 0 | 0.004921 | 0.195191 | 3,535 | 92 | 176 | 38.423913 | 0.835852 | 0.499293 | 0 | 0 | 0 | 0 | 0.054584 | 0.026592 | 0 | 0 | 0 | 0 | 0 | 1 | 0.344828 | false | 0 | 0.172414 | 0.172414 | 0.896552 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
6629618be28e08fac7d951d903379a55b389d3b4 | 3,242 | py | Python | NoteBooks/Curso de Python/Python/Paradigmas/Object Oriented Programming/Conceptos_clave.py | Alejandro-sin/Learning_Notebooks | 161d6bed4c7b1d171b45f61c0cc6fa91e9894aad | [
"MIT"
] | 1 | 2021-02-26T13:12:22.000Z | 2021-02-26T13:12:22.000Z | NoteBooks/Curso de Python/Python/Paradigmas/Object Oriented Programming/Conceptos_clave.py | Alejandro-sin/Learning_Notebooks | 161d6bed4c7b1d171b45f61c0cc6fa91e9894aad | [
"MIT"
] | null | null | null | NoteBooks/Curso de Python/Python/Paradigmas/Object Oriented Programming/Conceptos_clave.py | Alejandro-sin/Learning_Notebooks | 161d6bed4c7b1d171b45f61c0cc6fa91e9894aad | [
"MIT"
] | null | null | null | '''
Tipo: Concepto, Ejercicio, Pregunta...
Fuente: libro, curso, ...
Este chunk tiene como propósito revisar cocneptos fundamentales.
Variable de isntancia
Variable de clase
Una diferencia importante entre clase e instancia e sla forma de usarse, están fuera del constructor __init__()
Método de Instancias
Método de Clase
Método de Estpatico:
Se prefeire e a veces métodos estáticos, la funcioanldiad está estrechamente relacioanda con la clase, se cumple principio de abstracción.
Método especiales:
__init__():
Los dunder methods.
'''
class Persona():
# Variable de clase
edad = 18
# Sirve para personalzia la creación de las instancias de las clases.
# Podemos trabajr los inmutables
# New no necesita self, sino uan referencia a la clase que lo invoca, es un método estático que devuele un objeto.
# En teoria va primero, se ejecuta primero. El es el que realmente cra la clase y lo mandar a incialziar
def __new__(cls, nombre, nacionalidad):
print("New class")
# __new__ debe regresar un objeto. Se usa super para regresar una instancia
# Si no se retorna nada, la instancia no se inicialzia.
return super().__new__(cls)
# Constructor, inicializa
def __init__(self, nombre, nacionalidad):
print("Ininitialization of the instance")
self.nombre = nombre
self.nacionalidad = nacionalidad
# Método de instancia requiere de la variable self.
def nadar(self):
print("Nada de nada")
# Método de clase no posee la variable self. Pero si la variable "cls"
# Un método de clase solo funciona con una directiva espcial llamada classsmethod.
# Existen diferenters decoradores. Este nos permite no crear un objeto usarse sin instanciar
@classmethod
def saludar(cls, nombre):
print("Hola "+nombre)
# Método estático
# No requiere de ningún argumento y se usa el decorador staticmethod
@staticmethod
def brincar():
print("Brincar<!!!")
# El área de un círculo puede ser una propiedad no necesariamente un método
# Es decir peudo expresarlo como un método de instancia, clase, estático, variable.
# Lo correcto será como una propiedad.
class Circulo():
def __init__(self, radio):
self.radio = radio
# Para declarar propeidades usamos el decorador @property
# Nos va a permitir acceder a una propiedad de un objeto creado en python. objeto.propiedad
@property
def area(self):
return 3.1416*(self.radio**2)
# ■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
# La variable de isntancia de instancia son las precedidas del nombre de self.variable_instancia
stuff = Persona("José", "México")
print(stuff.edad) #18. No necesita crear una nstancia para poder acceder a este.
# print(Persona.nombre) #Arroha error, debido a que no está inicialziado el objeto.
#Método de clase.
Persona.saludar("Pepe")
# Método Estático
stuff.brincar()
# ■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
circle = Circulo(10) # 314.15999999999997
# se invoca como una propeidad del objeto instanciado.
print(circle.area) | 25.131783 | 138 | 0.670882 | 414 | 3,242 | 5.637681 | 0.442029 | 0.027421 | 0.022279 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011315 | 0.209439 | 3,242 | 129 | 139 | 25.131783 | 0.825985 | 0.701727 | 0 | 0 | 0 | 0 | 0.089247 | 0 | 0 | 0 | 0 | 0.046512 | 0 | 1 | 0.241379 | false | 0 | 0 | 0.034483 | 0.413793 | 0.241379 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6630d9ac76ef4eb179b89352d9a06e753d7e877d | 545 | py | Python | tiles_generator.py | torquecoder/ChessVista | 722afe39ae4d3cb38d83904a333b541eaba3a6ea | [
"MIT"
] | null | null | null | tiles_generator.py | torquecoder/ChessVista | 722afe39ae4d3cb38d83904a333b541eaba3a6ea | [
"MIT"
] | null | null | null | tiles_generator.py | torquecoder/ChessVista | 722afe39ae4d3cb38d83904a333b541eaba3a6ea | [
"MIT"
] | null | null | null | import chess_board_recognizer
import os
import dataset_organizer
# Generating tiles for training
for image_name in os.listdir("training_chessboards"):
chess_board_recognizer.generateTileset(image_name, "training_chessboards", "training_tiles")
dataset_organizer.organize("train_data", "training_tiles")
# Generating tiles for testing
for image_name in os.listdir("testing_chessboards"):
chess_board_recognizer.generateTileset(image_name, "testing_chessboards", "testing_tiles")
dataset_organizer.organize("test_data", "testing_tiles")
| 38.928571 | 96 | 0.831193 | 67 | 545 | 6.41791 | 0.328358 | 0.083721 | 0.139535 | 0.065116 | 0.362791 | 0.362791 | 0.255814 | 0 | 0 | 0 | 0 | 0 | 0.080734 | 545 | 13 | 97 | 41.923077 | 0.858283 | 0.106422 | 0 | 0 | 1 | 0 | 0.311983 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
6632a9eff62780e3018c5b29a731bbc480fcf73d | 5,518 | py | Python | ooobuild/lo/awt/grid/x_grid_column.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/awt/grid/x_grid_column.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/awt/grid/x_grid_column.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
#
# Copyright 2022 :Barry-Thomas-Paul: Moss
#
# Licensed under the Apache License, Version 2.0 (the "License")
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http: // www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Interface Class
# this is a auto generated file generated by Cheetah
# Libre Office Version: 7.3
# Namespace: com.sun.star.awt.grid
import typing
from abc import abstractmethod, abstractproperty
from ...lang.x_component import XComponent as XComponent_98dc0ab5
from ...util.x_cloneable import XCloneable as XCloneable_99d00aa3
if typing.TYPE_CHECKING:
from .x_grid_column_listener import XGridColumnListener as XGridColumnListener_44350fba
from ...style.horizontal_alignment import HorizontalAlignment as HorizontalAlignment_1f800f02
class XGridColumn(XComponent_98dc0ab5, XCloneable_99d00aa3):
"""
The XGridColumn defines the properties and behavior of a column in a grid control.
**since**
OOo 3.3
See Also:
`API XGridColumn <https://api.libreoffice.org/docs/idl/ref/interfacecom_1_1sun_1_1star_1_1awt_1_1grid_1_1XGridColumn.html>`_
"""
__ooo_ns__: str = 'com.sun.star.awt.grid'
__ooo_full_ns__: str = 'com.sun.star.awt.grid.XGridColumn'
__ooo_type_name__: str = 'interface'
__pyunointerface__: str = 'com.sun.star.awt.grid.XGridColumn'
@abstractmethod
def addGridColumnListener(self, Listener: 'XGridColumnListener_44350fba') -> None:
"""
Adds a listener for the GridColumnEvent posted after the grid changes.
"""
@abstractmethod
def removeGridColumnListener(self, Listener: 'XGridColumnListener_44350fba') -> None:
"""
Removes a listener previously added with addColumnListener().
"""
@abstractproperty
def ColumnWidth(self) -> int:
"""
specifies the current width of the column.
"""
@abstractproperty
def DataColumnIndex(self) -> int:
"""
denotes the index of the data column which should be used to fetch this grid column's data
A grid control has a column model and a data model, both containing a possibly different number of columns. The DataColumnIndex attribute defines the index of the column within the data model, which should be used to retrieve actual data.
Using this, you can do runtime changes to the column model, i.e. insertion and removal of columns, without necessarily needing to adjust the data model, too.
If DataColumnIndex is negative, the it will be ignored, then the column's index within its column model, as determined by the Index attribute, will be used.
"""
@abstractproperty
def Flexibility(self) -> int:
"""
specifies the flexibility of the column when it is automatically resized due to the grid control as a whole being resized.
Specify 0 here if you do not want the column to be resized automatically.
If a column has a flexibility greater than 0, it is set in relationship to the flexibility of all other such columns, and the respective widths of the columns are changed in the same relationship.
Note that a column's flexibility is ignored if its Resizeable attribute is FALSE.
A column's flexibility cannot be negative, attempts to set a negative value will raise an exception.
"""
@abstractproperty
def HelpText(self) -> str:
"""
is the help text associated with the column.
A grid control will usually display a column's help text as tooltip.
"""
@abstractproperty
def HorizontalAlign(self) -> 'HorizontalAlignment_1f800f02':
"""
Specifies the horizontal alignment of the content in the control.
"""
@abstractproperty
def Identifier(self) -> object:
"""
specifies an identifier of the column
This identifier will not be evaluated by the grid control, or its model. It is merely for clients to identify particular columns.
"""
@abstractproperty
def Index(self) -> int:
"""
denotes the index of the column within the grid column model it belongs to
If the column is not yet part of a column model, Index is -1.
"""
@abstractproperty
def MaxWidth(self) -> int:
"""
specifies the maximal width the column can have.
"""
@abstractproperty
def MinWidth(self) -> int:
"""
specifies the minimal width the column can have.
"""
@abstractproperty
def Resizeable(self) -> bool:
"""
controls whether or not the column's width is fixed or not.
If this is TRUE, the user can interactively change the column's width. Also, the column is subject to auto-resizing, if its Flexibility attribute is greater 0.
"""
@abstractproperty
def Title(self) -> str:
"""
A title is displayed in the column header row if UnoControlGridModel.ShowColumnHeader() is set to TRUE
"""
__all__ = ['XGridColumn']
| 37.794521 | 246 | 0.685031 | 714 | 5,518 | 5.221289 | 0.37535 | 0.038627 | 0.014753 | 0.013949 | 0.108369 | 0.070547 | 0.070547 | 0 | 0 | 0 | 0 | 0.017387 | 0.249547 | 5,518 | 145 | 247 | 38.055172 | 0.882879 | 0.5917 | 0 | 0.333333 | 0 | 0 | 0.111176 | 0.099534 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.153846 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6633b229ea50c21dbdd245f31d534f2fe0e53414 | 5,834 | py | Python | tests/test_filter.py | duruyi/pycasbin | a16bfaa669c37ac1598684e36b0319430ab749e5 | [
"Apache-2.0"
] | null | null | null | tests/test_filter.py | duruyi/pycasbin | a16bfaa669c37ac1598684e36b0319430ab749e5 | [
"Apache-2.0"
] | null | null | null | tests/test_filter.py | duruyi/pycasbin | a16bfaa669c37ac1598684e36b0319430ab749e5 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 The casbin Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import casbin
from unittest import TestCase
from tests.test_enforcer import get_examples
class Filter:
# P,G are strings
P = []
G = []
class TestFilteredAdapter(TestCase):
def test_init_filtered_adapter(self):
adapter = casbin.persist.adapters.FilteredAdapter(
get_examples("rbac_with_domains_policy.csv")
)
e = casbin.Enforcer(get_examples("rbac_with_domains_model.conf"), adapter)
self.assertFalse(e.has_policy(["admin", "domain1", "data1", "read"]))
def test_load_filtered_policy(self):
adapter = casbin.persist.adapters.FilteredAdapter(
get_examples("rbac_with_domains_policy.csv")
)
e = casbin.Enforcer(get_examples("rbac_with_domains_model.conf"), adapter)
filter = Filter()
filter.P = ["", "domain1"]
filter.G = ["", "", "domain1"]
try:
e.load_policy()
except:
raise RuntimeError("unexpected error in LoadFilteredPolicy")
self.assertTrue(e.has_policy(["admin", "domain1", "data1", "read"]))
self.assertTrue(e.has_policy(["admin", "domain2", "data2", "read"]))
try:
e.load_filtered_policy(filter)
except:
raise RuntimeError("unexpected error in LoadFilteredPolicy")
if not e.is_filtered:
raise RuntimeError("adapter did not set the filtered flag correctly")
self.assertTrue(e.has_policy(["admin", "domain1", "data1", "read"]))
self.assertFalse(e.has_policy(["admin", "domain2", "data2", "read"]))
with self.assertRaises(RuntimeError):
e.save_policy()
with self.assertRaises(RuntimeError):
e.get_adapter().save_policy(e.get_model())
def test_append_filtered_policy(self):
adapter = casbin.persist.adapters.FilteredAdapter(
get_examples("rbac_with_domains_policy.csv")
)
e = casbin.Enforcer(get_examples("rbac_with_domains_model.conf"), adapter)
filter = Filter()
filter.P = ["", "domain1"]
filter.G = ["", "", "domain1"]
try:
e.load_policy()
except:
raise RuntimeError("unexpected error in LoadFilteredPolicy")
self.assertTrue(e.has_policy(["admin", "domain1", "data1", "read"]))
self.assertTrue(e.has_policy(["admin", "domain2", "data2", "read"]))
try:
e.load_filtered_policy(filter)
except:
raise RuntimeError("unexpected error in LoadFilteredPolicy")
if not e.is_filtered:
raise RuntimeError("adapter did not set the filtered flag correctly")
self.assertTrue(e.has_policy(["admin", "domain1", "data1", "read"]))
self.assertFalse(e.has_policy(["admin", "domain2", "data2", "read"]))
filter.P = ["", "domain2"]
filter.G = ["", "", "domain2"]
try:
e.load_increment_filtered_policy(filter)
except:
raise RuntimeError("unexpected error in LoadFilteredPolicy")
self.assertTrue(e.has_policy(["admin", "domain1", "data1", "read"]))
self.assertTrue(e.has_policy(["admin", "domain2", "data2", "read"]))
def test_filtered_policy_invalid_filter(self):
adapter = casbin.persist.adapters.FilteredAdapter(
get_examples("rbac_with_domains_policy.csv")
)
e = casbin.Enforcer(get_examples("rbac_with_domains_model.conf"), adapter)
filter = ["", "domain1"]
with self.assertRaises(RuntimeError):
e.load_filtered_policy(filter)
def test_filtered_policy_empty_filter(self):
adapter = casbin.persist.adapters.FilteredAdapter(
get_examples("rbac_with_domains_policy.csv")
)
e = casbin.Enforcer(get_examples("rbac_with_domains_model.conf"), adapter)
try:
e.load_filtered_policy(None)
except:
raise RuntimeError("unexpected error in LoadFilteredPolicy")
if e.is_filtered():
raise RuntimeError("adapter did not reset the filtered flag correctly")
try:
e.save_policy()
except:
raise RuntimeError("unexpected error in SavePolicy")
def test_unsupported_filtered_policy(self):
e = casbin.Enforcer(
get_examples("rbac_with_domains_model.conf"),
get_examples("rbac_with_domains_policy.csv"),
)
filter = Filter()
filter.P = ["", "domain1"]
filter.G = ["", "", "domain1"]
with self.assertRaises(ValueError):
e.load_filtered_policy(filter)
def test_filtered_adapter_empty_filepath(self):
adapter = casbin.persist.adapters.FilteredAdapter("")
e = casbin.Enforcer(get_examples("rbac_with_domains_model.conf"), adapter)
with self.assertRaises(RuntimeError):
e.load_filtered_policy(None)
def test_filtered_adapter_invalid_filepath(self):
adapter = casbin.persist.adapters.FilteredAdapter(
get_examples("does_not_exist_policy.csv")
)
e = casbin.Enforcer(get_examples("rbac_with_domains_model.conf"), adapter)
with self.assertRaises(RuntimeError):
e.load_filtered_policy(None)
| 37.63871 | 83 | 0.644155 | 657 | 5,834 | 5.525114 | 0.200913 | 0.048485 | 0.057851 | 0.073278 | 0.740771 | 0.715427 | 0.711295 | 0.663085 | 0.57741 | 0.57741 | 0 | 0.008794 | 0.239801 | 5,834 | 154 | 84 | 37.883117 | 0.809696 | 0.101817 | 0 | 0.675439 | 0 | 0 | 0.212823 | 0.079809 | 0 | 0 | 0 | 0 | 0.149123 | 1 | 0.070175 | false | 0 | 0.026316 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6638ae9794518d29c0857e6982654a2a1b05c369 | 3,239 | py | Python | pysnmp/IANA-PWE3-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/IANA-PWE3-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/IANA-PWE3-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module IANA-PWE3-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/IANA-PWE3-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:30:09 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsIntersection, ConstraintsUnion, ValueRangeConstraint, ValueSizeConstraint, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsIntersection", "ConstraintsUnion", "ValueRangeConstraint", "ValueSizeConstraint", "SingleValueConstraint")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Gauge32, Counter64, MibScalar, MibTable, MibTableRow, MibTableColumn, Unsigned32, IpAddress, iso, ModuleIdentity, mib_2, TimeTicks, Counter32, NotificationType, Bits, ObjectIdentity, MibIdentifier, Integer32 = mibBuilder.importSymbols("SNMPv2-SMI", "Gauge32", "Counter64", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Unsigned32", "IpAddress", "iso", "ModuleIdentity", "mib-2", "TimeTicks", "Counter32", "NotificationType", "Bits", "ObjectIdentity", "MibIdentifier", "Integer32")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
ianaPwe3MIB = ModuleIdentity((1, 3, 6, 1, 2, 1, 174))
ianaPwe3MIB.setRevisions(('2009-06-11 00:00',))
if mibBuilder.loadTexts: ianaPwe3MIB.setLastUpdated('200906110000Z')
if mibBuilder.loadTexts: ianaPwe3MIB.setOrganization('IANA')
class IANAPwTypeTC(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 32767))
namedValues = NamedValues(("other", 0), ("frameRelayDlciMartiniMode", 1), ("atmAal5SduVcc", 2), ("atmTransparent", 3), ("ethernetTagged", 4), ("ethernet", 5), ("hdlc", 6), ("ppp", 7), ("cem", 8), ("atmCellNto1Vcc", 9), ("atmCellNto1Vpc", 10), ("ipLayer2Transport", 11), ("atmCell1to1Vcc", 12), ("atmCell1to1Vpc", 13), ("atmAal5PduVcc", 14), ("frameRelayPortMode", 15), ("cep", 16), ("e1Satop", 17), ("t1Satop", 18), ("e3Satop", 19), ("t3Satop", 20), ("basicCesPsn", 21), ("basicTdmIp", 22), ("tdmCasCesPsn", 23), ("tdmCasTdmIp", 24), ("frDlci", 25), ("wildcard", 32767))
class IANAPwPsnTypeTC(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))
namedValues = NamedValues(("mpls", 1), ("l2tp", 2), ("udpOverIp", 3), ("mplsOverIp", 4), ("mplsOverGre", 5), ("other", 6))
class IANAPwCapabilities(TextualConvention, Bits):
status = 'current'
namedValues = NamedValues(("pwStatusIndication", 0), ("pwVCCV", 1))
mibBuilder.exportSymbols("IANA-PWE3-MIB", IANAPwCapabilities=IANAPwCapabilities, IANAPwPsnTypeTC=IANAPwPsnTypeTC, ianaPwe3MIB=ianaPwe3MIB, IANAPwTypeTC=IANAPwTypeTC, PYSNMP_MODULE_ID=ianaPwe3MIB)
| 98.151515 | 574 | 0.736338 | 334 | 3,239 | 7.131737 | 0.473054 | 0.057935 | 0.013854 | 0.065491 | 0.322418 | 0.239295 | 0.234257 | 0.234257 | 0.234257 | 0.144416 | 0 | 0.083333 | 0.096017 | 3,239 | 32 | 575 | 101.21875 | 0.730191 | 0.098796 | 0 | 0.136364 | 0 | 0 | 0.295876 | 0.023711 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.272727 | 0 | 0.772727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6652a0863985d73afb712246d97796fd54eb4c1a | 6,813 | py | Python | gffutils/test/expected.py | aswarren/gffutils | 19ec9167b04557cc228cec7d92a78bee9d6d15d0 | [
"MIT"
] | 171 | 2015-01-15T16:12:29.000Z | 2022-03-14T00:38:03.000Z | gffutils/test/expected.py | aswarren/gffutils | 19ec9167b04557cc228cec7d92a78bee9d6d15d0 | [
"MIT"
] | 124 | 2015-01-09T20:21:45.000Z | 2022-03-17T15:45:35.000Z | gffutils/test/expected.py | aswarren/gffutils | 19ec9167b04557cc228cec7d92a78bee9d6d15d0 | [
"MIT"
] | 60 | 2015-04-15T13:25:51.000Z | 2021-12-16T13:44:41.000Z | # expected data for tests using FBgn0031208.gff and FBgn0031208.gtf files
# list the children and their expected first-order parents for the GFF test file.
GFF_parent_check_level_1 = {'FBtr0300690':['FBgn0031208'],
'FBtr0300689':['FBgn0031208'],
'CG11023:1':['FBtr0300689','FBtr0300690'],
'five_prime_UTR_FBgn0031208:1_737':['FBtr0300689','FBtr0300690'],
'CDS_FBgn0031208:1_737':['FBtr0300689','FBtr0300690'],
'intron_FBgn0031208:1_FBgn0031208:2':['FBtr0300690'],
'intron_FBgn0031208:1_FBgn0031208:3':['FBtr0300689'],
'FBgn0031208:3':['FBtr0300689'],
'CDS_FBgn0031208:3_737':['FBtr0300689'],
'CDS_FBgn0031208:2_737':['FBtr0300690'],
'exon:chr2L:8193-8589:+':['FBtr0300690'],
'intron_FBgn0031208:2_FBgn0031208:4':['FBtr0300690'],
'three_prime_UTR_FBgn0031208:3_737':['FBtr0300689'],
'FBgn0031208:4':['FBtr0300690'],
'CDS_FBgn0031208:4_737':['FBtr0300690'],
'three_prime_UTR_FBgn0031208:4_737':['FBtr0300690'],
}
# and second-level . . . they should all be grandparents of the same gene.
GFF_parent_check_level_2 = {
'CG11023:1':['FBgn0031208'],
'five_prime_UTR_FBgn0031208:1_737':['FBgn0031208'],
'CDS_FBgn0031208:1_737':['FBgn0031208'],
'intron_FBgn0031208:1_FBgn0031208:2':['FBgn0031208'],
'intron_FBgn0031208:1_FBgn0031208:3':['FBgn0031208'],
'FBgn0031208:3':['FBgn0031208'],
'CDS_FBgn0031208:3_737':['FBgn0031208'],
'CDS_FBgn0031208:2_737':['FBgn0031208'],
'exon:chr2L:8193-8589:+':['FBgn0031208'],
'intron_FBgn0031208:2_FBgn0031208:4':['FBgn0031208'],
'three_prime_UTR_FBgn0031208:3_737':['FBgn0031208'],
'FBgn0031208:4':['FBgn0031208'],
'CDS_FBgn0031208:4_737':['FBgn0031208'],
'three_prime_UTR_FBgn0031208:4_737':['FBgn0031208'],
}
# Same thing for GTF test file . . .
GTF_parent_check_level_1 = {
'exon:chr2L:7529-8116:+':['FBtr0300689'],
'exon:chr2L:7529-8116:+_1':['FBtr0300690'],
'exon:chr2L:8193-9484:+':['FBtr0300689'],
'exon:chr2L:8193-8589:+':['FBtr0300690'],
'exon:chr2L:8668-9484:+':['FBtr0300690'],
'exon:chr2L:10000-11000:-':['transcript_Fk_gene_1'],
'exon:chr2L:11500-12500:-':['transcript_Fk_gene_2'],
'CDS:chr2L:7680-8116:+':['FBtr0300689'],
'CDS:chr2L:7680-8116:+_1':['FBtr0300690'],
'CDS:chr2L:8193-8610:+':['FBtr0300689'],
'CDS:chr2L:8193-8589:+':['FBtr0300690'],
'CDS:chr2L:8668-9276:+':['FBtr0300690'],
'CDS:chr2L:10000-11000:-':['transcript_Fk_gene_1'],
'FBtr0300689':['FBgn0031208'],
'FBtr0300690':['FBgn0031208'],
'transcript_Fk_gene_1':['Fk_gene_1'],
'transcript_Fk_gene_2':['Fk_gene_2'],
'start_codon:chr2L:7680-7682:+':['FBtr0300689'],
'start_codon:chr2L:7680-7682:+_1':['FBtr0300690'],
'start_codon:chr2L:10000-11002:-':['transcript_Fk_gene_1'],
'stop_codon:chr2L:8611-8613:+':['FBtr0300689'],
'stop_codon:chr2L:9277-9279:+':['FBtr0300690'],
'stop_codon:chr2L:11001-11003:-':['transcript_Fk_gene_1'],
}
GTF_parent_check_level_2 = {
'exon:chr2L:7529-8116:+':['FBgn0031208'],
'exon:chr2L:8193-9484:+':['FBgn0031208'],
'exon:chr2L:8193-8589:+':['FBgn0031208'],
'exon:chr2L:8668-9484:+':['FBgn0031208'],
'exon:chr2L:10000-11000:-':['Fk_gene_1'],
'exon:chr2L:11500-12500:-':['Fk_gene_2'],
'CDS:chr2L:7680-8116:+':['FBgn0031208'],
'CDS:chr2L:8193-8610:+':['FBgn0031208'],
'CDS:chr2L:8193-8589:+':['FBgn0031208'],
'CDS:chr2L:8668-9276:+':['FBgn0031208'],
'CDS:chr2L:10000-11000:-':['Fk_gene_1'],
'FBtr0300689':[],
'FBtr0300690':[],
'transcript_Fk_gene_1':[],
'transcript_Fk_gene_2':[],
'start_codon:chr2L:7680-7682:+':['FBgn0031208'],
'start_codon:chr2L:10000-11002:-':['Fk_gene_1'],
'stop_codon:chr2L:8611-8613:+':['FBgn0031208'],
'stop_codon:chr2L:9277-9279:+':['FBgn0031208'],
'stop_codon:chr2L:11001-11003:-':['Fk_gene_1'],
}
expected_feature_counts = {
'gff3':{'gene':3,
'mRNA':4,
'exon':6,
'CDS':5,
'five_prime_UTR':1,
'intron':3,
'pcr_product':1,
'protein':2,
'three_prime_UTR':2},
'gtf':{
#'gene':3,
# 'mRNA':4,
'CDS':6,
'exon':7,
'start_codon':3,
'stop_codon':3}
}
expected_features = {'gff3':['gene',
'mRNA',
'protein',
'five_prime_UTR',
'three_prime_UTR',
'pcr_product',
'CDS',
'exon',
'intron'],
'gtf':['gene',
'mRNA',
'CDS',
'exon',
'start_codon',
'stop_codon']}
| 53.645669 | 93 | 0.420666 | 526 | 6,813 | 5.18251 | 0.161597 | 0.035216 | 0.028247 | 0.037417 | 0.41526 | 0.218269 | 0.115187 | 0.043287 | 0 | 0 | 0 | 0.303893 | 0.43828 | 6,813 | 126 | 94 | 54.071429 | 0.408414 | 0.041098 | 0 | 0.108108 | 0 | 0 | 0.417011 | 0.224981 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b07e44a4d5538f9ef1412e0218121f3902309fda | 404 | py | Python | dmb/modeling/__init__.py | jiaw-z/DenseMatchingBenchmark | 177c56ca1952f54d28e6073afa2c16981113a2af | [
"MIT"
] | 160 | 2019-11-16T13:59:21.000Z | 2022-03-28T07:52:59.000Z | dmb/modeling/__init__.py | jiaw-z/DenseMatchingBenchmark | 177c56ca1952f54d28e6073afa2c16981113a2af | [
"MIT"
] | 22 | 2019-11-22T02:14:18.000Z | 2022-01-24T10:16:14.000Z | dmb/modeling/__init__.py | jiaw-z/DenseMatchingBenchmark | 177c56ca1952f54d28e6073afa2c16981113a2af | [
"MIT"
] | 38 | 2019-12-27T14:01:01.000Z | 2022-03-12T11:40:11.000Z | from .flow.models import _META_ARCHITECTURES as _FLOW_META_ARCHITECTURES
from .stereo.models import _META_ARCHITECTURES as _STEREO_META_ARCHITECTURES
_META_ARCHITECTURES = dict()
_META_ARCHITECTURES.update(_FLOW_META_ARCHITECTURES)
_META_ARCHITECTURES.update(_STEREO_META_ARCHITECTURES)
def build_model(cfg):
meta_arch = _META_ARCHITECTURES[cfg.model.meta_architecture]
return meta_arch(cfg)
| 31.076923 | 76 | 0.85396 | 51 | 404 | 6.215686 | 0.333333 | 0.536278 | 0.100946 | 0.182965 | 0.195584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089109 | 404 | 12 | 77 | 33.666667 | 0.861413 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b08e4cdc5897d61e76d27267edee5c12b035f60f | 347 | py | Python | fasttrips/Examples/Springfield/configs/C/config_ft.py | janzill/fast-trips | a0f82849196894f1695e6400698a5fd16431f674 | [
"Apache-2.0"
] | 21 | 2017-11-19T03:21:48.000Z | 2021-11-25T09:16:33.000Z | fasttrips/Examples/Springfield/configs/C/config_ft.py | MetropolitanTransportationCommission/transit-assignment | 02597df56ed152e9993374ba0502d0de444da234 | [
"Apache-2.0"
] | 95 | 2017-08-23T22:04:03.000Z | 2021-07-13T05:32:18.000Z | fasttrips/Examples/Springfield/configs/C/config_ft.py | MetropolitanTransportationCommission/transit-assignment | 02597df56ed152e9993374ba0502d0de444da234 | [
"Apache-2.0"
] | 12 | 2017-10-09T23:23:13.000Z | 2021-11-25T09:16:50.000Z |
def user_class(row_series):
"""
Defines the user class for this trip list.
This function takes a single argument, the pandas.Series with person, household and
trip_list attributes, and returns a user class string.
"""
# print row_series
if row_series["hh_id"] == "simpson":
return "not_real"
return "real" | 28.916667 | 87 | 0.674352 | 49 | 347 | 4.632653 | 0.653061 | 0.118943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.242075 | 347 | 12 | 88 | 28.916667 | 0.863118 | 0.576369 | 0 | 0 | 0 | 0 | 0.195122 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b08e8deaf7998ec3f4f946c6bb89a5b4a6ab5e2a | 2,242 | py | Python | store/migrations/0003_auto_20200419_1852.py | enviousoner1166/appstore | c9ead52476035f891264de53ffe21453cde9348d | [
"MIT"
] | 1 | 2021-03-05T13:17:35.000Z | 2021-03-05T13:17:35.000Z | store/migrations/0003_auto_20200419_1852.py | enviousoner1166/appstore | c9ead52476035f891264de53ffe21453cde9348d | [
"MIT"
] | null | null | null | store/migrations/0003_auto_20200419_1852.py | enviousoner1166/appstore | c9ead52476035f891264de53ffe21453cde9348d | [
"MIT"
] | null | null | null | # Generated by Django 3.0.5 on 2020-04-19 18:52
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('store', '0002_publisher'),
]
operations = [
migrations.AddField(
model_name='app',
name='copyright',
field=models.CharField(default='', max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='app',
name='description',
field=models.TextField(default='', max_length=140),
preserve_default=False,
),
migrations.AddField(
model_name='app',
name='name',
field=models.CharField(default='', max_length=100),
preserve_default=False,
),
migrations.AddField(
model_name='app',
name='publisher',
field=models.ForeignKey(default='', on_delete=django.db.models.deletion.CASCADE, related_name='apps', to='store.Publisher'),
preserve_default=False,
),
migrations.AddField(
model_name='app',
name='version',
field=models.CharField(default='', max_length=20),
preserve_default=False,
),
migrations.AddField(
model_name='publisher',
name='name',
field=models.CharField(default='', help_text='Name of publisher', max_length=100),
preserve_default=False,
),
migrations.AddField(
model_name='publisher',
name='privacy_policy_url',
field=models.URLField(default=1, help_text='Privacy policy link'),
preserve_default=False,
),
migrations.AddField(
model_name='publisher',
name='suppport_url',
field=models.URLField(default='', help_text='Link to useful app resources'),
preserve_default=False,
),
migrations.AddField(
model_name='publisher',
name='website',
field=models.URLField(default='', help_text='Official website of app'),
preserve_default=False,
),
]
| 32.492754 | 136 | 0.56512 | 213 | 2,242 | 5.793427 | 0.300469 | 0.13128 | 0.167747 | 0.196921 | 0.606159 | 0.580227 | 0.388979 | 0.388979 | 0.388979 | 0.095624 | 0 | 0.022179 | 0.316236 | 2,242 | 68 | 137 | 32.970588 | 0.782779 | 0.020071 | 0 | 0.612903 | 1 | 0 | 0.117084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032258 | 0 | 0.080645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0911bf21fdb8cecdfcf2770701dc7197588b2ea | 317 | py | Python | 3-Python-Advanced (May 2021)/05-Functions/02_Exercises/01-Even-Numbers.py | karolinanikolova/SoftUni-Software-Engineering | 7891924956598b11a1e30e2c220457c85c40f064 | [
"MIT"
] | null | null | null | 3-Python-Advanced (May 2021)/05-Functions/02_Exercises/01-Even-Numbers.py | karolinanikolova/SoftUni-Software-Engineering | 7891924956598b11a1e30e2c220457c85c40f064 | [
"MIT"
] | null | null | null | 3-Python-Advanced (May 2021)/05-Functions/02_Exercises/01-Even-Numbers.py | karolinanikolova/SoftUni-Software-Engineering | 7891924956598b11a1e30e2c220457c85c40f064 | [
"MIT"
] | null | null | null | # 1. Even Numbers
# Write a program that receives a sequence of numbers (integers), separated by a single space.
# It should print a list of only the even numbers. Use filter().
def filter_even(iters):
return list(filter(lambda x: x % 2 == 0, iters))
nums = map(int, input().split())
print(filter_even(nums))
| 26.416667 | 94 | 0.70347 | 52 | 317 | 4.25 | 0.673077 | 0.099548 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011538 | 0.179811 | 317 | 11 | 95 | 28.818182 | 0.838462 | 0.539432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b09c0f819e33601edc63c9a78f79e89bc5886627 | 1,888 | py | Python | Staging/sandbox2.py | mccolm-robotics/Claver-AI-Assistant | 12b31633b4a45709403ed9599c626e63743318c7 | [
"MIT"
] | 1 | 2020-05-22T18:45:07.000Z | 2020-05-22T18:45:07.000Z | Staging/sandbox2.py | mccolm-robotics/Claver-AI-Assistant | 12b31633b4a45709403ed9599c626e63743318c7 | [
"MIT"
] | null | null | null | Staging/sandbox2.py | mccolm-robotics/Claver-AI-Assistant | 12b31633b4a45709403ed9599c626e63743318c7 | [
"MIT"
] | null | null | null | class Foo:
def __init__(self, id):
self.id = id
def __str__(self):
return 'Foo instance id: {}'.format(self.id)
class Bar:
def __init__(self, id):
self.id = id
def __str__(self):
return 'Bar instance id: {}'.format(self.id)
class Baz:
def __init__(self, id):
self.id = id
def __str__(self):
return 'Baz instance id: {}'.format(self.id)
foo_1 = Foo("to go")
foo_2 = Foo("to clean")
foo_3 = Foo("to avoid")
foo_4 = Foo("to destroy")
bar_1 = Bar("the bathroom")
bar_2 = Bar("the kitchen")
bar_3 = Bar("the bedroom")
bar_4 = Bar("the basement")
bar_5 = Bar("the garage")
my_dictionary = {Foo: [foo_1, foo_2, foo_3, foo_4], Bar: [bar_1, bar_2, bar_3, bar_4, bar_5]}
if Baz in my_dictionary:
textBatch = my_dictionary[Baz]
else:
textBatch = []
my_dictionary[Baz] = textBatch
textBatch.append(5) # Pass by reference
# test = my_dictionary.get(Foo)
# print(test)
# retest = my_dictionary[Foo]
# print(retest)
for model_type in my_dictionary:
for entity in my_dictionary[model_type]:
print(entity)
for unit in my_dictionary:
print(unit)
if Baz in my_dictionary:
del my_dictionary[Baz]
for model_type in my_dictionary:
for entity in my_dictionary[model_type]:
print(entity)
#
# if Foo in my_dictionary:
# # for entity in my_dictionary[Foo]:
# # print(entity)
# my_dictionary[Foo].append(Foo("to dirty"))
# for entity in my_dictionary[Foo]:
# print(entity)
# #
# # if Baz in my_dictionary:
# # print("Should not see this")
# # for entity in my_dictionary[Baz]:
# # print(entity)
# # else:
# # my_dictionary[Baz] = [Baz("big test")]
# #
# #
# # print("\nNow testing for Baz")
# # if Baz in my_dictionary:
# # for entity in my_dictionary[Baz]:
# # print(entity)
for unit in my_dictionary:
print(unit)
| 22.47619 | 93 | 0.630826 | 282 | 1,888 | 3.978723 | 0.198582 | 0.245989 | 0.187166 | 0.069519 | 0.555258 | 0.480392 | 0.432264 | 0.432264 | 0.285205 | 0.221925 | 0 | 0.013131 | 0.233581 | 1,888 | 83 | 94 | 22.746988 | 0.762267 | 0.31303 | 0 | 0.488372 | 0 | 0 | 0.114833 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139535 | false | 0 | 0 | 0.069767 | 0.27907 | 0.093023 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b09ce2d002feb3379e86e88aaf999aa361bcd469 | 29,520 | py | Python | GM2AUTOSAR_MM/Properties/positive/models/P1Connected_MDL.py | levilucio/SyVOLT | 7526ec794d21565e3efcc925a7b08ae8db27d46a | [
"MIT"
] | 3 | 2017-06-02T19:26:27.000Z | 2021-06-14T04:25:45.000Z | GM2AUTOSAR_MM/Properties/positive/models/P1Connected_MDL.py | levilucio/SyVOLT | 7526ec794d21565e3efcc925a7b08ae8db27d46a | [
"MIT"
] | 8 | 2016-08-24T07:04:07.000Z | 2017-05-26T16:22:47.000Z | GM2AUTOSAR_MM/Properties/positive/models/P1Connected_MDL.py | levilucio/SyVOLT | 7526ec794d21565e3efcc925a7b08ae8db27d46a | [
"MIT"
] | 1 | 2019-10-31T06:00:23.000Z | 2019-10-31T06:00:23.000Z | """
__P1Connected_MDL.py_____________________________________________________
Automatically generated AToM3 Model File (Do not modify directly)
Author: gehan
Modified: Mon Sep 16 17:32:40 2013
_________________________________________________________________________
"""
from stickylink import *
from widthXfillXdecoration import *
from MT_pre__ECU import *
from MT_pre__VirtualDevice import *
from MT_pre__Distributable import *
from MT_pre__ExecFrame import *
from MT_pre__Signal import *
from MT_pre__directLink_S import *
from LHS import *
from graph_MT_pre__ECU import *
from graph_LHS import *
from graph_MT_pre__ExecFrame import *
from graph_MT_pre__Distributable import *
from graph_MT_pre__directLink_S import *
from graph_MT_pre__VirtualDevice import *
from graph_MT_pre__Signal import *
from ATOM3Enum import *
from ATOM3String import *
from ATOM3BottomType import *
from ATOM3Constraint import *
from ATOM3Attribute import *
from ATOM3Float import *
from ATOM3List import *
from ATOM3Link import *
from ATOM3Connection import *
from ATOM3Boolean import *
from ATOM3Appearance import *
from ATOM3Text import *
from ATOM3Action import *
from ATOM3Integer import *
from ATOM3Port import *
from ATOM3MSEnum import *
def P1Connected_MDL(self, rootNode, MT_pre__GM2AUTOSAR_MMRootNode=None, MoTifRuleRootNode=None):
# --- Generating attributes code for ASG MT_pre__GM2AUTOSAR_MM ---
if( MT_pre__GM2AUTOSAR_MMRootNode ):
# author
MT_pre__GM2AUTOSAR_MMRootNode.author.setValue('Annonymous')
# description
MT_pre__GM2AUTOSAR_MMRootNode.description.setValue('\n')
MT_pre__GM2AUTOSAR_MMRootNode.description.setHeight(15)
# name
MT_pre__GM2AUTOSAR_MMRootNode.name.setValue('')
MT_pre__GM2AUTOSAR_MMRootNode.name.setNone()
# --- ASG attributes over ---
# --- Generating attributes code for ASG MoTifRule ---
if( MoTifRuleRootNode ):
# author
MoTifRuleRootNode.author.setValue('Annonymous')
# description
MoTifRuleRootNode.description.setValue('\n')
MoTifRuleRootNode.description.setHeight(15)
# name
MoTifRuleRootNode.name.setValue('P1Connected')
# --- ASG attributes over ---
self.obj61=MT_pre__ECU(self)
self.obj61.isGraphObjectVisual = True
if(hasattr(self.obj61, '_setHierarchicalLink')):
self.obj61._setHierarchicalLink(False)
# MT_pivotOut__
self.obj61.MT_pivotOut__.setValue('')
self.obj61.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj61.MT_subtypeMatching__.setValue(('True', 0))
self.obj61.MT_subtypeMatching__.config = 0
# MT_pre__classtype
self.obj61.MT_pre__classtype.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj61.MT_pre__classtype.setHeight(15)
# MT_pivotIn__
self.obj61.MT_pivotIn__.setValue('')
self.obj61.MT_pivotIn__.setNone()
# MT_label__
self.obj61.MT_label__.setValue('1')
# MT_pre__cardinality
self.obj61.MT_pre__cardinality.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj61.MT_pre__cardinality.setHeight(15)
# MT_pre__name
self.obj61.MT_pre__name.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj61.MT_pre__name.setHeight(15)
self.obj61.graphClass_= graph_MT_pre__ECU
if self.genGraphics:
new_obj = graph_MT_pre__ECU(140.0,80.0,self.obj61)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__ECU", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
new_obj.layConstraints['scale'] = [1.0, 1.0]
else: new_obj = None
self.obj61.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj61)
self.globalAndLocalPostcondition(self.obj61, rootNode)
self.obj61.postAction( rootNode.CREATE )
self.obj62=MT_pre__VirtualDevice(self)
self.obj62.isGraphObjectVisual = True
if(hasattr(self.obj62, '_setHierarchicalLink')):
self.obj62._setHierarchicalLink(False)
# MT_pivotOut__
self.obj62.MT_pivotOut__.setValue('')
self.obj62.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj62.MT_subtypeMatching__.setValue(('True', 0))
self.obj62.MT_subtypeMatching__.config = 0
# MT_pre__classtype
self.obj62.MT_pre__classtype.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj62.MT_pre__classtype.setHeight(15)
# MT_pivotIn__
self.obj62.MT_pivotIn__.setValue('')
self.obj62.MT_pivotIn__.setNone()
# MT_label__
self.obj62.MT_label__.setValue('2')
# MT_pre__cardinality
self.obj62.MT_pre__cardinality.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj62.MT_pre__cardinality.setHeight(15)
# MT_pre__name
self.obj62.MT_pre__name.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj62.MT_pre__name.setHeight(15)
self.obj62.graphClass_= graph_MT_pre__VirtualDevice
if self.genGraphics:
new_obj = graph_MT_pre__VirtualDevice(340.0,80.0,self.obj62)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__VirtualDevice", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
new_obj.layConstraints['scale'] = [1.0, 1.0]
else: new_obj = None
self.obj62.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj62)
self.globalAndLocalPostcondition(self.obj62, rootNode)
self.obj62.postAction( rootNode.CREATE )
self.obj63=MT_pre__Distributable(self)
self.obj63.isGraphObjectVisual = True
if(hasattr(self.obj63, '_setHierarchicalLink')):
self.obj63._setHierarchicalLink(False)
# MT_pivotOut__
self.obj63.MT_pivotOut__.setValue('')
self.obj63.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj63.MT_subtypeMatching__.setValue(('True', 0))
self.obj63.MT_subtypeMatching__.config = 0
# MT_pre__classtype
self.obj63.MT_pre__classtype.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj63.MT_pre__classtype.setHeight(15)
# MT_pivotIn__
self.obj63.MT_pivotIn__.setValue('')
self.obj63.MT_pivotIn__.setNone()
# MT_label__
self.obj63.MT_label__.setValue('3')
# MT_pre__cardinality
self.obj63.MT_pre__cardinality.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj63.MT_pre__cardinality.setHeight(15)
# MT_pre__name
self.obj63.MT_pre__name.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj63.MT_pre__name.setHeight(15)
self.obj63.graphClass_= graph_MT_pre__Distributable
if self.genGraphics:
new_obj = graph_MT_pre__Distributable(340.0,211.0,self.obj63)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__Distributable", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
new_obj.layConstraints['scale'] = [1.0, 1.0]
else: new_obj = None
self.obj63.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj63)
self.globalAndLocalPostcondition(self.obj63, rootNode)
self.obj63.postAction( rootNode.CREATE )
self.obj64=MT_pre__ExecFrame(self)
self.obj64.isGraphObjectVisual = True
if(hasattr(self.obj64, '_setHierarchicalLink')):
self.obj64._setHierarchicalLink(False)
# MT_pivotOut__
self.obj64.MT_pivotOut__.setValue('')
self.obj64.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj64.MT_subtypeMatching__.setValue(('True', 0))
self.obj64.MT_subtypeMatching__.config = 0
# MT_pre__classtype
self.obj64.MT_pre__classtype.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj64.MT_pre__classtype.setHeight(15)
# MT_pivotIn__
self.obj64.MT_pivotIn__.setValue('')
self.obj64.MT_pivotIn__.setNone()
# MT_label__
self.obj64.MT_label__.setValue('4')
# MT_pre__cardinality
self.obj64.MT_pre__cardinality.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj64.MT_pre__cardinality.setHeight(15)
# MT_pre__name
self.obj64.MT_pre__name.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj64.MT_pre__name.setHeight(15)
self.obj64.graphClass_= graph_MT_pre__ExecFrame
if self.genGraphics:
new_obj = graph_MT_pre__ExecFrame(340.0,338.0,self.obj64)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__ExecFrame", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
new_obj.layConstraints['scale'] = [1.0, 1.0]
else: new_obj = None
self.obj64.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj64)
self.globalAndLocalPostcondition(self.obj64, rootNode)
self.obj64.postAction( rootNode.CREATE )
self.obj65=MT_pre__Signal(self)
self.obj65.isGraphObjectVisual = True
if(hasattr(self.obj65, '_setHierarchicalLink')):
self.obj65._setHierarchicalLink(False)
# MT_pivotOut__
self.obj65.MT_pivotOut__.setValue('')
self.obj65.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj65.MT_subtypeMatching__.setValue(('True', 0))
self.obj65.MT_subtypeMatching__.config = 0
# MT_pre__classtype
self.obj65.MT_pre__classtype.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj65.MT_pre__classtype.setHeight(15)
# MT_pivotIn__
self.obj65.MT_pivotIn__.setValue('')
self.obj65.MT_pivotIn__.setNone()
# MT_label__
self.obj65.MT_label__.setValue('5')
# MT_pre__cardinality
self.obj65.MT_pre__cardinality.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj65.MT_pre__cardinality.setHeight(15)
# MT_pre__name
self.obj65.MT_pre__name.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj65.MT_pre__name.setHeight(15)
self.obj65.graphClass_= graph_MT_pre__Signal
if self.genGraphics:
new_obj = graph_MT_pre__Signal(140.0,337.0,self.obj65)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__Signal", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
new_obj.layConstraints['scale'] = [1.0, 1.0]
else: new_obj = None
self.obj65.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj65)
self.globalAndLocalPostcondition(self.obj65, rootNode)
self.obj65.postAction( rootNode.CREATE )
self.obj68=MT_pre__directLink_S(self)
self.obj68.isGraphObjectVisual = True
if(hasattr(self.obj68, '_setHierarchicalLink')):
self.obj68._setHierarchicalLink(False)
# MT_label__
self.obj68.MT_label__.setValue('6')
# MT_pivotOut__
self.obj68.MT_pivotOut__.setValue('')
self.obj68.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj68.MT_subtypeMatching__.setValue(('True', 0))
self.obj68.MT_subtypeMatching__.config = 0
# MT_pivotIn__
self.obj68.MT_pivotIn__.setValue('')
self.obj68.MT_pivotIn__.setNone()
# MT_pre__associationType
self.obj68.MT_pre__associationType.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj68.MT_pre__associationType.setHeight(15)
self.obj68.graphClass_= graph_MT_pre__directLink_S
if self.genGraphics:
new_obj = graph_MT_pre__directLink_S(410.0,154.0,self.obj68)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__directLink_S", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
else: new_obj = None
self.obj68.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj68)
self.globalAndLocalPostcondition(self.obj68, rootNode)
self.obj68.postAction( rootNode.CREATE )
self.obj69=MT_pre__directLink_S(self)
self.obj69.isGraphObjectVisual = True
if(hasattr(self.obj69, '_setHierarchicalLink')):
self.obj69._setHierarchicalLink(False)
# MT_label__
self.obj69.MT_label__.setValue('7')
# MT_pivotOut__
self.obj69.MT_pivotOut__.setValue('')
self.obj69.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj69.MT_subtypeMatching__.setValue(('True', 0))
self.obj69.MT_subtypeMatching__.config = 0
# MT_pivotIn__
self.obj69.MT_pivotIn__.setValue('')
self.obj69.MT_pivotIn__.setNone()
# MT_pre__associationType
self.obj69.MT_pre__associationType.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj69.MT_pre__associationType.setHeight(15)
self.obj69.graphClass_= graph_MT_pre__directLink_S
if self.genGraphics:
new_obj = graph_MT_pre__directLink_S(510.0,219.5,self.obj69)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__directLink_S", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
else: new_obj = None
self.obj69.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj69)
self.globalAndLocalPostcondition(self.obj69, rootNode)
self.obj69.postAction( rootNode.CREATE )
self.obj70=MT_pre__directLink_S(self)
self.obj70.isGraphObjectVisual = True
if(hasattr(self.obj70, '_setHierarchicalLink')):
self.obj70._setHierarchicalLink(False)
# MT_label__
self.obj70.MT_label__.setValue('8')
# MT_pivotOut__
self.obj70.MT_pivotOut__.setValue('')
self.obj70.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj70.MT_subtypeMatching__.setValue(('True', 0))
self.obj70.MT_subtypeMatching__.config = 0
# MT_pivotIn__
self.obj70.MT_pivotIn__.setValue('')
self.obj70.MT_pivotIn__.setNone()
# MT_pre__associationType
self.obj70.MT_pre__associationType.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj70.MT_pre__associationType.setHeight(15)
self.obj70.graphClass_= graph_MT_pre__directLink_S
if self.genGraphics:
new_obj = graph_MT_pre__directLink_S(510.0,348.5,self.obj70)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__directLink_S", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
else: new_obj = None
self.obj70.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj70)
self.globalAndLocalPostcondition(self.obj70, rootNode)
self.obj70.postAction( rootNode.CREATE )
self.obj71=MT_pre__directLink_S(self)
self.obj71.isGraphObjectVisual = True
if(hasattr(self.obj71, '_setHierarchicalLink')):
self.obj71._setHierarchicalLink(False)
# MT_label__
self.obj71.MT_label__.setValue('9')
# MT_pivotOut__
self.obj71.MT_pivotOut__.setValue('')
self.obj71.MT_pivotOut__.setNone()
# MT_subtypeMatching__
self.obj71.MT_subtypeMatching__.setValue(('True', 0))
self.obj71.MT_subtypeMatching__.config = 0
# MT_pivotIn__
self.obj71.MT_pivotIn__.setValue('')
self.obj71.MT_pivotIn__.setNone()
# MT_pre__associationType
self.obj71.MT_pre__associationType.setValue('\n#===============================================================================\n# This code is executed when evaluating if a node shall be matched by this rule.\n# You can access the value of the current node\'s attribute value by: attr_value.\n# You can access any attribute x of this node by: this[\'x\'].\n# If the constraint relies on attribute values from other nodes,\n# use the LHS/NAC constraint instead.\n# The given constraint must evaluate to a boolean expression.\n#===============================================================================\n\nreturn True\n')
self.obj71.MT_pre__associationType.setHeight(15)
self.obj71.graphClass_= graph_MT_pre__directLink_S
if self.genGraphics:
new_obj = graph_MT_pre__directLink_S(410.0,411.5,self.obj71)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("MT_pre__directLink_S", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
else: new_obj = None
self.obj71.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj71)
self.globalAndLocalPostcondition(self.obj71, rootNode)
self.obj71.postAction( rootNode.CREATE )
self.obj60=LHS(self)
self.obj60.isGraphObjectVisual = True
if(hasattr(self.obj60, '_setHierarchicalLink')):
self.obj60._setHierarchicalLink(False)
# constraint
self.obj60.constraint.setValue('#===============================================================================\n# This code is executed after the nodes in the LHS have been matched.\n# You can access a matched node labelled n by: PreNode(\'n\').\n# To access attribute x of node n, use: PreNode(\'n\')[\'x\'].\n# The given constraint must evaluate to a boolean expression:\n# returning True enables the rule to be applied,\n# returning False forbids the rule from being applied.\n#===============================================================================\n\nif PreNode(\'1\')[\'cardinality\']==\'+\' and PreNode(\'2\')[\'cardinality\']==\'+\' and PreNode(\'3\')[\'cardinality\']==\'+\' and PreNode(\'4\')[\'cardinality\']==\'+\' and PreNode(\'5\')[\'cardinality\']==\'1\' and PreNode(\'6\')[\'associationType\']==\'virtualDevice\' and PreNode(\'7\')[\'associationType\']==\'distributable\' and PreNode(\'8\')[\'associationType\']==\'execFrame\' and PreNode(\'9\')[\'associationType\']==\'provided\':\n return True\nreturn False\n')
self.obj60.constraint.setHeight(15)
self.obj60.graphClass_= graph_LHS
if self.genGraphics:
new_obj = graph_LHS(120.0,60.0,self.obj60)
new_obj.DrawObject(self.UMLmodel)
self.UMLmodel.addtag_withtag("LHS", new_obj.tag)
new_obj.layConstraints = dict() # Graphical Layout Constraints
new_obj.layConstraints['scale'] = [1.0, 1.0]
else: new_obj = None
self.obj60.graphObject_ = new_obj
# Add node to the root: rootNode
rootNode.addNode(self.obj60)
self.globalAndLocalPostcondition(self.obj60, rootNode)
self.obj60.postAction( rootNode.CREATE )
# Connections for obj61 (graphObject_: Obj13) of type MT_pre__ECU
self.drawConnections(
(self.obj61,self.obj68,[310.0, 154.0, 410.0, 154.0],"true", 2) )
# Connections for obj62 (graphObject_: Obj14) of type MT_pre__VirtualDevice
self.drawConnections(
(self.obj62,self.obj69,[510.0, 154.0, 510.0, 219.5],"true", 2) )
# Connections for obj63 (graphObject_: Obj15) of type MT_pre__Distributable
self.drawConnections(
(self.obj63,self.obj70,[510.0, 285.0, 510.0, 348.5],"true", 2) )
# Connections for obj64 (graphObject_: Obj16) of type MT_pre__ExecFrame
self.drawConnections(
(self.obj64,self.obj71,[510.0, 412.0, 410.0, 411.5],"true", 2) )
# Connections for obj65 (graphObject_: Obj17) of type MT_pre__Signal
self.drawConnections(
)
# Connections for obj68 (graphObject_: Obj18) of type MT_pre__directLink_S
self.drawConnections(
(self.obj68,self.obj62,[410.0, 154.0, 510.0, 154.0],"true", 2) )
# Connections for obj69 (graphObject_: Obj19) of type MT_pre__directLink_S
self.drawConnections(
(self.obj69,self.obj63,[510.0, 219.5, 510.0, 285.0],"true", 2) )
# Connections for obj70 (graphObject_: Obj20) of type MT_pre__directLink_S
self.drawConnections(
(self.obj70,self.obj64,[510.0, 348.5, 510.0, 412.0],"true", 2) )
# Connections for obj71 (graphObject_: Obj21) of type MT_pre__directLink_S
self.drawConnections(
(self.obj71,self.obj65,[410.0, 411.5, 310.0, 411.0],"true", 2) )
# Connections for obj60 (graphObject_: Obj12) of type LHS
self.drawConnections(
)
newfunction = P1Connected_MDL
loadedMMName = ['MT_pre__GM2AUTOSAR_MM_META', 'MoTifRule_META']
atom3version = '0.3'
| 55.698113 | 1,052 | 0.657927 | 3,823 | 29,520 | 4.818206 | 0.057808 | 0.033388 | 0.014821 | 0.027524 | 0.717535 | 0.637134 | 0.569815 | 0.550489 | 0.521173 | 0.499023 | 0 | 0.033195 | 0.141768 | 29,520 | 529 | 1,053 | 55.803403 | 0.693862 | 0.090989 | 0 | 0.184049 | 1 | 0.110429 | 0.201489 | 0.069657 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003067 | false | 0 | 0.09816 | 0 | 0.101227 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b09eb1730d5762c1bbc6722aad64f6fedbe56656 | 577 | py | Python | eod/fundamental_economic_data/economic_events_data_api/economic_events_data.py | gereon/eod-data | 4286a03cc08bc8b5dc42ebae0bb8eb22bdfa3230 | [
"Apache-2.0"
] | 19 | 2021-09-18T11:31:45.000Z | 2022-03-15T20:03:52.000Z | eod/fundamental_economic_data/economic_events_data_api/economic_events_data.py | gereon/eod-data | 4286a03cc08bc8b5dc42ebae0bb8eb22bdfa3230 | [
"Apache-2.0"
] | 2 | 2022-02-18T23:37:48.000Z | 2022-03-01T18:14:06.000Z | eod/fundamental_economic_data/economic_events_data_api/economic_events_data.py | gereon/eod-data | 4286a03cc08bc8b5dc42ebae0bb8eb22bdfa3230 | [
"Apache-2.0"
] | 8 | 2021-09-13T16:49:52.000Z | 2022-03-31T21:09:44.000Z | # -*- coding: utf-8 -*-
"""
Created on Tue Jan 4 16:28:22 2022
@author: lauta
"""
from eod.request_handler_class import RequestHandler
class EconomicEventsData(RequestHandler):
def __init__(self, api_key:str, timeout:int):
# base URL's of the API
self.URL_ECONOMIC_EVENT_DATA = 'https://eodhistoricaldata.com/api/economic-events/'
super().__init__(api_key, timeout)
def get_economic_events(self, **query_params):
self.endpoint = self.URL_ECONOMIC_EVENT_DATA
return super().handle_request(self.endpoint, query_params) | 32.055556 | 91 | 0.701906 | 76 | 577 | 5.026316 | 0.644737 | 0.031414 | 0.078534 | 0.104712 | 0.125654 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025478 | 0.183709 | 577 | 18 | 92 | 32.055556 | 0.785563 | 0.166378 | 0 | 0 | 0 | 0 | 0.105708 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b09ec5a50499dc25f03f9923941e0a467c11bd13 | 1,453 | py | Python | workspacemanager/test/setuptest.py | hayj/WorkspaceManager | 81a9b7fdb5b532ffed75742090a620447ee614eb | [
"MIT"
] | 3 | 2018-12-03T22:50:35.000Z | 2019-06-18T10:56:20.000Z | workspacemanager/test/setuptest.py | hayj/WorkspaceManager | 81a9b7fdb5b532ffed75742090a620447ee614eb | [
"MIT"
] | null | null | null | workspacemanager/test/setuptest.py | hayj/WorkspaceManager | 81a9b7fdb5b532ffed75742090a620447ee614eb | [
"MIT"
] | null | null | null | # coding: utf-8
import unittest
import doctest
import os
from workspacemanager import setup
from workspacemanager import generateSetup
from workspacemanager.utils import *
from shutil import *
from workspacemanager.test.utils import *
# The level allow the unit test execution to choose only the top level test
min = 0
max = 1
assert min <= max
if min <= 0 <= max:
class DocTest(unittest.TestCase):
def testDoctests(self):
"""Run doctests"""
doctest.testmod(setup)
if min <= 1 <= max:
class Test1(unittest.TestCase):
def setUp(self):
pass
def test1(self):
# Create a fake project:
theProjectDirectory = createFakeDir()
# Check the fake project:
assert os.path.isdir(theProjectDirectory) is True
# Generate the setup and others:
generateSetup(theProjectDirectory=theProjectDirectory)
# Check things:
self.assertTrue("__DES" not in fileToStr(theProjectDirectory + "/setup.py"))
self.assertTrue("<year>" not in fileToStr(theProjectDirectory + "/LICENCE.txt"))
self.assertTrue("version" in fileToStr(theProjectDirectory + "/projecttest/__init__.py"))
if min <= 2 <= max:
pass
if min <= 3 <= max:
pass
if __name__ == '__main__':
unittest.main() # Or execute as Python unit-test in eclipse
| 25.946429 | 101 | 0.623538 | 158 | 1,453 | 5.64557 | 0.474684 | 0.089686 | 0.100897 | 0.073991 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008712 | 0.289057 | 1,453 | 55 | 102 | 26.418182 | 0.854792 | 0.162423 | 0 | 0.09375 | 1 | 0 | 0.059167 | 0.02 | 0 | 0 | 0 | 0 | 0.15625 | 1 | 0.09375 | false | 0.09375 | 0.25 | 0 | 0.40625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b0a338a5f1a73271ed34d25f1ee73b316debec21 | 3,754 | py | Python | bundle.py | cmakler/kgjs | a29d194cfbfe3dcb0407b5281a34dd0ddd42bf68 | [
"MIT"
] | 20 | 2019-03-12T12:54:04.000Z | 2022-01-27T01:24:07.000Z | bundle.py | cmakler/kgjs | a29d194cfbfe3dcb0407b5281a34dd0ddd42bf68 | [
"MIT"
] | 5 | 2017-07-20T17:16:17.000Z | 2022-02-26T04:07:44.000Z | bundle.py | cmakler/kgjs | a29d194cfbfe3dcb0407b5281a34dd0ddd42bf68 | [
"MIT"
] | 3 | 2019-10-10T03:39:14.000Z | 2021-12-13T00:45:46.000Z | import json
import codecs
__author__ = 'cmakler'
js_directories = [
'build/bundled/',
'docs/js/',
'docs/playground/code/',
'../bh-textbook/code/',
'../core-interactives/code/',
'../econgraphs/static/js/'
]
js_local_directories = [
'build/bundled/',
'docs/js/',
'docs/playground/code/'
]
css_directories = [
'build/bundled/',
'docs/css/',
'docs/playground/code/',
'../bh-textbook/code/',
'../core-interactives/code/',
'../econgraphs/static/css/'
]
bundles = [
{
"name": "kg-lib.js",
"dest_directories": ["build/lib/"],
"order": [
"node_modules/katex/dist/katex.min.js",
"node_modules/katex/dist/contrib/auto-render.min.js",
"node_modules/d3/dist/d3.min.js",
"node_modules/mathjs/dist/math.min.js",
"node_modules/js-yaml/dist/js-yaml.min.js"
]
},
{
"name": "kg3d-lib.js",
"dest_directories": ["build/lib/"],
"order": [
"node_modules/katex/dist/katex.min.js",
"node_modules/katex/dist/contrib/auto-render.min.js",
"node_modules/d3/dist/d3.min.js",
"node_modules/mathjs/dist/math.min.js",
"node_modules/js-yaml/dist/js-yaml.min.js",
"build/lib/mathbox-bundle.min.js"
]
},
{
"name": "kg-lib.css",
"dest_directories": ["build/lib/"],
"order": [
"node_modules/katex/dist/katex.min.css"
]
},
{
"name": "kg-tufte.css",
"dest_directories": ["build/lib/"],
"order": [
"node_modules/katex/dist/katex.min.css",
"node_modules/tufte-css/tufte.min.css"
]
},
{
"name": "kg.0.2.6.js",
"dest_directories": js_directories,
"order": [
"build/lib/kg-lib.js",
"build/kg.js"
]
},
{
"name": "kg3d.0.2.6.js",
"dest_directories": js_directories,
"order": [
"build/lib/kg3d-lib.js",
"build/kg.js"
]
},
{
"name": "kg-lib.js",
"dest_directories": js_local_directories,
"order": [
"build/lib/kg-lib.js"
]
},
{
"name": "kg3d-lib.js",
"dest_directories": js_local_directories,
"order": [
"build/lib/kg3d-lib.js"
]
},
{
"name": "kg.js",
"dest_directories": js_local_directories,
"order": [
"build/kg.js"
]
},
{
"name": "kg.js.map",
"dest_directories": js_local_directories,
"order": [
"build/kg.js.map"
]
},
{
"name": "kg.0.2.6.css",
"dest_directories": css_directories,
"order": [
"node_modules/katex/dist/katex.min.css",
"build/kg.css"
]
},
{
"name": "kg-tufte.0.2.6.css",
"dest_directories": css_directories,
"order": [
"node_modules/katex/dist/katex.min.css",
"node_modules/tufte-css/tufte.min.css",
"build/kg.css"
]
}
]
for bundle in bundles:
for dest_directory in bundle['dest_directories']:
result = ''
bundle_name = bundle['name']
print 'Processing bundle ' + bundle_name + '\n'
for file_name in bundle['order']:
with codecs.open(file_name, 'r', encoding='utf8') as infile:
print ' Appending ' + file_name + '\n'
result += infile.read() + "\n\n"
infile.close()
with codecs.open(dest_directory + bundle_name, 'w', encoding='utf8') as outfile:
outfile.write(result)
outfile.close()
| 25.712329 | 88 | 0.499201 | 405 | 3,754 | 4.48642 | 0.165432 | 0.096863 | 0.070446 | 0.088057 | 0.731976 | 0.711062 | 0.69235 | 0.665933 | 0.623005 | 0.571271 | 0 | 0.009109 | 0.327384 | 3,754 | 145 | 89 | 25.889655 | 0.710495 | 0 | 0 | 0.398551 | 0 | 0 | 0.428343 | 0.224028 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.014493 | null | null | 0.014493 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0a9014475c08fce9f952140596ffecf867a6894 | 2,475 | py | Python | jump_bot/jumpbot/settings.py | beatyou/wechat_jump_game | ba1d1b88eacf9edd287c8418a753aa8b4c842945 | [
"Apache-2.0"
] | 17,238 | 2017-12-29T02:19:57.000Z | 2022-03-30T14:45:23.000Z | jump_bot/jumpbot/settings.py | lmtsunnie/wechat_jump_game | 143c35893ff0f5ddd07348a4b4c55ccd4e8165fd | [
"MIT"
] | 1,175 | 2017-12-29T04:54:10.000Z | 2022-03-11T23:17:36.000Z | jump_bot/jumpbot/settings.py | lmtsunnie/wechat_jump_game | 143c35893ff0f5ddd07348a4b4c55ccd4e8165fd | [
"MIT"
] | 5,918 | 2017-12-29T03:08:01.000Z | 2022-03-31T14:51:05.000Z | # Wechat Jump Bot (iOS)
# ----------------------------------------------------------------------------
import os
CURRENT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
PROJECT_ROOT = os.path.dirname(CURRENT_DIR)
PROJECT_DIR = "jumpbot/"
# ----------------------------------------------------------------------------
# Screenshot
DATA_DIR = "data/"
IMAGE = "screen.png"
IMAGE_DIR = PROJECT_DIR + DATA_DIR + IMAGE
# ----------------------------------------------------------------------------
# mode: ['auto', 'manual']
MODE = "manual"
# ----------------------------------------------------------------------------
# Params
def get_bot_params(model="ip"):
bot_params = {
"TIME_COEFF": 2.,
"COORD_Y_START_SCAN": 200,
"PIECE_BASE_HEIGHT_HALF": 13,
"PIECE_BODY_WIDTH": 49,
"SWIPE_X1": 375,
"SWIPE_Y1": 1055,
"SWIPE_X2": 375,
"SWIPE_Y2": 1055
}
if model == "ip":
bot_params["TIME_COEFF"] = 2.
bot_params["COORD_Y_START_SCAN"] = 200
bot_params["PIECE_BASE_HEIGHT_HALF"] = 13
bot_params["PIECE_BODY_WIDTH"] = 49
bot_params["SWIPE_X1"] = 375
bot_params["SWIPE_Y1"] = 1055
bot_params["SWIPE_X2"] = 375
bot_params["SWIPE_Y2"] = 1055
elif model == "plus":
bot_params["TIME_COEFF"] = 1.2
bot_params["COORD_Y_START_SCAN"] = 300
bot_params["PIECE_BASE_HEIGHT_HALF"] = 20
bot_params["PIECE_BODY_WIDTH"] = 70
bot_params["SWIPE_X1"] = 320
bot_params["SWIPE_Y1"] = 410
bot_params["SWIPE_X2"] = 320
bot_params["SWIPE_Y2"] = 410
elif model == "ipx":
bot_params["TIME_COEFF"] = 1.31
bot_params["COORD_Y_START_SCAN"] = 170
bot_params["PIECE_BASE_HEIGHT_HALF"] = 23
bot_params["PIECE_BODY_WIDTH"] = 70
bot_params["SWIPE_X1"] = 320
bot_params["SWIPE_Y1"] = 410
bot_params["SWIPE_X2"] = 320
bot_params["SWIPE_Y2"] = 410
elif model == "se":
bot_params["TIME_COEFF"] = 2.3
bot_params["COORD_Y_START_SCAN"] = 190
bot_params["PIECE_BASE_HEIGHT_HALF"] = 12
bot_params["PIECE_BODY_WIDTH"] = 50
bot_params["SWIPE_X1"] = 375
bot_params["SWIPE_Y1"] = 1055
bot_params["SWIPE_X2"] = 375
bot_params["SWIPE_Y2"] = 1055
else:
print("ParamError: Unknown model type, model should be [ip, plus, ipx, se]")
return bot_params
| 30.182927 | 84 | 0.528889 | 296 | 2,475 | 4.040541 | 0.260135 | 0.263378 | 0.187291 | 0.075251 | 0.682274 | 0.532609 | 0.398829 | 0.314381 | 0.314381 | 0.314381 | 0 | 0.068277 | 0.230707 | 2,475 | 81 | 85 | 30.555556 | 0.559874 | 0.150303 | 0 | 0.310345 | 0 | 0 | 0.286055 | 0.052531 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017241 | false | 0 | 0.017241 | 0 | 0.051724 | 0.017241 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0ce3fb606b60c976a97a3d7ed3337be7ed1e9e1 | 2,037 | py | Python | src/result/settings.py | fteychene/survey-server | c7c74e6229fdc7e8392fdf1aef570814131c2104 | [
"Apache-2.0"
] | null | null | null | src/result/settings.py | fteychene/survey-server | c7c74e6229fdc7e8392fdf1aef570814131c2104 | [
"Apache-2.0"
] | null | null | null | src/result/settings.py | fteychene/survey-server | c7c74e6229fdc7e8392fdf1aef570814131c2104 | [
"Apache-2.0"
] | null | null | null | '''
Created on Sep 21, 2015
@author: fteychene
'''
from result import hooks
class ResultConfiguration(object):
@staticmethod
def hooks():
return hooks.register
@staticmethod
def domain():
return {
'url': 'survey/<regex("[a-f0-9]{24}"):survey>/results',
'schema': ResultConfiguration.schema(),
'datasource': {
'source': 'results'
},
'resource_title': 'results',
'resource_methods': ['GET', 'POST'],
'item_methods': ['GET', 'DELETE'],
}
@staticmethod
def schema():
return {
'commiter': {
'type': 'dict',
'schema': {
'firstName': {
'type': 'string',
'minlength': 1,
},
'lastName': {
'type': 'string',
'minlength': 1,
'required': True,
}
},
'unique': True,
},
'survey': {
'type': 'objectid',
'data_relation': {
'resource': 'survey'
},
'required': True,
},
'responses': {
'type': 'list',
'schema': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'response_index': {
'type': 'integer',
},
'text': {
'type': 'string',
},
'valid': {
'type': 'boolean'
},
},
},
},
'required': True,
}
}
| 27.16 | 67 | 0.297005 | 103 | 2,037 | 5.825243 | 0.572816 | 0.075 | 0.046667 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013873 | 0.575356 | 2,037 | 74 | 68 | 27.527027 | 0.679769 | 0.021109 | 0 | 0.328125 | 0 | 0 | 0.20141 | 0.022659 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046875 | true | 0 | 0.015625 | 0.046875 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0dff4e69c3bd7e000971db9965a685b09734712 | 153 | py | Python | 051 - 100/ex051.py | SocrammBR/Desafios-Python-CursoEmVideo | bd2454a24134500343ece91b936c169d3a66f89e | [
"MIT"
] | null | null | null | 051 - 100/ex051.py | SocrammBR/Desafios-Python-CursoEmVideo | bd2454a24134500343ece91b936c169d3a66f89e | [
"MIT"
] | null | null | null | 051 - 100/ex051.py | SocrammBR/Desafios-Python-CursoEmVideo | bd2454a24134500343ece91b936c169d3a66f89e | [
"MIT"
] | null | null | null | pt = int(input('Primeiro termo: '))
rz = int(input('Razão: '))
for c in range (pt, (pt + (10-1) * rz)+rz, rz):
print(f'{c}', end=" → ")
print('ACABOU') | 30.6 | 47 | 0.542484 | 27 | 153 | 3.111111 | 0.666667 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023622 | 0.169935 | 153 | 5 | 48 | 30.6 | 0.629921 | 0 | 0 | 0 | 0 | 0 | 0.227273 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0e017985ddca231d43a737d6e517583fd46747b | 516 | py | Python | Utils.py | statisticsnorway/raml-to-umlet-diagram | b8342ba78c00bbce44c27809f1cb1d9e2ed9a6cb | [
"Apache-2.0"
] | null | null | null | Utils.py | statisticsnorway/raml-to-umlet-diagram | b8342ba78c00bbce44c27809f1cb1d9e2ed9a6cb | [
"Apache-2.0"
] | null | null | null | Utils.py | statisticsnorway/raml-to-umlet-diagram | b8342ba78c00bbce44c27809f1cb1d9e2ed9a6cb | [
"Apache-2.0"
] | null | null | null | import pprint
import xml.etree.ElementTree as ET
class Utils():
def printDict(self, dict):
pp = pprint.PrettyPrinter(indent=4, compact=False)
pp.pprint(dict)
def printETree(self, eTree):
#print(ET.tostring(eTree, encoding='utf8', method='xml'))
print(ET.tostring(eTree, encoding='unicode', method='xml'))
#ElementTree.dump(root)
def printList(self, list):
strList = ""
for elem in list:
strList += elem + "\n"
print(strList)
| 27.157895 | 67 | 0.606589 | 61 | 516 | 5.131148 | 0.57377 | 0.051118 | 0.095847 | 0.127796 | 0.178914 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005222 | 0.257752 | 516 | 18 | 68 | 28.666667 | 0.81201 | 0.151163 | 0 | 0 | 0 | 0 | 0.027523 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.153846 | 0 | 0.461538 | 0.615385 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b0f109b2cc1a8d43d7cddf91f25602eaa7292c71 | 1,032 | py | Python | Pre-Interview Challenges/string_sum.py | Wryhder/solve-with-code | 0fd1ef4f1c46ad89d68a667b3aaa6b98c69da266 | [
"MIT"
] | null | null | null | Pre-Interview Challenges/string_sum.py | Wryhder/solve-with-code | 0fd1ef4f1c46ad89d68a667b3aaa6b98c69da266 | [
"MIT"
] | null | null | null | Pre-Interview Challenges/string_sum.py | Wryhder/solve-with-code | 0fd1ef4f1c46ad89d68a667b3aaa6b98c69da266 | [
"MIT"
] | null | null | null | # Andela
"""
Problem Statement
Write a program which accepts a sequence of comma-separated numbers from console and generate the string sum of the digits and the total sum in a list. Suppose the following input is supplied to the program: 34,67,55,33,12,98
Then, the output should be: [“3+4+6+7+5+5+3+3+1+2+9+8”, 56]
"""
def string_sum():
"""
This function accepts a sequence of comma-separated numbers from a console
and generates the string sum of the digits and the total sum in a list
"""
comma_sep = raw_input(">>>")
a_list = comma_sep.split(",")
a_string = "".join(a_list)
if a_string.isdigit == False:
print "Enter only numbers"
string_sum()
total = 0
string_sum = ""
for c in a_string:
total += int(c)
if c == (a_string[len(a_string) - 1]):
string_sum += c
break
temp = c + "+"
string_sum += temp
result = [string_sum, total]
print result
string_sum()
| 25.8 | 226 | 0.598837 | 156 | 1,032 | 3.852564 | 0.461538 | 0.134775 | 0.053245 | 0.0599 | 0.289517 | 0.289517 | 0.289517 | 0.289517 | 0.146423 | 0.146423 | 0 | 0.038674 | 0.29845 | 1,032 | 39 | 227 | 26.461538 | 0.791436 | 0.005814 | 0 | 0.105263 | 1 | 0 | 0.042279 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.105263 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7c050b5dddc58a1b99c66ac8c5e5022b1b5bf3c0 | 4,422 | py | Python | packgen.py | Boundarybreaker/SwingChain | 6314b759303e1d9a08dd76a547e3cc05d33ab98a | [
"MIT"
] | null | null | null | packgen.py | Boundarybreaker/SwingChain | 6314b759303e1d9a08dd76a547e3cc05d33ab98a | [
"MIT"
] | null | null | null | packgen.py | Boundarybreaker/SwingChain | 6314b759303e1d9a08dd76a547e3cc05d33ab98a | [
"MIT"
] | null | null | null | # Generator used for all the Skinned Lanterns lantern variants!
lanterns = ["pufferfish", "zombie", "creeper", "skeleton", "wither_skeleton",
"bee", "jack_o_lantern", "ghost", "inky", "pinky", "blinky", "clyde", "pacman",
"paper_white", "paper_yellow", "paper_orange", "paper_blue", "paper_light_blue",
"paper_cyan", "paper_lime", "paper_green", "paper_red", "paper_pink",
"paper_brown", "paper_black", "paper_gray", "paper_light_gray", "paper_magenta",
"paper_purple", "guardian"]
shader_path = "assets/skinnedlanterns/materialmaps/"
handheld_path = "assets/skinnedlanterns/lights/item/"
material = """
{{
"defaultMaterial": "canvas:{type}_glow",
"variants": {{
"facing=north,hanging=false,waterlogged=true": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=north,hanging=false,waterlogged=false": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=north,hanging=true,waterlogged=true": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=north,hanging=true,waterlogged=false": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=east,hanging=false,waterlogged=true": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=east,hanging=false,waterlogged=false": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=east,hanging=true,waterlogged=true": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=east,hanging=true,waterlogged=false": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=south,hanging=false,waterlogged=true": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=south,hanging=false,waterlogged=false": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=south,hanging=true,waterlogged=true": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=south,hanging=true,waterlogged=false": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=west,hanging=false,waterlogged=true": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=west,hanging=false,waterlogged=false": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=west,hanging=true,waterlogged=true": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=west,hanging=true,waterlogged=false": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=up,hanging=false,waterlogged=true": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=up,hanging=false,waterlogged=false": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=up,hanging=true,waterlogged=true": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=up,hanging=true,waterlogged=false": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=down,hanging=false,waterlogged=true": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=down,hanging=false,waterlogged=false": {{ "defaultMaterial": "canvas:{type}_glow" }},
"facing=down,hanging=true,waterlogged=true": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }},
"facing=down,hanging=true,waterlogged=false": {{ "defaultMaterial": "swingchain:swinging_{type}_glow" }}
}}
}}
"""
normal_light = """
{
"intensity": 0.93,
"red": 1.0,
"green": 1.0,
"blue": 0.8,
"worksInFluid": false
}
"""
soul_light = """
{
"intensity": 0.93,
"red": 0.6,
"green": 0.8,
"blue": 1.0,
"worksInFluid": true
}
"""
for lantern in lanterns:
normal = open(shader_path + "block/" + lantern + "_lantern_block.json", "w")
soul = open(shader_path + "block/" + lantern + "_soul_lantern_block.json", "w")
normal_item = open(shader_path + "item/" + lantern + "_lantern_block.json", "w")
soul_item = open(shader_path + "item/" + lantern + "_soul_lantern_block.json", "w")
normal_handheld = open(handheld_path + lantern + "_lantern_block.json", "w")
soul_handheld = open(handheld_path + lantern + "_soul_lantern_block.json", "w")
normal.write(material.format(type = "warm"))
soul.write(material.format(type = "luminance"))
normal_item.write("{}")
soul_item.write("{}")
normal_handheld.write(normal_light)
soul_handheld.write(soul_light)
normal.close()
soul.close()
normal_item.close()
soul_item.close()
normal_handheld.close()
soul_handheld.close()
print("Filegen complete!")
| 50.827586 | 114 | 0.675486 | 478 | 4,422 | 6.046025 | 0.1841 | 0.069204 | 0.111419 | 0.13045 | 0.721107 | 0.673702 | 0.629412 | 0.575779 | 0.56263 | 0.279585 | 0 | 0.004697 | 0.133424 | 4,422 | 86 | 115 | 51.418605 | 0.749478 | 0.013795 | 0 | 0.090909 | 0 | 0.311688 | 0.77816 | 0.376233 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.012987 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9fec25bbe3e156e4bf72617f64c2dc32ae326fb6 | 387 | py | Python | recruit_app/ia/forms.py | tyler274/Recruitment-App | 439117cfc3e7ccc01064c688f295b01a542761fb | [
"BSD-3-Clause"
] | 8 | 2015-04-08T23:48:38.000Z | 2016-10-03T01:25:13.000Z | recruit_app/ia/forms.py | tyler274/Recruitment-App | 439117cfc3e7ccc01064c688f295b01a542761fb | [
"BSD-3-Clause"
] | 42 | 2015-03-31T14:48:49.000Z | 2015-11-26T18:18:46.000Z | recruit_app/ia/forms.py | tyler274/Recruitment-App | 439117cfc3e7ccc01064c688f295b01a542761fb | [
"BSD-3-Clause"
] | 5 | 2015-04-08T23:50:06.000Z | 2020-09-15T15:14:35.000Z | from flask_wtf import Form
from wtforms import StringField, TextAreaField, SubmitField
from wtforms.validators import DataRequired
class SubmitIssueForm(Form):
subject = StringField("Subject", validators=[DataRequired()])
body = TextAreaField("Body", validators=[DataRequired()])
logs = TextAreaField("Applicable Chat Logs")
submit = SubmitField(label='Submit Issue')
| 32.25 | 65 | 0.764858 | 39 | 387 | 7.564103 | 0.538462 | 0.074576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134367 | 387 | 11 | 66 | 35.181818 | 0.880597 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9ffbfd1f24433b6989ac05fe11fdfaf1c77bd016 | 1,114 | py | Python | db_create.py | yesan/openvpn-admin-ui | 0c054eb204421d62a89a2bbb8c8003eb9cf8b8d6 | [
"Apache-2.0"
] | 39 | 2018-07-18T09:11:34.000Z | 2022-03-07T13:32:00.000Z | db_create.py | yesan/openvpn-admin-ui | 0c054eb204421d62a89a2bbb8c8003eb9cf8b8d6 | [
"Apache-2.0"
] | 2 | 2018-09-07T08:58:58.000Z | 2022-01-23T22:45:11.000Z | db_create.py | sibosend/openvpn-admin-ui | 0c054eb204421d62a89a2bbb8c8003eb9cf8b8d6 | [
"Apache-2.0"
] | 16 | 2019-04-25T12:48:05.000Z | 2021-12-02T00:14:55.000Z | #!haoliVPNEnv/bin/python
from migrate.versioning import api
from application import db,create_app
import os
import subprocess
import sys
from firstApp.models import Role
app=create_app()
app.app_context().push()
db.create_all()
Role.insert_role()
if not os.path.exists(app.config['SQLALCHEMY_MIGRATE_REPO']):
api.create(app.config['SQLALCHEMY_MIGRATE_REPO'], 'database repository')
api.version_control(app.config['SQLALCHEMY_DATABASE_URI'], app.config['SQLALCHEMY_MIGRATE_REPO'])
else:
api.version_control(app.config['SQLALCHEMY_DATABASE_URI'], app.config['SQLALCHEMY_MIGRATE_REPO'], api.version(app.config['SQLALCHEMY_MIGRATE_REPO']))
# print DATABASE_URL
#
# try:
# output = subprocess.check_output('chown nobody:nobody ' + DATABASE_URL, stderr=subprocess.STDOUT)
# returncode = 0
# except subprocess.CalledProcessError as e:
# stderr = "command '{}' return with error (code {}): {}".format(e.cmd, e.returncode, e.output)
# returncode = -1
# except Exception, e:
# stderr = str(e)
# returncode = -1
#
# if returncode == 0:
# print 'success'
# else:
# print stderr
| 30.108108 | 153 | 0.731598 | 145 | 1,114 | 5.455172 | 0.42069 | 0.079646 | 0.168142 | 0.164349 | 0.316056 | 0.240202 | 0.19469 | 0.19469 | 0.19469 | 0.19469 | 0 | 0.004149 | 0.13465 | 1,114 | 36 | 154 | 30.944444 | 0.81639 | 0.406643 | 0 | 0 | 0 | 0 | 0.279503 | 0.25 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b0014c6dbbf9901708a16da7b6b7ae2f275ce80d | 662 | py | Python | pyinsteon/handlers/to_device/enter_unlinking_mode.py | michaeldavie/pyinsteon | e5b2e2910f4eff1474f158051fa71f75c2077dd6 | [
"MIT"
] | 15 | 2020-07-08T05:29:14.000Z | 2022-03-24T18:56:26.000Z | pyinsteon/handlers/to_device/enter_unlinking_mode.py | michaeldavie/pyinsteon | e5b2e2910f4eff1474f158051fa71f75c2077dd6 | [
"MIT"
] | 107 | 2019-06-03T09:23:02.000Z | 2022-03-31T23:12:38.000Z | pyinsteon/handlers/to_device/enter_unlinking_mode.py | michaeldavie/pyinsteon | e5b2e2910f4eff1474f158051fa71f75c2077dd6 | [
"MIT"
] | 16 | 2019-01-24T01:09:49.000Z | 2022-02-24T03:48:42.000Z | """Get Device Info command handler."""
from ...address import Address
from ...topics import ENTER_UNLINKING_MODE
from .direct_command import DirectCommandHandlerBase
class EnterUnlinkingModeCommand(DirectCommandHandlerBase):
"""Place a device in linking mode command handler."""
def __init__(self, address: Address):
"""Init the EnterUnlinkingModeCommand class."""
super().__init__(topic=ENTER_UNLINKING_MODE, address=address)
# pylint: disable=arguments-differ
async def async_send(self, group: int = 0):
"""Send the ENTER_UNLINKING_MODE request asyncronously."""
return await super().async_send(group=group)
| 36.777778 | 69 | 0.73716 | 74 | 662 | 6.364865 | 0.527027 | 0.089172 | 0.11465 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001802 | 0.161631 | 662 | 17 | 70 | 38.941176 | 0.846847 | 0.23565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b003bffe8650335acf9f55d3d98ea0154c292c5b | 1,347 | py | Python | test/conftest.py | Stranger6667/Flask-Postmark | 74818a6eb86278094c43235fd0faed9545569b9b | [
"MIT"
] | 6 | 2017-06-10T14:31:35.000Z | 2018-04-15T13:24:32.000Z | test/conftest.py | Stranger6667/Flask-Postmark | 74818a6eb86278094c43235fd0faed9545569b9b | [
"MIT"
] | 18 | 2017-05-12T08:50:18.000Z | 2020-09-01T15:31:34.000Z | test/conftest.py | Stranger6667/Flask-Postmark | 74818a6eb86278094c43235fd0faed9545569b9b | [
"MIT"
] | 1 | 2017-05-21T13:49:04.000Z | 2017-05-21T13:49:04.000Z | import pytest
from flask import Flask, json, request
from flask_postmark import Postmark
@pytest.fixture
def app(server_token, postmark_request):
app = Flask(__name__)
app.config["POSTMARK_SERVER_TOKEN"] = server_token
app.config["JSONIFY_PRETTYPRINT_REGULAR"] = False
postmark = Postmark(app)
def make_response():
return json.dumps(postmark_request.mock_calls[0][2]["json"])
@app.route("/token", methods=["GET"])
def token():
return postmark.client.server_token
@app.route("/send", methods=["POST"])
def send():
data = request.get_json()
postmark.send(**data)
return make_response()
@app.route("/is_same_client", methods=["POST"])
def is_same_client():
return json.dumps(postmark.client is postmark.client)
@app.route("/send_batch", methods=["POST"])
def send_batch():
data = request.get_json()
postmark.send_batch(*data)
return make_response()
return app
@pytest.fixture
def server_token():
return b"Foo"
@pytest.fixture
def test_client(app):
return app.test_client()
@pytest.fixture
def post(test_client):
def inner(url, data=None):
response = test_client.post(url, data=json.dumps(data), content_type="application/json")
return json.loads(response.data)
return inner
| 23.631579 | 96 | 0.669636 | 172 | 1,347 | 5.052326 | 0.267442 | 0.063291 | 0.073648 | 0.052934 | 0.069045 | 0.069045 | 0 | 0 | 0 | 0 | 0 | 0.001855 | 0.199703 | 1,347 | 56 | 97 | 24.053571 | 0.804267 | 0 | 0 | 0.2 | 0 | 0 | 0.091314 | 0.035635 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.075 | 0.125 | 0.575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b00f1dd12d0a8ca1a614dcf27eedd99eb0f3b32e | 2,148 | py | Python | tests/test_api_network.py | er-vin/ComDaAn | 17f953d2b24953649605f066c6a2278631a9e909 | [
"Apache-2.0"
] | null | null | null | tests/test_api_network.py | er-vin/ComDaAn | 17f953d2b24953649605f066c6a2278631a9e909 | [
"Apache-2.0"
] | 4 | 2019-11-06T14:36:01.000Z | 2021-03-29T15:25:18.000Z | tests/test_api_network.py | er-vin/ComDaAn | 17f953d2b24953649605f066c6a2278631a9e909 | [
"Apache-2.0"
] | null | null | null | from comdaan import parse_issues, parse_mail, parse_repositories
from comdaan import network, Network
import os
PATH_TO_RESOURCES = os.path.join(os.path.dirname(__file__), "resources/")
def test_network_return_type():
repo = PATH_TO_RESOURCES + "repo"
if not os.listdir(repo):
raise Exception("Empty git submodule. Try: git submodule update --init")
data = parse_repositories(repo)
assert isinstance(network(data, "author_name", "files"), Network)
def test_network_on_repository_cols():
repo = PATH_TO_RESOURCES + "repo"
if not os.listdir(repo):
raise Exception("Empty git submodule. Try: git submodule update --init")
data = parse_repositories(repo)
a = network(data, "author_name", "files")
assert a.dataframe.columns.tolist() == ["centrality"]
def test_network_on_repository_row_count():
repo = PATH_TO_RESOURCES + "repo"
if not os.listdir(repo):
raise Exception("Empty git submodule. Try: git submodule update --init")
data = parse_repositories(repo)
a = network(data, "author_name", "files")
assert len(a.dataframe.index) == 36
def test_network_on_mailinglists_return_type():
data = parse_mail(PATH_TO_RESOURCES + "mailinglist.mbox")
assert isinstance(network(data, "sender_name", "references", "message_id"), Network)
def test_network_on_mailinglist_cols():
data = parse_mail(PATH_TO_RESOURCES + "mailinglist.mbox")
a = network(data, "sender_name", "references", "message_id")
assert a.dataframe.columns.tolist() == ["centrality"]
def test_network_on_mailinglist_row_count():
data = parse_mail(PATH_TO_RESOURCES + "mailinglist.mbox")
a = network(data, "sender_name", "references", "message_id")
assert len(a.dataframe.index) == 8
def test_network_on_issues_cols():
data = parse_issues(PATH_TO_RESOURCES + "issues.json")
a = network(data, "author", "discussion")
assert a.dataframe.columns.tolist() == ["centrality"]
def test_network_on_issues_row_count():
data = parse_issues(PATH_TO_RESOURCES + "issues.json")
a = network(data, "author", "discussion")
assert len(a.dataframe.index) == 30
| 35.213115 | 88 | 0.717877 | 282 | 2,148 | 5.195035 | 0.216312 | 0.03686 | 0.09215 | 0.076451 | 0.809556 | 0.675768 | 0.675768 | 0.648464 | 0.619113 | 0.619113 | 0 | 0.002762 | 0.157356 | 2,148 | 60 | 89 | 35.8 | 0.80663 | 0 | 0 | 0.604651 | 0 | 0 | 0.211359 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 1 | 0.186047 | false | 0 | 0.069767 | 0 | 0.255814 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b01aa872f5e6c7c2485155ec579855fcc634e0a3 | 1,105 | py | Python | gnome/gnome2/gedit/plugins.symlink/gedit-openfiles/GeditOpenFile.py | icebreaker/dotfiles | 5c3dc7f981069a728cc6b34ae39cd4c2da1122aa | [
"MIT"
] | 4 | 2015-03-17T14:36:49.000Z | 2019-06-10T09:34:35.000Z | gnome/gnome2/gedit/plugins.symlink/gedit-openfiles/GeditOpenFile.py | icebreaker/dotfiles | 5c3dc7f981069a728cc6b34ae39cd4c2da1122aa | [
"MIT"
] | null | null | null | gnome/gnome2/gedit/plugins.symlink/gedit-openfiles/GeditOpenFile.py | icebreaker/dotfiles | 5c3dc7f981069a728cc6b34ae39cd4c2da1122aa | [
"MIT"
] | 1 | 2019-03-01T13:21:55.000Z | 2019-03-01T13:21:55.000Z | from gedit import Plugin
from FileMonitor import FileMonitor
from DBWrapper import DBWrapper
from GeditOpenFileGui import GeditOpenFileGui
from Logger import log
from Config import Config
class GeditOpenFile(Plugin):
DATA_TAG = "GeditOpenFilePlugin"
def __init__(self):
Plugin.__init__(self)
# Create DB Wrapper and start the thread
self._db_wrapper = DBWrapper()
self._config = Config()
self._file_monitor = FileMonitor(self._db_wrapper,
self._config.root_path(), self._config)
def _get_instance(self, window):
return window.get_data(self.DATA_TAG)
def _set_instance(self, window, instance):
window.set_data(self.DATA_TAG, instance)
def activate(self, window):
self._set_instance(window, GeditOpenFileGui(self, window,
self._file_monitor, self._config))
def deactivate(self, window):
# self._get_instance(window).deactivate() # Longer supported in 2.28
self._set_instance(window, None)
def update_ui(self, window):
self._get_instance(window).update_ui()
| 28.333333 | 75 | 0.702262 | 134 | 1,105 | 5.492537 | 0.328358 | 0.081522 | 0.076087 | 0.040761 | 0.084239 | 0.084239 | 0 | 0 | 0 | 0 | 0 | 0.003452 | 0.213575 | 1,105 | 38 | 76 | 29.078947 | 0.843498 | 0.100452 | 0 | 0 | 0 | 0 | 0.019211 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0 | 0.24 | 0.04 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b01dc491fa86382b84a1088bdbfdbe094f6e886b | 1,497 | py | Python | testai_classifier/classifier_pb2_grpc.py | testdotai/classifier-client-python | d63c1406d10f73098eac3c7f0c1dacf4acdde46e | [
"Apache-2.0"
] | 12 | 2019-12-22T01:16:28.000Z | 2022-03-01T00:35:10.000Z | testai_classifier/classifier_pb2_grpc.py | testdotai/classifier-client-python | d63c1406d10f73098eac3c7f0c1dacf4acdde46e | [
"Apache-2.0"
] | 5 | 2020-03-12T20:38:44.000Z | 2021-06-02T01:18:11.000Z | testai_classifier/classifier_pb2_grpc.py | testdotai/classifier-client-python | d63c1406d10f73098eac3c7f0c1dacf4acdde46e | [
"Apache-2.0"
] | 2 | 2020-09-04T08:51:26.000Z | 2022-01-01T17:58:02.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
import classifier_pb2 as classifier__pb2
class ClassifierStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.ClassifyElements = channel.unary_unary(
'/Classifier/ClassifyElements',
request_serializer=classifier__pb2.ElementClassificationRequest.SerializeToString,
response_deserializer=classifier__pb2.ClassifiedElements.FromString,
)
class ClassifierServicer(object):
# missing associated documentation comment in .proto file
pass
def ClassifyElements(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ClassifierServicer_to_server(servicer, server):
rpc_method_handlers = {
'ClassifyElements': grpc.unary_unary_rpc_method_handler(
servicer.ClassifyElements,
request_deserializer=classifier__pb2.ElementClassificationRequest.FromString,
response_serializer=classifier__pb2.ClassifiedElements.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'Classifier', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 31.851064 | 90 | 0.763527 | 149 | 1,497 | 7.395973 | 0.416107 | 0.07078 | 0.08167 | 0.100726 | 0.157895 | 0.157895 | 0.157895 | 0.157895 | 0.110708 | 0.110708 | 0 | 0.004804 | 0.165665 | 1,497 | 46 | 91 | 32.543478 | 0.877502 | 0.189045 | 0 | 0.107143 | 1 | 0 | 0.083893 | 0.02349 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0.107143 | 0.071429 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b039d0bddaa5f9a8fae190fb12bc5246fbcfa10f | 2,400 | py | Python | moff/parser/read_image.py | Tikubonn/moff | 1c363f60959138311068177fca177d0f0dc97380 | [
"MIT"
] | null | null | null | moff/parser/read_image.py | Tikubonn/moff | 1c363f60959138311068177fca177d0f0dc97380 | [
"MIT"
] | null | null | null | moff/parser/read_image.py | Tikubonn/moff | 1c363f60959138311068177fca177d0f0dc97380 | [
"MIT"
] | null | null | null |
from moff.builder import ImageBuilder, ImageAltAdapter, ImageTitleAdapter, ImageLinkAdapter, ImageCaseAdapter, ImageSrcCaseAdapter, ImageSizeCaseAdapter, ImageMediaCaseAdapter, ImageTypeCaseAdapter
from moff.attribute import Srcset, Sizes
from moff.util import read_until, read_whitespace, read_line, read_media_query, read_srcset_case, fix_path
# @image path
def read_image(preread, stream, indentation, parser, options=dict()):
read1 = read_line(stream).strip()
return ImageBuilder(src=fix_path(read1, options=options))
# @image @alt ...
def read_image_alt(preread, stream, indentation, parser, options=dict()):
read1 = read_line(stream).strip()
return ImageAltAdapter(read1)
# @image @title ...
def read_image_title(preread, stream, indentation, parser, options=dict()):
read1 = read_line(stream).strip()
return ImageTitleAdapter(read1)
# @image @link ...
def read_image_link(preread, stream, indentation, parser, options=dict()):
read1 = read_line(stream).strip()
return ImageLinkAdapter(fix_path(read1, options=options))
# @image @case
def read_image_case(preread, stream, indentation, parser, options=dict()):
return ImageCaseAdapter()
# @image @sizecase path width
def read_image_srccase(preread, stream, indentation, parser, options=dict()):
src = read_until(stream, " ", use_escape=True, eof_is_error=True).strip()
read_whitespace(stream)
width = read_srcset_case(stream)
return ImageSrcCaseAdapter(Srcset(src=fix_path(src), width=width))
# @image @case @type
def read_image_sizecase(preread, stream, indentation, parser, options=dict()):
read1 = read_media_query(stream, use_escape=True,
eof_is_error=True).strip()
read_whitespace(stream)
read2 = read_line(stream, use_escape=True, eof_is_error=True).strip()
if read2:
return ImageSizeCaseAdapter(Sizes(width=read2, media=read1))
else:
return ImageSizeCaseAdapter(Sizes(width=read1))
# @image @mediacase
def read_image_mediacase(preread, stream, indentation, parser, options=dict()):
read1 = read_line(stream, use_escape=True, eof_is_error=False)
return ImageMediaCaseAdapter(read1)
# @image @typecase
def read_image_typecase(preread, stream, indentation, parser, options=dict()):
read1 = read_line(stream, use_escape=True, eof_is_error=False)
return ImageTypeCaseAdapter(read1)
| 32 | 197 | 0.745 | 290 | 2,400 | 5.968966 | 0.196552 | 0.036395 | 0.062392 | 0.155979 | 0.478336 | 0.478336 | 0.395147 | 0.395147 | 0.366262 | 0.339688 | 0 | 0.008759 | 0.14375 | 2,400 | 74 | 198 | 32.432432 | 0.833577 | 0.065417 | 0 | 0.216216 | 0 | 0 | 0.000448 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.243243 | false | 0 | 0.081081 | 0.027027 | 0.594595 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b03d049623baac26deb441964f586638a9374f86 | 8,320 | py | Python | racer.py | nat-chan/BOAT_RACE | f171248ce9363a3af6d8725562900d3b353ffbdb | [
"MIT"
] | null | null | null | racer.py | nat-chan/BOAT_RACE | f171248ce9363a3af6d8725562900d3b353ffbdb | [
"MIT"
] | 1 | 2019-11-24T01:14:10.000Z | 2019-11-24T04:56:29.000Z | racer.py | nat-chan/BOAT_RACE | f171248ce9363a3af6d8725562900d3b353ffbdb | [
"MIT"
] | 2 | 2019-11-24T00:11:39.000Z | 2019-11-24T00:16:19.000Z | #!/usr/bin/env python3
import struct
layout = (
#key ,byte, descriptions
("登番" , 4 , "3415"),
("名前漢字" , 16 , "松井 繁"),
("名前カナ" , 15 , "マツイ シゲル"),
("支部" , 4 , "大阪"),
("級" , 2 , "A1"),
("年号" , 1 , "S(S:昭和, H:平成)"),
("生年月日" , 6 , "441111(昭和44年11月11日)"),
("性別" , 1 , "1(1:男, 2:女)"),
("年齢" , 2 , "44(才)"),
("身長" , 3 , "168(cm)"),
("体重" , 2 , "50(kg)"),
("血液型" , 2 , "O(型 A, B, AB, O)"),
("勝率" , 4 , "0756(7.56 小数点以下2桁)"),
("複勝率" , 4 , "0459(45.9 小数点以下1桁)"),
("1着回数" , 3 , "037(37回)"),
("2着回数" , 3 , "019(19回)"),
("出走回数" , 3 , "122(122回)"),
("優出回数" , 2 , "05(5回)"),
("優勝回数" , 2 , "02(2回)"),
("平均スタートタイミング" , 3 , "016(0.16 小数点以下2桁)"),
("1コース進入回数" , 3 , "046(46回)"),
("1コース複勝率" , 4 , "0739(73.9 小数点以下1桁)"),
("1コース平均スタートタイミング" , 3 , "015(0.15 小数点以下2桁)"),
("1コース平均スタート順位" , 3 , "240(2.40 小数点以下2桁)"),
("2コース進入回数" , 3 , "028(28回)"),
("2コース複勝率" , 4 , "0429(42.9 小数点以下1桁)"),
("2コース平均スタートタイミング" , 3 , "015(0.15 小数点以下2桁)"),
("2コース平均スタート順位" , 3 , "270(2.70 小数点以下2桁)"),
("3コース進入回数" , 3 , "024(24回)"),
("3コース複勝率" , 4 , "0417(41.7 小数点以下1桁)"),
("3コース平均スタートタイミング" , 3 , "016(0.16 小数点以下2桁)"),
("3コース平均スタート順位" , 3 , "270(2.70 小数点以下2桁)"),
("4コース進入回数" , 3 , "017(17回)"),
("4コース複勝率" , 4 , "0471(47.1 小数点以下1桁)"),
("4コース平均スタートタイミング" , 3 , "015(0.15 小数点以下2桁)"),
("4コース平均スタート順位" , 3 , "320(3.20 小数点以下2桁)"),
("5コース進入回数" , 3 , "009(09回)"),
("5コース複勝率" , 4 , "0222(22.2 小数点以下1桁)"),
("5コース平均スタートタイミング" , 3 , "020(0.20 小数点以下2桁)"),
("5コース平均スタート順位" , 3 , "330(3.30 小数点以下2桁)"),
("6コース進入回数" , 3 , "000(00回)"),
("6コース複勝率" , 4 , "0000(00.0 小数点以下1桁)"),
("6コース平均スタートタイミング" , 3 , "000(0.00 小数点以下2桁)"),
("6コース平均スタート順位" , 3 , "000(0.00 小数点以下2桁)"),
("前期級" , 2 , "A1"),
("前々期級" , 2 , "A1"),
("前々々期級" , 2 , "A1"),
("前期能力指数" , 4 , "7400(74.00 小数点以下2桁)"),
("今期能力指数" , 4 , "7500(75.00 小数点以下2桁)"),
("年" , 4 , "2014(2014年2期の成績)"),
("期" , 1 , "2(〃)"),
("算出期間(自)" , 8 , "20131101(2013年11月01日)"),
("算出期間(至)" , 8 , "20140430(2014年04月30日)"),
("養成期" , 3 , "064(64期)"),
("1コース1着回数" , 3 , "046(46回)"),
("1コース2着回数" , 3 , "046(46回)"),
("1コース3着回数" , 3 , "046(46回)"),
("1コース4着回数" , 3 , "046(46回)"),
("1コース5着回数" , 3 , "046(46回)"),
("1コース6着回数" , 3 , "046(46回)"),
("1コースF回数" , 2 , "01(1回)"),
("1コースL0回数" , 2 , "01(1回)"),
("1コースL1回数" , 2 , "01(1回)"),
("1コースK0回数" , 2 , "01(1回)"),
("1コースK1回数" , 2 , "01(1回)"),
("1コースS0回数" , 2 , "01(1回)"),
("1コースS1回数" , 2 , "01(1回)"),
("1コースS2回数" , 2 , "01(1回)"),
("2コース1着回数" , 3 , "046(46回)"),
("2コース2着回数" , 3 , "046(46回)"),
("2コース3着回数" , 3 , "046(46回)"),
("2コース4着回数" , 3 , "046(46回)"),
("2コース5着回数" , 3 , "046(46回)"),
("2コース6着回数" , 3 , "046(46回)"),
("2コースF回数" , 2 , "01(1回)"),
("2コースL0回数" , 2 , "01(1回)"),
("2コースL1回数" , 2 , "01(1回)"),
("2コースK0回数" , 2 , "01(1回)"),
("2コースK1回数" , 2 , "01(1回)"),
("2コースS0回数" , 2 , "01(1回)"),
("2コースS1回数" , 2 , "01(1回)"),
("2コースS2回数" , 2 , "01(1回)"),
("3コース1着回数" , 3 , "046(46回)"),
("3コース2着回数" , 3 , "046(46回)"),
("3コース3着回数" , 3 , "046(46回)"),
("3コース4着回数" , 3 , "046(46回)"),
("3コース5着回数" , 3 , "046(46回)"),
("3コース6着回数" , 3 , "046(46回)"),
("3コースF回数" , 2 , "01(1回)"),
("3コースL0回数" , 2 , "01(1回)"),
("3コースL1回数" , 2 , "01(1回)"),
("3コースK0回数" , 2 , "01(1回)"),
("3コースK1回数" , 2 , "01(1回)"),
("3コースS0回数" , 2 , "01(1回)"),
("3コースS1回数" , 2 , "01(1回)"),
("3コースS2回数" , 2 , "01(1回)"),
("4コース1着回数" , 3 , "046(46回)"),
("4コース2着回数" , 3 , "046(46回)"),
("4コース3着回数" , 3 , "046(46回)"),
("4コース4着回数" , 3 , "046(46回)"),
("4コース5着回数" , 3 , "046(46回)"),
("4コース6着回数" , 3 , "046(46回)"),
("4コースF回数" , 2 , "01(1回)"),
("4コースL0回数" , 2 , "01(1回)"),
("4コースL1回数" , 2 , "01(1回)"),
("4コースK0回数" , 2 , "01(1回)"),
("4コースK1回数" , 2 , "01(1回)"),
("4コースS0回数" , 2 , "01(1回)"),
("4コースS1回数" , 2 , "01(1回)"),
("4コースS2回数" , 2 , "01(1回)"),
("5コース1着回数" , 3 , "046(46回)"),
("5コース2着回数" , 3 , "046(46回)"),
("5コース3着回数" , 3 , "046(46回)"),
("5コース4着回数" , 3 , "046(46回)"),
("5コース5着回数" , 3 , "046(46回)"),
("5コース6着回数" , 3 , "046(46回)"),
("5コースF回数" , 2 , "01(1回)"),
("5コースL0回数" , 2 , "01(1回)"),
("5コースL1回数" , 2 , "01(1回)"),
("5コースK0回数" , 2 , "01(1回)"),
("5コースK1回数" , 2 , "01(1回)"),
("5コースS0回数" , 2 , "01(1回)"),
("5コースS1回数" , 2 , "01(1回)"),
("5コースS2回数" , 2 , "01(1回)"),
("6コース1着回数" , 3 , "046(46回)"),
("6コース2着回数" , 3 , "046(46回)"),
("6コース3着回数" , 3 , "046(46回)"),
("6コース4着回数" , 3 , "046(46回)"),
("6コース5着回数" , 3 , "046(46回)"),
("6コース6着回数" , 3 , "046(46回)"),
("6コースF回数" , 2 , "01(1回)"),
("6コースL0回数" , 2 , "01(1回)"),
("6コースL1回数" , 2 , "01(1回)"),
("6コースK0回数" , 2 , "01(1回)"),
("6コースK1回数" , 2 , "01(1回)"),
("6コースS0回数" , 2 , "01(1回)"),
("6コースS1回数" , 2 , "01(1回)"),
("6コースS2回数" , 2 , "01(1回)"),
("コースなしL0回数" , 2 , "01(1回)"),
("コースなしL1回数" , 2 , "01(1回)"),
("コースなしK0回数" , 2 , "01(1回)"),
("コースなしK1回数" , 2 , "01(1回)"),
("出身地" , 6 , "大阪"),
)
def load_fanXXXX(filename):
racers = dict()
with open(filename, "rb") as f:
while True:
try:
racer_id = int(f.read(layout[0][1]))
except ValueError:
break
X = dict()
for line in layout[1:]:
X_key = line[0]
X_value_buf = f.read(line[1])
try:
X_value = float(X_value_buf)
X.update({X_key: X_value})
except ValueError:
if X_key == "級":
if X_value_buf == b'A1':
X_value = 4
elif X_value_buf == b'A2':
X_value = 3
elif X_value_buf == b'B1':
X_value = 2
elif X_value_buf == b'B2':
X_value = 1
else:
X_value = 0
X.update({X_key: X_value})
pass
# X_value = X_value_buf.decode("shift-jis")
racers.update({racer_id: X})
f.read(2) # "\r\n"
return racers
if __name__ == "__main__":
racers = load_fanXXXX("./racer/fan1810.txt")
import IPython;IPython.embed()
| 41.393035 | 62 | 0.325481 | 776 | 8,320 | 3.440722 | 0.393041 | 0.058427 | 0.097378 | 0.014981 | 0.075655 | 0.012734 | 0 | 0 | 0 | 0 | 0 | 0.224099 | 0.466346 | 8,320 | 200 | 63 | 41.6 | 0.377027 | 0.016707 | 0 | 0.033149 | 0 | 0 | 0.289384 | 0.005137 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005525 | false | 0.005525 | 0.01105 | 0 | 0.022099 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b05d319137c129f4d609fbc3266074c123b0b121 | 354 | py | Python | md-graph/mdfile.py | barrettotte/md-graph | 3f70e064fa71f906232fb68493aebe137333cca8 | [
"MIT"
] | 1 | 2022-02-15T12:19:13.000Z | 2022-02-15T12:19:13.000Z | md-graph/mdfile.py | barrettotte/md-graph | 3f70e064fa71f906232fb68493aebe137333cca8 | [
"MIT"
] | null | null | null | md-graph/mdfile.py | barrettotte/md-graph | 3f70e064fa71f906232fb68493aebe137333cca8 | [
"MIT"
] | 1 | 2021-09-26T11:17:52.000Z | 2021-09-26T11:17:52.000Z |
class MdFile():
def __init__(self, file_path, base_name, title, mdlinks):
self.uid = 0
self.file_path = file_path
self.base_name = base_name
self.title = title if title else base_name
self.mdlinks = mdlinks
def __str__(self):
return f'{self.uid}: {self.file_path}, {self.title}, {self.mdlinks}'
| 27.230769 | 76 | 0.624294 | 49 | 354 | 4.183673 | 0.367347 | 0.156098 | 0.17561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003831 | 0.262712 | 354 | 12 | 77 | 29.5 | 0.781609 | 0 | 0 | 0 | 0 | 0 | 0.164306 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0.111111 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
c66648d485cda735f947935ba42b6ee02d483a2c | 990 | py | Python | tests/apps/app1/config/auth.py | coboyoshi/uvicore | 9cfdeeac83000b156fe48f068b4658edaf51c8de | [
"MIT"
] | 11 | 2021-03-22T22:07:49.000Z | 2022-03-08T16:18:33.000Z | tests/apps/app1/config/auth.py | coboyoshi/uvicore | 9cfdeeac83000b156fe48f068b4658edaf51c8de | [
"MIT"
] | 12 | 2021-03-04T05:51:24.000Z | 2021-09-22T05:16:18.000Z | tests/apps/app1/config/auth.py | coboyoshi/uvicore | 9cfdeeac83000b156fe48f068b4658edaf51c8de | [
"MIT"
] | 2 | 2021-03-25T14:49:56.000Z | 2021-11-17T23:20:29.000Z | config = {
# --------------------------------------------------------------------------
# Database Connections
# --------------------------------------------------------------------------
'database': {
'default': 'auth',
'connections': {
# SQLite
# 'auth': {
# 'driver': 'sqlite',
# 'dialect': None,
# 'host': None,
# 'port': None,
# 'database': ':memory',
# 'username': None,
# 'password': None,
# 'prefix': 'auth_',
# },
# MySQL
'auth': {
'driver': 'mysql',
'dialect': 'pymysql',
'host': '127.0.0.1',
'port': 3306,
'database': 'uvicore_test',
'username': 'root',
'password': 'techie',
'prefix': 'auth_',
},
},
},
}
| 29.117647 | 80 | 0.264646 | 47 | 990 | 5.510638 | 0.531915 | 0.07722 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018315 | 0.448485 | 990 | 33 | 81 | 30 | 0.456044 | 0.374747 | 0 | 0 | 0 | 0 | 0.220564 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c66ab303e4b8b5c14338155fea9dd8e461152247 | 1,401 | py | Python | status/migrations/0002_auto_20200807_2031.py | Ulorewien/covid19-india | 1e44d4b4f01207f4b03d43b075bcbff0735bae19 | [
"MIT"
] | 1 | 2020-11-29T13:17:17.000Z | 2020-11-29T13:17:17.000Z | status/migrations/0002_auto_20200807_2031.py | Ulorewien/covid19-india | 1e44d4b4f01207f4b03d43b075bcbff0735bae19 | [
"MIT"
] | 9 | 2020-06-25T19:03:16.000Z | 2021-07-15T17:15:47.000Z | status/migrations/0002_auto_20200807_2031.py | Ulorewien/covid19-india | 1e44d4b4f01207f4b03d43b075bcbff0735bae19 | [
"MIT"
] | 8 | 2020-06-25T11:15:21.000Z | 2021-10-04T15:43:43.000Z | # Generated by Django 2.2.13 on 2020-08-07 15:01
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('status', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='StateData',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('location', models.CharField(max_length=128)),
('confirmed', models.BigIntegerField()),
('confirmed_today', models.BigIntegerField()),
('active', models.BigIntegerField()),
('active_today', models.BigIntegerField()),
('recovered', models.BigIntegerField()),
('recovered_today', models.BigIntegerField()),
('deaths', models.BigIntegerField()),
('deaths_today', models.BigIntegerField()),
('tests', models.BigIntegerField()),
('timestamp', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
],
options={
'ordering': ['-active', 'timestamp'],
},
),
migrations.DeleteModel(
name='PastData',
),
migrations.DeleteModel(
name='State',
),
]
| 34.170732 | 114 | 0.533191 | 108 | 1,401 | 6.805556 | 0.546296 | 0.257143 | 0.141497 | 0.070748 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024313 | 0.324768 | 1,401 | 40 | 115 | 35.025 | 0.752643 | 0.032834 | 0 | 0.147059 | 1 | 0 | 0.133777 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.029412 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6977156ce630820b330edb126c34ccc390c5d83 | 396 | py | Python | src/api/error/basic.py | sonlhcsuit/togo | 68e79e1df3ac5b9b8b834a53345028f332abbda8 | [
"MIT"
] | null | null | null | src/api/error/basic.py | sonlhcsuit/togo | 68e79e1df3ac5b9b8b834a53345028f332abbda8 | [
"MIT"
] | null | null | null | src/api/error/basic.py | sonlhcsuit/togo | 68e79e1df3ac5b9b8b834a53345028f332abbda8 | [
"MIT"
] | null | null | null | from werkzeug.exceptions import HTTPException
from src.util import logger
class HTTPError(HTTPException):
def __init__(self, code: int, description: str):
self.code = code
self.description = description
def default_error_handler(error: HTTPError):
logger.error(str(error), exc_info=error)
return {"error": error.name, "description": error.description}, error.code
| 28.285714 | 78 | 0.734848 | 48 | 396 | 5.916667 | 0.5 | 0.056338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164141 | 396 | 13 | 79 | 30.461538 | 0.858006 | 0 | 0 | 0 | 0 | 0 | 0.040404 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c69cd4238b2cc19db8c5358eb3788d864ac787f6 | 1,886 | py | Python | metrics.py | SuhasShanbhogue/Sport-Prediction | ed9e1d693be99bfe6bda665866fe2aba97569a0f | [
"MIT"
] | 1 | 2020-12-11T03:00:58.000Z | 2020-12-11T03:00:58.000Z | metrics.py | SuhasShanbhogue/Sport-Prediction | ed9e1d693be99bfe6bda665866fe2aba97569a0f | [
"MIT"
] | null | null | null | metrics.py | SuhasShanbhogue/Sport-Prediction | ed9e1d693be99bfe6bda665866fe2aba97569a0f | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
from scipy import stats
from sklearn.linear_model import LogisticRegression
from sklearn.preprocessing import StandardScaler
from sklearn.preprocessing import MinMaxScaler
from sklearn.metrics import confusion_matrix
from sklearn.metrics import roc_auc_score
from sklearn.ensemble import RandomForestClassifier
from sklearn.tree import DecisionTreeClassifier
from sklearn.metrics import f1_score
from sklearn.metrics import roc_curve
from matplotlib import pyplot
import seaborn as sn
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.metrics import plot_confusion_matrix
%matplotlib inline
def metrics(y_true,y_pred,y_probs):
roc = roc_auc_score(y_true,y_probs)
print("The ROC AUC Score is: {}".format(roc))
cf = confusion_matrix(y_true,y_preds)
print("The Confusion Matrix is:")
print(cf)
f1score = f1_score(y_true, y_preds)
print("The F1 score is: {}".format(f1score))
fpr,tpr,_ = roc_curve(y_true,y_preds)
print("ROC_AUC curve")
pyplot.plot(fpr, tpr, marker='.', label='ROC_AUC')
def metrics_combined(comb_y_true,comb_y_pred,comb_y_probs):
roc =0
f1score =0
roc_l=[]
f1_score_l=[]
cf_total=0
for i in range(4):
roc += roc_auc_score(comb_y_true[i],comb_y_probs[i])
roc_l.append(roc_auc_score(comb_y_true[i],comb_y_probs[i]))
f1score += f1_score(comb_y_true[i],comb_y_pred[i])
f1_score_l.append(f1_score(comb_y_true[i],comb_y_pred[i]))
cf = confusion_matrix(comb_y_true[i],comb_y_pred[i])
cf_total += cf
print("Mean ROC score:{}".format(roc/4))
print("Mean F1 score{}".format(f1score/4))
print("Confusion Matrix for all:")
print(cf_total)
x = list(range(1,5))
plt.plot(x,roc_l,label='ROC')
plt.plot(x,f1_score_l,label='F1_score')
plt.ylabel('Split Vs ROC')
plt.show() | 30.918033 | 63 | 0.749205 | 314 | 1,886 | 4.264331 | 0.232484 | 0.048544 | 0.040329 | 0.089619 | 0.250934 | 0.198656 | 0.170276 | 0.170276 | 0.170276 | 0.088125 | 0 | 0.014233 | 0.14316 | 1,886 | 61 | 64 | 30.918033 | 0.814356 | 0 | 0 | 0.075472 | 0 | 0 | 0.08903 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.358491 | null | null | 0.169811 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c6b5278f4bab4feaa6421b7cfd1134a5c362d276 | 5,939 | py | Python | ggongsul/membership/migrations/0001_initial.py | blc-cruise/ggongsul-api | 0cdfc09ea75688ffc297bc0c0f08897091896f3e | [
"MIT"
] | 2 | 2021-05-22T07:33:34.000Z | 2021-09-18T04:22:25.000Z | ggongsul/membership/migrations/0001_initial.py | blc-cruise/ggongsul-api | 0cdfc09ea75688ffc297bc0c0f08897091896f3e | [
"MIT"
] | null | null | null | ggongsul/membership/migrations/0001_initial.py | blc-cruise/ggongsul-api | 0cdfc09ea75688ffc297bc0c0f08897091896f3e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.3 on 2021-01-10 17:20
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name="Subscription",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"validity_days",
models.IntegerField(default=30, verbose_name="구독 유효 기간(일수)"),
),
("started_at", models.DateTimeField(verbose_name="구독 혜택 시작 날짜")),
("ended_at", models.DateTimeField(verbose_name="구독 혜택 종료 날짜")),
(
"created_on",
models.DateTimeField(auto_now_add=True, verbose_name="생성 날짜"),
),
(
"updated_on",
models.DateTimeField(auto_now=True, verbose_name="최근 정보 변경 날짜"),
),
(
"member",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="subscriptions",
to=settings.AUTH_USER_MODEL,
verbose_name="사용자",
),
),
],
options={
"verbose_name": "구독 정보",
"verbose_name_plural": "구독 정보",
"ordering": ["-ended_at"],
},
),
migrations.CreateModel(
name="Payment",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"payment_uid",
models.CharField(
max_length=64, unique=True, verbose_name="결제 고유 id"
),
),
(
"payment_type",
models.IntegerField(choices=[(1, "카카오페이")], verbose_name="결제 수단"),
),
("imp_uid", models.CharField(max_length=64, verbose_name="아임포트 고유 id")),
("amount", models.IntegerField(verbose_name="결제 금액")),
(
"canceled_amount",
models.IntegerField(default=0, verbose_name="결제 취소 금액"),
),
("paid_at", models.DateTimeField(verbose_name="결제 날짜")),
(
"canceled_at",
models.DateTimeField(null=True, verbose_name="결제 취소 날짜"),
),
(
"created_on",
models.DateTimeField(auto_now_add=True, verbose_name="생성 날짜"),
),
(
"updated_on",
models.DateTimeField(auto_now=True, verbose_name="최근 정보 변경 날짜"),
),
(
"subscription",
models.OneToOneField(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="payment",
to="membership.subscription",
verbose_name="구독 정보",
),
),
],
options={
"verbose_name": "결제 정보",
"verbose_name_plural": "결제 정보",
"ordering": ["-id"],
},
),
migrations.CreateModel(
name="Membership",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"is_active",
models.BooleanField(
default=False,
help_text="멤버십이 활성화 되어있지 않더라도 구독 혜택이 남아 있을 수 있습니다. 멤버십 활성화 여부는 다음달 구독 연장 여부를 결정합니다.",
verbose_name="멤버십 활성화 여부",
),
),
(
"last_activated_at",
models.DateTimeField(null=True, verbose_name="최근 활성화 날짜"),
),
(
"last_deactivated_at",
models.DateTimeField(null=True, verbose_name="최근 비활성화 날짜"),
),
(
"created_on",
models.DateTimeField(auto_now_add=True, verbose_name="생성 날짜"),
),
(
"updated_on",
models.DateTimeField(auto_now=True, verbose_name="최근 정보 변경 날짜"),
),
(
"member",
models.OneToOneField(
on_delete=django.db.models.deletion.CASCADE,
related_name="membership",
to=settings.AUTH_USER_MODEL,
verbose_name="멤버십 정보",
),
),
],
options={
"verbose_name": "멤버십 정보",
"verbose_name_plural": "멤버십 정보",
"ordering": ["-id"],
},
),
]
| 34.935294 | 109 | 0.388281 | 439 | 5,939 | 5.05467 | 0.291572 | 0.153673 | 0.067598 | 0.067598 | 0.492114 | 0.477693 | 0.438035 | 0.356918 | 0.319063 | 0.2758 | 0 | 0.007986 | 0.51507 | 5,939 | 169 | 110 | 35.142012 | 0.7625 | 0.007577 | 0 | 0.512346 | 1 | 0 | 0.125255 | 0.003904 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.018519 | 0 | 0.04321 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6baeb98fe4d72607618df5335df3aa9a5ea4d95 | 375 | py | Python | 01_python/exercices/ex5.py | icecodder/nsi | eeb08932c1aa11f31bbdaae01361a526c5279527 | [
"MIT"
] | 4 | 2021-09-24T16:19:06.000Z | 2021-10-06T16:21:53.000Z | 01_python/exercices/ex5.py | icecodder/nsi | eeb08932c1aa11f31bbdaae01361a526c5279527 | [
"MIT"
] | 1 | 2021-10-06T16:25:25.000Z | 2021-11-28T08:11:14.000Z | 01_python/exercices/ex5.py | icecodder/nsi | eeb08932c1aa11f31bbdaae01361a526c5279527 | [
"MIT"
] | null | null | null | from math import sqrt
def f(a, b):
print(f"{'a':<10}{'b':<10}{'b-a':<10}{'m':<10}{'m²':<15}{'Test m² > a'}")
while b - a > 0.1:
m = (a + b) / 2
print(f"{a:<10}{b:<10}{b-a:<10}{m:<10}{m ** 2:<15}{m ** 2 > a!s}")
if m ** 2 > sqrt(3):
b = m
else:
a = m
return(a, b)
print(f(1, 2)) # -> 1.375, 1.4375
| 19.736842 | 77 | 0.362667 | 72 | 375 | 1.888889 | 0.347222 | 0.088235 | 0.102941 | 0.117647 | 0.279412 | 0.279412 | 0.279412 | 0.279412 | 0.279412 | 0.279412 | 0 | 0.163265 | 0.346667 | 375 | 18 | 78 | 20.833333 | 0.391837 | 0.042667 | 0 | 0 | 0 | 0.166667 | 0.333333 | 0.240896 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.166667 | 0.25 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6bbfca8711c6082de6863c5efbbbcfe325d8d17 | 342 | py | Python | ClassifyMembranes/setup_parallel.py | Rhoana/rhoana | b4027a57451d175ea02c2c7ef472cf9c4e1a0efc | [
"MIT"
] | 26 | 2015-01-08T08:30:10.000Z | 2021-07-08T06:21:35.000Z | ClassifyMembranes/setup_parallel.py | andudu/rhoana | b4027a57451d175ea02c2c7ef472cf9c4e1a0efc | [
"MIT"
] | null | null | null | ClassifyMembranes/setup_parallel.py | andudu/rhoana | b4027a57451d175ea02c2c7ef472cf9c4e1a0efc | [
"MIT"
] | 11 | 2015-01-23T23:30:26.000Z | 2019-02-06T21:56:33.000Z | from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
import numpy as np
setup(
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension("rf_classify_parallel", ["rf_classify_parallel.pyx"])],
extra_compile_args=['/openmp'],
include_dirs = [np.get_include()]
) | 31.090909 | 84 | 0.751462 | 45 | 342 | 5.444444 | 0.555556 | 0.097959 | 0.146939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134503 | 342 | 11 | 85 | 31.090909 | 0.827703 | 0 | 0 | 0 | 0 | 0 | 0.174927 | 0.069971 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c6be912b01bc8016f81eb1d37c8741c3a348dcb9 | 2,780 | py | Python | song_generator/song_data.py | gorbulus/song_generator | ba527e7a0151177f794995d0d79fdeffff45b7fb | [
"MIT"
] | null | null | null | song_generator/song_data.py | gorbulus/song_generator | ba527e7a0151177f794995d0d79fdeffff45b7fb | [
"MIT"
] | null | null | null | song_generator/song_data.py | gorbulus/song_generator | ba527e7a0151177f794995d0d79fdeffff45b7fb | [
"MIT"
] | null | null | null | '''
# song_data.py
# William Ponton
# 5.8.21
# Dictionaries and other data structures for use in the Song class
song_data dictionary:
Keys:
- genre
- tempo
- time_signature
- key_signature
- chord_progression
- drum_machines
- instruments
- pedals
- synths
- samplers
'''
# import packages
import random
from collections import defaultdict
random.seed()
''''TODO - make the data into a JSON file and parse the JSON object into a Python Dictionary
import json
with open('my_dict.json', 'w') as f:
json.dump(my_dict, f)
# elsewhere...to load the file
with open('my_dict.json') as f:
my_dict = json.load(f)
'''
# test_output
def test_output():
test = "Hello world ~ from song_data.py"
return print(test)
# collections
# lists
genres = ['Jazz','Rock','Metal','Electronic','SynthPop','Future Funk','Electro-Boogie','Vaporwave','Lo-Fi','Chill','Ambient','Psychedelic','Blues-Rock', 'Afro-Beat', 'Latin-Jazz']
time_signatures = ['4/4','3/4','2/4','7/8','6/8','5/8']
key_signatures = ['A major','B major','C major','D major','E major','F major','G major','a minor','b minor','c minor','d minor','e minor','f minor','g minor']
drum_machines = ['Volca Beats','PO-12','Arturia Drumbrute','Volca Sample','Beat Thang','Elektron Digitakt','Elektron Model:Samples', 'DAW']
instruments = ['Gibson SG','Fender Telecaster (modified)','Ibanez RG (Blue Flame)','Ibanez RG (Burst)','Washburn Acoustic','Michael Kelly Mandolin']
pedals = ['Catalinbread Octopussy','Catalinbread Topanga','Keeley Neutrino','Keeley Sfocato','Wampler Ego Compressor','Walrus Audio Slo','Benson Preamp','Keeley Caverns','JHS Unicorn','JHS Kodiak','Electro Harmonix Small Stone','Line 6 Delay','Keeley 4-Knob Compressor']
synths = ['Arturia MicroBrute','Korg Volca Keys','Korg Volca FM','Elektron DigiTone','Electron Heat','Elektron Model:Cycles','Korg Volca Bass','Korg Monotron','Korg Monotron Duo','Korg Monotron Delay','Teenage Engineering PO-14 Sub','Teenage Engineering PO-12 Drum','Teenage Engineering PO-16 Factory','Teenage Engineering PO-24 Office','Teenage Engineering PO-20 Arcade','Teenage Engineering PO-28 Robot', 'DAW']
samplers = ['Korg Volca Sample','Elektron DigiTakt','Eketron Model:Sample','Beat Thang','DAW']
#TODO - make a list of lists lie chord_progressions = [[1 [I, V, VI, VII]], [2, [V, VII, ii, IV]]
# list of lists
chord_progressions = ['I','IV','III','V','VII','VI']
# song_data_dict
# a dictionary that holds all the song parameter data
song_data_dict = {
'genres' : genres,
'tempos' : random.randrange(75,150),
'time_signatures' : time_signatures,
'key_signatures' : key_signatures,
'chord_progressions' : chord_progressions,
'drum_machines' : drum_machines,
'instruments' : instruments,
'pedals' : pedals,
'synths' : synths,
'samplers' : samplers,
}
| 40.289855 | 413 | 0.710791 | 395 | 2,780 | 4.926582 | 0.493671 | 0.055498 | 0.061665 | 0.014388 | 0.018499 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01599 | 0.122662 | 2,780 | 68 | 414 | 40.882353 | 0.781878 | 0.17554 | 0 | 0 | 0 | 0 | 0.613208 | 0 | 0 | 0 | 0 | 0.029412 | 0 | 1 | 0.037037 | false | 0 | 0.074074 | 0 | 0.148148 | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6c1049d0dac03c34d9a49593fb2d8ae72a380b7 | 459 | py | Python | memoirs.py | bunchesofdonald/sixwordmemoirbot | 0b52c3ee504b8deb953332f4a0a99066e22dfcea | [
"MIT"
] | null | null | null | memoirs.py | bunchesofdonald/sixwordmemoirbot | 0b52c3ee504b8deb953332f4a0a99066e22dfcea | [
"MIT"
] | null | null | null | memoirs.py | bunchesofdonald/sixwordmemoirbot | 0b52c3ee504b8deb953332f4a0a99066e22dfcea | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import json
import re
from markov import markov, generate_text
with open('memoirs.json') as infile:
memoirs = json.load(infile)
memoirs = [re.sub('[",\.!?-]+', '', m).lower() for m in memoirs]
chain = None
for m in memoirs:
chain = markov(m, chain)
text = generate_text(chain)
while " ".join(text) in memoirs or len(text) != 6:
text = generate_text(chain)
print " ".join(text).capitalize()
| 19.956522 | 68 | 0.642702 | 67 | 459 | 4.358209 | 0.507463 | 0.123288 | 0.041096 | 0.089041 | 0.123288 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005333 | 0.183007 | 459 | 22 | 69 | 20.863636 | 0.773333 | 0.091503 | 0 | 0.153846 | 1 | 0 | 0.057831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.230769 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6c8f681f33bdd7eef0b3fe219495ba9ea9fbb15 | 395 | py | Python | ASE20_data/evaluation/baselines/b3.py | henryhchchc/MockSniffer | 6d6e845616004ca77ce6d73709cff94a1a32c6c5 | [
"MIT"
] | 4 | 2020-09-03T13:20:13.000Z | 2021-07-14T08:04:26.000Z | ASE20_data/evaluation/baselines/b3.py | henryhchchc/MockSniffer | 6d6e845616004ca77ce6d73709cff94a1a32c6c5 | [
"MIT"
] | null | null | null | ASE20_data/evaluation/baselines/b3.py | henryhchchc/MockSniffer | 6d6e845616004ca77ce6d73709cff94a1a32c6c5 | [
"MIT"
] | null | null | null | def baseline3(X):
return (
X['ABS']
| X['INT']
| X['UINT']
| (X['TDEP'] > X['TDEP'].mean())
| (X['FIELD'] > X['FIELD'].mean())
| ((X['UAPI']+X['TUAPI']) > (X['UAPI']+X['TUAPI']).mean())
| (X['EXPCAT'] > 0)
| (X['RBFA'] > 0)
| (X['CONDCALL'] > 0)
| (X['SYNC'] > X['SYNC'].mean())
| (X['AFPR'] > 0)
)
| 26.333333 | 66 | 0.334177 | 46 | 395 | 2.869565 | 0.391304 | 0.151515 | 0.090909 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019455 | 0.349367 | 395 | 14 | 67 | 28.214286 | 0.494163 | 0 | 0 | 0 | 0 | 0 | 0.192405 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0 | 0.071429 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6ca9f18216caa165d46b77b5522acc1d6aa1f07 | 813 | py | Python | practice_problems/sorting/search_in_rotated_array.py | YazzyYaz/codinginterviews | b8115c43507d738bccd90366f2bc02867ba7f13f | [
"MIT"
] | null | null | null | practice_problems/sorting/search_in_rotated_array.py | YazzyYaz/codinginterviews | b8115c43507d738bccd90366f2bc02867ba7f13f | [
"MIT"
] | null | null | null | practice_problems/sorting/search_in_rotated_array.py | YazzyYaz/codinginterviews | b8115c43507d738bccd90366f2bc02867ba7f13f | [
"MIT"
] | null | null | null | def search_in_rotated_array(alist, k, leftix=0, rightix=None):
if not rightix:
rightix = len(alist)
midpoint = (leftix + rightix) / 2
aleft, amiddle = alist[leftix], alist[midpoint]
if k == amiddle:
return midpoint
if k == aleft:
return leftix
if aleft > amiddle:
if amiddle < k and k < aleft:
return search_in_rotated_array(alist, k, midpoint+1, rightix)
else:
return search_in_rotated_array(alist, k, leftix+1, midpoint)
elif aleft < k and k < amiddle:
return search_in_rotated_array(alist, k, leftix+1, midpoint)
else:
return search_in_rotated_array(alist, k, midpoint+1, rightix)
array = [55, 60, 65, 70, 75, 80, 85, 90, 95, 15, 20, 25, 30, 35, 40, 45]
print(search_in_rotated_array(array, 40))
| 36.954545 | 73 | 0.633456 | 118 | 813 | 4.211864 | 0.338983 | 0.096579 | 0.181087 | 0.241449 | 0.462777 | 0.462777 | 0.462777 | 0.39839 | 0.382294 | 0.382294 | 0 | 0.066445 | 0.259533 | 813 | 21 | 74 | 38.714286 | 0.759136 | 0 | 0 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0 | 0 | 0.35 | 0.05 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6da43a318085ed5dc8a681c4d1d0e7003951169 | 1,614 | py | Python | ggongsul/partner/migrations/0003_partneragreement.py | blc-cruise/ggongsul-api | 0cdfc09ea75688ffc297bc0c0f08897091896f3e | [
"MIT"
] | 2 | 2021-05-22T07:33:34.000Z | 2021-09-18T04:22:25.000Z | ggongsul/partner/migrations/0003_partneragreement.py | blc-cruise/ggongsul-api | 0cdfc09ea75688ffc297bc0c0f08897091896f3e | [
"MIT"
] | null | null | null | ggongsul/partner/migrations/0003_partneragreement.py | blc-cruise/ggongsul-api | 0cdfc09ea75688ffc297bc0c0f08897091896f3e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.3 on 2020-12-06 23:03
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("partner", "0002_auto_20201205_1630"),
]
operations = [
migrations.CreateModel(
name="PartnerAgreement",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"policy_agreed_at",
models.DateTimeField(null=True, verbose_name="정책 동의 날짜"),
),
(
"created_on",
models.DateTimeField(auto_now_add=True, verbose_name="생성 날짜"),
),
(
"updated_on",
models.DateTimeField(auto_now=True, verbose_name="최근 정보 변경 날짜"),
),
(
"partner",
models.OneToOneField(
on_delete=django.db.models.deletion.CASCADE,
related_name="agreement",
to="partner.partner",
verbose_name="동의서",
),
),
],
options={
"verbose_name": "이용약관 동의서",
"verbose_name_plural": "이용약관 동의서",
},
),
]
| 29.888889 | 84 | 0.405824 | 119 | 1,614 | 5.319328 | 0.537815 | 0.121643 | 0.07109 | 0.06951 | 0.088468 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.50062 | 1,614 | 53 | 85 | 30.45283 | 0.746898 | 0.027881 | 0 | 0.170213 | 1 | 0 | 0.121889 | 0.014678 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.042553 | 0 | 0.106383 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6da50f994fb6dec8552e2cb6d5017c97cbec004 | 208 | py | Python | chapter6/6.2/decorator_test.py | yifengyou/crazy-python | 28099bd5011de6981a7c5412783952cc7601ae0c | [
"Unlicense"
] | null | null | null | chapter6/6.2/decorator_test.py | yifengyou/crazy-python | 28099bd5011de6981a7c5412783952cc7601ae0c | [
"Unlicense"
] | null | null | null | chapter6/6.2/decorator_test.py | yifengyou/crazy-python | 28099bd5011de6981a7c5412783952cc7601ae0c | [
"Unlicense"
] | null | null | null | # coding:utf-8
# File Name: decorator_test
# Author : yifengyou
# Date : 2021/07/18
def funA(fn):
print("A")
fn()
return "hello"
@funA
def funB():
print("B")
print(funB)
| 12.235294 | 31 | 0.543269 | 28 | 208 | 4 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062069 | 0.302885 | 208 | 16 | 32 | 13 | 0.710345 | 0.447115 | 0 | 0 | 0 | 0 | 0.063636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.375 | 0.375 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6ded2462573fdea817e5bf0a2ae4d937f5d4db7 | 1,064 | py | Python | pyactus/domain/enums/interest_calculation_base.py | CasperLabs/actus-core-py | 070de8a31a29792f3e488db93300947a11f496b7 | [
"Apache-2.0"
] | null | null | null | pyactus/domain/enums/interest_calculation_base.py | CasperLabs/actus-core-py | 070de8a31a29792f3e488db93300947a11f496b7 | [
"Apache-2.0"
] | null | null | null | pyactus/domain/enums/interest_calculation_base.py | CasperLabs/actus-core-py | 070de8a31a29792f3e488db93300947a11f496b7 | [
"Apache-2.0"
] | null | null | null | import enum
class InterestCalculationBase(enum.Enum):
"""IPCB :: Interest Calculation Base.
This is important for amortizing instruments. The basis of interest calculation is normally the notional outstanding amount as per SD. This is considered the fair basis and in many countries the only legal basis. If NULL or NTSD is selected, this is the case.
Alternative bases (normally in order to favor the lending institution) are found. In the extreme case the original balance (PCDD=NT+PDCDD) never gets adjusted. In this case PCDD must be chosen.
An intermediate case exist wherre balances do get adjusted, however with lags. In this case NTL mut be selected and anchor dates and cycles must be set.
"""
# Notional Outstanding :: Interest accrues on the basis of the notional outstanding.
NT = 0
# Notional at Initial Exchange :: Interest accrues on the basis of the notional value at initial exchange.
NTIED = 1
# Notional Outstanding Lagged :: Interest accrues on the basis of the lagged notional outstanding.
NTL = 2
| 50.666667 | 264 | 0.759398 | 159 | 1,064 | 5.081761 | 0.540881 | 0.117574 | 0.049505 | 0.074257 | 0.131188 | 0.131188 | 0.131188 | 0.094059 | 0 | 0 | 0 | 0.003517 | 0.198308 | 1,064 | 20 | 265 | 53.2 | 0.943728 | 0.883459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c6e2591b3a9994c90718195618f8849a9904f2d7 | 128,118 | py | Python | tpipe.py | caseyjlaw/tpipe | d8337587230e334ee7f7147f64c2ae26566b0235 | [
"Apache-2.0"
] | 3 | 2015-01-05T01:23:07.000Z | 2018-09-30T07:53:58.000Z | tpipe.py | caseyjlaw/tpipe | d8337587230e334ee7f7147f64c2ae26566b0235 | [
"Apache-2.0"
] | null | null | null | tpipe.py | caseyjlaw/tpipe | d8337587230e334ee7f7147f64c2ae26566b0235 | [
"Apache-2.0"
] | null | null | null | #! /usr/bin/env python
"""
tpipe.py --- read and visualize visibility data to search for transients
Generalization of evlavis, etc.
Can read MS or Miriad formatted data. Will try to import the following:
- CASA and pyrap for MS reading
- miriad-python for Miriad reading
- aipy for imaging numpy array data
"""
import sys, string, os, shutil, types
from os.path import join
import pickle
import numpy as n
import pylab as p
# set up libraries for reading and imaging visibility data
try:
# for simplicity, we should use pyrap for reading MS
import pyrap
print 'Imported pyrap'
except ImportError:
try:
# as a backup, casa stuff can be imported if running casapy
# from casac import casac; # Recommended by Sanjay B. (Don't ask me why this is so! :)
# ms = casac.ms();
from casa import ms
from casa import quanta as qa
print 'Imported CASA'
except ImportError:
print 'No CASA or pyrap available. Can\'t read MS data.'
try:
# miriad-python can be used to read miriad format data
import mirtask
from mirexec import TaskInvert, TaskClean, TaskRestore, TaskImFit, TaskCgDisp, TaskImStat, TaskUVFit
import miriad
print 'Imported miriad-python'
except ImportError:
print 'No miriad-python available. Can\'t read miriad data.'
try:
# try aipy for imaging numpy array data
import aipy
print 'Imported aipy...'
except ImportError:
print 'No aipy available. Can\'t image in Python.'
try:
import ephem,pywcs
print 'Imported ephem and pywcs...'
except ImportError:
print 'Either ephem or pywcs not available. Can only do flat-sky coord transforms (no ra/dec info).'
class Reader:
""" Master class with basic functions.
self.params defines various tunable parameters for reading data and running pipelines.
A "profile" is a set of params that can be hardwired for easy access.
Can also set parameters giving them as keyword arguments (e.g., "chans=n.array(range(100,110))")
"""
def __init__(self):
raise NotImplementedError('Cannot instantiate class directly. Use \'pipe\' subclasses.')
def set_profile(self, profile='default'):
""" Method called by __init__ in subclasses. This sets all parameters needed elsewhere.
Can optionally use a profile which is a set of params.
Changing parameters done with set_params
"""
# parameters used by various subclasses
# each set is indexed by a name, called a profile
# Note that each parameter must also be listed in set_params method in order to get set
self.profile = profile
self.params = {
'default' : {
'chans': n.array(range(5,59)), # channels to read
'dmarr' : [44.,88.], # dm values to use for dedispersion (only for some subclasses)
'pulsewidth' : 0.0, # width of pulse in time (seconds)
'approxuvw' : True, # flag to make template visibility file to speed up writing of dm track data
'pathout': './', # place to put output files
'beam_params': [0], # flag=0 or list of parameters for twodgaussian parameter definition
'long': -107.6177, # longitude of the array center (vla)
'lat': 34.07875 # latitude of the array center (vla)
},
'vlacrab' : {
'chans': n.array(range(5,59)), # channels to read
'dmarr' : [29.,58.], # dm values to use for dedispersion (only for some subclasses)
'pulsewidth' : 0.0, # width of pulse in time (seconds)
'approxuvw' : True, # flag to make template visibility file to speed up writing of dm track data
'pathout': './', # place to put output files
'beam_params': [0], # flag=0 or list of parameters for twodgaussian parameter definition
'long': -107.6177, # longitude of the array center
'lat': 34.07875 # latitude of the array center
},
'psa' : {
'chans': n.array(range(140,150)), # channels to read
'dmarr' : [0.], # dm values to use for dedispersion (only for some subclasses)
'pulsewidth' : 0.0, # width of pulse in time (seconds)
'approxuvw' : True, # flag to make template visibility file to speed up writing of dm track data
'pathout': './', # place to put output files
'beam_params': [0], # flag=0 or list of parameters for twodgaussian parameter definition
'long': 21.411, # longitude of the array center
'lat': -30.721 # latitude of the array center
},
'pocob0329' : {
'chans': n.array(range(5,59)), # channels to read
'dmarr' : [0, 13.4, 26.8, 40.2, 53.5], # dm values to use for dedispersion (only for some subclasses)
'pulsewidth' : 0.005, # width of pulse in time (seconds)
'approxuvw' : True, # flag to make template visibility file to speed up writing of dm track data
'pathout': './', # place to put output files
'beam_params': [0], # flag=0 or list of parameters for twodgaussian parameter definition
'long': -121.470, # longitude of the array center
'lat': 40.817 # latitude of the array center
},
'mwa' : {
'chans': n.array(n.arange(128)), # channels to read
'dmarr' : [0, 50.], # dm values to use for dedispersion (only for some subclasses)
'pulsewidth' : 0.0, # width of pulse in time (seconds)
'approxuvw' : True, # flag to make template visibility file to speed up writing of dm track data
'pathout': './', # place to put output files
'beam_params': [0], # flag=0 or list of parameters for twodgaussian parameter definition
'long': 116.671, # longitude of the array center
'lat': -26.703 # latitude of the array center
}
}
self.pathout = self.params[self.profile]['pathout']
self.chans = self.params[self.profile]['chans']
self.dmarr = self.params[self.profile]['dmarr']
self.pulsewidth = self.params[self.profile]['pulsewidth'] * n.ones(len(self.chans))
self.approxuvw = self.params[self.profile]['approxuvw']
self.beam_params = self.params[self.profile]['beam_params']
self.long = self.params[self.profile]['long']
self.lat = self.params[self.profile]['lat']
def set_params(self, **kargs):
""" Method called by __init__ in subclasses. This allows one to change parameters.
Assumes set_profile already run on pipe.
"""
# may further modify parameters manually
if len(kargs) > 0:
for key in kargs:
if key in self.params[self.profile].keys():
self.params[self.profile][key] = kargs[key]
else:
print '%s not a standard key. Will not be used.' % (key)
self.pathout = self.params[self.profile]['pathout']
self.chans = self.params[self.profile]['chans']
self.dmarr = self.params[self.profile]['dmarr']
self.pulsewidth = self.params[self.profile]['pulsewidth'] * n.ones(len(self.chans))
self.approxuvw = self.params[self.profile]['approxuvw']
self.beam_params = self.params[self.profile]['beam_params']
self.long = self.params[self.profile]['long']
self.lat = self.params[self.profile]['lat']
def show_params(self):
""" Print parameters of pipeline that can be modified upon creation.
"""
return self.params[self.profile]
def spec(self, ind=[], save=0):
""" Plot spectrogram for phase center by taking mean over baselines and polarizations.
Optionally can zoom in on small range in time with ind parameter.
save=0 is no saving, save=1 is save with default name, save=<string>.png uses custom name (must include .png).
"""
reltime = self.reltime
bf = self.dataph
print 'Data mean, std: %f, %f' % (self.dataph.mean(), self.dataph.std())
(vmin, vmax) = sigma_clip(bf.data.ravel())
if ( (not(vmin >= 0)) & (not(vmin <= 0)) ):
print 'sigma_clip returning NaNs. Using data (min,max).'
vmin = bf.ravel().min()
vmax = bf.ravel().max()
p.figure()
p.clf()
ax = p.axes()
ax.set_position([0.2,0.2,0.7,0.7])
if len(ind) > 0:
for i in ind:
p.subplot(len(ind),1,list(ind).index(i)+1)
intmin = n.max([0,i-50])
intmax = n.min([len(self.reltime),i+50])
im = p.imshow(n.rot90(bf[intmin:intmax]), aspect='auto', origin='upper', interpolation='nearest', extent=(intmin,intmax,0,len(self.chans)), vmin=vmin, vmax=vmax)
p.subplot(len(ind),1,1)
else:
im = p.imshow(n.rot90(bf), aspect='auto', origin='upper', interpolation='nearest', extent=(0,len(self.reltime),0,len(self.chans)), vmin=vmin, vmax=vmax)
p.title(str(self.nskip/self.nbl) + ' nskip, candidates ' + str(ind))
cb = p.colorbar(im)
cb.set_label('Flux Density (Jy)',fontsize=12,fontweight="bold")
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
p.yticks(n.arange(0,len(self.chans),4), (self.chans[(n.arange(0,len(self.chans), 4))]))
p.xlabel('Time (integration)',fontsize=12,fontweight="bold")
p.ylabel('Frequency (channel)',fontsize=12,fontweight="bold")
if save:
if save == 1:
savename = self.file.split('.')[:-1]
savename.append(str(self.scan) + '_' + str(self.nskip/self.nbl) + '_spec.png')
savename = string.join(savename,'.')
elif isinstance(save, types.StringType):
savename = save
print 'Saving file as ', savename
p.savefig(self.pathout+savename)
def drops(self, data_type='ms', chan=0, pol=0, show=1):
""" Displays info on missing baselines by looking for zeros in data array.
data_type is needed to understand how to grab baseline info. options are 'ms' and 'mir'.
"""
nints = self.nints
bllen = []
if data_type == 'mir':
bls = self.preamble[:,4]
for bl in n.unique(bls):
bllen.append(n.shape(n.where(bls == bl))[1])
elif data_type == 'ms':
for i in xrange(len(self.blarr)):
bllen.append(len(n.where(self.data[:,i,chan,pol] != 0.00)[0]))
bllen = n.array(bllen)
if show:
p.clf()
for i in xrange(self.nbl):
p.text(self.blarr[i,0], self.blarr[i,1], s=str(100*(bllen[i]/nints - 1)), horizontalalignment='center', verticalalignment='center', fontsize=9)
p.axis((0,self.nants+1,0,self.nants+1))
p.plot([0,self.nants+1],[0,self.nants+1],'b--')
# p.xticks(int(self.blarr[:,0]), self.blarr[:,0])
# p.yticks(int(self.blarr[:,1]), self.blarr[:,1])
p.xlabel('Ant 1')
p.ylabel('Ant 2')
p.title('Drop fraction for chan %d, pol %d' % (chan, pol))
# p.show()
return self.blarr,bllen
def simple_image(self, i=0, c=0, cell=1.0, imagesize=4096):
"""
image a single integration and channel
cell size is in arcmin
returns image as an array
added by DLK 2013-04-05
"""
# select integration and channel
track=n.rollaxis(self.data[i,:,c,:].reshape((self.nbl,1,1)),2)
# take mean over frequency => introduces delay beam
truearr = n.ones( n.shape(track) )
falsearr = 1e-5*n.ones( n.shape(track) ) # need to set to small number so n.average doesn't go NaN
weightarr = n.where(track != 0j, truearr, falsearr) # ignore zeros in mean across channels # bit of a hack
track = n.average(track, axis=2, weights=weightarr)
# select integration and reduce pol axis
if ((pol == 'i') | (pol == 'I')):
if len(trackdata) == 2:
print 'Making Stokes I image as mean of two pols...'
else:
print 'Making Stokes I image as mean over all pols. Hope that\'s ok...'
tr=track.mean(axis=0)
elif isinstance(pol, types.IntType):
print 'Making image of pol %d' % (pol)
tr=track[pol]
# res and size in aipy units (lambda)
# size is pixel scale (cell size)
size=1/n.radians(cell/60.0)
# full field
res=size/imagesize
fov = n.degrees(1./res)*3600. # field of view in arcseconds
# form channel dependent uvw
u_ch = n.outer(self.u[i], self.freq/self.freq_orig[0])
v_ch = n.outer(self.v[i], self.freq/self.freq_orig[0])
w_ch = n.outer(self.w[i], self.freq/self.freq_orig[0])
# make image
ai = aipy.img.Img(size=size, res=res)
uvw_new, tr_new = ai.append_hermitian( (u_ch[:,c], v_ch[:,c], w_ch[:,c]), tr)
ai.put(uvw_new, tr_new)
image = ai.image(center = (size/res/2, size/res/2))
self.ai=ai
self.image_center=(size/res/2, size/res/2)
return image
def image_cube(self, i=0, channels=None, cell=1.0, imagesize=4096, pol='i'):
"""
Image a single integration across all channels.
Cell size is in arcmin. Channels can be an iterable object or None (which assumes all channels)
Returns images as 3d array (ra,dec,chan).
Added by DLK 2013-04-05
"""
# select integration and reduce pol axis
if ((pol == 'i') | (pol == 'I')):
if len(trackdata) == 2:
print 'Making Stokes I image as mean of two pols...'
else:
print 'Making Stokes I image as mean over all pols. Hope that\'s ok...'
tr=self.data.mean(axis=3)[0]
elif isinstance(pol, types.IntType):
print 'Making image of pol %d' % (pol)
tr=self.data[0,:,:,pol]
# res and size in aipy units (lambda)
# size is pixel scale (cell size)
size=1/n.radians(cell/60.0)
# full field
res=size/imagesize
fov = n.degrees(1./res)*3600. # field of view in arcseconds
if channels is None:
channels=range(self.nchan)
# form channel dependent uvw
u_ch = n.outer(self.u[i], self.freq/self.freq_orig[0])
v_ch = n.outer(self.v[i], self.freq/self.freq_orig[0])
w_ch = n.outer(self.w[i], self.freq/self.freq_orig[0])
# make image
image=n.zeros((len(channels),size/res,size/res))
for c,ic in zip(channels,xrange(len(channels))):
print 'Creating image for channel %d...' % c
ai = aipy.img.Img(size=size, res=res)
uvw_new, tr_new = ai.append_hermitian( (u_ch[:,c], v_ch[:,c], w_ch[:,c]), tr[:,c])
ai.put(uvw_new, tr_new)
image[ic] = ai.image(center = (size/res/2, size/res/2))
self.ai=ai
self.image_center=(size/res/2, size/res/2)
return image
def imagetrack(self, trackdata, mode='split', i=0, pol='i', size=48000, res=500, clean=True, gain=0.01, tol=1e-4, newbeam=0, save=0, show=0):
""" Use apiy to image trackdata returned by tracksub of dimensions (npol, nbl, nchan).
mode defines how frequency dependence is handled. 'split' means separate uv and data points in frequency (but not mfs). 'mean' means mean vis across frequency.
int is used to select uvw coordinates for track. default is first int.
pol can be 'i' for a Stokes I image (mean over pol dimension) or a pol index.
default params size and res are good for 1.4 GHz VLA, C-config image.
clean determines if image is cleaned and beam corrected. gain/tol are cleaning params.
newbeam forces the calculation of a new beam for restoring the cleaned image.
save=0 is no saving, save=1 is save with default name, save=<string>.png uses custom name (must include .png).
"""
# reduce pol axis
if ((pol == 'i') | (pol == 'I')):
if len(trackdata) == 2:
print 'Making Stokes I image as mean of two pols...'
else:
print 'Making Stokes I image as mean over all pols. Hope that\'s ok...'
td = trackdata.mean(axis=0)
elif isinstance(pol, types.IntType):
print 'Making image of pol %d' % (pol)
td = trackdata[pol]
# apply w phase rotation. generally this is done externally (e.g., by data writing software) and is not needed here.
# wrot = lambda w: n.exp(-2j*n.pi*n.outer(w, self.freq/self.freq_orig[0]))
# td = td*wrot(self.w[i])
# define handling of freq axis
if mode == 'split':
td = td.flatten()
uu = n.outer(self.u[i], self.freq/self.freq_orig[0]).flatten()
vv = n.outer(self.v[i], self.freq/self.freq_orig[0]).flatten()
ww = n.outer(self.w[i], self.freq/self.freq_orig[0]).flatten()
elif mode == 'mean':
td = td.mean(axis=1)
uu = self.u[i]
vv = self.v[i]
ww = self.w[i]
else:
print 'Mode must be \'mean\' or \'split\'.'
return 0
fov = n.degrees(1./res)*3600. # field of view in arcseconds
p.clf()
# make image
ai = aipy.img.Img(size=size, res=res)
uvw_new, td_new = ai.append_hermitian( (uu, vv, ww), td)
ai.put(uvw_new, td_new)
image = ai.image(center = (size/res/2, size/res/2))
image_final = image
# optionally clean image
if clean:
print 'Cleaning image...'
beam = ai.bm_image()
beamgain = aipy.img.beam_gain(beam[0])
(clean, dd) = aipy.deconv.clean(image, beam[0], verbose=True, gain=gain, tol=tol)
try:
import gaussfitter
if (len(self.beam_params) == 1) | (newbeam == 1) :
print 'Restoring image with new fit to beam shape...'
beam_centered = ai.bm_image(center=(size/res/2, size/res/2))
peak = n.where(beam_centered[0] >= 0.1*beam_centered[0].max(), beam_centered[0], 0.)
self.beam_params = gaussfitter.gaussfit(peak)
kernel = n.roll(n.roll(gaussfitter.twodgaussian(self.beam_params, shape=n.shape(beam[0])), size/res/2, axis=0), size/res/2, axis=1) # fit to beam at center, then roll to corners for later convolution step
except ImportError:
print 'Restoring image with peak of beam...'
kernel = n.where(beam[0] >= 0.4*beam[0].max(), beam[0], 0.) # take only peak (gaussian part) pixels of beam image
restored = aipy.img.convolve2d(clean, kernel)
image_restored = (restored + dd['res']).real/beamgain
image_final = image_restored
if show or save:
ax = p.axes()
ax.set_position([0.2,0.2,0.7,0.7])
# im = p.imshow(image_final, aspect='auto', origin='upper', interpolation='nearest', extent=[-fov/2, fov/2, -fov/2, fov/2])
im = p.imshow(image_final, aspect='auto', origin='lower', interpolation='nearest', extent=[fov/2, -fov/2, -fov/2, fov/2])
cb = p.colorbar(im)
cb.set_label('Flux Density (Jy)',fontsize=12,fontweight="bold")
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
p.xlabel('RA/l Offset (arcsec)',fontsize=12,fontweight="bold")
p.ylabel('Dec/m Offset (arcsec)',fontsize=12,fontweight="bold")
peak = n.where(n.max(image_final) == image_final)
print 'Image peak of %e at (%d,%d)' % (n.max(image_final), peak[0][0], peak[1][0])
print 'Peak/RMS = %e' % (image_final.max()/image_final[n.where(image_final <= 0.9*image_final.max())].std()) # estimate std without image peak
if save:
if save == 1:
savename = self.file.split('.')[:-1]
savename.append(str(self.nskip/self.nbl) + '_im.png')
savename = string.join(savename,'.')
elif isinstance(save, string):
savename = save
print 'Saving file as ', savename
p.savefig(self.pathout+savename)
return image_final
def get_shift(self, ra, dec):
"""
Returns the shift (dl,dm) in radians required to move the phase center to the new ra,dec (in degrees).
Needs to know ra0, dec0 of phase center and lat, long of array.
Added by DLK 2013-04-04
First, get (ra0,dec0) and (ra,dec)
use these to compute (l,m) via:
l=cos(dec)*sin(ra-ra0)
m=sin(dec)*cos(dec0)-cos(dec)*sin(dec0)*cos(ra-ra0)
(Synthesis Imaging II, Eqn. 19-8 and 19-9)
Then need to warp the snapshot
l'=l + tan(Z)*sin(chi)*(sqrt(1-l**2-m**2)-1)
m'=m - tan(Z)*cos(chi)*(sqrt(1-l**2-m**2)-1)
(Cornwell et al. 2008, http://adsabs.harvard.edu/abs/2008ISTSP...2..647C, arXiv:0807.4161v1
Eqn. 6, 7)
although the sign in Eqn. 6 might be wrong
Z=zenith angle and chi=parallactic angle, and so these terms are in fact the PV2_1 and PV2_2
that I calculate below. However, this is also given in Synthesis Imaging II, Eqn. 19-21,19-22
in the same form, so maybe I have a sign wrong
This is also all discussed in Ord et al. (2010)
"""
if not self.__dict__.has_key('wcs'):
# set up a fake WCS to do the transformation
# likely the details do not matter much
wcs=pywcs.WCS(naxis=2)
wcs.wcs.ctype=['RA---SIN','DEC--SIN']
wcs.wcs.crval=[n.degrees(self.ra0),n.degrees(self.dec0)]
wcs.wcs.crpix=[2049,2049]
wcs.wcs.cdelt=[-1.0/60,1.0/60]
observer=ephem.Observer()
observer.long=n.radians(self.long)
observer.lat=n.radians(self.lat)
observer.epoch=ephem.J2000
J0 = ephem.julian_date(0)
observer.date=self.time[0]-J0
body=ephem.FixedBody()
body._ra=self.ra0
body._dec=self.dec0
body._epoch=ephem.J2000
body.compute(observer)
LST=observer.sidereal_time()
HA=LST-self.ra0
_dec=self.dec0
_lat=n.radians(self.lat)
# this calculation comes from Steve Ord's fixhdr.c
parallactic_angle=n.arctan2(n.sin(HA)*n.cos(_lat),
n.sin(_lat)*n.cos(_dec)-n.sin(_dec)*n.cos(_lat)*n.cos(HA))
cosz=n.sin(_lat)*n.sin(_dec)+n.cos(_lat)*n.cos(_dec)*n.cos(HA)
z=n.arccos(cosz)
sinz=n.sin(z)
tanz=sinz/cosz
PV2_1=tanz*n.sin(parallactic_angle)
PV2_2=tanz*n.cos(parallactic_angle)
wcs.wcs.set_pv([(2,1,PV2_1),(2,2,PV2_2)])
self.wcs=wcs
if isinstance(ra,n.ndarray):
sky=n.vstack((ra,dec)).T
else:
sky=n.array([[ra,dec]])
pix=self.wcs.wcs_sky2pix(sky,0)
if isinstance(ra,n.ndarray):
x=pix[:,0]
y=pix[:,1]
else:
x=pix[0,0]
y=pix[0,1]
dx=x-(self.wcs.wcs.crpix[0]-1)
dy=y-(self.wcs.wcs.crpix[1]-1)
dl=n.radians(dx*self.wcs.wcs.cdelt[0])
dm=n.radians(dy*self.wcs.wcs.cdelt[1])
return dl,dm
def phaseshift(self, dl=0, dm=0, im=[[0]], size=0):
""" Function to apply phase shift (l,m) coordinates of data array, by (dl, dm).
If dl,dm are arrays, will try to apply the given shift for each integration separately (courtesy DLK)
If instead a 2d-array image, im, is given, phase center is shifted to image peak. Needs size to know image scale.
Sets data and dataph arrays to new values.
Sum of all phase shifts for each integration is tracked in self.l0, self.m0.
"""
ang = lambda dl,dm,u,v,freq: (dl*n.outer(u,freq/self.freq_orig[0]) + dm*n.outer(v,freq/self.freq_orig[0])) # operates on single time of u,v
if ((len(im) != 1) & (size != 0)):
y,x = n.where(im == im.max())
length = len(im)
dl = (length/2 - x[0]) * 1./size
dm = (y[0] - length/2) * 1./size
print 'Shifting phase center to image peak: (dl,dm) = (%e,%e) = (%e,%e) arcsec' % (dl, dm, n.degrees(dl)*3600, n.degrees(dm)*3600)
elif isinstance(dl,n.ndarray) and isinstance(dm,n.ndarray):
if not len(dl) == self.nints:
raise ValueError('dl is an array but its length (%d) does not match the number of integrations (%d)' % (len(dl),self.nints))
elif ((dl != 0) | (dm != 0)):
print 'Shifting phase center by given (dl,dm) = (%e,%e) = (%e,%e) arcsec' % (dl, dm, n.degrees(dl)*3600, n.degrees(dm)*3600)
dl = dl * n.ones(self.nints)
dm = dm * n.ones(self.nints)
else:
raise ValueError('Need to give either dl or dm, or im and size.')
for i in xrange(self.nints):
for pol in xrange(self.npol):
self.data[i,:,:,pol] = self.data[i,:,:,pol] * n.exp(-2j*n.pi*ang(dl[i], dm[i], self.u[i], self.v[i], self.freq))
self.l0 = self.l0 + dl
self.m0 = self.m0 + dm
self.dataph = (self.data.mean(axis=3).mean(axis=1)).real # multi-pol
self.min = self.dataph.min()
self.max = self.dataph.max()
print 'New dataph min, max:'
print self.min, self.max
def make_triples(self, amin=0, amax=0):
""" Calculates and returns data indexes (i,j,k) for all closed triples.
amin and amax define range of antennas (with index, in order). only used if nonzero.
"""
if amax == 0:
amax = self.nants
blarr = self.blarr
# first make triples indexes in antenna numbering
anttrips = []
for i in self.ants[amin:amax+1]:
for j in self.ants[list(self.ants).index(i)+1:amax+1]:
for k in self.ants[list(self.ants).index(j)+1:amax+1]:
anttrips.append([i,j,k])
# next return data indexes for triples
bltrips = []
for (ant1, ant2, ant3) in anttrips:
try:
bl1 = n.where( (blarr[:,0] == ant1) & (blarr[:,1] == ant2) )[0][0]
bl2 = n.where( (blarr[:,0] == ant2) & (blarr[:,1] == ant3) )[0][0]
bl3 = n.where( (blarr[:,0] == ant1) & (blarr[:,1] == ant3) )[0][0]
bltrips.append([bl1, bl2, bl3])
except IndexError:
continue
return n.array(bltrips)
class MiriadReader(Reader):
""" Class for reading Miriad format data with miriad-python.
"""
def __init__(self):
raise NotImplementedError('Cannot instantiate class directly. Use \'pipe\' subclasses.')
def read(self, file, nints, nskip, nocal, nopass, selectpol):
""" Reads in Miriad data using miriad-python.
"""
self.file = file
self.nints = nints
vis = miriad.VisData(self.file,)
# read data into python arrays
i = 0
for inp, preamble, data, flags in vis.readLowlevel ('dsl3', False, nocal=True, nopass=True):
# Loop to skip some data and read shifted data into original data arrays
if i == 0:
# get few general variables
self.nants0 = inp.getScalar ('nants', 0)
self.inttime0 = inp.getScalar ('inttime', 10.0)
self.nspect0 = inp.getScalar ('nspect', 0)
self.nwide0 = inp.getScalar ('nwide', 0)
self.sdf0 = inp.getScalar ('sdf', self.nspect0)
self.nschan0 = inp.getScalar ('nschan', self.nspect0)
self.ischan0 = inp.getScalar ('ischan', self.nspect0)
self.sfreq0 = inp.getScalar ('sfreq', self.nspect0)
self.restfreq0 = inp.getScalar ('restfreq', self.nspect0)
self.pol0 = inp.getScalar ('pol')
# DLK 2013-04-04
# get the initial phase center
self.ra0=inp.getScalar('ra')
self.dec0=inp.getScalar('dec')
self.sfreq = self.sfreq0
self.sdf = self.sdf0
self.nchan = len(data)
print 'Initializing nchan:', self.nchan
bls = []
# build complete list of baselines
bls.append(preamble[4])
# end here. assume at least one instance of each bl occurs before ~six integrations (accommodates MWA)
if len(bls) == 6*len(n.unique(bls)):
blarr = []
for bl in n.unique(bls):
blarr.append(mirtask.util.decodeBaseline (bl))
self.blarr = n.array(blarr)
bldict = dict( zip(n.unique(bls), n.arange(len(blarr))) )
break
i = i+1
# find number of pols in data
uvd = mirtask.UVDataSet(self.file, 'rw')
self.npol_orig = uvd.getNPol()
pols = []
for i in xrange(20): # loop over the first few spectra to find all polarizations in the data
pols.append(uvd.getPol())
uvd.next()
uvd.close()
upols = n.unique(pols) # get unique pols in first few spectra
polstr = mirtask.util.polarizationName(upols[0])
if len(upols) > 1:
for pol in upols[1:]:
polstr = polstr + ', ' + mirtask.util.polarizationName(pol)
self.npol = len(selectpol)
if self.npol > self.npol_orig:
raise ValueError('Trying to select %d pols from %d available.' % (self.npol, self.npol_orig))
for pol in selectpol:
if not pol in polstr:
raise ValueError('Trying to select %s, but %s available.' % (pol, polstr))
print 'Initializing npol: %d (of %d, %s)' % (self.npol, self.npol_orig, polstr)
# Initialize more stuff...
self.freq_orig = self.sfreq + self.sdf * n.arange(self.nchan)
self.freq = self.freq_orig[self.chans]
# good baselines
self.nbl = len(self.blarr)
print 'Initializing nbl:', self.nbl
self.ants = n.unique(self.blarr)
self.nants = len(self.ants)
print 'Initializing nants:', self.nants
self.nskip = int(nskip*self.nbl) # number of iterations to skip (for reading in different parts of buffer)
nskip = int(self.nskip)
# define data arrays
self.rawdata = n.zeros((nints, self.nbl, self.nchan, self.npol),dtype='complex64')
self.flags = n.zeros((nints, self.nbl, self.nchan, self.npol),dtype='bool')
self.u = n.zeros((nints,self.nbl),dtype='float64')
self.v = n.zeros((nints,self.nbl),dtype='float64')
self.w = n.zeros((nints,self.nbl),dtype='float64')
self.preamble = n.zeros((nints*self.nbl,5),dtype='float64')
# go back and read data into arrays
for polnum in range(self.npol):
stokes = selectpol[polnum]
i = 0
for inp, preamble, data, flags in vis.readLowlevel ('dsl3', False, nocal=nocal, nopass=nopass, stokes=stokes):
# Loop to skip some data and read shifted data into original data arrays
if i < nskip:
i = i+1
continue
# assumes ints in order, but may skip. after nbl iterations, it fills next row, regardless of number filled.
if (i-nskip) < nints*self.nbl:
self.preamble[i-nskip] = preamble
self.rawdata[(i-nskip)//self.nbl, bldict[preamble[4]], :, polnum] = data
self.flags[(i-nskip)//self.nbl, bldict[preamble[4]], :, polnum] = flags
# uvw stored in preamble index 0,1,2 in units of ns
# Assumes miriad files store uvw in ns. Set to lambda by multiplying by freq of first channel.
self.u[(i-nskip)//self.nbl, bldict[preamble[4]]] = preamble[0] * self.freq_orig[0]
self.v[(i-nskip)//self.nbl, bldict[preamble[4]]] = preamble[1] * self.freq_orig[0]
self.w[(i-nskip)//self.nbl, bldict[preamble[4]]] = preamble[2] * self.freq_orig[0]
else:
break # stop at nints
if not (i % (self.nbl*100)):
print 'Read spectrum ', str(i)
i = i+1
time = self.preamble[::self.nbl,3]
if ((not n.any(self.rawdata)) & (not n.any(time))):
raise ValueError('rawdata and time arrays at default values. No data read?')
# limit the data to actually real data (DLK)
maxgoodtime=max(n.where(time>0)[0])
if maxgoodtime+1 < nints:
print 'Requested to read %d integrations, but only found %d good integrations' % (nints,
maxgoodtime)
# need to trim off some of the data
time=time[:maxgoodtime]
self.nints=len(time)
self.u=self.u[:maxgoodtime]
self.v=self.v[:maxgoodtime]
self.w=self.w[:maxgoodtime]
self.rawdata=self.rawdata[:maxgoodtime]
self.flags=self.flags[:maxgoodtime]
self.reltime = 24*3600*(time - time[0]) # relative time array in seconds. evla times change...?
# preserve absolute time (DLK)
self.time=time
self.inttime = n.array([self.reltime[i+1] - self.reltime[i] for i in xrange(len(self.reltime)/5,len(self.reltime)-1)]).mean()
# define relative phase center for each integration
self.l0 = n.zeros(self.nints)
self.m0 = n.zeros(self.nints)
# print summary info
print
print 'Shape of raw data, time:'
print self.rawdata.shape, self.reltime.shape
def writetrack(self, dmbin, tbin, tshift=0, bgwindow=0, show=0, pol=0):
""" **Not tested recently** Writes data from track out as miriad visibility file.
Alternative to writetrack that uses stored, approximate preamble used from start of pulse, not middle.
Optional background subtraction bl-by-bl over bgwindow integrations. Note that this is bgwindow *dmtracks* so width is bgwindow+track width
"""
# create bgsub data
datadiffarr = self.tracksub(dmbin, tbin, bgwindow=bgwindow)
if n.shape(datadiffarr) == n.shape([0]): # if track doesn't cross band, ignore this iteration
return 0
data = n.zeros(self.nchan, dtype='complex64') # default data array. gets overwritten.
data0 = n.zeros(self.nchan, dtype='complex64') # zero data array for flagged bls
flags = n.zeros(self.nchan, dtype='bool')
# define output visibility file names
outname = string.join(self.file.split('.')[:-1], '.') + '.' + str(self.nskip/self.nbl) + '-' + 'dm' + str(dmbin) + 't' + str(tbin) + '.mir'
print outname
vis = miriad.VisData(self.file,)
int0 = int((tbin + tshift) * self.nbl)
flags0 = []
i = 0
for inp, preamble, data, flags in vis.readLowlevel ('dsl3', False, nocal=True, nopass=True):
if i == 0:
# prep for temp output vis file
shutil.rmtree(outname, ignore_errors=True)
out = miriad.VisData(outname)
dOut = out.open ('c')
# set variables
dOut.setPreambleType ('uvw', 'time', 'baseline')
dOut.writeVarInt ('nants', self.nants0)
dOut.writeVarFloat ('inttime', self.inttime0)
dOut.writeVarInt ('nspect', self.nspect0)
dOut.writeVarDouble ('sdf', self.sdf0)
dOut.writeVarInt ('nwide', self.nwide0)
dOut.writeVarInt ('nschan', self.nschan0)
dOut.writeVarInt ('ischan', self.ischan0)
dOut.writeVarDouble ('sfreq', self.sfreq0)
dOut.writeVarDouble ('restfreq', self.restfreq0)
dOut.writeVarInt ('pol', self.pol0)
# inp.copyHeader (dOut, 'history')
inp.initVarsAsInput (' ') # ???
inp.copyLineVars (dOut)
if i < self.nbl:
flags0.append(flags.copy())
i = i+1
else:
break
l = 0
for i in xrange(len(flags0)): # iterate over baselines
# write out track, if not flagged
if n.any(flags0[i]):
k = 0
for j in xrange(self.nchan):
if j in self.chans:
data[j] = datadiffarr[pol, l, k]
# flags[j] = flags0[i][j]
k = k+1
else:
data[j] = 0 + 0j
# flags[j] = False
l = l+1
else:
data = data0
# flags = n.zeros(self.nchan, dtype='bool')
dOut.write (self.preamble[int0 + i], data, flags0[i])
dOut.close ()
return 1
def writetrack2(self, dmbin, tbin, tshift=0, bgwindow=0, show=0, pol=0):
""" **Not tested recently** Writes data from track out as miriad visibility file.
Alternative to writetrack that uses stored, approximate preamble used from start of pulse, not middle.
Optional background subtraction bl-by-bl over bgwindow integrations. Note that this is bgwindow *dmtracks* so width is bgwindow+track width
"""
# create bgsub data
datadiffarr = self.tracksub(dmbin, tbin, bgwindow=bgwindow)
if n.shape(datadiffarr) == n.shape([0]): # if track doesn't cross band, ignore this iteration
return 0
data = n.zeros(self.nchan, dtype='complex64') # default data array. gets overwritten.
data0 = n.zeros(self.nchan, dtype='complex64') # zero data array for flagged bls
flags = n.zeros(self.nchan, dtype='bool')
# define output visibility file names
outname = string.join(self.file.split('.')[:-1], '.') + '.' + str(self.nskip/self.nbl) + '-' + 'dm' + str(dmbin) + 't' + str(tbin) + '.mir'
print outname
vis = miriad.VisData(self.file,)
int0 = int((tbin + tshift) * self.nbl)
flags0 = []
i = 0
for inp, preamble, data, flags in vis.readLowlevel ('dsl3', False, nocal=True, nopass=True):
if i == 0:
# prep for temp output vis file
shutil.rmtree(outname, ignore_errors=True)
out = miriad.VisData(outname)
dOut = out.open ('c')
# set variables
dOut.setPreambleType ('uvw', 'time', 'baseline')
dOut.writeVarInt ('nants', self.nants0)
dOut.writeVarFloat ('inttime', self.inttime0)
dOut.writeVarInt ('nspect', self.nspect0)
dOut.writeVarDouble ('sdf', self.sdf0)
dOut.writeVarInt ('nwide', self.nwide0)
dOut.writeVarInt ('nschan', self.nschan0)
dOut.writeVarInt ('ischan', self.ischan0)
dOut.writeVarDouble ('sfreq', self.sfreq0)
dOut.writeVarDouble ('restfreq', self.restfreq0)
dOut.writeVarInt ('pol', self.pol0)
# inp.copyHeader (dOut, 'history')
inp.initVarsAsInput (' ') # ???
inp.copyLineVars (dOut)
if i < self.nbl:
flags0.append(flags.copy())
i = i+1
else:
break
l = 0
for i in xrange(len(flags0)): # iterate over baselines
# write out track, if not flagged
if n.any(flags0[i]):
k = 0
for j in xrange(self.nchan):
if j in self.chans:
data[j] = datadiffarr[pol, l, k]
# flags[j] = flags0[i][j]
k = k+1
else:
data[j] = 0 + 0j
# flags[j] = False
l = l+1
else:
data = data0
# flags = n.zeros(self.nchan, dtype='bool')
dOut.write (self.preamble[int0 + i], data, flags0[i])
dOut.close ()
return 1
class MSReader(Reader):
""" Class for reading MS data with either CASA. (Will eventually use pyrap.)
"""
def __init__(self):
raise NotImplementedError('Cannot instantiate class directly. Use \'pipe\' subclasses.')
def read(self, file, nints, nskip, spw, selectpol, scan, datacol):
""" Reads in Measurement Set data using CASA.
spw is list of subbands. zero-based.
Scan is zero-based selection based on scan order, not actual scan number.
selectpol is list of polarization strings (e.g., ['RR','LL'])
"""
self.file = file
self.scan = scan
self.nints = nints
# get spw info. either load pickled version (if found) or make new one
pklname = string.join(file.split('.')[:-1], '.') + '_init.pkl'
# pklname = pklname.split('/')[-1] # hack to remove path and write locally
if os.path.exists(pklname):
print 'Pickle of initializing info found. Loading...'
pkl = open(pklname, 'r')
try:
(self.npol_orig, self.nbl, self.blarr, self.inttime, self.inttime0, spwinfo, scansummary) = pickle.load(pkl)
except EOFError:
print 'Bad pickle file. Exiting...'
return 1
# old way, casa 3.3?
# scanlist = scansummary['summary'].keys()
# starttime_mjd = scansummary['summary'][scanlist[scan]]['0']['BeginTime']
# new way, casa 4.0?
scanlist = scansummary.keys()
starttime_mjd = scansummary[scanlist[scan]]['0']['BeginTime']
self.nskip = int(nskip*self.nbl) # number of iterations to skip (for reading in different parts of buffer)
self.npol = len(selectpol)
else:
print 'No pickle of initializing info found. Making anew...'
pkl = open(pklname, 'wb')
ms.open(self.file)
spwinfo = ms.getspectralwindowinfo()
scansummary = ms.getscansummary()
# original (general version)
# scanlist = scansummary['summary'].keys()
# starttime_mjd = scansummary['summary'][scanlist[scan]]['0']['BeginTime']
# starttime0 = qa.getvalue(qa.convert(qa.time(qa.quantity(starttime_mjd+0/(24.*60*60),'d'),form=['ymd'], prec=9), 's'))
# stoptime0 = qa.getvalue(qa.convert(qa.time(qa.quantity(starttime_mjd+0.5/(24.*60*60), 'd'), form=['ymd'], prec=9), 's'))
# for casa 4.0 (?) and later
scanlist = scansummary.keys()
starttime_mjd = scansummary[scanlist[scan]]['0']['BeginTime']
starttime0 = qa.getvalue(qa.convert(qa.time(qa.quantity(starttime_mjd+0/(24.*60*60),'d'),form=['ymd'], prec=9)[0], 's'))[0]
stoptime0 = qa.getvalue(qa.convert(qa.time(qa.quantity(starttime_mjd+0.5/(24.*60*60), 'd'), form=['ymd'], prec=9)[0], 's'))[0]
ms.selectinit(datadescid=0) # initialize to initialize params
selection = {'time': [starttime0, stoptime0]}
ms.select(items = selection)
da = ms.getdata([datacol, 'axis_info'], ifraxis=True)
ms.close()
self.npol_orig = da[datacol].shape[0]
self.nbl = da[datacol].shape[2]
print 'Initializing nbl:', self.nbl
# good baselines
bls = da['axis_info']['ifr_axis']['ifr_shortname']
self.blarr = n.array([[int(bls[i].split('-')[0]),int(bls[i].split('-')[1])] for i in xrange(len(bls))])
self.nskip = int(nskip*self.nbl) # number of iterations to skip (for reading in different parts of buffer)
# set integration time
ti0 = da['axis_info']['time_axis']['MJDseconds']
# self.inttime = scansummary['summary'][scanlist[scan]]['0']['IntegrationTime'] # general way
self.inttime = scansummary[scanlist[scan]]['0']['IntegrationTime'] # subset way, or casa 4.0 way?
self.inttime0 = self.inttime
print 'Initializing integration time (s):', self.inttime
pickle.dump((self.npol_orig, self.nbl, self.blarr, self.inttime, self.inttime0, spwinfo, scansummary), pkl)
pkl.close()
self.ants = n.unique(self.blarr)
self.nants = len(n.unique(self.blarr))
self.nants0 = len(n.unique(self.blarr))
print 'Initializing nants:', self.nants
self.npol = len(selectpol)
print 'Initializing %d of %d polarizations' % (self.npol, self.npol_orig)
# set desired spw
if (len(spw) == 1) & (spw[0] == -1):
# spwlist = spwinfo['spwInfo'].keys() # old way
spwlist = spwinfo.keys() # new way
else:
spwlist = spw
self.freq_orig = n.array([])
for spw in spwlist:
# new way
nch = spwinfo[str(spw)]['NumChan']
ch0 = spwinfo[str(spw)]['Chan1Freq']
chw = spwinfo[str(spw)]['ChanWidth']
self.freq_orig = n.concatenate( (self.freq_orig, (ch0 + chw * n.arange(nch)) * 1e-9) )
# old way
# nch = spwinfo['spwInfo'][str(spw)]['NumChan']
# ch0 = spwinfo['spwInfo'][str(spw)]['Chan1Freq']
# chw = spwinfo['spwInfo'][str(spw)]['ChanWidth']
self.freq = self.freq_orig[self.chans]
self.nchan = len(self.freq)
print 'Initializing nchan:', self.nchan
# set requested time range based on given parameters
timeskip = self.inttime*nskip
# new way
starttime = qa.getvalue(qa.convert(qa.time(qa.quantity(starttime_mjd+timeskip/(24.*60*60),'d'),form=['ymd'], prec=9)[0], 's'))[0]
stoptime = qa.getvalue(qa.convert(qa.time(qa.quantity(starttime_mjd+(timeskip+nints*self.inttime)/(24.*60*60), 'd'), form=['ymd'], prec=9)[0], 's'))[0]
print 'First integration of scan:', qa.time(qa.quantity(starttime_mjd,'d'),form=['ymd'],prec=9)[0]
print
# new way
print 'Reading scan', str(scanlist[scan]) ,'for times', qa.time(qa.quantity(starttime_mjd+timeskip/(24.*60*60),'d'),form=['hms'], prec=9)[0], 'to', qa.time(qa.quantity(starttime_mjd+(timeskip+nints*self.inttime)/(24.*60*60), 'd'), form=['hms'], prec=9)[0]
# read data into data structure
ms.open(self.file)
ms.selectinit(datadescid=spwlist[0]) # reset select params for later data selection
selection = {'time': [starttime, stoptime]}
ms.select(items = selection)
print 'Reading %s column, SB %d, polarization %s...' % (datacol, spwlist[0], selectpol)
ms.selectpolarization(selectpol)
da = ms.getdata([datacol,'axis_info','u','v','w','flag'], ifraxis=True)
u = da['u']; v = da['v']; w = da['w']
if da == {}:
print 'No data found.'
return 1
newda = n.transpose(da[datacol], axes=[3,2,1,0]) # if using multi-pol data.
flags = n.transpose(da['flag'], axes=[3,2,1,0])
if len(spwlist) > 1:
for spw in spwlist[1:]:
ms.selectinit(datadescid=spw) # reset select params for later data selection
ms.select(items = selection)
print 'Reading %s column, SB %d, polarization %s...' % (datacol, spw, selectpol)
ms.selectpolarization(selectpol)
da = ms.getdata([datacol,'axis_info','flag'], ifraxis=True)
newda = n.concatenate( (newda, n.transpose(da[datacol], axes=[3,2,1,0])), axis=2 )
flags = n.concatenate( (flags, n.transpose(da['flag'], axes=[3,2,1,0])), axis=2 )
ms.close()
# Initialize more stuff...
self.nschan0 = self.nchan
# set variables for later writing data **some hacks here**
self.nspect0 = 1
self.nwide0 = 0
self.sdf0 = da['axis_info']['freq_axis']['resolution'][0][0] * 1e-9
self.sdf = self.sdf0
self.ischan0 = 1
self.sfreq0 = da['axis_info']['freq_axis']['chan_freq'][0][0] * 1e-9
self.sfreq = self.sfreq0
self.restfreq0 = 0.0
self.pol0 = -1 # assumes single pol?
# Assumes MS files store uvw in meters. Corrects by mean frequency of channels in use.
self.u = u.transpose() * self.freq_orig[0] * (1e9/3e8)
self.v = v.transpose() * self.freq_orig[0] * (1e9/3e8)
self.w = w.transpose() * self.freq_orig[0] * (1e9/3e8)
# set integration time and time axis
ti = da['axis_info']['time_axis']['MJDseconds']
self.reltime = ti - ti[0]
# define relative phase center for each integration
self.l0 = n.zeros(self.nints)
self.m0 = n.zeros(self.nints)
self.rawdata = newda
self.flags = n.invert(flags) # tests show that MS has opposite flag convention as Miriad! using complement of MS flag in tpipe.
print 'Shape of raw data, time:'
print self.rawdata.shape, self.reltime.shape
class SimulationReader(Reader):
""" Class for simulating visibility data for transients analysis.
"""
def __init__(self):
raise NotImplementedError('Cannot instantiate class directly. Use \'pipe\' subclasses.')
def simulate(self, nints, inttime, chans, freq, bw, array='vla10'):
""" Simulates data
array is the name of array config, nints is number of ints, inttime is integration duration, chans, freq, bw (in GHz) as normal.
array can be 'vla_d' and 'vla10', the latter is the first 10 of vla_d.
"""
self.file = 'sim'
self.chans = chans
self.nints = nints
self.nchan = len(chans)
print 'Initializing nchan:', self.nchan
self.sfreq = freq # in GHz
self.sdf = bw/self.nchan
self.npol = 1
self.freq_orig = self.sfreq + self.sdf * n.arange(self.nchan)
self.freq = self.freq_orig[self.chans]
self.inttime = inttime # in seconds
self.reltime = inttime*n.arange(nints)
# define relative phase center for each integration
self.l0 = n.zeros(self.nints)
self.m0 = n.zeros(self.nints)
# antennas and baselines
vla_d = 1e3*n.array([[ 0.00305045, 0.03486681], [ 0.00893224, 0.10209601], [ 0.01674565, 0.19140365], [ 0.02615514, 0.29895461], [ 0.03696303, 0.42248936], [ 0.04903413, 0.56046269], [ 0.06226816, 0.7117283 ], [ 0.07658673, 0.87539034], [ 0.09192633, 1.05072281], [ 0.02867032, -0.02007518], [ 0.08395162, -0.05878355], [ 0.1573876 , -0.11020398], [ 0.24582472, -0.17212832], [ 0.347405 , -0.2432556 ], [ 0.46085786, -0.32269615], [ 0.58524071, -0.40978995], [ 0.71981691, -0.50402122], [ 0.86398948, -0.60497195], [-0.03172077, -0.01479164], [-0.09288386, -0.04331245], [-0.17413325, -0.08119967], [-0.27197986, -0.12682629], [-0.38436803, -0.17923376], [-0.509892 , -0.23776654], [-0.64750886, -0.30193834], [-0.79640364, -0.37136912], [-0.95591581, -0.44575086]])
if array == 'vla_d':
antloc = vla_d
elif array == 'vla10':
antloc = vla_d[5:15] # 5:15 choses inner part of two arms
elif array == 'mwa':
antloc=1e3*n.array([[0,0]])
self.nants = len(antloc)
print 'Initializing nants:', self.nants
blarr = []; u = []; v = []; w = []
for i in range(1, self.nants+1):
for j in range(i, self.nants+1):
blarr.append([i,j])
u.append(antloc[i-1][0] - antloc[j-1][0]) # in meters (like MS, fwiw)
v.append(antloc[i-1][1] - antloc[j-1][1])
w.append(0.)
self.blarr = n.array(blarr)
self.nbl = len(self.blarr)
self.u = n.zeros((nints,self.nbl),dtype='float64')
self.v = n.zeros((nints,self.nbl),dtype='float64')
self.w = n.zeros((nints,self.nbl),dtype='float64')
print 'Initializing nbl:', self.nbl
self.ants = n.unique(self.blarr)
self.nskip = 0
# no earth rotation yet
for i in range(nints):
self.u[i] = n.array(u) * self.freq_orig[0] * (1e9/3e8)
self.v[i] = n.array(v) * self.freq_orig[0] * (1e9/3e8)
self.w[i] = n.array(w) * self.freq_orig[0] * (1e9/3e8)
# simulate data
self.rawdata = n.zeros((nints,self.nbl,self.nchan,self.npol),dtype='complex64')
self.flags = n.ones((nints,self.nbl,self.nchan,self.npol),dtype='bool')
self.rawdata.real = n.sqrt(self.nchan) * n.random.randn(nints,self.nbl,self.nchan,self.npol) # normal width=1 after channel mean
self.rawdata.imag = n.sqrt(self.nchan) * n.random.randn(nints,self.nbl,self.nchan,self.npol)
# print summary info
print
print 'Shape of raw data, time:'
print self.rawdata.shape, self.reltime.shape
def add_transient(self, dl, dm, s, i):
""" Add a transient to an integration.
dl, dm are relative direction cosines (location) of transient, s is brightness, and i is integration.
"""
ang = lambda dl,dm,u,v,freq: (dl*n.outer(u,freq/self.freq_orig[0]) + dm*n.outer(v,freq/self.freq_orig[0])) # operates on single time of u,v
for pol in range(self.npol):
self.data[i,:,:,pol] = self.data[i,:,:,pol] + s * n.exp(-2j*n.pi*ang(dl, dm, self.u[i], self.v[i], self.freq))
self.dataph = (self.data.mean(axis=3).mean(axis=1)).real #dataph is summed and detected to form TP beam at phase center, multi-pol
class ProcessByIntegration():
""" Class defines methods for pipeline processing for integration-based (no dispersion) transients searches.
"""
def __init__(self):
raise NotImplementedError('Cannot instantiate class directly. Use \'pipe\' subclasses.')
def prep(self):
""" Sets up tracks used to select data in time. Setting them early helped speed up dedispersion done elsewhere.
"""
print
print 'Filtering rawdata to data as masked array...'
# using 0 as flag
# self.data = n.ma.masked_array(self.rawdata[:self.nints,:, self.chans,:], self.rawdata[:self.nints,:, self.chans,:] == 0j)
# using standard flags
self.data = n.ma.masked_array(self.rawdata[:self.nints,:, self.chans,:], self.flags[:self.nints,:, self.chans,:] == 0) # mask of True for flagged data (flags=0 in tpipe, which is flags=False in Miriad and flags=True in MS)
self.dataph = (self.data.mean(axis=3).mean(axis=1)).real #dataph is summed and detected to form TP beam at phase center, multi-pol
self.min = self.dataph.min()
self.max = self.dataph.max()
print 'Shape of data:'
print self.data.shape
print 'Dataph min, max:'
print self.min, self.max
self.freq = self.freq_orig[self.chans]
self.track0 = self.track(0.)
self.twidth = 0
for k in self.track0[1]:
self.twidth = max(self.twidth, len(n.where(n.array(self.track0[1]) == k)[0]))
print 'Track width in time: %d. Iteration could step by %d/2.' % (self.twidth, self.twidth)
def track(self, t0 = 0., show=0):
""" Takes time offset from first integration in seconds.
t0 defined at first (unflagged) channel.
Returns an array of (timebin, channel) to select from the data array.
"""
reltime = self.reltime
chans = self.chans
tint = self.inttime
# calculate pulse time and duration
pulset = t0
pulsedt = self.pulsewidth[0] # dtime in seconds. just take one channel, since there is no freq dep
timebin = []
chanbin = []
ontime = n.where(((pulset + pulsedt) >= reltime - tint/2.) & (pulset <= reltime + tint/2.))
for ch in xrange(len(chans)):
timebin = n.concatenate((timebin, ontime[0]))
chanbin = n.concatenate((chanbin, (ch * n.ones(len(ontime[0]), dtype='int'))))
track = (list(timebin), list(chanbin))
if show:
p.plot(track[0], track[1], 'w*')
return track
def tracksub(self, tbin, bgwindow = 0):
""" Creates a background-subtracted set of visibilities.
For a given track (i.e., an integration number) and bg window, tracksub subtractes a background in time and returns an array with new data.
"""
data = self.data
track_t,track_c = self.track0 # get track time and channel arrays
trackon = (list(n.array(track_t)+tbin), track_c) # create new track during integration of interest
twidth = self.twidth
dataon = data[trackon[0], :, trackon[1]]
# set up bg track
if bgwindow:
# measure max width of pulse (to avoid in bgsub)
bgrange = range(tbin -(bgwindow/2+twidth)+1, tbin-twidth+1) + range(tbin + twidth, tbin + (twidth+bgwindow/2))
for k in bgrange: # build up super track for background subtraction
if bgrange.index(k) == 0: # first time through
trackoff = (list(n.array(track_t)+k), track_c)
else: # then extend arrays by next iterations
trackoff = (trackoff[0] + list(n.array(track_t)+k), list(trackoff[1]) + list(track_c))
dataoff = data[trackoff[0], :, trackoff[1]]
datadiffarr = n.ma.zeros((len(self.chans), self.nbl, self.npol),dtype='complex')
# compress time axis, then subtract on and off tracks
for ch in n.unique(trackon[1]):
indon = n.where(trackon[1] == ch)
meanon = dataon[indon].mean(axis=0) # include all zeros
if bgwindow:
indoff = n.where(trackoff[1] == ch)
meanoff = dataoff[indoff].mean(axis=0) # include all zeros
datadiffarr[ch] = meanon - meanoff
zeros = n.where( (meanon == 0j) | (meanoff == 0j) ) # find baselines and pols with zeros for meanon or meanoff
datadiffarr[ch][zeros] = 0j # set missing data to zero # hack! but could be ok if we can ignore zeros later...
else:
datadiffarr[ch] = meanon
return n.transpose(datadiffarr, axes=[2,1,0])
def make_bispectra(self, bgwindow=4):
""" Makes numpy array of bispectra for each integration. Subtracts visibilities in time in bgwindow.
Steps in Bispectrum Transient Detection Algorithm
1) Collect visibility spectra for some length of time. In Python, I read data into an array with a shape of (n_int, n_chan, n_bl).
2) Prepare visibility spectra to create bispectra. Optionally, one can form dedispersed spectra for each baseline. A simpler start (and probably more relevant for LOFAR) would be to instead select a single integration from the data array described above. Either way, this step changes the data shape to (n_chan, n_bl).
3) Subtract visibilities in time. If the sky has many (or complex) sources, the bispectrum is hard to interpret. Subtracting neighboring visibilities in time (or a rolling mean, like v_t2 - (v_t1+v_t3)/2) removes most constant emission. The only trick is that this assumes that the array has not rotated much and that gain and other effects have not changed. This should preserve the data shape as (n_chan, n_bl).
4) Calculate mean visibility for each baseline. After subtracting in time, one can measure the mean visibility across the band. This reduces the shape to (n_bl).
5) Form a bispectrum for every closed triple in the array. There are a total of n_a * (n_a-1) * (n_a-2) / 6 possible closed triples in the array, where n_a is the number of antennas. One way to form all bispectra is to iterate over antenna indices like this:
for i in range(0, len(n_a)-2):
for j in range(i, len(n_a)-1):
for k in range(k, len(n_a)):
bl1, bl2, bl3 = ant2bl(i, j, k)
bisp = vis[bl1] * vis[bl2] * vis[bl3]
As you can see, this loop needs a function to convert antenna triples to baseline triples (I call it "ant2bl" here). That is, for antennas (i, j, k), you need (bl_ij, bl_jk, bl_ki). Note that the order of the last baseline is flipped; this is a way of showing that the way you "close" a loop is by tracing a single line around all three baselines. This step changes the basic data product from a shape of (n_bl) to (n_tr).
6) Search the set of bispectra for sign of a source. Each bispectrum is complex, but if there is a point source in the (differenced) data, all bispectra will respond in the same way. This happens regardless of the location in the field of view.
The mean of all bispectra will scale with the source brightness to the third power, since it is formed from the product of three visibilities. Oddly, the standard deviation of the bispectra will *also* change with the source brightness, due to something called "self noise". The standard deviation of bispectra in the real-imaginary plane should be sqrt(3) S^2 sigma_bl, where S is the source brightness and sigma_bl is the noise on an individual baseline.
In practice, this search involves plotting the mean bispectrum versus time and searching for large deviations. At the same time, a plot of mean versus standard deviation of bispectra will show whether any significant deviation obeys the expected self-noise scaling. That scaling is only valid for a single point source in the field of view, which is what you expect for a fast transient. Any other behavior would be either noise-like or caused by RFI. In particular, RFI will look like a transient, but since it does not often look like a point source, it can be rejected in the plot of mean vs. standard deviation of bispectra. This is a point that I've demonstrated on a small scale, but would needs more testing, since RFI is so varied.
"""
bisp = lambda d, ij, jk, ki: d[:,ij] * d[:,jk] * n.conj(d[:,ki]) # bispectrum for pol data
self.triples = self.make_triples()
self.bispectra = n.ma.zeros((len(self.data), len(self.triples)), dtype='complex')
for i in xrange((bgwindow/2)+self.twidth, len(self.data)-( (bgwindow/2)+2*self.twidth )):
# for i in xrange((bgwindow/2)+self.twidth, len(self.data)-( (bgwindow/2)+self.twidth ), max(1,self.twidth)): # leaves gaps in data
diff = self.tracksub(i, bgwindow=bgwindow)
if len(n.shape(diff)) == 1: # no track
continue
diffmean = n.mean(diff, axis=2) # if all zeros, just make mean # bit of a hack
for trip in xrange(len(self.triples)):
ij, jk, ki = self.triples[trip]
self.bispectra[i, trip] = bisp(diffmean, ij, jk, ki).mean(axis=0) # Stokes I bispectrum. Note we are averaging after forming bispectrum, so not technically a Stokes I bispectrum.
def detect_bispectra(self, sigma=5., tol=1.3, Q=0, show=0, save=0):
"""Function to search for a transient in a bispectrum lightcurve.
Designed to be used by bisplc function or easily fed the output of that function.
sigma gives the threshold for SNR_bisp (apparent).
tol gives the amount of tolerance in the sigma_b cut for point-like sources (rfi filter).
Q is noise per baseline and can be input. Otherwise estimated from data.
Returns the SNR and integration number of any candidate events.
save=0 is no saving, save=1 is save with default name, save=<string>.png uses custom name (must include .png).
"""
try:
ba = self.bispectra
except AttributeError:
print 'Need to make bispectra first.'
return
# ntr = lambda num: num*(num-1)*(num-2)/6 # theoretical number of triples
ntr = lambda num: len(self.triples) # consider possibility of zeros in data and take mean number of good triples over all times
# using s=S/Q
# mu = lambda s: s/(1+s) # for independent bispectra, as in kulkarni 1989
mu = lambda s: 1. # for bispectra at high S/N from visibilities?
sigbQ3 = lambda s: n.sqrt((1 + 3*mu(s)**2) + 3*(1 + mu(s)**2)*s**2 + 3*s**4) # from kulkarni 1989, normalized by Q**3, also rogers et al 1995
s = lambda basnr, nants: (2.*basnr/n.sqrt(ntr(nants)))**(1/3.)
# measure SNR_bl==Q from sigma clipped times with normal mean and std of bispectra. put into time,dm order
bamean = ba.real.mean(axis=1)
bastd = ba.real.std(axis=1)
(meanmin,meanmax) = sigma_clip(bamean) # remove rfi
(stdmin,stdmax) = sigma_clip(bastd) # remove rfi
clipped = n.where((bamean > meanmin) & (bamean < meanmax) & (bastd > stdmin) & (bastd < stdmax) & (bamean != 0.0))[0] # remove rf
bameanstd = ba[clipped].real.mean(axis=1).std()
basnr = bamean/bameanstd
if Q:
print 'Using given Q =', Q
else:
Q = ((bameanstd/2.)*n.sqrt(ntr(self.nants)))**(1/3.)
# Q = n.median( bastd[clipped]**(1/3.) ) # alternate for Q
print 'Estimating noise per baseline from data. Q =', Q
self.Q = Q
# detect
cands = n.where( (bastd/Q**3 < tol*sigbQ3(s(basnr, self.nants))) & (basnr > sigma) )[0] # define compact sources with good snr
print cands
# plot snrb lc and expected snr vs. sigb relation
if show or save:
p.figure()
ax = p.axes()
p.subplot(211)
p.title(str(self.nskip/self.nbl)+' nskip, ' + str(len(cands))+' candidates', transform = ax.transAxes)
p.plot(basnr, 'b.')
if len(cands) > 0:
p.plot(cands, basnr[cands], 'r*')
p.ylim(-2*basnr[cands].max(),2*basnr[cands].max())
p.xlabel('Integration')
p.ylabel('SNR$_{bisp}$')
p.subplot(212)
p.plot(bastd/Q**3, basnr, 'b.')
# plot reference theory lines
smax = s(basnr.max(), self.nants)
sarr = smax*n.arange(0,51)/50.
p.plot(sigbQ3(sarr), 1/2.*sarr**3*n.sqrt(ntr(self.nants)), 'k')
p.plot(tol*sigbQ3(sarr), 1/2.*sarr**3*n.sqrt(ntr(self.nants)), 'k--')
p.plot(bastd[cands]/Q**3, basnr[cands], 'r*')
if len(cands) > 0:
p.axis([0, tol*sigbQ3(s(basnr[cands].max(), self.nants)), -0.5*basnr[cands].max(), 1.1*basnr[cands].max()])
# show spectral modulation next to each point
for candint in cands:
sm = n.single(round(self.specmod(candint),1))
p.text(bastd[candint]/Q**3, basnr[candint], str(sm), horizontalalignment='right', verticalalignment='bottom')
p.xlabel('$\sigma_b/Q^3$')
p.ylabel('SNR$_{bisp}$')
if save:
if save == 1:
savename = self.file.split('.')[:-1]
savename.append(str(self.nskip/self.nbl) + '_bisp.png')
savename = string.join(savename,'.')
elif isinstance(save, string):
savename = save
print 'Saving file as ', savename
p.savefig(self.pathout+savename)
return basnr[cands], bastd[cands], cands
def specmod(self, tbin, bgwindow=4):
"""Calculate spectral modulation for given track.
Spectral modulation is basically the standard deviation of a spectrum.
This helps quantify whether the flux is located in a narrow number of channels or across all channels.
Narrow RFI has large (>5) modulation, while spectrally broad emission has low modulation.
See Spitler et al 2012 for details.
"""
diff = self.tracksub(tbin, bgwindow=bgwindow)
bfspec = diff.mean(axis=0).real # should be ok for multipol data...
sm = n.sqrt( ((bfspec**2).mean() - bfspec.mean()**2) / bfspec.mean()**2 )
return sm
def make_phasedbeam(self):
"""Like that of dispersion-based classes, but integration-based.
Not yet implemented.
"""
raise NotImplementedError('For now, you could instead used dispersion code with dmarr=[0.]...')
def detect_phasedbeam(self):
"""Like that of dispersion-based classes, but integration-based.
Not yet implemented.
"""
raise NotImplementedError('For now, you could instead used dispersion code with dmarr=[0.]...')
class ProcessByDispersion():
""" Class defines methods for pipeline processing for dispersion-based transients searches.
"""
def __init__(self):
raise NotImplementedError('Cannot instantiate class directly. Use \'pipe\' subclasses.')
def prep(self):
""" Sets up tracks used to speed up dedispersion code.
"""
print
print 'Filtering rawdata to data as masked array...'
# using 0 as flag
# self.data = n.ma.masked_array(self.rawdata[:self.nints,:, self.chans,:], self.rawdata[:self.nints,:, self.chans,:] == 0j)
# using standard flags
self.data = n.ma.masked_array(self.rawdata[:self.nints,:, self.chans,:], self.flags[:self.nints,:, self.chans,:] == 0) # mask of True for flagged data (flags=0 in tpipe, which is flags=False in Miriad and flags=True in MS)
self.dataph = (self.data.mean(axis=3).mean(axis=1)).real #dataph is summed and detected to form TP beam at phase center, multi-pol
self.min = self.dataph.min()
self.max = self.dataph.max()
print 'Shape of data:'
print self.data.shape
print 'Dataph min, max:'
print self.min, self.max
self.freq = self.freq_orig[self.chans]
# set up ur tracks (lol)
self.dmtrack0 = {}
self.twidths = {}
for dmbin in xrange(len(self.dmarr)):
self.dmtrack0[dmbin] = self.dmtrack(self.dmarr[dmbin],0) # track crosses high-freq channel in first integration
self.twidths[dmbin] = 0
for k in self.dmtrack0[dmbin][1]:
self.twidths[dmbin] = max(self.twidths[dmbin], len(n.where(n.array(self.dmtrack0[dmbin][1]) == k)[0]))
print 'Track width in time: '
for dmbin in self.twidths:
print 'DM=%.1f, twidth=%d. Iteration could step by %d/2.' % (self.dmarr[dmbin], self.twidths[dmbin], self.twidths[dmbin])
def dmtrack(self, dm = 0., t0 = 0., show=0):
""" Takes dispersion measure in pc/cm3 and time offset from first integration in seconds.
t0 defined at first (unflagged) channel. Need to correct by flight time from there to freq=0 for true time.
Returns an array of (timebin, channel) to select from the data array.
"""
reltime = self.reltime
chans = self.chans
tint = self.inttime
# given freq, dm, dfreq, calculate pulse time and duration
pulset_firstchan = 4.2e-3 * dm * self.freq[len(self.chans)-1]**(-2) # used to start dmtrack at highest-freq unflagged channel
pulset_midchan = 4.2e-3 * dm * self.freq[len(self.chans)/2]**(-2) # used to start dmtrack at highest-freq unflagged channel. fails to find bright j0628 pulse
pulset = 4.2e-3 * dm * self.freq**(-2) + t0 - pulset_firstchan # time in seconds referenced to some frequency (first, mid, last)
pulsedt = n.sqrt( (8.3e-6 * dm * (1000*self.sdf) * self.freq**(-3))**2 + self.pulsewidth**2) # dtime in seconds
timebin = []
chanbin = []
for ch in xrange(len(chans)):
ontime = n.where(((pulset[ch] + pulsedt[ch]) >= reltime - tint/2.) & (pulset[ch] <= reltime + tint/2.))
timebin = n.concatenate((timebin, ontime[0]))
chanbin = n.concatenate((chanbin, (ch * n.ones(len(ontime[0]), dtype='int'))))
track = (list(timebin), list(chanbin))
if show:
p.plot(track[0], track[1], 'w*')
return track
def tracksub(self, dmbin, tbin, bgwindow = 0):
""" Creates a background-subtracted set of visibilities.
For a given track (i.e., an integration number) and bg window, tracksub subtractes a background in time and returns an array with new data.
Uses ur track for each dm, then shifts by tint. Faster than using n.where to find good integrations for each trial, but assumes int-aligned pulse.
"""
data = self.data
track0,track1 = self.dmtrack0[dmbin]
trackon = (list(n.array(track0)+tbin), track1)
twidth = self.twidths[dmbin]
dataon = data[trackon[0], :, trackon[1]]
truearron = n.ones( n.shape(dataon) )
falsearron = 1e-5*n.ones( n.shape(dataon) ) # small weight to keep n.average from giving NaN
# set up bg track
if bgwindow:
# measure max width of pulse (to avoid in bgsub)
bgrange = range(tbin -(bgwindow/2+twidth)+1, tbin-twidth+1) + range(tbin + twidth, tbin + (twidth+bgwindow/2))
for k in bgrange: # build up super track for background subtraction
if bgrange.index(k) == 0: # first time through
trackoff = (list(n.array(track0)+k), track1)
else: # then extend arrays by next iterations
trackoff = (trackoff[0] + list(n.array(track0)+k), list(trackoff[1]) + list(track1))
dataoff = data[trackoff[0], :, trackoff[1]]
truearroff = n.ones( n.shape(dataoff) )
falsearroff = 1e-5*n.ones( n.shape(dataoff) ) # small weight to keep n.average from giving NaN
datadiffarr = n.zeros((len(self.chans), self.nbl, self.npol),dtype='complex')
# compress time axis, then subtract on and off tracks
for ch in n.unique(trackon[1]):
indon = n.where(trackon[1] == ch)
weightarr = n.where(dataon[indon] != 0j, truearron[indon], falsearron[indon])
meanon = n.average(dataon[indon], axis=0, weights=weightarr)
# meanon = dataon[indon].mean(axis=0) # include all zeros
if bgwindow:
indoff = n.where(trackoff[1] == ch)
weightarr = n.where(dataoff[indoff] != 0j, truearroff[indoff], falsearroff[indoff])
meanoff = n.average(dataoff[indoff], axis=0, weights=weightarr)
# meanoff = dataoff[indoff].mean(axis=0) # include all zeros
datadiffarr[ch] = meanon - meanoff
zeros = n.where( (meanon == 0j) | (meanoff == 0j) ) # find baselines and pols with zeros for meanon or meanoff
datadiffarr[ch][zeros] = 0j # set missing data to zero # hack! but could be ok if we can ignore zeros later...
else:
datadiffarr[ch] = meanon
return n.transpose(datadiffarr, axes=[2,1,0])
def make_bispectra(self, bgwindow=4):
""" Makes numpy array of bispectra for each integration. Subtracts visibilities in time in bgwindow.
Steps in Bispectrum Transient Detection Algorithm
1) Collect visibility spectra for some length of time. In Python, I read data into an array with a shape of (n_int, n_chan, n_bl).
2) Prepare visibility spectra to create bispectra. Optionally, one can form dedispersed spectra for each baseline. A simpler start (and probably more relevant for LOFAR) would be to instead select a single integration from the data array described above. Either way, this step changes the data shape to (n_chan, n_bl).
3) Subtract visibilities in time. If the sky has many (or complex) sources, the bispectrum is hard to interpret. Subtracting neighboring visibilities in time (or a rolling mean, like v_t2 - (v_t1+v_t3)/2) removes most constant emission. The only trick is that this assumes that the array has not rotated much and that gain and other effects have not changed. This should preserve the data shape as (n_chan, n_bl).
4) Calculate mean visibility for each baseline. After subtracting in time, one can measure the mean visibility across the band. This reduces the shape to (n_bl).
5) Form a bispectrum for every closed triple in the array. There are a total of n_a * (n_a-1) * (n_a-2) / 6 possible closed triples in the array, where n_a is the number of antennas. One way to form all bispectra is to iterate over antenna indices like this:
for i in range(0, len(n_a)-2):
for j in range(i, len(n_a)-1):
for k in range(k, len(n_a)):
bl1, bl2, bl3 = ant2bl(i, j, k)
bisp = vis[bl1] * vis[bl2] * vis[bl3]
As you can see, this loop needs a function to convert antenna triples to baseline triples (I call it "ant2bl" here). That is, for antennas (i, j, k), you need (bl_ij, bl_jk, bl_ki). Note that the order of the last baseline is flipped; this is a way of showing that the way you "close" a loop is by tracing a single line around all three baselines. This step changes the basic data product from a shape of (n_bl) to (n_tr).
6) Search the set of bispectra for sign of a source. Each bispectrum is complex, but if there is a point source in the (differenced) data, all bispectra will respond in the same way. This happens regardless of the location in the field of view.
The mean of all bispectra will scale with the source brightness to the third power, since it is formed from the product of three visibilities. Oddly, the standard deviation of the bispectra will *also* change with the source brightness, due to something called "self noise". The standard deviation of bispectra in the real-imaginary plane should be sqrt(3) S^2 sigma_bl, where S is the source brightness and sigma_bl is the noise on an individual baseline.
In practice, this search involves plotting the mean bispectrum versus time and searching for large deviations. At the same time, a plot of mean versus standard deviation of bispectra will show whether any significant deviation obeys the expected self-noise scaling. That scaling is only valid for a single point source in the field of view, which is what you expect for a fast transient. Any other behavior would be either noise-like or caused by RFI. In particular, RFI will look like a transient, but since it does not often look like a point source, it can be rejected in the plot of mean vs. standard deviation of bispectra. This is a point that I've demonstrated on a small scale, but would needs more testing, since RFI is so varied.
"""
bisp = lambda d, ij, jk, ki: d[:,ij] * d[:,jk] * n.conj(d[:,ki]) # bispectrum for pol data
# bisp = lambda d, ij, jk, ki: n.complex(d[ij] * d[jk] * n.conj(d[ki])) # without pol axis
triples = self.make_triples()
meanbl = self.data.mean(axis=2).mean(axis=0) # find bls with no zeros in either pol to ignore in triples
self.triples = triples[n.all(meanbl[triples][:,0] != 0j, axis=1) & n.all(meanbl[triples][:,1] != 0j, axis=1) & n.all(meanbl[triples][:,2] != 0j, axis=1)] # only take triples if both pols are good. may be smaller than set for an individual pol
# set up arrays for bispectrum and for weighting data (ignoring zeros)
bispectra = n.zeros((len(self.dmarr), len(self.data), len(self.triples)), dtype='complex')
truearr = n.ones( (self.npol, self.nbl, len(self.chans)))
falsearr = n.zeros( (self.npol, self.nbl, len(self.chans)))
# iterate over dm trials and integrations
for d in xrange(len(self.dmarr)):
twidth = n.round(self.twidths[d])
dmwidth = int(n.round(n.max(self.dmtrack0[d][0]) - n.min(self.dmtrack0[d][0])))
for i in xrange((bgwindow/2)+twidth, len(self.data)-( (bgwindow/2)+2*twidth+dmwidth )): # dmwidth avoided at end, others are split on front and back side of time iteration
# for i in xrange((bgwindow/2)+twidth, len(self.data)-( (bgwindow/2)+twidth+dmwidth ), max(1,twidth/2)): # can step by twidth/2, but messes up data products
diff = self.tracksub(d, i, bgwindow=bgwindow)
if len(n.shape(diff)) == 1: # no track
continue
# **need to redo for self.flags**
weightarr = n.where(diff != 0j, truearr, falsearr) # ignore zeros in mean across channels # bit of a hack
try:
diffmean = n.average(diff, axis=2, weights=weightarr)
except ZeroDivisionError:
diffmean = n.mean(diff, axis=2) # if all zeros, just make mean # bit of a hack
for trip in xrange(len(self.triples)):
ij, jk, ki = self.triples[trip]
bispectra[d, i, trip] = bisp(diffmean, ij, jk, ki).mean(axis=0) # Stokes I bispectrum. Note we are averaging after forming bispectrum, so not technically a Stokes I bispectrum.
print 'dedispersed for ', self.dmarr[d]
self.bispectra = n.ma.masked_array(bispectra, bispectra == 0j)
def detect_bispectra(self, sigma=5., tol=1.3, Q=0, show=0, save=0):
""" Function to detect transient in bispectra
sigma gives the threshold for SNR_bisp (apparent).
tol gives the amount of tolerance in the sigma_b cut for point-like sources (rfi filter).
Q is noise per baseline and can be input. Otherwise estimated from data.
save=0 is no saving, save=1 is save with default name, save=<string>.png uses custom name (must include .png).
"""
try:
ba = self.bispectra
except AttributeError:
print 'Need to make bispectra first.'
return
# ntr = lambda num: num*(num-1)*(num-2)/6 # assuming all triples are present
ntr = lambda num: len(self.triples) # consider possibility of zeros in data and take mean number of good triples over all times
# using s=S/Q
mu = lambda s: 1. # for bispectra formed from visibilities
sigbQ3 = lambda s: n.sqrt((1 + 3*mu(s)**2) + 3*(1 + mu(s)**2)*s**2 + 3*s**4) # from kulkarni 1989, normalized by Q**3, also rogers et al 1995
s = lambda basnr, nants: (2.*basnr/n.sqrt(ntr(nants)))**(1/3.) # see rogers et al. 1995 for factor of 2
# measure SNR_bl==Q from sigma clipped times with normal mean and std of bispectra. put into time,dm order
bamean = ba.real.mean(axis=2).transpose()
bastd = ba.real.std(axis=2).transpose()
bameanstd = []
for dmind in xrange(len(self.dmarr)):
(meanmin,meanmax) = sigma_clip(bamean[:, dmind]) # remove rfi to estimate noise-like parts
(stdmin,stdmax) = sigma_clip(bastd[:, dmind])
clipped = n.where((bamean[:, dmind] > meanmin) & (bamean[:, dmind] < meanmax) & (bastd[:, dmind] > stdmin) & (bastd[:, dmind] < stdmax) & (bamean[:, dmind] != 0.0))[0] # remove rfi and zeros
bameanstd.append(ba[dmind][clipped].real.mean(axis=1).std())
bameanstd = n.array(bameanstd)
basnr = bamean/bameanstd # = S**3/(Q**3 / n.sqrt(n_tr)) = s**3 * n.sqrt(n_tr)
if Q:
print 'Using given Q =', Q
else:
Q = ((bameanstd/2.)*n.sqrt(ntr(self.nants)))**(1/3.)
# Q = n.median( bastd[clipped]**(1/3.) ) # alternate for Q
print 'Estimating noise per baseline from data. Q (per DM) =', Q
self.Q = Q
# detect
cands = n.where( (bastd/Q**3 < tol*sigbQ3(s(basnr, self.nants))) & (basnr > sigma) ) # get compact sources with high snr
# plot snrb lc and expected snr vs. sigb relation
if show or save:
for dmbin in xrange(len(self.dmarr)):
cands_dm = cands[0][n.where(cands[1] == dmbin)[0]] # find candidates for this dmbin
p.figure(range(len(self.dmarr)).index(dmbin)+1)
ax = p.axes()
p.subplot(211)
p.title(str(self.nskip/self.nbl) + ' nskip, ' + str(dmbin) + ' dmbin, ' + str(len(cands_dm))+' candidates', transform = ax.transAxes)
p.plot(basnr[:,dmbin], 'b.')
if len(cands_dm) > 0:
p.plot(cands_dm, basnr[cands_dm,dmbin], 'r*')
p.ylim(-2*basnr[cands_dm,dmbin].max(),2*basnr[cands_dm,dmbin].max())
p.xlabel('Integration',fontsize=12,fontweight="bold")
p.ylabel('SNR_b',fontsize=12,fontweight="bold")
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
p.subplot(212)
p.plot(bastd[:,dmbin]/Q[dmbin]**3, basnr[:,dmbin], 'b.')
# plot reference theory lines
smax = s(basnr[:,dmbin].max(), self.nants)
sarr = smax*n.arange(0,101)/100.
p.plot(sigbQ3(sarr), 1/2.*sarr**3*n.sqrt(ntr(self.nants)), 'k')
p.plot(tol*sigbQ3(sarr), 1/2.*sarr**3*n.sqrt(ntr(self.nants)), 'k--')
p.plot(bastd[cands_dm,dmbin]/Q[dmbin]**3, basnr[cands_dm,dmbin], 'r*')
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
if len(cands_dm) > 0:
p.axis([0, tol*sigbQ3(s(basnr[cands_dm,dmbin].max(), self.nants)), -0.5*basnr[cands_dm,dmbin].max(), 1.1*basnr[cands_dm,dmbin].max()])
# show spectral modulation next to each point
for candint in cands_dm:
sm = n.single(round(self.specmod(dmbin,candint),1))
p.text(bastd[candint,dmbin]/Q[dmbin]**3, basnr[candint,dmbin], str(sm), horizontalalignment='right', verticalalignment='bottom')
p.xlabel('sigma_b/Q^3',fontsize=12,fontweight="bold")
p.ylabel('SNR_b',fontsize=12,fontweight="bold")
if save:
if save == 1:
savename = self.file.split('.')[:-1]
savename.append(str(self.nskip/self.nbl) + '_' + str(dmbin) + '_bisp.png')
savename = string.join(savename,'.')
elif isinstance(save, string):
savename = save
print 'Saving file as ', savename
p.savefig(self.pathout+savename)
return basnr[cands], bastd[cands], zip(cands[0],cands[1])
def specmod(self, dmbin, tbin, bgwindow=4):
"""Calculate spectral modulation for given dmtrack.
Narrow RFI has large (>5) modulation, while spectrally broad emission has low modulation.
See Spitler et al 2012 for details.
"""
# smarr = n.zeros(len(self.dataph)) # uncomment to do specmod lightcurve
# for int in range(len(self.dataph)-bgwindow):
diff = self.tracksub(dmbin, tbin, bgwindow=bgwindow)
bfspec = diff.mean(axis=0).real # should be ok for multipol data...
sm = n.sqrt( ((bfspec**2).mean() - bfspec.mean()**2) / bfspec.mean()**2 )
return sm
def make_phasedbeam(self):
""" Integrates data at dmtrack for each pair of elements in dmarr, time.
Not threaded. Uses dmthread directly.
Stores mean of detected signal after dmtrack, effectively forming beam at phase center.
Ignores zeros in any bl, freq, time.
"""
self.phasedbeam = n.zeros((len(self.dmarr),len(self.reltime)), dtype='float64')
for i in xrange(len(self.dmarr)):
for j in xrange(len(self.reltime)):
# for j in xrange(0, len(self.reltime), max(1,self.twidths[i]/2)): # can also step by twidth/2, but leaves gaps in data products
dmtrack = self.dmtrack(dm=self.dmarr[i], t0=self.reltime[j])
if ((dmtrack[1][0] == 0) & (dmtrack[1][len(dmtrack[1])-1] == len(self.chans)-1)): # use only tracks that span whole band
truearr = n.ones( (len(dmtrack[0]), self.nbl, self.npol))
falsearr = n.zeros( (len(dmtrack[0]), self.nbl, self.npol))
selection = self.data[dmtrack[0], :, dmtrack[1], :]
weightarr = n.where(selection != 0j, truearr, falsearr) # ignore zeros in mean across channels # bit of a hack
try:
self.phasedbeam[i,j] = n.average(selection, weights=weightarr).real
except ZeroDivisionError:
self.phasedbeam[i,j] = n.mean(selection).real # if all zeros, just make mean # bit of a hack
print 'dedispersed for ', self.dmarr[i]
def detect_phasedbeam(self, sig=5., show=1, save=0, clipplot=1):
""" Method to find transients in dedispersed data (in dmt0 space).
Clips noise then does sigma threshold.
returns array of candidates transients.
Optionally plots beamformed lightcurve.
save=0 is no saving, save=1 is save with default name, save=<string>.png uses custom name (must include .png).
"""
try:
arr = self.phasedbeam
except AttributeError:
print 'Need to make phasedbeam first.'
return
reltime = self.reltime
# single iteration of sigma clip to find mean and std, skipping zeros
mean = arr.mean()
std = arr.std()
print 'initial mean, std: ', mean, std
amin,amax = sigma_clip(arr.flatten())
clipped = arr[n.where((arr < amax) & (arr > amin) & (arr != 0.))]
mean = clipped.mean()
std = clipped.std()
print 'final mean, sig, std: ', mean, sig, std
# Recast arr as significance array
arr_snr = (arr-mean)/std # for real valued trial output, gaussian dis'n, zero mean
# Detect peaks
peaks = n.where(arr_snr > sig)
peakmax = n.where(arr_snr == arr_snr.max())
print 'peaks: ', peaks
# Plot
if show:
p.clf()
ax = p.axes()
ax.set_position([0.2,0.2,0.7,0.7])
if clipplot:
im = p.imshow(arr, aspect='auto', origin='lower', interpolation='nearest', extent=(min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)), vmin=amin, vmax=amax)
else:
im = p.imshow(arr, aspect='auto', origin='lower', interpolation='nearest', extent=(min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)))
cb = p.colorbar(im)
cb.set_label('Flux Density (Jy)',fontsize=12,fontweight="bold")
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
if len(peaks[0]) > 0:
print 'Peak of %f at DM=%f, t0=%f' % (arr.max(), self.dmarr[peakmax[0][0]], reltime[peakmax[1][0]])
for i in xrange(len(peaks[1])):
ax = p.imshow(arr, aspect='auto', origin='lower', interpolation='nearest', extent=(min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)))
p.axis((min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)))
p.plot([reltime[peaks[1][i]]], [self.dmarr[peaks[0][i]]], 'o', markersize=2*arr_snr[peaks[0][i],peaks[1][i]], markerfacecolor='white', markeredgecolor='blue', alpha=0.5)
p.xlabel('Time (s)', fontsize=12, fontweight='bold')
p.ylabel('DM (pc/cm3)', fontsize=12, fontweight='bold')
if save:
if save == 1:
savename = self.file.split('.')[:-1]
savename.append(str(self.scan) + '_' + str(self.nskip/self.nbl) + '_disp.png')
savename = string.join(savename,'.')
elif isinstance(save, types.StringType):
savename = save
print 'Saving file as ', savename
p.savefig(self.pathout+savename)
return peaks,arr[peaks],arr_snr[peaks]
class ProcessByDispersion2():
""" Class defines methods for pipeline processing for dispersion-based transients searches.
Has several optimizations (esp for speed), including:
-- dedisperses using data.roll()
-- time windowing and bgsubtraction with mexican hat filter convolution
-- visibility interpolation in frequency
May want to produce data objects for each dm and filter dt...?
"""
def __init__(self):
raise NotImplementedError('Cannot instantiate class directly. Use \'pipe\' subclasses.')
def prep(self, deleteraw=False):
""" Sets up tracks used to speed up dedispersion code.
Has the option to delete raw data and flags to save memory.
"""
print
print 'Filtering rawdata to data as masked array...'
# using 0 as flag
# self.data = n.ma.masked_array(self.rawdata[:self.nints,:, self.chans,:], self.rawdata[:self.nints,:, self.chans,:] == 0j)
# using standard flags
self.data = n.ma.masked_array(self.rawdata[:self.nints,:, self.chans,:], self.flags[:self.nints,:, self.chans,:] == 0) # mask of True for flagged data (flags=0 in tpipe, which is flags=False in Miriad and flags=True in MS)
self.dataph = (self.data.mean(axis=3).mean(axis=1)).real #dataph is summed and detected to form TP beam at phase center, multi-pol
self.min = self.dataph.min()
self.max = self.dataph.max()
print 'Shape of data:'
print self.data.shape
print 'Dataph min, max:'
print self.min, self.max
if deleteraw:
del self.rawdata
del self.flags
self.freq = self.freq_orig[self.chans]
# set up ur tracks (lol)
self.dmtrack0 = {}
self.twidths = {}
self.delay = {}
for dmbin in xrange(len(self.dmarr)):
self.dmtrack0[dmbin] = self.dmtrack(self.dmarr[dmbin],0) # track crosses high-freq channel in first integration
(trackt, trackc) = self.dmtrack0[dmbin]
if len(trackc)<len(self.chans):
print 'Computed track for DM=%.1f is too long for the observation; only %d channels are computed' % (self.dmarr[dmbin],len(trackc))
continue
# old way
# self.twidths[dmbin] = [len(n.where(trackc == (chan-self.chans[0]))[0]) for chan in self.chans] # width of track for each unflagged channel
# self.delay[dmbin] = [n.int(trackt[n.where(trackc == (chan-self.chans[0]))[0][0]]) for chan in self.chans] # integration delay for each unflagged channel of a given dm.
# new way
self.twidths[dmbin] = [len(n.where(n.array(trackc) == chan)[0]) for chan in range(len(self.chans))] # width of track for each unflagged channel
self.delay[dmbin] = [n.int(trackt[n.where(n.array(trackc) == chan)[0][0]]) for chan in range(len(self.chans))] # integration delay for each unflagged channel of a given dm.
print 'Track width in time: '
for dmbin in self.twidths:
print 'DM=%.1f, max(twidth)=%d. Iteration could step by %d/2.' % (self.dmarr[dmbin], max(self.twidths[dmbin]), max(self.twidths[dmbin]))
def dmtrack(self, dm = 0., t0 = 0., show=0):
""" Takes dispersion measure in pc/cm3 and time offset from first integration in seconds.
t0 defined at first (unflagged) channel. Need to correct by flight time from there to freq=0 for true time.
Returns an array of (timebin, channel) to select from the data array.
"""
reltime = self.reltime
chans = self.chans
tint = self.inttime
# given freq, dm, dfreq, calculate pulse time and duration
pulset_firstchan = 4.2e-3 * dm * self.freq[len(self.chans)-1]**(-2) # used to start dmtrack at highest-freq unflagged channel
pulset_midchan = 4.2e-3 * dm * self.freq[len(self.chans)/2]**(-2) # used to start dmtrack at highest-freq unflagged channel. fails to find bright j0628 pulse
pulset = 4.2e-3 * dm * self.freq**(-2) + t0 - pulset_firstchan # time in seconds referenced to some frequency (first, mid, last)
pulsedt = n.sqrt( (8.3e-6 * dm * (1000*self.sdf) * self.freq**(-3))**2 + self.pulsewidth**2) # dtime in seconds
timebin = []
chanbin = []
for ch in xrange(len(chans)):
ontime = n.where(((pulset[ch] + pulsedt[ch]) >= reltime - tint/2.) & (pulset[ch] <= reltime + tint/2.))
timebin = n.concatenate((timebin, ontime[0]))
chanbin = n.concatenate((chanbin, (ch * n.ones(len(ontime[0]), dtype='int'))))
track = (list(timebin), list(chanbin))
if show:
p.plot(track[0], track[1], 'w*')
return track
def time_filter(self, width, kernel='t', bgwindow=4, show=0):
""" Replaces data array with filtered version via convolution in time. Note that this has trouble with zeroed data.
kernel specifies the convolution kernel. 'm' for mexican hat (a.k.a. ricker, effectively does bg subtraction), 'g' for gaussian. 't' for a tophat. 'b' is a tophat with bg subtraction (or square 'm'). 'w' is a tophat with width that varies with channel, as kept in 'self.twidth[dmbin]'.
width is the kernel width with length nchan. should be tuned to expected pulse width in each channel.
bgwindow is used by 'b' only.
An alternate design for this method would be to make a new data array after filtering, so this can be repeated for many assumed widths without reading data in anew. That would require more memory, so going with repalcement for now.
"""
print 'Applying fft time filter. Assumes no missing data in time.'
if not isinstance(width, types.ListType):
width = [width] * len(self.chans)
# time filter by convolution. functions have different normlizations. m has central peak integral=1 and total is 0. others integrate to 1, so they don't do bg subtraction.
kernelset = {} # optionally could make set of kernels. one per width needed. (used only by 'w' for now).
if kernel == 'm':
from scipy import signal
print 'Applying mexican hat filter. Note that effective width is somewhat larger than equivalent tophat width.'
for w in n.unique(width):
kernel = signal.wavelets.ricker(len(self.data), w) # mexican hat (ricker) function can have given width and integral=0, so good for smoothing in time and doing bg-subtraction at same time! width of averaging is tied to width of bgsub though...
kernelset[w] = kernel/n.where(kernel>0, kernel, 0).sum() # normalize to have peak integral=1, thus outside integral=-1.
elif kernel == 't':
import math
print 'Applying tophat filter.'
for w in n.unique(width):
kernel = n.zeros(len(self.data)) # tophat.
onrange = range(len(kernel)/2 - w/2, len(kernel)/2 + int(math.ceil(w/2.)))
kernel[onrange] = 1.
kernelset[w] = kernel/n.where(kernel>0, kernel, 0).sum() # normalize to have peak integral=1, thus outside integral=-1.
elif kernel == 'b':
import math
print 'Applying tophat filter with bg subtraction (square mexican hat).'
for w in n.unique(width):
kernel = n.zeros(len(self.data)) # tophat.
onrange = range(len(kernel)/2 - w/2, len(kernel)/2 + int(math.ceil(w/2.)))
kernel[onrange] = 1.
offrange = range(len(kernel)/2 - (bgwindow/2+w)+1, len(kernel)/2-w+1) + range(len(kernel)/2 + w, len(kernel)/2 + (w+bgwindow/2))
offrange = range(len(kernel)/2 - (bgwindow+w)/2, len(kernel)/2-w/2) + range(len(kernel)/2 + int(math.ceil(w/2.)), len(kernel)/2 + int(math.ceil((w+bgwindow)/2.)))
kernel[offrange] = -1.
posnorm = n.where(kernel>0, kernel, 0).sum() # find normalization of positive
negnorm = n.abs(n.where(kernel<0, kernel, 0).sum()) # find normalization of negative
kernelset[w] = n.where(kernel>0, kernel/posnorm, kernel/negnorm) # pos and neg both sum to 1/-1, so total integral=0
elif kernel == 'g':
from scipy import signal
print 'Applying gaussian filter. Note that effective width is much larger than equivalent tophat width.'
for w in n.unique(width):
kernel = signal.gaussian(len(self.data), w) # gaussian. peak not quite at 1 for widths less than 3, so it is later renormalized.
kernelset[w] = kernel / (w * n.sqrt(2*n.pi)) # normalize to pdf, not peak of 1.
elif kernel == 'w':
import math
print 'Applying tophat filter that varies with channel.'
for w in n.unique(width):
kernel = n.zeros(len(self.data)) # tophat.
onrange = range(len(kernel)/2 - w/2, len(kernel)/2 + int(math.ceil(w/2.)))
kernel[onrange] = 1.
kernelset[w] = kernel/n.where(kernel>0, kernel, 0).sum() # normalize to have peak integral=1, thus outside integral=-1.
if show:
for kernel in kernelset.values():
p.plot(kernel,'.')
p.title('Time filter kernel')
p.show()
# take ffts (in time)
datafft = n.fft.fft(self.data, axis=0)
kernelsetfft = {}
for w in n.unique(width):
kernelsetfft[w] = n.fft.fft(n.roll(kernelset[w], len(self.data)/2)) # seemingly need to shift kernel to have peak centered near first bin if convolving complex array (but not for real array?)
# filter by product in fourier space
for i in range(self.nbl): # **can't find matrix product I need, so iterating over nbl, chans, npol**
for j in range(len(self.chans)):
for k in range(self.npol):
datafft[:,i,j,k] = datafft[:,i,j,k]*kernelsetfft[width[j]] # index fft kernel by twidth
# ifft to restore time series
self.data = n.ma.masked_array(n.fft.ifft(datafft, axis=0), self.flags[:self.nints,:, self.chans,:] == 0)
self.dataph = (self.data.mean(axis=3).mean(axis=1)).real
def dedisperse(self, dmbin):
""" Creates dedispersed visibilities integrated over frequency.
Uses ur track for each dm, then shifts by tint. Faster than using n.where to find good integrations for each trial, but assumes int-aligned pulse.
"""
dddata = self.data.copy()
twidth = self.twidths[dmbin]
delay = self.delay[dmbin]
# dedisperse by rolling time axis for each channel
for i in xrange(len(self.chans)):
dddata[:,:,i,:] = n.roll(self.data[:,:,i,:], -delay[i], axis=0)
return dddata
def time_mean(self, width):
""" Tophat mean of width for each channel (as in self.twidths[dmbin])
"""
import math
for i in range(len(self.data)):
for j in range(len(self.chans)):
self.data[i,:,j,:] = self.data[i - width[j]/2 : i + int(math.ceil(width[j]/2.)), :, j, :].mean(axis=0)
def tracksub(self, dmbin, tbin):
""" Simplistic reproduction of tracksub used in older version of this class.
Does not have time integration.
"""
dddata = self.dedisperse(dmbin)
return n.rollaxis(dddata[tbin], 2)
def spectralInterpolate(self, Y, axis=0, maxturns=1, turnIncrement=0.125, weightNorm=2):
""" Function to interpolate visibilities across the bandwidth using an FFT.
From LANL colleagues Scott vd Wiel and Earl Lawrence.
Y is array of visibilities with one axis of frequency. Dimensions are 1d, 3d, or 4d (self.data-like)
axis defines spectral axis over which iterpolation is done. assumptions for input data: for 0, array in freq, for 2, is self.data structure.
maxturns is the number of phase wraps expected across band. This affects sensitivity of interpolation, since it effectively is a search space.
"""
if maxturns==0:
return(Y.mean(axis=axis))
# 1: DFT
# Number of visibilities
nr = Y.shape[axis]
t0 = nr/2
# Total length for fft
nt = nr/turnIncrement
# Half the number of Fourier frequencies considered
nf = min(nt/2, maxturns/turnIncrement)
if nf==nt/2:
freqID = range(int(nt))
else:
freqID = range(int(nf+1));
freqID.extend(range(int(nt-nf),int(nt)))
# Fourier frequencies and rotation vector
omega = n.array(freqID)/nt
rotation = n.exp(2*n.pi*omega*t0*1j)
# Some stuff that creates F
F = n.fft.fft(Y, n=int(nt), axis=axis)
F = F.take(freqID, axis=axis)
F = n.sqrt(2)*F/nr
# 2: Calculate weights
# 3: Rotate FFT per frequency, weight, and sum across frequencies
W = n.abs(F)
if axis == 0: # assume single visibility array
W = W/W.max()
W = W**weightNorm
W = W/W.sum()
Yhat = (rotation*(W*F)).sum(axis=axis)
elif axis == 2: # assume structure of self.data
W = W/W.max(axis=axis)[:, :, n.newaxis]
W = W**weightNorm
W = W/W.sum(axis=axis)[:, :, n.newaxis]
Yhat = ((W*F)*rotation[n.newaxis , n.newaxis, :, n.newaxis]).sum(axis=axis)
return Yhat
def make_bispectra(self, stokes='postbisp', maxturns=0):
""" Makes numpy array of bispectra for each integration.
stokes defines how polarizations are used (assumes two polarizations):
'postbisp' means form bispectra for each pol, then average pols.
'prebisp' means average visibility pols, then form bispectra.
'noavg' means calc for both stokes and store in bispectrum array
an index (0,1) means take that index from pol axis.
maxturns determines how to take visibility mean across channels. if > 0, does spectral interpolation (and loses sensitivity!).
"""
bisp = lambda d: d[:,:,0] * d[:,:,1] * n.conj(d[:,:,2]) # bispectrum for data referenced by triple (data[:,triples])
# set up triples and arrays for bispectrum considering flagged baselines (only having zeros).
triples = self.make_triples()
meanbl = self.data.mean(axis=2).mean(axis=0) # find bls with no zeros in either pol to ignore in triples
self.triples = triples[n.all(meanbl[triples][:,0] != 0j, axis=1) & n.all(meanbl[triples][:,1] != 0j, axis=1) & n.all(meanbl[triples][:,2] != 0j, axis=1) == True] # only take triples if both pols are good. may be smaller than set for an individual pol
# need to select path based on how polarization is handled. assumes only dual-pol data.
print 'Bispectrum made for stokes =', stokes
if ( (stokes == 'postbisp') | (stokes == 'prebisp') | (stokes == 'noavg') ): # case of combining two stokes
bispectra = n.zeros((len(self.dmarr), len(self.data), len(self.triples)), dtype='complex')
elif isinstance(stokes, types.IntType): # case of using single pol
if stokes >= self.npol:
raise IndexError, 'Stokes parameter larger than number of pols in data.'
bispectra = n.zeros((len(self.dmarr), len(self.data), len(self.triples)), dtype='complex')
elif stokes == 'noavg':
bispectra = n.zeros((len(self.dmarr), len(self.data), len(self.triples), self.npol), dtype='complex')
# iterate over dm trials
for dmbin in xrange(len(self.dmarr)):
if maxturns == 0:
dddata = self.dedisperse(dmbin).mean(axis=2) # average over channels
elif maxturns > 0:
dddata = self.spectralInterpolate(self.dedisperse(dmbin), axis=2, maxturns=maxturns) # interpolate over channels using fft
if stokes == 'prebisp':
dddata = dddata.mean(axis=2)
bispectra[dmbin] = bisp(dddata[:, self.triples])
elif stokes == 'postbisp':
bispectra[dmbin] = bisp(dddata[:, self.triples]).mean(axis=2)
elif stokes == 'noavg':
bispectra[dmbin] = bisp(dddata[:, self.triples])
elif isinstance(stokes, types.IntType): # case of using single pol
bispectra[dmbin] = bisp(dddata[:, self.triples, stokes])
print 'dedispersed for ', self.dmarr[dmbin]
self.bispectra = n.ma.masked_array(bispectra, bispectra == 0j)
def detect_bispectra(self, sigma=5., tol=1.3, Q=0, show=0, save=0):
""" Function to detect transient in bispectra
sigma gives the threshold for SNR_bisp (apparent).
tol gives the amount of tolerance in the sigma_b cut for point-like sources (rfi filter).
Q is noise per baseline and can be input. Otherwise estimated from data.
save=0 is no saving, save=1 is save with default name, save=<string>.png uses custom name (must include .png).
"""
try:
ba = self.bispectra
except AttributeError:
print 'Need to make bispectra first.'
return
# ntr = lambda num: num*(num-1)*(num-2)/6 # assuming all triples are present
ntr = lambda num: len(self.triples) # assume only good triples are present and use array size as input for noise estimate
# using s=S/Q
mu = lambda s: 1. # for bispectra formed from visibilities
sigbQ3 = lambda s: n.sqrt((1 + 3*mu(s)**2) + 3*(1 + mu(s)**2)*s**2 + 3*s**4) # from kulkarni 1989, normalized by Q**3, also rogers et al 1995
s = lambda basnr, nants: (2.*basnr/n.sqrt(ntr(nants)))**(1/3.) # see rogers et al. 1995 for factor of 2
# measure SNR_bl==Q from sigma clipped times with normal mean and std of bispectra. put into time,dm order
bamean = ba.real.mean(axis=2).transpose()
bastd = ba.real.std(axis=2).transpose()
bameanstd = []
for dmind in xrange(len(self.dmarr)):
(meanmin,meanmax) = sigma_clip(bamean[:, dmind]) # remove rfi to estimate noise-like parts
(stdmin,stdmax) = sigma_clip(bastd[:, dmind])
clipped = n.where((bamean[:, dmind] > meanmin) & (bamean[:, dmind] < meanmax) & (bastd[:, dmind] > stdmin) & (bastd[:, dmind] < stdmax) & (bamean[:, dmind] != 0.0))[0] # remove rfi and zeros
bameanstd.append(ba[dmind][clipped].real.mean(axis=1).std())
bameanstd = n.array(bameanstd)
basnr = bamean/bameanstd # = S**3/(Q**3 / n.sqrt(n_tr)) = s**3 * n.sqrt(n_tr)
if Q:
print 'Using given Q =', Q
else:
Q = ((bameanstd/2.)*n.sqrt(ntr(self.nants)))**(1/3.)
# Q = n.median( bastd[clipped]**(1/3.) ) # alternate for Q
print 'Estimating noise per baseline from data. Q (per DM) =', Q
self.Q = Q
# detect
cands = n.where( (bastd/Q**3 < tol*sigbQ3(s(basnr, self.nants))) & (basnr > sigma) ) # get compact sources with high snr
# plot snrb lc and expected snr vs. sigb relation
if show or save:
for dmbin in xrange(len(self.dmarr)):
cands_dm = cands[0][n.where(cands[1] == dmbin)[0]] # find candidates for this dmbin
p.figure(range(len(self.dmarr)).index(dmbin)+1)
ax = p.axes()
p.subplot(211)
p.title(str(self.nskip/self.nbl) + ' nskip, ' + str(dmbin) + ' dmbin, ' + str(len(cands_dm))+' candidates', transform = ax.transAxes)
p.plot(basnr[:,dmbin], 'b.')
if len(cands_dm) > 0:
p.plot(cands_dm, basnr[cands_dm,dmbin], 'r*')
p.ylim(-2*basnr[cands_dm,dmbin].max(),2*basnr[cands_dm,dmbin].max())
p.xlabel('Integration',fontsize=12,fontweight="bold")
p.ylabel('SNR_b',fontsize=12,fontweight="bold")
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
p.subplot(212)
p.plot(bastd[:,dmbin]/Q[dmbin]**3, basnr[:,dmbin], 'b.')
# plot reference theory lines
smax = s(basnr[:,dmbin].max(), self.nants)
sarr = smax*n.arange(0,101)/100.
p.plot(sigbQ3(sarr), 1/2.*sarr**3*n.sqrt(ntr(self.nants)), 'k')
p.plot(tol*sigbQ3(sarr), 1/2.*sarr**3*n.sqrt(ntr(self.nants)), 'k--')
p.plot(bastd[cands_dm,dmbin]/Q[dmbin]**3, basnr[cands_dm,dmbin], 'r*')
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
if len(cands_dm) > 0:
p.axis([0, tol*sigbQ3(s(basnr[cands_dm,dmbin].max(), self.nants)), -0.5*basnr[cands_dm,dmbin].max(), 1.1*basnr[cands_dm,dmbin].max()])
# show spectral modulation next to each point
for candint in cands_dm:
sm = n.single(round(self.specmod(dmbin,candint),1))
p.text(bastd[candint,dmbin]/Q[dmbin]**3, basnr[candint,dmbin], str(sm), horizontalalignment='right', verticalalignment='bottom')
p.xlabel('sigma_b/Q^3',fontsize=12,fontweight="bold")
p.ylabel('SNR_b',fontsize=12,fontweight="bold")
if save:
if save == 1:
savename = self.file.split('.')[:-1]
savename.append(str(self.nskip/self.nbl) + '_' + str(dmbin) + '_bisp.png')
savename = string.join(savename,'.')
elif isinstance(save, types.StringType):
savename = save
print 'Saving file as ', savename
p.savefig(self.pathout+savename)
return basnr[cands], bastd[cands], zip(cands[0],cands[1])
def specmod(self, dmbin, tbin, bgwindow=4):
"""Calculate spectral modulation for given dmtrack.
Narrow RFI has large (>5) modulation, while spectrally broad emission has low modulation.
See Spitler et al 2012 for details.
"""
# smarr = n.zeros(len(self.dataph)) # uncomment to do specmod lightcurve
# for int in range(len(self.dataph)-bgwindow):
bfspec = self.dedisperse(dmbin)[tbin].mean(axis=0).real
sm = n.sqrt( ((bfspec**2).mean() - bfspec.mean()**2) / bfspec.mean()**2 )
return sm
def make_phasedbeam(self):
""" Integrates data at dmtrack for each pair of elements in dmarr, time.
Not threaded. Uses dmthread directly.
Stores mean of detected signal after dmtrack, effectively forming beam at phase center.
Ignores zeros in any bl, freq, time.
"""
self.phasedbeam = n.zeros((len(self.dmarr),len(self.reltime)), dtype='float64')
for i in xrange(len(self.dmarr)):
self.phasedbeam[i] = self.dedisperse(dmbin=i).mean(axis=3).mean(axis=2).mean(axis=1).real # dedisperse and mean
print 'dedispersed for ', self.dmarr[i]
def detect_phasedbeam(self, sig=5., show=1, save=0, clipplot=1):
""" Method to find transients in dedispersed data (in dmt0 space).
Clips noise then does sigma threshold.
returns array of candidates transients.
Optionally plots beamformed lightcurve.
save=0 is no saving, save=1 is save with default name, save=<string>.png uses custom name (must include .png).
"""
try:
arr = self.phasedbeam
except AttributeError:
print 'Need to make phasedbeam first.'
return
reltime = self.reltime
# single iteration of sigma clip to find mean and std, skipping zeros
mean = arr.mean()
std = arr.std()
print 'initial mean, std: ', mean, std
amin,amax = sigma_clip(arr.flatten())
clipped = arr[n.where((arr < amax) & (arr > amin) & (arr != 0.))]
mean = clipped.mean()
std = clipped.std()
print 'final mean, sig, std: ', mean, sig, std
# Recast arr as significance array
arr_snr = (arr-mean)/std # for real valued trial output, gaussian dis'n, zero mean
# Detect peaks
peaks = n.where(arr_snr > sig)
peakmax = n.where(arr_snr == arr_snr.max())
print 'peaks: ', peaks
# Plot
if show:
p.clf()
ax = p.axes()
ax.set_position([0.2,0.2,0.7,0.7])
if clipplot:
im = p.imshow(arr, aspect='auto', origin='lower', interpolation='nearest', extent=(min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)), vmin=amin, vmax=amax)
else:
im = p.imshow(arr, aspect='auto', origin='lower', interpolation='nearest', extent=(min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)))
cb = p.colorbar(im)
cb.set_label('Flux Density (Jy)',fontsize=12,fontweight="bold")
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['bottom'].set_position(('outward', 20))
ax.spines['left'].set_position(('outward', 30))
ax.yaxis.set_ticks_position('left')
ax.xaxis.set_ticks_position('bottom')
if len(peaks[0]) > 0:
print 'Peak of %f at DM=%f, t0=%f' % (arr.max(), self.dmarr[peakmax[0][0]], reltime[peakmax[1][0]])
for i in xrange(len(peaks[1])):
ax = p.imshow(arr, aspect='auto', origin='lower', interpolation='nearest', extent=(min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)))
p.axis((min(reltime),max(reltime),min(self.dmarr),max(self.dmarr)))
p.plot([reltime[peaks[1][i]]], [self.dmarr[peaks[0][i]]], 'o', markersize=2*arr_snr[peaks[0][i],peaks[1][i]], markerfacecolor='white', markeredgecolor='blue', alpha=0.5)
p.xlabel('Time (s)', fontsize=12, fontweight='bold')
p.ylabel('DM (pc/cm3)', fontsize=12, fontweight='bold')
if save:
if save == 1:
savename = self.file.split('.')[:-1]
savename.append(str(self.scan) + '_' + str(self.nskip/self.nbl) + '_disp.png')
savename = string.join(savename,'.')
elif isinstance(save, types.StringType):
savename = save
print 'Saving file as ', savename
p.savefig(self.pathout+savename)
return peaks,arr[peaks],arr_snr[peaks]
class pipe_msint(MSReader, ProcessByIntegration):
""" Create pipeline object for reading in MS data and doing integration-based analysis without dedispersion.
nints is the number of integrations to read.
nskip is the number of integrations to skip before reading.
spw is list of spectral windows to read from MS.
selectpol is list of polarization product names for reading from MS
scan is zero-based selection of scan for reading from MS. It is based on scan order, not actual scan number.
datacol is the name of the data column name to read from the MS.
Can also set some parameters as key=value pairs.
"""
def __init__(self, file, profile='default', nints=1024, nskip=0, spw=[-1], selectpol=['RR','LL'], scan=0, datacol='data', **kargs):
self.set_profile(profile=profile)
self.set_params(**kargs)
self.read(file=file, nints=nints, nskip=nskip, spw=spw, selectpol=selectpol, scan=scan, datacol=datacol)
self.prep()
class pipe_msdisp(MSReader, ProcessByDispersion):
""" Create pipeline object for reading in MS data and doing dispersion-based analysis
nints is the number of integrations to read.
nskip is the number of integrations to skip before reading.
nocal,nopass are options for applying calibration while reading Miriad data.
spw is list of spectral windows to read from MS.
selectpol is list of polarization product names for reading from MS
scan is zero-based selection of scan for reading from MS. It is based on scan order, not actual scan number.
datacol is the name of the data column name to read from the MS.
Can also set some parameters as key=value pairs.
"""
def __init__(self, file, profile='default', nints=1024, nskip=0, spw=[-1], selectpol=['RR','LL'], scan=0, datacol='data', **kargs):
self.set_profile(profile=profile)
self.set_params(**kargs)
self.read(file=file, nints=nints, nskip=nskip, spw=spw, selectpol=selectpol, scan=scan, datacol=datacol)
self.prep()
class pipe_msdisp2(MSReader, ProcessByDispersion2):
""" Create pipeline object for reading in MS data and doing dispersion-based analysis
This version uses optimized code for dedispersion, which also has some syntax changes.
nints is the number of integrations to read.
nskip is the number of integrations to skip before reading.
nocal,nopass are options for applying calibration while reading Miriad data.
spw is list of spectral windows to read from MS.
selectpol is list of polarization product names for reading from MS
scan is zero-based selection of scan for reading from MS. It is based on scan order, not actual scan number.
datacol is the name of the data column name to read from the MS.
Can also set some parameters as key=value pairs.
"""
def __init__(self, file, profile='default', nints=1024, nskip=0, spw=[-1], selectpol=['RR','LL'], scan=0, datacol='data', **kargs):
self.set_profile(profile=profile)
self.set_params(**kargs)
self.read(file=file, nints=nints, nskip=nskip, spw=spw, selectpol=selectpol, scan=scan, datacol=datacol)
self.prep()
class pipe_mirint(MiriadReader, ProcessByIntegration):
""" Create pipeline object for reading in Miriad data and doing integration-based analysis without dedispersion.
nints is the number of integrations to read.
nskip is the number of integrations to skip before reading.
nocal,nopass are options for applying calibration while reading Miriad data.
Can also set some parameters as key=value pairs.
"""
def __init__(self, file, profile='default', nints=1024, nskip=0, nocal=False, nopass=False, selectpol=['XX'], **kargs):
self.set_profile(profile=profile)
self.set_params(**kargs)
self.read(file=file, nints=nints, nskip=nskip, nocal=nocal, nopass=nopass, selectpol=selectpol)
self.prep()
class pipe_mirdisp(MiriadReader, ProcessByDispersion):
""" Create pipeline object for reading in Miriad data and doing dispersion-based analysis.
nints is the number of integrations to read.
nskip is the number of integrations to skip before reading.
nocal,nopass are options for applying calibration while reading Miriad data.
Can also set some parameters as key=value pairs.
"""
def __init__(self, file, profile='default', nints=1024, nskip=0, nocal=False, nopass=False, selectpol=['XX'], **kargs):
self.set_profile(profile=profile)
self.set_params(**kargs)
self.read(file=file, nints=nints, nskip=nskip, nocal=nocal, nopass=nopass, selectpol=selectpol)
self.prep()
class pipe_mirdisp2(MiriadReader, ProcessByDispersion2):
""" Create pipeline object for reading in Miriad data and doing dispersion-based analysis.
This version uses optimized code for dedispersion, which also has some syntax changes.
nints is the number of integrations to read.
nskip is the number of integrations to skip before reading.
nocal,nopass are options for applying calibration while reading Miriad data.
Can also set some parameters as key=value pairs.
"""
def __init__(self, file, profile='default', nints=1024, nskip=0, nocal=False, nopass=False, selectpol=['XX'], **kargs):
self.set_profile(profile=profile)
self.set_params(**kargs)
self.read(file=file, nints=nints, nskip=nskip, nocal=nocal, nopass=nopass, selectpol=selectpol)
self.prep()
class pipe_simdisp2(SimulationReader, ProcessByDispersion2):
""" Create pipeline object for simulating data and doing dispersion-based analysis.
This version uses optimized code for dedispersion, which also has some syntax changes.
nints is the number of integrations to read.
nskip is the number of integrations to skip before reading.
nocal,nopass are options for applying calibration while reading Miriad data.
Can also set some parameters as key=value pairs.
"""
def __init__(self, profile='default', nints=256, inttime=0.001, freq=1.4, bw=0.128, array='vla10', **kargs):
self.set_profile(profile=profile)
self.set_params(**kargs)
self.simulate(nints, inttime, self.chans, freq, bw, array)
self.prep()
def sigma_clip(arr,sigma=3):
""" Function takes 1d array of values and returns the sigma-clipped min and max scaled by value "sigma".
"""
cliparr = range(len(arr)) # initialize
arr = n.append(arr,[1]) # append superfluous item to trigger loop
while len(cliparr) != len(arr):
arr = arr[cliparr]
mean = arr.mean()
std = arr.std()
cliparr = n.where((arr < mean + sigma*std) & (arr > mean - sigma*std) & (arr != 0) )[0]
# print 'Clipping %d from array of length %d' % (len(arr) - len(cliparr), len(arr))
return mean - sigma*std, mean + sigma*std
| 51.370489 | 811 | 0.591018 | 17,624 | 128,118 | 4.266909 | 0.089197 | 0.007819 | 0.005426 | 0.003976 | 0.690878 | 0.664508 | 0.644628 | 0.627128 | 0.611848 | 0.60633 | 0 | 0.026697 | 0.283114 | 128,118 | 2,493 | 812 | 51.391095 | 0.792065 | 0.144172 | 0 | 0.588725 | 0 | 0.009609 | 0.080684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.007047 | 0.019859 | null | null | 0.081358 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6e891651952b27226bb28fa1df0e725f78f1444 | 750 | py | Python | angr_platforms/tricore/simos_tricore.py | shahinsba/angr-platforms | 86f9ea90c396fb5561d0196a2d1a873e573b0294 | [
"BSD-2-Clause"
] | null | null | null | angr_platforms/tricore/simos_tricore.py | shahinsba/angr-platforms | 86f9ea90c396fb5561d0196a2d1a873e573b0294 | [
"BSD-2-Clause"
] | null | null | null | angr_platforms/tricore/simos_tricore.py | shahinsba/angr-platforms | 86f9ea90c396fb5561d0196a2d1a873e573b0294 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python3
""" simos_tricore.py
Define OS simulator for tricore architecture.
"""
from angr.calling_conventions import SimCC, register_default_cc, SimRegArg
from .arch_tricore import ArchTRICORE
class SimCCTricore(SimCC):
""" Calling convertion simulator for tricore architecture. """
ARG_REGS = ['d4', 'd5', 'd6', 'd7', 'a4', 'a5', 'a6', 'a7']
FP_ARG_REGS = []
CALLER_SAVED_REGS = ['d8', 'd9', 'd10', 'd11', 'd12', 'd13', 'd14',
'd15', 'a10', 'a11', 'a12', 'a13', 'a14', 'a15']
RETURN_ADDR = SimRegArg('ra', 4)
RETURN_VAL = SimRegArg('d2', 4) # scalar value
#RETURN_VAL = SimRegArg('a2', 4) # pointer value TODO
ARCH = ArchTRICORE
register_default_cc('TRICORE', SimCCTricore)
| 37.5 | 74 | 0.637333 | 94 | 750 | 4.925532 | 0.691489 | 0.051836 | 0.082073 | 0.133909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066116 | 0.193333 | 750 | 19 | 75 | 39.473684 | 0.699174 | 0.273333 | 0 | 0 | 0 | 0 | 0.126894 | 0 | 0 | 0 | 0 | 0.052632 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c6f402f3fa9e6d514fe5b7bdd530e7ec0e623479 | 658 | py | Python | house_robber.py | tusharsadhwani/leetcode | a17a8a7587c5654f05fcd13ae7cdf47263ab2ea8 | [
"MIT"
] | 6 | 2021-05-21T01:10:42.000Z | 2021-12-16T16:12:30.000Z | house_robber.py | tusharsadhwani/leetcode | a17a8a7587c5654f05fcd13ae7cdf47263ab2ea8 | [
"MIT"
] | null | null | null | house_robber.py | tusharsadhwani/leetcode | a17a8a7587c5654f05fcd13ae7cdf47263ab2ea8 | [
"MIT"
] | null | null | null | class Solution:
def rob(self, nums: list[int]) -> int:
if len(nums) == 0:
return 0
max_loot: list[int] = [0 for _ in nums]
for index, num in enumerate(nums):
if index == 0:
max_loot[index] = num
elif index == 1:
max_loot[index] = max(max_loot[index-1], num)
else:
max_loot[index] = max(
max_loot[index-1],
num + max_loot[index-2]
)
return max_loot[-1]
tests = [
(
([1, 2, 3, 1],),
4,
),
(
([2, 7, 9, 3, 1],),
12,
),
]
| 21.225806 | 61 | 0.389058 | 78 | 658 | 3.166667 | 0.371795 | 0.226721 | 0.291498 | 0.121457 | 0.251012 | 0.251012 | 0.251012 | 0.251012 | 0.251012 | 0 | 0 | 0.060694 | 0.474164 | 658 | 30 | 62 | 21.933333 | 0.653179 | 0 | 0 | 0.076923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
059db18f06e038424ee617d9555a718b37a6439d | 900 | py | Python | app/models/user_capsule_model.py | endlessdev/Capsule-Flask-App | 2d3faa789cbee3f59eb8a4f260c20ffd30ebf856 | [
"MIT"
] | null | null | null | app/models/user_capsule_model.py | endlessdev/Capsule-Flask-App | 2d3faa789cbee3f59eb8a4f260c20ffd30ebf856 | [
"MIT"
] | null | null | null | app/models/user_capsule_model.py | endlessdev/Capsule-Flask-App | 2d3faa789cbee3f59eb8a4f260c20ffd30ebf856 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import datetime
from sqlalchemy import Column, Integer, String, DateTime, Boolean, Enum, ForeignKey
from app import db
from app.models.user_model import UserModel
from app.models.capsule_model import CapsuleModel
class UserCapsuleModel(db.Model):
__tablename__ = 'user_capsule'
id = Column(Integer, primary_key=True)
user_id = Column(Integer, ForeignKey('users.id', onupdate="CASCADE", ondelete="CASCADE"), nullable=False)
capsule_id = Column(Integer, ForeignKey('capsules.id', onupdate="CASCADE", ondelete="CASCADE"), nullable=False)
created_at = Column(DateTime, default=datetime.datetime.now)
def __init__(self, user_id=None, capsule_id=None):
self.user_id = user_id
self.capsule_id = capsule_id
def get_users_num_by_capsule_id(capsule_id):
return UserCapsuleModel.query.filter(UserCapsuleModel.id == capsule_id).first()
| 34.615385 | 115 | 0.751111 | 117 | 900 | 5.538462 | 0.42735 | 0.111111 | 0.069444 | 0.067901 | 0.138889 | 0.138889 | 0.138889 | 0 | 0 | 0 | 0 | 0.001289 | 0.137778 | 900 | 25 | 116 | 36 | 0.833763 | 0.023333 | 0 | 0 | 0 | 0 | 0.067275 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.3125 | 0.0625 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
05ae9faff24430ed54db15d529d80cead1f24d29 | 1,741 | py | Python | proc/mac.py | linlinhaohao888/pcap_compress | 83acec7ba7a0d79e8afd40b9a28aa4dbbdbb373b | [
"BSD-3-Clause"
] | 2 | 2020-10-28T03:56:50.000Z | 2020-10-28T07:22:05.000Z | proc/mac.py | linlinhaohao888/pcap_compress | 83acec7ba7a0d79e8afd40b9a28aa4dbbdbb373b | [
"BSD-3-Clause"
] | null | null | null | proc/mac.py | linlinhaohao888/pcap_compress | 83acec7ba7a0d79e8afd40b9a28aa4dbbdbb373b | [
"BSD-3-Clause"
] | 1 | 2021-06-07T08:28:08.000Z | 2021-06-07T08:28:08.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
__author__ = "mengdj@outlook.com"
from proc.arp import ARP
from proc.ip import IP
from proc.ipv6 import IPV6
from proc.util import ProcData
class MAC(ProcData):
"""mac协议 14B+"""
_dst = None
_src = None
_type = None
_data = None
def __init__(self, data, upper):
super(MAC, self).__init__(upper)
size = len(data)
assert size > 18
self._dst = data[:6]
self._src = data[6:12]
self._type = data[12:14]
# fcs校验字段 self._fcs = data[size - 4:]
self._data = data[14:]
def __str__(self):
return "MAC dst=>%s src=>%s type:%s" % (self.dst_desc, self.src_desc, self.type_desc)
@property
def dst_desc(self):
return [hex(s).replace("0x", "").upper() for s in self._dst]
@property
def src_desc(self):
return [hex(s).replace("0x", "").upper() for s in self._src]
@property
def type_desc(self):
return [hex(i) for i in self._type]
@property
def dst(self):
return self._dst
@property
def src(self):
return self._src
@property
def type(self):
return self._type
@property
def data(self):
ret = None
if self._type[0] == 0x08:
if self._type[1] == 0x00:
# ipv4 0x0800
ret = IP(self._data, self)
elif self._type[1] == 0x06:
# arp 0x0806
ret = ARP(self._data, self)
elif self._type[0] == 0x86:
if self._type[1] == 0xdd:
# ipv6 0x86dd
ret = IPV6(self._data, self)
return ret
| 25.231884 | 94 | 0.51637 | 224 | 1,741 | 3.8125 | 0.303571 | 0.084309 | 0.04918 | 0.059719 | 0.245902 | 0.154567 | 0.098361 | 0.098361 | 0.098361 | 0.098361 | 0 | 0.049283 | 0.358989 | 1,741 | 68 | 95 | 25.602941 | 0.71595 | 0.071798 | 0 | 0.14 | 0 | 0 | 0.03188 | 0 | 0 | 0 | 0.013012 | 0 | 0.02 | 1 | 0.18 | false | 0 | 0.08 | 0.14 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
05b0a603e41e948cc4d0b554a17b95112cbe3f34 | 13,915 | py | Python | pysnmp/OADHCP-SERVER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/OADHCP-SERVER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/OADHCP-SERVER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module OADHCP-SERVER-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/OADHCP-SERVER-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 20:22:47 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, OctetString, Integer = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "OctetString", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, ValueSizeConstraint, ConstraintsIntersection, SingleValueConstraint, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "ValueSizeConstraint", "ConstraintsIntersection", "SingleValueConstraint", "ConstraintsUnion")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
IpAddress, Integer32, Unsigned32, iso, NotificationType, Gauge32, ModuleIdentity, TimeTicks, ObjectIdentity, Counter32, NotificationType, MibScalar, MibTable, MibTableRow, MibTableColumn, Counter64, enterprises, MibIdentifier, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "IpAddress", "Integer32", "Unsigned32", "iso", "NotificationType", "Gauge32", "ModuleIdentity", "TimeTicks", "ObjectIdentity", "Counter32", "NotificationType", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Counter64", "enterprises", "MibIdentifier", "Bits")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
class HostName(DisplayString):
subtypeSpec = DisplayString.subtypeSpec + ValueSizeConstraint(0, 32)
class EntryStatus(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3))
namedValues = NamedValues(("valid", 1), ("invalid", 2), ("insert", 3))
class ObjectStatus(Integer32):
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3))
namedValues = NamedValues(("enable", 1), ("disable", 2), ("other", 3))
oaccess = MibIdentifier((1, 3, 6, 1, 4, 1, 6926))
oaManagement = MibIdentifier((1, 3, 6, 1, 4, 1, 6926, 1))
oaDhcp = MibIdentifier((1, 3, 6, 1, 4, 1, 6926, 1, 11))
oaDhcpServer = MibIdentifier((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1))
oaDhcpServerGeneral = MibIdentifier((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 1))
oaDhcpServerStatus = MibScalar((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 1, 1), ObjectStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpServerStatus.setStatus('mandatory')
oaDhcpNetbiosNodeType = MibScalar((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("other", 1), ("B-node", 2), ("P-node", 3), ("M-node", 4), ("H-node", 5)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpNetbiosNodeType.setStatus('mandatory')
oaDhcpDomainName = MibScalar((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 1, 3), HostName()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpDomainName.setStatus('mandatory')
oaDhcpDefaultLeaseTime = MibScalar((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 1, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpDefaultLeaseTime.setStatus('mandatory')
oaDhcpMaxLeaseTime = MibScalar((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpMaxLeaseTime.setStatus('mandatory')
oaDhcpDNSTable = MibTable((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 2), )
if mibBuilder.loadTexts: oaDhcpDNSTable.setStatus('mandatory')
oaDhcpDNSEntry = MibTableRow((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 2, 1), ).setIndexNames((0, "OADHCP-SERVER-MIB", "oaDhcpDNSNum"))
if mibBuilder.loadTexts: oaDhcpDNSEntry.setStatus('mandatory')
oaDhcpDNSNum = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpDNSNum.setStatus('mandatory')
oaDhcpDNSIp = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 2, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpDNSIp.setStatus('mandatory')
oaDhcpDNSStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 2, 1, 3), EntryStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpDNSStatus.setStatus('mandatory')
oaDhcpNetbiosServersTable = MibTable((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 3), )
if mibBuilder.loadTexts: oaDhcpNetbiosServersTable.setStatus('mandatory')
oaDhcpNetbiosServersEntry = MibTableRow((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 3, 1), ).setIndexNames((0, "OADHCP-SERVER-MIB", "oaDhcpNetbiosServerNum"))
if mibBuilder.loadTexts: oaDhcpNetbiosServersEntry.setStatus('mandatory')
oaDhcpNetbiosServerNum = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 3, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpNetbiosServerNum.setStatus('mandatory')
oaDhcpNetbiosServerIp = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 3, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpNetbiosServerIp.setStatus('mandatory')
oaDhcpNetbiosServerStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 3, 1, 3), EntryStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpNetbiosServerStatus.setStatus('mandatory')
oaDhcpSubnetConfigTable = MibTable((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4), )
if mibBuilder.loadTexts: oaDhcpSubnetConfigTable.setStatus('mandatory')
oaDhcpSubnetConfigEntry = MibTableRow((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4, 1), ).setIndexNames((0, "OADHCP-SERVER-MIB", "oaDhcpInterfaceName"), (0, "OADHCP-SERVER-MIB", "oaDhcpSubnetIp"), (0, "OADHCP-SERVER-MIB", "oaDhcpSubnetMask"))
if mibBuilder.loadTexts: oaDhcpSubnetConfigEntry.setStatus('mandatory')
oaDhcpInterfaceName = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpInterfaceName.setStatus('mandatory')
oaDhcpSubnetIp = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpSubnetIp.setStatus('mandatory')
oaDhcpSubnetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpSubnetMask.setStatus('mandatory')
oaDhcpOptionSubnetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4, 1, 4), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpOptionSubnetMask.setStatus('mandatory')
oaDhcpIsOptionMask = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4, 1, 5), ObjectStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpIsOptionMask.setStatus('mandatory')
oaDhcpSubnetConfigStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 4, 1, 6), EntryStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpSubnetConfigStatus.setStatus('mandatory')
oaDhcpIpRangeTable = MibTable((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 5), )
if mibBuilder.loadTexts: oaDhcpIpRangeTable.setStatus('mandatory')
oaDhcpIpRangeEntry = MibTableRow((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 5, 1), ).setIndexNames((0, "OADHCP-SERVER-MIB", "oaDhcpIpRangeSubnetIp"), (0, "OADHCP-SERVER-MIB", "oaDhcpIpRangeSubnetMask"), (0, "OADHCP-SERVER-MIB", "oaDhcpIpRangeStart"))
if mibBuilder.loadTexts: oaDhcpIpRangeEntry.setStatus('mandatory')
oaDhcpIpRangeSubnetIp = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 5, 1, 1), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpIpRangeSubnetIp.setStatus('mandatory')
oaDhcpIpRangeSubnetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 5, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpIpRangeSubnetMask.setStatus('mandatory')
oaDhcpIpRangeStart = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 5, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpIpRangeStart.setStatus('mandatory')
oaDhcpIpRangeEnd = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 5, 1, 4), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpIpRangeEnd.setStatus('mandatory')
oaDhcpIpRangeStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 5, 1, 5), EntryStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpIpRangeStatus.setStatus('mandatory')
oaDhcpDefaultGWTable = MibTable((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 6), )
if mibBuilder.loadTexts: oaDhcpDefaultGWTable.setStatus('mandatory')
oaDhcpDefaultGWEntry = MibTableRow((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 6, 1), ).setIndexNames((0, "OADHCP-SERVER-MIB", "oaDhcpDefaultGWSubnetIp"), (0, "OADHCP-SERVER-MIB", "oaDhcpDefaultGWSubnetMask"), (0, "OADHCP-SERVER-MIB", "oaDhcpDefaultGWIp"))
if mibBuilder.loadTexts: oaDhcpDefaultGWEntry.setStatus('mandatory')
oaDhcpDefaultGWSubnetIp = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 6, 1, 1), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpDefaultGWSubnetIp.setStatus('mandatory')
oaDhcpDefaultGWSubnetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 6, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpDefaultGWSubnetMask.setStatus('mandatory')
oaDhcpDefaultGWIp = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 6, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpDefaultGWIp.setStatus('mandatory')
oaDhcpDefaultGWStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 1, 6, 1, 4), EntryStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpDefaultGWStatus.setStatus('mandatory')
oaDhcpRelay = MibIdentifier((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2))
oaDhcpRelayGeneral = MibIdentifier((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 1))
oaDhcpRelayStatus = MibScalar((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 1, 1), ObjectStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpRelayStatus.setStatus('mandatory')
oaDhcpRelayClearConfig = MibScalar((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("None", 1), ("ResetConfig", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpRelayClearConfig.setStatus('mandatory')
oaDhcpRelayServerTable = MibTable((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 2), )
if mibBuilder.loadTexts: oaDhcpRelayServerTable.setStatus('mandatory')
oaDhcpRelayServerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 2, 1), ).setIndexNames((0, "OADHCP-SERVER-MIB", "oaDhcpRelayServerIp"))
if mibBuilder.loadTexts: oaDhcpRelayServerEntry.setStatus('mandatory')
oaDhcpRelayServerIp = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 2, 1, 1), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpRelayServerIp.setStatus('mandatory')
oaDhcpRelayServerStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 2, 1, 2), EntryStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpRelayServerStatus.setStatus('mandatory')
oaDhcpRelayInterfaceTable = MibTable((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 3), )
if mibBuilder.loadTexts: oaDhcpRelayInterfaceTable.setStatus('mandatory')
oaDhcpRelayInterfaceEntry = MibTableRow((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 3, 1), ).setIndexNames((0, "OADHCP-SERVER-MIB", "oaDhcpRelayIfName"))
if mibBuilder.loadTexts: oaDhcpRelayInterfaceEntry.setStatus('mandatory')
oaDhcpRelayIfName = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 3, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDhcpRelayIfName.setStatus('mandatory')
oaDhcpRelayIfStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 6926, 1, 11, 2, 3, 1, 2), EntryStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaDhcpRelayIfStatus.setStatus('mandatory')
mibBuilder.exportSymbols("OADHCP-SERVER-MIB", oaDhcpDNSNum=oaDhcpDNSNum, oaDhcpDNSIp=oaDhcpDNSIp, oaDhcpNetbiosServerStatus=oaDhcpNetbiosServerStatus, oaDhcpRelayStatus=oaDhcpRelayStatus, oaDhcpIpRangeTable=oaDhcpIpRangeTable, oaDhcpServer=oaDhcpServer, oaDhcpDefaultGWIp=oaDhcpDefaultGWIp, oaDhcpDefaultGWEntry=oaDhcpDefaultGWEntry, oaDhcpDefaultLeaseTime=oaDhcpDefaultLeaseTime, EntryStatus=EntryStatus, oaDhcpNetbiosServersTable=oaDhcpNetbiosServersTable, oaDhcpRelay=oaDhcpRelay, oaDhcpNetbiosServerNum=oaDhcpNetbiosServerNum, oaManagement=oaManagement, oaDhcpNetbiosNodeType=oaDhcpNetbiosNodeType, oaDhcpOptionSubnetMask=oaDhcpOptionSubnetMask, oaDhcpIpRangeStatus=oaDhcpIpRangeStatus, oaDhcpRelayInterfaceEntry=oaDhcpRelayInterfaceEntry, oaDhcpIpRangeSubnetMask=oaDhcpIpRangeSubnetMask, oaccess=oaccess, oaDhcpSubnetConfigEntry=oaDhcpSubnetConfigEntry, oaDhcpRelayServerIp=oaDhcpRelayServerIp, oaDhcpRelayClearConfig=oaDhcpRelayClearConfig, oaDhcpRelayGeneral=oaDhcpRelayGeneral, oaDhcpRelayIfName=oaDhcpRelayIfName, oaDhcpMaxLeaseTime=oaDhcpMaxLeaseTime, oaDhcpServerGeneral=oaDhcpServerGeneral, ObjectStatus=ObjectStatus, oaDhcpDefaultGWStatus=oaDhcpDefaultGWStatus, oaDhcpDefaultGWSubnetIp=oaDhcpDefaultGWSubnetIp, oaDhcpIpRangeEnd=oaDhcpIpRangeEnd, oaDhcpDNSEntry=oaDhcpDNSEntry, oaDhcpSubnetConfigTable=oaDhcpSubnetConfigTable, oaDhcpSubnetConfigStatus=oaDhcpSubnetConfigStatus, oaDhcpDefaultGWTable=oaDhcpDefaultGWTable, oaDhcpDNSStatus=oaDhcpDNSStatus, oaDhcpNetbiosServersEntry=oaDhcpNetbiosServersEntry, oaDhcp=oaDhcp, oaDhcpServerStatus=oaDhcpServerStatus, oaDhcpInterfaceName=oaDhcpInterfaceName, oaDhcpRelayServerTable=oaDhcpRelayServerTable, oaDhcpRelayInterfaceTable=oaDhcpRelayInterfaceTable, oaDhcpSubnetMask=oaDhcpSubnetMask, oaDhcpIpRangeEntry=oaDhcpIpRangeEntry, oaDhcpIsOptionMask=oaDhcpIsOptionMask, oaDhcpDomainName=oaDhcpDomainName, oaDhcpIpRangeSubnetIp=oaDhcpIpRangeSubnetIp, oaDhcpDNSTable=oaDhcpDNSTable, oaDhcpRelayServerStatus=oaDhcpRelayServerStatus, oaDhcpNetbiosServerIp=oaDhcpNetbiosServerIp, oaDhcpDefaultGWSubnetMask=oaDhcpDefaultGWSubnetMask, oaDhcpRelayServerEntry=oaDhcpRelayServerEntry, oaDhcpRelayIfStatus=oaDhcpRelayIfStatus, HostName=HostName, oaDhcpSubnetIp=oaDhcpSubnetIp, oaDhcpIpRangeStart=oaDhcpIpRangeStart)
| 111.32 | 2,261 | 0.763277 | 1,541 | 13,915 | 6.892278 | 0.112914 | 0.01224 | 0.016948 | 0.01996 | 0.397985 | 0.36748 | 0.329536 | 0.318614 | 0.202994 | 0.188118 | 0 | 0.076941 | 0.08746 | 13,915 | 124 | 2,262 | 112.217742 | 0.75949 | 0.023572 | 0 | 0.017544 | 0 | 0 | 0.126676 | 0.011636 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.122807 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
05b749ffcbe0ef54ac5f3bd6219b12aa1d8b2104 | 109 | py | Python | face_analyser/configuration/data_settings.py | SergioML9/emotion_recogniser | f519a1075d713c8cea0bfce9c746765e6ae0a232 | [
"Apache-2.0"
] | null | null | null | face_analyser/configuration/data_settings.py | SergioML9/emotion_recogniser | f519a1075d713c8cea0bfce9c746765e6ae0a232 | [
"Apache-2.0"
] | null | null | null | face_analyser/configuration/data_settings.py | SergioML9/emotion_recogniser | f519a1075d713c8cea0bfce9c746765e6ae0a232 | [
"Apache-2.0"
] | null | null | null |
sample_split=1.0
data_loader_usage = 'Training'
training_data = "train_train"
evaluate_data = "privatetest"
| 18.166667 | 30 | 0.798165 | 15 | 109 | 5.4 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020408 | 0.100917 | 109 | 5 | 31 | 21.8 | 0.806122 | 0 | 0 | 0 | 0 | 0 | 0.277778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
05e974701a4d74b99c7ade176b46b3b110ae9208 | 782 | py | Python | tupla_maior_menor.py | eduardobaltazarmarfim/PythonC | 8e44b4f191582c73cca6df98120ab142145c4ba1 | [
"MIT"
] | null | null | null | tupla_maior_menor.py | eduardobaltazarmarfim/PythonC | 8e44b4f191582c73cca6df98120ab142145c4ba1 | [
"MIT"
] | null | null | null | tupla_maior_menor.py | eduardobaltazarmarfim/PythonC | 8e44b4f191582c73cca6df98120ab142145c4ba1 | [
"MIT"
] | null | null | null | import random
def retorno():
resp=input('Deseja executar o programa novamente?[s/n] ')
if(resp=='S' or resp=='s'):
verificar()
else:
print('Processo finalizado com sucesso!')
pass
def cabecalho(titulo):
print('-'*30)
print(' '*9+titulo+' '*15)
print('-'*30)
pass
def mensagem_erro():
print('Dados inseridos são invalidos!')
pass
def verificar():
num=(random.randint(1,10),random.randint(1,10),random.randint(1,10),random.randint(1,10),random.randint(1,10))
print('Os valores sorteados foram: ',end='')
for i in num:
print(f'{i} ',end='')
print('\nO maior valor é {}'.format(max(num)))
print('O menor valor é {}'.format(min(num)))
retorno()
pass
verificar() | 15.038462 | 114 | 0.580563 | 104 | 782 | 4.355769 | 0.528846 | 0.143488 | 0.154525 | 0.1766 | 0.1766 | 0.1766 | 0.1766 | 0.1766 | 0.1766 | 0.1766 | 0 | 0.036975 | 0.23913 | 782 | 52 | 115 | 15.038462 | 0.72437 | 0 | 0 | 0.307692 | 0 | 0 | 0.231162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0.153846 | 0.038462 | 0 | 0.192308 | 0.346154 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
05ee6f7b9c3a5d6efc08fbe11ebf79d50754b8d5 | 12,716 | py | Python | tests/test_twoside_normal_factor_iso.py | jedludlow/tolerance_interval_py | cf3ecd1e2f9b870de3e71bc92e07e6394510c001 | [
"MIT"
] | 9 | 2021-12-15T22:54:24.000Z | 2022-03-12T00:25:07.000Z | tests/test_twoside_normal_factor_iso.py | jedludlow/tolerance_interval_py | cf3ecd1e2f9b870de3e71bc92e07e6394510c001 | [
"MIT"
] | 5 | 2019-11-07T15:26:21.000Z | 2022-02-25T19:40:50.000Z | tests/test_twoside_normal_factor_iso.py | jedludlow/tolerance_interval_py | cf3ecd1e2f9b870de3e71bc92e07e6394510c001 | [
"MIT"
] | 1 | 2021-03-11T19:28:22.000Z | 2021-03-11T19:28:22.000Z | # Author: Copyright (c) 2021 Jed Ludlow
# License: MIT License
"""
Test normla_factor against standard tables of tolerance factors
as published in ISO 16269-6:2014 Annex F.
A sampling of values from the tables is included here for brevity.
"""
import numpy as np
import toleranceinterval.twoside as ts
import unittest
def decimal_ceil(x, places):
"""
Apply ceiling function at a decimal place.
The tables of tolerance factors in ISO 16269-6:2014 provide
the tolerance factors to a certain number of decimal places. The values
at that final decimal place reflect the application of the ceiling function
at that decimal place.
"""
x *= 10 ** places
x = np.ceil(x)
x /= 10 ** places
return x
class BaseTestIso:
class TestIsoTableF(unittest.TestCase):
def test_tolerance_factor(self):
for row_idx, row in enumerate(self.factor_k5):
for col_idx, k5 in enumerate(row):
k = ts.normal_factor(
self.sample_size[row_idx],
self.coverage,
self.confidence,
method='exact',
m=self.number_of_samples[col_idx])
k = decimal_ceil(k, places=4)
self.assertAlmostEqual(k, k5, places=4)
class TestIsoF1(BaseTestIso.TestIsoTableF):
coverage = 0.90
confidence = 0.90
# This is n from the table.
sample_size = np.array([
2, 8, 16, 35, 100, 300, 1000, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 2
15.5124, 6.0755, 4.5088, 3.8875, 3.5544,
3.3461, 3.2032, 3.0989, 3.0193, 2.9565,
# n = 8
2.7542, 2.3600, 2.2244, 2.1530, 2.1081,
2.0769, 2.0539, 2.0361, 2.0220, 2.0104,
# n = 16
2.2537, 2.0574, 1.9833, 1.9426, 1.9163,
1.8977, 1.8837, 1.8727, 1.8638, 1.8564,
# n = 35
1.9906, 1.8843, 1.8417, 1.8176, 1.8017,
1.7902, 1.7815, 1.7747, 1.7690, 1.7643,
# n = 100
1.8232, 1.7697, 1.7473, 1.7343, 1.7256,
1.7193, 1.7144, 1.7105, 1.7073, 1.7047,
# n = 300
1.7401, 1.7118, 1.6997, 1.6925, 1.6877,
1.6842, 1.6815, 1.6793, 1.6775, 1.6760,
# n = 1000
1.6947, 1.6800, 1.6736, 1.6698, 1.6672,
1.6653, 1.6639, 1.6627, 1.6617, 1.6609,
# n = infinity
1.6449, 1.6449, 1.6449, 1.6449, 1.6449,
1.6449, 1.6449, 1.6449, 1.6449, 1.6449,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF2(BaseTestIso.TestIsoTableF):
coverage = 0.95
confidence = 0.90
# This is n from the table.
sample_size = np.array([
3, 9, 15, 30, 90, 150, 400, 1000, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 3
6.8233, 4.3320, 3.7087, 3.4207, 3.2528,
3.1420, 3.0630, 3.0038, 2.9575, 2.9205,
# n = 9
3.1323, 2.7216, 2.5773, 2.5006, 2.4521,
2.4182, 2.3931, 2.3737, 2.3581, 2.3454,
# n = 15
2.7196, 2.4718, 2.3789, 2.3280, 2.2951,
2.2719, 2.2545, 2.2408, 2.2298, 2.2206,
# n = 30
2.4166, 2.2749, 2.2187, 2.1870, 2.1662,
2.1513, 2.1399, 2.1309, 2.1236, 2.1175,
# n = 90
2.1862, 2.1182, 2.0898, 2.0733, 2.0624,
2.0544, 2.0482, 2.0433, 2.0393, 2.0360,
# n = 150
2.1276, 2.0775, 2.0563, 2.0439, 2.0356,
2.0296, 2.0249, 2.0212, 2.0181, 2.0155,
# n = 400
2.0569, 2.0282, 2.0158, 2.0085, 2.0035,
1.9999, 1.9971, 1.9949, 1.9930, 1.9915,
# n = 1000
2.0193, 2.0018, 1.9942, 1.9897, 1.9866,
1.9844, 1.9826, 1.9812, 1.9800, 1.9791,
# n = infinity
1.9600, 1.9600, 1.9600, 1.9600, 1.9600,
1.9600, 1.9600, 1.9600, 1.9600, 1.9600,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF3(BaseTestIso.TestIsoTableF):
coverage = 0.99
confidence = 0.90
# This is n from the table.
sample_size = np.array([
4, 8, 17, 28, 100, 300, 1000, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 4
6.3722, 4.6643, 4.1701, 3.9277, 3.7814,
3.6825, 3.6108, 3.5562, 3.5131, 3.4782,
# n = 8
4.2707, 3.6541, 3.4408, 3.3281, 3.2572,
3.2078, 3.1712, 3.1428, 3.1202, 3.1016,
# n = 17
3.4741, 3.1819, 3.0708, 3.0095, 2.9698,
2.9416, 2.9204, 2.9037, 2.8902, 2.8791,
# n = 28
3.2023, 3.0062, 2.9286, 2.8850, 2.8564,
2.8358, 2.8203, 2.8080, 2.7980, 2.7896,
# n = 100
2.8548, 2.7710, 2.7358, 2.7155, 2.7018,
2.6919, 2.6843, 2.6782, 2.6732, 2.6690,
# n = 300
2.7249, 2.6806, 2.6616, 2.6504, 2.6429,
2.6374, 2.6331, 2.6297, 2.6269, 2.6245,
# n = 1000
2.6538, 2.6308, 2.6208, 2.6148, 2.6108,
2.6079, 2.6056, 2.6037, 2.6022, 2.6009,
# n = infinity
2.5759, 2.5759, 2.5759, 2.5759, 2.5759,
2.5759, 2.5759, 2.5759, 2.5759, 2.5759,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF4(BaseTestIso.TestIsoTableF):
coverage = 0.90
confidence = 0.95
# This is n from the table.
sample_size = np.array([
2, 8, 16, 35, 150, 500, 1000, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 2
31.0923, 8.7252, 5.8380, 4.7912, 4.2571,
3.9341, 3.7179, 3.5630, 3.4468, 3.3565,
# n = 8
3.1561, 2.5818, 2.3937, 2.2974, 2.2381,
2.1978, 2.1685, 2.1463, 2.1289, 2.1149,
# n = 16
2.4486, 2.1771, 2.0777, 2.0241, 1.9899,
1.9661, 1.9483, 1.9346, 1.9237, 1.9147,
# n = 35
2.0943, 1.9515, 1.8953, 1.8638, 1.8432,
1.8285, 1.8174, 1.8087, 1.8016, 1.7957,
# n = 150
1.8260, 1.7710, 1.7478, 1.7344, 1.7254,
1.7188, 1.7137, 1.7097, 1.7064, 1.7036,
# n = 500
1.7374, 1.7098, 1.6979, 1.6908, 1.6861,
1.6826, 1.6799, 1.6777, 1.6760, 1.6744,
# n = 1000
1.7088, 1.6898, 1.6816, 1.6767, 1.6734,
1.6709, 1.6690, 1.6675, 1.6663, 1.6652,
# n = infinity
1.6449, 1.6449, 1.6449, 1.6449, 1.6449,
1.6449, 1.6449, 1.6449, 1.6449, 1.6449,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF5(BaseTestIso.TestIsoTableF):
coverage = 0.95
confidence = 0.95
# This is n from the table.
sample_size = np.array([
5, 10, 26, 90, 200, 1000, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 5
5.0769, 3.6939, 3.2936, 3.0986, 2.9820,
2.9041, 2.8482, 2.8062, 2.7734, 2.7472,
# n = 10
3.3935, 2.8700, 2.6904, 2.5964, 2.5377,
2.4973, 2.4677, 2.4450, 2.4271, 2.4125,
# n = 26
2.6188, 2.4051, 2.3227, 2.2771, 2.2476,
2.2266, 2.2108, 2.1985, 2.1886, 2.1803,
# n = 90
2.2519, 2.1622, 2.1251, 2.1037, 2.0895,
2.0792, 2.0713, 2.0650, 2.0598, 2.0555,
# n = 200
2.1430, 2.0877, 2.0642, 2.0505, 2.0413,
2.0346, 2.0294, 2.0253, 2.0219, 2.0190,
# n = 1000
2.0362, 2.0135, 2.0037, 1.9979, 1.9939,
1.9910, 1.9888, 1.9870, 1.9855, 1.9842,
# n = infinity
1.9600, 1.9600, 1.9600, 1.9600, 1.9600,
1.9600, 1.9600, 1.9600, 1.9600, 1.9600,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF6(BaseTestIso.TestIsoTableF):
coverage = 0.99
confidence = 0.95
# This is n from the table.
sample_size = np.array([
3, 9, 17, 35, 100, 500, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 3
12.6472, 6.8474, 5.5623, 4.9943, 4.6711,
4.4612, 4.3133, 4.2032, 4.1180, 4.0500,
# n = 9
4.6329, 3.8544, 3.5909, 3.4534, 3.3677,
3.3085, 3.2651, 3.2317, 3.2052, 3.1837,
# n = 17
3.7606, 3.3572, 3.2077, 3.1264, 3.0743,
3.0377, 3.0104, 2.9892, 2.9722, 2.9582,
# n = 35
3.2762, 3.0522, 2.9638, 2.9143, 2.8818,
2.8586, 2.8411, 2.8273, 2.8161, 2.8068,
# n = 100
2.9356, 2.8253, 2.7794, 2.7529, 2.7352,
2.7224, 2.7126, 2.7048, 2.6984, 2.6930,
# n = 500
2.7208, 2.6775, 2.6588, 2.6478, 2.6403,
2.6349, 2.6307, 2.6273, 2.6245, 2.6221,
# n = infinity
2.5759, 2.5759, 2.5759, 2.5759, 2.5759,
2.5759, 2.5759, 2.5759, 2.5759, 2.5759,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF7(BaseTestIso.TestIsoTableF):
coverage = 0.90
confidence = 0.99
# This is n from the table.
sample_size = np.array([
4, 10, 22, 80, 200, 1000, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 4
9.4162, 4.9212, 3.9582, 3.5449, 3.3166,
3.1727, 3.0742, 3.0028, 2.9489, 2.9068,
# n = 10
3.6167, 2.8193, 2.5709, 2.4481, 2.3748,
2.3265, 2.2923, 2.2671, 2.2477, 2.2324,
# n = 22
2.5979, 2.2631, 2.1429, 2.0791, 2.0393,
2.0120, 1.9921, 1.9770, 1.9652, 1.9558,
# n = 80
2.0282, 1.9056, 1.8562, 1.8281, 1.8097,
1.7964, 1.7864, 1.7785, 1.7721, 1.7668,
# n = 200
1.8657, 1.7973, 1.7686, 1.7520, 1.7409,
1.7328, 1.7266, 1.7216, 1.7176, 1.7142,
# n = 1000
1.7359, 1.7086, 1.6967, 1.6897, 1.6850,
1.6815, 1.6788, 1.6767, 1.6749, 1.6734,
# n = infinity
1.6449, 1.6449, 1.6449, 1.6449, 1.6449,
1.6449, 1.6449, 1.6449, 1.6449, 1.6449,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF8(BaseTestIso.TestIsoTableF):
coverage = 0.95
confidence = 0.99
# This is n from the table.
sample_size = np.array([
2, 9, 17, 40, 150, 500, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 2
182.7201, 23.1159, 11.9855, 8.7010, 7.1975,
6.3481, 5.8059, 5.4311, 5.1573, 4.9489,
# n = 9
4.5810, 3.4807, 3.1443, 2.9784, 2.8793,
2.8136, 2.7670, 2.7324, 2.7057, 2.6846,
# n = 17
3.3641, 2.8501, 2.6716, 2.5784, 2.5207,
2.4814, 2.4529, 2.4314, 2.4147, 2.4013,
# n = 40
2.6836, 2.4425, 2.3498, 2.2987, 2.2658,
2.2427, 2.2254, 2.2120, 2.2013, 2.1926,
# n = 150
2.2712, 2.1740, 2.1336, 2.1103, 2.0948,
2.0835, 2.0749, 2.0681, 2.0625, 2.0578,
# n = 500
2.1175, 2.0697, 2.0492, 2.0372, 2.0291,
2.0231, 2.0185, 2.0149, 2.0118, 2.0093,
# n = infinity
1.9600, 1.9600, 1.9600, 1.9600, 1.9600,
1.9600, 1.9600, 1.9600, 1.9600, 1.9600,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
class TestIsoF9(BaseTestIso.TestIsoTableF):
coverage = 0.99
confidence = 0.99
# This is n from the table.
sample_size = np.array([
3, 7, 15, 28, 70, 200, 1000, np.inf,
])
# This is m from the table.
number_of_samples = np.arange(1, 11)
factor_k5 = np.array([
# n = 3
28.5857, 10.6204, 7.6599, 6.4888, 5.8628,
5.4728, 5.2065, 5.0131, 4.8663, 4.7512,
# n = 7
7.1908, 5.0656, 4.4559, 4.1605, 3.9847,
3.8678, 3.7844, 3.7220, 3.6736, 3.6350,
# n = 15
4.6212, 3.8478, 3.5825, 3.4441, 3.3581,
3.2992, 3.2564, 3.2238, 3.1983, 3.1777,
# n = 28
3.8042, 3.3792, 3.2209, 3.1350, 3.0801,
3.0418, 3.0135, 2.9916, 2.9742, 2.9600,
# n = 70
3.2284, 3.0179, 2.9334, 2.8857, 2.8544,
2.8319, 2.8150, 2.8016, 2.7908, 2.7818,
# n = 200
2.9215, 2.8144, 2.7695, 2.7434, 2.7260,
2.7133, 2.7036, 2.6958, 2.6894, 2.6841,
# n = 1000
2.7184, 2.6756, 2.6570, 2.6461, 2.6387,
2.6332, 2.6290, 2.6257, 2.6229, 2.6205,
# n = infinity
2.5759, 2.5759, 2.5759, 2.5759, 2.5759,
2.5759, 2.5759, 2.5759, 2.5759, 2.5759,
])
factor_k5 = factor_k5.reshape(sample_size.size, number_of_samples.size)
| 29.232184 | 79 | 0.530513 | 2,211 | 12,716 | 3.008141 | 0.321122 | 0.022553 | 0.024357 | 0.040595 | 0.381597 | 0.377086 | 0.377086 | 0.329725 | 0.329725 | 0.329725 | 0 | 0.442453 | 0.311261 | 12,716 | 434 | 80 | 29.299539 | 0.316967 | 0.120006 | 0 | 0.361446 | 0 | 0 | 0.000452 | 0 | 0 | 0 | 0 | 0 | 0.004016 | 1 | 0.008032 | false | 0 | 0.012048 | 0 | 0.285141 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.