hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
7cbe3198f6071ec0d541441f81f18f624a937b6f | 5,044 | py | Python | t2k/bin/cmttags.py | tianluyuan/pyutils | 2cd3a90dbbd3d0eec3054fb9493ca0f6e0272e50 | [
"MIT"
] | 1 | 2019-02-22T10:57:13.000Z | 2019-02-22T10:57:13.000Z | t2k/bin/cmttags.py | tianluyuan/pyutils | 2cd3a90dbbd3d0eec3054fb9493ca0f6e0272e50 | [
"MIT"
] | null | null | null | t2k/bin/cmttags.py | tianluyuan/pyutils | 2cd3a90dbbd3d0eec3054fb9493ca0f6e0272e50 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""
A script to create tags for CMT managed packages.
Call from within cmt/ directory
"""
import subprocess
import sys
import os
from optparse import OptionParser
__author__ = 'Tianlu Yuan'
__email__ = 'tianlu.yuan [at] colorado.edu'
# Ignore large external packages for now
IGNORES = ['CMT', 'EXTERN', 'GSL', 'MYSQL', 'GEANT', 'CLHEP']
# Extensions for finding src files, must satisfy unix wildcard rules
EXTENSIONS = {'cpp': ('*.[hc]', '*.[hc]xx', '*.[hc]pp', '*.cc', '*.hh'),
'python':('*.py'),
'java':('*.java')}
# Ignore these files and dirs, key specifies argument to find
# (e.g. '-iname')
PRUNE = {'iname':['*_Dict.[hc]*', '*linkdef.h']}
def check_dir():
""" Are we inside cmt/
"""
if os.path.basename(os.getcwd()) != 'cmt':
sys.exit('Not inside cmt directory!')
def check_requirements():
""" Ensure that requirements file exists in cmt dir
"""
if not os.path.isfile('requirements'):
sys.exit('No requirements file!')
def init_use_dict():
"""Returns the initial use_dict which contains the current (cwd)
package and its path. 'cmt show uses' does not include the
package itself.
"""
# Must call os.path.dirname because the cwd should be inside a cmt
# directory
return {'this':os.path.dirname(os.getcwd())}
def parse_uses():
""" Returns a dict of used packages and their root dir paths.
e.g. {ROOT:/path/to/cmt/installed/ROOT/vXrY}
"""
check_dir()
check_requirements()
proc = subprocess.Popen(['cmt', 'show', 'uses'],
stdout=subprocess.PIPE)
use_dict = init_use_dict()
for line in iter(proc.stdout.readline, ''):
tokens = line.split()
# ignore lines that start with '#'
if line[0] != '#' and tokens[1] not in IGNORES:
basepath = tokens[-1].strip('()')
# highland and psyche do not strictly follow CMT path
# organization. They have subpackages within a master, so
# we need to take that into account
relpath_list = [master for master in tokens[3:-1]]
relpath_list.extend([tokens[1], tokens[2]])
use_dict[tokens[1]] = os.path.join(basepath, *relpath_list)
return use_dict
def get_exts(opts):
if opts.python:
return EXTENSIONS['python']
elif opts.java:
return EXTENSIONS['java']
else:
return EXTENSIONS['cpp']
def build_find_args(exts):
""" ext is a list of file extensions corresponding to the files we want
to search. This will return a list of arguments that can be passed to `find`
"""
find_args = []
for a_ext in exts:
# -o for "or"
find_args.extend(['-o', '-iname'])
find_args.append('{0}'.format(a_ext))
# replace first '-o' with '( for grouping matches
find_args[0] = '('
# append parens for grouping negation
find_args.extend([')', '('])
# Add prune files
for match_type in PRUNE:
for aprune in PRUNE[match_type]:
find_args.append('-not')
find_args.append('-'+match_type)
find_args.append('{0}'.format(aprune))
find_args.append(')')
return find_args
def build_find_cmd(opts, paths):
""" Builds teh cmd file using ctags. Returns cmd based on the following
template: 'find {0} -type f {1} | etags -'
"""
find_args = build_find_args(get_exts(opts))
return ['find']+paths+['-type', 'f']+find_args
def build_tags_cmd():
return ['etags', '-']
def main():
""" Uses ctags to generate TAGS file in cmt directory based on cmt show uses
"""
parser = OptionParser()
parser.add_option('--cpp',
dest='cpp',
action='store_true',
default=False,
help='tag only c/cpp files (default)')
parser.add_option('--python',
dest='python',
action='store_true',
default=False,
help='tag only python files')
parser.add_option('--java',
dest='java',
action='store_true',
default=False,
help='tag only java files')
parser.add_option('-n',
dest='dry_run',
action='store_true',
default=False,
help='dry run')
(opts, args) = parser.parse_args()
# get the cmt show uses dictionary of programs and paths
use_dict = parse_uses()
# build the commands
find_cmd = build_find_cmd(opts, list(use_dict.itervalues()))
tags_cmd = build_tags_cmd()
print 'Creating TAGS file based on dependencies:'
print use_dict
if not opts.dry_run:
find_proc = subprocess.Popen(find_cmd, stdout=subprocess.PIPE)
tags_proc = subprocess.Popen(tags_cmd, stdin=find_proc.stdout)
tags_proc.communicate()
if __name__ == '__main__':
main()
| 28.822857 | 80 | 0.585052 | 639 | 5,044 | 4.482003 | 0.353678 | 0.039106 | 0.024441 | 0.030726 | 0.076466 | 0.050628 | 0.039804 | 0.039804 | 0 | 0 | 0 | 0.003606 | 0.285289 | 5,044 | 174 | 81 | 28.988506 | 0.790846 | 0.126289 | 0 | 0.086022 | 0 | 0 | 0.136083 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.043011 | null | null | 0.021505 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cbea5a7d278dcb466c16a1d3e035b7e14f3c77c | 63,630 | py | Python | salt/daemons/masterapi.py | rickh563/salt | 02822d6466c47d0daafd6e98b4e767a396b0ed48 | [
"Apache-2.0"
] | null | null | null | salt/daemons/masterapi.py | rickh563/salt | 02822d6466c47d0daafd6e98b4e767a396b0ed48 | [
"Apache-2.0"
] | null | null | null | salt/daemons/masterapi.py | rickh563/salt | 02822d6466c47d0daafd6e98b4e767a396b0ed48 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
'''
This module contains all of the routines needed to set up a master server, this
involves preparing the three listeners and the workers needed by the master.
'''
from __future__ import absolute_import
# Import python libs
import fnmatch
import logging
import os
import re
import time
import stat
import tempfile
# Import salt libs
import salt.crypt
import salt.utils
import salt.client
import salt.payload
import salt.pillar
import salt.state
import salt.runner
import salt.auth
import salt.wheel
import salt.minion
import salt.search
import salt.key
import salt.fileserver
import salt.utils.atomicfile
import salt.utils.event
import salt.utils.verify
import salt.utils.minions
import salt.utils.gzip_util
import salt.utils.jid
from salt.pillar import git_pillar
from salt.utils.event import tagify
from salt.exceptions import SaltMasterError
# Import 3rd-party libs
import salt.ext.six as six
try:
import pwd
HAS_PWD = True
except ImportError:
# pwd is not available on windows
HAS_PWD = False
log = logging.getLogger(__name__)
# Things to do in lower layers:
# only accept valid minion ids
def init_git_pillar(opts):
'''
Clear out the ext pillar caches, used when the master starts
'''
pillargitfs = []
for opts_dict in [x for x in opts.get('ext_pillar', [])]:
if 'git' in opts_dict:
try:
import git
except ImportError:
return pillargitfs
parts = opts_dict['git'].strip().split()
try:
br = parts[0]
loc = parts[1]
except IndexError:
log.critical(
'Unable to extract external pillar data: {0}'
.format(opts_dict['git'])
)
else:
pillargitfs.append(
git_pillar.GitPillar(
br,
loc,
opts
)
)
return pillargitfs
def clean_fsbackend(opts):
'''
Clean out the old fileserver backends
'''
# Clear remote fileserver backend caches so they get recreated
for backend in ('git', 'hg', 'svn'):
if backend in opts['fileserver_backend']:
env_cache = os.path.join(
opts['cachedir'],
'{0}fs'.format(backend),
'envs.p'
)
if os.path.isfile(env_cache):
log.debug('Clearing {0}fs env cache'.format(backend))
try:
os.remove(env_cache)
except OSError as exc:
log.critical(
'Unable to clear env cache file {0}: {1}'
.format(env_cache, exc)
)
file_lists_dir = os.path.join(
opts['cachedir'],
'file_lists',
'{0}fs'.format(backend)
)
try:
file_lists_caches = os.listdir(file_lists_dir)
except OSError:
continue
for file_lists_cache in fnmatch.filter(file_lists_caches, '*.p'):
cache_file = os.path.join(file_lists_dir, file_lists_cache)
try:
os.remove(cache_file)
except OSError as exc:
log.critical(
'Unable to file_lists cache file {0}: {1}'
.format(cache_file, exc)
)
def clean_expired_tokens(opts):
'''
Clean expired tokens from the master
'''
serializer = salt.payload.Serial(opts)
for (dirpath, dirnames, filenames) in os.walk(opts['token_dir']):
for token in filenames:
token_path = os.path.join(dirpath, token)
with salt.utils.fopen(token_path) as token_file:
token_data = serializer.loads(token_file.read())
if 'expire' not in token_data or token_data.get('expire', 0) < time.time():
try:
os.remove(token_path)
except (IOError, OSError):
pass
def clean_pub_auth(opts):
try:
auth_cache = os.path.join(opts['cachedir'], 'publish_auth')
if not os.path.exists(auth_cache):
return
else:
for (dirpath, dirnames, filenames) in os.walk(auth_cache):
for auth_file in filenames:
auth_file_path = os.path.join(dirpath, auth_file)
if not os.path.isfile(auth_file_path):
continue
if os.path.getmtime(auth_file_path) - time.time() > opts['keep_jobs']:
os.remove(auth_file_path)
except (IOError, OSError):
log.error('Unable to delete pub auth file')
def clean_old_jobs(opts):
'''
Clean out the old jobs from the job cache
'''
# TODO: better way to not require creating the masterminion every time?
mminion = salt.minion.MasterMinion(
opts,
states=False,
rend=False,
)
# If the master job cache has a clean_old_jobs, call it
fstr = '{0}.clean_old_jobs'.format(opts['master_job_cache'])
if fstr in mminion.returners:
mminion.returners[fstr]()
def access_keys(opts):
'''
A key needs to be placed in the filesystem with permissions 0400 so
clients are required to run as root.
'''
users = []
keys = {}
acl_users = set(opts['client_acl'].keys())
if opts.get('user'):
acl_users.add(opts['user'])
acl_users.add(salt.utils.get_user())
if HAS_PWD:
for user in pwd.getpwall():
users.append(user.pw_name)
for user in acl_users:
log.info(
'Preparing the {0} key for local communication'.format(
user
)
)
if HAS_PWD:
if user not in users:
try:
user = pwd.getpwnam(user).pw_name
except KeyError:
log.error('ACL user {0} is not available'.format(user))
continue
keyfile = os.path.join(
opts['cachedir'], '.{0}_key'.format(user)
)
if os.path.exists(keyfile):
log.debug('Removing stale keyfile: {0}'.format(keyfile))
os.unlink(keyfile)
key = salt.crypt.Crypticle.generate_key_string()
cumask = os.umask(191)
with salt.utils.fopen(keyfile, 'w+') as fp_:
fp_.write(key)
os.umask(cumask)
# 600 octal: Read and write access to the owner only.
# Write access is necessary since on subsequent runs, if the file
# exists, it needs to be written to again. Windows enforces this.
os.chmod(keyfile, 0o600)
if HAS_PWD:
try:
os.chown(keyfile, pwd.getpwnam(user).pw_uid, -1)
except OSError:
# The master is not being run as root and can therefore not
# chown the key file
pass
keys[user] = key
return keys
def fileserver_update(fileserver):
'''
Update the fileserver backends, requires that a built fileserver object
be passed in
'''
try:
if not fileserver.servers:
log.error(
'No fileservers loaded, the master will not be able to '
'serve files to minions'
)
raise SaltMasterError('No fileserver backends available')
fileserver.update()
except Exception as exc:
log.error(
'Exception {0} occurred in file server update'.format(exc),
exc_info_on_loglevel=logging.DEBUG
)
class AutoKey(object):
'''
Implement the methods to run auto key acceptance and rejection
'''
def __init__(self, opts):
self.opts = opts
def check_permissions(self, filename):
'''
Check if the specified filename has correct permissions
'''
if salt.utils.is_windows():
return True
# After we've ascertained we're not on windows
try:
user = self.opts['user']
pwnam = pwd.getpwnam(user)
uid = pwnam[2]
gid = pwnam[3]
groups = salt.utils.get_gid_list(user, include_default=False)
except KeyError:
log.error(
'Failed to determine groups for user {0}. The user is not '
'available.\n'.format(
user
)
)
return False
fmode = os.stat(filename)
if os.getuid() == 0:
if fmode.st_uid == uid or fmode.st_gid != gid:
return True
elif self.opts.get('permissive_pki_access', False) \
and fmode.st_gid in groups:
return True
else:
if stat.S_IWOTH & fmode.st_mode:
# don't allow others to write to the file
return False
# check group flags
if self.opts.get('permissive_pki_access', False) and stat.S_IWGRP & fmode.st_mode:
return True
elif stat.S_IWGRP & fmode.st_mode:
return False
# check if writable by group or other
if not (stat.S_IWGRP & fmode.st_mode or
stat.S_IWOTH & fmode.st_mode):
return True
return False
def check_signing_file(self, keyid, signing_file):
'''
Check a keyid for membership in a signing file
'''
if not signing_file or not os.path.exists(signing_file):
return False
if not self.check_permissions(signing_file):
message = 'Wrong permissions for {0}, ignoring content'
log.warn(message.format(signing_file))
return False
with salt.utils.fopen(signing_file, 'r') as fp_:
for line in fp_:
line = line.strip()
if line.startswith('#'):
continue
else:
if salt.utils.expr_match(keyid, line):
return True
return False
def check_autosign_dir(self, keyid):
'''
Check a keyid for membership in a autosign directory.
'''
autosign_dir = os.path.join(self.opts['pki_dir'], 'minions_autosign')
# cleanup expired files
expire_minutes = self.opts.get('autosign_expire_minutes', 10)
if expire_minutes > 0:
min_time = time.time() - (60 * int(expire_minutes))
for root, dirs, filenames in os.walk(autosign_dir):
for f in filenames:
stub_file = os.path.join(autosign_dir, f)
mtime = os.path.getmtime(stub_file)
if mtime < min_time:
log.warn('Autosign keyid expired {0}'.format(stub_file))
os.remove(stub_file)
stub_file = os.path.join(autosign_dir, keyid)
if not os.path.exists(stub_file):
return False
os.remove(stub_file)
return True
def check_autoreject(self, keyid):
'''
Checks if the specified keyid should automatically be rejected.
'''
return self.check_signing_file(
keyid,
self.opts.get('autoreject_file', None)
)
def check_autosign(self, keyid):
'''
Checks if the specified keyid should automatically be signed.
'''
if self.opts['auto_accept']:
return True
if self.check_signing_file(keyid, self.opts.get('autosign_file', None)):
return True
if self.check_autosign_dir(keyid):
return True
return False
class RemoteFuncs(object):
'''
Funcitons made available to minions, this class includes the raw routines
post validation that make up the minion access to the master
'''
def __init__(self, opts):
self.opts = opts
self.event = salt.utils.event.get_event(
'master',
self.opts['sock_dir'],
self.opts['transport'],
opts=self.opts,
listen=False)
self.serial = salt.payload.Serial(opts)
self.ckminions = salt.utils.minions.CkMinions(opts)
# Create the tops dict for loading external top data
self.tops = salt.loader.tops(self.opts)
# Make a client
self.local = salt.client.get_local_client(mopts=self.opts)
# Create the master minion to access the external job cache
self.mminion = salt.minion.MasterMinion(
self.opts,
states=False,
rend=False)
self.__setup_fileserver()
def __setup_fileserver(self):
'''
Set the local file objects from the file server interface
'''
fs_ = salt.fileserver.Fileserver(self.opts)
self._serve_file = fs_.serve_file
self._file_hash = fs_.file_hash
self._file_list = fs_.file_list
self._file_list_emptydirs = fs_.file_list_emptydirs
self._dir_list = fs_.dir_list
self._symlink_list = fs_.symlink_list
self._file_envs = fs_.envs
def __verify_minion_publish(self, load):
'''
Verify that the passed information authorized a minion to execute
'''
# Verify that the load is valid
if 'peer' not in self.opts:
return False
if not isinstance(self.opts['peer'], dict):
return False
if any(key not in load for key in ('fun', 'arg', 'tgt', 'ret', 'id')):
return False
# If the command will make a recursive publish don't run
if re.match('publish.*', load['fun']):
return False
# Check the permissions for this minion
perms = []
for match in self.opts['peer']:
if re.match(match, load['id']):
# This is the list of funcs/modules!
if isinstance(self.opts['peer'][match], list):
perms.extend(self.opts['peer'][match])
if ',' in load['fun']:
# 'arg': [['cat', '/proc/cpuinfo'], [], ['foo']]
load['fun'] = load['fun'].split(',')
arg_ = []
for arg in load['arg']:
arg_.append(arg.split())
load['arg'] = arg_
good = self.ckminions.auth_check(
perms,
load['fun'],
load['tgt'],
load.get('tgt_type', 'glob'),
publish_validate=True)
if not good:
return False
return True
def _master_opts(self, load):
'''
Return the master options to the minion
'''
mopts = {}
file_roots = {}
envs = self._file_envs()
for saltenv in envs:
if saltenv not in file_roots:
file_roots[saltenv] = []
mopts['file_roots'] = file_roots
if load.get('env_only'):
return mopts
mopts['renderer'] = self.opts['renderer']
mopts['failhard'] = self.opts['failhard']
mopts['state_top'] = self.opts['state_top']
mopts['nodegroups'] = self.opts['nodegroups']
mopts['state_auto_order'] = self.opts['state_auto_order']
mopts['state_events'] = self.opts['state_events']
mopts['state_aggregate'] = self.opts['state_aggregate']
mopts['jinja_lstrip_blocks'] = self.opts['jinja_lstrip_blocks']
mopts['jinja_trim_blocks'] = self.opts['jinja_trim_blocks']
return mopts
def _ext_nodes(self, load, skip_verify=False):
'''
Return the results from an external node classifier if one is
specified
'''
if not skip_verify:
if 'id' not in load:
log.error('Received call for external nodes without an id')
return {}
if not salt.utils.verify.valid_id(self.opts, load['id']):
return {}
# Evaluate all configured master_tops interfaces
opts = {}
grains = {}
ret = {}
if 'opts' in load:
opts = load['opts']
if 'grains' in load['opts']:
grains = load['opts']['grains']
for fun in self.tops:
if fun not in self.opts.get('master_tops', {}):
continue
try:
ret.update(self.tops[fun](opts=opts, grains=grains))
except Exception as exc:
# If anything happens in the top generation, log it and move on
log.error(
'Top function {0} failed with error {1} for minion '
'{2}'.format(
fun, exc, load['id']
)
)
return ret
def _mine_get(self, load, skip_verify=False):
'''
Gathers the data from the specified minions' mine
'''
if not skip_verify:
if any(key not in load for key in ('id', 'tgt', 'fun')):
return {}
if 'mine_get' in self.opts:
# If master side acl defined.
if not isinstance(self.opts['mine_get'], dict):
return {}
perms = set()
for match in self.opts['mine_get']:
if re.match(match, load['id']):
if isinstance(self.opts['mine_get'][match], list):
perms.update(self.opts['mine_get'][match])
if not any(re.match(perm, load['fun']) for perm in perms):
return {}
ret = {}
if not salt.utils.verify.valid_id(self.opts, load['id']):
return ret
match_type = load.get('expr_form', 'glob')
if match_type.lower() == 'pillar':
match_type = 'pillar_exact'
if match_type.lower() == 'compound':
match_type = 'compound_pillar_exact'
checker = salt.utils.minions.CkMinions(self.opts)
minions = checker.check_minions(
load['tgt'],
match_type,
greedy=False
)
for minion in minions:
mine = os.path.join(
self.opts['cachedir'],
'minions',
minion,
'mine.p')
try:
with salt.utils.fopen(mine, 'rb') as fp_:
fdata = self.serial.load(fp_).get(load['fun'])
if fdata:
ret[minion] = fdata
except Exception:
continue
return ret
def _mine(self, load, skip_verify=False):
'''
Return the mine data
'''
if not skip_verify:
if 'id' not in load or 'data' not in load:
return False
if self.opts.get('minion_data_cache', False) or self.opts.get('enforce_mine_cache', False):
cdir = os.path.join(self.opts['cachedir'], 'minions', load['id'])
if not os.path.isdir(cdir):
os.makedirs(cdir)
datap = os.path.join(cdir, 'mine.p')
if not load.get('clear', False):
if os.path.isfile(datap):
with salt.utils.fopen(datap, 'rb') as fp_:
new = self.serial.load(fp_)
if isinstance(new, dict):
new.update(load['data'])
load['data'] = new
with salt.utils.fopen(datap, 'w+b') as fp_:
fp_.write(self.serial.dumps(load['data']))
return True
def _mine_delete(self, load):
'''
Allow the minion to delete a specific function from its own mine
'''
if 'id' not in load or 'fun' not in load:
return False
if self.opts.get('minion_data_cache', False) or self.opts.get('enforce_mine_cache', False):
cdir = os.path.join(self.opts['cachedir'], 'minions', load['id'])
if not os.path.isdir(cdir):
return False
datap = os.path.join(cdir, 'mine.p')
if os.path.isfile(datap):
try:
with salt.utils.fopen(datap, 'rb') as fp_:
mine_data = self.serial.load(fp_)
if isinstance(mine_data, dict):
if mine_data.pop(load['fun'], False):
with salt.utils.fopen(datap, 'w+b') as fp_:
fp_.write(self.serial.dumps(mine_data))
except OSError:
return False
return True
def _mine_flush(self, load, skip_verify=False):
'''
Allow the minion to delete all of its own mine contents
'''
if not skip_verify and 'id' not in load:
return False
if self.opts.get('minion_data_cache', False) or self.opts.get('enforce_mine_cache', False):
cdir = os.path.join(self.opts['cachedir'], 'minions', load['id'])
if not os.path.isdir(cdir):
return False
datap = os.path.join(cdir, 'mine.p')
if os.path.isfile(datap):
try:
os.remove(datap)
except OSError:
return False
return True
def _file_recv(self, load):
'''
Allows minions to send files to the master, files are sent to the
master file cache
'''
if any(key not in load for key in ('id', 'path', 'loc')):
return False
if not self.opts['file_recv'] or os.path.isabs(load['path']):
return False
if os.path.isabs(load['path']) or '../' in load['path']:
# Can overwrite master files!!
return False
if not salt.utils.verify.valid_id(self.opts, load['id']):
return False
file_recv_max_size = 1024*1024 * self.opts['file_recv_max_size']
if 'loc' in load and load['loc'] < 0:
log.error('Invalid file pointer: load[loc] < 0')
return False
if len(load['data']) + load.get('loc', 0) > file_recv_max_size:
log.error(
'Exceeding file_recv_max_size limit: {0}'.format(
file_recv_max_size
)
)
return False
# Normalize Windows paths
normpath = load['path']
if ':' in normpath:
# make sure double backslashes are normalized
normpath = normpath.replace('\\', '/')
normpath = os.path.normpath(normpath)
cpath = os.path.join(
self.opts['cachedir'],
'minions',
load['id'],
'files',
normpath)
cdir = os.path.dirname(cpath)
if not os.path.isdir(cdir):
try:
os.makedirs(cdir)
except os.error:
pass
if os.path.isfile(cpath) and load['loc'] != 0:
mode = 'ab'
else:
mode = 'wb'
with salt.utils.fopen(cpath, mode) as fp_:
if load['loc']:
fp_.seek(load['loc'])
fp_.write(load['data'])
return True
def _pillar(self, load):
'''
Return the pillar data for the minion
'''
if any(key not in load for key in ('id', 'grains')):
return False
pillar = salt.pillar.Pillar(
self.opts,
load['grains'],
load['id'],
load.get('saltenv', load.get('env')),
load.get('ext'),
self.mminion.functions,
pillar=load.get('pillar_override', {}))
pillar_dirs = {}
data = pillar.compile_pillar(pillar_dirs=pillar_dirs)
if self.opts.get('minion_data_cache', False):
cdir = os.path.join(self.opts['cachedir'], 'minions', load['id'])
if not os.path.isdir(cdir):
os.makedirs(cdir)
datap = os.path.join(cdir, 'data.p')
tmpfh, tmpfname = tempfile.mkstemp(dir=cdir)
os.close(tmpfh)
with salt.utils.fopen(tmpfname, 'w+b') as fp_:
fp_.write(
self.serial.dumps(
{'grains': load['grains'],
'pillar': data})
)
# On Windows, os.rename will fail if the destination file exists.
salt.utils.atomicfile.atomic_rename(tmpfname, datap)
return data
def _minion_event(self, load):
'''
Receive an event from the minion and fire it on the master event
interface
'''
if 'id' not in load:
return False
if 'events' not in load and ('tag' not in load or 'data' not in load):
return False
if 'events' in load:
for event in load['events']:
self.event.fire_event(event, event['tag']) # old dup event
if load.get('pretag') is not None:
if 'data' in event:
self.event.fire_event(event['data'], tagify(event['tag'], base=load['pretag']))
else:
self.event.fire_event(event, tagify(event['tag'], base=load['pretag']))
else:
tag = load['tag']
self.event.fire_event(load, tag)
return True
def _return(self, load):
'''
Handle the return data sent from the minions
'''
# Generate EndTime
endtime = salt.utils.jid.jid_to_time(salt.utils.jid.gen_jid())
# If the return data is invalid, just ignore it
if any(key not in load for key in ('return', 'jid', 'id')):
return False
if load['jid'] == 'req':
# The minion is returning a standalone job, request a jobid
prep_fstr = '{0}.prep_jid'.format(self.opts['master_job_cache'])
load['jid'] = self.mminion.returners[prep_fstr](nocache=load.get('nocache', False))
# save the load, since we don't have it
saveload_fstr = '{0}.save_load'.format(self.opts['master_job_cache'])
self.mminion.returners[saveload_fstr](load['jid'], load)
log.info('Got return from {id} for job {jid}'.format(**load))
self.event.fire_event(load, load['jid']) # old dup event
self.event.fire_event(load, tagify([load['jid'], 'ret', load['id']], 'job'))
self.event.fire_ret_load(load)
if not self.opts['job_cache'] or self.opts.get('ext_job_cache'):
return
fstr = '{0}.update_endtime'.format(self.opts['master_job_cache'])
if (self.opts.get('job_cache_store_endtime')
and fstr in self.mminion.returners):
self.mminion.returners[fstr](load['jid'], endtime)
fstr = '{0}.returner'.format(self.opts['master_job_cache'])
self.mminion.returners[fstr](load)
def _syndic_return(self, load):
'''
Receive a syndic minion return and format it to look like returns from
individual minions.
'''
# Verify the load
if any(key not in load for key in ('return', 'jid', 'id')):
return None
# if we have a load, save it
if 'load' in load:
fstr = '{0}.save_load'.format(self.opts['master_job_cache'])
self.mminion.returners[fstr](load['jid'], load['load'])
# Format individual return loads
for key, item in six.iteritems(load['return']):
ret = {'jid': load['jid'],
'id': key,
'return': item}
if 'out' in load:
ret['out'] = load['out']
self._return(ret)
def minion_runner(self, load):
'''
Execute a runner from a minion, return the runner's function data
'''
if 'peer_run' not in self.opts:
return {}
if not isinstance(self.opts['peer_run'], dict):
return {}
if any(key not in load for key in ('fun', 'arg', 'id')):
return {}
perms = set()
for match in self.opts['peer_run']:
if re.match(match, load['id']):
# This is the list of funcs/modules!
if isinstance(self.opts['peer_run'][match], list):
perms.update(self.opts['peer_run'][match])
good = False
for perm in perms:
if re.match(perm, load['fun']):
good = True
if not good:
# The minion is not who it says it is!
# We don't want to listen to it!
log.warn(
'Minion id {0} is not who it says it is!'.format(
load['id']
)
)
return {}
# Prepare the runner object
opts = {'fun': load['fun'],
'arg': load['arg'],
'id': load['id'],
'doc': False,
'conf_file': self.opts['conf_file']}
opts.update(self.opts)
runner = salt.runner.Runner(opts)
return runner.run()
def pub_ret(self, load, skip_verify=False):
'''
Request the return data from a specific jid, only allowed
if the requesting minion also initialted the execution.
'''
if not skip_verify and any(key not in load for key in ('jid', 'id')):
return {}
else:
auth_cache = os.path.join(
self.opts['cachedir'],
'publish_auth')
if not os.path.isdir(auth_cache):
os.makedirs(auth_cache)
jid_fn = os.path.join(auth_cache, load['jid'])
with salt.utils.fopen(jid_fn, 'r') as fp_:
if not load['id'] == fp_.read():
return {}
return self.local.get_cache_returns(load['jid'])
def minion_pub(self, load):
'''
Publish a command initiated from a minion, this method executes minion
restrictions so that the minion publication will only work if it is
enabled in the config.
The configuration on the master allows minions to be matched to
salt functions, so the minions can only publish allowed salt functions
The config will look like this:
peer:
.*:
- .*
This configuration will enable all minions to execute all commands.
peer:
foo.example.com:
- test.*
This configuration will only allow the minion foo.example.com to
execute commands from the test module
'''
if not self.__verify_minion_publish(load):
return {}
# Set up the publication payload
pub_load = {
'fun': load['fun'],
'arg': load['arg'],
'expr_form': load.get('tgt_type', 'glob'),
'tgt': load['tgt'],
'ret': load['ret'],
'id': load['id'],
}
if 'tgt_type' in load:
if load['tgt_type'].startswith('node'):
if load['tgt'] in self.opts['nodegroups']:
pub_load['tgt'] = self.opts['nodegroups'][load['tgt']]
pub_load['expr_form_type'] = 'compound'
pub_load['expr_form'] = load['tgt_type']
else:
return {}
else:
pub_load['expr_form'] = load['tgt_type']
ret = {}
ret['jid'] = self.local.cmd_async(**pub_load)
ret['minions'] = self.ckminions.check_minions(
load['tgt'],
pub_load['expr_form'])
auth_cache = os.path.join(
self.opts['cachedir'],
'publish_auth')
if not os.path.isdir(auth_cache):
os.makedirs(auth_cache)
jid_fn = os.path.join(auth_cache, str(ret['jid']))
with salt.utils.fopen(jid_fn, 'w+') as fp_:
fp_.write(load['id'])
return ret
def minion_publish(self, load):
'''
Publish a command initiated from a minion, this method executes minion
restrictions so that the minion publication will only work if it is
enabled in the config.
The configuration on the master allows minions to be matched to
salt functions, so the minions can only publish allowed salt functions
The config will look like this:
peer:
.*:
- .*
This configuration will enable all minions to execute all commands.
peer:
foo.example.com:
- test.*
This configuration will only allow the minion foo.example.com to
execute commands from the test module
'''
if not self.__verify_minion_publish(load):
return {}
# Set up the publication payload
pub_load = {
'fun': load['fun'],
'arg': load['arg'],
'expr_form': load.get('tgt_type', 'glob'),
'tgt': load['tgt'],
'ret': load['ret'],
'id': load['id'],
}
if 'tmo' in load:
try:
pub_load['timeout'] = int(load['tmo'])
except ValueError:
msg = 'Failed to parse timeout value: {0}'.format(
load['tmo'])
log.warn(msg)
return {}
if 'timeout' in load:
try:
pub_load['timeout'] = int(load['timeout'])
except ValueError:
msg = 'Failed to parse timeout value: {0}'.format(
load['timeout'])
log.warn(msg)
return {}
if 'tgt_type' in load:
if load['tgt_type'].startswith('node'):
if load['tgt'] in self.opts['nodegroups']:
pub_load['tgt'] = self.opts['nodegroups'][load['tgt']]
pub_load['expr_form_type'] = 'compound'
else:
return {}
else:
pub_load['expr_form'] = load['tgt_type']
pub_load['raw'] = True
ret = {}
for minion in self.local.cmd_iter(**pub_load):
if load.get('form', '') == 'full':
data = minion
if 'jid' in minion:
ret['__jid__'] = minion['jid']
data['ret'] = data.pop('return')
ret[minion['id']] = data
else:
ret[minion['id']] = minion['return']
if 'jid' in minion:
ret['__jid__'] = minion['jid']
for key, val in six.iteritems(self.local.get_cache_returns(ret['__jid__'])):
if key not in ret:
ret[key] = val
if load.get('form', '') != 'full':
ret.pop('__jid__')
return ret
def revoke_auth(self, load):
'''
Allow a minion to request revocation of its own key
'''
if 'id' not in load:
return False
keyapi = salt.key.Key(self.opts)
keyapi.delete_key(load['id'],
preserve_minions=load.get('preserve_minion_cache',
False))
return True
class LocalFuncs(object):
'''
Set up methods for use only from the local system
'''
# The ClearFuncs object encapsulates the functions that can be executed in
# the clear:
# publish (The publish from the LocalClient)
# _auth
def __init__(self, opts, key):
self.opts = opts
self.serial = salt.payload.Serial(opts)
self.key = key
# Create the event manager
self.event = salt.utils.event.get_event(
'master',
self.opts['sock_dir'],
self.opts['transport'],
opts=self.opts,
listen=False)
# Make a client
self.local = salt.client.get_local_client(mopts=self.opts)
# Make an minion checker object
self.ckminions = salt.utils.minions.CkMinions(opts)
# Make an Auth object
self.loadauth = salt.auth.LoadAuth(opts)
# Stand up the master Minion to access returner data
self.mminion = salt.minion.MasterMinion(
self.opts,
states=False,
rend=False)
# Make a wheel object
self.wheel_ = salt.wheel.Wheel(opts)
def runner(self, load):
'''
Send a master control function back to the runner system
'''
if 'token' in load:
try:
token = self.loadauth.get_tok(load['token'])
except Exception as exc:
msg = 'Exception occurred when generating auth token: {0}'.format(
exc)
log.error(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
if not token:
msg = 'Authentication failure of type "token" occurred.'
log.warning(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
if token['eauth'] not in self.opts['external_auth']:
msg = 'Authentication failure of type "token" occurred.'
log.warning(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
good = self.ckminions.runner_check(
self.opts['external_auth'][token['eauth']][token['name']]
if token['name'] in self.opts['external_auth'][token['eauth']]
else self.opts['external_auth'][token['eauth']]['*'],
load['fun'])
if not good:
msg = ('Authentication failure of type "token" occurred for '
'user {0}.').format(token['name'])
log.warning(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
try:
fun = load.pop('fun')
runner_client = salt.runner.RunnerClient(self.opts)
return runner_client.async(
fun,
load.get('kwarg', {}),
token['name'])
except Exception as exc:
log.error('Exception occurred while '
'introspecting {0}: {1}'.format(fun, exc))
return dict(error=dict(name=exc.__class__.__name__,
args=exc.args,
message=str(exc)))
if 'eauth' not in load:
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
if load['eauth'] not in self.opts['external_auth']:
# The eauth system is not enabled, fail
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
try:
name = self.loadauth.load_name(load)
if not (name in self.opts['external_auth'][load['eauth']]) | ('*' in self.opts['external_auth'][load['eauth']]):
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
if not self.loadauth.time_auth(load):
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
good = self.ckminions.runner_check(
self.opts['external_auth'][load['eauth']][name] if name in self.opts['external_auth'][load['eauth']] else self.opts['external_auth'][load['eauth']]['*'],
load['fun'])
if not good:
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
try:
fun = load.pop('fun')
runner_client = salt.runner.RunnerClient(self.opts)
return runner_client.async(fun,
load.get('kwarg', {}),
load.get('username', 'UNKNOWN'))
except Exception as exc:
log.error('Exception occurred while '
'introspecting {0}: {1}'.format(fun, exc))
return dict(error=dict(name=exc.__class__.__name__,
args=exc.args,
message=str(exc)))
except Exception as exc:
log.error(
'Exception occurred in the runner system: {0}'.format(exc)
)
return dict(error=dict(name=exc.__class__.__name__,
args=exc.args,
message=str(exc)))
def wheel(self, load):
'''
Send a master control function back to the wheel system
'''
# All wheel ops pass through eauth
if 'token' in load:
try:
token = self.loadauth.get_tok(load['token'])
except Exception as exc:
msg = 'Exception occurred when generating auth token: {0}'.format(
exc)
log.error(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
if not token:
msg = 'Authentication failure of type "token" occurred.'
log.warning(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
if token['eauth'] not in self.opts['external_auth']:
msg = 'Authentication failure of type "token" occurred.'
log.warning(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
good = self.ckminions.wheel_check(
self.opts['external_auth'][token['eauth']][token['name']]
if token['name'] in self.opts['external_auth'][token['eauth']]
else self.opts['external_auth'][token['eauth']]['*'],
load['fun'])
if not good:
msg = ('Authentication failure of type "token" occurred for '
'user {0}.').format(token['name'])
log.warning(msg)
return dict(error=dict(name='TokenAuthenticationError',
message=msg))
jid = salt.utils.jid.gen_jid()
fun = load.pop('fun')
tag = tagify(jid, prefix='wheel')
data = {'fun': "wheel.{0}".format(fun),
'jid': jid,
'tag': tag,
'user': token['name']}
try:
self.event.fire_event(data, tagify([jid, 'new'], 'wheel'))
ret = self.wheel_.call_func(fun, **load)
data['return'] = ret
data['success'] = True
self.event.fire_event(data, tagify([jid, 'ret'], 'wheel'))
return {'tag': tag,
'data': data}
except Exception as exc:
log.error(exc)
log.error('Exception occurred while '
'introspecting {0}: {1}'.format(fun, exc))
data['return'] = 'Exception occurred in wheel {0}: {1}: {2}'.format(
fun,
exc.__class__.__name__,
exc,
)
data['success'] = False
self.event.fire_event(data, tagify([jid, 'ret'], 'wheel'))
return {'tag': tag,
'data': data}
if 'eauth' not in load:
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
if load['eauth'] not in self.opts['external_auth']:
# The eauth system is not enabled, fail
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
try:
name = self.loadauth.load_name(load)
if not ((name in self.opts['external_auth'][load['eauth']]) |
('*' in self.opts['external_auth'][load['eauth']])):
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
if not self.loadauth.time_auth(load):
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
good = self.ckminions.wheel_check(
self.opts['external_auth'][load['eauth']][name]
if name in self.opts['external_auth'][load['eauth']]
else self.opts['external_auth'][token['eauth']]['*'],
load['fun'])
if not good:
msg = ('Authentication failure of type "eauth" occurred for '
'user {0}.').format(load.get('username', 'UNKNOWN'))
log.warning(msg)
return dict(error=dict(name='EauthAuthenticationError',
message=msg))
jid = salt.utils.jid.gen_jid()
fun = load.pop('fun')
tag = tagify(jid, prefix='wheel')
data = {'fun': "wheel.{0}".format(fun),
'jid': jid,
'tag': tag,
'user': load.get('username', 'UNKNOWN')}
try:
self.event.fire_event(data, tagify([jid, 'new'], 'wheel'))
ret = self.wheel_.call_func(fun, **load)
data['return'] = ret
data['success'] = True
self.event.fire_event(data, tagify([jid, 'ret'], 'wheel'))
return {'tag': tag,
'data': data}
except Exception as exc:
log.error('Exception occurred while '
'introspecting {0}: {1}'.format(fun, exc))
data['return'] = 'Exception occurred in wheel {0}: {1}: {2}'.format(
fun,
exc.__class__.__name__,
exc,
)
self.event.fire_event(data, tagify([jid, 'ret'], 'wheel'))
return {'tag': tag,
'data': data}
except Exception as exc:
log.error(
'Exception occurred in the wheel system: {0}'.format(exc)
)
return dict(error=dict(name=exc.__class__.__name__,
args=exc.args,
message=str(exc)))
def mk_token(self, load):
'''
Create and return an authentication token, the clear load needs to
contain the eauth key and the needed authentication creds.
'''
if 'eauth' not in load:
log.warning('Authentication failure of type "eauth" occurred.')
return ''
if load['eauth'] not in self.opts['external_auth']:
# The eauth system is not enabled, fail
log.warning('Authentication failure of type "eauth" occurred.')
return ''
try:
name = self.loadauth.load_name(load)
if not ((name in self.opts['external_auth'][load['eauth']]) |
('*' in self.opts['external_auth'][load['eauth']])):
log.warning('Authentication failure of type "eauth" occurred.')
return ''
if not self.loadauth.time_auth(load):
log.warning('Authentication failure of type "eauth" occurred.')
return ''
return self.loadauth.mk_token(load)
except Exception as exc:
log.error(
'Exception occurred while authenticating: {0}'.format(exc)
)
return ''
def get_token(self, load):
'''
Return the name associated with a token or False if the token is invalid
'''
if 'token' not in load:
return False
return self.loadauth.get_tok(load['token'])
def publish(self, load):
'''
This method sends out publications to the minions, it can only be used
by the LocalClient.
'''
extra = load.get('kwargs', {})
# check blacklist/whitelist
good = True
# Check if the user is blacklisted
for user_re in self.opts['client_acl_blacklist'].get('users', []):
if re.match(user_re, load['user']):
good = False
break
# check if the cmd is blacklisted
for module_re in self.opts['client_acl_blacklist'].get('modules', []):
# if this is a regular command, its a single function
if isinstance(load['fun'], str):
funs_to_check = [load['fun']]
# if this a compound function
else:
funs_to_check = load['fun']
for fun in funs_to_check:
if re.match(module_re, fun):
good = False
break
if good is False:
log.error(
'{user} does not have permissions to run {function}. Please '
'contact your local administrator if you believe this is in '
'error.\n'.format(
user=load['user'],
function=load['fun']
)
)
return ''
# to make sure we don't step on anyone else's toes
del good
# Check for external auth calls
if extra.get('token', False):
# A token was passed, check it
try:
token = self.loadauth.get_tok(extra['token'])
except Exception as exc:
log.error(
'Exception occurred when generating auth token: {0}'.format(
exc
)
)
return ''
if not token:
log.warning('Authentication failure of type "token" occurred. \
Token could not be retrieved.')
return ''
if token['eauth'] not in self.opts['external_auth']:
log.warning('Authentication failure of type "token" occurred. \
Authentication type of {0} not present.').format(token['eauth'])
return ''
if not ((token['name'] in self.opts['external_auth'][token['eauth']]) |
('*' in self.opts['external_auth'][token['eauth']])):
log.warning('Authentication failure of type "token" occurred. \
Token does not verify against eauth provider: {0}').format(
self.opts['external_auth'])
return ''
good = self.ckminions.auth_check(
self.opts['external_auth'][token['eauth']][token['name']]
if token['name'] in self.opts['external_auth'][token['eauth']]
else self.opts['external_auth'][token['eauth']]['*'],
load['fun'],
load['tgt'],
load.get('tgt_type', 'glob'))
if not good:
# Accept find_job so the CLI will function cleanly
if load['fun'] != 'saltutil.find_job':
log.warning(
'Authentication failure of type "token" occurred.'
)
return ''
load['user'] = token['name']
log.debug('Minion tokenized user = "{0}"'.format(load['user']))
elif 'eauth' in extra:
if extra['eauth'] not in self.opts['external_auth']:
# The eauth system is not enabled, fail
log.warning(
'Authentication failure of type "eauth" occurred.'
)
return ''
try:
name = self.loadauth.load_name(extra)
if not ((name in self.opts['external_auth'][extra['eauth']]) |
('*' in self.opts['external_auth'][extra['eauth']])):
log.warning(
'Authentication failure of type "eauth" occurred.'
)
return ''
if not self.loadauth.time_auth(extra):
log.warning(
'Authentication failure of type "eauth" occurred.'
)
return ''
except Exception as exc:
log.error(
'Exception occurred while authenticating: {0}'.format(exc)
)
return ''
good = self.ckminions.auth_check(
self.opts['external_auth'][extra['eauth']][name]
if name in self.opts['external_auth'][extra['eauth']]
else self.opts['external_auth'][extra['eauth']]['*'],
load['fun'],
load['tgt'],
load.get('tgt_type', 'glob'))
if not good:
# Accept find_job so the CLI will function cleanly
if load['fun'] != 'saltutil.find_job':
log.warning(
'Authentication failure of type "eauth" occurred.'
)
return ''
load['user'] = name
# Verify that the caller has root on master
elif 'user' in load:
if load['user'].startswith('sudo_'):
# If someone can sudo, allow them to act as root
if load.get('key', 'invalid') == self.key.get('root'):
load.pop('key')
elif load.pop('key') != self.key[self.opts.get('user', 'root')]:
log.warning(
'Authentication failure of type "user" occurred.'
)
return ''
elif load['user'] == self.opts.get('user', 'root'):
if load.pop('key') != self.key[self.opts.get('user', 'root')]:
log.warning(
'Authentication failure of type "user" occurred.'
)
return ''
elif load['user'] == 'root':
if load.pop('key') != self.key.get(self.opts.get('user', 'root')):
log.warning(
'Authentication failure of type "user" occurred.'
)
return ''
elif load['user'] == salt.utils.get_user():
if load.pop('key') != self.key.get(load['user']):
log.warning(
'Authentication failure of type "user" occurred.'
)
return ''
else:
if load['user'] in self.key:
# User is authorised, check key and check perms
if load.pop('key') != self.key[load['user']]:
log.warning(
'Authentication failure of type "user" occurred.'
)
return ''
if load['user'] not in self.opts['client_acl']:
log.warning(
'Authentication failure of type "user" occurred.'
)
return ''
good = self.ckminions.auth_check(
self.opts['client_acl'][load['user']],
load['fun'],
load['tgt'],
load.get('tgt_type', 'glob'))
if not good:
# Accept find_job so the CLI will function cleanly
if load['fun'] != 'saltutil.find_job':
log.warning(
'Authentication failure of type "user" '
'occurred.'
)
return ''
else:
log.warning(
'Authentication failure of type "user" occurred.'
)
return ''
else:
if load.pop('key') != self.key[salt.utils.get_user()]:
log.warning(
'Authentication failure of type "other" occurred.'
)
return ''
# Retrieve the minions list
minions = self.ckminions.check_minions(
load['tgt'],
load.get('tgt_type', 'glob')
)
# If we order masters (via a syndic), don't short circuit if no minions
# are found
if not self.opts.get('order_masters'):
# Check for no minions
if not minions:
return {
'enc': 'clear',
'load': {
'jid': None,
'minions': minions
}
}
# Retrieve the jid
if not load['jid']:
fstr = '{0}.prep_jid'.format(self.opts['master_job_cache'])
load['jid'] = self.mminion.returners[fstr](nocache=extra.get('nocache', False))
self.event.fire_event({'minions': minions}, load['jid'])
new_job_load = {
'jid': load['jid'],
'tgt_type': load['tgt_type'],
'tgt': load['tgt'],
'user': load['user'],
'fun': load['fun'],
'arg': load['arg'],
'minions': minions,
}
# Announce the job on the event bus
self.event.fire_event(new_job_load, 'new_job') # old dup event
self.event.fire_event(new_job_load, tagify([load['jid'], 'new'], 'job'))
# Save the invocation information
if self.opts['ext_job_cache']:
try:
fstr = '{0}.save_load'.format(self.opts['ext_job_cache'])
self.mminion.returners[fstr](load['jid'], load)
except KeyError:
log.critical(
'The specified returner used for the external job cache '
'"{0}" does not have a save_load function!'.format(
self.opts['ext_job_cache']
)
)
except Exception:
log.critical(
'The specified returner threw a stack trace:\n',
exc_info=True
)
# always write out to the master job cache
try:
fstr = '{0}.save_load'.format(self.opts['master_job_cache'])
self.mminion.returners[fstr](load['jid'], load)
except KeyError:
log.critical(
'The specified returner used for the master job cache '
'"{0}" does not have a save_load function!'.format(
self.opts['master_job_cache']
)
)
except Exception:
log.critical(
'The specified returner threw a stack trace:\n',
exc_info=True
)
# Altering the contents of the publish load is serious!! Changes here
# break compatibility with minion/master versions and even tiny
# additions can have serious implications on the performance of the
# publish commands.
#
# In short, check with Thomas Hatch before you even think about
# touching this stuff, we can probably do what you want to do another
# way that won't have a negative impact.
pub_load = {
'fun': load['fun'],
'arg': load['arg'],
'tgt': load['tgt'],
'jid': load['jid'],
'ret': load['ret'],
}
if 'id' in extra:
pub_load['id'] = extra['id']
if 'tgt_type' in load:
pub_load['tgt_type'] = load['tgt_type']
if 'to' in load:
pub_load['to'] = load['to']
if 'kwargs' in load:
if 'ret_config' in load['kwargs']:
pub_load['ret_config'] = load['kwargs'].get('ret_config')
if 'metadata' in load['kwargs']:
pub_load['metadata'] = load['kwargs'].get('metadata')
if 'user' in load:
log.info(
'User {user} Published command {fun} with jid {jid}'.format(
**load
)
)
pub_load['user'] = load['user']
else:
log.info(
'Published command {fun} with jid {jid}'.format(
**load
)
)
log.debug('Published command details {0}'.format(pub_load))
return {'ret': {
'jid': load['jid'],
'minions': minions
},
'pub': pub_load
}
| 39.253547 | 173 | 0.49604 | 6,893 | 63,630 | 4.476861 | 0.098796 | 0.036553 | 0.027577 | 0.032373 | 0.547879 | 0.500891 | 0.476717 | 0.438413 | 0.407499 | 0.389643 | 0 | 0.002932 | 0.394232 | 63,630 | 1,620 | 174 | 39.277778 | 0.797665 | 0.055603 | 0 | 0.510989 | 0 | 0 | 0.154149 | 0.010159 | 0 | 0 | 0 | 0.000617 | 0 | 0 | null | null | 0.002355 | 0.027473 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cc20f1f6a53dbfc79dbca785199d6d05868daf1 | 25,440 | py | Python | tests/prep_post/test.py | Aslic/rmats_turbo_4.1.0 | c651509a5d32799315054fa37a2210fab2aae5e5 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | tests/prep_post/test.py | Aslic/rmats_turbo_4.1.0 | c651509a5d32799315054fa37a2210fab2aae5e5 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | tests/prep_post/test.py | Aslic/rmats_turbo_4.1.0 | c651509a5d32799315054fa37a2210fab2aae5e5 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | import glob
import os.path
import subprocess
import sys
import unittest
import tests.bam
import tests.base_test
import tests.gtf
import tests.output_parser as output_parser
import tests.test_config
import tests.util
class Test(tests.base_test.BaseTest):
def setUp(self):
super().setUp()
self._test_base_dir = tests.test_config.TEST_BASE_DIR
self._test_dir = os.path.join(self._test_base_dir, 'prep_post')
self._generated_input_dir = os.path.join(self._test_dir,
'generated_input')
self._out_dir = os.path.join(self._test_dir, 'out')
self._prep_1_tmp_dir = os.path.join(self._test_dir, 'tmp_prep_1')
self._prep_2_tmp_dir = os.path.join(self._test_dir, 'tmp_prep_2')
self._post_tmp_dir = os.path.join(self._test_dir, 'tmp_post')
self._dup_input_bam_tmp_dir = os.path.join(self._test_dir,
'tmp_dup_input_bam')
self._dup_prep_bam_tmp_dir = os.path.join(self._test_dir,
'tmp_dup_prep_bam')
self._miss_input_bam_tmp_dir = os.path.join(self._test_dir,
'tmp_miss_input_bam')
self._miss_prep_bam_tmp_dir = os.path.join(self._test_dir,
'tmp_miss_prep_bam')
tests.util.recreate_dirs([
self._generated_input_dir, self._out_dir, self._prep_1_tmp_dir,
self._prep_2_tmp_dir, self._post_tmp_dir,
self._dup_input_bam_tmp_dir, self._dup_prep_bam_tmp_dir,
self._miss_input_bam_tmp_dir, self._miss_prep_bam_tmp_dir,
self._command_output_dir()
])
self._read_type = 'paired'
self._read_length = 50
self._sample_1_bams_path = os.path.join(self._generated_input_dir,
'b1.txt')
self._sample_2_bams_path = os.path.join(self._generated_input_dir,
'b2.txt')
sample_1_bam_replicate_template = os.path.join(
self._generated_input_dir, 'sample_1_rep_{}.bam')
sample_2_bam_replicate_template = os.path.join(
self._generated_input_dir, 'sample_2_rep_{}.bam')
self._sample_1_bams = self._create_sample_1_bams(
self._sample_1_bams_path, sample_1_bam_replicate_template)
self._sample_2_bams = self._create_sample_2_bams(
self._sample_2_bams_path, sample_2_bam_replicate_template)
self._gtf_path = os.path.join(self._generated_input_dir, 'test.gtf')
self._gtf = self._create_gtf(self._gtf_path)
self._sub_steps = [
'prep_1',
'inte_1_fail',
'inte_1_pass',
'prep_2',
'inte_2_fail',
'inte_2_pass',
'post',
'duplicate_input_bam',
'duplicate_prep_bam',
'missing_input_bam',
'missing_prep_bam',
]
self._sub_step = None
def test(self):
for sub_step in self._sub_steps:
self._sub_step = sub_step
self._setup_sub_step()
self._run_test()
def _command_output_dir(self):
return os.path.join(self._test_dir, 'command_output')
def _rmats_arguments(self):
arguments = [
'--gtf',
self._gtf_path,
'--od',
self._out_dir,
'-t',
self._read_type,
'--readLength',
str(self._read_length),
]
if self._sub_step == 'prep_1':
arguments.extend([
'--tmp',
self._prep_1_tmp_dir,
'--b1',
self._sample_1_bams_path,
'--task',
'prep',
])
elif self._sub_step == 'inte_1_fail':
arguments.extend([
'--tmp',
self._post_tmp_dir,
'--b1',
self._sample_1_bams_path,
'--b2',
self._sample_2_bams_path,
'--task',
'inte',
])
elif self._sub_step == 'inte_1_pass':
arguments.extend([
'--tmp',
self._post_tmp_dir,
'--b1',
self._sample_1_bams_path,
'--task',
'inte',
'--statoff',
])
elif self._sub_step == 'prep_2':
arguments.extend([
'--tmp',
self._prep_2_tmp_dir,
'--b1',
self._sample_2_bams_path,
'--task',
'prep',
])
elif self._sub_step == 'inte_2_fail':
arguments.extend([
'--tmp',
self._post_tmp_dir,
'--b1',
self._sample_2_bams_path,
'--task',
'inte',
'--statoff',
])
elif self._sub_step == 'inte_2_pass':
arguments.extend([
'--tmp',
self._post_tmp_dir,
'--b1',
self._sample_1_bams_path,
'--b2',
self._sample_2_bams_path,
'--task',
'inte',
])
elif self._sub_step == 'post':
arguments.extend([
'--tmp',
self._post_tmp_dir,
'--b1',
self._sample_1_bams_path,
'--b2',
self._sample_2_bams_path,
'--task',
'post',
])
elif self._sub_step == 'duplicate_input_bam':
arguments.extend([
'--tmp',
self._dup_input_bam_tmp_dir,
'--b1',
self._dup_input_bam_path,
'--task',
'post',
'--statoff',
])
elif self._sub_step == 'duplicate_prep_bam':
arguments.extend([
'--tmp',
self._dup_prep_bam_tmp_dir,
'--b1',
self._dup_prep_bam_path,
'--task',
'post',
'--statoff',
])
elif self._sub_step == 'missing_input_bam':
arguments.extend([
'--tmp',
self._miss_input_bam_tmp_dir,
'--b1',
self._miss_input_bam_path,
'--task',
'post',
'--statoff',
])
elif self._sub_step == 'missing_prep_bam':
arguments.extend([
'--tmp',
self._miss_prep_bam_tmp_dir,
'--b1',
self._miss_prep_bam_path,
'--task',
'post',
'--statoff',
])
return arguments
def _setup_sub_step(self):
if self._sub_step == 'duplicate_input_bam':
self._setup_dup_input_bam()
elif self._sub_step == 'duplicate_prep_bam':
self._setup_dup_prep_bam()
elif self._sub_step == 'missing_input_bam':
self._setup_miss_input_bam()
elif self._sub_step == 'missing_prep_bam':
self._setup_miss_prep_bam()
def _setup_dup_input_bam(self):
self._dup_input_bam_path = os.path.join(self._generated_input_dir,
'dup_input.txt')
bams = self._sample_1_bams + [self._sample_1_bams[0]]
self._write_bams(bams, self._dup_input_bam_path)
self._cp_with_prefix('prep_1', self._prep_1_tmp_dir,
self._dup_input_bam_tmp_dir)
def _setup_dup_prep_bam(self):
self._dup_prep_bam_path = os.path.join(self._generated_input_dir,
'dup_prep.txt')
bams = self._sample_1_bams
self._write_bams(bams, self._dup_prep_bam_path)
self._cp_with_prefix('prep_1', self._prep_1_tmp_dir,
self._dup_prep_bam_tmp_dir)
self._cp_with_prefix('prep_1_again', self._prep_1_tmp_dir,
self._dup_prep_bam_tmp_dir)
def _setup_miss_input_bam(self):
self._miss_input_bam_path = os.path.join(self._generated_input_dir,
'miss_input.txt')
bams = [self._sample_1_bams[0]]
self._write_bams(bams, self._miss_input_bam_path)
self._cp_with_prefix('prep_1', self._prep_1_tmp_dir,
self._miss_input_bam_tmp_dir)
def _setup_miss_prep_bam(self):
self._miss_prep_bam_path = os.path.join(self._generated_input_dir,
'miss_prep.txt')
bams = self._sample_1_bams + self._sample_2_bams
self._write_bams(bams, self._miss_prep_bam_path)
self._cp_with_prefix('prep_1', self._prep_1_tmp_dir,
self._miss_prep_bam_tmp_dir)
def _create_gtf(self, gtf_path):
gtf = tests.gtf.GTF()
gtf.path = gtf_path
transcript_1 = tests.gtf.Transcript()
transcript_1.chromosome = '1'
transcript_1.strand = '+'
transcript_1.gene_id = tests.util.gene_id_str(1)
transcript_1.gene_name = tests.util.gene_name_str(1)
transcript_1.transcript_id = tests.util.transcript_id_str(1)
transcript_1.exons = [(1, 100), (201, 300), (401, 500)]
gtf.transcripts = [transcript_1]
error = gtf.write()
self.assertFalse(error)
return gtf
def _create_sample_1_bams(self, sample_1_bams_path,
sample_1_replicate_template):
rep_1_bam = tests.bam.BAM()
rep_1_bam.path = sample_1_replicate_template.format(1)
rep_2_bam = tests.bam.BAM()
rep_2_bam.path = sample_1_replicate_template.format(2)
sample_1_bams = [rep_1_bam, rep_2_bam]
rep_1_read_1 = tests.bam.Read()
rep_1_read_1.ref_seq_name = '1' # chromosome
rep_1_read_1.ref_seq_len = 1000 # chromosome length
rep_1_read_1.template_name = tests.util.template_name_str([1, 1])
rep_1_read_2 = tests.bam.Read()
error = tests.bam.set_read_pair_from_intervals(rep_1_read_1,
rep_1_read_2,
[[76, 100], [201, 300]],
[[401, 475]],
self._read_length)
self.assertFalse(error)
rep_1_bam.reads = [rep_1_read_1, rep_1_read_2]
rep_2_read_1 = tests.bam.Read()
rep_2_read_1.ref_seq_name = '1' # chromosome
rep_2_read_1.ref_seq_len = 1000 # chromosome length
rep_2_read_1.template_name = tests.util.template_name_str([1, 2])
rep_2_read_2 = tests.bam.Read()
error = tests.bam.set_read_pair_from_intervals(
rep_2_read_1, rep_2_read_2, [[26, 100]], [[201, 300], [401, 425]],
self._read_length)
self.assertFalse(error)
rep_2_bam.reads = [rep_2_read_1, rep_2_read_2]
self._write_bams(sample_1_bams, sample_1_bams_path)
return sample_1_bams
def _create_sample_2_bams(self, sample_2_bams_path,
sample_2_replicate_template):
rep_1_bam = tests.bam.BAM()
rep_1_bam.path = sample_2_replicate_template.format(1)
rep_2_bam = tests.bam.BAM()
rep_2_bam.path = sample_2_replicate_template.format(2)
sample_2_bams = [rep_1_bam, rep_2_bam]
rep_1_read_1 = tests.bam.Read()
rep_1_read_1.ref_seq_name = '1' # chromosome
rep_1_read_1.ref_seq_len = 1000 # chromosome length
rep_1_read_1.template_name = tests.util.template_name_str([2, 1])
rep_1_read_2 = tests.bam.Read()
error = tests.bam.set_read_pair_from_intervals(rep_1_read_1,
rep_1_read_2,
[[76, 100], [401, 500]],
[[401, 475]],
self._read_length)
self.assertFalse(error)
rep_1_bam.reads = [rep_1_read_1, rep_1_read_2]
rep_2_read_1 = tests.bam.Read()
rep_2_read_1.ref_seq_name = '1' # chromosome
rep_2_read_1.ref_seq_len = 1000 # chromosome length
rep_2_read_1.template_name = tests.util.template_name_str([2, 2])
rep_2_read_2 = tests.bam.Read()
error = tests.bam.set_read_pair_from_intervals(rep_2_read_1,
rep_2_read_2,
[[26, 100]],
[[1, 100], [401, 425]],
self._read_length)
self.assertFalse(error)
rep_2_bam.reads = [rep_2_read_1, rep_2_read_2]
self._write_bams(sample_2_bams, sample_2_bams_path)
return sample_2_bams
def _cp_with_prefix(self, prefix, source_dir, dest_dir):
source_paths = self._get_dot_rmats_paths(source_dir)
command = [
sys.executable, tests.test_config.CP_WITH_PREFIX, prefix, dest_dir
]
command.extend(source_paths)
subprocess.run(command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
check=True)
def _check_results(self):
if self._sub_step == 'prep_1':
self._check_results_prep_1()
elif self._sub_step == 'inte_1_fail':
self._check_results_inte_1_fail()
elif self._sub_step == 'inte_1_pass':
self._check_results_inte_1_pass()
elif self._sub_step == 'prep_2':
self._check_results_prep_2()
elif self._sub_step == 'inte_2_fail':
self._check_results_inte_2_fail()
elif self._sub_step == 'inte_2_pass':
self._check_results_inte_2_pass()
elif self._sub_step == 'post':
self._check_results_post()
elif self._sub_step == 'duplicate_input_bam':
self._check_results_dup_input_bam()
elif self._sub_step == 'duplicate_prep_bam':
self._check_results_dup_prep_bam()
elif self._sub_step == 'missing_input_bam':
self._check_results_miss_input_bam()
elif self._sub_step == 'missing_prep_bam':
self._check_results_miss_prep_bam()
else:
self.fail('unexpected sub_step: {}'.format(self._sub_step))
def _get_dot_rmats_paths(self, tmp_dir):
dot_rmats_file_paths = glob.glob(os.path.join(tmp_dir, '*.rmats'))
# filenames begin with a timestamp used for alphanumeric sort
return sorted(dot_rmats_file_paths)
def _check_results_prep_1(self):
self._check_no_error_results()
command_stdout_file_name = self._get_stdout_file_name()
with open(command_stdout_file_name, 'rt') as out_f_h:
out_lines = out_f_h.readlines()
tests.util.assert_no_line_has(self, out_lines,
'Processing count files')
test_gene_id = tests.util.gene_id_str(1)
quoted_test_gene_id = tests.util.double_quote(test_gene_id)
dot_rmats_paths = self._get_dot_rmats_paths(self._prep_1_tmp_dir)
self.assertEqual(len(dot_rmats_paths), 2)
for dot_rmats_i in range(2):
dot_rmats_contents, error = output_parser.parse_dot_rmats(
dot_rmats_paths[dot_rmats_i])
self.assertFalse(error)
self.assertEqual(dot_rmats_contents['bams'],
[self._sample_1_bams[dot_rmats_i].path])
self.assertEqual(dot_rmats_contents['read_length'],
self._read_length)
novel_juncs = dot_rmats_contents['novel_juncs']
self.assertEqual(novel_juncs, [dict()])
exons = dot_rmats_contents['exons']
if dot_rmats_i == 0:
self.assertEqual(exons, [{
quoted_test_gene_id: [{
'start_box': [401, 499],
'end_box': [401, 499],
'counts': [1, 0]
}]
}])
else:
self.assertEqual(exons, [{
quoted_test_gene_id: [{
'start_box': [1, 99],
'end_box': [1, 99],
'counts': [1, 0]
}]
}])
multis = dot_rmats_contents['multis']
if dot_rmats_i == 0:
self.assertEqual(multis, [{
quoted_test_gene_id: [{
'junction_pairs': [[1, 1], [100, 200], [299, 299]],
'count':
1
}]
}])
else:
self.assertEqual(multis, [{
quoted_test_gene_id: [{
'junction_pairs': [[201, 201], [300, 400], [499, 499]],
'count':
1
}]
}])
self._cp_with_prefix('prep_1_', self._prep_1_tmp_dir,
self._post_tmp_dir)
def _check_results_prep_2(self):
self._check_no_error_results()
command_stdout_file_name = self._get_stdout_file_name()
with open(command_stdout_file_name, 'rt') as out_f_h:
out_lines = out_f_h.readlines()
tests.util.assert_no_line_has(self, out_lines,
'Processing count files')
test_gene_id = tests.util.gene_id_str(1)
quoted_test_gene_id = tests.util.double_quote(test_gene_id)
dot_rmats_paths = self._get_dot_rmats_paths(self._prep_2_tmp_dir)
self.assertEqual(len(dot_rmats_paths), 2)
for dot_rmats_i in range(2):
dot_rmats_contents, error = output_parser.parse_dot_rmats(
dot_rmats_paths[dot_rmats_i])
self.assertFalse(error)
self.assertEqual(dot_rmats_contents['bams'],
[self._sample_2_bams[dot_rmats_i].path])
self.assertEqual(dot_rmats_contents['read_length'],
self._read_length)
novel_juncs = dot_rmats_contents['novel_juncs']
self.assertEqual(novel_juncs, [{quoted_test_gene_id: [[0, 0, 2]]}])
exons = dot_rmats_contents['exons']
if dot_rmats_i == 0:
self.assertEqual(exons, [{
quoted_test_gene_id: [{
'start_box': [401, 499],
'end_box': [401, 499],
'counts': [1, 0]
}]
}])
else:
self.assertEqual(exons, [{
quoted_test_gene_id: [{
'start_box': [1, 99],
'end_box': [1, 99],
'counts': [1, 0]
}]
}])
multis = dot_rmats_contents['multis']
if dot_rmats_i == 0:
self.assertEqual(multis, [{
quoted_test_gene_id: [{
'junction_pairs': [[1, 1], [100, 400], [499, 499]],
'count':
1
}]
}])
else:
self.assertEqual(multis, [{
quoted_test_gene_id: [{
'junction_pairs': [[1, 1], [100, 400], [499, 499]],
'count':
1
}]
}])
self._cp_with_prefix('prep_2_', self._prep_2_tmp_dir,
self._post_tmp_dir)
def _check_results_inte_1_fail(self):
self.assertNotEqual(self._rmats_return_code, 0)
command_stderr_file_name = self._get_stderr_file_name()
with open(command_stderr_file_name, 'rt') as err_f_h:
err_lines = err_f_h.readlines()
tests.util.assert_some_line_has(
self, err_lines, 'input bam files with no associated prep output')
def _check_results_inte_1_pass(self):
self._check_no_error_results()
def _check_results_inte_2_fail(self):
self.assertNotEqual(self._rmats_return_code, 0)
command_stderr_file_name = self._get_stderr_file_name()
with open(command_stderr_file_name, 'rt') as err_f_h:
err_lines = err_f_h.readlines()
tests.util.assert_some_line_has(
self, err_lines,
'bam files not in input but associated with prep output')
def _check_results_inte_2_pass(self):
self._check_no_error_results()
def _check_results_post(self):
self._check_no_error_results()
command_stdout_file_name = self._get_stdout_file_name()
with open(command_stdout_file_name, 'rt') as out_f_h:
out_lines = out_f_h.readlines()
tests.util.assert_some_line_has(self, out_lines,
'Processing count files')
from_gtf_se_path = os.path.join(self._out_dir, 'fromGTF.SE.txt')
from_gtf_se_header, from_gtf_se_rows, error = output_parser.parse_from_gtf(
from_gtf_se_path)
self.assertFalse(error)
self.assertEqual(len(from_gtf_se_rows), 1)
from_gtf_se_row = from_gtf_se_rows[0]
self.assertEqual(from_gtf_se_row['GeneID'],
tests.util.double_quote(tests.util.gene_id_str(1)))
self.assertEqual(from_gtf_se_row['exonStart_0base'], '200')
self.assertEqual(from_gtf_se_row['exonEnd'], '300')
jc_raw_se_path = os.path.join(self._out_dir, 'JC.raw.input.SE.txt')
jc_raw_se_header, jc_raw_se_rows, error = output_parser.parse_jc_raw(
jc_raw_se_path)
self.assertFalse(error)
self.assertEqual(len(jc_raw_se_rows), 1)
jc_raw_se_row = jc_raw_se_rows[0]
self.assertEqual(jc_raw_se_row['ID'], from_gtf_se_row['ID'])
self.assertEqual(jc_raw_se_row['IJC_SAMPLE_1'], '1,1')
self.assertEqual(jc_raw_se_row['SJC_SAMPLE_1'], '0,0')
self.assertEqual(jc_raw_se_row['IJC_SAMPLE_2'], '0,0')
self.assertEqual(jc_raw_se_row['SJC_SAMPLE_2'], '1,1')
se_mats_jc_path = os.path.join(self._out_dir, 'SE.MATS.JC.txt')
se_mats_jc_header, se_mats_jc_rows, error = output_parser.parse_mats_jc(
se_mats_jc_path)
self.assertFalse(error)
self._check_se_mats_jc_header(se_mats_jc_header)
self.assertEqual(len(se_mats_jc_rows), 1)
se_mats_jc_row = se_mats_jc_rows[0]
pvalue = float(se_mats_jc_row['PValue'])
tests.util.assert_within_bounds(self, pvalue, 0, 1)
fdr = float(se_mats_jc_row['FDR'])
tests.util.assert_within_bounds(self, fdr, 0, 1)
inc_level_1_splits = se_mats_jc_row['IncLevel1'].split(',')
self.assertEqual(len(inc_level_1_splits), 2)
self.assertAlmostEqual(float(inc_level_1_splits[0]), 1)
self.assertAlmostEqual(float(inc_level_1_splits[1]), 1)
inc_level_2_splits = se_mats_jc_row['IncLevel2'].split(',')
self.assertEqual(len(inc_level_2_splits), 2)
self.assertAlmostEqual(float(inc_level_2_splits[0]), 0)
self.assertAlmostEqual(float(inc_level_2_splits[1]), 0)
self.assertAlmostEqual(float(se_mats_jc_row['IncLevelDifference']), 1)
def _check_results_dup_input_bam(self):
self.assertNotEqual(self._rmats_return_code, 0)
command_stderr_file_name = self._get_stderr_file_name()
with open(command_stderr_file_name, 'rt') as err_f_h:
err_lines = err_f_h.readlines()
dup_bam_path = self._sample_1_bams[0].path
expected_error = '{} given 2 times'.format(dup_bam_path)
tests.util.assert_some_line_has(self, err_lines, expected_error)
def _check_results_dup_prep_bam(self):
self.assertNotEqual(self._rmats_return_code, 0)
command_stderr_file_name = self._get_stderr_file_name()
with open(command_stderr_file_name, 'rt') as err_f_h:
err_lines = err_f_h.readlines()
for bam in self._sample_1_bams:
dup_bam_path = bam.path
expected_error = '{} found 2 times in .rmats'.format(dup_bam_path)
tests.util.assert_some_line_has(self, err_lines, expected_error)
def _check_results_miss_input_bam(self):
self._check_no_error_results()
def _check_results_miss_prep_bam(self):
self.assertNotEqual(self._rmats_return_code, 0)
command_stderr_file_name = self._get_stderr_file_name()
with open(command_stderr_file_name, 'rt') as err_f_h:
err_lines = err_f_h.readlines()
for bam in self._sample_2_bams:
miss_bam_path = bam.path
expected_error = '{} not found in .rmats'.format(miss_bam_path)
tests.util.assert_some_line_has(self, err_lines, expected_error)
if __name__ == '__main__':
unittest.main(verbosity=2)
| 39.75 | 83 | 0.553263 | 3,131 | 25,440 | 3.984031 | 0.067071 | 0.020683 | 0.025573 | 0.025814 | 0.797819 | 0.735289 | 0.662819 | 0.618086 | 0.568462 | 0.531666 | 0 | 0.031469 | 0.347956 | 25,440 | 639 | 84 | 39.812207 | 0.720521 | 0.006879 | 0 | 0.527531 | 0 | 0 | 0.070521 | 0 | 0 | 0 | 0 | 0 | 0.104796 | 1 | 0.046181 | false | 0.017762 | 0.019538 | 0.001776 | 0.078153 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cc57c915f6cace046e0bbe739957206038f009f | 1,527 | py | Python | nltk/align/util.py | kruskod/nltk | dba7b5431b1d57a75d50e048961c1a203b98c3da | [
"Apache-2.0"
] | 1 | 2015-11-25T00:47:58.000Z | 2015-11-25T00:47:58.000Z | nltk/align/util.py | kruskod/nltk | dba7b5431b1d57a75d50e048961c1a203b98c3da | [
"Apache-2.0"
] | null | null | null | nltk/align/util.py | kruskod/nltk | dba7b5431b1d57a75d50e048961c1a203b98c3da | [
"Apache-2.0"
] | null | null | null | # Natural Language Toolkit: Aligner Utilities
#
# Copyright (C) 2001-2015 NLTK Project
# Author: Anna Garbar
# URL: <http://www.nltk.org/>
# For license information, see LICENSE.TXT
from nltk.align.api import Alignment
def pharaohtext2tuples(pharaoh_text):
"""
Converts pharaoh text format into an Alignment object (a list of tuples).
>>> pharaoh_text = '0-0 2-1 9-2 21-3 10-4 7-5'
>>> pharaohtext2tuples(pharaoh_text)
Alignment([(0, 0), (2, 1), (7, 5), (9, 2), (10, 4), (21, 3)])
:type pharaoh_text: str
:param pharaoh_text: the word alignment outputs in the pharaoh output format
:rtype: Alignment
:return: An Alignment object that contains a list of integer tuples
"""
# Converts integers to strings for a word alignment point.
list_of_tuples = [tuple(map(int,a.split('-'))) for a in pharaoh_text.split()]
return Alignment(list_of_tuples)
def alignment2pharaohtext(alignment):
"""
Converts an Alignment object (a list of tuples) into pharaoh text format.
>>> alignment = [(0, 0), (2, 1), (9, 2), (21, 3), (10, 4), (7, 5)]
>>> alignment2pharaohtext(alignment)
'0-0 2-1 9-2 21-3 10-4 7-5'
:type alignment: Alignment
:param alignment: An Alignment object that contains a list of integer tuples
:rtype: str
:return: the word alignment outputs in the pharaoh output format
"""
pharaoh_text = ' '.join(str(i) + "-" + str(j) for i,j in alignment)
return pharaoh_text
| 33.933333 | 81 | 0.642436 | 218 | 1,527 | 4.444954 | 0.348624 | 0.113519 | 0.070175 | 0.016512 | 0.335397 | 0.321981 | 0.321981 | 0.260062 | 0.260062 | 0.163055 | 0 | 0.058219 | 0.235102 | 1,527 | 44 | 82 | 34.704545 | 0.771404 | 0.693517 | 0 | 0 | 0 | 0 | 0.008357 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cd4c163b81a2a9f7a9f4fb51454b97b7933bffd | 1,565 | py | Python | dwh_analytic/dags/data_warehouse_prod/schema/dim_process.py | dnguyenngoc/analytic | d609a93e96e7c546ad3ee3ebd4e13309ddf575f8 | [
"MIT"
] | null | null | null | dwh_analytic/dags/data_warehouse_prod/schema/dim_process.py | dnguyenngoc/analytic | d609a93e96e7c546ad3ee3ebd4e13309ddf575f8 | [
"MIT"
] | null | null | null | dwh_analytic/dags/data_warehouse_prod/schema/dim_process.py | dnguyenngoc/analytic | d609a93e96e7c546ad3ee3ebd4e13309ddf575f8 | [
"MIT"
] | null | null | null | resource ='human ad machime'
class DimProcess:
def __init__(
self,
*kwargs,
process_key: int,
module: str,
type: str,
step: str,
sub_step: str,
resource: str = 'human',
):
def step(self):
return ['qc', 'auto_qc', 'apr_qc', 'keyer_input']
def example_data(self):
data = {
'process_key': 1,
'resource': 'human',
'module': 'keyed_data',
'step': 'qc',
'sub_step': None,
'process_key': 2,
'resource': 'machine',
'module': 'keyed_data',
'step': 'transform',
'sub_step': None,
}
class FactDataExtractionModel:
def __init__(
self,
*kwargs,
project_id: str,
document_id: str,
doc_set_id: str,
last_modified_time_key: int,
last_modified_date_key: int,
user_name: str = None,
process_key: int,
field_name: str,
field_value: str = None,
last_modified_timestamp: str
):
self.project_id = project_id
self.document_id = document_id
self.doc_set_id = doc_set_id
self.last_modified_time_key = last_modified_time_key
self.last_modified_date_key = last_modified_date_key
self.user_name = user_name
self.process_key = process_key
self.field_name = field_name
self.field_value = field_value
self.last_modified_timestamp = last_modified_timestamp | 27.946429 | 62 | 0.548882 | 173 | 1,565 | 4.572254 | 0.271676 | 0.136536 | 0.030341 | 0.072061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001988 | 0.357189 | 1,565 | 56 | 62 | 27.946429 | 0.784294 | 0 | 0 | 0.269231 | 0 | 0 | 0.104725 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cd6ff8a4443655a42df05eccf62b0e804763fb0 | 2,898 | py | Python | service.py | Tigge/script.filmtipset-grade | a5b438dc478d6ef40f611585e9cd196c2ff49cf6 | [
"BSD-2-Clause"
] | 1 | 2015-02-19T08:45:57.000Z | 2015-02-19T08:45:57.000Z | service.py | Tigge/script.filmtipset-grade | a5b438dc478d6ef40f611585e9cd196c2ff49cf6 | [
"BSD-2-Clause"
] | 1 | 2015-02-01T19:28:17.000Z | 2015-03-18T22:27:14.000Z | service.py | Tigge/script.filmtipset-grade | a5b438dc478d6ef40f611585e9cd196c2ff49cf6 | [
"BSD-2-Clause"
] | null | null | null | # Copyright (c) 2013, Gustav Tiger
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# * Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
#
# * Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
import xbmc
import xbmcaddon
import xbmcgui
import filmtipset
FILMTIPSET_ACCESS_KEY = "7ndg3Q3qwW8dPzbJMrB5Rw"
class XBMCPlayer(xbmc.Player):
def __init__(self, *args):
self.imdb = None
self.time = None
self.time_total = None
def onPlayBackStarted(self):
self.update()
def onPlayBackEnded(self):
self.onDone()
def onPlayBackStopped(self):
self.onDone()
def update(self):
info = self.getVideoInfoTag()
self.imdb = info.getIMDBNumber()
self.time = self.getTime()
self.time_total = self.getTotalTime()
def onDone(self):
print "getTime", self.time
print "getTotalTime", self.time_total
print "imdb", self.imdb
addon = xbmcaddon.Addon(id='script.filmtipset-grade')
key = addon.getSetting("key")
user = addon.getSetting("user")
grader = filmtipset.Filmtipset(FILMTIPSET_ACCESS_KEY, key, user)
movie = grader.get_movie_imdb(self.imdb)
print movie
if movie["grade"]["type"] != "seen":
dialog = xbmcgui.Dialog()
grade = dialog.select("Grade " + movie["orgname"] + " on filmtipset:",
["Skip", "1", "2", "3", "4", "5"])
if grade != 0:
print dialog, grade
print grader.grade(movie["id"], grade)
player = XBMCPlayer()
while(not xbmc.abortRequested):
if player.isPlayingVideo():
player.update()
xbmc.sleep(1000)
| 34.094118 | 82 | 0.670807 | 352 | 2,898 | 5.485795 | 0.463068 | 0.024858 | 0.020197 | 0.023822 | 0.095287 | 0.07043 | 0.07043 | 0.07043 | 0.07043 | 0.07043 | 0 | 0.00866 | 0.242926 | 2,898 | 84 | 83 | 34.5 | 0.871468 | 0.446515 | 0 | 0.046512 | 0 | 0 | 0.080482 | 0.028517 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.093023 | null | null | 0.139535 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cda6328ac58b61f05923cca8623aa6b42f94561 | 3,591 | py | Python | lib/reindex/reporting.py | scality/utapi | 29475f1b9aa25cf3c883262bfb6f4573f846a5b7 | [
"Apache-2.0"
] | 13 | 2016-10-07T20:25:11.000Z | 2022-02-23T06:33:59.000Z | lib/reindex/reporting.py | scality/utapi | 29475f1b9aa25cf3c883262bfb6f4573f846a5b7 | [
"Apache-2.0"
] | 427 | 2016-08-17T18:03:32.000Z | 2022-03-31T10:46:12.000Z | lib/reindex/reporting.py | scality/utapi | 29475f1b9aa25cf3c883262bfb6f4573f846a5b7 | [
"Apache-2.0"
] | 5 | 2017-04-25T21:13:03.000Z | 2018-01-23T00:21:06.000Z | import requests
import redis
import json
import ast
import sys
import time
import urllib
import re
import sys
from threading import Thread
from concurrent.futures import ThreadPoolExecutor
import argparse
def get_options():
parser = argparse.ArgumentParser()
parser.add_argument("-i", "--sentinel-ip", default='127.0.0.1', help="Sentinel IP")
parser.add_argument("-p", "--sentinel-port", default="16379", help="Sentinel Port")
parser.add_argument("-v", "--redis-password", default=None, help="Redis AUTH Password")
parser.add_argument("-n", "--sentinel-cluster-name", default='scality-s3', help="Redis cluster name")
parser.add_argument("-b", "--bucketd-addr", default='http://127.0.0.1:9000', help="URL of the bucketd server")
return parser.parse_args()
def safe_print(content):
print("{0}".format(content))
class askRedis():
def __init__(self, ip="127.0.0.1", port="16379", sentinel_cluster_name="scality-s3", password=None):
self._password = password
r = redis.Redis(host=ip, port=port, db=0, password=password)
self._ip, self._port = r.sentinel_get_master_addr_by_name(sentinel_cluster_name)
def read(self, resource, name):
r = redis.Redis(host=self._ip, port=self._port, db=0, password=self._password)
res = 's3:%s:%s:storageUtilized:counter' % (resource, name)
total_size = r.get(res)
res = 's3:%s:%s:numberOfObjects:counter' % (resource, name)
files = r.get(res)
try:
return {'files': int(files), "total_size": int(total_size)}
except Exception as e:
return {'files': 0, "total_size": 0}
class S3ListBuckets():
def __init__(self, host='127.0.0.1:9000'):
self.bucketd_host = host
def run(self):
docs = []
url = "%s/default/bucket/users..bucket" % self.bucketd_host
session = requests.Session()
r = session.get(url, timeout=30)
if r.status_code == 200:
payload = json.loads(r.text)
for keys in payload['Contents']:
key = keys["key"]
r1 = re.match("(\w+)..\|..(\w+.*)", key)
docs.append(r1.groups())
return docs
return(self.userid, self.bucket, user, files, total_size)
if __name__ == '__main__':
options = get_options()
redis_conf = dict(
ip=options.sentinel_ip,
port=options.sentinel_port,
sentinel_cluster_name=options.sentinel_cluster_name,
password=options.redis_password
)
P = S3ListBuckets(options.bucketd_addr)
listbuckets = P.run()
userids = set([x for x, y in listbuckets])
executor = ThreadPoolExecutor(max_workers=1)
for userid, bucket in listbuckets:
U = askRedis(**redis_conf)
data = U.read('buckets', bucket)
content = "Account:%s|Bucket:%s|NumberOFfiles:%s|StorageCapacity:%s " % (
userid, bucket, data["files"], data["total_size"])
executor.submit(safe_print, content)
data = U.read('buckets', 'mpuShadowBucket'+bucket)
content = "Account:%s|Bucket:%s|NumberOFfiles:%s|StorageCapacity:%s " % (
userid, 'mpuShadowBucket'+bucket, data["files"], data["total_size"])
executor.submit(safe_print, content)
executor.submit(safe_print, "")
for userid in sorted(userids):
U = askRedis(**redis_conf)
data = U.read('accounts', userid)
content = "Account:%s|NumberOFfiles:%s|StorageCapacity:%s " % (
userid, data["files"], data["total_size"])
executor.submit(safe_print, content) | 35.554455 | 114 | 0.634085 | 452 | 3,591 | 4.889381 | 0.292035 | 0.032579 | 0.038462 | 0.01086 | 0.184163 | 0.175113 | 0.158371 | 0.133937 | 0.133937 | 0.133937 | 0 | 0.021708 | 0.217488 | 3,591 | 101 | 115 | 35.554455 | 0.764769 | 0 | 0 | 0.109756 | 0 | 0 | 0.190145 | 0.076837 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073171 | false | 0.073171 | 0.146341 | 0 | 0.292683 | 0.073171 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7cdd5ddf7b7d2568fd208a60927251ae8e3ac857 | 10,399 | py | Python | retargeting/models/Kinematics.py | yujiatay/deep-motion-editing | 0a6fc5fd20059c5074f68a452cd49cf6ede36ea8 | [
"BSD-2-Clause"
] | 1 | 2021-07-06T14:34:12.000Z | 2021-07-06T14:34:12.000Z | retargeting/models/Kinematics.py | bmd080/deep-motion-editing | 19604abdc0ead66f8c82d9211b8c5862c6a68089 | [
"BSD-2-Clause"
] | null | null | null | retargeting/models/Kinematics.py | bmd080/deep-motion-editing | 19604abdc0ead66f8c82d9211b8c5862c6a68089 | [
"BSD-2-Clause"
] | null | null | null | import torch
import torch.nn as nn
import numpy as np
import math
class ForwardKinematics:
def __init__(self, args, edges):
self.topology = [-1] * (len(edges) + 1)
self.rotation_map = []
for i, edge in enumerate(edges):
self.topology[edge[1]] = edge[0]
self.rotation_map.append(edge[1])
self.world = args.fk_world
self.pos_repr = args.pos_repr
self.quater = args.rotation == 'quaternion'
def forward_from_raw(self, raw, offset, world=None, quater=None):
if world is None: world = self.world
if quater is None: quater = self.quater
if self.pos_repr == '3d':
position = raw[:, -3:, :]
rotation = raw[:, :-3, :]
elif self.pos_repr == '4d':
raise Exception('Not support')
if quater:
rotation = rotation.reshape((rotation.shape[0], -1, 4, rotation.shape[-1]))
identity = torch.tensor((1, 0, 0, 0), dtype=torch.float, device=raw.device)
else:
rotation = rotation.reshape((rotation.shape[0], -1, 3, rotation.shape[-1]))
identity = torch.zeros((3, ), dtype=torch.float, device=raw.device)
identity = identity.reshape((1, 1, -1, 1))
new_shape = list(rotation.shape)
new_shape[1] += 1
new_shape[2] = 1
rotation_final = identity.repeat(new_shape)
for i, j in enumerate(self.rotation_map):
rotation_final[:, j, :, :] = rotation[:, i, :, :]
return self.forward(rotation_final, position, offset, world=world, quater=quater)
'''
rotation should have shape batch_size * Joint_num * (3/4) * Time
position should have shape batch_size * 3 * Time
offset should have shape batch_size * Joint_num * 3
output have shape batch_size * Time * Joint_num * 3
'''
def forward(self, rotation: torch.Tensor, position: torch.Tensor, offset: torch.Tensor, order='xyz', quater=False, world=True):
if not quater and rotation.shape[-2] != 3: raise Exception('Unexpected shape of rotation')
if quater and rotation.shape[-2] != 4: raise Exception('Unexpected shape of rotation')
rotation = rotation.permute(0, 3, 1, 2)
position = position.permute(0, 2, 1)
result = torch.empty(rotation.shape[:-1] + (3, ), device=position.device)
norm = torch.norm(rotation, dim=-1, keepdim=True)
#norm[norm < 1e-10] = 1
rotation = rotation / norm
if quater:
transform = self.transform_from_quaternion(rotation)
else:
transform = self.transform_from_euler(rotation, order)
offset = offset.reshape((-1, 1, offset.shape[-2], offset.shape[-1], 1))
result[..., 0, :] = position
for i, pi in enumerate(self.topology):
if pi == -1:
assert i == 0
continue
transform[..., i, :, :] = torch.matmul(transform[..., pi, :, :], transform[..., i, :, :])
result[..., i, :] = torch.matmul(transform[..., i, :, :], offset[..., i, :, :]).squeeze()
if world: result[..., i, :] += result[..., pi, :]
return result
def from_local_to_world(self, res: torch.Tensor):
res = res.clone()
for i, pi in enumerate(self.topology):
if pi == 0 or pi == -1:
continue
res[..., i, :] += res[..., pi, :]
return res
@staticmethod
def transform_from_euler(rotation, order):
rotation = rotation / 180 * math.pi
transform = torch.matmul(ForwardKinematics.transform_from_axis(rotation[..., 1], order[1]),
ForwardKinematics.transform_from_axis(rotation[..., 2], order[2]))
transform = torch.matmul(ForwardKinematics.transform_from_axis(rotation[..., 0], order[0]), transform)
return transform
@staticmethod
def transform_from_axis(euler, axis):
transform = torch.empty(euler.shape[0:3] + (3, 3), device=euler.device)
cos = torch.cos(euler)
sin = torch.sin(euler)
cord = ord(axis) - ord('x')
transform[..., cord, :] = transform[..., :, cord] = 0
transform[..., cord, cord] = 1
if axis == 'x':
transform[..., 1, 1] = transform[..., 2, 2] = cos
transform[..., 1, 2] = -sin
transform[..., 2, 1] = sin
if axis == 'y':
transform[..., 0, 0] = transform[..., 2, 2] = cos
transform[..., 0, 2] = sin
transform[..., 2, 0] = -sin
if axis == 'z':
transform[..., 0, 0] = transform[..., 1, 1] = cos
transform[..., 0, 1] = -sin
transform[..., 1, 0] = sin
return transform
@staticmethod
def transform_from_quaternion(quater: torch.Tensor):
qw = quater[..., 0]
qx = quater[..., 1]
qy = quater[..., 2]
qz = quater[..., 3]
x2 = qx + qx
y2 = qy + qy
z2 = qz + qz
xx = qx * x2
yy = qy * y2
wx = qw * x2
xy = qx * y2
yz = qy * z2
wy = qw * y2
xz = qx * z2
zz = qz * z2
wz = qw * z2
m = torch.empty(quater.shape[:-1] + (3, 3), device=quater.device)
m[..., 0, 0] = 1.0 - (yy + zz)
m[..., 0, 1] = xy - wz
m[..., 0, 2] = xz + wy
m[..., 1, 0] = xy + wz
m[..., 1, 1] = 1.0 - (xx + zz)
m[..., 1, 2] = yz - wx
m[..., 2, 0] = xz - wy
m[..., 2, 1] = yz + wx
m[..., 2, 2] = 1.0 - (xx + yy)
return m
class InverseKinematics:
def __init__(self, rotations: torch.Tensor, positions: torch.Tensor, offset, parents, constrains):
self.rotations = rotations
self.rotations.requires_grad_(True)
self.position = positions
self.position.requires_grad_(True)
self.parents = parents
self.offset = offset
self.constrains = constrains
self.optimizer = torch.optim.Adam([self.position, self.rotations], lr=1e-3, betas=(0.9, 0.999))
self.crit = nn.MSELoss()
def step(self):
self.optimizer.zero_grad()
glb = self.forward(self.rotations, self.position, self.offset, order='', quater=True, world=True)
loss = self.crit(glb, self.constrains)
loss.backward()
self.optimizer.step()
self.glb = glb
return loss.item()
def tloss(self, time):
return self.crit(self.glb[time, :], self.constrains[time, :])
def all_loss(self):
res = [self.tloss(t).detach().numpy() for t in range(self.constrains.shape[0])]
return np.array(res)
'''
rotation should have shape batch_size * Joint_num * (3/4) * Time
position should have shape batch_size * 3 * Time
offset should have shape batch_size * Joint_num * 3
output have shape batch_size * Time * Joint_num * 3
'''
def forward(self, rotation: torch.Tensor, position: torch.Tensor, offset: torch.Tensor, order='xyz', quater=False,
world=True):
'''
if not quater and rotation.shape[-2] != 3: raise Exception('Unexpected shape of rotation')
if quater and rotation.shape[-2] != 4: raise Exception('Unexpected shape of rotation')
rotation = rotation.permute(0, 3, 1, 2)
position = position.permute(0, 2, 1)
'''
result = torch.empty(rotation.shape[:-1] + (3,), device=position.device)
norm = torch.norm(rotation, dim=-1, keepdim=True)
rotation = rotation / norm
if quater:
transform = self.transform_from_quaternion(rotation)
else:
transform = self.transform_from_euler(rotation, order)
offset = offset.reshape((-1, 1, offset.shape[-2], offset.shape[-1], 1))
result[..., 0, :] = position
for i, pi in enumerate(self.parents):
if pi == -1:
assert i == 0
continue
result[..., i, :] = torch.matmul(transform[..., pi, :, :], offset[..., i, :, :]).squeeze()
transform[..., i, :, :] = torch.matmul(transform[..., pi, :, :], transform[..., i, :, :])
if world: result[..., i, :] += result[..., pi, :]
return result
@staticmethod
def transform_from_euler(rotation, order):
rotation = rotation / 180 * math.pi
transform = torch.matmul(ForwardKinematics.transform_from_axis(rotation[..., 1], order[1]),
ForwardKinematics.transform_from_axis(rotation[..., 2], order[2]))
transform = torch.matmul(ForwardKinematics.transform_from_axis(rotation[..., 0], order[0]), transform)
return transform
@staticmethod
def transform_from_axis(euler, axis):
transform = torch.empty(euler.shape[0:3] + (3, 3), device=euler.device)
cos = torch.cos(euler)
sin = torch.sin(euler)
cord = ord(axis) - ord('x')
transform[..., cord, :] = transform[..., :, cord] = 0
transform[..., cord, cord] = 1
if axis == 'x':
transform[..., 1, 1] = transform[..., 2, 2] = cos
transform[..., 1, 2] = -sin
transform[..., 2, 1] = sin
if axis == 'y':
transform[..., 0, 0] = transform[..., 2, 2] = cos
transform[..., 0, 2] = sin
transform[..., 2, 0] = -sin
if axis == 'z':
transform[..., 0, 0] = transform[..., 1, 1] = cos
transform[..., 0, 1] = -sin
transform[..., 1, 0] = sin
return transform
@staticmethod
def transform_from_quaternion(quater: torch.Tensor):
qw = quater[..., 0]
qx = quater[..., 1]
qy = quater[..., 2]
qz = quater[..., 3]
x2 = qx + qx
y2 = qy + qy
z2 = qz + qz
xx = qx * x2
yy = qy * y2
wx = qw * x2
xy = qx * y2
yz = qy * z2
wy = qw * y2
xz = qx * z2
zz = qz * z2
wz = qw * z2
m = torch.empty(quater.shape[:-1] + (3, 3), device=quater.device)
m[..., 0, 0] = 1.0 - (yy + zz)
m[..., 0, 1] = xy - wz
m[..., 0, 2] = xz + wy
m[..., 1, 0] = xy + wz
m[..., 1, 1] = 1.0 - (xx + zz)
m[..., 1, 2] = yz - wx
m[..., 2, 0] = xz - wy
m[..., 2, 1] = yz + wx
m[..., 2, 2] = 1.0 - (xx + yy)
return m
| 36.233449 | 131 | 0.521877 | 1,272 | 10,399 | 4.203616 | 0.121855 | 0.005985 | 0.020946 | 0.026931 | 0.715728 | 0.695156 | 0.683935 | 0.662615 | 0.634187 | 0.625771 | 0 | 0.036943 | 0.315415 | 10,399 | 286 | 132 | 36.36014 | 0.714145 | 0.026637 | 0 | 0.671171 | 0 | 0 | 0.009929 | 0 | 0 | 0 | 0 | 0 | 0.009009 | 1 | 0.067568 | false | 0 | 0.018018 | 0.004505 | 0.153153 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ce0fee58832e03db2dedb448c80880f25c203aa | 800 | py | Python | mppi/Utilities/AttributeDict.py | marcodalessandro76/MPPI | ad60b73270b1f376ac501d47285146f1c3af457a | [
"MIT"
] | 1 | 2019-05-04T09:26:36.000Z | 2019-05-04T09:26:36.000Z | mppi/Utilities/AttributeDict.py | marcodalessandro76/MPPI | ad60b73270b1f376ac501d47285146f1c3af457a | [
"MIT"
] | null | null | null | mppi/Utilities/AttributeDict.py | marcodalessandro76/MPPI | ad60b73270b1f376ac501d47285146f1c3af457a | [
"MIT"
] | null | null | null | class AttributeDict(object):
"""
A class to convert a nested Dictionary into an object with key-values
accessibly using attribute notation (AttributeDict.attribute) instead of
key notation (Dict["key"]). This class recursively sets Dicts to objects,
allowing you to recurse down nested dicts (like: AttributeDict.attr.attr)
"""
def __init__(self, **entries):
self.add_entries(**entries)
def add_entries(self, **entries):
for key, value in entries.items():
if type(value) is dict:
self.__dict__[key] = AttributeDict(**value)
else:
self.__dict__[key] = value
def getAttributes(self):
"""
Return all the attributes of the object
"""
return self.__dict__.keys()
| 34.782609 | 77 | 0.62875 | 94 | 800 | 5.159574 | 0.531915 | 0.043299 | 0.045361 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.275 | 800 | 22 | 78 | 36.363636 | 0.836207 | 0.4125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ce16cb7ee2c1e090289468a70fd88401aba8ddc | 339 | py | Python | examples/xml-rpc/echoserver.py | keobox/yap101 | 26913da9f61ef3d0d9cb3ef54bbfc451a9ef9de9 | [
"MIT"
] | null | null | null | examples/xml-rpc/echoserver.py | keobox/yap101 | 26913da9f61ef3d0d9cb3ef54bbfc451a9ef9de9 | [
"MIT"
] | null | null | null | examples/xml-rpc/echoserver.py | keobox/yap101 | 26913da9f61ef3d0d9cb3ef54bbfc451a9ef9de9 | [
"MIT"
] | null | null | null | import SimpleXMLRPCServer as xmls
def echo(msg):
print 'Got', msg
return msg
class echoserver(xmls.SimpleXMLRPCServer):
allow_reuse_address = True
server = echoserver(('127.0.0.1', 8001))
server.register_function(echo, 'echo')
print 'Listening on port 8001'
try:
server.serve_forever()
except:
server.server_close()
| 19.941176 | 42 | 0.728614 | 45 | 339 | 5.377778 | 0.688889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049123 | 0.159292 | 339 | 16 | 43 | 21.1875 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.112094 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.076923 | null | null | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ce32e38118a236e7f22400e28b670e7f2079e82 | 869 | py | Python | practice/2008/qualification/C-Fly_swatter/c.py | victorWeiFreelancer/CodeJam | edb8f921860a35985823cb3dbd3ebec8a8f3c12f | [
"MIT"
] | null | null | null | practice/2008/qualification/C-Fly_swatter/c.py | victorWeiFreelancer/CodeJam | edb8f921860a35985823cb3dbd3ebec8a8f3c12f | [
"MIT"
] | null | null | null | practice/2008/qualification/C-Fly_swatter/c.py | victorWeiFreelancer/CodeJam | edb8f921860a35985823cb3dbd3ebec8a8f3c12f | [
"MIT"
] | null | null | null | import sys
sys.dont_write_bytecode = True
def hitP(f, R, t, r, g):
if f>=g/2 :
return 0.0
missArea = 0.0
gridL = g+2*r
nGrids = (R - t) // gridL
missGridSideLength = g - 2*f
print("gridL %.12f; nGrids %d" %(gridL, nGrids) )
indentSquareLength = nGrids*gridL
remain = (R - t) - indentSquareLength
missArea += (nGrids * missGridSideLength)**2
remainMissArea = 0
if remain - 2*r > 2*f
if remain > g+r:
totalArea = R**2 / 4.0
print( "missed a %.12f, total area %.12f" %(missR**2, (R-t)**2) )
return (totalArea - missArea) / (R-t)**2
def main():
numTestCases = int(input())
for i in range(numTestCases):
f, R, t, r, g = list(map(float, input().split()))
p = hitP(f, R, t, r, g)
print( "Case #%d: %.6f" %(i+1, p))
if __name__ == '__main__':
main() | 25.558824 | 69 | 0.537399 | 127 | 869 | 3.598425 | 0.393701 | 0.030635 | 0.019694 | 0.026258 | 0.050328 | 0.039387 | 0 | 0 | 0 | 0 | 0 | 0.040519 | 0.289988 | 869 | 34 | 70 | 25.558824 | 0.700162 | 0 | 0 | 0 | 0 | 0 | 0.087356 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.037037 | null | null | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ce3c5ddd55aea55c48d0f942c54f7645b346e45 | 24,690 | py | Python | tests/test_date.py | andy-z/ged4py | 2270bd8366174dcc98424cc6671bdaecf770fda0 | [
"MIT"
] | 10 | 2017-07-25T22:39:34.000Z | 2022-03-01T04:40:38.000Z | tests/test_date.py | andy-z/ged4py | 2270bd8366174dcc98424cc6671bdaecf770fda0 | [
"MIT"
] | 20 | 2018-03-25T10:25:40.000Z | 2021-05-02T20:38:48.000Z | tests/test_date.py | andy-z/ged4py | 2270bd8366174dcc98424cc6671bdaecf770fda0 | [
"MIT"
] | 6 | 2018-04-29T12:45:34.000Z | 2021-09-14T14:30:52.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Tests for `ged4py.date` module."""
import unittest
from ged4py.calendar import (
CalendarType, CalendarDate, FrenchDate, GregorianDate, HebrewDate, JulianDate,
CalendarDateVisitor
)
from ged4py.date import (
DateValue, DateValueAbout, DateValueAfter, DateValueBefore, DateValueCalculated,
DateValueEstimated, DateValueFrom, DateValueInterpreted, DateValuePeriod,
DateValuePhrase, DateValueRange, DateValueSimple, DateValueTo, DateValueTypes,
DateValueVisitor
)
class TestDateVisitor(CalendarDateVisitor, DateValueVisitor):
def visitGregorian(self, date):
if not isinstance(date, GregorianDate):
raise TypeError(str(type(date)))
return ("gregorian", date)
def visitJulian(self, date):
if not isinstance(date, JulianDate):
raise TypeError(str(type(date)))
return ("julian", date)
def visitHebrew(self, date):
if not isinstance(date, HebrewDate):
raise TypeError(str(type(date)))
return ("hebrew", date)
def visitFrench(self, date):
if not isinstance(date, FrenchDate):
raise TypeError(str(type(date)))
return ("french", date)
def visitSimple(self, date):
if not isinstance(date, DateValueSimple):
raise TypeError(str(type(date)))
return ("simple", date.date)
def visitPeriod(self, date):
if not isinstance(date, DateValuePeriod):
raise TypeError(str(type(date)))
return ("period", date.date1, date.date2)
def visitFrom(self, date):
if not isinstance(date, DateValueFrom):
raise TypeError(str(type(date)))
return ("from", date.date)
def visitTo(self, date):
if not isinstance(date, DateValueTo):
raise TypeError(str(type(date)))
return ("to", date.date)
def visitRange(self, date):
if not isinstance(date, DateValueRange):
raise TypeError(str(type(date)))
return ("range", date.date1, date.date2)
def visitBefore(self, date):
if not isinstance(date, DateValueBefore):
raise TypeError(str(type(date)))
return ("before", date.date)
def visitAfter(self, date):
if not isinstance(date, DateValueAfter):
raise TypeError(str(type(date)))
return ("after", date.date)
def visitAbout(self, date):
if not isinstance(date, DateValueAbout):
raise TypeError(str(type(date)))
return ("about", date.date)
def visitCalculated(self, date):
if not isinstance(date, DateValueCalculated):
raise TypeError(str(type(date)))
return ("calculated", date.date)
def visitEstimated(self, date):
if not isinstance(date, DateValueEstimated):
raise TypeError(str(type(date)))
return ("estimated", date.date)
def visitInterpreted(self, date):
if not isinstance(date, DateValueInterpreted):
raise TypeError(str(type(date)))
return ("interpreted", date.date, date.phrase)
def visitPhrase(self, date):
if not isinstance(date, DateValuePhrase):
raise TypeError(str(type(date)))
return ("phrase", date.phrase)
class TestDetailDate(unittest.TestCase):
"""Tests for `ged4py.date` module."""
def test_001_cal_date(self):
"""Test date.CalendarDate class."""
date = GregorianDate(2017, "OCT", 9)
self.assertEqual(date.year, 2017)
self.assertIsNone(date.dual_year)
self.assertFalse(date.bc)
self.assertEqual(date.year_str, "2017")
self.assertEqual(date.month, "OCT")
self.assertEqual(date.month_num, 10)
self.assertEqual(date.day, 9)
self.assertEqual(date.calendar, CalendarType.GREGORIAN)
date = GregorianDate(2017, "OCT", bc=True)
self.assertEqual(date.year, 2017)
self.assertIsNone(date.dual_year)
self.assertTrue(date.bc)
self.assertEqual(date.year_str, "2017 B.C.")
self.assertEqual(date.month, "OCT")
self.assertEqual(date.month_num, 10)
self.assertIsNone(date.day)
self.assertEqual(date.calendar, CalendarType.GREGORIAN)
date = GregorianDate(1699, "FEB", dual_year=1700)
self.assertEqual(date.year, 1699)
self.assertEqual(date.dual_year, 1700)
self.assertFalse(date.bc)
self.assertEqual(date.year_str, "1699/00")
self.assertEqual(date.month, "FEB")
self.assertEqual(date.month_num, 2)
self.assertIsNone(date.day)
self.assertEqual(date.calendar, CalendarType.GREGORIAN)
date = HebrewDate(5000)
self.assertEqual(date.year, 5000)
self.assertFalse(date.bc)
self.assertEqual(date.year_str, "5000")
self.assertIsNone(date.month)
self.assertIsNone(date.month_num)
self.assertIsNone(date.day)
self.assertEqual(date.calendar, CalendarType.HEBREW)
date = FrenchDate(1, "FRUC", 1)
self.assertEqual(date.year, 1)
self.assertFalse(date.bc)
self.assertEqual(date.year_str, "1")
self.assertEqual(date.month, "FRUC")
self.assertEqual(date.month_num, 12)
self.assertEqual(date.day, 1)
self.assertEqual(date.calendar, CalendarType.FRENCH_R)
date = JulianDate(5, "JAN", bc=True)
self.assertEqual(date.year, 5)
self.assertTrue(date.bc)
self.assertEqual(date.year_str, "5 B.C.")
self.assertEqual(date.month, "JAN")
self.assertEqual(date.month_num, 1)
self.assertIsNone(date.day)
self.assertEqual(date.calendar, CalendarType.JULIAN)
def test_002_cal_date_key(self):
"""Test date.CalendarDate class."""
date = GregorianDate(2017, "OCT", 9)
self.assertEqual(date.key(), (2458035.5, 0))
date = GregorianDate(1699, "FEB", 1, dual_year=1700)
self.assertEqual(date.key(), (2342003.5, 0))
date = FrenchDate(2017, "VENT", bc=True)
self.assertEqual(date.key(), (1638959.5, 1))
date = HebrewDate(2017, "TSH", 22)
self.assertEqual(date.key(), (1084542.5, 0))
date = JulianDate(1000)
self.assertEqual(date.key(), (2086672.5, 1))
def test_003_cal_date_cmp(self):
"""Test date.CalendarDate class."""
self.assertTrue(GregorianDate(2016, "JAN", 1) < GregorianDate(2017, "JAN", 1))
self.assertTrue(GregorianDate(2017, "JAN", 1) < GregorianDate(2017, "FEB", 1))
self.assertTrue(GregorianDate(2017, "JAN", 1) < GregorianDate(2017, "JAN", 2))
self.assertTrue(GregorianDate(2017, "JAN", 1) <= GregorianDate(2017, "JAN", 2))
self.assertTrue(GregorianDate(2017, "JAN", 2) > GregorianDate(2017, "JAN", 1))
self.assertTrue(GregorianDate(2017, "JAN", 2) >= GregorianDate(2017, "JAN", 1))
self.assertTrue(GregorianDate(2017, "JAN", 1) == GregorianDate(2017, "JAN", 1))
self.assertTrue(GregorianDate(2017, "JAN", 1) != GregorianDate(2017, "JAN", 2))
# missing day compares as "past" the last day of month, but before next month
self.assertTrue(GregorianDate(2017, "JAN") > GregorianDate(2017, "JAN", 31))
self.assertTrue(GregorianDate(2017, "JAN") < GregorianDate(2017, "FEB", 1))
# missing month compares as "past" the last day of year, but before next year
self.assertTrue(GregorianDate(2017) > GregorianDate(2017, "DEC", 31))
self.assertTrue(GregorianDate(2017) < GregorianDate(2018, "JAN", 1))
# dual date
self.assertTrue(GregorianDate(1700, "JAN", 1) == GregorianDate(1699, "JAN", 1, dual_year=1700))
# compare Gregorian and Julian dates
self.assertTrue(GregorianDate(1582, "OCT", 15) == JulianDate(1582, "OCT", 5))
self.assertTrue(GregorianDate(1582, "OCT", 16) > JulianDate(1582, "OCT", 5))
self.assertTrue(JulianDate(1582, "OCT", 6) > GregorianDate(1582, "OCT", 15))
self.assertTrue(GregorianDate(2000, "JAN", 14) == JulianDate(2000, "JAN", 1))
# compare Gregorian and French dates
self.assertTrue(GregorianDate(1792, "SEP", 22) == FrenchDate(1, "VEND", 1))
self.assertTrue(GregorianDate(1792, "SEP", 23) > FrenchDate(1, "VEND", 1))
self.assertTrue(FrenchDate(1, "VEND", 2) > GregorianDate(1792, "SEP", 22))
self.assertTrue(GregorianDate(2020, "SEP", 21) == FrenchDate(228, "COMP", 5))
# compare Gregorian and Hebrew dates
self.assertTrue(GregorianDate(2020, "JAN", 1) == HebrewDate(5780, "SVN", 4))
def test_004_cal_date_str(self):
"""Test date.CalendarDate class."""
date = GregorianDate(2017, "OCT", 9)
self.assertEqual(str(date), "9 OCT 2017")
date = GregorianDate(2017, "OCT", bc=True)
self.assertEqual(str(date), "OCT 2017 B.C.")
date = GregorianDate(1699, "JAN", 1, dual_year=1700)
self.assertEqual(str(date), "1 JAN 1699/00")
date = HebrewDate(5000)
self.assertEqual(str(date), "@#DHEBREW@ 5000")
date = FrenchDate(1, "VEND", 1)
self.assertEqual(str(date), "@#DFRENCH R@ 1 VEND 1")
date = JulianDate(1582, "OCT", 5)
self.assertEqual(str(date), "@#DJULIAN@ 5 OCT 1582")
def test_005_cal_date_parse(self):
"""Test date.CalendarDate.parse method."""
date = CalendarDate.parse("31 MAY 2020")
self.assertIsInstance(date, GregorianDate)
self.assertEqual(date.year, 2020)
self.assertIsNone(date.dual_year)
self.assertFalse(date.bc)
self.assertEqual(date.month, "MAY")
self.assertEqual(date.month_num, 5)
self.assertEqual(date.day, 31)
self.assertEqual(date.original, "31 MAY 2020")
self.assertEqual(date.calendar, CalendarType.GREGORIAN)
date = CalendarDate.parse("@#DGREGORIAN@ 10 MAR 1698/99")
self.assertIsInstance(date, GregorianDate)
self.assertEqual(date.year, 1698)
self.assertEqual(date.dual_year, 1699)
self.assertFalse(date.bc)
self.assertEqual(date.month, "MAR")
self.assertEqual(date.month_num, 3)
self.assertEqual(date.day, 10)
self.assertEqual(date.original, "@#DGREGORIAN@ 10 MAR 1698/99")
self.assertEqual(date.calendar, CalendarType.GREGORIAN)
date = CalendarDate.parse("10 MAR 1699/00")
self.assertIsInstance(date, GregorianDate)
self.assertEqual(date.year, 1699)
self.assertEqual(date.dual_year, 1700)
self.assertEqual(date.original, "10 MAR 1699/00")
self.assertEqual(date.calendar, CalendarType.GREGORIAN)
date = CalendarDate.parse("@#DJULIAN@ 100 B.C.")
self.assertIsInstance(date, JulianDate)
self.assertEqual(date.year, 100)
self.assertTrue(date.bc)
self.assertIsNone(date.month)
self.assertIsNone(date.month_num)
self.assertIsNone(date.day)
self.assertEqual(date.original, "@#DJULIAN@ 100 B.C.")
self.assertEqual(date.calendar, CalendarType.JULIAN)
date = CalendarDate.parse("@#DFRENCH R@ 15 GERM 0001")
self.assertIsInstance(date, FrenchDate)
self.assertEqual(date.year, 1)
self.assertFalse(date.bc)
self.assertEqual(date.month, "GERM")
self.assertEqual(date.month_num, 7)
self.assertEqual(date.day, 15)
self.assertEqual(date.original, "@#DFRENCH R@ 15 GERM 0001")
self.assertEqual(date.calendar, CalendarType.FRENCH_R)
date = CalendarDate.parse("@#DHEBREW@ 7 NSN 5000")
self.assertIsInstance(date, HebrewDate)
self.assertEqual(date.year, 5000)
self.assertFalse(date.bc)
self.assertEqual(date.month, "NSN")
self.assertEqual(date.month_num, 8)
self.assertEqual(date.day, 7)
self.assertEqual(date.original, "@#DHEBREW@ 7 NSN 5000")
self.assertEqual(date.calendar, CalendarType.HEBREW)
# cannot handle ROMAN
with self.assertRaises(ValueError):
date = CalendarDate.parse("@#DROMAN@ 2020")
# cannot handle UNKNOWN
with self.assertRaises(ValueError):
date = CalendarDate.parse("@#DUNKNOWN@ 2020")
# dual year only works for GREGORIAN
with self.assertRaises(ValueError):
date = CalendarDate.parse("@#DJULIAN@ 2020/21")
# cannot parse nonsense
with self.assertRaises(ValueError):
date = CalendarDate.parse("start of time")
def test_006_cal_date_visitor(self):
"""Test date.CalendarDate.accept method."""
visitor = TestDateVisitor()
date = GregorianDate(2017, "OCT", 9)
value = date.accept(visitor)
self.assertEqual(value, ("gregorian", date))
date = HebrewDate(5000)
value = date.accept(visitor)
self.assertEqual(value, ("hebrew", date))
date = FrenchDate(1, "VEND", 1)
value = date.accept(visitor)
self.assertEqual(value, ("french", date))
date = JulianDate(1582, "OCT", 5)
value = date.accept(visitor)
self.assertEqual(value, ("julian", date))
def test_007_cal_date_hash(self):
"""Test date.CalendarDate hash."""
self.assertEqual(hash(GregorianDate(2017, "OCT", 9)),
hash(GregorianDate(2017, "OCT", 9)))
self.assertEqual(hash(GregorianDate(2017, "OCT", 9, bc=True)),
hash(GregorianDate(2017, "OCT", 9, bc=True)))
self.assertEqual(hash(FrenchDate(1, "VEND", 1)),
hash(FrenchDate(1, "VEND", 1)))
self.assertEqual(hash(FrenchDate(1)),
hash(FrenchDate(1)))
def test_010_date_no_date(self):
"""Test date.DateValue class."""
date = DateValue.parse("not a date")
self.assertIsInstance(date, DateValuePhrase)
self.assertEqual(date.kind, DateValueTypes.PHRASE)
self.assertEqual(date.phrase, "not a date")
self.assertEqual(str(date), "(not a date)")
def test_012_date_parse_period(self):
"""Test date.DateValue class."""
date = DateValue.parse("FROM 1967")
self.assertIsInstance(date, DateValueFrom)
self.assertEqual(date.kind, DateValueTypes.FROM)
self.assertEqual(date.date, GregorianDate(1967))
self.assertEqual(str(date), "FROM 1967")
date = DateValue.parse("TO 1 JAN 2017")
self.assertIsInstance(date, DateValueTo)
self.assertEqual(date.kind, DateValueTypes.TO)
self.assertEqual(date.date, GregorianDate(2017, "JAN", 1))
self.assertEqual(str(date), "TO 1 JAN 2017")
date = DateValue.parse("FROM 1920 TO 2000")
self.assertIsInstance(date, DateValuePeriod)
self.assertEqual(date.kind, DateValueTypes.PERIOD)
self.assertEqual(date.date1, GregorianDate(1920))
self.assertEqual(date.date2, GregorianDate(2000))
self.assertEqual(str(date), "FROM 1920 TO 2000")
date = DateValue.parse("from mar 1920 to 1 apr 2000")
self.assertIsInstance(date, DateValuePeriod)
self.assertEqual(date.kind, DateValueTypes.PERIOD)
self.assertEqual(date.date1, GregorianDate(1920, "MAR"))
self.assertEqual(date.date2, GregorianDate(2000, "APR", 1))
self.assertEqual(str(date), "FROM MAR 1920 TO 1 APR 2000")
def test_013_date_parse_range(self):
"""Test date.DateValue class."""
date = DateValue.parse("BEF 1967B.C.")
self.assertIsInstance(date, DateValueBefore)
self.assertEqual(date.kind, DateValueTypes.BEFORE)
self.assertEqual(date.date, GregorianDate(1967, bc=True))
self.assertEqual(str(date), "BEFORE 1967 B.C.")
date = DateValue.parse("AFT 1 JAN 2017")
self.assertIsInstance(date, DateValueAfter)
self.assertEqual(date.kind, DateValueTypes.AFTER)
self.assertEqual(date.date, GregorianDate(2017, "JAN", 1))
self.assertEqual(str(date), "AFTER 1 JAN 2017")
date = DateValue.parse("BET @#DJULIAN@ 1600 AND 2000")
self.assertIsInstance(date, DateValueRange)
self.assertEqual(date.kind, DateValueTypes.RANGE)
self.assertEqual(date.date1, JulianDate(1600))
self.assertEqual(date.date2, GregorianDate(2000))
self.assertEqual(str(date), "BETWEEN @#DJULIAN@ 1600 AND 2000")
date = DateValue.parse("bet mar 1920 and apr 2000")
self.assertIsInstance(date, DateValueRange)
self.assertEqual(date.kind, DateValueTypes.RANGE)
self.assertEqual(date.date1, GregorianDate(1920, "MAR"))
self.assertEqual(date.date2, GregorianDate(2000, "APR"))
self.assertEqual(str(date), "BETWEEN MAR 1920 AND APR 2000")
def test_014_date_parse_approx(self):
"""Test date.DateValue class."""
dates = {"500 B.C.": GregorianDate(500, bc=True),
"JAN 2017": GregorianDate(2017, "JAN"),
"31 JAN 2017": GregorianDate(2017, "JAN", 31)}
approx = [
("ABT", "ABOUT", DateValueAbout, DateValueTypes.ABOUT),
("CAL", "CALCULATED", DateValueCalculated, DateValueTypes.CALCULATED),
("EST", "ESTIMATED", DateValueEstimated, DateValueTypes.ESTIMATED)
]
for appr, fmt, klass, typeEnum in approx:
for datestr, value in dates.items():
date = DateValue.parse(appr + " " + datestr)
self.assertIsInstance(date, klass)
self.assertEqual(date.kind, typeEnum)
self.assertEqual(str(date), fmt + " " + datestr)
self.assertEqual(date.date, value)
def test_015_date_parse_phrase(self):
"""Test date.DateValue class."""
date = DateValue.parse("(some phrase)")
self.assertIsInstance(date, DateValuePhrase)
self.assertEqual(date.kind, DateValueTypes.PHRASE)
self.assertEqual(date.phrase, "some phrase")
date = DateValue.parse("INT 1967 B.C. (some phrase)")
self.assertIsInstance(date, DateValueInterpreted)
self.assertEqual(date.kind, DateValueTypes.INTERPRETED)
self.assertEqual(date.date, GregorianDate(1967, bc=True))
self.assertEqual(date.phrase, "some phrase")
self.assertEqual(str(date), "INTERPRETED 1967 B.C. (some phrase)")
date = DateValue.parse("INT @#DGREGORIAN@ 1 JAN 2017 (some phrase)")
self.assertIsInstance(date, DateValueInterpreted)
self.assertEqual(date.kind, DateValueTypes.INTERPRETED)
self.assertEqual(date.date, GregorianDate(2017, "JAN", 1))
self.assertEqual(date.phrase, "some phrase")
self.assertEqual(str(date), "INTERPRETED 1 JAN 2017 (some phrase)")
def test_016_date_parse_simple(self):
"""Test date.DateValue class."""
date = DateValue.parse("1967 B.C.")
self.assertIsInstance(date, DateValueSimple)
self.assertEqual(date.kind, DateValueTypes.SIMPLE)
self.assertEqual(date.date, GregorianDate(1967, bc=True))
self.assertEqual(str(date), "1967 B.C.")
date = DateValue.parse("@#DGREGORIAN@ 1 JAN 2017")
self.assertIsInstance(date, DateValueSimple)
self.assertEqual(date.kind, DateValueTypes.SIMPLE)
self.assertEqual(date.date, GregorianDate(2017, "JAN", 1))
self.assertEqual(str(date), "1 JAN 2017")
def test_017_date_cmp(self):
"""Test date.Date class."""
dv = DateValue.parse("2016")
self.assertIsInstance(dv.key(), tuple)
self.assertEqual(dv.key(), (GregorianDate(2016), GregorianDate(2016)))
dv = DateValue.parse("31 DEC 2000")
self.assertIsInstance(dv.key(), tuple)
self.assertEqual(dv.key(), (GregorianDate(2000, "DEC", 31), GregorianDate(2000, "DEC", 31)))
dv = DateValue.parse("BET 31 DEC 2000 AND 1 JAN 2001")
self.assertIsInstance(dv.key(), tuple)
self.assertEqual(dv.key(), (GregorianDate(2000, "DEC", 31), GregorianDate(2001, "JAN", 1)))
# order of dates is messed up
dv = DateValue.parse("BET 31 DEC 2000 AND 1 JAN 2000")
self.assertIsInstance(dv.key(), tuple)
self.assertEqual(dv.key(), (GregorianDate(2000, "DEC", 31), GregorianDate(2000, "JAN", 1)))
self.assertTrue(DateValue.parse("2016") < DateValue.parse("2017"))
self.assertTrue(DateValue.parse("2 JAN 2016") > DateValue.parse("1 JAN 2016"))
self.assertTrue(DateValue.parse("BET 1900 AND 2000") < DateValue.parse("FROM 1920 TO 1999"))
# comparing simple date with range
self.assertTrue(DateValue.parse("1 JAN 2000") > DateValue.parse("BET 1 JAN 1999 AND 1 JAN 2000"))
self.assertNotEqual(DateValue.parse("1 JAN 2000"), DateValue.parse("BET 1 JAN 2000 AND 1 JAN 2001"))
self.assertTrue(DateValue.parse("1 JAN 2000") < DateValue.parse("BET 1 JAN 2000 AND 1 JAN 2001"))
self.assertTrue(DateValue.parse("1 JAN 2000") > DateValue.parse("BEF 1 JAN 2000"))
self.assertTrue(DateValue.parse("1 JAN 2000") > DateValue.parse("TO 1 JAN 2000"))
self.assertTrue(DateValue.parse("1 JAN 2000") < DateValue.parse("AFT 1 JAN 2000"))
self.assertTrue(DateValue.parse("1 JAN 2000") < DateValue.parse("FROM 1 JAN 2000"))
# comparing ranges
self.assertEqual(DateValue.parse("FROM 1 JAN 2000 TO 1 JAN 2001"),
DateValue.parse("BET 1 JAN 2000 AND 1 JAN 2001"))
self.assertTrue(DateValue.parse("FROM 1 JAN 1999 TO 1 JAN 2001") <
DateValue.parse("BET 1 JAN 2000 AND 1 JAN 2001"))
self.assertTrue(DateValue.parse("FROM 1 JAN 2000 TO 1 JAN 2002") >
DateValue.parse("BET 1 JAN 2000 AND 1 JAN 2001"))
# Less specific date compares later than more specific
self.assertTrue(DateValue.parse("2000") > DateValue.parse("31 DEC 2000"))
self.assertTrue(DateValue.parse("DEC 2000") > DateValue.parse("31 DEC 2000"))
# phrase is always later than any regular date
self.assertTrue(DateValue.parse("(Could be 1996 or 1998)") > DateValue.parse("2000"))
# "empty" date is always later than any regular date
self.assertTrue(DateValue.parse("") > DateValue.parse("2000"))
def test_018_date_parse_empty(self):
"""Test date.DateValue class."""
for value in (None, ""):
date = DateValue.parse(value)
self.assertIsInstance(date, DateValuePhrase)
self.assertEqual(date.kind, DateValueTypes.PHRASE)
self.assertIsNone(date.phrase)
self.assertEqual(str(date), "")
def test_019_date_value_visitor(self):
"""Test date.DateValue class."""
visitor = TestDateVisitor()
date1 = GregorianDate(2017, "JAN", 1)
date2 = GregorianDate(2017, "DEC", 31)
value = DateValueSimple(date1).accept(visitor)
self.assertEqual(value, ("simple", date1))
value = DateValueFrom(date1).accept(visitor)
self.assertEqual(value, ("from", date1))
value = DateValueTo(date1).accept(visitor)
self.assertEqual(value, ("to", date1))
value = DateValuePeriod(date1, date2).accept(visitor)
self.assertEqual(value, ("period", date1, date2))
value = DateValueBefore(date1).accept(visitor)
self.assertEqual(value, ("before", date1))
value = DateValueAfter(date1).accept(visitor)
self.assertEqual(value, ("after", date1))
value = DateValueRange(date1, date2).accept(visitor)
self.assertEqual(value, ("range", date1, date2))
value = DateValueAbout(date1).accept(visitor)
self.assertEqual(value, ("about", date1))
value = DateValueCalculated(date1).accept(visitor)
self.assertEqual(value, ("calculated", date1))
value = DateValueEstimated(date1).accept(visitor)
self.assertEqual(value, ("estimated", date1))
value = DateValueInterpreted(date1, "phrase").accept(visitor)
self.assertEqual(value, ("interpreted", date1, "phrase"))
value = DateValuePhrase("phrase").accept(visitor)
self.assertEqual(value, ("phrase", "phrase"))
def test_020_date_hash(self):
"""Test date.Date hash"""
dv1 = DateValue.parse("2016")
dv2 = DateValue.parse("2016")
self.assertEqual(hash(dv1), hash(dv2))
dv1 = DateValue.parse("31 DEC 2000")
dv2 = DateValue.parse("31 DEC 2000")
self.assertEqual(hash(dv1), hash(dv2))
dv1 = DateValue.parse("BET 31 DEC 2000 AND 1 JAN 2001")
dv2 = DateValue.parse("BET 31 DEC 2000 AND 1 JAN 2001")
self.assertEqual(hash(dv1), hash(dv2))
| 41.356784 | 108 | 0.637708 | 2,825 | 24,690 | 5.540177 | 0.095221 | 0.147594 | 0.127468 | 0.029519 | 0.685515 | 0.588844 | 0.440419 | 0.387196 | 0.343684 | 0.308223 | 0 | 0.067049 | 0.22981 | 24,690 | 596 | 109 | 41.426175 | 0.755995 | 0.047833 | 0 | 0.314943 | 0 | 0 | 0.096256 | 0 | 0 | 0 | 0 | 0 | 0.565517 | 1 | 0.075862 | false | 0 | 0.006897 | 0 | 0.124138 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ce4ea0979a0d8bcdfade749e59f8ad94da264f2 | 3,487 | py | Python | var/spack/repos/builtin/packages/visionary-dev-tools/package.py | electronicvisions/spack | d6121eb35b4948f7d8aef7ec7a305a5123a7439e | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 2 | 2019-02-10T13:47:48.000Z | 2019-04-17T13:05:17.000Z | var/spack/repos/builtin/packages/visionary-dev-tools/package.py | einc-eu/spack | 15468b92ed21d970c0111ae19144e85e66746433 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 8 | 2021-05-28T06:39:59.000Z | 2022-03-30T15:12:35.000Z | var/spack/repos/builtin/packages/visionary-dev-tools/package.py | einc-eu/spack | 15468b92ed21d970c0111ae19144e85e66746433 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 2 | 2018-04-06T09:04:11.000Z | 2020-01-24T12:52:12.000Z | # Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path as osp
class VisionaryDevTools(Package):
"""Developer convenience packages common to all visionary
development meta packages. Application specific build tools belong
to the dedicated meta packages."""
homepage = ''
# some random tarball, to make `spack fetch --dependencies visionary-defaults` work
url = 'https://github.com/electronicvisions/spack/archive/v0.8.tar.gz'
# This is only a dummy tarball (see difference between version numbers)
# TODO: as soon as a MetaPackage-concept has been merged, please update this package
version('1.0', '372ce038842f20bf0ae02de50c26e85d', url='https://github.com/electronicvisions/spack/archive/v0.8.tar.gz')
depends_on('ack')
depends_on('autoconf')
depends_on('automake')
depends_on('bash-completion')
depends_on('bazel')
depends_on('bear')
depends_on('cairo +X')
depends_on('cloc')
depends_on('cmake')
depends_on('connect-proxy')
depends_on('cppcheck +htmlreport')
depends_on('cquery')
depends_on('doxygen+graphviz')
depends_on('emacs ~X')
depends_on('gdb')
depends_on('genpybind')
depends_on('git+tcltk')
depends_on('git-fat-git')
depends_on('gtkplus')
depends_on('imagemagick')
depends_on('jq')
depends_on('libpcap')
depends_on('libtool')
depends_on('llvm+visionary+python~libcxx build_type=Release')
depends_on('mercurial')
depends_on('mosh')
depends_on('munge')
depends_on('ncdu')
depends_on('node-js')
depends_on('octave+fftw')
depends_on('openssh')
depends_on('pigz')
depends_on('pkg-config')
depends_on('py-autopep8')
depends_on('py-black', when="^python@3.6.0:")
depends_on('py-configargparse')
depends_on('py-doxypypy')
depends_on('py-flake8')
depends_on('py-gdbgui')
depends_on('py-git-review')
depends_on('py-ipython')
depends_on('py-jedi')
depends_on('py-junit-xml')
depends_on('py-language-server')
depends_on('py-line-profiler')
depends_on('py-nose')
depends_on('py-nose2')
depends_on('py-memory-profiler')
depends_on('py-pudb')
depends_on('py-pylint@:1.999.999', when="^python@:2.999.999")
depends_on('py-pylint', when="^python@3.4.0:")
depends_on('py-pyserial')
depends_on('py-pytest')
depends_on('py-pytest-xdist')
depends_on('py-ranger-fm')
depends_on('py-sqlalchemy')
depends_on('py-virtualenv')
depends_on('py-xmlrunner')
depends_on('py-yq')
depends_on('rtags')
depends_on('tar')
depends_on('texinfo')
# ECM (2020-05-14): removed 'the-silver-searcher' due to build fail on gcc@10.1.0
depends_on('tig')
depends_on('time')
depends_on('tmux')
depends_on('units')
depends_on('valgrind')
depends_on('verilator')
depends_on('vim +python +ruby +perl +cscope +huge +x')
depends_on('visionary-xilinx')
depends_on('wget')
depends_on('yaml-cpp+shared')
depends_on('zsh')
def install(self, spec, prefix):
mkdirp(prefix.etc)
# store a copy of this package.
filename = osp.basename(osp.dirname(__file__)) # gives name of parent folder
install(__file__, join_path(prefix.etc, filename + '.py'))
# we could create some filesystem view here?
| 33.854369 | 124 | 0.677086 | 474 | 3,487 | 4.805907 | 0.478903 | 0.288411 | 0.125549 | 0.014925 | 0.04741 | 0.04741 | 0.04741 | 0.04741 | 0.04741 | 0.04741 | 0 | 0.024756 | 0.177516 | 3,487 | 102 | 125 | 34.186275 | 0.769526 | 0.217666 | 0 | 0 | 0 | 0 | 0.34507 | 0.022239 | 0 | 0 | 0 | 0.009804 | 0 | 1 | 0.012195 | false | 0 | 0.012195 | 0 | 0.060976 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ceab378038506dba92e4b8d3ecd8a07fc74f4a2 | 1,469 | py | Python | tests/unit/peapods/runtimes/remote/ssh/test_ssh_remote.py | yk/jina | ab66e233e74b956390f266881ff5dc4e0110d3ff | [
"Apache-2.0"
] | 1 | 2020-12-23T12:34:00.000Z | 2020-12-23T12:34:00.000Z | tests/unit/peapods/runtimes/remote/ssh/test_ssh_remote.py | yk/jina | ab66e233e74b956390f266881ff5dc4e0110d3ff | [
"Apache-2.0"
] | null | null | null | tests/unit/peapods/runtimes/remote/ssh/test_ssh_remote.py | yk/jina | ab66e233e74b956390f266881ff5dc4e0110d3ff | [
"Apache-2.0"
] | null | null | null | import pytest
from jina.enums import RemoteAccessType
from jina.flow import Flow
from jina.parser import set_pea_parser, set_pod_parser
from jina.peapods.pods import BasePod
from jina.peapods.runtimes.remote.ssh import SSHRuntime
from jina.proto import jina_pb2
@pytest.mark.skip('works locally, but until I findout how to mock ssh, this has to be skipped')
def test_ssh_pea():
p = set_pea_parser().parse_args(['--host', 'pi@172.16.1.110', '--timeout', '5000'])
with SSHRuntime(p, kind='pea') as pp:
assert pp.status.envelope.status.code == jina_pb2.StatusProto.READY
assert pp.status is None
@pytest.mark.skip('works locally, but until I find out how to mock ssh, this has to be skipped')
def test_ssh_pod():
p = set_pod_parser().parse_args(['--host', 'pi@172.16.1.110', '--timeout', '5000'])
with SSHRuntime(p, kind='pod') as pp:
assert pp.status.envelope.status.code == jina_pb2.StatusProto.READY
assert pp.status is None
@pytest.mark.skip('not implemented yet')
def test_ssh_mutable_pod():
p = set_pod_parser().parse_args(['--host', 'pi@172.16.1.110', '--timeout', '5000'])
p = BasePod(p)
with SSHRuntime(p, kind='pod') as pp:
assert pp.status.envelope.status.code == jina_pb2.StatusProto.READY
assert pp.status is None
@pytest.mark.skip('not implemented yet')
def test_flow():
f = Flow().add().add(host='pi@172.16.1.110', remote_access=RemoteAccessType.SSH)
with f:
pass
| 32.644444 | 96 | 0.701157 | 235 | 1,469 | 4.27234 | 0.297872 | 0.047809 | 0.083665 | 0.043825 | 0.681275 | 0.681275 | 0.666335 | 0.666335 | 0.610558 | 0.610558 | 0 | 0.042037 | 0.157931 | 1,469 | 44 | 97 | 33.386364 | 0.769604 | 0 | 0 | 0.451613 | 0 | 0 | 0.21307 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 1 | 0.129032 | false | 0.032258 | 0.225806 | 0 | 0.354839 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cec971c07d5ed98dc62f84b80e44472db92d7d3 | 531 | py | Python | Uber/validExpression.py | Nithanaroy/random_scripts | 908e539e2b7050a09e03b4fc0d2621b23733d65a | [
"MIT"
] | null | null | null | Uber/validExpression.py | Nithanaroy/random_scripts | 908e539e2b7050a09e03b4fc0d2621b23733d65a | [
"MIT"
] | null | null | null | Uber/validExpression.py | Nithanaroy/random_scripts | 908e539e2b7050a09e03b4fc0d2621b23733d65a | [
"MIT"
] | null | null | null | def main(expr):
openingParams = '({['
closingParams = ')}]'
stack = []
for c in expr:
if c in openingParams:
stack.append(c)
elif c in closingParams:
topOfStack = stack.pop()
openingIndex = openingParams.find(topOfStack)
closingIndex = closingParams.find(c)
if openingIndex is not closingIndex:
return False
if len(stack) == 0:
return True
return False
if __name__ =='__main__':
print main('{(abc})')
| 25.285714 | 57 | 0.551789 | 53 | 531 | 5.377358 | 0.509434 | 0.031579 | 0.091228 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002874 | 0.344633 | 531 | 20 | 58 | 26.55 | 0.816092 | 0 | 0 | 0.111111 | 0 | 0 | 0.039548 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cf38c52d649f28843e5da3730409c34a52dc82f | 8,026 | py | Python | platform/gcutil/lib/google_compute_engine/gcutil_lib/address_cmds_test.py | IsaacHuang/google-cloud-sdk | 52afa5d1a75dff08f4f5380c5cccc015bf796ca5 | [
"Apache-2.0"
] | null | null | null | platform/gcutil/lib/google_compute_engine/gcutil_lib/address_cmds_test.py | IsaacHuang/google-cloud-sdk | 52afa5d1a75dff08f4f5380c5cccc015bf796ca5 | [
"Apache-2.0"
] | null | null | null | platform/gcutil/lib/google_compute_engine/gcutil_lib/address_cmds_test.py | IsaacHuang/google-cloud-sdk | 52afa5d1a75dff08f4f5380c5cccc015bf796ca5 | [
"Apache-2.0"
] | 2 | 2020-07-25T05:03:06.000Z | 2020-11-04T04:55:57.000Z | # Copyright 2012 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unit tests for address collection commands."""
import path_initializer
path_initializer.InitSysPath()
import json
import unittest
import gflags as flags
from gcutil_lib import address_cmds
from gcutil_lib import gcutil_unittest
from gcutil_lib import mock_api
from gcutil_lib import mock_lists
FLAGS = flags.FLAGS
class AddressCmdsTest(gcutil_unittest.GcutilTestCase):
def setUp(self):
self.mock, self.api = mock_api.CreateApi(self.version)
def testReserveAddressPromptsForRegion(self):
expected_project = 'test_project'
expected_address = 'test_address'
expected_description = 'test address'
expected_region = 'test-region'
expected_source_address = '123.123.123.1'
set_flags = {
'project': expected_project,
'description': expected_description,
'source_address': expected_source_address,
}
command = self._CreateAndInitializeCommand(
address_cmds.ReserveAddress, 'reserveaddress', set_flags=set_flags)
mock_lists.GetSampleRegionListCall(
command, self.mock, num_responses=1, name=[expected_region])
call = self.mock.Respond('compute.addresses.insert', {})
command.Handle(expected_address)
request = call.GetRequest()
self.assertEqual('POST', request.method)
self.assertEqual(expected_project, request.parameters['project'])
self.assertEquals(expected_region, request.parameters['region'])
body = json.loads(request.body)
self.assertEqual(body['name'], expected_address)
self.assertEqual(body['description'], expected_description)
self.assertEquals(body['address'], expected_source_address)
def testReserveAddressGeneratesCorrectRequest(self):
expected_project = 'test_project'
expected_address = 'test_address'
expected_description = 'test address'
submitted_region = 'test-region'
expected_source_address = '123.123.123.1'
set_flags = {
'project': expected_project,
'description': expected_description,
'region': submitted_region,
'source_address': expected_source_address,
}
command = self._CreateAndInitializeCommand(
address_cmds.ReserveAddress, 'reserveaddress', set_flags=set_flags)
call = self.mock.Respond('compute.addresses.insert', {})
command.Handle(expected_address)
request = call.GetRequest()
self.assertEqual('POST', request.method)
self.assertEqual(expected_project, request.parameters['project'])
self.assertEquals(submitted_region, request.parameters['region'])
body = json.loads(request.body)
self.assertEqual(body['name'], expected_address)
self.assertEqual(body['description'], expected_description)
self.assertEquals(body['address'], expected_source_address)
def testGetAddressGeneratesCorrectRequest(self):
expected_project = 'test_project'
expected_address = 'test_address'
submitted_region = 'test-region'
set_flags = {
'project': expected_project,
'region': submitted_region,
}
command = self._CreateAndInitializeCommand(
address_cmds.GetAddress, 'getaddress', set_flags=set_flags)
call = self.mock.Respond('compute.addresses.get', {})
command.Handle(expected_address)
request = call.GetRequest()
self.assertEqual('GET', request.method)
self.assertEqual(None, request.body)
parameters = request.parameters
self.assertEqual(parameters['project'], expected_project)
self.assertEqual(parameters['region'], submitted_region)
self.assertEqual(parameters['address'], expected_address)
def testGetAddressPrintNonEmptyUsers(self):
expected_project = 'test_project'
submitted_region = 'test-region'
set_flags = {
'project': expected_project,
'region': submitted_region,
}
command = self._CreateAndInitializeCommand(
address_cmds.GetAddress, 'getaddress', set_flags=set_flags)
data = command.GetDetailRow({'users': ['fr-1', 'fr-2']})
expected_data = {
'v1': [
('users', ['fr-1', 'fr-2'])
],
}
self.assertEquals(
gcutil_unittest.SelectTemplateForVersion(
expected_data, command.api.version),
data)
def testGetAddressPrintEmptyUsers(self):
expected_project = 'test_project'
submitted_region = 'test-region'
set_flags = {
'project': expected_project,
'region': submitted_region,
}
command = self._CreateAndInitializeCommand(
address_cmds.GetAddress, 'getaddress', set_flags=set_flags)
data = command.GetDetailRow({'users': []})
expected_data = {
'v1': [
('users', [])
],
}
self.assertEquals(
gcutil_unittest.SelectTemplateForVersion(
expected_data, command.api.version),
data)
def testReleaseAddressGeneratesCorrectRequest(self):
expected_project = 'test_project'
expected_address = 'test_address'
submitted_region = 'test-region'
set_flags = {
'project': expected_project,
'region': submitted_region,
}
command = self._CreateAndInitializeCommand(
address_cmds.ReleaseAddress, 'releaseaddress', set_flags=set_flags)
call = self.mock.Respond('compute.addresses.delete', {})
command.Handle(expected_address)
request = call.GetRequest()
self.assertEqual('DELETE', request.method)
self.assertEqual(None, request.body)
parameters = request.parameters
self.assertEqual(parameters['project'], expected_project)
self.assertEqual(parameters['region'], submitted_region)
self.assertEqual(parameters['address'], expected_address)
def testReleaseAddressWithoutRegionFlag(self):
expected_project = 'test_project'
expected_region = 'test-region'
expected_address = 'test_address'
address = ('projects/%s/regions/%s/addresses/%s' %
(expected_project, expected_region, expected_address))
set_flags = {
'project': 'incorrect_project',
}
command = self._CreateAndInitializeCommand(
address_cmds.ReleaseAddress, 'releaseaddress', set_flags=set_flags)
call = self.mock.Respond('compute.addresses.delete', {})
command.Handle(address)
request = call.GetRequest()
self.assertEqual('DELETE', request.method)
self.assertEqual(None, request.body)
parameters = request.parameters
self.assertEqual(parameters['project'], expected_project)
self.assertEqual(parameters['region'], expected_region)
self.assertEqual(parameters['address'], expected_address)
def testReleaseMultipleAddresses(self):
expected_project = 'test_project'
expected_addresses = [
'test-addresses-%02d' % x for x in xrange(100)]
set_flags = {
'project': expected_project,
'region': 'region-a',
}
command = self._CreateAndInitializeCommand(
address_cmds.ReleaseAddress, 'releaseaddress', set_flags=set_flags)
calls = [self.mock.Respond('compute.addresses.delete', {})
for x in xrange(len(expected_addresses))]
_, exceptions = command.Handle(*expected_addresses)
self.assertEqual(0, len(exceptions))
sorted_calls = sorted([call.GetRequest().parameters['address'] for
call in calls])
self.assertEqual(expected_addresses, sorted_calls)
if __name__ == '__main__':
unittest.main(testLoader=gcutil_unittest.GcutilLoader())
| 32.362903 | 75 | 0.705457 | 825 | 8,026 | 6.66303 | 0.193939 | 0.068219 | 0.040022 | 0.033473 | 0.698563 | 0.681281 | 0.652538 | 0.652538 | 0.64235 | 0.629252 | 0 | 0.006286 | 0.187391 | 8,026 | 247 | 76 | 32.493927 | 0.836553 | 0.076501 | 0 | 0.672316 | 0 | 0 | 0.128888 | 0.023803 | 0 | 0 | 0 | 0 | 0.175141 | 1 | 0.050847 | false | 0 | 0.045198 | 0 | 0.101695 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cf46d5f8307606187101597e795384399b48446 | 804 | py | Python | vote/migrations/0005_auto_20210204_1900.py | jnegrete2005/JuradoFMS | 25848037e51de1781c419155615d0fb41edc07ec | [
"MIT"
] | 2 | 2021-02-24T21:57:50.000Z | 2021-03-15T08:44:09.000Z | vote/migrations/0005_auto_20210204_1900.py | jnegrete2005/JuradoFMS | 25848037e51de1781c419155615d0fb41edc07ec | [
"MIT"
] | null | null | null | vote/migrations/0005_auto_20210204_1900.py | jnegrete2005/JuradoFMS | 25848037e51de1781c419155615d0fb41edc07ec | [
"MIT"
] | null | null | null | # Generated by Django 3.1.5 on 2021-02-05 00:00
import django.contrib.postgres.fields
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('vote', '0004_auto_20210131_1621'),
]
operations = [
migrations.AlterField(
model_name='competitor',
name='min1',
field=django.contrib.postgres.fields.ArrayField(base_field=models.PositiveSmallIntegerField(), blank=True, null=True, size=9, verbose_name='minuto 1'),
),
migrations.AlterField(
model_name='competitor',
name='min2',
field=django.contrib.postgres.fields.ArrayField(base_field=models.PositiveSmallIntegerField(), blank=True, null=True, size=9, verbose_name='minuto 2'),
),
]
| 32.16 | 163 | 0.655473 | 88 | 804 | 5.886364 | 0.522727 | 0.07529 | 0.121622 | 0.156371 | 0.633205 | 0.633205 | 0.467181 | 0.467181 | 0.467181 | 0.467181 | 0 | 0.059295 | 0.223881 | 804 | 24 | 164 | 33.5 | 0.770833 | 0.05597 | 0 | 0.333333 | 1 | 0 | 0.093791 | 0.030383 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cfa416e684eef42a41f05552ac51704b017a9e1 | 1,471 | py | Python | arguments_setting.py | Projectoy/ml_framework | f3d37d632a1aec314eb186a3da6d174a5dc4beee | [
"Apache-2.0"
] | null | null | null | arguments_setting.py | Projectoy/ml_framework | f3d37d632a1aec314eb186a3da6d174a5dc4beee | [
"Apache-2.0"
] | null | null | null | arguments_setting.py | Projectoy/ml_framework | f3d37d632a1aec314eb186a3da6d174a5dc4beee | [
"Apache-2.0"
] | null | null | null | import argparse, os
class ArgumentManager:
def __init__(self, model_list):
self.model_list = model_list
self.args = self.get_input_arguments()
self.validate_arguments()
def get_input_arguments(self):
parser = argparse.ArgumentParser(description='Process some integers.')
parser.add_argument("--configuration", "-c", required=True, help="the path of a configuration file(json type)")
parser.add_argument("--model", "-m", required=True, help="the model to process")
parser.add_argument("--task", "-t", required=True, help="training/testing")
return parser.parse_args()
def validate_arguments(self):
self.validate_configuration_path()
self.validate_model()
self.validate_task()
def validate_task(self):
task = self.args.task
assert task == "training" or task == "testing", "task should be training or testing"
def validate_model(self):
model = self.args.model
assert model in self.model_list, "model is not in the prepared model list"
def validate_configuration_path(self):
config_path = self.args.configuration
assert os.path.exists(config_path), "configuration path is inappropriate (not found file)"
def get_configuraiton_file_path(self):
return self.args.configuration
def get_model_type(self):
return self.args.model
def get_task_type(self):
return self.args.task | 36.775 | 119 | 0.680489 | 185 | 1,471 | 5.221622 | 0.302703 | 0.057971 | 0.040373 | 0.055901 | 0.045549 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2155 | 1,471 | 40 | 120 | 36.775 | 0.837088 | 0 | 0 | 0 | 0 | 0 | 0.186821 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 1 | 0.290323 | false | 0 | 0.032258 | 0.096774 | 0.483871 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cfa745e3890fcda9ffd072f599dc7be286f99a5 | 11,039 | py | Python | fileHandler.py | Omer-Sella/ldpc | 955c0bc32236e171365cbbb88f00574302771610 | [
"MIT"
] | null | null | null | fileHandler.py | Omer-Sella/ldpc | 955c0bc32236e171365cbbb88f00574302771610 | [
"MIT"
] | null | null | null | fileHandler.py | Omer-Sella/ldpc | 955c0bc32236e171365cbbb88f00574302771610 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu Nov 28 12:10:11 2019
@author: Omer
"""
## File handler
## This file was initially intended purely to generate the matrices for the near earth code found in: https://public.ccsds.org/Pubs/131x1o2e2s.pdf
## The values from the above pdf were copied manually to a txt file, and it is the purpose of this file to parse it.
## The emphasis here is on correctness, I currently do not see a reason to generalise this file, since matrices will be saved in either json or some matrix friendly format.
import numpy as np
from scipy.linalg import circulant
#import matplotlib.pyplot as plt
import scipy.io
import common
import hashlib
import os
projectDir = os.environ.get('LDPC')
if projectDir == None:
import pathlib
projectDir = pathlib.Path(__file__).parent.absolute()
## Omer Sella: added on 01/12/2020, need to make sure this doesn't break anything.
import sys
sys.path.insert(1, projectDir)
FILE_HANDLER_INT_DATA_TYPE = np.int32
GENERAL_CODE_MATRIX_DATA_TYPE = np.int32
NIBBLE_CONVERTER = np.array([8, 4, 2, 1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
def nibbleToHex(inputArray):
n = NIBBLE_CONVERTER.dot(inputArray)
if n == 10:
h = 'A'
elif n== 11:
h = 'B'
elif n== 12:
h = 'C'
elif n== 13:
h = 'D'
elif n== 14:
h = 'E'
elif n== 15:
h = 'F'
else:
h = str(n)
return h
def binaryArraytoHex(inputArray):
d1 = len(inputArray)
assert (d1 % 4 == 0)
outputArray = np.zeros(d1//4, dtype = str)
outputString = ''
for j in range(d1//4):
nibble = inputArray[4 * j : 4 * j + 4]
h = nibbleToHex(nibble)
outputArray[j] = h
outputString = outputString + h
return outputArray, outputString
def hexStringToBinaryArray(hexString):
outputBinary = np.array([], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
for i in hexString:
if i == '0':
nibble = np.array([0,0,0,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '1':
nibble = np.array([0,0,0,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '2':
nibble = np.array([0,0,1,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '3':
nibble = np.array([0,0,1,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '4':
nibble = np.array([0,1,0,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '5':
nibble = np.array([0,1,0,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '6':
nibble = np.array([0,1,1,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '7':
nibble = np.array([0,1,1,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '8':
nibble = np.array([1,0,0,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == '9':
nibble = np.array([1,0,0,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == 'A':
nibble = np.array([1,0,1,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == 'B':
nibble = np.array([1,0,1,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == 'C':
nibble = np.array([1,1,0,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == 'D':
nibble = np.array([1,1,0,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == 'E':
nibble = np.array([1,1,1,0], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
elif i == 'F':
nibble = np.array([1,1,1,1], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
else:
#print('Error, 0-9 or A-F')
pass
nibble = np.array([], dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
outputBinary = np.hstack((outputBinary, nibble))
return outputBinary
def hexToCirculant(hexStr, circulantSize):
binaryArray = hexStringToBinaryArray(hexStr)
if len(binaryArray) < circulantSize:
binaryArray = np.hstack(np.zeros(circulantSize-len(binaryArray), dtype = GENERAL_CODE_MATRIX_DATA_TYPE))
else:
binaryArray = binaryArray[1:]
circulantMatrix = circulant(binaryArray)
circulantMatrix = circulantMatrix.T
return circulantMatrix
def hotLocationsToCirculant(locationList, circulantSize):
generatingVector = np.zeros(circulantSize, dtype = GENERAL_CODE_MATRIX_DATA_TYPE)
generatingVector[locationList] = 1
newCirculant = circulant(generatingVector)
newCirculant = newCirculant.T
return newCirculant
def readMatrixFromFile(fileName, dim0, dim1, circulantSize, isRow = True, isHex = True, isGenerator = True ):
# This function assumes that each line in the file contains the non zero locations of the first row of a circulant.
# Each line in the file then defines a circulant, and the order in which they are defined is top to bottom left to right, i.e.:
# line 0 defines circulant 0,0
with open(fileName) as fid:
lines = fid.readlines()
if isGenerator:
for i in range((dim0 // circulantSize) ):
bLeft = hexToCirculant(lines[2 * i], circulantSize)
bRight = hexToCirculant(lines[2 * i + 1], circulantSize)
newBlock = np.hstack((bLeft, bRight))
if i == 0:
accumulatedBlock = newBlock
else:
accumulatedBlock = np.vstack((accumulatedBlock, newBlock))
newMatrix = np.hstack((np.eye(dim0, dtype = GENERAL_CODE_MATRIX_DATA_TYPE), accumulatedBlock))
else:
for i in range((dim1 // circulantSize)):
locationList1 = list(lines[ i].rstrip('\n').split(','))
locationList1 = list(map(int, locationList1))
upBlock = hotLocationsToCirculant(locationList1, circulantSize)
if i == 0:
accumulatedUpBlock1 = upBlock
else:
accumulatedUpBlock1 = np.hstack((accumulatedUpBlock1, upBlock))
for i in range((dim1 // circulantSize)):
locationList = list(lines[(dim1 // circulantSize) + i].rstrip('\n').split(','))
locationList = list(map(int, locationList))
newBlock = hotLocationsToCirculant(locationList, circulantSize)
if i == 0:
accumulatedBlock2 = newBlock
else:
accumulatedBlock2 = np.hstack((accumulatedBlock2, newBlock))
newMatrix = np.vstack((accumulatedUpBlock1, accumulatedBlock2))
return newMatrix
def binaryMatrixToHexString(binaryMatrix, circulantSize):
leftPadding = np.array(4 - (circulantSize % 4))
m,n = binaryMatrix.shape
#print(m)
#print(n)
assert( m % circulantSize == 0)
assert (n % circulantSize == 0)
M = m // circulantSize
N = n // circulantSize
hexName = ''
for r in range(M):
for k in range(N):
nextLine = np.hstack((leftPadding, binaryMatrix[ r * circulantSize , k * circulantSize : (k + 1) * circulantSize]))
hexArray, hexString = binaryArraytoHex(nextLine)
hexName = hexName + hexString
return hexName
def saveCodeInstance(parityMatrix, circulantSize, codewordSize, evaluationData = None, path = None, evaluationTime = 0, numberOfNonZero = 0, fileName = None):
print("*** in saveCodeInstance ...")
m, n = parityMatrix.shape
M = m // circulantSize
N = n // circulantSize
if fileName == None:
fileName = binaryMatrixToHexString(parityMatrix, circulantSize)
fileNameSHA224 = str(circulantSize) + '_' + str(M) + '_' + str(N) + '_' + str(hashlib.sha224(str(fileName).encode('utf-8')).hexdigest())
fileNameWithPath = path + fileNameSHA224
else:
fileNameWithPath = path + fileName
print("*** " + fileName)
workspaceDict = {}
workspaceDict['parityMatrix'] = parityMatrix
workspaceDict['fileName'] = fileName
if evaluationData != None:
scatterSNR, scatterBER, scatterITR, snrAxis, averageSnrAxis, berData, averageNumberOfIterations = evaluationData.getStatsV2()
workspaceDict['snrData'] = scatterSNR
workspaceDict['berData'] = scatterBER
workspaceDict['itrData'] = scatterITR
workspaceDict['averageSnrAxis'] = averageSnrAxis
workspaceDict['averageNumberOfIterations'] = averageNumberOfIterations
workspaceDict['evaluationTime'] = evaluationTime
workspaceDict['nonZero'] = numberOfNonZero
scipy.io.savemat((fileNameWithPath + '.mat'), workspaceDict)
#evaluationData.plotStats(codewordSize, fileNameWithPath)
print("*** Finishing saveCodeInstance !")
return fileName
def testFileHandler():
nearEarthGenerator = readMatrixFromFile(projectDir + '/codeMatrices/nearEarthGenerator.txt', 7154, 8176, 511, True, True, True)
nearEarthParity = readMatrixFromFile(projectDir + '/codeMatrices/nearEarthParity.txt', 1022, 8176, 511, True, False, False)
return 'OK'
def plotResults(path, makeMat = False):
i = 10
evaluationFaildAt = np.zeros(4, dtype = FILE_HANDLER_INT_DATA_TYPE)
evalTimes = []
numberOfIterationsAtHigh = []
for root, dirs, files in os.walk(path):
for file in files:
if str(file).endswith('.mat'):
i = i + 1
mat = scipy.io.loadmat(str(os.path.join(root, file)))
snrAxis = mat['snrAxis']
snrActual = mat['averageSnrAxis']
if len(snrAxis) < 3:
evaluationFaildAt[len(snrAxis)] = evaluationFaildAt[len(snrAxis)] + 1
berAxis = mat['berData']
if ('evaluationTime' in mat.keys()):
evalTimes.append(mat['evaluationTime'])
averageNumberOfIterations = mat['averageNumberOfIterations']
numberOfIterationsAtHigh.append(averageNumberOfIterations[-1])
common.plotSNRvsBER(snrActual, berAxis, fileName = None, inputLabel = '', figureNumber = i, figureName = str(file))
else:
pass
return evalTimes, evaluationFaildAt, numberOfIterationsAtHigh
#plt.imshow(nearEarthParity)
#nearEarthParity = readMatrixFromFile('/home/oss22/swift/swift/codeMatrices/nearEarthParity.txt', 1022, 8176, 511, True, False, False)
#import networkx as nx
#from networkx.algorithms import bipartite
#B = nx.Graph()
#B.add_nodes_from(range(1022), bipartite=0)
#B.add_nodes_from(range(1022, 7156 + 1022), bipartite=1)
# Add edges only between nodes of opposite node sets
#for i in range(8176):
# for j in range(1022):
# if nearEarthParity[j,i] != 0:
# B.add_edges_from([(j, 7156 + i)])
#X, Y = bipartite.sets(B)
#pos = dict()
#pos.update( (n, (1, i)) for i, n in enumerate(X) )
#pos.update( (n, (2, i)) for i, n in enumerate(Y) )
#nx.draw(B, pos=pos)
#plt.show()
| 38.197232 | 172 | 0.621343 | 1,254 | 11,039 | 5.37799 | 0.250399 | 0.029656 | 0.057977 | 0.071619 | 0.213078 | 0.197805 | 0.126928 | 0.116696 | 0.105724 | 0.084371 | 0 | 0.031883 | 0.266963 | 11,039 | 288 | 173 | 38.329861 | 0.801532 | 0.151191 | 0 | 0.10101 | 1 | 0 | 0.039043 | 0.012764 | 0 | 0 | 0 | 0 | 0.015152 | 1 | 0.050505 | false | 0.010101 | 0.040404 | 0 | 0.141414 | 0.015152 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7cfaab0b77af0b6c7c138ff09a0a82244c391f57 | 12,133 | py | Python | stage/configuration/test_amazon_s3_origin.py | Sentienz/datacollector-tests | ca27988351dc3366488098b5db6c85a8be2f7b85 | [
"Apache-2.0"
] | null | null | null | stage/configuration/test_amazon_s3_origin.py | Sentienz/datacollector-tests | ca27988351dc3366488098b5db6c85a8be2f7b85 | [
"Apache-2.0"
] | null | null | null | stage/configuration/test_amazon_s3_origin.py | Sentienz/datacollector-tests | ca27988351dc3366488098b5db6c85a8be2f7b85 | [
"Apache-2.0"
] | 1 | 2019-10-29T08:46:11.000Z | 2019-10-29T08:46:11.000Z | import logging
import pytest
from streamsets.testframework.markers import aws, sdc_min_version
from streamsets.testframework.utils import get_random_string
logger = logging.getLogger(__name__)
S3_SANDBOX_PREFIX = 'sandbox'
LOG_FIELD_MAPPING = [{'fieldPath': '/date', 'group': 1},
{'fieldPath': '/time', 'group': 2},
{'fieldPath': '/timehalf', 'group': 3},
{'fieldPath': '/info', 'group': 4},
{'fieldPath': '/file', 'group': 5},
{'fieldPath': '/message', 'group': 6}]
REGULAR_EXPRESSION = r'(\S+) (\S+) (\S+) (\S+) (\S+) (.*)'
# log to be written int the file on s3
data_format_content = {
'COMMON_LOG_FORMAT': '127.0.0.1 - frank [10/Oct/2000:13:55:36 -0700] '
'"GET /apache.gif HTTP/1.0" 200 232',
'LOG4J': '200 [main] DEBUG org.StreamSets.Log4j unknown - This is sample log message',
'APACHE_ERROR_LOG_FORMAT': '[Wed Oct 11 14:32:52 2000] [error] [client 127.0.0.1] client '
'denied by server configuration:/export/home/live/ap/htdocs/test',
'COMBINED_LOG_FORMAT': '127.0.0.1 - frank [10/Oct/2000:13:55:36 -0700] "GET /apache.gif'
' HTTP/1.0" 200 2326 "http://www.example.com/strt.html" "Mozilla/4.08'
' [en] (Win98; I ;Nav)"',
'APACHE_CUSTOM_LOG_FORMAT': '10.185.248.71 - - [09/Jan/2015:9:12:06 +0000] "GET '
'/inventoryServic/inventory/purchaseItem?userId=20253471&itemId=23434300 '
'HTTP/1.1" 500 17 ',
'CEF': '10.217.31.247 CEF:0|Citrix|NetScaler|NS10.0|APPFW|APPFW_STARTURL|6|src=10.217.253.78 '
'spt=53743 method=GET request=http://vpx247.example.net/FFC/login.html msg=Disallow Illegal URL.',
'LEEF': 'LEEF: 2.0|Trend Micro|Deep Security Agent|<DSA version>|4000030|cat=Anti-Malware '
'name=HEU_AEGIS_CRYPT desc=HEU_AEGIS_CRYPT sev=6 cn1=241 msg=Realtime',
'REGEX': '2019-04-30 08:23:53 AM [INFO] [streamsets.sdk.sdc_api] Pipeline Filewriterpipeline53'}
# data to verify the output of amazon s3 origin.
get_data_to_verify_output = {
'LOG4J': {'severity': 'DEBUG', 'relativetime': '200', 'thread': 'main', 'category': 'org.StreamSets.Log4j',
'ndc': 'unknown', 'message': 'This is sample log message'},
'COMMON_LOG_FORMAT': {'request': '/apache.gif', 'auth': 'frank', 'ident': '-', 'response': '200', 'bytes':
'232', 'clientip': '127.0.0.1', 'verb': 'GET', 'httpversion': '1.0', 'rawrequest': None,
'timestamp': '10/Oct/2000:13:55:36 -0700'},
'APACHE_ERROR_LOG_FORMAT': {'message': 'client denied by server configuration:/export/home/live/ap/htdocs/'
'test', 'timestamp': 'Wed Oct 11 14:32:52 2000', 'loglevel': 'error',
'clientip': '127.0.0.1'},
'COMBINED_LOG_FORMAT': {'request': '/apache.gif', 'agent': '"Mozilla/4.08 [en] (Win98; I ;Nav)"', 'auth':
'frank', 'ident': '-', 'verb': 'GET', 'referrer': '"http://www.example.com/strt.'
'html"', 'response': '200', 'bytes': '2326', 'clientip': '127.0.0.1',
'httpversion': '1.0', 'rawrequest': None, 'timestamp': '10/Oct/2000:13:55:36 -0700'},
'APACHE_CUSTOM_LOG_FORMAT': {'remoteUser': '-', 'requestTime': '09/Jan/2015:9:12:06 +0000', 'request': 'GET '
'/inventoryServic/inventory/purchaseItem?userId=20253471&itemId=23434300 HTTP/1.1',
'logName': '-', 'remoteHost': '10.185.248.71', 'bytesSent': '17', 'status': '500'},
'CEF': {'severity': '6', 'product': 'NetScaler', 'extensions': {'msg': 'Disallow Illegal URL.', 'request':
'http://vpx247.example.net/FFC/login.html', 'method': 'GET', 'src': '10.217.253.78', 'spt': '53743'},
'signature': 'APPFW', 'vendor': 'Citrix', 'cefVersion': 0, 'name': 'APPFW_STARTURL',
'version': 'NS10.0'},
'GROK': {'request': '/inventoryServic/inventory/purchaseItem?userId=20253471&itemId=23434300', 'auth': '-',
'ident': '-', 'response': '500', 'bytes': '17', 'clientip': '10.185.248.71', 'verb': 'GET',
'httpversion': '1.1', 'rawrequest': None, 'timestamp': '09/Jan/2015:9:12:06 +0000'},
'LEEF': {'eventId': '4000030', 'product': 'Deep Security Agent', 'extensions': {'cat': 'Realtime'},
'leefVersion': 2.0, 'vendor': 'Trend Micro', 'version': '<DSA version>'},
'REGEX': {'/time': '08:23:53', '/date': '2019-04-30', '/timehalf': 'AM',
'/info': '[INFO]', '/message': 'Pipeline Filewriterpipeline53', '/file': '[streamsets.sdk.sdc_api]'}}
@pytest.mark.skip('Not yet implemented')
def test_configuration_access_key_id(sdc_builder, sdc_executor):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_bucket(sdc_builder, sdc_executor):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_connection_timeout(sdc_builder, sdc_executor):
pass
@pytest.mark.parametrize('task', ['CREATE_NEW_OBJECT'])
@pytest.mark.skip('Not yet implemented')
def test_configuration_content(sdc_builder, sdc_executor, task):
pass
@pytest.mark.parametrize('task', ['COPY_OBJECT'])
@pytest.mark.parametrize('delete_original_object', [False, True])
@pytest.mark.skip('Not yet implemented')
def test_configuration_delete_original_object(sdc_builder, sdc_executor, task, delete_original_object):
pass
@pytest.mark.parametrize('region', ['OTHER'])
@pytest.mark.skip('Not yet implemented')
def test_configuration_endpoint(sdc_builder, sdc_executor, region):
pass
@pytest.mark.parametrize('task', ['COPY_OBJECT'])
@pytest.mark.skip('Not yet implemented')
def test_configuration_new_object_path(sdc_builder, sdc_executor, task):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_object(sdc_builder, sdc_executor):
pass
@pytest.mark.parametrize('on_record_error', ['DISCARD', 'STOP_PIPELINE', 'TO_ERROR'])
@pytest.mark.skip('Not yet implemented')
def test_configuration_on_record_error(sdc_builder, sdc_executor, on_record_error):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_preconditions(sdc_builder, sdc_executor):
pass
@pytest.mark.parametrize('use_proxy', [True])
@pytest.mark.skip('Not yet implemented')
def test_configuration_proxy_host(sdc_builder, sdc_executor, use_proxy):
pass
@pytest.mark.parametrize('use_proxy', [True])
@pytest.mark.skip('Not yet implemented')
def test_configuration_proxy_password(sdc_builder, sdc_executor, use_proxy):
pass
@pytest.mark.parametrize('use_proxy', [True])
@pytest.mark.skip('Not yet implemented')
def test_configuration_proxy_port(sdc_builder, sdc_executor, use_proxy):
pass
@pytest.mark.parametrize('use_proxy', [True])
@pytest.mark.skip('Not yet implemented')
def test_configuration_proxy_user(sdc_builder, sdc_executor, use_proxy):
pass
@pytest.mark.parametrize('region', ['AP_NORTHEAST_1', 'AP_NORTHEAST_2', 'AP_NORTHEAST_3', 'AP_SOUTHEAST_1', 'AP_SOUTHEAST_2', 'AP_SOUTH_1', 'CA_CENTRAL_1', 'CN_NORTHWEST_1', 'CN_NORTH_1', 'EU_CENTRAL_1', 'EU_WEST_1', 'EU_WEST_2', 'EU_WEST_3', 'OTHER', 'SA_EAST_1', 'US_EAST_1', 'US_EAST_2', 'US_GOV_WEST_1', 'US_WEST_1', 'US_WEST_2'])
@pytest.mark.skip('Not yet implemented')
def test_configuration_region(sdc_builder, sdc_executor, region):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_required_fields(sdc_builder, sdc_executor):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_retry_count(sdc_builder, sdc_executor):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_secret_access_key(sdc_builder, sdc_executor):
pass
@pytest.mark.skip('Not yet implemented')
def test_configuration_socket_timeout(sdc_builder, sdc_executor):
pass
@pytest.mark.parametrize('task', ['CHANGE_EXISTING_OBJECT'])
@pytest.mark.skip('Not yet implemented')
def test_configuration_tags(sdc_builder, sdc_executor, task):
pass
@pytest.mark.parametrize('task', ['CHANGE_EXISTING_OBJECT', 'COPY_OBJECT', 'CREATE_NEW_OBJECT'])
@pytest.mark.skip('Not yet implemented')
def test_configuration_task(sdc_builder, sdc_executor, task):
pass
@pytest.mark.parametrize('use_proxy', [False, True])
@pytest.mark.skip('Not yet implemented')
def test_configuration_use_proxy(sdc_builder, sdc_executor, use_proxy):
pass
@aws('s3')
@pytest.mark.parametrize('data_format', ['LOG'])
@pytest.mark.parametrize('log_format', ['COMMON_LOG_FORMAT', 'APACHE_ERROR_LOG_FORMAT', 'COMBINED_LOG_FORMAT',
'APACHE_CUSTOM_LOG_FORMAT', 'REGEX', 'GROK', 'LOG4J', 'CEF', 'LEEF'])
def test_configurations_data_format_log(sdc_executor, sdc_builder, aws, data_format, log_format):
"""Check whether S3 origin can parse different log format or not. A log file is being created in s3 bucket
mentioned below .S3 origin reads the log file and parse the same.
Pipeline for the same-
s3_origin >> trash
s3_origin >= pipeline_finisher_executor
"""
if log_format == 'GROK':
file_content = data_format_content['APACHE_CUSTOM_LOG_FORMAT']
else:
file_content = data_format_content[log_format]
client = aws.s3
s3_key = f'{S3_SANDBOX_PREFIX}/{get_random_string()}'
attributes = {'bucket': aws.s3_bucket_name,
'prefix_pattern': f'{s3_key}/*',
'number_of_threads': 1,
'read_order': 'LEXICOGRAPHICAL',
'data_format': data_format,
'log_format': log_format,
'custom_log_format': '%h %l %u [%t] "%r" %>s %b',
'regular_expression': REGULAR_EXPRESSION,
'field_path_to_regex_group_mapping': LOG_FIELD_MAPPING
}
pipeline = get_aws_origin_to_trash_pipeline(sdc_builder, attributes, aws)
s3_origin = pipeline.origin_stage
try:
client.put_object(Bucket=aws.s3_bucket_name, Key=f'{s3_key}/{get_random_string()}.log', Body=file_content)
output_records = execute_pipeline_and_get_output(sdc_executor, s3_origin, pipeline)
assert output_records[0].field == get_data_to_verify_output[log_format]
finally:
if sdc_executor.get_pipeline_status(pipeline).response.json().get('status') == 'RUNNING':
sdc_executor.stop_pipeline(pipeline)
# cleaning up s3 bucket
delete_aws_objects(client, aws, s3_key)
def get_aws_origin_to_trash_pipeline(sdc_builder, attributes, aws):
# Build pipeline.
builder = sdc_builder.get_pipeline_builder()
builder.add_error_stage('Discard')
s3_origin = builder.add_stage('Amazon S3', type='origin')
s3_origin.set_attributes(**attributes)
trash = builder.add_stage('Trash')
pipeline_finisher_executor = builder.add_stage('Pipeline Finisher Executor')
pipeline_finisher_executor.set_attributes(stage_record_preconditions=["${record:eventType() == 'no-more-data'}"])
s3_origin >> trash
s3_origin >= pipeline_finisher_executor
s3_origin_pipeline = builder.build().configure_for_environment(aws)
s3_origin_pipeline.configuration['shouldRetry'] = False
return s3_origin_pipeline
def delete_aws_objects(client, aws, s3_key):
# Clean up S3.
delete_keys = {'Objects': [{'Key': k['Key']}
for k in
client.list_objects_v2(Bucket=aws.s3_bucket_name, Prefix=s3_key)['Contents']]}
client.delete_objects(Bucket=aws.s3_bucket_name, Delete=delete_keys)
def execute_pipeline_and_get_output(sdc_executor, s3_origin, pipeline):
sdc_executor.add_pipeline(pipeline)
snapshot = sdc_executor.capture_snapshot(pipeline, start_pipeline=True).snapshot
output_records = snapshot[s3_origin].output
return output_records
| 46.84556 | 334 | 0.656556 | 1,539 | 12,133 | 4.925926 | 0.223522 | 0.050125 | 0.040628 | 0.049334 | 0.502308 | 0.45614 | 0.441367 | 0.397705 | 0.351801 | 0.311305 | 0 | 0.050808 | 0.193769 | 12,133 | 258 | 335 | 47.027132 | 0.724187 | 0.031979 | 0 | 0.26178 | 0 | 0.04712 | 0.359911 | 0.071111 | 0 | 0 | 0 | 0 | 0.005236 | 1 | 0.136126 | false | 0.120419 | 0.020942 | 0 | 0.167539 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7cfcc11fbbb1d31705e442bed5fe7d622b04a2bd | 4,472 | py | Python | benchmark/AMS/HIGGSTES/TP.py | victor-estrade/SystGradDescent | 822e7094290301ec47a99433381a8d6406798aff | [
"MIT"
] | 2 | 2019-03-20T09:05:02.000Z | 2019-03-20T15:23:44.000Z | benchmark/AMS/HIGGSTES/TP.py | victor-estrade/SystGradDescent | 822e7094290301ec47a99433381a8d6406798aff | [
"MIT"
] | null | null | null | benchmark/AMS/HIGGSTES/TP.py | victor-estrade/SystGradDescent | 822e7094290301ec47a99433381a8d6406798aff | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
from __future__ import print_function
from __future__ import division
from __future__ import absolute_import
from __future__ import unicode_literals
# Command line :
# python -m benchmark.VAR.GG.TP
import os
import logging
from config import SEED
from config import _ERROR
from config import _TRUTH
import numpy as np
import pandas as pd
from visual.misc import set_plot_config
set_plot_config()
from utils.log import set_logger
from utils.log import flush
from utils.log import print_line
from utils.model import get_model
from utils.model import get_optimizer
from utils.model import train_or_load_neural_net
from utils.evaluation import evaluate_summary_computer
from utils.images import gather_images
from visual.misc import plot_params
from problem.higgs import HiggsConfigTesOnly as Config
from problem.higgs import get_generators_torch
from problem.higgs import GeneratorCPU
from problem.higgs import GeneratorTorch
from problem.higgs import HiggsNLL as NLLComputer
from model.tangent_prop import TangentPropClassifier
from archi.classic import L4 as ARCHI
from ...my_argparser import TP_parse_args
from collections import OrderedDict
from .common import measurement
DATA_NAME = 'HIGGSTES'
BENCHMARK_NAME = 'VAR-'+DATA_NAME
N_ITER = 30
class TrainGenerator:
def __init__(self, data_generator, cuda=False):
self.data_generator = data_generator
if cuda:
self.data_generator.cuda()
else:
self.data_generator.cpu()
self.mu = self.tensor(Config.CALIBRATED.mu, requires_grad=True)
self.tes = self.tensor(Config.CALIBRATED.tes, requires_grad=True)
self.jes = self.tensor(Config.CALIBRATED.jes, requires_grad=True)
self.les = self.tensor(Config.CALIBRATED.les, requires_grad=True)
self.params = (self.tes, self.jes, self.tes, self.mu)
self.nuisance_params = OrderedDict([
('tes', self.tes),
('jes', self.jes),
('les', self.les),
])
def generate(self, n_samples=None):
X, y, w = self.data_generator.diff_generate(*self.params, n_samples=n_samples)
return X, y, w
def reset(self):
self.data_generator.reset()
def tensor(self, data, requires_grad=False, dtype=None):
return self.data_generator.tensor(data, requires_grad=requires_grad, dtype=dtype)
def build_model(args, i_cv):
args.net = ARCHI(n_in=29, n_out=2, n_unit=args.n_unit)
args.optimizer = get_optimizer(args)
model = get_model(args, TangentPropClassifier)
model.set_info(DATA_NAME, BENCHMARK_NAME, i_cv)
return model
# =====================================================================
# MAIN
# =====================================================================
def main():
# BASIC SETUP
logger = set_logger()
args = TP_parse_args(main_description="Training launcher for INFERNO on GG benchmark")
logger.info(args)
flush(logger)
# INFO
model = build_model(args, -1)
os.makedirs(model.results_directory, exist_ok=True)
# RUN
logger.info(f'Running runs [{args.start_cv},{args.end_cv}[')
results = [run(args, i_cv) for i_cv in range(args.start_cv, args.end_cv)]
results = pd.concat(results, ignore_index=True)
# EVALUATION
results.to_csv(os.path.join(model.results_directory, 'threshold.csv'))
print(results)
print("DONE !")
def run(args, i_cv):
logger = logging.getLogger()
print_line()
logger.info('Running iter n°{}'.format(i_cv))
print_line()
# LOAD/GENERATE DATA
logger.info('Set up data generator')
config = Config()
seed = SEED + i_cv * 5
train_generator, valid_generator, test_generator = get_generators_torch(seed, cuda=args.cuda)
train_generator = TrainGenerator(train_generator, cuda=args.cuda)
valid_generator = GeneratorCPU(valid_generator)
test_generator = GeneratorCPU(test_generator)
# SET MODEL
logger.info('Set up classifier')
model = build_model(args, i_cv)
os.makedirs(model.results_path, exist_ok=True)
flush(logger)
# TRAINING / LOADING
train_or_load_neural_net(model, train_generator, retrain=args.retrain)
# MEASUREMENT
results = measurement(model, i_cv, config, valid_generator, test_generator)
print(results)
return results
if __name__ == '__main__':
main()
| 30.841379 | 97 | 0.687165 | 590 | 4,472 | 4.981356 | 0.283051 | 0.039809 | 0.04049 | 0.037428 | 0.059204 | 0.018374 | 0.018374 | 0 | 0 | 0 | 0 | 0.002498 | 0.19432 | 4,472 | 144 | 98 | 31.055556 | 0.812934 | 0.070662 | 0 | 0.06 | 0 | 0 | 0.046366 | 0.007486 | 0 | 0 | 0 | 0 | 0 | 1 | 0.07 | false | 0 | 0.31 | 0.01 | 0.43 | 0.07 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
7cffa5673d098c5404a18e4042db11fef2170e1f | 6,540 | py | Python | common/OpTestASM.py | kyle-ibm/op-test | df8dbf8cbff1390668c22632052adb46ebf277c1 | [
"Apache-2.0"
] | null | null | null | common/OpTestASM.py | kyle-ibm/op-test | df8dbf8cbff1390668c22632052adb46ebf277c1 | [
"Apache-2.0"
] | null | null | null | common/OpTestASM.py | kyle-ibm/op-test | df8dbf8cbff1390668c22632052adb46ebf277c1 | [
"Apache-2.0"
] | 1 | 2021-05-25T11:33:18.000Z | 2021-05-25T11:33:18.000Z | #!/usr/bin/env python3
# encoding=utf8
# IBM_PROLOG_BEGIN_TAG
# This is an automatically generated prolog.
#
# $Source: op-test-framework/common/OpTestASM.py $
#
# OpenPOWER Automated Test Project
#
# Contributors Listed Below - COPYRIGHT 2017
# [+] International Business Machines Corp.
#
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied. See the License for the specific language governing
# permissions and limitations under the License.
#
# IBM_PROLOG_END_TAG
'''
OpTestASM: Advanced System Management (FSP Web UI)
--------------------------------------------------
This class can contains common functions which are useful for
FSP ASM Web page. Some functionality is only accessible through
the FSP Web UI (such as progress codes), so we scrape it.
'''
import time
import subprocess
import os
import pexpect
import sys
import subprocess
from .OpTestConstants import OpTestConstants as BMC_CONST
from .OpTestError import OpTestError
import http.cookiejar
import urllib.request
import urllib.parse
import urllib.error
import re
import ssl
class OpTestASM:
def __init__(self, i_fspIP, i_fspUser, i_fspPasswd):
self.host_name = i_fspIP
self.user_name = i_fspUser
self.password = i_fspPasswd
self.url = "https://%s/cgi-bin/cgi?" % self.host_name
self.cj = http.cookiejar.CookieJar()
context = ssl.create_default_context()
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
opener = urllib.request.build_opener(urllib.request.HTTPSHandler(context=context))
opener.addheaders = [('User-agent', 'LTCTest')]
opener.add_handler(urllib.request.HTTPCookieProcessor(self.cj))
urllib.request.install_opener(opener)
self.setforms()
def setforms(self):
if "FW860" in self.ver():
self.hrdwr = 'p8'
self.frms = {'pwr': '59',
'dbg': '78',
'immpwroff': '32'}
else:
self.hrdwr = 'p7'
self.frms = {'pwr': '60',
'dbg': '79',
'immpwroff': '33'}
def getcsrf(self, form):
while True:
try:
myurl = urllib.request.urlopen(self.url+form, timeout=10)
except urllib.error.URLError:
time.sleep(2)
continue
break
out = myurl.read().decode("utf-8")
if 'CSRF_TOKEN' in out:
return re.findall('CSRF_TOKEN.*value=\'(.*)\'', out)[0]
else:
return '0'
def getpage(self, form):
myurl = urllib.request.urlopen(self.url+form, timeout=60)
return myurl.read().decode("utf-8")
def submit(self, form, param):
param['CSRF_TOKEN'] = self.getcsrf(form)
data = urllib.parse.urlencode(param).encode("utf-8")
req = urllib.request.Request(self.url+form, data)
return urllib.request.urlopen(req)
def login(self):
if not len(self.cj) == 0:
return True
param = {'user': self.user_name,
'password': self.password,
'login': 'Log in',
'lang': '0',
'CSRF_TOKEN': ''}
form = "form=2"
resp = self.submit(form, param)
count = 0
while count < 2:
if not len(self.cj) == 0:
break
# the login can quietly fail because the FSP has 'too many users' logged in,
# even though it actually doesn't. let's check to see if this is the case
# by trying a request.
if "Too many users" in self.getpage("form=2"):
raise OpTestError("FSP reports 'Too many users', FSP needs power cycle")
time.sleep(10)
self.submit(form, param)
msg = "Login failed with user:{0} and password:{1}".format(
self.user_name, self.password)
print(msg)
count += 1
if count == 2:
print(msg)
return False
return True
def logout(self):
param = {'submit': 'Log out',
'CSRF_TOKEN': ''}
form = "form=1"
self.submit(form, param)
def ver(self):
form = "form=1"
return self.getpage(form)
def execommand(self, cmd):
if not self.login():
raise OpTestError("Failed to login ASM page")
param = {'form': '16',
'exe': 'Execute',
'CSRF_TOKEN': '',
'cmd': cmd}
form = "form=16&frm=0"
self.submit(form, param)
def disablefirewall(self):
if not self.login():
raise OpTestError("Failed to login ASM page")
self.execommand('iptables -F')
self.logout()
def clearlogs(self):
if not self.login():
raise OpTestError("Failed to login ASM page")
param = {'form': '30',
'clear': "Clear all error/event log entries",
'CSRF_TOKEN': ''}
form = "form=30"
self.submit(form, param)
self.logout()
def powerstat(self):
form = "form=%s" % self.frms['pwr']
return self.getpage(form)
def start_debugvtty_session(self, partitionId='0', sessionId='0',
sessionTimeout='600'):
if not self.login():
raise OpTestError("Failed to login ASM page")
param = {'form': '81',
'p': partitionId,
's': sessionId,
't': sessionTimeout,
'Save settings': 'Save settings',
'CSRF_TOKEN': ''}
form = "form=81"
self.submit(form, param)
self.logout()
def enable_err_injct_policy(self):
if not self.login():
raise OpTestError("Failed to login ASM page")
param = {'form': '56',
'p': '1',
'submit': 'Save settings',
'CSRF_TOKEN': ''}
form = "form=56"
self.submit(form, param)
self.logout()
| 31.902439 | 90 | 0.555657 | 762 | 6,540 | 4.711286 | 0.374016 | 0.032591 | 0.027298 | 0.037047 | 0.193593 | 0.157382 | 0.124791 | 0.106964 | 0.083008 | 0.083008 | 0 | 0.016988 | 0.324924 | 6,540 | 204 | 91 | 32.058824 | 0.796149 | 0.193119 | 0 | 0.284722 | 1 | 0 | 0.130509 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097222 | false | 0.034722 | 0.097222 | 0 | 0.263889 | 0.013889 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b00216e5015b612b495eca186f46004bdc92b04 | 1,824 | py | Python | test/test_storage.py | jrabasco/PyPasser | 3cc6ecdfa9b5fe22f5a88c221517fe09d2df9db6 | [
"MIT"
] | null | null | null | test/test_storage.py | jrabasco/PyPasser | 3cc6ecdfa9b5fe22f5a88c221517fe09d2df9db6 | [
"MIT"
] | null | null | null | test/test_storage.py | jrabasco/PyPasser | 3cc6ecdfa9b5fe22f5a88c221517fe09d2df9db6 | [
"MIT"
] | null | null | null | #!/usr/bin/python3.4
__author__ = "Jeremy Rabasco"
import sys
import os
sys.path.append("..")
import unittest
from modules import storage
from modules.service import Service
from modules.database import Database
class TestStorage(unittest.TestCase):
def setUp(self):
self.service = Service()
self.database = Database()
open("test.service", "w+").close()
open("test.db", "w+").close()
def test_write_read_service(self):
self.service.service_name = "Hello"
self.service.username = "This"
self.service.password = "Works"
storage.write("test", self.service, "test.service")
service2 = Service()
storage.read("test", service2, "test.service")
self.assertEqual(service2.service_name, self.service.service_name)
self.assertEqual(service2.username, self.service.username)
self.assertEqual(service2.password, self.service.password)
def test_write_read_database(self):
self.database.add_service(Service())
self.database.add_service(Service())
self.database.name = "Hey"
storage.write("test", self.database, "test.db")
database2 = Database()
storage.read("test", database2, "test.db")
self.assertEqual(database2.name, self.database.name)
for i in range(len(self.database.services)):
self.assertEqual(database2.services[i].service_name, self.database.services[i].service_name)
self.assertEqual(database2.services[i].username, self.database.services[i].username)
self.assertEqual(database2.services[i].password, self.database.services[i].password)
def tearDown(self):
os.remove(os.getcwd() + "/test.service")
os.remove(os.getcwd() + "/test.db")
if __name__ == "__main__":
unittest.main() | 35.076923 | 104 | 0.668311 | 215 | 1,824 | 5.553488 | 0.246512 | 0.100503 | 0.050251 | 0.065327 | 0.204355 | 0.058626 | 0.058626 | 0 | 0 | 0 | 0 | 0.008844 | 0.194079 | 1,824 | 52 | 105 | 35.076923 | 0.803401 | 0.010417 | 0 | 0.04878 | 0 | 0 | 0.077008 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 1 | 0.097561 | false | 0.073171 | 0.146341 | 0 | 0.268293 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6b01058178b8f414abe46085a609e4696e9cb097 | 1,096 | py | Python | setup.py | ripiuk/fant_sizer | dcc0908c79ed76af3f4189ebd2a75cecf7a89e34 | [
"MIT"
] | null | null | null | setup.py | ripiuk/fant_sizer | dcc0908c79ed76af3f4189ebd2a75cecf7a89e34 | [
"MIT"
] | null | null | null | setup.py | ripiuk/fant_sizer | dcc0908c79ed76af3f4189ebd2a75cecf7a89e34 | [
"MIT"
] | null | null | null | from setuptools import setup, find_packages
from os.path import join, dirname
setup(
name="fant_sizer",
version="0.7",
author="Rypiuk Oleksandr",
author_email="ripiuk96@gmail.com",
description="fant_sizer command-line file-information",
url="https://github.com/ripiuk/fant_sizer",
keywords="file command-line information size tool recursively",
license="MIT",
classifiers=[
'Topic :: Utilities',
'Environment :: Console',
'Natural Language :: English',
'License :: OSI Approved :: MIT License',
'Intended Audience :: Developers',
'Intended Audience :: Information Technology',
'Development Status :: 5 - Production/Stable',
'Programming Language :: Python :: 3.6'
],
packages=find_packages(),
long_description=open(join(dirname(__file__), "README.rst")).read(),
entry_points={
"console_scripts":
['fant_sizer = fant_sizer.fant_sizer:_main'],
},
)
| 36.533333 | 76 | 0.581204 | 104 | 1,096 | 5.961538 | 0.673077 | 0.087097 | 0.041935 | 0.058065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009091 | 0.297445 | 1,096 | 29 | 77 | 37.793103 | 0.796104 | 0 | 0 | 0 | 0 | 0 | 0.457117 | 0.024635 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b04db30f6d56200725a9e9d3be9cbc67d645d65 | 2,074 | py | Python | tests/python/unittest/test_tir_pass_inject_double_buffer.py | 0xreza/tvm | f08d5d78ee000b2c113ac451f8d73817960eafd5 | [
"Zlib",
"Unlicense",
"Apache-2.0",
"BSD-2-Clause",
"MIT",
"ECL-2.0"
] | null | null | null | tests/python/unittest/test_tir_pass_inject_double_buffer.py | 0xreza/tvm | f08d5d78ee000b2c113ac451f8d73817960eafd5 | [
"Zlib",
"Unlicense",
"Apache-2.0",
"BSD-2-Clause",
"MIT",
"ECL-2.0"
] | 1 | 2020-07-29T00:21:19.000Z | 2020-07-29T00:21:19.000Z | tests/python/unittest/test_tir_pass_inject_double_buffer.py | 0xreza/tvm | f08d5d78ee000b2c113ac451f8d73817960eafd5 | [
"Zlib",
"Unlicense",
"Apache-2.0",
"BSD-2-Clause",
"MIT",
"ECL-2.0"
] | 1 | 2021-07-22T17:33:16.000Z | 2021-07-22T17:33:16.000Z | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import tvm
from tvm import te
def test_double_buffer():
dtype = 'int64'
n = 100
m = 4
tx = te.thread_axis("threadIdx.x")
ib = tvm.tir.ir_builder.create()
A = ib.pointer("float32", name="A")
C = ib.pointer("float32", name="C")
ib.scope_attr(tx, "thread_extent", 1)
with ib.for_range(0, n) as i:
B = ib.allocate("float32", m, name="B", scope="shared")
with ib.new_scope():
ib.scope_attr(B.asobject(), "double_buffer_scope", 1)
with ib.for_range(0, m) as j:
B[j] = A[i * 4 + j]
with ib.for_range(0, m) as j:
C[j] = B[j] + 1
stmt = ib.get()
stmt = tvm.tir.ir_pass.InjectDoubleBuffer(stmt, 2)
stmt = tvm.tir.ir_pass.Simplify(stmt)
assert isinstance(stmt.body.body, tvm.tir.Allocate)
assert stmt.body.body.extents[0].value == 2
mod = tvm.IRModule({
"db" : tvm.tir.PrimFunc([A.asobject(), C.asobject()], stmt)
})
f = tvm.tir.transform.ThreadSync("shared")(mod)["db"]
count = [0]
def count_sync(op):
if isinstance(op, tvm.tir.Call) and op.name == "tvm_storage_sync":
count[0] += 1
tvm.tir.ir_pass.PostOrderVisit(f.body, count_sync)
assert count[0] == 4
if __name__ == "__main__":
test_double_buffer()
| 36.385965 | 74 | 0.655738 | 318 | 2,074 | 4.18239 | 0.430818 | 0.03609 | 0.02406 | 0.031579 | 0.065414 | 0.041353 | 0.028571 | 0.028571 | 0 | 0 | 0 | 0.019171 | 0.220347 | 2,074 | 56 | 75 | 37.035714 | 0.80334 | 0.362584 | 0 | 0.055556 | 0 | 0 | 0.085824 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.055556 | false | 0.083333 | 0.055556 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6b052c373e2583931e7668595c831adfd5fed432 | 491 | py | Python | read_sensor.py | shivupoojar/openfaas-pi | 5eda501368a1ac321954cb2aaf58be617977bd58 | [
"Apache-2.0"
] | 1 | 2020-11-24T03:31:26.000Z | 2020-11-24T03:31:26.000Z | read_sensor.py | shivupoojar/openfaas-pi | 5eda501368a1ac321954cb2aaf58be617977bd58 | [
"Apache-2.0"
] | null | null | null | read_sensor.py | shivupoojar/openfaas-pi | 5eda501368a1ac321954cb2aaf58be617977bd58 | [
"Apache-2.0"
] | null | null | null | import requests
from sense_hat import SenseHat
import smbus
import time
while True:
try:
pressure=0
sense = SenseHat()
pressure = sense.get_pressure()
data = {'pressure':pressure}
print(pressure)
#send http request to sense serverless function with pressure
#data
r=requests.post('http://127.0.0.1:8080/function/sensor',data)
print(r.text)
sense=SenseHat()
sense.show_message(r.text)
except KeyboardInterrupt:
sys.exit()
| 21.347826 | 69 | 0.672098 | 63 | 491 | 5.190476 | 0.571429 | 0.079511 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028947 | 0.226069 | 491 | 22 | 70 | 22.318182 | 0.831579 | 0.130346 | 0 | 0.117647 | 0 | 0 | 0.105882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.235294 | null | null | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b0a2521796cb92f0d1e011306fd05dc969275cf | 355 | py | Python | origamibot/core/teletypes/poll_option.py | cmd410/OrigamiBot | 03667d069f0c0b088671936ce36bf8f85a029b93 | [
"MIT"
] | 4 | 2020-06-30T10:32:54.000Z | 2020-11-01T23:07:58.000Z | origamibot/core/teletypes/poll_option.py | cmd410/OrigamiBot | 03667d069f0c0b088671936ce36bf8f85a029b93 | [
"MIT"
] | 6 | 2020-06-26T23:14:59.000Z | 2020-07-26T11:48:07.000Z | origamibot/core/teletypes/poll_option.py | cmd410/OrigamiBot | 03667d069f0c0b088671936ce36bf8f85a029b93 | [
"MIT"
] | 1 | 2020-07-28T08:52:51.000Z | 2020-07-28T08:52:51.000Z | from .base import TelegramStructure, Field
class PollOption(TelegramStructure):
text = Field()
voter_count = Field()
def __init__(self,
text: str,
voter_count: int
):
self.text = \
Field(text, [str])
self.voter_count = \
Field(voter_count, [int])
| 19.722222 | 42 | 0.515493 | 33 | 355 | 5.30303 | 0.454545 | 0.228571 | 0.171429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.388732 | 355 | 17 | 43 | 20.882353 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b0f57abb4c6963ae8d955c1ecf87495f2b1c219 | 12,193 | py | Python | plugins/modules/oci_blockstorage_volume_backup_policy_facts.py | LaudateCorpus1/oci-ansible-collection | 2b1cd87b4d652a97c1ca752cfc4fdc4bdb37a7e7 | [
"Apache-2.0"
] | null | null | null | plugins/modules/oci_blockstorage_volume_backup_policy_facts.py | LaudateCorpus1/oci-ansible-collection | 2b1cd87b4d652a97c1ca752cfc4fdc4bdb37a7e7 | [
"Apache-2.0"
] | null | null | null | plugins/modules/oci_blockstorage_volume_backup_policy_facts.py | LaudateCorpus1/oci-ansible-collection | 2b1cd87b4d652a97c1ca752cfc4fdc4bdb37a7e7 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
# Copyright (c) 2020, 2022 Oracle and/or its affiliates.
# This software is made available to you under the terms of the GPL 3.0 license or the Apache 2.0 license.
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# Apache License v2.0
# See LICENSE.TXT for details.
# GENERATED FILE - DO NOT EDIT - MANUAL CHANGES WILL BE OVERWRITTEN
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {
"metadata_version": "1.1",
"status": ["preview"],
"supported_by": "community",
}
DOCUMENTATION = """
---
module: oci_blockstorage_volume_backup_policy_facts
short_description: Fetches details about one or multiple VolumeBackupPolicy resources in Oracle Cloud Infrastructure
description:
- Fetches details about one or multiple VolumeBackupPolicy resources in Oracle Cloud Infrastructure
- Lists all the volume backup policies available in the specified compartment.
- For more information about Oracle defined backup policies and user defined backup policies,
see L(Policy-Based Backups,https://docs.cloud.oracle.com/iaas/Content/Block/Tasks/schedulingvolumebackups.htm).
- If I(policy_id) is specified, the details of a single VolumeBackupPolicy will be returned.
version_added: "2.9.0"
author: Oracle (@oracle)
options:
policy_id:
description:
- The OCID of the volume backup policy.
- Required to get a specific volume_backup_policy.
type: str
aliases: ["id"]
compartment_id:
description:
- The OCID of the compartment.
If no compartment is specified, the Oracle defined backup policies are listed.
type: str
extends_documentation_fragment: [ oracle.oci.oracle, oracle.oci.oracle_display_name_option ]
"""
EXAMPLES = """
- name: Get a specific volume_backup_policy
oci_blockstorage_volume_backup_policy_facts:
# required
policy_id: "ocid1.policy.oc1..xxxxxxEXAMPLExxxxxx"
- name: List volume_backup_policies
oci_blockstorage_volume_backup_policy_facts:
# optional
compartment_id: "ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx"
"""
RETURN = """
volume_backup_policies:
description:
- List of VolumeBackupPolicy resources
returned: on success
type: complex
contains:
display_name:
description:
- A user-friendly name. Does not have to be unique, and it's changeable.
Avoid entering confidential information.
returned: on success
type: str
sample: display_name_example
id:
description:
- The OCID of the volume backup policy.
returned: on success
type: str
sample: "ocid1.resource.oc1..xxxxxxEXAMPLExxxxxx"
schedules:
description:
- The collection of schedules that this policy will apply.
returned: on success
type: complex
contains:
backup_type:
description:
- The type of volume backup to create.
returned: on success
type: str
sample: FULL
offset_seconds:
description:
- The number of seconds that the volume backup start
time should be shifted from the default interval boundaries specified by
the period. The volume backup start time is the frequency start time plus the offset.
returned: on success
type: int
sample: 56
period:
description:
- The volume backup frequency.
returned: on success
type: str
sample: ONE_HOUR
offset_type:
description:
- Indicates how the offset is defined. If value is `STRUCTURED`,
then `hourOfDay`, `dayOfWeek`, `dayOfMonth`, and `month` fields are used
and `offsetSeconds` will be ignored in requests and users should ignore its
value from the responses.
- "`hourOfDay` is applicable for periods `ONE_DAY`,
`ONE_WEEK`, `ONE_MONTH` and `ONE_YEAR`."
- "`dayOfWeek` is applicable for period
`ONE_WEEK`."
- "`dayOfMonth` is applicable for periods `ONE_MONTH` and `ONE_YEAR`."
- "'month' is applicable for period 'ONE_YEAR'."
- They will be ignored in the requests for inapplicable periods.
- If value is `NUMERIC_SECONDS`, then `offsetSeconds`
will be used for both requests and responses and the structured fields will be
ignored in the requests and users should ignore their values from the responses.
- For clients using older versions of Apis and not sending `offsetType` in their
requests, the behaviour is just like `NUMERIC_SECONDS`.
returned: on success
type: str
sample: STRUCTURED
hour_of_day:
description:
- The hour of the day to schedule the volume backup.
returned: on success
type: int
sample: 56
day_of_week:
description:
- The day of the week to schedule the volume backup.
returned: on success
type: str
sample: MONDAY
day_of_month:
description:
- The day of the month to schedule the volume backup.
returned: on success
type: int
sample: 56
month:
description:
- The month of the year to schedule the volume backup.
returned: on success
type: str
sample: JANUARY
retention_seconds:
description:
- How long, in seconds, to keep the volume backups created by this schedule.
returned: on success
type: int
sample: 56
time_zone:
description:
- Specifies what time zone is the schedule in
returned: on success
type: str
sample: UTC
destination_region:
description:
- The paired destination region for copying scheduled backups to. Example `us-ashburn-1`.
See L(Region Pairs,https://docs.cloud.oracle.com/iaas/Content/Block/Tasks/schedulingvolumebackups.htm#RegionPairs) for details about paired
regions.
returned: on success
type: str
sample: us-phoenix-1
time_created:
description:
- The date and time the volume backup policy was created. Format defined by L(RFC3339,https://tools.ietf.org/html/rfc3339).
returned: on success
type: str
sample: "2013-10-20T19:20:30+01:00"
compartment_id:
description:
- The OCID of the compartment that contains the volume backup.
returned: on success
type: str
sample: "ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx"
defined_tags:
description:
- Defined tags for this resource. Each key is predefined and scoped to a
namespace. For more information, see L(Resource Tags,https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm).
- "Example: `{\\"Operations\\": {\\"CostCenter\\": \\"42\\"}}`"
returned: on success
type: dict
sample: {'Operations': {'CostCenter': 'US'}}
freeform_tags:
description:
- Free-form tags for this resource. Each tag is a simple key-value pair with no
predefined name, type, or namespace. For more information, see L(Resource
Tags,https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm).
- "Example: `{\\"Department\\": \\"Finance\\"}`"
returned: on success
type: dict
sample: {'Department': 'Finance'}
sample: [{
"display_name": "display_name_example",
"id": "ocid1.resource.oc1..xxxxxxEXAMPLExxxxxx",
"schedules": [{
"backup_type": "FULL",
"offset_seconds": 56,
"period": "ONE_HOUR",
"offset_type": "STRUCTURED",
"hour_of_day": 56,
"day_of_week": "MONDAY",
"day_of_month": 56,
"month": "JANUARY",
"retention_seconds": 56,
"time_zone": "UTC"
}],
"destination_region": "us-phoenix-1",
"time_created": "2013-10-20T19:20:30+01:00",
"compartment_id": "ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx",
"defined_tags": {'Operations': {'CostCenter': 'US'}},
"freeform_tags": {'Department': 'Finance'}
}]
"""
from ansible.module_utils.basic import AnsibleModule
from ansible_collections.oracle.oci.plugins.module_utils import oci_common_utils
from ansible_collections.oracle.oci.plugins.module_utils.oci_resource_utils import (
OCIResourceFactsHelperBase,
get_custom_class,
)
try:
from oci.core import BlockstorageClient
HAS_OCI_PY_SDK = True
except ImportError:
HAS_OCI_PY_SDK = False
class VolumeBackupPolicyFactsHelperGen(OCIResourceFactsHelperBase):
"""Supported operations: get, list"""
def get_required_params_for_get(self):
return [
"policy_id",
]
def get_required_params_for_list(self):
return []
def get_resource(self):
return oci_common_utils.call_with_backoff(
self.client.get_volume_backup_policy,
policy_id=self.module.params.get("policy_id"),
)
def list_resources(self):
optional_list_method_params = [
"compartment_id",
"display_name",
]
optional_kwargs = dict(
(param, self.module.params[param])
for param in optional_list_method_params
if self.module.params.get(param) is not None
)
return oci_common_utils.list_all_resources(
self.client.list_volume_backup_policies, **optional_kwargs
)
VolumeBackupPolicyFactsHelperCustom = get_custom_class(
"VolumeBackupPolicyFactsHelperCustom"
)
class ResourceFactsHelper(
VolumeBackupPolicyFactsHelperCustom, VolumeBackupPolicyFactsHelperGen
):
pass
def main():
module_args = oci_common_utils.get_common_arg_spec()
module_args.update(
dict(
policy_id=dict(aliases=["id"], type="str"),
compartment_id=dict(type="str"),
display_name=dict(type="str"),
)
)
module = AnsibleModule(argument_spec=module_args)
if not HAS_OCI_PY_SDK:
module.fail_json(msg="oci python sdk required for this module.")
resource_facts_helper = ResourceFactsHelper(
module=module,
resource_type="volume_backup_policy",
service_client_class=BlockstorageClient,
namespace="core",
)
result = []
if resource_facts_helper.is_get():
result = [resource_facts_helper.get()]
elif resource_facts_helper.is_list():
result = resource_facts_helper.list()
else:
resource_facts_helper.fail()
module.exit_json(volume_backup_policies=result)
if __name__ == "__main__":
main()
| 38.342767 | 157 | 0.584598 | 1,269 | 12,193 | 5.459417 | 0.253743 | 0.04157 | 0.046622 | 0.057592 | 0.365762 | 0.285508 | 0.182448 | 0.167436 | 0.141599 | 0.113453 | 0 | 0.012455 | 0.341507 | 12,193 | 317 | 158 | 38.463722 | 0.850417 | 0.033954 | 0 | 0.278986 | 0 | 0.018116 | 0.784652 | 0.055579 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018116 | false | 0.003623 | 0.021739 | 0.01087 | 0.061594 | 0.003623 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b118924fc3b616de8ffa2d875b9ce842c00da9f | 2,141 | py | Python | api/app/models/bookings/exam.py | pixelater/queue-management | 9881505d4af2b9860aeaf76b9572315dd016c7dc | [
"Apache-2.0"
] | null | null | null | api/app/models/bookings/exam.py | pixelater/queue-management | 9881505d4af2b9860aeaf76b9572315dd016c7dc | [
"Apache-2.0"
] | 1 | 2019-02-26T00:27:31.000Z | 2019-02-26T00:27:31.000Z | api/app/models/bookings/exam.py | pixelater/queue-management | 9881505d4af2b9860aeaf76b9572315dd016c7dc | [
"Apache-2.0"
] | null | null | null | '''Copyright 2018 Province of British Columbia
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.'''
from app.models.bookings import Base
from qsystem import db
class Exam(Base):
exam_id = db.Column(db.Integer, primary_key=True, autoincrement=True, nullable=False)
booking_id = db.Column(db.Integer, db.ForeignKey("booking.booking_id", ondelete="set null"), nullable=True)
exam_type_id = db.Column(db.Integer, db.ForeignKey("examtype.exam_type_id"), nullable=False)
office_id = db.Column(db.Integer, db.ForeignKey("office.office_id"), nullable=False)
event_id = db.Column(db.String(25), nullable=False)
exam_name = db.Column(db.String(50), nullable=False)
examinee_name = db.Column(db.String(50), nullable=True)
expiry_date = db.Column(db.DateTime, nullable=True)
notes = db.Column(db.String(400), nullable=True)
exam_received_date = db.Column(db.DateTime, nullable=True)
session_number = db.Column(db.Integer, nullable=True)
number_of_students = db.Column(db.Integer, nullable=True)
exam_method = db.Column(db.String(15), nullable=False)
deleted_date = db.Column(db.String(50), nullable=True)
exam_returned_ind = db.Column(db.Integer, nullable=False, default=0)
exam_returned_tracking_number = db.Column(db.String(50), nullable=True)
offsite_location = db.Column(db.String(50), nullable=True)
booking = db.relationship("Booking")
exam_type = db.relationship("ExamType")
office = db.relationship("Office")
def __repr__(self):
return '<Exam Name: (name={self.exam_name!r})>'.format(self=self)
def __init__(self, **kwargs):
super(Exam, self).__init__(**kwargs)
| 43.693878 | 111 | 0.739841 | 316 | 2,141 | 4.882911 | 0.39557 | 0.08814 | 0.110175 | 0.082955 | 0.270253 | 0.241737 | 0.204148 | 0 | 0 | 0 | 0 | 0.014161 | 0.142457 | 2,141 | 48 | 112 | 44.604167 | 0.826253 | 0.263428 | 0 | 0 | 0 | 0 | 0.077658 | 0.029917 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.074074 | 0.037037 | 0.962963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6b1abd24dcce5c1b223e996046c73de1b7c697fc | 1,332 | py | Python | Concurrent/PipelineDecomposingTask.py | rafagarciac/ParallelProgrammingPython | bba91984018688f41049fd63961d3b8872876336 | [
"MIT"
] | null | null | null | Concurrent/PipelineDecomposingTask.py | rafagarciac/ParallelProgrammingPython | bba91984018688f41049fd63961d3b8872876336 | [
"MIT"
] | null | null | null | Concurrent/PipelineDecomposingTask.py | rafagarciac/ParallelProgrammingPython | bba91984018688f41049fd63961d3b8872876336 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""
Artesanal example Pipe without Pipe class.
"""
__author__ = "Rafael García Cuéllar"
__email__ = "r.gc@hotmail.es"
__copyright__ = "Copyright (c) 2018 Rafael García Cuéllar"
__license__ = "MIT"
from concurrent.futures import ProcessPoolExecutor
import time
import random
def worker(arg):
time.sleep(random.random())
return arg
def pipeline(future):
pools[1].submit(worker, future.result()).add_done_callback(printer)
def printer(future):
pools[2].submit(worker, future.result()).add_done_callback(spout)
def spout(future):
print(future.result())
def instanceProcessPool():
pools = []
for i in range(3):
pool = ProcessPoolExecutor(2)
pools.append(pool)
return pools
def shutdownPools(pools):
for pool in pools:
pool.shutdown()
def runThreadsInPipeline(pools):
for pool in pools:
pool.submit(worker, random.random()).add_done_callback(pipeline)
if __name__ == "__main__":
__spec__ = None # Fix multiprocessing in Spyder's IPython
pools = instanceProcessPool() # pool = ProcessPoolExecutor([max_workers])
runThreadsInPipeline(pools) # pools[0].submit(worker, random.random()).add_done_callback(pipeline)
shutdownPools(pools) # pool.shutdown() | 27.183673 | 107 | 0.678679 | 152 | 1,332 | 5.703947 | 0.473684 | 0.055363 | 0.069204 | 0.055363 | 0.251442 | 0.251442 | 0.198385 | 0.10842 | 0 | 0 | 0 | 0.008531 | 0.207958 | 1,332 | 49 | 108 | 27.183673 | 0.81327 | 0.173423 | 0 | 0.060606 | 0 | 0 | 0.080111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212121 | false | 0 | 0.090909 | 0 | 0.363636 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b1f450e0afe4b703c2e85c366b7453eddf2730b | 1,289 | py | Python | iota/commands/core/get_node_info.py | EasonC13/iota.py | f596c1ac0d9bcbceda1cf6109cd921943a6599b3 | [
"MIT"
] | 347 | 2016-12-23T14:28:06.000Z | 2019-09-30T13:46:30.000Z | iota/commands/core/get_node_info.py | EasonC13/iota.py | f596c1ac0d9bcbceda1cf6109cd921943a6599b3 | [
"MIT"
] | 194 | 2016-12-22T21:22:47.000Z | 2019-10-01T09:01:16.000Z | iota/commands/core/get_node_info.py | EasonC13/iota.py | f596c1ac0d9bcbceda1cf6109cd921943a6599b3 | [
"MIT"
] | 147 | 2017-01-08T13:14:47.000Z | 2019-10-01T22:27:31.000Z | import filters as f
from iota import TransactionHash, Address
from iota.commands import FilterCommand, RequestFilter, ResponseFilter
from iota.filters import Trytes
__all__ = [
'GetNodeInfoCommand',
]
class GetNodeInfoCommand(FilterCommand):
"""
Executes `getNodeInfo` command.
See :py:meth:`iota.api.StrictIota.get_node_info`.
"""
command = 'getNodeInfo'
def get_request_filter(self):
return GetNodeInfoRequestFilter()
def get_response_filter(self):
return GetNodeInfoResponseFilter()
class GetNodeInfoRequestFilter(RequestFilter):
def __init__(self) -> None:
# ``getNodeInfo`` does not accept any parameters.
# Using a filter here just to enforce that the request is empty.
super(GetNodeInfoRequestFilter, self).__init__({})
class GetNodeInfoResponseFilter(ResponseFilter):
def __init__(self) -> None:
super(GetNodeInfoResponseFilter, self).__init__({
'coordinatorAddress':
f.ByteString(encoding='ascii') | Trytes(Address),
'latestMilestone':
f.ByteString(encoding='ascii') | Trytes(TransactionHash),
'latestSolidSubtangleMilestone':
f.ByteString(encoding='ascii') | Trytes(TransactionHash),
})
| 28.644444 | 73 | 0.6827 | 115 | 1,289 | 7.426087 | 0.513043 | 0.028103 | 0.066745 | 0.084309 | 0.140515 | 0.105386 | 0 | 0 | 0 | 0 | 0 | 0 | 0.221102 | 1,289 | 44 | 74 | 29.295455 | 0.850598 | 0.150504 | 0 | 0.153846 | 0 | 0 | 0.098696 | 0.027002 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.153846 | 0.076923 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6b2003bf580dcb3f7c6fda11b1276e5b7d0fe837 | 4,925 | py | Python | Aplicacion/Presentacion/views.py | Juandiegordp/TPI | 427266f00745e9d9678110c1d01d3be4febca673 | [
"MIT"
] | null | null | null | Aplicacion/Presentacion/views.py | Juandiegordp/TPI | 427266f00745e9d9678110c1d01d3be4febca673 | [
"MIT"
] | null | null | null | Aplicacion/Presentacion/views.py | Juandiegordp/TPI | 427266f00745e9d9678110c1d01d3be4febca673 | [
"MIT"
] | null | null | null | from Negocio import controller
import forms, functions
from flask import Flask, render_template, request, redirect, url_for, flash
def register(mysql, request):
registerForm= forms.RegisterForm(request.form)
if request.method == 'POST' and registerForm.validate():
return controller.registraUsuario(mysql, request, registerForm)
return render_template('register.html', form=registerForm)
def Index(mysql, request):
if request.method=='GET':
success= request.args.get('success')
if success==None:
if controller.usuarioIniciado():
return redirect(url_for('home'))
else:
return render_template('Index.html')
else:
return render_template('Index.html', success=success)
return render_template('Index.html')
def home(mysql, request):
if request.method== 'POST':
controller.iniciarSesion(mysql, request)
if controller.usuarioIniciado() and request.method== 'GET':
return controller.mostrarRutinas(mysql, request)
else:
return redirect(url_for('Index'))
def historial_rutina(mysql, request):
if controller.usuarioIniciado() and request.method== 'GET':
return controller.mostrar_historial_rutina(mysql, request)
else:
return redirect(url_for('Index'))
def historial_usuario(mysql, request):
if controller.usuarioIniciado() and request.method== 'GET':
return controller.mostrar_historial_usuario(mysql, request)
else:
return redirect(url_for('Index'))
def perfil(mysql, request):
if controller.usuarioIniciado and request.method=='GET':
success= request.args.get('success')
usuario=controller.datosUsuario(mysql, request)
imc=functions.IMC(usuario[8], usuario[7])
m_basal= controller.calcular_metabolismo_basal(mysql, usuario[7], usuario[8])
return render_template('perfil.html', success=success, usuario=usuario, imc=imc, evaluacion=functions.evaluarIMC(imc), pg=functions.porcentajeGrasa(usuario[5], usuario[9], usuario[10], usuario[7], usuario[11]), m_basal=m_basal )
else:
return redirect(url_for('Index'))
def ActualizarPerfil(mysql, request):
actualize_form= forms.PerfilForm(request.form)
if request.method == 'POST' and controller.usuarioIniciado:
if actualize_form.validate():
return controller.actualizar_perfil(mysql, request)
else:
flash("Alguno de los datos es incorrecto")
return redirect(url_for('actualizar_perfil', success=False))
else:
if request.method == 'GET' and controller.usuarioIniciado:
datos=controller.formulario_perfil(mysql)
return render_template('actualizar_perfil.html', form=actualize_form, datos=datos)
return redirect(url_for('perfil'))
def administracionRutinas(mysql, request):
if controller.usuarioIniciado():
return render_template('administracion_rutinas.html')
else:
return redirect(url_for('Index'))
def crearRutina(mysql, request):
if request.method =='POST' and controller.usuarioIniciado():
return controller.agregarRutina(mysql, request)
else:
if controller.rutinaIniciada() and controller.usuarioIniciado():
return controller.rutinaEnCurso(mysql, request)
if controller.usuarioIniciado():
return redirect(url_for('adm_rutinas'))
else:
return redirect(url_for('Index'))
def registrarEjerciciosRutina(mysql, request):
if request.method == 'POST':
return controller.registrarEjerciciosRutina(mysql, request)
return redirect(url_for('adm_rutinas'))
def modificarRutina(mysql, request):
if controller.usuarioIniciado():
rutinas=controller.rutinasUsuario(mysql)
rutinaEjercicios=controller.rutinaEjercicios(mysql)
datosEjer=controller.datosEjercicios(mysql)
return render_template('modify_rutina.html', rutinas=rutinas , ejercicios=datosEjer, rutinaEjer=rutinaEjercicios)
else:
return redirect(url_for('Index'))
def registrarModiciaciones(mysql, request):
if request.method == 'POST':
return controller.registrarModificaciones(mysql, request)
return redirect(url_for('adm_rutinas'))
def eliminarRutina(mysql,request):
if controller.usuarioIniciado():
rutinas=controller.rutinasUsuario(mysql)
rutinaEjercicios=controller.rutinaEjercicios(mysql)
return render_template('delete_rutina.html', rutinas=rutinas , rutinaEjer=rutinaEjercicios)
else:
return redirect(url_for('Index'))
def registrarEliminacion(mysql, request):
if request.method=='POST' and controller.usuarioIniciado():
return controller.registrarEliminacion(mysql, request)
else:
return redirect(url_for('Index'))
def registrarEjercicios(mysql, request):
if request.method == 'POST':
return controller.registrarEjercicio(mysql, request)
return redirect(url_for('ejercicios')) | 40.702479 | 236 | 0.718376 | 527 | 4,925 | 6.616698 | 0.178368 | 0.096358 | 0.068254 | 0.091769 | 0.540006 | 0.515056 | 0.458847 | 0.378262 | 0.317178 | 0.216232 | 0 | 0.002699 | 0.172589 | 4,925 | 121 | 237 | 40.702479 | 0.853006 | 0 | 0 | 0.457143 | 0 | 0 | 0.071255 | 0.009947 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.028571 | 0 | 0.514286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6b22b1c8cec3284d72a98eea77ac255711ba8ec7 | 811 | py | Python | web/backend/backend_django/apps/capacity/models.py | tOverney/ADA-Project | 69221210b1f4f13f6979123c6a7a1a9813ea18e5 | [
"Apache-2.0"
] | null | null | null | web/backend/backend_django/apps/capacity/models.py | tOverney/ADA-Project | 69221210b1f4f13f6979123c6a7a1a9813ea18e5 | [
"Apache-2.0"
] | 1 | 2016-11-04T01:03:21.000Z | 2016-11-04T10:10:06.000Z | web/backend/backend_django/apps/capacity/models.py | tOverney/ADA-Project | 69221210b1f4f13f6979123c6a7a1a9813ea18e5 | [
"Apache-2.0"
] | null | null | null | from django.db import models
from multigtfs.models import (
Block, Fare, FareRule, Feed, Frequency, Route, Service, ServiceDate, Shape,
ShapePoint, Stop, StopTime, Trip, Agency)
class Path(models.Model):
trip = models.ForeignKey(Trip)
stop = models.ForeignKey(Stop)
path = models.CharField(max_length=1024, null=True, blank=True)
class Meta:
unique_together = ('trip', 'stop',)
class Capacity(models.Model):
trip = models.ForeignKey(Trip)
stop_time = models.ForeignKey(StopTime)
service_date = models.ForeignKey(ServiceDate)
capacity1st = models.IntegerField('capacity1st', null=True, blank=True)
capacity2nd = models.IntegerField('capacity2nd', null=True, blank=True)
class Meta:
unique_together = ('trip', 'stop_time', 'service_date') | 32.44 | 79 | 0.705302 | 94 | 811 | 6.010638 | 0.425532 | 0.141593 | 0.069027 | 0.090265 | 0.307965 | 0.307965 | 0.307965 | 0.169912 | 0.169912 | 0.169912 | 0 | 0.011976 | 0.176326 | 811 | 25 | 80 | 32.44 | 0.833832 | 0 | 0 | 0.222222 | 0 | 0 | 0.067734 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6b2889ee02cbc2db0ebf9270a48b091ad3ca3b59 | 8,237 | py | Python | core/views.py | Neelamegam2000/QRcode-for-license | a6d4c9655c5ba52b24c1ea737797557f06e0fcbf | [
"MIT"
] | null | null | null | core/views.py | Neelamegam2000/QRcode-for-license | a6d4c9655c5ba52b24c1ea737797557f06e0fcbf | [
"MIT"
] | null | null | null | core/views.py | Neelamegam2000/QRcode-for-license | a6d4c9655c5ba52b24c1ea737797557f06e0fcbf | [
"MIT"
] | null | null | null | from django.shortcuts import render, redirect
from django.conf import settings
from django.core.files.storage import FileSystemStorage,default_storage
from django.core.mail import send_mail, EmailMessage
from core.models import Document
from core.forms import DocumentForm
from django.contrib import messages
import os
import pyqrcode
import png
import random
import base64
import cv2
import numpy as np
import pyzbar.pyzbar as pyzbar
def home(request):
documents= Document.objects.all()
return render(request, 'home.html', { 'documents': documents })
"""def simple_upload(request):
if request.method == 'POST' and request.FILES['myfile']:
myfile = request.FILES['myfile']
fs = FileSystemStorage()
filename = fs.save(myfile.name, myfile)
uploaded_file_url = fs.url(filename)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
media_path = os.path.join(BASE_DIR,'media')
full_path=os.path.join(media_path,myfile.name)
qr=pyqrcode.create(uploaded_file_url)
filename_before=filename.rsplit(".")
filename1=filename_before[0]+".png"
s=qr.png(filename1,scale=6)
'''from fpdf import FPDF
pdf=FPDF()
pdf.add_page()
pdf.image(filename1,x=50,y=None,w=60,h=60,type="",link=uploaded_file_url)'''
return render(request, 'simple_upload.html', {
'uploaded_file_url': uploaded_file_url
})
return render(request, 'simple_upload.html')"""
def model_form_upload(request):
id=""
msg=""
if request.method == 'POST':
form = DocumentForm(request.POST, request.FILES,request.POST)
if form.is_valid():
form.save()
email=form.cleaned_data['Email']
document_count=Document.objects.values_list('document').count()
document_last=Document.objects.values_list('document')[document_count-1]
document_name=document_last[0]
print(email)
t=Document.objects.last()
num_list=['0','1','2','3','4','5','6','7','8','9']
password1=""
for i in range(0,8):
password1=password1+random.choice(num_list)
t.password=password1
print(type(document_name))
document_name1=document_name.encode('ascii')
document_encode=str(base64.b64encode(document_name1))
ax=document_encode[2:-1]
t.file_url=ax
print(ax)
t.save()
qr=pyqrcode.create(ax)
filename=document_name.rsplit(".")
filename1=filename[0].split("/")
filename2=filename1[1]+".png"
qr.png(filename2,scale=6)
"""mail=EmailMessage('QR',password1,'vmneelamegam2000@gmail.com',[email])
#mail.attach(filename2,filename2.content_type)
mail.send()"""
subject = 'QRcode scanner for license'
message = password1
email_from = settings.EMAIL_HOST_USER
recipient_list = [email, ]
mail=EmailMessage( subject, message, email_from, recipient_list )
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
mail.attach_file(os.path.join(BASE_DIR,filename2))
mail.send()
msg="your successfully uploaded"
return redirect('model_form_upload')
else:
form = DocumentForm()
return render(request, 'model_form_upload.html', {'form': form,'msg':msg})
def mypass(request):
m=""
if(request.POST.get("pswd")==request.POST.get("pswd3")):
user_data=Document.objects.filter(Email=request.POST.get("email"),password=request.POST.get("old_pswd")).update(password=request.POST.get("pswd"))
user_data1=Document.objects.filter(Email=request.POST.get("email"),password=request.POST.get("pswd"))
"""if(len_user_data==1):
userdata.password=request.POST.get("pswd")
return render(request,'mypass.html',{u:"you have change the password successfully"})
else:"""
c=0
if(user_data1):
subject = 'QRcode scanner for license'
message = "Password has succesfully changed"+" "+request.POST.get("pswd")
email_from = settings.EMAIL_HOST_USER
recipient_list = [request.POST.get("email"), ]
mail=EmailMessage( subject, message, email_from, recipient_list )
mail.send()
c=1
m="your password is changed succesfully"
elif(len(Document.objects.filter(Email=request.POST.get("email"),password=request.POST.get("old_pswd")))==0 and request.method=="POST"):
m="your email or password is incorrect"
else:
m=""
print(m)
return render(request,'mypass.html',{"m":m})
def user_req(request):
if("scanner" in request.POST and request.method=="POST"):
cap = cv2.VideoCapture(0+cv2.CAP_DSHOW)
font = cv2.FONT_HERSHEY_PLAIN
decodedObjects=[]
while decodedObjects==[]:
_, frame = cap.read()
decodedObjects = pyzbar.decode(frame)
for obj in decodedObjects:
points = obj.polygon
(x,y,w,h) = obj.rect
pts = np.array(points, np.int32)
pts = pts.reshape((-1, 1, 2))
cv2.polylines(frame, [pts], True, (0, 255, 0), 3)
cv2.putText(frame, str(obj.data), (50, 50), font, 2,
(255, 0, 0), 3)
id =obj.data.decode("utf-8")
cv2.imshow("QR Reader", frame)
key = cv2.waitKey(10) & 0xFF
if decodedObjects!=[] :
cv2.destroyAllWindows()
return render(request,"user_req.html",{"id":id})
if('proceed' in request.POST and request.method=="POST"):
userdata=Document.objects.filter(file_url=request.POST.get("id1")).filter(password=request.POST.get("password1"))
return render(request,"user_req.html",{"userdata":userdata})
return render(request,"user_req.html",)
def user(request):
return render(request,"user.html",)
def forget_pass(request):
msg=""
if(request.method=="POST"):
num_list=['0','1','2','3','4','5','6','7','8','9']
password1=""
for i in range(0,8):
password1=password1+random.choice(num_list)
user_data=Document.objects.filter(Email=request.POST.get("email")).update(password=password1)
subject = 'QRcode scanner for license Forget password'
message = "Password has succesfully changed"+" "+password1
email_from = settings.EMAIL_HOST_USER
recipient_list = [request.POST.get("email"), ]
mail=EmailMessage( subject, message, email_from, recipient_list )
mail.send()
if(user_data>0):
msg="your password is changed succesfully and mail sent"
elif(user_data==0):
msg="your email is incorrect or not found"
return render(request,"forget_pass.html",{"msg":msg})
def qrcode_miss(request):
msg=""
if(request.method=='POST' and Document.objects.filter(Email=request.POST.get('email'),password=request.POST.get('password1'))):
user_data=Document.objects.values_list('document').filter(Email=request.POST.get('email'),password=request.POST.get('password1'))
m=user_data[0][0]
p=m.split('/')
print(p)
t=p[1]
print(t)
subject = 'QRcode scanner for license'
message = "resend"
email_from = settings.EMAIL_HOST_USER
recipient_list = [request.POST.get('email'),]
mail=EmailMessage( subject, message, email_from, recipient_list )
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
k=os.path.join(BASE_DIR,t)
print(k)
mail.attach_file(k)
mail.send()
msg="your qrcode is sent to your email"
elif(request.method=='POST'and Document.objects.values_list('document').filter(Email=request.POST.get('email'),password=request.POST.get('password1')).count()==0):
msg="your email or password is incorrect"
return render(request,'qrcode_miss.html',{"msg":msg})
| 42.901042 | 167 | 0.617701 | 1,016 | 8,237 | 4.889764 | 0.20374 | 0.059783 | 0.064815 | 0.038245 | 0.466787 | 0.367552 | 0.298712 | 0.285427 | 0.285427 | 0.253019 | 0 | 0.020186 | 0.2422 | 8,237 | 191 | 168 | 43.125654 | 0.775713 | 0 | 0 | 0.223684 | 0 | 0 | 0.124964 | 0.003171 | 0.013158 | 0 | 0.000577 | 0 | 0 | 1 | 0.046053 | false | 0.164474 | 0.098684 | 0.006579 | 0.210526 | 0.046053 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6b2b5f9728064d787e0b3474fa79a57d993dda3b | 594 | py | Python | main.py | Meat0Project/ChatBot | 35ebadc71b100d861f9c9e211e1e751175f47c50 | [
"MIT"
] | 4 | 2020-10-30T07:46:39.000Z | 2020-10-30T18:20:57.000Z | main.py | Meat0Project/ChatBot | 35ebadc71b100d861f9c9e211e1e751175f47c50 | [
"MIT"
] | null | null | null | main.py | Meat0Project/ChatBot | 35ebadc71b100d861f9c9e211e1e751175f47c50 | [
"MIT"
] | null | null | null | '''
Made by - Aditya mangal
Purpose - Python mini project
Date - 18 october 2020
'''
from chatterbot import ChatBot
from chatterbot.trainers import ChatterBotCorpusTrainer
form termcolor import cprint
import time
chatbot = ChatBot('Bot')
trainer = ChatterBotCorpusTrainer(chatbot)
trainer.train('chatterbot.corpus.english')
cprint("#" * 50, "magenta")
cprint((f"A Chatot ").center(50), "yellow")
cprint("#" * 50, "magenta")
print('You can exit by type exit\n')
while True:
query = input(">> ")
if 'exit' in query:
exit()
else:
print(chatbot.get_response(query))
| 22.846154 | 55 | 0.69697 | 74 | 594 | 5.581081 | 0.662162 | 0.067797 | 0.072639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024341 | 0.170034 | 594 | 25 | 56 | 23.76 | 0.813387 | 0 | 0 | 0.117647 | 0 | 0 | 0.182711 | 0.049116 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.235294 | null | null | 0.352941 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b2cec5a2588f39302333a5f4dacaf75c507b16b | 3,344 | py | Python | backend/api/management/commands/create_testdb.py | INSRapperswil/nornir-web | 458e6b24bc373197044b4b7b5da74f16f93a9459 | [
"MIT"
] | 2 | 2021-06-01T08:33:04.000Z | 2021-08-20T04:22:39.000Z | backend/api/management/commands/create_testdb.py | INSRapperswil/nornir-web | 458e6b24bc373197044b4b7b5da74f16f93a9459 | [
"MIT"
] | null | null | null | backend/api/management/commands/create_testdb.py | INSRapperswil/nornir-web | 458e6b24bc373197044b4b7b5da74f16f93a9459 | [
"MIT"
] | null | null | null | """
Setup DB with example data for tests
"""
from django.contrib.auth.hashers import make_password
from django.contrib.auth.models import User, Group
from django.core.management.base import BaseCommand
from api import models
class Command(BaseCommand):
help = 'Setup DB with example data for tests'
def handle(self, *args, **options):
print('---- Creating Users ----')
User.objects.get_or_create(username='thomastest', password=make_password('imatestin'))
thomas = User.objects.get(username='thomastest')
User.objects.get_or_create(username='norbert', password=make_password('netzwerk'))
norbert = User.objects.get(username='norbert')
User.objects.get_or_create(username='stefan', password=make_password('helldesk'))
stefan = User.objects.get(username='stefan')
superuser = Group.objects.get(name='superuser')
superuser.user_set.add(thomas)
netadmin = Group.objects.get(name='netadmin')
netadmin.user_set.add(norbert)
support = Group.objects.get(name='support')
support.user_set.add(stefan)
print('---- Creating Inventory ----')
models.Inventory.objects.create(name='Example', hosts_file='web_nornir/nornir_config/example_config/hosts.yaml',
groups_file='web_nornir/nornir_config/example_config/groups.yaml', type=1)
models.Inventory.objects.create(name='INS Lab', hosts_file='web_nornir/nornir_config/inslab_config/hosts.yaml',
groups_file='web_nornir/nornir_config/inslab_config/groups.yaml', type=1)
print('---- Creating Job Templates ----')
models.JobTemplate.objects.create(name='hello_world', description='This prints a hello world',
file_name='hello_world.py', created_by_id=1)
models.JobTemplate.objects.create(name='Get CDP Neighbors', description='Lists all CDP neighbors',
file_name='get_cdp_neighbors.py', created_by_id=1)
models.JobTemplate.objects.create(name='Get Interfaces',
description='Gets brief information about all interfaces, sh ip int br',
file_name='get_interfaces.py', created_by_id=1)
models.JobTemplate.objects.create(name='Ping Device',
description='Pings a chosen network device and reports if reachable',
file_name='ping.py', variables=['target'], created_by_id=1)
models.JobTemplate.objects.create(name='Get Configuration', description='Gets all configuration from device',
file_name='get_configuration.py', created_by_id=1)
print('---- Creating Tasks ----')
models.Task.objects.create(name='Get Hello World', created_by_id=1, template_id=1, inventory_id=1)
models.Task.objects.create(name='Get CDP neighbors of INS lab', created_by_id=2, template_id=2, inventory_id=2)
models.Task.objects.create(name='Get interfaces of INS lab', created_by_id=2, template_id=3, inventory_id=2)
print('---- ALL DONE!! ----')
| 54.819672 | 121 | 0.62201 | 387 | 3,344 | 5.21447 | 0.284238 | 0.06442 | 0.084242 | 0.035679 | 0.433598 | 0.35332 | 0.247275 | 0.173935 | 0.173935 | 0.098612 | 0 | 0.006486 | 0.262261 | 3,344 | 60 | 122 | 55.733333 | 0.811512 | 0.010766 | 0 | 0 | 0 | 0 | 0.283333 | 0.061728 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023256 | false | 0.093023 | 0.093023 | 0 | 0.162791 | 0.139535 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6b2e543e1da4f0dd04f05a16bdaaac83f262d6ce | 1,505 | py | Python | ipuz/puzzlekinds/__init__.py | maiamcc/ipuz | fbe6f663b28ad42754622bf2d3bbe59a26be2615 | [
"MIT"
] | 5 | 2015-06-23T17:18:41.000Z | 2020-05-05T16:43:14.000Z | ipuz/puzzlekinds/__init__.py | maiamcc/ipuz | fbe6f663b28ad42754622bf2d3bbe59a26be2615 | [
"MIT"
] | 3 | 2015-08-21T05:17:22.000Z | 2021-03-20T18:39:31.000Z | ipuz/puzzlekinds/__init__.py | maiamcc/ipuz | fbe6f663b28ad42754622bf2d3bbe59a26be2615 | [
"MIT"
] | 3 | 2018-01-15T17:28:10.000Z | 2020-09-29T20:32:21.000Z | from .acrostic import IPUZ_ACROSTIC_VALIDATORS
from .answer import IPUZ_ANSWER_VALIDATORS
from .block import IPUZ_BLOCK_VALIDATORS
from .crossword import IPUZ_CROSSWORD_VALIDATORS
from .fill import IPUZ_FILL_VALIDATORS
from .sudoku import IPUZ_SUDOKU_VALIDATORS
from .wordsearch import IPUZ_WORDSEARCH_VALIDATORS
IPUZ_PUZZLEKINDS = {
"http://ipuz.org/acrostic": {
"mandatory": (
"puzzle",
),
"validators": {
1: IPUZ_ACROSTIC_VALIDATORS,
},
},
"http://ipuz.org/answer": {
"mandatory": (),
"validators": {
1: IPUZ_ANSWER_VALIDATORS,
},
},
"http://ipuz.org/block": {
"mandatory": (
"dimensions",
),
"validators": {
1: IPUZ_BLOCK_VALIDATORS,
},
},
"http://ipuz.org/crossword": {
"mandatory": (
"dimensions",
"puzzle",
),
"validators": {
1: IPUZ_CROSSWORD_VALIDATORS,
},
},
"http://ipuz.org/fill": {
"mandatory": (),
"validators": {
1: IPUZ_FILL_VALIDATORS,
},
},
"http://ipuz.org/sudoku": {
"mandatory": (
"puzzle",
),
"validators": {
1: IPUZ_SUDOKU_VALIDATORS,
},
},
"http://ipuz.org/wordsearch": {
"mandatory": (
"dimensions",
),
"validators": {
1: IPUZ_WORDSEARCH_VALIDATORS,
},
},
}
| 23.153846 | 50 | 0.509635 | 120 | 1,505 | 6.15 | 0.15 | 0.094851 | 0.104336 | 0.170732 | 0.173442 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007216 | 0.355482 | 1,505 | 64 | 51 | 23.515625 | 0.753608 | 0 | 0 | 0.403226 | 0 | 0 | 0.226578 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.112903 | 0 | 0.112903 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b385bed93debf4cc525192d73536e80c2566746 | 591 | py | Python | python/ray/train/__init__.py | jamesliu/ray | 11ab412db1fa3603a3006e8ed414e80dd1f11c0c | [
"Apache-2.0"
] | 33 | 2020-05-27T14:25:24.000Z | 2022-03-22T06:11:30.000Z | python/ray/train/__init__.py | jamesliu/ray | 11ab412db1fa3603a3006e8ed414e80dd1f11c0c | [
"Apache-2.0"
] | 227 | 2021-10-01T08:00:01.000Z | 2021-12-28T16:47:26.000Z | python/ray/train/__init__.py | gramhagen/ray | c18caa4db36d466718bdbcb2229aa0b2dc03da1f | [
"Apache-2.0"
] | 5 | 2020-08-06T15:53:07.000Z | 2022-02-09T03:31:31.000Z | from ray.train.backend import BackendConfig
from ray.train.callbacks import TrainingCallback
from ray.train.checkpoint import CheckpointStrategy
from ray.train.session import (get_dataset_shard, local_rank, load_checkpoint,
report, save_checkpoint, world_rank, world_size)
from ray.train.trainer import Trainer, TrainingIterator
__all__ = [
"BackendConfig", "CheckpointStrategy", "get_dataset_shard",
"load_checkpoint", "local_rank", "report", "save_checkpoint",
"TrainingIterator", "TrainingCallback", "Trainer", "world_rank",
"world_size"
]
| 42.214286 | 79 | 0.749577 | 64 | 591 | 6.640625 | 0.375 | 0.082353 | 0.141176 | 0.084706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153976 | 591 | 13 | 80 | 45.461538 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0.258883 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.416667 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
6b3881271e2eaf5752f4f95cac12eda083886a6f | 301 | py | Python | byurak/accounts/admin.py | LikeLion-CAU-9th/Django-fancy-coder | 53c770f4c1891f9076bed8c89d0b942b77e67667 | [
"MIT"
] | null | null | null | byurak/accounts/admin.py | LikeLion-CAU-9th/Django-fancy-coder | 53c770f4c1891f9076bed8c89d0b942b77e67667 | [
"MIT"
] | 2 | 2021-06-27T16:19:47.000Z | 2021-08-01T16:41:54.000Z | byurak/accounts/admin.py | LikeLion-CAU-9th/Django-fancy-coder | 53c770f4c1891f9076bed8c89d0b942b77e67667 | [
"MIT"
] | 2 | 2021-08-21T13:32:52.000Z | 2021-12-20T10:12:45.000Z | from django.contrib import admin
from accounts.models import User, Profile, UserFollow
@admin.register(User)
class UserAdmin(admin.ModelAdmin):
list_display = ['email', 'nickname']
list_display_links = ['email', 'nickname']
admin.site.register(Profile)
admin.site.register(UserFollow)
| 23.153846 | 53 | 0.750831 | 36 | 301 | 6.194444 | 0.555556 | 0.098655 | 0.152466 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129568 | 301 | 12 | 54 | 25.083333 | 0.851145 | 0 | 0 | 0 | 0 | 0 | 0.086667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6b3deda0113b8eb8f9bdf6272cc95e4fe0c53714 | 2,743 | py | Python | jupyanno/sheets.py | betatim/jupyanno | 11fbb1825c8e6966260620758768e0e1fa5cecc9 | [
"Apache-2.0"
] | 23 | 2018-08-24T16:48:20.000Z | 2021-02-26T02:52:40.000Z | jupyanno/sheets.py | L3-data/jupyanno | 6f6ec37e88b4d92f00bc359e7e39157b6b7f0eb5 | [
"Apache-2.0"
] | 73 | 2018-08-13T07:56:15.000Z | 2018-10-09T13:55:20.000Z | jupyanno/sheets.py | L3-data/jupyanno | 6f6ec37e88b4d92f00bc359e7e39157b6b7f0eb5 | [
"Apache-2.0"
] | 4 | 2018-08-13T07:55:50.000Z | 2020-09-30T12:04:27.000Z | """Code for reading and writing results to google sheets"""
from bs4 import BeautifulSoup
import requests
import warnings
import json
import pandas as pd
from six.moves.urllib.parse import urlparse, parse_qs
from six.moves.urllib.request import urlopen
_CELLSET_ID = "AIzaSyC8Zo-9EbXgHfqNzDxVb_YS_IIZBWtvoJ4"
def get_task_sheet(in_task):
return get_sheet_as_df(sheet_api_url(in_task.sheet_id), _CELLSET_ID)
def get_sheet_as_df(base_url, kk, columns="A:AG"):
"""
Gets the sheet as a list of Dicts (directly importable to Pandas)
:return:
"""
try:
# TODO: we should probably get the whole sheet
all_vals = "{base_url}/{cols}?key={kk}".format(base_url=base_url,
cols=columns,
kk=kk)
t_data = json.loads(urlopen(all_vals).read().decode('latin1'))[
'values']
frow = t_data.pop(0)
return pd.DataFrame([
dict([(key, '' if idx >= len(irow) else irow[idx])
for idx, key in enumerate(frow)]) for irow in
t_data])
except IOError as e:
warnings.warn(
'Sheet could not be accessed, check internet connectivity, \
proxies and permissions: {}'.format(
e))
return pd.DataFrame([{}])
def sheet_api_url(sheet_id):
return "https://sheets.googleapis.com/v4/spreadsheets/{id}/values".format(
id=sheet_id)
def get_questions(in_url):
res = urlopen(in_url)
soup = BeautifulSoup(res.read(), 'html.parser')
def get_names(f):
return [v for k, v in f.attrs.items() if 'label' in k]
def get_name(f):
return get_names(f)[0] if len(
get_names(f)) > 0 else 'unknown'
all_questions = soup.form.findChildren(
attrs={'name': lambda x: x and x.startswith('entry.')})
return {get_name(q): q['name'] for q in all_questions}
def submit_response(form_url, cur_questions, verbose=False, **answers):
submit_url = form_url.replace('/viewform', '/formResponse')
form_data = {'draftResponse': [],
'pageHistory': 0}
for v in cur_questions.values():
form_data[v] = ''
for k, v in answers.items():
if k in cur_questions:
form_data[cur_questions[k]] = v
else:
warnings.warn('Unknown Question: {}'.format(k), RuntimeWarning)
if verbose:
print(form_data)
user_agent = {'Referer': form_url,
'User-Agent': "Mozilla/5.0 (X11; Linux i686) AppleWebKit/537\
.36 (KHTML, like Gecko) Chrome/28.0.1500.52 Safari/537.36"}
return requests.post(submit_url, data=form_data, headers=user_agent)
| 33.45122 | 79 | 0.606635 | 367 | 2,743 | 4.370572 | 0.427793 | 0.018703 | 0.016833 | 0.022444 | 0.009975 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018063 | 0.273423 | 2,743 | 81 | 80 | 33.864198 | 0.786754 | 0.063434 | 0 | 0 | 0 | 0.016949 | 0.101415 | 0.02555 | 0 | 0 | 0 | 0.012346 | 0 | 1 | 0.118644 | false | 0 | 0.118644 | 0.067797 | 0.372881 | 0.016949 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b3e1154af6f1eb866c2c34cdc822a0ff3902ab9 | 2,191 | py | Python | sorting/python/max_heap.py | zhou7rui/algorithm | 9b5500ac3d8bdfd223bf9aec55e68675f2df7c59 | [
"MIT"
] | 6 | 2017-08-31T07:13:34.000Z | 2018-09-10T08:54:43.000Z | sorting/python/max_heap.py | zhou7rui/algorithm | 9b5500ac3d8bdfd223bf9aec55e68675f2df7c59 | [
"MIT"
] | null | null | null | sorting/python/max_heap.py | zhou7rui/algorithm | 9b5500ac3d8bdfd223bf9aec55e68675f2df7c59 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*
'''
最大堆实现
98
/ \
96 84
/ \ / \
92 82 78 47
/ \ / \ / \ / \
33 26 51 85 50 15 44 60
/ \ / \ / \ / \ / \ / \ / \ / \
40 51 98 51 7 17 94 82 32 21 64 60 7 44 63 63
'''
import random
class Maxheap(object):
def __init__(self,cpacity,arr = None):
self.data = [None] * (cpacity + 1)
self.cpacity = cpacity
if arr is None:
self.count = 0
else:
for i in range(0,cpacity):
self.data[i + 1]= arr[i]
self.count = cpacity
for i in range(self.count / 2, 0, -1):
self.__shifDown(i)
def size(self):
return self.count
def isEmpty(self):
return self.count == 0
def __shiftUp(self,k):
while k > 1 and self.data[k] > self.data[int(k / 2)]:
self.data[k],self.data[int(k / 2)] = self.data[int(k / 2)], self.data[k]
k =int(k/2)
def insert(self,data):
self.data[self.count + 1] = data
self.count += 1
self.__shiftUp(self.count)
def __shifDown(self,k):
while k * 2 <= self.count:
j = k * 2
if self.count >= j + 1 and self.data[j + 1] > self.data[j]:
j += 1
if self.data[k] > self.data[j]:
break
self.data[k], self.data[j] = self.data[j],self.data[k]
k = j
def extractMax(self):
ret = self.data[1]
self.data[1], self.data[self.count] = self.data[self.count], self.data[1]
self.count -= 1
self.__shifDown(1)
return ret
if __name__ == '__main__':
N = 31
M = 100
heap = Maxheap(N)
for i in range(0,N):
k = random.randint(1, M)
heap.insert(k)
# arr = [random.randint(1,M) for i in range(N)]
# heap = Maxheap(len(arr),arr)
print(heap.size())
print(heap.data)
print(heap.extractMax())
| 24.076923 | 84 | 0.426289 | 281 | 2,191 | 3.245552 | 0.263345 | 0.201754 | 0.059211 | 0.048246 | 0.222588 | 0.157895 | 0.072368 | 0.072368 | 0.057018 | 0.057018 | 0 | 0.076735 | 0.440895 | 2,191 | 90 | 85 | 24.344444 | 0.667755 | 0.219534 | 0 | 0 | 0 | 0 | 0.004869 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.14 | false | 0 | 0.02 | 0.04 | 0.24 | 0.06 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b3e3c2d633954d06881dc1103a976a7248201f2 | 585 | py | Python | ink2canvas/svg/Use.py | greipfrut/pdftohtml5canvas | bd4b829a5fd02b503e6b32c268b265daa92e92e5 | [
"MIT"
] | 4 | 2016-05-06T21:29:39.000Z | 2020-02-25T08:47:48.000Z | ink2canvas/svg/Use.py | letw/pdftohtml5canvas | bd4b829a5fd02b503e6b32c268b265daa92e92e5 | [
"MIT"
] | null | null | null | ink2canvas/svg/Use.py | letw/pdftohtml5canvas | bd4b829a5fd02b503e6b32c268b265daa92e92e5 | [
"MIT"
] | null | null | null | from ink2canvas.svg.AbstractShape import AbstractShape
class Use(AbstractShape):
def drawClone(self):
drawables = self.rootTree.getDrawable()
OriginName = self.getCloneId()
OriginObject = self.rootTree.searchElementById(OriginName,drawables)
OriginObject.runDraw()
def draw(self, isClip=False):
if self.hasTransform():
transMatrix = self.getTransform()
self.canvasContext.transform(*transMatrix)
self.drawClone()
def getCloneId(self):
return self.attr("href","xlink")[1:] | 32.5 | 76 | 0.647863 | 53 | 585 | 7.150943 | 0.603774 | 0.063325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004545 | 0.247863 | 585 | 18 | 77 | 32.5 | 0.856818 | 0 | 0 | 0 | 0 | 0 | 0.015358 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.071429 | 0.071429 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b454d373a4daf57bd5eb97d08752d3322beb78a | 6,146 | py | Python | bcgs/disqus_objects.py | aeturnum/bcgs | e5ae4c9f4cdd45b47615f00581dcc3792c281ea3 | [
"MIT"
] | null | null | null | bcgs/disqus_objects.py | aeturnum/bcgs | e5ae4c9f4cdd45b47615f00581dcc3792c281ea3 | [
"MIT"
] | null | null | null | bcgs/disqus_objects.py | aeturnum/bcgs | e5ae4c9f4cdd45b47615f00581dcc3792c281ea3 | [
"MIT"
] | null | null | null | import requests
import aiohttp
from constants import API_KEY
class User(object):
def __init__(self, author_info):
# "author": {
# "about": "",
# "avatar": {
# "cache": "//a.disquscdn.com/1519942534/images/noavatar92.png",
# "isCustom": false,
# "large": {
# "cache": "//a.disquscdn.com/1519942534/images/noavatar92.png",
# "permalink": "https://disqus.com/api/users/avatars/felix1999.jpg"
# },
# "permalink": "https://disqus.com/api/users/avatars/felix1999.jpg",
# "small": {
# "cache": "//a.disquscdn.com/1519942534/images/noavatar32.png",
# "permalink": "https://disqus.com/api/users/avatars/felix1999.jpg"
# }
# },
# "disable3rdPartyTrackers": false,
# "id": "5472588",
# "isAnonymous": false,
# "isPowerContributor": false,
# "isPrimary": true,
# "isPrivate": true,
# "joinedAt": "2010-11-20T04:45:33",
# "location": "",
# "name": "felix1999",
# "profileUrl": "https://disqus.com/by/felix1999/",
# "signedUrl": "",
# "url": "",
# "username": "felix1999"
# },
self._basic_info = author_info
self._detailed_info = None
async def load(self):
async with aiohttp.ClientSession(connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
user_info = await session.get(
'https://disqus.com/api/3.0/users/details.json',
params={'user': self.id, 'api_key': API_KEY}
)
detail_json = await user_info.json()
if detail_json['code'] != 0:
print(f'Problem with getting user details from user {self.id}')
print(detail_json)
self._detailed_info = detail_json['response']
def _get_detailed_info(self):
# https://disqus.com/api/3.0/users/details.json?user=137780765&api_key=E8Uh5l5fHZ6gD8U3KycjAIAk46f68Zw7C6eW8WSjZvCLXebZ7p0r1yrYDrLilk2F
# {
# "code": 0,
# "response": {
# "about": "",
# "avatar": {
# "cache": "https://c.disquscdn.com/uploads/users/13778/765/avatar92.jpg?1433896551",
# "isCustom": true,
# "large": {
# "cache": "https://c.disquscdn.com/uploads/users/13778/765/avatar92.jpg?1433896551",
# "permalink": "https://disqus.com/api/users/avatars/disqus_FqhLpDGmTT.jpg"
# },
# "permalink": "https://disqus.com/api/users/avatars/disqus_FqhLpDGmTT.jpg",
# "small": {
# "cache": "https://c.disquscdn.com/uploads/users/13778/765/avatar32.jpg?1433896551",
# "permalink": "https://disqus.com/api/users/avatars/disqus_FqhLpDGmTT.jpg"
# }
# },
# "disable3rdPartyTrackers": false,
# "id": "137780765",
# "isAnonymous": false,
# "isPowerContributor": false,
# "isPrimary": true,
# "isPrivate": false,
# "joinedAt": "2015-01-02T18:40:14",
# "location": "",
# "name": "Bob",
# "numFollowers": 2,
# "numFollowing": 0,
# "numForumsFollowing": 0,
# "numLikesReceived": 8967,
# "numPosts": 4147,
# "profileUrl": "https://disqus.com/by/disqus_FqhLpDGmTT/",
# "rep": 3.5297520000000002,
# "reputation": 3.5297520000000002,
# "reputationLabel": "High",
# "signedUrl": "",
# "url": "",
# "username": "disqus_FqhLpDGmTT"
# }
# }
print("WARNING: auto-loading user in async version of code!!!!")
details = requests.get(
'https://disqus.com/api/3.0/users/details.json',
{'user': self.id, 'api_key': API_KEY}
)
detail_json = details.json()
if detail_json['code'] != 0:
print(f'Problem with getting user details from user {self.id}')
print(detail_json)
self._detailed_info = detail_json['response']
@property
def anonymous(self):
return 'id' not in self._basic_info
@property
def private(self):
return self.anonymous or self._basic_info.get('isPrivate')
@property
def id(self):
if self.private:
return 'Private'
return self._basic_info.get('id', 'Anonymous')
@property
def name(self):
return self._basic_info.get('name')
@property
def username(self):
return self._basic_info.get('username')
@property
def location(self):
return self._basic_info.get('location')
@property
def joined_at(self):
return self._basic_info.get('joinedAt')
@property
def profile_url(self):
return self._basic_info.get('profileUrl')
@property
def total_posts(self):
if self._detailed_info is None:
self._get_detailed_info()
return self._detailed_info.get('numPosts')
@property
def total_likes(self):
if self._detailed_info is None:
self._get_detailed_info()
return self._detailed_info.get('numLikesReceived')
@property
def user_info_row(self):
return [
self.id,
self.name,
self.username,
self.total_posts,
self.total_likes,
self.location,
self.joined_at,
self.profile_url
]
| 36.802395 | 143 | 0.493817 | 548 | 6,146 | 5.394161 | 0.24635 | 0.040934 | 0.052097 | 0.051759 | 0.515223 | 0.490189 | 0.434709 | 0.393437 | 0.362652 | 0.310893 | 0 | 0.064012 | 0.374715 | 6,146 | 166 | 144 | 37.024096 | 0.705178 | 0.432314 | 0 | 0.324675 | 0 | 0 | 0.113583 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.168831 | false | 0 | 0.038961 | 0.103896 | 0.376623 | 0.064935 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
6b496440b1b757ff1f65cdc922e139b550fcb6ef | 473 | py | Python | setup.py | aagaard/dbservice | 47daadab307e6744ef151dd4e0aacff27dcda881 | [
"MIT"
] | 1 | 2020-04-27T16:30:50.000Z | 2020-04-27T16:30:50.000Z | setup.py | aagaard/dbservice | 47daadab307e6744ef151dd4e0aacff27dcda881 | [
"MIT"
] | null | null | null | setup.py | aagaard/dbservice | 47daadab307e6744ef151dd4e0aacff27dcda881 | [
"MIT"
] | 1 | 2021-01-13T02:16:56.000Z | 2021-01-13T02:16:56.000Z | #!/usr/bin/env python3
# -*- encoding: utf-8 -*-
"""
Setup for the dbservice
"""
from setuptools import setup, find_packages
setup(
name='dbservice',
version='0.9',
description="Database service for storing meter data",
author="Søren Aagaard Mikkelsen",
author_email='smik@eng.au.dk',
url='https://github.com/dbservice/dbservice',
packages=find_packages(),
package_data={'': ['static/*.*', 'templates/*.*']},
scripts=['manage.py'],
)
| 22.52381 | 58 | 0.646934 | 56 | 473 | 5.392857 | 0.803571 | 0.07947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010101 | 0.162791 | 473 | 20 | 59 | 23.65 | 0.752525 | 0.145877 | 0 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6b4af341d1bd006f2df5874fa788b8866cb5c77d | 800 | py | Python | venv/lib/python3.6/site-packages/ansible_collections/junipernetworks/junos/plugins/module_utils/network/junos/argspec/facts/facts.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 1 | 2020-01-22T13:11:23.000Z | 2020-01-22T13:11:23.000Z | venv/lib/python3.6/site-packages/ansible_collections/junipernetworks/junos/plugins/module_utils/network/junos/argspec/facts/facts.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 12 | 2020-02-21T07:24:52.000Z | 2020-04-14T09:54:32.000Z | venv/lib/python3.6/site-packages/ansible_collections/junipernetworks/junos/plugins/module_utils/network/junos/argspec/facts/facts.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | null | null | null | #
# -*- coding: utf-8 -*-
# Copyright 2019 Red Hat
# GNU General Public License v3.0+
# (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
"""
The arg spec for the junos facts module.
"""
from __future__ import absolute_import, division, print_function
__metaclass__ = type
class FactsArgs(object):
""" The arg spec for the junos facts module
"""
def __init__(self, **kwargs):
pass
argument_spec = {
"gather_subset": dict(
default=["!config"], type="list", elements="str"
),
"config_format": dict(
default="text", choices=["xml", "text", "set", "json"]
),
"gather_network_resources": dict(type="list", elements="str"),
"available_network_resources": {"type": "bool", "default": False},
}
| 25.806452 | 74 | 0.60625 | 95 | 800 | 4.884211 | 0.705263 | 0.025862 | 0.043103 | 0.056034 | 0.137931 | 0.137931 | 0.137931 | 0.137931 | 0 | 0 | 0 | 0.014706 | 0.235 | 800 | 30 | 75 | 26.666667 | 0.743464 | 0.2775 | 0 | 0.133333 | 0 | 0 | 0.233929 | 0.091071 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0.066667 | 0.066667 | 0 | 0.266667 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
860a91391db83eb979d9849bfe427e0dbb8bf3eb | 1,171 | py | Python | helpers.py | owenjones/CaBot | dd47c077b21cbcf52c0ffd2e30b47fb736a41ebc | [
"MIT"
] | 3 | 2020-03-26T11:43:40.000Z | 2021-12-27T18:26:06.000Z | helpers.py | owenjones/CaBot | dd47c077b21cbcf52c0ffd2e30b47fb736a41ebc | [
"MIT"
] | 2 | 2021-05-14T01:31:12.000Z | 2021-08-23T16:07:44.000Z | helpers.py | owenjones/CaBot | dd47c077b21cbcf52c0ffd2e30b47fb736a41ebc | [
"MIT"
] | 1 | 2020-04-22T19:06:43.000Z | 2020-04-22T19:06:43.000Z | from server import roles
def hasRole(member, roleID):
role = member.guild.get_role(roleID)
return role in member.roles
def gainedRole(before, after, roleID):
role = before.guild.get_role(roleID)
return (role not in before.roles) and (role in after.roles)
def isExplorer(ctx):
return hasRole(ctx.author, roles["explorer"])
def isNetwork(ctx):
return hasRole(ctx.author, roles["network"])
def isLeader(ctx):
return hasRole(ctx.author, roles["leader"])
def isAdmin(ctx):
return hasRole(ctx.author, roles["admin"])
def isBot(ctx):
return hasRole(ctx.author, roles["bot"])
class Colours:
DEFAULT = 0
AQUA = 1752220
GREEN = 3066993
BLUE = 3447003
PURPLE = 10181046
GOLD = 15844367
ORANGE = 15105570
RED = 15158332
GREY = 9807270
DARKER_GREY = 8359053
NAVY = 3426654
DARK_AQUA = 1146986
DARK_GREEN = 2067276
DARK_BLUE = 2123412
DARK_PURPLE = 7419530
DARK_GOLD = 12745742
DARK_ORANGE = 11027200
DARK_RED = 10038562
DARK_GREY = 9936031
LIGHT_GREY = 12370112
DARK_NAVY = 2899536
LUMINOUS_VIVID_PINK = 16580705
DARK_VIVID_PINK = 12320855
| 20.189655 | 63 | 0.679761 | 149 | 1,171 | 5.228188 | 0.47651 | 0.057766 | 0.102696 | 0.121951 | 0.264442 | 0.264442 | 0 | 0 | 0 | 0 | 0 | 0.18313 | 0.230572 | 1,171 | 57 | 64 | 20.54386 | 0.681465 | 0 | 0 | 0 | 0 | 0 | 0.024765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.170732 | false | 0 | 0.02439 | 0.121951 | 0.95122 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
860b82a531bcd228b8d28c903681d9b70c4a8b49 | 2,793 | py | Python | topology.py | destinysky/nsh_sfc | 290fa49df2880527e0b7844bf3bec4d55c4945a6 | [
"Apache-2.0"
] | 2 | 2020-10-26T17:22:04.000Z | 2020-11-11T13:19:08.000Z | topology.py | destinysky/nsh_sfc | 290fa49df2880527e0b7844bf3bec4d55c4945a6 | [
"Apache-2.0"
] | null | null | null | topology.py | destinysky/nsh_sfc | 290fa49df2880527e0b7844bf3bec4d55c4945a6 | [
"Apache-2.0"
] | 3 | 2020-03-28T12:53:35.000Z | 2021-06-29T18:13:43.000Z | #!/usr/bin/python
"""
"""
from mininet.net import Mininet
from mininet.node import Controller, RemoteController, OVSKernelSwitch,UserSwitch
#OVSLegacyKernelSwitch, UserSwitch
from mininet.cli import CLI
from mininet.log import setLogLevel
from mininet.link import Link, TCLink
#conf_port=50000
conf_ip_1='10.0.0.254'
conf_mac_1='11:12:13:14:15:16'
def topology():
"Create a network."
net = Mininet( controller=RemoteController, link=TCLink, switch=OVSKernelSwitch )
print "*** Creating nodes"
h1 = net.addHost( 'h1', mac='00:00:00:00:00:01', ip='10.0.0.1/24' )
h2 = net.addHost( 'h2', mac='00:00:00:00:00:02', ip='10.0.0.2/24' )
h3 = net.addHost( 'h3', mac='00:00:00:00:00:03', ip='10.0.0.3/24' )
h4 = net.addHost( 'h4', mac='00:00:00:00:00:04', ip='10.0.0.4/24' )
h5 = net.addHost( 'h5', mac='00:00:00:00:00:05', ip='10.0.0.5/24' )
s1 = net.addSwitch( 's1', listenPort=6671 )
s2 = net.addSwitch( 's2', listenPort=6672 )
s3 = net.addSwitch( 's3', listenPort=6673 )
s4 = net.addSwitch( 's4', listenPort=6674 )
s5 = net.addSwitch( 's5', listenPort=6675 )
c1 = net.addController( 'c1', controller=RemoteController, ip='127.0.0.1', port=6633 )
print "*** Creating links"
net.addLink(s1, h1)
net.addLink(s2, h2)
net.addLink(s3, h3)
net.addLink(s4, h4)
net.addLink(s5, h5)
net.addLink(s1, s2)
net.addLink(s2, s3)
net.addLink(s3, s4)
net.addLink(s4, s5)
print "*** Starting network"
net.build()
h1.cmd('ip route add '+conf_ip_1+'/32 dev h1-eth0')
h1.cmd('sudo arp -i h1-eth0 -s '+conf_ip_1+' '+conf_mac_1)
h1.cmd('sysctl -w net.ipv4.ip_forward=1')
h1.cmd('python3 listen.py &')
h2.cmd('ip route add '+conf_ip_1+'/32 dev h2-eth0')
h2.cmd('sudo arp -i h2-eth0 -s '+conf_ip_1+' '+conf_mac_1)
h2.cmd('sysctl -w net.ipv4.ip_forward=1')
h2.cmd('python3 listen.py &')
h3.cmd('ip route add '+conf_ip_1+'/32 dev h3-eth0')
h3.cmd('sudo arp -i h3-eth0 -s '+conf_ip_1+' '+conf_mac_1)
h3.cmd('sysctl -w net.ipv4.ip_forward=1')
h3.cmd('python3 listen.py &')
h4.cmd('ip route add '+conf_ip_1+'/32 dev h4-eth0')
h4.cmd('sudo arp -i h4-eth0 -s '+conf_ip_1+' '+conf_mac_1)
h4.cmd('sysctl -w net.ipv4.ip_forward=1')
h4.cmd('python3 listen.py &')
h5.cmd('ip route add '+conf_ip_1+'/32 dev h5-eth0')
h5.cmd('sudo arp -i h5-eth0 -s '+conf_ip_1+' '+conf_mac_1)
h5.cmd('sysctl -w net.ipv4.ip_forward=1')
h5.cmd('python3 listen.py &')
c1.start()
s1.start( [c1] )
s2.start( [c1] )
s3.start( [c1] )
s4.start( [c1] )
s5.start( [c1] )
print "*** Running CLI"
CLI( net )
print "*** Stopping network"
net.stop()
if __name__ == '__main__':
setLogLevel( 'info' )
topology() | 30.358696 | 90 | 0.617257 | 474 | 2,793 | 3.535865 | 0.219409 | 0.047733 | 0.053699 | 0.047733 | 0.25358 | 0.25358 | 0.214797 | 0.214797 | 0.074582 | 0 | 0 | 0.125 | 0.192266 | 2,793 | 92 | 91 | 30.358696 | 0.617908 | 0.022914 | 0 | 0 | 0 | 0 | 0.304412 | 0.038603 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.074627 | null | null | 0.074627 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
860eaaee93a0cd4aceb0ebc7da1f6e2b65f05589 | 224 | py | Python | boa3_test/test_sc/event_test/EventNep5Transfer.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 25 | 2020-07-22T19:37:43.000Z | 2022-03-08T03:23:55.000Z | boa3_test/test_sc/event_test/EventNep5Transfer.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 419 | 2020-04-23T17:48:14.000Z | 2022-03-31T13:17:45.000Z | boa3_test/test_sc/event_test/EventNep5Transfer.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 15 | 2020-05-21T21:54:24.000Z | 2021-11-18T06:17:24.000Z | from boa3.builtin import public
from boa3.builtin.contract import Nep5TransferEvent
transfer = Nep5TransferEvent
@public
def Main(from_addr: bytes, to_addr: bytes, amount: int):
transfer(from_addr, to_addr, amount)
| 18.666667 | 56 | 0.785714 | 30 | 224 | 5.733333 | 0.5 | 0.093023 | 0.174419 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020725 | 0.138393 | 224 | 11 | 57 | 20.363636 | 0.870466 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
86161f7e9f969066db82c2f68d6e2be07cfb7ad1 | 3,694 | py | Python | src/falconpy/_endpoint/_filevantage.py | kra-ts/falconpy | c7c4ed93cb3b56cdfd86757f573fde57e4ccf857 | [
"Unlicense"
] | null | null | null | src/falconpy/_endpoint/_filevantage.py | kra-ts/falconpy | c7c4ed93cb3b56cdfd86757f573fde57e4ccf857 | [
"Unlicense"
] | null | null | null | src/falconpy/_endpoint/_filevantage.py | kra-ts/falconpy | c7c4ed93cb3b56cdfd86757f573fde57e4ccf857 | [
"Unlicense"
] | null | null | null | """Internal API endpoint constant library.
_______ __ _______ __ __ __
| _ .----.-----.--.--.--.--| | _ | |_.----|__| |--.-----.
|. 1___| _| _ | | | | _ | 1___| _| _| | <| -__|
|. |___|__| |_____|________|_____|____ |____|__| |__|__|__|_____|
|: 1 | |: 1 |
|::.. . | CROWDSTRIKE FALCON |::.. . | FalconPy
`-------' `-------'
OAuth2 API - Customer SDK
This is free and unencumbered software released into the public domain.
Anyone is free to copy, modify, publish, use, compile, sell, or
distribute this software, either in source code form or as a compiled
binary, for any purpose, commercial or non-commercial, and by any
means.
In jurisdictions that recognize copyright laws, the author or authors
of this software dedicate any and all copyright interest in the
software to the public domain. We make this dedication for the benefit
of the public at large and to the detriment of our heirs and
successors. We intend this dedication to be an overt act of
relinquishment in perpetuity of all present and future rights to this
software under copyright law.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
For more information, please refer to <https://unlicense.org>
"""
_filevantage_endpoints = [
[
"getChanges",
"GET",
"/filevantage/entities/changes/v2",
"Retrieve information on changes",
"filevantage",
[
{
"type": "array",
"items": {
"type": "string"
},
"collectionFormat": "multi",
"description": "Comma separated values of change ids",
"name": "ids",
"in": "query",
"required": True
}
]
],
[
"queryChanges",
"GET",
"/filevantage/queries/changes/v2",
"Returns one or more change IDs",
"filevantage",
[
{
"minimum": 0,
"type": "integer",
"description": "The first change index to return in the response. "
"If not provided it will default to '0'. "
"Use with the `limit` parameter to manage pagination of results.",
"name": "offset",
"in": "query"
},
{
"type": "integer",
"description": "The maximum number of changes to return in the response "
"(default: 100; max: 500). "
"Use with the `offset` parameter to manage pagination of results",
"name": "limit",
"in": "query"
},
{
"type": "string",
"description": "Sort changes using options like:\n\n"
"- `action_timestamp` (timestamp of the change occurrence) \n\n "
"Sort either `asc` (ascending) or `desc` (descending). "
"For example: `action_timestamp|asc`.\n"
"The full list of allowed sorting options can be reviewed in our API documentation.",
"name": "sort",
"in": "query"
},
{
"type": "string",
"description": "Filter changes using a query in Falcon Query Language (FQL). \n\n"
"Common filter options include:\n\n - `host.host_name`\n - `action_timestamp`\n\n "
"The full list of allowed filter parameters can be reviewed in our API documentation.",
"name": "filter",
"in": "query"
}
]
]
]
| 35.180952 | 95 | 0.600704 | 426 | 3,694 | 4.997653 | 0.469484 | 0.01644 | 0.0155 | 0.023485 | 0.139032 | 0.093001 | 0.073274 | 0.035698 | 0 | 0 | 0 | 0.005631 | 0.278831 | 3,694 | 104 | 96 | 35.519231 | 0.793544 | 0.457228 | 0 | 0.181818 | 0 | 0.015152 | 0.624124 | 0.055055 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
861a31bb111c594972aeb70c462a963cf1fefdb9 | 5,215 | py | Python | pysnmp/HH3C-PPPOE-SERVER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/HH3C-PPPOE-SERVER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/HH3C-PPPOE-SERVER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module HH3C-PPPOE-SERVER-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/HH3C-PPPOE-SERVER-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 19:16:17 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, SingleValueConstraint, ValueRangeConstraint, ConstraintsIntersection, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "SingleValueConstraint", "ValueRangeConstraint", "ConstraintsIntersection", "ConstraintsUnion")
hh3cCommon, = mibBuilder.importSymbols("HH3C-OID-MIB", "hh3cCommon")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
ObjectIdentity, Integer32, IpAddress, NotificationType, Unsigned32, iso, MibIdentifier, Counter64, Counter32, MibScalar, MibTable, MibTableRow, MibTableColumn, Gauge32, ModuleIdentity, Bits, TimeTicks = mibBuilder.importSymbols("SNMPv2-SMI", "ObjectIdentity", "Integer32", "IpAddress", "NotificationType", "Unsigned32", "iso", "MibIdentifier", "Counter64", "Counter32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Gauge32", "ModuleIdentity", "Bits", "TimeTicks")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
hh3cPPPoEServer = ModuleIdentity((1, 3, 6, 1, 4, 1, 25506, 2, 102))
hh3cPPPoEServer.setRevisions(('2009-05-06 00:00',))
if mibBuilder.loadTexts: hh3cPPPoEServer.setLastUpdated('200905060000Z')
if mibBuilder.loadTexts: hh3cPPPoEServer.setOrganization('Hangzhou H3C Technologies Co., Ltd.')
hh3cPPPoEServerObject = MibIdentifier((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1))
hh3cPPPoEServerMaxSessions = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hh3cPPPoEServerMaxSessions.setStatus('current')
hh3cPPPoEServerCurrSessions = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hh3cPPPoEServerCurrSessions.setStatus('current')
hh3cPPPoEServerAuthRequests = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hh3cPPPoEServerAuthRequests.setStatus('current')
hh3cPPPoEServerAuthSuccesses = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hh3cPPPoEServerAuthSuccesses.setStatus('current')
hh3cPPPoEServerAuthFailures = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hh3cPPPoEServerAuthFailures.setStatus('current')
hh3cPPPoESAbnormOffsThreshold = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hh3cPPPoESAbnormOffsThreshold.setStatus('current')
hh3cPPPoESAbnormOffPerThreshold = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 7), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hh3cPPPoESAbnormOffPerThreshold.setStatus('current')
hh3cPPPoESNormOffPerThreshold = MibScalar((1, 3, 6, 1, 4, 1, 25506, 2, 102, 1, 8), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hh3cPPPoESNormOffPerThreshold.setStatus('current')
hh3cPPPoEServerTraps = MibIdentifier((1, 3, 6, 1, 4, 1, 25506, 2, 102, 2))
hh3cPPPoeServerTrapPrefix = MibIdentifier((1, 3, 6, 1, 4, 1, 25506, 2, 102, 2, 0))
hh3cPPPoESAbnormOffsAlarm = NotificationType((1, 3, 6, 1, 4, 1, 25506, 2, 102, 2, 0, 1))
if mibBuilder.loadTexts: hh3cPPPoESAbnormOffsAlarm.setStatus('current')
hh3cPPPoESAbnormOffPerAlarm = NotificationType((1, 3, 6, 1, 4, 1, 25506, 2, 102, 2, 0, 2))
if mibBuilder.loadTexts: hh3cPPPoESAbnormOffPerAlarm.setStatus('current')
hh3cPPPoESNormOffPerAlarm = NotificationType((1, 3, 6, 1, 4, 1, 25506, 2, 102, 2, 0, 3))
if mibBuilder.loadTexts: hh3cPPPoESNormOffPerAlarm.setStatus('current')
mibBuilder.exportSymbols("HH3C-PPPOE-SERVER-MIB", hh3cPPPoEServerMaxSessions=hh3cPPPoEServerMaxSessions, hh3cPPPoEServerObject=hh3cPPPoEServerObject, hh3cPPPoeServerTrapPrefix=hh3cPPPoeServerTrapPrefix, hh3cPPPoEServerAuthFailures=hh3cPPPoEServerAuthFailures, hh3cPPPoEServer=hh3cPPPoEServer, PYSNMP_MODULE_ID=hh3cPPPoEServer, hh3cPPPoESAbnormOffsAlarm=hh3cPPPoESAbnormOffsAlarm, hh3cPPPoEServerAuthRequests=hh3cPPPoEServerAuthRequests, hh3cPPPoEServerAuthSuccesses=hh3cPPPoEServerAuthSuccesses, hh3cPPPoESNormOffPerThreshold=hh3cPPPoESNormOffPerThreshold, hh3cPPPoEServerCurrSessions=hh3cPPPoEServerCurrSessions, hh3cPPPoEServerTraps=hh3cPPPoEServerTraps, hh3cPPPoESAbnormOffPerThreshold=hh3cPPPoESAbnormOffPerThreshold, hh3cPPPoESAbnormOffPerAlarm=hh3cPPPoESAbnormOffPerAlarm, hh3cPPPoESAbnormOffsThreshold=hh3cPPPoESAbnormOffsThreshold, hh3cPPPoESNormOffPerAlarm=hh3cPPPoESNormOffPerAlarm)
| 115.888889 | 892 | 0.797124 | 519 | 5,215 | 8.00578 | 0.263969 | 0.007702 | 0.01083 | 0.01444 | 0.351865 | 0.282551 | 0.222383 | 0.222383 | 0.222383 | 0.218773 | 0 | 0.091964 | 0.074209 | 5,215 | 44 | 893 | 118.522727 | 0.768641 | 0.06443 | 0 | 0 | 0 | 0 | 0.141889 | 0.013347 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.189189 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
861a472cf4ef7f924185a3fe1ea6569338502257 | 2,041 | py | Python | Pyshare2019/02 - if + Nesteed if/Nesteed-IF.py | suhaili99/python-share | 6c65faaff722b8bd9e381650a6b277f56d1ae4c9 | [
"MIT"
] | 4 | 2019-10-21T11:00:55.000Z | 2020-10-22T16:11:21.000Z | Pyshare2019/02 - if + Nesteed if/Nesteed-IF.py | suhaili99/python-share | 6c65faaff722b8bd9e381650a6b277f56d1ae4c9 | [
"MIT"
] | 1 | 2019-12-17T05:20:26.000Z | 2019-12-17T05:20:26.000Z | Pyshare2019/02 - if + Nesteed if/Nesteed-IF.py | suhaili99/python-share | 6c65faaff722b8bd9e381650a6b277f56d1ae4c9 | [
"MIT"
] | 9 | 2019-10-20T05:48:03.000Z | 2020-11-17T14:08:14.000Z | name = input("masukkan nama pembeli = ")
alamat= input("Alamat = ")
NoTelp = input("No Telp = ")
print("\n")
print("=================INFORMASI HARGA MOBIL DEALER JAYA ABADI===============")
print("Pilih Jenis Mobil :")
print("\t 1.Daihatsu ")
print("\t 2.Honda ")
print("\t 3.Toyota ")
print("")
pilihan = int(input("Pilih jenis mobil yang ingin dibeli : "))
print("")
if (pilihan==1):
print("<<<<<<<< Macam macam mobil pada Daihatsu >>>>>>>>>")
print("\ta.Grand New Xenia")
print("\tb.All New Terios")
print("\tc.New Ayla")
Pilih1 = input("Mana yang ingin anda pilih ?? = ")
if(Pilih1 == "a"):
print("Harga mobil Grand New Xenia adalah 183 juta ")
elif(Pilih1== "b"):
print("Harga mobil All New Terios adalah 215 juta")
elif(Pilih1== "c"):
print("Harga mobil New Ayla adalah 110 juta")
else:
print("Tidak terdefinisi")
elif (pilihan==2):
print("<<<<<<<< Macam macam mobil pada Honda >>>>>>>>>")
print("\ta.Honda Brio Satya S")
print("\tb.Honda Jazz ")
print("\tb.Honda Mobilio ")
pilih2 = input("Mana yang ingin anda pilih??")
if(pilih2=="a"):
print("Harga mobil HOnda Brio Satya S adalah 131 juta")
elif(pilih2=="b"):
print("Harga mobil Honda Jazz adalah 232 juta")
elif(pilih2=="c"):
print("Harga mobil Honda mobilio adalah 189 juta")
else:
print("Tidak terdefinisi")
elif (pilihan==3):
print("<<<<<<<< Macam macam mobil pada Toyota>>>>>>>>?")
print("\ta.Alphard")
print("\tb.Camry")
print("\tc.Fortuner")
pilih3 = input("Mana yang ingin anda pilih??")
if (pilih3=="a"):
print("Harga mobil Alphard adalah 870 juta")
elif (pilih3=="b"):
print("Harga mobil Camry adalah 560 Juta")
elif (pilih3=="c"):
print("Harga mobil Fortuner adalah 492 Juta")
| 34.59322 | 80 | 0.529152 | 241 | 2,041 | 4.481328 | 0.307054 | 0.092593 | 0.125 | 0.055556 | 0.221296 | 0.15463 | 0.15463 | 0 | 0 | 0 | 0 | 0.031228 | 0.293974 | 2,041 | 58 | 81 | 35.189655 | 0.718251 | 0 | 0 | 0.115385 | 0 | 0 | 0.476237 | 0.012739 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.596154 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
861d74d55db578d9eef6b283f432f055362e839e | 975 | py | Python | examples/given_data.py | GuoJingyao/cornac | e7529990ec1dfa586c4af3de98e4b3e00a786578 | [
"Apache-2.0"
] | null | null | null | examples/given_data.py | GuoJingyao/cornac | e7529990ec1dfa586c4af3de98e4b3e00a786578 | [
"Apache-2.0"
] | null | null | null | examples/given_data.py | GuoJingyao/cornac | e7529990ec1dfa586c4af3de98e4b3e00a786578 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Example to train and evaluate a model with given data
@author: Quoc-Tuan Truong <tuantq.vnu@gmail.com>
"""
from cornac.data import Reader
from cornac.eval_methods import BaseMethod
from cornac.models import MF
from cornac.metrics import MAE, RMSE
from cornac.utils import cache
# Download MovieLens 100K provided training and test splits
reader = Reader()
train_data = reader.read(cache(url='http://files.grouplens.org/datasets/movielens/ml-100k/u1.base'))
test_data = reader.read(cache(url='http://files.grouplens.org/datasets/movielens/ml-100k/u1.test'))
eval_method = BaseMethod.from_splits(train_data=train_data, test_data=test_data,
exclude_unknowns=False, verbose=True)
mf = MF(k=10, max_iter=25, learning_rate=0.01, lambda_reg=0.02,
use_bias=True, early_stop=True, verbose=True)
# Evaluation
result = eval_method.evaluate(model=mf, metrics=[MAE(), RMSE()], user_based=True)
print(result)
| 33.62069 | 100 | 0.73641 | 146 | 975 | 4.80137 | 0.541096 | 0.071327 | 0.039943 | 0.054208 | 0.194009 | 0.194009 | 0.194009 | 0.194009 | 0.194009 | 0.194009 | 0 | 0.026222 | 0.139487 | 975 | 28 | 101 | 34.821429 | 0.809297 | 0.2 | 0 | 0 | 0 | 0 | 0.158442 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.357143 | 0 | 0.357143 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
861f5e4cfdc98de2a394371bb5b02dd397322979 | 203 | py | Python | Python/Fibonacci.py | kennethsequeira/Hello-world | 464227bc7d9778a4a2a4044fe415a629003ea77f | [
"MIT"
] | 1 | 2018-12-19T11:42:09.000Z | 2018-12-19T11:42:09.000Z | Python/Fibonacci.py | kennethsequeira/Hello-world | 464227bc7d9778a4a2a4044fe415a629003ea77f | [
"MIT"
] | 1 | 2019-10-25T09:19:21.000Z | 2019-10-25T09:19:21.000Z | Python/Fibonacci.py | kennethsequeira/Hello-world | 464227bc7d9778a4a2a4044fe415a629003ea77f | [
"MIT"
] | 7 | 2019-09-11T07:17:32.000Z | 2019-09-25T12:23:52.000Z | #Doesn't work.
import time
fibonacci = [1, 1]
n = int(input())
while len(fibonacci) < n:
fibonacci.append(fibonacci[-1] + fibonacci[-2])
for i in range(n):
print(fibonacci[i], end=' ')
| 20.3 | 52 | 0.605911 | 30 | 203 | 4.1 | 0.666667 | 0.162602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.211823 | 203 | 9 | 53 | 22.555556 | 0.74375 | 0.064039 | 0 | 0 | 0 | 0 | 0.005556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86205fe9ef8a0c045201301f18357ead5b9c92fc | 6,081 | py | Python | py/_log/log.py | EnjoyLifeFund/py36pkgs | 0ac677fbbfa7b6d8c527fe2c759ba05117b07fd2 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 2 | 2018-03-14T06:45:40.000Z | 2018-06-08T07:46:02.000Z | py/_log/log.py | EnjoyLifeFund/py36pkgs | 0ac677fbbfa7b6d8c527fe2c759ba05117b07fd2 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 1 | 2019-03-23T08:23:21.000Z | 2019-03-23T08:23:21.000Z | py/_log/log.py | EnjoyLifeFund/py36pkgs | 0ac677fbbfa7b6d8c527fe2c759ba05117b07fd2 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 2 | 2017-11-07T18:05:19.000Z | 2017-11-14T18:06:55.000Z | """
basic logging functionality based on a producer/consumer scheme.
XXX implement this API: (maybe put it into slogger.py?)
log = Logger(
info=py.log.STDOUT,
debug=py.log.STDOUT,
command=None)
log.info("hello", "world")
log.command("hello", "world")
log = Logger(info=Logger(something=...),
debug=py.log.STDOUT,
command=None)
"""
import py, sys
class Message(object):
def __init__(self, keywords, args):
self.keywords = keywords
self.args = args
def content(self):
return " ".join(map(str, self.args))
def prefix(self):
return "[%s] " % (":".join(self.keywords))
def __str__(self):
return self.prefix() + self.content()
class Producer(object):
""" (deprecated) Log producer API which sends messages to be logged
to a 'consumer' object, which then prints them to stdout,
stderr, files, etc. Used extensively by PyPy-1.1.
"""
Message = Message # to allow later customization
keywords2consumer = {}
def __init__(self, keywords, keywordmapper=None, **kw):
if hasattr(keywords, 'split'):
keywords = tuple(keywords.split())
self._keywords = keywords
if keywordmapper is None:
keywordmapper = default_keywordmapper
self._keywordmapper = keywordmapper
def __repr__(self):
return "<py.log.Producer %s>" % ":".join(self._keywords)
def __getattr__(self, name):
if '_' in name:
raise AttributeError(name)
producer = self.__class__(self._keywords + (name,))
setattr(self, name, producer)
return producer
def __call__(self, *args):
""" write a message to the appropriate consumer(s) """
func = self._keywordmapper.getconsumer(self._keywords)
if func is not None:
func(self.Message(self._keywords, args))
class KeywordMapper:
def __init__(self):
self.keywords2consumer = {}
def getstate(self):
return self.keywords2consumer.copy()
def setstate(self, state):
self.keywords2consumer.clear()
self.keywords2consumer.update(state)
def getconsumer(self, keywords):
""" return a consumer matching the given keywords.
tries to find the most suitable consumer by walking, starting from
the back, the list of keywords, the first consumer matching a
keyword is returned (falling back to py.log.default)
"""
for i in range(len(keywords), 0, -1):
try:
return self.keywords2consumer[keywords[:i]]
except KeyError:
continue
return self.keywords2consumer.get('default', default_consumer)
def setconsumer(self, keywords, consumer):
""" set a consumer for a set of keywords. """
# normalize to tuples
if isinstance(keywords, str):
keywords = tuple(filter(None, keywords.split()))
elif hasattr(keywords, '_keywords'):
keywords = keywords._keywords
elif not isinstance(keywords, tuple):
raise TypeError("key %r is not a string or tuple" % (keywords,))
if consumer is not None and not py.builtin.callable(consumer):
if not hasattr(consumer, 'write'):
raise TypeError(
"%r should be None, callable or file-like" % (consumer,))
consumer = File(consumer)
self.keywords2consumer[keywords] = consumer
def default_consumer(msg):
""" the default consumer, prints the message to stdout (using 'print') """
sys.stderr.write(str(msg)+"\n")
default_keywordmapper = KeywordMapper()
def setconsumer(keywords, consumer):
default_keywordmapper.setconsumer(keywords, consumer)
def setstate(state):
default_keywordmapper.setstate(state)
def getstate():
return default_keywordmapper.getstate()
#
# Consumers
#
class File(object):
""" log consumer wrapping a file(-like) object """
def __init__(self, f):
assert hasattr(f, 'write')
#assert isinstance(f, file) or not hasattr(f, 'open')
self._file = f
def __call__(self, msg):
""" write a message to the log """
self._file.write(str(msg) + "\n")
if hasattr(self._file, 'flush'):
self._file.flush()
class Path(object):
""" log consumer that opens and writes to a Path """
def __init__(self, filename, append=False,
delayed_create=False, buffering=False):
self._append = append
self._filename = str(filename)
self._buffering = buffering
if not delayed_create:
self._openfile()
def _openfile(self):
mode = self._append and 'a' or 'w'
f = open(self._filename, mode)
self._file = f
def __call__(self, msg):
""" write a message to the log """
if not hasattr(self, "_file"):
self._openfile()
self._file.write(str(msg) + "\n")
if not self._buffering:
self._file.flush()
def STDOUT(msg):
""" consumer that writes to sys.stdout """
sys.stdout.write(str(msg)+"\n")
def STDERR(msg):
""" consumer that writes to sys.stderr """
sys.stderr.write(str(msg)+"\n")
class Syslog:
""" consumer that writes to the syslog daemon """
def __init__(self, priority = None):
if priority is None:
priority = self.LOG_INFO
self.priority = priority
def __call__(self, msg):
""" write a message to the log """
py.std.syslog.syslog(self.priority, str(msg))
for _prio in "EMERG ALERT CRIT ERR WARNING NOTICE INFO DEBUG".split():
_prio = "LOG_" + _prio
try:
setattr(Syslog, _prio, getattr(py.std.syslog, _prio))
except AttributeError:
pass
| 32.518717 | 79 | 0.587568 | 682 | 6,081 | 5.093842 | 0.256598 | 0.037997 | 0.018998 | 0.017271 | 0.107369 | 0.090674 | 0.048071 | 0.035406 | 0.035406 | 0.035406 | 0 | 0.002831 | 0.302911 | 6,081 | 186 | 80 | 32.693548 | 0.816702 | 0.233021 | 0 | 0.132743 | 0 | 0 | 0.04698 | 0 | 0 | 0 | 0 | 0 | 0.00885 | 1 | 0.230089 | false | 0.00885 | 0.00885 | 0.053097 | 0.389381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86207cdb07326bc532b6b5b79d11a692b3f498c4 | 1,696 | py | Python | test/test_all_contacts.py | Sergggio/python_training | 6dfdbed9a503cf9a6810b31c57bdde76b15e4ec4 | [
"Apache-2.0"
] | null | null | null | test/test_all_contacts.py | Sergggio/python_training | 6dfdbed9a503cf9a6810b31c57bdde76b15e4ec4 | [
"Apache-2.0"
] | null | null | null | test/test_all_contacts.py | Sergggio/python_training | 6dfdbed9a503cf9a6810b31c57bdde76b15e4ec4 | [
"Apache-2.0"
] | null | null | null | import re
from model.contact import Contact
def test_all_contacts(app, db):
contacts_from_db = db.get_contact_list()
phone_list_from_db = db.phones_from_db()
#email_liset_from_db = db.emails_from_db()
phone_list = []
for phone in phone_list_from_db:
phone_list.append(merge_phones_like_on_home_page(phone))
email_list = []
#for email in email_liset_from_db:
# email_list.append(merge_mail_like_on_home_page(email))
contacts_from_home_page = sorted(app.contact.get_contact_list(), key=Contact.id_or_max)
phones_from_home_page = [con.all_phones_from_home_page for con in contacts_from_home_page]
#emails_from_home_page = [con.all_mail_from_home_page for con in contacts_from_home_page]
assert phone_list == phones_from_home_page
#assert email_list == emails_from_home_page
assert contacts_from_db == contacts_from_home_page
def clear(s):
return re.sub("[() -]", "", s)
def remove_spaces(s):
return re.sub(' +', ' ', s).rstrip()
def merge_phones_like_on_home_page(contact):
return "\n".join(filter(lambda x: x != "",
map(lambda x: clear(x),
filter(lambda x: x is not None,
[contact.home_phone, contact.mobile_phone,
contact.work_phone, contact.secondary_phone]))))
def merge_email_like_on_home_page(contact):
return "\n".join(filter(lambda x: x != "",
map(lambda x: remove_spaces(x),
filter(lambda x: x is not None,
[contact.email, contact.email2, contact.email3]))))
| 38.545455 | 94 | 0.635024 | 234 | 1,696 | 4.209402 | 0.235043 | 0.113706 | 0.121827 | 0.056853 | 0.35736 | 0.294416 | 0.257868 | 0.257868 | 0.257868 | 0.194924 | 0 | 0.001608 | 0.266509 | 1,696 | 43 | 95 | 39.44186 | 0.790193 | 0.154481 | 0 | 0.142857 | 0 | 0 | 0.009097 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 1 | 0.178571 | false | 0 | 0.071429 | 0.142857 | 0.392857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
86226920fab3327506a58d2f239e976d2e4d87d4 | 634 | py | Python | games/migrations/0002_auto_20201026_1221.py | IceArrow256/game-list | 5f06e0ff80023acdc0290a9a8f814f7c93b45e0e | [
"Unlicense"
] | 3 | 2020-10-19T12:33:37.000Z | 2020-10-21T05:28:35.000Z | games/migrations/0002_auto_20201026_1221.py | IceArrow256/gamelist | 5f06e0ff80023acdc0290a9a8f814f7c93b45e0e | [
"Unlicense"
] | null | null | null | games/migrations/0002_auto_20201026_1221.py | IceArrow256/gamelist | 5f06e0ff80023acdc0290a9a8f814f7c93b45e0e | [
"Unlicense"
] | null | null | null | # Generated by Django 3.1.2 on 2020-10-26 12:21
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('games', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='game',
name='score',
field=models.FloatField(null=True, verbose_name='Score'),
),
migrations.AlterField(
model_name='game',
name='series',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='games.series'),
),
]
| 25.36 | 111 | 0.600946 | 69 | 634 | 5.449275 | 0.57971 | 0.06383 | 0.074468 | 0.117021 | 0.196809 | 0.196809 | 0 | 0 | 0 | 0 | 0 | 0.041215 | 0.272871 | 634 | 24 | 112 | 26.416667 | 0.774403 | 0.070978 | 0 | 0.333333 | 1 | 0 | 0.09029 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
862625f0bd5d6882a14a812018126e427778e14a | 11,603 | py | Python | build/lib.linux-x86_64-2.7_ucs4/mx/Misc/PackageTools.py | mkubux/egenix-mx-base | 3e6f9186334d9d73743b0219ae857564c7208247 | [
"eGenix"
] | null | null | null | build/lib.linux-x86_64-2.7_ucs4/mx/Misc/PackageTools.py | mkubux/egenix-mx-base | 3e6f9186334d9d73743b0219ae857564c7208247 | [
"eGenix"
] | null | null | null | build/lib.linux-x86_64-2.7_ucs4/mx/Misc/PackageTools.py | mkubux/egenix-mx-base | 3e6f9186334d9d73743b0219ae857564c7208247 | [
"eGenix"
] | null | null | null | """ PackageTools - A set of tools to aid working with packages.
Copyright (c) 1998-2000, Marc-Andre Lemburg; mailto:mal@lemburg.com
Copyright (c) 2000-2015, eGenix.com Software GmbH; mailto:info@egenix.com
See the documentation for further information on copyrights,
or contact the author. All Rights Reserved.
"""
__version__ = '0.4.0'
import os,types,sys,re,imp,__builtin__
import mx.Tools.NewBuiltins
# RE to identify Python modules
suffixes = projection(imp.get_suffixes(),0)
module_name = re.compile('(.*)(' + '|'.join(suffixes) + ')$')
initmodule_name = re.compile('__init__(' + '|'.join(suffixes) + ')$')
initmodule_names = []
for suffix in suffixes:
initmodule_names.append('__init__' + suffix)
def find_packages(dir=os.curdir, files_only=0, recursive=0, ignore_modules=0,
pkgbasename='', pkgdict=None,
isdir=os.path.isdir,exists=os.path.exists,
isfile=os.path.isfile,join=os.path.join,listdir=os.listdir,
module_name=module_name,initmodule_name=initmodule_name):
""" Return a list of package names found in dir.
Packages are Python modules and subdirectories that provide an
__init__ module. The .py extension is removed from the
files. The __init__ modules are not considered being seperate
packages.
If files_only is true, only Python files are included in the
search (subdirectories are *not* taken into account). If
ignore_modules is true (default is false), modules are
ignored. If recursive is true the search recurses into package
directories.
pkgbasename and pkgdict are only used during recursion.
"""
l = listdir(dir)
if pkgdict is None:
pkgdict = {}
if files_only:
for filename in l:
m = module_name.match(filename)
if m is not None and \
m.group(1) != '__init__':
pkgdict[pkgbasename + m.group(1)] = 1
else:
for filename in l:
path = join(dir, filename)
if isdir(path):
# Check for __init__ module(s)
for name in initmodule_names:
if isfile(join(path, name)):
pkgname = pkgbasename + filename
pkgdict[pkgname] = 1
if recursive:
find_packages(path,
recursive=1,
pkgbasename=pkgname + '.',
pkgdict=pkgdict)
break
elif not ignore_modules:
m = module_name.match(filename)
if m is not None and \
m.group(1) != '__init__':
pkgdict[pkgbasename + m.group(1)] = 1
return pkgdict.keys()
def find_subpackages(package, recursive=0,
splitpath=os.path.split):
""" Assuming that package points to a loaded package module, this
function tries to identify all subpackages of that package.
Subpackages are all Python files included in the same
directory as the module plus all subdirectories having an
__init__.py file. The modules name is prepended to all
subpackage names.
The module location is found by looking at the __file__
attribute that non-builtin modules define. The function uses
the __all__ attribute from the package __init__ module if
available.
If recursive is true (default is false), then subpackages of
subpackages are recursively also included in the search.
"""
if not recursive:
# Try the __all__ attribute...
try:
subpackages = list(package.__all__)
except (ImportError, AttributeError):
# Did not work, then let's try to find the subpackages by looking
# at the directory where package lives...
subpackages = find_packages(package.__path__[0], recursive=recursive)
else:
# XXX Recursive search does not support the __all__ attribute
subpackages = find_packages(package.__path__[0], recursive=recursive)
basename = package.__name__ + '.'
for i,name in irange(subpackages):
subpackages[i] = basename + name
return subpackages
def _thismodule(upcount=1,
exc_info=sys.exc_info,trange=trange):
""" Returns the module object that the callee is calling from.
upcount can be given to indicate how far up the execution
stack the function is supposed to look (1 == direct callee, 2
== callee of callee, etc.).
"""
try:
1/0
except:
frame = exc_info()[2].tb_frame
for i in trange(upcount):
frame = frame.f_back
name = frame.f_globals['__name__']
del frame
return sys.modules[name]
def _module_loader(name, locals, globals, sysmods, errors='strict',
importer=__import__, reloader=reload, from_list=['*']):
""" Internal API for loading a module
"""
if not sysmods.has_key(name):
is_new = 1
else:
is_new = 0
try:
mod = importer(name, locals, globals, from_list)
if reload and not is_new:
mod = reloader(mod)
except KeyboardInterrupt:
# Pass through; SystemExit will be handled by the error handler
raise
except Exception, why:
if errors == 'ignore':
pass
elif errors == 'strict':
raise
elif callable(errors):
errors(name, sys.exc_info()[0], sys.exc_info()[1])
else:
raise ValueError,'unknown errors value'
else:
return mod
return None
def import_modules(modnames,module=None,errors='strict',reload=0,
thismodule=_thismodule):
""" Import all modules given in modnames into module.
module defaults to the caller's module. modnames may contain
dotted package names.
If errors is 'strict' (default), then ImportErrors and
SyntaxErrors are raised. If set to 'ignore', they are silently
ignored. If errors is a callable object, then it is called
with arguments (modname, errorclass, errorvalue). If the
handler returns, processing continues.
If reload is true (default is false), all already modules
among the list will be forced to reload.
"""
if module is None:
module = _thismodule(2)
locals = module.__dict__
sysmods = sys.modules
for name in modnames:
mod = _module_loader(name, locals, locals, sysmods, errors=errors)
if mod is not None:
locals[name] = mod
def load_modules(modnames,locals=None,globals=None,errors='strict',reload=0):
""" Imports all modules in modnames using the given namespaces and returns
list of corresponding module objects.
If errors is 'strict' (default), then ImportErrors and
SyntaxErrors are raised. If set to 'ignore', they are silently
ignored. If errors is a callable object, then it is called
with arguments (modname, errorclass, errorvalue). If the
handler returns, processing continues.
If reload is true (default is false), all already modules
among the list will be forced to reload.
"""
modules = []
append = modules.append
sysmods = sys.modules
for name in modnames:
mod = _module_loader(name, locals, globals, sysmods, errors=errors)
if mod is not None:
append(mod)
return modules
def import_subpackages(module, reload=0, recursive=0,
import_modules=import_modules,
find_subpackages=find_subpackages):
""" Does a subpackages scan using find_subpackages(module) and then
imports all submodules found into module.
The module location is found by looking at the __file__
attribute that non-builtin modules define. The function uses
the __all__ attribute from the package __init__ module if
available.
If reload is true (default is false), all already modules
among the list will be forced to reload.
"""
import_modules(find_subpackages(module, recursive=recursive),
module, reload=reload)
def load_subpackages(module, locals=None, globals=None, errors='strict', reload=0,
recursive=0,
load_modules=load_modules,
find_subpackages=find_subpackages):
""" Same as import_subpackages but with load_modules
functionality, i.e. imports the modules and also returns a list of
module objects.
If errors is 'strict' (default), then ImportErrors are
raised. If set to 'ignore', they are silently ignored.
If reload is true (default is false), all already modules
among the list will be forced to reload.
"""
return load_modules(find_subpackages(module, recursive=recursive),
locals, globals,
errors=errors, reload=reload)
def modules(names,
extract=extract):
""" Converts a list of module names into a list of module objects.
The modules must already be loaded.
"""
return extract(sys.modules, names)
def package_modules(pkgname):
""" Returns a list of all modules belonging to the package with the
given name.
The package must already be loaded. Only the currently
registered modules are included in the list.
"""
match = pkgname + '.'
match_len = len(match)
mods = [sys.modules[pkgname]]
for k,v in sys.modules.items():
if k[:match_len] == match and v is not None:
mods.append(v)
return mods
def find_classes(mods,baseclass=None,annotated=0,
ClassType=types.ClassType,issubclass=issubclass):
""" Find all subclasses of baseclass or simply all classes (if baseclass
is None) defined by the module objects in list mods.
If annotated is true the returned list will contain tuples
(module_object,name,class_object) for each class found where
module_object is the module where the class is defined.
"""
classes = []
for mod in mods:
for name,obj in mod.__dict__.items():
if type(obj) is ClassType:
if baseclass and not issubclass(obj,baseclass):
continue
if annotated:
classes.append((mod, name, obj))
else:
classes.append(obj)
return classes
def find_instances(mods,baseclass,annotated=0,
InstanceType=types.InstanceType,issubclass=issubclass):
""" Find all instances of baseclass defined by the module objects
in list mods.
If annotated is true the returned list will contain tuples
(module_object,name,instances_object) for each instances found where
module_object is the module where the instances is defined.
"""
instances = []
for mod in mods:
for name,obj in mod.__dict__.items():
if isinstance(obj,baseclass):
if annotated:
instances.append((mod,name,obj))
else:
instances.append(obj)
return instances
| 35.375 | 82 | 0.613031 | 1,395 | 11,603 | 4.967025 | 0.21147 | 0.008659 | 0.011257 | 0.012989 | 0.340597 | 0.311156 | 0.297878 | 0.290807 | 0.254438 | 0.233223 | 0 | 0.00666 | 0.314143 | 11,603 | 327 | 83 | 35.48318 | 0.864036 | 0.026976 | 0 | 0.242038 | 0 | 0 | 0.018765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.006369 | 0.057325 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86273fb9e1a631cb61fc755f591bccb65bcc2063 | 553 | py | Python | gapipy/resources/tour/transport.py | wmak/gapipy | b6849606d4f6af24b9f871f65e87aaf0d0c013cc | [
"MIT"
] | null | null | null | gapipy/resources/tour/transport.py | wmak/gapipy | b6849606d4f6af24b9f871f65e87aaf0d0c013cc | [
"MIT"
] | null | null | null | gapipy/resources/tour/transport.py | wmak/gapipy | b6849606d4f6af24b9f871f65e87aaf0d0c013cc | [
"MIT"
] | null | null | null | # Python 2 and 3
from __future__ import unicode_literals
from ...models import Address, SeasonalPriceBand
from ..base import Product
class Transport(Product):
_resource_name = 'transports'
_is_listable = False
_as_is_fields = [
'id', 'href', 'availability', 'name', 'product_line', 'sku', 'type', 'sub_type'
]
_date_time_fields_utc = ['date_created', 'date_last_modified']
_model_fields = [('start_address', Address), ('finish_address', Address)]
_model_collection_fields = [('price_bands', SeasonalPriceBand)]
| 27.65 | 87 | 0.703436 | 63 | 553 | 5.714286 | 0.68254 | 0.077778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004357 | 0.169982 | 553 | 19 | 88 | 29.105263 | 0.779956 | 0.025316 | 0 | 0 | 0 | 0 | 0.236499 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
862dc531f725b524bb6846cb090205fc7468f382 | 1,166 | py | Python | src/backup/template/PositionalArgumentTemplate.py | ytyaru0/Python.TemplateFileMaker.20180314204216 | 4849f982acea5d86b711c5dec4cc046016ab1031 | [
"CC0-1.0"
] | null | null | null | src/backup/template/PositionalArgumentTemplate.py | ytyaru0/Python.TemplateFileMaker.20180314204216 | 4849f982acea5d86b711c5dec4cc046016ab1031 | [
"CC0-1.0"
] | null | null | null | src/backup/template/PositionalArgumentTemplate.py | ytyaru0/Python.TemplateFileMaker.20180314204216 | 4849f982acea5d86b711c5dec4cc046016ab1031 | [
"CC0-1.0"
] | null | null | null | from string import Template
import re
class PositionalArgumentTemplate(Template):
# (?i): 大文字小文字を区別しないモードを開始する
# (?-i): 大文字小文字を区別しないモードを無効にする
idpattern_default = Template.idpattern # (?-i:[_a-zA-Z][_a-zA-Z0-9]*)
idpattern = '([0-9]+)'
def find_place_holders(self, template:str):
#for m in re.findall(self.pattern, template):
#for m in re.finditer(self.pattern, template):
for m in self.pattern.finditer(template):
print(m, type(m))
#print(dir(m))
#print(len(m.groups()))
print(m[0])
#print(m.groups())
#print(m, m.groups(), m.group('named'), type(m))
#print(m.group('escaped'))
#print(m.group('named'))
#print(m.group('braced'))
#print(m.group('invalid'))
if __name__ == '__main__':
template_str = '${0} is Aug.'
t = PositionalArgumentTemplate(template_str)
print(template_str)
print(dir(t))
print(t.delimiter)
print(t.idpattern)
print(type(t.idpattern))
print(t.flags)
print(t.pattern)
print(t.substitute(**{'0':'V'}))
t.find_place_holders(template_str)
| 31.513514 | 73 | 0.587479 | 145 | 1,166 | 4.593103 | 0.344828 | 0.072072 | 0.066066 | 0.024024 | 0.075075 | 0.075075 | 0 | 0 | 0 | 0 | 0 | 0.007946 | 0.244425 | 1,166 | 36 | 74 | 32.388889 | 0.748014 | 0.316467 | 0 | 0 | 0 | 0 | 0.038314 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.095238 | 0 | 0.285714 | 0.47619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
862e0a0793ac26ff1693be29a952ce4f785121be | 1,020 | py | Python | cla-backend/cla/tests/unit/test_company.py | kdhaigud/easycla | f913f8dbf658acf4711b601f9312ca5663a4efe8 | [
"Apache-2.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | cla-backend/cla/tests/unit/test_company.py | kdhaigud/easycla | f913f8dbf658acf4711b601f9312ca5663a4efe8 | [
"Apache-2.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | cla-backend/cla/tests/unit/test_company.py | kdhaigud/easycla | f913f8dbf658acf4711b601f9312ca5663a4efe8 | [
"Apache-2.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | # Copyright The Linux Foundation and each contributor to CommunityBridge.
# SPDX-License-Identifier: MIT
import json
import os
import requests
import uuid
import hug
import pytest
from falcon import HTTP_200, HTTP_409
import cla
from cla import routes
ID_TOKEN = os.environ.get('ID_TOKEN')
API_URL = os.environ.get('API_URL')
def test_create_company_duplicate():
"""
Test creating duplicate company names
"""
import pdb;pdb.set_trace()
url = f'{API_URL}/v1/company'
company_name = 'test_company_name'
data = {
'company_id' : uuid.uuid4() ,
'company_name' : company_name ,
}
headers = {
'Authorization' : f'Bearer {ID_TOKEN}'
}
response = requests.post(url, data=data, headers=headers)
assert response.status == HTTP_200
# add duplicate company
data = {
'company_id' : uuid.uuid4(),
'company_name' : company_name
}
req = hug.test.post(routes, url, data=data, headers=headers)
assert req.status == HTTP_409
| 23.72093 | 73 | 0.673529 | 133 | 1,020 | 4.992481 | 0.428571 | 0.099398 | 0.036145 | 0.051205 | 0.225904 | 0.225904 | 0.13253 | 0.13253 | 0.13253 | 0 | 0 | 0.018916 | 0.222549 | 1,020 | 42 | 74 | 24.285714 | 0.818411 | 0.157843 | 0 | 0.133333 | 0 | 0 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.033333 | false | 0 | 0.333333 | 0 | 0.366667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
8630b3c80464d13f544a914873b82ed141f94bf1 | 9,098 | py | Python | qstklearn/1knn.py | elxavicio/QSTK | 4981506c37227a72404229d5e1e0887f797a5d57 | [
"BSD-3-Clause"
] | 339 | 2015-01-01T10:06:49.000Z | 2022-03-23T23:32:24.000Z | QSTK/qstklearn/1knn.py | jenniyanjie/QuantSoftwareToolkit | 0eb2c7a776c259a087fdcac1d3ff883eb0b5516c | [
"BSD-3-Clause"
] | 19 | 2015-01-04T13:12:33.000Z | 2021-07-19T11:13:47.000Z | QSTK/qstklearn/1knn.py | jenniyanjie/QuantSoftwareToolkit | 0eb2c7a776c259a087fdcac1d3ff883eb0b5516c | [
"BSD-3-Clause"
] | 154 | 2015-01-30T09:41:15.000Z | 2022-03-19T02:27:59.000Z | '''
(c) 2011, 2012 Georgia Tech Research Corporation
This source code is released under the New BSD license. Please see
http://wiki.quantsoftware.org/index.php?title=QSTK_License
for license details.
Created on Feb 20, 2011
@author: John Cornwell
@organization: Georgia Institute of Technology
@contact: JohnWCornwellV@gmail.com
@summary: This is an implementation of the 1-KNN algorithm for ranking features quickly.
It uses the knn implementation.
@status: oneKNN functions correctly, optimized to use n^2/2 algorithm.
'''
import matplotlib.pyplot as plt
from pylab import gca
import itertools
import string
import numpy as np
import math
import knn
from time import clock
'''
@summary: Query function for 1KNN, return value is a double between 0 and 1.
@param naData: A 2D numpy array. Each row is a data point with the final column containing the classification.
'''
def oneKnn( naData ):
if naData.ndim != 2:
raise Exception( "Data should have two dimensions" )
lLen = naData.shape[0]
''' # of dimensions, subtract one for classification '''
lDim = naData.shape[1] - 1
''' Start best distances as very large '''
ldDistances = [1E300] * lLen
llIndexes = [-1] * lLen
dDistance = 0.0;
''' Loop through finding closest neighbors '''
for i in range( lLen ):
for j in range( i+1, lLen ):
dDistance = 0.0
for k in range( 0, lDim ):
dDistance += (naData[i][k] - naData[j][k])**2
dDistance = math.sqrt( dDistance )
''' Two distances to check, for i's best, and j's best '''
if dDistance < ldDistances[i]:
ldDistances[i] = dDistance
llIndexes[i] = j
if dDistance < ldDistances[j]:
ldDistances[j] = dDistance
llIndexes[j] = i
lCount = 0
''' Now count # of matching pairs '''
for i in range( lLen ):
if naData[i][-1] == naData[ llIndexes[i] ][-1]:
lCount = lCount + 1
return float(lCount) / lLen
''' Test function to plot results '''
def _plotResults( naDist1, naDist2, lfOneKnn, lf5Knn ):
plt.clf()
plt.subplot(311)
plt.scatter( naDist1[:,0], naDist1[:,1] )
plt.scatter( naDist2[:,0], naDist2[:,1], color='r' )
#plt.ylabel( 'Feature 2' )
#plt.xlabel( 'Feature 1' )
#gca().annotate( '', xy=( .8, 0 ), xytext=( -.3 , 0 ), arrowprops=dict(facecolor='red', shrink=0.05) )
gca().annotate( '', xy=( .7, 0 ), xytext=( 1.5 , 0 ), arrowprops=dict(facecolor='black', shrink=0.05) )
plt.title( 'Data Distribution' )
plt.subplot(312)
plt.plot( range( len(lfOneKnn) ), lfOneKnn )
plt.ylabel( '1-KNN Value' )
#plt.xlabel( 'Distribution Merge' )
plt.title( '1-KNN Performance' )
plt.subplot(313)
plt.plot( range( len(lf5Knn) ), lf5Knn )
plt.ylabel( '% Correct Classification' )
#plt.xlabel( 'Distribution Merge' )
plt.title( '5-KNN Performance' )
plt.subplots_adjust()
plt.show()
''' Function to plot 2 distributions '''
def _plotDist( naDist1, naDist2, i ):
plt.clf()
plt.scatter( naDist1[:,0], naDist1[:,1] )
plt.scatter( naDist2[:,0], naDist2[:,1], color='r' )
plt.ylabel( 'Feature 2' )
plt.xlabel( 'Feature 1' )
plt.title( 'Iteration ' + str(i) )
plt.show()
''' Function to test KNN performance '''
def _knnResult( naData ):
''' Split up data into training/testing '''
lSplit = naData.shape[0] * .7
naTrain = naData[:lSplit, :]
naTest = naData[lSplit:, :]
knn.addEvidence( naTrain.astype(float), 1 );
''' Query with last column omitted and 5 nearest neighbors '''
naResults = knn.query( naTest[:,:-1], 5, 'mode')
''' Count returns which are correct '''
lCount = 0
for i, dVal in enumerate(naResults):
if dVal == naTest[i,-1]:
lCount = lCount + 1
dResult = float(lCount) / naResults.size
return dResult
''' Tests performance of 1-KNN '''
def _test1():
''' Generate three random samples to show the value of 1-KNN compared to 5KNN learner performance '''
for i in range(3):
''' Select one of three distributions '''
if i == 0:
naTest1 = np.random.normal( loc=[0,0],scale=.25,size=[500,2] )
naTest1 = np.hstack( (naTest1, np.zeros(500).reshape(-1,1) ) )
naTest2 = np.random.normal( loc=[1.5,0],scale=.25,size=[500,2] )
naTest2 = np.hstack( (naTest2, np.ones(500).reshape(-1,1) ) )
elif i == 1:
naTest1 = np.random.normal( loc=[0,0],scale=.25,size=[500,2] )
naTest1 = np.hstack( (naTest1, np.zeros(500).reshape(-1,1) ) )
naTest2 = np.random.normal( loc=[1.5,0],scale=.1,size=[500,2] )
naTest2 = np.hstack( (naTest2, np.ones(500).reshape(-1,1) ) )
else:
naTest1 = np.random.normal( loc=[0,0],scale=.25,size=[500,2] )
naTest1 = np.hstack( (naTest1, np.zeros(500).reshape(-1,1) ) )
naTest2 = np.random.normal( loc=[1.5,0],scale=.25,size=[250,2] )
naTest2 = np.hstack( (naTest2, np.ones(250).reshape(-1,1) ) )
naOrig = np.vstack( (naTest1, naTest2) )
naBoth = np.vstack( (naTest1, naTest2) )
''' Keep track of runtimes '''
t = clock()
cOneRuntime = t-t;
cKnnRuntime = t-t;
lfResults = []
lfKnnResults = []
for i in range( 15 ):
#_plotDist( naTest1, naBoth[100:,:], i )
t = clock()
lfResults.append( oneKnn( naBoth ) )
cOneRuntime = cOneRuntime + (clock() - t)
t = clock()
lfKnnResults.append( _knnResult( np.random.permutation(naBoth) ) )
cKnnRuntime = cKnnRuntime + (clock() - t)
naBoth[500:,0] = naBoth[500:,0] - .1
print 'Runtime OneKnn:', cOneRuntime
print 'Runtime 5-KNN:', cKnnRuntime
_plotResults( naTest1, naTest2, lfResults, lfKnnResults )
''' Tests performance of 1-KNN '''
def _test2():
''' Generate three random samples to show the value of 1-KNN compared to 5KNN learner performance '''
np.random.seed( 12345 )
''' Create 5 distributions for each of the 5 attributes '''
dist1 = np.random.uniform( -1, 1, 1000 ).reshape( -1, 1 )
dist2 = np.random.uniform( -1, 1, 1000 ).reshape( -1, 1 )
dist3 = np.random.uniform( -1, 1, 1000 ).reshape( -1, 1 )
dist4 = np.random.uniform( -1, 1, 1000 ).reshape( -1, 1 )
dist5 = np.random.uniform( -1, 1, 1000 ).reshape( -1, 1 )
lDists = [ dist1, dist2, dist3, dist4, dist5 ]
''' All features used except for distribution 4 '''
distY = np.sin( dist1 ) + np.sin( dist2 ) + np.sin( dist3 ) + np.sin( dist5 )
distY = distY.reshape( -1, 1 )
for i, fVal in enumerate( distY ):
if fVal >= 0:
distY[i] = 1
else:
distY[i] = 0
for i in range( 1, 6 ):
lsNames = []
lf1Vals = []
lfVals = []
for perm in itertools.combinations( '12345', i ):
''' set test distribution to first element '''
naTest = lDists[ int(perm[0]) - 1 ]
sPerm = perm[0]
''' stack other distributions on '''
for j in range( 1, len(perm) ):
sPerm = sPerm + str(perm[j])
naTest = np.hstack( (naTest, lDists[ int(perm[j]) - 1 ] ) )
''' finally stack y values '''
naTest = np.hstack( (naTest, distY) )
lf1Vals.append( oneKnn( naTest ) )
lfVals.append( _knnResult( np.random.permutation(naTest) ) )
lsNames.append( sPerm )
''' Plot results '''
plt1 = plt.bar( np.arange(len(lf1Vals)), lf1Vals, .2, color='r' )
plt2 = plt.bar( np.arange(len(lfVals)) + 0.2, lfVals, .2, color='b' )
plt.legend( (plt1[0], plt2[0]), ('1-KNN', 'KNN, K=5') )
plt.ylabel('1-KNN Value/KNN Classification')
plt.xlabel('Feature Set')
plt.title('Combinations of ' + str(i) + ' Features')
plt.ylim( (0,1) )
if len(lf1Vals) < 2:
plt.xlim( (-1,1) )
gca().xaxis.set_ticks( np.arange(len(lf1Vals)) + .2 )
gca().xaxis.set_ticklabels( lsNames )
plt.show()
if __name__ == '__main__':
_test1()
#_test2()
| 31.811189 | 112 | 0.523522 | 1,069 | 9,098 | 4.434051 | 0.282507 | 0.008017 | 0.022785 | 0.021519 | 0.277004 | 0.228481 | 0.203587 | 0.197468 | 0.197468 | 0.165823 | 0 | 0.056915 | 0.339525 | 9,098 | 285 | 113 | 31.922807 | 0.731902 | 0.029677 | 0 | 0.189189 | 0 | 0 | 0.041518 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.054054 | null | null | 0.013514 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86346fa63b7971b7ad956846f8bc8dcc94175283 | 2,679 | py | Python | server/cauth/views.py | mashaka/TravelHelper | 8a216dd13c253e138f241187dee46e6e53281a7b | [
"MIT"
] | null | null | null | server/cauth/views.py | mashaka/TravelHelper | 8a216dd13c253e138f241187dee46e6e53281a7b | [
"MIT"
] | 3 | 2020-02-11T23:38:20.000Z | 2021-06-10T19:10:53.000Z | server/cauth/views.py | mashaka/TravelHelper | 8a216dd13c253e138f241187dee46e6e53281a7b | [
"MIT"
] | 1 | 2018-09-19T11:19:48.000Z | 2018-09-19T11:19:48.000Z | from django.shortcuts import render
from django.contrib.auth.decorators import login_required
from django.contrib.auth.forms import AdminPasswordChangeForm, PasswordChangeForm, UserCreationForm
from django.contrib.auth import update_session_auth_hash, login, authenticate
from django.contrib import messages
from django.shortcuts import render, redirect
from social_django.models import UserSocialAuth
from django.http import HttpResponse
from django.shortcuts import get_object_or_404, redirect
from rest_framework.authtoken.models import Token
from app.methods import prepare_user
def get_token(request):
if request.user:
user = request.user
prepare_user(user)
token,_ = Token.objects.get_or_create(user=user)
url = "travel://?token=" + token.key + '&id=' + str(user.id)
else:
url = "travel://error"
response = HttpResponse(url, status=302)
response['Location'] = url
return response
@login_required
def get_facebook_token(request):
q = get_object_or_404(UserSocialAuth, user=request.user, provider='facebook')
return HttpResponse(str(q.extra_data))
def signup(request):
return render(request, 'signup.html')
@login_required
def home(request):
return render(request, 'home.html')
@login_required
def settings(request):
user = request.user
try:
github_login = user.social_auth.get(provider='github')
except UserSocialAuth.DoesNotExist:
github_login = None
try:
twitter_login = user.social_auth.get(provider='twitter')
except UserSocialAuth.DoesNotExist:
twitter_login = None
try:
facebook_login = user.social_auth.get(provider='facebook')
except UserSocialAuth.DoesNotExist:
facebook_login = None
can_disconnect = (user.social_auth.count() > 1 or user.has_usable_password())
return render(request, 'settings.html', {
'facebook_login': facebook_login,
'can_disconnect': can_disconnect
})
@login_required
def password(request):
if request.user.has_usable_password():
PasswordForm = PasswordChangeForm
else:
PasswordForm = AdminPasswordChangeForm
if request.method == 'POST':
form = PasswordForm(request.user, request.POST)
if form.is_valid():
form.save()
update_session_auth_hash(request, form.user)
messages.success(request, 'Your password was successfully updated!')
return redirect('password')
else:
messages.error(request, 'Please correct the error below.')
else:
form = PasswordForm(request.user)
return render(request, 'password.html', {'form': form})
| 31.892857 | 99 | 0.709966 | 312 | 2,679 | 5.942308 | 0.294872 | 0.04315 | 0.036677 | 0.040453 | 0.081985 | 0.048544 | 0 | 0 | 0 | 0 | 0 | 0.00464 | 0.195595 | 2,679 | 83 | 100 | 32.277108 | 0.855684 | 0 | 0 | 0.231884 | 0 | 0 | 0.086226 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | false | 0.15942 | 0.15942 | 0.028986 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
863b23444fda9cb581afbddd6338c59075cfc887 | 1,793 | py | Python | tests/test_responder.py | craigderington/responder-persons-api | d2270d2f761c5dd3dbe253113d410f3e37d4d217 | [
"Apache-2.0"
] | null | null | null | tests/test_responder.py | craigderington/responder-persons-api | d2270d2f761c5dd3dbe253113d410f3e37d4d217 | [
"Apache-2.0"
] | null | null | null | tests/test_responder.py | craigderington/responder-persons-api | d2270d2f761c5dd3dbe253113d410f3e37d4d217 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
import pytest
import app as service
import yaml
import responder
from starlette.responses import PlainTextResponse
@pytest.fixture
def api():
return service.api
def test_hello_world(api):
r = api.requests.get("/api/v1.0/index")
assert r.text == "Hello, World!"
def test_basic_route(api):
@api.route("/api/v1.0/index")
def index(req, resp):
resp.text = "Hello, World!"
def test_requests_session(api):
assert api.session()
assert api.requests
def test_json_media(api):
dump = {"life": 42}
@api.route("/")
def media(req, resp):
resp.media = dump
r = api.requests.get("http://;/")
assert "json" in r.headers["Content-Type"]
assert r.json() == dump
def test_yaml_media(api):
dump = {"life": 42}
@api.route("/")
def media(req, resp):
resp.media = dump
r = api.requests.get("http://;/", headers={"Accept": "yaml"})
assert "yaml" in r.headers["Content-Type"]
assert yaml.load(r.content) == dump
def test_background(api):
@api.route("/")
def route(req, resp):
@api.background.task
def task():
import time
time.sleep(3)
task()
api.text = "ok"
r = api.requests.get(api.url_for(route))
assert r.ok
def test_500_error(api):
def catcher(req, exc):
return PlainTextResponse("Suppressed error", 500)
api.app.add_exception_handler(ValueError, catcher)
@api.route("/api/v1.0/index")
def view(req, resp):
raise ValueError
r = api.requests.get(api.url_for(view))
assert not r.ok
assert r.content == b'Suppressed error'
def test_404_error(api):
r = api.requests.get("/api/v1.0/foo")
assert r.status_code == responder.API.status_codes.HTTP_404
| 19.703297 | 65 | 0.621305 | 253 | 1,793 | 4.316206 | 0.284585 | 0.051282 | 0.065934 | 0.082418 | 0.349817 | 0.311355 | 0.261905 | 0.177656 | 0.1337 | 0.1337 | 0 | 0.018813 | 0.229225 | 1,793 | 90 | 66 | 19.922222 | 0.771346 | 0.00725 | 0 | 0.189655 | 0 | 0 | 0.106299 | 0 | 0 | 0 | 0 | 0 | 0.189655 | 1 | 0.275862 | false | 0 | 0.103448 | 0.034483 | 0.413793 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
863e8a2ed0006f7150de09f27d406b39ae986ad3 | 827 | py | Python | saleor/order/migrations/0081_auto_20200406_0456.py | fairhopeweb/saleor | 9ac6c22652d46ba65a5b894da5f1ba5bec48c019 | [
"CC-BY-4.0"
] | 15,337 | 2015-01-12T02:11:52.000Z | 2021-10-05T19:19:29.000Z | saleor/order/migrations/0081_auto_20200406_0456.py | fairhopeweb/saleor | 9ac6c22652d46ba65a5b894da5f1ba5bec48c019 | [
"CC-BY-4.0"
] | 7,486 | 2015-02-11T10:52:13.000Z | 2021-10-06T09:37:15.000Z | saleor/order/migrations/0081_auto_20200406_0456.py | aminziadna/saleor | 2e78fb5bcf8b83a6278af02551a104cfa555a1fb | [
"CC-BY-4.0"
] | 5,864 | 2015-01-16T14:52:54.000Z | 2021-10-05T23:01:15.000Z | # Generated by Django 3.0.4 on 2020-04-06 09:56
from django.db import migrations
from saleor.order import OrderStatus
def match_orders_with_users(apps, *_args, **_kwargs):
Order = apps.get_model("order", "Order")
User = apps.get_model("account", "User")
orders_without_user = Order.objects.filter(
user_email__isnull=False, user=None
).exclude(status=OrderStatus.DRAFT)
for order in orders_without_user:
try:
new_user = User.objects.get(email=order.user_email)
except User.DoesNotExist:
continue
order.user = new_user
order.save(update_fields=["user"])
class Migration(migrations.Migration):
dependencies = [
("order", "0080_invoice"),
]
operations = [
migrations.RunPython(match_orders_with_users),
]
| 25.060606 | 63 | 0.665054 | 102 | 827 | 5.176471 | 0.568627 | 0.051136 | 0.056818 | 0.075758 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029688 | 0.226119 | 827 | 32 | 64 | 25.84375 | 0.795313 | 0.054414 | 0 | 0 | 1 | 0 | 0.053846 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.090909 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
8648090bbe37bd69072d860284239d2be5f5a913 | 469 | py | Python | code-wars/010.moving-zeros-to-the-end.py | code-knayam/DataStructureAlgorithms | 8425911633d4d343c58798a123175289ed0df1fe | [
"MIT"
] | null | null | null | code-wars/010.moving-zeros-to-the-end.py | code-knayam/DataStructureAlgorithms | 8425911633d4d343c58798a123175289ed0df1fe | [
"MIT"
] | null | null | null | code-wars/010.moving-zeros-to-the-end.py | code-knayam/DataStructureAlgorithms | 8425911633d4d343c58798a123175289ed0df1fe | [
"MIT"
] | null | null | null | # Write an algorithm that takes an array and moves all of the zeros to the end, preserving the order of the other elements.
def move_zeros(array):
#your code here
new_array = []
new_index = 0
while len(array) > 0:
item = array.pop(0)
if item == 0 and not type(item) == bool :
new_array.append(item)
else:
new_array.insert(new_index, item)
new_index = new_index + 1
return new_array | 31.266667 | 123 | 0.603412 | 70 | 469 | 3.914286 | 0.557143 | 0.116788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015674 | 0.319829 | 469 | 15 | 124 | 31.266667 | 0.84326 | 0.287846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0 | 1 | 0.090909 | false | 0 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
864a44b2fa4b1d6dbb15ace15ef151c81922788f | 928 | py | Python | __main__.py | miezebieze/scott-launcher | a03597d0883af075128d1ea4ea53e7b5132807b1 | [
"MIT"
] | 1 | 2020-06-12T20:49:47.000Z | 2020-06-12T20:49:47.000Z | __main__.py | miezebieze/scott-launcher | a03597d0883af075128d1ea4ea53e7b5132807b1 | [
"MIT"
] | null | null | null | __main__.py | miezebieze/scott-launcher | a03597d0883af075128d1ea4ea53e7b5132807b1 | [
"MIT"
] | null | null | null | from enum import Enum
from window import Window
D = Enum ('Directions','N NE E SE S SW W NW')
selector_map = {
D.NW: [0.5,0.5], D.N: [1.5,0], D.NE: [2.5,0.5],
D.W: [0,1.5], D.E: [3,1.5],
D.SW: [0.5,2.5], D.S: [1.5,3], D.SE: [2.5,2.5],
}
selector_size = 100
window_size = selector_size*4
window = Window (window_size,window_size,selector_map,selector_size,selector_size)
# set actions here
from functools import partial
def say (something):
print (''.join (('Me: "',something,'"')))
window.actions[D.NW] = partial (say,'northwast')
window.actions[D.N] = partial (say,'north')
window.actions[D.NE] = partial (say,'neorthest')
window.actions[D.W] = partial (say,'western')
window.actions[D.E] = partial (say,'easy')
window.actions[D.SW] = partial (say,'suess whest')
window.actions[D.S] = partial (say,'sissy')
window.actions[D.SE] = partial (say,'seoul')
window.go ()
| 29 | 82 | 0.626078 | 158 | 928 | 3.620253 | 0.28481 | 0.181818 | 0.195804 | 0.013986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041721 | 0.173491 | 928 | 31 | 83 | 29.935484 | 0.704042 | 0.017241 | 0 | 0 | 0 | 0 | 0.098901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.130435 | 0 | 0.173913 | 0.043478 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
864a6447cb894e438d5dd8c26760c86abeb04746 | 760 | py | Python | cride/circles/serializers.py | monteals/C-Ride | 6e9368011f49ff619d1edaeaf1e8232685cc2095 | [
"MIT"
] | null | null | null | cride/circles/serializers.py | monteals/C-Ride | 6e9368011f49ff619d1edaeaf1e8232685cc2095 | [
"MIT"
] | 9 | 2020-04-24T01:29:38.000Z | 2022-03-12T00:25:50.000Z | cride/circles/serializers.py | monteals/C-Ride | 6e9368011f49ff619d1edaeaf1e8232685cc2095 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from rest_framework.validators import UniqueValidator
from cride.circles.models import Circle
class CircleSerializer(serializers.Serializer):
name = serializers.CharField()
slug_name = serializers.SlugField()
rides_taken = serializers.IntegerField()
rides_offered = serializers.IntegerField()
members_limit = serializers.IntegerField()
class CreateCircleSerializer(serializers.Serializer):
name = serializers.CharField(max_length=140)
slug_name = serializers.CharField(max_length=40, validators=[UniqueValidator(queryset=Circle.objects.all())])
about = serializers.CharField(max_length=255, required=False)
def create(self, data):
return Circle.objects.create(**data)
| 36.190476 | 113 | 0.777632 | 80 | 760 | 7.2625 | 0.5125 | 0.10327 | 0.123924 | 0.149742 | 0.227194 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012121 | 0.131579 | 760 | 20 | 114 | 38 | 0.868182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.2 | 0.066667 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
865535238f10c51c669114bcf29b3699dd34b1e8 | 559 | py | Python | examples/django/hello_world/wsgi.py | liuyu81/SnapSearch-Client-Python | 41857806c2b26f0537de2dcc23a145107a4ecd04 | [
"MIT"
] | null | null | null | examples/django/hello_world/wsgi.py | liuyu81/SnapSearch-Client-Python | 41857806c2b26f0537de2dcc23a145107a4ecd04 | [
"MIT"
] | null | null | null | examples/django/hello_world/wsgi.py | liuyu81/SnapSearch-Client-Python | 41857806c2b26f0537de2dcc23a145107a4ecd04 | [
"MIT"
] | 1 | 2018-03-04T20:24:14.000Z | 2018-03-04T20:24:14.000Z | import os
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "hello_world.settings")
# django WSGI application
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
# load SnapSearch API credentials
api_email = "<email>"
api_key = "<key>"
# initialize the interceptor
from SnapSearch import Client, Detector, Interceptor
interceptor = Interceptor(Client(api_email, api_key), Detector())
# deploy the interceptor
from SnapSearch.wsgi import InterceptorMiddleware
application = InterceptorMiddleware(application, interceptor)
| 27.95 | 71 | 0.815742 | 65 | 559 | 6.846154 | 0.415385 | 0.101124 | 0.080899 | 0.125843 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103757 | 559 | 19 | 72 | 29.421053 | 0.888224 | 0.187835 | 0 | 0 | 0 | 0 | 0.120267 | 0.048998 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
865b44ebd78e20ddd28ec532ea20204eaa6a07dc | 848 | py | Python | examples/run_merger.py | needlehaystack/needlestack | e00529a2a7c2d85059936a85f54dfb55e515b6ef | [
"Apache-2.0"
] | 3 | 2019-10-03T22:15:21.000Z | 2022-02-08T09:05:41.000Z | examples/run_merger.py | cungtv/needlestack | e00529a2a7c2d85059936a85f54dfb55e515b6ef | [
"Apache-2.0"
] | 1 | 2021-04-30T21:08:47.000Z | 2021-04-30T21:08:47.000Z | examples/run_merger.py | cungtv/needlestack | e00529a2a7c2d85059936a85f54dfb55e515b6ef | [
"Apache-2.0"
] | 2 | 2019-08-02T19:13:09.000Z | 2019-10-25T01:47:17.000Z | import logging
from grpc_health.v1 import health_pb2, health_pb2_grpc
from grpc_health.v1.health import HealthServicer
from needlestack.apis import servicers_pb2_grpc
from needlestack.servicers import factory
from needlestack.servicers.merger import MergerServicer
from examples import configs
logging.getLogger("kazoo").setLevel("WARN")
def main():
config = configs.LocalDockerConfig()
server = factory.create_server(config)
manager = factory.create_zookeeper_cluster_manager(config)
manager.startup()
servicers_pb2_grpc.add_MergerServicer_to_server(MergerServicer(config, manager), server)
health = HealthServicer()
health_pb2_grpc.add_HealthServicer_to_server(health, server)
health.set("Merger", health_pb2.HealthCheckResponse.SERVING)
factory.serve(server)
if __name__ == "__main__":
main()
| 25.69697 | 92 | 0.792453 | 101 | 848 | 6.356436 | 0.376238 | 0.056075 | 0.043614 | 0.049844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010811 | 0.127358 | 848 | 32 | 93 | 26.5 | 0.856757 | 0 | 0 | 0 | 0 | 0 | 0.027123 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.35 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
865b48e5b6d60c2c5b81fb4b0a827e80f5502ece | 4,482 | py | Python | engine_wrapper.py | lidevelopers/Lishogi-Bot-1 | 5e669870930fe497e323324f36ccdbf5b04d26d3 | [
"MIT"
] | null | null | null | engine_wrapper.py | lidevelopers/Lishogi-Bot-1 | 5e669870930fe497e323324f36ccdbf5b04d26d3 | [
"MIT"
] | 2 | 2021-06-28T11:09:19.000Z | 2021-06-30T16:59:13.000Z | engine_wrapper.py | lidevelopers/Lishogi-Bot-1 | 5e669870930fe497e323324f36ccdbf5b04d26d3 | [
"MIT"
] | 9 | 2021-06-28T08:06:08.000Z | 2021-10-06T05:01:57.000Z | import os
import shogi
import backoff
import subprocess
from util import *
import logging
logger = logging.getLogger(__name__)
import engine_ctrl
@backoff.on_exception(backoff.expo, BaseException, max_time=120)
def create_engine(config, board):
cfg = config["engine"]
engine_path = os.path.realpath(os.path.join(cfg["dir"], cfg["name"]))
engine_type = cfg.get("protocol")
engine_options = cfg.get("engine_options")
commands = [engine_path]
if engine_options:
for k, v in engine_options.items():
commands.append("--{}={}".format(k, v))
silence_stderr = cfg.get("silence_stderr", False)
return USIEngine(board, commands, cfg.get("usi_options", {}), cfg.get("go_commands", {}), silence_stderr)
class EngineWrapper:
def __init__(self, board, commands, options=None, silence_stderr=False):
pass
def search_for(self, board, movetime):
pass
def first_search(self, board, movetime):
pass
def search(self, game, board, btime, wtime, binc, winc):
pass
def print_stats(self):
pass
def get_opponent_info(self, game):
pass
def name(self):
return self.engine.name
def report_game_result(self, game, board):
pass
def quit(self):
self.engine.kill_process()
def print_handler_stats(self):
pass
def get_handler_stats(self):
pass
class USIEngine(EngineWrapper):
def __init__(self, board, commands, options, go_commands={}, silence_stderr=False):
commands = commands[0] if len(commands) == 1 else commands
self.go_commands = go_commands
self.engine = engine_ctrl.Engine(commands)
self.engine.usi()
if options:
for name, value in options.items():
self.engine.setoption(name, value)
self.engine.isready()
def first_search(self, board, movetime):
best_move, _ = self.engine.go(board.sfen(), "", movetime=movetime)
return best_move
def search_with_ponder(self, game, board, btime, wtime, binc, winc, byo, ponder=False):
moves = [m.usi() for m in list(board.move_stack)]
cmds = self.go_commands
if len(cmds) > 0:
best_move, ponder_move = self.engine.go(
game.initial_fen,
moves,
nodes=cmds.get("nodes"),
depth=cmds.get("depth"),
movetime=cmds.get("movetime"),
ponder=ponder
)
else:
best_move, ponder_move = self.engine.go(
game.initial_fen,
moves,
btime=btime,
wtime=wtime,
binc=binc,
winc=winc,
byo=byo,
ponder=ponder
)
return (best_move, ponder_move)
def search(self, game, board, btime, wtime, binc, winc):
cmds = self.go_commands
moves = [m.usi() for m in list(board.move_stack)]
best_move, _ = self.engine.go(
game.initial_fen,
moves,
btime=btime,
wtime=wtime,
binc=binc,
winc=winc,
depth=cmds.get("depth"),
nodes=cmds.get("nodes"),
movetime=cmds.get("movetime")
)
return best_move
def stop(self):
self.engine.kill_process()
def print_stats(self, stats=None):
if stats is None:
stats = ['score', 'depth', 'nodes', 'nps']
info = self.engine.info
for stat in stats:
if stat in info:
logger.info("{}: {}".format(stat, info[stat]))
def get_stats(self, stats=None):
if stats is None:
stats = ['score', 'depth', 'nodes', 'nps']
info = self.engine.info
stats_str = []
for stat in stats:
if stat in info:
stats_str.append("{}: {}".format(stat, info[stat]))
return stats_str
def get_opponent_info(self, game):
name = game.opponent.name
if name:
rating = game.opponent.rating if game.opponent.rating is not None else "none"
title = game.opponent.title if game.opponent.title else "none"
player_type = "computer" if title == "BOT" else "human"
def report_game_result(self, game, board):
self.engine.protocol._position(board)
| 29.486842 | 109 | 0.566934 | 528 | 4,482 | 4.660985 | 0.206439 | 0.056887 | 0.026412 | 0.026006 | 0.422999 | 0.369768 | 0.323446 | 0.222267 | 0.201138 | 0.168631 | 0 | 0.00197 | 0.320616 | 4,482 | 151 | 110 | 29.682119 | 0.80624 | 0 | 0 | 0.471074 | 0 | 0 | 0.041499 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173554 | false | 0.07438 | 0.057851 | 0.008264 | 0.297521 | 0.024793 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
865e9017c35669feb5f2b679820ab813bc9d8b73 | 533 | py | Python | scene_action2.py | encela95dus/ios_pythonista_examples | e136cdcb05126f0f9b9f6fb6365870876b419619 | [
"MIT"
] | 36 | 2019-01-12T04:17:49.000Z | 2022-03-31T05:33:29.000Z | scene_action2.py | Backup-eric645/ios_pythonista_examples | e136cdcb05126f0f9b9f6fb6365870876b419619 | [
"MIT"
] | null | null | null | scene_action2.py | Backup-eric645/ios_pythonista_examples | e136cdcb05126f0f9b9f6fb6365870876b419619 | [
"MIT"
] | 15 | 2018-12-30T21:18:05.000Z | 2022-01-30T13:17:07.000Z | import scene
class MyScene(scene.Scene):
def setup(self):
self.label_node = scene.LabelNode('A',
position=(100,400), parent=self)
self.start_flag = False
def update(self):
if self.start_flag:
x,y = self.label_node.position
if x < 340:
self.label_node.position = (x+2, y)
else:
self.start_flag = False
def touch_ended(self, touch):
self.start_flag = True
scene.run(MyScene())
| 25.380952 | 52 | 0.525328 | 64 | 533 | 4.25 | 0.453125 | 0.132353 | 0.191176 | 0.132353 | 0.154412 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029851 | 0.371482 | 533 | 20 | 53 | 26.65 | 0.78209 | 0 | 0 | 0.125 | 0 | 0 | 0.001876 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.0625 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
865e9adf437f79d36a8db146c26aeeb0ca4e34fa | 672 | py | Python | bot/commands/disconnect.py | aq1/vkPostman | db6b8d387d484ff53d12dcaf77ba3dcaa6da3822 | [
"MIT"
] | 1 | 2020-09-14T04:47:31.000Z | 2020-09-14T04:47:31.000Z | bot/commands/disconnect.py | aq1/vkPostman | db6b8d387d484ff53d12dcaf77ba3dcaa6da3822 | [
"MIT"
] | null | null | null | bot/commands/disconnect.py | aq1/vkPostman | db6b8d387d484ff53d12dcaf77ba3dcaa6da3822 | [
"MIT"
] | null | null | null | from bot.commands import BaseCommand
import mongo
class DisconnectCommand(BaseCommand):
_COMMAND = 'disconnect'
_DESCRIPTION = 'Close currently active chat.'
_SUCCESS_MESSAGE = 'Disconnected from chat'
def _callback(self, user, _bot, update, **kwargs):
return self._call(user, _bot, update, **kwargs)
def _call(self, user, _bot, update, **kwargs):
chat = mongo.chats.get_active_chat_by_telegram_id(user.id)
if chat:
mongo.chats.disable_chat(chat['_id'])
return True
_bot.send_message(
user.id,
'You are not connected to any vk user',
)
return False
| 25.846154 | 66 | 0.633929 | 79 | 672 | 5.151899 | 0.544304 | 0.051597 | 0.095823 | 0.140049 | 0.113022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.272321 | 672 | 25 | 67 | 26.88 | 0.832311 | 0 | 0 | 0 | 0 | 0 | 0.147321 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0.055556 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
865f59e775e337c6b42c37791b8b1b83a1c4fa34 | 2,522 | py | Python | pysh/bash_vm/shell_command.py | JordanKoeller/Pysch | 6775db00e6d551328ce49a50a5987223a9e9a9c3 | [
"MIT"
] | null | null | null | pysh/bash_vm/shell_command.py | JordanKoeller/Pysch | 6775db00e6d551328ce49a50a5987223a9e9a9c3 | [
"MIT"
] | null | null | null | pysh/bash_vm/shell_command.py | JordanKoeller/Pysch | 6775db00e6d551328ce49a50a5987223a9e9a9c3 | [
"MIT"
] | null | null | null | from __future__ import annotations
import subprocess
import os
from typing import List, Dict, Iterator, Optional, Tuple
class ShellCommand:
def __init__(self, cmd: str):
self.run_args = [
"bash", "-c", f'{cmd}'
]
# self.run_args: List[str] = [executable, *args]
def exec(self, **extra_environ: str) -> ShellCommandOutput:
result = subprocess.run(self.run_args,
stdout=subprocess.PIPE,
env={
**os.environ,
**(extra_environ if extra_environ else {})
}
)
print("Finished shell command")
return ShellCommandOutput(str(result.stdout, 'utf-8'), result.returncode)
class ShellCommandOutput:
def __init__(self, output_body: str, code: int):
self._code = code
self._value = output_body
@property
def succeeded(self) -> bool:
return self._code == 0
@property
def code(self) -> int:
return self._code
@property
def value(self) -> str:
return self._value
def lines(self) -> List[ShellCommandOutput]:
return [
ShellCommandOutput(substr, self.code)
for substr in self.value.splitlines()
if substr
]
def __iter__(self) -> Iterator[str]:
return iter(self._split_tokens())
def __str__(self) -> str:
return f'<STDOUT value={self.value} code={self.code} >'
def _split_tokens(self) -> List[str]:
ret = []
in_quotes = None
accumulator: List[str] = []
for char in self.value:
if _whitespace(char) and not in_quotes and accumulator:
ret.append(''.join(accumulator))
accumulator = []
elif in_quotes == None and _quotes(char):
in_quotes = char
elif in_quotes and in_quotes == char:
in_quotes = None
if accumulator:
ret.append(''.join(accumulator))
accumulator = []
elif in_quotes and _quotes(char):
raise ValueError(
f"Found unmatched quote characters in string {self.value}")
else:
accumulator.append(char)
return ret
def _quotes(c: str) -> bool:
return c in ['"', "'"]
def _whitespace(c: str) -> bool:
return str.isspace(c)
| 28.022222 | 81 | 0.527756 | 257 | 2,522 | 4.984436 | 0.307393 | 0.049961 | 0.025761 | 0.037471 | 0.090554 | 0.090554 | 0.090554 | 0.090554 | 0.090554 | 0 | 0 | 0.001261 | 0.371134 | 2,522 | 89 | 82 | 28.337079 | 0.806431 | 0.018239 | 0 | 0.134328 | 0 | 0 | 0.056589 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.179104 | false | 0 | 0.059701 | 0.119403 | 0.41791 | 0.014925 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
865fae0cf0882393868b033ff9b36122ab7504f2 | 76,601 | py | Python | src/saml2/saml.py | masterapps-au/pysaml2 | 97ad6c066c93cb31a3c3b9d504877c02e93ca9a9 | [
"Apache-2.0"
] | null | null | null | src/saml2/saml.py | masterapps-au/pysaml2 | 97ad6c066c93cb31a3c3b9d504877c02e93ca9a9 | [
"Apache-2.0"
] | null | null | null | src/saml2/saml.py | masterapps-au/pysaml2 | 97ad6c066c93cb31a3c3b9d504877c02e93ca9a9 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# Generated Mon May 2 14:23:33 2011 by parse_xsd.py version 0.4.
#
# A summary of available specifications can be found at:
# https://wiki.oasis-open.org/security/FrontPage
#
# saml core specifications to be found at:
# if any question arise please query the following pdf.
# http://docs.oasis-open.org/security/saml/v2.0/saml-core-2.0-os.pdf
# The specification was later updated with errata, and the new version is here:
# https://www.oasis-open.org/committees/download.php/56776/sstc-saml-core-errata-2.0-wd-07.pdf
#
try:
from base64 import encodebytes as b64encode
except ImportError:
from base64 import b64encode
from saml2.validate import valid_ipv4, MustValueError
from saml2.validate import valid_ipv6
from saml2.validate import ShouldValueError
from saml2.validate import valid_domain_name
import saml2
from saml2 import SamlBase
import six
from saml2 import xmldsig as ds
from saml2 import xmlenc as xenc
# authentication information fields
NAMESPACE = 'urn:oasis:names:tc:SAML:2.0:assertion'
# xmlschema definition
XSD = "xs"
# xmlschema templates and extensions
XS_NAMESPACE = 'http://www.w3.org/2001/XMLSchema'
# xmlschema-instance, which contains several builtin attributes
XSI_NAMESPACE = 'http://www.w3.org/2001/XMLSchema-instance'
# xml soap namespace
NS_SOAP_ENC = "http://schemas.xmlsoap.org/soap/encoding/"
# type definitions for xmlschemas
XSI_TYPE = '{%s}type' % XSI_NAMESPACE
# nil type definition for xmlschemas
XSI_NIL = '{%s}nil' % XSI_NAMESPACE
# idp and sp communicate usually about a subject(NameID)
# the format determines the category the subject is in
# custom subject
NAMEID_FORMAT_UNSPECIFIED = (
"urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified")
# subject as email address
NAMEID_FORMAT_EMAILADDRESS = (
"urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress")
# subject as x509 key
NAMEID_FORMAT_X509SUBJECTNAME = (
"urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName")
# subject as windows domain name
NAMEID_FORMAT_WINDOWSDOMAINQUALIFIEDNAME = (
"urn:oasis:names:tc:SAML:1.1:nameid-format:WindowsDomainQualifiedName")
# subject from a kerberos instance
NAMEID_FORMAT_KERBEROS = (
"urn:oasis:names:tc:SAML:2.0:nameid-format:kerberos")
# subject as name
NAMEID_FORMAT_ENTITY = (
"urn:oasis:names:tc:SAML:2.0:nameid-format:entity")
# linked subject
NAMEID_FORMAT_PERSISTENT = (
"urn:oasis:names:tc:SAML:2.0:nameid-format:persistent")
# annonymous subject
NAMEID_FORMAT_TRANSIENT = (
"urn:oasis:names:tc:SAML:2.0:nameid-format:transient")
# subject avaiable in encrypted format
NAMEID_FORMAT_ENCRYPTED = (
"urn:oasis:names:tc:SAML:2.0:nameid-format:encrypted")
# dicc for avaiable formats
NAMEID_FORMATS_SAML2 = (
('NAMEID_FORMAT_EMAILADDRESS', NAMEID_FORMAT_EMAILADDRESS),
('NAMEID_FORMAT_ENCRYPTED', NAMEID_FORMAT_ENCRYPTED),
('NAMEID_FORMAT_ENTITY', NAMEID_FORMAT_ENTITY),
('NAMEID_FORMAT_PERSISTENT', NAMEID_FORMAT_PERSISTENT),
('NAMEID_FORMAT_TRANSIENT', NAMEID_FORMAT_TRANSIENT),
('NAMEID_FORMAT_UNSPECIFIED', NAMEID_FORMAT_UNSPECIFIED),
)
# a profile outlines a set of rules describing how to embed SAML assertions.
# https://docs.oasis-open.org/security/saml/v2.0/saml-profiles-2.0-os.pdf
# The specification was later updated with errata, and the new version is here:
# https://www.oasis-open.org/committees/download.php/56782/sstc-saml-profiles-errata-2.0-wd-07.pdf
# XML based values for SAML attributes
PROFILE_ATTRIBUTE_BASIC = (
"urn:oasis:names:tc:SAML:2.0:profiles:attribute:basic")
# an AuthnRequest is made to initiate authentication
# authenticate the request with login credentials
AUTHN_PASSWORD = "urn:oasis:names:tc:SAML:2.0:ac:classes:Password"
# authenticate the request with login credentials, over tls/https
AUTHN_PASSWORD_PROTECTED = \
"urn:oasis:names:tc:SAML:2.0:ac:classes:PasswordProtectedTransport"
# attribute statements is key:value metadata shared with your app
# custom format
NAME_FORMAT_UNSPECIFIED = (
"urn:oasis:names:tc:SAML:2.0:attrname-format:unspecified")
# uri format
NAME_FORMAT_URI = "urn:oasis:names:tc:SAML:2.0:attrname-format:uri"
# XML-based format
NAME_FORMAT_BASIC = "urn:oasis:names:tc:SAML:2.0:attrname-format:basic"
# dicc for avaiable formats
NAME_FORMATS_SAML2 = (
('NAME_FORMAT_BASIC', NAME_FORMAT_BASIC),
('NAME_FORMAT_URI', NAME_FORMAT_URI),
('NAME_FORMAT_UNSPECIFIED', NAME_FORMAT_UNSPECIFIED),
)
# the SAML authority's decision can be predetermined by arbitrary context
# the specified action is permitted
DECISION_TYPE_PERMIT = "Permit"
# the specified action is denied
DECISION_TYPE_DENY = "Deny"
# the SAML authority cannot determine if the action is permitted or denied
DECISION_TYPE_INDETERMINATE = "Indeterminate"
# consent attributes determine wether consent has been given and under
# what conditions
# no claim to consent is made
CONSENT_UNSPECIFIED = "urn:oasis:names:tc:SAML:2.0:consent:unspecified"
# consent has been obtained
CONSENT_OBTAINED = "urn:oasis:names:tc:SAML:2.0:consent:obtained"
# consent has been obtained before the message has been initiated
CONSENT_PRIOR = "urn:oasis:names:tc:SAML:2.0:consent:prior"
# consent has been obtained implicitly
CONSENT_IMPLICIT = "urn:oasis:names:tc:SAML:2.0:consent:current-implicit"
# consent has been obtained explicitly
CONSENT_EXPLICIT = "urn:oasis:names:tc:SAML:2.0:consent:current-explicit"
# no consent has been obtained
CONSENT_UNAVAILABLE = "urn:oasis:names:tc:SAML:2.0:consent:unavailable"
# no consent is needed.
CONSENT_INAPPLICABLE = "urn:oasis:names:tc:SAML:2.0:consent:inapplicable"
# Subject confirmation methods(scm), can be issued, besides the subject itself
# by third parties.
# http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.0.pdf
# the 3rd party is identified on behalf of the subject given private/public key
SCM_HOLDER_OF_KEY = "urn:oasis:names:tc:SAML:2.0:cm:holder-of-key"
# the 3rd party is identified by subject confirmation and must include a security header
# signing its content.
SCM_SENDER_VOUCHES = "urn:oasis:names:tc:SAML:2.0:cm:sender-vouches"
# a bearer token is issued instead.
SCM_BEARER = "urn:oasis:names:tc:SAML:2.0:cm:bearer"
class AttributeValueBase(SamlBase):
def __init__(self,
text=None,
extension_elements=None,
extension_attributes=None):
self._extatt = {}
SamlBase.__init__(self,
text=None,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
if self._extatt:
self.extension_attributes = self._extatt
if text:
self.set_text(text)
elif not extension_elements:
self.extension_attributes = {XSI_NIL: 'true'}
elif XSI_TYPE in self.extension_attributes:
del self.extension_attributes[XSI_TYPE]
def __setattr__(self, key, value):
if key == "text":
self.set_text(value)
else:
SamlBase.__setattr__(self, key, value)
def verify(self):
if not self.text and not self.extension_elements:
if not self.extension_attributes:
raise Exception(
"Attribute value base should not have extension attributes"
)
if self.extension_attributes[XSI_NIL] != "true":
raise Exception(
"Attribute value base should not have extension attributes"
)
return True
else:
SamlBase.verify(self)
def set_type(self, typ):
try:
del self.extension_attributes[XSI_NIL]
except (AttributeError, KeyError):
pass
try:
self.extension_attributes[XSI_TYPE] = typ
except AttributeError:
self._extatt[XSI_TYPE] = typ
if typ.startswith('xs:'):
try:
self.extension_attributes['xmlns:xs'] = XS_NAMESPACE
except AttributeError:
self._extatt['xmlns:xs'] = XS_NAMESPACE
if typ.startswith('xsd:'):
try:
self.extension_attributes['xmlns:xsd'] = XS_NAMESPACE
except AttributeError:
self._extatt['xmlns:xsd'] = XS_NAMESPACE
def get_type(self):
try:
return self.extension_attributes[XSI_TYPE]
except (KeyError, AttributeError):
try:
return self._extatt[XSI_TYPE]
except KeyError:
return ""
def clear_type(self):
try:
del self.extension_attributes[XSI_TYPE]
except KeyError:
pass
try:
del self._extatt[XSI_TYPE]
except KeyError:
pass
def set_text(self, value, base64encode=False):
def _wrong_type_value(xsd, value):
msg = 'Type and value do not match: {xsd}:{type}:{value}'
msg = msg.format(xsd=xsd, type=type(value), value=value)
raise ValueError(msg)
# only work with six.string_types
_str = unicode if six.PY2 else str
if isinstance(value, six.binary_type):
value = value.decode('utf-8')
type_to_xsd = {
_str: 'string',
int: 'integer',
float: 'float',
bool: 'boolean',
type(None): '',
}
# entries of xsd-types each declaring:
# - a corresponding python type
# - a function to turn a string into that type
# - a function to turn that type into a text-value
xsd_types_props = {
'string': {
'type': _str,
'to_type': _str,
'to_text': _str,
},
'integer': {
'type': int,
'to_type': int,
'to_text': _str,
},
'short': {
'type': int,
'to_type': int,
'to_text': _str,
},
'int': {
'type': int,
'to_type': int,
'to_text': _str,
},
'long': {
'type': int,
'to_type': int,
'to_text': _str,
},
'float': {
'type': float,
'to_type': float,
'to_text': _str,
},
'double': {
'type': float,
'to_type': float,
'to_text': _str,
},
'boolean': {
'type': bool,
'to_type': lambda x: {
'true': True,
'false': False,
}[_str(x).lower()],
'to_text': lambda x: _str(x).lower(),
},
'base64Binary': {
'type': _str,
'to_type': _str,
'to_text': (
lambda x: b64encode(x.encode()) if base64encode else x
),
},
'anyType': {
'type': type(value),
'to_type': lambda x: x,
'to_text': lambda x: x,
},
'': {
'type': type(None),
'to_type': lambda x: None,
'to_text': lambda x: '',
},
}
xsd_string = (
'base64Binary' if base64encode
else self.get_type()
or type_to_xsd.get(type(value)))
xsd_ns, xsd_type = (
['', type(None)] if xsd_string is None
else ['', ''] if xsd_string == ''
else [
XSD if xsd_string in xsd_types_props else '',
xsd_string
] if ':' not in xsd_string
else xsd_string.split(':', 1))
xsd_type_props = xsd_types_props.get(xsd_type, {})
valid_type = xsd_type_props.get('type', type(None))
to_type = xsd_type_props.get('to_type', str)
to_text = xsd_type_props.get('to_text', str)
# cast to correct type before type-checking
if type(value) is _str and valid_type is not _str:
try:
value = to_type(value)
except (TypeError, ValueError, KeyError):
# the cast failed
_wrong_type_value(xsd=xsd_type, value=value)
if type(value) is not valid_type:
_wrong_type_value(xsd=xsd_type, value=value)
text = to_text(value)
self.set_type(
'{ns}:{type}'.format(ns=xsd_ns, type=xsd_type) if xsd_ns
else xsd_type if xsd_type
else '')
SamlBase.__setattr__(self, 'text', text)
return self
def harvest_element_tree(self, tree):
# Fill in the instance members from the contents of the XML tree.
for child in tree:
self._convert_element_tree_to_member(child)
for attribute, value in iter(tree.attrib.items()):
self._convert_element_attribute_to_member(attribute, value)
# if we have added children to this node
# we consider whitespace insignificant
# and remove/trim/strip whitespace
# and expect to not have actual text content
text = (
tree.text.strip()
if tree.text and self.extension_elements
else tree.text
)
if text:
#print("set_text:", tree.text)
# clear type
#self.clear_type()
self.set_text(text)
# if we have added a text node
# or other children to this node
# remove the nil marker
if text or self.extension_elements:
if XSI_NIL in self.extension_attributes:
del self.extension_attributes[XSI_NIL]
class BaseIDAbstractType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:BaseIDAbstractType element """
c_tag = 'BaseIDAbstractType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_attributes['NameQualifier'] = ('name_qualifier', 'string', False)
c_attributes['SPNameQualifier'] = ('sp_name_qualifier', 'string', False)
def __init__(self,
name_qualifier=None,
sp_name_qualifier=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.name_qualifier = name_qualifier
self.sp_name_qualifier = sp_name_qualifier
class NameIDType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:NameIDType element """
c_tag = 'NameIDType'
c_namespace = NAMESPACE
c_value_type = {'base': 'string'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_attributes['NameQualifier'] = ('name_qualifier', 'string', False)
c_attributes['SPNameQualifier'] = ('sp_name_qualifier', 'string', False)
c_attributes['Format'] = ('format', 'anyURI', False)
c_attributes['SPProvidedID'] = ('sp_provided_id', 'string', False)
def __init__(self,
name_qualifier=None,
sp_name_qualifier=None,
format=None,
sp_provided_id=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.name_qualifier = name_qualifier
self.sp_name_qualifier = sp_name_qualifier
self.format = format
self.sp_provided_id = sp_provided_id
def name_id_type__from_string(xml_string):
return saml2.create_class_from_xml_string(NameIDType_, xml_string)
class EncryptedElementType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:EncryptedElementType element
"""
c_tag = 'EncryptedElementType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{http://www.w3.org/2001/04/xmlenc#}EncryptedData'] = (
'encrypted_data',
xenc.EncryptedData)
c_children['{http://www.w3.org/2001/04/xmlenc#}EncryptedKey'] = (
'encrypted_key',
[xenc.EncryptedKey])
c_cardinality['encrypted_key'] = {"min": 0}
c_child_order.extend(['encrypted_data', 'encrypted_key'])
def __init__(self,
encrypted_data=None,
encrypted_key=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.encrypted_data = encrypted_data
self.encrypted_key = encrypted_key or []
def encrypted_element_type__from_string(xml_string):
return saml2.create_class_from_xml_string(EncryptedElementType_, xml_string)
class EncryptedID(EncryptedElementType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:EncryptedID element """
c_tag = 'EncryptedID'
c_namespace = NAMESPACE
c_children = EncryptedElementType_.c_children.copy()
c_attributes = EncryptedElementType_.c_attributes.copy()
c_child_order = EncryptedElementType_.c_child_order[:]
c_cardinality = EncryptedElementType_.c_cardinality.copy()
def encrypted_id_from_string(xml_string):
return saml2.create_class_from_xml_string(EncryptedID, xml_string)
class Issuer(NameIDType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Issuer element """
c_tag = 'Issuer'
c_namespace = NAMESPACE
c_children = NameIDType_.c_children.copy()
c_attributes = NameIDType_.c_attributes.copy()
c_child_order = NameIDType_.c_child_order[:]
c_cardinality = NameIDType_.c_cardinality.copy()
def issuer_from_string(xml_string):
return saml2.create_class_from_xml_string(Issuer, xml_string)
class AssertionIDRef(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AssertionIDRef element """
c_tag = 'AssertionIDRef'
c_namespace = NAMESPACE
c_value_type = {'base': 'NCName'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def assertion_id_ref_from_string(xml_string):
return saml2.create_class_from_xml_string(AssertionIDRef, xml_string)
class AssertionURIRef(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AssertionURIRef element """
c_tag = 'AssertionURIRef'
c_namespace = NAMESPACE
c_value_type = {'base': 'anyURI'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def assertion_uri_ref_from_string(xml_string):
return saml2.create_class_from_xml_string(AssertionURIRef, xml_string)
class SubjectConfirmationDataType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:SubjectConfirmationDataType
element """
c_tag = 'SubjectConfirmationDataType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_attributes['NotBefore'] = ('not_before', 'dateTime', False)
c_attributes['NotOnOrAfter'] = ('not_on_or_after', 'dateTime', False)
c_attributes['Recipient'] = ('recipient', 'anyURI', False)
c_attributes['InResponseTo'] = ('in_response_to', 'NCName', False)
c_attributes['Address'] = ('address', 'string', False)
c_any = {"namespace": "##any", "processContents": "lax", "minOccurs": "0",
"maxOccurs": "unbounded"}
c_any_attribute = {"namespace": "##other", "processContents": "lax"}
def __init__(self,
not_before=None,
not_on_or_after=None,
recipient=None,
in_response_to=None,
address=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.not_before = not_before
self.not_on_or_after = not_on_or_after
self.recipient = recipient
self.in_response_to = in_response_to
self.address = address
def subject_confirmation_data_type__from_string(xml_string):
return saml2.create_class_from_xml_string(SubjectConfirmationDataType_,
xml_string)
class KeyInfoConfirmationDataType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:KeyInfoConfirmationDataType
element """
c_tag = 'KeyInfoConfirmationDataType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{http://www.w3.org/2000/09/xmldsig#}KeyInfo'] = ('key_info',
[ds.KeyInfo])
c_cardinality['key_info'] = {"min": 1}
c_child_order.extend(['key_info'])
def __init__(self,
key_info=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.key_info = key_info or []
def key_info_confirmation_data_type__from_string(xml_string):
return saml2.create_class_from_xml_string(KeyInfoConfirmationDataType_,
xml_string)
class ConditionAbstractType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:ConditionAbstractType
element """
c_tag = 'ConditionAbstractType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
class Audience(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Audience element """
c_tag = 'Audience'
c_namespace = NAMESPACE
c_value_type = {'base': 'anyURI'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def audience_from_string(xml_string):
return saml2.create_class_from_xml_string(Audience, xml_string)
class OneTimeUseType_(ConditionAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:OneTimeUseType element """
c_tag = 'OneTimeUseType'
c_namespace = NAMESPACE
c_children = ConditionAbstractType_.c_children.copy()
c_attributes = ConditionAbstractType_.c_attributes.copy()
c_child_order = ConditionAbstractType_.c_child_order[:]
c_cardinality = ConditionAbstractType_.c_cardinality.copy()
def one_time_use_type__from_string(xml_string):
return saml2.create_class_from_xml_string(OneTimeUseType_, xml_string)
class ProxyRestrictionType_(ConditionAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:ProxyRestrictionType element
"""
c_tag = 'ProxyRestrictionType'
c_namespace = NAMESPACE
c_children = ConditionAbstractType_.c_children.copy()
c_attributes = ConditionAbstractType_.c_attributes.copy()
c_child_order = ConditionAbstractType_.c_child_order[:]
c_cardinality = ConditionAbstractType_.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Audience'] = ('audience',
[Audience])
c_cardinality['audience'] = {"min": 0}
c_attributes['Count'] = ('count', 'nonNegativeInteger', False)
c_child_order.extend(['audience'])
def __init__(self,
audience=None,
count=None,
text=None,
extension_elements=None,
extension_attributes=None):
ConditionAbstractType_.__init__(
self, text=text, extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.audience = audience or []
self.count = count
def proxy_restriction_type__from_string(xml_string):
return saml2.create_class_from_xml_string(ProxyRestrictionType_, xml_string)
class EncryptedAssertion(EncryptedElementType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:EncryptedAssertion element """
c_tag = 'EncryptedAssertion'
c_namespace = NAMESPACE
c_children = EncryptedElementType_.c_children.copy()
c_attributes = EncryptedElementType_.c_attributes.copy()
c_child_order = EncryptedElementType_.c_child_order[:]
c_cardinality = EncryptedElementType_.c_cardinality.copy()
def encrypted_assertion_from_string(xml_string):
return saml2.create_class_from_xml_string(EncryptedAssertion, xml_string)
class StatementAbstractType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:StatementAbstractType element
"""
c_tag = 'StatementAbstractType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
class SubjectLocalityType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:SubjectLocalityType element """
c_tag = 'SubjectLocalityType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_attributes['Address'] = ('address', 'string', False)
c_attributes['DNSName'] = ('dns_name', 'string', False)
def __init__(self,
address=None,
dns_name=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.address = address
self.dns_name = dns_name
def subject_locality_type__from_string(xml_string):
return saml2.create_class_from_xml_string(SubjectLocalityType_, xml_string)
class AuthnContextClassRef(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthnContextClassRef element
"""
c_tag = 'AuthnContextClassRef'
c_namespace = NAMESPACE
c_value_type = {'base': 'anyURI'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def authn_context_class_ref_from_string(xml_string):
return saml2.create_class_from_xml_string(AuthnContextClassRef, xml_string)
class AuthnContextDeclRef(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthnContextDeclRef element """
c_tag = 'AuthnContextDeclRef'
c_namespace = NAMESPACE
c_value_type = {'base': 'anyURI'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def authn_context_decl_ref_from_string(xml_string):
return saml2.create_class_from_xml_string(AuthnContextDeclRef, xml_string)
class AuthnContextDecl(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthnContextDecl element """
c_tag = 'AuthnContextDecl'
c_namespace = NAMESPACE
c_value_type = {'base': 'anyType'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def authn_context_decl_from_string(xml_string):
return saml2.create_class_from_xml_string(AuthnContextDecl, xml_string)
class AuthenticatingAuthority(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthenticatingAuthority
element """
c_tag = 'AuthenticatingAuthority'
c_namespace = NAMESPACE
c_value_type = {'base': 'anyURI'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def authenticating_authority_from_string(xml_string):
return saml2.create_class_from_xml_string(AuthenticatingAuthority,
xml_string)
class DecisionType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:DecisionType element """
c_tag = 'DecisionType'
c_namespace = NAMESPACE
c_value_type = {'base': 'string', 'enumeration': ['Permit', 'Deny',
'Indeterminate']}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def decision_type__from_string(xml_string):
return saml2.create_class_from_xml_string(DecisionType_, xml_string)
class ActionType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:ActionType element """
c_tag = 'ActionType'
c_namespace = NAMESPACE
c_value_type = {'base': 'string'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_attributes['Namespace'] = ('namespace', 'anyURI', True)
def __init__(self,
namespace=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.namespace = namespace
def action_type__from_string(xml_string):
return saml2.create_class_from_xml_string(ActionType_, xml_string)
class AttributeValue(AttributeValueBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AttributeValue element """
c_tag = 'AttributeValue'
c_namespace = NAMESPACE
c_value_type = {'base': 'anyType'}
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
def attribute_value_from_string(xml_string):
return saml2.create_class_from_xml_string(AttributeValue, xml_string)
class EncryptedAttribute(EncryptedElementType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:EncryptedAttribute element """
c_tag = 'EncryptedAttribute'
c_namespace = NAMESPACE
c_children = EncryptedElementType_.c_children.copy()
c_attributes = EncryptedElementType_.c_attributes.copy()
c_child_order = EncryptedElementType_.c_child_order[:]
c_cardinality = EncryptedElementType_.c_cardinality.copy()
def encrypted_attribute_from_string(xml_string):
return saml2.create_class_from_xml_string(EncryptedAttribute, xml_string)
class BaseID(BaseIDAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:BaseID element """
c_tag = 'BaseID'
c_namespace = NAMESPACE
c_children = BaseIDAbstractType_.c_children.copy()
c_attributes = BaseIDAbstractType_.c_attributes.copy()
c_child_order = BaseIDAbstractType_.c_child_order[:]
c_cardinality = BaseIDAbstractType_.c_cardinality.copy()
def base_id_from_string(xml_string):
return saml2.create_class_from_xml_string(BaseID, xml_string)
class NameID(NameIDType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:NameID element
From the Oasis SAML2 Technical Overview:
"The <NameID> element within a <Subject> offers the ability to provide name
identifiers in a number of different formats. SAML's predefined formats
include: Email address, X.509 subject name, Windows domain qualified name,
Kerberos principal name, Entity identifier, Persistent identifier,
Transient identifier."
"""
c_tag = 'NameID'
c_namespace = NAMESPACE
c_children = NameIDType_.c_children.copy()
c_attributes = NameIDType_.c_attributes.copy()
c_child_order = NameIDType_.c_child_order[:]
c_cardinality = NameIDType_.c_cardinality.copy()
def name_id_from_string(xml_string):
return saml2.create_class_from_xml_string(NameID, xml_string)
class SubjectConfirmationData(SubjectConfirmationDataType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:SubjectConfirmationData
element """
c_tag = 'SubjectConfirmationData'
c_namespace = NAMESPACE
c_children = SubjectConfirmationDataType_.c_children.copy()
c_attributes = SubjectConfirmationDataType_.c_attributes.copy()
c_child_order = SubjectConfirmationDataType_.c_child_order[:]
c_cardinality = SubjectConfirmationDataType_.c_cardinality.copy()
def subject_confirmation_data_from_string(xml_string):
return saml2.create_class_from_xml_string(SubjectConfirmationData,
xml_string)
class Condition(ConditionAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Condition element """
c_tag = 'Condition'
c_namespace = NAMESPACE
c_children = ConditionAbstractType_.c_children.copy()
c_attributes = ConditionAbstractType_.c_attributes.copy()
c_child_order = ConditionAbstractType_.c_child_order[:]
c_cardinality = ConditionAbstractType_.c_cardinality.copy()
def condition_from_string(xml_string):
return saml2.create_class_from_xml_string(Condition, xml_string)
class AudienceRestrictionType_(ConditionAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AudienceRestrictionType
element """
c_tag = 'AudienceRestrictionType'
c_namespace = NAMESPACE
c_children = ConditionAbstractType_.c_children.copy()
c_attributes = ConditionAbstractType_.c_attributes.copy()
c_child_order = ConditionAbstractType_.c_child_order[:]
c_cardinality = ConditionAbstractType_.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Audience'] = ('audience',
[Audience])
c_cardinality['audience'] = {"min": 1}
c_child_order.extend(['audience'])
def __init__(self,
audience=None,
text=None,
extension_elements=None,
extension_attributes=None):
ConditionAbstractType_.__init__(
self, text=text, extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.audience = audience or []
def audience_restriction_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AudienceRestrictionType_,
xml_string)
class OneTimeUse(OneTimeUseType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:OneTimeUse element """
c_tag = 'OneTimeUse'
c_namespace = NAMESPACE
c_children = OneTimeUseType_.c_children.copy()
c_attributes = OneTimeUseType_.c_attributes.copy()
c_child_order = OneTimeUseType_.c_child_order[:]
c_cardinality = OneTimeUseType_.c_cardinality.copy()
def one_time_use_from_string(xml_string):
return saml2.create_class_from_xml_string(OneTimeUse, xml_string)
class ProxyRestriction(ProxyRestrictionType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:ProxyRestriction element """
c_tag = 'ProxyRestriction'
c_namespace = NAMESPACE
c_children = ProxyRestrictionType_.c_children.copy()
c_attributes = ProxyRestrictionType_.c_attributes.copy()
c_child_order = ProxyRestrictionType_.c_child_order[:]
c_cardinality = ProxyRestrictionType_.c_cardinality.copy()
def proxy_restriction_from_string(xml_string):
return saml2.create_class_from_xml_string(ProxyRestriction, xml_string)
class Statement(StatementAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Statement element """
c_tag = 'Statement'
c_namespace = NAMESPACE
c_children = StatementAbstractType_.c_children.copy()
c_attributes = StatementAbstractType_.c_attributes.copy()
c_child_order = StatementAbstractType_.c_child_order[:]
c_cardinality = StatementAbstractType_.c_cardinality.copy()
def statement_from_string(xml_string):
return saml2.create_class_from_xml_string(Statement, xml_string)
class SubjectLocality(SubjectLocalityType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:SubjectLocality element """
c_tag = 'SubjectLocality'
c_namespace = NAMESPACE
c_children = SubjectLocalityType_.c_children.copy()
c_attributes = SubjectLocalityType_.c_attributes.copy()
c_child_order = SubjectLocalityType_.c_child_order[:]
c_cardinality = SubjectLocalityType_.c_cardinality.copy()
def verify(self):
if self.address:
# dotted-decimal IPv4 or RFC3513 IPv6 address
if valid_ipv4(self.address) or valid_ipv6(self.address):
pass
else:
raise ShouldValueError("Not an IPv4 or IPv6 address")
elif self.dns_name:
valid_domain_name(self.dns_name)
return SubjectLocalityType_.verify(self)
def subject_locality_from_string(xml_string):
return saml2.create_class_from_xml_string(SubjectLocality, xml_string)
class AuthnContextType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthnContextType element """
c_tag = 'AuthnContextType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children[
'{urn:oasis:names:tc:SAML:2.0:assertion}AuthnContextClassRef'] = (
'authn_context_class_ref', AuthnContextClassRef)
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AuthnContextDecl'] = (
'authn_context_decl',
AuthnContextDecl)
c_cardinality['authn_context_decl'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AuthnContextDeclRef'] = (
'authn_context_decl_ref',
AuthnContextDeclRef)
c_cardinality['authn_context_decl_ref'] = {"min": 0, "max": 1}
c_children[
'{urn:oasis:names:tc:SAML:2.0:assertion}AuthenticatingAuthority'] = (
'authenticating_authority', [AuthenticatingAuthority])
c_cardinality['authenticating_authority'] = {"min": 0}
c_child_order.extend(['authn_context_class_ref', 'authn_context_decl',
'authn_context_decl_ref', 'authenticating_authority'])
def __init__(self,
authn_context_class_ref=None,
authn_context_decl=None,
authn_context_decl_ref=None,
authenticating_authority=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.authn_context_class_ref = authn_context_class_ref
self.authn_context_decl = authn_context_decl
self.authn_context_decl_ref = authn_context_decl_ref
self.authenticating_authority = authenticating_authority or []
def verify(self):
if self.authn_context_decl and self.authn_context_decl_ref:
raise Exception(
"Invalid Response: "
"Cannot have both <AuthnContextDecl> and <AuthnContextDeclRef>"
)
return SamlBase.verify(self)
def authn_context_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AuthnContextType_, xml_string)
class Action(ActionType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Action element """
c_tag = 'Action'
c_namespace = NAMESPACE
c_children = ActionType_.c_children.copy()
c_attributes = ActionType_.c_attributes.copy()
c_child_order = ActionType_.c_child_order[:]
c_cardinality = ActionType_.c_cardinality.copy()
def action_from_string(xml_string):
return saml2.create_class_from_xml_string(Action, xml_string)
class AttributeType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AttributeType element """
c_tag = 'AttributeType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AttributeValue'] = (
'attribute_value',
[AttributeValue])
c_cardinality['attribute_value'] = {"min": 0}
c_attributes['Name'] = ('name', 'string', True)
c_attributes['NameFormat'] = ('name_format', 'anyURI', False)
c_attributes['FriendlyName'] = ('friendly_name', 'string', False)
c_child_order.extend(['attribute_value'])
c_any_attribute = {"namespace": "##other", "processContents": "lax"}
def __init__(self,
attribute_value=None,
name=None,
name_format=NAME_FORMAT_URI,
friendly_name=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.attribute_value = attribute_value or []
self.name = name
self.name_format = name_format
self.friendly_name = friendly_name
# when consuming such elements, default to NAME_FORMAT_UNSPECIFIED as NameFormat
def harvest_element_tree(self, tree):
tree.attrib.setdefault('NameFormat', NAME_FORMAT_UNSPECIFIED)
SamlBase.harvest_element_tree(self, tree)
def attribute_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AttributeType_, xml_string)
class SubjectConfirmationType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:SubjectConfirmationType
element """
c_tag = 'SubjectConfirmationType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}BaseID'] = ('base_id',
BaseID)
c_cardinality['base_id'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}NameID'] = ('name_id',
NameID)
c_cardinality['name_id'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}EncryptedID'] = (
'encrypted_id',
EncryptedID)
c_cardinality['encrypted_id'] = {"min": 0, "max": 1}
c_children[
'{urn:oasis:names:tc:SAML:2.0:assertion}SubjectConfirmationData'] = (
'subject_confirmation_data', SubjectConfirmationData)
c_cardinality['subject_confirmation_data'] = {"min": 0, "max": 1}
c_attributes['Method'] = ('method', 'anyURI', True)
c_child_order.extend(['base_id', 'name_id', 'encrypted_id',
'subject_confirmation_data'])
def __init__(self,
base_id=None,
name_id=None,
encrypted_id=None,
subject_confirmation_data=None,
method=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.base_id = base_id
self.name_id = name_id
self.encrypted_id = encrypted_id
self.subject_confirmation_data = subject_confirmation_data
self.method = method
def subject_confirmation_type__from_string(xml_string):
return saml2.create_class_from_xml_string(SubjectConfirmationType_,
xml_string)
class AudienceRestriction(AudienceRestrictionType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AudienceRestriction element """
c_tag = 'AudienceRestriction'
c_namespace = NAMESPACE
c_children = AudienceRestrictionType_.c_children.copy()
c_attributes = AudienceRestrictionType_.c_attributes.copy()
c_child_order = AudienceRestrictionType_.c_child_order[:]
c_cardinality = AudienceRestrictionType_.c_cardinality.copy()
def audience_restriction_from_string(xml_string):
return saml2.create_class_from_xml_string(AudienceRestriction, xml_string)
class AuthnContext(AuthnContextType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthnContext element """
c_tag = 'AuthnContext'
c_namespace = NAMESPACE
c_children = AuthnContextType_.c_children.copy()
c_attributes = AuthnContextType_.c_attributes.copy()
c_child_order = AuthnContextType_.c_child_order[:]
c_cardinality = AuthnContextType_.c_cardinality.copy()
def authn_context_from_string(xml_string):
return saml2.create_class_from_xml_string(AuthnContext, xml_string)
class Attribute(AttributeType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Attribute element """
c_tag = 'Attribute'
c_namespace = NAMESPACE
c_children = AttributeType_.c_children.copy()
c_attributes = AttributeType_.c_attributes.copy()
c_child_order = AttributeType_.c_child_order[:]
c_cardinality = AttributeType_.c_cardinality.copy()
def attribute_from_string(xml_string):
return saml2.create_class_from_xml_string(Attribute, xml_string)
class SubjectConfirmation(SubjectConfirmationType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:SubjectConfirmation element """
c_tag = 'SubjectConfirmation'
c_namespace = NAMESPACE
c_children = SubjectConfirmationType_.c_children.copy()
c_attributes = SubjectConfirmationType_.c_attributes.copy()
c_child_order = SubjectConfirmationType_.c_child_order[:]
c_cardinality = SubjectConfirmationType_.c_cardinality.copy()
def subject_confirmation_from_string(xml_string):
return saml2.create_class_from_xml_string(SubjectConfirmation, xml_string)
class ConditionsType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:ConditionsType element """
c_tag = 'ConditionsType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Condition'] = (
'condition',
[Condition])
c_cardinality['condition'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AudienceRestriction'] = (
'audience_restriction',
[AudienceRestriction])
c_cardinality['audience_restriction'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}OneTimeUse'] = (
'one_time_use',
[OneTimeUse])
c_cardinality['one_time_use'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}ProxyRestriction'] = (
'proxy_restriction',
[ProxyRestriction])
c_cardinality['proxy_restriction'] = {"min": 0}
c_attributes['NotBefore'] = ('not_before', 'dateTime', False)
c_attributes['NotOnOrAfter'] = ('not_on_or_after', 'dateTime', False)
c_child_order.extend(['condition', 'audience_restriction', 'one_time_use',
'proxy_restriction'])
def __init__(self,
condition=None,
audience_restriction=None,
one_time_use=None,
proxy_restriction=None,
not_before=None,
not_on_or_after=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.condition = condition or []
self.audience_restriction = audience_restriction or []
self.one_time_use = one_time_use or []
self.proxy_restriction = proxy_restriction or []
self.not_before = not_before
self.not_on_or_after = not_on_or_after
def verify(self):
if self.one_time_use:
if len(self.one_time_use) != 1:
raise Exception("Cannot be used more than once")
if self.proxy_restriction:
if len(self.proxy_restriction) != 1:
raise Exception("Cannot be used more than once")
return SamlBase.verify(self)
def conditions_type__from_string(xml_string):
return saml2.create_class_from_xml_string(ConditionsType_, xml_string)
class AuthnStatementType_(StatementAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthnStatementType element """
c_tag = 'AuthnStatementType'
c_namespace = NAMESPACE
c_children = StatementAbstractType_.c_children.copy()
c_attributes = StatementAbstractType_.c_attributes.copy()
c_child_order = StatementAbstractType_.c_child_order[:]
c_cardinality = StatementAbstractType_.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}SubjectLocality'] = (
'subject_locality', SubjectLocality)
c_cardinality['subject_locality'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AuthnContext'] = (
'authn_context', AuthnContext)
c_attributes['AuthnInstant'] = ('authn_instant', 'dateTime', True)
c_attributes['SessionIndex'] = ('session_index', 'string', False)
c_attributes['SessionNotOnOrAfter'] = ('session_not_on_or_after',
'dateTime', False)
c_child_order.extend(['subject_locality', 'authn_context'])
def __init__(self,
subject_locality=None,
authn_context=None,
authn_instant=None,
session_index=None,
session_not_on_or_after=None,
text=None,
extension_elements=None,
extension_attributes=None):
StatementAbstractType_.__init__(
self, text=text, extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.subject_locality = subject_locality
self.authn_context = authn_context
self.authn_instant = authn_instant
self.session_index = session_index
self.session_not_on_or_after = session_not_on_or_after
def authn_statement_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AuthnStatementType_, xml_string)
class AttributeStatementType_(StatementAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AttributeStatementType
element """
c_tag = 'AttributeStatementType'
c_namespace = NAMESPACE
c_children = StatementAbstractType_.c_children.copy()
c_attributes = StatementAbstractType_.c_attributes.copy()
c_child_order = StatementAbstractType_.c_child_order[:]
c_cardinality = StatementAbstractType_.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Attribute'] = (
'attribute',
[Attribute])
c_cardinality['attribute'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}EncryptedAttribute'] = (
'encrypted_attribute',
[EncryptedAttribute])
c_cardinality['encrypted_attribute'] = {"min": 0}
c_child_order.extend(['attribute', 'encrypted_attribute'])
def __init__(self,
attribute=None,
encrypted_attribute=None,
text=None,
extension_elements=None,
extension_attributes=None):
StatementAbstractType_.__init__(
self, text=text, extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.attribute = attribute or []
self.encrypted_attribute = encrypted_attribute or []
def attribute_statement_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AttributeStatementType_,
xml_string)
class SubjectType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:SubjectType element """
c_tag = 'SubjectType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}BaseID'] = ('base_id',
BaseID)
c_cardinality['base_id'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}NameID'] = ('name_id',
NameID)
c_cardinality['name_id'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}EncryptedID'] = (
'encrypted_id', EncryptedID)
c_cardinality['encrypted_id'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}SubjectConfirmation'] = (
'subject_confirmation', [SubjectConfirmation])
c_cardinality['subject_confirmation'] = {"min": 0}
c_child_order.extend(['base_id', 'name_id', 'encrypted_id',
'subject_confirmation'])
def __init__(self,
base_id=None,
name_id=None,
encrypted_id=None,
subject_confirmation=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.base_id = base_id
self.name_id = name_id
self.encrypted_id = encrypted_id
self.subject_confirmation = subject_confirmation or []
def subject_type__from_string(xml_string):
return saml2.create_class_from_xml_string(SubjectType_, xml_string)
class Conditions(ConditionsType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Conditions element """
c_tag = 'Conditions'
c_namespace = NAMESPACE
c_children = ConditionsType_.c_children.copy()
c_attributes = ConditionsType_.c_attributes.copy()
c_child_order = ConditionsType_.c_child_order[:]
c_cardinality = ConditionsType_.c_cardinality.copy()
def conditions_from_string(xml_string):
return saml2.create_class_from_xml_string(Conditions, xml_string)
class AuthnStatement(AuthnStatementType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthnStatement element """
c_tag = 'AuthnStatement'
c_namespace = NAMESPACE
c_children = AuthnStatementType_.c_children.copy()
c_attributes = AuthnStatementType_.c_attributes.copy()
c_child_order = AuthnStatementType_.c_child_order[:]
c_cardinality = AuthnStatementType_.c_cardinality.copy()
def authn_statement_from_string(xml_string):
return saml2.create_class_from_xml_string(AuthnStatement, xml_string)
class AttributeStatement(AttributeStatementType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AttributeStatement element """
c_tag = 'AttributeStatement'
c_namespace = NAMESPACE
c_children = AttributeStatementType_.c_children.copy()
c_attributes = AttributeStatementType_.c_attributes.copy()
c_child_order = AttributeStatementType_.c_child_order[:]
c_cardinality = AttributeStatementType_.c_cardinality.copy()
def attribute_statement_from_string(xml_string):
return saml2.create_class_from_xml_string(AttributeStatement, xml_string)
class Subject(SubjectType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Subject element """
c_tag = 'Subject'
c_namespace = NAMESPACE
c_children = SubjectType_.c_children.copy()
c_attributes = SubjectType_.c_attributes.copy()
c_child_order = SubjectType_.c_child_order[:]
c_cardinality = SubjectType_.c_cardinality.copy()
def subject_from_string(xml_string):
return saml2.create_class_from_xml_string(Subject, xml_string)
#..................
# ['AuthzDecisionStatement', 'EvidenceType', 'AdviceType', 'Evidence',
# 'Assertion', 'AssertionType', 'AuthzDecisionStatementType', 'Advice']
class EvidenceType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:EvidenceType element """
c_tag = 'EvidenceType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AssertionIDRef'] = (
'assertion_id_ref', [AssertionIDRef])
c_cardinality['assertion_id_ref'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AssertionURIRef'] = (
'assertion_uri_ref', [AssertionURIRef])
c_cardinality['assertion_uri_ref'] = {"min": 0}
c_cardinality['assertion'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}EncryptedAssertion'] = (
'encrypted_assertion', [EncryptedAssertion])
c_cardinality['encrypted_assertion'] = {"min": 0}
c_child_order.extend(['assertion_id_ref', 'assertion_uri_ref', 'assertion',
'encrypted_assertion'])
def __init__(self,
assertion_id_ref=None,
assertion_uri_ref=None,
assertion=None,
encrypted_assertion=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.assertion_id_ref = assertion_id_ref or []
self.assertion_uri_ref = assertion_uri_ref or []
self.assertion = assertion or []
self.encrypted_assertion = encrypted_assertion or []
def evidence_type__from_string(xml_string):
return saml2.create_class_from_xml_string(EvidenceType_, xml_string)
class Evidence(EvidenceType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Evidence element """
c_tag = 'Evidence'
c_namespace = NAMESPACE
c_children = EvidenceType_.c_children.copy()
c_attributes = EvidenceType_.c_attributes.copy()
c_child_order = EvidenceType_.c_child_order[:]
c_cardinality = EvidenceType_.c_cardinality.copy()
def evidence_from_string(xml_string):
return saml2.create_class_from_xml_string(Evidence, xml_string)
class AuthzDecisionStatementType_(StatementAbstractType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthzDecisionStatementType
element """
c_tag = 'AuthzDecisionStatementType'
c_namespace = NAMESPACE
c_children = StatementAbstractType_.c_children.copy()
c_attributes = StatementAbstractType_.c_attributes.copy()
c_child_order = StatementAbstractType_.c_child_order[:]
c_cardinality = StatementAbstractType_.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Action'] = (
'action', [Action])
c_cardinality['action'] = {"min": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Evidence'] = (
'evidence', Evidence)
c_cardinality['evidence'] = {"min": 0, "max": 1}
c_attributes['Resource'] = ('resource', 'anyURI', True)
c_attributes['Decision'] = ('decision', DecisionType_, True)
c_child_order.extend(['action', 'evidence'])
def __init__(self,
action=None,
evidence=None,
resource=None,
decision=None,
text=None,
extension_elements=None,
extension_attributes=None):
StatementAbstractType_.__init__(
self, text=text, extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.action = action or []
self.evidence = evidence
self.resource = resource
self.decision = decision
def authz_decision_statement_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AuthzDecisionStatementType_,
xml_string)
class AuthzDecisionStatement(AuthzDecisionStatementType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AuthzDecisionStatement
element """
c_tag = 'AuthzDecisionStatement'
c_namespace = NAMESPACE
c_children = AuthzDecisionStatementType_.c_children.copy()
c_attributes = AuthzDecisionStatementType_.c_attributes.copy()
c_child_order = AuthzDecisionStatementType_.c_child_order[:]
c_cardinality = AuthzDecisionStatementType_.c_cardinality.copy()
def authz_decision_statement_from_string(xml_string):
return saml2.create_class_from_xml_string(AuthzDecisionStatement,
xml_string)
#..................
# ['Assertion', 'AssertionType', 'AdviceType', 'Advice']
class AssertionType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AssertionType element """
c_tag = 'AssertionType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Issuer'] = ('issuer',
Issuer)
c_children['{http://www.w3.org/2000/09/xmldsig#}Signature'] = ('signature',
ds.Signature)
c_cardinality['signature'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Subject'] = ('subject',
Subject)
c_cardinality['subject'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Conditions'] = (
'conditions', Conditions)
c_cardinality['conditions'] = {"min": 0, "max": 1}
c_cardinality['advice'] = {"min": 0, "max": 1}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Statement'] = (
'statement', [Statement])
c_cardinality['statement'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AuthnStatement'] = (
'authn_statement', [AuthnStatement])
c_cardinality['authn_statement'] = {"min": 0}
c_children[
'{urn:oasis:names:tc:SAML:2.0:assertion}AuthzDecisionStatement'] = (
'authz_decision_statement', [AuthzDecisionStatement])
c_cardinality['authz_decision_statement'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AttributeStatement'] = (
'attribute_statement', [AttributeStatement])
c_cardinality['attribute_statement'] = {"min": 0}
c_attributes['Version'] = ('version', 'string', True)
c_attributes['ID'] = ('id', 'ID', True)
c_attributes['IssueInstant'] = ('issue_instant', 'dateTime', True)
c_child_order.extend(['issuer', 'signature', 'subject', 'conditions',
'advice', 'statement', 'authn_statement',
'authz_decision_statement', 'attribute_statement'])
def __init__(self,
issuer=None,
signature=None,
subject=None,
conditions=None,
advice=None,
statement=None,
authn_statement=None,
authz_decision_statement=None,
attribute_statement=None,
version=None,
id=None,
issue_instant=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.issuer = issuer
self.signature = signature
self.subject = subject
self.conditions = conditions
self.advice = advice
self.statement = statement or []
self.authn_statement = authn_statement or []
self.authz_decision_statement = authz_decision_statement or []
self.attribute_statement = attribute_statement or []
self.version = version
self.id = id
self.issue_instant = issue_instant
def verify(self):
# If no statement MUST contain a subject element
if self.attribute_statement or self.statement or \
self.authn_statement or self.authz_decision_statement:
pass
elif not self.subject:
raise MustValueError(
"If no statement MUST contain a subject element")
if self.authn_statement and not self.subject:
raise MustValueError(
"An assertion with an AuthnStatement must contain a Subject")
return SamlBase.verify(self)
def assertion_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AssertionType_, xml_string)
class Assertion(AssertionType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Assertion element """
c_tag = 'Assertion'
c_namespace = NAMESPACE
c_children = AssertionType_.c_children.copy()
c_attributes = AssertionType_.c_attributes.copy()
c_child_order = AssertionType_.c_child_order[:]
c_cardinality = AssertionType_.c_cardinality.copy()
def assertion_from_string(xml_string):
return saml2.create_class_from_xml_string(Assertion, xml_string)
class AdviceType_(SamlBase):
"""The urn:oasis:names:tc:SAML:2.0:assertion:AdviceType element """
c_tag = 'AdviceType'
c_namespace = NAMESPACE
c_children = SamlBase.c_children.copy()
c_attributes = SamlBase.c_attributes.copy()
c_child_order = SamlBase.c_child_order[:]
c_cardinality = SamlBase.c_cardinality.copy()
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AssertionIDRef'] = (
'assertion_id_ref', [AssertionIDRef])
c_cardinality['assertion_id_ref'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}AssertionURIRef'] = (
'assertion_uri_ref', [AssertionURIRef])
c_cardinality['assertion_uri_ref'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Assertion'] = (
'assertion', [Assertion])
c_cardinality['assertion'] = {"min": 0}
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}EncryptedAssertion'] = (
'encrypted_assertion', [EncryptedAssertion])
c_cardinality['encrypted_assertion'] = {"min": 0}
c_child_order.extend(['assertion_id_ref', 'assertion_uri_ref', 'assertion',
'encrypted_assertion'])
c_any = {"namespace": "##other", "processContents": "lax"}
def __init__(self,
assertion_id_ref=None,
assertion_uri_ref=None,
assertion=None,
encrypted_assertion=None,
text=None,
extension_elements=None,
extension_attributes=None):
SamlBase.__init__(self,
text=text,
extension_elements=extension_elements,
extension_attributes=extension_attributes)
self.assertion_id_ref = assertion_id_ref or []
self.assertion_uri_ref = assertion_uri_ref or []
self.assertion = assertion or []
self.encrypted_assertion = encrypted_assertion or []
def advice_type__from_string(xml_string):
return saml2.create_class_from_xml_string(AdviceType_, xml_string)
class Advice(AdviceType_):
"""The urn:oasis:names:tc:SAML:2.0:assertion:Advice element """
c_tag = 'Advice'
c_namespace = NAMESPACE
c_children = AdviceType_.c_children.copy()
c_attributes = AdviceType_.c_attributes.copy()
c_child_order = AdviceType_.c_child_order[:]
c_cardinality = AdviceType_.c_cardinality.copy()
def advice_from_string(xml_string):
return saml2.create_class_from_xml_string(Advice, xml_string)
# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
EvidenceType_.c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Assertion'] = (
'assertion', [Assertion])
Evidence.c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Assertion'] = (
'assertion', [Assertion])
AssertionType_.c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Advice'] = (
'advice', Advice)
Assertion.c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Advice'] = (
'advice', Advice)
# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
AG_IDNameQualifiers = [
('NameQualifier', 'string', False),
('SPNameQualifier', 'string', False),
]
ELEMENT_FROM_STRING = {
BaseID.c_tag: base_id_from_string,
NameID.c_tag: name_id_from_string,
NameIDType_.c_tag: name_id_type__from_string,
EncryptedElementType_.c_tag: encrypted_element_type__from_string,
EncryptedID.c_tag: encrypted_id_from_string,
Issuer.c_tag: issuer_from_string,
AssertionIDRef.c_tag: assertion_id_ref_from_string,
AssertionURIRef.c_tag: assertion_uri_ref_from_string,
Assertion.c_tag: assertion_from_string,
AssertionType_.c_tag: assertion_type__from_string,
Subject.c_tag: subject_from_string,
SubjectType_.c_tag: subject_type__from_string,
SubjectConfirmation.c_tag: subject_confirmation_from_string,
SubjectConfirmationType_.c_tag: subject_confirmation_type__from_string,
SubjectConfirmationData.c_tag: subject_confirmation_data_from_string,
SubjectConfirmationDataType_.c_tag:
subject_confirmation_data_type__from_string,
KeyInfoConfirmationDataType_.c_tag:
key_info_confirmation_data_type__from_string,
Conditions.c_tag: conditions_from_string,
ConditionsType_.c_tag: conditions_type__from_string,
Condition.c_tag: condition_from_string,
AudienceRestriction.c_tag: audience_restriction_from_string,
AudienceRestrictionType_.c_tag: audience_restriction_type__from_string,
Audience.c_tag: audience_from_string,
OneTimeUse.c_tag: one_time_use_from_string,
OneTimeUseType_.c_tag: one_time_use_type__from_string,
ProxyRestriction.c_tag: proxy_restriction_from_string,
ProxyRestrictionType_.c_tag: proxy_restriction_type__from_string,
Advice.c_tag: advice_from_string,
AdviceType_.c_tag: advice_type__from_string,
EncryptedAssertion.c_tag: encrypted_assertion_from_string,
Statement.c_tag: statement_from_string,
AuthnStatement.c_tag: authn_statement_from_string,
AuthnStatementType_.c_tag: authn_statement_type__from_string,
SubjectLocality.c_tag: subject_locality_from_string,
SubjectLocalityType_.c_tag: subject_locality_type__from_string,
AuthnContext.c_tag: authn_context_from_string,
AuthnContextType_.c_tag: authn_context_type__from_string,
AuthnContextClassRef.c_tag: authn_context_class_ref_from_string,
AuthnContextDeclRef.c_tag: authn_context_decl_ref_from_string,
AuthnContextDecl.c_tag: authn_context_decl_from_string,
AuthenticatingAuthority.c_tag: authenticating_authority_from_string,
AuthzDecisionStatement.c_tag: authz_decision_statement_from_string,
AuthzDecisionStatementType_.c_tag:
authz_decision_statement_type__from_string,
DecisionType_.c_tag: decision_type__from_string,
Action.c_tag: action_from_string,
ActionType_.c_tag: action_type__from_string,
Evidence.c_tag: evidence_from_string,
EvidenceType_.c_tag: evidence_type__from_string,
AttributeStatement.c_tag: attribute_statement_from_string,
AttributeStatementType_.c_tag: attribute_statement_type__from_string,
Attribute.c_tag: attribute_from_string,
AttributeType_.c_tag: attribute_type__from_string,
AttributeValue.c_tag: attribute_value_from_string,
EncryptedAttribute.c_tag: encrypted_attribute_from_string,
}
ELEMENT_BY_TAG = {
'BaseID': BaseID,
'NameID': NameID,
'NameIDType': NameIDType_,
'EncryptedElementType': EncryptedElementType_,
'EncryptedID': EncryptedID,
'Issuer': Issuer,
'AssertionIDRef': AssertionIDRef,
'AssertionURIRef': AssertionURIRef,
'Assertion': Assertion,
'AssertionType': AssertionType_,
'Subject': Subject,
'SubjectType': SubjectType_,
'SubjectConfirmation': SubjectConfirmation,
'SubjectConfirmationType': SubjectConfirmationType_,
'SubjectConfirmationData': SubjectConfirmationData,
'SubjectConfirmationDataType': SubjectConfirmationDataType_,
'KeyInfoConfirmationDataType': KeyInfoConfirmationDataType_,
'Conditions': Conditions,
'ConditionsType': ConditionsType_,
'Condition': Condition,
'AudienceRestriction': AudienceRestriction,
'AudienceRestrictionType': AudienceRestrictionType_,
'Audience': Audience,
'OneTimeUse': OneTimeUse,
'OneTimeUseType': OneTimeUseType_,
'ProxyRestriction': ProxyRestriction,
'ProxyRestrictionType': ProxyRestrictionType_,
'Advice': Advice,
'AdviceType': AdviceType_,
'EncryptedAssertion': EncryptedAssertion,
'Statement': Statement,
'AuthnStatement': AuthnStatement,
'AuthnStatementType': AuthnStatementType_,
'SubjectLocality': SubjectLocality,
'SubjectLocalityType': SubjectLocalityType_,
'AuthnContext': AuthnContext,
'AuthnContextType': AuthnContextType_,
'AuthnContextClassRef': AuthnContextClassRef,
'AuthnContextDeclRef': AuthnContextDeclRef,
'AuthnContextDecl': AuthnContextDecl,
'AuthenticatingAuthority': AuthenticatingAuthority,
'AuthzDecisionStatement': AuthzDecisionStatement,
'AuthzDecisionStatementType': AuthzDecisionStatementType_,
'DecisionType': DecisionType_,
'Action': Action,
'ActionType': ActionType_,
'Evidence': Evidence,
'EvidenceType': EvidenceType_,
'AttributeStatement': AttributeStatement,
'AttributeStatementType': AttributeStatementType_,
'Attribute': Attribute,
'AttributeType': AttributeType_,
'AttributeValue': AttributeValue,
'EncryptedAttribute': EncryptedAttribute,
'BaseIDAbstractType': BaseIDAbstractType_,
'ConditionAbstractType': ConditionAbstractType_,
'StatementAbstractType': StatementAbstractType_,
}
def factory(tag, **kwargs):
return ELEMENT_BY_TAG[tag](**kwargs)
| 38.185942 | 98 | 0.6752 | 8,490 | 76,601 | 5.754888 | 0.061602 | 0.029841 | 0.029043 | 0.038683 | 0.584478 | 0.513764 | 0.485479 | 0.472994 | 0.458196 | 0.411449 | 0 | 0.008473 | 0.223483 | 76,601 | 2,005 | 99 | 38.204988 | 0.812935 | 0.104711 | 0 | 0.402315 | 1 | 0.000681 | 0.141912 | 0.066663 | 0 | 0 | 0 | 0 | 0.078965 | 1 | 0.060585 | false | 0.005446 | 0.008169 | 0.03744 | 0.394826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
866017a177effed366d9a7810ad090cd23a963da | 1,163 | py | Python | ROS_packages/custom_ROS_envs/turtlebot2_maze_env/src/turtlebot2_maze_random.py | PierreExeter/custom_gym_envs | 2b6a1c16a4198c8d9fa64f10fe09a041826ac81a | [
"MIT"
] | 1 | 2020-09-25T01:51:58.000Z | 2020-09-25T01:51:58.000Z | ROS_packages/custom_ROS_envs/turtlebot2_maze_env/src/turtlebot2_maze_random.py | PierreExeter/custom_gym_envs | 2b6a1c16a4198c8d9fa64f10fe09a041826ac81a | [
"MIT"
] | null | null | null | ROS_packages/custom_ROS_envs/turtlebot2_maze_env/src/turtlebot2_maze_random.py | PierreExeter/custom_gym_envs | 2b6a1c16a4198c8d9fa64f10fe09a041826ac81a | [
"MIT"
] | 1 | 2021-07-16T02:55:59.000Z | 2021-07-16T02:55:59.000Z | #!/usr/bin/env python
import gym
import rospy
from openai_ros.openai_ros_common import StartOpenAI_ROS_Environment
# initialise environment
rospy.init_node('turtlebot2_maze_random', anonymous=True, log_level=rospy.WARN)
task_and_robot_environment_name = rospy.get_param('/turtlebot2/task_and_robot_environment_name')
env = StartOpenAI_ROS_Environment(task_and_robot_environment_name)
print("Environment: ", env)
print("Action space: ", env.action_space)
# print(env.action_space.high)
# print(env.action_space.low)
print("Observation space: ", env.observation_space)
print(env.observation_space.high)
print(env.observation_space.low)
for episode in range(20):
env.reset()
for t in range(100):
action = env.action_space.sample()
obs, reward, done, info = env.step(action)
print("episode: ", episode)
print("timestep: ", t)
print("obs: ", obs)
print("action:", action)
print("reward: ", reward)
print("done: ", done)
print("info: ", info)
if done:
print("Episode {} finished after {} timesteps".format(episode, t+1))
break
env.close() | 27.690476 | 96 | 0.687016 | 149 | 1,163 | 5.154362 | 0.395973 | 0.071615 | 0.072917 | 0.089844 | 0.105469 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008466 | 0.187446 | 1,163 | 42 | 97 | 27.690476 | 0.804233 | 0.085985 | 0 | 0 | 0 | 0 | 0.188679 | 0.061321 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0.481481 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
86609708c6740fc5dcff69f746034012abb3d227 | 1,112 | py | Python | 01_basics/01_building_expressions/02_vector_mat_soln.py | johny-c/theano_exercises | 7fd43315bf7c475a6f218091316c0bd34e0688c4 | [
"BSD-3-Clause"
] | 711 | 2015-01-10T05:39:21.000Z | 2022-03-15T23:45:45.000Z | 01_basics/01_building_expressions/02_vector_mat_soln.py | rsingh2083/theano_exercises | 7fd43315bf7c475a6f218091316c0bd34e0688c4 | [
"BSD-3-Clause"
] | 2 | 2016-06-13T06:46:58.000Z | 2017-04-14T08:21:20.000Z | 01_basics/01_building_expressions/02_vector_mat_soln.py | rsingh2083/theano_exercises | 7fd43315bf7c475a6f218091316c0bd34e0688c4 | [
"BSD-3-Clause"
] | 371 | 2015-01-16T01:31:41.000Z | 2022-03-15T11:37:30.000Z | import numpy as np
from theano import function
import theano.tensor as T
def make_vector():
"""
Returns a new Theano vector.
"""
return T.vector()
def make_matrix():
"""
Returns a new Theano matrix.
"""
return T.matrix()
def elemwise_mul(a, b):
"""
a: A theano matrix
b: A theano matrix
Returns the elementwise product of a and b
"""
return a * b
def matrix_vector_mul(a, b):
"""
a: A theano matrix
b: A theano vector
Returns the matrix-vector product of a and b
"""
return T.dot(a, b)
if __name__ == "__main__":
a = make_vector()
b = make_vector()
c = elemwise_mul(a, b)
d = make_matrix()
e = matrix_vector_mul(d, c)
f = function([a, b, d], e)
rng = np.random.RandomState([1, 2, 3])
a_value = rng.randn(5).astype(a.dtype)
b_value = rng.rand(5).astype(b.dtype)
c_value = a_value * b_value
d_value = rng.randn(5, 5).astype(d.dtype)
expected = np.dot(d_value, c_value)
actual = f(a_value, b_value, d_value)
assert np.allclose(actual, expected)
print "SUCCESS!"
| 19.508772 | 48 | 0.607014 | 176 | 1,112 | 3.664773 | 0.289773 | 0.018605 | 0.023256 | 0.052713 | 0.20155 | 0.20155 | 0.083721 | 0.083721 | 0.083721 | 0.083721 | 0 | 0.008589 | 0.267086 | 1,112 | 56 | 49 | 19.857143 | 0.782822 | 0 | 0 | 0 | 0 | 0 | 0.019656 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0 | null | null | 0 | 0.111111 | null | null | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
866731500bf9de7d963d33a61b133cfd0fb18eda | 1,727 | py | Python | examples/src/python/join_streamlet_topology.py | aaronstjohn/incubator-heron | bdc35f8d23296472983956a477ea38da54d16b2b | [
"Apache-2.0"
] | 2 | 2016-07-04T07:10:31.000Z | 2018-03-28T16:59:02.000Z | examples/src/python/join_streamlet_topology.py | aaronstjohn/incubator-heron | bdc35f8d23296472983956a477ea38da54d16b2b | [
"Apache-2.0"
] | 1 | 2019-05-08T22:30:16.000Z | 2019-05-08T22:30:16.000Z | examples/src/python/join_streamlet_topology.py | aaronstjohn/incubator-heron | bdc35f8d23296472983956a477ea38da54d16b2b | [
"Apache-2.0"
] | 1 | 2017-06-05T17:55:45.000Z | 2017-06-05T17:55:45.000Z | #!/usr/bin/env python
# -*- encoding: utf-8 -*-
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
'''join_streamlet_topology.py: module is an example of how to use the join operator'''
import sys
from heronpy.streamlet.builder import Builder
from heronpy.streamlet.runner import Runner
from heronpy.streamlet.config import Config
from heronpy.streamlet.windowconfig import WindowConfig
from heronpy.connectors.mock.arraylooper import ArrayLooper
# pylint: disable=superfluous-parens
if __name__ == '__main__':
if len(sys.argv) != 2:
print("Topology's name is not specified")
sys.exit(1)
builder = Builder()
source_1 = builder.new_source(ArrayLooper([["key1", "a"], ["key1", "b"]], sleep=1))
source_2 = builder.new_source(ArrayLooper([["key1", "c"], ["key1", "d"]], sleep=1))
source_1.join(source_2, WindowConfig.create_sliding_window(2, 1), lambda x, y: x + y).log()
runner = Runner()
config = Config()
runner.run(sys.argv[1], config, builder)
| 35.979167 | 93 | 0.735379 | 251 | 1,727 | 4.988048 | 0.505976 | 0.047923 | 0.063898 | 0.025559 | 0.049521 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013774 | 0.159236 | 1,727 | 47 | 94 | 36.744681 | 0.848485 | 0.536769 | 0 | 0 | 0 | 0 | 0.07732 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.352941 | 0 | 0.352941 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
86695a9d77f6427fc910ad0a37e4e6d95359ee20 | 1,918 | py | Python | tests/test.py | Nekmo/spice | 717a2cc24ad969e1caec2aabeffc30a796c6ec91 | [
"MIT"
] | null | null | null | tests/test.py | Nekmo/spice | 717a2cc24ad969e1caec2aabeffc30a796c6ec91 | [
"MIT"
] | null | null | null | tests/test.py | Nekmo/spice | 717a2cc24ad969e1caec2aabeffc30a796c6ec91 | [
"MIT"
] | null | null | null | from bs4 import BeautifulSoup
import requests
import sys, os
from time import sleep
sys.path.insert(0, '/home/may/Dropbox/Programming/spice/')
import spice_api as spice
def main():
creds = spice.load_auth_from_file('auth')
print(creds)
results = spice.search('Re:Zero Kara Hajimeru Isekai Seikatsu', spice.get_medium('anime'), creds)
print(results[0].title)
souma = spice.search_id(1, spice.get_medium('manga'), creds)
print(souma.raw_data)
print(souma.title)
print(souma.chapters)
print(souma.volumes)
re_zero_data = spice.get_blank(spice.get_medium('anime'))
re_zero_data.episodes = 0
re_zero_data.status = spice.get_status('reading')
re_zero_data.score = 8
re_zero_data.tags = ['this the first time a show that made me cringe']
shokugeki_data = spice.get_blank(spice.get_medium('manga'))
shokugeki_data.chapters = 13
shokugeki_data.volumes = 1
shokugeki_data.status = 1
shokugeki_data.score = 8
spice.update(shokugeki_data, 45757, spice.get_medium('manga'), creds)
anime_list = spice.get_list(spice.get_medium('ANIME'), 'Utagai-', creds)
print(anime_list.avg_score())
print(anime_list.median_score())
print(anime_list.mode_score())
print(anime_list.extremes())
print(anime_list.p_stddev())
print(anime_list.p_var())
print(anime_list.get_num_status(1))
print(anime_list.get_total())
print(anime_list.get_days())
print(anime_list.exists(11734))
print(len(anime_list.get_ids()))
print(len(anime_list.get_titles()))
print(anime_list.get_status(1))
print(anime_list.get_score(10))
print(anime_list.exists_as_status(11734, 1))
print(anime_list.score_diff())
anime_list2 = spice.get_list(spice.get_medium('ANIME'), 'Pickleplatter', creds)
print("Similarity coefficient: {}".format(anime_list.compatibility(anime_list2)))
if __name__ == '__main__':
main()
| 34.25 | 101 | 0.717414 | 279 | 1,918 | 4.65233 | 0.333333 | 0.124807 | 0.151002 | 0.065485 | 0.200308 | 0.132512 | 0.095532 | 0 | 0 | 0 | 0 | 0.020258 | 0.150678 | 1,918 | 55 | 102 | 34.872727 | 0.77655 | 0 | 0 | 0 | 0 | 0 | 0.114181 | 0.01877 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020833 | false | 0 | 0.104167 | 0 | 0.125 | 0.479167 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
866b10a749a0729a15fdb56c60d4ea92a50f3344 | 3,270 | py | Python | neutron/db/models/l3ha.py | cleo4zheng/neutron | 6d65318308edfd984bdd0ff1ac7fef9486a040f7 | [
"Apache-2.0"
] | 4 | 2018-08-05T00:43:03.000Z | 2021-10-13T00:45:45.000Z | neutron/db/models/l3ha.py | cleo4zheng/neutron | 6d65318308edfd984bdd0ff1ac7fef9486a040f7 | [
"Apache-2.0"
] | 8 | 2018-06-14T14:50:16.000Z | 2018-11-13T16:30:42.000Z | neutron/db/models/l3ha.py | cleo4zheng/neutron | 6d65318308edfd984bdd0ff1ac7fef9486a040f7 | [
"Apache-2.0"
] | 7 | 2018-06-12T18:57:04.000Z | 2019-05-09T15:42:30.000Z | # Copyright (C) 2014 eNovance SAS <licensing@enovance.com>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
from neutron_lib.db import model_base
import sqlalchemy as sa
from sqlalchemy import orm
from neutron.common import constants as n_const
from neutron.db.models import agent as agent_model
from neutron.db import models_v2
class L3HARouterAgentPortBinding(model_base.BASEV2):
"""Represent agent binding state of a HA router port.
A HA Router has one HA port per agent on which it is spawned.
This binding table stores which port is used for a HA router by a
L3 agent.
"""
__tablename__ = 'ha_router_agent_port_bindings'
__table_args__ = (
sa.UniqueConstraint(
'router_id', 'l3_agent_id',
name='uniq_ha_router_agent_port_bindings0port_id0l3_agent_id'),
model_base.BASEV2.__table_args__
)
port_id = sa.Column(sa.String(36), sa.ForeignKey('ports.id',
ondelete='CASCADE'),
nullable=False, primary_key=True)
port = orm.relationship(models_v2.Port)
router_id = sa.Column(sa.String(36), sa.ForeignKey('routers.id',
ondelete='CASCADE'),
nullable=False)
l3_agent_id = sa.Column(sa.String(36),
sa.ForeignKey("agents.id",
ondelete='CASCADE'))
agent = orm.relationship(agent_model.Agent)
state = sa.Column(sa.Enum(n_const.HA_ROUTER_STATE_ACTIVE,
n_const.HA_ROUTER_STATE_STANDBY,
name='l3_ha_states'),
default=n_const.HA_ROUTER_STATE_STANDBY,
server_default=n_const.HA_ROUTER_STATE_STANDBY)
class L3HARouterNetwork(model_base.BASEV2, model_base.HasProjectPrimaryKey):
"""Host HA network for a tenant.
One HA Network is used per tenant, all HA router ports are created
on this network.
"""
__tablename__ = 'ha_router_networks'
network_id = sa.Column(sa.String(36),
sa.ForeignKey('networks.id', ondelete="CASCADE"),
nullable=False, primary_key=True)
network = orm.relationship(models_v2.Network)
class L3HARouterVRIdAllocation(model_base.BASEV2):
"""VRID allocation per HA network.
Keep a track of the VRID allocations per HA network.
"""
__tablename__ = 'ha_router_vrid_allocations'
network_id = sa.Column(sa.String(36),
sa.ForeignKey('networks.id', ondelete="CASCADE"),
nullable=False, primary_key=True)
vr_id = sa.Column(sa.Integer(), nullable=False, primary_key=True)
| 37.159091 | 76 | 0.644954 | 413 | 3,270 | 4.900726 | 0.360775 | 0.047431 | 0.034585 | 0.035573 | 0.242095 | 0.204545 | 0.1917 | 0.159091 | 0.089921 | 0.089921 | 0 | 0.014737 | 0.2737 | 3,270 | 87 | 77 | 37.586207 | 0.837474 | 0.296942 | 0 | 0.214286 | 0 | 0 | 0.10892 | 0.048857 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.547619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
8671b6a372caa3589eb77dcc566a9b3713aa80a9 | 6,499 | py | Python | genlist.py | truckli/technotes | 11d3cc0a1bd33141a22eaa2247cac1be1d74718a | [
"Apache-2.0"
] | null | null | null | genlist.py | truckli/technotes | 11d3cc0a1bd33141a22eaa2247cac1be1d74718a | [
"Apache-2.0"
] | null | null | null | genlist.py | truckli/technotes | 11d3cc0a1bd33141a22eaa2247cac1be1d74718a | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
import shutil, re, os, sys
file_model = "Model.template"
bookname = "TechNotes"
file_bibtex = "thebib.bib"
folder_target = "../pdf/"
#if name is a chapter, return its sections
def get_sections(name):
if not os.path.isdir(name):
return []
files = os.listdir(name)
sections = []
for section in files:
if re.match('.*\.tex$', section) and not re.match(".*lmz0610.*", section):
sections.append(name + "/" + section)
return sections
def is_updated(pdffile, texfiles):
def depend_modified(fname, ims):
depend_mtime = os.path.getmtime(fname)
if depend_mtime > ims:
print pdffile, ' mtime: ',ims
print fname, ' mtime: ', depend_mtime
return True
return False
old_pdffile = folder_target + pdffile
if not os.path.isfile(old_pdffile):
return False
pdf_mtime = os.path.getmtime(old_pdffile)
#if depend_modified(sys.argv[0], pdf_mtime):
#return False
#if depend_modified(file_model, pdf_mtime):
#return False
for section in texfiles:
if depend_modified(section, pdf_mtime):
return False
return True
def remove_tmp(tmpname):
if os.path.isfile(tmpname):
os.remove(tmpname)
def remove_latex_tmps(texname):
remove_tmp(texname + ".pdf")
remove_tmp(texname + ".tex")
remove_tmp(texname + ".blg")
remove_tmp(texname + ".bbl")
remove_tmp(texname + ".out")
remove_tmp(texname + ".toc")
remove_tmp(texname + ".aux")
remove_tmp(texname + ".idx")
remove_tmp(texname + ".log")
remove_tmp(texname + ".lof")
remove_tmp(texname + ".lot")
def read_bbl_file(object_name):
file_bbl = object_name + ".bbl"
if not os.path.isfile(file_bbl):
return ""
with open(file_bbl, 'r') as f:
return f.read()
#if depend_files contains citation
def need_bibtex(object_name, depend_files):
#if a file contains latex citation command \cite{}
def contain_citation(section_name):
with open(section_name, "r") as f:
content_section = f.read()
if content_section.find("\\cite{") == -1:
return False
return True
for section in depend_files:
if contain_citation(section):
return True
return False
def gen_pdf(object_name):
object_pdf = object_name + ".pdf"
if object_name == bookname:
depend_files = book_sections
targets = [folder_target + object_pdf, folder_target + "AAAAAAAAAAA.pdf"]
chapter_start_counter = 0
else:
depend_files = chap_sections[object_name]
targets = [folder_target + object_pdf]
chapter_start_counter = book_chapters.index(object_name)
# if is_updated(object_pdf, depend_files):
# print(object_pdf + " is updated")
# return False
obj_need_bibtex = need_bibtex(object_name, depend_files)
model = ''
with open(file_model) as model_file:
model = model_file.read()
model = model.replace("OBJECTNAME", object_name)
if object_name == 'Report':
model = model.replace("CHAPTERSTART", "0")
model = model.replace("\\tableofcontents", "%\\tableofcontents")
model = model.replace("ctexrep", "ctexart")
model = model.replace("\\setcounter{chapter}", "%\\setcounter{chapter}")
else:
model = model.replace("CHAPTERSTART", str(chapter_start_counter))
insert_word = "TOADD"
insert_pos = model.find(insert_word)
latex_text = model[:insert_pos] + insert_word
for section in depend_files:
latex_text = latex_text + "\n\\input{"+ section + "}"
#prepend text encoding mode line
section_text = ""
with open(section, 'r') as f:
line = f.readline()
if line[:6] != '%!Mode':
section_text = '%!Mode:: "TeX:UTF-8"\n' + line + f.read()
if section_text != "":
with open(section, 'w') as f:
f.write(section_text)
if obj_need_bibtex:
latex_text = latex_text + "\n\n"
latex_text = latex_text + "\\bibliographystyle{unsrt}\n"
latex_text = latex_text + "\\bibliography{thebib}\n"
latex_text = latex_text + model[insert_pos+len(insert_word):]
object_tex = object_name + ".tex"
with open(object_tex, "w") as f:
f.write(latex_text)
# os.system("xelatex " + object_name)
# if len(sys.argv) < 3 or sys.argv[2] != "fast":
# if obj_need_bibtex:
# old_bbl = read_bbl_file(object_name)
# os.system("bibtex " + object_name)
# if old_bbl != read_bbl_file(object_name):
# os.system("xelatex " + object_name)
# os.system("xelatex " + object_name)
#
# if os.path.isfile(object_pdf):
# for target in targets:
# shutil.copy(object_pdf, target)
return True
#trim trailing slash
def trim_chap_name(name):
if name[len(name) - 1] == '/':
name = name[:len(name)-1]
return name
def merge_chapter_pdfs():
mergecmd = 'pdftk '
for chap in book_chapters:
chappdf = folder_target + chap + '.pdf'
if os.path.isfile(chappdf):
mergecmd += chappdf + ' '
mergecmd += 'cat output ' + folder_target + 'AAABBBBBBBB.pdf'
print mergecmd
os.system(mergecmd)
##################################################
#now work starts
files = os.listdir('.')
chap_sections = {}
book_sections = []
book_chapters = []
for chap in files:
sections = get_sections(chap)
if len(sections):
chap_sections[chap] = sections
book_sections.extend(sections)
book_chapters.append(chap)
cmd = "one"
if cmd == "one":
gen_pdf(bookname)
elif cmd == "all":
modified = False
for chap in chap_sections:
modified = gen_pdf(chap) or modified
if modified:
merge_chapter_pdfs()
elif cmd == "clean":
for chap in chap_sections:
remove_latex_tmps(chap)
remove_latex_tmps(bookname)
else:
chap = trim_chap_name(cmd)
if chap in book_sections:
#chap is actually a section
section = chap
chap = 'Report'
chap_sections[chap] = [section]
book_chapters.append(chap)
if not chap_sections.has_key(chap):
print(chap + " is not a valid chapter name")
sys.exit(1)
modified = gen_pdf(chap)
if modified and chap != 'Report':
merge_chapter_pdfs()
| 27.892704 | 82 | 0.610248 | 810 | 6,499 | 4.687654 | 0.201235 | 0.047406 | 0.046352 | 0.023703 | 0.1596 | 0.053463 | 0.029497 | 0.018436 | 0.018436 | 0 | 0 | 0.003137 | 0.264348 | 6,499 | 232 | 83 | 28.012931 | 0.791048 | 0.142791 | 0 | 0.135484 | 0 | 0 | 0.088342 | 0.017304 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.006452 | null | null | 0.025806 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86735b1546f055d4c408acb184f9fb83f843ede1 | 2,272 | py | Python | Older Examples - enter at your own risk/lavender_pos/app/models.py | electricimp/examples | ebdd01baf64f3aa67f027194457432c7d7501d37 | [
"MIT"
] | 26 | 2015-01-17T23:43:06.000Z | 2021-09-28T18:24:28.000Z | Older Examples - enter at your own risk/lavender_pos/app/models.py | silver2row/examples | 0ef3bd4b4a875364db883c7c387f1f175a3ce61e | [
"MIT"
] | 5 | 2015-02-27T22:23:39.000Z | 2020-10-19T23:58:55.000Z | Older Examples - enter at your own risk/lavender_pos/app/models.py | silver2row/examples | 0ef3bd4b4a875364db883c7c387f1f175a3ce61e | [
"MIT"
] | 42 | 2015-01-22T16:33:12.000Z | 2021-01-14T02:33:15.000Z | import datetime
from database import Base
from sqlalchemy import Column, String, Integer, ForeignKey, DateTime, Float
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
first_name = Column(String(255))
last_name = Column(String(255))
email = Column(String(255), index=True, unique=True)
password = Column(String(255))
def get_id(self):
"""
Callback for Flask-Login. Represents that unique ID of a given user
object. It is unicoded as per specification.
Returns: the unique ID of an object
"""
return unicode(self.id)
def is_anonymous(self):
"""
Callback for Flask-Login. Default to False - we don't deal with any
anonymous users.
Returns: False
"""
return False
def is_active(self):
"""
Callback for Flask-Login. Default to True - we don't deal with
non-active users.
Returns: True
"""
return True
def is_authenticated(self):
"""
Callback for Flask-Login. Should return True unless the object
represents a user should not be authenticated.
Returns: True because all objects should be authenticated
"""
return True
class PendingTransaction(Base):
__tablename__ = 'pending_transactions'
id = Column(Integer, primary_key=True)
barcode = Column(String(10))
user_id = Column(Integer, ForeignKey('users.id'))
timestamp = Column(DateTime, nullable=False, default=datetime.datetime.now())
# Status: 0 - default, 1 - scanned, 2 - claimed
status = Column(Integer, default=0)
company = Column(Integer, ForeignKey('vendors.id'))
amount = Column(Float)
class Transaction(Base):
__tablename__ = 'transactions'
id = Column(Integer, primary_key=True)
user_id = Column(Integer, ForeignKey('users.id'))
company = Column(Integer, ForeignKey('vendors.id'))
amount = Column(Float)
timestamp = Column(DateTime, nullable=False, default=datetime.datetime.now())
class Vendor(Base):
__tablename__ = 'vendors'
id = Column(Integer, primary_key=True)
name = Column(String(255))
agent_url = Column(String(255))
secret = Column(String(255))
| 28.049383 | 81 | 0.652729 | 272 | 2,272 | 5.341912 | 0.341912 | 0.074329 | 0.072264 | 0.060564 | 0.408809 | 0.355127 | 0.31521 | 0.162423 | 0.162423 | 0 | 0 | 0.015762 | 0.246039 | 2,272 | 80 | 82 | 28.4 | 0.832458 | 0.247359 | 0 | 0.35 | 0 | 0 | 0.051513 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.025 | 0.075 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
8674487bc14ab6d974246602ccaa1b9927159028 | 4,724 | py | Python | rr_ml/nodes/end_to_end/train.py | ebretl/roboracing-software | 8803c97a885500069d04e70894b19f807ae5baf9 | [
"MIT"
] | null | null | null | rr_ml/nodes/end_to_end/train.py | ebretl/roboracing-software | 8803c97a885500069d04e70894b19f807ae5baf9 | [
"MIT"
] | null | null | null | rr_ml/nodes/end_to_end/train.py | ebretl/roboracing-software | 8803c97a885500069d04e70894b19f807ae5baf9 | [
"MIT"
] | null | null | null | import os
import math
import string
import numpy as np
import rospy
import keras
from keras.models import Sequential
from keras.layers import Dense, Dropout, Flatten, Conv2D, MaxPooling2D, \
GaussianNoise, BatchNormalization
import cv2
import collections
import random
import time
from example_set import ExampleSet
from params import input_shape, expand_categories
n_examples_to_load = 8000 # if the number of training examples is below this, load more data
batch_size = 16
categories = [None]
def defineCategory(steering):
differences = [abs(steering - category) for category in categories]
category = np.argmin(differences)
oneHot = [1 if i == category else 0 for i in range(len(categories))]
return oneHot
def format_inputs(examples):
data2 = np.zeros((len(examples),) + input_shape, dtype='float32')
for i, ex in enumerate(examples):
data2[i] = ex.get_image()
data2 /= 255.0
return data2
def make_model():
model = Sequential()
# 128 x 48
model.add(GaussianNoise(0.05, input_shape=input_shape))
model.add(Conv2D(32, (3, 3), activation='relu', padding='same'))
model.add(MaxPooling2D((4, 4)))
# 32 x 12
model.add(Conv2D(64, (3, 3), activation='relu', padding='same'))
model.add(MaxPooling2D((2, 2)))
# 16 x 6
model.add(Flatten())
model.add(Dropout(0.25))
model.add(Dense(128, activation='relu'))
model.add(Dense(32, activation='relu'))
model.add(Dropout(0.35))
model.add(Dense(len(categories), activation='softmax'))
model.compile(loss=keras.losses.categorical_crossentropy,
optimizer=keras.optimizers.Adadelta(),
metrics=['accuracy'])
return model
def main():
global categories, n_examples_to_load, batch_size
rospy.init_node("nn_training")
startTime = time.time()
model_path = rospy.get_param("~model_output_path")
exampleSetDir = rospy.get_param("~example_set_dir")
epochs = int(rospy.get_param("~epochs"))
categories = rospy.get_param("~positive_nonzero_categories")
categories = string.strip(categories).split(" ")
categories = [float(x) for x in categories]
categories = expand_categories(categories)
model = make_model()
model.summary()
exampleSetFiles_const = tuple(f for f in os.listdir(exampleSetDir) if '.pkl.lz4' in f)
n_training_examples = 0
n_test_examples = 0
cnt = collections.Counter()
for f in exampleSetFiles_const:
data = ExampleSet.load(os.path.join(exampleSetDir, f))
n_training_examples += len(data.train)
n_test_examples += len(data.test)
for ex in data.train:
i = np.argmax(defineCategory(ex.angle))
cnt[i] += 1
print "total training examples:", n_training_examples
print "training label counts:", cnt
def batch_generator(isValidation = False):
gen_epochs = 1 if isValidation else epochs
for epoch in range(gen_epochs):
exampleSetFiles = list(exampleSetFiles_const)
random.shuffle(exampleSetFiles)
while len(exampleSetFiles) > 0:
D = []
while len(exampleSetFiles) > 0 and len(D) < n_examples_to_load:
data = ExampleSet.load(os.path.join(exampleSetDir, exampleSetFiles.pop()))
D += data.test if isValidation else data.train
if not isValidation: random.shuffle(D)
X = format_inputs(D)
# create output bins
labels = np.array([defineCategory(ex.angle) for ex in D])
if not isValidation:
for i in range(len(X)):
if random.random() < 0.4: # 40% of images are flipped
X[i] = cv2.flip(X[i], 1)
labels[i] = labels[i][::-1]
for i in range(0, len(X), batch_size):
xs = X[i: i + batch_size]
ys = labels[i: i + batch_size]
yield (xs, ys)
try:
n_minibatches = int(math.ceil(float(n_training_examples) / batch_size))
model.fit_generator(batch_generator(),
steps_per_epoch=n_minibatches,
epochs=epochs,
verbose=1)
print "elapsed time:", time.time() - startTime
n_minibatches = int(math.ceil(float(n_test_examples) / batch_size))
loss, acc = model.evaluate_generator(batch_generator(True), steps=n_minibatches)
print "validation loss:", loss, "| validation accuracy:", acc
finally:
model.save(model_path)
print "\nsaved model to", model_path
| 33.503546 | 94 | 0.622777 | 583 | 4,724 | 4.919383 | 0.319039 | 0.030683 | 0.018131 | 0.01569 | 0.108787 | 0.08159 | 0.08159 | 0.032775 | 0.032775 | 0 | 0 | 0.022932 | 0.270745 | 4,724 | 140 | 95 | 33.742857 | 0.809579 | 0.028154 | 0 | 0 | 0 | 0 | 0.054101 | 0.006108 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.130841 | null | null | 0.046729 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86798d0504dd04df9298eafb92e49de14fb4653a | 3,804 | py | Python | cloudferry/actions/prechecks/check_vmax_prerequisites.py | SVilgelm/CloudFerry | 4459c0d21ba7ccffe51176932197b352e426ba63 | [
"Apache-2.0"
] | 6 | 2017-04-20T00:49:49.000Z | 2020-12-20T16:27:10.000Z | cloudferry/actions/prechecks/check_vmax_prerequisites.py | SVilgelm/CloudFerry | 4459c0d21ba7ccffe51176932197b352e426ba63 | [
"Apache-2.0"
] | 3 | 2017-04-08T15:47:16.000Z | 2017-05-18T17:40:59.000Z | cloudferry/actions/prechecks/check_vmax_prerequisites.py | SVilgelm/CloudFerry | 4459c0d21ba7ccffe51176932197b352e426ba63 | [
"Apache-2.0"
] | 8 | 2017-04-07T23:42:36.000Z | 2021-08-10T11:05:10.000Z | # Copyright 2016 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import getpass
import logging
from cloudferry.lib.base import exception
from cloudferry.lib.base.action import action
from cloudferry.lib.utils import local
from cloudferry.lib.utils import remote_runner
LOG = logging.getLogger(__name__)
class CheckVMAXPrerequisites(action.Action):
"""This verifies prerequisites required for NFS to VMAX iSCSI cinder
volume migration"""
def _iscsiadm_is_installed_locally(self):
LOG.info("Checking if iscsiadm tool is installed")
try:
local.run('iscsiadm --help &>/dev/null')
except local.LocalExecutionFailed:
msg = ("iscsiadm is not available on the local host. Please "
"install iscsiadm tool on the node you running on or "
"choose other cinder backend for migration. iscsiadm is "
"mandatory for migrations with EMC VMAX cinder backend")
LOG.error(msg)
raise exception.AbortMigrationError(msg)
def _check_local_sudo_password_set(self):
current_user = getpass.getuser()
if current_user != 'root' and \
self.cfg.migrate.local_sudo_password is None:
try:
local.sudo('ls')
except local.LocalExecutionFailed:
msg = ("CloudFerry is running as '{user}' user, but "
"passwordless sudo does not seem to be configured on "
"current host. Please either specify password in "
"`local_sudo_password` config option, or run "
"CloudFerry as root user.").format(user=current_user)
LOG.error(msg)
raise exception.AbortMigrationError(msg)
def _ssh_connectivity_between_controllers(self):
src_host = self.cfg.src.ssh_host
src_user = self.cfg.src.ssh_user
dst_host = self.cfg.dst.ssh_host
dst_user = self.cfg.dst.ssh_user
LOG.info("Checking ssh connectivity between '%s' and '%s'",
src_host, dst_host)
rr = remote_runner.RemoteRunner(src_host, src_user)
ssh_opts = ('-o UserKnownHostsFile=/dev/null '
'-o StrictHostKeyChecking=no')
cmd = "ssh {opts} {user}@{host} 'echo ok'".format(opts=ssh_opts,
user=dst_user,
host=dst_host)
try:
rr.run(cmd)
except remote_runner.RemoteExecutionError:
msg = ("No ssh connectivity between source host '{src_host}' and "
"destination host '{dst_host}'. Make sure you have keys "
"and correct configuration on these nodes. To verify run "
"'{ssh_cmd}' from '{src_host}' node")
msg = msg.format(src_host=src_host, dst_host=dst_host, ssh_cmd=cmd)
LOG.error(msg)
raise exception.AbortMigrationError(msg)
def run(self, **kwargs):
if self.cfg.dst_storage.backend != 'iscsi-vmax':
return
self._iscsiadm_is_installed_locally()
self._ssh_connectivity_between_controllers()
self._check_local_sudo_password_set()
| 41.347826 | 79 | 0.624869 | 457 | 3,804 | 5.056893 | 0.380744 | 0.021203 | 0.023799 | 0.02077 | 0.168758 | 0.064907 | 0.064907 | 0.064907 | 0 | 0 | 0 | 0.002985 | 0.295478 | 3,804 | 91 | 80 | 41.802198 | 0.859328 | 0.166404 | 0 | 0.177419 | 0 | 0 | 0.268974 | 0.023182 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0.129032 | 0.096774 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
8681dc9beb7ce1fcfe008337221dc6feb16aedb5 | 1,888 | py | Python | conan/tools/env/virtualrunenv.py | dscole/conan | ff7b8e6703e8407773968517d68424b9ec59aa30 | [
"MIT"
] | null | null | null | conan/tools/env/virtualrunenv.py | dscole/conan | ff7b8e6703e8407773968517d68424b9ec59aa30 | [
"MIT"
] | 1 | 2019-06-07T03:02:02.000Z | 2019-06-07T03:02:02.000Z | conan/tools/env/virtualrunenv.py | dscole/conan | ff7b8e6703e8407773968517d68424b9ec59aa30 | [
"MIT"
] | 1 | 2021-08-20T19:47:51.000Z | 2021-08-20T19:47:51.000Z | from conan.tools.env import Environment
def runenv_from_cpp_info(conanfile, cpp_info):
""" return an Environment deducing the runtime information from a cpp_info
"""
dyn_runenv = Environment(conanfile)
if cpp_info is None: # This happens when the dependency is a private one = BINARY_SKIP
return dyn_runenv
if cpp_info.bin_paths: # cpp_info.exes is not defined yet
dyn_runenv.prepend_path("PATH", cpp_info.bin_paths)
# If it is a build_require this will be the build-os, otherwise it will be the host-os
if cpp_info.lib_paths:
dyn_runenv.prepend_path("LD_LIBRARY_PATH", cpp_info.lib_paths)
dyn_runenv.prepend_path("DYLD_LIBRARY_PATH", cpp_info.lib_paths)
if cpp_info.framework_paths:
dyn_runenv.prepend_path("DYLD_FRAMEWORK_PATH", cpp_info.framework_paths)
return dyn_runenv
class VirtualRunEnv:
""" captures the conanfile environment that is defined from its
dependencies, and also from profiles
"""
def __init__(self, conanfile):
self._conanfile = conanfile
def environment(self):
""" collects the runtime information from dependencies. For normal libraries should be
very occasional
"""
runenv = Environment(self._conanfile)
# FIXME: Missing profile info
# FIXME: Cache value?
host_req = self._conanfile.dependencies.host
test_req = self._conanfile.dependencies.test
for _, dep in list(host_req.items()) + list(test_req.items()):
if dep.runenv_info:
runenv.compose_env(dep.runenv_info)
runenv.compose_env(runenv_from_cpp_info(self._conanfile, dep.cpp_info))
return runenv
def generate(self, auto_activate=False):
run_env = self.environment()
if run_env:
run_env.save_script("conanrunenv", auto_activate=auto_activate)
| 37.76 | 94 | 0.697034 | 253 | 1,888 | 4.928854 | 0.351779 | 0.078589 | 0.028869 | 0.064154 | 0.158781 | 0.158781 | 0.056135 | 0.056135 | 0 | 0 | 0 | 0 | 0.227225 | 1,888 | 49 | 95 | 38.530612 | 0.854695 | 0.26589 | 0 | 0.068966 | 0 | 0 | 0.049364 | 0 | 0 | 0 | 0 | 0.020408 | 0 | 1 | 0.137931 | false | 0 | 0.034483 | 0 | 0.310345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
868508f96e8daab66111c1b9c301708733421f4d | 652 | py | Python | src/api/api_lists/models/list.py | rrickgauer/lists | 371de6af332789ef386392fd24857702794d05a6 | [
"Apache-2.0"
] | null | null | null | src/api/api_lists/models/list.py | rrickgauer/lists | 371de6af332789ef386392fd24857702794d05a6 | [
"Apache-2.0"
] | 67 | 2021-12-14T22:30:56.000Z | 2022-03-15T18:28:27.000Z | src/api/api_lists/models/list.py | rrickgauer/lists | 371de6af332789ef386392fd24857702794d05a6 | [
"Apache-2.0"
] | null | null | null | """
**********************************************************************************
List model
**********************************************************************************
"""
from enum import Enum
from dataclasses import dataclass
from uuid import UUID
from datetime import datetime
class ListType(str, Enum):
LIST : str = 'list'
TEMPLATE: str = 'template'
@classmethod
def _missing_(cls, value):
return ListType.LIST
@dataclass
class List:
id : UUID = None
user_id : UUID = None
name : str = None
created_on: datetime = None
type : ListType = ListType.LIST | 23.285714 | 82 | 0.460123 | 57 | 652 | 5.192982 | 0.45614 | 0.081081 | 0.067568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.231595 | 652 | 28 | 83 | 23.285714 | 0.590818 | 0.269939 | 0 | 0 | 0 | 0 | 0.025586 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.235294 | 0.058824 | 0.882353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
86864787ee128fda8f0e696df8fc12952938543c | 4,356 | py | Python | run_mod.py | fpl-analytics/gr_crypto | 2b0ab451c9c205a9f572c4bca23fffbb68ca188f | [
"MIT"
] | null | null | null | run_mod.py | fpl-analytics/gr_crypto | 2b0ab451c9c205a9f572c4bca23fffbb68ca188f | [
"MIT"
] | null | null | null | run_mod.py | fpl-analytics/gr_crypto | 2b0ab451c9c205a9f572c4bca23fffbb68ca188f | [
"MIT"
] | null | null | null | """
Setup:
- Import Libraries
- Setup tf on multiple cores
- Import Data
"""
import pandas as pd
import numpy as np
import tensorflow as tf
import seaborn as sns
from time import time
import multiprocessing
import random
import os
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, LSTM, ConvLSTM2D, Flatten
from sklearn.preprocessing import StandardScaler
from sklearn.linear_model import LinearRegression
from sklearn.ensemble import RandomForestRegressor
from joblib import dump, load
from mod.prep import log_return, log_return_np, preprocess
from mod.model import return_pred
from mod.eval import evaluate_regression, evaluate_up_down
cores = multiprocessing.cpu_count()
tf.config.threading.set_inter_op_parallelism_threads(cores-1)
root_folder = "data"
wide_close = pd.read_csv(root_folder + "/working/wide_close.csv")
wide_target = pd.read_csv(root_folder + "/working/wide_target.csv")
asset_details = pd.read_csv(root_folder + "/asset_details.csv")
assets = [str(i) for i in asset_details["Asset_ID"]]
"""
Preprocess
"""
close_returns = wide_close[assets].apply(log_return)
close_returns["time"] = wide_close["time"]
close_returns[assets] = close_returns[assets].replace([np.inf,-np.inf],np.nan)
"""
Linear Regression
"""
x_steps, y_steps = 60, [1, 15]
col_in, col_out = "1", "1"
train_x, test_x, train_y, test_y, time_d = preprocess(data_in = wide_close, col_in,
col_out, time_col="time", x_steps, y_steps)
# 1 step
lr_1 = LinearRegression()
lr_1.fit(train_x.reshape(-1, x_steps), train_y[:,0,:].reshape(-1, 1))
true, pred = return_pred(test_x, test_y[:,0,:], lr_1)
evaluate_regression(true, pred)
evaluate_up_down(true, pred)
# 15 step
lr_15 = LinearRegression()
lr_15.fit(train_x.reshape(-1, x_steps), train_y[:,1,:].reshape(-1, 1))
true, pred = return_pred(test_x, test_y[:,1,:], lr_1)
evaluate_regression(true, pred)
evaluate_up_down(true, pred)
"""
calculate and store components seperately
process:
- first, get rolling values for each timestamp
- then, predict 1 and 15 gaps and store in array
"""
# Production
"""
Steps:
- Get train, val test and test indices. Importantly, this
needs to cover all assets (even though not all assets exist)
for the whole time period.
- Build models
"""
assets = list(asset_details["Asset_ID"].astype(str))
# Get indexes
i = np.select(
[
(wide_close.index >= 0) & (wide_close.index <= (len(wide_close)*0.7)),
(wide_close.index > (len(wide_close)*0.7)) & (wide_close.index <= (len(wide_close)*0.8))
],
["train", "val"],
default = "test")
indexes = pd.DataFrame({"time":wide_close["time"],
"set":i})
for a in assets:
print("asset", a)
filt = indexes["set"][~pd.isna(wide_close[a])]
counts = filt.value_counts()
df = pd.DataFrame({"counts":counts,
"pct":counts/np.sum(counts)})
print(df, "\n\n")
indexes_d = {}
for s in indexes["set"].unique():
indexes_d[s] = indexes["time"][indexes["set"] == s]
mkdir "model_files"
mkdir "model_files/linear_regression"
for a in assets:
print("Asset", a)
x_steps, y_steps = 60, [1, 16]
cols_in, cols_out = a, a
train_x, test_x, train_y, test_y, time_d = preprocess(wide_close, cols_in,
cols_out, "time", x_steps, y_steps)
# 1 step
lr_1 = LinearRegression()
lr_1.fit(train_x.reshape(-1, x_steps), train_y[:,0,:].reshape(-1, 1))
true, pred = return_pred(test_x, test_y[:,0,:], lr_1)
print("Model 1 Metrics")
evaluate_regression(true, pred)
evaluate_up_down(true, pred)
# 16 step
lr_16 = LinearRegression()
lr_16.fit(train_x.reshape(-1, x_steps), train_y[:,1,:].reshape(-1, 1))
true, pred = return_pred(test_x, test_y[:,1,:], lr_16)
print("Model 16 Metrics")
evaluate_regression(true, pred)
evaluate_up_down(true, pred)
dump(lr_1, f"model_files/linear_regression/lr_{a}_1")
dump(lr_16, f"model_files/linear_regression/lr_{a}_16")
dump(time_d, "model_files/linear_regression/lr_times")
"""
Random Forest
"""
rf = RandomForestRegressor(n_jobs=-1)
# start = time.time()
rf.fit(train_x.reshape(-1, x_steps), train_y.reshape(-1))
# print("Took:", round(start-time.time()))
| 25.623529 | 93 | 0.677456 | 657 | 4,356 | 4.266362 | 0.252664 | 0.048163 | 0.024973 | 0.028541 | 0.358188 | 0.34142 | 0.330717 | 0.271495 | 0.271495 | 0.261149 | 0 | 0.022166 | 0.181818 | 4,356 | 169 | 94 | 25.775148 | 0.76431 | 0.025941 | 0 | 0.186047 | 0 | 0 | 0.093323 | 0.05122 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.197674 | null | null | 0.05814 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
868928ba8707be83d375694167add597b520a225 | 616 | py | Python | SmartCache/sim/Utilities/setup.py | Cloud-PG/smart-cache | 467987abece3fd4830fd615288046359761229f8 | [
"Apache-2.0"
] | 1 | 2019-10-13T09:05:24.000Z | 2019-10-13T09:05:24.000Z | SmartCache/sim/Utilities/setup.py | Cloud-PG/smart-cache | 467987abece3fd4830fd615288046359761229f8 | [
"Apache-2.0"
] | null | null | null | SmartCache/sim/Utilities/setup.py | Cloud-PG/smart-cache | 467987abece3fd4830fd615288046359761229f8 | [
"Apache-2.0"
] | 1 | 2019-05-16T11:53:38.000Z | 2019-05-16T11:53:38.000Z | from distutils.core import setup
setup(
name='utils',
version='1.0.0',
author='Mirco Tracolli',
author_email='mirco.tracolli@pg.infn.it',
packages=[
'utils',
],
scripts=[],
url='https://github.com/Cloud-PG/smart-cache',
license='Apache 2.0 License',
description='Utils for the SmartCache project',
long_description="To do...",
install_requires=open("requirements.txt").read(),
classifier=[
"Operating System :: POSIX :: Linux",
"License :: OSI Approved :: Apache 2.0 License",
"Programming Language :: Python :: 3 :: Only"
]
)
| 26.782609 | 56 | 0.61039 | 71 | 616 | 5.253521 | 0.774648 | 0.069705 | 0.042895 | 0.080429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016807 | 0.227273 | 616 | 22 | 57 | 28 | 0.766807 | 0 | 0 | 0 | 0 | 0 | 0.469156 | 0.040584 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.047619 | 0 | 0.047619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
868f63854096baf68c5ff8cc2009603138d30b30 | 5,195 | py | Python | delphiIDE.py | JeisonJHA/Plugins-Development | cccb58908eed6114c569e53d5710e70b8d53f5c5 | [
"MIT"
] | null | null | null | delphiIDE.py | JeisonJHA/Plugins-Development | cccb58908eed6114c569e53d5710e70b8d53f5c5 | [
"MIT"
] | null | null | null | delphiIDE.py | JeisonJHA/Plugins-Development | cccb58908eed6114c569e53d5710e70b8d53f5c5 | [
"MIT"
] | null | null | null | import sublime_plugin
class MethodDeclaration(object):
"""docstring for MethodDeclaration"""
def __init__(self):
self._methodclass = None
self.has_implementation = False
self.has_interface = False
@property
def has_implementation(self):
return self._has_implementation
@has_implementation.setter
def has_implementation(self, value):
self._has_implementation = value
@property
def has_interface(self):
return self._has_interface
@has_interface.setter
def has_interface(self, value):
self._has_interface = value
@property
def methodname(self):
return self._methodname
@methodname.setter
def methodname(self, value):
self._methodname = value
@property
def methodregion(self):
return self._methodregion
@methodregion.setter
def methodregion(self, value):
self._methodregion = value
@property
def visibility(self):
return self._visibility
@visibility.setter
def visibility(self, value):
self._visibility = value
@property
def params(self):
return self._params
@params.setter
def params(self, value):
self._params = value
@property
def methodclass(self):
return self._methodclass
@methodclass.setter
def methodclass(self, value):
self._methodclass = value
class ClassDeclaration(object):
"""docstring for ClassDeclaration"""
@property
def classname(self):
return self._classname
@classname.setter
def classname(self, value):
self._classname = value
@property
def classregion(self):
return self._classregion
@classregion.setter
def classregion(self, value):
self._classregion = value
@property
def privateregion(self):
return self._privateregion
@privateregion.setter
def privateregion(self, value):
self._privateregion = value
@property
def protectedregion(self):
return self._protectedregion
@protectedregion.setter
def protectedregion(self, value):
self._protectedregion = value
@property
def publicregion(self):
return self._publicregion
@publicregion.setter
def publicregion(self, value):
self._publicregion = value
@property
def publishedregion(self):
return self._publishedregion
@publishedregion.setter
def publishedregion(self, value):
self._publishedregion = value
class DelphiIdeCommand(sublime_plugin.TextCommand):
# // { "keys": ["ctrl+shift+x"], "command": "delphi_ide", "args": {"teste": "delphimethodnav"}}
# view.window().run_command('show_panel',
# args={"panel": 'output.find_results', "toggle": True})
def run(self, edit, teste):
print('teste[0]:%s' % teste)
method = None
try:
method = getattr(self, teste)
except AttributeError:
raise NotImplementedError("Class `{}` does not implement `{}`".
format(self.__class__.__name__,
teste))
method()
def delphimethodnav(self):
print('vai doido')
def getMethodInformation(self):
view = self.view
cursor_region = view.sel()[0]
cursor_pt = view.sel()[0].begin()
if not view.match_selector(cursor_pt,
'function.implementation.delphi'):
# exit because it is not in a method
return None
def params(region):
params_region = view.find_by_selector(
'meta.function.parameters.delphi')
param_name_region = view.find_by_selector(
'variable.parameter.function.delphi')
params_region_filt = [
s for s in params_region if region.contains(s)]
params_region_filt = [
s for s in param_name_region if
params_region_filt[0].contains(s)]
return params_region_filt
def paramsFromRegion(region):
try:
params_region_filt = params(region)
x = [view.substr(x) for x in params_region_filt]
return x
except:
return []
def getFunctionName():
functionname = view.find_by_selector('entity.name.function')
functionnamefiltered = [
n for n in functionname if method.methodregion[0].contains(n)]
return view.substr(functionnamefiltered[0])
# has_implementation
# has_interface
# methodname
# methodregion
# visibility
# params
# methodclass
method = MethodDeclaration()
selector = view.find_by_selector
method.methodregion = [r for r in selector('meta.function.delphi')
if cursor_region.intersects(r)]
method.methodname = getFunctionName()
method.params = self.paramsFromRegion(method.methodregion[0])
return method
def getClassInformation(self):
pass
| 26.237374 | 99 | 0.609047 | 504 | 5,195 | 6.099206 | 0.212302 | 0.046519 | 0.059206 | 0.023422 | 0.030579 | 0.014964 | 0.014964 | 0 | 0 | 0 | 0 | 0.001937 | 0.304524 | 5,195 | 197 | 100 | 26.370558 | 0.848879 | 0.071992 | 0 | 0.124088 | 0 | 0 | 0.039375 | 0.019792 | 0 | 0 | 0 | 0 | 0 | 1 | 0.248175 | false | 0.007299 | 0.007299 | 0.094891 | 0.416058 | 0.014599 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
869f105dabe3ba66d48901d5ab1ef89fe7052f2e | 624 | py | Python | investment_report/migrations/0020_auto_20180911_1005.py | uktrade/pir-api | 79747ceab042c42c287e2b7471f6dade70f68693 | [
"MIT"
] | 1 | 2021-02-02T19:08:55.000Z | 2021-02-02T19:08:55.000Z | investment_report/migrations/0020_auto_20180911_1005.py | uktrade/invest-pir-api | be56efddf9dfdf81c8557441a9a54d9a4dd4bab1 | [
"MIT"
] | 21 | 2018-07-10T10:20:47.000Z | 2022-03-24T09:36:29.000Z | investment_report/migrations/0020_auto_20180911_1005.py | uktrade/pir-api | 79747ceab042c42c287e2b7471f6dade70f68693 | [
"MIT"
] | 1 | 2021-02-04T11:28:37.000Z | 2021-02-04T11:28:37.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.13 on 2018-09-11 10:05
from __future__ import unicode_literals
import config.s3
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('investment_report', '0019_auto_20180820_1304'),
]
operations = [
migrations.AddField(
model_name='contact',
name='website_href',
field=models.URLField(default='https://invest.great.gov.uk/contact/', help_text='Custom link for website (used for tracking)', max_length=255),
preserve_default=False,
)
]
| 27.130435 | 155 | 0.653846 | 74 | 624 | 5.324324 | 0.810811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079002 | 0.229167 | 624 | 22 | 156 | 28.363636 | 0.740125 | 0.110577 | 0 | 0 | 1 | 0 | 0.25 | 0.041667 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86a51a9e777416f598dec47e504f16a9b1fa7744 | 433 | py | Python | myFirstApp/travello/models.py | cankush625/Django | a3e874a69fbf34bf9123a7d60697a2449c7591c6 | [
"MIT"
] | null | null | null | myFirstApp/travello/models.py | cankush625/Django | a3e874a69fbf34bf9123a7d60697a2449c7591c6 | [
"MIT"
] | 10 | 2020-02-12T03:08:48.000Z | 2022-02-10T11:27:50.000Z | myFirstApp/travello/models.py | cankush625/Django | a3e874a69fbf34bf9123a7d60697a2449c7591c6 | [
"MIT"
] | null | null | null | from django.db import models
# Create your models here.
class Destination(models.Model) :
name = models.CharField(max_length = 100)
img = models.ImageField(upload_to = 'pics')
desc = models.TextField()
price = models.IntegerField()
offer = models.BooleanField(default = False)
class News() :
id : int
img : str
date : int
month : str
headline : str
category : str
desc : str | 22.789474 | 48 | 0.637413 | 52 | 433 | 5.269231 | 0.711538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009434 | 0.265589 | 433 | 19 | 49 | 22.789474 | 0.852201 | 0.055427 | 0 | 0 | 0 | 0 | 0.009804 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
86a7a933257c5b58ca131b6e09db3e5af93d5f4e | 19,069 | py | Python | netesto/local/psPlot.py | fakeNetflix/facebook-repo-fbkutils | 16ec0c024322c163e7dbe691812ba8fdf5b511ad | [
"BSD-3-Clause"
] | 346 | 2016-04-08T17:04:29.000Z | 2021-09-30T06:05:47.000Z | netesto/local/psPlot.py | fakeNetflix/facebook-repo-fbkutils | 16ec0c024322c163e7dbe691812ba8fdf5b511ad | [
"BSD-3-Clause"
] | 38 | 2016-04-26T14:58:17.000Z | 2021-10-07T20:43:39.000Z | netesto/local/psPlot.py | fakeNetflix/facebook-repo-fbkutils | 16ec0c024322c163e7dbe691812ba8fdf5b511ad | [
"BSD-3-Clause"
] | 76 | 2016-04-08T17:59:23.000Z | 2021-09-05T13:18:27.000Z | #!/usr/bin/env python2
import sys
import random
import os.path
import shutil
import commands
import types
import math
#gsPath = '/usr/local/bin/gs'
gsPath = 'gs'
logFile = '/dev/null'
#logFile = 'plot.log'
#--- class PsPlot(fname, pageHeader, pageSubHeader, plotsPerPage)
#
class PsPlot(object):
def __init__(self, fname, pageHeader, pageSubHeader, plotsPerPage):
self.foutPath = os.path.dirname(fname)+'/'
if self.foutPath == '/':
self.foutPath = ''
self.foutName = os.path.basename(fname)
self.fname = fname+'.ps'
self.pageHeader = pageHeader
self.pageSubHeader = pageSubHeader
self.plotsPerPage = plotsPerPage
self.yfix1 = ''
self.yfix2 = ''
self.xGrid = 1
self.yGrid = 1
self.xUniform = False
self.xLen = 6.5 #inches
self.seriesTitle = ' '
self.x0 = 0
self.xInc = 0
self.xCount = 0
self.xList = []
self.xDict = {}
self.y1Inc = 0
self.y1Count = 0
self.y1LogScale = 0
self.y2Inc = 0
self.y2Count = 0
self.y2LogScale = 0
self.xOffset = 0
self.colors = [ (0.7,0.7,0.7), (0,0,0.8), (0.8,0,0),
(0.42,0.55,0.14), (0.6,0.5,0.3), (0.6,0.2,0.8),
(0,0.8,0),
(0.4,0.3,0.5), (0.5,0.5,0.5), (0.8,0.0,0.0), (0,0,0) ]
self.colorsN = 11
self.colorRed = (0.8,0,0)
self.colorGreen = (0,0.8,0)
self.colorBlue = (0,0,0.8)
self.colorAqua = (0,0.5,0.5)
self.colorWhite = (1,1,1)
self.ColorBlack = (0,0,0)
self.xSize = 1800
self.ySize = 900
shutil.copy('plot-header.ps', self.fname)
self.fout = open(self.fname, 'a')
self.flog = open(logFile, 'a')
# self.flog = open('./psPlot.out', 'a')
if plotsPerPage == 4:
print >>self.fout, '/doGraph { graph4v } def'
print >>self.fout, '/nextGraph { nextGraph4v } def'
elif plotsPerPage == 3:
print >>self.fout, '/doGraph { graph3v } def'
print >>self.fout, '/nextGraph { nextGraph3v } def'
elif plotsPerPage == 2:
print >>self.fout, '/doGraph { graph2v } def'
print >>self.fout, '/nextGraph { nextGraph2v } def'
else:
print >>self.fout, '/doGraph { graph1v } def'
print >>self.fout, '/nextGraph { nextGraph1v } def'
print >>self.fout, '/showpage {\n 40 742 moveto'
print >>self.fout, '/Helvetica findfont 12 scalefont setfont'
if self.pageHeader != '':
print >>self.fout, '(',self.pageHeader,') show'
if self.pageSubHeader != '':
print >>self.fout, '40 726 moveto\n (',self.pageSubHeader,') show'
print >>self.fout, 'showpage\n} bind def'
print >>self.fout, 'doGraph'
#--- End()
#
def End(self):
print >>self.fout, '\nshowpage\nend'
self.fout.close()
#--- GetInc(vMin, vMax)
def GetInc(self,vMin, vMax):
ff = 1.0
while vMax <= 1 and vMax > 0:
ff *= 0.10
vMin *= 10
vMax *= 10
v0 = int(vMin)
v1 = int(vMax+0.99)
f = 1
w = v1 - v0
if w == 0:
v1 = v0 + 1
w = 1
while w/f >= 100:
f *= 10
# w = int(w/f)
v0 = int(v0/f)
v1 = int(v1/f)
if (vMin % f) != 0 and vMax == v1:
v1 += 1
w = v1 - v0
if w <= 10:
vInc = 1
elif w <= 20:
vInc = 2
else:
m = 10
while w/m > 100:
m *= 10
if (v0 >= 0) and (v0 % m) != 0:
v0 = int(v0 / m) * m
if (v1 % m) != 0:
v1 = int(v1 / m) * m + m
w = v1 - v0
if w <= 5*m:
vInc = m/2
else:
vInc = m
else:
vInc = m
# if (vMax/f)%vInc != 0 or v1 % vInc != 0:
if v1 % vInc != 0:
v1 = int(v1/vInc)*vInc + vInc
if (v0 % vInc) != 0:
v0 = int(v0/vInc)*vInc
v0 += vInc
v0 *= (f*ff)
v1 *= (f*ff)
vInc *= (f*ff)
return v0, v1, vInc
#--- ValueConvert(v)
#
def ValueConvert(self, v, inc):
if inc > 0:
logInc = int(math.log10(v/inc))
d = math.pow(10,logInc)
if d == 0:
d = 10.0
else:
d = 10.0
if d == 1 and float(v)/inc > 1.0:
d = 10.0
if v >= 1000000000 and inc > 1:
s = int(v/(1000000000/d))/d
if s*d == int(s)*d:
s = int(s)
r = str(s) + 'G'
elif v >= 1000000 and inc > 1:
s = int(v/(1000000/d))/d
if s*d == int(s)*d:
s = int(s)
r = str(s) + 'M'
elif v >= 1000 and inc > 1:
s = int(v/(1000/d))/d
if s*d == int(s)*d:
s = int(s)
r = str(s) + 'K'
elif v >= 1:
s = int(v*d)/d
if s*d == int(s)*d:
s = int(s)
r = str(s)
else:
r = str(int(v*100)/100.0)
return r
#--- GetAxis(vBeg, vEnd, vInc, logFlag)
#
def GetAxis(self, vBeg, vEnd, vInc, logFlag):
fix = '{ 0 add }'
if isinstance(vBeg,list):
vList = vBeg
vList.append(' ')
self.xUniform = True
v0 = 1
v1 = len(vList)
vi = 1
fix = '{ '+str(v0-vi)+' sub '+str(vi)+' div }'
logFlag = 0
else:
if vInc == 0:
v0,v1,vi = self.GetInc(vBeg,vEnd)
else:
v0 = vBeg
v1 = vEnd
vi = vInc
if vBeg > 0 and (logFlag==1 or (logFlag==0 and (vEnd/vBeg > 100))):
v0 = vBeg
v1 = vEnd
logFlag = 1
v0Log = math.log10(v0)
t = math.ceil(v0Log)
ff = math.modf(v0Log)
if math.fabs(ff[0]) < math.fabs(v0Log)/1000 and t < 0:
t += 1
logOffset = 0
while t < 1:
logOffset += 1
t += 1
v0 = math.pow(10,math.floor(v0Log)+1)
v1 = math.pow(10,math.ceil(math.log10(v1)))
vi = 1
vList = []
v = v0
while v <= v1:
vList.append(self.ValueConvert(v,0))
v *= 10
if v0 > 1:
logOffset -= (math.log10(v0) - 1)
# substract 1 from above inside parent?
fix = '{ dup 0 eq { } { log '+str(logOffset)+' add } ifelse }'
else:
logFlag = 0
v = v0
vList = []
n = 0
while True:
vList.append(self.ValueConvert(v,vi))
if v > vEnd:
break
n += 1
v = v0 + n*vi
fix = '{ '+str(v0-vi)+' sub '+str(vi)+' div }'
print >>self.flog, 'v0:',v0,' vi:',vi,' v1:',v1,' (',vEnd,')'
print >>self.flog, 'vList: ', vList
print >>self.flog, 'logFlag: ', logFlag, ' fix: ', fix
return v0,v1,vi,vList,fix,logFlag
#--- SetXLen(xlen)
def SetXLen(self, xlen):
self.xLen = xlen
print >>self.fout, '/xAxisLen %.2f def' % self.xLen
print >>self.fout, 'doGraph'
return
#--- SetXSize(xsize)
def SetXSize(self, xsize):
self.xSize = xsize
return
#--- SetYSize(ysize)
def SetYSize(self, ysize):
self.ySize = ysize
return
#--- SetPlotBgLevel(level)
#
def SetPlotBgLevel(self,level):
print >>self.fout, '/plotBgLevel ', level, 'def\n'
return
#--- SetPlotPercentDir(value)
def SetPlotPercentDir(self,value):
if value == 'Vertical':
print >>self.fout, '/plotNumPercentDir 1 def\n'
else:
print >>self.fout, '/plotNumPercentDir 0 def\n'
return
#--- SetPlotYLogScale(axis,value)
#
def SetPlotYLogScale(self,axis,value):
if value == 'Off':
v = -1
elif value == 'On':
v = 1
else:
v = 0;
if axis == 1:
self.y1LogScale = v
else:
self.y2LogScale = v
return
#--- SetPlot(xbeg,xend,xinc,ybeg,yend,yinc,xtitle,ytitle,title)
#
def SetPlot(self,xbeg,xend,xinc,ybeg,yend,yinc,xtitle,ytitle,title):
print >>self.fout, '\n\nnextGraph\n1 setlinewidth\n'
(x0,x1,xi,xList,fix,logFlag) = self.GetAxis(xbeg,xend,xinc,0)
self.x0 = x0
self.xInc = xi
self.xCount = len(xList)
self.xList = xList
self.xDict = {}
k = 1
for x in xList:
self.xDict[x] = k
k=k+1
print >>self.fout, '/xfix ', fix, ' def\n'
(y0,y1,yi,yList,fix,logFlag) = self.GetAxis(ybeg,yend,yinc,
self.y1LogScale)
self.y1Inc = yi
self.y1Count = len(yList)
self.yfix1 = '/yfix '+fix+' def\n /yinc yinc1 def'
print >>self.fout, self.yfix1
print >>self.fout, '[ '
for x in xList:
self.fout.write('('+str(x)+') ')
self.fout.write(' ]\n[ ')
for y in yList:
self.fout.write('('+str(y)+') ')
print >>self.fout, ' ]'
print >>self.fout, '('+xtitle+')\n('+ytitle+')\naxes\n'
print >>self.fout, self.xGrid, self.yGrid, ' grid\n'
print >>self.fout, '/ymtitle ypos ylen add 10 add def\n'
# Multiple lines in title are separated by '|'
print >>self.flog, 'Main Title: '+title
titleLines = title.split('|')
for t in titleLines:
if len(t) > 0:
print >>self.flog, ' '+t
print >>self.fout, '('+t+')\n'
print >>self.fout, 'Mtitles\n'
# print >>self.fout, '('+title+')\nMtitles\n'
if logFlag == 1:
print >>self.fout, 'beginFunction\n'
for ys in yList:
factor = 1
if ys[-1:] == 'K':
yss = ys[:-1]
factor = 1000
elif ys[-1:] == 'M':
yss = ys[:-1]
factor = 1000000
else:
yss = ys
y = float(yss)*factor/10.0
k = 2
while k < 10:
print >>self.fout, 0, k*y
k += 1
print >>self.fout, 'endFunction\n'
print >>self.fout, '19 { 0 0 0 setrgbcolor } plotSymbolsC\n'
return y1
#--- SetPlot2(xbeg,xend,xinc,ybeg,yend,yinc,zbeg,zend,zinc,
# xtitle,ytitle,ztitle,title)
#
def SetPlot2(self,xbeg,xend,xinc,ybeg,yend,yinc,zbeg,zend,zinc,
xtitle,ytitle,ztitle,title):
rv = self.SetPlot(xbeg,xend,xinc,ybeg,yend,yinc,xtitle,ytitle,title)
(z0,z1,zi,zList,fix,logFlag) = self.GetAxis(zbeg,zend,zinc,self.y2LogScale)
self.y2Inc = zi
self.y2Count = len(zList)
print >>self.fout, '/Flag2Yaxes 1 def'
self.yfix2 = '/yfix '+fix+' def\n/yinc yinc2 def'
print >>self.fout, 'axpos axlen add aypos aylen'
self.fout.write('[ ')
for z in zList:
self.fout.write('('+str(z)+') ')
self.fout.write(' ]')
if ztitle != '':
print >>self.fout, '('+ztitle+') vaxis2'
if logFlag == 1:
print >>self.fout, self.yfix2
print >>self.fout, 'beginFunction\n'
for zs in zList:
factor = 1
if zs[-1:] == 'K':
zss = zs[:-1]
factor = 1000
elif zs[-1:] == 'M':
zss = zs[:-1]
factor = 1000000
else:
zss = zs
y = float(zss)*factor/10.0
k = 2
while k < 10:
print >>self.fout, self.xCount, k*y
k += 1
print >>self.fout, 'endFunction\n'
print >>self.fout, '18 { 0.72 0.52 0.5 setrgbcolor } plotSymbolsC\n'
return rv
#--- SetColor(color)
#
def SetColor(self, color):
rv = ' { '+str(color[0])+' '+str(color[1])+' '+str(color[2])+ \
' setrgbcolor } '
return rv
#--- GetColorIndx(indx)
#
def GetColorIndx(self, indx):
color = self.colors[indx % self.colorsN]
rv = ' { '+str(color[0])+' '+str(color[1])+' '+str(color[2])+ \
' setrgbcolor } '
return rv
#--- SetColorIndx(indx, r, g, b)
#
def SetColorIndx(self, indx, r, g, b):
self.colors[indx][0] = r
self.colors[indx][1] = g
self.colors[indx][2] = b
return rv
#--- outputPS(string)
#
def outputPS(self, s):
print >>self.fout, s
#--- SeriesNames(names)
#
def SeriesNames(self, names):
indx = len(names) - 1
if indx == 0:
return
print >>self.fout, '('+self.seriesTitle+')'
while indx >= 0:
if names[indx] != None:
print >>self.fout, '('+names[indx]+') '
print >>self.fout, self.SetColor(self.colors[indx % self.colorsN])
indx -= 1
print >>self.fout, 'fdescriptionsC'
#--- PlotVBars(xList, type)
#
def PlotVBars(self, xList, type):
flog = self.flog
print >>self.fout, self.yfix1
print >>self.fout, 'beginFunction\n'
endFun = 'endFunction\n'
indx = 0
for x in xList:
if x == ' ' and indx == len(xList)-1:
continue
indx += 1
print >>self.fout, x, 0.0
if (indx != 0) and (indx % 1000) == 0:
print >>self.fout, endFun+type+'\nbeginFunction\n'
print >>self.fout, x
print >>self.fout, endFun, type, '\n'
return
#--- PlotData(axis, xList, yList, zList, id, type)
#
def PlotData(self, axis, xList, yList, zList, id, type):
flog = self.flog
print >>flog, 'graph xList: ', self.xList, ' xList: ', xList, \
' yList: ', yList
print >>self.fout, '%\n% Plot '+id+'\n%\n'
print >>self.fout, '/xfix { ', self.x0 - self.xInc - self.xOffset,' sub ', self.xInc, ' div ', 0.0,' add } def\n'
if axis == 2:
print >>self.fout, self.yfix2
elif axis == 1:
print >>self.fout, self.yfix1
# else:
# print >>self.fout, '/yfix { 0 add } def\n'
print >>self.fout, 'beginFunction\n'
if isinstance(zList,list):
endFun = 'endFunctionW\n'
else:
endFun = 'endFunction\n'
indx = 0
for x in xList:
if x == ' ' and indx == len(xList)-1:
continue
if len(yList) <= indx:
continue
y = yList[indx]
if isinstance(zList,list):
if len(zList) <= indx:
continue
z = zList[indx]
else:
z = ''
indx += 1
if self.xUniform == True:
g_indx = self.xDict[x]
print >>self.fout, g_indx, y, z
else:
print >>self.fout, x, y, z
if (indx != 0) and (indx % 1000) == 0:
print >>self.fout, endFun+type+'\nbeginFunction\n'
if self.xUniform == True:
print >>self.fout, g_indx, y, z
else:
print >>self.fout, x, y, z
print >>self.fout, endFun, type, '\n'
return
#--- GetImage()
#
def GetImage(self):
flog = self.flog
print >>self.fout, 'showpage\n'
self.fout.flush()
os.fsync(self.fout)
if self.plotsPerPage == 1:
# size = ' -g1200x550 '
size = ' -g%dx%d ' % (self.xSize, self.ySize)
xres = int(100 * self.xSize * 6.5 / (1200 * self.xLen))
yres = int(110 * self.ySize / 550)
res = ' -r%dx%d ' % (xres, yres)
cmdStr = gsPath + ' -sDEVICE=jpeg'+size+'-sOutputFile='+self.foutPath+self.foutName+'.jpg -dNOPAUSE '+ res +self.fname+' -c quit'
# cmdStr = gsPath + ' -sDEVICE=jpeg'+size+'-sOutputFile='+self.foutPath+self.foutName+'.jpg -dNOPAUSE -r100x100 '+self.fname+' -c quit'
else:
size = ' -g1200x1100 '
cmdStr = gsPath + ' -sDEVICE=jpeg'+size+'-sOutputFile='+self.foutPath+self.foutName+'%d.jpg -dNOPAUSE -r100x100 '+self.fname+' -c quit'
print >>flog, 'cmdStr: ', cmdStr
output = commands.getoutput(cmdStr)
print >>flog, 'output from gs command: ', output
return self.foutPath+self.foutName+'.jpg'
#--- Main
#
def main():
tMin = 0
tMax = 100000
stateList = [0,1,2,2,3,3,3,3,4]
fname = 'sched.txt'
if len(sys.argv) == 2:
fname = sys.argv[1]
elif len(sys.argv) == 3:
tMin = int(sys.argv[1])
tMax = int(sys.argv[2])
elif len(sys.argv) == 4:
tMin = int(sys.argv[1])
tMax = int(sys.argv[2])
fname = sys.argv[3]
elif len(sys.argv) != 1:
print 'USAGE: psPlot.py [tMin tMax] [fname]'
sys.exit(1)
print 'tMin,tMax: ', tMin, tMax, 'fname: ', fname
p = PsPlot('./p', 'Header', 'SubHeader', 1)
fromStateList = []
toStateList = []
time1List = []
time2List = []
indx = 0
oldTime = 0
fin = open(fname, 'r')
for inputLine in fin:
inputLine = inputLine.replace(' ','')
inputLine = inputLine.replace("'", '')
i1 = inputLine.find('(')
i2 = inputLine.find(')')
inputList = inputLine[i1+1:i2-1].split(',')
s1 = stateList[int(inputList[0])]
s2 = stateList[int(inputList[1])]
t = int(inputList[2])
if indx != 0 and t >= tMin and t <= tMax:
fromStateList.append(s1)
toStateList.append(s2)
time1List.append(oldTime)
time2List.append(t)
oldTime = t
indx += 1
p.SetPlot(tMin, tMax, 0, 0, 2, 0, 'Time', 'Socket/State', 'Chavey\'s Plot')
state = 0
while state <= 4:
t1List = []
t2List = []
sList = []
indx = 0
for s in toStateList:
if s == state:
t1List.append(time1List[indx])
t2List.append(time2List[indx])
sList.append(0.10 + s*0.20)
indx += 1
p.PlotData(1,t1List, t2List, sList, 'Test',
'0.1 in 0 '+p.SetColor(p.colors[state])+' plotWbarsC',
sys.stdout)
state += 1
image = p.GetImage(sys.stdout)
print 'Image file: ', image
p.End()
if __name__ == "__main__":
main()
| 31.260656 | 147 | 0.457968 | 2,295 | 19,069 | 3.798693 | 0.156427 | 0.070658 | 0.099908 | 0.0195 | 0.254416 | 0.193164 | 0.161849 | 0.146708 | 0.133861 | 0.129158 | 0 | 0.054708 | 0.391316 | 19,069 | 609 | 148 | 31.311987 | 0.69639 | 0.063873 | 0 | 0.280079 | 0 | 0 | 0.090128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.013807 | null | null | 0.149901 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86a7e9fe107833a210f5b3b41b68cc42c51f48ee | 402 | py | Python | physio2go/exercises/migrations/0003_auto_20161128_1753.py | hamole/physio2go | ebd14c9406e2b6818dc649e4863a734bf812e9b0 | [
"MIT"
] | null | null | null | physio2go/exercises/migrations/0003_auto_20161128_1753.py | hamole/physio2go | ebd14c9406e2b6818dc649e4863a734bf812e9b0 | [
"MIT"
] | null | null | null | physio2go/exercises/migrations/0003_auto_20161128_1753.py | hamole/physio2go | ebd14c9406e2b6818dc649e4863a734bf812e9b0 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.3 on 2016-11-28 06:53
from __future__ import unicode_literals
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('exercises', '0002_auto_20161128_1718'),
]
operations = [
migrations.RenameModel(
old_name='Exercises',
new_name='Exercise',
),
]
| 20.1 | 49 | 0.621891 | 44 | 402 | 5.454545 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111486 | 0.263682 | 402 | 19 | 50 | 21.157895 | 0.699324 | 0.169154 | 0 | 0 | 1 | 0 | 0.148036 | 0.069486 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86ab8849571d80e31e545baaa8fc3a7e45faa001 | 6,176 | py | Python | tests/test_agent/test_manhole.py | guidow/pyfarm-agent | bb5d464f9f6549a3db3529a93e3d9f388b365586 | [
"Apache-2.0"
] | null | null | null | tests/test_agent/test_manhole.py | guidow/pyfarm-agent | bb5d464f9f6549a3db3529a93e3d9f388b365586 | [
"Apache-2.0"
] | null | null | null | tests/test_agent/test_manhole.py | guidow/pyfarm-agent | bb5d464f9f6549a3db3529a93e3d9f388b365586 | [
"Apache-2.0"
] | null | null | null | # No shebang line, this module is meant to be imported
#
# Copyright 2014 Oliver Palmer
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from collections import namedtuple
from pprint import pprint
from random import randint
from StringIO import StringIO
from textwrap import dedent
try:
from unittest.mock import patch
except ImportError: # pragma: no cover
from mock import patch
from twisted.internet.protocol import ServerFactory
from twisted.cred.portal import Portal
from twisted.conch.telnet import (
ITelnetProtocol, TelnetBootstrapProtocol, TelnetTransport)
from pyfarm.agent.testutil import TestCase
from pyfarm.agent.manhole import (
LoggingManhole, TransportProtocolFactory, TelnetRealm,
manhole_factory, show)
Peer = namedtuple("Peer", ("host", "port"))
class FakeLoggingManhole(LoggingManhole):
QUIT = False
GET_PEER_CALLS = 0
class terminal(object):
RIGHT_ARROW, LEFT_ARROW = None, None
class transport(object):
@classmethod
def getPeer(cls):
FakeLoggingManhole.GET_PEER_CALLS += 1
return Peer(os.urandom(12).encode("hex"), randint(1024, 65535))
def handle_QUIT(self):
self.QUIT = True
class TestManholeBase(TestCase):
def setUp(self):
TelnetRealm.NAMESPACE = None
FakeLoggingManhole.GET_PEER_CALLS = 0
FakeLoggingManhole.QUIT = False
class TestManholeFactory(TestManholeBase):
def test_assertions(self):
with self.assertRaises(AssertionError):
manhole_factory(None, "", "")
with self.assertRaises(AssertionError):
manhole_factory({}, None, "")
with self.assertRaises(AssertionError):
manhole_factory({}, "", None)
def test_instance_one(self):
namespace = {"bob": None}
username = os.urandom(32).encode("hex")
password = os.urandom(32).encode("hex")
manhole_factory(namespace, username, password)
with self.assertRaises(AssertionError):
manhole_factory(namespace, username, password)
def test_instance(self):
namespace = {"bob": None}
username = os.urandom(32).encode("hex")
password = os.urandom(32).encode("hex")
manhole = manhole_factory(namespace, username, password)
self.assertEqual(namespace, {"bob": None})
self.assertEqual(
TelnetRealm.NAMESPACE,
{"bob": None, "pp": pprint, "show": show})
self.assertIsInstance(manhole, ServerFactory)
self.assertIsInstance(manhole.protocol, TransportProtocolFactory)
self.assertIsInstance(manhole.protocol.portal, Portal)
# There could be multiple password checkers, check for the one
# we know we should have added.
for _, instance in manhole.protocol.portal.checkers.items():
found = False
for user, passwd in instance.users.items():
if user == username and passwd == password:
found = True
if found:
break
else:
self.fail("Failed to find correct username and password.")
def test_request_avatar(self):
realm = TelnetRealm()
avatar = realm.requestAvatar(None, ITelnetProtocol)
self.assertEqual(len(avatar), 3)
self.assertIs(avatar[0], ITelnetProtocol)
self.assertIsInstance(avatar[1], TelnetBootstrapProtocol)
self.assertTrue(callable(avatar[2]))
def test_request_avatar_error(self):
realm = TelnetRealm()
with self.assertRaises(NotImplementedError):
realm.requestAvatar(None, None)
def test_protocol_factory(self):
factory = TransportProtocolFactory(None)
transport = factory()
self.assertIsInstance(transport, TelnetTransport)
class TestManholeShow(TestManholeBase):
def test_uses_namespace(self):
namespace = {"bob": None}
username = os.urandom(32).encode("hex")
password = os.urandom(32).encode("hex")
manhole_factory(namespace, username, password)
output = StringIO()
with patch("sys.stdout", output):
show()
output.seek(0)
output = output.getvalue().strip()
self.assertEqual(output, "objects: ['bob', 'pp', 'show']")
def test_custom_object(self):
class Foobar(object):
a, b, c, d, e = True, 1, "yes", {}, 0.0
output = StringIO()
with patch("sys.stdout", output):
show(Foobar)
output.seek(0)
output = output.getvalue().strip()
self.assertEqual(
output,
dedent("""
data attributes of <class 'tests.test_agent.test_manhole.Foobar'>
a : True
b : 1
c : yes
d : {} (0 elements)
e : 0.0
""").strip())
def test_wrap_long_line(self):
class Foobar(object):
a = " " * 90
output = StringIO()
with patch("sys.stdout", output):
show(Foobar)
output.seek(0)
output = output.getvalue().strip()
self.assertEqual(
output,
dedent("""
data attributes of <class 'tests.test_agent.test_manhole.Foobar'>
a : ' """ +
""" '...
""").strip())
class TestLoggingManhole(TestManholeBase):
def test_line_received(self):
f = FakeLoggingManhole()
f.lineReceived("exit")
self.assertTrue(f.QUIT)
| 32.505263 | 79 | 0.615771 | 647 | 6,176 | 5.812983 | 0.33694 | 0.018612 | 0.017549 | 0.02712 | 0.279713 | 0.250997 | 0.238235 | 0.238235 | 0.227067 | 0.227067 | 0 | 0.011322 | 0.284974 | 6,176 | 189 | 80 | 32.677249 | 0.840353 | 0.114475 | 0 | 0.316176 | 0 | 0 | 0.106221 | 0.014485 | 0 | 0 | 0 | 0 | 0.147059 | 1 | 0.095588 | false | 0.073529 | 0.102941 | 0 | 0.286765 | 0.014706 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
86abdce88613d6ee71e638ae7487297146c3e7a8 | 338 | py | Python | func-button/klSigmode.py | xcgoo/uiKLine | 80683401d7dc66262ae645db4c2780d6e71be551 | [
"MIT"
] | 232 | 2017-10-11T09:19:03.000Z | 2022-03-09T01:34:49.000Z | func-button/klSigmode.py | DON-2020-LEE/uiKLine-2 | fd1d0dca5fd6b1542af4b10c110e39361b29d378 | [
"MIT"
] | 8 | 2017-12-09T09:10:15.000Z | 2021-04-22T03:35:26.000Z | func-button/klSigmode.py | DON-2020-LEE/uiKLine-2 | fd1d0dca5fd6b1542af4b10c110e39361b29d378 | [
"MIT"
] | 132 | 2017-10-11T09:16:29.000Z | 2022-02-09T10:37:57.000Z | # coding: utf-8
"""
插入所有需要的库,和函数
"""
#----------------------------------------------------------------------
def klSigmode(self):
"""查找模式"""
if self.mode == 'deal':
self.canvas.updateSig(self.signalsOpen)
self.mode = 'dealOpen'
else:
self.canvas.updateSig(self.signals)
self.mode = 'deal'
| 21.125 | 71 | 0.446746 | 30 | 338 | 5.033333 | 0.6 | 0.15894 | 0.15894 | 0.304636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003759 | 0.213018 | 338 | 15 | 72 | 22.533333 | 0.56391 | 0.301775 | 0 | 0 | 0 | 0 | 0.072398 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86ae868b0b9598e5f2e99607cce26d99b3a34dc3 | 4,147 | py | Python | vantage6/server/resource/recover.py | jaspersnel/vantage6-server | 88ad40d23cc36eaba57c170929f7ccdd0011720a | [
"Apache-2.0"
] | 2 | 2020-10-19T08:59:08.000Z | 2022-03-07T10:30:21.000Z | vantage6/server/resource/recover.py | jaspersnel/vantage6-server | 88ad40d23cc36eaba57c170929f7ccdd0011720a | [
"Apache-2.0"
] | 67 | 2020-04-15T09:43:31.000Z | 2022-03-18T08:29:17.000Z | vantage6/server/resource/recover.py | jaspersnel/vantage6-server | 88ad40d23cc36eaba57c170929f7ccdd0011720a | [
"Apache-2.0"
] | 2 | 2021-01-21T15:09:26.000Z | 2021-04-19T14:58:10.000Z | # -*- coding: utf-8 -*-
import logging
import datetime
from flask import request, render_template
from flask_jwt_extended import (
create_access_token,
decode_token
)
from jwt.exceptions import DecodeError
from flasgger import swag_from
from http import HTTPStatus
from pathlib import Path
from sqlalchemy.orm.exc import NoResultFound
from vantage6.common import logger_name
from vantage6.server import db
from vantage6.server.resource import (
ServicesResources
)
module_name = logger_name(__name__)
log = logging.getLogger(module_name)
def setup(api, api_base, services):
path = "/".join([api_base, module_name])
log.info(f'Setting up "{path}" and subdirectories')
api.add_resource(
ResetPassword,
path+'/reset',
endpoint="reset_password",
methods=('POST',),
resource_class_kwargs=services
)
api.add_resource(
RecoverPassword,
path+'/lost',
endpoint='recover_password',
methods=('POST',),
resource_class_kwargs=services
)
# ------------------------------------------------------------------------------
# Resources / API's
# ------------------------------------------------------------------------------
class ResetPassword(ServicesResources):
"""user can use recover token to reset their password."""
@swag_from(str(Path(r"swagger/post_reset_password.yaml")),
endpoint='reset_password')
def post(self):
""""submit email-adress receive token."""
# retrieve user based on email or username
body = request.get_json()
reset_token = body.get("reset_token")
password = body.get("password")
if not reset_token or not password:
return {"msg": "reset token and/or password is missing!"}, \
HTTPStatus.BAD_REQUEST
# obtain user
try:
user_id = decode_token(reset_token)['identity'].get('id')
except DecodeError:
return {"msg": "Invalid recovery token!"}, HTTPStatus.BAD_REQUEST
log.debug(user_id)
user = db.User.get(user_id)
# set password
user.set_password(password)
user.save()
log.info(f"Successfull password reset for '{user.username}'")
return {"msg": "password successfully been reset!"}, \
HTTPStatus.OK
class RecoverPassword(ServicesResources):
"""send a mail containing a recover token"""
@swag_from(str(Path(r"swagger/post_recover_password.yaml")),
endpoint='recover_password')
def post(self):
"""username or email generates a token which is mailed."""
# default return string
ret = {"msg": "If the username or email is our database you "
"will soon receive an email"}
# obtain username/email from request'
body = request.get_json()
username = body.get("username")
email = body.get("email")
if not (email or username):
return {"msg": "No username or email provided!"}, \
HTTPStatus.BAD_REQUEST
# find user in the database, if not here we stop!
try:
if username:
user = db.User.get_by_username(username)
else:
user = db.User.get_by_email(email)
except NoResultFound:
# we do not tell them.... But we won't continue either
return ret
log.info(f"Password reset requested for '{user.username}'")
# generate a token that can reset their password
expires = datetime.timedelta(hours=1)
reset_token = create_access_token(
{"id": str(user.id)}, expires_delta=expires
)
self.mail.send_email(
"password reset",
sender="support@vantage6.ai",
recipients=[user.email],
text_body=render_template("mail/reset_password_token.txt",
token=reset_token),
html_body=render_template("mail/reset_password_token.html",
token=reset_token)
)
return ret
| 30.718519 | 80 | 0.590306 | 459 | 4,147 | 5.187364 | 0.346405 | 0.033599 | 0.01008 | 0.01638 | 0.107518 | 0.094918 | 0.094918 | 0 | 0 | 0 | 0 | 0.002001 | 0.276827 | 4,147 | 134 | 81 | 30.947761 | 0.791931 | 0.15674 | 0 | 0.175824 | 0 | 0 | 0.180375 | 0.036075 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032967 | false | 0.21978 | 0.131868 | 0 | 0.252747 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
86b4af0033c71e00f4e30f0ac3bfd045c1932aa8 | 760 | py | Python | examples/server/models/image_file_upload.py | ParikhKadam/django-angular | 1fdd2ab3211ed1655acc2d172d826ed7f3ad0574 | [
"MIT"
] | 941 | 2015-01-01T18:17:43.000Z | 2022-02-26T07:45:40.000Z | examples/server/models/image_file_upload.py | ParikhKadam/django-angular | 1fdd2ab3211ed1655acc2d172d826ed7f3ad0574 | [
"MIT"
] | 228 | 2015-01-11T16:36:34.000Z | 2022-03-11T23:17:15.000Z | examples/server/models/image_file_upload.py | ParikhKadam/django-angular | 1fdd2ab3211ed1655acc2d172d826ed7f3ad0574 | [
"MIT"
] | 294 | 2015-01-04T09:01:33.000Z | 2022-02-26T07:45:41.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
# start tutorial
from django.db import models
from djng.forms import NgModelFormMixin, NgFormValidationMixin
from djng.styling.bootstrap3.forms import Bootstrap3ModelForm
class SubscribeUser(models.Model):
full_name = models.CharField(
"Full name",
max_length=99)
avatar = models.ImageField("Avatar", blank=False, null=True)
permit = models.FileField("Permit", blank=True, null=True)
class SubscribeForm(NgModelFormMixin, NgFormValidationMixin, Bootstrap3ModelForm):
use_required_attribute = False
scope_prefix = 'subscribe_data'
form_name = 'my_form'
class Meta:
model = SubscribeUser
fields = ['full_name', 'avatar', 'permit']
| 28.148148 | 82 | 0.728947 | 83 | 760 | 6.506024 | 0.590361 | 0.044444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009569 | 0.175 | 760 | 26 | 83 | 29.230769 | 0.851675 | 0.047368 | 0 | 0 | 0 | 0 | 0.087379 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.235294 | 0 | 0.764706 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
86bd70e7874de4b570ed32325f28d65eaa058486 | 4,544 | py | Python | mandoline/line_segment3d.py | Spiritdude/mandoline-py | 702cd1f9264c7d5d814600ff919406387fd86185 | [
"BSD-2-Clause"
] | 5 | 2021-09-16T10:41:44.000Z | 2021-11-04T14:45:24.000Z | mandoline/line_segment3d.py | Spiritdude/mandoline-py | 702cd1f9264c7d5d814600ff919406387fd86185 | [
"BSD-2-Clause"
] | null | null | null | mandoline/line_segment3d.py | Spiritdude/mandoline-py | 702cd1f9264c7d5d814600ff919406387fd86185 | [
"BSD-2-Clause"
] | null | null | null |
class LineSegment3D(object):
"""A class to represent a 3D line segment."""
def __init__(self, p1, p2):
"""Initialize with two endpoints."""
if p1 > p2:
p1, p2 = (p2, p1)
self.p1 = p1
self.p2 = p2
self.count = 1
def __len__(self):
"""Line segment always has two endpoints."""
return 2
def __iter__(self):
"""Iterator generator for endpoints."""
yield self.p1
yield self.p2
def __getitem__(self, idx):
"""Given a vertex number, returns a vertex coordinate vector."""
if idx == 0:
return self.p1
if idx == 1:
return self.p2
raise LookupError()
def __hash__(self):
"""Returns hash value for endpoints"""
return hash((self.p1, self.p2))
def __lt__(self, p):
return self < p
def __cmp__(self, p):
"""Compare points for sort ordering in an arbitrary heirarchy."""
val = self[0].__cmp__(p[0])
if val != 0:
return val
return self[1].__cmp__(p[1])
def __format__(self, fmt):
"""Provides .format() support."""
pfx = ""
sep = " - "
sfx = ""
if "a" in fmt:
pfx = "["
sep = ", "
sfx = "]"
elif "s" in fmt:
pfx = ""
sep = " "
sfx = ""
p1 = self.p1.__format__(fmt)
p2 = self.p2.__format__(fmt)
return pfx + p1 + sep + p2 + sfx
def __repr__(self):
"""Standard string representation."""
return "<LineSegment3D: {0}>".format(self)
def __str__(self):
"""Returns a human readable coordinate string."""
return "{0:a}".format(self)
def translate(self,offset):
"""Translate the endpoint's vertices"""
self.p1 = (self.p1[a] + offset[a] for a in range(3))
self.p2 = (self.p2[a] + offset[a] for a in range(3))
def scale(self,scale):
"""Translate the endpoint's vertices"""
self.p1 = (self.p1[a] * scale[a] for a in range(3))
self.p2 = (self.p2[a] * scale[a] for a in range(3))
def length(self):
"""Returns the length of the line."""
return self.p1.distFromPoint(self.p2)
class LineSegment3DCache(object):
"""Cache class for 3D Line Segments."""
def __init__(self):
"""Initialize as an empty cache."""
self.endhash = {}
self.seghash = {}
def _add_endpoint(self, p, seg):
"""Remember that this segment has a given endpoint"""
if p not in self.endhash:
self.endhash[p] = []
self.endhash[p].append(seg)
def rehash(self):
"""Reset the hashes for changed edge vertices"""
oldseghash = self.seghash
self.seghash = {
(v[0], v[1]): v
for v in oldseghash.values()
}
oldendhash = self.endhash
self.endhash = {
k: v
for v in oldendhash.values()
for k in v
}
def translate(self,offset):
"""Translate vertices of all edges."""
for v in self.seghash.values():
v.translate(offset)
self.rehash()
def scale(self,scale):
"""Scale vertices of all edges."""
for v in self.seghash.values():
v.scale(scale)
self.rehash()
def endpoint_segments(self, p):
"""get list of edges that end at point p"""
if p not in self.endhash:
return []
return self.endhash[p]
def get(self, p1, p2):
"""Given 2 endpoints, return the cached LineSegment3D inst, if any."""
key = (p1, p2) if p1 < p2 else (p2, p1)
if key not in self.seghash:
return None
return self.seghash[key]
def add(self, p1, p2):
"""Given 2 endpoints, return the (new or cached) LineSegment3D inst."""
key = (p1, p2) if p1 < p2 else (p2, p1)
if key in self.seghash:
seg = self.seghash[key]
seg.count += 1
return seg
seg = LineSegment3D(p1, p2)
self.seghash[key] = seg
self._add_endpoint(p1, seg)
self._add_endpoint(p2, seg)
return seg
def __iter__(self):
"""Creates an iterator for the line segments in the cache."""
for pt in self.seghash.values():
yield pt
def __len__(self):
"""Length of sequence."""
return len(self.seghash)
# vim: expandtab tabstop=4 shiftwidth=4 softtabstop=4 nowrap
| 28.223602 | 79 | 0.53015 | 573 | 4,544 | 4.073298 | 0.230366 | 0.033419 | 0.027849 | 0.011997 | 0.218081 | 0.183376 | 0.167095 | 0.164524 | 0.116538 | 0.116538 | 0 | 0.028485 | 0.34331 | 4,544 | 160 | 80 | 28.4 | 0.753686 | 0.224692 | 0 | 0.205607 | 0 | 0 | 0.010291 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214953 | false | 0 | 0 | 0.009346 | 0.401869 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
86cc1ac4bd0be7cfc736232b574de4d27f85e0ca | 6,656 | py | Python | discord/types/interactions.py | Voxel-Fox-Ltd/Novus | 3e254115daf1c09455b26dc7819b73fbf5ee56e5 | [
"MIT"
] | 61 | 2021-08-30T05:30:31.000Z | 2022-03-24T11:24:38.000Z | discord/types/interactions.py | Voxel-Fox-Ltd/Novus | 3e254115daf1c09455b26dc7819b73fbf5ee56e5 | [
"MIT"
] | 30 | 2021-08-31T10:16:42.000Z | 2022-03-09T22:53:15.000Z | discord/types/interactions.py | Voxel-Fox-Ltd/Novus | 3e254115daf1c09455b26dc7819b73fbf5ee56e5 | [
"MIT"
] | 15 | 2021-09-02T09:40:58.000Z | 2022-02-25T12:19:02.000Z | """
The MIT License (MIT)
Copyright (c) 2015-2021 Rapptz
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the "Software"),
to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense,
and/or sell copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.
"""
from __future__ import annotations
from typing import Optional, TYPE_CHECKING, Dict, TypedDict, Union, List, Literal
from .snowflake import Snowflake
from .components import Component, SelectOption
from .embed import Embed
from .channel import ChannelType, Channel
from .member import Member
from .role import Role
from .user import User
if TYPE_CHECKING:
from .message import AllowedMentions, Message
ApplicationCommandType = Literal[1, 2, 3]
class ApplicationCommand(TypedDict):
id: Snowflake
application_id: Snowflake
name: str
description: str
options: Optional[List[ApplicationCommandOption]]
type: Optional[ApplicationCommandType]
ApplicationCommandOptionType = Literal[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
class ApplicationCommandOption(TypedDict):
type: ApplicationCommandOptionType
name: str
description: str
required: bool
choices: Optional[List[ApplicationCommandOptionChoice]]
options: Optional[List[ApplicationCommandOption]]
class ApplicationCommandOptionChoice(TypedDict):
name: str
value: Union[str, int]
ApplicationCommandPermissionType = Literal[1, 2]
class ApplicationCommandPermissions(TypedDict):
id: Snowflake
type: ApplicationCommandPermissionType
permission: bool
class BaseGuildApplicationCommandPermissions(TypedDict):
permissions: List[ApplicationCommandPermissions]
class PartialGuildApplicationCommandPermissions(BaseGuildApplicationCommandPermissions):
id: Snowflake
class GuildApplicationCommandPermissions(PartialGuildApplicationCommandPermissions):
application_id: Snowflake
guild_id: Snowflake
InteractionType = Literal[1, 2, 3]
class _ApplicationCommandInteractionDataOption(TypedDict):
name: str
class _ApplicationCommandInteractionDataOptionSubcommand(_ApplicationCommandInteractionDataOption):
type: Literal[1, 2]
options: List[ApplicationCommandInteractionDataOption]
class _ApplicationCommandInteractionDataOptionString(_ApplicationCommandInteractionDataOption):
type: Literal[3]
value: str
class _ApplicationCommandInteractionDataOptionInteger(_ApplicationCommandInteractionDataOption):
type: Literal[4]
value: int
class _ApplicationCommandInteractionDataOptionBoolean(_ApplicationCommandInteractionDataOption):
type: Literal[5]
value: bool
class _ApplicationCommandInteractionDataOptionSnowflake(_ApplicationCommandInteractionDataOption):
type: Literal[6, 7, 8, 9]
value: Snowflake
class _ApplicationCommandInteractionDataOptionNumber(_ApplicationCommandInteractionDataOption):
type: Literal[10]
value: float
ApplicationCommandInteractionDataOption = Union[
_ApplicationCommandInteractionDataOptionString,
_ApplicationCommandInteractionDataOptionInteger,
_ApplicationCommandInteractionDataOptionSubcommand,
_ApplicationCommandInteractionDataOptionBoolean,
_ApplicationCommandInteractionDataOptionSnowflake,
_ApplicationCommandInteractionDataOptionNumber,
]
class ApplicationCommandResolvedPartialChannel(TypedDict):
id: Snowflake
type: ChannelType
permissions: str
name: str
class ApplicationCommandInteractionDataResolved(TypedDict, total=False):
users: Dict[Snowflake, User]
members: Dict[Snowflake, Member]
roles: Dict[Snowflake, Role]
channels: Dict[Snowflake, ApplicationCommandResolvedPartialChannel]
class ApplicationCommandInteractionDataOption(TypedDict):
name: str
type: int
value: Optional[str] # Optional[ApplicationCommandOptionType]
options: Optional[ApplicationCommandInteractionDataOption]
focused: Optional[bool]
components: Optional[List[ApplicationCommandInteractionDataOption]]
class _InteractionDataOptional(TypedDict, total=False):
resolved: Dict[str, dict]
options: List[ApplicationCommandInteractionDataOption]
custom_id: str
component_type: int
values: List[str]
target_id: Snowflake
components: List[ApplicationCommandInteractionDataOption]
class InteractionData(_InteractionDataOptional):
id: Snowflake
name: str
type: ApplicationCommandType
class InteractionResolved(TypedDict):
users: List[Union[User, Member]]
members: List[Member]
roles: List[Role]
channels: List[Channel]
messages: List[Message]
class _InteractionOptional(TypedDict, total=False):
data: InteractionData
guild_id: Snowflake
channel_id: Snowflake
member: Member
user: User
message: Message
guild_locale: str
class Interaction(_InteractionOptional):
id: Snowflake
application_id: Snowflake
type: InteractionType
token: str
version: int
resolved: InteractionResolved
locale: str
class InteractionApplicationCommandCallbackData(TypedDict, total=False):
tts: bool
content: str
embeds: List[Embed]
allowed_mentions: AllowedMentions
flags: int
components: List[Component]
InteractionResponseType = Literal[1, 4, 5, 6, 7]
class _InteractionResponseOptional(TypedDict, total=False):
data: InteractionApplicationCommandCallbackData
class InteractionResponse(_InteractionResponseOptional):
type: InteractionResponseType
class MessageInteraction(TypedDict):
id: Snowflake
type: InteractionType
name: str
user: User
class _EditApplicationCommandOptional(TypedDict, total=False):
description: str
options: Optional[List[ApplicationCommandOption]]
type: ApplicationCommandType
class EditApplicationCommand(_EditApplicationCommandOptional):
name: str
default_permission: bool
| 27.618257 | 99 | 0.790415 | 614 | 6,656 | 8.495114 | 0.32899 | 0.029525 | 0.057515 | 0.005752 | 0.063842 | 0.02339 | 0.02339 | 0 | 0 | 0 | 0 | 0.007615 | 0.151593 | 6,656 | 240 | 100 | 27.733333 | 0.916062 | 0.167819 | 0 | 0.239726 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.068493 | 0 | 0.90411 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
86ce3c0225876fe3453133a9a0965f8f30c17a84 | 5,447 | py | Python | flask_unchained/bundles/session/config.py | achiang/flask-unchained | 12788a6e618904a25ff2b571eb05ff1dc8f1840f | [
"MIT"
] | null | null | null | flask_unchained/bundles/session/config.py | achiang/flask-unchained | 12788a6e618904a25ff2b571eb05ff1dc8f1840f | [
"MIT"
] | null | null | null | flask_unchained/bundles/session/config.py | achiang/flask-unchained | 12788a6e618904a25ff2b571eb05ff1dc8f1840f | [
"MIT"
] | null | null | null | import os
from datetime import timedelta
from flask_unchained import BundleConfig
try:
from flask_unchained.bundles.sqlalchemy import db
except ImportError:
db = None
class _DefaultFlaskConfigForSessions(BundleConfig):
SESSION_COOKIE_NAME = 'session'
"""
The name of the session cookie.
Defaults to ``'session'``.
"""
SESSION_COOKIE_DOMAIN = None
"""
The domain for the session cookie. If this is not set, the cookie will be
valid for all subdomains of ``SERVER_NAME``.
Defaults to ``None``.
"""
SESSION_COOKIE_PATH = None
"""
The path for the session cookie. If this is not set the cookie will be valid
for all of ``APPLICATION_ROOT`` or if that is not set for '/'.
Defaults to ``None``.
"""
SESSION_COOKIE_HTTPONLY = True
"""
Controls if the cookie should be set with the ``httponly`` flag. Browsers will
not allow JavaScript access to cookies marked as ``httponly`` for security.
Defaults to ``True``.
"""
SESSION_COOKIE_SECURE = False
"""
Controls if the cookie should be set with the ``secure`` flag. Browsers will
only send cookies with requests over HTTPS if the cookie is marked ``secure``.
The application must be served over HTTPS for this to make sense.
Defaults to ``False``.
"""
PERMANENT_SESSION_LIFETIME = timedelta(days=31)
"""
The lifetime of a permanent session as ``datetime.timedelta`` object or an
integer representing seconds.
Defaults to 31 days.
"""
SESSION_COOKIE_SAMESITE = None
"""
Restrict how cookies are sent with requests from external sites. Limits the
scope of the cookie such that it will only be attached to requests if those
requests are "same-site". Can be set to ``'Lax'`` (recommended) or ``'Strict'``.
Defaults to ``None``.
"""
SESSION_REFRESH_EACH_REQUEST = True
"""
Controls the set-cookie behavior. If set to ``True`` a permanent session will
be refreshed each request and get their lifetime extended, if set to ``False``
it will only be modified if the session actually modifies. Non permanent sessions
are not affected by this and will always expire if the browser window closes.
Defaults to ``True``.
"""
class Config(_DefaultFlaskConfigForSessions):
"""
Default configuration options for the Session Bundle.
"""
SESSION_TYPE = 'null'
"""
Specifies which type of session interface to use. Built-in session types:
- ``'null'``: :class:`~flask_unchained.bundles.session.session_interfaces.NullSessionInterface` (default)
- ``'redis'``: :class:`~flask_unchained.bundles.session.session_interfaces.RedisSessionInterface`
- ``'memcached'``: :class:`~flask_unchained.bundles.session.session_interfaces.MemcachedSessionInterface`
- ``'filesystem'``: :class:`~flask_unchained.bundles.session.session_interfaces.FileSystemSessionInterface`
- ``'mongodb'``: :class:`~flask_unchained.bundles.session.session_interfaces.MongoDBSessionInterface`
- ``'sqlalchemy'``: :class:`~flask_unchained.bundles.session.session_interfaces.SqlAlchemySessionInterface`
Defaults to ``'null'``.
"""
SESSION_PERMANENT = True
"""
Whether use permanent session or not.
Defaults to ``True``.
"""
SESSION_USE_SIGNER = False
"""
Whether sign the session cookie sid or not. If set to ``True``, you have to
set ``SECRET_KEY``.
Defaults to ``False``.
"""
SESSION_KEY_PREFIX = 'session:'
"""
A prefix that is added before all session keys. This makes it possible to use
the same backend storage server for different apps.
Defaults to ``'session:'``.
"""
SESSION_REDIS = None
"""
A :class:`redis.Redis` instance.
By default, connect to ``127.0.0.1:6379``.
"""
SESSION_MEMCACHED = None
"""
A :class:`memcached.Client` instance.
By default, connect to ``127.0.0.1:11211``.
"""
SESSION_FILE_DIR = os.path.join(os.getcwd(), 'flask_sessions')
"""
The folder where session files are stored.
Defaults to using a folder named ``flask_sessions`` in your current working
directory.
"""
SESSION_FILE_THRESHOLD = 500
"""
The maximum number of items the session stores before it starts deleting some.
Defaults to 500.
"""
SESSION_FILE_MODE = 0o600
"""
The file mode wanted for the session files. Should be specified as an octal,
eg ``0o600``.
Defaults to ``0o600``.
"""
SESSION_MONGODB = None
"""
A :class:`pymongo.MongoClient` instance.
By default, connect to ``127.0.0.1:27017``.
"""
SESSION_MONGODB_DB = 'flask_session'
"""
The MongoDB database you want to use.
Defaults to ``'flask_session'``.
"""
SESSION_MONGODB_COLLECT = 'sessions'
"""
The MongoDB collection you want to use.
Defaults to ``'sessions'``.
"""
SESSION_SQLALCHEMY = db
"""
A :class:`~flask_unchained.bundles.sqlalchemy.SQLAlchemy` extension instance.
"""
SESSION_SQLALCHEMY_TABLE = 'flask_sessions'
"""
The name of the SQL table you want to use.
Defaults to ``flask_sessions``.
"""
SESSION_SQLALCHEMY_MODEL = None
"""
Set this if you need to customize the
:class:`~flask_unchained.bundles.sqlalchemy.BaseModel` subclass used for
storing sessions in the database.
"""
| 27.371859 | 111 | 0.668258 | 672 | 5,447 | 5.315476 | 0.318452 | 0.050392 | 0.052912 | 0.058231 | 0.221165 | 0.18589 | 0.179731 | 0.080627 | 0.080627 | 0.033035 | 0 | 0.012814 | 0.226363 | 5,447 | 198 | 112 | 27.510101 | 0.834836 | 0.00973 | 0 | 0 | 0 | 0 | 0.054662 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.15625 | 0 | 0.9375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
86d39cbeb38ed832359d8101e1462aeccc15eee8 | 1,400 | py | Python | src/knownnodes.py | skeevey/PyBitmessage | 196d688b138393d1d540df3322844dfe7e7c02ba | [
"MIT"
] | 1 | 2018-04-25T08:08:47.000Z | 2018-04-25T08:08:47.000Z | src/knownnodes.py | skeevey/PyBitmessage | 196d688b138393d1d540df3322844dfe7e7c02ba | [
"MIT"
] | null | null | null | src/knownnodes.py | skeevey/PyBitmessage | 196d688b138393d1d540df3322844dfe7e7c02ba | [
"MIT"
] | 1 | 2018-04-25T08:08:48.000Z | 2018-04-25T08:08:48.000Z | import pickle
import threading
from bmconfigparser import BMConfigParser
import state
knownNodesLock = threading.Lock()
knownNodes = {}
knownNodesTrimAmount = 2000
def saveKnownNodes(dirName = None):
if dirName is None:
dirName = state.appdata
with knownNodesLock:
with open(dirName + 'knownnodes.dat', 'wb') as output:
pickle.dump(knownNodes, output)
def increaseRating(peer):
increaseAmount = 0.1
maxRating = 1
with knownNodesLock:
for stream in knownNodes.keys():
try:
knownNodes[stream][peer]["rating"] = min(knownNodes[stream][peer]["rating"] + increaseAmount, maxRating)
except KeyError:
pass
def decreaseRating(peer):
decreaseAmount = 0.1
minRating = -1
with knownNodesLock:
for stream in knownNodes.keys():
try:
knownNodes[stream][peer]["rating"] = max(knownNodes[stream][peer]["rating"] - decreaseAmount, minRating)
except KeyError:
pass
def trimKnownNodes(recAddrStream = 1):
if len(knownNodes[recAddrStream]) < BMConfigParser().get("knownnodes", "maxnodes"):
return
with knownNodesLock:
oldestList = sorted(knownNodes[recAddrStream], key=lambda x: x['lastseen'])[:knownNodesTrimAmount]
for oldest in oldestList:
del knownNodes[recAddrStream][oldest]
| 30.434783 | 120 | 0.648571 | 134 | 1,400 | 6.776119 | 0.440299 | 0.079295 | 0.088106 | 0.114537 | 0.160793 | 0.160793 | 0.160793 | 0.160793 | 0.160793 | 0.160793 | 0 | 0.010496 | 0.251429 | 1,400 | 45 | 121 | 31.111111 | 0.855916 | 0 | 0 | 0.315789 | 0 | 0 | 0.047143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0.052632 | 0.105263 | 0 | 0.236842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.