hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f3f8f3a15cc9b7db02979a9f612225621dc97f24 | 3,751 | py | Python | nanopores/scripts/ahemIV.py | jhwnkim/nanopores | 98b3dbb5d36464fbdc03f59d224d38e4255324ce | [
"MIT"
] | 8 | 2016-09-07T01:59:31.000Z | 2021-03-06T12:14:31.000Z | nanopores/scripts/ahemIV.py | jhwnkim/nanopores | 98b3dbb5d36464fbdc03f59d224d38e4255324ce | [
"MIT"
] | null | null | null | nanopores/scripts/ahemIV.py | jhwnkim/nanopores | 98b3dbb5d36464fbdc03f59d224d38e4255324ce | [
"MIT"
] | 4 | 2017-12-06T17:43:01.000Z | 2020-05-01T05:41:14.000Z | import math, nanopores, dolfin
# @Benjamin, Gregor TODO:
# -) check permittivity and surface charge of ahem
# -) what biased voltage to use?
# some default values for parameters
### geo params [nm]
geo_name = "aHem"
domscale = 1.
l4 = 15.
l3 = 15.
R = 20.
r0 = 5.
z0 = 10.
exit_i = 1
badexit = {"upperbulkb"}
goodexit = {"exit"}
### phys params
phys_name = "pore_molecule"
bV = .5 # [V]
ahemqs = 0.0 # [C/m**2]
rTarget = 0.5e-9 # [m] for diffusion coeff.
bulkcon = 1000
### num params
clscale = 10.
refinement = False
maxcells = 50e3
newtondamp = 1.0
reuse_mesh = False
tolnewton = 1e-1
skip_stokes = True
iterative = True
def _update(dic, dic2): # conservative update
dic.update({key:dic2[key] for key in dic2 if not key in dic})
def _globals(): # globals except hidden ones ("_"), modules and functions
from types import ModuleType, FunctionType
return {key : var for key, var in globals().items() if not
(key.startswith("_") or isinstance(var, ModuleType) or isinstance(var, FunctionType))}
def calculate(**params):
# this time we do it the simple way: just pass every parameter
# have to be careful though
globals().update(params)
params.update(_globals())
# use some of the parameters
params["x0"] = [r0, 0., z0]
params["l3"] = l3*domscale
params["R"] = R*domscale
# TODO does this something?
nanopores.IllposedNonlinearSolver.newtondamp = newtondamp
nanopores.PNPS.tolnewton = tolnewton
t = dolfin.Timer("meshing")
geo = nanopores.geo_from_xml_threadsafe(geo_name, **params)
print "Mesh generation time:",t.stop()
#dolfin.plot(geo.submesh("solid"), interactive=True)
phys = nanopores.Physics(phys_name, geo, **params)
t = dolfin.Timer("PNPS")
pnps = nanopores.PNPS(geo, phys)
if skip_stokes:
pnps.solvers.pop("Stokes")
pnps.alwaysstokes = True
pnps.solve()
print "Time to calculate F:",t.stop()
#pnps.visualize("fluid")
(v, cp, cm, u, p) = pnps.solutions(deepcopy=True)
# F = phys.Feff(v, u)
# def avg(u, dx):
# return dolfin.assemble(u*dx)/dolfin.assemble(dolfin.Constant(1.)*dx)
Jcomp = ["Jzdiff", "Jzdrift", "Jzstokes"]
lPore = geo.params["ltop"]+geo.params["lctr"]+geo.params["lbtm"]
Jzdiff = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.D*phys.rDPore*phys.grad(-cp+cm)[2] /lPore * geo.dx("pore")
Jzdrift = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.mu*phys.rDPore*(-cp-cm)*phys.grad(v)[2]/lPore * geo.dx("pore")
Jzstokes = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.stokesdampPore*(cp-cm)*u[2]/lPore * geo.dx("pore")
Jcomponents = [j+p for j in Jcomp for p in ["top","btm"]]
Jzdifftop = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.D*phys.rDPore*phys.grad(-cp+cm)[2] /geo.params["ltop"] * geo.dx("poretop")
Jzdrifttop = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.mu*phys.rDPore*(-cp-cm)*phys.grad(v)[2]/geo.params["ltop"] * geo.dx("poretop")
Jzstokestop = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.stokesdampPore*(cp-cm)*u[2]/geo.params["ltop"] * geo.dx("poretop")
Jzdiffbtm = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.D*phys.rDPore*phys.grad(-cp+cm)[2] /geo.params["lbtm"] * geo.dx("porebottom")
Jzdriftbtm = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.mu*phys.rDPore*(-cp-cm)*phys.grad(v)[2]/geo.params["lbtm"] * geo.dx("porebottom")
Jzstokesbtm = dolfin.Constant((1.0/phys.lscale)**2) * phys.cFarad*phys.stokesdampPore*(cp-cm)*u[2]/geo.params["lbtm"] * geo.dx("porebottom")
result = pnps.get_functionals()
for j in Jcomp+Jcomponents:
result.update({j: 1e12*dolfin.assemble(locals()[j])})
return result
| 37.138614 | 155 | 0.662223 | 558 | 3,751 | 4.419355 | 0.333333 | 0.040146 | 0.060827 | 0.058394 | 0.316302 | 0.29927 | 0.29927 | 0.256691 | 0.256691 | 0.256691 | 0 | 0.026641 | 0.159424 | 3,751 | 100 | 156 | 37.51 | 0.755471 | 0.16369 | 0 | 0 | 0 | 0 | 0.071084 | 0 | 0 | 0 | 0 | 0.01 | 0 | 0 | null | null | 0 | 0.030303 | null | null | 0.030303 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f3fc3878013c76ce0d72ef7770e074772b56e099 | 9,459 | py | Python | pysynphot/test/test_tokenizer.py | dobos/pysynphot | 5d2e0b52ceda78890940ac9239c2d88e149e0bed | [
"BSD-3-Clause"
] | 24 | 2015-01-04T23:38:21.000Z | 2022-02-01T00:11:07.000Z | pysynphot/test/test_tokenizer.py | dobos/pysynphot | 5d2e0b52ceda78890940ac9239c2d88e149e0bed | [
"BSD-3-Clause"
] | 126 | 2015-01-29T14:50:37.000Z | 2022-02-15T01:58:13.000Z | pysynphot/test/test_tokenizer.py | dobos/pysynphot | 5d2e0b52ceda78890940ac9239c2d88e149e0bed | [
"BSD-3-Clause"
] | 25 | 2015-02-09T12:12:02.000Z | 2021-09-09T13:06:54.000Z | from __future__ import absolute_import, division, print_function
import pytest
from ..spparser import Scanner
scanner = Scanner()
# Test of a single instance of each token. Does not test them in
# context, but at least it tests that each one is recognized.
tokens = [
# bug: the original pysynphot could not recognize integer
# ('INTEGER', '1'),
# basic float
('FLOAT', '.1'),
('FLOAT', '1.1'),
('FLOAT', '1.'),
('FLOAT', '1'),
# basic float with e+
('FLOAT', '.1e+1'),
('FLOAT', '1.1e+1'),
('FLOAT', '1.e+1'),
('FLOAT', '1e+1'),
# basic float with e-
('FLOAT', '.1e-1'),
('FLOAT', '1.1e-1'),
('FLOAT', '1.e-1'),
('FLOAT', '1e-1'),
# basic float with e
('FLOAT', '.1e1'),
('FLOAT', '1.1e1'),
('FLOAT', '1.e1'),
('FLOAT', '1e1'),
# identifier
('IDENTIFIER', 'xyzzy'),
('IDENTIFIER', 'xyzzy20'),
('IDENTIFIER', '20xyzzy'),
('IDENTIFIER', '20xyzzy20'),
# special characters
('LPAREN', '('),
('RPAREN', ')'),
(',', ','),
('/', ' / '),
# filename
('IDENTIFIER', '/a/b/c'),
('IDENTIFIER', 'foo$bar'),
('IDENTIFIER', 'a/b'),
# file list
('FILELIST', '@arf'),
('FILELIST', '@narf')]
def print_token_list(tklist):
s = 'Token list: {} items\n'.format(len(tklist))
for x in tklist:
s += '{:<20s} \n'.format(x.type, x.attr)
s += '---\n'
return s
def ptl2(tkl):
"""
Use this to generate the list of tokens in a form easy to copy/paste
into a test.
"""
s = ''
for x in tkl:
s += ' ( "{}", {} ), \n'.format(x.type, repr(x.attr))
s += '\n'
return s
def stream_t(text, result):
"""
Parse a bit of text and compare it to the expected token stream.
Each actual test function calls this.
"""
tkl = scanner.tokenize(text)
msg = print_token_list(tkl)
assert result is not None, \
msg + 'NO EXPECT LIST\n [\n' + ptl2(tkl) + ' ]\n'
for n, (expect, actual) in enumerate(zip(result, tkl)):
assert expect[0] == actual.type and expect[1] == actual.attr, \
(msg + '{} expect={} actual=({}, {})'.format(
n, expect, actual.type, actual.attr))
@pytest.mark.parametrize(
('text', 'result'),
[('spec($PYSYN_CDBS//calspec/gd71_mod_005.fits)',
[('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', '$PYSYN_CDBS//calspec/gd71_mod_005.fits'),
('RPAREN', None)]),
(('spec(earthshine.fits)*0.5+rn(spec(Zodi.fits),'
'band(johnson,v),22.7,vegamag)+(spec(el1215a.fits)+'
'spec(el1302a.fits)+spec(el1356a.fits)+'
'spec(el2471a.fits))*0.5'),
[('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'earthshine.fits'),
('RPAREN', None),
('*', None),
('FLOAT', '0.5'),
('+', None),
('IDENTIFIER', 'rn'),
('LPAREN', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'Zodi.fits'),
('RPAREN', None),
(',', None),
('IDENTIFIER', 'band'),
('LPAREN', None),
('IDENTIFIER', 'johnson'),
(',', None),
('IDENTIFIER', 'v'),
('RPAREN', None),
(',', None),
('FLOAT', '22.7'),
(',', None),
('IDENTIFIER', 'vegamag'),
('RPAREN', None),
('+', None),
('LPAREN', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1215a.fits'),
('RPAREN', None),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1302a.fits'),
('RPAREN', None),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1356a.fits'),
('RPAREN', None),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el2471a.fits'),
('RPAREN', None),
('RPAREN', None),
('*', None),
('FLOAT', '0.5')]),
(('spec(earthshine.fits)*0.5+rn(spec(Zodi.fits),'
'band(johnson,v),22.7,vegamag)+(spec(el1215a.fits)*0.1+'
'spec(el1302a.fits)*0.066666667+spec(el1356a.fits)*0.0060+'
'spec(el2471a.fits)*0.0050)'),
[('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'earthshine.fits'),
('RPAREN', None),
('*', None),
('FLOAT', '0.5'),
('+', None),
('IDENTIFIER', 'rn'),
('LPAREN', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'Zodi.fits'),
('RPAREN', None),
(',', None),
('IDENTIFIER', 'band'),
('LPAREN', None),
('IDENTIFIER', 'johnson'),
(',', None),
('IDENTIFIER', 'v'),
('RPAREN', None),
(',', None),
('FLOAT', '22.7'),
(',', None),
('IDENTIFIER', 'vegamag'),
('RPAREN', None),
('+', None),
('LPAREN', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1215a.fits'),
('RPAREN', None),
('*', None),
('FLOAT', '0.1'),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1302a.fits'),
('RPAREN', None),
('*', None),
('FLOAT', '0.066666667'),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1356a.fits'),
('RPAREN', None),
('*', None),
('FLOAT', '0.0060'),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el2471a.fits'),
('RPAREN', None),
('*', None),
('FLOAT', '0.0050'),
('RPAREN', None)]),
(('spec(earthshine.fits)*0.5+rn(spec(Zodi.fits),band(johnson,v),'
'22.7,vegamag)+(spec(el1215a.fits)+spec(el1302a.fits)+'
'spec(el1356a.fits)+spec(el2471a.fits))'),
[('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'earthshine.fits'),
('RPAREN', None),
('*', None),
('FLOAT', '0.5'),
('+', None),
('IDENTIFIER', 'rn'),
('LPAREN', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'Zodi.fits'),
('RPAREN', None),
(',', None),
('IDENTIFIER', 'band'),
('LPAREN', None),
('IDENTIFIER', 'johnson'),
(',', None),
('IDENTIFIER', 'v'),
('RPAREN', None),
(',', None),
('FLOAT', '22.7'),
(',', None),
('IDENTIFIER', 'vegamag'),
('RPAREN', None),
('+', None),
('LPAREN', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1215a.fits'),
('RPAREN', None),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1302a.fits'),
('RPAREN', None),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el1356a.fits'),
('RPAREN', None),
('+', None),
('IDENTIFIER', 'spec'),
('LPAREN', None),
('IDENTIFIER', 'el2471a.fits'),
('RPAREN', None),
('RPAREN', None)])])
def test_stream(text, result):
stream_t(text, result)
@pytest.mark.xfail(reason='does not work')
@pytest.mark.parametrize(
('text', 'result'),
[('rn(unit(1.,flam),band(stis,ccd,g430m,c4451,52X0.2),10.000000,abmag)',
[('IDENTIFIER', 'rn'),
('LPAREN', None),
('IDENTIFIER', 'unit'),
('LPAREN', None),
('FLOAT', '1.'),
(',', None),
('IDENTIFIER', 'flam'),
('RPAREN', None),
(',', None),
('IDENTIFIER', 'band'),
('LPAREN', None),
('IDENTIFIER', 'stis'),
(',', None),
('IDENTIFIER', 'ccd'),
(',', None),
('IDENTIFIER', 'g430m'),
(',', None),
('IDENTIFIER', 'c4451'),
(',', None),
('IDENTIFIER', '52X0.2'),
('RPAREN', None),
(',', None),
('FLOAT', '10.000000'),
(',', None),
('IDENTIFIER', 'abmag'),
('RPAREN', None)]),
('rn(unit(1.,flam),band(stis,ccd,mirror,50CCD),10.000000,abmag)',
[('IDENTIFIER', 'rn'),
('LPAREN', None),
('IDENTIFIER', 'unit'),
('LPAREN', None),
('FLOAT', '1.'),
(',', None),
('IDENTIFIER', 'flam'),
('RPAREN', None),
(',', None),
('IDENTIFIER', 'band'),
('LPAREN', None),
('IDENTIFIER', 'stis'),
(',', None),
('IDENTIFIER', 'ccd'),
(',', None),
('IDENTIFIER', 'mirror'),
(',', None),
('IDENTIFIER', '50CCD'),
('RPAREN', None),
(',', None),
('FLOAT', '10.000000'),
(',', None),
('IDENTIFIER', 'abmag'),
('RPAREN', None)])])
def test_stream_xfail(text, result):
stream_t(text, result)
@pytest.mark.xfail(reason='does not work')
def test_tokens():
for x in tokens:
typ, val = x
tkl = scanner.tokenize(val)
assert len(tkl) == 1, 'too many tokens\n' + print_token_list(tkl)
assert tkl[0].type == typ, \
('wrong type: found {} want {}\n'.format(tkl[0].type, typ) +
print_token_list(tkl))
assert tkl[0].attr == val or tkl[0].attr is None or \
(val.startswith('@') and tkl[0].attr == val[1:]), \
('token value incorrect: found {} want {}'.format(
tkl[0].attr, val) + print_token_list(tkl))
| 27.417391 | 76 | 0.461571 | 930 | 9,459 | 4.663441 | 0.177419 | 0.213051 | 0.147567 | 0.105142 | 0.675121 | 0.649527 | 0.632695 | 0.588425 | 0.588425 | 0.588425 | 0 | 0.044946 | 0.291997 | 9,459 | 344 | 77 | 27.497093 | 0.602658 | 0.053071 | 0 | 0.699324 | 0 | 0.013514 | 0.338273 | 0.07859 | 0 | 0 | 0 | 0 | 0.016892 | 1 | 0.02027 | false | 0 | 0.010135 | 0 | 0.037162 | 0.02027 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f3fed70efe76821b13f574f22185b0c8fc08daa7 | 41,061 | py | Python | python/ccxt/tprexchange.py | ThePeoplesReserve/ccxt | 17c4678a07abac4cb972d968be2b79402921bc6e | [
"MIT"
] | null | null | null | python/ccxt/tprexchange.py | ThePeoplesReserve/ccxt | 17c4678a07abac4cb972d968be2b79402921bc6e | [
"MIT"
] | null | null | null | python/ccxt/tprexchange.py | ThePeoplesReserve/ccxt | 17c4678a07abac4cb972d968be2b79402921bc6e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# PLEASE DO NOT EDIT THIS FILE, IT IS GENERATED AND WILL BE OVERWRITTEN:
# https://github.com/ccxt/ccxt/blob/master/CONTRIBUTING.md#how-to-contribute-code
from ccxt.base.exchange import Exchange
from ccxt.base.errors import ExchangeError
from ccxt.base.errors import BadRequest
from ccxt.base.errors import BadSymbol
from ccxt.base.errors import InvalidOrder
from ccxt.base.decimal_to_precision import SIGNIFICANT_DIGITS
import json
import sys
from datetime import datetime
class tprexchange(Exchange):
def describe(self):
return self.deep_extend(super(tprexchange, self).describe(), {
'id': 'tprexchange',
'name': 'TPR Exchange',
# 'countries': ['US'],
# 'rateLimit': 500,
'version': 'v1',
'certified': False,
'has': {
'loadMarkets': True,
'cancelAllOrders': False,
'cancelOrder': True,
'cancelOrders': False,
'CORS': False,
'createDepositAddress': False,
'createLimitOrder': False,
'createMarketOrder': False,
'createOrder': True,
'deposit': False,
'editOrder': 'emulated',
'fetchBalance': True,
'fetchBidsAsks': False,
'fetchClosedOrders': True,
'fetchCurrencies': True,
'fetchDepositAddress': False,
'fetchDeposits': False,
'fetchFundingFees': False,
'fetchL2OrderBook': False,
'fetchLedger': False,
'fetchMarkets': True,
'fetchMyTrades': True,
'fetchOHLCV': True,
'fetchOpenOrders': True,
'fetchOrder': True,
'fetchOrderBook': True,
'fetchOrderBooks': False,
'fetchOrders': True,
'fetchOrderTrades': False,
'fetchStatus': True,
'fetchTicker': True,
'fetchTickers': True,
'fetchTime': False,
'fetchTrades': True,
'fetchTradingFee': False,
'fetchTradingFees': False,
'fetchTradingLimits': False,
'fetchTransactions': False,
'fetchWithdrawals': False,
'privateAPI': True,
'publicAPI': False,
'signIn': True,
'withdraw': False,
'getMarketPrice': True,
},
'timeframes': {
'1m': '1',
'1h': '60',
'1d': '1440',
'1w': '10080',
'1mn': '43200',
},
'urls': {
'logo': '',
'api': '{hostname}',
'www': '',
'doc': '',
'fees': '',
'referral': '',
},
'api': {
'private': {
'get': [
],
'post': [
'ucenter/api-login',
'ucenter/member/balance',
'market/symbol-thumb',
'market/coins-info',
'market/symbol-info',
'exchange/order/add',
'exchange/order/find',
'exchange/order/all',
'exchange/order/apicancel',
'exchange/order/trades',
'exchange/order/my-trades',
'exchange/exchange-coin/base-symbol',
],
'delete': [
],
},
'feed': {
'get': [
],
},
},
'fees': {
'trading': {
},
},
'requiredCredentials': {
'apiKey': True,
'secret': True,
'uid': False,
},
'precisionMode': SIGNIFICANT_DIGITS,
'options': {
'createMarketBuyOrderRequiresPrice': False,
},
'exceptions': {
'exact': {
'Invalid cost': InvalidOrder, # {"message":"Invalid cost","_links":{"self":{"href":"/orders","templated":false}}}
'Invalid order ID': InvalidOrder, # {"message":"Invalid order ID","_links":{"self":{"href":"/orders/4a151805-d594-4a96-9d64-e3984f2441f7","templated":false}}}
'Invalid market !': BadSymbol, # {"message":"Invalid market !","_links":{"self":{"href":"/markets/300/order-book","templated":false}}}
},
'broad': {
'Failed to convert argument': BadRequest,
},
},
})
def parse_ticker(self, response):
if len(response) == 0:
return []
symbol = response[0].get('symbol')
high = 0
bidVolume = 0
askVolume = 0
vwap = 0
vwapCost = 0
vwapVolume = 0
open_ = 'None'
close = 0
last = close
previousClose = 'None'
change = 'None'
percentage = 'None'
average = 'None'
baseVolume = 0
quoteVolume = 0
time = 0
lastDayTime = int((datetime.now().timestamp() - 86400) * 1000)
currentTimestamp = int(datetime.now().timestamp() * 1000)
currentDatetime = str(datetime.fromtimestamp(currentTimestamp * 0.001))
low = response[0].get('price')
bid = 0
ask = sys.maxsize
openSellOrdersCount = 0
for order in response:
price = order.get('price')
amount = order.get('amount')
timestamp = order.get('timestamp')
if high < price:
high = price
if low > price:
low = price
if order.get('status') == 'open':
if order.get('side') == 'buy':
if bid < price:
bid = price
if bidVolume < amount:
bidVolume = amount
if order.get('status') == 'open':
if order.get('side') == 'sell':
openSellOrdersCount += 1
if ask > price:
ask = price
if askVolume < amount:
askVolume = amount
if order.get('info').get('status') == 'COMPLETED':
vwapCost += price * amount
vwapVolume += amount
if time < timestamp:
time = timestamp
close = price
if timestamp > lastDayTime:
quoteVolume += amount
baseVolume += price
if vwapVolume != 0:
vwap = vwapCost / vwapVolume
if openSellOrdersCount == 0:
ask = 0
last = close
result = {
'symbol': symbol,
'info': response,
'timestamp': currentTimestamp,
'datetime': currentDatetime,
'high': high,
'low': low,
'bid': bid,
'bidVolume': bidVolume,
'ask': ask,
'askVolume': askVolume,
'vwap': vwap,
'open': open_,
'close': close,
'last': last,
'previousClose': previousClose,
'change': change,
'percentage': percentage,
'average': average,
'baseVolume': baseVolume,
'quoteVolume': quoteVolume,
}
return result
def fetch_ticker(self, symbol, since=None, limit=None):
response = self.fetch_orders(symbol, since, limit)
# Response example:
# {
# 'symbol': 'BTC/USDT',
# 'info': [...],
# 'timestamp': 1615386851976,
# 'datetime': '2021-03-10 16:34:11.976000',
# 'high': 50.0,
# 'low': 1.0,
# 'bid': 30.0,
# 'bidVolume': 15.0,
# 'ask': 40.0,
# 'askVolume': 25.0,
# 'vwap': 11.0,
# 'open': 'None',
# 'close': 20.0,
# 'last': 20.0,
# 'previousClose': 'None',
# 'change': 'None',
# 'percentage': 'None',
# 'average': 'None',
# 'baseVolume': 60.0,
# 'quoteVolume': 30.0
# }
return self.parse_ticker(response)
def fetch_tickers(self, since=None, limit=None):
# Response example:
# [
# {
# 'symbol': 'BTC/USDT',
# 'info': [...],
# 'timestamp': 1615386851976,
# 'datetime': '2021-03-10 16:34:11.976000',
# 'high': 50.0,
# 'low': 1.0,
# 'bid': 30.0,
# 'bidVolume': 15.0,
# 'ask': 40.0,
# 'askVolume': 25.0,
# 'vwap': 11.0,
# 'open': 'None',
# 'close': 20.0,
# 'last': 20.0,
# 'previousClose': 'None',
# 'change': 'None',
# 'percentage': 'None',
# 'average': 'None',
# 'baseVolume': 60.0,
# 'quoteVolume': 30.0
# },
# ...
# ]
result = []
symbols = self.fetch_markets()
for symblol in symbols:
response = self.fetch_orders(symblol.get('symbol'), since, limit)
ticker = self.parse_ticker(response)
if len(ticker) != 0:
result.append(ticker)
return result
def fetch_order_book(self, symbol, limit, since=0):
# Response example:
# {
# 'bids':
# [
# [20.0, 10.0, 'E161538482263642'], // [price, amount, orderId]
# [30.0, 15.0, 'E161538482271646']
# ],
# 'asks':
# [
# [40.0, 20.0, 'E161538482278825'],
# [50.0, 25.0, 'E161538482286085']
# ],
# 'timestamp': 1615390711695,
# 'datetime': '2021-03-10 17:38:31.695000',
# 'nonce': 1615390711695
# }
orders = self.fetch_open_orders(symbol, since, limit)
bids = []
asks = []
for order in orders:
temp = []
temp.append(order.get('price'))
temp.append(order.get('amount'))
temp.append(order.get('id'))
if order.get('side') == 'buy':
bids.append(temp)
else:
asks.append(temp)
currentTimestamp = int(datetime.now().timestamp() * 1000)
currentDatetime = str(datetime.fromtimestamp(currentTimestamp * 0.001))
result = {
'bids': bids,
'asks': asks,
'timestamp': currentTimestamp,
'datetime': currentDatetime,
'nonce': currentTimestamp,
}
return result
def parse_markets(self, response):
listData = []
for value in response:
tmp = {
"id": value.get("coinSymbol"),
"symbol": value.get("symbol"),
"base": value.get("coinSymbol"),
"quote": value.get("baseSymbol"),
"baseId": value.get("coinSymbol"),
"quoteId": value.get("baseSymbol"),
"type": value.get("publishType"),
"active": value.get("enable"),
"precision": {
"amount": value.get("coinScale"),
"price": value.get("baseCoinScale"),
},
"limits": {
"amount": {"min": value.get("minVolume"), "max": value.get("maxVolume")},
"price": {"min": value.get("minSellPrice"), "max": value.get("maxBuyPrice")},
"cost": {"min": value.get("minVolume") * value.get("minSellPrice"), "max": value.get("maxVolume") * value.get("maxBuyPrice")},
},
"taker": value.get("fee"),
"maker": value.get("fee"),
"info": value,
}
listData.append(tmp)
return listData
def add_frame(self, timeFrameStart, timeFrameEnd, timeframe, highestPrice, lowestPrice, amount, result, openPrice, closePrice):
frame = []
frame.append(timeFrameStart)
frame.append(openPrice)
frame.append(highestPrice)
frame.append(lowestPrice)
frame.append(closePrice)
frame.append(amount)
result.append(frame)
def parse_ohlcv(self, response, since, timeframe):
highestPrice = 0
lowestPrice = sys.maxsize
price = 0
amount = 0
timeFrameStart = since
timeFrameEnd = int((since * 0.001 + timeframe) * 1000)
result = []
i = 0
orders = response.get('content')
isOpenPrice = True
openPrice = 0
closePrice = 0
while i < len(orders):
if isOpenPrice == True:
openPrice = orders[i].get('price')
isOpenPrice = False
time = orders[i].get('time')
if time >= timeFrameStart and time <= timeFrameEnd:
price = orders[i].get('price')
closePrice = price
if highestPrice < price:
highestPrice = price
if lowestPrice > price:
lowestPrice = price
amount += orders[i].get('amount')
i += 1
if i == len(orders):
self.add_frame(timeFrameStart, timeFrameEnd, timeframe, highestPrice, lowestPrice, amount, result, openPrice, closePrice)
else:
if lowestPrice == sys.maxsize:
lowestPrice = 0
openPrice = 0
closePrice = 0
i -= 1
self.add_frame(timeFrameStart, timeFrameEnd, timeframe, highestPrice, lowestPrice, amount, result, openPrice, closePrice)
timeFrameStart = timeFrameEnd + 1
timeFrameEnd = int((timeFrameEnd * 0.001 + timeframe) * 1000)
amount = 0
highestPrice = 0
lowestPrice = sys.maxsize
isOpenPrice = True
i += 1
return result
# timeframe variants:
# 1m (one minute);
# 1h (one hour);
# 1d (one day - 24 hours)
# 1w (one week - 7 days)
# 1mn (one mounth - 30 days)
def fetch_ohlcv(self, symbol, timeframe=None, since=0, limit=None, params={}):
# Response example:
# [
# [
# 1504541580000, // UTC timestamp in milliseconds, integer
# 4235.4, // (O)pen price, float
# 4240.6, // (H)ighest price, float
# 4230.0, // (L)owest price, float
# 4230.7, // (C)losing price, float
# 37.72941911 // (V)olume (in terms of the base currency), float
# ],
# ...
# ]
inputDataCheck = False
for frame in self.timeframes:
if frame == timeframe:
inputDataCheck = True
break
if inputDataCheck == False:
return {'error': 'Incorrect timeframe'}
tFrame = int(self.timeframes.get(timeframe)) * 60
default_order_amount_limit = 100
params['status'] = 'COMPLETED'
if 'page' in params:
params['pageNo'] = self.safe_string(params, 'page')
else:
params['pageNo'] = 0
if since is None:
since = 0
if limit is None:
limit = default_order_amount_limit
request = {
'symbol': symbol,
'since': since,
'pageSize': limit,
}
fullRequest = self.extend(request, params)
response = self.privatePostExchangeOrderAll(fullRequest)
return self.parse_ohlcv(response, since, tFrame)
def fetch_markets(self, symbol=''):
request = {
'symbol': symbol,
}
response = self.privatePostMarketSymbolInfo(request)
return self.parse_markets(response)
# RETURN EXAMPLE:
# [
# {
# 'id': 'BTC',
# 'symbol': 'BTC/USDT',
# 'base': 'BTC',
# 'quote': 'USDT',
# 'baseId': 'BTC',
# 'quoteId': 'USDT',
# 'type': 'NONE',
# 'active': 1,
# 'precision': { 'amount': 2, 'price': 2 },
# 'limits':
# {
# 'amount': { 'min': 0.0, 'max': 0.0 },
# 'price': { 'min': 0.0, 'max': 0.0 },
# 'cost': { 'min': 0.0, 'max': 0.0 }
# },
# 'taker': 0.001,
# 'maker': 0.001,
# 'info': {backend response}
# },
# ...
# ]
def load_markets(self, reload=False, symbol=''):
if not reload:
if self.markets:
if not self.markets_by_id:
return self.set_markets(self.markets)
return self.markets
currencies = None
if self.has['fetchCurrencies']:
currencies = self.fetch_currencies()
markets = self.fetch_markets(symbol)
return self.set_markets(markets, currencies)
def sign(self, path, api='public', method='GET', params={}, headers=None, body=None):
# Check existance of authentication token
# Just use empy one in case of an application is not signed in yet
authToken = ''
if 'token' in self.options:
authToken = self.options['token']
# Get URL
url = self.implode_params(self.urls['api'], {'hostname': self.hostname}) + '/' + path
# Calculate body and content type depending on method type: GET or POST
keys = list(params.keys())
keysLength = len(keys)
# In case of body is still not assigned just make it empty string
if body is None:
body = ''
# Prepare line for hashing
# This hash sum is checked on backend side to verify API user
# POST params should not be added as body
query = method + ' /' + path + ' ' + self.urlencode(params) + ' ' + authToken + '\n' + body
signed = self.hmac(self.encode(query), self.encode(self.secret))
contentType = None
if method == 'POST':
contentType = 'application/x-www-form-urlencoded'
if keysLength > 0:
body = self.urlencode(params)
else:
if keysLength > 0:
url += '?' + self.urlencode(params)
headers = {
'x-auth-sign': signed,
'x-auth-token': authToken,
}
if authToken != '':
headers['access-auth-token'] = authToken
if contentType is not None:
headers['Content-Type'] = contentType
return {'url': url, 'method': method, 'body': body, 'headers': headers}
def sign_in(self, params={}):
params = {
'key': self.key,
'token': self.token,
}
response = self.privatePostUcenterApiLogin(params)
loginData = response['data']
self.options['token'] = self.safe_string(loginData, 'token')
memberId = self.safe_string(loginData, 'id')
return memberId
def fetch_status(self):
# Responce examples:
# {'status': 'ok'}
# or
# {'status': 'shutdown', 'reason': 'ExchangeNotAvailable'}
# or
# {'status': 'shutdown', 'reason': 'Unknown reason'}
result = False
try:
response = self.privatePostExchangeExchangeCoinBaseSymbol()
for field in response.items():
if field[0] == 'message':
if field[1] == 'SUCCESS':
result = True
if result is True:
return {"status": "ok"}
else:
return {"status": "shutdown", "reason": "ExchangeNotAvailable"}
except:
reason = str(sys.exc_info()[0])
if reason.find('ExchangeNotAvailable') != -1:
return {"status": "shutdown", "reason": "ExchangeNotAvailable"}
else:
return {"status": "shutdown", "reason": "Unknown reason"}
def parse_currencies(self, response):
listData = []
for value in response:
tmp = {
'id': value.get('name'),
'code': value.get('name').upper(),
'name': value.get('name'),
'active': bool(value.get('status')),
'fee': 0.005,
'precision': 0,
'limits':
{
'amount':
{
'min': 'None',
'max': 'None',
},
'price':
{
'min': 'None',
'max': 'None',
},
'cost':
{
'min': 'None',
'max': 'None',
},
'withdraw':
{
'min': value.get('minWithdrawAmount'),
'max': value.get('maxWithdrawAmount'),
},
},
'info': value
}
listData.append(tmp)
return listData
def fetch_currencies(self):
# Responce example
#[
# {
# 'id': 'BTC',
# 'code': 'BTC',
# 'name': 'BTC',
# 'active': True,
# 'fee': 0.001,
# 'precision': 0,
# 'limits': // TPR exchange has no restrictions
# {
# 'amount':
# {
# 'min': 'None',
# 'max': 'None'
# },
# 'price':
# {
# 'min': 'None',
# 'max': 'None'
# },
# 'cost':
# {
# 'min': 'None',
# 'max': 'None'
# },
# 'withdraw':
# {
# 'min': 1.0,
# 'max': 5000.0
# }
# },
# 'info': { },
# },
# ...
#]
try:
response = self.privatePostMarketCoinsInfo()
return self.parse_currencies(response)
except:
reason = str(sys.exc_info()[0])
if reason.find('ExchangeNotAvailable') != -1:
return {"Error": "ExchangeNotAvailable"}
else:
return {"Error": "Unknown reason"}
def fetch_order(self, id, symbol=None, params={}):
request = {
'orderId': id,
}
response = self.privatePostExchangeOrderFind(request)
return self.parse_order(response)
def parse_order(self, order, market=None):
# {
# 'orderId':'E161183624377614',
# 'memberId':2,
# 'type':'LIMIT_PRICE',
# 'amount':1000.0,
# 'symbol':'BCH/USDT',
# 'tradedAmount':1000.0,
# 'turnover':1080.0,
# 'coinSymbol':'BCH',
# 'baseSymbol':'USDT',
# 'status':'COMPLETED',
# 'latestTradeTimestamp':1611836256242,
# 'direction':'SELL',
# 'price':1.0,
# 'time':1611836243776,
# 'completedTime':1611836256242,
# },
if not order:
return None
type = 'market'
if order['type'] == 'LIMIT_PRICE':
type = 'limit'
side = order['direction'].lower()
remaining = order['amount'] - order['tradedAmount']
status = order['status']
if status == 'COMPLETED':
status = 'closed'
elif status == 'TRADING' or status == 'PAUSED' or status == 'RESERVED':
status = 'open'
else:
status = 'canceled'
cost = order['tradedAmount'] * order['price']
result = {
'info': order,
'id': order['orderId'],
'clientOrderId': order['memberId'],
'timestamp': order['time'],
'datetime': self.iso8601(order['time']),
'latestTradeTimestamp': order['latestTradeTimestamp'],
'symbol': order['symbol'],
'type': type,
'timeInForce': None,
'postOnly': None,
'side': side,
'price': order['price'],
'stopPrice': None,
'cost': cost,
'average': None,
'amount': order['amount'],
'filled': order['tradedAmount'],
'remaining': remaining,
'status': status,
'fee': None,
'trades': None,
}
return result
def create_order(self, symbol, type, side, amount, price=None, params={}):
params['symbol'] = symbol
params['price'] = price
params['amount'] = amount
if side == 'buy':
params['direction'] = 'BUY'
else:
params['direction'] = 'SELL'
if type == 'market':
params['type'] = 'MARKET_PRICE'
else:
params['type'] = 'LIMIT_PRICE'
params['useDiscount'] = '0'
response = self.privatePostExchangeOrderAdd(params)
orderId = self.safe_string(response, 'data')
return self.fetch_order(orderId)
def cancel_order(self, id, symbol=None, params={}):
request = {
'orderId': id,
}
response = self.privatePostExchangeOrderApicancel(self.extend(request, params))
return self.parse_order(response['data'])
def fetch_orders(self, symbol=None, since=None, limit=None, params={}):
# Request structure
# {
# 'symbol': Parameter from method arguments
# 'since': Timestamp of first order in list in Unix epoch format
# 'limit': Response list size
# 'memberId': May be set in params. May be not set
# 'status': one of TRADING COMPLETED CANCELED OVERTIMED. May be set in params
# 'page': for pagination. In self case limit is size of every page. May be set in params
# }
default_order_amount_limit = 10000
if 'page' in params:
params['pageNo'] = self.safe_string(params, 'page')
else:
params['pageNo'] = 0
if symbol is None:
symbol = ''
if since is None:
since = 0
if limit is None:
limit = default_order_amount_limit
request = {
'symbol': symbol,
'since': since,
'pageSize': limit,
}
fullRequest = self.extend(request, params)
response = self.privatePostExchangeOrderAll(fullRequest)
# {
# 'content': [
# {
# 'orderId':'E161183624377614',
# 'memberId':2,
# 'type':'LIMIT_PRICE',
# 'amount':1000.0,
# 'symbol':'BCH/USDT',
# 'tradedAmount':1000.0,
# 'turnover':1080.0,
# 'coinSymbol':'BCH',
# 'baseSymbol':'USDT',
# 'status':'COMPLETED',
# 'direction':'SELL',
# 'price':1.0,
# 'time':1611836243776,
# 'completedTime':1611836256242,
# },
# ...
# ],
# 'totalElements':41,
# 'totalPages':3,
# 'last':False,
# 'size':20,
# 'number':1,
# 'first':False,
# 'numberOfElements':20,
# 'sort': [
# {
# 'direction':'DESC',
# 'property':'time',
# 'ignoreCase':False,
# 'nullHandling':'NATIVE',
# 'ascending':False,
# 'descending':True,
# }
# ]
# }
return self.parse_orders(response['content'])
def fetch_open_orders(self, symbol=None, since=None, limit=None, params={}):
# Request structure
# {
# 'symbol': Parameter from method arguments
# 'since': Timestamp of first order in list in Unix epoch format
# 'limit': Response list size
# 'memberId': May be set in params. May be not set
# 'status': one of TRADING COMPLETED CANCELED OVERTIMED. May be set in params
# 'page': for pagination. In self case limit is size of every page. May be set in params
# }
default_order_amount_limit = 20
if 'page' in params:
params['pageNo'] = self.safe_string(params, 'page')
else:
params['pageNo'] = 0
if symbol is None:
symbol = ''
if since is None:
since = 0
if limit is None:
limit = default_order_amount_limit
request = {
'symbol': symbol,
'since': since,
'pageSize': limit,
}
fullRequest = self.extend(request, params)
response = self.privatePostExchangeOrderAll(fullRequest)
# {
# 'content': [
# {
# 'orderId':'E161183624377614',
# 'memberId':2,
# 'type':'LIMIT_PRICE',
# 'amount':1000.0,
# 'symbol':'BCH/USDT',
# 'tradedAmount':1000.0,
# 'turnover':1080.0,
# 'coinSymbol':'BCH',
# 'baseSymbol':'USDT',
# 'status':'COMPLETED',
# 'direction':'SELL',
# 'price':1.0,
# 'time':1611836243776,
# 'completedTime':1611836256242,
# },
# ...
# ],
# 'totalElements':41,
# 'totalPages':3,
# 'last':False,
# 'size':20,
# 'number':1,
# 'first':False,
# 'numberOfElements':20,
# 'sort': [
# {
# 'direction':'DESC',
# 'property':'time',
# 'ignoreCase':False,
# 'nullHandling':'NATIVE',
# 'ascending':False,
# 'descending':True,
# }
# ]
# }
return self.parse_orders(response['content'])
def fetch_closed_orders(self, symbol=None, since=None, limit=None, params={}):
# Request structure
# {
# 'symbol': Parameter from method arguments
# 'since': Timestamp of first order in list in Unix epoch format
# 'limit': Response list size
# 'memberId': May be set in params. May be not set
# 'status': one of TRADING COMPLETED CANCELED OVERTIMED. May be set in params
# 'page': for pagination. In self case limit is size of every page. May be set in params
# }
default_order_amount_limit = 20
params['status'] = 'CANCELED'
if 'page' in params:
params['pageNo'] = self.safe_string(params, 'page')
else:
params['pageNo'] = 0
if symbol is None:
symbol = ''
if since is None:
since = 0
if limit is None:
limit = default_order_amount_limit
request = {
'symbol': symbol,
'since': since,
'pageSize': limit,
}
fullRequest = self.extend(request, params)
response = self.privatePostExchangeOrderAll(fullRequest)
# {
# 'content': [
# {
# 'orderId':'E161183624377614',
# 'memberId':2,
# 'type':'LIMIT_PRICE',
# 'amount':1000.0,
# 'symbol':'BCH/USDT',
# 'tradedAmount':1000.0,
# 'turnover':1080.0,
# 'coinSymbol':'BCH',
# 'baseSymbol':'USDT',
# 'status':'COMPLETED',
# 'direction':'SELL',
# 'price':1.0,
# 'time':1611836243776,
# 'completedTime':1611836256242,
# },
# ...
# ],
# 'totalElements':41,
# 'totalPages':3,
# 'last':False,
# 'size':20,
# 'number':1,
# 'first':False,
# 'numberOfElements':20,
# 'sort': [
# {
# 'direction':'DESC',
# 'property':'time',
# 'ignoreCase':False,
# 'nullHandling':'NATIVE',
# 'ascending':False,
# 'descending':True,
# }
# ]
# }
return self.parse_orders(response['content'])
# If call without params the function returns balance of current user
def fetch_balance(self, uid='-1', params={}):
params = {
'uid': uid
}
try:
response = self.privatePostUcenterMemberBalance(params)
except Exception as e:
return e
return self.parse_balance(response)
def parse_balance(self, response):
data = json.loads(json.dumps(response))
if data['message'] == 'SUCCESS':
result = { "free":{}, "used":{}, "total":{}}
for row in data['data']['balances']:
result['free'].update({row['coinName']:row['free']})
result['used'].update({row['coinName']:row['used']})
result['total'].update({row['coinName']:row['total']})
result.update({row['coinName']:{'free':row['free'], 'used':row['used'], 'total':row['total']}})
return result
# Returns int or None
def get_market_price(self, symbol):
response = self.privatePostMarketSymbolThumb()
for i in response:
if i.get('symbol') == symbol:
return i.get('close')
def fetch_trades(self, orderId, since, pageNo=None, pageSize=None):
# Responce example:
# [
# {
# 'info': { backend response },
# 'id': 'E161460499516968',
# 'timestamp': 1614605187661,
# 'datetime': '2021-03-01 15:26:27.661000',
# 'symbol': 'BTC/USDT',
# 'order': 'E161460499516968',
# 'type': 'LIMIT_PRICE',
# 'side': 'SELL',
# 'takerOrMaker': 'None', (Have no this information inside TPR exchange)
# 'price': 1.0,
# 'amount': 1.0,
# 'cost': 1.0,
# 'fee':
# {
# 'cost': 0.005,
# 'currency': 'BTC',
# 'rate': 'None' (Have no this information inside TPR exchange)
# }
# }
# ]
if pageNo is None:
pageNo = 0
if pageSize is None:
pageSize = 100
request = { 'orderId': orderId,
'since': since,
'pageNo': pageNo,
'pageSize': pageSize }
return self.parse_trade(self.privatePostExchangeOrderTrades(request))
def parse_trade(self, response):
trades = []
content = response.get('content')
for exchangeTrade in content:
timestamp = exchangeTrade.get('time')
datetime_ = str(datetime.fromtimestamp(int(timestamp) * 0.001))
price = exchangeTrade.get('price')
amount = exchangeTrade.get('amount')
cost = price * amount
tmp = {
'info': exchangeTrade,
'id': exchangeTrade.get('orderId'),
'timestamp': timestamp,
'datetime': datetime_,
'symbol': exchangeTrade.get('symbol'),
'order': exchangeTrade.get('orderId'),
'type': exchangeTrade.get('type'),
'side': exchangeTrade.get('direction'),
'takerOrMaker': 'None',
'price': price,
'amount': amount,
'cost': cost,
'fee':
{
'cost': exchangeTrade.get('fee'),
'currency': exchangeTrade.get('coinSymbol'),
'rate': 'None',
}
}
trades.append(tmp)
return trades
def parse_my_trades(self, response):
listData = []
for value in response:
ExchangeOrder = response.get(value)
id_ = ExchangeOrder.get('orderId')
timestamp = ExchangeOrder.get('time')
datetime_ = str(datetime.fromtimestamp(int(timestamp) * 0.001))
price = ExchangeOrder.get('price')
amount = ExchangeOrder.get('amount')
cost = price * amount
tmp = {
'info': response.get(value),
'id': id_,
'timestamp': timestamp,
'datetime': datetime_,
'symbol': ExchangeOrder.get('symbol'),
'order': id_,
'type': ExchangeOrder.get('type'),
'side': ExchangeOrder.get('direction'),
'takerOrMaker': 'None',
'price': price,
'amount': amount,
'cost': cost,
'fee':
{
'cost': ExchangeOrder.get('fee'),
'currency': ExchangeOrder.get('coinSymbol'),
'rate': 'None',
}
}
listData.append(tmp)
return listData
def fetch_my_trades(self, pageNo=None, pageSize=None):
# Responce example:
# [
# {
# 'info': { backend response },
# 'id': 'E161460499516968',
# 'timestamp': 1614605187661,
# 'datetime': '2021-03-01 15:26:27.661000',
# 'symbol': 'BTC/USDT',
# 'order': 'E161460499516968',
# 'type': 'LIMIT_PRICE',
# 'side': 'SELL',
# 'takerOrMaker': 'None', (Have no this information inside TPR exchange)
# 'price': 1.0,
# 'amount': 1.0,
# 'cost': 1.0,
# 'fee':
# {
# 'cost': 0.001,
# 'currency': 'BTC',
# 'rate': 'None' (Have no this information inside TPR exchange)
# }
# },
# { ... },
# ]
if pageNo is None:
pageNo = 0
if pageSize is None:
pageSize = 100
request = { 'orderId': '',
'pageNo': pageNo,
'pageSize': pageSize }
return self.parse_my_trades(self.privatePostExchangeOrderMyTrades(request))
def handle_errors(self, httpCode, reason, url, method, headers, body, response, requestHeaders, requestBody):
if response is None:
return # fallback to default error handler
if httpCode == 200:
if 'code' in response:
if response['code'] == 0:
return
else:
return
# {
# "message": "Error text in case when HTTP code is not 200",
# ...
# }
message = self.safe_string(response, 'message')
if message is not None:
feedback = self.id + ' ' + body
self.throw_exactly_matched_exception(self.exceptions['exact'], message, feedback)
self.throw_broadly_matched_exception(self.exceptions['broad'], message, feedback)
raise ExchangeError(feedback) # unknown message
| 35.18509 | 179 | 0.445021 | 3,304 | 41,061 | 5.491223 | 0.167978 | 0.011464 | 0.009921 | 0.004961 | 0.392052 | 0.36284 | 0.357769 | 0.345092 | 0.340352 | 0.324808 | 0 | 0.041027 | 0.427169 | 41,061 | 1,166 | 180 | 35.215266 | 0.730326 | 0.241665 | 0 | 0.283149 | 1 | 0 | 0.11623 | 0.00621 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042818 | false | 0 | 0.012431 | 0.001381 | 0.116022 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d0625dd2e18c12063fa26852ff1ebfcbd46b4df | 396 | py | Python | losses/rmse.py | borhanMorphy/facial-keypoints-detection | d2a0fe077d04ae6f2701af107b8322566f27a432 | [
"MIT"
] | null | null | null | losses/rmse.py | borhanMorphy/facial-keypoints-detection | d2a0fe077d04ae6f2701af107b8322566f27a432 | [
"MIT"
] | null | null | null | losses/rmse.py | borhanMorphy/facial-keypoints-detection | d2a0fe077d04ae6f2701af107b8322566f27a432 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
class RMSELoss(nn.Module):
def __init__(self, reduction:str='mean'):
super(RMSELoss, self).__init__()
self.reduction = reduction
def forward(self, input: torch.Tensor, target: torch.Tensor) -> torch.Tensor:
mse = F.mse_loss(input, target, reduction=self.reduction)
return torch.sqrt(mse) | 33 | 81 | 0.691919 | 54 | 396 | 4.907407 | 0.444444 | 0.124528 | 0.098113 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191919 | 396 | 12 | 82 | 33 | 0.828125 | 0 | 0 | 0 | 0 | 0 | 0.010076 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.3 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6d07329e9ea1eb9741dbc8294a06425bd37de9b6 | 295 | py | Python | 33. Python Programs/FactorialOfNumbers.py | Ujjawalgupta42/Hacktoberfest2021-DSA | eccd9352055085973e3d6a1feb10dd193905584b | [
"MIT"
] | 225 | 2021-10-01T03:09:01.000Z | 2022-03-11T11:32:49.000Z | 33. Python Programs/FactorialOfNumbers.py | Ujjawalgupta42/Hacktoberfest2021-DSA | eccd9352055085973e3d6a1feb10dd193905584b | [
"MIT"
] | 252 | 2021-10-01T03:45:20.000Z | 2021-12-07T18:32:46.000Z | 33. Python Programs/FactorialOfNumbers.py | Ujjawalgupta42/Hacktoberfest2021-DSA | eccd9352055085973e3d6a1feb10dd193905584b | [
"MIT"
] | 911 | 2021-10-01T02:55:19.000Z | 2022-02-06T09:08:37.000Z | for i in range(int(input())):
fact=1
a=int(input())
for j in range(1,a+1,1):
fact=fact*j
print(fact)
def factorial(n):
return 1 if (n==1 or n==0) else n * factorial(n - 1);
num = int(input('Enter number'))
print("Factorial of",num,"is",
factorial(num))
| 19.666667 | 57 | 0.559322 | 52 | 295 | 3.173077 | 0.461538 | 0.145455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036364 | 0.254237 | 295 | 14 | 58 | 21.071429 | 0.713636 | 0 | 0 | 0 | 0 | 0 | 0.088136 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.181818 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d075ac2628154f04da3d85c667770e6d9e9db4a | 9,620 | py | Python | Stocks.py | ya332/stock-advisor | 108138415f8aa5ce2dee26ca43d5a55b744e26e2 | [
"MIT"
] | 19 | 2015-06-27T17:23:36.000Z | 2019-06-30T03:09:34.000Z | Stocks.py | ya332/stock-advisor | 108138415f8aa5ce2dee26ca43d5a55b744e26e2 | [
"MIT"
] | null | null | null | Stocks.py | ya332/stock-advisor | 108138415f8aa5ce2dee26ca43d5a55b744e26e2 | [
"MIT"
] | 1 | 2018-10-24T04:45:05.000Z | 2018-10-24T04:45:05.000Z | __author__ = 'Robbie Barrat'
# I need the datetime module for telling the current date when I compare todays stock values to 5 days ago and such.
import datetime
# Yahoo Finance V. 1.1.4
from yahoo_finance import Share
# Arrow -- replaced the old date module
import arrow
# Checks to see if it is currently the weekend, and displays a message if so (since the stock market only operates on weekdays)
def weekend():
if datetime.datetime.today().weekday() == 6:
print "Hey, according to your computer it is Sunday, so the stock market is closed today."
print "The data you see will be from the last time the stock market was opened."
elif datetime.datetime.today().weekday() == 5:
print "Hey, according to your computer it is Saturday, so the stock market is closed today."
print "The data you see will be from the last time the stock market was opened."
# Just a nice greeting message#
def greeting():
print "Enter the abbreviation of the company you'd like to look into, for example, 'Google' would be 'GOOGL'"
takeinput()
# Here is the function that actually takes input #
def takeinput():
company = raw_input("Company Abbreviation: ")
if company == "":
print "You need to enter a company name, buddy."
takeinput()
elif company.isalpha() == False:
print "Try to keep it all letters, no numbers, spaces or symbols allowed."
takeinput()
elif not Share(str(company)).get_price():
print "That isn't a valid company, pal."
takeinput()
else:
print "Thank you, please wait a minute while the data for your company is being retrieved..."
# the reason there are two variables for this (companyname and company) is because there were some issues with the program
# using a bad input, so companyname is defined only if the input is correct and accepted.
companyname = company
recentdata(companyname)
# This function just gives some quick data about how the stock has been doing today. #
def recentdata(companyname):
try:
companydata = Share(companyname)
print "This morning, it opened for $" + companydata.get_open() + ", and right now is at $" + companydata.get_price() + "."
if companydata.get_open() > companydata.get_price():
difference = float(companydata.get_open()) - float(companydata.get_price())
if len(str(difference)) < 3:
print "Since this morning, the price has fallen by $" + str(difference) + "0"
else:
print "Since this morning, the price has fallen by $" + str(difference)
elif companydata.get_open() < companydata.get_price():
difference = float(companydata.get_price()) - float(companydata.get_open())
if len(str(difference)) < 3:
print "Since this morning, the price has risen by $" + str(difference) + "0"
else:
print "Since this morning, the price has risen by $" + str(difference)
print ""
selection = raw_input(
"Would you like some info about what the stock has been like in the past few days? Yes/No: ")
if str.lower(selection) == "no":
end()
elif str.lower(selection) == "yes":
print "Okay, please wait a moment"
except (RuntimeError, TypeError, NameError):
print "Whoops, something went wrong there. Are you sure you entered a valid company abbreviation?"
finally:
longterm(companyname)
def longterm(companyname):
currentdate = arrow.utcnow()
onedayago = currentdate.replace(days=-1).format('YYYY-MM-DD')
twodaysago = currentdate.replace(days=-2).format('YYYY-MM-DD')
threedaysago = currentdate.replace(days=-3).format('YYYY-MM-DD')
fivedaysago = currentdate.replace(days=-5).format('YYYY-MM-DD')
oneweekago = currentdate.replace(days=-7).format('YYYY-MM-DD')
combine(onedayago, twodaysago, threedaysago, fivedaysago, oneweekago, companyname)
# All this function does is put the data into a format that the yahoo_finance module can easily read.
def combine(onedayago, twodaysago, threedaysago, fivedaysago, oneweekago, companyname):
print "Analyzing data from the past few days..."
dates = [onedayago, twodaysago, threedaysago, fivedaysago, oneweekago]
for i in dates:
i == i.format('YYYY-MM-DD')
# Just gets the info and puts it into programmer friendly names
def getclosing(date, company):
# Thanks to stackoverflow user 'TessellatingHeckler' for helping me out with this next function! At the time dictionaries were a foreign concept to me.
readings = company.get_historical(date, date)
for reading in readings:
close = reading['Close']
return close
company = Share(companyname)
closingonedayago = getclosing(str(dates[0]), company)
closingtwodaysago = getclosing(str(dates[1]), company)
closingthreedaysago = getclosing(str(dates[2]), company)
closingfivedaysago = getclosing(str(dates[3]), company)
closingoneweekago = getclosing(str(dates[4]), company)
twohundredavg = company.get_200day_moving_avg()
fiftyavg = company.get_50day_moving_avg()
today = company.get_price()
decision(today, closingonedayago, closingtwodaysago, closingthreedaysago, closingfivedaysago, closingoneweekago, twohundredavg, fiftyavg)
# All this does is get information, display it in a readable fasion, and then determine based on short term and long term
# data if the stock is worth investing in. It uses a system of points (titled positive and negative), which it ads up
# and uses to determine if the stock is overall 'good' or 'bad'. If you don't understand just look at the code.
def decision(today, oneday, twoday, threeday, fiveday, oneweek, twohundredavg, fiftyavg):
# The 'negative' and 'positive' values will act like a score, and in the end will be used to determine an 'overall' score of good or bad.
negative = 0
positive = 0
print "Today's price: " + str(today)
print "Yesterday's price: " + str(oneday)
print "Two days ago's price: " + str(twoday)
print "Three days ago's price: " + str(threeday)
print "Five days ago's price: " + str(fiveday)
print "One week ago's price: " + str(oneweek)
print "Fifty day moving average: " + str(fiftyavg)
print "Two hundred day moving average: " + str(twohundredavg)
print ""
# These are just for doing short term stuff. Longer term stuff (50 and 200 day moving averages) comes 20 lines later
prices = filter(None, [oneweek, fiveday, threeday])
recentprices = filter(None, [twoday, oneday, today])
pricenumber = 0
recentpricenumber = 0
for i in prices:
pricenumber += float(i)
for i in recentprices:
recentpricenumber += float(i)
# pricenumber is just the average of oneweek, fiveday, and threeday.
# recentpricenumber is the average of twoday, oneday, and today.
# These variables give a pretty good average for different time periods.
pricenumber = pricenumber / len(prices)
recentpricenumber = recentpricenumber / len(recentprices)
if recentpricenumber > pricenumber:
print "In the past week, the stock has been moving upwards in price."
positive += 1
else:
print "In the past week, the stock has been decreasing in price."
negative += 1
# Now these are the longer term things
if fiftyavg > twohundredavg:
print "The fifty day average is greater than the two hundred day average, which signifies a longer term upwards trend."
positive += 2
elif fiftyavg < twohundredavg:
print "The two hundred day average is greater than the fifty day average, which signifies a longer term downwards trend."
negative += 2
elif fiftyavg == twohundredavg:
print "You probably didn't enter a valid company. Restart the program and try again."
end()
if negative > positive:
if negative == 3:
print "This stock looks like it is doing VERY poor! I wouldn't invest in it."
end()
if negative - positive == 1:
print "This stock is doing pretty bad. It might recover in the near future, but for now I wouldn't buy it."
end()
print "This stock looks like it isn't doing well... I wouldn't invest in it."
end()
elif negative < positive:
if positive == 3:
print "This stock looks like it is doing very well right now! I would invest in it!"
end()
if positive - negative == 1:
print "This stock is doing slightly good. It isn't making any serious growth at the moment."
print "If you buy it you probably won't lose any money, but you probably won't make much either."
end()
print "This stock looks like it is doing pretty well. I suggest investing in it."
end()
elif positive == negative:
print "This stock is doing mediocre. I would look for another stock to buy unless you are very confident in this one."
end()
# Very simple function, just exits the program. and says goodbye to the user.
def end():
print ""
print "Alright, goodbye!"
print "Press the 'return' or 'enter' key to exit..."
pause = raw_input("")
raise SystemExit("Bye")
### ---------------------------------------------------------------------------------------------------------------- ###
# This is just the part where it runs all the functions after they've been defined #
weekend()
greeting()
| 46.699029 | 159 | 0.666632 | 1,294 | 9,620 | 4.934312 | 0.284389 | 0.013782 | 0.015348 | 0.013156 | 0.22177 | 0.186844 | 0.171809 | 0.127486 | 0.111511 | 0.101175 | 0 | 0.006005 | 0.238358 | 9,620 | 205 | 160 | 46.926829 | 0.865429 | 0.217152 | 0 | 0.15894 | 0 | 0.046358 | 0.340582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.019868 | null | null | 0.284768 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d07afd933b91df3022e8e6f670b97146b4008f1 | 1,600 | py | Python | old/Basics/exampleButtons.py | JCTLearning/Project-Runner | 3c36e0153900195cfa1736854b981b4f928d6043 | [
"MIT"
] | null | null | null | old/Basics/exampleButtons.py | JCTLearning/Project-Runner | 3c36e0153900195cfa1736854b981b4f928d6043 | [
"MIT"
] | null | null | null | old/Basics/exampleButtons.py | JCTLearning/Project-Runner | 3c36e0153900195cfa1736854b981b4f928d6043 | [
"MIT"
] | null | null | null | from tkinter import *
isOpen = True
x = True
"""This Script is the Basic logic for the GUI (It will obviously have
custom buttons and not be default buttons. Just created this to refer
to creating buttons. We aren't using Datalists. We'll be putting their
inputs into a .json file so we can call it back and average the numbers.
"""
class Window(Frame):
def __init__(self, master=None):
Frame.__init__(self, master)
self.master = master
self.init_window()
def init_window(self):
self.master.title("Track Thingy")
self.pack(fill=BOTH, expand=1)
#Creating Buttons for the GUI
addData = Button(self, text="Inputs", command=self.cstop)
stopProcess = Button(self, text = "Are you done?", command=self.output)
#Setting Position of Buttons on the geom.
addData.place(x = 5, y=0)
stopProcess.place(x = 450, y = 0)
def cstop(self):
"""Minor error (Forever loops and you can't clikck the
gui
"""
x = False
while(isOpen == True):
self.runnerName = input("[Runner Name]: ")
self.runnerTime = input("[Runner Time]: ")
self.rTime = []
self.rName = []
self.rTime.append(self.runnerTime)
self.rName.append(self.runnerName)
print(self.rTime)
print(self.rName)
def output(self):
if(x = False):
isOpen == False
root = Tk()
#size of the window
root.geometry("500x500")
app = Window(root)
root.mainloop()
| 30.769231 | 79 | 0.58875 | 206 | 1,600 | 4.524272 | 0.529126 | 0.042918 | 0.019313 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011712 | 0.30625 | 1,600 | 51 | 80 | 31.372549 | 0.827928 | 0.05375 | 0 | 0 | 0 | 0 | 0.059337 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.030303 | null | null | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d0e3402ff457c5aeb32f0b8f311699ddd2eab3e | 4,100 | py | Python | tests/graphql/objects/object_with_field_parameters/tests.py | ladal1/simple_api | 1b5d560476bccad9f68a7331d092dbdb68c48bf7 | [
"MIT"
] | 1 | 2021-02-24T22:14:59.000Z | 2021-02-24T22:14:59.000Z | tests/graphql/objects/object_with_field_parameters/tests.py | ladal1/simple_api | 1b5d560476bccad9f68a7331d092dbdb68c48bf7 | [
"MIT"
] | null | null | null | tests/graphql/objects/object_with_field_parameters/tests.py | ladal1/simple_api | 1b5d560476bccad9f68a7331d092dbdb68c48bf7 | [
"MIT"
] | null | null | null | from .objects import schema
from tests.graphql.graphql_test_utils import GraphQLTestCase, remove_ws
class Test(GraphQLTestCase):
GRAPHQL_SCHEMA = schema
REF_GRAPHQL_SCHEMA = """
schema {
query: Query
}
type ActionInfo {
name: String!
parameters: [FieldInfo!]!
data: [FieldInfo!]!
return_type: String!
permitted: Boolean!
deny_reason: String
retry_in: Duration
mutation: Boolean!
__str__: String!
}
scalar Duration
type FieldInfo {
name: String!
typename: String!
default: String
__str__: String!
}
type ObjectInfo {
name: String!
pk_field: String
actions: [ActionInfo!]!
__str__: String!
}
type Query {
get: TestObject!
__types: [TypeInfo!]!
__objects: [ObjectInfo!]!
__actions: [ActionInfo!]!
}
type TestObject {
number(num: Int): Int!
number_def(num: Int = 5): Int!
__str__: String!
__actions: [ActionInfo!]!
}
type TypeInfo {
typename: String!
fields: [FieldInfo!]!
__str__: String!
}
"""
REF_META_SCHEMA = {
"data": {
"__types": [
{
"typename": "TestObject",
"fields": [
{
"name": "number",
"typename": "Integer!"
},
{
"name": "number_def",
"typename": "Integer!"
}
]
}
],
"__objects": [],
"__actions": [
{
"name": "get",
"parameters": [],
"data": [],
"mutation": False,
"return_type": "TestObject!",
"permitted": True,
"deny_reason": None,
"retry_in": None
}
]
}
}
def test_request_with_param(self):
resp = self.query(
"""
query{
get{
number(num: 30)
}
}
"""
)
exp = {
"data": {
"get": {
"number": 30
}
}
}
self.assertResponseNoErrors(resp)
self.assertJSONEqual(resp.content, exp)
def test_request_no_param(self):
resp = self.query(
"""
query{
get{
number
}
}
"""
)
exp = {
"data": {
"get": {
"number": 20
}
}
}
self.assertResponseNoErrors(resp)
self.assertJSONEqual(resp.content, exp)
def test_request_no_param_def(self):
resp = self.query(
"""
query{
get{
number_def
}
}
"""
)
exp = {
"data": {
"get": {
"number_def": 5
}
}
}
self.assertResponseNoErrors(resp)
self.assertJSONEqual(resp.content, exp)
def test_request_no_param_def_with_value(self):
resp = self.query(
"""
query{
get{
number_def(num: 60)
}
}
"""
)
exp = {
"data": {
"get": {
"number_def": 60
}
}
}
self.assertResponseNoErrors(resp)
self.assertJSONEqual(resp.content, exp)
| 22.651934 | 71 | 0.360244 | 254 | 4,100 | 5.535433 | 0.26378 | 0.045519 | 0.039829 | 0.048364 | 0.351351 | 0.324324 | 0.324324 | 0.324324 | 0.183499 | 0.183499 | 0 | 0.006421 | 0.544146 | 4,100 | 180 | 72 | 22.777778 | 0.745853 | 0 | 0 | 0.3 | 0 | 0 | 0.395157 | 0 | 0 | 0 | 0 | 0 | 0.061538 | 1 | 0.030769 | false | 0 | 0.015385 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d112c6bc1fae1e06b84d5fca2b108ce3f6c8d0e | 3,001 | py | Python | wdfn-server/waterdata/tests/commands/lookup_generation/test_wqp_lookups.py | skaymen/waterdataui | 3f419360771a007f8b441d631c782cd67a4e78b4 | [
"CC0-1.0"
] | null | null | null | wdfn-server/waterdata/tests/commands/lookup_generation/test_wqp_lookups.py | skaymen/waterdataui | 3f419360771a007f8b441d631c782cd67a4e78b4 | [
"CC0-1.0"
] | null | null | null | wdfn-server/waterdata/tests/commands/lookup_generation/test_wqp_lookups.py | skaymen/waterdataui | 3f419360771a007f8b441d631c782cd67a4e78b4 | [
"CC0-1.0"
] | null | null | null |
from unittest import TestCase, mock
import requests
from waterdata.commands.lookup_generation.wqp_lookups import (
get_lookup_by_json, is_us_county, get_nwis_state_lookup, get_nwis_county_lookup)
@mock.patch('waterdata.commands.lookup_generation.wqp_lookups.execute_get_request')
class GetLookupByJsonTestCase(TestCase):
def test_sets_query_params_correctly(self, mrequest):
mrequest.return_value = requests.Response()
mrequest.return_value.status_code = 500
self.assertEqual(get_lookup_by_json('http://fakehost.com', path='codes', params={'param1': 'value1'}), {})
mrequest.assert_called_with('http://fakehost.com',
path='codes',
params={'param1': 'value1', 'mimeType': 'json'})
def test_bad_request(self, mrequest):
mrequest.return_value = requests.Response()
mrequest.return_value.status_code = 500
self.assertEqual(get_lookup_by_json('http://fakehost.com', path='codes'), {})
def test_good_request(self, mrequest):
def test_json():
return {'codes': []}
mrequest.return_value = requests.Response()
mrequest.return_value.status_code = 200
mrequest.return_value.json = test_json
self.assertEqual(get_lookup_by_json('http://fakehost.com', path='codes'), {'codes': []})
class IsUsCountyTestCase(TestCase):
def test_empty_lookup(self):
self.assertFalse(is_us_county({}))
def test_lookup_with_no_colon_in_value(self):
self.assertFalse(is_us_county({'value': '12US'}))
def test_lookup_with_colon_and_us(self):
self.assertTrue(is_us_county({'value': 'US:12'}))
def test_lookup_with_colon_and_not_us(self):
self.assertFalse(is_us_county({'value': 'CA:12'}))
class GetNwisStateLookupTestCase(TestCase):
def test_empty_list(self):
self.assertEqual(get_nwis_state_lookup([]), {})
def test_valid_lookup(self):
test_lookup = [
{'value': 'US:55', 'desc': 'Wisconsin'},
{'value': 'US:01', 'desc': 'Alabama'}
]
self.assertEqual(get_nwis_state_lookup(test_lookup),
{'55': {'name': 'Wisconsin'},
'01': {'name': 'Alabama'}}
)
class GetNwisCountyLookupTestCase(TestCase):
def test_empty_list(self):
self.assertEqual(get_nwis_county_lookup([]), {})
def test_valid_lookup(self):
test_lookup = [
{'value': 'US:01:001', 'desc': 'US, Alabama, Autauga County'},
{'value': 'US:01:002', 'desc': 'US, Alabama, Baldwin County'},
{'value': 'US:02:068', 'desc': 'US, Alaska, Denali Borough'}
]
self.assertEqual(get_nwis_county_lookup(test_lookup),
{'01': {'001': {'name': 'Autauga County'}, '002': {'name': 'Baldwin County'}},
'02': {'068': {'name': 'Denali Borough'}}}
)
| 36.597561 | 114 | 0.614795 | 336 | 3,001 | 5.196429 | 0.252976 | 0.04811 | 0.076174 | 0.034364 | 0.541237 | 0.541237 | 0.402062 | 0.363116 | 0.328751 | 0.328751 | 0 | 0.024155 | 0.241253 | 3,001 | 81 | 115 | 37.049383 | 0.742644 | 0 | 0 | 0.189655 | 0 | 0 | 0.170667 | 0.022667 | 0 | 0 | 0 | 0 | 0.206897 | 1 | 0.206897 | false | 0 | 0.051724 | 0.017241 | 0.344828 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d132e8c6c61ba288e7191f564d48638877b2dac | 6,553 | py | Python | src/SPARQL_BAMS_Store_Query_Example.py | rsoscia/BAMS-to-NeuroLex | e6c3b23725e63c0c9a70a7ef8c7a9ca0789ae153 | [
"MIT"
] | 1 | 2015-11-10T05:20:20.000Z | 2015-11-10T05:20:20.000Z | src/SPARQL_BAMS_Store_Query_Example.py | rsoscia/BAMS-to-NeuroLex | e6c3b23725e63c0c9a70a7ef8c7a9ca0789ae153 | [
"MIT"
] | null | null | null | src/SPARQL_BAMS_Store_Query_Example.py | rsoscia/BAMS-to-NeuroLex | e6c3b23725e63c0c9a70a7ef8c7a9ca0789ae153 | [
"MIT"
] | null | null | null | #SPARQL_BAMS_Store_Query_Example.py
#Accessing Python Interactive Mode:
#python -i SPARQL_BAMS_Store_Query_Example.py
#This program demonstrates a basic query pulling data out of a persisted SPARQL store
#For Parsing
import rdflib
from rdflib import plugin
#for getting the length of the files
import os
#for working with tempfiles
import os.path as op
import tempfile
plugin.register(
'sparql', rdflib.query.Processor,
'rdfextras.sparql.processor', 'Processor')
plugin.register(
'sparql', rdflib.query.Result,
'rdfextras.sparql.query', 'SPARQLQueryResult')
#Get a Graph object
g = rdflib.Graph('Sleepycat',identifier='BAMS')
print("loading up the BAMS file in memory...")
# assumes myRDF_BAMS_Store has been created
tempStore = op.join( tempfile.gettempdir(), 'myRDF_BAMS_Store')
g.open(tempStore)
print("going to get results...")
qres = g.query(
"""SELECT ?subject ?predicate ?object
WHERE {
?subject ?predicate ?object.
} LIMIT 5""")
print("printing results")
#Search through everything
#for i in qres:
# print("Definition: %s" %qres.result[i])
#print("Name: %s" %qres.result[0])
#TypeError: not all arguments converted during string formatting
print("The graph has " + str(len(g)) + " items in it")
print("Name--not necessarily in string format: ")
print(qres.result[0])
#The Fixed BAMS Thesaurus Has 178082 Items In It
#The Additional BAMS Content Has 246200 Items In It
# when done!
#g.close()
##################################################################
##################################################################
#Results from running this program on the BAMS Thesaurus:
##################################################################
#loading up the BAMS file in memory...
#going to get results...
#printing results
#The graph has 22176 items in it
#Name--not necessarily in string format:
#(rdflib.term.BNode('Ndf48c09cc76f48c2bc02ca3b687a8d06'),
#rdflib.term.URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#type'),
#rdflib.term.URIRef(u'file:///anchor'))
##################################################################
##################################################################
##################################################################
##################################################################
##################################################################
#Version 1
#Modified BAMS Query for fixed RDF:
qres = g.query(
"""PREFIX bams: <http://brancusi1.usc.edu/thesaurus/definition/>
SELECT ?predicate ?object
WHERE {
bams:corpora-quadrigemina/ ?predicate ?object.
} LIMIT 5""")
for r in qres.result:
#print str(r[0]), str(r[1])
#added on the object
print("added on the object")
print str(r[0]), str(r[1]), str(r[2])
##############################################################################################
##############################################################################################
#Version 2:
#Modified BAMS Query for fixed RDF:
#Instead of this prefix:
#http://brancusi1.usc.edu/thesaurus/
#Using this prefix:
#http://brancusi1.usc.edu/RDF/thesaurus
PREFIX bams: <http://brancusi1.usc.edu/RDF/thesaurus/definition/>
qres = g.query(
"""PREFIX bams: <http://brancusi1.usc.edu/RDF/thesaurus/definition/>
SELECT ?predicate ?object
WHERE {
bams:Basal-ganglia/ ?predicate ?object.
} LIMIT 5""")
for r in qres.result:
#print str(r[0]), str(r[1])
#added on the object
print str(r[0]), str(r[1]), str(r[2])
#############################################################################################
#Version 2:
#Modified BAMS Query for fixed RDF:
#Instead of this prefix:
#http://brancusi1.usc.edu/thesaurus/
#Using this prefix:
#http://brancusi1.usc.edu/RDF/thesaurus
http://brancusi1.usc.edu/RDF/thesaurusReference
##################################################
#Not sure if that's even the right description of the modification i'm about to make
#Found the node though for the basal glanglia triple:
#N8b46643d8653451da69a6a73576819d0
#Now inserting it into a query:
qres = g.query(
"""SELECT ?predicate ?object
WHERE {
_:N8b46643d8653451da69a6a73576819d0 ?predicate ?object .
} LIMIT 5""")
for r in qres.result:
print str(r[0]), str(r[1])
#The above query returned the following:
#http://www.w3.org/1999/02/22-rdf-syntax-ns#type file:///anchor
#http://www.w3.org/1999/xlinkhref http://brancusi1.usc.edu/thesaurus/definition/tectum/
#http://www.w3.org/1999/xlinktype simple
#http://www.w3.org/1999/02/22-rdf-syntax-ns#type http://brancusi1.usc.edu/RDF/thesaurus
#http://brancusi1.usc.edu/RDF/definition N8b830f1c7f6c4002b3384eadd2196373
#############################################################################################
#############################################################################################
#Changing the limit to something >5
#N8b46643d8653451da69a6a73576819d0
qres = g.query(
"""SELECT ?predicate ?object
WHERE {
_:N8b46643d8653451da69a6a73576819d0 ?predicate ?object .
} LIMIT 50""")
for r in qres.result:
print str(r[0]), str(r[1])
#############################################################################################
#############################################################################################
#Trying with a new ID
Na88e23a883694760abef4476981f4573
qres = g.query(
"""SELECT ?predicate ?object
WHERE {
_:Na88e23a883694760abef4476981f4573 ?predicate ?object .
} LIMIT 5""")
for r in qres.result:
print str(r[0]), str(r[1])
#############################################################################################
#############################################################################################
#New Node:
#N8b46643d8653451da69a6a73576819d0
qres = g.query(
"""SELECT ?predicate ?object
WHERE {
_:N8b46643d8653451da69a6a73576819d0 ?predicate ?object .
} LIMIT 5""")
for r in qres.result:
print str(r[0]), str(r[1])
qres = g.query(
"""SELECT ?predicate ?object
WHERE {
?predicate _:N8b46643d8653451da69a6a73576819d0 ?object .
} LIMIT 5""")
for r in qres.result:
print str(r[0]), str(r[1])
qres = g.query(
"""SELECT ?predicate ?object
WHERE {
?predicate ?object _:N8b46643d8653451da69a6a73576819d0 .
} LIMIT 5""")
for r in qres.result:
print str(r[0]), str(r[1])
| 29.518018 | 94 | 0.53548 | 700 | 6,553 | 4.987143 | 0.252857 | 0.025208 | 0.050415 | 0.059868 | 0.541106 | 0.514179 | 0.460327 | 0.411343 | 0.411343 | 0.368089 | 0 | 0.069821 | 0.14543 | 6,553 | 221 | 95 | 29.651584 | 0.553571 | 0.327789 | 0 | 0.465517 | 0 | 0 | 0.164973 | 0.028691 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.086207 | null | null | 0.258621 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d15f9fd513819715f4bc8c0abb5943c1d4656f1 | 3,767 | py | Python | basic_statistics.py | YitengZhou/cgcnn | e8de8d3c20ba7079314b25a6c5de33cedce4f783 | [
"MIT"
] | 1 | 2020-11-25T09:04:49.000Z | 2020-11-25T09:04:49.000Z | basic_statistics.py | YitengZhou/cgcnn | e8de8d3c20ba7079314b25a6c5de33cedce4f783 | [
"MIT"
] | null | null | null | basic_statistics.py | YitengZhou/cgcnn | e8de8d3c20ba7079314b25a6c5de33cedce4f783 | [
"MIT"
] | 1 | 2022-02-20T11:40:51.000Z | 2022-02-20T11:40:51.000Z | import csv
# # Calculate the proportion of Four types of datasets - energy, piezo, elasticity, diel
# # Energy
# energy = []
# with open('training/energy/energy.csv', 'r', encoding='utf-8') as en:
# reader = csv.reader(en)
# for row in reader:
# energy.append(row[0])
# print(len(energy))
#
# # elasticity
# elasticity = []
# with open('training/elasticity/elasticity.csv', 'r', encoding='utf-8') as el:
# reader = csv.reader(el)
# for row in reader:
# elasticity.append(row[0])
# print(len(elasticity))
#
# # diel
# diel = []
# with open('training/diel/diel.csv', 'r', encoding='utf-8') as di:
# reader = csv.reader(di)
# for row in reader:
# diel.append(row[0])
# print(len(diel))
#
# # piezo
# piezo = []
# with open('training/piezo/piezo.csv', 'r', encoding='utf-8') as pi:
# reader = csv.reader(pi)
# for row in reader:
# piezo.append(row[0])
# print(len(piezo))
#
# # energy & elasticity
# intersection = list(set(energy).intersection(set(elasticity)))
# print('energy & elasticity')
# print(len(intersection))
# # energy & diel
# intersection = list(set(energy).intersection(set(diel)))
# print('energy & diel')
# print(len(intersection))
# # energy & piezo
# intersection = list(set(energy).intersection(set(piezo)))
# print('energy & piezo')
# print(len(intersection))
# # elasticity & diel
# intersection = list(set(elasticity).intersection(set(diel)))
# print('elasticity & diel')
# print(len(intersection))
# # elasticity & piezo
# intersection = list(set(elasticity).intersection(set(piezo)))
# print('elasticity & piezo')
# print(len(intersection))
# # diel & piezo
# intersection = list(set(diel).intersection(set(piezo)))
# print('diel & piezo')
# print(len(intersection))
# # diel & piezo & elasticity
# intersection = list(set(diel).intersection(set(piezo).intersection(set(elasticity))))
# print('diel & piezo & elasticity')
# print(len(intersection))
#
# # nelement and nsite
# ne = {}
# ns = {}
# ns_st = {}
# with open('training/energy/energy.csv', 'r', encoding='utf-8') as en:
# reader = csv.reader(en)
# for row in reader:
# nelement = str(row[2])
# ne[nelement] = ne.get(nelement, 0) + 1
# nsite = row[3]
# ns[nsite] = ns.get(nsite, 0) + 1
# ns_new = int(row[3])//20
# ns_st[ns_new] = ns_st.get(ns_new,0)+1
# print('nelement')
# print(len(ne))
# print(ne)
# print('nsite')
# print(len(ns))
# print(ns)
# print(sorted(ns))
# print('nstie_new')
# print(len(ns_st))
# print(ns_st)
# count quality
energy = []
with open('training/energy/energy.csv', 'r', encoding='utf-8') as en:
reader = csv.reader(en)
count1 = 0
count2 = 0
count3 = 0
count4 = 0
for row in reader:
if row[10] == '0.0':
count1+=1
if row[11] == '0.0':
count2+=1
if row[12] == '0.0':
count3+=1
if row[13] =='0.0':
count4+=1
print(count1)
print(count2)
print(count3)
print(count4)
# count quality
with open('training/elasticity/elasticity.csv', 'r', encoding='utf-8') as en:
reader = csv.reader(en)
count1 = 0
count2 = 0
count3 = 0
count4 = 0
for row in reader:
if row[10] == '0.0':
count1+=1
if row[11] == '0.0':
count2+=1
if row[12] == '0.0':
count3+=1
if row[13] =='0.0':
count4+=1
print(count1)
print(count2)
print(count3)
print(count4)
# count quality
with open('training/piezo/piezo.csv', 'r', encoding='utf-8') as en:
reader = csv.reader(en)
count1 = 0
for row in reader:
if row[10] == '0.0':
count1+=1
print(count1)
| 26.907143 | 88 | 0.581099 | 498 | 3,767 | 4.37751 | 0.126506 | 0.051376 | 0.058716 | 0.055046 | 0.588073 | 0.555046 | 0.420183 | 0.380734 | 0.380734 | 0.380734 | 0 | 0.036376 | 0.241041 | 3,767 | 139 | 89 | 27.100719 | 0.726128 | 0.644014 | 0 | 0.893617 | 0 | 0 | 0.1042 | 0.067851 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021277 | 0 | 0.021277 | 0.191489 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d1f45a9ce8e28e7011e284064528bb0f1a5eef0 | 443 | py | Python | authentication.py | Daniel-Timothy-Leads/dtleads-api-helper-reference | 0206ba796a4df07cf0c1ceca0489c58928cade1e | [
"MIT"
] | null | null | null | authentication.py | Daniel-Timothy-Leads/dtleads-api-helper-reference | 0206ba796a4df07cf0c1ceca0489c58928cade1e | [
"MIT"
] | null | null | null | authentication.py | Daniel-Timothy-Leads/dtleads-api-helper-reference | 0206ba796a4df07cf0c1ceca0489c58928cade1e | [
"MIT"
] | null | null | null | import dtleads_api_helper
dtl = dtleads_api_helper.DanielTimothyLeads() # create instance of DTLeads helper
## Authentication/Authorization
results = dtl.authorize_login("<username>", "<password>")
token_details = dtl.handle_response(results)
print(token_details)
## refresh the token
refresh_token = token_details['refresh_token']
results = dtl.refresh_token(refresh_token)
token_details = dtl.handle_response(results)
print(token_details) | 31.642857 | 81 | 0.817156 | 55 | 443 | 6.290909 | 0.418182 | 0.17341 | 0.092486 | 0.121387 | 0.439306 | 0.306358 | 0.306358 | 0.306358 | 0.306358 | 0 | 0 | 0 | 0.083521 | 443 | 14 | 82 | 31.642857 | 0.852217 | 0.180587 | 0 | 0.444444 | 0 | 0 | 0.092179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.111111 | 0.111111 | 0 | 0.111111 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6d259ad8652754ae6ba08a88636138de14a40976 | 15,712 | py | Python | api/migrations/0001_initial.py | CraftyGirls/REST-Services | 30a435e113953ffe3984f611fcb595b38490e654 | [
"MIT"
] | null | null | null | api/migrations/0001_initial.py | CraftyGirls/REST-Services | 30a435e113953ffe3984f611fcb595b38490e654 | [
"MIT"
] | 22 | 2015-09-09T20:02:12.000Z | 2016-01-22T03:01:08.000Z | api/migrations/0001_initial.py | CraftyGirls/REST-Services | 30a435e113953ffe3984f611fcb595b38490e654 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Animation',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('description', models.TextField(default=b'')),
],
),
migrations.CreateModel(
name='Asset',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('description', models.TextField(default=b'')),
('remoteUrl', models.CharField(default=b'', max_length=1000)),
('assetType', models.CharField(default=b'', max_length=255, choices=[(b'CHARACTER COMPONENT', b'Character Component'), (b'MESH', b'Mesh'), (b'ITEM', b'Item')])),
],
),
migrations.CreateModel(
name='Behaviour',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('description', models.TextField(default=b'')),
],
),
migrations.CreateModel(
name='Character',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('description', models.TextField(default=b'')),
],
),
migrations.CreateModel(
name='CharacterComponent',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('componentType', models.CharField(max_length=100, choices=[(b'UPPER ARM', b'Upper Arm'), (b'LOWER ARM', b'Lower Arm'), (b'HAND', b'Hand'), (b'UPPER LEG', b'Upper Leg'), (b'LOWER LEG', b'Lower Leg'), (b'FOOT', b'Foot'), (b'TORSO', b'Torso'), (b'LOWER JAW', b'Lower Jaw'), (b'UPPER JAW', b'Upper Jaw'), (b'NOSE', b'Node'), (b'LEFT PUPIL', b'Left Pupil'), (b'RIGHT PUPIL', b'Right Pupil'), (b'PELVIS', b'Pelvis')])),
],
),
migrations.CreateModel(
name='Check',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
],
),
migrations.CreateModel(
name='Component',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(auto_now_add=True)),
('name', models.CharField(default=b'', unique=True, max_length=100)),
('image', models.ImageField(upload_to=b'component_images', blank=True)),
('description', models.TextField(default=b'')),
('rating', models.FloatField(default=0.0)),
],
options={
'ordering': ('name',),
},
),
migrations.CreateModel(
name='Condition',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('numArgs', models.IntegerField(default=0)),
],
),
migrations.CreateModel(
name='ConditionalArguments',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('value', models.TextField(default=b'')),
('dataType', models.IntegerField(default=0)),
('index', models.IntegerField(default=0)),
('conditionCheck', models.ForeignKey(to='api.Check', null=True)),
],
),
migrations.CreateModel(
name='Connection',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
],
),
migrations.CreateModel(
name='Conversation',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
],
),
migrations.CreateModel(
name='Dialogue',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('conversation', models.ForeignKey(to='api.Conversation', null=True)),
('speaker', models.ForeignKey(to='api.Character', null=True)),
],
),
migrations.CreateModel(
name='FurnitureComponent',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('meshUrl', models.TextField(default=b'')),
],
),
migrations.CreateModel(
name='FurnitureType',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('furnitureComponent', models.ForeignKey(to='api.FurnitureComponent', null=True)),
],
),
migrations.CreateModel(
name='Item',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('character', models.ForeignKey(to='api.Character', null=True)),
],
),
migrations.CreateModel(
name='Joint',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('xPercentage', models.FloatField(default=0.0)),
('yPercentage', models.FloatField(default=0.0)),
],
),
migrations.CreateModel(
name='Line',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('text', models.CharField(default=b'', max_length=100)),
('dialogue', models.ForeignKey(to='api.Dialogue', null=True)),
],
),
migrations.CreateModel(
name='Option',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('text', models.TextField(default=b'')),
('conversation', models.ForeignKey(to='api.Conversation', null=True)),
],
),
migrations.CreateModel(
name='PDUser',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('avatar', models.ImageField(upload_to=b'profile_images', blank=True)),
('user', models.OneToOneField(to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Room',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('size', models.IntegerField(default=0)),
],
),
migrations.CreateModel(
name='Scenario',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(auto_now_add=True)),
('name', models.CharField(default=b'', max_length=100)),
('description', models.TextField(default=b'')),
('script', models.TextField(default=b'')),
('jsonUrl', models.CharField(default=b'{}', max_length=1024)),
('owner', models.ForeignKey(related_name='scenarios', to='api.PDUser')),
],
options={
'ordering': ('created',),
},
),
migrations.CreateModel(
name='SkeletalConnection',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('component', models.ForeignKey(to='api.CharacterComponent', null=True)),
('outComponents', models.ManyToManyField(related_name='outComponents_rel_+', null=True, to='api.SkeletalConnection')),
],
),
migrations.CreateModel(
name='State',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('behaviour', models.ForeignKey(to='api.Behaviour', null=True)),
('character', models.ForeignKey(to='api.Character', null=True)),
('conversation', models.ForeignKey(to='api.Conversation', null=True)),
('idleAnimationOverride', models.ForeignKey(to='api.Animation', null=True)),
],
),
migrations.CreateModel(
name='Tag',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('value', models.CharField(default=b'', max_length=100)),
],
),
migrations.CreateModel(
name='Taggable',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
],
),
migrations.CreateModel(
name='Texture',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=100)),
('imageUrl', models.TextField(default=b'')),
],
),
migrations.CreateModel(
name='Trigger',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('function', models.CharField(default=b'', max_length=100)),
('description', models.TextField(default=b'')),
],
),
migrations.CreateModel(
name='TriggerArgument',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('dataType', models.CharField(default=0, max_length=100, choices=[(b'INT', b'int'), (b'FLOAT', b'float'), (b'CHARACTER', b'character'), (b'ITEM', b'item'), (b'ROOM', b'room'), (b'CONVERSATION', b'conversation')])),
('field', models.CharField(default=0, max_length=100)),
('trigger', models.ForeignKey(to='api.Trigger', null=True)),
],
),
migrations.CreateModel(
name='UploadFile',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('file', models.FileField(upload_to=b'files/%Y/%m/%d')),
],
),
migrations.CreateModel(
name='ComponentSet',
fields=[
('taggable_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='api.Taggable')),
('name', models.CharField(default=b'', max_length=100)),
('jsonRepresentation', models.TextField(default=b'')),
('fileUrl', models.CharField(default=b'', max_length=512)),
('setType', models.CharField(default=b'', max_length=100, choices=[(b'ARM', b'Arm'), (b'LEG', b'Leg'), (b'HEAD', b'Head'), (b'TORSO', b'Torso'), (b'PELVIS', b'Pelvis')])),
],
bases=('api.taggable',),
),
migrations.CreateModel(
name='ItemDefinition',
fields=[
('taggable_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='api.Taggable')),
('name', models.CharField(default=b'', max_length=100)),
('description', models.TextField(default=b'')),
('interactable', models.BooleanField(default=False)),
('texture', models.OneToOneField(null=True, to='api.Texture')),
],
bases=('api.taggable',),
),
migrations.AddField(
model_name='tag',
name='owner',
field=models.ForeignKey(to='api.Taggable', null=True),
),
migrations.AddField(
model_name='room',
name='scenario',
field=models.ForeignKey(to='api.Scenario', null=True),
),
migrations.AddField(
model_name='item',
name='room',
field=models.ForeignKey(to='api.Room', null=True),
),
migrations.AddField(
model_name='item',
name='scenario',
field=models.ForeignKey(to='api.Scenario', null=True),
),
migrations.AddField(
model_name='furnituretype',
name='room',
field=models.ForeignKey(to='api.Room', null=True),
),
migrations.AddField(
model_name='conversation',
name='scenario',
field=models.ForeignKey(to='api.Scenario', null=True),
),
migrations.AddField(
model_name='component',
name='owner',
field=models.ForeignKey(related_name='components', to='api.PDUser'),
),
migrations.AddField(
model_name='check',
name='dialogue',
field=models.ForeignKey(to='api.Dialogue', null=True),
),
migrations.AddField(
model_name='charactercomponent',
name='texture',
field=models.OneToOneField(null=True, to='api.Texture'),
),
migrations.AddField(
model_name='character',
name='scenario',
field=models.ForeignKey(to='api.Scenario', null=True),
),
migrations.AddField(
model_name='item',
name='itemDef',
field=models.ForeignKey(to='api.ItemDefinition', null=True),
),
migrations.AddField(
model_name='component',
name='componentSet',
field=models.ForeignKey(to='api.ComponentSet', null=True),
),
migrations.AddField(
model_name='charactercomponent',
name='componentSet',
field=models.ForeignKey(to='api.ComponentSet', null=True),
),
]
| 45.542029 | 430 | 0.542261 | 1,473 | 15,712 | 5.67685 | 0.111337 | 0.035398 | 0.092681 | 0.081559 | 0.725544 | 0.656302 | 0.636809 | 0.587778 | 0.552858 | 0.533006 | 0 | 0.008463 | 0.300598 | 15,712 | 344 | 431 | 45.674419 | 0.75248 | 0.001337 | 0 | 0.707101 | 0 | 0 | 0.133915 | 0.005545 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008876 | 0 | 0.017751 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d2b3418afb5afe27f7faa195e05cfdac751a8a3 | 324 | py | Python | Codeforces/C_Killjoy.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | Codeforces/C_Killjoy.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | Codeforces/C_Killjoy.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | for _ in range(int(input())):
n,x = map(int,input().split())
l = list(map(int,input().split()))
flag=2
if len(set(l)) == 1 and l[0] == x:
flag=0
elif x in l or sum([i-x for i in l])==0:
flag=1
if flag==0:
print(0)
elif flag==1:
print(1)
else:
print(2)
| 21.6 | 44 | 0.462963 | 58 | 324 | 2.568966 | 0.431034 | 0.161074 | 0.147651 | 0.214765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051643 | 0.342593 | 324 | 14 | 45 | 23.142857 | 0.647887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.214286 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d2dfab6aff32aa71e6750ca1bd9b42e41522133 | 1,130 | py | Python | msgcode/index2.py | MisterZhouZhou/pythonLearn | 8933c7a6d444d3d86a173984e6cf4c08dbf84039 | [
"Apache-2.0"
] | 1 | 2019-07-09T09:59:39.000Z | 2019-07-09T09:59:39.000Z | msgcode/index2.py | MisterZhouZhou/pythonLearn | 8933c7a6d444d3d86a173984e6cf4c08dbf84039 | [
"Apache-2.0"
] | null | null | null | msgcode/index2.py | MisterZhouZhou/pythonLearn | 8933c7a6d444d3d86a173984e6cf4c08dbf84039 | [
"Apache-2.0"
] | null | null | null | import PIL.Image,PIL.ImageDraw,PIL.ImageFont,PIL.ImageFilter
import random
#随机字母
def rndchar():
return chr(random.randint(65, 90))
#random.randint()函数生成随机数字,数字范围为在65 到90内,在此范围内的美国标准信息编码是大写的A-Z
#chr(kk) 函数,kk为整数,asc编码值,函数返回asc编码为kk 的对应的字符
#随机颜色1
def rndcolor():
return random.randint(64, 255),random.randint(64, 255),random.randint(64, 255)
#随机颜色2
def rndcolor2():
return random.randint(32, 127), random.randint(32, 127), random.randint(32, 127)
width = 60*4
height = 60
image = PIL.Image.new('RGB', (width, height), (255, 255, 255))
#RGB文件:RGB色彩模式是工业界的一种颜色标准,是通过对红®、绿(G)、蓝(B)三个颜色通道的变化以及它们相互之间的叠加来得到各式各样的颜色的,RGB即是代表红、绿、蓝三个通道的颜色
#创建font对象
font = PIL.ImageFont.truetype('fonts.ttf', 36)
#加载一个TrueType或者OpenType字体文件,并且创建一个字体对象,这里的路径可以打开控制面板->字体->选择一种字体,将字体样式的路径复制到这里这个函数从指定的文件加载了一个字体对象,并且为指定大小的字体创建了字体对象。
#创建draw对象
draw = PIL.ImageDraw.Draw(image)
#填充每个像素
for x in range(width):
for y in range(height):
draw.point((x, y), fill=rndcolor())
#输出文字
for t in range(4):
draw.text((60*t+10, 10), rndchar(), font=font, fill=rndcolor2())
image = image.filter(PIL.ImageFilter.BLUR)
image.save('test2.png') | 26.904762 | 116 | 0.738053 | 158 | 1,130 | 5.28481 | 0.525316 | 0.124551 | 0.053892 | 0.064671 | 0.129341 | 0.129341 | 0.129341 | 0.129341 | 0 | 0 | 0 | 0.065476 | 0.107965 | 1,130 | 42 | 117 | 26.904762 | 0.761905 | 0.309735 | 0 | 0 | 0 | 0 | 0.027273 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.1 | 0.15 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
6d31e720a0395d6ea7884f350148553c622f24df | 2,066 | py | Python | google/ads/google_ads/v0/proto/services/keyword_view_service_pb2_grpc.py | jwygoda/google-ads-python | 863892b533240cb45269d9c2cceec47e2c5a8b68 | [
"Apache-2.0"
] | null | null | null | google/ads/google_ads/v0/proto/services/keyword_view_service_pb2_grpc.py | jwygoda/google-ads-python | 863892b533240cb45269d9c2cceec47e2c5a8b68 | [
"Apache-2.0"
] | null | null | null | google/ads/google_ads/v0/proto/services/keyword_view_service_pb2_grpc.py | jwygoda/google-ads-python | 863892b533240cb45269d9c2cceec47e2c5a8b68 | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from google.ads.google_ads.v0.proto.resources import keyword_view_pb2 as google_dot_ads_dot_googleads__v0_dot_proto_dot_resources_dot_keyword__view__pb2
from google.ads.google_ads.v0.proto.services import keyword_view_service_pb2 as google_dot_ads_dot_googleads__v0_dot_proto_dot_services_dot_keyword__view__service__pb2
class KeywordViewServiceStub(object):
"""Service to manage keyword views.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetKeywordView = channel.unary_unary(
'/google.ads.googleads.v0.services.KeywordViewService/GetKeywordView',
request_serializer=google_dot_ads_dot_googleads__v0_dot_proto_dot_services_dot_keyword__view__service__pb2.GetKeywordViewRequest.SerializeToString,
response_deserializer=google_dot_ads_dot_googleads__v0_dot_proto_dot_resources_dot_keyword__view__pb2.KeywordView.FromString,
)
class KeywordViewServiceServicer(object):
"""Service to manage keyword views.
"""
def GetKeywordView(self, request, context):
"""Returns the requested keyword view in full detail.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_KeywordViewServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'GetKeywordView': grpc.unary_unary_rpc_method_handler(
servicer.GetKeywordView,
request_deserializer=google_dot_ads_dot_googleads__v0_dot_proto_dot_services_dot_keyword__view__service__pb2.GetKeywordViewRequest.FromString,
response_serializer=google_dot_ads_dot_googleads__v0_dot_proto_dot_resources_dot_keyword__view__pb2.KeywordView.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'google.ads.googleads.v0.services.KeywordViewService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 43.041667 | 167 | 0.810745 | 253 | 2,066 | 6.071146 | 0.288538 | 0.064453 | 0.046875 | 0.058594 | 0.479167 | 0.479167 | 0.419271 | 0.334635 | 0.334635 | 0.334635 | 0 | 0.009912 | 0.121007 | 2,066 | 47 | 168 | 43.957447 | 0.835903 | 0.117619 | 0 | 0 | 1 | 0 | 0.099497 | 0.065959 | 0 | 0 | 0 | 0 | 0 | 1 | 0.115385 | false | 0 | 0.115385 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d377b9f0e95cc7da3ba86cd55faed22a2297117 | 891 | py | Python | setup.py | spapas/django-git | a62215d315263bce5d5d0afcfa14152601f76901 | [
"MIT"
] | 1 | 2019-03-15T10:32:21.000Z | 2019-03-15T10:32:21.000Z | setup.py | spapas/django-git | a62215d315263bce5d5d0afcfa14152601f76901 | [
"MIT"
] | null | null | null | setup.py | spapas/django-git | a62215d315263bce5d5d0afcfa14152601f76901 | [
"MIT"
] | 1 | 2016-03-25T03:57:49.000Z | 2016-03-25T03:57:49.000Z | #!/usr/bin/env python
from setuptools import setup, find_packages
setup(
name='django-git',
version='0.1.0',
description='Get git information for your django repository',
author='Serafeim Papastefanos',
author_email='spapas@gmail.com',
license='MIT',
url='https://github.com/spapas/django-git/',
zip_safe=False,
include_package_data=False,
packages=find_packages(exclude=['tests.*', 'tests', 'sample', ]),
install_requires=['Django >=1.4', 'six', 'GitPython > 1.0'],
classifiers=[
'Environment :: Web Environment',
'Framework :: Django',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Operating System :: OS Independent',
'Programming Language :: Python',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Software Development :: Libraries',
],
)
| 29.7 | 69 | 0.62514 | 95 | 891 | 5.789474 | 0.736842 | 0.043636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010057 | 0.218855 | 891 | 29 | 70 | 30.724138 | 0.780172 | 0.022447 | 0 | 0 | 0 | 0 | 0.504598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d3b89d7e2be00a2238a2e44ed78c4b9050bf029 | 2,450 | py | Python | ex09-pg/ex09-pg.py | s0tt/rl-course | e2094b3d4d6f4401c5eaee1f283dae9ddf516716 | [
"MIT"
] | null | null | null | ex09-pg/ex09-pg.py | s0tt/rl-course | e2094b3d4d6f4401c5eaee1f283dae9ddf516716 | [
"MIT"
] | null | null | null | ex09-pg/ex09-pg.py | s0tt/rl-course | e2094b3d4d6f4401c5eaee1f283dae9ddf516716 | [
"MIT"
] | null | null | null | import gym
import numpy as np
import matplotlib.pyplot as plt
def policy(state, theta):
""" TODO: return probabilities for actions under softmax action selection """
h = state @ theta
return np.exp(h)/np.sum(np.exp(h))
def generate_episode(env, theta, display=False):
""" enerates one episode and returns the list of states, the list of rewards and the list of actions of that episode """
state = env.reset()
states = [state]
actions = []
rewards = []
for t in range(500):
if display:
env.render()
p = policy(state, theta)
action = np.random.choice(len(p), p=p)
state, reward, done, info = env.step(action)
rewards.append(reward)
actions.append(action)
if done:
break
states.append(state)
return states, rewards, actions
def REINFORCE(env, gamma=0.99, alpha=0.05):
theta = np.random.rand(4, 2) # policy parameters
ep_len_list = []
mean_ep_len = []
for e in range(1000):
if e % 300 == 0:
states, rewards, actions = generate_episode(env, theta, False) # display the policy every 300 episodes
else:
states, rewards, actions = generate_episode(env, theta, False)
# TODO: keep track of previous 100 episode lengths and compute mean
if len(ep_len_list) >= 100:
ep_len_list.pop(0) #remove last item
ep_len_list.append(len(states))
mean = sum(ep_len_list) / len(ep_len_list)
mean_ep_len.append(mean)
print("episode:\t" + str(e) + " length:\t" + str(len(states)) + " mean len:\t" + str(mean))
# TODO: implement the reinforce algorithm to improve the policy weights
nr_steps = len(states)
G = np.zeros([nr_steps])
for t in range(nr_steps):
for k in range(t+1,nr_steps+1):
G[t] += (gamma**(k-t-1)) * rewards[k-1]
action = actions[t]
theta[:,action] = theta[:,action] + alpha * (gamma**t) * G[t] * (states[t] * (1 - policy(states[t], theta)[action]))
return mean_ep_len
def main():
env = gym.make('CartPole-v1')
mean_ep_len = REINFORCE(env)
plt.plot(mean_ep_len)
plt.title("Mean Ep length over time")
plt.xlabel("Episodes")
plt.ylabel("Mean Episode Length")
plt.legend()
plt.savefig('ex09' + '.png')
plt.show()
env.close()
if __name__ == "__main__":
main()
| 29.518072 | 128 | 0.59551 | 343 | 2,450 | 4.145773 | 0.335277 | 0.038678 | 0.037975 | 0.048523 | 0.092827 | 0.092827 | 0.067511 | 0.067511 | 0 | 0 | 0 | 0.020857 | 0.275918 | 2,450 | 82 | 129 | 29.878049 | 0.780722 | 0.160408 | 0 | 0.033898 | 0 | 0 | 0.053922 | 0 | 0 | 0 | 0 | 0.012195 | 0 | 1 | 0.067797 | false | 0 | 0.050847 | 0 | 0.169492 | 0.016949 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d3dea70d27480b04ab234ec6603df5eec45e0a3 | 1,425 | py | Python | Bite 105. Slice and dice.py | Guznin/PyBites | 4a683a1d3aa6e742d28e96e8dfe5de7dd1f61e4a | [
"MIT"
] | 1 | 2019-07-16T15:51:19.000Z | 2019-07-16T15:51:19.000Z | Bite 105. Slice and dice.py | Guznin/PyBites | 4a683a1d3aa6e742d28e96e8dfe5de7dd1f61e4a | [
"MIT"
] | null | null | null | Bite 105. Slice and dice.py | Guznin/PyBites | 4a683a1d3aa6e742d28e96e8dfe5de7dd1f61e4a | [
"MIT"
] | 2 | 2019-10-02T07:30:17.000Z | 2020-04-07T20:38:58.000Z |
"""
Take the block of text provided and strip off the whitespace at both ends. Split the text by newline (\n) using split.
Loop through the lines and if the first character of each (stripped) line is lowercase, split the line into words and add the last word to the (given) results list, stripping the trailing dot (.) and exclamation mark (!) from the end of the word.
At the end of the function return the results list.
"""
text = """
One really nice feature of Python is polymorphism: using the same operation
on different types of objects.
Let's talk about an elegant feature: slicing.
You can use this on a string as well as a list for example
'pybites'[0:2] gives 'py'.
The first value is inclusive and the last one is exclusive so
here we grab indexes 0 and 1, the letter p and y.
When you have a 0 index you can leave it out so can write this as 'pybites'[:2]
but here is the kicker: you can use this on a list too!
['pybites', 'teaches', 'you', 'Python'][-2:] would gives ['you', 'Python']
and now you know about slicing from the end as well :)
keep enjoying our bites!
"""
def slice_and_dice(text=text):
results = []
strip_text = text.strip("\n")
split_strip = strip_text.split("\n")
for i in split_strip:
i = i.strip(" ")
if i[0].islower():
strip_i = i.strip('!.')
split_i = strip_i.split(" ")
results.append(split_i[-1])
return results | 39.583333 | 246 | 0.687719 | 245 | 1,425 | 3.959184 | 0.481633 | 0.018557 | 0.020619 | 0.02268 | 0.03299 | 0.03299 | 0 | 0 | 0 | 0 | 0 | 0.008072 | 0.217544 | 1,425 | 36 | 247 | 39.583333 | 0.861883 | 0.293333 | 0 | 0 | 0 | 0.08 | 0.652305 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0 | 0 | 0.08 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d3ec069650a07875b8e93b882c63f60bc92401c | 1,809 | py | Python | main/migrations/0009_auto_20180512_1403.py | jfilter/MDMA | 90feb5cf351787b1590a556764c4528daabfe051 | [
"MIT"
] | 4 | 2018-06-14T15:33:37.000Z | 2019-02-25T09:23:45.000Z | main/migrations/0009_auto_20180512_1403.py | jfilter/MDMA | 90feb5cf351787b1590a556764c4528daabfe051 | [
"MIT"
] | 1 | 2020-02-11T22:26:49.000Z | 2020-02-11T22:26:49.000Z | main/migrations/0009_auto_20180512_1403.py | jfilter/MDMA | 90feb5cf351787b1590a556764c4528daabfe051 | [
"MIT"
] | 1 | 2019-02-25T09:23:48.000Z | 2019-02-25T09:23:48.000Z | # Generated by Django 2.0.5 on 2018-05-12 14:03
import django.core.validators
from django.db import migrations, models
import main.models
class Migration(migrations.Migration):
dependencies = [
('main', '0008_auto_20180507_2011'),
]
operations = [
migrations.RemoveField(
model_name='job',
name='content_weight',
),
migrations.RemoveField(
model_name='job',
name='log',
),
migrations.RemoveField(
model_name='job',
name='num_steps',
),
migrations.AlterField(
model_name='inputimage',
name='image',
field=models.ImageField(upload_to=main.models.random_input_image_file_path, validators=[django.core.validators.FileExtensionValidator(['jpg', 'png'])]),
),
migrations.AlterField(
model_name='job',
name='output_image',
field=models.ImageField(null=True, upload_to=main.models.random_output_image_file_path),
),
migrations.AlterField(
model_name='job',
name='style_weight',
field=models.FloatField(default=1, null=True, validators=[django.core.validators.MaxValueValidator(1), django.core.validators.MinValueValidator(0)]),
),
migrations.AlterField(
model_name='job',
name='uuid',
field=models.CharField(blank=True, default='nnagbhXhWYbqtrkiFBESrmXiEaphfU6G7KYBw3LR69Vs8ghBP3', max_length=50, unique=True),
),
migrations.AlterField(
model_name='styleimage',
name='image',
field=models.ImageField(upload_to='static/images/style', validators=[django.core.validators.FileExtensionValidator(['jpg'])]),
),
]
| 34.132075 | 164 | 0.610835 | 173 | 1,809 | 6.231214 | 0.410405 | 0.06679 | 0.06679 | 0.089054 | 0.412801 | 0.375696 | 0.070501 | 0 | 0 | 0 | 0 | 0.032477 | 0.268104 | 1,809 | 52 | 165 | 34.788462 | 0.781722 | 0.024876 | 0 | 0.521739 | 1 | 0 | 0.11748 | 0.04143 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.065217 | 0 | 0.130435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d4062683b884798ed07b82edb1e9d86a003556b | 21,319 | py | Python | recipes/Python/577462_A_Buttonbar_program_with_color_/recipe-577462.py | tdiprima/code | 61a74f5f93da087d27c70b2efe779ac6bd2a3b4f | [
"MIT"
] | 2,023 | 2017-07-29T09:34:46.000Z | 2022-03-24T08:00:45.000Z | recipes/Python/577462_A_Buttonbar_program_with_color_/recipe-577462.py | unhacker/code | 73b09edc1b9850c557a79296655f140ce5e853db | [
"MIT"
] | 32 | 2017-09-02T17:20:08.000Z | 2022-02-11T17:49:37.000Z | recipes/Python/577462_A_Buttonbar_program_with_color_/recipe-577462.py | unhacker/code | 73b09edc1b9850c557a79296655f140ce5e853db | [
"MIT"
] | 780 | 2017-07-28T19:23:28.000Z | 2022-03-25T20:39:41.000Z | '''
;#template` {-path} {-menu} {-s1} {-s2} {-s3}
;#option`-path`Path to controlling file`F`c:\source\python\projects\menu\buttonbar.py`
;#option`-menu`Path to menu file`F`c:\source\python\projects\menu\test.ini`
;#option`-s1`First section`X`info`
;#option`-s2`Second section`X`help`
;#option`-s3`Third section`X`data`
;#end
'''
mHelpText = '''
# ----------------------------------------------
# Name: ButtonBarV1Ex
# Description:
## D20G-76 Popup button bar using Qeditor style menus.ini file, selects sections to use (ExPopen)
#
# Author: Philip S. Rist
# Date: 10/27/2010
# Copyright 2010 by St. Thomas Software
# ----------------------------------------------
# This program is freeware. It may be used
# for any moral purpose. You use it at your
# own risk. St. Thomas Software makes no
# guarantees to its fitness to do anything.
#
# If you feel you must pay for it. Please
# send one million dollars to
#
# The International Rescue Committee
# 122 East 42nd Street
# New York, N.Y. 10168-1289
#
# Ok, we are in a recession. So, make it a half
# million.
#
# Example of .reg file entry
# [HKEY_CLASSES_ROOT\textfile\Shell\TextTools]
# @="T&ext Tools
# "EditFlags"=hex:01,00,00,00
# [HKEY_CLASSES_ROOT\textfile\Shell\TextTools\Command]
# @="c:\\sys\\python25\\python.exe c:\\bin\\ButtonBarV1Ex.py \"%1\" {p}\\menus.ini;{o}\\menus.ini;c:\\bin\\menus.ini ide "
#
#
# Syntax:
# ButtonBarV1Ex.py [options] <file path> <menu path> <first section> [ <another section> ... ]
#
# Options:
#
# -b <color> - default button background color
# -d <directory> - set working directory
# -e <name>=<value> - set environment variable
# -h - print help text
# -l <color> - default label background color
# -o - set orientation to horizontal
#
# The command line template extracted from the menu file may contain macros of the form '{x}' which
# are replaced by parts of the file path or other string. The Do.py and Do2.py modules are used.
# %1 is replaced by the selected file path. The file path string is also used for display
# at the top of the button bar. If no expansion is performed any string can be used and displayed.
#
# The second parameter is the name of an initialization file similar to the one shown below.
# Each line contains a prompt displayed on a button and a command separated by a comma.
# Macros are replaced before execution.
#
# The remaining parameters name sections to be used to make the button bar. The order in the
# initialization file is maintained. Selection is case insensitive.
#
# Do.py and Do2.py are used to expand the command line, the menu selection and each section
# name. The command line can also contain '{pr}' which will be replaced by the prompt
# displayed on the button. Also, any string enclosed in braces not a macro will be taken
# as a command name in the registry entry for the primary files file type.
#
# The Menu File:
#
# The menu file follows the syntax used by the QEditor menus.ini file. The same syntax as an .ini
# file except a ',' is used to separate key from value. In this case the values are command
# line templates usually containing one or more macro strings.
#
# Lines beginning with ';' are comments unless followed by '#' in which case the line may contain
# a special command. Command line options are also available for the same functions
#
# ;#cd`c:\bin - Sets the working directory to c:\bin or '-d c:\bin'
# ;#set`abc=def - Sets the environment variable abc to 'def' or '-s abc=def'
# ;#button`#00CCCC - Sets the background of succeeding buttons to #00CCCC. or '-b #00CCCC'
# ;#label`light blue - Sets the background of labels created by succeeding '-' lines
# to light blue. If included before the first section the
# color will also be used for section header labels or '-l "light blue"'
# Within sections not included in the button bar special commands are ignored. The '`' usually
# appears under the '~' on the keyboard.
#
# -o option will produce a horizontal button bar
#
# Section names and command prompts can contan a file pattern enclosed within two '/' or two '!'
# to include the section or line if the file path matches or does not match the pattern. Examples
# are shown below.
#
# Example initialization file (Text Tools.ini)
;#set`test=A Short String
;#set`use=c:\sys\python27\pythonw.exe
;#set`gain=none
;#cd`\source\tcl
;#cd`c:\source\python\Projects\Program Execution
;#label`yellow
[Test1]
;#button`#00EEEE
;#label`#00CCCC
test 1,c:\bin\messagebox.exe {a} {b} {p} "{f}" {n} {e} {o} %test%
-Directory List
Dir,c:\windows\system32\cmd.exe /C dir "{p}\*.{e}"
DirX,c:\windows\system32\cmd.exe /C dir *.*
Dir2,c:\windows\system32\cmd.exe /C dir c:\source\Python\*.*
;#label`light gray
-Program execution
/*.exe/Run,"{a}" I am running "{a}"
/*.py/Run,c:\bin\messagebox.exe I am running a Python program "{a}"
/*.py/Run 2,%use% "{a}"
/*.exe/Run 2,"{a}" I am running "{a}"
[/*.py/Test2]
Edit T,{o}\qeditor.exe "{a}"
Dir *,%comspec% /C dir {.}\*.*
'''
import sys
import os
import getopt
import fnmatch
import Tkinter
from Tkconstants import *
import tkMessageBox
from _winreg import *
import Do # Recipe: 577439
from Do2 import ExpandArg # Recipe: 577440
from DoCommand import BuildCommand # Recipe: 577441
import subprocess
mCmds = {}
def Expand(pCommand, pFilePath, pPrompt, pSep=','):
'''
Expand
'''
# ---- Standard macro expansion with added {pr} macro
# Can separate arguments with ',' fo force argument by argument
# expansion
if pCommand.find(pSep) >= 0:
try:
lArgs = pCommand.split(pSep)
lCommand = ''
lDefaultPath = pFilePath
for lArg in lArgs:
lVal = ExpandArg(lArg, pFilePath, lDefaultPath)
if lVal.find('{pr}') >= 0: # insert button prompt
lVal = lVal.replace('{pr}', pPrompt)
lCommand = lCommand + lVal + " "
except Exception, e:
print 'BBV1Ex',e
else:
lCommand = Do.Expand(pCommand, pFilePath)
if lCommand.find('{pr}') >= 0:
lCommand = lCommand.replace('{pr}', pPrompt)
# ---- Remove {} for compatibility with DoM.py
lCommand = lCommand.replace('{}', ' ')
# ---- Remaining macros will be replaced by command of same name
# in registry
lPos = lCommand.find("{")
if lPos >= 0:
lEndPos = lCommand.find("}")
if lEndPos > lPos:
try:
lKey = lCommand[lPos+1:lEndPos]
lDotPos = lKey.rfind('.')
if lDotPos > 0:
lExt = lKey[lDotPos]
lKey = lKey[0:lDotPos]
else:
lDotPos = pFilePath.rfind('.')
lExt = pFilePath[lDotPos:]
lReplace = BuildCommand(pFilePath, lKey, lExt, [])
lCommand = lCommand[0:lPos] + lReplace + lCommand[lEndPos+1:]
except:
pass
# ---- replace %1 and %*
else:
lPos = lCommand.rfind("%1")
if lPos >= 0:
lCommand = lCommand.replace("%1", pFilePath)
lPos = lCommand.rfind("%*")
if lPos >= 0:
lCommand = lCommand.replace("%*", "")
# ---- Remove !! used by DoM.py
lPos = lCommand.rfind('!!')
if lPos > 0:
lCommand = lCommand[0:lPos]
return lCommand
def showit(pEvent):
'View expansion of selected command'
lPrompt = pEvent.widget.cget("text")
lCommand = mCmds[lPrompt]
lCommand = Expand(lCommand, mFileName, lPrompt)
tkMessageBox.showinfo('Expanded Command:', lPrompt + " = " + lCommand)
def submitnow(pEvent):
'Execute selected command'
lPrompt = pEvent.widget.cget("text")
lCommand = mCmds[lPrompt]
lCommand = Expand(lCommand, mFileName, lPrompt)
lCommand = '"' + lCommand + '"'
subprocess.Popen(lCommand, shell=True)
def Commands():
return mCmds
def Build(pFrame, pMenuName, pFileName, pLabelList, pSubmit=submitnow, pCallBack=None,
pDefaultLabel='yellow', pDefaultButton='light blue', pSide=TOP, pFill=X):
'''
Build menu from menu file
The menu file may contain commands to set the current directory and to
set specified environment variables. This is done while the file is
being read. They are not dependent on the selected button. The last
command encountered will be active.
set environment variable name to value
;#set`name=value
set current directory to path
;#cd`path
set label background color
;#label`color
set button background color
;#button`color
'''
#
lPos = pFileName.rfind('\\')
if lPos >= 0:
lFileName = pFileName[lPos+1:].lower()
else:
lFileName = pFileName.lower()
lExist = os.path.exists(pFileName)
lFile = open(pMenuName,'r')
lFileText = lFile.readlines()
lFile.close
lDefaultButton = lButtonColor = pDefaultButton
lDefaultLabel = lLabelColor = pDefaultLabel
label = Tkinter.Label(pFrame, text=pFileName, bg=lLabelColor)
#label.pack(side=TOP, fill=X)
label.pack(side=pSide, fill=pFill)
lKeep = False
lSectionCount = 0
for lText in lFileText:
if len(lText) > 1:
if lText[0] != "[":
# ---- Menu splitter line
if lText[0] == "-" and lKeep:
if len(lText) > 1:
lPrompt = lText[1:-1]
if lPrompt.find('{') >= 0:
lPrompt = ExpandCmd(lPrompt, pFileName)
# ---- Display label for - line
# can be removed without affecting remainder of program
if len(lPrompt) > 0 and lKeep == True:
label= Tkinter.Label(pFrame, text=lPrompt, bg=lLabelColor)
#label.pack(side=TOP, fill=X)
label.pack(side=pSide, fill=pFill)
# ---- process special command
elif lText.startswith(";#") and (lSectionCount == 0 or lKeep == True):
lPos = lText.find("`") # <---- Fields are separated by ` (below ~)
if lPos > 0:
lCh = lText[2]
# ---- Set environment variable
if lCh == 's': # set environment variable
lEqPos = lText.find("=")
if lEqPos > lPos:
lKey = lText[lPos+1:lEqPos].strip()
lValue = lText[lEqPos+1:].strip()
if lValue.find('{') >= 0:
lValue = ExpandCmd(lValue, pFileName)
os.environ[lKey] = lValue
# ---- Change working directory
elif lCh == 'c': # set working directory
if lPos > 0 and lPos < len(lText):
lText = lText[lPos+1:].strip()
if lText.find('{') >= 0:
lText = ExpandCmd(lText, pFileName)
os.chdir(lText)
elif lCh == 'b':
lButtonColor = lText[lPos+1:].strip().lower()
if lSectionCount == 0:
lDefaultButton = lButtonColor
elif lCh == 'l':
lLabelColor = lText[lPos+1:].strip().lower()
if lSectionCount == 0:
lDefaultLabel = lLabelColor
elif pCallBack != None:
pCallBack(lText)
else: # ignore
pass
# ---- Comment
elif lText[0] == ";":
pass
# ---- Menu section is being skipped
elif lKeep == False:
pass
# ---- Command template
else:
lPos = lText.find(",")
if lPos > 0:
lPrompt = lText[:lPos]
lCommand = lText[lPos+1:]
# ---- Filter commands based on match with file name
# /*3.py/Run,c:\source\python31\pyrhonw.exe ...
# can be removed without affecting remainder of program
if lPrompt[0] == '/':
try:
(lMask, lPrompt) = lPrompt[1:].split('/')
if not fnmatch.fnmatch(lFileName, lMask.lower()):
continue
except Exception, e:
continue
# ---- Filter commands based on mismatch with file name
# !*3.py!Run,c:\source\python27\pythonw.exe ...
# can be removed without affecting remainder of program
elif lPrompt[0] == '!':
try:
(lMask, lPrompt) = lPrompt[1:].split('!')
if fnmatch.fnmatch(lFileName, lMask.lower()):
continue
except Exception, e:
continue
# ---- Create toolbar button
mCmds[lPrompt] = lCommand
try:
button = Tkinter.Button(pFrame, text=lPrompt, bg=lButtonColor)
except:
button = Tkinter.Button(pFrame, text=lPrompt)
#button.pack(side=TOP,fill=X)
button.pack(side=pSide, fill=pFill)
button.bind("<Button-1>", pSubmit) ### (1)
button.bind("<Button-3>", showit)
# ---- Section Header
else:
lSectionCount += 1
lLabelColor = lDefaultLabel
lButtonColor = lDefaultButton
if lText.find('{') >= 0:
lText = ExpandCmd(label, pFileName)
label = lText[1:-2].lower()
# ---- Filter section based on match with file name
# [/*3.py/Special]
# can be removed without affecting remainder of program
if label[0] == '/':
try:
(lMask, label) = label[1:].split('/')
if not fnmatch.fnmatch(lFileName, lMask.lower()):
lKeep = False
continue
except Exception, e:
lKeep = False
continue
# ---- Filter section based on mismatch with file name
# [!*3.py!Special]
# can be removed without affecting remainder of program
elif label[0] == '!':
try:
(lMask, label) = label[1:].split('!')
if fnmatch.fnmatch(lFileName, lMask.lower()):
lKeep = False
continue
except Exception, e:
lKeep = False
continue
# ---- label must begin with letter
elif (label[0] < 'a') or (label[0] > 'z'):
label = label[1:]
# ---- Select/Unselect section
if pLabelList == [] or label in pLabelList:
lKeep = True
label = Tkinter.Label(pFrame,text=label.capitalize(), bg=lLabelColor)
#label.pack(side=TOP, fill=X)
label.pack(side=pSide, fill=pFill)
else:
lKeep = False
label = Tkinter.Label(pFrame,text="", bg=lLabelColor)
#label.pack(side=TOP, fill=X)
label.pack(side=pSide, fill=pFill)
button = Tkinter.Button(pFrame,text='Exit', command=pFrame.quit, bg=lButtonColor)
#button.pack(side=TOP, fill=X)
button.pack(side=pSide, fill=pFill)
if __name__ == '__main__':
(mOptions, mArgs) = getopt.getopt(sys.argv[1:], 'b:d:e:hl:o')
# ---- Set run options
mDefaultLabel = 'light green'
mDefaultButton = 'light blue'
mOrient = VERTICAL
for (mKey, mValue) in mOptions:
if mKey == '-b':
mDefaultButton = mValue
elif mKey == '-d': # Set current directory
if mValue.find('}') >= 0:
mValue = Expand(mValue, mFileName)
os.chdir(mValue)
elif mKey == '-e': # Set environment variable
Do.setenviron(mValue, mFileName)
elif mKey == '-h':
print mHelpText
elif mKey == '-l':
mDefaultLabel = mValue
elif mKey == '-o':
mOrient = HORIZONTAL
tk = Tkinter.Tk()
if len(mArgs) > 0:
try:
mDefaultMenu = os.environ['MENUPATH']
if mVerbose:
print 'BB1Ex Default menu path', mDefaultMenu
except:
mDefaultMenu = ''
try:
mDefaultSection = os.environ['SECTION']
if mVerbose:
print 'BB1Ex Default section', mDefaultSection
except:
mDefaultSection = '' # <------- Default section - Change this
if mDefaultSection == '':
mDefaultSection = ''
# ---- First argument is absolute path to file to be processed
mFileName = mArgs[0]
mFileName = mFileName.replace("/","\\")
# ---- Second argument is absolute path to initialization menu
# Selected menu can depend on file path (ie. {o}\menus.ini) See do2.py
if len(mArgs) > 1:
mMenuArg = mArgs[1] # generate button bar
mMenuArg += ';' + mDefaultMenu
else:
mMenuArg = mDefaultMenu
# ---- Prevent QEditor from performing replacement, may be removed
mMenuArg = mMenuArg.replace("(","{")
mMenuArg = mMenuArg.replace(")","}")
# ---- Expand menu name and/or select from list
if mMenuArg.find('{') >= 0:
mMenuName = ExpandArg(mMenuArg, mFileName, mDefaultMenu)
elif mMenuArg.find(';') >= 0:
for mMenu in mMenuArg.split(';'):
if os.path.exists(mMenu):
mMenuName = mMenu
break
else:
mMenuName = mDefaultMenu
else:
mMenuName = mMenuArg
# ---- Remaining arguments list sections to include in menu
if len(mArgs) > 2:
mLabelList = []
for lItem in mArgs[2:]:
if lItem == '*':
break
if lItem == '-':
lItem = mDefaultSection
lItem = lItem.lower()
# ---- Prevent QEditor from performing replacement, may be removed
lItem = lItem.replace("(","{")
lItem = lItem.replace(")","}")
# ---- Selected section can be dependent on file path
if lItem.find('{') >= 0:
lItem = ExpandArg(lItem, mFileName, '')
mLabelList.append(lItem)
elif mDefaultSection != '':
mLabelList = [ mDefaultSection ]
else:
mLabelList = [ ]
# ---- Build buttonbar
mFrame = Tkinter.Frame(tk, relief=RIDGE, borderwidth=2)
mFrame.pack(fill=BOTH, expand=1)
if mOrient == VERTICAL:
Build(mFrame, mMenuName, mFileName, mLabelList, pDefaultLabel=mDefaultLabel, pDefaultButton=mDefaultButton, pSide=TOP, pFill=X)
else:
Build(mFrame, mMenuName, mFileName, mLabelList, pDefaultLabel=mDefaultLabel, pDefaultButton=mDefaultButton, pSide=LEFT, pFill=Y)
# ---- Enter event loop
tk.mainloop()
else:
tkMessageBox.showinfo('Syntax Error','File name required')
| 39.045788 | 140 | 0.506262 | 2,170 | 21,319 | 4.967742 | 0.22765 | 0.008905 | 0.005195 | 0.008349 | 0.240724 | 0.215492 | 0.189981 | 0.170501 | 0.137291 | 0.109091 | 0 | 0.013885 | 0.385149 | 21,319 | 545 | 141 | 39.117431 | 0.808514 | 0.138656 | 0 | 0.226463 | 0 | 0.030534 | 0.307966 | 0.031245 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.010178 | 0.030534 | null | null | 0.012723 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d42d200de04bf75125a5f7cac6051062bfeb40b | 1,414 | py | Python | getting_started.py | amit-timalsina/Generative-Python-Transformer | 9353cccb084f7c01be5355eedf69094f91d30433 | [
"MIT"
] | null | null | null | getting_started.py | amit-timalsina/Generative-Python-Transformer | 9353cccb084f7c01be5355eedf69094f91d30433 | [
"MIT"
] | null | null | null | getting_started.py | amit-timalsina/Generative-Python-Transformer | 9353cccb084f7c01be5355eedf69094f91d30433 | [
"MIT"
] | null | null | null | <<<<<<< HEAD
from curtsies.fmtfuncs import cyan, bold
=======
>>>>>>> 9a201987d7a6490e552e7d03d8281330895ef733
from github import Github
import time
from datetime import datetime
import os
ACCESS_TOKEN = open('token.txt', 'r').read()
g = Github(ACCESS_TOKEN)
print(g.get_user())
<<<<<<< HEAD
end_time = time.time() - 86400 * 2
=======
end_time = time.time()
>>>>>>> 9a201987d7a6490e552e7d03d8281330895ef733
start_time = end_time - 86400
for i in range(1):
start_time_str = datetime.utcfromtimestamp(start_time).strftime('%Y-%m-%d')
end_time_str = datetime.utcfromtimestamp(end_time).strftime('%Y-%m-%d')
query = f"language:Python created:{start_time_str}..{end_time_str}"
print(query)
end_time -=86400
start_time -= 86400
result = g.search_repositories(query)
print(result.totalCount)
for repository in result:
print(f"{repository.clone_url}")
print(f"{repository._clone_url}")
print(f"{repository.tags_url}")
# print(f"{dir(repository)}")
<<<<<<< HEAD
os.system(f"git clone {repository.clone_url} repos/{repository.owner.login}/{repository.name}")
print(cyan(bold(f"current start time {start_time}")))
print("Finished, your new end time must be:", start_time)
=======
os.system(f"git clone {repository.clone_url} repos/{repository.owner.login}/{repository.name}")
>>>>>>> 9a201987d7a6490e552e7d03d8281330895ef733
| 30.73913 | 103 | 0.687412 | 178 | 1,414 | 5.303371 | 0.353933 | 0.059322 | 0.076271 | 0.03178 | 0.256356 | 0.224576 | 0.224576 | 0.224576 | 0.15678 | 0.15678 | 0 | 0.095357 | 0.1471 | 1,414 | 45 | 104 | 31.422222 | 0.687396 | 0.019095 | 0 | 0.297297 | 0 | 0 | 0.272202 | 0.177617 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.135135 | null | null | 0.216216 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d451b82a61be9c8cca43334325abb31306bcadf | 760 | py | Python | sessions/session 4/queue_publish_read.py | IceCrew-Source/BachelorDIM-Lectures-Algorithms-2020 | 95d2761883feebada25f62c20bdfe405a1353f61 | [
"MIT"
] | null | null | null | sessions/session 4/queue_publish_read.py | IceCrew-Source/BachelorDIM-Lectures-Algorithms-2020 | 95d2761883feebada25f62c20bdfe405a1353f61 | [
"MIT"
] | null | null | null | sessions/session 4/queue_publish_read.py | IceCrew-Source/BachelorDIM-Lectures-Algorithms-2020 | 95d2761883feebada25f62c20bdfe405a1353f61 | [
"MIT"
] | null | null | null | import argparse
import os
import pika
from decouple import config
import importlib
simple_queue_read = importlib.import_module('simple_queue_read')
simple_queue_publish = importlib.import_module('simple_queue_publish')
URL = config('URL')
url = os.environ.get('CLOUDAMQP_URL', URL)
params = pika.URLParameters(url)
params.socket_timeout = 5
connection = pika.BlockingConnection(params)
channel = connection.channel() # start a channel
channel.queue_declare(queue='hello') # Declare a queue
parser = argparse.ArgumentParser(description='How to')
parser.add_argument('-read', action='store_true')
flags = parser.parse_args()
if flags.read:
simple_queue_read.read_queue(channel)
else:
simple_queue_publish.publish_queue(channel)
connection.close()
| 27.142857 | 70 | 0.794737 | 101 | 760 | 5.762376 | 0.445545 | 0.113402 | 0.07732 | 0.092784 | 0.109966 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001462 | 0.1 | 760 | 27 | 71 | 28.148148 | 0.849415 | 0.040789 | 0 | 0 | 0 | 0 | 0.108815 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.318182 | 0 | 0.318182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
6d4b0371bbc405789746fc2f894683115042f998 | 1,186 | py | Python | translate.py | KTH1234/deep_summ | f242733f55e0b42c4fe3ac469142f9586102599d | [
"MIT"
] | null | null | null | translate.py | KTH1234/deep_summ | f242733f55e0b42c4fe3ac469142f9586102599d | [
"MIT"
] | null | null | null | translate.py | KTH1234/deep_summ | f242733f55e0b42c4fe3ac469142f9586102599d | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from __future__ import division, unicode_literals
import argparse
from onmt.translate.Translator import make_translator
import onmt.io
import onmt.translate
import onmt
import onmt.ModelConstructor
import onmt.modules
import onmt.opts
import timeit
def main(opt):
translator = make_translator(opt, report_score=True)
start = timeit.default_timer()
_, attns_info, oov_info, copy_info, context_attns_info = translator.translate(opt.src_dir, opt.src, opt.tgt,
opt.batch_size, opt.attn_debug)
end = timeit.default_timer()
print("Translation takes {}s".format(end-start))
# currently attns_info,oov_info only contain first index data of batch
if len(context_attns_info) == 0:
return attns_info, oov_info, copy_info
else:
return attns_info, oov_info, copy_info, context_attns_info
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description='translate.py',
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
onmt.opts.add_md_help_argument(parser)
onmt.opts.translate_opts(parser)
opt = parser.parse_args()
main(opt)
| 27.581395 | 112 | 0.724283 | 154 | 1,186 | 5.285714 | 0.474026 | 0.077396 | 0.058968 | 0.078624 | 0.142506 | 0.142506 | 0.142506 | 0.09828 | 0.09828 | 0 | 0 | 0.001044 | 0.192243 | 1,186 | 42 | 113 | 28.238095 | 0.848643 | 0.075042 | 0 | 0 | 0 | 0 | 0.037443 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.344828 | 0 | 0.448276 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
6d4e696d020c5cc19c2a78065f0bfe86189cfbb6 | 476 | py | Python | src/streaming-programs/01-wordsplit_map.py | gofore/aws-emr | 31c70abc92aa84cfa93841be59f546809310fe6c | [
"MIT"
] | 2 | 2017-02-12T07:14:37.000Z | 2019-08-12T19:13:40.000Z | src/streaming-programs/01-wordsplit_map.py | gofore/aws-emr | 31c70abc92aa84cfa93841be59f546809310fe6c | [
"MIT"
] | 1 | 2021-05-07T05:47:27.000Z | 2021-05-07T05:47:27.000Z | src/streaming-programs/01-wordsplit_map.py | gofore/aws-emr | 31c70abc92aa84cfa93841be59f546809310fe6c | [
"MIT"
] | 7 | 2015-02-23T18:57:52.000Z | 2019-11-06T09:31:24.000Z | #!/usr/bin/python
import sys
import re
# Test application to check whether EMR pipeline and reading the data works
# This code is from the EMR example:
# https://s3.amazonaws.com/elasticmapreduce/samples/wordcount/wordSplitter.py
def main(argv):
pattern = re.compile("[a-zA-Z][a-zA-Z0-9]*")
for line in sys.stdin:
for word in pattern.findall(line):
print "LongValueSum:" + word.lower() + "\t" + "1"
if __name__ == "__main__":
main(sys.argv)
| 28 | 77 | 0.67437 | 71 | 476 | 4.408451 | 0.774648 | 0.019169 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010336 | 0.186975 | 476 | 16 | 78 | 29.75 | 0.79845 | 0.422269 | 0 | 0 | 0 | 0 | 0.162362 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.222222 | null | null | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d5163ff7afac2854217dc08a8bc7b2702ea0ae5 | 1,550 | py | Python | query_research/bound_variables.py | sjuenger/WikiMETA | 13ed293b4bda8ff0fc10b532907ca35c24a12616 | [
"MIT"
] | null | null | null | query_research/bound_variables.py | sjuenger/WikiMETA | 13ed293b4bda8ff0fc10b532907ca35c24a12616 | [
"MIT"
] | null | null | null | query_research/bound_variables.py | sjuenger/WikiMETA | 13ed293b4bda8ff0fc10b532907ca35c24a12616 | [
"MIT"
] | null | null | null | import glob
import json
# to hand over the bound variables of a quuery
# e.g. ["var4", "<http://www.w3.org/ns/prov#wasDerivedFrom>"]
# for :
#SELECT ?var1 ?var2Label ?var3
#WHERE {
# BIND ( <http://www.w3.org/ns/prov#wasDerivedFrom> AS ?var4 ).
# ?var5 ?var6 ?var4 .
# ?var5 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://wikiba.se/ontology#Property> .
# ?var5 <http://www.wikidata.org/prop/direct/P1659> ?var1 .
# ?var1 <http://www.w3.org/2000/01/rdf-schema#label> ?var2Label .
# ?var1 <http://schema.org/description> ?var3 .
# FILTER ( ( ( LANG ( ?var2Label ) = "en" ) )
#) .
# FILTER ( ( ( LANG ( ?var3 ) = "en" ) )
#) .
#}
#
def find_bound_variables(json_object):
where = json_object["where"]
# find BIND Variables
bound_variables = []
for where_part in where:
if where_part["type"] == "bind":
#print(where_part)
if "termType" in where_part["expression"]:
if where_part["expression"]["termType"] == "NamedNode":
if where_part["variable"]["termType"] == "Variable":
bound_variables.append(
(where_part["variable"]["value"], where_part["expression"]["value"]))
#print("Bound Variables: ")
#print(bound_variables)
# TODO: Declare a reference in bound variables as an own scenario
# TODO: Change this bound variables tuple arry to a dictionary to get rid of the .__str__()
return bound_variables
| 36.046512 | 107 | 0.58129 | 186 | 1,550 | 4.736559 | 0.44086 | 0.143019 | 0.040863 | 0.054484 | 0.072645 | 0.072645 | 0.072645 | 0 | 0 | 0 | 0 | 0.034061 | 0.26129 | 1,550 | 42 | 108 | 36.904762 | 0.735371 | 0.547097 | 0 | 0 | 0 | 0 | 0.163447 | 0 | 0 | 0 | 0 | 0.02381 | 0 | 1 | 0.076923 | false | 0 | 0.153846 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d520b7ec88266cce313daeeffbe58a78a8cd7b3 | 337 | py | Python | metaci/build/migrations/0030_build_priority.py | sfdc-qbranch/MetaCI | 78ac0d2bccd2db381998321ebd71029dd5d9ab39 | [
"BSD-3-Clause"
] | 48 | 2018-10-24T14:52:06.000Z | 2022-03-25T21:14:50.000Z | metaci/build/migrations/0030_build_priority.py | sfdc-qbranch/MetaCI | 78ac0d2bccd2db381998321ebd71029dd5d9ab39 | [
"BSD-3-Clause"
] | 2,034 | 2018-10-31T20:59:16.000Z | 2022-03-22T21:38:03.000Z | metaci/build/migrations/0030_build_priority.py | sfdc-qbranch/MetaCI | 78ac0d2bccd2db381998321ebd71029dd5d9ab39 | [
"BSD-3-Clause"
] | 27 | 2018-12-24T18:16:23.000Z | 2021-12-15T17:57:27.000Z | # Generated by Django 2.2.5 on 2019-10-01 15:52
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [("build", "0029_build_org_note")]
operations = [
migrations.AddField(
model_name="build", name="priority", field=models.IntegerField(default=0)
)
]
| 22.466667 | 85 | 0.661721 | 41 | 337 | 5.341463 | 0.780488 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075758 | 0.216617 | 337 | 14 | 86 | 24.071429 | 0.753788 | 0.133531 | 0 | 0 | 1 | 0 | 0.127586 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d537aafddd30273e1241cbf12d001f8b117f913 | 3,392 | py | Python | work/MTRIX/submit_to_queue3.py | youdar/work | d5f32015e7be1d58ede255eeadbd1d12acb3f270 | [
"MIT"
] | null | null | null | work/MTRIX/submit_to_queue3.py | youdar/work | d5f32015e7be1d58ede255eeadbd1d12acb3f270 | [
"MIT"
] | null | null | null | work/MTRIX/submit_to_queue3.py | youdar/work | d5f32015e7be1d58ede255eeadbd1d12acb3f270 | [
"MIT"
] | null | null | null | from __future__ import division
from libtbx.command_line import easy_qsub
from misc_scripts import helpers
from misc_scripts.r_factor_calc import *
from iotbx import pdb
import cPickle as pickle
import os
'''
Submit to queue all files with good MTRIX records that also havestracture factor files
'''
def run():
# set working environment
#phenix_source = "/net/chevy/raid1/youval/Work_chevy/phenix_build/setpaths.csh"
#where_to_run_dir = "/net/cci-filer2/raid1/home/youval/Work/work/junk/queue_job_3"
phenix_source = r"c:\Phenix\Dev\phenix_build\setpaths.csh"
#os.chdir(where_to_run_dir)
# collect files to work on
files = get_file_names()
#
commands = []
# Set command path
com_path = '/net/chevy/raid1/youval/Work_chevy/phenix_sources/misc_scripts/r_factor_calc.py'
#
for [pdb_file,sf_file,file_name] in files:
#print file_name
com_options = '--strOut=True --eps=0.01 --fromRCSB=False'.format(file_name)
outString = '{0} {1} {2} {3} &> log_{4}'.format(com_path,pdb_file,sf_file,com_options,file_name)
commands.append(
"python {}".format(outString))
easy_qsub.run(
phenix_source = phenix_source,
where = where_to_run_dir,
qsub_cmd = 'qsub -q all.q@morse',
commands = commands,
size_of_chunks= 300) # because jobs are quick
def get_file_names():
'''
returns the list of files that need to be processed
'''
#data_dir = '/net/cci-filer2/raid1/home/youval/Work/work/MTRIX/Data'
data_dir = r'c:\Phenix\Dev\Work\work\MTRIX\Data'
results = []
# all 157 files with good MTRIX also have structure factors files
# good MTRIX is tested with eps = 0.01
files_with_good_MTRIX = set(pickle.load(open(os.path.join(data_dir,'files_with_good_MTRIX'),'r')))
# file location dictionary
good_MTRIX_pdb_files = pickle.load(open(os.path.join(data_dir,'dict_good_MTRIX_pdb_files'),'r'))
structure_factors_files = pickle.load(open(os.path.join(data_dir,'dict_structure_factors_files'),'r'))
print 'File names and location dictionaries are loaded...'
# Just to explore data
mtrix_not_included_in_pdb = set(open(os.path.join(data_dir,'mtrix_not_included_in_pdb.txt')).read().splitlines())
# look at files that were processed
files_with_problems = pickle.load(open(os.path.join(data_dir,'files_with_problems'),"r"))
files_with_problems = {x[0] for x in files_with_problems}
Collect_tested_files = pickle.load(open(os.path.join(data_dir,'Collect_tested_files'),"r"))
Collect_tested_files = {x[0] for x in Collect_tested_files}
file_set = (files_with_good_MTRIX - Collect_tested_files)
print 'Number of files from the 157 that have issues: {}'.format(len(mtrix_not_included_in_pdb - Collect_tested_files))
print 'number of files_with_good_MTRIX: {}'.format(len(files_with_good_MTRIX))
print 'number of file records in Collect_tested_files: {}'.format(len(Collect_tested_files))
print 'number of files to run: {}'.format(len(file_set))
#file_set = set(['3dpr', '1o4z', '1bxb', '2bht', '4a5p', '4a5m', '2x9m',
#'4bl4', '2x9i', '4gme', '3lob', '2vvq', '1g31', '2wy3',
#'4iq1', '2wy6', '2w0c', '2uwa'])
for file_name in file_set:
pdb_file = good_MTRIX_pdb_files[file_name]
sf_file = structure_factors_files[file_name]
results.append([pdb_file,sf_file,file_name])
return results
if (__name__ == "__main__"):
run()
| 40.86747 | 121 | 0.721403 | 524 | 3,392 | 4.383588 | 0.305344 | 0.0431 | 0.06269 | 0.054854 | 0.276012 | 0.222464 | 0.195037 | 0.118415 | 0.118415 | 0.069656 | 0 | 0.020479 | 0.150649 | 3,392 | 82 | 122 | 41.365854 | 0.776814 | 0.204599 | 0 | 0 | 0 | 0 | 0.242761 | 0.118207 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.148936 | null | null | 0.106383 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d540b98a72ab68cdb241761ec9a3d49ae2599e1 | 426 | py | Python | czsc/__init__.py | vercity/czsc | 7a372baa3a550b18ff319008ac3fcab0f3faa684 | [
"MIT"
] | 1 | 2022-02-22T06:31:40.000Z | 2022-02-22T06:31:40.000Z | czsc/__init__.py | vercity/czsc | 7a372baa3a550b18ff319008ac3fcab0f3faa684 | [
"MIT"
] | 1 | 2021-09-25T02:32:39.000Z | 2021-09-25T02:32:39.000Z | czsc/__init__.py | vercity/czsc | 7a372baa3a550b18ff319008ac3fcab0f3faa684 | [
"MIT"
] | null | null | null | # coding: utf-8
from .analyze import CZSC
from .traders.advanced import CzscAdvancedTrader
from .utils.ta import SMA, EMA, MACD, KDJ
from .objects import Freq, Operate, Direction, Signal, Factor, Event, RawBar, NewBar
from . import aphorism
__version__ = "0.8.17"
__author__ = "zengbin93"
__email__ = "zeng_bin8888@163.com"
__date__ = "20220218"
print(f"欢迎使用CZSC!当前版本标识为 {__version__}@{__date__}\n")
aphorism.print_one()
| 23.666667 | 84 | 0.758216 | 57 | 426 | 5.210526 | 0.77193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058981 | 0.124413 | 426 | 17 | 85 | 25.058824 | 0.737265 | 0.030516 | 0 | 0 | 0 | 0 | 0.209756 | 0.063415 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.454545 | 0 | 0.454545 | 0.181818 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
6d546d1dc2cae3ea649edcb8b4f5382cc4bc67ee | 334 | py | Python | restaurant/migrations/0004_remove_recipe_ingreiends.py | turgayh/Recipe-Share | 858eb3e0e21c11b62249fbc9490cd7bb1f244b9e | [
"MIT"
] | 1 | 2021-01-20T20:52:47.000Z | 2021-01-20T20:52:47.000Z | restaurant/migrations/0004_remove_recipe_ingreiends.py | turgayh/Recipe-Share | 858eb3e0e21c11b62249fbc9490cd7bb1f244b9e | [
"MIT"
] | null | null | null | restaurant/migrations/0004_remove_recipe_ingreiends.py | turgayh/Recipe-Share | 858eb3e0e21c11b62249fbc9490cd7bb1f244b9e | [
"MIT"
] | null | null | null | # Generated by Django 3.0.2 on 2020-01-25 19:30
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('restaurant', '0003_recipe_ingreiends'),
]
operations = [
migrations.RemoveField(
model_name='recipe',
name='ingreiends',
),
]
| 18.555556 | 49 | 0.598802 | 34 | 334 | 5.794118 | 0.794118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080169 | 0.290419 | 334 | 17 | 50 | 19.647059 | 0.751055 | 0.134731 | 0 | 0 | 1 | 0 | 0.167247 | 0.076655 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d5ea38576f25db53776c642414c54041033cdde | 7,525 | py | Python | Project/src/uff/ic/mell/sentimentembedding/modelos/modelo_transformer.py | MeLL-UFF/tuning_sentiment | cb8683be115f694e02c12f8e1a90529a08c7d1d6 | [
"MIT"
] | 2 | 2021-09-23T18:06:10.000Z | 2021-09-23T18:15:11.000Z | Project/src/uff/ic/mell/sentimentembedding/modelos/modelo_transformer.py | MeLL-UFF/tuning_sentiment | cb8683be115f694e02c12f8e1a90529a08c7d1d6 | [
"MIT"
] | null | null | null | Project/src/uff/ic/mell/sentimentembedding/modelos/modelo_transformer.py | MeLL-UFF/tuning_sentiment | cb8683be115f694e02c12f8e1a90529a08c7d1d6 | [
"MIT"
] | null | null | null | from uff.ic.mell.sentimentembedding.utils.data_converstion_utils import convert_tensor2array
from uff.ic.mell.sentimentembedding.modelos.modelo import Modelo
import pandas as pd
import numpy as np
import torch
from enum import Enum
from tokenizers import ByteLevelBPETokenizer
class ModeloTransformer(Modelo):
# média dos tensores da concatenação dos 4 útimos layers - CONTEXT_CONCAT,
# média dos tensores do último layer - CONTEXT_LAST
# embedding do token [CLS] - CONTEXT_CLS
# media dos embeddings estaticos das palavras STATIC_AVG
METHOD = Enum("METHOD", "CONTEXT_CONCAT CONTEXT_LAST CONTEXT_CLS STATIC_AVG")
def __init__(self, name:str, config, tokenizer, originalModel, embedMethod:METHOD):
"""
Metodo construtor
name: qualquer string que identifique o modelo
config: algo do modelo de Transformers. BertConfig() por exemplo
tokenizer: tokernizer do modelo. BertTokenizer por exemplo
originalModel: modelo propriamente dito. BertModel por exemplo
embedMethod: metodo de geracao de embedding das sentencas. Deve ser uma das opcoes do
enum METHOD
"""
super().__init__(name)
self.config = config
self.tokenizer = tokenizer
self.originalModel = originalModel
self.embedMethod = embedMethod
def embTexts(self, dataSeries:pd.Series, **kwagars) -> pd.DataFrame:
'''Função para gerar embedding da BASE DE TWEETS com média dos tensores dos tokens
Parâmetros:
dataSeries: dataframe['tweet']
return: dataframe com média dos tensores de cada token que perfaz o tweet
'''
retorno = []
if (self.embedMethod != ModeloTransformer.METHOD.STATIC_AVG):
# TODO: Verificar se realmente é necessário definir este tamanho explicitamente
# se ficar assim e algum modelo gerar os embeddings de outro tamanho vai dar problema
if (self.embedMethod == ModeloTransformer.METHOD.CONTEXT_CONCAT):
#montando array para receber embedding dos tweets do dataframe
embeddings = np.ndarray((len(dataSeries),3072))
else:
#montando array para receber embedding dos tweets do dataframe
embeddings = np.ndarray((len(dataSeries),768))
for i, text in enumerate(dataSeries):
#gerando embeding do text
tweet = self.get_tweet_embed(text, self.embedMethod)
#convertando em um array e inserindo no array criado
embeddings[i] = convert_tensor2array(tweet.to(device="cpu"))
return pd.DataFrame(embeddings)
else:
for i, text in enumerate(dataSeries):
retorno.append(self.transform_sentence_to_avgembword(text))
return pd.DataFrame(retorno)
def get_tweet_embed(self, text, method:METHOD, add=True):
'''Função para gerar embedding do TWEET
Parâmetros:
text: tweet a ser tokenizado
method: conforme enum METHOD
add: Boolean para adição ou não de tokens especiais, como [CLS]
return: média dos tensores de cada token que perfaz o tweet
'''
self.originalModel.cuda()
# tokenizar texto, transformar num tensor e enviar para a GPU
tokens_tensor = torch.tensor([self.tokenizer.encode(text, add_special_tokens=add)]).cuda()
if (method != ModeloTransformer.METHOD.STATIC_AVG):
with torch.no_grad():
out = self.originalModel(tokens_tensor)
hidden_states = out[2] # selecionando apenas os tensores
if (method == ModeloTransformer.METHOD.CONTEXT_CONCAT):
# get last four layers
last_four_layers = [hidden_states[i] for i in (-1, -2, -3, -4)]
# cast layers to a tuple and concatenate over the last dimension
cat_hidden_states = torch.cat(tuple(last_four_layers), dim= -1)
# take the mean of the concatenated vector over the token dimension
cat_sentence_embedding = torch.mean(cat_hidden_states, dim=1)
return cat_sentence_embedding # gerando o embedding da sentença pela média dos embeddings dos tokens concatenados dos 4 últimos layers
else:
if(method == ModeloTransformer.METHOD.CONTEXT_LAST):
return torch.mean(hidden_states[-1], dim=1) # gerando o embedding da sentença pela média dos embeddings dos tokens
else:
if(method == ModeloTransformer.METHOD.CONTEXT_CLS):
return hidden_states[-1][:,0,:]
def transform_sentence_to_avgembword(self, text:str):
"""
Metodo para gerar embedding das sentencas a partir dos embeddings
estaticos do modelo usando a media
Parametros:
texts: sentenca a ser feito o embedding usando
a media das palavras que a compoe
Return:
retorna um [] com os embeddings das sentencas fazendo a media
dos embeddings dos tokens que a compoe
"""
self.originalModel.cuda()
# pegando os ids de cada palavra do texto
input_ids = self.tokenizer.encode(text, add_special_tokens=False)
#print(input_ids)
# pegando o embedding de cada palavra do texto
#print("#####################")
ids_tensor = torch.tensor([input_ids]).cuda() #gera um tensor de ids das palavras das sentencas
#print(ids_tensor.shape)
embeddings_palavras = self.originalModel.get_input_embeddings()(ids_tensor) # me retorna um tensor de dim 1 x qtdIds x 768
#print("#####################")
#print(embeddings_palavras[0])
# tirando a media e transformando de tensor para array
#t_stack = torch.stack(embeddings_palavras[0])
#print("#####################")
#print(t_stack)
mean = torch.mean(embeddings_palavras[0], dim=0) # tiro a primeira dimensao do tensor que esta vazia para fazer a media por coluna
#print("#####################")
#print (mean)
#print("#####################")
mean_arr = convert_tensor2array(torch.unsqueeze(mean, 0)) # recoloco a primeira dimensao para o convert funcionar
#print (mean_arr)
return mean_arr
def tokenize_sentences(self, sentences):
input_ids = [] # For every sentence...
for sent in sentences:
encoded_sent = self.tokenizer.encode(sent,add_special_tokens=True)
# Add the encoded sentence to the list.
input_ids.append(encoded_sent) # Print sentence 0, now as a list of IDs.
print('Original: ', sentences[0])
print('Token IDs:', input_ids[0])
return input_ids
def train_tokenizer(self,file_path,outDir):
# Initialize a tokenizer
tokenizer = ByteLevelBPETokenizer()
# Customize training
tokenizer.train(files=file_path, vocab_size=52_000, min_frequency=2, special_tokens=[
"<s>",
"<pad>",
"</s>",
"<unk>",
"<mask>",
])
self.tokenizer=tokenizer
tokenizer.save(outDir)
| 47.626582 | 154 | 0.612226 | 851 | 7,525 | 5.304348 | 0.310223 | 0.012406 | 0.017723 | 0.02747 | 0.179442 | 0.130704 | 0.099247 | 0.081967 | 0.081967 | 0.081967 | 0 | 0.007823 | 0.303522 | 7,525 | 157 | 155 | 47.929936 | 0.853463 | 0.385914 | 0 | 0.133333 | 0 | 0 | 0.024679 | 0 | 0 | 0 | 0 | 0.025478 | 0 | 1 | 0.08 | false | 0 | 0.093333 | 0 | 0.293333 | 0.026667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d61115da86ee25dd4b72746264e5411b0609670 | 7,889 | py | Python | sdk/python/pulumi_ovh/get_vps.py | legitbee/pulumi-ovh | 23205a76a958edfbfcd96242d9663b3d7c81ccd9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_ovh/get_vps.py | legitbee/pulumi-ovh | 23205a76a958edfbfcd96242d9663b3d7c81ccd9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_ovh/get_vps.py | legitbee/pulumi-ovh | 23205a76a958edfbfcd96242d9663b3d7c81ccd9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = [
'GetVpsResult',
'AwaitableGetVpsResult',
'get_vps',
'get_vps_output',
]
@pulumi.output_type
class GetVpsResult:
"""
A collection of values returned by getVps.
"""
def __init__(__self__, cluster=None, datacenter=None, displayname=None, id=None, ips=None, keymap=None, memory=None, model=None, name=None, netbootmode=None, offertype=None, service_name=None, slamonitoring=None, state=None, type=None, vcore=None, zone=None):
if cluster and not isinstance(cluster, str):
raise TypeError("Expected argument 'cluster' to be a str")
pulumi.set(__self__, "cluster", cluster)
if datacenter and not isinstance(datacenter, dict):
raise TypeError("Expected argument 'datacenter' to be a dict")
pulumi.set(__self__, "datacenter", datacenter)
if displayname and not isinstance(displayname, str):
raise TypeError("Expected argument 'displayname' to be a str")
pulumi.set(__self__, "displayname", displayname)
if id and not isinstance(id, str):
raise TypeError("Expected argument 'id' to be a str")
pulumi.set(__self__, "id", id)
if ips and not isinstance(ips, list):
raise TypeError("Expected argument 'ips' to be a list")
pulumi.set(__self__, "ips", ips)
if keymap and not isinstance(keymap, str):
raise TypeError("Expected argument 'keymap' to be a str")
pulumi.set(__self__, "keymap", keymap)
if memory and not isinstance(memory, int):
raise TypeError("Expected argument 'memory' to be a int")
pulumi.set(__self__, "memory", memory)
if model and not isinstance(model, dict):
raise TypeError("Expected argument 'model' to be a dict")
pulumi.set(__self__, "model", model)
if name and not isinstance(name, str):
raise TypeError("Expected argument 'name' to be a str")
pulumi.set(__self__, "name", name)
if netbootmode and not isinstance(netbootmode, str):
raise TypeError("Expected argument 'netbootmode' to be a str")
pulumi.set(__self__, "netbootmode", netbootmode)
if offertype and not isinstance(offertype, str):
raise TypeError("Expected argument 'offertype' to be a str")
pulumi.set(__self__, "offertype", offertype)
if service_name and not isinstance(service_name, str):
raise TypeError("Expected argument 'service_name' to be a str")
pulumi.set(__self__, "service_name", service_name)
if slamonitoring and not isinstance(slamonitoring, bool):
raise TypeError("Expected argument 'slamonitoring' to be a bool")
pulumi.set(__self__, "slamonitoring", slamonitoring)
if state and not isinstance(state, str):
raise TypeError("Expected argument 'state' to be a str")
pulumi.set(__self__, "state", state)
if type and not isinstance(type, str):
raise TypeError("Expected argument 'type' to be a str")
pulumi.set(__self__, "type", type)
if vcore and not isinstance(vcore, int):
raise TypeError("Expected argument 'vcore' to be a int")
pulumi.set(__self__, "vcore", vcore)
if zone and not isinstance(zone, str):
raise TypeError("Expected argument 'zone' to be a str")
pulumi.set(__self__, "zone", zone)
@property
@pulumi.getter
def cluster(self) -> str:
return pulumi.get(self, "cluster")
@property
@pulumi.getter
def datacenter(self) -> Mapping[str, str]:
return pulumi.get(self, "datacenter")
@property
@pulumi.getter
def displayname(self) -> str:
return pulumi.get(self, "displayname")
@property
@pulumi.getter
def id(self) -> str:
"""
The provider-assigned unique ID for this managed resource.
"""
return pulumi.get(self, "id")
@property
@pulumi.getter
def ips(self) -> Sequence[str]:
return pulumi.get(self, "ips")
@property
@pulumi.getter
def keymap(self) -> str:
return pulumi.get(self, "keymap")
@property
@pulumi.getter
def memory(self) -> int:
return pulumi.get(self, "memory")
@property
@pulumi.getter
def model(self) -> Mapping[str, str]:
return pulumi.get(self, "model")
@property
@pulumi.getter
def name(self) -> str:
return pulumi.get(self, "name")
@property
@pulumi.getter
def netbootmode(self) -> str:
return pulumi.get(self, "netbootmode")
@property
@pulumi.getter
def offertype(self) -> str:
return pulumi.get(self, "offertype")
@property
@pulumi.getter(name="serviceName")
def service_name(self) -> str:
return pulumi.get(self, "service_name")
@property
@pulumi.getter
def slamonitoring(self) -> bool:
return pulumi.get(self, "slamonitoring")
@property
@pulumi.getter
def state(self) -> str:
return pulumi.get(self, "state")
@property
@pulumi.getter
def type(self) -> str:
return pulumi.get(self, "type")
@property
@pulumi.getter
def vcore(self) -> int:
return pulumi.get(self, "vcore")
@property
@pulumi.getter
def zone(self) -> str:
return pulumi.get(self, "zone")
class AwaitableGetVpsResult(GetVpsResult):
# pylint: disable=using-constant-test
def __await__(self):
if False:
yield self
return GetVpsResult(
cluster=self.cluster,
datacenter=self.datacenter,
displayname=self.displayname,
id=self.id,
ips=self.ips,
keymap=self.keymap,
memory=self.memory,
model=self.model,
name=self.name,
netbootmode=self.netbootmode,
offertype=self.offertype,
service_name=self.service_name,
slamonitoring=self.slamonitoring,
state=self.state,
type=self.type,
vcore=self.vcore,
zone=self.zone)
def get_vps(service_name: Optional[str] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> AwaitableGetVpsResult:
"""
Use this data source to access information about an existing resource.
"""
__args__ = dict()
__args__['serviceName'] = service_name
if opts is None:
opts = pulumi.InvokeOptions()
if opts.version is None:
opts.version = _utilities.get_version()
__ret__ = pulumi.runtime.invoke('ovh:index/getVps:getVps', __args__, opts=opts, typ=GetVpsResult).value
return AwaitableGetVpsResult(
cluster=__ret__.cluster,
datacenter=__ret__.datacenter,
displayname=__ret__.displayname,
id=__ret__.id,
ips=__ret__.ips,
keymap=__ret__.keymap,
memory=__ret__.memory,
model=__ret__.model,
name=__ret__.name,
netbootmode=__ret__.netbootmode,
offertype=__ret__.offertype,
service_name=__ret__.service_name,
slamonitoring=__ret__.slamonitoring,
state=__ret__.state,
type=__ret__.type,
vcore=__ret__.vcore,
zone=__ret__.zone)
@_utilities.lift_output_func(get_vps)
def get_vps_output(service_name: Optional[pulumi.Input[str]] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> pulumi.Output[GetVpsResult]:
"""
Use this data source to access information about an existing resource.
"""
...
| 34.3 | 263 | 0.63316 | 914 | 7,889 | 5.249453 | 0.140044 | 0.021259 | 0.05669 | 0.106294 | 0.31263 | 0.206961 | 0.137349 | 0.050021 | 0.024594 | 0.024594 | 0 | 0.000171 | 0.259222 | 7,889 | 229 | 264 | 34.449782 | 0.820842 | 0.058056 | 0 | 0.177419 | 1 | 0 | 0.135616 | 0.005979 | 0 | 0 | 0 | 0 | 0 | 1 | 0.112903 | false | 0 | 0.026882 | 0.086022 | 0.252688 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6d61124fdff6342f0f0ec0bd44dc8339e788d4a2 | 901 | py | Python | config_projector.py | edmcdonagh/drumminhands_projector | bcb32ccfd9b18889ea87b01e6ad32a6298e89fdb | [
"MIT"
] | 6 | 2016-11-19T21:15:57.000Z | 2020-03-07T13:26:38.000Z | config_projector.py | edmcdonagh/drumminhands_projector | bcb32ccfd9b18889ea87b01e6ad32a6298e89fdb | [
"MIT"
] | null | null | null | config_projector.py | edmcdonagh/drumminhands_projector | bcb32ccfd9b18889ea87b01e6ad32a6298e89fdb | [
"MIT"
] | 2 | 2017-11-30T07:59:21.000Z | 2019-02-09T22:29:43.000Z | #!/usr/bin/env python
# configure these settings to change projector behavior
server_mount_path = '//192.168.42.11/PiShare' # shared folder on other pi
user_name = 'pi' # shared drive login user name
user_password = 'raspberry' # shared drive login password
client_mount_path = '/mnt/pishare' # where to find the shared folder on this pi
pics_folder = '/mnt/pishare/pics' # where to find the pics to display
waittime = 2 # default time to wait between images (in seconds)
use_prime = True # Set to true to show the prime slide, false if otherwise
prime_slide = '/home/pi/photobooth/projector.png' # image to show regularly in slideshow
prime_freq = 16 # how many pics to show before showing the prime slide again
monitor_w = 800 # width in pixels of display (monitor or projector)
monitor_h = 600 # height in pixels of display (monitor or projector)
title = "SlideShow" # caption of the window...
| 56.3125 | 88 | 0.758047 | 144 | 901 | 4.659722 | 0.569444 | 0.026826 | 0.041729 | 0.041729 | 0.104322 | 0.104322 | 0.104322 | 0 | 0 | 0 | 0 | 0.025232 | 0.164262 | 901 | 15 | 89 | 60.066667 | 0.86587 | 0.622642 | 0 | 0 | 0 | 0 | 0.322086 | 0.171779 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
ed9aa65300cd780d6505d47d777d1fe467511db5 | 1,777 | py | Python | opt.py | ArmandB/RITnet | afe9524fdd795982c7e52761da68af2dfda589ea | [
"MIT"
] | null | null | null | opt.py | ArmandB/RITnet | afe9524fdd795982c7e52761da68af2dfda589ea | [
"MIT"
] | null | null | null | opt.py | ArmandB/RITnet | afe9524fdd795982c7e52761da68af2dfda589ea | [
"MIT"
] | 3 | 2021-03-05T04:49:42.000Z | 2022-02-16T20:20:33.000Z | from pprint import pprint
import argparse
def parse_args():
parser = argparse.ArgumentParser()
# Data input settings
parser.add_argument('--dataset', type=str, default='Semantic_Segmentation_Dataset/', help='name of dataset')
# Optimization: General
parser.add_argument('--bs', type=int, default = 8 )
parser.add_argument('--epochs', type=int,help='Number of epochs',default= 250)
parser.add_argument('--workers', type=int,help='Number of workers',default=4)
parser.add_argument('--model', help='model name',default='densenet')
parser.add_argument('--evalsplit', help='eval spolit',default='val')
parser.add_argument('--lr', type=float,default= 1e-3,help='Learning rate')
parser.add_argument('--save', help='save folder name',default='0try')
parser.add_argument('--seed', type=int, default=1111, help='random seed')
parser.add_argument('--load', type=str, default='best_model.pkl', help='load checkpoint file name')
parser.add_argument('--resume', action='store_true', help='resume train from load chkpoint')
parser.add_argument('--test', action='store_true', help='test only')
parser.add_argument('--savemodel',action='store_true',help='checkpoint save the model')
parser.add_argument('--testrun', action='store_true', help='test run with few dataset')
parser.add_argument('--expname', type=str, default='info', help='extra explanation of the method')
parser.add_argument('--useGPU', type=str, default=True, help='Set it as False if GPU is unavailable')
# parse
args = parser.parse_args()
opt = vars(args)
pprint('parsed input parameters:')
pprint(opt)
return args
if __name__ == '__main__':
opt = parse_args()
print('opt[\'dataset\'] is ', opt.dataset)
| 43.341463 | 112 | 0.694429 | 236 | 1,777 | 5.084746 | 0.385593 | 0.12 | 0.226667 | 0.063333 | 0.07 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007874 | 0.142375 | 1,777 | 40 | 113 | 44.425 | 0.779528 | 0.026449 | 0 | 0 | 0 | 0 | 0.32423 | 0.017432 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.071429 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eda26997e04dad71a5345443c6618c67bd6570e5 | 613 | py | Python | oops_fhir/r4/value_set/contact_point_system.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | oops_fhir/r4/value_set/contact_point_system.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | oops_fhir/r4/value_set/contact_point_system.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | from pathlib import Path
from fhir.resources.valueset import ValueSet as _ValueSet
from oops_fhir.utils import ValueSet
from oops_fhir.r4.code_system.contact_point_system import (
ContactPointSystem as ContactPointSystem_,
)
__all__ = ["ContactPointSystem"]
_resource = _ValueSet.parse_file(Path(__file__).with_suffix(".json"))
class ContactPointSystem(ContactPointSystem_):
"""
ContactPointSystem
Telecommunications form for contact point.
Status: active - Version: 4.0.1
http://hl7.org/fhir/ValueSet/contact-point-system
"""
class Meta:
resource = _resource
| 19.774194 | 69 | 0.748777 | 69 | 613 | 6.347826 | 0.536232 | 0.082192 | 0.073059 | 0.091324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009785 | 0.166395 | 613 | 30 | 70 | 20.433333 | 0.847358 | 0.238173 | 0 | 0 | 0 | 0 | 0.052392 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
eda269eeba1002830ddb7feba1403352015b2168 | 3,367 | py | Python | django_db_meter/views.py | djangothon/django-db-meter | 2a96b32b5cc1a926832316841afd5da7d90a0b8f | [
"Apache-2.0"
] | null | null | null | django_db_meter/views.py | djangothon/django-db-meter | 2a96b32b5cc1a926832316841afd5da7d90a0b8f | [
"Apache-2.0"
] | null | null | null | django_db_meter/views.py | djangothon/django-db-meter | 2a96b32b5cc1a926832316841afd5da7d90a0b8f | [
"Apache-2.0"
] | null | null | null | import json
from django.conf import settings
from django.core.urlresolvers import reverse
from django.http import HttpResponse
from django.db.models import get_apps, get_models
from django.shortcuts import render
from django.core.serializers.json import DjangoJSONEncoder
from django_db_meter.models import DBQueryMetric
from django_db_meter.models import AppWiseAggregatedMetric
from django_db_meter.models import TableWiseAggregatedMetric
from django_db_meter.models import DBWiseAggregatedMetric
# Create your views here.
def index(request):
ctx = {}
template = 'index.html'
return render(request, template, ctx)
def list_apps(request, app_name=None):
template = 'index.html'
if app_name is None:
ctx = {
'apps': [app.__package__.split('.')[-1] for app in get_apps()]
}
else:
ctx = {'url': reverse('app_wise_data', args=(app_name,))}
return render(request, template, ctx)
def list_tables(request, table_name=None):
template = 'index.html'
if table_name is None:
tables = []
for app in get_apps():
for model in get_models(app):
tables.append(model._meta.db_table)
ctx = {
'tables': tables
}
else:
ctx = {'url': reverse('table_wise_data', args=(table_name,))}
return render(request, template, ctx)
def list_dbs(request, db_name=None):
template = 'index.html'
if db_name is None:
dbs = [db.get('NAME') for db in settings.DATABASES.itervalues()]
ctx = {
'dbs': dbs
}
else:
ctx = {
'url': reverse('db_wise_data', args=(db_name,))
}
return render(request, template, ctx)
def create_response(data):
response = []
for data_point in data:
data_dict = {
'timestamp': data_point.timestamp,
'num_queries': data_point.num_queries,
'average_query_time': data_point.average_query_time,
'num_joined_queries': data_point.num_joined_queries,
}
data_dict['num_insert_queries'] = data_point.num_queries if data_point.query_type == 'INSERT' else 0
data_dict['num_update_queries'] = data_point.num_queries if data_point.query_type == 'UPDATE' else 0
data_dict['num_delete_queries'] = data_point.num_queries if data_point.query_type == 'DELETE' else 0
data_dict['num_select_queries'] = data_point.num_queries if data_point.query_type == 'SELECT' else 0
response.append(data_dict)
response = json.dumps(response, cls=DjangoJSONEncoder)
response = HttpResponse(response, content_type='application/json')
return response
def get_app_wise_data(request, app_name):
data = AppWiseAggregatedMetric.objects.filter(
app_name=app_name).order_by('-timestamp')[:-40]
return create_response(data)
def get_table_wise_data(request, table_name):
data = TableWiseAggregatedMetric.objects.filter(
table_name=table_name).order_by('-timestamp')[:40]
return create_response(data)
def get_db_wise_data(request, db_name):
data = DBWiseAggregatedMetric.objects.filter(
db_name=db_name).order_by('-timestamp')[:40]
return create_response(data)
def slow_queries(request):
slow_queries = DBQueryMetric.objects.filter(query_execution_time__gte=0.05)
template = 'index.html'
ctx = {'slow_queries': slow_queries}
return render(request, template, ctx)
| 33.67 | 101 | 0.703594 | 443 | 3,367 | 5.085779 | 0.198646 | 0.051931 | 0.04261 | 0.050599 | 0.365735 | 0.306258 | 0.218819 | 0.185974 | 0.149578 | 0.149578 | 0 | 0.005134 | 0.19008 | 3,367 | 99 | 102 | 34.010101 | 0.821049 | 0.006831 | 0 | 0.240964 | 0 | 0 | 0.097846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.13253 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edad2a21f58b87b3261ce84038ec82c4c7e7c0c6 | 57,471 | py | Python | ion/agents/mission_executive.py | ooici/coi-services | 43246f46a82e597345507afd7dfc7373cb346afa | [
"BSD-2-Clause"
] | 3 | 2016-09-20T09:50:06.000Z | 2018-08-10T01:41:38.000Z | ion/agents/mission_executive.py | ooici/coi-services | 43246f46a82e597345507afd7dfc7373cb346afa | [
"BSD-2-Clause"
] | null | null | null | ion/agents/mission_executive.py | ooici/coi-services | 43246f46a82e597345507afd7dfc7373cb346afa | [
"BSD-2-Clause"
] | 2 | 2016-03-16T22:25:49.000Z | 2016-11-26T14:54:21.000Z | """
@package ion.agents.mission_executive
@file ion/agents/mission_executive.py
@author Bob Fratantonio
@brief A class for the platform mission executive
"""
import calendar
import gevent
import yaml
import time
from time import gmtime
import pytz
from datetime import datetime
from pyon.agent.agent import ResourceAgentState
from pyon.agent.agent import ResourceAgentEvent
from pyon.agent.common import BaseEnum
from pyon.event.event import EventSubscriber
from pyon.public import log
from pyon.util.config import Config
from pyon.util.breakpoint import breakpoint
from ion.core.includes.mi import DriverEvent
from interface.objects import AgentCommand
from interface.objects import AgentCapability
from interface.objects import CapabilityType
from interface.objects import MissionExecutionStatus
class MissionErrorCode(BaseEnum):
"""
Mission executive error code
0 = No error
1 = Abort mission_thread
2 = Abort mission
"""
NO_ERROR = 0
ABORT_MISSION_THREAD = 1
ABORT_MISSION = 2
class MissionThreadStatus(BaseEnum):
"""
Mission thread status
starting = mission thread is starting
running = mission sequence is executing
stopped = mission sequence is waiting for next sequence
done = mission sequence has been terminated normally
aborted = mission sequence was aborted
"""
STARTING = 'starting'
RUNNING = 'running'
STOPPED = 'stopped'
DONE = 'done'
ABORTED = 'aborted'
class MissionLoader(object):
"""
MissionLoader class is used to parse a mission file, check the mission logic
and save the mission as a dict
"""
def __init__(self, platform_agent):
self.platform_agent = platform_agent
self.mission_entries = []
self.mission_id = None
self.accepted_error_values = ['abort', 'abortMission', 'retry', 'skip']
def add_entry(self, instrument_id=[], error_handling = {}, start_time=0, loop={}, event = {},
premission_cmds=[], mission_cmds=[], postmission_cmds=[]):
self.mission_entries.append({"mission_id": self.mission_id,
"instrument_id": instrument_id,
"error_handling": error_handling,
"start_time": start_time,
"loop": loop,
"event": event,
"premission_cmds": premission_cmds,
"mission_cmds": mission_cmds,
"postmission_cmds": postmission_cmds})
def count_entries(self):
return len(self.mission_entries)
def sort_entries(self):
self.mission_entries = sorted(self.mission_entries, key=lambda k: k["start_time"])
def get_entry_all(self, id_):
return self.mission_entries[id_]
def print_entry_all(self):
log.debug(self.mission_entries)
def delete_entry(self, id_):
self.mission_entries.pop(id_)
def convert_to_utc_from_timezone(self, seconds_from_epoch, tz):
"""
Convert user defined time zone to UTC
@param seconds_from_epoch Start time in seconds from epoch in timezone
@param tz Timezone from pytz.all_timezones
return Time in UTC seconds from epoch
"""
if tz in pytz.all_timezones:
user_timezone = pytz.timezone(tz)
else:
raise Exception('Time Zone not recognized')
# Get the time string into the user defined time zone
local_dt = user_timezone.localize(datetime.utcfromtimestamp(seconds_from_epoch))
# Convert to UTC datetime
utc_dt = local_dt.astimezone(pytz.utc)
# Return as seconds from the epoch
return calendar.timegm(utc_dt.timetuple())
def calculate_next_interval(self, id_):
current_time = time.time()
start_time = self.mission_entries[id_]["start_time"]
loop_duration = self.mission_entries[id_]["loop"]["loop_duration"]
if start_time < current_time:
next_interval = start_time
while next_interval < current_time:
next_interval += loop_duration
log.debug("[mm] Current time is: %s", time.strftime("%Y-%m-%d %H:%M:%S", gmtime(time.time())))
log.debug("[mm] Next start at: %s", time.strftime("%Y-%m-%d %H:%M:%S", gmtime(start_time + loop_duration)))
return next_interval - current_time
else:
log.debug("[mm] Current time is: %s", time.strftime("%Y-%m-%d %H:%M:%S", gmtime(time.time())))
log.debug("[mm] Next start at: %s", time.strftime("%Y-%m-%d %H:%M:%S", gmtime(start_time)))
return (start_time - current_time) + loop_duration
def check_start_time(self, schedule, loop_duration):
"""
Check mission start time
"""
start_time = schedule['startTime']
tz = schedule['timeZone']
if start_time:
try:
start_time = calendar.timegm(start_time.timetuple())
except AttributeError:
self.publish_mission_loader_error_event('MissionLoader: validate_schedule: startTime format error')
# log.error("MissionLoader: validate_schedule: startTime format error: " + str(start_time_string))
log.error("[mm] MissionLoader: validate_schedule: startTime format error: " + str(start_time))
raise Exception('MissionLoader: validate_schedule: startTime format error')
else:
if tz:
start_time = self.convert_to_utc_from_timezone(start_time, tz)
current_time = time.time()
# Compare mission start time to current time
if (current_time > start_time):
if loop_duration > 0:
nloops = int((current_time-start_time)/loop_duration)+1
start_time += nloops*loop_duration
else:
log.debug("[mm] MissionLoader: validate_schedule: Start time has already elapsed")
# raise
log.debug("[mm] Current time is: %s", time.strftime("%Y-%m-%d %H:%M:%S", gmtime(current_time)))
log.debug("[mm] Start time is: %s", time.strftime("%Y-%m-%d %H:%M:%S", gmtime(start_time)))
return start_time
def check_intersections(self, indices):
"""
In the case of a single instrument with multiple missions,
check missions for schedule intersections
"""
start_times = []
mission_durations = []
loop_durations = []
num_loops = []
#Extract all the relevant info from the duplicate instrument missions
for index in indices:
if self.mission_entries[index]['start_time']:
start_times.append(self.mission_entries[index]['start_time'])
mission_durations.append(self.mission_entries[index]['duration'])
loop_durations.append(self.mission_entries[index]['loop_duration'])
num_loops.append(self.mission_entries[index]['num_loops'])
if start_times:
# Start times don't conflict with mission duration
# Now check possible loop conflicts
mission_all_times = []
for n in range(len(start_times)):
if num_loops[n] == -1:
#Checking the first 100 times should work
num_loops[n] = 100
#Create list of mission start times and end times
start = range(start_times[n], start_times[n] + (num_loops[n] * loop_durations[n]), loop_durations[n])
end = [x + mission_durations[n] for x in start]
mission_all_times.append(zip(start, end))
# This only compares adjacent missions (only works for 2 missions)
# TODO: Update logic to work for multiple instrument missions
for n in range(1, len(start_times)):
for sublist1 in mission_all_times[n-1]:
for sublist2 in mission_all_times[n]:
if (sublist1[0] >= sublist2[0] and sublist1[0] <= sublist2[1]) or (
sublist1[1] >= sublist2[0] and sublist1[1] <= sublist2[1]):
self.publish_mission_loader_error_event('Mission Error: Scheduling conflict')
log.error('[mm] Mission Error: Scheduling conflict: ' + str(sublist1) + str(sublist2))
raise Exception('Mission Error: Scheduling conflict')
return True
def verify_command_and_params(self, cmd='', params={}):
"""
Verify that specified command is defined.
"""
pass
# if cmd not in MissionCommands.all_cmds:
# raise Exception('Mission Error: %s Mission command not recognized' % cmd)
# for param in params:
# if param not in MissionCommands.all_cmds[cmd]:
# raise Exception('Mission Error: %s Mission parameter not recognized' % param)
def parse_loop_parameters(self, schedule):
"""
Parse loop parameters if given.
"""
num_loops = schedule['loop']['quantity']
loop_value = schedule['loop']['value']
if num_loops:
loop_units = schedule['loop']['units']
if loop_units.lower() == 'days':
loop_value *= 3600*24
elif loop_units.lower() == 'hrs':
loop_value *= 3600
elif loop_units.lower() == 'mins':
loop_value *= 60
else:
num_loops = None
loop_value = None
loop_parameters = {'loop_duration': loop_value, 'num_loops': num_loops}
return loop_parameters
def parse_error_parameters(self, error_parameters):
"""
Parse the error parameters - default and maxRetries
"""
if error_parameters['default'] not in self.accepted_error_values:
error_parameters['default'] = 'retry'
error_parameters['maxRetries'] = int(error_parameters['maxRetries'])
return error_parameters
def parse_mission_sequence(self, mission_sequence={}, instrument_id = []):
"""
Check the mission commands and parameters for duration
"""
mission_duration = 0
mission_params = []
if not mission_sequence:
return [], mission_duration
# Check mission duration
for index, items in enumerate(mission_sequence):
#Calculate mission duration
command = items['command']
error = items['onError']
if error not in self.accepted_error_values:
error = None
if ',' in command:
# Instrument ID is explicitly stated
instrument, command = command.strip().split(',')
if instrument not in list(instrument_id): # "in" operation not working as expected for DotList.
log.warn('[mm] instrument=%r not recognized from instrumentID list: %s', instrument, instrument_id)
else:
#First quick check for a wait command
cmd_method, rest = command.strip().split('(')
if cmd_method.lower() == 'wait':
instrument = None
elif len(instrument_id) == 1:
instrument = instrument_id[0]
else:
log.error('[mm] instrument_id not given in command: %s', command)
raise Exception('Error in mission command string: instrument_id not specified in command: %s', command)
# self.verify_command_and_params(command, params)
if '(' in command and ')' in command:
cmd_method, rest = command.strip().split('(')
if '{' in rest and '}' in rest:
cmd, rest = rest.split('{')
param = rest.split('}')[0]
# Leave as string, doesn't have to be numeric
#param = float(param) if '.' in param else int(param)
else:
cmd = rest.split(')')[0]
param = None
else:
self.publish_mission_loader_error_event('Error in mission command string')
raise Exception('Error in mission command string')
if cmd_method.lower() == 'wait':
param = float(cmd) if '.' in cmd else int(cmd)
cmd = cmd_method
duration = param * 60
else:
duration = 0
mission_duration += duration
mission_params.append({'instrument_id': instrument,
'method': cmd_method,
'command': cmd,
'parameters': param,
'error': error})
return mission_params, mission_duration
def check_types(self, value, _type):
"""
Check the mission file contents types
"""
if type(value) != _type:
log.debug("[mm] Mission Executive Parser Warning: value %s is not %s", value, _type)
return _type(value)
else:
return value
def check_event(self, schedule):
"""
Verify the mission event
"""
event = schedule['event']
event_id = event['eventID']
parent_id = event['parentID']
if not event_id:
self.publish_mission_loader_error_event('Mission event not specified')
log.error('[mm] Mission event not specified')
raise Exception('Mission event not specified')
elif not parent_id:
self.publish_mission_loader_error_event('Mission event parentID not specified')
log.error('[mm] Mission event parentID not specified')
raise Exception('Mission event parentID not specified')
return event
def validate_schedule(self, mission={}):
"""
Check the mission parameters for scheduling conflicts
"""
for current_mission in mission:
# platform_id = current_mission['platformID']
instrument_id = current_mission['instrumentID']
schedule = current_mission['schedule']
if type(instrument_id) == str:
instrument_id = [instrument_id]
if 'preMissionSequence' in current_mission:
premission_sequence = current_mission['preMissionSequence']
else:
premission_sequence = None
mission_sequence = current_mission['missionSequence']
if 'postMissionSequence' in current_mission:
postmission_sequence = current_mission['postMissionSequence']
else:
postmission_sequence = None
error_parameters = current_mission['errorHandling']
premission_params, _ = self.parse_mission_sequence(premission_sequence, instrument_id)
mission_params, mission_duration = self.parse_mission_sequence(mission_sequence, instrument_id)
postmission_params, _ = self.parse_mission_sequence(postmission_sequence, instrument_id)
loop_params = self.parse_loop_parameters(schedule)
loop_duration = loop_params['loop_duration']
if (loop_duration and loop_duration < mission_duration):
self.publish_mission_loader_error_event('Mission File Error: Mission duration > scheduled loop duration')
log.error('[mm] Mission File Error: Mission duration > scheduled loop duration')
raise Exception('Mission Error: Mission duration greater than scheduled loop duration')
error = self.parse_error_parameters(error_parameters)
start_time = self.check_start_time(schedule, loop_duration)
if start_time:
# Timed mission
event = None
else:
# Event Driven Mission
event = self.check_event(schedule)
#Add mission entry
self.add_entry(instrument_id, error, start_time, loop_params, event,
premission_params, mission_params, postmission_params)
#Sort mission entries by start time
self.sort_entries()
instrument_id = []
for instrument in self.mission_entries:
instrument_id.append(instrument['instrument_id'])
#Return indices of duplicate instruments to check schedules
indices = [i for i, x in enumerate(instrument_id) if instrument_id.count(x) > 1]
#Now check timing schedule of duplicate instruments
if len(indices) > 1:
return self.check_intersections(indices)
else:
return True
def publish_mission_loader_error_event(self, description):
evt = dict(event_type='DeviceMissionEvent',
description=description,
mission_id=self.mission_id,
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id)
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
def load_mission_file(self, filename):
"""
Load, parse, and check the mission
"""
self.filename = filename
log.debug('[mm] Parsing %s', filename.split('/')[-1])
mission_dict = Config([filename]).data
# with open(filename) as f:
# mission_dict = yaml.safe_load(f)
self.raw_mission = mission_dict['mission']
return self.validate_schedule(self.raw_mission)
def load_mission(self, mission_id, mission_yml):
"""
Load, parse, and check the mission file contents
@param mission_id Mission id from RR
@param mission_yml Mission file contents as string
"""
self.mission_id = mission_id
log.debug('[mm] Parsing mission_id %s', self.mission_id)
self.mission_id = mission_id
mission_dict = yaml.safe_load(mission_yml)
self.raw_mission = mission_dict['mission']
return self.validate_schedule(self.raw_mission)
class MissionScheduler(object):
"""
MissionScheduler takes care of the command/control and associated timing for a
platform mission
"""
def __init__(self, platform_agent, instruments, mission):
import pprint
self.pformat = pprint.PrettyPrinter().pformat
log.debug('[mm] MissionScheduler: instruments=%s\nmission=%s',
self.pformat(instruments), self.pformat(mission))
self.platform_agent = platform_agent
self.instruments = instruments
self.mission = mission
self.mission_id = mission[0]['mission_id']
# Define max number of agent command retries
self.max_attempts = mission[0]['error_handling']['maxRetries']
self.default_error = mission[0]['error_handling']['default']
# Initialize error events
self.error_events_received = []
# Initialize list of mission event subscribers
self.mission_event_subscribers = []
# Initialize list of mission threads
self.threads = []
# Initialize mission thread ID
self.mission_thread_id = 0
# Boolean to hold global mission abort status
self.mission_aborted = False
# Boolean to hold global mission running status
self.mission_running = True
# Keep track of all running mission threads
self.mission_threads = []
def run_mission(self):
"""
Starts execution of the given mission file.
"""
log.debug('[mm] run_mission: mission=%s', self.mission)
# Start up the platform
# self.startup_platform()
self._schedule(self.mission)
def abort_mission(self):
"""
Terminates the ongoing mission execution by immediately stopping the
main mission sequence and then running instrument abort sequence
"""
# Only need the abort sequence once...
if not self.mission_aborted:
log.debug('[mm] abort_mission method called...')
self._publish_mission_abort_event()
# Setting this global to true will tell all running mission threads
# to stop ASAP
self.mission_aborted = True
self.mission_running = False
# Wait for all threads to finish current command...
for thread in self.mission_threads:
if thread['status'] == MissionThreadStatus.RUNNING:
# Wait for status change
thread_id = thread['mission_thread_id']
status = self.mission_threads[thread_id]['status']
while status == MissionThreadStatus.RUNNING:
gevent.sleep(1)
status = self.mission_threads[thread_id]['status']
# For event driven missions, stop event subscribers
self._stop_mission_event_subscribers()
# Start abort sequence
for instrument, client in self.instruments.iteritems():
self._instrument_abort_sequence(client)
log.error('[mm] Mission Aborted')
self._publish_mission_aborted_event()
def _abort_mission_thread(self, instruments):
"""
Terminates the missionThread execution and puts instruments in INACTIVE State
"""
log.debug('[mm] _abort_mission_thread called...')
# Start abort sequence
for instrument_id in instruments:
if instrument_id not in self.instruments:
continue
ia_client = self.instruments[instrument_id]
self._instrument_abort_sequence(ia_client)
log.error('[mm] Mission thread aborted')
def _check_mission_running(self):
"""
This method checks all mission threads and if complete, stop any event subscribers
and publish event
"""
mission_complete = [MissionThreadStatus.ABORTED, MissionThreadStatus.DONE]
for thread in self.mission_threads:
if thread['status'] not in mission_complete:
log.debug('[mm] _check_mission_running - Mission threads are still running')
return True
self.mission_running = False
return False
def _stop_mission_event_subscribers(self):
"""
This method will stop all mission event subscribers when a mission has terminated
"""
# For event driven missions, stop event subscribers
for subscriber in self.mission_event_subscribers:
subscriber.stop()
self.mission_event_subscribers = []
def _schedule(self, missions):
"""
Set up gevent threads for each mission
"""
self._publish_mission_start_event()
for mission in missions:
start_time = mission['start_time']
# There are two types of mission schedules: timed and event
if start_time:
# Timed schedule
self.threads.append(gevent.spawn(self._run_timed_mission, mission))
else:
# Event driven scheduler
self.threads.append(gevent.spawn(self._run_event_driven_mission, mission))
self._publish_mission_started_event()
log.debug('[mm] schedule: waiting for mission to complete')
gevent.joinall(self.threads)
def _send_command(self, instrument_id, agent_client, cmd):
"""
Send agent command
@param instrument_id for logging
@param agent_client Instrument/platform agent client
@param cmd Mission command
"""
method = cmd['method']
command = cmd['command']
parameters = cmd['parameters']
# Three types of commands: wait, platform cmd, and instrument cmd
if command == 'wait':
log.debug('[mm] Send mission command = %s', command)
wait_duration = parameters * 60
now = time.time()
wait_end = now + wait_duration
while (time.time() < wait_end and not self.mission_aborted):
gevent.sleep(1)
elif agent_client is None:
# This indicates platform agent command
log.debug('[mm] Send mission command = %s - %s to platform agent %r',
method, command, instrument_id)
driver_event_class = self.platform_agent._plat_driver.get_platform_driver_event_class()
if command in driver_event_class.__dict__.keys():
kwargs = {}
if parameters:
# This must be a TURN_ON_PORT or TURN_OFF_PORT command
kwargs = dict(port_id=parameters)
cmd = AgentCommand(command=getattr(driver_event_class, command), kwargs=kwargs)
reply = getattr(self.platform_agent, method)(command=cmd)
else:
log.error('[mm] Mission Error: Command %s not recognized', command)
raise Exception('Mission Error: Command %s not recognized', command)
else:
# This indicates instrument agent command
log.debug('[mm] Send mission command = %s - %s to instrument agent resource_id=%r, instrument_id=%r',
method, command, agent_client.resource_id, instrument_id)
retval = agent_client.get_capabilities()
agt_cmds, agt_pars, res_cmds, res_iface, res_pars = self._sort_capabilities(retval)
if command in ResourceAgentEvent.__dict__.keys():
cmd = AgentCommand(command=getattr(ResourceAgentEvent, command))
reply = getattr(agent_client, method)(cmd)
elif command in DriverEvent.__dict__.keys():
cmd = AgentCommand(command=getattr(DriverEvent, command))
reply = getattr(agent_client, method)(cmd)
elif command in res_pars:
# Set parameters - check parameter first, then set if necessary
reply = getattr(agent_client, 'get_resource')(command)
log.debug('[mm] %s = %s', command, str(reply[command]))
if parameters and parameters != reply[command]:
parameters = float(parameters) if '.' in parameters else int(parameters)
getattr(agent_client, method)({command: parameters})
reply = getattr(agent_client, 'get_resource')(command)
log.debug('[mm] %s = %s', command, str(reply[command]))
if parameters != reply[command]:
msg = '[mm] Mission Error: Parameter %s not set' % parameters
log.error(msg)
raise Exception(msg)
else:
log.error('[mm] Mission Error: Command %s not recognized', command)
raise Exception('Mission Error: Command %s not recognized', command)
state = agent_client.get_agent_state()
log.debug('[mm] Agent State = %s', state)
def _execute_mission_commands(self, mission_cmds):
"""
Loop through the mission commands sequentially
@param mission_cmds mission command dict
return an error dict containing an error code from MissionErrorCode and error message
"""
for cmd in mission_cmds:
attempt = 0
instrument_id = cmd['instrument_id']
if instrument_id in self.instruments:
# This command is for a child instrument
ia_client = self.instruments[instrument_id]
elif instrument_id == self.platform_agent._platform_id or instrument_id is None:
# This command is for the parent platform agent or a 'wait'
ia_client = None
else:
log.warn('[mm] instrument_id=%r unrecognized', instrument_id)
continue
error_handling = cmd['error']
if not error_handling:
error_handling = self.default_error
log.debug('[mm] instrument_id=%r', instrument_id)
while attempt < self.max_attempts:
error_string = ''
if self.mission_aborted:
return dict(code=MissionErrorCode.ABORT_MISSION, message=error_string)
attempt += 1
log.debug('[mm] Mission command = %s, Attempt # %d', cmd['command'], attempt)
try:
self._send_command(instrument_id, ia_client, cmd)
except Exception, ex:
# Get a description of the error
error_string = str(ex) + ' Mission sequence command = ' + str(cmd)
if error_handling == 'skip':
log.warn('[mm] Mission command %s skipped on error', cmd['command'])
break
elif (error_handling == 'abort' or attempt >= self.max_attempts):
return dict(code=MissionErrorCode.ABORT_MISSION_THREAD, message=error_string)
elif error_handling == 'abortMission':
return dict(code=MissionErrorCode.ABORT_MISSION, message=error_string)
elif error_handling == 'retry':
# Wait 5 seconds before retrying
gevent.sleep(5)
else:
break
return dict(code=MissionErrorCode.NO_ERROR, message='')
def _run_timed_mission(self, mission):
"""
Run a timed mission
@param mission Mission dictionary
"""
current_mission_thread = self.mission_thread_id
self.mission_thread_id += 1
# Simple status update
self.mission_threads.append({"mission_thread_id": current_mission_thread,
"status": MissionThreadStatus.STARTING,
"type": 'timed'})
# Publish event mission thread started
self._publish_mission_thread_started_event(current_mission_thread)
error_code = MissionErrorCode.NO_ERROR
loop_count = 0
self.max_attempts = mission['error_handling']['maxRetries']
self.default_error = mission['error_handling']['default']
instrument_ids = mission['instrument_id']
# First execute premission if necessary
if mission['premission_cmds']:
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.RUNNING
# Execute commands
error = self._execute_mission_commands(mission['premission_cmds'])
error_code = error['code']
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.STOPPED
start_time = mission['start_time']
num_loops = mission['loop']['num_loops']
start_in = start_time - time.time() if (time.time() < start_time) else 0
log.debug('[mm] Mission start in ' + str(int(start_in)) + ' seconds')
# Master loop
while not error_code:
# Wait until next start (or abort)
while (time.time() < start_time) and not self.mission_aborted:
gevent.sleep(1)
# Publish event - missionSequence started
self._publish_mission_sequence_starting_event(current_mission_thread, "MissionSequence starting...")
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.RUNNING
# Execute commands
error = self._execute_mission_commands(mission['mission_cmds'])
error_code = error['code']
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.STOPPED
# Commands have been executed - increment loop count
loop_count += 1
if error_code or (loop_count >= num_loops and num_loops != -1):
# Mission was completed successfully or aborted
break
# Calculate next start time
start_time += mission['loop']['loop_duration']
log.debug("[mm] Next sequence starts at " + time.strftime("%Y-%m-%d %H:%M:%S", gmtime(start_time)))
event_description = ("MissionSequence loop {0} of {1} complete."
" Next start at {2}".format
(loop_count, num_loops, time.strftime("%Y-%m-%d %H:%M:%S", gmtime(start_time))))
self._publish_mission_sequence_complete_event(current_mission_thread, event_description)
if error_code:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.ABORTED
self._publish_mission_thread_failed_event(current_mission_thread, error['message'])
if error_code == MissionErrorCode.ABORT_MISSION_THREAD:
self._abort_mission_thread(instrument_ids)
if not self._check_mission_running():
self._stop_mission_event_subscribers()
self._publish_mission_complete_event()
elif error_code == MissionErrorCode.ABORT_MISSION:
self.abort_mission()
else:
# Execute postmission if specified
if mission['postmission_cmds']:
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.RUNNING
# Execute commands
error = self._execute_mission_commands(mission['postmission_cmds'])
error_code = error['code']
if error_code == MissionErrorCode.ABORT_MISSION_THREAD:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.ABORTED
self._publish_mission_thread_failed_event(current_mission_thread, error['message'])
self._abort_mission_thread(instrument_ids)
if not self._check_mission_running():
self._stop_mission_event_subscribers()
self._publish_mission_complete_event()
elif error_code == MissionErrorCode.ABORT_MISSION:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.ABORTED
self._publish_mission_thread_failed_event(current_mission_thread, error['message'])
self.abort_mission()
else:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.DONE
self._publish_mission_thread_complete_event(current_mission_thread)
if not self._check_mission_running():
self._stop_mission_event_subscribers()
self._publish_mission_complete_event()
else:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.DONE
self._publish_mission_thread_complete_event(current_mission_thread)
if not self._check_mission_running():
self._stop_mission_event_subscribers()
self._publish_mission_complete_event()
def _run_event_driven_mission(self, mission):
"""
Run an event driven mission
@param mission Mission dictionary
"""
current_mission_thread = self.mission_thread_id
self.mission_thread_id += 1
# Simple status update
self.mission_threads.append({"mission_thread_id": current_mission_thread,
"status": MissionThreadStatus.STARTING,
"type": 'event'})
# Publish event mission thread started
self._publish_mission_thread_started_event(current_mission_thread)
self.max_attempts = mission['error_handling']['maxRetries']
self.default_error = mission['error_handling']['default']
error_code = MissionErrorCode.NO_ERROR
instrument_ids = mission['instrument_id']
# Execute premission
if mission['premission_cmds']:
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.RUNNING
# Execute commands
error = self._execute_mission_commands(mission['premission_cmds'])
error_code = error['code']
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.DONE
# Get the agent client for the device whos event needs monitoring
parent_id = mission['event']['parentID']
event_id = mission['event']['eventID']
if parent_id in self.instruments:
ia_event_client = self.instruments[parent_id]
origin = ia_event_client.resource_id
elif parent_id == self.platform_agent._platform_id:
origin = self.platform_agent._platform_id
else:
self._publish_mission_thread_failed_event(current_mission_thread, 'Parent ID unrecognized - {0}'.format(parent_id))
raise Exception('Parent ID unrecognized - {0}'.format(parent_id))
# Check that the event id is legitimate
driver_event_class = self.platform_agent._plat_driver.get_platform_driver_event_class()
if event_id in DriverEvent.__dict__.keys():
event_type = 'ResourceAgentCommandEvent'
event_id = getattr(DriverEvent, event_id)
elif event_id in driver_event_class.__dict__.keys():
event_type = ''
event_id = getattr(driver_event_class, event_id)
else:
# EXTERNAL event - check OMSDeviceStatusEvent
event_type = 'OMSDeviceStatusEvent'
#self._publish_mission_thread_failed_event(current_mission_thread, 'Event ID unrecognized - {0}'.format(event_id))
#raise Exception('Event ID unrecognized - %s', event_id)
#-------------------------------------------------------------------------------------
# Set up the subscriber to catch the mission event
#-------------------------------------------------------------------------------------
def callback_for_mission_events(event, *args, **kwargs):
#Check which type of event is being monitored
for attr in dir(event):
# An event was captured. Check that it is the correct event
try:
event_attr = event[attr]
except KeyError:
continue
else:
if event_id == event_attr:
# Execute the mission
log.debug('[mm] Mission Event %s received!', event_id)
log.debug('[mm] Event Driven Mission execution commenced')
# Publish event - missionSequence started
event_description = "Event {0} received! MissionSequence starting...".format(event_id)
self._publish_mission_sequence_starting_event(current_mission_thread, event_description)
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.RUNNING
# Execute commands
error = self._execute_mission_commands(mission['mission_cmds'])
error_code = error['code']
# Update internal mission status
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.DONE
if error_code == MissionErrorCode.ABORT_MISSION_THREAD:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.ABORTED
self._publish_mission_thread_failed_event(current_mission_thread, error['message'])
self.mission_event_subscribers[mission_event_id].stop()
self._abort_mission_thread(instrument_ids)
if not self._check_mission_running():
self._publish_mission_complete_event()
self._stop_mission_event_subscribers()
elif error_code == MissionErrorCode.ABORT_MISSION:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.ABORTED
self._publish_mission_thread_failed_event(current_mission_thread, error['message'])
self.abort_mission()
elif not self._check_mission_running():
self._publish_mission_thread_complete_event(current_mission_thread)
self._publish_mission_complete_event()
self._stop_mission_event_subscribers()
else:
event_description = "MissionSequence complete. Waiting for {0}".format(event_id)
self._publish_mission_sequence_complete_event(current_mission_thread, event_description)
if error_code:
self.mission_threads[current_mission_thread]['status'] = MissionThreadStatus.ABORTED
if error_code == MissionErrorCode.ABORT_MISSION_THREAD:
self._publish_mission_thread_failed_event(current_mission_thread, error['message'])
self._abort_mission_thread(instrument_ids)
if not self._check_mission_running():
self._stop_mission_event_subscribers()
self._publish_mission_complete_event()
elif error_code == MissionErrorCode.ABORT_MISSION:
self._publish_mission_thread_failed_event(current_mission_thread, error['message'])
self.abort_mission()
else:
# Start an event subscriber to catch mission event
self.mission_event_subscribers.append(EventSubscriber(event_type=event_type,
origin=origin,
callback=callback_for_mission_events))
mission_event_id = len(self.mission_event_subscribers) - 1
self.mission_event_subscribers[mission_event_id].start()
log.debug('[mm] Event driven mission started. Waiting for %s', event_id)
while self.mission_running:
gevent.sleep(1)
#-------------------------------------------------------------------------------------
# Instrument state commands
#-------------------------------------------------------------------------------------
def _startup_instrument_into_command(self, agent_client):
state = agent_client.get_agent_state()
while state != ResourceAgentState.COMMAND:
# UNINITIALIZED -> INACTIVE -> IDLE ->
try:
if state == ResourceAgentState.UNINITIALIZED:
log.debug('[mm] startup_instrument_into_command - INITIALIZE')
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = agent_client.execute_agent(cmd)
elif state == ResourceAgentState.INACTIVE:
log.debug('[mm] startup_instrument_into_command - GO_ACTIVE')
cmd = AgentCommand(command=ResourceAgentEvent.GO_ACTIVE)
retval = agent_client.execute_agent(cmd)
elif state == ResourceAgentState.IDLE:
log.debug('[mm] startup_instrument_into_command - RUN')
cmd = AgentCommand(command=ResourceAgentEvent.RUN)
retval = agent_client.execute_agent(cmd)
else:
log.debug('[mm] startup_instrument_into_command - RESET')
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = agent_client.execute_agent(cmd)
state = agent_client.get_agent_state()
log.debug('[mm] Agent state = %s', state)
except:
return False
return True
def _shutdown_instrument_into_inactive(self, agent_client):
"""
Check state, stop streaming if necessary, and get into INACTIVE state
"""
state = agent_client.get_agent_state()
while state != ResourceAgentState.INACTIVE:
# STREAMING -> COMMAND -> IDLE -> INACTIVE
try:
# Get capabilities
retval = agent_client.get_capabilities()
agt_cmds, agt_pars, res_cmds, res_iface, res_pars = self._sort_capabilities(retval)
if state == ResourceAgentState.UNINITIALIZED:
log.debug('[mm] shutdown_instrument_into_inactive - INITIALIZE')
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = agent_client.execute_agent(cmd)
elif ResourceAgentEvent.GO_INACTIVE in agt_cmds:
log.debug('[mm] shutdown_instrument_into_inactive - GO_INACTIVE')
cmd = AgentCommand(command=ResourceAgentEvent.GO_INACTIVE)
retval = agent_client.execute_agent(cmd)
elif state == ResourceAgentState.IDLE:
log.debug('[mm] shutdown_instrument_into_inactive - GO_INACTIVE')
cmd = AgentCommand(command=ResourceAgentEvent.GO_INACTIVE)
retval = agent_client.execute_agent(cmd)
elif state == ResourceAgentState.COMMAND:
log.debug('[mm] shutdown_instrument_into_inactive - CLEAR')
cmd = AgentCommand(command=ResourceAgentEvent.CLEAR)
retval = agent_client.execute_agent(cmd)
elif state == ResourceAgentState.STREAMING:
log.debug('[mm] shutdown_instrument_into_inactive - STOP_AUTOSAMPLE')
cmd = AgentCommand(command=DriverEvent.STOP_AUTOSAMPLE)
retval = agent_client.execute_resource(cmd)
else:
# TODO: What about Calibrate?
log.debug('[mm] shutdown_instrument_into_inactive - RESET')
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = agent_client.execute_agent(cmd)
state = agent_client.get_agent_state()
log.debug('[mm] shutdown_instrument_into_inactive - Agent state = %s', state)
except:
return False
return True
def _instrument_abort_sequence(self, agent_client):
"""
Check state, and gracefully get into INACTIVE state
"""
state = agent_client.get_agent_state()
if state != ResourceAgentState.INACTIVE:
self._shutdown_instrument_into_inactive(agent_client)
def _sort_capabilities(self, caps_list):
agt_cmds = []
agt_pars = []
res_cmds = []
res_iface = []
res_pars = []
if len(caps_list) > 0 and isinstance(caps_list[0], AgentCapability):
agt_cmds = [x.name for x in caps_list if x.cap_type == CapabilityType.AGT_CMD]
agt_pars = [x.name for x in caps_list if x.cap_type == CapabilityType.AGT_PAR]
res_cmds = [x.name for x in caps_list if x.cap_type == CapabilityType.RES_CMD]
res_iface = [x.name for x in caps_list if x.cap_type == CapabilityType.RES_IFACE]
res_pars = [x.name for x in caps_list if x.cap_type == CapabilityType.RES_PAR]
elif len(caps_list) > 0 and isinstance(caps_list[0], dict):
agt_cmds = [x['name'] for x in caps_list if x['cap_type'] == CapabilityType.AGT_CMD]
agt_pars = [x['name'] for x in caps_list if x['cap_type'] == CapabilityType.AGT_PAR]
res_cmds = [x['name'] for x in caps_list if x['cap_type'] == CapabilityType.RES_CMD]
res_iface = [x['name'] for x in caps_list if x['cap_type'] == CapabilityType.RES_IFACE]
res_pars = [x['name'] for x in caps_list if x['cap_type'] == CapabilityType.RES_PAR]
return agt_cmds, agt_pars, res_cmds, res_iface, res_pars
#----------------------------------------------------------------------------------------------------
# Publish mission executive events
# Refer to https://confluence.oceanobservatories.org/display/CIDev/Platform+Agent+Mission+Executive
#----------------------------------------------------------------------------------------------------
# Mission is about to start execution
def _publish_mission_start_event(self):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
sub_type="STARTING",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.OK,
description='Mission {0} is about to start execution'.format(self.mission_id))
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission has started
def _publish_mission_started_event(self):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
sub_type="STARTED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.OK,
description='Mission {0} has started'.format(self.mission_id))
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission has been requested to be aborted
def _publish_mission_abort_event(self):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
sub_type="STOPPING",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.ABORTED,
description='Mission {0} abort sequence has started'.format(self.mission_id))
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission has been aborted
def _publish_mission_aborted_event(self):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
sub_type="STOPPED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.ABORTED)
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission has completed, determine if it exited normally or due to exception
def _publish_mission_complete_event(self):
"""
This method can be called by a mission thread with no knowledge of
the status of other threads. So a check must be made to see if any
threads failed during mission execution!
"""
mission_threads_aborted = 0
mission_threads_successful = 0
for thread in self.mission_threads:
if thread['status'] == MissionThreadStatus.ABORTED:
mission_threads_aborted += 1
else:
mission_threads_successful += 1
if mission_threads_aborted > 0:
execution_status = MissionExecutionStatus.FAILED
description = ('Mission {0} has failed.'
' {1} mission threads completed successfully,'
' {2} mission threads aborted'
.format(self.mission_id, mission_threads_successful, mission_threads_aborted))
else:
execution_status = MissionExecutionStatus.OK
description = 'Mission {0} has exited normally'.format(self.mission_id)
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
sub_type="STOPPED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=execution_status,
description=description)
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission thread has started
def _publish_mission_thread_started_event(self, mission_thread_id):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
mission_thread_id=str(mission_thread_id),
sub_type="STARTED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.OK,
description='Mission thread {0} has started'.format(mission_thread_id))
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission thread has exited normally
def _publish_mission_thread_complete_event(self, mission_thread_id):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
mission_thread_id=str(mission_thread_id),
sub_type="STOPPED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.OK,
description='Mission thread {0} has exited normally'.format(mission_thread_id))
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission thread has started a missionSequence
def _publish_mission_sequence_starting_event(self, mission_thread_id, description):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
mission_thread_id=str(mission_thread_id),
sub_type="STARTED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.OK,
description=description)
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission thread has completed a missionSequence
def _publish_mission_sequence_complete_event(self, mission_thread_id, description):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
mission_thread_id=str(mission_thread_id),
sub_type="STOPPED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.OK,
description=description)
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
# Mission thread has exited due to some exception
def _publish_mission_thread_failed_event(self, mission_thread_id, description=''):
evt = dict(event_type='MissionLifecycleEvent',
mission_id=self.mission_id,
mission_thread_id=str(mission_thread_id),
sub_type="STOPPED",
origin_type=self.platform_agent.ORIGIN_TYPE,
origin=self.platform_agent.resource_id,
execution_status=MissionExecutionStatus.FAILED,
description=description)
self.platform_agent._event_publisher.publish_event(**evt)
log.debug('[mm] event published: %s', evt)
if __name__ == "__main__": # pragma: no cover
"""
Stand alone to check the mission loading/parsing capabilities
"""
p_agent = []
mission_id = 0
mission = MissionLoader(p_agent)
filename = "ion/agents/platform/test/mission_RSN_simulator1.yml"
with open(filename, 'r') as f:
mission_string = f.read()
mission.load_mission(mission_id, mission_string)
| 44.242494 | 127 | 0.599346 | 6,007 | 57,471 | 5.48277 | 0.083569 | 0.042629 | 0.015789 | 0.014999 | 0.511796 | 0.453287 | 0.418339 | 0.3816 | 0.368514 | 0.341157 | 0 | 0.0028 | 0.310243 | 57,471 | 1,298 | 128 | 44.276579 | 0.828032 | 0.09222 | 0 | 0.389499 | 0 | 0.001221 | 0.117105 | 0.014698 | 0 | 0 | 0 | 0.001541 | 0 | 0 | null | null | 0.001221 | 0.02442 | null | null | 0.003663 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edb43d40789e6373d5f99ed2947e668d34a00f63 | 1,499 | py | Python | setup.py | andrewwstephens/GNIRS-Pype | eb38f75e98d4f12aee51bab3c2058f28ab2a318f | [
"BSD-3-Clause"
] | 2 | 2020-01-05T03:28:02.000Z | 2020-03-22T04:35:40.000Z | setup.py | andrewwstephens/GNIRS-Pype | eb38f75e98d4f12aee51bab3c2058f28ab2a318f | [
"BSD-3-Clause"
] | 3 | 2019-08-25T20:53:40.000Z | 2019-08-25T20:56:28.000Z | setup.py | andrewwstephens/GNIRS-Pype | eb38f75e98d4f12aee51bab3c2058f28ab2a318f | [
"BSD-3-Clause"
] | 1 | 2019-08-21T08:35:09.000Z | 2019-08-21T08:35:09.000Z | # Based on STScI's JWST calibration pipeline.
from __future__ import print_function
from setuptools import setup, find_packages, Extension, Command
from glob import glob
# Open the README as the package long description
readme = open('README.rst', 'r')
README_TEXT = readme.read()
readme.close()
NAME = 'gnirs-pype'
SCRIPTS = glob('scripts/*')
PACKAGE_DATA = {
'': ['*.dat', '*.cfg', '*.fits', '*.txt']
}
setup(
name=NAME,
version="1.0.1",
author='mbusserolle',
author_email='mbussero@gemini.edu',
description='Gemini Instruments Data Reduction Framework.',
long_description = README_TEXT,
url='http://www.gemini.edu',
license='MIT',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: MacOS X',
'Intended Audience :: Science/Research',
'Intended Audience :: Education',
'License :: OSI Approved :: MIT License',
'Operating System :: MacOS',
'Programming Language :: Python :: 2.7',
'Topic :: Scientific/Engineering :: Astronomy',
'Topic :: Scientific/Engineering :: Physics',
],
keywords='Gemini GNIRS gnirs pipeline reduction data IRAF iraf PYRAF pyraf astronomy IR cross-dispersed longslit spectroscopy xd',
python_requires='~=2.7',
scripts=SCRIPTS, # TODO(nat): Update this to use entry_points instead of scripts for better cross-platform performance
packages=find_packages(),
package_data=PACKAGE_DATA,
install_requires=[]
)
| 33.311111 | 134 | 0.67445 | 172 | 1,499 | 5.77907 | 0.639535 | 0.033199 | 0.042254 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00665 | 0.197465 | 1,499 | 44 | 135 | 34.068182 | 0.819618 | 0.127418 | 0 | 0 | 0 | 0 | 0.456288 | 0.033742 | 0 | 0 | 0 | 0.022727 | 0 | 1 | 0 | false | 0 | 0.078947 | 0 | 0.078947 | 0.026316 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edb88c87eacb260200fcfeff5b16ef69ed4f2641 | 1,476 | py | Python | setup.py | jeroyang/mcgocr | 6cc6967b32161ee17cd52b482c4e9d1c08200285 | [
"MIT"
] | 1 | 2021-12-14T18:31:00.000Z | 2021-12-14T18:31:00.000Z | setup.py | jeroyang/mcgocr | 6cc6967b32161ee17cd52b482c4e9d1c08200285 | [
"MIT"
] | 1 | 2021-03-31T18:52:56.000Z | 2021-03-31T18:52:56.000Z | setup.py | jeroyang/mcgocr | 6cc6967b32161ee17cd52b482c4e9d1c08200285 | [
"MIT"
] | 1 | 2019-05-15T17:43:47.000Z | 2019-05-15T17:43:47.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from setuptools import setup
import sys
import os
if sys.argv[-1] == 'publish':
os.system("python setup.py sdist bdist_wheel upload")
sys.exit()
with open('README.md') as readme_file:
readme = readme_file.read()
version = '1.0.2'
with open('requirements.txt') as f:
requirements = f.read().split('\n')
test_requirements = [
# TODO: put package test requirements here
]
input_fns = ['input/'+fn for fn in '11532192.txt 11597317.txt 11897010.txt 12079497.txt 12546709.txt 12585968.txt'.split(' ')]
setup(
name='ncgocr',
version=version,
description="Named Concept Gene Ontology Concept Recognition",
long_description=readme,
author="Chia-Jung, Yang",
author_email='jeroyang@gmail.com',
url='https://github.com/jeroyang/ncgocr',
packages=[
'ncgocr'
],
package_dir={'ncgocr': 'ncgocr'},
data_files=[('input', input_fns)],
include_package_data=True,
install_requires=requirements,
license="MIT",
zip_safe=False,
keywords='ncgocr',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Natural Language :: English',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
],
test_suite='tests',
tests_require=test_requirements
)
| 27.333333 | 126 | 0.645664 | 177 | 1,476 | 5.288136 | 0.615819 | 0.051282 | 0.080128 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050427 | 0.207317 | 1,476 | 53 | 127 | 27.849057 | 0.749573 | 0.056233 | 0 | 0.045455 | 0 | 0 | 0.399281 | 0 | 0 | 0 | 0 | 0.018868 | 0 | 1 | 0 | false | 0 | 0.068182 | 0 | 0.068182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edc2156240913ee79c59e19a1de39484351cfa7f | 3,701 | py | Python | frontends/pycc.py | Henny022p/cexplore | 2bd33c87d4373d4073dda8a0fe0abaef63709df2 | [
"MIT"
] | null | null | null | frontends/pycc.py | Henny022p/cexplore | 2bd33c87d4373d4073dda8a0fe0abaef63709df2 | [
"MIT"
] | 3 | 2021-08-28T21:42:23.000Z | 2021-11-10T22:30:49.000Z | frontends/pycc.py | Henny022p/cexplore | 2bd33c87d4373d4073dda8a0fe0abaef63709df2 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import argparse
import os
import subprocess
import sys
from shutil import copyfile
from parser import parse, generate_ast, apply_transformations, ASTDump
from parse_debug import process_debug_info
def parse_args(argv):
parser = argparse.ArgumentParser(description='Simplified CC1 frontend')
parser.add_argument('--qinclude', action='append', help='Include Paths for iquote', required=False)
parser.add_argument('--binclude', action='append', help='Include Paths for Block Include', required=False)
parser.add_argument('--cc1', help='<Required> cc1 Path', required=False)
parser.add_argument('--version', help='Get Version String of cc1', required=False)
parser.add_argument('--preproc', help='preproc path', required=False)
parser.add_argument('--charmap', help='preproc charmap', required=False)
parser.add_argument('-S', action='store_true', help='Ignore parameter as agbcc does not know it', required=False)
parser.add_argument('-o', help='Output Assembly file', required=False, dest='destination')
parser.add_argument('--no-parse', action='store_true', help='disable parsing of agbcc output (debug option)',
required=False)
return parser.parse_known_args(argv)
def compile(source, output_filename, args, remainder):
cpp_args = ["cpp", "-nostdinc", "-undef"]
# Add Block Includes and Quote Includes
if args.qinclude:
for q in args.qinclude:
cpp_args += ["-iquote", q]
if args.binclude:
for b in args.binclude:
cpp_args += ["-I", b]
cpp_args += [source, "-o", source + ".i"]
subprocess.call(cpp_args)
if args.preproc and args.charmap:
pprocess = subprocess.Popen([args.preproc, source + '.i', args.charmap], stdout=subprocess.PIPE)
subprocess.call([args.cc1] + ['-o', output_filename] + remainder, stdin=pprocess.stdout)
else:
with open(source + '.i', 'r') as a:
subprocess.call([args.cc1] + ['-o', output_filename] + remainder, stdin=a)
def process_asm(input_filename, output_filename):
tree, success = parse(input_filename)
if not success:
raise ValueError('could not parse file')
ast = generate_ast(tree)
apply_transformations(ast)
with open(output_filename, 'w') as destination_file:
ASTDump(destination_file).visit(ast)
def cleanup(args, source):
for file in [f'{source}.i', f'{args.destination}.tmp']:
if os.path.exists(file):
os.remove(file)
def main(argv):
status_code = 0
args, remainder = parse_args(argv)
if args.version:
git_proc = subprocess.run(['git', '--git-dir=' + args.version + '/.git', 'rev-parse', '--short', 'HEAD'],
stdout=subprocess.PIPE)
print("pycc frontend for agbcc1 " + os.path.basename(args.version) + "@" + git_proc.stdout.decode('utf-8'))
exit(0)
source = remainder.pop(-1)
try:
if source.endswith('.c'):
asm_file = args.destination + '.tmp'
compile(source, asm_file, args, remainder)
process_debug_info(asm_file)
else:
asm_file = source
if not args.no_parse:
try:
process_asm(asm_file, args.destination)
except Exception as e:
print(f'error cleaning assembly code: {e}\nOutputting unprocessed assembly', file=sys.stderr)
copyfile(asm_file, args.destination)
status_code = 1
else:
copyfile(asm_file, args.destination)
finally:
cleanup(args, source)
exit(status_code)
if __name__ == '__main__':
main(sys.argv[1:])
| 37.765306 | 117 | 0.64388 | 461 | 3,701 | 5.036876 | 0.321041 | 0.034884 | 0.065891 | 0.066322 | 0.189492 | 0.099053 | 0.043066 | 0.043066 | 0.043066 | 0 | 0 | 0.00488 | 0.224804 | 3,701 | 97 | 118 | 38.154639 | 0.804461 | 0.015942 | 0 | 0.089744 | 1 | 0 | 0.167033 | 0.006044 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064103 | false | 0 | 0.089744 | 0 | 0.166667 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edc41cd9ac050974b750582741dcf3766b738d8e | 2,702 | py | Python | server/ahj_app/migrations/0006_ahjusermaintains_engineeringreviewrequirement.py | btansy/ahj-registry | b5340c7474f610964e828588572846ce4fb088aa | [
"MIT"
] | 4 | 2020-11-06T04:42:07.000Z | 2021-07-28T18:09:26.000Z | server/ahj_app/migrations/0006_ahjusermaintains_engineeringreviewrequirement.py | btansy/ahj-registry | b5340c7474f610964e828588572846ce4fb088aa | [
"MIT"
] | 38 | 2020-08-19T20:20:08.000Z | 2022-01-23T03:22:51.000Z | server/ahj_app/migrations/0006_ahjusermaintains_engineeringreviewrequirement.py | btansy/ahj-registry | b5340c7474f610964e828588572846ce4fb088aa | [
"MIT"
] | 8 | 2020-05-22T17:04:16.000Z | 2021-01-15T19:14:36.000Z | # Generated by Django 3.1.3 on 2021-03-16 21:18
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('ahj_app', '0005_ahjdocumentsubmissionmethoduse_ahjpermitissuemethoduse_documentsubmissionmethod_permitissuemethod'),
]
operations = [
migrations.CreateModel(
name='EngineeringReviewRequirement',
fields=[
('EngineeringReviewRequirementID', models.AutoField(db_column='EngineeringReviewRequirementID', primary_key=True, serialize=False)),
('Description', models.CharField(blank=True, db_column='Description', max_length=255)),
('EngineeringReviewType', models.CharField(choices=[('', ''), ('StructuralEngineer', 'Structural Engineer'), ('ElectricalEngineer', 'Electrical Engineer'), ('PVEngineer', 'PV Engineer'), ('MasterElectrician', 'Master Electrician'), ('FireMarshal', 'Fire Marshal'), ('EnvironmentalEngineer', 'Environmental Engineer')], db_column='EngineeringReviewType', max_length=21)),
('RequirementLevel', models.CharField(choices=[('', ''), ('Required', 'Required'), ('Optional', 'Optional'), ('ConditionallyRequired', 'Conditionally Required')], db_column='RequirementLevel', max_length=21)),
('RequirementNotes', models.CharField(blank=True, db_column='RequirementNotes', max_length=255)),
('StampType', models.CharField(choices=[('', ''), ('Wet', 'Wet'), ('Digital', 'Digital'), ('Notary', 'Notary'), ('None', 'None')], db_column='StampType', max_length=7)),
('AHJPK', models.ForeignKey(db_column='AHJPK', on_delete=django.db.models.deletion.DO_NOTHING, to='ahj_app.ahj')),
],
options={
'db_table': 'EngineeringReviewRequirement',
'managed': True,
},
),
migrations.CreateModel(
name='AHJUserMaintains',
fields=[
('MaintainerID', models.AutoField(db_column='MaintainerID', primary_key=True, serialize=False)),
('MaintainerStatus', models.IntegerField(db_column='MaintainerStatus')),
('AHJPK', models.ForeignKey(db_column='AHJPK', on_delete=django.db.models.deletion.DO_NOTHING, to='ahj_app.ahj')),
('UserID', models.ForeignKey(db_column='UserID', on_delete=django.db.models.deletion.DO_NOTHING, to=settings.AUTH_USER_MODEL)),
],
options={
'db_table': 'AHJUserMaintains',
'managed': True,
'unique_together': {('AHJPK', 'UserID')},
},
),
]
| 58.73913 | 386 | 0.634345 | 235 | 2,702 | 7.140426 | 0.408511 | 0.052443 | 0.033373 | 0.052443 | 0.196067 | 0.162694 | 0.124553 | 0.124553 | 0.124553 | 0.100119 | 0 | 0.014078 | 0.211325 | 2,702 | 45 | 387 | 60.044444 | 0.773346 | 0.016654 | 0 | 0.358974 | 1 | 0 | 0.33145 | 0.113748 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edc48cf5740bf064744042cdf10cd4eb422aed4d | 5,295 | py | Python | back/organization/views.py | kingzbauer/ChiefOnboarding | 202092a367aade9e032286466e06399ea07f1908 | [
"MIT"
] | null | null | null | back/organization/views.py | kingzbauer/ChiefOnboarding | 202092a367aade9e032286466e06399ea07f1908 | [
"MIT"
] | null | null | null | back/organization/views.py | kingzbauer/ChiefOnboarding | 202092a367aade9e032286466e06399ea07f1908 | [
"MIT"
] | null | null | null | import boto3
from botocore.config import Config
from django.conf import settings
from django.http import HttpResponse
from django.shortcuts import get_object_or_404, render
from django.utils.decorators import method_decorator
from django.views.decorators.csrf import ensure_csrf_cookie
from rest_framework import status
from rest_framework.permissions import IsAuthenticatedOrReadOnly, AllowAny
from rest_framework.response import Response
from rest_framework.views import APIView
from .models import Organization, Tag, WelcomeMessage
from misc.models import File
from .serializers import BaseOrganizationSerializer, DetailOrganizationSerializer, \
WelcomeMessageSerializer, ExportSerializer
from misc.serializers import FileSerializer
from users.permissions import NewHirePermission, AdminPermission
from django.core import management
from sequences.models import Sequence
def home(request):
return render(request, 'index.html')
class OrgView(APIView):
permission_classes = (IsAuthenticatedOrReadOnly,)
def get(self, request):
org = BaseOrganizationSerializer(Organization.object.get())
return Response(org.data)
class OrgDetailView(APIView):
def get(self, request):
org = DetailOrganizationSerializer(Organization.object.get())
return Response(org.data)
def patch(self, request):
serializer = DetailOrganizationSerializer(Organization.object.get(), data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
Sequence.objects.all().update(auto_add=False)
if 'auto_add_sequence' in request.data:
for i in request.data['auto_add_sequence']:
seq = Sequence.objects.get(id=i)
seq.auto_add = True
seq.save()
serializer.save()
return Response(serializer.data)
class WelcomeMessageView(APIView):
permission_classes = (IsAuthenticatedOrReadOnly,)
def get(self, request):
welcome_messages = WelcomeMessage.objects.all()
serializer = WelcomeMessageSerializer(welcome_messages, many=True)
return Response(serializer.data)
def post(self, request):
serializer = WelcomeMessageSerializer(data=request.data)
serializer.is_valid(raise_exception=True)
welcome_message = WelcomeMessage.objects.get(language=serializer.data['language'], message_type=serializer.data['message_type'])
welcome_message.message = serializer.data['message']
welcome_message.save()
return Response(serializer.data)
class TagView(APIView):
permission_classes = (IsAuthenticatedOrReadOnly,)
def get(self, request):
tags = [i.name for i in Tag.objects.all()]
return Response(tags)
class CSRFTokenView(APIView):
permission_classes = (AllowAny,)
@method_decorator(ensure_csrf_cookie)
def get(self, request):
return HttpResponse()
class FileView(APIView):
permission_classes = (AdminPermission, NewHirePermission)
def get(self, request, id, uuid):
file = get_object_or_404(File, uuid=uuid, id=id)
url = file.get_url()
return Response(url)
def post(self, request):
serializer = FileSerializer(data={'name': request.data['name'], 'ext': request.data['name'].split('.')[1]})
serializer.is_valid(raise_exception=True)
f = serializer.save()
key = str(f.id) + '-' + request.data['name'].split('.')[0] + '/' + request.data['name']
f.key = key
f.save()
s3 = boto3.client('s3',
settings.AWS_REGION,
endpoint_url=settings.AWS_S3_ENDPOINT_URL,
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
config=Config(signature_version='s3v4')
)
url = s3.generate_presigned_url(ClientMethod='put_object', ExpiresIn=3600,
Params={'Bucket': settings.AWS_STORAGE_BUCKET_NAME, 'Key': key})
return Response({'url': url, 'id': f.id})
def put(self, request, id):
file = get_object_or_404(File, pk=id)
file.active = True
file.save()
return Response(FileSerializer(file).data)
def delete(self, request, id):
if request.user.role == 1:
file = get_object_or_404(File, pk=id)
file.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class LogoView(APIView):
def put(self, request, id):
file = get_object_or_404(File, pk=id)
file.active = True
file.save()
org = Organization.object.get()
org.logo = file
org.save()
return Response(FileSerializer(file).data)
class ExportView(APIView):
def post(self, request):
from io import StringIO
import json
from django.core.files.base import ContentFile
buf = StringIO()
serializer = ExportSerializer(data=request.data)
serializer.is_valid(raise_exception=True)
management.call_command('dumpdata', serializer.data['export_model'], stdout=buf, natural_foreign=True)
buf.seek(0)
return Response(json.loads(buf.read()))
| 34.835526 | 136 | 0.675921 | 592 | 5,295 | 5.907095 | 0.263514 | 0.040892 | 0.017158 | 0.029168 | 0.247927 | 0.225336 | 0.156134 | 0.132113 | 0.075493 | 0.038319 | 0 | 0.008293 | 0.225685 | 5,295 | 151 | 137 | 35.066225 | 0.844634 | 0 | 0 | 0.264957 | 0 | 0 | 0.027951 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.119658 | false | 0 | 0.179487 | 0.017094 | 0.529915 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
edc7aee6d0b122d86c42cccd0bf4c885fa2fd7db | 4,790 | py | Python | src/infrastructure/services/scraper_services.py | rafaelcorazzi/game-scraper | 6543e6ee55f4776dd8f2c3cc49993d84401202c9 | [
"MIT"
] | null | null | null | src/infrastructure/services/scraper_services.py | rafaelcorazzi/game-scraper | 6543e6ee55f4776dd8f2c3cc49993d84401202c9 | [
"MIT"
] | null | null | null | src/infrastructure/services/scraper_services.py | rafaelcorazzi/game-scraper | 6543e6ee55f4776dd8f2c3cc49993d84401202c9 | [
"MIT"
] | null | null | null | import bs4
import requests
from bs4 import BeautifulSoup
import base64
import hashlib
from typing import List
import re
from src.helpers.utils import Utils
from src.domain.console_domain import ConsolePlataform
from src.domain.console_games_domain import ConsoleGames
from src.domain.game_domain import Game
import maya
import uuid
class ScraperServices:
@staticmethod
def __html_result(url: str = None) -> bs4.BeautifulSoup:
page = requests.get(url)
soup = BeautifulSoup(page.text, 'html.parser')
return soup
@staticmethod
def game_details(link: str = None, reference_id: str = None, console_code: str = None) -> Game:
page = ScraperServices.__html_result(f'https://jogorama.com.br/{link}/')
data_sheet = page.select('.ficha')
title = re.sub('[^A-Za-z0-9]+', ' ', str(page.findAll('span', attrs={"itemprop": "name"})[0].text))
owner = re.sub('[^A-Za-z0-9]+', ' ', str(page.findAll('span', attrs={"itemprop": "author"})[0].text))
publisher = re.sub('[^A-Za-z0-9]+', ' ', str(page.findAll('span', attrs={"itemprop": "publisher"})[0].text))
genre = '' if len(page.findAll('span', attrs={"itemprop": "genre"})) == 0 else str(page.findAll('span', attrs={"itemprop": "genre"})[0].text)
game_detail: Game = Game()
release_year = 0
release_month = 0
release_day = 0
release_date = ''
if re.search('<b>Lançamento:</b>([^,]+)<br/>', str(data_sheet[0])) is not None:
released = re.search('<b>Lançamento:</b>([^,]+)<br/>', str(data_sheet[0])).group(1)
result = re.findall(r'\d+', released)[0];
release_day = int(result) if len(result) == 2 else 0
release_year = result if len(result) == 4 else re.findall(r'\d+', released)[1]
if re.search(r'(?<=de)([\S\s]*)(?=de)', released) is not None:
release_month = Utils.month_converter(re.search(r'(?<=de)([\S\s]*)(?=de)', released).group(1).replace(' ', ''))
if release_month > 0 and release_day > 0:
release_date = maya.parse(f'{release_year}-{release_month}-{release_day}').datetime()
game_uuid = f'{reference_id} - {title} - {owner} - {publisher} - {release_year}'
game_detail.game_id = str(uuid.uuid5(uuid.NAMESPACE_URL, game_uuid))
game_detail.reference_id = reference_id
game_detail.title = title
game_detail.console_code = str(uuid.uuid5(uuid.NAMESPACE_URL, console_code))
game_detail.release_date = release_date
game_detail.release_year = release_year
game_detail.cover_image = base64.b64encode(requests.get(f"https://jogorama.com.br/thumbr.php?l=180&a=400&img=capas/{reference_id}.jpg").content)
game_detail.owner = owner
game_detail.publisher = publisher
game_detail.genre = genre
#print(f'{reference_id} - {title} - {owner} - {publisher} - {genre} - {release_date} - {release_year}')
return game_detail
@staticmethod
def list_of_games_by_plataform(console_code: str = None) -> List[ConsoleGames]:
page = ScraperServices.__html_result(f'https://jogorama.com.br/jogos/{console_code}/lista-de-jogos/')
rows_games = page.select('.lista')
glst = []
for r in rows_games:
consoles_games = r.select('li')
for c in consoles_games:
for z in c.find_all('a', href=True):
game_list: ConsoleGames = ConsoleGames()
game_list.console_code = console_code
game_list.reference_id = z['href'].split("/")[3]
game_list.title = z['title'].strip().replace("'", " ")
game_list.link = z['href']
#print(game_list.to_json())
glst.append(game_list.to_json())
return glst
@staticmethod
def list_of_console() -> List[ConsolePlataform]:
# ToDo: alterar para config
page = ScraperServices.__html_result('https://jogorama.com.br/')
rows = page.select('.menu')
plt = []
clt = []
i = 0
for a in rows:
i += 1
if i == 2:
consoles = a.select('li') # a.select('')
for b in consoles:
console: ConsolePlataform = ConsolePlataform()
console.console_plataform_name = b.select_one('a').text.strip()
console.console_plataform_code = b.select_one('a').text.strip().replace(' ', '-').lower()
console.console_uuid = str(uuid.uuid5(uuid.NAMESPACE_URL, b.select_one('a').text.strip().replace(' ', '-').lower()))
clt.append(console)
plt.append(console.to_json())
return clt, plt
| 47.9 | 152 | 0.596868 | 601 | 4,790 | 4.585691 | 0.237937 | 0.043541 | 0.027213 | 0.036284 | 0.268505 | 0.231858 | 0.170537 | 0.145864 | 0.105951 | 0.071118 | 0 | 0.013056 | 0.248434 | 4,790 | 99 | 153 | 48.383838 | 0.7525 | 0.034864 | 0 | 0.045455 | 0 | 0.011364 | 0.129249 | 0.032042 | 0 | 0 | 0 | 0.010101 | 0 | 1 | 0.045455 | false | 0 | 0.147727 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edc9550648c671816c3a908aa883c11e67ae1921 | 1,373 | py | Python | badwolf/deploy/providers/pypi.py | messense/badwolf | c693785af101f68505769cd712bbf13e37423587 | [
"MIT"
] | 94 | 2016-09-19T08:40:47.000Z | 2021-07-03T12:55:11.000Z | badwolf/deploy/providers/pypi.py | messense/badwolf | c693785af101f68505769cd712bbf13e37423587 | [
"MIT"
] | 32 | 2016-09-25T07:22:53.000Z | 2018-12-20T08:13:19.000Z | badwolf/deploy/providers/pypi.py | messense/badwolf | c693785af101f68505769cd712bbf13e37423587 | [
"MIT"
] | 15 | 2016-09-20T08:14:40.000Z | 2021-03-23T07:59:09.000Z | # -*- coding: utf-8 -*-
import logging
from badwolf.utils import run_command
from badwolf.deploy.providers import Provider
logger = logging.getLogger(__name__)
class PypiProvider(Provider):
name = 'pypi'
def deploy(self):
username = self.config.get('username')
password = self.config.get('password')
repository = self.config.get('repository')
command = [
'twine',
'upload',
]
if repository:
command.extend([
'--repository',
repository,
'--repository-url',
repository
])
if username:
command.extend([
'--username',
username
])
if password:
command.extend([
'--password',
password,
])
command.extend([
'--skip-existing',
self.config['distributions']
])
exit_code, output = run_command(
command,
include_errors=True,
cwd=self.working_dir,
env=self.context.environment
)
return exit_code == 0, output
def url(self):
pkg_name = self.config.get('package', self.context.repo_name)
return '{}/pypi/{}'.format(self.config['repository'], pkg_name)
| 25.425926 | 71 | 0.504734 | 118 | 1,373 | 5.762712 | 0.440678 | 0.088235 | 0.076471 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00235 | 0.380189 | 1,373 | 53 | 72 | 25.90566 | 0.79671 | 0.015295 | 0 | 0.177778 | 0 | 0 | 0.106667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044444 | false | 0.088889 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
edc95e42eea0407cdb0977716b363f03dc484545 | 15,706 | py | Python | lib/filewatcher/filewatchconfig.py | yinyin/FileWatcher | c415b201eaaeb2ab84d0a6ee4f6a290aeaa4eacf | [
"MIT"
] | null | null | null | lib/filewatcher/filewatchconfig.py | yinyin/FileWatcher | c415b201eaaeb2ab84d0a6ee4f6a290aeaa4eacf | [
"MIT"
] | null | null | null | lib/filewatcher/filewatchconfig.py | yinyin/FileWatcher | c415b201eaaeb2ab84d0a6ee4f6a290aeaa4eacf | [
"MIT"
] | null | null | null |
# -*- coding: utf-8 -*-
""" 設定檔相關物件與共用函式定義 """
import os
import re
import time
import datetime
import yaml
from filewatcher import metadatum
class WatcherConfiguration(object):
""" global configuration """
def __init__(self, target_directory, recursive_watch, remove_unoperate_file, meta_db_path, meta_reserve_day_duplicatecheck, meta_reserve_day_missingcheck):
""" 建構子
參數:
target_directory - 要監視的目錄路徑
recursive_watch - 是否遞迴監視子目錄
remove_unoperate_file - 是否要移除規則檢查成功但因各種原因而未執行作業的檔案
meta_db_path - Meta 資料庫檔案路徑
meta_reserve_day_duplicatecheck - 重複檔案檢查資訊留存天數
meta_reserve_day_missingcheck - 已刪除檔案檢查資訊留存天數
"""
super(WatcherConfiguration, self).__init__()
self.target_directory = target_directory
self.recursive_watch = recursive_watch
self.remove_unoperate_file = remove_unoperate_file
self.meta_db_path = meta_db_path
self.meta_reserve_day_duplicatecheck = meta_reserve_day_duplicatecheck
self.meta_reserve_day_missingcheck = meta_reserve_day_missingcheck
self.metadb = None
self._setup_meta_db()
# ### def __init__
def _setup_meta_db(self):
""" 當指定了 Meta 資料庫檔案路徑時,建立 Meta 資料庫物件 """
if self.meta_db_path is not None:
self.metadb = metadatum.MetaStorage(self.meta_db_path, self.meta_reserve_day_duplicatecheck, self.meta_reserve_day_missingcheck)
# ### def _setup_meta_db
# ### class WatcherConfiguration
class OperationEntry(object):
""" configuration of operation """
def __init__(self, opname, argv, opmodule):
""" 建構子
參數:
opname - 作業名稱
argv - 作業參數字串
opmodule - 執行作業的模組
"""
super(OperationEntry, self).__init__()
self.opname = opname
self.argv = argv
self.opmodule = opmodule
# ### __init__
# ### class OperationEntry
class MonitorEntry(object):
""" monitor configuration """
def __init__(self, file_regex, path_regex, do_dupcheck, operation_update, operation_remove, process_as_uniqname=True, content_check_label=None, ignorance_checker=None):
""" 建構子
參數:
file_regex - 檔名正規表示式
path_regex - 路徑 (相對於 target_directory) 正規表示式
do_dupcheck - 是否進行重複性檢查
operation_update - 存有作業設定的串列 (針對新增或修改檔案)
operation_remove - 存有作業設定的串列 (針對移出或刪除檔案)
process_as_uniqname - 是否使用唯一檔名進行後續作業 (需要目錄的 write 權限)
content_check_label - 是否在進行重複性比對作業時使用指定的字串來覆蓋掉檔名 (不同檔名視為同一筆檔案)
ignorance_checker - 檢查所找到的目錄或檔案是否要忽略
"""
super(MonitorEntry, self).__init__()
self.file_regex = re.compile(file_regex)
self.path_regex = None if (path_regex is None) else re.compile(path_regex)
self.operation_update = operation_update
self.operation_remove = operation_remove
self.process_as_uniqname = process_as_uniqname
self.do_dupcheck = False
self.content_check_label = None
if do_dupcheck:
self.do_dupcheck = True
self.content_check_label = content_check_label
self.ignorance_checker = ignorance_checker
# ### __init__
# ### class MonitorEntry
class TimeInterval(object):
""" 僅含小時與分 (HH:MM) 的時間區間 """
def __init__(self, time_start, time_end):
""" 建構子
參數:
time_start - 起始時間
time_end - 終止時間
"""
super(TimeInterval, self).__init__()
ts = time.strptime(time_start, "%H:%M")
time_start = datetime.timedelta(minutes=ts.tm_min, hours=ts.tm_hour)
ts = time.strptime(time_end, "%H:%M")
time_end = datetime.timedelta(minutes=ts.tm_min, hours=ts.tm_hour)
if time_start < time_end:
self.time_start = time_start.total_seconds()
self.time_end = time_end.total_seconds()
else:
self.time_start = time_end.total_seconds()
self.time_end = time_start.total_seconds()
# ### __init__
def isIn(self, t):
""" 給定的時戳是否落在這個時間區間之內
參數:
t - 時戳 (number, time.timedelta)
回傳值:
True - 是在區間內
False - 否
"""
# {{{ convert format into seconds from mid-night
if isinstance(t, (int, long, float,)):
if t >= 86400:
t = t % 86400
elif isinstance(t, datetime.timedelta):
t = t.total_seconds()
elif isinstance(t, (datetime.datetime, datetime.time)):
t = datetime.timedelta(seconds=t.second, minutes=t.minute, hours=t.hour)
t = t.total_seconds()
# }}} convert format into seconds from mid-night
if (self.time_start <= t) and (self.time_start >= t):
return True
else:
return False
# ### def isIn
# ### class TimeInterval
_ignorance_checker_list = {}
def register_ignorance_checker(name, checker):
""" 註冊忽略路徑與檔案檢查器,由針對專案客製化的程式載入器呼叫
參數:
name - 要註冊的名字
checker - 進行路徑與檔案名稱檢查的函式,函數原型: (relpath=None, filename=None) 回傳 True 表要忽略所檢查的項目
"""
global _ignorance_checker_list
_ignorance_checker_list[name] = checker
# ### def register_ignorance_checker
def lookup_ignorance_checker(name):
""" 找尋以指定名稱註冊的檢查器
參數:
name - 要找尋的名字
回傳值:
檢查器,或是 None
"""
name = name.strip()
if len(name) < 1:
return None
return _ignorance_checker_list.get(name, None)
# ### def lookup_ignorance_checker
def _to_bool(v, default_value=False):
""" 將傳入的參數值轉換為 bool
參數:
v - 要轉換的參數值
回傳值:
True 或 False
"""
if isinstance(v, bool):
return v
elif isinstance(v, (int, long, float,)):
if 0 == int(v):
return True
else:
return False
elif isinstance(v, (str, unicode,)):
if len(v) > 1:
return (True if (str(v[0]) in ('y', 'Y', 't', 'T',)) else False)
return default_value
# ### def _to_bool
def _load_config_impl_globalconfig(configMap):
""" 讀取全域設定資訊
參數:
configMap - 設定值資訊字典
回傳值:
WatcherConfiguration 物件
"""
target_directory = configMap['target_directory']
if (target_directory is None) or (False == os.path.isdir(target_directory)):
return None
target_directory = os.path.abspath(target_directory)
# set 'recursive_watch'
recursive_watch = _to_bool(configMap.get('recursive_watch', False), default_value=False)
# set 'remove_unoperate_file'
remove_unoperate_file = _to_bool(configMap.get('remove_unoperate_file', False), default_value=False)
# {{{ load meta storage options
meta_db_path = None
meta_reserve_day_duplicatecheck = 3
meta_reserve_day_missingcheck = 2
if ('meta' in configMap) and isinstance(configMap['meta'], dict):
meta_cfg = configMap['meta']
meta_db_path = meta_cfg['db_path']
meta_reserve_day_duplicatecheck = max(int(meta_cfg.get('duplicate_check_reserve_day', 3)), 1)
meta_reserve_day_missingcheck = max(int(meta_cfg.get('missing_detect_reserve_day', 2)), 1)
# }}} load meta storage options
global_config = WatcherConfiguration(target_directory, recursive_watch, remove_unoperate_file, meta_db_path, meta_reserve_day_duplicatecheck, meta_reserve_day_missingcheck)
return global_config
# ### _load_config_impl_globalconfig
def _load_config_impl_moduleconfig(configMap, config_reader, global_config):
""" 讀取模組設定資訊
參數:
configMap - 設定值資訊字典
config_reader - 以「模組的設定名稱」為鍵「模組實體」為值的 dict 結構體
global_config - 全域設定值
回傳值:
(無)
"""
for mod_cfgname, mod_object in config_reader.iteritems():
if mod_cfgname in configMap:
m = mod_object.get_module_prop()
cfg_content = configMap[mod_cfgname]
if m.isMonitor:
mod_object.monitor_configure(cfg_content, global_config.metadb)
if m.isOperator:
mod_object.operator_configure(cfg_content, global_config.metadb)
# ### _load_config_impl_moduleconfig
def _load_config_impl_watchentries_operation(operation_cfg, operation_deliver, operation_schedule_seq, operation_run_seq):
""" 載入 watch entries 的 operation 作業設定
參數:
operation_cfg - 作業設定
operation_deliver - 以作業名稱為 key 監視工作模組為 value 的字典
operation_schedule_seq - 排定作業塊先後順序用的作業名稱串列
operation_run_seq - 排定作業執行先後順序用的作業名稱串列
回傳值:
含有 OperationEntry 物件的串列
"""
ordered_operation_block = []
# {{{ 將 operation block 依據 operation_schedule_seq 排序
remain_oprcfg = operation_cfg
#print ">>> remaiS %r" % (remain_oprcfg,)
for opname in operation_schedule_seq:
#print ">>> %r" % (opname,)
next_remain_oprcfg = []
for oprblk_cfg in remain_oprcfg:
if opname in oprblk_cfg:
ordered_operation_block.append(oprblk_cfg)
else:
next_remain_oprcfg.append(oprblk_cfg)
if len(next_remain_oprcfg) < 1:
remain_oprcfg = None
break
remain_oprcfg = next_remain_oprcfg
if remain_oprcfg is not None:
ordered_operation_block.extend(remain_oprcfg)
remain_oprcfg = None
#print ">>> remain %r" % (remain_oprcfg,)
# }}} 將 operation block 依據 operation_schedule_seq 排序
organized_operation_block = []
# {{{ 將各 operation block 內的作業依據 operation_run_seq 排序
for oprblk_cfg in ordered_operation_block:
oprblock = []
for opname in operation_run_seq:
if opname in oprblk_cfg:
oparg = operation_deliver[opname].read_operation_argv(oprblk_cfg[opname])
if oparg is not None:
oprblock.append(OperationEntry(opname, oparg, operation_deliver[opname]))
if len(oprblock) > 0:
organized_operation_block.append(oprblock)
# }}} 將各 operation block 內的作業依據 operation_run_seq 排序
return organized_operation_block
# ### def _load_config_impl_watchentries_updateoprn
def _load_config_impl_watchentries(watch_entries_cfg, operation_deliver, operation_schedule_seq, operation_run_newupdate_seq, operation_run_dismiss_seq):
""" 載入檔案規則設定
參數:
watch_entries_cfg - 檔案監看規則設定
operation_deliver - 以作業名稱為 key 監視工作模組為 value 的字典
operation_schedule_seq - 排定作業塊先後順序用的作業名稱串列
operation_run_newupdate_seq - 排定作業執行先後順序用的作業名稱串列
回傳值:
含有 OperationEntry 物件的串列
"""
watch_entries = []
for entry_cfg in watch_entries_cfg:
try:
file_regex = str(entry_cfg['file_regex'])
path_regex = None
if 'path_regex' in entry_cfg:
path_regex = str(entry_cfg['path_regex'])
do_dupcheck = _to_bool(entry_cfg.get('duplicate_check', False), default_value=False)
content_check_label = None
if (True == do_dupcheck) and ('duplicate_content_check_label' in entry_cfg):
v = str(entry_cfg['duplicate_content_check_label'])
v = v.strip()
if len(v) > 0:
content_check_label = v
process_as_uniqname = _to_bool(entry_cfg.get('process_as_uniqname', True), default_value=True)
ignorance_checker = None
if 'ignorance-checker' in entry_cfg:
ignorance_checker = lookup_ignorance_checker(str(entry_cfg['ignorance-checker']))
# {{{ load operations
operation_update = None
if 'update-operation' in entry_cfg:
operation_update = _load_config_impl_watchentries_operation(entry_cfg['update-operation'], operation_deliver, operation_schedule_seq, operation_run_newupdate_seq)
elif 'operation' in entry_cfg:
operation_update = _load_config_impl_watchentries_operation(entry_cfg['operation'], operation_deliver, operation_schedule_seq, operation_run_newupdate_seq)
operation_remove = None
if 'remove-operation' in entry_cfg:
operation_remove = _load_config_impl_watchentries_operation(entry_cfg['remove-operation'], operation_deliver, operation_schedule_seq, operation_run_dismiss_seq)
# }}} load operations
entryobj = MonitorEntry(file_regex, path_regex, do_dupcheck, operation_update, operation_remove, process_as_uniqname, content_check_label, ignorance_checker)
watch_entries.append(entryobj)
except:
print "Failed on loading watch entry: %r" % (entry_cfg,)
raise
return watch_entries
# ### def _load_config_impl_watchentries
def _load_config_impl_import_external_watchentries(watch_entries, external_files, base_folder_path, operation_deliver, operation_schedule_seq, operation_run_newupdate_seq, operation_run_dismiss_seq):
for external_cfg_file in external_files:
expanded_external_cfg_path = os.path.join(base_folder_path, external_cfg_file)
with open(expanded_external_cfg_path, 'r') as fp:
ext_watchentries_cmap = yaml.load(fp)
ext_watch_entries = _load_config_impl_watchentries(ext_watchentries_cmap.get('watching_entries', ()), operation_deliver, operation_schedule_seq, operation_run_newupdate_seq, operation_run_dismiss_seq)
if len(ext_watch_entries) > 0:
watch_entries.extend(ext_watch_entries)
# ### def _load_config_impl_import_external_watchentries
def load_config(config_filename, config_reader, operation_deliver, operation_schedule_seq, operation_run_newupdate_seq, operation_run_dismiss_seq):
"""" 讀取設定檔內容
參數:
config_filename - 設定檔檔名
config_reader - 以「模組的設定名稱」為鍵「模組實體」為值的 dict 結構體
operation_deliver - 以作業名稱為 key 監視工作模組為 value 的字典
operation_schedule_seq - 排定作業塊先後順序用的作業名稱串列
operation_run_newupdate_seq - 排定作業執行先後順序用的作業名稱串列 (針對檔案新增或修改事件)
operation_run_dismiss_seq - 排定作業執行先後順序用的作業名稱串列 (針對檔案刪除或移出事件)
傳回值:
(global_config, watch_entries,) - 存放 WatcherConfiguration 物件 (global_config) 及 MonitorEntry 物件 list (watch_entries) 的 tuple
"""
with open(config_filename, 'r') as fp:
configMap = yaml.load(fp)
# Global Configuration
global_config = _load_config_impl_globalconfig(configMap)
if global_config is None:
return None
# Module Configuration
_load_config_impl_moduleconfig(configMap, config_reader, global_config)
# Watch Entries
watch_entries = _load_config_impl_watchentries(configMap.get('watching_entries', ()),
operation_deliver, operation_schedule_seq, operation_run_newupdate_seq, operation_run_dismiss_seq)
# Import Watch Entries
_load_config_impl_import_external_watchentries(watch_entries,
configMap.get('import_watching_entries_from', ()), os.path.dirname(config_filename),
operation_deliver, operation_schedule_seq, operation_run_newupdate_seq, operation_run_dismiss_seq)
return (global_config, watch_entries,)
# ### def load_config
def _keyfunction_prop_schedule(prop):
""" 排序用 key function (componentprop.OperatorProp 物件,針對 schedule_priority) """
return prop.schedule_priority
# ### def __keyfunction_schedule
def _keyfunction_prop_run(prop):
""" 排序用 key function (componentprop.OperatorProp 物件,針對 run_priority) """
return prop.run_priority
# ### def _keyfunction_prop_run
def get_module_interfaces(mods):
""" 取得工作模組的屬性,並依此設定相關參數
參數:
mods - 含有工作模組的串列
回傳值:
含有以下元素的 tuple: (config_readers, monitor_implement, operation_deliver, operation_schedule_seq, operation_run_seq)
config_readers - 以設定檔段落名稱為 key 工作模組為 value 的字典
monitor_implement - 以監視工作模組名稱 (同設定檔段落名稱) 作為 key 監視模組為 value 的字典
operation_deliver - 以作業名稱為 key 監視工作模組為 value 的字典
operation_schedule_seq - 排定作業塊先後順序用的作業名稱串列
operation_run_newupdate_seq - 排定作業執行先後順序用的作業名稱串列 (針對檔案新增或修改事件)
operation_run_dismiss_seq - 排定作業執行先後順序用的作業名稱串列 (針對檔案刪除或移出事件)
"""
config_readers = {}
monitor_implement = {}
operation_deliver = {}
operation_schedule_seq = []
operation_run_newupdate_seq = []
operation_run_dismiss_seq = []
for m in mods:
prop = m.get_module_prop()
config_sec_name = prop.module_name
config_readers[config_sec_name] = m
# {{{ if module is monitor module
if prop.isMonitor:
monitor_name = prop.module_name
monitor_implement[monitor_name] = m
# }}} if module is monitor module
# {{{ if module is operator module
if prop.isOperator:
operation_name = prop.operation_name
if prop.run_priority is not None: # only push into dict. if run-priority is available
operation_deliver[operation_name] = m
operation_run_newupdate_seq.append(prop)
if prop.schedule_priority is not None:
operation_schedule_seq.append(prop)
if prop.handle_dismiss:
operation_run_dismiss_seq.append(prop)
# }}} if module is operator module
# {{{ sort operator module lists
operation_schedule_seq = [x.operation_name for x in sorted(operation_schedule_seq, key=_keyfunction_prop_schedule)] # eg: 含有 copy 的作業塊應該比含有 move 的早執行
operation_run_newupdate_seq = [x.operation_name for x in sorted(operation_run_newupdate_seq, key=_keyfunction_prop_run)] # eg: copy 或 move 要比 coderunner 早執行
operation_run_dismiss_seq = [x.operation_name for x in sorted(operation_run_dismiss_seq, key=_keyfunction_prop_run)]
# }}} sort operator module lists
return (config_readers, monitor_implement, operation_deliver, operation_schedule_seq, operation_run_newupdate_seq, operation_run_dismiss_seq,)
# ### def get_module_interfaces
# vim: ts=4 sw=4 ai nowrap
| 31.60161 | 202 | 0.768687 | 2,068 | 15,706 | 5.461799 | 0.161509 | 0.038247 | 0.040726 | 0.033997 | 0.39761 | 0.344577 | 0.29606 | 0.288181 | 0.229482 | 0.199026 | 0 | 0.002001 | 0.141093 | 15,706 | 496 | 203 | 31.665323 | 0.835285 | 0.096969 | 0 | 0.084071 | 0 | 0 | 0.043053 | 0.014625 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.039823 | null | null | 0.004425 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edd15c37e43dc01593a312943e481b1e92f7c6cb | 3,696 | py | Python | src/spark/io/stubs.py | jbalint/spark | caccf1cd9122dd4a7dc0f26a57ee4a649056aa6f | [
"CNRI-Jython"
] | 1 | 2015-05-21T20:00:12.000Z | 2015-05-21T20:00:12.000Z | src/spark/io/stubs.py | jbalint/spark | caccf1cd9122dd4a7dc0f26a57ee4a649056aa6f | [
"CNRI-Jython"
] | null | null | null | src/spark/io/stubs.py | jbalint/spark | caccf1cd9122dd4a7dc0f26a57ee4a649056aa6f | [
"CNRI-Jython"
] | null | null | null | #*****************************************************************************#
#* Copyright (c) 2004-2008, SRI International. *#
#* All rights reserved. *#
#* *#
#* Redistribution and use in source and binary forms, with or without *#
#* modification, are permitted provided that the following conditions are *#
#* met: *#
#* * Redistributions of source code must retain the above copyright *#
#* notice, this list of conditions and the following disclaimer. *#
#* * Redistributions in binary form must reproduce the above copyright *#
#* notice, this list of conditions and the following disclaimer in the *#
#* documentation and/or other materials provided with the distribution. *#
#* * Neither the name of SRI International nor the names of its *#
#* contributors may be used to endorse or promote products derived from *#
#* this software without specific prior written permission. *#
#* *#
#* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS *#
#* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT *#
#* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR *#
#* A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT *#
#* OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, *#
#* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT *#
#* LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, *#
#* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY *#
#* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT *#
#* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE *#
#* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. *#
#*****************************************************************************#
#* "$Revision:: 26 $" *#
#* "$HeadURL:: https://svn.ai.sri.com/projects/spark/trunk/spark/src/spar#$" *#
#*****************************************************************************#
from spark.internal.version import *
def ask(message, args):
next = raw_input('foobar?')
print "received answer %s" % (next)
return next
# This is a stub for sending a message to the user through MMD
def inform(message, args):
print message%args
return True
# This is also an MMD stub
def inform_other(name, message, args):
print "Sending message to %s" % (name)
print message%args
return True
# This is a stub for requesting information from the user through MMD
def request1(question, options):
print question
# print "Options: %s" % (options)
return "ibook"
# This is a stub for requesting information from the user through MMD
def request2(question):
print question
return "yes"
# This is a really simple stub for sending a query to CA
def obtain(item, requirements):
print "Sending query to CA (fake results for now)"
quotes = ("ibook", "thinkpad", "notebook")
return quotes
# Not sure how this will normally be done... CA?
def fill_slot(form, slot):
print "Filling slot: %s (need KB query)" % (slot)
return "Owara" # Every slot gets filled in with "Owara"....
| 52.056338 | 80 | 0.559524 | 406 | 3,696 | 5.086207 | 0.482759 | 0.026634 | 0.013559 | 0.015981 | 0.190799 | 0.15109 | 0.15109 | 0.123002 | 0.123002 | 0.123002 | 0 | 0.00463 | 0.298701 | 3,696 | 70 | 81 | 52.8 | 0.792052 | 0.754329 | 0 | 0.24 | 0 | 0 | 0.202086 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.04 | null | null | 0.32 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edd295ef1c3dfcb502f8ddda52936db7370d0e3f | 448 | py | Python | tws_game/base_classes.py | tomreitsma/the-weird-science | 1aaa4f42852fa82c99d69df9a6b0448f054583d9 | [
"MIT"
] | 1 | 2016-02-20T23:44:53.000Z | 2016-02-20T23:44:53.000Z | tws_game/base_classes.py | tomreitsma/the-weird-science | 1aaa4f42852fa82c99d69df9a6b0448f054583d9 | [
"MIT"
] | null | null | null | tws_game/base_classes.py | tomreitsma/the-weird-science | 1aaa4f42852fa82c99d69df9a6b0448f054583d9 | [
"MIT"
] | null | null | null | from uuid import uuid4
class BaseLobby(object):
id = None
clients = []
game = None
factory = None
def __init__(self):
self.id = str(uuid4())
def set_factory(self, factory):
self.factory = factory
class BaseGame(object):
factory = []
commands = {
}
def handle_command(self, client, command, data):
if command in self.commands:
self.commands[command](client, data)
| 16.592593 | 52 | 0.595982 | 51 | 448 | 5.117647 | 0.490196 | 0.084291 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006369 | 0.299107 | 448 | 26 | 53 | 17.230769 | 0.824841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0 | 0.058824 | 0 | 0.705882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
edd5d196bcf7dc43acd4440a4a14048214ab8a01 | 3,557 | py | Python | bash_source/bacsort/pairwise_identities_to_distance_matrix.py | jwang45/biosuite | e380951d01dff39a5c5d98c2c9e345f52e12b429 | [
"MIT"
] | null | null | null | bash_source/bacsort/pairwise_identities_to_distance_matrix.py | jwang45/biosuite | e380951d01dff39a5c5d98c2c9e345f52e12b429 | [
"MIT"
] | null | null | null | bash_source/bacsort/pairwise_identities_to_distance_matrix.py | jwang45/biosuite | e380951d01dff39a5c5d98c2c9e345f52e12b429 | [
"MIT"
] | 2 | 2021-03-28T01:33:56.000Z | 2022-01-03T22:35:42.000Z | #!/usr/bin/env python3
"""
Copyright 2018 Ryan Wick (rrwick@gmail.com)
https://github.com/rrwick/Bacsort
This script uses FastANI output to generate a PHYLIP distance matrix suitable for quicktree.
This file is part of Bacsort. Bacsort is free software: you can redistribute it and/or modify it
under the terms of the GNU General Public License as published by the Free Software Foundation,
either version 3 of the License, or (at your option) any later version. Bacsort is distributed in
the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
details. You should have received a copy of the GNU General Public License along with Bacsort. If
not, see <http://www.gnu.org/licenses/>.
"""
import argparse
import sys
def get_arguments():
parser = argparse.ArgumentParser(description='Distance matrix from pairwise identities')
parser.add_argument('identities', type=str,
help='FastANI output file (or similarly formatted file with three '
'whitespace-delimited columns of assembly 1, assembly 2, percent '
'identity')
parser.add_argument('--max_dist', type=float, required=False, default=1.0,
help='Maximum allowed genomic distance')
args = parser.parse_args()
return args
def main():
args = get_arguments()
clusters = set()
distances = {}
print('', file=sys.stderr)
print('Convert FastANI distances to PHYLIP matrix', file=sys.stderr)
print('------------------------------------------------', file=sys.stderr)
fastani_output_filename = args.identities
with open(fastani_output_filename, 'rt') as fastani_output:
for line in fastani_output:
parts = line.strip().split()
cluster_1 = parts[0]
cluster_2 = parts[1]
ani = float(parts[2])
if cluster_1 == cluster_2:
distance = 0.0
else:
distance = 1.0 - (ani / 100.0)
clusters.add(cluster_1)
clusters.add(cluster_2)
add_distance(distances, cluster_1, cluster_2, distance)
add_distance(distances, cluster_2, cluster_1, distance)
print('Found {} clusters and {} distances'.format(len(clusters), len(distances)),
file=sys.stderr)
print(len(clusters))
clusters = sorted(clusters)
for i in clusters:
print(i, end='')
for j in clusters:
print('\t', end='')
try:
distance = distances[(i, j)]
except KeyError:
distance = args.max_dist
if distance > args.max_dist:
distance = args.max_dist
print('%.6f' % distance, end='')
print()
print('', file=sys.stderr)
def add_distance(distances, cluster_1, cluster_2, distance):
# If this is the first time we've seen this pair, then we just add it to the dictionary.
if (cluster_1, cluster_2) not in distances:
distances[(cluster_1, cluster_2)] = distance
# If we've seen this pair before (the other way around), then we make sure the distances are
# close (sanity check) and then save the mean distance.
else:
assert abs(distance - distances[(cluster_1, cluster_2)]) < 0.1
distances[(cluster_1, cluster_2)] = (distances[(cluster_1, cluster_2)] + distance) / 2.0
if __name__ == '__main__':
main()
| 37.442105 | 97 | 0.634523 | 461 | 3,557 | 4.791757 | 0.403471 | 0.039837 | 0.054323 | 0.057945 | 0.169307 | 0.111815 | 0.056587 | 0.039837 | 0 | 0 | 0 | 0.018237 | 0.260051 | 3,557 | 94 | 98 | 37.840426 | 0.821049 | 0.295193 | 0 | 0.101695 | 1 | 0 | 0.145833 | 0.019231 | 0 | 0 | 0 | 0 | 0.016949 | 1 | 0.050847 | false | 0 | 0.033898 | 0 | 0.101695 | 0.169492 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edda94634a39ff6afa273e96315420641fb859a8 | 1,476 | py | Python | Python/Robot.py | zdynamics/zRobotics | aed0beac327710f319b996a34b42df37440d33cd | [
"MIT"
] | 2 | 2021-01-19T10:21:48.000Z | 2021-02-15T14:55:12.000Z | Python/Robot.py | zdynamics/zRobotics | aed0beac327710f319b996a34b42df37440d33cd | [
"MIT"
] | null | null | null | Python/Robot.py | zdynamics/zRobotics | aed0beac327710f319b996a34b42df37440d33cd | [
"MIT"
] | null | null | null | import DenavitHartenberg as dh
import Kinematics as k
import numpy as np
import matplotlib.pyplot as plt
import Movements as mv
from mpl_toolkits.mplot3d import Axes3D
from matplotlib.animation import FuncAnimation
from sympy import *
class System:
def __init__(self, jointsPositions, linksLengths, centersOfMass, dhParameters = [], symbolicDHParameters = [], dhParametersCOM = [], symbolicDHParametersCOM = [], xi = [], xid = [], name = ''):
"""
Robot's constructor
jointsPositions: np.array (two - dimensional)
linksLengths: list (one - dimensional)
centersOfMass: list (one - dimensional)
name: string (optional)
"""
# Robot's name
self.name = name
# Kinematic Parameters
self.jointsPositions = jointsPositions
self.linksLengths = linksLengths
self.centersOfMass = centersOfMass
# Actuation axes
self.xi = xi
self.xid = xid
# Denavit - Hartenberg Parameters
self.dhParameters = dhParameters
self.symbolicDHParameters = symbolicDHParameters
self.dhParametersCOM = dhParametersCOM
self.symbolicDHParametersCOM = symbolicDHParametersCOM
# Symbolic parameters
self.symbolicJointsPositions = np.array([[Symbol(f"q{i + 1}"),] for i in range(jointsPositions.shape[0])])
self.symbolicLinksLengths = [Symbol(f"l{i + 1}") for i in range(len(linksLengths))]
self.symbolicCentersOfMass = [Symbol(f"lcom{i + 1}") for i in range(len(linksLengths))]
| 35.142857 | 195 | 0.705285 | 153 | 1,476 | 6.771242 | 0.45098 | 0.040541 | 0.014479 | 0.017375 | 0.066602 | 0.066602 | 0.054054 | 0.054054 | 0 | 0 | 0 | 0.005076 | 0.199187 | 1,476 | 42 | 196 | 35.142857 | 0.871404 | 0.182927 | 0 | 0 | 0 | 0 | 0.023256 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.347826 | 0 | 0.434783 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
eddb2b8cf0ca4625b7ca1cfc397fac955e5bf784 | 901 | py | Python | src/spaceone/inventory/libs/manager.py | jihyungSong/plugin-azure-power-state | d66bd5dfafa01659c877da11c0d18de6e55cb5ab | [
"Apache-2.0"
] | null | null | null | src/spaceone/inventory/libs/manager.py | jihyungSong/plugin-azure-power-state | d66bd5dfafa01659c877da11c0d18de6e55cb5ab | [
"Apache-2.0"
] | null | null | null | src/spaceone/inventory/libs/manager.py | jihyungSong/plugin-azure-power-state | d66bd5dfafa01659c877da11c0d18de6e55cb5ab | [
"Apache-2.0"
] | null | null | null | from spaceone.core.manager import BaseManager
from spaceone.inventory.libs.connector import AzureConnector
class AzureManager(BaseManager):
connector_name = None
cloud_service_types = []
response_schema = None
collected_region_codes = []
def verify(self, options, secret_data, **kwargs):
""" Check collector's status.
"""
connector: AzureConnector = self.locator.get_connector('AzureConnector', secret_data=secret_data)
connector.verify()
def collect_power_state(self, params) -> list:
raise NotImplemented
def collect_resources(self, params) -> list:
return self.collect_power_state(params)
def list_all_resource_groups(self, secret_data):
connector: AzureConnector = self.locator.get_connector('AzureConnector')
connector.set_connect(secret_data)
return connector.list_resource_groups()
| 31.068966 | 105 | 0.720311 | 98 | 901 | 6.377551 | 0.5 | 0.08 | 0.0864 | 0.1088 | 0.192 | 0.192 | 0.192 | 0 | 0 | 0 | 0 | 0 | 0.193119 | 901 | 28 | 106 | 32.178571 | 0.859697 | 0.027747 | 0 | 0 | 0 | 0 | 0.032596 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.055556 | 0.722222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eddccfe344220f40b8d37221eaf5337a8e696f89 | 1,745 | py | Python | project_name/common/serializers.py | brevetech/breve_drf_template | 125e476810641f919296cb878980f91f4c091cf2 | [
"MIT"
] | null | null | null | project_name/common/serializers.py | brevetech/breve_drf_template | 125e476810641f919296cb878980f91f4c091cf2 | [
"MIT"
] | 17 | 2021-04-05T00:22:13.000Z | 2022-01-11T04:53:47.000Z | project_name/common/serializers.py | brevetech/breve_drf_template | 125e476810641f919296cb878980f91f4c091cf2 | [
"MIT"
] | 1 | 2022-01-07T05:48:19.000Z | 2022-01-07T05:48:19.000Z | from rest_framework import serializers
class ErrorSerializer(serializers.Serializer):
"""
Defines a basic error serializer for
error responses
"""
error_code = serializers.IntegerField()
message = serializers.CharField(max_length=500)
def set_data(self, error_code, message):
self.error_code = error_code
self.message = message
def get_data(self):
return {"code_error": self.error_code, "message": self.message}
class Meta:
fields = ("error_code", "message")
class MultiErrorSerializer(serializers.Serializer):
"""Defines a basic structure for multi errors responses"""
errors = serializers.ListField(child=serializers.CharField(max_length=500))
error_type = serializers.CharField(max_length=100)
error_code = serializers.IntegerField()
def set_data(self, error_code, error_type, errors):
self.error_code = error_code
self.error_type = error_type
self.errors = errors
def get_data(self):
return {
"code_error": self.error_code,
"error_type": self.error_type,
"errors": self.errors,
}
class Meta:
fields = ("error_code", "error_type", "errors")
class CommonResponseSerializer(serializers.Serializer):
"""
Defines a basic common response serializer with
a status field and a message field
"""
status = serializers.IntegerField()
message = serializers.CharField(max_length=500)
def set_data(self, status, message):
self.status = status
self.message = message
def get_data(self):
return {"status": self.status, "message": self.message}
class Meta:
fields = ("code_error", "message")
| 26.846154 | 79 | 0.665903 | 197 | 1,745 | 5.730964 | 0.22335 | 0.09566 | 0.069088 | 0.102746 | 0.576616 | 0.387954 | 0.258636 | 0.258636 | 0.209035 | 0.209035 | 0 | 0.008982 | 0.234384 | 1,745 | 64 | 80 | 27.265625 | 0.836078 | 0.107736 | 0 | 0.378378 | 0 | 0 | 0.076669 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.162162 | false | 0 | 0.027027 | 0.081081 | 0.621622 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
ede26e4b3315b6449745d3f1642592836366342e | 12,015 | py | Python | refactored_modular/nest_tvb_pivot.py | sontheimer/EBRAINS_InterscaleHUB | 2ee323960fb9b0d0c8920fcaee996b56f8a23b56 | [
"BSD-3-Clause"
] | null | null | null | refactored_modular/nest_tvb_pivot.py | sontheimer/EBRAINS_InterscaleHUB | 2ee323960fb9b0d0c8920fcaee996b56f8a23b56 | [
"BSD-3-Clause"
] | null | null | null | refactored_modular/nest_tvb_pivot.py | sontheimer/EBRAINS_InterscaleHUB | 2ee323960fb9b0d0c8920fcaee996b56f8a23b56 | [
"BSD-3-Clause"
] | null | null | null | # ------------------------------------------------------------------------------
# Copyright 2020 Forschungszentrum Jülich GmbH
# "Licensed to the Apache Software Foundation (ASF) under one or more contributor
# license agreements; and to You under the Apache License, Version 2.0. "
#
# Forschungszentrum Jülich
# Institute: Institute for Advanced Simulation (IAS)
# Section: Jülich Supercomputing Centre (JSC)
# Division: High Performance Computing in Neuroscience
# Laboratory: Simulation Laboratory Neuroscience
# Team: Multi-scale Simulation and Design
#
# ------------------------------------------------------------------------------
from mpi4py import MPI
import time
import numpy as np
from EBRAINS_InterscaleHUB.refactored_modular.Communicator import Communicator
from EBRAINS_InterscaleHUB.refactored_modular import interscalehub_utils
from EBRAINS_InterscaleHUB.refactored_modular import interscalehub_mediator as mediator
#from EBRAINS_InterscaleHUB.Interscale_hub.transformer import spiketorate
from EBRAINS_ConfigManager.global_configurations_manager.xml_parsers.default_directories_enum import DefaultDirectories
from EBRAINS_RichEndpoint.Application_Companion.common_enums import Response
# NestTvbPivot and TvbNestPivot classes:
# TODO: proper abstraction -> extract the usecase details from the general implementation
# -> Init, start, stop are pretty much the same every time
# -> incoming (receive) and outgoing (send) loops (M:N mapping)
# -> the analyse (method) should be
# a) pivot, as raw data to cosim data
# b) transform (might be trivial) and
# c) analysis (might be trivial)
# TODO: rework on the receive and send loops (both, general coding style and usecase specifics)
class CommunicatorNestTvb(Communicator):
'''
Implements the PivotBaseClass for abstracting the pivot operations and
the underlying communication protocol. This class provides wrappers
for receving the data from NEST simulator and sending it to TVB simulator
after processing/transforming to the required format.
'''
def __init__(self, configurations_manager, log_settings, name, databuffer,
intracomm, param, comm_receiver, comm_sender):
'''
'''
super().__init__(configurations_manager,
log_settings,
name,
databuffer
)
# Parameter for transformation and analysis
self.__param = param
# INTERcommunicator
# TODO: Revisit the protocol to TVB and NEST
# TODO: rank 0 and rank 1 hardcoded
if intracomm.Get_rank() == 0:
self.__comm_receiver = comm_receiver
self.__num_sending = self.__comm_receiver.Get_remote_size()
elif intracomm.Get_rank() == 1:
self.__comm_sender = comm_sender
self.__num_receiving = self.__comm_sender.Get_remote_size()
self.__logger.info("Initialised")
def start(self, intracomm):
'''
Starts the pivot operation.
M:N mapping of MPI ranks, receive data, further process data.
Receive on rank 0, do the rest on rest of the ranks.
'''
if intracomm.Get_rank() == 0: # Receiver from input sim, rank 0
self._receive()
elif intracomm.Get_rank() == 1: # Science/analyse and sender to TVB, rank 1-x
self._send()
def stop(self):
'''
TODO: proper execution of stop command
'''
self.__stop = True
def _receive(self):
'''
Receive data on rank 0. Put it into the shared mem buffer.
Replaces the former 'receive' function.
NOTE: First refactored version -> not pretty, not final.
'''
# The last two buffer entries are used for shared information
# --> they replace the status_data variable from previous version
# --> find more elegant solution?
self.__logger.info("setting up buffers")
self.__databuffer[-1] = 1 # set buffer to 'ready to receive from nest'
self.__databuffer[-2] = 0 # marks the 'head' of the buffer
# It seems the 'check' variable is used to receive tags from NEST, i.e. ready for send...
# change this in the future, also mentioned in the FatEndPoint solution from Wouter.
check = np.empty(1,dtype='b')
shape = np.empty(1, dtype='i')
count = 0
status_ = MPI.Status()
self.__logger.info("reading from buffer")
###########################################################
#TODO Refactor to move this functionality to appropriate location
#NOTE As per protocol, it should be the response message of 'init'
# command, and should return the PID and the port information
import os
from EBRAINS_RichEndpoint.Application_Companion.common_enums import INTEGRATED_SIMULATOR_APPLICATION as SIMULATOR
pid_and_local_minimum_step_size = \
{SIMULATOR.PID.name: os.getpid(),
SIMULATOR.LOCAL_MINIMUM_STEP_SIZE.name: 0.0}
print(f'{pid_and_local_minimum_step_size}')
###########################################################
# self.__logger.info("NESTtoTVB -- consumer/receiver -- Rank:"+str(self.__comm_receiver.Get_rank()))
while True:
head_ = 0 # head of the buffer, reset after each iteration
# TODO: This is still not correct. We only check for the Tag of the last rank.
# IF all ranks send always the same tag in one iteration (simulation step)
# then this works. But it should be handled differently!!!!
self.__comm_receiver.Recv([check, 1, MPI.CXX_BOOL], source=0, tag=MPI.ANY_TAG, status=status_)
status_rank_0 = status_.Get_tag()
for i in range(1, self.__num_sending):
# new: We do not care which source sends first, give MPI the freedom to send in whichever order.
# self.__comm_receiver.Recv([check, 1, MPI.CXX_BOOL], source=MPI.ANY_SOURCE, tag=MPI.ANY_TAG, status=status_)
# self.__logger.info("checking status")
self.__comm_receiver.Recv([check, 1, MPI.CXX_BOOL], source=i, tag=MPI.ANY_TAG, status=status_)
if status_rank_0 != status_.Get_tag():
# Case: the state of the NEST is different between the ranks
# Log the exception with traceback
interscalehub_utils.log_exception(
log_message="Abnormal state : the state of Nest is different between rank. Tag received: ",
mpi_tag_received=status_.Get_tag())
# Terminate with Error
return Response.ERROR
if status_.Get_tag() == 0:
# wait until ready to receive new data (i.e. the sender has cleared the buffer)
while self.__databuffer[-1] != 1: # TODO: use MPI, remove the sleep
time.sleep(0.001)
pass
for source in range(self.__num_sending):
# send 'ready' to the nest rank
# self.__logger.info("send ready")
self.__comm_receiver.Send([np.array(True,dtype='b'),MPI.BOOL],dest=source,tag=0)
# receive package size info
# self.__logger.info("DEBUG 121 ====> receiving size in NEST_TVB_PIVOT")
self.__comm_receiver.Recv([shape, 1, MPI.INT], source=source, tag=0, status=status_)
# self.__comm_receiver.Recv([shape, 1, MPI.INT], source=MPI.ANY_SOURCE, tag=MPI.ANY_TAG, status=status_)
# NEW: receive directly into the buffer
self.__comm_receiver.Recv([self.__databuffer[head_:], MPI.DOUBLE], source=source, tag=0, status=status_)
head_ += shape[0] # move head
# Mark as 'ready to do analysis'
self.__databuffer[-1] = 0
# important: head_ is first buffer index WITHOUT data.
self.__databuffer[-2] = head_
# continue receiving the data
continue
elif status_.Get_tag() == 1:
# increment the count and continue receiving the data
count += 1
continue
elif status_.Get_tag() == 2:
# NOTE: simulation ended
# everything goes fine, terminate the loop and respond with OK
return Response.OK
else:
# A 'bad' MPI tag is received,
# log the exception with traceback
interscalehub_utils.log_exception(
log_message="bad mpi tag :",
mpi_tag_received=status_.Get_tag())
# terminate with Error
return Response.ERROR
def _send(self):
'''
Send data to TVB (multiple MPI ranks possible).
Replaces the former 'send' function.
NOTE: First refactored version -> not pretty, not final.
'''
count=0 # simulation/iteration step
status_ = MPI.Status()
# self.__logger.info("NESTtoTVB -- producer/sender -- Rank:"+str(self.__comm_sender.Get_rank()))
while True:
# TODO: this communication has the 'rank 0' problem described in the beginning
accept = False
#logger.info("Nest to TVB : wait to send " )
while not accept:
req = self.__comm_sender.irecv(source=MPI.ANY_SOURCE,tag=MPI.ANY_TAG)
accept = req.wait(status_)
#logger.info(" Nest to TVB : send data status : " +str(status_.Get_tag()))
if status_.Get_tag() == 0:
# wait until the receiver has cleared the buffer, i.e. filled with new data
while self.__databuffer[-1] != 0: # TODO: use MPI, remove the sleep
time.sleep(0.001)
pass
# NOTE: calling the mediator which calls the corresponding transformer functions
times,data = mediator.spike_to_rate(self.__databuffer, count)
# Mark as 'ready to receive next simulation step'
self.__databuffer[-1] = 1
### OLD Code
#logger.info("Nest to TVB : send data :"+str(np.sum(data)) )
# time of sim step
self.__comm_sender.Send([times, MPI.DOUBLE], dest=status_.Get_source(), tag=0)
# send the size of the rate
size = np.array(int(data.shape[0]),dtype='i')
self.__comm_sender.Send([size,MPI.INT], dest=status_.Get_source(), tag=0)
# send the rates
self.__comm_sender.Send([data,MPI.DOUBLE], dest=status_.Get_source(), tag=0)
# increment the count
count+=1
# continue sending the data
continue
### OLD Code end
elif status_.Get_tag() == 1:
# NOTE: simulation ended
# everything goes fine, terminate the loop and respond with OK
return Response.OK
else:
# A 'bad' MPI tag is received,
# log the exception with traceback
interscalehub_utils.log_exception(
log_message="bad mpi tag :",
mpi_tag_received=status_.Get_tag())
# terminate with Error
return Response.ERROR
'''
def _transform(self, count):
#store: Python object, create the histogram
#analyse: Python object, calculate rates
spikerate = spiketorate(self.__param)
times, data = spikerate.spike_to_rate(count, self.__databuffer[-2], self.__databuffer)
return times, data
'''
| 47.868526 | 125 | 0.590179 | 1,390 | 12,015 | 4.918705 | 0.265468 | 0.019892 | 0.019307 | 0.017552 | 0.310663 | 0.279655 | 0.229048 | 0.196577 | 0.159865 | 0.134708 | 0 | 0.008917 | 0.30928 | 12,015 | 250 | 126 | 48.06 | 0.814917 | 0.426384 | 0 | 0.377358 | 0 | 0 | 0.030239 | 0.005336 | 0 | 0 | 0 | 0.02 | 0 | 1 | 0.04717 | false | 0.018868 | 0.09434 | 0 | 0.198113 | 0.009434 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ede494c59f84787efc51402d949603f354657f2e | 1,345 | py | Python | atlas/foundations_contrib/src/foundations_contrib/models/project_listing.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 296 | 2020-03-16T19:55:00.000Z | 2022-01-10T19:46:05.000Z | atlas/foundations_contrib/src/foundations_contrib/models/project_listing.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 57 | 2020-03-17T11:15:57.000Z | 2021-07-10T14:42:27.000Z | atlas/foundations_contrib/src/foundations_contrib/models/project_listing.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 38 | 2020-03-17T21:06:05.000Z | 2022-02-08T03:19:34.000Z |
class ProjectListing(object):
@staticmethod
def list_projects(redis_connection):
"""Returns a list of projects store in redis with their
creation timestamps
Arguments:
redis_connection {RedisConnection} -- Redis connection to use as a provider for data
Returns:
list -- The list of project names and creation dates
"""
from foundations_contrib.utils import string_from_bytes
projects = redis_connection.zrange('projects', 0, -1, withscores=True)
return [{'name': string_from_bytes(name), 'created_at': created_at} for name, created_at in projects]
@staticmethod
def find_project(redis_connection, project_name):
"""Returns a single of projects store in redis with it's
creation timestamp
Arguments:
redis_connection {RedisConnection} -- Redis connection to use as a provider for data
project_name {str} -- Name of the project to find
Returns:
dict -- The dictionary of the 2 attribute from the description above or None if the project does not exist
"""
created_at = redis_connection.zscore('projects', project_name)
if created_at is None:
return None
return {'name': project_name, 'created_at': created_at}
| 34.487179 | 118 | 0.656506 | 163 | 1,345 | 5.269939 | 0.411043 | 0.139697 | 0.045402 | 0.039581 | 0.291036 | 0.239814 | 0.179278 | 0.179278 | 0.179278 | 0.179278 | 0 | 0.00309 | 0.278067 | 1,345 | 38 | 119 | 35.394737 | 0.881565 | 0.439405 | 0 | 0.166667 | 0 | 0 | 0.069182 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.083333 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
edf254f29e53bc8f9cd9417828c9783420853336 | 3,977 | py | Python | dialpad/client.py | dialpad/dialpad-python-sdk | 7a448ff9402c9bffd3809b311444619ed9bab2c1 | [
"MIT"
] | 5 | 2020-12-08T16:48:31.000Z | 2022-02-03T01:07:38.000Z | dialpad/client.py | dialpad/dialpad-python-sdk | 7a448ff9402c9bffd3809b311444619ed9bab2c1 | [
"MIT"
] | 7 | 2021-08-18T19:52:07.000Z | 2022-03-11T04:16:21.000Z | dialpad/client.py | dialpad/dialpad-python-sdk | 7a448ff9402c9bffd3809b311444619ed9bab2c1 | [
"MIT"
] | 5 | 2020-12-06T04:04:31.000Z | 2021-12-20T18:59:29.000Z |
import requests
from cached_property import cached_property
from .resources import (
SMSResource,
RoomResource,
UserResource,
CallResource,
NumberResource,
OfficeResource,
WebhookResource,
CompanyResource,
ContactResource,
CallbackResource,
CallCenterResource,
CallRouterResource,
DepartmentResource,
TranscriptResource,
UserDeviceResource,
StatsExportResource,
SubscriptionResource,
BlockedNumberResource,
EventSubscriptionResource
)
hosts = dict(
live='https://dialpad.com',
sandbox='https://sandbox.dialpad.com'
)
class DialpadClient(object):
def __init__(self, token, sandbox=False, base_url=None):
self._token = token
self._session = requests.Session()
self._base_url = base_url or hosts.get('sandbox' if sandbox else 'live')
def _url(self, *path):
path = ['%s' % p for p in path]
return '/'.join([self._base_url, 'api', 'v2'] + path)
def _cursor_iterator(self, response_json, path, method, data, headers):
for i in response_json['items']:
yield i
data = dict(data or {})
while 'cursor' in response_json:
data['cursor'] = response_json['cursor']
response = self._raw_request(path, method, data, headers)
response.raise_for_status()
response_json = response.json()
for i in response_json['items']:
yield i
def _raw_request(self, path, method='GET', data=None, headers=None):
url = self._url(*path)
headers = headers or dict()
headers.update({'Authorization': 'Bearer %s' % self._token})
if str(method).upper() in ['GET', 'POST', 'PUT', 'PATCH', 'DELETE']:
return getattr(self._session, str(method).lower())(
url,
headers=headers,
json=data if method != 'GET' else None,
params=data if method == 'GET' else None,
)
raise ValueError('Unsupported method "%s"' % method)
def request(self, path, method='GET', data=None, headers=None):
response = self._raw_request(path, method, data, headers)
response.raise_for_status()
if response.status_code == 204: # No Content
return None
response_json = response.json()
response_keys = set(k for k in response_json)
# If the response contains the 'items' key, (and maybe 'cursor'), then this is a cursorized
# list response.
if 'items' in response_keys and not response_keys - {'cursor', 'items'}:
return self._cursor_iterator(
response_json, path=path, method=method, data=data, headers=headers)
return response_json
@cached_property
def blocked_number(self):
return BlockedNumberResource(self)
@cached_property
def call(self):
return CallResource(self)
@cached_property
def call_router(self):
return CallRouterResource(self)
@cached_property
def callback(self):
return CallbackResource(self)
@cached_property
def callcenter(self):
return CallCenterResource(self)
@cached_property
def company(self):
return CompanyResource(self)
@cached_property
def contact(self):
return ContactResource(self)
@cached_property
def department(self):
return DepartmentResource(self)
@cached_property
def event_subscription(self):
return EventSubscriptionResource(self)
@cached_property
def number(self):
return NumberResource(self)
@cached_property
def office(self):
return OfficeResource(self)
@cached_property
def room(self):
return RoomResource(self)
@cached_property
def sms(self):
return SMSResource(self)
@cached_property
def stats(self):
return StatsExportResource(self)
@cached_property
def subscription(self):
return SubscriptionResource(self)
@cached_property
def transcript(self):
return TranscriptResource(self)
@cached_property
def user(self):
return UserResource(self)
@cached_property
def userdevice(self):
return UserDeviceResource(self)
@cached_property
def webhook(self):
return WebhookResource(self)
| 24.398773 | 95 | 0.702288 | 458 | 3,977 | 5.949782 | 0.270742 | 0.10789 | 0.118532 | 0.138716 | 0.13578 | 0.117431 | 0.10055 | 0.10055 | 0.079266 | 0.047706 | 0 | 0.001247 | 0.193613 | 3,977 | 162 | 96 | 24.549383 | 0.848457 | 0.028916 | 0 | 0.230159 | 0 | 0 | 0.048483 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0.02381 | 0.150794 | 0.412698 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
edf36a07d1b1f908a147c0df5e05785149976da1 | 3,253 | py | Python | pydart/apps/turtle/optimizer.py | Jungdam/AlphaCon | 2c70e848ab176537560484d28869cdc3c2fc8cd7 | [
"MIT"
] | 2 | 2020-03-16T05:43:56.000Z | 2021-11-22T05:52:29.000Z | pydart/apps/turtle/optimizer.py | Jungdam/AlphaCon | 2c70e848ab176537560484d28869cdc3c2fc8cd7 | [
"MIT"
] | null | null | null | pydart/apps/turtle/optimizer.py | Jungdam/AlphaCon | 2c70e848ab176537560484d28869cdc3c2fc8cd7 | [
"MIT"
] | null | null | null | import math
import numpy as np
import pydart
import action as ac
import myworld
import cma
import multiprocessing
from multiprocessing import Process, Queue
import mmMath
from numpy.linalg import inv
dt = 1.0/600.0
skel_file = '/home/jungdam/Research/AlphaCon/pydart/apps/turtle/data/skel/turtle.skel'
def obj_func_transform(q, idx, world, action, transform):
skel = world.get_skeleton()
a_default = skel.controller.get_action_default()
a_extra = ac.format(ac.flat(action))
a = ac.add(a_default, a_extra)
world.reset()
skel.controller.reset()
skel.controller.set_action_all(a)
um_wingbeat = 2
while True:
world.step()
if skel.controller.get_num_wingbeat() >= num_wingbeat:
break
R_des,p_des = mmMath.T2Rp(transform)
R,p = mmMath.T2Rp(skel.body('trunk').T)
# difference on orientation
v0 = 1.0 * mmMath.length(mmMath.log(np.dot(inv(R_des),R)))
# difference on position
v1 = 1.0 * mmMath.length(p_des-p)
# penalty on large deviation
v2 = 10 * np.dot(np.array(a_l),np.array(a_l))
# penalty on large torque
v3 = 0.00002 * skel.controller.get_tau_sum() / (num_wingbeat*a[2])
val = v0 + v1 + v2 + v3
print '\t', val, v0, v1, v2, v3
q.put([idx, val])
def obj_func_straight(q, idx, world, action):
skel = world.get_skeleton()
a_default = skel.controller.get_action_default()
a_l = action[0:6].tolist()
a_r = ac.mirror(a_l)
a_t = 0.0
a_extra = [a_l,a_r,a_t]
a = ac.add(a_default, a_extra)
world.reset()
skel.controller.reset()
skel.controller.set_action_all(a)
num_wingbeat = 8
t0 = skel.body('trunk').world_com()
while True:
world.step()
if skel.controller.get_num_wingbeat() >= num_wingbeat:
break
t1 = skel.body('trunk').world_com()
diff = t1-t0
v0 = 10.0 * (diff[0]*diff[0] + diff[1]*diff[1])
v1 = 30 * math.exp(-0.01*diff[2]*diff[2])
v2 = 100 * np.dot(np.array(a_l),np.array(a_l))
v3 = 0.0002 * skel.controller.get_tau_sum() / (num_wingbeat*a[2])
val = v0 + v1 + v2 + v3
print '\t', val, v0, v1, v2, v3
q.put([idx, val])
def result_func_straight(result,controller):
a_default = controller.get_action_default()
a_l = result[0].tolist()
a_r = ac.mirror(a_l)
a_t = 0.0
a_extra = [a_l,a_r,a_t]
return ac.add(a_default, a_extra)
def run(obj_func, result_func, options=None):
print('-----------Start Optimization-----------')
num_cores = multiprocessing.cpu_count()
num_pop = max(8, 1*num_cores)
num_gen = 100
myWorlds = []
for i in range(num_pop):
myWorlds.append(myworld.Myworld(dt, skel_file))
opts = cma.CMAOptions()
opts.set('pop', num_pop)
es = cma.CMAEvolutionStrategy(6*[0.0], 0.2, opts)
for i in range(num_gen):
X = es.ask()
fit = num_pop*[0.0]
q = Queue()
ps = []
for j in range(num_pop):
p = Process(target=obj_func, args=(q, j, myWorlds[j],X[j]))
p.start()
ps.append(p)
for j in range(num_pop):
ps[j].join()
for j in range(num_pop):
val = q.get()
fit[val[0]] = val[1]
print '[', i, val[0], ']', val[1]
es.tell(X,fit)
cma.pprint(es.result())
# es.optimize(obj_func, verb_disp=1)
# cma.pprint(es.result())
print('-----------End Optimization-------------')
return result_func(es.result(),myWorlds[0].skel.controller) | 25.414063 | 86 | 0.652936 | 551 | 3,253 | 3.6951 | 0.254083 | 0.075639 | 0.050098 | 0.017682 | 0.410118 | 0.375737 | 0.327112 | 0.327112 | 0.327112 | 0.327112 | 0 | 0.036982 | 0.177067 | 3,253 | 128 | 87 | 25.414063 | 0.723571 | 0.048571 | 0 | 0.360825 | 0 | 0.010309 | 0.056976 | 0.038848 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.103093 | null | null | 0.061856 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
edf750d72b79f5e3c826e2032c0c58c48d9ee5cb | 432 | py | Python | Semana07Atividade01/Sem07Atv01Q02.py | Marvingms7/IFPI | 0f6d50e4a7e1016e58cd227b477b3ce08e19c61a | [
"MIT"
] | null | null | null | Semana07Atividade01/Sem07Atv01Q02.py | Marvingms7/IFPI | 0f6d50e4a7e1016e58cd227b477b3ce08e19c61a | [
"MIT"
] | null | null | null | Semana07Atividade01/Sem07Atv01Q02.py | Marvingms7/IFPI | 0f6d50e4a7e1016e58cd227b477b3ce08e19c61a | [
"MIT"
] | null | null | null | v_carro = float(input("Informe o valor: "))
total_mes_carro = (0.004 * v_carro) +v_carro
p_inicial = 10000
renda_total = (0.007 * p_inicial) + p_inicial
mes = -1
while True:
mes += 1
total_mes_carro = (0.004 * total_mes_carro) + total_mes_carro
if renda_total < total_mes_carro:
renda_total = (0.007 * renda_total) + renda_total
else:
break
print(f'Você so podera comprar um carro com {mes} meses!') | 30.857143 | 65 | 0.678241 | 71 | 432 | 3.830986 | 0.43662 | 0.147059 | 0.238971 | 0.102941 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067449 | 0.210648 | 432 | 14 | 66 | 30.857143 | 0.730205 | 0 | 0 | 0 | 0 | 0 | 0.150115 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.076923 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
61077b955f71be1c35656e2b03e6c7df22bd07f0 | 7,922 | py | Python | logstash_conf_api.py | garyelephant/dolphin | 68bf3b3612c78b3094e332c34787751c948a0246 | [
"MIT"
] | null | null | null | logstash_conf_api.py | garyelephant/dolphin | 68bf3b3612c78b3094e332c34787751c948a0246 | [
"MIT"
] | null | null | null | logstash_conf_api.py | garyelephant/dolphin | 68bf3b3612c78b3094e332c34787751c948a0246 | [
"MIT"
] | null | null | null | # coding: utf-8
from string import Template
PRETTY_INDENT = 4
BASE_INDENT_SPACE = ' '
def _quote( s ):
return ''.join( [ '"', s, '"' ] )
class ConfBase( object ):
def __init__( self ):
super( ConfBase, self ).__init__()
def _dump( d, pretty=False, indent=0 ):
if isinstance( d, int ) or isinstance( d, float ) or isinstance( d, bool ):
return d
elif isinstance( d, basestring ): # str and unicode
return _quote( d )
elif isinstance( d, list ):
arr = []
for ele in d:
arr.append( _dump( ele ) )
t = Template( '[ $arr ]' )
return t.substitute( arr=', '.join( arr ) )
elif isinstance( d, dict ):
p = ' '
indent_spaces = ''
if pretty:
p = '\n'
indent_spaces = BASE_INDENT_SPACE * indent
res = '{' + p
for k, v in d.items():
t = Template( '$indent$key => $val$p' )
key_indent_spaces = BASE_INDENT_SPACE * ( indent + PRETTY_INDENT )
val_indent = indent + PRETTY_INDENT * 2
res += t.substitute( indent=key_indent_spaces, key=k, val=_dump( v, pretty, indent=val_indent ), p=p )
res += indent_spaces + '}' + p
return res
class Plugin( ConfBase ):
def __init__( self, name, **kwargs ):
super( Plugin, self ).__init__()
self.name = name
self.settings = kwargs
def __str__( self ):
return self.name
def dumps( self, pretty=False, indent=0 ):
p = ' '
indent_spaces = ''
if pretty:
p = '\n'
indent_spaces = BASE_INDENT_SPACE * indent
res = indent_spaces + self.name + ' {' + p
for k, v in self.settings.items():
sub_indent = indent + PRETTY_INDENT
v_dumped = _dump( v, pretty, indent=sub_indent )
res += BASE_INDENT_SPACE * sub_indent + Template( '$key => $val' ).substitute( key=k, val=v_dumped ) + p
res += indent_spaces + '}' + p
return res
class Comment( ConfBase ):
def __init__( self, c ):
super( Comment, self ).__init__()
self.c = c
def dumps( self, pretty=False, indent=0 ):
p = ' '
indent_spaces = ''
if pretty:
p = '\n'
indent_spaces = BASE_INDENT_SPACE * indent
t = Template( '$indent#$comment$p' )
return t.substitute( indent=indent_spaces, comment=self.c, p=p)
# 可以考虑合并If, ElseIf, Else为一个class
# __init__(self, if_expr_blocks, elseif_expr_blocks, else_expr_blocks )
class If( ConfBase ):
def __init__( self, expression, *blocks ):
super( If, self ).__init__()
self.expression = expression
self.child_blocks = blocks
def dumps( self, pretty=False, indent=0 ):
p = ' '
indent_spaces = ''
if pretty:
p = '\n'
indent_spaces = BASE_INDENT_SPACE * indent
res = indent_spaces + 'if ' + self.expression + ' {' + p
for blk in self.child_blocks:
res += blk.dumps( pretty, indent=indent + PRETTY_INDENT )
res += indent_spaces + '}' + p
return res
class ElseIf( ConfBase ):
def __init__( self, expression, *blocks ):
super( ElseIf, self ).__init__()
self.expression = expression
self.child_blocks = blocks
def dumps( self, pretty=False, indent=0 ):
p = ' '
indent_spaces = ''
if pretty:
p = '\n'
indent_spaces = BASE_INDENT_SPACE * indent
res = indent_spaces + 'else if ' + self.expression + ' {' + p
for blk in self.child_blocks:
res += blk.dumps( pretty, indent=indent + PRETTY_INDENT )
res += indent_spaces + '}' + p
return res
class Else( ConfBase ):
def __init__( self, *blocks ):
super( Else, self ).__init__()
self.child_blocks = blocks
def dumps( self, pretty=False, indent=0 ):
p = ' '
indent_spaces = ''
if pretty:
p = '\n'
indent_spaces = BASE_INDENT_SPACE * indent
res = indent_spaces + 'else {' + p
for blk in self.child_blocks:
res += blk.dumps( pretty, indent=indent + PRETTY_INDENT )
res += indent_spaces + '}' + p
return res
class LogstashConf( object ):
"""
配置的顺序很重要
"""
def __init__( self ):
self.conf = { 'input': [], 'filter': [], 'output': [] }
def input( self, *blocks ):
self.conf[ 'input' ].extend( blocks )
def filter( self, *blocks ):
self.conf[ 'filter' ].extend( blocks )
def output( self, *blocks ):
self.conf[ 'output' ].extend( blocks )
def dumps_dict( self, pretty=True ):
return self.conf
def dumps( self, pretty=False ):
p = ' ' if pretty is False else '\n'
res = 'input {' + p
res += self._dumps( self.conf[ 'input' ], pretty )
res += '}' + p
res += 'filter {' + p
res += self._dumps( self.conf[ 'filter' ], pretty )
res += '}' + p
res += 'output {' + p
res += self._dumps( self.conf[ 'output' ], pretty )
res += '}'
return res
def _dumps( self, d, pretty=False, indent=PRETTY_INDENT ):
p = ' ' if pretty is False else '\n'
res = ''
for blk in d:
res += blk.dumps( pretty, indent=indent )
return res
"""
# Let me show you a example, Run the following command:
$ python logstash_conf_api.py
# Then you get this output:
input {
#This is a test comment
kafka{
zk_connect => "<zk_connect>"
group_id => "<group_id>"
codec => "plain"
topic_id => "<topic_id>"
}
}
filter {
ruby{
code => "puts event"
}
grok{
match => {
"message" => [ "%{NUMBER:http_status} %{WORD:method}", "%{DATA:access_log}" ]
}
}
}
output {
if [type]== "type_1" {
elasticsearch{
index => "test.yingju1-%{type}-%{+YYYY.MM.dd}"
host => [ "1002.es.dip.sina.com.cn" ]
protocol => "http"
workers => 2
flush_size => 20000
}
}
else if [type]== "type_2" {
elasticsearch{
index => "test.yingju1-%{type}-%{+YYYY.MM.dd}"
host => [ "1002.es.dip.sina.com.cn" ]
protocol => "http"
workers => 2
flush_size => 20000
}
}
else {
elasticsearch{
index => "test.yingju1-%{type}-%{+YYYY.MM.dd}"
host => [ "1002.es.dip.sina.com.cn" ]
protocol => "http"
workers => 2
flush_size => 20000
}
}
}
"""
if __name__ == '__main__':
conf = LogstashConf()
conf.input( Comment( 'This is a test comment' ) )
kafka = { 'topic_id': '<topic_id>', 'group_id': '<group_id>', 'zk_connect': '<zk_connect>', 'codec': 'plain' }
conf.input( Plugin( 'kafka', **kafka ) )
ruby = { 'code': 'puts event' }
conf.filter( Plugin( 'ruby', **ruby ) )
# use _quote() if specific string is a field reference, such as _quote( 'message' )
grok = { 'match': { _quote( 'message' ): [ '%{NUMBER:http_status} %{WORD:method}', '%{DATA:access_log}' ] } }
conf.filter( Plugin( 'grok', **grok ) )
es = { 'host': [ '1002.es.dip.sina.com.cn' ], 'index': 'test.yingju1-%{type}-%{+YYYY.MM.dd}', 'protocol': 'http', 'flush_size': 20000, 'workers': 2 }
es_plugin = Plugin( 'elasticsearch', **es )
# you can do this
# conf.output( es_plugin )
# or using if block
if_blk = If( '[type]== "type_1"', es_plugin )
elseif_blk = ElseIf( '[type]== "type_2"', es_plugin )
else_blk = Else( es_plugin )
conf.output( if_blk, elseif_blk, else_blk )
# print conf.dumps()
print conf.dumps( pretty=True )
| 27.796491 | 153 | 0.525499 | 909 | 7,922 | 4.370737 | 0.172717 | 0.072489 | 0.033979 | 0.038762 | 0.46363 | 0.453813 | 0.410773 | 0.378052 | 0.350365 | 0.327209 | 0 | 0.010781 | 0.332618 | 7,922 | 284 | 154 | 27.894366 | 0.740685 | 0.036607 | 0 | 0.420382 | 0 | 0 | 0.08448 | 0.01264 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.006369 | null | null | 0.006369 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6113ea07a805e10d1e1d7f7c37ca1fcbc483ca36 | 8,608 | py | Python | src/indriya_msgs/python/physics_pb2.py | praveenv4k/Indriya | a74a4b8bb9f04b5e858305769e892edfcab32df5 | [
"MIT"
] | 1 | 2017-07-12T15:30:58.000Z | 2017-07-12T15:30:58.000Z | src/indriya_msgs/python/physics_pb2.py | praveenv4k/Indriya | a74a4b8bb9f04b5e858305769e892edfcab32df5 | [
"MIT"
] | null | null | null | src/indriya_msgs/python/physics_pb2.py | praveenv4k/Indriya | a74a4b8bb9f04b5e858305769e892edfcab32df5 | [
"MIT"
] | 1 | 2016-03-09T12:34:19.000Z | 2016-03-09T12:34:19.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: physics.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
import vector3d_pb2 as vector3d__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='physics.proto',
package='Indriya.Core.Msgs',
#syntax='proto2',
serialized_pb=_b('\n\rphysics.proto\x12\x11Indriya.Core.Msgs\x1a\x0evector3d.proto\"\xc2\x03\n\x07Physics\x12\x32\n\x04type\x18\x01 \x01(\x0e\x32\x1f.Indriya.Core.Msgs.Physics.Type:\x03ODE\x12\x13\n\x0bsolver_type\x18\x02 \x01(\t\x12\x15\n\rmin_step_size\x18\x03 \x01(\x01\x12\x14\n\x0cprecon_iters\x18\x04 \x01(\x05\x12\r\n\x05iters\x18\x05 \x01(\x05\x12\x0b\n\x03sor\x18\x06 \x01(\x01\x12\x0b\n\x03\x63\x66m\x18\x07 \x01(\x01\x12\x0b\n\x03\x65rp\x18\x08 \x01(\x01\x12\"\n\x1a\x63ontact_max_correcting_vel\x18\t \x01(\x01\x12\x1d\n\x15\x63ontact_surface_layer\x18\n \x01(\x01\x12,\n\x07gravity\x18\x0b \x01(\x0b\x32\x1b.Indriya.Core.Msgs.Vector3d\x12\x16\n\x0e\x65nable_physics\x18\x0c \x01(\x08\x12\x18\n\x10real_time_factor\x18\r \x01(\x01\x12\x1d\n\x15real_time_update_rate\x18\x0e \x01(\x01\x12\x15\n\rmax_step_size\x18\x0f \x01(\x01\"2\n\x04Type\x12\x07\n\x03ODE\x10\x01\x12\n\n\x06\x42ULLET\x10\x02\x12\x0b\n\x07SIMBODY\x10\x03\x12\x08\n\x04\x44\x41RT\x10\x04')
,
dependencies=[vector3d__pb2.DESCRIPTOR,])
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_PHYSICS_TYPE = _descriptor.EnumDescriptor(
name='Type',
full_name='Indriya.Core.Msgs.Physics.Type',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='ODE', index=0, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='BULLET', index=1, number=2,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='SIMBODY', index=2, number=3,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='DART', index=3, number=4,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=453,
serialized_end=503,
)
_sym_db.RegisterEnumDescriptor(_PHYSICS_TYPE)
_PHYSICS = _descriptor.Descriptor(
name='Physics',
full_name='Indriya.Core.Msgs.Physics',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='type', full_name='Indriya.Core.Msgs.Physics.type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='solver_type', full_name='Indriya.Core.Msgs.Physics.solver_type', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='min_step_size', full_name='Indriya.Core.Msgs.Physics.min_step_size', index=2,
number=3, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='precon_iters', full_name='Indriya.Core.Msgs.Physics.precon_iters', index=3,
number=4, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='iters', full_name='Indriya.Core.Msgs.Physics.iters', index=4,
number=5, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sor', full_name='Indriya.Core.Msgs.Physics.sor', index=5,
number=6, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cfm', full_name='Indriya.Core.Msgs.Physics.cfm', index=6,
number=7, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='erp', full_name='Indriya.Core.Msgs.Physics.erp', index=7,
number=8, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='contact_max_correcting_vel', full_name='Indriya.Core.Msgs.Physics.contact_max_correcting_vel', index=8,
number=9, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='contact_surface_layer', full_name='Indriya.Core.Msgs.Physics.contact_surface_layer', index=9,
number=10, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='gravity', full_name='Indriya.Core.Msgs.Physics.gravity', index=10,
number=11, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='enable_physics', full_name='Indriya.Core.Msgs.Physics.enable_physics', index=11,
number=12, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='real_time_factor', full_name='Indriya.Core.Msgs.Physics.real_time_factor', index=12,
number=13, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='real_time_update_rate', full_name='Indriya.Core.Msgs.Physics.real_time_update_rate', index=13,
number=14, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='max_step_size', full_name='Indriya.Core.Msgs.Physics.max_step_size', index=14,
number=15, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
_PHYSICS_TYPE,
],
options=None,
is_extendable=False,
#syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=53,
serialized_end=503,
)
_PHYSICS.fields_by_name['type'].enum_type = _PHYSICS_TYPE
_PHYSICS.fields_by_name['gravity'].message_type = vector3d__pb2._VECTOR3D
_PHYSICS_TYPE.containing_type = _PHYSICS
DESCRIPTOR.message_types_by_name['Physics'] = _PHYSICS
Physics = _reflection.GeneratedProtocolMessageType('Physics', (_message.Message,), dict(
DESCRIPTOR = _PHYSICS,
__module__ = 'physics_pb2'
# @@protoc_insertion_point(class_scope:Indriya.Core.Msgs.Physics)
))
_sym_db.RegisterMessage(Physics)
# @@protoc_insertion_point(module_scope)
| 42.196078 | 970 | 0.730832 | 1,215 | 8,608 | 4.925926 | 0.156379 | 0.06817 | 0.052632 | 0.069841 | 0.573935 | 0.546032 | 0.520635 | 0.452464 | 0.40802 | 0.393985 | 0 | 0.053087 | 0.139986 | 8,608 | 203 | 971 | 42.403941 | 0.755369 | 0.028694 | 0 | 0.541436 | 1 | 0.005525 | 0.220879 | 0.193703 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.038674 | 0 | 0.038674 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
611cc4a74e1e739d5253846e13dbb876fb6c2bff | 1,719 | py | Python | src/mailer.py | jack139/takit | d647bf2e6400df681e701da7252f0cbecd1e6631 | [
"BSD-3-Clause"
] | 2 | 2019-02-14T09:14:56.000Z | 2019-07-16T09:46:47.000Z | src/mailer.py | jack139/takit | d647bf2e6400df681e701da7252f0cbecd1e6631 | [
"BSD-3-Clause"
] | null | null | null | src/mailer.py | jack139/takit | d647bf2e6400df681e701da7252f0cbecd1e6631 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
import smtplib, os, time, sys, gc
from email.MIMEMultipart import MIMEMultipart
from email.MIMEBase import MIMEBase
from email.MIMEText import MIMEText
from email.Utils import COMMASPACE, formatdate
from email import Encoders
from email.Header import Header
import helper
from config import setting
db = setting.db_web
HTML = u'<div><p>现在有 %%s 个事件等待人工处理</p><p>时间:%%s</p></div>' \
u'<div><p style="color:#E6E6E6;font-size:4px">%s</p></div>' % os.uname()[1]
def send_mail(send_from, send_to, subject, text):
assert type(send_to)==list
msg = MIMEMultipart()
msg['From'] = send_from
msg['To'] = COMMASPACE.join(send_to)
msg['Date'] = formatdate(localtime=True)
msg['Subject'] = Header(subject, 'gb2312')
msg.attach(MIMEText(text, 'html', 'gb2312'))
smtp = smtplib.SMTP()
try:
smtp.connect(setting.mail_server)
smtp.sendmail(send_from, send_to, msg.as_string())
except Exception,e:
print "%s: %s" % (type(e), str(e))
smtp.close()
if __name__ == "__main__":
gc.set_threshold(300,5,5)
print "MAILER: %s started" % helper.time_str()
last_count=0
try:
while 1:
# 搜索人工事件
db_alert=db.todo.find({'$and': [{'$and': [{'event': {'$ne':'ORDER_UI'}}, {'event': {'$ne':'ORDER_API'}}]}, {'man':1}]})
if db_alert.count()>last_count:
text = HTML % (db_alert.count(), helper.time_str())
send_mail(setting.sender, setting.worker, u'人工事件提醒'.encode('gb2312'),
text.encode('gb2312'))
print "%s MAILER: sent a mail" % (helper.time_str())
last_count=db_alert.count()
sys.stdout.flush()
time.sleep(5)
except KeyboardInterrupt:
print
print 'Ctrl-C!'
print "MAILER: %s exited" % helper.time_str()
| 25.279412 | 122 | 0.66434 | 253 | 1,719 | 4.379447 | 0.434783 | 0.048736 | 0.046931 | 0.025271 | 0.039711 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021306 | 0.153578 | 1,719 | 67 | 123 | 25.656716 | 0.740206 | 0.028505 | 0 | 0.043478 | 0 | 0.043478 | 0.166267 | 0.048019 | 0 | 0 | 0 | 0 | 0.021739 | 0 | null | null | 0 | 0.195652 | null | null | 0.130435 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
611cd063edd78a73e36adbd1a6e8ee077e569a16 | 5,901 | py | Python | src/oci/database/models/stack_monitoring_config.py | pabs3/oci-python-sdk | 437ba18ce39af2d1090e277c4bb8750c89f83021 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/database/models/stack_monitoring_config.py | pabs3/oci-python-sdk | 437ba18ce39af2d1090e277c4bb8750c89f83021 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/database/models/stack_monitoring_config.py | pabs3/oci-python-sdk | 437ba18ce39af2d1090e277c4bb8750c89f83021 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
# Copyright (c) 2016, 2022, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from oci.util import formatted_flat_dict, NONE_SENTINEL, value_allowed_none_or_none_sentinel # noqa: F401
from oci.decorators import init_model_state_from_kwargs
@init_model_state_from_kwargs
class StackMonitoringConfig(object):
"""
The configuration of Stack Monitoring for the external database.
"""
#: A constant which can be used with the stack_monitoring_status property of a StackMonitoringConfig.
#: This constant has a value of "ENABLING"
STACK_MONITORING_STATUS_ENABLING = "ENABLING"
#: A constant which can be used with the stack_monitoring_status property of a StackMonitoringConfig.
#: This constant has a value of "ENABLED"
STACK_MONITORING_STATUS_ENABLED = "ENABLED"
#: A constant which can be used with the stack_monitoring_status property of a StackMonitoringConfig.
#: This constant has a value of "DISABLING"
STACK_MONITORING_STATUS_DISABLING = "DISABLING"
#: A constant which can be used with the stack_monitoring_status property of a StackMonitoringConfig.
#: This constant has a value of "NOT_ENABLED"
STACK_MONITORING_STATUS_NOT_ENABLED = "NOT_ENABLED"
#: A constant which can be used with the stack_monitoring_status property of a StackMonitoringConfig.
#: This constant has a value of "FAILED_ENABLING"
STACK_MONITORING_STATUS_FAILED_ENABLING = "FAILED_ENABLING"
#: A constant which can be used with the stack_monitoring_status property of a StackMonitoringConfig.
#: This constant has a value of "FAILED_DISABLING"
STACK_MONITORING_STATUS_FAILED_DISABLING = "FAILED_DISABLING"
def __init__(self, **kwargs):
"""
Initializes a new StackMonitoringConfig object with values from keyword arguments.
The following keyword arguments are supported (corresponding to the getters/setters of this class):
:param stack_monitoring_status:
The value to assign to the stack_monitoring_status property of this StackMonitoringConfig.
Allowed values for this property are: "ENABLING", "ENABLED", "DISABLING", "NOT_ENABLED", "FAILED_ENABLING", "FAILED_DISABLING", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type stack_monitoring_status: str
:param stack_monitoring_connector_id:
The value to assign to the stack_monitoring_connector_id property of this StackMonitoringConfig.
:type stack_monitoring_connector_id: str
"""
self.swagger_types = {
'stack_monitoring_status': 'str',
'stack_monitoring_connector_id': 'str'
}
self.attribute_map = {
'stack_monitoring_status': 'stackMonitoringStatus',
'stack_monitoring_connector_id': 'stackMonitoringConnectorId'
}
self._stack_monitoring_status = None
self._stack_monitoring_connector_id = None
@property
def stack_monitoring_status(self):
"""
**[Required]** Gets the stack_monitoring_status of this StackMonitoringConfig.
The status of Stack Monitoring.
Allowed values for this property are: "ENABLING", "ENABLED", "DISABLING", "NOT_ENABLED", "FAILED_ENABLING", "FAILED_DISABLING", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The stack_monitoring_status of this StackMonitoringConfig.
:rtype: str
"""
return self._stack_monitoring_status
@stack_monitoring_status.setter
def stack_monitoring_status(self, stack_monitoring_status):
"""
Sets the stack_monitoring_status of this StackMonitoringConfig.
The status of Stack Monitoring.
:param stack_monitoring_status: The stack_monitoring_status of this StackMonitoringConfig.
:type: str
"""
allowed_values = ["ENABLING", "ENABLED", "DISABLING", "NOT_ENABLED", "FAILED_ENABLING", "FAILED_DISABLING"]
if not value_allowed_none_or_none_sentinel(stack_monitoring_status, allowed_values):
stack_monitoring_status = 'UNKNOWN_ENUM_VALUE'
self._stack_monitoring_status = stack_monitoring_status
@property
def stack_monitoring_connector_id(self):
"""
Gets the stack_monitoring_connector_id of this StackMonitoringConfig.
The `OCID`__ of the
:func:`create_external_database_connector_details`.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:return: The stack_monitoring_connector_id of this StackMonitoringConfig.
:rtype: str
"""
return self._stack_monitoring_connector_id
@stack_monitoring_connector_id.setter
def stack_monitoring_connector_id(self, stack_monitoring_connector_id):
"""
Sets the stack_monitoring_connector_id of this StackMonitoringConfig.
The `OCID`__ of the
:func:`create_external_database_connector_details`.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param stack_monitoring_connector_id: The stack_monitoring_connector_id of this StackMonitoringConfig.
:type: str
"""
self._stack_monitoring_connector_id = stack_monitoring_connector_id
def __repr__(self):
return formatted_flat_dict(self)
def __eq__(self, other):
if other is None:
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not self == other
| 42.15 | 245 | 0.72564 | 710 | 5,901 | 5.714085 | 0.204225 | 0.195958 | 0.16564 | 0.115356 | 0.644811 | 0.586394 | 0.519842 | 0.483362 | 0.444664 | 0.376633 | 0 | 0.003854 | 0.208609 | 5,901 | 139 | 246 | 42.453237 | 0.864882 | 0.561769 | 0 | 0.045455 | 0 | 0 | 0.138102 | 0.067926 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.045455 | 0.045455 | 0.522727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
612349f16fdeaf7c2205c36f0550978d6bdb88e9 | 1,142 | py | Python | app/hrcm/services/cli/output.py | bastienbot/hr-challenges-manager | 1fe8e4fa34f6866a724c2461bc17cc442fe50a4c | [
"MIT"
] | null | null | null | app/hrcm/services/cli/output.py | bastienbot/hr-challenges-manager | 1fe8e4fa34f6866a724c2461bc17cc442fe50a4c | [
"MIT"
] | null | null | null | app/hrcm/services/cli/output.py | bastienbot/hr-challenges-manager | 1fe8e4fa34f6866a724c2461bc17cc442fe50a4c | [
"MIT"
] | null | null | null | """
@desc Prints informations about the candidate : his/her profile and messages sent
@params candidate: instance of Candidate
"""
def show_candidate_informations(candidate):
print("## CANDIDATE PROFILE ##", end="\n\n")
print("Firstname: {}".format(candidate.firstname))
print("Lastname: {}".format(candidate.lastname))
print("Email address: {}".format(candidate.email))
print("Job: {}".format(candidate.job), end="\n\n")
print("## MESSAGE SENT ##", end="\n\n")
for message in candidate.messages:
print("{0} sent : {1}, {2} days ago".format(
message["name"],
message["ts_str"],
message["diff_to_today"])
)
def show_candidates(candidates):
for candidate in candidates:
print("\nFirstname: {0}, Lastname: {1}, Email address: {2}, Job: {3}".format(candidate.firstname, candidate.lastname, candidate.email, candidate.job))
for message in candidate.messages:
print("\t{0} sent : {1}, {2} days ago".format(
message["name"],
message["ts_str"],
message["diff_to_today"])
)
| 35.6875 | 158 | 0.601576 | 130 | 1,142 | 5.215385 | 0.338462 | 0.110619 | 0.022124 | 0.029499 | 0.280236 | 0.280236 | 0.179941 | 0.179941 | 0.179941 | 0.179941 | 0 | 0.011416 | 0.232925 | 1,142 | 31 | 159 | 36.83871 | 0.762557 | 0.107706 | 0 | 0.363636 | 0 | 0.045455 | 0.264095 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0 | 0 | 0.090909 | 0.409091 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
6123ffcf6c686202f3bfd927ed19a030e1f7389a | 414 | py | Python | Fletnix/apps/profiles/migrations/0009_whoiswatching_user_age.py | FuryAndRage/Fletnix | 99cc015c799eda24d605ecb9706f809fa6a05392 | [
"MIT"
] | null | null | null | Fletnix/apps/profiles/migrations/0009_whoiswatching_user_age.py | FuryAndRage/Fletnix | 99cc015c799eda24d605ecb9706f809fa6a05392 | [
"MIT"
] | 1 | 2021-02-21T11:08:36.000Z | 2021-02-24T20:42:01.000Z | Fletnix/apps/profiles/migrations/0009_whoiswatching_user_age.py | FuryAndRage/Fletnix | 99cc015c799eda24d605ecb9706f809fa6a05392 | [
"MIT"
] | null | null | null | # Generated by Django 3.1.2 on 2021-02-15 05:21
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('profiles', '0008_whoiswatching_person_avatar'),
]
operations = [
migrations.AddField(
model_name='whoiswatching',
name='user_age',
field=models.CharField(max_length=30, null=True),
),
]
| 21.789474 | 61 | 0.618357 | 45 | 414 | 5.555556 | 0.844444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069536 | 0.270531 | 414 | 18 | 62 | 23 | 0.758278 | 0.108696 | 0 | 0 | 1 | 0 | 0.166213 | 0.087193 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
612aedd0be877a625c366cc32d1e904c74177e14 | 7,798 | py | Python | examples/api-samples/inc_samples/sample11.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | examples/api-samples/inc_samples/sample11.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | examples/api-samples/inc_samples/sample11.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | ### This sample will show how programmatically create and post an annotation into document. How to delete the annotation
# Import of classes from libraries
from pyramid.renderers import render_to_response
from groupdocs.ApiClient import ApiClient
from groupdocs.AntApi import AntApi
from groupdocs.StorageApi import StorageApi
from groupdocs.GroupDocsRequestSigner import GroupDocsRequestSigner
from groupdocs.FileStream import FileStream
# Checking value on null
def IsNotNull(value):
return value is not None and len(value) > 0
# Set variables and get POST data
def sample11(request):
clientId = request.POST.get('client_id')
privateKey = request.POST.get('private_key')
inputFile = request.POST.get('file')
url = request.POST.get('url')
basePath = request.POST.get('server_type')
fileId = request.POST.get('fileId')
guid = ""
iframe = ""
annotationType = request.POST.get('annotation_type')
# Checking required parameters
if IsNotNull(clientId) == False or IsNotNull(privateKey) == False or IsNotNull(annotationType) == False:
return render_to_response('__main__:templates/sample11.pt',
{ 'error' : 'You do not enter all parameters' })
### Create Signer, ApiClient and Annotation Api objects
# Create signer object
signer = GroupDocsRequestSigner(privateKey)
# Create apiClient object
apiClient = ApiClient(signer)
# Create Annotation object
ant = AntApi(apiClient)
api = StorageApi(apiClient)
if basePath == "":
basePath = 'https://api.groupdocs.com/v2.0'
#Set base path
ant.basePath = basePath
api.basePath = basePath
if url != "":
try:
# Upload file to current user storage using entere URl to the file
upload = api.UploadWeb(clientId, url)
guid = upload.result.guid
fileId = ""
except Exception, e:
return render_to_response('__main__:templates/sample11.pt',
{ 'error' : str(e) })
if inputFile != "":
try:
#A hack to get uploaded file size
inputFile.file.seek(0, 2)
fileSize = inputFile.file.tell()
inputFile.file.seek(0)
fs = FileStream.fromStream(inputFile.file, fileSize)
####Make a request to Storage API using clientId
#Upload file to current user storage
response = api.Upload(clientId, inputFile.filename, fs)
guid = response.result.guid
fileId = ""
except Exception, e:
return render_to_response('__main__:templates/sample11.pt',
{ 'error' : str(e) })
if fileId != '':
guid = fileId
# Delete annotation if Delete Button clicked
if request.POST.get('delete_annotation') == "1":
try:
ant.DeleteAnnotation(clientId, request.POST.get('annotationId'))
except Exception, e:
return render_to_response('__main__:templates/sample11.pt',
{ 'error' : str(e) })
# Required parameters
allParams = ['box_x', 'box_y', 'text']
# Added required parameters depends on annotation type ['text' or 'area']
if annotationType == "text":
allParams = allParams + ['box_width', 'box_height', 'annotationPosition_x', 'annotationPosition_y', 'range_position', 'range_length']
elif annotationType == "area":
allParams = allParams + ['box_width', 'box_height']
# Checking required parameters
for param in allParams:
needParam = request.POST.get(param)
if IsNotNull(needParam) == False:
return render_to_response('__main__:templates/sample11.pt',
{ 'error' : 'You do not enter all parameters' })
types = {'text' : "0", "area" : "1", "point" : "2"}
# construct requestBody
requestBody = {
"type": types[request.POST.get('annotation_type')],
"replies": [ { "text": request.POST.get('text') } ],
}
# construct requestBody depends on annotation type
# text annotation
if annotationType == "text":
requestBody = dict(requestBody.items() + {
"box": {
"x" : request.POST.get('box_x'),
"y" : request.POST.get('box_y'),
"width" : request.POST.get('box_width'),
"height" : request.POST.get('box_height')
},
"textRange":{
"position" : request.POST.get('range_position'),
"length" : request.POST.get('range_length')
},
"annotationPosition": {
"x" : request.POST.get('annotationPosition_x'),
"y" : request.POST.get('annotationPosition_y')
},
}.items())
# area annotation
elif annotationType == "area":
requestBody = dict(requestBody.items() + {
"box": {
"x" : request.POST.get('box_x'),
"y" : request.POST.get('box_y'),
"width" : request.POST.get('box_width'),
"height" : request.POST.get('box_height')
},
"annotationPosition": {
"x" : "0",
"y" : "0"
},
}.items())
# point annotation
elif annotationType == "point":
requestBody = dict(requestBody.items() + {
"box": {
"x" : request.POST.get('box_x'),
"y" : request.POST.get('box_y'),
"width" : "0",
"height" : "0"
},
"annotationPosition": {
"x" : "0",
"y" : "0"
},
}.items())
try:
# Make a request to Annotation API using clientId, fileId and requestBody
response = ant.CreateAnnotation(clientId, guid, requestBody)
if response.status == "Ok":
if response.result:
#Generation of iframe URL using fileGuId
if basePath == "https://api.groupdocs.com/v2.0":
iframe = 'https://apps.groupdocs.com/document-annotation2/embed/' + response.result.documentGuid
#iframe to dev server
elif basePath == "https://dev-api.groupdocs.com/v2.0":
iframe = 'https://dev-apps.groupdocs.com/document-annotation2/embed/' + response.result.documentGuid
#iframe to test server
elif basePath == "https://stage-api.groupdocs.com/v2.0":
iframe = 'https://stage-apps.groupdocs.com/document-annotation2/embed/' + response.result.documentGuid
#Iframe to realtime server
elif basePath == "http://realtime-api.groupdocs.com":
iframe = 'https://realtime-apps.groupdocs.com/document-annotation2/embed/' + response.result.documentGuid
iframe = signer.signUrl(iframe)
except Exception, e:
return render_to_response('__main__:templates/sample11.pt',
{ 'error' : str(e) })
# If request was successfull - set variables for template
return render_to_response('__main__:templates/sample11.pt',
{ 'userId' : clientId,
'privateKey' : privateKey,
'fileId' : fileId,
'annotationType' : annotationType,
'annotationText' : request.POST.get('text'),
'annotationId' : response.result.annotationGuid,
'iframe' : iframe,
'status' : response.status
},
request=request) | 40.827225 | 142 | 0.561682 | 762 | 7,798 | 5.650919 | 0.217848 | 0.068974 | 0.087784 | 0.03948 | 0.369949 | 0.340687 | 0.297956 | 0.267534 | 0.257083 | 0.257083 | 0 | 0.007953 | 0.322775 | 7,798 | 191 | 143 | 40.827225 | 0.807423 | 0.136189 | 0 | 0.373239 | 0 | 0 | 0.204147 | 0.032258 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.042254 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
612b757b9aaca93e9879dec08a653bb1d983c0f9 | 1,425 | py | Python | sherpa_client/models/new_user.py | kairntech/sherpa-client | cd259c87b7291eeec3f3ea025e368f2f069a06cd | [
"Apache-2.0"
] | null | null | null | sherpa_client/models/new_user.py | kairntech/sherpa-client | cd259c87b7291eeec3f3ea025e368f2f069a06cd | [
"Apache-2.0"
] | null | null | null | sherpa_client/models/new_user.py | kairntech/sherpa-client | cd259c87b7291eeec3f3ea025e368f2f069a06cd | [
"Apache-2.0"
] | null | null | null | from typing import Any, Dict, List, Type, TypeVar, Union, cast
import attr
from ..types import UNSET, Unset
T = TypeVar("T", bound="NewUser")
@attr.s(auto_attribs=True)
class NewUser:
""" """
password: str
permissions: List[str]
roles: List[str]
username: str
email: Union[Unset, str] = UNSET
def to_dict(self) -> Dict[str, Any]:
password = self.password
permissions = self.permissions
roles = self.roles
username = self.username
email = self.email
field_dict: Dict[str, Any] = {}
field_dict.update(
{
"password": password,
"permissions": permissions,
"roles": roles,
"username": username,
}
)
if email is not UNSET:
field_dict["email"] = email
return field_dict
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
d = src_dict.copy()
password = d.pop("password")
permissions = cast(List[str], d.pop("permissions"))
roles = cast(List[str], d.pop("roles"))
username = d.pop("username")
email = d.pop("email", UNSET)
new_user = cls(
password=password,
permissions=permissions,
roles=roles,
username=username,
email=email,
)
return new_user
| 21.923077 | 63 | 0.536842 | 154 | 1,425 | 4.896104 | 0.285714 | 0.026525 | 0.039788 | 0.037135 | 0.209549 | 0.169761 | 0.169761 | 0.169761 | 0 | 0 | 0 | 0 | 0.345965 | 1,425 | 64 | 64 | 22.265625 | 0.809013 | 0 | 0 | 0 | 0 | 0 | 0.057828 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044444 | false | 0.111111 | 0.066667 | 0 | 0.288889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
612d8edf5c8e74a0adf08f4a0990ac0f4895b9ee | 9,417 | py | Python | app/main/User.py | simeonschuez/slurk | 5ece0ef0698c43ac274b94c4d9c87aa9bfaae021 | [
"BSD-3-Clause"
] | null | null | null | app/main/User.py | simeonschuez/slurk | 5ece0ef0698c43ac274b94c4d9c87aa9bfaae021 | [
"BSD-3-Clause"
] | null | null | null | app/main/User.py | simeonschuez/slurk | 5ece0ef0698c43ac274b94c4d9c87aa9bfaae021 | [
"BSD-3-Clause"
] | null | null | null | from calendar import timegm
from flask_login import UserMixin, login_user
from datetime import datetime
import flask_socketio as sio
from .Permissions import Permissions
from .Token import Token
from .database import Database
from .Room import Room, ROOMS
from .Logger import Logger
from .. import config
from .Layout import Layout
logged_users = {}
class User(UserMixin):
_id = None
def get_id(self):
return str(self.id())
def is_authenticated(self):
return self.token().valid()
def id(self):
return self._id
def token(self):
c = Database().get_cursor()
c.execute("SELECT TokenId FROM User WHERE Id = ?;", (self.id(),))
fetch = c.fetchone()
if fetch[0] is None:
return None
return Token.from_id(fetch[0])
def set_token(self, token: Token):
if not isinstance(token, Token):
raise TypeError(
f"Object of type `Token` expected, however type `{type(token)}` was passed")
db = Database()
db.get_cursor().execute(
'UPDATE User SET TokenId = ? WHERE Id = ?;', (token.id(), self.id()))
db.commit()
def name(self):
c = Database().get_cursor()
c.execute("SELECT Name FROM User WHERE Id = ?;", (self.id(),))
fetch = c.fetchone()
return fetch[0] if fetch and fetch[0] else None
def set_name(self, name: str):
if not isinstance(name, str):
raise TypeError(
f"Object of type `str` expected, however type `{type(name)}` was passed")
db = Database()
db.get_cursor().execute('UPDATE User SET Name = ? WHERE Id = ?;', (name, self.id()))
db.commit()
def sid(self):
c = Database().get_cursor()
c.execute(
'SELECT SessionId FROM SessionId WHERE UserId = ? ORDER BY Updated DESC LIMIT 1;', (self.id(),))
fetch = c.fetchone()
return fetch[0] if fetch and fetch[0] else None
def set_sid(self, sid: str):
if not isinstance(sid, str):
raise TypeError(
f"Object of type `str` expected, however type `{type(sid)}` was passed")
db = Database()
db.get_cursor().execute('INSERT OR REPLACE INTO SessionId(`UserId`, `SessionId`) VALUES(?, ?);',
(self.id(), sid))
db.commit()
def latest_room(self):
c = Database().get_cursor()
c.execute('SELECT LatestRoom FROM User WHERE Id = ?;', (self.id(),))
fetch = c.fetchone()
return Room(fetch[0]) if fetch and fetch[0] else None
def set_latest_room(self, latest_room: Room):
if not isinstance(latest_room, Room):
raise TypeError(
f"Object of type `Room` expected, however type `{type(latest_room)}` was passed")
db = Database()
db.get_cursor().execute('UPDATE User SET LatestRoom = ? WHERE Id = ?;',
(latest_room.id(), self.id()))
db.commit()
def join_room(self, room: Room):
if not isinstance(room, Room):
raise TypeError(
f"Object of type `Room` expected, however type `{type(room)}` was passed")
db = Database()
db.get_cursor().execute('INSERT OR REPLACE INTO UserRoom(`UserId`, `RoomId`) VALUES (?, ?);',
(self.id(), room.id()))
db.commit()
self.set_latest_room(room)
sio.join_room(room.name(), self.sid())
if room.id() not in ROOMS:
logfile_format = '%Y-%m-%d %H-%M-%S'
if "logfile-date-format" in config["server"]:
logfile_format = config["server"]["logfile-date-format"]
logfile_date_format = '{:'+logfile_format+"}"
logfile_date = logfile_date_format.format(datetime.now())
ROOMS[room.id()] = {
'log': Logger('log/{}-{}.log'.format(logfile_date, room.name())),
'users': {},
'listeners': {}
}
users = [User.from_id(id).serialize()
for id in ROOMS[room.id()]['users']]
ROOMS[room.id()]['users'][self.id()] = self
history = []
for event in ROOMS[room.id()]['log'].get_data():
if (event["type"] == "new_image" or event["type"] == "text") and ('receiver' not in event or event["receiver"] == self.id()):
history.append(event)
if event["type"] == "command" and event["user"]['id'] == self.id():
history.append(event)
sio.emit('status', {
'type': 'join',
'user': self.serialize(),
'room': room.serialize(),
'timestamp': timegm(datetime.now().utctimetuple())
}, room=room.name())
sio.emit('joined_room', {
'room': room.serialize(),
'layout': Layout.from_json_file(room.layout_path()).serialize(),
'users': users,
'history': history,
'self': self.serialize(),
'permissions': Permissions(self.token(), room).serialize()
}, room=self.sid())
ROOMS[room.id()]['log'].append(
{'type': "join", 'user': self.serialize(), 'room': room.serialize()})
print(self.name(), "joined room:", room.name())
def leave_room(self, room: Room):
if not isinstance(room, Room):
raise TypeError(
f"Object of type `Room` expected, however type `{type(room)}` was passed")
db = Database()
db.get_cursor().execute(
'DELETE FROM UserRoom WHERE UserId = ? AND RoomId = ?;', (self.id(), room.id()))
db.commit()
sio.leave_room(room.name(), self.sid())
sio.emit('left_room', {'room': room.serialize()}, room=self.sid())
ROOMS[room.id()]['log'].append(
{'type': "leave", 'user': self.serialize(), 'room': room.serialize()})
print(self.name(), "left room:", room.name())
if room.id() in ROOMS:
if self.id() in ROOMS[room.id()]['users']:
del ROOMS[room.id()]['users'][self.id()]
if not ROOMS[room.id()]:
del ROOMS[room.id()]
sio.close_room(room.name())
sio.emit('status', {
'type': 'leave',
'room': room.serialize(),
'user': self.serialize(),
'timestamp': timegm(datetime.now().utctimetuple())
}, room=room.name())
def rooms(self):
return [Room(id[0]) for id in Database().get_cursor().execute('SELECT RoomId FROM UserRoom WHERE UserId = ?',
(self.id(),))]
def in_room(self, room: Room):
if not isinstance(room, Room):
raise TypeError(
f"Object of type `Room` expected, however type `{type(room)}` was passed")
c = Database().get_cursor()
c.execute('SELECT COUNT(*) FROM UserRoom WHERE UserId = ? AND RoomId = ?',
(self.id(), room.id()))
fetch = c.fetchone()
return Room(fetch[0]) if fetch[0] else None
def serialize(self):
return {
'id': self.id(),
'name': self.name(),
'sid': self.sid(),
'token': self.token().serialize(),
'latest_room': self.latest_room().serialize(),
'rooms': [room.serialize() for room in self.rooms()]
}
@classmethod
def from_id(cls, id):
if not isinstance(id, int) and not isinstance(id, str):
raise TypeError(
f"Object of type `int` or `str` expected, however type `{type(id)}` was passed")
global logged_users
if id not in logged_users:
c = Database().get_cursor()
c.execute('SELECT COUNT(*) FROM User WHERE Id = ?', (id,))
logged_users[id] = cls(id) if c.fetchone()[0] != 0 else None
return logged_users[id]
@classmethod
def from_sid(cls, sid: str):
if not isinstance(sid, str):
raise TypeError(
f"Object of type `str` expected, however type `{type(sid)}` was passed")
c = Database().get_cursor()
c.execute('SELECT UserId FROM SessionId WHERE SessionId = ?', (sid,))
id = c.fetchone()
return cls(id[0]) if id[0] else None
@classmethod
def login(cls, name: str, token: Token):
if not token:
return None
if not isinstance(name, str):
raise TypeError(
f"Object of type `str` expected, however type `{type(name)}` was passed")
if not isinstance(token, Token):
raise TypeError(
f"Object of type `Token` expected, however type `{type(token)}` was passed")
if not token.valid():
return None
db = Database()
c = db.get_cursor()
c.execute('INSERT INTO User(`TokenId`, `Name`) VALUES (?, ?);',
(token.id(), name))
db.commit()
user = cls(c.lastrowid)
login_user(user)
return user
def __repr__(self):
return str(self.serialize())
def __init__(self, id: int):
if not isinstance(id, int) and not isinstance(id, str):
raise TypeError(
f"Object of type `int` or `str` expected, however type `{type(id)}` was passed")
self._id = int(id)
| 35.942748 | 137 | 0.538813 | 1,123 | 9,417 | 4.449688 | 0.11309 | 0.038423 | 0.036022 | 0.05043 | 0.548929 | 0.507304 | 0.477286 | 0.477286 | 0.441065 | 0.368821 | 0 | 0.002469 | 0.311883 | 9,417 | 261 | 138 | 36.08046 | 0.768673 | 0 | 0 | 0.406542 | 0 | 0.009346 | 0.211745 | 0.00223 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098131 | false | 0.056075 | 0.051402 | 0.028037 | 0.238318 | 0.009346 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
612d9b3d304302a8762350348e09a5291b61e673 | 2,825 | py | Python | tests/test_types.py | SupraSummus/petriish | 6e887ac067904bd95377c30aa04b74cb4478495b | [
"MIT"
] | null | null | null | tests/test_types.py | SupraSummus/petriish | 6e887ac067904bd95377c30aa04b74cb4478495b | [
"MIT"
] | 4 | 2018-09-05T21:31:12.000Z | 2018-09-11T20:13:20.000Z | tests/test_types.py | SupraSummus/petriish | 6e887ac067904bd95377c30aa04b74cb4478495b | [
"MIT"
] | null | null | null | from unittest import TestCase
from petriish.types import Resolver, TypeResolutionError, PolymorphicType, Record, Bytes
class TypesTestCase(TestCase):
def setUp(self):
self.resolver = Resolver()
def test_identity_bind(self):
x = PolymorphicType()
self.assertEqual(self.resolver.get_best_bind(x), x)
def test_unify_with_self(self):
x = PolymorphicType()
self.resolver.unify(x, x)
self.assertEqual(self.resolver.get_best_bind(x), x)
def test_unify(self):
x = PolymorphicType()
self.resolver.unify(x, Bytes())
self.assertEqual(self.resolver.get_best_bind(x), Bytes())
def test_unify_in_record(self):
x = PolymorphicType()
y = PolymorphicType()
self.resolver.unify(
Record({'a': x, 'b': Record()}),
Record({'a': Bytes(), 'b': y}),
)
self.assertEqual(self.resolver.get_best_bind(x), Bytes())
self.assertEqual(self.resolver.get_best_bind(y), Record())
def test_unify_variables(self):
x = PolymorphicType()
y = PolymorphicType()
z = PolymorphicType()
self.resolver.unify(x, y)
self.resolver.unify(y, z)
self.assertEqual(self.resolver.get_best_bind(x), self.resolver.get_best_bind(y))
self.assertEqual(self.resolver.get_best_bind(z), self.resolver.get_best_bind(y))
self.assertIn(self.resolver.get_best_bind(x), [x, y, z])
# just to make sure it wont throw
self.resolver.unify(z, x)
self.resolver.unify(x, y)
def test_reuse_resolver(self):
x = PolymorphicType()
y = PolymorphicType()
self.resolver.unify(x, y)
self.resolver.unify(y, Record())
self.assertEqual(self.resolver.get_best_bind(x), Record())
r = Resolver()
r.unify(x, y)
r.unify(x, Record())
self.assertEqual(r.get_best_bind(y), Record())
def test_reuse_resolver_2(self):
x = PolymorphicType()
y = PolymorphicType()
self.resolver.unify(x, Record())
self.resolver.unify(x, y)
self.assertEqual(self.resolver.get_best_bind(y), Record())
r = Resolver()
r.unify(y, Record())
r.unify(x, y)
self.assertEqual(r.get_best_bind(x), Record())
def test_fail_on_recursive_type(self):
x = PolymorphicType()
with self.assertRaises(TypeResolutionError):
self.resolver.unify(x, Record({'aaa': x}))
self.assertEqual(self.resolver.get_best_bind(x), x)
def test_fail_on_reuse_resolver(self):
x = PolymorphicType()
self.resolver.unify(x, Record())
with self.assertRaises(TypeResolutionError):
self.resolver.unify(x, Bytes())
self.assertEqual(self.resolver.get_best_bind(x), Record())
| 33.235294 | 88 | 0.626549 | 352 | 2,825 | 4.869318 | 0.136364 | 0.203034 | 0.102684 | 0.155193 | 0.748541 | 0.672112 | 0.629522 | 0.510502 | 0.376896 | 0.222287 | 0 | 0.000467 | 0.242124 | 2,825 | 84 | 89 | 33.630952 | 0.800093 | 0.010973 | 0 | 0.537313 | 0 | 0 | 0.002507 | 0 | 0 | 0 | 0 | 0 | 0.238806 | 1 | 0.149254 | false | 0 | 0.029851 | 0 | 0.19403 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b61f0ab99f0d2949d7b1b8e0c18da681fa5ac75b | 759 | py | Python | bohr/config/appconfig.py | giganticode/bohr-framework | fd364a1f036123985ac96e9076e5dce3bbc2ca2c | [
"MIT"
] | null | null | null | bohr/config/appconfig.py | giganticode/bohr-framework | fd364a1f036123985ac96e9076e5dce3bbc2ca2c | [
"MIT"
] | 54 | 2021-02-17T13:36:51.000Z | 2021-08-25T05:06:57.000Z | bohr/config/appconfig.py | giganticode/bohr-framework | fd364a1f036123985ac96e9076e5dce3bbc2ca2c | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from typing import Optional
from bohr.config.pathconfig import PathConfig, load_config_dict_from_file
from bohr.fs import find_project_root
from bohr.util.paths import AbsolutePath
@dataclass(frozen=True)
class AppConfig:
verbose: bool
paths: PathConfig
@staticmethod
def load(project_root: Optional[AbsolutePath] = None) -> "AppConfig":
project_root = project_root or find_project_root()
config_dict = load_config_dict_from_file(project_root)
try:
verbose_str = config_dict["core"]["verbose"]
verbose = verbose_str == "true" or verbose_str == "True"
except KeyError:
verbose = False
return AppConfig(verbose, PathConfig.load())
| 31.625 | 73 | 0.711462 | 91 | 759 | 5.703297 | 0.395604 | 0.127168 | 0.05395 | 0.069364 | 0.084778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 759 | 23 | 74 | 33 | 0.867893 | 0 | 0 | 0 | 0 | 0 | 0.036891 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.263158 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b61fd9bfa261de84307f305744905ff5ab44f6f3 | 4,470 | py | Python | genetic_algorithm/mutation.py | mig029/SnakeAI | 33dbbe1d5608a60a4c6f9d14c41d547f7dc98b3b | [
"MIT"
] | 181 | 2019-09-24T00:46:58.000Z | 2022-03-20T12:06:02.000Z | genetic_algorithm/mutation.py | mig029/SnakeAI | 33dbbe1d5608a60a4c6f9d14c41d547f7dc98b3b | [
"MIT"
] | 9 | 2019-09-25T19:58:11.000Z | 2022-03-04T05:10:37.000Z | genetic_algorithm/mutation.py | mig029/SnakeAI | 33dbbe1d5608a60a4c6f9d14c41d547f7dc98b3b | [
"MIT"
] | 76 | 2019-09-25T16:48:18.000Z | 2022-03-21T12:01:49.000Z | # 9.3.2
# 11.2.1
# 12.4.3
import numpy as np
from typing import List, Union, Optional
from .individual import Individual
def gaussian_mutation(chromosome: np.ndarray, prob_mutation: float,
mu: List[float] = None, sigma: List[float] = None,
scale: Optional[float] = None) -> None:
"""
Perform a gaussian mutation for each gene in an individual with probability, prob_mutation.
If mu and sigma are defined then the gaussian distribution will be drawn from that,
otherwise it will be drawn from N(0, 1) for the shape of the individual.
"""
# Determine which genes will be mutated
mutation_array = np.random.random(chromosome.shape) < prob_mutation
# If mu and sigma are defined, create gaussian distribution around each one
if mu and sigma:
gaussian_mutation = np.random.normal(mu, sigma)
# Otherwise center around N(0,1)
else:
gaussian_mutation = np.random.normal(size=chromosome.shape)
if scale:
gaussian_mutation[mutation_array] *= scale
# Update
chromosome[mutation_array] += gaussian_mutation[mutation_array]
def random_uniform_mutation(chromosome: np.ndarray, prob_mutation: float,
low: Union[List[float], float],
high: Union[List[float], float]
) -> None:
"""
Randomly mutate each gene in an individual with probability, prob_mutation.
If a gene is selected for mutation it will be assigned a value with uniform probability
between [low, high).
@Note [low, high) is defined for each gene to help get the full range of possible values
@TODO: Eq 11.4
"""
assert type(low) == type(high), 'low and high must have the same type'
mutation_array = np.random.random(chromosome.shape) < prob_mutation
if isinstance(low, list):
uniform_mutation = np.random.uniform(low, high)
else:
uniform_mutation = np.random.uniform(low, high, size=chromosome.shape)
chromosome[mutation_array] = uniform_mutation[mutation_array]
def uniform_mutation_with_respect_to_best_individual(chromosome: np.ndarray, best_chromosome: np.ndarray, prob_mutation: float) -> None:
"""
Ranomly mutate each gene in an individual with probability, prob_mutation.
If a gene is selected for mutation it will nudged towards the gene from the best individual.
@TODO: Eq 11.6
"""
mutation_array = np.random.random(chromosome.shape) < prob_mutation
uniform_mutation = np.random.uniform(size=chromosome.shape)
chromosome[mutation_array] += uniform_mutation[mutation_array] * (best_chromosome[mutation_array] - chromosome[mutation_array])
def cauchy_mutation(individual: np.ndarray, scale: float) -> np.ndarray:
pass
def exponential_mutation(chromosome: np.ndarray, xi: Union[float, np.ndarray], prob_mutation: float) -> None:
mutation_array = np.random.random(chromosome.shape) < prob_mutation
# Fill xi if necessary
if not isinstance(xi, np.ndarray):
xi_val = xi
xi = np.empty(chromosome.shape)
xi.fill(xi_val)
# Change xi so we get E(0, 1), instead of E(0, xi)
xi_div = 1.0 / xi
xi.fill(1.0)
# Eq 11.17
y = np.random.uniform(size=chromosome.shape)
x = np.empty(chromosome.shape)
x[y <= 0.5] = (1.0 / xi[y <= 0.5]) * np.log(2 * y[y <= 0.5])
x[y > 0.5] = -(1.0 / xi[y > 0.5]) * np.log(2 * (1 - y[y > 0.5]))
# Eq 11.16
delta = np.empty(chromosome.shape)
delta[mutation_array] = (xi[mutation_array] / 2.0) * np.exp(-xi[mutation_array] * np.abs(x[mutation_array]))
# Update delta such that E(0, xi) = (1 / xi) * E(0 , 1)
delta[mutation_array] = xi_div[mutation_array] * delta[mutation_array]
# Update individual
chromosome[mutation_array] += delta[mutation_array]
def mmo_mutation(chromosome: np.ndarray, prob_mutation: float) -> None:
from scipy import stats
mutation_array = np.random.random(chromosome.shape) < prob_mutation
normal = np.random.normal(size=chromosome.shape) # Eq 11.21
cauchy = stats.cauchy.rvs(size=chromosome.shape) # Eq 11.22
# Eq 11.20
delta = np.empty(chromosome.shape)
delta[mutation_array] = normal[mutation_array] + cauchy[mutation_array]
# Update individual
chromosome[mutation_array] += delta[mutation_array] | 41.775701 | 137 | 0.660626 | 615 | 4,470 | 4.691057 | 0.214634 | 0.12617 | 0.055806 | 0.036395 | 0.501906 | 0.465165 | 0.414905 | 0.32825 | 0.283189 | 0.227036 | 0 | 0.021364 | 0.23557 | 4,470 | 107 | 138 | 41.775701 | 0.822944 | 0.245861 | 0 | 0.2 | 0 | 0 | 0.011349 | 0 | 0 | 0 | 0 | 0.018692 | 0.018182 | 1 | 0.109091 | false | 0.018182 | 0.072727 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b629487815f53b819e62caac70113be42192235e | 1,286 | py | Python | custom_object.py | Jerbuck/content-playground | 3760b1894370d9ffaaa9032e9ce0af6a4954e046 | [
"MIT"
] | null | null | null | custom_object.py | Jerbuck/content-playground | 3760b1894370d9ffaaa9032e9ce0af6a4954e046 | [
"MIT"
] | null | null | null | custom_object.py | Jerbuck/content-playground | 3760b1894370d9ffaaa9032e9ce0af6a4954e046 | [
"MIT"
] | null | null | null | import sys
username = None
password = None
base_url = None
objects = []
class CustomObject(object):
"""Data class for custom object."""
def __init__(self):
self.username = None
self.password = None
self.base_url = None
self.objects = []
def print(self):
"""Print the contents of the custom object."""
print(f"\n--> Username: {self.username}")
print(f"--> Password: {self.password}")
print(f"--> Base URL: {self.base_url}")
print(f"--> Objects: {self.objects}\n")
def load(self, data):
"""Load the custom object from a dictionary."""
try:
self.username = data["username"]
self.password = data["password"]
self.base_url = data["base-url"]
self.objects = data['objects']
if (self.username == None) or \
(self.password == None) or \
(self.base_url == None) or \
(self.objects == []):
print(f"\nERROR: Load called with null data.")
sys.Exit()
except:
print(f"\nERROR: Unable to parse data into custom object.")
sys.exit()
if __name__ == '__main__':
custom_object = CustomObject()
custom_object.print() | 29.906977 | 71 | 0.53888 | 145 | 1,286 | 4.648276 | 0.289655 | 0.0727 | 0.065282 | 0.04451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.321151 | 1,286 | 43 | 72 | 29.906977 | 0.77205 | 0.087092 | 0 | 0 | 0 | 0 | 0.208801 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088235 | false | 0.147059 | 0.029412 | 0 | 0.147059 | 0.235294 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b62dffd3ebf5c0ef3c4b470740acee6b457152ef | 310 | py | Python | zorro/__init__.py | tailhook/zorro | 8f004515ac934ba88b8df72b85c536e607d82712 | [
"MIT"
] | 15 | 2015-06-06T08:12:41.000Z | 2021-08-17T18:29:47.000Z | zorro/__init__.py | tailhook/zorro | 8f004515ac934ba88b8df72b85c536e607d82712 | [
"MIT"
] | null | null | null | zorro/__init__.py | tailhook/zorro | 8f004515ac934ba88b8df72b85c536e607d82712 | [
"MIT"
] | 3 | 2015-05-10T05:28:45.000Z | 2016-03-03T04:11:43.000Z | from functools import wraps
from contextlib import contextmanager
from .core import Hub, gethub, Future, Condition, Lock, TimeoutError
__version__ = '0.2.a0'
__all__ = [
'Hub',
'gethub',
'sleep',
'Future',
'Condition',
'Lock',
]
def sleep(value):
gethub().do_sleep(value)
| 14.761905 | 68 | 0.641935 | 35 | 310 | 5.428571 | 0.628571 | 0.094737 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012552 | 0.229032 | 310 | 20 | 69 | 15.5 | 0.782427 | 0 | 0 | 0 | 0 | 0 | 0.125806 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.214286 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b6318c5fe30048e18ee5d9db9d7db7aac78ac012 | 887 | py | Python | tpDcc/libs/python/web.py | tpDcc/tpDcc-libs-python | bd7b39f3cbfb68e99705de02c0b7738fd9602917 | [
"MIT"
] | 1 | 2019-07-15T21:00:18.000Z | 2019-07-15T21:00:18.000Z | tpDcc/libs/python/web.py | tpoveda/tpPyUtils | bd7b39f3cbfb68e99705de02c0b7738fd9602917 | [
"MIT"
] | 1 | 2021-03-02T08:57:33.000Z | 2021-03-04T01:55:05.000Z | tpDcc/libs/python/web.py | tpoveda/tpPyUtils | bd7b39f3cbfb68e99705de02c0b7738fd9602917 | [
"MIT"
] | 1 | 2021-03-03T21:01:54.000Z | 2021-03-03T21:01:54.000Z | #! /usr/bin/env python
# -*- coding: utf-8 -*-
"""
Module that contains utility functions related with email
"""
from __future__ import print_function, division, absolute_import
import os
import webbrowser
try:
import urllib2 as urllib
except ImportError:
import urllib
from tpDcc.libs.python import osplatform
def open_web(url):
"""
Open given web URL in user web browser
:param url: str
"""
if osplatform.is_linux():
try:
os.system('gio open {}'.format(url))
except Exception:
webbrowser.open(url, 0)
else:
webbrowser.open(url, 0)
def safe_open_url(url):
"""
Opens given URL in a safe way
:param url: str
:return:
"""
try:
result = urllib.urlopen(url)
except urllib.HTTPError as exc:
raise Exception('{} : {}'.format(exc, exc.url))
return result
| 18.479167 | 64 | 0.62345 | 113 | 887 | 4.80531 | 0.548673 | 0.038674 | 0.040516 | 0.066298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006163 | 0.26832 | 887 | 47 | 65 | 18.87234 | 0.830508 | 0.237881 | 0 | 0.227273 | 0 | 0 | 0.0288 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.318182 | 0 | 0.454545 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b635195d33c8a5d05f49bbc5c4402fa9e68990e5 | 26,200 | py | Python | pymtl3/datatypes/bitstructs.py | kevinyuan/pymtl3 | 5949e6a4acc625c0ccbbb25be3af1d0db683df3c | [
"BSD-3-Clause"
] | 152 | 2020-06-03T02:34:11.000Z | 2022-03-30T04:16:45.000Z | pymtl3/datatypes/bitstructs.py | kevinyuan/pymtl3 | 5949e6a4acc625c0ccbbb25be3af1d0db683df3c | [
"BSD-3-Clause"
] | 139 | 2019-05-29T00:37:09.000Z | 2020-05-17T16:49:26.000Z | pymtl3/datatypes/bitstructs.py | kevinyuan/pymtl3 | 5949e6a4acc625c0ccbbb25be3af1d0db683df3c | [
"BSD-3-Clause"
] | 22 | 2020-05-18T13:42:05.000Z | 2022-03-11T08:37:51.000Z | """
==========================================================================
bitstruct.py
==========================================================================
APIs to generate a bitstruct type. Using decorators and type annotations
to create bit struct is much inspired by python3 dataclass implementation.
Note that the implementation (such as the _CAPITAL constants to add some
private metadata) in this file is very similar to the **original python3
dataclass implementation**. The syntax of creating bit struct is very
similar to that of python3 dataclass.
https://github.com/python/cpython/blob/master/Lib/dataclasses.py
For example,
@bitstruct
class Point:
x : Bits4
y : Bits4
will automatically generate some methods, such as __init__, __str__,
__repr__, for the Point class.
Similar to the built-in dataclasses module, we also provide a
mk_bitstruct function for user to dynamically generate bit struct types.
For example,
mk_bitstruct( 'Pixel',{
'r' : Bits4,
'g' : Bits4,
'b' : Bits4,
},
name_space = {
'__str__' : lambda self: f'({self.r},{self.g},{self.b})'
}
)
is equivalent to:
@bitstruct
class Pixel:
r : Bits4
g : Bits4
b : Bits4
def __str__( self ):
return f'({self.r},{self.g},{self.b})'
Author : Yanghui Ou, Shunning Jiang
Date : Oct 19, 2019
"""
import functools
import keyword
import operator
import types
import warnings
import py
from pymtl3.extra.pypy import custom_exec
from .bits_import import *
from .helpers import concat
#-------------------------------------------------------------------------
# Constants
#-------------------------------------------------------------------------
# Object with this attribute is considered as bit struct, as we assume
# only the bitstruct decorator will stamp this attribute to a class. This
# attribute also stores the field information and can be used for
# translation.
#
# The original dataclass use hasattr( cls, _FIELDS ) to check dataclass.
# We do this here as well
_FIELDS = '__bitstruct_fields__'
def is_bitstruct_inst( obj ):
"""Returns True if obj is an instance of a dataclass."""
return hasattr(type(obj), _FIELDS)
def is_bitstruct_class(cls):
"""Returns True if obj is a dataclass ."""
return isinstance(cls, type) and hasattr(cls, _FIELDS)
def get_bitstruct_inst_all_classes( obj ):
# list: put all types together
if isinstance( obj, list ):
return functools.reduce( operator.or_, [ get_bitstruct_inst_all_classes(x) for x in obj ] )
ret = { obj.__class__ }
# BitsN or int
if isinstance( obj, (Bits, int) ):
return ret
# BitStruct
assert is_bitstruct_inst( obj ), f"{obj} is not a valid PyMTL Bitstruct!"
return ret | functools.reduce( operator.or_, [ get_bitstruct_inst_all_classes(getattr(obj, v))
for v in obj.__bitstruct_fields__.keys() ] )
_DEFAULT_SELF_NAME = 's'
_ANTI_CONFLICT_SELF_NAME = '__bitstruct_self__'
#-------------------------------------------------------------------------
# _create_fn
#-------------------------------------------------------------------------
# A helper function that creates a function based on
# - fn_name : name of the function
# - args_lst : a list of arguments in string
# - body_lst : a list of statement of the function body in string
# Also note that this whole _create_fn thing is similar to the original
# dataclass implementation!
def _create_fn( fn_name, args_lst, body_lst, _globals=None ):
# Assemble argument string and body string
args = ', '.join(args_lst)
body = '\n'.join(f' {statement}' for statement in body_lst)
# Assemble the source code and execute it
src = f'def {fn_name}({args}):\n{body}'
if _globals is None: _globals = {}
_locals = {}
custom_exec( py.code.Source(src).compile(), _globals, _locals )
return _locals[fn_name]
#-------------------------------------------------------------------------
# _mk_init_arg
#-------------------------------------------------------------------------
# Creates a init argument string from a field.
#
# Shunning: I revamped the whole thing because they are indeed mutable
# objects.
def _mk_init_arg( name, type_ ):
# default is always None
if isinstance( type_, list ) or is_bitstruct_class( type_ ):
return f'{name} = None'
return f'{name} = 0'
#-------------------------------------------------------------------------
# _mk_init_body
#-------------------------------------------------------------------------
# Creates one line of __init__ body from a field
# to globals.
def _mk_init_body( self_name, name, type_ ):
def _recursive_generate_init( x ):
if isinstance( x, list ):
return f"[{', '.join( [ _recursive_generate_init(x[0]) ] * len(x) )}]"
return f"_type_{name}()"
if isinstance( type_, list ) or is_bitstruct_class( type_ ):
return f'{self_name}.{name} = {name} or {_recursive_generate_init(type_)}'
assert issubclass( type_, Bits )
return f'{self_name}.{name} = _type_{name}({name})'
#-------------------------------------------------------------------------
# _mk_tuple_str
#-------------------------------------------------------------------------
# Creates a tuple of string representations of each field. For example,
# if the self_name is 'self' and fields is [ 'x', 'y' ], it will return
# ('self.x', 'self.y'). This is used for creating the default __eq__ and
# __hash__ function.
def _mk_tuple_str( self_name, fields ):
return f'({",".join([f"{self_name}.{name}" for name in fields])},)'
#-------------------------------------------------------------------------
# _mk_init_fn
#-------------------------------------------------------------------------
# Creates a __init__ function based on fields. For example, if fields
# contains two field x (Bits4) and y (Bits4), _mk_init_fn will return a
# function that looks like the following:
#
# def __init__( s, x = 0, y = 0, z = None, p = None ):
# s.x = _type_x(x)
# s.y = _type_y(y)
# s.z = z or _type_z()
# s.p = p or [ _type_p(), _type_p() ]
#
# NOTE:
# _mk_init_fn also takes as argument the name of self in case there is a
# field with name 's' or 'self'.
#
# TODO: should we provide a __post_init__ function like dataclass does?
def _mk_init_fn( self_name, fields ):
# Register necessary types in _globals
_globals = {}
for name, type_ in fields.items():
if isinstance( type_, list ):
x = type_[0]
while isinstance( x, list ):
x = x[0]
_globals[ f"_type_{name}" ] = x
else:
assert issubclass( type_, Bits ) or is_bitstruct_class( type_ )
_globals[ f"_type_{name}" ] = type_
return _create_fn(
'__init__',
[ self_name ] + [ _mk_init_arg( *field ) for field in fields.items() ],
[ _mk_init_body( self_name, *field ) for field in fields.items() ],
_globals = _globals,
)
#-------------------------------------------------------------------------
# _mk_str_fn
#-------------------------------------------------------------------------
# Creates a __str__ function based on fields. For example, if fields
# contains two field x (Bits4) and y (Bits4), _mk_str_fn will return a
# function that looks like the following:
#
# def __str__( self ):
# return f'{self.x}:{self.y}'
def _mk_str_fn( fields ):
return _create_fn(
'__str__',
[ 'self' ],
[ 'return f"' +
':'.join([ f'{{self.{name}}}' for name in fields ]) + '"']
)
#-------------------------------------------------------------------------
# _mk_repr_fn
#-------------------------------------------------------------------------
# Creates a __repr__ function based on fields. For example, if fields
# contains two field x (Bits4) and y (Bits4), _mk_repr_fn will return a
# function that looks like the following:
#
# def __repr__( self ):
# return self.__class__.__name__ + f'(x={self.x!r}, y={self.y!r})'
def _mk_repr_fn( fields ):
return _create_fn(
'__repr__',
[ 'self' ],
[ 'return self.__class__.__name__ + f"(' +
','.join([ f'{{self.{name}!r}}' for name in fields ]) +
')"']
)
#-------------------------------------------------------------------------
# _mk_eq_fn
#-------------------------------------------------------------------------
# Creates a __eq__ function based on fields. By default it just compares
# each field. For example, if fields contains two field x (Bits4) and y
# (Bits4), _mk_eq_fn will return a function that looks like the
# following:
#
# def __eq__( self, other ):
# if other.__class__ is self.__class__:
# return (self.x,self.y,) == (other.x,other.y,)
# else:
# raise NotImplemented
def _mk_eq_fn( fields ):
self_tuple = _mk_tuple_str( 'self', fields )
other_tuple = _mk_tuple_str( 'other', fields )
return _create_fn(
'__eq__',
[ 'self', 'other' ],
[ f'return (other.__class__ is self.__class__) and {self_tuple} == {other_tuple}' ]
)
#-------------------------------------------------------------------------
# _mk_hash_fn
#-------------------------------------------------------------------------
# Creates a __hash__ function based on fields. By default it just hashes
# all fields. For example, if fields contains two field x (Bits4) and y
# (Bits4), _mk_hash_fn will return a function that looks like the
# following:
#
# def __hash__( self ):
# return hash((self.x,self.y,))
def _mk_hash_fn( fields ):
self_tuple = _mk_tuple_str( 'self', fields )
return _create_fn(
'__hash__',
[ 'self' ],
[ f'return hash({self_tuple})' ]
)
#--------------------------PyMTL3 specific--------------------------------
#-------------------------------------------------------------------------
# _mk_ff_fn
#-------------------------------------------------------------------------
# Creates __ilshift__ and _flip functions that looks like the following:
#
# def __ilshift__( self, other ):
# if self.__class__ is not other.__class__:
# other = self.__class__.from_bits( other.to_bits() )
# self.x <<= other.x
# self.y[0][0] <<= other.y[0][0]
#
# def _flip( self ):
# self.x._flip()
# self.y[i][j]._flip()
def _mk_ff_fn( fields ):
def _gen_list_ilshift_strs( type_, prefix='' ):
if isinstance( type_, list ):
ilshift_strs, flip_strs = [], []
for i in range(len(type_)):
ils, fls = _gen_list_ilshift_strs( type_[0], f"{prefix}[{i}]" )
ilshift_strs.extend( ils )
flip_strs.extend( fls )
return ilshift_strs, flip_strs
else:
return [ f"self.{prefix} <<= other.{prefix}" ], [f"self.{prefix}._flip()"]
ilshift_strs = [ 'if self.__class__ is not other.__class__:',
' other = self.__class__.from_bits( other.to_bits() )']
flip_strs = []
for name, type_ in fields.items():
ils, fls = _gen_list_ilshift_strs( type_, name )
ilshift_strs.extend( ils )
flip_strs.extend( fls )
return _create_fn(
'__ilshift__',
[ 'self', 'other' ],
ilshift_strs + [ "return self" ],
), _create_fn(
'_flip',
[ 'self' ],
flip_strs,
),
#-------------------------------------------------------------------------
# _mk_clone_fn
#-------------------------------------------------------------------------
# Creates clone function that looks like the following:
# Use this clone function in any place that you need to perform a
# deepcopy on a bitstruct.
#
# def clone( self ):
# return self.__class__( self.x.clone(), [ self.y[0].clone(), self.y[1].clone() ] )
def _gen_list_clone_strs( type_, prefix='' ):
if isinstance( type_, list ):
return "[" + ",".join( [ _gen_list_clone_strs( type_[0], f"{prefix}[{i}]" )
for i in range(len(type_)) ] ) + "]"
else:
return f"{prefix}.clone()"
def _mk_clone_fn( fields ):
clone_strs = [ 'return self.__class__(' ]
for name, type_ in fields.items():
clone_strs.append( " " + _gen_list_clone_strs( type_, f'self.{name}' ) + "," )
return _create_fn(
'clone',
[ 'self' ],
clone_strs + [ ')' ],
)
def _mk_deepcopy_fn( fields ):
clone_strs = [ 'return self.__class__(' ]
for name, type_ in fields.items():
clone_strs.append( " " + _gen_list_clone_strs( type_, f'self.{name}' ) + "," )
return _create_fn(
'__deepcopy__',
[ 'self', 'memo' ],
clone_strs + [ ')' ],
)
#-------------------------------------------------------------------------
# _mk_imatmul_fn
#-------------------------------------------------------------------------
# Creates @= function that copies the value over ...
# TODO create individual from_bits for imatmul and ilshift
# def __imatmul__( self, other ):
# if self.__class__ is not other.__class__:
# other = self.__class__.from_bits( other.to_bits() )
# self.x @= other.x
# self.y[0] @= other.y[0]
# self.y[1] @= other.y[1]
def _mk_imatmul_fn( fields ):
def _gen_list_imatmul_strs( type_, prefix='' ):
if isinstance( type_, list ):
ret = []
for i in range(len(type_)):
ret.extend( _gen_list_imatmul_strs( type_[0], f"{prefix}[{i}]" ) )
return ret
else:
return [ f"self.{prefix} @= other.{prefix}" ]
imatmul_strs = [ 'if self.__class__ is not other.__class__:',
' other = self.__class__.from_bits( other.to_bits() )']
for name, type_ in fields.items():
imatmul_strs.extend( _gen_list_imatmul_strs( type_, name ) )
return _create_fn(
'__imatmul__',
[ 'self', 'other' ],
imatmul_strs + [ "return self" ],
)
#-------------------------------------------------------------------------
# _mk_nbits_to_bits_fn
#-------------------------------------------------------------------------
# Creates nbits, to_bits function that copies the value over ...
#
# def to_bits( self ):
# return concat( self.x, self.y[0], self.y[1] )
#
# TODO packing order of array? x[0] is LSB or MSB of a list
# current we do LSB
def _mk_nbits_to_bits_fn( fields ):
def _gen_to_bits_strs( type_, prefix, start_bit ):
if isinstance( type_, list ):
to_strs = []
# The packing order is LSB, so we need to reverse the list to make x[-1] higher bits
for i in reversed(range(len(type_))):
start_bit, tos = _gen_to_bits_strs( type_[0], f"{prefix}[{i}]", start_bit )
to_strs.extend( tos )
return start_bit, to_strs
elif is_bitstruct_class( type_ ):
to_strs = []
for name, typ in getattr(type_, _FIELDS).items():
start_bit, tos = _gen_to_bits_strs( typ, f"{prefix}.{name}", start_bit )
to_strs.extend( tos )
return start_bit, to_strs
else:
end_bit = start_bit + type_.nbits
return end_bit, [ f"self.{prefix}" ]
to_bits_strs = []
total_nbits = 0
for name, type_ in fields.items():
total_nbits, tos = _gen_to_bits_strs( type_, name, total_nbits )
to_bits_strs.extend( tos )
return total_nbits, _create_fn( 'to_bits', [ 'self' ],
[ f"return concat({', '.join(to_bits_strs)})" ],
_globals={'concat':concat} )
#-------------------------------------------------------------------------
# _mk_from_bits_fn
#-------------------------------------------------------------------------
# Creates static method from_bits that creates a new bitstruct based on Bits
# and instance method _from_bits that copies the value over
#
# @staticmethod
# def from_bits( other ):
# return self.__class__( other[16:32], other[0:16] )
def _mk_from_bits_fns( fields, total_nbits ):
def _gen_from_bits_strs( type_, end_bit ):
if isinstance( type_, list ):
from_strs = []
# Since we are doing LSB for x[0], we need to unpack from the last
# element of the list, and then reverse it again to construct a list ...
for i in range(len(type_)):
end_bit, fs = _gen_from_bits_strs( type_[0], end_bit )
from_strs.extend( fs )
return end_bit, [ f"[{','.join(reversed(from_strs))}]" ]
elif is_bitstruct_class( type_ ):
if type_ in type_name_mapping:
type_name = type_name_mapping[ type_ ]
else:
type_name = f"_type{len(type_name_mapping)}"
type_name_mapping[ type_ ] = type_name
from_strs = []
for name, typ in getattr(type_, _FIELDS).items():
end_bit, fs = _gen_from_bits_strs( typ, end_bit )
from_strs.extend( fs )
return end_bit, [ f"{type_name}({','.join(from_strs)})" ]
else:
if type_ not in type_name_mapping:
type_name_mapping[ type_ ] = type_.__name__
else:
assert type_name_mapping[ type_ ] == type_.__name__
start_bit = end_bit - type_.nbits
return start_bit, [ f"other[{start_bit}:{end_bit}]" ]
from_bits_strs = []
end_bit = total_nbits
# This is to make sure we capture two types with the same name but different
# attributes
type_name_mapping = {}
type_count = 0
for _, type_ in fields.items():
end_bit, fs = _gen_from_bits_strs( type_, end_bit )
from_bits_strs.extend( fs )
assert end_bit == 0
_globals = { y: x for x,y in type_name_mapping.items() }
assert len(_globals) == len(type_name_mapping)
# TODO add assertion in bits
return _create_fn( 'from_bits', [ 'cls', 'other' ],
[ "assert cls.nbits == other.nbits, f'LHS bitstruct {cls.nbits}-bit <> RHS other {other.nbits}-bit'",
"other = other.to_bits()",
f"return cls({','.join(from_bits_strs)})" ], _globals )
#-------------------------------------------------------------------------
# _check_valid_array
#-------------------------------------------------------------------------
def _recursive_check_array_types( current ):
x = current[0]
if isinstance( x, list ):
x_len = len(x)
x_type = _recursive_check_array_types( x )
for y in current[1:]:
assert isinstance( y, list ) and len(y) == x_len
y_type = _recursive_check_array_types( y )
assert y_type is x_type
return x_type
assert issubclass( x, Bits ) or is_bitstruct_class( x )
for y in current[1:]:
assert y is x
return x
def _check_valid_array_of_types( arr ):
# Check if the provided list is a strict multidimensional array
try:
return _recursive_check_array_types( arr )
except Exception as e:
print(e)
return None
#-------------------------------------------------------------------------
# _check_field_annotation
#-------------------------------------------------------------------------
def _check_field_annotation( cls, name, type_ ):
# Make sure not default is annotated
if hasattr( cls, name ):
default = getattr( cls, name )
raise TypeError( "We don't allow subfields to have default value:\n"
f"- Field '{name}' of BitStruct {cls.__name__} has default value {default!r}." )
# Special case if the type is an instance of list
if isinstance( type_, list ):
if _check_valid_array_of_types( type_ ) is None:
raise TypeError( "The provided list spec should be a strict multidimensional ARRAY "
"with no varying sizes or types. All non-list elements should be VALID types." )
else:
# Now we work with types
if not isinstance( type_, type ):
raise TypeError(f"{type_} is not a type\n"\
f"- Field '{name}' of BitStruct {cls.__name__} is annotated as {type_}.")
# More specifically, Bits and BitStruct
if not issubclass( type_, Bits ) and not is_bitstruct_class( type_ ):
raise TypeError( "We currently only support BitsN, list, or another BitStruct as BitStruct field:\n"
f"- Field '{name}' of BitStruct {cls.__name__} is annotated as {type_}." )
#-------------------------------------------------------------------------
# _get_self_name
#-------------------------------------------------------------------------
# Return a self name based on fields.
def _get_self_name( fields ):
return( _ANTI_CONFLICT_SELF_NAME if _DEFAULT_SELF_NAME in fields else
_DEFAULT_SELF_NAME )
#-------------------------------------------------------------------------
# _process_cls
#-------------------------------------------------------------------------
# Process the input cls and add methods to it.
_bitstruct_hash_cache = {}
def _process_class( cls, add_init=True, add_str=True, add_repr=True,
add_hash=True ):
# Get annotations of the class
cls_annotations = cls.__dict__.get('__annotations__', {})
if not cls_annotations:
raise AttributeError( "No field is declared in the bit struct definition.\n"
f"Suggestion: check the definition of {cls.__name__} to"
" make sure it only contains 'field_name(string): Type(type).'" )
# Get field information from the annotation and prepare for hashing
fields = {}
hashable_fields = {}
def _convert_list_to_tuple( x ):
if isinstance( x, list ):
return tuple( [ _convert_list_to_tuple( y ) for y in x ] )
return x
reserved_fields = ['to_bits', 'from_bits', 'nbits']
for x in reserved_fields:
assert x not in cls.__dict__, f"Currently a bitstruct cannot have {reserved_fields}, but "\
f"{x} is provided as {cls.__dict__[x]}"
for a_name, a_type in cls_annotations.items():
assert a_name not in reserved_fields, f"Currently a bitstruct cannot have {reserved_fields}, but "\
f"{a_name} is annotated as {a_type}"
_check_field_annotation( cls, a_name, a_type )
fields[ a_name ] = a_type
hashable_fields[ a_name ] = _convert_list_to_tuple( a_type )
cls._hash = _hash = hash( (cls.__name__, *tuple(hashable_fields.items()),
add_init, add_str, add_repr, add_hash) )
if _hash in _bitstruct_hash_cache:
return _bitstruct_hash_cache[ _hash ]
_bitstruct_hash_cache[ _hash ] = cls
# Stamp the special attribute so that translation pass can identify it
# as bit struct.
setattr( cls, _FIELDS, fields )
# Add methods to the class
# Create __init__. Here I follow the dataclass convention that we only
# add our generated __init__ function when add_init is true and user
# did not define their own init.
if add_init:
if not '__init__' in cls.__dict__:
cls.__init__ = _mk_init_fn( _get_self_name(fields), fields )
# Create __str__
if add_str:
if not '__str__' in cls.__dict__:
cls.__str__ = _mk_str_fn( fields )
# Create __repr__
if add_repr:
if not '__repr__' in cls.__dict__:
cls.__repr__ = _mk_repr_fn( fields )
# Create __eq__. There is no need for a __ne__ method as python will
# call __eq__ and negate it.
# NOTE: if user overwrites __eq__ it may lead to different behavior for
# the translated verilog as in the verilog world two bit structs are
# equal only if all the fields are equal. We always try to add __eq__
if not '__eq__' in cls.__dict__:
cls.__eq__ = _mk_eq_fn( fields )
else:
w_msg = ( f'Overwriting {cls.__qualname__}\'s __eq__ may cause the '
'translated verilog behaves differently from PyMTL '
'simulation.')
warnings.warn( w_msg )
# Create __hash__.
if add_hash:
if not '__hash__' in cls.__dict__:
cls.__hash__ = _mk_hash_fn( fields )
# Shunning: add __ilshift__ and _flip for update_ff
assert not '__ilshift__' in cls.__dict__ and not '_flip' in cls.__dict__
cls.__ilshift__, cls._flip = _mk_ff_fn( fields )
# Shunning: add clone
assert not 'clone' in cls.__dict__ and not '__deepcopy__' in cls.__dict__
cls.clone = _mk_clone_fn( fields )
cls.__deepcopy__ = _mk_deepcopy_fn( fields )
# Shunning: add imatmul for assignment, as well as nbits/to_bits/from_bits
assert '__imatmul__' not in cls.__dict__ and 'to_bits' not in cls.__dict__ and \
'nbits' not in cls.__dict__ and 'from_bits' not in cls.__dict__
cls.__imatmul__ = _mk_imatmul_fn( fields )
cls.nbits, cls.to_bits = _mk_nbits_to_bits_fn( fields )
from_bits = _mk_from_bits_fns( fields, cls.nbits )
cls.from_bits = classmethod(from_bits)
assert not 'get_field_type' in cls.__dict__
def get_field_type( cls, name ):
if name in cls.__bitstruct_fields__:
return cls.__bitstruct_fields__[ name ]
raise AttributeError( f"{cls} has no field '{name}'" )
cls.get_field_type = classmethod(get_field_type)
# TODO: maybe add a to_bits and from bits function.
return cls
#-------------------------------------------------------------------------
# bitstruct
#-------------------------------------------------------------------------
# The actual class decorator. We add a * in the argument list so that the
# following argument can only be used as keyword arguments.
def bitstruct( _cls=None, *, add_init=True, add_str=True, add_repr=True, add_hash=True ):
def wrap( cls ):
return _process_class( cls, add_init, add_str, add_repr )
# Called as @bitstruct(...)
if _cls is None:
return wrap
# Called as @bitstruct without parens.
return wrap( _cls )
#-------------------------------------------------------------------------
# mk_bitstruct
#-------------------------------------------------------------------------
# Dynamically generate a bit struct class.
# TODO: should we add base parameters to support inheritence?
def mk_bitstruct( cls_name, fields, *, namespace=None, add_init=True,
add_str=True, add_repr=True, add_hash=True ):
# copy namespace since will mutate it
namespace = {} if namespace is None else namespace.copy()
# We assume fields is a dictionary and thus there won't be duplicate
# field names. So we only check if the field names are indeed strings
# and that they are not keywords.
annos = {}
for name, f in fields.items():
if not isinstance( name, str ) or not name.isidentifier():
raise TypeError( f'Field name {name!r} is not a valid identifier!' )
if keyword.iskeyword( name ):
raise TypeError( f'Field name {name!r} is a keyword!' )
annos[ name ] = f
namespace['__annotations__'] = annos
cls = types.new_class( cls_name, (), {}, lambda ns: ns.update( namespace ) )
return bitstruct( cls, add_init=add_init, add_str=add_str,
add_repr=add_repr, add_hash=add_hash )
| 34.748011 | 122 | 0.580496 | 3,360 | 26,200 | 4.159821 | 0.122619 | 0.015454 | 0.009659 | 0.012878 | 0.323961 | 0.26193 | 0.201903 | 0.167776 | 0.153896 | 0.119625 | 0 | 0.003233 | 0.197099 | 26,200 | 753 | 123 | 34.794157 | 0.6612 | 0.420038 | 0 | 0.23662 | 1 | 0.005634 | 0.185908 | 0.026272 | 0 | 0 | 0 | 0.001328 | 0.047887 | 1 | 0.095775 | false | 0 | 0.025352 | 0.014085 | 0.259155 | 0.002817 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b638eb605f71c87607689ad53bda58ff3e514601 | 491 | py | Python | 2009-01-03_datacenter-shared-host-migration/real.py | neoice/neoice-random | 9a0387babc4eec3ff5c808b818bea696e40a7129 | [
"WTFPL"
] | null | null | null | 2009-01-03_datacenter-shared-host-migration/real.py | neoice/neoice-random | 9a0387babc4eec3ff5c808b818bea696e40a7129 | [
"WTFPL"
] | null | null | null | 2009-01-03_datacenter-shared-host-migration/real.py | neoice/neoice-random | 9a0387babc4eec3ff5c808b818bea696e40a7129 | [
"WTFPL"
] | null | null | null | #!/usr/bin/python
import os
import re
infile = open('domains', 'r')
outfile = open('output', 'a')
for line in infile:
line = line.strip('\n') # yank the newline off to make printing pretty and play nice with shell one-liners
x = os.system('host ' + line + ' | grep PREFIX.*')
if not x: # that system call returns 0 if found, 256 if not
print line + ' is hosted by us'
outfile.write(line + '\t Y' + '\n')
else:
outfile.write(line + '\t N' + '\n')
infile.close()
outfile.close()
| 24.55 | 107 | 0.641548 | 81 | 491 | 3.888889 | 0.691358 | 0.031746 | 0.101587 | 0.107937 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010152 | 0.197556 | 491 | 19 | 108 | 25.842105 | 0.78934 | 0.295316 | 0 | 0 | 0 | 0 | 0.19242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b6394e522e1dd0319db4306310a7c1dcf549d35a | 1,345 | py | Python | design-patterns/structural/adapter.py | verthais/exercise-python | d989647e8fbfe8a79b9b5f2c3ab003715d238851 | [
"MIT"
] | null | null | null | design-patterns/structural/adapter.py | verthais/exercise-python | d989647e8fbfe8a79b9b5f2c3ab003715d238851 | [
"MIT"
] | null | null | null | design-patterns/structural/adapter.py | verthais/exercise-python | d989647e8fbfe8a79b9b5f2c3ab003715d238851 | [
"MIT"
] | null | null | null | class Korean:
"""Korean speaker"""
def __init__(self):
self.name = "Korean"
def speak_korean(self):
return "An-neyong?"
class British:
"""English speaker"""
def __init__(self):
self.name = "British"
#Note the different method name here!
def speak_english(self):
return "Hello!"
class Adapter:
"""This changes the generic method name to individualized method names"""
def __init__(self, object, **adapted_method):
"""Change the name of the method"""
self._object = object
#Add a new dictionary item that establishes the mapping between the generic method name: speak() and the concrete method
#For example, speak() will be translated to speak_korean() if the mapping says so
self.__dict__.update(adapted_method)
def __getattr__(self, attr):
"""Simply return the rest of attributes!"""
return getattr(self._object, attr)
def main():
#List to store speaker objects
objects = []
#Create a Korean object
korean = Korean()
#Create a British object
british =British()
#Append the objects to the objects list
objects.append(Adapter(korean, speak=korean.speak_korean))
objects.append(Adapter(british, speak=british.speak_english))
for obj in objects:
print("{} says '{}'\n".format(obj.name, obj.speak()))
if __name__ == '__main__':
main()
| 24.907407 | 123 | 0.689963 | 178 | 1,345 | 5.005618 | 0.376404 | 0.049383 | 0.037037 | 0.040404 | 0.058361 | 0.058361 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196283 | 1,345 | 53 | 124 | 25.377358 | 0.824237 | 0.382156 | 0 | 0.076923 | 0 | 0 | 0.068365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.269231 | false | 0 | 0 | 0.076923 | 0.5 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b63f097723b0813436ade51cfc6168ae3228c596 | 28,932 | py | Python | utils/surface.py | fsanges/glTools | 8ff0899de43784a18bd4543285655e68e28fb5e5 | [
"MIT"
] | 165 | 2015-01-26T05:22:04.000Z | 2022-03-22T02:50:41.000Z | utils/surface.py | qeeji/glTools | 8ff0899de43784a18bd4543285655e68e28fb5e5 | [
"MIT"
] | 5 | 2015-12-02T02:39:44.000Z | 2020-12-09T02:45:54.000Z | utils/surface.py | qeeji/glTools | 8ff0899de43784a18bd4543285655e68e28fb5e5 | [
"MIT"
] | 83 | 2015-02-10T17:18:24.000Z | 2022-02-10T07:16:47.000Z | import maya.cmds as mc
import maya.OpenMaya as OpenMaya
import glTools.utils.base
import glTools.utils.curve
import glTools.utils.component
import glTools.utils.mathUtils
import glTools.utils.matrix
import glTools.utils.shape
import glTools.utils.stringUtils
import math
def isSurface(surface):
'''
Check if the specified object is a nurbs surface or transform parent of a surface
@param surface: Object to query
@type surface: str
'''
# Check object exists
if not mc.objExists(surface): return False
# Check shape
if mc.objectType(surface) == 'transform': surface = mc.listRelatives(surface,s=True,ni=True,pa=True)[0]
if mc.objectType(surface) != 'nurbsSurface': return False
# Return result
return True
def getSurfaceFn(surface):
'''
Create an MFnNurbsSurface class object from the specified nurbs surface
@param surface: Surface to create function class for
@type surface: str
'''
# Checks
if not isSurface(surface): raise Exception('Object '+surface+' is not a valid surface!')
if mc.objectType(surface) == 'transform':
surface = mc.listRelatives(surface,s=True,ni=True,pa=True)[0]
# Get MFnNurbsSurface
selection = OpenMaya.MSelectionList()
OpenMaya.MGlobal.getSelectionListByName(surface,selection)
surfacePath = OpenMaya.MDagPath()
selection.getDagPath(0,surfacePath)
surfaceFn = OpenMaya.MFnNurbsSurface()
surfaceFn.setObject(surfacePath)
# Return result
return surfaceFn
def chordLength(surface,param=0.0,direction='u'):
'''
Return the length of a surface isoparm given a parameter and a direction
@param surface: Surface to query closest point from
@type surface: str
@param param: The parameter on the surface to query length of
@type param: float
@param direction: Direction along the surface to measure length of
@type direction: str
'''
# Check surface
if not isSurface(surface): raise Exception('Object '+surface+' is not a valid surface!')
# Duplicate surface curve
curve = mc.duplicateCurve(surface+'.'+direction+'['+str(param)+']',ch=0,rn=0,local=0)
# Measure curve length
length = mc.arclen(curve[0])
# Cleanup
mc.delete(curve)
# Return result
return length
def closestPoint(surface,pos=(0,0,0)):
'''
Return closest point on surface to target position
@param surface: Surface to query closest point from
@type surface: str
@param pos: Position to query surface from
@type pos: tuple/list
'''
# Check surface
if not isSurface(surface): raise Exception('Object '+surface+' is not a valid surface!')
# Get point world position
pos = glTools.utils.base.getPosition(pos)
pt = OpenMaya.MPoint(pos[0],pos[1],pos[2],1.0)
# Get surface function set
surfFn = getSurfaceFn(surface)
# Get uCoord and vCoord pointer objects
uCoord = OpenMaya.MScriptUtil()
uCoord.createFromDouble(0.0)
uCoordPtr = uCoord.asDoublePtr()
vCoord = OpenMaya.MScriptUtil()
vCoord.createFromDouble(0.0)
vCoordPtr = vCoord.asDoublePtr()
# get closest uCoord to edit point position
surfFn.closestPoint(pt,uCoordPtr,vCoordPtr,True,0.0001,OpenMaya.MSpace.kWorld)
return (OpenMaya.MScriptUtil(uCoordPtr).asDouble(),OpenMaya.MScriptUtil(vCoordPtr).asDouble())
def distToSurface(surface,pos=(0,0,0)):
'''
'''
# Get point world position
pos = glTools.utils.base.getPosition(pos)
# Get closest point to surface
uv = closestPoint(surface,pos)
pt = mc.pointOnSurface(surface,u=uv[0],v=uv[1],p=True)
# Get distance to surface point
dist = glTools.utils.mathUtils.distanceBetween(pos,pt)
# Return Result
return dist
def snapPtsToSurface(surface,pointList):
'''
@param surface: Nurbs surface to snap points to
@type surface: str
@param pointList: Point to snap to the specified surface
@type pointList: list
'''
# Check surface
if not isSurface(surface): raise Exception('Object '+surface+' is not a valid surface!')
# Check points
pointList = mc.ls(pointList,fl=True)
# Transform types
transform = ['transform','joint','ikHandle','effector']
# Snap points
for pt in pointList:
# Check Transform
if transform.count(mc.objectType(pt)):
snapToSurface(surface,pt,0.0,0.0,True,snapPivot=False)
continue
# Move Point
pos = mc.pointPosition(pt)
(uParam,vParam) = closestPoint(surface,pos)
sPt = mc.pointOnSurface(surface,u=uParam,v=vParam,position=True)
mc.move(sPt[0],sPt[1],sPt[2],pt,ws=True,a=True)
def locatorSurface(surface,controlPoints=[],locatorScale=0.075,prefix=''):
'''
Drive the control point positions of a surface with a set of control locators
@param surface: Input surface to connect locators positions to
@type surface: str
@param controlPoints: List of control points to be driven by locators. If left as default (None), all control points will be connected.
@type controlPoints: list
@param locatorScale: Scale of the locators relative to the area of the surface
@type locatorScale: float
@param prefix: Name prefix for newly created nodes
@type prefix: str
'''
# Check surface
if not glTools.utils.surface.isSurface(surface):
raise Exception('Object '+surface+' is not a valid surface!')
if mc.objectType(surface) == 'transform':
surface = mc.listRelatives(surface,s=True,ni=True,pa=True)[0]
# Check prefix
if not prefix: prefix = glTools.utils.stringUtils.stripSuffix(surface)
# Calculate locator scale
locatorScale *= math.sqrt(glTools.utils.surface.surfaceArea(surface))
# Get Control Points
if not controlPoints:
controlPoints = glTools.utils.component.getComponentIndexList(surface)[surface]
else:
controlPoints = glTools.utils.component.getComponentIndexList(controlPoints)[surface]
# Create locators and connect to control points
locatorList = []
for cv in controlPoints:
# Get index (string)
ind = glTools.utils.component.getSingleIndex(surface,cv)
indStr = glTools.utils.stringUtils.stringIndex(ind,2)
# Create locator
loc = prefix+'_cv'+indStr+'_loc'
loc = mc.spaceLocator(n=loc)[0]
locatorList.append(loc)
mc.setAttr(loc+'.localScale',locatorScale,locatorScale,locatorScale)
# Get control point world position
pos = mc.pointPosition(surface+'.cv['+str(cv[0])+']['+str(cv[1])+']')
mc.setAttr(loc+'.t',pos[0],pos[1],pos[2])
mc.makeIdentity(loc,apply=True,t=1,r=1,s=1,n=0)
# Connect locator position to control point
mc.connectAttr(loc+'.worldPosition[0]',surface+'.controlPoints['+str(ind)+']')
return locatorList
def surfaceArea(surface,worldSpace=True):
'''
Calculates the surface area of a specified nurbs surface.
@param surface: Nurbs surface to calculate the surface area for
@type surface: str
@param worldSpace: Calculate the surface area in world or local space units
@type worldSpace: bool
'''
# Check Surface
if not mc.objExists(surface): raise Exception('Object '+surface+' does not exist!')
if mc.objectType(surface) == 'transform':
surfaceShape = mc.listRelatives(surface,s=True,ni=True,pa=True)[0]
if mc.objectType(surfaceShape) != 'nurbsSurface':
raise Exception('Object '+surface+' is not a valid nurbs surface!')
surface = surfaceShape
# Get MFnNurbsSurface
surfaceFn = getSurfaceFn(surface)
# Get surface area
area = 0.0
if worldSpace: area = surfaceFn.area(OpenMaya.MSpace.kWorld)
else: area = surfaceFn.area(OpenMaya.MSpace.kObject)
# Return result
return area
def snapToSurface(surface,obj,uValue=0.0,vValue=0.0,useClosestPoint=False,snapPivot=False):
'''
Snap an object (or transform pivot) to a specified point on a surface.
@param surface: Curve to snap to
@type surface: str
@param obj: Object to move to point on surface
@type obj: str
@param uValue: U Paramater value of the surface to snap to
@type uValue: float
@param vValue: V Paramater value of the surface to snap to
@type vValue: float
@param useClosestPoint: Use the closest point on the surface instead of the specified uv parameter
@type useClosestPoint: bool
@param snapPivot: Move only the objects pivot to the surface point
@type snapPivot: bool
'''
# Check surface
if not isSurface(surface): raise Exception('Object '+surface+' is not a valid surface!!')
# Check uValue/vValue
minu = mc.getAttr(surface+'.minValueU')
maxu = mc.getAttr(surface+'.maxValueU')
minv = mc.getAttr(surface+'.minValueV')
maxv = mc.getAttr(surface+'.maxValueV')
# Closest Point
if useClosestPoint:
pos = mc.xform(obj,q=True,ws=True,rp=True)
(uValue,vValue) = closestPoint(surface,pos)
# Verify surface parameter
if uValue < minu or uValue > maxu: raise Exception('U paramater '+str(uValue)+' is not within the U parameter range for '+surface+'!!')
if vValue < minv or vValue > maxv: raise Exception('V paramater '+str(vValue)+' is not within the V parameter range for '+surface+'!!')
# Get surface point position
pnt = mc.pointPosition(surface+'.uv['+str(uValue)+']['+str(vValue)+']')
# Snap to Curve
piv = mc.xform(obj,q=True,ws=True,rp=True)
if snapPivot: mc.xform(obj,piv=pnt,ws=True)
else: mc.move(pnt[0]-piv[0],pnt[1]-piv[1],pnt[2]-piv[2],obj,r=True,ws=True)
def orientToSurface( surface,
obj,
uValue = 0.0,
vValue = 0.0,
useClosestPoint = False,
tangentUAxis = 'x',
tangentVAxis = 'y',
alignTo = 'u' ):
'''
Orient object to a specified point on a surface.
@param surface: Surface to orient object to
@type surface: str
@param obj: Object to orient
@type obj: str
@param uValue: U Paramater value of the surface to orient to
@type uValue: float
@param vValue: V Paramater value of the surface to orient to
@type vValue: float
@param useClosestPoint: Use the closest point on the surface instead of the specified uv parameter
@type useClosestPoint: bool
@param tangentUAxis: Basis axis that will be derived from the U tangent of the surface point
@type tangentUAxis: str
@param tangentVAxis: Basis axis that will be derived from the V tangent of the surface point
@type tangentVAxis: str
@param upAxis: Basis axis that will be derived from the upVector
@type upAxis: str
'''
# Check surface
if not isSurface(surface): raise Exception('Object '+surface+' is not a valid surface!!')
# Check Obj
transform = ['transform','joint','ikHandle','effector']
if not transform.count(mc.objectType(obj)):
raise Exception('Object '+obj+' is not a valid transform!!')
# Check uValue/vValue
minu = mc.getAttr(surface+'.minValueU')
maxu = mc.getAttr(surface+'.maxValueU')
minv = mc.getAttr(surface+'.minValueV')
maxv = mc.getAttr(surface+'.maxValueV')
# Closest Point
if useClosestPoint:
pos = mc.xform(obj,q=True,ws=True,rp=True)
(uValue,vValue) = closestPoint(surface,pos)
# Verify surface parameter
if uValue < minu or uValue > maxu: raise Exception('U paramater '+str(uValue)+' is not within the U parameter range for '+surface+'!!')
if vValue < minv or vValue > maxv: raise Exception('V paramater '+str(uValue)+' is not within the V parameter range for '+surface+'!!')
# Check object
if not mc.objExists(obj): raise Exception('Object '+obj+' does not exist!!')
rotateOrder = mc.getAttr(obj+'.ro')
# Get tangents at surface point
tanU = mc.pointOnSurface(surface,u=uValue,v=vValue,ntu=True)
tanV = mc.pointOnSurface(surface,u=uValue,v=vValue,ntv=True)
# Build rotation matrix
aimVector = tanU
if alignTo == 'v': aimVector = tanV
upVector = tanV
if alignTo == 'v': upVector = tanU
aimAxis = tangentUAxis
if alignTo == 'v': aimAxis = tangentVAxis
upAxis = tangentVAxis
if alignTo == 'v': upAxis = tangentUAxis
mat = glTools.utils.matrix.buildRotation(aimVector,upVector,aimAxis,upAxis)
rot = glTools.utils.matrix.getRotation(mat,rotateOrder)
# Orient object to surface
mc.rotate(rot[0],rot[1],rot[2],obj,a=True,ws=True)
def rebuild(surface,spansU=0,spansV=0,fullRebuildU=False,fullRebuildV=False,rebuildUfirst=True,replaceOrig=False):
'''
Do brute force surface rebuild for even parameterization
@param surface: Nurbs surface to rebuild
@type surface: str
@param spansU: Number of spans along U. If 0, keep original value.
@type spansU: int
@param spansV: Number of spans along V. If 0, keep original value.
@type spansV: int
@param replaceOrig: Replace original surface, or create new rebuilt surface.
@type replaceOrig: bool
'''
# ==========
# - Checks -
# ==========
# Check surface
if not isSurface(surface):
raise Exception('Object "'+surface+'" is not a valid surface!')
# Check spans
if not spansU: spansU = mc.getAttr(surface+'.spansU')
if not spansV: spansV = mc.getAttr(surface+'.spansV')
# =============
# - Rebuild U -
# =============
# Get V range
if rebuildUfirst:
dir = 'u'
opp = 'v'
spans = spansU
min = mc.getAttr(surface+'.minValueV')
max = mc.getAttr(surface+'.maxValueV')
else:
dir = 'v'
opp = 'u'
spans = spansV
min = mc.getAttr(surface+'.minValueU')
max = mc.getAttr(surface+'.maxValueU')
val = min + (max - min) * 0.5
# Caluculate surface length
iso_crv = mc.duplicateCurve(surface+'.'+opp+'['+str(val)+']',ch=0,rn=0,local=0)[0]
iso_len = mc.arclen(iso_crv)
iso_inc = iso_len / spans
# Get spaced isoparm list
curveFn = glTools.utils.curve.getCurveFn(iso_crv)
iso_list = [surface+'.'+dir+'['+str(curveFn.findParamFromLength(iso_inc*i))+']' for i in range(spans+1)]
mc.delete(iso_crv)
# Check full rebuild
if fullRebuildV:
# Extract isoparm curves
iso_crv_list = [mc.duplicateCurve(iso,ch=False,rn=False,local=False)[0] for iso in iso_list]
# Rebuild isoparm curves
for iso_crv in iso_crv_list:
mc.rebuildCurve(iso_crv,ch=False,rpo=True,rt=0,end=1,kr=0,kcp=0,kep=1,kt=1,s=0,d=3,tol=0)
# Loft final surface
int_surface = mc.loft(iso_crv_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
# Delete intermediate curves
mc.delete(iso_crv_list)
else:
# Loft intermediate surface
int_surface = mc.loft(iso_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
# =============
# - Rebuild V -
# =============
# Get V range (intermediate surface)
if rebuildUfirst:
dir = 'u'
opp = 'v'
spans = spansV
min = mc.getAttr(int_surface+'.minValueU')
max = mc.getAttr(int_surface+'.maxValueU')
else:
dir = 'v'
opp = 'u'
spans = spansU
min = mc.getAttr(int_surface+'.minValueV')
max = mc.getAttr(int_surface+'.maxValueV')
val = min + (max - min) * 0.5
# Caluculate surface length (intermediate surface)
iso_crv = mc.duplicateCurve(int_surface+'.'+opp+'['+str(val)+']',ch=0,rn=0,local=0)[0]
iso_len = mc.arclen(iso_crv)
iso_inc = iso_len / spans
# Get spaced isoparm list
curveFn = glTools.utils.curve.getCurveFn(iso_crv)
iso_list = [int_surface+'.'+dir+'['+str(curveFn.findParamFromLength(iso_inc*i))+']' for i in range(spans+1)]
mc.delete(iso_crv)
# Check full rebuild
if fullRebuildU:
# Extract isoparm curves
iso_crv_list = [mc.duplicateCurve(iso,ch=False,rn=False,local=False)[0] for iso in iso_list]
# Rebuild isoparm curves
for iso_crv in iso_crv_list:
mc.rebuildCurve(iso_crv,ch=False,rpo=True,rt=0,end=1,kr=0,kcp=0,kep=1,kt=1,s=0,d=3,tol=0)
# Loft final surface
rebuild_surface = mc.loft(iso_crv_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
# Delete intermediate curves
mc.delete(iso_crv_list)
else:
# Loft final surface
rebuild_surface = mc.loft(iso_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
# Rename rebuilt surface
rebuild_surface = mc.rename(rebuild_surface,surface+'_rebuild')
rebuild_surfaceShape = mc.listRelatives(surface,s=True,ni=True,pa=True)[0]
mc.delete(int_surface)
# Re-parameterize 0-1
mc.rebuildSurface(rebuild_surface,ch=False,rpo=True,dir=2,rt=0,end=1,kr=0,kcp=1,kc=1,tol=0,fr=0)
# Initialize return value
outputShape = rebuild_surfaceShape
# ====================
# - Replace Original -
# ====================
if replaceOrig:
"""
# Get original shape
shapes = glTools.utils.shape.getShapes(surface,nonIntermediates=True,intermediates=False)
if not shapes:
# Find Intermediate Shapes
shapes = glTools.utils.shape.listIntermediates(surface)
# Check shapes
if not shapes:
raise Exception('Unable to determine shape for surface "'+surface+'"!')
# Check connections
if mc.listConnections(shapes[0]+'.create',s=True,d=False):
# Find Intermediate Shapes
shapes = glTools.utils.shape.findInputShape(shapes[0])
"""
# Check history
shapes = mc.listRelatives(surface,s=True,ni=True,pa=True)
if not shapes: raise Exception('Unable to determine shape for surface "'+surface+'"!')
shape = shapes[0]
shapeHist = mc.listHistory(shape)
if shapeHist.count(shape): shapeHist.remove(shape)
if shapeHist: print('Surface "" contains construction history, creating new shape!')
# Override shape info and delete intermediate
mc.connectAttr(rebuild_surfaceShape+'.local',shape+'.create',f=True)
outputShape = shape
# =================
# - Return Result -
# =================
return outputShape
def rebuild_old(surface,spansU=0,spansV=0,fullRebuildU=False,fullRebuildV=False,replaceOrig=False):
'''
Do brute force surface rebuild for even parameterization
@param surface: Nurbs surface to rebuild
@type surface: str
@param spansU: Number of spans along U. If 0, keep original value.
@type spansU: int
@param spansV: Number of spans along V. If 0, keep original value.
@type spansV: int
@param replaceOrig: Replace original surface, or create new rebuilt surface.
@type replaceOrig: bool
'''
# Check surface
if not isSurface(surface):
raise Exception('Object "'+surface+'" is not a valid surface!')
# Check spans
if not spansU: spansU = mc.getAttr(surface+'.spansU')
if not spansV: spansV = mc.getAttr(surface+'.spansV')
# -------------
# - Rebuild V -
# Get V range
minu = mc.getAttr(surface+'.minValueU')
maxu = mc.getAttr(surface+'.maxValueU')
u = minu + (maxu - minu) * 0.5
# Extract isoparm curve
iso_crv = mc.duplicateCurve(surface+'.u['+str(u)+']',ch=0,rn=0,local=0)[0]
iso_len = mc.arclen(iso_crv)
iso_inc = iso_len / spansV
curveFn = glTools.utils.curve.getCurveFn(iso_crv)
iso_list = [surface+'.v['+str(curveFn.findParamFromLength(iso_inc*i))+']' for i in range(spansV+1)]
mc.delete(iso_crv)
# Check full rebuild
if fullRebuildU:
# Extract isoparm curves
iso_crv_list = [mc.duplicateCurve(iso,ch=False,rn=False,local=False)[0] for iso in iso_list]
# Rebuild isoparm curves
for iso_crv in iso_crv_list:
mc.rebuildCurve(iso_crv,ch=False,rpo=True,rt=0,end=1,kr=0,kcp=0,kep=1,kt=1,s=0,d=3,tol=0)
# Loft final surface
int_surface = mc.loft(iso_crv_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
# Delete intermediate curves
mc.delete(iso_crv_list)
else:
# Loft intermediate surface
int_surface = mc.loft(iso_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
# -------------
# - Rebuild U -
# Get V range (intermediate surface)
minv = mc.getAttr(int_surface+'.minValueV')
maxv = mc.getAttr(int_surface+'.maxValueV')
v = minv + (maxv - minv) * 0.5
# Extract isoparm curve (intermediate surface)
iso_crv = mc.duplicateCurve(int_surface+'.v['+str(v)+']',ch=0,rn=0,local=0)[0]
iso_len = mc.arclen(iso_crv)
iso_inc = iso_len / spansU
curveFn = glTools.utils.curve.getCurveFn(iso_crv)
iso_list = [int_surface+'.u['+str(curveFn.findParamFromLength(iso_inc*i))+']' for i in range(spansU+1)]
mc.delete(iso_crv)
# Check full rebuild
if fullRebuildV:
# Extract isoparm curves
iso_crv_list = [mc.duplicateCurve(iso,ch=False,rn=False,local=False)[0] for iso in iso_list]
# Rebuild isoparm curves
for iso_crv in iso_crv_list:
mc.rebuildCurve(iso_crv,ch=False,rpo=True,rt=0,end=1,kr=0,kcp=0,kep=1,kt=1,s=0,d=3,tol=0)
# Loft final surface
rebuild_surface = mc.loft(iso_crv_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
# Delete intermediate curves
mc.delete(iso_crv_list)
else:
# Loft final surface
rebuild_surface = mc.loft(iso_list,ch=0,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)[0]
rebuild_surface = mc.rename(rebuild_surface,surface+'_rebuild')
rebuild_surfaceShape = mc.listRelatives(surface,s=True,ni=True,pa=True)[0]
mc.delete(int_surface)
# Initialize return value
outputShape = rebuild_surfaceShape
# --------------------
# - Replace Original -
if replaceOrig:
"""
# Get original shape
shapes = glTools.utils.shape.getShapes(surface,nonIntermediates=True,intermediates=False)
if not shapes:
# Find Intermediate Shapes
shapes = glTools.utils.shape.listIntermediates(surface)
# Check shapes
if not shapes:
raise Exception('Unable to determine shape for surface "'+surface+'"!')
# Check connections
if mc.listConnections(shapes[0]+'.create',s=True,d=False):
# Find Intermediate Shapes
shapes = glTools.utils.shape.findInputShape(shapes[0])
"""
# Check history
shapes = mc.listRelatives(surface,s=True,ni=True,pa=True)
if not shapes: raise Exception('Unable to determine shape for surface "'+surface+'"!')
shape = shapes[0]
shapeHist = mc.listHistory(shape)
if shapeHist.count(shape): shapeHist.remove(shape)
if shapeHist: print('Surface "" contains construction history, creating new shape!')
# Override shape info and delete intermediate
mc.connectAttr(rebuild_surfaceShape+'.local',shape+'.create',f=True)
outputShape = shape
# Return result
return outputShape
def intersect(surface,source,direction):
'''
Return the intersection point on a specified nurbs surface given a source point and direction
@param surface: Nurbs surface to perform intersection on
@type surface: str
@param source: Source point for the intersection ray
@type source: list or tuple or str
@param direction: Direction of the intersection ray intersection
@type direction: list or tuple
'''
# Get surfaceFn
surfaceFn = getSurfaceFn(surface)
# Get source point
source = glTools.utils.base.getMPoint(source)
# Get direction vector
direction = OpenMaya.MVector(direction[0],direction[1],direction[2])
# Calculate intersection
hitPt = OpenMaya.MPoint()
hit = surfaceFn.intersect(source,direction,None,None,hitPt,0.0001,OpenMaya.MSpace.kWorld,False,None,None)
if not hit:
print 'No intersection found!'
hitPt = OpenMaya.MPoint.origin
# Return intersection hit point
return [hitPt[0],hitPt[1],hitPt[2]]
def projectToSurface(surface,targetSurface,direction='u',keepOriginal=False,prefix=''):
'''
Project the edit points of the specified nurbs surface to another nurbs or polygon object
@param surface: Surface to project
@type surface: str
@param targetSurface: Surface to project onto
@type targetSurface: str
@param direction: Surface direction to extract isoparm curves from
@type direction: str
@param keepOriginal: Create new surface or replace original
@type keepOriginal: bool
@param prefix: Name prefix for all created nodes
@type prefix: str
'''
# Check surface
if not mc.objExists(surface):
raise Exception('Surface "'+surface+'" does not exist!!')
if not isSurface(surface):
raise Exception('Object "'+surface+'" is not a valid nurbs surface!!')
# Check target surface
if not mc.objExists(targetSurface):
raise Exception('Target surface "'+targetSurface+'" does not exist!!')
# Check prefix
if not prefix: prefix = glTools.utils.stringUtils.stripSuffix(surface)
# Check direction
direction = direction.upper()
if (direction != 'U') and (direction != 'V'):
raise Exception('Invalid surface direction specified! Must specify either "u" or "v"!!')
# Get surface information
spans = mc.getAttr(surface+'.spans'+direction)
minVal = mc.getAttr(surface+'.minValue'+direction)
maxVal = mc.getAttr(surface+'.maxValue'+direction)
# Create main surface group
mainGrp = mc.createNode('transform',n=prefix+'_grp')
# Extract curves
curveList = []
curveGrpList = []
curveLocList = []
geomConstraintList = []
spanInc = (maxVal - minVal)/spans
for i in range(spans+1):
# Curve prefix
strInd = glTools.utils.stringUtils.stringIndex(i,2)
crvPrefix = prefix+'_crv'+strInd
# Create curve group
curveGrp = crvPrefix+'_grp'
curveGrp = mc.createNode('transform',n=curveGrp)
curveGrp = mc.parent(curveGrp,mainGrp)[0]
curveGrpList.append(curveGrp)
# Get surface curve
srfCurveName = crvPrefix+'_crv'
srfCurve = mc.duplicateCurve(surface+'.'+direction.lower()+'['+str(i*spanInc)+']',ch=0,rn=0,local=0,n=srfCurveName)
srfCurve = mc.parent(srfCurve[0],curveGrp)[0]
curveList.append(srfCurve)
# Generate curve locators
curveLocatorList = glTools.utils.curve.locatorEpCurve(srfCurve,locatorScale=0.05,prefix=crvPrefix)
curveLocatorList = mc.parent(curveLocatorList,curveGrp)
curveLocList.append(curveLocatorList)
# Create geometry constraints
for loc in curveLocatorList:
geomConstraint = crvPrefix+'_geometryConstraint'
geomConstraint = mc.geometryConstraint(targetSurface,loc,n=geomConstraint)
geomConstraintList.append(geomConstraint[0])
# Center group pivot
mc.xform(curveGrp,cp=True)
# Delete original surface
surfaceName = prefix+'_surface'
if not keepOriginal:
surfaceName = surface
mc.delete(surface)
# Loft new surface
surfaceLoft = mc.loft(curveList,ch=1,u=1,c=0,ar=1,d=3,ss=1,rn=0,po=0,rsn=True)
surface = mc.rename(surfaceLoft[0],surface)
surface = mc.parent(surface,mainGrp)[0]
mc.reorder(surface,f=True)
loft = mc.rename(surfaceLoft[1],prefix+'_loft')
# Return result
return[surface,loft,curveList,curveGrpList,curveLocList,geomConstraintList]
def rebuildFromExistingIsoparms(surface,direction='u',degree=3,close=False,keepHistory=False):
'''
Build a new nurbs surface from an existing surfaces isoparms
@param surface: Surface to build from
@type surface: str
@param direction: Surface direction to build from
@type direction: str
@param degree: Degree to build new surface to
@type degree: int
@param close: Close lofted surface
@type close: bool
@param keepHistory: Keep loft surface history
@type keepHistory: bool
'''
# Check surface
if not mc.objExists(surface):
raise Exception('Surface "'+surface+'" does not exist!!')
if not isSurface(surface):
raise Exception('Object "'+surface+'" is not a valid nurbs surface!!')
# Check direction
direction = direction.lower()
if not direction == 'u' and not direction == 'v':
raise Exception('Invalid surface direction! Accepted values are "u" and "v"!')
# Get surface details
surfFn = getSurfaceFn(surface)
spans = mc.getAttr(surface+'.spans'+direction.upper())
degree = mc.getAttr(surface+'.degree'+direction.upper())
form = mc.getAttr(surface+'.form'+direction.upper())
knots = OpenMaya.MDoubleArray()
if direction == 'u': surfFn.getKnotsInU(knots)
if direction == 'v': surfFn.getKnotsInV(knots)
# Build iso list for surface rebuild
if degree > 1: knots = knots[(degree-1):-(degree-1)]
isoList = [surface+'.'+direction+'['+str(i)+']' for i in knots]
if not close and form:
#isoList.append(isoList[0])
isoList[-1] = isoList[-1] - 0.0001
# Loft new rebuild surface
rebuild = mc.loft(isoList,ch=keepHistory,u=True,c=close,ar=False,d=degree,ss=1,rn=False,po=False,rsn=(direction=='v'))
rebuild = mc.rename(rebuild[0],surface+'_rebuild')
# Return result
return rebuild
def rebuildFromIsoparms(surface,spansU=0,spansV=0,degree=3,keepHistory=False):
'''
Build a new nurbs surface from an existing surfaces isoparms
@param surface: Surface to build from
@type surface: str
@param direction: Surface direction to build from
@type direction: str
@param degree: Degree to build new surface to
@type degree: int
@param keepHistory: Keep loft surface history
@type keepHistory: bool
'''
# Check Surface
if not mc.objExists(surface):
raise Exception('Surface "'+surface+'" does not exist!!')
if not isSurface(surface):
raise Exception('Object "'+surface+'" is not a valid nurbs surface!!')
# Initialize function pointers
uMinPtr = OpenMaya.MScriptUtil().asDoublePtr()
uMaxPtr = OpenMaya.MScriptUtil().asDoublePtr()
vMinPtr = OpenMaya.MScriptUtil().asDoublePtr()
vMaxPtr = OpenMaya.MScriptUtil().asDoublePtr()
# Get surface details
surfFn = getSurfaceFn(surface)
surfFn.getKnotDomain(uMinPtr,uMaxPtr,vMinPtr,vMaxPtr)
uMin = OpenMaya.MScriptUtil(uMinPtr).asDouble()
uMax = OpenMaya.MScriptUtil(uMaxPtr).asDouble()
vMin = OpenMaya.MScriptUtil(vMinPtr).asDouble()
vMax = OpenMaya.MScriptUtil(vMaxPtr).asDouble()
uDif = uMax - uMin
vDif = vMax - vMin
# Get surface form
closeU = bool(mc.getAttr(surface+'.formU'))
closeV = bool(mc.getAttr(surface+'.formV'))
# Check spans
if not spansU: spansU = surfFn.numKnotsInU()
if not spansV: spansV = surfFn.numKnotsInV()
# Get new knot values
uList = []
vList = []
uInc = uDif/(spansU-int(not closeU))
vInc = vDif/(spansV-int(not closeV))
for u in range(spansU): uList.append(uMin+(uInc*u))
for v in range(spansV): vList.append(vMin+(vInc*v))
# Rebuild in U
uLoft = mc.loft([surface+'.u['+str(i)+']' for i in uList],close=closeU,degree=degree)
uSurface = uLoft[0]
# Rebuld in V
vLoft = mc.loft([uSurface+'.v['+str(i)+']' for i in vList],close=closeV,degree=degree)
rebuildSurface = vLoft[0]
# Return result
return rebuildSurface
| 34.198582 | 136 | 0.721658 | 4,113 | 28,932 | 5.045222 | 0.113542 | 0.011566 | 0.020047 | 0.018216 | 0.56725 | 0.504409 | 0.486531 | 0.47482 | 0.462387 | 0.440509 | 0 | 0.011873 | 0.144097 | 28,932 | 845 | 137 | 34.239053 | 0.826112 | 0.110673 | 0 | 0.392308 | 0 | 0 | 0.098271 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.025641 | null | null | 0.007692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b64840e86cc87648b3f6e504a007b4a94f6552a2 | 308 | py | Python | page_edits/migrations/0015_delete_howweworktext.py | webspace95/studyhelp | 70e0978b4a97cdb45d1574924e7997932bb410fb | [
"MIT"
] | null | null | null | page_edits/migrations/0015_delete_howweworktext.py | webspace95/studyhelp | 70e0978b4a97cdb45d1574924e7997932bb410fb | [
"MIT"
] | null | null | null | page_edits/migrations/0015_delete_howweworktext.py | webspace95/studyhelp | 70e0978b4a97cdb45d1574924e7997932bb410fb | [
"MIT"
] | null | null | null | # Generated by Django 3.0.4 on 2022-03-02 19:32
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('page_edits', '0014_delete_whatsappnumber'),
]
operations = [
migrations.DeleteModel(
name='HowWeWorkText',
),
]
| 18.117647 | 53 | 0.62013 | 32 | 308 | 5.875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084821 | 0.272727 | 308 | 16 | 54 | 19.25 | 0.754464 | 0.146104 | 0 | 0 | 1 | 0 | 0.187739 | 0.099617 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b655a40ad3e46599e2e209081b5a314b30cd5e8c | 5,597 | py | Python | rdr_service/model/questionnaire.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 39 | 2017-10-13T19:16:27.000Z | 2021-09-24T16:58:21.000Z | rdr_service/model/questionnaire.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 312 | 2017-09-08T15:42:13.000Z | 2022-03-23T18:21:40.000Z | rdr_service/model/questionnaire.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 19 | 2017-09-15T13:58:00.000Z | 2022-02-07T18:33:20.000Z | from sqlalchemy import (
Boolean,
Column,
ForeignKey,
ForeignKeyConstraint,
Integer,
String,
UniqueConstraint,
)
from sqlalchemy import BLOB # pylint: disable=unused-import
from sqlalchemy.orm import relationship
from typing import List
from rdr_service.model.base import Base
from rdr_service.model.code import Code
from rdr_service.model.utils import Enum, UTCDateTime
from rdr_service.participant_enums import QuestionnaireDefinitionStatus
from rdr_service.model.field_types import BlobUTF8
class QuestionnaireBase(object):
"""Mixin containing columns for Questionnaire and QuestionnaireHistory"""
questionnaireId = Column("questionnaire_id", Integer, primary_key=True)
"""RDR identifier for the questionnaire"""
# Incrementing version, starts at 1 and is incremented on each update.
version = Column("version", Integer, nullable=False)
"""RDR version of the questionnaire"""
semanticVersion = Column('semantic_version', String(100))
"""PTSC's version of the questionnaire (does not necessarily match RDR version)"""
semanticDesc = Column('semantic_desc', String(500))
irbMapping = Column('irb_mapping', String(500))
created = Column("created", UTCDateTime, nullable=False)
"""The date and time the questionnaire was created"""
lastModified = Column("last_modified", UTCDateTime, nullable=False)
"""The date and time the questionnaire was last modified"""
# The JSON representation of the questionnaire provided by the client.
# Concepts and questions can be be parsed out of this for use in querying.
resource = Column("resource", BlobUTF8, nullable=False)
status = Column("status", Enum(QuestionnaireDefinitionStatus), default=QuestionnaireDefinitionStatus.VALID)
externalId = Column('external_id', String(100))
def asdict_with_children(self):
return self.asdict(follow={"concepts": {}, "questions": {}})
class Questionnaire(QuestionnaireBase, Base):
"""A questionnaire containing questions to pose to participants."""
__tablename__ = "questionnaire"
concepts = relationship(
"QuestionnaireConcept",
cascade="expunge",
cascade_backrefs=False,
primaryjoin="Questionnaire.questionnaireId==" + "foreign(QuestionnaireConcept.questionnaireId)",
)
questions = relationship(
"QuestionnaireQuestion",
cascade="expunge",
cascade_backrefs=False,
primaryjoin="and_(Questionnaire.questionnaireId==" + "foreign(QuestionnaireQuestion.questionnaireId)," + \
"Questionnaire.version==" + "foreign(QuestionnaireQuestion.questionnaireVersion))",
)
class QuestionnaireHistory(QuestionnaireBase, Base):
__tablename__ = "questionnaire_history"
version = Column("version", Integer, primary_key=True)
concepts: List['QuestionnaireConcept'] = relationship("QuestionnaireConcept", cascade="all, delete-orphan")
questions: List['QuestionnaireQuestion'] = relationship("QuestionnaireQuestion", cascade="all, delete-orphan")
class QuestionnaireConcept(Base):
"""Concepts for the questionnaire as a whole.
These should be copied whenever a new version of
a questionnaire is created.
"""
__tablename__ = "questionnaire_concept"
questionnaireConceptId = Column("questionnaire_concept_id", Integer, primary_key=True)
"""An identifier to link the questionnaire to the CONCEPT table from OMOP"""
questionnaireId = Column("questionnaire_id", Integer, nullable=False)
"""Questionnaire identifier for the concept"""
questionnaireVersion = Column("questionnaire_version", Integer, nullable=False)
"""Questionnaire's RDR version for the concept"""
codeId = Column("code_id", Integer, ForeignKey("code.code_id"), nullable=False)
"""The corresponding concept for this item"""
__table_args__ = (
ForeignKeyConstraint(
["questionnaire_id", "questionnaire_version"],
["questionnaire_history.questionnaire_id", "questionnaire_history.version"],
),
UniqueConstraint("questionnaire_id", "questionnaire_version", "code_id"),
)
class QuestionnaireQuestion(Base):
"""A question in a questionnaire.
These should be copied whenever a new version of a
questionnaire is created.
Each question has a concept system and code defining what the question is
about. Questions on
different questionnaires can share the same concept code, but concept code is
unique within a
given questionnaire.
"""
__tablename__ = "questionnaire_question"
questionnaireQuestionId = Column("questionnaire_question_id", Integer, primary_key=True)
"""RDR identifier for the question"""
questionnaireId = Column("questionnaire_id", Integer)
"""RDR identifier for the containing questionnaire"""
questionnaireVersion = Column("questionnaire_version", Integer)
"""RDR version for the containing questionnaire"""
linkId = Column("link_id", String(40))
"""The unique ID for the item in the questionnaire"""
codeId = Column("code_id", Integer, ForeignKey("code.code_id"), nullable=False)
"""The corresponding concept for this question"""
repeats = Column("repeats", Boolean, nullable=False)
"""Whether the item may repeat"""
__table_args__ = (
ForeignKeyConstraint(
["questionnaire_id", "questionnaire_version"],
["questionnaire_history.questionnaire_id", "questionnaire_history.version"],
),
UniqueConstraint("questionnaire_id", "questionnaire_version", "link_id"),
)
code = relationship(Code)
| 41.768657 | 114 | 0.729319 | 575 | 5,597 | 6.96 | 0.290435 | 0.033733 | 0.041979 | 0.018991 | 0.298101 | 0.235882 | 0.213393 | 0.213393 | 0.213393 | 0.192404 | 0 | 0.003665 | 0.171342 | 5,597 | 133 | 115 | 42.082707 | 0.859207 | 0.144899 | 0 | 0.202532 | 0 | 0 | 0.273597 | 0.168638 | 0.025316 | 0 | 0 | 0 | 0 | 1 | 0.012658 | false | 0 | 0.113924 | 0.012658 | 0.607595 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b656fab2ba1aca0c087abbf0334456183b4b1cfe | 463 | py | Python | src/preprocessing.py | ppisarski/DaaS-docker | 390b628ff87e6a67b3aa1c1fe91786203bb1168c | [
"MIT"
] | null | null | null | src/preprocessing.py | ppisarski/DaaS-docker | 390b628ff87e6a67b3aa1c1fe91786203bb1168c | [
"MIT"
] | null | null | null | src/preprocessing.py | ppisarski/DaaS-docker | 390b628ff87e6a67b3aa1c1fe91786203bb1168c | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
from src.features import *
def get_data():
train_data = pd.read_csv('../data/train.csv')
test_data = pd.read_csv('../data/test.csv')
return train_data, test_data
def main():
train_data, test_data = get_data()
train_data = impute_all_features(train_data)
test_data = impute_all_features(test_data)
print(train_data.head())
print(test_data.head())
if __name__ == '__main__':
main()
| 19.291667 | 49 | 0.688985 | 70 | 463 | 4.157143 | 0.342857 | 0.185567 | 0.134021 | 0.175258 | 0.116838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185745 | 463 | 23 | 50 | 20.130435 | 0.771883 | 0 | 0 | 0 | 0 | 0 | 0.088553 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.2 | 0 | 0.4 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b658a27b5fb49841c9263f0bf1698ebf168e9254 | 265 | py | Python | Python/Week2/regularExpression.py | CoSandu/PythonCourses | 0cec199f8f9f335e2c8daf5733eb6685c978fff8 | [
"Apache-2.0"
] | null | null | null | Python/Week2/regularExpression.py | CoSandu/PythonCourses | 0cec199f8f9f335e2c8daf5733eb6685c978fff8 | [
"Apache-2.0"
] | null | null | null | Python/Week2/regularExpression.py | CoSandu/PythonCourses | 0cec199f8f9f335e2c8daf5733eb6685c978fff8 | [
"Apache-2.0"
] | null | null | null | import re
f = open(r'C:\Constantin\Corsi_Online\Python\regx.txt', 'r')
interi = []
tot = 0
for line in f:
x = re.findall('[0-9]+', line)
if x:
interi.append(x)
for val in interi:
for v in val:
tot = int(v) + tot
print tot
f.close()
| 13.947368 | 60 | 0.566038 | 47 | 265 | 3.170213 | 0.595745 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015544 | 0.271698 | 265 | 18 | 61 | 14.722222 | 0.756477 | 0 | 0 | 0 | 0 | 0 | 0.184906 | 0.158491 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.076923 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b65f7582cacbcbf99d936c43a7adb0e8f588732f | 2,827 | py | Python | src/device/handler.py | OSLL/apagescan | c2d993c0260697aeb2ed70d2f3d2a805b4f8a5f4 | [
"MIT"
] | 6 | 2020-08-03T06:17:50.000Z | 2021-11-08T20:21:29.000Z | src/device/handler.py | OSLL/apagescan | c2d993c0260697aeb2ed70d2f3d2a805b4f8a5f4 | [
"MIT"
] | 17 | 2020-03-04T14:13:10.000Z | 2020-03-23T19:12:25.000Z | src/device/handler.py | OSLL/apagescan | c2d993c0260697aeb2ed70d2f3d2a805b4f8a5f4 | [
"MIT"
] | null | null | null | import re
from itertools import chain
from src.utilities import exec_command, list_difference
class DeviceHandler:
"""Class to handle Android devices, connected to PC
:ivar __listeners: list of listeners - objects (implementing Listener interface), that are tracking devices' changes
:ivar __serial_numbers: list of all connected devices' serial numbers, each serial number is generated by adb as id
:ivar __current_device_id: serial number of selected connected device
"""
def __init__(self):
self.__listeners = []
self.__serial_numbers = []
self.__current_device_id = None
def is_device_selected(self):
"""Checks if current device is still connected to PC
:return: True if device is connected, else False
:rtype: Bool
"""
return self.__current_device_id in self.__serial_numbers
def has_devices(self):
""" Checks if there are connected to PC devices
:return: True if there are connected devices, else False
:rtype: Bool
"""
return len(self.__serial_numbers) != 0
@property
def current_device(self):
"""The unique serial number of selected device
:getter: returns serial number of current selected device
:type: String
"""
return self.__current_device_id
def switch(self, device_number):
"""Switches working device to new device
:param device_number: number of a new working device
:return: None
"""
self.__current_device_id = device_number
if not self.is_device_selected():
self.__current_device_id = None
@property
def devices_list(self):
"""List of all connected devices' serial numbers
:getter: returns list of all connected devices
:type: List
"""
return self.__serial_numbers
def add_listener(self, listener):
"""Adds new listener
:param listener: listener to be added
:return: None
"""
self.__listeners.append(listener)
def update(self):
"""Updates list of all connected devices and notifies listeners
:return: None
"""
res = exec_command('adb devices')
prev_state = self.__serial_numbers
output = [x for x in res.decode('UTF-8').split('\n') if x]
number_pattern = re.compile(r'(.*)\t')
serial_numbers = list(map(lambda x: number_pattern.findall(x), output[1:]))
# flatten list
self.__serial_numbers = list(chain.from_iterable(serial_numbers))
if len(self.__serial_numbers) == 0:
self.__current_device_id = None
if list_difference(self.__serial_numbers, prev_state):
for listener in self.__listeners:
listener.react()
| 31.065934 | 120 | 0.645207 | 345 | 2,827 | 5.028986 | 0.310145 | 0.097406 | 0.078386 | 0.065706 | 0.189625 | 0.043804 | 0.043804 | 0 | 0 | 0 | 0 | 0.001952 | 0.275203 | 2,827 | 90 | 121 | 31.411111 | 0.844802 | 0.378493 | 0 | 0.138889 | 0 | 0 | 0.015676 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.083333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b66357baffc9f27b646286c589544f0c9a94af93 | 2,142 | py | Python | ShapeLearnerAPI/_server.py | DEKHTIARJonathan/ShapeLearnerAPI | 6b8824da28cc5f481b550c8c7912c86097e20536 | [
"MIT"
] | null | null | null | ShapeLearnerAPI/_server.py | DEKHTIARJonathan/ShapeLearnerAPI | 6b8824da28cc5f481b550c8c7912c86097e20536 | [
"MIT"
] | null | null | null | ShapeLearnerAPI/_server.py | DEKHTIARJonathan/ShapeLearnerAPI | 6b8824da28cc5f481b550c8c7912c86097e20536 | [
"MIT"
] | null | null | null | ################################# Loading config File ################################
import sys
sys.path.append('../Config/')
from loadConf import loadConf
################################# Import Libraries ################################
import os.path
import urllib2
import time
from sqlalchemy import *
from sqlalchemy.sql import select, column
from bottle import route, run, response, static_file, request, error
from json import dumps
#################################### WebService Route / #####################################
@route('/static/<filename:path>')
def getStaticFile(filename):
extension = str.lower(os.path.splitext(filename)[1][1:])
if extension == 'jpeg'or extension == 'jpg':
return static_file(filename, root='C:\\Users\\Administrator\\Desktop\\ShapeLearnerPackage\\ShapeLearnerAPI\\static', mimetype='image/jpg')
elif extension == 'png':
return static_file(filename, root='C:\\Users\\Administrator\\Desktop\\ShapeLearnerPackage\\ShapeLearnerAPI\\static', mimetype='image/png')
elif extension == 'css':
return static_file(filename, root='C:\\Users\\Administrator\\Desktop\\ShapeLearnerPackage\\ShapeLearnerAPI\\static', mimetype='text/css')
elif extension == 'js':
return static_file(filename, root='C:\\Users\\Administrator\\Desktop\\ShapeLearnerPackage\\ShapeLearnerAPI\\static', mimetype='text/javascript')
@route('/test')
def homepage():
for i in range (1,5):
print "processing ..."
time.sleep(1)
return "Process Finished"
@route('/')
def homepage():
return static_file("index.html", root='C:\\Users\\Administrator\\Desktop\\ShapeLearnerPackage\\ShapeLearnerAPI\\html')
@route('<request:path>')
def reroute(request):
return urllib2.urlopen('http://127.0.0.1:8888'+str(request))
@error(404)
def error404(error):
return static_file("404.html", root='C:\\Users\\Administrator\\Desktop\\ShapeLearnerPackage\\ShapeLearnerAPI\\html')
@error(500)
def error500(error):
return error
################################# Server Initialization #####################################
config = loadConf()
run(server='paste', host='0.0.0.0', port=80, debug=True, reloader=True)
| 36.305085 | 147 | 0.643324 | 232 | 2,142 | 5.909483 | 0.37931 | 0.051058 | 0.070022 | 0.100656 | 0.415755 | 0.415755 | 0.415755 | 0.415755 | 0.415755 | 0.310722 | 0 | 0.019467 | 0.088702 | 2,142 | 58 | 148 | 36.931034 | 0.682889 | 0.037348 | 0 | 0.04878 | 0 | 0 | 0.369334 | 0.275881 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.219512 | null | null | 0.02439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b66c5f4a049b6ad77a78e2d4cae2ce8fbcea9dea | 2,346 | py | Python | simple_scheduler/jobs/curl_job.py | In2ItChicago/ndscheduler | f9cc3eadc0fabc0cb9c4250ae65c4c11b81125c2 | [
"BSD-2-Clause"
] | null | null | null | simple_scheduler/jobs/curl_job.py | In2ItChicago/ndscheduler | f9cc3eadc0fabc0cb9c4250ae65c4c11b81125c2 | [
"BSD-2-Clause"
] | 20 | 2019-01-27T18:40:26.000Z | 2019-10-05T21:50:06.000Z | simple_scheduler/jobs/curl_job.py | In2ItChicago/ndscheduler | f9cc3eadc0fabc0cb9c4250ae65c4c11b81125c2 | [
"BSD-2-Clause"
] | null | null | null | """A job to send a HTTP (GET or DELETE) periodically."""
import logging
import requests
from ndscheduler.corescheduler import job
logger = logging.getLogger(__name__)
class CurlJob(job.JobBase):
TIMEOUT = 30
@classmethod
def meta_info(cls):
return {
'job_class_string': '%s.%s' % (cls.__module__, cls.__name__),
'notes': 'This sends a HTTP request to a particular URL',
'arguments': [
# url
{'type': 'string', 'description': 'What URL you want to make a GET call?'},
# Request Type
{'type': 'string', 'description': 'What request type do you want? '
'(currently supported: GET/POST/DELETE)'},
{'type': 'string', 'description': 'Username'},
{'type': 'string', 'description': 'Password'},
{'type': 'string', 'description': 'Login endpoint'}
],
'example_arguments': ('["http://localhost:8888/api/v1/jobs", "GET"]'
'["http://localhost:8888/api/v1/jobs/ba12e", "DELETE"]')
}
def run(self, url, request_type, username, password, login_endpoint, data={}):
print('Calling %s on url: %s' % (request_type, url))
session = requests.Session()
authorization = None
if username is not None and password is not None:
token = session.post(login_endpoint,
json={'email': username, 'password': password}).content.decode()
authorization = f'Bearer {token}'
if request_type == 'POST':
result = session.request(request_type, url,
timeout=self.TIMEOUT,
headers={'Content-Type': 'application/json',
'Authorization': authorization},
json=data)
return result.text
else:
result = session.request(request_type, url, timeout=self.TIMEOUT,
headers={'Authorization': authorization})
return result.text
if __name__ == "__main__":
job = CurlJob.create_test_instance()
job.run('http://localhost:8888/api/v1/jobs', 'GET', {})
| 39.762712 | 97 | 0.518755 | 223 | 2,346 | 5.309417 | 0.403587 | 0.065034 | 0.088682 | 0.050676 | 0.170608 | 0.170608 | 0.148649 | 0.099662 | 0.099662 | 0.099662 | 0 | 0.012541 | 0.35422 | 2,346 | 58 | 98 | 40.448276 | 0.768977 | 0.028986 | 0 | 0.045455 | 0 | 0 | 0.257596 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0.090909 | 0.068182 | 0.022727 | 0.227273 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b67146d928f69a5146e1d26fe26e5079cb833b63 | 1,272 | py | Python | za.py | zweed4u/residue | b4c397af34d0e67932db454eb486a9935a5d9d39 | [
"MIT"
] | null | null | null | za.py | zweed4u/residue | b4c397af34d0e67932db454eb486a9935a5d9d39 | [
"MIT"
] | null | null | null | za.py | zweed4u/residue | b4c397af34d0e67932db454eb486a9935a5d9d39 | [
"MIT"
] | null | null | null | #!/usr/bin/python
#sitekey: 6Lcm qwIT AAAA AD_8 PDdc M-9A hyft RNtF Xn1U 5e_8
import json, random, requests
def decrypt_1(cyphertext):
plaintext = ''
for letter in cyphertext:
letter = chr(ord(letter) - 1)
plaintext += letter
return plaintext
url = decrypt_1('iuuqt;00psefs/epnjopt/dpn0qpxfs0hjgudbse.cbmbodf')
host = decrypt_1('psefs/epnjopt/dpn')
UA = decrypt_1('EpnjoptjPT05/1/1!)jQipof!PT!:/4/4<!jQipof<!fo.VT*')
session = requests.session()
headers = {
'Host': host,
'Content-Type': 'application/json',
'Connection': 'keep-alive',
'Proxy-Connection': 'keep-alive',
'Accept': 'application/json',
'User-Agent': UA,
'Accept-Language': 'en-us',
'Accept-Encoding': 'gzip, deflate'
}
#while 1: #no looping - for demonstration purposes only
number = random.randint('''''','''''') #stubbed for a reason - not for malicious use!
pin = random.randint('''''','''''')
data = {
"GiftCards": [
{
"Number": str(number),
"Pin": str(pin)
}
]
}
r = session.post(url, json=data, headers=headers)
print r.status_code
if r.json()["Status"] == 0:
print json.dumps(r.json(), indent=4, sort_keys=True)
else:
print str(number),'::',str(pin),'is not a valid combination'
print
| 28.266667 | 85 | 0.629717 | 166 | 1,272 | 4.777108 | 0.596386 | 0.040353 | 0.047919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023346 | 0.191824 | 1,272 | 44 | 86 | 28.909091 | 0.748054 | 0.13522 | 0 | 0 | 0 | 0 | 0.29589 | 0.088584 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.026316 | null | null | 0.105263 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b6729cf227c800433d7bb6a4d03bad320ebb49d2 | 11,447 | py | Python | lib/carbon/cache.py | zillow/carbon | 07244f98e8ddf305a0b2cc2da1bcc1a86b613ce6 | [
"Apache-2.0"
] | null | null | null | lib/carbon/cache.py | zillow/carbon | 07244f98e8ddf305a0b2cc2da1bcc1a86b613ce6 | [
"Apache-2.0"
] | 3 | 2017-02-17T04:11:13.000Z | 2017-05-03T03:22:16.000Z | lib/carbon/cache.py | zillow/carbon | 07244f98e8ddf305a0b2cc2da1bcc1a86b613ce6 | [
"Apache-2.0"
] | null | null | null | """Copyright 2009 Chris Davis
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License."""
import time
import os
import threading
from operator import itemgetter
from random import choice
from collections import defaultdict
from carbon.conf import settings
from carbon import events, log
from carbon.pipeline import Processor
from carbon_index.index import CarbonIndex
try:
import cPickle as pickle
except ImportError:
import pickle
def by_timestamp((timestamp, (value, is_flushed))): # useful sort key function
return timestamp
class CacheFeedingProcessor(Processor):
plugin_name = 'write'
def process(self, metric, datapoint):
MetricCache.store(metric, datapoint)
return Processor.NO_OUTPUT
class DrainStrategy(object):
"""Implements the strategy for writing metrics.
The strategy chooses what order (if any) metrics
will be popped from the backing cache"""
def __init__(self, cache):
self.cache = cache
def choose_item(self):
raise NotImplemented
class NaiveStrategy(DrainStrategy):
"""Pop points in an unordered fashion."""
def __init__(self, cache):
super(NaiveStrategy, self).__init__(cache)
def _generate_queue():
while True:
metric_names = self.cache.keys()
while metric_names:
yield metric_names.pop()
self.queue = _generate_queue()
def choose_item(self):
return self.queue.next()
class MaxStrategy(DrainStrategy):
"""Always pop the metric with the greatest number of points stored.
This method leads to less variance in pointsPerUpdate but may mean
that infrequently or irregularly updated metrics may not be written
until shutdown """
def choose_item(self):
metric_name, size = max(self.cache.items(), key=lambda x: len(itemgetter(1)(x)))
return metric_name
class RandomStrategy(DrainStrategy):
"""Pop points randomly"""
def choose_item(self):
return choice(self.cache.keys())
class SortedStrategy(DrainStrategy):
""" The default strategy which prefers metrics with a greater number
of cached points but guarantees every point gets written exactly once during
a loop of the cache """
def __init__(self, cache):
super(SortedStrategy, self).__init__(cache)
def _generate_queue():
while True:
t = time.time()
metric_counts = sorted(self.cache.unflush_counts, key=lambda x: x[1])
if settings.LOG_CACHE_QUEUE_SORTS:
log.msg("Sorted %d cache queues in %.6f seconds" % (len(metric_counts), time.time() - t))
while metric_counts:
yield itemgetter(0)(metric_counts.pop())
self.queue = _generate_queue()
def choose_item(self):
return self.queue.next()
class _MetricCache(defaultdict):
"""A Singleton dictionary of metric names and lists of their datapoints"""
def __init__(self, strategy=None):
self.lock = threading.Lock()
self.size = 0
# Signal to shutdown
self.will_shutdown = False
# Let's also keep track of unflush_counts for each metric
self.metric_unflush_counts = defaultdict(int)
# If we haven't received a metric for two hours,
# let's drop its datapoints. Before dropping, make sure
# its datapoints have been flushed.
# {metric: timestamp}
self.last_received_timestamps = defaultdict(int)
# Build an index for carbon cache
self.index = CarbonIndex()
self.strategy = None
if strategy:
self.strategy = strategy(self)
super(_MetricCache, self).__init__(dict)
@property
def counts(self):
return [(metric, len(datapoints)) for (metric, datapoints) in self.items()]
@property
def unflush_counts(self):
return [(metric, self._count_unflushed(metric)) for (metric, datapoints) in self.items()]
@property
def is_full(self):
if settings.MAX_CACHE_SIZE == float('inf'):
return False
else:
return self.size >= settings.MAX_CACHE_SIZE
def _count_unflushed(self, metric):
return self.metric_unflush_counts[metric]
def _check_available_space(self):
if state.cacheTooFull and self.size < settings.CACHE_SIZE_LOW_WATERMARK:
log.msg("MetricCache below watermark: self.size=%d" % self.size)
events.cacheSpaceAvailable()
def drain_metric(self):
"""Returns a metric and it's datapoints in order determined by the
`DrainStrategy`_"""
if not self:
return (None, [])
if self.strategy:
metric = self.strategy.choose_item()
else:
# Avoid .keys() as it dumps the whole list
metric = self.iterkeys().next()
# Do not pop all datapoints for this metric.
# Instead, retain last N datapoints in carbon cache.
datapoints = self.pop(metric)
# 1. Retain last N datapoints in carbon cache,
# let's push them back
# 2. Do not push back during shutdown
if not self.will_shutdown:
for i in range(1, settings.MAX_RETAINED_LATEST_DATAPOINTS + 1):
if i > len(datapoints):
break
timestamp, tup = datapoints[-i]
value, is_flushed = tup
self.store(metric, (timestamp, value), is_flushed=True)
# Actions before returning
# 1. filter out datapoints that have been already flushed.
# 2. strip out is_flushed field, as for keeping same interface as before
datapoints = [(timestamp, value) for (timestamp, (value, is_flushed)) in datapoints if not is_flushed]
# Update unflushed
with self.lock:
self.metric_unflush_counts[metric] -= len(datapoints)
# Return directly during shutdown
if self.will_shutdown:
return (metric, datapoints)
# 1) latest datatpoints receivced two hours ago or even older.
# 2) all dataponts of that metric have been flushed already.
if len(datapoints) == 0:
time_now = int(time.time())
oldest_timestamp = time_now - settings.MAX_TIME_GAP_FOR_MISSING_DATA
if self.last_received_timestamps[metric] < oldest_timestamp:
with self.lock:
# remove the entry in unflush_counts map
self.metric_unflush_counts.pop(metric, None)
# remove the entry in last_received_timestamps map
self.last_received_timestamps.pop(metric, None)
# remove the index from CarbonIndex
self.index.delete(metric)
if metric in self:
self.pop(metric)
if settings.LOG_DROP_LATEST_DATAPOINTS:
log.msg("MetricCache drops latest datapoints for metric {0}...".format(metric))
return (metric, datapoints)
def get_datapoints(self, metric):
"""Return a list of currently cached datapoints sorted by timestamp"""
# let's keep old interface, strip out is_flushed before return
return sorted(self.get_unsorted_datapoints(metric), key=lambda tup: tup[0])
def get_unsorted_datapoints(self, metric):
"""Return a list of currently unsorted cached datapoints"""
# let's keep old interface, strip out is_flushed before return
return [(timestamp, value) for (timestamp, (value, is_flushed)) in self.get(metric, {}).items()]
def expand_wildcard_query(self, metric):
""" expand wildcard query """
return self.index.expand_pattern(metric)
def pop(self, metric):
with self.lock:
datapoint_index = defaultdict.pop(self, metric)
self.size -= len(datapoint_index)
self._check_available_space()
return sorted(datapoint_index.items(), key=by_timestamp)
def store(self, metric, datapoint, is_flushed=False):
timestamp, value = datapoint
if timestamp not in self[metric]:
# Not a duplicate, hence process if cache is not full
if self.is_full:
log.msg("MetricCache is full: self.size=%d" % self.size)
events.cacheFull()
else:
with self.lock:
self.size += 1
if not is_flushed:
self.metric_unflush_counts[metric] += 1
if self.last_received_timestamps[metric] < timestamp:
self.last_received_timestamps[metric] = timestamp
# create an index for new metrics
if len(self[metric]) == 0:
self.index.insert(metric)
self[metric][timestamp] = (value, is_flushed)
else:
# Updating a duplicate does not increase the cache size
self[metric][timestamp] = (value, is_flushed)
# Initialize a singleton cache instance
write_strategy = None
if settings.CACHE_WRITE_STRATEGY == 'naive':
write_strategy = NaiveStrategy
if settings.CACHE_WRITE_STRATEGY == 'max':
write_strategy = MaxStrategy
if settings.CACHE_WRITE_STRATEGY == 'sorted':
write_strategy = SortedStrategy
if settings.CACHE_WRITE_STRATEGY == 'random':
write_strategy = RandomStrategy
def _cleanup_files(files):
for f in files:
if os.path.isfile(f):
os.unlink(f)
PICKLE_DUMP_PATH = os.environ.get("PICKLE_DUMP_PATH")
INSTANCE_NAME = os.environ.get("CARBON_CACHE_INSTANCE_NAME")
if PICKLE_DUMP_PATH and INSTANCE_NAME:
instance_pickle_file_path = os.path.join(PICKLE_DUMP_PATH, "carbon-instance-{}.pickle".format(INSTANCE_NAME))
metric_unflush_counts_pickle_file_path = os.path.join(PICKLE_DUMP_PATH, "carbon-instance-{}-metric_unflush_counts.pickle".format(INSTANCE_NAME))
last_received_timestamps_pickle_file_path = os.path.join(PICKLE_DUMP_PATH, "carbon-instance-{}-last_received_timestamps.pickle".format(INSTANCE_NAME))
carbon_index_pickle_file_path = os.path.join(PICKLE_DUMP_PATH, "carbon-instance-{}-carbon_index.pickle".format(INSTANCE_NAME))
size_pickle_file_path = os.path.join(PICKLE_DUMP_PATH, "carbon-instance-{}-size.pickle".format(INSTANCE_NAME))
pickled_files = [
instance_pickle_file_path,
metric_unflush_counts_pickle_file_path,
last_received_timestamps_pickle_file_path,
carbon_index_pickle_file_path,
size_pickle_file_path,
]
if os.path.isfile(instance_pickle_file_path):
try:
with open(instance_pickle_file_path, 'rb') as handle:
MetricCache = pickle.load(handle)
if os.path.isfile(metric_unflush_counts_pickle_file_path):
with open(metric_unflush_counts_pickle_file_path, 'rb') as handle:
MetricCache.metric_unflush_counts = pickle.load(handle)
if os.path.isfile(last_received_timestamps_pickle_file_path):
with open(last_received_timestamps_pickle_file_path, 'rb') as handle:
MetricCache.last_received_timestamps = pickle.load(handle)
if os.path.isfile(carbon_index_pickle_file_path):
with open(carbon_index_pickle_file_path, 'rb') as handle:
MetricCache.index = pickle.load(handle)
if os.path.isfile(size_pickle_file_path):
with open(size_pickle_file_path, 'rb') as handle:
MetricCache.size = pickle.load(handle)
MetricCache.strategy = write_strategy(MetricCache)
except:
MetricCache = _MetricCache(write_strategy)
finally:
# remove all pickled files
_cleanup_files(pickled_files)
else:
MetricCache = _MetricCache(write_strategy)
else:
MetricCache = _MetricCache(write_strategy)
# Avoid import circularities
from carbon import state
| 33.47076 | 152 | 0.71617 | 1,520 | 11,447 | 5.201974 | 0.217763 | 0.025294 | 0.035412 | 0.017453 | 0.28342 | 0.212723 | 0.150626 | 0.104717 | 0.063488 | 0.063488 | 0 | 0.002934 | 0.195947 | 11,447 | 341 | 153 | 33.568915 | 0.85615 | 0.115052 | 0 | 0.185366 | 0 | 0 | 0.050511 | 0.025081 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.068293 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b67314edb609b2d661c738a36e7968705ff7d30d | 995 | py | Python | danbooru_dump/danbooru_dump_bulkdata_download_missing.py | danya02/booru-mirror-unified | 2e4ee827c6b6dd0d2a3fe6b9f18220aafd552abe | [
"MIT"
] | null | null | null | danbooru_dump/danbooru_dump_bulkdata_download_missing.py | danya02/booru-mirror-unified | 2e4ee827c6b6dd0d2a3fe6b9f18220aafd552abe | [
"MIT"
] | null | null | null | danbooru_dump/danbooru_dump_bulkdata_download_missing.py | danya02/booru-mirror-unified | 2e4ee827c6b6dd0d2a3fe6b9f18220aafd552abe | [
"MIT"
] | null | null | null | import subprocess
import os
import sys
sys.path.append('..')
from database import *
import time
import random
import danbooru_dump_transfer_queue_into_redis_from_find
os.chdir('/hugedata/booru/danbooru_temp')
def download_content(id):
print (['rsync', '--recursive', '--verbose', f'rsync://78.46.86.149:873/danbooru2020/original/0{str(id)[-3:].zfill(3)}/{id}.*', './danbooru2020/original/'])
subprocess.run(['rsync', '--recursive', '--verbose', f'rsync://78.46.86.149:873/danbooru2020/original/0{str(id)[-3:].zfill(3)}/{id}.*', './danbooru2020/original/'])
while 1:
#time.sleep(3)
query = Post.select().where(~fn.EXISTS(Content.select().where(Content.post_id == Post.id))).where(Post.board == Imageboard.get(Imageboard.name == 'danbooru')).order_by(random.choice([Post.id, Post.local_id])).limit(200)
for post in query:
print(post.__data__)
download_content(post.local_id)
danbooru_dump_transfer_queue_into_redis_from_find.find_transfer()
| 41.458333 | 223 | 0.702513 | 140 | 995 | 4.807143 | 0.457143 | 0.118871 | 0.059435 | 0.074294 | 0.40416 | 0.40416 | 0.40416 | 0.40416 | 0.279346 | 0.279346 | 0 | 0.057432 | 0.107538 | 995 | 23 | 224 | 43.26087 | 0.70045 | 0.013065 | 0 | 0 | 0 | 0.111111 | 0.29898 | 0.237755 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.388889 | 0 | 0.444444 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b6771c944826e5be81f3d435b7cc708dd5ba54c5 | 1,282 | py | Python | beautiful_soup/page_parser.py | akellermann97/college-dump | 5c82d93767038709ad71b8f212fdb6243eeb0aec | [
"MIT"
] | null | null | null | beautiful_soup/page_parser.py | akellermann97/college-dump | 5c82d93767038709ad71b8f212fdb6243eeb0aec | [
"MIT"
] | null | null | null | beautiful_soup/page_parser.py | akellermann97/college-dump | 5c82d93767038709ad71b8f212fdb6243eeb0aec | [
"MIT"
] | null | null | null | # Author: Alexander Kellermann Nieves
# Date: September 18th, 2018
# This is a placeholder file.
# In theory everything that has to do with beautifulsoup should find its way
# here in this file. This is where the parsing of the individual webpage should happen.
from bs4 import BeautifulSoup
import sys
import requests
import typing
def get_page(link: str):
"""
Given a http url, attempt to grab a page using GET.
Returns None is there is an error
link: a string containing the URL
"""
try:
return requests.get(link)
except requests.exceptions.MissingSchema:
return None
except:
print("Unhandled exception", file=sys.stderr)
return None
def turn_request_into_soup(page: requests.models.Response):
"""
Takes a request from Python Requests, and turns it into a bs4
document. Returns None is there is an error
page: Python Request Object
returns:
bs4.BeautifulSoup object
"""
try:
return BeautifulSoup(page.text, 'html.parser')
except:
return None
def find_forms(soup: BeautifulSoup):
"""
Returns a list of forms found on a particular webpage
"""
return soup.findAll('form')
def find_links(soup: BeautifulSoup):
return soup.findAll('a')
| 23.740741 | 87 | 0.691108 | 175 | 1,282 | 5.028571 | 0.52 | 0.034091 | 0.029545 | 0.040909 | 0.061364 | 0.061364 | 0.061364 | 0 | 0 | 0 | 0 | 0.009231 | 0.23947 | 1,282 | 53 | 88 | 24.188679 | 0.893333 | 0.465679 | 0 | 0.333333 | 0 | 0 | 0.057283 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0.190476 | 0.047619 | 0.714286 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b67b06ba8e6dd02cf4ca0a2c1aadbd51e315110b | 5,694 | py | Python | ComputingAllSpectralIntegrals.py | llimonta/Alaska2014 | 2022d37209719c665dc72a5b59fd58ed0b2a867b | [
"MIT"
] | null | null | null | ComputingAllSpectralIntegrals.py | llimonta/Alaska2014 | 2022d37209719c665dc72a5b59fd58ed0b2a867b | [
"MIT"
] | null | null | null | ComputingAllSpectralIntegrals.py | llimonta/Alaska2014 | 2022d37209719c665dc72a5b59fd58ed0b2a867b | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
from matplotlib.lines import Line2D
from scipy import interpolate
from scipy.integrate import trapz
from scipy.optimize import curve_fit
import itertools
#dictionary for plotting
color_line = {'Iron':'k','Normal':'orange', 'Na-poor':'g','Na-enhanced':'y','Na-rich':'b', 'Na-free':'r','Fe-poor':'cyan'}
#dictionary for creating mean spectral line
meteor_types_counter = {'Iron':0,'Normal':0, 'Na-poor':0,'Na-enhanced':0,'Na-rich':0, 'Na-free':0,'Fe-poor':0,'Other':0} # NB this is an updatable dictionary used later on in the code
def main():
text_file_sporadic = open('/home/limo/PHD/PokerFlat2014/MeanSpectrum/MeteorShowerVsSporadic.txt', 'r')
text_file_characteristic = open('/home/limo/PHD/PokerFlat2014/MeanSpectrum/MeteorSpecifications.txt', 'r')
# obtain response spectra by Camera
qeBVdata = pd.read_csv('/home/limo/Documents/Thesis/CodeForPlots/CameraQEBvBvf.csv')
qeBVx_points = qeBVdata['x']*10
qeBVy_points = qeBVdata['Curve2']
params1,cov1=curve_fit(QECurveFit,qeBVx_points,qeBVy_points,)
# print params1
# plt.plot(qeBVx_points,qeBVy_points)
# plt.plot(np.linspace(min(qeBVx_points),max(qeBVx_points),1000),QECurveFit(np.linspace(min(qeBVx_points),max(qeBVx_points),1000),*params1))
# plt.plot(qeBVx_points,QECurveFit(qeBVx_points,*params1))
# plt.show()
computing_integrals_all_spectra(text_file_sporadic,text_file_characteristic,params1)
return 0
def computing_integrals_all_spectra(text_file_sporadic,text_file_characteristic,params1):
# put file at beginning
text_file_sporadic.seek(0)
text_file_characteristic.seek(0)
x_text_file_sporadic = text_file_sporadic.readlines()
x_text_file_characteristic = text_file_characteristic.readlines()
# finding the minimum and maximum wavelengths on which to interpolate for mean_spectra
meteor_min_spectrum = np.zeros(1)
meteor_max_spectrum = np.zeros(1)
all_meteor_integral_ratios=np.zeros(1)
color_meteor=[]
meteor_types_legend=[]
dummy_counter=0
fig = plt.figure(figsize=(8, 6))
for iii in range(0,len(x_text_file_sporadic)):
is_sporadic = x_text_file_sporadic[iii][:].split('|')[-1].rstrip() #rstrip removes the \n at the end, spaces et al
if is_sporadic == str('SPO'):
# print iii
dummy_counter+=1
name_meteor = x_text_file_sporadic[iii][:].split('|')[0].rstrip()
spo_characteristic = x_text_file_characteristic[iii][:].split('|')[-1].rstrip()
meteor_data = pd.read_csv("".join(['/home/limo/PHD/PokerFlat2014/MeanSpectrum/',name_meteor,'.dat']),delim_whitespace = True, skiprows =[-1], header = None)
meteor_spectrum=np.float64(meteor_data[0][:])
meteor_spectral_response=np.float64(meteor_data[2][:])
# meteor_spectral_response[meteor_spectral_response<0]=np.NaN
# meteor_spectral_response[meteor_spectral_response<0]=np.nan
# print meteor_spectrum,meteor_spectral_response
if spo_characteristic in color_line:
color_meteor.append(color_line[spo_characteristic])
meteor_types_counter[spo_characteristic]=meteor_types_counter[spo_characteristic]+1
meteor_types_legend.append(spo_characteristic)
else:
color_meteor.append('magenta')
meteor_types_counter['Other']=meteor_types_counter['Other']+1
meteor_types_legend.append('Other')
# Compute spectral response seen by camera
Meteor_Camera_Response=meteor_spectral_response*QECurveFit(meteor_spectrum[:],*params1)/100
# Compute ingteral
MeteorIntegral = trapz(meteor_spectral_response,meteor_spectrum)
CameraMeteorIntegral = trapz(Meteor_Camera_Response,meteor_spectrum)
all_meteor_integral_ratios= np.append(all_meteor_integral_ratios,CameraMeteorIntegral/MeteorIntegral)
plt.plot(dummy_counter,CameraMeteorIntegral/MeteorIntegral,color_meteor[dummy_counter-1],marker='.',markersize=10)
custom_lines = [Line2D([0], [0], color=color_line['Normal'],marker='.',linestyle='None'),
Line2D([0], [0], color=color_line['Iron'],marker='.',linestyle='None'),
Line2D([0], [0], color=color_line['Fe-poor'],marker='.',linestyle='None'),
Line2D([0], [0], color=color_line['Na-rich'], marker='.',linestyle='None'),
Line2D([0], [0], color=color_line['Na-enhanced'],marker='.',linestyle='None'),
Line2D([0], [0], color=color_line['Na-poor'], marker='.',linestyle='None'),
Line2D([0], [0], color=color_line['Na-free'], marker='.',linestyle='None'),
Line2D([0], [0], color='magenta',marker='.',linestyle='None')]
plt.xlabel('Meteor #',fontsize=20)
plt.ylabel('Integral Ratio',fontsize=24)
plt.title('Ratio between Camera Response and Ideal Response',fontsize=24)
plt.legend(custom_lines, ['Normal','Fe','Fe-poor','Na-rich','Na-enhanced','Na-poor','Na-free', 'other'],fontsize=15)
plt.show()
print all_meteor_integral_ratios
print np.mean(all_meteor_integral_ratios[1::]),np.std(all_meteor_integral_ratios[1::])
return 0 #min(meteor_min_spectrum[1::]),max(meteor_max_spectrum[1::])
def QECurveFit(x,a,b,c,d,e,f,g,h,n,l,m):
# print type(x)
# print a,b,c,d,e,f,g,h,n,l,m
# print a*x,b*x**2,c*x**3,d*x**4,e*x**5,f*x**6,g*x**7,h*x**8,n*x**9,l*x**10,m*x**11
return a*x+b*x**2+c*x**3+d*x**4+e*x**5+f*x**6+g*x**7+h*x**8+n*x**9+l*x**10+m*x**11
if __name__ == '__main__':
mat=main()
| 54.75 | 183 | 0.679487 | 790 | 5,694 | 4.682278 | 0.249367 | 0.034604 | 0.038929 | 0.028116 | 0.329549 | 0.260341 | 0.219248 | 0.187078 | 0.187078 | 0.115166 | 0 | 0.027848 | 0.167545 | 5,694 | 103 | 184 | 55.281553 | 0.752532 | 0.173516 | 0 | 0.027397 | 0 | 0 | 0.132366 | 0.049957 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.109589 | null | null | 0.027397 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b67b35eb818dc4e5baf99aad96bee0144538fed9 | 2,696 | py | Python | midnight_news/models.py | webadmin87/midnight | b60b3b257b4d633550b82a692f3ea3756c62a0a9 | [
"BSD-3-Clause"
] | 1 | 2015-11-20T12:42:39.000Z | 2015-11-20T12:42:39.000Z | midnight_news/models.py | webadmin87/midnight | b60b3b257b4d633550b82a692f3ea3756c62a0a9 | [
"BSD-3-Clause"
] | 3 | 2020-02-11T21:21:12.000Z | 2021-06-10T17:23:56.000Z | midnight_news/models.py | webadmin87/midnight | b60b3b257b4d633550b82a692f3ea3756c62a0a9 | [
"BSD-3-Clause"
] | 1 | 2015-11-04T09:23:31.000Z | 2015-11-04T09:23:31.000Z | from django.core.urlresolvers import reverse
from django.db import models
from midnight_main.models import BaseTree, Base, BreadCrumbsMixin, BaseComment
from ckeditor.fields import RichTextField
from django.utils.translation import ugettext_lazy as _
from sorl.thumbnail import ImageField
from mptt.fields import TreeManyToManyField
class Section(BreadCrumbsMixin, BaseTree):
"""
Модель категории новостей
"""
title = models.CharField(max_length=255, verbose_name=_('Title'))
slug = models.SlugField(max_length=255, unique=True, verbose_name=_('Slug'))
sort = models.IntegerField(default=500, verbose_name=_('Sort'))
metatitle = models.CharField(max_length=2000, blank=True, verbose_name=_('Title'))
keywords = models.CharField(max_length=2000, blank=True, verbose_name=_('Keywords'))
description = models.CharField(max_length=2000, blank=True, verbose_name=_('Description'))
def get_absolute_url(self):
return reverse('midnight_news:news_list', kwargs={'slug': self.slug})
def __str__(self):
return self.title
class MPTTMeta:
order_insertion_by = ['sort']
class Meta:
verbose_name = _('NewsSection')
verbose_name_plural = _('NewsSections')
class News(Base):
"""
Модель новости
"""
title = models.CharField(max_length=255, verbose_name=_('Title'))
slug = models.SlugField(max_length=255, unique=True, verbose_name=_('Slug'))
date = models.DateField(verbose_name=_('Date'), blank=False)
sections = TreeManyToManyField(Section, verbose_name=_('Sections'))
image = ImageField(upload_to='news', verbose_name=_('Image'), blank=True)
annotation = models.TextField(blank=True, verbose_name=_('Annotation'))
text = RichTextField(blank=True, verbose_name=_('Text'))
comments = models.BooleanField(default=False, verbose_name=_('Comments'))
metatitle = models.CharField(max_length=2000, blank=True, verbose_name=_('Title'))
keywords = models.CharField(max_length=2000, blank=True, verbose_name=_('Keywords'))
description = models.CharField(max_length=2000, blank=True, verbose_name=_('Description'))
def get_absolute_url(self):
return reverse('midnight_news:news_detail', kwargs={'section_slug': self.sections.all()[0].slug, 'slug': self.slug})
def __str__(self):
return self.title
class Meta:
verbose_name = _('NewsItem')
verbose_name_plural = _('News')
class NewsComment(BaseComment):
"""
Модель комментария к новости
"""
obj = models.ForeignKey(News)
class Meta:
verbose_name = _('NewsComment')
verbose_name_plural = _('NewsComments')
| 27.793814 | 124 | 0.706602 | 309 | 2,696 | 5.899676 | 0.300971 | 0.138782 | 0.082282 | 0.105321 | 0.43006 | 0.43006 | 0.43006 | 0.43006 | 0.43006 | 0.43006 | 0 | 0.017841 | 0.168398 | 2,696 | 96 | 125 | 28.083333 | 0.795272 | 0.025593 | 0 | 0.404255 | 0 | 0 | 0.094186 | 0.018605 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085106 | false | 0 | 0.148936 | 0.085106 | 0.851064 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b67cca26a73f81dbfa87c94981ac89667eb242b4 | 1,704 | py | Python | Src/Morse.py | DragonixAlpha/Kal | 2ca941ae91ffc459c821ac1b9139dd96abfec6e2 | [
"MIT"
] | null | null | null | Src/Morse.py | DragonixAlpha/Kal | 2ca941ae91ffc459c821ac1b9139dd96abfec6e2 | [
"MIT"
] | null | null | null | Src/Morse.py | DragonixAlpha/Kal | 2ca941ae91ffc459c821ac1b9139dd96abfec6e2 | [
"MIT"
] | 1 | 2021-05-16T15:32:56.000Z | 2021-05-16T15:32:56.000Z | MORSE_CODE_DICT = { 'A':'.-', 'B':'-...',
'C':'-.-.', 'D':'-..', 'E':'.',
'F':'..-.', 'G':'--.', 'H':'....',
'I':'..', 'J':'.---', 'K':'-.-',
'L':'.-..', 'M':'--', 'N':'-.',
'O':'---', 'P':'.--.', 'Q':'--.-',
'R':'.-.', 'S':'...', 'T':'-',
'U':'..-', 'V':'...-', 'W':'.--',
'X':'-..-', 'Y':'-.--', 'Z':'--..',
'1':'.----', '2':'..---', '3':'...--',
'4':'....-', '5':'.....', '6':'-....',
'7':'--...', '8':'---..', '9':'----.',
'0':'-----', ', ':'--..--', '.':'.-.-.-',
'?':'..--..', '/':'-..-.', '-':'-....-',
'(':'-.--.', ')':'-.--.-'}
# Function to encrypt the string
# according to the morse code chart
def encrypt(message):
cipher = ''
for letter in message:
if letter != ' ':
# Looks up the dictionary and adds the
# correspponding morse code
# along with a space to separate
# morse codes for different characters
cipher += MORSE_CODE_DICT[letter] + ' '
else:
# 1 space indicates different characters
# and 2 indicates different words
cipher += ' '
return cipher
# Hard-coded driver function to run the program
def main():
message = input("Enter a Word or Phrase")
result = encrypt(message.upper())
print (result)
# Executes the main function
if __name__ == '__main__':
main()
| 36.255319 | 63 | 0.311033 | 131 | 1,704 | 3.954198 | 0.656489 | 0.069498 | 0.050193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011685 | 0.3973 | 1,704 | 46 | 64 | 37.043478 | 0.492697 | 0.204225 | 0 | 0 | 0 | 0 | 0.191834 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0 | 0 | 0 | 0.103448 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b683e527a05953ee9bd12970d53f000c8b55f886 | 934 | py | Python | gui/helpers/native_saver.py | intdata-bsc/idact-gui | 68df38dd231b5bd6ce45fd5e6f6450b9951efc50 | [
"MIT"
] | 3 | 2019-06-11T19:15:45.000Z | 2020-05-16T17:07:43.000Z | gui/helpers/native_saver.py | intdata-bsc/idact-gui | 68df38dd231b5bd6ce45fd5e6f6450b9951efc50 | [
"MIT"
] | 21 | 2019-06-24T19:47:41.000Z | 2020-02-22T14:39:46.000Z | gui/helpers/native_saver.py | intdata-bsc/idact-gui | 68df38dd231b5bd6ce45fd5e6f6450b9951efc50 | [
"MIT"
] | 1 | 2019-08-20T20:20:20.000Z | 2019-08-20T20:20:20.000Z | """ One of the helpers for the gui application.
Similar modules: class:`.DataProvider`, :class:`.ParameterSaver`,
:class:`.UiLoader`, :class:`.Worker`, :class:`.ConfigurationProvider`
"""
import json
import os
class NativeArgsSaver:
""" Manages the native arguments.
"""
def __init__(self):
current_directory = os.path.dirname(os.path.abspath(__file__))
self.filename = os.path.join(current_directory, '../../native_args.json')
with open(self.filename) as json_file:
self.native_args = json.load(json_file)
def save(self, native_args):
""" Saves the native arguments into a json file.
:param native_args: Native arguments to be saved.
"""
self.native_args = native_args
with open(self.filename, 'w') as json_file:
json.dump(native_args, json_file)
def get_native_args(self):
return self.native_args
| 30.129032 | 81 | 0.650964 | 115 | 934 | 5.078261 | 0.434783 | 0.15411 | 0.09589 | 0.068493 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229122 | 934 | 30 | 82 | 31.133333 | 0.811111 | 0.332976 | 0 | 0 | 0 | 0 | 0.04007 | 0.038328 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.142857 | 0.071429 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b6861069d6eb9f5bbebb4e747de8688fa5393f07 | 10,444 | py | Python | cinder/tests/unit/volume/drivers/datacore/test_datacore_fc.py | alexisries/openstack-cinder | 7cc6e45c5ddb8bf771bdb01b867628e41761ae11 | [
"Apache-2.0"
] | 2 | 2019-05-24T14:13:50.000Z | 2019-05-24T14:21:13.000Z | cinder/tests/unit/volume/drivers/datacore/test_datacore_fc.py | vexata/cinder | 7b84c0842b685de7ee012acec40fb4064edde5e9 | [
"Apache-2.0"
] | 5 | 2019-08-14T06:46:03.000Z | 2021-12-13T20:01:25.000Z | cinder/tests/unit/volume/drivers/datacore/test_datacore_fc.py | vexata/cinder | 7b84c0842b685de7ee012acec40fb4064edde5e9 | [
"Apache-2.0"
] | 2 | 2020-03-15T01:24:15.000Z | 2020-07-22T20:34:26.000Z | # Copyright (c) 2017 DataCore Software Corp. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Unit tests for the Fibre Channel Driver for DataCore SANsymphony
storage array.
"""
import mock
from cinder import exception as cinder_exception
from cinder import test
from cinder.tests.unit.volume.drivers.datacore import test_datacore_driver
from cinder.volume.drivers.datacore import fc
PORTS = [
mock.Mock(Id='initiator_port_id1',
PortType='FibreChannel',
PortMode='Initiator',
PortName='AA-AA-AA-AA-AA-AA-AA-AA',
HostId='client_id1'),
mock.Mock(Id='initiator_port_id2',
PortType='FibreChannel',
PortMode='Initiator',
PortName='BB-BB-BB-BB-BB-BB-BB-BB'),
mock.Mock(Id='target_port_id1',
PortMode='Target',
PortName='CC-CC-CC-CC-CC-CC-CC-CC',
HostId='server_id1'),
mock.Mock(Id='target_port_id2',
PortMode='Target',
PortName='DD-DD-DD-DD-DD-DD-DD-DD',
HostId='server_id1'),
]
LOGICAL_UNITS = [
mock.Mock(VirtualTargetDeviceId='target_device_id1',
Lun=mock.Mock(Quad=4)),
mock.Mock(VirtualTargetDeviceId='target_device_id2',
Lun=mock.Mock(Quad=3)),
mock.Mock(VirtualTargetDeviceId='target_device_id3',
Lun=mock.Mock(Quad=2)),
mock.Mock(VirtualTargetDeviceId='target_device_id4',
Lun=mock.Mock(Quad=1)),
]
TARGET_DEVICES = [
mock.Mock(Id='target_device_id1',
TargetPortId='target_port_id1',
InitiatorPortId='initiator_port_id1'),
mock.Mock(Id='target_device_id2',
TargetPortId='target_port_id2',
InitiatorPortId='initiator_port_id1'),
mock.Mock(Id='target_device_id3',
TargetPortId='target_port_id2',
InitiatorPortId='initiator_port_id1'),
mock.Mock(Id='target_device_id4',
TargetPortId='target_port_id2',
InitiatorPortId='initiator_port_id2'),
]
class FibreChannelVolumeDriverTestCase(
test_datacore_driver.DataCoreVolumeDriverTestCase, test.TestCase):
"""Tests for the FC Driver for DataCore SANsymphony storage array."""
def setUp(self):
super(FibreChannelVolumeDriverTestCase, self).setUp()
self.mock_client.get_ports.return_value = PORTS
self.mock_client.get_target_devices.return_value = TARGET_DEVICES
@staticmethod
def init_driver(config):
driver = fc.FibreChannelVolumeDriver(configuration=config)
driver.do_setup(None)
return driver
def test_validate_connector(self):
driver = self.init_driver(self.setup_default_configuration())
connector = {
'host': 'host_name',
'wwpns': ['AA-AA-AA-AA-AA-AA-AA-AA'],
}
driver.validate_connector(connector)
def test_validate_connector_failed(self):
driver = self.init_driver(self.setup_default_configuration())
connector = {}
self.assertRaises(cinder_exception.InvalidConnectorException,
driver.validate_connector,
connector)
connector = {'host': 'host_name'}
self.assertRaises(cinder_exception.InvalidConnectorException,
driver.validate_connector,
connector)
connector = {'wwpns': ['AA-AA-AA-AA-AA-AA-AA-AA']}
self.assertRaises(cinder_exception.InvalidConnectorException,
driver.validate_connector,
connector)
def test_initialize_connection(self):
(self.mock_client.serve_virtual_disks_to_host
.return_value) = LOGICAL_UNITS
virtual_disk = test_datacore_driver.VIRTUAL_DISKS[0]
client = test_datacore_driver.CLIENTS[0]
driver = self.init_driver(self.setup_default_configuration())
volume = test_datacore_driver.VOLUME.copy()
volume['provider_location'] = virtual_disk.Id
initiator_wwpns = [port.PortName.replace('-', '').lower() for port
in PORTS
if port.PortMode == 'Initiator']
connector = {
'host': client.HostName,
'wwpns': initiator_wwpns,
}
result = driver.initialize_connection(volume, connector)
self.assertEqual('fibre_channel', result['driver_volume_type'])
target_wwns = [port.PortName.replace('-', '').lower() for port
in PORTS
if port.PortMode == 'Target']
self.assertIn(result['data']['target_wwn'], target_wwns)
target_wwn = result['data']['target_wwn']
target_port_id = next((
port.Id for port
in PORTS
if port.PortName.replace('-', '').lower() == target_wwn), None)
target_device_id = next((
device.Id for device
in TARGET_DEVICES
if device.TargetPortId == target_port_id), None)
target_lun = next((
unit.Lun.Quad for unit
in LOGICAL_UNITS
if unit.VirtualTargetDeviceId == target_device_id), None)
self.assertEqual(target_lun, result['data']['target_lun'])
self.assertFalse(result['data']['target_discovered'])
self.assertEqual(volume['id'], result['data']['volume_id'])
self.assertEqual('rw', result['data']['access_mode'])
def test_initialize_connection_unknown_client(self):
client = test_datacore_driver.CLIENTS[0]
self.mock_client.register_client.return_value = client
(self.mock_client.get_clients
.return_value) = test_datacore_driver.CLIENTS[1:]
(self.mock_client.serve_virtual_disks_to_host
.return_value) = LOGICAL_UNITS
virtual_disk = test_datacore_driver.VIRTUAL_DISKS[0]
driver = self.init_driver(self.setup_default_configuration())
volume = test_datacore_driver.VOLUME.copy()
volume['provider_location'] = virtual_disk.Id
initiator_wwpns = [port.PortName.replace('-', '').lower() for port
in PORTS
if port.PortMode == 'Initiator']
connector = {
'host': client.HostName,
'wwpns': initiator_wwpns,
}
result = driver.initialize_connection(volume, connector)
self.assertEqual('fibre_channel', result['driver_volume_type'])
target_wwns = [port.PortName.replace('-', '').lower() for port
in PORTS
if port.PortMode == 'Target']
self.assertIn(result['data']['target_wwn'], target_wwns)
target_wwn = result['data']['target_wwn']
target_port_id = next((
port.Id for port
in PORTS
if port.PortName.replace('-', '').lower() == target_wwn), None)
target_device_id = next((
device.Id for device
in TARGET_DEVICES
if device.TargetPortId == target_port_id), None)
target_lun = next((
unit.Lun.Quad for unit
in LOGICAL_UNITS
if unit.VirtualTargetDeviceId == target_device_id), None)
self.assertEqual(target_lun, result['data']['target_lun'])
self.assertFalse(result['data']['target_discovered'])
self.assertEqual(volume['id'], result['data']['volume_id'])
self.assertEqual('rw', result['data']['access_mode'])
def test_initialize_connection_failed_not_found(self):
client = test_datacore_driver.CLIENTS[0]
driver = self.init_driver(self.setup_default_configuration())
volume = test_datacore_driver.VOLUME.copy()
volume['provider_location'] = 'wrong_virtual_disk_id'
initiator_wwpns = [port.PortName.replace('-', '').lower() for port
in PORTS
if port.PortMode == 'Initiator']
connector = {
'host': client.HostName,
'wwpns': initiator_wwpns,
}
self.assertRaises(cinder_exception.VolumeDriverException,
driver.initialize_connection,
volume,
connector)
def test_initialize_connection_failed_initiator_not_found(self):
(self.mock_client.serve_virtual_disks_to_host
.return_value) = LOGICAL_UNITS
virtual_disk = test_datacore_driver.VIRTUAL_DISKS[0]
client = test_datacore_driver.CLIENTS[0]
driver = self.init_driver(self.setup_default_configuration())
volume = test_datacore_driver.VOLUME.copy()
volume['provider_location'] = virtual_disk.Id
connector = {
'host': client.HostName,
'wwpns': ['0000000000000000'],
}
self.assertRaises(cinder_exception.VolumeDriverException,
driver.initialize_connection,
volume,
connector)
def test_initialize_connection_failed_on_serve(self):
self.mock_client.serve_virtual_disks_to_host.return_value = []
virtual_disk = test_datacore_driver.VIRTUAL_DISKS[0]
client = test_datacore_driver.CLIENTS[0]
driver = self.init_driver(self.setup_default_configuration())
volume = test_datacore_driver.VOLUME.copy()
volume['provider_location'] = virtual_disk.Id
initiator_wwpns = [port.PortName.replace('-', '').lower() for port
in PORTS
if port.PortMode == 'Initiator']
connector = {
'host': client.HostName,
'wwpns': initiator_wwpns,
}
self.assertRaises(cinder_exception.VolumeDriverException,
driver.initialize_connection,
volume,
connector)
| 40.638132 | 78 | 0.618058 | 1,105 | 10,444 | 5.600905 | 0.158371 | 0.013572 | 0.01745 | 0.019389 | 0.753595 | 0.685248 | 0.664889 | 0.642107 | 0.639522 | 0.609953 | 0 | 0.008094 | 0.278437 | 10,444 | 256 | 79 | 40.796875 | 0.813163 | 0.072578 | 0 | 0.665072 | 0 | 0 | 0.110433 | 0.016456 | 0 | 0 | 0 | 0 | 0.086124 | 1 | 0.043062 | false | 0 | 0.023923 | 0 | 0.076555 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b6862baa381a49c5ef1854fb8bd9b911475f5c62 | 1,580 | py | Python | tests/test_DepthEstimator.py | melkimble/OpticalRS | 54404f6c1e4e4a6f625e7b15e9f0489cb3600d79 | [
"BSD-3-Clause"
] | 17 | 2016-06-13T02:29:09.000Z | 2022-03-22T13:30:39.000Z | tests/test_DepthEstimator.py | melkimble/OpticalRS | 54404f6c1e4e4a6f625e7b15e9f0489cb3600d79 | [
"BSD-3-Clause"
] | 7 | 2017-09-02T12:50:49.000Z | 2019-02-14T18:32:52.000Z | tests/test_DepthEstimator.py | melkimble/OpticalRS | 54404f6c1e4e4a6f625e7b15e9f0489cb3600d79 | [
"BSD-3-Clause"
] | 14 | 2016-04-02T14:03:34.000Z | 2021-05-18T06:00:45.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
test_DepthEstimator
-------------------
pytest Tests for `OpticalRS.DepthEstimator` module. To run these tests, install
pytest and run `py.test` in this test directory.
"""
from OpticalRS.RasterDS import RasterDS
from OpticalRS.DepthEstimator import DepthEstimator
import pytest
import numpy as np
rds = RasterDS('data/eReefWV2.tif')
drds = RasterDS('data/eReefDepth.tif')
imarr = rds.band_array
darr = drds.band_array.squeeze()
imflat = imarr.reshape(-1,imarr.shape[-1])
dflat = darr.ravel()
@pytest.fixture(params=[rds,imarr,imflat],
ids=['imgRasterDS','imgArray','imgFlat'])
def image(request):
return request.param
@pytest.fixture(params=[drds,darr,dflat],
ids=['depRasterDS','depArray','depFlat'])
def depth(request):
return request.param
@pytest.fixture
def dep_est(image,depth):
print "Img Type: {}, Depth Type: {}".format(type(image).__name__,type(depth).__name__)
return DepthEstimator(image,depth)
class TestDepthEstimator:
def test_nbands(self,dep_est):
assert dep_est.nbands == 8
def test_image_masking(self,dep_est):
if np.ma.isMA(dep_est.imarr) and np.ma.isMA(dep_est.known_depth_arr):
imgmask = dep_est.known_imarr.mask[...,0]
depmask = dep_est.known_depth_arr.mask
assert np.array_equiv(imgmask,depmask)
imgmask = dep_est.known_imarr_flat.mask[...,0]
depmask = dep_est.known_depth_arr_flat.mask
assert np.array_equiv(imgmask,depmask)
| 28.214286 | 90 | 0.675316 | 207 | 1,580 | 4.980676 | 0.410628 | 0.058196 | 0.053346 | 0.046557 | 0.28807 | 0.203686 | 0.129971 | 0.060136 | 0 | 0 | 0 | 0.005447 | 0.186709 | 1,580 | 55 | 91 | 28.727273 | 0.796887 | 0.026582 | 0 | 0.121212 | 0 | 0 | 0.085483 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0 | null | null | 0 | 0.121212 | null | null | 0.030303 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b68858b3359450e821dd2966444e4abc86685776 | 26,328 | py | Python | test/test_say.py | andrewvaughan/ansible-message | 73b45fd9a09ff69a4221c898b575b1b17701b936 | [
"MIT"
] | 8 | 2018-02-27T11:33:33.000Z | 2020-07-11T23:59:33.000Z | test/test_say.py | andrewvaughan/ansible-prompt | 73b45fd9a09ff69a4221c898b575b1b17701b936 | [
"MIT"
] | 65 | 2017-11-01T20:38:30.000Z | 2017-12-02T00:19:28.000Z | test/test_say.py | andrewvaughan/ansible-message | 73b45fd9a09ff69a4221c898b575b1b17701b936 | [
"MIT"
] | 1 | 2017-11-30T22:02:17.000Z | 2017-11-30T22:02:17.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright 2017 Andrew Vaughan <hello@andrewvaughan.io>
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
# documentation files (the "Software"), to deal in the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
# Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
Test suite for the Ansible prompt action plugin.
.. moduleauthor:: Andrew Vaughan <hello@andrewvaughan.io>
"""
import ansible
import mock
import StringIO
import sys
import unittest
from action_plugins import Prompt
from ansible.playbook.task import Task as AnsibleTask
from ansible.playbook.play_context import PlayContext as AnsiblePlayContext
class TestSay(unittest.TestCase):
"""
Tests the messaging portion of the Ansible prompt action plugin.
.. class:: TestSay
.. versionadded:: 0.1.0
"""
def setUp(self):
"""
Sets up a prompt object before each test.
.. versionadded:: 0.1.0
.. function:: setUp()
"""
self.prompt = self._getPrompt()
self.outstr = StringIO.StringIO()
self.prompt.setOutput(self.outstr)
self.response = {
"changed": False
}
self.expected = self.response.copy()
def _getPrompt(self):
"""
Return a generic Prompt object.
:returns: generic Prompt object
.. versionadded:: 0.1.0
.. function:: _getPrompt()
"""
return Prompt(
task=AnsibleTask(),
connection=None,
play_context=AnsiblePlayContext(),
loader=None,
templar=None,
shared_loader_obj=None
)
# __init__(task, connection, play_context, loader, templar, shared_loader_obj)
def test_prompt_init_default_outstr_valid(self):
"""
Test that the default output stream is stdout.
.. versionadded:: 0.2.0
.. function:: test_prompt_init_default_outstr_valid()
"""
self.assertEqual(self._getPrompt()._outstr, sys.stdout)
# setOutput(outstr)
def test_prompt_setOutput_default_valid(self):
"""
Test that the default setting for a prompt is stdout.
.. versionadded:: 0.1.0
.. function:: test_setOutput_default_valid()
"""
prompt = self._getPrompt()
prompt.setOutput()
self.assertEquals(
prompt._outstr,
sys.stdout
)
def test_prompt_setOutput_stringio_valid(self):
"""
Test that an updated setting for setOutput() sticks.
.. versionadded:: 0.1.0
.. function:: test_setOutput_stringio_valid()
"""
prompt = self._getPrompt()
outstr = StringIO.StringIO()
prompt.setOutput(outstr)
self.assertEquals(outstr, prompt._outstr)
self.assertEquals(outstr.getvalue(), "")
prompt._prompt({}, "test")
self.assertEquals(outstr.getvalue(), "test\n")
# _fail(result, msg, args*)
def test_prompt_fail_params_missing_exception(self):
"""
Test that the _fail() method throws an exception if all parameters are missing.
.. versionadded:: 0.1.0
.. function:: test_fail_params_missing_exception()
"""
with self.assertRaises(TypeError):
self.prompt._fail()
def test_prompt_fail_response_missing_exception(self):
"""
Test that the _fail() method throws an exception if the response is missing.
.. versionadded:: 0.1.0
.. function:: test_fail_response_missing_exception()
"""
with self.assertRaises(TypeError):
self.prompt._fail(None, "Failure Message")
def test_prompt_fail_response_invalid_exception(self):
"""
Test that the _fail() method throws an exception if the response is not a dict.
.. versionadded:: 0.1.0
.. function:: test_fail_response_invalid_exception()
"""
with self.assertRaises(TypeError):
self.prompt._fail("Bad Result", "Failure Message")
def test_prompt_fail_message_missing_exception(self):
"""
Test that the _fail() method throws an exception if no message is provided.
.. versionadded:: 0.1.0
.. function:: test_fail_message_missing_exception()
"""
with self.assertRaises(TypeError):
self.prompt._fail(self.response)
def test_prompt_fail_message_empty_exception(self):
"""
Test that the _fail() method throws an exception if an empty message is provided.
.. versionadded:: 0.1.0
.. function:: test_fail_message_empty_exception()
"""
with self.assertRaises(ValueError):
self.prompt._fail(self.response, "")
def test_prompt_fail_message_nonstring_exception(self):
"""
Test that the _fail() method throws an exception if a non-string message is provided.
.. versionadded:: 0.1.0
.. function:: test_fail_message_nonstring_exception()
"""
with self.assertRaises(TypeError):
self.prompt._fail(self.response, 45)
self.prompt._fail(self.response, {"hello": "bar"})
self.prompt._fail(self.response, ["a", "b"])
def test_prompt_fail_message_valid_success(self):
"""
Tests that the _fail() method returns an expected response.
.. versionadded:: 0.1.0
.. function:: test_fail_message_valid_success()
"""
self.expected['failed'] = True
self.expected['msg'] = "Failure Message"
self.assertEquals(
self.prompt._fail(self.response, "Failure Message"),
self.expected
)
def test_prompt_fail_message_valid_success(self):
"""
Test that the _fail() method returns an expected response with multiple variables.
.. versionadded:: 0.1.0
.. function:: test_fail_message_valid_success()
"""
self.expected['failed'] = True
self.expected['msg'] = "Failure Message A 3.14 Cats"
self.assertEquals(
self.prompt._fail(self.response, "Failure Message %s %.2f %s", "A", 3.14159, "Cats"),
self.expected
)
# _prompt(result, msg)
def test_prompt_param_invalid_fails(self):
"""
Test that the _prompt() method returns a failure if given a bad, top-level parameter name.
.. versionadded:: 0.1.0
.. function:: test_prompt_param_invalid_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "Unexpected parameter 'foo'"
self.assertEquals(
self.prompt._prompt(self.response, {
"foo": "bar"
}),
self.expected
)
def test_prompt_msg_emptystring_fails(self):
"""
Test that the _prompt() method returns a failure if an empty string is provided for the message.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_emptystring_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "No message provided"
self.assertEquals(
self.prompt._prompt(self.response, ""),
self.expected
)
def test_prompt_msg_emptylist_fails(self):
"""
Test that the _prompt() method returns a failure if an empty list is provided for the message.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_emptylist_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "No message provided"
self.assertEquals(
self.prompt._prompt(self.response, []),
self.expected
)
def test_prompt_msg_emptyobject_fails(self):
"""
Test that the _prompt() method returns a failure if an empty object is provided for the message.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_emptyobject_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "No message provided"
self.assertEquals(
self.prompt._prompt(self.response, {}),
self.expected
)
def test_prompt_msg_none_fails(self):
"""
Test that the _prompt() method returns a failure if no message is provided.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_none_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "No message provided"
self.assertEquals(
self.prompt._prompt(self.response, None),
self.expected
)
def test_prompt_msg_string_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple string.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_string_succeeds()
"""
msg = "Hello World"
self.assertTrue(isinstance(msg, str))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % msg)
def test_prompt_msg_unicode_succeeds(self):
"""
Test that the _prompt() method is successful if given a unicode string.
.. versionadded:: 0.2.0
.. function:: test_prompt_msg_unicode_succeeds()
"""
msg = u"Hello World"
self.assertTrue(isinstance(msg, unicode))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % msg)
def test_prompt_msg_int_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple integer.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_int_succeeds()
"""
msg = 1
self.assertTrue(isinstance(msg, int))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % str(msg))
def test_prompt_msg_long_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple long.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_long_succeeds()
"""
msg = sys.maxint + 1
self.assertTrue(isinstance(msg, long))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % str(msg))
def test_prompt_msg_float_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple float.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_float_succeeds()
"""
msg = 1.1
self.assertTrue(isinstance(msg, float))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % str(msg))
def test_prompt_msg_complex_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple complex.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_complex_succeeds()
"""
msg = complex(1, 5)
self.assertTrue(isinstance(msg, complex))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % str(msg))
def test_prompt_msg_tuple_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple tuple.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_tuple_succeeds()
"""
msg = (1, 5)
self.assertTrue(isinstance(msg, tuple))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % str(msg))
def test_prompt_msg_set_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple set.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_set_succeeds()
"""
msg = set([1, 5, 50, "alpha"])
self.assertTrue(isinstance(msg, set))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % str(msg))
def test_prompt_msg_frozenset_succeeds(self):
"""
Test that the _prompt() method is successful if given a simple frozenset.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_frozenset_succeeds()
"""
msg = frozenset([-5, 5, 5.0, "beta"])
self.assertTrue(isinstance(msg, frozenset))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % str(msg))
def test_prompt_msg_listparam_invalid_fails(self):
"""
Test that the _prompt() method fails if given a list with an invalid parameter in it.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_listparam_invalid_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "Unexpected parameter 'foo'"
self.assertEquals(
self.prompt._prompt(self.response, [
{"say": "valid"},
{"foo": "bar"}
]),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "valid\n")
def test_prompt_msg_list_empty_fails(self):
"""
Test that the _prompt() method fails if given an empty list.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_list_empty_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "No message provided"
self.assertEquals(
self.prompt._prompt(self.response, []),
self.expected
)
def test_prompt_msg_list_succeeds(self):
"""
Test that the _prompt() method is successful if given multiple, simple strings.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_list_succeeds()
"""
msg = [
"alpha",
"bravo",
-7,
1.5
]
self.assertTrue(isinstance(msg, list))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % ("\n".join([str(m) for m in msg])))
def test_prompt_msg_listsay_succeeds(self):
"""
Test that the _prompt() method is successful if given multiple, simple strings.
.. versionadded:: 0.1.0
.. function:: test_prompt_msg_listsay_succeeds()
"""
msg = [
{"say": "a"},
{"say": "B"},
{"say": -20},
{"say": 5.5},
]
self.assertTrue(isinstance(msg, list))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s\n" % ("\n".join([str(m['say']) for m in msg])))
def test_prompt_msg_saynewline_succeeds(self):
"""
Test that the _prompt() method is successful if given a string that has no trailing newline.
.. versionadded:: 0.3.0
.. function:: test_prompt_msg_saynewline_succeeds()
"""
msg = [
{"say": "Hello World", "newline": False},
]
self.assertTrue(isinstance(msg, list))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "Hello World")
def test_prompt_msg_saynewlinemultiple_succeeds(self):
"""
Test that the _prompt() method is successful if given multiple strings that have no trailing newline.
.. versionadded:: 0.3.0
.. function:: test_prompt_msg_saynewlinemultiple_succeeds()
"""
msg = [
{"say": "Hello ", "newline": False},
{"say": "World", "newline": False},
{"say": ", How are we?", "newline": True},
]
self.assertTrue(isinstance(msg, list))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "Hello World, How are we?\n")
def test_prompt_msg_align_invalid_fails(self):
"""
Test that the _prompt() method is fails if given an invalid align option.
.. versionadded:: 0.3.0
.. function:: test_prompt_msg_align_invalid_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "Align 'foobar' invalid. Expected 'left', 'center', or 'right'."
self.assertEquals(
self.prompt._prompt(self.response, {
"say": "Hello World",
"align": "foobar"
}),
self.expected
)
def test_prompt_param_align_left_valid(self):
"""
Test that the param() method is successful with a left alignment.
.. versionadded:: 0.3.0
.. function:: test_prompt_param_align_left_valid()
"""
with mock.patch('subprocess.check_output', return_value='10 50'):
msg = [
{"say": "Hello World", "align": "left"},
]
self.assertTrue(isinstance(msg, list))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "Hello World\n")
def test_prompt_param_align_center_valid(self):
"""
Test that the param() method is successful with a center alignment.
.. versionadded:: 0.3.0
.. function:: test_prompt_param_align_center_valid()
"""
with mock.patch('subprocess.check_output', return_value='10 75'):
msg = [
{"say": "Hello World", "align": "center"},
]
self.assertTrue(isinstance(msg, list))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s%s" % ("Hello World".center(74), "\n"))
def test_prompt_param_align_right_valid(self):
"""
Test that the param() method is successful with a right alignment.
.. versionadded:: 0.3.0
.. function:: test_prompt_param_align_right_valid()
"""
with mock.patch('subprocess.check_output', return_value='10 88'):
msg = [
{"say": "Hello World", "align": "right"},
]
self.assertTrue(isinstance(msg, list))
self.assertEquals(
self.prompt._prompt(self.response, msg),
self.expected
)
self.assertEquals(self.outstr.getvalue(), "%s%s" % ("Hello World".rjust(87), "\n"))
# run(tmp=None, task_vars=None)
def test_prompt_run_msg_missing_fails(self):
"""
Test that the run() method will fail if missing the msg parameter.
.. versionadded:: 0.1.0
.. function:: test_run_msg_missing_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "Required 'msg' parameter missing."
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
self.assertEquals(
prompt.run(),
self.expected
)
def test_prompt_run_multiple_params_fails(self):
"""
Test that the run() method will fail if there are too many parameters.
.. versionadded:: 0.1.0
.. function:: test_run_multiple_params_fails()
"""
self.expected['failed'] = True
self.expected['msg'] = "Expected single 'msg' parameter. Multiple parameters given."
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
prompt._task.args = {
"msg": "Hello World",
"foo": "bar"
}
self.assertEquals(
prompt.run(),
self.expected
)
def test_prompt_run_singlemessage_valid(self):
"""
Test that the run() method will succeed with a single message.
.. versionadded:: 0.1.0
.. function:: test_run_singlemessage_valid()
"""
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
prompt._task.args = {
"msg": "Hello World"
}
self.assertEquals(
prompt.run(),
self.expected
)
self.assertEquals(
self.outstr.getvalue(),
"Hello World\n"
)
def test_prompt_run_multimessage_valid(self):
"""
Test that the run() method will succeed with a multiple messages.
.. versionadded:: 0.1.0
.. function:: test_run_multimessage_valid()
"""
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
prompt._task.args = {
"msg": [
"Hello World",
"Hi there"
]
}
self.assertEquals(
prompt.run(),
self.expected
)
self.assertEquals(
self.outstr.getvalue(),
"Hello World\nHi there\n"
)
def test_prompt_run_saynewlinemultiple_valid(self):
"""
Test that the run() method will succeed with a multiple messages.
.. versionadded:: 0.3.0
.. function:: test_prompt_run_saynewlinemultiple_valid()
"""
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
prompt._task.args = {
"msg": [
{
"say": "Hello ",
"newline": False,
},
{
"say": "World",
"newline": True,
}
]
}
self.assertEquals(
prompt.run(),
self.expected
)
self.assertEquals(
self.outstr.getvalue(),
"Hello World\n"
)
def test_prompt_run_align_left_valid(self):
"""
Test that the run() method is successful with a left alignment.
.. versionadded:: 0.3.0
.. function:: test_prompt_run_align_left_valid()
"""
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
prompt._task.args = {
"msg": [
{
"say": "Hello World",
"align": "left",
}
]
}
self.assertEquals(
prompt.run(),
self.expected
)
self.assertEquals(
self.outstr.getvalue(),
"Hello World\n"
)
def test_prompt_run_align_center_valid(self):
"""
Test that the run() method is successful with a center alignment.
.. versionadded:: 0.3.0
.. function:: test_prompt_run_align_center_valid()
"""
with mock.patch('subprocess.check_output', return_value='10 88'):
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
prompt._task.args = {
"msg": [
{
"say": "Hello World",
"align": "center",
"newline": False,
}
]
}
self.assertEquals(
prompt.run(),
self.expected
)
self.assertEquals(
self.outstr.getvalue(),
"Hello World".center(88)
)
def test_prompt_run_align_right_valid(self):
"""
Test that the run() method is successful with a right alignment.
.. versionadded:: 0.3.0
.. function:: test_prompt_run_align_right_valid()
"""
with mock.patch('subprocess.check_output', return_value='10 52'):
del(self.expected['changed'])
prompt = self._getPrompt()
prompt.setOutput(self.outstr)
prompt._task.args = {
"msg": [
{
"say": "Hello World",
"align": "right",
"newline": False,
}
]
}
self.assertEquals(
prompt.run(),
self.expected
)
self.assertEquals(
self.outstr.getvalue(),
"Hello World".rjust(52)
)
| 27.568586 | 118 | 0.564342 | 2,798 | 26,328 | 5.137598 | 0.10579 | 0.050087 | 0.068174 | 0.042783 | 0.781913 | 0.725774 | 0.687374 | 0.64967 | 0.577809 | 0.554852 | 0 | 0.011546 | 0.322356 | 26,328 | 954 | 119 | 27.597484 | 0.794182 | 0.307315 | 0 | 0.481818 | 0 | 0 | 0.086403 | 0.007002 | 0 | 0 | 0 | 0 | 0.190909 | 1 | 0.102273 | false | 0 | 0.018182 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b68cd9bab578328a565f2b6f75b5874b0fc24f5f | 511 | py | Python | opendata/tasks.py | OpenData-NC/open-data-nc | 5e1964d5c986f79055e6f018f0f2f9a15a3eaf99 | [
"MIT"
] | 5 | 2015-10-31T02:01:24.000Z | 2021-03-02T12:34:59.000Z | opendata/tasks.py | OpenData-NC/open-data-nc | 5e1964d5c986f79055e6f018f0f2f9a15a3eaf99 | [
"MIT"
] | null | null | null | opendata/tasks.py | OpenData-NC/open-data-nc | 5e1964d5c986f79055e6f018f0f2f9a15a3eaf99 | [
"MIT"
] | 4 | 2016-04-05T06:07:15.000Z | 2020-10-18T01:17:17.000Z | from celery import task
from django.core.mail import send_mail
@task
def send_email(subject, message, from_email, recipient_list):
"""Send email async using a celery worker
args: Take sames args as django send_mail function.
"""
send_mail(subject, message, from_email, recipient_list)
@task
def update_object(index, instance, using=None):
index.update_object(instance, using=using)
@task
def remove_object(index, instance, using=None):
index.remove_object(instance, using=using) | 26.894737 | 61 | 0.753425 | 73 | 511 | 5.109589 | 0.39726 | 0.13941 | 0.096515 | 0.123324 | 0.369973 | 0.369973 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156556 | 511 | 19 | 62 | 26.894737 | 0.865429 | 0.178082 | 0 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.181818 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b690e2d5eb8179f10745b17c20d29455f2af53df | 3,316 | py | Python | store/adminshop/views/__init__.py | vallemrv/my_store_test | 2da624fd02c5f1784464f15b751b488f3dd2bae6 | [
"Apache-2.0"
] | null | null | null | store/adminshop/views/__init__.py | vallemrv/my_store_test | 2da624fd02c5f1784464f15b751b488f3dd2bae6 | [
"Apache-2.0"
] | null | null | null | store/adminshop/views/__init__.py | vallemrv/my_store_test | 2da624fd02c5f1784464f15b751b488f3dd2bae6 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# @Author: Manuel Rodriguez <valle>
# @Date: 28-Sep-2017
# @Email: valle.mrv@gmail.com
# @Last modified by: valle
# @Last modified time: 31-Jan-2018
# @License: Apache license vesion 2.0
from django.shortcuts import render, redirect
from django.contrib.auth.decorators import login_required, permission_required
from django.template import RequestContext
from django.contrib.auth import login as login_auth, logout as logout_auth, authenticate
from adminshop.forms import LoginForm, ConfigSiteForm
from adminshop.models import ConfigSite
from .categorias import *
from .clientes import *
from .categorizacion import *
from .productos import *
from .usuarios import *
from .compras import *
from .taller import *
from .ventas import *
from .abonos import *
from .presupuesto import *
from .sign import *
from .proveedores import *
from .garantias import *
from .balances import *
from .documentos import *
# Create your views here.
@login_required(login_url='login_tk')
def tienda(request):
return render(request, "menus/tienda.html")
@login_required(login_url='login_tk')
def taller(request):
return render(request, "menus/taller.html")
@login_required(login_url='login_tk')
def gestion(request):
return render(request, "menus/gestion.html")
@login_required(login_url='login_tk')
def informes(request):
return render(request, "menus/informes.html")
@login_required(login_url='login_tk')
def almacen(request):
return render(request, "menus/almacen.html")
@login_required(login_url='login_tk')
def configuracion(request):
if not request.method == "POST":
f_categoria = CategoriasForm()
try:
config = ConfigSite.objects.get(pk=1)
f_config = ConfigSiteForm(instance=config)
except:
config = ConfigSite(codigo_compra="4000")
config.save()
f_config = ConfigSiteForm(instance=config)
return render(request, "gestion/configuracion.html",
{"form": f_config,
"mensaje": "Editar configuracion"})
else:
config = ConfigSite.objects.get(pk=1)
f_config = ConfigSiteForm(request.POST, request.FILES, instance=config)
if f_config.is_valid():
f_config.save()
else:
print(f_config.errors)
return redirect("gestion")
def login(request):
if not request.method == "POST":
#if request.user.is_authenticated:
# return redirect("/gestion")
form_log = LoginForm()
return render(request, "gestion/login/login.html", {"form": form_log})
else:
form_log = LoginForm(request.POST)
if form_log.is_valid():
user = authenticate(username=form_log.cleaned_data['username'],
password=form_log.cleaned_data['password'])
if user != None:
login_auth(request, user)
return redirect("tienda")
else:
return render(request, "gestion/login/login.html",
{"form": form_log, "mensaje": "El usuario o la contraseña no son validos"})
def logout(request):
logout_auth(request)
return redirect("tienda")
def en_construccion(request):
return render(request, "gestion/404.html")
| 31.580952 | 105 | 0.666466 | 391 | 3,316 | 5.534527 | 0.327366 | 0.064695 | 0.07902 | 0.058226 | 0.309612 | 0.215342 | 0.18854 | 0.174214 | 0.093346 | 0.047135 | 0 | 0.009317 | 0.22316 | 3,316 | 104 | 106 | 31.884615 | 0.830745 | 0.087153 | 0 | 0.230769 | 0 | 0 | 0.119695 | 0.024536 | 0 | 0 | 0 | 0 | 0 | 1 | 0.115385 | false | 0.012821 | 0.269231 | 0.076923 | 0.538462 | 0.012821 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b69231e529d3179611550965950d64e9a25ebb98 | 378 | py | Python | dmriprep/workflows/dwi/conversions/nii_to_mif/edges.py | GalBenZvi/dmriprep | 7c4c0e1b01e3d941a7fafcbb6b001605cb1f2b0b | [
"Apache-2.0"
] | null | null | null | dmriprep/workflows/dwi/conversions/nii_to_mif/edges.py | GalBenZvi/dmriprep | 7c4c0e1b01e3d941a7fafcbb6b001605cb1f2b0b | [
"Apache-2.0"
] | 1 | 2022-03-22T13:22:18.000Z | 2022-03-22T13:22:18.000Z | dmriprep/workflows/dwi/conversions/nii_to_mif/edges.py | GalBenZvi/dmriprep | 7c4c0e1b01e3d941a7fafcbb6b001605cb1f2b0b | [
"Apache-2.0"
] | null | null | null | INPUT_TO_DWI_CONVERSION_EDGES = [("dwi_file", "in_file")]
INPUT_TO_FMAP_CONVERSION_EDGES = [("fmap_file", "in_file")]
LOCATE_ASSOCIATED_TO_COVERSION_EDGES = [
("json_file", "json_import"),
("bvec_file", "in_bvec"),
("bval_file", "in_bval"),
]
DWI_CONVERSION_TO_OUTPUT_EDGES = [("out_file", "dwi_file")]
FMAP_CONVERSION_TO_OUTPUT_EDGES = [("out_file", "fmap_file")]
| 34.363636 | 61 | 0.719577 | 53 | 378 | 4.490566 | 0.320755 | 0.10084 | 0.084034 | 0.193277 | 0.252101 | 0.252101 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103175 | 378 | 10 | 62 | 37.8 | 0.702065 | 0 | 0 | 0 | 0 | 0 | 0.306878 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b6971975b9eaf770ff217af0628bef84c7d15d9a | 611 | py | Python | modelconfig.py | divyanshrm/Polyth-Net-Classification-of-Polythene-Bags-Using-Deep-Dearning | f52c0887cb12cf1322a37d1042917be5d679c725 | [
"MIT"
] | null | null | null | modelconfig.py | divyanshrm/Polyth-Net-Classification-of-Polythene-Bags-Using-Deep-Dearning | f52c0887cb12cf1322a37d1042917be5d679c725 | [
"MIT"
] | null | null | null | modelconfig.py | divyanshrm/Polyth-Net-Classification-of-Polythene-Bags-Using-Deep-Dearning | f52c0887cb12cf1322a37d1042917be5d679c725 | [
"MIT"
] | null | null | null | from tensorflow.keras.applications.xception import Xception
import tensorflow.keras as k
def modelconfig(dropout_rate):
model=k.models.Sequential()
model_efficient=Xception(include_top=False,input_shape=(224,224,3),weights=None)
model.add(k.layers.InputLayer((224,224,3)))
model.add(model_efficient)
model.add(k.layers.GlobalMaxPool2D())
model.add(k.layers.Dense(1024,activation='relu'))
model.add(k.layers.Dropout(dropout_rate))
model.add(k.layers.Dense(256,activation='relu'))
model.add(k.layers.Dropout(dropout_rate))
model.add(k.layers.Dense(3,activation='softmax'))
return model | 38.1875 | 82 | 0.769231 | 90 | 611 | 5.144444 | 0.4 | 0.138229 | 0.136069 | 0.226782 | 0.332613 | 0.289417 | 0.289417 | 0.289417 | 0.289417 | 0.289417 | 0 | 0.040925 | 0.080196 | 611 | 16 | 83 | 38.1875 | 0.782918 | 0 | 0 | 0.142857 | 0 | 0 | 0.025126 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b6991871b344a5cf774115d7f86c5484afbe76ab | 1,064 | py | Python | examples/data/gen_olf_input.py | neurokernel/sensory_int | 1e9f690a3e09812f441313433252eecdfc33f78c | [
"BSD-3-Clause"
] | null | null | null | examples/data/gen_olf_input.py | neurokernel/sensory_int | 1e9f690a3e09812f441313433252eecdfc33f78c | [
"BSD-3-Clause"
] | null | null | null | examples/data/gen_olf_input.py | neurokernel/sensory_int | 1e9f690a3e09812f441313433252eecdfc33f78c | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
"""
Generate sample olfactory model stimulus.
"""
import numpy as np
import h5py
osn_num = 1375
dt = 1e-4 # time step
Ot = 2000 # number of data point during reset period
Rt = 1000 # number of data point during odor delivery period
#Nt = 4*Ot + 3*Rt # number of data points in time
#Nt = 10000
#t = np.arange(0, dt*Nt, dt)
I = 0.5195 # amplitude of odorant concentration
u_1 = np.zeros(500, np.float64)
u_2 = I*np.ones(5000, np.float64)
u_3 = np.zeros(4500,np.float64)
u_4 = I*np.ones(1000,np.float64)
u_5 = np.zeros(1000, np.float64)
u_6 = I*np.ones(1500, np.float64)
u_7 = np.zeros(500,np.float64)
#u_on = I*np.ones(Ot, dtype=np.float64)
#u_off = np.zeros(Ot, dtype=np.float64)
#u_reset = np.zeros(Rt, dtype=np.float64)
u = np.concatenate((u_1,u_2,u_3,u_4,u_5,u_6,u_7))
Nt = u.size
#print Nt
u_all = np.transpose(np.kron(np.ones((osn_num, 1)), u))
with h5py.File('olfactory_input.h5', 'w') as f:
f.create_dataset('array', (Nt, osn_num),
dtype=np.float64,
data=u_all)
| 26.6 | 60 | 0.652256 | 202 | 1,064 | 3.316832 | 0.391089 | 0.147761 | 0.149254 | 0.067164 | 0.179104 | 0.059701 | 0 | 0 | 0 | 0 | 0 | 0.108645 | 0.195489 | 1,064 | 39 | 61 | 27.282051 | 0.674065 | 0.386278 | 0 | 0 | 1 | 0 | 0.037915 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.095238 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b69c4b997fca33a3fb149b9f7a489a2143e12900 | 608 | py | Python | airnow/api.py | briandconnelly/airnow-py | ebe1b0b18b77038aa3473c5814a34b5eeb908bb6 | [
"MIT"
] | 1 | 2020-09-22T15:23:59.000Z | 2020-09-22T15:23:59.000Z | airnow/api.py | briandconnelly/airnow-py | ebe1b0b18b77038aa3473c5814a34b5eeb908bb6 | [
"MIT"
] | 6 | 2020-09-21T16:01:20.000Z | 2020-09-25T01:01:26.000Z | airnow/api.py | briandconnelly/airnow-py | ebe1b0b18b77038aa3473c5814a34b5eeb908bb6 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import requests
def get_airnow_data(endpoint: str, **kwargs) -> dict:
"""Query the AirNow API and return the contents
:param str endpoint: AirNow API endpoint (e.g., "/aq/observation/zipCode/current")
Additional named arguments are passed on as query parameters.
All queries should at least have `API_KEY` set.
See the AirNow API documentation for parameter lists and descriptions.
"""
with requests.Session() as s:
result = requests.get(url=f"http://www.airnowapi.org{endpoint}", params=kwargs)
return result.content.decode("UTF-8")
| 32 | 87 | 0.694079 | 84 | 608 | 4.988095 | 0.72619 | 0.064439 | 0.057279 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004073 | 0.192434 | 608 | 18 | 88 | 33.777778 | 0.849287 | 0.547697 | 0 | 0 | 0 | 0 | 0.157258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b69fbe77289f7235767b0bd63da00b1ed896dee8 | 459 | py | Python | CursoEmVideo-Python3-Mundo1/desafio035.py | martinsnathalia/Python | 0cabf894dc9c0d0307bd1136831e011580b497c5 | [
"MIT"
] | null | null | null | CursoEmVideo-Python3-Mundo1/desafio035.py | martinsnathalia/Python | 0cabf894dc9c0d0307bd1136831e011580b497c5 | [
"MIT"
] | null | null | null | CursoEmVideo-Python3-Mundo1/desafio035.py | martinsnathalia/Python | 0cabf894dc9c0d0307bd1136831e011580b497c5 | [
"MIT"
] | null | null | null | # Desenvolva um programa que leia o comprimento de três retas e diga ao usuário se elas podem ou não formar um triângulo.
print('Suas retas formam um triângulo?')
r1 = float(input('Digite a primeira reta: '))
r2 = float(input('Digite a segunda reta: '))
r3 = float(input('Digite a terceira reta: '))
if r1 < (r2+r3) and r2 < (r1 + r3) and r3 < (r2 + r1):
print('Essas retas formam um triângulo!')
else:
print('Essas retas NÃO formam um triângulo!')
| 38.25 | 121 | 0.690632 | 75 | 459 | 4.226667 | 0.52 | 0.138801 | 0.160883 | 0.160883 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0.189542 | 459 | 11 | 122 | 41.727273 | 0.819892 | 0.259259 | 0 | 0 | 0 | 0 | 0.502959 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.