hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b42110e69fbba6f3cc1175f605afe65f09844634 | 5,211 | py | Python | validation_tests/analytical_exact/river_at_rest_varying_topo_width/numerical_varying_width.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | 136 | 2015-05-07T05:47:43.000Z | 2022-02-16T03:07:40.000Z | validation_tests/analytical_exact/river_at_rest_varying_topo_width/numerical_varying_width.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | 184 | 2015-05-03T09:27:54.000Z | 2021-12-20T04:22:48.000Z | validation_tests/analytical_exact/river_at_rest_varying_topo_width/numerical_varying_width.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | 70 | 2015-03-18T07:35:22.000Z | 2021-11-01T07:07:29.000Z | """Simple water flow example using ANUGA
Water driven up a linear slope and time varying boundary,
similar to a beach environment
"""
#------------------------------------------------------------------------------
# Import necessary modules
#------------------------------------------------------------------------------
import sys
import anuga
from anuga import myid, finalize, distribute
from anuga import Domain as Domain
from math import cos
from numpy import zeros, ones, array, interp, polyval, ones_like, zeros_like
from numpy import where, logical_and
from time import localtime, strftime, gmtime
from scipy.interpolate import interp1d
from anuga.geometry.polygon import inside_polygon, is_inside_triangle
#from balanced_dev import *
#-------------------------------------------------------------------------------
# Copy scripts to time stamped output directory and capture screen
# output to file
#-------------------------------------------------------------------------------
time = strftime('%Y%m%d_%H%M%S',localtime())
#output_dir = 'varying_width'+time
output_dir = '.'
output_file = 'varying_width'
#anuga.copy_code_files(output_dir,__file__)
#start_screen_catcher(output_dir+'_')
args = anuga.get_args()
alg = args.alg
verbose = args.verbose
#------------------------------------------------------------------------------
# Setup domain
#------------------------------------------------------------------------------
dx = 1.
dy = dx
L = 1500.
W = 60.
#===============================================================================
# Create sequential domain
#===============================================================================
if myid == 0:
# structured mesh
points, vertices, boundary = anuga.rectangular_cross(int(L/dx), int(W/dy), L, W, (0.,-W/2.))
#domain = anuga.Domain(points, vertices, boundary)
domain = Domain(points, vertices, boundary)
domain.set_name(output_file)
domain.set_datadir(output_dir)
#------------------------------------------------------------------------------
# Setup Algorithm, either using command line arguments
# or override manually yourself
#------------------------------------------------------------------------------
domain.set_flow_algorithm(alg)
#------------------------------------------------------------------------------
# Setup initial conditions
#------------------------------------------------------------------------------
domain.set_quantity('friction', 0.0)
domain.set_quantity('stage', 12.0)
XX = array([0.,50.,100.,150.,250.,300.,350.,400.,425.,435.,450.,470.,475.,500.,
505.,530.,550.,565.,575.,600.,650.,700.,750.,800.,820.,900.,950.,
1000.,1500.])
ZZ = array([0.,0.,2.5,5.,5.,3.,5.,5.,7.5,8.,9.,9.,9.,9.1,9.,9.,6.,5.5,5.5,5.,
4.,3.,3.,2.3,2.,1.2,0.4,0.,0.])
WW = array([40.,40.,30.,30.,30.,30.,25.,25.,30.,35.,35.,40.,40.,40.,45.,45.,50.,
45.,40.,40.,30.,40.,40.,5.,40.,35.,25.,40.,40.])/2.
depth = interp1d(XX, ZZ)
width = interp1d(XX, WW)
def bed_elevation(x,y):
z = 25.0*ones_like(x)
wid = width(x)
dep = depth(x)
z = where( logical_and(y < wid, y>-wid), dep, z)
return z
domain.set_quantity('elevation', bed_elevation)
else:
domain = None
#===========================================================================
# Create Parallel domain
#===========================================================================
domain = distribute(domain)
#-----------------------------------------------------------------------------
# Setup boundary conditions
#------------------------------------------------------------------------------
from math import sin, pi, exp
Br = anuga.Reflective_boundary(domain) # Solid reflective wall
#Bt = anuga.Transmissive_boundary(domain) # Continue all values on boundary
#Bd = anuga.Dirichlet_boundary([1,0.,0.]) # Constant boundary values
# Associate boundary tags with boundary objects
domain.set_boundary({'left': Br, 'right': Br, 'top': Br, 'bottom': Br})
#------------------------------------------------------------------------------
# Produce a documentation of parameters
#------------------------------------------------------------------------------
if myid == 0:
parameter_file=open('parameters.tex', 'w')
parameter_file.write('\\begin{verbatim}\n')
from pprint import pprint
pprint(domain.get_algorithm_parameters(),parameter_file,indent=4)
parameter_file.write('\\end{verbatim}\n')
parameter_file.close()
#------------------------------------------------------------------------------
# Evolve system through time
#------------------------------------------------------------------------------
import time
t0 = time.time()
for t in domain.evolve(yieldstep = 0.1, finaltime = 5.0):
#print(domain.timestepping_statistics(track_speeds=True))
if myid == 0 and verbose: print(domain.timestepping_statistics())
#vis.update()
if myid == 0 and verbose: print('That took %s sec' % str(time.time()-t0))
domain.sww_merge(delete_old=True)
finalize()
| 36.440559 | 96 | 0.459797 | 538 | 5,211 | 4.351301 | 0.431227 | 0.026912 | 0.011961 | 0.023921 | 0.047843 | 0.018795 | 0 | 0 | 0 | 0 | 0 | 0.047788 | 0.136634 | 5,211 | 142 | 97 | 36.697183 | 0.472549 | 0.485895 | 0 | 0.03125 | 0 | 0 | 0.051048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015625 | false | 0 | 0.203125 | 0 | 0.234375 | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b426f99a8bac6c3327cab3da97ce79ef51269da3 | 1,068 | py | Python | commands/calc.py | periodicaidan/dalton-cli | 6a83e1a2675e335bf807c43c4201d78e5b389837 | [
"MIT"
] | 2 | 2018-12-21T19:09:49.000Z | 2018-12-22T10:41:36.000Z | commands/calc.py | periodicaidan/dalton-cli | 6a83e1a2675e335bf807c43c4201d78e5b389837 | [
"MIT"
] | null | null | null | commands/calc.py | periodicaidan/dalton-cli | 6a83e1a2675e335bf807c43c4201d78e5b389837 | [
"MIT"
] | null | null | null | """
File: commands/calc.py
Purpose: Performs calculations in response to user input, and outputs the result
"""
from sys import argv
import click
from calculator import *
from models import History
from models.Config import Config
from help_menus import calc_help
@click.group("calc", invoke_without_command=True)
@click.option("-M", "--mass-spec",
is_flag=True, default=False,
help="Get a theoretical mass spectrum of a molecule")
@click.option("-i", "--histogram",
is_flag=True, default=False,
help="Use with -M/--mass-spec to display the mass spec as a histogram")
@click.argument("formula", required=False)
def calc(mass_spec, histogram, formula):
config = Config.setup() # todo: Pass as context
if not any(locals().items()) or len(argv) == 2:
calc_help()
else:
if mass_spec:
click.echo(get_mass_spec(formula, histogram))
else:
mass = History.get(formula)["mass"] or get_mass(formula)
click.echo("%.3f %s" % (mass, config.units))
| 31.411765 | 85 | 0.652622 | 146 | 1,068 | 4.691781 | 0.5 | 0.070073 | 0.026277 | 0.049635 | 0.075912 | 0.075912 | 0 | 0 | 0 | 0 | 0 | 0.002413 | 0.223783 | 1,068 | 33 | 86 | 32.363636 | 0.823884 | 0.117978 | 0 | 0.166667 | 0 | 0 | 0.167024 | 0 | 0 | 0 | 0 | 0.030303 | 0 | 1 | 0.041667 | false | 0 | 0.25 | 0 | 0.291667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b4372d11f9380b54abe868161855c4d8eb68fe8d | 3,301 | py | Python | peter_lists/blog/views.py | pvize1/peter_lists | 77e9f30cfc45f500e059b7b163db541335180332 | [
"MIT"
] | null | null | null | peter_lists/blog/views.py | pvize1/peter_lists | 77e9f30cfc45f500e059b7b163db541335180332 | [
"MIT"
] | 8 | 2021-05-12T05:53:42.000Z | 2022-03-31T04:08:18.000Z | peter_lists/blog/views.py | pvize1/peter_lists | 77e9f30cfc45f500e059b7b163db541335180332 | [
"MIT"
] | null | null | null | from django.contrib.auth.mixins import LoginRequiredMixin
from django.contrib.auth.mixins import PermissionRequiredMixin
from django.views.generic import (
ListView,
DetailView,
CreateView,
UpdateView,
DeleteView,
)
from django.shortcuts import render
from django.db.models import Count
from django.db.models.functions import Trim, Lower
from django.urls import reverse_lazy
from .models import Blog
from .forms import EditBlogForm
def tag_count(blog_user, topn=0):
# TODO Move to model manager
raw_tags = (
Blog.blog.filter(user=blog_user)
.order_by("tag")
.values("tag")
.annotate(count=Count("tag"), tag_new=Trim(Lower("tag")))
)
count_tags = dict()
# TODO Split by tags with "," and those without
for record in raw_tags:
for tag in record["tag_new"].split(","):
k = tag.strip()
if len(k) > 0:
count_tags[k] = count_tags.get(k, 0) + record["count"]
# TODO Sort by value (desc) and then key (ascend) for common values
if topn == 0:
return {
k: count_tags[k]
for k in sorted(count_tags, key=count_tags.get, reverse=True)
}
else:
return {
k: count_tags[k]
for k in sorted(count_tags, key=count_tags.get, reverse=True)[:topn]
}
# Create your views here.
def BlogHome(request):
blog_all = Blog.blog.filter(user=request.user)
blogs = blog_all.order_by("-modified")[:3]
blog_count = blog_all.count()
tag_sorted = tag_count(request.user, topn=5)
return render(
request,
"blog/blog_home.html",
{"blogs": blogs, "tags": tag_sorted, "blog_count": blog_count},
)
class BlogListView(PermissionRequiredMixin, ListView):
model = Blog
paginate_by = 3
template_name = "blog/blog_list.html"
permission_required = "blog.view_blog"
def get_queryset(self):
return Blog.blog.filter(user=self.request.user)
def BlogAllTagsView(request):
# TODO turn into ListView with paginate
tag_sorted = tag_count(request.user)
return render(request, "blog/blog_tags.html", {"tags": tag_sorted})
class BlogTagListView(PermissionRequiredMixin, ListView):
model = Blog
paginate_by = 3
template_name = "blog/blog_list.html"
permission_required = "blog.view_blog"
def get_queryset(self):
return Blog.blog.filter(tag__contains=self.kwargs["tag_name"], user=self.request.user)
class BlogDetailView(PermissionRequiredMixin, DetailView):
model = Blog
template_name = "blog/blog_detail.html"
permission_required = "blog.view_blog"
class BlogCreateView(PermissionRequiredMixin, LoginRequiredMixin, CreateView):
form_class = EditBlogForm
model = Blog
action = "Add"
template_name = "blog/blog_form.html"
permission_required = "blog.add_blog"
class BlogUpdateView(PermissionRequiredMixin, LoginRequiredMixin, UpdateView):
form_class = EditBlogForm
model = Blog
action = "Edit"
template_name = "blog/blog_form.html"
permission_required = "blog.change_blog"
class BlogDeleteView(PermissionRequiredMixin, LoginRequiredMixin, DeleteView):
model = Blog
success_url = reverse_lazy("blog:list")
permission_required = "blog.delete_blog"
| 29.212389 | 94 | 0.684035 | 409 | 3,301 | 5.359413 | 0.278729 | 0.040146 | 0.060219 | 0.04562 | 0.366788 | 0.342153 | 0.238139 | 0.238139 | 0.238139 | 0.192518 | 0 | 0.003079 | 0.212966 | 3,301 | 112 | 95 | 29.473214 | 0.840647 | 0.060588 | 0 | 0.270588 | 0 | 0 | 0.0979 | 0.006785 | 0 | 0 | 0 | 0.008929 | 0 | 1 | 0.058824 | false | 0 | 0.105882 | 0.023529 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b4378b3e91302a7b53287f43ef0ed313d4ff8c2f | 1,992 | py | Python | tests/test_pythonpath.py | browniebroke/pytest-srcpaths | c0bf4a9b521c8f7af029f9923b344936cf425bf1 | [
"MIT"
] | 26 | 2021-02-18T20:49:41.000Z | 2022-02-08T21:06:20.000Z | tests/test_pythonpath.py | browniebroke/pytest-srcpaths | c0bf4a9b521c8f7af029f9923b344936cf425bf1 | [
"MIT"
] | null | null | null | tests/test_pythonpath.py | browniebroke/pytest-srcpaths | c0bf4a9b521c8f7af029f9923b344936cf425bf1 | [
"MIT"
] | 2 | 2021-04-04T01:45:37.000Z | 2022-02-07T11:28:51.000Z | import sys
from typing import Generator
from typing import List
from typing import Optional
import pytest
from _pytest.pytester import Pytester
def test_one_dir_pythonpath(pytester: Pytester, file_structure) -> None:
pytester.makefile(".ini", pytest="[pytest]\npythonpath=sub\n")
result = pytester.runpytest("test_foo.py")
assert result.ret == 0
result.assert_outcomes(passed=1)
def test_two_dirs_pythonpath(pytester: Pytester, file_structure) -> None:
pytester.makefile(".ini", pytest="[pytest]\npythonpath=sub sub2\n")
result = pytester.runpytest("test_foo.py", "test_bar.py")
assert result.ret == 0
result.assert_outcomes(passed=2)
def test_unconfigure_unadded_dir_pythonpath(pytester: Pytester) -> None:
pytester.makeconftest(
"""
def pytest_configure(config):
config.addinivalue_line("pythonpath", "sub")
"""
)
pytester.makepyfile(
"""
import sys
def test_something():
pass
"""
)
result = pytester.runpytest()
result.assert_outcomes(passed=1)
def test_clean_up_pythonpath(pytester: Pytester) -> None:
"""Test that the srcpaths plugin cleans up after itself."""
pytester.makefile(".ini", pytest="[pytest]\npythonpath=I_SHALL_BE_REMOVED\n")
pytester.makepyfile(test_foo="""def test_foo(): pass""")
before: Optional[List[str]] = None
after: Optional[List[str]] = None
class Plugin:
@pytest.hookimpl(hookwrapper=True, tryfirst=True)
def pytest_unconfigure(self) -> Generator[None, None, None]:
nonlocal before, after
before = sys.path.copy()
yield
after = sys.path.copy()
result = pytester.runpytest_inprocess(plugins=[Plugin()])
assert result.ret == 0
assert before is not None
assert after is not None
assert any("I_SHALL_BE_REMOVED" in entry for entry in before)
assert not any("I_SHALL_BE_REMOVED" in entry for entry in after)
| 30.181818 | 81 | 0.676205 | 246 | 1,992 | 5.321138 | 0.325203 | 0.032086 | 0.07945 | 0.057296 | 0.36822 | 0.36822 | 0.336134 | 0.255157 | 0.255157 | 0.18793 | 0 | 0.004456 | 0.211345 | 1,992 | 65 | 82 | 30.646154 | 0.828771 | 0.026606 | 0 | 0.121951 | 0 | 0 | 0.114171 | 0.052209 | 0 | 0 | 0 | 0 | 0.243902 | 1 | 0.121951 | false | 0.097561 | 0.146341 | 0 | 0.292683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b439967634fbd815c14f34a574722d653f74e466 | 367 | py | Python | distributed_social_network/posts/migrations/0003_auto_20190308_2055.py | leevtori/CMPUT404-project | 52214288855ae4b3f05b8d17e67a2686debffb19 | [
"Apache-2.0"
] | null | null | null | distributed_social_network/posts/migrations/0003_auto_20190308_2055.py | leevtori/CMPUT404-project | 52214288855ae4b3f05b8d17e67a2686debffb19 | [
"Apache-2.0"
] | 51 | 2019-03-22T00:31:06.000Z | 2021-06-10T21:17:30.000Z | distributed_social_network/posts/migrations/0003_auto_20190308_2055.py | leevtori/CMPUT404-project | 52214288855ae4b3f05b8d17e67a2686debffb19 | [
"Apache-2.0"
] | 1 | 2019-02-08T01:33:57.000Z | 2019-02-08T01:33:57.000Z | # Generated by Django 2.1.7 on 2019-03-08 20:55
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('posts', '0002_auto_20190221_0234'),
]
operations = [
migrations.RenameField(
model_name='post',
old_name='visiblilty',
new_name='visibility',
),
]
| 19.315789 | 47 | 0.588556 | 39 | 367 | 5.384615 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120623 | 0.299728 | 367 | 18 | 48 | 20.388889 | 0.696498 | 0.122616 | 0 | 0 | 1 | 0 | 0.1625 | 0.071875 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b43e6c43008ba217cff97642ff4168d07bf643bc | 23,644 | py | Python | policy.py | nyu-dl/dl4mt-simul-trans | 392ff3148e944be6fbc475d5285441807902e2e0 | [
"BSD-3-Clause"
] | 34 | 2016-12-01T07:59:43.000Z | 2021-09-13T10:46:15.000Z | policy.py | yifanjun233/dl4mt-simul-trans | 392ff3148e944be6fbc475d5285441807902e2e0 | [
"BSD-3-Clause"
] | 1 | 2020-09-14T08:35:00.000Z | 2020-09-14T08:35:00.000Z | policy.py | yifanjun233/dl4mt-simul-trans | 392ff3148e944be6fbc475d5285441807902e2e0 | [
"BSD-3-Clause"
] | 18 | 2016-12-15T01:43:33.000Z | 2021-09-29T07:24:08.000Z | """
-- Policy Network for decision making [more general]
"""
from nmt_uni import *
from layers import _p
import os
import time, datetime
import cPickle as pkl
# hyper params
TINY = 1e-7
PI = numpy.pi
E = numpy.e
A = 0.2
B = 1
class Controller(object):
def __init__(self, trng,
options,
n_in=None, n_out=None,
recurrent=False, id=None):
self.WORK = options['workspace']
self.trng = trng
self.options = options
self.recurrent = recurrent
self.type = options.get('type', 'categorical')
self.n_hidden = 128
self.n_in = n_in
self.n_out = n_out
if self.options.get('layernorm', True):
self.rec = 'lngru'
else:
self.rec = 'gru'
if not n_in:
self.n_in = options['readout_dim']
if not n_out:
if self.type == 'categorical':
self.n_out = 2 # initially it is a WAIT/COMMIT action.
elif self.type == 'gaussian':
self.n_out = 100
else:
raise NotImplementedError
# build the policy network
print 'parameter initialization'
params = OrderedDict()
if not self.recurrent:
print 'building a feedforward controller'
params = get_layer('ff')[0](options, params, prefix='policy_net_in',
nin=self.n_in, nout=self.n_hidden)
else:
print 'building a recurrent controller'
params = get_layer(self.rec)[0](options, params, prefix='policy_net_in',
nin=self.n_in, dim=self.n_hidden)
params = get_layer('ff')[0](options, params, prefix='policy_net_out',
nin=self.n_hidden,
nout=self.n_out if self.type == 'categorical' else self.n_out * 2)
# bias the forget probability
# if self.n_out == 3:
# params[_p('policy_net_out', 'b')][-1] = -2
# for the baseline network.
params_b = OrderedDict()
# using a scalar baseline [**]
# params_b['b0'] = numpy.array(numpy.random.rand() * 0.0, dtype='float32')
# using a MLP as a baseline
params_b = get_layer('ff')[0](options, params_b, prefix='baseline_net_in',
nin=self.n_in, nout=128)
params_b = get_layer('ff')[0](options, params_b, prefix='baseline_net_out',
nin=128, nout=1)
if id is not None:
print 'reload the saved model: {}'.format(id)
params = load_params(self.WORK + '.policy/{}-{}.current.npz'.format(id, self.options['base']), params)
params_b = load_params(self.WORK + '.policy/{}-{}.current.npz'.format(id, self.options['base']), params_b)
else:
id = datetime.datetime.fromtimestamp(time.time()).strftime('%y%m%d-%H%M%S')
print 'start from a new model: {}'.format(id)
self.id = id
self.model = self.WORK + '.policy/{}-{}'.format(id, self.options['base'])
# theano shared params
tparams = init_tparams(params)
tparams_b = init_tparams(params_b)
self.tparams = tparams
self.tparams_b = tparams_b
# build the policy network
self.build_sampler(options=options)
self.build_discriminator(options=options)
print 'policy network'
for p in params:
print p, params[p].shape
def build_batchnorm(self, observation, mask=None):
raise NotImplementedError
def build_sampler(self, options):
# ==================================================================================== #
# Build Action function: samplers
# ==================================================================================== #
observation = tensor.matrix('observation', dtype='float32') # batch_size x readout_dim (seq_steps=1)
prev_hidden = tensor.matrix('p_hidden', dtype='float32')
if not self.recurrent:
hiddens = get_layer('ff')[1](self.tparams, observation,
options, prefix='policy_net_in',
activ='tanh')
else:
hiddens = get_layer(self.rec)[1](self.tparams, observation,
options, prefix='policy_net_in', mask=None,
one_step=True, _init_state=prev_hidden)[0]
act_inps = [observation, prev_hidden]
if self.type == 'categorical':
act_prob = get_layer('ff')[1](self.tparams, hiddens, options,
prefix='policy_net_out',
activ='softmax') # batch_size x n_out
act_prob2 = tensor.clip(act_prob, TINY, 1 - TINY)
# compiling the sampling function for action
# action = self.trng.binomial(size=act_prop.shape, p=act_prop)
action = self.trng.multinomial(pvals=act_prob).argmax(1) # 0, 1, ...
print 'build action sampling function [Discrete]'
self.f_action = theano.function(act_inps, [action, act_prob, hiddens, act_prob2],
on_unused_input='ignore') # action/dist/hiddens
elif self.type == 'gaussian':
_temp = get_layer('ff')[1](self.tparams, hiddens, options,
prefix='policy_net_out',
activ='linear'
) # batch_size x n_out
mean, log_std = _temp[:, :self.n_out], _temp[:, self.n_out:]
mean, log_std = -A * tanh(mean), -B-relu(log_std)
action0 = self.trng.normal(size=mean.shape, dtype='float32')
action = action0 * tensor.exp(log_std) + mean
print 'build action sampling function [Gaussian]'
self.f_action = theano.function(act_inps, [action, mean, log_std, hiddens],
on_unused_input='ignore') # action/dist/hiddens
else:
raise NotImplementedError
def build_discriminator(self, options):
# ==================================================================================== #
# Build Action Discriminator
# ==================================================================================== #
observations = tensor.tensor3('observations', dtype='float32')
mask = tensor.matrix('mask', dtype='float32')
if self.type == 'categorical':
actions = tensor.matrix('actions', dtype='int64')
elif self.type == 'gaussian':
actions = tensor.tensor3('actions', dtype='float32')
else:
raise NotImplementedError
if not self.recurrent:
hiddens = get_layer('ff')[1](self.tparams, observations,
options, prefix='policy_net_in',
activ='tanh')
else:
hiddens = get_layer(self.rec)[1](self.tparams, observations,
options, prefix='policy_net_in', mask=mask)[0]
act_inputs = [observations, mask]
if self.type == 'categorical':
act_probs = get_layer('ff')[1](self.tparams, hiddens, options, prefix='policy_net_out',
activ='softmax') # seq_steps x batch_size x n_out
act_probs = tensor.clip(act_probs, TINY, 1 - TINY)
print 'build action distribiution'
self.f_probs = theano.function(act_inputs, act_probs,
on_unused_input='ignore') # get the action probabilities
elif self.type == 'gaussian':
_temps = get_layer('ff')[1](self.tparams, hiddens, options,
prefix='policy_net_out',
activ='linear'
) # batch_size x n_out
means, log_stds = _temps[:, :, :self.n_out], _temps[:, :, self.n_out:]
means, log_stds = -A * tanh(means), -B-relu(log_stds)
act_probs = [means, log_stds]
print 'build Gaussian PDF'
self.f_pdf = theano.function(act_inputs, [means, log_stds],
on_unused_input='ignore') # get the action probabilities
else:
raise NotImplementedError
# ==================================================================================== #
# Build Baseline Network (Input-dependent Value Function) & Advantages
# ==================================================================================== #
print 'setup the advantages & baseline network'
reward = tensor.matrix('reward') # seq_steps x batch_size :: rewards for each steps
# baseline is estimated with a 2-layer neural network.
hiddens_b = get_layer('ff')[1](self.tparams_b, observations, options,
prefix='baseline_net_in',
activ='tanh')
baseline = get_layer('ff')[1](self.tparams_b, hiddens_b, options,
prefix='baseline_net_out',
activ='linear')[:, :, 0] # seq_steps x batch_size or batch_size
advantages = self.build_advantages(act_inputs, reward, baseline, normalize=True)
# ==================================================================================== #
# Build Policy Gradient (here we provide two options)
# ==================================================================================== #
if self.options['updater'] == 'REINFORCE':
print 'build RENIFROCE.'
self.build_reinforce(act_inputs, act_probs, actions, advantages)
elif self.options['updater'] == 'TRPO':
print 'build TRPO'
self.build_trpo(act_inputs, act_probs, actions, advantages)
else:
raise NotImplementedError
# ==================================================================================== #
# Controller Actions
# ==================================================================================== #
def random(self, states, p=0.5):
live_k = states.shape[0]
return (numpy.random.random(live_k) > p).astype('int64'), \
numpy.ones(live_k) * p
def action(self, states, prevhidden):
return self.f_action(states, prevhidden)
def init_hidden(self, n_samples=1):
return numpy.zeros((n_samples, self.n_hidden), dtype='float32')
def init_action(self, n_samples=1):
states0 = numpy.zeros((n_samples, self.n_in), dtype='float32')
return self.f_action(states0, self.init_hidden(n_samples))
def get_learner(self):
if self.options['updater'] == 'REINFORCE':
return self.run_reinforce
elif self.options['updater'] == 'TRPO':
return self.run_trpo
else:
raise NotImplementedError
@staticmethod
def kl(prob0, prob1):
p1 = (prob0 + TINY) / (prob1 + TINY)
# p2 = (1 - prob0 + TINY) / (1 - prob1 + TINY)
return tensor.sum(prob0 * tensor.log(p1), axis=-1)
@staticmethod
def _grab_prob(probs, X):
assert probs.ndim == 3
batch_size = probs.shape[1]
max_len = probs.shape[0]
vocab_size = probs.shape[2]
probs = probs.reshape((batch_size * max_len, vocab_size))
return probs[tensor.arange(batch_size * max_len), X.flatten(1)].reshape(X.shape) # advanced indexing
def cross(self, probs, actions):
# return tensor.log(probs) * actions + tensor.log(1 - probs) * (1 - actions)
return self._grab_prob(tensor.log(probs), actions)
def build_advantages(self, act_inputs, reward, baseline, normalize=True):
# TODO: maybe we need a discount factor gamma for advantages.
# TODO: we can also rewrite advantages with value functions (GAE)
# Advantages and Normalization the return
reward_adv = reward - baseline
mask = act_inputs[1]
if normalize:
reward_mean = tensor.sum(mask * reward_adv) / tensor.sum(mask)
reward_mean2 = tensor.sum(mask * (reward_adv ** 2)) / tensor.sum(mask)
reward_std = tensor.sqrt(tensor.maximum(reward_mean2 - reward_mean ** 2, TINY)) + TINY
# reward_std = tensor.maximum(reward_std, 1)
reward_c = reward_adv - reward_mean # independent mean
advantages = reward_c / reward_std
else:
advantages = reward_adv
print 'build advantages and baseline gradient'
L = tensor.sum(mask * (reward_adv ** 2)) / tensor.sum(mask)
dL = tensor.grad(L, wrt=itemlist(self.tparams_b))
lr = tensor.scalar(name='lr')
inps_b = act_inputs + [reward]
oups_b = [L, advantages]
f_adv, f_update_b = adam(lr, self.tparams_b, dL, inps_b, oups_b)
self.f_adv = f_adv
self.f_update_b = f_update_b
return advantages
# ===================================================================
# Policy Grident: REINFORCE with Adam
# ===================================================================
def build_reinforce(self, act_inputs, act_probs, actions, advantages):
mask = act_inputs[1]
if self.type == 'categorical':
negEntropy = tensor.sum(tensor.log(act_probs) * act_probs, axis=-1)
logLikelihood = self.cross(act_probs, actions)
elif self.type == 'gaussian':
means, log_stds = act_probs
negEntropy = -tensor.sum(log_stds + tensor.log(tensor.sqrt(2 * PI * E)), axis=-1)
actions0 = (actions - means) / tensor.exp(log_stds)
logLikelihood = -tensor.sum(log_stds, axis=-1) - \
0.5 * tensor.sum(tensor.sqr(actions0), axis=-1) - \
0.5 * means.shape[-1] * tensor.log(2 * PI)
else:
raise NotImplementedError
# tensor.log(act_probs) * actions + tensor.log(1 - act_probs) * (1 - actions)
H = tensor.sum(mask * negEntropy, axis=0).mean() * 0.001 # penalty
J = tensor.sum(mask * -logLikelihood * advantages, axis=0).mean() + H
dJ = grad_clip(tensor.grad(J, wrt=itemlist(self.tparams)))
print 'build REINFORCE optimizer'
lr = tensor.scalar(name='lr')
inps = act_inputs + [actions, advantages]
outps = [J, H]
if self.type == 'gaussian':
outps += [actions0.mean(), actions.mean()]
f_cost, f_update = adam(lr, self.tparams, dJ, inps, outps)
self.f_cost = f_cost
self.f_update = f_update
print 'done'
def run_reinforce(self, act_inputs, actions, reward, update=True, lr=0.0002):
# sub baseline
inps_adv = act_inputs + [reward]
L, advantages = self.f_adv(*inps_adv)
inps_reinfoce = act_inputs + [actions, advantages]
if self.type == 'gaussian':
J, H, m, s = self.f_cost(*inps_reinfoce)
info = {'J': J, 'G_norm': H, 'B_loss': L, 'Adv': advantages.mean(), 'm': m, 's': s}
else:
J, H = self.f_cost(*inps_reinfoce)
info = {'J': J, 'Entropy': H, 'B_loss': L, 'Adv': advantages.mean()}
info['advantages'] = advantages
if update: # update the parameters
self.f_update_b(lr)
self.f_update(lr)
return info
# ==================================================================================== #
# Trust Region Policy Optimization
# ==================================================================================== #
def build_trpo(self, act_inputs, act_probs, actions, advantages):
assert self.type == 'categorical', 'in this stage not support TRPO'
# probability distribution
mask = act_inputs[1]
probs = act_probs
probs_old = tensor.matrix(dtype='float32')
logp = self.cross(probs, actions)
logp_old = self.cross(probs_old, actions)
# policy gradient
J = tensor.sum(mask * -tensor.exp(logp - logp_old) * advantages, axis=0).mean()
dJ = flatgrad(J, self.tparams)
probs_fix = theano.gradient.disconnected_grad(probs)
kl_fix = tensor.sum(mask * self.kl(probs_fix, probs), axis=0).mean()
kl_grads = tensor.grad(kl_fix, wrt=itemlist(self.tparams))
ftangents = tensor.fvector(name='flat_tan')
shapes = [self.tparams[var].get_value(borrow=True).shape for var in self.tparams]
start = 0
tangents = []
for shape in shapes:
size = numpy.prod(shape)
tangents.append(tensor.reshape(ftangents[start:start + size], shape))
start += size
gvp = tensor.add(*[tensor.sum(g * t) for (g, t) in zipsame(kl_grads, tangents)])
# Fisher-vectror product
fvp = flatgrad(gvp, self.tparams)
entropy = tensor.sum(mask * -self.cross(probs, probs), axis=0).mean()
kl = tensor.sum(mask * self.kl(probs_old, probs), axis=0).mean()
print 'compile the functions'
inps = act_inputs + [actions, advantages, probs_old]
loss = [J, kl, entropy]
self.f_pg = theano.function(inps, dJ)
self.f_loss = theano.function(inps, loss)
self.f_fisher = theano.function([ftangents] + inps, fvp, on_unused_input='ignore')
# get/set flatten params
print 'compling flat updater'
self.get_flat = theano.function([], tensor.concatenate([self.tparams[v].flatten() for v in self.tparams]))
theta = tensor.vector()
start = 0
updates = []
for v in self.tparams:
p = self.tparams[v]
shape = p.shape
size = tensor.prod(shape)
updates.append((p, theta[start:start + size].reshape(shape)))
start += size
self.set_flat = theano.function([theta], [], updates=updates)
def run_trpo(self, act_inputs, actions, reward,
update=True, cg_damping=1e-3, max_kl=1e-2, lr=0.0002):
# sub baseline
inps_adv = act_inputs + [reward]
L, advantages = self.f_adv(*inps_adv)
self.f_update_b(lr)
# get current action distributions
probs = self.f_probs(*act_inputs)
inps = act_inputs + [actions, advantages, probs]
thprev = self.get_flat()
def fisher_vector_product(p):
return self.f_fisher(p, *inps) + cg_damping * p
g = self.f_pg(*inps)
losses_before = self.f_loss(*inps)
if numpy.allclose(g, 0):
print 'zero gradient, not updating'
else:
stepdir = self.cg(fisher_vector_product, -g)
shs = .5 * stepdir.dot(fisher_vector_product(stepdir))
lm = numpy.sqrt(shs / max_kl)
print "\nlagrange multiplier:", lm, "gnorm:", numpy.linalg.norm(g)
fullstep = stepdir / lm
neggdotstepdir = -g.dot(stepdir)
def loss(th):
self.set_flat(th)
return self.f_loss(*inps)[0]
print 'do line search'
success, theta = self.linesearch(loss, thprev, fullstep, neggdotstepdir / lm)
print "success", success
self.set_flat(theta)
losses_after = self.f_loss(*inps)
info = OrderedDict()
for (lname, lbefore, lafter) in zipsame(['J', 'KL', 'entropy'], losses_before, losses_after):
info[lname + "_before"] = lbefore
info[lname + "_after"] = lafter
# add the baseline loss into full information
info['B_loss'] = L
return info
@staticmethod
def linesearch(f, x, fullstep, expected_improve_rate, max_backtracks=10, accept_ratio=.1):
"""
Backtracking linesearch, where expected_improve_rate is the slope dy/dx at the initial point
"""
fval = f(x)
print "fval before", fval
for (_n_backtracks, stepfrac) in enumerate(.5 ** numpy.arange(max_backtracks)):
xnew = x + stepfrac * fullstep
newfval = f(xnew)
actual_improve = fval - newfval
expected_improve = expected_improve_rate * stepfrac
ratio = actual_improve / expected_improve
print "a/e/r", actual_improve, expected_improve, ratio
if ratio > accept_ratio and actual_improve > 0:
print "fval after", newfval
return True, xnew
return False, x
@staticmethod
def cg(f_Ax, b, cg_iters=10, callback=None, verbose=False, residual_tol=1e-10):
"""
Conjuctate Gradient
"""
p = b.copy()
r = b.copy()
x = numpy.zeros_like(b)
rdotr = r.dot(r)
fmtstr = "%10i %10.3g %10.3g"
titlestr = "%10s %10s %10s"
if verbose: print titlestr % ("iter", "residual norm", "soln norm")
for i in xrange(cg_iters):
if callback is not None:
callback(x)
if verbose: print fmtstr % (i, rdotr, numpy.linalg.norm(x))
z = f_Ax(p)
v = rdotr / p.dot(z)
x += v * p
r -= v * z
newrdotr = r.dot(r)
mu = newrdotr / rdotr
p = r + mu * p
rdotr = newrdotr
if rdotr < residual_tol:
break
if callback is not None:
callback(x)
if verbose: print fmtstr % (i + 1, rdotr, numpy.linalg.norm(x))
return x
# ====================================================================== #
# Save & Load
# ====================================================================== #
def save(self, history, it):
_params = OrderedDict()
_params = unzip(self.tparams, _params)
_params = unzip(self.tparams_b, _params)
print 'save the policy network >> {}'.format(self.model)
numpy.savez('%s.current' % (self.model),
history=history,
it=it,
**_params)
numpy.savez('{}.iter={}'.format(self.model, it),
history=history,
it=it,
**_params)
def load(self):
if os.path.exists(self.model):
print 'loading from the existing model (current)'
rmodel = numpy.load(self.model)
history = rmodel['history']
it = rmodel['it']
self.params = load_params(rmodel, self.params)
self.params_b = load_params(rmodel, self.params_b)
self.tparams = init_tparams(self.params)
self.tparams_b = init_tparams(self.params_b)
print 'the dataset need to go over {} lines'.format(it)
return history, it
else:
return [], -1
| 39.871838 | 118 | 0.514084 | 2,580 | 23,644 | 4.557752 | 0.163566 | 0.027128 | 0.010205 | 0.007484 | 0.257165 | 0.194319 | 0.156561 | 0.128157 | 0.109618 | 0.095501 | 0 | 0.011799 | 0.326129 | 23,644 | 592 | 119 | 39.939189 | 0.726229 | 0.131323 | 0 | 0.222772 | 0 | 0 | 0.080475 | 0.002472 | 0 | 0 | 0 | 0.001689 | 0.004951 | 0 | null | null | 0 | 0.012376 | null | null | 0.079208 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b442fb148ab72708b2f20e85644d227c7977348c | 453 | py | Python | ejercicio 14.py | Davidpadilla1234/taller_estructura-secuencial | 3a65931ad75fd4902f406c6c872053169dad1a0b | [
"MIT"
] | null | null | null | ejercicio 14.py | Davidpadilla1234/taller_estructura-secuencial | 3a65931ad75fd4902f406c6c872053169dad1a0b | [
"MIT"
] | null | null | null | ejercicio 14.py | Davidpadilla1234/taller_estructura-secuencial | 3a65931ad75fd4902f406c6c872053169dad1a0b | [
"MIT"
] | null | null | null | """
Entradas:
lectura actual--->float--->lect2
lectura anterior--->float--->lect1
valor kw--->float--->valorkw
Salidas:
consumo--->float--->consumo
total factura-->flotante--->total
"""
lect2 = float ( entrada ( "Digite lectura real:" ))
lect1 = float ( entrada ( "Digite lectura anterior:" ))
valorkw = float ( input ( "Valor del kilowatio: " ))
consumo = ( lect2 - lect1 )
total = ( consumo * valorkw )
print ( "El valor a pagar es: " + str ( total )) | 30.2 | 55 | 0.653422 | 53 | 453 | 5.584906 | 0.509434 | 0.101351 | 0.121622 | 0.168919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015666 | 0.154525 | 453 | 15 | 56 | 30.2 | 0.75718 | 0.390728 | 0 | 0 | 0 | 0 | 0.319703 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b443e69cd16f1827fe9ba10cb1499425321f1ac2 | 1,059 | py | Python | manage.py | xinbingliang/dockertest | aca2a508658681a5e6b1beab714059bf1b43d9ed | [
"MIT"
] | 30 | 2018-05-23T16:58:12.000Z | 2021-10-18T21:25:01.000Z | manage.py | xinbingliang/dockertest | aca2a508658681a5e6b1beab714059bf1b43d9ed | [
"MIT"
] | 2 | 2019-12-01T13:32:50.000Z | 2019-12-01T13:32:53.000Z | manage.py | xinbingliang/dockertest | aca2a508658681a5e6b1beab714059bf1b43d9ed | [
"MIT"
] | 136 | 2018-02-04T14:13:33.000Z | 2022-03-09T08:26:07.000Z | # manage.py
import unittest
from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand
from skeleton.server import app, db
from skeleton.server.models import User
migrate = Migrate(app, db)
manager = Manager(app)
# migrations
manager.add_command('db', MigrateCommand)
@manager.command
def test():
"""Runs the unit tests without coverage."""
tests = unittest.TestLoader().discover('tests', pattern='test*.py')
result = unittest.TextTestRunner(verbosity=2).run(tests)
if result.wasSuccessful():
return 0
else:
return 1
@manager.command
def create_db():
"""Creates the db tables."""
db.create_all()
@manager.command
def drop_db():
"""Drops the db tables."""
db.drop_all()
@manager.command
def create_admin():
"""Creates the admin user."""
db.session.add(User(email='admin@cisco.com', password='admin', admin=True))
db.session.commit()
@manager.command
def create_data():
"""Creates sample data."""
pass
if __name__ == '__main__':
manager.run()
| 18.578947 | 79 | 0.685552 | 135 | 1,059 | 5.251852 | 0.444444 | 0.098731 | 0.119887 | 0.09732 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003448 | 0.17847 | 1,059 | 56 | 80 | 18.910714 | 0.811494 | 0.139754 | 0 | 0.16129 | 0 | 0 | 0.048643 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16129 | false | 0.064516 | 0.16129 | 0 | 0.387097 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b444035780c265816dfc1fd4e30cb0ee8b926672 | 610 | py | Python | client/middleware.py | uktrade/directory-forms-api | 078e38ddf7a761d2d34a0e1ab2dc3f20cd32e6aa | [
"MIT"
] | null | null | null | client/middleware.py | uktrade/directory-forms-api | 078e38ddf7a761d2d34a0e1ab2dc3f20cd32e6aa | [
"MIT"
] | 77 | 2018-10-29T14:38:37.000Z | 2022-03-23T14:20:39.000Z | client/middleware.py | uktrade/directory-forms-api | 078e38ddf7a761d2d34a0e1ab2dc3f20cd32e6aa | [
"MIT"
] | 1 | 2021-08-05T10:20:17.000Z | 2021-08-05T10:20:17.000Z | import sigauth.middleware
import sigauth.helpers
from client import helpers
class SignatureCheckMiddleware(sigauth.middleware.SignatureCheckMiddlewareBase):
secret = None
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.request_checker = helpers.RequestSignatureChecker(self.secret)
def should_check(self, request):
if request.resolver_match.namespace in [
'admin', 'healthcheck', 'authbroker_client'
] or request.path_info.startswith('/admin/login'):
return False
return super().should_check(request)
| 30.5 | 80 | 0.703279 | 62 | 610 | 6.693548 | 0.596774 | 0.062651 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196721 | 610 | 19 | 81 | 32.105263 | 0.846939 | 0 | 0 | 0 | 0 | 0 | 0.07377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.214286 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b4484ab703976e8f170a719cc81c5d0146cb13ba | 533 | py | Python | dictionaries/lab/06_students.py | Galchov/python-fundamentals | 4939bdd1c66a7b458fd9ffd0a01d714de26724b5 | [
"MIT"
] | null | null | null | dictionaries/lab/06_students.py | Galchov/python-fundamentals | 4939bdd1c66a7b458fd9ffd0a01d714de26724b5 | [
"MIT"
] | null | null | null | dictionaries/lab/06_students.py | Galchov/python-fundamentals | 4939bdd1c66a7b458fd9ffd0a01d714de26724b5 | [
"MIT"
] | null | null | null | data = input()
courses = {}
while ":" in data:
student_name, id, course_name = data.split(":")
if course_name not in courses:
courses[course_name] = {}
courses[course_name][id] = student_name
data = input()
searched_course = data
searched_course_name_as_list = searched_course.split("_")
searched_course = " ".join(searched_course_name_as_list)
for course_name in courses:
if course_name == searched_course:
for id, name in courses[course_name].items():
print(f"{name} - {id}")
| 23.173913 | 57 | 0.669794 | 71 | 533 | 4.71831 | 0.28169 | 0.268657 | 0.152239 | 0.119403 | 0.143284 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206379 | 533 | 22 | 58 | 24.227273 | 0.791962 | 0 | 0 | 0.133333 | 0 | 0 | 0.031895 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.066667 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b44954b2c2b3e9462c5ae4cfc721ce64071a8588 | 1,184 | py | Python | 04.Encapsulation/Exe/pizza_maker/project/main.py | nmoskova/Python-OOP | 07327bcb93eee3a7db5d7c0bbdd1b54eb9e8b864 | [
"MIT"
] | null | null | null | 04.Encapsulation/Exe/pizza_maker/project/main.py | nmoskova/Python-OOP | 07327bcb93eee3a7db5d7c0bbdd1b54eb9e8b864 | [
"MIT"
] | null | null | null | 04.Encapsulation/Exe/pizza_maker/project/main.py | nmoskova/Python-OOP | 07327bcb93eee3a7db5d7c0bbdd1b54eb9e8b864 | [
"MIT"
] | null | null | null | from encapsulation_04.exe.pizza_maker.project.dough import Dough
from encapsulation_04.exe.pizza_maker.project.pizza import Pizza
from encapsulation_04.exe.pizza_maker.project.topping import Topping
tomato_topping = Topping("Tomato", 60)
print(tomato_topping.topping_type)
print(tomato_topping.weight)
mushrooms_topping = Topping("Mushroom", 75)
print(mushrooms_topping.topping_type)
print(mushrooms_topping.weight)
mozzarella_topping = Topping("Mozzarella", 80)
print(mozzarella_topping.topping_type)
print(mozzarella_topping.weight)
cheddar_topping = Topping("Cheddar", 150)
pepperoni_topping = Topping("Pepperoni", 120)
white_flour_dough = Dough("White Flour", "Mixing", 200)
print(white_flour_dough.flour_type)
print(white_flour_dough.weight)
print(white_flour_dough.baking_technique)
whole_wheat_dough = Dough("Whole Wheat Flour", "Mixing", 200)
print(whole_wheat_dough.weight)
print(whole_wheat_dough.flour_type)
print(whole_wheat_dough.baking_technique)
p = Pizza("Margherita", whole_wheat_dough, 2)
p.add_topping(tomato_topping)
print(p.calculate_total_weight())
p.add_topping(mozzarella_topping)
print(p.calculate_total_weight())
p.add_topping(mozzarella_topping)
| 29.6 | 68 | 0.831081 | 165 | 1,184 | 5.648485 | 0.230303 | 0.120172 | 0.080472 | 0.070815 | 0.248927 | 0.248927 | 0.248927 | 0.123391 | 0.123391 | 0.123391 | 0 | 0.022604 | 0.065878 | 1,184 | 39 | 69 | 30.358974 | 0.820072 | 0 | 0 | 0.142857 | 0 | 0 | 0.076014 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.107143 | 0 | 0.107143 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
b44998685fc665e80493c8e5ef4cef6084f68ca9 | 4,875 | py | Python | ludopediaAnuncios.py | christianbobsin/LudopediaDataMiner | d136a40b024b3611a8a88371b4a47a673c782180 | [
"MIT"
] | 2 | 2018-03-16T23:05:51.000Z | 2021-08-05T03:23:44.000Z | ludopediaAnuncios.py | christianbobsin/LudopediaDataMiner | d136a40b024b3611a8a88371b4a47a673c782180 | [
"MIT"
] | null | null | null | ludopediaAnuncios.py | christianbobsin/LudopediaDataMiner | d136a40b024b3611a8a88371b4a47a673c782180 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from lxml import html
from time import sleep
from datetime import datetime
import requests
import os
import sqlite3
import sys
# No terminal usar ~: python ludopedia.py [idIni] [regs]
# por ex. ~: python ludopedia.py 451 3000
con = sqlite3.connect('ludopedia.db')
cursor = con.cursor()
cursor.execute("""SELECT (ANUNCIO + 1) FROM JOGOS WHERE ANUNCIO=(SELECT MAX(ANUNCIO) FROM JOGOS WHERE TIPO='ANUNCIO') """)
anuncios = cursor.fetchall()
con.close()
idIni = int(anuncios[0][0])
#idIni = 75691
#regs = int(sys.argv[2])
regs = 9999
idMax = ( idIni + regs )
jogosAdicionados = 0
for id in range(idIni, idMax):
# 'http://www.ludopedia.com.br/anuncio?id_anuncio='+str(id)
#url = 'http://www.ludopedia.com.br/anuncio?id_anuncio=' % id
try:
page = requests.get('http://www.ludopedia.com.br/anuncio?id_anuncio='+str(id))
tree = html.fromstring(page.content)
except:
print 'nova tentativa em 10s'
sleep(10)
page = requests.get('http://www.ludopedia.com.br/anuncio?id_anuncio='+str(id))
tree = html.fromstring(page.content)
#jogoNome = tree.xpath('//div[@class="col-xs-10"]/h3/a/text()')
jogoNome = tree.xpath('//*[@id="page-content"]/div/div/div/div[2]/h3/a/text()')
#jogoFlavor = tree.xpath('//div[@class="col-xs-10"]/h3/span/text()')
jogoFlavor = tree.xpath('//*[@id="page-content"]/div/div/div/div[2]/h3/span/text()')
if len(jogoFlavor):
detalhes = jogoFlavor[0]
else:
detalhes = 'NA'
jogoPreco = tree.xpath('//span[@class="negrito proximo_lance"]/text()')
if len(jogoPreco):
jogoPreco =jogoPreco[0].split()
jogoPreco[1] = jogoPreco[1].replace('.','')
preco = float( jogoPreco[1].replace( ',','.' ) )
else:
preco = 0.0
status = tree.xpath('//td/span/text()')
validadeAnuncio = tree.xpath('//td/text()')
if len(validadeAnuncio):
validadeAnuncio[4] = validadeAnuncio[4].replace(',',' ')
data = validadeAnuncio[4].split()
ano = data[0].split('/')
hora = data[1].split(':')
data = datetime( int(ano[2]), int(ano[1]),int(ano[0]), int(hora[0]), int(hora[1]))
if ( data > datetime.now() and status[1] == 'Vendido'):
data = datetime.now()
else:
data = datetime( 1979, 8, 10 )
pessoa = tree.xpath('//td/a/text()')
if len(pessoa):
vendedor = pessoa[1]
if len(pessoa) < 3:
comprador = 'NA'
else:
comprador = pessoa[2]
current = id - idIni + 1
total = idMax - idIni
progress = (current/float(total))*100
#print str(current) + ' / ' + str(total) + " : " + "%.2f" % round(progress,2) + "%"
#print 'Id: ', id
#jogoCount = id - idIni
if len(jogoNome):
jogosAdicionados = jogosAdicionados + 1
if ( len(status[1]) > 15 ):
status[1] = 'Ativo'
#print 'Jogo: ', jogoNome[0]
#print 'Detalhes ', detalhes
#print 'Preco: ', str(preco)
#print 'Status: ', status[1]
#print 'Validade: ', data
#print 'Estado: ', validadeAnuncio[6]
#print 'Local: ', validadeAnuncio[8]
#print 'Vendedor: ', vendedor
#print 'Comprador:', comprador
print str( current ).zfill( 4 ) + ' '+ str ( id ) + ' ' + ano[2] + '-' +str( ano[1] ).zfill(2) + '-'+ str( ano[0] ).zfill(2) + ' ' + status[1] + '\t\t' + validadeAnuncio[6] + '\t' + str(preco) + '\t ' + jogoNome[0]
con = sqlite3.connect('ludopedia.db')
cursor = con.cursor()
cursor.execute("""INSERT INTO JOGOS ( ANUNCIO, JOGO, SUBTITULO, PRECO, STATUS, VALIDADE, ESTADO, ORIGEM, VENDEDOR, COMPRADOR, TIPO )
VALUES (?,?,?,?,?,?,?,?,?,?,?)""", (id, jogoNome[0], detalhes, preco, status[1], data, validadeAnuncio[6],
validadeAnuncio[8], vendedor, comprador, 'ANUNCIO' ) )
try:
con.commit()
except:
print 'Falha no Commit, tentando novamente em 10s.'
sleep(10)
con.commit()
con.close()
#print '-----------------------'
#print 'Jogos Adicionados: ' + str( jogosAdicionados )
#print '-----------------------'
else:
print str( current ).zfill( 4 ) + ' ' + str ( id ) + '\t ' + '-------' + ' \t ' + '-------' + ' \t ' + '------' + '\t ' + '---'
sleep(0.05)
#os.system('clear')
print '---------------------------------------------------------------'
print 'Jogos Adicionados: ' + str( jogosAdicionados )
print '---------------------------------------------------------------'
########################################################################
#sTable = sorted( table, key = getKey )
#print tabulate(sTable, tablefmt="plain" )
#f = open ( 'LudopediaLeaks %s-%s.csv' % ( idIni, idMax) , 'w' )
#for x in range ( 0, len( sTable ) ):
# row = "%s;%s;%s;%s;%s;%s;%s;%s;%s;%s" % ( sTable[x][0],
# sTable[x][1].encode('utf8'),
# sTable[x][2].encode('utf8'),
# sTable[x][3],
# sTable[x][4].encode('utf8'),
# sTable[x][5],
# sTable[x][6].encode('utf8'),
# sTable[x][7].encode('utf8'),
# sTable[x][8].encode('utf8'),
# sTable[x][9].encode('utf8') )
# print row
# f.write(row + '\n' )
#f.close()
| 28.676471 | 223 | 0.570256 | 617 | 4,875 | 4.497569 | 0.264182 | 0.007207 | 0.008649 | 0.01009 | 0.234595 | 0.234595 | 0.234595 | 0.17982 | 0.147748 | 0.144144 | 0 | 0.029899 | 0.169846 | 4,875 | 169 | 224 | 28.846154 | 0.655794 | 0.305026 | 0 | 0.301205 | 0 | 0.048193 | 0.262756 | 0.092881 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.084337 | null | null | 0.084337 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b44ef5d465bb9fde348df90c5e65dba1ad7814be | 67,560 | py | Python | pandas/core/internals.py | lodagro/pandas | dfcf74679a273395cc9d7b3db78a1fbbc17c4f57 | [
"PSF-2.0",
"BSD-2-Clause",
"BSD-3-Clause"
] | null | null | null | pandas/core/internals.py | lodagro/pandas | dfcf74679a273395cc9d7b3db78a1fbbc17c4f57 | [
"PSF-2.0",
"BSD-2-Clause",
"BSD-3-Clause"
] | null | null | null | pandas/core/internals.py | lodagro/pandas | dfcf74679a273395cc9d7b3db78a1fbbc17c4f57 | [
"PSF-2.0",
"BSD-2-Clause",
"BSD-3-Clause"
] | null | null | null | import itertools
from datetime import datetime
from numpy import nan
import numpy as np
from pandas.core.common import _possibly_downcast_to_dtype, isnull
from pandas.core.index import Index, MultiIndex, _ensure_index, _handle_legacy_indexes
from pandas.core.indexing import _check_slice_bounds, _maybe_convert_indices
import pandas.core.common as com
import pandas.lib as lib
import pandas.tslib as tslib
import pandas.core.expressions as expressions
from pandas.tslib import Timestamp
from pandas.util import py3compat
class Block(object):
"""
Canonical n-dimensional unit of homogeneous dtype contained in a pandas
data structure
Index-ignorant; let the container take care of that
"""
__slots__ = ['items', 'ref_items', '_ref_locs', 'values', 'ndim']
is_numeric = False
is_bool = False
is_object = False
_can_hold_na = False
_downcast_dtype = None
def __init__(self, values, items, ref_items, ndim=2):
if issubclass(values.dtype.type, basestring):
values = np.array(values, dtype=object)
if values.ndim != ndim:
raise ValueError('Wrong number of dimensions')
if len(items) != len(values):
raise ValueError('Wrong number of items passed %d, indices imply %d'
% (len(items), len(values)))
self._ref_locs = None
self.values = values
self.ndim = ndim
self.items = _ensure_index(items)
self.ref_items = _ensure_index(ref_items)
def _gi(self, arg):
return self.values[arg]
@property
def ref_locs(self):
if self._ref_locs is None:
indexer = self.ref_items.get_indexer(self.items)
indexer = com._ensure_platform_int(indexer)
if (indexer == -1).any():
raise AssertionError('Some block items were not in block '
'ref_items')
self._ref_locs = indexer
return self._ref_locs
def set_ref_items(self, ref_items, maybe_rename=True):
"""
If maybe_rename=True, need to set the items for this guy
"""
if not isinstance(ref_items, Index):
raise AssertionError('block ref_items must be an Index')
if maybe_rename:
self.items = ref_items.take(self.ref_locs)
self.ref_items = ref_items
def __repr__(self):
shape = ' x '.join([com.pprint_thing(s) for s in self.shape])
name = type(self).__name__
result = '%s: %s, %s, dtype %s' % (
name, com.pprint_thing(self.items), shape, self.dtype)
if py3compat.PY3:
return unicode(result)
return com.console_encode(result)
def __contains__(self, item):
return item in self.items
def __len__(self):
return len(self.values)
def __getstate__(self):
# should not pickle generally (want to share ref_items), but here for
# completeness
return (self.items, self.ref_items, self.values)
def __setstate__(self, state):
items, ref_items, values = state
self.items = _ensure_index(items)
self.ref_items = _ensure_index(ref_items)
self.values = values
self.ndim = values.ndim
@property
def shape(self):
return self.values.shape
@property
def itemsize(self):
return self.values.itemsize
@property
def dtype(self):
return self.values.dtype
def copy(self, deep=True):
values = self.values
if deep:
values = values.copy()
return make_block(values, self.items, self.ref_items)
def merge(self, other):
if not self.ref_items.equals(other.ref_items):
raise AssertionError('Merge operands must have same ref_items')
# Not sure whether to allow this or not
# if not union_ref.equals(other.ref_items):
# union_ref = self.ref_items + other.ref_items
return _merge_blocks([self, other], self.ref_items)
def reindex_axis(self, indexer, axis=1, fill_value=np.nan, mask_info=None):
"""
Reindex using pre-computed indexer information
"""
if axis < 1:
raise AssertionError('axis must be at least 1, got %d' % axis)
new_values = com.take_nd(self.values, indexer, axis,
fill_value=fill_value, mask_info=mask_info)
return make_block(new_values, self.items, self.ref_items)
def reindex_items_from(self, new_ref_items, copy=True):
"""
Reindex to only those items contained in the input set of items
E.g. if you have ['a', 'b'], and the input items is ['b', 'c', 'd'],
then the resulting items will be ['b']
Returns
-------
reindexed : Block
"""
new_ref_items, indexer = self.items.reindex(new_ref_items)
if indexer is None:
new_items = new_ref_items
new_values = self.values.copy() if copy else self.values
else:
masked_idx = indexer[indexer != -1]
new_values = com.take_nd(self.values, masked_idx, axis=0,
allow_fill=False)
new_items = self.items.take(masked_idx)
return make_block(new_values, new_items, new_ref_items)
def get(self, item):
loc = self.items.get_loc(item)
return self.values[loc]
def set(self, item, value):
"""
Modify Block in-place with new item value
Returns
-------
None
"""
loc = self.items.get_loc(item)
self.values[loc] = value
def delete(self, item):
"""
Returns
-------
y : Block (new object)
"""
loc = self.items.get_loc(item)
new_items = self.items.delete(loc)
new_values = np.delete(self.values, loc, 0)
return make_block(new_values, new_items, self.ref_items)
def split_block_at(self, item):
"""
Split block into zero or more blocks around columns with given label,
for "deleting" a column without having to copy data by returning views
on the original array.
Returns
-------
generator of Block
"""
loc = self.items.get_loc(item)
if type(loc) == slice or type(loc) == int:
mask = [True] * len(self)
mask[loc] = False
else: # already a mask, inverted
mask = -loc
for s, e in com.split_ranges(mask):
yield make_block(self.values[s:e],
self.items[s:e].copy(),
self.ref_items)
def fillna(self, value, inplace=False, downcast=None):
if not self._can_hold_na:
if inplace:
return self
else:
return self.copy()
new_values = self.values if inplace else self.values.copy()
mask = com.isnull(new_values)
np.putmask(new_values, mask, value)
block = make_block(new_values, self.items, self.ref_items)
if downcast:
block = block.downcast()
return block
def downcast(self, dtypes = None):
""" try to downcast each item to the dict of dtypes if present """
if dtypes is None:
dtypes = dict()
values = self.values
blocks = []
for i, item in enumerate(self.items):
dtype = dtypes.get(item,self._downcast_dtype)
if dtype is None:
nv = _block_shape(values[i])
blocks.append(make_block(nv, [ item ], self.ref_items))
continue
nv = _possibly_downcast_to_dtype(values[i], np.dtype(dtype))
nv = _block_shape(nv)
blocks.append(make_block(nv, [ item ], self.ref_items))
return blocks
def astype(self, dtype, copy = True, raise_on_error = True):
"""
Coerce to the new type (if copy=True, return a new copy)
raise on an except if raise == True
"""
try:
newb = make_block(com._astype_nansafe(self.values, dtype, copy = copy),
self.items, self.ref_items)
except:
if raise_on_error is True:
raise
newb = self.copy() if copy else self
if newb.is_numeric and self.is_numeric:
if (newb.shape != self.shape or
(not copy and newb.itemsize < self.itemsize)):
raise TypeError("cannot set astype for copy = [%s] for dtype "
"(%s [%s]) with smaller itemsize that current "
"(%s [%s])" % (copy, self.dtype.name,
self.itemsize, newb.dtype.name, newb.itemsize))
return newb
def convert(self, copy = True, **kwargs):
""" attempt to coerce any object types to better types
return a copy of the block (if copy = True)
by definition we are not an ObjectBlock here! """
return self.copy() if copy else self
def _can_hold_element(self, value):
raise NotImplementedError()
def _try_cast(self, value):
raise NotImplementedError()
def _try_cast_result(self, result):
""" try to cast the result to our original type,
we may have roundtripped thru object in the mean-time """
return result
def _try_coerce_args(self, values, other):
""" provide coercion to our input arguments """
return values, other
def _try_coerce_result(self, result):
""" reverse of try_coerce_args """
return result
def to_native_types(self, slicer=None, na_rep='', **kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
values = values[:,slicer]
values = np.array(values,dtype=object)
mask = isnull(values)
values[mask] = na_rep
return values.tolist()
def replace(self, to_replace, value, inplace=False, filter=None):
""" replace the to_replace value with value, possible to create new blocks here
this is just a call to putmask """
mask = com.mask_missing(self.values, to_replace)
if filter is not None:
for i, item in enumerate(self.items):
if item not in filter:
mask[i] = False
if not mask.any():
if inplace:
return [ self ]
return [ self.copy() ]
return self.putmask(mask, value, inplace=inplace)
def putmask(self, mask, new, inplace=False):
""" putmask the data to the block; it is possible that we may create a new dtype of block
return the resulting block(s) """
new_values = self.values if inplace else self.values.copy()
# may need to align the new
if hasattr(new, 'reindex_axis'):
axis = getattr(new, '_het_axis', 0)
new = new.reindex_axis(self.items, axis=axis, copy=False).values.T
# may need to align the mask
if hasattr(mask, 'reindex_axis'):
axis = getattr(mask, '_het_axis', 0)
mask = mask.reindex_axis(self.items, axis=axis, copy=False).values.T
if self._can_hold_element(new):
new = self._try_cast(new)
np.putmask(new_values, mask, new)
# maybe upcast me
elif mask.any():
# need to go column by column
new_blocks = []
for i, item in enumerate(self.items):
m = mask[i]
# need a new block
if m.any():
n = new[i] if isinstance(new, np.ndarray) else new
# type of the new block
dtype, _ = com._maybe_promote(np.array(n).dtype)
# we need to exiplicty astype here to make a copy
nv = new_values[i].astype(dtype)
# we create a new block type
np.putmask(nv, m, n)
else:
nv = new_values[i] if inplace else new_values[i].copy()
nv = _block_shape(nv)
new_blocks.append(make_block(nv, [ item ], self.ref_items))
return new_blocks
if inplace:
return [ self ]
return [ make_block(new_values, self.items, self.ref_items) ]
def interpolate(self, method='pad', axis=0, inplace=False,
limit=None, missing=None, coerce=False):
# if we are coercing, then don't force the conversion
# if the block can't hold the type
if coerce:
if not self._can_hold_na:
if inplace:
return self
else:
return self.copy()
values = self.values if inplace else self.values.copy()
if values.ndim != 2:
raise NotImplementedError
transf = (lambda x: x) if axis == 0 else (lambda x: x.T)
if missing is None:
mask = None
else: # todo create faster fill func without masking
mask = com.mask_missing(transf(values), missing)
if method == 'pad':
com.pad_2d(transf(values), limit=limit, mask=mask)
else:
com.backfill_2d(transf(values), limit=limit, mask=mask)
return make_block(values, self.items, self.ref_items)
def take(self, indexer, axis=1):
if axis < 1:
raise AssertionError('axis must be at least 1, got %d' % axis)
new_values = com.take_nd(self.values, indexer, axis=axis,
allow_fill=False)
return make_block(new_values, self.items, self.ref_items)
def get_values(self, dtype):
return self.values
def diff(self, n):
""" return block for the diff of the values """
new_values = com.diff(self.values, n, axis=1)
return make_block(new_values, self.items, self.ref_items)
def shift(self, indexer, periods):
""" shift the block by periods, possibly upcast """
new_values = self.values.take(indexer, axis=1)
# convert integer to float if necessary. need to do a lot more than
# that, handle boolean etc also
new_values, fill_value = com._maybe_upcast(new_values)
if periods > 0:
new_values[:, :periods] = fill_value
else:
new_values[:, periods:] = fill_value
return make_block(new_values, self.items, self.ref_items)
def eval(self, func, other, raise_on_error = True, try_cast = False):
"""
evaluate the block; return result block from the result
Parameters
----------
func : how to combine self, other
other : a ndarray/object
raise_on_error : if True, raise when I can't perform the function, False by default (and just return
the data that we had coming in)
Returns
-------
a new block, the result of the func
"""
values = self.values
# see if we can align other
if hasattr(other, 'reindex_axis'):
axis = getattr(other, '_het_axis', 0)
other = other.reindex_axis(self.items, axis=axis, copy=True).values
# make sure that we can broadcast
is_transposed = False
if hasattr(other, 'ndim') and hasattr(values, 'ndim'):
if values.ndim != other.ndim or values.shape == other.shape[::-1]:
values = values.T
is_transposed = True
values, other = self._try_coerce_args(values, other)
args = [ values, other ]
try:
result = self._try_coerce_result(func(*args))
except (Exception), detail:
if raise_on_error:
raise TypeError('Could not operate [%s] with block values [%s]'
% (repr(other),str(detail)))
else:
# return the values
result = np.empty(values.shape,dtype='O')
result.fill(np.nan)
if not isinstance(result, np.ndarray):
raise TypeError('Could not compare [%s] with block values'
% repr(other))
if is_transposed:
result = result.T
# try to cast if requested
if try_cast:
result = self._try_cast_result(result)
return make_block(result, self.items, self.ref_items)
def where(self, other, cond, raise_on_error = True, try_cast = False):
"""
evaluate the block; return result block(s) from the result
Parameters
----------
other : a ndarray/object
cond : the condition to respect
raise_on_error : if True, raise when I can't perform the function, False by default (and just return
the data that we had coming in)
Returns
-------
a new block(s), the result of the func
"""
values = self.values
# see if we can align other
if hasattr(other,'reindex_axis'):
axis = getattr(other,'_het_axis',0)
other = other.reindex_axis(self.items, axis=axis, copy=True).values
# make sure that we can broadcast
is_transposed = False
if hasattr(other, 'ndim') and hasattr(values, 'ndim'):
if values.ndim != other.ndim or values.shape == other.shape[::-1]:
values = values.T
is_transposed = True
# see if we can align cond
if not hasattr(cond,'shape'):
raise ValueError("where must have a condition that is ndarray like")
if hasattr(cond,'reindex_axis'):
axis = getattr(cond,'_het_axis',0)
cond = cond.reindex_axis(self.items, axis=axis, copy=True).values
else:
cond = cond.values
# may need to undo transpose of values
if hasattr(values, 'ndim'):
if values.ndim != cond.ndim or values.shape == cond.shape[::-1]:
values = values.T
is_transposed = not is_transposed
# our where function
def func(c,v,o):
if c.ravel().all():
return v
v, o = self._try_coerce_args(v, o)
try:
return self._try_coerce_result(expressions.where(c, v, o, raise_on_error=True))
except (Exception), detail:
if raise_on_error:
raise TypeError('Could not operate [%s] with block values [%s]'
% (repr(o),str(detail)))
else:
# return the values
result = np.empty(v.shape,dtype='float64')
result.fill(np.nan)
return result
def create_block(result, items, transpose = True):
if not isinstance(result, np.ndarray):
raise TypeError('Could not compare [%s] with block values'
% repr(other))
if transpose and is_transposed:
result = result.T
# try to cast if requested
if try_cast:
result = self._try_cast_result(result)
return make_block(result, items, self.ref_items)
# see if we can operate on the entire block, or need item-by-item
if not self._can_hold_na:
axis = cond.ndim-1
result_blocks = []
for item in self.items:
loc = self.items.get_loc(item)
item = self.items.take([loc])
v = values.take([loc],axis=axis)
c = cond.take([loc],axis=axis)
o = other.take([loc],axis=axis) if hasattr(other,'shape') else other
result = func(c,v,o)
if len(result) == 1:
result = np.repeat(result,self.shape[1:])
result = _block_shape(result,ndim=self.ndim,shape=self.shape[1:])
result_blocks.append(create_block(result, item, transpose = False))
return result_blocks
else:
result = func(cond,values,other)
return create_block(result, self.items)
class NumericBlock(Block):
is_numeric = True
_can_hold_na = True
def _try_cast_result(self, result):
return _possibly_downcast_to_dtype(result, self.dtype)
class FloatBlock(NumericBlock):
_downcast_dtype = 'int64'
def _can_hold_element(self, element):
if isinstance(element, np.ndarray):
return issubclass(element.dtype.type, (np.floating, np.integer))
return isinstance(element, (float, int))
def _try_cast(self, element):
try:
return float(element)
except: # pragma: no cover
return element
def to_native_types(self, slicer=None, na_rep='', float_format=None, **kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
values = values[:,slicer]
values = np.array(values,dtype=object)
mask = isnull(values)
values[mask] = na_rep
if float_format:
imask = (-mask).ravel()
values.flat[imask] = np.array([ float_format % val for val in values.ravel()[imask] ])
return values.tolist()
def should_store(self, value):
# when inserting a column should not coerce integers to floats
# unnecessarily
return issubclass(value.dtype.type, np.floating) and value.dtype == self.dtype
class ComplexBlock(NumericBlock):
def _can_hold_element(self, element):
return isinstance(element, complex)
def _try_cast(self, element):
try:
return complex(element)
except: # pragma: no cover
return element
def should_store(self, value):
return issubclass(value.dtype.type, np.complexfloating)
class IntBlock(NumericBlock):
_can_hold_na = False
def _can_hold_element(self, element):
if isinstance(element, np.ndarray):
return issubclass(element.dtype.type, np.integer)
return com.is_integer(element)
def _try_cast(self, element):
try:
return int(element)
except: # pragma: no cover
return element
def should_store(self, value):
return com.is_integer_dtype(value) and value.dtype == self.dtype
class BoolBlock(NumericBlock):
is_bool = True
_can_hold_na = False
def _can_hold_element(self, element):
return isinstance(element, (int, bool))
def _try_cast(self, element):
try:
return bool(element)
except: # pragma: no cover
return element
def should_store(self, value):
return issubclass(value.dtype.type, np.bool_)
class ObjectBlock(Block):
is_object = True
_can_hold_na = True
@property
def is_bool(self):
""" we can be a bool if we have only bool values but are of type object """
return lib.is_bool_array(self.values.ravel())
def convert(self, convert_dates = True, convert_numeric = True, copy = True):
""" attempt to coerce any object types to better types
return a copy of the block (if copy = True)
by definition we ARE an ObjectBlock!!!!!
can return multiple blocks!
"""
# attempt to create new type blocks
blocks = []
for i, c in enumerate(self.items):
values = self.get(c)
values = com._possibly_convert_objects(values, convert_dates=convert_dates, convert_numeric=convert_numeric)
values = _block_shape(values)
items = self.items.take([i])
newb = make_block(values, items, self.ref_items)
blocks.append(newb)
return blocks
def _can_hold_element(self, element):
return True
def _try_cast(self, element):
return element
def should_store(self, value):
return not issubclass(value.dtype.type,
(np.integer, np.floating, np.complexfloating,
np.datetime64, np.bool_))
_NS_DTYPE = np.dtype('M8[ns]')
_TD_DTYPE = np.dtype('m8[ns]')
class DatetimeBlock(Block):
_can_hold_na = True
def __init__(self, values, items, ref_items, ndim=2):
if values.dtype != _NS_DTYPE:
values = tslib.cast_to_nanoseconds(values)
Block.__init__(self, values, items, ref_items, ndim=ndim)
def _gi(self, arg):
return lib.Timestamp(self.values[arg])
def _can_hold_element(self, element):
return com.is_integer(element) or isinstance(element, datetime)
def _try_cast(self, element):
try:
return int(element)
except:
return element
def _try_coerce_args(self, values, other):
""" provide coercion to our input arguments
we are going to compare vs i8, so coerce to integer
values is always ndarra like, other may not be """
values = values.view('i8')
if isinstance(other, datetime):
other = lib.Timestamp(other).asm8.view('i8')
elif isnull(other):
other = tslib.iNaT
else:
other = other.view('i8')
return values, other
def _try_coerce_result(self, result):
""" reverse of try_coerce_args """
if isinstance(result, np.ndarray):
if result.dtype == 'i8':
result = tslib.array_to_datetime(result.astype(object).ravel()).reshape(result.shape)
elif isinstance(result, np.integer):
result = lib.Timestamp(result)
return result
def to_native_types(self, slicer=None, na_rep=None, **kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
values = values[:,slicer]
mask = isnull(values)
rvalues = np.empty(self.shape,dtype=object)
if na_rep is None:
na_rep = 'NaT'
rvalues[mask] = na_rep
imask = (-mask).ravel()
if self.dtype == 'datetime64[ns]':
rvalues.flat[imask] = np.array([ Timestamp(val)._repr_base for val in values.ravel()[imask] ],dtype=object)
elif self.dtype == 'timedelta64[ns]':
rvalues.flat[imask] = np.array([ lib.repr_timedelta64(val) for val in values.ravel()[imask] ],dtype=object)
return rvalues.tolist()
def should_store(self, value):
return issubclass(value.dtype.type, np.datetime64)
def set(self, item, value):
"""
Modify Block in-place with new item value
Returns
-------
None
"""
loc = self.items.get_loc(item)
if value.dtype != _NS_DTYPE:
value = tslib.cast_to_nanoseconds(value)
self.values[loc] = value
def get_values(self, dtype):
if dtype == object:
flat_i8 = self.values.ravel().view(np.int64)
res = tslib.ints_to_pydatetime(flat_i8)
return res.reshape(self.values.shape)
return self.values
def make_block(values, items, ref_items):
dtype = values.dtype
vtype = dtype.type
klass = None
if issubclass(vtype, np.floating):
klass = FloatBlock
elif issubclass(vtype, np.complexfloating):
klass = ComplexBlock
elif issubclass(vtype, np.datetime64):
klass = DatetimeBlock
elif issubclass(vtype, np.integer):
klass = IntBlock
elif dtype == np.bool_:
klass = BoolBlock
# try to infer a datetimeblock
if klass is None and np.prod(values.shape):
flat = values.ravel()
inferred_type = lib.infer_dtype(flat)
if inferred_type == 'datetime':
# we have an object array that has been inferred as datetime, so
# convert it
try:
values = tslib.array_to_datetime(flat).reshape(values.shape)
klass = DatetimeBlock
except: # it already object, so leave it
pass
if klass is None:
klass = ObjectBlock
return klass(values, items, ref_items, ndim=values.ndim)
# TODO: flexible with index=None and/or items=None
class BlockManager(object):
"""
Core internal data structure to implement DataFrame
Manage a bunch of labeled 2D mixed-type ndarrays. Essentially it's a
lightweight blocked set of labeled data to be manipulated by the DataFrame
public API class
Parameters
----------
Notes
-----
This is *not* a public API class
"""
__slots__ = ['axes', 'blocks', '_known_consolidated', '_is_consolidated']
def __init__(self, blocks, axes, do_integrity_check=True):
self.axes = [_ensure_index(ax) for ax in axes]
self.blocks = blocks
ndim = len(axes)
for block in blocks:
if ndim != block.values.ndim:
raise AssertionError(('Number of Block dimensions (%d) must '
'equal number of axes (%d)')
% (block.values.ndim, ndim))
if do_integrity_check:
self._verify_integrity()
self._consolidate_check()
@classmethod
def make_empty(self):
return BlockManager([], [[], []])
def __nonzero__(self):
return True
@property
def ndim(self):
return len(self.axes)
def set_axis(self, axis, value):
cur_axis = self.axes[axis]
value = _ensure_index(value)
if len(value) != len(cur_axis):
raise Exception('Length mismatch (%d vs %d)'
% (len(value), len(cur_axis)))
self.axes[axis] = value
if axis == 0:
for block in self.blocks:
block.set_ref_items(self.items, maybe_rename=True)
# make items read only for now
def _get_items(self):
return self.axes[0]
items = property(fget=_get_items)
def get_dtype_counts(self):
""" return a dict of the counts of dtypes in BlockManager """
self._consolidate_inplace()
counts = dict()
for b in self.blocks:
counts[b.dtype.name] = counts.get(b.dtype,0) + b.shape[0]
return counts
def __getstate__(self):
block_values = [b.values for b in self.blocks]
block_items = [b.items for b in self.blocks]
axes_array = [ax for ax in self.axes]
return axes_array, block_values, block_items
def __setstate__(self, state):
# discard anything after 3rd, support beta pickling format for a little
# while longer
ax_arrays, bvalues, bitems = state[:3]
self.axes = [_ensure_index(ax) for ax in ax_arrays]
self.axes = _handle_legacy_indexes(self.axes)
self._is_consolidated = False
self._known_consolidated = False
blocks = []
for values, items in zip(bvalues, bitems):
blk = make_block(values, items, self.axes[0])
blocks.append(blk)
self.blocks = blocks
def __len__(self):
return len(self.items)
def __repr__(self):
output = 'BlockManager'
for i, ax in enumerate(self.axes):
if i == 0:
output += '\nItems: %s' % ax
else:
output += '\nAxis %d: %s' % (i, ax)
for block in self.blocks:
output += '\n%s' % repr(block)
return output
@property
def shape(self):
return tuple(len(ax) for ax in self.axes)
def _verify_integrity(self):
mgr_shape = self.shape
tot_items = sum(len(x.items) for x in self.blocks)
for block in self.blocks:
if block.ref_items is not self.items:
raise AssertionError("Block ref_items must be BlockManager "
"items")
if block.values.shape[1:] != mgr_shape[1:]:
construction_error(tot_items,block.values.shape[1:],self.axes)
if len(self.items) != tot_items:
raise AssertionError('Number of manager items must equal union of '
'block items')
def apply(self, f, *args, **kwargs):
""" iterate over the blocks, collect and create a new block manager
Parameters
----------
f : the callable or function name to operate on at the block level
axes : optional (if not supplied, use self.axes)
filter : list, if supplied, only call the block if the filter is in the block
"""
axes = kwargs.pop('axes',None)
filter = kwargs.get('filter')
result_blocks = []
for blk in self.blocks:
if filter is not None:
kwargs['filter'] = set(kwargs['filter'])
if not blk.items.isin(filter).any():
result_blocks.append(blk)
continue
if callable(f):
applied = f(blk, *args, **kwargs)
else:
applied = getattr(blk,f)(*args, **kwargs)
if isinstance(applied,list):
result_blocks.extend(applied)
else:
result_blocks.append(applied)
bm = self.__class__(result_blocks, axes or self.axes)
bm._consolidate_inplace()
return bm
def where(self, *args, **kwargs):
return self.apply('where', *args, **kwargs)
def eval(self, *args, **kwargs):
return self.apply('eval', *args, **kwargs)
def putmask(self, *args, **kwargs):
return self.apply('putmask', *args, **kwargs)
def diff(self, *args, **kwargs):
return self.apply('diff', *args, **kwargs)
def interpolate(self, *args, **kwargs):
return self.apply('interpolate', *args, **kwargs)
def shift(self, *args, **kwargs):
return self.apply('shift', *args, **kwargs)
def fillna(self, *args, **kwargs):
return self.apply('fillna', *args, **kwargs)
def downcast(self, *args, **kwargs):
return self.apply('downcast', *args, **kwargs)
def astype(self, *args, **kwargs):
return self.apply('astype', *args, **kwargs)
def convert(self, *args, **kwargs):
return self.apply('convert', *args, **kwargs)
def replace(self, *args, **kwargs):
return self.apply('replace', *args, **kwargs)
def replace_list(self, src_lst, dest_lst, inplace=False):
""" do a list replace """
# figure out our mask a-priori to avoid repeated replacements
values = self.as_matrix()
def comp(s):
if isnull(s):
return isnull(values)
return values == s
masks = [ comp(s) for i, s in enumerate(src_lst) ]
result_blocks = []
for blk in self.blocks:
# its possible to get multiple result blocks here
# replace ALWAYS will return a list
rb = [ blk if inplace else blk.copy() ]
for i, d in enumerate(dest_lst):
new_rb = []
for b in rb:
# get our mask for this element, sized to this
# particular block
m = masks[i][b.ref_locs]
if m.any():
new_rb.extend(b.putmask(m, d, inplace=True))
else:
new_rb.append(b)
rb = new_rb
result_blocks.extend(rb)
bm = self.__class__(result_blocks, self.axes)
bm._consolidate_inplace()
return bm
def is_consolidated(self):
"""
Return True if more than one block with the same dtype
"""
if not self._known_consolidated:
self._consolidate_check()
return self._is_consolidated
def _consolidate_check(self):
dtypes = [blk.dtype.type for blk in self.blocks]
self._is_consolidated = len(dtypes) == len(set(dtypes))
self._known_consolidated = True
@property
def is_mixed_type(self):
self._consolidate_inplace()
return len(self.blocks) > 1
@property
def is_numeric_mixed_type(self):
self._consolidate_inplace()
return all([ block.is_numeric for block in self.blocks ])
def get_numeric_data(self, copy=False, type_list=None, as_blocks = False):
"""
Parameters
----------
copy : boolean, default False
Whether to copy the blocks
type_list : tuple of type, default None
Numeric types by default (Float/Complex/Int but not Datetime)
"""
if type_list is None:
filter_blocks = lambda block: block.is_numeric
else:
type_list = self._get_clean_block_types(type_list)
filter_blocks = lambda block: isinstance(block, type_list)
maybe_copy = lambda b: b.copy() if copy else b
num_blocks = [maybe_copy(b) for b in self.blocks if filter_blocks(b)]
if as_blocks:
return num_blocks
if len(num_blocks) == 0:
return BlockManager.make_empty()
indexer = np.sort(np.concatenate([b.ref_locs for b in num_blocks]))
new_items = self.items.take(indexer)
new_blocks = []
for b in num_blocks:
b = b.copy(deep=False)
b.ref_items = new_items
new_blocks.append(b)
new_axes = list(self.axes)
new_axes[0] = new_items
return BlockManager(new_blocks, new_axes, do_integrity_check=False)
def _get_clean_block_types(self, type_list):
if not isinstance(type_list, tuple):
try:
type_list = tuple(type_list)
except TypeError:
type_list = (type_list,)
type_map = {int: IntBlock, float: FloatBlock,
complex: ComplexBlock,
np.datetime64: DatetimeBlock,
datetime: DatetimeBlock,
bool: BoolBlock,
object: ObjectBlock}
type_list = tuple([type_map.get(t, t) for t in type_list])
return type_list
def get_bool_data(self, copy=False, as_blocks=False):
return self.get_numeric_data(copy=copy, type_list=(BoolBlock,),
as_blocks=as_blocks)
def get_slice(self, slobj, axis=0, raise_on_error=False):
new_axes = list(self.axes)
if raise_on_error:
_check_slice_bounds(slobj, new_axes[axis])
new_axes[axis] = new_axes[axis][slobj]
if axis == 0:
new_items = new_axes[0]
if len(self.blocks) == 1:
blk = self.blocks[0]
newb = make_block(blk.values[slobj], new_items,
new_items)
new_blocks = [newb]
else:
return self.reindex_items(new_items)
else:
new_blocks = self._slice_blocks(slobj, axis)
return BlockManager(new_blocks, new_axes, do_integrity_check=False)
def _slice_blocks(self, slobj, axis):
new_blocks = []
slicer = [slice(None, None) for _ in range(self.ndim)]
slicer[axis] = slobj
slicer = tuple(slicer)
for block in self.blocks:
newb = make_block(block.values[slicer], block.items,
block.ref_items)
new_blocks.append(newb)
return new_blocks
def get_series_dict(self):
# For DataFrame
return _blocks_to_series_dict(self.blocks, self.axes[1])
def __contains__(self, item):
return item in self.items
@property
def nblocks(self):
return len(self.blocks)
def copy(self, deep=True):
"""
Make deep or shallow copy of BlockManager
Parameters
----------
deep : boolean, default True
If False, return shallow copy (do not copy data)
Returns
-------
copy : BlockManager
"""
copy_blocks = [block.copy(deep=deep) for block in self.blocks]
# copy_axes = [ax.copy() for ax in self.axes]
copy_axes = list(self.axes)
return BlockManager(copy_blocks, copy_axes, do_integrity_check=False)
def as_matrix(self, items=None):
if len(self.blocks) == 0:
mat = np.empty(self.shape, dtype=float)
elif len(self.blocks) == 1:
blk = self.blocks[0]
if items is None or blk.items.equals(items):
# if not, then just call interleave per below
mat = blk.values
else:
mat = self.reindex_items(items).as_matrix()
else:
if items is None:
mat = self._interleave(self.items)
else:
mat = self.reindex_items(items).as_matrix()
return mat
def _interleave(self, items):
"""
Return ndarray from blocks with specified item order
Items must be contained in the blocks
"""
dtype = _interleaved_dtype(self.blocks)
items = _ensure_index(items)
result = np.empty(self.shape, dtype=dtype)
itemmask = np.zeros(len(items), dtype=bool)
# By construction, all of the item should be covered by one of the
# blocks
if items.is_unique:
for block in self.blocks:
indexer = items.get_indexer(block.items)
if (indexer == -1).any():
raise AssertionError('Items must contain all block items')
result[indexer] = block.get_values(dtype)
itemmask[indexer] = 1
else:
for block in self.blocks:
mask = items.isin(block.items)
indexer = mask.nonzero()[0]
if (len(indexer) != len(block.items)):
raise AssertionError('All items must be in block items')
result[indexer] = block.get_values(dtype)
itemmask[indexer] = 1
if not itemmask.all():
raise AssertionError('Some items were not contained in blocks')
return result
def xs(self, key, axis=1, copy=True):
if axis < 1:
raise AssertionError('Can only take xs across axis >= 1, got %d'
% axis)
loc = self.axes[axis].get_loc(key)
slicer = [slice(None, None) for _ in range(self.ndim)]
slicer[axis] = loc
slicer = tuple(slicer)
new_axes = list(self.axes)
# could be an array indexer!
if isinstance(loc, (slice, np.ndarray)):
new_axes[axis] = new_axes[axis][loc]
else:
new_axes.pop(axis)
new_blocks = []
if len(self.blocks) > 1:
if not copy:
raise Exception('cannot get view of mixed-type or '
'non-consolidated DataFrame')
for blk in self.blocks:
newb = make_block(blk.values[slicer], blk.items, blk.ref_items)
new_blocks.append(newb)
elif len(self.blocks) == 1:
vals = self.blocks[0].values[slicer]
if copy:
vals = vals.copy()
new_blocks = [make_block(vals, self.items, self.items)]
return BlockManager(new_blocks, new_axes)
def fast_2d_xs(self, loc, copy=False):
"""
"""
if len(self.blocks) == 1:
result = self.blocks[0].values[:, loc]
if copy:
result = result.copy()
return result
if not copy:
raise Exception('cannot get view of mixed-type or '
'non-consolidated DataFrame')
dtype = _interleaved_dtype(self.blocks)
items = self.items
n = len(items)
result = np.empty(n, dtype=dtype)
for blk in self.blocks:
for j, item in enumerate(blk.items):
i = items.get_loc(item)
result[i] = blk._gi((j, loc))
return result
def consolidate(self):
"""
Join together blocks having same dtype
Returns
-------
y : BlockManager
"""
if self.is_consolidated():
return self
new_blocks = _consolidate(self.blocks, self.items)
return BlockManager(new_blocks, self.axes)
def _consolidate_inplace(self):
if not self.is_consolidated():
self.blocks = _consolidate(self.blocks, self.items)
self._is_consolidated = True
self._known_consolidated = True
def get(self, item):
_, block = self._find_block(item)
return block.get(item)
def iget(self, i):
item = self.items[i]
if self.items.is_unique:
return self.get(item)
else:
# ugh
try:
inds, = (self.items == item).nonzero()
except AttributeError: # MultiIndex
inds, = self.items.map(lambda x: x == item).nonzero()
_, block = self._find_block(item)
try:
binds, = (block.items == item).nonzero()
except AttributeError: # MultiIndex
binds, = block.items.map(lambda x: x == item).nonzero()
for j, (k, b) in enumerate(zip(inds, binds)):
if i == k:
return block.values[b]
raise Exception('Cannot have duplicate column names '
'split across dtypes')
def get_scalar(self, tup):
"""
Retrieve single item
"""
item = tup[0]
_, blk = self._find_block(item)
# this could obviously be seriously sped up in cython
item_loc = blk.items.get_loc(item),
full_loc = item_loc + tuple(ax.get_loc(x)
for ax, x in zip(self.axes[1:], tup[1:]))
return blk.values[full_loc]
def delete(self, item):
i, _ = self._find_block(item)
loc = self.items.get_loc(item)
self._delete_from_block(i, item)
if com._is_bool_indexer(loc): # dupe keys may return mask
loc = [i for i, v in enumerate(loc) if v]
new_items = self.items.delete(loc)
self.set_items_norename(new_items)
self._known_consolidated = False
def set(self, item, value):
"""
Set new item in-place. Does not consolidate. Adds new Block if not
contained in the current set of items
"""
value = _block_shape(value,self.ndim-1)
if value.shape[1:] != self.shape[1:]:
raise AssertionError('Shape of new values must be compatible '
'with manager shape')
def _set_item(item, arr):
i, block = self._find_block(item)
if not block.should_store(value):
# delete from block, create and append new block
self._delete_from_block(i, item)
self._add_new_block(item, arr, loc=None)
else:
block.set(item, arr)
try:
loc = self.items.get_loc(item)
if isinstance(loc, int):
_set_item(self.items[loc], value)
else:
subset = self.items[loc]
if len(value) != len(subset):
raise AssertionError(
'Number of items to set did not match')
for i, (item, arr) in enumerate(zip(subset, value)):
_set_item(item, arr[None, :])
except KeyError:
# insert at end
self.insert(len(self.items), item, value)
self._known_consolidated = False
def insert(self, loc, item, value):
if item in self.items:
raise Exception('cannot insert %s, already exists' % item)
try:
new_items = self.items.insert(loc, item)
self.set_items_norename(new_items)
# new block
self._add_new_block(item, value, loc=loc)
except:
# so our insertion operation failed, so back out of the new items
# GH 3010
new_items = self.items.delete(loc)
self.set_items_norename(new_items)
# re-raise
raise
if len(self.blocks) > 100:
self._consolidate_inplace()
self._known_consolidated = False
def set_items_norename(self, value):
value = _ensure_index(value)
self.axes[0] = value
for block in self.blocks:
block.set_ref_items(value, maybe_rename=False)
def _delete_from_block(self, i, item):
"""
Delete and maybe remove the whole block
"""
block = self.blocks.pop(i)
for b in block.split_block_at(item):
self.blocks.append(b)
def _add_new_block(self, item, value, loc=None):
# Do we care about dtype at the moment?
# hm, elaborate hack?
if loc is None:
loc = self.items.get_loc(item)
new_block = make_block(value, self.items[loc:loc + 1].copy(),
self.items)
self.blocks.append(new_block)
def _find_block(self, item):
self._check_have(item)
for i, block in enumerate(self.blocks):
if item in block:
return i, block
def _check_have(self, item):
if item not in self.items:
raise KeyError('no item named %s' % com.pprint_thing(item))
def reindex_axis(self, new_axis, method=None, axis=0, copy=True):
new_axis = _ensure_index(new_axis)
cur_axis = self.axes[axis]
if new_axis.equals(cur_axis):
if copy:
result = self.copy(deep=True)
result.axes[axis] = new_axis
if axis == 0:
# patch ref_items, #1823
for blk in result.blocks:
blk.ref_items = new_axis
return result
else:
return self
if axis == 0:
if method is not None:
raise AssertionError('method argument not supported for '
'axis == 0')
return self.reindex_items(new_axis)
new_axis, indexer = cur_axis.reindex(new_axis, method)
return self.reindex_indexer(new_axis, indexer, axis=axis)
def reindex_indexer(self, new_axis, indexer, axis=1, fill_value=np.nan):
"""
pandas-indexer with -1's only.
"""
if axis == 0:
return self._reindex_indexer_items(new_axis, indexer, fill_value)
new_blocks = []
for block in self.blocks:
newb = block.reindex_axis(indexer, axis=axis, fill_value=fill_value)
new_blocks.append(newb)
new_axes = list(self.axes)
new_axes[axis] = new_axis
return BlockManager(new_blocks, new_axes)
def _reindex_indexer_items(self, new_items, indexer, fill_value):
# TODO: less efficient than I'd like
item_order = com.take_1d(self.items.values, indexer)
# keep track of what items aren't found anywhere
mask = np.zeros(len(item_order), dtype=bool)
new_blocks = []
for blk in self.blocks:
blk_indexer = blk.items.get_indexer(item_order)
selector = blk_indexer != -1
# update with observed items
mask |= selector
if not selector.any():
continue
new_block_items = new_items.take(selector.nonzero()[0])
new_values = com.take_nd(blk.values, blk_indexer[selector], axis=0,
allow_fill=False)
new_blocks.append(make_block(new_values, new_block_items,
new_items))
if not mask.all():
na_items = new_items[-mask]
na_block = self._make_na_block(na_items, new_items,
fill_value=fill_value)
new_blocks.append(na_block)
new_blocks = _consolidate(new_blocks, new_items)
return BlockManager(new_blocks, [new_items] + self.axes[1:])
def reindex_items(self, new_items, copy=True, fill_value=np.nan):
"""
"""
new_items = _ensure_index(new_items)
data = self
if not data.is_consolidated():
data = data.consolidate()
return data.reindex_items(new_items)
# TODO: this part could be faster (!)
new_items, indexer = self.items.reindex(new_items)
# could have some pathological (MultiIndex) issues here
new_blocks = []
if indexer is None:
for blk in self.blocks:
if copy:
new_blocks.append(blk.reindex_items_from(new_items))
else:
blk.ref_items = new_items
new_blocks.append(blk)
else:
for block in self.blocks:
newb = block.reindex_items_from(new_items, copy=copy)
if len(newb.items) > 0:
new_blocks.append(newb)
mask = indexer == -1
if mask.any():
extra_items = new_items[mask]
na_block = self._make_na_block(extra_items, new_items,
fill_value=fill_value)
new_blocks.append(na_block)
new_blocks = _consolidate(new_blocks, new_items)
return BlockManager(new_blocks, [new_items] + self.axes[1:])
def _make_na_block(self, items, ref_items, fill_value=np.nan):
# TODO: infer dtypes other than float64 from fill_value
block_shape = list(self.shape)
block_shape[0] = len(items)
dtype, fill_value = com._infer_dtype_from_scalar(fill_value)
block_values = np.empty(block_shape, dtype=dtype)
block_values.fill(fill_value)
na_block = make_block(block_values, items, ref_items)
return na_block
def take(self, indexer, axis=1, verify=True):
if axis < 1:
raise AssertionError('axis must be at least 1, got %d' % axis)
indexer = com._ensure_platform_int(indexer)
n = len(self.axes[axis])
if verify:
indexer = _maybe_convert_indices(indexer, n)
if ((indexer == -1) | (indexer >= n)).any():
raise Exception('Indices must be nonzero and less than '
'the axis length')
new_axes = list(self.axes)
new_axes[axis] = self.axes[axis].take(indexer)
new_blocks = []
for blk in self.blocks:
new_values = com.take_nd(blk.values, indexer, axis=axis,
allow_fill=False)
newb = make_block(new_values, blk.items, self.items)
new_blocks.append(newb)
return BlockManager(new_blocks, new_axes)
def merge(self, other, lsuffix=None, rsuffix=None):
if not self._is_indexed_like(other):
raise AssertionError('Must have same axes to merge managers')
this, other = self._maybe_rename_join(other, lsuffix, rsuffix)
cons_items = this.items + other.items
consolidated = _consolidate(this.blocks + other.blocks, cons_items)
new_axes = list(this.axes)
new_axes[0] = cons_items
return BlockManager(consolidated, new_axes)
def _maybe_rename_join(self, other, lsuffix, rsuffix, copydata=True):
to_rename = self.items.intersection(other.items)
if len(to_rename) > 0:
if not lsuffix and not rsuffix:
raise Exception('columns overlap: %s' % to_rename)
def lrenamer(x):
if x in to_rename:
return '%s%s' % (x, lsuffix)
return x
def rrenamer(x):
if x in to_rename:
return '%s%s' % (x, rsuffix)
return x
this = self.rename_items(lrenamer, copydata=copydata)
other = other.rename_items(rrenamer, copydata=copydata)
else:
this = self
return this, other
def _is_indexed_like(self, other):
"""
Check all axes except items
"""
if self.ndim != other.ndim:
raise AssertionError(('Number of dimensions must agree '
'got %d and %d') % (self.ndim, other.ndim))
for ax, oax in zip(self.axes[1:], other.axes[1:]):
if not ax.equals(oax):
return False
return True
def rename_axis(self, mapper, axis=1):
index = self.axes[axis]
if isinstance(index, MultiIndex):
new_axis = MultiIndex.from_tuples([tuple(mapper(y) for y in x) for x in index], names=index.names)
else:
new_axis = Index([mapper(x) for x in index], name=index.name)
if not new_axis.is_unique:
raise AssertionError('New axis must be unique to rename')
new_axes = list(self.axes)
new_axes[axis] = new_axis
return BlockManager(self.blocks, new_axes)
def rename_items(self, mapper, copydata=True):
new_items = Index([mapper(x) for x in self.items])
new_items.is_unique
new_blocks = []
for block in self.blocks:
newb = block.copy(deep=copydata)
newb.set_ref_items(new_items, maybe_rename=True)
new_blocks.append(newb)
new_axes = list(self.axes)
new_axes[0] = new_items
return BlockManager(new_blocks, new_axes)
def add_prefix(self, prefix):
f = (('%s' % prefix) + '%s').__mod__
return self.rename_items(f)
def add_suffix(self, suffix):
f = ('%s' + ('%s' % suffix)).__mod__
return self.rename_items(f)
@property
def block_id_vector(self):
# TODO
result = np.empty(len(self.items), dtype=int)
result.fill(-1)
for i, blk in enumerate(self.blocks):
indexer = self.items.get_indexer(blk.items)
if (indexer == -1).any():
raise AssertionError('Block items must be in manager items')
result.put(indexer, i)
if (result < 0).any():
raise AssertionError('Some items were not in any block')
return result
@property
def item_dtypes(self):
result = np.empty(len(self.items), dtype='O')
mask = np.zeros(len(self.items), dtype=bool)
for i, blk in enumerate(self.blocks):
indexer = self.items.get_indexer(blk.items)
result.put(indexer, blk.values.dtype.name)
mask.put(indexer, 1)
if not (mask.all()):
raise AssertionError('Some items were not in any block')
return result
def construction_error(tot_items, block_shape, axes):
""" raise a helpful message about our construction """
raise ValueError("Shape of passed values is %s, indices imply %s" % (
tuple(map(int, [tot_items] + list(block_shape))),
tuple(map(int, [len(ax) for ax in axes]))))
def create_block_manager_from_blocks(blocks, axes):
try:
# if we are passed values, make the blocks
if len(blocks) == 1 and not isinstance(blocks[0], Block):
blocks = [ make_block(blocks[0], axes[0], axes[0]) ]
mgr = BlockManager(blocks, axes)
mgr._consolidate_inplace()
return mgr
except (ValueError):
blocks = [ getattr(b,'values',b) for b in blocks ]
tot_items = sum(b.shape[0] for b in blocks)
construction_error(tot_items,blocks[0].shape[1:],axes)
def create_block_manager_from_arrays(arrays, names, axes):
try:
blocks = form_blocks(arrays, names, axes)
mgr = BlockManager(blocks, axes)
mgr._consolidate_inplace()
return mgr
except (ValueError):
construction_error(len(arrays),arrays[0].shape[1:],axes)
def form_blocks(arrays, names, axes):
# pre-filter out items if we passed it
items = axes[0]
if len(arrays) < len(items):
extra_items = items - Index(names)
else:
extra_items = []
# put "leftover" items in float bucket, where else?
# generalize?
float_items = []
complex_items = []
int_items = []
bool_items = []
object_items = []
datetime_items = []
for k, v in zip(names, arrays):
if issubclass(v.dtype.type, np.floating):
float_items.append((k, v))
elif issubclass(v.dtype.type, np.complexfloating):
complex_items.append((k, v))
elif issubclass(v.dtype.type, np.datetime64):
if v.dtype != _NS_DTYPE:
v = tslib.cast_to_nanoseconds(v)
if hasattr(v, 'tz') and v.tz is not None:
object_items.append((k, v))
else:
datetime_items.append((k, v))
elif issubclass(v.dtype.type, np.integer):
if v.dtype == np.uint64:
# HACK #2355 definite overflow
if (v > 2 ** 63 - 1).any():
object_items.append((k, v))
continue
int_items.append((k, v))
elif v.dtype == np.bool_:
bool_items.append((k, v))
else:
object_items.append((k, v))
blocks = []
if len(float_items):
float_blocks = _multi_blockify(float_items, items)
blocks.extend(float_blocks)
if len(complex_items):
complex_blocks = _simple_blockify(complex_items, items, np.complex128)
blocks.extend(complex_blocks)
if len(int_items):
int_blocks = _multi_blockify(int_items, items)
blocks.extend(int_blocks)
if len(datetime_items):
datetime_blocks = _simple_blockify(datetime_items, items, _NS_DTYPE)
blocks.extend(datetime_blocks)
if len(bool_items):
bool_blocks = _simple_blockify(bool_items, items, np.bool_)
blocks.extend(bool_blocks)
if len(object_items) > 0:
object_blocks = _simple_blockify(object_items, items, np.object_)
blocks.extend(object_blocks)
if len(extra_items):
shape = (len(extra_items),) + tuple(len(x) for x in axes[1:])
# empty items -> dtype object
block_values = np.empty(shape, dtype=object)
block_values.fill(nan)
na_block = make_block(block_values, extra_items, items)
blocks.append(na_block)
blocks = _consolidate(blocks, items)
return blocks
def _simple_blockify(tuples, ref_items, dtype):
""" return a single array of a block that has a single dtype; if dtype is not None, coerce to this dtype """
block_items, values = _stack_arrays(tuples, ref_items, dtype)
# CHECK DTYPE?
if dtype is not None and values.dtype != dtype: # pragma: no cover
values = values.astype(dtype)
return [ make_block(values, block_items, ref_items) ]
def _multi_blockify(tuples, ref_items, dtype = None):
""" return an array of blocks that potentially have different dtypes """
# group by dtype
grouper = itertools.groupby(tuples, lambda x: x[1].dtype)
new_blocks = []
for dtype, tup_block in grouper:
block_items, values = _stack_arrays(list(tup_block), ref_items, dtype)
block = make_block(values, block_items, ref_items)
new_blocks.append(block)
return new_blocks
def _stack_arrays(tuples, ref_items, dtype):
from pandas.core.series import Series
# fml
def _asarray_compat(x):
# asarray shouldn't be called on SparseSeries
if isinstance(x, Series):
return x.values
else:
return np.asarray(x)
def _shape_compat(x):
# sparseseries
if isinstance(x, Series):
return len(x),
else:
return x.shape
names, arrays = zip(*tuples)
# index may box values
items = ref_items[ref_items.isin(names)]
first = arrays[0]
shape = (len(arrays),) + _shape_compat(first)
stacked = np.empty(shape, dtype=dtype)
for i, arr in enumerate(arrays):
stacked[i] = _asarray_compat(arr)
return items, stacked
def _blocks_to_series_dict(blocks, index=None):
from pandas.core.series import Series
series_dict = {}
for block in blocks:
for item, vec in zip(block.items, block.values):
series_dict[item] = Series(vec, index=index, name=item)
return series_dict
def _interleaved_dtype(blocks):
if not len(blocks): return None
from collections import defaultdict
counts = defaultdict(lambda: [])
for x in blocks:
counts[type(x)].append(x)
def _lcd_dtype(l):
""" find the lowest dtype that can accomodate the given types """
m = l[0].dtype
for x in l[1:]:
if x.dtype.itemsize > m.itemsize:
m = x.dtype
return m
have_int = len(counts[IntBlock]) > 0
have_bool = len(counts[BoolBlock]) > 0
have_object = len(counts[ObjectBlock]) > 0
have_float = len(counts[FloatBlock]) > 0
have_complex = len(counts[ComplexBlock]) > 0
have_dt64 = len(counts[DatetimeBlock]) > 0
have_numeric = have_float or have_complex or have_int
if (have_object or
(have_bool and have_numeric) or
(have_numeric and have_dt64)):
return np.dtype(object)
elif have_bool:
return np.dtype(bool)
elif have_int and not have_float and not have_complex:
return _lcd_dtype(counts[IntBlock])
elif have_dt64 and not have_float and not have_complex:
return np.dtype('M8[ns]')
elif have_complex:
return np.dtype('c16')
else:
return _lcd_dtype(counts[FloatBlock])
def _consolidate(blocks, items):
"""
Merge blocks having same dtype
"""
get_dtype = lambda x: x.dtype.name
# sort by dtype
grouper = itertools.groupby(sorted(blocks, key=get_dtype),
lambda x: x.dtype)
new_blocks = []
for dtype, group_blocks in grouper:
new_block = _merge_blocks(list(group_blocks), items, dtype)
new_blocks.append(new_block)
return new_blocks
def _merge_blocks(blocks, items, dtype=None):
if len(blocks) == 1:
return blocks[0]
if dtype is None:
if len(set([ b.dtype for b in blocks ])) != 1:
raise AssertionError("_merge_blocks are invalid!")
dtype = blocks[0].dtype
new_values = _vstack([ b.values for b in blocks ], dtype)
new_items = blocks[0].items.append([b.items for b in blocks[1:]])
new_block = make_block(new_values, new_items, items)
return new_block.reindex_items_from(items)
def _block_shape(values, ndim=1, shape=None):
""" guarantee the shape of the values to be at least 1 d """
if values.ndim == ndim:
if shape is None:
shape = values.shape
values = values.reshape(tuple((1,) + shape))
return values
def _vstack(to_stack, dtype):
# work around NumPy 1.6 bug
if dtype == _NS_DTYPE or dtype == _TD_DTYPE:
new_values = np.vstack([x.view('i8') for x in to_stack])
return new_values.view(dtype)
else:
return np.vstack(to_stack)
| 33.004397 | 120 | 0.574571 | 8,412 | 67,560 | 4.454351 | 0.07941 | 0.019696 | 0.008327 | 0.007713 | 0.372671 | 0.301414 | 0.238884 | 0.197812 | 0.181825 | 0.164265 | 0 | 0.004667 | 0.327561 | 67,560 | 2,046 | 121 | 33.020528 | 0.820119 | 0.042214 | 0 | 0.350295 | 0 | 0 | 0.036292 | 0 | 0 | 0 | 0 | 0.002444 | 0.016962 | 0 | null | null | 0.002212 | 0.011799 | null | null | 0.002212 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b44f004ae7c6b3eb8725a6532e9b3868344a526e | 4,919 | py | Python | Sketches/MH/PipeBuilder/BuildViewer.py | sparkslabs/kamaelia_orig | 24b5f855a63421a1f7c6c7a35a7f4629ed955316 | [
"Apache-2.0"
] | 12 | 2015-10-20T10:22:01.000Z | 2021-07-19T10:09:44.000Z | Sketches/MH/PipeBuilder/BuildViewer.py | sparkslabs/kamaelia_orig | 24b5f855a63421a1f7c6c7a35a7f4629ed955316 | [
"Apache-2.0"
] | 2 | 2015-10-20T10:22:55.000Z | 2017-02-13T11:05:25.000Z | Sketches/MH/PipeBuilder/BuildViewer.py | sparkslabs/kamaelia_orig | 24b5f855a63421a1f7c6c7a35a7f4629ed955316 | [
"Apache-2.0"
] | 6 | 2015-03-09T12:51:59.000Z | 2020-03-01T13:06:21.000Z | #!/usr/bin/python
# -*- coding: utf-8 -*-
#
# Copyright 2010 British Broadcasting Corporation and Kamaelia Contributors(1)
#
# (1) Kamaelia Contributors are listed in the AUTHORS file and at
# http://www.kamaelia.org/AUTHORS - please extend this file,
# not this notice.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# -------------------------------------------------------------------------
#
# Simple control window for a looping audio player
import pygame
from Axon.Ipc import producerFinished, shutdownMicroprocess
from Kamaelia.Visualisation.PhysicsGraph.TopologyViewerComponent import TopologyViewerComponent
from Kamaelia.Physics.Simple import SimpleLaws, Particle
import time
class ComponentParticle(Particle):
"""Version of Physics.Particle designed to represent components in a simple pipeline"""
def __init__(self, ID, position, name):
super(ComponentParticle,self).__init__(position=position, ID = ID )
self.radius = 20
self.labelText = name # strip up to the first pipe only
self.name = name
font = pygame.font.Font(None, 24)
self.label = font.render(self.labelText, False, (0,0,0))
self.left = 0
self.top = 0
self.selected = False
def render(self, surface):
"""Rendering passes. A generator method that renders in multiple passes.
Use yields to specify a wait until the pass the next stage of rendering
should take place at.
Example, that renders bonds 'behind' the blobs.
def render(self, surface):
yield 1
self.renderBonds(surface) # render bonds on pass 1
yield 5
self.renderSelf(surface) # render 'blob' on pass 5
If another particle type rendered, for example, on pass 3, then it
would be rendered on top of the bonds, but behind the blobs.
Use this mechanism to order rendering into layers.
"""
sx = int(self.pos[0]) - self.left
sy = int(self.pos[1]) - self.top
yield 1
phase = (time.time()*4) % 2.0
off = phase > 1.0
phase = phase % 1.0
for p in self.bondedTo:
ex = int(p.pos[0] -self.left)
ey = int(p.pos[1] - self.top)
# 'make a crawling dotted line' appearance, to give an animated indication
# directionality of the link
dx = ex-sx
dy = ey-sy
length = (dx*dx + dy*dy)**0.5
dx = dx/length
dy = dy/length
p=0
while p<length:
newp = min(length, p+ phase * 10.0 )
phase = 1.0
if not off:
pygame.draw.line( surface,
(128,128,255),
(sx+dx*p,sy+dy*p),
(sx+dx*newp,sy+dy*newp)
)
off = not off
p=newp
yield 2
if self.selected:
pygame.draw.circle(surface, (255,255,128), (sx,sy), self.radius)
else:
pygame.draw.circle(surface, (192,192,192), (sx,sy), self.radius)
surface.blit(self.label, (sx - self.label.get_width()/2, sy - self.label.get_height()/2))
def setOffset( self, (left,top) ):
"""Inform of a change to the coords of the top left of the drawing surface,
so that this entity can render, as if the top left had moved
"""
self.left = left
self.top = top
def select( self ):
"""Tell this particle it is selected"""
self.selected = True
def deselect( self ):
"""Tell this particle it is selected"""
self.selected = False
def BuildViewer(screensize = (800,600), fullscreen = False, transparency = None):
laws = SimpleLaws(bondLength=100)
return TopologyViewerComponent( screensize=screensize,
fullscreen=fullscreen,
caption = "The pipeline",
particleTypes = {"component":ComponentParticle},
laws = laws
)
| 36.708955 | 121 | 0.551332 | 580 | 4,919 | 4.658621 | 0.405172 | 0.022206 | 0.009993 | 0.011843 | 0.032568 | 0.032568 | 0.032568 | 0.032568 | 0.032568 | 0 | 0 | 0.026283 | 0.350274 | 4,919 | 133 | 122 | 36.984962 | 0.819149 | 0.211425 | 0 | 0.03125 | 0 | 0 | 0.007473 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.078125 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b4561bfc43f0bcb4bcb4c7719b19ceba05dfa31d | 853 | py | Python | onnx/backend/test/case/node/constant.py | stillmatic/onnx | 8d5eb62d5299f6dcb6ac787f0ea8e6cf5b8331a7 | [
"Apache-2.0"
] | null | null | null | onnx/backend/test/case/node/constant.py | stillmatic/onnx | 8d5eb62d5299f6dcb6ac787f0ea8e6cf5b8331a7 | [
"Apache-2.0"
] | null | null | null | onnx/backend/test/case/node/constant.py | stillmatic/onnx | 8d5eb62d5299f6dcb6ac787f0ea8e6cf5b8331a7 | [
"Apache-2.0"
] | null | null | null | # SPDX-License-Identifier: Apache-2.0
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import numpy as np
import onnx
from ..base import Base
from . import expect
class Constant(Base):
@staticmethod
def export(): # type: () -> None
values = np.random.randn(5, 5).astype(np.float32)
node = onnx.helper.make_node(
'Constant',
inputs=[],
outputs=['values'],
value=onnx.helper.make_tensor(
name='const_tensor',
data_type=onnx.TensorProto.FLOAT,
dims=values.shape,
vals=values.flatten().astype(float),
),
)
expect(node, inputs=[], outputs=[values],
name='test_constant')
| 25.088235 | 57 | 0.601407 | 93 | 853 | 5.258065 | 0.548387 | 0.0818 | 0.130879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009967 | 0.294256 | 853 | 33 | 58 | 25.848485 | 0.802326 | 0.060961 | 0 | 0 | 0 | 0 | 0.048872 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.32 | 0 | 0.4 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b4695e99489bd38daaa6b9010e4ec8efec4ce4a7 | 3,524 | py | Python | pyvalidator/is_strong_password.py | theteladras/py.validator | 624ace7973552c8ac9353f48acbf96ec0ecc24a9 | [
"MIT"
] | 15 | 2021-11-01T14:14:56.000Z | 2022-03-17T11:52:29.000Z | pyvalidator/is_strong_password.py | theteladras/py.validator | 624ace7973552c8ac9353f48acbf96ec0ecc24a9 | [
"MIT"
] | 1 | 2022-03-16T13:39:16.000Z | 2022-03-17T09:16:00.000Z | pyvalidator/is_strong_password.py | theteladras/py.validator | 624ace7973552c8ac9353f48acbf96ec0ecc24a9 | [
"MIT"
] | null | null | null | from typing import TypedDict
from .utils.Classes.String import String
from .utils.assert_string import assert_string
from .utils.merge import merge
class _IsStrongPasswordOptions(TypedDict):
min_length: int
min_uppercase: int
min_lowercase: int
min_numbers: int
min_symbols: int
return_score: bool
points_per_unique: int
points_per_repeat: float
points_for_containing_upper: int
points_for_containing_lower: int
points_for_containing_number: int
points_for_containing_symbol: int
class _Analysis(TypedDict):
length: int
unique_chars: int
uppercase_count: int
lowercase_count: int
number_count: int
symbol_count: int
default_options: _IsStrongPasswordOptions = {
"min_length": 8,
"min_uppercase": 1,
"min_lowercase": 1,
"min_numbers": 1,
"min_symbols": 1,
"return_score": False,
"points_per_unique": 1,
"points_per_repeat": 0.5,
"points_for_containing_lower": 10,
"points_for_containing_upper": 10,
"points_for_containing_number": 10,
"points_for_containing_symbol": 10,
}
def count_chars(pw: String):
result = {}
for char in pw:
if char in result:
result[char] += result[char] + 1
else:
result[char] = 1
return result
def analyze_password(pw: String) -> _Analysis:
upper_case_regex = r"^[A-Z]$"
lower_case_regex = r"^[a-z]$"
number_regex = r"^[0-9]$"
symbol_regex = r"^[-#!$@%^&*()_+|~=`{}\[\]:\";'<>?,./ ]$"
char_map = count_chars(pw)
analysis: _Analysis = {
"length": pw.length,
"unique_chars": len([*char_map]),
"uppercase_count": 0,
"lowercase_count": 0,
"number_count": 0,
"symbol_count": 0,
}
for char in [*char_map]:
char = String(char)
if char.match(upper_case_regex):
analysis["uppercase_count"] += char_map[char]
elif char.match(lower_case_regex):
analysis["lowercase_count"] += char_map[char]
elif char.match(number_regex):
analysis["number_count"] += char_map[char]
elif char.match(symbol_regex):
analysis["symbol_count"] += char_map[char]
return analysis
def score_password(analysis: _Analysis, options: _IsStrongPasswordOptions):
points = 0
points += analysis["unique_chars"] * options["points_per_unique"]
points += (analysis["length"] - analysis["unique_chars"]) * options["points_per_unique"]
if analysis["uppercase_count"] > 0:
points += options["points_for_containing_upper"]
if analysis["lowercase_count"] > 0:
points += options["points_for_containing_lower"]
if analysis["number_count"] > 0:
points += options["points_for_containing_number"]
if analysis["symbol_count"] > 0:
points += options["points_for_containing_symbol"]
return points
def is_strong_password(input: str, options: _IsStrongPasswordOptions = {}) -> bool:
input = assert_string(input)
options = merge(options, default_options)
analysis = analyze_password(input)
if options["return_score"]:
return score_password(analysis, options)
return (
analysis["length"] >= options["min_length"] and
analysis["uppercase_count"] >= options["min_uppercase"] and
analysis["lowercase_count"] >= options["min_lowercase"] and
analysis["number_count"] >= options["min_numbers"] and
analysis["symbol_count"] >= options["min_symbols"]
)
| 29.864407 | 92 | 0.652667 | 413 | 3,524 | 5.251816 | 0.162228 | 0.049793 | 0.105118 | 0.029507 | 0.159059 | 0.147994 | 0.147994 | 0 | 0 | 0 | 0 | 0.010619 | 0.225028 | 3,524 | 117 | 93 | 30.119658 | 0.783596 | 0 | 0 | 0 | 0 | 0 | 0.211691 | 0.070091 | 0 | 0 | 0 | 0 | 0.020619 | 1 | 0.041237 | false | 0.072165 | 0.041237 | 0 | 0.340206 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b474450a8d01b6c6116bd09fee74ef2ac63927a9 | 5,928 | py | Python | cxphasing/CXFileReader.py | jbgastineau/cxphasing | a9847a0afb9a981d81f027e75c06c9bb2b531d33 | [
"MIT"
] | 3 | 2018-05-11T16:05:55.000Z | 2021-12-20T08:52:02.000Z | cxphasing/CXFileReader.py | jbgastineau/cxphasing | a9847a0afb9a981d81f027e75c06c9bb2b531d33 | [
"MIT"
] | null | null | null | cxphasing/CXFileReader.py | jbgastineau/cxphasing | a9847a0afb9a981d81f027e75c06c9bb2b531d33 | [
"MIT"
] | 2 | 2018-11-14T08:57:10.000Z | 2021-12-20T08:52:06.000Z | import Image
import readMDA
import h5py
import os
import numpy
from mmpad_image import open_mmpad_tif
import numpy as np
import scipy as sp
import sys
#import libtiff
from cxparams import CXParams as CXP
class CXFileReader(object):
"""
file_reader
A generic and configurable file reader.
The file reader determines the file type from the extension.
For hierarchical data files a method for extracting the data must be specified.
Inputs
------
filename - the name of the file to read
h5_file_path - hdf5 files: a string describing the location of the data inside a hierarchical data format
mda_filepath - mda files: must specify whether to read a detector channel or positioner number.
For e.g. detector channel 5 mda_filepath='d5'
positioner number 2 mda_filepath='p2'
Outputs
-------
data - the 2 or 3D array read from the data file.
Example Usage:
fr = FileReader()
data=fr.open('filename.h5', h5_file_path='/some/string')
data=fr.open('filename.mda', mda_file_path='d4') for detector channel 4
"""
def __init__(self, *args, **kwargs):
self.args = args
for key in kwargs.keys():
setattr(self, key, kwargs[key])
def openup(self, filename, **kwargs):
if not os.path.isfile(filename):
CXP.log.error('{} is not a valid file'.format(filename))
sys.exit(1)
self.extension = filename.split('.')[-1].lower()
for key in kwargs.keys():
setattr(self, key, kwargs[key])
try:
action = {
'mda': self.read_mda,
'h5': self.read_h5,
'hdf5': self.read_h5,
'jpg': self.read_image,
'jpeg': self.read_image,
'png': self.read_image,
'tif': self.read_image,
'tiff': self.read_tif,
'npy': self.read_npy,
'npz': self.read_npz,
'dat': self.read_dat,
'pkl': self.read_pickle,
'mmpd': self.read_mmpad,
'pil': self.read_pilatus
}[self.extension]
except NameError:
CXP.log.error('Unknown file extension {}'.format(self.extension))
raise
return action(filename=filename)
def read_mda(self, filename=None):
if not filename:
filename = self.filename
source = self.mda_file_path[0].lower()
if source not in ['d', 'p']:
CXP.log.error("mda_file_path first character must be 'd' or 'p'")
raise
channel = self.mda_file_path[1]
if not np.isnumeric(channel):
CXP.log.error("mda_file_path second character must be numeric.")
raise
try:
return readMDA.readMDA(filename)[2][source].data
except:
CXP.log.error('Could not extract array from mda file')
raise
def read_h5(self, filename=None, h5_file_path='/entry/instrument/detector/data'):
if not filename:
filename = self.filename
try:
h5_file_path = self.h5_file_path
except:
pass
try:
return h5py.File(filename)[h5_file_path].value
except:
CXP.log.error('Could not extract data from h5 file.')
raise
def read_image(self, filename=None):
if not filename:
filename = self.filename
try:
return sp.misc.fromimage(Image.open(filename))
except:
CXP.log.error('Unable to read data from {}'.format(filename))
raise
def read_npy(self, filename=None):
if not filename:
filename = self.filename
try:
return numpy.load(filename)
except IOError as e:
print e
CXP.log.error('Could not extract data from numpy file.')
raise
def read_npz(self, filename=None):
if not filename:
filename = self.filename
l=[]
try:
d= dict(numpy.load(filename))
# Return list in the right order
for i in range(len(d)):
l.append(d['arr_{:d}'.format(i)])
return l
except IOError:
CXP.log.error('Could not extract data from numpy file.')
raise
def read_dat(self, filename=None):
if not filename:
filename = self.filename
try:
return sp.fromfile(filename)
except:
CXP.log.error('Could not extract data from data file.')
raise
def read_pickle(self, filename=None):
if not filename:
filename = self.filename
try:
return pickle.load(filename)
except:
CXP.log.error('Could not load data from pickle')
raise
def read_mmpad(self, filename=None):
if not filename:
filename = self.filename
try:
return open_mmpad_tif(filename)
except:
CXP.log.error('Could not load data from pickle')
raise
def read_pilatus(self, filename=None):
if not filename:
filename = self.filename
try:
return sp.misc.fromimage(Image.open(filename))[:-1,:-1]
except:
CXP.log.error('Unable to read data from {}'.format(filename))
raise
def read_tif(self, filename=None):
if not filename:
filename = self.filename
try:
return libtiff.TIFF.open(filename).read_image()
except:
CXP.log.error('Unable to read data from {}'.format(filename))
raise
| 29.20197 | 109 | 0.546896 | 702 | 5,928 | 4.531339 | 0.216524 | 0.07922 | 0.048412 | 0.066017 | 0.399246 | 0.399246 | 0.385413 | 0.361522 | 0.361522 | 0.305564 | 0 | 0.008188 | 0.361336 | 5,928 | 202 | 110 | 29.346535 | 0.832013 | 0.007591 | 0 | 0.464286 | 0 | 0 | 0.112991 | 0.006244 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.007143 | 0.071429 | null | null | 0.007143 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b47e5c3d3423860e078e6b322a1719db193870cb | 3,107 | py | Python | pkg/tests/helpers_test.py | hborawski/rules_pkg | 8d542763a3959db79175404758f46c7f3f385fa5 | [
"Apache-2.0"
] | null | null | null | pkg/tests/helpers_test.py | hborawski/rules_pkg | 8d542763a3959db79175404758f46c7f3f385fa5 | [
"Apache-2.0"
] | null | null | null | pkg/tests/helpers_test.py | hborawski/rules_pkg | 8d542763a3959db79175404758f46c7f3f385fa5 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 The Bazel Authors. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import tempfile
import unittest
from private import helpers
class GetFlagValueTestCase(unittest.TestCase):
def testNonStripped(self):
self.assertEqual(helpers.GetFlagValue('value ', strip=False), 'value ')
def testStripped(self):
self.assertEqual(helpers.GetFlagValue('value ', strip=True), 'value')
def testNonStripped_fromFile(self):
with tempfile.TemporaryDirectory() as temp_d:
argfile_path = os.path.join(temp_d, 'argfile')
with open(argfile_path, 'wb') as f:
f.write(b'value ')
self.assertEqual(
helpers.GetFlagValue('@'+argfile_path, strip=False), 'value ')
def testStripped_fromFile(self):
with tempfile.TemporaryDirectory() as temp_d:
argfile_path = os.path.join(temp_d, 'argfile')
with open(argfile_path, 'wb') as f:
f.write(b'value ')
self.assertEqual(
helpers.GetFlagValue('@'+argfile_path, strip=True), 'value')
class SplitNameValuePairAtSeparatorTestCase(unittest.TestCase):
def testNoSep(self):
key, val = helpers.SplitNameValuePairAtSeparator('abc', '=')
self.assertEqual(key, 'abc')
self.assertEqual(val, '')
def testNoSepWithEscape(self):
key, val = helpers.SplitNameValuePairAtSeparator('a\\=bc', '=')
self.assertEqual(key, 'a=bc')
self.assertEqual(val, '')
def testNoSepWithDanglingEscape(self):
key, val = helpers.SplitNameValuePairAtSeparator('abc\\', '=')
self.assertEqual(key, 'abc')
self.assertEqual(val, '')
def testHappyCase(self):
key, val = helpers.SplitNameValuePairAtSeparator('abc=xyz', '=')
self.assertEqual(key, 'abc')
self.assertEqual(val, 'xyz')
def testHappyCaseWithEscapes(self):
key, val = helpers.SplitNameValuePairAtSeparator('a\\=\\=b\\=c=xyz', '=')
self.assertEqual(key, 'a==b=c')
self.assertEqual(val, 'xyz')
def testStopsAtFirstSep(self):
key, val = helpers.SplitNameValuePairAtSeparator('a=b=c', '=')
self.assertEqual(key, 'a')
self.assertEqual(val, 'b=c')
def testDoesntUnescapeVal(self):
key, val = helpers.SplitNameValuePairAtSeparator('abc=x\\=yz\\', '=')
self.assertEqual(key, 'abc')
# the val doesn't get unescaped at all
self.assertEqual(val, 'x\\=yz\\')
def testUnescapesNonsepCharsToo(self):
key, val = helpers.SplitNameValuePairAtSeparator('na\\xffme=value', '=')
# this behaviour is surprising
self.assertEqual(key, 'naxffme')
self.assertEqual(val, 'value')
if __name__ == '__main__':
unittest.main()
| 33.408602 | 77 | 0.700032 | 368 | 3,107 | 5.855978 | 0.347826 | 0.139211 | 0.037123 | 0.063109 | 0.482599 | 0.415777 | 0.348492 | 0.285847 | 0.240371 | 0.240371 | 0 | 0.003086 | 0.165755 | 3,107 | 92 | 78 | 33.771739 | 0.828318 | 0.206308 | 0 | 0.327586 | 0 | 0 | 0.082857 | 0 | 0 | 0 | 0 | 0 | 0.344828 | 1 | 0.206897 | false | 0 | 0.068966 | 0 | 0.310345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b48720b38e6ef7c7ce6bd71cd8a1fc79b8ad2a3a | 3,263 | py | Python | scripts/sha3.py | cidox479/ecc | da4091ff675d0fc757dc7d19bcdd4474a1388011 | [
"BSD-2-Clause"
] | null | null | null | scripts/sha3.py | cidox479/ecc | da4091ff675d0fc757dc7d19bcdd4474a1388011 | [
"BSD-2-Clause"
] | null | null | null | scripts/sha3.py | cidox479/ecc | da4091ff675d0fc757dc7d19bcdd4474a1388011 | [
"BSD-2-Clause"
] | 1 | 2020-09-28T03:06:38.000Z | 2020-09-28T03:06:38.000Z | #/*
# * Copyright (C) 2017 - This file is part of libecc project
# *
# * Authors:
# * Ryad BENADJILA <ryadbenadjila@gmail.com>
# * Arnaud EBALARD <arnaud.ebalard@ssi.gouv.fr>
# * Jean-Pierre FLORI <jean-pierre.flori@ssi.gouv.fr>
# *
# * Contributors:
# * Nicolas VIVET <nicolas.vivet@ssi.gouv.fr>
# * Karim KHALFALLAH <karim.khalfallah@ssi.gouv.fr>
# *
# * This software is licensed under a dual BSD and GPL v2 license.
# * See LICENSE file at the root folder of the project.
# */
import struct
keccak_rc = [
0x0000000000000001, 0x0000000000008082, 0x800000000000808A, 0x8000000080008000,
0x000000000000808B, 0x0000000080000001, 0x8000000080008081, 0x8000000000008009,
0x000000000000008A, 0x0000000000000088, 0x0000000080008009, 0x000000008000000A,
0x000000008000808B, 0x800000000000008B, 0x8000000000008089, 0x8000000000008003,
0x8000000000008002, 0x8000000000000080, 0x000000000000800A, 0x800000008000000A,
0x8000000080008081, 0x8000000000008080, 0x0000000080000001, 0x8000000080008008
]
keccak_rot = [
[ 0, 36, 3, 41, 18 ],
[ 1, 44, 10, 45, 2 ],
[ 62, 6, 43, 15, 61 ],
[ 28, 55, 25, 21, 56 ],
[ 27, 20, 39, 8, 14 ],
]
# Keccak function
def keccak_rotl(x, l):
return (((x << l) ^ (x >> (64 - l))) & (2**64-1))
def keccakround(bytestate, rc):
# Import little endian state
state = [0] * 25
for i in range(0, 25):
(state[i],) = struct.unpack('<Q', ''.join(bytestate[(8*i):(8*i)+8]))
# Proceed with the KECCAK core
bcd = [0] * 25
# Theta
for i in range(0, 5):
bcd[i] = state[i] ^ state[i + (5*1)] ^ state[i + (5*2)] ^ state[i + (5*3)] ^ state[i + (5*4)]
for i in range(0, 5):
tmp = bcd[(i+4)%5] ^ keccak_rotl(bcd[(i+1)%5], 1)
for j in range(0, 5):
state[i + (5 * j)] = state[i + (5 * j)] ^ tmp
# Rho and Pi
for i in range(0, 5):
for j in range(0, 5):
bcd[j + (5*(((2*i)+(3*j)) % 5))] = keccak_rotl(state[i + (5*j)], keccak_rot[i][j])
# Chi
for i in range(0, 5):
for j in range(0, 5):
state[i + (5*j)] = bcd[i + (5*j)] ^ (~bcd[((i+1)%5) + (5*j)] & bcd[((i+2)%5) + (5*j)])
# Iota
state[0] = state[0] ^ keccak_rc[rc]
# Pack the output state
output = [0] * (25 * 8)
for i in range(0, 25):
output[(8*i):(8*i)+1] = struct.pack('<Q', state[i])
return output
def keccakf(bytestate):
for rnd in range(0, 24):
bytestate = keccakround(bytestate, rnd)
return bytestate
# SHA-3 context class
class Sha3_ctx(object):
def __init__(self, digest_size):
self.digest_size = digest_size / 8
self.block_size = (25*8) - (2 * (digest_size / 8))
self.idx = 0
self.state = [chr(0)] * (25 * 8)
def digest_size(self):
return self.digest_size
def block_size(self):
return self.block_size
def update(self, message):
for i in range(0, len(message)):
self.state[self.idx] = chr(ord(self.state[self.idx]) ^ ord(message[i]))
self.idx = self.idx + 1
if (self.idx == self.block_size):
self.state = keccakf(self.state)
self.idx = 0
def digest(self):
self.state[self.idx] = chr(ord(self.state[self.idx]) ^ 0x06)
self.state[self.block_size - 1] = chr(ord(self.state[self.block_size - 1]) ^ 0x80)
self.state = keccakf(self.state)
return ''.join(self.state[:self.digest_size])
| 32.63 | 97 | 0.62274 | 490 | 3,263 | 4.095918 | 0.312245 | 0.053812 | 0.043847 | 0.038366 | 0.169905 | 0.128052 | 0.078226 | 0.078226 | 0.078226 | 0.078226 | 0 | 0.211013 | 0.204107 | 3,263 | 99 | 98 | 32.959596 | 0.561802 | 0.190316 | 0 | 0.19697 | 0 | 0 | 0.001528 | 0 | 0 | 0 | 0.168067 | 0 | 0 | 1 | 0.121212 | false | 0 | 0.015152 | 0.045455 | 0.242424 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
81eb4b0e294989b02c9358c7a2349765725c6844 | 970 | py | Python | app/mod_check/MySQL.py | RITC3/Hermes | 7df5cf1cbeaca949918ace9278b2d5c1138d4eac | [
"MIT"
] | 2 | 2018-03-06T03:39:00.000Z | 2018-03-06T04:31:39.000Z | app/mod_check/MySQL.py | RITC3/Hermes | 7df5cf1cbeaca949918ace9278b2d5c1138d4eac | [
"MIT"
] | 15 | 2018-01-01T20:55:22.000Z | 2018-06-09T21:37:39.000Z | app/mod_check/MySQL.py | RITC3/Hermes | 7df5cf1cbeaca949918ace9278b2d5c1138d4eac | [
"MIT"
] | null | null | null | import pymysql.cursors
from ..mod_check import app
@app.task
def check(host, port, username, password, db):
result = None
connection = None
try:
connection = pymysql.connect(host=host,
port=port,
user=username,
password=password,
db=db,
charset='utf8mb4',
autocommit=True,
cursorclass=pymysql.cursors.DictCursor)
with connection.cursor() as cursor:
cursor.execute('SELECT @@version AS version')
res = cursor.fetchone()
if isinstance(res, dict):
result = res.get('version', None)
except pymysql.Error:
result = False
finally:
if connection is not None:
connection.close()
return result
| 29.393939 | 76 | 0.464948 | 81 | 970 | 5.555556 | 0.567901 | 0.062222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003824 | 0.460825 | 970 | 32 | 77 | 30.3125 | 0.856597 | 0 | 0 | 0 | 0 | 0 | 0.042268 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0.076923 | 0.076923 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
81ee84f6b82809aa8e7f47a4b2060161548b3aab | 1,353 | py | Python | sitri/providers/contrib/ini.py | Elastoo-Team/sitri | d5470d9a37d3c944c0976793fce80a630e5625b1 | [
"MIT"
] | 11 | 2020-12-16T07:00:29.000Z | 2021-05-25T16:24:50.000Z | sitri/providers/contrib/ini.py | Elastoo-Team/sitri | d5470d9a37d3c944c0976793fce80a630e5625b1 | [
"MIT"
] | 1 | 2021-06-30T05:42:46.000Z | 2021-09-03T11:45:56.000Z | sitri/providers/contrib/ini.py | Elastoo-Team/sitri | d5470d9a37d3c944c0976793fce80a630e5625b1 | [
"MIT"
] | null | null | null | import configparser
import os
import typing
from sitri.providers.base import ConfigProvider
class IniConfigProvider(ConfigProvider):
"""Config provider for Initialization file (Ini)."""
provider_code = "ini"
def __init__(
self,
ini_path: str = "./config.ini",
):
"""
:param ini_path: path to ini file
"""
self.configparser = configparser.ConfigParser()
with open(os.path.abspath(ini_path)) as f:
self.configparser.read_file(f)
self._sections = None
@property
def sections(self):
if not self._sections:
self._sections = list(self.configparser.keys())
return self._sections
def get(self, key: str, section: str, **kwargs) -> typing.Optional[typing.Any]: # type: ignore
"""Get value from ini file.
:param key: key or path for search
:param section: section of ini file
"""
if section not in self.sections:
return None
return self.configparser[section].get(key)
def keys(self, section: str, **kwargs) -> typing.List[str]: # type: ignore
"""Get keys of section.
:param section: section of ini file
"""
if section not in self.sections:
return []
return list(self.configparser[section].keys())
| 24.6 | 99 | 0.602365 | 157 | 1,353 | 5.10828 | 0.33121 | 0.089776 | 0.049875 | 0.054863 | 0.149626 | 0.149626 | 0.149626 | 0.149626 | 0.149626 | 0.149626 | 0 | 0 | 0.291944 | 1,353 | 54 | 100 | 25.055556 | 0.837161 | 0.193644 | 0 | 0.074074 | 0 | 0 | 0.014881 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.148148 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
81f8d698a3ddfe36ef13f1113078ded3a3fb3cf5 | 865 | py | Python | checkov/terraform/checks/resource/aws/EKSSecretsEncryption.py | cclauss/checkov | 60a385fcaff1499cf00c2d0018575fe5ab71f556 | [
"Apache-2.0"
] | 1 | 2021-01-26T12:46:32.000Z | 2021-01-26T12:46:32.000Z | checkov/terraform/checks/resource/aws/EKSSecretsEncryption.py | cclauss/checkov | 60a385fcaff1499cf00c2d0018575fe5ab71f556 | [
"Apache-2.0"
] | 1 | 2021-06-02T02:53:31.000Z | 2021-06-02T02:53:31.000Z | checkov/terraform/checks/resource/aws/EKSSecretsEncryption.py | cclauss/checkov | 60a385fcaff1499cf00c2d0018575fe5ab71f556 | [
"Apache-2.0"
] | null | null | null | from checkov.common.models.enums import CheckResult, CheckCategories
from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
class EKSSecretsEncryption(BaseResourceCheck):
def __init__(self):
name = "Ensure EKS Cluster has Secrets Encryption Enabled"
id = "CKV_AWS_58"
supported_resources = ['aws_eks_cluster']
categories = [CheckCategories.KUBERNETES]
super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
def scan_resource_conf(self, conf):
if "encryption_config" in conf.keys() and "resources" in conf["encryption_config"][0] and \
"secrets" in conf["encryption_config"][0]["resources"][0]:
return CheckResult.PASSED
else:
return CheckResult.FAILED
check = EKSSecretsEncryption()
| 39.318182 | 106 | 0.713295 | 93 | 865 | 6.397849 | 0.516129 | 0.090756 | 0.053782 | 0.07395 | 0.077311 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007174 | 0.19422 | 865 | 21 | 107 | 41.190476 | 0.846485 | 0 | 0 | 0 | 0 | 0 | 0.17341 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.0625 | 0.125 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
81fd3d016f2f7329e2389892dcfbd3f365d1769d | 844 | py | Python | pip-check.py | Urucas/pip-check | 777d8208bb89f566b95885a6711c773580a9c80f | [
"MIT"
] | null | null | null | pip-check.py | Urucas/pip-check | 777d8208bb89f566b95885a6711c773580a9c80f | [
"MIT"
] | null | null | null | pip-check.py | Urucas/pip-check | 777d8208bb89f566b95885a6711c773580a9c80f | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import json
import pip
import os
import sys
def err(msg):
print "\033[31m✗ \033[0m%s" % msg
def ok(msg):
print "\033[32m✓ \033[0m%s" % msg
def main():
cwd = os.getcwd()
json_file = os.path.join(cwd, 'dependencies.json')
if os.path.isfile(json_file) == False:
err("dependencies.json not found in current folder")
sys.exit(1)
with open(json_file) as data_file:
data = json.load(data_file)
dependencies = data["dependencies"]
for lib in dependencies:
command = pip.commands.install.InstallCommand()
opts, args = command.parser.parse_args()
requirements_set = command.run(opts, [lib])
requirements_set.install(opts)
ok("Successfuly installed mising dependencies")
if __name__ == "__main__":
main()
| 23.444444 | 60 | 0.640995 | 117 | 844 | 4.504274 | 0.538462 | 0.045541 | 0.041746 | 0.034156 | 0.045541 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030488 | 0.222749 | 844 | 35 | 61 | 24.114286 | 0.769817 | 0.049763 | 0 | 0 | 0 | 0 | 0.20125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.16 | null | null | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
81fdb0e1136255e877c9ae2c151c33d3b0b0ee1d | 338 | py | Python | 1801-1900/1807.evaluate-thebracket-pairs-of-a-string.py | guangxu-li/leetcode-in-python | 8a5a373b32351500342705c141591a1a8f5f1cb1 | [
"MIT"
] | null | null | null | 1801-1900/1807.evaluate-thebracket-pairs-of-a-string.py | guangxu-li/leetcode-in-python | 8a5a373b32351500342705c141591a1a8f5f1cb1 | [
"MIT"
] | null | null | null | 1801-1900/1807.evaluate-thebracket-pairs-of-a-string.py | guangxu-li/leetcode-in-python | 8a5a373b32351500342705c141591a1a8f5f1cb1 | [
"MIT"
] | null | null | null | #
# @lc app=leetcode id=1807 lang=python3
#
# [1807] Evaluate the Bracket Pairs of a String
#
# @lc code=start
import re
class Solution:
def evaluate(self, s: str, knowledge: list[list[str]]) -> str:
mapping = dict(knowledge)
return re.sub(r"\((\w+?)\)", lambda m: mapping.get(m.group(1), "?"), s)
# @lc code=end
| 18.777778 | 79 | 0.612426 | 51 | 338 | 4.058824 | 0.745098 | 0.057971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037313 | 0.207101 | 338 | 17 | 80 | 19.882353 | 0.735075 | 0.328402 | 0 | 0 | 0 | 0 | 0.050228 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
81ff4f468611ece2f0ec909a6f48f5be0e5338fb | 404 | py | Python | articles/migrations/0003_article_published_at.py | mosalaheg/django3.2 | 551ecd0c8f633bcd9c37a95688e7bed958c0b91c | [
"MIT"
] | null | null | null | articles/migrations/0003_article_published_at.py | mosalaheg/django3.2 | 551ecd0c8f633bcd9c37a95688e7bed958c0b91c | [
"MIT"
] | null | null | null | articles/migrations/0003_article_published_at.py | mosalaheg/django3.2 | 551ecd0c8f633bcd9c37a95688e7bed958c0b91c | [
"MIT"
] | null | null | null | # Generated by Django 3.2.7 on 2021-10-02 08:24
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('articles', '0002_auto_20211002_1019'),
]
operations = [
migrations.AddField(
model_name='article',
name='published_at',
field=models.DateTimeField(blank=True, null=True),
),
]
| 21.263158 | 62 | 0.608911 | 44 | 404 | 5.477273 | 0.840909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106164 | 0.277228 | 404 | 18 | 63 | 22.444444 | 0.719178 | 0.111386 | 0 | 0 | 1 | 0 | 0.140056 | 0.064426 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
81ffc4260214e21a8fbb8d247a68944ab547969b | 643 | py | Python | example/usage/example_kate.py | vodka2/vkaudiotoken-python | 5720e4cf77f5e1b20c3bf57f3df0717638a539e0 | [
"MIT"
] | 32 | 2020-07-21T18:32:59.000Z | 2022-03-20T21:16:11.000Z | example/usage/example_kate.py | vodka2/vkaudiotoken-python | 5720e4cf77f5e1b20c3bf57f3df0717638a539e0 | [
"MIT"
] | 1 | 2020-10-04T04:41:06.000Z | 2020-10-05T11:43:48.000Z | example/usage/example_kate.py | vodka2/vkaudiotoken-python | 5720e4cf77f5e1b20c3bf57f3df0717638a539e0 | [
"MIT"
] | 2 | 2021-09-21T01:17:05.000Z | 2022-03-17T10:17:22.000Z | from __future__ import print_function
try:
import vkaudiotoken
except ImportError:
import path_hack
from vkaudiotoken import supported_clients
import sys
import requests
import json
token = sys.argv[1]
user_agent = supported_clients.KATE.user_agent
sess = requests.session()
sess.headers.update({'User-Agent': user_agent})
def prettyprint(result):
print(json.dumps(json.loads(result.content.decode('utf-8')), indent=2))
prettyprint(sess.get(
"https://api.vk.com/method/audio.getById",
params=[('access_token', token),
('audios', '371745461_456289486,-41489995_202246189'),
('v', '5.95')]
))
| 21.433333 | 75 | 0.715397 | 82 | 643 | 5.439024 | 0.670732 | 0.080717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075092 | 0.150855 | 643 | 29 | 76 | 22.172414 | 0.741758 | 0 | 0 | 0 | 0 | 0 | 0.180404 | 0.060653 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.380952 | 0 | 0.428571 | 0.190476 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
c301529eb7d8f8a6047d8e286ff806d7da8427d3 | 2,235 | py | Python | tools/testrunner/outproc/message.py | LancerWang001/v8 | 42ff4531f590b901ade0a18bfd03e56485fe2452 | [
"BSD-3-Clause"
] | 20,995 | 2015-01-01T05:12:40.000Z | 2022-03-31T21:39:18.000Z | tools/testrunner/outproc/message.py | Andrea-MariaDB-2/v8 | a0f0ebd7a876e8cb2210115adbfcffe900e99540 | [
"BSD-3-Clause"
] | 333 | 2020-07-15T17:06:05.000Z | 2021-03-15T12:13:09.000Z | tools/testrunner/outproc/message.py | Andrea-MariaDB-2/v8 | a0f0ebd7a876e8cb2210115adbfcffe900e99540 | [
"BSD-3-Clause"
] | 4,523 | 2015-01-01T15:12:34.000Z | 2022-03-28T06:23:41.000Z | # Copyright 2018 the V8 project authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import itertools
import os
import re
from . import base
class OutProc(base.ExpectedOutProc):
def __init__(self, expected_outcomes, basepath, expected_fail,
expected_filename, regenerate_expected_files):
super(OutProc, self).__init__(expected_outcomes, expected_filename,
regenerate_expected_files)
self._basepath = basepath
self._expected_fail = expected_fail
def _is_failure_output(self, output):
fail = output.exit_code != 0
if fail != self._expected_fail:
return True
expected_lines = []
# Can't use utils.ReadLinesFrom() here because it strips whitespace.
with open(self._basepath + '.out') as f:
for line in f:
if line.startswith("#") or not line.strip():
continue
expected_lines.append(line)
raw_lines = output.stdout.splitlines()
actual_lines = [ s for s in raw_lines if not self._ignore_line(s) ]
if len(expected_lines) != len(actual_lines):
return True
# Try .js first, and fall back to .mjs.
# TODO(v8:9406): clean this up by never separating the path from
# the extension in the first place.
base_path = self._basepath + '.js'
if not os.path.exists(base_path):
base_path = self._basepath + '.mjs'
env = {
'basename': os.path.basename(base_path),
}
for (expected, actual) in itertools.izip_longest(
expected_lines, actual_lines, fillvalue=''):
pattern = re.escape(expected.rstrip() % env)
pattern = pattern.replace('\\*', '.*')
pattern = pattern.replace('\\{NUMBER\\}', '\d+(?:\.\d*)?')
pattern = '^%s$' % pattern
if not re.match(pattern, actual):
return True
return False
def _ignore_line(self, string):
"""Ignore empty lines, valgrind output, Android output."""
return (
not string or
not string.strip() or
string.startswith("==") or
string.startswith("**") or
string.startswith("ANDROID") or
# Android linker warning.
string.startswith('WARNING: linker:')
)
| 32.867647 | 72 | 0.648322 | 282 | 2,235 | 4.968085 | 0.425532 | 0.034261 | 0.038544 | 0.048537 | 0.094218 | 0.038544 | 0 | 0 | 0 | 0 | 0 | 0.006471 | 0.239374 | 2,235 | 67 | 73 | 33.358209 | 0.817647 | 0.195526 | 0 | 0.061224 | 0 | 0 | 0.045378 | 0 | 0 | 0 | 0 | 0.014925 | 0 | 1 | 0.061224 | false | 0 | 0.081633 | 0 | 0.265306 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c3027f734157db362e121ea8ce2b5d36ad4e6075 | 604 | py | Python | gemtown/users/urls.py | doramong0926/gemtown | 2c39284e3c68f0cc11994bed0ee2abaad0ea06b6 | [
"MIT"
] | null | null | null | gemtown/users/urls.py | doramong0926/gemtown | 2c39284e3c68f0cc11994bed0ee2abaad0ea06b6 | [
"MIT"
] | 5 | 2020-09-04T20:13:39.000Z | 2022-02-17T22:03:33.000Z | gemtown/users/urls.py | doramong0926/gemtown | 2c39284e3c68f0cc11994bed0ee2abaad0ea06b6 | [
"MIT"
] | null | null | null | from django.urls import path
from . import views
app_name = "users"
urlpatterns = [
path("all/", view=views.UserList.as_view(), name="all_user"),
path("<int:user_id>/password/", view=views.ChangePassword.as_view(), name="change password"),
path("<int:user_id>/follow/", view=views.FollowUser.as_view(), name="follow user"),
path("<int:user_id>/unfollow/", view=views.UnfollowUser.as_view(), name="unfollow user"),
path("<int:user_id>/", view=views.UserFeed.as_view(), name="user_detail_infomation"),
path("login/facebook/", view=views.FacebookLogin.as_view(), name="fb_login"),
] | 50.333333 | 97 | 0.701987 | 85 | 604 | 4.811765 | 0.364706 | 0.132029 | 0.146699 | 0.127139 | 0.124694 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099338 | 604 | 12 | 98 | 50.333333 | 0.751838 | 0 | 0 | 0 | 0 | 0 | 0.300826 | 0.147107 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.090909 | 0.181818 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c302fe24cced11c5bc506098882205738bad2b79 | 3,132 | py | Python | Packs/Thycotic/Integrations/Thycotic/Thycotic_test.py | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 799 | 2016-08-02T06:43:14.000Z | 2022-03-31T11:10:11.000Z | Packs/Thycotic/Integrations/Thycotic/Thycotic_test.py | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 9,317 | 2016-08-07T19:00:51.000Z | 2022-03-31T21:56:04.000Z | Packs/Thycotic/Integrations/Thycotic/Thycotic_test.py | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 1,297 | 2016-08-04T13:59:00.000Z | 2022-03-31T23:43:06.000Z | import pytest
from Thycotic import Client, \
secret_password_get_command, secret_username_get_command, \
secret_get_command, secret_password_update_command, secret_checkout_command, secret_checkin_command, \
secret_delete_command, folder_create_command, folder_delete_command, folder_update_command
from test_data.context import GET_PASSWORD_BY_ID_CONTEXT, GET_USERNAME_BY_ID_CONTENT, \
SECRET_GET_CONTENT, SECRET_PASSWORD_UPDATE_CONTEXT, SECRET_CHECKOUT_CONTEXT, SECRET_CHECKIN_CONTEXT, \
SECRET_DELETE_CONTEXT, FOLDER_CREATE_CONTEXT, FOLDER_DELETE_CONTEXT, FOLDER_UPDATE_CONTEXT
from test_data.http_responses import GET_PASSWORD_BY_ID_RAW_RESPONSE, GET_USERNAME_BY_ID_RAW_RESPONSE, \
SECRET_GET_RAW_RESPONSE, SECRET_PASSWORD_UPDATE_RAW_RESPONSE, SECRET_CHECKOUT_RAW_RESPONSE, \
SECRET_CHECKIN_RAW_RESPONSE, SECRET_DELETE_RAW_RESPONSE, FOLDER_CREATE_RAW_RESPONSE, FOLDER_DELETE_RAW_RESPONSE, \
FOLDER_UPDATE_RAW_RESPONSE
GET_PASSWORD_BY_ID_ARGS = {"secret_id": "4"}
GET_USERNAME_BY_ID_ARGS = {"secret_id": "4"}
SECRET_GET_ARGS = {"secret_id": "4"}
SECRET_PASSWORD_UPDATE_ARGS = {"secret_id": "4", "newpassword": "NEWPASSWORD1"}
SECRET_CHECKOUT_ARGS = {"secret_id": "4"}
SECRET_CHECKIN_ARGS = {"secret_id": "4"}
SECRET_DELETE_ARGS = {"id": "9"}
FOLDER_CREATE_ARGS = {"folderName": "xsoarFolderTest3", "folderTypeId": "1", "parentFolderId": "3"}
FOLDER_DELETE_ARGS = {"folder_id": "9"}
FOLDER_UPDATE_ARGS = {"id": "12", "folderName": "xsoarTF3New"}
@pytest.mark.parametrize('command, args, http_response, context', [
(secret_password_get_command, GET_PASSWORD_BY_ID_ARGS, GET_PASSWORD_BY_ID_RAW_RESPONSE, GET_PASSWORD_BY_ID_CONTEXT),
(secret_username_get_command, GET_USERNAME_BY_ID_ARGS, GET_USERNAME_BY_ID_RAW_RESPONSE, GET_USERNAME_BY_ID_CONTENT),
(secret_get_command, SECRET_GET_ARGS, SECRET_GET_RAW_RESPONSE, SECRET_GET_CONTENT),
(secret_password_update_command, SECRET_PASSWORD_UPDATE_ARGS, SECRET_PASSWORD_UPDATE_RAW_RESPONSE,
SECRET_PASSWORD_UPDATE_CONTEXT),
(secret_checkout_command, SECRET_CHECKOUT_ARGS, SECRET_CHECKOUT_RAW_RESPONSE, SECRET_CHECKOUT_CONTEXT),
(secret_checkin_command, SECRET_CHECKIN_ARGS, SECRET_CHECKIN_RAW_RESPONSE, SECRET_CHECKIN_CONTEXT),
(secret_delete_command, SECRET_DELETE_ARGS, SECRET_DELETE_RAW_RESPONSE, SECRET_DELETE_CONTEXT),
(folder_create_command, FOLDER_CREATE_ARGS, FOLDER_CREATE_RAW_RESPONSE, FOLDER_CREATE_CONTEXT),
(folder_delete_command, FOLDER_DELETE_ARGS, FOLDER_DELETE_RAW_RESPONSE, FOLDER_DELETE_CONTEXT),
(folder_update_command, FOLDER_UPDATE_ARGS, FOLDER_UPDATE_RAW_RESPONSE, FOLDER_UPDATE_CONTEXT)
])
def test_thycotic_commands(command, args, http_response, context, mocker):
mocker.patch.object(Client, '_generate_token')
client = Client(server_url="https://thss.softwarium.net/SecretServer", username="xsoar1", password="HfpuhXjv123",
proxy=False, verify=False)
mocker.patch.object(Client, '_http_request', return_value=http_response)
outputs = command(client, **args)
results = outputs.to_context()
assert results.get("EntryContext") == context
| 60.230769 | 120 | 0.814815 | 412 | 3,132 | 5.616505 | 0.165049 | 0.095073 | 0.073466 | 0.038894 | 0.52809 | 0.185825 | 0.063526 | 0.025929 | 0 | 0 | 0 | 0.006714 | 0.096424 | 3,132 | 51 | 121 | 61.411765 | 0.810954 | 0 | 0 | 0 | 0 | 0 | 0.098659 | 0 | 0 | 0 | 0 | 0 | 0.023256 | 1 | 0.023256 | false | 0.27907 | 0.093023 | 0 | 0.116279 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c304c12fe37620c738efd7817690de209aad07c4 | 1,190 | py | Python | src/pynnet/test.py | RalphMao/kaldi-pynnet | a8c050e976a138b43ff0c2ea2a1def72f51f9177 | [
"Apache-2.0"
] | null | null | null | src/pynnet/test.py | RalphMao/kaldi-pynnet | a8c050e976a138b43ff0c2ea2a1def72f51f9177 | [
"Apache-2.0"
] | null | null | null | src/pynnet/test.py | RalphMao/kaldi-pynnet | a8c050e976a138b43ff0c2ea2a1def72f51f9177 | [
"Apache-2.0"
] | null | null | null | import _nnet
import numpy as np
import IPython
net = _nnet.Nnet()
net.read('/home/maohz12/online_50h_Tsinghua/exp_train_50h/lstm_karel_bak/nnet/nnet_iter14_learnrate7.8125e-07_tr1.2687_cv1.6941')
# Test1
blobs = net.layers[0].get_params()
x = blobs[1].data.flatten()
x_test = np.fromfile('test/1.bin', 'f')
assert np.sum(abs(x-x_test)) < 1e-5
x = blobs[4].data.flatten()
x_test = np.fromfile('test/4.bin', 'f')
assert np.sum(abs(x-x_test)) < 1e-5
blobs[1].data[:] = np.arange(blobs[1].data.size).reshape(blobs[1].data.shape)
blobs[4].data[:] = np.arange(blobs[4].data.size).reshape(blobs[4].data.shape)
net.layers[0].set_params(blobs)
net.write('test/test_nnet', 0)
pointer, read_only_flag = blobs[1].data.__array_interface__['data']
# Test 2
data_copy = blobs[1].data.copy()
del net
pointer, read_only_flag = blobs[1].data.__array_interface__['data']
assert np.sum(abs(blobs[1].data - data_copy)) < 1e-5
# Test 3
net = _nnet.Nnet()
net.read('test/test_nnet')
blobs_new = net.layers[0].get_params()
x = blobs[1].data
x_test = blobs_new[1].data
assert np.sum(abs(x-x_test)) < 1e-5
x = blobs[4].data
x_test = blobs_new[4].data
assert np.sum(abs(x-x_test)) < 1e-5
print "Test passed"
| 27.045455 | 129 | 0.715966 | 224 | 1,190 | 3.602679 | 0.285714 | 0.061958 | 0.111524 | 0.086741 | 0.510533 | 0.416357 | 0.416357 | 0.351921 | 0.351921 | 0.277571 | 0 | 0.055659 | 0.094118 | 1,190 | 43 | 130 | 27.674419 | 0.69295 | 0.015966 | 0 | 0.258065 | 0 | 0.032258 | 0.159383 | 0.100257 | 0 | 0 | 0 | 0 | 0.16129 | 0 | null | null | 0.032258 | 0.096774 | null | null | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c307055a5d64c20c7212a67b032444ffbf9d764a | 569 | py | Python | Linear_Insertion_Sort.py | toppassion/python-master-app | 21d854186664440f997bfe53010b242f62979e7f | [
"MIT"
] | null | null | null | Linear_Insertion_Sort.py | toppassion/python-master-app | 21d854186664440f997bfe53010b242f62979e7f | [
"MIT"
] | null | null | null | Linear_Insertion_Sort.py | toppassion/python-master-app | 21d854186664440f997bfe53010b242f62979e7f | [
"MIT"
] | 1 | 2021-12-08T11:38:20.000Z | 2021-12-08T11:38:20.000Z | def Linear_Search(Test_arr, val):
index = 0
for i in range(len(Test_arr)):
if val > Test_arr[i]:
index = i+1
return index
def Insertion_Sort(Test_arr):
for i in range(1, len(Test_arr)):
val = Test_arr[i]
j = Linear_Search(Test_arr[:i], val)
Test_arr.pop(i)
Test_arr.insert(j, val)
return Test_arr
if __name__ == "__main__":
Test_list = input("Enter the list of Numbers: ").split()
Test_list = [int(i) for i in Test_list]
print(f"Binary Insertion Sort: {Insertion_Sort(Test_list)}") | 27.095238 | 64 | 0.616872 | 91 | 569 | 3.571429 | 0.373626 | 0.215385 | 0.055385 | 0.116923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007126 | 0.260105 | 569 | 21 | 64 | 27.095238 | 0.764846 | 0 | 0 | 0 | 0 | 0 | 0.149123 | 0.047368 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0 | 0 | 0.235294 | 0.058824 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c308e55ef9a8f6ca2122399901177b70c65eef30 | 1,208 | py | Python | test/test_everything.py | jameschapman19/Eigengame | 165d1bf35076fbfc6e65a987cb2e09a174776927 | [
"MIT"
] | null | null | null | test/test_everything.py | jameschapman19/Eigengame | 165d1bf35076fbfc6e65a987cb2e09a174776927 | [
"MIT"
] | null | null | null | test/test_everything.py | jameschapman19/Eigengame | 165d1bf35076fbfc6e65a987cb2e09a174776927 | [
"MIT"
] | null | null | null | import jax.numpy as jnp
import numpy as np
from jax import random
from algorithms import Game, GHA, Oja, Krasulina, Numpy
def test_pca():
"""
At the moment just checks they all run.
Returns
-------
"""
n = 10
p = 2
n_components = 2
batch_size = 2
epochs = 10
key = random.PRNGKey(0)
X = random.normal(key, (n, p))
X = X / jnp.linalg.norm(X, axis=0)
numpy = Numpy(n_components=n_components).fit(X)
game = Game(
n_components=n_components, batch_size=batch_size, epochs=epochs
).fit(X)
gha = GHA(n_components=n_components, batch_size=batch_size, epochs=epochs).fit(
X
)
oja = Oja(n_components=n_components, batch_size=batch_size, epochs=epochs).fit(
X
)
krasulina = Krasulina(
n_components=n_components, batch_size=batch_size, epochs=epochs
).fit(X)
assert (
np.testing.assert_almost_equal(
[
game.score(X),
gha.score(X),
oja.score(X),
krasulina.score(X),
],
numpy.score(X),
decimal=0,
)
is None
)
| 24.16 | 83 | 0.543874 | 152 | 1,208 | 4.171053 | 0.328947 | 0.190852 | 0.094637 | 0.173502 | 0.353312 | 0.353312 | 0.353312 | 0.353312 | 0.353312 | 0.353312 | 0 | 0.012723 | 0.349338 | 1,208 | 49 | 84 | 24.653061 | 0.793893 | 0.046358 | 0 | 0.102564 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 1 | 0.025641 | false | 0 | 0.102564 | 0 | 0.128205 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c3119f2506c627ca857b498eb0bfe45c4bd66fbc | 9,582 | py | Python | dataanalysis.py | Rev-Jiang/Python | c91d5724a6843f095bfe1a05f65d9fc885e01b88 | [
"MIT"
] | null | null | null | dataanalysis.py | Rev-Jiang/Python | c91d5724a6843f095bfe1a05f65d9fc885e01b88 | [
"MIT"
] | null | null | null | dataanalysis.py | Rev-Jiang/Python | c91d5724a6843f095bfe1a05f65d9fc885e01b88 | [
"MIT"
] | null | null | null | #-*- coding: UTF-8 -*-
#上句表示可用中文注释,否则默认ASCII码保存
# Filename : dataanalysis.py
# author by : Rev_997
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
def isiterable(obj):
try:
iter(obj)
return True
except TypeError:#not iterable
return False
#if it is not list or NumPy, transfer it
if not isinstance(x,list) and isiterable(x):
x=list(x)
#is and is not are used to judge if the varible is None, as None is unique.
a=None
a is None
import datetime
dt=datetime(2011,10,29,20,30,21)
dt.day
dt.minute
dt.date()
dt.time()
#datetime could be transfered to string by function striftime
dt.strftime('%m/%d/%Y %H:%M')
#string could be transfered to datetime by function strptime
datetime.strptime('20091031','%Y%m%d')
#substitute 0 for minutes and seconds
dt.replace(minute=0,second=0)
#the difference of two datetime objects produce a datetime.timedelta
dt2=datetime(2011,11,15,22,30)
delta=dt2-dt
delta
type(delta)
#add a timedelta to a datetime -- get a now datetime
dt+delta
#if elif else
if x:
pass
elif:
pass
else:
pass
#for
for value in collection:
#do something wuth value
#continue
#break
for a,b,c in iterator:
#do something
#while
x=256
total=0
while x>0:
if total>500:
break
total+=x
x=x//2
def attempt_float(x):
try:
return float(x)
except:
return x
#once the float(x) is invalid, the except works
def attempt_float(x):
try:
return float(x)
except(TypeError,ValueError):
return x
#catch the abnormity
#value=true-expr if condition else false-expr
#same as
'''
if condition:
value=true-expr
else:
value=false-expr
'''
#about tuple
tup=4,5,6
tup
#(4,5,6)
#transfer to tuple
tuple([4,0,2])
tuple('string')
#tuple use + to generate longer tuple
#tuple.append()
#tuple.count()
#list.append()
#list.insert()
#list.pop()
#list.remove()
#list.extend()
#list.sort()
import bisect
c=[1,2,2,2,3,4,7]
#find the suitable position
bisect.bisect(c,2)
#insert the new number
bisect.insort(c,6)
###attention: bisect is suitable for ordered sequence
#----------------------------------------------------------------
#some function of list
#enumerate
for i,value in enumerate(collection):
#do something with value
some_list=['foo','bar','baz']
mapping=dict((v,i) for i,v in enumerate(some_list))
mapping
#sorted
sorted([7,2,4,6,3,5,2])
sorted('horse race')
#powerful with set
sorted(set('this is just some string'))
#zip
seq1=['foo','bar','baz']
seq2=['one','two','three']
zip(seq1,seq2)
seq3=[False,True]
zip(seq1,seq2,seq3)
#several arrays iterate together with zip
for i,(a,b) in enumerate(zip(seq1,seq2)):
print('%d: %s, %s' % (i,a,b))
#unzip
pitchers=[('Nolan','Ryan'),('Roger','Clemens'),('Schilling','Curt')]
first_names,last_names=zip(*pitchers)# * is meant zip(seq[0],seq[1],...,seq[len(seq)-1])
first_names
last_names
#reversed
list(reversed(range(10)))
#dictionary
empty_dict={}d1={'a':'some value','b':[1,2,3,4]}
d1
#delete
del d1[5]
#or
ret=d1.pop('dummy')
ret
#get keys and values
d1.keys()
d1.values()
#combine two dictionaries
d1.update({'b':'foo','c':12})
d1
#match two list to be dictionary
'''
mapping={}
for key,value in zip(key_list,value_list):
mapping[key]=value
'''
mapping=dict(zip(range(5),reversed(range(5))))
mapping
#brief way to express circulation by dict
'''
if key in some_dict:
value=some_dict[key]
else:
value=default_value
'''
value=some_dict.get(key,default_values)
#the vlaue of dictionary is set as other list
'''
words=['apple','bat','bar','atom','book']
by_letter={}
for word in words:
letter=word[0]
if letter not in by_letter:
by_letter[letter]=[word]
else:
by_letter[letter].append(word)
by_letter
'''
by_letter.setdefault(letter,[]).append(word)
#or use defaultdict class in Module collections
from collections import defaultdict
by_letter=defaultdict(list)
for word in words:
by_letter[word[0]].append(word)
#the key of dictionary should be of hashability--unchangable
hash('string')
hash((1,2,(2,3)))
hash((1,2,[3,4]))#no hashability as list is changable
#to change a list to tuple is the easiest way to make it a key
d={}
d[tuple([1,2,3])]=5
d
#set
set([2,2,2,1,3,3])
{2,2,2,1,3,3}
a={1,2,3,4,5}
b={3,4,5,6,7,8}
#intersection
a|b
#union
a&b
#difference
a-b
#symmetric difference
a^b
#if is subset
a_set={1,2,3,4,5}
{1,2,3}.issubset(a_set)
a_set.issuperset({1,2,3})
#set could use the == to judge if the same
{1,2,3}=={3,2,1}
#the operation of the sets
a.add(x)
a.remove(x)
a.union(b)
a.intersection(b)
a.difference(b)
a.symmetric_difference(b)
a.issubset(b)
a.issuperset(b)
a.isdisjoint(b)
#the derivative of list&set&dictionary
'''
[expr for val in collection if condition]
is the same as
result=[]
for val in collection:
if condition:
result.append(expr)
'''
#list
#[expr for val in collection if condition]
strings=['a','as','bat','car','dove','python']
[x.upper() for x in strings if len(x)>2]
#dicrionary
#dict_comp={key-expr:value-expr for value in collection if condition}
loc_mapping={val:index for index, val in enumerate(string)}
loc_mapping
#or
loc_mapping=dict((val,idx) for idx, val in enumerate(string))
#set
#set_comp={expr for value in collection if condition}
unique_lengths={len(x) for x in strings}
unique_lengths
#list nesting derivative
all_data=[['Tom','Billy','Jeffery','Andrew','Wesley','Steven','Joe'],
['Susie','Casey','Jill','Ana','Eva','Jennifer','Stephanie']]
#find the names with two 'e' and put them in a new list
names_of_interest=[]
for name in all_data:
enough_es=[name for name in names if name.count('e')>2]
names_of_interest.extend(enough_es)
#which could be shorten as below:
result=[name for names in all_data for name in names
if name.count('e')>=2]
result
#flat a list consist of tuples
some_tuples=[(1,2,3),(4,5,6),(7,8,9)]
flattened=[x for tup in some_tuples for x in tup]
flattened
'''
flattened=[]
for tup in some_tuples:
for x in tup:
flattened.append(x)
'''
#which is different from:
[[x for x in tup] for tup in some_tuples]
#clean function
import re
def clean_strings(strings):
result=[]
for value in strings:
value=value.strip()
value=re.sub('[!#?]','',value) #Remove punctuation marks
value=value.title()
result.append(value)
return result
states=[' Alabama ','Georgia!','Georgia','georgia','FlOrIda','south carolina##','West virginia?']
clean_strings(states)
#or
def remove_punctuation(value):
return re.sub('[!#?]','',value)
clean_ops=[str.strip,remove_punctuation,str.title]
def clean_strings(strings,ops):
result=[]
for value in strings:
for function in ops:
value=function(value)
result.append(value)
return result
clean_strings(states,clean_ops)
#anonymous function
#lambda [arg1[, arg2, ... argN]]: expression
#exmaple 1
#use def define function
def add( x, y ):
return x + y
#use lambda expression
lambda x, y: x + y
#lambda permits default parameter
lambda x, y = 2: x + y
lambda *z: z
#call lambda function
a = lambda x, y: x + y
a( 1, 3 )
b = lambda x, y = 2: x + y
b( 1 )
b( 1, 3 )
c = lambda *z: z
c( 10, 'test')
#example2
#use def define function
def add( x, y ):
return x + y
#use lambda expression
lambda x, y: x + y
#lambda permits default parameter
lambda x, y = 2: x + y
lambda *z: z
#call lambda function
a = lambda x, y: x + y
a( 1, 3 )
b = lambda x, y = 2: x + y
b( 1 )
b( 1, 3 )
c = lambda *z: z
c( 10, 'test')
#example 3
def apply_to_list(some_list,f):
return [f(x) for x in some_list]
ints=[4,0,1,5,6]
apply_to_list(ints,lambda x:x*2)
#example 4
strings=['foo','card','bar','aaaa','abab']
strings.sort(key=lambda x: len(set(list(x))))
strings
#currying
'''
def add_numbers(x,y):
return x+y
add_five=lambda y:add_numbers(5,y)
'''
#partial function is to simplify the process
from functools import partial
add_five=partial(add_numbers,5)
#generator expression
gen=(x**2 for x in xxrange(100))
gen
#the same:
def _make_gen():
for x in xrange(100):
yield x**2
gen=_make_gen()
#generator expression could be used in any python function acceptable of generator
sum(x**2 for x in xrange(100))
dict((i,i**2) for i in xrange(5))
#itertools module
import itertools
first_letter=lambda x:x[0]
names=['Alan','Adam','Wes','Will','Albert','Steven']
for letter,names in itertools.groupby(names,first_letter):
print letter,list(names) #names is a genetator
#some functions in itertools
imap(func,*iterables)
ifilter(func,iterable)
combinations(iterable,k)
permutations(iterable,k)
groupby(iterable[,keyfunc])
#documents and operation system
path='xxx.txt'
f=open(path)
for line in f:
pass
#remove EOL of every line
lines=[x.rstrip() for x in open(path)]
lines
#set a empty-lineproof doc
with open('tmp.txt','w') as handle:
handle.writelines(x for x in open(path) if len(x)>1)
open('tmp.txt').readlines()
#some function to construct documents
read([size])
readlines([size])
write(str)
close()
flush()
seek(pos)
tell()
closed
| 20.08805 | 100 | 0.644124 | 1,546 | 9,582 | 3.944373 | 0.259379 | 0.007215 | 0.010823 | 0.00328 | 0.156937 | 0.122171 | 0.11348 | 0.091177 | 0.091177 | 0.070187 | 0 | 0.02935 | 0.207055 | 9,582 | 476 | 101 | 20.130252 | 0.77323 | 0.286266 | 0 | 0.2103 | 0 | 0 | 0.07698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.017167 | 0.038627 | null | null | 0.008584 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c311dcd3f870bbdf6b67118d6ccc561653945f40 | 259 | py | Python | show_model_info.py | panovr/Brain-Tumor-Segmentation | bf1ac2360af46a484d632474ce93de339ad2b496 | [
"MIT"
] | null | null | null | show_model_info.py | panovr/Brain-Tumor-Segmentation | bf1ac2360af46a484d632474ce93de339ad2b496 | [
"MIT"
] | null | null | null | show_model_info.py | panovr/Brain-Tumor-Segmentation | bf1ac2360af46a484d632474ce93de339ad2b496 | [
"MIT"
] | null | null | null | import bts.model as model
import torch
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
BATCH_SIZE = 6
FILTER_LIST = [16,32,64,128,256]
unet_model = model.DynamicUNet(FILTER_LIST)
unet_model.summary(batch_size=BATCH_SIZE, device=device)
| 28.777778 | 69 | 0.783784 | 43 | 259 | 4.534884 | 0.581395 | 0.138462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.096525 | 259 | 8 | 70 | 32.375 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c3159e702eacd0f494cdd9cb0e3428247b34b8ae | 669 | py | Python | tests/biology/test_join_fasta.py | shandou/pyjanitor | d7842613b4e4a7532a88f673fd54e94c3ba5a96b | [
"MIT"
] | 1 | 2021-03-25T10:46:57.000Z | 2021-03-25T10:46:57.000Z | tests/biology/test_join_fasta.py | shandou/pyjanitor | d7842613b4e4a7532a88f673fd54e94c3ba5a96b | [
"MIT"
] | null | null | null | tests/biology/test_join_fasta.py | shandou/pyjanitor | d7842613b4e4a7532a88f673fd54e94c3ba5a96b | [
"MIT"
] | null | null | null | import importlib
import os
import pytest
from helpers import running_on_ci
import janitor.biology # noqa: F403, F401
# Skip all tests if Biopython not installed
pytestmark = pytest.mark.skipif(
(importlib.util.find_spec("Bio") is None) & ~running_on_ci(),
reason="Biology tests relying on Biopython only required for CI",
)
@pytest.mark.biology
def test_join_fasta(biodf):
"""Test adding sequence from FASTA file in ``sequence`` column."""
df = biodf.join_fasta(
filename=os.path.join(pytest.TEST_DATA_DIR, "sequences.fasta"),
id_col="sequence_accession",
column_name="sequence",
)
assert "sequence" in df.columns
| 25.730769 | 71 | 0.714499 | 92 | 669 | 5.054348 | 0.619565 | 0.03871 | 0.047312 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010949 | 0.180867 | 669 | 25 | 72 | 26.76 | 0.837591 | 0.179372 | 0 | 0 | 0 | 0 | 0.197417 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 1 | 0.058824 | false | 0 | 0.352941 | 0 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
c31bd0f2505a1c4be1c52fbd6469723bb696bfa9 | 2,470 | py | Python | account/models.py | Hasanozzaman-Khan/Django-User-Authentication | 96482a51ed01bbdc7092d6ca34383054967a8aa0 | [
"MIT"
] | null | null | null | account/models.py | Hasanozzaman-Khan/Django-User-Authentication | 96482a51ed01bbdc7092d6ca34383054967a8aa0 | [
"MIT"
] | null | null | null | account/models.py | Hasanozzaman-Khan/Django-User-Authentication | 96482a51ed01bbdc7092d6ca34383054967a8aa0 | [
"MIT"
] | null | null | null |
from django.db import models
from django.contrib.auth.models import AbstractBaseUser, PermissionsMixin, BaseUserManager
from PIL import Image
# Create your models here.
class Home(models.Model):
pass
class CustomUserManager(BaseUserManager):
"""Manager for user profiles"""
def create_user(self, email, first_name, last_name, password=None):
"""Create a new user profile"""
if not email:
raise ValueError("User must have an email address.")
email = self.normalize_email(email)
user = self.model(email=email, first_name=first_name, last_name=last_name)
user.set_password(password)
user.save(using=self._db)
return user
def create_superuser(self, email, first_name, last_name, password):
"""Create and save a new superuser with given details"""
user = self.create_user(email, first_name, last_name, password)
user.is_superuser = True
user.is_staff = True
user.save(using=self._db)
return user
class CustomRegisterModel(AbstractBaseUser, PermissionsMixin):
""" Database model for users in the system """
email = models.EmailField(max_length=255, unique=True)
first_name = models.CharField(max_length=255)
last_name = models.CharField(max_length=255)
is_active = models.BooleanField(default=True)
is_staff = models.BooleanField(default=False)
is_email_verified = models.BooleanField(default=False)
objects = CustomUserManager()
USERNAME_FIELD = 'email'
REQUIRED_FIELDS = ['first_name', 'last_name']
def get_full_name(self):
"""Retrieve full name of user"""
return self.first_name + " " + self.last_name
def get_short_name(self):
"""Retrieve short name of user"""
return self.first_name
def __str__(self):
"""Return string representation of our user"""
return self.email
class ProfileModel(models.Model):
user = models.OneToOneField(CustomRegisterModel, on_delete=models.CASCADE)
image = models.ImageField(default='default.jpg', upload_to='profile_picture')
def __str__(self):
return f"{self.user.first_name}'s profile"
def save(self, *args, **kwargs):
super().save(*args, **kwargs)
img = Image.open(self.image.path)
if img.height > 300 or img.width > 300:
output_size = (300, 300)
img.thumbnail(output_size)
img.save(self.image.path)
| 30.493827 | 90 | 0.676923 | 309 | 2,470 | 5.239482 | 0.36246 | 0.05559 | 0.044472 | 0.052502 | 0.170476 | 0.170476 | 0.11365 | 0 | 0 | 0 | 0 | 0.010898 | 0.219838 | 2,470 | 80 | 91 | 30.875 | 0.829268 | 0.106883 | 0 | 0.125 | 0 | 0 | 0.053044 | 0.01107 | 0 | 0 | 0 | 0 | 0 | 1 | 0.145833 | false | 0.104167 | 0.0625 | 0.020833 | 0.645833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c32367d43e08138167f815beb65fbee346856f66 | 1,965 | py | Python | old_test/test-large.py | briandobbins/pynio | 1dd5fc0fc133f2b8d329ae68929bd3c6c1c5fa7c | [
"Apache-2.0"
] | null | null | null | old_test/test-large.py | briandobbins/pynio | 1dd5fc0fc133f2b8d329ae68929bd3c6c1c5fa7c | [
"Apache-2.0"
] | null | null | null | old_test/test-large.py | briandobbins/pynio | 1dd5fc0fc133f2b8d329ae68929bd3c6c1c5fa7c | [
"Apache-2.0"
] | null | null | null | from __future__ import print_function, division
import numpy as np
import Nio
import time, os
#
# Creating a file
#
init_time = time.clock()
ncfile = 'test-large.nc'
if (os.path.exists(ncfile)):
os.system("/bin/rm -f " + ncfile)
opt = Nio.options()
opt.Format = "LargeFile"
opt.PreFill = False
file = Nio.open_file(ncfile, 'w', options=opt)
file.title = "Testing large files and dimensions"
file.create_dimension('big', 2500000000)
bigvar = file.create_variable('bigvar', "b", ('big',))
print("created bigvar")
# note it is incredibly slow to write a scalar to a large file variable
# so create an temporary variable x that will get assigned in steps
x = np.empty(1000000,dtype = 'int8')
#print x
x[:] = 42
t = list(range(0,2500000000,1000000))
ii = 0
for i in t:
if (i == 0):
continue
print(t[ii],i)
bigvar[t[ii]:i] = x[:]
ii += 1
x[:] = 84
bigvar[2499000000:2500000000] = x[:]
bigvar[-1] = 84
bigvar.units = "big var units"
#print bigvar[-1]
print(bigvar.dimensions)
# check unlimited status
for dim in list(file.dimensions.keys()):
print(dim, " unlimited: ",file.unlimited(dim))
print(file)
print("closing file")
print('elapsed time: ',time.clock() - init_time)
file.close()
#quit()
#
# Reading a file
#
print('opening file for read')
print('elapsed time: ',time.clock() - init_time)
file = Nio.open_file(ncfile, 'r')
print('file is open')
print('elapsed time: ',time.clock() - init_time)
print(file.dimensions)
print(list(file.variables.keys()))
print(file)
print("reading variable")
print('elapsed time: ',time.clock() - init_time)
x = file.variables['bigvar']
print(x[0],x[1000000],x[249000000],x[2499999999])
print("max and min")
min = x[:].min()
max = x[:].max()
print(min, max)
print('elapsed time: ',time.clock() - init_time)
# check unlimited status
for dim in list(file.dimensions.keys()):
print(dim, " unlimited: ",file.unlimited(dim))
print("closing file")
print('elapsed time: ',time.clock() - init_time)
file.close()
| 23.674699 | 71 | 0.689567 | 303 | 1,965 | 4.419142 | 0.339934 | 0.041822 | 0.067961 | 0.089619 | 0.344287 | 0.315907 | 0.315907 | 0.241972 | 0.214339 | 0.214339 | 0 | 0.05549 | 0.137913 | 1,965 | 82 | 72 | 23.963415 | 0.734947 | 0.122646 | 0 | 0.271186 | 0 | 0 | 0.181871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.067797 | 0 | 0.067797 | 0.40678 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
c324c7d6ffabe1bf0c4f2f6e3eba09b511032c92 | 7,470 | py | Python | Mask/Interpolate slider without prepolate.py | typedev/RoboFont-1 | 307c3c953a338f58cd0070aa5b1bb737bde08cc9 | [
"MIT"
] | 1 | 2016-03-27T17:07:16.000Z | 2016-03-27T17:07:16.000Z | Mask/Interpolate slider without prepolate.py | typedev/RoboFont-1 | 307c3c953a338f58cd0070aa5b1bb737bde08cc9 | [
"MIT"
] | null | null | null | Mask/Interpolate slider without prepolate.py | typedev/RoboFont-1 | 307c3c953a338f58cd0070aa5b1bb737bde08cc9 | [
"MIT"
] | null | null | null | """
This slider controls interpolation between foreground and mask layers.
Initial position for slider is at 1.0 (current foreground outline)
Sliding left to 0.0 interpolates to mask
Sliding right to 3.0 extrapolates away from mask.
NOTE:
Running this script opens an observer on the current glyph in the Glyph View window.
The slider window must then be closed before it can be used on another glyph.
"""
from fontTools.misc.transform import Transform
from vanilla import *
g = CurrentGlyph()
g.prepareUndo('interpolate with mask')
################### PREPOLATION ###################################
## Auto contour order and startpoints for foreground:
#g.autoContourOrder()
#for c in g:
# c.autoStartSegment()
## Auto contour order and startpoints for mask:
g.flipLayers("foreground", "mask")
#g.autoContourOrder()
#for c in g:
# c.autoStartSegment()
## Gather point info for mask layer:
maskpoints = []
for i in range(len(g)):
maskpoints.append([])
for j in range(len(g[i])):
maskpoints[i].append((g[i][j].onCurve.x,g[i][j].onCurve.y))
## Gather point info for foreground layer:
g.flipLayers("mask", "foreground")
forepoints = []
for i in range(len(g)):
forepoints.append([])
for j in range(len(g[i])):
forepoints[i].append((g[i][j].onCurve.x,g[i][j].onCurve.y))
## Compare length of each contour in mask and foreground:
n = 0
print '-------------------------------'
print 'Checking ' + str(g.name) + ' without auto ordering'
def gradient(point1, point2):
grad = (point2[1] - point1[1])/(point2[0] - point1[0] + 0.9)
return grad
mismatched = []
if len(maskpoints) == len(forepoints):
for i in range(len(forepoints)):
print '-------------------------------'
if len(forepoints[i]) == len(maskpoints[i]):
print 'Contour ' + str(i) + ' matches'
else:
n = n + 1
print 'Contour ' + str(i) + ':'
print str(len(forepoints[i])) + ' points in foreground'
print str(len(maskpoints[i])) + ' points in mask'
print '-------------------------------'
if len(forepoints[i]) > len(maskpoints[i]):
count = len(maskpoints[i])
prob = 'mask'
else:
count = len(forepoints[i])
prob = 'foreground'
for j in range(-1,count - 1):
def foregradient(a,b):
foregrad = gradient(forepoints[a][b],forepoints[a][b+1])
return foregrad
def maskgradient(a,b):
maskgrad = gradient(maskpoints[a][b],maskpoints[a][b+1])
return maskgrad
foregrad = foregradient(i,j)
maskgrad = maskgradient(i,j)
if foregrad > 20:
foregrad = 100
if maskgrad > 20:
maskgrad = 100
if foregrad < -20:
foregrad = -100
if maskgrad < -20:
maskgrad = -100
if abs(foregrad - maskgrad) > 0.4:
mismatched.append(j+1)
mismatched = [mismatched[0]]
## Find second problem:
if prob == 'foreground':
foregrad = foregradient(i,j)
maskgrad = maskgradient(i,j+1)
else:
foregrad = foregradient(i,j+1)
maskgrad = maskgradient(i,j)
if foregrad > 20:
foregrad = 100
if maskgrad > 20:
maskgrad = 100
if foregrad < -20:
foregrad = -100
if maskgrad < -20:
maskgrad = -100
if abs(foregrad - maskgrad) > 0.4:
mismatched.append(j+1)
if abs(len(forepoints[i]) - len(maskpoints[i])) == 1:
if len(mismatched) == 1:
print 'Check between points ' + str(mismatched[0]) + ' and ' + str(mismatched[0] + 1)
else:
print 'Check amongst the last few points'
else:
if len(mismatched) == 2:
print 'Check between points ' + str(mismatched[0]) + ' and ' + str(mismatched[0] + 1)
print 'Check between points ' + str(mismatched[1]) + ' and ' + str(mismatched[1] + 1)
elif len(mismatched) == 1:
print 'Check between points ' + str(mismatched[0]) + ' and ' + str(mismatched[0] + 1)
print 'Check amongst the last few points'
else:
print 'Check amongst the last few points'
else:
print '-------------------------------'
print 'Foreground has ' + str(len(forepoints)) + ' contours'
print 'Mask has ' + str(len(maskpoints)) + ' contours'
print '-------------------------------'
################### INTERP SLIDER ###################################
## Collect mask points:
g.flipLayers("foreground", "mask")
all_mask_points = []
all_mask_points_length = []
for i in range(len(g)):
all_mask_points.append([])
for j in range(len(g[i].points)):
all_mask_points[i].append((g[i].points[j].x, g[i].points[j].y))
all_mask_points_length.append(j)
## Collect initial foreground points:
g.flipLayers("mask", "foreground")
all_fore_points = []
all_fore_points_length = []
for i in range(len(g)):
all_fore_points.append([])
for j in range(len(g[i].points)):
all_fore_points[i].append((g[i].points[j].x, g[i].points[j].y))
all_fore_points_length.append(j)
## Check for compatibility:
if n > 0:
pass
else:
## if compatible, interpolate:
def interp_fore(Glif, int_val):
for i in range(len(Glif)):
for j in range(len(Glif[i].points)):
fore_point = all_fore_points[i][j]
mask_point = all_mask_points[i][j]
Glif[i].points[j].x = mask_point[0] + ((fore_point[0] - mask_point[0]) * int_val)
Glif[i].points[j].y = mask_point[1] + ((fore_point[1] - mask_point[1]) * int_val)
class InterpWithMaskWindow:
def __init__(self, glyph):
if glyph is None:
print "There should be a glyph window selected."
return
self.glyph = glyph
self.w = Window((600, 36),"Interpolate Foreground with Mask (no AutoOrder):")
self.w.int = Slider((10, 6, -10, 22), value=1,
maxValue=3,
minValue=0,
callback=self.adjust)
self.w.open()
def adjust(self, sender):
int_val = self.w.int.get()
print round(int_val, 2)
Glif = self.glyph
interp_fore(Glif, int_val)
Glif.update()
OpenWindow(InterpWithMaskWindow, CurrentGlyph())
g.update()
g.performUndo()
t = Transform().translate(0, 0)
g.transform(t, doComponents=True)
g.update()
| 30.614754 | 105 | 0.493574 | 826 | 7,470 | 4.405569 | 0.2046 | 0.023083 | 0.030228 | 0.024182 | 0.387744 | 0.369057 | 0.325639 | 0.318494 | 0.246496 | 0.19978 | 0 | 0.023829 | 0.359572 | 7,470 | 243 | 106 | 30.740741 | 0.736831 | 0.065863 | 0 | 0.393103 | 0 | 0 | 0.105844 | 0.024091 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.006897 | 0.013793 | null | null | 0.144828 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c327543b799027a0d190954bd8149ab8b7d7603f | 809 | py | Python | scrapets/extract.py | ownport/scrapets | e52609aae4d55fb9d4315f90d4e2fe3804ef8ff6 | [
"MIT"
] | 2 | 2017-06-22T15:45:52.000Z | 2019-08-23T03:34:40.000Z | scrapets/extract.py | ownport/scrapets | e52609aae4d55fb9d4315f90d4e2fe3804ef8ff6 | [
"MIT"
] | 9 | 2016-10-23T17:56:34.000Z | 2016-12-12T10:39:23.000Z | scrapets/extract.py | ownport/scrapets | e52609aae4d55fb9d4315f90d4e2fe3804ef8ff6 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from HTMLParser import HTMLParser
# -------------------------------------------------------
#
# LinkExtractor: extract links from html page
#
class BaseExtractor(HTMLParser):
def __init__(self):
HTMLParser.__init__(self)
self._links = []
@property
def links(self):
return self._links
class LinkExtractor(BaseExtractor):
def handle_starttag(self, tag, attrs):
if tag == 'a':
links = [v for k,v in attrs if k == 'href' and v not in self._links]
self._links.extend(links)
class ImageLinkExtractor(BaseExtractor):
def handle_starttag(self, tag, attrs):
if tag == 'img':
links = [v for k,v in attrs if k == 'src' and v not in self._links]
self._links.extend(links)
| 20.74359 | 80 | 0.566131 | 95 | 809 | 4.652632 | 0.368421 | 0.122172 | 0.099548 | 0.135747 | 0.479638 | 0.479638 | 0.479638 | 0.479638 | 0.479638 | 0.171946 | 0 | 0.001675 | 0.262052 | 809 | 38 | 81 | 21.289474 | 0.738693 | 0.15204 | 0 | 0.222222 | 0 | 0 | 0.016176 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.055556 | 0.055556 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c3283cdb2fefed11f9dc322c324670fa2d4fbccd | 1,069 | py | Python | tests/unit/utils/filebuffer_test.py | gotcha/salt | 7b84c704777d3d2062911895dc3fdf93d40e9848 | [
"Apache-2.0"
] | 2 | 2019-03-30T02:12:56.000Z | 2021-03-08T18:59:46.000Z | tests/unit/utils/filebuffer_test.py | gotcha/salt | 7b84c704777d3d2062911895dc3fdf93d40e9848 | [
"Apache-2.0"
] | null | null | null | tests/unit/utils/filebuffer_test.py | gotcha/salt | 7b84c704777d3d2062911895dc3fdf93d40e9848 | [
"Apache-2.0"
] | 1 | 2020-12-04T11:28:06.000Z | 2020-12-04T11:28:06.000Z | # -*- coding: utf-8 -*-
'''
tests.unit.utils.filebuffer_test
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
:codeauthor: :email:`Pedro Algarvio (pedro@algarvio.me)`
:copyright: © 2012 by the SaltStack Team, see AUTHORS for more details.
:license: Apache 2.0, see LICENSE for more details.
'''
# Import salt libs
from saltunittest import TestCase, TestLoader, TextTestRunner
from salt.utils.filebuffer import BufferedReader, InvalidFileMode
class TestFileBuffer(TestCase):
def test_read_only_mode(self):
with self.assertRaises(InvalidFileMode):
BufferedReader('/tmp/foo', mode='a')
with self.assertRaises(InvalidFileMode):
BufferedReader('/tmp/foo', mode='ab')
with self.assertRaises(InvalidFileMode):
BufferedReader('/tmp/foo', mode='w')
with self.assertRaises(InvalidFileMode):
BufferedReader('/tmp/foo', mode='wb')
if __name__ == "__main__":
loader = TestLoader()
tests = loader.loadTestsFromTestCase(TestFileBuffer)
TextTestRunner(verbosity=1).run(tests)
| 30.542857 | 75 | 0.663237 | 111 | 1,069 | 6.288288 | 0.558559 | 0.045845 | 0.114613 | 0.200573 | 0.338109 | 0.338109 | 0.338109 | 0.338109 | 0 | 0 | 0 | 0.009249 | 0.190833 | 1,069 | 34 | 76 | 31.441176 | 0.796532 | 0.268475 | 0 | 0.25 | 0 | 0 | 0.061089 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.0625 | false | 0 | 0.125 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c3292201406d3697087e8916c4dd2621e50dc55a | 192 | py | Python | src/wwucs/bot/__init__.py | reillysiemens/wwucs-bot | 9e48ba5dc981e36cd8b18345bcbd3768c3deeeb8 | [
"0BSD"
] | null | null | null | src/wwucs/bot/__init__.py | reillysiemens/wwucs-bot | 9e48ba5dc981e36cd8b18345bcbd3768c3deeeb8 | [
"0BSD"
] | null | null | null | src/wwucs/bot/__init__.py | reillysiemens/wwucs-bot | 9e48ba5dc981e36cd8b18345bcbd3768c3deeeb8 | [
"0BSD"
] | null | null | null | """WWUCS Bot module."""
__all__ = [
"__author__",
"__email__",
"__version__",
]
__author__ = "Reilly Tucker Siemens"
__email__ = "reilly@tuckersiemens.com"
__version__ = "0.1.0"
| 16 | 38 | 0.651042 | 19 | 192 | 5.105263 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019108 | 0.182292 | 192 | 11 | 39 | 17.454545 | 0.598726 | 0.088542 | 0 | 0 | 0 | 0 | 0.473373 | 0.142012 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c329db170d0245164f12a99cffcce2a4d1c0ef5a | 551 | py | Python | plugins/google_cloud_compute/komand_google_cloud_compute/actions/disk_detach/action.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | 46 | 2019-06-05T20:47:58.000Z | 2022-03-29T10:18:01.000Z | plugins/google_cloud_compute/komand_google_cloud_compute/actions/disk_detach/action.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | 386 | 2019-06-07T20:20:39.000Z | 2022-03-30T17:35:01.000Z | plugins/google_cloud_compute/komand_google_cloud_compute/actions/disk_detach/action.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | 43 | 2019-07-09T14:13:58.000Z | 2022-03-28T12:04:46.000Z | import insightconnect_plugin_runtime
from .schema import DiskDetachInput, DiskDetachOutput, Input, Component
class DiskDetach(insightconnect_plugin_runtime.Action):
def __init__(self):
super(self.__class__, self).__init__(
name="disk_detach", description=Component.DESCRIPTION, input=DiskDetachInput(), output=DiskDetachOutput()
)
def run(self, params={}):
return self.connection.client.disk_detach(
params.get(Input.ZONE), params.get(Input.INSTANCE), params.get(Input.DEVICENAME)
)
| 34.4375 | 117 | 0.718693 | 57 | 551 | 6.631579 | 0.54386 | 0.071429 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176044 | 551 | 15 | 118 | 36.733333 | 0.832599 | 0 | 0 | 0 | 0 | 0 | 0.019964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.181818 | 0.090909 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c332e2fe6b727044df2454bc3e05a8e3dca73a1d | 4,773 | py | Python | examples/authentication/demo_auth.py | jordiyeh/safrs | eecfaf6d63ed44b9dc44b7b86c600db02989b512 | [
"MIT"
] | null | null | null | examples/authentication/demo_auth.py | jordiyeh/safrs | eecfaf6d63ed44b9dc44b7b86c600db02989b512 | [
"MIT"
] | null | null | null | examples/authentication/demo_auth.py | jordiyeh/safrs | eecfaf6d63ed44b9dc44b7b86c600db02989b512 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
#
# This is a demo application to demonstrate the functionality of the safrs_rest REST API with authentication
#
# you will have to install the requirements:
# pip3 install passlib flask_httpauth flask_login
#
# This script can be ran standalone like this:
# python3 demo_auth.py [Listener-IP]
# This will run the example on http://Listener-Ip:5000
#
# - A database is created and a item is added
# - User is created and the User endpoint is protected by user:admin & pass: adminPASS
# - swagger2 documentation is generated
#
import sys
import os
import logging
import builtins
from functools import wraps
from flask import Flask, redirect, jsonify, make_response
from flask import abort, request, g, url_for
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy import Column, Integer, String
from safrs import SAFRSBase, SAFRSJSONEncoder, Api, jsonapi_rpc
from flask_swagger_ui import get_swaggerui_blueprint
from flask_sqlalchemy import SQLAlchemy
from flask_httpauth import HTTPBasicAuth
from passlib.apps import custom_app_context as pwd_context
from itsdangerous import (TimedJSONWebSignatureSerializer as Serializer, BadSignature, SignatureExpired)
from flask.ext.login import LoginManager, UserMixin, \
login_required, login_user, logout_user
db = SQLAlchemy()
auth = HTTPBasicAuth()
# Example sqla database object
class Item(SAFRSBase, db.Model):
'''
description: Item description
'''
__tablename__ = 'items'
id = Column(String, primary_key=True)
name = Column(String, default = '')
class User(SAFRSBase, db.Model):
'''
description: User description
'''
__tablename__ = 'users'
id = db.Column(String, primary_key=True)
username = db.Column(db.String(32), index=True)
password_hash = db.Column(db.String(64))
custom_decorators = [auth.login_required]
@jsonapi_rpc(http_methods = ['POST'])
def hash_password(self, password):
self.password_hash = pwd_context.encrypt(password)
@jsonapi_rpc(http_methods = ['POST'])
def verify_password(self, password):
return pwd_context.verify(password, self.password_hash)
@jsonapi_rpc(http_methods = ['POST'])
def generate_auth_token(self, expiration=600):
s = Serializer(app.config['SECRET_KEY'], expires_in=expiration)
return s.dumps({'id': self.id})
@staticmethod
@jsonapi_rpc(http_methods = ['POST'])
def verify_auth_token(token):
s = Serializer(app.config['SECRET_KEY'])
try:
data = s.loads(token)
except SignatureExpired:
return None # valid token, but expired
except BadSignature:
return None # invalid token
user = User.query.get(data['id'])
return user
def start_app(app):
api = Api(app, api_spec_url = '/api/swagger', host = '{}:{}'.format(HOST,PORT), schemes = [ "http" ] )
item = Item(name='test',email='em@il')
user = User(username='admin')
user.hash_password('adminPASS')
api.expose_object(Item)
api.expose_object(User)
# Set the JSON encoder used for object to json marshalling
app.json_encoder = SAFRSJSONEncoder
# Register the API at /api/docs
swaggerui_blueprint = get_swaggerui_blueprint('/api', '/api/swagger.json')
app.register_blueprint(swaggerui_blueprint, url_prefix='/api')
print('Starting API: http://{}:{}/api'.format(HOST,PORT))
app.run(host=HOST, port = PORT)
#
# APP Initialization
#
app = Flask('demo_app')
app.config.update( SQLALCHEMY_DATABASE_URI = 'sqlite://',
SQLALCHEMY_TRACK_MODIFICATIONS = False,
SECRET_KEY = b'sdqfjqsdfqizroqnxwc',
DEBUG = True)
HOST = sys.argv[1] if len(sys.argv) > 1 else '0.0.0.0'
PORT = 5000
db.init_app(app)
#
# Authentication and custom routes
#
@auth.verify_password
def verify_password(username_or_token, password):
user = User.verify_auth_token(username_or_token)
if not user:
# try to authenticate with username/password
user = User.query.filter_by(username=username_or_token).first()
if not user or not user.verify_password(password):
return False
print('Authentication Successful for "{}"'.format(user.username))
return True
@app.route('/')
def goto_api():
return redirect('/api')
@app.teardown_appcontext
def shutdown_session(exception=None):
'''cfr. http://flask.pocoo.org/docs/0.12/patterns/sqlalchemy/'''
db.session.remove()
# Start the application
with app.app_context():
db.create_all()
start_app(app)
| 31.82 | 109 | 0.673581 | 591 | 4,773 | 5.284264 | 0.368866 | 0.020173 | 0.017931 | 0.026897 | 0.099904 | 0.083253 | 0.021774 | 0 | 0 | 0 | 0 | 0.007297 | 0.224806 | 4,773 | 149 | 110 | 32.033557 | 0.836757 | 0.191284 | 0 | 0.089888 | 0 | 0 | 0.063479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.089888 | false | 0.11236 | 0.179775 | 0.022472 | 0.47191 | 0.05618 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c333f525069086ebb8689eece355d91dd6b64f69 | 8,757 | py | Python | model/BPE.py | djmhunt/TTpy | 0f0997314bf0f54831494b2ef1a64f1bff95c097 | [
"MIT"
] | null | null | null | model/BPE.py | djmhunt/TTpy | 0f0997314bf0f54831494b2ef1a64f1bff95c097 | [
"MIT"
] | 4 | 2020-04-19T11:43:41.000Z | 2020-07-21T09:57:51.000Z | model/BPE.py | djmhunt/TTpy | 0f0997314bf0f54831494b2ef1a64f1bff95c097 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
:Author: Dominic Hunt
"""
import logging
import numpy as np
import scipy as sp
import collections
import itertools
from model.modelTemplate import Model
class BPE(Model):
"""The Bayesian predictor model
Attributes
----------
Name : string
The name of the class used when recording what has been used.
Parameters
----------
alpha : float, optional
Learning rate parameter
epsilon : float, optional
Noise parameter. The larger it is the less likely the model is to choose the highest expected reward
number_actions : integer, optional
The maximum number of valid actions the model can expect to receive.
Default 2.
number_cues : integer, optional
The initial maximum number of stimuli the model can expect to receive.
Default 1.
number_critics : integer, optional
The number of different reaction learning sets.
Default number_actions*number_cues
validRewards : list,np.ndarray, optional
The different reward values that can occur in the task. Default ``array([0, 1])``
action_codes : dict with string or int as keys and int values, optional
A dictionary used to convert between the action references used by the
task or dataset and references used in the models to describe the order
in which the action information is stored.
dirichletInit : float, optional
The initial values for values of the dirichlet distribution.
Normally 0, 1/2 or 1. Default 1
prior : array of floats in ``[0, 1]``, optional
Ignored in this case
stimFunc : function, optional
The function that transforms the stimulus into a form the model can
understand and a string to identify it later. Default is blankStim
rewFunc : function, optional
The function that transforms the reward into a form the model can
understand. Default is blankRew
decFunc : function, optional
The function that takes the internal values of the model and turns them
in to a decision. Default is model.decision.discrete.weightProb
See Also
--------
model.BP : This model is heavily based on that one
"""
def __init__(self, alpha=0.3, epsilon=0.1, dirichletInit=1, validRewards=np.array([0, 1]), **kwargs):
super(BPE, self).__init__(**kwargs)
self.alpha = alpha
self.epsilon = epsilon
self.validRew = validRewards
self.rewLoc = collections.OrderedDict(((k, v) for k, v in itertools.izip(self.validRew, range(len(self.validRew)))))
self.dirichletVals = np.ones((self.number_actions, self.number_cues, len(self.validRew))) * dirichletInit
self.expectations = self.updateExpectations(self.dirichletVals)
self.parameters["epsilon"] = self.epsilon
self.parameters["alpha"] = self.alpha
self.parameters["dirichletInit"] = dirichletInit
# Recorded information
self.recDirichletVals = []
def returnTaskState(self):
""" Returns all the relevant data for this model
Returns
-------
results : dict
The dictionary contains a series of keys including Name,
Probabilities, Actions and Events.
"""
results = self.standardResultOutput()
results["dirichletVals"] = np.array(self.recDirichletVals)
return results
def storeState(self):
"""
Stores the state of all the important variables so that they can be
accessed later
"""
self.storeStandardResults()
self.recDirichletVals.append(self.dirichletVals.copy())
def rewardExpectation(self, observation):
"""Calculate the estimated reward based on the action and stimuli
This contains parts that are task dependent
Parameters
----------
observation : {int | float | tuple}
The set of stimuli
Returns
-------
actionExpectations : array of floats
The expected rewards for each action
stimuli : list of floats
The processed observations
activeStimuli : list of [0, 1] mapping to [False, True]
A list of the stimuli that were or were not present
"""
activeStimuli, stimuli = self.stimulus_shaper.processStimulus(observation)
actionExpectations = self._actExpectations(self.dirichletVals, stimuli)
return actionExpectations, stimuli, activeStimuli
def delta(self, reward, expectation, action, stimuli):
"""
Calculates the comparison between the reward and the expectation
Parameters
----------
reward : float
The reward value
expectation : float
The expected reward value
action : int
The chosen action
stimuli : {int | float | tuple | None}
The stimuli received
Returns
-------
delta
"""
modReward = self.reward_shaper.processFeedback(reward, action, stimuli)
return modReward
def updateModel(self, delta, action, stimuli, stimuliFilter):
"""
Parameters
----------
delta : float
The difference between the reward and the expected reward
action : int
The action chosen by the model in this trialstep
stimuli : list of float
The weights of the different stimuli in this trialstep
stimuliFilter : list of bool
A list describing if a stimulus cue is present in this trialstep
"""
# Find the new activities
self._newExpect(action, delta, stimuli)
# Calculate the new probabilities
# We need to combine the expectations before calculating the probabilities
actionExpectations = self._actExpectations(self.dirichletVals, stimuli)
self.probabilities = self.calcProbabilities(actionExpectations)
def _newExpect(self, action, delta, stimuli):
self.dirichletVals[action, :, self.rewLoc[delta]] += self.alpha * stimuli/np.sum(stimuli)
self.expectations = self.updateExpectations(self.dirichletVals)
def _actExpectations(self, dirichletVals, stimuli):
# If there are multiple possible stimuli, filter by active stimuli and calculate
# calculate the expectations associated with each action.
if self.number_cues > 1:
actionExpectations = self.calcActExpectations(self.actStimMerge(dirichletVals, stimuli))
else:
actionExpectations = self.calcActExpectations(dirichletVals[:, 0, :])
return actionExpectations
def calcProbabilities(self, actionValues):
# type: (np.ndarray) -> np.ndarray
"""
Calculate the probabilities associated with the actions
Parameters
----------
actionValues : 1D ndArray of floats
Returns
-------
probArray : 1D ndArray of floats
The probabilities associated with the actionValues
"""
cbest = actionValues == max(actionValues)
deltaEpsilon = self.epsilon * (1 / self.number_actions)
bestEpsilon = (1 - self.epsilon) / np.sum(cbest) + deltaEpsilon
probArray = bestEpsilon * cbest + deltaEpsilon * (1 - cbest)
return probArray
def actorStimulusProbs(self):
"""
Calculates in the model-appropriate way the probability of each action.
Returns
-------
probabilities : 1D ndArray of floats
The probabilities associated with the action choices
"""
probabilities = self.calcProbabilities(self.expectedRewards)
return probabilities
def actStimMerge(self, dirichletVals, stimuli):
dirVals = dirichletVals * np.expand_dims(np.repeat([stimuli], self.number_actions, axis=0), 2)
actDirVals = np.sum(dirVals, 1)
return actDirVals
def calcActExpectations(self, dirichletVals):
actExpect = np.fromiter((np.sum(sp.stats.dirichlet(d).mean() * self.validRew) for d in dirichletVals), float, count=self.number_actions)
return actExpect
def updateExpectations(self, dirichletVals):
def meanFunc(p, r=[]):
return np.sum(sp.stats.dirichlet(p).mean() * r)
expectations = np.apply_along_axis(meanFunc, 2, dirichletVals, r=self.validRew)
return expectations
| 33.680769 | 145 | 0.628183 | 934 | 8,757 | 5.857602 | 0.296574 | 0.03418 | 0.008042 | 0.014805 | 0.127216 | 0.099799 | 0.057394 | 0.018278 | 0.018278 | 0 | 0 | 0.005361 | 0.29702 | 8,757 | 259 | 146 | 33.810811 | 0.883366 | 0.454836 | 0 | 0.059701 | 0 | 0 | 0.010109 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208955 | false | 0 | 0.089552 | 0.014925 | 0.462687 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c336028d3170491bb761554d05258241830c82fc | 1,688 | py | Python | affiliates/banners/tests/__init__.py | glogiotatidis/affiliates | 34d0ded8e24be9dd207d6419a5157dc8ce34bc06 | [
"BSD-3-Clause"
] | 15 | 2015-01-01T07:17:44.000Z | 2020-11-09T06:28:29.000Z | affiliates/banners/tests/__init__.py | glogiotatidis/affiliates | 34d0ded8e24be9dd207d6419a5157dc8ce34bc06 | [
"BSD-3-Clause"
] | 16 | 2015-02-25T23:17:27.000Z | 2015-08-20T10:28:18.000Z | affiliates/banners/tests/__init__.py | glogiotatidis/affiliates | 34d0ded8e24be9dd207d6419a5157dc8ce34bc06 | [
"BSD-3-Clause"
] | 12 | 2015-01-17T20:57:03.000Z | 2019-11-03T15:04:31.000Z | from django.db.models.signals import post_init
from factory import DjangoModelFactory, Sequence, SubFactory
from factory.django import mute_signals
from affiliates.banners import models
class CategoryFactory(DjangoModelFactory):
FACTORY_FOR = models.Category
name = Sequence(lambda n: 'test{0}'.format(n))
class BannerFactory(DjangoModelFactory):
ABSTRACT_FACTORY = True
category = SubFactory(CategoryFactory)
name = Sequence(lambda n: 'test{0}'.format(n))
destination = 'https://mozilla.org/'
visible = True
class ImageBannerFactory(BannerFactory):
FACTORY_FOR = models.ImageBanner
@mute_signals(post_init)
class ImageVariationFactory(DjangoModelFactory):
ABSTRACT_FACTORY = True
color = 'Blue'
locale = 'en-us'
image = 'uploads/image_banners/test.png'
class ImageBannerVariationFactory(ImageVariationFactory):
FACTORY_FOR = models.ImageBannerVariation
banner = SubFactory(ImageBannerFactory)
class TextBannerFactory(BannerFactory):
FACTORY_FOR = models.TextBanner
class TextBannerVariationFactory(DjangoModelFactory):
FACTORY_FOR = models.TextBannerVariation
banner = SubFactory(TextBannerFactory)
locale = 'en-us'
text = Sequence(lambda n: 'test{0}'.format(n))
class FirefoxUpgradeBannerFactory(BannerFactory):
FACTORY_FOR = models.FirefoxUpgradeBanner
@mute_signals(post_init)
class FirefoxUpgradeBannerVariationFactory(ImageVariationFactory):
FACTORY_FOR = models.FirefoxUpgradeBannerVariation
banner = SubFactory(FirefoxUpgradeBannerFactory)
image = 'uploads/firefox_upgrade_banners/test.png'
upgrade_image = 'uploads/firefox_upgrade_banners/test_upgrade.png'
| 25.575758 | 70 | 0.776659 | 166 | 1,688 | 7.76506 | 0.349398 | 0.054306 | 0.086889 | 0.04422 | 0.171451 | 0.134213 | 0.076804 | 0.076804 | 0 | 0 | 0 | 0.002069 | 0.140995 | 1,688 | 65 | 71 | 25.969231 | 0.886897 | 0 | 0 | 0.205128 | 0 | 0 | 0.102488 | 0.069905 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.102564 | 0 | 0.948718 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c33b670e9c5af9440c581f7412728d80706d9eb8 | 5,240 | py | Python | bin/runinterpret.py | christine-liu/somaticCNVpipeline | 254b709e611e56e5c891c663508ac79fa1093c07 | [
"MIT"
] | null | null | null | bin/runinterpret.py | christine-liu/somaticCNVpipeline | 254b709e611e56e5c891c663508ac79fa1093c07 | [
"MIT"
] | 2 | 2018-03-09T00:22:18.000Z | 2019-03-12T11:26:42.000Z | bin/runinterpret.py | christine-liu/somaticCNVpipeline | 254b709e611e56e5c891c663508ac79fa1093c07 | [
"MIT"
] | 6 | 2018-03-09T02:10:49.000Z | 2020-05-14T09:19:11.000Z | #!usr/bin/python
import os
import numpy as np
import common
from interpret import qcfile, funcfile, analyzefiles
def runAll(args):
print('\n\n\nYou have requested to analyze CNV call data')
print('\tWARNING:')
print('\t\tIF USING ANY REFERENCES OTHER THAN THOSE I PROVIDE I CANNOT GUARANTEE RESULT ACCURACY')
print('\n')
#Set up environment#
args.AnalysisDirectory = common.fixDirName(args.AnalysisDirectory)
folderDict = {'LowessBinCounts': args.lowess,
'Segments': args.segments,
'PipelineStats': args.countstats}
for i in list(folderDict.keys()):
if not folderDict[i]:
folderDict[i] = args.AnalysisDirectory + i + '/'
else:
folderDict[i] = common.fixDirName(folderDict[i])
QCdir = args.AnalysisDirectory + 'QC/'
CNVdir = args.AnalysisDirectory + 'CNVlists/'
summaryDir = args.AnalysisDirectory + 'SummaryFiles/'
PloidyPlotDir = args.AnalysisDirectory + 'PloidyDeterminationPlots/'
CNplotDir = args.AnalysisDirectory + 'CopyNumberProfilePlots/'
ChromPlotDir = args.AnalysisDirectory + 'ChromosomeCopyNumberPlots/'
for i in [args.AnalysisDirectory, QCdir, CNVdir, summaryDir, PloidyPlotDir, CNplotDir, ChromPlotDir]:#
common.makeDir(i)
#get list of samples to process
#will involve checking infofile (if present) and whether required input files exist
sampleFiles = common.getSampleList(folderDict['Segments'], args.samples, 'segments')
sampleNames = [x.split('/')[-1].split('.')[0] for x in sampleFiles]
# info = common.importInfoFile(args.infofile, args.columns, 'interpret')
# if args.infofile:
# refArray = info
# else:
# thisDtype = info
# refArray = np.array(
# [ (x, 1, 'unk',) for x in sampleNames],
# dtype=thisDtype)
#QC assessment#
# qcfile.runQCone(sampleNames[0], args.species, folderDict['PipelineStats'], folderDict['LowessBinCounts'], folderDict['Segments'], QCdir, PloidyPlotDir)
argList = [(x, args.species, folderDict['PipelineStats'], folderDict['LowessBinCounts'], folderDict['Segments'], QCdir, PloidyPlotDir) for x in sampleNames]
common.daemon(qcfile.runQCone, argList, 'assess sample quality')
analysisSamples = []
ploidyDict = {}
genderDict = {}
mergeQCfile = summaryDir + 'QCmetrics.txt'
OUT = open(mergeQCfile, 'w')
OUT.write('Name\tReads\tMAPD\tCS\tPloidy\tGender\tPASS\n')
for i in sampleNames:
IN = open(QCdir + i + '.qcTEMP.txt', 'r')
data = IN.readline()
OUT.write(data)
data = data.rstrip().split('\t')
if data[-1] == 'True':
analysisSamples.append(i)
ploidyDict[i] = float(data[4])
genderDict[i] = data[-2]
IN.close()
os.remove(QCdir + i + '.qcTEMP.txt')
OUT.close()
os.rmdir(QCdir)
#FUnC: CNV filtering#
if args.nofilter:
print '\nFURTHER CODE IS ONLY DEVELOPED FOR WHEN FUnC IS IMPLEMENTED, EXITING NOW\n\n\n'
raise SystemExit
# funcfile.FUnCone(analysisSamples[0], args.species, folderDict['Segments'], CNVdir,
# ploidyDict[analysisSamples[0]], genderDict[analysisSamples[0]])
argList = [(x, args.species, folderDict['Segments'], CNVdir, ploidyDict[x], genderDict[x]) for x in analysisSamples]
common.daemon(funcfile.FUnCone, argList, 'remove unreliable CNV calls')
#CNV analysis#
# summaryStats = analyzefiles.analyzeOne(analysisSamples[0], args.species, CNVdir, folderDict['LowessBinCounts'], CNplotDir, ChromPlotDir, ploidyDict[analysisSamples[0]], genderDict[analysisSamples[0]])
# summaryStats = [summaryStats]
argList = [(x, args.species, CNVdir, folderDict['LowessBinCounts'], CNplotDir, ChromPlotDir, ploidyDict[x], genderDict[x]) for x in analysisSamples]
summaryStats = common.daemon(analyzefiles.analyzeOne, argList, 'create summary files')
cellStatsFile = summaryDir + 'CellStats.txt'
chromAmpFile = summaryDir + 'ChromosomeAmplifiedPercent.txt'
chromDelFile = summaryDir + 'ChromosomeDeletedPercent.txt'
#write summary statistics files#
with open(cellStatsFile, 'w') as CELL, open(chromAmpFile, 'w') as AMP, open(chromDelFile, 'w') as DEL:
CELL.write('Sample\tDeletionNumber\tAmplificationNumber\tTotalCNVnumber\tDeletedMB\tAmplifiedMB\tNetDNAalterdMB\n')
chromHeader = 'Sample\t' + '\t'.join(summaryStats[0]['chroms']) + '\n'
AMP.write(chromHeader)
DEL.write(chromHeader)
for i,j in enumerate(analysisSamples):
CELL.write(str(j + '\t'))
cellOut = [summaryStats[i]['cellStats']['delCount'],
summaryStats[i]['cellStats']['ampCount'],
summaryStats[i]['cellStats']['delCount'] + summaryStats[i]['cellStats']['ampCount'],
np.round(summaryStats[i]['cellStats']['delMB'], 3),
np.round(summaryStats[i]['cellStats']['ampMB'], 3),
np.round(summaryStats[i]['cellStats']['ampMB'] - summaryStats[i]['cellStats']['delMB'], 3)]
cellOut = '\t'.join(map(str, cellOut)) + '\n'
CELL.write(cellOut)
AMP.write(str(j + '\t'))
ampOut = [np.round(summaryStats[i]['chromAmp'][x], 3) for x in summaryStats[0]['chroms']]
ampOut = '\t'.join(map(str, ampOut)) + '\n'
AMP.write(ampOut)
DEL.write(str(j + '\t'))
delOut = [np.round(summaryStats[i]['chromDel'][x], 3) for x in summaryStats[0]['chroms']]
delOut = '\t'.join(map(str, delOut)) + '\n'
DEL.write(delOut)
print('\nCNV analysis complete\n\n\n')
| 31.566265 | 202 | 0.698473 | 602 | 5,240 | 6.079734 | 0.330565 | 0.057377 | 0.048087 | 0.027322 | 0.245355 | 0.224044 | 0.179235 | 0.160109 | 0.051913 | 0.051913 | 0 | 0.0047 | 0.147328 | 5,240 | 165 | 203 | 31.757576 | 0.814458 | 0.184542 | 0 | 0 | 0 | 0 | 0.233396 | 0.065473 | 0.012048 | 0 | 0 | 0 | 0 | 0 | null | null | 0.012048 | 0.048193 | null | null | 0.072289 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c33c7a593798637e5989678bfdadfbeb83157154 | 29,527 | py | Python | mbio/EM/mrc.py | wzmao/mbio | af78cfdb47577199585179c3b04cc6cf3d6b401c | [
"MIT"
] | 2 | 2015-05-28T12:23:02.000Z | 2018-05-25T14:01:17.000Z | mbio/EM/mrc.py | wzmao/mbio | af78cfdb47577199585179c3b04cc6cf3d6b401c | [
"MIT"
] | null | null | null | mbio/EM/mrc.py | wzmao/mbio | af78cfdb47577199585179c3b04cc6cf3d6b401c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""This module contains the MRC file class.
"""
__author__ = 'Wenzhi Mao'
__all__ = ['MRC']
class MRCHeader():
"""A header class for mrc file."""
def __init__(self, filename=None, **kwargs):
"""Provide the filename to parse or set it later."""
self.nx = self.ny = self.nz = None
self.mode = None
self.nxstart = self.nystart = self.nzstart = None
self.mx = self.my = self.mz = None
self.cella = [None] * 3
self.cellb = [None] * 3
self.mapc = None
self.mapr = None
self.maps = None
self.dmin = self.dmax = self.dmean = None
self.ispg = None
self.nsymbt = None
self.extra = None
self.origin = [None] * 3
self.map = None
self.machst = None
self.rms = None
self.nlabels = None
self.label = [None] * 10
self.symdata = None
self.xstart = self.ystart = self.zstart = None
if filename:
from os.path import exists, isfile
if exists(filename) and isfile(filename):
from .Cmrc import readHeader
compress = 1 if filename.lower().endswith('.gz') else 0
temp = readHeader(
filename=filename, header=self, compress=compress)
if isinstance(temp, tuple):
from ..IO.output import printError
if temp[0] == None:
printError(temp[1])
else:
printError("Couldn't parse the Error information.")
return None
else:
from numpy import array, argsort
self = temp
for i in xrange(10):
self.label[i] = self.label[i][:80]
if self.label[i].find('\0') != -1:
self.label[i] = self.label[i][
:self.label[i].find("\0")]
elif self.label[i] == ' ' * 80:
self.label[i] = ''
self.label[i] = self.label[i].rstrip()
if self.symdata:
self.symdata = self.symdata[:80]
if self.symdata.find('\0') != -1:
self.symdata = self.symdata[
:self.symdata.find('\0')]
if self.extra:
self.extra = self.extra[:80]
if self.extra.find('\0') != -1:
self.extra = self.extra[:self.extra.find('\0')]
if self.origin == [0, 0, 0]:
self.xstart, self.ystart, self.zstart = array(
[self.nxstart * self.cella[0] / self.mx, self.nystart * self.cella[1] / self.my, self.nzstart * self.cella[2] / self.mz])[argsort([self.mapc, self.mapr, self.maps])]
self.origin = list(array([self.xstart, self.ystart, self.zstart])[
[self.mapc - 1, self.mapr - 1, self.maps - 1]])
self.nxstart = self.nystart = self.nzstart = 0
else:
self.nxstart = self.nystart = self.nzstart = 0
self.xstart, self.ystart, self.zstart = array(
self.origin)[argsort([self.mapc, self.mapr, self.maps])]
else:
from ..IO.output import printError
printError("The file doesn't exists or is not a file.")
def parseHeader(self, filename=None, **kwargs):
"""Parse the MRC header information from the given file."""
if filename:
from os.path import exists, isfile
if exists(filename) and isfile(filename):
from .Cmrc import readHeader
compress = 1 if filename.lower().endswith('.gz') else 0
temp = readHeader(
filename=filename, header=self, compress=compress)
if isinstance(temp, tuple):
from ..IO.output import printError
if temp[0] == None:
printError(temp[1])
else:
printError("Couldn't parse the Error information.")
return None
else:
from numpy import array, argsort
self = temp
for i in xrange(10):
self.label[i] = self.label[i][:80]
if self.label[i].find('\0') != -1:
self.label[i] = self.label[i][
:self.label[i].find("\0")]
elif self.label[i] == ' ' * 80:
self.label[i] = ''
self.label[i] = self.label[i].rstrip()
if self.symdata:
self.symdata = self.symdata[:80]
if self.symdata.find('\0') != -1:
self.symdata = self.symdata[
:self.symdata.find('\0')]
if self.extra:
self.extra = self.extra[:80]
if self.extra.find('\0') != -1:
self.extra = self.extra[:self.extra.find('\0')]
if self.origin == [0, 0, 0]:
self.xstart, self.ystart, self.zstart = array(
[self.nxstart * self.cella[0] / self.mx, self.nystart * self.cella[1] / self.my, self.nzstart * self.cella[2] / self.mz])[argsort([self.mapc, self.mapr, self.maps])]
self.origin = list(array([self.xstart, self.ystart, self.zstart])[
[self.mapc - 1, self.mapr - 1, self.maps - 1]])
self.nxstart = self.nystart = self.nzstart = 0
else:
self.nxstart = self.nystart = self.nzstart = 0
self.xstart, self.ystart, self.zstart = array(
self.origin)[argsort([self.mapc, self.mapr, self.maps])]
else:
from ..IO.output import printError
printError("The file doesn't exists or is not a file.")
else:
from ..IO.output import printError
printError("The filename must be provided.")
def printInformation(self, **kwargs):
"""Print the information from the header."""
from ..IO.output import printInfo as p
p("Num of columns, rows and sections: {0} {1} {2}".format(
self.nx, self.ny, self.nz))
p("Mode: {0}".format(self.mode))
p("Num of First column, row, section: {0} {1} {2}".format(
self.nxstart, self.nystart, self.nzstart))
p("Num of intervals along x, y, z: {0} {1} {2}".format(
self.mx, self.my, self.mz))
p("Cell dimensions in angstroms: {0:.2f} {1:.2f} {2:.2f}".format(
self.cella[0], self.cella[1], self.cella[2]))
p("Cell angles in degrees: {0:.2f} {1:.2f} {2:.2f}".format(
self.cellb[0], self.cellb[1], self.cellb[2]))
p("Axis for cols, rows, sections: {0} {1} {2}".format(
self.mapc, self.mapr, self.maps))
p("Min, max, mean density value: {0:.6f} {1:.6f} {2:.6f}".format(
self.dmin, self.dmax, self.dmean))
p("Space group number: {0}".format(self.ispg))
p("Origin in X,Y,Z: {0:.4f} {1:.4f} {2:.4f}".format(
self.origin[0], self.origin[1], self.origin[2]))
p("Machine stamp: {0}".format(self.machst))
p("rms deviationfrom mean density: {0}".format(self.rms))
p("Num of labels being used: {0}".format(self.nlabels))
if self.nlabels != 0:
p("Labels:")
for i in self.label:
if i != "":
p("\t{0}".format(i))
p("Num of bytes for symmetry data: {0}".format(self.nsymbt))
if self.nsymbt != 0:
p("\t{0}".format(self.symdata))
def getMatrixShape(self, **kwargs):
"""Get the data shape from the header information.
Caution: it could be different with the data array."""
if (isinstance(self.nx, int) and
isinstance(self.ny, int) and isinstance(self.nz, int)):
return (self.nx, self.ny, self.nz)
else:
from ..IO.output import printError
printError("There is no header information here.")
return None
def __repr__(self):
return "MRCHeader"
def setValue(self, label, value=None, **kwargs):
"""Set the value for a label."""
setattr(self, label, value)
def getValue(self, label, default=None, **kwargs):
"""Get the value for a label."""
getattr(self, label, default)
class MRC():
"""This is a class to read and write MRC file.
The data will always been store as x,y,z oreder."""
def __init__(self, filename=None, **kwargs):
"""Parse data from the given file."""
self.header = MRCHeader()
self.data = None
if filename:
self.parseData(filename=filename, **kwargs)
def __getattr__(self, name, **kwargs):
if name in ['data', 'header']:
return getattr(self, name)
else:
try:
return getattr(self.header, name)
except:
return None
def __setattr__(self, name, value, **kwargs):
if name == 'data':
self.__dict__[name] = value
elif name == 'header':
self.__dict__[name] = value
else:
if name in self.header.__dict__.keys():
setattr(self.header, name, value)
elif name in self.__dict__.keys():
setattr(self, name, value)
else:
pass
def __repr__(self):
return "MRC"
def __str__(self):
return "MRC"
def __dir__(self, **kwargs):
return self.__dict__.keys() + self.header.__dict__.keys()
def parseHeader(self, filename=None, **kwargs):
"""Parse the header only from a given file.
If the data will be parsed in the future, the header will be overwrited
by the new data file's header."""
if filename:
from os.path import exists, isfile
if exists(filename) and isfile(filename):
self.header = MRCHeader(filename=filename)
else:
from ..IO.output import printError
printError("The file doesn't exists or is not a file.")
else:
from ..IO.output import printError
printError("The filename must be provided.")
def parseData(self, filename=None, **kwargs):
"""Parse the data and header from a given file.
If the header or data have already exists, all will be overwrited."""
if filename:
from os.path import exists, isfile
if exists(filename) and isfile(filename):
from .Cmrc import readData
from numpy import zeros, int8, int16, float32, uint8, uint16
from ..IO.output import printInfo, printError, printUpdateInfo
if getattr(self, 'header', None):
del self.header
if kwargs.get('output', True):
printUpdateInfo(
"Parsing the Header from file {0}.".format(filename))
self.header = MRCHeader(filename=filename)
if getattr(self, 'data', None):
printInfo("Some data exists already, overwrite it.")
del self.data
if self.header.mode in [3, 4]:
printError(
"Sorry, we don't support the complex format yet.")
del self.data
self.data = None
return None
else:
if self.header.mode == 0:
self.data = zeros(
(self.header.nz, self.header.ny, self.header.nx), dtype=int8)
elif self.header.mode == 1:
self.data = zeros(
(self.header.nz, self.header.ny, self.header.nx), dtype=int16)
elif self.header.mode == 2:
self.data = zeros(
(self.header.nz, self.header.ny, self.header.nx), dtype=float32)
elif self.header.mode == 5:
self.data = zeros(
(self.header.nz, self.header.ny, self.header.nx), dtype=uint8)
elif self.header.mode == 6:
self.data = zeros(
(self.header.nz, self.header.ny, self.header.nx), dtype=uint16)
else:
printError(
"Couldn't understand the mode {0}".format(self.header.mode))
del self.data
self.data = None
return None
if kwargs.get('output', True):
printUpdateInfo(
"Parsing the Data from file {0}.".format(filename))
self.data = self.data - 1
compress = 1 if filename.lower().endswith('.gz') else 0
temp = readData(
filename=filename, nsymbt=self.header.nsymbt,
datamode=self.header.mode, data=self.data,
size=self.header.nz * self.header.ny * self.header.nx,
compress=compress)
if isinstance(temp, tuple):
del self.data
self.data = None
if temp[0] == None:
printError(temp[1])
else:
printError("Couldn't parse the Error information.")
return None
else:
from numpy import transpose, argsort
if set([self.header.mapc, self.header.mapr, self.header.maps]) != set([1, 2, 3]):
printError(
"The MRC header contains no clear axis.(mapc, mapr and maps must cotain all 1,2,3.)")
printError("Keep the data as it.")
self.data = temp
return None
else:
temporder = [
self.header.maps, self.header.mapr, self.header.mapc]
self.data = transpose(temp, argsort(temporder))
del temp
if self.header.transend:
self.data.byteswap(True)
else:
printError("The file doesn't exists or is not a file.")
return None
else:
printError("The filename must be provided.")
return None
def writeData(self, filename, skipupdate=False, force=False, **kwargs):
"""Write the MRC file into file.
The header and data format will automaticly update.
You could skip the update using `skipupdate` option.
You could force it to overwrite files with `force` option."""
from ..IO.output import printInfo, printError
from os.path import exists, isfile
from numpy import transpose, array
if filename:
if exists(filename):
if not isfile(filename):
printError("The path is not a file.")
return None
else:
if not force:
back = raw_input(
"* File {0} exists, do you want to overwrite it?(y/n)".format(filename))
while back.strip().lower() not in ['y', 'n']:
back = raw_input(
"* File {0} exists, do you want to overwrite it?(y/n)".format(filename))
if back.strip().lower() == 'n':
printInfo("File not write.")
return None
else:
printError("The filename must be provided.")
return None
if isinstance(self.data, type(None)):
printError("No data to write.")
return None
find = False
for i in xrange(10):
if self.label[i].startswith("Written by mbio"):
find = True
from time import ctime
from .. import __version__
self.label[i] = "Written by mbio {0} {1}".format(
__version__, ctime())
self.label = self.label[:i] + \
self.label[i + 1:] + [self.label[i]]
self.label = [j for j in self.label if j != ""]
self.label = self.label + [""] * (10 - len(self.label))
break
if not find:
if self.nlabels != 10:
from time import ctime
from .. import __version__
self.label[self.nlabels] = "Written by mbio {0} {1}".format(
__version__, ctime())
self.nlabels += 1
if not skipupdate:
self.update()
from .Cmrc import writeData
if set([self.header.mapc, self.header.mapr, self.header.maps]) != set([1, 2, 3]):
printError(
"The MRC header contains no clear axis.(mapc, mapr and maps must cotain all 1,2,3.)")
printError("Change it automaticly.")
self.header.mapc, self.header.mapr, self.header.maps = 1, 2, 3
self.header.nxstart, self.header.nystart, self.header.nzstart = array(
[self.header.nxstart, self.header.nystart, self.header.nzstart])[[self.header.mapc - 1, self.header.mapr - 1, self.header.maps - 1]]
if kwargs.get('output', True):
printInfo("Writing MRC to {0}".format(filename))
compress = 1 if filename.lower().endswith('.gz') else 0
temp = writeData(header=self.header, data=transpose(
self.data, (self.header.maps - 1, self.header.mapr - 1, self.header.mapc - 1)), filename=filename, compress=compress)
if isinstance(temp, tuple):
if temp[0] == None:
print temp
printError(temp[1])
else:
printError("Couldn't parse the Error information.")
return None
elif temp == 0:
return None
else:
printError("Couldn't parse the Error information.")
def update(self, **kwargs):
"""Update the MRC header information from the data array.
Update the MRC data format based on the `header.mode`
Include: nx, ny, nz, dmin, dmax, dmean, rms, nsymbt, nlabels and sort label
nxstart, nystart, nzstart, xstart, ystart, zstart, map.
Correct mapc, mapr and maps automaticly."""
from numpy import array, int8, int16, float32, uint8, uint16, argsort
from ..IO.output import printError
from platform import architecture
if set([self.header.mapc, self.header.mapr, self.header.maps]) != set([1, 2, 3]):
printError(
"The MRC header contains no clear axis.(mapc, mapr and maps must cotain all 1,2,3.)")
printError("Change it automaticly.")
self.header.mapc, self.header.mapr, self.header.maps = 1, 2, 3
self.header.nx, self.header.ny, self.header.nz = array(
self.data.shape)[[self.header.mapc - 1, self.header.mapr - 1, self.header.maps - 1]]
if self.header.origin != [0., 0., 0.]:
self.header.nxstart = self.header.nystart = self.header.nzstart = 0
self.header.xstart, self.header.ystart, self.header.zstart = array(
self.header.origin)[argsort([self.header.mapc, self.header.mapr, self.header.maps])]
elif self.header.nxstart != 0 or self.header.nystart != 0 or self.header.nzstart != 0:
self.header.xstart, self.header.ystart, self.header.zstart = array(
[self.header.nxstart * self.header.cella[0] / self.header.mx, self.header.nystart * self.header.cella[1] / self.header.my, self.header.nzstart * self.header.cella[2] / self.header.mz])[argsort([self.header.mapc, self.header.mapr, self.header.maps])]
# self.header.nxstart, self.header.nystart, self.header.nzstart = array(
# [self.header.nxstart, self.header.nystart, self.header.nzstart])[[self.header.mapc - 1, self.header.mapr - 1, self.header.maps - 1]]
else:
self.header.xstart, self.header.ystart, self.header.zstart = 0., 0., 0.
self.header.dmin = self.data.min()
self.header.dmax = self.data.max()
self.header.dmean = self.data.mean()
self.header.rms = (((self.data - self.data.mean()) ** 2).mean()) ** .5
# if architecture()[0].find('32')!=-1:
# temp1=0.
# temp2=0.
# temp3=0.
# for i in self.data:
# for j in i:
# for k in j:
# temp1+=k**2
# temp2+=k
# temp3+=1
# self.header.rms = (temp1/temp3-(temp2/temp3)**2)**.5
# else:
# self.header.rms = (((self.data - self.data.mean())**2).mean())**.5
if self.header.symdata:
self.header.nsymbt = 80
self.header.symdata = self.header.symdata[:80]
else:
self.header.nsymbt = 0
self.header.symdata = None
self.header.nlabels = sum(
[1 if i != "" else 0 for i in self.header.label])
self.header.label = [i[:80] for i in self.header.label if i != ""]
self.header.label = self.header.label + \
[""] * (10 - len(self.header.label))
self.header.map = "MAP "
if {0: int8, 1: int16, 2: float32, 5: uint8, 6: uint16}[self.header.mode] != self.data.dtype:
self.data = array(self.data,
dtype={0: int8, 1: int16, 2: float32, 5: uint8, 6: uint16}[self.header.mode])
def truncMatrix(self, index=[None, None, None, None, None, None], **kwargs):
"""Trunc the matrix by index. Related values will change accordingly.
You need provide the start and end index(will be included) of x,y and z.
Exapmle:
MRC.truncMatrix([xstart, xend, ystart, yend, zstart, zend])
You could use *None* to indicate start from begin or to the end.
"""
from ..IO.output import printError, printInfo
from numpy import array
if len(index) != 6:
printError("Must provide 6 indeces.")
return None
if index == [None] * 6:
printInfo("Nothing changed.")
return None
xstart, xend, ystart, yend, zstart, zend = index
if xstart == None:
xstart = 0
if ystart == None:
ystart = 0
if zstart == None:
zstart = 0
if xend == None:
xend = self.data.shape[0] + 1
else:
xend += 1
if yend == None:
yend = self.data.shape[1] + 1
else:
yend += 1
if zend == None:
zend = self.data.shape[2] + 1
else:
zend += 1
if not 0 <= xstart <= self.data.shape[0]:
printError("xstart is not in the range of x.")
return None
if not 0 <= xend <= self.data.shape[0]:
printError("xend is not in the range of x.")
return None
if not xstart < xend:
printError("xstart must less than xend.")
return None
if not 0 <= ystart <= self.data.shape[1]:
printError("ystart is not in the range of y.")
return None
if not 0 <= yend <= self.data.shape[1]:
printError("yend is not in the range of y.")
return None
if not ystart < yend:
printError("ystart must less than yend.")
return None
if not 0 <= zstart <= self.data.shape[2]:
printError("zstart is not in the range of z.")
return None
if not 0 <= zend <= self.data.shape[2]:
printError("zend is not in the range of z.")
return None
if not zstart < zend:
printError("zstart must less than zend.")
return None
self.data = self.data[xstart:xend, ystart:yend, zstart:zend]
xstep, ystep, zstep = array(
self.header.cella) * 1.0 / array([self.header.mx, self.header.my, self.header.mz])
self.header.xstart += xstart * xstep
self.header.ystart += ystart * ystep
self.header.zstart += zstart * zstep
if self.header.origin == [0, 0, 0]:
self.header.nxstart += xstart
self.header.nystart += ystart
self.header.nzstart += zstart
else:
self.header.nxstart = 0
self.header.nystart = 0
self.header.nzstart = 0
self.header.origin = list(array([self.header.xstart, self.header.ystart, self.header.zstart])[
[self.header.mapc - 1, self.header.mapr - 1, self.header.maps - 1]])
self.header.nx, self.header.ny, self.header.nz = array(
self.data.shape)[[self.header.mapc - 1, self.header.mapr - 1, self.header.maps - 1]]
# def getMatrixShape(self, **kwargs):
# """Get the data shape from the header information.
# Caution: it could be different with the data array."""
# if (isinstance(self.header.nx, int) and
# isinstance(self.header.ny, int) and isinstance(self.header.nz, int)):
# return (self.header.nx, self.header.ny, self.header.nz)
# else:
# from ..IO.output import printError
# printError("There is no header information here.")
# return None
def getGridCoords(self, **kwargs):
"""Return the x, y and z coordinate for the whole grid."""
from numpy import array, arange, argsort
xstep, ystep, zstep = array(
self.header.cella) * 1.0 / array([self.header.mx, self.header.my, self.header.mz])
if self.header.origin == [0, 0, 0]:
xcoor = (self.header.nxstart + arange(self.header.nx)) * xstep
ycoor = (self.header.nystart + arange(self.header.ny)) * ystep
zcoor = (self.header.nzstart + arange(self.header.nz)) * zstep
coor = array([xcoor, ycoor, zcoor])[
argsort([self.header.mapc, self.header.mapr, self.header.maps])]
return list(coor)
else:
xcoor = arange(self.header.nx) * xstep + self.header.origin[0]
ycoor = arange(self.header.ny) * ystep + self.header.origin[1]
zcoor = arange(self.header.nz) * zstep + self.header.origin[2]
coor = array([xcoor, ycoor, zcoor])[
argsort([self.header.mapc, self.header.mapr, self.header.maps])]
return list(coor)
def getGridSteps(self, **kwargs):
"""Return the x, y and z coordinate steps."""
from numpy import array, arange, argsort
step = array(array(self.header.cella) * 1.0 /
array([self.header.mx, self.header.my, self.header.mz]))
step = step[
argsort([self.header.mapc, self.header.mapr, self.header.maps])]
return step
def getArray(self, **kwargs):
"""Get the data from the MRC class"""
return self.data
def setMode(self, mode=2, **kwargs):
"""Set the data format for the data.
The data will be change the format accordingly.
Data type :
0 image : signed 8-bit bytes range -128 to 127
1 image : 16-bit halfwords
2 image : 32-bit reals
3 transform : complex 16-bit integers (not support now)
4 transform : complex 32-bit reals (not support now)
5 image : unsigned 8-bit range 0 to 255
6 image : unsigned 16-bit range 0 to 65535"""
from numpy import array, int8, int16, float32, uint8, uint16
from ..IO.output import printError
if mode not in xrange(7):
printError("Mode must be 0,1,2,3,4,5,6.")
elif mode in [3, 4]:
printError("Sorry, the complex format is not supported now.")
self.header.mode = mode
if {0: int8, 1: int16, 2: float32, 5: uint8, 6: uint16}[self.header.mode] != self.data.dtype:
self.data = array(self.data,
dtype={0: int8, 1: int16, 2: float32, 5: uint8, 6: uint16}[self.header.mode])
def printInformation(self, **kwargs):
"""Print the information from the header."""
self.header.printInformation()
def __del__(self):
del self.data
del self
| 44.602719 | 265 | 0.507739 | 3,449 | 29,527 | 4.321832 | 0.09742 | 0.136858 | 0.016772 | 0.018114 | 0.635046 | 0.573326 | 0.494432 | 0.489669 | 0.467597 | 0.429089 | 0 | 0.023292 | 0.374776 | 29,527 | 661 | 266 | 44.670197 | 0.78414 | 0.035662 | 0 | 0.503846 | 0 | 0.009615 | 0.092118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.001923 | 0.073077 | null | null | 0.123077 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c34648b7e6fe0e43164dec6e0c0022e1e1efabdd | 1,485 | py | Python | fb/forms.py | pure-python/brainmate | 79c83e707a4811dd881832d22f17c29f33c4d7f2 | [
"Apache-2.0"
] | null | null | null | fb/forms.py | pure-python/brainmate | 79c83e707a4811dd881832d22f17c29f33c4d7f2 | [
"Apache-2.0"
] | 1 | 2016-04-14T14:42:52.000Z | 2016-04-14T14:42:52.000Z | fb/forms.py | pure-python/brainmate | 79c83e707a4811dd881832d22f17c29f33c4d7f2 | [
"Apache-2.0"
] | null | null | null | from django.forms import (
Form, CharField, Textarea, PasswordInput, ChoiceField, DateField,
ImageField, BooleanField, IntegerField, MultipleChoiceField
)
from django import forms
from fb.models import UserProfile
class UserPostForm(Form):
text = CharField(widget=Textarea(
attrs={'rows': 1, 'cols': 40, 'class': 'form-control','placeholder': "What's on your mind?"}))
class UserPostCommentForm(Form):
text = CharField(widget=Textarea(
attrs={'rows': 1, 'cols': 50, 'class': 'form-control','placeholder': "Write a comment..."}))
class UserLogin(Form):
username = CharField(max_length=30)
password = CharField(widget=PasswordInput)
class UserProfileForm(Form):
first_name = CharField(max_length=100, required=False)
last_name = CharField(max_length=100, required=False)
gender = ChoiceField(choices=UserProfile.GENDERS, required=False)
date_of_birth = DateField(required=False)
avatar = ImageField(required=False)
OPTIONS = (
("Cars", "Cars"),
("Dogs", "Dogs"),
("Sports", "Sports"),
)
interests = MultipleChoiceField(widget=forms.CheckboxSelectMultiple,
choices=OPTIONS, required=False)
class QuestionFrom(Form):
question_description = CharField(max_length=300)
points = IntegerField()
class AddAnswerForm(Form):
answer_description = CharField(max_length=30)
correct_answer = BooleanField(required=False)
| 31.595745 | 102 | 0.682828 | 152 | 1,485 | 6.592105 | 0.473684 | 0.090818 | 0.08982 | 0.045908 | 0.165669 | 0.165669 | 0.165669 | 0.08982 | 0.08982 | 0 | 0 | 0.015913 | 0.19596 | 1,485 | 46 | 103 | 32.282609 | 0.823283 | 0 | 0 | 0.058824 | 0 | 0 | 0.092929 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.058824 | 0.088235 | 0 | 0.705882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c346562511e160197f5f2be08e436cdf509a8cc0 | 28,863 | py | Python | Galaxy_Invander/user23_fTVPDKIDhRdCfUp.py | triump0870/Interactive_Programming_Python | 97e0f1f5639aecac683053ed742632db14dc6954 | [
"Apache-2.0"
] | 1 | 2015-06-09T22:40:15.000Z | 2015-06-09T22:40:15.000Z | Galaxy_Invander/user23_fTVPDKIDhRdCfUp.py | triump0870/Interactive_Programming_Python | 97e0f1f5639aecac683053ed742632db14dc6954 | [
"Apache-2.0"
] | null | null | null | Galaxy_Invander/user23_fTVPDKIDhRdCfUp.py | triump0870/Interactive_Programming_Python | 97e0f1f5639aecac683053ed742632db14dc6954 | [
"Apache-2.0"
] | null | null | null | # Simple implementation of GalaxyInvanders game
# Rohan Roy (India) - 3 Nov 2013
# www.codeskulptor.org/#user23_fTVPDKIDhRdCfUp
VER = "1.0"
# "add various aliens"
import simplegui, math, random, time
#Global const
FIELD_WIDTH = 850
FIELD_HEIGHT = 500
TOP_MARGIN = 75
LEFT_MARGIN = 25
ALIEN_WIDTH = 48
ALIEN_HEIGHT = 55
PLAYER_SPEED = 10
BULLET_SPEED = 10
BULLET_POWER = 1
BONUS_SPEED = 10
ALIEN_SPEED = [3, 5]
# Images:
pImage = simplegui.load_image('https://dl.dropbox.com/s/zhnjucatewcmfs4/player.png')
aImages = []
for i in range(7):
aImages.append([])
aImages[0].append(simplegui.load_image('https://dl.dropbox.com/s/0cck7w6r0mt8pzz/alien_1_1.png'))
aImages[0].append(simplegui.load_image('https://dl.dropbox.com/s/j0kubnhzajbdngu/alien_1_2.png'))
aImages[0].append(simplegui.load_image('https://dl.dropbox.com/s/zkeu6hqh9bakj25/alien_1_3.png'))
aImages[1].append(simplegui.load_image('https://dl.dropbox.com/s/e75mkcylat70lnd/alien_2_1.png'))
aImages[1].append(simplegui.load_image('https://dl.dropbox.com/s/pgjvaxg0z6rhco9/alien_2_2.png'))
aImages[1].append(simplegui.load_image('https://dl.dropbox.com/s/en0hycfsi3cuzuo/alien_2_3.png'))
aImages[2].append(simplegui.load_image('https://dl.dropbox.com/s/fu9weoll70acs8f/alien_3_1.png'))
aImages[2].append(simplegui.load_image('https://dl.dropbox.com/s/b2rxru2nt5q2r1u/alien_3_2.png'))
aImages[2].append(simplegui.load_image('https://dl.dropbox.com/s/x66vgj9fc2jlg53/alien_3_3.png'))
aImages[3].append(simplegui.load_image('https://dl.dropbox.com/s/7o04ljg52kniyac/alien_4_1.png'))
aImages[3].append(simplegui.load_image('https://dl.dropbox.com/s/b3v6tvami0rvl6r/alien_4_2.png'))
aImages[3].append(simplegui.load_image('https://dl.dropbox.com/s/j451arcevsag36h/alien_4_3.png'))
aImages[4].append(simplegui.load_image('https://dl.dropbox.com/s/jlhdigkm79nncnm/alien_5_1.png'))
aImages[4].append(simplegui.load_image('https://dl.dropbox.com/s/wvlvjsa8yl6gka3/alien_5_2.png'))
aImages[4].append(simplegui.load_image('https://dl.dropbox.com/s/rrg4y1tnsbrh04r/alien_5_3.png'))
aImages[5].append(simplegui.load_image('https://dl.dropbox.com/s/oufyfy590tzf7cx/alien_6_1.png'))
aImages[5].append(simplegui.load_image('https://dl.dropbox.com/s/p4ehd9f6mo2xfzc/alien_6_2.png'))
aImages[5].append(simplegui.load_image('https://dl.dropbox.com/s/815gq3xyh6wmc0t/alien_6_3.png'))
aImages[6].append(simplegui.load_image('https://dl.dropbox.com/s/bv4ycocuomsvj50/alien_7_1.png'))
aImages[6].append(simplegui.load_image('https://dl.dropbox.com/s/krs2gtvdxxve79z/alien_7_2.png'))
aImages[6].append(simplegui.load_image('https://dl.dropbox.com/s/v2wczi8lxwczq87/alien_7_3.png'))
#backgrounds
bckg = []
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/ibfu2t9vrh4bhxd/back01.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/pcl8vzby25ovis8/back02.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/g8nwo1t9s4i9usg/back03.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/ee8oilluf7pe98h/back04.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/7jfgjoxinzwwlx4/back05.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/wh01g2q3607snvz/back06.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/b72ltp2xii9utnr/back07.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/av73jek8egezs1w/back08.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/ik54ttfklv3x3ai/back09.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/e9e6kpyg3yuoenc/back10.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/zrabwnnvlwvn7it/back11.jpg"))
bckg.append(simplegui.load_image("https://dl.dropbox.com/s/a2infkx0rmn8b8m/back12.jpg"))
# sounds
sndPlayer = simplegui.load_sound('https://dl.dropbox.com/s/vl3as0o2m2wvlwu/player_shoot.wav')
sndAlien = simplegui.load_sound('https://dl.dropbox.com/s/m4x0tldpze29hcr/alien_shoot.wav')
sndPlayerExplosion = simplegui.load_sound('https://dl.dropbox.com/s/10fn2wh7kk7uoxh/explosion%2001.wav')
sndAlienHit = simplegui.load_sound('https://dl.dropbox.com/s/80qdvup27n8j6r1/alien_hit.wav')
sndAlienExplosion = simplegui.load_sound('https://dl.dropbox.com/s/qxm3je9vdlb469g/explosion_02.wav')
sndBonus = simplegui.load_sound('https://dl.dropbox.com/s/tzp7e20e5v19l01/bonus.wav')
sndPause = simplegui.load_sound('https://dl.dropbox.com/s/uzs9nixpd22asno/pause.wav')
sndTheme = simplegui.load_sound('https://dl.dropbox.com/s/52zo892uemfkuzm/theme_01.mp3')
sounds = [sndPlayer, sndAlien, sndPlayerExplosion, sndAlienExplosion, \
sndBonus, sndPause, sndTheme, sndAlienHit]
#Global variables
GameRunning = False
GameEnded = False
player_speed = 0
mes = ""
timer_counter = 0
lives = 0
level = 1
scores = 0
killed = 0
current_back = 0
paused = False
shoot_count = 0
level_time = []
ready, go = False, False
#player = [FIELD_WIDTH //2, FIELD_HEIGHT - 30 + TOP_MARGIN]
#game objects
user_bullet = []
weapon_level = 1
weapon_speed = BULLET_SPEED
alien_bullets = []
alien_fleet = None
player = None
frame = None
aTimer = None
dTimer = None
bonuses = []
dCounter = 0
back = False
bonus_count = [0, 0, 0, 0]
player_killed = False
player_killed_at = 0
level_map = []
for i in range(7):
level_map.append([])
level_map[0] = [ 0, 0, 0, 0]
level_map[1] = [129, 0, 0, 0]
level_map[2] = [195, 129, 0, 0]
level_map[3] = [255, 195, 60, 0]
level_map[4] = [255, 231, 195, 195]
level_map[5] = [255, 255, 231, 195]
level_map[6] = [255, 255, 255, 231]
def draw_text(canvas, text, point, size, delta, color):
canvas.draw_text(text, point, size, color[0])
canvas.draw_text(text, [point[0]-delta[0], \
point[1]-delta[1]], size, color[1])
class Bonus:
def __init__ (self, kind, point):
self.kind = kind
self.x = point[0]
self.y = point[1]
self.v = BONUS_SPEED #velocity
self.width = 36
self.height = 36
return self
def move(self):
self.y += self.v
return self
def draw(self, canvas):
if self.kind == 0: #speed of bullet
canvas.draw_circle([self.x, self.y], 15, 3, "LightBlue")
canvas.draw_text("WS", [self.x-12, self.y+5], self.width //2, "LightBlue")
elif self.kind == 1: #weapon level
canvas.draw_circle([self.x, self.y], 15, 3, "Red")
canvas.draw_text("WL", [self.x-12, self.y+5], self.width //2, "Red")
elif self.kind == 2: #life
canvas.draw_circle([self.x, self.y], 15, 3, "LightGreen")
canvas.draw_text("LF", [self.x-12, self.y+5], self.width //2, "LightGreen")
elif self.kind == 3: #weapon power
canvas.draw_circle([self.x, self.y], 15, 3, "8010df")
canvas.draw_text("WP", [self.x-12, self.y+5], self.width //2, "8010df")
return self
def execute(self):
global weapon_speed, weapon_level, player, scores, bonus_count
bonus_count[self.kind] += 1
if self.kind == 0: #speed of bullet
weapon_speed += 1
delta = round(math.pow(20, (1 + (1.0*level-1)/32))*5)
scores = scores + delta
elif self.kind == 1: #weapon level
weapon_level += 1
delta = round(math.pow(30, (1 + (1.0*level-1)/32))*5)
scores = scores + delta
elif self.kind == 2: #life
player.lives += 1
delta = round(math.pow(100, (1 + (1.0*level-1)/32))*5)
scores = scores + delta
elif self.kind == 3: #weapon power
player.power += 0.1
delta = round(math.pow(100, (1 + (1.0*level-1)/32))*5)
scores = scores + delta
sndBonus.play()
return self
def dHandler():
global dCounter, back, player_killed
dCounter += 1
if dCounter % 10 == 0:
if back:
frame.set_canvas_background("Red")
else:
frame.set_canvas_background("black")
back = not back;
if dCounter > 50:
dCounter = 0
player_killed = False
dTimer.stop()
frame.set_canvas_background("black")
class Bullet:
def __init__ (self, point, color, velocity):
self.x = point[0]
self.y = point[1]
self.color = color
self.v = velocity
self.width = 1
self.height = 1
def draw(self, canvas):
canvas.draw_line([self.x, self.y-5], [self.x, self.y+5], 3, self.color)
def move(self):
self.y += self.v
class Alien:
def __init__(self, point, kind):
self.x = point[0]
self.y = point[1]
self.kind = kind
self.flying = False
self.vy = 0
self.vx = 0
self.health = self.get_max_health()
self.width = 20
self.height = 20
def get_max_health(self):
return 1+0.6 * self.kind[1]
def shoot(self):
if len(alien_bullets)<level*2:
bullet = Bullet([self.x, self.y], "LightRed", BULLET_SPEED)
alien_bullets.append(bullet)
sndAlien.play()
def move(self, point):
if self.flying:
koef = 1.5
self.y += (self.vy / koef)
if self.x>player.x:
self.x -= (self.vx / koef)
else:
self.x += (self.vx / koef)
if self.vx<ALIEN_SPEED[0]:
self.vx += 1
if self.vy<ALIEN_SPEED[1]:
self.vy += 1
else:
self.x = point[0]
self.y = point[1]
def draw(self, canvas):
if aImages[self.kind[1]][self.kind[0]].get_width()==0:
w = 15
h = 15
canvas.draw_circle([self.x, self.y], 15, 5, "Red")
else:
# img = aImages[self.kind[1]][self.kind[0]]
img = aImages[self.kind[1]][self.kind[0]]
self.width = w = img.get_width()
self.height = h = img.get_height()
canvas.draw_image(img, (w//2, h//2), (w, h), (self.x, self.y), (w, h))
if self.health<>self.get_max_health():
ratio = w * (self.health*1.0) / self.get_max_health()
canvas.draw_line([self.x-w//2, self.y-h//2-3], [self.x+w//2, self.y-h//2-3], 4, "red")
canvas.draw_line([self.x-w//2, self.y-h//2-3], [self.x-w//2+ratio, self.y-h//2-3], 4, "green")
return canvas
class AliensFleet:
def __init__ (self, point):
def is_high_level(place):
map_ = (level-1)%7
row = level_map[map_][place[1]] #255 - 0
return (row & (1 << place[0]))<>0
self.x = point[0]
self.y = point[1]
self.aliens = []
self.pattern = [255, 255, 255, 255]
self.y_velocity = ALIEN_HEIGHT//3 + 1
self.x_velocity = - ALIEN_WIDTH//3 + 1
for i in range(self.get_aliens_count()):
point = self.get_alien_position(i)
place = self.get_alien_place(i)
alien_level = (level-1)//7 + is_high_level(place)
alien = Alien(point, [random.randrange(3), alien_level])
self.aliens.append(alien)
def get_aliens_count(self):
c = 0
for i in range(4):
for j in range(8):
if (self.pattern[i] & (1 << j))<>0:
c+=1
return c
def get_alien_position(self, n):
#returns a screen x, y of alien with number n
point = self.get_alien_place(n)
x = point[0]*(ALIEN_WIDTH + 3) + self.x
y = point[1]*(ALIEN_HEIGHT + 3) +self.y
point = [x, y]
return point
def get_alien_place(self, n):
#returns a fleet x, y of alien with number n
x, y, c = 0, 0, 0
for i in range(4):
for j in range(8):
if (self.pattern[i] & (1 << j))<>0:
if c==n:
x, y = j, i
c+=1
point = [x, y]
return point
def move_aliens(self):
i = 0
for alien in self.aliens:
point = self.get_alien_position(i)
alien.move(point)
i += 1
return self
def move_down(self):
self.y += self.y_velocity
if self.y>400:
player.explode()
self.y = 100
self.move_aliens()
def move_side(self):
self.x -= self.x_velocity
# check borders of fleet:
left = 8
right = -1
for i in range(len(self.aliens)):
point = self.get_alien_place(i)
if point[0]<left:
left = point[0]
if point[0]>right:
right = point[0]
if (self.x+(left+1)*60 < LEFT_MARGIN + 10) or (self.x + (right+1)*45>FIELD_WIDTH-LEFT_MARGIN-60):
self.x_velocity = -self.x_velocity
self.move_aliens()
def draw(self, canvas):
for alien in self.aliens:
alien.draw(canvas)
def make_shoot(self):
for alien in self.aliens:
if len(alien_bullets) < level * 3 + 1:
if random.randrange(101)<2: #
alien.shoot()
return self
def alien_fly(self):
i = 0
for alien in self.aliens:
if alien.flying:
i += 1
if (i<1+level) and (random.randrange(1000)<3) and (time.time()-level_time[len(level_time)-1]>60):
alien.flying=True
def check_death(self):
global scores, killed, player
i = 0
for bullet in user_bullet:
for i in range(len(self.aliens)):
alien = self.aliens[i]
if isBulletHit(bullet, alien):
if alien.health-player.power<=0:
point = self.get_alien_place(i)
sndAlienExplosion.play()
self.aliens.remove(alien)
x = ~int((1 << point[0]))
self.pattern[point[1]] = self.pattern[point[1]] & x
user_bullet.remove(bullet)
delta = round(math.pow(5, (1 + (1.0*level-1)/32))*5)
scores = scores + delta
killed += 1
x = random.randrange(1000)
if x<5:
bonus = Bonus(3, [alien.x, alien.y])
bonuses.append(bonus)
elif x<50:
bonus = Bonus(2, [alien.x, alien.y])
bonuses.append(bonus)
elif x<120:
bonus = Bonus(1, [alien.x, alien.y])
bonuses.append(bonus)
elif x<200:
bonus = Bonus(0, [alien.x, alien.y])
bonuses.append(bonus)
if killed % 500 == 0:
player.lives += 1
sndBonus.play()
break
else:
user_bullet.remove(bullet)
alien.health -= player.power
sndAlienHit.play()
i += 1
class Player:
def __init__(self, point, lives):
self.x = point[0]
self.y = point[1]
self.lives = 3
self.speed = player_speed
self.power = BULLET_POWER
self.width = 20
self.height = 20
def draw(self, canvas):
draw_user_image(canvas, [self.x, self.y])
def move(self):
self.x += player_speed
if self.x<LEFT_MARGIN*2:
self.x = LEFT_MARGIN*2
if self.x>FIELD_WIDTH:
self.x=FIELD_WIDTH
def draw_lives_counter(self, canvas):
if self.lives < 5:
for i in range(self.lives):
draw_user_image(canvas, [150+i*35, 15])
else:
draw_user_image(canvas, [150, 15])
canvas.draw_text(" x "+str(int(self.lives)), [170, 25], 25, "Yellow")
def explode(self):
global dTimer, alien_bullets, user_bullet, weapon_level, weapon_speed
global alien_fleet, player_killed_at, player_killed, player_speed
player_speed = 0
player_killed_at = time.time()
sndPlayerExplosion.play()
for alien in alien_fleet.aliens:
alien.flying = False
player_killed = True
alien_bullets = []
user_bullet = []
bonuses = []
weapon_level = level // 10 + 1
weapon_speed = BULLET_SPEED
self.lives -= 1
if self.lives<0:
stop_game()
dTimer = simplegui.create_timer(25, dHandler)
dTimer.start()
#helper functions
def dummy(key):
return key
def pause():
global paused
paused = not paused
sndPause.play()
def draw_user_image(canvas, point):
# draw a image of user ship
#
global player
if pImage.get_width()==0:
canvas.draw_circle(point, 12, 5, "Yellow")
else:
canvas.draw_image(pImage, (25, 36), (49, 72), point, (34, 50))
player.width = pImage.get_width()
player.height = pImage.get_height()
return canvas
def draw_lives(canvas):
# draw lives counter
canvas.draw_text("Lives : ", [30, 25], 25, "Red")
if player<>None:
player.draw_lives_counter(canvas)
return canvas
def draw_weapons(canvas):
canvas.draw_text("Weapon : ", [30, 60], 25, "Red")
canvas.draw_text("Rocket lvl: "+str(int(weapon_level)), [135, 60], 25, "Yellow")
canvas.draw_text("WS:"+str(weapon_speed/10.0), [280, 48], 10, "00c5fe")
canvas.draw_text("WP:"+str(player.power), [280, 61], 10, "00c5fe")
return canvas
def draw_level(canvas):
canvas.draw_text("Level : ", [FIELD_WIDTH-200, 50], 50, "Red")
canvas.draw_text(str(level), [FIELD_WIDTH-50, 50], 50, "Yellow")
return canvas
def draw_scores(canvas):
canvas.draw_text(str(int(scores)), [400, 50], 50, "LightBlue")
return canvas
def draw_screen(canvas):
# border of board
canvas.draw_image(bckg[current_back], (425, 250), (850, 500), \
(LEFT_MARGIN+FIELD_WIDTH//2, TOP_MARGIN+FIELD_HEIGHT//2),\
(FIELD_WIDTH, FIELD_HEIGHT))
canvas.draw_polygon([[LEFT_MARGIN, TOP_MARGIN],
[LEFT_MARGIN, FIELD_HEIGHT+TOP_MARGIN],
[FIELD_WIDTH+LEFT_MARGIN, FIELD_HEIGHT+TOP_MARGIN],
[FIELD_WIDTH+LEFT_MARGIN, TOP_MARGIN]], 2, 'Orange')
return canvas
def draw_start_screen(canvas):
img_count = 1 + len(aImages)*(len(aImages[0])) + len(bckg)
loaded_img_count = 0
if pImage.get_width()<>0:
loaded_img_count += 1
for bImage in bckg:
if bImage.get_width()<>0:
loaded_img_count += 1
for aImg in aImages:
for img in aImg:
if img.get_width()<>0:
loaded_img_count += 1
loaded_sounds = 0
for snd in sounds:
if snd <> None:
loaded_sounds += 1
draw_text(canvas, "SPACE INVANDERS", [220, 150], 50, [3, 3], ["blue", "yellow"])
canvas.draw_text("ver. - "+VER, [600, 170], 20, "yellow")
canvas.draw_text("03 nov. 2013", [600, 190], 20, "yellow")
draw_text(canvas, "CONTROLS:", [110, 210], 24, [2, 2], ["green", "yellow"])
draw_text(canvas, "Arrows - to left and right, space - to fire, P to pause game", [110, 240], 24, [2, 2], ["green", "yellow"])
draw_text(canvas, "Bonuses: ", [110, 280], 24, [2, 2], ["green", "yellow"])
b = Bonus(0, [125, 310])
b.draw(canvas)
draw_text(canvas, " - increase user's bullet speed", [150, 320], 24, [2, 2], ["green", "yellow"])
b = Bonus(1, [125, 350])
b.draw(canvas)
draw_text(canvas, " - increase user's bullet number", [150, 360], 24, [2, 2], ["green", "yellow"])
b = Bonus(2, [125, 390])
b.draw(canvas)
draw_text(canvas, " - add life", [150, 400], 24, [2, 2], ["green", "yellow"])
b = Bonus(3, [125, 430])
b.draw(canvas)
draw_text(canvas, " - increase weapon power", [150, 440], 24, [2, 2], ["green", "yellow"])
if loaded_img_count<img_count:
draw_text(canvas, "Please, wait for loading...", [280, 500], 40, [3, 3], ["Blue", "Yellow"])
s = "Loaded "+str(loaded_img_count)+" images of "+str(img_count)
draw_text(canvas, s, [110, 550], 20, [2, 2], ["Blue", "yellow"])
s = "Loaded "+str(loaded_sounds)+" sounds of "+str(len(sounds))
draw_text(canvas, s, [510, 550], 20, [2, 2], ["Blue", "yellow"])
else:
draw_text(canvas, "Click to start game", [300, 500], 40, [3, 3], ["Blue", "yellow"])
frame.set_mouseclick_handler(click_handler)
return canvas
def draw_end_screen(canvas):
draw_text(canvas, "Game over!", [350, 180], 50, [2, 2], ["Blue", "Yellow"])
draw_text(canvas, "Your score is "+str(int(scores)), [330, 240], 35, [2, 2], ["blue", "Yellow"])
draw_text(canvas, "You shoot "+str(int(shoot_count))+" times", [150, 320], 24, [2, 2], ["blue", "Yellow"])
draw_text(canvas, "You kill a "+str(killed)+" aliens", [150, 360], 24, [2, 2], ["blue", "Yellow"])
if shoot_count == 0:
s = "0"
else:
s = str(int(10000*float(killed)/shoot_count)/100.0)
draw_text(canvas, "Your accuracy is "+s+"%", [150, 400], 24, [2, 2], ["blue", "Yellow"])
i = 0
for bc in bonus_count:
b = Bonus(i, [505, 310 + 40*i])
b.draw(canvas)
draw_text(canvas, " - used "+str(bonus_count[i])+" times", [530, 320+40*i], 24, [2, 2], ["blue", "yellow"])
i += 1
draw_text(canvas, "Click to start new game", [300, 500], 40, [2, 2], ["blue", "Yellow"])
canvas.draw_text("ver. - "+VER, [600, 540], 15, "yellow");
return canvas
def draw_game_objects(canvas):
player.draw(canvas)
#draw_user_image(canvas, Player)
for bullet in alien_bullets:
bullet.draw(canvas)
for bullet in user_bullet:
bullet.draw(canvas)
for bonus in bonuses:
bonus.draw(canvas)
alien_fleet.draw(canvas)
readyGo()
if paused:
draw_text(canvas, "P A U S E", [380, 350], 50, [2, 2], ["Green", "Yellow"])
if ready:
draw_text(canvas, "R E A D Y", [380, 350], 50, [2, 2], ["Green", "Yellow"])
if go:
draw_text(canvas, "G O ! ! !", [380, 350], 50, [2, 2], ["Green", "Yellow"])
sndTheme.play()
return canvas
def moving_objects():
global timer_counter
if not GameRunning:
return None
if paused or ready or go or player_killed:
return None
timer_counter += 1
player.move()
for alien in alien_fleet.aliens:
if alien.flying:
alien.move([0,0])
if isBulletHit(alien, player):
player.explode()
if alien.y>FIELD_HEIGHT + TOP_MARGIN+20:
alien.y = TOP_MARGIN
for bonus in bonuses:
bonus.move();
if bonus.y > FIELD_HEIGHT + TOP_MARGIN+20:
bonuses.remove(bonus)
if isBulletHit(bonus, player):
bonus.execute()
bonuses.remove(bonus)
for bullet in user_bullet:
bullet.move()
alien_fleet.check_death()
for bullet in user_bullet:
if bullet.y<TOP_MARGIN+25:
user_bullet.remove(bullet)
# for bullet in alien_bullets:
bullets_to_delete = []
for bullet in list(alien_bullets):
bullet.move()
if bullet.y > FIELD_HEIGHT + TOP_MARGIN -10:
bullets_to_delete.append(bullet)
if isBulletHit(bullet, player):
player.explode()
for bullet in bullets_to_delete:
if bullet in alien_bullets:
alien_bullets.remove(bullet)
alien_fleet.make_shoot()
alien_fleet.alien_fly()
if level<30:
x = 60 - level
else:
x = 1
if timer_counter % x == 0:
alien_fleet.move_side()
if timer_counter % (100 + x) == 0:
alien_fleet.move_down()
if alien_fleet.get_aliens_count() == 0:
new_level()
# Handler to draw on canvas
def draw(canvas):
draw_screen(canvas)
canvas.draw_text(mes, [250, 250], 40, "Yellow")
######################
#check a begin of game
#
if GameEnded:
draw_end_screen(canvas)
elif not GameRunning:
draw_start_screen(canvas)
else:
##################
# game info
draw_lives(canvas)
draw_weapons(canvas)
draw_level(canvas)
draw_scores(canvas)
draw_game_objects(canvas)
return canvas
def readyGo():
global ready, go
ready = time.time()-level_time[len(level_time)-1]<0.7
go = (not ready) and time.time()-level_time[len(level_time)-1]<1.5
player_killed = time.time() - player_killed_at < 1.2
#Initialization and start of game
def start_game():
global GameRunning, alien_fleet, player, GameEnded
global scores, killed, level, level_time, bonus_count
scores = 0
bonus_count = [0, 0, 0, 0]
killed = 0
level = 0
GameEnded = False
GameRunning = True
new_level()
player = Player([FIELD_WIDTH //2, FIELD_HEIGHT + TOP_MARGIN-20], 3)
return None
def stop_game():
global GameRunning, GameEnded
# aTimer.stop()
GameEnded = True
GameRunning = False
level_time.append(time.time())
frame.set_keydown_handler(dummy)
frame.set_keyup_handler(dummy)
return None
# Handler for mouse click
def click_handler(position):
if not GameRunning:
start_game()
#else:
# stop_game()
return position
#### keydown_handler
def keydown(key):
global keypressed, mes, shoot_count, player_speed
keypressed = key
if (key == simplegui.KEY_MAP['p']) or \
(key == simplegui.KEY_MAP['P']):
pause()
else:
if (key == simplegui.KEY_MAP['right']):
#player.move('right')
player_speed = PLAYER_SPEED
elif (key == simplegui.KEY_MAP['left']):
# player.move('left')
player_speed = -PLAYER_SPEED
if (key == simplegui.KEY_MAP['space'])and(GameRunning):
if len(user_bullet) < weapon_level:
b = Bullet([player.x, player.y], "LightBlue", -weapon_speed)
user_bullet.append(b)
sndPlayer.play()
shoot_count += 1
return
#### keyup_handler to stop keydown
def keyup(key):
global player_speed
#if keytimer.is_running():
# keytimer.stop()
if (key == simplegui.KEY_MAP['right'])or(key == simplegui.KEY_MAP['left']):
player_speed = 0
return
def isBulletHit(bullet, obj):
if (bullet.y+bullet.height//2+2 > obj.y-obj.height // 2) and (bullet.y-bullet.height//2-2<obj.y+obj.height//2):
if (bullet.x+bullet.width//2 +2> obj.x - obj.width//2) and (bullet.x-bullet.width//2 -2< obj.x + obj.width//2):
return True
else:
return False
else:
return False
def new_level():
global level, alien_fleet, user_bullet, alien_bullets, current_back, player
global level_time, player_speed
level_time.append(time.time())
current_back = random.randrange(12)
level += 1
player_speed = 0
user_bullet = []
alien_bullets = []
alien_fleet = AliensFleet([250, 100])
if level % 10 == 0:
player.lives += 1
sndBonus.play()
# Create a frame and assign callbacks to event handlers
frame = simplegui.create_frame("Galaxian", 900, 600, 0)
frame.set_draw_handler(draw)
frame.set_keydown_handler(keydown)
frame.set_keyup_handler(keyup)
aTimer = simplegui.create_timer(60, moving_objects)
aTimer.start()
# Start the frame animation
frame.start()
| 35.988778 | 131 | 0.562346 | 3,792 | 28,863 | 4.142141 | 0.105485 | 0.029286 | 0.037436 | 0.045457 | 0.396638 | 0.324823 | 0.266123 | 0.236009 | 0.185459 | 0.155854 | 0 | 0.05978 | 0.299311 | 28,863 | 801 | 132 | 36.033708 | 0.716871 | 0.035166 | 0 | 0.292237 | 0 | 0 | 0.117778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.001522 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c34b267716c64dbcac0061ea5f7b0de5338ff153 | 19,015 | py | Python | d373c7/pytorch/models/classifiers.py | t0kk35/d373c7 | 7780b97545e581244fb4fb74347bb1b052b9ec3f | [
"Apache-2.0"
] | 1 | 2021-07-23T18:04:55.000Z | 2021-07-23T18:04:55.000Z | d373c7/pytorch/models/classifiers.py | t0kk35/d373c7 | 7780b97545e581244fb4fb74347bb1b052b9ec3f | [
"Apache-2.0"
] | null | null | null | d373c7/pytorch/models/classifiers.py | t0kk35/d373c7 | 7780b97545e581244fb4fb74347bb1b052b9ec3f | [
"Apache-2.0"
] | null | null | null | """
Module for classifier Models
(c) 2020 d373c7
"""
import logging
import torch
import torch.nn as nn
from .common import PyTorchModelException, ModelDefaults, _History, _ModelGenerated, _ModelStream
from .encoders import GeneratedAutoEncoder
from ..layers import LSTMBody, ConvolutionalBody1d, AttentionLastEntry, LinearEncoder, TensorDefinitionHead
from ..layers import TransformerBody, TailBinary
from ..loss import SingleLabelBCELoss
from ...features import TensorDefinition, TensorDefinitionMulti
from typing import List, Dict, Union
logger = logging.getLogger(__name__)
class BinaryClassifierHistory(_History):
loss_key = 'loss'
acc_key = 'acc'
def __init__(self, *args):
dl = self._val_argument(args)
h = {m: [] for m in [BinaryClassifierHistory.loss_key, BinaryClassifierHistory.acc_key]}
_History.__init__(self, dl, h)
self._running_loss = 0
self._running_correct_cnt = 0
self._running_count = 0
@staticmethod
def _reshape_label(pr: torch.Tensor, lb: torch.Tensor) -> torch.Tensor:
if pr.shape == lb.shape:
return lb
elif len(pr.shape)-1 == len(lb.shape) and pr.shape[-1] == 1:
return torch.unsqueeze(lb, dim=len(pr.shape)-1)
else:
raise PyTorchModelException(
f'Incompatible shapes for prediction and label. Got {pr.shape} and {lb.shape}. Can not safely compare'
)
def end_step(self, *args):
BinaryClassifierHistory._val_is_tensor(args[0])
BinaryClassifierHistory._val_is_tensor_list(args[1])
BinaryClassifierHistory._val_is_tensor(args[2])
pr, lb, loss = args[0], args[1][0], args[2]
lb = BinaryClassifierHistory._reshape_label(pr, lb)
self._running_loss += loss.item()
self._running_correct_cnt += torch.sum(torch.eq(torch.ge(pr, 0.5), lb)).item()
self._running_count += pr.shape[0]
super(BinaryClassifierHistory, self).end_step(pr, lb, loss)
def end_epoch(self):
self._history[BinaryClassifierHistory.loss_key].append(round(self._running_loss/self.steps, 4))
self._history[BinaryClassifierHistory.acc_key].append(round(self._running_correct_cnt/self.samples, 4))
self._running_correct_cnt = 0
self._running_count = 0
self._running_loss = 0
super(BinaryClassifierHistory, self).end_epoch()
def step_stats(self) -> Dict:
r = {
BinaryClassifierHistory.loss_key: round(self._running_loss/self.step, 4),
BinaryClassifierHistory.acc_key: round(self._running_correct_cnt/self._running_count, 4)
}
return r
def early_break(self) -> bool:
return False
class ClassifierDefaults(ModelDefaults):
def __init__(self):
super(ClassifierDefaults, self).__init__()
self.emb_dim(4, 100, 0.2)
self.linear_batch_norm = True
self.inter_layer_drop_out = 0.1
self.default_series_body = 'recurrent'
self.attention_drop_out = 0.0
self.convolutional_dense = True
self.convolutional_drop_out = 0.1
self.transformer_positional_logic = 'encoding'
self.transformer_positional_size = 16
self.transformer_drop_out = 0.2
def emb_dim(self, minimum: int, maximum: int, dropout: float):
self.set('emb_min_dim', minimum)
self.set('emb_max_dim', maximum)
self.set('emb_dropout', dropout)
@property
def linear_batch_norm(self) -> bool:
"""Define if a batch norm layer will be added before the final hidden layer.
:return: bool
"""
return self.get_bool('lin_batch_norm')
@linear_batch_norm.setter
def linear_batch_norm(self, flag: bool):
"""Set if a batch norm layer will be added before the final hidden layer.
:return: bool
"""
self.set('lin_batch_norm', flag)
@property
def inter_layer_drop_out(self) -> float:
"""Defines a value for the inter layer dropout between linear layers. If set, then dropout will be applied
between linear layers.
:return: A float value, the dropout aka p value to apply in the nn.Dropout layers.
"""
return self.get_float('lin_interlayer_drop_out')
@inter_layer_drop_out.setter
def inter_layer_drop_out(self, dropout: float):
"""Define a value for the inter layer dropout between linear layers. If set, then dropout will be applied
between linear layers.
:param dropout: The dropout aka p value to apply in the nn.Dropout layers.
"""
self.set('lin_interlayer_drop_out', dropout)
@property
def default_series_body(self) -> str:
"""Defines the default body type for series, which is a tensor of rank 3 (including batch).
This could be for instance 'recurrent'.
:return: A string value, the default body type to apply to a rank 3 tensor stream.
"""
return self.get_str('def_series_body')
@default_series_body.setter
def default_series_body(self, def_series_body: str):
"""Defines the default body type for series, which is a tensor of rank 3 (including batch).
This could be for instance 'recurrent'.
:param def_series_body: A string value, the default body type to apply to a rank 3 tensor stream.
"""
self.set('def_series_body', def_series_body)
@property
def attention_drop_out(self) -> float:
"""Define a value for the attention dropout. If set, then dropout will be applied after the attention layer.
:return: The dropout aka p value to apply in the nn.Dropout layers.
"""
return self.get_float('attn_drop_out')
@attention_drop_out.setter
def attention_drop_out(self, dropout: float):
"""Define a value for the attention dropout. If set, then dropout will be applied after the attention layer.
:param dropout: The dropout aka p value to apply in the nn.Dropout layers.
"""
self.set('attn_drop_out', dropout)
@property
def convolutional_drop_out(self) -> float:
"""Define a value for the attention dropout. If set, then dropout will be applied after the attention layer.
:return: The dropout aka p value to apply in the nn.Dropout layers.
"""
return self.get_float('conv_body_dropout')
@convolutional_drop_out.setter
def convolutional_drop_out(self, dropout: float):
"""Define a value for the attention dropout. If set, then dropout will be applied after the attention layer.
:param dropout: The dropout aka p value to apply in the nn.Dropout layers.
"""
self.set('conv_body_dropout', dropout)
@property
def convolutional_dense(self) -> bool:
"""Defines if convolutional bodies are dense. Dense bodies mean that the input to the layer is added to the
output. It forms a sort of residual connection. The input is concatenated along the features axis. This
allows the model to work with the input if that turns out to be useful.
:return: A boolean value, indicating if the input will be added to the output or not.
"""
return self.get_bool('conv_body_dense')
@convolutional_dense.setter
def convolutional_dense(self, dense: bool):
"""Defines if convolutional bodies are dense. Dense bodies mean that the input to the layer is added to the
output. It forms a sort of residual connection. The input is concatenated along the features axis. This
allows the model to work with the input if that turns out to be useful.
:param dense: A boolean value, indicating if the input will be added to the output or not.
"""
self.set('conv_body_dense', dense)
@property
def transformer_positional_logic(self) -> str:
"""Sets which positional logic is used in transformer blocks. 'encoding' : The system will use the encoding,
'embedding' : The system will use an embedding layer.
:return: A string value defining which positional logic to use.
"""
return self.get_str('trans_pos_logic')
@transformer_positional_logic.setter
def transformer_positional_logic(self, positional_logic: str):
"""Sets which positional logic is used in transformer blocks. 'encoding' : The system will use the encoding,
'embedding' : The system will use an embedding layer.
:param positional_logic: A string value defining which positional logic to use.
"""
self.set('trans_pos_logic', positional_logic)
@property
def transformer_positional_size(self) -> int:
"""Sets the positional size of transformer blocks. The size is the number of elements added to each transaction
in the series to help the model determine the position of transactions in the series.
:return: An integer value. The number of elements output by the positional logic
"""
return self.get_int('trans_pos_size')
@transformer_positional_size.setter
def transformer_positional_size(self, positional_size: int):
"""Sets the positional size of transformer blocks. The size is the number of elements added to each transaction
in the series to help the model determine the position of transactions in the series.
:param positional_size: An integer value. The number of elements output by the positional logic
"""
self.set('trans_pos_size', positional_size)
@property
def transformer_drop_out(self) -> float:
"""Defines the drop out to apply in the transformer layer
:return: An float value. The drop out value to apply in transformer layers
"""
return self.get_float('trans_dropout')
@transformer_drop_out.setter
def transformer_drop_out(self, dropout: float):
"""Defines the drop out to apply in the transformer layer
:param dropout: The drop out value to apply in transformer layers
"""
self.set('trans_dropout', dropout)
class GeneratedClassifier(_ModelGenerated):
"""Generate a Pytorch classifier model. This class will create a model that fits the input and label definition
of the TensorDefinition.
Args:
tensor_def: A TensorDefinition or TensorDefinitionMulti object describing the various input and output features
c_defaults: (Optional) ClassifierDefaults object defining the defaults which need to be used.
kwargs: Various named parameters which can be use to drive the type of classifier and the capacity of the model.
"""
def __init__(self, tensor_def: Union[TensorDefinition, TensorDefinitionMulti],
c_defaults=ClassifierDefaults(), **kwargs):
tensor_def_m = self.val_is_td_multi(tensor_def)
super(GeneratedClassifier, self).__init__(tensor_def_m, c_defaults)
# Set-up stream per tensor_definition
label_td = self.label_tensor_def(tensor_def_m)
feature_td = [td for td in self._tensor_def.tensor_definitions if td not in label_td]
streams = [_ModelStream(td.name) for td in feature_td]
if self.is_param_defined('transfer_from', kwargs):
# We're being asked to do transfer learning.
# TODO we'll need a bunch of validation here.
om = self.get_gen_model_parameter('transfer_from', kwargs)
logger.info(f'Transferring from model {om.__class__}')
# The Source model is an auto-encoder
if isinstance(om, GeneratedAutoEncoder):
self.set_up_heads(c_defaults, feature_td, streams)
# Copy and freeze the TensorDefinitionHead, this should normally be the first item.
for s, oms in zip(streams, om.streams):
for sly in oms:
if isinstance(sly, TensorDefinitionHead):
src = self.is_tensor_definition_head(sly)
trg = self.is_tensor_definition_head(s.layers[0])
trg.copy_state_dict(src)
trg.freeze()
logger.info(f'Transferred and froze TensorDefinitionHead {trg.tensor_definition.name}')
elif isinstance(sly, LinearEncoder):
# If no linear layers defined then try and copy the encoder linear_layers
if not self.is_param_defined('linear_layers', kwargs):
linear_layers = sly.layer_definition
# Add last layer. Because this is binary, it has to have size of 1.
linear_layers.append((1, 0.0))
tail = TailBinary(
sum(s.out_size for s in streams), linear_layers, c_defaults.linear_batch_norm
)
tail_state = tail.state_dict()
# Get state of the target layer, remove last item. (popitem)
source_state = list(sly.state_dict().values())
for i, sk in enumerate(tail_state.keys()):
if i < 2:
tail_state[sk].copy_(source_state[i])
# Load target Dict in the target layer.
tail.load_state_dict(tail_state)
for i, p in enumerate(tail.parameters()):
if i < 2:
p.requires_grad = False
logger.info(f'Transferred and froze Linear Encoder layers {sly.layer_definition}')
else:
# Set-up a head layer to each stream. This is done in the parent class.
self.set_up_heads(c_defaults, feature_td, streams)
# Add Body to each stream.
for td, s in zip(feature_td, streams):
self._add_body(s, td, kwargs, c_defaults)
# Create tail.
linear_layers = self.get_list_parameter('linear_layers', int, kwargs)
# Add dropout parameter this will make a list of tuples of (layer_size, dropout)
linear_layers = [(i, c_defaults.inter_layer_drop_out) for i in linear_layers]
# Add last layer. Because this is binary, it has to have size of 1.
linear_layers.append((1, 0.0))
tail = TailBinary(sum(s.out_size for s in streams), linear_layers, c_defaults.linear_batch_norm)
# Assume the last entry is the label
self._y_index = self._x_indexes[-1] + 1
self.streams = nn.ModuleList(
[s.create() for s in streams]
)
self.tail = tail
# Last but not least, set-up the loss function
self.set_loss_fn(SingleLabelBCELoss())
def _add_body(self, stream: _ModelStream, tensor_def: TensorDefinition, kwargs: dict, defaults: ClassifierDefaults):
if tensor_def.rank == 2:
# No need to add anything to the body, rank goes directly to the tail.
return
elif tensor_def.rank == 3:
# Figure out to which body to use.
if self.is_param_defined('recurrent_layers', kwargs):
body_type = 'recurrent'
elif self.is_param_defined('convolutional_layers', kwargs):
body_type = 'convolutional'
elif self.is_param_defined('attention_heads', kwargs):
body_type = 'transformer'
else:
body_type = defaults.default_series_body
# Set-up the body.
if body_type.lower() == 'recurrent':
self._add_recurrent_body(stream, kwargs, defaults)
elif body_type.lower() == 'convolutional':
self._add_convolutional_body(stream, tensor_def, kwargs, defaults)
elif body_type.lower() == 'transformer':
self._add_transformer_body(stream, tensor_def, kwargs, defaults)
else:
raise PyTorchModelException(
f'Do not know how to build body of type {body_type}'
)
def _add_recurrent_body(self, stream: _ModelStream, kwargs: dict, defaults: ClassifierDefaults):
attn_heads = self.get_int_parameter('attention_heads', kwargs, 0)
# attn_do = defaults.attention_drop_out
rnn_features = self.get_int_parameter(
'recurrent_features', kwargs, self.closest_power_of_2(int(stream.out_size / 3))
)
rnn_layers = self.get_int_parameter('recurrent_layers', kwargs, 1)
# Add attention if requested
if attn_heads > 0:
attn = AttentionLastEntry(stream.out_size, attn_heads, rnn_features)
stream.add('Attention', attn, attn.output_size)
# Add main rnn layer
rnn = LSTMBody(stream.out_size, rnn_features, rnn_layers, True, False)
stream.add('Recurrent', rnn, rnn.output_size)
def _add_convolutional_body(self, stream: _ModelStream, tensor_def: TensorDefinition, kwargs: dict,
defaults: ClassifierDefaults):
s_length = [s[1] for s in tensor_def.shapes if len(s) == 3][0]
convolutional_layers = self.get_list_of_tuples_parameter('convolutional_layers', int, kwargs, None)
dropout = defaults.convolutional_drop_out
dense = defaults.convolutional_dense
cnn = ConvolutionalBody1d(stream.out_size, s_length, convolutional_layers, dropout, dense)
stream.add('Convolutional', cnn, cnn.output_size)
def _add_transformer_body(self, stream: _ModelStream, tensor_def: TensorDefinition, kwargs: dict,
defaults: ClassifierDefaults):
s_length = [s[1] for s in tensor_def.shapes if len(s) == 3][0]
attention_head = self.get_int_parameter('attention_heads', kwargs, 1)
feedforward_size = self.get_int_parameter(
'feedforward_size', kwargs, self.closest_power_of_2(int(stream.out_size / 3))
)
drop_out = defaults.transformer_drop_out
positional_size = defaults.transformer_positional_size
positional_logic = defaults.transformer_positional_logic
trans = TransformerBody(
stream.out_size,
s_length,
positional_size,
positional_logic,
attention_head,
feedforward_size,
drop_out
)
stream.add('Transformer', trans, trans.output_size)
def get_y(self, ds: List[torch.Tensor]) -> List[torch.Tensor]:
return ds[self._y_index: self._y_index+1]
def history(self, *args) -> _History:
return BinaryClassifierHistory(*args)
def forward(self, x: List[torch.Tensor]):
y = [s([x[i] for i in hi]) for hi, s in zip(self.head_indexes, self.streams)]
y = self.tail(y)
return y
| 46.041162 | 120 | 0.645438 | 2,414 | 19,015 | 4.888567 | 0.140845 | 0.017795 | 0.007626 | 0.009491 | 0.43437 | 0.359715 | 0.336751 | 0.330142 | 0.330142 | 0.302347 | 0 | 0.006104 | 0.276256 | 19,015 | 412 | 121 | 46.152913 | 0.851402 | 0.278938 | 0 | 0.125 | 1 | 0.004032 | 0.073632 | 0.007295 | 0 | 0 | 0 | 0.002427 | 0 | 1 | 0.137097 | false | 0 | 0.040323 | 0.012097 | 0.266129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c351ebb4f07cf7eccdee13a557a0b9df8efb0303 | 4,321 | py | Python | files/spam-filter/tracspamfilter/captcha/keycaptcha.py | Puppet-Finland/puppet-trac | ffdf467ba80ff995778c30b0bdc6dc3e7d4e6cd3 | [
"BSD-2-Clause"
] | null | null | null | files/spam-filter/tracspamfilter/captcha/keycaptcha.py | Puppet-Finland/puppet-trac | ffdf467ba80ff995778c30b0bdc6dc3e7d4e6cd3 | [
"BSD-2-Clause"
] | null | null | null | files/spam-filter/tracspamfilter/captcha/keycaptcha.py | Puppet-Finland/puppet-trac | ffdf467ba80ff995778c30b0bdc6dc3e7d4e6cd3 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Copyright (C) 2015 Dirk Stöcker <trac@dstoecker.de>
# All rights reserved.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at http://trac.edgewall.com/license.html.
#
# This software consists of voluntary contributions made by many
# individuals. For the exact contribution history, see the revision
# history and logs, available at http://projects.edgewall.com/trac/.
import hashlib
import random
import urllib2
from trac.config import Option
from trac.core import Component, implements
from trac.util.html import tag
from tracspamfilter.api import user_agent
from tracspamfilter.captcha import ICaptchaMethod
class KeycaptchaCaptcha(Component):
"""KeyCaptcha implementation"""
implements(ICaptchaMethod)
private_key = Option('spam-filter', 'captcha_keycaptcha_private_key', '',
"""Private key for KeyCaptcha usage.""", doc_domain="tracspamfilter")
user_id = Option('spam-filter', 'captcha_keycaptcha_user_id', '',
"""User id for KeyCaptcha usage.""", doc_domain="tracspamfilter")
def generate_captcha(self, req):
session_id = "%d-3.4.0.001" % random.randint(1, 10000000)
sign1 = hashlib.md5(session_id + req.remote_addr +
self.private_key).hexdigest()
sign2 = hashlib.md5(session_id + self.private_key).hexdigest()
varblock = "var s_s_c_user_id = '%s';\n" % self.user_id
varblock += "var s_s_c_session_id = '%s';\n" % session_id
varblock += "var s_s_c_captcha_field_id = 'keycaptcha_response_field';\n"
varblock += "var s_s_c_submit_button_id = 'keycaptcha_response_button';\n"
varblock += "var s_s_c_web_server_sign = '%s';\n" % sign1
varblock += "var s_s_c_web_server_sign2 = '%s';\n" % sign2
varblock += "document.s_s_c_debugmode=1;\n"
fragment = tag(tag.script(varblock, type='text/javascript'))
fragment.append(
tag.script(type='text/javascript',
src='http://backs.keycaptcha.com/swfs/cap.js')
)
fragment.append(
tag.input(type='hidden', id='keycaptcha_response_field',
name='keycaptcha_response_field')
)
fragment.append(
tag.input(type='submit', id='keycaptcha_response_button',
name='keycaptcha_response_button')
)
req.session['captcha_key_session'] = session_id
return None, fragment
def verify_key(self, private_key, user_id):
if private_key is None or user_id is None:
return False
# FIXME - Not yet implemented
return True
def verify_captcha(self, req):
session = None
if 'captcha_key_session' in req.session:
session = req.session['captcha_key_session']
del req.session['captcha_key_session']
response_field = req.args.get('keycaptcha_response_field')
val = response_field.split('|')
s = hashlib.md5('accept' + val[1] + self.private_key +
val[2]).hexdigest()
self.log.debug("KeyCaptcha response: %s .. %s .. %s",
response_field, s, session)
if s == val[0] and session == val[3]:
try:
request = urllib2.Request(
url=val[2],
headers={"User-agent": user_agent}
)
response = urllib2.urlopen(request)
return_values = response.read()
response.close()
except Exception, e:
self.log.warning("Exception in KeyCaptcha handling (%s)", e)
else:
self.log.debug("KeyCaptcha check result: %s", return_values)
if return_values == '1':
return True
self.log.warning("KeyCaptcha returned invalid check result: "
"%s (%s)", return_values, response_field)
else:
self.log.warning("KeyCaptcha returned invalid data: "
"%s (%s,%s)", response_field, s, session)
return False
def is_usable(self, req):
return self.private_key and self.user_id
| 38.238938 | 82 | 0.610507 | 514 | 4,321 | 4.949416 | 0.338521 | 0.009434 | 0.008255 | 0.03066 | 0.202437 | 0.118711 | 0.03695 | 0 | 0 | 0 | 0 | 0.012211 | 0.279796 | 4,321 | 112 | 83 | 38.580357 | 0.80527 | 0.118028 | 0 | 0.139241 | 0 | 0 | 0.24052 | 0.097508 | 0 | 0 | 0 | 0.008929 | 0 | 0 | null | null | 0 | 0.101266 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c35493185a871b0c5b3f41a18ba8dd0865c75b5e | 1,521 | py | Python | var/spack/repos/builtin/packages/bcache/package.py | milljm/spack | b476f8aa63d48f4b959522ece0406caa32992d4a | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | var/spack/repos/builtin/packages/bcache/package.py | milljm/spack | b476f8aa63d48f4b959522ece0406caa32992d4a | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | var/spack/repos/builtin/packages/bcache/package.py | milljm/spack | b476f8aa63d48f4b959522ece0406caa32992d4a | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | # Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Bcache(MakefilePackage):
"""Bcache is a patch for the Linux kernel to use SSDs to cache other block
devices."""
homepage = "http://bcache.evilpiepirate.org"
url = "https://github.com/g2p/bcache-tools/archive/v1.0.8.tar.gz"
version('1.0.8', sha256='d56923936f37287efc57a46315679102ef2c86cd0be5874590320acd48c1201c')
version('1.0.7', sha256='64d76d1085afba8c3d5037beb67bf9d69ee163f357016e267bf328c0b1807abd')
version('1.0.6', sha256='9677c6da3ceac4e1799d560617c4d00ea7e9d26031928f8f94b8ab327496d4e0')
version('1.0.5', sha256='1449294ef545b3dc6f715f7b063bc2c8656984ad73bcd81a0dc048cbba416ea9')
version('1.0.4', sha256='102ffc3a8389180f4b491188c3520f8a4b1a84e5a7ca26d2bd6de1821f4d913d')
depends_on('libuuid')
depends_on('util-linux')
depends_on('gettext')
depends_on('pkgconfig', type='build')
def setup_build_environment(self, env):
env.append_flags('LDFLAGS', '-lintl')
patch('func_crc64.patch', sha256='558b35cadab4f410ce8f87f0766424a429ca0611aa2fd247326ad10da115737d')
def install(self, spec, prefix):
mkdirp(prefix.bin)
install('bcache-register', prefix.bin)
install('bcache-super-show', prefix.bin)
install('make-bcache', prefix.bin)
install('probe-bcache', prefix.bin)
| 40.026316 | 104 | 0.738988 | 160 | 1,521 | 6.975 | 0.63125 | 0.035842 | 0.040323 | 0.039427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.227586 | 0.142012 | 1,521 | 37 | 105 | 41.108108 | 0.627586 | 0.177515 | 0 | 0 | 0 | 0.045455 | 0.5 | 0.310178 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.045455 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c3590d5e9d8eea5dee2b2753a4c5f63a26af1754 | 5,401 | py | Python | home/pedrosenarego/zorba/zorba1.0.py | rv8flyboy/pyrobotlab | 4e04fb751614a5cb6044ea15dcfcf885db8be65a | [
"Apache-2.0"
] | 63 | 2015-02-03T18:49:43.000Z | 2022-03-29T03:52:24.000Z | home/pedrosenarego/zorba/zorba1.0.py | hirwaHenryChristian/pyrobotlab | 2debb381fc2db4be1e7ea6e5252a50ae0de6f4a9 | [
"Apache-2.0"
] | 16 | 2016-01-26T19:13:29.000Z | 2018-11-25T21:20:51.000Z | home/pedrosenarego/zorba/zorba1.0.py | hirwaHenryChristian/pyrobotlab | 2debb381fc2db4be1e7ea6e5252a50ae0de6f4a9 | [
"Apache-2.0"
] | 151 | 2015-01-03T18:55:54.000Z | 2022-03-04T07:04:23.000Z | from java.lang import String
import threading
import random
import codecs
import io
import itertools
import time
import os
import urllib2
import textwrap
import socket
import shutil
#############################################################
# This is the ZOrba
#
#############################################################
# All bot specific configuration goes here.
leftPort = "/dev/ttyACM1"
rightPort = "/dev/ttyACM0"
headPort = leftPort
gesturesPath = "/home/pedro/Dropbox/pastaPessoal/3Dprinter/inmoov/scripts/zorba/gestures"
botVoice = "WillBadGuy"
#starting the INMOOV
i01 = Runtime.createAndStart("i01", "InMoov")
i01.setMute(True)
##############STARTING THE RIGHT HAND#########
i01.rightHand = Runtime.create("i01.rightHand", "InMoovHand")
#tweaking defaults settings of right hand
i01.rightHand.thumb.setMinMax(20,155)
i01.rightHand.index.setMinMax(30,130)
i01.rightHand.majeure.setMinMax(38,150)
i01.rightHand.ringFinger.setMinMax(30,170)
i01.rightHand.pinky.setMinMax(30,150)
i01.rightHand.thumb.map(0,180,20,155)
i01.rightHand.index.map(0,180,30,130)
i01.rightHand.majeure.map(0,180,38,150)
i01.rightHand.ringFinger.map(0,180,30,175)
i01.rightHand.pinky.map(0,180,30,150)
#################
#################STARTING RIGHT ARM###############
i01.startRightArm(rightPort)
#i01.rightArm = Runtime.create("i01.rightArm", "InMoovArm")
## tweak default RightArm
i01.detach()
i01.rightArm.bicep.setMinMax(0,60)
i01.rightArm.bicep.map(0,180,0,60)
i01.rightArm.rotate.setMinMax(46,130)
i01.rightArm.rotate.map(0,180,46,130)
i01.rightArm.shoulder.setMinMax(0,155)
i01.rightArm.shoulder.map(0,180,0,155)
i01.rightArm.omoplate.setMinMax(8,85)
i01.rightArm.omoplate.map(0,180,8,85)
########STARTING SIDE NECK CONTROL########
def neckMoveTo(restPos,delta):
leftneckServo.moveTo(restPos + delta)
rightneckServo.moveTo(restPos - delta)
leftneckServo = Runtime.start("leftNeck","Servo")
rightneckServo = Runtime.start("rightNeck","Servo")
right = Runtime.start("i01.right", "Arduino")
#right.connect(rightPort)
leftneckServo.attach(right, 13)
rightneckServo.attach(right, 12)
restPos = 90
delta = 20
neckMoveTo(restPos,delta)
#############STARTING THE HEAD##############
i01.head = Runtime.create("i01.head", "InMoovHead")
#weaking defaults settings of head
i01.head.jaw.setMinMax(35,75)
i01.head.jaw.map(0,180,35,75)
i01.head.jaw.setRest(35)
#tweaking default settings of eyes
i01.head.eyeY.setMinMax(0,180)
i01.head.eyeY.map(0,180,70,110)
i01.head.eyeY.setRest(90)
i01.head.eyeX.setMinMax(0,180)
i01.head.eyeX.map(0,180,70,110)
i01.head.eyeX.setRest(90)
i01.head.neck.setMinMax(40,142)
i01.head.neck.map(0,180,40,142)
i01.head.neck.setRest(70)
i01.head.rothead.setMinMax(21,151)
i01.head.rothead.map(0,180,21,151)
i01.head.rothead.setRest(88)
#########STARTING MOUTH CONTROL###############
i01.startMouthControl(leftPort)
i01.mouthControl.setmouth(0,180)
######################################################################
# mouth service, speech synthesis
mouth = Runtime.createAndStart("i01.mouth", "AcapelaSpeech")
mouth.setVoice(botVoice)
######################################################################
# helper function help debug the recognized text from webkit/sphinx
######################################################################
def heard(data):
print "Speech Recognition Data:"+str(data)
######################################################################
# Create ProgramAB chat bot ( This is the inmoov "brain" )
######################################################################
zorba2 = Runtime.createAndStart("zorba", "ProgramAB")
zorba2.startSession("Pedro", "zorba")
######################################################################
# Html filter to clean the output from programab. (just in case)
htmlfilter = Runtime.createAndStart("htmlfilter", "HtmlFilter")
######################################################################
# the "ear" of the inmoov TODO: replace this with just base inmoov ear?
ear = Runtime.createAndStart("i01.ear", "WebkitSpeechRecognition")
ear.addListener("publishText", python.name, "heard");
ear.addMouth(mouth)
######################################################################
# MRL Routing webkitspeechrecognition/ear -> program ab -> htmlfilter -> mouth
######################################################################
ear.addTextListener(zorba)
zorba2.addTextListener(htmlfilter)
htmlfilter.addTextListener(mouth)
#starting the INMOOV
i01 = Runtime.createAndStart("i01", "InMoov")
i01.setMute(True)
i01.mouth = mouth
######################################################################
# Launch the web gui and create the webkit speech recognition gui
# This service works in Google Chrome only with the WebGui
#################################################################
webgui = Runtime.createAndStart("webgui","WebGui")
######################################################################
# Helper functions and various gesture definitions
######################################################################
i01.loadGestures(gesturesPath)
ear.startListening()
######################################################################
# starting services
######################################################################
i01.startRightHand(rightPort)
i01.detach()
leftneckServo.detach()
rightneckServo.detach()
i01.startHead(leftPort)
i01.detach()
| 31.95858 | 89 | 0.587854 | 583 | 5,401 | 5.445969 | 0.339623 | 0.021417 | 0.030866 | 0.008504 | 0.141732 | 0.052283 | 0.052283 | 0.040315 | 0.040315 | 0.040315 | 0 | 0.065204 | 0.071468 | 5,401 | 168 | 90 | 32.14881 | 0.567896 | 0.179411 | 0 | 0.074468 | 0 | 0 | 0.114802 | 0.030635 | 0 | 0 | 0 | 0.005952 | 0 | 0 | null | null | 0 | 0.12766 | null | null | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c359a6fbb849b989ceb5b8e12f21bfb4e4e866fd | 1,729 | py | Python | PAL/Cross/client/sources-linux/build_library_zip.py | infosecsecurity/OSPTF | df3f63dc882db6d7e0b7bd80476e9bbc8471ac1f | [
"MIT"
] | 2 | 2017-11-23T01:07:37.000Z | 2021-06-25T05:03:49.000Z | PAL/Cross/client/sources-linux/build_library_zip.py | infosecsecurity/OSPTF | df3f63dc882db6d7e0b7bd80476e9bbc8471ac1f | [
"MIT"
] | null | null | null | PAL/Cross/client/sources-linux/build_library_zip.py | infosecsecurity/OSPTF | df3f63dc882db6d7e0b7bd80476e9bbc8471ac1f | [
"MIT"
] | 1 | 2018-05-22T02:28:43.000Z | 2018-05-22T02:28:43.000Z | import sys
from distutils.core import setup
import os
from glob import glob
import zipfile
import shutil
sys.path.insert(0, os.path.join('resources','library_patches'))
sys.path.insert(0, os.path.join('..','..','pupy'))
import pp
import additional_imports
import Crypto
all_dependencies=set([x.split('.')[0] for x,m in sys.modules.iteritems() if not '(built-in)' in str(m) and x != '__main__'])
print "ALLDEPS: ", all_dependencies
zf = zipfile.ZipFile(os.path.join('resources','library.zip'), mode='w', compression=zipfile.ZIP_DEFLATED)
try:
for dep in all_dependencies:
mdep = __import__(dep)
print "DEPENDENCY: ", dep, mdep
if hasattr(mdep, '__path__'):
print('adding package %s'%dep)
path, root = os.path.split(mdep.__path__[0])
for root, dirs, files in os.walk(mdep.__path__[0]):
for f in list(set([x.rsplit('.',1)[0] for x in files])):
found=False
for ext in ('.pyc', '.so', '.pyo', '.py'):
if ext == '.py' and found:
continue
if os.path.exists(os.path.join(root,f+ext)):
zipname = os.path.join(root[len(path)+1:], f.split('.', 1)[0] + ext)
print('adding file : {}'.format(zipname))
zf.write(os.path.join(root, f+ext), zipname)
found=True
else:
if '<memimport>' in mdep.__file__:
continue
_, ext = os.path.splitext(mdep.__file__)
print('adding %s -> %s'%(mdep.__file__, dep+ext))
zf.write(mdep.__file__, dep+ext)
finally:
zf.close()
| 36.020833 | 124 | 0.54251 | 221 | 1,729 | 4.049774 | 0.371041 | 0.060335 | 0.067039 | 0.046927 | 0.156425 | 0.109497 | 0.109497 | 0 | 0 | 0 | 0 | 0.008326 | 0.305379 | 1,729 | 47 | 125 | 36.787234 | 0.736886 | 0 | 0 | 0.05 | 0 | 0 | 0.103528 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.275 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c35a9f8a6f746b1900b91c33a9b1be7d36fdde7f | 4,086 | py | Python | data_collection/json2mongodb.py | kwond2/hedgehogs | 58dbed549a1e78e401fc90c7a7041d9979cfc2e4 | [
"MIT"
] | 9 | 2018-02-06T19:08:16.000Z | 2022-03-15T13:31:57.000Z | data_collection/json2mongodb.py | kwond2/hedgehogs | 58dbed549a1e78e401fc90c7a7041d9979cfc2e4 | [
"MIT"
] | 37 | 2018-02-09T21:22:58.000Z | 2021-12-13T19:51:24.000Z | data_collection/json2mongodb.py | kwond2/hedgehogs | 58dbed549a1e78e401fc90c7a7041d9979cfc2e4 | [
"MIT"
] | 10 | 2018-02-27T20:26:55.000Z | 2021-02-06T02:26:30.000Z | #-*- coding: utf-8 -*-
# import os
# from optparse import OptionParser
# from pymongo import MongoClient, bulk
# import json
# import collections
# import sys
from import_hedgehogs import *
HOST = '45.55.48.43'
PORT = 27017
DB = 'SEC_EDGAR'
class OrderedDictWithKeyEscaping(collections.OrderedDict):
def __setitem__(self, key, value, dict_setitem=dict.__setitem__):
# MongoDB complains when keys contain dots, so we call json.load with
# a modified OrderedDict class which escapes dots in keys on the fly
key = key.replace('.', '<DOT>')
super(OrderedDictWithKeyEscaping, self).__setitem__(key, value)#, dict_setitem=dict.__setitem__)
#super(OrderedDictWithKeyEscaping, self).__setitem__
#super()
def save_to_mongodb(input_file_name, collectionID, usernameID, passwordID):
with open(input_file_name) as fp:
data = fp.read()
json_ = json.loads(data, encoding='utf-8', object_pairs_hook=OrderedDictWithKeyEscaping)
client = MongoClient(HOST, PORT, username=usernameID, password=passwordID, authMechanism ='SCRAM-SHA-1')
# client.admin.authenticate('jgeorge','123',source= 'SEC_EDGAR')
# print("arguments to function:", input_file_name, usernameID, collectionID)
db = client[DB]
collection = db[collectionID]
# print(type(input_file_name))
# file = open(input_file_name, "r")
# data = json.load(file)
# print(type(data))
# print(type(file))
# data = json_util.loads(file.read())
# print(json_)
for item in json_:
collection.insert_one(item)
# file.close()
def get_collection_name(input_file_name):
data_list = json.load(open(input_file_name))
data = dict(data_list[0])
ticker = "TICKER"
quarter = "QUARTER"
try:
# year = data.get("Document And Entity Information [Abstract]")
# print(year)
year = data.get("Document And Entity Information [Abstract]").get("Document Fiscal Year Focus").get("value")
quarter = data.get("Document And Entity Information [Abstract]").get("Document Fiscal Period Focus").get("value")
ticker = data.get("Document And Entity Information [Abstract]").get("Entity Trading Symbol").get("value")
except AttributeError:
print("[EXCEPT] Issues with ", input_file_namex)
# except AttributeError:
# year = data.get("Document And Entity Information").get("Document Fiscal Year Focus").get("value")
# quarter = data.get("Document And Entity Information").get("Document Fiscal Period Focus").get("value")
# try:
# ticker = data.get("Document And Entity Information [Abstract]").get("Entity Trading Symbol").get("value")
# except:
# ticker = data.get("Document And Entity Information [Abstract]").get("Trading Symbol").get("value")
# try:
# ticker = data.get("Document And Entity Information [Abstract]").get("Entity Trading Symbol").get("value")
# except:
# ticker = data.get("Document And Entity Information [Abstract]").get("Trading Symbol").get("value")
# quarter = data.get("Document And Entity Information [Abstract]").get("Document Fiscal Period Focus").get("value")
return str(ticker) + "_" + str(year) + "_" + str(quarter)
def main():
cli_parser = OptionParser(
usage='usage: %prog <input.json> <username> <password>'
)
(options, args) = cli_parser.parse_args()
# Input file checks
if len(args) < 2:
cli_parser.error("You have to supply 2 arguments, USAGE: .json username")
input_file_name = args[0]
if not os.path.exists(input_file_name):
cli_parser.error("The input file %s you supplied does not exist" % input_file_name)
# JAROD's FUNCTION
collection = get_collection_name(input_file_name)
#collection = (sys.argv[1]).strip('.')
username = sys.argv[2]
password = sys.argv[3]
print("Adding to MongoDB...")
#save_to_mongodb(input_file_name, collection, username)
if __name__ == "__main__":
print("[WARNING] STILL UNDER DEVELOPMENT")
main()
| 41.272727 | 121 | 0.670338 | 502 | 4,086 | 5.290837 | 0.304781 | 0.066265 | 0.058735 | 0.074548 | 0.387801 | 0.387801 | 0.323042 | 0.317395 | 0.298193 | 0.278614 | 0 | 0.007929 | 0.197504 | 4,086 | 98 | 122 | 41.693878 | 0.802074 | 0.417523 | 0 | 0 | 0 | 0 | 0.213675 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | false | 0.086957 | 0.021739 | 0 | 0.152174 | 0.065217 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c35c97b552a6619198e65898ccb72250776063d5 | 1,867 | py | Python | molecule/default/tests/test_default.py | escalate/ansible-influxdb-docker | bbb2c259bd1de3c4c40322103a05894494af7104 | [
"MIT"
] | null | null | null | molecule/default/tests/test_default.py | escalate/ansible-influxdb-docker | bbb2c259bd1de3c4c40322103a05894494af7104 | [
"MIT"
] | null | null | null | molecule/default/tests/test_default.py | escalate/ansible-influxdb-docker | bbb2c259bd1de3c4c40322103a05894494af7104 | [
"MIT"
] | null | null | null | """Role testing files using testinfra"""
def test_config_directory(host):
"""Check config directory"""
f = host.file("/etc/influxdb")
assert f.is_directory
assert f.user == "influxdb"
assert f.group == "root"
assert f.mode == 0o775
def test_data_directory(host):
"""Check data directory"""
d = host.file("/var/lib/influxdb")
assert d.is_directory
assert d.user == "influxdb"
assert d.group == "root"
assert d.mode == 0o700
def test_backup_directory(host):
"""Check backup directory"""
b = host.file("/var/backups/influxdb")
assert b.is_directory
assert b.user == "influxdb"
assert b.group == "root"
assert b.mode == 0o775
def test_influxdb_service(host):
"""Check InfluxDB service"""
s = host.service("influxdb")
assert s.is_running
assert s.is_enabled
def test_influxdb_docker_container(host):
"""Check InfluxDB docker container"""
d = host.docker("influxdb.service").inspect()
assert d["HostConfig"]["Memory"] == 1073741824
assert d["Config"]["Image"] == "influxdb:latest"
assert d["Config"]["Labels"]["maintainer"] == "me@example.com"
assert "INFLUXD_REPORTING_DISABLED=true" in d["Config"]["Env"]
assert "internal" in d["NetworkSettings"]["Networks"]
assert \
"influxdb" in d["NetworkSettings"]["Networks"]["internal"]["Aliases"]
def test_backup(host):
"""Check if the backup runs successfully"""
cmd = host.run("/usr/local/bin/backup-influxdb.sh")
assert cmd.succeeded
def test_backup_cron_job(host):
"""Check backup cron job"""
f = host.file("/var/spool/cron/crontabs/root")
assert "/usr/local/bin/backup-influxdb.sh" in f.content_string
def test_restore(host):
"""Check if the restore runs successfully"""
cmd = host.run("/usr/local/bin/restore-influxdb.sh")
assert cmd.succeeded
| 28.287879 | 77 | 0.664167 | 245 | 1,867 | 4.963265 | 0.306122 | 0.046053 | 0.044408 | 0.026316 | 0.134046 | 0.096217 | 0.060855 | 0.060855 | 0 | 0 | 0 | 0.01437 | 0.179968 | 1,867 | 65 | 78 | 28.723077 | 0.779882 | 0.136583 | 0 | 0.05 | 0 | 0 | 0.277778 | 0.115581 | 0 | 0 | 0 | 0 | 0.575 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c368cab3b6e074a25c4387726e3ddcf458b2da2f | 384 | py | Python | sapextractor/utils/fields_corresp/extract_dd03t.py | aarkue/sap-meta-explorer | 613bf657bbaa72a3781a84664e5de7626516532f | [
"Apache-2.0"
] | 2 | 2021-02-10T08:09:35.000Z | 2021-05-21T06:25:34.000Z | sapextractor/utils/fields_corresp/extract_dd03t.py | aarkue/sap-meta-explorer | 613bf657bbaa72a3781a84664e5de7626516532f | [
"Apache-2.0"
] | null | null | null | sapextractor/utils/fields_corresp/extract_dd03t.py | aarkue/sap-meta-explorer | 613bf657bbaa72a3781a84664e5de7626516532f | [
"Apache-2.0"
] | 3 | 2021-11-22T13:27:00.000Z | 2022-03-16T22:08:51.000Z | def apply(con, target_language="E"):
dict_field_desc = {}
try:
df = con.prepare_and_execute_query("DD03T", ["DDLANGUAGE", "FIELDNAME", "DDTEXT"], " WHERE DDLANGUAGE = '"+target_language+"'")
stream = df.to_dict("records")
for el in stream:
dict_field_desc[el["FIELDNAME"]] = el["DDTEXT"]
except:
pass
return dict_field_desc
| 34.909091 | 135 | 0.611979 | 46 | 384 | 4.847826 | 0.630435 | 0.121076 | 0.174888 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006849 | 0.239583 | 384 | 10 | 136 | 38.4 | 0.756849 | 0 | 0 | 0 | 0 | 0 | 0.195313 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.1 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c3693f12a03bbf78b7f7bcf22ea6cd2fd4184fd8 | 1,043 | py | Python | app/forms/fields/month_year_date_field.py | ons-eq-team/eq-questionnaire-runner | 8d029097faa2b9d53d9621064243620db60c62c7 | [
"MIT"
] | null | null | null | app/forms/fields/month_year_date_field.py | ons-eq-team/eq-questionnaire-runner | 8d029097faa2b9d53d9621064243620db60c62c7 | [
"MIT"
] | null | null | null | app/forms/fields/month_year_date_field.py | ons-eq-team/eq-questionnaire-runner | 8d029097faa2b9d53d9621064243620db60c62c7 | [
"MIT"
] | null | null | null | import logging
from werkzeug.utils import cached_property
from wtforms import FormField, Form, StringField
logger = logging.getLogger(__name__)
def get_form_class(validators):
class YearMonthDateForm(Form):
year = StringField(validators=validators)
month = StringField()
@cached_property
def data(self):
data = super().data
try:
return "{year:04d}-{month:02d}".format(
year=int(data["year"]), month=int(data["month"])
)
except (TypeError, ValueError):
return None
return YearMonthDateForm
class MonthYearDateField(FormField):
def __init__(self, validators, **kwargs):
form_class = get_form_class(validators)
super().__init__(form_class, **kwargs)
def process(self, formdata, data=None):
if data is not None:
substrings = data.split("-")
data = {"year": substrings[0], "month": substrings[1]}
super().process(formdata, data)
| 26.74359 | 68 | 0.607862 | 106 | 1,043 | 5.792453 | 0.433962 | 0.058632 | 0.039088 | 0.071661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008011 | 0.281879 | 1,043 | 38 | 69 | 27.447368 | 0.811749 | 0 | 0 | 0 | 0 | 0 | 0.03931 | 0.021093 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.111111 | 0 | 0.518519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c37f533b46624d83873bcd5b9e4314c8ccb4405c | 11,734 | py | Python | myo/device_listener.py | ehliang/myo-unlock | 059e130a90e44df3869dd892e216c020d6d97a7e | [
"MIT"
] | 1 | 2021-06-25T02:27:31.000Z | 2021-06-25T02:27:31.000Z | myo/device_listener.py | ehliang/myo-unlock | 059e130a90e44df3869dd892e216c020d6d97a7e | [
"MIT"
] | null | null | null | myo/device_listener.py | ehliang/myo-unlock | 059e130a90e44df3869dd892e216c020d6d97a7e | [
"MIT"
] | null | null | null | # Copyright (c) 2015 Niklas Rosenstein
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
import abc
import six
import time
import threading
import warnings
from .lowlevel.enums import EventType, Pose, Arm, XDirection
from .utils.threading import TimeoutClock
from .vector import Vector
from .quaternion import Quaternion
class DeviceListener(six.with_metaclass(abc.ABCMeta)):
"""
Interface for listening to data sent from a Myo device.
Return False from one of its callback methods to instruct
the Hub to stop processing.
The *DeviceListener* operates between the high and low level
of the myo Python bindings. The ``myo`` object that is passed
to callback methods is a :class:`myo.lowlevel.ctyping.Myo`
object.
"""
def on_event(self, kind, event):
"""
Called before any of the event callbacks.
"""
def on_event_finished(self, kind, event):
"""
Called after the respective event callbacks have been
invoked. This method is *always* triggered, even if one of
the callbacks requested the stop of the Hub.
"""
def on_pair(self, myo, timestamp):
pass
def on_unpair(self, myo, timestamp):
pass
def on_connect(self, myo, timestamp):
pass
def on_disconnect(self, myo, timestamp):
pass
def on_pose(self, myo, timestamp, pose):
pass
def on_orientation_data(self, myo, timestamp, orientation):
pass
def on_accelerometor_data(self, myo, timestamp, acceleration):
pass
def on_gyroscope_data(self, myo, timestamp, gyroscope):
pass
def on_rssi(self, myo, timestamp, rssi):
pass
def on_emg(self, myo, timestamp, emg):
pass
def on_unsync(self, myo, timestamp):
pass
def on_sync(self, myo, timestamp, arm, x_direction):
pass
def on_unlock(self, myo, timestamp):
pass
def on_lock(self, myo, timestamp):
pass
class Feed(DeviceListener):
"""
This class implements the :class:`DeviceListener` interface
to collect all data and make it available to another thread
on-demand.
.. code-block:: python
import myo as libmyo
feed = libmyo.device_listener.Feed()
hub = libmyo.Hub()
hub.run(1000, feed)
try:
while True:
myos = feed.get_connected_devices()
if myos:
print myos[0], myos[0].orientation
time.sleep(0.5)
finally:
hub.stop(True)
hub.shutdown()
"""
class MyoProxy(object):
__slots__ = ('synchronized,_pair_time,_unpair_time,_connect_time,'
'_disconnect_time,_myo,_emg,_orientation,_acceleration,'
'_gyroscope,_pose,_arm,_xdir,_rssi,_firmware_version').split(',')
def __init__(self, low_myo, timestamp, firmware_version):
super(Feed.MyoProxy, self).__init__()
self.synchronized = threading.Condition()
self._pair_time = timestamp
self._unpair_time = None
self._connect_time = None
self._disconnect_time = None
self._myo = low_myo
self._emg = None
self._orientation = Quaternion.identity()
self._acceleration = Vector(0, 0, 0)
self._gyroscope = Vector(0, 0, 0)
self._pose = Pose.rest
self._arm = None
self._xdir = None
self._rssi = None
self._firmware_version = firmware_version
def __repr__(self):
result = '<MyoProxy ('
with self.synchronized:
if self.connected:
result += 'connected) at 0x{0:x}>'.format(self._myo.value)
else:
result += 'disconnected)>'
return result
def __assert_connected(self):
if not self.connected:
raise RuntimeError('Myo was disconnected')
@property
def connected(self):
with self.synchronized:
return (self._connect_time is not None and
self._disconnect_time is None)
@property
def paired(self):
with self.synchronized:
return (self.myo_ is None or self._unpair_time is not None)
@property
def pair_time(self):
return self._pair_time
@property
def unpair_time(self):
with self.synchronized:
return self._unpair_time
@property
def connect_time(self):
return self._connect_time
@property
def disconnect_time(self):
with self.synchronized:
return self._disconnect_time
@property
def firmware_version(self):
return self._firmware_version
@property
def orientation(self):
with self.synchronized:
return self._orientation.copy()
@property
def acceleration(self):
with self.synchronized:
return self._acceleration.copy()
@property
def gyroscope(self):
with self.synchronized:
return self._gyroscope.copy()
@property
def pose(self):
with self.synchronized:
return self._pose
@property
def arm(self):
with self.synchronized:
return self._arm
@property
def x_direction(self):
with self.synchronized:
return self._xdir
@property
def rssi(self):
with self.synchronized:
return self._rssi
def set_locking_policy(self, locking_policy):
with self.synchronized:
self.__assert_connected()
self._myo.set_locking_policy(locking_policy)
def set_stream_emg(self, emg):
with self.synchronized:
self.__assert_connected()
self._myo.set_stream_emg(emg)
def vibrate(self, vibration_type):
with self.synchronized:
self.__assert_connected()
self._myo.vibrate(vibration_type)
def request_rssi(self):
"""
Requests the RSSI of the Myo armband. Until the RSSI is
retrieved, :attr:`rssi` returns None.
"""
with self.synchronized:
self.__assert_connected()
self._rssi = None
self._myo.request_rssi()
def __init__(self):
super(Feed, self).__init__()
self.synchronized = threading.Condition()
self._myos = {}
def get_devices(self):
"""
get_devices() -> list of Feed.MyoProxy
Returns a list of paired and connected Myo's.
"""
with self.synchronized:
return list(self._myos.values())
def get_connected_devices(self):
"""
get_connected_devices(self) -> list of Feed.MyoProxy
Returns a list of connected Myo's.
"""
with self.synchronized:
return [myo for myo in self._myos.values() if myo.connected]
def wait_for_single_device(self, timeout=None, interval=0.5):
"""
wait_for_single_device(timeout) -> Feed.MyoProxy or None
Waits until a Myo is was paired **and** connected with the Hub
and returns it. If the *timeout* is exceeded, returns None.
This function will not return a Myo that is only paired but
not connected.
:param timeout: The maximum time to wait for a device.
:param interval: The interval at which the function should
exit sleeping. We can not sleep endlessly, otherwise
the main thread can not be exit, eg. through a
KeyboardInterrupt.
"""
timer = TimeoutClock(timeout)
start = time.time()
with self.synchronized:
# As long as there are no Myo's connected, wait until we
# get notified about a change.
while not timer.exceeded:
# Check if we found a Myo that is connected.
for myo in six.itervalues(self._myos):
if myo.connected:
return myo
remaining = timer.remaining
if interval is not None and remaining > interval:
remaining = interval
self.synchronized.wait(remaining)
return None
# DeviceListener
def on_event(self, kind, event):
myo = event.myo
timestamp = event.timestamp
with self.synchronized:
if kind == EventType.paired:
fmw_version = event.firmware_version
self._myos[myo.value] = self.MyoProxy(myo, timestamp, fmw_version)
self.synchronized.notify_all()
return True
elif kind == EventType.unpaired:
try: proxy = self._myos.pop(myo.value)
except KeyError:
message = "Myo 0x{0:x} was not in the known Myo's list"
warnings.warn(message.format(myo.value), RuntimeWarning)
else:
# Remove the reference handle from the Myo proxy.
with proxy.synchronized:
proxy._unpair_time = timestamp
proxy._myo = None
finally:
self.synchronized.notify_all()
return True
else:
try: proxy = self._myos[myo.value]
except KeyError:
message = "Myo 0x{0:x} was not in the known Myo's list"
warnings.warn(message.format(myo.value), RuntimeWarning)
return True
with proxy.synchronized:
if kind == EventType.connected:
proxy._connect_time = timestamp
elif kind == EventType.disconnected:
proxy._disconnect_time = timestamp
elif kind == EventType.emg:
proxy._emg = event.emg
elif kind == EventType.arm_synced:
proxy._arm = event.arm
proxy._xdir = event.x_direction
elif kind == EventType.rssi:
proxy._rssi = event.rssi
elif kind == EventType.pose:
proxy._pose = event.pose
elif kind == EventType.orientation:
proxy._orientation = event.orientation
proxy._gyroscope = event.gyroscope
proxy._acceleration = event.acceleration
| 32.325069 | 82 | 0.587183 | 1,317 | 11,734 | 5.081245 | 0.22779 | 0.059773 | 0.059773 | 0.050508 | 0.204573 | 0.191721 | 0.106246 | 0.06306 | 0.046623 | 0.031978 | 0 | 0.003357 | 0.339952 | 11,734 | 362 | 83 | 32.414365 | 0.860684 | 0.260695 | 0 | 0.349057 | 0 | 0 | 0.03748 | 0.018861 | 0 | 0 | 0 | 0 | 0.023585 | 1 | 0.198113 | false | 0.066038 | 0.042453 | 0.014151 | 0.358491 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c3806b9e128d8474be2a0c8c16ed645a6cd61414 | 333 | py | Python | utilities/poisson.py | lukepinkel/pylmm | b9e896222f077b000f9a752be77cfc9e60b49f19 | [
"MIT"
] | null | null | null | utilities/poisson.py | lukepinkel/pylmm | b9e896222f077b000f9a752be77cfc9e60b49f19 | [
"MIT"
] | null | null | null | utilities/poisson.py | lukepinkel/pylmm | b9e896222f077b000f9a752be77cfc9e60b49f19 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Wed Aug 12 13:34:49 2020
@author: lukepinkel
"""
import numpy as np
import scipy as sp
import scipy.special
def poisson_logp(x, mu, logp=True):
p = sp.special.xlogy(x, mu) - sp.special.gammaln(x + 1) - mu
if logp==False:
p = np.exp(p)
return p
| 19.588235 | 65 | 0.618619 | 57 | 333 | 3.596491 | 0.684211 | 0.107317 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058366 | 0.228228 | 333 | 17 | 66 | 19.588235 | 0.7393 | 0.3003 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5ed9ef5b5cccf956209757de81563a4bc4e12b59 | 43,492 | py | Python | oscar/apps/offer/models.py | endgame/django-oscar | e5d78436e20b55902537a6cc82edf4e22568f9d6 | [
"BSD-3-Clause"
] | null | null | null | oscar/apps/offer/models.py | endgame/django-oscar | e5d78436e20b55902537a6cc82edf4e22568f9d6 | [
"BSD-3-Clause"
] | null | null | null | oscar/apps/offer/models.py | endgame/django-oscar | e5d78436e20b55902537a6cc82edf4e22568f9d6 | [
"BSD-3-Clause"
] | 1 | 2019-07-10T06:32:14.000Z | 2019-07-10T06:32:14.000Z | from decimal import Decimal as D, ROUND_DOWN, ROUND_UP
import math
import datetime
from django.core import exceptions
from django.template.defaultfilters import slugify
from django.db import models
from django.utils.translation import ungettext, ugettext as _
from django.utils.importlib import import_module
from django.core.exceptions import ValidationError
from django.core.urlresolvers import reverse
from django.conf import settings
from oscar.apps.offer.managers import ActiveOfferManager
from oscar.templatetags.currency_filters import currency
from oscar.models.fields import PositiveDecimalField, ExtendedURLField
def load_proxy(proxy_class):
module, classname = proxy_class.rsplit('.', 1)
try:
mod = import_module(module)
except ImportError, e:
raise exceptions.ImproperlyConfigured(
"Error importing module %s: %s" % (module, e))
try:
return getattr(mod, classname)
except AttributeError:
raise exceptions.ImproperlyConfigured(
"Module %s does not define a %s" % (module, classname))
class ConditionalOffer(models.Model):
"""
A conditional offer (eg buy 1, get 10% off)
"""
name = models.CharField(
_("Name"), max_length=128, unique=True,
help_text=_("This is displayed within the customer's basket"))
slug = models.SlugField(_("Slug"), max_length=128, unique=True, null=True)
description = models.TextField(_("Description"), blank=True, null=True)
# Offers come in a few different types:
# (a) Offers that are available to all customers on the site. Eg a
# 3-for-2 offer.
# (b) Offers that are linked to a voucher, and only become available once
# that voucher has been applied to the basket
# (c) Offers that are linked to a user. Eg, all students get 10% off. The
# code to apply this offer needs to be coded
# (d) Session offers - these are temporarily available to a user after some
# trigger event. Eg, users coming from some affiliate site get 10% off.
SITE, VOUCHER, USER, SESSION = ("Site", "Voucher", "User", "Session")
TYPE_CHOICES = (
(SITE, _("Site offer - available to all users")),
(VOUCHER, _("Voucher offer - only available after entering the appropriate voucher code")),
(USER, _("User offer - available to certain types of user")),
(SESSION, _("Session offer - temporary offer, available for a user for the duration of their session")),
)
offer_type = models.CharField(_("Type"), choices=TYPE_CHOICES, default=SITE, max_length=128)
condition = models.ForeignKey('offer.Condition', verbose_name=_("Condition"))
benefit = models.ForeignKey('offer.Benefit', verbose_name=_("Benefit"))
# Some complicated situations require offers to be applied in a set order.
priority = models.IntegerField(_("Priority"), default=0,
help_text=_("The highest priority offers are applied first"))
# AVAILABILITY
# Range of availability. Note that if this is a voucher offer, then these
# dates are ignored and only the dates from the voucher are used to
# determine availability.
start_date = models.DateField(_("Start Date"), blank=True, null=True)
end_date = models.DateField(
_("End Date"), blank=True, null=True,
help_text=_("Offers are not active on their end date, only "
"the days preceding"))
# Use this field to limit the number of times this offer can be applied in
# total. Note that a single order can apply an offer multiple times so
# this is not the same as the number of orders that can use it.
max_global_applications = models.PositiveIntegerField(
_("Max global applications"),
help_text=_("The number of times this offer can be used before it "
"is unavailable"), blank=True, null=True)
# Use this field to limit the number of times this offer can be used by a
# single user. This only works for signed-in users - it doesn't really
# make sense for sites that allow anonymous checkout.
max_user_applications = models.PositiveIntegerField(
_("Max user applications"),
help_text=_("The number of times a single user can use this offer"),
blank=True, null=True)
# Use this field to limit the number of times this offer can be applied to
# a basket (and hence a single order).
max_basket_applications = models.PositiveIntegerField(
blank=True, null=True,
help_text=_("The number of times this offer can be applied to a "
"basket (and order)"))
# Use this field to limit the amount of discount an offer can lead to.
# This can be helpful with budgeting.
max_discount = models.DecimalField(
_("Max discount"), decimal_places=2, max_digits=12, null=True,
blank=True,
help_text=_("When an offer has given more discount to orders "
"than this threshold, then the offer becomes "
"unavailable"))
# TRACKING
total_discount = models.DecimalField(
_("Total Discount"), decimal_places=2, max_digits=12,
default=D('0.00'))
num_applications = models.PositiveIntegerField(
_("Number of applications"), default=0)
num_orders = models.PositiveIntegerField(
_("Number of Orders"), default=0)
redirect_url = ExtendedURLField(_("URL redirect (optional)"), blank=True)
date_created = models.DateTimeField(_("Date Created"), auto_now_add=True)
objects = models.Manager()
active = ActiveOfferManager()
# We need to track the voucher that this offer came from (if it is a
# voucher offer)
_voucher = None
class Meta:
ordering = ['-priority']
verbose_name = _("Conditional Offer")
verbose_name_plural = _("Conditional Offers")
def save(self, *args, **kwargs):
if not self.slug:
self.slug = slugify(self.name)
return super(ConditionalOffer, self).save(*args, **kwargs)
def get_absolute_url(self):
return reverse('offer:detail', kwargs={'slug': self.slug})
def __unicode__(self):
return self.name
def clean(self):
if self.start_date and self.end_date and self.start_date > self.end_date:
raise exceptions.ValidationError(_('End date should be later than start date'))
def is_active(self, test_date=None):
"""
Test whether this offer is active and can be used by customers
"""
if test_date is None:
test_date = datetime.date.today()
predicates = [self.get_max_applications() > 0]
if self.start_date:
predicates.append(self.start_date <= test_date)
if self.end_date:
predicates.append(test_date < self.end_date)
if self.max_discount:
predicates.append(self.total_discount < self.max_discount)
return all(predicates)
def is_condition_satisfied(self, basket):
return self._proxy_condition().is_satisfied(basket)
def is_condition_partially_satisfied(self, basket):
return self._proxy_condition().is_partially_satisfied(basket)
def get_upsell_message(self, basket):
return self._proxy_condition().get_upsell_message(basket)
def apply_benefit(self, basket):
"""
Applies the benefit to the given basket and returns the discount.
"""
if not self.is_condition_satisfied(basket):
return D('0.00')
return self._proxy_benefit().apply(basket, self._proxy_condition(),
self)
def set_voucher(self, voucher):
self._voucher = voucher
def get_voucher(self):
return self._voucher
def get_max_applications(self, user=None):
"""
Return the number of times this offer can be applied to a basket
"""
limits = [10000]
if self.max_user_applications and user:
limits.append(max(0, self.max_user_applications -
self.get_num_user_applications(user)))
if self.max_basket_applications:
limits.append(self.max_basket_applications)
if self.max_global_applications:
limits.append(
max(0, self.max_global_applications - self.num_applications))
return min(limits)
def get_num_user_applications(self, user):
OrderDiscount = models.get_model('order', 'OrderDiscount')
aggregates = OrderDiscount.objects.filter(
offer_id=self.id, order__user=user).aggregate(
total=models.Sum('frequency'))
return aggregates['total'] if aggregates['total'] is not None else 0
def shipping_discount(self, charge):
return self._proxy_benefit().shipping_discount(charge)
def _proxy_condition(self):
"""
Returns the appropriate proxy model for the condition
"""
field_dict = dict(self.condition.__dict__)
for field in field_dict.keys():
if field.startswith('_'):
del field_dict[field]
if self.condition.proxy_class:
klass = load_proxy(self.condition.proxy_class)
return klass(**field_dict)
klassmap = {
self.condition.COUNT: CountCondition,
self.condition.VALUE: ValueCondition,
self.condition.COVERAGE: CoverageCondition}
if self.condition.type in klassmap:
return klassmap[self.condition.type](**field_dict)
return self.condition
def _proxy_benefit(self):
"""
Returns the appropriate proxy model for the benefit
"""
field_dict = dict(self.benefit.__dict__)
for field in field_dict.keys():
if field.startswith('_'):
del field_dict[field]
klassmap = {
self.benefit.PERCENTAGE: PercentageDiscountBenefit,
self.benefit.FIXED: AbsoluteDiscountBenefit,
self.benefit.MULTIBUY: MultibuyDiscountBenefit,
self.benefit.FIXED_PRICE: FixedPriceBenefit,
self.benefit.SHIPPING_ABSOLUTE: ShippingAbsoluteDiscountBenefit,
self.benefit.SHIPPING_FIXED_PRICE: ShippingFixedPriceBenefit,
self.benefit.SHIPPING_PERCENTAGE: ShippingPercentageDiscountBenefit}
if self.benefit.type in klassmap:
return klassmap[self.benefit.type](**field_dict)
return self.benefit
def record_usage(self, discount):
self.num_applications += discount['freq']
self.total_discount += discount['discount']
self.num_orders += 1
self.save()
record_usage.alters_data = True
def availability_description(self):
"""
Return a description of when this offer is available
"""
sentences = []
if self.max_global_applications:
desc = _(
"Can be used %(total)d times "
"(%(remainder)d remaining)") % {
'total': self.max_global_applications,
'remainder': self.max_global_applications - self.num_applications}
sentences.append(desc)
if self.max_user_applications:
if self.max_user_applications == 1:
desc = _("Can be used once per user")
else:
desc = _(
"Can be used %(total)d times per user") % {
'total': self.max_user_applications}
sentences.append(desc)
if self.max_basket_applications:
if self.max_user_applications == 1:
desc = _("Can be used once per basket")
else:
desc = _(
"Can be used %(total)d times per basket") % {
'total': self.max_basket_applications}
sentences.append(desc)
if self.start_date and self.end_date:
desc = _("Available between %(start)s and %(end)s") % {
'start': self.start_date,
'end': self.end_date}
sentences.append(desc)
elif self.start_date:
sentences.append(_("Available until %(start)s") % {
'start': self.start_date})
elif self.end_date:
sentences.append(_("Available until %(end)s") % {
'end': self.end_date})
if self.max_discount:
sentences.append(_("Available until a discount of %(max)s "
"has been awarded") % {
'max': currency(self.max_discount)})
return "<br/>".join(sentences)
class Condition(models.Model):
COUNT, VALUE, COVERAGE = ("Count", "Value", "Coverage")
TYPE_CHOICES = (
(COUNT, _("Depends on number of items in basket that are in "
"condition range")),
(VALUE, _("Depends on value of items in basket that are in "
"condition range")),
(COVERAGE, _("Needs to contain a set number of DISTINCT items "
"from the condition range")))
range = models.ForeignKey(
'offer.Range', verbose_name=_("Range"), null=True, blank=True)
type = models.CharField(_('Type'), max_length=128, choices=TYPE_CHOICES,
null=True, blank=True)
value = PositiveDecimalField(_('Value'), decimal_places=2, max_digits=12,
null=True, blank=True)
proxy_class = models.CharField(_("Custom class"), null=True, blank=True,
max_length=255, unique=True, default=None)
class Meta:
verbose_name = _("Condition")
verbose_name_plural = _("Conditions")
def __unicode__(self):
if self.proxy_class:
return load_proxy(self.proxy_class).name
if self.type == self.COUNT:
return _("Basket includes %(count)d item(s) from %(range)s") % {
'count': self.value, 'range': unicode(self.range).lower()}
elif self.type == self.COVERAGE:
return _("Basket includes %(count)d distinct products from %(range)s") % {
'count': self.value, 'range': unicode(self.range).lower()}
return _("Basket includes %(amount)s from %(range)s") % {
'amount': currency(self.value),
'range': unicode(self.range).lower()}
description = __unicode__
def consume_items(self, basket, affected_lines):
pass
def is_satisfied(self, basket):
"""
Determines whether a given basket meets this condition. This is
stubbed in this top-class object. The subclassing proxies are
responsible for implementing it correctly.
"""
return False
def is_partially_satisfied(self, basket):
"""
Determine if the basket partially meets the condition. This is useful
for up-selling messages to entice customers to buy something more in
order to qualify for an offer.
"""
return False
def get_upsell_message(self, basket):
return None
def can_apply_condition(self, product):
"""
Determines whether the condition can be applied to a given product
"""
return (self.range.contains_product(product)
and product.is_discountable and product.has_stockrecord)
def get_applicable_lines(self, basket, most_expensive_first=True):
"""
Return line data for the lines that can be consumed by this condition
"""
line_tuples = []
for line in basket.all_lines():
product = line.product
if not self.can_apply_condition(product):
continue
price = line.unit_price_incl_tax
if not price:
continue
line_tuples.append((price, line))
if most_expensive_first:
return sorted(line_tuples, reverse=True)
return sorted(line_tuples)
class Benefit(models.Model):
range = models.ForeignKey(
'offer.Range', null=True, blank=True, verbose_name=_("Range"))
# Benefit types
PERCENTAGE, FIXED, MULTIBUY, FIXED_PRICE = (
"Percentage", "Absolute", "Multibuy", "Fixed price")
SHIPPING_PERCENTAGE, SHIPPING_ABSOLUTE, SHIPPING_FIXED_PRICE = (
'Shipping percentage', 'Shipping absolute', 'Shipping fixed price')
TYPE_CHOICES = (
(PERCENTAGE, _("Discount is a % of the product's value")),
(FIXED, _("Discount is a fixed amount off the product's value")),
(MULTIBUY, _("Discount is to give the cheapest product for free")),
(FIXED_PRICE, _("Get the products that meet the condition for a fixed price")),
(SHIPPING_ABSOLUTE, _("Discount is a fixed amount off the shipping cost")),
(SHIPPING_FIXED_PRICE, _("Get shipping for a fixed price")),
(SHIPPING_PERCENTAGE, _("Discount is a % off the shipping cost")),
)
type = models.CharField(_("Type"), max_length=128, choices=TYPE_CHOICES)
value = PositiveDecimalField(_("Value"), decimal_places=2, max_digits=12,
null=True, blank=True)
# If this is not set, then there is no upper limit on how many products
# can be discounted by this benefit.
max_affected_items = models.PositiveIntegerField(
_("Max Affected Items"), blank=True, null=True,
help_text=_("Set this to prevent the discount consuming all items "
"within the range that are in the basket."))
class Meta:
verbose_name = _("Benefit")
verbose_name_plural = _("Benefits")
def __unicode__(self):
if self.type == self.PERCENTAGE:
desc = _("%(value)s%% discount on %(range)s") % {
'value': self.value,
'range': unicode(self.range).lower()}
elif self.type == self.MULTIBUY:
desc = _("Cheapest product is free from %s") % (
unicode(self.range).lower(),)
elif self.type == self.FIXED_PRICE:
desc = _("The products that meet the condition are "
"sold for %(amount)s") % {
'amount': currency(self.value)}
elif self.type == self.SHIPPING_PERCENTAGE:
desc = _("%(value)s%% off shipping cost") % {
'value': self.value}
elif self.type == self.SHIPPING_ABSOLUTE:
desc = _("%(amount)s off shipping cost") % {
'amount': currency(self.value)}
elif self.type == self.SHIPPING_FIXED_PRICE:
desc = _("Get shipping for %(amount)s") % {
'amount': currency(self.value)}
else:
desc = _("%(amount)s discount on %(range)s") % {
'amount': currency(self.value),
'range': unicode(self.range).lower()}
if self.max_affected_items:
desc += ungettext(" (max %d item)", " (max %d items)", self.max_affected_items) % self.max_affected_items
return desc
description = __unicode__
def apply(self, basket, condition, offer=None):
return D('0.00')
def clean(self):
if not self.type:
raise ValidationError(_("Benefit requires a value"))
method_name = 'clean_%s' % self.type.lower().replace(' ', '_')
if hasattr(self, method_name):
getattr(self, method_name)()
def clean_multibuy(self):
if not self.range:
raise ValidationError(
_("Multibuy benefits require a product range"))
if self.value:
raise ValidationError(
_("Multibuy benefits don't require a value"))
if self.max_affected_items:
raise ValidationError(
_("Multibuy benefits don't require a 'max affected items' "
"attribute"))
def clean_percentage(self):
if not self.range:
raise ValidationError(
_("Percentage benefits require a product range"))
if self.value > 100:
raise ValidationError(
_("Percentage discount cannot be greater than 100"))
def clean_shipping_absolute(self):
if not self.value:
raise ValidationError(
_("A discount value is required"))
if self.range:
raise ValidationError(
_("No range should be selected as this benefit does not "
"apply to products"))
if self.max_affected_items:
raise ValidationError(
_("Shipping discounts don't require a 'max affected items' "
"attribute"))
def clean_shipping_percentage(self):
if self.value > 100:
raise ValidationError(
_("Percentage discount cannot be greater than 100"))
if self.range:
raise ValidationError(
_("No range should be selected as this benefit does not "
"apply to products"))
if self.max_affected_items:
raise ValidationError(
_("Shipping discounts don't require a 'max affected items' "
"attribute"))
def clean_shipping_fixed_price(self):
if self.range:
raise ValidationError(
_("No range should be selected as this benefit does not "
"apply to products"))
if self.max_affected_items:
raise ValidationError(
_("Shipping discounts don't require a 'max affected items' "
"attribute"))
def clean_fixed_price(self):
if self.range:
raise ValidationError(
_("No range should be selected as the condition range will "
"be used instead."))
def clean_absolute(self):
if not self.range:
raise ValidationError(
_("Percentage benefits require a product range"))
def round(self, amount):
"""
Apply rounding to discount amount
"""
if hasattr(settings, 'OSCAR_OFFER_ROUNDING_FUNCTION'):
return settings.OSCAR_OFFER_ROUNDING_FUNCTION(amount)
return amount.quantize(D('.01'), ROUND_DOWN)
def _effective_max_affected_items(self):
"""
Return the maximum number of items that can have a discount applied
during the application of this benefit
"""
return self.max_affected_items if self.max_affected_items else 10000
def can_apply_benefit(self, product):
"""
Determines whether the benefit can be applied to a given product
"""
return product.has_stockrecord and product.is_discountable
def get_applicable_lines(self, basket, range=None):
"""
Return the basket lines that are available to be discounted
:basket: The basket
:range: The range of products to use for filtering. The fixed-price
benefit ignores its range and uses the condition range
"""
if range is None:
range = self.range
line_tuples = []
for line in basket.all_lines():
product = line.product
if (not range.contains(product) or
not self.can_apply_benefit(product)):
continue
price = line.unit_price_incl_tax
if not price:
# Avoid zero price products
continue
if line.quantity_without_discount == 0:
continue
line_tuples.append((price, line))
# We sort lines to be cheapest first to ensure consistent applications
return sorted(line_tuples)
def shipping_discount(self, charge):
return D('0.00')
class Range(models.Model):
"""
Represents a range of products that can be used within an offer
"""
name = models.CharField(_("Name"), max_length=128, unique=True)
includes_all_products = models.BooleanField(_('Includes All Products'), default=False)
included_products = models.ManyToManyField('catalogue.Product', related_name='includes', blank=True,
verbose_name=_("Included Products"))
excluded_products = models.ManyToManyField('catalogue.Product', related_name='excludes', blank=True,
verbose_name=_("Excluded Products"))
classes = models.ManyToManyField('catalogue.ProductClass', related_name='classes', blank=True,
verbose_name=_("Product Classes"))
included_categories = models.ManyToManyField('catalogue.Category', related_name='includes', blank=True,
verbose_name=_("Included Categories"))
# Allow a custom range instance to be specified
proxy_class = models.CharField(_("Custom class"), null=True, blank=True,
max_length=255, default=None, unique=True)
date_created = models.DateTimeField(_("Date Created"), auto_now_add=True)
__included_product_ids = None
__excluded_product_ids = None
__class_ids = None
class Meta:
verbose_name = _("Range")
verbose_name_plural = _("Ranges")
def __unicode__(self):
return self.name
def contains_product(self, product):
"""
Check whether the passed product is part of this range
"""
# We look for shortcircuit checks first before
# the tests that require more database queries.
if settings.OSCAR_OFFER_BLACKLIST_PRODUCT and \
settings.OSCAR_OFFER_BLACKLIST_PRODUCT(product):
return False
# Delegate to a proxy class if one is provided
if self.proxy_class:
return load_proxy(self.proxy_class)().contains_product(product)
excluded_product_ids = self._excluded_product_ids()
if product.id in excluded_product_ids:
return False
if self.includes_all_products:
return True
if product.product_class_id in self._class_ids():
return True
included_product_ids = self._included_product_ids()
if product.id in included_product_ids:
return True
test_categories = self.included_categories.all()
if test_categories:
for category in product.categories.all():
for test_category in test_categories:
if category == test_category or category.is_descendant_of(test_category):
return True
return False
# Shorter alias
contains = contains_product
def _included_product_ids(self):
if None == self.__included_product_ids:
self.__included_product_ids = [row['id'] for row in self.included_products.values('id')]
return self.__included_product_ids
def _excluded_product_ids(self):
if not self.id:
return []
if None == self.__excluded_product_ids:
self.__excluded_product_ids = [row['id'] for row in self.excluded_products.values('id')]
return self.__excluded_product_ids
def _class_ids(self):
if None == self.__class_ids:
self.__class_ids = [row['id'] for row in self.classes.values('id')]
return self.__class_ids
def num_products(self):
if self.includes_all_products:
return None
return self.included_products.all().count()
@property
def is_editable(self):
"""
Test whether this product can be edited in the dashboard
"""
return self.proxy_class is None
# ==========
# Conditions
# ==========
class CountCondition(Condition):
"""
An offer condition dependent on the NUMBER of matching items from the basket.
"""
class Meta:
proxy = True
verbose_name = _("Count Condition")
verbose_name_plural = _("Count Conditions")
def is_satisfied(self, basket):
"""
Determines whether a given basket meets this condition
"""
num_matches = 0
for line in basket.all_lines():
if (self.can_apply_condition(line.product)
and line.quantity_without_discount > 0):
num_matches += line.quantity_without_discount
if num_matches >= self.value:
return True
return False
def _get_num_matches(self, basket):
if hasattr(self, '_num_matches'):
return getattr(self, '_num_matches')
num_matches = 0
for line in basket.all_lines():
if (self.can_apply_condition(line.product)
and line.quantity_without_discount > 0):
num_matches += line.quantity_without_discount
self._num_matches = num_matches
return num_matches
def is_partially_satisfied(self, basket):
num_matches = self._get_num_matches(basket)
return 0 < num_matches < self.value
def get_upsell_message(self, basket):
num_matches = self._get_num_matches(basket)
delta = self.value - num_matches
return ungettext('Buy %(delta)d more product from %(range)s',
'Buy %(delta)d more products from %(range)s', delta) % {
'delta': delta, 'range': self.range}
def consume_items(self, basket, affected_lines):
"""
Marks items within the basket lines as consumed so they
can't be reused in other offers.
:basket: The basket
:affected_lines: The lines that have been affected by the discount.
This should be list of tuples (line, discount, qty)
"""
# We need to count how many items have already been consumed as part of
# applying the benefit, so we don't consume too many items.
num_consumed = 0
for line, __, quantity in affected_lines:
num_consumed += quantity
to_consume = max(0, self.value - num_consumed)
if to_consume == 0:
return
for __, line in self.get_applicable_lines(basket,
most_expensive_first=True):
quantity_to_consume = min(line.quantity_without_discount,
to_consume)
line.consume(quantity_to_consume)
to_consume -= quantity_to_consume
if to_consume == 0:
break
class CoverageCondition(Condition):
"""
An offer condition dependent on the number of DISTINCT matching items from the basket.
"""
class Meta:
proxy = True
verbose_name = _("Coverage Condition")
verbose_name_plural = _("Coverage Conditions")
def is_satisfied(self, basket):
"""
Determines whether a given basket meets this condition
"""
covered_ids = []
for line in basket.all_lines():
if not line.is_available_for_discount:
continue
product = line.product
if (self.can_apply_condition(product) and product.id not in covered_ids):
covered_ids.append(product.id)
if len(covered_ids) >= self.value:
return True
return False
def _get_num_covered_products(self, basket):
covered_ids = []
for line in basket.all_lines():
if not line.is_available_for_discount:
continue
product = line.product
if (self.can_apply_condition(product) and product.id not in covered_ids):
covered_ids.append(product.id)
return len(covered_ids)
def get_upsell_message(self, basket):
delta = self.value - self._get_num_covered_products(basket)
return ungettext('Buy %(delta)d more product from %(range)s',
'Buy %(delta)d more products from %(range)s', delta) % {
'delta': delta, 'range': self.range}
def is_partially_satisfied(self, basket):
return 0 < self._get_num_covered_products(basket) < self.value
def consume_items(self, basket, affected_lines):
"""
Marks items within the basket lines as consumed so they
can't be reused in other offers.
"""
# Determine products that have already been consumed by applying the
# benefit
consumed_products = []
for line, __, quantity in affected_lines:
consumed_products.append(line.product)
to_consume = max(0, self.value - len(consumed_products))
if to_consume == 0:
return
for line in basket.all_lines():
product = line.product
if not self.can_apply_condition(product):
continue
if product in consumed_products:
continue
if not line.is_available_for_discount:
continue
# Only consume a quantity of 1 from each line
line.consume(1)
consumed_products.append(product)
to_consume -= 1
if to_consume == 0:
break
def get_value_of_satisfying_items(self, basket):
covered_ids = []
value = D('0.00')
for line in basket.all_lines():
if (self.can_apply_condition(line.product) and line.product.id not in covered_ids):
covered_ids.append(line.product.id)
value += line.unit_price_incl_tax
if len(covered_ids) >= self.value:
return value
return value
class ValueCondition(Condition):
"""
An offer condition dependent on the VALUE of matching items from the
basket.
"""
class Meta:
proxy = True
verbose_name = _("Value Condition")
verbose_name_plural = _("Value Conditions")
def is_satisfied(self, basket):
"""
Determine whether a given basket meets this condition
"""
value_of_matches = D('0.00')
for line in basket.all_lines():
product = line.product
if (self.can_apply_condition(product) and product.has_stockrecord
and line.quantity_without_discount > 0):
price = line.unit_price_incl_tax
value_of_matches += price * int(line.quantity_without_discount)
if value_of_matches >= self.value:
return True
return False
def _get_value_of_matches(self, basket):
if hasattr(self, '_value_of_matches'):
return getattr(self, '_value_of_matches')
value_of_matches = D('0.00')
for line in basket.all_lines():
product = line.product
if (self.can_apply_condition(product) and product.has_stockrecord
and line.quantity_without_discount > 0):
price = line.unit_price_incl_tax
value_of_matches += price * int(line.quantity_without_discount)
self._value_of_matches = value_of_matches
return value_of_matches
def is_partially_satisfied(self, basket):
value_of_matches = self._get_value_of_matches(basket)
return D('0.00') < value_of_matches < self.value
def get_upsell_message(self, basket):
value_of_matches = self._get_value_of_matches(basket)
return _('Spend %(value)s more from %(range)s') % {
'value': currency(self.value - value_of_matches),
'range': self.range}
def consume_items(self, basket, affected_lines):
"""
Marks items within the basket lines as consumed so they
can't be reused in other offers.
We allow lines to be passed in as sometimes we want them sorted
in a specific order.
"""
# Determine value of items already consumed as part of discount
value_consumed = D('0.00')
for line, __, qty in affected_lines:
price = line.unit_price_incl_tax
value_consumed += price * qty
to_consume = max(0, self.value - value_consumed)
if to_consume == 0:
return
for price, line in self.get_applicable_lines(basket,
most_expensive_first=True):
quantity_to_consume = min(
line.quantity_without_discount,
(to_consume / price).quantize(D(1), ROUND_UP))
line.consume(quantity_to_consume)
to_consume -= price * quantity_to_consume
if to_consume == 0:
break
# ========
# Benefits
# ========
class PercentageDiscountBenefit(Benefit):
"""
An offer benefit that gives a percentage discount
"""
class Meta:
proxy = True
verbose_name = _("Percentage discount benefit")
verbose_name_plural = _("Percentage discount benefits")
def apply(self, basket, condition, offer=None):
line_tuples = self.get_applicable_lines(basket)
discount = D('0.00')
affected_items = 0
max_affected_items = self._effective_max_affected_items()
affected_lines = []
for price, line in line_tuples:
if affected_items >= max_affected_items:
break
quantity_affected = min(line.quantity_without_discount,
max_affected_items - affected_items)
line_discount = self.round(self.value / D('100.0') * price
* int(quantity_affected))
line.discount(line_discount, quantity_affected)
affected_lines.append((line, line_discount, quantity_affected))
affected_items += quantity_affected
discount += line_discount
if discount > 0:
condition.consume_items(basket, affected_lines)
return discount
class AbsoluteDiscountBenefit(Benefit):
"""
An offer benefit that gives an absolute discount
"""
class Meta:
proxy = True
verbose_name = _("Absolute discount benefit")
verbose_name_plural = _("Absolute discount benefits")
def apply(self, basket, condition, offer=None):
line_tuples = self.get_applicable_lines(basket)
if not line_tuples:
return self.round(D('0.00'))
discount = D('0.00')
affected_items = 0
max_affected_items = self._effective_max_affected_items()
affected_lines = []
for price, line in line_tuples:
if affected_items >= max_affected_items:
break
remaining_discount = self.value - discount
quantity_affected = min(
line.quantity_without_discount,
max_affected_items - affected_items,
int(math.ceil(remaining_discount / price)))
line_discount = self.round(min(remaining_discount,
quantity_affected * price))
line.discount(line_discount, quantity_affected)
affected_lines.append((line, line_discount, quantity_affected))
affected_items += quantity_affected
discount += line_discount
if discount > 0:
condition.consume_items(basket, affected_lines)
return discount
class FixedPriceBenefit(Benefit):
"""
An offer benefit that gives the items in the condition for a
fixed price. This is useful for "bundle" offers.
Note that we ignore the benefit range here and only give a fixed price
for the products in the condition range. The condition cannot be a value
condition.
We also ignore the max_affected_items setting.
"""
class Meta:
proxy = True
verbose_name = _("Fixed price benefit")
verbose_name_plural = _("Fixed price benefits")
def apply(self, basket, condition, offer=None):
if isinstance(condition, ValueCondition):
return self.round(D('0.00'))
line_tuples = self.get_applicable_lines(basket, range=condition.range)
if not line_tuples:
return self.round(D('0.00'))
# Determine the lines to consume
num_permitted = int(condition.value)
num_affected = 0
value_affected = D('0.00')
covered_lines = []
for price, line in line_tuples:
if isinstance(condition, CoverageCondition):
quantity_affected = 1
else:
quantity_affected = min(
line.quantity_without_discount,
num_permitted - num_affected)
num_affected += quantity_affected
value_affected += quantity_affected * price
covered_lines.append((price, line, quantity_affected))
if num_affected >= num_permitted:
break
discount = max(value_affected - self.value, D('0.00'))
if not discount:
return self.round(discount)
# Apply discount to the affected lines
discount_applied = D('0.00')
last_line = covered_lines[-1][0]
for price, line, quantity in covered_lines:
if line == last_line:
# If last line, we just take the difference to ensure that
# rounding doesn't lead to an off-by-one error
line_discount = discount - discount_applied
else:
line_discount = self.round(
discount * (price * quantity) / value_affected)
line.discount(line_discount, quantity)
discount_applied += line_discount
return discount
class MultibuyDiscountBenefit(Benefit):
class Meta:
proxy = True
verbose_name = _("Multibuy discount benefit")
verbose_name_plural = _("Multibuy discount benefits")
def apply(self, basket, condition, offer=None):
line_tuples = self.get_applicable_lines(basket)
if not line_tuples:
return self.round(D('0.00'))
# Cheapest line gives free product
discount, line = line_tuples[0]
line.discount(discount, 1)
affected_lines = [(line, discount, 1)]
condition.consume_items(basket, affected_lines)
return discount
# =================
# Shipping benefits
# =================
class ShippingBenefit(Benefit):
def apply(self, basket, condition, offer=None):
# Attach offer to basket to indicate that it qualifies for a shipping
# discount. At this point, we only allow one shipping offer per
# basket.
basket.shipping_offer = offer
condition.consume_items(basket, affected_lines=())
return D('0.00')
class ShippingAbsoluteDiscountBenefit(ShippingBenefit):
class Meta:
proxy = True
verbose_name = _("Shipping absolute discount benefit")
verbose_name_plural = _("Shipping absolute discount benefits")
def shipping_discount(self, charge):
return min(charge, self.value)
class ShippingFixedPriceBenefit(ShippingBenefit):
class Meta:
proxy = True
verbose_name = _("Fixed price shipping benefit")
verbose_name_plural = _("Fixed price shipping benefits")
def shipping_discount(self, charge):
if charge < self.value:
return D('0.00')
return charge - self.value
class ShippingPercentageDiscountBenefit(ShippingBenefit):
class Meta:
proxy = True
verbose_name = _("Shipping percentage discount benefit")
verbose_name_plural = _("Shipping percentage discount benefits")
def shipping_discount(self, charge):
return charge * self.value / D('100.0')
| 38.150877 | 117 | 0.616849 | 5,011 | 43,492 | 5.156456 | 0.09978 | 0.009985 | 0.01548 | 0.014629 | 0.502458 | 0.452339 | 0.392817 | 0.328921 | 0.283912 | 0.250474 | 0 | 0.005945 | 0.296123 | 43,492 | 1,139 | 118 | 38.184372 | 0.838108 | 0.060724 | 0 | 0.427307 | 0 | 0 | 0.130418 | 0.00137 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.001264 | 0.021492 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5edd1d618589e67fdc13ac60dffe9edc5736896c | 2,980 | py | Python | scripts/core/soldier.py | whackashoe/entwinement | 4acff2147b86e08e267fc50c327917a338c7bf36 | [
"Unlicense"
] | 1 | 2020-03-10T10:52:13.000Z | 2020-03-10T10:52:13.000Z | scripts/core/soldier.py | whackashoe/entwinement | 4acff2147b86e08e267fc50c327917a338c7bf36 | [
"Unlicense"
] | null | null | null | scripts/core/soldier.py | whackashoe/entwinement | 4acff2147b86e08e267fc50c327917a338c7bf36 | [
"Unlicense"
] | null | null | null | d_soldiers = []
class Soldier:
def __init__(self, id, name, team):
self.id = id
self.name = name
self.team = team
self.x = 0
self.y = 0
self.xVelo = 0
self.yVelo = 0
self.kills = 0
self.deaths = 0
self.alive = 'true'
self.driving = 'false'
self.gun = 0
self.ammo = 0
self.reloading = 'false'
def setPosition(self, x, y, xv, yv):
self.x = x
self.y = y
self.xVelo = xv
self.yVelo = yv
def setName(self, name):
self.name = name
def setTeam(self, team):
self.team = team
def setGun(self, gun):
self.gun = gun
def setGunInfo(self, gun, ammo, reloading):
self.gun = gun
self.ammo = ammo
self.reloading = reloading
def die(self):
self.alive = 'false'
self.driving = 'false'
self.deaths += 1
def respawn(self):
self.alive = 'true'
def teleport(self, x, y):
global com
self.x = x
self.y = y
com += 'f_t s '+str(self.id)+' '+str(self.x)+' '+str(self.y)+';'
def applyForce(self, xf, yf):
global com
com += 'f_af s '+str(self.id)+' '+str(xf)+' '+str(yf)+';'
def setVelocity(self, xf, yf):
global com
self.xVelo = xf
self.yVelo = yf
com += 'f_v s '+str(self.id)+' '+str(self.xVelo)+' '+str(self.yVelo)+';'
def changeTeam(self, team):
global com
self.team = team
com += 's_ct '+str(self.id)+' '+str(self.team)+';'
def changeGun(self, gun):
global com
self.gun = gun
com += 's_cg '+str(self.id)+' '+str(self.gun)+';'
def changeAttachment(self, type, amount):
global com
com += 's_ca '+str(self.id)+' '+str(type)+' '+str(amount)+';'
def killSoldier(self):
global com
self.alive = false
com += 's_ks '+str(id)+';'
def respawnSoldier(self, spawn):
global com
com += 's_rs '+str(self.id)+' '+str(spawn)+';'
def enterVehicle(self, vehicleId):
global com
com += 's_en '+str(self.id)+' '+str(vehicleId)+';'
def exitVehicle(self):
global com
com += 's_ex '+str(self.id)+';'
def addKill(self):
global com
self.kills += 1
com += 's_ak '+str(self.id)+';'
def addDeath(self):
global com
self.deaths += 1
com += 's_ad '+str(self.id)+';'
def dropGun(self):
global com
com += 's_dg '+str(self.id)+';'
def addSoldier(team):
global com
com += 'a s '+str(team)+';'
def getSoldier(n):
global d_soldiers
return d_soldiers[n]
def getSoldierById(id):
global d_soldiers
for n in xrange(len(d_soldiers)):
s = d_soldiers[n]
if s.id == id:
return s
def getSoldiers():
global d_soldiers
return d_soldiers
def getSoldierCount():
global d_soldiers
return len(d_soldiers)
def getTeamKills(team):
amount = 0
for n in xrange(len(d_soldiers)):
s = d_soldiers[n]
if s.team == team:
amount += s.kills
return amount
def getTeamDeaths(team):
amount = 0
for n in xrange(len(d_soldiers)):
s = d_soldiers[n]
if s.team == team:
amount += s.deaths
return amount
def getTeamSize(team):
amount = 0
for n in xrange(len(d_soldiers)):
s = d_soldiers[n]
if s.team == team:
amount += 1
return amount
| 18.742138 | 74 | 0.617785 | 473 | 2,980 | 3.82241 | 0.183932 | 0.06969 | 0.059735 | 0.053097 | 0.254425 | 0.191925 | 0.126659 | 0.126659 | 0.126659 | 0.126659 | 0 | 0.006375 | 0.210403 | 2,980 | 158 | 75 | 18.860759 | 0.762006 | 0 | 0 | 0.424 | 0 | 0 | 0.042617 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.232 | false | 0 | 0 | 0 | 0.296 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5edf63e904c948abd2995cb1fd09ff2f09a7f87a | 572 | py | Python | CursoEmVideo/Aula22/ex109/ex109.py | lucashsouza/Desafios-Python | abb5b11ebdfd4c232b4f0427ef41fd96013f2802 | [
"MIT"
] | null | null | null | CursoEmVideo/Aula22/ex109/ex109.py | lucashsouza/Desafios-Python | abb5b11ebdfd4c232b4f0427ef41fd96013f2802 | [
"MIT"
] | null | null | null | CursoEmVideo/Aula22/ex109/ex109.py | lucashsouza/Desafios-Python | abb5b11ebdfd4c232b4f0427ef41fd96013f2802 | [
"MIT"
] | null | null | null | """
Modifique as funções que foram criadas no desafio 107 para
que elas aceitem um parametro a mais, informando se o valor
retornado por elas vai ser ou não formatado pela função
moeda(), desenvolvida no desafio 108.
"""
from Aula22.ex109 import moeda
from Aula22.ex109.titulo import titulo
preco = float(input("Preço: R$"))
titulo('Informações Calculadas: ')
print(f"Metade: {moeda.metade(preco, True)}")
print(f"Dobro: {moeda.dobro(preco, True)}")
print(f"10% Acréscimo: {moeda.aumentar(preco, 10, True)}")
print(f"10% Desconto: {moeda.diminuir(preco, 10, True)}")
| 28.6 | 59 | 0.737762 | 87 | 572 | 4.850575 | 0.62069 | 0.056872 | 0.07109 | 0.07109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048193 | 0.129371 | 572 | 19 | 60 | 30.105263 | 0.799197 | 0.370629 | 0 | 0 | 0 | 0 | 0.558405 | 0.125356 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
5eeebe655d0529cd4e57b3684dd0b12853503ba1 | 442 | py | Python | greedy_algorithms/6_maximum_salary/largest_number.py | Desaiakshata/Algorithms-problems | 90f4e40ba05e4bdfc783614bb70b9156b05eec0b | [
"MIT"
] | null | null | null | greedy_algorithms/6_maximum_salary/largest_number.py | Desaiakshata/Algorithms-problems | 90f4e40ba05e4bdfc783614bb70b9156b05eec0b | [
"MIT"
] | null | null | null | greedy_algorithms/6_maximum_salary/largest_number.py | Desaiakshata/Algorithms-problems | 90f4e40ba05e4bdfc783614bb70b9156b05eec0b | [
"MIT"
] | null | null | null | #Uses python3
import sys
def largest_number(a):
#write your code here
res = ""
while len(a)!=0:
maxa = a[0]
for x in a:
if int(str(x)+str(maxa))>int(str(maxa)+str(x)):
maxa = x
res += str(maxa)
a.remove(str(maxa))
return res
if __name__ == '__main__':
#input = sys.stdin.read()
data = input().split(' ')
a = data[1:]
print(largest_number(a))
| 19.217391 | 59 | 0.506787 | 63 | 442 | 3.396825 | 0.539683 | 0.130841 | 0.130841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013605 | 0.334842 | 442 | 22 | 60 | 20.090909 | 0.714286 | 0.126697 | 0 | 0 | 0 | 0 | 0.023499 | 0 | 0 | 0 | 0 | 0.045455 | 0 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.2 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5ef260b5bf84eb695b2bd8138b23ebab7ec1405b | 4,779 | py | Python | cno/chrutils.py | CherokeeLanguage/cherokee-audio-data | a10b7b38c0c1b56338561c917cef18a078ca573c | [
"CC0-1.0",
"MIT"
] | 2 | 2021-09-15T19:41:01.000Z | 2022-01-12T17:57:08.000Z | cno/chrutils.py | CherokeeLanguage/cherokee-audio-data | a10b7b38c0c1b56338561c917cef18a078ca573c | [
"CC0-1.0",
"MIT"
] | 1 | 2021-10-08T18:06:29.000Z | 2021-10-08T18:48:44.000Z | cno/chrutils.py | CherokeeLanguage/cherokee-audio-data | a10b7b38c0c1b56338561c917cef18a078ca573c | [
"CC0-1.0",
"MIT"
] | null | null | null | #!/usr/bin/env python3
def test():
cedTest = ["U²sgal²sdi ạ²dv¹ne²³li⁴sgi.", "Ụ²wo²³dị³ge⁴ɂi gi²hli a¹ke²³he³²ga na ạ²chu⁴ja.",
"Ạ²ni²³tạɂ³li ạ²ni²sgạ²ya a¹ni²no²hạ²li²³do³²he, ạ²hwi du¹ni²hyọ²he.",
"Sa¹gwu⁴hno ạ²sgạ²ya gạ²lo¹gwe³ ga²ne²he sọ³ɂị³hnv³ hla².",
"Na³hnv³ gạ²lo¹gwe³ ga²ne⁴hi u²dlv²³kwsạ²ti ge¹se³, ạ²le go²hu⁴sdi yu²³dv³²ne⁴la a¹dlv²³kwsge³.",
"A¹na³ɂi²sv⁴hnv go²hu⁴sdi wu²³ni³go²he do²jụ²wạ³ɂị²hlv,",
"na³hnv³ gạ²lo¹gwe³ ga²ne⁴hi kị²lạ²gwu ị²yv⁴da wị²du²³sdạ³yo²hle³ o²³sdạ²gwu nu²³ksẹ²stạ²nv⁴na ị²yu³sdi da¹sdạ²yo²hị²hv⁴.",
"U²do²hị²yu⁴hnv³ wu²³yo³hle³ ạ²le u¹ni²go²he³ gạ²nv³gv⁴.",
"Na³hnv³ gạ²lo¹gwe³ nị²ga²³ne³hv⁴na \"ạ²hwi e¹ni²yo³ɂa!\" u¹dv²hne.",
"\"Ji²yo³ɂe³²ga\" u¹dv²hne na³ gạ²lo¹gwe³ ga²ne⁴hi, a¹dlv²³kwsgv³.",
"U¹na³ne²lu²³gi³²se do²jụ²wạ³ɂị²hlv³ di³dla, nạ²ɂv²³hnị³ge⁴hnv wu²³ni³luh²ja u¹ni²go²he³ so²³gwị³li gạɂ³nv⁴.",
"\"So²³gwị³lị³le³² i¹nạ²da²hị³si\" u¹dv²hne³ na³ u²yo²hlv⁴.", "\"Hạ²da²hị³se³²ga³\" a¹go¹se²³le³."]
for a in cedTest:
print("_______________");
print();
print(a);
print(ced2mco(a));
asciiCedText = ["ga.2da.2de3ga", "ha.2da.2du1ga", "u2da.2di23nv32di", "u1da.2di23nv32sv23?i", "a1da.2de3go3?i"]
for a in asciiCedText:
print("_______________");
print();
print(a);
print(ascii_ced2mco(a));
return
# Converts MCO annotation into pseudo English phonetics for use by the aeneas alignment package
# lines prefixed with '#' are returned with the '#' removed, but otherwise unchanged.
def mco2espeak(text: str):
import unicodedata as ud
import re
if (len(text.strip()) == 0):
return ""
# Handle specially flagged text
if (text[0].strip() == "#"):
if text[1] != "!":
return text.strip()[1:]
else:
text = text[2:]
newText = ud.normalize('NFD', text.strip()).lower()
if (newText[0] == ""):
newText = newText[1:]
# remove all tone indicators
newText = re.sub("[\u030C\u0302\u0300\u0301\u030b]", "", newText)
newText = "[[" + newText.strip() + "]]"
newText = newText.replace(" ", "]] [[")
newText = newText.replace("'", "]]'[[")
newText = newText.replace(".]]", "]].")
newText = newText.replace(",]]", "]],")
newText = newText.replace("!]]", "]]!")
newText = newText.replace("?]]", "]]?")
newText = newText.replace(":]]", "]]:")
newText = newText.replace(";]]", "]];")
newText = newText.replace("\"]]", "]]\"")
newText = newText.replace("']]", "]]'")
newText = newText.replace(" ]]", "]] ")
newText = newText.replace("[[ ", " [[")
newText = re.sub("(?i)([aeiouv]):", "\\1", newText)
# convert all vowels into approximate espeak x-sampa escaped forms
newText = newText.replace("A", "0")
newText = newText.replace("a", "0")
newText = newText.replace("v", "V")
newText = newText.replace("tl", "tl#")
newText = newText.replace("hl", "l#")
newText = newText.replace("J", "dZ")
newText = newText.replace("j", "dZ")
newText = newText.replace("Y", "j")
newText = newText.replace("y", "j")
newText = newText.replace("Ch", "tS")
newText = newText.replace("ch", "tS")
newText = newText.replace("ɂ", "?")
return newText
def ced2mco(text: str):
import unicodedata as ud
import re
tones2mco = [("²³", "\u030C"), ("³²", "\u0302"), ("¹", "\u0300"), ("²", ""), ("³", "\u0301"), ("⁴", "\u030b")]
text = ud.normalize('NFD', text)
text = re.sub("(?i)([aeiouv])([^¹²³⁴\u0323]+)", "\\1\u0323\\2", text)
text = re.sub("(?i)([aeiouv])([¹²³⁴]+)$", "\\1\u0323\\2", text)
text = re.sub("(?i)([aeiouv])([¹²³⁴]+)([^¹²³⁴a-zɂ])", "\\1\u0323\\2\\3", text)
text = re.sub("(?i)([^aeiouv\u0323¹²³⁴]+)([¹²³⁴]+)", "\\2\\1", text)
text = re.sub("(?i)([aeiouv])([¹²³⁴]+)", "\\1\\2:", text)
text = text.replace("\u0323", "")
text = re.sub("(?i)([aeiouv])²$", "\\1\u0304", text)
text = re.sub("(?i)([aeiouv])²([^a-zɂ¹²³⁴:])", "\\1\u0304\\2", text)
for ced2mcotone in tones2mco:
text = text.replace(ced2mcotone[0], ced2mcotone[1])
#
return ud.normalize('NFC', text)
def ascii_ced2mco(text: str):
import unicodedata as ud
text = ud.normalize('NFD', text)
return ced2mco(ascii_ced2ced(text))
def ascii_ced2ced(text: str):
import unicodedata as ud
text = ud.normalize('NFD', text)
text = text.replace(".", "\u0323")
text = text.replace("1", "¹")
text = text.replace("2", "²")
text = text.replace("3", "³")
text = text.replace("4", "⁴")
text = text.replace("?", "ɂ")
return text
if __name__ == "__main__":
test()
| 38.232 | 138 | 0.586943 | 510 | 4,779 | 5.417647 | 0.347059 | 0.136808 | 0.18241 | 0.121607 | 0.345277 | 0.330076 | 0.285197 | 0.271444 | 0.152371 | 0.152371 | 0 | 0.111052 | 0.214271 | 4,779 | 124 | 139 | 38.540323 | 0.624767 | 0.06675 | 0 | 0.154639 | 0 | 0 | 0.310043 | 0.051898 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051546 | false | 0 | 0.061856 | 0 | 0.185567 | 0.082474 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5ef2f309d751c48873dcfc34c92ab93f2ef03256 | 1,793 | py | Python | app/db_con.py | bmugenya/Zup | 1677c1e4e263409f9f5fcaac7411dd403e32650e | [
"MIT"
] | null | null | null | app/db_con.py | bmugenya/Zup | 1677c1e4e263409f9f5fcaac7411dd403e32650e | [
"MIT"
] | 1 | 2020-03-06T17:32:15.000Z | 2020-03-06T17:32:15.000Z | app/db_con.py | bmugenya/Zup | 1677c1e4e263409f9f5fcaac7411dd403e32650e | [
"MIT"
] | null | null | null | import psycopg2
url = "dbname='da43n1slakcjkc' user='msqgxzgmcskvst' host='ec2-54-80-184-43.compute-1.amazonaws.com' port=5432 password='9281f925b1e2298e8d62812d9d4e430c1054db62e918c282d7039fa85b1759fa'"
class database_setup(object):
def __init__(self):
self.conn = psycopg2.connect(url)
self.cursor = self.conn.cursor()
def destroy_tables(self):
self.cursor.execute("""DROP TABLE IF EXISTS user CASCADE;""")
self.conn.commit()
def create_tables(self):
self.cursor.execute("""CREATE TABLE IF NOT EXISTS Users (
user_id SERIAL NOT NULL,
fname VARCHAR(25) NOT NULL,
lname VARCHAR(25) NOT NULL,
post_date DATE NOT NULL DEFAULT CURRENT_DATE,
email VARCHAR(50) UNIQUE NOT NULL,
password VARCHAR(256) NOT NULL,
photo VARCHAR(255) NOT NULL,
PRIMARY KEY (email)
);""")
self.cursor.execute("""CREATE TABLE IF NOT EXISTS Report (
report_id SERIAL NOT NULL,
num_tweet INT NOT NULL,
tweet VARCHAR(255) NOT NULL,
plot_bar VARCHAR(255) NOT NULL,
plot_pie VARCHAR(255) NOT NULL,
post_date DATE NOT NULL DEFAULT CURRENT_DATE,
email VARCHAR(50) REFERENCES Users(email) NOT NULL,
PRIMARY KEY (report_id)
);""")
self.cursor.execute("""CREATE TABLE IF NOT EXISTS Config (
config_id SERIAL NOT NULL,
consumerKey TEXT NOT NULL,
consumerSecret TEXT NOT NULL,
accessToken TEXT NOT NULL,
accessSecret TEXT NOT NULL,
email VARCHAR(50) REFERENCES Users(email) NOT NULL,
PRIMARY KEY (config_id)
);""")
self.conn.commit()
| 34.480769 | 187 | 0.605131 | 210 | 1,793 | 5.07619 | 0.361905 | 0.131332 | 0.06379 | 0.06379 | 0.375235 | 0.301126 | 0.301126 | 0.301126 | 0.19137 | 0.19137 | 0 | 0.072581 | 0.308422 | 1,793 | 51 | 188 | 35.156863 | 0.787097 | 0 | 0 | 0.225 | 0 | 0.025 | 0.760178 | 0.092582 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075 | false | 0.05 | 0.025 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5efb1967191c3b432f3eb4d402361c056b7541a9 | 4,085 | py | Python | linux-distro/package/nuxleus/Source/Vendor/Microsoft/IronPython-2.0.1/Lib/Kamaelia/Protocol/Torrent/TorrentIPC.py | mdavid/nuxleus | 653f1310d8bf08eaa5a7e3326c2349e56a6abdc2 | [
"BSD-3-Clause"
] | 1 | 2017-03-28T06:41:51.000Z | 2017-03-28T06:41:51.000Z | linux-distro/package/nuxleus/Source/Vendor/Microsoft/IronPython-2.0.1/Lib/Kamaelia/Protocol/Torrent/TorrentIPC.py | mdavid/nuxleus | 653f1310d8bf08eaa5a7e3326c2349e56a6abdc2 | [
"BSD-3-Clause"
] | null | null | null | linux-distro/package/nuxleus/Source/Vendor/Microsoft/IronPython-2.0.1/Lib/Kamaelia/Protocol/Torrent/TorrentIPC.py | mdavid/nuxleus | 653f1310d8bf08eaa5a7e3326c2349e56a6abdc2 | [
"BSD-3-Clause"
] | 1 | 2016-12-13T21:08:58.000Z | 2016-12-13T21:08:58.000Z | #!/usr/bin/env python
#
# Copyright (C) 2006 British Broadcasting Corporation and Kamaelia Contributors(1)
# All Rights Reserved.
#
# You may only modify and redistribute this under the terms of any of the
# following licenses(2): Mozilla Public License, V1.1, GNU General
# Public License, V2.0, GNU Lesser General Public License, V2.1
#
# (1) Kamaelia Contributors are listed in the AUTHORS file and at
# http://kamaelia.sourceforge.net/AUTHORS - please extend this file,
# not this notice.
# (2) Reproduced in the COPYING file, and at:
# http://kamaelia.sourceforge.net/COPYING
# Under section 3.5 of the MPL, we are using this text since we deem the MPL
# notice inappropriate for this file. As per MPL/GPL/LGPL removal of this
# notice is prohibited.
#
# Please contact us via: kamaelia-list-owner@lists.sourceforge.net
# to discuss alternative licensing.
# -------------------------------------------------------------------------
# Licensed to the BBC under a Contributor Agreement: RJL
"""(Bit)Torrent IPC messages"""
from Kamaelia.BaseIPC import IPC
# ====================== Messages to send to TorrentMaker =======================
class TIPCMakeTorrent(IPC):
"Create a .torrent file"
Parameters = [ "trackerurl", "log2piecesizebytes", "title", "comment", "srcfile" ]
#Parameters:
# trackerurl - the URL of the BitTorrent tracker that will be used
# log2piecesizebytes - log base 2 of the hash-piece-size, sensible value: 18
# title - name of the torrent
# comment - a field that can be read by users when they download the torrent
# srcfile - the file that the .torrent file will have metainfo about
# ========= Messages for TorrentPatron to send to TorrentService ================
# a message for TorrentClient (i.e. to be passed on by TorrentService)
class TIPCServicePassOn(IPC):
"Add a client to TorrentService"
Parameters = [ "replyService", "message" ]
#Parameters: replyService, message
# request to add a TorrentPatron to a TorrentService's list of clients
class TIPCServiceAdd(IPC):
"Add a client to TorrentService"
Parameters = [ "replyService" ]
#Parameters: replyService
# request to remove a TorrentPatron from a TorrentService's list of clients
class TIPCServiceRemove(IPC):
"Remove a client from TorrentService"
Parameters = [ "replyService" ]
#Parameters: replyService
# ==================== Messages for TorrentClient to produce ====================
# a new torrent has been added with id torrentid
class TIPCNewTorrentCreated(IPC):
"New torrent %(torrentid)d created in %(savefolder)s"
Parameters = [ "torrentid", "savefolder" ]
#Parameters: torrentid, savefolder
# the torrent you requested me to download is already being downloaded as torrentid
class TIPCTorrentAlreadyDownloading(IPC):
"That torrent is already downloading!"
Parameters = [ "torrentid" ]
#Parameters: torrentid
# for some reason the torrent could not be started
class TIPCTorrentStartFail(object):
"Torrent failed to start!"
Parameters = []
#Parameters: (none)
# message containing the current status of a particular torrent
class TIPCTorrentStatusUpdate(IPC):
"Current status of a single torrent"
def __init__(self, torrentid, statsdictionary):
super(TIPCTorrentStatusUpdate, self).__init__()
self.torrentid = torrentid
self.statsdictionary = statsdictionary
def __str__(self):
return "Torrent %d status : %s" % (self.torrentid, str(int(self.statsdictionary.get("fractionDone",0) * 100)) + "%")
# ====================== Messages to send to TorrentClient ======================
# create a new torrent (a new download session) from a .torrent file's binary contents
class TIPCCreateNewTorrent(IPC):
"Create a new torrent"
Parameters = [ "rawmetainfo" ]
#Parameters: rawmetainfo - the contents of a .torrent file
# close a running torrent
class TIPCCloseTorrent(IPC):
"Close torrent %(torrentid)d"
Parameters = [ "torrentid" ]
#Parameters: torrentid
| 40.04902 | 124 | 0.682742 | 484 | 4,085 | 5.737603 | 0.423554 | 0.047533 | 0.008642 | 0.015844 | 0.115232 | 0.086424 | 0.086424 | 0.03673 | 0 | 0 | 0 | 0.007505 | 0.184578 | 4,085 | 101 | 125 | 40.445545 | 0.826178 | 0.674663 | 0 | 0.166667 | 0 | 0 | 0.302384 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.027778 | 0.027778 | 0.027778 | 0.638889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6f011e9d1e6d5fe45f9c159871d9be7ae9ea35b9 | 1,111 | py | Python | snakes/help_info.py | japinol7/snakes | bb501736027897bacab498ad7bbbe622cf4b9755 | [
"MIT"
] | 12 | 2019-04-15T07:20:31.000Z | 2019-05-18T22:03:35.000Z | snakes/help_info.py | japinol7/snakes | bb501736027897bacab498ad7bbbe622cf4b9755 | [
"MIT"
] | null | null | null | snakes/help_info.py | japinol7/snakes | bb501736027897bacab498ad7bbbe622cf4b9755 | [
"MIT"
] | null | null | null | """Module help_info."""
__author__ = 'Joan A. Pinol (japinol)'
class HelpInfo:
"""Manages information used for help purposes."""
def print_help_keys(self):
print(' F1: \t show a help screen while playing the game'
' t: \t stats on/off\n'
' L_Ctrl + R_Alt + g: grid\n'
' p: \t pause\n'
' ESC: exit game\n'
' ^m: \t pause/resume music\n'
' ^s: \t sound effects on/off\n'
' Alt + Enter: change full screen / normal screen mode\n'
' ^h: \t shows this help\n'
' \t left, a: move snake to the left\n'
' \t right, d: move snake to the right\n'
' \t up, w: move snake up\n'
' \t down, s: move snake down\n'
' \t u 4: fire a light shot\n'
' \t i 5: fire a medium shot\n'
' \t j 1: fire a strong shot\n'
' \t k 2: fire a heavy shot\n'
)
| 41.148148 | 73 | 0.417642 | 145 | 1,111 | 3.137931 | 0.537931 | 0.035165 | 0.03956 | 0.061538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008547 | 0.473447 | 1,111 | 26 | 74 | 42.730769 | 0.769231 | 0.054905 | 0 | 0 | 0 | 0 | 0.627838 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0 | 0 | 0.095238 | 0.095238 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f03742065f7d2c3fc2369fb406d4426cdddbeab | 459 | py | Python | Exercicios em Python/ex080.py | Raphael-Azevedo/Exercicios_Python | dece138f38edd02b0731aed78e44acccb021b3cb | [
"MIT"
] | null | null | null | Exercicios em Python/ex080.py | Raphael-Azevedo/Exercicios_Python | dece138f38edd02b0731aed78e44acccb021b3cb | [
"MIT"
] | null | null | null | Exercicios em Python/ex080.py | Raphael-Azevedo/Exercicios_Python | dece138f38edd02b0731aed78e44acccb021b3cb | [
"MIT"
] | null | null | null | n = []
i = 0
for c in range(0, 5):
n1 = int(input('Digite um valor: '))
if c == 0 or n1 > n[-1]:
n.append(n1)
print(f'Adicionado na posição {c} da lista...')
else:
pos = 0
while pos < len(n):
if n1 <= n[pos]:
n.insert(pos, n1)
print(f'Adicionado na posição {pos} da lista...')
break
pos += 1
print(f'Os valores digitados em ordem foram {n}')
| 25.5 | 65 | 0.461874 | 68 | 459 | 3.117647 | 0.544118 | 0.084906 | 0.075472 | 0.169811 | 0.254717 | 0.254717 | 0 | 0 | 0 | 0 | 0 | 0.043011 | 0.392157 | 459 | 17 | 66 | 27 | 0.716846 | 0 | 0 | 0 | 0 | 0 | 0.287582 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.1875 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f03aa2ab2aaee70b468bb66183fe442925a1018 | 13,132 | py | Python | rawal_stuff/src/demo.py | rawalkhirodkar/traffic_light_detection | 0e1e99962477bcf271b22d5205b1e7afab8635ba | [
"MIT"
] | null | null | null | rawal_stuff/src/demo.py | rawalkhirodkar/traffic_light_detection | 0e1e99962477bcf271b22d5205b1e7afab8635ba | [
"MIT"
] | null | null | null | rawal_stuff/src/demo.py | rawalkhirodkar/traffic_light_detection | 0e1e99962477bcf271b22d5205b1e7afab8635ba | [
"MIT"
] | null | null | null | import cv2
import numpy as np
import random
import copy
import dlib
from keras.models import Sequential
from keras.optimizers import SGD
from keras.datasets import cifar10
from keras.preprocessing.image import ImageDataGenerator
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Convolution2D, MaxPooling2D
from keras.utils import np_utils
from keras.models import load_model
from convnetskeras.convnets import preprocess_image_batch, convnet
from convnetskeras.imagenet_tool import synset_to_dfs_ids
np.set_printoptions(threshold=np.inf)
#----------------------------Globals------------------------------------------------------------
MIN_AREA = 20
MAX_AREA = 500
MIN_RED_DENSITY = 0.4
MIN_BLACk_DENSITY_BELOW = 0
MIN_POLYAPPROX = 3
WIDTH_HEIGHT_RATIO = [0.333, 1.5] #range
#------------------------------------------------------------------------------------------------
tracker_list = []
TRACK_FRAME = 10
VOTE_FRAME = 3
frame0_detections = []
frame1_detections = []
frame2_detections = []
frame_detections = []
RADIAL_DIST = 10
#------------------------------------------------------------------------------------------------
def dist(x1,y1,x2,y2):
a = np.array((x1 ,y1))
b = np.array((x2, y2))
return np.linalg.norm(a-b)
#------------------------------------------------------------------------------------------------
BOUNDING_BOX = [0,0,0,0] #x1, y1, x2, y2
#------------------------------------------------------------------------------------------------
def prune_detection(detections):
ans = []
size = len(detections)
for i in range(0,size):
(x,y,w,h) = detections[i]
found = -1
for j in range(i+1,size):
(x1,y1,w1,h1) = detections[j]
if(dist(x,y,x1,y1) < RADIAL_DIST):
found = 1
break
if found == -1:
ans.append(detections[i])
return ans
#------------------------------------------------------------------------------------------------
#------------------------------------------------------------------------------------------------
def inside(p):
(x,y) = p
if(x < BOUNDING_BOX[2] and x > BOUNDING_BOX[0] and y < BOUNDING_BOX[3] and y > BOUNDING_BOX[1]):
return True
return False
#------------------------------------------------------------------------------------------------
#------------------------------------------------------------------------------------------------
def is_violation(frame_detections):
for (x,y,w,h) in frame_detections:
p1 = (x,y)
p2 = (x+w,y)
p3 = (x,y+h)
p4 = (x+w,y+h)
if(inside(p1) and inside(p2) and inside(p3) and inside(p4)):
continue
elif(not(inside(p1)) and not(inside(p2)) and not(inside(p3)) and not(inside(p4))):
continue
else:
return True
return False
#------------------------------------------------------------------------------------------------
#------------------------------------------------------------------------------------------------
def create_model():
nb_classes = 2
# Create the model
model = Sequential()
model.add(Convolution2D(32, 3, 3, input_shape=(3, 128, 128), border_mode='same'))
model.add(Activation('relu'))
model.add(Convolution2D(32, 3, 3))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2,3)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(128))
model.add(Activation('relu'))
model.add(Dropout(0.5))
model.add(Dense(nb_classes))
model.add(Activation('softmax'))
return model
#------------------------------------------------------------------------------------------------
print "Loading model"
model = create_model()
model.load_weights("../model/traffic_light_weights.h5")
#------------------------------------------------------------------------------------------------
sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
model_heatmap = convnet('vgg_19',weights_path="../model/weights/vgg19_weights.h5", heatmap=True)
model_heatmap.compile(optimizer=sgd, loss='mse')
traffic_light_synset = "n06874185"
ids = synset_to_dfs_ids(traffic_light_synset)
#------------------------------------------------------------------------------------------------
#------------------------------------------------------------------------------------------------
clipnum = raw_input("Enter Clip number:\n")
f=open('../../dayTrain/dayClip'+str(clipnum)+'/frameAnnotationsBULB.csv','r')
inputs=f.read()
f.close();
inputs=inputs.split()
inputs=[i.split(";") for i in inputs]
for i in range(21):
inputs.pop(0)
# fourcc = cv2.VideoWriter_fourcc(*'XVID')
fourcc = cv2.cv.CV_FOURCC(*'XVID')
out = cv2.VideoWriter('output'+str(clipnum)+'.avi',fourcc, 20.0, (1280,960))
#------------------------------------------------------------------------------------------------
frame_num = -1
VIOLATION = -1
for i in inputs:
if i[1]=="stop":
filename="../../dayTrain/dayClip"+str(clipnum)+"/frames/"+i[0][12:len(i[0])]
original_img=cv2.imread(filename)
img=copy.copy(original_img)
height, width, channels = img.shape
if(frame_num == -1):
center_x = width/2
center_y = height/2
BB_width = width/4
BB_height = height/4
BOUNDING_BOX = [center_x-BB_width,center_y-BB_height,center_x + BB_width, center_y + BB_height ]
frame_num += 1
#------------------detection begins--------------------------------------------------------
if(frame_num % TRACK_FRAME < VOTE_FRAME): #VOTE_FRAME = 3, then 0,1,2 allowed
#------------------reset------------------------
if(frame_num % TRACK_FRAME == 0):
tracker_list = []
frame0_detections = []
frame1_detections = []
frame2_detections = []
#------------------reset------------------------
#-----------preprocess------------------------------------
img = cv2.medianBlur(img,3) # Median Blur to Remove Noise
img = cv2.cvtColor(img, cv2.COLOR_BGR2HSV)
b,g,r = cv2.split(img)
clahe = cv2.createCLAHE(clipLimit=7.0, tileGridSize=(8,8)) # Adaptive histogram equilization
clahe = clahe.apply(r)
img = cv2.merge((b,g,clahe))
#----------------------------------------------------------
#----------red threshold the HSV image--------------------
img1 = cv2.inRange(img, np.array([0, 100, 100]), np.array([10,255,255])) #lower red hue
img2 = cv2.inRange(img, np.array([160, 100, 100]), np.array([179,255,255])) #upper red hue
img3 = cv2.inRange(img, np.array([160, 40, 60]), np.array([180,70,80]))
img4 = cv2.inRange(img, np.array([0, 150, 40]), np.array([20,190,75]))
img5 = cv2.inRange(img, np.array([145, 35, 65]), np.array([170,65,90]))
img = cv2.bitwise_or(img1,img3)
img = cv2.bitwise_or(img,img2)
img = cv2.bitwise_or(img,img4)
img = cv2.bitwise_or(img,img5)
cv2.medianBlur(img,7)
ret,thresh = cv2.threshold(img,127,255,0)
#----------------------------------------------------------
#--------------------Heatmap------------------------------------
im_heatmap = preprocess_image_batch([filename], color_mode="bgr")
out_heatmap = model_heatmap.predict(im_heatmap)
heatmap = out_heatmap[0,ids].sum(axis=0)
my_range = np.max(heatmap) - np.min(heatmap)
heatmap = heatmap / my_range
heatmap = heatmap * 255
heatmap = cv2.resize(heatmap,(width,height))
cv2.imwrite("heatmap.png",heatmap)
cv2.imwrite("image.png",original_img)
heatmap[heatmap < 128] = 0 # Black
heatmap[heatmap >= 128] = 255 # White
heatmap = np.asarray(heatmap,dtype=np.uint8)
#----------------------------------------------------------
thresh = cv2.bitwise_and(thresh,heatmap)
#----------------------------------------------------------
contours, hierarchy = cv2.findContours(thresh,cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_NONE)
for cnt in contours:
area = cv2.contourArea(cnt)
x,y,w,h = cv2.boundingRect(cnt)
red_density = (area*1.0)/(w*h)
width_height_ratio = (w*1.0)/h
perimeter = cv2.arcLength(cnt, True)
approx = cv2.approxPolyDP(cnt, 0.04 * perimeter, True)
temp=cv2.cvtColor(original_img[y+h:y+2*h,x:x+w], cv2.COLOR_RGB2GRAY)
(thresh, temp) = cv2.threshold(temp, 128, 255, cv2.THRESH_BINARY | cv2.THRESH_OTSU)
black_density_below = ((w*h - cv2.countNonZero(temp))*1.0)/(w*h)
if area>MIN_AREA and area<MAX_AREA and len(approx) > MIN_POLYAPPROX and red_density > MIN_RED_DENSITY and width_height_ratio < WIDTH_HEIGHT_RATIO[1] and width_height_ratio > WIDTH_HEIGHT_RATIO[0] and black_density_below > MIN_BLACk_DENSITY_BELOW:
try:
r_x1=x-50
r_y1=y-50
r_x2=x+w+50
r_y2=y+h+50
temp=original_img[r_y1:r_y2,r_x1:r_x2]
xx=cv2.resize(temp,(128,128))
xx=np.asarray(xx)
xx=np.transpose(xx,(2,0,1))
xx=np.reshape(xx,(1,3,128,128))
if model.predict_classes(xx,verbose=0)==[1]:
cv2.rectangle(original_img, (x,y), (x+w,y+h),(0,255,0), 2)
#append detections
if frame_num % TRACK_FRAME == 0:
frame0_detections.append((x,y,w,h))
elif frame_num%TRACK_FRAME == 1:
frame1_detections.append((x,y,w,h))
elif frame_num%TRACK_FRAME == 2:
frame2_detections.append((x,y,w,h))
else:
cv2.rectangle(original_img, (x,y), (x+w,y+h),(255,0,0), 1)
except Exception as e:
cv2.rectangle(original_img, (x,y), (x+w,y+h),(0,255,0), 2) #edges are allowed
print e
pass
#--------------------Violation in Detect Phase------------------------------
frame_detections = []
if(frame_num % TRACK_FRAME == 0):
frame_detections = frame0_detections
if(frame_num % TRACK_FRAME == 1):
frame_detections = frame1_detections
if(frame_num % TRACK_FRAME == 2):
frame_detections = frame2_detections
#--------------------Violation in Detect Phase------------------------------
#compute and start tracking
if frame_num % TRACK_FRAME == 2:
all_detections = frame0_detections + frame1_detections + frame2_detections
final_detections = prune_detection(all_detections)
for (x,y,w,h) in final_detections:
tracker = dlib.correlation_tracker()
tracker.start_track(original_img, dlib.rectangle(x,y,(x+w),(y+h)))
tracker_list.append(tracker)
#------------------detection end----------------------------------------------------
#------------------tracking begins----------------------------------------------------
else:
frame_detections = []
for tracker in tracker_list:
tracker.update(original_img)
rect = tracker.get_position()
pt1 = (int(rect.left()), int(rect.top()))
pt2 = (int(rect.right()), int(rect.bottom()))
cv2.rectangle(original_img, pt1, pt2, (255, 255, 255), 2)
frame_detections.append((pt1[0], pt1[1], pt2[0]-pt1[0], pt2[1]-pt1[1]))
#------------------ tracking end----------------------------------------------------
if(is_violation(frame_detections) == True):
cv2.rectangle(original_img, (BOUNDING_BOX[0],BOUNDING_BOX[1]), (BOUNDING_BOX[2],BOUNDING_BOX[3]),(0, 0, 255), 2)
else:
cv2.rectangle(original_img, (BOUNDING_BOX[0],BOUNDING_BOX[1]), (BOUNDING_BOX[2],BOUNDING_BOX[3]),(60, 255, 255), 2)
cv2.imshow("Annotated",original_img)
out.write(original_img)
ch = 0xFF & cv2.waitKey(1)
if ch == 27:
break
cv2.destroyAllWindows()
#------------------------------------------------------------------------------------------------
| 43.919732 | 262 | 0.456823 | 1,388 | 13,132 | 4.172911 | 0.240634 | 0.00518 | 0.0202 | 0.02797 | 0.220822 | 0.194406 | 0.109289 | 0.066126 | 0.056112 | 0.056112 | 0 | 0.046642 | 0.253884 | 13,132 | 298 | 263 | 44.067114 | 0.544499 | 0.223881 | 0 | 0.144144 | 0 | 0 | 0.026513 | 0.013306 | 0 | 0 | 0.000394 | 0 | 0 | 0 | null | null | 0.004505 | 0.072072 | null | null | 0.013514 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f073d830bc26d55a9b16a99438ab898d40254be | 3,418 | py | Python | mcpyrate/markers.py | Technologicat/mcpyrate | 8182a8d246554b152e281d0f6c912e35ea58c316 | [
"MIT"
] | 34 | 2020-10-13T19:22:36.000Z | 2022-01-28T00:53:55.000Z | mcpyrate/markers.py | Technologicat/mcpyrate | 8182a8d246554b152e281d0f6c912e35ea58c316 | [
"MIT"
] | 32 | 2020-10-16T16:29:54.000Z | 2022-01-27T15:45:51.000Z | mcpyrate/markers.py | Technologicat/mcpyrate | 8182a8d246554b152e281d0f6c912e35ea58c316 | [
"MIT"
] | 2 | 2020-10-17T19:07:26.000Z | 2021-02-20T01:43:50.000Z | # -*- coding: utf-8; -*-
"""AST markers for internal communication.
*Internal* here means they are to be never passed to Python's `compile`;
macros may use them to work together.
"""
__all__ = ["ASTMarker", "get_markers", "delete_markers", "check_no_markers_remaining"]
import ast
from . import core, utils, walkers
class ASTMarker(ast.AST):
"""Base class for AST markers.
Markers are AST-node-like objects meant for communication between
co-operating, related macros. They are also used by the macro expander
to talk with itself during expansion.
We inherit from `ast.AST`, so that during macro expansion, a marker
behaves like a single AST node.
It is a postcondition of a completed macro expansion that no markers
remain in the AST.
To help fail-fast, if you define your own marker types, use `get_markers`
to check (at an appropriate point) that the expanded AST has no instances
of your own markers remaining. (You'll want a base class for your own markers.)
A typical usage example is in the quasiquote system, where the unquote
operators (some of which expand to markers) may only appear inside a quoted
section. So just before the quote operator exits, it checks that all
quasiquote markers within that section have been compiled away.
"""
# TODO: Silly default `None`, because `copy` and `deepcopy` call `__init__` without arguments,
# TODO: though the docs say they behave like `pickle` (and wouldn't thus need to call __init__ at all!).
def __init__(self, body=None):
"""body: the actual AST that is annotated by this marker"""
self.body = body
self._fields = ["body"] # support ast.iter_fields
def get_markers(tree, cls=ASTMarker):
"""Return a `list` of any `cls` instances found in `tree`. For output validation."""
class ASTMarkerCollector(walkers.ASTVisitor):
def examine(self, tree):
if isinstance(tree, cls):
self.collect(tree)
self.generic_visit(tree)
w = ASTMarkerCollector()
w.visit(tree)
return w.collected
def delete_markers(tree, cls=ASTMarker):
"""Delete any `cls` ASTMarker instances found in `tree`.
The deletion takes place by replacing each marker node with
the actual AST node stored in its `body` attribute.
"""
class ASTMarkerDeleter(walkers.ASTTransformer):
def transform(self, tree):
if isinstance(tree, cls):
return self.visit(tree.body)
return self.generic_visit(tree)
return ASTMarkerDeleter().visit(tree)
def check_no_markers_remaining(tree, *, filename, cls=None):
"""Check that `tree` has no AST markers remaining.
If a class `cls` is provided, only check for markers that `isinstance(cls)`.
If there are any, raise `MacroExpansionError`.
No return value.
`filename` is the full path to the `.py` file, for error reporting.
Convenience function.
"""
cls = cls or ASTMarker
remaining_markers = get_markers(tree, cls)
if remaining_markers:
codes = [utils.format_context(node, n=5) for node in remaining_markers]
locations = [utils.format_location(filename, node, code) for node, code in zip(remaining_markers, codes)]
report = "\n\n".join(locations)
raise core.MacroExpansionError(f"{filename}: AST markers remaining after expansion:\n{report}")
| 37.977778 | 113 | 0.693681 | 475 | 3,418 | 4.911579 | 0.429474 | 0.034291 | 0.018003 | 0.019717 | 0.023146 | 0.023146 | 0 | 0 | 0 | 0 | 0 | 0.00075 | 0.219427 | 3,418 | 89 | 114 | 38.404494 | 0.873688 | 0.550614 | 0 | 0.064516 | 0 | 0 | 0.091298 | 0.018545 | 0 | 0 | 0 | 0.011236 | 0 | 1 | 0.193548 | false | 0 | 0.064516 | 0 | 0.483871 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f0bb8acf71ebb128d83c12c5909aa37ad5afe8a | 940 | py | Python | sizer.py | riffcc/librarian | f3cf8f4cc9f9a717e5f807a1d8558eb8c4e4d528 | [
"MIT"
] | null | null | null | sizer.py | riffcc/librarian | f3cf8f4cc9f9a717e5f807a1d8558eb8c4e4d528 | [
"MIT"
] | null | null | null | sizer.py | riffcc/librarian | f3cf8f4cc9f9a717e5f807a1d8558eb8c4e4d528 | [
"MIT"
] | null | null | null | #!/usr/bin/python3
# Fetch torrent sizes
# TODO: Report number of files before we go etc
import os
from torrentool.api import Torrent
from fnmatch import fnmatch
root = '/opt/radio/collections'
pattern = "*.torrent"
alltorrentsize = 0
print("Thanks for using The Librarian.")
for path, subdirs, files in os.walk(root):
for name in files:
if fnmatch(name, pattern):
torrentstats = Torrent.from_file(os.path.join(path, name))
alltorrentsize += torrentstats.total_size
print('Torrent size ' + str(torrentstats.total_size) + ' for a total so far of ' + str(alltorrentsize))
print('DEBUG' + os.path.join(path, name))
# Reading filesize
my_torrent = Torrent.from_file('/opt/radio/collections/arienscompanymanuals/archive.org/download/collection_01_ariens_manuals/collection_01_ariens_manuals_archive.torrent')
size = my_torrent.total_size # Total files size in bytes.
print(size) | 34.814815 | 172 | 0.726596 | 127 | 940 | 5.267717 | 0.488189 | 0.049327 | 0.056801 | 0.041854 | 0.053812 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007702 | 0.171277 | 940 | 27 | 173 | 34.814815 | 0.851091 | 0.135106 | 0 | 0 | 0 | 0 | 0.297899 | 0.197775 | 0 | 0 | 0 | 0.037037 | 0 | 1 | 0 | false | 0 | 0.176471 | 0 | 0.176471 | 0.235294 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f1051aadde1f5582ce2b30a763b8cd2ec505a2e | 1,373 | py | Python | tests/test_renderer.py | 0xflotus/maildown | fa17ce6a29458da549a145741db8e5092def2176 | [
"MIT"
] | 626 | 2019-05-08T22:34:45.000Z | 2022-03-31T07:29:35.000Z | tests/test_renderer.py | pythonthings/maildown | 4e0caf297bdf264ab5ead537eb45d20f187971a1 | [
"MIT"
] | 12 | 2019-04-30T20:47:17.000Z | 2019-06-27T11:19:46.000Z | tests/test_renderer.py | pythonthings/maildown | 4e0caf297bdf264ab5ead537eb45d20f187971a1 | [
"MIT"
] | 36 | 2019-05-08T23:50:41.000Z | 2021-07-30T17:46:24.000Z | import mock
from maildown import renderer
import mistune
import pygments
from pygments import lexers
from pygments.formatters import html
import premailer
import jinja2
def test_highlight_renderer(monkeypatch):
monkeypatch.setattr(mistune, "escape", mock.MagicMock())
monkeypatch.setattr(lexers, "get_lexer_by_name", mock.MagicMock())
monkeypatch.setattr(html, "HtmlFormatter", mock.MagicMock())
monkeypatch.setattr(pygments, "highlight", mock.MagicMock())
lexers.get_lexer_by_name.return_value = True
html.HtmlFormatter.return_value = {}
r = renderer.HighlightRenderer()
r.block_code("code")
mistune.escape.assert_called_with("code")
r.block_code("code", "python")
lexers.get_lexer_by_name.assert_called_with("python", stripall=True)
pygments.highlight.assert_called_with("code", True, {})
def test_generate_content(monkeypatch):
monkeypatch.setattr(mistune, "Markdown", mock.MagicMock())
monkeypatch.setattr(premailer, "transform", mock.MagicMock())
monkeypatch.setattr(renderer, "HighlightRenderer", mock.MagicMock())
monkeypatch.setattr(jinja2, "Template", mock.MagicMock())
renderer.HighlightRenderer.return_value = 1
premailer.transform.return_value = ""
jinja2.Template.render.return_value = ""
renderer.generate_content("")
mistune.Markdown.assert_called_with(renderer=1)
| 33.487805 | 72 | 0.758194 | 156 | 1,373 | 6.487179 | 0.275641 | 0.142292 | 0.142292 | 0.183794 | 0.059289 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004163 | 0.125273 | 1,373 | 40 | 73 | 34.325 | 0.838468 | 0 | 0 | 0 | 0 | 0 | 0.083758 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 1 | 0.064516 | false | 0 | 0.258065 | 0 | 0.322581 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f149a0dd9e45b60d9d630858342198ce7d83ebf | 1,709 | py | Python | xen/xen-4.2.2/tools/xm-test/tests/xapi/01_xapi-vm_basic.py | zhiming-shen/Xen-Blanket-NG | 47e59d9bb92e8fdc60942df526790ddb983a5496 | [
"Apache-2.0"
] | 1 | 2018-02-02T00:15:26.000Z | 2018-02-02T00:15:26.000Z | xen/xen-4.2.2/tools/xm-test/tests/xapi/01_xapi-vm_basic.py | zhiming-shen/Xen-Blanket-NG | 47e59d9bb92e8fdc60942df526790ddb983a5496 | [
"Apache-2.0"
] | null | null | null | xen/xen-4.2.2/tools/xm-test/tests/xapi/01_xapi-vm_basic.py | zhiming-shen/Xen-Blanket-NG | 47e59d9bb92e8fdc60942df526790ddb983a5496 | [
"Apache-2.0"
] | 1 | 2019-05-27T09:47:18.000Z | 2019-05-27T09:47:18.000Z | #!/usr/bin/python
# Copyright (C) International Business Machines Corp., 2006
# Author: Stefan Berger <stefanb@us.ibm.com>
# Basic VM creation test
from XmTestLib import xapi
from XmTestLib.XenAPIDomain import XmTestAPIDomain
from XmTestLib import *
from xen.xend import XendAPIConstants
import commands
import os
try:
# XmTestAPIDomain tries to establish a connection to XenD
domain = XmTestAPIDomain()
except Exception, e:
SKIP("Skipping test. Error: %s" % str(e))
vm_uuid = domain.get_uuid()
session = xapi.connect()
domain.start(startpaused=True)
res = session.xenapi.VM.get_power_state(vm_uuid)
if res != XendAPIConstants.XEN_API_VM_POWER_STATE[XendAPIConstants.XEN_API_VM_POWER_STATE_PAUSED]:
FAIL("VM was not started in 'paused' state")
res = session.xenapi.VM.unpause(vm_uuid)
res = session.xenapi.VM.get_power_state(vm_uuid)
if res != XendAPIConstants.XEN_API_VM_POWER_STATE[XendAPIConstants.XEN_API_VM_POWER_STATE_RUNNING]:
FAIL("VM could not be put into 'running' state")
console = domain.getConsole()
try:
run = console.runCmd("cat /proc/interrupts")
except ConsoleError, e:
saveLog(console.getHistory())
FAIL("Could not access proc-filesystem")
res = session.xenapi.VM.pause(vm_uuid)
res = session.xenapi.VM.get_power_state(vm_uuid)
if res != XendAPIConstants.XEN_API_VM_POWER_STATE[XendAPIConstants.XEN_API_VM_POWER_STATE_PAUSED]:
FAIL("VM could not be put into 'paused' state")
res = session.xenapi.VM.unpause(vm_uuid)
res = session.xenapi.VM.get_power_state(vm_uuid)
if res != XendAPIConstants.XEN_API_VM_POWER_STATE[XendAPIConstants.XEN_API_VM_POWER_STATE_RUNNING]:
FAIL("VM could not be 'unpaused'")
domain.stop()
domain.destroy()
| 27.564516 | 99 | 0.774137 | 250 | 1,709 | 5.08 | 0.352 | 0.094488 | 0.138583 | 0.151181 | 0.491339 | 0.491339 | 0.491339 | 0.472441 | 0.472441 | 0.472441 | 0 | 0.00267 | 0.123464 | 1,709 | 61 | 100 | 28.016393 | 0.845127 | 0.114687 | 0 | 0.333333 | 0 | 0 | 0.143899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.166667 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f177aacdeb67b4df7640983b24e1411fe279553 | 2,853 | py | Python | app/models/fragment.py | saury2013/Memento | dbb2031a5aff3064f40bcb5afe631de8724a547e | [
"MIT"
] | null | null | null | app/models/fragment.py | saury2013/Memento | dbb2031a5aff3064f40bcb5afe631de8724a547e | [
"MIT"
] | null | null | null | app/models/fragment.py | saury2013/Memento | dbb2031a5aff3064f40bcb5afe631de8724a547e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from datetime import datetime
from sqlalchemy.dialects.mysql import LONGTEXT
from sqlalchemy.orm import load_only
from sqlalchemy import func
from flask import abort
from markdown import Markdown,markdown
from app.models import db,fragment_tags_table
from app.models.tag import Tag
from app.whoosh import search_helper
class Fragment(db.Model):
'''知识碎片'''
__tablename__ = 'fragment'
__table_args__ = {
"mysql_engine": "InnoDB",
"mysql_charset": "utf8"
}
id = db.Column(db.Integer,nullable=False,primary_key=True,autoincrement=True)
title = db.Column(db.String(255),nullable=False,default="",index=True)
access = db.Column(db.Integer,nullable=False,default=1)
status = db.Column(db.Integer,nullable=False,default=0)
markdown = db.deferred(db.Column(LONGTEXT,default="",nullable=False))
html = db.deferred(db.Column(LONGTEXT,default="",nullable=False))
publish_markdown = db.deferred(db.Column(LONGTEXT,default="",nullable=False))
publish_html = db.deferred(db.Column(LONGTEXT,default="",nullable=False))
publish_timestamp = db.Column(db.DateTime,default=datetime.now,nullable=False)
updatetime = db.Column(db.DateTime,default=datetime.now,nullable=False)
user_id = db.Column(db.Integer,db.ForeignKey('user.id'))
tags = db.relationship('Tag',secondary=fragment_tags_table,backref=db.backref('fragments'))
# branch = db.relationship('Branch',back_populates='fragment',uselist=False)
branch_id = db.Column(db.Integer,db.ForeignKey('branch.id'))
# branch = db.relationship('Branch',foreign_keys=branch_id)
def get(self,id):
return Fragment.query.get(id)
@staticmethod
def get_or_404(id):
fragment = Fragment.query.get(id)
if fragment:
return fragment
abort(404)
def save(self):
self.html = self.markdown2html(self.markdown)
db.session.add(self)
db.session.commit()
search_helper.add_document(self.title,str(self.id),self.markdown)
def markdown2html(self,content):
# md = Markdown(['codehilite', 'fenced_code', 'meta', 'tables'])
# html = md.convert(content)
html = markdown(content,extensions=[
'markdown.extensions.extra',
'markdown.extensions.codehilite',
'markdown.extensions.toc',
])
return html
@staticmethod
def get_nearest_fragments(num=5):
fragments = Fragment.query.filter().order_by(Fragment.updatetime.desc()).limit(num)
res = []
from app.models.branch import Branch
for fragment in fragments:
fragment.branch = Branch.get(fragment.branch_id)
res.append(fragment)
return res
| 38.04 | 95 | 0.660007 | 340 | 2,853 | 5.435294 | 0.326471 | 0.051948 | 0.04329 | 0.045996 | 0.267857 | 0.266775 | 0.250541 | 0.176948 | 0.176948 | 0.061688 | 0 | 0.007159 | 0.216614 | 2,853 | 74 | 96 | 38.554054 | 0.819687 | 0.087627 | 0 | 0.035088 | 0 | 0 | 0.057529 | 0.030116 | 0 | 0 | 0 | 0 | 0 | 1 | 0.087719 | false | 0 | 0.175439 | 0.017544 | 0.614035 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6f1b8a527ec012630d1bead41b940dac1320a132 | 4,617 | py | Python | source1/bsp/entities/portal2_entity_handlers.py | tltneon/SourceIO | 418224918c2b062a4c78a41d4d65329ba2decb22 | [
"MIT"
] | 199 | 2019-04-02T02:30:58.000Z | 2022-03-30T21:29:49.000Z | source1/bsp/entities/portal2_entity_handlers.py | syborg64/SourceIO | e4ba86d801f518e192260af08ef533759c2e1cc3 | [
"MIT"
] | 113 | 2019-03-03T19:36:25.000Z | 2022-03-31T19:44:05.000Z | source1/bsp/entities/portal2_entity_handlers.py | syborg64/SourceIO | e4ba86d801f518e192260af08ef533759c2e1cc3 | [
"MIT"
] | 38 | 2019-05-15T16:49:30.000Z | 2022-03-22T03:40:43.000Z | import math
from mathutils import Euler
import bpy
from .portal2_entity_classes import *
from .portal_entity_handlers import PortalEntityHandler
local_entity_lookup_table = PortalEntityHandler.entity_lookup_table.copy()
local_entity_lookup_table.update(entity_class_handle)
class Portal2EntityHandler(PortalEntityHandler):
entity_lookup_table = local_entity_lookup_table
pointlight_power_multiplier = 1000
def handle_prop_weighted_cube(self, entity: prop_weighted_cube, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_weighted_cube', obj, 'props')
def handle_prop_testchamber_door(self, entity: prop_testchamber_door, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_testchamber_door', obj, 'props')
def handle_prop_floor_button(self, entity: prop_floor_button, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_floor_button', obj, 'props')
def handle_prop_floor_ball_button(self, entity: prop_floor_ball_button, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_floor_ball_button', obj, 'props')
def handle_prop_floor_cube_button(self, entity: prop_floor_cube_button, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_floor_cube_button', obj, 'props')
def handle_prop_under_floor_button(self, entity: prop_under_floor_button, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_under_floor_button', obj, 'props')
def handle_prop_tractor_beam(self, entity: prop_tractor_beam, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_tractor_beam', obj, 'props')
def handle_logic_playmovie(self, entity: logic_playmovie, entity_raw: dict):
obj = bpy.data.objects.new(self._get_entity_name(entity), None)
self._set_location(obj, entity.origin)
self._set_icon_if_present(obj, entity)
self._set_entity_data(obj, {'entity': entity_raw})
self._put_into_collection('logic_playmovie', obj, 'logic')
def handle_trigger_paint_cleanser(self, entity: trigger_paint_cleanser, entity_raw: dict):
if 'model' not in entity_raw:
return
model_id = int(entity_raw.get('model')[1:])
mesh_object = self._load_brush_model(model_id, self._get_entity_name(entity))
self._set_location_and_scale(mesh_object, parse_float_vector(entity_raw.get('origin', '0 0 0')))
self._set_rotation(mesh_object, parse_float_vector(entity_raw.get('angles', '0 0 0')))
self._set_entity_data(mesh_object, {'entity': entity_raw})
self._put_into_collection('trigger_paint_cleanser', mesh_object, 'triggers')
def handle_trigger_catapult(self, entity: trigger_catapult, entity_raw: dict):
if 'model' not in entity_raw:
return
model_id = int(entity_raw.get('model')[1:])
mesh_object = self._load_brush_model(model_id, self._get_entity_name(entity))
self._set_location_and_scale(mesh_object, parse_float_vector(entity_raw.get('origin', '0 0 0')))
self._set_rotation(mesh_object, parse_float_vector(entity_raw.get('angles', '0 0 0')))
self._set_entity_data(mesh_object, {'entity': entity_raw})
self._put_into_collection('trigger_catapult', mesh_object, 'triggers')
def handle_npc_wheatley_boss(self, entity: npc_wheatley_boss, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('npc_wheatley_boss', obj, 'npc')
def handle_prop_exploding_futbol(self, entity: prop_exploding_futbol, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_exploding_futbol', obj, 'props')
def handle_prop_exploding_futbol_socket(self, entity: prop_exploding_futbol_socket, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_exploding_futbol', obj, 'props')
def handle_prop_exploding_futbol_spawnert(self, entity: prop_exploding_futbol_spawner, entity_raw: dict):
obj = self._handle_entity_with_model(entity, entity_raw)
self._put_into_collection('prop_exploding_futbol_spawner', obj, 'props')
| 53.068966 | 109 | 0.753736 | 637 | 4,617 | 4.971743 | 0.142857 | 0.102305 | 0.057468 | 0.083991 | 0.695295 | 0.598042 | 0.581307 | 0.539627 | 0.539627 | 0.539627 | 0 | 0.005101 | 0.150747 | 4,617 | 86 | 110 | 53.686047 | 0.802601 | 0 | 0 | 0.402985 | 0 | 0 | 0.09465 | 0.039203 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208955 | false | 0 | 0.074627 | 0 | 0.358209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f1f9754bb7f6d41b30e4a4c10cead5e654ca04e | 2,743 | py | Python | edexOsgi/com.raytheon.edex.plugin.gfe/utility/cave_static/user/GFETEST/gfe/userPython/smartTools/ExUtil1.py | srcarter3/awips2 | 37f31f5e88516b9fd576eaa49d43bfb762e1d174 | [
"Apache-2.0"
] | null | null | null | edexOsgi/com.raytheon.edex.plugin.gfe/utility/cave_static/user/GFETEST/gfe/userPython/smartTools/ExUtil1.py | srcarter3/awips2 | 37f31f5e88516b9fd576eaa49d43bfb762e1d174 | [
"Apache-2.0"
] | null | null | null | edexOsgi/com.raytheon.edex.plugin.gfe/utility/cave_static/user/GFETEST/gfe/userPython/smartTools/ExUtil1.py | srcarter3/awips2 | 37f31f5e88516b9fd576eaa49d43bfb762e1d174 | [
"Apache-2.0"
] | 1 | 2021-10-30T00:03:05.000Z | 2021-10-30T00:03:05.000Z | ##
# This software was developed and / or modified by Raytheon Company,
# pursuant to Contract DG133W-05-CQ-1067 with the US Government.
#
# U.S. EXPORT CONTROLLED TECHNICAL DATA
# This software product contains export-restricted data whose
# export/transfer/disclosure is restricted by U.S. law. Dissemination
# to non-U.S. persons whether in the United States or abroad requires
# an export license or other authorization.
#
# Contractor Name: Raytheon Company
# Contractor Address: 6825 Pine Street, Suite 340
# Mail Stop B8
# Omaha, NE 68106
# 402.291.0100
#
# See the AWIPS II Master Rights File ("Master Rights File.pdf") for
# further licensing information.
##
# ----------------------------------------------------------------------------
# This software is in the public domain, furnished "as is", without technical
# support, and with no warranty, express or implied, as to its usefulness for
# any purpose.
#
# ExUtil1
#
# Author:
# ----------------------------------------------------------------------------
ToolType = "numeric"
WeatherElementEdited = "T"
from numpy import *
import SmartScript
import Common
VariableList = [("Model:" , "", "D2D_model")]
class Tool (SmartScript.SmartScript):
def __init__(self, dbss):
self._dbss = dbss
SmartScript.SmartScript.__init__(self, dbss)
def execute(self, GridTimeRange, Topo, varDict):
"This tool accesses T grids directly"
self._common = Common.Common(self._dbss)
model = varDict["Model:"]
# Convert Topo to meters
topo_M = self._common._convertFtToM(Topo)
# Make a sounding cubes for T
# Height will increase in the sounding and be the
# first dimension
levels = ["MB1000","MB850", "MB700","MB500"]
gh_Cube, t_Cube = self.makeNumericSounding(
model, "t", levels, GridTimeRange)
print "Cube shapes ", gh_Cube.shape, t_Cube.shape
# Make an initial T grid with values of -200
# This is an out-of-range value to help us identify values that
# have already been set.
T = (Topo * 0) - 200
# Work "upward" in the cubes to assign T
# We will only set the value once, i.e. the first time the
# gh height is greater than the Topo
# For each level
for i in xrange(gh_Cube.shape[0]):
# where ( gh > topo and T == -200),
# set to t_Cube value, otherwise keep value already set))
T = where(logical_and(greater(gh_Cube[i], topo_M), equal(T,-200)), t_Cube[i], T)
# Convert from K to F
T_F = self.convertKtoF(T)
return T_F
| 34.2875 | 96 | 0.596792 | 346 | 2,743 | 4.653179 | 0.523121 | 0.012422 | 0.019876 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030258 | 0.265038 | 2,743 | 79 | 97 | 34.721519 | 0.768353 | 0.566168 | 0 | 0 | 0 | 0 | 0.08559 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.125 | null | null | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f24c0d9627e8e593e0f3f03a5c6df58f6f65c2e | 2,922 | py | Python | lib/vapi_cli/users.py | nogayama/vision-tools | f3041b519f30037d5b6390bce36a7f5efd3ed6ae | [
"Apache-2.0"
] | 15 | 2020-03-22T18:25:27.000Z | 2021-12-03T05:49:32.000Z | lib/vapi_cli/users.py | nogayama/vision-tools | f3041b519f30037d5b6390bce36a7f5efd3ed6ae | [
"Apache-2.0"
] | 8 | 2020-04-04T18:11:56.000Z | 2021-07-27T18:06:47.000Z | lib/vapi_cli/users.py | nogayama/vision-tools | f3041b519f30037d5b6390bce36a7f5efd3ed6ae | [
"Apache-2.0"
] | 19 | 2020-03-20T23:36:32.000Z | 2022-01-10T20:38:48.000Z | #!/usr/bin/env python3
# IBM_PROLOG_BEGIN_TAG
#
# Copyright 2019,2020 IBM International Business Machines Corp.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# IBM_PROLOG_END_TAG
import logging as logger
import sys
import vapi
import vapi_cli.cli_utils as cli_utils
from vapi_cli.cli_utils import reportSuccess, reportApiError, translate_flags
# All of Vision Tools requires python 3.6 due to format string
# Make the check in a common location
if sys.hexversion < 0x03060000:
sys.exit("Python 3.6 or newer is required to run this program.")
token_usage = """
Usage:
users token --user=<user-name> --password=<password>
Where:
--user Required parameter containing the user login name
--password Required parameter containing the user's password
Gets an authentication token for the given user"""
server = None
# --- Token Operation ----------------------------------------------
def token(params):
""" Handles getting an authentication token for a specific user"""
user = params.get("--user", None)
pw = params.get("--password", None)
rsp = server.users.get_token(user, pw)
if rsp is None or rsp.get("result", "fail") == "fail":
reportApiError(server, f"Failed to get token for user '{user}'")
else:
reportSuccess(server, rsp["token"])
cmd_usage = f"""
Usage: users {cli_utils.common_cmd_flags} <operation> [<args>...]
Where:
{cli_utils.common_cmd_flag_descriptions}
<operation> is required and must be one of:
token -- gets an authentication token for the given user
Use 'users <operation> --help' for more information on a specific command."""
usage_stmt = {
"usage": cmd_usage,
"token": token_usage
}
operation_map = {
"token": token
}
def main(params, cmd_flags=None):
global server
args = cli_utils.get_valid_input(usage_stmt, operation_map, argv=params, cmd_flags=cmd_flags)
if args is not None:
# When requesting a token, we need to ignore any existing token info
if args.cmd_params["<operation>"] == "token":
cli_utils.token = ""
try:
server = vapi.connect_to_server(cli_utils.host_name, cli_utils.token)
except Exception as e:
print("Error: Failed to setup server.", file=sys.stderr)
logger.debug(e)
return 1
args.operation(args.op_params)
if __name__ == "__main__":
main(None)
| 29.816327 | 97 | 0.687543 | 411 | 2,922 | 4.766423 | 0.425791 | 0.036753 | 0.032159 | 0.036753 | 0.075549 | 0.040837 | 0.040837 | 0.040837 | 0 | 0 | 0 | 0.011583 | 0.202259 | 2,922 | 97 | 98 | 30.123711 | 0.828829 | 0.326489 | 0 | 0.038462 | 0 | 0 | 0.382156 | 0.0459 | 0 | 0 | 0.005157 | 0 | 0 | 1 | 0.038462 | false | 0.057692 | 0.096154 | 0 | 0.153846 | 0.019231 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6f2fda5d1a7f7912eef13fc0ff8b8f413ac5c9a7 | 1,373 | py | Python | corehq/form_processor/migrations/0049_case_attachment_props.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2020-05-05T13:10:01.000Z | 2020-05-05T13:10:01.000Z | corehq/form_processor/migrations/0049_case_attachment_props.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2019-12-09T14:00:14.000Z | 2019-12-09T14:00:14.000Z | corehq/form_processor/migrations/0049_case_attachment_props.py | MaciejChoromanski/commcare-hq | fd7f65362d56d73b75a2c20d2afeabbc70876867 | [
"BSD-3-Clause"
] | 5 | 2015-11-30T13:12:45.000Z | 2019-07-01T19:27:07.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from __future__ import absolute_import
from django.db import models, migrations
import jsonfield.fields
class Migration(migrations.Migration):
dependencies = [
('form_processor', '0048_attachment_content_length_blob_id'),
]
operations = [
migrations.AddField(
model_name='xformattachmentsql',
name='properties',
field=jsonfield.fields.JSONField(default=dict),
preserve_default=True,
),
migrations.AddField(
model_name='caseattachmentsql',
name='attachment_from',
field=models.TextField(null=True),
preserve_default=True,
),
migrations.AddField(
model_name='caseattachmentsql',
name='properties',
field=jsonfield.fields.JSONField(default=dict),
preserve_default=True,
),
migrations.AddField(
model_name='caseattachmentsql',
name='attachment_src',
field=models.TextField(null=True),
preserve_default=True,
),
migrations.AddField(
model_name='caseattachmentsql',
name='identifier',
field=models.CharField(default='', max_length=255),
preserve_default=False,
),
]
| 29.212766 | 69 | 0.600874 | 117 | 1,373 | 6.803419 | 0.393162 | 0.113065 | 0.144472 | 0.169598 | 0.562814 | 0.562814 | 0.562814 | 0.562814 | 0.562814 | 0.562814 | 0 | 0.008325 | 0.300073 | 1,373 | 46 | 70 | 29.847826 | 0.819979 | 0.015295 | 0 | 0.6 | 0 | 0 | 0.145926 | 0.028148 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.175 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f35ce7e4cec8e809fb6bd6d1db0395eade06403 | 633 | py | Python | misc/fill_blanks.py | netotz/codecamp | ff6b5ce1af1d99bbb00f7e095ca6beac92020b1c | [
"Unlicense"
] | null | null | null | misc/fill_blanks.py | netotz/codecamp | ff6b5ce1af1d99bbb00f7e095ca6beac92020b1c | [
"Unlicense"
] | null | null | null | misc/fill_blanks.py | netotz/codecamp | ff6b5ce1af1d99bbb00f7e095ca6beac92020b1c | [
"Unlicense"
] | 1 | 2020-04-05T06:22:18.000Z | 2020-04-05T06:22:18.000Z | # Given an array containing None values fill in the None values with most recent
# non None value in the array
from random import random
def generate_sample(n):
rand = 0.9
while n:
yield int(rand * 10) if rand % 1 > 1 / 3 else None
rand = random()
n -= 1
def fill1(array):
for i in range(len(array)):
if array[i] is None:
array[i] = array[i - 1]
return array
def fill2(array):
for i, num in enumerate(array):
if num is None:
array[i] = array[i - 1]
return array
test = list(map(int, input().split()))
print(fill1(test))
print(fill2(test))
| 22.607143 | 81 | 0.593997 | 101 | 633 | 3.712871 | 0.475248 | 0.08 | 0.048 | 0.064 | 0.16 | 0.16 | 0.16 | 0.16 | 0.16 | 0 | 0 | 0.031461 | 0.296998 | 633 | 27 | 82 | 23.444444 | 0.811236 | 0.169036 | 0 | 0.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.05 | 0 | 0.3 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f3d9e5be4e02104620356819d1fd22753eef212 | 3,349 | py | Python | dbSchema.py | zikasak/ReadOnlyBot | 912403a5d6386c1ce691bbe22dad660af49b26e8 | [
"MIT"
] | 1 | 2020-12-17T20:50:29.000Z | 2020-12-17T20:50:29.000Z | dbSchema.py | zikasak/ReadOnlyBot | 912403a5d6386c1ce691bbe22dad660af49b26e8 | [
"MIT"
] | null | null | null | dbSchema.py | zikasak/ReadOnlyBot | 912403a5d6386c1ce691bbe22dad660af49b26e8 | [
"MIT"
] | null | null | null | import datetime
from sqlalchemy import Column, Integer, Boolean, ForeignKey, String, DateTime, UniqueConstraint, ForeignKeyConstraint
from sqlalchemy.orm import relationship
from dbConfig import Base, engine
class GroupStatus(Base):
__tablename__ = "groupstatus"
id = Column(Integer, primary_key=True)
status = Column(Boolean, default=False)
wel_message = Column(String)
new_users_blocked = Column(Boolean, default=False)
time_to_mute = Column(Integer, default=30)
messages = relationship("GroupMessage", cascade="save-update, merge, delete, delete-orphan")
banned_users = relationship("BannedUser", cascade="save-update, merge, delete, delete-orphan")
mutted_users = relationship("MutedUser", backref="chat", cascade="save-update, merge, delete, delete-orphan")
blocked_phrases = relationship("BlockedPhrases", backref="chat", cascade="save-update, merge, delete, delete-orphan")
def add_muted(self, user_id, message_id):
m = MutedUser()
m.chat_id = self.id
m.user_id = user_id
m.welcome_msg_id = message_id
m.mute_date = datetime.datetime.utcnow()
if m not in self.mutted_users:
self.mutted_users.append(m)
class GroupMessage(Base):
__tablename__ = "groupmessage"
chat_id = Column(Integer, ForeignKey("groupstatus.id"), primary_key=True)
message = Column(String)
command = Column(String, primary_key=True)
description = Column(String, default="")
UniqueConstraint('chat_id', 'command')
def __repr__(self):
return "{!r} - {!r}".format(self.command, self.description)
class MutedUser(Base):
__tablename__ = "muted"
chat_id = Column(Integer, ForeignKey("groupstatus.id"), primary_key=True)
user_id = Column(Integer, primary_key=True)
mute_date = Column(DateTime(timezone=True), nullable=False, default=datetime.datetime.utcnow())
welcome_msg_id = Column(Integer, nullable=False)
time_messages = relationship("TimeExceededMessage", cascade="save-update, merge, delete, delete-orphan",
primaryjoin="and_(MutedUser.chat_id==TimeExceededMessage.chat_id, "
"MutedUser.user_id==TimeExceededMessage.user_id)")
def __eq__(self, obj: object) -> bool:
if type(obj) != MutedUser:
return super().__eq__(obj)
return (self.chat_id == obj.chat_id) and (self.user_id == obj.user_id)
class TimeExceededMessage(Base):
__tablename__ = "mutedMessages"
id = Column(Integer, primary_key=True)
chat_id = Column(Integer)
user_id = Column(Integer)
welcome_msg_id = Column(Integer, ForeignKey("muted.welcome_msg_id"))
msg_id = Column(Integer, nullable=False)
__table_args__ = (ForeignKeyConstraint([chat_id, user_id], [MutedUser.chat_id, MutedUser.user_id]), {})
class BannedUser(Base):
__tablename__ = "bannedusers"
chat_id = Column(Integer, ForeignKey("groupstatus.id"), primary_key=True)
user_id = Column(Integer, primary_key=True)
username = Column(String)
reason = Column(String)
class BlockedPhrases(Base):
__tablename__ = "blockedPhrases"
id = Column(Integer, primary_key=True)
chat_id = Column(Integer, ForeignKey("groupstatus.id"))
blockedPhrase = Column(String, nullable=False)
Base.metadata.create_all(engine)
| 40.349398 | 121 | 0.701702 | 389 | 3,349 | 5.77892 | 0.236504 | 0.092527 | 0.093416 | 0.048932 | 0.329181 | 0.296263 | 0.255783 | 0.192171 | 0.192171 | 0.146797 | 0 | 0.000729 | 0.181248 | 3,349 | 82 | 122 | 40.841463 | 0.81911 | 0 | 0 | 0.123077 | 0 | 0 | 0.162437 | 0.029561 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046154 | false | 0 | 0.061538 | 0.015385 | 0.815385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6f459b6385eeaec430778e2b8c2a198dc774b06f | 1,280 | py | Python | tests/ws/TestWebsocketRegisterAgent.py | sinri/nehushtan | 6fda496e16a8d443a86c617173d35f31c392beb6 | [
"MIT"
] | null | null | null | tests/ws/TestWebsocketRegisterAgent.py | sinri/nehushtan | 6fda496e16a8d443a86c617173d35f31c392beb6 | [
"MIT"
] | 1 | 2020-11-20T03:10:23.000Z | 2020-11-20T09:30:34.000Z | tests/ws/TestWebsocketRegisterAgent.py | sinri/nehushtan | 6fda496e16a8d443a86c617173d35f31c392beb6 | [
"MIT"
] | 1 | 2021-10-13T10:16:58.000Z | 2021-10-13T10:16:58.000Z | import uuid
from typing import Dict, List
from nehushtan.ws.NehushtanWebsocketConnectionEntity import NehushtanWebsocketConnectionEntity
class TestWebsocketRegisterAgent:
def __init__(self):
self.__map: Dict[str, NehushtanWebsocketConnectionEntity] = {}
self.agent_identity = str(uuid.uuid4())
def register(self, websocket):
entity = NehushtanWebsocketConnectionEntity(websocket)
self.__map[entity.get_key()] = entity
print(f"TestWebsocketRegisterAgent[{self.agent_identity}] registered [{entity.get_key()}]")
return entity
def unregister(self, key: str):
if self.__map.get(key):
del self.__map[key]
print(f"TestWebsocketRegisterAgent[{self.agent_identity}] unregistered [{key}]")
def read(self, key: str):
print(f"TestWebsocketRegisterAgent[{self.agent_identity}] reading [{key}]")
return self.__map.get(key)
def list_for_server(self, local_key: str) -> List[NehushtanWebsocketConnectionEntity]:
print(f"TestWebsocketRegisterAgent[{self.agent_identity}] listing for [{local_key}]")
enities = []
for k, v in self.__map.items():
if v.get_local_key() == local_key:
enities.append(v)
return enities
| 36.571429 | 99 | 0.682813 | 134 | 1,280 | 6.298507 | 0.328358 | 0.049763 | 0.100711 | 0.170616 | 0.232227 | 0.232227 | 0 | 0 | 0 | 0 | 0 | 0.000986 | 0.207813 | 1,280 | 34 | 100 | 37.647059 | 0.831361 | 0 | 0 | 0 | 0 | 0 | 0.227344 | 0.153125 | 0 | 0 | 0 | 0 | 0 | 1 | 0.192308 | false | 0 | 0.115385 | 0 | 0.461538 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f480b5d92cd89679ad9577e9f8230981a8ae4ea | 1,641 | py | Python | src/geo_testing/test_scripts/psgs_big.py | hpgl/hpgl | 72d8c4113c242295de740513093f5779c94ba84a | [
"BSD-3-Clause"
] | 70 | 2015-01-21T12:24:50.000Z | 2022-03-16T02:10:45.000Z | src/geo_testing/test_scripts/psgs_big.py | hpgl/hpgl | 72d8c4113c242295de740513093f5779c94ba84a | [
"BSD-3-Clause"
] | 8 | 2015-04-22T13:14:30.000Z | 2021-11-23T12:16:32.000Z | src/geo_testing/test_scripts/psgs_big.py | hpgl/hpgl | 72d8c4113c242295de740513093f5779c94ba84a | [
"BSD-3-Clause"
] | 18 | 2015-02-15T18:04:31.000Z | 2021-01-16T08:54:32.000Z | #
#
# Copyright 2009 HPGL Team
#
# This file is part of HPGL (High Perfomance Geostatistics Library).
#
# HPGL is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, version 2 of the License.
#
# HPGL is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with HPGL. If not, see http://www.gnu.org/licenses/.
#
from geo import *
from sys import *
import os
import time
if not os.path.exists("results/"):
os.mkdir("results/")
if not os.path.exists("results/medium/"):
os.mkdir("results/medium/")
#grid = SugarboxGrid(166, 141, 225)
#prop = load_cont_property("test_data/BIG_HARD_DATA.INC", -99)
grid = SugarboxGrid(166, 141, 20)
prop = load_cont_property("test_data/BIG_SOFT_DATA_CON_160_141_20.INC",-99)
sgs_params = {
"prop": prop,
"grid": grid,
"seed": 3439275,
"kriging_type": "sk",
"radiuses": (20, 20, 20),
"max_neighbours": 12,
"covariance_type": covariance.exponential,
"ranges": (10, 10, 10),
"sill": 0.4
}
for x in xrange(1):
time1 = time.time()
psgs_result = sgs_simulation(workers_count = x+2, use_new_psgs = True, **sgs_params)
time2 = time.time()
print "Workers: %s" % (x+2)
print "Time: %s" % (time2 - time1)
write_property(psgs_result, "results/medium/PSGS_workers_1.inc", "PSIS_MEDIUM_workers_1", -99)
| 33.489796 | 229 | 0.702011 | 252 | 1,641 | 4.448413 | 0.531746 | 0.013381 | 0.03479 | 0.050847 | 0.171276 | 0.148082 | 0.055308 | 0 | 0 | 0 | 0 | 0.050483 | 0.179159 | 1,641 | 48 | 230 | 34.1875 | 0.781737 | 0.444851 | 0 | 0 | 0 | 0 | 0.276596 | 0.113475 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f4c1702195066e993129a8eb57596bee6bd8234 | 2,371 | py | Python | partycipe/migrations/0001_initial.py | spexxsoldier51/PartyCipe | 5b8038db408fca1e1d568d6520daaf04889ccef0 | [
"CC0-1.0"
] | null | null | null | partycipe/migrations/0001_initial.py | spexxsoldier51/PartyCipe | 5b8038db408fca1e1d568d6520daaf04889ccef0 | [
"CC0-1.0"
] | null | null | null | partycipe/migrations/0001_initial.py | spexxsoldier51/PartyCipe | 5b8038db408fca1e1d568d6520daaf04889ccef0 | [
"CC0-1.0"
] | null | null | null | # Generated by Django 4.0.3 on 2022-04-02 17:32
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='cocktail',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('last_updated', models.DateTimeField(auto_now=True)),
('created', models.DateTimeField(auto_now_add=True)),
('id_api', models.PositiveIntegerField()),
],
),
migrations.CreateModel(
name='party',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('paypal', models.URLField()),
('name', models.CharField(max_length=50)),
('resume', models.CharField(max_length=500)),
('place', models.CharField(max_length=150)),
('datehour', models.DateTimeField()),
('last_updated', models.DateTimeField(auto_now=True)),
('price', models.FloatField()),
('drink', models.ManyToManyField(to='partycipe.cocktail')),
('organisate', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='participate',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('last_updated', models.DateTimeField(auto_now=True)),
('created', models.DateTimeField(auto_now_add=True)),
('etat', models.BooleanField()),
('party', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='partycipe.party')),
('utilisateur', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]
| 43.907407 | 126 | 0.578237 | 225 | 2,371 | 5.937778 | 0.342222 | 0.099551 | 0.103293 | 0.116766 | 0.520958 | 0.520958 | 0.520958 | 0.460329 | 0.460329 | 0.460329 | 0 | 0.013609 | 0.287221 | 2,371 | 53 | 127 | 44.735849 | 0.776923 | 0.018979 | 0 | 0.456522 | 1 | 0 | 0.088507 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.065217 | 0 | 0.152174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f52a901e875d32b20f9451889c4b2196619f283 | 3,879 | py | Python | synthesizing/gui/python-portmidi-0.0.7/test_pyportmidi.py | Chiel92/MusicTheory | ddaaa60042c2db3522144e90ceabcd1bbd9818c3 | [
"MIT"
] | null | null | null | synthesizing/gui/python-portmidi-0.0.7/test_pyportmidi.py | Chiel92/MusicTheory | ddaaa60042c2db3522144e90ceabcd1bbd9818c3 | [
"MIT"
] | null | null | null | synthesizing/gui/python-portmidi-0.0.7/test_pyportmidi.py | Chiel92/MusicTheory | ddaaa60042c2db3522144e90ceabcd1bbd9818c3 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# test code for PyPortMidi
# a port of a subset of test.c provided with PortMidi
# John Harrison
# harrison [at] media [dot] mit [dot] edu
# March 15, 2005: accommodate for SysEx messages and preferred list formats
# SysEx test code contributed by Markus Pfaff
# February 27, 2005: initial release
import pypm
import array
import time
NUM_MSGS = 100 # number of MIDI messages for input before closing
INPUT=0
OUTPUT=1
def PrintDevices(InOrOut):
for loop in range(pypm.CountDevices()):
interf,name,inp,outp,opened = pypm.GetDeviceInfo(loop)
if ((InOrOut == INPUT) & (inp == 1) |
(InOrOut == OUTPUT) & (outp ==1)):
print loop, name," ",
if (inp == 1): print "(input) ",
else: print "(output) ",
if (opened == 1): print "(opened)"
else: print "(unopened)"
print
def TestInput():
PrintDevices(INPUT)
dev = int(raw_input("Type input number: "))
MidiIn = pypm.Input(dev)
print "Midi Input opened. Reading ",NUM_MSGS," Midi messages..."
# MidiIn.SetFilter(pypm.FILT_ACTIVE | pypm.FILT_CLOCK)
for cntr in range(1,NUM_MSGS+1):
while not MidiIn.Poll(): pass
MidiData = MidiIn.Read(1) # read only 1 message at a time
print "Got message ",cntr,": time ",MidiData[0][1],", ",
print MidiData[0][0][0]," ",MidiData[0][0][1]," ",MidiData[0][0][2], MidiData[0][0][3]
# NOTE: most Midi messages are 1-3 bytes, but the 4 byte is returned for use with SysEx messages.
del MidiIn
def TestOutput():
latency = int(raw_input("Type latency: "))
print
PrintDevices(OUTPUT)
dev = int(raw_input("Type output number: "))
MidiOut = pypm.Output(dev, latency)
print "Midi Output opened with ",latency," latency"
dummy = raw_input("ready to send program 1 change... (type RETURN):")
MidiOut.Write([[[0xc0,0,0],pypm.Time()]])
dummy = raw_input("ready to note-on... (type RETURN):")
MidiOut.Write([[[0x90,60,100],pypm.Time()]])
dummy = raw_input("read to note-off... (type RETURN):")
MidiOut.Write([[[0x90,60,0],pypm.Time()]])
dummy = raw_input("ready to note-on (short form)... (type RETURN):")
MidiOut.WriteShort(0x90,60,100)
dummy = raw_input("ready to note-off (short form)... (type RETURN):")
MidiOut.WriteShort(0x90,60,0)
print
print "chord will arpeggiate if latency > 0"
dummy = raw_input("ready to chord-on/chord-off... (type RETURN):")
chord = [60, 67, 76, 83, 90]
ChordList = []
MidiTime = pypm.Time()
for i in range(len(chord)):
ChordList.append([[0x90,chord[i],100], MidiTime + 1000 * i])
MidiOut.Write(ChordList)
while pypm.Time() < MidiTime + 1000 + len(chord) * 1000 : pass
ChordList = []
# seems a little odd that they don't update MidiTime here...
for i in range(len(chord)):
ChordList.append([[0x90,chord[i],0], MidiTime + 1000 * i])
MidiOut.Write(ChordList)
print("Sending SysEx messages...")
# sending with timestamp = 0 should be the same as sending with
# timestamp = pypm.Time()
dummy = raw_input("ready to send a SysEx string with timestamp = 0 ... (type RETURN):")
MidiOut.WriteSysEx(0,'\xF0\x7D\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1A\xF7')
dummy = raw_input("ready to send a SysEx list with timestamp = pypm.Time() ... (type RETURN):")
MidiOut.WriteSysEx(pypm.Time(), [0xF0, 0x7D, 0x10, 0x11, 0x12, 0x13, 0xF7])
dummy = raw_input("ready to close and terminate... (type RETURN):")
del MidiOut
# main code begins here
pypm.Initialize() # always call this first, or OS may crash when you try to open a stream
x=0
while (x<1) | (x>2):
print """
enter your choice...
1: test input
2: test output
"""
x=int(raw_input())
if x==1: TestInput()
else: TestOutput()
pypm.Terminate()
| 38.405941 | 105 | 0.63109 | 555 | 3,879 | 4.378378 | 0.36036 | 0.042798 | 0.048148 | 0.059259 | 0.238272 | 0.198354 | 0.127572 | 0.12428 | 0.065021 | 0.065021 | 0 | 0.053902 | 0.220418 | 3,879 | 100 | 106 | 38.79 | 0.749669 | 0.203403 | 0 | 0.115385 | 0 | 0.012821 | 0.260983 | 0.025057 | 0 | 0 | 0.018223 | 0 | 0 | 0 | null | null | 0.025641 | 0.038462 | null | null | 0.192308 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f54793f102a2f9346990845e8357d9f1db537d3 | 4,330 | py | Python | ck_airport.py | 58565856/checkinpanel | 58f2292d9c4d65f15ffd6bc4fa4b9f23214d3d72 | [
"MIT"
] | 3 | 2022-02-08T16:11:43.000Z | 2022-03-23T16:18:59.000Z | ck_airport.py | 58565856/checkinpanel | 58f2292d9c4d65f15ffd6bc4fa4b9f23214d3d72 | [
"MIT"
] | null | null | null | ck_airport.py | 58565856/checkinpanel | 58f2292d9c4d65f15ffd6bc4fa4b9f23214d3d72 | [
"MIT"
] | 2 | 2022-02-01T05:35:56.000Z | 2022-02-10T01:37:38.000Z | # -*- coding: utf-8 -*-
"""
:author @Icrons
cron: 20 10 * * *
new Env('机场签到');
"""
import json
import re
import traceback
import requests
import urllib3
from notify_mtr import send
from utils import get_data
urllib3.disable_warnings()
class SspanelQd(object):
def __init__(self, check_items):
self.check_items = check_items
@staticmethod
def checkin(url, email, password):
url = url.rstrip("/")
email = email.split("@")
if len(email) > 1:
email = email[0] + "%40" + email[1]
else:
email = email[0]
session = requests.session()
"""
以下 except 都是用来捕获当 requests 请求出现异常时,
通过捕获然后等待网络情况的变化,以此来保护程序的不间断运行
"""
try:
session.get(url, verify=False)
except requests.exceptions.ConnectionError:
msg = url + "\n" + "网络不通"
return msg
except requests.exceptions.ChunkedEncodingError:
msg = url + "\n" + "分块编码错误"
return msg
except Exception:
msg = url + "\n" + "未知错误,请查看日志"
print(f"未知错误,错误信息:\n{traceback.format_exc()}")
return msg
login_url = url + "/auth/login"
headers = {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36",
"Content-Type": "application/x-www-form-urlencoded; charset=UTF-8",
}
post_data = "email=" + email + "&passwd=" + password + "&code="
post_data = post_data.encode()
try:
res = session.post(login_url, post_data, headers=headers, verify=False)
res_str = res.text.encode("utf-8").decode("unicode_escape")
print(f"{url} 接口登录返回信息:{res_str}")
res_dict = json.loads(res_str)
if res_dict.get("ret") == 0:
msg = url + "\n" + str(res_dict.get("msg"))
return msg
except Exception:
msg = url + "\n" + "登录失败,请查看日志"
print(f"登录失败,错误信息:\n{traceback.format_exc()}")
return msg
headers = {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36",
"Referer": url + "/user",
}
try:
response = session.post(
url + "/user/checkin", headers=headers, verify=False
)
res_str = response.text.encode("utf-8").decode("unicode_escape")
print(f"{url} 接口签到返回信息:{res_str}")
res_dict = json.loads(res_str)
check_msg = res_dict.get("msg")
if check_msg:
msg = url + "\n" + str(check_msg)
else:
msg = url + "\n" + str(res_dict)
except Exception:
msg = url + "\n" + "签到失败,请查看日志"
print(f"签到失败,错误信息:\n{traceback.format_exc()}")
info_url = url + "/user"
response = session.get(info_url, verify=False)
"""
以下只适配了editXY主题
"""
try:
level = re.findall(r'\["Class", "(.*?)"],', response.text)[0]
day = re.findall(r'\["Class_Expire", "(.*)"],', response.text)[0]
rest = re.findall(r'\["Unused_Traffic", "(.*?)"]', response.text)[0]
msg = (
url
+ "\n- 今日签到信息:"
+ str(msg)
+ "\n- 用户等级:"
+ str(level)
+ "\n- 到期时间:"
+ str(day)
+ "\n- 剩余流量:"
+ str(rest)
)
except Exception:
pass
return msg
def main(self):
msg_all = ""
for check_item in self.check_items:
# 机场地址
url = str(check_item.get("url"))
# 登录信息
email = str(check_item.get("email"))
password = str(check_item.get("password"))
if url and email and password:
msg = self.checkin(url=url, email=email, password=password)
else:
msg = "配置错误"
msg_all += msg + "\n\n"
return msg_all
if __name__ == "__main__":
data = get_data()
_check_items = data.get("AIRPORT", [])
res = SspanelQd(check_items=_check_items).main()
send("机场签到", res)
| 31.151079 | 138 | 0.505081 | 484 | 4,330 | 4.390496 | 0.303719 | 0.025412 | 0.029647 | 0.029647 | 0.283765 | 0.262588 | 0.217412 | 0.158118 | 0.131765 | 0.131765 | 0 | 0.025678 | 0.352425 | 4,330 | 138 | 139 | 31.376812 | 0.732168 | 0.019169 | 0 | 0.214953 | 0 | 0.018692 | 0.189664 | 0.034617 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028037 | false | 0.056075 | 0.065421 | 0 | 0.168224 | 0.046729 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6f5c96a2170db005f0df74623642b0c6df9f9c2a | 433 | py | Python | setup.py | astrodeepnet/sbi_experiments | 70af041da08565ba15e0c011145b11ab3fd973d7 | [
"MIT"
] | 3 | 2021-12-11T20:57:07.000Z | 2021-12-14T22:20:42.000Z | setup.py | astrodeepnet/sbi_experiments | 70af041da08565ba15e0c011145b11ab3fd973d7 | [
"MIT"
] | 20 | 2021-11-15T17:08:54.000Z | 2022-03-25T10:32:52.000Z | setup.py | astrodeepnet/sbi_experiments | 70af041da08565ba15e0c011145b11ab3fd973d7 | [
"MIT"
] | 3 | 2021-11-22T21:44:04.000Z | 2021-12-14T10:31:46.000Z | from setuptools import setup, find_packages
setup(
name='SBIExperiments',
version='0.0.1',
url='https://github.com/astrodeepnet/sbi_experiments',
author='Justine Zeghal and friends',
description='Package for numerical experiments of SBI tools',
packages=find_packages(),
install_requires=[
'numpy>=1.19.2',
'jax>=0.2.0',
'tensorflow_probability>=0.14.1',
'scikit-learn>=0.21',
'jaxopt>=0.2'
],
)
| 24.055556 | 63 | 0.681293 | 58 | 433 | 5 | 0.724138 | 0.082759 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051771 | 0.152425 | 433 | 17 | 64 | 25.470588 | 0.73842 | 0 | 0 | 0 | 0 | 0 | 0.508083 | 0.069284 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f6c63911e71ae7c84e18bedf35df7f0d63d41aa | 437 | py | Python | serialTest.py | fmuno003/SeniorDesign | 113bdcf4cc906042f44736a1ffddb6ffff3a217e | [
"BSD-3-Clause"
] | 1 | 2019-04-29T16:07:51.000Z | 2019-04-29T16:07:51.000Z | serialTest.py | fmuno003/SeniorDesign | 113bdcf4cc906042f44736a1ffddb6ffff3a217e | [
"BSD-3-Clause"
] | null | null | null | serialTest.py | fmuno003/SeniorDesign | 113bdcf4cc906042f44736a1ffddb6ffff3a217e | [
"BSD-3-Clause"
] | null | null | null | import serial
import RPi.GPIO as GPIO
import time
ser=serial.Serial("/dev/ttyACM0",9600)
start_time = time.time()
imu = open("IMU.txt","w")
while time.time() - start_time <= 1:
ser.readline()
while time.time() - start_time <= 8:
read_ser=ser.readline()
if float(read_ser) == 0.00:
pass
else:
read = read_ser.strip('\n')
imu.write(read)
imu.write('\n')
imu.close()
| 19.863636 | 39 | 0.578947 | 63 | 437 | 3.920635 | 0.47619 | 0.129555 | 0.105263 | 0.145749 | 0.178138 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03096 | 0.26087 | 437 | 21 | 40 | 20.809524 | 0.733746 | 0 | 0 | 0 | 0 | 0 | 0.057692 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.058824 | 0.176471 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6f70b2504b0ddf0927280e069e308de02195aea2 | 447 | py | Python | linkit/models.py | what-digital/linkit | 58fb7dc966e7b76b654c9bc5e52253eb81731e98 | [
"MIT"
] | 8 | 2019-06-11T14:09:12.000Z | 2021-09-09T09:37:47.000Z | linkit/models.py | what-digital/linkit | 58fb7dc966e7b76b654c9bc5e52253eb81731e98 | [
"MIT"
] | 7 | 2020-02-12T02:55:11.000Z | 2020-08-27T09:54:54.000Z | linkit/models.py | what-digital/linkit | 58fb7dc966e7b76b654c9bc5e52253eb81731e98 | [
"MIT"
] | 2 | 2020-06-18T09:54:20.000Z | 2022-02-17T08:33:13.000Z | from django.db import models
from filer.fields.file import FilerFileField
class FakeLink(models.Model):
"""
In our widget we need to manually render a AdminFileFormField. Basically for every other Field type this is not
a problem at all, but Failer needs a rel attribute which consists of a reverse relationship. We fake it
with this model.
"""
fake_file = FilerFileField(blank=True, null=True, on_delete=models.CASCADE)
| 37.25 | 115 | 0.753915 | 67 | 447 | 5 | 0.791045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.187919 | 447 | 11 | 116 | 40.636364 | 0.922865 | 0.519016 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
6f73d54d3a1a664d942bd0ee6d760eedb4233760 | 1,054 | py | Python | ecommerce/User/admin.py | AwaleRohin/commerce-fm | cb5b43c999ae5be37957b29de9c07d5affc66fb0 | [
"MIT"
] | 18 | 2020-12-05T14:12:32.000Z | 2022-03-11T20:15:22.000Z | ecommerce/User/admin.py | AwaleRohin/commerce-fm | cb5b43c999ae5be37957b29de9c07d5affc66fb0 | [
"MIT"
] | 1 | 2021-07-22T09:23:13.000Z | 2021-07-22T09:23:13.000Z | ecommerce/User/admin.py | shakyasaijal/commerce-fm | 358b6925f4b569dc374010d7cc7d4d560ede2b48 | [
"MIT"
] | 13 | 2020-10-15T10:17:35.000Z | 2022-01-29T06:56:24.000Z | from django.contrib import admin
from django.conf import settings
from django.core.exceptions import ImproperlyConfigured
from . import models
if settings.HAS_ADDITIONAL_USER_DATA:
try:
class UserProfileInline(admin.TabularInline):
model = models.UserProfile
extra = 0
except (Exception, KeyError) as e:
raise ImproperlyConfigured("User/admin.py:: Multi Vendor is turned on.")
class UserAdmin(admin.ModelAdmin):
list_display = ['get_full_name', 'email', 'is_verified']
search_fields = ['get_full_name', 'email', 'date_joined', 'username']
list_filter = ('groups',)
if settings.HAS_ADDITIONAL_USER_DATA:
inlines = [ UserProfileInline, ]
def save_model(self, request, obj, form, change):
if 'password' in form.changed_data:
obj.set_password(request.POST['password'])
obj.save()
admin.site.register(models.User, UserAdmin)
admin.site.register(models.IpAddress)
admin.site.register(models.CityFromIpAddress)
admin.site.register(models.Marketing) | 31 | 80 | 0.712524 | 125 | 1,054 | 5.864 | 0.56 | 0.049113 | 0.092769 | 0.125512 | 0.084584 | 0.084584 | 0 | 0 | 0 | 0 | 0 | 0.001159 | 0.181214 | 1,054 | 34 | 81 | 31 | 0.848204 | 0 | 0 | 0.08 | 0 | 0 | 0.123223 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0.08 | 0.16 | 0 | 0.48 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
488b91ca767e9611a3e2258e676d32094fa0687f | 4,023 | py | Python | python/svm.py | mwalton/em-machineLearning | efd76961fa3b78e042ca481733152a683074d15c | [
"MIT"
] | null | null | null | python/svm.py | mwalton/em-machineLearning | efd76961fa3b78e042ca481733152a683074d15c | [
"MIT"
] | null | null | null | python/svm.py | mwalton/em-machineLearning | efd76961fa3b78e042ca481733152a683074d15c | [
"MIT"
] | null | null | null | import numpy as np
import argparse
import os.path
import plots as plot
from sklearn.preprocessing import StandardScaler
from sklearn.grid_search import GridSearchCV
import time
from sklearn import svm
from sklearn.metrics import classification_report
from sklearn.metrics import accuracy_score
from sklearn.externals import joblib
from sklearn.cross_validation import StratifiedKFold
def loadData(XPath, yPath):
X = np.genfromtxt(XPath, delimiter=",", dtype="float32")
y = np.genfromtxt(yPath, delimiter=",", dtype="float32")
return (X, y)
def convertToClasses(targetVector):
return np.argmax(targetVector[:,1:5], axis=1)
def standardize(featureVector):
scaler = StandardScaler()
return scaler.fit_transform(featureVector)
# construct the argument parser and parse the arguments
ap = argparse.ArgumentParser()
ap.add_argument("-x", "--xTrain", required = True,
help = "path to training feature set")
ap.add_argument("-y", "--yTrain", required = True,
help = "path to training target set")
ap.add_argument("-X", "--xTest", required = True,
help = "path to testing feature set")
ap.add_argument("-Y", "--yTest", required = True,
help = "path to testing target set")
ap.add_argument("-o", "--optimize", type = int, default = 0,
help = "optomization mode: 0 use default, 1 optomize, 2 use pkl model if possible")
ap.add_argument("-m", "--multiClass", type = int, default=1,
help = "exclusive multi class or regression")
ap.add_argument("-p", "--pickle", default="models/svmModel.pkl",
help = "pickle dump of model (output if optomize = 1, input if optomize = 0)")
ap.add_argument("-v", "--visualize", type=int, default=0,
help = "whether or not to show visualizations after a run")
args = vars(ap.parse_args())
(trainX, trainY) = loadData(args["xTrain"], args["yTrain"])
(testX, testY) = loadData(args["xTest"], args["yTest"])
# required scaling for SVM
trainX = standardize(trainX)
testX = standardize(testX)
if (args["multiClass"] == 1):
trainY = convertToClasses(trainY)
testY = convertToClasses(testY)
# check to see if a grid search should be done
if args["optimize"] == 1:
#configure stratified k-fold cross validation
cv = StratifiedKFold(y=trainY, n_folds=4, shuffle=True)
# perform a grid search on the 'C' and 'gamma' parameter
# of SVM
print "SEARCHING SVM"
C_range = 2. ** np.arange(-15, 15, step=1)
gamma_range = 2. ** np.arange(-15, 15, step=1)
param_grid = dict(gamma=gamma_range, C=C_range)
start = time.time()
gs = GridSearchCV(svm.SVC(), param_grid=param_grid, cv=cv, n_jobs = -1, verbose = 2)
gs.fit(trainX, trainY)
# print diagnostic information to the user and grab the
# best model
print "done in %0.3fs" % (time.time() - start)
print "best score: %0.3f" % (gs.best_score_)
print "SVM PARAMETERS"
bestParams = gs.best_estimator_.get_params()
# loop over the parameters and print each of them out
# so they can be manually set
print("Best Estimator: %s" % gs.best_estimator_)
#for p in sorted(params.keys()):
# print "\t %s: %f" % (p, bestParams[p])
print("Accuracy Score On Validation Set: %s\n" % accuracy_score(testY, gs.predict(testX)))
# show a reminder message
print "\nIMPORTANT"
print "Now that your parameters have been searched, manually set"
print "them and re-run this script with --optomize 0"
joblib.dump(gs.best_estimator_, args["pickle"])
# otherwise, use the manually specified parameters
else:
# evaluate using SVM
if (os.path.isfile(args["pickle"]) and args["optimize"] == 2):
clf = joblib.load(args["pickle"])
else:
clf = svm.SVC()
clf.fit(trainX, trainY)
print "SVM PERFORMANCE"
pred = clf.predict(testX)
print classification_report(testY, pred)
print("Accuracy Score: %s\n" % accuracy_score(testY, pred))
if (args["visualize"] == 1):
plot.accuracy(testY, pred, "SVM")
| 35.60177 | 94 | 0.675864 | 551 | 4,023 | 4.871143 | 0.368421 | 0.014903 | 0.038748 | 0.029806 | 0.124441 | 0.078987 | 0.017139 | 0.017139 | 0 | 0 | 0 | 0.012026 | 0.193885 | 4,023 | 112 | 95 | 35.919643 | 0.815603 | 0.137708 | 0 | 0.025974 | 0 | 0 | 0.230235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.168831 | null | null | 0.155844 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4893210d0b7c805a88b25dd46688e23dd6ed78a0 | 6,517 | py | Python | safe_control_gym/math_and_models/normalization.py | catgloss/safe-control-gym | b3f69bbed8577f64fc36d23677bf50027e991b2d | [
"MIT"
] | 120 | 2021-08-16T13:55:47.000Z | 2022-03-31T10:31:42.000Z | safe_control_gym/math_and_models/normalization.py | catgloss/safe-control-gym | b3f69bbed8577f64fc36d23677bf50027e991b2d | [
"MIT"
] | 10 | 2021-10-19T07:19:23.000Z | 2022-03-24T18:43:02.000Z | safe_control_gym/math_and_models/normalization.py | catgloss/safe-control-gym | b3f69bbed8577f64fc36d23677bf50027e991b2d | [
"MIT"
] | 24 | 2021-08-28T17:21:09.000Z | 2022-03-31T10:31:44.000Z | """Perform normalization on inputs or rewards.
"""
import numpy as np
import torch
from gym.spaces import Box
def normalize_angle(x):
"""Wraps input angle to [-pi, pi].
"""
return ((x + np.pi) % (2 * np.pi)) - np.pi
class RunningMeanStd():
"""Calulates the running mean and std of a data stream.
Attributes:
mean (np.array): mean of data stream.
var (np.array): variance of data stream.
count (float): total count of data steam.
"""
def __init__(self, epsilon=1e-4, shape=()):
"""Initializes containers for data mean and variance.
Args:
epsilon (float): helps with arithmetic issues.
shape (tuple): the shape of the data stream's output.
"""
self.mean = np.zeros(shape, np.float64)
self.var = np.ones(shape, np.float64)
self.count = epsilon
def update(self, arr):
"""Update current stats with a new stream of data.
Args:
arr (np.array): 1D array of data, (batch_size, *shape).
"""
batch_mean = np.mean(arr, axis=0)
batch_var = np.var(arr, axis=0)
batch_count = arr.shape[0]
self.update_from_moments(batch_mean, batch_var, batch_count)
def update_from_moments(self, batch_mean, batch_var, batch_count):
"""Util function for `update` method.
"""
delta = batch_mean - self.mean
tot_count = self.count + batch_count
new_mean = self.mean + delta * batch_count / tot_count
m_a = self.var * self.count
m_b = batch_var * batch_count
m_2 = m_a + m_b + np.square(delta) * self.count * batch_count / (self.count + batch_count)
new_var = m_2 / (self.count + batch_count)
new_count = batch_count + self.count
self.mean = new_mean
self.var = new_var
self.count = new_count
class BaseNormalizer(object):
"""Template/default normalizer.
Attributes:
read_only (bool): if to freeze the current stats being tracked.
"""
def __init__(self, read_only=False):
self.read_only = read_only
def set_read_only(self):
self.read_only = True
def unset_read_only(self):
self.read_only = False
def __call__(self, x, *args, **kwargs):
"""Invokes normalization on the given input.
"""
return x
def state_dict(self):
"""Returns snapshot of current stats.
"""
return {}
def load_state_dict(self, _):
"""Restores the stats from a snapshot.
"""
pass
class MeanStdNormalizer(BaseNormalizer):
"""Normalize by the running average.
"""
def __init__(self, shape=(), read_only=False, clip=10.0, epsilon=1e-8):
"""Initializes the data stream tracker.
Args:
shape (tuple): shape of data being tracked.
read_only (bool): if to freeze the tracker.
clip (float): bounds on the data.
epsilon (float): offset to provide divide-by-zero.
"""
super().__init__(read_only)
self.read_only = read_only
self.rms = RunningMeanStd(shape=shape)
self.clip = clip
self.epsilon = epsilon
def __call__(self, x):
"""Update tracker given data, optionally normalize the data.
"""
x = np.asarray(x)
if not self.read_only:
self.rms.update(x)
return np.clip(
(x - self.rms.mean) / np.sqrt(self.rms.var + self.epsilon),
-self.clip, self.clip)
def state_dict(self):
return {'mean': self.rms.mean, 'var': self.rms.var}
def load_state_dict(self, saved):
self.rms.mean = saved['mean']
self.rms.var = saved['var']
class RewardStdNormalizer(MeanStdNormalizer):
"""Reward normalization by running average of returns.
Papers:
* arxiv.org/pdf/1808.04355.pdf
* arxiv.org/pdf/1810.12894.pdf
Also see:
* github.com/openai/baselines/issues/538
"""
def __init__(self, gamma=0.99, read_only=False, clip=10.0, epsilon=1e-8):
"""Initializes the data stream tracker.
Args:
gamma (float): discount factor for rewards.
read_only (bool): if to freeze the tracker.
clip (float): bounds on the data.
epsilon (float): offset to provide divide-by-zero.
"""
# Reward has default shape (1,) or just ().
super().__init__((), read_only, clip, epsilon)
self.gamma = gamma
self.ret = None
def __call__(self, x, dones):
"""Update tracker given reward, optionally normalize the reward (only scaling).
"""
x = np.asarray(x)
if not self.read_only:
# Track running average of forward discounted returns.
if self.ret is None:
self.ret = np.zeros(x.shape[0])
self.ret = self.ret * self.gamma + x
self.rms.update(self.ret)
# Prevent information leak from previous episodes.
self.ret[dones.astype(np.long)] = 0
return np.clip(x / np.sqrt(self.rms.var + self.epsilon), -self.clip, self.clip)
class RescaleNormalizer(BaseNormalizer):
"""Apply constant scaling.
"""
def __init__(self, coef=1.0):
"""Initializes with fixed scaling constant.
Args:
coef (float): scaling coefficient.
"""
super().__init__(self)
self.coef = coef
def __call__(self, x):
"""Scale the input.
"""
if not isinstance(x, torch.Tensor):
x = np.asarray(x)
return self.coef * x
class ImageNormalizer(RescaleNormalizer):
"""Scale image pixles from [0,255] to [0,1].
"""
def __init__(self):
super().__init__(self, 1.0 / 255)
class ActionUnnormalizer(BaseNormalizer):
"""Assumes policy output action is in [-1,1], unnormalize it for gym env.
"""
def __init__(self, action_space):
"""Defines the mean and std for the bounded action space.
"""
super().__init__()
assert isinstance(action_space, Box), "action space must be gym.spaces.Box"
low, high = action_space.low, action_space.high
self.mean = (low + high) / 2.0
self.std = (high - low) / 2.0
def __call__(self, action):
"""Unnormalizes given input action.
"""
x = np.asarray(action)
return self.mean + x * self.std
| 27.041494 | 98 | 0.584778 | 828 | 6,517 | 4.437198 | 0.240338 | 0.039194 | 0.020958 | 0.020686 | 0.21203 | 0.175014 | 0.132553 | 0.125749 | 0.125749 | 0.110506 | 0 | 0.015145 | 0.300905 | 6,517 | 240 | 99 | 27.154167 | 0.791264 | 0.343563 | 0 | 0.117021 | 0 | 0 | 0.012609 | 0 | 0 | 0 | 0 | 0 | 0.010638 | 1 | 0.223404 | false | 0.010638 | 0.031915 | 0.010638 | 0.414894 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
48935c63c2620e531593d07e9af2473ca805cfae | 5,125 | py | Python | networking/pycat.py | itsbriany/PythonSec | eda5dc3f7ac069cd77d9525e93be5cfecc00db16 | [
"MIT"
] | 1 | 2016-01-12T19:38:59.000Z | 2016-01-12T19:38:59.000Z | networking/pycat.py | itsbriany/Security-Tools | eda5dc3f7ac069cd77d9525e93be5cfecc00db16 | [
"MIT"
] | null | null | null | networking/pycat.py | itsbriany/Security-Tools | eda5dc3f7ac069cd77d9525e93be5cfecc00db16 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import socket
import threading
import sys # Support command line args
import getopt # Support command line option parsing
import os # Kill the application
import signal # Catch an interrupt
import time # Thread sleeping
# Global variables definitions
target = ""
port = False
listen = False
command = ""
upload = False
# This tool should be able to replace netcat
# The tool should be able to act as a server and as a client depending on the arguments
###############################################################################
# Start menu
def menu():
print "pycat, a python implementation of netcat"
print ""
print "Usage:"
print ""
print "-h, --help: Display this menu"
print "-t, --target: The IP to bind to"
print "-l, --listen: Listen mode (act as a server)"
print "-p, --port: The port number to bind to"
print "-c, --command: The command you wish to execute via pycat"
print "-u --upload: Set this flag to upload a file"
print ""
print ""
print "By default, pycat will act as a client unless the -p flag is specified"
print ""
print "Examples will happen later..."
print ""
sys.exit(0)
###############################################################################
# Connect as a client
def connectMode(client_socket, address):
global kill_thread
# Get raw input which is terminated with \n
try:
while True:
buffer = raw_input()
buffer += "\n"
if buffer == "quit\n" or buffer == "q\n":
client_socket.close()
sys.exit(0)
if not client_socket:
print "[!!] No connection on the other end!"
client_socket.close()
break
client_socket.send(buffer)
except Exception as err:
print "[!!] Caught exception in client thread: %s!" % err
client_socket.close()
###############################################################################
# Handle the connection from the client.
def handle_client(client_socket, address):
print "[*] Got a connection from %s:%d" % (address[0], address[1])
try:
while True:
# Wait for a response
request = client_socket.recv(4096)
# If the client disconnects, request is 0
if not request:
break
# Output what the client has given us
print request
client_socket.close()
except Exception as err:
print "[!!] Caught exception in server thread: %s" % err
client_socket.close()
sys.exit(0)
###############################################################################
# This is the listening functionality of the program
def serverMode():
global target
global port
server = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if not len(target):
target = "0.0.0.0"
try:
server.bind((target, port))
except socket.error as err:
print err
sys.exit(0)
server.listen(5)
print "[*] Listening on %s:%d" % (target, port)
while True:
try:
# This will wait until we get a connection
client, address = server.accept()
# Create a thread to handle incoming responses
# Daemonic threads will die as soon as the main thread dies
listen_thread = threading.Thread(target = handle_client, args = (client, address))
listen_thread.daemon = True
listen_thread.start()
# Create a thread to handle outgoing requests
client_thread = threading.Thread(target = connectMode, args = (client, address))
client_thread.daemon = True
client_thread.start()
time.sleep(1)
'''
# The problem is that python does NOT pass by refernece!
This means that the sockets are simply copies and the actual socket that gets closed
does not do anything!
'''
except (KeyboardInterrupt, SystemExit):
print "Cleaning up sockets..."
client.close()
sys.stdout.write("Exiting form main thread...\n")
sys.exit(0)
###############################################################################
# main definition
def main():
global target
global listen
global port
global command
global upload
# Set the option
# If the options are not parsing properly, then try gnu_getopt
if not len(sys.argv[1:]):
menu()
try:
options, remainder = getopt.getopt(sys.argv[1:], 'ht:lp:cu', ['help', 'target', 'listen', 'port', 'command', 'upload'])
except getopt.GetoptError as err:
print str(err)
menu()
for opt, arg in options:
if opt in ('-h', '--help'):
menu()
elif opt in ('-t', '--target'):
target = arg
elif opt in ('-l', '--listen'):
listen = True
elif opt in ('-p', '--port'):
port = int(arg)
elif opt in ('-c', '--command'):
command = arg
elif opt in ('-u', '--upload'):
upload = True
else:
assert False, "Invalid option" # This throws an error
print "Target: %s" % target
print "Listen: %s" % listen
print "Port: %d" % port
if port > 0:
if not listen and len(target):
print "Client mode"
elif listen:
serverMode()
else: # This could probably be cleaned up a little since the functions will have looping
menu()
else:
menu()
###############################################################################
# Program execution
try:
main()
except KeyboardInterrupt:
print ""
sys.exit(0)
| 23.617512 | 121 | 0.607415 | 672 | 5,125 | 4.598214 | 0.321429 | 0.038835 | 0.015534 | 0.01165 | 0.080583 | 0.05534 | 0.027184 | 0.027184 | 0 | 0 | 0 | 0.005358 | 0.198829 | 5,125 | 216 | 122 | 23.726852 | 0.747199 | 0.185756 | 0 | 0.323308 | 0 | 0 | 0.226139 | 0 | 0 | 0 | 0 | 0 | 0.007519 | 0 | null | null | 0 | 0.052632 | null | null | 0.225564 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
48938090ba940fdf1245ccfb1e1b41da0dfdb8ec | 4,356 | py | Python | code/striatal_model/neuron_model_tuning.py | weidel-p/go-robot-nogo-robot | 026f1f753125089a03504320cc94a76888a0efc5 | [
"MIT"
] | 1 | 2020-09-23T22:16:10.000Z | 2020-09-23T22:16:10.000Z | code/striatal_model/neuron_model_tuning.py | weidel-p/go-robot-nogo-robot | 026f1f753125089a03504320cc94a76888a0efc5 | [
"MIT"
] | null | null | null | code/striatal_model/neuron_model_tuning.py | weidel-p/go-robot-nogo-robot | 026f1f753125089a03504320cc94a76888a0efc5 | [
"MIT"
] | null | null | null | import nest
import pylab as pl
import pickle
from nest import voltage_trace
from nest import raster_plot as rplt
import numpy as np
from params import *
seed = [np.random.randint(0, 9999999)] * num_threads
def calcFI():
#amplitudesList = np.arange(3.5,4.5,0.1)
amplitudesList = np.arange(100, 500, 50.)
listD1 = []
listD2 = []
for amp in amplitudesList:
nest.ResetKernel()
nest.SetKernelStatus({"resolution": timestep, "overwrite_files": True, "rng_seeds": seed,
"print_time": True, "local_num_threads": num_threads})
nest.CopyModel("iaf_cond_alpha", "d1", d1_params)
#nest.CopyModel("izhikevich", "d1", d1_params_iz)
nest.CopyModel("iaf_cond_alpha", "d2", d2_params)
#nest.CopyModel("izhikevich", "d2", d2_params_iz)
d1 = nest.Create("d1", 1)
d2 = nest.Create("d2", 1)
dc = nest.Create("dc_generator", 1)
sd = nest.Create("spike_detector", 2)
mult = nest.Create("multimeter", 1, params={
"withgid": True, "withtime": True, "record_from": ["V_m"]})
nest.Connect(d1, [sd[0]])
nest.Connect(d2, [sd[1]])
nest.Connect(dc, d1)
nest.Connect(dc, d2)
nest.Connect(mult, d1)
nest.Connect(mult, d2)
nest.SetStatus(dc, params={"amplitude": amp})
nest.Simulate(10000.)
evs_d1 = nest.GetStatus([sd[0]])[0]["events"]["senders"]
ts_d1 = nest.GetStatus([sd[0]])[0]["events"]["times"]
evs_d2 = nest.GetStatus([sd[1]])[0]["events"]["senders"]
ts_d2 = nest.GetStatus([sd[1]])[0]["events"]["times"]
listD1.append(len(ts_d1) / 10.0)
listD2.append(len(ts_d2) / 10.0)
# voltage_trace.from_device(mult)
# pl.show()
FI = dict()
FI["d1"] = listD1
FI["d2"] = listD2
pickle.dump(FI, open("../../data/FI.pickle", "w"))
pl.figure()
pl.text(70, 62, "A", fontweight='bold', fontsize=15)
pl.plot(amplitudesList, listD1, 'bo-', label='D1', linewidth=1.5)
pl.plot(amplitudesList, listD2, 'go-', label='D2', linewidth=1.5)
pl.legend(loc='best')
pl.xlabel("Amplitude(pA)", fontweight='bold', fontsize=14)
pl.ylabel("Firing rate (sps)", fontweight='bold', fontsize=14)
for x in pl.gca().get_xticklabels():
x.set_fontweight('bold')
x.set_fontsize(10)
for x in pl.gca().get_yticklabels():
x.set_fontweight('bold')
x.set_fontsize(10)
pl.savefig("../../data/FI.pdf")
print "d1", FI["d1"], "d2", FI["d2"], amplitudesList
pl.figure()
voltage_trace.from_device(mult)
pl.show()
def checkConninMV():
nest.ResetKernel()
nest.SetKernelStatus({"resolution": timestep, "overwrite_files": True, "rng_seeds": seed,
"print_time": True, "local_num_threads": num_threads})
nest.CopyModel("iaf_cond_alpha", "d21", d2_params)
#nest.CopyModel("izhikevich", "d1", d1_params_iz)
nest.CopyModel("iaf_cond_alpha", "d22", d2_params)
#nest.CopyModel("izhikevich", "d2", d2_params_iz)
d21 = nest.Create("d21", 1)
d22 = nest.Create("d22", 1)
nest.SetStatus(d22, {'I_e': 27.}) # Has to be tuned so that d2 is at -80
# nest.SetStatus(d1,{'I_e':69.}) # Has to be tuned so that d1 is at -80
dc = nest.Create("dc_generator", 1)
sd = nest.Create("spike_detector", 2)
mult = nest.Create("multimeter", 1, params={
"withgid": True, "withtime": True, "record_from": ["V_m"]})
nest.Connect(d21, [sd[0]])
nest.Connect(d22, [sd[1]])
nest.Connect(dc, d21)
# nest.Connect(dc,d2)
# nest.Connect(mult,d1)
nest.Connect(mult, d22)
nest.Connect(d21, d22, syn_spec={'weight': jd2d2})
nest.SetStatus(dc, params={"amplitude": 250.})
nest.Simulate(1000.)
evs_d1 = nest.GetStatus([sd[0]])[0]["events"]["senders"]
ts_d1 = nest.GetStatus([sd[0]])[0]["events"]["times"]
V_m = nest.GetStatus(mult)[0]["events"]["V_m"]
ts = nest.GetStatus(mult)[0]["events"]["times"]
inds = np.where(ts > 400.)
Vmmin = np.min(V_m[inds])
print "conn_strength", Vmmin + 80.
# pl.figure(1)
# rplt.from_device(sd)
pl.figure(2)
voltage_trace.from_device(mult)
pl.plot(ts_d1, np.ones(len(ts_d1)) * -80., 'r|', markersize=10)
pl.show()
calcFI()
# checkConninMV()
| 32.75188 | 97 | 0.597107 | 601 | 4,356 | 4.206323 | 0.261231 | 0.056566 | 0.035601 | 0.031646 | 0.554589 | 0.498418 | 0.462025 | 0.41693 | 0.391614 | 0.357595 | 0 | 0.055245 | 0.218779 | 4,356 | 132 | 98 | 33 | 0.687629 | 0.108127 | 0 | 0.304348 | 0 | 0 | 0.149096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.076087 | null | null | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4895a29e1cbfd7f3cbc0290d21c2ee285348e317 | 385 | py | Python | students/admin.py | eustone/sms | 0b785c8a6cc7f8c6035f1b46092d5b8e8750ab7f | [
"Apache-2.0"
] | null | null | null | students/admin.py | eustone/sms | 0b785c8a6cc7f8c6035f1b46092d5b8e8750ab7f | [
"Apache-2.0"
] | 7 | 2021-03-19T01:09:50.000Z | 2022-03-12T00:20:49.000Z | students/admin.py | eustone/sms | 0b785c8a6cc7f8c6035f1b46092d5b8e8750ab7f | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from .models import Student
# Register your models here.
class StudentAdmin(admin.ModelAdmin):
list_display = ('first_name','middle_name',
'last_name','identification_number')
search_fields = ('first_name','middle_name',
'last_name','identification_number')
admin.site.register(Student,StudentAdmin)
| 27.5 | 56 | 0.696104 | 42 | 385 | 6.142857 | 0.571429 | 0.069767 | 0.116279 | 0.147287 | 0.364341 | 0.364341 | 0.364341 | 0.364341 | 0 | 0 | 0 | 0 | 0.197403 | 385 | 13 | 57 | 29.615385 | 0.834951 | 0.067532 | 0 | 0.25 | 0 | 0 | 0.286517 | 0.117978 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
48976b6d6b5db52348271fa437cb2c3858865703 | 1,723 | py | Python | proof_of_work/multiagent/turn_based/v6/environmentv6.py | michaelneuder/parkes_lab_fa19 | 18d9f564e0df9c17ac5d54619ed869d778d4f6a4 | [
"MIT"
] | null | null | null | proof_of_work/multiagent/turn_based/v6/environmentv6.py | michaelneuder/parkes_lab_fa19 | 18d9f564e0df9c17ac5d54619ed869d778d4f6a4 | [
"MIT"
] | null | null | null | proof_of_work/multiagent/turn_based/v6/environmentv6.py | michaelneuder/parkes_lab_fa19 | 18d9f564e0df9c17ac5d54619ed869d778d4f6a4 | [
"MIT"
] | null | null | null | import numpy as np
np.random.seed(0)
class Environment(object):
def __init__(self, alpha, T, mining_cost=0.5):
self.alpha = alpha
self.T = T
self.current_state = None
self.mining_cost = mining_cost
def reset(self):
self.current_state = (0, 0)
return self.current_state
def getNextStateAdopt(self, rand_val):
self.current_state = (0, 0)
return np.asarray(self.current_state), 0
def getNextStateOverride(self, rand_val):
a, h = self.current_state
if a <= h:
self.current_state = (0, 0)
return np.asarray(self.current_state), -100
self.current_state = (a - h - 1, 0)
return np.asarray(self.current_state), h + 1
def getNextStateMine(self, rand_val):
a, h = self.current_state
if (a == self.T) or (h == self.T):
return self.getNextStateAdopt(rand_val)
if rand_val < self.alpha:
self.current_state = (a + 1, h)
else:
self.current_state = (a, h + 1)
return np.asarray(self.current_state), -1*self.alpha*self.mining_cost
def takeAction(self, action, rand_val=None):
assert(action in [0, 1, 2])
if not rand_val:
rand_val = np.random.uniform()
if action == 0:
return self.getNextStateAdopt(rand_val)
elif action == 1:
return self.getNextStateOverride(rand_val)
else:
return self.getNextStateMine(rand_val)
def main():
env = Environment(alpha=0.35, T=9)
print(env.reset(0.01))
print(env.takeAction(2, 0.01))
print(env.takeAction(1, 0.01))
if __name__ == "__main__":
main() | 31.327273 | 77 | 0.585607 | 231 | 1,723 | 4.190476 | 0.220779 | 0.159091 | 0.231405 | 0.070248 | 0.411157 | 0.296488 | 0.200413 | 0.167355 | 0.167355 | 0.167355 | 0 | 0.032312 | 0.299478 | 1,723 | 55 | 78 | 31.327273 | 0.769677 | 0 | 0 | 0.191489 | 0 | 0 | 0.00464 | 0 | 0 | 0 | 0 | 0 | 0.021277 | 1 | 0.148936 | false | 0 | 0.021277 | 0 | 0.382979 | 0.06383 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
48a84cb7d32acc3cbc3af963ca0e81cc7ff163d9 | 424 | py | Python | poem/Poem/urls_public.py | kzailac/poem | 9f898e3cc3378ef1c49517b4cf6335a93a3f49b0 | [
"Apache-2.0"
] | null | null | null | poem/Poem/urls_public.py | kzailac/poem | 9f898e3cc3378ef1c49517b4cf6335a93a3f49b0 | [
"Apache-2.0"
] | null | null | null | poem/Poem/urls_public.py | kzailac/poem | 9f898e3cc3378ef1c49517b4cf6335a93a3f49b0 | [
"Apache-2.0"
] | null | null | null | from django.conf.urls import include
from django.http import HttpResponseRedirect
from django.urls import re_path
from Poem.poem_super_admin.admin import mysuperadmin
urlpatterns = [
re_path(r'^$', lambda x: HttpResponseRedirect('/poem/superadmin/')),
re_path(r'^superadmin/', mysuperadmin.urls),
re_path(r'^saml2/', include(('djangosaml2.urls', 'poem'),
namespace='saml2')),
]
| 32.615385 | 72 | 0.688679 | 50 | 424 | 5.72 | 0.44 | 0.083916 | 0.073427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008621 | 0.179245 | 424 | 12 | 73 | 35.333333 | 0.813218 | 0 | 0 | 0 | 0 | 0 | 0.148585 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
48ae6c1d7db7737a61286051c58656fa1c61b3ae | 387 | py | Python | osu/osu_overlay.py | HQupgradeHQ/Daylight | a110a0f618877f5cccd66c4d75115c765d8f62a0 | [
"MIT"
] | 2 | 2020-07-30T14:07:19.000Z | 2020-08-01T05:28:29.000Z | osu/osu_overlay.py | HQupgradeHQ/Daylight | a110a0f618877f5cccd66c4d75115c765d8f62a0 | [
"MIT"
] | null | null | null | osu/osu_overlay.py | HQupgradeHQ/Daylight | a110a0f618877f5cccd66c4d75115c765d8f62a0 | [
"MIT"
] | null | null | null | import mpv
import keyboard
import time
p = mpv.MPV()
p.play("song_name.mp4")
def play_pause():
p.pause = not p.pause
keyboard.add_hotkey("e", play_pause)
def full():
p.fullscreen = not p.fullscreen
keyboard.add_hotkey("2", full)
def go_to_start():
p.time_pos = 2
keyboard.add_hotkey("1", go_to_start)
while 1:
time.sleep(40)
| 12.09375 | 38 | 0.620155 | 60 | 387 | 3.816667 | 0.433333 | 0.144105 | 0.222707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024221 | 0.25323 | 387 | 31 | 39 | 12.483871 | 0.768166 | 0 | 0 | 0 | 0 | 0 | 0.044944 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.1875 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
48bc4c72c304a6d7aeeb0dab781f82a2616fe4d3 | 4,766 | py | Python | test/test_memory_leaks.py | elventear/psutil | c159f3352dc5f699143960840e4f6535174690ed | [
"BSD-3-Clause"
] | 4 | 2015-01-06T01:39:12.000Z | 2019-12-09T10:27:44.000Z | test/test_memory_leaks.py | elventear/psutil | c159f3352dc5f699143960840e4f6535174690ed | [
"BSD-3-Clause"
] | null | null | null | test/test_memory_leaks.py | elventear/psutil | c159f3352dc5f699143960840e4f6535174690ed | [
"BSD-3-Clause"
] | 2 | 2016-10-21T03:15:34.000Z | 2018-12-10T03:40:50.000Z | #!/usr/bin/env python
#
# $Id$
#
"""
Note: this is targeted for python 2.x.
To run it under python 3.x you need to use 2to3 tool first:
$ 2to3 -w test/test_memory_leaks.py
"""
import os
import gc
import sys
import unittest
import psutil
from test_psutil import reap_children, skipUnless, skipIf, \
POSIX, LINUX, WINDOWS, OSX, BSD
LOOPS = 1000
TOLERANCE = 4096
class TestProcessObjectLeaks(unittest.TestCase):
"""Test leaks of Process class methods and properties"""
def setUp(self):
gc.collect()
def tearDown(self):
reap_children()
def execute(self, method, *args, **kwarks):
# step 1
p = psutil.Process(os.getpid())
for x in xrange(LOOPS):
obj = getattr(p, method)
if callable(obj):
retvalue = obj(*args, **kwarks)
else:
retvalue = obj # property
del x, p, obj, retvalue
gc.collect()
rss1 = psutil.Process(os.getpid()).get_memory_info()[0]
# step 2
p = psutil.Process(os.getpid())
for x in xrange(LOOPS):
obj = getattr(p, method)
if callable(obj):
retvalue = obj(*args, **kwarks)
else:
retvalue = obj # property
del x, p, obj, retvalue
gc.collect()
rss2 = psutil.Process(os.getpid()).get_memory_info()[0]
# comparison
difference = rss2 - rss1
if difference > TOLERANCE:
self.fail("rss1=%s, rss2=%s, difference=%s" %(rss1, rss2, difference))
def test_name(self):
self.execute('name')
def test_cmdline(self):
self.execute('cmdline')
def test_ppid(self):
self.execute('ppid')
def test_uid(self):
self.execute('uid')
def test_uid(self):
self.execute('gid')
@skipIf(POSIX)
def test_username(self):
self.execute('username')
def test_create_time(self):
self.execute('create_time')
def test_get_num_threads(self):
self.execute('get_num_threads')
def test_get_threads(self):
self.execute('get_num_threads')
def test_get_cpu_times(self):
self.execute('get_cpu_times')
def test_get_memory_info(self):
self.execute('get_memory_info')
def test_is_running(self):
self.execute('is_running')
@skipUnless(WINDOWS)
def test_resume(self):
self.execute('resume')
@skipUnless(WINDOWS)
def test_getcwd(self):
self.execute('getcwd')
@skipUnless(WINDOWS)
def test_get_open_files(self):
self.execute('get_open_files')
@skipUnless(WINDOWS)
def test_get_connections(self):
self.execute('get_connections')
class TestModuleFunctionsLeaks(unittest.TestCase):
"""Test leaks of psutil module functions."""
def setUp(self):
gc.collect()
def execute(self, function, *args, **kwarks):
# step 1
for x in xrange(LOOPS):
obj = getattr(psutil, function)
if callable(obj):
retvalue = obj(*args, **kwarks)
else:
retvalue = obj # property
del x, obj, retvalue
gc.collect()
rss1 = psutil.Process(os.getpid()).get_memory_info()[0]
# step 2
for x in xrange(LOOPS):
obj = getattr(psutil, function)
if callable(obj):
retvalue = obj(*args, **kwarks)
else:
retvalue = obj # property
del x, obj, retvalue
gc.collect()
rss2 = psutil.Process(os.getpid()).get_memory_info()[0]
# comparison
difference = rss2 - rss1
if difference > TOLERANCE:
self.fail("rss1=%s, rss2=%s, difference=%s" %(rss1, rss2, difference))
def test_get_pid_list(self):
self.execute('get_pid_list')
@skipIf(POSIX)
def test_pid_exists(self):
self.execute('pid_exists', os.getpid())
def test_process_iter(self):
self.execute('process_iter')
def test_used_phymem(self):
self.execute('used_phymem')
def test_avail_phymem(self):
self.execute('avail_phymem')
def test_total_virtmem(self):
self.execute('total_virtmem')
def test_used_virtmem(self):
self.execute('used_virtmem')
def test_avail_virtmem(self):
self.execute('avail_virtmem')
def test_cpu_times(self):
self.execute('cpu_times')
def test_main():
test_suite = unittest.TestSuite()
test_suite.addTest(unittest.makeSuite(TestProcessObjectLeaks))
test_suite.addTest(unittest.makeSuite(TestModuleFunctionsLeaks))
unittest.TextTestRunner(verbosity=2).run(test_suite)
if __name__ == '__main__':
test_main()
| 24.822917 | 82 | 0.599245 | 571 | 4,766 | 4.831874 | 0.220666 | 0.065966 | 0.135919 | 0.045669 | 0.471185 | 0.391446 | 0.355926 | 0.355926 | 0.355926 | 0.355926 | 0 | 0.011417 | 0.283256 | 4,766 | 191 | 83 | 24.95288 | 0.796253 | 0.070709 | 0 | 0.460317 | 0 | 0 | 0.073459 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.246032 | false | 0 | 0.047619 | 0 | 0.309524 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
48bfa6a9870aa2f95044df7a3145739de4a0dc15 | 1,681 | py | Python | tests/molecular/molecules/building_block/test_with_functional_groups.py | andrewtarzia/stk | 1ac2ecbb5c9940fe49ce04cbf5603fd7538c475a | [
"MIT"
] | 21 | 2018-04-12T16:25:24.000Z | 2022-02-14T23:05:43.000Z | tests/molecular/molecules/building_block/test_with_functional_groups.py | JelfsMaterialsGroup/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 8 | 2019-03-19T12:36:36.000Z | 2020-11-11T12:46:00.000Z | tests/molecular/molecules/building_block/test_with_functional_groups.py | supramolecular-toolkit/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 5 | 2018-08-07T13:00:16.000Z | 2021-11-01T00:55:10.000Z | from ..utilities import (
has_same_structure,
is_equivalent_molecule,
is_equivalent_building_block,
are_equivalent_functional_groups,
)
def test_with_functional_groups(building_block, get_functional_groups):
"""
Test :meth:`.BuildingBlock.with_functional_groups`.
Parameters
----------
building_block : :class:`.BuildingBlock`
The building block to test.
get_functional_groups : :class:`callable`
Takes a single parameter, `building_block` and returns the
`functional_groups` parameter to use for this test.
Returns
-------
None : :class:`NoneType`
"""
# Save clone to check immutability.
clone = building_block.clone()
_test_with_functional_groups(
building_block=building_block,
functional_groups=tuple(get_functional_groups(building_block)),
)
is_equivalent_building_block(building_block, clone)
has_same_structure(building_block, clone)
def _test_with_functional_groups(building_block, functional_groups):
"""
Test :meth:`.BuildingBlock.with_functional_groups`.
Parameters
----------
building_block : :class:`.BuildingBlock`
The building block to test.
functional_groups : :class:`tuple` of :class:`.FunctionalGroup`
The functional groups the new building block should hold.
Returns
-------
None : :class:`NoneType`
"""
new = building_block.with_functional_groups(functional_groups)
are_equivalent_functional_groups(
new.get_functional_groups(),
functional_groups,
)
is_equivalent_molecule(building_block, new)
has_same_structure(building_block, new)
| 26.68254 | 71 | 0.702558 | 182 | 1,681 | 6.120879 | 0.252747 | 0.221724 | 0.10772 | 0.104129 | 0.372531 | 0.320467 | 0.287253 | 0.21544 | 0.21544 | 0.21544 | 0 | 0 | 0.201666 | 1,681 | 62 | 72 | 27.112903 | 0.830104 | 0.418203 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.045455 | 0 | 0.136364 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.