hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1b5ae15c5f93d6a5c129b200254b544ee0bfa0ee | 1,577 | py | Python | setup.py | ilkka/nap | 3ea7b41ef6b24b7e127bc87bb010d8a8bb18a4bd | [
"MIT"
] | 31 | 2015-02-11T22:36:26.000Z | 2019-03-26T17:00:36.000Z | setup.py | ilkka/nap | 3ea7b41ef6b24b7e127bc87bb010d8a8bb18a4bd | [
"MIT"
] | 6 | 2015-11-16T13:29:34.000Z | 2019-10-28T13:37:57.000Z | setup.py | ilkka/nap | 3ea7b41ef6b24b7e127bc87bb010d8a8bb18a4bd | [
"MIT"
] | 11 | 2015-04-15T00:09:08.000Z | 2020-08-25T13:50:21.000Z | #!/usr/bin/env python
from pip.req import parse_requirements
# parse_requirements() returns generator of pip.req.InstallRequirement objects
install_reqs = parse_requirements('requirements.txt')
# reqs is a list of requirement
# e.g. ['django==1.5.1', 'mezzanine==1.4.6']
reqs = [str(ir.req) for ir in install_reqs]
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
readme = """Read docs from GitHub_
.. _GitHub: https://github.com/kimmobrunfeldt/nap
"""
setup(
name='nap',
version='2.0.0-dev',
description='Convenient way to request HTTP APIs',
long_description=readme,
author='Kimmo Brunfeldt',
author_email='kimmobrunfeldt@gmail.com',
url='https://github.com/kimmobrunfeldt/nap',
packages=[
'nap',
],
package_dir={'nap': 'nap'},
include_package_data=True,
install_requires=reqs,
license='MIT',
zip_safe=False,
keywords='nap rest requests http',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Natural Language :: English',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.2',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: Implementation :: PyPy',
],
)
| 28.160714 | 78 | 0.641725 | 182 | 1,577 | 5.483516 | 0.549451 | 0.152305 | 0.200401 | 0.104208 | 0.116232 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.21877 | 1,577 | 55 | 79 | 28.672727 | 0.792208 | 0.1078 | 0 | 0.047619 | 0 | 0 | 0.487527 | 0.017106 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.095238 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b62a0d4ec377c212288a73dec2ea0b7124e85ec | 659 | py | Python | examples/notiltangle.py | jckw/Adafruit_LSM9DS0 | 98ff135fbf1702160a9277df1fd637022f91e234 | [
"MIT"
] | 6 | 2017-11-14T07:21:58.000Z | 2018-08-24T03:47:58.000Z | examples/notiltangle.py | jckw/Adafruit_LSM9DS0 | 98ff135fbf1702160a9277df1fd637022f91e234 | [
"MIT"
] | null | null | null | examples/notiltangle.py | jckw/Adafruit_LSM9DS0 | 98ff135fbf1702160a9277df1fd637022f91e234 | [
"MIT"
] | 2 | 2017-09-26T16:57:16.000Z | 2018-12-06T12:33:11.000Z | # Simple example whereby the angle is calculated, assuming the magnetometer is flat,
# i.e. there is no tilt-compensation.
# Author: Jack Weatherilt
# License: Public Domain
import math
import time
# Import the LSM9DS0 module
import Adafruit_LSM9DS0
# Create new LSM9DS0 instance
imu = Adafruit_LSM9DS0.LSM9DS0()
while True:
# Unpack (x, y, z) readings from magnetometer
(mag_x, mag_y, mag_z) = imu.readMag()
# Calculate the angle using trigonometry
angle_deg = math.degrees(math.atan2(mag_y, mag_x))
# NOTE: this method does not account for tilt!
print("Non-tilt: deg:", angle_deg)
# Wait half a second before repeating
time.sleep(0.05)
| 21.258065 | 84 | 0.743551 | 100 | 659 | 4.81 | 0.66 | 0.033264 | 0.029106 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025594 | 0.169954 | 659 | 30 | 85 | 21.966667 | 0.853748 | 0.582701 | 0 | 0 | 0 | 0 | 0.054264 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1b730a1ad9ad1fa78cb7e47435bcdc5ea1daee8d | 1,279 | py | Python | lab01/authserver/lab01_authserver/client_test.py | Boris-Barboris/rsoi | 30b03f50549f7977d5ecb7788b8e22b789f8859f | [
"MIT"
] | null | null | null | lab01/authserver/lab01_authserver/client_test.py | Boris-Barboris/rsoi | 30b03f50549f7977d5ecb7788b8e22b789f8859f | [
"MIT"
] | null | null | null | lab01/authserver/lab01_authserver/client_test.py | Boris-Barboris/rsoi | 30b03f50549f7977d5ecb7788b8e22b789f8859f | [
"MIT"
] | null | null | null | #!/usr/bin/python3
from lab01_authserver_app.oauthclient import *
import json
import requests
import time
import requests
import logging
import http.client
http.client.HTTPConnection.debuglevel = 1
logging.basicConfig()
logging.getLogger().setLevel(logging.DEBUG)
requests_log = logging.getLogger("requests.packages.urllib3")
requests_log.setLevel(logging.DEBUG)
requests_log.propagate = True
pp = PasswordPlugin('admin', 'admin')
client = OAuthClient('http://127.0.0.1:39000', pp, 'debug_client', 'mysecret', 'localhost')
print('\nVerification using password plugin\n')
print('\n\n' + repr(client.verify()) + '\n')
print('\nissuing tokens...\n')
tokens = client.issue_tokens()
print('\n\ntokens:\n')
print('\n' + repr(tokens) + '\n')
tp = TokenPlugin(atoken = tokens['access_token'], rtoken = tokens['refresh_token'])
client.auth_plugin = tp
print('\nVerification using token plugin...\n')
print('\n\n' + repr(client.verify()) + '\n')
#time.sleep(1)
#print('\nVerification using token plugin again...\n')
#print('\n\n' + repr(client.verify()) + '\n')
print('\nrefreshing tokens...\n')
tokens = client.issue_tokens()
print('\n\ntokens:\n')
print('\n' + repr(tokens) + '\n')
print('\nme information...\n')
me = client.me()
print('\n\nme:\n')
print('\n' + repr(me) + '\n')
| 27.804348 | 91 | 0.702893 | 174 | 1,279 | 5.103448 | 0.350575 | 0.060811 | 0.047297 | 0.027027 | 0.386261 | 0.246622 | 0.246622 | 0.246622 | 0.246622 | 0.137387 | 0 | 0.01468 | 0.094605 | 1,279 | 45 | 92 | 28.422222 | 0.752159 | 0.099296 | 0 | 0.30303 | 0 | 0 | 0.271777 | 0.021777 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.060606 | 0.212121 | 0 | 0.212121 | 0.393939 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1b7b00a69a0d287e97e0ec900905293bcb8524cc | 1,973 | py | Python | tests/regressiontests/get_or_create_regress/models.py | huicheese/Django-test3 | ac11d2dce245b48392e52d1f4acfd5e7433b243e | [
"BSD-3-Clause"
] | 23 | 2015-01-26T12:16:59.000Z | 2022-02-10T10:58:40.000Z | tests/regressiontests/get_or_create_regress/models.py | joetyson/django | c3699190186561d5c216b2a77ecbfc487d42a734 | [
"BSD-3-Clause"
] | 1 | 2018-01-03T15:26:49.000Z | 2018-01-03T15:26:49.000Z | tests/regressiontests/get_or_create_regress/models.py | joetyson/django | c3699190186561d5c216b2a77ecbfc487d42a734 | [
"BSD-3-Clause"
] | 30 | 2015-03-25T19:40:07.000Z | 2021-05-28T22:59:26.000Z | from django.db import models
class Publisher(models.Model):
name = models.CharField(max_length=100)
class Author(models.Model):
name = models.CharField(max_length=100)
class Book(models.Model):
name = models.CharField(max_length=100)
authors = models.ManyToManyField(Author, related_name='books')
publisher = models.ForeignKey(Publisher, related_name='books')
__test__ = {'one':"""
#
# RelatedManager
#
# First create a Publisher.
>>> p = Publisher.objects.create(name='Acme Publishing')
# Create a book through the publisher.
>>> book, created = p.books.get_or_create(name='The Book of Ed & Fred')
>>> created
True
# The publisher should have one book.
>>> p.books.count()
1
# Try get_or_create again, this time nothing should be created.
>>> book, created = p.books.get_or_create(name='The Book of Ed & Fred')
>>> created
False
# And the publisher should still have one book.
>>> p.books.count()
1
#
# ManyRelatedManager
#
# Add an author to the book.
>>> ed, created = book.authors.get_or_create(name='Ed')
>>> created
True
# Book should have one author.
>>> book.authors.count()
1
# Try get_or_create again, this time nothing should be created.
>>> ed, created = book.authors.get_or_create(name='Ed')
>>> created
False
# And the book should still have one author.
>>> book.authors.count()
1
# Add a second author to the book.
>>> fred, created = book.authors.get_or_create(name='Fred')
>>> created
True
# The book should have two authors now.
>>> book.authors.count()
2
# Create an Author not tied to any books.
>>> Author.objects.create(name='Ted')
<Author: Author object>
# There should be three Authors in total. The book object should have two.
>>> Author.objects.count()
3
>>> book.authors.count()
2
# Try creating a book through an author.
>>> ed.books.get_or_create(name="Ed's Recipies", publisher=p)
(<Book: Book object>, True)
# Now Ed has two Books, Fred just one.
>>> ed.books.count()
2
>>> fred.books.count()
1
"""}
| 21.445652 | 74 | 0.706031 | 297 | 1,973 | 4.606061 | 0.252525 | 0.05848 | 0.064327 | 0.065789 | 0.432018 | 0.415936 | 0.415936 | 0.323099 | 0.292398 | 0.223684 | 0 | 0.010766 | 0.15256 | 1,973 | 91 | 75 | 21.681319 | 0.807416 | 0 | 0 | 0.357143 | 0 | 0 | 0.798277 | 0.151039 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014286 | 0 | 0.128571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b7cb10575222f004342449819d9940438677034 | 3,613 | py | Python | python_files/generate_figures/7.2_Perturbations.py | TiKeil/Masterthesis-LOD | 03c1c6748a0464165a666bd9f4f933bc8e4f233b | [
"Apache-2.0"
] | null | null | null | python_files/generate_figures/7.2_Perturbations.py | TiKeil/Masterthesis-LOD | 03c1c6748a0464165a666bd9f4f933bc8e4f233b | [
"Apache-2.0"
] | null | null | null | python_files/generate_figures/7.2_Perturbations.py | TiKeil/Masterthesis-LOD | 03c1c6748a0464165a666bd9f4f933bc8e4f233b | [
"Apache-2.0"
] | 1 | 2020-03-30T08:49:13.000Z | 2020-03-30T08:49:13.000Z | # This file is part of the master thesis "Variational crimes in the Localized orthogonal decomposition method":
# https://github.com/TiKeil/Masterthesis-LOD.git
# Copyright holder: Tim Keil
# License: BSD 2-Clause License (http://opensource.org/licenses/BSD-2-Clause)
import numpy as np
import buildcoef2d
import matplotlib.pyplot as plt
from visualize import drawCoefficient, ExtradrawCoefficient
bg = 0.05 #background
val = 1 #values
NWorldFine = np.array([42, 42])
CoefClass = buildcoef2d.Coefficient2d(NWorldFine,
bg = bg, # background
val = val, # values
length = 2, # length
thick = 2, # thickness
space = 2, # space between values
probfactor = 1, # probability of an value
right = 1, # shape 1
down = 0, # shape 2
diagr1 = 0, # shape 3
diagr2 = 0, # shape 4
diagl1 = 0, # shape 5
diagl2 = 0, # shape 6
LenSwitch = None, # various length
thickSwitch = None, # various thickness
ChannelHorizontal = None, # horizontal Channels
ChannelVertical = None, # vertical Channels
BoundarySpace = True # additional space on the boundary
)
A = CoefClass.BuildCoefficient() # coefficient in a numpy array
numbers = [13,20,27,44,73] #What entries will be changed
# Change in Value
B = CoefClass.SpecificValueChange( Number = numbers,
ratio = -0.4,
randomvalue = None,
negative = None,
ShapeRestriction = True,
ShapeWave = None,
probfactor = 1,
Original = True)
# Disappearance
C = CoefClass.SpecificVanish( Number = numbers,
PartlyVanish = None,
probfactor = 1,
Original = True)
# Shift
D = CoefClass.SpecificMove( Number = numbers,
steps = 1,
randomstep = None,
randomDirection = None,
Right = 1,
BottomRight = 1,
Bottom = 1,
BottomLeft = 1,
Left = 1,
TopLeft = 1,
Top = 1,
TopRight = 1,
Original = True)
A = A.flatten()
B = B.flatten()
C = C.flatten()
D = D.flatten()
plt.figure("original")
drawCoefficient(NWorldFine, A)
plt.figure("1")
drawCoefficient(NWorldFine, B)
plt.figure("2")
drawCoefficient(NWorldFine, C)
plt.figure("3")
drawCoefficient(NWorldFine, D)
plt.figure('all')
ExtradrawCoefficient(NWorldFine, A, B, C, D)
plt.show() | 39.703297 | 111 | 0.4124 | 272 | 3,613 | 5.477941 | 0.503676 | 0.020134 | 0.026175 | 0.030872 | 0.036242 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034183 | 0.522281 | 3,613 | 91 | 112 | 39.703297 | 0.829085 | 0.166897 | 0 | 0.088235 | 0 | 0 | 0.004695 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b7d3e825fb41006762ca249a60864707434a95c | 516 | py | Python | entertainment_tonight/migrations/0003_event_upload_photo.py | ashleyf1996/OOP_Web_Application_Assignment3 | 5c398bf282e8decefaac2ff54a5ec50aff3ab32f | [
"MIT"
] | null | null | null | entertainment_tonight/migrations/0003_event_upload_photo.py | ashleyf1996/OOP_Web_Application_Assignment3 | 5c398bf282e8decefaac2ff54a5ec50aff3ab32f | [
"MIT"
] | null | null | null | entertainment_tonight/migrations/0003_event_upload_photo.py | ashleyf1996/OOP_Web_Application_Assignment3 | 5c398bf282e8decefaac2ff54a5ec50aff3ab32f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.6 on 2017-04-06 20:39
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('entertainment_tonight', '0002_auto_20170321_1517'),
]
operations = [
migrations.AddField(
model_name='event',
name='upload_photo',
field=models.CharField(default=1, max_length=200),
preserve_default=False,
),
]
| 23.454545 | 62 | 0.629845 | 57 | 516 | 5.473684 | 0.824561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096859 | 0.25969 | 516 | 21 | 63 | 24.571429 | 0.719895 | 0.131783 | 0 | 0 | 1 | 0 | 0.137079 | 0.098876 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b850dbfb0c60c1aeafce298fa429a15013488c3 | 20,047 | py | Python | PassPY0.2Win.py | IvaldiS6/PassPYWin | 25ffc73392166fd85a8590866c5b65fd0aece72c | [
"MIT"
] | null | null | null | PassPY0.2Win.py | IvaldiS6/PassPYWin | 25ffc73392166fd85a8590866c5b65fd0aece72c | [
"MIT"
] | 1 | 2021-09-15T20:02:36.000Z | 2021-09-15T20:02:36.000Z | PassPY0.2Win.py | IvaldiS6/PassPYWin | 25ffc73392166fd85a8590866c5b65fd0aece72c | [
"MIT"
] | 1 | 2021-09-15T21:49:40.000Z | 2021-09-15T21:49:40.000Z | import codecs
import os
import os.path
import string
import random
from random import shuffle
import csv
import time
import hashlib
import struct
import binascii
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
DefaultSize = "mode con: cols=100 lines=20"
os.system(DefaultSize)
pre = "C:\ProgramData\PassPY"
if not os.path.exists(pre):
os.makedirs(pre)
account = ""
cypher = ""
username = ""
user_name = ""
m = ""
def clrscr():
# Check if Operating System is Mac and Linux or Windows
if os.name == 'posix':
_ = os.system('clear')
else:
# Else Operating System is Windows (os.name = nt)
_ = os.system('cls')
def logo():
print("________ __________ __\n___ __ \_____ _________________ __ \ \/ /\n__ /_/ / __ `/_ ___/_ ___/_ /_/ /_ / \n_ ____// /_/ /_(__ )_(__ )_ ____/_ / \n/_/ \__,_/ /____/ /____/ /_/ /_/ \n\n\n\n\n")
def header():
clrscr()
os.system(DefaultSize)
logo()
def PassGen(user_name,acc,uN,pre):
header()
x = ''
x = input("1: Have PassPY generate a password with a length you choose for " + acc + "\n2: Type your own password for " + acc + "\n")
if x == '1':
header()
length = float(input("How many characters would you like the password to be for " + acc + "? \n"))
div = int(length/3)
r = int(length%3)
seed = string.ascii_letters # Generating letters
letters = ( ''. join(random.choice(seed) for i in range(div)) )
seed = string.digits # generating digits
numbers = ( ''.join(random.choice(seed) for i in range(div)) )
seed = string.punctuation # generating punctuation
punctuation = ( ''.join(random.choice(seed) for i in range(div + r)) )
hold = letters + numbers + punctuation
pW = ( ''.join(random.sample(hold, len(hold))))
print("here is the generated password: " + pW)
preKey = acc + uN + pW
lineHash = hashlib.sha256(preKey.encode('utf-8'))
half = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
lineHashHexidecimal = lineHash.hexdigest()
smosh = hashlib.sha256(bytes(half + lineHashHexidecimal, 'utf8'))
key = smosh.digest()
iv = bytes(int(len(key)/2))
acc = bytes(acc, 'utf8')
uN = bytes(uN, 'utf8')
pW = bytes(pW, 'utf8')
cipher = Cipher(algorithms.AES(key), modes.CTR(iv))
encryptor = cipher.encryptor()
uN = encryptor.update(uN) + encryptor.finalize()
uN = bytes.hex(uN)
encryptor = cipher.encryptor()
acc = encryptor.update(acc) + encryptor.finalize()
acc = bytes.hex(acc)
encryptor = cipher.encryptor()
pW = encryptor.update(pW) + encryptor.finalize()
pW = bytes.hex(pW)
lineEncrypted = bytes(acc + uN + pW, 'utf8')
lineChecksum = hashlib.sha256(lineEncrypted).hexdigest()
newline = acc + "\t" + uN + "\t" + pW + "\t" + str(lineHashHexidecimal) + "\t" + str(lineChecksum) + "\n"
post = user_name + "50" + ".passpy"
location = os.path.join(pre, post)
with open(location, "a", newline="\n") as filea:
filea.write(newline + "\n")
input("press Enter once the password is memorized (dont worry if you forget, it was saved in your password directory.)\n")
MainMenu(user_name)
elif x == '2':
header()
pW = input("Type the password for " + acc + ", then press Enter: \n")
preKey = acc + uN + pW
lineHash = hashlib.sha256(preKey.encode('utf-8'))
half = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
lineHashHexidecimal = lineHash.hexdigest()
smosh = hashlib.sha256(bytes(half + lineHashHexidecimal, 'utf8'))
key = smosh.digest()
iv = bytes(int(len(key)/2))
acc = bytes(acc, 'utf8')
uN = bytes(uN, 'utf8')
pW = bytes(pW, 'utf8')
cipher = Cipher(algorithms.AES(key), modes.CTR(iv))
smosh = ''
key = ''
iv = ''
encryptor = cipher.encryptor()
uN = encryptor.update(uN) + encryptor.finalize()
uN = bytes.hex(uN)
encryptor = cipher.encryptor()
acc = encryptor.update(acc) + encryptor.finalize()
acc = bytes.hex(acc)
encryptor = cipher.encryptor()
pW = encryptor.update(pW) + encryptor.finalize()
pW = bytes.hex(pW)
lineEncrypted = bytes(acc + uN + pW, 'utf8')
lineChecksum = hashlib.sha256(lineEncrypted).hexdigest()
newline = acc + "\t" + uN + "\t" + pW + "\t" + str(lineHashHexidecimal) + "\t" + str(lineChecksum) + "\n"
post = user_name + "50" + ".passpy"
location = os.path.join(pre, post)
with open(location, "a", newline="\n") as filea:
filea.write(newline)
MainMenu(user_name)
else:
PassGen(user_name,acc,uN,pre)
def Signin(pre):
header()
user_name = input("Enter Username: ").encode("utf-8").hex()
if user_name == "":
input("Press enter to returnt to the Sign In screen and enter a user name\n")
Signin(pre)
nametest2 = user_name + "4c" + ".passpy"
location = os.path.join(pre, nametest2)
try: #check to see if the account exists
usersearch = open(location,"r") #search for user's password file
lst = list(usersearch.readlines())
confirm = lst[-1]
print("Hello " + str(codecs.decode(user_name, "hex"), "utf-8"))
password = input("Enter Password: ").encode("utf-8").hex()
s(user_name,password)
compare = line
if compare == confirm:
print("Access Granted")
MainMenu(user_name)
else:
print("Access Denied")
Signin(pre)
except FileNotFoundError:
header()
print("Username not found!")
input("please press enter to continue")
Login(pre)
def AddEntry(user_name,pre):
header()
acc = input("what account is this password for? (e.g. GitHub)\n")
uN = input("What is the username for " + acc + "?\n")
PassGen(user_name,acc,uN,pre)
print("Done!")
def PasswordSearch(user_name,pre):
c = ""
header()
post = user_name + "50" + ".passpy"
location = os.path.join(pre, post)
half = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
SearchColumn = input("Password Search Menu:\nPress 1 to show all passwords\nPress 2 to search by account\nAll of the following options will NOT work!\nPress 3 to search by username\nPress 4 to search by password\nPress 5 to return to the Main Menu\n ")
try: #make sure there is a password file to search through
with open(location) as csv_file:
csv_reader = csv.reader(csv_file, delimiter="\t")
next(csv_reader)
if SearchColumn == '1':
header()
print("Here are all of the stored passwords: ")
for row in csv_reader: # !!!START HERE!!! Decrypt single item line by line
smosh = hashlib.sha256(bytes(half + str(row[3]), 'utf8'))
key = smosh.digest()
iv = bytes(int(len(key)/2))
cipher = Cipher(algorithms.AES(key), modes.CTR(iv))
decryptor = cipher.decryptor()
bEntry = bytes.fromhex(str(row[2]).lower())
bct = str(decryptor.update(bEntry), "utf8")
print(bct)
input("Press Enter to continue to the Main Menu")
MainMenu(user_name)
elif SearchColumn == '2':
header()
search = bytes(input("What Account are you looking for? \n"), 'utf8')
for row in csv_reader:
half = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
smosh = hashlib.sha256(bytes(half + str(row[3]), 'utf8'))
key = smosh.digest()
iv = bytes(int(len(key)/2))
cipher = Cipher(algorithms.AES(key), modes.CTR(iv))
decryptor = cipher.decryptor()
encryptor = cipher.encryptor()
sup = encryptor.update(search) + encryptor.finalize()
sable = bytes.hex(sup)
if sable == row[0]:
decryptor = cipher.decryptor()
a = bytes.fromhex(str(row[0]).lower())
a = str(decryptor.update(a), "utf8")
decryptor = cipher.decryptor()
u = bytes.fromhex(str(row[1]).lower())
u = str(decryptor.update(u), "utf8")
decryptor = cipher.decryptor()
p = bytes.fromhex(str(row[2]).lower())
p = str(decryptor.update(p), "utf8")
header()
c = input("The Account, Username and Password information for " + a + " are:\n\nAccount------" + a + "\nUser Name----" + u + "\nPassword-----" + p + "\n\nEnter 1 if you want to copy the password to the clipboard\nEnter 2 if you want to continue searching\n")
if c == '1':
target = p
header()
print("The Account, Username and Password information for " + a + " are:\n\nAccount------" + a + "\nUser Name----" + u + "\nPassword-----" + p + "\n")
Clipboard(target)
MainMenu(user_name)
elif c == '2':
print("Password NOT copied, continuing to search")
time.sleep(2)
continue
else:
print("Returning to the Main Menu")
time.sleep(1)
MainMenu(user_name)
MainMenu(user_name)
elif SearchColumn == '3':
header()
search = bytes(input("What Username are you looking for? \n"), 'utf8')
for row in csv_reader:
half = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
smosh = hashlib.sha256(bytes(half + str(row[3]), 'utf8'))
key = smosh.digest()
iv = bytes(int(len(key)/2))
cipher = Cipher(algorithms.AES(key), modes.CTR(iv))
decryptor = cipher.decryptor()
encryptor = cipher.encryptor()
sup = encryptor.update(search) + encryptor.finalize()
sable = bytes.hex(sup)
if sable == row[1]:
decryptor = cipher.decryptor()
a = bytes.fromhex(str(row[0]).lower())
a = str(decryptor.update(a), "utf8")
decryptor = cipher.decryptor()
u = bytes.fromhex(str(row[1]).lower())
u = str(decryptor.update(u), "utf8")
decryptor = cipher.decryptor()
p = bytes.fromhex(str(row[2]).lower())
p = str(decryptor.update(p), "utf8")
header()
c = input("The Account, Username and Password information for " + a + " are:\n\nAccount------" + a + "\nUser Name----" + u + "\nPassword-----" + p + "\n\nEnter 1 if you want to copy the password to the clipboard\nEnter 2 if you do not\n")
if c == '1':
target = p
header()
print("The Account, Username and Password information for " + a + " are:\n\nAccount------" + a + "\nUser Name----" + u + "\nPassword-----" + p + "\n")
Clipboard(target)
MainMenu(user_name)
elif c == '2':
input("Password NOT copied, Press enter to return to continue searching")
continue
else:
input("Password NOT copied, Press enter to return to the Main Menu")
MainMenu(user_name)
continue
MainMenu(user_name)
elif SearchColumn == '4':
header()
search = bytes(input("What password are you looking for? \n"), 'utf8')
for row in csv_reader:
half = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
smosh = hashlib.sha256(bytes(half + str(row[3]), 'utf8'))
key = smosh.digest()
iv = bytes(int(len(key)/2))
cipher = Cipher(algorithms.AES(key), modes.CTR(iv))
decryptor = cipher.decryptor()
encryptor = cipher.encryptor()
sup = encryptor.update(search) + encryptor.finalize()
sable = bytes.hex(sup)
if sable == row[2]:
decryptor = cipher.decryptor()
a = bytes.fromhex(str(row[0]).lower())
a = str(decryptor.update(a), "utf8")
decryptor = cipher.decryptor()
u = bytes.fromhex(str(row[1]).lower())
u = str(decryptor.update(u), "utf8")
decryptor = cipher.decryptor()
p = bytes.fromhex(str(row[2]).lower())
p = str(decryptor.update(p), "utf8")
header()
c = input("The Account, Username and Password information for " + a + " are:\n\nAccount------" + a + "\nUser Name----" + u + "\nPassword-----" + p + "\n\nEnter 1 if you want to copy the password to the clipboard\nEnter 2 if you do not\n")
if c == '1':
target = p
header()
print("The Account, Username and Password information for " + a + " are:\n\nAccount------" + a + "\nUser Name----" + u + "\nPassword-----" + p + "\n")
Clipboard(target)
MainMenu(user_name)
elif c == '2':
input("Password NOT copied, Press enter to return to continue")
continue
else:
input("Password NOT copied, Press enter to return to the Main Menu")
MainMenu(user_name)
continue
MainMenu(user_name)
elif SearchColumn == '5':
MainMenu(user_name)
else:
m = input("enter 1, 2, 3 or 4:\n")
PasswordSearch(user_name,pre)
MainMenu(user_name)
except FileNotFoundError:
header()
print("Please register some passwords for me to search through.")
input("please press enter to continue")
MainMenu(user_name)
def Clipboard(target):
command = 'echo ' + target.strip() + '| clip'
os.system(command)
time.sleep(1)
print("The clipboard will be cleared in 5 seconds")
time.sleep(1)
print("The clipboard will be cleared in 4 seconds")
time.sleep(1)
print("The clipboard will be cleared in 3 seconds")
time.sleep(1)
print("The clipboard will be cleared in 2 seconds")
time.sleep(1)
print("The clipboard will be cleared in 1 seconds")
time.sleep(1)
print("The clipboard will be cleared now")
os.system("echo.| clip")
def MainMenu(user_name):
header()
print("Menu:\n 1: New password - register new password\n 2: List - show passwords\n 3: Exit")
menu = input("Enter a number:\n")
if menu == '1':
AddEntry(user_name,pre)
elif menu == '2':
PasswordSearch(user_name,pre)
elif menu == '3':
clrscr()
exit()
elif menu == '':
MainMenu(user_name)
else:
MainMenu(user_name)
def s(user_name,password):
uhold = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
phold = hashlib.sha256(password.encode('utf-8')).hexdigest()
for i in uhold:
if i.isdigit():
ucount = i
break
for i in phold:
if i.isdigit():
pcount = i
break
if int(pcount) % 2 == 0:
global line
line = uhold * int(pcount) + phold * int(ucount)
else:
line = phold * int(pcount) + uhold * int(pcount)
line = hashlib.sha256(line.encode('utf-8')).hexdigest()
def Register(pre):
header()
user_name = input("Enter Username: ").encode("utf-8").hex()
if user_name == "":
input("Press enter to return to the Sign In screen and enter a user name\n")
Register(pre)
nametest1 = user_name + "4c" + ".passpy"
location = os.path.join(pre, nametest1)
try:
usersearch = open(location) #search for user's password file
usersearch.close()
header()
print("User name not available")
input("Press Enter to try again: ")
Register(pre)
except FileNotFoundError:
header()
print("User name is available")
with open(location,"a") as create: #create user's password file
password = input("enter desired password:\n").encode("utf-8").hex()
while password == "":
header()
password = input("An empty password is not useful\nPlease enter desired password:\n")
s(user_name,password)
create.write(line)
second = user_name + "50" + ".passpy"
location = os.path.join(pre, second)
with open(location, "a", newline="\n") as create:
first = "count"
b = "0"
third = "empty"
hold = first + b + third
fourth = hashlib.sha256(bytes(hold, 'utf8')).hexdigest()
half = hashlib.sha256(user_name.encode('utf-8')).hexdigest()
smosh = hashlib.sha256(bytes(half + hold, 'utf8'))
key = smosh.digest()
iv = bytes(int(len(key)/2))
cipher = Cipher(algorithms.AES(key), modes.CTR(iv))
encryptor = cipher.encryptor()
first = bytes(first, 'utf8')
first = bytes.hex(encryptor.update(first) + encryptor.finalize())
encryptor = cipher.encryptor()
b = bytes(b, 'utf8')
b = bytes.hex(encryptor.update(b) + encryptor.finalize())
encryptor = cipher.encryptor()
third = bytes(third, 'utf8')
third = bytes.hex(encryptor.update(third) + encryptor.finalize())
hold = bytes(first + b + third, "utf8")
fifth = hashlib.sha256(hold).hexdigest()
firstLine = first + "\t" + b + "\t" + third + "\t" + fourth + "\t" + fifth + "\n"
create.write(firstLine)
header()
input("Done! \nNew account created!\nWelcome!")
MainMenu(user_name)
def Login(pre):
header()
print("Welcome!\n 1: New users - register your account\n 2: Existing users - log in\n 3: Exit - close the application.")
login = input("Enter a number:\n")
if login == '1':
Register(pre)
elif login == '2':
Signin(pre)
elif login == '3':
clrscr()
exit()
else:
Login(pre)
# Startup Phase
header()
print("Welcome to PassPY, the python based, opensource password storage\nIf you like PassPY, share it with a friend github.com/kayakers6/passpy\nIf you love PassPY, BTC: bc1qsqc3v2jt3lh0kq9addf4gu6e2uq5vxxfk35pl\n SNX: 0x05E8813B7dc3c4e039D898CB13f21A6E4d675bc1")
start = input("Press ENTER to start")
Login(pre)
| 42.562633 | 284 | 0.521874 | 2,222 | 20,047 | 4.636814 | 0.142214 | 0.042706 | 0.031059 | 0.018441 | 0.591866 | 0.559158 | 0.532563 | 0.527128 | 0.522372 | 0.504028 | 0 | 0.020117 | 0.350327 | 20,047 | 470 | 285 | 42.653191 | 0.770961 | 0.019953 | 0 | 0.590047 | 0 | 0.021327 | 0.207701 | 0.013395 | 0 | 0 | 0.002139 | 0 | 0 | 1 | 0.028436 | false | 0.104265 | 0.028436 | 0 | 0.056872 | 0.061611 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1b87d6becc53e7c2b04088a17b003a62df6d1a55 | 291 | py | Python | cloudtunes-server/cloudtunes/async.py | skymemoryGit/cloudtunes | 424bf59e05663b0df20e16ddbd119eb39f6a729b | [
"BSD-3-Clause"
] | 529 | 2015-01-01T04:59:12.000Z | 2022-03-31T16:09:38.000Z | cloudtunes-server/cloudtunes/async.py | skymemoryGit/cloudtunes | 424bf59e05663b0df20e16ddbd119eb39f6a729b | [
"BSD-3-Clause"
] | 21 | 2015-01-13T15:41:15.000Z | 2021-11-06T20:56:40.000Z | cloudtunes-server/cloudtunes/async.py | skymemoryGit/cloudtunes | 424bf59e05663b0df20e16ddbd119eb39f6a729b | [
"BSD-3-Clause"
] | 116 | 2015-01-04T22:12:56.000Z | 2022-02-23T11:43:01.000Z | """Asynchronous MongoDB and Redis connections."""
from functools import partial
import motor
import tornadoredis
from cloudtunes import settings
RedisClient = partial(tornadoredis.Client, **settings.REDIS)
mongo = motor.MotorClient(**settings.MONGODB).cloudtunes
redis = RedisClient()
| 19.4 | 60 | 0.797251 | 31 | 291 | 7.483871 | 0.548387 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113402 | 291 | 14 | 61 | 20.785714 | 0.899225 | 0.147766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.571429 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1b89f8f65882ad931575298d75654e28396d40ac | 1,541 | py | Python | scripts/convert_ts_model_to_onnx.py | SakodaShintaro/Miacis | af3508076660cc6e19186f17fa436499e32164f5 | [
"BSD-3-Clause"
] | 10 | 2019-05-14T12:54:49.000Z | 2022-02-28T12:02:52.000Z | scripts/convert_ts_model_to_onnx.py | SakodaShintaro/Miacis | af3508076660cc6e19186f17fa436499e32164f5 | [
"BSD-3-Clause"
] | null | null | null | scripts/convert_ts_model_to_onnx.py | SakodaShintaro/Miacis | af3508076660cc6e19186f17fa436499e32164f5 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python3
import os
from generate_cnn_model import *
from generate_transformer_model import *
parser = argparse.ArgumentParser()
parser.add_argument("model_path", type=str)
parser.add_argument("--batch_size", type=int, default=128)
args = parser.parse_args()
input_channel_num = 42
board_size = 9
policy_channel_num = 27
input_tensor = torch.randn([args.batch_size, input_channel_num, board_size, board_size]).cuda()
script_model = torch.jit.load(args.model_path)
filename = os.path.splitext(os.path.basename(args.model_path))[0]
parts = filename.split("_")
block_num = None
channel_num = None
for part in parts:
if "bl" in part:
block_num = int(part.replace("bl", ""))
if "ch" in part:
channel_num = int(part.replace("ch", ""))
print(f"block_num = {block_num}, channel_num = {channel_num}")
model = None
if "transformer" in args.model_path:
model = TransformerModel(input_channel_num, block_num=block_num, channel_num=channel_num,
policy_channel_num=policy_channel_num,
board_size=board_size)
else:
model = CategoricalNetwork(input_channel_num, block_num=block_num, channel_num=channel_num,
policy_channel_num=policy_channel_num,
board_size=board_size)
model.load_state_dict(script_model.state_dict())
model.eval()
model.cuda()
save_path = args.model_path.replace(".model", ".onnx")
torch.onnx.export(model, input_tensor, save_path)
print(f"export to {save_path}")
| 31.44898 | 95 | 0.704737 | 216 | 1,541 | 4.722222 | 0.314815 | 0.166667 | 0.076471 | 0.090196 | 0.260784 | 0.260784 | 0.233333 | 0.233333 | 0.198039 | 0.198039 | 0 | 0.007905 | 0.179104 | 1,541 | 48 | 96 | 32.104167 | 0.798419 | 0.013628 | 0 | 0.108108 | 1 | 0 | 0.082949 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.081081 | 0 | 0.081081 | 0.054054 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b8cff6c78d91bb987bc3e4d6107a012b36bc8af | 2,092 | py | Python | utils/configs.py | realeu/ImgBB-Bot | 8e71df093e41690f815051636c91815f2db08ec2 | [
"MIT"
] | null | null | null | utils/configs.py | realeu/ImgBB-Bot | 8e71df093e41690f815051636c91815f2db08ec2 | [
"MIT"
] | null | null | null | utils/configs.py | realeu/ImgBB-Bot | 8e71df093e41690f815051636c91815f2db08ec2 | [
"MIT"
] | null | null | null | import os
import time
class Var(object):
# Get a bot token from botfather
BOT_TOKEN = os.environ.get("BOT_TOKEN", "")
# Get from my.telegram.org
API_ID = int(os.environ.get("API_ID", 12345))
# Get from my.telegram.org
API_HASH = os.environ.get("API_HASH", "")
# To record start time of bot
BOT_START_TIME = time.time()
# You Can Get An API Key From https://api.imgbb.com.
API = os.environ.get("API", None)
OWNER_ID = int(os.environ.get("OWNER_ID", "1453690249"))
BOT_NAME = os.environ.get("BOT_NAME", "ImgBB")
START_PIC = "https://telegra.ph/file/e162f5f8554a9bf66e830.jpg"
HELP_PIC = "https://telegra.ph/file/e162f5f8554a9bf66e830.jpg"
class Tr(object):
START_TEXT = """
👋 Hi {},
I’m **[ImgBB](telegram.me/xImgBBbot)**. I can upload images on **ImgBB.com** & generate shareable link for it!
BTW, do press **Help** for more information about the process.
"""
ABOUT_TEXT = """🤖 **My Name:** [ImgBB](telegram.me/xImgBBbot)
📝 **Language:** [Python 3](https://www.python.org)
📚 **Framework:** [Pyrogram](https://github.com/pyrogram/pyrogram)
📡 **Hosted On:** [Railway](https://railway.app)
👨💻 **Developer:** [𖤍 Λℓσηє 𖤍](t.me/xDune)
👥 **Support Group:** [Marine Support](https://t.me/MarineChats)
📢 **Updates Channel:** [Marine Bots](https://t.me/MarineBots)
"""
HELP_TEXT = """You may have already known my function. As you have seen in the start message, I can upload images on **ImgBB.com** & generate shareable link for it, which can be deleted after a specific time or stay there forever ~ according to your selection...🙃
Steps:
• Post/Forward an image...
• Select an option ~ whether to delete it automatically within the given period or keep it permanently...
• BOOM!💥 Your image is uploaded! You will be provided with a link to view the image, as well as, a link to delete it."""
ERR_TEXT = "⚠️ API Not Found"
ERRTOKEN_TEXT = "😶 The Access Token Provided Has Expired, Revoked, Malformed Or Invalid For Other Reasons. Report this at @MarineBots",
WAIT = "💬 Please Wait !!"
| 31.69697 | 267 | 0.672084 | 325 | 2,092 | 4.326154 | 0.501538 | 0.038407 | 0.051209 | 0.032006 | 0.194879 | 0.170697 | 0.13798 | 0.073969 | 0.073969 | 0.073969 | 0 | 0.025671 | 0.180688 | 2,092 | 65 | 268 | 32.184615 | 0.783547 | 0.076004 | 0 | 0.058824 | 0 | 0.117647 | 0.721847 | 0.033731 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.558824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b8fc5e29f8f56408c4c25c653cdf874f3bd78ff | 1,685 | py | Python | exifGPSimplant.py | Bahrd/AppliedPythonology | 65d7bd665eba823c3319c3efdc5a5047ddfa534d | [
"MIT"
] | 4 | 2019-10-11T07:39:49.000Z | 2022-03-01T23:18:57.000Z | exifGPSimplant.py | Bahrd/AppliedPythonology | 65d7bd665eba823c3319c3efdc5a5047ddfa534d | [
"MIT"
] | null | null | null | exifGPSimplant.py | Bahrd/AppliedPythonology | 65d7bd665eba823c3319c3efdc5a5047ddfa534d | [
"MIT"
] | null | null | null | ## EXIF GPS tags hand-crafted modification ('autografts')...
from exif import Image
from sys import argv as names
from math import floor
def dd_GPS_dms(coordinate):
latlonitude = float(coordinate)
degrees = floor(latlonitude)
residuum = (latlonitude - degrees) * 60
minutes = floor(residuum)
seconds = (residuum - minutes) * 60
return (degrees, minutes, seconds)
if(len(names) < 4):
print('USAGE: exifGPSimplant filename latitude [0-360) longitude [0 - 180)')
exit(-1)
else:
(recipient, latitude, longitude) = names[1:4]
with open(recipient, 'rb') as image_file:
img = Image(image_file)
img.gps_latitude = dd_GPS_dms(latitude)
img.gps_longitude = dd_GPS_dms(longitude)
#img.gps_altitude = 1200 # An orphan...
print(img.gps_latitude, img.gps_longitude)
with open(recipient, 'wb') as image_file:
image_file.write(img.get_file())
## Note the GPS tags format
# 34°56'43.386"N 109°46'32.447"W
## Other locations...
# https://www.gps-coordinates.net/gps-coordinates-converter
# Whitewater, CA: 33.923685, -116.640324
# Yosemite Valley: 37° 43′ 18″ N, 119° 38′ 47″ W
# Mocassin (Tuolumne Count, CA): 37° 48′ 39″ N, 120° 18′ 0″ W
# Hollywood Sign Puzzle View: 34°06'18.3"N 118°19'52.0"W
# Hoover Dam: 36° 0′ 56″ N, 114° 44′ 16″ W
# Rainbow Canyon: 36° 21′ 56.88″ N, 117° 30′ 5.4″ W
# Route 66 (AZ): 35°14'15.5"N 113°12'22.6"W
# Las Vegas' Replica of the Statue of Liberty 36°6'3.58"N 115°10'23.029"W
# The Tepees in Petrified Forest 34°56'43.386"N 109°46'32.447"W
# Golden gate & Alcatraz: 37.7764931, 122.5042172
## Shortcuts...
# Target folder: C:\Users\Przem\OneDrive\Images\OOW\Arizona, California & Nevada\Journey, not a destination
| 33.039216 | 107 | 0.695549 | 314 | 1,685 | 3.818471 | 0.535032 | 0.025021 | 0.020017 | 0.038365 | 0.038365 | 0.038365 | 0.038365 | 0.038365 | 0.038365 | 0.038365 | 0 | 0.136812 | 0.158457 | 1,685 | 50 | 108 | 33.7 | 0.67842 | 0.511573 | 0 | 0 | 0 | 0 | 0.088972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.136364 | null | null | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b937dd883ff8de284f496b4a8959a68bb61e4f0 | 5,802 | py | Python | infra_validation_engine/infra_tests/components/docker.py | WLCG-Lightweight-Sites/simple_grid_infra_validation_engine | ad1eca7ffe0337f276f0ef9e0c89c80c1070139b | [
"Apache-2.0"
] | 1 | 2020-03-01T12:25:33.000Z | 2020-03-01T12:25:33.000Z | infra_validation_engine/infra_tests/components/docker.py | simple-framework/simple_grid_infra_validation_engine | ad1eca7ffe0337f276f0ef9e0c89c80c1070139b | [
"Apache-2.0"
] | 14 | 2019-11-07T14:36:16.000Z | 2020-10-01T17:04:33.000Z | infra_validation_engine/infra_tests/components/docker.py | simple-framework/simple_grid_infra_validation_engine | ad1eca7ffe0337f276f0ef9e0c89c80c1070139b | [
"Apache-2.0"
] | 1 | 2019-11-07T15:14:21.000Z | 2019-11-07T15:14:21.000Z | # coding: utf-8
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from infra_validation_engine.core import InfraTest, InfraTestType
from infra_validation_engine.utils.constants import Constants
from infra_validation_engine.core.exceptions import PackageNotFoundError, ServiceNotRunningError, \
CommandExecutionError
class DockerConstants(Constants):
DOCKER_PKG_NAME = "docker-ce"
class DockerImageNotFoundError(Exception):
""" Raised if Docker Image is not found"""
pass
class DockerContainerNotFoundError(Exception):
""" Raised if container is not present on a node """
pass
class DockerContainerNotRunningError(Exception):
""" Raised if container is present but is not running """
pass
class DockerInstallationTest(InfraTest):
"""Test if Docker is installed on the nodes"""
__metaclass__ = InfraTestType
def __init__(self, host, fqdn):
InfraTest.__init__(self,
"Docker Installation Test",
"Check if {pkg} is installed on {fqdn}".format(pkg=DockerConstants.DOCKER_PKG_NAME,
fqdn=fqdn),
host,
fqdn)
def run(self):
cmd = self.host.run("docker --version")
return cmd.rc == 0
def fail(self):
err_msg = "Package {pkg} is not installed on {fqdn}".format(pkg=DockerConstants.DOCKER_PKG_NAME, fqdn=self.fqdn)
raise PackageNotFoundError(err_msg)
class DockerServiceTest(InfraTest):
"""
Test if docker is running on a node
"""
__metaclass__ = InfraTestType
def __init__(self, host, fqdn):
InfraTest.__init__(self,
"Docker Service Test",
"Check if docker is running on {fqdn}".format(fqdn=fqdn),
host,
fqdn)
def run(self):
cmd = self.host.run("docker ps -a")
return cmd.rc == 0
def fail(self):
err_msg = "Docker is not running on {fqdn}".format(fqdn=self.fqdn)
raise ServiceNotRunningError(err_msg)
class DockerImageTest(InfraTest):
"""
Check if a given image is present on the host
"""
def __init__(self, host, fqdn, image):
InfraTest.__init__(self,
"Docker Image Test",
"Check if {image} is present on {fqdn}".format(image=image, fqdn=fqdn),
host,
fqdn)
self.image = image
def run(self):
cmd_str = 'docker image ls -q -f "reference={image}"'.format(image=self.image)
cmd = self.host.run(cmd_str)
print cmd.rc, cmd.stdout == ""
if cmd.stdout == "":
return False
self.out = cmd.stdout.split("\n")
print self.out
# stdout containers one extra line
is_single_image = len(self.out) == 2
if is_single_image:
self.out = self.out
self.message = "The Image ID for {image} on {fqdn} is {id}".format(image=self.image, fqdn=self.fqdn,
id=self.out)
else:
self.message = "Multiple docker images found for {image}".format(image=self.image)
self.warn = True
return True
def fail(self):
err_msg = "Docker Image {image} was not found on {fqdn}".format(image=self.image, fqdn=self.fqdn)
raise DockerImageNotFoundError(err_msg)
class DockerContainerStatusTest(InfraTest):
""" Tests if container is running """
def __init__(self, host, fqdn, container):
InfraTest.__init__(self,
"Docker Container Status Test",
"Check if {container} is running on {fqdn}".format(container=container, fqdn=fqdn),
host,
fqdn)
self.container = container
self.cmd_str = ""
def run(self):
cmd_str = "docker inspect -f '{{.State.Running}}' " + "{container}".format(container=self.container)
cmd = self.host.run(cmd_str)
self.rc = cmd.rc
self.err = cmd.stderr
self.out = cmd.stdout
test_status = False
if self.out.strip() == "true":
test_status = True
return test_status
def fail(self):
if self.rc == 1:
err_msg = "Container {container} could not be found on {fqdn}".format(container=self.container,
fqdn=self.fqdn)
raise DockerContainerNotFoundError(err_msg)
elif self.rc == 127:
err_msg = "Command {cmd_str} could not be executed on {fqdn}".format(cmd_str=self.cmd_str, fqdn=self.fqdn)
raise CommandExecutionError(err_msg)
elif self.rc == 0:
err_msg = "Docker container {container} is present but is not running on {fqdn}".format(
container=self.container,
fqdn=self.fqdn)
raise DockerContainerNotRunningError(err_msg)
# class DockerContainerSanityTest(InfraTest):
# """
# Executes sanity check script for a container
# """
# pass
| 34.535714 | 120 | 0.586867 | 657 | 5,802 | 5.062405 | 0.252664 | 0.021648 | 0.036079 | 0.030667 | 0.341551 | 0.226999 | 0.185809 | 0.146723 | 0.146723 | 0.129284 | 0 | 0.003299 | 0.320751 | 5,802 | 167 | 121 | 34.742515 | 0.84065 | 0.118235 | 0 | 0.326733 | 0 | 0 | 0.155125 | 0 | 0.009901 | 0 | 0 | 0 | 0 | 0 | null | null | 0.029703 | 0.029703 | null | null | 0.019802 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1b9c5cc86b543fca12e20fbcf7f9c14dd78ad47e | 374 | py | Python | hipyelp/migrations/0014_rename_tagname_drinktag_drinktagname.py | buggydev1/Hip_yelp_Back | f80407d898d267465e5cfbf7f73905b776e91b43 | [
"MIT"
] | null | null | null | hipyelp/migrations/0014_rename_tagname_drinktag_drinktagname.py | buggydev1/Hip_yelp_Back | f80407d898d267465e5cfbf7f73905b776e91b43 | [
"MIT"
] | null | null | null | hipyelp/migrations/0014_rename_tagname_drinktag_drinktagname.py | buggydev1/Hip_yelp_Back | f80407d898d267465e5cfbf7f73905b776e91b43 | [
"MIT"
] | 2 | 2021-04-22T14:42:57.000Z | 2021-05-18T01:11:58.000Z | # Generated by Django 3.2 on 2021-04-25 02:49
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('hipyelp', '0013_alter_drinktag_tagname'),
]
operations = [
migrations.RenameField(
model_name='drinktag',
old_name='tagName',
new_name='drinktagName',
),
]
| 19.684211 | 51 | 0.59893 | 38 | 374 | 5.736842 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068182 | 0.294118 | 374 | 18 | 52 | 20.777778 | 0.757576 | 0.114973 | 0 | 0 | 1 | 0 | 0.18541 | 0.082067 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1ba1f83792d3dfefb74bfb86f0f1b1667d473d05 | 542 | py | Python | webapp/models/req_error_log.py | myfreshcity/mystock | 3a8832e8c498128683b6af528da92d7fda32386d | [
"MIT"
] | 2 | 2016-09-19T09:18:17.000Z | 2022-02-16T14:55:51.000Z | webapp/models/req_error_log.py | myfreshcity/mystock | 3a8832e8c498128683b6af528da92d7fda32386d | [
"MIT"
] | 2 | 2020-04-29T13:01:45.000Z | 2020-04-29T13:01:45.000Z | webapp/models/req_error_log.py | myfreshcity/mystock | 3a8832e8c498128683b6af528da92d7fda32386d | [
"MIT"
] | 2 | 2018-06-29T15:09:36.000Z | 2019-09-05T09:26:06.000Z | from datetime import datetime
from webapp.services import db
class ReqErrorLog(db.Model):
__tablename__ = 'req_error_log'
id = db.Column(db.Integer, primary_key=True)
action = db.Column(db.String(50))
key = db.Column(db.String(255))
msg = db.Column(db.String(2000))
created_time = db.Column(db.DateTime, default=datetime.now)
def __init__(self, action, key, msg):
self.action = action
self.key = key
self.msg = msg
def __repr__(self):
return '<ReqErrorLog %r>' % self.id
| 23.565217 | 63 | 0.653137 | 75 | 542 | 4.506667 | 0.466667 | 0.118343 | 0.147929 | 0.142012 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021429 | 0.225092 | 542 | 22 | 64 | 24.636364 | 0.783333 | 0 | 0 | 0 | 0 | 0 | 0.053604 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.133333 | 0.066667 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1ba5d06c5362ae2cf5a563053ad286ea87461d19 | 632 | py | Python | frappe-bench/apps/erpnext/erpnext/buying/doctype/supplier_scorecard_standing/supplier_scorecard_standing.py | Semicheche/foa_frappe_docker | a186b65d5e807dd4caf049e8aeb3620a799c1225 | [
"MIT"
] | 1 | 2021-04-29T14:55:29.000Z | 2021-04-29T14:55:29.000Z | frappe-bench/apps/erpnext/erpnext/buying/doctype/supplier_scorecard_standing/supplier_scorecard_standing.py | Semicheche/foa_frappe_docker | a186b65d5e807dd4caf049e8aeb3620a799c1225 | [
"MIT"
] | null | null | null | frappe-bench/apps/erpnext/erpnext/buying/doctype/supplier_scorecard_standing/supplier_scorecard_standing.py | Semicheche/foa_frappe_docker | a186b65d5e807dd4caf049e8aeb3620a799c1225 | [
"MIT"
] | 1 | 2021-04-29T14:39:01.000Z | 2021-04-29T14:39:01.000Z | # -*- coding: utf-8 -*-
# Copyright (c) 2017, Frappe Technologies Pvt. Ltd. and contributors
# For license information, please see license.txt
from __future__ import unicode_literals
import frappe
from frappe.model.document import Document
class SupplierScorecardStanding(Document):
pass
@frappe.whitelist()
def get_scoring_standing(standing_name):
standing = frappe.get_doc("Supplier Scorecard Standing", standing_name)
return standing
@frappe.whitelist()
def get_standings_list():
standings = frappe.db.sql("""
SELECT
scs.name
FROM
`tabSupplier Scorecard Standing` scs""",
{}, as_dict=1)
return standings | 21.793103 | 72 | 0.761076 | 78 | 632 | 6 | 0.615385 | 0.064103 | 0.076923 | 0.089744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011029 | 0.139241 | 632 | 29 | 73 | 21.793103 | 0.849265 | 0.21519 | 0 | 0.111111 | 0 | 0 | 0.192698 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0.055556 | 0.166667 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1baabd5bd4355d53876aed7f693222e894bfb08a | 5,524 | py | Python | backend/hqlib/metric_source/issue_log/jira_action_list.py | ICTU/quality-report | f6234e112228ee7cfe6476c2d709fe244579bcfe | [
"Apache-2.0"
] | 25 | 2016-11-25T10:41:24.000Z | 2021-07-03T14:02:49.000Z | backend/hqlib/metric_source/issue_log/jira_action_list.py | ICTU/quality-report | f6234e112228ee7cfe6476c2d709fe244579bcfe | [
"Apache-2.0"
] | 783 | 2016-09-19T12:10:21.000Z | 2021-01-04T20:39:15.000Z | backend/hqlib/metric_source/issue_log/jira_action_list.py | ICTU/quality-report | f6234e112228ee7cfe6476c2d709fe244579bcfe | [
"Apache-2.0"
] | 15 | 2015-03-25T13:52:49.000Z | 2021-03-08T17:17:56.000Z | """
Copyright 2012-2019 Ministerie van Sociale Zaken en Werkgelegenheid
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import logging
import datetime
from typing import List, Tuple
import dateutil.parser
from dateutil.relativedelta import relativedelta
from hqlib import utils
from hqlib.metric_source.abstract.issue_log import ActionLog
from hqlib.metric_source import JiraFilter
class JiraActionList(ActionLog):
""" Jira used as an action list """
metric_source_name = 'Jira Action List'
def __init__(self, url: str, username: str, password: str, field_name: str = 'duedate', *args, **kwargs) -> None:
self._fields_to_ignore = kwargs.pop('fields_to_ignore', [])
self.__url = url
self.__field_name = field_name
self.__jira_filter = JiraFilter(self.__url, username, password, self.__field_name)
super().__init__(*args, **kwargs)
@classmethod
def _is_str_date_before(cls, str_date: str, limit_date: datetime.datetime) -> bool:
return utils.parse_iso_datetime_local_naive(str_date) < limit_date
def _get_issues_older_than(self, *metric_source_ids: str,
limit_date: datetime.datetime) -> List[Tuple[str, str, str]]:
try:
extra_fields = ['updated', 'created'] + [list(field.keys())[0] for field in self._fields_to_ignore]
issues = self.__jira_filter.issues_with_field_exceeding_value(
*metric_source_ids,
extra_fields=extra_fields,
compare=self._is_str_date_before, limit_value=limit_date)
return [i for i in issues if not self.__should_issue_be_ignored(i)]
except IndexError as reason:
logging.error("Jira filter result for overdue issues inadequate. Reason: %s.", reason)
return None
def __should_issue_be_ignored(self, issue) -> bool:
for index, ignore in enumerate(self._fields_to_ignore):
if issue[index + 5] == list(ignore.values())[0]:
return True
return False
def ignored_lists(self) -> List[str]:
""" Return the ignored lists. """
return self._fields_to_ignore
def nr_of_over_due_actions(self, *metric_source_ids: str) -> int:
""" Return the number of over due actions. """
issue_list = self._get_issues_older_than(*metric_source_ids, limit_date=datetime.datetime.now())
return len(issue_list) if issue_list is not None else -1
def over_due_actions_url(self, *metric_source_ids: str) -> List[Tuple[str, str, str]]:
""" Return the urls to the over due actions. """
issue_list = self._get_issues_older_than(*metric_source_ids, limit_date=datetime.datetime.now())
return [(issue[0], issue[1], self.__get_formatted_time_delta(issue[2])) for issue in issue_list] \
if issue_list is not None else []
def nr_of_inactive_actions(self, *metric_source_ids: str) -> int:
""" Return the number of inactive actions. """
issue_list = self._get_issues_inactive_for(*metric_source_ids)
return len(issue_list) if issue_list is not None else -1
def inactive_actions_url(self, *metric_source_ids: str) -> List[Tuple[str, str, str]]:
""" Return the urls for the inactive actions. """
issue_list = self._get_issues_inactive_for(*metric_source_ids)
return [(issue[0], issue[1], self.__get_formatted_time_delta(issue[3])) for issue in issue_list] \
if issue_list is not None else []
def _get_issues_inactive_for(self, *metric_source_ids, delta: relativedelta = relativedelta(days=14)):
try:
overdue_issue_list = self._get_issues_older_than(*metric_source_ids, limit_date=datetime.datetime.now())
return [issue for issue in overdue_issue_list if issue[3]
is not None and utils.parse_iso_datetime_local_naive(issue[3]) <= (datetime.datetime.now() - delta)]
except IndexError as reason:
logging.error("Jira filter result for inactive issues inadequate. Reason: %s.", reason)
return None
@classmethod
def __get_formatted_time_delta(cls, date_to_parse) -> str:
return utils.format_timedelta(datetime.datetime.now().astimezone() - dateutil.parser.parse(date_to_parse))
def metric_source_urls(self, *metric_source_ids: str) -> List[str]:
""" Return the url(s) to the metric source for the metric source id. """
return self.__jira_filter.metric_source_urls(*metric_source_ids)
def datetime(self, *metric_source_ids: str) -> datetime.datetime: # pylint: disable=unused-argument,no-self-use
""" Return the date and time of the last measurement. """
issue_list = self._get_issues_older_than(*metric_source_ids, limit_date=datetime.datetime.now())
return max([max(utils.parse_iso_datetime_local_naive(issue[4]),
utils.parse_iso_datetime_local_naive(issue[3]) if issue[3] else datetime.datetime.min)
for issue in issue_list])
| 48.884956 | 120 | 0.696235 | 765 | 5,524 | 4.741176 | 0.24183 | 0.076096 | 0.06617 | 0.041908 | 0.402537 | 0.365591 | 0.349876 | 0.318445 | 0.298042 | 0.298042 | 0 | 0.006868 | 0.209269 | 5,524 | 112 | 121 | 49.321429 | 0.823489 | 0.176503 | 0 | 0.242857 | 0 | 0 | 0.039172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.185714 | false | 0.028571 | 0.114286 | 0.028571 | 0.542857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1baea6433c91aefdeee1385fa3f02122e8c0dac4 | 598 | py | Python | v2x_solution/car/admin.py | Michaelwwgo/V2X_Project | d26f476329dd7f6083e9275e01e2748d38918afc | [
"MIT"
] | 1 | 2021-02-03T08:15:59.000Z | 2021-02-03T08:15:59.000Z | v2x_solution/car/admin.py | Michaelwwgo/V2X_Project | d26f476329dd7f6083e9275e01e2748d38918afc | [
"MIT"
] | null | null | null | v2x_solution/car/admin.py | Michaelwwgo/V2X_Project | d26f476329dd7f6083e9275e01e2748d38918afc | [
"MIT"
] | null | null | null | from django.contrib import admin
from . import models
@admin.register(models.Car)
class CarAdmin(admin.ModelAdmin):
list_display = (
'number',
'isFind',
'created_at',
'updated_at',
)
@admin.register(models.CriminalCar)
class CriminalCarAdmin(admin.ModelAdmin):
list_display = (
'car',
'image',
'created_at',
'updated_at',
)
@admin.register(models.PostCar)
class PostCarAdmin(admin.ModelAdmin):
list_display = (
'name',
'number',
'owner',
'created_at',
'updated_at',
) | 19.933333 | 41 | 0.583612 | 57 | 598 | 5.964912 | 0.438596 | 0.114706 | 0.167647 | 0.229412 | 0.217647 | 0.217647 | 0.217647 | 0 | 0 | 0 | 0 | 0 | 0.287625 | 598 | 30 | 42 | 19.933333 | 0.798122 | 0 | 0 | 0.407407 | 0 | 0 | 0.158598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.074074 | 0 | 0.296296 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1bb742a4023e045284504065668d9a075bf310a7 | 3,806 | py | Python | ziggurat_foundations/ext/pyramid/sign_in.py | fmigneault/ziggurat_foundations | c4eaac1d3e0d1a5dc09e1a3450e6c6631b701c0d | [
"BSD-3-Clause"
] | null | null | null | ziggurat_foundations/ext/pyramid/sign_in.py | fmigneault/ziggurat_foundations | c4eaac1d3e0d1a5dc09e1a3450e6c6631b701c0d | [
"BSD-3-Clause"
] | null | null | null | ziggurat_foundations/ext/pyramid/sign_in.py | fmigneault/ziggurat_foundations | c4eaac1d3e0d1a5dc09e1a3450e6c6631b701c0d | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
import importlib
import logging
import pyramid.security
from ziggurat_foundations.models.base import get_db_session
from ziggurat_foundations.models.services.user import UserService
CONFIG_KEY = "ziggurat_foundations"
log = logging.getLogger(__name__)
class ZigguratSignInSuccess(object):
def __contains__(self, other):
return True
def __init__(self, headers, came_from, user):
self.headers = headers
self.came_from = came_from
self.user = user
class ZigguratSignInBadAuth(object):
def __contains__(self, other):
return False
def __init__(self, headers, came_from):
self.headers = headers
self.came_from = came_from
class ZigguratSignOut(object):
def __contains__(self, other):
return True
def __init__(self, headers):
self.headers = headers
def includeme(config):
settings = config.registry.settings
sign_in_path = settings.get("%s.sign_in.sign_in_pattern" % CONFIG_KEY, "/sign_in")
sign_out_path = settings.get(
"%s.sign_in.sign_out_pattern" % CONFIG_KEY, "/sign_out"
)
session_provider_callable_config = settings.get(
"%s.session_provider_callable" % CONFIG_KEY
)
signin_came_from_key = settings.get(
"%s.sign_in.came_from_key" % CONFIG_KEY, "came_from"
)
signin_username_key = settings.get("%s.sign_in.username_key" % CONFIG_KEY, "login")
signin_password_key = settings.get(
"%s.sign_in.password_key" % CONFIG_KEY, "password"
)
if not session_provider_callable_config:
def session_provider_callable(request):
return get_db_session()
else:
parts = session_provider_callable_config.split(":")
_tmp = importlib.import_module(parts[0])
session_provider_callable = getattr(_tmp, parts[1])
endpoint = ZigguratSignInProvider(
settings=settings,
session_getter=session_provider_callable,
signin_came_from_key=signin_came_from_key,
signin_username_key=signin_username_key,
signin_password_key=signin_password_key,
)
config.add_route(
"ziggurat.routes.sign_in",
sign_in_path,
use_global_views=True,
factory=endpoint.sign_in,
)
config.add_route(
"ziggurat.routes.sign_out",
sign_out_path,
use_global_views=True,
factory=endpoint.sign_out,
)
class ZigguratSignInProvider(object):
def __init__(self, *args, **kwargs):
for k, v in kwargs.items():
setattr(self, k, v)
def sign_in(self, request):
came_from = request.params.get(self.signin_came_from_key, "/")
db_session = self.session_getter(request)
user = UserService.by_user_name(
request.params.get(self.signin_username_key), db_session=db_session
)
if user is None:
# if no result, test to see if email exists
user = UserService.by_email(
request.params.get(self.signin_username_key), db_session=db_session
)
if user:
password = request.params.get(self.signin_password_key)
if UserService.check_password(user, password):
headers = pyramid.security.remember(request, user.id)
return ZigguratSignInSuccess(
headers=headers, came_from=came_from, user=user
)
headers = pyramid.security.forget(request)
return ZigguratSignInBadAuth(headers=headers, came_from=came_from)
def sign_out(self, request):
headers = pyramid.security.forget(request)
return ZigguratSignOut(headers=headers)
def session_getter(self, request):
raise NotImplementedError()
| 30.448 | 87 | 0.672097 | 447 | 3,806 | 5.364653 | 0.228188 | 0.056714 | 0.067139 | 0.033361 | 0.382819 | 0.323186 | 0.183486 | 0.161802 | 0.095913 | 0.095913 | 0 | 0.001037 | 0.239622 | 3,806 | 124 | 88 | 30.693548 | 0.827574 | 0.016553 | 0 | 0.1875 | 0 | 0 | 0.069251 | 0.052941 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.052083 | 0.072917 | 0.041667 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1bbe66e627f5fbbe75b9f195c03efc8b24413bf8 | 489 | py | Python | solutions/2019/prob_15.py | PolPtoAmo/HPCodeWarsBCN | 8a98b1feb6d8b7d2d5b8b4dace3e02af9e6bb4e8 | [
"MIT"
] | 1 | 2021-02-27T09:46:06.000Z | 2021-02-27T09:46:06.000Z | solutions/2019/prob_15.py | PolPtoAmo/HPCodeWarsBCN | 8a98b1feb6d8b7d2d5b8b4dace3e02af9e6bb4e8 | [
"MIT"
] | null | null | null | solutions/2019/prob_15.py | PolPtoAmo/HPCodeWarsBCN | 8a98b1feb6d8b7d2d5b8b4dace3e02af9e6bb4e8 | [
"MIT"
] | 1 | 2021-02-27T12:03:33.000Z | 2021-02-27T12:03:33.000Z | entrada = input()
def fibonacci(n):
fib = [0, 1, 1]
if n > 0 and n <= 2:
return fib[1]
elif n == 0:
return fib[0]
else:
for x in range(3, n+1):
fib.append(0)
fib[x] = fib[x-1] + fib[x-2]
return fib[n]
numeros = entrada.split(' ')
numeros = list(map(int, numeros))
print(str(fibonacci(numeros[0])) + " " + str(fibonacci(numeros[1])) + " "
+ str(fibonacci(numeros[2])) + " " + str(fibonacci(numeros[3])))
| 19.56 | 73 | 0.503067 | 71 | 489 | 3.464789 | 0.380282 | 0.195122 | 0.308943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050147 | 0.306748 | 489 | 24 | 74 | 20.375 | 0.675516 | 0 | 0 | 0 | 0 | 0 | 0.00818 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0 | 0 | 0.25 | 0.0625 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1bbeef824c196490b52c030d0f2de427178e48b0 | 441 | py | Python | DomJudge/practica10/EsMonti.py | Camiloasc1/AlgorithmsUNAL | 1542b8f2c170f9b5a24638f05ae50fa2c85cfc7b | [
"MIT"
] | null | null | null | DomJudge/practica10/EsMonti.py | Camiloasc1/AlgorithmsUNAL | 1542b8f2c170f9b5a24638f05ae50fa2c85cfc7b | [
"MIT"
] | null | null | null | DomJudge/practica10/EsMonti.py | Camiloasc1/AlgorithmsUNAL | 1542b8f2c170f9b5a24638f05ae50fa2c85cfc7b | [
"MIT"
] | null | null | null | import sys
def isHeap(H):
for i in xrange(1, len(H)):
if H[parent(i)] > H[i]:
return False
else:
return True
def parent(i):
return (i - 1) / 2
def read():
while True:
if int(sys.stdin.readline()) == 0:
return
line = sys.stdin.readline()
hp = map(int, line.split())
if isHeap(hp):
print 'Yes'
else:
print 'No'
read()
| 16.961538 | 42 | 0.46712 | 59 | 441 | 3.491525 | 0.525424 | 0.067961 | 0.15534 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015038 | 0.396825 | 441 | 25 | 43 | 17.64 | 0.759399 | 0 | 0 | 0.1 | 0 | 0 | 0.011338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.05 | null | null | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1bca72364354424e756eb0857899935691ba79bb | 422 | py | Python | HIS_void/patient/migrations/0007_auto_20210429_2316.py | YuanchenZhu2020/HIS_void | 7289bf537e9fc4b09750bbca76a4cc8354dc770f | [
"MIT"
] | null | null | null | HIS_void/patient/migrations/0007_auto_20210429_2316.py | YuanchenZhu2020/HIS_void | 7289bf537e9fc4b09750bbca76a4cc8354dc770f | [
"MIT"
] | null | null | null | HIS_void/patient/migrations/0007_auto_20210429_2316.py | YuanchenZhu2020/HIS_void | 7289bf537e9fc4b09750bbca76a4cc8354dc770f | [
"MIT"
] | null | null | null | # Generated by Django 3.1.7 on 2021-04-29 15:16
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('rbac', '0002_auto_20210426_2345'),
('patient', '0006_auto_20210429_1431'),
]
operations = [
migrations.RenameModel(
old_name='PatientURLPermissions',
new_name='PatientURLPermission',
),
]
| 22.210526 | 48 | 0.597156 | 41 | 422 | 5.95122 | 0.853659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158249 | 0.296209 | 422 | 18 | 49 | 23.444444 | 0.6633 | 0.106635 | 0 | 0 | 1 | 0 | 0.27451 | 0.187675 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1bd09b12f21d9acb607142944ec3b8ac94113d29 | 3,236 | py | Python | xero_python/payrolluk/models/timesheet_line_object.py | sromero84/xero-python | 89558c0baa8080c3f522701eb1b94f909248dbd7 | [
"MIT"
] | null | null | null | xero_python/payrolluk/models/timesheet_line_object.py | sromero84/xero-python | 89558c0baa8080c3f522701eb1b94f909248dbd7 | [
"MIT"
] | null | null | null | xero_python/payrolluk/models/timesheet_line_object.py | sromero84/xero-python | 89558c0baa8080c3f522701eb1b94f909248dbd7 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Xero Payroll UK
This is the Xero Payroll API for orgs in the UK region. # noqa: E501
OpenAPI spec version: 2.3.4
Contact: api@xero.com
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
from xero_python.models import BaseModel
class TimesheetLineObject(BaseModel):
"""NOTE: This class is auto generated by OpenAPI Generator.
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
"""
Attributes:
openapi_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
openapi_types = {
"pagination": "Pagination",
"problem": "Problem",
"timesheet_line": "TimesheetLine",
}
attribute_map = {
"pagination": "pagination",
"problem": "problem",
"timesheet_line": "timesheetLine",
}
def __init__(
self, pagination=None, problem=None, timesheet_line=None
): # noqa: E501
"""TimesheetLineObject - a model defined in OpenAPI""" # noqa: E501
self._pagination = None
self._problem = None
self._timesheet_line = None
self.discriminator = None
if pagination is not None:
self.pagination = pagination
if problem is not None:
self.problem = problem
if timesheet_line is not None:
self.timesheet_line = timesheet_line
@property
def pagination(self):
"""Gets the pagination of this TimesheetLineObject. # noqa: E501
:return: The pagination of this TimesheetLineObject. # noqa: E501
:rtype: Pagination
"""
return self._pagination
@pagination.setter
def pagination(self, pagination):
"""Sets the pagination of this TimesheetLineObject.
:param pagination: The pagination of this TimesheetLineObject. # noqa: E501
:type: Pagination
"""
self._pagination = pagination
@property
def problem(self):
"""Gets the problem of this TimesheetLineObject. # noqa: E501
:return: The problem of this TimesheetLineObject. # noqa: E501
:rtype: Problem
"""
return self._problem
@problem.setter
def problem(self, problem):
"""Sets the problem of this TimesheetLineObject.
:param problem: The problem of this TimesheetLineObject. # noqa: E501
:type: Problem
"""
self._problem = problem
@property
def timesheet_line(self):
"""Gets the timesheet_line of this TimesheetLineObject. # noqa: E501
:return: The timesheet_line of this TimesheetLineObject. # noqa: E501
:rtype: TimesheetLine
"""
return self._timesheet_line
@timesheet_line.setter
def timesheet_line(self, timesheet_line):
"""Sets the timesheet_line of this TimesheetLineObject.
:param timesheet_line: The timesheet_line of this TimesheetLineObject. # noqa: E501
:type: TimesheetLine
"""
self._timesheet_line = timesheet_line
| 26.096774 | 92 | 0.621755 | 343 | 3,236 | 5.758017 | 0.230321 | 0.125063 | 0.151899 | 0.132152 | 0.434937 | 0.352405 | 0.317975 | 0.112911 | 0.038481 | 0.038481 | 0 | 0.01886 | 0.295426 | 3,236 | 123 | 93 | 26.308943 | 0.847368 | 0.40204 | 0 | 0.068182 | 1 | 0 | 0.084958 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.159091 | false | 0 | 0.045455 | 0 | 0.340909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1bd0cba8c6f81882ac13039efa512c0dfba5b8e9 | 4,492 | py | Python | monocle/resources.py | bigjust/django-monocle | f546a9061834b41a1a701ef89f8371bfaf4f1691 | [
"MIT"
] | 1 | 2019-04-21T17:25:16.000Z | 2019-04-21T17:25:16.000Z | monocle/resources.py | bigjust/django-monocle | f546a9061834b41a1a701ef89f8371bfaf4f1691 | [
"MIT"
] | null | null | null | monocle/resources.py | bigjust/django-monocle | f546a9061834b41a1a701ef89f8371bfaf4f1691 | [
"MIT"
] | null | null | null | import json
import os
import time
from django.template import Context
from django.template.loader import get_template
from django.utils.safestring import mark_safe
from monocle.settings import settings
class Resource(object):
"""
A JSON compatible response from an OEmbed provider
"""
def __init__(self, url, data=None):
self.url = url
self.created = time.time()
self._data = data or {}
def __getitem__(self, key):
if key == 'cache_age':
return self.ttl
return self._data.get(key, '')
def __setitem__(self, key, value):
if key == 'cache_age':
self.ttl = value
else:
self._data[key] = value
def __contains__(self, key):
return key in self._data
def render(self):
"""
Renders this resource to the template corresponding to this resource type.
The template is rendered with variables ``url`` and ``resource`` that represent
the original requested URL and this resource respectively.
If the resource is considered invalid from :func:`is_valid`, the original
requested URL is returned unless ``RESOURCE_URLIZE_INVALID`` is configured
in :mod:`monocle.settings`. If so, then the original URL is returned hyperlinked
:returns: Rendered oembed content
"""
if not self.is_valid:
if settings.RESOURCE_URLIZE_INVALID:
template_name = 'monocle/link.html'
else:
return self.url
else:
template_name = os.path.join('monocle', '%s.html' % self._data['type'])
template = get_template(template_name)
return mark_safe(template.render(Context({'url': self.url, 'resource': self})))
@property
def is_valid(self):
"""
Perform validation against this resource object. The resource is considered
valid if it meets the following criteria:
* It has oembed response data
* It is a valid oembed resource type
* It has the required attributes based on its type
"""
# We can create resources without valid data
if not self._data:
return False
# Must be a valid type
if self._data.get('type') not in settings.RESOURCE_TYPES:
return False
# Must have required fields
has_required = True
for field in settings.RESOURCE_REQUIRED_ATTRS[self._data['type']]:
has_required = has_required and (field in self._data)
if not has_required:
return False
return True
@property
def is_stale(self):
"""
True of the current timestamp is greater than the sum of the resource's
creation timestamp plus its TTL, False otherwise.
"""
return (time.time() - self.created) > self.ttl
def refresh(self):
"""
Returns a version of this resource that is considered fresh by updating
its internal timestamp to now
"""
self.created = time.time()
return self
@property
def json(self):
"""
A JSON string without any empty or null keys
"""
return json.dumps(dict([(k, v) for k, v in self._data.items() if v]))
def get_ttl(self):
"""
Returns the TTL of this resource ensuring that it at minimum the value
of ``RESOURCE_MIN_TTL`` from :mod:`monocle.settings`.
This value could be specified by the provider via the property ``cache_age``.
If it is not, the value ``RESOURCE_DEFAULT_TTL`` from :mod:`monocle.settings`
is used.
:returns: TTL in seconds
"""
try:
return max(settings.RESOURCE_MIN_TTL,
int(self._data.get('cache_age', settings.RESOURCE_DEFAULT_TTL)))
except (ValueError, TypeError):
return settings.RESOURCE_DEFAULT_TTL
def set_ttl(self, value):
"""
Sets the TTL value of this resource ensuring that it is at minimum the value
of ``RESOURCE_MIN_TTL`` from :mod:`monocle.settings`. If it is not, the value
of ``RESOURCE_DEFAULT_TTL`` from :mod:`monocle.settings` is used.
"""
try:
value = max(settings.RESOURCE_MIN_TTL, int(value))
except (ValueError, TypeError):
value = settings.RESOURCE_DEFAULT_TTL
self._data['cache_age'] = value
ttl = property(get_ttl, set_ttl)
| 32.085714 | 88 | 0.616652 | 567 | 4,492 | 4.749559 | 0.276896 | 0.035648 | 0.03342 | 0.025251 | 0.129224 | 0.129224 | 0.075009 | 0.075009 | 0.075009 | 0.040847 | 0 | 0 | 0.300089 | 4,492 | 139 | 89 | 32.316547 | 0.856552 | 0.370659 | 0 | 0.253731 | 0 | 0 | 0.036058 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.164179 | false | 0 | 0.104478 | 0.014925 | 0.507463 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1bd33e1813ca1443a89a2ae8c80e1f64e10a3043 | 316 | py | Python | tods/sk_interface/detection_algorithm/SOD_skinterface.py | ZhuangweiKang/tods | fe3f55f8ccb306dd292c668e0f1154f1afdfa556 | [
"Apache-2.0"
] | 544 | 2020-09-21T06:02:33.000Z | 2022-03-27T07:16:32.000Z | tods/sk_interface/detection_algorithm/SOD_skinterface.py | ZhuangweiKang/tods | fe3f55f8ccb306dd292c668e0f1154f1afdfa556 | [
"Apache-2.0"
] | 35 | 2020-09-21T06:33:13.000Z | 2022-03-11T14:20:21.000Z | tods/sk_interface/detection_algorithm/SOD_skinterface.py | ZhuangweiKang/tods | fe3f55f8ccb306dd292c668e0f1154f1afdfa556 | [
"Apache-2.0"
] | 86 | 2020-09-21T16:44:33.000Z | 2022-03-11T18:20:22.000Z | import numpy as np
from ..base import BaseSKI
from tods.detection_algorithm.PyodSOD import SODPrimitive
class SODSKI(BaseSKI):
def __init__(self, **hyperparams):
super().__init__(primitive=SODPrimitive, **hyperparams)
self.fit_available = True
self.predict_available = True
self.produce_available = False
| 28.727273 | 57 | 0.787975 | 39 | 316 | 6.076923 | 0.666667 | 0.109705 | 0.14346 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123418 | 316 | 10 | 58 | 31.6 | 0.855596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1bd36a5a050517e6c0b779b2d7a6a6b544da049f | 294 | py | Python | app/db_fill.py | justincredble/Circulation | 9c7537261a89acf737268943ab69fcb6f7d2af7f | [
"MIT"
] | null | null | null | app/db_fill.py | justincredble/Circulation | 9c7537261a89acf737268943ab69fcb6f7d2af7f | [
"MIT"
] | null | null | null | app/db_fill.py | justincredble/Circulation | 9c7537261a89acf737268943ab69fcb6f7d2af7f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from app import app, db
from app.models import User, Role
app_ctx = app.app_context()
app_ctx.push()
db.create_all()
Role.insert_roles()
admin = User(name=u'root', email='root@gmail.com', password='password')
db.session.add(admin)
db.session.commit()
app_ctx.pop()
| 17.294118 | 71 | 0.707483 | 49 | 294 | 4.122449 | 0.591837 | 0.089109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003846 | 0.115646 | 294 | 16 | 72 | 18.375 | 0.773077 | 0.071429 | 0 | 0 | 0 | 0 | 0.095941 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.1 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1bd98169236fadcdd2f4be3bc8f9290368ceb686 | 15,192 | py | Python | test/test_api.py | archoversight/u2fval | ec8b6b9e65f880fd609c7e9f82638696341c781f | [
"BSD-2-Clause"
] | 75 | 2015-02-10T08:33:28.000Z | 2021-11-16T21:10:56.000Z | test/test_api.py | archoversight/u2fval | ec8b6b9e65f880fd609c7e9f82638696341c781f | [
"BSD-2-Clause"
] | 39 | 2015-04-09T04:34:26.000Z | 2021-09-04T23:18:58.000Z | test/test_api.py | archoversight/u2fval | ec8b6b9e65f880fd609c7e9f82638696341c781f | [
"BSD-2-Clause"
] | 31 | 2015-02-26T08:38:29.000Z | 2022-02-17T20:03:25.000Z | from u2fval import app, exc
from u2fval.model import db, Client
from .soft_u2f_v2 import SoftU2FDevice, CERT
from six.moves.urllib.parse import quote
from cryptography import x509
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.serialization import Encoding
import unittest
import json
class RestApiTest(unittest.TestCase):
def setUp(self):
app.config['TESTING'] = True
app.config['ALLOW_UNTRUSTED'] = True
db.session.close()
db.drop_all()
db.create_all()
db.session.add(Client('fooclient', 'https://example.com',
['https://example.com']))
db.session.commit()
self.app = app.test_client()
def test_call_without_client(self):
resp = self.app.get('/')
self.assertEqual(resp.status_code, 400)
err = json.loads(resp.data.decode('utf8'))
self.assertEqual(err['errorCode'], exc.BadInputException.code)
def test_call_with_invalid_client(self):
resp = self.app.get('/', environ_base={'REMOTE_USER': 'invalid'})
self.assertEqual(resp.status_code, 404)
err = json.loads(resp.data.decode('utf8'))
self.assertEqual(err['errorCode'], exc.BadInputException.code)
def test_get_trusted_facets(self):
resp = json.loads(
self.app.get('/', environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertIn('https://example.com', resp['trustedFacets'][0]['ids'])
def test_list_empty_devices(self):
resp = json.loads(
self.app.get('/foouser', environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(resp, [])
def test_begin_auth_without_devices(self):
resp = self.app.get('/foouser/sign',
environ_base={'REMOTE_USER': 'fooclient'})
self.assertEqual(resp.status_code, 400)
err = json.loads(resp.data.decode('utf8'))
self.assertEqual(err['errorCode'], exc.NoEligibleDevicesException.code)
def test_register(self):
device = SoftU2FDevice()
self.do_register(device, {'foo': 'bar'})
def test_sign(self):
device = SoftU2FDevice()
self.do_register(device, {'foo': 'bar', 'baz': 'one'})
descriptor = self.do_sign(device, {'baz': 'two'})
self.assertEqual(descriptor['properties'],
{'foo': 'bar', 'baz': 'two'})
def test_get_properties(self):
device = SoftU2FDevice()
descriptor = self.do_register(device, {'foo': 'bar', 'baz': 'foo'})
descriptor2 = json.loads(
self.app.get('/foouser/' + descriptor['handle'],
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(descriptor2['properties'],
{'foo': 'bar', 'baz': 'foo'})
def test_update_properties(self):
device = SoftU2FDevice()
desc = self.do_register(device,
{'foo': 'one', 'bar': 'one', 'baz': 'one'})
self.assertEqual({
'foo': 'one',
'bar': 'one',
'baz': 'one'
}, desc['properties'])
desc2 = json.loads(self.app.post(
'/foouser/' + desc['handle'],
environ_base={'REMOTE_USER': 'fooclient'},
data=json.dumps({'bar': 'two', 'baz': None})
).data.decode('utf8'))
self.assertEqual({
'foo': 'one',
'bar': 'two'
}, desc2['properties'])
desc3 = json.loads(self.app.get(
'/foouser/' + desc['handle'],
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(desc2['properties'], desc3['properties'])
def test_get_devices(self):
self.do_register(SoftU2FDevice())
self.do_register(SoftU2FDevice())
self.do_register(SoftU2FDevice())
resp = json.loads(
self.app.get('/foouser', environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(len(resp), 3)
def test_get_device_descriptor_and_cert(self):
desc = self.do_register(SoftU2FDevice())
desc2 = json.loads(
self.app.get('/foouser/' + desc['handle'],
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(desc, desc2)
cert = x509.load_pem_x509_certificate(self.app.get(
'/foouser/' + desc['handle'] + '/certificate',
environ_base={'REMOTE_USER': 'fooclient'}
).data, default_backend())
self.assertEqual(CERT, cert.public_bytes(Encoding.DER))
def test_get_invalid_device(self):
resp = self.app.get('/foouser/' + ('ab' * 16),
environ_base={'REMOTE_USER': 'fooclient'}
)
self.assertEqual(resp.status_code, 404)
self.do_register(SoftU2FDevice())
resp = self.app.get('/foouser/' + ('ab' * 16),
environ_base={'REMOTE_USER': 'fooclient'}
)
self.assertEqual(resp.status_code, 404)
resp = self.app.get('/foouser/InvalidHandle',
environ_base={'REMOTE_USER': 'fooclient'}
)
self.assertEqual(resp.status_code, 400)
def test_delete_user(self):
self.do_register(SoftU2FDevice())
self.do_register(SoftU2FDevice())
self.do_register(SoftU2FDevice())
self.app.delete('/foouser',
environ_base={'REMOTE_USER': 'fooclient'})
resp = json.loads(
self.app.get('/foouser', environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(resp, [])
def test_delete_devices(self):
d1 = self.do_register(SoftU2FDevice())
d2 = self.do_register(SoftU2FDevice())
d3 = self.do_register(SoftU2FDevice())
self.app.delete('/foouser/' + d2['handle'],
environ_base={'REMOTE_USER': 'fooclient'})
resp = json.loads(
self.app.get('/foouser',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(len(resp), 2)
self.app.delete('/foouser/' + d1['handle'],
environ_base={'REMOTE_USER': 'fooclient'})
resp = json.loads(
self.app.get('/foouser',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(len(resp), 1)
self.assertEqual(d3, resp[0])
self.app.delete('/foouser/' + d3['handle'],
environ_base={'REMOTE_USER': 'fooclient'})
resp = json.loads(
self.app.get('/foouser',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(resp, [])
def test_set_properties_during_register(self):
device = SoftU2FDevice()
reg_req = json.loads(self.app.get(
'/foouser/register?properties=' + quote(json.dumps(
{'foo': 'one', 'bar': 'one'})),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
reg_resp = device.register('https://example.com', reg_req['appId'],
reg_req['registerRequests'][0]).json
desc = json.loads(self.app.post(
'/foouser/register',
data=json.dumps({
'registerResponse': reg_resp,
'properties': {'baz': 'two', 'bar': 'two'}
}),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual({'foo': 'one', 'bar': 'two', 'baz': 'two'},
desc['properties'])
def test_set_properties_during_sign(self):
device = SoftU2FDevice()
self.do_register(device, {'foo': 'one', 'bar': 'one', 'baz': 'one'})
aut_req = json.loads(self.app.get(
'/foouser/sign?properties=' + quote(json.dumps(
{'bar': 'two', 'boo': 'two'})),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
aut_resp = device.getAssertion('https://example.com', aut_req['appId'],
aut_req['challenge'],
aut_req['registeredKeys'][0]).json
desc = json.loads(self.app.post(
'/foouser/sign',
data=json.dumps({
'signResponse': aut_resp,
'properties': {'baz': 'three', 'boo': None}
}),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual({
'foo': 'one',
'bar': 'two',
'baz': 'three',
}, desc['properties'])
def test_register_and_sign_with_custom_challenge(self):
device = SoftU2FDevice()
reg_req = json.loads(self.app.get(
'/foouser/register?challenge=ThisIsAChallenge',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(reg_req['registerRequests'][0]['challenge'],
'ThisIsAChallenge')
reg_resp = device.register('https://example.com', reg_req['appId'],
reg_req['registerRequests'][0]).json
desc1 = json.loads(self.app.post(
'/foouser/register',
data=json.dumps({
'registerResponse': reg_resp
}),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
aut_req = json.loads(self.app.get(
'/foouser/sign?challenge=ThisIsAChallenge',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(aut_req['challenge'], 'ThisIsAChallenge')
aut_resp = device.getAssertion('https://example.com', aut_req['appId'],
aut_req['challenge'],
aut_req['registeredKeys'][0]).json
desc2 = json.loads(self.app.post(
'/foouser/sign',
data=json.dumps({
'signResponse': aut_resp
}),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(desc1['handle'], desc2['handle'])
def test_sign_with_handle_filtering(self):
dev = SoftU2FDevice()
h1 = self.do_register(dev)['handle']
h2 = self.do_register(dev)['handle']
self.do_register(dev)['handle']
aut_req = json.loads(
self.app.get('/foouser/sign',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(len(aut_req['registeredKeys']), 3)
self.assertEqual(len(aut_req['descriptors']), 3)
aut_req = json.loads(
self.app.get('/foouser/sign?handle=' + h1,
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(len(aut_req['registeredKeys']), 1)
self.assertEqual(aut_req['descriptors'][0]['handle'], h1)
aut_req = json.loads(
self.app.get(
'/foouser/sign?handle=' + h1 + '&handle=' + h2,
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(len(aut_req['registeredKeys']), 2)
self.assertIn(aut_req['descriptors'][0]['handle'], [h1, h2])
self.assertIn(aut_req['descriptors'][1]['handle'], [h1, h2])
def test_sign_with_invalid_handle(self):
dev = SoftU2FDevice()
self.do_register(dev)
resp = self.app.get('/foouser/sign?handle=foobar',
environ_base={'REMOTE_USER': 'fooclient'})
self.assertEqual(resp.status_code, 400)
def test_device_compromised_on_counter_error(self):
dev = SoftU2FDevice()
self.do_register(dev)
self.do_sign(dev)
self.do_sign(dev)
self.do_sign(dev)
dev.counter = 1
aut_req = json.loads(
self.app.get('/foouser/sign',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
aut_resp = dev.getAssertion('https://example.com', aut_req['appId'],
aut_req['challenge'],
aut_req['registeredKeys'][0]).json
resp = self.app.post(
'/foouser/sign',
data=json.dumps({
'signResponse': aut_resp
}),
environ_base={'REMOTE_USER': 'fooclient'}
)
self.assertEqual(400, resp.status_code)
self.assertEqual(12, json.loads(resp.data.decode('utf8'))['errorCode'])
resp = self.app.get('/foouser/sign',
environ_base={'REMOTE_USER': 'fooclient'})
self.assertEqual(400, resp.status_code)
self.assertEqual(11, json.loads(resp.data.decode('utf8'))['errorCode'])
def do_register(self, device, properties=None):
reg_req = json.loads(
self.app.get('/foouser/register',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(len(reg_req['registeredKeys']),
len(reg_req['descriptors']))
reg_resp = device.register('https://example.com', reg_req['appId'],
reg_req['registerRequests'][0]).json
if properties is None:
properties = {}
descriptor = json.loads(self.app.post(
'/foouser/register',
data=json.dumps({
'registerResponse': reg_resp,
'properties': properties
}),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
self.assertEqual(descriptor['properties'], properties)
return descriptor
def do_sign(self, device, properties=None):
aut_req = json.loads(
self.app.get('/foouser/sign',
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
aut_resp = device.getAssertion('https://example.com', aut_req['appId'],
aut_req['challenge'],
aut_req['registeredKeys'][0]).json
if properties is None:
properties = {}
return json.loads(self.app.post(
'/foouser/sign',
data=json.dumps({
'signResponse': aut_resp,
'properties': properties
}),
environ_base={'REMOTE_USER': 'fooclient'}
).data.decode('utf8'))
| 39.769634 | 79 | 0.535874 | 1,516 | 15,192 | 5.208443 | 0.103562 | 0.038121 | 0.08612 | 0.106383 | 0.739235 | 0.707953 | 0.674265 | 0.633359 | 0.619808 | 0.558384 | 0 | 0.014659 | 0.312994 | 15,192 | 381 | 80 | 39.874016 | 0.74188 | 0 | 0 | 0.584848 | 0 | 0 | 0.177001 | 0.015074 | 0 | 0 | 0 | 0 | 0.142424 | 1 | 0.069697 | false | 0 | 0.027273 | 0 | 0.106061 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1bdb708dd35b54eb9305795bb23aa7f10f6fd2ff | 656 | py | Python | setup.py | suchanlee/typekit-python | 45b1d09b1934ea87b752405c6b2290ce477afa0d | [
"MIT"
] | 6 | 2016-03-17T10:56:22.000Z | 2020-09-10T02:55:19.000Z | setup.py | suchanlee/typekit-python | 45b1d09b1934ea87b752405c6b2290ce477afa0d | [
"MIT"
] | null | null | null | setup.py | suchanlee/typekit-python | 45b1d09b1934ea87b752405c6b2290ce477afa0d | [
"MIT"
] | 6 | 2015-03-13T12:11:59.000Z | 2020-05-22T20:42:36.000Z | try:
from setuptools import setup
except ImportError:
from distutils.core import setup
from typekit._version import __version__ as version
setup(
name = 'typekit',
packages = ['typekit'],
version = version,
license='MIT',
description = 'Python wrapper for Typekit Developer API',
author = 'Suchan Lee',
author_email = 'lee.suchan@gmail.com',
url = 'https://github.com/suchanlee/typekit-python',
download_url = 'https://github.com/suchanlee/typekit-python/tarball/{}'.format(version),
keywords = ['typekit', 'typekit-python', 'typekit python'],
install_requires= [
'requests >= 2.2.1',
],
) | 28.521739 | 92 | 0.667683 | 74 | 656 | 5.810811 | 0.554054 | 0.12093 | 0.065116 | 0.07907 | 0.181395 | 0.181395 | 0.181395 | 0 | 0 | 0 | 0 | 0.005682 | 0.195122 | 656 | 23 | 93 | 28.521739 | 0.808712 | 0 | 0 | 0 | 0 | 0 | 0.359209 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
59edeae5675e92a4f2deb7a85fba31c6c92a2875 | 1,315 | py | Python | organization/migrations/0002_auto_20180325_1529.py | H0neyBadger/pmapi | d34dad32170e53f49e14611f5bfbfcb4eb7b8d4d | [
"MIT"
] | null | null | null | organization/migrations/0002_auto_20180325_1529.py | H0neyBadger/pmapi | d34dad32170e53f49e14611f5bfbfcb4eb7b8d4d | [
"MIT"
] | 1 | 2017-09-07T09:15:07.000Z | 2017-09-07T09:15:07.000Z | organization/migrations/0002_auto_20180325_1529.py | H0neyBadger/cmdb | d34dad32170e53f49e14611f5bfbfcb4eb7b8d4d | [
"MIT"
] | null | null | null | # Generated by Django 2.0.3 on 2018-03-25 15:29
from django.conf import settings
import django.contrib.auth.models
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('organization', '0001_initial'),
]
operations = [
migrations.AlterModelManagers(
name='team',
managers=[
('objects', django.contrib.auth.models.GroupManager()),
],
),
migrations.AlterField(
model_name='team',
name='ciso',
field=models.ForeignKey(help_text='chief information security officer', null=True, on_delete=django.db.models.deletion.PROTECT, related_name='ciso', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='team',
name='manager',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, related_name='manager', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='team',
name='technical_contact',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, related_name='technical_contact', to=settings.AUTH_USER_MODEL),
),
]
| 34.605263 | 190 | 0.638783 | 142 | 1,315 | 5.78169 | 0.394366 | 0.048721 | 0.06821 | 0.107186 | 0.47503 | 0.447016 | 0.401949 | 0.401949 | 0.401949 | 0.401949 | 0 | 0.019115 | 0.244106 | 1,315 | 37 | 191 | 35.540541 | 0.806841 | 0.034221 | 0 | 0.322581 | 1 | 0 | 0.108044 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.129032 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
59f56393703c13188954083843d72db2967ca6cd | 607 | py | Python | pychain/node_discovery.py | tylermzeller/pychain | 3cbf49a762cacb9d4698a83a9f5851c601e57bd0 | [
"MIT"
] | null | null | null | pychain/node_discovery.py | tylermzeller/pychain | 3cbf49a762cacb9d4698a83a9f5851c601e57bd0 | [
"MIT"
] | 10 | 2018-06-21T01:15:14.000Z | 2018-07-30T01:02:35.000Z | pychain/node_discovery.py | tylermzeller/pychain | 3cbf49a762cacb9d4698a83a9f5851c601e57bd0 | [
"MIT"
] | null | null | null | '''
"Node discovery". *Hack*
This implementation of p2p node discovery takes
advantage of docker and information sharing through
environment variables. REAL discovery should probably happen
through DNS.
'''
# returns a list of node addresses
def discoverNodes():
import os
import random
numNodes, serviceName = int(os.environ['NUMNODES']), os.environ['SERVICENAME']
if numNodes is None or serviceName is None:
print("Node discovery could not be performed.")
return []
return list(map(lambda i: serviceName + '_' + str(i), list(range(1, numNodes + 1))))
| 33.722222 | 89 | 0.693575 | 76 | 607 | 5.526316 | 0.657895 | 0.092857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006263 | 0.210873 | 607 | 17 | 90 | 35.705882 | 0.870564 | 0.382208 | 0 | 0 | 0 | 0 | 0.165714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | true | 0 | 0.25 | 0 | 0.625 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9400dcaa88a2929b28b3ed85154b671a3195ca83 | 265 | py | Python | upload_test.py | binocular-vision/container | 35e86b51c430ccbf5f6a9776e33c21e586eda448 | [
"MIT"
] | null | null | null | upload_test.py | binocular-vision/container | 35e86b51c430ccbf5f6a9776e33c21e586eda448 | [
"MIT"
] | null | null | null | upload_test.py | binocular-vision/container | 35e86b51c430ccbf5f6a9776e33c21e586eda448 | [
"MIT"
] | null | null | null | from google.cloud import storage
import json
client = storage.Client()
bucket = client.get_bucket('ibvdata')
blob = bucket.get_blob("experiments/2018-04-14-03-06-01/outputs/json/a0.05_r3.00_p0.05_t1.00")
check = json.loads(blob.download_as_string())
print(check)
| 26.5 | 94 | 0.773585 | 45 | 265 | 4.4 | 0.688889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106557 | 0.079245 | 265 | 9 | 95 | 29.444444 | 0.704918 | 0 | 0 | 0 | 0 | 0.142857 | 0.283019 | 0.256604 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
94021a64302eb30d02fce61d892fb9559a6b654c | 2,932 | py | Python | S4/S4 Library/simulation/gsi_handlers/lot_handlers.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | 1 | 2021-05-20T19:33:37.000Z | 2021-05-20T19:33:37.000Z | S4/S4 Library/simulation/gsi_handlers/lot_handlers.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | null | null | null | S4/S4 Library/simulation/gsi_handlers/lot_handlers.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | null | null | null | from gsi_handlers.commodity_tracker_gsi_util import generate_data_from_commodity_tracker, create_schema_for_commodity_tracker
from sims4.gsi.dispatcher import GsiHandler
from sims4.gsi.schema import GsiGridSchema, GsiFieldVisualizers
import services
import sims4
import build_buy
lot_info_schema = GsiGridSchema(label='Lot Info', auto_refresh=False)
lot_info_schema.add_field('neighborhood', label='Neighborhood', unique_field=True)
lot_info_schema.add_field('cur_lot', label='Current Lot', width=0.4)
lot_info_schema.add_field('region_id', label='Region ID', type=GsiFieldVisualizers.INT, width=0.5)
lot_info_schema.add_field('lot_desc_id', label='Description ID', type=GsiFieldVisualizers.INT, width=0.5)
lot_info_schema.add_field('zone_id', label='Zone ID')
lot_info_schema.add_field('venue', label='Venue')
lot_info_schema.add_field('lot_name', label='Lot Name')
with lot_info_schema.add_has_many('statistics', GsiGridSchema, label='Statistics (Current Lot Only)') as sub_schema:
sub_schema.add_field('statistic', label='Statistic')
sub_schema.add_field('value', label='Statistic Value', type=GsiFieldVisualizers.FLOAT, width=0.5)
@GsiHandler('lot_info', lot_info_schema)
def generate_lot_info_data(*args, zone_id:int=None, **kwargs):
lot_infos = []
current_zone = services.current_zone()
lot = current_zone.lot
venue_manager = services.get_instance_manager(sims4.resources.Types.VENUE)
for neighborhood_proto in services.get_persistence_service().get_neighborhoods_proto_buf_gen():
for lot_owner_info in neighborhood_proto.lots:
zone_id = lot_owner_info.zone_instance_id
if zone_id is not None:
venue_tuning_id = build_buy.get_current_venue(zone_id)
venue_tuning = venue_manager.get(venue_tuning_id)
if venue_tuning is not None:
is_current_lot = lot_owner_info.zone_instance_id == lot.zone_id
cur_info = {'neighborhood': neighborhood_proto.name, 'region_id': neighborhood_proto.region_id, 'lot_desc_id': lot_owner_info.lot_description_id, 'zone_id': str(hex(zone_id)), 'venue': venue_tuning.__name__, 'lot_name': lot_owner_info.lot_name, 'cur_lot': 'X' if is_current_lot else ''}
if is_current_lot:
stat_entries = []
for stat in lot.get_all_stats_gen():
stat_entries.append({'statistic': stat.stat_type.__name__, 'value': stat.get_value()})
cur_info['statistics'] = stat_entries
lot_infos.append(cur_info)
return lot_infos
commodity_data_schema = create_schema_for_commodity_tracker('Lot Statistics/Continuous Statistic Data')
@GsiHandler('lot_commodity_data_view', commodity_data_schema)
def generate_lot_commodity_data_view():
lot = services.active_lot()
return generate_data_from_commodity_tracker(lot.commodity_tracker)
| 61.083333 | 306 | 0.74045 | 405 | 2,932 | 4.953086 | 0.22716 | 0.045364 | 0.064806 | 0.063809 | 0.1999 | 0.095214 | 0.055833 | 0.055833 | 0.055833 | 0.055833 | 0 | 0.00488 | 0.161323 | 2,932 | 47 | 307 | 62.382979 | 0.810899 | 0 | 0 | 0 | 1 | 0 | 0.124488 | 0.015007 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.136364 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
94074a7f69d0726f20e9c615de67c7046452f8cb | 778 | py | Python | Day 1/Puzzle 1.py | rookuu/AdventOfCode-2015 | b94e98f5711d0901412b22342395457a8185bc10 | [
"MIT"
] | null | null | null | Day 1/Puzzle 1.py | rookuu/AdventOfCode-2015 | b94e98f5711d0901412b22342395457a8185bc10 | [
"MIT"
] | null | null | null | Day 1/Puzzle 1.py | rookuu/AdventOfCode-2015 | b94e98f5711d0901412b22342395457a8185bc10 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""
Solution to Day 1 - Puzzle 1 of the Advent Of Code 2015 series of challenges.
--- Day 1: Not Quite Lisp ---
An opening parenthesis represents an increase in floor and a closing parenthesis represents a decrease in floor.
After taking a 7000 character long input string of assorted parenthesis, determine the resulting floor.
-----------------------------
Author: Luke "rookuu" Roberts
"""
inputData = raw_input("Puzzle Input: ") # Copy and Paste in the input string, hint: use CTRL-A
floor = 0 # Holds the variable to decrement/increment depending on the flavor of parenthesis.
for char in inputData:
if char == "(":
floor += 1
elif char == ")":
floor -= 1
print "The floor Santa arrives at is... Floor " + str(floor)
| 29.923077 | 112 | 0.676093 | 112 | 778 | 4.6875 | 0.616071 | 0.015238 | 0.038095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022581 | 0.203085 | 778 | 25 | 113 | 31.12 | 0.824194 | 0.199229 | 0 | 0 | 0 | 0 | 0.245536 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
94169a27bf1cb9a9f17f568c26d91e83bff1fa33 | 825 | py | Python | src/pretalx/event/migrations/0023_update_featured_visibility.py | lili668668/pretalx | 5ba2185ffd7c5f95254aafe25ad3de340a86eadb | [
"Apache-2.0"
] | 418 | 2017-10-05T05:52:49.000Z | 2022-03-24T09:50:06.000Z | src/pretalx/event/migrations/0023_update_featured_visibility.py | lili668668/pretalx | 5ba2185ffd7c5f95254aafe25ad3de340a86eadb | [
"Apache-2.0"
] | 1,049 | 2017-09-16T09:34:55.000Z | 2022-03-23T16:13:04.000Z | src/pretalx/event/migrations/0023_update_featured_visibility.py | lili668668/pretalx | 5ba2185ffd7c5f95254aafe25ad3de340a86eadb | [
"Apache-2.0"
] | 155 | 2017-10-16T18:32:01.000Z | 2022-03-15T12:48:33.000Z | # Generated by Django 3.0.5 on 2020-07-26 15:45
from django.db import migrations
def update_show_featured(apps, schema_editor):
Event = apps.get_model("event", "Event")
EventSettings = apps.get_model("event", "Event_SettingsStore")
for event in Event.objects.all():
old_value = EventSettings.objects.filter(
object=event, key="show_sneak_peek"
).first()
if old_value and old_value.value == "False":
EventSettings.objects.create(
object=event,
key="show_featured",
value="never",
)
class Migration(migrations.Migration):
dependencies = [
("event", "0022_auto_20200124_1213"),
]
operations = [
migrations.RunPython(update_show_featured, migrations.RunPython.noop),
]
| 26.612903 | 78 | 0.624242 | 92 | 825 | 5.413043 | 0.597826 | 0.072289 | 0.072289 | 0.068273 | 0.088353 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051071 | 0.264242 | 825 | 30 | 79 | 27.5 | 0.769358 | 0.054545 | 0 | 0 | 1 | 0 | 0.128535 | 0.029563 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.047619 | 0 | 0.238095 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
94187ed0c8700f9512490144c63712ffb0927710 | 2,575 | py | Python | meiduo_mall24/meiduo_mall/meiduo_mall/apps/meiduo_admin/views/spus.py | MarioKarting/Django_DRF_meiduo_mall | b9dc85d6d538e4655dd02ef1027524bdcbe497d7 | [
"MIT"
] | 1 | 2020-04-15T03:22:18.000Z | 2020-04-15T03:22:18.000Z | meiduo_mall24/meiduo_mall/meiduo_mall/apps/meiduo_admin/views/spus.py | MarioKarting/Django_DRF_meiduo_mall | b9dc85d6d538e4655dd02ef1027524bdcbe497d7 | [
"MIT"
] | 5 | 2020-05-11T20:29:00.000Z | 2021-11-02T15:46:12.000Z | meiduo_mall24/meiduo_mall/meiduo_mall/apps/meiduo_admin/views/spus.py | MarioKarting/Django_DRF_meiduo_mall | b9dc85d6d538e4655dd02ef1027524bdcbe497d7 | [
"MIT"
] | null | null | null | from rest_framework.permissions import IsAdminUser
from rest_framework.response import Response
from rest_framework.viewsets import ModelViewSet
from goods.models import SPU, Brand, GoodsCategory, SKU
from meiduo_admin.serializers.specs import SPUSerializer
from meiduo_admin.serializers.spus import SPUGoodsSerialzier, SPUBrandsSerizliser, CategorysSerizliser
from meiduo_admin.utils import PageNum
from django.conf import settings
from fdfs_client.client import Fdfs_client
#spu表增删改查,一二三级分类
class SPUGoodsView(ModelViewSet):
"""
SPU表的增删改查
"""
# 指定权限
permission_classes = [IsAdminUser]
# 指定序列化器
serializer_class = SPUGoodsSerialzier
# 指定查询及
queryset = SPU.objects.all()
# 指定分页
pagination_class = PageNum
def get_queryset(self):
keyword = self.request.query_params.get('keyword')
if keyword == '' or keyword is None:
return SPU.objects.all()
else:
return SPU.objects.filter(name=keyword)
# 在类中跟定义获取品牌数据的方法
def brand(self, request):
# 1、查询所有品牌数据
data = Brand.objects.all()
# 2、序列化返回品牌数据
ser = SPUBrandsSerizliser(data, many=True)
return Response(ser.data)
def channel(self, request):
# 1、获取一级分类数据
data = GoodsCategory.objects.filter(parent=None)
# 2、序列化返回分类数据
ser = CategorysSerizliser(data, many=True)
return Response(ser.data)
def channels(self, request, pk):
# 1、获取二级和三级分类数据
data = GoodsCategory.objects.filter(parent_id=pk)
# 2、序列化返回分类数据
ser = CategorysSerizliser(data, many=True)
return Response(ser.data)
class SPUSView(ModelViewSet):
"""
spu表的增删改查
"""
serializer_class = SPUSerializer
queryset = SPU.objects.all()
pagination_class = PageNum
def image(self,request):
"""
保存图片
:param request:
:return:
"""
# 1、获取图片数据
data = request.FILES.get('image')
# 验证图片数据
if data is None:
return Response(status=500)
# 2、建立fastDFS连接对象
client = Fdfs_client(settings.FASTDFS_CONF)
# 3、上传图片
res = client.upload_by_buffer(data.read())
# 4、判断上传状态
if res['Status'] != 'Upload successed.':
return Response({'error': '上传失败'}, status=501)
# 5、获取上传的图片路径
image_url = res['Remote file_id']
# 6、结果返回
return Response(
{
'img_url': settings.FDFS_URL+image_url
},
status=201
) | 26.822917 | 102 | 0.624854 | 270 | 2,575 | 5.866667 | 0.414815 | 0.05303 | 0.032197 | 0.034091 | 0.152146 | 0.106692 | 0.106692 | 0.106692 | 0.082071 | 0.082071 | 0 | 0.011382 | 0.283495 | 2,575 | 96 | 103 | 26.822917 | 0.847154 | 0.095922 | 0 | 0.176471 | 0 | 0 | 0.029109 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098039 | false | 0 | 0.176471 | 0 | 0.607843 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
941a1a67e6695f7c1740b6f37424502f9989d6c5 | 1,061 | py | Python | vframe_cli/commands/synthetic/utils/relabel.py | ngi-nix/vframe | 60469e25203136f9d6a5ecaabe2423695ee9a0f2 | [
"MIT"
] | null | null | null | vframe_cli/commands/synthetic/utils/relabel.py | ngi-nix/vframe | 60469e25203136f9d6a5ecaabe2423695ee9a0f2 | [
"MIT"
] | null | null | null | vframe_cli/commands/synthetic/utils/relabel.py | ngi-nix/vframe | 60469e25203136f9d6a5ecaabe2423695ee9a0f2 | [
"MIT"
] | null | null | null | #############################################################################
#
# VFRAME
# MIT License
# Copyright (c) 2019 Adam Harvey and VFRAME
# https://vframe.io
#
#############################################################################
import click
from vframe.settings import app_cfg
ext_choices = ['jpg', 'png']
@click.command()
@click.option('-i', '--input', 'opt_input', required=True,
help='Input file CSV')
@click.option('-o', '--output', 'opt_output',
help='Input file CSV')
@click.option('--label', 'opt_labels_from_to', required=True, type=(str,str),
multiple=True, help='Label from, to')
@click.pass_context
def cli(ctx, opt_input, opt_output, opt_labels_from_to):
"""Relabel label enum in annotation CSV"""
import pandas as pd
log = app_cfg.LOG
opt_output = opt_output if opt_output else opt_input
df_meta = pd.read_csv(opt_input)
for label_from, label_to in opt_labels_from_to:
df_meta.loc[(df_meta.label_enum == label_from), 'label_enum'] = label_to
# write csv
df_meta.to_csv(opt_output, index=False)
| 27.205128 | 77 | 0.607917 | 145 | 1,061 | 4.206897 | 0.427586 | 0.088525 | 0.063934 | 0.07377 | 0.088525 | 0.088525 | 0 | 0 | 0 | 0 | 0 | 0.004301 | 0.123468 | 1,061 | 38 | 78 | 27.921053 | 0.651613 | 0.118756 | 0 | 0.105263 | 0 | 0 | 0.157347 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.052632 | 0.157895 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
941cea816007d01476d7254fd327defb9c84327e | 1,030 | py | Python | Problem-No-23/Approach-No-1.py | sidhant-sriv/comp-project-grade-11 | a95a39e4624ec8f19e800296f3ab025de49b0d64 | [
"MIT"
] | 3 | 2020-10-23T05:21:23.000Z | 2022-01-10T10:58:42.000Z | Problem-No-23/Approach-No-1.py | sidhant-sriv/comp-project-grade-11 | a95a39e4624ec8f19e800296f3ab025de49b0d64 | [
"MIT"
] | 3 | 2020-10-25T10:44:32.000Z | 2020-10-25T16:44:15.000Z | Problem-No-23/Approach-No-1.py | sidhant-sriv/comp-project-grade-11 | a95a39e4624ec8f19e800296f3ab025de49b0d64 | [
"MIT"
] | 6 | 2020-10-23T05:18:56.000Z | 2021-01-14T09:32:35.000Z | '''
| Write a program that should prompt the user to type some sentences. It should then print number of words, number of characters, number of digits and number of special characters in it. |
|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| We use the input() function to receive input from the user and print() function to print it |
'''
string = input("Enter a sentence...\n")
numbers = '1234567890'
chars = '!@#$%^&*()<>?:-\"\'}+=_{|\][;//.,`~'
num_words = len(string.split())
res = [0,0,0]
for i in string:
if i.isalpha():
res[0] += 1
elif i in numbers:
res[1] += 1
elif i in chars:
res[2] += 1
else:
pass
print(f'There are {num_words} word(s), {res[0]} alphabets, {res[1]} digits and {res[2]} special characters in the given string.')
| 44.782609 | 188 | 0.446602 | 116 | 1,030 | 3.939655 | 0.491379 | 0.070022 | 0.083151 | 0.035011 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028609 | 0.253398 | 1,030 | 23 | 189 | 44.782609 | 0.56567 | 0.549515 | 0 | 0 | 0 | 0 | 0.367615 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.066667 | 0 | 0 | 0 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
94239941557b0ef32f8bdc3e929a8b7a0ca9bab9 | 1,087 | py | Python | calc/calculations/calculation.py | dhruvshah1996/Project3 | d87ad37f6cf2de0d3402c71d21b25258946aad69 | [
"MIT"
] | null | null | null | calc/calculations/calculation.py | dhruvshah1996/Project3 | d87ad37f6cf2de0d3402c71d21b25258946aad69 | [
"MIT"
] | null | null | null | calc/calculations/calculation.py | dhruvshah1996/Project3 | d87ad37f6cf2de0d3402c71d21b25258946aad69 | [
"MIT"
] | null | null | null | """Calculation Class"""
class Calculation:
""" calculation abstract base class"""
# pylint: disable=too-few-public-methods
def __init__(self,values: tuple):
""" constructor method"""
self.values = Calculation.convert_args_to_tuple_of_float(values)
@classmethod
def create(cls,values: tuple):
""" factory method"""
return cls(values)
@staticmethod
def convert_args_to_tuple_of_float(values: tuple):
""" standardize values to list of floats"""
#lists can be modified and tuple cannot, tuple are faster.
#We need to convert the tuple of potentially random data types (its raw data)
#into a standard data format to keep things consistent so we convert it to float
#then i make it a tuple again because i actually won't need to change the calculation values
#I can also use it as a list and then i would be able to edit the calculation
list_values_float = []
for item in values:
list_values_float.append(float(item))
return tuple(list_values_float) | 47.26087 | 100 | 0.678933 | 151 | 1,087 | 4.754967 | 0.503311 | 0.045961 | 0.062674 | 0.050139 | 0.086351 | 0.086351 | 0.086351 | 0 | 0 | 0 | 0 | 0 | 0.24655 | 1,087 | 23 | 101 | 47.26087 | 0.876679 | 0.49586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
942599ab210e3bb5b077aab6360a6c4298208c75 | 507 | py | Python | zhihuer/celery.py | guojy1314/stw1209 | 043889688c2ed55884b8ddde9cdf6949ee52f905 | [
"MIT"
] | 85 | 2018-07-15T06:45:59.000Z | 2021-06-26T06:51:38.000Z | zhihuer/celery.py | guojy1314/stw1209 | 043889688c2ed55884b8ddde9cdf6949ee52f905 | [
"MIT"
] | 8 | 2020-02-12T02:26:58.000Z | 2022-03-12T00:08:05.000Z | zhihuer/celery.py | guojy1314/stw1209 | 043889688c2ed55884b8ddde9cdf6949ee52f905 | [
"MIT"
] | 26 | 2019-01-26T18:05:11.000Z | 2021-06-26T06:51:39.000Z | # 在zhihuer项目目录下,
# cmd运行: celery -A zhihuer worker -l info (-A 默认寻找目录下的celery模块)
# 启动celery服务
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# 设置环境变量
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'zhihuer.settings')
# 实例化Celery
app = Celery('zhihuer')
# 使用django的settings文件配置celery
app.config_from_object('django.conf:settings', 'CELERY')
# Celery加载所有注册应用中的tasks.py
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
| 25.35 | 67 | 0.808679 | 60 | 507 | 6.633333 | 0.6 | 0.050251 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098619 | 507 | 19 | 68 | 26.684211 | 0.870897 | 0.309665 | 0 | 0 | 0 | 0 | 0.207602 | 0.064327 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
94267a216f2fcaedda83675c1ece9c6b384d5d17 | 7,273 | py | Python | pysnmp/CTRON-SSR-CONFIG-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CTRON-SSR-CONFIG-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CTRON-SSR-CONFIG-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CTRON-SSR-CONFIG-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CTRON-SSR-CONFIG-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:15:46 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, Integer, OctetString = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "Integer", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsUnion, ValueSizeConstraint, ConstraintsIntersection, SingleValueConstraint, ValueRangeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsUnion", "ValueSizeConstraint", "ConstraintsIntersection", "SingleValueConstraint", "ValueRangeConstraint")
ssrMibs, = mibBuilder.importSymbols("CTRON-SSR-SMI-MIB", "ssrMibs")
ModuleCompliance, ObjectGroup, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "ObjectGroup", "NotificationGroup")
MibScalar, MibTable, MibTableRow, MibTableColumn, ObjectIdentity, Counter32, MibIdentifier, IpAddress, Gauge32, Counter64, ModuleIdentity, iso, Bits, NotificationType, TimeTicks, Unsigned32, Integer32 = mibBuilder.importSymbols("SNMPv2-SMI", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "ObjectIdentity", "Counter32", "MibIdentifier", "IpAddress", "Gauge32", "Counter64", "ModuleIdentity", "iso", "Bits", "NotificationType", "TimeTicks", "Unsigned32", "Integer32")
TextualConvention, TruthValue, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "TruthValue", "DisplayString")
ssrConfigMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230))
ssrConfigMIB.setRevisions(('2000-07-15 00:00', '2000-02-20 00:00', '1998-08-17 00:00',))
if mibBuilder.loadTexts: ssrConfigMIB.setLastUpdated('200007150000Z')
if mibBuilder.loadTexts: ssrConfigMIB.setOrganization('Cabletron Systems, Inc')
class SSRErrorCode(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8))
namedValues = NamedValues(("noStatus", 1), ("timeout", 2), ("networkError", 3), ("noSpace", 4), ("invalidConfig", 5), ("commandCompleted", 6), ("internalError", 7), ("tftpServerError", 8))
cfgGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231))
cfgTransferOp = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("noop", 1), ("sendConfigToAgent", 2), ("receiveConfigFromAgent", 3), ("receiveBootlogFromAgent", 4)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: cfgTransferOp.setStatus('current')
cfgManagerAddress = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: cfgManagerAddress.setStatus('current')
cfgFileName = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: cfgFileName.setStatus('current')
cfgActivateTransfer = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 4), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: cfgActivateTransfer.setStatus('current')
cfgTransferStatus = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("idle", 1), ("sending", 2), ("receiving", 3), ("transferComplete", 4), ("error", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cfgTransferStatus.setStatus('current')
cfgActivateFile = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 6), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: cfgActivateFile.setStatus('current')
cfgLastError = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 7), SSRErrorCode()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cfgLastError.setStatus('current')
cfgLastErrorReason = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 8), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cfgLastErrorReason.setStatus('current')
cfgActiveImageVersion = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 9), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cfgActiveImageVersion.setStatus('current')
cfgActiveImageBootLocation = MibScalar((1, 3, 6, 1, 4, 1, 52, 2501, 1, 231, 10), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cfgActiveImageBootLocation.setStatus('current')
configConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230, 3))
configCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230, 3, 1))
configGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230, 3, 2))
configCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230, 3, 1, 1)).setObjects(("CTRON-SSR-CONFIG-MIB", "configGroup10"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
configCompliance = configCompliance.setStatus('obsolete')
configCompliance2 = ModuleCompliance((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230, 3, 1, 2)).setObjects(("CTRON-SSR-CONFIG-MIB", "configGroup20"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
configCompliance2 = configCompliance2.setStatus('current')
configGroup10 = ObjectGroup((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230, 3, 2, 1)).setObjects(("CTRON-SSR-CONFIG-MIB", "cfgTransferOp"), ("CTRON-SSR-CONFIG-MIB", "cfgManagerAddress"), ("CTRON-SSR-CONFIG-MIB", "cfgFileName"), ("CTRON-SSR-CONFIG-MIB", "cfgActivateTransfer"), ("CTRON-SSR-CONFIG-MIB", "cfgTransferStatus"), ("CTRON-SSR-CONFIG-MIB", "cfgActivateFile"), ("CTRON-SSR-CONFIG-MIB", "cfgLastError"), ("CTRON-SSR-CONFIG-MIB", "cfgLastErrorReason"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
configGroup10 = configGroup10.setStatus('deprecated')
configGroup20 = ObjectGroup((1, 3, 6, 1, 4, 1, 52, 2501, 1, 230, 3, 2, 2)).setObjects(("CTRON-SSR-CONFIG-MIB", "cfgTransferOp"), ("CTRON-SSR-CONFIG-MIB", "cfgManagerAddress"), ("CTRON-SSR-CONFIG-MIB", "cfgFileName"), ("CTRON-SSR-CONFIG-MIB", "cfgActivateTransfer"), ("CTRON-SSR-CONFIG-MIB", "cfgTransferStatus"), ("CTRON-SSR-CONFIG-MIB", "cfgActivateFile"), ("CTRON-SSR-CONFIG-MIB", "cfgLastError"), ("CTRON-SSR-CONFIG-MIB", "cfgLastErrorReason"), ("CTRON-SSR-CONFIG-MIB", "cfgActiveImageVersion"), ("CTRON-SSR-CONFIG-MIB", "cfgActiveImageBootLocation"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
configGroup20 = configGroup20.setStatus('current')
mibBuilder.exportSymbols("CTRON-SSR-CONFIG-MIB", cfgManagerAddress=cfgManagerAddress, cfgActiveImageVersion=cfgActiveImageVersion, cfgActivateFile=cfgActivateFile, configGroups=configGroups, cfgLastErrorReason=cfgLastErrorReason, ssrConfigMIB=ssrConfigMIB, SSRErrorCode=SSRErrorCode, configCompliance=configCompliance, configGroup20=configGroup20, cfgGroup=cfgGroup, cfgActiveImageBootLocation=cfgActiveImageBootLocation, configConformance=configConformance, configCompliance2=configCompliance2, cfgFileName=cfgFileName, cfgLastError=cfgLastError, configGroup10=configGroup10, cfgTransferOp=cfgTransferOp, PYSNMP_MODULE_ID=ssrConfigMIB, cfgActivateTransfer=cfgActivateTransfer, cfgTransferStatus=cfgTransferStatus, configCompliances=configCompliances)
| 115.444444 | 751 | 0.74371 | 826 | 7,273 | 6.546005 | 0.2046 | 0.03551 | 0.059552 | 0.072314 | 0.45626 | 0.374884 | 0.314962 | 0.314962 | 0.304975 | 0.279822 | 0 | 0.081928 | 0.087034 | 7,273 | 62 | 752 | 117.306452 | 0.73238 | 0.045923 | 0 | 0.076923 | 0 | 0 | 0.248918 | 0.019625 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.134615 | 0 | 0.211538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
942894d912ee4a7ba9f0aefb2767a546ced6f224 | 19,551 | py | Python | demisto_client/demisto_api/models/widget.py | guytest/demisto-py | 8ca4f56a6177668151b5656cbe675a377003c0e9 | [
"Apache-2.0"
] | 59 | 2017-05-04T05:48:00.000Z | 2022-02-27T21:06:01.000Z | demisto_client/demisto_api/models/widget.py | guytest/demisto-py | 8ca4f56a6177668151b5656cbe675a377003c0e9 | [
"Apache-2.0"
] | 44 | 2017-05-09T17:42:43.000Z | 2022-03-30T05:55:44.000Z | demisto_client/demisto_api/models/widget.py | guytest/demisto-py | 8ca4f56a6177668151b5656cbe675a377003c0e9 | [
"Apache-2.0"
] | 37 | 2017-05-06T04:30:32.000Z | 2022-02-15T04:59:00.000Z | # coding: utf-8
"""
Demisto API
This is the public REST API to integrate with the demisto server. HTTP request can be sent using any HTTP-client. For an example dedicated client take a look at: https://github.com/demisto/demisto-py. Requests must include API-key that can be generated in the Demisto web client under 'Settings' -> 'Integrations' -> 'API keys' Optimistic Locking and Versioning\\: When using Demisto REST API, you will need to make sure to work on the latest version of the item (incident, entry, etc.), otherwise, you will get a DB version error (which not allow you to override a newer item). In addition, you can pass 'version\\: -1' to force data override (make sure that other users data might be lost). Assume that Alice and Bob both read the same data from Demisto server, then they both changed the data, and then both tried to write the new versions back to the server. Whose changes should be saved? Alice’s? Bob’s? To solve this, each data item in Demisto has a numeric incremental version. If Alice saved an item with version 4 and Bob trying to save the same item with version 3, Demisto will rollback Bob request and returns a DB version conflict error. Bob will need to get the latest item and work on it so Alice work will not get lost. Example request using 'curl'\\: ``` curl 'https://hostname:443/incidents/search' -H 'content-type: application/json' -H 'accept: application/json' -H 'Authorization: <API Key goes here>' --data-binary '{\"filter\":{\"query\":\"-status:closed -category:job\",\"period\":{\"by\":\"day\",\"fromValue\":7}}}' --compressed ``` # noqa: E501
OpenAPI spec version: 2.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
import pprint
import re # noqa: F401
import six
from demisto_client.demisto_api.models.date_range import DateRange # noqa: F401,E501
from demisto_client.demisto_api.models.order import Order # noqa: F401,E501
class Widget(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
"""
"""
Attributes:
swagger_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
swagger_types = {
'category': 'str',
'commit_message': 'str',
'data_type': 'str',
'date_range': 'DateRange',
'description': 'str',
'id': 'str',
'is_predefined': 'bool',
'locked': 'bool',
'modified': 'datetime',
'name': 'str',
'params': 'dict(str, object)',
'prev_name': 'str',
'query': 'str',
'should_commit': 'bool',
'size': 'int',
'sort': 'list[Order]',
'sort_values': 'list[str]',
'vc_should_ignore': 'bool',
'version': 'int',
'widget_type': 'str'
}
attribute_map = {
'category': 'category',
'commit_message': 'commitMessage',
'data_type': 'dataType',
'date_range': 'dateRange',
'description': 'description',
'id': 'id',
'is_predefined': 'isPredefined',
'locked': 'locked',
'modified': 'modified',
'name': 'name',
'params': 'params',
'prev_name': 'prevName',
'query': 'query',
'should_commit': 'shouldCommit',
'size': 'size',
'sort': 'sort',
'sort_values': 'sortValues',
'vc_should_ignore': 'vcShouldIgnore',
'version': 'version',
'widget_type': 'widgetType'
}
def __init__(self, category=None, commit_message=None, data_type=None, date_range=None, description=None, id=None, is_predefined=None, locked=None, modified=None, name=None, params=None, prev_name=None, query=None, should_commit=None, size=None, sort=None, sort_values=None, vc_should_ignore=None, version=None, widget_type=None): # noqa: E501
"""Widget - a model defined in Swagger""" # noqa: E501
self._category = None
self._commit_message = None
self._data_type = None
self._date_range = None
self._description = None
self._id = None
self._is_predefined = None
self._locked = None
self._modified = None
self._name = None
self._params = None
self._prev_name = None
self._query = None
self._should_commit = None
self._size = None
self._sort = None
self._sort_values = None
self._vc_should_ignore = None
self._version = None
self._widget_type = None
self.discriminator = None
if category is not None:
self.category = category
if commit_message is not None:
self.commit_message = commit_message
if data_type is not None:
self.data_type = data_type
if date_range is not None:
self.date_range = date_range
if description is not None:
self.description = description
if id is not None:
self.id = id
if is_predefined is not None:
self.is_predefined = is_predefined
if locked is not None:
self.locked = locked
if modified is not None:
self.modified = modified
self.name = name
if params is not None:
self.params = params
if prev_name is not None:
self.prev_name = prev_name
if query is not None:
self.query = query
if should_commit is not None:
self.should_commit = should_commit
if size is not None:
self.size = size
if sort is not None:
self.sort = sort
if sort_values is not None:
self.sort_values = sort_values
if vc_should_ignore is not None:
self.vc_should_ignore = vc_should_ignore
if version is not None:
self.version = version
self.widget_type = widget_type
@property
def category(self):
"""Gets the category of this Widget. # noqa: E501
Category the widget is related to. Used to display in widget library under category or dataType if empty. # noqa: E501
:return: The category of this Widget. # noqa: E501
:rtype: str
"""
return self._category
@category.setter
def category(self, category):
"""Sets the category of this Widget.
Category the widget is related to. Used to display in widget library under category or dataType if empty. # noqa: E501
:param category: The category of this Widget. # noqa: E501
:type: str
"""
self._category = category
@property
def commit_message(self):
"""Gets the commit_message of this Widget. # noqa: E501
:return: The commit_message of this Widget. # noqa: E501
:rtype: str
"""
return self._commit_message
@commit_message.setter
def commit_message(self, commit_message):
"""Sets the commit_message of this Widget.
:param commit_message: The commit_message of this Widget. # noqa: E501
:type: str
"""
self._commit_message = commit_message
@property
def data_type(self):
"""Gets the data_type of this Widget. # noqa: E501
Data type of the widget. Describes what data does the widget query. supporting data types \"incidents\",\"messages\",\"system\",\"entries\",\"tasks\", \"audit\". # noqa: E501
:return: The data_type of this Widget. # noqa: E501
:rtype: str
"""
return self._data_type
@data_type.setter
def data_type(self, data_type):
"""Sets the data_type of this Widget.
Data type of the widget. Describes what data does the widget query. supporting data types \"incidents\",\"messages\",\"system\",\"entries\",\"tasks\", \"audit\". # noqa: E501
:param data_type: The data_type of this Widget. # noqa: E501
:type: str
"""
self._data_type = data_type
@property
def date_range(self):
"""Gets the date_range of this Widget. # noqa: E501
:return: The date_range of this Widget. # noqa: E501
:rtype: DateRange
"""
return self._date_range
@date_range.setter
def date_range(self, date_range):
"""Sets the date_range of this Widget.
:param date_range: The date_range of this Widget. # noqa: E501
:type: DateRange
"""
self._date_range = date_range
@property
def description(self):
"""Gets the description of this Widget. # noqa: E501
The description of the widget's usage and data representation. # noqa: E501
:return: The description of this Widget. # noqa: E501
:rtype: str
"""
return self._description
@description.setter
def description(self, description):
"""Sets the description of this Widget.
The description of the widget's usage and data representation. # noqa: E501
:param description: The description of this Widget. # noqa: E501
:type: str
"""
self._description = description
@property
def id(self):
"""Gets the id of this Widget. # noqa: E501
:return: The id of this Widget. # noqa: E501
:rtype: str
"""
return self._id
@id.setter
def id(self, id):
"""Sets the id of this Widget.
:param id: The id of this Widget. # noqa: E501
:type: str
"""
self._id = id
@property
def is_predefined(self):
"""Gets the is_predefined of this Widget. # noqa: E501
Is the widget a system widget. # noqa: E501
:return: The is_predefined of this Widget. # noqa: E501
:rtype: bool
"""
return self._is_predefined
@is_predefined.setter
def is_predefined(self, is_predefined):
"""Sets the is_predefined of this Widget.
Is the widget a system widget. # noqa: E501
:param is_predefined: The is_predefined of this Widget. # noqa: E501
:type: bool
"""
self._is_predefined = is_predefined
@property
def locked(self):
"""Gets the locked of this Widget. # noqa: E501
Is the widget locked for editing. # noqa: E501
:return: The locked of this Widget. # noqa: E501
:rtype: bool
"""
return self._locked
@locked.setter
def locked(self, locked):
"""Sets the locked of this Widget.
Is the widget locked for editing. # noqa: E501
:param locked: The locked of this Widget. # noqa: E501
:type: bool
"""
self._locked = locked
@property
def modified(self):
"""Gets the modified of this Widget. # noqa: E501
:return: The modified of this Widget. # noqa: E501
:rtype: datetime
"""
return self._modified
@modified.setter
def modified(self, modified):
"""Sets the modified of this Widget.
:param modified: The modified of this Widget. # noqa: E501
:type: datetime
"""
self._modified = modified
@property
def name(self):
"""Gets the name of this Widget. # noqa: E501
Default name of the widget. # noqa: E501
:return: The name of this Widget. # noqa: E501
:rtype: str
"""
return self._name
@name.setter
def name(self, name):
"""Sets the name of this Widget.
Default name of the widget. # noqa: E501
:param name: The name of this Widget. # noqa: E501
:type: str
"""
if name is None:
raise ValueError("Invalid value for `name`, must not be `None`") # noqa: E501
self._name = name
@property
def params(self):
"""Gets the params of this Widget. # noqa: E501
Additional parameters for this widget, depends on widget type and data. # noqa: E501
:return: The params of this Widget. # noqa: E501
:rtype: dict(str, object)
"""
return self._params
@params.setter
def params(self, params):
"""Sets the params of this Widget.
Additional parameters for this widget, depends on widget type and data. # noqa: E501
:param params: The params of this Widget. # noqa: E501
:type: dict(str, object)
"""
self._params = params
@property
def prev_name(self):
"""Gets the prev_name of this Widget. # noqa: E501
The previous name of the widget. # noqa: E501
:return: The prev_name of this Widget. # noqa: E501
:rtype: str
"""
return self._prev_name
@prev_name.setter
def prev_name(self, prev_name):
"""Sets the prev_name of this Widget.
The previous name of the widget. # noqa: E501
:param prev_name: The prev_name of this Widget. # noqa: E501
:type: str
"""
self._prev_name = prev_name
@property
def query(self):
"""Gets the query of this Widget. # noqa: E501
Query to search on the dataType. # noqa: E501
:return: The query of this Widget. # noqa: E501
:rtype: str
"""
return self._query
@query.setter
def query(self, query):
"""Sets the query of this Widget.
Query to search on the dataType. # noqa: E501
:param query: The query of this Widget. # noqa: E501
:type: str
"""
self._query = query
@property
def should_commit(self):
"""Gets the should_commit of this Widget. # noqa: E501
:return: The should_commit of this Widget. # noqa: E501
:rtype: bool
"""
return self._should_commit
@should_commit.setter
def should_commit(self, should_commit):
"""Sets the should_commit of this Widget.
:param should_commit: The should_commit of this Widget. # noqa: E501
:type: bool
"""
self._should_commit = should_commit
@property
def size(self):
"""Gets the size of this Widget. # noqa: E501
Maximum size for this widget data returned. # noqa: E501
:return: The size of this Widget. # noqa: E501
:rtype: int
"""
return self._size
@size.setter
def size(self, size):
"""Sets the size of this Widget.
Maximum size for this widget data returned. # noqa: E501
:param size: The size of this Widget. # noqa: E501
:type: int
"""
self._size = size
@property
def sort(self):
"""Gets the sort of this Widget. # noqa: E501
Sorting array to sort the data received by the given Order parameters. # noqa: E501
:return: The sort of this Widget. # noqa: E501
:rtype: list[Order]
"""
return self._sort
@sort.setter
def sort(self, sort):
"""Sets the sort of this Widget.
Sorting array to sort the data received by the given Order parameters. # noqa: E501
:param sort: The sort of this Widget. # noqa: E501
:type: list[Order]
"""
self._sort = sort
@property
def sort_values(self):
"""Gets the sort_values of this Widget. # noqa: E501
:return: The sort_values of this Widget. # noqa: E501
:rtype: list[str]
"""
return self._sort_values
@sort_values.setter
def sort_values(self, sort_values):
"""Sets the sort_values of this Widget.
:param sort_values: The sort_values of this Widget. # noqa: E501
:type: list[str]
"""
self._sort_values = sort_values
@property
def vc_should_ignore(self):
"""Gets the vc_should_ignore of this Widget. # noqa: E501
:return: The vc_should_ignore of this Widget. # noqa: E501
:rtype: bool
"""
return self._vc_should_ignore
@vc_should_ignore.setter
def vc_should_ignore(self, vc_should_ignore):
"""Sets the vc_should_ignore of this Widget.
:param vc_should_ignore: The vc_should_ignore of this Widget. # noqa: E501
:type: bool
"""
self._vc_should_ignore = vc_should_ignore
@property
def version(self):
"""Gets the version of this Widget. # noqa: E501
:return: The version of this Widget. # noqa: E501
:rtype: int
"""
return self._version
@version.setter
def version(self, version):
"""Sets the version of this Widget.
:param version: The version of this Widget. # noqa: E501
:type: int
"""
self._version = version
@property
def widget_type(self):
"""Gets the widget_type of this Widget. # noqa: E501
Widget type describes how does the widget should recieve the data, and display it. Supporting types: \"bar\", \"column\", \"pie\", \"list\", \"number\", \"trend\", \"text\", \"duration\", \"image\", \"line\", and \"table\". # noqa: E501
:return: The widget_type of this Widget. # noqa: E501
:rtype: str
"""
return self._widget_type
@widget_type.setter
def widget_type(self, widget_type):
"""Sets the widget_type of this Widget.
Widget type describes how does the widget should recieve the data, and display it. Supporting types: \"bar\", \"column\", \"pie\", \"list\", \"number\", \"trend\", \"text\", \"duration\", \"image\", \"line\", and \"table\". # noqa: E501
:param widget_type: The widget_type of this Widget. # noqa: E501
:type: str
"""
if widget_type is None:
raise ValueError("Invalid value for `widget_type`, must not be `None`") # noqa: E501
self._widget_type = widget_type
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
if issubclass(Widget, dict):
for key, value in self.items():
result[key] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, Widget):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""Returns true if both objects are not equal"""
return not self == other
| 30.596244 | 1,584 | 0.591223 | 2,462 | 19,551 | 4.574736 | 0.119009 | 0.063216 | 0.085235 | 0.085235 | 0.487437 | 0.410725 | 0.383468 | 0.297257 | 0.1734 | 0.120039 | 0 | 0.021936 | 0.309805 | 19,551 | 638 | 1,585 | 30.644201 | 0.812732 | 0.447036 | 0 | 0.082397 | 0 | 0 | 0.080311 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.172285 | false | 0 | 0.018727 | 0 | 0.299625 | 0.007491 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9428c808c828141ff45952921a511681e35a77df | 410 | py | Python | saliency_web_mapper/config/environment.py | HeosSacer/saliency_web_mapper | a2fd744b821086dc1a0af0498361207f7bcddee6 | [
"MIT"
] | null | null | null | saliency_web_mapper/config/environment.py | HeosSacer/saliency_web_mapper | a2fd744b821086dc1a0af0498361207f7bcddee6 | [
"MIT"
] | null | null | null | saliency_web_mapper/config/environment.py | HeosSacer/saliency_web_mapper | a2fd744b821086dc1a0af0498361207f7bcddee6 | [
"MIT"
] | null | null | null | from saliency_web_mapper.config.typesafe_dataclass import TypesafeDataclass
from typing import List, Dict, Tuple, Sequence
class SaliencyWebMapperEnvironment(TypesafeDataclass):
# Defaults with type
url: str = 'http://localhost:3001/'
window_name: str = 'Cart 4.0'
# Debug
debug: bool = False
debug_address: str = "192.168.178.33"
def __init__(self):
super().__init__()
| 25.625 | 75 | 0.707317 | 49 | 410 | 5.653061 | 0.836735 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05136 | 0.192683 | 410 | 15 | 76 | 27.333333 | 0.785498 | 0.058537 | 0 | 0 | 0 | 0 | 0.114883 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
942c0a693ba17d410d8999d5102376d8d5967101 | 8,825 | py | Python | eurlex2lexparency/transformation/utils/liap.py | Lexparency/eurlex2lexparency | b4958f6fea5c2207eb06d2c3b91be798720c94bd | [
"MIT"
] | null | null | null | eurlex2lexparency/transformation/utils/liap.py | Lexparency/eurlex2lexparency | b4958f6fea5c2207eb06d2c3b91be798720c94bd | [
"MIT"
] | null | null | null | eurlex2lexparency/transformation/utils/liap.py | Lexparency/eurlex2lexparency | b4958f6fea5c2207eb06d2c3b91be798720c94bd | [
"MIT"
] | null | null | null | """
"""
import re
from collections import namedtuple
from functools import lru_cache
from lexref.model import Value
__all__ = ['ListItemsAndPatterns']
romans_pattern = Value.tag_2_pattern('EN')['ROM_L'].pattern.strip('b\\()')
_eur_lex_item_patterns_en = { # key: (itemization-character-pattern, ordered [bool], first two items, decorations)
# TODO: Amendments could cause Itemizations of the type "5a. ". Keep that in mind and see if / how the code
# TODO: can cope with that.
'nump': (re.compile(r'^[1-9][0-9]{,3}\.' + chr(160) + '{,3}', flags=re.UNICODE), True,
('1', '2'), '(). ' + chr(160)),
'numpt': (re.compile(r'^[0-9]{1,3}\.?(?!([0-9/();]| of))'), True, # TODO: This pattern does not belong here!
('1', '2'), '.'), # TODO: => get rid of it!
'numbr': (re.compile(r'^\([0-9]{1,3}\)'), True,
('1', '2'), '()'), # 2
'alpha': (re.compile(r'^\([a-z]\)'), True,
('a', 'b'), '()'), # 3
'roman': (re.compile(r'^\((' + romans_pattern + r')\)'), True,
('i', 'ii'), '()'),
'dash': (re.compile(u'^(—|' + chr(8212) + ')', flags=re.UNICODE), False, None, None)
}
_eur_lex_item_patterns_es = { # key: (itemization-character-pattern, ordered [bool], first two items, decorations)
'nump': (re.compile(r'^[1-9][0-9]{,3}\.' + chr(160) + '{,3}', flags=re.UNICODE), True,
('1', '2'), '(). ' + chr(160)),
'numpt': (re.compile(r'^[0-9]{1,3}\.?(?!([0-9/();]| de))'), True,
# TODO: This pattern does not belong here!
('1', '2'), '.'), # TODO: => get rid of it!
'numbr': (re.compile(r'^\([0-9]{1,3}\)'), True,
('1', '2'), '()'), # 2
'alpha': (re.compile(r'^\([a-z]\)'), True,
('a', 'b'), '()'), # 3
'roman': (re.compile(r'^\((' + romans_pattern + r')\)'), True,
('i', 'ii'), '()'),
'dash': (re.compile(u'^(—|' + chr(8212) + ')', flags=re.UNICODE),
False, None, None)
}
_eur_lex_item_patterns_de = { # key: (itemization-character-pattern, ordered [bool], first two items, decorations)
'nump': (re.compile(r'^\([0-9]{1,3}\)'), True, ('1', '2'), '()'), # 2
'alpha': (re.compile(r'^\([a-z]\)'), True, ('a', 'b'), '()'), # 3
'roman': (re.compile(r'^\((' + romans_pattern + r')\)'), True, ('i', 'ii'), '()'),
'dash': (re.compile(u'^(—|' + chr(8212) + ')', flags=re.UNICODE),
False, None, None)
}
_eur_lex_item_patterns_hierarchy = ['nump', 'numpt', 'numbr', 'alpha', 'roman', 'dash']
class ListItemPattern:
FirstSecond = namedtuple('FirstSecond', ['first', 'second'])
def __init__(self, tag, # Tag is used as CSS class on the surface
item_pattern, ordered, first_two_items, decoration):
self.item_pattern = item_pattern
self.tag = tag
self.ordered = ordered
self.first_two_items = (None if first_two_items is None
else self.FirstSecond(*first_two_items))
self.decoration = decoration
@classmethod
@lru_cache()
def create(cls, tag, # Tag is used as CSS class on the surface
item_pattern, ordered, first_two_items, decoration):
return cls(tag, item_pattern, ordered, first_two_items, decoration)
@lru_cache()
class ListItemsAndPatterns:
TagProposal = namedtuple('TagProposal', ['tags', 'inner'])
def __init__(self, language, document_domain, known_firsts=False):
if document_domain.lower() == 'eu':
try:
_eur_lex_item_patterns = eval(
f'_eur_lex_item_patterns_{language.lower()}')
except NameError:
raise NotImplementedError(
f'It seems that the time has come to implement '
f'language {language} for domain eu.'
)
else:
self.list_item_patterns = {
key: ListItemPattern.create(key, *value)
for key, value in _eur_lex_item_patterns.items()
}
self.known_firsts = known_firsts
self.list_label_generic = re.compile('^(' + '|'.join(
['(' + x.item_pattern.pattern.strip('^') + ')'
for x in self.list_item_patterns.values()]) + ')')
self.tag_hierarchy = _eur_lex_item_patterns_hierarchy
else:
raise NotImplementedError(f'It seems that the time has come to '
f'implement domain {document_domain}')
def get_list_item_tag(self, arg, force_ambivalence_resolution=True):
if type(arg) is str:
if force_ambivalence_resolution:
return self.get_list_item_tag([arg])[0]
else:
tag_candidates = set()
inner = None
for list_item_pattern in self.list_item_patterns.values():
m = list_item_pattern.item_pattern.match(arg)
if m is not None:
if inner is None:
inner = m.group(0).strip(list_item_pattern.decoration)
elif inner != m.group(0).strip(list_item_pattern.decoration):
raise RuntimeError("Unexpected ambivalence (type 0) "
"within ListItemsHandler")
tag_candidates |= {list_item_pattern.tag}
return self.TagProposal(tag_candidates, inner)
elif type(arg) is list:
tags_list = [
self.get_list_item_tag(it, force_ambivalence_resolution=False)
for it in arg
]
self._resolve_ambivalences(tags_list)
return tags_list
def __getitem__(self, item):
return self.list_item_patterns[item]
def _resolve_ambivalences(self, tag_candidates_list):
"""
1. Identify
:param tag_candidates_list:
:return:
TODO: This routine works more or les fine. However, it does not
really take into account all the context sensitivity that may arise.
Furthermore, at least two of the test cases have no unique solution,
but this routine simply chooses one possible solution. That is more
than questionable. Furthermore, this routine, does not take into
account the full nested structure of itemization, which would
clearly help to make the outcome this task more more correct for
all possible input cases.
"""
def ambivalence_resolvable(tag_list):
for tag_l in tag_list:
for tag_r in tag_list:
if tag_r > tag_l:
if self[tag_l] != self[tag_r]:
return True
return False
# TODO: distinction between two types of ambivalence:
ambivalent_cases = [k for k, (tags, inner) in enumerate(tag_candidates_list)
if ambivalence_resolvable(tags)]
# TODO: Not resolvable cases must be handled via the hierarchy
for k in ambivalent_cases:
case = tag_candidates_list[k]
if k < len(tag_candidates_list) - 1:
subsequent = tag_candidates_list[k+1]
if k + 1 not in ambivalent_cases:
# If the adjacent item is not ambivalent. The tag of the subsequent is it
if subsequent.tags.issubset(case.tags) \
and self[subsequent.tags.copy().pop()].first_two_items.first != subsequent.inner:
tag_candidates_list[k] = self.TagProposal(subsequent.tags, case.inner)
continue
if k > 0: # No successor of case but a precedent (of course)
preceding = tag_candidates_list[k-1]
if k - 1 not in ambivalent_cases:
if preceding.tags.issubset(case.tags): # and case is not first with respect to preceding tag
if self[preceding.tags.copy().pop()].first_two_items.first != case.inner:
tag_candidates_list[k] = self.TagProposal(preceding.tags, case.inner)
continue
else:
case.tags.remove(preceding.tags.copy().pop())
continue
for tag in self.tag_hierarchy: # map to hierarchy and take the first one.
if tag in case.tags:
tag_candidates_list[k] = self.TagProposal({tag}, case.inner)
continue
if len([_ for _ in tag_candidates_list if len(_.tags) > 1]) > 0:
self._resolve_ambivalences(tag_candidates_list)
| 47.446237 | 115 | 0.546062 | 1,045 | 8,825 | 4.435407 | 0.221053 | 0.03301 | 0.028047 | 0.031068 | 0.372168 | 0.34671 | 0.327508 | 0.289752 | 0.289752 | 0.271629 | 0 | 0.015823 | 0.31966 | 8,825 | 185 | 116 | 47.702703 | 0.756163 | 0.170425 | 0 | 0.237762 | 0 | 0 | 0.100584 | 0.013495 | 0 | 0 | 0 | 0.016216 | 0 | 1 | 0.048951 | false | 0 | 0.027972 | 0.013986 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
942c240cacbca2c4496639534336a931e698439a | 409 | py | Python | setup.py | JulianGindi/auto-semver | 8e98a341155ac44a698310b08177c3e8c27aa201 | [
"MIT"
] | 5 | 2019-12-18T17:35:09.000Z | 2021-07-06T01:20:58.000Z | setup.py | JulianGindi/auto-increment-semver | 8e98a341155ac44a698310b08177c3e8c27aa201 | [
"MIT"
] | null | null | null | setup.py | JulianGindi/auto-increment-semver | 8e98a341155ac44a698310b08177c3e8c27aa201 | [
"MIT"
] | 1 | 2020-11-13T05:51:35.000Z | 2020-11-13T05:51:35.000Z | from setuptools import setup
setup(
name="auto-semver",
version="0.8.0",
description="Semver swiss-army knife",
url="http://github.com/juliangindi/auto-semver",
author="Julian Gindi",
author_email="julian@gindi.io",
license="MIT",
packages=["auto_semver"],
zip_safe=False,
entry_points={
"console_scripts": ["auto-semver=auto_semver.__main__:main",]
},
)
| 22.722222 | 69 | 0.650367 | 50 | 409 | 5.12 | 0.7 | 0.195313 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009009 | 0.185819 | 409 | 17 | 70 | 24.058824 | 0.75976 | 0 | 0 | 0 | 0 | 0 | 0.42402 | 0.090686 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.066667 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9430c7d26f1a2d5ac3fef69dacab439ea6ec2d2d | 477 | py | Python | shortly/settings.py | fengsp/shortly | 29532523c2db297c995a7e94c84df6d884ce240e | [
"BSD-3-Clause"
] | 11 | 2015-01-01T03:22:09.000Z | 2021-02-12T14:08:06.000Z | shortly/settings.py | fengsp/shortly | 29532523c2db297c995a7e94c84df6d884ce240e | [
"BSD-3-Clause"
] | null | null | null | shortly/settings.py | fengsp/shortly | 29532523c2db297c995a7e94c84df6d884ce240e | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
shortly.settings
~~~~~~~~~~~~~~~~
Shortly config.
:copyright: (c) 2014 by fsp.
:license: BSD.
"""
import os
DEBUG = False
# Detect environment by whether debug named file exists or not
if os.path.exists(os.path.join(os.path.dirname(__file__), 'debug')):
DEBUG = True
if DEBUG:
REDIS_HOST = 'localhost'
REDIS_PORT = 6379
REDIS_DB = 0
else:
REDIS_HOST = 'localhost'
REDIS_PORT = 6379
REDIS_DB = 0
| 17.035714 | 68 | 0.607966 | 63 | 477 | 4.444444 | 0.603175 | 0.064286 | 0.128571 | 0.164286 | 0.278571 | 0.278571 | 0.278571 | 0.278571 | 0.278571 | 0 | 0 | 0.041436 | 0.24109 | 477 | 27 | 69 | 17.666667 | 0.732044 | 0.375262 | 0 | 0.5 | 0 | 0 | 0.085502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9436ee02169dab30dcc57c297b85baa82e5a02a7 | 246 | py | Python | deeptrade/__init__.py | deeptrade-tech/deeptrade_api | 4612f1057ff4532b3f4b96f3fe07c2b7150f80b7 | [
"MIT"
] | 1 | 2019-09-30T05:33:24.000Z | 2019-09-30T05:33:24.000Z | deeptrade/__init__.py | deeptrade-tech/deeptrade_api | 4612f1057ff4532b3f4b96f3fe07c2b7150f80b7 | [
"MIT"
] | null | null | null | deeptrade/__init__.py | deeptrade-tech/deeptrade_api | 4612f1057ff4532b3f4b96f3fe07c2b7150f80b7 | [
"MIT"
] | null | null | null | from __future__ import absolute_import, division, print_function
# configuration variables
api_key = None
api_base = "https://www.deeptrade.ch/"
# API sentiment
from deeptrade.sentiment import *
# API stocks
from deeptrade.stocks import *
| 15.375 | 64 | 0.776423 | 31 | 246 | 5.903226 | 0.612903 | 0.142077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146341 | 246 | 15 | 65 | 16.4 | 0.871429 | 0.195122 | 0 | 0 | 0 | 0 | 0.128866 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
94388603ebcb4df77f1d83d5d6abd9980c49bf05 | 261 | py | Python | maad/cluster/__init__.py | jflatorreg/scikit-maad | f7c4ac1370dcf416b7014f94784d71549623593f | [
"BSD-3-Clause"
] | 3 | 2021-04-17T21:13:57.000Z | 2021-04-25T00:55:18.000Z | maad/cluster/__init__.py | jflatorreg/scikit-maad | f7c4ac1370dcf416b7014f94784d71549623593f | [
"BSD-3-Clause"
] | null | null | null | maad/cluster/__init__.py | jflatorreg/scikit-maad | f7c4ac1370dcf416b7014f94784d71549623593f | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
""" cluster functions for scikit-maad
Cluster regions of interest using High Dimensional Data Clsutering (HDDC).
"""
from .hdda import (HDDC)
from .cluster_func import (do_PCA)
__all__ = ['HDDC',
'do_PCA']
| 18.642857 | 76 | 0.628352 | 32 | 261 | 4.90625 | 0.75 | 0.101911 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005076 | 0.245211 | 261 | 13 | 77 | 20.076923 | 0.791878 | 0.505747 | 0 | 0 | 0 | 0 | 0.093458 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
943af4c9b4f0d02449e03edde4cb90f8a5ee5e58 | 929 | py | Python | pipng/imagescale/Globals.py | nwiizo/joke | 808c4c998cc7f5b7f6f3fb5a3ce421588a70c087 | [
"MIT"
] | 1 | 2017-01-11T06:12:24.000Z | 2017-01-11T06:12:24.000Z | pipng/imagescale/Globals.py | ShuyaMotouchi/joke | 808c4c998cc7f5b7f6f3fb5a3ce421588a70c087 | [
"MIT"
] | null | null | null | pipng/imagescale/Globals.py | ShuyaMotouchi/joke | 808c4c998cc7f5b7f6f3fb5a3ce421588a70c087 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# Copyright © 2012-13 Qtrac Ltd. All rights reserved.
# This program or module is free software: you can redistribute it
# and/or modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version. It is provided for
# educational purposes and is distributed in the hope that it will be
# useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
ABOUT = "About"
APPNAME = "ImageScale"
GENERAL = "General"
PAD = "0.75m"
POSITION = "position"
RESTORE = "Restore"
SOURCE, TARGET = ("SOURCE", "TARGET")
VERSION = "1.0.0"
WORKING, CANCELED, TERMINATING, IDLE = ("WORKING", "CANCELED",
"TERMINATING", "IDLE")
class Canceled(Exception): pass
| 38.708333 | 73 | 0.722282 | 133 | 929 | 5.052632 | 0.661654 | 0.035714 | 0.03869 | 0.056548 | 0.077381 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018592 | 0.189451 | 929 | 23 | 74 | 40.391304 | 0.87251 | 0.631862 | 0 | 0 | 0 | 0 | 0.288026 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.090909 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
943b27a6307e3dd5baf6b7f00cd5eae08fb5833d | 861 | py | Python | examples/old-examples/pygame/01_hello_world.py | Vallentin/ModernGL | 18ce7d3f28c68b2d305da332ae0afd221dc043d1 | [
"MIT"
] | null | null | null | examples/old-examples/pygame/01_hello_world.py | Vallentin/ModernGL | 18ce7d3f28c68b2d305da332ae0afd221dc043d1 | [
"MIT"
] | null | null | null | examples/old-examples/pygame/01_hello_world.py | Vallentin/ModernGL | 18ce7d3f28c68b2d305da332ae0afd221dc043d1 | [
"MIT"
] | 1 | 2020-07-10T23:26:36.000Z | 2020-07-10T23:26:36.000Z | import struct
import ModernGL
import pygame
from pygame.locals import DOUBLEBUF, OPENGL
pygame.init()
pygame.display.set_mode((800, 600), DOUBLEBUF | OPENGL)
ctx = ModernGL.create_context()
vert = ctx.vertex_shader('''
#version 330
in vec2 vert;
void main() {
gl_Position = vec4(vert, 0.0, 1.0);
}
''')
frag = ctx.fragment_shader('''
#version 330
out vec4 color;
void main() {
color = vec4(0.30, 0.50, 1.00, 1.0);
}
''')
prog = ctx.program(vert, frag])
vbo = ctx.buffer(struct.pack('6f', 0.0, 0.8, -0.6, -0.8, 0.6, -0.8))
vao = ctx.simple_vertex_array(prog, vbo, ['vert'])
running = True
while running:
for event in pygame.event.get():
if event.type == pygame.QUIT:
running = False
ctx.clear(0.9, 0.9, 0.9)
vao.render()
pygame.display.flip()
pygame.time.wait(10)
| 19.568182 | 68 | 0.609756 | 131 | 861 | 3.954198 | 0.503817 | 0.011583 | 0.061776 | 0.015444 | 0.019305 | 0.019305 | 0 | 0 | 0 | 0 | 0 | 0.078078 | 0.226481 | 861 | 43 | 69 | 20.023256 | 0.6997 | 0 | 0 | 0.181818 | 0 | 0 | 0.252033 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.121212 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
943bf0558ebf6b4b809909be2b8a669c3b8ef8b4 | 1,189 | py | Python | app/core/tests/test_models.py | prafullkumar41/recipe-app-api | f66d076aafa6c8bb727d7f9bd03955c986e811c9 | [
"MIT"
] | null | null | null | app/core/tests/test_models.py | prafullkumar41/recipe-app-api | f66d076aafa6c8bb727d7f9bd03955c986e811c9 | [
"MIT"
] | null | null | null | app/core/tests/test_models.py | prafullkumar41/recipe-app-api | f66d076aafa6c8bb727d7f9bd03955c986e811c9 | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.contrib.auth import get_user_model
class ModelTests(TestCase):
def test_create_user_with_email(self):
'''Tet creating a new user with an email is sucessfull'''
email = 'test@gmail.com'
password = 'test123'
user = get_user_model().objects.create_user(
email=email,
password=password
)
self.assertEqual(user.email, email)
self.assertTrue(user.check_password(password))
def test_user_email_is_normalize(self):
email = 'test@DEV.COM'
user = get_user_model().objects.create_user(email, 'test123')
self.assertEqual(user.email, email.lower())
def test_email_field_not_empty(self):
'''Raises Error if email is not provided'''
with self.assertRaises(ValueError):
get_user_model().objects.create_user(None, 'test123')
def test_create_super_user(self):
'''Test Creating a new Super USer'''
user = get_user_model().objects.create_super_user(
'vj"dev.com',
'tst123'
)
self.assertTrue(user.is_superuser)
self.assertTrue(user.is_staff)
| 29 | 69 | 0.644239 | 147 | 1,189 | 4.986395 | 0.340136 | 0.047749 | 0.081855 | 0.103683 | 0.261937 | 0.18281 | 0.103683 | 0.103683 | 0 | 0 | 0 | 0.013498 | 0.252313 | 1,189 | 40 | 70 | 29.725 | 0.811024 | 0.100925 | 0 | 0 | 0 | 0 | 0.059829 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 1 | 0.153846 | false | 0.115385 | 0.076923 | 0 | 0.269231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
943e1d6391bc96c958af3f46e434cc6fac7f09c7 | 568 | py | Python | uni_ticket/migrations/0176_ticketcategorywsprotocollo_protocollo_uo_rpa_matricola.py | biotech2021/uniTicket | 8c441eac18e67a983e158326b1c4b82f00f1f1ef | [
"Apache-2.0"
] | 15 | 2019-09-06T06:47:08.000Z | 2022-01-17T06:39:54.000Z | uni_ticket/migrations/0176_ticketcategorywsprotocollo_protocollo_uo_rpa_matricola.py | biotech2021/uniTicket | 8c441eac18e67a983e158326b1c4b82f00f1f1ef | [
"Apache-2.0"
] | 69 | 2019-09-06T12:03:19.000Z | 2022-03-26T14:30:53.000Z | uni_ticket/migrations/0176_ticketcategorywsprotocollo_protocollo_uo_rpa_matricola.py | biotech2021/uniTicket | 8c441eac18e67a983e158326b1c4b82f00f1f1ef | [
"Apache-2.0"
] | 13 | 2019-09-11T10:54:20.000Z | 2021-11-23T09:09:19.000Z | # Generated by Django 3.2.7 on 2021-11-11 09:18
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('uni_ticket', '0175_alter_ticketcategorywsprotocollo_protocollo_uo_rpa'),
]
operations = [
migrations.AddField(
model_name='ticketcategorywsprotocollo',
name='protocollo_uo_rpa_matricola',
field=models.CharField(blank=True, default='', help_text='Matricola RPA sul sistema di protocollo', max_length=255, verbose_name='RPA matricola'),
),
]
| 29.894737 | 158 | 0.684859 | 63 | 568 | 5.968254 | 0.730159 | 0.06383 | 0.079787 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049217 | 0.213028 | 568 | 18 | 159 | 31.555556 | 0.791946 | 0.079225 | 0 | 0 | 1 | 0 | 0.326296 | 0.207294 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9446b9ff584bcbd9ff011026042c303a28b108ae | 7,988 | py | Python | tests/run-tests.py | notofonts/NotoSansDuployan | d3bf2d4436d376501d0720e5d9f1fe032f8c5ffc | [
"Apache-2.0"
] | 6 | 2020-04-06T02:14:07.000Z | 2022-03-22T09:13:47.000Z | tests/run-tests.py | dscorbett/duployan-font | 966e4e233f56d818bbbcb4548f1cf232cd3fe4a1 | [
"Apache-2.0"
] | 2 | 2021-07-19T10:20:41.000Z | 2021-12-16T01:25:02.000Z | tests/run-tests.py | notofonts/NotoSansDuployan | d3bf2d4436d376501d0720e5d9f1fe032f8c5ffc | [
"Apache-2.0"
] | 1 | 2019-08-04T03:40:57.000Z | 2019-08-04T03:40:57.000Z | #!/usr/bin/env python3
# Copyright 2018-2019 David Corbett
# Copyright 2020-2021 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import argparse
import difflib
import json
import os
import re
import subprocess
import sys
CI = os.getenv('CI') == 'true'
DISAMBIGUATION_SUFFIX_PATTERN = re.compile(r'\._[0-9A-F]+$')
GLYPH_POSITION_PATTERN = re.compile(r'@-?[0-9]+,-?[0-9]+')
NOTDEF_PATTERN = re.compile(r'[\[|]\.notdef@')
SPACE_NAME_COMPONENT_PATTERN = re.compile('(?<=[\[|])(?:uni00A0|uni200[0-9A]|uni202F|uni205F|uni3000)(?![0-9A-Za-z_])')
FULL_FONT_CODE_POINTS = [0x034F]
NAME_PREFIX = r'(?:(?:dupl|u(?:ni(?:[0-9A-F]{4})+|[0-9A-F]{4,6})(?:_[^.]*)?)\.)'
UNSTABLE_NAME_COMPONENT_PATTERN = re.compile(fr'(?<=[\[|])(?:{NAME_PREFIX}[0-9A-Za-z_]+|(?!{NAME_PREFIX})[0-9A-Za-z_]+)')
def parse_color(color):
if color == 'auto':
return CI or sys.stdout.isatty()
if color == 'no':
return False
if color == 'yes':
return True
raise ValueError(f'Invalid --color value: {color}')
def parse_json(s):
x = 0
y = 0
for glyph in json.loads(s):
if not (name := glyph['g']).startswith('_'):
yield f'''{
DISAMBIGUATION_SUFFIX_PATTERN.sub('', name)
}@{
x + glyph["dx"]
},{
y + glyph["dy"]
}'''
x += int(glyph['ax'])
y += int(glyph['ay'])
yield f'_@{x},{y}'
def munge(output, regular, incomplete):
if incomplete:
output = UNSTABLE_NAME_COMPONENT_PATTERN.sub('dupl', output)
if not regular:
output = GLYPH_POSITION_PATTERN.sub('', output)
return output
def print_diff(code_points, options, actual_output, expected_output, color):
if color:
highlighted_actual_output = []
highlighted_expected_output = []
matcher = difflib.SequenceMatcher(None, actual_output, expected_output, False)
for tag, i1, i2, j1, j2 in matcher.get_opcodes():
if tag == 'equal':
highlighted_actual_output.append(actual_output[i1:i2])
highlighted_expected_output.append(expected_output[j1:j2])
elif tag == 'delete':
highlighted_actual_output.append('\x1B[1;96m')
highlighted_actual_output.append(actual_output[i1:i2])
highlighted_actual_output.append('\x1B[0m')
elif tag == 'insert':
highlighted_expected_output.append('\x1B[1;93m')
highlighted_expected_output.append(expected_output[j1:j2])
highlighted_expected_output.append('\x1B[0m')
elif tag == 'replace':
highlighted_actual_output.append('\x1B[1;96m')
highlighted_actual_output.append(actual_output[i1:i2])
highlighted_actual_output.append('\x1B[0m')
highlighted_expected_output.append('\x1B[1;93m')
highlighted_expected_output.append(expected_output[j1:j2])
highlighted_expected_output.append('\x1B[0m')
else:
assert False, f'Unknown tag: {tag}'
actual_output = ''.join(highlighted_actual_output)
expected_output = ''.join(highlighted_expected_output)
print()
print(f'Input: {code_points}:{options}')
print('Actual: ' + actual_output)
print('Expected: ' + expected_output)
def run_test(font, line, png_file, color, incomplete, view_all):
code_points, options, expected_output = line.split(':')
p = subprocess.Popen(
[
'hb-shape',
font,
'-u',
code_points,
'-O',
'json',
'--remove-default-ignorables',
*options.split(),
],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
stdout_data, stderr_data = p.communicate()
print(stderr_data.decode('utf-8'), end='', file=sys.stderr)
actual_output = f'[{"|".join(parse_json(stdout_data.decode("utf-8")))}]'
regular = font.endswith('-Regular.otf')
passed = (munge(actual_output, regular, incomplete) == munge(expected_output, regular, incomplete)
or incomplete and (
NOTDEF_PATTERN.search(actual_output)
or SPACE_NAME_COMPONENT_PATTERN.search(expected_output)
or any(int(cp, 16) in FULL_FONT_CODE_POINTS for cp in code_points.split())
)
)
if not passed or view_all:
if not passed:
print_diff(code_points, options, actual_output, expected_output, color)
if not CI:
os.makedirs(os.path.dirname(png_file), exist_ok=True)
png_file = '{}-{}.png'.format(png_file, code_points.replace(' ', '-'))
p = subprocess.Popen(
[
'hb-view',
'--font-file',
font,
'--font-size',
'upem',
'-u',
f'E000 {code_points} E000',
'--remove-default-ignorables',
'-o',
png_file,
'-O',
'png',
'--margin',
'800 0',
*options.split(),
],
stderr=subprocess.PIPE,
stdout=subprocess.PIPE)
p.wait()
print(p.stderr.read().decode('utf-8'), end='', file=sys.stderr)
return (passed, ':'.join([code_points, options, actual_output]))
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Run shaping tests.')
parser.add_argument('--color', default='auto', help='Whether to print diffs in color: "yes", "no", or "auto".')
parser.add_argument('--incomplete', action='store_true', help='Whether the font is less than the complete font. Do not fail a test if the actual result contains `.notdef`. Ignore the parts of glyph names that indicate code points.')
parser.add_argument('--view', action='store_true', help='Render all test cases, not just the failures.')
parser.add_argument('font', help='The path to a font.')
parser.add_argument('tests', nargs='*', help='The paths to test files.')
args = parser.parse_args()
color = parse_color(args.color.lower())
passed_all = True
failed_dir = os.path.join(os.path.dirname(sys.argv[0]), 'failed', os.path.basename(args.font))
os.makedirs(failed_dir, exist_ok=True)
for fn in args.tests:
result_lines = []
passed_file = True
with open(fn) as f:
for line_number, line in enumerate(f, start=1):
line = line.rstrip()
if line and line[0] != '#':
passed_line, result_line = run_test(
args.font,
line,
os.path.join(failed_dir, 'png', os.path.basename(fn), '{:03}'.format(line_number)),
color,
args.incomplete,
args.view,
)
passed_file = passed_file and passed_line
result_lines.append(result_line + '\n')
else:
result_lines.append(line + '\n')
if not passed_file:
with open(os.path.join(failed_dir, os.path.basename(fn)), 'w') as f:
f.writelines(result_lines)
passed_all = passed_all and passed_file
if not passed_all:
sys.exit(1)
| 40.964103 | 236 | 0.578117 | 952 | 7,988 | 4.67542 | 0.286765 | 0.056616 | 0.046506 | 0.045608 | 0.212761 | 0.17659 | 0.166255 | 0.154572 | 0.146035 | 0.133453 | 0 | 0.021968 | 0.287682 | 7,988 | 194 | 237 | 41.175258 | 0.760281 | 0.076114 | 0 | 0.189349 | 0 | 0.023669 | 0.176942 | 0.050923 | 0 | 0 | 0.000815 | 0 | 0.005917 | 1 | 0.029586 | false | 0.065089 | 0.04142 | 0 | 0.100592 | 0.053254 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
944c02065ba04cb19712b6f22f00a4dabb48ce05 | 3,049 | py | Python | test_migrations/contrib/pytest_plugin/plugin.py | skarzi/django-test-migrations | 9b61f36fd91900334a29ff26dc60228842a86da8 | [
"MIT"
] | 4 | 2019-07-26T12:42:42.000Z | 2020-01-27T07:45:09.000Z | test_migrations/contrib/pytest_plugin/plugin.py | skarzi/django-test-migrations | 9b61f36fd91900334a29ff26dc60228842a86da8 | [
"MIT"
] | 37 | 2019-10-18T18:10:14.000Z | 2020-01-31T07:46:23.000Z | test_migrations/contrib/pytest_plugin/plugin.py | skarzi/django-test-migrations | 9b61f36fd91900334a29ff26dc60228842a86da8 | [
"MIT"
] | 1 | 2019-08-03T15:54:50.000Z | 2019-08-03T15:54:50.000Z | import pytest
from test_migrations import constants
from .fixtures import migrator # pylint: disable=W0611
pytest_plugins = ['pytest_django'] # pylint: disable=C0103
def pytest_load_initial_conftests(early_config):
# Register the marks
early_config.addinivalue_line(
'markers',
(
"{marker}: Mark the test as a"
"Django migration test. Dynamically add `transactional_db` "
"fixture to marked item. Migration tests are run only when "
"`--test-migrations` pytest's CLI option passed."
).format(marker=constants.MIGRATIONS_TEST_MARKER),
)
def pytest_addoption(parser):
"""Add option for running migration tests.
"""
group = parser.getgroup('django_test_migrations')
group._addoption( # pylint: disable=W0212
'--test-migrations',
action='store_true',
dest='test_migrations',
default=False,
help=(
"Run Django migration tests. This does the following: "
" ensure migrations are enabled, skip all test not marked with "
"`{marker}` marker."
).format(marker=constants.MIGRATIONS_TEST_MARKER)
)
def pytest_sessionstart(session):
if session.config.getoption('test_migrations', False):
# TODO: consider raising AssertionError when `nomigration` falsy
session.config.option.nomigrations = False
def pytest_collection_modifyitems(session, items):
migration_test_skip_marker = pytest.mark.skip(
reason=(
'Migration tests not skipped, because`--test-migration` option '
'passed.'
),
)
for item in items:
# mark all tests using `migrator` fixture with `MIGRATION_TEST_MARKER`
if 'migrator' in getattr(item, 'fixturenames', list()):
item.add_marker(constants.MIGRATIONS_TEST_MARKER)
# skip all no migration tests when option `--test-migrations` passed
if (
session.config.getoption('test_migrations', False)
and not item.get_closest_marker(constants.MIGRATIONS_TEST_MARKER)
):
item.add_marker(migration_test_skip_marker)
@pytest.fixture(autouse=True, scope='function')
def _django_migration_marker(request):
"""Implement the migration marker, internal to `django_test_migrations`.
This will dynamically request the `transactional_db` fixture
and skip tests marked with migration marker if not
explicitly requested by passing `--test-migrations` option.
"""
marker = request.node.get_closest_marker(constants.MIGRATIONS_TEST_MARKER)
if marker:
if request.config.getoption('test_migrations', False):
request.getfixturevalue('transactional_db')
else:
pytest.skip(
msg=(
'Migration tests can require `migrations` enabled and can '
'be slow hence they should be ran separetly with pytest '
'`--test-migrations` option.'
),
)
| 35.870588 | 81 | 0.651361 | 330 | 3,049 | 5.857576 | 0.369697 | 0.086912 | 0.064666 | 0.075013 | 0.208484 | 0.142783 | 0.142783 | 0.051733 | 0 | 0 | 0 | 0.005317 | 0.259757 | 3,049 | 84 | 82 | 36.297619 | 0.85113 | 0.187602 | 0 | 0.033898 | 0 | 0 | 0.28858 | 0.019239 | 0 | 0 | 0 | 0.011905 | 0 | 1 | 0.084746 | false | 0.033898 | 0.050847 | 0 | 0.135593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
944d0a8ff7bd49103305170067ede2c52eda684a | 19,307 | py | Python | hrp/models.py | paleocore/paleocore110 | 754f3248ab22a2996b43bd224bd4ba15462edf7d | [
"MIT"
] | null | null | null | hrp/models.py | paleocore/paleocore110 | 754f3248ab22a2996b43bd224bd4ba15462edf7d | [
"MIT"
] | 7 | 2020-02-05T20:54:24.000Z | 2021-12-13T20:13:20.000Z | hrp/models.py | paleocore/paleocore110 | 754f3248ab22a2996b43bd224bd4ba15462edf7d | [
"MIT"
] | null | null | null | import os
from django.contrib.gis.db import models
from hrp.ontologies import *
# from hrp.ontologies import ITEM_TYPE_VOCABULARY, HRP_COLLECTOR_CHOICES, \
# HRP_COLLECTING_METHOD_VOCABULARY, HRP_BASIS_OF_RECORD_VOCABULARY, HRP_COLLECTION_CODES
from django.contrib.gis.geos import Point
import projects.models
class TaxonRank(projects.models.TaxonRank):
class Meta:
verbose_name = "HRP Taxon Rank"
verbose_name_plural = "HRP Taxon Ranks"
class Taxon(projects.models.Taxon):
parent = models.ForeignKey('self', null=True, blank=True)
rank = models.ForeignKey(TaxonRank)
class Meta:
verbose_name = "HRP Taxon"
verbose_name_plural = "HRP Taxa"
class IdentificationQualifier(projects.models.IdentificationQualifier):
class Meta:
verbose_name = "HRP ID Qualifier"
verbose_name_plural = "HRP ID Qualifiers"
# Locality Class
class Locality(projects.models.PaleoCoreLocalityBaseClass):
id = models.CharField(primary_key=True, max_length=255)
collection_code = models.CharField(null=True, blank=True, choices=HRP_COLLECTION_CODES, max_length=10)
locality_number = models.IntegerField(null=True, blank=True)
sublocality = models.CharField(null=True, blank=True, max_length=50)
description = models.TextField(null=True, blank=True, max_length=255)
stratigraphic_section = models.CharField(null=True, blank=True, max_length=50)
upper_limit_in_section = models.IntegerField(null=True, blank=True)
lower_limit_in_section = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
error_notes = models.CharField(max_length=255, null=True, blank=True)
notes = models.CharField(max_length=254, null=True, blank=True)
geom = models.PointField(srid=4326, blank=True, null=True)
date_last_modified = models.DateTimeField("Date Last Modified", auto_now=True)
objects = models.GeoManager()
def __str__(self):
nice_name = str(self.collection_code) + " " + str(self.locality_number) + str(self.sublocality)
return nice_name.replace("None", "").replace("--", "")
class Meta:
verbose_name = "HRP Locality"
verbose_name_plural = "HRP Localities"
ordering = ("locality_number", "sublocality")
class Person(projects.models.Person):
last_name = models.CharField("Last Name", null=True, blank=True, max_length=256)
first_name = models.CharField("First Name", null=True, blank=True, max_length=256)
class Meta:
verbose_name = "HRP Person"
verbose_name_plural = "HRP People"
ordering = ["last_name", "first_name"]
def __str__(self):
if self.last_name and self.first_name:
name = self.last_name+', '+self.first_name
else:
name = self.last_name
return name
# Occurrence Class and Subclasses
class Occurrence(projects.models.PaleoCoreOccurrenceBaseClass):
"""
Occurrence == Specimen, a general class for things discovered in the field.
Find's have three subtypes: Archaeology, Biology, Geology
Fields are grouped by comments into logical sets (i.e. ontological classes)
"""
basis_of_record = models.CharField("Basis of Record", max_length=50, blank=True, null=False,
help_text='e.g. Observed item or Collected item',
choices=HRP_BASIS_OF_RECORD_VOCABULARY) # NOT NULL dwc:basisOfRecord
field_number = models.CharField("Field Number", max_length=50, null=True, blank=True)
item_type = models.CharField("Item Type", max_length=255, blank=True, null=False,
choices=ITEM_TYPE_VOCABULARY) # NOT NULL
# TODO merge with taxon
item_scientific_name = models.CharField("Sci Name", max_length=255, null=True, blank=True)
# TODO merge with element
item_description = models.CharField("Description", max_length=255, blank=True, null=True)
item_count = models.IntegerField("Item Count", blank=True, null=True, default=1)
collector = models.CharField("Collector", max_length=50, blank=True, null=True, choices=HRP_COLLECTOR_CHOICES)
recorded_by = models.ForeignKey("Person", null=True, blank=True, related_name="occurrence_recorded_by")
finder = models.CharField("Finder", null=True, blank=True, max_length=50, choices=HRP_COLLECTOR_CHOICES)
found_by = models.ForeignKey("Person", null=True, blank=True, related_name="occurrence_found_by")
collecting_method = models.CharField("Collecting Method", max_length=50,
choices=HRP_COLLECTING_METHOD_VOCABULARY,
null=True, blank=True)
locality = models.ForeignKey("Locality", null=True, blank=True) # dwc:sampling_protocol
item_number = models.IntegerField("Item #", null=True, blank=True)
item_part = models.CharField("Item Part", max_length=10, null=True, blank=True)
cat_number = models.CharField("Cat Number", max_length=255, blank=True, null=True)
disposition = models.CharField("Disposition", max_length=255, blank=True, null=True)
preparation_status = models.CharField("Prep Status", max_length=50, blank=True, null=True)
# TODO rename collection remarks to find remarks
collection_remarks = models.TextField("Collection Remarks", null=True, blank=True, max_length=255)
# Geological Context
stratigraphic_formation = models.CharField("Formation", max_length=255, blank=True, null=True)
stratigraphic_member = models.CharField("Member", max_length=255, blank=True, null=True)
analytical_unit_1 = models.CharField(max_length=255, blank=True, null=True)
analytical_unit_2 = models.CharField(max_length=255, blank=True, null=True)
analytical_unit_3 = models.CharField(max_length=255, blank=True, null=True)
analytical_unit_found = models.CharField(max_length=255, blank=True, null=True)
analytical_unit_likely = models.CharField(max_length=255, blank=True, null=True)
analytical_unit_simplified = models.CharField(max_length=255, blank=True, null=True)
in_situ = models.BooleanField(default=False)
ranked = models.BooleanField(default=False)
weathering = models.SmallIntegerField(blank=True, null=True)
surface_modification = models.CharField("Surface Mod", max_length=255, blank=True, null=True)
geology_remarks = models.TextField("Geol Remarks", max_length=500, null=True, blank=True)
# Location
collection_code = models.CharField("Collection Code", max_length=20, blank=True, null=True)
drainage_region = models.CharField("Drainage Region", null=True, blank=True, max_length=255)
# Media
image = models.FileField(max_length=255, blank=True, upload_to="uploads/images/hrp", null=True)
class Meta:
verbose_name = "HRP Occurrence"
verbose_name_plural = "HRP Occurrences"
ordering = ["collection_code", "locality", "item_number", "item_part"]
def catalog_number(self):
"""
Generate a pretty string formatted catalog number from constituent fields
:return: catalog number as string
"""
if self.basis_of_record == 'Collection':
# Crate catalog number string. Null values become None when converted to string
if self.item_number:
if self.item_part:
item_text = '-' + str(self.item_number) + str(self.item_part)
else:
item_text = '-' + str(self.item_number)
else:
item_text = ''
catalog_number_string = str(self.collection_code) + " " + str(self.locality_id) + item_text
return catalog_number_string.replace('None', '').replace('- ', '') # replace None with empty string
else:
return None
@staticmethod
def fields_to_display():
fields = ("id", "barcode")
return fields
@staticmethod
def method_fields_to_export():
"""
Method to store a list of fields that should be added to data exports.
Called by export admin actions.
These fields are defined in methods and are not concrete fields in the DB so have to be declared.
:return:
"""
return ['longitude', 'latitude', 'easting', 'northing', 'catalog_number', 'photo']
def get_all_field_names(self):
"""
Field names from model
:return: list with all field names
"""
field_list = self._meta.get_fields() # produce a list of field objects
return [f.name for f in field_list] # return a list of names from each field
def get_foreign_key_field_names(self):
"""
Get foreign key fields
:return: returns a list of for key field names
"""
field_list = self._meta.get_fields() # produce a list of field objects
return [f.name for f in field_list if f.is_relation] # return a list of names for fk fields
def get_concrete_field_names(self):
"""
Get field names that correspond to columns in the DB
:return: returns a lift
"""
field_list = self._meta.get_fields()
return [f.name for f in field_list if f.concrete]
def photo(self):
try:
return u'<a href="%s"><img src="%s" style="width:600px" /></a>' \
% (os.path.join(self.image.url), os.path.join(self.image.url))
except:
return None
photo.short_description = 'Photo'
photo.allow_tags = True
photo.mark_safe = True
def thumbnail(self):
try:
return u'<a href="%s"><img src="%s" style="width:100px" /></a>' \
% (os.path.join(self.image.url), os.path.join(self.image.url))
except:
return None
thumbnail.short_description = 'Thumb'
thumbnail.allow_tags = True
thumbnail.mark_safe = True
class Biology(Occurrence):
# Biology
sex = models.CharField("Sex", null=True, blank=True, max_length=50)
life_stage = models.CharField("Life Stage", null=True, blank=True, max_length=50, choices=HRP_LIFE_STAGE_CHOICES)
size_class = models.CharField("Size Class", null=True, blank=True, max_length=50, choices=HRP_SIZE_CLASS_CHOICES)
# Taxon
taxon = models.ForeignKey(Taxon,
default=0, on_delete=models.SET_DEFAULT, # prevent deletion when taxa deleted
related_name='hrp_taxon_bio_occurrences')
identification_qualifier = models.ForeignKey(IdentificationQualifier, null=True, blank=True,
on_delete=models.SET_NULL,
related_name='hrp_id_qualifier_bio_occurrences')
qualifier_taxon = models.ForeignKey(Taxon, null=True, blank=True,
on_delete=models.SET_NULL,
related_name='hrp_qualifier_taxon_bio_occurrences')
verbatim_taxon = models.CharField(null=True, blank=True, max_length=1024)
verbatim_identification_qualifier = models.CharField(null=True, blank=True, max_length=255)
taxonomy_remarks = models.TextField(max_length=500, null=True, blank=True)
# Identification
identified_by = models.CharField(null=True, blank=True, max_length=100, choices=HRP_IDENTIFIER_CHOICES)
year_identified = models.IntegerField(null=True, blank=True)
type_status = models.CharField(null=True, blank=True, max_length=50)
fauna_notes = models.TextField(null=True, blank=True, max_length=64000)
# Element
side = models.CharField("Side", null=True, blank=True, max_length=50, choices=HRP_SIDE_CHOICES)
element = models.CharField("Element", null=True, blank=True, max_length=50, choices=HRP_ELEMENT_CHOICES)
# TODO add element_modifier choices once field is cleaned
element_modifier = models.CharField("Element Mod", null=True, blank=True, max_length=50,
choices=HRP_ELEMENT_MODIFIER_CHOICES)
# TODO populate portion after migrate
element_portion = models.CharField("Element Portion", null=True, blank=True, max_length=50,
choices=HRP_ELEMENT_PORTION_CHOICES)
# TODO populate number choices after migrate
element_number = models.CharField(null=True, blank=True, max_length=50, choices=HRP_ELEMENT_NUMBER_CHOICES)
element_remarks = models.TextField(max_length=500, null=True, blank=True)
tooth_upper_or_lower = models.CharField(null=True, blank=True, max_length=50)
tooth_number = models.CharField(null=True, blank=True, max_length=50)
tooth_type = models.CharField(null=True, blank=True, max_length=50)
# upper dentition fields
uli1 = models.BooleanField(default=False)
uli2 = models.BooleanField(default=False)
uli3 = models.BooleanField(default=False)
uli4 = models.BooleanField(default=False)
uli5 = models.BooleanField(default=False)
uri1 = models.BooleanField(default=False)
uri2 = models.BooleanField(default=False)
uri3 = models.BooleanField(default=False)
uri4 = models.BooleanField(default=False)
uri5 = models.BooleanField(default=False)
ulc = models.BooleanField(default=False)
urc = models.BooleanField(default=False)
ulp1 = models.BooleanField(default=False)
ulp2 = models.BooleanField(default=False)
ulp3 = models.BooleanField(default=False)
ulp4 = models.BooleanField(default=False)
urp1 = models.BooleanField(default=False)
urp2 = models.BooleanField(default=False)
urp3 = models.BooleanField(default=False)
urp4 = models.BooleanField(default=False)
ulm1 = models.BooleanField(default=False)
ulm2 = models.BooleanField(default=False)
ulm3 = models.BooleanField(default=False)
urm1 = models.BooleanField(default=False)
urm2 = models.BooleanField(default=False)
urm3 = models.BooleanField(default=False)
# lower dentition fields
lli1 = models.BooleanField(default=False)
lli2 = models.BooleanField(default=False)
lli3 = models.BooleanField(default=False)
lli4 = models.BooleanField(default=False)
lli5 = models.BooleanField(default=False)
lri1 = models.BooleanField(default=False)
lri2 = models.BooleanField(default=False)
lri3 = models.BooleanField(default=False)
lri4 = models.BooleanField(default=False)
lri5 = models.BooleanField(default=False)
llc = models.BooleanField(default=False)
lrc = models.BooleanField(default=False)
llp1 = models.BooleanField(default=False)
llp2 = models.BooleanField(default=False)
llp3 = models.BooleanField(default=False)
llp4 = models.BooleanField(default=False)
lrp1 = models.BooleanField(default=False)
lrp2 = models.BooleanField(default=False)
lrp3 = models.BooleanField(default=False)
lrp4 = models.BooleanField(default=False)
llm1 = models.BooleanField(default=False)
llm2 = models.BooleanField(default=False)
llm3 = models.BooleanField(default=False)
lrm1 = models.BooleanField(default=False)
lrm2 = models.BooleanField(default=False)
lrm3 = models.BooleanField(default=False)
# indeterminate dental fields
indet_incisor = models.BooleanField(default=False)
indet_canine = models.BooleanField(default=False)
indet_premolar = models.BooleanField(default=False)
indet_molar = models.BooleanField(default=False)
indet_tooth = models.BooleanField(default=False)
deciduous = models.BooleanField(default=False)
# Measurements
um_tooth_row_length_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
um_1_length_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
um_1_width_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
um_2_length_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
um_2_width_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
um_3_length_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
um_3_width_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
lm_tooth_row_length_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
lm_1_length = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
lm_1_width = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
lm_2_length = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
lm_2_width = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
lm_3_length = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
lm_3_width = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
# TODO delete attributes, preparations and morphobank number
attributes = models.CharField(null=True, blank=True, max_length=50)
preparations = models.CharField(null=True, blank=True, max_length=50)
morphobank_number = models.IntegerField(null=True, blank=True) # empty, ok to delete
def __str__(self):
return str(self.taxon.__str__())
class Meta:
verbose_name = "HRP Biology"
verbose_name_plural = "HRP Biology"
class Archaeology(Occurrence):
find_type = models.CharField(null=True, blank=True, max_length=255)
length_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
width_mm = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
class Meta:
verbose_name = "HRP Archaeology"
verbose_name_plural = "HRP Archaeology"
class Geology(Occurrence):
find_type = models.CharField(null=True, blank=True, max_length=255)
dip = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
strike = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
color = models.CharField(null=True, blank=True, max_length=255)
texture = models.CharField(null=True, blank=True, max_length=255)
class Meta:
verbose_name = "HRP Geology"
verbose_name_plural = "HRP Geology"
# Hydrology Class
class Hydrology(models.Model):
length = models.DecimalField(max_digits=38, decimal_places=8, null=True, blank=True)
name = models.CharField(null=True, blank=True, max_length=50)
size = models.IntegerField(null=True, blank=True)
map_sheet = models.CharField(null=True, blank=True, max_length=50)
geom = models.LineStringField(srid=4326)
objects = models.GeoManager()
def __str__(self):
return str(self.name)
class Meta:
verbose_name = "HRP Hydrology"
verbose_name_plural = "HRP Hydrology"
# Media Classes
class Image(models.Model):
occurrence = models.ForeignKey("Occurrence", related_name='hrp_occurrences')
image = models.ImageField(upload_to="uploads/images", null=True, blank=True)
description = models.TextField(null=True, blank=True)
class File(models.Model):
occurrence = models.ForeignKey("Occurrence")
file = models.FileField(upload_to="uploads/files", null=True, blank=True)
description = models.TextField(null=True, blank=True)
| 48.147132 | 117 | 0.70187 | 2,438 | 19,307 | 5.384742 | 0.158327 | 0.06787 | 0.077239 | 0.101005 | 0.426341 | 0.382617 | 0.343921 | 0.296999 | 0.279479 | 0.231261 | 0 | 0.019148 | 0.193919 | 19,307 | 400 | 118 | 48.2675 | 0.824391 | 0.092402 | 0 | 0.137457 | 0 | 0 | 0.066327 | 0.006598 | 0 | 0 | 0 | 0.005 | 0 | 1 | 0.041237 | false | 0 | 0.017182 | 0.006873 | 0.762887 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
9451476f11677b46be97a0f4978515f3acb6cf52 | 1,049 | py | Python | object-oriented-programming/src/oop-interface.py | giserh/book-python | ebd4e70cea1dd56986aa8efbae3629ba3f1ba087 | [
"MIT"
] | 1 | 2019-01-02T15:04:08.000Z | 2019-01-02T15:04:08.000Z | object-oriented-programming/src/oop-interface.py | giserh/book-python | ebd4e70cea1dd56986aa8efbae3629ba3f1ba087 | [
"MIT"
] | null | null | null | object-oriented-programming/src/oop-interface.py | giserh/book-python | ebd4e70cea1dd56986aa8efbae3629ba3f1ba087 | [
"MIT"
] | null | null | null | class MLModelInterface:
def fit(self, features, labels):
raise NotImplementedError
def predict(self, data):
raise NotImplementedError
class KNeighborsClassifier(MLModelInterface):
def fit(self, features, labels):
pass
def predict(self, data):
pass
class LinearRegression(MLModelInterface):
def fit(self, features, labels):
pass
def predict(self, data):
pass
class LogisticsRegression(MLModelInterface):
pass
# Imput to the classifier
features = [
(5.1, 3.5, 1.4, 0.2),
(4.9, 3.0, 1.4, 0.2),
(4.7, 3.2, 1.3, 0.2),
(7.0, 3.2, 4.7, 1.4),
(6.4, 3.2, 4.5, 1.5),
(6.9, 3.1, 4.9, 1.5),
(6.3, 3.3, 6.0, 2.5),
(5.8, 2.7, 5.1, 1.9),
(7.1, 3.0, 5.9, 2.1),
]
# 0: I. setosa
# 1: I. versicolor
# 2: I. virginica
labels = [0, 0, 0, 1, 1, 1, 2, 2, 2]
to_predict = [
(5.7, 2.8, 4.1, 1.3),
(4.9, 2.5, 4.5, 1.7),
(4.6, 3.4, 1.4, 0.3),
]
model = LinearRegression()
model.fit(features, labels)
model.predict(to_predict)
# [1, 2, 0]
| 18.732143 | 45 | 0.553861 | 182 | 1,049 | 3.181319 | 0.186813 | 0.017271 | 0.11399 | 0.134715 | 0.331606 | 0.314335 | 0.24525 | 0.24525 | 0.24525 | 0.24525 | 0 | 0.142308 | 0.256435 | 1,049 | 55 | 46 | 19.072727 | 0.6 | 0.07531 | 0 | 0.351351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.162162 | false | 0.135135 | 0 | 0 | 0.27027 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
9454ae456058a5c984b91c0a770883c558f45cd2 | 1,934 | py | Python | tests/python/pants_test/backend/jvm/tasks/jvm_compile/test_resource_mapping.py | WamBamBoozle/pants | 98cadfa1a5d337146903eb66548cfe955f2627b3 | [
"Apache-2.0"
] | null | null | null | tests/python/pants_test/backend/jvm/tasks/jvm_compile/test_resource_mapping.py | WamBamBoozle/pants | 98cadfa1a5d337146903eb66548cfe955f2627b3 | [
"Apache-2.0"
] | null | null | null | tests/python/pants_test/backend/jvm/tasks/jvm_compile/test_resource_mapping.py | WamBamBoozle/pants | 98cadfa1a5d337146903eb66548cfe955f2627b3 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
from __future__ import (absolute_import, division, generators, nested_scopes, print_function,
unicode_literals, with_statement)
import os
from pants.backend.jvm.tasks.jvm_compile.resource_mapping import ResourceMapping
from pants_test.base_test import BaseTest
class ResourceMappingTest(BaseTest):
def test_resource_mapping_ok(self):
rel_dir = 'tests/python/pants_test/backend/jvm/tasks/jvm_compile/test-data/resource_mapping'
resource_mapping = ResourceMapping(rel_dir)
self.assertEquals(2, len(resource_mapping.mappings))
def test_resource_mapping_short(self):
rel_dir = 'tests/python/pants_test/backend/jvm/tasks/jvm_compile/test-data/resource_mapping-broken-short'
resource_mapping = ResourceMapping(rel_dir)
with self.assertRaises(ResourceMapping.TruncatedFileException):
resource_mapping.mappings
def test_resource_mapping_long(self):
rel_dir = 'tests/python/pants_test/backend/jvm/tasks/jvm_compile/test-data/resource_mapping-broken-long'
resource_mapping = ResourceMapping(rel_dir)
with self.assertRaises(ResourceMapping.TooLongFileException):
resource_mapping.mappings
def test_resource_mapping_mangled(self):
rel_dir = 'tests/python/pants_test/backend/jvm/tasks/jvm_compile/test-data/resource_mapping-broken-mangled'
resource_mapping = ResourceMapping(rel_dir)
with self.assertRaises(ResourceMapping.UnparseableLineException):
resource_mapping.mappings
def test_resource_mapping_noitems(self):
rel_dir = 'tests/python/pants_test/backend/jvm/tasks/jvm_compile/test-data/resource_mapping-broken-missing-items'
resource_mapping = ResourceMapping(rel_dir)
with self.assertRaises(ResourceMapping.MissingItemsLineException):
resource_mapping.mappings
| 39.469388 | 117 | 0.798345 | 237 | 1,934 | 6.248945 | 0.28692 | 0.212694 | 0.06077 | 0.072924 | 0.634031 | 0.592843 | 0.592843 | 0.471303 | 0.471303 | 0.279541 | 0 | 0.004692 | 0.118407 | 1,934 | 48 | 118 | 40.291667 | 0.86393 | 0.071872 | 0 | 0.3 | 0 | 0.133333 | 0.257398 | 0.257398 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.166667 | false | 0 | 0.133333 | 0 | 0.333333 | 0.033333 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9459da79cb451077424b3e724720da8c0a06c9ac | 1,281 | py | Python | openport/common/tee.py | Deshdeepak1/openport | 4aefb3281c2a483ed7b02c8574321b5c0d9a954a | [
"MIT"
] | 5 | 2018-06-04T14:45:15.000Z | 2022-01-07T09:02:35.000Z | openport/common/tee.py | Deshdeepak1/openport | 4aefb3281c2a483ed7b02c8574321b5c0d9a954a | [
"MIT"
] | 9 | 2019-11-24T19:58:33.000Z | 2022-03-29T21:54:03.000Z | openport/common/tee.py | Deshdeepak1/openport | 4aefb3281c2a483ed7b02c8574321b5c0d9a954a | [
"MIT"
] | 5 | 2019-08-04T17:56:15.000Z | 2022-02-06T20:15:03.000Z | import sys
class TeeStdOut(object):
def __init__(self, name, mode):
self.file = open(name, mode)
self.stdout = sys.stdout
sys.stdout = self
def close(self):
if self is None:
return
if self.stdout is not None:
sys.stdout = self.stdout
self.stdout = None
if self.file is not None:
self.file.close()
self.file = None
def write(self, data):
self.file.write(data)
self.stdout.write(data)
def flush(self):
self.file.flush()
self.stdout.flush()
def __del__(self):
return
self.close()
class TeeStdErr(object):
def __init__(self, name, mode):
self.file = open(name, mode)
self.stderr = sys.stderr
sys.stderr = self
def close(self):
if self.stderr is not None:
sys.stderr = self.stderr
self.stderr = None
if self.file is not None:
self.file.close()
self.file = None
def write(self, data):
self.file.write(data)
self.stderr.write(data)
def flush(self):
self.file.flush()
self.stderr.flush()
def __del__(self):
return
self.close() | 22.875 | 37 | 0.530055 | 157 | 1,281 | 4.22293 | 0.159236 | 0.144796 | 0.072398 | 0.051282 | 0.675716 | 0.675716 | 0.609351 | 0.518854 | 0.518854 | 0.404223 | 0 | 0 | 0.367682 | 1,281 | 56 | 38 | 22.875 | 0.818519 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.022222 | 0 | 0.355556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
94611d8fb6626acc66a8ff1fb61998cb3024f470 | 274 | py | Python | nickleback.py | helanan/kill_nickleback | 6cbc37f58cf3e9aedc42d4e2718b8c7549692a15 | [
"MIT"
] | null | null | null | nickleback.py | helanan/kill_nickleback | 6cbc37f58cf3e9aedc42d4e2718b8c7549692a15 | [
"MIT"
] | null | null | null | nickleback.py | helanan/kill_nickleback | 6cbc37f58cf3e9aedc42d4e2718b8c7549692a15 | [
"MIT"
] | null | null | null | songs = { ('Nickelback', 'How You Remind Me'), ('Will.i.am', 'That Power'), ('Miles Davis', 'Stella by Starlight'), ('Nickelback', 'Animals') }
# Using a set comprehension, create a new set that contains all songs that were not performed by Nickelback.
nonNickelback = {}
| 54.8 | 144 | 0.689781 | 37 | 274 | 5.108108 | 0.783784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156934 | 274 | 4 | 145 | 68.5 | 0.818182 | 0.386861 | 0 | 0 | 0 | 0 | 0.560241 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9462eb2500bdbcb6be261010cdb1088e435cd696 | 4,099 | py | Python | main.py | mneeman/Removing_atmospheric_turbulence | cf0dc0ab503d6fe9dfa4f48d2fffc69f1e0899b6 | [
"MIT"
] | 4 | 2020-06-25T15:21:51.000Z | 2021-08-11T14:04:43.000Z | main.py | mneeman/Removing_atmospheric_turbulence | cf0dc0ab503d6fe9dfa4f48d2fffc69f1e0899b6 | [
"MIT"
] | 1 | 2021-05-29T08:22:30.000Z | 2021-05-29T08:22:30.000Z | main.py | mneeman/Removing_atmospheric_turbulence | cf0dc0ab503d6fe9dfa4f48d2fffc69f1e0899b6 | [
"MIT"
] | 2 | 2020-09-23T02:42:59.000Z | 2021-08-11T14:04:45.000Z | import multiprocessing
multiprocessing.set_start_method('spawn', True)
import argparse
import os
import numpy as np
import math
import sys
import torchvision
import torchvision.transforms as transforms
from torch.utils.data import DataLoader
from torchvision import datasets
from torchvision.utils import make_grid
from torch.utils.tensorboard import SummaryWriter
from torch.autograd import Variable
import torch.nn as nn
import torch.nn.functional as F
import torch.autograd as autograd
import torch
from train import train
from test import test, test_moving
from models import UNet, Discriminator, weights_init
from functions import *
from data_util import MyDataset
torch.backends.cudnn.benchmark = True
torch.backends.cudnn.enabled = True
def main(opt):
writer = SummaryWriter()
log_dir = writer.get_logdir()
os.makedirs(os.path.join(log_dir, "images"), exist_ok=True)
os.makedirs(os.path.join(log_dir, "test"), exist_ok=True)
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
# Initialize generator and discriminator
generator = UNet(opt.sample_num, opt.channels, opt.batch_size, opt.alpha)
discriminator = Discriminator(opt.batch_size, opt.alpha)
generator.to(device=device)
discriminator.to(device=device)
# Optimizers
optimizer_G = torch.optim.Adam(generator.parameters(), lr=opt.lr_g, betas=(opt.b1, opt.b2))
optimizer_D = torch.optim.Adam(discriminator.parameters(), lr=opt.lr_d, betas=(opt.b1, opt.b2))
if opt.mode == 'train':
generator = train(writer, log_dir, device, generator, discriminator, optimizer_G, optimizer_D, opt)
test(opt, log_dir, generator=generator)
if opt.mode == 'test':
test(opt, log_dir)
test_moving(opt, log_dir)
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument("--n_epochs", type=int, default=100, help="number of epochs of training")
parser.add_argument("--n_epochs_g", type=int, default=3, help="number of epochs of training only g")
parser.add_argument("--batch_size", type=int, default=1, help="size of the batches")
parser.add_argument("--lr_d", type=float, default=0.000001, help="adam: learning rate d")
parser.add_argument("--lr_g", type=float, default=0.00004, help="adam: learning rate g")
parser.add_argument("--b1", type=float, default=0.5, help="adam: decay of first order momentum of gradient")
parser.add_argument("--b2", type=float, default=0.99, help="adam: decay of first order momentum of gradient")
parser.add_argument("--alpha", type=float, default=0.2, help="Randomized Leaky ReLU activation layer")
parser.add_argument("--lambda_gp", type=float, default=10, help="Loss weight for gradient penalty")
parser.add_argument("--img_size", type=int, default=256, help="size of each image dimension")
parser.add_argument("--channels", type=int, default=3, help="number of image channels")
parser.add_argument("--sample_interval", type=int, default=30, help="interval betwen image samples")
parser.add_argument("--load_model", type=str, default='', help="path to model to continue training")
parser.add_argument("--windows", type=bool, default=False, help="run on windows")
parser.add_argument("--mode", type=str, default='test', help="train, test")
parser.add_argument("--dataset_dir", type=str, default='/Users/Maayan/Documents/databases/mit_100_frames', help="path to dataset directory")
parser.add_argument("--reference_dataset_path", type=str, default='/Users/Maayan/Documents/databases/mit', help="path to ground thruth dataset")
parser.add_argument("--test_dataset_path", type=str, default='/Users/Maayan/Documents/databases/test/frames_256', help="path to test_dataset")
parser.add_argument("--num_workers_dataloader", type=int, default=0, help="num workers for dataloader")
parser.add_argument("--sample_num", type=int, default=20, help="number of images to random sample from each video")
opt = parser.parse_args()
print(opt)
main(opt)
| 44.554348 | 148 | 0.732618 | 584 | 4,099 | 5.008562 | 0.292808 | 0.061538 | 0.116239 | 0.02906 | 0.186325 | 0.145983 | 0.13094 | 0.094701 | 0.077949 | 0.041026 | 0 | 0.013602 | 0.139058 | 4,099 | 91 | 149 | 45.043956 | 0.815245 | 0.011954 | 0 | 0 | 0 | 0 | 0.242649 | 0.044972 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014706 | false | 0 | 0.323529 | 0 | 0.338235 | 0.014706 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
946461fc3a4fd6828fbc72c73b2d0410d8ef7f35 | 858 | py | Python | handlers/changing_stickerpack_handl.py | bbt-t/bot-pet-project | 6b0d7862b14fe739be52d87ff8c8610a3f4548e1 | [
"Apache-2.0"
] | null | null | null | handlers/changing_stickerpack_handl.py | bbt-t/bot-pet-project | 6b0d7862b14fe739be52d87ff8c8610a3f4548e1 | [
"Apache-2.0"
] | null | null | null | handlers/changing_stickerpack_handl.py | bbt-t/bot-pet-project | 6b0d7862b14fe739be52d87ff8c8610a3f4548e1 | [
"Apache-2.0"
] | null | null | null | from aiogram.dispatcher import FSMContext
from aiogram.types import CallbackQuery
from handlers.states_in_handlers import UserSettingStates
from loader import dp
from utils.keyboards.start_handl_choice_kb import get_start_keyboard
@dp.callback_query_handler(text='set_skin', state=UserSettingStates.settings)
async def choose_a_sticker_pack(call: CallbackQuery, state: FSMContext) -> None:
async with state.proxy() as data:
if data.get('lang') == 'ru':
await call.message.answer(
"На какой меняем?",
reply_markup=await get_start_keyboard(is_choice_skin=True)
)
else:
await call.message.answer(
"What are we changing to?",
reply_markup=await get_start_keyboard(is_choice_skin=True, lang='en')
)
await state.finish()
| 37.304348 | 85 | 0.681818 | 106 | 858 | 5.301887 | 0.584906 | 0.042705 | 0.085409 | 0.078292 | 0.170819 | 0.170819 | 0.170819 | 0.170819 | 0.170819 | 0.170819 | 0 | 0 | 0.237762 | 858 | 22 | 86 | 39 | 0.859327 | 0 | 0 | 0.105263 | 0 | 0 | 0.065268 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.263158 | 0 | 0.263158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
94698a7593ec8f7eb2b02572abe4aa120407e92f | 1,653 | py | Python | objectModel/Python/tests/storage/test_github.py | rt112000/CDM | 34bd34f9260140a8f8aa02bd87c23033f3daad4c | [
"CC-BY-4.0",
"MIT"
] | 884 | 2019-05-10T02:09:10.000Z | 2022-03-31T14:02:00.000Z | objectModel/Python/tests/storage/test_github.py | spbast/CDM | bf97a3720c97ee4c9df3625084cf8b3bc65ff9c7 | [
"CC-BY-4.0",
"MIT"
] | 171 | 2019-06-10T11:34:37.000Z | 2022-03-31T22:50:12.000Z | objectModel/Python/tests/storage/test_github.py | spbast/CDM | bf97a3720c97ee4c9df3625084cf8b3bc65ff9c7 | [
"CC-BY-4.0",
"MIT"
] | 340 | 2019-05-07T18:00:16.000Z | 2022-03-31T12:00:15.000Z | # Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
import json
import unittest
import unittest.mock as mock
import random
from tests.common import async_test
from cdm.storage.github import GithubAdapter
class GithubStorageAdapterTestCase(unittest.TestCase):
def test_make_corpus_path(self):
adapter = GithubAdapter()
adapter.timeout = 2000
adapter.maximum_timeout = 5000
adapter.number_of_retries = 0
# Valid path.
self.assertEqual(adapter.create_corpus_path(
'https://raw.githubusercontent.com/Microsoft/CDM/master/schemaDocuments/dir1/dir2/file.json'), '/dir1/dir2/file.json')
# Invalid path.
self.assertIsNone(adapter.create_corpus_path('https://raw.githubusercontent.com/Microsoft/CDM/master/schemaDocument/dir1/dir2/file.json'))
@mock.patch('cdm.utilities.network.cdm_http_client.urllib.request.urlopen', new_callable=mock.mock_open, read_data=json.dumps({'Ḽơᶉëᶆ': 'ȋṕšᶙṁ'}).encode())
@async_test
async def test_read(self, mock_urlopen):
adapter = GithubAdapter()
adapter.timeout = 2000
adapter.maximum_timeout = 5000
raw_data = await adapter.read_async('/dir1/dir2/file.json')
data = json.loads(raw_data)
# Verify URL.
self.assertEqual(mock_urlopen.call_args[0][0].full_url, 'https://raw.githubusercontent.com/Microsoft/CDM/master/schemaDocuments/dir1/dir2/file.json')
self.assertEqual(data, {'Ḽơᶉëᶆ': 'ȋṕšᶙṁ'}) # Verify data.
if __name__ == '__main__':
unittest.main()
| 37.568182 | 159 | 0.717483 | 206 | 1,653 | 5.597087 | 0.456311 | 0.034692 | 0.052038 | 0.069384 | 0.322637 | 0.322637 | 0.322637 | 0.322637 | 0.322637 | 0.213356 | 0 | 0.021122 | 0.169389 | 1,653 | 43 | 160 | 38.44186 | 0.817917 | 0.121597 | 0 | 0.222222 | 0 | 0.111111 | 0.274931 | 0.041551 | 0 | 0 | 0 | 0 | 0.148148 | 0 | null | null | 0 | 0.222222 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9470c26c016cbc9700c1e8fe9a668a1e2bb75e79 | 722 | py | Python | l5q1.py | gonewithharshwinds/itt-lab | 257eb0d38b09eac7991b490ec64c068ef51d7fb2 | [
"MIT"
] | 1 | 2022-01-06T00:07:36.000Z | 2022-01-06T00:07:36.000Z | l5q1.py | gonewithharshwinds/itt-lab | 257eb0d38b09eac7991b490ec64c068ef51d7fb2 | [
"MIT"
] | null | null | null | l5q1.py | gonewithharshwinds/itt-lab | 257eb0d38b09eac7991b490ec64c068ef51d7fb2 | [
"MIT"
] | null | null | null | #!/usr/bin/python
def add(a,b):
return a+b
def sub(a,b):
return a-b
def mul(a,b):
return a*b
def div(a,b):
return a/b
a = int(input("Enter first number: "))
b = int(input("Enter second number: "))
print("Please select operation : \n" \
"1. Addition \n" \
"2. Subtraction \n" \
"3. Multiplication \n" \
"4. Division \n")
select = int(input("Select operations form 1, 2, 3, 4 :"))
if select == 1:
print(a, "+", b, "=", add(a,b))
elif select == 2:
print(a, "-", b, "=", sub(a,b))
elif select == 3:
print(a, "*", b, "=", mul(a,b))
elif select == 4:
if b == 0:
exit
elif b != 0:
print(a, "/", b, "=", div(a,b))
else:
print("Invalid input")
| 20.628571 | 59 | 0.506925 | 117 | 722 | 3.128205 | 0.324786 | 0.087432 | 0.087432 | 0.098361 | 0.13388 | 0.106557 | 0 | 0 | 0 | 0 | 0 | 0.026565 | 0.270083 | 722 | 34 | 60 | 21.235294 | 0.667932 | 0.022161 | 0 | 0 | 0 | 0 | 0.283159 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137931 | false | 0 | 0 | 0.137931 | 0.275862 | 0.206897 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
94710d9fc08d583f01866bc48d784b361bc959a7 | 2,467 | py | Python | Task/Sokoban/Python/sokoban.py | LaudateCorpus1/RosettaCodeData | 9ad63ea473a958506c041077f1d810c0c7c8c18d | [
"Info-ZIP"
] | 1 | 2018-11-09T22:08:38.000Z | 2018-11-09T22:08:38.000Z | Task/Sokoban/Python/sokoban.py | seanwallawalla-forks/RosettaCodeData | 9ad63ea473a958506c041077f1d810c0c7c8c18d | [
"Info-ZIP"
] | null | null | null | Task/Sokoban/Python/sokoban.py | seanwallawalla-forks/RosettaCodeData | 9ad63ea473a958506c041077f1d810c0c7c8c18d | [
"Info-ZIP"
] | 1 | 2018-11-09T22:08:40.000Z | 2018-11-09T22:08:40.000Z | from array import array
from collections import deque
import psyco
data = []
nrows = 0
px = py = 0
sdata = ""
ddata = ""
def init(board):
global data, nrows, sdata, ddata, px, py
data = filter(None, board.splitlines())
nrows = max(len(r) for r in data)
maps = {' ':' ', '.': '.', '@':' ', '#':'#', '$':' '}
mapd = {' ':' ', '.': ' ', '@':'@', '#':' ', '$':'*'}
for r, row in enumerate(data):
for c, ch in enumerate(row):
sdata += maps[ch]
ddata += mapd[ch]
if ch == '@':
px = c
py = r
def push(x, y, dx, dy, data):
if sdata[(y+2*dy) * nrows + x+2*dx] == '#' or \
data[(y+2*dy) * nrows + x+2*dx] != ' ':
return None
data2 = array("c", data)
data2[y * nrows + x] = ' '
data2[(y+dy) * nrows + x+dx] = '@'
data2[(y+2*dy) * nrows + x+2*dx] = '*'
return data2.tostring()
def is_solved(data):
for i in xrange(len(data)):
if (sdata[i] == '.') != (data[i] == '*'):
return False
return True
def solve():
open = deque([(ddata, "", px, py)])
visited = set([ddata])
dirs = ((0, -1, 'u', 'U'), ( 1, 0, 'r', 'R'),
(0, 1, 'd', 'D'), (-1, 0, 'l', 'L'))
lnrows = nrows
while open:
cur, csol, x, y = open.popleft()
for di in dirs:
temp = cur
dx, dy = di[0], di[1]
if temp[(y+dy) * lnrows + x+dx] == '*':
temp = push(x, y, dx, dy, temp)
if temp and temp not in visited:
if is_solved(temp):
return csol + di[3]
open.append((temp, csol + di[3], x+dx, y+dy))
visited.add(temp)
else:
if sdata[(y+dy) * lnrows + x+dx] == '#' or \
temp[(y+dy) * lnrows + x+dx] != ' ':
continue
data2 = array("c", temp)
data2[y * lnrows + x] = ' '
data2[(y+dy) * lnrows + x+dx] = '@'
temp = data2.tostring()
if temp not in visited:
if is_solved(temp):
return csol + di[2]
open.append((temp, csol + di[2], x+dx, y+dy))
visited.add(temp)
return "No solution"
level = """\
#######
# #
# #
#. # #
#. $$ #
#.$$ #
#.# @#
#######"""
psyco.full()
init(level)
print level, "\n\n", solve()
| 25.43299 | 65 | 0.398865 | 301 | 2,467 | 3.259136 | 0.255814 | 0.021407 | 0.03262 | 0.040775 | 0.304791 | 0.231397 | 0.178389 | 0.124363 | 0.085627 | 0.085627 | 0 | 0.020791 | 0.395622 | 2,467 | 96 | 66 | 25.697917 | 0.637156 | 0 | 0 | 0.101266 | 0 | 0 | 0.049858 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.037975 | null | null | 0.012658 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9475e0c18bbe81db935a18b756540e0834fe6a82 | 503 | py | Python | Chapter09/others/convertor.py | PacktPublishing/Machine-Learning-for-Mobile | 89ccd5a8caba5266c5be0a3465e8cb112cbc5a7a | [
"MIT"
] | 13 | 2018-06-28T04:21:55.000Z | 2021-10-05T19:45:48.000Z | Chapter09/screenshots/avinash/convertor.py | PacktPublishing/Machine-Learning-for-Mobile | 89ccd5a8caba5266c5be0a3465e8cb112cbc5a7a | [
"MIT"
] | null | null | null | Chapter09/screenshots/avinash/convertor.py | PacktPublishing/Machine-Learning-for-Mobile | 89ccd5a8caba5266c5be0a3465e8cb112cbc5a7a | [
"MIT"
] | 12 | 2018-12-19T13:59:23.000Z | 2021-08-16T15:47:35.000Z | import tfcoreml as tf_converter
tf_converter.convert(tf_model_path = 'retrained_graph.pb',
mlmodel_path = 'converted.mlmodel',
output_feature_names = ['final_result:0'],
image_input_names = 'input:0',
class_labels = 'retrained_labels.txt',
red_bias = -1,
green_bias = -1,
blue_bias = -1,
image_scale = 2.0/224.0
)
| 41.916667 | 63 | 0.475149 | 49 | 503 | 4.530612 | 0.653061 | 0.067568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039007 | 0.439364 | 503 | 11 | 64 | 45.727273 | 0.748227 | 0 | 0 | 0 | 0 | 0 | 0.151093 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
94763e73690878ccbfb0b93a8b2aa9df65644db9 | 5,567 | py | Python | pythem-master/pythem/modules/fuzzer.py | Marzooq13579/Hack-Gadgets | 4b9351c4465f3e01fb0390b1e86dfe7c26237a19 | [
"MIT"
] | 8 | 2019-02-17T20:11:46.000Z | 2019-10-18T06:27:16.000Z | pythem-master/pythem/modules/fuzzer.py | Marzooq13579/Hack-Gadgets | 4b9351c4465f3e01fb0390b1e86dfe7c26237a19 | [
"MIT"
] | null | null | null | pythem-master/pythem/modules/fuzzer.py | Marzooq13579/Hack-Gadgets | 4b9351c4465f3e01fb0390b1e86dfe7c26237a19 | [
"MIT"
] | 4 | 2019-02-17T23:00:18.000Z | 2019-10-18T06:27:14.000Z | #!/usr/bin/env python2.7
# Copyright (c) 2016-2018 Angelo Moura
#
# This file is part of the program pythem
#
# pythem is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307
# USA
import os
import sys
import struct
import resource
import time
from netaddr import IPAddress, AddrFormatError
from subprocess import *
import socket
class SimpleFuzz(object):
name = "Fuzzer"
desc = "Used in the xploit module. simple 'A' generation through tcp or stdin"
version = "0.3"
def __init__(self, target, type, offset):
self.offset = offset
self.target = target
if type == "test":
return
if type == "tcp":
self.port = input("[+]Enter the tcp port to fuzz: ")
self.tcpfuzz()
elif type == "stdin":
self.stdinfuzz()
else:
print "[!] Select a valid fuzzer type (stdin or tcp)."
def stdinfuzz(self):
buf = ''
while True:
try:
first = True
buf += '\x41' * self.offset
resource.setrlimit(resource.RLIMIT_STACK, (-1, -1))
resource.setrlimit(resource.RLIMIT_CORE, (-1, -1))
P = Popen(self.target, stdin=PIPE)
print "[*] Sending buffer with lenght: " + str(len(buf))
P.stdin.write(buf + '\n')
line = sys.stdin.readline()
P.poll()
ret = P.returncode
if ret is None:
continue
else:
if ret == -4:
print "\n[+] Instruction Pointer may be at: {}\n".format(str(len(buf)))
break
elif ret == -7:
print "\n[+] Instruction Pointer may be near: {}\n".format(str(len(buf)))
print "[*] Child program crashed with code: %d\n" % ret
continue
elif ret == -11:
print "[*] Child program crashed with SIGSEGV code: %d\n" % ret
print "\n[*] Hit enter to continue.\n"
continue
else:
print "[*] Child program exited with code: %d\n" % ret
print "\n[*] Hit enter to continue.\n"
continue
except KeyboardInterrupt:
break
def tcpfuzz(self):
buf = ''
try:
self.target = str(IPAddress(self.target))
except AddrFormatError as e:
try:
self.target = socket.gethostbyname(self.target)
except Exception as e:
print "[-] Select a valid IP Address as target."
print "[!] Exception caught: {}".format(e)
return
buf = '\x41' * self.offset
print "[+] TCP fuzzing initialized, wait untill crash."
while True:
try:
self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.socket.settimeout(2)
self.socket.connect((self.target, self.port))
print "[+] Fuzzing with [{}] bytes.".format(len(buf))
try:
response = self.socket.recv(1024)
print "[*] Response: {}".format(response)
self.socket.send(buf)
try:
response = self.socket.recv(1024)
print "[*] Response: {}".format(response)
self.socket.close()
buf += '\x41' * self.offset
except:
self.socket.close()
buf += '\x41' * self.offset
except:
self.socket.send(buf)
try:
response = self.socket.recv(1024)
print "[*] Response: {}".format(response)
self.socket.close()
buf += '\x41' * self.offset
except:
self.socket.close()
buf += '\x41' * self.offset
except KeyboardInterrupt:
break
except Exception as e:
if 'Connection refused' in e:
print "[-] Connection refused."
time.sleep(4)
else:
try:
response = self.socket.recv(1024)
print "[*] Response: {}".format(response)
except Exception as e:
if 'timed out' in e:
print "[-] Timed out."
time.sleep(2)
print "[+] Crash occured with buffer length: {}".format(str(len(buf)))
print "[!] Exception caught: {}".format(e)
| 36.86755 | 97 | 0.485181 | 573 | 5,567 | 4.699825 | 0.364747 | 0.048273 | 0.046788 | 0.035648 | 0.313034 | 0.225399 | 0.183067 | 0.183067 | 0.183067 | 0.183067 | 0 | 0.020326 | 0.416742 | 5,567 | 150 | 98 | 37.113333 | 0.809055 | 0.137597 | 0 | 0.486957 | 0 | 0 | 0.173531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.069565 | null | null | 0.182609 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9476f2d76b635e59e675584f63851e269e61905f | 874 | py | Python | attention/utils/metadata.py | fbickfordsmith/attention-iclr | 3b8744fd344ed7e2e360a601be12f12e69892941 | [
"MIT"
] | null | null | null | attention/utils/metadata.py | fbickfordsmith/attention-iclr | 3b8744fd344ed7e2e360a601be12f12e69892941 | [
"MIT"
] | null | null | null | attention/utils/metadata.py | fbickfordsmith/attention-iclr | 3b8744fd344ed7e2e360a601be12f12e69892941 | [
"MIT"
] | null | null | null | """
Define metadata variables used throughout the repository.
"""
import numpy as np
import pandas as pd
from scipy.spatial.distance import squareform
from sklearn.metrics.pairwise import cosine_distances
from ..utils.paths import path_metadata, path_representations, path_results
wnids = np.loadtxt(path_metadata/'imagenet_class_wnids.txt', dtype=str)
ind2wnid = {ind:wnid for ind, wnid in enumerate(wnids)}
wnid2ind = {wnid:ind for ind, wnid in enumerate(wnids)}
acc_baseline = pd.read_csv(path_results/'baseline_attn_results.csv')
acc_vgg = pd.read_csv(path_results/'vgg16_results.csv')
mean_acc = np.mean(acc_vgg['accuracy'])
std_acc = np.std(acc_vgg['accuracy'])
distances_represent = cosine_distances(np.load(path_representations))
mean_dist = np.mean(squareform(distances_represent, checks=False))
std_dist = np.std(squareform(distances_represent, checks=False))
| 38 | 75 | 0.803204 | 128 | 874 | 5.273438 | 0.445313 | 0.048889 | 0.02963 | 0.035556 | 0.251852 | 0.077037 | 0 | 0 | 0 | 0 | 0 | 0.005025 | 0.089245 | 874 | 22 | 76 | 39.727273 | 0.842965 | 0.065217 | 0 | 0 | 0 | 0 | 0.10136 | 0.060569 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
9477323d968feefdc5b066b7368d4e3e196160db | 767 | py | Python | Grokking-Algorithms/Greedy/classRoomScheduling.py | javokhirbek1999/AlgorithmsDS | f5f403fed959ac8cf3064c8c852c59f2e67496ab | [
"MIT"
] | 6 | 2021-03-21T02:24:05.000Z | 2021-04-05T01:32:13.000Z | Grokking-Algorithms/Greedy/classRoomScheduling.py | javokhirbek1999/AlgorithmsDS | f5f403fed959ac8cf3064c8c852c59f2e67496ab | [
"MIT"
] | null | null | null | Grokking-Algorithms/Greedy/classRoomScheduling.py | javokhirbek1999/AlgorithmsDS | f5f403fed959ac8cf3064c8c852c59f2e67496ab | [
"MIT"
] | null | null | null |
# The classroom scheduling problem
# Suppose you have a classroom and you want to hold as many classes as possible
# __________________________
#| class | start | end |
#|_______|_________|________|
#| Art | 9:00 am | 10:30am|
#|_______|_________|________|
#| Eng | 9:30am | 10:30am|
#|_______|_________|________|
#| Math | 10 am | 11 am |
#|_______|_________|________|
#| CS | 10:30am | 11.30am|
#|_______|_________|________|
#| Music | 11 am | 12 pm |
#|_______|_________|________|
#|
def schedule(classes):
possible_classes = []
possible_classes.append(classes[0])
print(possible_classes)
# return
for i in range(2,len(classes)):
if possible_classes[-1][1]<=classes[i][0]:
possible_classes.append(classes[i])
return possible_classes
| 27.392857 | 80 | 0.698827 | 85 | 767 | 4.235294 | 0.541176 | 0.25 | 0.122222 | 0.155556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054348 | 0.160365 | 767 | 27 | 81 | 28.407407 | 0.504658 | 0.624511 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.25 | 0.125 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
947bb882a6bca2a6316713846edc3e060262afd0 | 656 | py | Python | test/supporting/test_processor.py | holitics/phenome-extensions | 230d1494d0f0d7a1d747cb8af9413df46d8d0e25 | [
"Apache-2.0"
] | 1 | 2019-11-22T18:15:39.000Z | 2019-11-22T18:15:39.000Z | test/supporting/test_processor.py | holitics/phenome-extensions | 230d1494d0f0d7a1d747cb8af9413df46d8d0e25 | [
"Apache-2.0"
] | 1 | 2020-02-29T03:13:39.000Z | 2020-02-29T03:19:29.000Z | test/supporting/test_processor.py | holitics/phenome-extensions | 230d1494d0f0d7a1d747cb8af9413df46d8d0e25 | [
"Apache-2.0"
] | null | null | null | # test_processor.py, Copyright (c) 2019, Phenome Project - Nicholas Saparoff <nick.saparoff@gmail.com>
from phenome_core.core.base.base_processor import BaseProcessor
class TestProcessor(BaseProcessor):
__test__ = False
def __init__(self):
super(TestProcessor, self).__init__()
def process(self, results):
from phenome.test.supporting.test_mockobject import MockObject
test_value = 45
object = MockObject()
object.id = 1
# here we would normally POLL the object
# populate the value with 45
results.set_result(object, 'test_value', test_value)
return results
| 22.62069 | 102 | 0.685976 | 77 | 656 | 5.584416 | 0.584416 | 0.062791 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018 | 0.237805 | 656 | 28 | 103 | 23.428571 | 0.842 | 0.253049 | 0 | 0 | 0 | 0 | 0.020619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
947d9383137b613729b449880f7118b84411c66a | 344 | py | Python | Pyspark_Sample_ML_programs/simpleapp.py | mihaque313/pyspark_mllib | 8342108384d15b7f2169aff190038e176d7533a9 | [
"Apache-2.0"
] | 7 | 2018-01-31T07:44:13.000Z | 2021-09-17T11:07:08.000Z | Pyspark_Sample_ML_programs/simpleapp.py | mihaque313/pyspark_mllib | 8342108384d15b7f2169aff190038e176d7533a9 | [
"Apache-2.0"
] | 2 | 2017-11-24T05:57:47.000Z | 2021-06-29T10:42:24.000Z | Pyspark_Sample_ML_programs/simpleapp.py | mihaque313/pyspark_mllib | 8342108384d15b7f2169aff190038e176d7533a9 | [
"Apache-2.0"
] | 6 | 2017-11-02T12:28:37.000Z | 2019-07-06T08:03:33.000Z | from pyspark import SparkContext
logFile = "D:/Spark/spark-1.6.1-bin-hadoop2.6/README.md"
sc = SparkContext("local", "Simple App")
logData = sc.textFile(logFile).cache()
numAs = logData.filter(lambda s: 'a' in s).count()
numBs = logData.filter(lambda s: 'b' in s).count()
print("Lines with a: %i, lines with b: %i" % (numAs, numBs)) | 34.4 | 60 | 0.671512 | 55 | 344 | 4.2 | 0.6 | 0.112554 | 0.164502 | 0.17316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017065 | 0.148256 | 344 | 10 | 60 | 34.4 | 0.771331 | 0 | 0 | 0 | 0 | 0.142857 | 0.282738 | 0.130952 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9482473a1126c6399fbe546129eda6043f695cbf | 490 | py | Python | scg-scrape.py | josteinstraume/python-capstone | a7cf2e3d6e0cf83686b999f9877d06446c1176af | [
"BSD-3-Clause"
] | 1 | 2017-09-01T22:54:00.000Z | 2017-09-01T22:54:00.000Z | scg-scrape.py | josteinstraume/python-capstone | a7cf2e3d6e0cf83686b999f9877d06446c1176af | [
"BSD-3-Clause"
] | null | null | null | scg-scrape.py | josteinstraume/python-capstone | a7cf2e3d6e0cf83686b999f9877d06446c1176af | [
"BSD-3-Clause"
] | null | null | null | import urllib, csv, numpy
from BeautifulSoup import *
url = raw_input('Enter URL to crawl: ')
if len(url) < 1:
url = 'http://sales.starcitygames.com//deckdatabase/deckshow.php?&t%5BC1%5D=3&start_num=0&start_num=0&limit=limit'
html = urllib.urlopen(url).read()
soup = BeautifulSoup(html)
# Retrieve all of the anchor tags
tags = soup('a')
for tag in tags:
print 'TAG:', tag
print 'URL:', tag.get('href', None)
print 'Contents:', tag.contents[0]
print 'Attrs:', tag.attrs
| 27.222222 | 115 | 0.685714 | 77 | 490 | 4.324675 | 0.649351 | 0.048048 | 0.054054 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019277 | 0.153061 | 490 | 17 | 116 | 28.823529 | 0.783133 | 0.063265 | 0 | 0 | 0 | 0.076923 | 0.33698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.153846 | null | null | 0.307692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
948d415b98380c188931709b7710a872ebb746cb | 263 | py | Python | client.py | mguidon/socket-logger | ff74a7974a2965794a4b90068b9ecb4aedf95346 | [
"MIT"
] | null | null | null | client.py | mguidon/socket-logger | ff74a7974a2965794a4b90068b9ecb4aedf95346 | [
"MIT"
] | null | null | null | client.py | mguidon/socket-logger | ff74a7974a2965794a4b90068b9ecb4aedf95346 | [
"MIT"
] | null | null | null | import socketio
sio = socketio.Client()
@sio.event
def connect():
print('connection established')
@sio.event
def log(data):
print(data)
@sio.event
def disconnect():
print('disconnected from server')
sio.connect('http://localhost:8080')
sio.wait() | 14.611111 | 37 | 0.69962 | 34 | 263 | 5.411765 | 0.588235 | 0.130435 | 0.179348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017778 | 0.144487 | 263 | 18 | 38 | 14.611111 | 0.8 | 0 | 0 | 0.230769 | 0 | 0 | 0.253788 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.076923 | 0 | 0.307692 | 0.230769 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84699eaf44ad2cc9577c8cecad8b2d9e9e2a0a25 | 355 | py | Python | example.py | badmutex/CoPipes | b329a381f0dbaef238b9e36c7103fd72bee80138 | [
"BSD-2-Clause"
] | 1 | 2020-11-27T23:18:58.000Z | 2020-11-27T23:18:58.000Z | example.py | badi/CoPipes | b329a381f0dbaef238b9e36c7103fd72bee80138 | [
"BSD-2-Clause"
] | null | null | null | example.py | badi/CoPipes | b329a381f0dbaef238b9e36c7103fd72bee80138 | [
"BSD-2-Clause"
] | null | null | null | from copipes import coroutine, pipeline, null
from copipes.macros.pipe import pipe
@pipe
def putStrLn():
"""doc"""
[x]
print x
send(x)
@pipe
def replicate(n):
[x]
for i in xrange(n):
send(x)
if __name__ == '__main__':
pipeline(
putStrLn,
replicate.params(3),
putStrLn,
).feed(range(10))
| 14.791667 | 45 | 0.574648 | 45 | 355 | 4.355556 | 0.622222 | 0.112245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011952 | 0.292958 | 355 | 23 | 46 | 15.434783 | 0.768924 | 0 | 0 | 0.444444 | 0 | 0 | 0.023121 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
846eee9785f4109a7b7f8c3840b04ea8fc5f8ad8 | 8,187 | py | Python | sliceMaker.py | longnow/longview | 9345faacec64f427eab43790abc165af6a572e3d | [
"BSD-2-Clause"
] | 82 | 2015-01-23T04:20:31.000Z | 2022-02-18T22:33:53.000Z | sliceMaker.py | longnow/longview | 9345faacec64f427eab43790abc165af6a572e3d | [
"BSD-2-Clause"
] | 2 | 2015-03-27T22:24:46.000Z | 2017-02-20T08:19:12.000Z | sliceMaker.py | longnow/longview | 9345faacec64f427eab43790abc165af6a572e3d | [
"BSD-2-Clause"
] | 7 | 2015-06-04T20:37:02.000Z | 2021-03-10T02:41:08.000Z | #!/usr/bin/env python
# Copyright (c) 02004, The Long Now Foundation
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
from __future__ import absolute_import, division, with_statement
import sys
import string
import math
import csv
import gd
__author__ = "John Tangney and Dan Mosedale"
__maintainer__ = "Ben Keating"
__email__ = "oss+longview@longnow.org"
__version__ = "1.1"
__copyright__ = "Copyright (c) 02004 The Long Now Foundation"
__license__ = "BSD-style"
__status__ = "Beta"
class SlicedImage:
def __init__(self, width, height):
self.__image = gd.image((width, height))
self.__xOffset = 0
self.__height = height
self.diamondWidth = 7
"""Width of any diamonds rendered."""
self.diamondLeftMargin = 2
"""Width of horizontal margin on the left side a diamond"""
self.diamondVerticalMargin = 0
"""Height of vertical margin above and below a diamond"""
self.diamondColor = (0xff, 0xff, 0xff)
"""Color used to render diamonds."""
return
def addSlice(self, fields):
"""Add a slice to this image"""
# Returns a tuple of adjusted r, g, b integers given tuple of integers.
def computeRgbValues(rgb, saturation, brightness):
return map((lambda x:
adjustColorComponent(x, saturation, brightness)), rgb)
# Returns an integer value for the given component, adjusted for
# brightness and saturation.
def adjustColorComponent(component, saturation, brightness):
answer = component * saturation + brightness * 255.0 * \
(1 - saturation)
return int(round(answer))
width = fields["width"]
height = fields["height"]
saturation = fields["saturation"]
brightness = fields["brightness"]
upperHeight = int(round(height - (height * fields["lowerSize"])))
tl = (self.__xOffset, upperHeight + 1)
br = (self.__xOffset + width, height)
lowerColor = self.__image.colorAllocate(
computeRgbValues(fields["lowerColor"], saturation, brightness))
self.__image.filledRectangle(tl, br, lowerColor)
tl = (self.__xOffset, upperHeight)
br = (self.__xOffset + width, upperHeight + 1)
dividerColor = self.__image.colorAllocate(
computeRgbValues(fields["dividerColor"], saturation, brightness))
self.__image.filledRectangle(tl, br, dividerColor)
tl = (self.__xOffset, 0)
br = (self.__xOffset + width, upperHeight)
upperColor = self.__image.colorAllocate(
computeRgbValues(fields["upperColor"], saturation, brightness))
self.__image.filledRectangle(tl, br, upperColor)
self.__xOffset += width
return
def drawDiamond(self, startX):
"""Draw a diamond whose leftmost point is at startX."""
# allocate diamond color, if we haven't already
if not hasattr(self, '__diamondColorIndex'):
self.__diamondColorIndex = self.__image.colorAllocate(
self.diamondColor)
# calculate the points
bottomPoint = (startX + int(math.ceil(self.diamondWidth/2)),
self.__height-1 - self.diamondVerticalMargin)
rightPoint = (startX + self.diamondWidth-1, int((self.__height-1)/2))
topPoint = (startX + int(math.ceil(self.diamondWidth/2)),
self.diamondVerticalMargin)
leftPoint = (startX + self.diamondLeftMargin, int(self.__height/2))
# draw the diamond
diamond = (bottomPoint, rightPoint, topPoint, leftPoint)
self.__image.filledPolygon(diamond, self.__diamondColorIndex)
return
def drawFilledSlice(self, startX, width, color):
"""Draw a filled slice
startX -- X coordinate where slice should start
width -- width in pixels
color -- color tuple
"""
colorIndex = self.__image.colorAllocate(color)
self.__image.filledRectangle( (startX, 0),
(startX + width - 1, self.__height),
colorIndex )
return
def generate(self, filename):
"""Write out a PNG file for this image"""
# write out the file
file = open(filename, "w")
self.__image.writePng(file)
file.close()
return
# Returns a tuple of red, green, and blue integer values, given a hex string
def parseHexColor(hex):
justHex = hex.split('#')[-1]
return map((lambda x: string.atoi(x, 16)), (justHex[0:2], justHex[2:4], justHex[4:6]))
def calculateTotalSize(inputFileName):
width = 0
height = 0
reader = csv.reader(file(inputFileName))
for row in reader:
fields = mapInputFields(row)
width += fields["width"]
if fields["height"] > height:
height = fields["height"]
return int(width), int(height)
# Given a list of input values, return a dictionary of same. This allows us to
# deal with column positions in one place only.
def mapInputFields(row):
fields = ["height", "width", "lowerSize", "lowerColor", "dividerColor", "upperColor", "saturation", "brightness"]
numericFieldNos = [0, 1, 2, 6, 7]
percentFieldNos = [2, 6, 7]
hexFieldNos = [3, 4, 5]
i = 0
answer = {}
for field in fields:
value = row[i]
if i in numericFieldNos:
value = string.atof(value)
if i in percentFieldNos:
value = value / 100.0
else:
if i in hexFieldNos:
value = parseHexColor(value)
answer[field] = value
i = i + 1
return answer
def makeSlices(width, height, inputFileName, outputFileName):
slices = SlicedImage(width, height);
reader = csv.reader(file(inputFileName))
for row in reader:
fields = mapInputFields(row)
slices.addSlice(fields)
slices.generate(outputFileName)
return
# We write an image per csv file.
def generateImage(inFile, outFile):
print "Reading from %s and writing to %s" % (inFile, outFile)
width, height = calculateTotalSize(inFile)
slices = makeSlices(width, height, inFile, outFile)
if __name__ == "__main__":
if len(sys.argv) == 1:
print "Usage: %s csvfile [pngfile]" % sys.argv[0]
print " If pngfile is not specified, its name is derived from csvfile"
else:
inFile = sys.argv[1]
if len(sys.argv) == 2:
outFile = inFile.split(".")[0] + ".png"
else:
outFile = sys.argv[2]
generateImage(inFile, outFile)
| 34.544304 | 117 | 0.623183 | 907 | 8,187 | 5.512679 | 0.339581 | 0.0216 | 0.022 | 0.0108 | 0.1604 | 0.1128 | 0.1128 | 0.07 | 0.0548 | 0.0548 | 0 | 0.011956 | 0.284842 | 8,187 | 236 | 118 | 34.690678 | 0.842015 | 0.224136 | 0 | 0.128788 | 0 | 0 | 0.07554 | 0.004111 | 0 | 0 | 0.002056 | 0 | 0 | 0 | null | null | 0 | 0.045455 | null | null | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
8479d2ec291f95cbf616deccc6b6e4ba086c6d91 | 973 | py | Python | generation_rs/process_output.py | microsoft/MRS | 1213bf21b1f76d9822e0426dfd81222112a40e05 | [
"MIT"
] | 2 | 2021-08-11T18:00:33.000Z | 2021-09-04T09:38:05.000Z | generation_rs/process_output.py | microsoft/MRS | 1213bf21b1f76d9822e0426dfd81222112a40e05 | [
"MIT"
] | null | null | null | generation_rs/process_output.py | microsoft/MRS | 1213bf21b1f76d9822e0426dfd81222112a40e05 | [
"MIT"
] | 1 | 2021-09-13T12:29:10.000Z | 2021-09-13T12:29:10.000Z | """Convert the output format of fairseq.generate to the input format of the evaluation script."""
from argparse import ArgumentParser
from collections import defaultdict
def main():
parser = ArgumentParser()
parser.add_argument('src', help='path to source')
parser.add_argument('tgt', help='path to target')
parser.add_argument('file', help='path to fairseq generate output')
args = parser.parse_args()
with open(args.src, 'r') as f:
src = [line.strip() for line in f]
with open(args.tgt, 'r') as f:
tgt = [line.strip() for line in f]
hyp = defaultdict(list)
with open(args.file, 'r') as f:
for line in f:
if line.startswith('H-'):
idx, _, text = line.split('\t')
hyp[int(idx[2:])].append(text.strip())
for i, k in enumerate(sorted(hyp.keys())):
fields = [src[i], tgt[i]] + hyp[k]
print('\t'.join(fields))
if __name__ == '__main__':
main()
| 32.433333 | 97 | 0.605344 | 138 | 973 | 4.173913 | 0.442029 | 0.046875 | 0.088542 | 0.052083 | 0.065972 | 0.065972 | 0 | 0 | 0 | 0 | 0 | 0.001361 | 0.244604 | 973 | 29 | 98 | 33.551724 | 0.782313 | 0.093525 | 0 | 0 | 1 | 0 | 0.098174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.086957 | 0 | 0.130435 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
847c54faa0fb9230a80f8d346585c5ab5678996e | 2,118 | py | Python | home/views.py | pantsocksboots/rackandstack | 9681da91fbaac08a83b008e4632c9ba876f0f5c5 | [
"MIT"
] | null | null | null | home/views.py | pantsocksboots/rackandstack | 9681da91fbaac08a83b008e4632c9ba876f0f5c5 | [
"MIT"
] | 5 | 2022-02-05T00:10:27.000Z | 2022-03-23T19:45:38.000Z | home/views.py | pantsocksboots/rackandstack | 9681da91fbaac08a83b008e4632c9ba876f0f5c5 | [
"MIT"
] | null | null | null | from django.http import HttpResponseRedirect
from django.shortcuts import get_object_or_404
from django.template.response import TemplateResponse
from rubric.models import Evolution
from stucon.models import Student
from gradebook.models import ObjectiveScore, TraitScore
from django.contrib.auth.forms import PasswordChangeForm
from django.contrib.auth.models import User
from django.contrib import messages
from django.urls import reverse
def home_view(request):
evos = Evolution.objects.all().count()
students = Student.objects.all().count()
scores = ObjectiveScore.objects.all().count() + TraitScore.objects.all().count()
user_is_student = request.user.groups.filter(name="student").exists()
user_is_grader = request.user.groups.filter(name="grader").exists()
user_is_course_admin = request.user.groups.filter(name="course_admin").exists()
return TemplateResponse(
request,
"home/main.html",
{
"num_evos": evos,
"num_students": students,
"num_scores": scores,
"user_is_student": user_is_student,
"user_is_grader": user_is_grader,
"user_is_course_admin": user_is_course_admin,
},
)
def unauthorized_view(request):
return TemplateResponse(
request,
"home/unauthorized.html",
{},
)
def reset_password(request):
if request.method == "GET":
pw_change_form = PasswordChangeForm(user=request.user)
return TemplateResponse(request, "home/password_reset.html",{
"pw_change_form": pw_change_form,
})
if request.method == "POST":
pw_change_form = PasswordChangeForm(user=request.user, data=request.POST)
if pw_change_form.is_valid():
pw_change_form.save()
messages.add_message(request, messages.SUCCESS, "Successfully changed password.")
else:
print(pw_change_form.error_messages)
messages.add_message(request, messages.ERROR, "Error: Password not changed successfully.")
return HttpResponseRedirect(reverse("home:home_page")) | 37.157895 | 102 | 0.695467 | 244 | 2,118 | 5.827869 | 0.29918 | 0.037975 | 0.059072 | 0.048523 | 0.206048 | 0.063291 | 0.063291 | 0 | 0 | 0 | 0 | 0.001777 | 0.203022 | 2,118 | 57 | 103 | 37.157895 | 0.84064 | 0 | 0 | 0.08 | 0 | 0 | 0.127419 | 0.021708 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06 | false | 0.14 | 0.2 | 0.02 | 0.34 | 0.02 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
848440f9712c0a01ee428394d7d4cc5695b4f759 | 1,855 | py | Python | assignment/pubsub.py | nilaysaha/multitasking | a87eb24c81e7a784f278eff8209e9356c8da9754 | [
"Apache-2.0"
] | null | null | null | assignment/pubsub.py | nilaysaha/multitasking | a87eb24c81e7a784f278eff8209e9356c8da9754 | [
"Apache-2.0"
] | null | null | null | assignment/pubsub.py | nilaysaha/multitasking | a87eb24c81e7a784f278eff8209e9356c8da9754 | [
"Apache-2.0"
] | null | null | null | #!/bin/python3
import os, amqp
USERNAME=os.environ['MQTT_USERNAME']
PASSWORD=os.environ['MQTT_PASSWORD']
AMQP_URL=f"amqp://{MQTT_USERNAME}:{MQTT_PASSWORD}@finch.rmq.cloudamqp.com/wwarpzsg"
class PubSub:
def __init__(self, url=AMQP_URL):
this.connection = amqp.Connection(url)
this.channels = {}
def _close(self):
this.connection.close()
def open_channel(id):
try:
ch = this.connection.channel(id)
this.channels[id] = ch
this._create_auth(id)
except:
print(f"Error in opening channel with id:{id} ")
def _create_auth(self, id):
ch = this.channels[id]
tkt = ch.access_request('/data', active=True, write=True, read=True)
ch.exchange_declare('myfan', 'fanout', auto_delete=True, ticket=tkt)
def create_exchange(self,id, exchange_name):
ch = this.channels[id]
ch.exchange_declare(exchange_name, type='fanout')
def publish_message(self, exchange_name, channel_id, sample_message):
msg = amqp.Message(sample_message)
ch = this.channels[id]
ch.publish_message(sample_message, exchange_name)
def queue(self, id, exchange_name, queue_name):
ch = this.channels[id]
ch.queue_declare(queue_name)
ch.queue_bind(queue_name, exchange_name)
def rcv_msg(self, id, queue_name):
ch = this.channels[id]
msg = ch.basic_get(queue_name)
if msg is not None:
# do something
ch.basic_ack(msg.delivery_tag)
def wait_for_msg(self, id, queue_name):
ch = this.channels[id]
def mycallback(msg):
print 'received', msg.body, 'from channel #', msg.channel.channel_id
ch.basic_consume(queue_name, callback=mycallback, no_ack=True)
while True:
ch.wait()
| 31.440678 | 83 | 0.632884 | 244 | 1,855 | 4.602459 | 0.340164 | 0.085485 | 0.087266 | 0.085485 | 0.120214 | 0.104185 | 0.060552 | 0.060552 | 0.060552 | 0 | 0 | 0.000719 | 0.250674 | 1,855 | 58 | 84 | 31.982759 | 0.807194 | 0.014016 | 0 | 0.136364 | 0 | 0 | 0.097975 | 0.038862 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.045455 | 0.022727 | null | null | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
8487557b661ba93e451a0dbc3862a8a6a3503d2d | 578 | py | Python | assembler/arguments.py | ITD27M01/hack-assembler | 58f691adf979106209b407be08c7d31b59088957 | [
"Apache-2.0"
] | 1 | 2021-01-18T07:31:39.000Z | 2021-01-18T07:31:39.000Z | assembler/arguments.py | ITD27M01/hack-assembler | 58f691adf979106209b407be08c7d31b59088957 | [
"Apache-2.0"
] | null | null | null | assembler/arguments.py | ITD27M01/hack-assembler | 58f691adf979106209b407be08c7d31b59088957 | [
"Apache-2.0"
] | null | null | null | import argparse
def args_parser():
parser = argparse.ArgumentParser(description='Generates hack machine binary code from symbolic form.')
parser.add_argument('file', type=str, action='store',
help='File path with assembly code')
parser.add_argument('--debug', default=False, action='store_true',
help='debug process of assembler')
parser.add_argument('--dry-run', default=False, action='store_true',
help='Dry run assemble process. Useful with --debug')
return parser.parse_args()
| 36.125 | 106 | 0.645329 | 67 | 578 | 5.462687 | 0.58209 | 0.07377 | 0.139344 | 0.125683 | 0.169399 | 0.169399 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238754 | 578 | 15 | 107 | 38.533333 | 0.831818 | 0 | 0 | 0 | 1 | 0 | 0.342561 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84888b176eb03a9243b2cb5e2358400c50f22984 | 479 | py | Python | pyGLLib/light.py | juanmcasillas/pyGLLib | eb91e5f50ed2e05b7822de8a7125e57eafa91eca | [
"Apache-2.0"
] | null | null | null | pyGLLib/light.py | juanmcasillas/pyGLLib | eb91e5f50ed2e05b7822de8a7125e57eafa91eca | [
"Apache-2.0"
] | null | null | null | pyGLLib/light.py | juanmcasillas/pyGLLib | eb91e5f50ed2e05b7822de8a7125e57eafa91eca | [
"Apache-2.0"
] | null | null | null |
# ///////////////////////////////////////////////////////////////////////////
#
#
#
# ///////////////////////////////////////////////////////////////////////////
class GLLight:
def __init__(self, pos=(0.0,0.0,0.0), color=(1.0,1.0,1.0)):
self.pos = pos
self.color = color
self.ambient = (1.0, 1.0, 1.0)
self.diffuse = (1.0, 1.0, 1.0)
# attenuation
self.constant = 1.0
self.linear = 0.09
self.quadratic = 0.032
| 26.611111 | 77 | 0.334029 | 54 | 479 | 2.888889 | 0.314815 | 0.128205 | 0.115385 | 0.153846 | 0.205128 | 0.166667 | 0.128205 | 0 | 0 | 0 | 0 | 0.089431 | 0.229645 | 479 | 17 | 78 | 28.176471 | 0.333333 | 0.340292 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
848bcf190a3b0c3ad5ed94d7892b725d609ef60b | 1,105 | py | Python | map/forms.py | mpbrown/buffalo-community-engagement-map | 04cba068de9a22db4fe8708d1bd3c10d5a6adfbd | [
"MIT"
] | 1 | 2021-04-11T16:40:31.000Z | 2021-04-11T16:40:31.000Z | map/forms.py | mpbrown/buffalo-community-engagement-map | 04cba068de9a22db4fe8708d1bd3c10d5a6adfbd | [
"MIT"
] | null | null | null | map/forms.py | mpbrown/buffalo-community-engagement-map | 04cba068de9a22db4fe8708d1bd3c10d5a6adfbd | [
"MIT"
] | null | null | null | from django import forms
class EmailOrganizationForm(forms.Form):
organization_name = forms.CharField(label="Organization you'd be most interested in working with")
why_interesting = forms.CharField(
label="Write one sentence about why this organization is interesting. How does it/could it connect to your life?",
widget=forms.Textarea(attrs={'rows': '3'}))
services = forms.CharField(
label="Review the programs and services offered by this organization. Which ones could you contribute to?",
widget=forms.Textarea(attrs={'rows': '3'}))
contribute = forms.CharField(
label="Write one sentence about how you believe you can contribute to the organization.",
widget=forms.Textarea(attrs={'rows': '3'}))
staff_member_name = forms.CharField(label="Staff member name")
staff_member_title = forms.CharField(label="Staff member title")
staff_member_email = forms.EmailField(label="Staff member email")
student_name = forms.CharField(label="Your name")
student_email = forms.EmailField(label="Your UB email address")
| 40.925926 | 123 | 0.721267 | 142 | 1,105 | 5.542254 | 0.422535 | 0.124524 | 0.168996 | 0.087675 | 0.288437 | 0.212198 | 0.101652 | 0 | 0 | 0 | 0 | 0.003297 | 0.176471 | 1,105 | 26 | 124 | 42.5 | 0.861538 | 0 | 0 | 0.176471 | 0 | 0.058824 | 0.39276 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
848d88b5383b8a146bb61bbf80914cbd12720899 | 1,163 | py | Python | demo.py | itsrobli/depreciation-rate-classifier | ab1ef346d577a3ee64d200ee41e097bba12f6b14 | [
"BSD-3-Clause"
] | null | null | null | demo.py | itsrobli/depreciation-rate-classifier | ab1ef346d577a3ee64d200ee41e097bba12f6b14 | [
"BSD-3-Clause"
] | null | null | null | demo.py | itsrobli/depreciation-rate-classifier | ab1ef346d577a3ee64d200ee41e097bba12f6b14 | [
"BSD-3-Clause"
] | null | null | null | # 25 February 2019 - Robert Li <robertwli@gmail.com>
import sys
from src.text_classifier_deprn_rates import DeprnPredictor
predict = DeprnPredictor()
print('Evaluate using user input.\n')
user_description = ['']
print('\"QQ\" to quit.')
print('\"CR\" to see classification report.')
print('Otherwise...')
while True:
user_description = input('Enter a depreciable asset description: \n')
if user_description == 'QQ':
print('====================GOODBYE====================\n')
sys.exit()
elif user_description == 'CR':
predict.report_results()
else:
result, predicted_account = predict.predict_description(user_description)
rate_perc = str(result.rate_perc_text) + '% prime cost'
life = str(result.life_years) + ' years effective life'
tax_cat = result.tax_cat
print(f'Input from user:\n\t {user_description}')
print(f'Result:')
print(f'\taccount: \t\t\t{predicted_account}')
print(f'\tdeprn rate: \t\t{rate_perc}')
print(f'\teffective life: \t{life}')
print(f'\ttax category: \t\t{tax_cat}')
print('END of Result')
print()
| 33.228571 | 81 | 0.625107 | 145 | 1,163 | 4.868966 | 0.462069 | 0.127479 | 0.056657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006486 | 0.204643 | 1,163 | 34 | 82 | 34.205882 | 0.756757 | 0.042992 | 0 | 0 | 0 | 0 | 0.357336 | 0.066607 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.464286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
849193606996f031236d222d383c8ca454a60778 | 796 | py | Python | django_slack/templatetags/django_slack.py | lociii/django-slack | 409521f427a68d12ad7e7ac3e19d517d2ba131ba | [
"BSD-3-Clause"
] | 237 | 2015-01-24T01:03:13.000Z | 2022-03-31T16:38:06.000Z | django_slack/templatetags/django_slack.py | lociii/django-slack | 409521f427a68d12ad7e7ac3e19d517d2ba131ba | [
"BSD-3-Clause"
] | 106 | 2015-02-25T12:09:02.000Z | 2022-03-31T09:57:31.000Z | django_slack/templatetags/django_slack.py | lociii/django-slack | 409521f427a68d12ad7e7ac3e19d517d2ba131ba | [
"BSD-3-Clause"
] | 83 | 2015-02-11T15:20:19.000Z | 2022-03-30T01:58:16.000Z | from django import template
from django.utils.encoding import force_str
from django.utils.functional import keep_lazy
from django.utils.safestring import SafeText, mark_safe
from django.template.defaultfilters import stringfilter
register = template.Library()
_slack_escapes = {
ord('&'): u'&',
ord('<'): u'<',
ord('>'): u'>',
}
@keep_lazy(str, SafeText)
@register.filter(is_safe=True)
@stringfilter
def escapeslack(value):
"""
Returns the given text with ampersands and angle brackets encoded for use in
the Slack API, per the Slack API documentation:
<https://api.slack.com/docs/formatting#how_to_escape_characters>
This is based on django.template.defaultfilters.escapejs.
"""
return mark_safe(force_str(value).translate(_slack_escapes))
| 28.428571 | 80 | 0.73995 | 107 | 796 | 5.373832 | 0.579439 | 0.086957 | 0.078261 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145729 | 796 | 27 | 81 | 29.481481 | 0.845588 | 0.311558 | 0 | 0 | 0 | 0 | 0.030769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.3125 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
84977020945cbac5f5269bef5d81627f58f39aa2 | 293 | py | Python | 2/solution-a.py | gmodena/adventofcode2017 | 6fb72b2220f38e5f65b04f491c8787f1b999ba60 | [
"Unlicense"
] | 1 | 2017-12-08T21:36:50.000Z | 2017-12-08T21:36:50.000Z | 2/solution-a.py | gmodena/adventofcode2017 | 6fb72b2220f38e5f65b04f491c8787f1b999ba60 | [
"Unlicense"
] | null | null | null | 2/solution-a.py | gmodena/adventofcode2017 | 6fb72b2220f38e5f65b04f491c8787f1b999ba60 | [
"Unlicense"
] | null | null | null | if __name__ == '__main__':
checksum = 0
while True:
try:
numbers = input()
except EOFError:
break
numbers = map(int, numbers.split('\t'))
numbers = sorted(numbers)
checksum += numbers[-1] - numbers[0]
print(checksum)
| 24.416667 | 47 | 0.522184 | 29 | 293 | 5 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015957 | 0.358362 | 293 | 11 | 48 | 26.636364 | 0.755319 | 0 | 0 | 0 | 0 | 0 | 0.03413 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.090909 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84ab6e8a9d52009482f08103bea85204c90f5e58 | 383 | py | Python | cp_validator/__init__.py | klashxx/cp_validator | 17dcb0a5b999190add1c6139101694e514d43676 | [
"MIT"
] | null | null | null | cp_validator/__init__.py | klashxx/cp_validator | 17dcb0a5b999190add1c6139101694e514d43676 | [
"MIT"
] | null | null | null | cp_validator/__init__.py | klashxx/cp_validator | 17dcb0a5b999190add1c6139101694e514d43676 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import sys
__author__ = 'Juan Diego Godoy Robles'
__version__ = '0.1'
__ppath__ = os.path.dirname(os.path.realpath(__file__))
if __ppath__ not in sys.path:
sys.path.append(os.path.dirname(__ppath__))
from flask import Flask
app = Flask(__name__)
from cp_validator import extractor
postals = extractor.get_postal()
import cp_validator.views
| 20.157895 | 55 | 0.751958 | 56 | 383 | 4.589286 | 0.607143 | 0.070039 | 0.101167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009063 | 0.13577 | 383 | 18 | 56 | 21.277778 | 0.767372 | 0.05483 | 0 | 0 | 0 | 0 | 0.072222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.416667 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
84adba39a6a4c0e1a2f47db291c44ae8499e12cc | 513 | py | Python | common/helpers/client_helpers.py | srrokib/aws-cloudformation-resource-providers-frauddetector | 08c049a57e4f9d54666ef7bc250d878853be3cd4 | [
"Apache-2.0"
] | 4 | 2021-05-24T05:35:05.000Z | 2021-11-08T09:43:48.000Z | common/helpers/client_helpers.py | srrokib/aws-cloudformation-resource-providers-frauddetector | 08c049a57e4f9d54666ef7bc250d878853be3cd4 | [
"Apache-2.0"
] | 3 | 2021-04-29T20:30:17.000Z | 2021-05-14T16:28:19.000Z | common/helpers/client_helpers.py | srrokib/aws-cloudformation-resource-providers-frauddetector | 08c049a57e4f9d54666ef7bc250d878853be3cd4 | [
"Apache-2.0"
] | 3 | 2021-04-07T16:03:03.000Z | 2021-10-30T03:25:33.000Z | from cloudformation_cli_python_lib import (
SessionProxy,
exceptions,
)
# Use this global and use `afd_client = get_singleton_afd_client(session)` for singleton
afd_client = None
def get_singleton_afd_client(session):
global afd_client
if afd_client is not None:
return afd_client
if isinstance(session, SessionProxy):
afd_client = session.client("frauddetector")
return afd_client
raise exceptions.InternalFailure(f"Error: failed to get frauddetector client.")
| 28.5 | 88 | 0.746589 | 65 | 513 | 5.646154 | 0.492308 | 0.220708 | 0.147139 | 0.114441 | 0.152589 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191033 | 513 | 17 | 89 | 30.176471 | 0.884337 | 0.167641 | 0 | 0.153846 | 0 | 0 | 0.129412 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.076923 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84af899d6b5f8151362343ed14a77e305bd145de | 9,116 | py | Python | flask_unchained/bundles/security/config.py | briancappello/flask-unchained | bff296b5c808f5b1db10f7dddb81054600545749 | [
"MIT"
] | 69 | 2018-10-10T01:59:11.000Z | 2022-03-29T17:29:30.000Z | flask_unchained/bundles/security/config.py | briancappello/flask-unchained | bff296b5c808f5b1db10f7dddb81054600545749 | [
"MIT"
] | 18 | 2018-11-17T12:42:02.000Z | 2021-05-22T18:45:27.000Z | flask_unchained/bundles/security/config.py | briancappello/flask-unchained | bff296b5c808f5b1db10f7dddb81054600545749 | [
"MIT"
] | 7 | 2018-10-12T16:20:25.000Z | 2021-10-06T12:18:21.000Z | from datetime import datetime, timezone
from flask import abort
from flask_unchained import BundleConfig
from http import HTTPStatus
from .forms import (
LoginForm,
RegisterForm,
ForgotPasswordForm,
ResetPasswordForm,
ChangePasswordForm,
SendConfirmationForm,
)
from .models import AnonymousUser
class AuthenticationConfig:
"""
Config options for logging in and out.
"""
SECURITY_LOGIN_FORM = LoginForm
"""
The form class to use for the login view.
"""
SECURITY_DEFAULT_REMEMBER_ME = False
"""
Whether or not the login form should default to checking the
"Remember me?" option.
"""
SECURITY_REMEMBER_SALT = 'security-remember-salt'
"""
Salt used for the remember me cookie token.
"""
SECURITY_USER_IDENTITY_ATTRIBUTES = ['email'] # FIXME-identity
"""
List of attributes on the user model that can used for logging in with.
Each must be unique.
"""
SECURITY_POST_LOGIN_REDIRECT_ENDPOINT = '/'
"""
The endpoint or url to redirect to after a successful login.
"""
SECURITY_POST_LOGOUT_REDIRECT_ENDPOINT = '/'
"""
The endpoint or url to redirect to after a user logs out.
"""
class ChangePasswordConfig:
"""
Config options for changing passwords
"""
SECURITY_CHANGEABLE = False
"""
Whether or not to enable change password functionality.
"""
SECURITY_CHANGE_PASSWORD_FORM = ChangePasswordForm
"""
Form class to use for the change password view.
"""
SECURITY_POST_CHANGE_REDIRECT_ENDPOINT = None
"""
Endpoint or url to redirect to after the user changes their password.
"""
SECURITY_SEND_PASSWORD_CHANGED_EMAIL = \
'mail_bundle' in BundleConfig.current_app.unchained.bundles
"""
Whether or not to send the user an email when their password has been changed.
Defaults to True, and it's strongly recommended to leave this option enabled.
"""
class EncryptionConfig:
"""
Config options for encryption hashing.
"""
SECURITY_PASSWORD_SALT = 'security-password-salt'
"""
Specifies the HMAC salt. This is only used if the password hash type is
set to something other than plain text.
"""
SECURITY_PASSWORD_HASH = 'bcrypt'
"""
Specifies the password hash algorithm to use when hashing passwords.
Recommended values for production systems are ``argon2``, ``bcrypt``,
or ``pbkdf2_sha512``. May require extra packages to be installed.
"""
SECURITY_PASSWORD_SINGLE_HASH = False
"""
Specifies that passwords should only be hashed once. By default, passwords
are hashed twice, first with SECURITY_PASSWORD_SALT, and then with a random
salt. May be useful for integrating with other applications.
"""
SECURITY_PASSWORD_SCHEMES = ['argon2',
'bcrypt',
'pbkdf2_sha512',
# and always the last one...
'plaintext']
"""
List of algorithms that can be used for hashing passwords.
"""
SECURITY_PASSWORD_HASH_OPTIONS = {}
"""
Specifies additional options to be passed to the hashing method.
"""
SECURITY_DEPRECATED_PASSWORD_SCHEMES = ['auto']
"""
List of deprecated algorithms for hashing passwords.
"""
SECURITY_HASHING_SCHEMES = ['sha512_crypt']
"""
List of algorithms that can be used for creating and validating tokens.
"""
SECURITY_DEPRECATED_HASHING_SCHEMES = []
"""
List of deprecated algorithms for creating and validating tokens.
"""
class ForgotPasswordConfig:
"""
Config options for recovering forgotten passwords
"""
SECURITY_RECOVERABLE = False
"""
Whether or not to enable forgot password functionality.
"""
SECURITY_FORGOT_PASSWORD_FORM = ForgotPasswordForm
"""
Form class to use for the forgot password form.
"""
# reset password (when the user clicks the link from the email sent by forgot pw)
# --------------
SECURITY_RESET_PASSWORD_FORM = ResetPasswordForm
"""
Form class to use for the reset password form.
"""
SECURITY_RESET_SALT = 'security-reset-salt'
"""
Salt used for the reset token.
"""
SECURITY_RESET_PASSWORD_WITHIN = '5 days'
"""
Specifies the amount of time a user has before their password reset link
expires. Always pluralized the time unit for this value. Defaults to 5 days.
"""
SECURITY_POST_RESET_REDIRECT_ENDPOINT = None
"""
Endpoint or url to redirect to after the user resets their password.
"""
SECURITY_INVALID_RESET_TOKEN_REDIRECT = 'security_controller.forgot_password'
"""
Endpoint or url to redirect to if the reset token is invalid.
"""
SECURITY_EXPIRED_RESET_TOKEN_REDIRECT = 'security_controller.forgot_password'
"""
Endpoint or url to redirect to if the reset token is expired.
"""
SECURITY_API_RESET_PASSWORD_HTTP_GET_REDIRECT = None
"""
Endpoint or url to redirect to if a GET request is made to the reset password
view. Defaults to None, meaning no redirect. Useful for single page apps.
"""
SECURITY_SEND_PASSWORD_RESET_NOTICE_EMAIL = \
'mail_bundle' in BundleConfig.current_app.unchained.bundles
"""
Whether or not to send the user an email when their password has been reset.
Defaults to True, and it's strongly recommended to leave this option enabled.
"""
class RegistrationConfig:
"""
Config options for user registration
"""
SECURITY_REGISTERABLE = False
"""
Whether or not to enable registration.
"""
SECURITY_REGISTER_FORM = RegisterForm
"""
The form class to use for the register view.
"""
SECURITY_POST_REGISTER_REDIRECT_ENDPOINT = None
"""
The endpoint or url to redirect to after a user completes the
registration form.
"""
SECURITY_SEND_REGISTER_EMAIL = \
'mail_bundle' in BundleConfig.current_app.unchained.bundles
"""
Whether or not send a welcome email after a user completes the
registration form.
"""
# email confirmation options
# --------------------------
SECURITY_CONFIRMABLE = False
"""
Whether or not to enable required email confirmation for new users.
"""
SECURITY_SEND_CONFIRMATION_FORM = SendConfirmationForm
"""
Form class to use for the (re)send confirmation email form.
"""
SECURITY_CONFIRM_SALT = 'security-confirm-salt'
"""
Salt used for the confirmation token.
"""
SECURITY_LOGIN_WITHOUT_CONFIRMATION = False
"""
Allow users to login without confirming their email first. (This option
only applies when :attr:`SECURITY_CONFIRMABLE` is True.)
"""
SECURITY_CONFIRM_EMAIL_WITHIN = '5 days'
"""
How long to wait until considering the token in confirmation emails to
be expired.
"""
SECURITY_POST_CONFIRM_REDIRECT_ENDPOINT = None
"""
Endpoint or url to redirect to after the user confirms their email.
Defaults to :attr:`SECURITY_POST_LOGIN_REDIRECT_ENDPOINT`.
"""
SECURITY_CONFIRM_ERROR_REDIRECT_ENDPOINT = None
"""
Endpoint to redirect to if there's an error confirming the user's email.
"""
class TokenConfig:
"""
Config options for token authentication.
"""
SECURITY_TOKEN_AUTHENTICATION_KEY = 'auth_token'
"""
Specifies the query string parameter to read when using token authentication.
"""
SECURITY_TOKEN_AUTHENTICATION_HEADER = 'Authentication-Token'
"""
Specifies the HTTP header to read when using token authentication.
"""
SECURITY_TOKEN_MAX_AGE = None
"""
Specifies the number of seconds before an authentication token expires.
Defaults to None, meaning the token never expires.
"""
class Config(AuthenticationConfig,
ChangePasswordConfig,
EncryptionConfig,
ForgotPasswordConfig,
RegistrationConfig,
TokenConfig,
BundleConfig):
"""
Config options for the Security Bundle.
"""
SECURITY_ANONYMOUS_USER = AnonymousUser
"""
Class to use for representing anonymous users.
"""
SECURITY_UNAUTHORIZED_CALLBACK = lambda: abort(HTTPStatus.UNAUTHORIZED)
"""
This callback gets called when authorization fails. By default we abort with
an HTTP status code of 401 (UNAUTHORIZED).
"""
# make datetimes timezone-aware by default
SECURITY_DATETIME_FACTORY = lambda: datetime.now(timezone.utc)
"""
Factory function to use when creating new dates. By default we use
``datetime.now(timezone.utc)`` to create a timezone-aware datetime.
"""
ADMIN_CATEGORY_ICON_CLASSES = {
'Security': 'fa fa-lock',
}
class TestConfig(Config):
"""
Default test settings for the Security Bundle.
"""
SECURITY_PASSWORD_HASH = 'plaintext'
"""
Disable password-hashing in tests (shaves about 30% off the test-run time)
"""
| 27.130952 | 85 | 0.672554 | 1,058 | 9,116 | 5.646503 | 0.258979 | 0.011048 | 0.020087 | 0.022598 | 0.290425 | 0.234851 | 0.20472 | 0.180783 | 0.154335 | 0.154335 | 0 | 0.003082 | 0.252523 | 9,116 | 335 | 86 | 27.21194 | 0.873643 | 0.06176 | 0 | 0.036585 | 0 | 0 | 0.083925 | 0.035517 | 0 | 0 | 0 | 0.002985 | 0 | 1 | 0 | false | 0.280488 | 0.073171 | 0 | 0.743902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
84b76a8676a19ebc9fa379b48d1d23747d4e43c3 | 6,256 | py | Python | monkPriorityList.py | skasch/ffxiv-sim | eb69fa55a8004f064937eeaffec8f1c7e31f7db8 | [
"MIT"
] | null | null | null | monkPriorityList.py | skasch/ffxiv-sim | eb69fa55a8004f064937eeaffec8f1c7e31f7db8 | [
"MIT"
] | null | null | null | monkPriorityList.py | skasch/ffxiv-sim | eb69fa55a8004f064937eeaffec8f1c7e31f7db8 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Tue May 31 17:02:24 2016
@author: rmondoncancel
"""
# Deprecated
priorityList = [
{
'name': 'fistOfFire',
'group': 'monk',
'prepull': True,
'condition': {
'type': 'buffPresent',
'name': 'fistOfFire',
'comparison': 'is',
'value': False,
}
},
{
'name': 'perfectBalance',
'group': 'pugilist',
'prepull': True,
},
{
'name': 'bloodForBlood',
'group': 'lancer',
'condition': {
'logic': 'and',
'list': [
{
'type': 'buffAtMaxStacks',
'name': 'greasedLightning',
'comparison': 'is',
'value': True,
},
]
}
},
{
'name': 'perfectBalance',
'group': 'pugilist',
'condition': {
'type': 'cooldownPresent',
'name': 'tornadoKick',
'comparison': 'is',
'value': False,
}
},
{
'name': 'tornadoKick',
'group': 'monk',
'condition': {
'logic': 'and',
'list': [
{
'type': 'buffTimeLeft',
'name': 'perfectBalance',
'comparison': '>=',
'value': 7,
},
{
'type': 'buffAtMaxStacks',
'name': 'greasedLightning',
'comparison': 'is',
'value': True,
}
],
}
},
{
'name': 'internalRelease',
'group': 'pugilist',
'condition': {
'logic': 'and',
'list': [
{
'type': 'buffAtMaxStacks',
'name': 'greasedLightning',
'comparison': 'is',
'value': True,
},
{
'type': 'cooldownTimeLeft',
'name': 'elixirField',
'comparison': '<=',
'value': 6,
},
]
}
},
{
'name': 'potionOfStrengthHQ',
'group': 'item',
'condition': {
'logic': 'and',
'list': [
{
'type': 'buffAtMaxStacks',
'name': 'greasedLightning',
'comparison': 'is',
'value': True,
},
]
}
},
{
'name': 'howlingFist',
'group': 'pugilist',
'condition': {
'type': 'buffPresent',
'name': 'internalRelease',
'comparison': 'is',
'value': True,
}
},
{
'name': 'elixirField',
'group': 'monk',
'condition': {
'logic': 'or',
'list': [
{
'type': 'buffPresent',
'name': 'internalRelease',
'comparison': 'is',
'value': True,
},
{
'type': 'cooldownTimeLeft',
'name': 'internalRelease',
'comparison': '>=',
'value': 20,
},
],
}
},
{
'name': 'steelPeak',
'group': 'pugilist',
'condition': {
'logic': 'and',
'list': [
{
'type': 'buffAtMaxStacks',
'name': 'greasedLightning',
'comparison': 'is',
'value': True,
},
]
}
},
{
'name': 'touchOfDeath',
'group': 'pugilist',
'condition': {
'logic': 'and',
'list': [
{
'type': 'debuffTimeLeft',
'name': 'touchOfDeath',
'comparison': '<=',
'value': 1.5,
},
{
'type': 'debuffPresent',
'name': 'dragonKick',
'comparison': 'is',
'value': True,
},
{
'type': 'buffPresent',
'name': 'twinSnakes',
'comparison': 'is',
'value': True,
},
]
}
},
{
'name': 'fracture',
'group': 'marauder',
'condition': {
'logic': 'and',
'list': [
{
'type': 'debuffTimeLeft',
'name': 'fracture',
'comparison': '<=',
'value': 1.5,
},
{
'type': 'debuffPresent',
'name': 'dragonKick',
'comparison': 'is',
'value': True,
},
{
'type': 'buffPresent',
'name': 'twinSnakes',
'comparison': 'is',
'value': True,
},
]
}
},
{
'name': 'demolish',
'group': 'pugilist',
'condition': {
'type': 'debuffTimeLeft',
'name': 'demolish',
'comparison': '<=',
'value': 4,
},
},
{
'name': 'twinSnakes',
'group': 'pugilist',
'condition': {
'type': 'buffTimeLeft',
'name': 'twinSnakes',
'comparison': '<=',
'value': 5,
},
},
{
'name': 'snapPunch',
'group': 'pugilist',
},
{
'name': 'dragonKick',
'group': 'monk',
'condition': {
'type': 'debuffTimeLeft',
'name': 'dragonKick',
'comparison': '<=',
'value': 5,
},
},
{
'name': 'trueStrike',
'group': 'pugilist',
},
{
'name': 'bootshine',
'group': 'pugilist',
},
]
| 24.825397 | 47 | 0.307385 | 292 | 6,256 | 6.585616 | 0.226027 | 0.081123 | 0.114925 | 0.120125 | 0.525221 | 0.483099 | 0.483099 | 0.406656 | 0.349454 | 0.316173 | 0 | 0.008245 | 0.534687 | 6,256 | 251 | 48 | 24.924303 | 0.652353 | 0.014866 | 0 | 0.570248 | 0 | 0 | 0.290543 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84b7a85139287ab51f816065cf0d6ff27012f826 | 2,740 | py | Python | robotto.py | sectrimte/pkmn_robotto | 4c9e4bbcf91e96ca4e8808c5f7d2d0a235016e69 | [
"Apache-2.0"
] | null | null | null | robotto.py | sectrimte/pkmn_robotto | 4c9e4bbcf91e96ca4e8808c5f7d2d0a235016e69 | [
"Apache-2.0"
] | null | null | null | robotto.py | sectrimte/pkmn_robotto | 4c9e4bbcf91e96ca4e8808c5f7d2d0a235016e69 | [
"Apache-2.0"
] | null | null | null | import pyautogui
import random
from state import State
import time
fight_screen_bg_color = [52, 52, 52]
fight_screen_blank_xy = [533,733]
button_run_xy = [486, 755]
def path_circle_by_sizes(height, length):
path = ['right']*length + ['down']*height + ['left']*length + ['up']*height
return path
def path_rand_by_sizes(height, length, current_position=[0, 0]):
directions = ['right', 'left', 'down', 'up']
delta_x = [1, -1, 0, 0]
delta_y = [0, 0, 1, -1]
path = []
for i in range(100):
next_step = random.randint(0, 3)
if (0 <= current_position[0] + delta_x[next_step] < length) and (0 <= current_position[1] + delta_y[next_step] < height):
path.append(directions[next_step])
current_position[0] = current_position[0] + delta_x[next_step]
current_position[1] = current_position[1] + delta_y[next_step]
return path
def move(path):
#pyautogui.typewrite(path, 0.5)
for key in path:
hold_key(key, 0.2)
print('pressed '+key+'\n')
def automove(h, l):
move(path_circle_by_sizes(h, l))
move(path_rand_by_sizes(h, l))
def hold_key(key, hold_time):
start = time.time()
pyautogui.keyDown(key)
while time.time() - start < hold_time:
pass
pyautogui.keyUp(key)
#definition state machine
class MoveState(State):
def on_event(self, event):
if event == 'encounter':
return EncounterState()
return self
class EncounterState(State):
def on_event(self, event):
if event == 'pokemon_ok':
return FightingState()
if event == 'pokemon_ko':
return MoveState()
return self
class FightingState(State):
def on_event(self, event):
if event == 'encounter':
return UnlockedState()
return self
class WaitingState(State):
def on_event(self, event):
if event == 'encounter':
return UnlockedState()
return self
class LearningState(State):
def on_event(self, event):
if event == 'encounter':
return UnlockedState()
return self
states = ['move', 'encounter', 'check_pokemon', 'run']
current_state = MoveState()
#move(path_circle_by_sizes(4, 4))
#exit()
try:
while True:
#get mouse coordinates
x, y = pyautogui.position()
#screenshot and get pixel color under the mouse cursor
pixelColor = pyautogui.screenshot().getpixel((x, y))
if repr(current_state) == 'MoveState':
except KeyboardInterrupt:
print('\nDone')
| 26.346154 | 130 | 0.583212 | 330 | 2,740 | 4.672727 | 0.318182 | 0.068093 | 0.032425 | 0.048638 | 0.297017 | 0.26978 | 0.26978 | 0.230869 | 0.170558 | 0.170558 | 0 | 0.024454 | 0.29854 | 2,740 | 104 | 131 | 26.346154 | 0.777836 | 0.060584 | 0 | 0.267606 | 0 | 0 | 0.056772 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.014085 | 0.056338 | null | null | 0.028169 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84b947747158c22cea0ee583bdecc74083bcd0ad | 2,209 | py | Python | accounts/tests.py | HerbyDE/jagdreisencheck-webapp | 9af5deda2423b787da88a0c893f3c474d8e4f73f | [
"BSD-3-Clause"
] | null | null | null | accounts/tests.py | HerbyDE/jagdreisencheck-webapp | 9af5deda2423b787da88a0c893f3c474d8e4f73f | [
"BSD-3-Clause"
] | null | null | null | accounts/tests.py | HerbyDE/jagdreisencheck-webapp | 9af5deda2423b787da88a0c893f3c474d8e4f73f | [
"BSD-3-Clause"
] | null | null | null | from django.contrib.auth.hashers import check_password
from django.test import TestCase
from accounts.forms import CreateBaseUserInstance
from accounts.models import User
# Unit tests for SignUp and Registration
class UserManagementTestCase(TestCase):
def test_create_base_user_form(self):
'''
Test form behavior. Failure and Validation.
This method checks whether the
:return:
'''
invalid_base_data = {
'email': 'max.mustermann@jagdreisencheck.de',
'username': 'maxmustermann',
'password': 'testword123',
'confirm_passwd': 'testwort123',
'first_name': 'Max',
'last_name': 'Mustermann',
'country_of_residence': 'DE',
'agree_to_privacy': True,
'agree_to_tos': True
}
valid_base_data = {
'email': 'max.mustermann@jagdreisencheck.de',
'username': 'maxmustermann',
'password': 'testword123',
'confirm_passwd': 'testword123',
'first_name': 'Max',
'last_name': 'Mustermann',
'country_of_residence': 'DE',
'agree_to_privacy': True,
'agree_to_tos': True
}
# Test form with invalid data
form = CreateBaseUserInstance(data=invalid_base_data)
form.is_valid()
self.assertTrue(form.errors)
# Test form with valid data and write it to the database. Then retrieve the base user instance and compare it to
# the submitted form.
form = CreateBaseUserInstance(data=valid_base_data)
form.is_valid()
self.assertFalse(form.errors)
form = form.save(commit=False)
self.assertTrue(isinstance(form, User))
form.save()
maxi = User.objects.get(email='max.mustermann@jagdreisencheck.de')
self.assertEqual(maxi, form, msg="User found in DB.")
self.assertTrue(check_password(valid_base_data['password'], maxi.password))
def test_create_individual_user(self):
'''
Test the individual user creation procedure. Uses the base user created in the base user test.
:return:
'''
pass | 35.063492 | 120 | 0.616569 | 236 | 2,209 | 5.605932 | 0.394068 | 0.030234 | 0.040816 | 0.07483 | 0.33031 | 0.303855 | 0.269085 | 0.269085 | 0.269085 | 0.269085 | 0 | 0.007595 | 0.284744 | 2,209 | 63 | 121 | 35.063492 | 0.829747 | 0.17474 | 0 | 0.428571 | 0 | 0 | 0.244571 | 0.056571 | 0 | 0 | 0 | 0 | 0.119048 | 1 | 0.047619 | false | 0.166667 | 0.095238 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
84ba755e68d6f8aed3b5d61cd0febbfe9a2346b4 | 3,971 | py | Python | hio-yocto-bsp/sources/poky/scripts/lib/mic/3rdparty/pykickstart/commands/multipath.py | qiangzai00001/hio-prj | 060ff97fe21093b1369db78109d5b730b2b181c8 | [
"MIT"
] | null | null | null | hio-yocto-bsp/sources/poky/scripts/lib/mic/3rdparty/pykickstart/commands/multipath.py | qiangzai00001/hio-prj | 060ff97fe21093b1369db78109d5b730b2b181c8 | [
"MIT"
] | null | null | null | hio-yocto-bsp/sources/poky/scripts/lib/mic/3rdparty/pykickstart/commands/multipath.py | qiangzai00001/hio-prj | 060ff97fe21093b1369db78109d5b730b2b181c8 | [
"MIT"
] | null | null | null | #
# Chris Lumens <clumens@redhat.com>
# Peter Jones <pjones@redhat.com>
#
# Copyright 2006, 2007 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use, modify,
# copy, or redistribute it subject to the terms and conditions of the GNU
# General Public License v.2. This program is distributed in the hope that it
# will be useful, but WITHOUT ANY WARRANTY expressed or implied, including the
# implied warranties of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc., 51
# Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA. Any Red Hat
# trademarks that are incorporated in the source code or documentation are not
# subject to the GNU General Public License and may only be used or replicated
# with the express permission of Red Hat, Inc.
#
from pykickstart.base import *
from pykickstart.errors import *
from pykickstart.options import *
import gettext
_ = lambda x: gettext.ldgettext("pykickstart", x)
class FC6_MpPathData(BaseData):
removedKeywords = BaseData.removedKeywords
removedAttrs = BaseData.removedAttrs
def __init__(self, *args, **kwargs):
BaseData.__init__(self, *args, **kwargs)
self.mpdev = kwargs.get("mpdev", "")
self.device = kwargs.get("device", "")
self.rule = kwargs.get("rule", "")
def __str__(self):
return " --device=%s --rule=\"%s\"" % (self.device, self.rule)
class FC6_MultiPathData(BaseData):
removedKeywords = BaseData.removedKeywords
removedAttrs = BaseData.removedAttrs
def __init__(self, *args, **kwargs):
BaseData.__init__(self, *args, **kwargs)
self.name = kwargs.get("name", "")
self.paths = kwargs.get("paths", [])
def __str__(self):
retval = BaseData.__str__(self)
for path in self.paths:
retval += "multipath --mpdev=%s %s\n" % (self.name, path.__str__())
return retval
class FC6_MultiPath(KickstartCommand):
removedKeywords = KickstartCommand.removedKeywords
removedAttrs = KickstartCommand.removedAttrs
def __init__(self, writePriority=50, *args, **kwargs):
KickstartCommand.__init__(self, writePriority, *args, **kwargs)
self.op = self._getParser()
self.mpaths = kwargs.get("mpaths", [])
def __str__(self):
retval = ""
for mpath in self.mpaths:
retval += mpath.__str__()
return retval
def _getParser(self):
op = KSOptionParser()
op.add_option("--name", dest="name", action="store", type="string",
required=1)
op.add_option("--device", dest="device", action="store", type="string",
required=1)
op.add_option("--rule", dest="rule", action="store", type="string",
required=1)
return op
def parse(self, args):
(opts, extra) = self.op.parse_args(args=args, lineno=self.lineno)
dd = FC6_MpPathData()
self._setToObj(self.op, opts, dd)
dd.lineno = self.lineno
dd.mpdev = dd.mpdev.split('/')[-1]
parent = None
for x in range(0, len(self.mpaths)):
mpath = self.mpaths[x]
for path in mpath.paths:
if path.device == dd.device:
mapping = {"device": path.device, "multipathdev": path.mpdev}
raise KickstartValueError, formatErrorMsg(self.lineno, msg=_("Device '%(device)s' is already used in multipath '%(multipathdev)s'") % mapping)
if mpath.name == dd.mpdev:
parent = x
if parent is None:
mpath = FC6_MultiPathData()
return mpath
else:
mpath = self.mpaths[parent]
return dd
def dataList(self):
return self.mpaths
| 35.455357 | 162 | 0.636867 | 479 | 3,971 | 5.150313 | 0.361169 | 0.019457 | 0.021078 | 0.030807 | 0.193758 | 0.172679 | 0.137819 | 0.137819 | 0.137819 | 0.10458 | 0 | 0.010753 | 0.250567 | 3,971 | 111 | 163 | 35.774775 | 0.818212 | 0.238983 | 0 | 0.225352 | 0 | 0 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.056338 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84be222814483ab383927adcdabea5a7a57603ff | 9,142 | py | Python | tests/samples/test_sample.py | FoxoTech/methylcheck | 881d14d78e6086aab184716e0b79cdf87e9be8bf | [
"MIT"
] | 1 | 2020-07-30T14:16:47.000Z | 2020-07-30T14:16:47.000Z | tests/samples/test_sample.py | FoxoTech/methylcheck | 881d14d78e6086aab184716e0b79cdf87e9be8bf | [
"MIT"
] | 11 | 2021-04-08T16:14:54.000Z | 2022-03-09T00:22:13.000Z | tests/samples/test_sample.py | FoxoTech/methylcheck | 881d14d78e6086aab184716e0b79cdf87e9be8bf | [
"MIT"
] | 1 | 2022-02-10T09:06:45.000Z | 2022-02-10T09:06:45.000Z | import methylcheck
import pandas as pd
import methylprep # for manifest support
from pathlib import Path
PATH = Path('docs/example_data/mouse')
class TestProcessedSample():
manifest = methylprep.Manifest(methylprep.ArrayType('mouse'))
manifest_mouse_design_types = dict(manifest.mouse_data_frame['design'].value_counts())
manifest_control_probe_types = dict(manifest.control_data_frame['Control_Type'].value_counts())
def test_mouse_probes(self):
pd_mu = pd.read_pickle(Path(PATH,'mouse_probes.pkl'))
mu = methylcheck.load(Path(PATH,'mouse_probes.pkl'), verbose=False, silent=True)
if not (isinstance(mu, dict) and isinstance(list(mu.values())[0], pd.DataFrame)):
raise AssertionError()
mouse_probe_countA = len(list(mu.values())[0])
mouse_probe_countB = len(list(pd_mu.values())[0])
if len(mu) != len(pd_mu):
raise AssertionError(f"Got {len(mu)} items in mouse_probes.pkl; expected {len(pd_mu)}.")
if mouse_probe_countA != mouse_probe_countB:
raise AssertionError(f"Got {mouse_probe_countA} probes in mouse_probes.pkl vs {mouse_probe_countB}.")
# mouse_probes_v146 = 10067
mouse_probes_v155 = 32753
if mu['204879580038_R06C02'].shape[0] != mouse_probes_v155:
raise AssertionError(f"Got {mu['204879580038_R06C02'].shape[0]} probes in mouse_probes.pkl; expected {mouse_probes_v155}.")
print(f"sample and probe count OK")
df0 = list(mu.values())[0]
#probes_by_type = dict(df0.index.str[:2].value_counts())
#if probes_by_type['cg'] != 5881 and probes_by_type['uk'] != 4186:
# raise AssertionError(f"Mismatch in number of cg and uk(nown) probes in mouse_probes.pkl; expected 5881 cg and 4186 uk.")
probes_by_type = dict(df0.design.value_counts())
if probes_by_type['Random'] != 27305 or probes_by_type['Multi'] != 5448:
raise AssertionError(f"Mismatch in number of Random/Multi CpG probes in mouse_probes.pkl; expected 27305 Random and 5448 Multi.")
print('mouse probe counts OK')
def __test_mouse_probes_v146(self):
pd_mu = pd.read_pickle(Path(PATH,'mouse_probes.pkl'))
mu = methylcheck.load(Path(PATH,'mouse_probes.pkl'), verbose=False, silent=True)
if not (isinstance(mu, dict) and isinstance(list(mu.values())[0], pd.DataFrame)):
raise AssertionError()
if len(mu) != len(pd_mu):
raise AssertionError(f"Got a {len(mu)} items in mouse_probes.pkl; expected {len(pd_mu)}.")
print(f"sample count OK")
df0 = list(mu.values())[0]
probes_by_type = dict(df0.index.str[:2].value_counts())
if probes_by_type['cg'] != 5881 and probes_by_type['uk'] != 4186:
raise AssertionError(f"Mismatch in number of cg and uk(nown) probes in mouse_probes.pkl; expected 5881 cg and 4186 uk.")
print('mouse cg,uk count OK')
actual_mouse_probes = dict(df0['Probe_Type'].value_counts())
probe_counts = {'mu': 6332, 'rp': 4514, 'ch': 2851, 'rs': 291} # based on C20 manifest
probe_count_errors = {}
for probe_type, probe_count in probe_counts.items():
if self.manifest_mouse_probe_types[probe_type] != probe_count:
probe_count_errors[probe_type] = {'actual': self.manifest_mouse_probe_types[probe_type], 'expected': probe_count}
if probe_count_errors:
raise AssertionError(f"mouse probe count errors: {probe_count_errors}")
print('mouse mu,rp,ch,rs count OK')
# compare with manifest
diffs = []
for probe_type, probe_count in self.manifest_mouse_probe_types.items():
if probe_count != actual_mouse_probes[probe_type]:
diffs.append(f"{probe_type}: {actual_mouse_probes[probe_type]} / {probe_count}")
if diffs:
print("Probes in manifest NOT in control probes saved:")
print('\n'.join(diffs))
# not part of newer (>v1.4.6) mouse_probes.pkl
#actual_mouse_probes = dict(df0['Probe_Type'].value_counts())
#probe_counts_C20 = {'mu': 6332, 'rp': 4514, 'ch': 2851, 'rs': 291} # based on C20 manifest, v1.4.6
#probe_counts_mm285_v2 = {'mu': 4821, 'rp': 3048, 'ch': 2085, 'rs': 113} # v1.5.5
#probe_count_errors = {}
#for probe_type, probe_count in probe_counts_mm285_v2.items():
# if self.manifest_mouse_probe_types[probe_type] != probe_count:
# probe_count_errors[probe_type] = {'actual': self.manifest_mouse_probe_types[probe_type], 'expected': probe_count}
#if probe_count_errors:
# raise AssertionError(f"mouse probe count errors: {probe_count_errors}")
#print('mouse mu,rp,ch,rs count OK')
# compare with manifest -- done in test_Array_processed.py already
#diffs = []
#for probe_type, probe_count in self.manifest_mouse_probe_types.items():
# if probe_count != actual_mouse_probes[probe_type]:
# diffs.append(f"{probe_type}: {actual_mouse_probes[probe_type]} / {probe_count}")
#if diffs:
# print("Probes in manifest NOT in control probes saved:")
# print('\n'.join(diffs))
def test_control_probes(self):
con = methylcheck.load(Path(PATH,'control_probes.pkl'), verbose=False, silent=True)
con0 = list(con.values())[0]
con_types = dict(con0['Control_Type'].value_counts())
#probe_counts_v146 = {'NEGATIVE': 179, 'NORM_T': 24, 'NORM_C': 22, 'NORM_A': 10, 'NORM_G': 9, 'NON-POLYMORPHIC': 3, 'SPECIFICITY I': 3, 'BISULFITE CONVERSION I': 3, 'BISULFITE CONVERSION II': 2, 'SPECIFICITY II': 2, 'HYBRIDIZATION': 2, 'RESTORATION': 1}
probe_counts = {'NEGATIVE': 411, 'NORM_T': 58, 'NORM_C': 58, 'NORM_A': 27, 'NORM_G': 27, 'NON-POLYMORPHIC': 9, 'SPECIFICITY I': 12, 'BISULFITE CONVERSION I': 10, 'BISULFITE CONVERSION II': 4, 'SPECIFICITY II': 3, 'HYBRIDIZATION': 3, 'RESTORATION': 1}
probe_count_errors = {}
for probe_type, probe_count in probe_counts.items():
if con_types[probe_type] != probe_count:
probe_count_errors[probe_type] = {'actual': con_types[probe_type], 'expected': probe_count}
if probe_count_errors:
raise AssertionError(f"control probe count differed: {probe_count_errors}")
print('mouse control_probes count OK')
diffs = []
for probe_type, probe_count in self.manifest_control_probe_types.items():
if not con_types.get(probe_type):
print(f"ERROR: control output data is missing {probe_type} found in manifest.")
continue
if probe_count != con_types[probe_type]:
diffs.append(f"{probe_type}: {con_types[probe_type]} / {probe_count}")
if diffs:
print("Probes in manifest NOT in control probes saved:")
print('\n'.join(diffs))
def test_plot_mouse_betas_from_pickle(self):
""" tests SAVE too """
mu = methylcheck.load(Path(PATH,'mouse_probes.pkl'), verbose=False, silent=True)
df0 = list(mu.values())[0]
df = df0[['beta_value']]
methylcheck.sample_plot(df, silent=True, save=True)
methylcheck.beta_density_plot(df, silent=True)
methylcheck.sample_plot(df, silent=True)
methylcheck.mean_beta_plot(df, silent=True)
methylcheck.cumulative_sum_beta_distribution(df, silent=True)
methylcheck.beta_mds_plot(df, silent=True, save=False)
methylcheck.mean_beta_compare(df,df,silent=True)
df = df0[['cm_value']]
methylcheck.beta_density_plot(df, silent=True)
df = df0[['m_value']]
methylcheck.beta_density_plot(df, silent=True)
Path('./beta.png').unlink()
# Path('./beta_mds_n=*.png').unlink() -- causes error if this function doesn't create the file to remove
methylcheck.beta_mds_plot(df, silent=True, save=True)
for saved_png in Path('.').rglob('./beta_mds_n=*.png'):
print(saved_png)
saved_png.unlink()
def ignore_test():
with np.errstate(all='raise'):
df = methylcheck.load(Path(PATH,'beta_values.pkl'), verbose=False, silent=True)
methylcheck.sample_plot(df, silent=True, save=True)
methylcheck.beta_density_plot(df, silent=True)
methylcheck.sample_plot(df, silent=True)
methylcheck.mean_beta_plot(df, silent=True)
methylcheck.cumulative_sum_beta_distribution(df, silent=True)
methylcheck.beta_mds_plot(df, silent=True, save=False)
methylcheck.mean_beta_compare(df,df,silent=True)
#df = df0[['cm_value']]
#methylcheck.beta_density_plot(df, silent=True)
#df = df0[['m_value']]
#methylcheck.beta_density_plot(df, silent=True)
Path('./beta.png').unlink()
# Path('./beta_mds_n=*.png').unlink() -- causes error if this function doesn't create the file to remove
methylcheck.beta_mds_plot(df, silent=True, save=True)
for saved_png in Path('.').rglob('./beta_mds_n=*.png'):
print(saved_png)
saved_png.unlink()
| 57.496855 | 261 | 0.654233 | 1,245 | 9,142 | 4.570281 | 0.15502 | 0.059754 | 0.042179 | 0.044991 | 0.715817 | 0.684886 | 0.662566 | 0.650615 | 0.648858 | 0.641652 | 0 | 0.033742 | 0.215489 | 9,142 | 158 | 262 | 57.860759 | 0.759621 | 0.220083 | 0 | 0.469565 | 0 | 0.017391 | 0.211313 | 0.019185 | 0 | 0 | 0 | 0 | 0.086957 | 1 | 0.043478 | false | 0 | 0.034783 | 0 | 0.113043 | 0.113043 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84c335fc458875fff1f67c92200084f00cd6ddd4 | 420 | py | Python | server/apps/api/migrations/0002_allow_null.py | m3xan1k/censortracker_backend | 998f71efd65d7d1e02c0aecd20b7a59e0df35fd0 | [
"MIT"
] | null | null | null | server/apps/api/migrations/0002_allow_null.py | m3xan1k/censortracker_backend | 998f71efd65d7d1e02c0aecd20b7a59e0df35fd0 | [
"MIT"
] | null | null | null | server/apps/api/migrations/0002_allow_null.py | m3xan1k/censortracker_backend | 998f71efd65d7d1e02c0aecd20b7a59e0df35fd0 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.5 on 2020-04-20 15:21
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='domain',
name='client_ip',
field=models.GenericIPAddressField(blank=True, null=True, verbose_name='Client IP'),
),
]
| 22.105263 | 96 | 0.609524 | 46 | 420 | 5.478261 | 0.782609 | 0.079365 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062092 | 0.271429 | 420 | 18 | 97 | 23.333333 | 0.761438 | 0.107143 | 0 | 0 | 1 | 0 | 0.104558 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84d06320c996d5932ddb1765740ed2bedba84b6f | 2,842 | py | Python | noronha/db/movers.py | pierodesenzi/noronha | ee7cba8d0d29d0dc5484d2000e1a42c9954c20e4 | [
"Apache-2.0"
] | 43 | 2021-04-14T00:41:15.000Z | 2022-01-02T23:32:58.000Z | noronha/db/movers.py | pierodesenzi/noronha | ee7cba8d0d29d0dc5484d2000e1a42c9954c20e4 | [
"Apache-2.0"
] | 19 | 2021-04-14T00:35:21.000Z | 2022-01-12T14:24:48.000Z | noronha/db/movers.py | pierodesenzi/noronha | ee7cba8d0d29d0dc5484d2000e1a42c9954c20e4 | [
"Apache-2.0"
] | 8 | 2021-04-14T00:31:02.000Z | 2022-01-02T23:33:08.000Z | # -*- coding: utf-8 -*-
# Copyright Noronha Development Team
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""TODO: {{module description}}
"""
from mongoengine import CASCADE
from mongoengine.fields import StringField, DictField, ReferenceField, EmbeddedDocumentField, BooleanField
from noronha.common.constants import DBConst, OnBoard
from noronha.db.main import SmartDoc, SmartEmbeddedDoc
from noronha.db.ds import EmbeddedDataset
from noronha.db.model import Model, EmbeddedModel
from noronha.db.train import EmbeddedTraining
class ProtoModelVersion(object):
PK_FIELDS = ['model.name', 'name']
FILE_NAME = OnBoard.Meta.MV
class EmbeddedModelVersion(SmartEmbeddedDoc):
PK_FIELDS = ProtoModelVersion.PK_FIELDS
FILE_NAME = ProtoModelVersion.FILE_NAME
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.use_as_pretrained = False
name = StringField(max_length=DBConst.MAX_NAME_LEN)
model = EmbeddedDocumentField(EmbeddedModel, default=None)
train = EmbeddedDocumentField(EmbeddedTraining, default=None)
ds = EmbeddedDocumentField(EmbeddedDataset, default=None)
compressed = BooleanField(default=False)
details = DictField(default={})
pretrained = StringField(default=None)
lightweight = BooleanField(default=False)
class ModelVersion(SmartDoc):
PK_FIELDS = ProtoModelVersion.PK_FIELDS
FILE_NAME = ProtoModelVersion.FILE_NAME
EMBEDDED_SCHEMA = EmbeddedModelVersion
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
name = StringField(required=True, max_length=DBConst.MAX_NAME_LEN)
model = ReferenceField(Model, required=True, reverse_delete_rule=CASCADE)
train = EmbeddedDocumentField(EmbeddedTraining, default=None)
ds = EmbeddedDocumentField(EmbeddedDataset, default=None)
compressed = BooleanField(default=False)
details = DictField(default={})
pretrained = EmbeddedDocumentField(EmbeddedModelVersion, default=None)
lightweight = BooleanField(default=False)
def to_embedded(self):
emb: EmbeddedModelVersion = super().to_embedded()
if isinstance(self.pretrained, EmbeddedModelVersion):
emb.pretrained = self.pretrained.show()
return emb
| 34.240964 | 106 | 0.732935 | 311 | 2,842 | 6.572347 | 0.414791 | 0.037671 | 0.02544 | 0.015656 | 0.344423 | 0.344423 | 0.299413 | 0.26908 | 0.26908 | 0.229941 | 0 | 0.00215 | 0.181562 | 2,842 | 82 | 107 | 34.658537 | 0.876612 | 0.213934 | 0 | 0.418605 | 0 | 0 | 0.006323 | 0 | 0 | 0 | 0 | 0.012195 | 0 | 1 | 0.069767 | false | 0 | 0.162791 | 0 | 0.860465 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84d217b619e298e8238b55059389e2293157ea76 | 3,852 | py | Python | Lib/background.py | ProjectZeroDays/Pyto | d5d77f3541f329bbb28142d18606b22f115b7df6 | [
"MIT"
] | 2 | 2020-08-25T13:55:00.000Z | 2020-08-25T16:36:03.000Z | Lib/background.py | ProjectZeroDays/Pyto | d5d77f3541f329bbb28142d18606b22f115b7df6 | [
"MIT"
] | null | null | null | Lib/background.py | ProjectZeroDays/Pyto | d5d77f3541f329bbb28142d18606b22f115b7df6 | [
"MIT"
] | null | null | null | """
Run code in background indefinitely
This module allows you to keep running a script in the background indefinitely.
A great usage of this is fetching data in background and sending notifications with :py:mod:`notifications`. You can also run a server or a Discord bot for example.
Note: Because of privacy, apps cannot access to the clipboard in background, so coding a clipboard manager is not possible.
"""
from pyto import __Class__
from datetime import datetime
from time import sleep
from os.path import abspath
import sys
import threading
class BackgroundTask:
"""
Represents a task to run in background.
When started, the audio at the path passed to the initializer is played. If no audio is passed, a blank audio is used so Pyto isn't killed by the system.
Usage:
.. highlight:: python
.. code-block:: python
import background as bg
with bg.BackgroundTask() as b:
while True:
print(b.execution_time())
b.wait(1)
"""
start_date = None
__end_date__ = None
def execution_time(self) -> int:
"""
Returns the total execution time of the task in seconds.
:rtype: int
"""
if self.__end_date__ is not None:
date = self.__end_date__
else:
date = datetime.now()
return int((date - self.start_date).total_seconds())
@property
def notification_delay(self) -> int:
"""
The delay in seconds since each reminder notification.
If set to 3600, a notification will be sent every hour while the task is running.
The default value is ``21600`` (6 hours).
:rtype: int
"""
return self.__background_task__.delay
@notification_delay.setter
def notification_delay(self, new_value: int):
self.__background_task__.delay = new_value
@property
def reminder_notifications(self) -> bool:
"""
A boolean indicating whether a notification should be sent while the task is running.
By default, a notification is sent every 6 hours while the task is running, set this property to ``False`` to disable that,
:rtype: bool
"""
return self.__background_task__.sendNotification
@reminder_notifications.setter
def reminder_notifications(self, new_value: bool):
self.__background_task__.sendNotification = new_value
def __init__(self, audio_path=None):
self.__background_task__ = __Class__("BackgroundTask").new()
if audio_path is not None:
self.__background_task__.soundPath = abspath(audio_path)
def start(self):
"""
Starts the background task. After calling this function, Pyto will not be killed by the system.
"""
self.start_date = datetime.now()
self.__end_date__ = None
try:
self.__background_task__.scriptName = threading.current_thread().script_path.split("/")[-1]
except AttributeError:
pass
except IndexError:
pass
self.__background_task__.startBackgroundTask()
def stop(self):
"""
Stops the background task. After calling this function, Pyto can be killed by the system to free memory.
"""
self.__end_date__ = datetime.now()
self.__background_task__.stopBackgroundTask()
def wait(self, delay: float):
"""
Waits n seconds. Does the same thing as ``time.sleep``.
:param delay: Seconds to wait.
"""
sleep(delay)
def __enter__(self):
self.start()
return self
def __exit__(self, type, value, traceback):
self.stop()
if type is not None and value is not None and traceback is not None:
sys.excepthook(type, value, traceback)
| 28.746269 | 164 | 0.647456 | 485 | 3,852 | 4.913402 | 0.342268 | 0.064624 | 0.067982 | 0.021402 | 0.080151 | 0.037768 | 0.037768 | 0.037768 | 0 | 0 | 0 | 0.004676 | 0.278297 | 3,852 | 133 | 165 | 28.962406 | 0.852518 | 0.408619 | 0 | 0.075472 | 0 | 0 | 0.007407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.207547 | false | 0.037736 | 0.113208 | 0 | 0.45283 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84d93da7a53d5949d64f3873e6e7cfb952e9165e | 2,231 | py | Python | db.py | changwang/knowledge | b5d014de1c9cf70991bd721015db5883ff670cda | [
"MIT"
] | null | null | null | db.py | changwang/knowledge | b5d014de1c9cf70991bd721015db5883ff670cda | [
"MIT"
] | null | null | null | db.py | changwang/knowledge | b5d014de1c9cf70991bd721015db5883ff670cda | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import os
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from logger import logger
__all__ = [
'fetch_from_table',
'fetch_one_row',
'TableNotFoundException'
]
DATABASE_PASSWORD = os.getenv('DB_PASSWORD')
DATABASE_URI = 'mysql+pymysql://knowledge:{0}@localhost/knowledge'.format(DATABASE_PASSWORD)
engine = create_engine(
DATABASE_URI,
convert_unicode=True,
pool_recycle=3600, pool_size=10
)
db_session = scoped_session(sessionmaker(
autocommit=False, autoflush=False, bind=engine
))
def _check_table(table_name):
"""
checks if the given table is available from knowledge database.
"""
check_table_sql = "SELECT exists(SELECT * FROM information_schema.tables WHERE table_name='{0}')".format(table_name)
logger.info(check_table_sql)
rs = db_session.execute(check_table_sql)
table_row = rs.fetchone()[0]
if not table_row:
raise TableNotFoundException("Given table name {0} cannot be found".format(table_name))
def _get_pk(table_name, default_pk='id'):
"""
gets the primary key name from given table,
if nothing found, default primary key is used.
"""
get_pk_sql = "SELECT k.COLUMN_NAME FROM information_schema.table_constraints t " \
"LEFT JOIN information_schema.key_column_usage k USING (constraint_name, table_schema, table_name) " + \
"WHERE t.constraint_type = 'PRIMARY KEY' AND t.table_schema=DATABASE() AND t.table_name = '{0}'".format(
table_name)
logger.info(get_pk_sql)
rs = db_session.execute(get_pk_sql)
pk_name = rs.fetchone()[0]
if not pk_name:
pk_name = default_pk
return pk_name
def fetch_from_table(table_name, limit, offset):
_check_table(table_name)
rs = db_session.execute("SELECT * FROM {0} LIMIT {1} OFFSET {2}".format(table_name, limit, offset))
return rs.fetchall()
def fetch_one_row(table_name, id, pk='id'):
_check_table(table_name)
pk = _get_pk(table_name, pk)
rs = db_session.execute("SELECT * FROM {0} WHERE {1} = {2}".format(table_name, pk, id))
return rs.fetchone()
class TableNotFoundException(Exception):
pass
| 29.746667 | 121 | 0.701927 | 307 | 2,231 | 4.824104 | 0.315961 | 0.097232 | 0.050641 | 0.048616 | 0.136394 | 0.086428 | 0.086428 | 0.047265 | 0 | 0 | 0 | 0.010509 | 0.189601 | 2,231 | 74 | 122 | 30.148649 | 0.808628 | 0.079337 | 0 | 0.041667 | 0 | 0.020833 | 0.275384 | 0.095097 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.0625 | 0.083333 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
84da5879b36efd297ed4f89e12576786ac06a517 | 318 | py | Python | tool.py | LooDaHu/net_drive | 0e265cef6b70d5e3e4e1b0a6db68d465a8ee2fe8 | [
"MIT"
] | 2 | 2019-12-31T07:47:03.000Z | 2020-01-17T00:00:38.000Z | tool.py | LooDaHu/net_drive | 0e265cef6b70d5e3e4e1b0a6db68d465a8ee2fe8 | [
"MIT"
] | null | null | null | tool.py | LooDaHu/net_drive | 0e265cef6b70d5e3e4e1b0a6db68d465a8ee2fe8 | [
"MIT"
] | null | null | null | import socket
import re
def get_local_adr():
address_set = socket.getaddrinfo(socket.gethostname(), None, family=2)
for address in address_set:
if re.match("192.168.", address[4][0]):
local_network_addr = address[4][0]
return local_network_addr
return "ADDRESS_NOT_FOUND"
| 26.5 | 74 | 0.669811 | 44 | 318 | 4.613636 | 0.613636 | 0.098522 | 0.08867 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044534 | 0.22327 | 318 | 11 | 75 | 28.909091 | 0.777328 | 0 | 0 | 0 | 0 | 0 | 0.078616 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
84ef25a6781dddc7b93337eee537701562bd82e0 | 886 | py | Python | login_required_middleware.py | IBM/omnia | 588f380e04c697ca9d5ed84822c14b37dad92d27 | [
"Apache-2.0"
] | 1 | 2021-11-25T16:07:38.000Z | 2021-11-25T16:07:38.000Z | login_required_middleware.py | IBM/omnia | 588f380e04c697ca9d5ed84822c14b37dad92d27 | [
"Apache-2.0"
] | null | null | null | login_required_middleware.py | IBM/omnia | 588f380e04c697ca9d5ed84822c14b37dad92d27 | [
"Apache-2.0"
] | null | null | null | from django.contrib.auth.decorators import login_required
from django.urls import reverse
def login_exempt(view):
view.login_exempt = True
return view
class LoginRequiredMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
return self.get_response(request)
def process_view(self, request, view_func, view_args, view_kwargs):
if getattr(view_func, 'login_exempt', False):
return
if request.user.is_authenticated:
return
if request.GET.get('siteindex','') == 'y':
return
# Exclude sign in/signout to prevent loop
if request.path == reverse('research:signin') or request.path == reverse('research:signout'):
return
return login_required(view_func)(request, *view_args, **view_kwargs) | 28.580645 | 101 | 0.662528 | 106 | 886 | 5.292453 | 0.433962 | 0.078431 | 0.080214 | 0.064171 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.244921 | 886 | 31 | 102 | 28.580645 | 0.838565 | 0.044018 | 0 | 0.2 | 0 | 0 | 0.062648 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0.05 | 0.65 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
84ef970c45f901d834ef512adc2e8980501798a3 | 2,296 | py | Python | yoo.py | Bayurzx/translateReadme | 2ffb7ad206b1b8b463dc3859da0281b0a2bbd78c | [
"MIT"
] | null | null | null | yoo.py | Bayurzx/translateReadme | 2ffb7ad206b1b8b463dc3859da0281b0a2bbd78c | [
"MIT"
] | null | null | null | yoo.py | Bayurzx/translateReadme | 2ffb7ad206b1b8b463dc3859da0281b0a2bbd78c | [
"MIT"
] | null | null | null | yoo = {
"af": "Afrikaans",
"sq": "Albanian - shqip",
"am": "Amharic - አማርኛ",
"ar": "Arabic - العربية",
"hy": "Armenian - հայերեն",
"az": "Azerbaijani - azərbaycan dili",
"bn": "Bengali - বাংলা",
"bs": "Bosnian - bosanski",
"bg": "Bulgarian - български",
"ca": "Catalan - català",
"zh": "Chinese - 中文",
"zh-Hans": "Chinese (Simplified) - 中文(简体)",
"zh-Hant": "Chinese (Traditional) - 中文(繁體)",
"hr": "Croatian - hrvatski",
"cs": "Czech - čeština",
"da": "Danish - dansk",
"nl": "Dutch - Nederlands",
"en": "English",
"et": "Estonian - eesti",
"fj": "Fijian - føroyskt",
"fil": "Filipino",
"fi": "Finnish - suomi",
"fr": "French - français",
"de": "German - Deutsch",
"el": "Greek - Ελληνικά",
"gu": "Gujarati - ગુજરાતી",
"ht": "Haitian Creole",
"he": "Hebrew - עברית",
"hi": "Hindi - हिन्दी",
"mww": "Hmong Daw",
"hu": "Hungarian - magyar",
"is": "Icelandic - íslenska",
"id": "Indonesian - Indonesia",
"iu": "Inuktitut",
"ga": "Irish - Gaeilge",
"it": "Italian - italiano",
"ja": "Japanese - 日本語",
"kn": "Kannada - ಕನ್ನಡ",
"kk": "Kazakh - қазақ тілі",
"km": "Khmer - ខ្មែរ",
"ko": "Korean - 한국어",
"ku": "Kurdish - Kurdî",
"lo": "Lao - ລາວ",
"la": "Latin",
"lv": "Latvian - latviešu",
"lt": "Lithuanian - lietuvių",
"mg": "Malagasy - मराठी",
"ms": "Malay - Bahasa Melayu",
"ml": "Malayalam - മലയാളം",
"mt": "Maltese - Malti",
"mn": "Mongolian - монгол",
"ne": "Nepali - नेपाली",
"no": "Norwegian - norsk",
"or": "Oriya - ଓଡ଼ିଆ",
"ps": "Pashto - پښتو",
"fa": "Persian - فارسی",
"pl": "Polish - polski",
"pt": "Portuguese - português",
"pt-pt": "Portuguese (Portugal)",
"pa": "Punjabi - ਪੰਜਾਬੀ",
"ro": "Romanian - română",
"ru": "Russian - русский",
"sr-Cyrl": "Serbian - српски",
"sr-Latn": "Serbian - српски",
"sd": "Sindhi",
"si": "Sinhala - සිංහල",
"sk": "Slovak - slovenčina",
"sl": "Slovenian - slovenščina",
"es": "Spanish - español",
"sw": "Swahili - Kiswahili",
"sv": "Swedish - svenska",
"tg": "Tajik - тоҷикӣ",
"ta": "Tamil - தமிழ்",
"tt": "Tatar",
"te": "Telugu - తెలుగు",
"th": "Thai - ไทย",
"ti": "Tigrinya - ትግርኛ",
"to": "Tongan - lea fakatonga",
"tr": "Turkish - Türkçe",
"uk": "Ukrainian - українська",
"ur": "Urdu - اردو",
"vi": "Vietnamese - Tiếng Việt",
"cy": "Welsh - Cymraeg",
}
def language_name(lang):
return yoo[lang]
if __name__ == "__main__":
language_name(lang)
| 24.688172 | 44 | 0.574042 | 317 | 2,296 | 4.223975 | 0.927445 | 0.017924 | 0.023898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150697 | 2,296 | 92 | 45 | 24.956522 | 0.670769 | 0 | 0 | 0 | 0 | 0 | 0.666376 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011236 | false | 0 | 0 | 0.011236 | 0.022472 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84f0fd3aaa5e1acb4ea3de86e12e62747b00b241 | 2,508 | py | Python | spacer/tests/test_train_classifier.py | beijbom/PySpacer | 09ec5d41aa3d239e252e351b18a0835956c21903 | [
"MIT"
] | 3 | 2020-03-09T00:30:20.000Z | 2021-03-08T22:30:19.000Z | spacer/tests/test_train_classifier.py | beijbom/PySpacer | 09ec5d41aa3d239e252e351b18a0835956c21903 | [
"MIT"
] | 28 | 2020-02-28T17:19:07.000Z | 2022-03-25T04:59:26.000Z | spacer/tests/test_train_classifier.py | beijbom/PySpacer | 09ec5d41aa3d239e252e351b18a0835956c21903 | [
"MIT"
] | null | null | null | import random
import unittest
import numpy as np
from spacer import config
from spacer.messages import DataLocation
from spacer.train_classifier import trainer_factory
from spacer.train_utils import make_random_data, train
@unittest.skipUnless(config.HAS_S3_TEST_ACCESS, 'No access to test bucket')
class TestDefaultTrainerDummyData(unittest.TestCase):
def setUp(self):
config.filter_warnings()
np.random.seed(0)
random.seed(0)
def test_simple(self):
n_valdata = 20
n_traindata = 200
points_per_image = 20
feature_dim = 5
class_list = [1, 2]
num_epochs = 4
# First create data to train on.
feature_loc = DataLocation(storage_type='memory', key='')
train_data = make_random_data(n_valdata,
class_list,
points_per_image,
feature_dim,
feature_loc)
val_data = make_random_data(n_traindata,
class_list,
points_per_image,
feature_dim,
feature_loc)
trainer = trainer_factory('minibatch')
for clf_type in config.CLASSIFIER_TYPES:
pc_clf1, _ = train(train_data, feature_loc, 1, clf_type)
pc_clf2, _ = train(train_data, feature_loc, 1, clf_type)
clf, val_results, return_message = trainer(train_data,
val_data,
num_epochs,
[pc_clf1, pc_clf2],
feature_loc,
clf_type)
# The way we rendered the data, accuracy is usually around 90%.
# Adding some margin to account for randomness.
# TODO: fix random seed; somehow the set above didn't work.
self.assertGreater(return_message.acc,
0.75,
"Failure may be due to random generated numbers,"
"re-run tests.")
self.assertEqual(len(return_message.pc_accs), 2)
self.assertEqual(len(return_message.ref_accs), num_epochs)
if __name__ == '__main__':
unittest.main()
| 37.432836 | 80 | 0.508373 | 251 | 2,508 | 4.788845 | 0.458167 | 0.049917 | 0.034942 | 0.02995 | 0.207987 | 0.124792 | 0.124792 | 0.124792 | 0.071547 | 0 | 0 | 0.018271 | 0.432616 | 2,508 | 66 | 81 | 38 | 0.826423 | 0.07815 | 0 | 0.163265 | 0 | 0 | 0.046381 | 0 | 0 | 0 | 0 | 0.015152 | 0.061224 | 1 | 0.040816 | false | 0 | 0.142857 | 0 | 0.204082 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
84fcdea20c3ac270c0917795bed84e0caff0bc61 | 891 | py | Python | src/harness/wires/base.py | vmagamedov/harness | 0e9d64295f937aa4476dbe5f084e80a3783edce7 | [
"BSD-3-Clause"
] | 6 | 2020-03-26T16:49:54.000Z | 2022-01-13T09:13:40.000Z | src/harness/wires/base.py | vmagamedov/harness | 0e9d64295f937aa4476dbe5f084e80a3783edce7 | [
"BSD-3-Clause"
] | 1 | 2020-03-14T16:47:51.000Z | 2020-03-14T16:47:51.000Z | src/harness/wires/base.py | vmagamedov/harness | 0e9d64295f937aa4476dbe5f084e80a3783edce7 | [
"BSD-3-Clause"
] | null | null | null | import asyncio
from types import TracebackType
from typing import Optional, Type, Any
class Wire:
def configure(self, value: Any) -> None:
pass
async def __aenter__(self) -> None:
pass
async def __aexit__(
self,
exc_type: Optional[Type[BaseException]],
exc_val: Optional[BaseException],
exc_tb: Optional[TracebackType],
) -> None:
self.close()
await self.wait_closed()
def close(self) -> None:
pass
async def wait_closed(self) -> None:
pass
class WaitMixin:
_event: asyncio.Event
def close(self) -> None:
if not hasattr(self, "_event"):
self._event = asyncio.Event()
self._event.set()
async def wait_closed(self) -> None:
if not hasattr(self, "_event"):
self._event = asyncio.Event()
await self._event.wait()
| 21.731707 | 48 | 0.59596 | 103 | 891 | 4.951456 | 0.330097 | 0.105882 | 0.076471 | 0.094118 | 0.345098 | 0.282353 | 0.196078 | 0.196078 | 0.196078 | 0.196078 | 0 | 0 | 0.298541 | 891 | 40 | 49 | 22.275 | 0.816 | 0 | 0 | 0.4 | 0 | 0 | 0.013468 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.133333 | 0.1 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
84fd59316076ee88ed20da5294439d509f4f0499 | 1,377 | py | Python | tickets/migrations/0011_auto_20190804_2139.py | jdevera/pythoncanarias_web | 465e8b0a054726e29b1029f1dffe11f913e40bcc | [
"MIT"
] | 5 | 2018-08-14T14:59:38.000Z | 2020-06-14T14:56:21.000Z | tickets/migrations/0011_auto_20190804_2139.py | jdevera/pythoncanarias_web | 465e8b0a054726e29b1029f1dffe11f913e40bcc | [
"MIT"
] | 189 | 2018-07-22T19:42:05.000Z | 2021-05-24T21:37:48.000Z | tickets/migrations/0011_auto_20190804_2139.py | jdevera/pythoncanarias_web | 465e8b0a054726e29b1029f1dffe11f913e40bcc | [
"MIT"
] | 11 | 2018-07-30T16:11:40.000Z | 2020-10-18T20:40:53.000Z | # Generated by Django 2.2.4 on 2019-08-04 20:39
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('tickets', '0010_auto_20190804_1818'),
]
operations = [
migrations.CreateModel(
name='Gift',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=256)),
('description', models.TextField(blank=True)),
('awarded_at', models.DateTimeField(blank=True, null=True)),
('awarded_ticket', models.OneToOneField(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='gift', to='tickets.Ticket')),
],
options={
'ordering': ['awarded_at', 'name', 'description'],
},
),
migrations.AlterModelOptions(
name='raffle',
options={'ordering': ['created_at']},
),
migrations.DeleteModel(
name='Present',
),
migrations.AddField(
model_name='gift',
name='raffle',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='gifts', to='tickets.Raffle'),
),
]
| 34.425 | 167 | 0.574437 | 134 | 1,377 | 5.783582 | 0.5 | 0.04129 | 0.054194 | 0.085161 | 0.123871 | 0.123871 | 0.123871 | 0.123871 | 0.123871 | 0 | 0 | 0.034518 | 0.284677 | 1,377 | 39 | 168 | 35.307692 | 0.752284 | 0.03268 | 0 | 0.181818 | 1 | 0 | 0.141353 | 0.017293 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.060606 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ca0d6e949501a9ffb2556c5d406d3b3f90523bdb | 733 | py | Python | apps/events/migrations/0005_auto_20180312_1245.py | Strand94/WhatsMappening | 92a297d531d0fd34940ec8b9ea035bbde00a1b24 | [
"MIT"
] | null | null | null | apps/events/migrations/0005_auto_20180312_1245.py | Strand94/WhatsMappening | 92a297d531d0fd34940ec8b9ea035bbde00a1b24 | [
"MIT"
] | 5 | 2018-05-14T18:18:20.000Z | 2021-06-10T20:29:51.000Z | apps/events/migrations/0005_auto_20180312_1245.py | Strand94/WhatsMappening | 92a297d531d0fd34940ec8b9ea035bbde00a1b24 | [
"MIT"
] | null | null | null | # Generated by Django 2.0.2 on 2018-03-12 11:45
import datetime
import django.contrib.gis.db.models.fields
from django.db import migrations, models
from django.utils.timezone import utc
class Migration(migrations.Migration):
dependencies = [
('events', '0004_auto_20180309_1804'),
]
operations = [
migrations.AddField(
model_name='event',
name='location',
field=django.contrib.gis.db.models.fields.PointField(null=True, srid=4326),
),
migrations.AlterField(
model_name='event',
name='timestamp',
field=models.DateTimeField(default=datetime.datetime(2018, 3, 12, 11, 45, 2, 775370, tzinfo=utc)),
),
]
| 27.148148 | 110 | 0.630286 | 85 | 733 | 5.376471 | 0.588235 | 0.017505 | 0.026258 | 0.078775 | 0.131291 | 0.131291 | 0 | 0 | 0 | 0 | 0 | 0.096539 | 0.251023 | 733 | 26 | 111 | 28.192308 | 0.735883 | 0.061392 | 0 | 0.2 | 1 | 0 | 0.081633 | 0.033528 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.35 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.