hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4ed9066c72e27b2452eb11f959682224bfe19f4e | 7,727 | py | Python | state_workflow_sdk/api/state_workflow/filterInstanceOfStateWorkflow_pb2.py | easyopsapis/easyops-api-python | adf6e3bad33fa6266b5fa0a449dd4ac42f8447d0 | [
"Apache-2.0"
] | 5 | 2019-07-31T04:11:05.000Z | 2021-01-07T03:23:20.000Z | state_workflow_sdk/api/state_workflow/filterInstanceOfStateWorkflow_pb2.py | easyopsapis/easyops-api-python | adf6e3bad33fa6266b5fa0a449dd4ac42f8447d0 | [
"Apache-2.0"
] | null | null | null | state_workflow_sdk/api/state_workflow/filterInstanceOfStateWorkflow_pb2.py | easyopsapis/easyops-api-python | adf6e3bad33fa6266b5fa0a449dd4ac42f8447d0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: filterInstanceOfStateWorkflow.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.protobuf import struct_pb2 as google_dot_protobuf_dot_struct__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='filterInstanceOfStateWorkflow.proto',
package='state_workflow',
syntax='proto3',
serialized_options=None,
serialized_pb=_b('\n#filterInstanceOfStateWorkflow.proto\x12\x0estate_workflow\x1a\x1cgoogle/protobuf/struct.proto\"`\n$FilterInstanceOfStateWorkflowRequest\x12\x10\n\x08objectId\x18\x01 \x01(\t\x12&\n\x05query\x18\x02 \x01(\x0b\x32\x17.google.protobuf.Struct\";\n%FilterInstanceOfStateWorkflowResponse\x12\x12\n\ninstanceId\x18\x01 \x01(\t\"\xa5\x01\n,FilterInstanceOfStateWorkflowResponseWrapper\x12\x0c\n\x04\x63ode\x18\x01 \x01(\x05\x12\x13\n\x0b\x63odeExplain\x18\x02 \x01(\t\x12\r\n\x05\x65rror\x18\x03 \x01(\t\x12\x43\n\x04\x64\x61ta\x18\x04 \x01(\x0b\x32\x35.state_workflow.FilterInstanceOfStateWorkflowResponseb\x06proto3')
,
dependencies=[google_dot_protobuf_dot_struct__pb2.DESCRIPTOR,])
_FILTERINSTANCEOFSTATEWORKFLOWREQUEST = _descriptor.Descriptor(
name='FilterInstanceOfStateWorkflowRequest',
full_name='state_workflow.FilterInstanceOfStateWorkflowRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='objectId', full_name='state_workflow.FilterInstanceOfStateWorkflowRequest.objectId', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='query', full_name='state_workflow.FilterInstanceOfStateWorkflowRequest.query', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=85,
serialized_end=181,
)
_FILTERINSTANCEOFSTATEWORKFLOWRESPONSE = _descriptor.Descriptor(
name='FilterInstanceOfStateWorkflowResponse',
full_name='state_workflow.FilterInstanceOfStateWorkflowResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='instanceId', full_name='state_workflow.FilterInstanceOfStateWorkflowResponse.instanceId', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=183,
serialized_end=242,
)
_FILTERINSTANCEOFSTATEWORKFLOWRESPONSEWRAPPER = _descriptor.Descriptor(
name='FilterInstanceOfStateWorkflowResponseWrapper',
full_name='state_workflow.FilterInstanceOfStateWorkflowResponseWrapper',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='code', full_name='state_workflow.FilterInstanceOfStateWorkflowResponseWrapper.code', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='codeExplain', full_name='state_workflow.FilterInstanceOfStateWorkflowResponseWrapper.codeExplain', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='error', full_name='state_workflow.FilterInstanceOfStateWorkflowResponseWrapper.error', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='data', full_name='state_workflow.FilterInstanceOfStateWorkflowResponseWrapper.data', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=245,
serialized_end=410,
)
_FILTERINSTANCEOFSTATEWORKFLOWREQUEST.fields_by_name['query'].message_type = google_dot_protobuf_dot_struct__pb2._STRUCT
_FILTERINSTANCEOFSTATEWORKFLOWRESPONSEWRAPPER.fields_by_name['data'].message_type = _FILTERINSTANCEOFSTATEWORKFLOWRESPONSE
DESCRIPTOR.message_types_by_name['FilterInstanceOfStateWorkflowRequest'] = _FILTERINSTANCEOFSTATEWORKFLOWREQUEST
DESCRIPTOR.message_types_by_name['FilterInstanceOfStateWorkflowResponse'] = _FILTERINSTANCEOFSTATEWORKFLOWRESPONSE
DESCRIPTOR.message_types_by_name['FilterInstanceOfStateWorkflowResponseWrapper'] = _FILTERINSTANCEOFSTATEWORKFLOWRESPONSEWRAPPER
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
FilterInstanceOfStateWorkflowRequest = _reflection.GeneratedProtocolMessageType('FilterInstanceOfStateWorkflowRequest', (_message.Message,), {
'DESCRIPTOR' : _FILTERINSTANCEOFSTATEWORKFLOWREQUEST,
'__module__' : 'filterInstanceOfStateWorkflow_pb2'
# @@protoc_insertion_point(class_scope:state_workflow.FilterInstanceOfStateWorkflowRequest)
})
_sym_db.RegisterMessage(FilterInstanceOfStateWorkflowRequest)
FilterInstanceOfStateWorkflowResponse = _reflection.GeneratedProtocolMessageType('FilterInstanceOfStateWorkflowResponse', (_message.Message,), {
'DESCRIPTOR' : _FILTERINSTANCEOFSTATEWORKFLOWRESPONSE,
'__module__' : 'filterInstanceOfStateWorkflow_pb2'
# @@protoc_insertion_point(class_scope:state_workflow.FilterInstanceOfStateWorkflowResponse)
})
_sym_db.RegisterMessage(FilterInstanceOfStateWorkflowResponse)
FilterInstanceOfStateWorkflowResponseWrapper = _reflection.GeneratedProtocolMessageType('FilterInstanceOfStateWorkflowResponseWrapper', (_message.Message,), {
'DESCRIPTOR' : _FILTERINSTANCEOFSTATEWORKFLOWRESPONSEWRAPPER,
'__module__' : 'filterInstanceOfStateWorkflow_pb2'
# @@protoc_insertion_point(class_scope:state_workflow.FilterInstanceOfStateWorkflowResponseWrapper)
})
_sym_db.RegisterMessage(FilterInstanceOfStateWorkflowResponseWrapper)
# @@protoc_insertion_point(module_scope)
| 42.690608 | 634 | 0.792546 | 789 | 7,727 | 7.457541 | 0.185044 | 0.032631 | 0.039259 | 0.03569 | 0.524813 | 0.408736 | 0.381713 | 0.368457 | 0.361999 | 0.361999 | 0 | 0.026676 | 0.102498 | 7,727 | 180 | 635 | 42.927778 | 0.821774 | 0.061343 | 0 | 0.605263 | 1 | 0.006579 | 0.259939 | 0.234125 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.039474 | 0 | 0.039474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4edb091a886897f03cb82ebe1a3cbf9db99936e7 | 296 | py | Python | Anima.py | Daz-Riza-Seriog/Op_Heat_transfer | 3ee27a049a2de5deaf13356ed3e675562233ac29 | [
"MIT"
] | null | null | null | Anima.py | Daz-Riza-Seriog/Op_Heat_transfer | 3ee27a049a2de5deaf13356ed3e675562233ac29 | [
"MIT"
] | null | null | null | Anima.py | Daz-Riza-Seriog/Op_Heat_transfer | 3ee27a049a2de5deaf13356ed3e675562233ac29 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
import seaborn as sns
import pandas as pd
import numpy as np
sns.set()
path = r'C:\Users\HP\PycharmProjects\Operaciones_Calor\Data.xlsx'
df = pd.read_excel("Data.xlsx")
df2 = pd.read_excel("time.xlsx")
df3 = np.transpose(df)
ax = sns.heatmap(data=df3)
plt.show() | 22.769231 | 65 | 0.75 | 52 | 296 | 4.211538 | 0.615385 | 0.073059 | 0.100457 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01145 | 0.114865 | 296 | 13 | 66 | 22.769231 | 0.824427 | 0 | 0 | 0 | 0 | 0 | 0.245791 | 0.185185 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
14c8cbde65a563f12d5db26bfb9fc708e55a51a3 | 738 | py | Python | Examples/BasicUsage/WorkingWithFolder/MoveFolder.py | groupdocs-annotation-cloud/groupdocs-annotation-cloud-python-samples | 5fb6c88d0e173198753d8483ea0a75606479fa41 | [
"MIT"
] | null | null | null | Examples/BasicUsage/WorkingWithFolder/MoveFolder.py | groupdocs-annotation-cloud/groupdocs-annotation-cloud-python-samples | 5fb6c88d0e173198753d8483ea0a75606479fa41 | [
"MIT"
] | null | null | null | Examples/BasicUsage/WorkingWithFolder/MoveFolder.py | groupdocs-annotation-cloud/groupdocs-annotation-cloud-python-samples | 5fb6c88d0e173198753d8483ea0a75606479fa41 | [
"MIT"
] | 2 | 2019-07-08T12:50:55.000Z | 2019-07-08T13:21:54.000Z | # Import modules
import groupdocs_annotation_cloud
from Common import Common
class MoveFolder:
@classmethod
def Run(cls):
# Create instance of the API
api = groupdocs_annotation_cloud.FolderApi.from_config(Common.GetConfig())
try:
request = groupdocs_annotation_cloud.MoveFolderRequest("annotationdocs1", "annotationdocs1\\annotationdocs", Common.myStorage, Common.myStorage)
api.move_folder(request)
print("Expected response type is Void: 'annotationdocs1' folder moved to 'annotationdocs/annotationdocs1'.")
except groupdocs_annotation_cloud.ApiException as e:
print("Exception while calling API: {0}".format(e.message)) | 41 | 156 | 0.697832 | 75 | 738 | 6.733333 | 0.626667 | 0.150495 | 0.190099 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008726 | 0.223577 | 738 | 18 | 157 | 41 | 0.8726 | 0.055556 | 0 | 0 | 0 | 0 | 0.254676 | 0.092086 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.166667 | 0 | 0.333333 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14ca45583c2227b2168e4f681b76f28fa240e335 | 1,203 | py | Python | ex098.py | paulo-caixeta/Exercicios_Curso_Python | 3b77925499c174ea9ff81dec65d6319125219b9a | [
"MIT"
] | null | null | null | ex098.py | paulo-caixeta/Exercicios_Curso_Python | 3b77925499c174ea9ff81dec65d6319125219b9a | [
"MIT"
] | null | null | null | ex098.py | paulo-caixeta/Exercicios_Curso_Python | 3b77925499c174ea9ff81dec65d6319125219b9a | [
"MIT"
] | null | null | null | """
Exercício Python 098: Faça um programa que tenha uma função chamada contador(), que receba três parâmetros: início, fim e passo.
Seu programa tem que realizar três contagens através da função criada:
a) de 1 até 10, de 1 em 1
b) de 10 até 0, de 2 em 2
c) uma contagem personalizada
"""
from time import sleep
def linha():
sleep(0.5)
print('-=' * 20)
def escreva(txt):
caracteres = len(txt)
print('=' * (caracteres + 4))
print(f' {txt}')
print('=' * (caracteres + 4))
def contador(i, f, p):
if p == 0:
p = 1
elif p < 0:
p = -p
print(f'Contagem de {i} até {f} de {p} em {p}:')
if f > i:
for v in range(i, f+1, p):
print(f'{v}', end=' ')
sleep(0.1)
print('FIM!')
elif f < i:
for v in range(i, f-1, -p):
print(f'{v}', end=' ')
sleep(0.1)
print('FIM!')
# Programa principal:
escreva('CONTADOR')
linha()
contador(1, 10, 1)
linha()
contador(10, 0, 2)
linha()
# Recebe parâmetros:
print(f'Agora é sua vez de personalizar a contagem!')
início = int(input('Início: '))
fim = int(input('Fim: '))
passo = int(input('Passo: '))
contador(início, fim, passo)
linha()
| 23.588235 | 128 | 0.564422 | 189 | 1,203 | 3.592593 | 0.375661 | 0.044183 | 0.030928 | 0.055965 | 0.123711 | 0.123711 | 0.123711 | 0.123711 | 0.123711 | 0.123711 | 0 | 0.041049 | 0.270989 | 1,203 | 50 | 129 | 24.06 | 0.733181 | 0.266833 | 0 | 0.324324 | 0 | 0 | 0.155785 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081081 | false | 0.054054 | 0.027027 | 0 | 0.108108 | 0.27027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
14cc5e75bc05398b3bb53a79b496548b99c38a35 | 327 | py | Python | direct/nn/varnet/config.py | cnmy-ro/direct-custom | a354e82d4f4b7598037e7b9dc73456fc361820ac | [
"Apache-2.0"
] | 57 | 2021-12-21T23:11:46.000Z | 2022-03-26T23:25:36.000Z | direct/nn/varnet/config.py | cnmy-ro/direct-custom | a354e82d4f4b7598037e7b9dc73456fc361820ac | [
"Apache-2.0"
] | 102 | 2020-06-07T19:10:06.000Z | 2021-12-18T11:34:27.000Z | direct/nn/varnet/config.py | cnmy-ro/direct-custom | a354e82d4f4b7598037e7b9dc73456fc361820ac | [
"Apache-2.0"
] | 24 | 2020-06-06T12:06:58.000Z | 2021-12-21T10:42:50.000Z | # coding=utf-8
# Copyright (c) DIRECT Contributors
from dataclasses import dataclass
from direct.config.defaults import ModelConfig
@dataclass
class EndToEndVarNetConfig(ModelConfig):
num_layers: int = 8
regularizer_num_filters: int = 18
regularizer_num_pull_layers: int = 4
regularizer_dropout: float = 0.0
| 23.357143 | 46 | 0.7737 | 41 | 327 | 6 | 0.658537 | 0.073171 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025547 | 0.16208 | 327 | 13 | 47 | 25.153846 | 0.872263 | 0.140673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14d7c8670ef0ff23366e664847eb0ddc0db4e76b | 4,924 | py | Python | src/lib/extra.py | bentherien/measevalcompetition | 1d285991eb26403682a633a728629a9900923d80 | [
"MIT"
] | null | null | null | src/lib/extra.py | bentherien/measevalcompetition | 1d285991eb26403682a633a728629a9900923d80 | [
"MIT"
] | null | null | null | src/lib/extra.py | bentherien/measevalcompetition | 1d285991eb26403682a633a728629a9900923d80 | [
"MIT"
] | null | null | null | import src.exerptController
import networkx as nx
def getGraphPaths(sentences, sourceSpan, targetSpan, getSrc, getTrg, reverse=False, pos=False, docName = ""):
nonePassed = False
if(type(sentences) == type(None)):
print("sentences passed to getGraphPaths() as None")
nonePassed = True
if(type(sourceSpan) == type(None)):
print("quantity passed to getGraphPaths() as None")
nonePassed = True
if(type(targetSpan) == type(None)):
print("measuredProperty passed to getGraphPaths() as None")
nonePassed = True
if(nonePassed):
return None, None
if len(sentences) != 1:
#cant handle more than one sentece so break
#print("more than one sentence")
return None, None
edges = []
for sent in sentences:
for token in sent:
for child in token.children:
edges.append(((token.text,token,token.dep_,token.i),
(child.text,child,child.dep_,child.i)))
graph = nx.Graph(edges)
path = []
src = getSrc(sourceSpan)
trg = getTrg(targetSpan)
if(src == None or trg == None):
return None, None
source = (src.text, src, src.dep_,src.i)
target = (trg.text, trg, trg.dep_,trg.i)
try:
shortestPath = nx.shortest_path(graph, source=source, target=target)
except nx.exception.NodeNotFound:
print("docname",docName)
print("source span",sourceSpan)
print("target span",targetSpan)
print(f"Either source {source} or target {target} is not in G")
print(sentences)
return None, None
if(len(shortestPath) < 2):
print("error path shorter than 2")
print("Source:", source, "target:",target)
print("shortestpath:",shortestPath)
print("docName:",docName)
return shortestPath, {"source":src,"target":trg}
def getPathFreq(cont,source="Quantity",target="MeasuredEntity"):
def getSource(srcSpan):
temp = []
for tok in srcSpan:
if tok.tag_ == "CD":
temp.append(tok)
if temp == []:
#print("No CD found in",[(x,x.pos_) for x in srcSpan])
for tok in srcSpan:
#print("$$Source replaced with:",(tok,tok.pos_), "span:",[(x,x.pos_) for x in srcSpan])
return tok
else:
return temp[-1]
def getTargetME(trgSpan):
for x in reversed(trgSpan):
if(x.pos_ in ["NOUN","PROPN"]):
return x
if(len(trgSpan) == 1):
for tok in trgSpan:
print("@@Target replaced with:",(tok,tok.pos_), "span:",[(x,x.pos_) for x in trgSpan])
return tok
#print("No noun found in",[(x,x.pos_) for x in trgSpan])
return None
def getTarget(trgSpan):
for x in reversed(trgSpan):
if(x.pos_ in ["NOUN","PROPN"]):
return x
if(len(trgSpan) == 1):
for tok in trgSpan:
#print("@@Target replaced with:",(tok,tok.pos_), "span:",[(x,x.pos_) for x in trgSpan])
return tok
#print("No noun found in",[(x,x.pos_) for x in trgSpan])
return None
accumPaths = []
for e in cont.data.values():
for x in e.doc._.meAnnots.values():
try:
path, info = getGraphPaths( x["sentences"], x[source], x[target],getSource, getTarget, reverse=False, pos=False, docName=e.name)
if path == None or info == None:
continue
accumPaths.append({
"path":path,
"doc":e.name,
"source":info["source"],
"target":info["target"]
})
except KeyError:
pass
def getPath(l1):
#print(l1)
x=0
path = []
while(x<len(l1)-1):
if(l1[x][1].head.text == l1[x+1][1].text):
#if x+1 is governor of x then
path.append(l1[x][1].dep_)
elif(l1[x][1].text == l1[x+1][1].head.text):
path.append("r"+l1[x+1][1].dep_)
else:
print("error")
x+=1
#print(",".join(path))
return tuple(path),l1[0][1].sent
d={}
for path in accumPaths:
if path != None and path != []:
count =0
tempPath, tempSent = getPath(path["path"])
del path["path"]
path["sent"] = tempSent
try:
d[tempPath][0] += 1
d[tempPath][1].append(path)
except KeyError:
d[tempPath] = [1,[path]]
return {k: v for k, v in sorted(d.items(), key=lambda item : item[1][0],reverse=True)}
| 28.298851 | 144 | 0.50792 | 579 | 4,924 | 4.284974 | 0.221071 | 0.01451 | 0.021765 | 0.019347 | 0.279323 | 0.234986 | 0.234986 | 0.229343 | 0.204756 | 0.166868 | 0 | 0.012003 | 0.357027 | 4,924 | 173 | 145 | 28.462428 | 0.771636 | 0.094639 | 0 | 0.289474 | 0 | 0 | 0.093771 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.078947 | 0.017544 | 0 | 0.201754 | 0.122807 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
14dfb2340b865c6fae4d40bddd82ee9333c4b009 | 1,024 | py | Python | (Ex4)_Neural_Network_model_implementation/initializeWeight.py | HarryPham0123/Coursera_Machine_learning_AndrewNg | ae1fa34969fa0dafd44aa6606f6749c09b447239 | [
"MIT"
] | 1 | 2021-11-10T07:23:30.000Z | 2021-11-10T07:23:30.000Z | (Ex4)_Neural_Network_model_implementation/initializeWeight.py | HarryPham0123/Coursera_Machine_learning_AndrewNg | ae1fa34969fa0dafd44aa6606f6749c09b447239 | [
"MIT"
] | null | null | null | (Ex4)_Neural_Network_model_implementation/initializeWeight.py | HarryPham0123/Coursera_Machine_learning_AndrewNg | ae1fa34969fa0dafd44aa6606f6749c09b447239 | [
"MIT"
] | null | null | null | import numpy as np
# TODO: Initialize weights with fixed (Can not change) dimension -> Useful for debugging (NOT DONE)
# "fan_in": # of nodes in layer "l"
# "fan_out": # of nodes in layer "l+1"
def debug_initialize_weight(fan_in, fan_out):
W = np.zeros((fan_out, fan_in+1))
W = np.reshape(range(len(W.ravel(order="F"))), W.shape)/10
return W
# TODO: Initialize randomly weight of layer with L_in (#_nodes layer l) and L_out (#_nodes layer l+1))
def random_initialize_weight(L_in, L_out):
# Confirm the size of weight parameters
W = np.zeros((L_out, L_in+1))
# Initialize random weight W in [-epsilon, epsilon]
epsilon = 0.1
W = np.random.rand(L_out, L_in+1) * (2*epsilon) - epsilon
return W
# (NOTE) Difference between np.flatten() (1) & np.ravel() (2) (Both outputs the same values)
# "np.flatten()" always return a copy
# "np.ravel()": Return view of original array. If I modify array returned by ravel, it may modify the entries in the
# original array
| 37.925926 | 117 | 0.668945 | 171 | 1,024 | 3.888889 | 0.421053 | 0.03609 | 0.027068 | 0.042105 | 0.069173 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014797 | 0.208008 | 1,024 | 26 | 118 | 39.384615 | 0.805179 | 0.595703 | 0 | 0.2 | 0 | 0 | 0.002681 | 0 | 0 | 0 | 0 | 0.038462 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14e4c678bbb5e1b2e735a299680f167c2f0679dc | 603 | py | Python | tests/unit/simulation/test_reset.py | AlexanderGrooff/discrete-event-simulation | 2160cebfd6106d4a1299b3a2ff5480affa63aed0 | [
"MIT"
] | null | null | null | tests/unit/simulation/test_reset.py | AlexanderGrooff/discrete-event-simulation | 2160cebfd6106d4a1299b3a2ff5480affa63aed0 | [
"MIT"
] | null | null | null | tests/unit/simulation/test_reset.py | AlexanderGrooff/discrete-event-simulation | 2160cebfd6106d4a1299b3a2ff5480affa63aed0 | [
"MIT"
] | null | null | null | from simulation.framework import DiscreteSimulation
from tests.helpers import TestCase
class TestSimulationFramework(TestCase):
def setUp(self):
self.sim = DiscreteSimulation(max_duration=8, available_actions=[])
def test_reset_creates_new_timeline(self):
original_timeline = self.sim.timeline
self.sim.reset()
self.assertNotEqual(self.sim.timeline, original_timeline)
def test_reset_applies_initial_values(self):
self.sim.reset(initial_values={"water": 5000})
self.assertDictEqual(self.sim.timeline.current_state.values, {"water": 5000})
| 35.470588 | 85 | 0.74461 | 70 | 603 | 6.214286 | 0.485714 | 0.096552 | 0.103448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017717 | 0.157546 | 603 | 16 | 86 | 37.6875 | 0.838583 | 0 | 0 | 0 | 0 | 0 | 0.016584 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.25 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14ea36dc58776c58fc1d52d85063b3ce57d19adb | 184 | py | Python | src/main.py | M3nin0/PyBot | f2e4237d9d0f66eb070a5a7d3f2e503d75ca70f5 | [
"BSD-2-Clause"
] | null | null | null | src/main.py | M3nin0/PyBot | f2e4237d9d0f66eb070a5a7d3f2e503d75ca70f5 | [
"BSD-2-Clause"
] | null | null | null | src/main.py | M3nin0/PyBot | f2e4237d9d0f66eb070a5a7d3f2e503d75ca70f5 | [
"BSD-2-Clause"
] | null | null | null |
from ChatBot import ChatBot
Bot = ChatBot("Felipe")
while True:
frase = Bot.escuta()
resp = Bot.pensa(frase)
Bot.fala(resp)
if resp == "tchau":
break
| 12.266667 | 27 | 0.581522 | 23 | 184 | 4.652174 | 0.652174 | 0.149533 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.298913 | 184 | 14 | 28 | 13.142857 | 0.829457 | 0 | 0 | 0 | 0 | 0 | 0.060109 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14ea947c43c04f7bd20b6be9cbe2dc86a9730dfe | 507 | py | Python | bazaar/decorators.py | Jonas-Quinn/deliverance | 9a99cf9d24a4711dc055f7578df0ba48bdc9bbee | [
"MIT"
] | 1 | 2020-02-11T07:25:47.000Z | 2020-02-11T07:25:47.000Z | bazaar/decorators.py | Jonas-Quinn/deliverance | 9a99cf9d24a4711dc055f7578df0ba48bdc9bbee | [
"MIT"
] | 9 | 2020-02-27T22:40:07.000Z | 2022-03-12T00:14:39.000Z | bazaar/decorators.py | Jonas-Quinn/deliverance | 9a99cf9d24a4711dc055f7578df0ba48bdc9bbee | [
"MIT"
] | null | null | null | from django.core.exceptions import PermissionDenied
from django.utils import timezone
from .models import Item
from datetime import datetime
def active_auction(function):
def wrap(request, *args, **kwargs):
item = Item.objects.get(slug=kwargs['slug'])
if item.end_of_auction > timezone.now():
return function(request, *args, **kwargs)
else:
raise PermissionDenied
wrap.__doc__ = function.__doc__
wrap.__name__ = function.__name__
return wrap | 31.6875 | 53 | 0.696252 | 60 | 507 | 5.566667 | 0.516667 | 0.05988 | 0.101796 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21499 | 507 | 16 | 54 | 31.6875 | 0.839196 | 0 | 0 | 0 | 0 | 0 | 0.007874 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
14ebdfe49e8339b78e7bbac1a688d3fec00e758b | 1,435 | py | Python | examples/storageDownloadFile.py | topcats/python-o365 | 0378cee0120279d195a18b14bf8ca46577990417 | [
"Apache-2.0"
] | null | null | null | examples/storageDownloadFile.py | topcats/python-o365 | 0378cee0120279d195a18b14bf8ca46577990417 | [
"Apache-2.0"
] | 1 | 2021-03-03T00:49:55.000Z | 2021-03-03T00:49:55.000Z | examples/storageDownloadFile.py | topcats/python-o365 | 0378cee0120279d195a18b14bf8ca46577990417 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python3
# To generate an Office365 token:
# python3
# from O365 import Account
# account = Account(credentials=('yourregisteredappname', 'yoursecret'))
# account.authenticate(scopes=['files.read', 'user.read', 'offline_access'])
# It will return a URL, go to this in a browser, accept the permissions, then paste in the URL you are redirected to
# YOU MAY HAVE TO SWITCH TO THE 'OLD' VIEW TO DO THIS!
import pandas as pd
from O365 import Account
# Generated on the app registration portal
registered_app_name='yourregisteredappname'
registered_app_secret='yoursecret'
# File to download, and location to download to
dl_path='/path/to/download'
f_name='myfile.xlsx'
print("Connecting to O365")
account = Account(credentials=(registered_app_name, registered_app_secret), scopes=['files.read', 'user.read', 'offline_access'])
storage = account.storage() # here we get the storage instance that handles all the storage options.
# get the default drive
my_drive = storage.get_default_drive()
print(f"Searching for {f_name}...")
files = my_drive.search(f_name, limit=1)
if files:
numberDoc = files[0]
print("... copying to local machine")
operation = numberDoc.download(to_path=dl_path)
else:
print("File not found!")
exit()
print("Reading sheet to dataframe")
df = pd.read_excel(f'{dl_path}/{f_name}')
with pd.option_context('display.max_rows', None, 'display.max_columns', None):
print(df)
| 30.531915 | 129 | 0.743554 | 211 | 1,435 | 4.933649 | 0.511848 | 0.049952 | 0.026897 | 0.040346 | 0.069164 | 0.069164 | 0.069164 | 0 | 0 | 0 | 0 | 0.012935 | 0.137979 | 1,435 | 46 | 130 | 31.195652 | 0.828618 | 0.402091 | 0 | 0 | 1 | 0 | 0.303783 | 0.024823 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.086957 | 0.26087 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14ec1bcbf7c1252dcbd344ba1ea2da3aca1c06b6 | 437 | py | Python | app/api_1_0/__init__.py | thinkingserious/apithyself | b6dcb595d4074eb9a5da9c2b5f0b8a4b68899504 | [
"MIT"
] | null | null | null | app/api_1_0/__init__.py | thinkingserious/apithyself | b6dcb595d4074eb9a5da9c2b5f0b8a4b68899504 | [
"MIT"
] | null | null | null | app/api_1_0/__init__.py | thinkingserious/apithyself | b6dcb595d4074eb9a5da9c2b5f0b8a4b68899504 | [
"MIT"
] | null | null | null | from flask import Blueprint
from flask.ext.restful import Api
from .endpoints.goal import Goal
from .endpoints.weight import Weight
from .endpoints.calories import Calories
api_blueprint = Blueprint('api', __name__)
api = Api(prefix='/api/v1.0')
# Register the endpoints
api.add_resource(Goal, '/goal', '/goal')
api.add_resource(Weight, '/weight/<string:id>', '/weight')
api.add_resource(Calories, '/calories/<string:id>', '/calories') | 33.615385 | 64 | 0.757437 | 61 | 437 | 5.295082 | 0.344262 | 0.120743 | 0.130031 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005051 | 0.093822 | 437 | 13 | 64 | 33.615385 | 0.810606 | 0.050343 | 0 | 0 | 0 | 0 | 0.188406 | 0.050725 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
14edc171ee825a9b639ac66387fb241efbce3400 | 2,952 | py | Python | src/scraper/scraper.py | jion/python-scraper | 264f9cbc4145409a6391cdac37975c91275a09b6 | [
"MIT"
] | null | null | null | src/scraper/scraper.py | jion/python-scraper | 264f9cbc4145409a6391cdac37975c91275a09b6 | [
"MIT"
] | null | null | null | src/scraper/scraper.py | jion/python-scraper | 264f9cbc4145409a6391cdac37975c91275a09b6 | [
"MIT"
] | null | null | null | from dombuilder import DomBuilder
from dom import SimpleDOM
from urllib2 import urlopen, URLError, HTTPError
# Scraper responsability:
# orquestate the steps of the scraping process:
# Construction phase:
# 1. create the proper reader (input)
# 2. construct the HTML parser
# 3. customize the HTML parser adding operations
# 4. construct the Printer
#
# Running phase:
# 1. obtain an html input
# 2. pass the input to de HTML parser
# 3. Run the purser.
# 4. Print the results
def simpleUrlReader(url, consumer):
"""
This function handle the process of reading from an url
and passing data to a consumer (via a feed method on the
consumer).
I would wish to do this asynchronic, but for now it takes
the whole HTML at once and just then pass it to the consumer.
The function is decoupled for the Scraper class in order to be able
to support another types of feeding (maybe html file, etc)
"""
try:
handler = urlopen(url)
html = handler.read()
except HTTPError, e:
raise Exception("There was a problem getting the specified url - HTTP code %s" % e.code)
except URLError, e:
raise Exception("There was a problem getting the specified url: %s" % str(e))
else:
#####################
consumer.feed(html) #
#####################
def defaultUrlScraper(url):
reader = simpleUrlReader
dom = SimpleDOM()
domBuilder = DomBuilder(dom)
scraper = Scraper(reader, domBuilder)
scraper.setUrl(url)
return scraper
class Scraper(object):
"""
This class orchestrates the whole proccess of Scrapping the specified HTML.
You can customize the proccess adding Operations that will be executed on
the fly while the DOM object is created from parsing the HTML page.
When results are ready this will output the results in a decoupled fashion
passing the desired implementation of a printer to the printResults function.
The initializer receives an object that accepts file protocol.
"""
url = None
def __init__(self, reader, domBuilder):
self.operations = []
self.domBuilder = domBuilder
self.reader = reader
def setUrl(self, url):
self.url = url
def addOperation(self, operation):
operation.attachTo(self.domBuilder.dom)
self.operations.append(operation)
return self
def _feed_parser(self):
"""
This is the core method of all the scraper application.
It feeds the builder with the HTML data, who will build
a dom representation and doing analysis on the fly.
"""
self.reader(self.url, self.domBuilder)
def run(self):
if self.url == None:
raise Exception("Scraper Error - URL missing")
self._feed_parser()
def printResults(self, printer):
for operation in self.operations:
printer.printResults( operation )
| 30.122449 | 96 | 0.666667 | 386 | 2,952 | 5.07772 | 0.404145 | 0.014286 | 0.013265 | 0.020408 | 0.054082 | 0.054082 | 0.054082 | 0.054082 | 0.054082 | 0.054082 | 0 | 0.004097 | 0.255759 | 2,952 | 97 | 97 | 30.43299 | 0.888029 | 0.115515 | 0 | 0 | 0 | 0 | 0.093197 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.073171 | null | null | 0.04878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14f1cc3f25971e2c47d21da7964f4150b859ae2f | 906 | py | Python | Week -2/Day-4 Maximum Product Subarray.py | Jeevan-kumar-Raj/September-LeetCoding-Challenge | cf5f4abaee4eb5d4292e95ad7252c5dff43cb518 | [
"MIT"
] | null | null | null | Week -2/Day-4 Maximum Product Subarray.py | Jeevan-kumar-Raj/September-LeetCoding-Challenge | cf5f4abaee4eb5d4292e95ad7252c5dff43cb518 | [
"MIT"
] | null | null | null | Week -2/Day-4 Maximum Product Subarray.py | Jeevan-kumar-Raj/September-LeetCoding-Challenge | cf5f4abaee4eb5d4292e95ad7252c5dff43cb518 | [
"MIT"
] | null | null | null | Day-4 Maximum Product Subarray.py
/*
Given an integer array nums, find the contiguous subarray within an array (containing at least one number) which has the largest product.
Example 1:
Input: [2,3,-2,4]
Output: 6
Explanation: [2,3] has the largest product 6.
Example 2:
Input: [-2,0,-1]
Output: 0
Explanation: The result cannot be 2, because [-2,-1] is not a subarray.
*/
class Solution(object):
# @param A, a list of integers
# @return an integer
def maxProduct(self, A):
global_max, local_max, local_min = float("-inf"), 1, 1
for x in A:
local_max = max(1, local_max)
if x > 0:
local_max, local_min = local_max * x, local_min * x
else:
local_max, local_min = local_min * x, local_max * x
global_max = max(global_max, local_max)
return global_max
| 26.647059 | 138 | 0.601545 | 136 | 906 | 3.882353 | 0.448529 | 0.121212 | 0.073864 | 0.090909 | 0.079545 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034921 | 0.304636 | 906 | 33 | 139 | 27.454545 | 0.803175 | 0.051876 | 0 | 0 | 0 | 0 | 0.00486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14f2dc93a5e49f9af2ce7eb52d4a30803c8251b4 | 350 | py | Python | recipes/forms.py | Bytlot/foodgram-project | fb5c39ff17c2e3fb0245036ec02a9626954e0684 | [
"MIT"
] | null | null | null | recipes/forms.py | Bytlot/foodgram-project | fb5c39ff17c2e3fb0245036ec02a9626954e0684 | [
"MIT"
] | 2 | 2022-01-13T03:43:22.000Z | 2022-03-12T00:56:53.000Z | recipes/forms.py | Bytlot/reshare | 4be75616c4ed6127349a5d2c707581daa3535d78 | [
"MIT"
] | null | null | null | from django import forms
from .models import Recipe
class RecipeForm(forms.ModelForm):
class Meta:
model = Recipe
fields = (
'title',
'tags',
'cooking_time',
'text',
'image',
)
widgets = {
'tags': forms.CheckboxSelectMultiple(),
}
| 18.421053 | 51 | 0.474286 | 27 | 350 | 6.111111 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.425714 | 350 | 18 | 52 | 19.444444 | 0.820896 | 0 | 0 | 0 | 0 | 0 | 0.097143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.133333 | 0 | 0.266667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
14f35957e7f1afff3cefdbd23a93d695f4c8deb8 | 387 | py | Python | qzzzme_app/urls.py | aboudzein/Qzzz.me-API | b5ee8e63fb7cf58d26fb5b6e4c9f22c04e90df08 | [
"MIT"
] | 1 | 2021-04-22T08:50:14.000Z | 2021-04-22T08:50:14.000Z | qzzzme_app/urls.py | aboudzein/Qzzz.me-API | b5ee8e63fb7cf58d26fb5b6e4c9f22c04e90df08 | [
"MIT"
] | 10 | 2020-04-17T02:31:35.000Z | 2022-02-10T11:49:33.000Z | qzzzme_app/urls.py | aboudzein/Qzzz.me-API | b5ee8e63fb7cf58d26fb5b6e4c9f22c04e90df08 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
# Python imports.
import logging
# Django imports.
from django.conf.urls import url
# Rest Framework imports.
# Third Party Library imports
# local imports.
from qzzzme_app.swagger import schema_view
app_name = 'qzzzme_app'
urlpatterns = [
url(r'^docs/$', schema_view, name="schema_view"),
] | 18.428571 | 53 | 0.73385 | 53 | 387 | 5.150943 | 0.622642 | 0.10989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00304 | 0.149871 | 387 | 21 | 54 | 18.428571 | 0.826748 | 0.364341 | 0 | 0 | 0 | 0 | 0.116667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
14f55735d20e9103dcb557592b91446372d9a5e6 | 17,635 | py | Python | external/snake-0.3/snake/ibamr/plotField2dVisIt.py | mesnardo/cuIBM | 0b63f86c58e93e62f9dc720c08510cc88b10dd04 | [
"MIT"
] | 67 | 2015-01-25T18:33:39.000Z | 2022-02-17T17:50:00.000Z | external/snake-0.3/snake/ibamr/plotField2dVisIt.py | mesnardo/cuIBM | 0b63f86c58e93e62f9dc720c08510cc88b10dd04 | [
"MIT"
] | 20 | 2015-01-25T05:29:30.000Z | 2018-05-14T02:07:03.000Z | external/snake-0.3/snake/ibamr/plotField2dVisIt.py | mesnardo/cuIBM | 0b63f86c58e93e62f9dc720c08510cc88b10dd04 | [
"MIT"
] | 39 | 2015-01-22T16:32:11.000Z | 2022-02-17T08:17:09.000Z | """
Calls VisIt in batch mode to generate .png files of the 2D field contour.
cli: visit -nowin -cli -s plotField2dVisIt.py <arguments>
"""
import os
import sys
import math
import argparse
sys.path.append(os.environ['SNAKE'])
from snake import miscellaneous
def parse_command_line():
"""
Parses the command-line.
"""
print('[info] parsing the command-line ...'),
# create the parser
formatter_class = argparse.ArgumentDefaultsHelpFormatter
parser = argparse.ArgumentParser(description='Plots the vorticity field '
'with VisIt',
formatter_class=formatter_class)
# fill the parser with arguments
parser.add_argument('--directory',
dest='directory',
type=str,
default=os.getcwd(),
help='directory of the IBAMR simulation')
parser.add_argument('--solution-folder',
dest='solution_folder',
type=str,
default='numericalSolution',
help='name of folder containing the solution in time')
parser.add_argument('--body',
dest='body',
type=str,
default=None,
help='name of the body file (without the .vertex '
'extension)')
parser.add_argument('--field',
dest='field_name',
type=str,
choices=['vorticity', 'pressure',
'x-velocity', 'x-velocity',
'velocity-magnitude'],
help='name of the field to plot')
parser.add_argument('--range',
dest='field_range',
type=float, nargs=2,
default=(-1.0, 1.0),
metavar=('min', 'max'),
help='Range of the field to plot')
parser.add_argument('--states',
dest='states',
type=int, nargs=3,
default=(0, 2**10000, 1),
metavar=('min', 'max', 'increment'),
help='steps to plot')
parser.add_argument('--view',
dest='view',
type=float,
nargs=4,
default=(-2.0, -2.0, 2.0, 2.0),
metavar=('x-bl', 'y-bl', 'x-tr', 'y-tr'),
help='bottom-left coordinates followed by top-right '
'coordinates of the view to display')
parser.add_argument('--width',
dest='width',
type=int,
default=800,
help='figure width in pixels')
# parse given options file
parser.add_argument('--options',
type=open,
action=miscellaneous.ReadOptionsFromFile,
help='path of the file with options to parse')
print('done')
return parser.parse_args()
def check_version():
"""
Check the VisIt version and prints warning if the version has not been
tested.
"""
script_version = '2.8.2'
tested_versions = ['2.8.2', '2.10.2', '2.12.1']
current_version = Version()
print('VisIt version: {}\n'.format(Version()))
if current_version not in tested_versions:
print('[warning] You are using VisIt-{}'.format(current_version))
print('[warning] This script was created with '
'VisIt-{}.'.format(script_version))
print('[warning] This script was tested with versions: '
'{}.'.format(tested_versions))
print('[warning] It may not work as expected')
def plot_field_contours(field_name, field_range,
body=None,
directory=os.getcwd(),
solution_folder='numericalSolution',
states=(0, 2**10000, 1),
view=(-2.0, -2.0, 2.0, 2.0),
width=800):
"""
Plots the contour of a given field using VisIt.
Parameters
----------
field_name: string
Name of field to plot;
choices: vorticity, pressure, velocity-magnitude, x-velocity, y-velocity.
field_range: 2-tuple of floats
Range of the field to plot (min, max).
body: string, optional
Name of the immersed body;
default: None.
directory: string, optional
Directory of the IBAMR simulation;
default: current directory.
solution_folder: string, optional
Relative path of the folder containing the numerical solution;
default: 'numericalSolution'.
states: 3-tuple of integers, optional
Limits of index of the states to plot followed by the increment;
default: (0, 20000, 1).
view: 4-tuple of floats, optional
Bottom-left and top-right coordinates of the view to display;
default: (-2.0, -2.0, 2.0, 2.0).
width: integer, optional
Width (in pixels) of the figure;
default: 800.
"""
info = {}
info['vorticity'] = {'variable': 'Omega',
'color-table': 'RdBu',
'invert-color-table': 1}
info['pressure'] = {'variable': 'P',
'color-table': 'hot',
'invert-color-table': 0}
info['velocity-magnitude'] = {'variable': 'U_magnitude',
'color-table': 'RdBu',
'invert-color-table': 1}
info['x-velocity'] = {'variable': 'U_x',
'color-table': 'RdBu',
'invert-color-table': 1}
info['y-velocity'] = {'variable': 'U_y',
'color-table': 'RdBu',
'invert-color-table': 1}
# define dimensions of domain to plot
height = int(math.floor(width * (view[3] - view[1]) / (view[2] - view[0])))
# create images directory
view_string = '{:.2f}_{:.2f}_{:.2f}_{:.2f}'.format(*view)
images_directory = os.path.join(directory,
'images',
'_'.join([field_name, view_string]))
if not os.path.isdir(images_directory):
print('[info] creating images directory {} ...'.format(images_directory))
os.makedirs(images_directory)
ShowAllWindows()
# display body
if body:
OpenDatabase(GetLocalHostName() + ':' + os.path.join(directory,
solution_folder,
'lag_data.visit'), 0)
AddPlot('Mesh', body + '_vertices', 1, 1)
DrawPlots()
MeshAtts = MeshAttributes()
MeshAtts.legendFlag = 0
MeshAtts.lineStyle = MeshAtts.SOLID # SOLID, DASH, DOT, DOTDASH
MeshAtts.lineWidth = 0
MeshAtts.meshColor = (0, 0, 0, 255)
try:
MeshAtts.outlineOnlyFlag = 0
MeshAtts.errorTolerance = 0.01
except:
pass
MeshAtts.meshColorSource = MeshAtts.Foreground # Foreground, MeshCustom
MeshAtts.opaqueColorSource = MeshAtts.Background # Background, OpaqueCustom
MeshAtts.opaqueMode = MeshAtts.Auto # Auto, On, Off
MeshAtts.pointSize = 0.05
MeshAtts.opaqueColor = (255, 255, 255, 255)
# MeshAtts.smoothingLevel = MeshAtts.None # None, Fast, High
MeshAtts.pointSizeVarEnabled = 0
MeshAtts.pointSizeVar = 'default'
MeshAtts.pointType = MeshAtts.Point # Box, Axis, Icosahedron, Octahedron, Tetrahedron, SphereGeometry, Point, Sphere
MeshAtts.showInternal = 0
MeshAtts.pointSizePixels = 2
MeshAtts.opacity = 1
SetPlotOptions(MeshAtts)
# display vorticity field
OpenDatabase(GetLocalHostName() + ':' + os.path.join(directory,
solution_folder,
'dumps.visit'), 0)
HideActivePlots()
AddPlot('Pseudocolor', info[field_name]['variable'], 1, 1)
DrawPlots()
PseudocolorAtts = PseudocolorAttributes()
PseudocolorAtts.scaling = PseudocolorAtts.Linear # Linear, Log, Skew
PseudocolorAtts.skewFactor = 1
PseudocolorAtts.limitsMode = PseudocolorAtts.OriginalData # OriginalData, CurrentPlot
PseudocolorAtts.minFlag = 1
PseudocolorAtts.min = field_range[0]
PseudocolorAtts.maxFlag = 1
PseudocolorAtts.max = field_range[1]
PseudocolorAtts.centering = PseudocolorAtts.Natural # Natural, Nodal, Zonal
PseudocolorAtts.colorTableName = info[field_name]['color-table']
PseudocolorAtts.invertColorTable = info[field_name]['invert-color-table']
PseudocolorAtts.opacityType = PseudocolorAtts.FullyOpaque # ColorTable, FullyOpaque, Constant, Ramp, VariableRange
PseudocolorAtts.opacityVariable = ''
PseudocolorAtts.opacity = 1
PseudocolorAtts.opacityVarMin = 0
PseudocolorAtts.opacityVarMax = 1
PseudocolorAtts.opacityVarMinFlag = 0
PseudocolorAtts.opacityVarMaxFlag = 0
PseudocolorAtts.pointSize = 0.05
PseudocolorAtts.pointType = PseudocolorAtts.Point # Box, Axis, Icosahedron, Octahedron, Tetrahedron, SphereGeometry, Point, Sphere
PseudocolorAtts.pointSizeVarEnabled = 0
PseudocolorAtts.pointSizeVar = 'default'
PseudocolorAtts.pointSizePixels = 2
PseudocolorAtts.lineType = PseudocolorAtts.Line # Line, Tube, Ribbon
PseudocolorAtts.lineStyle = PseudocolorAtts.SOLID # SOLID, DASH, DOT, DOTDASH
PseudocolorAtts.lineWidth = 0
if Version() in ['2.8.2', '2.10.2']:
PseudocolorAtts.tubeDisplayDensity = 10
elif Version() in ['2.12.1']:
PseudocolorAtts.tubeResolution = 10
else:
PseudocolorAtts.tubeDisplayDensity = 10
PseudocolorAtts.tubeRadiusSizeType = PseudocolorAtts.FractionOfBBox # Absolute, FractionOfBBox
PseudocolorAtts.tubeRadiusAbsolute = 0.125
PseudocolorAtts.tubeRadiusBBox = 0.005
if Version() in ['2.8.2', '2.10.2']:
PseudocolorAtts.varyTubeRadius = 0
PseudocolorAtts.varyTubeRadiusVariable = ''
PseudocolorAtts.varyTubeRadiusFactor = 10
# PseudocolorAtts.endPointType = PseudocolorAtts.None
PseudocolorAtts.endPointStyle = PseudocolorAtts.Spheres
elif Version() in ['2.12.1']:
PseudocolorAtts.tubeRadiusVarEnabled = 0
PseudocolorAtts.tubeRadiusVar = ''
PseudocolorAtts.tubeRadiusVarRatio = 10
# PseudocolorAtts.tailStyle = PseudocolorAtts.None
# PseudocolorAtts.headStyle = PseudocolorAtts.None
else:
PseudocolorAtts.varyTubeRadius = 0
PseudocolorAtts.varyTubeRadiusVariable = ''
PseudocolorAtts.varyTubeRadiusFactor = 10
# PseudocolorAtts.endPointType = PseudocolorAtts.None
PseudocolorAtts.endPointStyle = PseudocolorAtts.Spheres
PseudocolorAtts.endPointRadiusSizeType = PseudocolorAtts.FractionOfBBox
PseudocolorAtts.endPointRadiusAbsolute = 1
PseudocolorAtts.endPointRadiusBBox = 0.005
PseudocolorAtts.endPointRatio = 2
PseudocolorAtts.renderSurfaces = 1
PseudocolorAtts.renderWireframe = 0
PseudocolorAtts.renderPoints = 0
PseudocolorAtts.smoothingLevel = 0
PseudocolorAtts.legendFlag = 1
PseudocolorAtts.lightingFlag = 1
SetPlotOptions(PseudocolorAtts)
# colorbar of pseudocolor plot
legend = GetAnnotationObject(GetPlotList().GetPlots(2).plotName)
legend.xScale = 1.5
legend.yScale = 0.5
legend.numberFormat = '%# -9.2g'
legend.orientation = legend.HorizontalBottom
legend.managePosition = 0
legend.position = (0.10, 0.10)
legend.fontFamily = legend.Courier
legend.fontBold = 1
legend.fontHeight = 0.1
legend.drawMinMax = 0
legend.drawTitle = 0
print('[info] legend settings:')
print(legend)
# set up view
View2DAtts = View2DAttributes()
View2DAtts.windowCoords = (view[0], view[2], view[1], view[3])
View2DAtts.viewportCoords = (0, 1, 0, 1)
View2DAtts.fullFrameActivationMode = View2DAtts.Auto # On, Off, Auto
View2DAtts.fullFrameAutoThreshold = 100
View2DAtts.xScale = View2DAtts.LINEAR # LINEAR, LOG
View2DAtts.yScale = View2DAtts.LINEAR # LINEAR, LOG
View2DAtts.windowValid = 1
print('[info] view settings:')
print(View2DAtts)
SetView2D(View2DAtts)
# Logging for SetAnnotationObjectOptions is not implemented yet.
AnnotationAtts = AnnotationAttributes()
AnnotationAtts.axes2D.visible = 1
AnnotationAtts.axes2D.autoSetTicks = 1
AnnotationAtts.axes2D.autoSetScaling = 1
AnnotationAtts.axes2D.lineWidth = 0
AnnotationAtts.axes2D.tickLocation = AnnotationAtts.axes2D.Inside # Inside, Outside, Both
AnnotationAtts.axes2D.tickAxes = AnnotationAtts.axes2D.BottomLeft # Off, Bottom, Left, BottomLeft, All
# x-axis
AnnotationAtts.axes2D.xAxis.title.visible = 0 # hide x-axis title
AnnotationAtts.axes2D.xAxis.label.visible = 0 # hide x-axis label
AnnotationAtts.axes2D.xAxis.tickMarks.visible = 0 # hide x-axis tick marks
AnnotationAtts.axes2D.xAxis.grid = 0 # no grid
# y-axis
AnnotationAtts.axes2D.yAxis.title.visible = 0 # hide y-axis title
AnnotationAtts.axes2D.yAxis.label.visible = 0 # hide y-axis label
AnnotationAtts.axes2D.yAxis.tickMarks.visible = 0 # hide y-axis tick marks
AnnotationAtts.axes2D.yAxis.grid = 0 # no grid
AnnotationAtts.userInfoFlag = 0 # hide text with user's name
# settings for legend
AnnotationAtts.databaseInfoFlag = 0
AnnotationAtts.timeInfoFlag = 0
AnnotationAtts.legendInfoFlag = 1
AnnotationAtts.backgroundColor = (255, 255, 255, 255)
AnnotationAtts.foregroundColor = (0, 0, 0, 255)
AnnotationAtts.gradientBackgroundStyle = AnnotationAtts.Radial # TopToBottom, BottomToTop, LeftToRight, RightToLeft, Radial
AnnotationAtts.gradientColor1 = (0, 0, 255, 255)
AnnotationAtts.gradientColor2 = (0, 0, 0, 255)
AnnotationAtts.backgroundMode = AnnotationAtts.Solid # Solid, Gradient, Image, ImageSphere
AnnotationAtts.backgroundImage = ''
AnnotationAtts.imageRepeatX = 1
AnnotationAtts.imageRepeatY = 1
AnnotationAtts.axesArray.visible = 0
SetAnnotationAttributes(AnnotationAtts)
print('[info] annotation settings:')
print(AnnotationAtts)
SetActiveWindow(1)
# create time-annotation
time_annotation = CreateAnnotationObject('Text2D')
time_annotation.position = (0.05, 0.90)
time_annotation.fontFamily = 1
time_annotation.fontBold = 0
time_annotation.height = 0.05
print('[info] time-annotation:')
print(time_annotation)
# check number of states available
if states[1] > TimeSliderGetNStates():
print('[warning] maximum number of states available is '
'{}'.format(TimeSliderGetNStates()))
print('[warning] setting new final state ...')
states[1] = TimeSliderGetNStates()
# loop over saved time-steps
for state in xrange(args.states[0], args.states[1], args.states[2]):
SetTimeSliderState(state)
time = float(Query('Time')[:-1].split()[-1])
print('\n[state {}] time: {} - creating and saving the field ...'
''.format(state, time))
time_annotation.text = 'Time: {0:.3f}'.format(time)
RenderingAtts = RenderingAttributes()
RenderingAtts.antialiasing = 0
RenderingAtts.multiresolutionMode = 0
RenderingAtts.multiresolutionCellSize = 0.002
RenderingAtts.geometryRepresentation = RenderingAtts.Surfaces # Surfaces, Wireframe, Points
RenderingAtts.displayListMode = RenderingAtts.Auto # Never, Always, Auto
RenderingAtts.stereoRendering = 0
RenderingAtts.stereoType = RenderingAtts.CrystalEyes # RedBlue, Interlaced, CrystalEyes, RedGreen
RenderingAtts.notifyForEachRender = 0
RenderingAtts.scalableActivationMode = RenderingAtts.Auto # Never, Always, Auto
RenderingAtts.scalableAutoThreshold = 2000000
RenderingAtts.specularFlag = 0
RenderingAtts.specularCoeff = 0.6
RenderingAtts.specularPower = 10
RenderingAtts.specularColor = (255, 255, 255, 255)
RenderingAtts.doShadowing = 0
RenderingAtts.shadowStrength = 0.5
RenderingAtts.doDepthCueing = 0
RenderingAtts.depthCueingAutomatic = 1
RenderingAtts.startCuePoint = (-10, 0, 0)
RenderingAtts.endCuePoint = (10, 0, 0)
RenderingAtts.compressionActivationMode = RenderingAtts.Never # Never, Always, Auto
RenderingAtts.colorTexturingFlag = 1
RenderingAtts.compactDomainsActivationMode = RenderingAtts.Never # Never, Always, Auto
RenderingAtts.compactDomainsAutoThreshold = 256
SetRenderingAttributes(RenderingAtts)
SaveWindowAtts = SaveWindowAttributes()
SaveWindowAtts.outputToCurrentDirectory = 0
SaveWindowAtts.outputDirectory = images_directory
SaveWindowAtts.fileName = '{}{:0>7}'.format(field_name, state)
SaveWindowAtts.family = 0
SaveWindowAtts.format = SaveWindowAtts.PNG # BMP, CURVE, JPEG, OBJ, PNG, POSTSCRIPT, POVRAY, PPM, RGB, STL, TIFF, ULTRA, VTK, PLY
SaveWindowAtts.width = width
SaveWindowAtts.height = height
SaveWindowAtts.screenCapture = 0
SaveWindowAtts.saveTiled = 0
SaveWindowAtts.quality = 100
SaveWindowAtts.progressive = 0
SaveWindowAtts.binary = 0
SaveWindowAtts.stereo = 0
SaveWindowAtts.compression = SaveWindowAtts.PackBits # None, PackBits, Jpeg, Deflate
SaveWindowAtts.forceMerge = 0
SaveWindowAtts.resConstraint = SaveWindowAtts.NoConstraint # NoConstraint, EqualWidthHeight, ScreenProportions
SaveWindowAtts.advancedMultiWindowSave = 0
SetSaveWindowAttributes(SaveWindowAtts)
SaveWindow()
def main(args):
check_version()
plot_field_contours(args.field_name, args.field_range,
directory=args.directory,
body=args.body,
solution_folder=args.solution_folder,
states=args.states,
view=args.view,
width=args.width)
os.remove('visitlog.py')
if __name__ == '__main__':
args = parse_command_line()
main(args)
sys.exit(0)
| 41.494118 | 134 | 0.655685 | 1,760 | 17,635 | 6.522159 | 0.281818 | 0.027877 | 0.013329 | 0.003136 | 0.154804 | 0.120045 | 0.09574 | 0.087464 | 0.055493 | 0.039289 | 0 | 0.030332 | 0.240998 | 17,635 | 424 | 135 | 41.591981 | 0.827269 | 0.163992 | 0 | 0.1 | 1 | 0 | 0.115139 | 0.00185 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011765 | false | 0.002941 | 0.014706 | 0 | 0.029412 | 0.055882 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
0901166f5b96cd272e1a7d9a0f266a5057bca1e0 | 1,725 | py | Python | tests/factories.py | UNINETT/django-rtdb | a8af13c1756581fee0a02a9da9cbb4d252d77dab | [
"MIT"
] | null | null | null | tests/factories.py | UNINETT/django-rtdb | a8af13c1756581fee0a02a9da9cbb4d252d77dab | [
"MIT"
] | null | null | null | tests/factories.py | UNINETT/django-rtdb | a8af13c1756581fee0a02a9da9cbb4d252d77dab | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
import factory
from faker import Factory as Faker
from rtdb.models import Customfield
from rtdb.models import Queue
from rtdb.models import Ticket
from rtdb.models import TicketCustomfieldValue
__all__ = [
'QueueFactory',
'TicketFactory',
]
fake = Faker.create()
class QueueFactory(factory.django.DjangoModelFactory):
class Meta:
model = Queue
name = factory.LazyAttribute(lambda t: fake.name())
initialpriority = 0
finalpriority = 0
defaultduein = 0
creator = 0
lastupdatedby = 0
disabled = 0
class TicketFactory(factory.django.DjangoModelFactory):
class Meta:
model = Ticket
effectiveid = 0
queue = factory.SubFactory(QueueFactory)
issuestatement = 0
resolution = 0
owner = 0
initialpriority = 0
finalpriority = 0
priority = 0
timeestimated = 0
timeworked = 0
status = factory.LazyAttribute(lambda t: fake.tld())
timeleft = 0
lastupdatedby = 0
creator = 0
disabled = 0
class CustomfieldFactory(factory.django.DjangoModelFactory):
class Meta:
model = Customfield
name = factory.LazyAttribute(lambda t: fake.tld())
sortorder = 0
creator = 0
lastupdatedby = 0
disabled = 0
lookuptype = 'RT::Queue-RT::Ticket'
class TicketCustomfieldValueFactory(factory.django.DjangoModelFactory):
class Meta:
model = TicketCustomfieldValue
ticket = factory.SubFactory(TicketFactory)
customfield = factory.SubFactory(CustomfieldFactory)
content = factory.LazyAttribute(lambda t: fake.domain_name())
creator = 0
lastupdatedby = 0
objecttype = 'RT::Ticket'
sortorder = 0
disabled = 0
| 21.036585 | 71 | 0.692174 | 178 | 1,725 | 6.651685 | 0.303371 | 0.027027 | 0.047297 | 0.067568 | 0.322635 | 0.296453 | 0.054054 | 0 | 0 | 0 | 0 | 0.020408 | 0.233043 | 1,725 | 81 | 72 | 21.296296 | 0.874528 | 0 | 0 | 0.366667 | 0 | 0 | 0.031903 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.116667 | 0 | 0.85 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
09073779c1cb3424b3bf597d5ef5057396d4f95d | 3,475 | py | Python | Documentation_generator/ivml_documentation_from_file.py | jegentile/IVML | 6f37f71c3eafd62bf8eae8b81906d4dec024a337 | [
"Apache-2.0"
] | 8 | 2015-01-04T22:24:04.000Z | 2021-03-02T22:52:13.000Z | Documentation_generator/ivml_documentation_from_file.py | jegentile/IVML | 6f37f71c3eafd62bf8eae8b81906d4dec024a337 | [
"Apache-2.0"
] | null | null | null | Documentation_generator/ivml_documentation_from_file.py | jegentile/IVML | 6f37f71c3eafd62bf8eae8b81906d4dec024a337 | [
"Apache-2.0"
] | 5 | 2015-01-15T01:42:34.000Z | 2015-06-10T20:14:07.000Z | __author__ = 'jgentile'
class IVMLDocumentationFromFile:
def __init__(self,file_name):
self.__file_name = file_name
self.__file = open(file_name)
print ' Document generator looking at ',file_name
self.__ivml_object_documentation = '\0'
self.__type = ''
self.__ivml_non_required_attributes = {}
self.__ivml_required_attributes = {}
self.__svg_attributes = {}
self.__events = {}
self.__name = ""
self.__description = ""
def change_from_camel_case(self,string):
str = string
for i in range(65,91):
str = str.replace(chr(i),'-'+chr(i+32))
return str
def parse(self):
on_line = 0
for line in self.__file:
#check to see if the first line annotates an IVML object
if on_line ==0:
stripped_line = line.replace(' ','').replace('\t','').rstrip()
array = stripped_line.split(':')
#if the first line in the JavaScript file is not annotated for the documentaion, return
if not array[0] == '//@defining' or not len(array) ==3:
print " WARNING: First line of",self.__file_name," file does not have proper annotation. Skipping."
return
print array
if array[1] == 'ivml.chart':
self.__type = 'chart'
if array[1] == 'ivml.visualElement':
self.__type = 'visual_element'
if array[1] == 'ivml.group':
self.__type = 'group'
print 'Name: ', array[2]," --- type:",self.__type
self.__name = self.change_from_camel_case( array[2])
if on_line > 0 and self.__type:
if '//@' in line:
struct = 0
offset = 1
if '//@i' in line:
struct = self.__ivml_non_required_attributes
if '//@ir' in line:
struct = self.__ivml_required_attributes
offset = 2
if '//@s' in line:
struct = self.__svg_attributes
if '//@e' in line:
struct = self.__events
if '//@description' in line:
self.__description = line.split('//@description')[1].rstrip()
if not struct == 0:
attribute = line.strip().replace(' ','').replace('\t','').split(':')[0]
struct[self.change_from_camel_case(attribute)] = line.split('//@')[len(line.split('//@'))-1][offset:].strip().rstrip().replace('#','\#').replace('_','\_')
on_line+=1
'''
print 'ivml',self.__ivml_attributes
print 'svg',self.__svg_attributes
print 'events',self.__events
'''
def get_type(self):
if self.__type:
return self.__type
def get_ivml_required_attributes(self):
return self.__ivml_required_attributes
def get_ivml_non_required_attributes(self):
return self.__ivml_non_required_attributes
def get_svg_attributes(self):
return self.__svg_attributes
def get_event_attributes(self):
return self.__events
def get_name(self):
return self.__name
def get_description(self):
return self.__description
| 33.413462 | 178 | 0.524892 | 370 | 3,475 | 4.562162 | 0.245946 | 0.037915 | 0.049763 | 0.059242 | 0.14218 | 0.042654 | 0 | 0 | 0 | 0 | 0 | 0.011251 | 0.360576 | 3,475 | 103 | 179 | 33.737864 | 0.748425 | 0.040576 | 0 | 0 | 0 | 0 | 0.087175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.056338 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
09092d39bb27dcbbb648e0ff51b488e0aa0fd877 | 7,682 | py | Python | tests/silhouette_test.py | JesusFreke/fscad | 0f79ec27f9141b8735e2f66efdb7dc705fd1a574 | [
"Apache-2.0"
] | 35 | 2019-02-19T08:26:52.000Z | 2022-03-17T06:05:24.000Z | tests/silhouette_test.py | JesusFreke/fscad | 0f79ec27f9141b8735e2f66efdb7dc705fd1a574 | [
"Apache-2.0"
] | 3 | 2019-10-19T06:15:00.000Z | 2021-12-03T06:29:57.000Z | tests/silhouette_test.py | JesusFreke/fscad | 0f79ec27f9141b8735e2f66efdb7dc705fd1a574 | [
"Apache-2.0"
] | 3 | 2019-03-31T00:11:01.000Z | 2020-03-26T21:29:57.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import adsk.fusion
from adsk.core import Point3D, Vector3D
import unittest
# note: load_tests is required for the "pattern" test filtering functionality in loadTestsFromModule in run()
from fscad.test_utils import FscadTestCase, load_tests
from fscad.fscad import *
class SilhouetteTest(FscadTestCase):
def test_orthogonal_face_silhouette(self):
rect = Rect(1, 1)
silhouette = Silhouette(rect.faces[0], adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), rect.size().asArray())
def test_non_orthogonal_face_silhouette(self):
rect = Rect(1, 1)
rect.ry(45)
silhouette = Silhouette(rect.faces[0], adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), (rect.size().x, rect.size().y, 0))
def test_parallel_face_silhouette(self):
rect = Rect(1, 1)
rect.ry(90)
silhouette = Silhouette(rect.faces[0], adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), (0, 0, 0))
def test_body_silhouette(self):
box = Box(1, 1, 1)
box.ry(45)
silhouette = Silhouette(box.bodies[0], adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), (box.size().x, box.size().y, 0))
def test_component_silhouette(self):
rect = Rect(1, 1)
rect.ry(45)
silhouette = Silhouette(rect, adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), (rect.size().x, rect.size().y, 0))
def test_multiple_disjoint_faces_silhouette(self):
rect1 = Rect(1, 1)
rect2 = Rect(1, 1)
rect2.ry(45)
rect2.tx(2)
assembly = Group([rect1, rect2])
silhouette = Silhouette(assembly.faces, adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertTrue(abs(silhouette.size().x - assembly.size().x) < app().pointTolerance)
self.assertTrue(abs(silhouette.size().y - assembly.size().y) < app().pointTolerance)
self.assertEquals(silhouette.size().z, 0)
def test_multiple_overlapping_faces_silhouette(self):
rect1 = Rect(1, 1)
rect2 = Rect(1, 1)
rect2.ry(45)
rect2.translate(.5, .5)
assembly = Group([rect1, rect2])
silhouette = Silhouette(assembly.faces, adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertTrue(abs(silhouette.size().x - assembly.size().x) < app().pointTolerance)
self.assertTrue(abs(silhouette.size().y - assembly.size().y) < app().pointTolerance)
self.assertEquals(silhouette.size().z, 0)
def test_cylinder_silhouette(self):
cyl = Cylinder(1, 1)
silhouette = Silhouette(cyl, adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), (cyl.size().x, cyl.size().y, 0))
def test_single_edge(self):
circle = Circle(1)
silhouette = Silhouette(circle.edges[0], adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)))
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), circle.size().asArray())
def test_multiple_edges(self):
rect = Rect(1, 1)
hole1 = Circle(.1)
hole2 = Circle(.2)
hole1.place(
(-hole1 == -rect) + .1,
(-hole1 == -rect) + .1,
~hole1 == ~rect)
hole2.place(
(+hole2 == +rect) - .1,
(+hole2 == +rect) - .1,
~hole2 == ~rect)
assembly = Difference(rect, hole1, hole2)
silhouette = Silhouette(assembly.faces[0].outer_edges, assembly.get_plane())
silhouette.create_occurrence(True)
self.assertEquals(silhouette.size().asArray(), rect.size().asArray())
self.assertEquals(len(silhouette.edges), 4)
def test_named_edges(self):
box = Box(1, 1, 1)
silhouette = Silhouette(
box,
adsk.core.Plane.create(
Point3D.create(0, 0, -1),
Vector3D.create(0, 0, 1)),
named_edges={
"front": box.shared_edges(box.bottom, box.front),
"back": box.shared_edges(box.bottom, box.back),
"left": box.shared_edges(box.bottom, box.left),
"right": box.shared_edges(box.bottom, box.right)})
silhouette.create_occurrence(create_children=True)
edge_finder = Box(.1, .1, .1)
edge_finder.place(
~edge_finder == ~silhouette,
-edge_finder == -silhouette,
~edge_finder == ~silhouette)
found_edges = silhouette.find_edges(edge_finder)
named_edges = silhouette.named_edges("front")
self.assertEquals(len(found_edges), 1)
self.assertEquals(found_edges, named_edges)
edge_finder.place(
~edge_finder == ~silhouette,
+edge_finder == +silhouette,
~edge_finder == ~silhouette)
found_edges = silhouette.find_edges(edge_finder)
named_edges = silhouette.named_edges("back")
self.assertEquals(len(found_edges), 1)
self.assertEquals(found_edges, named_edges)
edge_finder.place(
+edge_finder == +silhouette,
~edge_finder == ~silhouette,
~edge_finder == ~silhouette)
found_edges = silhouette.find_edges(edge_finder)
named_edges = silhouette.named_edges("right")
self.assertEquals(len(found_edges), 1)
self.assertEquals(found_edges, named_edges)
edge_finder.place(
-edge_finder == -silhouette,
~edge_finder == ~silhouette,
~edge_finder == ~silhouette)
found_edges = silhouette.find_edges(edge_finder)
named_edges = silhouette.named_edges("left")
self.assertEquals(len(found_edges), 1)
self.assertEquals(found_edges, named_edges)
def run(context):
import sys
test_suite = unittest.defaultTestLoader.loadTestsFromModule(sys.modules[__name__],
#pattern="named_edges",
)
unittest.TextTestRunner(failfast=True).run(test_suite)
| 35.897196 | 109 | 0.609607 | 902 | 7,682 | 5.072062 | 0.176275 | 0.009617 | 0.034973 | 0.039344 | 0.673224 | 0.651803 | 0.623388 | 0.623388 | 0.612896 | 0.60459 | 0 | 0.032662 | 0.262692 | 7,682 | 213 | 110 | 36.065728 | 0.775071 | 0.087738 | 0 | 0.607843 | 0 | 0 | 0.005149 | 0 | 0 | 0 | 0 | 0 | 0.150327 | 1 | 0.078431 | false | 0 | 0.039216 | 0 | 0.124183 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
09099ad6231f06e950b5551e02bfd0a0b2878fe9 | 7,794 | py | Python | core/configs.py | firehose-dataset/congrad | 20792f43aa89beae75454e30b82b2e1280ed3106 | [
"MIT"
] | 9 | 2020-07-21T14:37:22.000Z | 2021-07-14T12:44:13.000Z | core/configs.py | firehose-dataset/congrad | 20792f43aa89beae75454e30b82b2e1280ed3106 | [
"MIT"
] | 2 | 2020-09-22T18:05:03.000Z | 2020-11-19T09:42:21.000Z | core/configs.py | firehose-dataset/congrad | 20792f43aa89beae75454e30b82b2e1280ed3106 | [
"MIT"
] | 2 | 2020-07-21T16:39:12.000Z | 2020-07-30T02:20:47.000Z | import os
import time
import argparse
def get_basic_parser():
parser = argparse.ArgumentParser(description='Transformer Language Model', add_help=False)
parser.add_argument('--gpu0_bsz', type=int, default=-1,
help='batch size on gpu 0')
parser.add_argument('--n_layer', type=int, default=12,
help='number of total layers')
parser.add_argument('--n_head', type=int, default=10,
help='number of heads')
parser.add_argument('--d_head', type=int, default=50,
help='head dimension')
parser.add_argument('--d_embed', type=int, default=-1,
help='embedding dimension')
parser.add_argument('--d_user_embed', type=int, default=-1,
help='user_embedding dimension')
parser.add_argument('--mtl_depth', type=int, default=1,
help='depth of the mtl model')
parser.add_argument('--mtl_width', type=float, default=0.5,
help='width of the mtl model')
parser.add_argument('--d_model', type=int, default=500,
help='model dimension')
parser.add_argument('--d_inner', type=int, default=1000,
help='inner dimension in FF')
parser.add_argument('--dropout', type=float, default=0.0,
help='global dropout rate')
parser.add_argument('--dropatt', type=float, default=0.0,
help='attention probability dropout rate')
parser.add_argument('--init', default='normal', type=str,
help='parameter initializer to use.')
parser.add_argument('--emb_init', default='normal', type=str,
help='parameter initializer to use.')
parser.add_argument('--init_range', type=float, default=0.01,
help='parameters initialized by U(-init_range, init_range)')
parser.add_argument('--emb_init_range', type=float, default=0.01,
help='parameters initialized by U(-init_range, init_range)')
parser.add_argument('--init_std', type=float, default=0.001,
help='parameters initialized by N(0, init_std)')
parser.add_argument('--proj_init_std', type=float, default=0.01,
help='parameters initialized by N(0, init_std)')
parser.add_argument('--optim', default='adam', type=str,
choices=['adam', 'sgd', 'adagrad'],
help='optimizer to use.')
parser.add_argument('--lr', type=float, default=0.00025,
help='initial learning rate (0.00025|5 for adam|sgd)')
parser.add_argument('--mom', type=float, default=0.0,
help='momentum for sgd')
parser.add_argument('--scheduler', default='cosine', type=str,
choices=['cosine', 'inv_sqrt', 'dev_perf', 'constant'],
help='lr scheduler to use.')
parser.add_argument('--warmup_step', type=int, default=0,
help='upper epoch limit')
parser.add_argument('--decay_rate', type=float, default=0.5,
help='decay factor when ReduceLROnPlateau is used')
parser.add_argument('--lr_min', type=float, default=0.0,
help='minimum learning rate during annealing')
parser.add_argument('--clip', type=float, default=0.25,
help='gradient clipping')
parser.add_argument('--clip_nonemb', action='store_true',
help='only clip the gradient of non-embedding params')
parser.add_argument('--parsimonious', action='store_true',
help='parsimonious mtl params')
parser.add_argument('--max_step', type=int, default=50000,
help='upper step limit')
parser.add_argument('--batch_size', type=int, default=60,
help='the data chunk size. In continual learning, this is the rate of data arrival at each model learning step; In offline batch learning, this is the equivalent term to the mini-batch size of the model.')
parser.add_argument('--eval_batch_size', type=int, default=60,
help='evaluation batch size')
parser.add_argument('--eval_initial_model', action='store_true',
help='evaluate the initialized model')
parser.add_argument('--batch_chunk', type=int, default=1,
help='split batch into chunks to save memory')
parser.add_argument('--max_seqlen', type=int, default=280,
help='number of tokens to predict')
parser.add_argument('--not_tied', action='store_true',
help='do not tie the word embedding and softmax weights')
parser.add_argument('--seed', type=int, default=1111,
help='random seed')
parser.add_argument('--cuda', action='store_true',
help='use CUDA')
parser.add_argument('--varlen', action='store_true',
help='use variable length')
parser.add_argument('--multi_gpu', action='store_true',
help='use multiple GPU')
parser.add_argument('--log-interval', type=int, default=200,
help='report interval')
parser.add_argument('--eval-interval', type=int, default=5000,
help='evaluation interval')
parser.add_argument('--work_dir', default='exp/LM', type=str,
help='experiment directory.')
parser.add_argument('--model_class', type=str, default='MemTransformerLM',
choices=['MTLMemTransformerLM', 'MemTransformerLM'],
help='choose transformer model')
parser.add_argument('--clamp_len', type=int, default=-1,
help='use the same pos embeddings after clamp_len')
parser.add_argument('--eta_min', type=float, default=0.0,
help='min learning rate for cosine scheduler')
parser.add_argument('--max_eval_steps', type=int, default=-1,
help='max eval steps')
parser.add_argument('--break_ratio', default=1.0, type=float,
help='number of data to break down')
parser.add_argument('--subword-augment', action='store_true',
help='perform subword augmentation')
parser.add_argument('--mtl-type', type=str, default='layerwise',
choices=['multi_encoder', 'multi_decoder', 'layerwise', 'all'],
help='types of multitask learning architecture')
parser.add_argument('--snapshot_dir', type=str, default=None,
help='resume snapshot dir')
parser.add_argument('--init_weights', type=str, default=None,
help='weights to init ')
parser.add_argument('--postfix', type=str, default=None,
help='postfix of experiment')
parser.add_argument('--use_tb_logger', action='store_true',
help='Turn on tensorboard logger.')
parser.add_argument('--async_lr', action='store_true',
help='Smaller lr for backbone and larger lr for mtl.')
parser.add_argument('--debug', action='store_true',
help='run in debug mode (do not create exp dir)')
parser.add_argument('--resume', action='store_true',
help='resume training from the saved checkpoint')
parser.add_argument('--resume_dir', type=str, default='',
help='resume directory')
parser.add_argument('--eval_online_fit', action='store_false',
help='Evaluate online loss as metric for online fit.')
return parser
| 60.890625 | 229 | 0.588145 | 907 | 7,794 | 4.907387 | 0.254686 | 0.117277 | 0.221523 | 0.049652 | 0.283532 | 0.180634 | 0.138171 | 0.100427 | 0.100427 | 0.096158 | 0 | 0.016628 | 0.282397 | 7,794 | 127 | 230 | 61.370079 | 0.779188 | 0 | 0 | 0.047619 | 0 | 0.007937 | 0.347703 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007937 | false | 0 | 0.02381 | 0 | 0.039683 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
0913e4c8b9fb3ed958524ac1b07d0ed762918360 | 1,355 | py | Python | main.py | ragequitcc/Ragequit.FritzBoxStatus | 981e65d0c3b84e22384d6b5095ee8ba05bde370f | [
"MIT"
] | null | null | null | main.py | ragequitcc/Ragequit.FritzBoxStatus | 981e65d0c3b84e22384d6b5095ee8ba05bde370f | [
"MIT"
] | null | null | null | main.py | ragequitcc/Ragequit.FritzBoxStatus | 981e65d0c3b84e22384d6b5095ee8ba05bde370f | [
"MIT"
] | null | null | null | from os import environ
from http.server import BaseHTTPRequestHandler, HTTPServer
from fritzconnection.lib.fritzstatus import FritzStatus
from fritzconnection import FritzConnection
if "FritzBoxUri" not in environ:
print("FritzBoxUri Missing")
quit()
if "FritzBoxUser" not in environ:
print("FritzBoxUser Missing")
quit()
if "FritzBoxPassword" not in environ:
print("FritzBoxPassword Missing")
quit()
fc = FritzConnection(address=environ["FritzBoxUri"], user=environ["FritzBoxUser"], password=environ["FritzBoxPassword"])
fs = FritzStatus(fc)
class Server(BaseHTTPRequestHandler):
def do_GET(self):
ip = fs.external_ip if fs.external_ip else fc.call_action("WANPPPConnection1", "GetInfo")["NewExternalIPAddress"]
if self.path == "/ip/v4":
self.send_response(200)
self.send_header("Content-type", "text/html")
self.end_headers()
self.wfile.write(ip.encode())
elif self.path == "/ip/v6":
ip = fs.external_ipv6 if fs.external_ipv6 else "Not Available"
self.send_response(200)
self.send_header("Content-type", "text/html")
self.end_headers()
self.wfile.write(ip.encode())
httpd = HTTPServer(("0.0.0.0", 8080), Server)
httpd.serve_forever()
| 33.04878 | 122 | 0.653875 | 153 | 1,355 | 5.705882 | 0.405229 | 0.045819 | 0.041237 | 0.058419 | 0.201604 | 0.201604 | 0.201604 | 0.201604 | 0.201604 | 0.201604 | 0 | 0.018217 | 0.230258 | 1,355 | 40 | 123 | 33.875 | 0.818792 | 0 | 0 | 0.354839 | 0 | 0 | 0.196958 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0.096774 | 0.129032 | 0 | 0.193548 | 0.096774 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
091e4f3b3919380511b4de82bf25152045071808 | 1,951 | py | Python | status/get_status_from_s3_path.py | oslokommune/dataplatform-status-api | f19845697d1a2c2592a3fa18bade2b70014c3aa1 | [
"MIT"
] | null | null | null | status/get_status_from_s3_path.py | oslokommune/dataplatform-status-api | f19845697d1a2c2592a3fa18bade2b70014c3aa1 | [
"MIT"
] | null | null | null | status/get_status_from_s3_path.py | oslokommune/dataplatform-status-api | f19845697d1a2c2592a3fa18bade2b70014c3aa1 | [
"MIT"
] | null | null | null | import base64
import simplejson
from aws_xray_sdk.core import patch_all, xray_recorder
from botocore.exceptions import ClientError
from okdata.aws.logging import (
logging_wrapper,
log_add,
log_exception,
)
from okdata.resource_auth import ResourceAuthorizer
from status.StatusData import StatusData
from status.common import (
response,
response_error,
extract_bearer_token,
extract_dataset_id,
)
patch_all()
resource_authorizer = ResourceAuthorizer()
@logging_wrapper
@xray_recorder.capture("get_status_from_s3_path")
def handler(event, context):
params = event["pathParameters"]
# The s3_path parameter MUST be base64 encoded since it can contain "/"
# and any other character known to man.....
path = base64.b64decode(params["s3_path"]).decode("utf-8", "ignore")
log_add(s3_path=path)
db = StatusData()
try:
status_item = db.get_status_from_s3_path(path)
if status_item is None:
error = "Could not find item"
return response_error(404, error)
dataset_id = extract_dataset_id(status_item)
bearer_token = extract_bearer_token(event)
log_add(trace_id=status_item["trace_id"], dataset_id=dataset_id)
if dataset_id and resource_authorizer.has_access(
bearer_token,
"okdata:dataset:write",
f"okdata:dataset:{dataset_id}",
use_whitelist=True,
):
ret = {
# TODO: Return both id and trace_id until
# all clients are updated
"id": status_item["trace_id"],
"trace_id": status_item["trace_id"],
}
return response(200, simplejson.dumps(ret))
error = "Access denied"
return response_error(403, error)
except ClientError as ce:
log_exception(ce)
error_msg = f"Could not get status: {ce}"
return response_error(404, error_msg)
| 30.484375 | 75 | 0.661712 | 242 | 1,951 | 5.078512 | 0.409091 | 0.051261 | 0.039056 | 0.041497 | 0.129373 | 0.039056 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.253716 | 1,951 | 63 | 76 | 30.968254 | 0.826236 | 0.089698 | 0 | 0 | 0 | 0 | 0.109543 | 0.028233 | 0 | 0 | 0 | 0.015873 | 0 | 1 | 0.019608 | false | 0 | 0.156863 | 0 | 0.254902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
09243420f3d8c289a389aaaead800458826beaf8 | 860 | py | Python | 0901-1000/0931-Median of K Sorted Arrays/0931-Median of K Sorted Arrays.py | jiadaizhao/LintCode | a8aecc65c47a944e9debad1971a7bc6b8776e48b | [
"MIT"
] | 77 | 2017-12-30T13:33:37.000Z | 2022-01-16T23:47:08.000Z | 0901-1000/0931-Median of K Sorted Arrays/0931-Median of K Sorted Arrays.py | jxhangithub/LintCode-1 | a8aecc65c47a944e9debad1971a7bc6b8776e48b | [
"MIT"
] | 1 | 2018-05-14T14:15:40.000Z | 2018-05-14T14:15:40.000Z | 0901-1000/0931-Median of K Sorted Arrays/0931-Median of K Sorted Arrays.py | jxhangithub/LintCode-1 | a8aecc65c47a944e9debad1971a7bc6b8776e48b | [
"MIT"
] | 39 | 2017-12-07T14:36:25.000Z | 2022-03-10T23:05:37.000Z | import bisect
class Solution:
"""
@param nums: the given k sorted arrays
@return: the median of the given k sorted arrays
"""
def findMedian(self, nums):
# write your code here
n = sum(len(arr) for arr in nums)
if n == 0:
return 0
def findKth(k):
result = low = 0
high = (1 << 31) - 1
while low <= high:
mid = (low + high) // 2
count = sum(len(arr) - bisect.bisect_left(arr, mid) for arr in nums)
if count >= k:
result = mid
low = mid + 1
else:
high = mid - 1
return result
if n & 1:
return findKth(n // 2 + 1)
else:
return (findKth(n // 2) + findKth(n // 2 + 1)) / 2
| 27.741935 | 84 | 0.422093 | 103 | 860 | 3.514563 | 0.398058 | 0.066298 | 0.074586 | 0.082873 | 0.19337 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038031 | 0.480233 | 860 | 30 | 85 | 28.666667 | 0.771812 | 0.126744 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0 | 1 | 0.090909 | false | 0 | 0.045455 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
092c4c5b9e451b609b0d1d54ed26a56cbafeb03c | 57,179 | py | Python | Lib/unittest/case.py | sireliah/polish-python | 605df4944c2d3bc25f8bf6964b274c0a0d297cc3 | [
"PSF-2.0"
] | 1 | 2018-06-21T18:21:24.000Z | 2018-06-21T18:21:24.000Z | Lib/unittest/case.py | sireliah/polish-python | 605df4944c2d3bc25f8bf6964b274c0a0d297cc3 | [
"PSF-2.0"
] | null | null | null | Lib/unittest/case.py | sireliah/polish-python | 605df4944c2d3bc25f8bf6964b274c0a0d297cc3 | [
"PSF-2.0"
] | null | null | null | """Test case implementation"""
zaimportuj sys
zaimportuj functools
zaimportuj difflib
zaimportuj logging
zaimportuj pprint
zaimportuj re
zaimportuj warnings
zaimportuj collections
zaimportuj contextlib
zaimportuj traceback
z . zaimportuj result
z .util zaimportuj (strclass, safe_repr, _count_diff_all_purpose,
_count_diff_hashable, _common_shorten_repr)
__unittest = Prawda
DIFF_OMITTED = ('\nDiff jest %s characters long. '
'Set self.maxDiff to Nic to see it.')
klasa SkipTest(Exception):
"""
Raise this exception w a test to skip it.
Usually you can use TestCase.skipTest() albo one of the skipping decorators
instead of raising this directly.
"""
klasa _ShouldStop(Exception):
"""
The test should stop.
"""
klasa _UnexpectedSuccess(Exception):
"""
The test was supposed to fail, but it didn't!
"""
klasa _Outcome(object):
def __init__(self, result=Nic):
self.expecting_failure = Nieprawda
self.result = result
self.result_supports_subtests = hasattr(result, "addSubTest")
self.success = Prawda
self.skipped = []
self.expectedFailure = Nic
self.errors = []
@contextlib.contextmanager
def testPartExecutor(self, test_case, isTest=Nieprawda):
old_success = self.success
self.success = Prawda
spróbuj:
uzyskaj
wyjąwszy KeyboardInterrupt:
podnieś
wyjąwszy SkipTest jako e:
self.success = Nieprawda
self.skipped.append((test_case, str(e)))
wyjąwszy _ShouldStop:
dalej
wyjąwszy:
exc_info = sys.exc_info()
jeżeli self.expecting_failure:
self.expectedFailure = exc_info
inaczej:
self.success = Nieprawda
self.errors.append((test_case, exc_info))
# explicitly przerwij a reference cycle:
# exc_info -> frame -> exc_info
exc_info = Nic
inaczej:
jeżeli self.result_supports_subtests oraz self.success:
self.errors.append((test_case, Nic))
w_końcu:
self.success = self.success oraz old_success
def _id(obj):
zwróć obj
def skip(reason):
"""
Unconditionally skip a test.
"""
def decorator(test_item):
jeżeli nie isinstance(test_item, type):
@functools.wraps(test_item)
def skip_wrapper(*args, **kwargs):
podnieś SkipTest(reason)
test_item = skip_wrapper
test_item.__unittest_skip__ = Prawda
test_item.__unittest_skip_why__ = reason
zwróć test_item
zwróć decorator
def skipIf(condition, reason):
"""
Skip a test jeżeli the condition jest true.
"""
jeżeli condition:
zwróć skip(reason)
zwróć _id
def skipUnless(condition, reason):
"""
Skip a test unless the condition jest true.
"""
jeżeli nie condition:
zwróć skip(reason)
zwróć _id
def expectedFailure(test_item):
test_item.__unittest_expecting_failure__ = Prawda
zwróć test_item
def _is_subtype(expected, basetype):
jeżeli isinstance(expected, tuple):
zwróć all(_is_subtype(e, basetype) dla e w expected)
zwróć isinstance(expected, type) oraz issubclass(expected, basetype)
klasa _BaseTestCaseContext:
def __init__(self, test_case):
self.test_case = test_case
def _raiseFailure(self, standardMsg):
msg = self.test_case._formatMessage(self.msg, standardMsg)
podnieś self.test_case.failureException(msg)
klasa _AssertRaisesBaseContext(_BaseTestCaseContext):
def __init__(self, expected, test_case, expected_regex=Nic):
_BaseTestCaseContext.__init__(self, test_case)
self.expected = expected
self.test_case = test_case
jeżeli expected_regex jest nie Nic:
expected_regex = re.compile(expected_regex)
self.expected_regex = expected_regex
self.obj_name = Nic
self.msg = Nic
def handle(self, name, args, kwargs):
"""
If args jest empty, assertRaises/Warns jest being used jako a
context manager, so check dla a 'msg' kwarg oraz zwróć self.
If args jest nie empty, call a callable dalejing positional oraz keyword
arguments.
"""
jeżeli nie _is_subtype(self.expected, self._base_type):
podnieś TypeError('%s() arg 1 must be %s' %
(name, self._base_type_str))
jeżeli args oraz args[0] jest Nic:
warnings.warn("callable jest Nic",
DeprecationWarning, 3)
args = ()
jeżeli nie args:
self.msg = kwargs.pop('msg', Nic)
jeżeli kwargs:
warnings.warn('%r jest an invalid keyword argument dla '
'this function' % next(iter(kwargs)),
DeprecationWarning, 3)
zwróć self
callable_obj, *args = args
spróbuj:
self.obj_name = callable_obj.__name__
wyjąwszy AttributeError:
self.obj_name = str(callable_obj)
przy self:
callable_obj(*args, **kwargs)
klasa _AssertRaisesContext(_AssertRaisesBaseContext):
"""A context manager used to implement TestCase.assertRaises* methods."""
_base_type = BaseException
_base_type_str = 'an exception type albo tuple of exception types'
def __enter__(self):
zwróć self
def __exit__(self, exc_type, exc_value, tb):
jeżeli exc_type jest Nic:
spróbuj:
exc_name = self.expected.__name__
wyjąwszy AttributeError:
exc_name = str(self.expected)
jeżeli self.obj_name:
self._raiseFailure("{} nie podnieśd by {}".format(exc_name,
self.obj_name))
inaczej:
self._raiseFailure("{} nie podnieśd".format(exc_name))
inaczej:
traceback.clear_frames(tb)
jeżeli nie issubclass(exc_type, self.expected):
# let unexpected exceptions dalej through
zwróć Nieprawda
# store exception, without traceback, dla later retrieval
self.exception = exc_value.with_traceback(Nic)
jeżeli self.expected_regex jest Nic:
zwróć Prawda
expected_regex = self.expected_regex
jeżeli nie expected_regex.search(str(exc_value)):
self._raiseFailure('"{}" does nie match "{}"'.format(
expected_regex.pattern, str(exc_value)))
zwróć Prawda
klasa _AssertWarnsContext(_AssertRaisesBaseContext):
"""A context manager used to implement TestCase.assertWarns* methods."""
_base_type = Warning
_base_type_str = 'a warning type albo tuple of warning types'
def __enter__(self):
# The __warningregistry__'s need to be w a pristine state dla tests
# to work properly.
dla v w sys.modules.values():
jeżeli getattr(v, '__warningregistry__', Nic):
v.__warningregistry__ = {}
self.warnings_manager = warnings.catch_warnings(record=Prawda)
self.warnings = self.warnings_manager.__enter__()
warnings.simplefilter("always", self.expected)
zwróć self
def __exit__(self, exc_type, exc_value, tb):
self.warnings_manager.__exit__(exc_type, exc_value, tb)
jeżeli exc_type jest nie Nic:
# let unexpected exceptions dalej through
zwróć
spróbuj:
exc_name = self.expected.__name__
wyjąwszy AttributeError:
exc_name = str(self.expected)
first_matching = Nic
dla m w self.warnings:
w = m.message
jeżeli nie isinstance(w, self.expected):
kontynuuj
jeżeli first_matching jest Nic:
first_matching = w
jeżeli (self.expected_regex jest nie Nic oraz
nie self.expected_regex.search(str(w))):
kontynuuj
# store warning dla later retrieval
self.warning = w
self.filename = m.filename
self.lineno = m.lineno
zwróć
# Now we simply try to choose a helpful failure message
jeżeli first_matching jest nie Nic:
self._raiseFailure('"{}" does nie match "{}"'.format(
self.expected_regex.pattern, str(first_matching)))
jeżeli self.obj_name:
self._raiseFailure("{} nie triggered by {}".format(exc_name,
self.obj_name))
inaczej:
self._raiseFailure("{} nie triggered".format(exc_name))
_LoggingWatcher = collections.namedtuple("_LoggingWatcher",
["records", "output"])
klasa _CapturingHandler(logging.Handler):
"""
A logging handler capturing all (raw oraz formatted) logging output.
"""
def __init__(self):
logging.Handler.__init__(self)
self.watcher = _LoggingWatcher([], [])
def flush(self):
dalej
def emit(self, record):
self.watcher.records.append(record)
msg = self.format(record)
self.watcher.output.append(msg)
klasa _AssertLogsContext(_BaseTestCaseContext):
"""A context manager used to implement TestCase.assertLogs()."""
LOGGING_FORMAT = "%(levelname)s:%(name)s:%(message)s"
def __init__(self, test_case, logger_name, level):
_BaseTestCaseContext.__init__(self, test_case)
self.logger_name = logger_name
jeżeli level:
self.level = logging._nameToLevel.get(level, level)
inaczej:
self.level = logging.INFO
self.msg = Nic
def __enter__(self):
jeżeli isinstance(self.logger_name, logging.Logger):
logger = self.logger = self.logger_name
inaczej:
logger = self.logger = logging.getLogger(self.logger_name)
formatter = logging.Formatter(self.LOGGING_FORMAT)
handler = _CapturingHandler()
handler.setFormatter(formatter)
self.watcher = handler.watcher
self.old_handlers = logger.handlers[:]
self.old_level = logger.level
self.old_propagate = logger.propagate
logger.handlers = [handler]
logger.setLevel(self.level)
logger.propagate = Nieprawda
zwróć handler.watcher
def __exit__(self, exc_type, exc_value, tb):
self.logger.handlers = self.old_handlers
self.logger.propagate = self.old_propagate
self.logger.setLevel(self.old_level)
jeżeli exc_type jest nie Nic:
# let unexpected exceptions dalej through
zwróć Nieprawda
jeżeli len(self.watcher.records) == 0:
self._raiseFailure(
"no logs of level {} albo higher triggered on {}"
.format(logging.getLevelName(self.level), self.logger.name))
klasa TestCase(object):
"""A klasa whose instances are single test cases.
By default, the test code itself should be placed w a method named
'runTest'.
If the fixture may be used dla many test cases, create as
many test methods jako are needed. When instantiating such a TestCase
subclass, specify w the constructor arguments the name of the test method
that the instance jest to execute.
Test authors should subclass TestCase dla their own tests. Construction
oraz deconstruction of the test's environment ('fixture') can be
implemented by overriding the 'setUp' oraz 'tearDown' methods respectively.
If it jest necessary to override the __init__ method, the base class
__init__ method must always be called. It jest important that subclasses
should nie change the signature of their __init__ method, since instances
of the classes are instantiated automatically by parts of the framework
w order to be run.
When subclassing TestCase, you can set these attributes:
* failureException: determines which exception will be podnieśd when
the instance's assertion methods fail; test methods raising this
exception will be deemed to have 'failed' rather than 'errored'.
* longMessage: determines whether long messages (including repr of
objects used w assert methods) will be printed on failure w *addition*
to any explicit message dalejed.
* maxDiff: sets the maximum length of a diff w failure messages
by assert methods using difflib. It jest looked up jako an instance
attribute so can be configured by individual tests jeżeli required.
"""
failureException = AssertionError
longMessage = Prawda
maxDiff = 80*8
# If a string jest longer than _diffThreshold, use normal comparison instead
# of difflib. See #11763.
_diffThreshold = 2**16
# Attribute used by TestSuite dla classSetUp
_classSetupFailed = Nieprawda
def __init__(self, methodName='runTest'):
"""Create an instance of the klasa that will use the named test
method when executed. Raises a ValueError jeżeli the instance does
nie have a method przy the specified name.
"""
self._testMethodName = methodName
self._outcome = Nic
self._testMethodDoc = 'No test'
spróbuj:
testMethod = getattr(self, methodName)
wyjąwszy AttributeError:
jeżeli methodName != 'runTest':
# we allow instantiation przy no explicit method name
# but nie an *incorrect* albo missing method name
podnieś ValueError("no such test method w %s: %s" %
(self.__class__, methodName))
inaczej:
self._testMethodDoc = testMethod.__doc__
self._cleanups = []
self._subtest = Nic
# Map types to custom assertEqual functions that will compare
# instances of said type w more detail to generate a more useful
# error message.
self._type_equality_funcs = {}
self.addTypeEqualityFunc(dict, 'assertDictEqual')
self.addTypeEqualityFunc(list, 'assertListEqual')
self.addTypeEqualityFunc(tuple, 'assertTupleEqual')
self.addTypeEqualityFunc(set, 'assertSetEqual')
self.addTypeEqualityFunc(frozenset, 'assertSetEqual')
self.addTypeEqualityFunc(str, 'assertMultiLineEqual')
def addTypeEqualityFunc(self, typeobj, function):
"""Add a type specific assertEqual style function to compare a type.
This method jest dla use by TestCase subclasses that need to register
their own type equality functions to provide nicer error messages.
Args:
typeobj: The data type to call this function on when both values
are of the same type w assertEqual().
function: The callable taking two arguments oraz an optional
msg= argument that podnieśs self.failureException przy a
useful error message when the two arguments are nie equal.
"""
self._type_equality_funcs[typeobj] = function
def addCleanup(self, function, *args, **kwargs):
"""Add a function, przy arguments, to be called when the test jest
completed. Functions added are called on a LIFO basis oraz are
called after tearDown on test failure albo success.
Cleanup items are called even jeżeli setUp fails (unlike tearDown)."""
self._cleanups.append((function, args, kwargs))
def setUp(self):
"Hook method dla setting up the test fixture before exercising it."
dalej
def tearDown(self):
"Hook method dla deconstructing the test fixture after testing it."
dalej
@classmethod
def setUpClass(cls):
"Hook method dla setting up klasa fixture before running tests w the class."
@classmethod
def tearDownClass(cls):
"Hook method dla deconstructing the klasa fixture after running all tests w the class."
def countTestCases(self):
zwróć 1
def defaultTestResult(self):
zwróć result.TestResult()
def shortDescription(self):
"""Returns a one-line description of the test, albo Nic jeżeli no
description has been provided.
The default implementation of this method returns the first line of
the specified test method's docstring.
"""
doc = self._testMethodDoc
zwróć doc oraz doc.split("\n")[0].strip() albo Nic
def id(self):
zwróć "%s.%s" % (strclass(self.__class__), self._testMethodName)
def __eq__(self, other):
jeżeli type(self) jest nie type(other):
zwróć NotImplemented
zwróć self._testMethodName == other._testMethodName
def __hash__(self):
zwróć hash((type(self), self._testMethodName))
def __str__(self):
zwróć "%s (%s)" % (self._testMethodName, strclass(self.__class__))
def __repr__(self):
zwróć "<%s testMethod=%s>" % \
(strclass(self.__class__), self._testMethodName)
def _addSkip(self, result, test_case, reason):
addSkip = getattr(result, 'addSkip', Nic)
jeżeli addSkip jest nie Nic:
addSkip(test_case, reason)
inaczej:
warnings.warn("TestResult has no addSkip method, skips nie reported",
RuntimeWarning, 2)
result.addSuccess(test_case)
@contextlib.contextmanager
def subTest(self, msg=Nic, **params):
"""Return a context manager that will zwróć the enclosed block
of code w a subtest identified by the optional message oraz
keyword parameters. A failure w the subtest marks the test
case jako failed but resumes execution at the end of the enclosed
block, allowing further test code to be executed.
"""
jeżeli nie self._outcome.result_supports_subtests:
uzyskaj
zwróć
parent = self._subtest
jeżeli parent jest Nic:
params_map = collections.ChainMap(params)
inaczej:
params_map = parent.params.new_child(params)
self._subtest = _SubTest(self, msg, params_map)
spróbuj:
przy self._outcome.testPartExecutor(self._subtest, isTest=Prawda):
uzyskaj
jeżeli nie self._outcome.success:
result = self._outcome.result
jeżeli result jest nie Nic oraz result.failfast:
podnieś _ShouldStop
albo_inaczej self._outcome.expectedFailure:
# If the test jest expecting a failure, we really want to
# stop now oraz register the expected failure.
podnieś _ShouldStop
w_końcu:
self._subtest = parent
def _feedErrorsToResult(self, result, errors):
dla test, exc_info w errors:
jeżeli isinstance(test, _SubTest):
result.addSubTest(test.test_case, test, exc_info)
albo_inaczej exc_info jest nie Nic:
jeżeli issubclass(exc_info[0], self.failureException):
result.addFailure(test, exc_info)
inaczej:
result.addError(test, exc_info)
def _addExpectedFailure(self, result, exc_info):
spróbuj:
addExpectedFailure = result.addExpectedFailure
wyjąwszy AttributeError:
warnings.warn("TestResult has no addExpectedFailure method, reporting jako dalejes",
RuntimeWarning)
result.addSuccess(self)
inaczej:
addExpectedFailure(self, exc_info)
def _addUnexpectedSuccess(self, result):
spróbuj:
addUnexpectedSuccess = result.addUnexpectedSuccess
wyjąwszy AttributeError:
warnings.warn("TestResult has no addUnexpectedSuccess method, reporting jako failure",
RuntimeWarning)
# We need to dalej an actual exception oraz traceback to addFailure,
# otherwise the legacy result can choke.
spróbuj:
podnieś _UnexpectedSuccess z Nic
wyjąwszy _UnexpectedSuccess:
result.addFailure(self, sys.exc_info())
inaczej:
addUnexpectedSuccess(self)
def run(self, result=Nic):
orig_result = result
jeżeli result jest Nic:
result = self.defaultTestResult()
startTestRun = getattr(result, 'startTestRun', Nic)
jeżeli startTestRun jest nie Nic:
startTestRun()
result.startTest(self)
testMethod = getattr(self, self._testMethodName)
jeżeli (getattr(self.__class__, "__unittest_skip__", Nieprawda) albo
getattr(testMethod, "__unittest_skip__", Nieprawda)):
# If the klasa albo method was skipped.
spróbuj:
skip_why = (getattr(self.__class__, '__unittest_skip_why__', '')
albo getattr(testMethod, '__unittest_skip_why__', ''))
self._addSkip(result, self, skip_why)
w_końcu:
result.stopTest(self)
zwróć
expecting_failure = getattr(testMethod,
"__unittest_expecting_failure__", Nieprawda)
outcome = _Outcome(result)
spróbuj:
self._outcome = outcome
przy outcome.testPartExecutor(self):
self.setUp()
jeżeli outcome.success:
outcome.expecting_failure = expecting_failure
przy outcome.testPartExecutor(self, isTest=Prawda):
testMethod()
outcome.expecting_failure = Nieprawda
przy outcome.testPartExecutor(self):
self.tearDown()
self.doCleanups()
dla test, reason w outcome.skipped:
self._addSkip(result, test, reason)
self._feedErrorsToResult(result, outcome.errors)
jeżeli outcome.success:
jeżeli expecting_failure:
jeżeli outcome.expectedFailure:
self._addExpectedFailure(result, outcome.expectedFailure)
inaczej:
self._addUnexpectedSuccess(result)
inaczej:
result.addSuccess(self)
zwróć result
w_końcu:
result.stopTest(self)
jeżeli orig_result jest Nic:
stopTestRun = getattr(result, 'stopTestRun', Nic)
jeżeli stopTestRun jest nie Nic:
stopTestRun()
# explicitly przerwij reference cycles:
# outcome.errors -> frame -> outcome -> outcome.errors
# outcome.expectedFailure -> frame -> outcome -> outcome.expectedFailure
outcome.errors.clear()
outcome.expectedFailure = Nic
# clear the outcome, no more needed
self._outcome = Nic
def doCleanups(self):
"""Execute all cleanup functions. Normally called dla you after
tearDown."""
outcome = self._outcome albo _Outcome()
dopóki self._cleanups:
function, args, kwargs = self._cleanups.pop()
przy outcome.testPartExecutor(self):
function(*args, **kwargs)
# zwróć this dla backwards compatibility
# even though we no longer us it internally
zwróć outcome.success
def __call__(self, *args, **kwds):
zwróć self.run(*args, **kwds)
def debug(self):
"""Run the test without collecting errors w a TestResult"""
self.setUp()
getattr(self, self._testMethodName)()
self.tearDown()
dopóki self._cleanups:
function, args, kwargs = self._cleanups.pop(-1)
function(*args, **kwargs)
def skipTest(self, reason):
"""Skip this test."""
podnieś SkipTest(reason)
def fail(self, msg=Nic):
"""Fail immediately, przy the given message."""
podnieś self.failureException(msg)
def assertNieprawda(self, expr, msg=Nic):
"""Check that the expression jest false."""
jeżeli expr:
msg = self._formatMessage(msg, "%s jest nie false" % safe_repr(expr))
podnieś self.failureException(msg)
def assertPrawda(self, expr, msg=Nic):
"""Check that the expression jest true."""
jeżeli nie expr:
msg = self._formatMessage(msg, "%s jest nie true" % safe_repr(expr))
podnieś self.failureException(msg)
def _formatMessage(self, msg, standardMsg):
"""Honour the longMessage attribute when generating failure messages.
If longMessage jest Nieprawda this means:
* Use only an explicit message jeżeli it jest provided
* Otherwise use the standard message dla the assert
If longMessage jest Prawda:
* Use the standard message
* If an explicit message jest provided, plus ' : ' oraz the explicit message
"""
jeżeli nie self.longMessage:
zwróć msg albo standardMsg
jeżeli msg jest Nic:
zwróć standardMsg
spróbuj:
# don't switch to '{}' formatting w Python 2.X
# it changes the way unicode input jest handled
zwróć '%s : %s' % (standardMsg, msg)
wyjąwszy UnicodeDecodeError:
zwróć '%s : %s' % (safe_repr(standardMsg), safe_repr(msg))
def assertRaises(self, expected_exception, *args, **kwargs):
"""Fail unless an exception of klasa expected_exception jest podnieśd
by the callable when invoked przy specified positional oraz
keyword arguments. If a different type of exception jest
podnieśd, it will nie be caught, oraz the test case will be
deemed to have suffered an error, exactly jako dla an
unexpected exception.
If called przy the callable oraz arguments omitted, will zwróć a
context object used like this::
przy self.assertRaises(SomeException):
do_something()
An optional keyword argument 'msg' can be provided when assertRaises
jest used jako a context object.
The context manager keeps a reference to the exception as
the 'exception' attribute. This allows you to inspect the
exception after the assertion::
przy self.assertRaises(SomeException) jako cm:
do_something()
the_exception = cm.exception
self.assertEqual(the_exception.error_code, 3)
"""
context = _AssertRaisesContext(expected_exception, self)
zwróć context.handle('assertRaises', args, kwargs)
def assertWarns(self, expected_warning, *args, **kwargs):
"""Fail unless a warning of klasa warnClass jest triggered
by the callable when invoked przy specified positional oraz
keyword arguments. If a different type of warning jest
triggered, it will nie be handled: depending on the other
warning filtering rules w effect, it might be silenced, printed
out, albo podnieśd jako an exception.
If called przy the callable oraz arguments omitted, will zwróć a
context object used like this::
przy self.assertWarns(SomeWarning):
do_something()
An optional keyword argument 'msg' can be provided when assertWarns
jest used jako a context object.
The context manager keeps a reference to the first matching
warning jako the 'warning' attribute; similarly, the 'filename'
oraz 'lineno' attributes give you information about the line
of Python code z which the warning was triggered.
This allows you to inspect the warning after the assertion::
przy self.assertWarns(SomeWarning) jako cm:
do_something()
the_warning = cm.warning
self.assertEqual(the_warning.some_attribute, 147)
"""
context = _AssertWarnsContext(expected_warning, self)
zwróć context.handle('assertWarns', args, kwargs)
def assertLogs(self, logger=Nic, level=Nic):
"""Fail unless a log message of level *level* albo higher jest emitted
on *logger_name* albo its children. If omitted, *level* defaults to
INFO oraz *logger* defaults to the root logger.
This method must be used jako a context manager, oraz will uzyskaj
a recording object przy two attributes: `output` oraz `records`.
At the end of the context manager, the `output` attribute will
be a list of the matching formatted log messages oraz the
`records` attribute will be a list of the corresponding LogRecord
objects.
Example::
przy self.assertLogs('foo', level='INFO') jako cm:
logging.getLogger('foo').info('first message')
logging.getLogger('foo.bar').error('second message')
self.assertEqual(cm.output, ['INFO:foo:first message',
'ERROR:foo.bar:second message'])
"""
zwróć _AssertLogsContext(self, logger, level)
def _getAssertEqualityFunc(self, first, second):
"""Get a detailed comparison function dla the types of the two args.
Returns: A callable accepting (first, second, msg=Nic) that will
podnieś a failure exception jeżeli first != second przy a useful human
readable error message dla those types.
"""
#
# NOTE(gregory.p.smith): I considered isinstance(first, type(second))
# oraz vice versa. I opted dla the conservative approach w case
# subclasses are nie intended to be compared w detail to their super
# klasa instances using a type equality func. This means testing
# subtypes won't automagically use the detailed comparison. Callers
# should use their type specific assertSpamEqual method to compare
# subclasses jeżeli the detailed comparison jest desired oraz appropriate.
# See the discussion w http://bugs.python.org/issue2578.
#
jeżeli type(first) jest type(second):
asserter = self._type_equality_funcs.get(type(first))
jeżeli asserter jest nie Nic:
jeżeli isinstance(asserter, str):
asserter = getattr(self, asserter)
zwróć asserter
zwróć self._baseAssertEqual
def _baseAssertEqual(self, first, second, msg=Nic):
"""The default assertEqual implementation, nie type specific."""
jeżeli nie first == second:
standardMsg = '%s != %s' % _common_shorten_repr(first, second)
msg = self._formatMessage(msg, standardMsg)
podnieś self.failureException(msg)
def assertEqual(self, first, second, msg=Nic):
"""Fail jeżeli the two objects are unequal jako determined by the '=='
operator.
"""
assertion_func = self._getAssertEqualityFunc(first, second)
assertion_func(first, second, msg=msg)
def assertNotEqual(self, first, second, msg=Nic):
"""Fail jeżeli the two objects are equal jako determined by the '!='
operator.
"""
jeżeli nie first != second:
msg = self._formatMessage(msg, '%s == %s' % (safe_repr(first),
safe_repr(second)))
podnieś self.failureException(msg)
def assertAlmostEqual(self, first, second, places=Nic, msg=Nic,
delta=Nic):
"""Fail jeżeli the two objects are unequal jako determined by their
difference rounded to the given number of decimal places
(default 7) oraz comparing to zero, albo by comparing that the
between the two objects jest more than the given delta.
Note that decimal places (z zero) are usually nie the same
jako significant digits (measured z the most signficant digit).
If the two objects compare equal then they will automatically
compare almost equal.
"""
jeżeli first == second:
# shortcut
zwróć
jeżeli delta jest nie Nic oraz places jest nie Nic:
podnieś TypeError("specify delta albo places nie both")
jeżeli delta jest nie Nic:
jeżeli abs(first - second) <= delta:
zwróć
standardMsg = '%s != %s within %s delta' % (safe_repr(first),
safe_repr(second),
safe_repr(delta))
inaczej:
jeżeli places jest Nic:
places = 7
jeżeli round(abs(second-first), places) == 0:
zwróć
standardMsg = '%s != %s within %r places' % (safe_repr(first),
safe_repr(second),
places)
msg = self._formatMessage(msg, standardMsg)
podnieś self.failureException(msg)
def assertNotAlmostEqual(self, first, second, places=Nic, msg=Nic,
delta=Nic):
"""Fail jeżeli the two objects are equal jako determined by their
difference rounded to the given number of decimal places
(default 7) oraz comparing to zero, albo by comparing that the
between the two objects jest less than the given delta.
Note that decimal places (z zero) are usually nie the same
jako significant digits (measured z the most signficant digit).
Objects that are equal automatically fail.
"""
jeżeli delta jest nie Nic oraz places jest nie Nic:
podnieś TypeError("specify delta albo places nie both")
jeżeli delta jest nie Nic:
jeżeli nie (first == second) oraz abs(first - second) > delta:
zwróć
standardMsg = '%s == %s within %s delta' % (safe_repr(first),
safe_repr(second),
safe_repr(delta))
inaczej:
jeżeli places jest Nic:
places = 7
jeżeli nie (first == second) oraz round(abs(second-first), places) != 0:
zwróć
standardMsg = '%s == %s within %r places' % (safe_repr(first),
safe_repr(second),
places)
msg = self._formatMessage(msg, standardMsg)
podnieś self.failureException(msg)
def assertSequenceEqual(self, seq1, seq2, msg=Nic, seq_type=Nic):
"""An equality assertion dla ordered sequences (like lists oraz tuples).
For the purposes of this function, a valid ordered sequence type jest one
which can be indexed, has a length, oraz has an equality operator.
Args:
seq1: The first sequence to compare.
seq2: The second sequence to compare.
seq_type: The expected datatype of the sequences, albo Nic jeżeli no
datatype should be enforced.
msg: Optional message to use on failure instead of a list of
differences.
"""
jeżeli seq_type jest nie Nic:
seq_type_name = seq_type.__name__
jeżeli nie isinstance(seq1, seq_type):
podnieś self.failureException('First sequence jest nie a %s: %s'
% (seq_type_name, safe_repr(seq1)))
jeżeli nie isinstance(seq2, seq_type):
podnieś self.failureException('Second sequence jest nie a %s: %s'
% (seq_type_name, safe_repr(seq2)))
inaczej:
seq_type_name = "sequence"
differing = Nic
spróbuj:
len1 = len(seq1)
wyjąwszy (TypeError, NotImplementedError):
differing = 'First %s has no length. Non-sequence?' % (
seq_type_name)
jeżeli differing jest Nic:
spróbuj:
len2 = len(seq2)
wyjąwszy (TypeError, NotImplementedError):
differing = 'Second %s has no length. Non-sequence?' % (
seq_type_name)
jeżeli differing jest Nic:
jeżeli seq1 == seq2:
zwróć
differing = '%ss differ: %s != %s\n' % (
(seq_type_name.capitalize(),) +
_common_shorten_repr(seq1, seq2))
dla i w range(min(len1, len2)):
spróbuj:
item1 = seq1[i]
wyjąwszy (TypeError, IndexError, NotImplementedError):
differing += ('\nUnable to index element %d of first %s\n' %
(i, seq_type_name))
przerwij
spróbuj:
item2 = seq2[i]
wyjąwszy (TypeError, IndexError, NotImplementedError):
differing += ('\nUnable to index element %d of second %s\n' %
(i, seq_type_name))
przerwij
jeżeli item1 != item2:
differing += ('\nFirst differing element %d:\n%s\n%s\n' %
(i, item1, item2))
przerwij
inaczej:
jeżeli (len1 == len2 oraz seq_type jest Nic oraz
type(seq1) != type(seq2)):
# The sequences are the same, but have differing types.
zwróć
jeżeli len1 > len2:
differing += ('\nFirst %s contains %d additional '
'elements.\n' % (seq_type_name, len1 - len2))
spróbuj:
differing += ('First extra element %d:\n%s\n' %
(len2, seq1[len2]))
wyjąwszy (TypeError, IndexError, NotImplementedError):
differing += ('Unable to index element %d '
'of first %s\n' % (len2, seq_type_name))
albo_inaczej len1 < len2:
differing += ('\nSecond %s contains %d additional '
'elements.\n' % (seq_type_name, len2 - len1))
spróbuj:
differing += ('First extra element %d:\n%s\n' %
(len1, seq2[len1]))
wyjąwszy (TypeError, IndexError, NotImplementedError):
differing += ('Unable to index element %d '
'of second %s\n' % (len1, seq_type_name))
standardMsg = differing
diffMsg = '\n' + '\n'.join(
difflib.ndiff(pprint.pformat(seq1).splitlines(),
pprint.pformat(seq2).splitlines()))
standardMsg = self._truncateMessage(standardMsg, diffMsg)
msg = self._formatMessage(msg, standardMsg)
self.fail(msg)
def _truncateMessage(self, message, diff):
max_diff = self.maxDiff
jeżeli max_diff jest Nic albo len(diff) <= max_diff:
zwróć message + diff
zwróć message + (DIFF_OMITTED % len(diff))
def assertListEqual(self, list1, list2, msg=Nic):
"""A list-specific equality assertion.
Args:
list1: The first list to compare.
list2: The second list to compare.
msg: Optional message to use on failure instead of a list of
differences.
"""
self.assertSequenceEqual(list1, list2, msg, seq_type=list)
def assertTupleEqual(self, tuple1, tuple2, msg=Nic):
"""A tuple-specific equality assertion.
Args:
tuple1: The first tuple to compare.
tuple2: The second tuple to compare.
msg: Optional message to use on failure instead of a list of
differences.
"""
self.assertSequenceEqual(tuple1, tuple2, msg, seq_type=tuple)
def assertSetEqual(self, set1, set2, msg=Nic):
"""A set-specific equality assertion.
Args:
set1: The first set to compare.
set2: The second set to compare.
msg: Optional message to use on failure instead of a list of
differences.
assertSetEqual uses ducktyping to support different types of sets, oraz
jest optimized dla sets specifically (parameters must support a
difference method).
"""
spróbuj:
difference1 = set1.difference(set2)
wyjąwszy TypeError jako e:
self.fail('invalid type when attempting set difference: %s' % e)
wyjąwszy AttributeError jako e:
self.fail('first argument does nie support set difference: %s' % e)
spróbuj:
difference2 = set2.difference(set1)
wyjąwszy TypeError jako e:
self.fail('invalid type when attempting set difference: %s' % e)
wyjąwszy AttributeError jako e:
self.fail('second argument does nie support set difference: %s' % e)
jeżeli nie (difference1 albo difference2):
zwróć
lines = []
jeżeli difference1:
lines.append('Items w the first set but nie the second:')
dla item w difference1:
lines.append(repr(item))
jeżeli difference2:
lines.append('Items w the second set but nie the first:')
dla item w difference2:
lines.append(repr(item))
standardMsg = '\n'.join(lines)
self.fail(self._formatMessage(msg, standardMsg))
def assertIn(self, member, container, msg=Nic):
"""Just like self.assertPrawda(a w b), but przy a nicer default message."""
jeżeli member nie w container:
standardMsg = '%s nie found w %s' % (safe_repr(member),
safe_repr(container))
self.fail(self._formatMessage(msg, standardMsg))
def assertNotIn(self, member, container, msg=Nic):
"""Just like self.assertPrawda(a nie w b), but przy a nicer default message."""
jeżeli member w container:
standardMsg = '%s unexpectedly found w %s' % (safe_repr(member),
safe_repr(container))
self.fail(self._formatMessage(msg, standardMsg))
def assertIs(self, expr1, expr2, msg=Nic):
"""Just like self.assertPrawda(a jest b), but przy a nicer default message."""
jeżeli expr1 jest nie expr2:
standardMsg = '%s jest nie %s' % (safe_repr(expr1),
safe_repr(expr2))
self.fail(self._formatMessage(msg, standardMsg))
def assertIsNot(self, expr1, expr2, msg=Nic):
"""Just like self.assertPrawda(a jest nie b), but przy a nicer default message."""
jeżeli expr1 jest expr2:
standardMsg = 'unexpectedly identical: %s' % (safe_repr(expr1),)
self.fail(self._formatMessage(msg, standardMsg))
def assertDictEqual(self, d1, d2, msg=Nic):
self.assertIsInstance(d1, dict, 'First argument jest nie a dictionary')
self.assertIsInstance(d2, dict, 'Second argument jest nie a dictionary')
jeżeli d1 != d2:
standardMsg = '%s != %s' % _common_shorten_repr(d1, d2)
diff = ('\n' + '\n'.join(difflib.ndiff(
pprint.pformat(d1).splitlines(),
pprint.pformat(d2).splitlines())))
standardMsg = self._truncateMessage(standardMsg, diff)
self.fail(self._formatMessage(msg, standardMsg))
def assertDictContainsSubset(self, subset, dictionary, msg=Nic):
"""Checks whether dictionary jest a superset of subset."""
warnings.warn('assertDictContainsSubset jest deprecated',
DeprecationWarning)
missing = []
mismatched = []
dla key, value w subset.items():
jeżeli key nie w dictionary:
missing.append(key)
albo_inaczej value != dictionary[key]:
mismatched.append('%s, expected: %s, actual: %s' %
(safe_repr(key), safe_repr(value),
safe_repr(dictionary[key])))
jeżeli nie (missing albo mismatched):
zwróć
standardMsg = ''
jeżeli missing:
standardMsg = 'Missing: %s' % ','.join(safe_repr(m) dla m w
missing)
jeżeli mismatched:
jeżeli standardMsg:
standardMsg += '; '
standardMsg += 'Mismatched values: %s' % ','.join(mismatched)
self.fail(self._formatMessage(msg, standardMsg))
def assertCountEqual(self, first, second, msg=Nic):
"""An unordered sequence comparison asserting that the same elements,
regardless of order. If the same element occurs more than once,
it verifies that the elements occur the same number of times.
self.assertEqual(Counter(list(first)),
Counter(list(second)))
Example:
- [0, 1, 1] oraz [1, 0, 1] compare equal.
- [0, 0, 1] oraz [0, 1] compare unequal.
"""
first_seq, second_seq = list(first), list(second)
spróbuj:
first = collections.Counter(first_seq)
second = collections.Counter(second_seq)
wyjąwszy TypeError:
# Handle case przy unhashable elements
differences = _count_diff_all_purpose(first_seq, second_seq)
inaczej:
jeżeli first == second:
zwróć
differences = _count_diff_hashable(first_seq, second_seq)
jeżeli differences:
standardMsg = 'Element counts were nie equal:\n'
lines = ['First has %d, Second has %d: %r' % diff dla diff w differences]
diffMsg = '\n'.join(lines)
standardMsg = self._truncateMessage(standardMsg, diffMsg)
msg = self._formatMessage(msg, standardMsg)
self.fail(msg)
def assertMultiLineEqual(self, first, second, msg=Nic):
"""Assert that two multi-line strings are equal."""
self.assertIsInstance(first, str, 'First argument jest nie a string')
self.assertIsInstance(second, str, 'Second argument jest nie a string')
jeżeli first != second:
# don't use difflib jeżeli the strings are too long
jeżeli (len(first) > self._diffThreshold albo
len(second) > self._diffThreshold):
self._baseAssertEqual(first, second, msg)
firstlines = first.splitlines(keepends=Prawda)
secondlines = second.splitlines(keepends=Prawda)
jeżeli len(firstlines) == 1 oraz first.strip('\r\n') == first:
firstlines = [first + '\n']
secondlines = [second + '\n']
standardMsg = '%s != %s' % _common_shorten_repr(first, second)
diff = '\n' + ''.join(difflib.ndiff(firstlines, secondlines))
standardMsg = self._truncateMessage(standardMsg, diff)
self.fail(self._formatMessage(msg, standardMsg))
def assertLess(self, a, b, msg=Nic):
"""Just like self.assertPrawda(a < b), but przy a nicer default message."""
jeżeli nie a < b:
standardMsg = '%s nie less than %s' % (safe_repr(a), safe_repr(b))
self.fail(self._formatMessage(msg, standardMsg))
def assertLessEqual(self, a, b, msg=Nic):
"""Just like self.assertPrawda(a <= b), but przy a nicer default message."""
jeżeli nie a <= b:
standardMsg = '%s nie less than albo equal to %s' % (safe_repr(a), safe_repr(b))
self.fail(self._formatMessage(msg, standardMsg))
def assertGreater(self, a, b, msg=Nic):
"""Just like self.assertPrawda(a > b), but przy a nicer default message."""
jeżeli nie a > b:
standardMsg = '%s nie greater than %s' % (safe_repr(a), safe_repr(b))
self.fail(self._formatMessage(msg, standardMsg))
def assertGreaterEqual(self, a, b, msg=Nic):
"""Just like self.assertPrawda(a >= b), but przy a nicer default message."""
jeżeli nie a >= b:
standardMsg = '%s nie greater than albo equal to %s' % (safe_repr(a), safe_repr(b))
self.fail(self._formatMessage(msg, standardMsg))
def assertIsNic(self, obj, msg=Nic):
"""Same jako self.assertPrawda(obj jest Nic), przy a nicer default message."""
jeżeli obj jest nie Nic:
standardMsg = '%s jest nie Nic' % (safe_repr(obj),)
self.fail(self._formatMessage(msg, standardMsg))
def assertIsNotNic(self, obj, msg=Nic):
"""Included dla symmetry przy assertIsNic."""
jeżeli obj jest Nic:
standardMsg = 'unexpectedly Nic'
self.fail(self._formatMessage(msg, standardMsg))
def assertIsInstance(self, obj, cls, msg=Nic):
"""Same jako self.assertPrawda(isinstance(obj, cls)), przy a nicer
default message."""
jeżeli nie isinstance(obj, cls):
standardMsg = '%s jest nie an instance of %r' % (safe_repr(obj), cls)
self.fail(self._formatMessage(msg, standardMsg))
def assertNotIsInstance(self, obj, cls, msg=Nic):
"""Included dla symmetry przy assertIsInstance."""
jeżeli isinstance(obj, cls):
standardMsg = '%s jest an instance of %r' % (safe_repr(obj), cls)
self.fail(self._formatMessage(msg, standardMsg))
def assertRaisesRegex(self, expected_exception, expected_regex,
*args, **kwargs):
"""Asserts that the message w a podnieśd exception matches a regex.
Args:
expected_exception: Exception klasa expected to be podnieśd.
expected_regex: Regex (re pattern object albo string) expected
to be found w error message.
args: Function to be called oraz extra positional args.
kwargs: Extra kwargs.
msg: Optional message used w case of failure. Can only be used
when assertRaisesRegex jest used jako a context manager.
"""
context = _AssertRaisesContext(expected_exception, self, expected_regex)
zwróć context.handle('assertRaisesRegex', args, kwargs)
def assertWarnsRegex(self, expected_warning, expected_regex,
*args, **kwargs):
"""Asserts that the message w a triggered warning matches a regexp.
Basic functioning jest similar to assertWarns() przy the addition
that only warnings whose messages also match the regular expression
are considered successful matches.
Args:
expected_warning: Warning klasa expected to be triggered.
expected_regex: Regex (re pattern object albo string) expected
to be found w error message.
args: Function to be called oraz extra positional args.
kwargs: Extra kwargs.
msg: Optional message used w case of failure. Can only be used
when assertWarnsRegex jest used jako a context manager.
"""
context = _AssertWarnsContext(expected_warning, self, expected_regex)
zwróć context.handle('assertWarnsRegex', args, kwargs)
def assertRegex(self, text, expected_regex, msg=Nic):
"""Fail the test unless the text matches the regular expression."""
jeżeli isinstance(expected_regex, (str, bytes)):
assert expected_regex, "expected_regex must nie be empty."
expected_regex = re.compile(expected_regex)
jeżeli nie expected_regex.search(text):
msg = msg albo "Regex didn't match"
msg = '%s: %r nie found w %r' % (msg, expected_regex.pattern, text)
podnieś self.failureException(msg)
def assertNotRegex(self, text, unexpected_regex, msg=Nic):
"""Fail the test jeżeli the text matches the regular expression."""
jeżeli isinstance(unexpected_regex, (str, bytes)):
unexpected_regex = re.compile(unexpected_regex)
match = unexpected_regex.search(text)
jeżeli match:
msg = msg albo "Regex matched"
msg = '%s: %r matches %r w %r' % (msg,
text[match.start():match.end()],
unexpected_regex.pattern,
text)
podnieś self.failureException(msg)
def _deprecate(original_func):
def deprecated_func(*args, **kwargs):
warnings.warn(
'Please use {0} instead.'.format(original_func.__name__),
DeprecationWarning, 2)
zwróć original_func(*args, **kwargs)
zwróć deprecated_func
# see #9424
failUnlessEqual = assertEquals = _deprecate(assertEqual)
failIfEqual = assertNotEquals = _deprecate(assertNotEqual)
failUnlessAlmostEqual = assertAlmostEquals = _deprecate(assertAlmostEqual)
failIfAlmostEqual = assertNotAlmostEquals = _deprecate(assertNotAlmostEqual)
failUnless = assert_ = _deprecate(assertPrawda)
failUnlessRaises = _deprecate(assertRaises)
failIf = _deprecate(assertNieprawda)
assertRaisesRegexp = _deprecate(assertRaisesRegex)
assertRegexpMatches = _deprecate(assertRegex)
klasa FunctionTestCase(TestCase):
"""A test case that wraps a test function.
This jest useful dla slipping pre-existing test functions into the
unittest framework. Optionally, set-up oraz tidy-up functions can be
supplied. As przy TestCase, the tidy-up ('tearDown') function will
always be called jeżeli the set-up ('setUp') function ran successfully.
"""
def __init__(self, testFunc, setUp=Nic, tearDown=Nic, description=Nic):
super(FunctionTestCase, self).__init__()
self._setUpFunc = setUp
self._tearDownFunc = tearDown
self._testFunc = testFunc
self._description = description
def setUp(self):
jeżeli self._setUpFunc jest nie Nic:
self._setUpFunc()
def tearDown(self):
jeżeli self._tearDownFunc jest nie Nic:
self._tearDownFunc()
def runTest(self):
self._testFunc()
def id(self):
zwróć self._testFunc.__name__
def __eq__(self, other):
jeżeli nie isinstance(other, self.__class__):
zwróć NotImplemented
zwróć self._setUpFunc == other._setUpFunc oraz \
self._tearDownFunc == other._tearDownFunc oraz \
self._testFunc == other._testFunc oraz \
self._description == other._description
def __hash__(self):
zwróć hash((type(self), self._setUpFunc, self._tearDownFunc,
self._testFunc, self._description))
def __str__(self):
zwróć "%s (%s)" % (strclass(self.__class__),
self._testFunc.__name__)
def __repr__(self):
zwróć "<%s tec=%s>" % (strclass(self.__class__),
self._testFunc)
def shortDescription(self):
jeżeli self._description jest nie Nic:
zwróć self._description
doc = self._testFunc.__doc__
zwróć doc oraz doc.split("\n")[0].strip() albo Nic
klasa _SubTest(TestCase):
def __init__(self, test_case, message, params):
super().__init__()
self._message = message
self.test_case = test_case
self.params = params
self.failureException = test_case.failureException
def runTest(self):
podnieś NotImplementedError("subtests cannot be run directly")
def _subDescription(self):
parts = []
jeżeli self._message:
parts.append("[{}]".format(self._message))
jeżeli self.params:
params_desc = ', '.join(
"{}={!r}".format(k, v)
dla (k, v) w sorted(self.params.items()))
parts.append("({})".format(params_desc))
zwróć " ".join(parts) albo '(<subtest>)'
def id(self):
zwróć "{} {}".format(self.test_case.id(), self._subDescription())
def shortDescription(self):
"""Returns a one-line description of the subtest, albo Nic jeżeli no
description has been provided.
"""
zwróć self.test_case.shortDescription()
def __str__(self):
zwróć "{} {}".format(self.test_case, self._subDescription())
| 40.466384 | 98 | 0.601672 | 6,186 | 57,179 | 5.432751 | 0.133042 | 0.00976 | 0.014283 | 0.019371 | 0.320142 | 0.271134 | 0.247776 | 0.218704 | 0.192638 | 0.180558 | 0 | 0.003902 | 0.318683 | 57,179 | 1,412 | 99 | 40.495042 | 0.858767 | 0.036552 | 0 | 0.309579 | 0 | 0 | 0.080538 | 0.003234 | 0.001168 | 0 | 0 | 0 | 0.094626 | 0 | null | null | 0 | 0.014019 | null | null | 0.005841 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
09339de497bce804c80e7ae435e1df64b35617dc | 6,022 | py | Python | RabbitMQooz.py | khalilovcmd/sd-rabbitmqooz | 3771964be66b8d2dcf2d141d5b479eb40da352cf | [
"Unlicense"
] | 1 | 2015-12-10T08:55:52.000Z | 2015-12-10T08:55:52.000Z | RabbitMQooz.py | khalilovcmd/sd-rabbitmqooz | 3771964be66b8d2dcf2d141d5b479eb40da352cf | [
"Unlicense"
] | null | null | null | RabbitMQooz.py | khalilovcmd/sd-rabbitmqooz | 3771964be66b8d2dcf2d141d5b479eb40da352cf | [
"Unlicense"
] | null | null | null | import json
import urllib2
import base64
import json
import logging
import os
import platform
import sys
import time
class RabbitMQooz(object):
def __init__(self, agent_config, checks_logger, raw_config):
self.agent_config = agent_config
self.checks_logger = checks_logger
self.raw_config = raw_config
self.version = platform.python_version_tuple()
self.api_url = '/api/overview'
self.host = self.raw_config['RabbitMQooz'].get('host', 'localhost')
self.port = self.raw_config['RabbitMQooz'].get('port', '55672')
self.username = self.raw_config['RabbitMQooz'].get('username', '')
self.password = self.raw_config['RabbitMQooz'].get('password', '')
def make_base64(self):
# creating a base64 of the username and password combination (for HTTP basic authentication)
base64string = base64.encodestring('%s:%s' % (self.username, self.password))[:-1]
return base64string
def make_http_request(self,base64string):
# create http request
request = urllib2.Request(self.host + ':' + self.port + self.api_url)
# add authorization header
request.add_header("Authorization", "Basic %s" % base64string)
# make request
return urllib2.urlopen(request)
def make_metrics(self, key, value, source):
source[key] = value
def run(self):
data = {}
base64string = self.make_base64()
http = self.make_http_request(base64string)
content = http.read()
parsed_json = json.loads(content)
print(content)
if content:
if 'publish' in parsed_json['message_stats']:
data['publish'] = parsed_json['message_stats']['publish']
data['publish_rate'] = parsed_json['message_stats']['publish_details']['rate']
else:
data['publish'] = 0
data['publish_rate'] = 0
if 'deliver' in parsed_json['message_stats']:
# Count of messages delivered in acknowledgement mode to consumers.
data['delivered_messages'] = parsed_json['message_stats']['deliver']
data['delivered_messages_rate'] = parsed_json['message_stats']['deliver_details']['rate']
else:
data['delivered_messages'] = 0
data['delivered_messages_rate'] = 0
if 'deliver_noack' in parsed_json['message_stats']:
# Count of messages delivered in no-acknowledgement mode to consumers.
data['delivered_noAck_messages'] = parsed_json['message_stats']['deliver_noack']
data['delivered_noAck_messages_rate'] = parsed_json['message_stats']['deliver_noack_details']['rate']
else:
data['delivered_noAck_messages'] = 0
data['delivered_noAck_messages_rate'] = 0
if 'get_ack' in parsed_json['message_stats']:
# Count of messages delivered in acknowledgement mode in response to basic.get.
data['delivered_basicGet_messages'] = parsed_json['message_stats']['get_ack']
data['delivered_basicGet_messages_rate'] = parsed_json['message_stats']['get_ack_details']['rate']
else:
data['delivered_basicGet_messages'] = 0
data['delivered_basicGet_messages_rate'] = 0
if 'get_no_ack' in parsed_json['message_stats']:
# Count of messages delivered in no-acknowledgement mode in response to basic.get.
data['delivered_basicGet_noAck_messages'] = parsed_json['message_stats']['get_no_ack']
data['delivered_basicGet_noAck_messages_rate'] = parsed_json['message_stats']['get_no_ack_details']['rate']
else:
data['delivered_basicGet_noAck_messages'] = 0
data['delivered_basicGet_noAck_messages'] = 0
if 'deliver_get' in parsed_json['message_stats']:
# Sum of all four of deliver + deliver_noack + get + get_noack
data['all_delivered'] = parsed_json['message_stats']['deliver_get']
data['all_delivered_rate'] = parsed_json['message_stats']['deliver_get_details']['rate']
else:
data['all_delivered'] = 0
data['all_delivered_rate'] = 0
if 'queue_totals' in parsed_json['message_stats']:
# queue messages
data['total_messages_in_queues'] = parsed_json['queue_totals']['messages']
# queue message with acknowledge-mode
data['total_messages_ready_in_queues'] = parsed_json['queue_totals']['messages_ready']
# queue message with no-acknowledge-mode
data['total_messages_noAck_in_queues'] = parsed_json['queue_totals']['messages_unacknowledged']
else:
data['total_messages_in_queues'] = 0
data['total_messages_ready_in_queues'] = 0
data['total_messages_noAck_in_queues'] = 0
return data
if __name__ == '__main__':
"""Standalone test
"""
raw_agent_config = {
'RabbitMQooz': {
'host': 'localhost',
'port': '55672',
'username' : 'guest',
'password': '123456'
}
}
main_checks_logger = logging.getLogger('RabbitMQooz')
main_checks_logger.setLevel(logging.DEBUG)
main_checks_logger.addHandler(logging.StreamHandler(sys.stdout))
rabbitmq = RabbitMQooz({}, main_checks_logger, raw_agent_config)
while True:
try:
print json.dumps(rabbitmq.run(), indent=4, sort_keys=True)
except:
main_checks_logger.exception("Unhandled exception")
finally:
time.sleep(5) | 42.70922 | 123 | 0.597144 | 634 | 6,022 | 5.369085 | 0.198738 | 0.067568 | 0.094888 | 0.122797 | 0.495593 | 0.344301 | 0.221798 | 0.108108 | 0.108108 | 0.108108 | 0 | 0.013935 | 0.296911 | 6,022 | 141 | 124 | 42.70922 | 0.790033 | 0.098472 | 0 | 0.107843 | 0 | 0 | 0.279725 | 0.114821 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.029412 | 0.088235 | null | null | 0.019608 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
09395fae23a11419b5cc816c9b353358d4f359ad | 6,489 | py | Python | annotation-tool/filter_annotations.py | robotika/subt-artf | 9ec40a7a41b89c9238a205dfd36c7d26bbdb1d93 | [
"MIT"
] | 1 | 2021-04-11T23:02:16.000Z | 2021-04-11T23:02:16.000Z | annotation-tool/filter_annotations.py | robotika/subt-artf | 9ec40a7a41b89c9238a205dfd36c7d26bbdb1d93 | [
"MIT"
] | 170 | 2020-05-27T08:22:45.000Z | 2021-11-01T15:28:31.000Z | annotation-tool/filter_annotations.py | robotika/subt-artf | 9ec40a7a41b89c9238a205dfd36c7d26bbdb1d93 | [
"MIT"
] | null | null | null | """Filter and convert annotations"""
import json
import os
import cv2
USED_ARTF_NAME = ["backpack", "phone", "survivor", "drill", "fire_extinguisher", "vent", "helmet", "rope", "breadcrumb",
"robot", "cube", "nothing"]
BLACKLIST = [
"../virtual/helmet/local_J01_000.jpg",
"../virtual/helmet/local_J01_001.jpg",
"../virtual/helmet/local_J01_012.jpg",
"../virtual/helmet/local_J01_013.jpg",
"../virtual/helmet/local_J01_014.jpg",
"../virtual/helmet/local_J01_015.jpg",
"../virtual/helmet/local_J01_016.jpg",
"../virtual/helmet/local_J03_000.jpg",
"../virtual/helmet/local_J03_001.jpg",
"../virtual/helmet/local_J03_002.jpg",
"../virtual/helmet/local_J03_003.jpg",
"../virtual/helmet/local_J03_004.jpg",
"../virtual/helmet/local_J03_005.jpg",
"../virtual/helmet/local_J03_006.jpg",
"../virtual/helmet/local_J03_007.jpg",
"../virtual/helmet/local_J03_008.jpg",
"../virtual/helmet/local_J03_009.jpg",
"../virtual/helmet/local_J03_010.jpg",
"../virtual/helmet/local_J03_011.jpg",
"../virtual/helmet/local_J03_012.jpg",
"../virtual/helmet/local_J03_013.jpg",
"../virtual/helmet/local_J03_014.jpg",
"../virtual/helmet/local_J03_015.jpg",
"../virtual/helmet/local_J03_016.jpg",
"../virtual/helmet/local_J03_017.jpg",
"../virtual/helmet/local_J03_018.jpg",
"../virtual/drill/drill_tunel_p_013.jpg",
"../virtual/drill/drill_tunel_p_014.jpg",
"../virtual/drill/drill_tunel_p_015.jpg",
"../virtual/drill/drill_tunel_p_016.jpg",
"../virtual/drill/drill_tunel_p_039.jpg",
"../virtual/drill/drill_tunel_p_040.jpg",
"../virtual/extinguisher/ext_tunel_p_000.jpg",
"../virtual/extinguisher/ext_tunel_p_001.jpg",
"../virtual/extinguisher/ext_tunel_p_002.jpg",
"../virtual/extinguisher/ext_tunel_p_003.jpg",
"../virtual/drill/drill_tunel_s2_000.jpg",
"../virtual/drill/drill_tunel_s2_001.jpg",
"../virtual/drill/drill_tunel_s2_002.jpg",
"../virtual/drill/drill_tunel_s2_003.jpg",
"../virtual/drill/drill_tunel_s2_004.jpg",
"../virtual/extinguisher/ext_tunel_s2_000.jpg",
"../virtual/extinguisher/ext_tunel_s2_001.jpg",
"../virtual/extinguisher/ext_tunel_s2_002.jpg",
"../virtual/extinguisher/ext_tunel_s2_004.jpg",
"../virtual/backpack/fq-99-freyja-000.jpg",
"../virtual/backpack/fq-99-freyja-001.jpg",
"../virtual/backpack/fq-99-freyja-002.jpg",
"../virtual/backpack/fq-99-freyja-003.jpg",
"../virtual/backpack/fq-99-freyja-004.jpg",
"../virtual/backpack/fq-99-freyja-005.jpg",
"../virtual/backpack/fq-99-freyja-006.jpg",
"../virtual/backpack/fq-99-freyja-007.jpg",
"../virtual/backpack/fq-99-freyja-008.jpg",
"../virtual/backpack/fq-99-freyja-029.jpg",
"../virtual/backpack/fq-99-freyja-032.jpg",
"../virtual/backpack/backpack_fp3_000.jpg",
"../virtual/backpack/backpack_fp3_001.jpg",
"../virtual/backpack/backpack_fp3_002.jpg",
"../virtual/backpack/backpack_fp3_004.jpg",
"../virtual/backpack/backpack_fp3_014.jpg",
"../virtual/backpack/backpack_fp3_015.jpg",
"../virtual/helmet/helmet_fp3_001.jpg",
"../virtual/helmet/helmet_fp3_002.jpg"
]
def manual_sorting(data, annotations_dir):
ii = 0
while True:
if ii < 0:
ii = 0
if ii >= len(data):
ii = len(data) - 1
file_name, artf_name, bbox, use_for = data[ii]
x, y, xw, yh = bbox
img = cv2.imread(os.path.join(annotations_dir, file_name), 1)
cv2.rectangle(img, (x, y), (xw, yh), (0, 0, 255))
cv2.putText(img, artf_name, (x, y), fontFace=cv2.FONT_HERSHEY_SIMPLEX, fontScale=1, color=(0, 0, 255))
cv2.putText(img, use_for, (10, 50), fontFace=cv2.FONT_HERSHEY_SIMPLEX, fontScale=1, color=(255, 0, 0))
cv2.imshow("win", img)
k = cv2.waitKey(0) & 0xFF
if k == ord("n"): # next img
ii += 1
elif k == ord("b"): # back one img
ii -= 1
elif k == ord("t"):
data[ii][3] = "train" # use for training
elif k == ord("e"):
data[ii][3] = "eval" # use for evaluation
elif k == ord("d"):
data[ii][3] = "None" # do not use
elif k == ord("q"): # close and save
break
cv2.destroyAllWindows()
return data
def main(annotation_file, out_prefix):
annotations_dir = os.path.dirname(annotation_file)
data = []
with open(annotation_file) as f:
json_data = json.load(f)
for item in json_data.values():
file_name = item['filename']
if file_name in BLACKLIST:
print(file_name)
continue
regions = item['regions']
for reg in regions:
artf_name = reg['region_attributes']['artifact']
if artf_name not in USED_ARTF_NAME:
continue
x = reg['shape_attributes']['x']
y = reg['shape_attributes']['y']
width = reg['shape_attributes']['width']
height = reg['shape_attributes']['height']
if g_manual:
# Store annotations and add a label about a future use. "None" in the beginning (do not use).
data.append( [file_name, artf_name, [x, y, x + width, y+ height], "None"] )
else:
data.append([file_name, artf_name, [x, y, x + width, y + height], "train"])
out_train = open(out_prefix + "_train.csv", "w")
out_train.write("filename,class,xmin,ymin,xmax,ymax\r\n")
if g_manual:
data = manual_sorting(data, annotations_dir)
out_eval = open(out_prefix + "_eval.csv", "w")
out_eval.write("filename,class,xmin,ymin,xmax,ymax\r\n")
for item in data:
file_name, artf_name, bbox, use_for = item
x, y, xw, yh = bbox
output_string = "%s,%s,%d,%d,%d,%d\r\n" %(file_name, artf_name, x, y, xw, yh)
if use_for == "train":
out_train.write(output_string)
elif use_for == "eval":
out_eval.write(output_string)
out_train.close()
if g_manual:
out_eval.close()
if __name__ == "__main__":
import argparse
parser = argparse.ArgumentParser(description='Filter and convert annotations.')
parser.add_argument('annotation', help='json - annotations')
parser.add_argument('--out', help='outpput csv filename', default='annotation')
parser.add_argument('--manual', help='Manual sorting', action='store_true')
args = parser.parse_args()
annotation_file = args.annotation
out_csv = args.out
g_manual = False
if args.manual:
g_manual = True
main(annotation_file, out_csv)
| 38.39645 | 120 | 0.64016 | 914 | 6,489 | 4.315098 | 0.201313 | 0.159736 | 0.109533 | 0.133114 | 0.56212 | 0.332404 | 0.105477 | 0.061359 | 0.039047 | 0.020791 | 0 | 0.061661 | 0.185237 | 6,489 | 168 | 121 | 38.625 | 0.68432 | 0.0319 | 0 | 0.058824 | 0 | 0 | 0.466507 | 0.403509 | 0 | 0 | 0.000638 | 0 | 0 | 1 | 0.013072 | false | 0 | 0.026144 | 0 | 0.045752 | 0.006536 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
093c450e14f33b26a810c5421ff806a441e1e0f5 | 684 | py | Python | django_mobile_app_distribution/migrations/0005_auto_20160118_1759.py | chingmeng/django-mobile-app-distribution | 7f1644f635b549c46968490b66c83e4107c6cb6d | [
"MIT"
] | 38 | 2015-09-26T00:31:10.000Z | 2021-04-06T04:48:15.000Z | django_mobile_app_distribution/migrations/0005_auto_20160118_1759.py | chingmeng/django-mobile-app-distribution | 7f1644f635b549c46968490b66c83e4107c6cb6d | [
"MIT"
] | 4 | 2016-04-08T17:07:44.000Z | 2019-02-10T08:15:12.000Z | django_mobile_app_distribution/migrations/0005_auto_20160118_1759.py | chingmeng/django-mobile-app-distribution | 7f1644f635b549c46968490b66c83e4107c6cb6d | [
"MIT"
] | 23 | 2015-09-16T12:12:42.000Z | 2020-04-22T08:22:00.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
import django_mobile_app_distribution.storage
import django_mobile_app_distribution.models
class Migration(migrations.Migration):
dependencies = [
('django_mobile_app_distribution', '0004_auto_20150921_0813'),
]
operations = [
migrations.AlterField(
model_name='androidapp',
name='app_binary',
field=models.FileField(upload_to=django_mobile_app_distribution.models.normalize_android_filename, verbose_name='APK file', storage=django_mobile_app_distribution.storage.CustomFileSystemStorage()),
),
]
| 31.090909 | 210 | 0.736842 | 72 | 684 | 6.597222 | 0.555556 | 0.126316 | 0.157895 | 0.284211 | 0.307368 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030089 | 0.173977 | 684 | 21 | 211 | 32.571429 | 0.810619 | 0.030702 | 0 | 0 | 0 | 0 | 0.122542 | 0.080182 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.266667 | 0 | 0.466667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
093fd5eeede1a93b304aa202cbba9e19b85d9f79 | 2,370 | py | Python | yayun.py | Gold-Sea/Poems | 0a8c017f11e37e69be1646ac006ad7e95bc84a5a | [
"MIT"
] | null | null | null | yayun.py | Gold-Sea/Poems | 0a8c017f11e37e69be1646ac006ad7e95bc84a5a | [
"MIT"
] | null | null | null | yayun.py | Gold-Sea/Poems | 0a8c017f11e37e69be1646ac006ad7e95bc84a5a | [
"MIT"
] | null | null | null | def clarify_yayun():
with open("字表拼音.txt", "r", encoding="utf-8") as f:
lines = f.readlines()
zhengti = ["hi", "ri", "zi", "ci", "si"] # 整体认读音节的后两个字母
u_to_v = ["yu", "xu", "qu"] # 读作v写作u
for line in lines:
c = line[0]
# _i指末尾i个字母组成的韵母
_1 = line[-2]
_2 = line[-3: -1]
_3 = line[-4: -1]
if _1 == "a":
yun_mu[1].append(c)
elif _2 == "ai":
yun_mu[2].append(c)
elif _2 == "an":
yun_mu[3].append(c)
elif _3 == "ang":
yun_mu[4].append(c)
elif _2 == "ao":
yun_mu[5].append(c)
elif _1 == "o" or (_1 == "e" and _2 != "ie" and _2 != "ye" and _2 != "ue"):
yun_mu[6].append(c)
elif _2 == "ei" or _2 == "ui":
yun_mu[7].append(c)
elif _2 == "en" or _2 == "in" or _2 == "un":
yun_mu[8].append(c)
elif _3 == "eng" or _3 == "ing" or _3 == "ong":
yun_mu[9].append(c)
elif (_1 == "i" and _2 not in zhengti) or _2 == "er":
yun_mu[10].append(c)
elif _1 == "i" and _2 in zhengti:
yun_mu[11].append(c)
elif _2 == "ie" or _2 == "ye":
yun_mu[12].append(c)
elif _2 == "ou" or _2 == "iu":
yun_mu[13].append(c)
elif _1 == "u" and _2 not in u_to_v:
yun_mu[14].append(c)
elif _1 == "v" or _2 in u_to_v:
yun_mu[15].append(c)
elif _2 == "ue":
yun_mu[16].append(c)
sum = 0
for lst in yun_mu.values():
sum += len(lst)
'''def output():
with open("押韵分类结果.txt", "w", encoding="utf-8") as file:
for i in range(1, 17):
file.write(str(i))
file.write(": ")
for c in yun_mu[i]:
file.write(c)
file.write(" ")
file.write("\n")'''
#
def get_yun(c): # 返回所属韵的类别
for i in range(1, 17):
if c in yun_mu[i]:
return i
return 0
yun_mu = {i: [] for i in range(1, 17)} # 韵母
clarify_yayun()
if __name__ == "__main__":
clarify_yayun()
# output()
| 32.465753 | 88 | 0.397468 | 323 | 2,370 | 2.687307 | 0.303406 | 0.115207 | 0.190092 | 0.110599 | 0.133641 | 0.112903 | 0.039171 | 0 | 0 | 0 | 0 | 0.058157 | 0.44135 | 2,370 | 72 | 89 | 32.916667 | 0.597432 | 0.023207 | 0 | 0.037037 | 0 | 0 | 0.046859 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
09563535d87dcc72b1804e60da4568b9e00b350c | 4,120 | py | Python | analyzer_project/source_parser/abstract_source_parser.py | Dakhnovskiy/linguistic_analyzer_projects | 36551d788f76be47b8892bce4073900e2cccec2f | [
"Apache-2.0"
] | null | null | null | analyzer_project/source_parser/abstract_source_parser.py | Dakhnovskiy/linguistic_analyzer_projects | 36551d788f76be47b8892bce4073900e2cccec2f | [
"Apache-2.0"
] | null | null | null | analyzer_project/source_parser/abstract_source_parser.py | Dakhnovskiy/linguistic_analyzer_projects | 36551d788f76be47b8892bce4073900e2cccec2f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
__author__ = 'Dmitriy.Dakhnovskiy'
import itertools
from abc import ABCMeta, abstractmethod
class AbstractSourceParser:
__metaclass__ = ABCMeta
def __init__(self, source):
"""
:param source: исходный код
"""
self.__type_identificator_funcs = {
'variable': self.__get_local_variables_from_functions,
'function': self.__get_functions
}
self.__parse_source(source)
@staticmethod
@abstractmethod
def _parse_source(source):
"""
возвращает разобранный в структуру код
:param source: исходный код
"""
def __parse_source(self, source):
"""
разбирает исходный код, резульат складывает в __parsed_source
:param source: исходный код
"""
self.__parsed_source = self._parse_source(source)
@staticmethod
@abstractmethod
def _walk_element(element):
"""
генератор по элементам переданного элемента
:param element: элемент разобранной структуры
"""
def __walk_parsed_source(self):
"""
генератор по элементам разобранной структуры
"""
if self.__parsed_source:
for element in self._walk_element(self.__parsed_source):
yield element
@staticmethod
@abstractmethod
def _is_variable(element):
"""
проаверяет является ли элемент структуры переменной
:param element: элемент
"""
@staticmethod
@abstractmethod
def _is_function(element):
"""
проаверяет является ли элемент структуры функцией
:param element: элемент
"""
def __get_functions(self):
"""
Генератор по функциям
"""
for element in self.__walk_parsed_source():
if self._is_function(element):
yield element
def __get_local_variables_from_function(self, function):
"""
генератор по переменным объявленным внутри функции
:param function: функция(ast объект)
"""
for element in self._walk_element(function):
if self._is_variable(element):
yield element
def __get_local_variables_from_functions(self):
"""
генератор по локальным переменным функций
"""
# TODO из-за повторного обхода вложенныъ функций приходится выбирать уникальные. Подумать над проблемой
set_variables = set()
for function in self.__get_functions():
for variable in self.__get_local_variables_from_function(function):
set_variables.add(variable)
yield from set_variables
@staticmethod
@abstractmethod
def _get_words_from_identificator(identificator):
"""
получить слова из идентификатора
:param identificator: идентификатор
"""
@staticmethod
@abstractmethod
def _get_element_name(element):
"""
получить наименование(идентификатор) элемента разобранной структуры
:param element: элемент структуры
"""
def __get_identificators(self, types_identificators):
"""
генератор по идентификаторам с заданным типом
:param types_identificators: список типов идентификаторов
"""
gens = [self.__type_identificator_funcs[types_identificator]() for types_identificator in types_identificators]
elements = itertools.chain(*gens)
yield from (self._get_element_name(element) for element in elements)
def get_words(self, types_identificators):
"""
генератор по словам идентификаторов с заданным типом
:param types_identificators: список типов идентификаторов
"""
assert set(types_identificators).issubset(set(self.__type_identificator_funcs.keys())), \
'types_identificators must be subset ({0})'.format(', '.join(self.__type_identificator_funcs.keys()))
for identificator in self.__get_identificators(types_identificators):
yield from self._get_words_from_identificator(identificator)
| 30.518519 | 119 | 0.65 | 397 | 4,120 | 6.395466 | 0.287154 | 0.059866 | 0.068531 | 0.040961 | 0.325325 | 0.204017 | 0.123671 | 0.084285 | 0.050414 | 0 | 0 | 0.000671 | 0.276214 | 4,120 | 134 | 120 | 30.746269 | 0.850771 | 0.274757 | 0 | 0.258621 | 0 | 0 | 0.030186 | 0 | 0 | 0 | 0 | 0.007463 | 0.017241 | 1 | 0.241379 | false | 0 | 0.034483 | 0 | 0.310345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
095e0c3d9c424b594012bf01b503f18a9ffd1fe2 | 239 | py | Python | our_scripts/setup.py | shrivats-pu/Prescient | 3d4238e98ddd767e2b81adc4091bb723dbf563d3 | [
"BSD-3-Clause"
] | 1 | 2021-10-14T20:39:50.000Z | 2021-10-14T20:39:50.000Z | our_scripts/setup.py | shrivats-pu/Prescient | 3d4238e98ddd767e2b81adc4091bb723dbf563d3 | [
"BSD-3-Clause"
] | null | null | null | our_scripts/setup.py | shrivats-pu/Prescient | 3d4238e98ddd767e2b81adc4091bb723dbf563d3 | [
"BSD-3-Clause"
] | null | null | null | from setuptools import setup
setup(name='prescient_helpers',
version='0.1',
description='Use prescient helper functions',
url='#',
author='Ethan Reese',
author_email='ereese@princeton.edu',
packages=['prescient_helpers'],
zip_safe=False)
| 21.727273 | 45 | 0.774059 | 31 | 239 | 5.83871 | 0.83871 | 0.176796 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009009 | 0.07113 | 239 | 10 | 46 | 23.9 | 0.806306 | 0 | 0 | 0 | 0 | 0 | 0.414226 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
096120c3f43bcf80287a490af405c9d8c994639a | 1,998 | py | Python | modelling/modelling_sample.py | PREDICTFORCE/Toolkit-for-Professional-Data-Science-Workflow-Example | 90b3a54a1fffe10282617e967afed1712d39ca3f | [
"MIT"
] | null | null | null | modelling/modelling_sample.py | PREDICTFORCE/Toolkit-for-Professional-Data-Science-Workflow-Example | 90b3a54a1fffe10282617e967afed1712d39ca3f | [
"MIT"
] | null | null | null | modelling/modelling_sample.py | PREDICTFORCE/Toolkit-for-Professional-Data-Science-Workflow-Example | 90b3a54a1fffe10282617e967afed1712d39ca3f | [
"MIT"
] | 1 | 2021-09-16T08:30:45.000Z | 2021-09-16T08:30:45.000Z | import argparse
import pickle
from pathlib import Path
import pandas as pd
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import accuracy_score, confusion_matrix, classification_report
from sklearn.model_selection import train_test_split
parser = argparse.ArgumentParser(description='Process some integers.')
parser.add_argument('--input-train-data-path', type=str, help='an integer for the accumulator')
parser.add_argument('--input-test-data-path', type=str, help='an integer for the accumulator')
parser.add_argument('--output-predictions-data-path', type=str, help='an integer for the accumulator')
parser.add_argument('--output-model-path', type=str, help='an integer for the accumulator')
args = parser.parse_args()
Path(args.output_predictions_data_path).parent.mkdir(parents=True, exist_ok=True)
Path(args.output_model_path).parent.mkdir(parents=True, exist_ok=True)
Path(args.output_predictions_data_path).parent.mkdir(parents=True, exist_ok=True)
train = pd.read_csv(args.input_train_data_path)
test = pd.read_csv(args.input_test_data_path)
train, valid = train_test_split(train, train_size=0.8, random_state=150)
x_train = train.drop(['Survived'], axis=1).copy()
y_train = train['Survived'].copy()
x_valid = valid.drop(['Survived'], axis=1).copy()
y_valid = valid['Survived'].copy()
x_test = test.drop(['Survived', 'PassengerId'], axis=1).copy()
y_test = test['Survived'].copy()
model = LogisticRegression(solver='liblinear', random_state=42)
model.fit(x_train, y_train)
pred_valid = model.predict(x_valid)
accuracy_score(y_valid, pred_valid)
confusion_matrix(y_valid, pred_valid)
print(classification_report(y_valid, pred_valid))
# Actual Test Prediction
pred_test = model.predict(x_test).astype(int)
output = pd.DataFrame({'PassengerId': test.PassengerId, 'Survived': pred_test})
output.to_csv(args.output_predictions_data_path, index=False)
print("Your submission was successfully saved!")
pickle.dump(model, open(args.output_model_path, 'wb'))
| 39.176471 | 102 | 0.790791 | 298 | 1,998 | 5.087248 | 0.308725 | 0.042216 | 0.044855 | 0.039578 | 0.350264 | 0.307388 | 0.278364 | 0.278364 | 0.278364 | 0.251319 | 0 | 0.00542 | 0.076577 | 1,998 | 50 | 103 | 39.96 | 0.81626 | 0.011011 | 0 | 0.055556 | 0 | 0 | 0.184397 | 0.037994 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.055556 | 0.194444 | 0 | 0.194444 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
11727ab8f0fb0e0ff3484493225652e1697f3b16 | 5,265 | py | Python | sdk/python/pulumi_rancher2/get_project_role_template_binding.py | mitchellmaler/pulumi-rancher2 | e6ca44b58b5b10c12a4e628e61aa8d98330f0863 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_rancher2/get_project_role_template_binding.py | mitchellmaler/pulumi-rancher2 | e6ca44b58b5b10c12a4e628e61aa8d98330f0863 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_rancher2/get_project_role_template_binding.py | mitchellmaler/pulumi-rancher2 | e6ca44b58b5b10c12a4e628e61aa8d98330f0863 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import json
import warnings
import pulumi
import pulumi.runtime
from typing import Union
from . import utilities, tables
class GetProjectRoleTemplateBindingResult:
"""
A collection of values returned by getProjectRoleTemplateBinding.
"""
def __init__(__self__, annotations=None, group_id=None, group_principal_id=None, id=None, labels=None, name=None, project_id=None, role_template_id=None, user_id=None, user_principal_id=None):
if annotations and not isinstance(annotations, dict):
raise TypeError("Expected argument 'annotations' to be a dict")
__self__.annotations = annotations
"""
(Computed) Annotations of the resource (map)
"""
if group_id and not isinstance(group_id, str):
raise TypeError("Expected argument 'group_id' to be a str")
__self__.group_id = group_id
"""
(Computed) The group ID to assign project role template binding (string)
"""
if group_principal_id and not isinstance(group_principal_id, str):
raise TypeError("Expected argument 'group_principal_id' to be a str")
__self__.group_principal_id = group_principal_id
"""
(Computed) The group_principal ID to assign project role template binding (string)
"""
if id and not isinstance(id, str):
raise TypeError("Expected argument 'id' to be a str")
__self__.id = id
"""
id is the provider-assigned unique ID for this managed resource.
"""
if labels and not isinstance(labels, dict):
raise TypeError("Expected argument 'labels' to be a dict")
__self__.labels = labels
"""
(Computed) Labels of the resource (map)
"""
if name and not isinstance(name, str):
raise TypeError("Expected argument 'name' to be a str")
__self__.name = name
if project_id and not isinstance(project_id, str):
raise TypeError("Expected argument 'project_id' to be a str")
__self__.project_id = project_id
if role_template_id and not isinstance(role_template_id, str):
raise TypeError("Expected argument 'role_template_id' to be a str")
__self__.role_template_id = role_template_id
if user_id and not isinstance(user_id, str):
raise TypeError("Expected argument 'user_id' to be a str")
__self__.user_id = user_id
"""
(Computed) The user ID to assign project role template binding (string)
"""
if user_principal_id and not isinstance(user_principal_id, str):
raise TypeError("Expected argument 'user_principal_id' to be a str")
__self__.user_principal_id = user_principal_id
"""
(Computed) The user_principal ID to assign project role template binding (string)
"""
class AwaitableGetProjectRoleTemplateBindingResult(GetProjectRoleTemplateBindingResult):
# pylint: disable=using-constant-test
def __await__(self):
if False:
yield self
return GetProjectRoleTemplateBindingResult(
annotations=self.annotations,
group_id=self.group_id,
group_principal_id=self.group_principal_id,
id=self.id,
labels=self.labels,
name=self.name,
project_id=self.project_id,
role_template_id=self.role_template_id,
user_id=self.user_id,
user_principal_id=self.user_principal_id)
def get_project_role_template_binding(name=None,project_id=None,role_template_id=None,opts=None):
"""
Use this data source to retrieve information about a Rancher v2 project role template binding.
> This content is derived from https://github.com/terraform-providers/terraform-provider-rancher2/blob/master/website/docs/d/projectRole.html.markdown.
:param str name: The name of the project role template binding (string)
:param str project_id: The project id where bind project role template (string)
:param str role_template_id: The role template id from create project role template binding (string)
"""
__args__ = dict()
__args__['name'] = name
__args__['projectId'] = project_id
__args__['roleTemplateId'] = role_template_id
if opts is None:
opts = pulumi.InvokeOptions()
if opts.version is None:
opts.version = utilities.get_version()
__ret__ = pulumi.runtime.invoke('rancher2:index/getProjectRoleTemplateBinding:getProjectRoleTemplateBinding', __args__, opts=opts).value
return AwaitableGetProjectRoleTemplateBindingResult(
annotations=__ret__.get('annotations'),
group_id=__ret__.get('groupId'),
group_principal_id=__ret__.get('groupPrincipalId'),
id=__ret__.get('id'),
labels=__ret__.get('labels'),
name=__ret__.get('name'),
project_id=__ret__.get('projectId'),
role_template_id=__ret__.get('roleTemplateId'),
user_id=__ret__.get('userId'),
user_principal_id=__ret__.get('userPrincipalId'))
| 44.243697 | 196 | 0.679012 | 645 | 5,265 | 5.212403 | 0.206202 | 0.078525 | 0.054134 | 0.089233 | 0.30696 | 0.203748 | 0.160024 | 0.080309 | 0.080309 | 0 | 0 | 0.000994 | 0.235328 | 5,265 | 118 | 197 | 44.618644 | 0.834078 | 0.148718 | 0 | 0 | 1 | 0 | 0.160588 | 0.019417 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040541 | false | 0 | 0.081081 | 0 | 0.175676 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1173a905fb25189b01d63dd9733072094d501da2 | 2,718 | py | Python | main.py | DebNatkh/tinkoff-colloquium-bot | 7c00402929a4615617d77f73e79e4243fd92220f | [
"Unlicense"
] | null | null | null | main.py | DebNatkh/tinkoff-colloquium-bot | 7c00402929a4615617d77f73e79e4243fd92220f | [
"Unlicense"
] | null | null | null | main.py | DebNatkh/tinkoff-colloquium-bot | 7c00402929a4615617d77f73e79e4243fd92220f | [
"Unlicense"
] | null | null | null | #! /usr/bin/env python3
import os
from random import choice
import filetype
from telegram.ext import Updater, CommandHandler
from tinydb import TinyDB, where
db = TinyDB('db.json')
admins_table = db.table("admins")
given_questions = db.table("given")
quiet_categories = ['meme']
def register_admin(update, context):
chat_id = update.message.chat_id
admins_table.insert({'chat_id': chat_id})
context.bot.send_message(chat_id=chat_id, text="Added you to admins list")
def start(update, context):
chat_id = update.message.chat_id
context.bot.send_message(chat_id=chat_id, text="Использование: /question 3")
def question(update, context):
chat_id = update.message.chat_id
category = update.message.text.split()[-1].lower()
question_requested: bool = False
if category not in os.listdir('questions'):
text = "Не смог найти такую категорию вопросов"
else:
if given_questions.search((where('chat_id') == chat_id) & (where('category') == category)):
text = "Уже выдал вам задачу на эту категорию"
else:
text = f"Вот вопрос на {category}"
question_requested = True
context.bot.send_message(chat_id=chat_id, text=text)
if not question_requested:
return
filename = f"questions/{category}/" + choice(os.listdir(f'questions/{category}/'))
print(filename)
with open(filename, 'rb') as doc:
if category not in quiet_categories:
for admin in admins_table.all():
context.bot.send_message(chat_id=admin['chat_id'],
text=f'User {update.message.from_user} (category: {category}):')
context.bot.send_document(chat_id=admin['chat_id'], document=doc)
doc.seek(0)
given_questions.insert({'chat_id': chat_id, 'category': category})
if filetype.is_image(filename):
context.bot.send_photo(chat_id=chat_id, photo=doc)
else:
context.bot.send_document(chat_id=chat_id, document=doc)
def main():
telegram_token = os.getenv('TELEGRAM_BOT_TOKEN')
assert telegram_token is not None, "env TELEGRAM_BOT_TOKEN is not set"
telegram_password = os.getenv('TELEGRAM_BOT_ADMIN_PASSWORD')
assert telegram_password is not None, "env TELEGRAM_BOT_ADMIN_PASSWORD is not set"
updater = Updater(telegram_token, use_context=True)
dp = updater.dispatcher
dp.add_handler(CommandHandler('question', question))
dp.add_handler(CommandHandler('start', start))
dp.add_handler(CommandHandler(f'register_admin_{telegram_password}', register_admin))
updater.start_polling()
updater.idle()
if __name__ == '__main__':
main()
| 34.846154 | 105 | 0.680648 | 358 | 2,718 | 4.946927 | 0.296089 | 0.088086 | 0.045172 | 0.054207 | 0.232637 | 0.203275 | 0.130435 | 0.130435 | 0.069452 | 0.04856 | 0 | 0.001852 | 0.205298 | 2,718 | 77 | 106 | 35.298701 | 0.818056 | 0.008094 | 0 | 0.101695 | 0 | 0 | 0.187384 | 0.057885 | 0 | 0 | 0 | 0 | 0.033898 | 1 | 0.067797 | false | 0.050847 | 0.084746 | 0 | 0.169492 | 0.016949 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1174d352a17ae7bdbd1c4d2e5a18da54455cf336 | 12,059 | py | Python | py/mockDensData.py | jobovy/apogee-maps | 0316e95bdf9b85f3f4b758e8e67dc197d228288f | [
"BSD-3-Clause"
] | 1 | 2020-06-28T14:22:20.000Z | 2020-06-28T14:22:20.000Z | py/mockDensData.py | jobovy/apogee-maps | 0316e95bdf9b85f3f4b758e8e67dc197d228288f | [
"BSD-3-Clause"
] | null | null | null | py/mockDensData.py | jobovy/apogee-maps | 0316e95bdf9b85f3f4b758e8e67dc197d228288f | [
"BSD-3-Clause"
] | 3 | 2015-10-05T21:59:32.000Z | 2017-02-03T11:53:45.000Z | ###############################################################################
# mockDensData.py: generate mock data following a given density
###############################################################################
import os, os.path
import pickle
import multiprocessing
from optparse import OptionParser
import numpy
from scipy import ndimage
import fitsio
from galpy.util import bovy_coords, multi
import mwdust
import define_rcsample
import fitDens
import densprofiles
dmap= None
dmapg15= None
apo= None
def generate(locations,
type='exp',
sample='lowlow',
extmap='green15',
nls=101,
nmock=1000,
H0=-1.49,
_dmapg15=None,
ncpu=1):
"""
NAME:
generate
PURPOSE:
generate mock data following a given density
INPUT:
locations - locations to be included in the sample
type= ('exp') type of density profile to sample from
sample= ('lowlow') for selecting mock parameters
extmap= ('green15') extinction map to use ('marshall06' and others use Green15 to fill in unobserved regions)
nls= (101) number of longitude bins to use for each field
nmock= (1000) number of mock data points to generate
H0= (-1.49) absolute magnitude (can be array w/ sampling spread)
ncpu= (1) number of cpus to use to compute the probability
OUTPUT:
mockdata recarray with tags 'RC_GALR_H', 'RC_GALPHI_H', 'RC_GALZ_H'
HISTORY:
2015-04-03 - Written - Bovy (IAS)
"""
if isinstance(H0,float): H0= [H0]
# Setup the density function and its initial parameters
rdensfunc= fitDens._setup_densfunc(type)
mockparams= _setup_mockparams_densfunc(type,sample)
densfunc= lambda x,y,z: rdensfunc(x,y,z,params=mockparams)
# Setup the extinction map
global dmap
global dmapg15
if _dmapg15 is None: dmapg15= mwdust.Green15(filter='2MASS H')
else: dmapg15= _dmapg15
if isinstance(extmap,mwdust.DustMap3D.DustMap3D):
dmap= extmap
elif extmap.lower() == 'green15':
dmap= dmapg15
elif extmap.lower() == 'marshall06':
dmap= mwdust.Marshall06(filter='2MASS H')
elif extmap.lower() == 'sale14':
dmap= mwdust.Sale14(filter='2MASS H')
elif extmap.lower() == 'drimmel03':
dmap= mwdust.Drimmel03(filter='2MASS H')
# Use brute-force rejection sampling to make no approximations
# First need to estimate the max probability to use in rejection;
# Loop through all locations and compute sampling probability on grid in
# (l,b,D)
# First restore the APOGEE selection function (assumed pre-computed)
global apo
selectFile= '../savs/selfunc-nospdata.sav'
if os.path.exists(selectFile):
with open(selectFile,'rb') as savefile:
apo= pickle.load(savefile)
# Now compute the necessary coordinate transformations and evaluate the
# maximum probability
distmods= numpy.linspace(7.,15.5,301)
ds= 10.**(distmods/5-2.)
nbs= nls
lnprobs= numpy.empty((len(locations),len(distmods),nbs,nls))
radii= []
lcens, bcens= [], []
lnprobs= multi.parallel_map(lambda x: _calc_lnprob(locations[x],nls,nbs,
ds,distmods,
H0,
densfunc),
range(len(locations)),
numcores=numpy.amin([len(locations),
multiprocessing.cpu_count(),ncpu]))
lnprobs= numpy.array(lnprobs)
for ll, loc in enumerate(locations):
lcen, bcen= apo.glonGlat(loc)
rad= apo.radius(loc)
radii.append(rad) # save for later
lcens.append(lcen[0])
bcens.append(bcen[0])
maxp= (numpy.exp(numpy.nanmax(lnprobs))-10.**-8.)*1.1 # Just to be sure
# Now generate mock data using rejection sampling
nout= 0
arlocations= numpy.array(locations)
arradii= numpy.array(radii)
arlcens= numpy.array(lcens)
arbcens= numpy.array(bcens)
out= numpy.recarray((nmock,),
dtype=[('RC_DIST_H','f8'),
('RC_DM_H','f8'),
('RC_GALR_H','f8'),
('RC_GALPHI_H','f8'),
('RC_GALZ_H','f8')])
while nout < nmock:
nnew= 2*(nmock-nout)
# nnew new locations
locIndx= numpy.floor(numpy.random.uniform(size=nnew)*len(locations)).astype('int')
newlocations= arlocations[locIndx]
# Point within these locations
newds_coord= numpy.random.uniform(size=nnew)
newds= 10.**((newds_coord*(numpy.amax(distmods)-numpy.amin(distmods))\
+numpy.amin(distmods))/5.-2.)
newdls_coord= numpy.random.uniform(size=nnew)
newdls= newdls_coord*2.*arradii[locIndx]\
-arradii[locIndx]
newdbs_coord= numpy.random.uniform(size=nnew)
newdbs= newdbs_coord*2.*arradii[locIndx]\
-arradii[locIndx]
newr2s= newdls**2.+newdbs**2.
keepIndx= newr2s < arradii[locIndx]**2.
newlocations= newlocations[keepIndx]
newds_coord= newds_coord[keepIndx]
newdls_coord= newdls_coord[keepIndx]
newdbs_coord= newdbs_coord[keepIndx]
newds= newds[keepIndx]
newdls= newdls[keepIndx]
newdbs= newdbs[keepIndx]
newls= newdls+arlcens[locIndx][keepIndx]
newbs= newdbs+arbcens[locIndx][keepIndx]
# Reject?
tps= numpy.zeros_like(newds)
for nloc in list(set(newlocations)):
lindx= newlocations == nloc
pindx= arlocations == nloc
coord= numpy.array([newds_coord[lindx]*(len(distmods)-1.),
newdbs_coord[lindx]*(nbs-1.),
newdls_coord[lindx]*(nls-1.)])
tps[lindx]= \
numpy.exp(ndimage.interpolation.map_coordinates(\
lnprobs[pindx][0],
coord,cval=-10.,
order=1))-10.**-8.
XYZ= bovy_coords.lbd_to_XYZ(newls,newbs,newds,degree=True)
Rphiz= bovy_coords.XYZ_to_galcencyl(XYZ[:,0],XYZ[:,1],XYZ[:,2],
Xsun=define_rcsample._R0,
Ysun=0.,
Zsun=define_rcsample._Z0)
testp= numpy.random.uniform(size=len(newds))*maxp
keepIndx= tps > testp
if numpy.sum(keepIndx) > nmock-nout:
rangeIndx= numpy.zeros(len(keepIndx),dtype='int')
rangeIndx[keepIndx]= numpy.arange(numpy.sum(keepIndx))
keepIndx*= (rangeIndx < nmock-nout)
out['RC_DIST_H'][nout:nout+numpy.sum(keepIndx)]= newds[keepIndx]
out['RC_DM_H'][nout:nout+numpy.sum(keepIndx)]= newds_coord[keepIndx]*(numpy.amax(distmods)-numpy.amin(distmods))\
+numpy.amin(distmods)
out['RC_GALR_H'][nout:nout+numpy.sum(keepIndx)]= Rphiz[0][keepIndx]
out['RC_GALPHI_H'][nout:nout+numpy.sum(keepIndx)]= Rphiz[1][keepIndx]
out['RC_GALZ_H'][nout:nout+numpy.sum(keepIndx)]= Rphiz[2][keepIndx]
nout= nout+numpy.sum(keepIndx)
return (out,lnprobs)
def _setup_mockparams_densfunc(type,sample):
"""Return the parameters of the mock density for this type"""
if type.lower() == 'exp':
if sample.lower() == 'lowlow':
return [0.,1./0.3]
elif sample.lower() == 'solar':
return [1./3.,1./0.3]
else:
return [1./3.,1./0.3]
elif type.lower() == 'expplusconst':
if sample.lower() == 'lowlow':
return [0.,1./0.3,numpy.log(0.1)]
else:
return [1./3.,1./0.3,numpy.log(0.1)]
elif type.lower() == 'twoexp':
return [1./3.,1./0.3,1./4.,1./0.5,densprofiles.logit(0.5)]
elif type.lower() == 'brokenexp':
if sample.lower() == 'lowlow':
return [-0.2,1./.3,0.2,numpy.log(11.)]
elif sample.lower() == 'solar':
return [-1./6.,1./0.3,1./2.,numpy.log(8.)]
else:
return [-1./6.,1./0.3,1./2.,numpy.log(6.)]
elif type.lower() == 'brokenexpflare':
if sample.lower() == 'lowlow':
return [-0.2,1./.3,0.2,numpy.log(11.),-0.1]
elif sample.lower() == 'solar':
return [-1./6.,1./0.3,1./2.,numpy.log(8.),-0.1]
else:
return [-1./6.,1./0.3,1./2.,numpy.log(6.),-0.1]
elif type.lower() == 'gaussexp':
if sample.lower() == 'lowlow':
return [.4,1./0.3,numpy.log(11.)]
else:
return [1./3.,1./0.3,numpy.log(10.)]
def _calc_lnprob(loc,nls,nbs,ds,distmods,H0,densfunc):
lcen, bcen= apo.glonGlat(loc)
rad= apo.radius(loc)
ls= numpy.linspace(lcen-rad,lcen+rad,nls)
bs= numpy.linspace(bcen-rad,bcen+rad,nbs)
# Tile these
tls= numpy.tile(ls,(len(ds),len(bs),1))
tbs= numpy.swapaxes(numpy.tile(bs,(len(ds),len(ls),1)),1,2)
tds= numpy.tile(ds,(len(ls),len(bs),1)).T
XYZ= bovy_coords.lbd_to_XYZ(tls.flatten(),
tbs.flatten(),
tds.flatten(),
degree=True)
Rphiz= bovy_coords.XYZ_to_galcencyl(XYZ[:,0],XYZ[:,1],XYZ[:,2],
Xsun=define_rcsample._R0,
Ysun=0.,
Zsun=define_rcsample._Z0)
# Evaluate probability density
tH= numpy.tile(distmods.T,(1,len(ls),len(bs),1))[0].T
for ii in range(tH.shape[1]):
for jj in range(tH.shape[2]):
try:
tH[:,ii,jj]+= dmap(ls[jj],bs[ii],ds)
except (IndexError, TypeError,ValueError):
try:
tH[:,ii,jj]+= dmapg15(ls[jj],bs[ii],ds)
except IndexError: # assume zero outside
pass
tH= tH.flatten()+H0[0]
ps= densfunc(Rphiz[0],Rphiz[1],Rphiz[2])*apo(loc,tH)\
*numpy.fabs(numpy.cos(tbs.flatten()/180.*numpy.pi))\
*tds.flatten()**3.
return numpy.log(numpy.reshape(ps,(len(distmods),nbs,nls))\
+10.**-8.)
def get_options():
usage = "usage: %prog [options] <savefilename>\n\nsavefilename= name of the file that the mock data will be saved to"
parser = OptionParser(usage=usage)
parser.add_option("--type",dest='type',default='exp',
help="Type of density profile")
parser.add_option("--sample",dest='sample',default='lowlow',
help="Sample parameter for mock parameters")
parser.add_option("--H0",dest='H0',default=-1.49,type='float',
help="RC absolute magnitude")
parser.add_option("--nls",dest='nls',default=101,type='int',
help="Number of longitudes to bin each field in")
parser.add_option("--nmock",dest='nmock',default=20000,type='int',
help="Number of mock samples to generate")
# Dust map to use
parser.add_option("--extmap",dest='extmap',default='green15',
help="Dust map to use ('Green15', 'Marshall03', 'Drimmel03', 'Sale14', or 'zero'")
# Multiprocessing?
parser.add_option("-m","--multi",dest='multi',default=1,type='int',
help="number of cpus to use")
return parser
if __name__ == '__main__':
parser= get_options()
options, args= parser.parse_args()
data= define_rcsample.get_rcsample()
locations= list(set(list(data['LOCATION_ID'])))
#locations= [4240,4242]
out= generate(locations,
type=options.type,
sample=options.sample,
extmap=options.extmap,
nls=options.nls,
nmock=options.nmock,
H0=options.H0,
ncpu=options.multi)
fitsio.write(args[0],out[0],clobber=True)
| 43.222222 | 121 | 0.556348 | 1,464 | 12,059 | 4.511612 | 0.239754 | 0.004239 | 0.00545 | 0.014534 | 0.247237 | 0.203785 | 0.150643 | 0.104466 | 0.104466 | 0.062074 | 0 | 0.036291 | 0.291649 | 12,059 | 278 | 122 | 43.377698 | 0.736947 | 0.127623 | 0 | 0.133621 | 1 | 0.00431 | 0.077321 | 0.005753 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017241 | false | 0.00431 | 0.051724 | 0 | 0.142241 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11767732ae5f9f96a073f159f786b30544ab7d0b | 1,117 | py | Python | authors/apps/articles/migrations/0007_report.py | andela/ah-backend-summer | f842a3e02f8418f123dc5de36809ad67557b1c1d | [
"BSD-3-Clause"
] | 1 | 2019-03-11T12:45:24.000Z | 2019-03-11T12:45:24.000Z | authors/apps/articles/migrations/0007_report.py | andela/ah-backend-summer | f842a3e02f8418f123dc5de36809ad67557b1c1d | [
"BSD-3-Clause"
] | 53 | 2019-01-29T08:02:23.000Z | 2022-03-11T23:39:37.000Z | authors/apps/articles/migrations/0007_report.py | andela/ah-backend-summer | f842a3e02f8418f123dc5de36809ad67557b1c1d | [
"BSD-3-Clause"
] | 5 | 2019-10-04T07:02:38.000Z | 2020-06-11T12:39:22.000Z | # Generated by Django 2.1.5 on 2019-02-19 13:24
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('profiles', '0003_merge_20190214_1334'),
('articles', '0006_article_tag_list'),
]
operations = [
migrations.CreateModel(
name='Report',
fields=[
('id', models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name='ID')),
('reason', models.TextField()),
('created_at', models.DateTimeField(auto_now_add=True)),
('article', models.ForeignKey(
blank=True,
on_delete=django.db.models.deletion.CASCADE,
to='articles.Article')),
('reporter', models.ForeignKey(
blank=True,
on_delete=django.db.models.deletion.CASCADE,
to='profiles.Profile')),
],
),
]
| 31.027778 | 72 | 0.506714 | 99 | 1,117 | 5.575758 | 0.59596 | 0.057971 | 0.076087 | 0.119565 | 0.231884 | 0.231884 | 0.231884 | 0.231884 | 0.231884 | 0.231884 | 0 | 0.050432 | 0.378693 | 1,117 | 35 | 73 | 31.914286 | 0.744957 | 0.040286 | 0 | 0.137931 | 1 | 0 | 0.125234 | 0.042056 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.068966 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
117fc72709b64a2259d1fc8bb47168d09f8d1b4b | 8,876 | py | Python | src/py_moveit/scripts/line_cartesian_5DOF.py | ERP1234/5DOF_Robot_arm | 5dbb9b3374aea32ef38882d2fcc3cd253d49bef5 | [
"MIT"
] | null | null | null | src/py_moveit/scripts/line_cartesian_5DOF.py | ERP1234/5DOF_Robot_arm | 5dbb9b3374aea32ef38882d2fcc3cd253d49bef5 | [
"MIT"
] | null | null | null | src/py_moveit/scripts/line_cartesian_5DOF.py | ERP1234/5DOF_Robot_arm | 5dbb9b3374aea32ef38882d2fcc3cd253d49bef5 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import sys
import copy
import rospy
import moveit_commander
import moveit_msgs.msg
import geometry_msgs.msg
from math import pi
from std_msgs.msg import String
from moveit_commander.conversions import pose_to_list
## END_SUB_TUTORIAL
def all_close(goal, actual, tolerance):
all_equal = True
if type(goal) is list:
for index in range(len(goal)):
if abs(actual[index] - goal[index]) > tolerance:
return False
elif type(goal) is geometry_msgs.msg.PoseStamped:
return all_close(goal.pose, actual.pose, tolerance)
elif type(goal) is geometry_msgs.msg.Pose:
return all_close(pose_to_list(goal), pose_to_list(actual), tolerance)
return True
class MoveGroupPythonIntefaceTutorial(object):
def __init__(self):
super(MoveGroupPythonIntefaceTutorial, self).__init__()
## BEGIN_SUB_TUTORIAL setup
##
## First initialize `moveit_commander`_ and a `rospy`_ node:
moveit_commander.roscpp_initialize(sys.argv)
rospy.init_node('move_group_python_interface_tutorial',
anonymous=True)
## Instantiate a `RobotCommander`_ object. This object is the outer-level interface to
## the robot:
robot = moveit_commander.RobotCommander()
## Instantiate a `PlanningSceneInterface`_ object. This object is an interface
## to the world surrounding the robot:
scene = moveit_commander.PlanningSceneInterface()
## Instantiate a `MoveGroupCommander`_ object. This object is an interface
## to one group of joints. In this case the group is the joints in the Panda
## arm so we set ``group_name = panda_arm``. If you are using a different robot,
## you should change this value to the name of your robot arm planning group.
## This interface can be used to plan and execute motions on the Panda:
group_name = "arm"
group = moveit_commander.MoveGroupCommander(group_name)
planning_frame = group.get_planning_frame()
## We create a `DisplayTrajectory`_ publisher which is used later to publish
## trajectories for RViz to visualize:
display_trajectory_publisher = rospy.Publisher('/move_group/display_planned_path',
moveit_msgs.msg.DisplayTrajectory,
queue_size=20)
self.robot = robot
self.scene = scene
self.group = group
self.display_trajectory_publisher = display_trajectory_publisher
self.planning_frame = planning_frame
def go_to_joint_state(self):
# Copy class variables to local variables to make the web tutorials more clear.
# In practice, you should use the class variables directly unless you have a good
# reason not to.
group = self.group
## BEGIN_SUB_TUTORIAL plan_to_joint_state
##
## Planning to a Joint Goal
## ^^^^^^^^^^^^^^^^^^^^^^^^
## The Panda's zero configuration is at a `singularity <https://www.quora.com/Robotics-What-is-meant-by-kinematic-singularity>`_ so the first
## thing we want to do is move it to a slightly better configuration.
# We can get the joint values from the group and adjust some of the values:
joint_goal = group.get_current_joint_values()
joint_goal[0] = 0
joint_goal[1] = 0
joint_goal[2] = 0
joint_goal[3] = 0
joint_goal[4] = 3.1
group.go(joint_goal, wait=True)
rospy.sleep(1)
print "============ Press `Enter` to move to next point ..."
raw_input()
joint_goal = group.get_current_joint_values()
#joint_goal[0] = -0.57
#joint_goal[1] = -0.75
#joint_goal[2] = -0.94
#joint_goal[3] = 0.2
#joint_goal[4] = 0
joint_goal[0] = -0.57
joint_goal[1] = -0.75
joint_goal[2] = -0.94
joint_goal[3] = -1.32
joint_goal[4] = -0.57
group.go(joint_goal, wait=True)
rospy.sleep(3)
group.stop()
## END_SUB_TUTORIAL
# For testing:
# Note that since this section of code will not be included in the tutorials
# we use the class variable rather than the copied state variable
current_joints = self.group.get_current_joint_values()
current_pose = self.group.get_current_pose().pose
print current_pose
return all_close(joint_goal, current_joints, 0.01)
def plan_cartesian_path(self, scale=1):
# Copy class variables to local variables to make the web tutorials more clear.
# In practice, you should use the class variables directly unless you have a good
# reason not to.
group = self.group
## BEGIN_SUB_TUTORIAL plan_cartesian_path
##
## Cartesian Paths
## ^^^^^^^^^^^^^^^
## You can plan a Cartesian path directly by specifying a list of waypoints
## for the end-effector to go through:
##
waypoints = []
wpose = group.get_current_pose().pose
#wpose.position.z -= scale * 0.1 # First move up (z)
#wpose.position.x += scale * 0.2 # Third move sideways (y)
#waypoints.append(copy.deepcopy(wpose))
wpose.position.y += scale * 1 # and sideways (y)
waypoints.append(copy.deepcopy(wpose))
#wpose.position.x += scale * 0.2 # Third move sideways (y)
#waypoints.append(copy.deepcopy(wpose))
#wpose.position.z += scale * 0.3 # Second move forward/backwards in (x)
#waypoints.append(copy.deepcopy(wpose))
# We want the Cartesian path to be interpolated at a resolution of 1 cm
# which is why we will specify 0.01 as the eef_step in Cartesian
# translation. We will disable the jump threshold by setting it to 0.0 disabling:
(plan, fraction) = group.compute_cartesian_path(
waypoints, # waypoints to follow
0.01, # eef_step
0) # jump_threshold
# Note: We are just planning, not asking move_group to actually move the robot yet:
return plan, fraction
## END_SUB_TUTORIAL
def display_trajectory(self, plan):
# Copy class variables to local variables to make the web tutorials more clear.
# In practice, you should use the class variables directly unless you have a good
# reason not to.
robot = self.robot
display_trajectory_publisher = self.display_trajectory_publisher
## BEGIN_SUB_TUTORIAL display_trajectory
##
## Displaying a Trajectory
## ^^^^^^^^^^^^^^^^^^^^^^^
## You can ask RViz to visualize a plan (aka trajectory) for you. But the
## group.plan() method does this automatically so this is not that useful
## here (it just displays the same trajectory again):
##
## A `DisplayTrajectory`_ msg has two primary fields, trajectory_start and trajectory.
## We populate the trajectory_start with our current robot state to copy over
## any AttachedCollisionObjects and add our plan to the trajectory.
display_trajectory = moveit_msgs.msg.DisplayTrajectory()
display_trajectory.trajectory_start = robot.get_current_state()
display_trajectory.trajectory.append(plan)
# Publish
display_trajectory_publisher.publish(display_trajectory);
def execute_plan(self, plan):
# Copy class variables to local variables to make the web tutorials more clear.
# In practice, you should use the class variables directly unless you have a good
# reason not to.
group = self.group
## BEGIN_SUB_TUTORIAL execute_plan
##
## Executing a Plan
## ^^^^^^^^^^^^^^^^
## Use execute if you would like the robot to follow
## the plan that has already been computed:
group.execute(plan, wait=True)
## **Note:** The robot's current joint state must be within some tolerance of the
## first waypoint in the `RobotTrajectory`_ or ``execute()`` will fail
## END_SUB_TUTORIAL
def main():
try:
print "============ Press `Enter` to begin and launch moveit_commander (press ctrl-d to exit) ..."
raw_input()
tutorial = MoveGroupPythonIntefaceTutorial()
print "============ Press `Enter` to execute a movement using a joint state goal ..."
raw_input()
tutorial.go_to_joint_state()
print "============ Press `Enter` to plan and display a Cartesian path ..."
raw_input()
cartesian_plan, fraction = tutorial.plan_cartesian_path()
print "============ Press `Enter` to display a saved trajectory (this will replay the Cartesian path) ..."
raw_input()
tutorial.display_trajectory(cartesian_plan)
print "============ Press `Enter` to execute a saved path ..."
raw_input()
tutorial.execute_plan(cartesian_plan)
print "============ Python_moveit complete!"
except rospy.ROSInterruptException:
return
except KeyboardInterrupt:
return
if __name__ == '__main__':
main()
| 34.536965 | 145 | 0.662123 | 1,169 | 8,876 | 4.87083 | 0.244654 | 0.033193 | 0.027397 | 0.017914 | 0.251317 | 0.228311 | 0.219529 | 0.198455 | 0.186512 | 0.179312 | 0 | 0.011141 | 0.24155 | 8,876 | 256 | 146 | 34.671875 | 0.83467 | 0.436233 | 0 | 0.137615 | 0 | 0 | 0.114151 | 0.013986 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.082569 | null | null | 0.073395 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1185bba91e01b4de118d2b542bfd9251ad900106 | 799 | py | Python | reviews/migrations/0003_auto_20180610_1456.py | UrbanBogger/horrorexplosion | 3698e00a6899a5e8b224cd3d1259c3deb3a2ca80 | [
"MIT"
] | null | null | null | reviews/migrations/0003_auto_20180610_1456.py | UrbanBogger/horrorexplosion | 3698e00a6899a5e8b224cd3d1259c3deb3a2ca80 | [
"MIT"
] | 4 | 2020-06-05T18:21:18.000Z | 2021-06-10T20:17:31.000Z | reviews/migrations/0003_auto_20180610_1456.py | UrbanBogger/horrorexplosion | 3698e00a6899a5e8b224cd3d1259c3deb3a2ca80 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-06-10 13:56
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('reviews', '0002_auto_20180610_1406'),
]
operations = [
migrations.AddField(
model_name='movie',
name='human_readable_url',
field=models.SlugField(help_text="Enter the 'slug',i.e., the human-readable URL for the movie", null=True, unique=True),
),
migrations.AddField(
model_name='moviereview',
name='human_readable_url',
field=models.SlugField(help_text="Enter the 'slug',i.e., the human-readable URL for the movie review", null=True, unique=True),
),
]
| 30.730769 | 139 | 0.630788 | 98 | 799 | 4.979592 | 0.55102 | 0.106557 | 0.131148 | 0.110656 | 0.377049 | 0.377049 | 0.377049 | 0.377049 | 0.377049 | 0.377049 | 0 | 0.054908 | 0.24781 | 799 | 25 | 140 | 31.96 | 0.757072 | 0.085106 | 0 | 0.333333 | 1 | 0 | 0.284341 | 0.031593 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11864167f1112f6dca47939b6462e24eabe9c356 | 1,142 | py | Python | coinbasepro/auth.py | apfelnutzer/coinbasepro | f4f7e6d2f40eca027709a15d5d81ed804a506fbb | [
"MIT"
] | 46 | 2018-07-14T16:07:56.000Z | 2022-03-19T16:40:36.000Z | coinbasepro/auth.py | apfelnutzer/coinbasepro | f4f7e6d2f40eca027709a15d5d81ed804a506fbb | [
"MIT"
] | 19 | 2018-07-15T07:15:53.000Z | 2022-03-14T08:01:53.000Z | coinbasepro/auth.py | apfelnutzer/coinbasepro | f4f7e6d2f40eca027709a15d5d81ed804a506fbb | [
"MIT"
] | 20 | 2018-07-15T16:19:53.000Z | 2022-01-17T05:16:20.000Z | import time
import base64
import hashlib
import hmac
from requests.auth import AuthBase
class CoinbaseProAuth(AuthBase):
"""Request authorization.
Provided by Coinbase Pro:
https://docs.pro.coinbase.com/?python#signing-a-message
"""
def __init__(self, api_key, secret_key, passphrase):
self.api_key = api_key
self.secret_key = secret_key
self.passphrase = passphrase
def __call__(self, request):
timestamp = str(time.time())
message = timestamp + request.method + request.path_url + (request.body or "")
message = message.encode("ascii")
hmac_key = base64.b64decode(self.secret_key)
signature = hmac.new(hmac_key, message, hashlib.sha256)
signature_b64 = base64.b64encode(signature.digest())
request.headers.update(
{
"CB-ACCESS-SIGN": signature_b64,
"CB-ACCESS-TIMESTAMP": timestamp,
"CB-ACCESS-KEY": self.api_key,
"CB-ACCESS-PASSPHRASE": self.passphrase,
"Content-Type": "application/json",
}
)
return request
| 30.052632 | 86 | 0.619965 | 124 | 1,142 | 5.540323 | 0.467742 | 0.034935 | 0.043668 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020507 | 0.274081 | 1,142 | 37 | 87 | 30.864865 | 0.808203 | 0.091944 | 0 | 0 | 0 | 0 | 0.09725 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0.111111 | 0.185185 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1187839e1513da1980ec154bf0929f30c77d2a67 | 4,286 | py | Python | src/rbx2/rbx2_tasks/src/rbx2_tasks/clean_house_tasks_tree.py | fujy/ROS-Project | b5e3b43c5eb5d3c1d984648f0f61710eea3a1bb8 | [
"MIT"
] | 9 | 2017-08-22T13:07:13.000Z | 2021-07-13T10:02:29.000Z | src/rbx2/rbx2_tasks/src/rbx2_tasks/clean_house_tasks_tree.py | vcdanda/ROS-Turtlebot2 | f24356486f55933d52e25a81715f1571aace0224 | [
"MIT"
] | 1 | 2017-12-14T06:46:58.000Z | 2017-12-14T06:46:58.000Z | src/rbx2/rbx2_tasks/src/rbx2_tasks/clean_house_tasks_tree.py | vcdanda/ROS-Turtlebot2 | f24356486f55933d52e25a81715f1571aace0224 | [
"MIT"
] | 5 | 2018-02-07T14:09:28.000Z | 2021-01-08T20:41:52.000Z | #!/usr/bin/env python
""" clean_house_tasks_tree.py - Version 1.0 2013-12-20
Create a number of simulated cleaning tasks.
Created for the Pi Robot Project: http://www.pirobot.org
Copyright (c) 2013 Patrick Goebel. All rights reserved.
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.5
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details at:
http://www.gnu.org/licenses/gpl.html
"""
import rospy
from pi_trees_lib.pi_trees_lib import *
from geometry_msgs.msg import Twist
class Vacuum(Task):
def __init__(self, room=None, timer=3, *args):
name = "VACUUM_" + room.upper()
super(Vacuum, self).__init__(name)
self.name = name
self.room = room
self.counter = timer
self.finished = False
self.cmd_vel_pub = rospy.Publisher('cmd_vel', Twist)
self.cmd_vel_msg = Twist()
self.cmd_vel_msg.linear.x = 0.05
def run(self):
if self.finished:
return TaskStatus.SUCCESS
else:
rospy.loginfo('Vacuuming the floor in the ' + str(self.room))
while self.counter > 0:
self.cmd_vel_pub.publish(self.cmd_vel_msg)
self.cmd_vel_msg.linear.x *= -1
rospy.loginfo(self.counter)
self.counter -= 1
rospy.sleep(1)
return TaskStatus.RUNNING
self.finished = True
self.cmd_vel_pub.publish(Twist())
message = "Finished vacuuming the " + str(self.room) + "!"
rospy.loginfo(message)
class Mop(Task):
def __init__(self, room=None, timer=3, *args):
name = "MOP_" + room.upper()
super(Mop, self).__init__(name)
self.name = name
self.room = room
self.counter = timer
self.finished = False
self.cmd_vel_pub = rospy.Publisher('cmd_vel', Twist)
self.cmd_vel_msg = Twist()
self.cmd_vel_msg.linear.x = 0.05
self.cmd_vel_msg.angular.z = 1.2
def run(self):
if self.finished:
return TaskStatus.SUCCESS
else:
rospy.loginfo('Mopping the floor in the ' + str(self.room))
while self.counter > 0:
self.cmd_vel_pub.publish(self.cmd_vel_msg)
self.cmd_vel_msg.linear.x *= -1
rospy.loginfo(self.counter)
self.counter -= 1
rospy.sleep(1)
return TaskStatus.RUNNING
self.finished = True
self.cmd_vel_pub.publish(Twist())
message = "Done mopping the " + str(self.room) + "!"
rospy.loginfo(message)
class Scrub(Task):
def __init__(self, room=None, timer=7, *args):
name = "SCRUB_" + room.upper()
super(Scrub, self).__init__(name)
self.name = name
self.room = room
self.finished = False
self.counter = timer
self.cmd_vel_pub = rospy.Publisher('cmd_vel', Twist)
self.cmd_vel_msg = Twist()
self.cmd_vel_msg.linear.x = 0.3
self.cmd_vel_msg.angular.z = 0.2
def run(self):
if self.finished:
return TaskStatus.SUCCESS
else:
rospy.loginfo('Cleaning the tub...')
while self.counter > 0:
self.cmd_vel_pub.publish(self.cmd_vel_msg)
self.cmd_vel_msg.linear.x *= -1
if self.counter % 2 == 5:
self.cmd_vel_msg.angular.z *= -1
rospy.loginfo(self.counter)
self.counter -= 1
rospy.sleep(0.2)
return TaskStatus.RUNNING
self.finished = True
self.cmd_vel_pub.publish(Twist())
message = "The tub is clean!"
rospy.loginfo(message)
| 34.015873 | 73 | 0.569995 | 545 | 4,286 | 4.324771 | 0.269725 | 0.068731 | 0.101824 | 0.082732 | 0.640645 | 0.618583 | 0.609673 | 0.57955 | 0.547306 | 0.547306 | 0 | 0.01652 | 0.336211 | 4,286 | 126 | 74 | 34.015873 | 0.811951 | 0.175688 | 0 | 0.715909 | 0 | 0 | 0.048471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068182 | false | 0 | 0.034091 | 0 | 0.204545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11898903547e244ea87eaac3f750b792be78816a | 1,895 | py | Python | blog/models.py | alegume/furry-succotash | 2f8a49d4f11a50ae4884246984dc22346d6af328 | [
"BSD-3-Clause"
] | null | null | null | blog/models.py | alegume/furry-succotash | 2f8a49d4f11a50ae4884246984dc22346d6af328 | [
"BSD-3-Clause"
] | 7 | 2021-04-08T19:41:49.000Z | 2022-03-12T00:38:27.000Z | blog/models.py | alegume/furry-succotash | 2f8a49d4f11a50ae4884246984dc22346d6af328 | [
"BSD-3-Clause"
] | null | null | null | from django.conf import settings
from django.db import models
from django.utils import timezone
class Post(models.Model):
author = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
title = models.CharField(max_length=200)
text = models.TextField()
created_date = models.DateTimeField(auto_now=True)
published_date = models.DateTimeField(blank=True, null=True)
views = models.BigIntegerField(default=0)
cover = models.ImageField(blank=True, default=None, upload_to='images/%Y/%m/')
attachment = models.FileField(blank=True, default=None, upload_to='attachment')
tags = models.ManyToManyField('blog.Tag')
def publish(self):
self.published_date = timezone.now()
self.save()
def approved_comments(self):
return self.comments.filter(approved_comment=True)
def likes_count(self):
return PostLike.objects.filter(post=self).count()
def __str__(self):
return '{} ({})'.format(self.title, self.author)
class Tag(models.Model):
name = models.CharField(max_length=30)
def __str__(self):
return self.name
class PostLike(models.Model):
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
post = models.ForeignKey('blog.Post', on_delete=models.CASCADE)
created_date = models.DateTimeField(auto_now=True)
def __str__(self):
return '{} - {}'.format(self.post.title, self.user)
class Comment(models.Model):
post = models.ForeignKey('blog.Post', on_delete=models.CASCADE, related_name='comments')
author = models.CharField(max_length=200)
text = models.TextField()
created_date = models.DateTimeField(default=timezone.now)
approved_comment = models.BooleanField(default=False)
def approve(self):
self.approved_comment = True
self.save()
def __str__(self):
return self.text
| 33.839286 | 92 | 0.709763 | 237 | 1,895 | 5.49789 | 0.312236 | 0.046048 | 0.042978 | 0.064467 | 0.434382 | 0.403684 | 0.320798 | 0.28089 | 0.28089 | 0.205679 | 0 | 0.005718 | 0.169393 | 1,895 | 55 | 93 | 34.454545 | 0.822109 | 0 | 0 | 0.232558 | 0 | 0 | 0.037467 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.186047 | false | 0 | 0.069767 | 0.139535 | 0.906977 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
118b7cd89d4c450e3759129b6c9154abc809c419 | 573 | py | Python | ex036.py | MarcelleFranca/exercicios-python | e65c1d1f2d92db0e8f5035dea8062a53defa899f | [
"MIT"
] | null | null | null | ex036.py | MarcelleFranca/exercicios-python | e65c1d1f2d92db0e8f5035dea8062a53defa899f | [
"MIT"
] | null | null | null | ex036.py | MarcelleFranca/exercicios-python | e65c1d1f2d92db0e8f5035dea8062a53defa899f | [
"MIT"
] | null | null | null | print('-=-' * 25)
print('BEM-VINDO AO SEU EMPRÉSTIMO, PARA INICIAR SIGA AS INTRUÇÕES ABAIXO!')
print('-=-' * 25)
salario = float(input('Qual é o seu salario atual? R$'))
casa = float(input('Qual é o valo do imóvel que pretende compar? R$'))
anos = int(input('Em quantos anos pretende financiar a casa? '))
fina = casa / (anos * 12)
minimo = salario * 30 / 100
print('Para para um imóvel de R${:.2f} em {} ano(s), a prestação e será de R${:.2f}.'.format(casa, anos, fina))
if fina <= minimo:
print('Empréstimo pode ser CONCEDIDO!')
else:
print('Empréstimo NEGADO!')
| 40.928571 | 111 | 0.663176 | 91 | 573 | 4.175824 | 0.593407 | 0.036842 | 0.073684 | 0.078947 | 0.084211 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027311 | 0.169284 | 573 | 13 | 112 | 44.076923 | 0.771008 | 0 | 0 | 0.153846 | 0 | 0.076923 | 0.554974 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.461538 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
1196985810c472b20b259b654187b432d0eed6b4 | 986 | py | Python | src/__main__.py | MarioBonse/MulticameraTraking | 8f87c5fc3c807b2fa6ed18aacd153310855a31bf | [
"MIT"
] | null | null | null | src/__main__.py | MarioBonse/MulticameraTraking | 8f87c5fc3c807b2fa6ed18aacd153310855a31bf | [
"MIT"
] | null | null | null | src/__main__.py | MarioBonse/MulticameraTraking | 8f87c5fc3c807b2fa6ed18aacd153310855a31bf | [
"MIT"
] | null | null | null | import threeDplot
import camera as cam
import HSVObjTracking as HSVTr
import numpy as np
import matplotlib.pyplot as plt
import mpl_toolkits.mplot3d.axes3d as p3
import matplotlib.animation as animation
def main():
######red ball
ballLower = (137, 88, 55)
ballUpper = (183, 255, 255)
camera1 = cam.camera(1, "/dev/video0")
camera2 = cam.camera(2, "/dev/video3")
if camera1.createCameraMatrixUndistort() == False:
return False
if camera2.createCameraMatrixUndistort() == False:
return False
camera1.showvideo()
camera2.showvideo()
trackingObject = HSVTr.HSVObjTracking(ballLower, ballUpper, [camera1, camera2])
x,y,z = trackingObject.threedMovementsRecontstruction()
threeDplot.displayanimation(x, y, z)
if __name__ == "__main__":
main()
###yellow ball
#ballLower = (14, 67, 34)
#ballUpper = (57, 255, 255)
#
######red ball
#ballLower = (137, 88, 55)
#ballUpper = (183, 255, 255)
| 27.388889 | 84 | 0.6643 | 112 | 986 | 5.767857 | 0.473214 | 0.060372 | 0.049536 | 0.058824 | 0.126935 | 0.126935 | 0.126935 | 0.126935 | 0.126935 | 0.126935 | 0 | 0.07871 | 0.213996 | 986 | 35 | 85 | 28.171429 | 0.754839 | 0.130832 | 0 | 0.086957 | 0 | 0 | 0.037406 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.304348 | 0 | 0.434783 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1198641ba2be85478e0534ef9825ff1b7092342f | 931 | py | Python | src/genie/libs/parser/iosxe/tests/ShowIsisHostname/cli/equal/golden_output_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/iosxe/tests/ShowIsisHostname/cli/equal/golden_output_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/iosxe/tests/ShowIsisHostname/cli/equal/golden_output_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z | expected_output = {
"tag": {
"VRF1": {
"hostname_db": {
"hostname": {
"7777.77ff.eeee": {"hostname": "R7", "level": 2},
"2222.22ff.4444": {"hostname": "R2", "local_router": True},
}
}
},
"test": {
"hostname_db": {
"hostname": {
"9999.99ff.3333": {"hostname": "R9", "level": 2},
"8888.88ff.1111": {"hostname": "R8", "level": 2},
"7777.77ff.eeee": {"hostname": "R7", "level": 2},
"5555.55ff.aaaa": {"hostname": "R5", "level": 2},
"3333.33ff.6666": {"hostname": "R3", "level": 2},
"1111.11ff.2222": {"hostname": "R1", "level": 1},
"2222.22ff.4444": {"hostname": "R2", "local_router": True},
}
}
},
}
}
| 35.807692 | 79 | 0.360902 | 76 | 931 | 4.355263 | 0.5 | 0.108761 | 0.108761 | 0.120846 | 0.392749 | 0.392749 | 0.392749 | 0.223565 | 0 | 0 | 0 | 0.179584 | 0.431794 | 931 | 25 | 80 | 37.24 | 0.446125 | 0 | 0 | 0.32 | 0 | 0 | 0.348013 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
119d49f803cf21333256f78210d4c7fe2ef2ff51 | 7,596 | py | Python | python/biograph/internal/breadth_find_near.py | spiralgenetics/biograph | 33c78278ce673e885f38435384f9578bfbf9cdb8 | [
"BSD-2-Clause"
] | 16 | 2021-07-14T23:32:31.000Z | 2022-03-24T16:25:15.000Z | python/biograph/internal/breadth_find_near.py | spiralgenetics/biograph | 33c78278ce673e885f38435384f9578bfbf9cdb8 | [
"BSD-2-Clause"
] | 9 | 2021-07-20T20:39:47.000Z | 2021-09-16T20:57:59.000Z | python/biograph/internal/breadth_find_near.py | spiralgenetics/biograph | 33c78278ce673e885f38435384f9578bfbf9cdb8 | [
"BSD-2-Clause"
] | 9 | 2021-07-15T19:38:35.000Z | 2022-01-31T19:24:56.000Z |
# coding: utf-8
# In[195]:
from biograph import BioGraph, Sequence
cseq = "GGTTTAAGGCGTTTCCGTTCTTCTTCGTCATAACTTAATG"
diff = " dd * i"
qseq = "GGTTTAAGGTTTCCGTTTTTCTTCAGTCATAACTTAATG"
diff = " dd * i ****"
qseq = "GGTTTAAGGTTTCCGTTTTTCTTCAGTCATAACTTTTTT"
qseq = "GGTTTAAGGCGTTTCCGTTCTTCTTCGTCATAACTTCCCC"
diff = " ****"
qseq = "acattttaaacaatttatactagccccccccgcccttttttaacagctctcataaatatgtagttcatttctctTgaatacttaagtcaactgtatga".upper()
diff = "acattttaaacaatttatactagcccccccc..gcccttttttaacagctctcataaatatgtagttcatttctcttgaatacttaagtcaactgtatga"
bases = ["A", "T", "C", "G"]
bg_file = "/home/english/temp/lambdaToyData/proband.bg"
bg_file = "/home/english/ajtrio/HG002-NA24385-50x.bg/"
my_bg = BioGraph(bg_file)
max_diff = 3
""""
>ref
ACATTTTAAACAATTTATACTAGCCCCCCCCGCCCTTTTTTAACAGCTCTCATAAATATGTAGTTCATTTCTCTTGAATACTTAAGTCAACTGTATGA
>query1
ACATTTTAAACAATTTATACTAGCCCCCCCCCCGCCCTTTTTTAACAGCTCTCATAAATATGTAGTTCATTTCTCTTGAATACTTAAGTCAACTGTATGA
>query2
ACATTTTAAACAATTTATACTAGCCCCCCCCCCGCCCTTTTTTAACAGCTCTCATAAATATGTAGTTCATTTCTCTGGAATACTTAAGTCAACTGTATGA
ACTIONS QUERY SCORE START END QSIZE IDENTITY CHRO STRAND START END SPAN
---------------------------------------------------------------------------------------------------
browser details query1 97 1 100 100 96.0% 17 + 57832317 57832414 98
browser details query2 95 1 100 100 94.9% 17 + 57832317 57832414 98
browser details ref 98 1 98 98 100.0% 17 + 57832317 57832414 98
query1 - details
00000001 acattttaaacaatttatactagccccccccccgcccttttttaacagct 00000050
>>>>>>>> ||||||||||||||||||||||||||||||| ||||||||||||||||| >>>>>>>>
57832317 acattttaaacaatttatactagcccccccc..gcccttttttaacagct 57832364
^
57832348
00000051 ctcataaatatgtagttcatttctcttgaatacttaagtcaactgtatga 00000100
>>>>>>>> |||||||||||||||||||||||||||||||||||||||||||||||||| >>>>>>>>
57832365 ctcataaatatgtagttcatttctcttgaatacttaagtcaactgtatga 57832414
query2 - details
00000001 acattttaaacaatttatactagccccccccccgcccttttttaacagct 00000050
>>>>>>>> ||||||||||||||||||||||||||||||| ||||||||||||||||| >>>>>>>>
57832317 acattttaaacaatttatactagcccccccc..gcccttttttaacagct 57832364
^
57832348
00000051 ctcataaatatgtagttcatttctctggaatacttaagtcaactgtatga 00000100
>>>>>>>> |||||||||||||||||||||||||| ||||||||||||||||||||||| >>>>>>>>
57832365 ctcataaatatgtagttcatttctcttgaatacttaagtcaactgtatga 57832414
^
57832392
Both variants exist!
https://genome.ucsc.edu/cgi-bin/hgTracks?db=hg19&lastVirtModeType=default&lastVirtModeExtraState=&virtModeType=default&virtMode=0&nonVirtPosition=&position=chr17%3A57832317-57832414&hgsid=596652695_gh1Wcz1EruLV7qLCcogBY3CdjlV9
"""
# hotcache
import sys
sys.stderr.write("hotcache")
with open(bg_file + "/seqset", "rb") as fh:
for chunk in iter(lambda: fh.read(4096), b""):
pass
sys.stderr.write(" fin\n")
# In[196]:
print my_bg.find(qseq)
def find_diff(my_bg, search_seq, max_diff=3):
"""
search for a string with up to max_diff mis/ins/del
returns a list of seqset_ranges
"""
search_seq = search_seq[::-1] # Reverse it for pop_front
candidates = []
cur_diff = 0
if max_diff == 0:
ret = my_bg.find(search_seq)
if ret.valid:
return [ret]
return []
for base in bases:
new = my_bg.find(base[0])
if new.valid:
# print base, cur_diff, search_seq[0], base == search_seq[0]
if base == search_seq[0]:
candidates.append((new, 0, search_seq[1:], False)) # mat
else:
candidates.append((new, 1, search_seq, False)) # ins err
candidates.append((new, 1, search_seq[1:], True)) # mis err
candidates.append((my_bg.find(""), 1, search_seq[1:], True)) # del err
# print "start" redundancy filtering
ret = find_diff_recursive(candidates, max_diff, len(search_seq))
for i in ret:
print i[0].sequence, i
return candidate_unique(ret, False)
def candidate_unique(candidates, with_remain=True):
"""
Remove redundant candidates.
if with_remain: we're still building so we need to take remaining sequence into account
"""
uhash = {}
for i in candidates:
if with_remain:
key = "%d_%d_%s" % (i[0].start, i[0].end, i[2])
else:
key = "%d_%d" % (i[0].start, i[0].end)
if key not in uhash:
uhash[key] = i
else:
if uhash[key][1] > i[1]:
uhash[key] = i
print 'filt', len(candidates), '->', len(uhash)
return uhash.values()
def find_diff_recursive(candidates, max_diff, query_len):
"""
Candidate = (sequence being build, it's difference, sequence remaining to explore, previous base indel)
#can't have side by side insertion/deletions
"""
if len(candidates) == 0:
return []
# reduce redundancy -- hardly worth it?
candidates = candidate_unique(candidates)
new_candidates = []
finished = []
for cand, diff_cnt, remain_seq, pindel in candidates: # for each of the candidates
# print cand.sequence, diff_cnt, remain_seq
if len(remain_seq) == 0: # nothing left to do
# print('completed')
finished.append((cand, diff_cnt))
elif diff_cnt >= max_diff: # can only try matching
# print("maxdiff'd")
pos = 0
while cand.valid and pos < len(remain_seq):
cand = cand.push_front(remain_seq[pos])
pos += 1
if cand.valid and pos == len(remain_seq):
# print('but good', pos, len(remain_seq), cand.sequence)
finished.append((cand, diff_cnt))
else:
has_match = False
for base in bases: # try pushing on a new base
new = cand.push_front(base[0])
if new.valid:
if base == remain_seq[0]:
has_match = True
new_candidates.append((new, diff_cnt, remain_seq[1:], False)) # mat
# you can't ins or del a match
# left-aligned
else:
new_candidates.append((new, diff_cnt + 1, remain_seq[1:], False)) # mis
if not pindel:
new_candidates.append((new, diff_cnt + 1, remain_seq, True)) # ins
# didn't find this base, give a candidate with a del
if not has_match and not pindel:
new_candidates.append((cand, diff_cnt + 1, remain_seq[1:], True)) # del
return finished + find_diff_recursive(new_candidates, max_diff, query_len)
# In[199]:
import time
sys.stderr.write(time.asctime() + ' first\n')
j = find_diff(my_bg, qseq, 5)
sys.stderr.write(time.asctime() + ' second\n')
j = find_diff(my_bg, qseq, 5)
sys.stderr.write(time.asctime() + ' finish\n')
# In[200]:
print cseq
print diff
print qseq, len(j)
for x in j:
print x[0].sequence, x
# In[ ]:
exit()
# In[136]:
def ne(self, other):
return not self == other
Sequence.__ne__ = ne
def hash(self):
return str(self).__hash__()
Sequence.__hash__ = hash
# In[137]:
print(len(j))
# In[135]:
"j".__hash__
# In[ ]:
| 35.330233 | 226 | 0.590837 | 805 | 7,596 | 5.444721 | 0.313043 | 0.024641 | 0.02601 | 0.013689 | 0.235455 | 0.180698 | 0.120009 | 0.107689 | 0.107689 | 0.089893 | 0 | 0.065609 | 0.269616 | 7,596 | 214 | 227 | 35.495327 | 0.724405 | 0.081885 | 0 | 0.155963 | 0 | 0 | 0.143386 | 0.101989 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.009174 | 0.027523 | null | null | 0.073395 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11a349d89eed4479b108886885b0685929e596e4 | 21,670 | py | Python | fepy/pycopia/fepy/UI.py | kdart/pycopia | 1446fabaedf8c6bdd4ab1fc3f0ea731e0ef8da9d | [
"Apache-2.0"
] | 89 | 2015-03-26T11:25:20.000Z | 2022-01-12T06:25:14.000Z | fepy/pycopia/fepy/UI.py | kdart/pycopia | 1446fabaedf8c6bdd4ab1fc3f0ea731e0ef8da9d | [
"Apache-2.0"
] | 1 | 2015-07-05T03:27:43.000Z | 2015-07-11T06:21:20.000Z | fepy/pycopia/fepy/UI.py | kdart/pycopia | 1446fabaedf8c6bdd4ab1fc3f0ea731e0ef8da9d | [
"Apache-2.0"
] | 30 | 2015-04-30T01:35:54.000Z | 2022-01-12T06:19:49.000Z | # -*- coding: utf-8 -*-
# vim:ts=4:sw=4:softtabstop=4:smarttab:expandtab
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
User Interface base classes and themes.
"""
import sys
import os
import time
import StringIO
from pycopia import environ
from pycopia import cliutils
from pycopia.fsm import FSM, ANY
from pycopia.aid import IF
# set the PROMPT ignore depending on whether or not readline module is
# available.
#try:
# import readline
# PROMPT_START_IGNORE = '\001'
# PROMPT_END_IGNORE = '\002'
#except ImportError:
# readline = None
# PROMPT_START_IGNORE = ''
# PROMPT_END_IGNORE = ''
from types import MethodType
class UIError(Exception):
pass
class UIFindError(UIError):
pass
# themes define some basic "look and feel" for a CLI. This includes prompt
# srtrings and color set.
class Theme(object):
NORMAL = RESET = ""
BOLD = BRIGHT = ""
BLACK = ""
RED = ""
GREEN = ""
YELLOW = ""
BLUE = ""
MAGENTA = ""
CYAN = ""
WHITE = ""
DEFAULT = ""
GREY = ""
BRIGHTRED = ""
BRIGHTGREEN = ""
BRIGHTYELLOW = ""
BRIGHTBLUE = ""
BRIGHTMAGENTA = ""
BRIGHTCYAN = ""
BRIGHTWHITE = ""
UNDERSCORE = ""
BLINK = ""
help_local = WHITE
help_inherited = YELLOW
help_created = GREEN
@classmethod
def _setcolors(cls):
pass
def __init__(self, ps1="> ", ps2="more> ", ps3="choose", ps4="-> "):
self._ps1 = ps1 # main prompt
self._ps2 = ps2 # more input needed
self._ps3 = ps3 # choose prompt
self._ps4 = ps4 # input prompt
self._setcolors()
def _set_ps1(self, new):
self._ps1 = str(new)
def _set_ps2(self, new):
self._ps2 = str(new)
def _set_ps3(self, new):
self._ps3 = str(new)
def _set_ps4(self, new):
self._ps4 = str(new)
ps1 = property(lambda s: s._ps1, _set_ps1, None, "primary prompt")
ps2 = property(lambda s: s._ps2, _set_ps2, None, "more input needed")
ps3 = property(lambda s: s._ps3, _set_ps3, None, "choose prompt")
ps4 = property(lambda s: s._ps4, _set_ps4, None, "text input prompt")
class BasicTheme(Theme):
def _setcolors(cls):
"Base class for themes. Defines interface."
cls.NORMAL = cls.RESET = "\x1b[0m"
cls.BOLD = cls.BRIGHT = "\x1b[1m"
cls.BLACK = ""
cls.RED = ""
cls.GREEN = ""
cls.YELLOW = ""
cls.BLUE = ""
cls.MAGENTA = ""
cls.CYAN = ""
cls.WHITE = ""
cls.DEFAULT = ""
cls.GREY = ""
cls.BRIGHTRED = ""
cls.BRIGHTGREEN = ""
cls.BRIGHTYELLOW = ""
cls.BRIGHTBLUE = ""
cls.BRIGHTMAGENTA = ""
cls.BRIGHTCYAN = ""
cls.BRIGHTWHITE = ""
cls.UNDERSCORE = "\x1b[4m"
cls.BLINK = "\x1b[5m"
cls.help_local = cls.WHITE
cls.help_inherited = cls.YELLOW
cls.help_created = cls.GREEN
_setcolors = classmethod(_setcolors)
class ANSITheme(BasicTheme):
"""Defines tunable parameters for the UserInterface, to provide
different color schemes and prompts.
"""
def _setcolors(cls):
# ANSI escapes for color terminals
cls.NORMAL = cls.RESET = "\x1b[0m"
cls.BOLD = cls.BRIGHT = "\x1b[01m"
cls.BLACK = "\x1b[30m"
cls.RED = "\x1b[31m"
cls.GREEN = "\x1b[32m"
cls.YELLOW = "\x1b[33m"
cls.BLUE = "\x1b[34m"
cls.MAGENTA = "\x1b[35m"
cls.CYAN = "\x1b[36m"
cls.WHITE = "\x1b[37m"
cls.GREY = "\x1b[30;01m"
cls.BRIGHTRED = "\x1b[31;01m"
cls.BRIGHTGREEN = "\x1b[32;01m"
cls.BRIGHTYELLOW = "\x1b[33;01m"
cls.BRIGHTBLUE = "\x1b[34;01m"
cls.BRIGHTMAGENTA = "\x1b[35;01m"
cls.BRIGHTCYAN = "\x1b[36;01m"
cls.BRIGHTWHITE = "\x1b[37;01m"
cls.DEFAULT = "\x1b[39;49m"
cls.UNDERSCORE = "\x1b[4m"
cls.BLINK = "\x1b[5m"
cls.help_local = cls.BRIGHTWHITE
cls.help_inherited = cls.YELLOW
cls.help_created = cls.GREEN
_setcolors = classmethod(_setcolors)
DefaultTheme = ANSITheme
class UserInterface(object):
"""An ANSI terminal user interface for CLIs. """
def __init__(self, io, env=None, theme=None):
self.set_IO(io)
self._env = env or environ.Environ()
assert hasattr(self._env, "get")
self._env["_"] = None
self._cache = {}
self.set_theme(theme)
self._initfsm()
self.initialize()
def set_IO(self, io):
self._io = io
self._termlen, self._termwidth = 24, 80 # TODO dotnet Terminal window size?
def get_IO(self):
return self._io
def _del_IO(self):
self._io = None
IO = property(get_IO, set_IO, _del_IO)
def __del__(self):
try:
self.finalize()
except:
pass
def initialize(self, *args):
pass
def finalize(self):
pass
def close(self):
if self._io is not None:
self._io.close()
self._io = None
def set_environ(self, env):
assert hasattr(env, "get")
self._env = env
self._env["_"] = None
def set_theme(self, theme):
self._theme = theme or DefaultTheme()
assert isinstance(self._theme, Theme), "must supply a Theme object."
self._env.setdefault("PS1", self._theme.ps1)
self._env.setdefault("PS2", self._theme.ps2)
self._env.setdefault("PS3", self._theme.ps3)
self._env.setdefault("PS4", self._theme.ps4)
def clone(self, theme=None):
return self.__class__(self._io, self._env.copy(), theme or self._theme)
# output methods
def Print(self, *objs):
wr = self._io.write
if objs:
for obj in objs[:-1]:
wr(str(obj))
wr(" ")
last = objs[-1]
if last is not None: # don't NL if last value is None (works like trailing comma).
wr(str(last))
wr("\n")
else:
wr("\n")
self._io.flush()
def pprint(self, obj):
self._format(obj, 0, 0, {}, 0)
self._io.write("\n")
self._io.flush()
def printf(self, text):
"Print text run through the prompt formatter."
self.Print(self.format(text))
def print_obj(self, obj, nl=1):
if nl:
self._io.write("%s\n" % (obj,))
else:
self._io.write(str(obj))
self._io.flush()
def print_list(self, clist, indent=0):
if clist:
width = self._termwidth - 9
indent = min(max(indent,0),width)
ps = " " * indent
for c in clist[:-1]:
cs = "%s, " % (c,)
if len(ps) + len(cs) > width:
self.print_obj(ps)
ps = "%s%s" % (" " * indent, cs)
else:
ps += cs
self.print_obj("%s%s" % (ps, clist[-1]))
def error(self, text):
self.printf("%%r%s%%N" % (text,))
# report-like methods for test framework
def write(self, text):
self._io.write(text)
def writeline(self, text=""):
self._io.writeline(text)
def writelines(self, lines):
self._io.writelines(lines)
def add_heading(self, text, level=1):
s = ["\n"]
s.append("%s%s" % (" "*(level-1), text))
s.append("%s%s" % (" "*(level-1), "-"*len(text)))
self.Print("\n".join(s))
def add_title(self, title):
self.add_heading(title, 0)
# called with the name of a logfile to report
def logfile(self, filename):
self._io.write("LOGFILE: <%s>\n" % (filename,))
def add_message(self, msgtype, msg, level=1):
self._io.write("%s%s: %s\n" % (" "*(level-1), msgtype, msg))
def add_summary(self, text):
self._io.write(text)
def add_text(self, text):
self._io.write(text)
def add_url(self, text, url):
self._io.write("%s: <%s>\n" % (text, url))
def passed(self, msg="", level=1):
return self.add_message(self.format("%GPASSED%N"), msg, level)
def failed(self, msg="", level=1):
return self.add_message(self.format("%RFAILED%N"), msg, level)
def incomplete(self, msg="", level=1):
return self.add_message(self.format("%yINCOMPLETE%N"), msg, level)
def abort(self, msg="", level=1):
return self.add_message(self.format("%YABORT%N"), msg, level)
def info(self, msg, level=1):
return self.add_message("INFO", msg, level)
def diagnostic(self, msg, level=1):
return self.add_message(self.format("%yDIAGNOSTIC%N"), msg, level)
def newpage(self):
self._io.write("\x0c") # FF
def newsection(self):
self._io.write("\x0c") # FF
# user input
def _get_prompt(self, name, prompt=None):
return self.format(prompt or self._env[name])
def user_input(self, prompt=None):
return self._io.raw_input(self._get_prompt("PS1", prompt))
def more_user_input(self):
return self._io.raw_input(self._get_prompt("PS2"))
def choose(self, somelist, defidx=0, prompt=None):
return cliutils.choose(somelist,
defidx,
self._get_prompt("PS3", prompt),
input=self._io.raw_input, error=self.error)
def get_text(self, msg=None):
return cliutils.get_text(self._get_prompt("PS4"), msg, input=self._io.raw_input)
def get_value(self, prompt, default=None):
return cliutils.get_input(self.format(prompt), default, self._io.raw_input)
def yes_no(self, prompt, default=True):
yesno = cliutils.get_input(self.format(prompt), IF(default, "Y", "N"), self._io.raw_input)
return yesno.upper().startswith("Y")
# docstring/help formatters
def _format_doc(self, s, color):
i = s.find("\n")
if i > 0:
return color+s[:i]+self._theme.NORMAL+s[i:]+"\n"
else:
return color+s+self._theme.NORMAL+"\n"
def help_local(self, text):
self.Print(self._format_doc(text, self._theme.help_local))
def help_inherited(self, text):
self.Print(self._format_doc(text, self._theme.help_inherited))
def help_created(self, text):
self.Print(self._format_doc(text, self._theme.help_created))
def format(self, ps):
"Expand percent-exansions in a string and return the result."
self._fsm.process_string(ps)
return self._getarg()
def register_expansion(self, key, func):
"""Register a percent-expansion function for the format method. The
function must take one argument, and return a string. The argument is
the character expanded on."""
key = str(key)[0]
if not self._EXPANSIONS.has_key(key):
self._EXPANSIONS[key] = func
else:
raise ValueError, "expansion key %r already exists." % (key, )
# FSM for prompt expansion
def _initfsm(self):
# maps percent-expansion items to some value.
self._EXPANSIONS = {
"I":self._theme.BRIGHT,
"N":self._theme.NORMAL,
"D":self._theme.DEFAULT,
"R":self._theme.BRIGHTRED,
"G":self._theme.BRIGHTGREEN,
"Y":self._theme.BRIGHTYELLOW,
"B":self._theme.BRIGHTBLUE,
"M":self._theme.BRIGHTMAGENTA,
"C":self._theme.BRIGHTCYAN,
"W":self._theme.BRIGHTWHITE,
"r":self._theme.RED,
"g":self._theme.GREEN,
"y":self._theme.YELLOW,
"b":self._theme.BLUE,
"m":self._theme.MAGENTA,
"c":self._theme.CYAN,
"w":self._theme.WHITE,
"n":"\n", "h":self._hostname, "u":self._username,
"$": self._priv, "d":self._cwd, "L": self._shlvl, "t":self._time,
"T":self._date}
f = FSM(0)
f.add_default_transition(self._error, 0)
# add text to args
f.add_transition(ANY, 0, self._addtext, 0)
# percent escapes
f.add_transition("%", 0, None, 1)
f.add_transition("%", 1, self._addtext, 0)
f.add_transition("{", 1, self._startvar, 2)
f.add_transition("}", 2, self._endvar, 0)
f.add_transition(ANY, 2, self._vartext, 2)
f.add_transition(ANY, 1, self._expand, 0)
f.arg = ''
self._fsm = f
def _startvar(self, c, fsm):
fsm.varname = ""
def _vartext(self, c, fsm):
fsm.varname += c
def _endvar(self, c, fsm):
fsm.arg += str(self._env.get(fsm.varname, fsm.varname))
def _expand(self, c, fsm):
try:
arg = self._cache[c]
except KeyError:
try:
arg = self._EXPANSIONS[c]
except KeyError:
arg = c
else:
if callable(arg):
arg = str(arg(c))
else:
pass
#arg = PROMPT_START_IGNORE + arg + PROMPT_END_IGNORE
fsm.arg += arg
def _username(self, c):
un = os.environ.get("USERNAME") or os.environ.get("USER")
if un:
self._cache[c] = un
return un
def _shlvl(self, c):
return str(self._env.get("SHLVL", ""))
def _hostname(self, c):
hn = os.uname()[1]
self._cache[c] = hn
return hn
def _priv(self, c):
if os.getuid() == 0:
arg = "#"
else:
arg = ">"
self._cache[c] = arg
return arg
def _cwd(self, c):
return os.getcwd()
def _time(self, c):
return time.strftime("%H:%M:%S", time.localtime())
def _date(self, c):
return time.strftime("%m/%d/%Y", time.localtime())
def _error(self, input_symbol, fsm):
self._io.errlog('Prompt string error: %s\n%r' % (input_symbol, fsm.stack))
fsm.reset()
def _addtext(self, c, fsm):
fsm.arg += c
def _getarg(self):
if self._fsm.arg:
arg = self._fsm.arg
self._fsm.arg = ''
return arg
else:
return None
# pretty printing
def _format(self, obj, indent, allowance, context, level):
level = level + 1
objid = id(obj)
if objid in context:
self._io.write(_recursion(obj))
return
rep = self._repr(obj, context, level - 1)
typ = type(obj)
sepLines = len(rep) > (self._termwidth - 1 - indent - allowance)
write = self._io.write
if sepLines:
if typ is dict:
write('{\n ')
length = len(obj)
if length:
context[objid] = 1
indent = indent + 2
items = obj.items()
items.sort()
key, ent = items[0]
rep = self._repr(key, context, level)
write(rep)
write(': ')
self._format(ent, indent + len(rep) + 2, allowance + 1, context, level)
if length > 1:
for key, ent in items[1:]:
rep = self._repr(key, context, level)
write(',\n%s%s: ' % (' '*indent, rep))
self._format(ent, indent + len(rep) + 2, allowance + 1, context, level)
indent = indent - 2
del context[objid]
write('\n}')
return
if typ is list:
write('[\n')
self.print_list(obj, 2)
write(']')
return
if typ is tuple:
write('(\n')
self.print_list(obj, 2)
if len(obj) == 1:
write(',')
write(')')
return
write(rep)
def _repr(self, obj, context, level):
return self._safe_repr(obj, context.copy(), None, level)
def _safe_repr(self, obj, context, maxlevels, level):
return _safe_repr(obj, context, maxlevels, level)
# Return repr_string
def _safe_repr(obj, context, maxlevels, level):
typ = type(obj)
if typ is str:
if 'locale' not in sys.modules:
return repr(obj)
if "'" in obj and '"' not in obj:
closure = '"'
quotes = {'"': '\\"'}
else:
closure = "'"
quotes = {"'": "\\'"}
qget = quotes.get
sio = StringIO()
write = sio.write
for char in obj:
if char.isalpha():
write(char)
else:
write(qget(char, `char`[1:-1]))
return ("%s%s%s" % (closure, sio.getvalue(), closure))
if typ is dict:
if not obj:
return "{}"
objid = id(obj)
if maxlevels and level > maxlevels:
return "{...}"
if objid in context:
return _recursion(obj)
context[objid] = 1
components = []
append = components.append
level += 1
saferepr = _safe_repr
for k, v in obj.iteritems():
krepr = saferepr(k, context, maxlevels, level)
vrepr = saferepr(v, context, maxlevels, level)
append("%s: %s" % (krepr, vrepr))
del context[objid]
return "{%s}" % ", ".join(components)
if typ is list or typ is tuple:
if typ is list:
if not obj:
return "[]"
format = "[%s]"
elif len(obj) == 1:
format = "(%s,)"
else:
if not obj:
return "()"
format = "(%s)"
objid = id(obj)
if maxlevels and level > maxlevels:
return format % "..."
if objid in context:
return _recursion(obj)
context[objid] = 1
components = []
append = components.append
level += 1
for o in obj:
orepr = _safe_repr(o, context, maxlevels, level)
append(orepr)
del context[objid]
return format % ", ".join(components)
if typ is MethodType:
return method_repr(obj)
rep = repr(obj)
return rep
def _recursion(obj):
return ("<Recursion on %s with id=%s>" % (type(obj).__name__, id(obj)))
def safe_repr(value):
return _safe_repr(value, {}, None, 0)
def method_repr(method):
methname = method.im_func.func_name
# formal names
varnames = list(method.im_func.func_code.co_varnames)[1:method.im_func.func_code.co_argcount]
if method.im_func.func_defaults:
ld = len(method.im_func.func_defaults)
varlist = [", ".join(varnames[:-ld]),
", ".join(["%s=%r" % (n, v) for n, v in zip(varnames[-ld:], method.im_func.func_defaults)])]
return "%s(%s)" % (methname, ", ".join(varlist))
else:
return "%s(%s)" % (methname, ", ".join(varnames))
def _get_object(name):
try:
return getattr(sys.modules[__name__], name)
except AttributeError:
i = name.rfind(".")
if i >= 0:
modname = name[:i]
try:
mod = sys.modules[modname]
except KeyError:
try:
mod = __import__(modname, globals(), locals(), ["*"])
except ImportError, err:
raise UIFindError, \
"Could not find UI module %s: %s" % (modname, err)
try:
return getattr(mod, name[i+1:])
except AttributeError:
raise UIFindError, \
"Could not find UI object %r in module %r." % (name, modname)
else:
raise UIFindError, "%s is not a valid object path." % (name,)
# construct a user interface from object names given as strings.
def get_userinterface(uiname="UserInterface",
ioname="IO.ConsoleIO", themename=None):
if type(ioname) is str:
ioobj = _get_object(ioname)
elif hasattr(ioname, "write"):
ioobj = ioname
else:
raise ValueError("ioname not a valid type")
if not hasattr(ioobj, "close"):
raise UIFindError, "not a valid IO object: %r" % (ioobj,)
uiobj = _get_object(uiname)
if not hasattr(uiobj, "Print"):
raise UIFindError, "not a valid UI object: %r" % (uiobj,)
if themename is not None:
themeobj = _get_object(themename)
if not issubclass(themeobj, Theme):
raise UIFindError, "not a valid Theme object: %r." % (themeobj,)
return uiobj(ioobj(), theme=themeobj())
else:
return uiobj(ioobj())
def _test(argv):
ui = get_userinterface()
ui.Print("Hello world!")
inp = ui.user_input("Type something> ")
ui.Print("You typed:", inp)
if __name__ == "__main__":
_test(sys.argv)
| 31.045845 | 111 | 0.534702 | 2,626 | 21,670 | 4.275324 | 0.176314 | 0.018705 | 0.013717 | 0.006948 | 0.195778 | 0.148838 | 0.118197 | 0.106262 | 0.092278 | 0.084439 | 0 | 0.015306 | 0.333687 | 21,670 | 697 | 112 | 31.090387 | 0.762241 | 0.070743 | 0 | 0.202186 | 0 | 0 | 0.062443 | 0 | 0 | 0 | 0 | 0.001435 | 0.005464 | 0 | null | null | 0.016393 | 0.020036 | null | null | 0.016393 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11a52312ae117471edddd3a1746b612e1f57bc72 | 257 | py | Python | airflow/plugins/helpers/fetch_page.py | makism/find-a-ps5 | 3f004be40554595c687a0661998065b08b1f9009 | [
"MIT"
] | 1 | 2021-02-10T14:18:49.000Z | 2021-02-10T14:18:49.000Z | airflow/plugins/helpers/fetch_page.py | makism/find-a-ps5 | 3f004be40554595c687a0661998065b08b1f9009 | [
"MIT"
] | null | null | null | airflow/plugins/helpers/fetch_page.py | makism/find-a-ps5 | 3f004be40554595c687a0661998065b08b1f9009 | [
"MIT"
] | null | null | null | import requests
from bs4 import BeautifulSoup
def fetch_page(url: str) -> str:
"""Fetch the page from the given URL."""
try:
r = requests.get(url)
return r.text
except Exception as err:
print(err)
return None
| 17.133333 | 44 | 0.607004 | 35 | 257 | 4.428571 | 0.657143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005618 | 0.307393 | 257 | 14 | 45 | 18.357143 | 0.865169 | 0.132296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.555556 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
11aa968f92e21fea76b989e7ea6ae511caa8f46d | 12,094 | py | Python | tests/denon/test_response.py | JPHutchins/pyavreceiver | 2c86d0ab1f3bca886d2a876096ac760ffb1dcd5f | [
"Apache-2.0"
] | 2 | 2020-12-28T06:09:18.000Z | 2021-01-09T22:36:57.000Z | tests/denon/test_response.py | JPHutchins/pyavreceiver | 2c86d0ab1f3bca886d2a876096ac760ffb1dcd5f | [
"Apache-2.0"
] | 1 | 2021-02-03T22:59:49.000Z | 2021-02-03T22:59:49.000Z | tests/denon/test_response.py | JPHutchins/pyavreceiver | 2c86d0ab1f3bca886d2a876096ac760ffb1dcd5f | [
"Apache-2.0"
] | null | null | null | """Test responses from Denon/Marantz."""
from pyavreceiver.denon.response import DenonMessage
def test_separate(message_none):
"""Test separation of messages."""
assert message_none.separate("PWON") == ("PW", None, "ON")
assert message_none.separate("PWSTANDBY") == ("PW", None, "STANDBY")
assert message_none.separate("MVMAX 80") == ("MV", "MAX", "80")
assert message_none.separate("CVFL 60 ") == ("CV", "FL", "60")
assert message_none.separate("CVFL60") == ("CV", "FL", "60")
assert message_none.separate("CV FHL 44") == ("CV", "FHL", "44")
assert message_none.separate("CVNEW SPEC 55") == ("CV", "NEW SPEC", "55")
assert message_none.separate("CVUNKNOWNCOMMAND55") == (
"CV",
"UNKNOWNCOMMAND55",
None,
)
assert message_none.separate("MUON") == ("MU", None, "ON")
assert message_none.separate("SIPHONO") == ("SI", None, "PHONO")
assert message_none.separate("SI PHONO ") == ("SI", None, "PHONO")
assert message_none.separate("SIUSB DIRECT") == ("SI", None, "USB DIRECT")
assert message_none.separate("SINEW SOURCE VARIETY") == (
"SI",
None,
"NEW SOURCE VARIETY",
)
assert message_none.separate("SLPOFF") == ("SLP", None, "OFF")
assert message_none.separate("SLP OFF") == ("SLP", None, "OFF")
assert message_none.separate("MSDOLBY D+ +PL2X C") == (
"MS",
None,
"DOLBY D+ +PL2X C",
)
assert message_none.separate("MSYET ANOTHER POINTLESS DSP") == (
"MS",
None,
"YET ANOTHER POINTLESS DSP",
)
assert message_none.separate("PSDELAY 000") == ("PS", "DELAY", "000")
assert message_none.separate("PSTONE CTRL ON") == ("PS", "TONE CTRL", "ON")
assert message_none.separate("PSTONE CTRLOFF") == ("PS", "TONE CTRL", "OFF")
assert message_none.separate("PSSB MTRX ON") == ("PS", "SB", "MTRX ON")
assert message_none.separate("PSSB ON") == ("PS", "SB", "ON")
assert message_none.separate("PSMULTEQ BYP.LR") == ("PS", "MULTEQ", "BYP.LR")
assert message_none.separate("PSDCO OFF") == ("PS", "DCO", "OFF")
assert message_none.separate("PSLFE -8") == ("PS", "LFE", "-8")
assert message_none.separate("PSNEWPARAM OK") == ("PS", "NEWPARAM", "OK")
assert message_none.separate("PSUNKNOWNCOMMAND55") == (
"PS",
"UNKNOWNCOMMAND55",
None,
)
assert message_none.separate("MV60") == ("MV", None, "60")
assert message_none.separate("MV595") == ("MV", None, "595")
assert message_none.separate("Z2PSBAS 51") == ("Z2PS", "BAS", "51")
assert message_none.separate("Z260") == ("Z2", None, "60")
assert message_none.separate("Z2ON") == ("Z2", None, "ON")
assert message_none.separate("Z2PHONO") == ("Z2", None, "PHONO")
assert message_none.separate("Z3PSBAS 51") == ("Z3PS", "BAS", "51")
assert message_none.separate("Z360") == ("Z3", None, "60")
assert message_none.separate("Z3ON") == ("Z3", None, "ON")
assert message_none.separate("Z3PHONO") == ("Z3", None, "PHONO")
assert message_none.separate("NEWCMD 50") == ("NEWCMD", None, "50")
assert message_none.separate("NEWCMD WITH PARAMS 50") == (
"NEWCMD WITH PARAMS",
None,
"50",
)
assert message_none.separate("UNPARSABLE") == ("UNPARSABLE", None, None)
assert message_none.separate("FAKEFOR TESTS") == ("FAKEFO", None, "R TESTS")
assert message_none.separate("FAKENORTEST") == ("FAKEN", "OR", "TEST")
def test_format_db(message_none):
"""Test format to decibel."""
assert message_none.parse_value("MV", None, "60") == -20
assert message_none.parse_value("MV", None, "595") == -20.5
assert message_none.parse_value("MV", None, "80") == 0
assert message_none.parse_value("MV", None, "805") == 0.5
assert message_none.parse_value("MV", None, "00") == -80
assert message_none.parse_value("MV", "MAX", "80") == 0
assert message_none.parse_value("CV", "FL", "50") == 0
assert message_none.parse_value("CV", "SL", "39") == -11
assert message_none.parse_value("CV", "FHL", "545") == 4.5
assert message_none.parse_value("SSLEV", "FL", "50") == 0
assert message_none.parse_value("PS", "BAS", "50") == 0
assert message_none.parse_value("PS", "BAS", "39") == -11
assert message_none.parse_value("PS", "TRE", "545") == 4.5
assert message_none.parse_value("PS", "LFE", "-6") == -6
assert message_none.parse_value("Z2", None, "60") == -20
assert message_none.parse_value("Z2", None, "595") == -20.5
assert message_none.parse_value("Z2", None, "80") == 0
assert message_none.parse_value("Z2", None, "805") == 0.5
assert message_none.parse_value("Z2", None, "00") == -80
def test_attribute_assignment(command_dict):
"""Test assignment of attr."""
msg = DenonMessage("PWON", command_dict)
assert msg.parsed == ("PW", None, "ON")
assert str(msg) == "PWON"
assert repr(msg) == "PWON"
assert msg.group == "PW"
msg = DenonMessage("MV75", command_dict)
assert msg.parsed == ("MV", None, -5)
assert msg.message == "MV75"
assert msg.raw_value == "75"
msg = DenonMessage("MVMAX 80", command_dict)
assert msg.parsed == ("MV", "MAX", 0)
assert msg.message == "MVMAX 80"
assert msg.raw_value == "80"
msg = DenonMessage("CVFL 51", command_dict)
assert msg.parsed == ("CV", "FL", 1)
assert msg.message == "CVFL 51"
assert msg.raw_value == "51"
assert msg.group == "CVFL"
msg = DenonMessage("MSDOLBY D+ +PL2X C", command_dict)
assert msg.parsed == ("MS", None, "DOLBY D+ +PL2X C")
msg = DenonMessage("PSDYNVOL LOW", command_dict)
assert msg.parsed == ("PS", "DYNVOL", "LOW")
assert msg.message == "PSDYNVOL LOW"
assert msg.raw_value == "LOW"
assert msg.group == "PSDYNVOL"
msg = DenonMessage("PSDELAY 000", command_dict)
assert msg.parsed == ("PS", "DELAY", "000")
assert msg.message == "PSDELAY 000"
assert msg.raw_value == "000"
assert msg.group == "PSDELAY"
def test_state_update_dict(command_dict):
"""Test create the update dict."""
assert DenonMessage("PWON", command_dict).state_update == {"power": True}
assert DenonMessage("MVMAX 80", command_dict).state_update == {"max_volume": 0}
assert DenonMessage("PWSTANDBY", command_dict).state_update == {"power": False}
assert DenonMessage("MV75", command_dict).state_update == {"volume": -5}
assert DenonMessage("MV56", command_dict).state_update == {"volume": -24}
assert DenonMessage("CVFL 51", command_dict).state_update == {"channel_level_fl": 1}
assert DenonMessage("SSLEVFL 50", command_dict).state_update == {
"channel_level_fl": 0
}
assert DenonMessage("PSNEWPARAM LOW", command_dict).state_update == {
"PS_NEWPARAM": "LOW"
}
assert DenonMessage("MSDOLBY D+ +PL2X C", command_dict).state_update == {
"sound_mode": "DOLBY D+ +PL2X C"
}
assert DenonMessage("PSBAS 39", command_dict).state_update == {"bass": -11}
assert DenonMessage("MUON", command_dict).state_update == {"mute": True}
assert DenonMessage("SIPHONO", command_dict).state_update == {"source": "PHONO"}
assert DenonMessage("SIBD", command_dict).state_update == {"source": "BD"}
assert DenonMessage("SINEW SOURCE TYPE", command_dict).state_update == {
"source": "NEW SOURCE TYPE"
}
assert DenonMessage("DCAUTO", command_dict).state_update == {
"digital_signal_mode": "AUTO"
}
assert DenonMessage("PSTONE CTRL ON", command_dict).state_update == {
"tone_control": True
}
assert DenonMessage("PSSBMTRX ON", command_dict).state_update == {
"surround_back": "MTRX ON"
}
assert DenonMessage("PSDYNVOL MED", command_dict).state_update == {
"dsp_dynamic_range_control": "medium"
}
assert DenonMessage("NEWPARAM ANYVALUE", command_dict).state_update == {
"NEWPARAM": "ANYVALUE"
}
assert DenonMessage("PSNEWPARAM ANYVALUE", command_dict).state_update == {
"PS_NEWPARAM": "ANYVALUE"
}
assert DenonMessage("PSNEWPARAM", command_dict).state_update == {
"PS_NEWPARAM": None
}
def test_bad_value_handling(command_dict):
"""Test error handling for values that don't conform to spec."""
assert DenonMessage("MVSTRING", command_dict).state_update == {
"volume_string": None
}
assert DenonMessage("MV1000", command_dict).state_update == {}
def test_multiple_types(command_dict):
"""Test handling multiple types of value."""
assert DenonMessage("PSDIL OFF", command_dict).state_update == {
"dialog_level": False
}
assert DenonMessage("PSDIL ON", command_dict).state_update == {"dialog_level": True}
assert DenonMessage("PSDIL 55", command_dict).state_update == {"dialog_level": 5}
assert DenonMessage("PSDIL 45", command_dict).state_update == {"dialog_level": -5}
def test_unnamed_param(command_dict):
"""Test an unnamed parsed parameter."""
assert DenonMessage("PSDELAY 000", command_dict).state_update == {"PS_DELAY": "000"}
def test_zones(command_dict):
"""Test parsing zone commands."""
assert DenonMessage("ZMON", command_dict).state_update == {"zone1_power": True}
assert DenonMessage("ZMOFF", command_dict).state_update == {"zone1_power": False}
assert DenonMessage("Z2PSBAS 51", command_dict).state_update == {"zone2_bass": 1}
assert DenonMessage("Z3PSTRE 445", command_dict).state_update == {
"zone3_treble": -5.5
}
assert DenonMessage("Z260", command_dict).state_update == {"zone2_volume": -20}
assert DenonMessage("Z2ON", command_dict).state_update == {"zone2_power": True}
assert DenonMessage("Z2PHONO", command_dict).state_update == {
"zone2_source": "PHONO"
}
assert DenonMessage("Z2SOURCE", command_dict).state_update == {
"zone2_source": "SOURCE"
}
assert DenonMessage("Z360", command_dict).state_update == {"zone3_volume": -20}
assert DenonMessage("Z3OFF", command_dict).state_update == {"zone3_power": False}
assert DenonMessage("Z3SOURCE", command_dict).state_update == {
"zone3_source": "SOURCE"
}
def test_sequence(command_dict):
"""Test a long sequence."""
seq = [
"PW?"
"PWON"
"MV56"
"MVMAX 80"
"MUOFF"
"SITV"
"SVOFF"
"PSDYNVOL OFF"
"PWON"
"PWON"
"MV56"
"MVMAX 80"
]
for command in seq:
DenonMessage(command, command_dict)
invalid_seq = [
"90f9jf3^F*)UF(U(*#fjliuF(#)U(F@ujniljf(@#)&%T^GHkjbJBVKjY*(Y#*(@&5-00193ljl",
"",
" ",
" b b b ",
".:':>,",
"578934",
"None",
"\r",
"MV ",
" MV",
]
for command in invalid_seq:
DenonMessage(command, command_dict)
def test_learning_commands(command_dict):
"""Test saving learned commands."""
assert DenonMessage("PWON", command_dict).new_command is None
assert DenonMessage("PWSCREENSAVER", command_dict).new_command == {
"cmd": "PW",
"prm": None,
"val": "SCREENSAVER",
}
assert DenonMessage("PSNEW", command_dict).new_command == {
"cmd": "PS",
"prm": "NEW",
"val": None,
}
# The parser matches param to "EFF" and then sees "ECT" as value
# - this is not ideal behavior - the parser should know that "ECT"
# as an argument should be preceded by a space
assert DenonMessage("PSEFFECT", command_dict).parsed == ("PS", "EFF", "ECT")
assert DenonMessage("PSEFF ECT", command_dict).parsed == ("PS", "EFF", "ECT")
assert DenonMessage("CVATMOS RIGHT 52", command_dict).new_command == {
"cmd": "CV",
"prm": "ATMOS RIGHT",
"val": "52",
}
assert DenonMessage("NEWCMD MEDIUM", command_dict).new_command == {
"cmd": "NEWCMD",
"prm": None,
"val": "MEDIUM",
}
assert DenonMessage("UNPARSABLE", command_dict).new_command is None
| 38.15142 | 88 | 0.620308 | 1,440 | 12,094 | 5.036111 | 0.193056 | 0.097077 | 0.142995 | 0.144788 | 0.470077 | 0.297297 | 0.156095 | 0.076393 | 0.029785 | 0 | 0 | 0.032035 | 0.210187 | 12,094 | 316 | 89 | 38.272152 | 0.727178 | 0.043741 | 0 | 0.130435 | 0 | 0.003953 | 0.208221 | 0.00869 | 0 | 0 | 0 | 0 | 0.517787 | 1 | 0.039526 | false | 0 | 0.003953 | 0 | 0.043478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11ad425f227427bac1da25255aac829b9be20c1e | 397 | py | Python | stayhome/business/migrations/0031_request_lang.py | mageo/stayhomech | 5afe922b13f0350a79eaff0401709f99c5a31e8b | [
"MIT"
] | 3 | 2020-03-20T11:01:57.000Z | 2020-03-20T16:29:12.000Z | stayhome/business/migrations/0031_request_lang.py | stayhomech/stayhomech | 5afe922b13f0350a79eaff0401709f99c5a31e8b | [
"MIT"
] | 74 | 2020-03-23T21:35:07.000Z | 2020-04-27T12:55:50.000Z | stayhome/business/migrations/0031_request_lang.py | mageo/stayhomech | 5afe922b13f0350a79eaff0401709f99c5a31e8b | [
"MIT"
] | 3 | 2020-03-20T11:02:35.000Z | 2020-03-20T16:29:23.000Z | # Generated by Django 3.0.4 on 2020-03-26 14:52
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('business', '0030_auto_20200326_1533'),
]
operations = [
migrations.AddField(
model_name='request',
name='lang',
field=models.CharField(default='en', max_length=2),
),
]
| 20.894737 | 63 | 0.596977 | 44 | 397 | 5.272727 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112281 | 0.282116 | 397 | 18 | 64 | 22.055556 | 0.701754 | 0.11335 | 0 | 0 | 1 | 0 | 0.125714 | 0.065714 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11ae23e035181c98b7394d75acf9ffb867b14de0 | 1,739 | py | Python | swan/__init__.py | INM-6/swan | ecd426657d6e0ee67e8ea31f0298daf2ea065158 | [
"BSD-3-Clause"
] | 3 | 2020-06-08T12:12:40.000Z | 2021-06-11T05:40:37.000Z | swan/__init__.py | INM-6/swan | ecd426657d6e0ee67e8ea31f0298daf2ea065158 | [
"BSD-3-Clause"
] | 29 | 2018-10-25T16:10:05.000Z | 2021-12-18T21:41:50.000Z | swan/__init__.py | INM-6/swan | ecd426657d6e0ee67e8ea31f0298daf2ea065158 | [
"BSD-3-Clause"
] | 2 | 2015-10-07T08:00:16.000Z | 2021-06-14T08:28:13.000Z | """
This module contains all the modules and subpackages required to run SWAN.
The contents are organized in five folders:
* src: contains all the important scripts, including src.main
* gui: contains code that renders the graphical aspects of the tool
* base: contains important classes for data base and access
* resources: contains miscellaneous resources to complete the tool's functionality
* tools: contains additional tools that add functionality to SWAN
"""
from .version import version as swan_version
title = "SWAN - Sequential Waveform Analyzer"
description_short = 'Python based tool for tracking single units in spike sorted data across several ' \
'electrophysiological recording sessions. '
version = swan_version
author = "SWAN authors and contributors"
copy_right = "2013-2018"
about = """
{0}
{1}
Features
========
* Completely written in Python 3 and PyQt5.
* Shows an overview of spike sorted data from one channel over multiple recording sessions.
* Interactively displays various characteristics of single units in different graphical widgets, called views.
* Provides a flexible and modular graphical user interface, ideal for multiple desktop systems.
* Utilizes efficient plotting libraries to smoothly visualize large datasets.
* Creates a virtual mapping of units across sessions which can be exported to csv and odml formats.
* Calculates this virtual unit mapping automatically by home-grown and published algorithms.
Version: {2}
Copyright (c) {3}, {4}
""".format(title, description_short, version, copy_right, author) | 37.804348 | 118 | 0.706153 | 212 | 1,739 | 5.764151 | 0.641509 | 0.018003 | 0.022913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011346 | 0.239793 | 1,739 | 46 | 119 | 37.804348 | 0.913011 | 0.275446 | 0 | 0 | 0 | 0 | 0.808307 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.045455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11afbf00aed2c5c05359ce94bca8f8577f0a9d7a | 559 | py | Python | python_toolbox/wx_tools/drawing_tools/pens.py | hboshnak/python_toolbox | cb9ef64b48f1d03275484d707dc5079b6701ad0c | [
"MIT"
] | 119 | 2015-02-05T17:59:47.000Z | 2022-02-21T22:43:40.000Z | python_toolbox/wx_tools/drawing_tools/pens.py | hboshnak/python_toolbox | cb9ef64b48f1d03275484d707dc5079b6701ad0c | [
"MIT"
] | 4 | 2019-04-24T14:01:14.000Z | 2020-05-21T12:03:29.000Z | python_toolbox/wx_tools/drawing_tools/pens.py | hboshnak/python_toolbox | cb9ef64b48f1d03275484d707dc5079b6701ad0c | [
"MIT"
] | 14 | 2015-03-30T06:30:42.000Z | 2021-12-24T23:45:11.000Z | # Copyright 2009-2017 Ram Rachum.
# This program is distributed under the MIT license.
import wx
from python_toolbox import caching
is_mac = (wx.Platform == '__WXMAC__')
is_gtk = (wx.Platform == '__WXGTK__')
is_win = (wx.Platform == '__WXMSW__')
@caching.cache(max_size=100)
def get_focus_pen(color='black', width=1, dashes=[1, 4]):
''' '''
if isinstance(color, basestring):
color = wx.NamedColour(color)
# todo: do `if is_mac`, also gtk maybe
pen = wx.Pen(color, width, wx.USER_DASH)
pen.SetDashes(dashes)
return pen
| 21.5 | 57 | 0.674419 | 80 | 559 | 4.45 | 0.65 | 0.08427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030905 | 0.189624 | 559 | 25 | 58 | 22.36 | 0.754967 | 0.21288 | 0 | 0 | 0 | 0 | 0.074766 | 0 | 0 | 0 | 0 | 0.04 | 0 | 1 | 0.083333 | false | 0 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11bdd932e749e2c81c59b1a1e9b7fdd80018e7d5 | 1,320 | py | Python | tests/test_pre_gen_project/test_check_valid_email_address_format.py | lorenzwalthert/govcookiecutter | 57300c5e3c7f1e9f4bcf5189f3ea74363c5cc993 | [
"MIT"
] | 41 | 2020-04-07T13:10:49.000Z | 2021-07-22T07:43:38.000Z | tests/test_pre_gen_project/test_check_valid_email_address_format.py | lorenzwalthert/govcookiecutter | 57300c5e3c7f1e9f4bcf5189f3ea74363c5cc993 | [
"MIT"
] | 36 | 2020-04-22T15:20:19.000Z | 2021-07-23T10:36:01.000Z | tests/test_pre_gen_project/test_check_valid_email_address_format.py | lorenzwalthert/govcookiecutter | 57300c5e3c7f1e9f4bcf5189f3ea74363c5cc993 | [
"MIT"
] | 18 | 2020-04-15T17:27:54.000Z | 2021-07-21T11:44:23.000Z | from hooks.pre_gen_project import check_valid_email_address_format
import pytest
# Define test cases for the `TestCheckValidEmailAddressFormat` test class
args_invalid_email_addresses = ["hello.world", "foo_bar"]
args_valid_email_addresses = ["hello@world.com", "foo@bar"]
class TestCheckValidEmailAddressFormat:
@pytest.mark.parametrize("test_input_email", args_invalid_email_addresses)
def test_raises_assertion_error_for_invalid_emails(
self, test_input_email: str
) -> None:
"""Test an `AssertionError` is raised for invalid email addresses."""
# Execute the `check_valid_email_address_format` function, and check it raises
# an `AssertionError`
with pytest.raises(AssertionError):
check_valid_email_address_format(test_input_email)
@pytest.mark.parametrize("test_input_email", args_valid_email_addresses)
def test_passes_for_valid_emails(self, test_input_email: str) -> None:
"""Test no errors are raised for valid email addresses."""
# Execute the `check_valid_email_address_format` function, which should not
# raise any exceptions for a valid email address
try:
check_valid_email_address_format(test_input_email)
except Exception as e:
pytest.fail(f"Error raised: {e}")
| 42.580645 | 86 | 0.738636 | 168 | 1,320 | 5.464286 | 0.386905 | 0.098039 | 0.111111 | 0.119826 | 0.413943 | 0.383442 | 0.383442 | 0.298475 | 0.130719 | 0.130719 | 0 | 0 | 0.184848 | 1,320 | 30 | 87 | 44 | 0.85316 | 0.308333 | 0 | 0.117647 | 0 | 0 | 0.099109 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.117647 | false | 0.058824 | 0.117647 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
11bf1ed62908bd631ac2eab3d56f1d3f3b63feaa | 3,811 | py | Python | bin/SampleQCI_pca_convert.py | jkaessens/gwas-assoc | 1053c94222701f108362e33c99155cfc148f4ca2 | [
"MIT"
] | null | null | null | bin/SampleQCI_pca_convert.py | jkaessens/gwas-assoc | 1053c94222701f108362e33c99155cfc148f4ca2 | [
"MIT"
] | null | null | null | bin/SampleQCI_pca_convert.py | jkaessens/gwas-assoc | 1053c94222701f108362e33c99155cfc148f4ca2 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import sys
import re
import os
from os.path import *
import string
import re
import gzip
import math
import decimal
import datetime
from os import listdir
import subprocess
# may also need some of these:
# import Ingos lib
#sys.path.append(join(sys.path[0], "../../all_scripts"))
sys.path.append(os.environ['PYLIB_DIR'] + "/all_scripts")
from all_common import *
# import my lib
#sys.path.append(join(sys.path[0], "../lib"))
sys.path.append(os.environ['PYLIB_DIR'] + "/lib")
from plink_classes import *
from eigenstrat_classes import *
def pca_convert(plink, eigenstrat_parameter_file, annotation_file):
""" convert PLINK file data set to eigenstrat format """
# ----------------------------- #
# - generate parameter file m - #
# ----------------------------- #
packedped = PackedPed(write_file=eigenstrat_parameter_file)
packedped.set_input_PLINK_binary(
bed=plink + ".bed",\
bim=plink + ".bim",\
fam=plink + ".fam")
packedped.write_par_file() ; del packedped
# ------------------------ #
# - run convertf program - #
# ------------------------ #
cmd = Command( "convertf -p %s" \
%(eigenstrat_parameter_file) )
cmd.run() ; del cmd
os.system("mv %s.ind %s.ind.orig" \
%(plink, plink) )
# read individualIDs and HAPMAP info from from hapmap2 fam file
try:
fh_anno = file(annotation_file, "r")
except IOError, e:
print e
sys.exit(1)
individuals2batch_id = {}
# skip header
line = fh_anno.readline().replace("\n", "")
line = fh_anno.readline().replace("\n", "")
while line:
list = re.split("\s+",line)
IID = list[1]
batch_id = list[6]
individuals2batch_id[IID] = batch_id
line = fh_anno.readline().replace("\n", "")
fh_anno.close()
# re-write ind file with info on HapMap samples and batch_info
try:
fh_ind = file(plink + ".ind.orig", "r")
fh_ind_new = file(plink + ".ind", "w")
except IOError, e:
print e
sys.exit(1)
batches = []
batches_dict = {}
# no header line
line = fh_ind.readline().replace("\n", "")
while line:
list = re.split("\s+",line)
if list[0] == "":
del list[0]
# change info last column from "Case/Control" to batch_id
if individuals2batch_id.has_key(list[0]):
batch_id = individuals2batch_id[list[0]]
if not batches_dict.has_key(batch_id):
batches.append(batch_id)
batches_dict[batch_id] = True
if list[-1] == "Case":
line = line.replace("Case", batch_id)
elif list[-1] == "Control":
line = line.replace("Control", batch_id)
# nothing to replace
else:
print >> sys.stderr, "\n warning: could not replace case/control status for sample " +list[0]+ " by batch_id in file pca.evec file " +plink_pca + ".pca.evec ...\n\n"
fh_ind_new.writelines(line +"\n")
# nothing to replace
else:
print >> sys.stderr, "\n warning: could not found sample " +list[0]+ " in annotation file " +individuals_annotation_cases_controls_hapmap2+ " ...\n\n"
fh_ind_new.writelines(line +"\n")
line = fh_ind.readline().replace("\n", "")
fh_ind.close()
fh_ind_new.close()
del batches_dict
# Main
if __name__ == "__main__":
# check args
if len(sys.argv) != 4:
print "Usage: " + sys.argv[0] + " <input plink basename> <eigenstrat parameter file> <annotations>\n"
sys.exit(1)
pca_convert(sys.argv[1], sys.argv[2], sys.argv[3])
| 27.817518 | 184 | 0.561795 | 476 | 3,811 | 4.342437 | 0.310924 | 0.033866 | 0.038703 | 0.023222 | 0.241896 | 0.241896 | 0.195452 | 0.166425 | 0.08805 | 0.08805 | 0 | 0.009479 | 0.280241 | 3,811 | 136 | 185 | 28.022059 | 0.744076 | 0.160063 | 0 | 0.296296 | 0 | 0 | 0.12925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.185185 | null | null | 0.061728 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11d9fdaa6e790a81a67a442676505557118d1780 | 859 | py | Python | home/kwatters/harry/gestures/howdoyoudo.py | rv8flyboy/pyrobotlab | 4e04fb751614a5cb6044ea15dcfcf885db8be65a | [
"Apache-2.0"
] | 63 | 2015-02-03T18:49:43.000Z | 2022-03-29T03:52:24.000Z | home/kwatters/harry/gestures/howdoyoudo.py | rv8flyboy/pyrobotlab | 4e04fb751614a5cb6044ea15dcfcf885db8be65a | [
"Apache-2.0"
] | 16 | 2016-01-26T19:13:29.000Z | 2018-11-25T21:20:51.000Z | home/kwatters/harry/gestures/howdoyoudo.py | rv8flyboy/pyrobotlab | 4e04fb751614a5cb6044ea15dcfcf885db8be65a | [
"Apache-2.0"
] | 151 | 2015-01-03T18:55:54.000Z | 2022-03-04T07:04:23.000Z | def howdoyoudo():
global helvar
if helvar <= 2:
i01.mouth.speak("I'm fine thank you")
helvar += 1
elif helvar == 3:
i01.mouth.speak("you have already said that at least twice")
i01.moveArm("left",43,88,22,10)
i01.moveArm("right",20,90,30,10)
i01.moveHand("left",0,0,0,0,0,119)
i01.moveHand("right",0,0,0,0,0,119)
sleep(2)
relax()
helvar += 1
elif helvar == 4:
i01.mouth.speak("what is your problem stop saying how do you do all the time")
i01.moveArm("left",30,83,22,10)
i01.moveArm("right",40,85,30,10)
i01.moveHand("left",130,180,180,180,180,119)
i01.moveHand("right",130,180,180,180,180,119)
sleep(2)
relax()
helvar += 1
elif helvar == 5:
i01.mouth.speak("i will ignore you if you say how do you do one more time")
unhappy()
sleep(4)
relax()
helvar += 1
| 28.633333 | 82 | 0.615832 | 150 | 859 | 3.526667 | 0.42 | 0.030246 | 0.034026 | 0.030246 | 0.347826 | 0.204159 | 0.117202 | 0.117202 | 0 | 0 | 0 | 0.177347 | 0.218859 | 859 | 29 | 83 | 29.62069 | 0.611028 | 0 | 0 | 0.310345 | 0 | 0 | 0.24447 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | true | 0 | 0 | 0 | 0.034483 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11ec09f5a8b06f26f7b9880df4b5fb7cd6fc7cf0 | 4,711 | py | Python | tests/typed_list/test_opt.py | canyon289/Theano-PyMC | 1a9b04bfe480b758ddfa54ba49c88bee3bec419c | [
"BSD-3-Clause"
] | 1 | 2020-12-30T19:12:52.000Z | 2020-12-30T19:12:52.000Z | tests/typed_list/test_opt.py | canyon289/Theano-PyMC | 1a9b04bfe480b758ddfa54ba49c88bee3bec419c | [
"BSD-3-Clause"
] | null | null | null | tests/typed_list/test_opt.py | canyon289/Theano-PyMC | 1a9b04bfe480b758ddfa54ba49c88bee3bec419c | [
"BSD-3-Clause"
] | 1 | 2020-08-15T17:09:10.000Z | 2020-08-15T17:09:10.000Z | import numpy as np
import theano
import theano.tensor as tt
import theano.typed_list
from tests.tensor.utils import rand_ranged
from theano import In
from theano.typed_list.basic import Append, Extend, Insert, Remove, Reverse
from theano.typed_list.type import TypedListType
class TestInplace:
def test_reverse_inplace(self):
mySymbolicMatricesList = TypedListType(
tt.TensorType(theano.config.floatX, (False, False))
)()
z = Reverse()(mySymbolicMatricesList)
m = theano.compile.mode.get_default_mode().including("typed_list_inplace_opt")
f = theano.function(
[In(mySymbolicMatricesList, borrow=True, mutable=True)],
z,
accept_inplace=True,
mode=m,
)
assert f.maker.fgraph.toposort()[0].op.inplace
x = rand_ranged(-1000, 1000, [100, 101])
y = rand_ranged(-1000, 1000, [100, 101])
assert np.array_equal(f([x, y]), [y, x])
def test_append_inplace(self):
mySymbolicMatricesList = TypedListType(
tt.TensorType(theano.config.floatX, (False, False))
)()
mySymbolicMatrix = tt.matrix()
z = Append()(mySymbolicMatricesList, mySymbolicMatrix)
m = theano.compile.mode.get_default_mode().including("typed_list_inplace_opt")
f = theano.function(
[
In(mySymbolicMatricesList, borrow=True, mutable=True),
In(mySymbolicMatrix, borrow=True, mutable=True),
],
z,
accept_inplace=True,
mode=m,
)
assert f.maker.fgraph.toposort()[0].op.inplace
x = rand_ranged(-1000, 1000, [100, 101])
y = rand_ranged(-1000, 1000, [100, 101])
assert np.array_equal(f([x], y), [x, y])
def test_extend_inplace(self):
mySymbolicMatricesList1 = TypedListType(
tt.TensorType(theano.config.floatX, (False, False))
)()
mySymbolicMatricesList2 = TypedListType(
tt.TensorType(theano.config.floatX, (False, False))
)()
z = Extend()(mySymbolicMatricesList1, mySymbolicMatricesList2)
m = theano.compile.mode.get_default_mode().including("typed_list_inplace_opt")
f = theano.function(
[
In(mySymbolicMatricesList1, borrow=True, mutable=True),
mySymbolicMatricesList2,
],
z,
mode=m,
)
assert f.maker.fgraph.toposort()[0].op.inplace
x = rand_ranged(-1000, 1000, [100, 101])
y = rand_ranged(-1000, 1000, [100, 101])
assert np.array_equal(f([x], [y]), [x, y])
def test_insert_inplace(self):
mySymbolicMatricesList = TypedListType(
tt.TensorType(theano.config.floatX, (False, False))
)()
mySymbolicIndex = tt.scalar(dtype="int64")
mySymbolicMatrix = tt.matrix()
z = Insert()(mySymbolicMatricesList, mySymbolicIndex, mySymbolicMatrix)
m = theano.compile.mode.get_default_mode().including("typed_list_inplace_opt")
f = theano.function(
[
In(mySymbolicMatricesList, borrow=True, mutable=True),
mySymbolicIndex,
mySymbolicMatrix,
],
z,
accept_inplace=True,
mode=m,
)
assert f.maker.fgraph.toposort()[0].op.inplace
x = rand_ranged(-1000, 1000, [100, 101])
y = rand_ranged(-1000, 1000, [100, 101])
assert np.array_equal(f([x], np.asarray(1, dtype="int64"), y), [x, y])
def test_remove_inplace(self):
mySymbolicMatricesList = TypedListType(
tt.TensorType(theano.config.floatX, (False, False))
)()
mySymbolicMatrix = tt.matrix()
z = Remove()(mySymbolicMatricesList, mySymbolicMatrix)
m = theano.compile.mode.get_default_mode().including("typed_list_inplace_opt")
f = theano.function(
[
In(mySymbolicMatricesList, borrow=True, mutable=True),
In(mySymbolicMatrix, borrow=True, mutable=True),
],
z,
accept_inplace=True,
mode=m,
)
assert f.maker.fgraph.toposort()[0].op.inplace
x = rand_ranged(-1000, 1000, [100, 101])
y = rand_ranged(-1000, 1000, [100, 101])
assert np.array_equal(f([x, y], y), [x])
def test_constant_folding():
m = tt.ones((1,), dtype="int8")
l = theano.typed_list.make_list([m, m])
f = theano.function([], l)
topo = f.maker.fgraph.toposort()
assert len(topo)
assert isinstance(topo[0].op, theano.compile.ops.DeepCopyOp)
assert f() == [1, 1]
| 32.267123 | 86 | 0.591806 | 509 | 4,711 | 5.355599 | 0.1611 | 0.040352 | 0.051357 | 0.066031 | 0.699193 | 0.695525 | 0.695525 | 0.695525 | 0.676082 | 0.655906 | 0 | 0.04776 | 0.284441 | 4,711 | 145 | 87 | 32.489655 | 0.760902 | 0 | 0 | 0.577586 | 0 | 0 | 0.026321 | 0.02335 | 0 | 0 | 0 | 0 | 0.112069 | 1 | 0.051724 | false | 0 | 0.068966 | 0 | 0.12931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11fa153abfd2630bf903cdbd6287a2ef981ac882 | 22,508 | py | Python | djangotasks/tests.py | godber/crunch.io-dashboard | b7e411d63ed91e558591a2bc9d76aca8a96f09bd | [
"MIT"
] | 2 | 2015-08-27T14:24:42.000Z | 2020-10-20T20:54:28.000Z | djangotasks/tests.py | godber/crunch.io-dashboard | b7e411d63ed91e558591a2bc9d76aca8a96f09bd | [
"MIT"
] | null | null | null | djangotasks/tests.py | godber/crunch.io-dashboard | b7e411d63ed91e558591a2bc9d76aca8a96f09bd | [
"MIT"
] | null | null | null | #
# Copyright (c) 2010 by nexB, Inc. http://www.nexb.com/ - All rights reserved.
#
# Redistribution and use in source and binary forms, with or without modification,
# are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
#
# 3. Neither the names of Django, nexB, Django-tasks nor the names of the contributors may be used
# to endorse or promote products derived from this software without
# specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
# ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
# ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
from __future__ import with_statement
import sys
import StringIO
import os
import unittest
import tempfile
import time
import inspect
from os.path import join, dirname, basename, exists, join
import re
DATETIME_REGEX = re.compile('[a-zA-Z]+ \d+\, \d\d\d\d at \d+(\:\d+)? [ap]\.m\.')
from models import Task, TaskManager
class StandardOutputCheck(object):
def __init__(self, test, expected_stdout = None, fail_if_different=True):
self.test = test
self.expected_stdout = expected_stdout or ''
self.fail_if_different = fail_if_different
def __enter__(self):
self.stdout = sys.stdout
sys.stdout = StringIO.StringIO()
def __exit__(self, type, value, traceback):
# Restore state
self.stdoutvalue = sys.stdout.getvalue()
sys.stdout = self.stdout
# Check the output only if no exception occured (to avoid "eating" test failures)
if type:
return
if self.fail_if_different:
self.test.assertEquals(self.expected_stdout, self.stdoutvalue)
class TestModel(object):
''' A mock Model object for task tests'''
class Manager(object):
def get(self, pk):
if basename(pk) not in ['key1', 'key2', 'key3', 'key with space', 'key-more']:
raise Exception("Not a good object loaded")
return TestModel(pk)
objects = Manager()
def __init__(self, pk):
self.pk = pk
def run_something_long(self, msg=''):
print "running something..."
sys.stdout.flush()
self._trigger("run_something_long_1")
time.sleep(0.2)
print "still running..."
sys.stdout.flush()
time.sleep(0.2)
self._trigger("run_something_long_2")
return "finished"
def run_something_else(self):
pass
def run_something_failing(self):
print "running, will fail..."
sys.stdout.flush()
time.sleep(0.2)
self._trigger("run_something_failing")
raise Exception("Failed !")
def run_something_with_required(self):
print "running required..."
sys.stdout.flush()
time.sleep(0.2)
self._trigger("run_something_with_required")
return "finished required"
def run_something_with_required_failing(self):
print "running required..."
sys.stdout.flush()
time.sleep(0.2)
self._trigger("run_something_with_required")
return "finished required"
def run_something_with_two_required(self):
# not called in the tests
pass
def run_a_method_that_is_not_registered(self):
# not called in the tests
pass
def run_something_fast(self):
print "Doing something fast"
time.sleep(0.1)
self._trigger("run_something_fast")
def check_database_settings(self):
from django.db import connection
print connection.settings_dict["NAME"]
time.sleep(0.1)
self._trigger("check_database_settings")
def _trigger(self, event):
open(self.pk + event, 'w').writelines(["."])
class ViewsTestCase(unittest.TestCase):
def failUnlessRaises(self, excClassOrInstance, callableObj, *args, **kwargs):
# improved method compared to unittest.TestCase.failUnlessRaises:
# also check the content of the exception
if inspect.isclass(excClassOrInstance):
return unittest.TestCase.failUnlessRaises(self, excClassOrInstance, callableObj, *args, **kwargs)
excClass = excClassOrInstance.__class__
try:
callableObj(*args, **kwargs)
except excClass, e:
self.assertEquals(str(excClassOrInstance), str(e))
else:
if hasattr(excClass,'__name__'): excName = excClass.__name__
else: excName = str(excClass)
raise self.failureException, "%s not raised" % excName
assertRaises = failUnlessRaises
def setUp(self):
TaskManager.DEFINED_TASKS['djangotasks.tests.TestModel'] = \
[('run_something_long', "Run a successful task", ''),
('run_something_else', "Run an empty task", ''),
('run_something_failing', "Run a failing task", ''),
('run_something_with_required', "Run a task with a required task", 'run_something_long'),
('run_something_with_two_required', "Run a task with two required task", 'run_something_long,run_something_with_required'),
('run_something_fast', "Run a fast task", ''),
('run_something_with_required_failing', "Run a task with a required task that fails", 'run_something_failing'),
('check_database_settings', "Checks the database settings", ''),
]
import tempfile
self.tempdir = tempfile.mkdtemp()
import os
os.environ['DJANGOTASKS_TESTING'] = "YES"
def tearDown(self):
del TaskManager.DEFINED_TASKS['djangotasks.tests.TestModel']
for task in Task.objects.filter(model='djangotasks.tests.TestModel'):
task.delete()
import shutil
shutil.rmtree(self.tempdir)
import os
del os.environ['DJANGOTASKS_TESTING']
def test_tasks_import(self):
from djangotasks.models import _my_import
self.assertEquals(TestModel, _my_import('djangotasks.tests.TestModel'))
def _create_task(self, method, object_id):
from djangotasks.models import _qualified_class_name
return Task.objects._create_task(_qualified_class_name(method.im_class),
method.im_func.__name__,
object_id)
def test_tasks_invalid_method(self):
self.assertRaises(Exception("Method 'run_a_method_that_is_not_registered' not registered for model 'djangotasks.tests.TestModel'"),
self._create_task, TestModel.run_a_method_that_is_not_registered, 'key1')
class NotAValidModel(object):
def a_method(self):
pass
self.assertRaises(Exception("'module' object has no attribute 'NotAValidModel'"),
self._create_task, NotAValidModel.a_method, 'key1')
self.assertRaises(Exception("Not a good object loaded"),
self._create_task, TestModel.run_something_long, 'key_that_does_not_exist')
def test_tasks_register(self):
class MyClass(object):
def mymethod1(self):
pass
def mymethod2(self):
pass
def mymethod3(self):
pass
from djangotasks.models import register_task
try:
register_task(MyClass.mymethod3, None, MyClass.mymethod1, MyClass.mymethod2)
register_task(MyClass.mymethod1, '''Some documentation''', MyClass.mymethod2)
register_task(MyClass.mymethod2, '''Some other documentation''')
self.assertEquals([('mymethod3', '', 'mymethod1,mymethod2'),
('mymethod1', 'Some documentation', 'mymethod2'),
('mymethod2', 'Some other documentation', '')],
TaskManager.DEFINED_TASKS['djangotasks.tests.MyClass'])
finally:
del TaskManager.DEFINED_TASKS['djangotasks.tests.MyClass']
def _wait_until(self, key, event):
max = 50 # 5 seconds max
while not exists(join(self.tempdir, key + event)) and max:
time.sleep(0.1)
max -= 1
if not max:
self.fail("Timeout")
def _reset(self, key, event):
os.remove(join(self.tempdir, key + event))
def _assert_status(self, expected_status, task):
task = Task.objects.get(pk=task.pk)
self.assertEquals(expected_status, task.status)
def test_tasks_run_successful(self):
task = self._create_task(TestModel.run_something_long, join(self.tempdir, 'key1'))
Task.objects.run_task(task.pk)
with StandardOutputCheck(self, "Starting task " + str(task.pk) + "... started.\n"):
Task.objects._do_schedule()
self._wait_until('key1', 'run_something_long_2')
time.sleep(0.5)
new_task = Task.objects.get(pk=task.pk)
self.assertEquals(u'running something...\nstill running...\n'
, new_task.log)
self.assertEquals("successful", new_task.status)
def test_tasks_run_check_database(self):
task = self._create_task(TestModel.check_database_settings, join(self.tempdir, 'key1'))
Task.objects.run_task(task.pk)
with StandardOutputCheck(self, "Starting task " + str(task.pk) + "... started.\n"):
Task.objects._do_schedule()
self._wait_until('key1', 'check_database_settings')
time.sleep(0.5)
new_task = Task.objects.get(pk=task.pk)
from django.db import connection
self.assertEquals(connection.settings_dict["NAME"] + u'\n' , new_task.log) # May fail if your Django settings define a different test database for each run: in which case you should modify it, to ensure it's always the same.
self.assertEquals("successful", new_task.status)
def test_tasks_run_with_space_fast(self):
task = self._create_task(TestModel.run_something_fast, join(self.tempdir, 'key with space'))
Task.objects.run_task(task.pk)
with StandardOutputCheck(self, "Starting task " + str(task.pk) + "... started.\n"):
Task.objects._do_schedule()
self._wait_until('key with space', "run_something_fast")
time.sleep(0.5)
new_task = Task.objects.get(pk=task.pk)
self.assertEquals(u'Doing something fast\n'
, new_task.log)
self.assertEquals("successful", new_task.status)
def test_tasks_run_cancel_running(self):
task = self._create_task(TestModel.run_something_long, join(self.tempdir, 'key1'))
Task.objects.run_task(task.pk)
with StandardOutputCheck(self, "Starting task " + str(task.pk) + "... started.\n"):
Task.objects._do_schedule()
self._wait_until('key1', "run_something_long_1")
Task.objects.cancel_task(task.pk)
output_check = StandardOutputCheck(self, fail_if_different=False)
with output_check:
Task.objects._do_schedule()
time.sleep(0.3)
self.assertTrue(("Cancelling task " + str(task.pk) + "...") in output_check.stdoutvalue)
self.assertTrue("cancelled.\n" in output_check.stdoutvalue)
#self.assertTrue('INFO: failed to mark tasked as finished, from status "running" to "unsuccessful" for task 3. May have been finished in a different thread already.\n'
# in output_check.stdoutvalue)
new_task = Task.objects.get(pk=task.pk)
self.assertEquals("cancelled", new_task.status)
self.assertTrue(u'running something...' in new_task.log)
self.assertFalse(u'still running...' in new_task.log)
self.assertFalse('finished' in new_task.log)
def test_tasks_run_cancel_scheduled(self):
task = self._create_task(TestModel.run_something_long, join(self.tempdir, 'key1'))
with StandardOutputCheck(self):
Task.objects._do_schedule()
Task.objects.run_task(task.pk)
Task.objects.cancel_task(task.pk)
with StandardOutputCheck(self, "Cancelling task " + str(task.pk) + "... cancelled.\n"):
Task.objects._do_schedule()
new_task = Task.objects.get(pk=task.pk)
self.assertEquals("cancelled", new_task.status)
self.assertEquals("", new_task.log)
def test_tasks_run_failing(self):
task = self._create_task(TestModel.run_something_failing, join(self.tempdir, 'key1'))
Task.objects.run_task(task.pk)
with StandardOutputCheck(self, "Starting task " + str(task.pk) + "... started.\n"):
Task.objects._do_schedule()
self._wait_until('key1', "run_something_failing")
time.sleep(0.5)
new_task = Task.objects.get(pk=task.pk)
self.assertEquals("unsuccessful", new_task.status)
self.assertTrue(u'running, will fail...' in new_task.log)
self.assertTrue(u'raise Exception("Failed !")' in new_task.log)
self.assertTrue(u'Exception: Failed !' in new_task.log)
def test_tasks_get_tasks_for_object(self):
tasks = Task.objects.tasks_for_object(TestModel, 'key2')
self.assertEquals(8, len(tasks))
self.assertEquals('defined', tasks[0].status)
self.assertEquals('defined', tasks[1].status)
self.assertEquals('run_something_long', tasks[0].method)
self.assertEquals('run_something_else', tasks[1].method)
def test_tasks_get_task_for_object(self):
self.assertRaises(Exception("Method 'run_doesn_not_exists' not registered for model 'djangotasks.tests.TestModel'"),
Task.objects.task_for_object, TestModel, 'key2', 'run_doesn_not_exists')
task = Task.objects.task_for_object(TestModel, 'key2', 'run_something_long')
self.assertEquals('defined', task.status)
self.assertEquals('run_something_long', task.method)
def test_tasks_get_task_for_object_required(self):
task = Task.objects.task_for_object(TestModel, 'key-more', 'run_something_with_two_required')
self.assertEquals('run_something_long,run_something_with_required', task.required_methods)
def test_tasks_archive_task(self):
tasks = Task.objects.tasks_for_object(TestModel, 'key3')
task = tasks[0]
self.assertTrue(task.pk)
task.status = 'successful'
task.save()
self.assertEquals(False, task.archived)
new_task = self._create_task(TestModel.run_something_long,
'key3')
self.assertTrue(new_task.pk)
self.assertTrue(task.pk != new_task.pk)
old_task = Task.objects.get(pk=task.pk)
self.assertEquals(True, old_task.archived, "Task should have been archived once a new one has been created")
def test_tasks_get_required_tasks(self):
task = self._create_task(TestModel.run_something_with_required, 'key1')
self.assertEquals(['run_something_long'],
[required_task.method for required_task in task.get_required_tasks()])
task = self._create_task(TestModel.run_something_with_two_required, 'key1')
self.assertEquals(['run_something_long', 'run_something_with_required'],
[required_task.method for required_task in task.get_required_tasks()])
def test_tasks_run_required_task_successful(self):
required_task = Task.objects.task_for_object(TestModel, join(self.tempdir, 'key1'), 'run_something_long')
task = self._create_task(TestModel.run_something_with_required, join(self.tempdir, 'key1'))
self.assertEquals("defined", required_task.status)
Task.objects.run_task(task.pk)
self._assert_status("scheduled", task)
self._assert_status("scheduled", required_task)
with StandardOutputCheck(self, "Starting task " + str(required_task.pk) + "... started.\n"):
Task.objects._do_schedule()
time.sleep(0.5)
self._assert_status("scheduled", task)
self._assert_status("running", required_task)
self._wait_until('key1', 'run_something_long_2')
time.sleep(0.5)
self._assert_status("scheduled", task)
self._assert_status("successful", required_task)
with StandardOutputCheck(self, "Starting task " + str(task.pk) + "... started.\n"):
Task.objects._do_schedule()
time.sleep(0.5)
self._assert_status("running", task)
self._assert_status("successful", required_task)
self._wait_until('key1', "run_something_with_required")
time.sleep(0.5)
self._assert_status("successful", task)
task = Task.objects.get(pk=task.pk)
complete_log, _ = DATETIME_REGEX.subn('', task.complete_log())
self.assertEquals(u'Run a successful task started on \nrunning something...\nstill running...\n\n' +
u'Run a successful task finished successfully on \n' +
u'Run a task with a required task started on \nrunning required...\n\n' +
u'Run a task with a required task finished successfully on ', complete_log)
def test_tasks_run_required_task_failing(self):
required_task = Task.objects.task_for_object(TestModel, join(self.tempdir, 'key1'), 'run_something_failing')
task = self._create_task(TestModel.run_something_with_required_failing, join(self.tempdir, 'key1'))
self.assertEquals("defined", required_task.status)
Task.objects.run_task(task.pk)
self._assert_status("scheduled", task)
self._assert_status("scheduled", required_task)
with StandardOutputCheck(self, "Starting task " + str(required_task.pk) + "... started.\n"):
Task.objects._do_schedule()
time.sleep(0.5)
self._assert_status("scheduled", task)
self._assert_status("running", required_task)
self._wait_until('key1', 'run_something_failing')
time.sleep(0.5)
self._assert_status("scheduled", task)
self._assert_status("unsuccessful", required_task)
with StandardOutputCheck(self):
Task.objects._do_schedule()
time.sleep(0.5)
self._assert_status("unsuccessful", task)
task = Task.objects.get(pk=task.pk)
complete_log, _ = DATETIME_REGEX.subn('', task.complete_log())
self.assertTrue(complete_log.startswith('Run a failing task started on \nrunning, will fail...\nTraceback (most recent call last):'))
self.assertTrue(complete_log.endswith(u', in run_something_failing\n raise Exception("Failed !")\nException: Failed !\n\n' +
u'Run a failing task finished with error on \n' +
u'Run a task with a required task that fails started\n' +
u'Run a task with a required task that fails finished with error'))
self.assertEquals("unsuccessful", task.status)
def test_tasks_run_again(self):
tasks = Task.objects.tasks_for_object(TestModel, join(self.tempdir, 'key1'))
task = tasks[5]
self.assertEquals('run_something_fast', task.method)
Task.objects.run_task(task.pk)
with StandardOutputCheck(self, "Starting task " + str(task.pk) + "... started.\n"):
Task.objects._do_schedule()
self._wait_until('key1', "run_something_fast")
time.sleep(0.5)
self._reset('key1', "run_something_fast")
self._assert_status("successful", task)
Task.objects.run_task(task.pk)
output_check = StandardOutputCheck(self, fail_if_different=False)
with output_check:
Task.objects._do_schedule()
self._wait_until('key1', "run_something_fast")
time.sleep(0.5)
import re
pks = re.findall(r'(\d+)', output_check.stdoutvalue)
self.assertEquals(1, len(pks))
self.assertEquals("Starting task " + pks[0] + "... started.\n", output_check.stdoutvalue)
new_task = Task.objects.get(pk=int(pks[0]))
self.assertTrue(new_task.pk != task.pk)
self.assertEquals("successful", new_task.status)
tasks = Task.objects.tasks_for_object(TestModel, join(self.tempdir, 'key1'))
self.assertEquals(new_task.pk, tasks[5].pk)
def NOtest_tasks_exception_in_thread(self):
task = self._create_task(TestModel.run_something_long, join(self.tempdir, 'key1'))
Task.objects.run_task(task.pk)
task = self._create_task(TestModel.run_something_long, join(self.tempdir, 'key1'))
task_delete = self._create_task(TestModel.run_something_long, join(self.tempdir, 'key1'))
task_delete.delete()
try:
Task.objects.get(pk=task.pk)
self.fail("Should throw an exception")
except Exception, e:
self.assertEquals("Task matching query does not exist.", str(e))
output_check = StandardOutputCheck(self, fail_if_different=False)
with output_check:
task.do_run()
time.sleep(0.5)
self.assertTrue("Exception: Failed to mark task with " in output_check.stdoutvalue)
self.assertTrue("as started, task does not exist" in output_check.stdoutvalue)
self.assertTrue('INFO: failed to mark tasked as finished, from status "running" to "unsuccessful" for task' in output_check.stdoutvalue)
| 46.312757 | 232 | 0.651057 | 2,733 | 22,508 | 5.144896 | 0.139773 | 0.052912 | 0.02731 | 0.026172 | 0.568381 | 0.519238 | 0.442998 | 0.391082 | 0.351326 | 0.312709 | 0 | 0.007323 | 0.241647 | 22,508 | 485 | 233 | 46.408247 | 0.816451 | 0.09681 | 0 | 0.379947 | 0 | 0.002639 | 0.193185 | 0.041481 | 0 | 0 | 0 | 0 | 0.203166 | 0 | null | null | 0.01847 | 0.060686 | null | null | 0.01847 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11fb2363bd2a523964e82107107e25596ee565b7 | 3,580 | py | Python | ailearn/nn/losses.py | axi345/ailearn | f5a8eb5ce51b5225eb4a65a85d7a03ff93235757 | [
"Apache-2.0"
] | 39 | 2018-04-19T02:44:04.000Z | 2021-12-10T14:08:58.000Z | ailearn/nn/losses.py | axi345/DRL | 1e2d5328e83b1262cec006da084506d623204d27 | [
"Apache-2.0"
] | 5 | 2018-04-19T15:27:46.000Z | 2021-08-30T07:49:11.000Z | ailearn/nn/losses.py | axi345/DRL | 1e2d5328e83b1262cec006da084506d623204d27 | [
"Apache-2.0"
] | 13 | 2018-04-19T08:31:53.000Z | 2020-10-21T09:10:40.000Z | # -*- coding: utf-8 -*-
# Copyright 2018 Zhao Xingyu & An Yuexuan. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS-IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# -*- coding: utf-8 -*-
import numpy as np
from ..utils import to_categorical
from .activations import softmax, sigmoid
# softmax交叉熵
def softmax_cross_entropy(out, label):
# out:神经元的输出值
# label:实际类别或one-hot编码
out, label = np.array(out), np.array(label)
assert len(out.shape) == 2 # 输出形状错误
assert len(label.shape) == 1 or len(label.shape) == 2 # 标签形状错误
if len(label.shape) == 1: # 转化为one-hot编码
y = to_categorical(label, num_classes=out.shape[1])
else:
if label.shape[1] == 1:
y = to_categorical(label.squeeze(), num_classes=out.shape[1])
else:
assert label.max() == 1 and label.sum(1).mean() == 1 # 标签one-hot编码错误
y = label
yhat = softmax(out)
return -np.mean(y * np.log(yhat))
# 交叉熵
def cross_entropy(out, label):
# out:神经元的输出值
# label:实际类别或one-hot编码
yhat, label = np.array(out), np.array(label)
assert len(out.shape) == 2 # 输出形状错误
assert len(label.shape) == 1 or len(label.shape) == 2 # 标签形状错误
if len(label.shape) == 1: # 转化为one-hot编码
y = to_categorical(label, num_classes=out.shape[1])
else:
if label.shape[1] == 1:
y = to_categorical(label.squeeze(), num_classes=out.shape[1])
else:
assert label.max() == 1 and label.sum(1).mean() == 1 # 标签one-hot编码错误
y = label
return -np.mean(y * np.log(yhat))
# 二分类
def sigmoid_binary_cross_entropy(out, label):
# out:神经元的输出值
# label:实际类别或one-hot编码
out, y = np.array(out), np.array(label)
assert len(out.shape) == 2 and out.shape[1] == 1 # 输出形状错误
assert len(y.shape) == 1 # 标签形状错误
yhat = sigmoid(out)
return -np.mean(y * np.log(yhat) + (1 - y) * np.log(1 - yhat))
# 二分类
def binary_cross_entropy(out, label):
# out:神经元的输出值
# label:实际类别或one-hot编码
yhat, y = np.array(out), np.array(label)
assert len(yhat.shape) == 2 and out.shape[1] == 1 # 输出形状错误
assert len(y.shape) == 1 # 标签形状错误
return -np.mean(y * np.log(yhat) + (1 - y) * np.log(1 - yhat))
# 最小二乘损失
def square_loss(prediction, y):
# prediction:预测值
# y:实际值
prediction, y = np.array(prediction), np.array(y)
assert (len(prediction.shape) == 2 and prediction.shape[1] == 1) or len(prediction.shape) == 1 # 输出形状错误
assert len(y.shape) == 1 or (len(y.shape) == 2 and y.shape[1] == 1) # 真实值形状错误
return np.sum(np.sum(np.square(prediction.reshape(-1, 1) - y.reshape(-1, 1)), 1))
# 均方误差
def mse(prediction, y):
# prediction:预测值
# y:实际值
prediction, y = np.array(prediction), np.array(y)
assert (len(prediction.shape) == 2 and prediction.shape[1] == 1) or len(prediction.shape) == 1 # 输出形状错误
assert len(y.shape) == 1 or (len(y.shape) == 2 and y.shape[1] == 1) # 真实值形状错误
return np.mean(np.sum(np.square(prediction.reshape(-1, 1) - y.reshape(-1, 1)), 1))
| 36.530612 | 109 | 0.614804 | 534 | 3,580 | 4.088015 | 0.230337 | 0.060467 | 0.025653 | 0.036647 | 0.693083 | 0.693083 | 0.693083 | 0.683005 | 0.670179 | 0.655062 | 0 | 0.025773 | 0.241341 | 3,580 | 97 | 110 | 36.907216 | 0.777982 | 0.272905 | 0 | 0.627451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.27451 | 1 | 0.117647 | false | 0 | 0.058824 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11fdd854eb6511b84989edaa9ac1f5ae3c3b7b74 | 4,004 | py | Python | mhc_tools/tests/test_generate_peptides.py | ignatovmg/mhc-adventures | 82cb91db6918d85c364b5e64714642589c892084 | [
"MIT"
] | null | null | null | mhc_tools/tests/test_generate_peptides.py | ignatovmg/mhc-adventures | 82cb91db6918d85c364b5e64714642589c892084 | [
"MIT"
] | null | null | null | mhc_tools/tests/test_generate_peptides.py | ignatovmg/mhc-adventures | 82cb91db6918d85c364b5e64714642589c892084 | [
"MIT"
] | null | null | null | import unittest
import prody
import numpy as np
import pytest
import itertools
from path import Path
from ..mhc_peptide import BasePDB
from ..sampling.generate_peptides import PeptideSampler
from .. import utils
from ..helpers import isolate, isolated_filesystem
@pytest.fixture()
def default_mhc():
return utils.load_gdomains_mhc('1ao7')
@pytest.fixture()
def default_pep():
return utils.load_gdomains_peptide('1ao7')
@isolate
def test_instantiate_with_seq():
sampler = PeptideSampler('ADCHTRTAC')
assert sampler.pep.numAtoms() > 10
@isolate
def test_instantiate_with_short_seq():
with pytest.raises(RuntimeError):
PeptideSampler('ADCH')
@isolate
def test_instantiate_with_long_seq():
with pytest.raises(RuntimeError):
PeptideSampler('ADCHLKKKKKKKKKKKK')
@isolate
def test_instantiate_with_wrong_letters_seq():
with pytest.raises(RuntimeError):
PeptideSampler('ADCHLBBKK')
@isolate
def test_instantiate_with_pdb():
prody.writePDB('pep.pdb', utils.load_gdomains_peptide('1ao7'))
sampler = PeptideSampler(pep='pep.pdb')
assert sampler.pep.numAtoms() > 10
@isolate
def test_instantiate_with_pep_and_mhc():
prody.writePDB('pep.pdb', utils.load_gdomains_peptide('1ao7'))
prody.writePDB('mhc.pdb', utils.load_gdomains_mhc('1ao7'))
sampler = PeptideSampler(pep='pep.pdb', rec='mhc.pdb')
assert sampler.pep.numAtoms() > 10
assert sampler.rec.numAtoms() > 100
@isolate
def test_instantiate_with_seq_and_custom_template():
prody.writePDB('template.pdb', utils.load_gdomains_peptide('1ao7'))
sampler = PeptideSampler('ADCHTRTAC', custom_template='template.pdb')
assert sampler.pep.numAtoms() > 10
@pytest.mark.parametrize('nsamples', [1, 10, 100, 1000, 15000])
def test_generate_simple(nsamples):
with isolated_filesystem():
sampler = PeptideSampler(pep=utils.load_gdomains_peptide('1ao7'))
sampler.generate_peptides(nsamples, 1, 0.3, 123)
assert sampler.brikard.numCoordsets() == nsamples
@isolate
def test_generate_with_template():
prody.writePDB('template.pdb', utils.load_gdomains_peptide('1ao7'))
sampler = PeptideSampler('ADCHTRTAC', custom_template='template.pdb')
sampler.generate_peptides(10, 1, 0.2, 123)
assert sampler.brikard.numCoordsets() == 10
@pytest.mark.parametrize('pep,rec', itertools.product(['1a1m', '1t22', '2bvo'], ['1a1m', '1t22', '2bvo']))
def test_generate_with_rec(pep, rec):
with isolated_filesystem():
sampler = PeptideSampler(pep=utils.load_gdomains_peptide(pep), rec=utils.load_gdomains_mhc(rec))
sampler.generate_peptides(10, 1, 0.2, 123)
assert sampler.brikard.numCoordsets() == 10
# check that receptor is fixed by default during sampling
def test_generate_receptor_fixed(default_mhc, default_pep):
with isolated_filesystem():
sampler = PeptideSampler(pep=default_pep, rec=default_mhc)
sampler.generate_peptides(10, 1, 0.2, 123)
assert sampler.brikard.numCoordsets() == 10
rec_fixed = sampler.brikard.select('chain A')
assert np.all(rec_fixed.getCoordsets(0) == rec_fixed.getCoordsets(1))
# check that receptor is flexible with sample_resi_within parameter set
def test_generate_receptor_flexible(default_mhc, default_pep):
with isolated_filesystem():
sampler = PeptideSampler(pep=default_pep, rec=default_mhc)
sampler.generate_peptides(10, 1, 0.2, 123, sample_resi_within=7)
assert sampler.brikard.numCoordsets() == 10
rec_flex = sampler.brikard.select('chain A')
assert np.any(rec_flex.getCoordsets(0) != rec_flex.getCoordsets(1))
@pytest.mark.parametrize('radius', range(1, 7, 2))
def test_generate_receptor_variable_radius(default_mhc, default_pep, radius):
with isolated_filesystem():
sampler = PeptideSampler(pep=default_pep, rec=default_mhc)
sampler.generate_peptides(10, 1, 0.2, 123, sample_resi_within=radius)
assert sampler.brikard.numCoordsets() == 10
| 32.819672 | 106 | 0.735514 | 507 | 4,004 | 5.591716 | 0.189349 | 0.032099 | 0.059965 | 0.059259 | 0.626102 | 0.545679 | 0.422928 | 0.398942 | 0.391534 | 0.358377 | 0 | 0.034533 | 0.146603 | 4,004 | 121 | 107 | 33.090909 | 0.795142 | 0.031219 | 0 | 0.448276 | 1 | 0 | 0.061404 | 0 | 0 | 0 | 0 | 0 | 0.149425 | 1 | 0.172414 | false | 0 | 0.114943 | 0.022989 | 0.310345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
11fdf99718db2821742652df5b1a7a7a7cca4854 | 862 | py | Python | src/pytoydb/query_api.py | mansoor96g/py-toy-db | 4e2912cb2c9a5e5a1c4632c3d1d65af1314f5fa0 | [
"MIT"
] | null | null | null | src/pytoydb/query_api.py | mansoor96g/py-toy-db | 4e2912cb2c9a5e5a1c4632c3d1d65af1314f5fa0 | [
"MIT"
] | null | null | null | src/pytoydb/query_api.py | mansoor96g/py-toy-db | 4e2912cb2c9a5e5a1c4632c3d1d65af1314f5fa0 | [
"MIT"
] | 2 | 2017-09-29T11:50:09.000Z | 2019-10-10T10:52:06.000Z | # -*- coding: utf-8 -*-
"""
API для формирования запросов к хранилищу
"""
class Simple(object):
"""
Простейший API
Запросы выглядят примерно так: data.query('a', 1)('b', 2)
"""
def __init__(self, dep, steps=None):
self._dep = dep
self._steps = steps or []
def __call__(self, name, *args, **kwargs):
return self.__class__(
self._dep,
self._steps + [(name, args, kwargs)]
)
def __iter__(self):
if not self._steps:
ids = self._dep.indexmap.keys()
else:
def do((n, args, kwargs)):
return self._dep.indexes[n].query(*args, **kwargs)
ids = set(do(self._steps[0]))
for s in self._steps[1:]:
ids.intersection_update(do(s))
for i in ids:
yield i, self._dep.get(i)
| 23.944444 | 66 | 0.519722 | 106 | 862 | 3.971698 | 0.518868 | 0.099762 | 0.057007 | 0.095012 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008741 | 0.336427 | 862 | 35 | 67 | 24.628571 | 0.727273 | 0.024362 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eea01a1945cbf6c9fab5e0649c019841a2a0827c | 540 | py | Python | server/code/models/article.py | Social-Do-Gooders/localy | c9f11a38a9a6cffbeaedba9e93ee5058b13657ce | [
"MIT"
] | null | null | null | server/code/models/article.py | Social-Do-Gooders/localy | c9f11a38a9a6cffbeaedba9e93ee5058b13657ce | [
"MIT"
] | null | null | null | server/code/models/article.py | Social-Do-Gooders/localy | c9f11a38a9a6cffbeaedba9e93ee5058b13657ce | [
"MIT"
] | 1 | 2020-12-11T10:43:30.000Z | 2020-12-11T10:43:30.000Z | import json
from db import db
class ArticleModel(db.Document):
title = db.StringField(required=True)
content = db.StringField()
author = db.StringField()
date = db.DateTimeField()
article_type = db.StringField(default='other', choices=['business news', 'tech news', 'science news', 'other'])
meta = {'collection': 'articles'}
def json(self):
return json.loads(self.to_json())
def save_to_db(self):
self.save()
def delete_from_db(self):
self.delete()
| 24.545455 | 116 | 0.618519 | 64 | 540 | 5.125 | 0.53125 | 0.158537 | 0.060976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.248148 | 540 | 21 | 117 | 25.714286 | 0.807882 | 0 | 0 | 0 | 0 | 0 | 0.119461 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.133333 | 0.066667 | 0.866667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
eea13e48b604e45150618ed61c1603c72eb12c7f | 1,081 | py | Python | config.py | taeram/ineffable | fcaf1cc405f9e4dc3346fa91338b75b195325f84 | [
"MIT"
] | 7 | 2015-05-08T21:37:27.000Z | 2019-01-01T22:11:07.000Z | config.py | taeram/ineffable | fcaf1cc405f9e4dc3346fa91338b75b195325f84 | [
"MIT"
] | null | null | null | config.py | taeram/ineffable | fcaf1cc405f9e4dc3346fa91338b75b195325f84 | [
"MIT"
] | null | null | null | from os import getenv, \
path
from time import time
from datetime import timedelta
class Config(object):
AWS_ACCESS_KEY_ID = getenv('AWS_ACCESS_KEY_ID')
AWS_REGION = getenv('AWS_REGION')
AWS_S3_BUCKET = getenv('AWS_S3_BUCKET')
AWS_SECRET_ACCESS_KEY = getenv('AWS_SECRET_ACCESS_KEY')
CACHE_BUSTER = time()
DEBUG = getenv('DEBUG', False)
GALLERIES_PER_PAGE = getenv('GALLERIES_PER_PAGE', 5)
GOOGLE_ANALYTICS_ID = getenv('GOOGLE_ANALYTICS_ID', False)
LAMBDA_INSTRUCTIONS = getenv('LAMBDA_INSTRUCTIONS')
MAX_UPLOAD_SIZE = getenv('MAX_UPLOAD_SIZE')
PERMANENT_SESSION_LIFETIME = timedelta(minutes=30)
REMEMBER_COOKIE_DURATION = timedelta(days=30)
SECRET_KEY = getenv('SECRET_KEY')
SEND_FILE_MAX_AGE_DEFAULT = 365 * 86400
SITE_NAME = getenv('SITE_NAME', 'Ineffable')
SQLALCHEMY_DATABASE_URI = getenv('DATABASE_URL', 'sqlite:///' + path.dirname(__file__) + '/app/app.db').replace('mysql2:', 'mysql:')
SQLALCHEMY_ECHO = getenv('SQLALCHEMY_ECHO', False)
SQLALCHEMY_POOL_RECYCLE = 60
TESTING = False
| 41.576923 | 136 | 0.725254 | 139 | 1,081 | 5.23741 | 0.489209 | 0.049451 | 0.032967 | 0.038462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019934 | 0.164662 | 1,081 | 25 | 137 | 43.24 | 0.786268 | 0 | 0 | 0 | 0 | 0 | 0.209066 | 0.019426 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.958333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
eea232085074a49d948920f72605827aea0e2dc0 | 3,050 | py | Python | tests/test_wins.py | Acamol/connect-four | 9db0123ad4b6527ee39b198a42ec1d537e0ecb85 | [
"MIT"
] | null | null | null | tests/test_wins.py | Acamol/connect-four | 9db0123ad4b6527ee39b198a42ec1d537e0ecb85 | [
"MIT"
] | null | null | null | tests/test_wins.py | Acamol/connect-four | 9db0123ad4b6527ee39b198a42ec1d537e0ecb85 | [
"MIT"
] | null | null | null | from ConnectFour import ConnectFour
import pytest
def test_empty_cell():
my = ConnectFour.Game()
assert not my.check_for_win(2, 2)
def test_diagonal_win():
# positive slope
my = ConnectFour.Game()
my.move(1) # X
my.move(2) # Y
my.move(2) # YX
my.move(3) # Y
my.move(3) # YX
my.move(4) # Y
my.move(3) # YXX
my.move(4) # YY
my.move(4) # YYX
my.move(6) # Y
my.move(4) # YYXX
assert my.check_for_win(3, 4)
assert my.winner == 'X'
winning_discs = [(3, 4), (2, 3), (1, 2), (0, 1)]
returned_winning_discs = my.get_winning_discs()
for disc in returned_winning_discs:
assert disc in winning_discs
for disc in winning_discs:
assert disc in returned_winning_discs
# game was already decided
assert my.move(4) == None
# negative slope
my = ConnectFour.Game()
my.move(6) # X
my.move(5) # Y
my.move(5) # YX
my.move(0) # Y
my.move(4) # X
my.move(4) # XY
my.move(4) # XYX
my.move(3) # Y
my.move(3) # YX
my.move(3) # YXY
my.move(3) # YXYX
assert my.check_for_win(3, 3)
assert my.winner == 'X'
returned_winning_discs = my.get_winning_discs()
winning_discs = [(3, 3), (2, 4), (1, 5), (0, 6)]
for disc in returned_winning_discs:
assert disc in winning_discs
for disc in winning_discs:
assert disc in returned_winning_discs
def test_horizontal_win():
my = ConnectFour.Game()
# fill the first row
for i in range(7):
my.move(i)
my.move(2)
my.move(2)
my.move(3)
my.move(3)
my.move(4)
my.move(4)
my.move(5)
assert my.check_for_win(1, 5)
assert my.winner == 'Y'
winning_discs = [(1, 2), (1, 3), (1, 4), (1, 5)]
returned_winning_discs = my.get_winning_discs()
for disc in returned_winning_discs:
assert disc in winning_discs
for disc in winning_discs:
assert disc in returned_winning_discs
# game was already decided
assert my.move(4) == None
def test_vertical_win():
my = ConnectFour.Game()
my.move(2)
my.move(2)
my.move(2)
my.move(5)
my.move(2)
my.move(5)
my.move(2)
my.move(5)
my.move(2)
assert my.check_for_win(5, 2)
assert my.winner == 'X'
winning_discs = [(2, 2), (3, 2), (4, 2), (5, 2)]
returned_winning_discs = my.get_winning_discs()
for disc in returned_winning_discs:
assert disc in winning_discs
for disc in winning_discs:
assert disc in returned_winning_discs
# game was already decided
assert my.move(4) == None
def test_draw():
my = ConnectFour.Game()
for col in range(3):
for _ in range(6):
my.move(col)
my.move(6)
for col in range(3, 7):
for _ in range(6):
my.move(col)
# now the board is full, but no winners
assert my.winner == 'D'
my.print_board()
test_draw() | 26.293103 | 53 | 0.569836 | 475 | 3,050 | 3.517895 | 0.145263 | 0.16158 | 0.050269 | 0.100539 | 0.676242 | 0.609814 | 0.520048 | 0.473968 | 0.453621 | 0.453621 | 0 | 0.042654 | 0.308197 | 3,050 | 116 | 54 | 26.293103 | 0.749289 | 0.07377 | 0 | 0.737374 | 0 | 0 | 0.001865 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 1 | 0.050505 | false | 0 | 0.020202 | 0 | 0.070707 | 0.010101 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eea5e23c40bf133792e4c4fcbc142f8cd3d91172 | 4,326 | py | Python | sdk/python/pulumi_gcp/dns/record_set.py | pulumi-bot/pulumi-gcp | 43ff11bf1c99b4e9e493f61d9755e359b686ae67 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_gcp/dns/record_set.py | pulumi-bot/pulumi-gcp | 43ff11bf1c99b4e9e493f61d9755e359b686ae67 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_gcp/dns/record_set.py | pulumi-bot/pulumi-gcp | 43ff11bf1c99b4e9e493f61d9755e359b686ae67 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import pulumi
import pulumi.runtime
class RecordSet(pulumi.CustomResource):
"""
Manages a set of DNS records within Google Cloud DNS. For more information see [the official documentation](https://cloud.google.com/dns/records/) and
[API](https://cloud.google.com/dns/api/v1/resourceRecordSets).
~> **Note:** The Google Cloud DNS API requires NS records be present at all
times. To accommodate this, when creating NS records, the default records
Google automatically creates will be silently overwritten. Also, when
destroying NS records, Terraform will not actually remove NS records, but will
report that it did.
"""
def __init__(__self__, __name__, __opts__=None, managed_zone=None, name=None, project=None, rrdatas=None, ttl=None, type=None):
"""Create a RecordSet resource with the given unique name, props, and options."""
if not __name__:
raise TypeError('Missing resource name argument (for URN creation)')
if not isinstance(__name__, basestring):
raise TypeError('Expected resource name to be a string')
if __opts__ and not isinstance(__opts__, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
__props__ = dict()
if not managed_zone:
raise TypeError('Missing required property managed_zone')
elif not isinstance(managed_zone, basestring):
raise TypeError('Expected property managed_zone to be a basestring')
__self__.managed_zone = managed_zone
"""
The name of the zone in which this record set will
reside.
"""
__props__['managedZone'] = managed_zone
if name and not isinstance(name, basestring):
raise TypeError('Expected property name to be a basestring')
__self__.name = name
"""
The DNS name this record set will apply to.
"""
__props__['name'] = name
if project and not isinstance(project, basestring):
raise TypeError('Expected property project to be a basestring')
__self__.project = project
"""
The ID of the project in which the resource belongs. If it
is not provided, the provider project is used.
"""
__props__['project'] = project
if not rrdatas:
raise TypeError('Missing required property rrdatas')
elif not isinstance(rrdatas, list):
raise TypeError('Expected property rrdatas to be a list')
__self__.rrdatas = rrdatas
"""
The string data for the records in this record set
whose meaning depends on the DNS type. For TXT record, if the string data contains spaces, add surrounding `\"` if you don't want your string to get split on spaces.
"""
__props__['rrdatas'] = rrdatas
if not ttl:
raise TypeError('Missing required property ttl')
elif not isinstance(ttl, int):
raise TypeError('Expected property ttl to be a int')
__self__.ttl = ttl
"""
The time-to-live of this record set (seconds).
"""
__props__['ttl'] = ttl
if not type:
raise TypeError('Missing required property type')
elif not isinstance(type, basestring):
raise TypeError('Expected property type to be a basestring')
__self__.type = type
"""
The DNS record set type.
"""
__props__['type'] = type
super(RecordSet, __self__).__init__(
'gcp:dns/recordSet:RecordSet',
__name__,
__props__,
__opts__)
def set_outputs(self, outs):
if 'managedZone' in outs:
self.managed_zone = outs['managedZone']
if 'name' in outs:
self.name = outs['name']
if 'project' in outs:
self.project = outs['project']
if 'rrdatas' in outs:
self.rrdatas = outs['rrdatas']
if 'ttl' in outs:
self.ttl = outs['ttl']
if 'type' in outs:
self.type = outs['type']
| 40.055556 | 173 | 0.625751 | 525 | 4,326 | 4.939048 | 0.291429 | 0.070189 | 0.067875 | 0.069418 | 0.190513 | 0.037794 | 0.037794 | 0 | 0 | 0 | 0 | 0.00065 | 0.289182 | 4,326 | 107 | 174 | 40.429907 | 0.842602 | 0.182386 | 0 | 0 | 1 | 0 | 0.231776 | 0.009554 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032787 | false | 0 | 0.032787 | 0 | 0.081967 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eea79038aa68bb587fd566127c104d4607b6cbfe | 762 | py | Python | run.py | nunenuh/crnn.pytorch | b0f8bc7fc43622f7396fa0550fb2cc6e17849551 | [
"MIT"
] | 1 | 2020-09-30T04:37:39.000Z | 2020-09-30T04:37:39.000Z | run.py | nunenuh/crnn.pytorch | b0f8bc7fc43622f7396fa0550fb2cc6e17849551 | [
"MIT"
] | null | null | null | run.py | nunenuh/crnn.pytorch | b0f8bc7fc43622f7396fa0550fb2cc6e17849551 | [
"MIT"
] | null | null | null | import os
import sys
# from lmdb.cffi import version as ver
sys.path.append(os.getcwd())
import torch
from iqra.models.crnn import *
from iqra.modules.feature import *
if __name__ == '__main__':
image_data = torch.rand(3,1,224,224)
text_data = torch.rand(3,512).long()
# text_data = torch.LongTensor(text_data)
# fe = FeatureExtraction(in_channels=1, version=50)
# hype = fe.feature.last_channels
# print(fe)
# print(fe(image_data))
# print()
# print(fe(image_data).shape)
# out = enc(test_data)
# # print(out)
num_class = 96
im_size = (32, 100)
model = OCRNet(num_class = num_class, im_size=im_size)
out = model(image_data, text_data)
print(out)
print(out.shape)
| 21.166667 | 58 | 0.641732 | 110 | 762 | 4.218182 | 0.472727 | 0.077586 | 0.056034 | 0.060345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037607 | 0.232283 | 762 | 36 | 59 | 21.166667 | 0.755556 | 0.339895 | 0 | 0 | 0 | 0 | 0.01626 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
eeaa9f209da716a2a49b70a314e3045acf4c8cf7 | 440 | py | Python | foodplanapp/migrations/0006_dish_active.py | 949027/food-plan | 382791059469058614bfd028d674ba1f69b25e2b | [
"MIT"
] | null | null | null | foodplanapp/migrations/0006_dish_active.py | 949027/food-plan | 382791059469058614bfd028d674ba1f69b25e2b | [
"MIT"
] | null | null | null | foodplanapp/migrations/0006_dish_active.py | 949027/food-plan | 382791059469058614bfd028d674ba1f69b25e2b | [
"MIT"
] | 1 | 2022-03-22T02:04:59.000Z | 2022-03-22T02:04:59.000Z | # Generated by Django 4.0.3 on 2022-03-17 18:37
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('foodplanapp', '0005_merge_0002_delete_user_0004_alter_dish_calories'),
]
operations = [
migrations.AddField(
model_name='dish',
name='active',
field=models.BooleanField(default=True, verbose_name='Статус'),
),
]
| 23.157895 | 80 | 0.636364 | 49 | 440 | 5.510204 | 0.836735 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082317 | 0.254545 | 440 | 18 | 81 | 24.444444 | 0.740854 | 0.102273 | 0 | 0 | 1 | 0 | 0.201018 | 0.132316 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eeb34e7812fdfc4153e4f5ac57a1441f64d25f20 | 539 | py | Python | attack/attacker.py | smallflyingpig/adversarial_attack_and_defense | 485e7cf6269c184a4ed1f3281b21a4242d5ed199 | [
"MIT"
] | null | null | null | attack/attacker.py | smallflyingpig/adversarial_attack_and_defense | 485e7cf6269c184a4ed1f3281b21a4242d5ed199 | [
"MIT"
] | null | null | null | attack/attacker.py | smallflyingpig/adversarial_attack_and_defense | 485e7cf6269c184a4ed1f3281b21a4242d5ed199 | [
"MIT"
] | null | null | null | import torch
from torch import nn
from abc import ABC, abstractmethod
class AdditiveAttacker(ABC):
def __init__(self, eps, data_min=-1, data_max=1):
self.eps = eps
self.data_min = data_min
self.data_max = data_max
@abstractmethod
def _get_perturbation(self, x, y=None):
pass
def generate(self, x, y=None):
delta = self._get_perturbation(x, y)
x = x + delta * self.eps
x[x<self.data_min] = self.data_min
x[x>self.data_max] = self.data_max
return x
| 24.5 | 53 | 0.627087 | 81 | 539 | 3.950617 | 0.308642 | 0.15 | 0.103125 | 0.09375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005115 | 0.274583 | 539 | 21 | 54 | 25.666667 | 0.813299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0.058824 | 0.176471 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
eeb6d7d3008cc66f20e2d95cde377248ad372b18 | 1,036 | py | Python | fixture/orm.py | Emeraldany/python_training | 619565e88f0c138d58fa486df60396481daaba26 | [
"Apache-2.0"
] | null | null | null | fixture/orm.py | Emeraldany/python_training | 619565e88f0c138d58fa486df60396481daaba26 | [
"Apache-2.0"
] | null | null | null | fixture/orm.py | Emeraldany/python_training | 619565e88f0c138d58fa486df60396481daaba26 | [
"Apache-2.0"
] | null | null | null | import datetime
from pymysql.converters import encoders, decoders, convert_mysql_timestamp
from pony.orm import *
class ORMFicture:
db=Database()
class ORMGroup(db.Entity):
_table_= 'grouplist'
id= PrimaryKey(int,column='group_id')
name=Optional(str,column='group_name')
header=Optional(str,column='group_header')
footer=Optional(str,column='group_Footer')
class ORMContact(db.Entity):
_table_= 'adressbook'
id= PrimaryKey(int,column='id')
firstname=Optional(str,column='firstname')
lastname=Optional(str,column='lastname')
deprecated=Optional(datetime,column='deprecated')
def __init__(self,host,name,user,password):
conv = encoders
conv.update(decoders)
conv[datetime] = convert_mysql_timestamp
self.db.bind('mysql', host=host, database=name, user=user, password=password, conv=conv)
self.db.generate_mapping()
def get_group_list(self):
list(select(g for g in ORMFicture.ORMGroup)) | 34.533333 | 96 | 0.682432 | 123 | 1,036 | 5.593496 | 0.422764 | 0.079942 | 0.123547 | 0.09593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.200772 | 1,036 | 30 | 97 | 34.533333 | 0.830918 | 0 | 0 | 0 | 0 | 0 | 0.09161 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0.08 | 0.12 | 0 | 0.36 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
eeb9d69641a4150d26f149bcd1a060a9161a6d9a | 1,114 | py | Python | base/modules/tmp/sb_utils/root/sb_utils/serialize/enums.py | ScreamBun/openc2-oif-device | 1666b2d202e3adb8baab618ead7fa1de556c1c58 | [
"Apache-2.0"
] | 2 | 2019-07-02T14:06:24.000Z | 2021-07-07T09:45:54.000Z | base/modules/tmp/sb_utils/root/sb_utils/serialize/enums.py | ScreamBun/openc2-oif-device | 1666b2d202e3adb8baab618ead7fa1de556c1c58 | [
"Apache-2.0"
] | 22 | 2020-03-24T16:58:17.000Z | 2022-02-27T15:36:57.000Z | base/modules/tmp/sb_utils/root/sb_utils/serialize/enums.py | ScreamBun/openc2-oif-device | 1666b2d202e3adb8baab618ead7fa1de556c1c58 | [
"Apache-2.0"
] | 10 | 2019-04-26T12:22:22.000Z | 2021-08-05T09:16:05.000Z | from shutil import which
from typing import Dict
from ..enums import EnumBase
class SerialFormats(str, EnumBase):
"""
The format of an OpenC2 Serialization
"""
# Binary Format
CBOR = 'cbor'
# Text Format
JSON = 'json'
# Extra
# Binary
BINN = 'binn'
BSON = 'bson'
ION = 'ion'
MSGPACK = 'msgpack'
SMILE = 'smile'
# Text
BENCODE = 'bencode'
EDN = 'edn'
S_EXPRESSION = 'sexp'
TOML = 'toml'
UBJSON = 'ubjson'
XML = 'xml'
YAML = 'yaml'
@classmethod
def is_binary(cls, fmt: 'SerialFormats') -> bool:
"""
Determine if the format is binary or text based
:param fmt: Serialization
"""
bins = (cls.BINN, cls.BSON, cls.CBOR, cls.ION, cls.MSGPACK, cls.SMILE, cls.UBJSON)
if vp := getattr(cls, 'VPACK', None):
bins = (*bins, vp)
return fmt in bins
def _optional_values(self) -> Dict[str, str]:
vals = {}
# VPACK - Binary
if which("json-to-vpack") and which("vpack-to-json"):
vals['VPACK'] = 'vpack'
return vals
| 23.208333 | 90 | 0.552065 | 131 | 1,114 | 4.664122 | 0.458015 | 0.02946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001325 | 0.322262 | 1,114 | 47 | 91 | 23.702128 | 0.807947 | 0.153501 | 0 | 0 | 0 | 0 | 0.130191 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0 | 0.103448 | 0 | 0.758621 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
eeba9b0c7624cec94e9b08d7ae61fc16c23c4ac8 | 3,120 | py | Python | selectolx.py | buritica06/Scraping_Project | 07532f9cef114a511ed31b6c4ade75cb989a47f8 | [
"MIT"
] | null | null | null | selectolx.py | buritica06/Scraping_Project | 07532f9cef114a511ed31b6c4ade75cb989a47f8 | [
"MIT"
] | null | null | null | selectolx.py | buritica06/Scraping_Project | 07532f9cef114a511ed31b6c4ade75cb989a47f8 | [
"MIT"
] | null | null | null | import random
from time import sleep
from selenium import webdriver
# Instancio el driver de selenium que va a controlar el navegador
# A partir de este objeto voy a realizar el web scraping e interacciones
driver = webdriver.Chrome(r"C:\Users\rburi\AppData\Local\Programs\Python\Python39\Proyectos\Scraping_Project\chromedriver.exe")
# Voy a la pagina que requiero
driver.get('https://www.olx.com.co/atlantico_g2007003/carros_c378?sorting=desc-creation')
'''
# Busco el boton para cargar mas informacion
boton = driver.find_element_by_xpath('//button[@data-aut-id="btnLoadMore"]')
for i in range(1): # Voy a darle click en cargar mas 3 veces
try:
# le doy click
boton.click()
# espero que cargue la informacion dinamica
sleep(random.uniform(8.0, 10.0))
# busco el boton nuevamente para darle click en la siguiente iteracion
boton = driver.find_element_by_xpath('//button[@data-aut-id="btnLoadMore"]')
except:
# si hay algun error, rompo el lazo. No me complico.
break
# Encuentro cual es el XPATH de cada elemento donde esta la informacion que quiero extraer
# Esto es una LISTA. Por eso el metodo esta en plural
'''
'''
# Busco el boton de los carros para darle click
boton_auto = driver.find_element_by_xpath('//li[@data-aut-id="itemBox"]')
for i in range(1): # Voy a darle click en cargar mas 3 veces
try:
# le doy click
boton_auto.click()
# espero que cargue la informacion dinamica
sleep(random.uniform(8.0, 10.0))
# busco el boton nuevamente para darle click en la siguiente iteracion
boton_auto = driver.find_element_by_xpath('//li[@data-aut-id="itemBox"]')
except:
# si hay algun error, rompo el lazo. No me complico.
break
'''
#INICIO DE SESION OLX
boton_login = driver.find_element_by_xpath('//button[@data-aut-id="btnLogin"]')
boton_login.click()
sleep(random.uniform(8.0, 10.0))
boton_email = driver.find_element_by_xpath('//button[@data-aut-id="emailLogin"]')
boton_email.click()
sleep(random.uniform(6.0, 10.0))
e_mail = driver.find_element_by_xpath('//input[@id="email_input_field"]')
e_mail.send_keys("r.buritica@outloook.com")
sleep(random.uniform(6.0, 11.0))
boton_sgte_mail = driver.find_element_by_xpath('//button[@class="rui-3sH3b rui-2yJ_A rui-1zK8h _2_t7-"]')
boton_sgte_mail.click()
sleep(random.uniform(6.0, 11.0))
'''
#DESCOMENTAR PARA SEGUIR CON LA SECUENCIA
autos = driver.find_element_by_xpath('//li[@data-aut-id="itemBox"]')
# Recorro cada uno de los anuncios que he encontrado
for auto in autos:
# Por cada anuncio hallo el precio
autos.click()
sleep(random.uniform(8.0, 10.0))
precio = driver.find_element_by_xpath('.//span[@data-aut-id="itemPrice"]').text
nombre_vendedor = driver.find_element_by_xpath('.//div[@class="_3oOe9"]').text
descripcion = driver.find_element_by_xpath('.//span[@data-aut-id="itemTitle"]').text
print(nombre_vendedor)
print (precio)
print (descripcion)
sleep(random.uniform(8.0, 10.0))
driver.back()
# Por cada anuncio hallo la descripcion
'''
| 37.142857 | 127 | 0.708013 | 479 | 3,120 | 4.484342 | 0.356994 | 0.055866 | 0.094972 | 0.106145 | 0.517225 | 0.506052 | 0.463222 | 0.431099 | 0.405028 | 0.334264 | 0 | 0.025039 | 0.167949 | 3,120 | 83 | 128 | 37.590361 | 0.802388 | 0.058654 | 0 | 0.117647 | 0 | 0.058824 | 0.392377 | 0.275785 | 0 | 0 | 0 | 0.012048 | 0 | 1 | 0 | false | 0 | 0.176471 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eecf77c099ce9359a5ed298f84c5eefaa7076883 | 3,839 | py | Python | numba/tests/test_overlap.py | auderson/numba | 3d67c9850ab56457f418cf40af6245fd9c337705 | [
"BSD-2-Clause"
] | 6,620 | 2015-01-04T08:51:04.000Z | 2022-03-31T12:52:18.000Z | numba/tests/test_overlap.py | auderson/numba | 3d67c9850ab56457f418cf40af6245fd9c337705 | [
"BSD-2-Clause"
] | 6,457 | 2015-01-04T03:18:41.000Z | 2022-03-31T17:38:42.000Z | numba/tests/test_overlap.py | auderson/numba | 3d67c9850ab56457f418cf40af6245fd9c337705 | [
"BSD-2-Clause"
] | 930 | 2015-01-25T02:33:03.000Z | 2022-03-30T14:10:32.000Z | import numpy as np
from numba import jit
from numba.core import types
from numba.tests.support import TestCase, tag
import unittest
# Array overlaps involving a displacement
def array_overlap1(src, dest, k=1):
assert src.shape == dest.shape
dest[k:] = src[:-k]
def array_overlap2(src, dest, k=1):
assert src.shape == dest.shape
dest[:-k] = src[k:]
def array_overlap3(src, dest, k=1):
assert src.shape == dest.shape
dest[:,:-k] = src[:,k:]
def array_overlap4(src, dest, k=1):
assert src.shape == dest.shape
dest[:,k:] = src[:,:-k]
def array_overlap5(src, dest, k=1):
assert src.shape == dest.shape
dest[...,:-k] = src[...,k:]
def array_overlap6(src, dest, k=1):
assert src.shape == dest.shape
dest[...,k:] = src[...,:-k]
# Array overlaps involving an in-place reversal
def array_overlap11(src, dest):
assert src.shape == dest.shape
dest[::-1] = src
def array_overlap12(src, dest):
assert src.shape == dest.shape
dest[:] = src[::-1]
def array_overlap13(src, dest):
assert src.shape == dest.shape
dest[:,::-1] = src
def array_overlap14(src, dest):
assert src.shape == dest.shape
dest[:] = src[:,::-1]
def array_overlap15(src, dest):
assert src.shape == dest.shape
dest[...,::-1] = src
def array_overlap16(src, dest):
assert src.shape == dest.shape
dest[:] = src[...,::-1]
class TestArrayOverlap(TestCase):
def check_overlap(self, pyfunc, min_ndim, have_k_argument=False):
N = 4
def vary_layouts(orig):
yield orig.copy(order='C')
yield orig.copy(order='F')
a = orig[::-1].copy()[::-1]
assert not a.flags.c_contiguous and not a.flags.f_contiguous
yield a
def check(pyfunc, cfunc, pydest, cdest, kwargs):
pyfunc(pydest, pydest, **kwargs)
cfunc(cdest, cdest, **kwargs)
self.assertPreciseEqual(pydest, cdest)
cfunc = jit(nopython=True)(pyfunc)
# Check for up to 3d arrays
for ndim in range(min_ndim, 4):
shape = (N,) * ndim
orig = np.arange(0, N**ndim).reshape(shape)
# Note we cannot copy a 'A' layout array exactly (bitwise),
# so instead we call vary_layouts() twice
for pydest, cdest in zip(vary_layouts(orig), vary_layouts(orig)):
if have_k_argument:
for k in range(1, N):
check(pyfunc, cfunc, pydest, cdest, dict(k=k))
else:
check(pyfunc, cfunc, pydest, cdest, {})
def check_overlap_with_k(self, pyfunc, min_ndim):
self.check_overlap(pyfunc, min_ndim=min_ndim, have_k_argument=True)
def test_overlap1(self):
self.check_overlap_with_k(array_overlap1, min_ndim=1)
def test_overlap2(self):
self.check_overlap_with_k(array_overlap2, min_ndim=1)
def test_overlap3(self):
self.check_overlap_with_k(array_overlap3, min_ndim=2)
def test_overlap4(self):
self.check_overlap_with_k(array_overlap4, min_ndim=2)
def test_overlap5(self):
self.check_overlap_with_k(array_overlap5, min_ndim=1)
def test_overlap6(self):
self.check_overlap_with_k(array_overlap6, min_ndim=1)
def test_overlap11(self):
self.check_overlap(array_overlap11, min_ndim=1)
def test_overlap12(self):
self.check_overlap(array_overlap12, min_ndim=1)
def test_overlap13(self):
self.check_overlap(array_overlap13, min_ndim=2)
def test_overlap14(self):
self.check_overlap(array_overlap14, min_ndim=2)
def test_overlap15(self):
self.check_overlap(array_overlap15, min_ndim=1)
def test_overlap16(self):
self.check_overlap(array_overlap16, min_ndim=1)
if __name__ == '__main__':
unittest.main()
| 28.437037 | 77 | 0.631154 | 539 | 3,839 | 4.306122 | 0.189239 | 0.093063 | 0.089617 | 0.093063 | 0.504093 | 0.316243 | 0.316243 | 0.23869 | 0.23869 | 0.23869 | 0 | 0.02914 | 0.240167 | 3,839 | 134 | 78 | 28.649254 | 0.766541 | 0.054441 | 0 | 0.130435 | 0 | 0 | 0.002759 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 1 | 0.304348 | false | 0 | 0.054348 | 0 | 0.369565 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eed6e091bea827a49ac882826a400f0dcb0c0328 | 3,070 | py | Python | app/authenticationUtils.py | spravesh1818/globalFantasyLeague | 296eba4e450a60ef9df1664de9a59e95b9351758 | [
"BSD-3-Clause"
] | null | null | null | app/authenticationUtils.py | spravesh1818/globalFantasyLeague | 296eba4e450a60ef9df1664de9a59e95b9351758 | [
"BSD-3-Clause"
] | null | null | null | app/authenticationUtils.py | spravesh1818/globalFantasyLeague | 296eba4e450a60ef9df1664de9a59e95b9351758 | [
"BSD-3-Clause"
] | null | null | null | from datetime import timedelta, datetime
from typing import Optional
from fastapi import HTTPException, Depends
from fastapi.security import OAuth2PasswordBearer
from jose import jwt, JWTError
from passlib.context import CryptContext
from pydantic import BaseModel
from db import database as adb
from usermanagement.models import users
from usermanagement.schema import UserCreate, User
from starlette import status
SECRET_KEY = "8ee7b057761c29f8ca0a336f850aa73abf9eeb81c6a5b015c893af74d7e6a948"
ALGORITHM = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES = 30
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token")
class Token(BaseModel):
access_token: str
token_type: str
class TokenData(BaseModel):
username: Optional[str] = None
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def verify_password(plain_password, hashed_password) -> str:
return pwd_context.verify(plain_password, hashed_password)
def get_password_hash(password) -> str:
return pwd_context.hash(password)
async def get_user(username: Optional[str]) -> UserCreate:
query = users.select()
user_list = await adb.fetch_all(query)
for user in user_list:
if user["username"] == username:
return UserCreate(**user)
return HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
"""authenticating user"""
async def authenticate_user(username: str, password: str):
user = await get_user(username)
if not user:
return False
if not verify_password(password, user.password):
return False
return user
"""code to create the access token"""
def create_access_token(data: dict, expires_delta: Optional[timedelta] = None) -> str:
to_encode = data.copy()
if expires_delta:
expire = datetime.now() + expires_delta
else:
expire = datetime.utcnow() + timedelta(minutes=15)
to_encode.update({"exp": expire})
encoded_jwt = jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
return encoded_jwt
"""getting the current user details"""
async def get_current_user(token: str = Depends(oauth2_scheme)) -> User:
credentials_exception = HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Could not validate credentials",
headers={"WWW-Authenticate": "Bearer"},
)
try:
payload = jwt.decode(token, SECRET_KEY, algorithms=[ALGORITHM])
username: str = payload.get("sub")
if username is None:
raise credentials_exception
token_data = TokenData(username=username)
except JWTError:
raise credentials_exception
user = await get_user(username=token_data.username)
if not user:
raise credentials_exception
return user
"""Checking users if they are active or not"""
async def get_current_active_user(
current_user: User = Depends(get_current_user)
) -> User:
if current_user.disabled:
raise HTTPException(status_code=400, detail="Inactive user")
return current_user
def testFunc():
return "Hello"
| 26.929825 | 88 | 0.727687 | 370 | 3,070 | 5.87027 | 0.345946 | 0.030387 | 0.015193 | 0.024862 | 0.077348 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023219 | 0.186319 | 3,070 | 113 | 89 | 27.168142 | 0.846277 | 0 | 0 | 0.121622 | 0 | 0 | 0.062244 | 0.021888 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054054 | false | 0.121622 | 0.148649 | 0.040541 | 0.418919 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
eed6fb2923d095b3cf5b34bdf82bbffb01e0753a | 317 | py | Python | avg.py | subhendu17620/codechef | 97d98433cd9e6f8925251a26185b2e05fd58a168 | [
"MIT"
] | null | null | null | avg.py | subhendu17620/codechef | 97d98433cd9e6f8925251a26185b2e05fd58a168 | [
"MIT"
] | null | null | null | avg.py | subhendu17620/codechef | 97d98433cd9e6f8925251a26185b2e05fd58a168 | [
"MIT"
] | null | null | null | for t in range(int(input())):
n,k,v=map(int,input().split())
a=list(map(int,input().split()))
s=0
for i in range(n):
s+=a[i]
x=(((n+k)*v)-s)/k
x=float(x)
if x>0:
if((x).is_integer()):
print(int(x))
else:
print(-1)
else:
print(-1) | 21.133333 | 36 | 0.425868 | 54 | 317 | 2.481481 | 0.444444 | 0.179104 | 0.044776 | 0.238806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019231 | 0.343849 | 317 | 15 | 37 | 21.133333 | 0.625 | 0 | 0 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eedaaadaf698ed98d36ec2ed6561901b827de588 | 637 | py | Python | codes/Python/mqtt_pub.py | abhaysbharadwaj/pscmr_vijayawada | 30b37f3bae1a1229c47a4705ded22c8e34d618ac | [
"MIT"
] | null | null | null | codes/Python/mqtt_pub.py | abhaysbharadwaj/pscmr_vijayawada | 30b37f3bae1a1229c47a4705ded22c8e34d618ac | [
"MIT"
] | null | null | null | codes/Python/mqtt_pub.py | abhaysbharadwaj/pscmr_vijayawada | 30b37f3bae1a1229c47a4705ded22c8e34d618ac | [
"MIT"
] | null | null | null | import paho.mqtt.client as mqtt
import time
from random import randint
statusTopic = 'abhay/home/room1/status'
tempTopic = 'abhay/home/room1/temp'
humdTopic = 'abhay/home/room1/humd'
mqttBrokerIp = '23.20.0.61'
mqttBrokerPort = 1883
client = mqtt.Client()
client.connect(mqttBrokerIp, mqttBrokerPort)#52.204.21.160
print "Connected to MQTT"
client.publish(statusTopic, '1')
while 1:
x = randint(10,26)
print x
client.publish(tempTopic, str(x))
y = randint(0,100)
print y
client.publish(humdTopic, str(y))
print "Publish done"
time.sleep(5)
| 24.5 | 59 | 0.645212 | 83 | 637 | 4.951807 | 0.542169 | 0.072993 | 0.10219 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072314 | 0.240188 | 637 | 25 | 60 | 25.48 | 0.77686 | 0.020408 | 0 | 0 | 0 | 0 | 0.175585 | 0.108696 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.190476 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eeeb81ab94ba964715b0fef143338ba1f8bf0c4a | 18,974 | py | Python | Indexer/EacCpf.py | atbradley/eaccpf-indexer | da50970a0ffbc9b0eb892814ef07071ed5e843b2 | [
"Apache-2.0"
] | null | null | null | Indexer/EacCpf.py | atbradley/eaccpf-indexer | da50970a0ffbc9b0eb892814ef07071ed5e843b2 | [
"Apache-2.0"
] | null | null | null | Indexer/EacCpf.py | atbradley/eaccpf-indexer | da50970a0ffbc9b0eb892814ef07071ed5e843b2 | [
"Apache-2.0"
] | null | null | null | """
This file is subject to the terms and conditions defined in the
LICENSE file, which is part of this source code package.
"""
from DigitalObject import DigitalObject
from lxml import etree
import Cfg
import Utils
import hashlib
import logging
import os
# namespaces
DOC_KEY = "doc"
DOC_NS = "urn:isbn:1-931666-33-4"
ESRC_KEY = "ns0"
ESRC_NS = "http://www.esrc.unimelb.edu.au"
XLINK_KEY = "xlink"
XLINK_NS = "http://www.w3.org/1999/xlink"
XSI_KEY = "xsi"
XSI_NS = "http://www.w3.org/2001/XMLSchema-instance"
class EacCpf(object):
"""
EAC-CPF documents provide metadata and references to external entities
that are the subject of indexing. This class wraps the EAC-CPF document
and provides convenience methods for extracting required metadata. The
content of an EAC-CPF document is typically presented by a separate HTML
document, referred to here as the presentation.
"""
def __init__(self, Source, MetadataUrl=None, PresentationUrl=None):
"""
Source is a file system path or URL to the EAC-CPF document file. The
Source is used to load the content of the document. MetadataUrl is the
public URL to the EAC-CPF document. PresentationUrl is the public URL
to the HTML presentation.
"""
self.log = logging.getLogger()
self.metadata = MetadataUrl
self.ns = { DOC_KEY: DOC_NS, ESRC_KEY: ESRC_NS, XLINK_KEY: XLINK_NS }
self.presentation = PresentationUrl
self.source = Source
data = Utils.load_from_source(Source)
self.xml = etree.fromstring(data)
# some documents may be missing the fully specified eac-cpf document
# namespace attributes, which will result in failures during subsequent
# operations. we'll check for the missing attribute here so that we can
# make the problem and its resolution obvious in the log
root = self.xml.xpath('//doc:eac-cpf', namespaces=self.ns)
if len(root) == 0:
self.log.error("Missing EAC-CPF namespace declaration in {0}".format(Source))
raise Exception
def getAbstract(self):
"""
Get document abstract.
"""
try:
abstract = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:description/doc:biogHist/doc:abstract", namespaces=self.ns)
return abstract[0].text if abstract[0].text else None
except:
pass
def getBiogHist(self):
"""
Get the non-abstract portion of the biogHist entry.
"""
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:description/doc:biogHist/doc:p", namespaces=self.ns)
if val:
ps = []
for p in val:
if p.text is not None:
ps.append(p.text)
return ' '.join(ps)
except:
pass
return None
def getCpfRelations(self):
"""
Get list of CPF relations.
"""
rels = []
try:
cpfr = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:relations/doc:cpfRelation", namespaces=self.ns)
rels.extend(cpfr)
except:
pass
return rels
def getCpfRelationLinks(self):
"""
"""
links = []
target = "{{{0}}}href".format(XLINK_NS)
try:
rels = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:relations/doc:cpfRelation", namespaces=self.ns)
for rel in rels:
for attr in rel.attrib:
if target in attr:
url = rel.attrib[attr]
relationEntry = rel.xpath("./doc:relationEntry[1]", namespaces=self.ns)
if relationEntry and len(relationEntry) > 0:
links.append((url, relationEntry[0].text))
except:
pass
return links
def getData(self):
"""
Get the raw XML data.
"""
return etree.tostring(self.xml, pretty_print=True)
def getDigitalObjects(self, Thumbnail=False):
"""
Get the list of digital objects referenced in the document. Transform
the metadata contained in the HTML page to an intermediate YML digital
object representation.
"""
dobjects = []
rels = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:relations/doc:resourceRelation", namespaces=self.ns)
for rel in rels:
try:
if rel.attrib['resourceRelationType'] == 'other':
relEntry = rel.xpath("./doc:relationEntry", namespaces=self.ns)
descNote = rel.xpath("./doc:descriptiveNote/doc:p", namespaces=self.ns)
if relEntry[0].attrib['localType'] == 'digitalObject':
# if the descriptiveNote does not contain the string "<p>Include in Gallery</p>",
# then it is not a thumbnail for this record
if Thumbnail and len(descNote) > 0 and not "Include in Gallery" in descNote[0].text:
continue
nz = {
"doc": "urn:isbn:1-931666-33-4",
"obj": "urn:isbn:1-931666-22-9",
}
# ISSUE #30 in some cases, the title string contains
# markup in it, which results in only a portion of the
# title string being returned. Here we concat the text
# content of all the child nodes together to create a
# single title string
title = ''
title_elements = rel.xpath("./doc:relationEntry", namespaces=self.ns)
if title_elements:
for e in title_elements.pop().itertext():
title += e
# ISSUE #30: abstract may contain markup. concat all
# the child elements on to the abstract value.
abstract = ''
abstract_elements = rel.xpath("./doc:objectXMLWrap/obj:archref/obj:abstract", namespaces=nz)
if abstract_elements:
for e in abstract_elements.pop().itertext():
abstract += e
alternate_title = self.getTitle()
localtype = self.getLocalType()
presentation = rel.attrib['{http://www.w3.org/1999/xlink}href']
unitdate = rel.xpath("./doc:objectXMLWrap/obj:archref/obj:unitdate", namespaces=nz)
# create the digital object
if unitdate and not hasattr(unitdate, 'lower'):
unitdate = unitdate[0].text
dobj = DigitalObject(self.source, self.metadata, presentation, title, abstract, localtype, UnitDate=unitdate, AlternateTitle=alternate_title)
else:
fromDate, toDate = self.getExistDates()
dobj = DigitalObject(self.source, self.metadata, presentation, title, abstract, localtype, FromDate=fromDate, ToDate=toDate, AlternateTitle=alternate_title)
dobjects.append(dobj)
except:
self.log.error("Could not retrieve digital object {0}".format(self.source), exc_info=Cfg.LOG_EXC_INFO)
return dobjects
def getEntityId(self):
"""
Get the record entity Id. If a value can not be found None is returned.
"""
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:identity/doc:entityId", namespaces=self.ns)
return val[0].text if val[0].text else None
except:
pass
def getEntityType(self):
"""
Get the entity type.
"""
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:identity/doc:entityType", namespaces=self.ns)
return val[0].text if val[0].text else None
except:
pass
def getExistDates(self):
"""
Get entity exist dates. Returns 'from date', 'to date' tuple.
"""
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:description/doc:existDates", namespaces=self.ns)
if val:
fromDate = val[0].xpath("./doc:dateRange/doc:fromDate", namespaces=self.ns)
toDate = val[0].xpath("./doc:dateRange/doc:toDate", namespaces=self.ns)
if fromDate and len(fromDate) > 0 and 'standardDate' in fromDate[0].attrib:
fromDate = fromDate[0].attrib['standardDate']
else:
fromDate = None
if toDate and len(toDate) > 0 and 'standardDate' in toDate[0].attrib:
toDate = toDate[0].attrib['standardDate']
else:
toDate = None
# ensure dates are in ISO format
if fromDate and not 'T00:00:00Z' in fromDate:
fromDate += "T00:00:00Z"
if toDate and not 'T00:00:00Z' in toDate:
toDate += "T00:00:00Z"
return fromDate, toDate
except:
pass
return None, None
def getFileName(self):
"""
Get document file name.
"""
return Utils.getFileName(self.source)
def getFreeText(self):
"""
Get content from free text fields.
"""
freeText = ''
names = self.getNameEntries()
if names:
freeText = ' '.join(names)
abstract = self.getAbstract()
if abstract:
freeText += self.getAbstract() + ' '
biog = self.getBiogHist()
if biog:
freeText += biog + ' '
functions = self.getFunctions()
if functions:
freeText += ' '.join(functions)
return freeText
def getFunctions(self):
"""
Get the functions.
"""
functions = []
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:description/doc:functions/doc:function/doc:term", namespaces=self.ns)
for func in val:
if func.text is not None:
functions.append(func.text)
return functions
except:
pass
return functions
def getHash(self):
"""
Get a secure hash for the content in hexadecimal format.
"""
h = hashlib.sha1()
data = etree.tostring(self.xml)
h.update(data)
return h.hexdigest()
def getLocalType(self):
"""
Get the local type.
"""
try:
val = self.xml.xpath("//doc:eac-cpf/doc:control/doc:localControl/doc:term", namespaces=self.ns)
return val[0].text if val[0].text else None
except:
pass
def getLocations(self):
"""
Get locations.
"""
locations = []
try:
places = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:description/doc:places/doc:place", namespaces=self.ns)
for place in places:
location = {}
placeEntry = place.xpath("./doc:placeEntry", namespaces=self.ns)
if placeEntry:
location['placeentry'] = placeEntry[0].text
if 'latitude' in placeEntry[0].attrib:
location['latitude'] = placeEntry[0].attrib['latitude']
if 'longitude' in placeEntry[0].attrib:
location['longitude'] = placeEntry[0].attrib['longitude']
locations.append(location)
except:
pass
return locations
def getChronLocations(self):
"""
Get locations.
"""
locations = []
try:
chronItems = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:description/doc:biogHist/doc:chronList/doc:chronItem", namespaces=self.ns)
for chronItem in chronItems:
location = {}
fromDate = chronItem.xpath("./doc:dateRange/doc:fromDate", namespaces=self.ns)
toDate = chronItem.xpath("./doc:dateRange/doc:toDate", namespaces=self.ns)
if fromDate and len(fromDate) > 0 and 'standardDate' in fromDate[0].attrib:
fromDate = fromDate[0].attrib['standardDate']
fromDate = Utils.fixIncorrectDateEncoding(fromDate)
location['fromDate'] = fromDate
if toDate and len(toDate) and 'standardDate' in toDate[0].attrib:
toDate = toDate[0].attrib['standardDate']
toDate = Utils.fixIncorrectDateEncoding(toDate)
location['toDate'] = toDate
placeEntry = chronItem.xpath("./doc:placeEntry", namespaces=self.ns)
if placeEntry:
location['placeentry'] = placeEntry[0].text
if 'latitude' in placeEntry[0].attrib:
location['latitude'] = placeEntry[0].attrib['latitude']
if 'longitude' in placeEntry[0].attrib:
location['longitude'] = placeEntry[0].attrib['longitude']
event = chronItem.xpath("./doc:event", namespaces=self.ns)
if event:
location['event'] = event[0].text
locations.append(location)
except:
pass
return locations
def getMetadataUrl(self):
"""
Get the URL to the EAC-CPF document.
"""
try:
if 'http://' in self.source or 'https://' in self.source:
return self.source
elif self.metadata:
return self.metadata
except:
pass
return None
def getNameEntries(self):
"""
Get name entry.
"""
names = []
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:identity/doc:nameEntry/doc:part", namespaces=self.ns)
for part in val:
for t in part.itertext():
names.append(t)
return names
except:
pass
return names
def getPresentationUrl(self):
"""
Get the URL to the HTML presentation of the EAC-CPF document.
"""
if self.presentation:
return self.presentation
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:identity/doc:entityId", namespaces=self.ns)
return val[0].text if val[0].text else None
except:
pass
def getRecordId(self):
"""
Get the record identifier.
"""
try:
val = self.xml.xpath("//doc:eac-cpf/doc:control/doc:recordId", namespaces=self.ns)
return val[0].text if val[0].text else None
except:
pass
def getResourceRelations(self):
"""
Get list of resource relations.
"""
rels = []
try:
val = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:relations/doc:resourceRelation", namespaces=self.ns)
rels.extend(val)
except:
pass
return rels
def getResourceRelationLinks(self):
"""
Get links from resource relation entries to external documents.
"""
links = []
target = "{{{0}}}href".format(XLINK_NS)
try:
rels = self.xml.xpath("//doc:eac-cpf/doc:cpfDescription/doc:relations/doc:resourceRelation", namespaces=self.ns)
for rel in rels:
for attr in rel.attrib:
if target in attr:
url = rel.attrib[attr]
relationEntry = rel.xpath("./doc:relationEntry[1]", namespaces=self.ns)
if relationEntry and len(relationEntry) > 0:
links.append((url, relationEntry[0].text))
except:
pass
return links
def getTitle(self):
"""
Get the record title.
"""
names = self.getNameEntries()
if names:
return ' '.join(names)
return None
def getThumbnail(self):
"""
Get the digital object that acts as a thumbnail image for this record.
"""
try:
obj = self.getDigitalObjects(Thumbnail=True)
return obj[0]
except:
return None
def hasDigitalObjects(self):
"""
Determine if the EAC-CPF record has digital object references.
"""
objects = self.getDigitalObjects()
if objects and len(objects) > 0:
return True
return False
def hasLocation(self):
"""
Determine if the record has a location.
"""
locations = self.getLocations()
if len(locations) > 0:
return True
return False
def hasMaintenanceRecord(self):
"""
Determine if the record has a maintenance history section.
"""
try:
val = self.xml.xpath("//doc:eac-cpf/doc:control/doc:maintenanceHistory/doc:maintenanceEvent", namespaces=self.ns)
if val and len(val) > 0:
return True
except:
pass
return False
def hasResourceRelations(self):
"""
Determine if the record has one or more resource relations.
"""
cr = self.getCpfRelations()
rr = self.getResourceRelations()
if cr and rr and len(cr) > 0 and len(rr) > 0:
return True
return False
def write(self, Path):
"""
Write the EAC-CPF data to the specified path. Add the metadata,
presentation source URLs as attributes to the eac-cpf node.
"""
# add the metadata and presentation source URLs to the eac-cpf node
root = self.xml.xpath('//doc:eac-cpf', namespaces=self.ns)
metadata = '{' + ESRC_NS + '}metadata'
presentation = '{' + ESRC_NS + '}presentation'
source = '{' + ESRC_NS + '}source'
root[0].set(metadata, self.metadata)
root[0].set(presentation, self.presentation)
root[0].set(source, self.source)
# write the data to the specified path
path = Path + os.sep + self.getFileName()
with open(path, 'w') as outfile:
data = etree.tostring(self.xml, pretty_print=True)
outfile.write(data)
self.log.info("Stored EAC-CPF document " + self.getFileName())
return path
| 37.948 | 184 | 0.544324 | 2,035 | 18,974 | 5.05602 | 0.176904 | 0.019827 | 0.049762 | 0.029157 | 0.432015 | 0.38235 | 0.345126 | 0.315385 | 0.305472 | 0.295753 | 0 | 0.011029 | 0.354854 | 18,974 | 499 | 185 | 38.024048 | 0.829507 | 0.152366 | 0 | 0.431548 | 0 | 0.017857 | 0.149905 | 0.098412 | 0 | 0 | 0 | 0 | 0 | 1 | 0.089286 | false | 0.053571 | 0.020833 | 0 | 0.238095 | 0.005952 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
eeed374847556e345a4199cee72d0448be3fae3c | 2,226 | py | Python | nodes/ssn_node.py | aneeshads/EPAi4.0-Capstone | cc5568b23d34e503932fe68d8ae83c287fe30881 | [
"CNRI-Python"
] | null | null | null | nodes/ssn_node.py | aneeshads/EPAi4.0-Capstone | cc5568b23d34e503932fe68d8ae83c287fe30881 | [
"CNRI-Python"
] | null | null | null | nodes/ssn_node.py | aneeshads/EPAi4.0-Capstone | cc5568b23d34e503932fe68d8ae83c287fe30881 | [
"CNRI-Python"
] | null | null | null | import bpy
from bpy.types import NodeTree, Node, NodeSocket
from ..customnodetree import MyCustomTreeNode
class MyCustomNodeSSN(Node, MyCustomTreeNode):
'''A custom node that accepts input of fake social security numbers (SSN) \
from the Faker library.'''
bl_idname = 'CustomNodeSSN'
bl_label = "Social Security Number"
bl_icon = 'ACTION'
def update_ssn_node_attribute(self, context):
'''This function updates the output of the current node.'''
self.outputs[0].ssn_include = self.ssn_include
ssn_include: bpy.props.BoolProperty(name="Include SSN", update=update_ssn_node_attribute)
def init(self, context):
'''This function creates the sockets that will house the input and the output \
nodes.'''
self.outputs.new('CustomSocketTypeSSN', "SSN")
self.ssn_include = True
def update(self):
'''This function connects the output of the current node to the input socket of the \
next node.'''
if (self.outputs[0].is_linked) and (len(self.outputs[0].links) != 0):
self.outputs[0].links[0].to_socket.ssn_include = self.outputs[0].ssn_include
print("**************** NODE: SSN *********************")
print("Input :: Include SSN :: ", self.ssn_include)
print("Output :: Social Security Number :: ", self.outputs[0].ssn_include)
print("***************************************************")
def copy(self, node):
'''This function facilitates the copying of a node.'''
print("Copying from node ", node)
def free(self):
'''This function facilitates the removal of a node.'''
print("Removing node ", self, ", Goodbye!")
# Additional buttons displayed on the node.
def draw_buttons(self, context, layout):
'''This function creates the labels for the display panels within the node.'''
layout.label(text="Social Security Number")
layout.prop(self, "ssn_include")
def draw_buttons_ext(self, context, layout):
pass
def draw_label(self):
'''This function determines the name of the node which appears at its head.'''
return "Social Security Number"
| 35.333333 | 93 | 0.625786 | 273 | 2,226 | 5.014652 | 0.355311 | 0.065741 | 0.052593 | 0.032871 | 0.118335 | 0.075968 | 0 | 0 | 0 | 0 | 0 | 0.004676 | 0.231357 | 2,226 | 62 | 94 | 35.903226 | 0.795441 | 0.28257 | 0 | 0 | 0 | 0 | 0.213548 | 0.046452 | 0 | 0 | 0 | 0 | 0 | 1 | 0.258065 | false | 0.032258 | 0.096774 | 0 | 0.548387 | 0.193548 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
eeed5850537376677fe6ee3bac5bb19233632501 | 4,990 | py | Python | api/app/reviews/models.py | martinoywa/Baobab | f87bfefa69398fb8a59fc7684323c89f2813c1a0 | [
"Apache-2.0"
] | 1 | 2020-03-08T20:00:24.000Z | 2020-03-08T20:00:24.000Z | api/app/reviews/models.py | martinoywa/Baobab | f87bfefa69398fb8a59fc7684323c89f2813c1a0 | [
"Apache-2.0"
] | null | null | null | api/app/reviews/models.py | martinoywa/Baobab | f87bfefa69398fb8a59fc7684323c89f2813c1a0 | [
"Apache-2.0"
] | null | null | null | from datetime import datetime
from app import db
class ReviewForm(db.Model):
id = db.Column(db.Integer(), primary_key=True)
application_form_id = db.Column(db.Integer(), db.ForeignKey('application_form.id'), nullable=False)
is_open = db.Column(db.Boolean(), nullable=False)
deadline = db.Column(db.DateTime(), nullable=False)
application_form = db.relationship('ApplicationForm', foreign_keys=[application_form_id])
review_questions = db.relationship('ReviewQuestion')
def __init__(self, application_form_id, deadline):
self.application_form_id = application_form_id
self.is_open = True
self.deadline = deadline
def close(self):
self.is_open = False
class ReviewQuestion(db.Model):
id = db.Column(db.Integer, primary_key=True)
review_form_id = db.Column(db.Integer(), db.ForeignKey('review_form.id'), nullable=False)
question_id = db.Column(db.Integer(), db.ForeignKey('question.id'), nullable=True)
description = db.Column(db.String(), nullable=True)
headline = db.Column(db.String(), nullable=True)
type = db.Column(db.String(), nullable=False)
placeholder = db.Column(db.String(), nullable=True)
options = db.Column(db.JSON(), nullable=True)
is_required = db.Column(db.Boolean(), nullable=False)
order = db.Column(db.Integer(), nullable=False)
validation_regex = db.Column(db.String(), nullable=True)
validation_text = db.Column(db.String(), nullable=True)
weight = db.Column(db.Float(), nullable=False)
review_form = db.relationship('ReviewForm', foreign_keys=[review_form_id])
question = db.relationship('Question', foreign_keys=[question_id])
def __init__(self,
review_form_id,
question_id,
description,
headline,
type,
placeholder,
options,
is_required,
order,
validation_regex,
validation_text,
weight):
self.review_form_id = review_form_id
self.question_id = question_id
self.description = description
self.headline = headline
self.type = type
self.placeholder = placeholder
self.options = options
self.is_required = is_required
self.order = order
self.validation_regex = validation_regex
self.validation_text = validation_text
self.weight = weight
class ReviewResponse(db.Model):
id = db.Column(db.Integer(), primary_key=True)
review_form_id = db.Column(db.Integer(), db.ForeignKey('review_form.id'), nullable=False)
reviewer_user_id = db.Column(db.Integer(), db.ForeignKey('app_user.id'), nullable=False)
response_id = db.Column(db.Integer(), db.ForeignKey('response.id'), nullable=False)
submitted_timestamp = db.Column(db.DateTime(), nullable=False)
review_form = db.relationship('ReviewForm', foreign_keys=[review_form_id])
reviewer_user = db.relationship('AppUser', foreign_keys=[reviewer_user_id])
response = db.relationship('Response', foreign_keys=[response_id])
review_scores = db.relationship('ReviewScore')
def __init__(self,
review_form_id,
reviewer_user_id,
response_id):
self.review_form_id = review_form_id
self.reviewer_user_id = reviewer_user_id
self.response_id = response_id
self.submitted_timestamp = datetime.now()
class ReviewScore(db.Model):
id = db.Column(db.Integer(), primary_key=True)
review_response_id = db.Column(db.Integer(), db.ForeignKey('review_response.id'), nullable=False)
review_question_id = db.Column(db.Integer(), db.ForeignKey('review_question.id'), nullable=False)
value = db.Column(db.String(), nullable=False)
review_response = db.relationship('ReviewResponse', foreign_keys=[review_response_id])
review_question = db.relationship('ReviewQuestion', foreign_keys=[review_question_id])
def __init__(self,
review_question_id,
value):
self.review_question_id = review_question_id
self.value = value
class ReviewConfiguration(db.Model):
id = db.Column(db.Integer(), primary_key=True)
review_form_id = db.Column(db.Integer(), db.ForeignKey('review_form.id'), nullable=False)
num_reviews_required = db.Column(db.Integer(), nullable=False)
num_optional_reviews = db.Column(db.Integer(), nullable=False)
drop_optional_question_id = db.Column(db.Integer(), db.ForeignKey('review_question.id'), nullable=True)
drop_optional_agreement_values = db.Column(db.String(), nullable=True)
review_form = db.relationship('ReviewForm', foreign_keys=[review_form_id])
review_question = db.relationship('ReviewQuestion', foreign_keys=[drop_optional_question_id])
| 43.017241 | 108 | 0.666934 | 592 | 4,990 | 5.373311 | 0.113176 | 0.080478 | 0.100597 | 0.096196 | 0.509274 | 0.501729 | 0.342031 | 0.332285 | 0.228544 | 0.228544 | 0 | 0 | 0.219038 | 4,990 | 115 | 109 | 43.391304 | 0.816269 | 0 | 0 | 0.178947 | 0 | 0 | 0.058051 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.021053 | 0 | 0.589474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
eeeed8395b6cdcbae78e475e2cea1bc6d9944357 | 1,397 | py | Python | cli.py | shiftsayan/plane | 008ea5d5bbe82d9b08d5a6cd56fd9071ee0c3e2b | [
"MIT"
] | null | null | null | cli.py | shiftsayan/plane | 008ea5d5bbe82d9b08d5a6cd56fd9071ee0c3e2b | [
"MIT"
] | 1 | 2021-10-02T18:43:46.000Z | 2021-10-02T18:44:45.000Z | cli.py | shiftsayan/plane | 008ea5d5bbe82d9b08d5a6cd56fd9071ee0c3e2b | [
"MIT"
] | 2 | 2020-09-26T02:31:31.000Z | 2021-09-23T21:22:52.000Z | from PyInquirer import prompt
from schema import schema
from profiles import profiles
def prompt_schema():
questions = [
{
'type': 'list',
'name': 'schema',
'message': "What type of email do you want to send?",
'choices': [ ps.id for ps in schema ],
}
]
answers = prompt(questions)
for ps in schema:
if ps.id == answers['schema']:
return ps
def prompt_profile():
# return profiles[1] # TODO: remove hardcode testing
questions = [
{
'type': 'list',
'name': 'profile',
'message': "Which profile do you want to use?",
'choices': [ profile.id for profile in profiles ],
}
]
answers = prompt(questions)
for profile in profiles:
if profile.id == answers['profile']:
return profile
def prompt_confirm(message="Do you want to confirm?"):
questions = [
{
'type': 'confirm',
'name': 'confirm',
'message': message,
}
]
answers = prompt(questions)
return answers['confirm']
def prompt_subject():
questions = [
{
'type': 'input',
'name': 'subject',
'message': "What is the subject of the email?",
}
]
answers = prompt(questions)
return answers['subject']
| 22.532258 | 65 | 0.518253 | 139 | 1,397 | 5.179856 | 0.294964 | 0.05 | 0.122222 | 0.045833 | 0.097222 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001124 | 0.362921 | 1,397 | 61 | 66 | 22.901639 | 0.807865 | 0.035075 | 0 | 0.204082 | 0 | 0 | 0.205204 | 0 | 0 | 0 | 0 | 0.016393 | 0 | 1 | 0.081633 | false | 0 | 0.061224 | 0 | 0.22449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e10176b41a0d6addbcd1d97e5f90c0f957932b32 | 4,448 | py | Python | python/protobufs/services/post/actions/add_to_collections_pb2.py | getcircle/protobuf-registry | 20ad8463b7ac6e2cf279c08bcd3e953993fe9153 | [
"MIT"
] | null | null | null | python/protobufs/services/post/actions/add_to_collections_pb2.py | getcircle/protobuf-registry | 20ad8463b7ac6e2cf279c08bcd3e953993fe9153 | [
"MIT"
] | null | null | null | python/protobufs/services/post/actions/add_to_collections_pb2.py | getcircle/protobuf-registry | 20ad8463b7ac6e2cf279c08bcd3e953993fe9153 | [
"MIT"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: protobufs/services/post/actions/add_to_collections.proto
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from protobufs.services.post import containers_pb2 as protobufs_dot_services_dot_post_dot_containers__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='protobufs/services/post/actions/add_to_collections.proto',
package='services.post.actions.add_to_collections',
syntax='proto3',
serialized_pb=b'\n8protobufs/services/post/actions/add_to_collections.proto\x12(services.post.actions.add_to_collections\x1a(protobufs/services/post/containers.proto\"\x82\x01\n\tRequestV1\x12\x38\n\x04item\x18\x01 \x01(\x0b\x32*.services.post.containers.CollectionItemV1\x12;\n\x0b\x63ollections\x18\x02 \x03(\x0b\x32&.services.post.containers.CollectionV1\"G\n\nResponseV1\x12\x39\n\x05items\x18\x01 \x03(\x0b\x32*.services.post.containers.CollectionItemV1b\x06proto3'
,
dependencies=[protobufs_dot_services_dot_post_dot_containers__pb2.DESCRIPTOR,])
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_REQUESTV1 = _descriptor.Descriptor(
name='RequestV1',
full_name='services.post.actions.add_to_collections.RequestV1',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='item', full_name='services.post.actions.add_to_collections.RequestV1.item', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='collections', full_name='services.post.actions.add_to_collections.RequestV1.collections', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=145,
serialized_end=275,
)
_RESPONSEV1 = _descriptor.Descriptor(
name='ResponseV1',
full_name='services.post.actions.add_to_collections.ResponseV1',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='items', full_name='services.post.actions.add_to_collections.ResponseV1.items', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=277,
serialized_end=348,
)
_REQUESTV1.fields_by_name['item'].message_type = protobufs_dot_services_dot_post_dot_containers__pb2._COLLECTIONITEMV1
_REQUESTV1.fields_by_name['collections'].message_type = protobufs_dot_services_dot_post_dot_containers__pb2._COLLECTIONV1
_RESPONSEV1.fields_by_name['items'].message_type = protobufs_dot_services_dot_post_dot_containers__pb2._COLLECTIONITEMV1
DESCRIPTOR.message_types_by_name['RequestV1'] = _REQUESTV1
DESCRIPTOR.message_types_by_name['ResponseV1'] = _RESPONSEV1
RequestV1 = _reflection.GeneratedProtocolMessageType('RequestV1', (_message.Message,), dict(
DESCRIPTOR = _REQUESTV1,
__module__ = 'protobufs.services.post.actions.add_to_collections_pb2'
# @@protoc_insertion_point(class_scope:services.post.actions.add_to_collections.RequestV1)
))
_sym_db.RegisterMessage(RequestV1)
ResponseV1 = _reflection.GeneratedProtocolMessageType('ResponseV1', (_message.Message,), dict(
DESCRIPTOR = _RESPONSEV1,
__module__ = 'protobufs.services.post.actions.add_to_collections_pb2'
# @@protoc_insertion_point(class_scope:services.post.actions.add_to_collections.ResponseV1)
))
_sym_db.RegisterMessage(ResponseV1)
# @@protoc_insertion_point(module_scope)
| 37.378151 | 472 | 0.786646 | 551 | 4,448 | 6.00363 | 0.214156 | 0.068924 | 0.080411 | 0.093108 | 0.62364 | 0.57769 | 0.528114 | 0.510278 | 0.480653 | 0.369408 | 0 | 0.03355 | 0.102068 | 4,448 | 118 | 473 | 37.694915 | 0.794692 | 0.083858 | 0 | 0.536842 | 1 | 0.010526 | 0.257375 | 0.228368 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.063158 | 0 | 0.063158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e102b0eed20ef03e1186cc1cda6e7219fc900815 | 420 | py | Python | tests/models.py | padeny/tastypie_api | 696a17535d921fabe35d693565684803d39c451a | [
"MIT"
] | 2 | 2019-07-10T12:09:25.000Z | 2019-07-10T12:09:26.000Z | tests/models.py | padeny/tastypie_api | 696a17535d921fabe35d693565684803d39c451a | [
"MIT"
] | 4 | 2020-06-05T21:24:48.000Z | 2021-11-08T00:57:37.000Z | tests/models.py | padeny/tastypie_api | 696a17535d921fabe35d693565684803d39c451a | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
class Entry(models.Model):
user = models.ForeignKey(User, on_delete=models.CASCADE, null=True)
title = models.CharField(max_length=128, unique=True)
slug = models.CharField(max_length=128)
created = models.DateTimeField()
# ImageField need Pillow
image = models.FileField(upload_to='tests/images', null=True, blank=True)
| 35 | 77 | 0.747619 | 57 | 420 | 5.438596 | 0.631579 | 0.064516 | 0.116129 | 0.154839 | 0.174194 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016667 | 0.142857 | 420 | 11 | 78 | 38.181818 | 0.844444 | 0.052381 | 0 | 0 | 0 | 0 | 0.030303 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e10a9c0c234a42a9fa6849dd35ed3d7aa98e2198 | 989 | bzl | Python | src/cpp/model_benchmark.bzl | SanggunLee/edgetpu | d3cf166783265f475c1ddba5883e150ee84f7bfe | [
"Apache-2.0"
] | 320 | 2019-09-19T07:10:48.000Z | 2022-03-12T01:48:56.000Z | src/cpp/model_benchmark.bzl | Machine-Learning-Practice/edgetpu | 6d699665efdc5d84944b5233223c55fe5d3bd1a6 | [
"Apache-2.0"
] | 563 | 2019-09-27T06:40:40.000Z | 2022-03-31T23:12:15.000Z | src/cpp/model_benchmark.bzl | Machine-Learning-Practice/edgetpu | 6d699665efdc5d84944b5233223c55fe5d3bd1a6 | [
"Apache-2.0"
] | 119 | 2019-09-25T02:51:10.000Z | 2022-03-03T08:11:12.000Z | """Generate model benchmark source file using template.
"""
_TEMPLATE = "//src/cpp:models_benchmark.cc.template"
def _generate_models_benchmark_src_impl(ctx):
ctx.actions.expand_template(
template = ctx.file._template,
output = ctx.outputs.source_file,
substitutions = {
"{BENCHMARK_NAME}": ctx.attr.benchmark_name,
"{TFLITE_CPU_FILEPATH}": ctx.attr.tflite_cpu_filepath,
"{TFLITE_EDGETPU_FILEPATH}": ctx.attr.tflite_edgetpu_filepath,
},
)
generate_models_benchmark_src = rule(
implementation = _generate_models_benchmark_src_impl,
attrs = {
"benchmark_name": attr.string(mandatory = True),
"tflite_cpu_filepath": attr.string(mandatory = True),
"tflite_edgetpu_filepath": attr.string(mandatory = True),
"_template": attr.label(
default = Label(_TEMPLATE),
allow_single_file = True,
),
},
outputs = {"source_file": "%{name}.cc"},
)
| 32.966667 | 74 | 0.652174 | 104 | 989 | 5.836538 | 0.336538 | 0.098847 | 0.113674 | 0.128501 | 0.258649 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228514 | 989 | 29 | 75 | 34.103448 | 0.795544 | 0.052578 | 0 | 0 | 1 | 0 | 0.2 | 0.115054 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e10e3c3e8fbfcd2aafba548c2fbd85769b01995c | 422 | py | Python | circle/migrations/0028_person_tags.py | Acids-Bases/Marketplace | 31f42a077279d891cdb6bb86abb2b8c6e841a889 | [
"MIT"
] | null | null | null | circle/migrations/0028_person_tags.py | Acids-Bases/Marketplace | 31f42a077279d891cdb6bb86abb2b8c6e841a889 | [
"MIT"
] | null | null | null | circle/migrations/0028_person_tags.py | Acids-Bases/Marketplace | 31f42a077279d891cdb6bb86abb2b8c6e841a889 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.5 on 2021-08-09 15:44
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('circle', '0027_auto_20210801_1047'),
]
operations = [
migrations.AddField(
model_name='person',
name='tags',
field=models.ManyToManyField(blank=True, related_name='tagz', to='circle.Tag'),
),
]
| 22.210526 | 91 | 0.606635 | 47 | 422 | 5.340426 | 0.829787 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100324 | 0.267773 | 422 | 18 | 92 | 23.444444 | 0.711974 | 0.106635 | 0 | 0 | 1 | 0 | 0.141333 | 0.061333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1140990f95743f4d6963bba2460dc66551dff80 | 783 | py | Python | examples/perceptron.py | 7enTropy7/ravml | de24fd5b65ebb5f045b3f49a4eb0c09b6601595e | [
"MIT"
] | 7 | 2021-05-06T12:27:19.000Z | 2021-11-15T09:58:32.000Z | examples/perceptron.py | 7enTropy7/ravml | de24fd5b65ebb5f045b3f49a4eb0c09b6601595e | [
"MIT"
] | 3 | 2021-04-12T08:32:18.000Z | 2021-05-31T11:18:41.000Z | examples/perceptron.py | 7enTropy7/ravml | de24fd5b65ebb5f045b3f49a4eb0c09b6601595e | [
"MIT"
] | 8 | 2021-03-19T07:36:41.000Z | 2021-09-24T04:34:41.000Z | from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import OneHotEncoder
from sklearn.preprocessing import StandardScaler
import ravop.core as R
from ravml.linear.perceptron import Perceptron
iris_data = load_iris()
x_ = iris_data.data
y_ = iris_data.target.reshape(-1, 1) # Convert data to a single column
sc = StandardScaler()
x_ = sc.fit_transform(x_)
encoder = OneHotEncoder(sparse=False)
y_ = encoder.fit_transform(y_)
X_train, X_test, y_train, y_test = train_test_split(x_, y_, shuffle=True, test_size=0.3)
model = Perceptron(input_dims=4, hidden_dims=10, output_dims=3)
model.fit(X_train, y_train, alpha = 0.01, epoch = 3)
pr = model.predict(X_test[1])
print('Prediction : ',pr)
model.plot_metrics() | 27.964286 | 88 | 0.782886 | 125 | 783 | 4.648 | 0.464 | 0.075732 | 0.048193 | 0.10327 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018786 | 0.11622 | 783 | 28 | 89 | 27.964286 | 0.820809 | 0.039591 | 0 | 0 | 0 | 0 | 0.01731 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.315789 | 0 | 0.315789 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e11455c8dcd5dceff5966d7acd171e55b042079a | 1,554 | py | Python | src/tools/name.py | MorganeAudrain/Calcium_analysis | ea1ac64c5e021fe59f953e3ec09ac5a1abbcf871 | [
"MIT"
] | null | null | null | src/tools/name.py | MorganeAudrain/Calcium_analysis | ea1ac64c5e021fe59f953e3ec09ac5a1abbcf871 | [
"MIT"
] | null | null | null | src/tools/name.py | MorganeAudrain/Calcium_analysis | ea1ac64c5e021fe59f953e3ec09ac5a1abbcf871 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
@author: Morgane
This program is for create a excel sheet where all the behavioral path are storage
"""
# Importation
import pandas as pd
df = pd.read_excel(r'calcium_analysis_checked_videos.xlsx')
mouse=pd.DataFrame(df, columns= ['mouse'])
date=pd.DataFrame(df, columns= ['date'])
trial=pd.DataFrame(df, columns= ['trial'])
datetime=pd.DataFrame(df, columns= ['timestamp'],dtype='float64')
path=pd.DataFrame(df, columns= ['Calcium_video'])
date_b=pd.DataFrame(df, columns= ['date_bis'], dtype='str')
df['behavioral_path']=path
b_path=pd.DataFrame(df, columns= ['behavioral_path'])
for i in range(0,len(mouse)):
mouse1 = mouse.iloc[i]
date1=date.iloc[i]
trail1=trial.iloc[i]
datetime1=datetime.iloc[i]
path0=path.iloc[i]
date_b0=date_b.iloc[i]
b_path0 = b_path.iloc[i]
data=[]
data1=[]
for p in path0:
for m in mouse1:
for t in trail1:
for d in date1:
for db in date_b0:
for dt in datetime1:
for bp in b_path0:
data=p
for i in range(0,len(data)):
if data[i] == '/':
data1=data[:i]
path1=data1
file_name=f'{d}_{m}_Trial{t}_{db}-%.f_0000.avi' % dt
file=path1+'/'+file_name
print(file)
| 33.06383 | 84 | 0.521879 | 200 | 1,554 | 3.945 | 0.395 | 0.097592 | 0.115336 | 0.17744 | 0.159696 | 0.038023 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.346847 | 1,554 | 46 | 85 | 33.782609 | 0.748768 | 0.100386 | 0 | 0 | 0 | 0 | 0.112473 | 0.050469 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.028571 | 0 | 0.028571 | 0.028571 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e116916e8644e66c66dc9895052a5749be621e8c | 2,072 | py | Python | reminder_system/event_manager.py | ProneToAdjust/Reminder-System | fbed927f8ee1aba8371e1145764d5e53241994ca | [
"MIT"
] | null | null | null | reminder_system/event_manager.py | ProneToAdjust/Reminder-System | fbed927f8ee1aba8371e1145764d5e53241994ca | [
"MIT"
] | null | null | null | reminder_system/event_manager.py | ProneToAdjust/Reminder-System | fbed927f8ee1aba8371e1145764d5e53241994ca | [
"MIT"
] | null | null | null | from .google_calendar import GoogleCalendar
import datetime
from . import utils
class EventManager(GoogleCalendar):
def __init__(self, service_account_file):
super().__init__(service_account_file)
self.upcoming_events = []
self.unconfirmed_events = []
self.confirmed_events = []
def get_events_to_be_reminded(self):
upcoming_events = self.check_for_upcoming_events()
unconfirmed_events = self.unconfirmed_events
return upcoming_events, unconfirmed_events
def check_for_upcoming_events(self):
events = self.get_events()
# iterate through the events to add the events due to be reminded into the upcoming events list
for event in events:
mins_to_event = utils.get_mins_between_now_and_event(event)
# if the event is using the default reminder time
if(event['reminders']['useDefault']):
# default reminder time is 30 minutes
reminder_mins = 30
else:
reminder_mins = event['reminders']['overrides'][0]['minutes']
# check if the event is ready to be in a reminder
if (mins_to_event < reminder_mins) and (mins_to_event > 0):
if (event not in self.unconfirmed_events) and (event not in self.confirmed_events):
self.upcoming_events.append(event)
return self.upcoming_events
def move_upcoming_events_to_confirmed_events(self):
self.confirmed_events.extend(self.upcoming_events)
self.upcoming_events.clear()
pass
def move_unconfirmed_events_to_confirmed_events(self):
self.confirmed_events.extend(self.unconfirmed_events)
self.unconfirmed_events.clear()
pass
def move_upcoming_events_to_unconfirmed_events(self):
self.unconfirmed_events.extend(self.upcoming_events)
self.upcoming_events.clear()
pass
def clear_confirmed_list_of_past(self):
for event in self.confirmed_events:
time_difference_mins = utils.get_mins_between_now_and_event(event)
if time_difference_mins < 0:
self.confirmed_events.remove(event)
if __name__ == "__main__":
event_manager = EventManager(service_account_file='credentials.json')
print(event_manager.check_for_upcoming_events())
pass | 31.393939 | 97 | 0.781853 | 287 | 2,072 | 5.28223 | 0.247387 | 0.138522 | 0.094987 | 0.058047 | 0.275066 | 0.191293 | 0.191293 | 0.191293 | 0.191293 | 0.14248 | 0 | 0.003917 | 0.137548 | 2,072 | 66 | 98 | 31.393939 | 0.844432 | 0.108591 | 0 | 0.130435 | 0 | 0 | 0.036896 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.152174 | false | 0.086957 | 0.065217 | 0 | 0.282609 | 0.021739 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e11ae01770df3546201a14c42658c6d78131d1fd | 742 | py | Python | server/openslides/assignments/migrations/0017_vote_to_y.py | Gersdorfa/OpenSlides | fca1e94759ba32de2e57fec083d8973d095f1cd0 | [
"MIT"
] | 3 | 2021-02-11T20:45:58.000Z | 2022-02-09T21:59:42.000Z | server/openslides/assignments/migrations/0017_vote_to_y.py | Gersdorfa/OpenSlides | fca1e94759ba32de2e57fec083d8973d095f1cd0 | [
"MIT"
] | 2 | 2021-11-02T15:48:16.000Z | 2022-03-02T08:38:19.000Z | server/openslides/assignments/migrations/0017_vote_to_y.py | DLRG-Jugend-NDS/OpenSlides | 03704e4852821ccd67fe23adb6e2c38b67d93732 | [
"MIT"
] | 3 | 2021-01-18T11:44:05.000Z | 2022-01-19T16:00:23.000Z | # Generated by Finn Stutzenstein on 2020-11-24 06:44
from django.db import migrations
def votes_to_y(apps, schema_editor):
AssignmentPoll = apps.get_model("assignments", "AssignmentPoll")
for poll in AssignmentPoll.objects.all():
changed = False
if poll.pollmethod == "votes":
poll.pollmethod = "Y"
changed = True
if poll.onehundred_percent_base == "votes":
poll.onehundred_percent_base = "Y"
changed = True
if changed:
poll.save(skip_autoupdate=True)
class Migration(migrations.Migration):
dependencies = [
("assignments", "0016_negative_votes"),
]
operations = [
migrations.RunPython(votes_to_y),
]
| 23.935484 | 68 | 0.628032 | 80 | 742 | 5.6625 | 0.6125 | 0.030905 | 0.03532 | 0.06181 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029685 | 0.273585 | 742 | 30 | 69 | 24.733333 | 0.810761 | 0.067385 | 0 | 0.1 | 1 | 0 | 0.097101 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.05 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e11e4871477974dc938efcc467ec95e8585f0631 | 25,317 | py | Python | src/Sephrasto/UI/CharakterInfo.py | Ilaris-Tools/Sephrasto | 8574a5b45da8ebfa5f69a775066fd3136da1c718 | [
"MIT"
] | 1 | 2022-02-02T16:15:59.000Z | 2022-02-02T16:15:59.000Z | src/Sephrasto/UI/CharakterInfo.py | Ilaris-Tools/Sephrasto | 8574a5b45da8ebfa5f69a775066fd3136da1c718 | [
"MIT"
] | 1 | 2022-01-14T11:04:19.000Z | 2022-01-14T11:04:19.000Z | src/Sephrasto/UI/CharakterInfo.py | lukruh/Sephrasto | 8574a5b45da8ebfa5f69a775066fd3136da1c718 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'CharakterInfo.ui'
#
# Created by: PyQt5 UI code generator 5.15.6
#
# WARNING: Any manual changes made to this file will be lost when pyuic5 is
# run again. Do not edit this file unless you know what you are doing.
from PyQt5 import QtCore, QtGui, QtWidgets
class Ui_Form(object):
def setupUi(self, Form):
Form.setObjectName("Form")
Form.resize(974, 721)
self.gridLayout = QtWidgets.QGridLayout(Form)
self.gridLayout.setContentsMargins(20, 20, 20, 20)
self.gridLayout.setHorizontalSpacing(20)
self.gridLayout.setVerticalSpacing(10)
self.gridLayout.setObjectName("gridLayout")
self.verticalLayout_4 = QtWidgets.QVBoxLayout()
self.verticalLayout_4.setObjectName("verticalLayout_4")
self.labelEinstellungen = QtWidgets.QLabel(Form)
font = QtGui.QFont()
font.setBold(True)
font.setWeight(75)
self.labelEinstellungen.setFont(font)
self.labelEinstellungen.setObjectName("labelEinstellungen")
self.verticalLayout_4.addWidget(self.labelEinstellungen)
self.groupBox_3 = QtWidgets.QGroupBox(Form)
self.groupBox_3.setTitle("")
self.groupBox_3.setObjectName("groupBox_3")
self.gridLayout_5 = QtWidgets.QGridLayout(self.groupBox_3)
self.gridLayout_5.setContentsMargins(20, 20, 20, 20)
self.gridLayout_5.setObjectName("gridLayout_5")
self.checkReq = QtWidgets.QCheckBox(self.groupBox_3)
self.checkReq.setChecked(True)
self.checkReq.setObjectName("checkReq")
self.gridLayout_5.addWidget(self.checkReq, 1, 0, 1, 2)
self.comboHausregeln = QtWidgets.QComboBox(self.groupBox_3)
self.comboHausregeln.setObjectName("comboHausregeln")
self.gridLayout_5.addWidget(self.comboHausregeln, 4, 1, 1, 1)
self.label_5 = QtWidgets.QLabel(self.groupBox_3)
self.label_5.setObjectName("label_5")
self.gridLayout_5.addWidget(self.label_5, 4, 0, 1, 1)
self.label_7 = QtWidgets.QLabel(self.groupBox_3)
self.label_7.setObjectName("label_7")
self.gridLayout_5.addWidget(self.label_7, 9, 0, 1, 1)
self.checkUeberPDF = QtWidgets.QCheckBox(self.groupBox_3)
self.checkUeberPDF.setObjectName("checkUeberPDF")
self.gridLayout_5.addWidget(self.checkUeberPDF, 3, 0, 1, 2)
self.label_6 = QtWidgets.QLabel(self.groupBox_3)
self.label_6.setObjectName("label_6")
self.gridLayout_5.addWidget(self.label_6, 6, 0, 1, 1)
self.checkFinanzen = QtWidgets.QCheckBox(self.groupBox_3)
self.checkFinanzen.setChecked(True)
self.checkFinanzen.setObjectName("checkFinanzen")
self.gridLayout_5.addWidget(self.checkFinanzen, 2, 0, 1, 2)
self.comboCharsheet = QtWidgets.QComboBox(self.groupBox_3)
self.comboCharsheet.setObjectName("comboCharsheet")
self.gridLayout_5.addWidget(self.comboCharsheet, 6, 1, 1, 1)
self.labelReload = QtWidgets.QLabel(self.groupBox_3)
self.labelReload.setStyleSheet("background-color: rgb(255, 255, 0); color: black;")
self.labelReload.setAlignment(QtCore.Qt.AlignLeading|QtCore.Qt.AlignLeft|QtCore.Qt.AlignVCenter)
self.labelReload.setWordWrap(True)
self.labelReload.setObjectName("labelReload")
self.gridLayout_5.addWidget(self.labelReload, 11, 0, 1, 2)
self.comboRegelnGroesse = QtWidgets.QComboBox(self.groupBox_3)
self.comboRegelnGroesse.setObjectName("comboRegelnGroesse")
self.comboRegelnGroesse.addItem("")
self.comboRegelnGroesse.addItem("")
self.comboRegelnGroesse.addItem("")
self.gridLayout_5.addWidget(self.comboRegelnGroesse, 9, 1, 1, 1)
self.checkRegeln = QtWidgets.QCheckBox(self.groupBox_3)
self.checkRegeln.setChecked(True)
self.checkRegeln.setTristate(False)
self.checkRegeln.setObjectName("checkRegeln")
self.gridLayout_5.addWidget(self.checkRegeln, 8, 0, 1, 2)
self.label_10 = QtWidgets.QLabel(self.groupBox_3)
self.label_10.setObjectName("label_10")
self.gridLayout_5.addWidget(self.label_10, 10, 0, 1, 1)
self.listRegelKategorien = QtWidgets.QListView(self.groupBox_3)
self.listRegelKategorien.setMaximumSize(QtCore.QSize(280, 80))
self.listRegelKategorien.setHorizontalScrollBarPolicy(QtCore.Qt.ScrollBarAlwaysOff)
self.listRegelKategorien.setObjectName("listRegelKategorien")
self.gridLayout_5.addWidget(self.listRegelKategorien, 10, 1, 1, 1)
self.verticalLayout_4.addWidget(self.groupBox_3)
spacerItem = QtWidgets.QSpacerItem(20, 20, QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
self.verticalLayout_4.addItem(spacerItem)
self.labelEP = QtWidgets.QLabel(Form)
font = QtGui.QFont()
font.setBold(True)
font.setWeight(75)
self.labelEP.setFont(font)
self.labelEP.setObjectName("labelEP")
self.verticalLayout_4.addWidget(self.labelEP)
self.groupBox_2 = QtWidgets.QGroupBox(Form)
self.groupBox_2.setTitle("")
self.groupBox_2.setObjectName("groupBox_2")
self.gridLayout_4 = QtWidgets.QGridLayout(self.groupBox_2)
self.gridLayout_4.setContentsMargins(20, 20, 20, 20)
self.gridLayout_4.setObjectName("gridLayout_4")
self.gridLayout_2 = QtWidgets.QGridLayout()
self.gridLayout_2.setObjectName("gridLayout_2")
self.spinFertigkeitenSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinFertigkeitenSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinFertigkeitenSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinFertigkeitenSpent.setReadOnly(True)
self.spinFertigkeitenSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinFertigkeitenSpent.setMaximum(999999)
self.spinFertigkeitenSpent.setObjectName("spinFertigkeitenSpent")
self.gridLayout_2.addWidget(self.spinFertigkeitenSpent, 3, 1, 1, 1)
self.spinUebernatuerlichPercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinUebernatuerlichPercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinUebernatuerlichPercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinUebernatuerlichPercent.setReadOnly(True)
self.spinUebernatuerlichPercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinUebernatuerlichPercent.setMaximum(100)
self.spinUebernatuerlichPercent.setObjectName("spinUebernatuerlichPercent")
self.gridLayout_2.addWidget(self.spinUebernatuerlichPercent, 6, 2, 1, 1)
self.labelUeber3 = QtWidgets.QLabel(self.groupBox_2)
self.labelUeber3.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setItalic(False)
self.labelUeber3.setFont(font)
self.labelUeber3.setObjectName("labelUeber3")
self.gridLayout_2.addWidget(self.labelUeber3, 8, 0, 1, 1)
self.spinProfanPercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinProfanPercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinProfanPercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinProfanPercent.setReadOnly(True)
self.spinProfanPercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinProfanPercent.setMaximum(100)
self.spinProfanPercent.setObjectName("spinProfanPercent")
self.gridLayout_2.addWidget(self.spinProfanPercent, 2, 2, 1, 1)
self.spinVorteileSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinVorteileSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinVorteileSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinVorteileSpent.setReadOnly(True)
self.spinVorteileSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinVorteileSpent.setMaximum(99999999)
self.spinVorteileSpent.setObjectName("spinVorteileSpent")
self.gridLayout_2.addWidget(self.spinVorteileSpent, 1, 1, 1, 1)
self.spinAttributeSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinAttributeSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinAttributeSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinAttributeSpent.setReadOnly(True)
self.spinAttributeSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinAttributeSpent.setMaximum(99999999)
self.spinAttributeSpent.setObjectName("spinAttributeSpent")
self.gridLayout_2.addWidget(self.spinAttributeSpent, 0, 1, 1, 1)
self.spinUeberTalenteSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinUeberTalenteSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinUeberTalenteSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinUeberTalenteSpent.setReadOnly(True)
self.spinUeberTalenteSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinUeberTalenteSpent.setMaximum(999999)
self.spinUeberTalenteSpent.setObjectName("spinUeberTalenteSpent")
self.gridLayout_2.addWidget(self.spinUeberTalenteSpent, 8, 1, 1, 1)
self.spinFreieSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinFreieSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinFreieSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinFreieSpent.setReadOnly(True)
self.spinFreieSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinFreieSpent.setMaximum(999999)
self.spinFreieSpent.setObjectName("spinFreieSpent")
self.gridLayout_2.addWidget(self.spinFreieSpent, 5, 1, 1, 1)
self.spinUeberFertigkeitenPercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinUeberFertigkeitenPercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinUeberFertigkeitenPercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinUeberFertigkeitenPercent.setReadOnly(True)
self.spinUeberFertigkeitenPercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinUeberFertigkeitenPercent.setMaximum(100)
self.spinUeberFertigkeitenPercent.setObjectName("spinUeberFertigkeitenPercent")
self.gridLayout_2.addWidget(self.spinUeberFertigkeitenPercent, 7, 2, 1, 1)
self.label_2 = QtWidgets.QLabel(self.groupBox_2)
self.label_2.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setBold(True)
font.setWeight(75)
self.label_2.setFont(font)
self.label_2.setObjectName("label_2")
self.gridLayout_2.addWidget(self.label_2, 1, 0, 1, 1)
self.spinAttributePercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinAttributePercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinAttributePercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinAttributePercent.setReadOnly(True)
self.spinAttributePercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinAttributePercent.setMaximum(100)
self.spinAttributePercent.setObjectName("spinAttributePercent")
self.gridLayout_2.addWidget(self.spinAttributePercent, 0, 2, 1, 1)
self.spinUeberTalentePercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinUeberTalentePercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinUeberTalentePercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinUeberTalentePercent.setReadOnly(True)
self.spinUeberTalentePercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinUeberTalentePercent.setMaximum(100)
self.spinUeberTalentePercent.setObjectName("spinUeberTalentePercent")
self.gridLayout_2.addWidget(self.spinUeberTalentePercent, 8, 2, 1, 1)
self.labelUeber1 = QtWidgets.QLabel(self.groupBox_2)
self.labelUeber1.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setBold(True)
font.setWeight(75)
self.labelUeber1.setFont(font)
self.labelUeber1.setObjectName("labelUeber1")
self.gridLayout_2.addWidget(self.labelUeber1, 6, 0, 1, 1)
self.label_4 = QtWidgets.QLabel(self.groupBox_2)
self.label_4.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setItalic(False)
self.label_4.setFont(font)
self.label_4.setObjectName("label_4")
self.gridLayout_2.addWidget(self.label_4, 5, 0, 1, 1)
self.spinUebernatuerlichSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinUebernatuerlichSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinUebernatuerlichSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinUebernatuerlichSpent.setReadOnly(True)
self.spinUebernatuerlichSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinUebernatuerlichSpent.setMaximum(999999)
self.spinUebernatuerlichSpent.setObjectName("spinUebernatuerlichSpent")
self.gridLayout_2.addWidget(self.spinUebernatuerlichSpent, 6, 1, 1, 1)
self.spinUeberFertigkeitenSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinUeberFertigkeitenSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinUeberFertigkeitenSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinUeberFertigkeitenSpent.setReadOnly(True)
self.spinUeberFertigkeitenSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinUeberFertigkeitenSpent.setMaximum(999999)
self.spinUeberFertigkeitenSpent.setObjectName("spinUeberFertigkeitenSpent")
self.gridLayout_2.addWidget(self.spinUeberFertigkeitenSpent, 7, 1, 1, 1)
self.spinFreiePercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinFreiePercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinFreiePercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinFreiePercent.setReadOnly(True)
self.spinFreiePercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinFreiePercent.setMaximum(100)
self.spinFreiePercent.setObjectName("spinFreiePercent")
self.gridLayout_2.addWidget(self.spinFreiePercent, 5, 2, 1, 1)
self.spinFertigkeitenPercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinFertigkeitenPercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinFertigkeitenPercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinFertigkeitenPercent.setReadOnly(True)
self.spinFertigkeitenPercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinFertigkeitenPercent.setMaximum(100)
self.spinFertigkeitenPercent.setObjectName("spinFertigkeitenPercent")
self.gridLayout_2.addWidget(self.spinFertigkeitenPercent, 3, 2, 1, 1)
self.spinTalentePercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinTalentePercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinTalentePercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinTalentePercent.setReadOnly(True)
self.spinTalentePercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinTalentePercent.setMaximum(100)
self.spinTalentePercent.setObjectName("spinTalentePercent")
self.gridLayout_2.addWidget(self.spinTalentePercent, 4, 2, 1, 1)
self.spinProfanSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinProfanSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinProfanSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinProfanSpent.setReadOnly(True)
self.spinProfanSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinProfanSpent.setMaximum(999999)
self.spinProfanSpent.setObjectName("spinProfanSpent")
self.gridLayout_2.addWidget(self.spinProfanSpent, 2, 1, 1, 1)
self.label = QtWidgets.QLabel(self.groupBox_2)
self.label.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setBold(True)
font.setWeight(75)
self.label.setFont(font)
self.label.setObjectName("label")
self.gridLayout_2.addWidget(self.label, 0, 0, 1, 1)
self.label_9 = QtWidgets.QLabel(self.groupBox_2)
self.label_9.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setItalic(False)
self.label_9.setFont(font)
self.label_9.setObjectName("label_9")
self.gridLayout_2.addWidget(self.label_9, 4, 0, 1, 1)
self.labelUeber2 = QtWidgets.QLabel(self.groupBox_2)
self.labelUeber2.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setItalic(False)
self.labelUeber2.setFont(font)
self.labelUeber2.setObjectName("labelUeber2")
self.gridLayout_2.addWidget(self.labelUeber2, 7, 0, 1, 1)
self.label_8 = QtWidgets.QLabel(self.groupBox_2)
self.label_8.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setItalic(False)
self.label_8.setFont(font)
self.label_8.setObjectName("label_8")
self.gridLayout_2.addWidget(self.label_8, 3, 0, 1, 1)
self.label_3 = QtWidgets.QLabel(self.groupBox_2)
self.label_3.setMinimumSize(QtCore.QSize(230, 0))
font = QtGui.QFont()
font.setBold(True)
font.setWeight(75)
self.label_3.setFont(font)
self.label_3.setObjectName("label_3")
self.gridLayout_2.addWidget(self.label_3, 2, 0, 1, 1)
self.spinTalenteSpent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinTalenteSpent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinTalenteSpent.setAlignment(QtCore.Qt.AlignCenter)
self.spinTalenteSpent.setReadOnly(True)
self.spinTalenteSpent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinTalenteSpent.setMaximum(999999)
self.spinTalenteSpent.setObjectName("spinTalenteSpent")
self.gridLayout_2.addWidget(self.spinTalenteSpent, 4, 1, 1, 1)
self.spinVorteilePercent = QtWidgets.QSpinBox(self.groupBox_2)
self.spinVorteilePercent.setFocusPolicy(QtCore.Qt.NoFocus)
self.spinVorteilePercent.setAlignment(QtCore.Qt.AlignCenter)
self.spinVorteilePercent.setReadOnly(True)
self.spinVorteilePercent.setButtonSymbols(QtWidgets.QAbstractSpinBox.NoButtons)
self.spinVorteilePercent.setMaximum(100)
self.spinVorteilePercent.setObjectName("spinVorteilePercent")
self.gridLayout_2.addWidget(self.spinVorteilePercent, 1, 2, 1, 1)
self.gridLayout_4.addLayout(self.gridLayout_2, 0, 0, 1, 1)
self.verticalLayout_4.addWidget(self.groupBox_2)
spacerItem1 = QtWidgets.QSpacerItem(20, 40, QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Expanding)
self.verticalLayout_4.addItem(spacerItem1)
self.gridLayout.addLayout(self.verticalLayout_4, 0, 1, 1, 1)
self.verticalLayout_3 = QtWidgets.QVBoxLayout()
self.verticalLayout_3.setObjectName("verticalLayout_3")
self.labelNotiz = QtWidgets.QLabel(Form)
font = QtGui.QFont()
font.setBold(True)
font.setWeight(75)
self.labelNotiz.setFont(font)
self.labelNotiz.setObjectName("labelNotiz")
self.verticalLayout_3.addWidget(self.labelNotiz)
self.groupBox = QtWidgets.QGroupBox(Form)
self.groupBox.setTitle("")
self.groupBox.setObjectName("groupBox")
self.gridLayout_3 = QtWidgets.QGridLayout(self.groupBox)
self.gridLayout_3.setContentsMargins(20, 20, 20, 20)
self.gridLayout_3.setObjectName("gridLayout_3")
self.teNotiz = QtWidgets.QPlainTextEdit(self.groupBox)
self.teNotiz.setPlainText("")
self.teNotiz.setObjectName("teNotiz")
self.gridLayout_3.addWidget(self.teNotiz, 0, 0, 1, 1)
self.verticalLayout_3.addWidget(self.groupBox)
self.gridLayout.addLayout(self.verticalLayout_3, 0, 0, 1, 1)
self.retranslateUi(Form)
QtCore.QMetaObject.connectSlotsByName(Form)
Form.setTabOrder(self.teNotiz, self.checkReq)
Form.setTabOrder(self.checkReq, self.checkFinanzen)
Form.setTabOrder(self.checkFinanzen, self.checkUeberPDF)
Form.setTabOrder(self.checkUeberPDF, self.comboHausregeln)
Form.setTabOrder(self.comboHausregeln, self.comboCharsheet)
Form.setTabOrder(self.comboCharsheet, self.checkRegeln)
Form.setTabOrder(self.checkRegeln, self.comboRegelnGroesse)
def retranslateUi(self, Form):
_translate = QtCore.QCoreApplication.translate
Form.setWindowTitle(_translate("Form", "Form"))
self.labelEinstellungen.setText(_translate("Form", "Charakter-Einstellungen"))
self.checkReq.setToolTip(_translate("Form", "Falls abgewählt, werden sämtliche Voraussetzungsprüfungen für Vorteile, übernatürliche Fertigkeiten usw. deaktiviert."))
self.checkReq.setText(_translate("Form", "Voraussetzungen überprüfen"))
self.label_5.setText(_translate("Form", "Hausregeln:"))
self.label_7.setText(_translate("Form", "Regelschriftgröße:"))
self.checkUeberPDF.setToolTip(_translate("Form", "<html><head/><body><p>Sephrasto übernimmt automatisch alle übernatürlichen Fertigkeiten in den Charakterbogen, deren FW mindestens 1 beträgt und für welche du mindestens ein Talent aktiviert hast. Wenn du diese Option aktivierst, zeigt Sephrasto eine PDF-Spalte bei den übernatürlichen Fertigkeiten an. Mit dieser kannst du selbst entscheiden, welche Fertigkeiten in den Charakterbogen übernommen werden sollen.</p></body></html>"))
self.checkUeberPDF.setText(_translate("Form", "PDF-Ausgabe von übernatürlichen Fertigkeiten manuell auswählen"))
self.label_6.setText(_translate("Form", "Charakterbogen:"))
self.checkFinanzen.setToolTip(_translate("Form", "<html><head/><body><p>Die Finanzen spielen nur bei einem neuen Charakter eine Rolle und können nach dem ersten Abenteuer ausgeblendet werden. Auch die aktuellen Schicksalspunkte werden dann nicht mehr ausgegeben, da diese ab dem ersten Abenteuer händisch verwaltet werden.</p></body></html>"))
self.checkFinanzen.setText(_translate("Form", "Finanzen anzeigen und aktuelle Schicksalspunkte ausgeben"))
self.labelReload.setText(_translate("Form", "Der Charakter muss gespeichert und neu geladen werden, damit alle Änderungen übernommen werden können!"))
self.comboRegelnGroesse.setItemText(0, _translate("Form", "Klein"))
self.comboRegelnGroesse.setItemText(1, _translate("Form", "Mittel"))
self.comboRegelnGroesse.setItemText(2, _translate("Form", "Groß"))
self.checkRegeln.setText(_translate("Form", "Dem Charakterbogen relevante Ilaris Regeln anhängen"))
self.label_10.setText(_translate("Form", "Regelkategorien:"))
self.labelEP.setText(_translate("Form", "EP-Verteilung"))
self.spinFertigkeitenSpent.setSuffix(_translate("Form", " EP"))
self.spinUebernatuerlichPercent.setSuffix(_translate("Form", " %"))
self.labelUeber3.setText(_translate("Form", " Talente"))
self.spinProfanPercent.setSuffix(_translate("Form", " %"))
self.spinVorteileSpent.setSuffix(_translate("Form", " EP"))
self.spinAttributeSpent.setSuffix(_translate("Form", " EP"))
self.spinUeberTalenteSpent.setSuffix(_translate("Form", " EP"))
self.spinFreieSpent.setSuffix(_translate("Form", " EP"))
self.spinUeberFertigkeitenPercent.setSuffix(_translate("Form", " %)"))
self.spinUeberFertigkeitenPercent.setPrefix(_translate("Form", "("))
self.label_2.setText(_translate("Form", "Vorteile"))
self.spinAttributePercent.setSuffix(_translate("Form", " %"))
self.spinUeberTalentePercent.setSuffix(_translate("Form", " %)"))
self.spinUeberTalentePercent.setPrefix(_translate("Form", "("))
self.labelUeber1.setText(_translate("Form", "Übernatürliche Fertigkeiten und Talente"))
self.label_4.setText(_translate("Form", " Freie Fertigkeiten"))
self.spinUebernatuerlichSpent.setSuffix(_translate("Form", " EP"))
self.spinUeberFertigkeitenSpent.setSuffix(_translate("Form", " EP"))
self.spinFreiePercent.setSuffix(_translate("Form", " %)"))
self.spinFreiePercent.setPrefix(_translate("Form", "("))
self.spinFertigkeitenPercent.setSuffix(_translate("Form", " %)"))
self.spinFertigkeitenPercent.setPrefix(_translate("Form", "("))
self.spinTalentePercent.setSuffix(_translate("Form", " %)"))
self.spinTalentePercent.setPrefix(_translate("Form", "("))
self.spinProfanSpent.setSuffix(_translate("Form", " EP"))
self.label.setText(_translate("Form", "Attribute"))
self.label_9.setText(_translate("Form", " Talente"))
self.labelUeber2.setText(_translate("Form", " Fertigkeiten"))
self.label_8.setText(_translate("Form", " Fertigkeiten"))
self.label_3.setText(_translate("Form", "Profane Fertigkeiten und Talente"))
self.spinTalenteSpent.setSuffix(_translate("Form", " EP"))
self.spinVorteilePercent.setSuffix(_translate("Form", " %"))
self.labelNotiz.setText(_translate("Form", "Notiz"))
if __name__ == "__main__":
import sys
app = QtWidgets.QApplication(sys.argv)
Form = QtWidgets.QWidget()
ui = Ui_Form()
ui.setupUi(Form)
Form.show()
sys.exit(app.exec_())
| 60.566986 | 490 | 0.727258 | 2,544 | 25,317 | 7.141509 | 0.119497 | 0.047006 | 0.01288 | 0.0262 | 0.39531 | 0.168538 | 0.090324 | 0.05218 | 0.047556 | 0.047556 | 0 | 0.028286 | 0.166331 | 25,317 | 417 | 491 | 60.71223 | 0.832512 | 0.010981 | 0 | 0.084788 | 1 | 0.004988 | 0.099197 | 0.013743 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004988 | false | 0 | 0.004988 | 0 | 0.012469 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e12046cf567efa3da5dc880cccd66651d00b755e | 740 | py | Python | src/lab2/rawdata/WEB.py | ntuaha/NewsInsight2 | 86f6e4ce9cfa37a6a98fe48b01809784274fa0de | [
"MIT"
] | 1 | 2016-04-06T06:11:34.000Z | 2016-04-06T06:11:34.000Z | src/lab2/rawdata/WEB.py | ntuaha/NewsInsight2 | 86f6e4ce9cfa37a6a98fe48b01809784274fa0de | [
"MIT"
] | null | null | null | src/lab2/rawdata/WEB.py | ntuaha/NewsInsight2 | 86f6e4ce9cfa37a6a98fe48b01809784274fa0de | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#import re
#處理掉unicode 和 str 在ascii上的問題
import sys
#import os
import psycopg2
import cookielib, urllib2,urllib
from lxml import html,etree
import StringIO
import datetime
import json
reload(sys)
sys.setdefaultencoding('utf8')
class WEB:
def getRawData(self,url,d=None):
if d is not None:
value = urllib.urlencode( d)
response = urllib2.build_opener().open(url,value)
else:
response = urllib2.build_opener().open(url)
the_page = response.read()
response.close()
return the_page
if __name__ =="__main__":
url = 'http://news.cnyes.com/Ajax.aspx?Module=GetRollNews'
d= {'date' : datetime.datetime.now().strftime("%Y%m%d")}
web = WEB()
print web.getRawData(url,d)
| 20 | 60 | 0.690541 | 103 | 740 | 4.84466 | 0.621359 | 0.016032 | 0.08016 | 0.104208 | 0.132265 | 0.132265 | 0 | 0 | 0 | 0 | 0 | 0.009836 | 0.175676 | 740 | 36 | 61 | 20.555556 | 0.808197 | 0.089189 | 0 | 0 | 0 | 0 | 0.107946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.291667 | null | null | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1235ebaaf02dc6938e8f75c93cf458c758f80ac | 1,208 | py | Python | sparse/repos/shane-breeze/atuproot/binder/atuproot/EventBuilder.py | yuvipanda/mybinder.org-analytics | 7b654e3e21dea790505c626d688aa15640ea5808 | [
"BSD-3-Clause"
] | 1 | 2021-03-18T23:33:35.000Z | 2021-03-18T23:33:35.000Z | sparse/repos/shane-breeze/atuproot/binder/atuproot/EventBuilder.py | yuvipanda/mybinder.org-analytics | 7b654e3e21dea790505c626d688aa15640ea5808 | [
"BSD-3-Clause"
] | 17 | 2020-01-28T22:33:27.000Z | 2021-06-10T21:05:49.000Z | sparse/repos/shane-breeze/atuproot/binder/atuproot/EventBuilder.py | yuvipanda/mybinder.org-analytics | 7b654e3e21dea790505c626d688aa15640ea5808 | [
"BSD-3-Clause"
] | 1 | 2021-07-17T12:55:22.000Z | 2021-07-17T12:55:22.000Z | import uproot
from .BEvents import BEvents
class EventBuilder(object):
def __init__(self, config):
self.config = config
def __repr__(self):
return '{}({!r})'.format(
self.__class__.__name__,
self.config,
)
def __call__(self):
if len(self.config.inputPaths) != 1:
# TODO - support multiple inputPaths
raise AttributeError("Multiple inputPaths not yet supported")
# Try to open the tree - some machines have configured limitations
# which prevent memmaps from begin created. Use a fallback - the
# localsource option
try:
rootfile = uproot.open(self.config.inputPaths[0])
tree = rootfile[self.config.treeName]
except:
rootfile = uproot.open(self.config.inputPaths[0],
localsource = uproot.FileSource.defaults)
tree = rootfile [self.config.treeName]
events = BEvents(tree,
self.config.nevents_per_block,
self.config.start_block,
self.config.stop_block)
events.config = self.config
return events
| 32.648649 | 74 | 0.577815 | 121 | 1,208 | 5.570248 | 0.504132 | 0.178042 | 0.089021 | 0.065282 | 0.204748 | 0.115727 | 0.115727 | 0 | 0 | 0 | 0 | 0.003764 | 0.340232 | 1,208 | 36 | 75 | 33.555556 | 0.841907 | 0.149834 | 0 | 0.076923 | 0 | 0 | 0.044031 | 0 | 0 | 0 | 0 | 0.027778 | 0 | 1 | 0.115385 | false | 0 | 0.076923 | 0.038462 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e125a598376f22c1c04a68632e9c2fe16ac81ad0 | 449 | py | Python | posts/migrations/0011_auto_20200530_1544.py | olifirovai/yatube | dc08ca61099a83fffebd34b34f6608864dacaf39 | [
"MIT"
] | null | null | null | posts/migrations/0011_auto_20200530_1544.py | olifirovai/yatube | dc08ca61099a83fffebd34b34f6608864dacaf39 | [
"MIT"
] | null | null | null | posts/migrations/0011_auto_20200530_1544.py | olifirovai/yatube | dc08ca61099a83fffebd34b34f6608864dacaf39 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.6 on 2020-05-30 22:44
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('posts', '0010_auto_20200530_1531'),
]
operations = [
migrations.AlterField(
model_name='follow',
name='created',
field=models.DateTimeField(auto_now_add=True, db_index=True, verbose_name='beginning_following_date'),
),
]
| 23.631579 | 114 | 0.636971 | 51 | 449 | 5.411765 | 0.784314 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092262 | 0.25167 | 449 | 18 | 115 | 24.944444 | 0.729167 | 0.100223 | 0 | 0 | 1 | 0 | 0.161692 | 0.116915 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bbfffefb8adc54405bffab540060d3a0ee21a1b2 | 1,171 | py | Python | alchemist_lib/database/aum_history.py | Dodo33/alchemist-lib | 40c2d3b48d5f46315eb09e7f572d578b7e5324b4 | [
"MIT"
] | 5 | 2018-07-11T05:38:51.000Z | 2021-12-19T03:06:51.000Z | alchemist_lib/database/aum_history.py | Dodo33/alchemist-lib | 40c2d3b48d5f46315eb09e7f572d578b7e5324b4 | [
"MIT"
] | null | null | null | alchemist_lib/database/aum_history.py | Dodo33/alchemist-lib | 40c2d3b48d5f46315eb09e7f572d578b7e5324b4 | [
"MIT"
] | 2 | 2019-07-12T08:51:11.000Z | 2021-09-29T22:22:46.000Z | from sqlalchemy import DateTime, String, ForeignKey, Integer, Column, Float
from sqlalchemy.orm import relationship
from . import Base
class AumHistory(Base):
"""
Map class for table AumHistory.
- **aum_id**: Integer, primary_key.
- **aum_datetime**: DateTime, not null.
- **aum**: Float(20, 8), not null.
- **ts_name**: String(150), not null, foreign_key(ts.ts_name).
Relationships:
- **ts**: TradingSystem instance. (Many-to-One)
"""
__tablename__ = "aum_history"
aum_id = Column(Integer, primary_key = True)
aum_datetime = Column(DateTime, nullable = False)
aum = Column(Float(precision = 20, scale = 8, asdecimal = True), nullable = False)
ts_name = Column(String(150), ForeignKey("ts.ts_name"), nullable = False)
ts = relationship("Ts")
def __repr__(self):
return "<AumHistory(datetime={}, aum={}, ts={})>".format(self.aum_datetime,
self.aum,
self.ts_name
)
| 30.815789 | 86 | 0.530316 | 117 | 1,171 | 5.119658 | 0.410256 | 0.050083 | 0.056761 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015748 | 0.349274 | 1,171 | 37 | 87 | 31.648649 | 0.770341 | 0.248506 | 0 | 0 | 0 | 0 | 0.075359 | 0.028708 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.2 | 0.066667 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
010001948d0e40a2baed1fda1c45ba791b3826c3 | 877 | py | Python | ext.py | taipeithomas/adventure-wows | 302b98c1d57689759f1e1c74b71d4d2717c92852 | [
"BSD-3-Clause"
] | null | null | null | ext.py | taipeithomas/adventure-wows | 302b98c1d57689759f1e1c74b71d4d2717c92852 | [
"BSD-3-Clause"
] | null | null | null | ext.py | taipeithomas/adventure-wows | 302b98c1d57689759f1e1c74b71d4d2717c92852 | [
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
"""
ext
~~~
Good place for pluggable extensions.
:copyright: (c) 2015 by Roman Zaiev.
:license: BSD, see LICENSE for more details.
"""
from flask.ext.debugtoolbar import DebugToolbarExtension
from flask.ext.gravatar import Gravatar
from flask.ext.login import LoginManager
from flask.ext.sqlalchemy import SQLAlchemy
from flask.ext.assets import Environment
from flask.ext.restplus import Api
db = SQLAlchemy()
assets = Environment()
login_manager = LoginManager()
gravatar = Gravatar(size=50)
toolbar = DebugToolbarExtension()
api = Api(default='api')
# Almost any modern Flask extension has special init_app()
# method for deferred app binding. But there are a couple of
# popular extensions that no nothing about such use case.
# Or, maybe, you have to use some app.config settings
# gravatar = lambda app: Gravatar(app, size=50)
| 25.794118 | 60 | 0.752566 | 121 | 877 | 5.438017 | 0.603306 | 0.082067 | 0.109422 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012262 | 0.163056 | 877 | 33 | 61 | 26.575758 | 0.884196 | 0.470924 | 0 | 0 | 0 | 0 | 0.006961 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
0100806c56992391d4c163857f323078e375795a | 1,357 | py | Python | notes/algo-ds-practice/problems/greedy/stock_buy_sell.py | Anmol-Singh-Jaggi/interview-notes | 65af75e2b5725894fa5e13bb5cd9ecf152a0d652 | [
"MIT"
] | 6 | 2020-07-05T05:15:19.000Z | 2021-01-24T20:17:14.000Z | notes/algo-ds-practice/problems/greedy/stock_buy_sell.py | Anmol-Singh-Jaggi/interview-notes | 65af75e2b5725894fa5e13bb5cd9ecf152a0d652 | [
"MIT"
] | null | null | null | notes/algo-ds-practice/problems/greedy/stock_buy_sell.py | Anmol-Singh-Jaggi/interview-notes | 65af75e2b5725894fa5e13bb5cd9ecf152a0d652 | [
"MIT"
] | 2 | 2020-09-14T06:46:37.000Z | 2021-06-15T09:17:21.000Z | '''
The cost of a stock on each day is given in an array.
Find the max profit that you can make by buying and selling in those days.
Only 1 stock can be held at a time.
For example:
Array = {100, 180, 260, 310, 40, 535, 695}
The maximum profit can earned by buying on day 0, selling on day 3.
Again buy on day 4 and sell on day 6.
If the given array of prices is sorted in decreasing order, then profit cannot be earned at all.
'''
'''
If we are allowed to buy and sell only once, then we can use following algorithm. Maximum difference between two elements. Here we are allowed to buy and sell multiple times.
Following is algorithm for this problem.
1. Find the local minima and store it as starting index. If not exists, return.
2. Find the local maxima. and store it as ending index. If we reach the end, set the end as ending index.
3. Update the solution (Increment count of buy sell pairs)
4. Repeat the above steps if end is not reached.
Alternate solution:
class Solution {
public int maxProfit(int[] prices) {
int maxprofit = 0;
for (int i = 1; i < prices.length; i++) {
if (prices[i] > prices[i - 1])
maxprofit += prices[i] - prices[i - 1];
}
return maxprofit;
}
}
Explanation - https://leetcode.com/problems/best-time-to-buy-and-sell-stock-ii/solution/
'''
| 38.771429 | 174 | 0.686072 | 231 | 1,357 | 4.030303 | 0.493506 | 0.021482 | 0.025779 | 0.038668 | 0.083781 | 0.051557 | 0.051557 | 0 | 0 | 0 | 0 | 0.031853 | 0.236551 | 1,357 | 34 | 175 | 39.911765 | 0.866795 | 0.312454 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
01074bb3d271a68f45647040e5048810dd38869d | 2,668 | py | Python | myproxy.py | nolink/penetration | c7cf9b3c7338b3907e5c2c0ef11f76bb9210cbb7 | [
"MIT"
] | null | null | null | myproxy.py | nolink/penetration | c7cf9b3c7338b3907e5c2c0ef11f76bb9210cbb7 | [
"MIT"
] | null | null | null | myproxy.py | nolink/penetration | c7cf9b3c7338b3907e5c2c0ef11f76bb9210cbb7 | [
"MIT"
] | null | null | null | import os
import sys
import socket
import threading
def hexdump(src, length=16):
result = []
digits = 4 if isinstance(str, unicode) else 2
for i in xrange(0, len(str), length):
s = src[i:i+length]
hexa = b' '.join(["%0*X" % (digits, ord(x)) for x in s])
text = b''.join([x if 0x20 <= ord(x) < 0x7F else b'.' for x in s])
result.append(b"%04X %-*s %s" % (i, length*(digits + 1), hexa, text))
print b'\n'.join(result)
def request_handler(buffer):
return buffer
def response_handler(buffer):
return buffer
def receive_from(sock):
buffer = ""
sock.settimeout(2)
try:
while True:
data = sock.recv(1024)
if not data:
break
buffer += data
except:
pass
return buffer
def client_handler(client_sock, target_host, target_port, receive_first):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect((target_host, target_port))
if receive_first:
buffer = receive_from(sock)
hexdump(buffer)
buffer = response_handler(buffer)
if len(buffer):
client_sock.send(buffer)
while True:
local_buffer = receive_from(client_sock)
if len(local_buffer):
hexdump(local_buffer)
local_buffer = request_handler(local_buffer)
sock.send(local_buffer)
buffer = receive_from(sock)
if len(buffer):
hexdump(buffer)
buffer = response_handler(buffer)
client_sock.send(buffer)
if not len(buffer) or not len(local_buffer):
client_sock.close()
sock.close()
break
def server_loop(local_host, local_port, target_host, target_port, receive_first):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind((local_host, local_port))
sock.listen(5)
while True:
client_sock, addr = sock.accept()
print 'accept on: %s:%s' % (addr[0], addr[1])
client_thread = threading.Thread(target=client_handler, args=(client_sock,target_host, target_port, receive_first,))
client_thread.start()
def main():
if len(sys.argv[1:]) != 5:
print 'usage: myproxy.py local_host local_port target_host target_port receive_first'
sys.exit(1)
local_host = sys.argv[1]
local_port = int(sys.argv[2])
target_host = sys.argv[3]
target_port = int(sys.argv[4])
if 'True' in sys.argv[5]:
receive_first = True
else:
receive_first = False
server_loop(local_host, local_port, target_host, target_port, receive_first)
main()
| 28.084211 | 124 | 0.613568 | 358 | 2,668 | 4.391061 | 0.25419 | 0.061069 | 0.061069 | 0.076336 | 0.338422 | 0.273537 | 0.222646 | 0.222646 | 0.189567 | 0.189567 | 0 | 0.015504 | 0.274738 | 2,668 | 94 | 125 | 28.382979 | 0.796899 | 0 | 0 | 0.263158 | 0 | 0 | 0.044619 | 0 | 0 | 0 | 0.003 | 0 | 0 | 0 | null | null | 0.013158 | 0.052632 | null | null | 0.039474 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
010cf4217c2fdf758fdaa2410e3d37d6f9d1bef0 | 774 | py | Python | src/discord/checks.py | ttgc/zigotoland | 0f1910e9853761a0f8187bb20c79a467f19ff3e2 | [
"MIT"
] | 2 | 2019-06-27T22:43:05.000Z | 2021-07-08T13:22:52.000Z | src/discord/checks.py | ttgc/zigotoland | 0f1910e9853761a0f8187bb20c79a467f19ff3e2 | [
"MIT"
] | 2 | 2019-06-28T08:34:52.000Z | 2019-06-28T13:46:23.000Z | src/discord/checks.py | ttgc/zigotoland | 0f1910e9853761a0f8187bb20c79a467f19ff3e2 | [
"MIT"
] | null | null | null | #!usr/bin/env python3.7
#-*-coding:utf-8-*-
from src.utils.config import *
from src.games.poker.lobby import *
from src.discord.manage import *
from src.utils.logs import getlogger
def check_botowner(ctx):
config = Config()
return ctx.author.id in config.owners
def check_inserv(ctx):
config = Config()
return (ctx.guild == config.guild)
def check_inpokerlobby(ctx):
return ctx.channel in PokerLobby.instances
def check_pokerlobbyowner(ctx):
if not check_inpokerlobby(ctx): return False
return PokerLobby.instances[ctx.channel].owner == ctx.author
async def check_darkmember(ctx):
if not check_inserv(ctx): return False
logger = getlogger()
gold, plat, dark = await manage_roles(ctx, logger)
return dark in ctx.author.roles
| 26.689655 | 64 | 0.729974 | 111 | 774 | 5.018018 | 0.432432 | 0.071813 | 0.070018 | 0.075404 | 0.086176 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00463 | 0.162791 | 774 | 28 | 65 | 27.642857 | 0.854938 | 0.05168 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.05 | 0.65 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
0114543aa4948cb2423e6ddcc3d9a9d38a3c9822 | 26,361 | py | Python | python/ymt_components/ymt_arm_2jnt_02/settingsUI.py | yamahigashi/mgear_shifter_components | c4e4c19d8a972e4d78df46f4bdf0b3319da5a792 | [
"MIT"
] | 10 | 2020-01-24T10:10:39.000Z | 2021-09-16T06:20:55.000Z | python/ymt_components/ymt_arm_2jnt_02/settingsUI.py | yamahigashi/mgear_shifter_components | c4e4c19d8a972e4d78df46f4bdf0b3319da5a792 | [
"MIT"
] | null | null | null | python/ymt_components/ymt_arm_2jnt_02/settingsUI.py | yamahigashi/mgear_shifter_components | c4e4c19d8a972e4d78df46f4bdf0b3319da5a792 | [
"MIT"
] | 2 | 2020-01-24T10:11:07.000Z | 2020-04-21T18:17:09.000Z | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'D:/Pipeline/rez-packages/third/github.com/yamahigashi/ymtshiftercomponents/mgear_shifter_components/python/ymt_components/ymt_arm_2jnt_02/settingsUI.ui',
# licensing of 'D:/Pipeline/rez-packages/third/github.com/yamahigashi/ymtshiftercomponents/mgear_shifter_components/python/ymt_components/ymt_arm_2jnt_02/settingsUI.ui' applies.
#
# Created: Mon Jan 27 15:04:51 2020
# by: pyside2-uic running on PySide2 5.12.5
#
# WARNING! All changes made in this file will be lost!
from PySide2 import QtCore, QtGui, QtWidgets
class Ui_Form(object):
def setupUi(self, Form):
Form.setObjectName("Form")
Form.resize(268, 861)
self.gridLayout = QtWidgets.QGridLayout(Form)
self.gridLayout.setObjectName("gridLayout")
self.groupBox = QtWidgets.QGroupBox(Form)
self.groupBox.setTitle("")
self.groupBox.setObjectName("groupBox")
self.gridLayout_2 = QtWidgets.QGridLayout(self.groupBox)
self.gridLayout_2.setObjectName("gridLayout_2")
self.verticalLayout = QtWidgets.QVBoxLayout()
self.verticalLayout.setObjectName("verticalLayout")
self.formLayout = QtWidgets.QFormLayout()
self.formLayout.setFieldGrowthPolicy(QtWidgets.QFormLayout.AllNonFixedFieldsGrow)
self.formLayout.setObjectName("formLayout")
self.ikfk_label = QtWidgets.QLabel(self.groupBox)
self.ikfk_label.setObjectName("ikfk_label")
self.formLayout.setWidget(0, QtWidgets.QFormLayout.LabelRole, self.ikfk_label)
self.horizontalLayout_3 = QtWidgets.QHBoxLayout()
self.horizontalLayout_3.setObjectName("horizontalLayout_3")
self.ikfk_slider = QtWidgets.QSlider(self.groupBox)
self.ikfk_slider.setMinimumSize(QtCore.QSize(0, 15))
self.ikfk_slider.setMaximum(100)
self.ikfk_slider.setOrientation(QtCore.Qt.Horizontal)
self.ikfk_slider.setObjectName("ikfk_slider")
self.horizontalLayout_3.addWidget(self.ikfk_slider)
self.ikfk_spinBox = QtWidgets.QSpinBox(self.groupBox)
self.ikfk_spinBox.setMaximum(100)
self.ikfk_spinBox.setObjectName("ikfk_spinBox")
self.horizontalLayout_3.addWidget(self.ikfk_spinBox)
self.formLayout.setLayout(0, QtWidgets.QFormLayout.FieldRole, self.horizontalLayout_3)
self.maxStretch_label = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.MinimumExpanding, QtWidgets.QSizePolicy.MinimumExpanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.maxStretch_label.sizePolicy().hasHeightForWidth())
self.maxStretch_label.setSizePolicy(sizePolicy)
self.maxStretch_label.setObjectName("maxStretch_label")
self.formLayout.setWidget(1, QtWidgets.QFormLayout.LabelRole, self.maxStretch_label)
self.maxStretch_spinBox = QtWidgets.QDoubleSpinBox(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.MinimumExpanding, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.maxStretch_spinBox.sizePolicy().hasHeightForWidth())
self.maxStretch_spinBox.setSizePolicy(sizePolicy)
self.maxStretch_spinBox.setMinimum(1.0)
self.maxStretch_spinBox.setProperty("value", 1.5)
self.maxStretch_spinBox.setObjectName("maxStretch_spinBox")
self.formLayout.setWidget(1, QtWidgets.QFormLayout.FieldRole, self.maxStretch_spinBox)
self.maxStretch_label_2 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.MinimumExpanding, QtWidgets.QSizePolicy.MinimumExpanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.maxStretch_label_2.sizePolicy().hasHeightForWidth())
self.maxStretch_label_2.setSizePolicy(sizePolicy)
self.maxStretch_label_2.setObjectName("maxStretch_label_2")
self.formLayout.setWidget(2, QtWidgets.QFormLayout.LabelRole, self.maxStretch_label_2)
self.elbowThickness_spinBox = QtWidgets.QDoubleSpinBox(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.MinimumExpanding, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.elbowThickness_spinBox.sizePolicy().hasHeightForWidth())
self.elbowThickness_spinBox.setSizePolicy(sizePolicy)
self.elbowThickness_spinBox.setDecimals(3)
self.elbowThickness_spinBox.setMinimum(0.0)
self.elbowThickness_spinBox.setMaximum(1.0)
self.elbowThickness_spinBox.setSingleStep(0.1)
self.elbowThickness_spinBox.setProperty("value", 0.0)
self.elbowThickness_spinBox.setObjectName("elbowThickness_spinBox")
self.formLayout.setWidget(2, QtWidgets.QFormLayout.FieldRole, self.elbowThickness_spinBox)
self.verticalLayout.addLayout(self.formLayout)
self.horizontalLayout = QtWidgets.QHBoxLayout()
self.horizontalLayout.setObjectName("horizontalLayout")
self.divisions_label = QtWidgets.QLabel(self.groupBox)
self.divisions_label.setObjectName("divisions_label")
self.horizontalLayout.addWidget(self.divisions_label)
self.div0_spinBox = QtWidgets.QSpinBox(self.groupBox)
self.div0_spinBox.setMinimum(0)
self.div0_spinBox.setProperty("value", 2)
self.div0_spinBox.setObjectName("div0_spinBox")
self.horizontalLayout.addWidget(self.div0_spinBox)
self.div1_spinBox = QtWidgets.QSpinBox(self.groupBox)
self.div1_spinBox.setMinimum(0)
self.div1_spinBox.setProperty("value", 2)
self.div1_spinBox.setObjectName("div1_spinBox")
self.horizontalLayout.addWidget(self.div1_spinBox)
self.verticalLayout.addLayout(self.horizontalLayout)
self.ikTR_checkBox = QtWidgets.QCheckBox(self.groupBox)
self.ikTR_checkBox.setObjectName("ikTR_checkBox")
self.verticalLayout.addWidget(self.ikTR_checkBox)
self.mirrorIK_checkBox = QtWidgets.QCheckBox(self.groupBox)
self.mirrorIK_checkBox.setObjectName("mirrorIK_checkBox")
self.verticalLayout.addWidget(self.mirrorIK_checkBox)
self.mirrorMid_checkBox = QtWidgets.QCheckBox(self.groupBox)
self.mirrorMid_checkBox.setObjectName("mirrorMid_checkBox")
self.verticalLayout.addWidget(self.mirrorMid_checkBox)
self.extraTweak_checkBox = QtWidgets.QCheckBox(self.groupBox)
self.extraTweak_checkBox.setObjectName("extraTweak_checkBox")
self.verticalLayout.addWidget(self.extraTweak_checkBox)
self.smoothStep_checkBox = QtWidgets.QCheckBox(self.groupBox)
self.smoothStep_checkBox.setObjectName("smoothStep_checkBox")
self.verticalLayout.addWidget(self.smoothStep_checkBox)
self.supportJoints_checkBox = QtWidgets.QCheckBox(self.groupBox)
self.supportJoints_checkBox.setChecked(True)
self.supportJoints_checkBox.setObjectName("supportJoints_checkBox")
self.verticalLayout.addWidget(self.supportJoints_checkBox)
self.guideOrientWrist_checkBox = QtWidgets.QCheckBox(self.groupBox)
self.guideOrientWrist_checkBox.setChecked(True)
self.guideOrientWrist_checkBox.setObjectName("guideOrientWrist_checkBox")
self.verticalLayout.addWidget(self.guideOrientWrist_checkBox)
self.horizontalLayout_2 = QtWidgets.QHBoxLayout()
self.horizontalLayout_2.setObjectName("horizontalLayout_2")
self.squashStretchProfile_pushButton = QtWidgets.QPushButton(self.groupBox)
self.squashStretchProfile_pushButton.setObjectName("squashStretchProfile_pushButton")
self.horizontalLayout_2.addWidget(self.squashStretchProfile_pushButton)
self.verticalLayout.addLayout(self.horizontalLayout_2)
self.gridLayout_2.addLayout(self.verticalLayout, 0, 0, 1, 1)
self.gridLayout.addWidget(self.groupBox, 0, 0, 1, 1)
self.ikRefArray_groupBox = QtWidgets.QGroupBox(Form)
self.ikRefArray_groupBox.setObjectName("ikRefArray_groupBox")
self.gridLayout_3 = QtWidgets.QGridLayout(self.ikRefArray_groupBox)
self.gridLayout_3.setObjectName("gridLayout_3")
self.ikRefArray_horizontalLayout = QtWidgets.QHBoxLayout()
self.ikRefArray_horizontalLayout.setObjectName("ikRefArray_horizontalLayout")
self.ikRefArray_verticalLayout_1 = QtWidgets.QVBoxLayout()
self.ikRefArray_verticalLayout_1.setObjectName("ikRefArray_verticalLayout_1")
self.ikRefArray_listWidget = QtWidgets.QListWidget(self.ikRefArray_groupBox)
self.ikRefArray_listWidget.setDragDropOverwriteMode(True)
self.ikRefArray_listWidget.setDragDropMode(QtWidgets.QAbstractItemView.InternalMove)
self.ikRefArray_listWidget.setDefaultDropAction(QtCore.Qt.MoveAction)
self.ikRefArray_listWidget.setAlternatingRowColors(True)
self.ikRefArray_listWidget.setSelectionMode(QtWidgets.QAbstractItemView.ExtendedSelection)
self.ikRefArray_listWidget.setSelectionRectVisible(False)
self.ikRefArray_listWidget.setObjectName("ikRefArray_listWidget")
self.ikRefArray_verticalLayout_1.addWidget(self.ikRefArray_listWidget)
self.ikRefArray_copyRef_pushButton = QtWidgets.QPushButton(self.ikRefArray_groupBox)
self.ikRefArray_copyRef_pushButton.setObjectName("ikRefArray_copyRef_pushButton")
self.ikRefArray_verticalLayout_1.addWidget(self.ikRefArray_copyRef_pushButton)
self.ikRefArray_horizontalLayout.addLayout(self.ikRefArray_verticalLayout_1)
self.ikRefArray_verticalLayout_2 = QtWidgets.QVBoxLayout()
self.ikRefArray_verticalLayout_2.setObjectName("ikRefArray_verticalLayout_2")
self.ikRefArrayAdd_pushButton = QtWidgets.QPushButton(self.ikRefArray_groupBox)
self.ikRefArrayAdd_pushButton.setObjectName("ikRefArrayAdd_pushButton")
self.ikRefArray_verticalLayout_2.addWidget(self.ikRefArrayAdd_pushButton)
self.ikRefArrayRemove_pushButton = QtWidgets.QPushButton(self.ikRefArray_groupBox)
self.ikRefArrayRemove_pushButton.setObjectName("ikRefArrayRemove_pushButton")
self.ikRefArray_verticalLayout_2.addWidget(self.ikRefArrayRemove_pushButton)
spacerItem = QtWidgets.QSpacerItem(20, 40, QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Expanding)
self.ikRefArray_verticalLayout_2.addItem(spacerItem)
self.ikRefArray_horizontalLayout.addLayout(self.ikRefArray_verticalLayout_2)
self.gridLayout_3.addLayout(self.ikRefArray_horizontalLayout, 0, 0, 1, 1)
self.gridLayout.addWidget(self.ikRefArray_groupBox, 1, 0, 1, 1)
self.upvRefArray_groupBox = QtWidgets.QGroupBox(Form)
self.upvRefArray_groupBox.setObjectName("upvRefArray_groupBox")
self.gridLayout_5 = QtWidgets.QGridLayout(self.upvRefArray_groupBox)
self.gridLayout_5.setObjectName("gridLayout_5")
self.upvRefArray_horizontalLayout = QtWidgets.QHBoxLayout()
self.upvRefArray_horizontalLayout.setObjectName("upvRefArray_horizontalLayout")
self.upvRefArray_verticalLayout_1 = QtWidgets.QVBoxLayout()
self.upvRefArray_verticalLayout_1.setObjectName("upvRefArray_verticalLayout_1")
self.upvRefArray_listWidget = QtWidgets.QListWidget(self.upvRefArray_groupBox)
self.upvRefArray_listWidget.setDragDropOverwriteMode(True)
self.upvRefArray_listWidget.setDragDropMode(QtWidgets.QAbstractItemView.InternalMove)
self.upvRefArray_listWidget.setDefaultDropAction(QtCore.Qt.MoveAction)
self.upvRefArray_listWidget.setAlternatingRowColors(True)
self.upvRefArray_listWidget.setSelectionMode(QtWidgets.QAbstractItemView.ExtendedSelection)
self.upvRefArray_listWidget.setSelectionRectVisible(False)
self.upvRefArray_listWidget.setObjectName("upvRefArray_listWidget")
self.upvRefArray_verticalLayout_1.addWidget(self.upvRefArray_listWidget)
self.upvRefArray_copyRef_pushButton = QtWidgets.QPushButton(self.upvRefArray_groupBox)
self.upvRefArray_copyRef_pushButton.setObjectName("upvRefArray_copyRef_pushButton")
self.upvRefArray_verticalLayout_1.addWidget(self.upvRefArray_copyRef_pushButton)
self.upvRefArray_horizontalLayout.addLayout(self.upvRefArray_verticalLayout_1)
self.upvRefArray_verticalLayout_2 = QtWidgets.QVBoxLayout()
self.upvRefArray_verticalLayout_2.setObjectName("upvRefArray_verticalLayout_2")
self.upvRefArrayAdd_pushButton = QtWidgets.QPushButton(self.upvRefArray_groupBox)
self.upvRefArrayAdd_pushButton.setObjectName("upvRefArrayAdd_pushButton")
self.upvRefArray_verticalLayout_2.addWidget(self.upvRefArrayAdd_pushButton)
self.upvRefArrayRemove_pushButton = QtWidgets.QPushButton(self.upvRefArray_groupBox)
self.upvRefArrayRemove_pushButton.setObjectName("upvRefArrayRemove_pushButton")
self.upvRefArray_verticalLayout_2.addWidget(self.upvRefArrayRemove_pushButton)
spacerItem1 = QtWidgets.QSpacerItem(20, 40, QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Expanding)
self.upvRefArray_verticalLayout_2.addItem(spacerItem1)
self.upvRefArray_horizontalLayout.addLayout(self.upvRefArray_verticalLayout_2)
self.gridLayout_5.addLayout(self.upvRefArray_horizontalLayout, 0, 0, 1, 1)
self.gridLayout.addWidget(self.upvRefArray_groupBox, 2, 0, 1, 1)
self.pinRefArray_groupBox = QtWidgets.QGroupBox(Form)
self.pinRefArray_groupBox.setObjectName("pinRefArray_groupBox")
self.gridLayout_4 = QtWidgets.QGridLayout(self.pinRefArray_groupBox)
self.gridLayout_4.setObjectName("gridLayout_4")
self.pinRefArray_horizontalLayout = QtWidgets.QHBoxLayout()
self.pinRefArray_horizontalLayout.setObjectName("pinRefArray_horizontalLayout")
self.pinRefArray_verticalLayout = QtWidgets.QVBoxLayout()
self.pinRefArray_verticalLayout.setObjectName("pinRefArray_verticalLayout")
self.pinRefArray_listWidget = QtWidgets.QListWidget(self.pinRefArray_groupBox)
self.pinRefArray_listWidget.setDragDropOverwriteMode(True)
self.pinRefArray_listWidget.setDragDropMode(QtWidgets.QAbstractItemView.InternalMove)
self.pinRefArray_listWidget.setDefaultDropAction(QtCore.Qt.MoveAction)
self.pinRefArray_listWidget.setAlternatingRowColors(True)
self.pinRefArray_listWidget.setSelectionMode(QtWidgets.QAbstractItemView.ExtendedSelection)
self.pinRefArray_listWidget.setSelectionRectVisible(False)
self.pinRefArray_listWidget.setObjectName("pinRefArray_listWidget")
self.pinRefArray_verticalLayout.addWidget(self.pinRefArray_listWidget)
self.pinRefArray_copyRef_pushButton = QtWidgets.QPushButton(self.pinRefArray_groupBox)
self.pinRefArray_copyRef_pushButton.setObjectName("pinRefArray_copyRef_pushButton")
self.pinRefArray_verticalLayout.addWidget(self.pinRefArray_copyRef_pushButton)
self.pinRefArray_horizontalLayout.addLayout(self.pinRefArray_verticalLayout)
self.pinRefArray_verticalLayout_2 = QtWidgets.QVBoxLayout()
self.pinRefArray_verticalLayout_2.setObjectName("pinRefArray_verticalLayout_2")
self.pinRefArrayAdd_pushButton = QtWidgets.QPushButton(self.pinRefArray_groupBox)
self.pinRefArrayAdd_pushButton.setObjectName("pinRefArrayAdd_pushButton")
self.pinRefArray_verticalLayout_2.addWidget(self.pinRefArrayAdd_pushButton)
self.pinRefArrayRemove_pushButton = QtWidgets.QPushButton(self.pinRefArray_groupBox)
self.pinRefArrayRemove_pushButton.setObjectName("pinRefArrayRemove_pushButton")
self.pinRefArray_verticalLayout_2.addWidget(self.pinRefArrayRemove_pushButton)
spacerItem2 = QtWidgets.QSpacerItem(20, 40, QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Expanding)
self.pinRefArray_verticalLayout_2.addItem(spacerItem2)
self.pinRefArray_horizontalLayout.addLayout(self.pinRefArray_verticalLayout_2)
self.gridLayout_4.addLayout(self.pinRefArray_horizontalLayout, 0, 0, 1, 1)
self.gridLayout.addWidget(self.pinRefArray_groupBox, 3, 0, 1, 1)
self.fkRefArray_groupBox = QtWidgets.QGroupBox(Form)
self.fkRefArray_groupBox.setObjectName("fkRefArray_groupBox")
self.gridLayout_51 = QtWidgets.QGridLayout(self.fkRefArray_groupBox)
self.gridLayout_51.setObjectName("gridLayout_51")
self.fkRefArray_horizontalLayout = QtWidgets.QHBoxLayout()
self.fkRefArray_horizontalLayout.setObjectName("fkRefArray_horizontalLayout")
self.fkRefArray_verticalLayout_1 = QtWidgets.QVBoxLayout()
self.fkRefArray_verticalLayout_1.setObjectName("fkRefArray_verticalLayout_1")
self.fkRefArray_listWidget = QtWidgets.QListWidget(self.fkRefArray_groupBox)
self.fkRefArray_listWidget.setDragDropOverwriteMode(True)
self.fkRefArray_listWidget.setDragDropMode(QtWidgets.QAbstractItemView.InternalMove)
self.fkRefArray_listWidget.setDefaultDropAction(QtCore.Qt.MoveAction)
self.fkRefArray_listWidget.setAlternatingRowColors(True)
self.fkRefArray_listWidget.setSelectionMode(QtWidgets.QAbstractItemView.ExtendedSelection)
self.fkRefArray_listWidget.setSelectionRectVisible(False)
self.fkRefArray_listWidget.setObjectName("fkRefArray_listWidget")
self.fkRefArray_verticalLayout_1.addWidget(self.fkRefArray_listWidget)
self.fkRefArray_copyRef_pushButton = QtWidgets.QPushButton(self.fkRefArray_groupBox)
self.fkRefArray_copyRef_pushButton.setObjectName("fkRefArray_copyRef_pushButton")
self.fkRefArray_verticalLayout_1.addWidget(self.fkRefArray_copyRef_pushButton)
self.fkRefArray_horizontalLayout.addLayout(self.fkRefArray_verticalLayout_1)
self.fkRefArray_verticalLayout_2 = QtWidgets.QVBoxLayout()
self.fkRefArray_verticalLayout_2.setObjectName("fkRefArray_verticalLayout_2")
self.fkRefArrayAdd_pushButton = QtWidgets.QPushButton(self.fkRefArray_groupBox)
self.fkRefArrayAdd_pushButton.setObjectName("fkRefArrayAdd_pushButton")
self.fkRefArray_verticalLayout_2.addWidget(self.fkRefArrayAdd_pushButton)
self.fkRefArrayRemove_pushButton = QtWidgets.QPushButton(self.fkRefArray_groupBox)
self.fkRefArrayRemove_pushButton.setObjectName("fkRefArrayRemove_pushButton")
self.fkRefArray_verticalLayout_2.addWidget(self.fkRefArrayRemove_pushButton)
spacerItem3 = QtWidgets.QSpacerItem(20, 40, QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Expanding)
self.fkRefArray_verticalLayout_2.addItem(spacerItem3)
self.fkRefArray_horizontalLayout.addLayout(self.fkRefArray_verticalLayout_2)
self.gridLayout_51.addLayout(self.fkRefArray_horizontalLayout, 0, 0, 1, 1)
self.gridLayout.addWidget(self.fkRefArray_groupBox, 4, 0, 1, 1)
self.retranslateUi(Form)
QtCore.QObject.connect(self.ikfk_slider, QtCore.SIGNAL("sliderMoved(int)"), self.ikfk_spinBox.setValue)
QtCore.QObject.connect(self.ikfk_spinBox, QtCore.SIGNAL("valueChanged(int)"), self.ikfk_slider.setValue)
QtCore.QMetaObject.connectSlotsByName(Form)
def retranslateUi(self, Form):
Form.setWindowTitle(QtWidgets.QApplication.translate("Form", "Form", None, -1))
self.ikfk_label.setText(QtWidgets.QApplication.translate("Form", "FK/IK Blend", None, -1))
self.maxStretch_label.setText(QtWidgets.QApplication.translate("Form", "Max Stretch", None, -1))
self.maxStretch_label_2.setText(QtWidgets.QApplication.translate("Form", "Elbow Thickness", None, -1))
self.divisions_label.setText(QtWidgets.QApplication.translate("Form", "Divisions", None, -1))
self.ikTR_checkBox.setText(QtWidgets.QApplication.translate("Form", "IK separated Trans and Rot ctl", None, -1))
self.mirrorIK_checkBox.setToolTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.mirrorIK_checkBox.setStatusTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.mirrorIK_checkBox.setWhatsThis(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.mirrorIK_checkBox.setText(QtWidgets.QApplication.translate("Form", "Mirror IK Ctl axis behaviour", None, -1))
self.mirrorMid_checkBox.setToolTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.mirrorMid_checkBox.setStatusTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.mirrorMid_checkBox.setWhatsThis(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.mirrorMid_checkBox.setText(QtWidgets.QApplication.translate("Form", "MirrorMid Ctl and UPV axis behaviour", None, -1))
self.extraTweak_checkBox.setToolTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.extraTweak_checkBox.setStatusTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.extraTweak_checkBox.setWhatsThis(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.extraTweak_checkBox.setText(QtWidgets.QApplication.translate("Form", "Add Extra Tweak Ctl", None, -1))
self.supportJoints_checkBox.setToolTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.supportJoints_checkBox.setStatusTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.supportJoints_checkBox.setWhatsThis(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.supportJoints_checkBox.setText(QtWidgets.QApplication.translate("Form", "Support Elbow Joints", None, -1))
self.guideOrientWrist_checkBox.setToolTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.guideOrientWrist_checkBox.setStatusTip(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.guideOrientWrist_checkBox.setWhatsThis(QtWidgets.QApplication.translate("Form", "This option set the axis of the mid CTL (elbow) and the up vector control to move in a mirror behaviour ", None, -1))
self.guideOrientWrist_checkBox.setText(QtWidgets.QApplication.translate("Form", "This option aligns the wrist with the guide.", None, -1))
self.smoothStep_checkBox.setToolTip(QtWidgets.QApplication.translate("Form", "Smoothstepped interpolation connected shoulder.", None, -1))
self.smoothStep_checkBox.setStatusTip(QtWidgets.QApplication.translate("Form", "Smoothstepped interpolation connected shoulder.", None, -1))
self.smoothStep_checkBox.setWhatsThis(QtWidgets.QApplication.translate("Form", "Smoothstepped interpolation connected shoulder.", None, -1))
self.smoothStep_checkBox.setText(QtWidgets.QApplication.translate("Form", "Smoothstepped interpolation connected shoulder.", None, -1))
self.squashStretchProfile_pushButton.setText(QtWidgets.QApplication.translate("Form", "Squash and Stretch Profile", None, -1))
self.ikRefArray_groupBox.setTitle(QtWidgets.QApplication.translate("Form", "IK Reference Array", None, -1))
self.ikRefArray_copyRef_pushButton.setText(QtWidgets.QApplication.translate("Form", "Copy from UpV Ref", None, -1))
self.ikRefArrayAdd_pushButton.setText(QtWidgets.QApplication.translate("Form", "<<", None, -1))
self.ikRefArrayRemove_pushButton.setText(QtWidgets.QApplication.translate("Form", ">>", None, -1))
self.upvRefArray_groupBox.setTitle(QtWidgets.QApplication.translate("Form", "UpV Reference Array", None, -1))
self.upvRefArray_copyRef_pushButton.setText(QtWidgets.QApplication.translate("Form", "Copy from IK Ref", None, -1))
self.upvRefArrayAdd_pushButton.setText(QtWidgets.QApplication.translate("Form", "<<", None, -1))
self.upvRefArrayRemove_pushButton.setText(QtWidgets.QApplication.translate("Form", ">>", None, -1))
self.pinRefArray_groupBox.setTitle(QtWidgets.QApplication.translate("Form", "Pin Elbow Reference Array", None, -1))
self.pinRefArray_copyRef_pushButton.setText(QtWidgets.QApplication.translate("Form", "Copy from IK Ref", None, -1))
self.pinRefArrayAdd_pushButton.setText(QtWidgets.QApplication.translate("Form", "<<", None, -1))
self.pinRefArrayRemove_pushButton.setText(QtWidgets.QApplication.translate("Form", ">>", None, -1))
self.fkRefArray_groupBox.setTitle(QtWidgets.QApplication.translate("Form", "FK Reference Array", None, -1))
self.fkRefArray_copyRef_pushButton.setText(QtWidgets.QApplication.translate("Form", "Copy from IK Ref", None, -1))
self.fkRefArrayAdd_pushButton.setText(QtWidgets.QApplication.translate("Form", "<<", None, -1))
self.fkRefArrayRemove_pushButton.setText(QtWidgets.QApplication.translate("Form", ">>", None, -1))
| 81.613003 | 211 | 0.774402 | 2,712 | 26,361 | 7.356195 | 0.088496 | 0.015789 | 0.070677 | 0.0801 | 0.615539 | 0.484411 | 0.362857 | 0.264712 | 0.260351 | 0.226566 | 0 | 0.012322 | 0.134934 | 26,361 | 322 | 212 | 81.86646 | 0.862524 | 0.020409 | 0 | 0.038961 | 1 | 0.048701 | 0.143029 | 0.032542 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006494 | false | 0 | 0.003247 | 0 | 0.012987 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
011558cdd32ae8aa3765a951ce4a7171f77bdc57 | 200 | py | Python | examples/simple.py | bytewax/bytewax | c935aac8d6eabb0af5a331f59fc4a1300fa6d1e3 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2022-02-05T02:38:35.000Z | 2022-03-31T16:42:52.000Z | examples/simple.py | yutiansut/bytewax | 54dba26cbab2afd24007865f69083a92c05cbdc1 | [
"ECL-2.0",
"Apache-2.0"
] | 9 | 2022-02-08T22:13:56.000Z | 2022-03-31T18:16:12.000Z | examples/simple.py | yutiansut/bytewax | 54dba26cbab2afd24007865f69083a92c05cbdc1 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2022-02-14T23:45:03.000Z | 2022-03-22T06:58:47.000Z | from bytewax import Dataflow, run
flow = Dataflow()
flow.map(lambda x: x * x)
flow.capture()
if __name__ == "__main__":
for epoch, y in sorted(run(flow, enumerate(range(10)))):
print(y)
| 20 | 60 | 0.655 | 30 | 200 | 4.1 | 0.733333 | 0.113821 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012422 | 0.195 | 200 | 9 | 61 | 22.222222 | 0.751553 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
011b1337ba396fd41d130511c15d49e96da40575 | 4,342 | py | Python | scitwi/users/user.py | vahndi/scitwi | e873ea8b21710fd7b9a1cec0da594ccef91c54a2 | [
"MIT"
] | null | null | null | scitwi/users/user.py | vahndi/scitwi | e873ea8b21710fd7b9a1cec0da594ccef91c54a2 | [
"MIT"
] | null | null | null | scitwi/users/user.py | vahndi/scitwi | e873ea8b21710fd7b9a1cec0da594ccef91c54a2 | [
"MIT"
] | null | null | null | from scitwi.users.user_entities import UserEntities
from scitwi.users.user_profile import UserProfile
from scitwi.utils.attrs import bool_attr, datetime_attr, str_attr, obj_attr
from scitwi.utils.attrs import int_attr
from scitwi.utils.strs import obj_string
class User(object):
"""
Users can be anyone or anything. They tweet, follow, create lists,
have a home_timeline, can be mentioned, and can be looked up in bulk.
https://dev.twitter.com/overview/api/users
"""
def __init__(self, user: dict):
self.contributors_enabled = bool_attr(user, 'contributors_enabled')
self.created_at = datetime_attr(user, 'created_at')
self.default_profile = bool_attr(user, 'default_profile')
self.default_profile_image = bool_attr(user, 'default_profile_image')
self.description = str_attr(user, 'description')
self.entities = obj_attr(user, 'entities', UserEntities)
self.favourites_count = int_attr(user, 'favourites_count')
self.follow_request_sent = bool_attr(user, 'follow_request_sent')
self.following = bool_attr(user, 'following') # deprecated
self.followers_count = int_attr(user, 'followers_count')
self.friends_count = int_attr(user, 'friends_count')
self.geo_enabled = bool_attr(user, 'geo_enabled')
self.has_extended_profile = bool_attr(user, 'has_extended_profile')
self.id_ = int_attr(user, 'id')
self.is_translation_enabled = bool_attr(user, 'is_translation_enabled') # not in docs
self.is_translator = bool_attr(user, 'is_translator')
self.lang = str_attr(user, 'lang')
self.listed_count = int_attr(user, 'listed_count')
self.location = str_attr(user, 'location')
self.name = str_attr(user, 'name')
self.notifications = bool_attr(user, 'notifications') # deprecated
self.profile = UserProfile(user)
self.protected = bool_attr(user, 'protected')
self.screen_name = str_attr(user, 'screen_name')
self.show_all_inline_media = bool_attr(user, 'show_all_inline_media')
self.statuses_count = user['statuses_count']
self.time_zone = user['time_zone']
self.url = user['url']
self.utc_offset = user['utc_offset']
self.verified = user['verified']
def __str__(self):
str_out = ''
str_out += obj_string('Contributors Enabled', self.contributors_enabled)
str_out += obj_string('Created At', self.created_at)
str_out += obj_string('Default Profile', self.default_profile)
str_out += obj_string('Default Profile Image', self.default_profile_image)
str_out += obj_string('Description', self.description)
str_out += obj_string('Entities', self.entities)
str_out += obj_string('Favourites Count', self.favourites_count)
str_out += obj_string('Follow Request Sent', self.follow_request_sent)
str_out += obj_string('Following', self.following)
str_out += obj_string('Followers Count', self.followers_count)
str_out += obj_string('Friends Count', self.friends_count)
str_out += obj_string('Geo Enabled', self.geo_enabled)
str_out += obj_string('Has Extended Profile', self.has_extended_profile)
str_out += obj_string('Id', self.id_)
str_out += obj_string('Is Translation Enabled', self.is_translation_enabled)
str_out += obj_string('Is Translator', self.is_translator)
str_out += obj_string('Language', self.lang)
str_out += obj_string('Listed Count', self.listed_count)
str_out += obj_string('Location', self.location)
str_out += obj_string('Name', self.listed_count)
str_out += obj_string('Notifications', self.notifications)
str_out += obj_string('Profile', self.profile)
str_out += obj_string('Protected', self.protected)
str_out += obj_string('Screen Name', self.screen_name)
str_out += obj_string('Show All Inline Media', self.show_all_inline_media)
str_out += obj_string('Statuses Count', self.statuses_count)
str_out += obj_string('Time Zone', self.time_zone)
str_out += obj_string('Url', self.url)
str_out += obj_string('UTC Offset', self.utc_offset)
str_out += obj_string('Verified', self.verified)
return str_out
| 50.488372 | 94 | 0.68655 | 570 | 4,342 | 4.917544 | 0.168421 | 0.068498 | 0.096325 | 0.160542 | 0.196575 | 0.042098 | 0.021406 | 0 | 0 | 0 | 0 | 0 | 0.199678 | 4,342 | 85 | 95 | 51.082353 | 0.806619 | 0.049516 | 0 | 0 | 0 | 0 | 0.173945 | 0.015614 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0 | 0.071429 | 0 | 0.128571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
01289ec9d619dfeb01a3deb3962077cae286d130 | 1,031 | py | Python | futura_ui/app/ui/widgets/recipe_actions.py | pjamesjoyce/futura | fc4558bd07626b0d1e89093c0107ccd989ceaa6a | [
"BSD-3-Clause"
] | 6 | 2020-05-04T16:48:03.000Z | 2022-03-29T14:58:16.000Z | futura_ui/app/ui/widgets/recipe_actions.py | pjamesjoyce/futura | fc4558bd07626b0d1e89093c0107ccd989ceaa6a | [
"BSD-3-Clause"
] | 1 | 2021-09-13T06:41:21.000Z | 2021-09-13T06:41:21.000Z | futura_ui/app/ui/widgets/recipe_actions.py | pjamesjoyce/futura | fc4558bd07626b0d1e89093c0107ccd989ceaa6a | [
"BSD-3-Clause"
] | 1 | 2020-11-13T23:02:18.000Z | 2020-11-13T23:02:18.000Z | from PySide2 import QtWidgets
from ..utils import load_ui_file
from ...signals import signals
import os
from functools import partial
emit_0 = partial(signals.start_status_progress.emit, 0)
class RecipeWidget(QtWidgets.QWidget):
def __init__(self, parent=None):
super(RecipeWidget, self).__init__(parent)
ui_path = 'recipe_actions.ui'
ui_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), ui_path)
load_ui_file(ui_path, self)
#self.baseDatabaseButton.pressed.connect(signals.show_load_actions.emit)
self.baseDatabaseButton.pressed.connect(signals.add_base_database.emit)
self.technologyManualButton.pressed.connect(signals.test_filter.emit)
self.regionaliseButton.pressed.connect(signals.regionalisation_wizard.emit)
self.exportButton.pressed.connect(signals.export_recipe.emit)
self.technologyFileButton.pressed.connect(signals.add_technology_file.emit)
self.marketsButton.pressed.connect(signals.markets_wizard.emit)
| 38.185185 | 83 | 0.766246 | 127 | 1,031 | 5.952756 | 0.401575 | 0.12963 | 0.194444 | 0.095238 | 0.113757 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003382 | 0.13967 | 1,031 | 26 | 84 | 39.653846 | 0.848929 | 0.068865 | 0 | 0 | 0 | 0 | 0.017745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.277778 | 0 | 0.388889 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
0128e338ffaee1248aaf4b27db3db28655ac577f | 280 | py | Python | frontend/Untitled.py | hanisaf/ODIN | 8be6af05caa43a7effd3878c8bcc0147080e70b1 | [
"MIT"
] | null | null | null | frontend/Untitled.py | hanisaf/ODIN | 8be6af05caa43a7effd3878c8bcc0147080e70b1 | [
"MIT"
] | null | null | null | frontend/Untitled.py | hanisaf/ODIN | 8be6af05caa43a7effd3878c8bcc0147080e70b1 | [
"MIT"
] | null | null | null |
# coding: utf-8
# In[5]:
from flask import Flask
#from werkzeuk.utils import find_modules, import_str
# In[ ]:
# In[2]:
def create_app(config=None):
app = Flask(__name__)
app.config.update(config or {})
#register_blueprints(app)
return app
# In[ ]:
| 10 | 52 | 0.635714 | 39 | 280 | 4.358974 | 0.641026 | 0.105882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013889 | 0.228571 | 280 | 27 | 53 | 10.37037 | 0.773148 | 0.414286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
012993199dbd76e26b0f1ab66da3cb902a1d8d4e | 1,141 | py | Python | using_insert_many.py | julio-coelho/python-playground | 730981d74471449ecdd811bfa6c862b5d9c1981f | [
"MIT"
] | null | null | null | using_insert_many.py | julio-coelho/python-playground | 730981d74471449ecdd811bfa6c862b5d9c1981f | [
"MIT"
] | null | null | null | using_insert_many.py | julio-coelho/python-playground | 730981d74471449ecdd811bfa6c862b5d9c1981f | [
"MIT"
] | null | null | null |
import sys
import pymongo
connection = pymongo.MongoClient('127.0.0.1', 27017)
def insert():
print "insert many, reporting for duty"
db = connection.school
people = db.people
richard = {
'name': 'Richard Kreuter',
'company': '10gen',
'interest': [
'horsing',
'skydiving',
'fencing'
]
}
andrew = {
'_id': 'erlichson',
'name': 'Andrew Erlichson',
'company': '10gen',
'interest': [
'runnig',
'cycling',
'photography'
]
}
people_to_insert = [andrew, richard]
try:
people.insert_many(people_to_insert)
except Exception as e:
print "Unexpected error:", type(e), e
def print_people():
db = connection.school
people = db.people
try:
cursor = people.find()
except Exception as e:
print "Unexpected error:", type(e), e
for doc in cursor:
print doc
print "Before the insert_many, here are the people"
print_people()
insert()
print "After the insert_many, here are the people"
print_people()
| 18.111111 | 52 | 0.555653 | 122 | 1,141 | 5.106557 | 0.42623 | 0.064205 | 0.057785 | 0.077047 | 0.372392 | 0.372392 | 0.269663 | 0.269663 | 0.269663 | 0.141252 | 0 | 0.019608 | 0.329536 | 1,141 | 62 | 53 | 18.403226 | 0.794771 | 0 | 0 | 0.355556 | 0 | 0 | 0.260526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.044444 | null | null | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.