hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c61cab3c865673fe6c842b370482d357d813efac | 749 | py | Python | scripts/deploy_esp/seo/error.py | smart-edge-open/open-developer-experience-kits | e33837157ff9b8f255be946e6c24e51bc76d8c88 | [
"Apache-2.0"
] | 3 | 2021-11-23T06:30:33.000Z | 2022-02-23T08:50:55.000Z | scripts/deploy_esp/seo/error.py | smart-edge-open/open-developer-experience-kits | e33837157ff9b8f255be946e6c24e51bc76d8c88 | [
"Apache-2.0"
] | 7 | 2021-10-13T02:59:30.000Z | 2022-03-14T13:06:36.000Z | scripts/deploy_esp/seo/error.py | smart-edge-open/open-developer-experience-kits | e33837157ff9b8f255be946e6c24e51bc76d8c88 | [
"Apache-2.0"
] | 7 | 2021-12-04T04:44:29.000Z | 2022-03-20T02:11:33.000Z | # SPDX-License-Identifier: Apache-2.0
# Copyright (c) 2021-2022 Intel Corporation
""" Error handling related utilities """
import enum
TS_REF = "See the Troubleshooting section of the Intel® Smart Edge Open Provisioning Process document"
class Codes(enum.Enum):
""" Script exit codes """
NO_ERROR = 0
GENERIC_ERROR = 1
MISSING_PREREQUISITE = 2
ARGUMENT_ERROR = 3
CONFIG_ERROR = 4
RUNTIME_ERROR = 5
class AppException(Exception):
"""
Exception indicating application error which, if not handled, should result in the
application exit with the error message printed to the screen
"""
def __init__(self, code, msg=None):
super().__init__()
self.code = code
self.msg = msg
| 24.16129 | 102 | 0.684913 | 98 | 749 | 5.091837 | 0.714286 | 0.032064 | 0.048096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027826 | 0.23231 | 749 | 30 | 103 | 24.966667 | 0.838261 | 0.368491 | 0 | 0 | 0 | 0 | 0.206818 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c624e476645db0e6cf50b94ea175b922b41212d1 | 608 | py | Python | src/core/config.py | week-with-me/fastapi-graphql | faf0ab428e47f09c1d4d3939c03e98320962a45b | [
"MIT"
] | null | null | null | src/core/config.py | week-with-me/fastapi-graphql | faf0ab428e47f09c1d4d3939c03e98320962a45b | [
"MIT"
] | null | null | null | src/core/config.py | week-with-me/fastapi-graphql | faf0ab428e47f09c1d4d3939c03e98320962a45b | [
"MIT"
] | null | null | null | from functools import lru_cache
from pydantic import BaseSettings, Field
class Settings(BaseSettings):
LEVEL: str
PROJECT_TITLE: str = 'FastAPI with GraphQL and REST'
GRAPHQL_API: str = '/graphql'
REST_API: str = '/rest'
COMMON_API: str = '/api'
class Config:
env_file = ".env"
class DevelopSettings(Settings):
DB_URL: str = Field(env="DEVELOP_DB_URL")
class ProductSettings(Settings):
DB_URL: str = Field("PRODUCT_DB_URL")
@lru_cache
def get_settings():
return DevelopSettings() if Settings().LEVEL == 'DEVELOP' \
else ProductSettings() | 21.714286 | 63 | 0.677632 | 74 | 608 | 5.378378 | 0.472973 | 0.050251 | 0.065327 | 0.080402 | 0.105528 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217105 | 608 | 28 | 64 | 21.714286 | 0.836134 | 0 | 0 | 0 | 0 | 0 | 0.139573 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.111111 | 0.055556 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c62683defad49be62b8ff124abb0d5537a0522a7 | 537 | py | Python | src/spaceone/notification/manager/notification_manager.py | jihyungSong/plugin-megazone-sms-notification-protocol | 93c12937137da6f06fd94c1fa152666792bac72b | [
"Apache-2.0"
] | 1 | 2021-06-22T08:00:05.000Z | 2021-06-22T08:00:05.000Z | src/spaceone/notification/manager/notification_manager.py | jihyungSong/plugin-megazone-sms-notification-protocol | 93c12937137da6f06fd94c1fa152666792bac72b | [
"Apache-2.0"
] | 1 | 2022-02-18T03:47:44.000Z | 2022-02-28T01:54:22.000Z | src/spaceone/notification/manager/notification_manager.py | jihyungSong/plugin-megazone-sms-notification-protocol | 93c12937137da6f06fd94c1fa152666792bac72b | [
"Apache-2.0"
] | 1 | 2021-06-22T08:00:07.000Z | 2021-06-22T08:00:07.000Z | from spaceone.core.manager import BaseManager
from spaceone.notification.manager.megazone_sms_manager import MegazoneSMSManager
class NotificationManager(BaseManager):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def dispatch(self, access_key, secret_key, title, body, to, **kwargs):
mz_sms_mgr: MegazoneSMSManager = self.locator.get_manager('MegazoneSMSManager')
mz_sms_mgr.set_connector(access_key, secret_key)
mz_sms_mgr.request_send_sms(title, body, to, **kwargs) | 41.307692 | 87 | 0.752328 | 66 | 537 | 5.757576 | 0.5 | 0.039474 | 0.063158 | 0.094737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143389 | 537 | 13 | 88 | 41.307692 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0.033457 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c628b43cf9b0fb37bb58af6755daabef9e05b1e2 | 2,636 | py | Python | second/zp/config.py | zhangpur/second.pytorch | a2addd2d9f96126d4089427faf8db9607b413e17 | [
"MIT"
] | null | null | null | second/zp/config.py | zhangpur/second.pytorch | a2addd2d9f96126d4089427faf8db9607b413e17 | [
"MIT"
] | null | null | null | second/zp/config.py | zhangpur/second.pytorch | a2addd2d9f96126d4089427faf8db9607b413e17 | [
"MIT"
] | null | null | null | from second.zp.utils_zp import *
'''
g_data_root='/home/zp/data/nuscene/nuscene'
g_data_name='nuscene-trainval'
g_version='v1.0-trainval'
'''
g_data_root='/home/zp/data/nuscene/nuscene-mini'
g_data_name='nuscene-mini'
g_version='v1.0-mini'
g_DataPrepare_config=g_data_root + '/DataPrepare.config'
g_VoxelGenerator_config=g_data_root + '/VoxelGenerator.config'
def get_parser_DataPrepare():
parser = argparse.ArgumentParser(
description='DataPrepare')
parser.add_argument(
'--data_name',default=g_data_name)
parser.add_argument(
'--cache_name',default='DataPrepare_cache.pkl')
parser.add_argument(
'--data_root',default=g_data_root)
parser.add_argument(
'--version',default=g_version)
parser.add_argument(
'--phase',default='train')
parser.add_argument(
'--verbose',default=True)
parser.add_argument(
'--seq_length',default=40)
parser.add_argument(
'--obs_length',default=20)
parser.add_argument(
'--pred_length',default=20)
parser.add_argument(
'--interval',default=1)
parser.add_argument(
'--use_image',default='last_image',# 'last_image' or 'key_images'
help='last_image or key_images')
return parser
def get_parser_VoxelGenerator():
parser = argparse.ArgumentParser(
description='VoxelGenerator')
parser.add_argument(
'--full_empty_part_with_mean',default=False)
parser.add_argument(
'--point_cloud_range',default=[-72, -40, -2, 72, 40, 5])
parser.add_argument(
'--voxel_size',default=[0.2, 0.2, 0.2])
parser.add_argument(
'--max_number_of_points_per_voxel',default=40)
parser.add_argument(
'--block_filtering',default=False,
help='filter voxels by block height')
parser.add_argument(
'--block_factor',default=1,
help='height calc width: voxel_size * block_factor * block_size= (0.2 * 1 * 8) ')
parser.add_argument(
'--block_size',default=3)
parser.add_argument(
'--height_threshold',default=0.2,
help='locations with height < height_threshold will be removed.')
parser.add_argument(
'--bev_data',default=['bev_img','bev_index'])
return parser
def Configuration(*config_items):
args={}
for config_item in config_items:
get_parser=globals()['get_parser_'+config_item]
config_path=globals()['g_'+config_item+'_config']
parser = get_parser()
p = parser.parse_args()
if not load_arg(p,config_path):
save_arg(p,config_path)
args[config_item]=p
return args
| 32.54321 | 89 | 0.665402 | 337 | 2,636 | 4.902077 | 0.305638 | 0.108959 | 0.205811 | 0.039952 | 0.134383 | 0.078692 | 0.039952 | 0.039952 | 0 | 0 | 0 | 0.017469 | 0.19651 | 2,636 | 80 | 90 | 32.95 | 0.762512 | 0.010622 | 0 | 0.347826 | 0 | 0 | 0.261809 | 0.054444 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.014493 | 0 | 0.101449 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c628d640e4fd06ace30f86bf4dfc6b7cd4a832ef | 32,199 | py | Python | TAO/Firewall/BUZZDIRECTION/BUZZ_1120/LP/Scripts/Lp_UserInterface.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | 46 | 2017-05-15T11:15:08.000Z | 2018-07-02T03:32:52.000Z | TAO/Firewall/BUZZDIRECTION/BUZZ_1120/LP/Scripts/Lp_UserInterface.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | null | null | null | TAO/Firewall/BUZZDIRECTION/BUZZ_1120/LP/Scripts/Lp_UserInterface.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | 24 | 2017-05-17T03:26:17.000Z | 2018-07-09T07:00:50.000Z | import cmd
import os
import Lp_FrontEndFunctions
import Lp_CursesDriver
import Lp_XmlParser
import Lp_RpcDispatcher
import string
import sys
import socket
import textwrap
import time
import subprocess
import signal
import platform
import threading
from datetime import datetime
BLOCKER_PORT = 1340
RPC_DISPATCH_PORT = 1339
PRINT_PORT = 1338
BACKEND_PORT = 1337
FRONTEND_PORT = 1336
#Prints anything recieved from the backend on port 1338.
class PrintThread(threading.Thread):
def __init__(self,processor,lFile):
self.sock=socket.socket(socket.AF_INET,socket.SOCK_DGRAM)
self.sock.bind(('127.0.0.1',PRINT_PORT))
self.sock.settimeout(.1)
self.cmdLoop=processor
self.log=lFile
threading.Thread.__init__(self)
def run(self):
while 1:
try:
stringIn,addr=self.sock.recvfrom(1024)
if stringIn.find('RECV')>=0:
self.sock.sendto(stringIn,('127.0.0.1',RPC_DISPATCH_PORT))
elif stringIn == '!!#QUIT':
break
else:
print stringIn,
self.log.write(stringIn)
sys.stdout.flush()
except:
continue
self.sock.close()
#Parses input and executes the appropriate command.
class LpInputProcessing(cmd.Cmd):
def __init__(self,functionsIn,lFile,lark,lsock):
self.prompt="LP> "
self.functions=functionsIn
self.helpDict={}
self.Modules={}
self.architecture=''
self.fMaps={}
self.logFile=lFile
self.lpArch=lark
self.defaultOutDir=''
self.lpSock=lsock
self.printBlocker = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
self.printBlocker.bind(('127.0.0.1',BLOCKER_PORT))
cmd.Cmd.__init__(self)
def setDefaultOutDir(self):
self.defaultOutDir = Lp_XmlParser.parseDefaultDir()
def preloop(self):
self.logFile.write(self.prompt)
self.do_help("")
#This function is executed immediatly before the command entered by the user is handled by the
#cmd class. For a command that is a key in the function dictionary, the user input is handled
#by the genericCmd function.
def precmd(self,line):
usrIn=line.split(" ")
fName=self.__resolveFuncName(usrIn[0])
if fName=='quit':
return 'quit'
for mod in self.Modules:
#Is the commanded function a key in the dictionary for the module 'mod'?
if fName in self.Modules[mod]:
oldDict={}
oldDict=self.Modules[mod][fName].copy()
cmdRes=self.genericCmd(mod,usrIn)
if cmdRes>=0 and fName=='burn':
self.lpSock.settimeout(60)
try:
burnConf = self.lpSock.recv(1024)
while burnConf.find('RECV RPC')<0:
burnConf = self.printBlocker.recv(1024)
retCode = burnConf[burnConf.find('rc=')+3:len(burnConf)-1]
if retCode == '0':
print "Burn Successful"
self.logFile.write("Burn Successful.\n")
return 'abort'
else:
print "Burn rpc returned an error: %s"%retCode
print "The implant may have burned."
self.lpSock.setblocking(1)
return ''
except socket.timeout:
print "Did not receive confirmation of burn. The implant may have burned."
self.logFile.write("Did not receive confirmation of burn. The implant may have burned.\n")
self.lpSock.setblocking(1)
return 'abort'
elif cmdRes>=0 and fName=="upgrade":
stillActive=1
rpcRes=-1
ret=self.functions.cmdGeneric('port',
{'command':'!port','outport':str(FRONTEND_PORT),
'endOnString':'DONE'},
{})
self.lpSock.settimeout(60)
while stillActive==1:
try:
self.lpSock.sendto('!stat\n',('127.0.0.1',BACKEND_PORT))
stillActive=0
lineIn=self.lpSock.recv(1024)
while lineIn.find('DONE')<0:
if lineIn.find('RECV')>=0:
rpcRes=int(lineIn[lineIn.find('rc=')+3:len(lineIn)])
elif lineIn.find('RPC ID')>=0:
newLine=lineIn.find('\n')
rpcN=int(lineIn[7:newLine])
if rpcN==cmdRes:
prog = lineIn[lineIn.find('Sent:')+6:
lineIn.find('\nTotal')]
totalPkts = lineIn[lineIn.find('RPC:')+5:]
print "\rProgress: %.1f%s"%((float(prog)/float(totalPkts)*
100),'%'),
sys.stdout.flush()
stillActive=1
lineIn=self.lpSock.recv(1024)
line=''
time.sleep(.5)
except socket.timeout:
line='abort'
print 'Socket timed out while receiving status. Backend is not responding.'
self.logFile.write('Socket timed out while receiving status. Backend is not responding.\n')
break
except KeyboardInterrupt:
print "\nUpfile transfer aborted."
self.logFile.write('\nUpfile transfer aborted.\n')
return 'abort'
if rpcRes == 0:
print "Upgrade file sent."
self.logFile.write("Upgrade file sent.\n")
return 'abort'
else:
self.functions.cmdGeneric('port',{'command':'!port','outport':str(PRINT_PORT),
'endOnString':'DONE'},{})
self.lpSock.setblocking(1)
cmdRes = rpcRes
if cmdRes == -2:
line='abort'
break
self.Modules[mod][fName]=oldDict
return ""
return line
def deleteMod(self,toUnload,okToDelete):
if okToDelete>0:
return
keys=self.Modules.keys()
for key in keys:
if self.Modules[key]['iface']==toUnload[0]:
del self.Modules[key]
del self.helpDict[key]
def printTunnelCmd(self,tunnelCommand,okToPrint):
if okToPrint==0:
print tunnelCommand[0]
#Handles user input if the commanded function has an entry in the function dictionary.
#Arguments:
# mod: the module that contains this function
# usrIn: raw input that the user entered at the command prompt
#Return:
# An integer representing the result of the function call
def genericCmd(self,mod,usrIn):
fName=self.__resolveFuncName(usrIn[0])
newIn=[]
#Strip empty strings from usrIn
for el in usrIn:
if el!='':
newIn.append(el)
usrIn=newIn
try:
#Check if this function takes input arguments
if self.Modules[mod][fName]['noargs']=="true":
if len(usrIn)>1:
print "The command %s does not accept input arguments."%fName
self.logFile.write("The command %s does not accept input arguments.\n"%fName)
return
except KeyError:
print "No functions found for module '%s'"%mod
self.logFile.write("No functions found for module '%s'"%mod)
return -1
if self.Modules[mod][fName]['confirm']=='1':
try:
confirmation=raw_input("Are you sure you want to execute %s?\nY\N: "%fName)
except KeyboardInterrupt:
print
return -1
self.logFile.write("Are you sure you want to execute %s?"%fName)
self.logFile.write((confirmation+'\n'))
if confirmation!='y' and confirmation!='Y':
return -1
#Check if this function should use a curses form. If so, then extract the necessary arguments
#from the curses return and call the function.
if self.Modules[mod][fName]['curses']=="true" and len(usrIn)==1:
self.logFile.write('Curses form entered.\n')
arguments=[]
formType=''
if fName=='redirtunnel':
formType='redir'
elif fName=='outtunnel':
formType='out'
else:
formType='default'
form=Lp_CursesDriver.CursesDriver(
self.Modules[mod][fName]['cursesPrompts'],
len(self.Modules[mod][fName]['cursesPrompts']),
formType
)
arguments=form.runCurses()
self.logFile.write('***Raw output from curses form***\n')
self.logFile.write('%s\n%s\n******\n'%(arguments[0],arguments[1]))
#clear the screen to eliminate any oddities that result from the terminal being resized.
os.system('clear')
res=0
for arg in arguments:
validArg=self.__checkArg(arg)
if arg!=[] and validArg==1:
argDict=dict(arg)
if 'transprot' in argDict:
protocol=argDict['transprot']
if protocol=='udp' or protocol=='UDP':
argDict['transprot']='17'
elif protocol=='tcp' or protocol=='TCP':
argDict['transprot']='6'
else:
print "Bad Protocol Selection."
self.logFile.write("Bad Protocol Selection.\n")
return -1
argList=argDict.items()
#build string for creating this tunnel on command line
if fName=='redirtunnel' or fName=='outtunnel':
cmdString=fName+' netprot 2048'
for el in argList:
cmdString+=' '+el[0]+' '+el[1]
tunnelCmd='\nCommand to create this tunnel via command line:\n%s\n'%cmdString
keys=argDict.keys()
for key in keys:
try:
self.Modules[mod][fName][key]=argDict[key].rstrip()
except KeyError:
print "assignment failed"
self.logFile.write("assignment failed\n")
return
res=self.functions.cmdGeneric(fName,self.Modules[mod][fName],{})
if fName=='redirtunnel' or fName=='outtunnel':
self.printTunnelCmd([tunnelCmd],0)
elif arg!=[] and validArg!=0:
print "Error. Input not provided for %s."%validArg
self.logFile.write("Error. Input not provided for %s.\n"%validArg)
return res
if self.Modules[mod][fName]['useDirList']=='1' and len(usrIn)==1:
try:
dirListParams=self.Modules[mod][fName]['dirListParams']
directory=dirListParams['baseDir']
if dirListParams['prependCWD']=='1':
directory='%s%s'%(os.getcwd(),directory)
if dirListParams['appendImplantArch']=='1':
directory='%s/%s'%(directory,self.architecture)
dirListing=[]
initialList=os.listdir(directory)
for item in initialList:
if item.find(dirListParams['fileEx'])>=0:
dirListing.append(item)
moduleIndexes={}
while 1:
print dirListParams['prePrint']
try:
if dirListParams['showIfaceNumbers']=='1':
for i in range(0, len(dirListing),1):
modName=dirListing[i].split(dirListParams['modNameSplitChar'])[0]
ifaceArgs=[modName,self.architecture, self.lpArch]
iface=Lp_XmlParser.parseIface(ifaceArgs)
if iface<0:
print "Unable to find iface number for %s."%modName
print "Ensure the xml file for this module is located in the proper directory."
else:
print "%d: %s"%(iface,dirListing[i])
#Create the mapping of iface number to position in list
#This allows user to enter iface number to select mod
moduleIndexes[iface]=i
input=raw_input(dirListParams['listPrompt'])
input=int(input)
selection=moduleIndexes[input]
self.Modules[mod][fName][dirListParams['promptToSet']]='%s/%s'%(directory,
dirListing[int(selection)])
break
else:
for i in range(0, len(dirListing),1):
print "%d: %s"%(i,dirListing[i])
selection=raw_input(dirListParams['listPrompt'])
self.Modules[mod][fName][dirListParams['promptToSet']]='%s/%s'%(directory,
dirListing[int(selection)])
break
except (IndexError, ValueError, KeyError):
print "\nInvalid selection.\n"
if dirListParams['requireXml']=='1':
modStripped=dirListing[int(selection)].split(dirListParams['modNameSplitChar'])[0]
modFd=open('%s/%s.xml'%(directory,modStripped))
modFd.close()
except IOError:
print 'No xml configuration file found for this module.'
return -1
except OSError:
print "%s not found."%directory
self.logFile.write("%s not found.\n"%directory)
except KeyboardInterrupt:
print
return -1
if self.Modules[mod][fName]['useSwitch']=='1' and len(usrIn)==1:
try:
switchParams=self.Modules[mod][fName]['switchParams']
while 1:
response=raw_input(str(switchParams['prompt']))
possibleInputs=switchParams['switchOpts'].keys()
correctKey=0
for input in possibleInputs:
input=str(input)
res=input.find(response)
if res>=0:
correctKey=input
if correctKey==0:
print "Invalid selection."
else:
for argument in switchParams['switchOpts'][correctKey]:
try:
self.Modules[mod][fName][str(argument[0])]=str(argument[1])
except KeyError:
print "Internal Error: argument specified in switch not found as an argument for %s."%fName
break
except KeyboardInterrupt:
print
return -1
if self.Modules[mod][fName]['printFunc'] != [] and len(usrIn)==1:
self.printFunctionOut(self.Modules[mod][fName]['printFunc'])
if self.Modules[mod][fName]['useDefaultDir'] != [] and len(usrIn) == 1:
argument = self.Modules[mod][fName]['useDefaultDir']
self.Modules[mod][fName][argument] = self.defaultOutDir
#Extract arguments from the entered command and assign values in the function dictionary
#if no other method of obtaining arguments is defined for this command.
if len(usrIn)>1:
for i in range(1,len(usrIn)-1,2):
try:
res=self.Modules[mod][fName][usrIn[i]]
self.Modules[mod][fName][usrIn[i]]=usrIn[i+1]
except:
print "Incorrect argument: %s"%usrIn[i]
self.logFile.write("Incorrect argument: %s\n"%usrIn[i])
return
if self.Modules[mod][fName]['useArgConfirm']=='1':
res=self.functions.cmdGeneric(fName, self.Modules[mod][fName],
self.Modules[mod][fName]['argConfirmParams'])
else:
res=self.functions.cmdGeneric(fName, self.Modules[mod][fName],{})
return res
#Used to print the result of a function when the lp needs to confirm the completion of the function that
#caused the print.
def printFunctionOut(self,func):
self.printBlocker.sendto('!!#TURN_OFF_PRINTING',('127.0.0.1',RPC_DISPATCH_PORT))
if func=='mods':
print "******************Loaded Modules*****************"
self.logFile.write("******************Loaded Modules*****************\n")
res = self.functions.cmdGeneric('mods',{'command':'!mods'},{})
self.printBlocker.sendto('!!#REG_BLOCK%d'%res,('127.0.0.1',RPC_DISPATCH_PORT))
result = self.printBlocker.recv(1024)
elif func=='listtunnels':
res = self.functions.cmdGeneric('listtunnels',{'command':'!call','ciface':'34',
'cfunc':'2','cprov':'1'},{})
self.printBlocker.sendto('!!#REG_BLOCK%d'%res,('127.0.0.1',RPC_DISPATCH_PORT))
result = self.printBlocker.recv(1024)
self.printBlocker.sendto('!!#TURN_ON_PRINTING',('127.0.0.1',RPC_DISPATCH_PORT))
def parseXml(self, mod, requireLpEx):
Lp_XmlParser.parseMod([self.helpDict, self.Modules, mod, self.architecture, self.lpArch, self.fMaps,
requireLpEx, self.functions], 0)
#The call command can be used to browse loaded modules and call any function from any loaded module.
#It is intended as an advanced/developer command, and therefore will not be shown by help.
#It can be called by entering 'call' at the lp prompt.
def do_call(self,line):
usrIn=line.split(" ")
if len(usrIn)!=1:
return
#print "Incorrect number of arguments. Enter \'help call\' for usage information."
elif usrIn[0]=="":
res=self.functions.cmdCall(self.printBlocker)
else:
return
#print "Incorrect number of arguments. Enter \'help call\' for usage information."
#Prints all of the functions available from each module by going through helpDict
def do_help(self,line):
usrIn=line.split(" ")
if line=="":
print "****Available Commands****"
self.logFile.write("****Available Commands****\n")
keys=self.helpDict.keys()
for key in keys:
self.__printModuleFunctions(key)
else:
#Check if user entered a module name
if usrIn[0] in self.helpDict:
self.__printModuleFunctions(usrIn[0])
return
keys=self.helpDict.keys()
#Check if user entered module iface number
for key in keys:
try:
modStr='padding/%s_debug.mo'%key
ifaceNum=Lp_XmlParser.parseIface([modStr, self.architecture, self.lpArch])
if str(ifaceNum)==usrIn[0]:
self.__printModuleFunctions(key)
return
except KeyError:
break
fName=self.__resolveFuncName(usrIn[0])
#Check if user entered a function name or number
for key in keys:
if fName in self.helpDict[key]:
use='\n%s\n'%textwrap.fill(self.helpDict[key][fName]['usage'])
text='%s\n'%textwrap.fill(self.helpDict[key][fName]['text'])
print use
self.logFile.write('%s\n'%use)
print text
self.logFile.write('%s\n'%text)
return
print "No help information found for: %s"%fName
self.logFile.write("No help information found for: %s\n"%fName)
#Binding for help command
def do_h(self,line):
self.do_help(line)
#Exits the LP without printing modules after receiving confirmation of burn.
def do_abort(self,line):
print "Goodbye"
return True
#Prints modules loaded and then exits the LP
def do_exit(self,line):
#Display loaded modules on exit. Port is changed to 1336 so that the front end can make sure
#that the entire list is printed before exiting.
try:
self.printFunctionOut('mods')
res=functions.cmdGeneric('term',{'command':'!term','endOnString':'DONE'},{})
print "Goodbye"
self.logFile.write("Goodbye\n")
self.logFile.write('Session terminated at %s'%str(datetime.now()))
return True
except KeyboardInterrupt:
print "Goodbye"
self.logFile.write("Goodbye\n")
self.logFile.write('Session terminated at %s'%str(datetime.now()))
return True
#Command binding to exit LP
def do_quit(self,line):
self.do_exit(line)
return True
#Command binding to exit LP
def do_logout(self,line):
self.do_exit(line)
return True
#Command binding to exit LP
def do_EOF(self,line):
self.do_exit(line)
return True
def emptyline(self):
pass
def setArch(self,inArch):
self.architecture=inArch
def __sort(self,modKey,toSort):
toReturn=[]
fMaps={}
fNums=[]
for fName in toSort:
fNum=self.helpDict[modKey][fName]['fnum']
fMaps[int(str(fNum).split('.')[1])]=fName
fNums.append(int(str(fNum).split('.')[1]))
fNums=sorted(fNums)
for num in fNums:
toReturn.append(fMaps[num])
return toReturn
#Checks if there are any empty strings in a list returned by CursesDriver
def __checkArg(self,toCheck):
for element in toCheck:
if element[1]=='':
return element[0]
return 1
def __resolveFuncName(self,name):
try:
function=self.fMaps[name]
except KeyError:
function=name
return function
def __printModuleFunctions(self,modName):
fKeys=self.helpDict[modName].keys()
if len(fKeys)<=0:
return
print 'Module: %s'%modName
self.logFile.write('Module: %s\n'%modName)
sortedfKeys=self.__sort(modName,fKeys)
for fKey in sortedfKeys:
if self.helpDict[modName][fKey]['nodisplay']!="true":
disp=" %s: %s"%(self.helpDict[modName][fKey]['fnum'],fKey)
print disp
self.logFile.write((disp+'\n'))
print
#Forces backend to ignore TERM signals sent to the front end on ctrl-c
def preexec_fcn():
signal.signal(signal.SIGINT,signal.SIG_IGN)
#Connects to the implant and prints the currently loaded modules and implant uptime.
#Arguments:
# proc: the Lp Input Processing object
# sock: the socket used to communicate with backend
#Return:
# 1 on success
# -1 in fail
def showWelcome(proc,func,sock,outFile,lpArch):
currentDirectory=os.getcwd()
supportedArchs={'062':'x86_64','003':'i386','020':'ppc','021':'ppc64',
'002':'sparc','008':'mips_be','010':'mips_le',
'040':'arm','043':'sparcv9'}
openArgs={"command":"!open","dstip":sys.argv[1],"dstport":sys.argv[2],"srcip":sys.argv[3],
"srcport":sys.argv[4],"keyfile":sys.argv[5],'endOnString':'DONE'}
res = func.cmdGeneric('open',openArgs, {})
try:
line=sock.recv(1024)
except:
print "Failed to connect to implant."
return -1
lpexList = []
allArches = []
print "Loading Lp Extensions...",
sys.stdout.flush()
#load all lp extention files available
try:
allArches = os.listdir('%s/../Mods/App/Buzzdirection/'%(currentDirectory))
except OSError:
pass
for oneArch in allArches:
lpexDir = '%s/../Mods/App/Buzzdirection/%s/'%(currentDirectory,oneArch)
lpexList += os.listdir(lpexDir)
if lpArch == 'i386':
lpexExtension = '.lx32'
else:
lpexExtension = '.lx64'
for lpex in lpexList:
if lpex.find(lpexExtension)>=0:
fileLoc = '%s%s'%(lpexDir,lpex)
func.cmdGeneric('lpex',{'command':'!lpex','lpexfile':fileLoc,'endOnString':'DONE'},{})
print "\r",
#Parse xml files for preloaded modules and print currently loaded modules
func.cmdGeneric('port',{'command':'!port','outport':str(FRONTEND_PORT),'endOnString':'DONE'},{})
res=func.cmdGeneric('mods',{'command':'!mods','endOnString':'DONE'},{})
print "******************Loaded Modules*****************",
try:
line=sock.recv(1024)
#while line.find("Device ID")<0:
while line.find("RECV")<0:
print line,
outFile.write(line)
if line.find('name')<0 and line.find('--')<0 and line.find("Device")<0:
modName=line[8:25]
modName=modName.strip()
module=(modName+'.mo')
if len(module)>3:
proc.parseXml(module,0)
if modName=='PlatCore':
platCoreArch=line[46:len(line)].strip()
try:
proc.setArch(supportedArchs[platCoreArch])
except KeyError:
print "The reported implant architecture type of %s is not supported."%platCoreArch
return -1
line=sock.recv(1024)
except socket.timeout:
print "Failed to receive module list."
outFile.write("Failed to receive module list.")
except KeyboardInterrupt:
return -1
func.cmdGeneric('port',{'command':'!port','outport':str(PRINT_PORT),'endOnString':'DONE'},{})
res=func.cmdGeneric('uptime',{'command':'!call','ciface':'2','cprov':'0','cfunc':'15'},{})
proc.printBlocker.sendto('!!#REG_BLOCK%d'%res,('127.0.0.1',RPC_DISPATCH_PORT))
result = proc.printBlocker.recv(1024)
return 1
if __name__=='__main__':
try:
functions=0
processor=0
printThread=0
log=0
lpSock=0
out=0
ark=platform.architecture()[0]
if ark=='32bit':
lpArk='i386'
else:
lpArk='x86_64'
curDir=os.getcwd()
out=open((curDir+'/back.log'),'w+')
try:
logFiles=os.listdir('%s/Logs'%os.getcwd())
except OSError:
os.mkdir('%s/Logs'%curDir)
logFiles=os.listdir('%s/Logs'%os.getcwd())
numLogs=len(logFiles)
logFiles.sort()
#If there are more than 20 log files, delete the oldest one.
if numLogs>20 and sys.argv[6]=='1':
try:
os.remove('%s/Logs/%s'%(curDir,logFiles[0]))
except OSError:
print "Unable to remove oldest logfile."
date='%s'%datetime.date(datetime.now())
cTime='%s'%datetime.time(datetime.now())
cTime=cTime[:cTime.find('.')]
logname='%s_%s_lp.log'%(date,cTime)
log=open('%s/Logs/%s'%(curDir,logname),'w+')
lpSock=socket.socket(socket.AF_INET,socket.SOCK_DGRAM)
lpSock.bind(('127.0.0.1',FRONTEND_PORT))
try:
subprocess.call([(curDir+'/'+lpArk+'/ThrowUser_LinuxUser'),(curDir+'/'+lpArk+'/blob.lp')],
stdout=out,preexec_fn=preexec_fcn)
except:
print "Unable to locate back end executatble. This should be located in Lp/<LP Architecture>"
out.close()
ThreadExit=0
lpSock.close()
sys.exit()
time.sleep(1)
functions=Lp_FrontEndFunctions.Lp_FrontEndFcns(lpSock,log)
processor=LpInputProcessing(functions,log,lpArk,lpSock)
processor.parseXml('Lp.mo',0)
processor.setDefaultOutDir()
functions.setProc(processor)
printThread=PrintThread(processor,log)
printThread.daemon=True
printThread.start()
rpcDispatch=Lp_RpcDispatcher.RpcDispatcher(processor)
rpcDispatch.start()
res=showWelcome(processor,functions,lpSock,log,lpArk)
lpSock.sendto("!!#TURN_ON_PRINTING",('127.0.0.1',RPC_DISPATCH_PORT))
if res>0:
processor.cmdloop()
lpSock.sendto("!!#QUIT",('127.0.0.1',RPC_DISPATCH_PORT))
lpSock.sendto("!!#QUIT",('127.0.0.1',PRINT_PORT))
subprocess.call(['killall','ThrowUser_LinuxUser'])
out.close()
log.close()
lpSock.close()
except KeyboardInterrupt:
if functions!=0:
res=functions.cmdGeneric('term',{'command':'!term','endOnString':'DONE'},{})
if out !=0:
try:
out.close()
except IOError:
pass
print "Goodbye"
if log!=0:
log.write('Goodbye\n')
log.write('Session terminated at %s'%str(datetime.now()))
log.close()
lpSock.sendto("!!#QUIT",('127.0.0.1',RPC_DISPATCH_PORT))
lpSock.sendto("!!#QUIT",('127.0.0.1',PRINT_PORT))
subprocess.call(['killall','ThrowUser_LinuxUser'])
if lpSock != 0:
lpSock.close()
| 39.123937 | 123 | 0.497624 | 3,191 | 32,199 | 4.977437 | 0.179568 | 0.023547 | 0.029214 | 0.032299 | 0.262104 | 0.191903 | 0.167978 | 0.15696 | 0.12611 | 0.107473 | 0 | 0.019292 | 0.389888 | 32,199 | 822 | 124 | 39.171533 | 0.789208 | 0.091525 | 0 | 0.307942 | 0 | 0 | 0.1334 | 0.00709 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.004862 | 0.025932 | null | null | 0.123177 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c62b6b1f6fc4a4593d05459cd17e5513cab47f19 | 2,205 | py | Python | desktop/core/ext-py/chardet-3.0.4/setup.py | kokosing/hue | 2307f5379a35aae9be871e836432e6f45138b3d9 | [
"Apache-2.0"
] | 1,511 | 2015-07-01T15:29:03.000Z | 2022-03-30T13:40:05.000Z | desktop/core/ext-py/chardet-3.0.4/setup.py | zks888/hue | 93a8c370713e70b216c428caa2f75185ef809deb | [
"Apache-2.0"
] | 2,695 | 2015-07-01T16:01:35.000Z | 2022-03-31T19:17:44.000Z | desktop/core/ext-py/chardet-3.0.4/setup.py | zks888/hue | 93a8c370713e70b216c428caa2f75185ef809deb | [
"Apache-2.0"
] | 540 | 2015-07-01T15:08:19.000Z | 2022-03-31T12:13:11.000Z | #!/usr/bin/env python
import re
import sys
from setuptools import find_packages, setup
needs_pytest = set(['pytest', 'test', 'ptr']).intersection(sys.argv)
pytest_runner = ['pytest-runner'] if needs_pytest else []
# Get version without importing, which avoids dependency issues
def get_version():
with open('chardet/version.py') as version_file:
return re.search(r"""__version__\s+=\s+(['"])(?P<version>.+?)\1""",
version_file.read()).group('version')
def readme():
with open('README.rst') as f:
return f.read()
setup(name='chardet',
version=get_version(),
description='Universal encoding detector for Python 2 and 3',
long_description=readme(),
author='Mark Pilgrim',
author_email='mark@diveintomark.org',
maintainer='Daniel Blanchard',
maintainer_email='dan.blanchard@gmail.com',
url='https://github.com/chardet/chardet',
license="LGPL",
keywords=['encoding', 'i18n', 'xml'],
classifiers=["Development Status :: 4 - Beta",
"Intended Audience :: Developers",
("License :: OSI Approved :: GNU Library or Lesser General"
" Public License (LGPL)"),
"Operating System :: OS Independent",
"Programming Language :: Python",
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
("Topic :: Software Development :: Libraries :: Python "
"Modules"),
"Topic :: Text Processing :: Linguistic"],
packages=find_packages(),
setup_requires=pytest_runner,
tests_require=['pytest', 'hypothesis'],
entry_points={'console_scripts':
['chardetect = chardet.cli.chardetect:main']})
| 38.684211 | 78 | 0.56644 | 217 | 2,205 | 5.658986 | 0.557604 | 0.139251 | 0.183225 | 0.105863 | 0.043974 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012862 | 0.294785 | 2,205 | 56 | 79 | 39.375 | 0.776849 | 0.037188 | 0 | 0 | 0 | 0 | 0.445545 | 0.053277 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044444 | false | 0 | 0.066667 | 0 | 0.155556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c62c54cc76f0fcd254fd1edb608ab1982f3b42e8 | 1,868 | py | Python | src/radical/pilot/agent/launch_method/archive/dplace.py | eirrgang/radical.pilot | ceccd1867dd172935d602ff4c33a5ed4467e0dc8 | [
"MIT"
] | 1 | 2021-11-07T04:51:30.000Z | 2021-11-07T04:51:30.000Z | src/radical/pilot/agent/launch_method/archive/dplace.py | eirrgang/radical.pilot | ceccd1867dd172935d602ff4c33a5ed4467e0dc8 | [
"MIT"
] | null | null | null | src/radical/pilot/agent/launch_method/archive/dplace.py | eirrgang/radical.pilot | ceccd1867dd172935d602ff4c33a5ed4467e0dc8 | [
"MIT"
] | null | null | null |
__copyright__ = "Copyright 2016, http://radical.rutgers.edu"
__license__ = "MIT"
import radical.utils as ru
from .base import LaunchMethod
# ------------------------------------------------------------------------------
#
class DPlace(LaunchMethod):
# --------------------------------------------------------------------------
#
def __init__(self, cfg, session):
LaunchMethod.__init__(self, cfg, session)
# --------------------------------------------------------------------------
#
def _configure(self):
# dplace: job launcher for SGI systems (e.g. on Blacklight)
self.launch_command = ru.which('dplace')
# --------------------------------------------------------------------------
#
def construct_command(self, t, launch_script_hop):
slots = t['slots']
td = t['description']
task_exec = td['executable']
task_cores = td['cpu_processes'] # FIXME: also use cpu_threads
task_args = td.get('arguments') or []
task_argstr = self._create_arg_string(task_args)
if 'task_offsets' not in slots :
raise RuntimeError('insufficient information to launch via %s: %s'
% (self.name, slots))
# FIXME: This is broken due to changes lot structure
task_offsets = slots['task_offsets']
assert(len(task_offsets) == 1)
dplace_offset = task_offsets[0]
task_command = "%s %s" % (task_exec, task_argstr)
dplace_command = "%s -c %d-%d %s" % (self.launch_command, dplace_offset,
dplace_offset + task_cores - 1,
task_command)
return dplace_command, None
# ------------------------------------------------------------------------------
| 31.661017 | 80 | 0.448608 | 165 | 1,868 | 4.793939 | 0.533333 | 0.069532 | 0.027813 | 0.045512 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005124 | 0.268737 | 1,868 | 58 | 81 | 32.206897 | 0.573939 | 0.277837 | 0 | 0 | 0 | 0 | 0.14018 | 0 | 0 | 0 | 0 | 0.017241 | 0.037037 | 1 | 0.111111 | false | 0 | 0.074074 | 0 | 0.259259 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c62e0e66546d914d98f2f915f0f9cc4cb0a15ccd | 4,271 | py | Python | Archive Formats/fmt_davidsonassoc_res.py | TheDeverEric/noesis-scripts | 0a96ccbcac304c91dc21d252e0e4c6f10e0aaa16 | [
"MIT"
] | null | null | null | Archive Formats/fmt_davidsonassoc_res.py | TheDeverEric/noesis-scripts | 0a96ccbcac304c91dc21d252e0e4c6f10e0aaa16 | [
"MIT"
] | null | null | null | Archive Formats/fmt_davidsonassoc_res.py | TheDeverEric/noesis-scripts | 0a96ccbcac304c91dc21d252e0e4c6f10e0aaa16 | [
"MIT"
] | null | null | null | #-------------------------------------------------------------------------------
# Name: Davidson & Associates *.res
# Purpose: Extract Archive
#
# Author: Eric Van Hoven
#
# Created: 01/09/2017
# Copyright: (c) Eric Van Hoven 2017
# Licence: <MIT License>
#-------------------------------------------------------------------------------
from inc_noesis import *
def registerNoesisTypes():
handle = noesis.register("Davidson & Associates", ".res")
noesis.setHandlerExtractArc(handle, resExtract)
return 1
def resExtract(fileName, fileLen, justChecking):
with open(fileName, "rb") as fs:
if justChecking:
return 1
verID = noeUnpack("<I", fs.read(4))[0]
if verID == 2:
fs.read(8) #null uint64
TB1 = noeUnpack("<I", fs.read(4))[0] + 0x8
TB1SZ = noeUnpack("<I", fs.read(4))[0] - 0x8
TB2 = noeUnpack("<I", fs.read(4))[0] + 0x8
TB2SZ = noeUnpack("<I", fs.read(4))[0] - 0x8
endtb = TB1 + TB1SZ
filenum = -1
fs.seek(TB1, 0)
while (TB1 != endtb):
offset = noeUnpack("<I", fs.read(4))[0]
size = noeUnpack("<I", fs.read(4))[0]
TB1 = fs.tell()
fs.seek(offset, 0)
filenum += 1
fileName = str(filenum) + ".dat"
print("Writing", fileName)
rapi.exportArchiveFile(fileName, fs.read(size))
fs.seek(TB1, 0)
endtb = TB2 + TB2SZ
fs.seek(TB2, 0)
while (TB2 != endtb):
offset = noeUnpack("<I", fs.read(4))[0]
size = noeUnpack("<I", fs.read(4))[0]
TB2 = fs.tell()
fs.seek(offset, 0)
filenum += 1
fileName = str(filenum) + ".dat"
print("Writing", fileName)
rapi.exportArchiveFile(fileName, fs.read(size))
fs.seek(TB2, 0)
elif verID == 3:
while True:
if(noeUnpack("<B", fs.read(1))[0] != 0x0):
break
fs.seek(-0x1, 1)
TB_A = noeUnpack("<I", fs.read(4))[0] + 0x8
TB_A_SZ = noeUnpack("<I", fs.read(4))[0] - 0x8
while True:
if(noeUnpack("<B", fs.read(1))[0] != 0x0):
break
fs.seek(-0x1, 1)
TB_B = noeUnpack("<I", fs.read(4))[0]
TB_B_SZ = noeUnpack("<I", fs.read(4))[0]
fs.seek(TB_B, 0)
FNTB = noeUnpack("<I", fs.read(4))[0]
FNTB_SZ = noeUnpack("<I", fs.read(4))[0]
endoftb = TB_A + TB_A_SZ
while(TB_A != endoftb):
fs.seek(TB_A, 0)
offset = noeUnpack("<I", fs.read(4))[0]
size = noeUnpack("<I", fs.read(4))[0]
TB_A = fs.tell()
fs.seek(FNTB)
fs.read(4) #uint filenum
fs.read(4) #uint fakesize
fs.read(4) #float file date + time
fs.read(4) #uint fileID
fnsz = noeUnpack("<I", fs.read(4))[0]
fileName = noeStrFromBytes(noeParseToZero(fs.read(fnsz)))
FNTB = fs.tell()
fs.seek(offset, 0)
print("Writing", fileName)
rapi.exportArchiveFile(fileName, fs.read(size))
elif verID == 4:
fs.read(8)
fs.read(4) #useless table offset
fs.read(4) #useless table size
TB = (noeUnpack("<I", fs.read(4))[0]) + 0x8
TB_SZ = (noeUnpack("<I", fs.read(4))[0]) - 0x8
TB_END = TB + TB_SZ
i = -1
while(TB != TB_END):
fs.seek(TB)
offset = noeUnpack("<I", fs.read(4))[0]
size = noeUnpack("<I", fs.read(4))[0]
TB = fs.tell()
fs.seek(offset, 0)
i += 1
fileName = str(i) + ".dat"
print("Writing", fileName)
rapi.exportArchiveFile(fileName, fs.read(size))
return 1
| 36.194915 | 81 | 0.421213 | 469 | 4,271 | 3.795309 | 0.200426 | 0.124719 | 0.110112 | 0.197753 | 0.567978 | 0.546629 | 0.494944 | 0.414045 | 0.361236 | 0.32809 | 0 | 0.052055 | 0.401779 | 4,271 | 117 | 82 | 36.504274 | 0.644618 | 0.105596 | 0 | 0.505376 | 0 | 0 | 0.031199 | 0 | 0 | 0 | 0.009767 | 0 | 0 | 1 | 0.021505 | false | 0 | 0.010753 | 0 | 0.064516 | 0.043011 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c643e8c05150782afcd3003ad500909907489985 | 4,582 | py | Python | capture.py | LEON6156SCOTT/Hand-Cricket | c5d66538446880966964eadc58e39fe03aecd1fc | [
"MIT"
] | 3 | 2021-01-31T04:48:53.000Z | 2021-02-17T05:15:10.000Z | capture.py | dpatel-8112/hand-cricket | e88dff33e2215c5e8a73690ce06b7c86309fd54d | [
"MIT"
] | null | null | null | capture.py | dpatel-8112/hand-cricket | e88dff33e2215c5e8a73690ce06b7c86309fd54d | [
"MIT"
] | 3 | 2021-01-31T04:49:09.000Z | 2021-05-03T15:59:48.000Z | import cv2 as cv
import numpy as np
import matplotlib.pyplot as plt
import os
class Capture:
def Collect(num_samples):
global one,two,three,four,five,none
capture = cv.VideoCapture(0)
capture.set(cv.CAP_PROP_FRAME_WIDTH, 1024)
capture.set(cv.CAP_PROP_FRAME_HEIGHT, 768)
count = 0
switch = False
# This the ROI size, the size of images saved will be box_size -10
box_size = 234
# Getting the width of the frame from the camera properties
width = int(capture.get(3))
while True:
# Read frame by frame
ret, frame = capture.read()
# Flip the frame laterally
frame = cv.flip(frame, 1)
# Break the loop if there is trouble reading the frame.
if not ret:
break
# If counter is equal to the number samples then reset triger and the counter
if count == num_samples:
switch = not switch
count = 0
cv.rectangle(frame, (width - box_size, 0), (width, box_size), (0, 0, 0), 2)
cv.namedWindow("Collecting images", cv.WINDOW_NORMAL)
if switch == True:
ROI = frame[5: box_size-5 , width-box_size+5: width-5] # LFU
if class_name == "1":
one.append([ROI])
if class_name == "2":
two.append([ROI])
if class_name == "3":
three.append([ROI])
if class_name == "4":
four.append([ROI])
if class_name == "5":
five.append([ROI])
if class_name == "none":
none.append([ROI])
# Increment the counter
count += 1
# Text for the counter
text = "Collected Samples of {}: {}".format(class_name, count)
else:
text = "Press 1-0 and n, for collecting data-set."
# Show the counter on the imaege
cv.putText(frame, text, (3, 350), cv.FONT_HERSHEY_SIMPLEX, 0.45, (0, 0, 0), 1, cv.LINE_AA)
# Display the window
cv.imshow("Collecting images", frame)
# Wait 1 ms
k = cv.waitKey(1)
if k == ord('1'):
switch = not switch
class_name = "1" # this is defined for eval(): This will represent the one[]
one = []
if k == ord('2'):
switch = not switch
class_name = "2"
two = []
if k == ord('3'):
switch = not switch
class_name = "3"
three = []
if k == ord('4'):
switch = not switch
class_name = "4"
four = []
if k == ord('5'):
switch = not switch
class_name = "5"
five = []
if k == ord('n'):
switch = not switch
class_name = "none"
none = []
if k == ord('q'):
break
# Release the camera and destroy the window
capture.release()
cv.destroyAllWindows()
data = [one,two,three,four,five,none]
# return all the lists containing our dataset!
return data
def Show(data):
# Set the figure size
plt.figure(figsize=[30,20])
# Set the rows and columns
rows, cols = 6,10
# Iterate for each class
for class_index, class_name in enumerate(data):
r = np.random.randint(10, size=8);
for i, example_index in enumerate(r,1):
plt.subplot(rows,cols,class_index*cols + i);
plt.imshow(class_name[example_index][0][:,:,::-1]); # converting BGR to RGB
plt.axis('off');
| 29.947712 | 103 | 0.410956 | 466 | 4,582 | 3.961373 | 0.332618 | 0.073131 | 0.05688 | 0.065005 | 0.183099 | 0.050921 | 0 | 0 | 0 | 0 | 0 | 0.031181 | 0.503055 | 4,582 | 153 | 104 | 29.947712 | 0.779534 | 0.139895 | 0 | 0.130952 | 0 | 0 | 0.033146 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02381 | false | 0 | 0.047619 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c644f0ddcd2301b8bf29ad85fabec4ff05aa6d7d | 576 | py | Python | wagtail_blog/admin.py | pizzapanther/wagtail-blog-app | ba8984a9ab7ff7d91927ba70b85bec40c3dbc025 | [
"MIT"
] | null | null | null | wagtail_blog/admin.py | pizzapanther/wagtail-blog-app | ba8984a9ab7ff7d91927ba70b85bec40c3dbc025 | [
"MIT"
] | null | null | null | wagtail_blog/admin.py | pizzapanther/wagtail-blog-app | ba8984a9ab7ff7d91927ba70b85bec40c3dbc025 | [
"MIT"
] | null | null | null | from wagtail.admin.edit_handlers import FieldPanel
from wagtail.images.edit_handlers import ImageChooserPanel
from wagtail_blog.models import BlogPage, BlogIndexPage
# Add your Wagtail panels here.
BlogIndexPage.content_panels = [
FieldPanel('title', classname="full title"),
FieldPanel('headline'),
]
BlogPage.content_panels = [
FieldPanel('title', classname="full title"),
FieldPanel('author'),
FieldPanel('date'),
FieldPanel('date_updated'),
FieldPanel('content', classname="full"),
ImageChooserPanel('image'),
FieldPanel('tags'),
]
| 26.181818 | 58 | 0.732639 | 59 | 576 | 7.050847 | 0.457627 | 0.079327 | 0.086538 | 0.134615 | 0.269231 | 0.269231 | 0.269231 | 0.269231 | 0 | 0 | 0 | 0 | 0.140625 | 576 | 21 | 59 | 27.428571 | 0.840404 | 0.050347 | 0 | 0.125 | 0 | 0 | 0.146789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.1875 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d65d0c73215abff7c88e992f9d59aa792ae03fe9 | 968 | py | Python | porcupine/dirs.py | rscales02/porcupine | 91b3c90d19d2291c0a60ddb9dffac931147cde3c | [
"MIT"
] | null | null | null | porcupine/dirs.py | rscales02/porcupine | 91b3c90d19d2291c0a60ddb9dffac931147cde3c | [
"MIT"
] | null | null | null | porcupine/dirs.py | rscales02/porcupine | 91b3c90d19d2291c0a60ddb9dffac931147cde3c | [
"MIT"
] | null | null | null | # TODO: move this to __init__.py? this was in a separate file because
# setup.py used to import porcupine but it doesn't do it anymore
import os
import platform
import appdirs
from porcupine import __author__ as _author
if platform.system() in {'Windows', 'Darwin'}:
# these platforms like path names like "Program Files" or
# "Application Support"
_appname = 'Porcupine'
else:
_appname = 'porcupine'
_author = _author.lower()
cachedir = appdirs.user_cache_dir(_appname, _author)
configdir = appdirs.user_config_dir(_appname, _author)
# this hack shouldn't be a problem because porcupine isn't distributed
# with tools like pyinstaller, and it doesn't need to be because people
# using porcupine have python installed anyway
installdir = os.path.dirname(os.path.abspath(__file__))
def makedirs():
all_paths = [cachedir, configdir, os.path.join(configdir, 'plugins')]
for path in all_paths:
os.makedirs(path, exist_ok=True)
| 29.333333 | 73 | 0.743802 | 138 | 968 | 5.014493 | 0.586957 | 0.026012 | 0.023121 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171488 | 968 | 32 | 74 | 30.25 | 0.862843 | 0.404959 | 0 | 0 | 0 | 0 | 0.066901 | 0 | 0 | 0 | 0 | 0.03125 | 0 | 1 | 0.0625 | false | 0 | 0.25 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d66645abc0c9dca87c1142c4568eb368bbd055bf | 219 | py | Python | Python3_Mundo1_Aula7/Desafio009.py | AgladeJesus/python | 16e4e5cc43aa987858a4719748708d6d5327d75d | [
"MIT"
] | null | null | null | Python3_Mundo1_Aula7/Desafio009.py | AgladeJesus/python | 16e4e5cc43aa987858a4719748708d6d5327d75d | [
"MIT"
] | null | null | null | Python3_Mundo1_Aula7/Desafio009.py | AgladeJesus/python | 16e4e5cc43aa987858a4719748708d6d5327d75d | [
"MIT"
] | null | null | null | a = int(input('Construa a tabuada do Número: '))
aux = 0
print('*' * 18)
print('Trabuada de {}'.format(a))
print('*' * 18)
while (aux <= 10):
print('{} X {:2} = {:4}'.format(a, aux, (a * aux)))
aux = aux + 1
| 18.25 | 55 | 0.511416 | 34 | 219 | 3.294118 | 0.588235 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0.223744 | 219 | 11 | 56 | 19.909091 | 0.6 | 0 | 0 | 0.25 | 0 | 0 | 0.283105 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
d667b11e070212e9dadf1c11643176e447a7dce0 | 1,345 | py | Python | custom_components/hue_sync_box/const.py | nitobuendia/hue-sync-box-custom-component | c8496a7776017dbf56dafec6cd9c411829e4a6ef | [
"Apache-2.0"
] | 8 | 2020-06-03T21:49:45.000Z | 2022-01-21T22:20:09.000Z | custom_components/hue_sync_box/const.py | nitobuendia/hue-sync-box-custom-component | c8496a7776017dbf56dafec6cd9c411829e4a6ef | [
"Apache-2.0"
] | 3 | 2020-06-03T12:33:10.000Z | 2020-11-04T01:34:04.000Z | custom_components/hue_sync_box/const.py | nitobuendia/hue-sync-box-custom-component | c8496a7776017dbf56dafec6cd9c411829e4a6ef | [
"Apache-2.0"
] | 2 | 2020-06-06T04:16:57.000Z | 2020-07-10T17:31:33.000Z | """Constants and common variables for Philips Hue Sync Box."""
from homeassistant import const
# Set up.
DOMAIN = 'hue_sync_box'
PLATFORMS = ['remote']
TOKEN_FILE = 'hue-sync-box-token-cache-{}'
# Platform config.
CONF_ENTITY_ID = const.CONF_ENTITY_ID
CONF_IP_ADDRESS = const.CONF_IP_ADDRESS
CONF_NAME = const.CONF_NAME
# Services.
SERVICE_GET_ACCESS_TOKEN = 'get_access_token'
SERVICE_SET_AREA = 'set_area'
SERVICE_SET_BRIGHTNESS = 'set_brightness'
SERVICE_SET_HDMI_INPUT = 'set_hdmi_input'
SERVICE_SET_INTENSITY = 'set_intensity'
SERVICE_SET_SYNC_MODE = 'set_sync_mode'
SERVICE_TOGGLE = const.SERVICE_TOGGLE
SERVICE_TURN_OFF = const.SERVICE_TURN_OFF
SERVICE_TURN_ON = const.SERVICE_TURN_ON
SERVICE_UPDATE = 'update'
# Service attributes.
ATTR_AREA_NAME = 'area_name'
ATTR_BRIGHTNESS = const.CONF_BRIGHTNESS
ATTR_ENTITY_ID = const.ATTR_ENTITY_ID
ATTR_HDMI_INPUT = 'hdmi_input'
ATTR_INTENSITY = 'intensity'
ATTR_SYNC_MODE = 'sync_mode'
# Default values.
DEFAULT_STR_VALUE = 'undefined'
DEVICE_DEFAULT_NAME = 'Philips Hue Sync Box'
# Accepted API values.
INPUT_VALUES = ('1', '2', '3', '4')
ACTIVE_SYNC_MODES = ('video', 'music', 'game')
DEFAULT_SYNC_MODE = 'passthrough'
SYNC_MODE_VALUES = ('passthrough', 'powersave') + ACTIVE_SYNC_MODES
INTENSITY_VALUES = (
'subtle', 'moderate', 'high', 'extreme', 'intense') # Extreme = Intense.
| 29.23913 | 77 | 0.777695 | 189 | 1,345 | 5.121693 | 0.37037 | 0.049587 | 0.041322 | 0.035124 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003347 | 0.111524 | 1,345 | 45 | 78 | 29.888889 | 0.806695 | 0.124907 | 0 | 0 | 0 | 0 | 0.23691 | 0.023176 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.064516 | 0.032258 | 0 | 0.032258 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d667c7e9cffe0e8ebddb738da85243c80be72c1b | 1,566 | py | Python | maskpw.py | harmony5/maskpw | 8672d338af03a7497dcb1bcedf860e26c4371d45 | [
"MIT"
] | null | null | null | maskpw.py | harmony5/maskpw | 8672d338af03a7497dcb1bcedf860e26c4371d45 | [
"MIT"
] | null | null | null | maskpw.py | harmony5/maskpw | 8672d338af03a7497dcb1bcedf860e26c4371d45 | [
"MIT"
] | null | null | null | """A simple library to ask the user for a password. Similar to getpass.getpass() but allows to specify a default mask (like '*' instead of blank)."""
__version__ = "0.5.5"
from sys import platform, stdin
if platform == "win32":
from msvcrt import getch as __getch
def getch():
return __getch().decode()
else:
# taken from https://stackoverflow.com/questions/1052107/reading-a-single-character-getch-style-in-python-is-not-working-in-unix
from termios import tcgetattr, tcsetattr, TCSADRAIN
from tty import setraw as tty_setraw
def getch():
old_settings = tcgetattr(stdin)
try:
tty_setraw(stdin)
char = stdin.read(1)
finally:
tcsetattr(stdin, TCSADRAIN, old_settings)
return char
def get_password(prompt="Password: ", mask="*"):
print(prompt, end="", flush=True)
password = ""
while True:
char = getch()
# Enter
if ord(char) == 13:
print()
break
# Ctrl-C, Ctrl-D, Ctrl-Z
elif ord(char) in [3, 4, 26]:
exit(0)
# Backspace, Delete
elif ord(char) in [8, 127]:
if len(password) > 0:
print("\b \b", end="", flush=True)
password = password[:-1]
else:
print(mask, end="", flush=True)
password += char
return password
if __name__ == "__main__":
username = input("Username: ")
password = get_password(mask="#")
print(f"Username: {username}; Password: {password}")
| 25.672131 | 149 | 0.57599 | 189 | 1,566 | 4.656085 | 0.513228 | 0.027273 | 0.040909 | 0.068182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023701 | 0.299489 | 1,566 | 60 | 150 | 26.1 | 0.778487 | 0.203065 | 0 | 0.102564 | 0 | 0 | 0.070218 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.205128 | 0.102564 | 0.025641 | 0.25641 | 0.128205 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d6687922bf42aa533dfa4ed7fba3a27695479b32 | 2,227 | py | Python | evaluation/__init__.py | Leinadh/PeruvianImageGenerator | 9bf11125f4ea3090e217cf15866ec19ce944f9c6 | [
"Apache-2.0"
] | 3 | 2020-06-30T04:44:31.000Z | 2020-06-30T04:44:55.000Z | evaluation/__init__.py | Leinadh/PeruvianImageGenerator | 9bf11125f4ea3090e217cf15866ec19ce944f9c6 | [
"Apache-2.0"
] | null | null | null | evaluation/__init__.py | Leinadh/PeruvianImageGenerator | 9bf11125f4ea3090e217cf15866ec19ce944f9c6 | [
"Apache-2.0"
] | null | null | null | import numpy as np
import pandas as pd
from sklearn.preprocessing import StandardScaler
from .utils_tsne import apply_tsne, generate_scatter
def tsne_evaluation(ls_feature_arrays, ls_array_names, pca_components=None, perplexity=30, n_iter=1000, save_image=False, output_dir='./', save_wandb=False, plot_title='t-SNE evaluation'):
assert len(ls_feature_arrays) == len(ls_array_names)
feature_vectors = np.concatenate(ls_feature_arrays)
# cancatenate names in a df with same length as feature_vectors
feature_vector_names = []
list( map(feature_vector_names.extend, [[name]*ls_feature_arrays[i].shape[0] for i, name in enumerate(ls_array_names)]) )
df_feature_vector_info = pd.DataFrame({'name':feature_vector_names})
tsne_results, df_feature_vector_info = apply_tsne(df_feature_vector_info , feature_vectors, perplexity, n_iter, pca_components=pca_components)
tsne_results_norm = StandardScaler().fit_transform(tsne_results)
scatter_plot = None
wandb_scatter_plot = None
img_scatter_plot = None
if save_image or save_wandb:
wandb_scatter_plot, img_scatter_plot = generate_scatter(tsne_results_norm, df_feature_vector_info, save_image, output_dir, save_wandb, plot_title)
return tsne_results_norm, df_feature_vector_info, wandb_scatter_plot, img_scatter_plot
###############################
# distance_threshold, stats_df = calc_jaccard_index(df)
# logger.info("stats %s", stats_df)
# stats_df.to_csv(os.path.join(output_dir, "stats.csv"), index=False)
# if enable_rmse:
# df_distances = calc_rmse(df, image_shape)
# df_distances.to_csv(os.path.join(output_dir, "distances.csv"), index=False)
# return stats_df
if __name__ == "__main__":
output_dir = "./"
# pca_components = 10
pca_components = None
with open('dataset.npy', 'rb') as f:
np_a = np.load(f)
with open('model_a.npy', 'rb') as f:
np_b = np.load(f)
with open('model_b.npy', 'rb') as f:
np_c = np.load(f)
tsne_evaluation([np_a, np_b, np_c],['dataset', 'm_a','m_b'], pca_components=pca_components, save_image=True, output_dir= output_dir, save_wandb = True)
| 42.018868 | 188 | 0.709026 | 324 | 2,227 | 4.487654 | 0.308642 | 0.071527 | 0.051582 | 0.065337 | 0.169188 | 0.148556 | 0.07978 | 0 | 0 | 0 | 0 | 0.004899 | 0.175123 | 2,227 | 52 | 189 | 42.826923 | 0.786609 | 0.191738 | 0 | 0 | 1 | 0 | 0.047754 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 1 | 0.035714 | false | 0 | 0.142857 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d66bc0464e4375612c9203812ed0b15fe93bf6f0 | 3,094 | py | Python | aiida_ase3/calculations.py | sudarshanv01/aiida-ase-test | 3fb7028a31b81654a1e4502fbb791b215b75ed69 | [
"MIT"
] | null | null | null | aiida_ase3/calculations.py | sudarshanv01/aiida-ase-test | 3fb7028a31b81654a1e4502fbb791b215b75ed69 | [
"MIT"
] | null | null | null | aiida_ase3/calculations.py | sudarshanv01/aiida-ase-test | 3fb7028a31b81654a1e4502fbb791b215b75ed69 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Calculations provided by aiida_ase3.
Register calculations via the "aiida.calculations" entry point in setup.json.
"""
from aiida.common import datastructures
from aiida.engine import CalcJob
from aiida.orm import SinglefileData, Str
from aiida.plugins import DataFactory
DiffParameters = DataFactory('ase3')
class Ase3Calculation(CalcJob):
"""
ASE calculation which operates currently on:
1. inout: This mode takes the input file and reads in
the output file, nothing fancy does what it is
asked to do - Thanks to Leopold for this idea!
Other possibilities:
2. gpaw-ready: A gpaw compatibility based setup which
will automatically take care of a bunch of parsing
and input output options
"""
@classmethod
def define(cls, spec):
"""Define inputs and outputs of the calculation."""
# yapf: disable
super(Ase3Calculation, cls).define(spec)
# set default values for AiiDA options
spec.inputs['metadata']['options']['resources'].default = {
'num_machines': 1,
'num_mpiprocs_per_machine': 1,
}
spec.inputs['metadata']['options']['parser_name'].default = 'ase3'
# new ports
spec.input('metadata.options.output_filename', valid_type=str, default='aiida.out')
spec.input('operation_mode', valid_type=Str, default=lambda: Str('inout'))
spec.input('input_file', valid_type=SinglefileData, help='Input file which will be used', required=False)
spec.input('output_filename', valid_type=Str, default=lambda: Str('aiida.txt'), help='AiiDA output file by default')
# outputs
spec.output('ase3_output', valid_type=SinglefileData, help='Output file which will be read in')
# Error messages
spec.exit_code(100, 'ERROR_MISSING_OUTPUT_FILES', message='Calculation did not produce all expected output files.')
def prepare_for_submission(self, folder):
"""
Create input files.
TODO: Currently implemented only for input-output options
:param folder: an `aiida.common.folders.Folder` where the plugin should temporarily place all files
needed by the calculation.
:return: `aiida.common.datastructures.CalcInfo` instance
"""
codeinfo = datastructures.CodeInfo()
codeinfo.code_uuid = self.inputs.code.uuid
codeinfo.stdout_name = self.metadata.options.output_filename
codeinfo.withmpi = self.inputs.metadata.options.withmpi
codeinfo.cmdline_params = ['python', self.inputs.input_file.filename]
# Prepare a `CalcInfo` to be returned to the engine
calcinfo = datastructures.CalcInfo()
calcinfo.codes_info = [codeinfo]
calcinfo.local_copy_list = [
(self.inputs.input_file.uuid, self.inputs.input_file.filename, self.inputs.input_file.filename),
]
calcinfo.retrieve_list = [self.metadata.options.output_filename, self.inputs.output_filename.value]
return calcinfo
| 39.164557 | 124 | 0.678087 | 372 | 3,094 | 5.543011 | 0.430108 | 0.030553 | 0.029098 | 0.036857 | 0.121242 | 0.049952 | 0 | 0 | 0 | 0 | 0 | 0.005848 | 0.226244 | 3,094 | 78 | 125 | 39.666667 | 0.855472 | 0.314803 | 0 | 0 | 0 | 0 | 0.187313 | 0.040959 | 0 | 0 | 0 | 0.012821 | 0 | 1 | 0.060606 | false | 0 | 0.121212 | 0 | 0.242424 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d66df299c7b74899a4ee30dbd5d0c659c580427a | 1,928 | py | Python | pygomas/bdisoldier.py | sfp932705/pygomas | 8cdd7e973b8b4e8de467803c106ec44ca6b8bd03 | [
"MIT"
] | 3 | 2019-06-20T08:55:36.000Z | 2019-07-04T14:10:40.000Z | pygomas/bdisoldier.py | sfp932705/pygomas | 8cdd7e973b8b4e8de467803c106ec44ca6b8bd03 | [
"MIT"
] | null | null | null | pygomas/bdisoldier.py | sfp932705/pygomas | 8cdd7e973b8b4e8de467803c106ec44ca6b8bd03 | [
"MIT"
] | null | null | null | from collections import deque
from .vector import Vector3D
from .bditroop import BDITroop, CLASS_SOLDIER
from .config import BACKUP_SERVICE, DESTINATION, VELOCITY, HEADING
from agentspeak import Actions
from agentspeak import grounded
from agentspeak.stdlib import actions as asp_action
class BDISoldier(BDITroop):
def __init__(self, *args, **kwargs):
soldier_actions = Actions(asp_action)
@soldier_actions.add(".reinforce", 3)
def _reinforce(agent, term, intention):
"""Same as a .goto"""
args = grounded(term.args, intention.scope)
self.movement.destination.x = args[0]
self.movement.destination.y = args[1]
self.movement.destination.z = args[2]
start = (self.movement.position.x, self.movement.position.z)
end = (self.movement.destination.x, self.movement.destination.z)
path = self.path_finder.get_path(start, end)
if path:
self.destinations = deque(path)
x, z = path[0]
self.movement.calculate_new_orientation(Vector3D(x=x, y=0, z=z))
self.bdi.set_belief(DESTINATION, args[0], args[1], args[2])
self.bdi.set_belief(VELOCITY, self.movement.velocity.x, self.movement.velocity.y,
self.movement.velocity.z)
self.bdi.set_belief(HEADING, self.movement.heading.x, self.movement.heading.y, self.movement.heading.z)
else:
self.destinations = deque()
self.movement.destination.x = self.movement.position.x
self.movement.destination.y = self.movement.position.y
self.movement.destination.z = self.movement.position.z
yield
super().__init__(actions=soldier_actions, *args, **kwargs)
self.services.append(BACKUP_SERVICE)
self.eclass = CLASS_SOLDIER
| 42.844444 | 119 | 0.631743 | 227 | 1,928 | 5.255507 | 0.286344 | 0.201174 | 0.154233 | 0.060352 | 0.134116 | 0.105616 | 0 | 0 | 0 | 0 | 0 | 0.007757 | 0.264523 | 1,928 | 44 | 120 | 43.818182 | 0.833568 | 0.00778 | 0 | 0 | 0 | 0 | 0.005244 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.194444 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d670d5aa50884e432ae6064493a3517e18d5845d | 577 | py | Python | saleor/core/migrations/0005_alter_eventdelivery_webhook.py | eanknd/saleor | 08aa724176be00d7aaf654f14e9ae99dd4327f97 | [
"CC-BY-4.0"
] | 1,392 | 2021-10-06T15:54:28.000Z | 2022-03-31T20:50:55.000Z | saleor/core/migrations/0005_alter_eventdelivery_webhook.py | eanknd/saleor | 08aa724176be00d7aaf654f14e9ae99dd4327f97 | [
"CC-BY-4.0"
] | 888 | 2021-10-06T10:48:54.000Z | 2022-03-31T11:00:30.000Z | saleor/core/migrations/0005_alter_eventdelivery_webhook.py | eanknd/saleor | 08aa724176be00d7aaf654f14e9ae99dd4327f97 | [
"CC-BY-4.0"
] | 538 | 2021-10-07T16:21:27.000Z | 2022-03-31T22:58:57.000Z | # Generated by Django 3.2.12 on 2022-04-08 12:37
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("webhook", "0008_webhook_subscription_query"),
("core", "0004_delete_delivery_without_webhook"),
]
operations = [
migrations.AlterField(
model_name="eventdelivery",
name="webhook",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="webhook.webhook"
),
),
]
| 25.086957 | 81 | 0.622184 | 60 | 577 | 5.833333 | 0.633333 | 0.068571 | 0.08 | 0.125714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056738 | 0.266898 | 577 | 22 | 82 | 26.227273 | 0.770686 | 0.079723 | 0 | 0.125 | 1 | 0 | 0.213611 | 0.126654 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d673c897a97167e0234738305af2c88177b02037 | 339 | py | Python | ExpenseProject/ExpenseApp/migrations/0016_remove_expensemodel_expenses.py | cs-fullstack-fall-2018/project3-django-psanon19 | 4df4459265c5d0b560dc4fa26b7da2c1b3bf9cff | [
"Apache-2.0"
] | null | null | null | ExpenseProject/ExpenseApp/migrations/0016_remove_expensemodel_expenses.py | cs-fullstack-fall-2018/project3-django-psanon19 | 4df4459265c5d0b560dc4fa26b7da2c1b3bf9cff | [
"Apache-2.0"
] | null | null | null | ExpenseProject/ExpenseApp/migrations/0016_remove_expensemodel_expenses.py | cs-fullstack-fall-2018/project3-django-psanon19 | 4df4459265c5d0b560dc4fa26b7da2c1b3bf9cff | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.6 on 2018-10-27 04:42
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('ExpenseApp', '0015_auto_20181026_1800'),
]
operations = [
migrations.RemoveField(
model_name='expensemodel',
name='expenses',
),
]
| 18.833333 | 50 | 0.60472 | 35 | 339 | 5.742857 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128099 | 0.286136 | 339 | 17 | 51 | 19.941176 | 0.702479 | 0.132743 | 0 | 0 | 1 | 0 | 0.181507 | 0.078767 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6823a832aea816c2f48825bcf38e6f4d87d49cc | 2,588 | py | Python | b3/guess_type.py | oddy/b3 | 5d295588389ce6873b087ceff24aa949c9f9ffc4 | [
"MIT"
] | 5 | 2020-06-25T02:24:29.000Z | 2020-07-06T15:23:26.000Z | b3/guess_type.py | oddy/b3 | 5d295588389ce6873b087ceff24aa949c9f9ffc4 | [
"MIT"
] | null | null | null | b3/guess_type.py | oddy/b3 | 5d295588389ce6873b087ceff24aa949c9f9ffc4 | [
"MIT"
] | null | null | null |
# Python-Obj to B3-Type guesser for composite_dynamic (pack)
import datetime, decimal
from six import PY2
from b3.datatypes import *
def guess_type(obj):
if isinstance(obj, bytes): # Note this will catch also *str* on python2. If you want unicode out, pass unicode in.
return B3_BYTES
if PY2 and isinstance(obj, unicode): # py2 unicode string
return B3_UTF8
if isinstance(obj, str): # Py3 unicode str only, py2 str/bytes is caught by above test.
return B3_UTF8
if obj is True or obj is False: # Note: make sure this check is BEFORE int checks!
return B3_BOOL # Note: because bools are a subclass of int (!?) in python :S
if isinstance(obj, int):
return B3_SVARINT # Policy: fixed to svarint to make this deterministic for better interop.
# alternatives: uvarint, int64
if PY2 and isinstance(obj, long):
return B3_SVARINT # the zigzag size diff is only noticeable with small numbers.
if isinstance(obj, dict):
return B3_COMPOSITE_DICT
if isinstance(obj, list):
return B3_COMPOSITE_LIST
if isinstance(obj, float):
return B3_FLOAT64
if isinstance(obj, decimal.Decimal):
return B3_DECIMAL
if isinstance(obj, (datetime.datetime, datetime.date, datetime.time)):
return B3_SCHED
if isinstance(obj, complex):
return B3_COMPLEX
raise TypeError('Could not map type of object %r to a viable B3 type' % type(obj))
# Policy: Currently guessed types are fixed and 1:1 with python types.
# - There is/was an idea to have guess_type select the 'best' type based on value (e.g. SVARINT or UVARINT depending on sign)
# - But that would make interop difficult between the Dynamic and Schema packers, so we've dropped it for now.
# The 'best type' selector would have 3 settings -
# 'fixed' (default, as now), 'compact' (e.g. prefer var-types for small numbers), 'fast' (prefer the xxx64 types)
# The wastefulness of using svarint for everything hurts a little, but compactness-obsessed people should be using schemas anyway.
# Policy: we are NOT auto-converting stuff to DECIMAL, callers responsibility
# - because we'd have to fix a precision for the user and i dont know if we want to be opinionated about that.
# - just because I hate IEEE754 doesnt mean any one else does.
# Note: no NULL type - the item header has a NULL flag instead. More info in item_header.
| 41.741935 | 135 | 0.668083 | 378 | 2,588 | 4.526455 | 0.481481 | 0.056108 | 0.078901 | 0.02104 | 0.024547 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019038 | 0.26932 | 2,588 | 61 | 136 | 42.42623 | 0.885775 | 0.548686 | 0 | 0.137931 | 0 | 0 | 0.044503 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.103448 | 0 | 0.551724 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d682b75ed77e306a31960fd2f05b18dec13b96bd | 200 | py | Python | problem/01000~09999/02862/2862.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 1 | 2019-04-19T16:37:44.000Z | 2019-04-19T16:37:44.000Z | problem/01000~09999/02862/2862.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 1 | 2019-04-20T11:42:44.000Z | 2019-04-20T11:42:44.000Z | problem/01000~09999/02862/2862.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 3 | 2019-04-19T16:37:47.000Z | 2021-10-25T00:45:00.000Z | def ans(n):
global fib
for i in range(1,99):
if fib[i]==n: return n
if fib[i+1]>n: return ans(n-fib[i])
fib=[1]*100
for i in range(2,100):
fib[i]=fib[i-1]+fib[i-2]
n=int(input())
print(ans(n)) | 18.181818 | 37 | 0.6 | 50 | 200 | 2.4 | 0.36 | 0.2 | 0.1 | 0.183333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08284 | 0.155 | 200 | 11 | 38 | 18.181818 | 0.627219 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.1 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d684e2a6f1a8b1aae17eeb957d82a419425fab31 | 521 | py | Python | price_picker/models/shop.py | M0r13n/price_picker | aaa4e79496753bc3b61afbde6324868ee5c46aa5 | [
"MIT"
] | 3 | 2019-08-03T16:52:36.000Z | 2020-04-13T10:06:38.000Z | price_picker/models/shop.py | M0r13n/price_picker | aaa4e79496753bc3b61afbde6324868ee5c46aa5 | [
"MIT"
] | null | null | null | price_picker/models/shop.py | M0r13n/price_picker | aaa4e79496753bc3b61afbde6324868ee5c46aa5 | [
"MIT"
] | 1 | 2021-01-02T11:31:25.000Z | 2021-01-02T11:31:25.000Z | from price_picker.common.database import CRUDMixin
from price_picker import db
class Shop(CRUDMixin, db.Model):
""" Shops """
__tablename__ = 'shops'
name = db.Column(db.String(128), primary_key=True, unique=True, default="Zentrale")
@classmethod
def query_factory_all(cls):
# insert default if no shop exists
if cls.query.first() is None:
cls.create()
return cls.query.order_by(cls.name)
def __str__(self):
return self.name
__repr__ = __str__
| 24.809524 | 87 | 0.658349 | 68 | 521 | 4.720588 | 0.632353 | 0.056075 | 0.093458 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007557 | 0.238004 | 521 | 20 | 88 | 26.05 | 0.801008 | 0.076775 | 0 | 0 | 0 | 0 | 0.027426 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.153846 | 0.076923 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d6855a43a2ccf3a81f181c755f6b6e16d0e11895 | 7,462 | py | Python | bridgeClient.py | omega3love/BridgeProject | 35ba6d96f584dc11874cb053dda5bed0a47cd347 | [
"FTL"
] | null | null | null | bridgeClient.py | omega3love/BridgeProject | 35ba6d96f584dc11874cb053dda5bed0a47cd347 | [
"FTL"
] | null | null | null | bridgeClient.py | omega3love/BridgeProject | 35ba6d96f584dc11874cb053dda5bed0a47cd347 | [
"FTL"
] | null | null | null | #! /usr/bin/python
import socket
import threading
from time import sleep
import sys, os
import inputbox
import pygame
from bridgeSprites import Button
class userInterfaceWindow():
def __init__(self, screen):
self.screen = screen
self.clients = []
self.userName = inputbox.ask(screen, "Type your name ")
self.buttonColor = (200,20,20)
self.buttonSize = (200,50)
self.buttonPos = (50,50)
self.myButton = Button(self.buttonPos, self.buttonSize,
self.buttonColor, self.userName)
self.brokenError = False
def lobby(self, clients):
self.screen.fill(-1)
self.buttonList = [ self.myButton ]
i = 1
for client in clients:
newButtonPos = (self.buttonPos[0], self.buttonPos[1] + 70*i)
newUserName = client.split(";")[0]
if self.userName == newUserName:
continue
self.buttonList.append( Button(newButtonPos, self.buttonSize,
self.buttonColor, newUserName) )
i += 1
for button in self.buttonList:
button.draw(self.screen)
for event in pygame.event.get():
if event.type == pygame.QUIT:
self.brokenError = True
pygame.display.update()
#pygame.time.Clock().tick(30)
def askToPlay(self):
mouseDownPos, mouseUpPos = None, None
buttonDowned = None
self.waitingForAns = False
self.switch = True
while True:
pygame.event.clear()
ev = pygame.event.wait()
#print pygame.event.event_name(ev.type)
mouseDownPos = None
mouseUpPos = None
if ev.type == pygame.KEYDOWN and ev.key == pygame.K_ESCAPE or ev.type == pygame.QUIT:
break
elif ev.type == pygame.MOUSEBUTTONDOWN:
mouseDownPos = pygame.mouse.get_pos()
elif ev.type == pygame.MOUSEBUTTONUP:
mouseUpPos = pygame.mouse.get_pos()
if not self.waitingForAns:
isMousePressed = pygame.mouse.get_pressed()[0]
for button in self.buttonList:
xBdry = (button.pos[0], button.pos[0] + button.rect[2])
yBdry = (button.pos[1], button.pos[1] + button.rect[3])
if mouseDownPos:
isInBdry = (xBdry[0] <= mouseDownPos[0] < xBdry[1]) and (yBdry[0] <= mouseDownPos[1] < yBdry[1])
if isMousePressed:
if not buttonDowned and isInBdry:
buttonDowned = button
elif buttonDowned == button and not isInBdry:
buttonDowned = None
else:
buttonDowned = None
if mouseUpPos:
isInBdry = (xBdry[0] <= mouseUpPos[0] < xBdry[1]) and (yBdry[0] <= mouseUpPos[1] < yBdry[1])
if buttonDowned == button and isInBdry:
print "Clicked button : " + button.text
display_pos = ( button.pos[0]+button.rect[2]+20, button.pos[1] )
inputbox.display_msg_custum(self.screen, display_pos, "Asked '%s' to play. Hang on a sec..." %button.text)
buttonDowned = None
self.waitingForAns = button.text
#else:
class bridgeConnection(userInterfaceWindow):
def __init__(self, screen):
#self.HOST = raw_input("HOST IP : ")
self.HOST = "143.248.12.11"
self.PORT = 50000
self.DATA_SIZE = 256 # maximum data length which can be sent in once
self.myIP = myIPaddress()
self.endThread = False
self.startGame = False
userInterfaceWindow.__init__(self, screen)
self.makeConnection()
self.sendData("info:connMade:%s;%s"%(self.userName, self.myIP))
self.dataGrave = [] # processed data will be saved here
self.dataList = {'cmd':[],'grid':[], 'ask':[], 'pick':[], 'turn':[], 'info':[]} #Sort the type of the data
if not self.soc:
print "Server is not opened"
print "waiting an event..."
self.lobby(self.clients)
self.askToPlay()
def makeConnection(self):
# make socket and connect to the server
self.soc = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.soc.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
self.soc.settimeout(5.0) # maximum wating time (seconds)
connected = False
while not connected:
try:
print "trying to connect " + self.HOST
self.soc.connect( (self.HOST, self.PORT) )
connected = True
print "Connected!"
#soc.settimeout(None)
break
except socket.timeout:
print "Exceeded time limit"
connectAgain = raw_input("try again?(y/n)")
if connectAgain == "y" or connectAgain == "Y":
continue
else:
return
except socket.error:
print "Access denied"
sleep(1)
# [ NOT YET ] if QUIT command is received, call 'sys.exit'
self.soc = False
return
# Threading allows to get data whenever it's delievered
self.T = threading.Thread(target = self.receiveData)
self.T.start()
self.T2 = threading.Thread(target = self.selfConnectedSend)
self.T2.start()
def sendData(self, data):
""" Send data (string type) to the server """
if len(data) <= self.DATA_SIZE:
self.soc.send(data.encode('UTF-8'))
#print "Data '%s' is sent successfully" %data
else:
print "Data packet size exceeded!"
def receiveData(self):
""" Receive data (string type) from the server """
while not self.endThread:
try:
data = self.soc.recv(self.DATA_SIZE)# receive data whose length <= DATA_SIZE
print "raw data is : %s" %data
for realData in data.split("^")[:-1]:
if "info" in realData:
self.dataList['info'].append(realData)
elif "ask" in realData:
self.dataList['ask'].append(readData[4:])
elif "pick" in realData:
self.dataList['pick'].append(readData[5:])
elif "cmd" in realData:
self.dataList['cmd'].append(realData[4:])
elif "grid" in realData:
self.dataList['grid'].append(data[5:])
except socket.timeout:
#print "socket timed out"
continue
except:
print "Connection is lost"
break
self.dataProcessing()
self.soc.close() # disconnect the connection
def disconnect(self):
self.endThread = True
print "joining the thread..."
self.T.join()
self.T2.join()
print "thread is joined"
pygame.quit()
sys.exit()
def dataProcessing(self):
# for reading
# dataList['info'] part
for data in self.dataList['info'][:]:
if "info:connList" in data:
self.clients = eval(data.split(":")[-1])
self.lobby(self.clients)
elif "info:askPlay" in data:
self.opponent = data.split(":")[-1].split(";")[0]
answer = inputbox.ask(self.screen, "'%s' has asked you to play. Accept?(y/n) " %self.opponent)
if answer in ["Y", "Yes", "y", "yes"]:
self.sendData("info:gameAccept:%s;%s" %(self.opponent, self.userName))
self.sendData("1")
self.sendData("0")
else:
self.opponent = None
self.waitingForAns = False
self.switch = True
self.dataList['info'].remove(data)
self.dataGrave.append(data)
def selfConnectedSend(self):
# for sending
# if self.# is changed, send data.
while not self.endThread:
try:
if self.waitingForAns and self.switch:
self.sendData("info:askPlay:%s;%s" %(self.userName, self.waitingForAns))
self.switch = False
except:
pass
self.soc.close()
def myIPaddress():
try:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.connect(("gmail.com",80))
myip = s.getsockname()[0]
s.close()
return myip
except:
print "Internet disconnected?"
return 0
if __name__ == "__main__":
client = bridgeConnection(pygame.display.set_mode((600,600)))
#client = bridgeConnection(pygame.Surface((600,600)))
print "now main"
sleep(3)
print "end session"
client.disconnect()
| 27.535055 | 113 | 0.64728 | 963 | 7,462 | 4.971963 | 0.265836 | 0.01462 | 0.01462 | 0.022974 | 0.091061 | 0.063074 | 0.030911 | 0 | 0 | 0 | 0 | 0.017643 | 0.217636 | 7,462 | 270 | 114 | 27.637037 | 0.802501 | 0.092469 | 0 | 0.19598 | 0 | 0 | 0.08446 | 0.003156 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.005025 | 0.035176 | null | null | 0.075377 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d685af112d81faa0f634d5f48b27e861166261c2 | 2,812 | py | Python | class3/exercise2.py | befthimi/be_pynet_course | d5e384baf4d8ddbb99e348a66e8b688c82098f59 | [
"Apache-2.0"
] | null | null | null | class3/exercise2.py | befthimi/be_pynet_course | d5e384baf4d8ddbb99e348a66e8b688c82098f59 | [
"Apache-2.0"
] | null | null | null | class3/exercise2.py | befthimi/be_pynet_course | d5e384baf4d8ddbb99e348a66e8b688c82098f59 | [
"Apache-2.0"
] | 1 | 2020-06-03T08:50:12.000Z | 2020-06-03T08:50:12.000Z | #!/usr/bin/env python
"""
Script that graphs interface stats
"""
import time
import snmp_helper
import pygal
intfInOctets_fa4 = '1.3.6.1.2.1.2.2.1.10.5'
intfInUcastPkts_fa4 = '1.3.6.1.2.1.2.2.1.11.5'
intfOutOctets_fa4 = '1.3.6.1.2.1.2.2.1.16.5'
intfOutUcastPkts_fa4 = '1.3.6.1.2.1.2.2.1.17.5'
router1 = ('50.76.53.27', 7961)
a_user = 'pysnmp'
auth_key = 'galileo1'
encrypt_key = 'galileo1'
snmp_user = (a_user, auth_key, encrypt_key)
def snmp_get(whichrouter, which_oid):
"""
Get snmp result based on which router and OID
"""
snmp_data = snmp_helper.snmp_get_oid_v3(whichrouter, snmp_user, oid=which_oid)
snmp_result = int(snmp_helper.snmp_extract(snmp_data))
return snmp_result
def plotter(rawlist):
"""
From rawlist calculate delta between each element in the list and
return list of deltas in a new list.
"""
newlist = [rawlist[i+1]-rawlist[i] for i in range(len(rawlist)-1)]
return newlist
ListInOctets = []
ListOutOctets = []
ListInUcastP = []
ListOutUcastP = []
repeat = 13
print "Grabbing data. Please wait...\n"
while repeat > 0:
ListInOctets.append(snmp_get(router1, intfInOctets_fa4))
ListOutOctets.append(snmp_get(router1, intfOutOctets_fa4))
ListInUcastP.append(snmp_get(router1, intfInUcastPkts_fa4))
ListOutUcastP.append(snmp_get(router1, intfOutUcastPkts_fa4))
time.sleep(300)
repeat = repeat - 1
print "Finished grabbing data."
print "-" * 20
print "Creating graphs...."
PlotInOctets = plotter(ListInOctets)
PlotOutOctets = plotter(ListOutOctets)
PlotInUcastP = plotter(ListInUcastP)
PlotOutUcastP = plotter(ListOutUcastP)
#---------------------------------------------
# Create a Chart of type Line for byte count
byteline_chart = pygal.Line()
# Title
byteline_chart.title = 'Input/Output Bytes'
# X-axis labels (samples were every five minutes)
byteline_chart.x_labels = ['5', '10', '15', '20', '25', '30', '35', '40', '45', '50', '55', '60']
# Add each one of the above lists into the graph as a line with corresponding label
byteline_chart.add('InBytes', PlotInOctets)
byteline_chart.add('OutBytes', PlotOutOctets)
# Create an output image file from this
byteline_chart.render_to_file('Bytes.svg')
#---------------------------------------------
# Create a Chart of type Line for packet count
pcktline_chart = pygal.Line()
# Title
pcktline_chart.title = 'Input/Output Packets'
# X-axis labels (samples were every five minutes)
pcktline_chart.x_labels = ['5', '10', '15', '20', '25', '30', '35', '40', '45', '50', '55', '60']
# Add each one of the above lists into the graph as a line with corresponding label
pcktline_chart.add('In Unicast Packets', PlotInUcastP)
pcktline_chart.add('Out Unicast Packets', PlotOutOctets)
# Create an output image file from this
pcktline_chart.render_to_file('Pckts.svg')
| 30.565217 | 97 | 0.698435 | 411 | 2,812 | 4.652068 | 0.364964 | 0.008368 | 0.01046 | 0.012552 | 0.242678 | 0.242678 | 0.242678 | 0.216527 | 0.130753 | 0.130753 | 0 | 0.055118 | 0.141892 | 2,812 | 91 | 98 | 30.901099 | 0.737257 | 0.194168 | 0 | 0 | 0 | 0.078431 | 0.172091 | 0.043393 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.058824 | null | null | 0.078431 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d68931defee18e31e403efa7119fcbbcac29b44e | 893 | py | Python | testapp/wagtail_wordpress_processor/management/commands/base_command.py | nickmoreton/wagtail_wordpress_importer | fbe6b60ae624edac3f42a62ce30af4a0c548b4ed | [
"MIT"
] | null | null | null | testapp/wagtail_wordpress_processor/management/commands/base_command.py | nickmoreton/wagtail_wordpress_importer | fbe6b60ae624edac3f42a62ce30af4a0c548b4ed | [
"MIT"
] | null | null | null | testapp/wagtail_wordpress_processor/management/commands/base_command.py | nickmoreton/wagtail_wordpress_importer | fbe6b60ae624edac3f42a62ce30af4a0c548b4ed | [
"MIT"
] | null | null | null | from django.core.management import BaseCommand
from wagtail_wordpress_importer.utils import spinner
class BaseProcessCommand(BaseCommand):
def output_start(self, message, newline=''):
self.stdout.write(message, ending=newline)
def output_message_success(self, message, newline=''):
self.stdout.write(self.style.SUCCESS(message), ending=newline)
def output_message_warning(self, message, newline=''):
self.stdout.write(self.style.WARNING(message), ending=newline)
def output_message_end(self, message, newline='\n', symbol='tick'):
SYMBOLS = {
'tick': ' ✅',
'wait': ' ⌛️',
'boom': ' 💥',
'popper': ' 🎉',
'flag': ' 🏁'
}
self.stdout.write(message + SYMBOLS[symbol], ending=newline)
def output_spinner(self):
self.stdout.write(next(spinner), ending='\b')
| 31.892857 | 71 | 0.6271 | 100 | 893 | 5.56 | 0.41 | 0.080935 | 0.134892 | 0.158273 | 0.404676 | 0.404676 | 0.151079 | 0.151079 | 0 | 0 | 0 | 0 | 0.227324 | 893 | 27 | 72 | 33.074074 | 0.797101 | 0 | 0 | 0 | 0 | 0 | 0.045913 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.1 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d68a536157f9fd7d4662b09af86a14b1f0e44990 | 5,503 | py | Python | Labs/LineSweep/plots.py | jessicaleete/numerical_computing | cc71f51f35ca74d00e617af3d1a0223e19fb9a68 | [
"CC-BY-3.0"
] | 10 | 2016-10-18T19:54:25.000Z | 2021-10-09T20:12:38.000Z | Labs/LineSweep/plots.py | jessicaleete/numerical_computing | cc71f51f35ca74d00e617af3d1a0223e19fb9a68 | [
"CC-BY-3.0"
] | null | null | null | Labs/LineSweep/plots.py | jessicaleete/numerical_computing | cc71f51f35ca74d00e617af3d1a0223e19fb9a68 | [
"CC-BY-3.0"
] | 2 | 2017-05-14T16:07:59.000Z | 2020-06-20T09:05:06.000Z | # This plotting file is rather old code.
# It generates a full set of plots illustrating the two different linesweep
# algorithms on randomly chosen points.
# It should be updated at some point so that the random number generator
# is seeded with some specifically chosen seed that generates plots that
# illustrate the algorithms well.
# This isn't currently an issue since the plots already conform to
# project standards, but if we ever want to standardize plot generation
# this will have to be taken care of.
# There isn't currently code to generate the voronoi 1-norm and supnorm plots.
import matplotlib
matplotlib.rcParams = matplotlib.rc_params_from_file('../../matplotlibrc')
from matplotlib import pyplot as plt
import numpy as np
from numpy.random import rand
import bisect as bs
# This generates the plots for the simplified linsweep.
# It generates more plots than are actually used in the lab.
# I just picked the plots that were useful in illustrating the algorithm.
def multidist(p0, p1):
l = len(p0)
return (sum([(p0[i] - p1[i])**2 for i in range(l)]))**(.5)
def mindist_simple_plot(Y):
X = Y.take(Y[:,0].argsort(), axis=0)
n = len(X)
actives = []
pt = tuple(X[0])
actives.append(pt)
pt = tuple(X[1])
actives.append(pt)
r = multidist(actives[0], actives[1])
for i in xrange(2, len(X)):
pt = tuple(X[i])
l = len(actives)
while l > 0:
if actives[0][0] > pt[0] + r:
actives.pop(0)
l -= 1
else:
break
plt.scatter(X[:,0], X[:,1])
res = 15
T = np.linspace(-.2, 1.2, res)
res2 = 201
theta = np.linspace(np.pi/2, 3*np.pi/2, res2)
plt.plot([pt[0]]*res, T, color='r')
plt.plot([pt[0]-r]*res, T, color='r')
X0 = np.array([pt + r * np.array([np.cos(t), np.sin(t)]) for t in theta])
plt.plot(X0[:,0], X0[:,1], color='g')
plt.xlim((-.2, 1.2))
plt.ylim((-.2, 1.2))
plt.show()
for k in xrange(len(actives)):
d = multidist(pt, actives[k])
if d < r:
r = d
actives.append(pt)
return r
# This generates the plots for the full version.
# It generates more plots than are actually used in the lab.
# I just picked the plots that were useful in illustrating the algorithm.
def mindist_plot(Y):
X = Y.take(Y[:,0].argsort(), axis=0)
n = len(X)
actives = []
pt = X[0]
actives.insert(bs.bisect_left(actives, tuple(reversed(tuple(pt)))), tuple(reversed(tuple(pt))))
pt = X[1]
actives.insert(bs.bisect_left(actives, tuple(reversed(tuple(pt)))), tuple(reversed(tuple(pt))))
r = multidist(actives[0], actives[1])
for i in xrange(2, n):
plt.scatter(X[:,0], X[:,1])
pt = tuple(X[i])
res = 1401
x = np.linspace(-.2, 1.2, res)
plt.plot(x, [pt[1] - r] * res, color='r')
plt.plot(x, [pt[1] + r] * res, color='r')
plt.plot([pt[0]] * res, x, color='b')
plt.plot([pt[0] - r] * res, x, color='b')
T = np.linspace(np.pi / 2, 3 * np.pi / 2, res)
pt = np.array(pt)
X0 = np.array([pt + r * np.array([np.cos(t), np.sin(t)]) for t in T])
plt.plot(X0[:,0], X0[:,1], color='g')
block = actives[bs.bisect_left(actives, (pt[1] - r, pt[0] - r)): bs.bisect_right(actives, (pt[1] + r, pt[0]))]
for k in xrange(len(block)):
d = multidist(tuple(reversed(tuple(pt))), block[k])
if d < r:
r = d
removalidx = 0
while removalidx < len(actives):
if abs(actives[removalidx][1] - pt[0]) > r:
actives.pop(removalidx)
else:
removalidx += 1
if len(actives) > 0:
plt.scatter(np.fliplr(np.array(actives))[:,0], np.fliplr(np.array(actives))[:,1])
if len(block) > 0:
plt.scatter(np.fliplr(np.array(block))[:,0], np.fliplr(np.array(block))[:,1])
plt.show()
actives.insert(bs.bisect_left(actives, tuple(reversed(tuple(pt)))), tuple(reversed(tuple(pt))))
return r
def pnorm(pt, X, p=2):
# Take the p-norm distance between a point 'pt'
# and an array of points 'X'.
if p == "inf":
return np.absolute(pt - X).max(axis=-1)
return (np.absolute(pt - X)**p).sum(axis=-1)**(1./p)
def brute_force_voronoi(n, res, p=2, filename=None):
# Generates a grid of points and tests to find the nearest
# neighbor for each of them.
pts = rand(n, 2)
X = np.linspace(0, 1, res)
# Make an array to store the indices of the nearest points.
indices = np.zeros((res, res))
for i in xrange(res):
for j in xrange(res):
indices[i, j] = pnorm(np.array([X[j], X[i]]), pts, p).argmin()
# Make a colorplot of the results.
X, Y = np.meshgrid(X, X, copy=False)
plt.pcolormesh(X, Y, indices)
plt.scatter(pts[:,0], pts[:,1])
plt.xlim((0,1))
plt.ylim((0,1))
plt.show()
if filename is None:
plt.show()
else:
plt.savefig(filename)
plt.clf()
if __name__=="__main__":
# Generate the plots for the simplified algorithm.
X = rand(10, 2)
mindist3plot(X)
# Generate the plots for the full algorithm.
X = rand(25, 2)
mindistplot(X)
# The 1-norm voronoi diagram.
brute_force_voronoi(10, 401, 1, "voronoi_1norm.png")
# The oo-norm voronoi diagram.
brute_force_voronoi(10, 401, "inf", "voronoi_supnorm.png")
| 36.443709 | 118 | 0.580774 | 871 | 5,503 | 3.638347 | 0.237658 | 0.022089 | 0.03976 | 0.044178 | 0.429157 | 0.353424 | 0.292521 | 0.276112 | 0.238877 | 0.225623 | 0 | 0.030887 | 0.264583 | 5,503 | 150 | 119 | 36.686667 | 0.752162 | 0.242232 | 0 | 0.289474 | 1 | 0 | 0.018344 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04386 | false | 0 | 0.04386 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d68d7b1197ee7c4dce2f2971b053d4887af55db6 | 955 | py | Python | setup.py | eshandas/simple_django_logger | a6d15ca1c1ded414ed8fe5cc0c4ca0c20a846582 | [
"MIT"
] | null | null | null | setup.py | eshandas/simple_django_logger | a6d15ca1c1ded414ed8fe5cc0c4ca0c20a846582 | [
"MIT"
] | 5 | 2020-02-11T23:15:07.000Z | 2021-06-10T20:52:39.000Z | setup.py | eshandas/simple_django_logger | a6d15ca1c1ded414ed8fe5cc0c4ca0c20a846582 | [
"MIT"
] | null | null | null | from setuptools import find_packages, setup
# Read more here: https://pypi.org/project/twine/
setup(
name='simple_django_logger',
# packages=[
# 'simple_django_logger', # this must be the same as the name above
# 'simple_django_logger.middleware',
# 'simple_django_logger.migrations'],
packages=find_packages(),
include_package_data=True,
version='3.1.0',
description='A basic logger for Django',
author='Eshan Das',
author_email='eshandasnit@gmail.com',
url='https://github.com/eshandas/simple_django_logger', # use the URL to the github repo
download_url='https://github.com/eshandas/simple_django_logger/archive/3.1.0.tar.gz', # Create a tag in github
keywords=['django', 'logger'],
classifiers=[],
install_requires=[
'Django>=2.0',
'requests>=2.0',
'djangorestframework>=3.8',
'user-agents>=1.1.0',
'django-user-agents>=0.3.2'],
)
| 35.37037 | 115 | 0.657592 | 126 | 955 | 4.833333 | 0.547619 | 0.137931 | 0.17734 | 0.055829 | 0.141215 | 0.141215 | 0.141215 | 0.141215 | 0 | 0 | 0 | 0.023438 | 0.195812 | 955 | 26 | 116 | 36.730769 | 0.769531 | 0.273298 | 0 | 0 | 0 | 0.05 | 0.437956 | 0.10219 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.05 | 0 | 0.05 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d692f8f38cc997b2f8bfdb6f4463f32491f25ba9 | 639 | py | Python | rotas/serializers.py | fchevitarese/routecalc | 4222d61511885acaad58595689af05eef5ef4218 | [
"MIT"
] | null | null | null | rotas/serializers.py | fchevitarese/routecalc | 4222d61511885acaad58595689af05eef5ef4218 | [
"MIT"
] | null | null | null | rotas/serializers.py | fchevitarese/routecalc | 4222d61511885acaad58595689af05eef5ef4218 | [
"MIT"
] | null | null | null | # encoding: utf-8
from rest_framework import serializers
from .models import Rota
class RotaSerializer(serializers.ModelSerializer):
class Meta:
model = Rota
fields = ('nome', 'origem', 'destino', 'distancia',
'created', 'updated')
class MenorRotaSerializer(serializers.Serializer):
nome = serializers.CharField()
origem = serializers.CharField()
destino = serializers.CharField()
autonomia = serializers.IntegerField()
preco = serializers.FloatField()
valor_frete = serializers.FloatField()
caminho = serializers.CharField()
distancia = serializers.IntegerField()
| 29.045455 | 59 | 0.699531 | 56 | 639 | 7.946429 | 0.571429 | 0.179775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001949 | 0.197183 | 639 | 21 | 60 | 30.428571 | 0.865497 | 0.023474 | 0 | 0 | 0 | 0 | 0.064309 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.8125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d6986e27cf802a59743b1e891aaab14d1df9b133 | 2,715 | py | Python | backend/src/msg/jsonMsg.py | frost917/customer-manager | d7d4c16f99e1548989bff85c20c307a844711eda | [
"Apache-2.0"
] | null | null | null | backend/src/msg/jsonMsg.py | frost917/customer-manager | d7d4c16f99e1548989bff85c20c307a844711eda | [
"Apache-2.0"
] | 1 | 2021-09-18T05:56:45.000Z | 2021-09-18T05:56:45.000Z | backend/src/msg/jsonMsg.py | frost917/customer-manager | d7d4c16f99e1548989bff85c20c307a844711eda | [
"Apache-2.0"
] | null | null | null | import json
from datetime import datetime
# Dict in List
def authFailedJson():
payload = dict()
convDict = dict()
convList = list()
convDict['error'] = "AuthFailed"
convDict['msg'] = "Authentication Failed!"
convList.append(convDict)
payload['failed'] = convList
return str(json.dumps(payload))
def dataMissingJson():
payload = dict()
convDict = dict()
convList = list()
convDict['error'] = "MissingData"
convDict['msg'] = "some data is missing!"
convList.append(convDict)
payload['failed'] = convList
return str(json.dumps(payload))
def customerNotFound(customerID):
convDict = dict()
convDict['error'] = 'CustomerNotFound'
convDict['msg'] = 'customer is not found!'
convDict['customerID'] = customerID
convList = list()
convList.append(convDict)
payload = dict()
payload['failed'] = convList
def jobNotFound(jobID):
convDict = dict()
convDict['error'] = 'jobNotFound'
convDict['msg'] = 'job is not found!'
convDict['jobID'] = jobID
convList = list()
convList.append(convDict)
payload = dict()
payload['failed'] = convList
def reserveNotFound(reserveID):
convDict = dict()
convDict['error'] = 'reserveNotFound'
convDict['msg'] = 'reserve is not found!'
convDict['reserveID'] = reserveID
convList = list()
convList.append(convDict)
payload = dict()
payload['failed'] = convList
def dataNotJSON():
payload = dict()
convDict = dict()
convList = list()
convDict['error'] = "DataMustJSON"
convDict['msg'] = "data must be JSON object!"
convList.append(convDict)
payload['failed'] = convList
return str(json.dumps(payload))
def tokenInvalid():
payload = dict()
convDict = dict()
convList = list()
convDict['error'] = "TokenInvalid"
convDict['msg'] = "token is invalid!"
convList.append(convDict)
payload['failed'] = convList
return json.dumps(payload)
def databaseIsGone():
payload = dict()
convDict = dict()
convList = list()
convDict['error'] = "DatabaseIsGone"
convDict['msg'] = "database is dead!"
convDict['queryDate'] = datetime.now().strftime('%Y-%m-%d')
convList.append(convDict)
payload['failed'] = convList
return json.dumps(payload)
def redisIsGone():
payload = dict()
convDict = dict()
convList = list()
convDict['error'] = "redisIsGone"
convDict['msg'] = "redis is dead!"
convDict['queryDate'] = datetime.now().strftime('%Y-%m-%d')
convList.append(convDict)
payload['failed'] = convList
return json.dumps(payload)
def queryingResult(data: dict):
payload = dict() | 22.438017 | 63 | 0.632413 | 274 | 2,715 | 6.270073 | 0.215328 | 0.064028 | 0.11525 | 0.151921 | 0.583236 | 0.583236 | 0.583236 | 0.583236 | 0.4156 | 0.4156 | 0 | 0 | 0.220626 | 2,715 | 121 | 64 | 22.438017 | 0.811437 | 0.00442 | 0 | 0.62069 | 0 | 0 | 0.17475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.022989 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d69d1427809e16ff71adf65c1d277b74efb15e53 | 537 | py | Python | village_api/migrations/0008_alter_relationship_people.py | roseDickinson/dnd-village-api | ddcaf2ab9c1d496f150160ef8638526634752368 | [
"MIT"
] | 1 | 2021-08-10T21:40:35.000Z | 2021-08-10T21:40:35.000Z | village_api/migrations/0008_alter_relationship_people.py | roseDickinson/dnd-village-api | ddcaf2ab9c1d496f150160ef8638526634752368 | [
"MIT"
] | null | null | null | village_api/migrations/0008_alter_relationship_people.py | roseDickinson/dnd-village-api | ddcaf2ab9c1d496f150160ef8638526634752368 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.6 on 2021-08-30 14:17
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("village_api", "0007_auto_20210830_1327"),
]
operations = [
migrations.AlterField(
model_name="relationship",
name="people",
field=models.ManyToManyField(
related_name="relationships",
through="village_api.Relation",
to="village_api.Person",
),
),
]
| 23.347826 | 51 | 0.56797 | 51 | 537 | 5.823529 | 0.784314 | 0.10101 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085635 | 0.325885 | 537 | 22 | 52 | 24.409091 | 0.734807 | 0.083799 | 0 | 0.125 | 1 | 0 | 0.210204 | 0.046939 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6b9635d03c20a241e2abb35126dd57faba06e46 | 811 | py | Python | 296-Best-Meeting-Point/Python/Solution01.py | Eroica-cpp/LeetCode | 07276bd11558f3d0e32bec768b09e886de145f9e | [
"CC-BY-3.0",
"MIT"
] | 7 | 2015-05-05T22:21:30.000Z | 2021-03-13T04:04:15.000Z | 296-Best-Meeting-Point/Python/Solution01.py | Eroica-cpp/LeetCode | 07276bd11558f3d0e32bec768b09e886de145f9e | [
"CC-BY-3.0",
"MIT"
] | null | null | null | 296-Best-Meeting-Point/Python/Solution01.py | Eroica-cpp/LeetCode | 07276bd11558f3d0e32bec768b09e886de145f9e | [
"CC-BY-3.0",
"MIT"
] | 2 | 2018-12-26T08:13:25.000Z | 2020-07-18T20:18:24.000Z | #!/usr/bin/python
"""
https://leetcode.com/problems/best-meeting-point/
Time O(n^2), Space O(n)
"""
class Solution(object):
def minTotalDistance(self, grid):
"""
:type grid: List[List[int]]
:rtype: int
"""
if not grid or not grid[0]: return
addr = []
m, n = len(grid), len(grid[0])
for i in xrange(m):
for j in xrange(n):
if grid[i][j] == 1:
addr.append([i, j])
N = len(addr)
x_mid = sum(i[0] for i in addr) / N
y_mid = sum(i[1] for i in addr) / N
return sum(abs(i[0] - x_mid)+abs(i[1] - y_mid) for i in addr)
if __name__ == '__main__':
grid = [[1,0,0,0,1],[0,0,0,0,0],[0,0,1,0,0]]
# grid = [[0,1], [1,0]]
print Solution().minTotalDistance(grid) | 25.34375 | 69 | 0.491985 | 133 | 811 | 2.909774 | 0.368421 | 0.046512 | 0.046512 | 0.041344 | 0.093023 | 0.036176 | 0 | 0 | 0 | 0 | 0 | 0.048913 | 0.319359 | 811 | 32 | 70 | 25.34375 | 0.652174 | 0.046856 | 0 | 0 | 0 | 0 | 0.012924 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6ba5be930a7a3f8ad3c3a97a5f88e0d91793099 | 2,056 | py | Python | message_models.py | stefanSuYiGuo/FlaskAPI | b7a194391d481ffdcd542d24eedde3537f9d782e | [
"MIT"
] | null | null | null | message_models.py | stefanSuYiGuo/FlaskAPI | b7a194391d481ffdcd542d24eedde3537f9d782e | [
"MIT"
] | null | null | null | message_models.py | stefanSuYiGuo/FlaskAPI | b7a194391d481ffdcd542d24eedde3537f9d782e | [
"MIT"
] | null | null | null | from flask_sqlalchemy import SQLAlchemy
from flask import Flask
from datetime import datetime
app = Flask(__name__)
app.config["SQLALCHEMY_DATABASE_URI"] = "sqlite:///" + "E:/Moses/College_Life/Year3_2/Software_Development_Workshop_III/Code/practice1/second.db"
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
app.config["SECRET_KEY"] = "theSecondPractice"
db = SQLAlchemy(app) # 实例化数据库
# 管理员
class Admin(db.Model):
__tablename__ = "admin"
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(32), nullable=False, unique=True) # 管理员用户名需要独一无二的
password = db.Column(db.String(64), nullable=False)
tags = db.relationship("Tag", backref="admin")
# 标签
class Tag(db.Model):
__tablename__ = "tag"
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(10), nullable=False, unique=True) # 标签名字
admin_id = db.Column(db.Integer, db.ForeignKey("admin.id")) # 所属管理员
# 用户
class User(db.Model):
__tablename__ = "user"
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(32), nullable=False, unique=True) # 用户用户名需要独一无二的
password = db.Column(db.String(64), nullable=False)
tags = db.relationship("Message", backref="user") # 反向访问
# messages = db.relationship("Message", backref="user") # 反向访问
# 留言条
class Message(db.Model):
__tablename__ = "message"
id = db.Column(db.Integer, primary_key=True)
content = db.Column(db.String(256), nullable=False)
create_time = db.Column(db.DateTime, default=datetime.now) # 获得发布留言的时间
user_id = db.Column(db.Integer, db.ForeignKey("user.id")) # 所属用户
tags = db.relationship("Tag", secondary="message_to_tag", backref="messages")
# 中间表
class MessageToTag(db.Model):
__tablename__ = "message_to_tag"
id = db.Column(db.Integer, primary_key=True)
message_id = db.Column(db.Integer, db.ForeignKey("message.id")) # 所属留言条
tag_id = db.Column(db.Integer, db.ForeignKey("tag.id")) # 所属tag
if __name__ == "__main__":
db.create_all()
# db.drop_all()
| 33.704918 | 145 | 0.701362 | 278 | 2,056 | 4.97482 | 0.302158 | 0.092552 | 0.115691 | 0.078091 | 0.433839 | 0.433839 | 0.366594 | 0.276934 | 0.253073 | 0.201012 | 0 | 0.009169 | 0.151265 | 2,056 | 60 | 146 | 34.266667 | 0.783381 | 0.082198 | 0 | 0.230769 | 0 | 0 | 0.157388 | 0.075482 | 0.025641 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.051282 | 0.076923 | 0 | 0.820513 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d6bbce0090eceace8041216de150e51ec2e8444c | 3,786 | py | Python | Step10_test_ensemble.py | lyoshiwo/resume_job_matching | 3abc59d5011abc96d0aba96d35fe8ddebd5dda25 | [
"Apache-2.0"
] | 37 | 2016-07-22T03:52:54.000Z | 2021-08-30T13:44:01.000Z | Step10_test_ensemble.py | ezcareer/resume_job_matching | 3abc59d5011abc96d0aba96d35fe8ddebd5dda25 | [
"Apache-2.0"
] | 3 | 2017-02-09T22:58:06.000Z | 2019-07-10T01:42:09.000Z | Step10_test_ensemble.py | ezcareer/resume_job_matching | 3abc59d5011abc96d0aba96d35fe8ddebd5dda25 | [
"Apache-2.0"
] | 26 | 2017-01-08T05:23:10.000Z | 2021-12-20T10:18:42.000Z | # encoding=utf8
import numpy as np
from sklearn import cross_validation
import pandas as pd
import os
import time
from keras.models import Sequential, model_from_json
import util
def score_lists(list_1, list_2):
count = 0
total = len(list_1)
print total
for i in range(total):
if list_1[i] == list_2[i]:
count += 1
return float(count) / total
print time.strftime("%Y-%m-%d %H:%M:%S", time.localtime())
def get_esembel_score(name):
if os.path.exists(util.features_prefix + name + "_XXXYYY.pkl") is False:
print 'file does not exist'
exit()
[X_train, X_validate, X_test, y_train, y_validate, y_test] = pd.read_pickle(
util.features_prefix + name + '_XXXYYY.pkl')
import xgboost as xgb
rf_clf_2 = pd.read_pickle(util.models_prefix + name+'_rf.pkl')
list_all = []
rf_2_list = rf_clf_2.predict(X_test)
from sklearn.feature_selection import SelectFromModel
model = SelectFromModel(rf_clf_2, prefit=True)
temp = model.get_support()
print sum(temp)
list_all.append(rf_2_list)
print rf_clf_2.score(X_test, y_test)
xgb_2 = xgb.Booster({'nthread': 4}) # init model
xgb_2.load_model(util.models_prefix +name+ '_xgb.pkl') # load data
print len(xgb_2.get_fscore().keys())
dtest = xgb.DMatrix(X_test)
xgb_2_test = xgb_2.predict(dtest)
list_all.append(xgb_2_test)
print score_lists(xgb_2_test, y_test)
from keras.utils import np_utils
import copy
[train_X, train_Y] = pd.read_pickle(util.features_prefix + name + '_XY.pkl')
X_semantic = np.array(copy.deepcopy(X_test[:, range(95, 475)]))
X_manual = np.array(copy.deepcopy(X_test[:, range(0, 95)]))
X_cluster = np.array(copy.deepcopy(X_test[:, range(475, 545)]))
X_document = np.array(copy.deepcopy(X_test[:, range(545, 547)]))
X_document[:, [0]] = X_document[:, [0]] + train_X[:, [-1]].max()
X_semantic = X_semantic.reshape(X_semantic.shape[0], 10, -1)
X_semantic_1 = np.zeros((X_semantic.shape[0], X_semantic.shape[2], X_semantic.shape[1]))
for i in range(int(X_semantic.shape[0])):
X_semantic_1[i] = np.transpose(X_semantic[i])
json_string = pd.read_pickle(util.models_prefix +name+ '_json_string_cnn.pkl')
model_cnn = model_from_json(json_string)
model_cnn.load_weights(util.models_prefix + name+'_nn_weight_cnn.h5')
cnn_list = model_cnn.predict_classes([X_document, X_cluster, X_manual, X_semantic_1])
# cnn_list_prob = model_cnn.predict_proba([X_document, X_cluster, X_manual, X_semantic_1])
kk = list(cnn_list)
list_all.append(kk)
print score_lists(kk, y_test)
json_string = pd.read_pickle(util.models_prefix + name + '_json_string_lstm.pkl')
model_lstm = model_from_json(json_string)
model_lstm.load_weights(util.models_prefix + name + '_nn_weight_lstm.h5')
lstm_list = model_lstm.predict_classes([X_document, X_cluster, X_manual, X_semantic_1])
# cnn_list_prob = model_cnn.predict_proba([X_document, X_cluster, X_manual, X_semantic_1])
kk = list(lstm_list)
list_all.append(kk)
print score_lists(kk, y_test)
list_ensemble = []
for i in range(len(y_test)):
dict_all = {}
for z in range(len(list_all)):
dict_all[list_all[z][i]] = dict_all.setdefault(list_all[z][i], 0) + 1
tmp_list = dict_all.items()
list_ensemble.append(sorted(tmp_list, lambda a, b: -cmp(a[1], b[1]))[0][0])
print score_lists(list_ensemble, y_test)
print '**************************'
if __name__ == "__main__":
for name in ['degree', 'position', 'salary', 'size']:
get_esembel_score(name)
# xg
# 2016 - 07 - 16
# 23:39:28
# 2016 - 07 - 16
# 23:58:37
# 2016 - 07 - 17
# 00:34:06
| 38.632653 | 94 | 0.666931 | 602 | 3,786 | 3.878738 | 0.252492 | 0.057816 | 0.041113 | 0.051392 | 0.369165 | 0.369165 | 0.307495 | 0.214989 | 0.181585 | 0.181585 | 0 | 0.035761 | 0.194929 | 3,786 | 97 | 95 | 39.030928 | 0.730315 | 0.075806 | 0 | 0.051948 | 0 | 0 | 0.063396 | 0.013483 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6c65e57e0ed098807d853ba6fc1c8062316ebe8 | 3,437 | py | Python | helper/markup.py | sushi-irc/nigiri | 9e1137a80f350ea05ae76df93061d3dc188e1ba7 | [
"BSD-2-Clause"
] | 1 | 2017-07-24T19:31:19.000Z | 2017-07-24T19:31:19.000Z | helper/markup.py | sushi-irc/nigiri | 9e1137a80f350ea05ae76df93061d3dc188e1ba7 | [
"BSD-2-Clause"
] | null | null | null | helper/markup.py | sushi-irc/nigiri | 9e1137a80f350ea05ae76df93061d3dc188e1ba7 | [
"BSD-2-Clause"
] | 1 | 2019-01-31T19:16:16.000Z | 2019-01-31T19:16:16.000Z |
def get_occurances(haystack,needle,l=[],start=0,end=-1):
""" return a list with all indices of needle in haystack """
i = haystack.find(needle,start)
if i == -1: return l
return get_occurances(haystack,needle,l+[i],i+1)
def tuple_to_list(t):
""" Convert a markup tuple:
(text,[(color,pos),...])
To a markup list:
[(color,textpart),...]
This is the opposite to urwid.util.decompose_tagmarkup
"""
(text,attrs) = t
pc = 0
l = []
for (attr,pos) in attrs:
if attr == None:
l.append(text[pc:pc+pos])
else:
l.append((attr, text[pc:pc+pos]))
pc += pos
return l
def colorize_mult(text, needles, colors, base_color="default"):
""" colorize more than one needle with a color.
If base_color is given, the text not matching a needle
is colored with the base_color.
Example: colorize_mult("foo bar baz",["foo","baz"],
{"foo":"green","baz":"blue"}) ->
[("green","foo")," bar ",("blue","baz")]
colorize_mult("foo bar baz",["foo","baz"],
{"foo":"green","baz":"blue"}, base_color="red") ->
[("green","foo"),("red"," bar "),("blue","baz")]
Returns a markup list.
"""
# list sorted descending by needle length
needles = sorted(needles, cmp=lambda a,b: (
(len(a)>len(b) and -1) or (len(a)<len(b) and 1) or 0))
def join(l,elem):
nl = []
for i in l[:-1]:
if i != '':
nl.append(i)
nl.append(elem)
if l and l[-1] != '':
nl.append(l[-1])
return nl
l = [text]
for needle in needles:
nl = []
for i in l:
if isinstance(i,basestring):
split = i.split(needle)
if len(split) > 1:
x = join(split,(colors[needle],needle))
nl = nl + x
else:
nl.append(i)
else:
nl.append(i)
l = nl
if base_color != "default":
nl = []
for i in l:
if isinstance(i,basestring):
nl.append((base_color,i))
else:
nl.append(i)
l = nl
return l
def colorize(text, needle, color, base_color, start=0, end=-1):
""" Color all occurances of needle in text in the given
color. The markup will have the tuple-form like:
(thetext,[(attr1,pos),...])
"""
occ = get_occurances(text,needle,start=start,end=end)
p = 0
l = []
for i in occ:
if i != p:
if i-p > 0:
l.append((base_color,i-p))
l.append((color,len(needle)))
p = i
rest = len(text)-(p+len(needle))
if rest > 0:
l.append((base_color,rest))
return (text, l)
if __name__ == "__main__":
print "Testing tuple_to_list."
r = tuple_to_list(("foobarbaz",[("red",3),("green",3),("blue",3)]))
assert r == [("red","foo"),("green","bar"),("blue","baz")]
r = tuple_to_list(("foobarbaz",[("red",3),(None,3),("blue",3)]))
assert r == [("red","foo"),"bar",("blue","baz")]
print "Success."
print "Testing get_occurances."
assert [0,1,2,3] == get_occurances("aaaa","a")
assert [3] == get_occurances("abcde","d")
assert [] == get_occurances("foo","x")
print "Success."
print "Testing colorize."
text = "00:11 <nick> Das ist ein Test."
color = "green"
nick = "nick"
base_color = "default"
assert (text,[("default",7),("green",4),("default",19)]) == colorize(
text,nick,color,base_color)
print "Success."
print "Testing colorize_mult."
assert colorize_mult("foo bar baz",["foo","baz"],
{"foo":"green","baz":"blue"}) == [
("green","foo")," bar ",("blue","baz")]
assert colorize_mult("foo bar baz",["foo","baz"],
{"foo":"green","baz":"blue"}, base_color="red") == [
("green","foo"),("red"," bar "),("blue","baz")]
print "Success."
| 24.034965 | 70 | 0.595578 | 535 | 3,437 | 3.747664 | 0.214953 | 0.053865 | 0.029925 | 0.03591 | 0.345636 | 0.247382 | 0.247382 | 0.172569 | 0.172569 | 0.140648 | 0 | 0.013181 | 0.183299 | 3,437 | 142 | 71 | 24.204225 | 0.701104 | 0.011347 | 0 | 0.293478 | 0 | 0 | 0.156299 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 0 | null | null | 0 | 0 | null | null | 0.086957 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6cba1917ffb02b7cd3b93a824a1ac1db2f02038 | 1,719 | py | Python | UE4Parse/Provider/Vfs/DirectoryStorageProvider.py | zbx911/pyUE4Parse | eb30d27a4341ad68a7104458a0fdfae6a360d153 | [
"MIT"
] | 13 | 2021-06-09T09:21:00.000Z | 2022-03-30T22:13:24.000Z | UE4Parse/Provider/Vfs/DirectoryStorageProvider.py | zbx911/pyUE4Parse | eb30d27a4341ad68a7104458a0fdfae6a360d153 | [
"MIT"
] | 3 | 2021-09-04T22:23:02.000Z | 2022-03-04T09:45:45.000Z | UE4Parse/Provider/Vfs/DirectoryStorageProvider.py | zbx911/pyUE4Parse | eb30d27a4341ad68a7104458a0fdfae6a360d153 | [
"MIT"
] | 6 | 2021-09-02T10:28:21.000Z | 2022-03-30T22:13:37.000Z | from functools import singledispatchmethod
from typing import Union, TYPE_CHECKING, List, Dict, Optional, Tuple
from .DirectoryStorage import DirectoryStorage
from ..Common import GameFile
from ...IoObjects.FImportedPackage import FPackageId
if TYPE_CHECKING:
from ...IO import FFileIoStoreReader
from ...PakFile.PakReader import PakReader
class DirectoryStorageProvider:
Storage: List[DirectoryStorage]
IsCaseInsensitive: bool
def __init__(self, is_case_insensitive: bool):
self.IsCaseInsensitive = is_case_insensitive
self.Storage = []
def __iter__(self) -> Tuple[str, GameFile]:
for storage in self.Storage:
for k, v in storage.files.items():
yield k, v
def add_storage(self, storage: DirectoryStorage):
self.Storage.append(storage)
def add_index(self, index: Dict[str, GameFile], container: Union['FFileIoStoreReader', 'PakReader']):
self.add_storage(DirectoryStorage(index, container, self.IsCaseInsensitive))
def resolve_relative_path(self, path: str):
for storage in self.Storage:
result = storage.get_full_path(path)
if result is not None:
return result
return None
@singledispatchmethod
def get(self, path) -> Optional[GameFile]:
for storage in self.Storage:
result = storage.try_get(path)
if result is not None:
return result
return None
@get.register
def _(self, id: FPackageId) -> Optional[GameFile]:
for storage in self.Storage:
result = storage.try_get(id)
if result is not None:
return result
return None
| 31.833333 | 105 | 0.662013 | 191 | 1,719 | 5.832461 | 0.308901 | 0.06912 | 0.043088 | 0.057451 | 0.276481 | 0.276481 | 0.248654 | 0.216338 | 0.216338 | 0.181329 | 0 | 0 | 0.260617 | 1,719 | 53 | 106 | 32.433962 | 0.876475 | 0 | 0 | 0.309524 | 0 | 0 | 0.015707 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.547619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d6cd6aea5b21781c6a52e0b4e89c0caaee5c73ac | 1,389 | py | Python | almoxarifado/migrations/0013_auto_20171009_1843.py | rvmoura96/projeto-almoxarifado | 4ca5e5d00f449a940f7c601479bb3fe14c54f012 | [
"MIT"
] | 1 | 2019-05-24T17:39:01.000Z | 2019-05-24T17:39:01.000Z | almoxarifado/migrations/0013_auto_20171009_1843.py | rvmoura96/projeto-almoxarifado | 4ca5e5d00f449a940f7c601479bb3fe14c54f012 | [
"MIT"
] | null | null | null | almoxarifado/migrations/0013_auto_20171009_1843.py | rvmoura96/projeto-almoxarifado | 4ca5e5d00f449a940f7c601479bb3fe14c54f012 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.5 on 2017-10-09 21:43
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('almoxarifado', '0012_auto_20171009_1821'),
]
operations = [
migrations.CreateModel(
name='Item',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('tipo', models.CharField(choices=[('Consumíveis', 'Consumíveis'), ('Periféricos', 'Periféricos')], default='Consumíveis', max_length=12)),
('item', models.CharField(max_length=170)),
('fabricante', models.CharField(default=None, max_length=90)),
('quantidade', models.IntegerField(default=0)),
('status', models.CharField(choices=[('Disponivel', ' Disponível'), ('Indisponivel', 'Indisponível')], default='Disponivel', max_length=12)),
('prateleira', models.IntegerField(default=0)),
],
),
migrations.AlterField(
model_name='equipamento',
name='status',
field=models.CharField(choices=[('Disponivel', 'Disponível'), ('Indisponivel', 'Indisponível')], default='Disponivel', max_length=12),
),
]
| 42.090909 | 158 | 0.590353 | 126 | 1,389 | 6.373016 | 0.563492 | 0.0934 | 0.082192 | 0.064757 | 0.234122 | 0.234122 | 0.234122 | 0.234122 | 0.234122 | 0.234122 | 0 | 0.044487 | 0.25558 | 1,389 | 32 | 159 | 43.40625 | 0.732108 | 0.048956 | 0 | 0.08 | 1 | 0 | 0.208398 | 0.017885 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.08 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6d5ff78a1c2500ab9f3c80b9b788fe32710508f | 1,815 | py | Python | taiwanpeaks/peaks/models.py | bhomnick/taiwanpeaks | 0ac3aee6493bc5e982646efd4afad5ede5db005f | [
"MIT"
] | 1 | 2020-11-17T15:17:01.000Z | 2020-11-17T15:17:01.000Z | taiwanpeaks/peaks/models.py | bhomnick/taiwanpeaks | 0ac3aee6493bc5e982646efd4afad5ede5db005f | [
"MIT"
] | 16 | 2020-12-04T11:05:25.000Z | 2020-12-05T09:00:05.000Z | taiwanpeaks/peaks/models.py | bhomnick/taiwanpeaks | 0ac3aee6493bc5e982646efd4afad5ede5db005f | [
"MIT"
] | null | null | null | from django.db import models
from common import constants
class Peak(models.Model):
name = models.CharField(max_length=255)
slug = models.SlugField(unique=True)
name_zh = models.CharField(max_length=255, verbose_name='Name (中文)', help_text=(
"This peak's Chinese name, i.e. 南湖大山"))
height = models.IntegerField(help_text="Height in meters")
latitude = models.DecimalField(max_digits=8, decimal_places=5, help_text="Positive number, 5 decimal places")
longitude = models.DecimalField(max_digits=8, decimal_places=5, help_text="Positive number, 5 decimal places")
rank = models.IntegerField(help_text="Peak rank from 1-100")
difficulty = models.CharField(max_length=50, choices=constants.DIFFICULTY_CHOICES, help_text=(
"A = beginner<br>B = intermediate<br>C = advanced<br>C+ = expert"
))
list_photo = models.ForeignKey('photos.Photo', related_name='list_peaks', on_delete=models.PROTECT,
help_text="Choose a good photo that's representative of this peak")
locations = models.ManyToManyField('common.Location', help_text=(
"Select any locations this peak falls in or borders. Most peaks have more than one."))
national_park = models.CharField(max_length=50, choices=constants.NP_CHOICES, null=True, blank=True,
help_text="If this peak is in a national park, select it here")
class Meta:
ordering = ['rank']
def __str__(self):
return self.name
@property
def location_list_short(self):
return [l.name_short for l in self.locations.all()]
@property
def difficulty_index(self):
return constants.DIFFICULTY_CHOICES.index_of(self.difficulty)
@property
def filter_tags(self):
# Level
tags = [f'filter-level-{self.difficulty}']
return tags
| 40.333333 | 114 | 0.704683 | 247 | 1,815 | 5.02834 | 0.445344 | 0.057971 | 0.057971 | 0.077295 | 0.236715 | 0.193237 | 0.193237 | 0.125604 | 0.125604 | 0.125604 | 0 | 0.013559 | 0.187328 | 1,815 | 44 | 115 | 41.25 | 0.828475 | 0.002755 | 0 | 0.088235 | 0 | 0.029412 | 0.257743 | 0.016593 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.058824 | 0.088235 | 0.676471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d6d6fb750180ece205973e32db09b105c80ac154 | 514 | py | Python | src/collect_results_mutable_bitmap.py | jermp/mutable_rank_select | 0b7d59d283cc752eac3778548672acdf7823d325 | [
"MIT"
] | 22 | 2020-09-29T07:40:07.000Z | 2022-02-17T23:45:28.000Z | src/collect_results_mutable_bitmap.py | kiminh/mutable_rank_select | 25eeb4071bd198e35e09359a75fe3fbc06f34b21 | [
"MIT"
] | null | null | null | src/collect_results_mutable_bitmap.py | kiminh/mutable_rank_select | 25eeb4071bd198e35e09359a75fe3fbc06f34b21 | [
"MIT"
] | 3 | 2020-09-29T10:38:38.000Z | 2021-07-26T03:20:22.000Z | import sys, os
output_filename = sys.argv[1]
types = [
"avx2_256_a",
"avx512_256_a",
"avx2_256_b",
"avx512_256_b",
"avx2_512_a",
"avx512_512_a",
"avx2_512_b",
"avx512_512_b",
"avx2_256_c",
"avx512_256_c"]
for t in types:
os.system("./perf_mutable_bitmap " + t + " flip 0.3 2>> " + output_filename + ".flip.txt")
os.system("./perf_mutable_bitmap " + t + " rank 0.3 2>> " + output_filename + ".rank.txt")
os.system("./perf_mutable_bitmap " + t + " select 0.3 2>> " + output_filename + ".select.txt")
| 23.363636 | 98 | 0.651751 | 85 | 514 | 3.588235 | 0.341176 | 0.183607 | 0.118033 | 0.186885 | 0.442623 | 0.27541 | 0.190164 | 0 | 0 | 0 | 0 | 0.138568 | 0.157588 | 514 | 21 | 99 | 24.47619 | 0.56582 | 0 | 0 | 0 | 0 | 0 | 0.484436 | 0.122568 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6daa8f5ccd7dce93b9ead99394ae7a2a57bf95b | 3,941 | py | Python | Flask_security/flask_1.py | SAVE-POlNT/Flask_shared_auth | 95374bedfd914f7cbe34424619409ddc7e2ecb95 | [
"MIT"
] | 2 | 2020-10-27T14:56:39.000Z | 2021-12-07T13:00:59.000Z | Flask_security/flask_1.py | SAVE-POlNT/Flask_shared_auth | 95374bedfd914f7cbe34424619409ddc7e2ecb95 | [
"MIT"
] | null | null | null | Flask_security/flask_1.py | SAVE-POlNT/Flask_shared_auth | 95374bedfd914f7cbe34424619409ddc7e2ecb95 | [
"MIT"
] | null | null | null | from flask import Flask, redirect, url_for, render_template, request, session, url_for, flash
from captchacreater import create_image_captcha
from sendmail_func import sendMail, validMail
from TokenGenerator import getTokenUser, setToken, ChangeTokenUser
from mysqlhostedwithpython import getUserPassword, setUserData, init
from MainPage import getClientData, setClientData
from flask import g
import os
import time
from datetime import timedelta
from models import app, db
from Cryptage import encode, decode
app.secret_key = os.urandom(21)
# session.permanent = True
########################################################################################
@app.route("/", methods=['GET', 'POST'])
def register():
email = request.form.get('email')
name = request.form.get('name')
password = request.form.get('passw')
prénom = request.form.get('prénom')
if(email is not None):
try:
if(validMail(email)):
if(len(password) < 8):
return render_template("register.html", error="Password too short please make sure it is at least 8 characters")
else:
tk = setToken(email=encode(email, 11))
ud = setUserData(email=encode(email, 12), login=encode(
password, 12), passwords=init())
cd = setClientData(email=encode(email, 13), nom=encode(name, 13), prénom=encode(prénom, 13),
sex='H', balance='0.0', incomes='0.0', expenses='0.0')
return render_template("signin.html", wrongpassword="")
else:
return render_template("register.html", error="entrez un mail valide ! ")
except:
return render_template("register.html", error="votre mail existe deja")
return render_template("register.html", error="")
@app.route("/account", methods=['POST'])
def account():
tkn = request.form["token"]
if not session.get('userMail') is None:
z = getClientData(encode(session['userMail'], 13))
user_tkn = str(int(getTokenUser(encode(session['userMail'], 11))))
print("user token = "+user_tkn)
if tkn == user_tkn:
ChangeTokenUser(encode(session['userMail'], 11))
return render_template("MainPage.html", name=decode(z[0], 13), firstname=decode(z[1], 13), balance=z[2], Incomes=z[3], Expenses=z[4])
else:
return render_template("token.html", error='error : wrong token!')
else:
return redirect('/signin')
@app.route("/signin/")
def signin():
return render_template("signin.html", wrongpassword='')
@app.route("/token", methods=['POST'])
def token():
f = open("message.html", "r")
ms = f.read()
f.close()
mail = request.form["email"]
z = getClientData(encode(request.form["email"], 13))
ms = str.replace(ms, "name", decode(z[0], 13))
print(ms)
print("coded mail : "+mail)
print("coded mail : "+encode(mail, 11))
print("coded mail : "+encode(mail, 12))
print("coded mail : "+encode(mail, 13))
session['userMail'] = mail
password = request.form["passw"]
print("password entred is : " + password)
print("password is : "+decode(str(getUserPassword(encode(mail, 12), init())), 12))
if(password == decode(str(getUserPassword(encode(mail, 12), init())), 12)):
sendMail(mail, "Bank Token", ms)
return render_template("token.html", error='')
return render_template("signin.html", wrongpassword='Invalid Password Or Mail retry !')
@app.route("/resend")
def resendToken():
f = open("./securité/message.html", "r")
ms = f.read()
f.close()
sendMail(session['userMail'], "Bank Token", ms)
return render_template("token.html")
if __name__ == "__main__":
app.run(debug=True)
| 37.179245 | 146 | 0.598325 | 456 | 3,941 | 5.107456 | 0.302632 | 0.072134 | 0.094461 | 0.048089 | 0.270502 | 0.227565 | 0.091885 | 0.091885 | 0 | 0 | 0 | 0.018623 | 0.236996 | 3,941 | 105 | 147 | 37.533333 | 0.755903 | 0.00609 | 0 | 0.096386 | 0 | 0 | 0.163622 | 0.006179 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060241 | false | 0.144578 | 0.144578 | 0.012048 | 0.349398 | 0.096386 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d6de9c1169e887eca336cb92869b544db4b46ed3 | 480 | py | Python | tripcore/inventory/migrations/0002_fixture_source.py | robotlightsyou/trip | 5a58babe399febb476cfb42a530ead20937597fd | [
"MIT"
] | null | null | null | tripcore/inventory/migrations/0002_fixture_source.py | robotlightsyou/trip | 5a58babe399febb476cfb42a530ead20937597fd | [
"MIT"
] | null | null | null | tripcore/inventory/migrations/0002_fixture_source.py | robotlightsyou/trip | 5a58babe399febb476cfb42a530ead20937597fd | [
"MIT"
] | null | null | null | # Generated by Django 3.1.1 on 2020-09-17 20:12
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('inventory', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='fixture',
name='source',
field=models.CharField(choices=[('led', 'LED'), ('conventional', 'Conventional'), ('mover', 'Mover')], default='conventional', max_length=20),
),
]
| 25.263158 | 154 | 0.597917 | 49 | 480 | 5.795918 | 0.755102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058011 | 0.245833 | 480 | 18 | 155 | 26.666667 | 0.726519 | 0.09375 | 0 | 0 | 1 | 0 | 0.198614 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6e6ceef6e599caf236cd792e310118de765b012 | 474 | py | Python | project_name/settings/prod.py | ar4s/django-good-settings-template | 9f465361a0b3aeb08bf4f0ee2690a5f0952bbdd3 | [
"MIT"
] | 3 | 2016-10-15T18:52:09.000Z | 2017-02-04T07:07:15.000Z | project_name/settings/prod.py | ar4s/django-good-settings-template | 9f465361a0b3aeb08bf4f0ee2690a5f0952bbdd3 | [
"MIT"
] | null | null | null | project_name/settings/prod.py | ar4s/django-good-settings-template | 9f465361a0b3aeb08bf4f0ee2690a5f0952bbdd3 | [
"MIT"
] | null | null | null | from {{ project_name }}.settings.base import * # noqa
DEBUG = False
ALLOWED_HOSTS = ['{{ project_name }}.local']
ROOT_URLCONF = '{{ project_name }}.urls'
CSRF_COOKIE_SECURE = True
CSRF_COOKIE_HTTPONLY = True
X_FRAME_OPTIONS = 'DENY'
SECURE_SSL_REDIRECT = True
SECURE_BROWSER_XSS_FILTER = True
SECURE_CONTENT_TYPE_NOSNIFF = True
SESSION_COOKIE_SECURE = True
SESSION_ENGINE = 'redis_sessions.session'
SESSION_REDIS_UNIX_DOMAIN_SOCKET_PATH = '/var/run/redis/redis.sock'
| 23.7 | 67 | 0.7827 | 65 | 474 | 5.276923 | 0.661538 | 0.09621 | 0.093294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116034 | 474 | 19 | 68 | 24.947368 | 0.818616 | 0.008439 | 0 | 0 | 0 | 0 | 0.209402 | 0.100427 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.076923 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6e90ca78df35fc9a551f63bf42adf35795859be | 1,294 | py | Python | pagapp/service_pages/forms.py | eugeneandrienko/PyArtistsGallery | b75114955859d45d9dfb5c901213f25a6e09f488 | [
"MIT"
] | null | null | null | pagapp/service_pages/forms.py | eugeneandrienko/PyArtistsGallery | b75114955859d45d9dfb5c901213f25a6e09f488 | [
"MIT"
] | null | null | null | pagapp/service_pages/forms.py | eugeneandrienko/PyArtistsGallery | b75114955859d45d9dfb5c901213f25a6e09f488 | [
"MIT"
] | null | null | null | """Forms, which using in "first run" page."""
from flask_wtf import Form
from wtforms import SubmitField, StringField, PasswordField, TextAreaField
from wtforms.validators import DataRequired, EqualTo
class FirstRunForm(Form):
"""First-run form.
Form, which using in web page, which
shows at first run of application.
"""
gallery_title = StringField(
"Gallery title:",
validators=[DataRequired()],
description="Gallery title name")
gallery_description = TextAreaField(
"Gallery description:",
validators=[DataRequired()],
description="Gallery description")
username = StringField(
"Administrator name:",
validators=[DataRequired()],
description="Name of gallery administrator")
password = PasswordField(
"Administrator's password",
validators=[DataRequired()],
description="Password of the gallery administrator")
password2 = PasswordField(
"Retype administrator's password",
validators=[DataRequired(),
EqualTo('password',
message="Passwords must match")],
description="Retype password")
submit_button = SubmitField(
"Save settings",
description="Save settings")
| 32.35 | 74 | 0.647604 | 114 | 1,294 | 7.315789 | 0.403509 | 0.131894 | 0.158273 | 0.095923 | 0.105516 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001036 | 0.25425 | 1,294 | 39 | 75 | 33.179487 | 0.863212 | 0.098918 | 0 | 0.137931 | 0 | 0 | 0.24606 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.310345 | 0.103448 | 0 | 0.344828 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d6f548e7f01a1da4ed24bbe693aac24e3c441cd0 | 11,940 | py | Python | CLI.py | Cryptoc1/CMS | 05f867b2cc4b0d40684059b600e35b479c107baf | [
"MIT"
] | 1 | 2015-03-24T05:21:54.000Z | 2015-03-24T05:21:54.000Z | CLI.py | Cryptoc1/CMS | 05f867b2cc4b0d40684059b600e35b479c107baf | [
"MIT"
] | 4 | 2015-01-27T17:53:23.000Z | 2015-02-01T19:54:30.000Z | CLI.py | Cryptoc1/CMS | 05f867b2cc4b0d40684059b600e35b479c107baf | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from CMS import CMS
import sys
import os
# Initiate the CMS Class
manager = CMS(hostname="localhost", username="test", password="test123", db="CMS")
# Misc strings
tchar = ": "
working = tchar + "Working..."
# Checks if user input is an in-program command
def verify_input(cmd):
new_cmd = str(cmd).lower()
if new_cmd == "exit":
print tchar + "Exiting CMS..."
sys.exit()
os.system("clear")
elif new_cmd == "help" or new_cmd == "?":
print tchar + "\n" + tchar + "clear :: Clear the screen\n" + tchar + "exit :: Exit the program\n" + tchar + "help or ?:: Display this help text"
print tchar + "get <arg>\n" + tchar + "\tall :: Display all entries\n" + tchar + "\tpost :: Display entry that matches specified Selection Query"
print tchar + "new <arg>\n" + tchar + "\tpost :: Create a new entry"
print tchar + "update <arg>\n" + tchar + "\tpost :: Update specified feilds of an entry"
print tchar + "delete <arg>\n" + tchar + "\tpost :: Delete an entry"
print tchar
main()
elif new_cmd == "clear":
os.system("clear")
main()
elif new_cmd == "get":
print tchar + "Get what?\n" + tchar + "\tget <arg>\n" + tchar + "\t\tall :: Display all entries\n" + tchar + "\t\tpost :: Display entry that matches specified Selection Query"
main()
elif new_cmd == "new":
print tchar + "New what?\n" + tchar + "\tnew <arg>\n" + tchar + "\t\tpost :: Create a new entry"
main()
elif new_cmd == "update":
print tchar + "Update what?\n" + tchar + "\tupdate <arg>\n" + tchar + "\t\tpost :: Update specified feilds of an entry"
main()
elif new_cmd == "delete":
print tchar + "Delete what?\n" + tchar + "\tdelete <arg>\n" + tchar + "\t\tpost :: Deletes an entry"
main()
else:
if new_cmd not in commands:
default()
else:
commands[new_cmd]()
# Interactively gets a post using CMS.get_post_by_*(), then prints formatted results
def get_post():
print tchar + "Select by pid, title, or author?"
method = str(raw_input(tchar)).lower()
if method == "pid":
print tchar + "Enter Post ID (pid)"
pid = raw_input(tchar)
try:
pid = int(pid)
except ValueError:
print tchar+ "Value entered was not a number."
get_post()
print working
post = manager.get_post_by_id(pid)
print organize_post_data(post)
main()
elif method == "title":
print tchar + "Enter Post Title (title)"
title = str(raw_input(tchar))
print working
posts = manager.get_posts_by_title(title)
print organize_post_data(posts)
main()
elif method == "author":
print tchar + "Enter Post Author (author)"
author = str(raw_input(tchar))
print working
posts = manager.get_posts_by_author(author)
print organize_post_data(posts)
main()
elif method != "pid" and method != "title" and method != "author":
print "Invalid Selection Method."
get_post()
# Prints all post entries in the Post table with JSON like formatting
def get_all():
count = manager.get_entry_count()
if count == 1:
print tchar + "There is " + str(count) + " total entry.\n" + tchar + "Are you sure you want to list them all? (y/n)"
else:
print tchar + "There are " + str(count) + " total entries\n" + tchar + "Are you sure you want to list them all? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print working
print organize_post_data(manager.get_all_posts())
main()
elif choice == "n":
print tchar + "Okay, exiting \"get all\" command."
main()
else:
print tchar + "There was an error... Exiting \"get all\" command."
main()
# Interactively creates a new post using CMS.new_post()
def new_post():
print tchar + "Your are about to create a new post! A series of prompts are going to ask you to enter some information.\n Continue? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print tchar + "Enter the title of this Post"
title = raw_input(tchar + "\ttitle: ")
print tchar + "Enter author of this Post"
author = raw_input(tchar + "\tauthor: ")
print tchar + "Enter path to markdown file"
path = raw_input(tchar + "\tpath to md: ")
f = open(path, 'r')
content = f.read()
print tchar + "You are about to create the post " + title + ". \n Continue? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print working
if manager.new_post(title, author, content):
f.close()
print tchar + "New Post created. To view it, use \"get post\" with pid: " + str(manager.get_entry_count())
main()
else:
print tchar + "Failed to create new post."
main()
elif choice == "n":
print tchar + "Okay, exiting \"new post\" command."
else:
print tchar + "There was an error... Exiting \"new post\" command."
elif choice == "n":
print tchar + "Okay, exiting \"new post\" command."
main()
else:
print tchar + "There was an error... Exiting \"new post\" command."
main()
# Interactively updates specified values of a post using CMS.update_post_*()
def update_post():
print tchar + "You're about to update a post! A series of prompts will ask you for update information.\n Continue? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print tchar + "Enter the Post ID (pid) of the post to update"
pid = raw_input(tchar)
print tchar + "What attribute do you want to update: title, author, or content?"
attr = raw_input(tchar).lower()
if attr == "title":
print tchar + "Enter the new title for Post with pid: " + str(pid)
title = raw_input(tchar + "\ttitle: ")
print tchar + "You are about to update Post with pid: " + str(pid) + " with the new title:\"" + title + "\". \nContinue? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print working
if manager.update_post_title(pid, title):
print tchar + "Updated post. Use \"get post\" with pid: " + str(pid) + " to view changes."
main()
else:
print tchar + "Failed to update post."
elif choice == "n":
print tchar + "Okay, exiting \"update post\" command."
main()
else:
print tchar + "There was an error... Exiting \"update post\" command."
main()
elif attr == "author":
print tchar + "Enter the new author for Post with pid: " + str(pid)
author = raw_input(tchar + "\tauthor: ")
print tchar + "You are about to update Post with pid: " + str(pid) + " with the new author:\"" + author + "\". \nContinue? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print working
if manager.update_post_author(pid, author):
print tchar + "Updated post. Use \"get post\" with pid: " + str(pid) + " to view changes."
main()
else:
print tchar + "Failed to update post."
elif choice == "n":
print tchar + "Okay, exiting \"update post\" command."
main()
else:
print tchar + "There was an error... Exiting \"update post\" command."
main()
elif attr == "content":
print tchar + "Enter the path to the markdown file containing the new post content for post with pid: " + str(pid)
path = raw_input(tchar + "\tpath to md: ")
f = open(path, 'r')
content = f.read()
print tchar + "You are about to update Post with pid: " + str(pid) + " with the new content.\nContinue? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print working
if manager.update_post_content(pid, content):
f.close()
print tchar + "Post content updated. Use \"get post\" with pid: " + str(pid) + " to view changes."
main()
else:
print thcar + "Failed to update content."
main()
elif choice == "n":
print tchar + "Okay, exiting \"update post\" command."
main()
else:
print tchar + "There was an error... Exiting \"update post\" command."
main()
elif attr != "title" or attr != "author" or attr != "content":
print tchar + "Invalid attribute."
update_post()
elif choice == "n":
print tchar + "Okay, exiting \"update post\" command."
main()
else:
print tchar + "There was an error... Exiting \"update post\" command."
main()
# Interactively removes a specified post entry
def delete_post():
print tchar + "You are about to delete a post! This action can not be reversed. \nContinue? (y/n)"
choice = raw_input(tchar).lower()
if choice == "y":
print tchar + "Enter Post ID (pid) of post to delete"
pid = raw_input(tchar)
print tchar + "Are you sure you want to delete Post with pid:\"" + str(pid) + "\"? (y/n)"
choice = raw_input(tchar)
if choice == "y":
if manager.remove_post(pid):
print tchar + "Post with pid:\"" + str(pid) + "\" deleted."
main()
else:
print tchar + "Failed to delete post."
main()
elif choice == "n":
print tchar + "Okay, exiting \"delete post\" command."
main()
else:
print tchar + "There was an error... Exiting \"delete post\" command."
main()
elif choice == "n":
print tchar + "Okay, exiting \"delete post\" command."
main()
else:
print tchar + "There was an error... Exiting \"delete post\" command."
main()
# Called whenever an unrecgonized command is used
def default():
print tchar + "Unrecognized command, please try again."
main()
# Disctionary of possible commands, used for a sly C-Style switch/case
commands = {
"get post": get_post,
"get all": get_all,
"new post": new_post,
"update post": update_post,
"delete post": delete_post
}
# Formats a post entry into a JSON like formatted string for printing on the screen
def organize_post_data(post_data):
post_as_string = ""
print tchar + "The content property is returned as inline markdown,\n" + tchar + "meaning escape characters (\\n, \\t, etc.) will be processed."
if type(post_data) == list:
for i in post_data:
post_as_string += "{\n\t \"pid\": " + str(i["pid"]) + "\n\t \"title\": \"" + i["title"] + "\"\n\t \"author\": \"" + i["author"] + "\"\n\t \"content\": \"" + i["content"]+ "\"\n}"
else:
post_as_string = "{\n\t \"pid\": " + str(post_data["pid"]) + "\n\t \"title\": \"" + post_data["title"] + "\"\n\t \"author\": \"" + post_data["author"] + "\"\n\t \"content\": \"" + post_data["content"]+ "\"\n}"
return post_as_string
# Program 'loop'
def main():
cmd = raw_input(tchar)
verify_input(cmd)
# Starting the program... Now!
os.system("clear")
print tchar + "Starting CMS..."
print tchar + "To exit, type 'exit'. To view more commands, type 'help' or '?'."
print tchar + "Enter a command to begin."
main()
| 42.795699 | 217 | 0.552094 | 1,507 | 11,940 | 4.305906 | 0.143331 | 0.104793 | 0.046078 | 0.033287 | 0.514255 | 0.458776 | 0.411003 | 0.37186 | 0.337648 | 0.335799 | 0 | 0.00049 | 0.315662 | 11,940 | 278 | 218 | 42.94964 | 0.793661 | 0.056114 | 0 | 0.517928 | 0 | 0.011952 | 0.326048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.003984 | 0.011952 | null | null | 0.326693 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6fc3049a49a39b90a00f7d33f6531de3096f159 | 500 | py | Python | boards/migrations/0002_board_category.py | AxioDL/axiodl-www | f5011f3ee3ad8e45b72884c6ea238fb4cea7929c | [
"MIT"
] | null | null | null | boards/migrations/0002_board_category.py | AxioDL/axiodl-www | f5011f3ee3ad8e45b72884c6ea238fb4cea7929c | [
"MIT"
] | 7 | 2021-03-18T23:27:44.000Z | 2022-03-11T23:52:23.000Z | boards/migrations/0002_board_category.py | AxioDL/axiodl-www | f5011f3ee3ad8e45b72884c6ea238fb4cea7929c | [
"MIT"
] | null | null | null | # Generated by Django 2.2.3 on 2019-07-09 17:10
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('boards', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='board',
name='category',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, to='boards.Category'),
),
]
| 25 | 144 | 0.642 | 59 | 500 | 5.372881 | 0.661017 | 0.07571 | 0.088328 | 0.138801 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049608 | 0.234 | 500 | 19 | 145 | 26.315789 | 0.778068 | 0.09 | 0 | 0 | 1 | 0 | 0.101545 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6fd19e1a3336009f28a9d61374e30c956f808cc | 16,569 | py | Python | design.py | ilyxych96/ScanEP | daca880fcf9f186dd175b82fa8e1c80776c9e6e6 | [
"MIT"
] | 2 | 2020-08-16T06:39:18.000Z | 2021-05-03T11:39:32.000Z | design.py | ilyxych96/ScanEP | daca880fcf9f186dd175b82fa8e1c80776c9e6e6 | [
"MIT"
] | 2 | 2022-01-13T03:47:51.000Z | 2022-03-12T00:57:46.000Z | design.py | ilyxych96/ScanEP | daca880fcf9f186dd175b82fa8e1c80776c9e6e6 | [
"MIT"
] | 1 | 2020-08-25T00:20:27.000Z | 2020-08-25T00:20:27.000Z | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'design.ui'
#
# Created by: PyQt5 UI code generator 5.14.1
#
# WARNING! All changes made in this file will be lost!
from PyQt5 import QtCore, QtGui, QtWidgets
class Ui_MainWindow(object):
def setupUi(self, MainWindow):
MainWindow.setObjectName("MainWindow")
MainWindow.resize(685, 643)
MainWindow.setMinimumSize(QtCore.QSize(685, 0))
self.centralwidget = QtWidgets.QWidget(MainWindow)
self.centralwidget.setObjectName("centralwidget")
self.verticalLayout_3 = QtWidgets.QVBoxLayout(self.centralwidget)
self.verticalLayout_3.setObjectName("verticalLayout_3")
self.gridLayout = QtWidgets.QGridLayout()
self.gridLayout.setObjectName("gridLayout")
self.toolButton_6 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_6.setObjectName("toolButton_6")
self.gridLayout.addWidget(self.toolButton_6, 10, 2, 1, 1)
self.toolButton_2 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_2.setObjectName("toolButton_2")
self.gridLayout.addWidget(self.toolButton_2, 2, 2, 1, 1)
self.label_4 = QtWidgets.QLabel(self.centralwidget)
self.label_4.setAlignment(QtCore.Qt.AlignCenter)
self.label_4.setObjectName("label_4")
self.gridLayout.addWidget(self.label_4, 23, 1, 1, 1)
self.verticalLayout = QtWidgets.QVBoxLayout()
self.verticalLayout.setObjectName("verticalLayout")
self.gridLayout.addLayout(self.verticalLayout, 17, 0, 1, 1)
self.toolButton = QtWidgets.QToolButton(self.centralwidget)
self.toolButton.setObjectName("toolButton")
self.gridLayout.addWidget(self.toolButton, 1, 2, 1, 1)
self.pushButton_5 = QtWidgets.QPushButton(self.centralwidget)
self.pushButton_5.setObjectName("pushButton_5")
self.gridLayout.addWidget(self.pushButton_5, 26, 0, 1, 4)
self.label_8 = QtWidgets.QLabel(self.centralwidget)
self.label_8.setObjectName("label_8")
self.gridLayout.addWidget(self.label_8, 6, 0, 1, 1)
self.label_22 = QtWidgets.QLabel(self.centralwidget)
self.label_22.setObjectName("label_22")
self.gridLayout.addWidget(self.label_22, 15, 0, 1, 1)
self.horizontalLayout_2 = QtWidgets.QHBoxLayout()
self.horizontalLayout_2.setObjectName("horizontalLayout_2")
self.label_24 = QtWidgets.QLabel(self.centralwidget)
self.label_24.setObjectName("label_24")
self.horizontalLayout_2.addWidget(self.label_24)
self.spinBox = QtWidgets.QSpinBox(self.centralwidget)
self.spinBox.setObjectName("spinBox")
self.horizontalLayout_2.addWidget(self.spinBox)
self.label_23 = QtWidgets.QLabel(self.centralwidget)
self.label_23.setObjectName("label_23")
self.horizontalLayout_2.addWidget(self.label_23, 0, QtCore.Qt.AlignRight)
self.spinBox_2 = QtWidgets.QSpinBox(self.centralwidget)
self.spinBox_2.setObjectName("spinBox_2")
self.horizontalLayout_2.addWidget(self.spinBox_2)
self.gridLayout.addLayout(self.horizontalLayout_2, 20, 0, 1, 4)
self.label_5 = QtWidgets.QLabel(self.centralwidget)
self.label_5.setObjectName("label_5")
self.gridLayout.addWidget(self.label_5, 1, 0, 1, 1)
self.comboBox = QtWidgets.QComboBox(self.centralwidget)
self.comboBox.setObjectName("comboBox")
self.comboBox.addItem("")
self.comboBox.addItem("")
self.gridLayout.addWidget(self.comboBox, 15, 1, 1, 1)
self.toolButton_5 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_5.setObjectName("toolButton_5")
self.gridLayout.addWidget(self.toolButton_5, 9, 2, 1, 1)
self.lineEdit_2 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_2.setObjectName("lineEdit_2")
self.gridLayout.addWidget(self.lineEdit_2, 2, 1, 1, 1)
self.pushButton_2 = QtWidgets.QPushButton(self.centralwidget)
self.pushButton_2.setObjectName("pushButton_2")
self.gridLayout.addWidget(self.pushButton_2, 6, 3, 1, 1)
self.toolButton_7 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_7.setObjectName("toolButton_7")
self.gridLayout.addWidget(self.toolButton_7, 11, 2, 1, 1)
self.lineEdit_6 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_6.setObjectName("lineEdit_6")
self.gridLayout.addWidget(self.lineEdit_6, 10, 1, 1, 1)
self.label_18 = QtWidgets.QLabel(self.centralwidget)
self.label_18.setAlignment(QtCore.Qt.AlignCenter)
self.label_18.setObjectName("label_18")
self.gridLayout.addWidget(self.label_18, 17, 1, 1, 1)
self.toolButton_10 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_10.setObjectName("toolButton_10")
self.gridLayout.addWidget(self.toolButton_10, 18, 2, 1, 1)
self.toolButton_3 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_3.setObjectName("toolButton_3")
self.gridLayout.addWidget(self.toolButton_3, 5, 2, 1, 1)
self.toolButton_8 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_8.setObjectName("toolButton_8")
self.gridLayout.addWidget(self.toolButton_8, 12, 2, 1, 1)
self.lineEdit_5 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_5.setObjectName("lineEdit_5")
self.gridLayout.addWidget(self.lineEdit_5, 9, 1, 1, 1)
self.label_15 = QtWidgets.QLabel(self.centralwidget)
self.label_15.setObjectName("label_15")
self.gridLayout.addWidget(self.label_15, 3, 0, 1, 4)
self.label_17 = QtWidgets.QLabel(self.centralwidget)
self.label_17.setObjectName("label_17")
self.gridLayout.addWidget(self.label_17, 21, 0, 1, 4)
self.lineEdit_11 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_11.setObjectName("lineEdit_11")
self.gridLayout.addWidget(self.lineEdit_11, 18, 1, 1, 1)
self.lineEdit_10 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_10.setObjectName("lineEdit_10")
self.gridLayout.addWidget(self.lineEdit_10, 14, 1, 1, 1)
self.label_16 = QtWidgets.QLabel(self.centralwidget)
self.label_16.setObjectName("label_16")
self.gridLayout.addWidget(self.label_16, 7, 0, 1, 4)
self.label = QtWidgets.QLabel(self.centralwidget)
self.label.setAlignment(QtCore.Qt.AlignCenter)
self.label.setObjectName("label")
self.gridLayout.addWidget(self.label, 0, 1, 1, 1)
self.label_6 = QtWidgets.QLabel(self.centralwidget)
self.label_6.setObjectName("label_6")
self.gridLayout.addWidget(self.label_6, 2, 0, 1, 1)
self.label_9 = QtWidgets.QLabel(self.centralwidget)
self.label_9.setObjectName("label_9")
self.gridLayout.addWidget(self.label_9, 9, 0, 1, 1)
self.lineEdit = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit.setObjectName("lineEdit")
self.gridLayout.addWidget(self.lineEdit, 1, 1, 1, 1)
self.pushButton_3 = QtWidgets.QPushButton(self.centralwidget)
self.pushButton_3.setObjectName("pushButton_3")
self.gridLayout.addWidget(self.pushButton_3, 15, 3, 1, 1)
self.label_14 = QtWidgets.QLabel(self.centralwidget)
self.label_14.setObjectName("label_14")
self.gridLayout.addWidget(self.label_14, 25, 0, 1, 4)
self.pushButton_4 = QtWidgets.QPushButton(self.centralwidget)
self.pushButton_4.setObjectName("pushButton_4")
self.gridLayout.addWidget(self.pushButton_4, 24, 0, 1, 4)
self.lineEdit_8 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_8.setObjectName("lineEdit_8")
self.gridLayout.addWidget(self.lineEdit_8, 12, 1, 1, 1)
self.label_2 = QtWidgets.QLabel(self.centralwidget)
self.label_2.setAlignment(QtCore.Qt.AlignCenter)
self.label_2.setObjectName("label_2")
self.gridLayout.addWidget(self.label_2, 4, 1, 1, 1)
self.label_3 = QtWidgets.QLabel(self.centralwidget)
self.label_3.setAlignment(QtCore.Qt.AlignCenter)
self.label_3.setObjectName("label_3")
self.gridLayout.addWidget(self.label_3, 8, 1, 1, 1)
self.label_7 = QtWidgets.QLabel(self.centralwidget)
self.label_7.setObjectName("label_7")
self.gridLayout.addWidget(self.label_7, 5, 0, 1, 1)
self.label_20 = QtWidgets.QLabel(self.centralwidget)
self.label_20.setObjectName("label_20")
self.gridLayout.addWidget(self.label_20, 16, 0, 1, 4)
self.verticalLayout_2 = QtWidgets.QVBoxLayout()
self.verticalLayout_2.setObjectName("verticalLayout_2")
self.label_19 = QtWidgets.QLabel(self.centralwidget)
self.label_19.setObjectName("label_19")
self.verticalLayout_2.addWidget(self.label_19)
self.gridLayout.addLayout(self.verticalLayout_2, 18, 0, 1, 1)
self.pushButton_7 = QtWidgets.QPushButton(self.centralwidget)
self.pushButton_7.setObjectName("pushButton_7")
self.gridLayout.addWidget(self.pushButton_7, 14, 3, 1, 1)
self.lineEdit_3 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_3.setObjectName("lineEdit_3")
self.gridLayout.addWidget(self.lineEdit_3, 5, 1, 1, 1)
self.toolButton_4 = QtWidgets.QToolButton(self.centralwidget)
self.toolButton_4.setObjectName("toolButton_4")
self.gridLayout.addWidget(self.toolButton_4, 6, 2, 1, 1)
self.label_10 = QtWidgets.QLabel(self.centralwidget)
self.label_10.setObjectName("label_10")
self.gridLayout.addWidget(self.label_10, 10, 0, 1, 1)
self.lineEdit_7 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_7.setObjectName("lineEdit_7")
self.gridLayout.addWidget(self.lineEdit_7, 11, 1, 1, 1)
self.label_11 = QtWidgets.QLabel(self.centralwidget)
self.label_11.setObjectName("label_11")
self.gridLayout.addWidget(self.label_11, 11, 0, 1, 1)
self.lineEdit_4 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_4.setObjectName("lineEdit_4")
self.gridLayout.addWidget(self.lineEdit_4, 6, 1, 1, 1)
self.pushButton_6 = QtWidgets.QPushButton(self.centralwidget)
self.pushButton_6.setObjectName("pushButton_6")
self.gridLayout.addWidget(self.pushButton_6, 18, 3, 1, 1)
self.label_12 = QtWidgets.QLabel(self.centralwidget)
self.label_12.setObjectName("label_12")
self.gridLayout.addWidget(self.label_12, 12, 0, 1, 1)
self.label_21 = QtWidgets.QLabel(self.centralwidget)
self.label_21.setObjectName("label_21")
self.gridLayout.addWidget(self.label_21, 14, 0, 1, 1)
self.pushButton = QtWidgets.QPushButton(self.centralwidget)
self.pushButton.setObjectName("pushButton")
self.gridLayout.addWidget(self.pushButton, 2, 3, 1, 1)
self.label_13 = QtWidgets.QLabel(self.centralwidget)
self.label_13.setAlignment(QtCore.Qt.AlignCenter)
self.label_13.setObjectName("label_13")
self.gridLayout.addWidget(self.label_13, 11, 3, 1, 1)
self.doubleSpinBox = QtWidgets.QDoubleSpinBox(self.centralwidget)
self.doubleSpinBox.setObjectName("doubleSpinBox")
self.gridLayout.addWidget(self.doubleSpinBox, 12, 3, 1, 1)
self.verticalLayout_3.addLayout(self.gridLayout)
MainWindow.setCentralWidget(self.centralwidget)
self.menubar = QtWidgets.QMenuBar(MainWindow)
self.menubar.setGeometry(QtCore.QRect(0, 0, 685, 22))
self.menubar.setObjectName("menubar")
MainWindow.setMenuBar(self.menubar)
self.statusbar = QtWidgets.QStatusBar(MainWindow)
self.statusbar.setObjectName("statusbar")
MainWindow.setStatusBar(self.statusbar)
self.retranslateUi(MainWindow)
QtCore.QMetaObject.connectSlotsByName(MainWindow)
def retranslateUi(self, MainWindow):
_translate = QtCore.QCoreApplication.translate
MainWindow.setWindowTitle(_translate("MainWindow", "MainWindow"))
self.toolButton_6.setText(_translate("MainWindow", "..."))
self.toolButton_2.setText(_translate("MainWindow", "..."))
self.label_4.setText(_translate("MainWindow", "QDPT result processing"))
self.toolButton.setText(_translate("MainWindow", "..."))
self.pushButton_5.setText(_translate("MainWindow", "Instruction"))
self.label_8.setText(_translate("MainWindow", "New method file"))
self.label_22.setText(_translate("MainWindow", "dimer/monomer"))
self.label_24.setText(_translate("MainWindow", "Minimum atoms in one molecule"))
self.label_23.setText(_translate("MainWindow", "Maximum contact length"))
self.label_5.setText(_translate("MainWindow", "Main file"))
self.comboBox.setItemText(0, _translate("MainWindow", "Dimer"))
self.comboBox.setItemText(1, _translate("MainWindow", "Monomer"))
self.toolButton_5.setText(_translate("MainWindow", "..."))
self.pushButton_2.setText(_translate("MainWindow", "Method Changer"))
self.toolButton_7.setText(_translate("MainWindow", "..."))
self.label_18.setText(_translate("MainWindow", "Molecular Delimiter"))
self.toolButton_10.setText(_translate("MainWindow", "..."))
self.toolButton_3.setText(_translate("MainWindow", "..."))
self.toolButton_8.setText(_translate("MainWindow", "..."))
self.label_15.setText(_translate("MainWindow", "-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------"))
self.label_17.setText(_translate("MainWindow", "--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------"))
self.label_16.setText(_translate("MainWindow", "--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------"))
self.label.setText(_translate("MainWindow", "VEC Changer"))
self.label_6.setText(_translate("MainWindow", "VEC file"))
self.label_9.setText(_translate("MainWindow", "Settings"))
self.pushButton_3.setText(_translate("MainWindow", "Files generator"))
self.label_14.setText(_translate("MainWindow", "--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------"))
self.pushButton_4.setText(_translate("MainWindow", "Plot Editor"))
self.label_2.setText(_translate("MainWindow", "Method Changer"))
self.label_3.setText(_translate("MainWindow", "Inp files generator"))
self.label_7.setText(_translate("MainWindow", "Directory with files to change"))
self.label_20.setText(_translate("MainWindow", "--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------"))
self.label_19.setText(_translate("MainWindow", "Pack file"))
self.pushButton_7.setText(_translate("MainWindow", "Open Folder"))
self.toolButton_4.setText(_translate("MainWindow", "..."))
self.label_10.setText(_translate("MainWindow", "Geometry 1"))
self.label_11.setText(_translate("MainWindow", "Geometry 2"))
self.pushButton_6.setText(_translate("MainWindow", "Start"))
self.label_12.setText(_translate("MainWindow", "Directory for generated files"))
self.label_21.setText(_translate("MainWindow", "inp filenames mask"))
self.pushButton.setText(_translate("MainWindow", "VEC Changer"))
self.label_13.setText(_translate("MainWindow", "Step"))
| 63.482759 | 290 | 0.640775 | 1,774 | 16,569 | 5.82018 | 0.085118 | 0.08891 | 0.113898 | 0.128136 | 0.557094 | 0.291913 | 0.01724 | 0 | 0 | 0 | 0 | 0.044288 | 0.176897 | 16,569 | 260 | 291 | 63.726923 | 0.712788 | 0.010864 | 0 | 0.008097 | 1 | 0 | 0.162821 | 0.071269 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008097 | false | 0 | 0.004049 | 0 | 0.016194 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba33640b845fe1710edd1263fa90d2214da751b9 | 9,719 | py | Python | main.py | steviezhang/COG403-Project | 4f7ba3e6ab760c219a4722c21350711ab072f82f | [
"Apache-1.1"
] | null | null | null | main.py | steviezhang/COG403-Project | 4f7ba3e6ab760c219a4722c21350711ab072f82f | [
"Apache-1.1"
] | null | null | null | main.py | steviezhang/COG403-Project | 4f7ba3e6ab760c219a4722c21350711ab072f82f | [
"Apache-1.1"
] | null | null | null | import numpy as np
import re
from nltk import Tree
from nltk import induce_pcfg
from nltk import Nonterminal
from nltk.parse.generate import generate
epsilon = 1e-20
class corpus:
# stores all sentence forms in data
def __init__(self):
self.sentence_forms = {}
for i in range(6): # init six levels
self.sentence_forms[i + 1] = {}
self.corp = []
def sort_sentence_types(self, types):
for t in types:
freq = types[t]
if freq >= 500:
self.sentence_forms[1][t.rstrip("\n")] = freq # we need to strip these newline characters because they shouldn't count as terminal
self.sentence_forms[2][t.rstrip("\n")] = freq
self.sentence_forms[3][t.rstrip("\n")] = freq
self.sentence_forms[4][t.rstrip("\n")] = freq
self.sentence_forms[5][t.rstrip("\n")] = freq
self.sentence_forms[6][t.rstrip("\n")] = freq
if freq >= 300:
self.sentence_forms[2][t.rstrip("\n")] = freq
self.sentence_forms[3][t.rstrip("\n")] = freq
self.sentence_forms[4][t.rstrip("\n")] = freq
self.sentence_forms[5][t.rstrip("\n")] = freq
self.sentence_forms[6][t.rstrip("\n")] = freq
if freq >= 100:
self.sentence_forms[3][t.rstrip("\n")] = freq
self.sentence_forms[4][t.rstrip("\n")] = freq
self.sentence_forms[5][t.rstrip("\n")] = freq
self.sentence_forms[6][t.rstrip("\n")] = freq
if freq >= 50:
self.sentence_forms[4][t.rstrip("\n")] = freq
self.sentence_forms[5][t.rstrip("\n")] = freq
self.sentence_forms[6][t.rstrip("\n")] = freq
if freq >= 10:
self.sentence_forms[5][t.rstrip("\n")] = freq
self.sentence_forms[6][t.rstrip("\n")] = freq
self.sentence_forms[6][t.rstrip("\n")] = freq
FREE = "Free"
PRG = "Regular"
PCFG = "Context Free"
def geometric(n, p):
return p * np.power(1.0 - p, n - 1, dtype=np.float64)
def compute_prior(G, corpus, n, level, flag=False): # flag for NLTK
# P : number of productions for grammar G
# n: number of non terminals for grammar G
# V: Vocabulary size = # num non terminals + # num terminals = len(corpus[level])
productions = None
if flag:
productions = corpus
else:
productions = G
P = len(productions)
V = None
if flag:
V = len(corpus)
else:
V = len(corpus.sentence_forms[level])
prob_P = np.log(geometric(P, 0.5)+epsilon)
prob_n = np.log(geometric(n, 0.5)+epsilon)
log_prior = prob_P + prob_n
for i in range(P):
if flag:
N_i = len(productions[i])
else:
N_i = len(list(productions.keys())[i])# num symbols for production i
prob_N_i = geometric(N_i, 0.5)
log_prior -= (N_i * np.log(V))
log_prior += prob_N_i
return log_prior
def compute_log_likelihood(corpus, G, T, level, flag=False):
# k: number of unique sentence types in corpus
log_likelihood = 0
D = None
k = None
if flag:
k = len(corpus)
D = corpus
else:
D = corpus.corp # sentence forms at specified level in corpus
k = len(D) # get num diff sentence forms at given level
productions = G
for i in range(k):
sl = None
if flag:
sl = compute_sentence_likelihood_nltk(productions, D[:50])
else:
sentence_i = D[i].split(" ")
sl = compute_sentence_likelihood(sentence_i, productions)
if sl != 0:
log_likelihood += np.log(sl)
return log_likelihood
def compute_sentence_likelihood(S_i, productions):
# sum of probability of generating S_i under all possible parses
# productions = "S -> U" # example
prob = 0
prods = list(productions.keys())
for p in prods:
p_split = p.split("->") # change based on how the prod symbols are seperated
s1 = p_split[0]
s2 = p_split[1] # should be only two prod symbols per production
for i, token in enumerate(S_i[:-1]):
if s1 == token and s2 == S_i[i + 1]:
prob += productions[p]
return prob
def compute_sentence_likelihood_nltk(G, productions):
prob = 0
prods = list(G.keys())
S_i = productions
for p in prods:
p_split = p.split(" -> ")
s1 = p_split[0]
s2 = p_split[1]
for i, token in enumerate(S_i[:-1]):
if s1 == token and s2 == S_i[i + 1]:
prob += np.log(G[p])
return prob
def compute_log_posterior(log_prior, log_likelihood):
return log_prior + log_likelihood + np.log((1.0 / 3.0))
def test_functions(adam_levelk, k):
terminal_pattern = "[.?!]"
levelk_terminal = 0
for j in adam_levelk.keys():
terminal = re.search(terminal_pattern, j)
if terminal:
levelk_terminal += 1
# #turn grammar into probabilities
total = sum(adam_levelk.values())
adam_levelk_probabilities = {}
for j in adam_levelk.keys():
adam_levelk_probabilities[j] = adam_levelk[j]/total
levelk_nonterminal = (len(adam_levelk) - levelk_terminal)
prior = compute_prior(adam_levelk_probabilities, data, levelk_nonterminal, k)
likelihood = compute_log_likelihood(data, adam_levelk_probabilities, PCFG, k)
logpost = compute_log_posterior(prior, likelihood)
return prior, likelihood, logpost
import os
directory = "Adam/"
people = ["*MOT", "*URS", "*RIC", "*COL", "*CHI"]
def read_and_return(directory):
speakers = {}
struct = {}
append_next = False
for file_path in os.listdir(directory):
with open("Adam/" + file_path, "r") as f:
speakers[file_path] = []
struct[file_path] = []
for line in f:
split = line.split(" ")
if append_next and split[0][:4] == "%mor":
content = split[0].split("\t")[-1]
struct[file_path].append(content.split(" "))
elif split[0][:4] in people[:-1]:
speakers[file_path].append(split)
append_next = True
else:
append_next = False
return speakers, struct
def loadTrees(path):
with open (path, 'r') as f:
data = f.read().split("\n\n")
flattened_data = []
for i in range(len(data)):
#flatten it and strip extra whitespace
flattened_data.append(" ".join(data[i].replace("\n", "").split()))
tree = []
for i, s in enumerate(flattened_data[:-2]):
if "R" in s:
tree.append(Tree.fromstring(s))
return tree
def productionsFromTrees(trees):
productions = []
for tree in trees:
productions += tree.productions()
return productions
def inducePCFGFromProductions(productions):
S = Nonterminal('S')
grammar = induce_pcfg(S, productions)
return grammar
if __name__ == "__main__":
speakers, struct = read_and_return(directory) # this function was used before perfors sent his data
corp = []
types = {}
for fp in struct:
for segments in struct[fp]:
t = ""
for s in segments[:-1]:
token = s.split("|")[0].split(":")[0]
if ("#" in token):
token = token.split("#")[1]
t += token + " "
corp.append(t[:-1])
splitter = t.split(" ")[:-1]
for i in range(len(splitter)):
if (i < (len(splitter) - 1)):
tok = splitter[i] + "->" + splitter[i+1]
if tok in types:
types[tok] += 1
else:
types[tok] = 1
data = corpus()
data.sort_sentence_types(types)
data.corp = corp
adam_level1 = data.sentence_forms[1]
adam_level2 = data.sentence_forms[2]
adam_level3 = data.sentence_forms[3]
adam_level4 = data.sentence_forms[4]
adam_level5 = data.sentence_forms[5]
adam_level6 = data.sentence_forms[6]
print("FREQUENCY WEIGHTED CFG")
for i in range(6):
print("----------------")
print("LEVEL " + str(i+1))
prior, likelihood, logpost = test_functions(data.sentence_forms[i+1], i+1)
print("Log Prior: " + str(prior))
print("Log Likelihood: " + str(likelihood))
print("Log Posterior: " + str(logpost))
trees = loadTrees("Parsetree/brown-adam.parsed")
productions = productionsFromTrees(trees)
nltkgrammar = inducePCFGFromProductions(productions)
grammarToParse = str(nltkgrammar).split("\n")
finalGrammar = []
grammarDict = {}
for g in grammarToParse:
finalGrammar.append(g[4:])
for fg in finalGrammar[1:]:
gg = fg.split("[")
rule = gg[0][:-1]
value = gg[1][:-1]
grammarDict[rule] = float(value)
terminal_pattern = "[.?!]"
terminal_sum = 0
for j in grammarDict.keys():
terminal = re.search(terminal_pattern, j)
if terminal:
terminal_sum += 1
print("PROBABALISTIC PCFG")
prior = compute_prior(grammarDict, productions, terminal_sum, 0, True)
print("Log Prior: " + str(prior))
likelihood = compute_log_likelihood(productions, grammarDict, PCFG, 0, True)
print("Log Likelihood: " + str(likelihood))
logpost = compute_log_posterior(prior, likelihood)
print("Log Posterior: " + str(logpost))
| 31.555195 | 146 | 0.564461 | 1,230 | 9,719 | 4.327642 | 0.184553 | 0.083036 | 0.073455 | 0.047342 | 0.236333 | 0.201014 | 0.166823 | 0.166823 | 0.151418 | 0.134135 | 0 | 0.018125 | 0.307439 | 9,719 | 307 | 147 | 31.65798 | 0.772694 | 0.080255 | 0 | 0.292887 | 1 | 0 | 0.036343 | 0.003029 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054393 | false | 0 | 0.029289 | 0.008368 | 0.133891 | 0.041841 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba37657616231320b90e3ca5582011ae51d8375d | 2,877 | py | Python | Editor/pini/ATL.py | RangHo/pini-engine | e1407724de32a433b7b46e0ee2469240b70d960b | [
"MIT"
] | null | null | null | Editor/pini/ATL.py | RangHo/pini-engine | e1407724de32a433b7b46e0ee2469240b70d960b | [
"MIT"
] | null | null | null | Editor/pini/ATL.py | RangHo/pini-engine | e1407724de32a433b7b46e0ee2469240b70d960b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import sys
reload(sys)
sys.setdefaultencoding("utf-8")
import math
M_PI_2 = math.pi/2.0
M_PI = math.pi
def interpolation(t,o1,o2):
return round(( 1 - t )*o1 + t*o2,2)
def linear(time) :
return time
def sineEaseIn(time):
return -1 * math.cos(time * M_PI_2) + 1
def sineEaseOut(time) :
return math.sin(time * M_PI_2)
def sineEaseInOut(time) :
return -0.5 * (math.cos(M_PI * time) - 1)
def EaseImmediately(time):
if time >= 1 :
return 1
return 0
#########################################
## types
line_Interval = [
u"위치X",
u"위치Y",
u"크기X",
u"크기Y",
u"회전",
u"색상R",
u"색상G",
u"색상B",
u"색상A",
]
line_Interval_Default = [
0,
0,
1,
1,
0,
255,
255,
255,
255
]
line_Instant = [
u"매크로",
u"루아",
u"이미지",
]
line_type = line_Interval + line_Instant
############################################
#### ease type!
line_ease = [
u"기본",
u"사인인",
u"사인아웃",
u"사인인아웃",
u"즉시",
]
##############################################
#### set type
line_increment = [
u"증가",
u"변경",
]
from ctypes import *
atl = cdll.LoadLibrary("ATL.so")
atl.getNumberVal.restype = c_float
atl.getNumberSetVal.restype = c_float
atl.getNumberSet.restype = c_float
atl.isValue.restype = c_bool
atl.getFrame.restype = c_int
atl.getMaxFrame.restype = c_int
atl.getMarkedFrames.restype = c_char_p
atl.getStringVal.restype = c_char_p
atl.registAnimation.argtypes = [c_char_p]
atl.getMarkedFrames.argtypes = [c_char_p,c_int]
atl.getFrame.argtypes = [c_char_p,c_int,c_int,c_char_p,c_char_p]
atl.getMaxFrame.argtypes = [c_char_p,c_int]
atl.isExists.argtypes = [c_char_p]
atl.numNode.argtypes = [c_char_p]
atl.registStringValue.argtypes = [c_char_p,c_char_p,c_char_p]
atl.registNumberValue.argtypes = [c_char_p,c_char_p,c_float]
atl.deleteNodeValue.argtypes = [c_char_p]
def FAL_REGIST(json):
atl.registAnimation(json)
def FAL_GETFRAME(idx,node,frame,nodeName,_hash):
return atl.getFrame(idx,node,frame,nodeName,_hash)
def FAL_GETVALUE(frame,key):
return atl.getNumberVal(frame,key), atl.getNumberSetVal(frame,key), atl.getNumberSet(frame, key)
def FAL_GETSTRVALUE(frame,key):
return atl.getStringVal(frame,key).decode("mbcs")
def FAL_ISVALUE(frame,key):
return atl.isValue(frame,key)
def FAL_DELETEFRAME(frame):
atl.deleteFrame(frame)
def FAL_MAXFRAME(idx,node):
return atl.getMaxFrame(idx,node)
def FAL_MARKEDFRAMES(idx,node):
frames = atl.getMarkedFrames(idx,node)
frames = frames.split(",")
if len(frames) > 0 :
frames = frames[0:-1]
frames = [int(v) for v in frames]
return frames
def FAL_ISEXISTS(idx):
return atl.isExists(idx)
def FAL_NUMNODE(idx):
return atl.numNode(idx)
def FAL_REGISTSTRINGVALUE(node,idx,value):
atl.registStringValue(node,idx,value)
def FAL_REGISTNUMBERVALUE(node,idx,value):
atl.registNumberValue(node,idx,value)
def FAL_DELETENODEVALUE(node):
atl.deleteNodeValue(node)
def FAL_CLEARFRAME():
atl.clearFrame()
| 21.154412 | 97 | 0.693083 | 444 | 2,877 | 4.31982 | 0.252252 | 0.04171 | 0.050052 | 0.065693 | 0.160584 | 0.09854 | 0.055787 | 0.021898 | 0 | 0 | 0 | 0.016218 | 0.121307 | 2,877 | 135 | 98 | 21.311111 | 0.742484 | 0.016336 | 0 | 0.072072 | 0 | 0 | 0.026051 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.18018 | false | 0 | 0.027027 | 0.108108 | 0.342342 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
ba3949b07159d8a545e8c858e6970f71b387618c | 554 | py | Python | setup.py | stweil/tensorflow_gpu_to_tensorflow | f0e7a644aca970e98fa5b1ae1e46fd15e2aedc28 | [
"Apache-2.0"
] | 2 | 2020-04-23T09:30:39.000Z | 2020-10-08T17:44:07.000Z | setup.py | stweil/tensorflow_gpu_to_tensorflow | f0e7a644aca970e98fa5b1ae1e46fd15e2aedc28 | [
"Apache-2.0"
] | null | null | null | setup.py | stweil/tensorflow_gpu_to_tensorflow | f0e7a644aca970e98fa5b1ae1e46fd15e2aedc28 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Dummy Python package for tensorflow-gpu
"""
import codecs
import setuptools
setuptools.setup(
name='tensorflow_gpu',
version='1.15.2',
description='dummy for tensorflow-gpu',
long_description=codecs.open('README.md', encoding='utf-8').read(),
long_description_content_type='text/markdown',
author='Stefan Weil',
author_email='sw@weilnetz.de',
url='https://github.com/stweil/tensorflow_gpu_to_tensorflow',
license='Apache License 2.0',
install_requires=[
'tensorflow'
],
)
| 24.086957 | 71 | 0.68231 | 68 | 554 | 5.411765 | 0.691176 | 0.141304 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017241 | 0.162455 | 554 | 22 | 72 | 25.181818 | 0.775862 | 0.111913 | 0 | 0 | 0 | 0 | 0.367769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba3950293138df1b235e10dc0343e10a2d0752b1 | 439 | py | Python | ohjelma/migrations/0008_track_track_danceability.py | katrii/ohsiha | 257e15071e3887a3dd87bf598cdd02db6e1e46fe | [
"MIT"
] | null | null | null | ohjelma/migrations/0008_track_track_danceability.py | katrii/ohsiha | 257e15071e3887a3dd87bf598cdd02db6e1e46fe | [
"MIT"
] | null | null | null | ohjelma/migrations/0008_track_track_danceability.py | katrii/ohsiha | 257e15071e3887a3dd87bf598cdd02db6e1e46fe | [
"MIT"
] | null | null | null | # Generated by Django 3.0.2 on 2020-04-11 18:57
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('ohjelma', '0007_track_track_id'),
]
operations = [
migrations.AddField(
model_name='track',
name='track_danceability',
field=models.FloatField(default=0, max_length=10),
preserve_default=False,
),
]
| 21.95 | 62 | 0.605923 | 48 | 439 | 5.395833 | 0.770833 | 0.069498 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070064 | 0.284738 | 439 | 19 | 63 | 23.105263 | 0.754777 | 0.102506 | 0 | 0 | 1 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba3974b9d90d9c8e422ca27f10585f44f60e8f91 | 786 | py | Python | gogolook/models/task.py | chenjr0719/Gogolook-Exercise | 864c4e6c06d42858f465e1986e931832594859e0 | [
"MIT"
] | null | null | null | gogolook/models/task.py | chenjr0719/Gogolook-Exercise | 864c4e6c06d42858f465e1986e931832594859e0 | [
"MIT"
] | 13 | 2022-02-16T15:30:10.000Z | 2022-02-20T08:30:29.000Z | gogolook/models/task.py | chenjr0719/Gogolook-Exercise | 864c4e6c06d42858f465e1986e931832594859e0 | [
"MIT"
] | null | null | null | from enum import IntEnum
from typing import Optional
from pydantic import Field
from sqlalchemy import Column, Enum, String
from gogolook.models import Base, BaseSchema
class TaskStatus(IntEnum):
Incomplete = 0
Complete = 1
class Task(Base):
name = Column(String(length=100))
status = Column(Enum(TaskStatus), default=TaskStatus.Incomplete)
class TaskSchema(BaseSchema):
id: int = Field(description="The id of Task")
name: str = Field(description="The name of Task")
status: TaskStatus = Field(
description="The status of Task", default=TaskStatus.Incomplete
)
class TaskUpdateSchema(TaskSchema):
name: Optional[str] = Field(description="The name of Task")
status: Optional[TaskStatus] = Field(description="The status of Task")
| 25.354839 | 74 | 0.727735 | 97 | 786 | 5.896907 | 0.360825 | 0.13986 | 0.166084 | 0.111888 | 0.276224 | 0.276224 | 0.276224 | 0.132867 | 0 | 0 | 0 | 0.00774 | 0.178117 | 786 | 30 | 75 | 26.2 | 0.877709 | 0 | 0 | 0 | 0 | 0 | 0.104326 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.9 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
ba3cc577432550c07729032ff5a895ea35ca90bf | 899 | py | Python | astroutils/writer_module.py | nithyanandan/AstroUtils | 97473f52d4247bb9c8507598899215d0662e8d6f | [
"MIT"
] | 1 | 2018-10-31T03:49:39.000Z | 2018-10-31T03:49:39.000Z | astroutils/writer_module.py | nithyanandan/AstroUtils | 97473f52d4247bb9c8507598899215d0662e8d6f | [
"MIT"
] | 5 | 2017-11-18T01:45:50.000Z | 2020-05-30T12:26:50.000Z | astroutils/writer_module.py | nithyanandan/AstroUtils | 97473f52d4247bb9c8507598899215d0662e8d6f | [
"MIT"
] | 1 | 2019-10-14T08:44:40.000Z | 2019-10-14T08:44:40.000Z | from blessings import Terminal
term = Terminal()
class Writer(object):
"""
---------------------------------------------------------------------------
Create an object with a write method that writes to a
specific place on the screen, defined at instantiation.
This is the glue between blessings and progressbar.
---------------------------------------------------------------------------
"""
def __init__(self, location):
"""
-----------------------------------------------------------------------
Input: location - tuple of ints (x, y), the position
of the bar in the terminal
-----------------------------------------------------------------------
"""
self.location = location
def write(self, string):
with term.location(*self.location):
print(string)
| 29.966667 | 79 | 0.384872 | 69 | 899 | 4.956522 | 0.666667 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.240267 | 899 | 29 | 80 | 31 | 0.500732 | 0.61624 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.5 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba3e6f882956569f961b1c785b3e7a17c7f9793d | 576 | py | Python | 1201-1300/1215-Magical String/1215-Magical String.py | jiadaizhao/LintCode | a8aecc65c47a944e9debad1971a7bc6b8776e48b | [
"MIT"
] | 77 | 2017-12-30T13:33:37.000Z | 2022-01-16T23:47:08.000Z | 1201-1300/1215-Magical String/1215-Magical String.py | jxhangithub/LintCode-1 | a8aecc65c47a944e9debad1971a7bc6b8776e48b | [
"MIT"
] | 1 | 2018-05-14T14:15:40.000Z | 2018-05-14T14:15:40.000Z | 1201-1300/1215-Magical String/1215-Magical String.py | jxhangithub/LintCode-1 | a8aecc65c47a944e9debad1971a7bc6b8776e48b | [
"MIT"
] | 39 | 2017-12-07T14:36:25.000Z | 2022-03-10T23:05:37.000Z | class Solution:
"""
@param n: an integer
@return: the number of '1's in the first N number in the magical string S
"""
def magicalString(self, n):
# write your code here
if n == 0:
return 0
seed = list('122')
count = 1
i = 2
while len(seed) < n:
num = int(seed[i])
if seed[-1] == '1':
seed += ['2'] * num
else:
seed += ['1'] * num
count += (num if len(seed) <= n else 1)
i += 1
return count
| 26.181818 | 77 | 0.414931 | 73 | 576 | 3.273973 | 0.493151 | 0.041841 | 0.066946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045307 | 0.463542 | 576 | 21 | 78 | 27.428571 | 0.728155 | 0.203125 | 0 | 0 | 0 | 0 | 0.013667 | 0 | 0 | 0 | 0 | 0.047619 | 0 | 1 | 0.0625 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba3f133ecd9fb728e46e4fc79ada11e7484148e2 | 3,897 | py | Python | minibench/benchmark.py | noirbizarre/minibench | a1ac66dc075181c62bb3c0d3a26beb5c46d5f4ab | [
"MIT"
] | 4 | 2015-11-19T09:42:01.000Z | 2021-12-30T06:18:02.000Z | minibench/benchmark.py | noirbizarre/minibench | a1ac66dc075181c62bb3c0d3a26beb5c46d5f4ab | [
"MIT"
] | null | null | null | minibench/benchmark.py | noirbizarre/minibench | a1ac66dc075181c62bb3c0d3a26beb5c46d5f4ab | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
import time
import sys
from collections import namedtuple
from .utils import humanize
DEFAULT_TIMES = 5
if sys.platform == "win32":
# On Windows, the best timer is time.clock()
timer = time.clock
else:
# On most other platforms the best timer is time.time()
timer = time.time
#: Store a single method execution result
RunResult = namedtuple('RunResult', ('duration', 'success', 'result'))
class Result(object):
''' Store an aggregated result for a single method'''
def __init__(self):
self.total = 0
self.has_success = False
self.has_errors = False
self.error = None
class Benchmark(object):
'''Base class for all benchmark suites'''
times = DEFAULT_TIMES
def __init__(self, times=None, prefix="bench_", debug=False,
before=None, before_each=None,
after=None, after_each=None,
**kwargs):
self.times = times or self.times
self.results = {}
self.debug = debug
self._prefix = prefix
self._before = before or self._noop
self._before_each = before_each or self._noop
self._after = after or self._noop
self._after_each = after_each or self._noop
@property
def label(self):
'''A human readable label'''
if self.__doc__ and self.__doc__.strip():
return self.__doc__.strip().splitlines()[0]
return humanize(self.__class__.__name__)
def label_for(self, name):
'''Get a human readable label for a method given its name'''
method = getattr(self, name)
if method.__doc__ and method.__doc__.strip():
return method.__doc__.strip().splitlines()[0]
return humanize(name.replace(self._prefix, ''))
def _noop(self, *args, **kwargs):
pass
def before_class(self):
'''Hook called before each class'''
pass
def before(self):
'''Hook called once before each method'''
pass
def before_each(self):
'''Hook called before each method'''
pass
def after_each(self):
'''Hook called after each method once'''
pass
def after(self):
'''Hook called once after each method'''
pass
def after_class(self):
'''Hook called after each class'''
pass
def _collect(self):
return [test for test in dir(self) if test.startswith(self._prefix)]
def _run_one(self, func):
self.before_each()
tick = timer()
success = True
try:
result = func()
except Exception as e:
success = False
result = e
duration = timer() - tick
self.after_each()
return RunResult(duration, success, result)
def run(self):
'''
Collect all tests to run and run them.
Each method will be run :attr:`Benchmark.times`.
'''
tests = self._collect()
if not tests:
return
self.times
self.before_class()
for test in tests:
func = getattr(self, test)
results = self.results[test] = Result()
self._before(self, test)
self.before()
for i in range(self.times):
self._before_each(self, test, i)
result = self._run_one(func)
results.total += result.duration
if result.success:
results.has_success = True
else:
results.has_errors = True
self._after_each(self, test, i)
if self.debug and not result.success:
results.error = result.result
break
self.after()
self._after(self, test)
self.after_class()
| 26.691781 | 76 | 0.569412 | 457 | 3,897 | 4.656455 | 0.249453 | 0.042293 | 0.039474 | 0.019737 | 0.139568 | 0.031015 | 0 | 0 | 0 | 0 | 0 | 0.002691 | 0.332564 | 3,897 | 145 | 77 | 26.875862 | 0.815456 | 0.155248 | 0 | 0.094737 | 0 | 0 | 0.012789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.147368 | false | 0.073684 | 0.052632 | 0.010526 | 0.305263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
ba5f8c54b58e2039ba5050a39da943245f582c52 | 568 | py | Python | conftest.py | d34dm8/chime | 1438245b3510b682345cd18f60150e975a54233d | [
"MIT"
] | 149 | 2020-10-05T13:13:38.000Z | 2022-03-30T11:59:27.000Z | conftest.py | d34dm8/chime | 1438245b3510b682345cd18f60150e975a54233d | [
"MIT"
] | 17 | 2020-10-05T20:43:15.000Z | 2021-11-14T16:15:00.000Z | conftest.py | d34dm8/chime | 1438245b3510b682345cd18f60150e975a54233d | [
"MIT"
] | 8 | 2020-10-05T21:03:49.000Z | 2022-02-03T17:46:01.000Z | import importlib
import pathlib
import tempfile
import _pytest.monkeypatch
import pytest
import chime
@pytest.fixture(scope='function', autouse=True)
def reload_chime():
importlib.reload(chime)
@pytest.fixture(scope='function', autouse=True)
def mock_pathlib_home(monkeypatch: _pytest.monkeypatch.MonkeyPatch):
with tempfile.TemporaryDirectory() as home_dir:
home_dir_path = pathlib.Path(home_dir)
monkeypatch.setattr(pathlib.Path, name='home', value=lambda: home_dir_path)
monkeypatch.setenv('APPDATA', value=str(home_dir_path))
| 25.818182 | 83 | 0.767606 | 71 | 568 | 5.957746 | 0.394366 | 0.082742 | 0.078014 | 0.108747 | 0.212766 | 0.212766 | 0.212766 | 0.212766 | 0 | 0 | 0 | 0 | 0.128521 | 568 | 21 | 84 | 27.047619 | 0.854545 | 0 | 0 | 0.133333 | 0 | 0 | 0.047535 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.466667 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
ba60a6e988ca3407a19a72d7fe7fe9d52a6f5fc1 | 2,307 | py | Python | atropos/commands/error/reports.py | plijnzaad/atropos | dcc79d0aa89b5d2b87f02ee841525fa154898ae6 | [
"CC0-1.0"
] | null | null | null | atropos/commands/error/reports.py | plijnzaad/atropos | dcc79d0aa89b5d2b87f02ee841525fa154898ae6 | [
"CC0-1.0"
] | 1 | 2020-09-23T07:37:32.000Z | 2020-09-23T07:37:32.000Z | atropos/commands/error/reports.py | plijnzaad/atropos | dcc79d0aa89b5d2b87f02ee841525fa154898ae6 | [
"CC0-1.0"
] | null | null | null | """Report generator for the error command.
TODO: move reporting functionality out of the ErrorEstimator class.
"""
from itertools import repeat
from atropos.commands.reports import BaseReportGenerator
from atropos.io import open_output
from atropos.commands.legacy_report import Printer, TitlePrinter
class ReportGenerator(BaseReportGenerator):
def generate_text_report(self, fmt, summary, outfile, **kwargs):
if fmt == 'txt':
with open_output(outfile, context_wrapper=True) as out:
generate_reports(out, summary)
else:
super().generate_from_template(fmt, summary, outfile, **kwargs)
def generate_reports(outstream, summary):
names = summary['input']['input_names'] or repeat(None)
estimates = summary['errorrate']['estimate']
_print = Printer(outstream)
_print_title = TitlePrinter(outstream)
input_idx = 0
for input_idx, (estimate, details, name) in enumerate(zip(
estimates, summary['errorrate']['details'], names), 1):
generate_estimator_report(
outstream, input_idx, estimate, details, _print, _print_title, name)
if input_idx > 1:
_print.newline()
_print_title("Overall", level=0)
total_lens = summary['errorrate']['total_len']
overall_err = (
sum(err * total_len for err, total_len in zip(estimates, total_lens)) /
sum(total_lens))
print("Error rate: {:.2%}".format(overall_err), file=outstream)
def generate_estimator_report(
outstream, input_idx, estimate, details, _print, _print_title,
input_name=None):
_print_indent = Printer(outstream, indent=' ')
_print.newline()
_print_title("Input {}".format(input_idx), level=0)
if input_name:
_print("File: {}".format(input_name))
_print("Error rate: {:.2%}".format(estimate))
if details:
_print("Details:\n")
per_read = details['per_read']
per_cycle = details['per_cycle']
_print_indent("StdErr: {:.2%}".format(per_read['standard error']))
_print_indent("Per-cycle rates:")
for cycle in per_cycle:
_print_indent(
"Cycle: {}, Error: {:.2%}, StdErr: {:.2%}".format(*cycle),
indent=2)
| 36.619048 | 84 | 0.641092 | 263 | 2,307 | 5.380228 | 0.326996 | 0.033922 | 0.036042 | 0.048763 | 0.128622 | 0.09894 | 0.09894 | 0.09894 | 0.09894 | 0.09894 | 0 | 0.006246 | 0.236671 | 2,307 | 62 | 85 | 37.209677 | 0.797274 | 0.046814 | 0 | 0.041667 | 0 | 0 | 0.110401 | 0 | 0 | 0 | 0 | 0.016129 | 0 | 1 | 0.0625 | false | 0 | 0.083333 | 0 | 0.166667 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba62fad903a85520d54d8e5c15dd90d984f9d167 | 4,071 | py | Python | pypy/objspace/std/iterobject.py | camillobruni/pygirl | ddbd442d53061d6ff4af831c1eab153bcc771b5a | [
"MIT"
] | 12 | 2016-01-06T07:10:28.000Z | 2021-05-13T23:02:02.000Z | pypy/objspace/std/iterobject.py | camillobruni/pygirl | ddbd442d53061d6ff4af831c1eab153bcc771b5a | [
"MIT"
] | null | null | null | pypy/objspace/std/iterobject.py | camillobruni/pygirl | ddbd442d53061d6ff4af831c1eab153bcc771b5a | [
"MIT"
] | 2 | 2016-07-29T07:09:50.000Z | 2016-10-16T08:50:26.000Z | """
Reviewed 03-06-22
Sequence-iteration is correctly implemented, thoroughly
tested, and complete. The only missing feature is support
for function-iteration.
"""
from pypy.objspace.std.objspace import *
class W_AbstractSeqIterObject(W_Object):
from pypy.objspace.std.itertype import iter_typedef as typedef
def __init__(w_self, w_seq, index=0):
if index < 0:
index = 0
w_self.w_seq = w_seq
w_self.index = index
class W_SeqIterObject(W_AbstractSeqIterObject):
"""Sequence iterator implementation for general sequences."""
class W_FastSeqIterObject(W_AbstractSeqIterObject):
"""Sequence iterator specialized for lists or tuples, accessing
directly their RPython-level list of wrapped objects.
"""
def __init__(w_self, w_seq, wrappeditems):
W_AbstractSeqIterObject.__init__(w_self, w_seq)
w_self.wrappeditems = wrappeditems
class W_ReverseSeqIterObject(W_Object):
from pypy.objspace.std.itertype import reverse_iter_typedef as typedef
def __init__(w_self, space, w_seq, index=-1):
w_self.w_seq = w_seq
w_self.w_len = space.len(w_seq)
w_self.index = space.int_w(w_self.w_len) + index
registerimplementation(W_SeqIterObject)
registerimplementation(W_FastSeqIterObject)
registerimplementation(W_ReverseSeqIterObject)
def iter__SeqIter(space, w_seqiter):
return w_seqiter
def next__SeqIter(space, w_seqiter):
if w_seqiter.w_seq is None:
raise OperationError(space.w_StopIteration, space.w_None)
try:
w_item = space.getitem(w_seqiter.w_seq, space.wrap(w_seqiter.index))
except OperationError, e:
w_seqiter.w_seq = None
if not e.match(space, space.w_IndexError):
raise
raise OperationError(space.w_StopIteration, space.w_None)
w_seqiter.index += 1
return w_item
def len__SeqIter(space, w_seqiter):
if w_seqiter.w_seq is None:
return space.wrap(0)
index = w_seqiter.index
w_length = space.len(w_seqiter.w_seq)
w_len = space.sub(w_length, space.wrap(index))
if space.is_true(space.lt(w_len,space.wrap(0))):
w_len = space.wrap(0)
return w_len
def iter__FastSeqIter(space, w_seqiter):
return w_seqiter
def next__FastSeqIter(space, w_seqiter):
if w_seqiter.wrappeditems is None:
raise OperationError(space.w_StopIteration, space.w_None)
index = w_seqiter.index
try:
w_item = w_seqiter.wrappeditems[index]
except IndexError:
w_seqiter.wrappeditems = None
w_seqiter.w_seq = None
raise OperationError(space.w_StopIteration, space.w_None)
w_seqiter.index = index + 1
return w_item
def len__FastSeqIter(space, w_seqiter):
if w_seqiter.wrappeditems is None:
return space.wrap(0)
totallength = len(w_seqiter.wrappeditems)
remaining = totallength - w_seqiter.index
if remaining < 0:
remaining = 0
return space.wrap(remaining)
def iter__ReverseSeqIter(space, w_seqiter):
return w_seqiter
def next__ReverseSeqIter(space, w_seqiter):
if w_seqiter.w_seq is None or w_seqiter.index < 0:
raise OperationError(space.w_StopIteration, space.w_None)
try:
w_item = space.getitem(w_seqiter.w_seq, space.wrap(w_seqiter.index))
w_seqiter.index -= 1
except OperationError, e:
w_seqiter.w_seq = None
if not e.match(space, space.w_IndexError):
raise
raise OperationError(space.w_StopIteration, space.w_None)
return w_item
def len__ReverseSeqIter(space, w_seqiter):
if w_seqiter.w_seq is None:
return space.wrap(0)
index = w_seqiter.index+1
w_length = space.len(w_seqiter.w_seq)
# if length of sequence is less than index :exhaust iterator
if space.is_true(space.gt(space.wrap(w_seqiter.index), w_length)):
w_len = space.wrap(0)
w_seqiter.w_seq = None
else:
w_len =space.wrap(index)
if space.is_true(space.lt(w_len,space.wrap(0))):
w_len = space.wrap(0)
return w_len
register_all(vars())
| 31.804688 | 76 | 0.704495 | 582 | 4,071 | 4.651203 | 0.170103 | 0.118212 | 0.039897 | 0.053195 | 0.602512 | 0.56003 | 0.532324 | 0.517178 | 0.393055 | 0.391577 | 0 | 0.007766 | 0.209285 | 4,071 | 127 | 77 | 32.055118 | 0.833178 | 0.014247 | 0 | 0.505263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.031579 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba7117574bb20ecd7e041b405becc3a15e9db17c | 348 | py | Python | Homework/HW7.py | Javascript-void0/hxgv | 09d04a5bb4f74a476d652bfd8ae5ff56588593c4 | [
"MIT"
] | null | null | null | Homework/HW7.py | Javascript-void0/hxgv | 09d04a5bb4f74a476d652bfd8ae5ff56588593c4 | [
"MIT"
] | null | null | null | Homework/HW7.py | Javascript-void0/hxgv | 09d04a5bb4f74a476d652bfd8ae5ff56588593c4 | [
"MIT"
] | null | null | null | '''
num1=1
num2=1
num3=num1+num2
print(num3)
sum=num1+num2+num3
for i in range(1,18,1):
num1=num2
num2=num3
num3=num1+num2
if num3%2==0:
print(num3)
'''
'''
for i in range(1,101):
if 100%i==0:
print(i)
'''
num1=int(input("Please input a number: "))
for i in range(1,num1+1):
if num1%i==0:
print(i) | 14.5 | 42 | 0.560345 | 65 | 348 | 3 | 0.323077 | 0.164103 | 0.092308 | 0.169231 | 0.225641 | 0.164103 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 0.252874 | 348 | 24 | 43 | 14.5 | 0.596154 | 0.491379 | 0 | 0 | 0 | 0 | 0.216981 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba7896e39b36a9f391aa37af0d8d529de9f57847 | 579 | py | Python | plugins/modules/files_attributes.py | manala/ansible-roles | 30dc7d0bcea10ac4b38c6ad85ad66dbd098131f4 | [
"MIT"
] | 138 | 2017-05-18T13:45:45.000Z | 2022-03-23T02:33:45.000Z | plugins/modules/files_attributes.py | manala/ansible-roles | 30dc7d0bcea10ac4b38c6ad85ad66dbd098131f4 | [
"MIT"
] | 159 | 2017-05-11T09:05:26.000Z | 2022-03-04T07:36:59.000Z | plugins/modules/files_attributes.py | manala/ansible-roles | 30dc7d0bcea10ac4b38c6ad85ad66dbd098131f4 | [
"MIT"
] | 35 | 2017-06-29T09:01:42.000Z | 2021-11-18T11:35:00.000Z | #!/usr/bin/python
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# This is a virtual module that is entirely implemented as an action plugin and runs on the controller
from __future__ import absolute_import, division, print_function
__metaclass__ = type
DOCUMENTATION = r'''
---
module: files_attributes
author: Manala (@manala)
short_description: Manage files attributes
description:
- Manage files attributes
'''
EXAMPLES = r'''
- name: Touch file
manala.roles.files_attributes:
path: /tmp/touch
state: touch
'''
| 22.269231 | 102 | 0.746114 | 80 | 579 | 5.2375 | 0.775 | 0.143198 | 0.105012 | 0.152745 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008163 | 0.153713 | 579 | 25 | 103 | 23.16 | 0.846939 | 0.35924 | 0 | 0.125 | 0 | 0 | 0.625 | 0.081522 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba79b2422cf56667368b3ef2dbac850bed5b6ef4 | 1,210 | py | Python | tests/ui/menus/test_opmenu.py | Hengle/Houdini-Toolbox | a1fd7d3dd73d3fc4cea78e29aeff1d190c41bae3 | [
"MIT"
] | 136 | 2015-01-03T04:03:23.000Z | 2022-02-07T11:08:57.000Z | tests/ui/menus/test_opmenu.py | Hengle/Houdini-Toolbox | a1fd7d3dd73d3fc4cea78e29aeff1d190c41bae3 | [
"MIT"
] | 11 | 2017-02-09T20:05:04.000Z | 2021-01-24T22:25:59.000Z | tests/ui/menus/test_opmenu.py | Hengle/Houdini-Toolbox | a1fd7d3dd73d3fc4cea78e29aeff1d190c41bae3 | [
"MIT"
] | 26 | 2015-08-18T12:11:02.000Z | 2020-12-19T01:53:31.000Z | """Tests for ht.ui.menus.opmenu module."""
# =============================================================================
# IMPORTS
# =============================================================================
# Houdini Toolbox
import ht.ui.menus.opmenu
# Houdini
import hou
# =============================================================================
# TESTS
# =============================================================================
def test_create_absolute_reference_copy(mocker):
"""Test ht.ui.menus.opmenu.create_absolute_reference_copy."""
mock_node = mocker.MagicMock(spec=hou.Node)
scriptargs = {"node": mock_node}
ht.ui.menus.opmenu.create_absolute_reference_copy(scriptargs)
mock_node.parent.return_value.copyItems.assert_called_with(
[mock_node], channel_reference_originals=True, relative_references=False
)
def test_save_item_to_file(mocker):
"""Test ht.ui.menus.opmenu.save_item_to_file."""
mock_copy = mocker.patch("ht.ui.menus.opmenu.copy_item")
mock_node = mocker.MagicMock(spec=hou.Node)
scriptargs = {"node": mock_node}
ht.ui.menus.opmenu.save_item_to_file(scriptargs)
mock_copy.assert_called_with(mock_node)
| 28.809524 | 80 | 0.547934 | 126 | 1,210 | 4.960317 | 0.333333 | 0.0448 | 0.1008 | 0.168 | 0.5152 | 0.4384 | 0.4064 | 0.4064 | 0.2272 | 0.2272 | 0 | 0 | 0.106612 | 1,210 | 41 | 81 | 29.512195 | 0.578168 | 0.401653 | 0 | 0.266667 | 0 | 0 | 0.051355 | 0.039943 | 0 | 0 | 0 | 0 | 0.133333 | 1 | 0.133333 | false | 0 | 0.133333 | 0 | 0.266667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba81f31f40fcfb95bb592d0946c0dd4a71c9a4b2 | 500 | py | Python | Curso_Em_Video_Python/ex013.py | ThallesTorres/Curso_Em_Video_Python | 95ffbff5a03f11fee41df746604dfe435f385a3b | [
"MIT"
] | null | null | null | Curso_Em_Video_Python/ex013.py | ThallesTorres/Curso_Em_Video_Python | 95ffbff5a03f11fee41df746604dfe435f385a3b | [
"MIT"
] | null | null | null | Curso_Em_Video_Python/ex013.py | ThallesTorres/Curso_Em_Video_Python | 95ffbff5a03f11fee41df746604dfe435f385a3b | [
"MIT"
] | null | null | null | print('''
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
--Seja bem-vindo!
--Exercício 013
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
''')
salario = float(input('Digite o salário: R$'))
aumento = float(input('Digite a porcentagem do aumento: '))
total = salario * aumento / 100
input(f'Aumento: R${total:.2f} \nAumento + Salário: R${total + salario:.2f} ')
print('''
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
--Obrigado pelo uso!
--Desenvolvido por Thalles Torres
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-''')
| 27.777778 | 79 | 0.452 | 45 | 500 | 5.022222 | 0.644444 | 0.088496 | 0.141593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018519 | 0.136 | 500 | 17 | 80 | 29.411765 | 0.50463 | 0 | 0 | 0.333333 | 0 | 0.066667 | 0.73706 | 0.289855 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba83e623f6a24c365a0bdff2420df7abb9e275a9 | 54,639 | py | Python | PyLibFTDI/_ftd2xx64.py | paretech/PyLibFTDI | b0c3a1196e36a3820b950f1fd0cbd3a917365322 | [
"Unlicense"
] | 2 | 2019-03-04T16:10:28.000Z | 2020-01-29T22:36:53.000Z | PyLibFTDI/_ftd2xx64.py | paretech/PyLibFTDI | b0c3a1196e36a3820b950f1fd0cbd3a917365322 | [
"Unlicense"
] | null | null | null | PyLibFTDI/_ftd2xx64.py | paretech/PyLibFTDI | b0c3a1196e36a3820b950f1fd0cbd3a917365322 | [
"Unlicense"
] | null | null | null | # generated by 'clang2py'
# flags '-c -d -l ftd2xx64.dll ftd2xx.h -vvv -o _ftd2xx64.py'
# -*- coding: utf-8 -*-
#
# TARGET arch is: []
# WORD_SIZE is: 4
# POINTER_SIZE is: 8
# LONGDOUBLE_SIZE is: 8
#
import ctypes
# if local wordsize is same as target, keep ctypes pointer function.
if ctypes.sizeof(ctypes.c_void_p) == 8:
POINTER_T = ctypes.POINTER
else:
# required to access _ctypes
import _ctypes
# Emulate a pointer class using the approriate c_int32/c_int64 type
# The new class should have :
# ['__module__', 'from_param', '_type_', '__dict__', '__weakref__', '__doc__']
# but the class should be submitted to a unique instance for each base type
# to that if A == B, POINTER_T(A) == POINTER_T(B)
ctypes._pointer_t_type_cache = {}
def POINTER_T(pointee):
# a pointer should have the same length as LONG
fake_ptr_base_type = ctypes.c_uint32
# specific case for c_void_p
if pointee is None: # VOID pointer type. c_void_p.
pointee = type(None) # ctypes.c_void_p # ctypes.c_ulong
clsname = 'c_void'
else:
clsname = pointee.__name__
if clsname in ctypes._pointer_t_type_cache:
return ctypes._pointer_t_type_cache[clsname]
# make template
class _T(_ctypes._SimpleCData,):
_type_ = 'L'
_subtype_ = pointee
def _sub_addr_(self):
return self.value
def __repr__(self):
return '%s(%d)'%(clsname, self.value)
def contents(self):
raise TypeError('This is not a ctypes pointer.')
def __init__(self, **args):
raise TypeError('This is not a ctypes pointer. It is not instanciable.')
_class = type('LP_%d_%s'%(8, clsname), (_T,),{})
ctypes._pointer_t_type_cache[clsname] = _class
return _class
c_int128 = ctypes.c_ubyte*16
c_uint128 = c_int128
void = None
if ctypes.sizeof(ctypes.c_longdouble) == 8:
c_long_double_t = ctypes.c_longdouble
else:
c_long_double_t = ctypes.c_ubyte*8
_libraries = {}
_libraries['ftd2xx64.dll'] = ctypes.CDLL('ftd2xx64.dll')
PULONG = POINTER_T(ctypes.c_uint32)
PUCHAR = POINTER_T(ctypes.c_ubyte)
DWORD = ctypes.c_uint32
BOOL = ctypes.c_int32
WORD = ctypes.c_uint16
LPWORD = POINTER_T(ctypes.c_uint16)
LPLONG = POINTER_T(ctypes.c_int32)
LPDWORD = POINTER_T(ctypes.c_uint32)
LPVOID = POINTER_T(None)
ULONG = ctypes.c_uint32
UCHAR = ctypes.c_ubyte
USHORT = ctypes.c_uint16
class struct__SECURITY_ATTRIBUTES(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('nLength', ctypes.c_uint32),
('PADDING_0', ctypes.c_ubyte * 4),
('lpSecurityDescriptor', POINTER_T(None)),
('bInheritHandle', ctypes.c_int32),
('PADDING_1', ctypes.c_ubyte * 4),
]
LPSECURITY_ATTRIBUTES = POINTER_T(struct__SECURITY_ATTRIBUTES)
class struct__OVERLAPPED(ctypes.Structure):
pass
class union__OVERLAPPED_0(ctypes.Union):
pass
class struct__OVERLAPPED_0_0(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('Offset', ctypes.c_uint32),
('OffsetHigh', ctypes.c_uint32),
]
union__OVERLAPPED_0._pack_ = True # source:False
union__OVERLAPPED_0._fields_ = [
('_0', struct__OVERLAPPED_0_0),
('Pointer', POINTER_T(None)),
]
struct__OVERLAPPED._pack_ = True # source:False
struct__OVERLAPPED._fields_ = [
('Internal', ctypes.c_uint64),
('InternalHigh', ctypes.c_uint64),
('_2', union__OVERLAPPED_0),
('hEvent', POINTER_T(None)),
]
LPOVERLAPPED = POINTER_T(struct__OVERLAPPED)
PVOID = POINTER_T(None)
PCHAR = POINTER_T(ctypes.c_char)
LPCTSTR = POINTER_T(ctypes.c_char)
HANDLE = POINTER_T(None)
FT_HANDLE = POINTER_T(None)
FT_STATUS = ctypes.c_uint32
# values for enumeration 'c__Ea_FT_OK'
FT_OK = 0
FT_INVALID_HANDLE = 1
FT_DEVICE_NOT_FOUND = 2
FT_DEVICE_NOT_OPENED = 3
FT_IO_ERROR = 4
FT_INSUFFICIENT_RESOURCES = 5
FT_INVALID_PARAMETER = 6
FT_INVALID_BAUD_RATE = 7
FT_DEVICE_NOT_OPENED_FOR_ERASE = 8
FT_DEVICE_NOT_OPENED_FOR_WRITE = 9
FT_FAILED_TO_WRITE_DEVICE = 10
FT_EEPROM_READ_FAILED = 11
FT_EEPROM_WRITE_FAILED = 12
FT_EEPROM_ERASE_FAILED = 13
FT_EEPROM_NOT_PRESENT = 14
FT_EEPROM_NOT_PROGRAMMED = 15
FT_INVALID_ARGS = 16
FT_NOT_SUPPORTED = 17
FT_OTHER_ERROR = 18
FT_DEVICE_LIST_NOT_READY = 19
c__Ea_FT_OK = ctypes.c_int # enum
PFT_EVENT_HANDLER = POINTER_T(ctypes.CFUNCTYPE(None, ctypes.c_uint32, ctypes.c_uint32))
FT_DEVICE = ctypes.c_uint32
# values for enumeration 'c__Ea_FT_DEVICE_BM'
FT_DEVICE_BM = 0
FT_DEVICE_AM = 1
FT_DEVICE_100AX = 2
FT_DEVICE_UNKNOWN = 3
FT_DEVICE_2232C = 4
FT_DEVICE_232R = 5
FT_DEVICE_2232H = 6
FT_DEVICE_4232H = 7
FT_DEVICE_232H = 8
FT_DEVICE_X_SERIES = 9
FT_DEVICE_4222H_0 = 10
FT_DEVICE_4222H_1_2 = 11
FT_DEVICE_4222H_3 = 12
FT_DEVICE_4222_PROG = 13
FT_DEVICE_900 = 14
FT_DEVICE_930 = 15
FT_DEVICE_UMFTPD3A = 16
c__Ea_FT_DEVICE_BM = ctypes.c_int # enum
FT_Open = _libraries['ftd2xx64.dll'].FT_Open
FT_Open.restype = FT_STATUS
# FT_Open(deviceNumber, pHandle)
FT_Open.argtypes = [ctypes.c_int32, POINTER_T(POINTER_T(None))]
FT_Open.__doc__ = \
"""FT_STATUS FT_Open(c_int32 deviceNumber, LP_LP_None pHandle)
ftd2xx.h:334"""
FT_OpenEx = _libraries['ftd2xx64.dll'].FT_OpenEx
FT_OpenEx.restype = FT_STATUS
# FT_OpenEx(pArg1, Flags, pHandle)
FT_OpenEx.argtypes = [PVOID, DWORD, POINTER_T(POINTER_T(None))]
FT_OpenEx.__doc__ = \
"""FT_STATUS FT_OpenEx(PVOID pArg1, DWORD Flags, LP_LP_None pHandle)
ftd2xx.h:340"""
FT_ListDevices = _libraries['ftd2xx64.dll'].FT_ListDevices
FT_ListDevices.restype = FT_STATUS
# FT_ListDevices(pArg1, pArg2, Flags)
FT_ListDevices.argtypes = [PVOID, PVOID, DWORD]
FT_ListDevices.__doc__ = \
"""FT_STATUS FT_ListDevices(PVOID pArg1, PVOID pArg2, DWORD Flags)
ftd2xx.h:347"""
FT_Close = _libraries['ftd2xx64.dll'].FT_Close
FT_Close.restype = FT_STATUS
# FT_Close(ftHandle)
FT_Close.argtypes = [FT_HANDLE]
FT_Close.__doc__ = \
"""FT_STATUS FT_Close(FT_HANDLE ftHandle)
ftd2xx.h:354"""
FT_Read = _libraries['ftd2xx64.dll'].FT_Read
FT_Read.restype = FT_STATUS
# FT_Read(ftHandle, lpBuffer, dwBytesToRead, lpBytesReturned)
FT_Read.argtypes = [FT_HANDLE, LPVOID, DWORD, LPDWORD]
FT_Read.__doc__ = \
"""FT_STATUS FT_Read(FT_HANDLE ftHandle, LPVOID lpBuffer, DWORD dwBytesToRead, LPDWORD lpBytesReturned)
ftd2xx.h:359"""
FT_Write = _libraries['ftd2xx64.dll'].FT_Write
FT_Write.restype = FT_STATUS
# FT_Write(ftHandle, lpBuffer, dwBytesToWrite, lpBytesWritten)
FT_Write.argtypes = [FT_HANDLE, LPVOID, DWORD, LPDWORD]
FT_Write.__doc__ = \
"""FT_STATUS FT_Write(FT_HANDLE ftHandle, LPVOID lpBuffer, DWORD dwBytesToWrite, LPDWORD lpBytesWritten)
ftd2xx.h:367"""
FT_IoCtl = _libraries['ftd2xx64.dll'].FT_IoCtl
FT_IoCtl.restype = FT_STATUS
# FT_IoCtl(ftHandle, dwIoControlCode, lpInBuf, nInBufSize, lpOutBuf, nOutBufSize, lpBytesReturned, lpOverlapped)
FT_IoCtl.argtypes = [FT_HANDLE, DWORD, LPVOID, DWORD, LPVOID, DWORD, LPDWORD, LPOVERLAPPED]
FT_IoCtl.__doc__ = \
"""FT_STATUS FT_IoCtl(FT_HANDLE ftHandle, DWORD dwIoControlCode, LPVOID lpInBuf, DWORD nInBufSize, LPVOID lpOutBuf, DWORD nOutBufSize, LPDWORD lpBytesReturned, LPOVERLAPPED lpOverlapped)
ftd2xx.h:375"""
FT_SetBaudRate = _libraries['ftd2xx64.dll'].FT_SetBaudRate
FT_SetBaudRate.restype = FT_STATUS
# FT_SetBaudRate(ftHandle, BaudRate)
FT_SetBaudRate.argtypes = [FT_HANDLE, ULONG]
FT_SetBaudRate.__doc__ = \
"""FT_STATUS FT_SetBaudRate(FT_HANDLE ftHandle, ULONG BaudRate)
ftd2xx.h:387"""
FT_SetDivisor = _libraries['ftd2xx64.dll'].FT_SetDivisor
FT_SetDivisor.restype = FT_STATUS
# FT_SetDivisor(ftHandle, Divisor)
FT_SetDivisor.argtypes = [FT_HANDLE, USHORT]
FT_SetDivisor.__doc__ = \
"""FT_STATUS FT_SetDivisor(FT_HANDLE ftHandle, USHORT Divisor)
ftd2xx.h:393"""
FT_SetDataCharacteristics = _libraries['ftd2xx64.dll'].FT_SetDataCharacteristics
FT_SetDataCharacteristics.restype = FT_STATUS
# FT_SetDataCharacteristics(ftHandle, WordLength, StopBits, Parity)
FT_SetDataCharacteristics.argtypes = [FT_HANDLE, UCHAR, UCHAR, UCHAR]
FT_SetDataCharacteristics.__doc__ = \
"""FT_STATUS FT_SetDataCharacteristics(FT_HANDLE ftHandle, UCHAR WordLength, UCHAR StopBits, UCHAR Parity)
ftd2xx.h:399"""
FT_SetFlowControl = _libraries['ftd2xx64.dll'].FT_SetFlowControl
FT_SetFlowControl.restype = FT_STATUS
# FT_SetFlowControl(ftHandle, FlowControl, XonChar, XoffChar)
FT_SetFlowControl.argtypes = [FT_HANDLE, USHORT, UCHAR, UCHAR]
FT_SetFlowControl.__doc__ = \
"""FT_STATUS FT_SetFlowControl(FT_HANDLE ftHandle, USHORT FlowControl, UCHAR XonChar, UCHAR XoffChar)
ftd2xx.h:407"""
FT_ResetDevice = _libraries['ftd2xx64.dll'].FT_ResetDevice
FT_ResetDevice.restype = FT_STATUS
# FT_ResetDevice(ftHandle)
FT_ResetDevice.argtypes = [FT_HANDLE]
FT_ResetDevice.__doc__ = \
"""FT_STATUS FT_ResetDevice(FT_HANDLE ftHandle)
ftd2xx.h:415"""
FT_SetDtr = _libraries['ftd2xx64.dll'].FT_SetDtr
FT_SetDtr.restype = FT_STATUS
# FT_SetDtr(ftHandle)
FT_SetDtr.argtypes = [FT_HANDLE]
FT_SetDtr.__doc__ = \
"""FT_STATUS FT_SetDtr(FT_HANDLE ftHandle)
ftd2xx.h:420"""
FT_ClrDtr = _libraries['ftd2xx64.dll'].FT_ClrDtr
FT_ClrDtr.restype = FT_STATUS
# FT_ClrDtr(ftHandle)
FT_ClrDtr.argtypes = [FT_HANDLE]
FT_ClrDtr.__doc__ = \
"""FT_STATUS FT_ClrDtr(FT_HANDLE ftHandle)
ftd2xx.h:425"""
FT_SetRts = _libraries['ftd2xx64.dll'].FT_SetRts
FT_SetRts.restype = FT_STATUS
# FT_SetRts(ftHandle)
FT_SetRts.argtypes = [FT_HANDLE]
FT_SetRts.__doc__ = \
"""FT_STATUS FT_SetRts(FT_HANDLE ftHandle)
ftd2xx.h:430"""
FT_ClrRts = _libraries['ftd2xx64.dll'].FT_ClrRts
FT_ClrRts.restype = FT_STATUS
# FT_ClrRts(ftHandle)
FT_ClrRts.argtypes = [FT_HANDLE]
FT_ClrRts.__doc__ = \
"""FT_STATUS FT_ClrRts(FT_HANDLE ftHandle)
ftd2xx.h:435"""
FT_GetModemStatus = _libraries['ftd2xx64.dll'].FT_GetModemStatus
FT_GetModemStatus.restype = FT_STATUS
# FT_GetModemStatus(ftHandle, pModemStatus)
FT_GetModemStatus.argtypes = [FT_HANDLE, POINTER_T(ctypes.c_uint32)]
FT_GetModemStatus.__doc__ = \
"""FT_STATUS FT_GetModemStatus(FT_HANDLE ftHandle, LP_c_uint32 pModemStatus)
ftd2xx.h:440"""
FT_SetChars = _libraries['ftd2xx64.dll'].FT_SetChars
FT_SetChars.restype = FT_STATUS
# FT_SetChars(ftHandle, EventChar, EventCharEnabled, ErrorChar, ErrorCharEnabled)
FT_SetChars.argtypes = [FT_HANDLE, UCHAR, UCHAR, UCHAR, UCHAR]
FT_SetChars.__doc__ = \
"""FT_STATUS FT_SetChars(FT_HANDLE ftHandle, UCHAR EventChar, UCHAR EventCharEnabled, UCHAR ErrorChar, UCHAR ErrorCharEnabled)
ftd2xx.h:446"""
FT_Purge = _libraries['ftd2xx64.dll'].FT_Purge
FT_Purge.restype = FT_STATUS
# FT_Purge(ftHandle, Mask)
FT_Purge.argtypes = [FT_HANDLE, ULONG]
FT_Purge.__doc__ = \
"""FT_STATUS FT_Purge(FT_HANDLE ftHandle, ULONG Mask)
ftd2xx.h:455"""
FT_SetTimeouts = _libraries['ftd2xx64.dll'].FT_SetTimeouts
FT_SetTimeouts.restype = FT_STATUS
# FT_SetTimeouts(ftHandle, ReadTimeout, WriteTimeout)
FT_SetTimeouts.argtypes = [FT_HANDLE, ULONG, ULONG]
FT_SetTimeouts.__doc__ = \
"""FT_STATUS FT_SetTimeouts(FT_HANDLE ftHandle, ULONG ReadTimeout, ULONG WriteTimeout)
ftd2xx.h:461"""
FT_GetQueueStatus = _libraries['ftd2xx64.dll'].FT_GetQueueStatus
FT_GetQueueStatus.restype = FT_STATUS
# FT_GetQueueStatus(ftHandle, dwRxBytes)
FT_GetQueueStatus.argtypes = [FT_HANDLE, POINTER_T(ctypes.c_uint32)]
FT_GetQueueStatus.__doc__ = \
"""FT_STATUS FT_GetQueueStatus(FT_HANDLE ftHandle, LP_c_uint32 dwRxBytes)
ftd2xx.h:468"""
FT_SetEventNotification = _libraries['ftd2xx64.dll'].FT_SetEventNotification
FT_SetEventNotification.restype = FT_STATUS
# FT_SetEventNotification(ftHandle, Mask, Param)
FT_SetEventNotification.argtypes = [FT_HANDLE, DWORD, PVOID]
FT_SetEventNotification.__doc__ = \
"""FT_STATUS FT_SetEventNotification(FT_HANDLE ftHandle, DWORD Mask, PVOID Param)
ftd2xx.h:474"""
FT_GetStatus = _libraries['ftd2xx64.dll'].FT_GetStatus
FT_GetStatus.restype = FT_STATUS
# FT_GetStatus(ftHandle, dwRxBytes, dwTxBytes, dwEventDWord)
FT_GetStatus.argtypes = [FT_HANDLE, POINTER_T(ctypes.c_uint32), POINTER_T(ctypes.c_uint32), POINTER_T(ctypes.c_uint32)]
FT_GetStatus.__doc__ = \
"""FT_STATUS FT_GetStatus(FT_HANDLE ftHandle, LP_c_uint32 dwRxBytes, LP_c_uint32 dwTxBytes, LP_c_uint32 dwEventDWord)
ftd2xx.h:481"""
FT_SetBreakOn = _libraries['ftd2xx64.dll'].FT_SetBreakOn
FT_SetBreakOn.restype = FT_STATUS
# FT_SetBreakOn(ftHandle)
FT_SetBreakOn.argtypes = [FT_HANDLE]
FT_SetBreakOn.__doc__ = \
"""FT_STATUS FT_SetBreakOn(FT_HANDLE ftHandle)
ftd2xx.h:489"""
FT_SetBreakOff = _libraries['ftd2xx64.dll'].FT_SetBreakOff
FT_SetBreakOff.restype = FT_STATUS
# FT_SetBreakOff(ftHandle)
FT_SetBreakOff.argtypes = [FT_HANDLE]
FT_SetBreakOff.__doc__ = \
"""FT_STATUS FT_SetBreakOff(FT_HANDLE ftHandle)
ftd2xx.h:494"""
FT_SetWaitMask = _libraries['ftd2xx64.dll'].FT_SetWaitMask
FT_SetWaitMask.restype = FT_STATUS
# FT_SetWaitMask(ftHandle, Mask)
FT_SetWaitMask.argtypes = [FT_HANDLE, DWORD]
FT_SetWaitMask.__doc__ = \
"""FT_STATUS FT_SetWaitMask(FT_HANDLE ftHandle, DWORD Mask)
ftd2xx.h:499"""
FT_WaitOnMask = _libraries['ftd2xx64.dll'].FT_WaitOnMask
FT_WaitOnMask.restype = FT_STATUS
# FT_WaitOnMask(ftHandle, Mask)
FT_WaitOnMask.argtypes = [FT_HANDLE, POINTER_T(ctypes.c_uint32)]
FT_WaitOnMask.__doc__ = \
"""FT_STATUS FT_WaitOnMask(FT_HANDLE ftHandle, LP_c_uint32 Mask)
ftd2xx.h:505"""
FT_GetEventStatus = _libraries['ftd2xx64.dll'].FT_GetEventStatus
FT_GetEventStatus.restype = FT_STATUS
# FT_GetEventStatus(ftHandle, dwEventDWord)
FT_GetEventStatus.argtypes = [FT_HANDLE, POINTER_T(ctypes.c_uint32)]
FT_GetEventStatus.__doc__ = \
"""FT_STATUS FT_GetEventStatus(FT_HANDLE ftHandle, LP_c_uint32 dwEventDWord)
ftd2xx.h:511"""
FT_ReadEE = _libraries['ftd2xx64.dll'].FT_ReadEE
FT_ReadEE.restype = FT_STATUS
# FT_ReadEE(ftHandle, dwWordOffset, lpwValue)
FT_ReadEE.argtypes = [FT_HANDLE, DWORD, LPWORD]
FT_ReadEE.__doc__ = \
"""FT_STATUS FT_ReadEE(FT_HANDLE ftHandle, DWORD dwWordOffset, LPWORD lpwValue)
ftd2xx.h:517"""
FT_WriteEE = _libraries['ftd2xx64.dll'].FT_WriteEE
FT_WriteEE.restype = FT_STATUS
# FT_WriteEE(ftHandle, dwWordOffset, wValue)
FT_WriteEE.argtypes = [FT_HANDLE, DWORD, WORD]
FT_WriteEE.__doc__ = \
"""FT_STATUS FT_WriteEE(FT_HANDLE ftHandle, DWORD dwWordOffset, WORD wValue)
ftd2xx.h:524"""
FT_EraseEE = _libraries['ftd2xx64.dll'].FT_EraseEE
FT_EraseEE.restype = FT_STATUS
# FT_EraseEE(ftHandle)
FT_EraseEE.argtypes = [FT_HANDLE]
FT_EraseEE.__doc__ = \
"""FT_STATUS FT_EraseEE(FT_HANDLE ftHandle)
ftd2xx.h:531"""
class struct_ft_program_data(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('Signature1', ctypes.c_uint32),
('Signature2', ctypes.c_uint32),
('Version', ctypes.c_uint32),
('VendorId', ctypes.c_uint16),
('ProductId', ctypes.c_uint16),
('Manufacturer', POINTER_T(ctypes.c_char)),
('ManufacturerId', POINTER_T(ctypes.c_char)),
('Description', POINTER_T(ctypes.c_char)),
('SerialNumber', POINTER_T(ctypes.c_char)),
('MaxPower', ctypes.c_uint16),
('PnP', ctypes.c_uint16),
('SelfPowered', ctypes.c_uint16),
('RemoteWakeup', ctypes.c_uint16),
('Rev4', ctypes.c_ubyte),
('IsoIn', ctypes.c_ubyte),
('IsoOut', ctypes.c_ubyte),
('PullDownEnable', ctypes.c_ubyte),
('SerNumEnable', ctypes.c_ubyte),
('USBVersionEnable', ctypes.c_ubyte),
('USBVersion', ctypes.c_uint16),
('Rev5', ctypes.c_ubyte),
('IsoInA', ctypes.c_ubyte),
('IsoInB', ctypes.c_ubyte),
('IsoOutA', ctypes.c_ubyte),
('IsoOutB', ctypes.c_ubyte),
('PullDownEnable5', ctypes.c_ubyte),
('SerNumEnable5', ctypes.c_ubyte),
('USBVersionEnable5', ctypes.c_ubyte),
('USBVersion5', ctypes.c_uint16),
('AIsHighCurrent', ctypes.c_ubyte),
('BIsHighCurrent', ctypes.c_ubyte),
('IFAIsFifo', ctypes.c_ubyte),
('IFAIsFifoTar', ctypes.c_ubyte),
('IFAIsFastSer', ctypes.c_ubyte),
('AIsVCP', ctypes.c_ubyte),
('IFBIsFifo', ctypes.c_ubyte),
('IFBIsFifoTar', ctypes.c_ubyte),
('IFBIsFastSer', ctypes.c_ubyte),
('BIsVCP', ctypes.c_ubyte),
('UseExtOsc', ctypes.c_ubyte),
('HighDriveIOs', ctypes.c_ubyte),
('EndpointSize', ctypes.c_ubyte),
('PullDownEnableR', ctypes.c_ubyte),
('SerNumEnableR', ctypes.c_ubyte),
('InvertTXD', ctypes.c_ubyte),
('InvertRXD', ctypes.c_ubyte),
('InvertRTS', ctypes.c_ubyte),
('InvertCTS', ctypes.c_ubyte),
('InvertDTR', ctypes.c_ubyte),
('InvertDSR', ctypes.c_ubyte),
('InvertDCD', ctypes.c_ubyte),
('InvertRI', ctypes.c_ubyte),
('Cbus0', ctypes.c_ubyte),
('Cbus1', ctypes.c_ubyte),
('Cbus2', ctypes.c_ubyte),
('Cbus3', ctypes.c_ubyte),
('Cbus4', ctypes.c_ubyte),
('RIsD2XX', ctypes.c_ubyte),
('PullDownEnable7', ctypes.c_ubyte),
('SerNumEnable7', ctypes.c_ubyte),
('ALSlowSlew', ctypes.c_ubyte),
('ALSchmittInput', ctypes.c_ubyte),
('ALDriveCurrent', ctypes.c_ubyte),
('AHSlowSlew', ctypes.c_ubyte),
('AHSchmittInput', ctypes.c_ubyte),
('AHDriveCurrent', ctypes.c_ubyte),
('BLSlowSlew', ctypes.c_ubyte),
('BLSchmittInput', ctypes.c_ubyte),
('BLDriveCurrent', ctypes.c_ubyte),
('BHSlowSlew', ctypes.c_ubyte),
('BHSchmittInput', ctypes.c_ubyte),
('BHDriveCurrent', ctypes.c_ubyte),
('IFAIsFifo7', ctypes.c_ubyte),
('IFAIsFifoTar7', ctypes.c_ubyte),
('IFAIsFastSer7', ctypes.c_ubyte),
('AIsVCP7', ctypes.c_ubyte),
('IFBIsFifo7', ctypes.c_ubyte),
('IFBIsFifoTar7', ctypes.c_ubyte),
('IFBIsFastSer7', ctypes.c_ubyte),
('BIsVCP7', ctypes.c_ubyte),
('PowerSaveEnable', ctypes.c_ubyte),
('PullDownEnable8', ctypes.c_ubyte),
('SerNumEnable8', ctypes.c_ubyte),
('ASlowSlew', ctypes.c_ubyte),
('ASchmittInput', ctypes.c_ubyte),
('ADriveCurrent', ctypes.c_ubyte),
('BSlowSlew', ctypes.c_ubyte),
('BSchmittInput', ctypes.c_ubyte),
('BDriveCurrent', ctypes.c_ubyte),
('CSlowSlew', ctypes.c_ubyte),
('CSchmittInput', ctypes.c_ubyte),
('CDriveCurrent', ctypes.c_ubyte),
('DSlowSlew', ctypes.c_ubyte),
('DSchmittInput', ctypes.c_ubyte),
('DDriveCurrent', ctypes.c_ubyte),
('ARIIsTXDEN', ctypes.c_ubyte),
('BRIIsTXDEN', ctypes.c_ubyte),
('CRIIsTXDEN', ctypes.c_ubyte),
('DRIIsTXDEN', ctypes.c_ubyte),
('AIsVCP8', ctypes.c_ubyte),
('BIsVCP8', ctypes.c_ubyte),
('CIsVCP8', ctypes.c_ubyte),
('DIsVCP8', ctypes.c_ubyte),
('PullDownEnableH', ctypes.c_ubyte),
('SerNumEnableH', ctypes.c_ubyte),
('ACSlowSlewH', ctypes.c_ubyte),
('ACSchmittInputH', ctypes.c_ubyte),
('ACDriveCurrentH', ctypes.c_ubyte),
('ADSlowSlewH', ctypes.c_ubyte),
('ADSchmittInputH', ctypes.c_ubyte),
('ADDriveCurrentH', ctypes.c_ubyte),
('Cbus0H', ctypes.c_ubyte),
('Cbus1H', ctypes.c_ubyte),
('Cbus2H', ctypes.c_ubyte),
('Cbus3H', ctypes.c_ubyte),
('Cbus4H', ctypes.c_ubyte),
('Cbus5H', ctypes.c_ubyte),
('Cbus6H', ctypes.c_ubyte),
('Cbus7H', ctypes.c_ubyte),
('Cbus8H', ctypes.c_ubyte),
('Cbus9H', ctypes.c_ubyte),
('IsFifoH', ctypes.c_ubyte),
('IsFifoTarH', ctypes.c_ubyte),
('IsFastSerH', ctypes.c_ubyte),
('IsFT1248H', ctypes.c_ubyte),
('FT1248CpolH', ctypes.c_ubyte),
('FT1248LsbH', ctypes.c_ubyte),
('FT1248FlowControlH', ctypes.c_ubyte),
('IsVCPH', ctypes.c_ubyte),
('PowerSaveEnableH', ctypes.c_ubyte),
('PADDING_0', ctypes.c_ubyte),
]
FT_PROGRAM_DATA = struct_ft_program_data
PFT_PROGRAM_DATA = POINTER_T(struct_ft_program_data)
FT_EE_Program = _libraries['ftd2xx64.dll'].FT_EE_Program
FT_EE_Program.restype = FT_STATUS
# FT_EE_Program(ftHandle, pData)
FT_EE_Program.argtypes = [FT_HANDLE, PFT_PROGRAM_DATA]
FT_EE_Program.__doc__ = \
"""FT_STATUS FT_EE_Program(FT_HANDLE ftHandle, PFT_PROGRAM_DATA pData)
ftd2xx.h:700"""
FT_EE_ProgramEx = _libraries['ftd2xx64.dll'].FT_EE_ProgramEx
FT_EE_ProgramEx.restype = FT_STATUS
# FT_EE_ProgramEx(ftHandle, pData, Manufacturer, ManufacturerId, Description, SerialNumber)
FT_EE_ProgramEx.argtypes = [FT_HANDLE, PFT_PROGRAM_DATA, POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char)]
FT_EE_ProgramEx.__doc__ = \
"""FT_STATUS FT_EE_ProgramEx(FT_HANDLE ftHandle, PFT_PROGRAM_DATA pData, LP_c_char Manufacturer, LP_c_char ManufacturerId, LP_c_char Description, LP_c_char SerialNumber)
ftd2xx.h:706"""
FT_EE_Read = _libraries['ftd2xx64.dll'].FT_EE_Read
FT_EE_Read.restype = FT_STATUS
# FT_EE_Read(ftHandle, pData)
FT_EE_Read.argtypes = [FT_HANDLE, PFT_PROGRAM_DATA]
FT_EE_Read.__doc__ = \
"""FT_STATUS FT_EE_Read(FT_HANDLE ftHandle, PFT_PROGRAM_DATA pData)
ftd2xx.h:716"""
FT_EE_ReadEx = _libraries['ftd2xx64.dll'].FT_EE_ReadEx
FT_EE_ReadEx.restype = FT_STATUS
# FT_EE_ReadEx(ftHandle, pData, Manufacturer, ManufacturerId, Description, SerialNumber)
FT_EE_ReadEx.argtypes = [FT_HANDLE, PFT_PROGRAM_DATA, POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char)]
FT_EE_ReadEx.__doc__ = \
"""FT_STATUS FT_EE_ReadEx(FT_HANDLE ftHandle, PFT_PROGRAM_DATA pData, LP_c_char Manufacturer, LP_c_char ManufacturerId, LP_c_char Description, LP_c_char SerialNumber)
ftd2xx.h:722"""
FT_EE_UASize = _libraries['ftd2xx64.dll'].FT_EE_UASize
FT_EE_UASize.restype = FT_STATUS
# FT_EE_UASize(ftHandle, lpdwSize)
FT_EE_UASize.argtypes = [FT_HANDLE, LPDWORD]
FT_EE_UASize.__doc__ = \
"""FT_STATUS FT_EE_UASize(FT_HANDLE ftHandle, LPDWORD lpdwSize)
ftd2xx.h:732"""
FT_EE_UAWrite = _libraries['ftd2xx64.dll'].FT_EE_UAWrite
FT_EE_UAWrite.restype = FT_STATUS
# FT_EE_UAWrite(ftHandle, pucData, dwDataLen)
FT_EE_UAWrite.argtypes = [FT_HANDLE, PUCHAR, DWORD]
FT_EE_UAWrite.__doc__ = \
"""FT_STATUS FT_EE_UAWrite(FT_HANDLE ftHandle, PUCHAR pucData, DWORD dwDataLen)
ftd2xx.h:738"""
FT_EE_UARead = _libraries['ftd2xx64.dll'].FT_EE_UARead
FT_EE_UARead.restype = FT_STATUS
# FT_EE_UARead(ftHandle, pucData, dwDataLen, lpdwBytesRead)
FT_EE_UARead.argtypes = [FT_HANDLE, PUCHAR, DWORD, LPDWORD]
FT_EE_UARead.__doc__ = \
"""FT_STATUS FT_EE_UARead(FT_HANDLE ftHandle, PUCHAR pucData, DWORD dwDataLen, LPDWORD lpdwBytesRead)
ftd2xx.h:745"""
class struct_ft_eeprom_header(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('deviceType', ctypes.c_uint32),
('VendorId', ctypes.c_uint16),
('ProductId', ctypes.c_uint16),
('SerNumEnable', ctypes.c_ubyte),
('PADDING_0', ctypes.c_ubyte),
('MaxPower', ctypes.c_uint16),
('SelfPowered', ctypes.c_ubyte),
('RemoteWakeup', ctypes.c_ubyte),
('PullDownEnable', ctypes.c_ubyte),
('PADDING_1', ctypes.c_ubyte),
]
FT_EEPROM_HEADER = struct_ft_eeprom_header
PFT_EEPROM_HEADER = POINTER_T(struct_ft_eeprom_header)
class struct_ft_eeprom_232b(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('common', FT_EEPROM_HEADER),
]
FT_EEPROM_232B = struct_ft_eeprom_232b
PFT_EEPROM_232B = POINTER_T(struct_ft_eeprom_232b)
class struct_ft_eeprom_2232(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('common', FT_EEPROM_HEADER),
('AIsHighCurrent', ctypes.c_ubyte),
('BIsHighCurrent', ctypes.c_ubyte),
('AIsFifo', ctypes.c_ubyte),
('AIsFifoTar', ctypes.c_ubyte),
('AIsFastSer', ctypes.c_ubyte),
('BIsFifo', ctypes.c_ubyte),
('BIsFifoTar', ctypes.c_ubyte),
('BIsFastSer', ctypes.c_ubyte),
('ADriverType', ctypes.c_ubyte),
('BDriverType', ctypes.c_ubyte),
('PADDING_0', ctypes.c_ubyte * 2),
]
FT_EEPROM_2232 = struct_ft_eeprom_2232
PFT_EEPROM_2232 = POINTER_T(struct_ft_eeprom_2232)
class struct_ft_eeprom_232r(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('common', FT_EEPROM_HEADER),
('IsHighCurrent', ctypes.c_ubyte),
('UseExtOsc', ctypes.c_ubyte),
('InvertTXD', ctypes.c_ubyte),
('InvertRXD', ctypes.c_ubyte),
('InvertRTS', ctypes.c_ubyte),
('InvertCTS', ctypes.c_ubyte),
('InvertDTR', ctypes.c_ubyte),
('InvertDSR', ctypes.c_ubyte),
('InvertDCD', ctypes.c_ubyte),
('InvertRI', ctypes.c_ubyte),
('Cbus0', ctypes.c_ubyte),
('Cbus1', ctypes.c_ubyte),
('Cbus2', ctypes.c_ubyte),
('Cbus3', ctypes.c_ubyte),
('Cbus4', ctypes.c_ubyte),
('DriverType', ctypes.c_ubyte),
]
FT_EEPROM_232R = struct_ft_eeprom_232r
PFT_EEPROM_232R = POINTER_T(struct_ft_eeprom_232r)
class struct_ft_eeprom_2232h(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('common', FT_EEPROM_HEADER),
('ALSlowSlew', ctypes.c_ubyte),
('ALSchmittInput', ctypes.c_ubyte),
('ALDriveCurrent', ctypes.c_ubyte),
('AHSlowSlew', ctypes.c_ubyte),
('AHSchmittInput', ctypes.c_ubyte),
('AHDriveCurrent', ctypes.c_ubyte),
('BLSlowSlew', ctypes.c_ubyte),
('BLSchmittInput', ctypes.c_ubyte),
('BLDriveCurrent', ctypes.c_ubyte),
('BHSlowSlew', ctypes.c_ubyte),
('BHSchmittInput', ctypes.c_ubyte),
('BHDriveCurrent', ctypes.c_ubyte),
('AIsFifo', ctypes.c_ubyte),
('AIsFifoTar', ctypes.c_ubyte),
('AIsFastSer', ctypes.c_ubyte),
('BIsFifo', ctypes.c_ubyte),
('BIsFifoTar', ctypes.c_ubyte),
('BIsFastSer', ctypes.c_ubyte),
('PowerSaveEnable', ctypes.c_ubyte),
('ADriverType', ctypes.c_ubyte),
('BDriverType', ctypes.c_ubyte),
('PADDING_0', ctypes.c_ubyte * 3),
]
FT_EEPROM_2232H = struct_ft_eeprom_2232h
PFT_EEPROM_2232H = POINTER_T(struct_ft_eeprom_2232h)
class struct_ft_eeprom_4232h(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('common', FT_EEPROM_HEADER),
('ASlowSlew', ctypes.c_ubyte),
('ASchmittInput', ctypes.c_ubyte),
('ADriveCurrent', ctypes.c_ubyte),
('BSlowSlew', ctypes.c_ubyte),
('BSchmittInput', ctypes.c_ubyte),
('BDriveCurrent', ctypes.c_ubyte),
('CSlowSlew', ctypes.c_ubyte),
('CSchmittInput', ctypes.c_ubyte),
('CDriveCurrent', ctypes.c_ubyte),
('DSlowSlew', ctypes.c_ubyte),
('DSchmittInput', ctypes.c_ubyte),
('DDriveCurrent', ctypes.c_ubyte),
('ARIIsTXDEN', ctypes.c_ubyte),
('BRIIsTXDEN', ctypes.c_ubyte),
('CRIIsTXDEN', ctypes.c_ubyte),
('DRIIsTXDEN', ctypes.c_ubyte),
('ADriverType', ctypes.c_ubyte),
('BDriverType', ctypes.c_ubyte),
('CDriverType', ctypes.c_ubyte),
('DDriverType', ctypes.c_ubyte),
]
FT_EEPROM_4232H = struct_ft_eeprom_4232h
PFT_EEPROM_4232H = POINTER_T(struct_ft_eeprom_4232h)
class struct_ft_eeprom_232h(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('common', FT_EEPROM_HEADER),
('ACSlowSlew', ctypes.c_ubyte),
('ACSchmittInput', ctypes.c_ubyte),
('ACDriveCurrent', ctypes.c_ubyte),
('ADSlowSlew', ctypes.c_ubyte),
('ADSchmittInput', ctypes.c_ubyte),
('ADDriveCurrent', ctypes.c_ubyte),
('Cbus0', ctypes.c_ubyte),
('Cbus1', ctypes.c_ubyte),
('Cbus2', ctypes.c_ubyte),
('Cbus3', ctypes.c_ubyte),
('Cbus4', ctypes.c_ubyte),
('Cbus5', ctypes.c_ubyte),
('Cbus6', ctypes.c_ubyte),
('Cbus7', ctypes.c_ubyte),
('Cbus8', ctypes.c_ubyte),
('Cbus9', ctypes.c_ubyte),
('FT1248Cpol', ctypes.c_ubyte),
('FT1248Lsb', ctypes.c_ubyte),
('FT1248FlowControl', ctypes.c_ubyte),
('IsFifo', ctypes.c_ubyte),
('IsFifoTar', ctypes.c_ubyte),
('IsFastSer', ctypes.c_ubyte),
('IsFT1248', ctypes.c_ubyte),
('PowerSaveEnable', ctypes.c_ubyte),
('DriverType', ctypes.c_ubyte),
('PADDING_0', ctypes.c_ubyte * 3),
]
FT_EEPROM_232H = struct_ft_eeprom_232h
PFT_EEPROM_232H = POINTER_T(struct_ft_eeprom_232h)
class struct_ft_eeprom_x_series(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('common', FT_EEPROM_HEADER),
('ACSlowSlew', ctypes.c_ubyte),
('ACSchmittInput', ctypes.c_ubyte),
('ACDriveCurrent', ctypes.c_ubyte),
('ADSlowSlew', ctypes.c_ubyte),
('ADSchmittInput', ctypes.c_ubyte),
('ADDriveCurrent', ctypes.c_ubyte),
('Cbus0', ctypes.c_ubyte),
('Cbus1', ctypes.c_ubyte),
('Cbus2', ctypes.c_ubyte),
('Cbus3', ctypes.c_ubyte),
('Cbus4', ctypes.c_ubyte),
('Cbus5', ctypes.c_ubyte),
('Cbus6', ctypes.c_ubyte),
('InvertTXD', ctypes.c_ubyte),
('InvertRXD', ctypes.c_ubyte),
('InvertRTS', ctypes.c_ubyte),
('InvertCTS', ctypes.c_ubyte),
('InvertDTR', ctypes.c_ubyte),
('InvertDSR', ctypes.c_ubyte),
('InvertDCD', ctypes.c_ubyte),
('InvertRI', ctypes.c_ubyte),
('BCDEnable', ctypes.c_ubyte),
('BCDForceCbusPWREN', ctypes.c_ubyte),
('BCDDisableSleep', ctypes.c_ubyte),
('I2CSlaveAddress', ctypes.c_uint16),
('PADDING_0', ctypes.c_ubyte * 2),
('I2CDeviceId', ctypes.c_uint32),
('I2CDisableSchmitt', ctypes.c_ubyte),
('FT1248Cpol', ctypes.c_ubyte),
('FT1248Lsb', ctypes.c_ubyte),
('FT1248FlowControl', ctypes.c_ubyte),
('RS485EchoSuppress', ctypes.c_ubyte),
('PowerSaveEnable', ctypes.c_ubyte),
('DriverType', ctypes.c_ubyte),
('PADDING_1', ctypes.c_ubyte),
]
FT_EEPROM_X_SERIES = struct_ft_eeprom_x_series
PFT_EEPROM_X_SERIES = POINTER_T(struct_ft_eeprom_x_series)
FT_EEPROM_Read = _libraries['ftd2xx64.dll'].FT_EEPROM_Read
FT_EEPROM_Read.restype = FT_STATUS
# FT_EEPROM_Read(ftHandle, eepromData, eepromDataSize, Manufacturer, ManufacturerId, Description, SerialNumber)
FT_EEPROM_Read.argtypes = [FT_HANDLE, POINTER_T(None), DWORD, POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char)]
FT_EEPROM_Read.__doc__ = \
"""FT_STATUS FT_EEPROM_Read(FT_HANDLE ftHandle, LP_None eepromData, DWORD eepromDataSize, LP_c_char Manufacturer, LP_c_char ManufacturerId, LP_c_char Description, LP_c_char SerialNumber)
ftd2xx.h:968"""
FT_EEPROM_Program = _libraries['ftd2xx64.dll'].FT_EEPROM_Program
FT_EEPROM_Program.restype = FT_STATUS
# FT_EEPROM_Program(ftHandle, eepromData, eepromDataSize, Manufacturer, ManufacturerId, Description, SerialNumber)
FT_EEPROM_Program.argtypes = [FT_HANDLE, POINTER_T(None), DWORD, POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char), POINTER_T(ctypes.c_char)]
FT_EEPROM_Program.__doc__ = \
"""FT_STATUS FT_EEPROM_Program(FT_HANDLE ftHandle, LP_None eepromData, DWORD eepromDataSize, LP_c_char Manufacturer, LP_c_char ManufacturerId, LP_c_char Description, LP_c_char SerialNumber)
ftd2xx.h:980"""
FT_SetLatencyTimer = _libraries['ftd2xx64.dll'].FT_SetLatencyTimer
FT_SetLatencyTimer.restype = FT_STATUS
# FT_SetLatencyTimer(ftHandle, ucLatency)
FT_SetLatencyTimer.argtypes = [FT_HANDLE, UCHAR]
FT_SetLatencyTimer.__doc__ = \
"""FT_STATUS FT_SetLatencyTimer(FT_HANDLE ftHandle, UCHAR ucLatency)
ftd2xx.h:992"""
FT_GetLatencyTimer = _libraries['ftd2xx64.dll'].FT_GetLatencyTimer
FT_GetLatencyTimer.restype = FT_STATUS
# FT_GetLatencyTimer(ftHandle, pucLatency)
FT_GetLatencyTimer.argtypes = [FT_HANDLE, PUCHAR]
FT_GetLatencyTimer.__doc__ = \
"""FT_STATUS FT_GetLatencyTimer(FT_HANDLE ftHandle, PUCHAR pucLatency)
ftd2xx.h:998"""
FT_SetBitMode = _libraries['ftd2xx64.dll'].FT_SetBitMode
FT_SetBitMode.restype = FT_STATUS
# FT_SetBitMode(ftHandle, ucMask, ucEnable)
FT_SetBitMode.argtypes = [FT_HANDLE, UCHAR, UCHAR]
FT_SetBitMode.__doc__ = \
"""FT_STATUS FT_SetBitMode(FT_HANDLE ftHandle, UCHAR ucMask, UCHAR ucEnable)
ftd2xx.h:1004"""
FT_GetBitMode = _libraries['ftd2xx64.dll'].FT_GetBitMode
FT_GetBitMode.restype = FT_STATUS
# FT_GetBitMode(ftHandle, pucMode)
FT_GetBitMode.argtypes = [FT_HANDLE, PUCHAR]
FT_GetBitMode.__doc__ = \
"""FT_STATUS FT_GetBitMode(FT_HANDLE ftHandle, PUCHAR pucMode)
ftd2xx.h:1011"""
FT_SetUSBParameters = _libraries['ftd2xx64.dll'].FT_SetUSBParameters
FT_SetUSBParameters.restype = FT_STATUS
# FT_SetUSBParameters(ftHandle, ulInTransferSize, ulOutTransferSize)
FT_SetUSBParameters.argtypes = [FT_HANDLE, ULONG, ULONG]
FT_SetUSBParameters.__doc__ = \
"""FT_STATUS FT_SetUSBParameters(FT_HANDLE ftHandle, ULONG ulInTransferSize, ULONG ulOutTransferSize)
ftd2xx.h:1017"""
FT_SetDeadmanTimeout = _libraries['ftd2xx64.dll'].FT_SetDeadmanTimeout
FT_SetDeadmanTimeout.restype = FT_STATUS
# FT_SetDeadmanTimeout(ftHandle, ulDeadmanTimeout)
FT_SetDeadmanTimeout.argtypes = [FT_HANDLE, ULONG]
FT_SetDeadmanTimeout.__doc__ = \
"""FT_STATUS FT_SetDeadmanTimeout(FT_HANDLE ftHandle, ULONG ulDeadmanTimeout)
ftd2xx.h:1024"""
FT_GetDeviceInfo = _libraries['ftd2xx64.dll'].FT_GetDeviceInfo
FT_GetDeviceInfo.restype = FT_STATUS
# FT_GetDeviceInfo(ftHandle, lpftDevice, lpdwID, SerialNumber, Description, Dummy)
FT_GetDeviceInfo.argtypes = [FT_HANDLE, POINTER_T(ctypes.c_uint32), LPDWORD, PCHAR, PCHAR, LPVOID]
FT_GetDeviceInfo.__doc__ = \
"""FT_STATUS FT_GetDeviceInfo(FT_HANDLE ftHandle, LP_c_uint32 lpftDevice, LPDWORD lpdwID, PCHAR SerialNumber, PCHAR Description, LPVOID Dummy)
ftd2xx.h:1053"""
FT_StopInTask = _libraries['ftd2xx64.dll'].FT_StopInTask
FT_StopInTask.restype = FT_STATUS
# FT_StopInTask(ftHandle)
FT_StopInTask.argtypes = [FT_HANDLE]
FT_StopInTask.__doc__ = \
"""FT_STATUS FT_StopInTask(FT_HANDLE ftHandle)
ftd2xx.h:1063"""
FT_RestartInTask = _libraries['ftd2xx64.dll'].FT_RestartInTask
FT_RestartInTask.restype = FT_STATUS
# FT_RestartInTask(ftHandle)
FT_RestartInTask.argtypes = [FT_HANDLE]
FT_RestartInTask.__doc__ = \
"""FT_STATUS FT_RestartInTask(FT_HANDLE ftHandle)
ftd2xx.h:1068"""
FT_SetResetPipeRetryCount = _libraries['ftd2xx64.dll'].FT_SetResetPipeRetryCount
FT_SetResetPipeRetryCount.restype = FT_STATUS
# FT_SetResetPipeRetryCount(ftHandle, dwCount)
FT_SetResetPipeRetryCount.argtypes = [FT_HANDLE, DWORD]
FT_SetResetPipeRetryCount.__doc__ = \
"""FT_STATUS FT_SetResetPipeRetryCount(FT_HANDLE ftHandle, DWORD dwCount)
ftd2xx.h:1073"""
FT_ResetPort = _libraries['ftd2xx64.dll'].FT_ResetPort
FT_ResetPort.restype = FT_STATUS
# FT_ResetPort(ftHandle)
FT_ResetPort.argtypes = [FT_HANDLE]
FT_ResetPort.__doc__ = \
"""FT_STATUS FT_ResetPort(FT_HANDLE ftHandle)
ftd2xx.h:1079"""
FT_CyclePort = _libraries['ftd2xx64.dll'].FT_CyclePort
FT_CyclePort.restype = FT_STATUS
# FT_CyclePort(ftHandle)
FT_CyclePort.argtypes = [FT_HANDLE]
FT_CyclePort.__doc__ = \
"""FT_STATUS FT_CyclePort(FT_HANDLE ftHandle)
ftd2xx.h:1084"""
FT_W32_CreateFile = _libraries['ftd2xx64.dll'].FT_W32_CreateFile
FT_W32_CreateFile.restype = FT_HANDLE
# FT_W32_CreateFile(lpszName, dwAccess, dwShareMode, lpSecurityAttributes, dwCreate, dwAttrsAndFlags, hTemplate)
FT_W32_CreateFile.argtypes = [LPCTSTR, DWORD, DWORD, LPSECURITY_ATTRIBUTES, DWORD, DWORD, HANDLE]
FT_W32_CreateFile.__doc__ = \
"""FT_HANDLE FT_W32_CreateFile(LPCTSTR lpszName, DWORD dwAccess, DWORD dwShareMode, LPSECURITY_ATTRIBUTES lpSecurityAttributes, DWORD dwCreate, DWORD dwAttrsAndFlags, HANDLE hTemplate)
ftd2xx.h:1094"""
FT_W32_CloseHandle = _libraries['ftd2xx64.dll'].FT_W32_CloseHandle
FT_W32_CloseHandle.restype = BOOL
# FT_W32_CloseHandle(ftHandle)
FT_W32_CloseHandle.argtypes = [FT_HANDLE]
FT_W32_CloseHandle.__doc__ = \
"""BOOL FT_W32_CloseHandle(FT_HANDLE ftHandle)
ftd2xx.h:1105"""
FT_W32_ReadFile = _libraries['ftd2xx64.dll'].FT_W32_ReadFile
FT_W32_ReadFile.restype = BOOL
# FT_W32_ReadFile(ftHandle, lpBuffer, nBufferSize, lpBytesReturned, lpOverlapped)
FT_W32_ReadFile.argtypes = [FT_HANDLE, LPVOID, DWORD, LPDWORD, LPOVERLAPPED]
FT_W32_ReadFile.__doc__ = \
"""BOOL FT_W32_ReadFile(FT_HANDLE ftHandle, LPVOID lpBuffer, DWORD nBufferSize, LPDWORD lpBytesReturned, LPOVERLAPPED lpOverlapped)
ftd2xx.h:1110"""
FT_W32_WriteFile = _libraries['ftd2xx64.dll'].FT_W32_WriteFile
FT_W32_WriteFile.restype = BOOL
# FT_W32_WriteFile(ftHandle, lpBuffer, nBufferSize, lpBytesWritten, lpOverlapped)
FT_W32_WriteFile.argtypes = [FT_HANDLE, LPVOID, DWORD, LPDWORD, LPOVERLAPPED]
FT_W32_WriteFile.__doc__ = \
"""BOOL FT_W32_WriteFile(FT_HANDLE ftHandle, LPVOID lpBuffer, DWORD nBufferSize, LPDWORD lpBytesWritten, LPOVERLAPPED lpOverlapped)
ftd2xx.h:1119"""
FT_W32_GetLastError = _libraries['ftd2xx64.dll'].FT_W32_GetLastError
FT_W32_GetLastError.restype = DWORD
# FT_W32_GetLastError(ftHandle)
FT_W32_GetLastError.argtypes = [FT_HANDLE]
FT_W32_GetLastError.__doc__ = \
"""DWORD FT_W32_GetLastError(FT_HANDLE ftHandle)
ftd2xx.h:1128"""
FT_W32_GetOverlappedResult = _libraries['ftd2xx64.dll'].FT_W32_GetOverlappedResult
FT_W32_GetOverlappedResult.restype = BOOL
# FT_W32_GetOverlappedResult(ftHandle, lpOverlapped, lpdwBytesTransferred, bWait)
FT_W32_GetOverlappedResult.argtypes = [FT_HANDLE, LPOVERLAPPED, LPDWORD, BOOL]
FT_W32_GetOverlappedResult.__doc__ = \
"""BOOL FT_W32_GetOverlappedResult(FT_HANDLE ftHandle, LPOVERLAPPED lpOverlapped, LPDWORD lpdwBytesTransferred, BOOL bWait)
ftd2xx.h:1133"""
FT_W32_CancelIo = _libraries['ftd2xx64.dll'].FT_W32_CancelIo
FT_W32_CancelIo.restype = BOOL
# FT_W32_CancelIo(ftHandle)
FT_W32_CancelIo.argtypes = [FT_HANDLE]
FT_W32_CancelIo.__doc__ = \
"""BOOL FT_W32_CancelIo(FT_HANDLE ftHandle)
ftd2xx.h:1141"""
class struct__FTCOMSTAT(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('fCtsHold', ctypes.c_uint32, 1),
('fDsrHold', ctypes.c_uint32, 1),
('fRlsdHold', ctypes.c_uint32, 1),
('fXoffHold', ctypes.c_uint32, 1),
('fXoffSent', ctypes.c_uint32, 1),
('fEof', ctypes.c_uint32, 1),
('fTxim', ctypes.c_uint32, 1),
('fReserved', ctypes.c_uint32, 25),
('cbInQue', ctypes.c_uint32),
('cbOutQue', ctypes.c_uint32),
]
FTCOMSTAT = struct__FTCOMSTAT
LPFTCOMSTAT = POINTER_T(struct__FTCOMSTAT)
class struct__FTDCB(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('DCBlength', ctypes.c_uint32),
('BaudRate', ctypes.c_uint32),
('fBinary', ctypes.c_uint32, 1),
('fParity', ctypes.c_uint32, 1),
('fOutxCtsFlow', ctypes.c_uint32, 1),
('fOutxDsrFlow', ctypes.c_uint32, 1),
('fDtrControl', ctypes.c_uint32, 2),
('fDsrSensitivity', ctypes.c_uint32, 1),
('fTXContinueOnXoff', ctypes.c_uint32, 1),
('fOutX', ctypes.c_uint32, 1),
('fInX', ctypes.c_uint32, 1),
('fErrorChar', ctypes.c_uint32, 1),
('fNull', ctypes.c_uint32, 1),
('fRtsControl', ctypes.c_uint32, 2),
('fAbortOnError', ctypes.c_uint32, 1),
('fDummy2', ctypes.c_uint32, 17),
('wReserved', ctypes.c_uint16),
('XonLim', ctypes.c_uint16),
('XoffLim', ctypes.c_uint16),
('ByteSize', ctypes.c_ubyte),
('Parity', ctypes.c_ubyte),
('StopBits', ctypes.c_ubyte),
('XonChar', ctypes.c_char),
('XoffChar', ctypes.c_char),
('ErrorChar', ctypes.c_char),
('EofChar', ctypes.c_char),
('EvtChar', ctypes.c_char),
('wReserved1', ctypes.c_uint16),
]
FTDCB = struct__FTDCB
LPFTDCB = POINTER_T(struct__FTDCB)
class struct__FTTIMEOUTS(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('ReadIntervalTimeout', ctypes.c_uint32),
('ReadTotalTimeoutMultiplier', ctypes.c_uint32),
('ReadTotalTimeoutConstant', ctypes.c_uint32),
('WriteTotalTimeoutMultiplier', ctypes.c_uint32),
('WriteTotalTimeoutConstant', ctypes.c_uint32),
]
FTTIMEOUTS = struct__FTTIMEOUTS
LPFTTIMEOUTS = POINTER_T(struct__FTTIMEOUTS)
FT_W32_ClearCommBreak = _libraries['ftd2xx64.dll'].FT_W32_ClearCommBreak
FT_W32_ClearCommBreak.restype = BOOL
# FT_W32_ClearCommBreak(ftHandle)
FT_W32_ClearCommBreak.argtypes = [FT_HANDLE]
FT_W32_ClearCommBreak.__doc__ = \
"""BOOL FT_W32_ClearCommBreak(FT_HANDLE ftHandle)
ftd2xx.h:1203"""
FT_W32_ClearCommError = _libraries['ftd2xx64.dll'].FT_W32_ClearCommError
FT_W32_ClearCommError.restype = BOOL
# FT_W32_ClearCommError(ftHandle, lpdwErrors, lpftComstat)
FT_W32_ClearCommError.argtypes = [FT_HANDLE, LPDWORD, LPFTCOMSTAT]
FT_W32_ClearCommError.__doc__ = \
"""BOOL FT_W32_ClearCommError(FT_HANDLE ftHandle, LPDWORD lpdwErrors, LPFTCOMSTAT lpftComstat)
ftd2xx.h:1208"""
FT_W32_EscapeCommFunction = _libraries['ftd2xx64.dll'].FT_W32_EscapeCommFunction
FT_W32_EscapeCommFunction.restype = BOOL
# FT_W32_EscapeCommFunction(ftHandle, dwFunc)
FT_W32_EscapeCommFunction.argtypes = [FT_HANDLE, DWORD]
FT_W32_EscapeCommFunction.__doc__ = \
"""BOOL FT_W32_EscapeCommFunction(FT_HANDLE ftHandle, DWORD dwFunc)
ftd2xx.h:1215"""
FT_W32_GetCommModemStatus = _libraries['ftd2xx64.dll'].FT_W32_GetCommModemStatus
FT_W32_GetCommModemStatus.restype = BOOL
# FT_W32_GetCommModemStatus(ftHandle, lpdwModemStatus)
FT_W32_GetCommModemStatus.argtypes = [FT_HANDLE, LPDWORD]
FT_W32_GetCommModemStatus.__doc__ = \
"""BOOL FT_W32_GetCommModemStatus(FT_HANDLE ftHandle, LPDWORD lpdwModemStatus)
ftd2xx.h:1221"""
FT_W32_GetCommState = _libraries['ftd2xx64.dll'].FT_W32_GetCommState
FT_W32_GetCommState.restype = BOOL
# FT_W32_GetCommState(ftHandle, lpftDcb)
FT_W32_GetCommState.argtypes = [FT_HANDLE, LPFTDCB]
FT_W32_GetCommState.__doc__ = \
"""BOOL FT_W32_GetCommState(FT_HANDLE ftHandle, LPFTDCB lpftDcb)
ftd2xx.h:1227"""
FT_W32_GetCommTimeouts = _libraries['ftd2xx64.dll'].FT_W32_GetCommTimeouts
FT_W32_GetCommTimeouts.restype = BOOL
# FT_W32_GetCommTimeouts(ftHandle, pTimeouts)
FT_W32_GetCommTimeouts.argtypes = [FT_HANDLE, POINTER_T(struct__FTTIMEOUTS)]
FT_W32_GetCommTimeouts.__doc__ = \
"""BOOL FT_W32_GetCommTimeouts(FT_HANDLE ftHandle, LP_struct__FTTIMEOUTS pTimeouts)
ftd2xx.h:1233"""
FT_W32_PurgeComm = _libraries['ftd2xx64.dll'].FT_W32_PurgeComm
FT_W32_PurgeComm.restype = BOOL
# FT_W32_PurgeComm(ftHandle, dwMask)
FT_W32_PurgeComm.argtypes = [FT_HANDLE, DWORD]
FT_W32_PurgeComm.__doc__ = \
"""BOOL FT_W32_PurgeComm(FT_HANDLE ftHandle, DWORD dwMask)
ftd2xx.h:1239"""
FT_W32_SetCommBreak = _libraries['ftd2xx64.dll'].FT_W32_SetCommBreak
FT_W32_SetCommBreak.restype = BOOL
# FT_W32_SetCommBreak(ftHandle)
FT_W32_SetCommBreak.argtypes = [FT_HANDLE]
FT_W32_SetCommBreak.__doc__ = \
"""BOOL FT_W32_SetCommBreak(FT_HANDLE ftHandle)
ftd2xx.h:1245"""
FT_W32_SetCommMask = _libraries['ftd2xx64.dll'].FT_W32_SetCommMask
FT_W32_SetCommMask.restype = BOOL
# FT_W32_SetCommMask(ftHandle, ulEventMask)
FT_W32_SetCommMask.argtypes = [FT_HANDLE, ULONG]
FT_W32_SetCommMask.__doc__ = \
"""BOOL FT_W32_SetCommMask(FT_HANDLE ftHandle, ULONG ulEventMask)
ftd2xx.h:1250"""
FT_W32_GetCommMask = _libraries['ftd2xx64.dll'].FT_W32_GetCommMask
FT_W32_GetCommMask.restype = BOOL
# FT_W32_GetCommMask(ftHandle, lpdwEventMask)
FT_W32_GetCommMask.argtypes = [FT_HANDLE, LPDWORD]
FT_W32_GetCommMask.__doc__ = \
"""BOOL FT_W32_GetCommMask(FT_HANDLE ftHandle, LPDWORD lpdwEventMask)
ftd2xx.h:1256"""
FT_W32_SetCommState = _libraries['ftd2xx64.dll'].FT_W32_SetCommState
FT_W32_SetCommState.restype = BOOL
# FT_W32_SetCommState(ftHandle, lpftDcb)
FT_W32_SetCommState.argtypes = [FT_HANDLE, LPFTDCB]
FT_W32_SetCommState.__doc__ = \
"""BOOL FT_W32_SetCommState(FT_HANDLE ftHandle, LPFTDCB lpftDcb)
ftd2xx.h:1262"""
FT_W32_SetCommTimeouts = _libraries['ftd2xx64.dll'].FT_W32_SetCommTimeouts
FT_W32_SetCommTimeouts.restype = BOOL
# FT_W32_SetCommTimeouts(ftHandle, pTimeouts)
FT_W32_SetCommTimeouts.argtypes = [FT_HANDLE, POINTER_T(struct__FTTIMEOUTS)]
FT_W32_SetCommTimeouts.__doc__ = \
"""BOOL FT_W32_SetCommTimeouts(FT_HANDLE ftHandle, LP_struct__FTTIMEOUTS pTimeouts)
ftd2xx.h:1268"""
FT_W32_SetupComm = _libraries['ftd2xx64.dll'].FT_W32_SetupComm
FT_W32_SetupComm.restype = BOOL
# FT_W32_SetupComm(ftHandle, dwReadBufferSize, dwWriteBufferSize)
FT_W32_SetupComm.argtypes = [FT_HANDLE, DWORD, DWORD]
FT_W32_SetupComm.__doc__ = \
"""BOOL FT_W32_SetupComm(FT_HANDLE ftHandle, DWORD dwReadBufferSize, DWORD dwWriteBufferSize)
ftd2xx.h:1274"""
FT_W32_WaitCommEvent = _libraries['ftd2xx64.dll'].FT_W32_WaitCommEvent
FT_W32_WaitCommEvent.restype = BOOL
# FT_W32_WaitCommEvent(ftHandle, pulEvent, lpOverlapped)
FT_W32_WaitCommEvent.argtypes = [FT_HANDLE, PULONG, LPOVERLAPPED]
FT_W32_WaitCommEvent.__doc__ = \
"""BOOL FT_W32_WaitCommEvent(FT_HANDLE ftHandle, PULONG pulEvent, LPOVERLAPPED lpOverlapped)
ftd2xx.h:1281"""
class struct__ft_device_list_info_node(ctypes.Structure):
_pack_ = True # source:False
_fields_ = [
('Flags', ctypes.c_uint32),
('Type', ctypes.c_uint32),
('ID', ctypes.c_uint32),
('LocId', ctypes.c_uint32),
('SerialNumber', ctypes.c_char * 16),
('Description', ctypes.c_char * 64),
('ftHandle', POINTER_T(None)),
]
FT_DEVICE_LIST_INFO_NODE = struct__ft_device_list_info_node
# values for enumeration 'c__Ea_FT_FLAGS_OPENED'
FT_FLAGS_OPENED = 1
FT_FLAGS_HISPEED = 2
c__Ea_FT_FLAGS_OPENED = ctypes.c_int # enum
FT_CreateDeviceInfoList = _libraries['ftd2xx64.dll'].FT_CreateDeviceInfoList
FT_CreateDeviceInfoList.restype = FT_STATUS
# FT_CreateDeviceInfoList(lpdwNumDevs)
FT_CreateDeviceInfoList.argtypes = [LPDWORD]
FT_CreateDeviceInfoList.__doc__ = \
"""FT_STATUS FT_CreateDeviceInfoList(LPDWORD lpdwNumDevs)
ftd2xx.h:1310"""
FT_GetDeviceInfoList = _libraries['ftd2xx64.dll'].FT_GetDeviceInfoList
FT_GetDeviceInfoList.restype = FT_STATUS
# FT_GetDeviceInfoList(pDest, lpdwNumDevs)
FT_GetDeviceInfoList.argtypes = [POINTER_T(struct__ft_device_list_info_node), LPDWORD]
FT_GetDeviceInfoList.__doc__ = \
"""FT_STATUS FT_GetDeviceInfoList(LP_struct__ft_device_list_info_node pDest, LPDWORD lpdwNumDevs)
ftd2xx.h:1315"""
FT_GetDeviceInfoDetail = _libraries['ftd2xx64.dll'].FT_GetDeviceInfoDetail
FT_GetDeviceInfoDetail.restype = FT_STATUS
# FT_GetDeviceInfoDetail(dwIndex, lpdwFlags, lpdwType, lpdwID, lpdwLocId, lpSerialNumber, lpDescription, pftHandle)
FT_GetDeviceInfoDetail.argtypes = [DWORD, LPDWORD, LPDWORD, LPDWORD, LPDWORD, LPVOID, LPVOID, POINTER_T(POINTER_T(None))]
FT_GetDeviceInfoDetail.__doc__ = \
"""FT_STATUS FT_GetDeviceInfoDetail(DWORD dwIndex, LPDWORD lpdwFlags, LPDWORD lpdwType, LPDWORD lpdwID, LPDWORD lpdwLocId, LPVOID lpSerialNumber, LPVOID lpDescription, LP_LP_None pftHandle)
ftd2xx.h:1321"""
FT_GetDriverVersion = _libraries['ftd2xx64.dll'].FT_GetDriverVersion
FT_GetDriverVersion.restype = FT_STATUS
# FT_GetDriverVersion(ftHandle, lpdwVersion)
FT_GetDriverVersion.argtypes = [FT_HANDLE, LPDWORD]
FT_GetDriverVersion.__doc__ = \
"""FT_STATUS FT_GetDriverVersion(FT_HANDLE ftHandle, LPDWORD lpdwVersion)
ftd2xx.h:1338"""
FT_GetLibraryVersion = _libraries['ftd2xx64.dll'].FT_GetLibraryVersion
FT_GetLibraryVersion.restype = FT_STATUS
# FT_GetLibraryVersion(lpdwVersion)
FT_GetLibraryVersion.argtypes = [LPDWORD]
FT_GetLibraryVersion.__doc__ = \
"""FT_STATUS FT_GetLibraryVersion(LPDWORD lpdwVersion)
ftd2xx.h:1344"""
FT_Rescan = _libraries['ftd2xx64.dll'].FT_Rescan
FT_Rescan.restype = FT_STATUS
# FT_Rescan()
FT_Rescan.argtypes = []
FT_Rescan.__doc__ = \
"""FT_STATUS FT_Rescan()
ftd2xx.h:1350"""
FT_Reload = _libraries['ftd2xx64.dll'].FT_Reload
FT_Reload.restype = FT_STATUS
# FT_Reload(wVid, wPid)
FT_Reload.argtypes = [WORD, WORD]
FT_Reload.__doc__ = \
"""FT_STATUS FT_Reload(WORD wVid, WORD wPid)
ftd2xx.h:1355"""
FT_GetComPortNumber = _libraries['ftd2xx64.dll'].FT_GetComPortNumber
FT_GetComPortNumber.restype = FT_STATUS
# FT_GetComPortNumber(ftHandle, lpdwComPortNumber)
FT_GetComPortNumber.argtypes = [FT_HANDLE, LPLONG]
FT_GetComPortNumber.__doc__ = \
"""FT_STATUS FT_GetComPortNumber(FT_HANDLE ftHandle, LPLONG lpdwComPortNumber)
ftd2xx.h:1361"""
FT_EE_ReadConfig = _libraries['ftd2xx64.dll'].FT_EE_ReadConfig
FT_EE_ReadConfig.restype = FT_STATUS
# FT_EE_ReadConfig(ftHandle, ucAddress, pucValue)
FT_EE_ReadConfig.argtypes = [FT_HANDLE, UCHAR, PUCHAR]
FT_EE_ReadConfig.__doc__ = \
"""FT_STATUS FT_EE_ReadConfig(FT_HANDLE ftHandle, UCHAR ucAddress, PUCHAR pucValue)
ftd2xx.h:1372"""
FT_EE_WriteConfig = _libraries['ftd2xx64.dll'].FT_EE_WriteConfig
FT_EE_WriteConfig.restype = FT_STATUS
# FT_EE_WriteConfig(ftHandle, ucAddress, ucValue)
FT_EE_WriteConfig.argtypes = [FT_HANDLE, UCHAR, UCHAR]
FT_EE_WriteConfig.__doc__ = \
"""FT_STATUS FT_EE_WriteConfig(FT_HANDLE ftHandle, UCHAR ucAddress, UCHAR ucValue)
ftd2xx.h:1379"""
FT_EE_ReadECC = _libraries['ftd2xx64.dll'].FT_EE_ReadECC
FT_EE_ReadECC.restype = FT_STATUS
# FT_EE_ReadECC(ftHandle, ucOption, lpwValue)
FT_EE_ReadECC.argtypes = [FT_HANDLE, UCHAR, LPWORD]
FT_EE_ReadECC.__doc__ = \
"""FT_STATUS FT_EE_ReadECC(FT_HANDLE ftHandle, UCHAR ucOption, LPWORD lpwValue)
ftd2xx.h:1386"""
FT_GetQueueStatusEx = _libraries['ftd2xx64.dll'].FT_GetQueueStatusEx
FT_GetQueueStatusEx.restype = FT_STATUS
# FT_GetQueueStatusEx(ftHandle, dwRxBytes)
FT_GetQueueStatusEx.argtypes = [FT_HANDLE, POINTER_T(ctypes.c_uint32)]
FT_GetQueueStatusEx.__doc__ = \
"""FT_STATUS FT_GetQueueStatusEx(FT_HANDLE ftHandle, LP_c_uint32 dwRxBytes)
ftd2xx.h:1393"""
FT_ComPortIdle = _libraries['ftd2xx64.dll'].FT_ComPortIdle
FT_ComPortIdle.restype = FT_STATUS
# FT_ComPortIdle(ftHandle)
FT_ComPortIdle.argtypes = [FT_HANDLE]
FT_ComPortIdle.__doc__ = \
"""FT_STATUS FT_ComPortIdle(FT_HANDLE ftHandle)
ftd2xx.h:1399"""
FT_ComPortCancelIdle = _libraries['ftd2xx64.dll'].FT_ComPortCancelIdle
FT_ComPortCancelIdle.restype = FT_STATUS
# FT_ComPortCancelIdle(ftHandle)
FT_ComPortCancelIdle.argtypes = [FT_HANDLE]
FT_ComPortCancelIdle.__doc__ = \
"""FT_STATUS FT_ComPortCancelIdle(FT_HANDLE ftHandle)
ftd2xx.h:1404"""
FT_VendorCmdGet = _libraries['ftd2xx64.dll'].FT_VendorCmdGet
FT_VendorCmdGet.restype = FT_STATUS
# FT_VendorCmdGet(ftHandle, Request, Buf, Len)
FT_VendorCmdGet.argtypes = [FT_HANDLE, UCHAR, POINTER_T(ctypes.c_ubyte), USHORT]
FT_VendorCmdGet.__doc__ = \
"""FT_STATUS FT_VendorCmdGet(FT_HANDLE ftHandle, UCHAR Request, LP_c_ubyte Buf, USHORT Len)
ftd2xx.h:1409"""
FT_VendorCmdSet = _libraries['ftd2xx64.dll'].FT_VendorCmdSet
FT_VendorCmdSet.restype = FT_STATUS
# FT_VendorCmdSet(ftHandle, Request, Buf, Len)
FT_VendorCmdSet.argtypes = [FT_HANDLE, UCHAR, POINTER_T(ctypes.c_ubyte), USHORT]
FT_VendorCmdSet.__doc__ = \
"""FT_STATUS FT_VendorCmdSet(FT_HANDLE ftHandle, UCHAR Request, LP_c_ubyte Buf, USHORT Len)
ftd2xx.h:1417"""
FT_VendorCmdGetEx = _libraries['ftd2xx64.dll'].FT_VendorCmdGetEx
FT_VendorCmdGetEx.restype = FT_STATUS
# FT_VendorCmdGetEx(ftHandle, wValue, Buf, Len)
FT_VendorCmdGetEx.argtypes = [FT_HANDLE, USHORT, POINTER_T(ctypes.c_ubyte), USHORT]
FT_VendorCmdGetEx.__doc__ = \
"""FT_STATUS FT_VendorCmdGetEx(FT_HANDLE ftHandle, USHORT wValue, LP_c_ubyte Buf, USHORT Len)
ftd2xx.h:1425"""
FT_VendorCmdSetEx = _libraries['ftd2xx64.dll'].FT_VendorCmdSetEx
FT_VendorCmdSetEx.restype = FT_STATUS
# FT_VendorCmdSetEx(ftHandle, wValue, Buf, Len)
FT_VendorCmdSetEx.argtypes = [FT_HANDLE, USHORT, POINTER_T(ctypes.c_ubyte), USHORT]
FT_VendorCmdSetEx.__doc__ = \
"""FT_STATUS FT_VendorCmdSetEx(FT_HANDLE ftHandle, USHORT wValue, LP_c_ubyte Buf, USHORT Len)
ftd2xx.h:1433"""
__all__ = \
['struct_ft_eeprom_232r', 'FT_SetDtr', 'FT_INVALID_BAUD_RATE',
'FT_EEPROM_NOT_PRESENT', 'FT_DEVICE_232R', 'PULONG',
'FT_GetBitMode', 'FT_EE_ReadECC', 'PFT_EEPROM_2232H',
'FT_EEPROM_2232', 'FT_EE_UARead', 'FT_CyclePort',
'FT_EEPROM_X_SERIES', 'FT_W32_ReadFile', 'FT_DEVICE_4222_PROG',
'FT_WriteEE', 'struct_ft_eeprom_4232h', 'FT_VendorCmdGet',
'FT_EE_ReadEx', 'FT_DEVICE_930', 'FT_EraseEE', 'PFT_EEPROM_4232H',
'FT_DEVICE_NOT_FOUND', 'PFT_EEPROM_232B', 'FT_W32_SetCommMask',
'PUCHAR', 'FT_SetBreakOff', 'FT_EE_ProgramEx',
'FT_ComPortCancelIdle', 'c__Ea_FT_OK', 'PFT_EEPROM_X_SERIES',
'struct__FTDCB', 'FT_W32_GetOverlappedResult',
'FT_EEPROM_READ_FAILED', 'FT_SetWaitMask', 'FT_DEVICE',
'FT_EE_Read', 'FT_W32_CancelIo', 'FT_DEVICE_NOT_OPENED',
'FT_DEVICE_NOT_OPENED_FOR_ERASE', 'c__Ea_FT_FLAGS_OPENED',
'FT_GetDeviceInfoDetail', 'union__OVERLAPPED_0', 'FT_ListDevices',
'LPLONG', 'FT_W32_GetCommMask', 'FT_DEVICE_X_SERIES',
'FT_W32_ClearCommBreak', 'FT_ClrRts', 'FT_INVALID_PARAMETER',
'struct_ft_eeprom_232h', 'FT_GetDriverVersion',
'FT_INSUFFICIENT_RESOURCES', 'FT_RestartInTask',
'FT_W32_ClearCommError', 'FT_OTHER_ERROR', 'FT_SetRts',
'FT_DEVICE_4222H_0', 'FT_GetQueueStatusEx',
'FT_SetDataCharacteristics', 'struct_ft_eeprom_2232', 'PVOID',
'FT_W32_GetCommModemStatus', 'FT_DEVICE_100AX',
'FT_W32_WriteFile', 'FT_GetDeviceInfo', 'LPFTDCB',
'FT_EEPROM_WRITE_FAILED', 'FT_W32_GetCommTimeouts',
'PFT_PROGRAM_DATA', 'LPFTTIMEOUTS', 'FT_EEPROM_Read', 'BOOL',
'FT_DEVICE_4222H_1_2', 'FT_DEVICE_LIST_INFO_NODE',
'FT_GetComPortNumber', 'FT_INVALID_ARGS', 'FT_EE_WriteConfig',
'struct_ft_program_data', 'FT_DEVICE_LIST_NOT_READY',
'FT_WaitOnMask', 'FT_FAILED_TO_WRITE_DEVICE',
'FT_SetDeadmanTimeout', 'FT_StopInTask', 'struct__FTCOMSTAT',
'FT_EEPROM_NOT_PROGRAMMED', 'FT_GetModemStatus', 'LPDWORD',
'struct_ft_eeprom_2232h', 'FT_SetFlowControl', 'FT_EEPROM_2232H',
'PFT_EEPROM_2232', 'FT_EE_Program', 'FT_VendorCmdSet', 'FT_Purge',
'LPCTSTR', 'FT_GetQueueStatus', 'FT_SetEventNotification',
'FT_EEPROM_Program', 'FT_W32_PurgeComm', 'FT_GetLatencyTimer',
'FT_DEVICE_232H', 'FT_SetDivisor', 'PCHAR', 'HANDLE',
'struct_ft_eeprom_header', 'FTTIMEOUTS', 'FT_IO_ERROR',
'FT_ReadEE', 'USHORT', 'struct_ft_eeprom_x_series', 'FT_STATUS',
'FT_Close', 'struct__OVERLAPPED', 'FT_DEVICE_UMFTPD3A',
'FT_W32_CreateFile', 'struct__ft_device_list_info_node',
'FT_ComPortIdle', 'c__Ea_FT_DEVICE_BM', 'FT_Reload', 'WORD',
'FT_EE_ReadConfig', 'FT_SetBaudRate', 'FT_EEPROM_232B', 'FT_OK',
'ULONG', 'FT_OpenEx', 'FT_SetUSBParameters',
'FT_W32_GetLastError', 'FT_W32_EscapeCommFunction', 'FT_Open',
'FT_DEVICE_NOT_OPENED_FOR_WRITE', 'FT_SetChars',
'FT_DEVICE_4232H', 'struct__FTTIMEOUTS', 'FT_DEVICE_BM',
'FT_EEPROM_HEADER', 'struct__OVERLAPPED_0_0', 'FT_HANDLE',
'PFT_EVENT_HANDLER', 'FT_ClrDtr', 'FT_W32_SetCommState',
'FT_W32_WaitCommEvent', 'FT_GetLibraryVersion', 'FT_SetBitMode',
'FT_DEVICE_AM', 'struct_ft_eeprom_232b', 'FT_EEPROM_232R',
'FT_EEPROM_4232H', 'FT_Write', 'FT_W32_GetCommState',
'FT_DEVICE_2232H', 'PFT_EEPROM_HEADER', 'FT_W32_CloseHandle',
'PFT_EEPROM_232H', 'FT_W32_SetCommTimeouts', 'FT_EE_UASize',
'LPVOID', 'FT_DEVICE_900', 'LPOVERLAPPED',
'FT_CreateDeviceInfoList', 'LPSECURITY_ATTRIBUTES',
'struct__SECURITY_ATTRIBUTES', 'FT_W32_SetupComm',
'FT_VendorCmdGetEx', 'LPFTCOMSTAT', 'FT_VendorCmdSetEx',
'FT_EEPROM_ERASE_FAILED', 'FT_PROGRAM_DATA',
'FT_SetResetPipeRetryCount', 'UCHAR', 'FT_DEVICE_2232C',
'FT_FLAGS_HISPEED', 'FT_DEVICE_UNKNOWN', 'FT_SetLatencyTimer',
'FT_ResetDevice', 'FT_GetEventStatus', 'DWORD',
'FT_INVALID_HANDLE', 'FT_GetStatus', 'FT_EE_UAWrite',
'FT_SetBreakOn', 'FT_FLAGS_OPENED', 'FT_W32_SetCommBreak',
'FT_Rescan', 'LPWORD', 'FT_DEVICE_4222H_3', 'FT_SetTimeouts',
'PFT_EEPROM_232R', 'FT_IoCtl', 'FT_GetDeviceInfoList',
'FT_NOT_SUPPORTED', 'FT_ResetPort', 'FTDCB', 'FT_EEPROM_232H',
'FTCOMSTAT', 'FT_Read']
| 42.192278 | 193 | 0.750984 | 7,002 | 54,639 | 5.424165 | 0.103113 | 0.071143 | 0.083096 | 0.052712 | 0.335387 | 0.237599 | 0.201843 | 0.181306 | 0.165798 | 0.141917 | 0 | 0.036548 | 0.124673 | 54,639 | 1,294 | 194 | 42.224884 | 0.757564 | 0.096122 | 0 | 0.219361 | 1 | 0 | 0.191095 | 0.024249 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005149 | false | 0.00206 | 0.00206 | 0.00206 | 0.062822 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba87c6200c3589812eddf0e71c34e444b157cf48 | 3,098 | py | Python | nicos/services/daemon/auth/params.py | ebadkamil/nicos | 0355a970d627aae170c93292f08f95759c97f3b5 | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 12 | 2019-11-06T15:40:36.000Z | 2022-01-01T16:23:00.000Z | nicos/services/daemon/auth/params.py | ebadkamil/nicos | 0355a970d627aae170c93292f08f95759c97f3b5 | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 4 | 2019-11-08T10:18:16.000Z | 2021-01-13T13:07:29.000Z | nicos/services/daemon/auth/params.py | ISISComputingGroup/nicos | 94cb4d172815919481f8c6ee686f21ebb76f2068 | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 6 | 2020-01-11T10:52:30.000Z | 2022-02-25T12:35:23.000Z | # -*- coding: utf-8 -*-
# *****************************************************************************
# NICOS, the Networked Instrument Control System of the MLZ
# Copyright (c) 2009-2021 by the NICOS contributors (see AUTHORS)
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors:
# Georg Brandl <georg.brandl@frm2.tum.de>
# Alexander Lenz <alexander.lenz@frm2.tum.de>
# Christian Felder <c.felder@fz-juelich.de>
# Björn Pedersen <bjoern.pedersen@frm2.tum.de>
#
# *****************************************************************************
from nicos.core import ACCESS_LEVELS
def _access_level_list():
return ', '.join(repr(l) for l in ACCESS_LEVELS.values())
def UserPassLevelAuthEntry(val=None):
"""Provide a 3-tuple of user, password, and level
* user: string
* password: string
* level: oneof(ACCESS_LEVELS)
currently: GUEST, USER, ADMIN
"""
val = list(val)
if len(val) != 3:
raise ValueError('UserPassLevelAuthEntry entry needs to be '
'a 3-tuple (name, password, accesslevel)')
if not isinstance(val[0], str):
raise ValueError('user name must be a string')
val[0] = val[0].strip()
if not isinstance(val[1], str):
raise ValueError('user password must be a string')
val[1] = val[1].strip()
if isinstance(val[2], str):
for i, name in ACCESS_LEVELS.items():
if name == val[2].strip():
val[2] = i
break
else:
raise ValueError('access level must be one of %s' % _access_level_list())
elif not isinstance(val[2], int):
# for backwards compatibility: allow integer values as well
raise ValueError('access level must be one of %s' % _access_level_list())
else:
if val[2] not in ACCESS_LEVELS:
raise ValueError('access level must be one of %s' % _access_level_list())
return tuple(val)
def UserLevelAuthEntry(val=None):
"""Provide a 2-tuple of user and level
* user: string
* level: oneof(ACCESS_LEVELS)
currently: GUEST, USER, ADMIN
"""
if len(val) != 2:
raise ValueError('UserLevelAuthEntry entry needs to be a 2-tuple '
'(name, accesslevel)')
# pylint: disable=unbalanced-tuple-unpacking
user, _p, level = UserPassLevelAuthEntry((val[0], '', val[1]))
return tuple((user, level))
| 37.780488 | 85 | 0.619109 | 408 | 3,098 | 4.654412 | 0.397059 | 0.040548 | 0.031596 | 0.030016 | 0.21327 | 0.16693 | 0.137441 | 0.137441 | 0.137441 | 0.083728 | 0 | 0.019393 | 0.234345 | 3,098 | 81 | 86 | 38.246914 | 0.781197 | 0.49419 | 0 | 0.151515 | 0 | 0 | 0.200957 | 0.015038 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.151515 | 0.030303 | 0.030303 | 0.212121 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
ba8eb6b446da9b5861b5d55aaa9e826bc5dac1a4 | 894 | py | Python | tests/test_doctest.py | allaudet/python-diskcache | 2774689c60bac3ebd06246943bca2014779ee2c6 | [
"Apache-2.0"
] | null | null | null | tests/test_doctest.py | allaudet/python-diskcache | 2774689c60bac3ebd06246943bca2014779ee2c6 | [
"Apache-2.0"
] | null | null | null | tests/test_doctest.py | allaudet/python-diskcache | 2774689c60bac3ebd06246943bca2014779ee2c6 | [
"Apache-2.0"
] | null | null | null | import doctest
import shutil
import diskcache.core
import diskcache.djangocache
import diskcache.fanout
import diskcache.memo
import diskcache.persistent
def rmdir(directory):
try:
shutil.rmtree(directory)
except OSError:
pass
def test_core():
rmdir('/tmp/diskcache')
failures, _ = doctest.testmod(diskcache.core)
assert failures == 0
def test_djangocache():
rmdir('/tmp/diskcache')
failures, _ = doctest.testmod(diskcache.djangocache)
assert failures == 0
def test_fanout():
rmdir('/tmp/diskcache')
failures, _ = doctest.testmod(diskcache.fanout)
assert failures == 0
def test_memo():
rmdir('/tmp/diskcache')
failures, _ = doctest.testmod(diskcache.memo)
assert failures == 0
def test_persistent():
rmdir('/tmp/diskcache')
failures, _ = doctest.testmod(diskcache.persistent)
assert failures == 0
| 19.434783 | 56 | 0.696868 | 99 | 894 | 6.191919 | 0.232323 | 0.122349 | 0.138662 | 0.203915 | 0.535073 | 0.391517 | 0.391517 | 0 | 0 | 0 | 0 | 0.006935 | 0.193512 | 894 | 45 | 57 | 19.866667 | 0.843273 | 0 | 0 | 0.3125 | 0 | 0 | 0.0783 | 0 | 0 | 0 | 0 | 0 | 0.15625 | 1 | 0.1875 | false | 0.03125 | 0.21875 | 0 | 0.40625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba902e13fa38f2d48dee6ee78ad8615cef0e6563 | 591 | py | Python | OverApp/migrations/0004_auto_20160617_1924.py | sheissage/overnightasiasg | 83351e51b26a0ff3759030d5165cc87e987ff07d | [
"Apache-2.0"
] | null | null | null | OverApp/migrations/0004_auto_20160617_1924.py | sheissage/overnightasiasg | 83351e51b26a0ff3759030d5165cc87e987ff07d | [
"Apache-2.0"
] | null | null | null | OverApp/migrations/0004_auto_20160617_1924.py | sheissage/overnightasiasg | 83351e51b26a0ff3759030d5165cc87e987ff07d | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.9.2 on 2016-06-17 19:24
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('OverApp', '0003_hotelavailability_currency'),
]
operations = [
migrations.RemoveField(
model_name='hotelavailability',
name='currency',
),
migrations.AddField(
model_name='hotelinfo',
name='currency',
field=models.CharField(default='USD', max_length=3),
),
]
| 23.64 | 64 | 0.604061 | 58 | 591 | 5.982759 | 0.741379 | 0.051873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049296 | 0.279188 | 591 | 24 | 65 | 24.625 | 0.765258 | 0.113367 | 0 | 0.235294 | 1 | 0 | 0.159309 | 0.059501 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba929360696b6edf5a6ea4d1c28a8dae5537d68d | 466 | py | Python | detect.py | nemero/py_neural | 87f151097f8c331a06f13b96c4cec9a1ee663abf | [
"MIT"
] | null | null | null | detect.py | nemero/py_neural | 87f151097f8c331a06f13b96c4cec9a1ee663abf | [
"MIT"
] | 1 | 2017-01-18T18:35:03.000Z | 2017-01-25T08:55:49.000Z | detect.py | nemero/py_neural | 87f151097f8c331a06f13b96c4cec9a1ee663abf | [
"MIT"
] | null | null | null | # coding: utf8
import numpy as np
from loadsample import *
np.set_printoptions(suppress=True)
def nonlin(x,deriv=False):
if(deriv==True):
return x*(1-x)
return 1/(1+np.exp(-x))
#load sinapse
syn0 = np.load('synapse/syn0.npy')
syn1 = np.load('synapse/syn1.npy')
X = np.array(collection,dtype=float)
# проходим вперёд по слоям 0, 1 и 2
l0 = X
l1 = nonlin(np.dot(l0,syn0))
l2 = nonlin(np.dot(l1,syn1))
print teacher
print l2
#print syn0
#print syn1 | 16.642857 | 36 | 0.684549 | 82 | 466 | 3.878049 | 0.536585 | 0.037736 | 0.081761 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053846 | 0.16309 | 466 | 28 | 37 | 16.642857 | 0.761538 | 0.167382 | 0 | 0 | 0 | 0 | 0.083551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.133333 | null | null | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba97cdb62a5afe7e84e6d10f9deb473a487ebf38 | 12,194 | py | Python | Main.py | nntropy/HonoursProject | 7fa62ffb65b386706b9f99ccb83f20800393e077 | [
"Apache-2.0"
] | null | null | null | Main.py | nntropy/HonoursProject | 7fa62ffb65b386706b9f99ccb83f20800393e077 | [
"Apache-2.0"
] | null | null | null | Main.py | nntropy/HonoursProject | 7fa62ffb65b386706b9f99ccb83f20800393e077 | [
"Apache-2.0"
] | null | null | null | # this is the main method for my entire program.
import itertools
import math
import os
import sys
import numpy as np
import pandas as pd
import torch
import torch.utils.data
from Utility import feature_selection as fs
from Utility import ngram
from Utility.opcodeseq_creator import run_opcode_seq_creation
from models.BNN import train, test
import models.DBN as dbn
import xml.etree.ElementTree as et
import lxml.etree as ET
##
## (opt) INPUT: apk dir, tmp dir, opcode dir, fcbl dir, model dir
##
##
sys.path.insert(1, os.path.join(sys.path[0], '../..'))
def converttoopcode(apkdir, tmpdir, opdir, suplib):
""""
:param apkdies. default is no
output: None
ir: the directory to get apks from
:param tmpdir: the temporary dir to store while converting to opcode
:param opdir: the directory to story the opcodeseq
:param suplib: whether to use support librar
This function takes apkdir, tmpdir, opdir and suplib and convert all apk's in apkdir to opcode sequences
in opdir. See opcode based malware detection for more details.
"""
# Simple function. Just run opcode seq creation
run_opcode_seq_creation.main(apkdir=apkdir, tmpdir=tmpdir, opdir=opdir, suplib="")
def getdataset(dire, nograms):
"""
:param dire: dir of texts to be converted to ngrams (opcodeseq file)
:param nograms: the number (n) of ngrams to be split into
output: None
"""
# appends all txtfiles in array
txt = []
num = 0
# Second param doesn't matter right now, it's just for easily customizability
# Get all opseq locations
for txtfile in os.listdir(dire):
if os.path.isfile(os.path.join(dire, txtfile)):
txt.append(txtfile)
nofiles = len(txt)
perms = pd.DataFrame(columns=['Malware'])
print("Filling dataframe with counts")
# Go into opseq locations
for txt_file in txt:
# Get all possible ngrams for a file
if num % 40 == 0 and num != 0:
print(str(num) + " of " + str(nofiles) + "files have been finished.")
tmp, nowords = getngrams(txt_file, dire, nograms)
# if none of first two are digits, it's benign
if not (txt_file[0].isdigit()) and not (txt_file[1].isdigit()):
perms.at[num, 'Malware'] = 0
else:
perms.at[num, 'Malware'] = 1
for gram in tmp:
perms.at[num, gram] += (1 / (nowords))
num += 1
print("Extracting features")
# Reduces it to 2048 features
features = feature_extraction(perms)
print("Creating input for model")
nninput = pd.DataFrame(index=[np.arange(nofiles)], columns=[features.columns])
for col in features:
nninput[[col]] = perms[[col]].values
print(nninput)
dataset = torch.tensor(nninput[:].values).double()
dataset = dataset[torch.randperm(dataset.size()[0])]
print("Input for model created")
return dataset
def getperms():
def feature_extraction(arr):
# Drops malware column
X = arr.drop('Malware', axis=1)
# Initialize array to store values
val = pd.DataFrame(index=[0], columns=X.columns, dtype="float64")
# Gets malware column
Y = arr['Malware']
# Gets SU of all features
for feature in X:
Fi = X[feature]
val.loc[0, feature] = fs.SU(Fi, Y)
# Sort it based on index 0 value 0 (.at[0,:]
result = val.sort_values(0, axis=1, ascending=False)
# Get top 4096 values
result = result.iloc[:, 0:2048]
result[['Malware']] = arr[['Malware']]
return result
def getngrams(fname, dire, nograms):
ngram_generator = ngram.Ngram()
ngramcomplete = []
"""
:param fname: file name to be read
:param nograms: number of grams wanted to be partitioned into
:return: list of all grams in the file
"""
# open file
fname = os.path.join(dire, fname)
with open(fname, mode="r") as bigfile:
# read it as txt
reader = bigfile.read()
# removes newlines as to not mess with n-gram
reader = reader.replace('\\', "")
reader = reader.replace('\n', "")
# append list to list
nowords = len(reader) // 2
ngramcomplete = ngram_generator.letterNgram(reader, nograms)
return ngramcomplete, nowords
###############################################
# UNUSED, decided to use another one as data #
# is just too sparse. I would look into using #
# elmo. Issue is takes forever to train. #
###############################################
def converttofcbl(csvdir, nofeatures):
""""
:param csvdir: the directory of the csv files
:param nofeatures: the number of features you want the csv to be reduced to
output: numpy array with proper attributes
This function takes a csvdir and the nofeatures and takes all csv's in csvdir and reduces it to the number of
features using entropy. See Fast Correlation Based Entropy for data dimensionality reduction for more i2nformation.
"""
data = pd.read_csv(csvdir)
# fs.fcbf(data, label, threshold = 0.2, base=2)
def parsepermissiondirectory(dire):
#Get list of all permissions (166 of them)
perms = getpermissions()
txt = []
i = 0
#get file names
for xmlfile in os.listdir(dire):
if os.path.isfile(os.path.join(dire, xmlfile)):
txt.append(xmlfile)
#get number of files
nofiles = len(txt)
#create dataframe
perm = pd.DataFrame(0, columns=perms + ['Malware'], index=np.arange(nofiles))
for xmlfile in txt:
# get malware flag
if (i % 50 == 0):
print("On file " + str(i))
if not (xmlfile[0].isdigit()) and not (xmlfile[1].isdigit()) and not (xmlfile[2].isdigit()) and not (xmlfile[3].isdigit()) \
and not (xmlfile[4].isdigit()) and not (xmlfile[5].isdigit()):
malwareflag=0
else:
malwareflag=1
# get permission array
permfile = parsepermissionfile(os.path.join(dire, xmlfile))
# append to big array
perm.at[i, 'Malware'] = malwareflag
for permission in permfile:
if permission in perm.columns:
perm.at[i, permission] = 1
i = i + 1
return perm
def parsepermissionfile(f):
"""
:param f: file to parse
:return: list of all perms in this file
"""
# get list of all possible permissions
t = getpermissions()
# list to add to
listofperms = []
# edge case where no manifest
if os.stat(f).st_size == 0:
return []
# parse tree
parser = ET.XMLParser(recover=True)
tree = ET.parse(f, parser=parser)
et.register_namespace('android', 'http://schemas.android.com/apk/res/android')
root = tree.getroot()
#loop through all permissions
for node in root.findall('uses-permission'):
# get all uses-permissions
list = node.items()
perm = list[0][1]
# make sure it's android.permission.PERMISSION
if "android.permission." in perm:
# get only permission, append to listofperms
onlyperm = perm.replace("android.permission.", "")
listofperms.append(onlyperm)
# replace values in dataframe
return listofperms
def getpermissions():
t = ['ACCEPT_HANDOVER', 'ACCESS_BACKGROUND_LOCATION', 'ACCESS_CHECKIN_PROPERTIES', 'ACCESS_CHECKIN_PROPERTIES',
'ACCESS_COARSE_LOCATION', 'ACCESS_FINE_LOCATION', 'ACCESS_LOCATION_EXTRA_COMMANDS',
'ACCESS_MEDIA_LOCATION', 'ACCESS_NETWORK_STATE', 'ACCESS_NOTOFICATION_POLICY', 'ACCESS_WIFI_STATE',
'ACCOUNT_MANAGER', 'ACTIVITY_RECOGNITION', 'ADD_VOICEMAIL', 'ANSWER_PHONE_CALLS', 'BATTERY_STATE',
'BIND_ACCESSIBILITY_SERVICE'
, 'BIND_APPWIDGET', 'BIND_AUTOFILL_SERVICE', 'BIND_CALL_REDIRECTION_SERVICE',
'BIND_CARRIER_MESSAGING_CLIENT_SERVICE', 'BIND_CARRIER_MESSAING_SERVICE', 'BIND_CARRIER_SERVICES',
'BIND_CHOOSER_TARGET_SERVICE', 'BIND_CONDITION_PROVIDER_SERVICE', 'BIND_CONTROLS'
, 'BIND_DEVICE_ADMIN', 'BIND_DREAM_SERVICE', 'BIND_INCALL_SERVICE', 'BIND_INPUT_METHOD',
'BIND_MIDI_DEVICE_SERVICE', 'BIND_NFC_SERVICE', 'BIND_NOTIFICATION_LISTENER_SERVICE', 'BIND_PRINT_SERVICE',
'BIND_QUICK_ACCESS_WALLET_SERVICE',
'BIND_QUICK_SETTINGS_TILE', 'BIND_REMOTEVIEWS', 'BIND_SCREENING_SERVICE', 'BIND_TELECOM_CONNECTION_SERVICE',
'BIND_TEXT_SERVICE', 'BIND_TV_INPUT', 'BIND_VISUAL_VOICEMAIL_SERVICE', 'BIND_VOICE_INTERACTION',
'BIND_VPN_SERVICE',
'BIND_VR_LISTENER_SERVICE', 'BIND_WALLPAPER', 'BLUETOOTH', 'BLUETOOTH_ADMIN', 'BLUETOOTH_PRIVILEGED',
'BODY_SENSORS', 'BROADCAST_PACKAGE_REMOVED', 'BROADCAST_SMS', 'BROADCAST_STICKY', 'BROADCAST_WAP_PUSH',
'CALL_COMPANION_APP',
'CALL_PHONE', 'CALL_PRIVILEGED', 'CAMERA', 'CAPTURE_AUDIO_OUTPUT', 'CHANGE_COMPONENT_ENABLED_STATE',
'CHANGE_CONFIGURATION', 'CHANGE_NETWORK_STATE', 'CHANGE_WIFI_MULTICAST_STATE', 'CHANGE_WIFI_STATE',
'CLEAR_APP_CACHE', 'CONTROL_LOCATION_UPDATES', 'DELETE_CACHE_FILES', 'DELETE_PACKAGES', 'DIAGNOSTIC',
'DISABLE_KEYGUARD', 'DUMP', 'EXPAND_STATUS_BAR', 'FACTORY_TEST', 'FOREGROUND_SERVICE', 'GET_ACCOUNTS',
'GET_ACCOUNTS_PRIVILEGED', 'GET_PACKAGE_SIZE', 'GET_TASKS', 'GLOBAL_SEARCH', 'INSTALL_LOCATION_PROVIDER',
'INSTALL_PACKAGES', 'INSTALL_SHORTCUT', 'INSTANT_APP_FOREGROUND_SERVICE', 'INTERACT_ACROSS_PROFILES',
'INTERNET',
'KILL_BACKGROUND_PROCESSES', 'LOADER_USAGE_STATS', 'LOCATION_HARDWARE', 'MANAGE_DOCUMENTS',
'MANAGE_EXTERNAL_STORAGE', 'MANAGE_OWN_CALLS', 'MASTER_CLEAR', 'MEDIA_CONTENT_CONTROL',
'MODIFY_AUDIO_SETTINGS',
'MODIFY_PHONE_STATE', 'MOUNT_FORMAT_FILESYSTEMS', 'MOUNT_UNMOUN_FILESYSTEMS', 'NFC',
'NFC_PREFERRED_PAYMENT_INFO', 'NFC_TRANSACTION_EVENT', 'PACKAGE_USAGE_STATS', 'PERSISTENT_ACTIVITY',
'PROCESS_OUTGOING_CALLS', 'QUERY_ALL_PACKAGES',
'READ_CALENDER', 'READ_CALL_LOG', 'READ_CONTACTS', 'READ_EXTERNAL_STORAGE', 'READ_INPUT_STATE', 'READ_LOGS',
'READ_PHONE_NUMBERS', 'READ_PHONE_STATE', 'READ_PRECISE_PHONE_STATE', 'READ_SMS', 'READ_SYNC_SETTINGS',
'READ_VOICEMAIL', 'REBOOT', 'RECEIVE_BOOT_COMPLETED', 'RECEIVE_MMS', 'RECEIVE_SMS', 'RECEIVE_WAP_PUSH',
'RECORD_AUDIO', 'REORDER_TASKS', 'REQUEST_COMPANION_RUN_IN_BACKGROUND',
'REQUEST_COMPANION_USE_DATA_IN_BACKGROUND',
'REQUEST_DELETE_PACKAGES', 'REQUEST_IGNORE_BATTERY_OPTIMIZATIONS', 'REQUEST_INSTALL_PACKAGES',
'REQUEST_PASSWORD_COMPLEXITY', 'RESTART_PACKAGES', 'SEND_RESPOND_VIA_MESSAGE', 'SEND_SMS', 'SET_ALARM',
'SET_ALWAYS_FINISH', 'SET_ANIMATION_SCALE',
'SET_DEBUG_APP', 'SET_PREFERRED_APPLICATIONS', 'SET_PROCESS_LIMIT', 'SET_TIME', 'SET_TIME_ZONE',
'SET_WALLPAPER', 'SET_WALLPAPER_HINTS', 'SIGNAL_PERSISTENT_PROCESSES', 'SMS_FINANCIAL_TRANSACTIONS',
'START_VIEW_PERMISSION_USAGE',
'STATUS_BAR', 'SYSTEM_ALERT_WINDOW', 'TRANSMIT_IR', 'UNINSTALL_SHORTCUT', 'UPDATE_DEVICE_STATS',
'USE_BIOMETRIC', 'USE_FINGERPRINT', 'USE_FULL_SCREEN_INTENT', 'USE_SIP', 'VIBRATE', 'WAKE_LOCK',
'WRITE_APN_SETTINGS', 'WRITE_CALENDER',
'WRITE_CALL_LOG', 'WRITE_CONTACTS', 'WRITE_EXTERNAL_STORAGE', 'WRITE_GSERVICES', 'WRITE_SECURE_SETTINGS',
'WRITE_SETTINGS', 'WRITE_SYNC_SETTINGS', 'WRITE_VOICEMAIL']
return t
def dbndataset():
f = os.path.join("D:\\5th\Honours\Code\manidataset\\benign", "com.coafit.apk.smali.xml")
p = parsepermissiondirectory(f)
if __name__ == "__main__":
dataset = getdataset("D:\\5th\Honours\Code\opdataset\mixed", 4)
train(dataset)
#model = dbn.DBN()
#dataset = parsepermissiondirectory("D:\\5th\Honours\Code\manidataset\\mixed")
#dataset = torch.tensor(dataset[:].values).double()
#dataset = dataset[torch.randperm(dataset.size()[0])]
#model.train_static(dataset)
#train(dataset) | 44.181159 | 133 | 0.657044 | 1,496 | 12,194 | 5.143048 | 0.349599 | 0.028594 | 0.007798 | 0.012997 | 0.034832 | 0.024435 | 0.024435 | 0.024435 | 0.024435 | 0.011178 | 0 | 0.007192 | 0.224619 | 12,194 | 276 | 134 | 44.181159 | 0.806557 | 0.122519 | 0 | 0.035714 | 0 | 0 | 0.388277 | 0.177956 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.005952 | 0.089286 | null | null | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba9ea81348075859e6add3d9ac0ec7f1ce621f0f | 179 | py | Python | src/actions_server/__init__.py | rzarajczyk/actions-server | 88580df33ee9dd089dab364e9fd90952a607f03d | [
"MIT"
] | null | null | null | src/actions_server/__init__.py | rzarajczyk/actions-server | 88580df33ee9dd089dab364e9fd90952a607f03d | [
"MIT"
] | null | null | null | src/actions_server/__init__.py | rzarajczyk/actions-server | 88580df33ee9dd089dab364e9fd90952a607f03d | [
"MIT"
] | null | null | null | from .server import *
__all__ = [
'http_server',
'Response',
'Action',
'JsonGet',
'JsonPost',
'Redirect',
'StaticResources',
'ServerController'
]
| 13.769231 | 22 | 0.569832 | 13 | 179 | 7.461538 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.273743 | 179 | 12 | 23 | 14.916667 | 0.746154 | 0 | 0 | 0 | 0 | 0 | 0.441341 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
baa77433e641424f6986c86518d3286d51a8882c | 662 | py | Python | my_pca.py | siddharthtelang/Face-and-Pose-Classification | 742953dd6d05791a9ee643248863e8d9d5a05b9a | [
"MIT"
] | null | null | null | my_pca.py | siddharthtelang/Face-and-Pose-Classification | 742953dd6d05791a9ee643248863e8d9d5a05b9a | [
"MIT"
] | null | null | null | my_pca.py | siddharthtelang/Face-and-Pose-Classification | 742953dd6d05791a9ee643248863e8d9d5a05b9a | [
"MIT"
] | null | null | null | from sklearn.decomposition import PCA
import matplotlib.pyplot as plt
import numpy as np
def get_min_dimensions(flattened):
pca = PCA().fit(flattened)
# plt.figure()
# plt.title('PCA')
# plt.xlabel('Dimensions')
# plt.ylabel('Variance Retention')
# plt.plot(pca.explained_variance_ratio_.cumsum(), lw=3)
# plt.show()
min_dim = (np.where(pca.explained_variance_ratio_.cumsum() > 0.95))[0][0]
print('Minimum dimensions required for 95% retention = ', min_dim)
return min_dim
def doPCA(flattened):
dim = get_min_dimensions(flattened)
pca = PCA(dim)
projected = pca.fit_transform(flattened)
return projected
| 30.090909 | 77 | 0.696375 | 89 | 662 | 5.022472 | 0.47191 | 0.040268 | 0.071588 | 0.111857 | 0.277405 | 0.138702 | 0 | 0 | 0 | 0 | 0 | 0.014706 | 0.178248 | 662 | 21 | 78 | 31.52381 | 0.806985 | 0.231118 | 0 | 0 | 0 | 0 | 0.095618 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.230769 | 0 | 0.538462 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
baaa839d3e982230b9fe73cd6d5e7dd4fc350545 | 1,474 | py | Python | tests/test_container_deps.py | hazbottles/flonb | 8cb1fd4926c8ab14cf579292b06f02ac4568a55e | [
"MIT"
] | 3 | 2020-11-02T14:32:11.000Z | 2022-02-03T04:38:53.000Z | tests/test_container_deps.py | hazbottles/flonb | 8cb1fd4926c8ab14cf579292b06f02ac4568a55e | [
"MIT"
] | 5 | 2021-07-28T02:00:48.000Z | 2021-07-28T02:44:45.000Z | tests/test_container_deps.py | hazbottles/flonb | 8cb1fd4926c8ab14cf579292b06f02ac4568a55e | [
"MIT"
] | null | null | null | import pytest
import flonb
def test_nested_list_deps():
@flonb.task_func()
def multiply(x, y):
return x * y
@flonb.task_func()
def collect(
container=flonb.Dep(
[[multiply.partial(x=x, y=y + 2) for x in range(3)] for y in range(2)]
)
):
return container
assert collect.compute() == [
[0 * 2, 1 * 2, 2 * 2],
[0 * 3, 1 * 3, 2 * 3],
]
def test_heterogenous_deps():
@flonb.task_func()
def multiply(x, y):
return x * y
@flonb.task_func()
def add(x, y):
return x + y
@flonb.task_func()
def collect(
container=flonb.Dep(
[
[{"cows": 4}, multiply.partial(x=2), [add.partial(x=2)]],
5,
add.partial(x=3),
"here is a string",
]
)
):
return container
assert collect.compute(y=4) == [
[{"cows": 4}, 8, [6]],
5,
7,
"here is a string",
]
@pytest.mark.xfail # dicts are not parsed for tasks by dask - do we want to implement that on top?
def test_dict_deps():
@flonb.task_func()
def add_y(x, y):
return x + y
@flonb.task_func()
def collect(
container={y + 10: add_y.partial(y=y) for y in range(5)},
):
return container
# Note how we don't specify `y`!
result = collect.compute(x=3)
assert result == {13: 3, 14: 4, 15: 5, 16: 6, 17: 7}
| 21.057143 | 99 | 0.497965 | 207 | 1,474 | 3.468599 | 0.333333 | 0.02507 | 0.126741 | 0.155989 | 0.445682 | 0.311978 | 0.311978 | 0.311978 | 0.311978 | 0.311978 | 0 | 0.04772 | 0.360244 | 1,474 | 69 | 100 | 21.362319 | 0.71368 | 0.07327 | 0 | 0.518519 | 0 | 0 | 0.029347 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 1 | 0.185185 | false | 0 | 0.037037 | 0.12963 | 0.351852 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
baaaaa85690bcd79d5052208eaadb37b07276d7b | 719 | py | Python | submissions/valid-parentheses/solution.py | Wattyyy/LeetCode | 13a9be056d0a0c38c2f8c8222b11dc02cb25a935 | [
"MIT"
] | null | null | null | submissions/valid-parentheses/solution.py | Wattyyy/LeetCode | 13a9be056d0a0c38c2f8c8222b11dc02cb25a935 | [
"MIT"
] | 1 | 2022-03-04T20:24:32.000Z | 2022-03-04T20:31:58.000Z | submissions/valid-parentheses/solution.py | Wattyyy/LeetCode | 13a9be056d0a0c38c2f8c8222b11dc02cb25a935 | [
"MIT"
] | null | null | null | # https://leetcode.com/problems/valid-parentheses
from collections import deque
class Solution:
def isValid(self, s: str):
if not s:
return True
N = len(s)
st = deque([s[0]])
for i in range(1, N):
if not st:
st.append(s[i])
else:
top = st[-1]
if (
(top == "(" and s[i] == ")")
or (top == "{" and s[i] == "}")
or (top == "[" and s[i] == "]")
):
st.pop()
else:
st.append(s[i])
if len(st) == 0:
return True
else:
return False
| 23.966667 | 51 | 0.346314 | 77 | 719 | 3.233766 | 0.480519 | 0.040161 | 0.084337 | 0.096386 | 0.11245 | 0.11245 | 0.11245 | 0.11245 | 0.11245 | 0 | 0 | 0.011461 | 0.514604 | 719 | 29 | 52 | 24.793103 | 0.702006 | 0.065369 | 0 | 0.291667 | 0 | 0 | 0.008955 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.041667 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bab1ab98b5b9a432147ce95b3a0dad2f4a850934 | 10,929 | py | Python | Efficiency Test/arcadameVM_dictionaries.py | bernardotc/Arcadame | 258a26498fc9de7f95c577704bba8c7709b2d4fc | [
"MIT"
] | null | null | null | Efficiency Test/arcadameVM_dictionaries.py | bernardotc/Arcadame | 258a26498fc9de7f95c577704bba8c7709b2d4fc | [
"MIT"
] | null | null | null | Efficiency Test/arcadameVM_dictionaries.py | bernardotc/Arcadame | 258a26498fc9de7f95c577704bba8c7709b2d4fc | [
"MIT"
] | null | null | null | # -----------------------------------------------------------------------------
# Bernardo Daniel Trevino Caballero A00813175
# Myriam Maria Gutierrez Aburto A00617060
# arcadame.py
#
# Virtual Machine for the languange Arcadame
# -----------------------------------------------------------------------------
import xml.etree.ElementTree as ET
import timeit
timer = timeit.default_timer
totalTimeAccess = 0
totalTimeAssign = 0
timeIt = False
debug = False
functionDictionary = {}
constants = {}
quadruplets = []
instructionCounter = 1;
instructionStack = []
functionScope = ""
parametersMemoryValues = {101: 7000, 102: 8000, 103: 9000, 104: 10000, 105: 11000}
# Memory of execution
memory = [{}, [{}], [{}], constants]
def getSection(value):
if (value < 7000):
return 0
elif (value < 12000):
return 1
elif (value < 17000):
return 2
else:
return 3
def getIndirectDirection(result):
result = result.replace('[', '')
result = result.replace(']', '')
return int(result)
def assignValueInMemory(memoryKey, value):
global memory
##if (isinstance(memoryKey, str)):
##memoryKey = accessValueInMemory(getIndirectDirection(memoryKey))
section = getSection(memoryKey)
if (debug):
print "SET = assigning value in section: ", section
if (section == 0):
memory[section][memoryKey] = value
else:
memory[section][-1][memoryKey] = value
def accessValueInMemory(memoryKey):
global memory
##if (isinstance(memoryKey, str)):
##memoryKey = accessValueInMemory(getIndirectDirection(memoryKey))
section = getSection(memoryKey)
if (debug):
print "GET = accessing value in section: ", section
if (section == 0):
return memory[section][memoryKey]
elif (section == 3):
return memory[section][memoryKey]['value']
else:
return memory[section][-1][memoryKey]
def createERAInMemory():
global memory
memory[1]['values'].append({})
memory[2]['values'].append({})
def addOffsetsInMemory():
global memory
offset = len(memory[1]['values']) - 1
memory[1]['offsetStack'].append(offset)
memory[2]['offsetStack'].append(offset)
def deleteERAInMemory():
global memory
memory[1]['offsetStack'].pop()
memory[1]['values'].pop()
memory[2]['offsetStack'].pop()
memory[2]['values'].pop()
def assignParamInMemory(memoryKey1, memoryKey2):
global memory
if (isinstance(memoryKey2, str)):
memoryKey2 = accessValueInMemory(getIndirectDirection(memoryKey2))
section = getSection(memoryKey2)
value = accessValueInMemory(memoryKey1)
if (debug):
print "Assigning parameters: ", memoryKey1, memoryKey2, value
offset = len(memory[section]['values']) - 1
memory[section]['values'][offset][memoryKey2] = value
def getParamMemoryValue(paramType):
global parametersMemoryValues
value = parametersMemoryValues[paramType]
parametersMemoryValues[paramType] += 1
return value
def resetParametersMemoryValues():
global parametersMemoryValues
parametersMemoryValues = {101: 7000, 102: 8000, 103: 9000, 104: 10000, 105: 11000}
def readRawCode(fileName):
global functionDictionary, constants, quadruplets
parameters = []
tree = ET.parse(fileName)
for function in tree.find('functions').findall('function'):
name = function.find('functionName').text
for parameter in function.findall('parameter'):
parameters.append(int(parameter.text))
returnType = function.find('return').text
memory = function.find('memory').text
quadruplet = function.find('quadruplet').text
eraRoot = function.find('era')
era = [int(eraRoot.find('int').text), int(eraRoot.find('float').text), int(eraRoot.find('string').text), int(eraRoot.find('char').text), int(eraRoot.find('boolean').text)]
functionDictionary[name] = {'parameters': parameters, 'return': returnType, 'memory': int(memory), 'quadruplet': int(quadruplet), 'era': era}
for constant in tree.find('constants').findall('constant'):
value = constant.find('constantValue').text
type = int(constant.find('type').text)
memory = int(constant.find('memory').text)
if (int(type) == 101):
constants[memory] = {'type': type, 'value': int(value)}
elif (int(type) == 102):
constants[memory] = {'type': type, 'value': float(value)}
else:
constants[memory] = {'type': type, 'value': value}
for quadruplet in tree.find('quadruplets').findall('quadruplet'):
operator = int(quadruplet.find('operation').text)
elem1 = int(quadruplet.find('element1').text)
elem2 = int(quadruplet.find('element2').text)
try:
result = int(quadruplet.find('result').text)
except ValueError:
result = quadruplet.find('result').text
quadruplets.append([operator, elem1, elem2, result])
if (debug):
print "functions: ", functionDictionary
print "constants: ", constants
print "quadruplets: ", quadruplets
def doOperation(quadruplet):
global instructionCounter, functionDictionary, instructionStack, functionScope, totalTimeAccess, totalTimeAssign
if (debug):
print quadruplet
if (quadruplet[0] < 10):
t1 = timer();
elem1 = accessValueInMemory(quadruplet[1])
elem2 = accessValueInMemory(quadruplet[2])
t2 = timer();
if (quadruplet[0] == 0):
result = elem1 + elem2
elif (quadruplet[0] == 1):
result = elem1 * elem2
elif (quadruplet[0] == 2):
result = elem1 - elem2
elif (quadruplet[0] == 3):
result = 1.0 * elem1 / elem2
elif (quadruplet[0] == 4):
result = elem1 and elem2
elif (quadruplet[0] == 5):
result = elem1 or elem2
elif (quadruplet[0] == 6):
result = elem1 < elem2
elif (quadruplet[0] == 7):
result = elem1 > elem2
elif (quadruplet[0] == 8):
result = elem1 != elem2
elif (quadruplet[0] == 9):
result = elem1 == elem2
if (debug):
print "elem1: ", elem1
print "elem2: ", elem2
print "result: ", result
t3 = timer()
assignValueInMemory(quadruplet[3], result)
t4 = timer()
if (timeIt):
print "Access operands : ", t2-t1
print "Assign result : ", t4-t3
totalTimeAccess += t2-t1
totalTimeAssign += t4-t3
return True
elif (quadruplet[0] == 10):
t1 = timer()
result = accessValueInMemory(quadruplet[1])
t2 = timer()
assignValueInMemory(quadruplet[3], result)
t3 = timer()
if (timeIt):
print "Access operand : ", t2-t1
print "Assign result : ", t3-t2
totalTimeAccess += t2-t1
totalTimeAssign += t3-t2
return True
elif (quadruplet[0] == 11):
instructionCounter = quadruplet[3] - 1
return True
elif (quadruplet[0] == 12):
result = accessValueInMemory(quadruplet[1])
if (debug):
print "GOTOF result: ", result
if (result == False):
instructionCounter = quadruplet[3] - 1
return True
elif (quadruplet[0] == 13):
result = accessValueInMemory(quadruplet[3])
print "-----> AVM PRINT: ", result
return True
elif (quadruplet[0] == 14):
scan = raw_input('-----> AVM GET_VALUE: ')
try:
if (quadruplet[3] < 15000):
result = int(scan.strip())
else:
result = float(scan.strip())
assignValueInMemory(quadruplet[3], result)
return True
except:
# TODO
# raise ERROR
return False
elif (quadruplet[0] == 15):
scan = raw_input('-----> AVM GET_LINE: ')
assignValueInMemory(quadruplet[3], scan)
return True
elif (quadruplet[0] == 16):
scan = raw_input('-----> AVM GET_BOOLEAN: ')
result = scan.strip()
if (result == 'True' or result == 'False'):
assignValueInMemory(quadruplet[3], result)
return True
else:
# TODO
# raise ERROR
return False
elif (quadruplet[0] == 17):
scan = raw_input('-----> AVM GET_CHAR: ')
if (len(scan) > 0):
result = scan[0]
return True
else:
# TODO
# raise ERROR
return False
elif (quadruplet[0] == 18):
deleteERAInMemory()
if (debug):
print "Memory after deleting ERA: ", memory
instructionCounter = instructionStack.pop()
functionScope = ""
resetParametersMemoryValues()
return True
elif (quadruplet[0] == 19):
createERAInMemory()
functionScope = quadruplet[3]
if (debug):
print "Memory after creating ERA: ", memory
return True
elif (quadruplet[0] == 20):
function = quadruplet[3]
instructionStack.append(instructionCounter)
addOffsetsInMemory()
instructionCounter = functionDictionary[function]['quadruplet'] - 1
resetParametersMemoryValues()
return True
elif (quadruplet[0] == 21):
value = quadruplet[1]
if (debug):
print "Function scope Name for Params: ", functionScope
paramType = functionDictionary[functionScope]['parameters'][quadruplet[3] - 1]
param = getParamMemoryValue(paramType)
assignParamInMemory(value, param)
return True
elif (quadruplet[0] == 22):
result = accessValueInMemory(quadruplet[3])
assignValueInMemory(functionDictionary[functionScope]['memory'], result)
deleteERAInMemory()
if (debug):
print "Memory after deleting ERA: ", memory
instructionCounter = instructionStack.pop()
return True
elif (quadruplet[0] == 23):
result = accessValueInMemory(quadruplet[1])
if (result >= quadruplet[2] and result <= quadruplet[3]):
return True
else:
# TODO: - raise ERROR
return False
elif (quadruplet[0] == 30):
return False
# Main.
readRawCode('rawCode.xml')
if (debug):
print "Initial memory: ", memory
count = 100000
while count > 0:
instructionCounter = 1
memory = [{}, [{}], [{}], constants]
memory[3] = constants
while 1:
if (doOperation(quadruplets[instructionCounter - 1])):
instructionCounter += 1;
else:
break;
if (debug):
print "Final memory: ", memory
count -= 1
print "Total access time: ", totalTimeAccess
print "Total assign time: ", totalTimeAssign
| 33.835913 | 179 | 0.586238 | 1,037 | 10,929 | 6.16972 | 0.181292 | 0.044701 | 0.056268 | 0.041263 | 0.322757 | 0.205377 | 0.147234 | 0.137543 | 0.131291 | 0.114098 | 0 | 0.039651 | 0.275414 | 10,929 | 322 | 180 | 33.940994 | 0.768279 | 0.054442 | 0 | 0.311828 | 0 | 0 | 0.087787 | 0 | 0 | 0 | 0 | 0.003106 | 0 | 0 | null | null | 0 | 0.007168 | null | null | 0.086022 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bab3ff9be49b8a19139c53d2ee877f5f6476ae7c | 1,252 | py | Python | web/web/urls.py | danieltrt/UnchartIt_UI | dbaefcb68e8d1a3e48b9ee071a9918dcc6eceebe | [
"BSD-3-Clause-Clear"
] | 2 | 2020-09-27T14:40:28.000Z | 2022-01-20T14:52:27.000Z | web/web/urls.py | danieltrt/UnchartIt | dbaefcb68e8d1a3e48b9ee071a9918dcc6eceebe | [
"BSD-3-Clause-Clear"
] | null | null | null | web/web/urls.py | danieltrt/UnchartIt | dbaefcb68e8d1a3e48b9ee071a9918dcc6eceebe | [
"BSD-3-Clause-Clear"
] | null | null | null | """web URL Configuration
The `urlpatterns` list routes URLs to views. For more information please see:
https://docs.djangoproject.com/en/3.0/topics/http/urls/
Examples:
Function views
1. Add an import: from my_app import views
2. Add a URL to urlpatterns: path('', views.home, name='home')
Class-based views
1. Add an import: from other_app.views import Home
2. Add a URL to urlpatterns: path('', Home.as_view(), name='home')
Including another URLconf
1. Import the include() function: from django.urls import include, path
2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
"""
from django.contrib import admin
from django.urls import include, path
from django.views.generic import RedirectView
from django.contrib.staticfiles.urls import static
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
from . import settings
urlpatterns = [
path('dist/', include('dist.urls')),
path('synth/', include('synth.urls')),
path('data/', include('data.urls')),
path('home/', include('home.urls')),
path('admin/', admin.site.urls),
path('', RedirectView.as_view(url='/home/', permanent=True)),
]
urlpatterns += static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT) | 37.939394 | 77 | 0.718051 | 177 | 1,252 | 5.033898 | 0.367232 | 0.06734 | 0.016835 | 0.026936 | 0.286195 | 0.286195 | 0.084175 | 0 | 0 | 0 | 0 | 0.007484 | 0.146166 | 1,252 | 33 | 78 | 37.939394 | 0.826006 | 0.494409 | 0 | 0 | 0 | 0 | 0.111643 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bab6a18249c041770c616493b83bf7abb970132e | 3,538 | py | Python | 20180715/FeatureSelection/FeatureSelection.py | fengjiaxin/Home_Credit_Default_Risk | 3407e76b4e5cfb8dd6056d24675b80fe0e82c123 | [
"Apache-2.0"
] | 26 | 2018-06-13T07:34:16.000Z | 2020-12-07T16:38:25.000Z | 20180715/FeatureSelection/FeatureSelection.py | fengjiaxin/Home_Credit_Default_Risk | 3407e76b4e5cfb8dd6056d24675b80fe0e82c123 | [
"Apache-2.0"
] | 1 | 2019-05-02T12:48:31.000Z | 2019-05-25T09:06:22.000Z | 20180715/FeatureSelection/FeatureSelection.py | fengjiaxin/Home_Credit_Default_Risk | 3407e76b4e5cfb8dd6056d24675b80fe0e82c123 | [
"Apache-2.0"
] | 6 | 2018-08-02T11:03:33.000Z | 2021-11-09T10:42:11.000Z | # coding:utf-8
import os
import re
import sys
import tqdm
import numpy as np
import pandas as pd
from category_encoders import TargetEncoder
def filter_nan_feature(feature):
"""
:param feature: feature pd.Series
:return:
"""
return (np.sum(feature.isna()) / len(feature)) > 0.9
class FeatureSelection(object):
def __init__(self, *, input_path, output_path):
# init
self.__input_path, self.__output_path = input_path, output_path
# data prepare
self.__train_feature_before, self.__train_feature_after = [None for _ in range(2)];
self.__train, self.__test = [None for _ in range(2)]
self.__train_label = None
self.__train_feature, self.__test_feature = [None for _ in range(2)]
self.__categorical_columns = None
# data output
self.__train_select_feature, self.__test_select_feature = [None for _ in range(2)]
def data_prepare(self):
self.__train_feature_before = pd.read_csv(os.path.join(self.__input_path, "train_feature_before_df.csv"))
self.__train_feature_after = pd.read_csv(os.path.join(self.__input_path, "train_feature_after_df.csv"))
self.__train = pd.concat([self.__train_feature_before, self.__train_feature_after])
self.__test = pd.read_csv(os.path.join(self.__input_path, "test_feature_df.csv"))
self.__train_label = self.__train["TARGET"].copy()
self.__train_feature = (
self.__train.drop(
["TARGET"] + [col for col in self.__train.columns.tolist() if re.search(r"SK_ID", col)], axis=1
)
).copy()
self.__test_feature = self.__test[self.__train_feature.columns.tolist()].copy()
self.__categorical_columns = self.__train_feature.select_dtypes(include="object").columns.tolist()
encoder = TargetEncoder()
encoder.fit(self.__train_feature[self.__categorical_columns], self.__train_label)
self.__train_feature[self.__categorical_columns] = encoder.transform(
self.__train_feature[self.__categorical_columns]
)
def feature_filter(self):
# np.nan feature filter
flag_list = []
for col in tqdm.tqdm(self.__train_feature.columns):
flag_list.append(filter_nan_feature(self.__train_feature[col]))
self.__train_feature = self.__train_feature[
[col for col, flag in zip(self.__train_feature.columns, flag_list) if flag is not True]]
# std filter
flag_list = []
for col in tqdm.tqdm(self.__train_feature.columns):
flag_list.append(self.__train_feature[col].std() < 0.01)
self.__train_feature = self.__train_feature[
[col for col, flag in zip(self.__train_feature.columns, flag_list) if flag is not True]]
def data_output(self):
self.__train_select_feature = (
self.__train[["TARGET"] + self.__train_feature.columns.tolist()]
)
self.__test_select_feature = (
self.__test[self.__train_feature.columns.tolist()]
)
self.__train_select_feature.to_csv(
os.path.join(self.__output_path, "train_select_feature_df.csv"), index=False
)
self.__test_select_feature.to_csv(
os.path.join(self.__output_path, "test_select_feature_df.csv"), index=False
)
if __name__ == "__main__":
fs = FeatureSelection(
input_path=sys.argv[1],
output_path=sys.argv[2]
)
fs.data_prepare()
fs.feature_filter()
fs.data_output() | 38.043011 | 113 | 0.667326 | 459 | 3,538 | 4.62963 | 0.196078 | 0.156706 | 0.188235 | 0.065882 | 0.554353 | 0.473882 | 0.353882 | 0.331294 | 0.249412 | 0.234353 | 0 | 0.004729 | 0.223007 | 3,538 | 93 | 114 | 38.043011 | 0.768279 | 0.033635 | 0 | 0.117647 | 0 | 0 | 0.047717 | 0.031222 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073529 | false | 0 | 0.102941 | 0 | 0.205882 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
babbbc3188b2ac7eb453bce93107324257a810eb | 265 | py | Python | bin/collect.py | nw-engineer/collectiontools | 7a5cf183c4f626e4ad21d218ebb99d2780f9f4e4 | [
"Apache-2.0"
] | 1 | 2021-11-16T16:43:03.000Z | 2021-11-16T16:43:03.000Z | bin/collect.py | nw-engineer/collectiontools | 7a5cf183c4f626e4ad21d218ebb99d2780f9f4e4 | [
"Apache-2.0"
] | null | null | null | bin/collect.py | nw-engineer/collectiontools | 7a5cf183c4f626e4ad21d218ebb99d2780f9f4e4 | [
"Apache-2.0"
] | null | null | null | import sys
import requests
from bs4 import BeautifulSoup
from urllib import request
args = sys.argv[1]
url = args
response = request.urlopen(url)
soup = BeautifulSoup(response, features = "html.parser")
response.close()
print(soup.title.text)
print(soup.pre.text)
| 20.384615 | 56 | 0.777358 | 38 | 265 | 5.421053 | 0.605263 | 0.087379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008547 | 0.116981 | 265 | 12 | 57 | 22.083333 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0.041509 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0.181818 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bac5b4b38a78ca5bdb0f775ea7cef114d7faeffb | 2,768 | py | Python | main/python/model/Base.py | ShangxuanWu/MT_python | b64bea6e773d9f0df3f8f850252a909e6e4afff5 | [
"CNRI-Python"
] | null | null | null | main/python/model/Base.py | ShangxuanWu/MT_python | b64bea6e773d9f0df3f8f850252a909e6e4afff5 | [
"CNRI-Python"
] | null | null | null | main/python/model/Base.py | ShangxuanWu/MT_python | b64bea6e773d9f0df3f8f850252a909e6e4afff5 | [
"CNRI-Python"
] | null | null | null | # Shangxuan Wu @ Myraid of Things
# 31 Jun 2017
# add path for root ('tf_code/') directory if not in sys.path
import sys, os
root_path = os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
if root_path not in sys.path
sys.path.append(root_path)
from main.python.utils import FileUtils, LogUtils
from main.python.dataloader import DataLoader
from main.resource.dataloader import DataLoaderConfig
import pdb
import os, logging
# this is actually an abstract class with no actual use
class BaseModel():
# init function for
def __init__(self, fd_path, child_class_name):
# check path valid
assert fileUtils.isFolderExists(fd_path)
self.root_fd = fd_path
# check three necessary files exist
self.data_loader = DataLoader(self.root_fd)
# already checked file existence
#assert self.data_loader.hasNecessaryTrainFiles()
# create folders
self.model_fd = os.path.join(self.root_fd, model_fd_basename)
fileUtils.makeOrClearFolder(root_fd)
# handle logging
self.train_log_fn = os.path.join(self.root_fd, train_log_basename)
# create logger with 'spam_application'
self.logger = logging.getLogger(child_class_name)
self.logger.setLevel(logging.DEBUG)
# create file handler which logs even debug messages
fh = logging.FileHandler(self.train_log_fn)
fh.setLevel(logging.DEBUG)
# create console handler with a higher log level
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
ch.setFormatter(formatter)
# add the handlers to the logger
self.logger.addHandler(fh)
self.logger.addHandler(ch)
return
# Following are some abstract functions needed to be implemented:
def parseParames(self):
raise NotImplementedError
def saveModel(self):
raise NotImplementedError
def loadModel(self):
raise NotImplementedError
def forward(self):
raise NotImplementedError
def train(self):
raise NotImplementedError
def evaluate(self):
raise NotImplementedError
def send(self):
raise NotImplementedError
# make assertion to logger instead of to screen
def assertToLogger(self, assertion, err_str):
try:
assert assertion
except AssertionError as err:
self.logger.exception(err_str)
raise err
return
def submitModel(self):
return | 32.186047 | 105 | 0.67052 | 335 | 2,768 | 5.426866 | 0.41194 | 0.023102 | 0.107811 | 0.10231 | 0.053905 | 0.053905 | 0.031903 | 0.031903 | 0.031903 | 0.031903 | 0 | 0.002917 | 0.256864 | 2,768 | 86 | 106 | 32.186047 | 0.880895 | 0.240246 | 0 | 0.192308 | 0 | 0 | 0.02494 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0 | null | null | 0 | 0.115385 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bac65dbd6a7060ed2081b0cb0b52496151e704ac | 292 | py | Python | Software/__init__.py | justins0923/zephyrus-iaq | 10e5f14f4dc0d1d72a2eff928df4ad572fc99e4c | [
"MIT"
] | 1 | 2019-11-05T18:45:52.000Z | 2019-11-05T18:45:52.000Z | Software/__init__.py | justins0923/zephyrus-iaq | 10e5f14f4dc0d1d72a2eff928df4ad572fc99e4c | [
"MIT"
] | null | null | null | Software/__init__.py | justins0923/zephyrus-iaq | 10e5f14f4dc0d1d72a2eff928df4ad572fc99e4c | [
"MIT"
] | null | null | null | Software/
Config/
GUI/
IAQ_GUI.py
HAT/
IAQ_DAC43608.py
IAQ_Mux.py
Sensors/
IAQ_Sensor.py
IAQ_MqGas.py
third_party/
bme680-python
IAQ_AnalogPortController.py
IAQ_Exceptions.py
IAQ_FileHandler.py
IAQ_Logger.py
| 17.176471 | 31 | 0.60274 | 36 | 292 | 4.611111 | 0.527778 | 0.150602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041667 | 0.342466 | 292 | 16 | 32 | 18.25 | 0.822917 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bac852e76d72f264f2be673496b458a4609e122d | 2,102 | py | Python | repoxplorer/controllers/tags.py | Priya-100/repoxplorer | 45fba33cf27702a56cea54cb9f7afa472ef5498c | [
"Apache-2.0"
] | 107 | 2016-05-05T19:41:12.000Z | 2022-01-29T22:49:03.000Z | repoxplorer/controllers/tags.py | Priya-100/repoxplorer | 45fba33cf27702a56cea54cb9f7afa472ef5498c | [
"Apache-2.0"
] | 68 | 2016-05-08T13:30:07.000Z | 2021-12-22T21:33:39.000Z | repoxplorer/controllers/tags.py | Priya-100/repoxplorer | 45fba33cf27702a56cea54cb9f7afa472ef5498c | [
"Apache-2.0"
] | 27 | 2018-08-31T09:27:41.000Z | 2022-02-17T03:35:53.000Z | # Copyright 2017, Fabien Boucher
# Copyright 2017, Red Hat
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pecan import expose
from repoxplorer.controllers import utils
from repoxplorer import index
from repoxplorer.index.projects import Projects
from repoxplorer.index.tags import Tags
from repoxplorer.index.contributors import Contributors
class TagsController(object):
@expose('json')
def tags(self, pid=None, tid=None,
dfrom=None, dto=None, inc_repos=None):
t = Tags(index.Connector(index_suffix='tags'))
projects_index = Projects()
idents = Contributors()
query_kwargs = utils.resolv_filters(
projects_index, idents, pid, tid, None, None,
dfrom, dto, inc_repos, None, None, None, None)
p_filter = [":".join(r.split(':')[:-1]) for r in query_kwargs['repos']]
dfrom = query_kwargs['fromdate']
dto = query_kwargs['todate']
ret = [r['_source'] for r in t.get_tags(p_filter, dfrom, dto)]
# TODO: if tid is given we can include user defined releases
# for repo tagged with tid.
if not pid:
return ret
# now append user defined releases
ur = {}
project = projects_index.get(pid, source=['refs', 'releases'])
for release in project.get('releases', []):
ur[release['name']] = release
for ref in project['refs']:
for release in ref.get('releases', []):
ur[release['name']] = release
for rel in ur.values():
ret.append(rel)
return ret
| 36.877193 | 79 | 0.653663 | 279 | 2,102 | 4.870968 | 0.46595 | 0.04415 | 0.04415 | 0.023547 | 0.050037 | 0.050037 | 0.050037 | 0 | 0 | 0 | 0 | 0.008207 | 0.246432 | 2,102 | 56 | 80 | 37.535714 | 0.849747 | 0.334443 | 0 | 0.125 | 0 | 0 | 0.055033 | 0 | 0 | 0 | 0 | 0.017857 | 0 | 1 | 0.03125 | false | 0 | 0.1875 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bad719cf9311790e0960fabcfa0c5d3ced6ef911 | 1,392 | py | Python | pcdet/models/dense_heads/__init__.py | ocNflag/point2seq | 710686f576b3df5469a06c66860758b25f852dbd | [
"Apache-2.0"
] | 21 | 2022-03-24T09:37:38.000Z | 2022-03-31T13:21:54.000Z | pcdet/models/dense_heads/__init__.py | ocNflag/point2seq | 710686f576b3df5469a06c66860758b25f852dbd | [
"Apache-2.0"
] | null | null | null | pcdet/models/dense_heads/__init__.py | ocNflag/point2seq | 710686f576b3df5469a06c66860758b25f852dbd | [
"Apache-2.0"
] | 1 | 2022-03-24T09:37:48.000Z | 2022-03-24T09:37:48.000Z | from .anchor_head_multi import AnchorHeadMulti
from .anchor_head_single import AnchorHeadSingle
from .anchor_head_template import AnchorHeadTemplate
from .point_head_box import PointHeadBox
from .point_head_simple import PointHeadSimple
from .point_intra_part_head import PointIntraPartOffsetHead
from .anchor_head_seg import AnchorHeadSeg
from .center_head import CenterHead
from .mm_head import MMHead
from .e2e_head import E2EHead
from .fusion_head import FusionHead
from .attention_fusion_head import AttnFusionHead
from .e2e_fusion_head import E2EFusionHead
from .e2e_seqfuse_head import E2ESeqFusionHead
from .e2e_seq_head import E2ESeqHead
from .e2e_refine_head import E2ERefinementHead
from .e2e_seq_token_head import E2ESeqTokenHead
__all__ = {
'AnchorHeadTemplate': AnchorHeadTemplate,
'AnchorHeadSingle': AnchorHeadSingle,
'PointIntraPartOffsetHead': PointIntraPartOffsetHead,
'PointHeadSimple': PointHeadSimple,
'PointHeadBox': PointHeadBox,
'AnchorHeadMulti': AnchorHeadMulti,
'AnchorHeadSeg': AnchorHeadSeg,
'CenterHead': CenterHead,
'MMHead': MMHead,
'E2EHead': E2EHead,
'FusionHead': FusionHead,
'AttnFusionHead': AttnFusionHead,
'E2EFusionHead': E2EFusionHead,
'E2ESeqFusionHead': E2ESeqFusionHead,
'E2ESeqHead': E2ESeqHead,
'E2ESeqTokenHead': E2ESeqTokenHead,
'E2ERefinementHead': E2ERefinementHead
}
| 33.142857 | 59 | 0.808908 | 135 | 1,392 | 8.074074 | 0.296296 | 0.100917 | 0.051376 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019786 | 0.128592 | 1,392 | 41 | 60 | 33.95122 | 0.878813 | 0 | 0 | 0 | 0 | 0 | 0.165948 | 0.017241 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.472222 | 0 | 0.472222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bad7d648118bd2858bf49d2a44a2f4cfe07ecd25 | 465 | py | Python | tests/factories/input/types/float_input.py | TheLabbingProject/django_analyses | 08cac40a32754a265b37524f08ec6160c69ebea8 | [
"Apache-2.0"
] | 1 | 2020-12-30T12:43:34.000Z | 2020-12-30T12:43:34.000Z | tests/factories/input/types/float_input.py | TheLabbingProject/django_analyses | 08cac40a32754a265b37524f08ec6160c69ebea8 | [
"Apache-2.0"
] | 59 | 2019-12-25T13:14:56.000Z | 2021-07-22T12:24:46.000Z | tests/factories/input/types/float_input.py | TheLabbingProject/django_analyses | 08cac40a32754a265b37524f08ec6160c69ebea8 | [
"Apache-2.0"
] | 2 | 2020-05-24T06:44:27.000Z | 2020-07-09T15:47:31.000Z | from factory import SubFactory
from factory.django import DjangoModelFactory
from factory.faker import Faker
class FloatInputFactory(DjangoModelFactory):
run = SubFactory("tests.factories.run.RunFactory")
definition = SubFactory(
"tests.factories.input.definitions.float_input_definition.FloatInputDefinitionFactory"
)
value = Faker("pyfloat", min_value=-1000, max_value=1000)
class Meta:
model = "django_analyses.FloatInput"
| 31 | 94 | 0.765591 | 48 | 465 | 7.3125 | 0.5625 | 0.094017 | 0.136752 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020305 | 0.152688 | 465 | 14 | 95 | 33.214286 | 0.870558 | 0 | 0 | 0 | 0 | 0 | 0.316129 | 0.301075 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.272727 | 0 | 0.727273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bad8f5ba5bba6f1c09743a472987af8c5e7d904e | 291 | py | Python | tests/supply-test.py | perpetualCreations/bandage | 1b61b9b9ac68a5a59dbbb4b5586c22f6961e935b | [
"MIT"
] | 1 | 2021-03-21T05:04:42.000Z | 2021-03-21T05:04:42.000Z | tests/supply-test.py | perpetualCreations/bandage | 1b61b9b9ac68a5a59dbbb4b5586c22f6961e935b | [
"MIT"
] | null | null | null | tests/supply-test.py | perpetualCreations/bandage | 1b61b9b9ac68a5a59dbbb4b5586c22f6961e935b | [
"MIT"
] | null | null | null | """unit test for bandage.Supply"""
import bandage
supplier = bandage.Supply(
"https://github.com/perpetualCreations/bandage/releases/tag/BANDAGE",
"F://bandage//tests//test_target//VERSION")
print(supplier.realize())
print(supplier.pre_collect_dump())
print(supplier.version_gap)
| 24.25 | 73 | 0.752577 | 36 | 291 | 5.972222 | 0.638889 | 0.181395 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085911 | 291 | 11 | 74 | 26.454545 | 0.808271 | 0.09622 | 0 | 0 | 0 | 0 | 0.412451 | 0.155642 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.428571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
badae7fa2546703ec6a20b95cc69a793bfa6b940 | 492 | py | Python | desktop/core/ext-py/greenlet-0.3.1/tests/test_weakref.py | digideskio/hortonworks-sandbox | dd8e95c91faee3daa094707baeb94c3953b41efa | [
"Apache-2.0"
] | 19 | 2015-05-01T19:59:03.000Z | 2021-12-09T08:03:16.000Z | desktop/core/ext-py/greenlet-0.3.1/tests/test_weakref.py | digideskio/hortonworks-sandbox | dd8e95c91faee3daa094707baeb94c3953b41efa | [
"Apache-2.0"
] | 1 | 2018-01-03T15:26:49.000Z | 2018-01-03T15:26:49.000Z | desktop/core/ext-py/greenlet-0.3.1/tests/test_weakref.py | hortonworks/hortonworks-sandbox | dd8e95c91faee3daa094707baeb94c3953b41efa | [
"Apache-2.0"
] | 30 | 2015-03-25T19:40:07.000Z | 2021-05-28T22:59:26.000Z | import gc
import greenlet
import weakref
import unittest
class WeakRefTests(unittest.TestCase):
def test_dead_weakref(self):
def _dead_greenlet():
g = greenlet.greenlet(lambda:None)
g.switch()
return g
o = weakref.ref(_dead_greenlet())
gc.collect()
self.assertEquals(o(), None)
def test_inactive_weakref(self):
o = weakref.ref(greenlet.greenlet())
gc.collect()
self.assertEquals(o(), None)
| 24.6 | 46 | 0.617886 | 56 | 492 | 5.285714 | 0.392857 | 0.047297 | 0.074324 | 0.141892 | 0.256757 | 0.256757 | 0.256757 | 0 | 0 | 0 | 0 | 0 | 0.276423 | 492 | 19 | 47 | 25.894737 | 0.831461 | 0 | 0 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.176471 | false | 0 | 0.235294 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bade483fcb1821220372bd41e8ce8d6fdbb4b560 | 1,699 | py | Python | BendersDecomposition/cplex.py | prameshk/Discrete-Optimization | 7cfb1e0977ab071ba5faf2b515e7838ff09fe579 | [
"MIT"
] | 4 | 2021-06-16T07:04:08.000Z | 2021-11-08T21:35:01.000Z | BendersDecomposition/cplex.py | prameshk/Discrete-Optimization | 7cfb1e0977ab071ba5faf2b515e7838ff09fe579 | [
"MIT"
] | null | null | null | BendersDecomposition/cplex.py | prameshk/Discrete-Optimization | 7cfb1e0977ab071ba5faf2b515e7838ff09fe579 | [
"MIT"
] | 1 | 2021-11-08T21:51:30.000Z | 2021-11-08T21:51:30.000Z | # -*- coding: utf-8 -*-
"""
Created on Wed Dec 30 22:26:17 2020
@author: Pramesh Kumar
"""
import numpy as np
import time
def generateFacilityLocationData(C, F):
# Unbounded ray instance seed 159
np.random.seed(15645)
p = np.random.randint(1000, size=(C, F))
f = np.random.randint(1000, size=(F))
for j in range(F):
for i in range(C):
f[j] += round(0.05*p[i,j])
return C, F, p, f
# Step 1: Initialize variables
C = 1000
F = 10
# Step 2: Start clock
ts = time.time()
# Step 3: Generate instance
C, F, p, f = generateFacilityLocationData(C,F)
############################################################################################################################
##############################################################################################################################
def solveModelCplex():
from docplex.mp.model import Model
m2 = Model(name='benders', log_output=True)
x = {j: m2.addVar(lb=0, vtype=GRB.BINARY) for j in range(F)}
y = {(i, j): m2.addVar(lb=0, vtype=GRB.BINARY) for i in range(C) for j in range(F)}
for i in range(C):
m2.addConstr(sum([y[i, j] for j in range(F)]) == 1)
for j in range(F):
for i in range(C):
m2.addConstr(y[i, j] <= bigM*x[j])
obj = 0
for j in range(F):
obj = obj -f[j] * x[j]
for i in range(C):
obj += p[i, j] * y[i, j]
m2.setObjective(obj, sense=GRB.MAXIMIZE)
m2.update()
m2.Params.OutputFlag = 0
m2.setParam('OutputFlag', False)
m2.optimize()
xVal= [x[j].x for j in range(F)]
yVal =[y[i, j].x for i in range(C) for j in range(F)]
return m2.objVal, xVal, yVal
| 28.316667 | 126 | 0.493231 | 254 | 1,699 | 3.295276 | 0.350394 | 0.117085 | 0.057348 | 0.105137 | 0.344086 | 0.221027 | 0.221027 | 0.221027 | 0.221027 | 0.155317 | 0 | 0.043247 | 0.22425 | 1,699 | 59 | 127 | 28.79661 | 0.591806 | 0.111242 | 0 | 0.189189 | 1 | 0 | 0.013622 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054054 | false | 0 | 0.081081 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bae9abee735558f26edead261f576dc0f852beff | 380 | py | Python | script_napari.py | plotly/dash-3d-viz | 496d521ffb2f394a2f7a3a1b68c9d5776bf1a028 | [
"MIT"
] | 2 | 2020-06-04T20:56:52.000Z | 2020-06-05T11:06:44.000Z | script_napari.py | plotly/dash-3d-viz | 496d521ffb2f394a2f7a3a1b68c9d5776bf1a028 | [
"MIT"
] | 1 | 2020-06-16T20:31:57.000Z | 2020-06-17T15:00:53.000Z | script_napari.py | plotly/dash-3d-viz | 496d521ffb2f394a2f7a3a1b68c9d5776bf1a028 | [
"MIT"
] | null | null | null | import napari
from nilearn import image
from skimage import segmentation
img = image.image.load_img('assets/BraTS19_2013_10_1_flair.nii').get_data()
viewer = napari.view_image(img)
pix = segmentation.slic(img, n_segments=10000, compactness=0.002, multichannel=False,
)
pix_boundaries = segmentation.find_boundaries(pix)
viewer.add_labels(pix_boundaries)
| 31.666667 | 85 | 0.768421 | 52 | 380 | 5.384615 | 0.653846 | 0.092857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055046 | 0.139474 | 380 | 11 | 86 | 34.545455 | 0.801223 | 0 | 0 | 0 | 0 | 0 | 0.089474 | 0.089474 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
baec0cd869d7afb2b17b039efdaef8410a2bebc6 | 249 | py | Python | ws4py/_asyncio_compat.py | diveyez/WebSocket-for-Python | a3e6d157b7bb1da1009e66aa750170f1c07aa143 | [
"BSD-3-Clause"
] | 733 | 2015-01-02T12:11:25.000Z | 2022-03-20T21:00:45.000Z | ws4py/_asyncio_compat.py | diveyez/WebSocket-for-Python | a3e6d157b7bb1da1009e66aa750170f1c07aa143 | [
"BSD-3-Clause"
] | 124 | 2015-01-11T10:22:48.000Z | 2021-11-24T17:44:18.000Z | ws4py/_asyncio_compat.py | diveyez/WebSocket-for-Python | a3e6d157b7bb1da1009e66aa750170f1c07aa143 | [
"BSD-3-Clause"
] | 251 | 2015-01-11T09:39:16.000Z | 2022-02-22T17:22:20.000Z | """Provide compatibility over different versions of asyncio."""
import asyncio
if hasattr(asyncio, "async"):
# Compatibility for Python 3.3 and older
ensure_future = getattr(asyncio, "async")
else:
ensure_future = asyncio.ensure_future | 27.666667 | 63 | 0.742972 | 31 | 249 | 5.870968 | 0.645161 | 0.197802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009569 | 0.160643 | 249 | 9 | 64 | 27.666667 | 0.861244 | 0.389558 | 0 | 0 | 0 | 0 | 0.068027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
baf129a810bf414995d2f5915ec8689447b32c3e | 1,093 | py | Python | beproductive/pomodoro.py | JohannesStutz/beproductive | 119e766ba4eb29d16908dd9ff2e9b677a9cec955 | [
"Apache-2.0"
] | 2 | 2021-02-01T23:36:38.000Z | 2021-03-08T04:26:37.000Z | beproductive/pomodoro.py | JohannesStutz/beproductive | 119e766ba4eb29d16908dd9ff2e9b677a9cec955 | [
"Apache-2.0"
] | null | null | null | beproductive/pomodoro.py | JohannesStutz/beproductive | 119e766ba4eb29d16908dd9ff2e9b677a9cec955 | [
"Apache-2.0"
] | null | null | null | # AUTOGENERATED! DO NOT EDIT! File to edit: 02_pomodoro.ipynb (unless otherwise specified).
__all__ = ['WORK_TIME', 'BREAK_TIME', 'POMODOROS', 'pomodoro']
# Cell
from time import sleep
from .blocker import *
import sys
WORK_TIME = 25 # minutes
BREAK_TIME = 5 # minutes
POMODOROS = 4
# Cell
def pomodoro(work_time=WORK_TIME, break_time=BREAK_TIME, pomodoros=POMODOROS):
blocker = Blocker()
if not blocker.adminrights:
return False
turn = 1
while turn <= pomodoros:
if blocker.block():
blocker.notify(f"Pomodoro no. {turn} of {pomodoros} started, work for {work_time} minutes")
else:
blocker.notify("An error occured. Please exit with ctrl+c")
sleep(work_time*60)
blocker.unblock()
if turn < pomodoros:
blocker.notify(f"Pomodoro no. {turn} ended, take a {break_time} minutes break")
sleep(break_time*60)
else:
blocker.notify(f"Pomodoro session ended, take a longer break. All websites unblocked.", duration=10)
turn += 1 | 34.15625 | 113 | 0.63678 | 138 | 1,093 | 4.92029 | 0.449275 | 0.070692 | 0.057437 | 0.097202 | 0.082474 | 0.082474 | 0 | 0 | 0 | 0 | 0 | 0.017413 | 0.26441 | 1,093 | 32 | 114 | 34.15625 | 0.827114 | 0.105215 | 0 | 0.08 | 1 | 0 | 0.294055 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.12 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
baf43f38cc9b07342d1f1d2e40a5df32e2005cd5 | 453 | py | Python | AmexTest.py | n1cfury/ViolentPython | c758de753bd56fe8b5c5b6db6f0f1b134023bf6f | [
"MIT"
] | null | null | null | AmexTest.py | n1cfury/ViolentPython | c758de753bd56fe8b5c5b6db6f0f1b134023bf6f | [
"MIT"
] | null | null | null | AmexTest.py | n1cfury/ViolentPython | c758de753bd56fe8b5c5b6db6f0f1b134023bf6f | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import re
def banner():
print "[***] Amex finder p 176 [***]"
def findCreditCard(pkt):
americaRE = re.findall("3[47][0-9][13]".raw)
if americaRE:
print "[+] Found American Express Card: "+americaRE[0]
def main():
tests = []
tests.append("I would like to buy 1337 copies of that dvd")
tests.append ("Bill my card: 378282246310005 for \$2600")
for test in tests:
findCreditCard(test)
if __name__ == '__main__':
main()
| 22.65 | 60 | 0.668874 | 67 | 453 | 4.402985 | 0.731343 | 0.074576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08971 | 0.163355 | 453 | 19 | 61 | 23.842105 | 0.688654 | 0.04415 | 0 | 0 | 0 | 0 | 0.386574 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.066667 | null | null | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
baf5cd31e38dc263cdc4440453a88e75480f1571 | 7,049 | py | Python | analysis/export.py | sunlightlabs/regulations-scraper | 5f2644a3cf54f915d7d90957645073737ab91022 | [
"BSD-3-Clause"
] | 13 | 2015-09-28T16:59:31.000Z | 2020-06-20T01:23:13.000Z | analysis/export.py | sunlightlabs/regulations-scraper | 5f2644a3cf54f915d7d90957645073737ab91022 | [
"BSD-3-Clause"
] | 1 | 2019-02-18T20:26:21.000Z | 2021-02-25T01:43:44.000Z | analysis/export.py | sunlightlabs/regulations-scraper | 5f2644a3cf54f915d7d90957645073737ab91022 | [
"BSD-3-Clause"
] | 6 | 2015-07-08T22:41:34.000Z | 2019-09-03T18:19:32.000Z | #!/usr/bin/env python
import sys
import os
import csv
import time
import multiprocessing
from Queue import Empty
from datetime import datetime
from collections import namedtuple
from pymongo import Connection
import StringIO
pid = os.getpid()
import_start = time.time()
print '[%s] Loading trie...' % pid
from oxtail.matching import match
print '[%s] Loaded trie in %s seconds.' % (pid, time.time() - import_start)
F = namedtuple('F', ['csv_column', 'transform'])
def deep_get(key, dict, default=None):
if '.' in key:
first, rest = key.split('.', 1)
return deep_get(rest, dict.get(first, {}), default)
else:
out = dict.get(key, default)
return out if out else default
def getter(key, default=''):
return lambda d: deep_get(key, d, default)
DOCS_QUERY = {'deleted': False}
DOCS_FIELDS = [
F('document_id', getter('document_id')),
F('docket_id', getter('docket_id')),
F('agency', getter('agency')),
F('date_posted', getter('details.receive_date', None)),
F('date_due', getter('details.comment_end_date', None)),
F('title', getter('title')),
F('type', getter('type')),
F('org_name', getter('details.organization')),
F('submitter_name', lambda d: ' '.join(filter(bool, [deep_get('details.first_name', d, None), deep_get('details.mid_initial', d, None), deep_get('details.last_name', d, None)]))),
F('on_type', getter('comment_on.type')),
F('on_id', getter('comment_on.id')),
F('on_title', getter('comment_on.title')),
]
def filter_for_postgres(v):
if v is None:
return '\N'
if isinstance(v, datetime):
return str(v)
return v.encode('utf8').replace("\.", ".")
def process_doc(doc, fields=DOCS_FIELDS):
# field extraction
output = {
'metadata': [filter_for_postgres(f.transform(doc)) for f in fields],
'matches': [],
'submitter_matches': []
}
# entity extraction
if 'views' in doc and doc['views']:
for view in doc['views']:
if 'extracted' in view and view['extracted'] == True:
for entity_id in match(view['text']).keys():
# hack to deal with documents whose scrapes failed but still got extracted
object_id = doc['object_id'] if 'object_id' in doc else view['file'].split('/')[-1].split('.')[0]
output['matches'].append([doc['document_id'], object_id, view['type'], 'view', entity_id])
if 'attachments' in doc and doc['attachments']:
for attachment in doc['attachments']:
if 'views' in attachment and attachment['views']:
for view in attachment['views']:
if 'extracted' in view and view['extracted'] == True:
for entity_id in match(view['text']).keys():
output['matches'].append([doc['document_id'], attachment['object_id'], view['type'], 'attachment', entity_id])
# submitter matches
for entity_id in match('\n'.join([output['metadata'][7], output['metadata'][8]])).keys():
output['submitter_matches'].append([doc['document_id'], entity_id])
return output
# single-core version
def dump_cursor(c, fields, filename):
metadata_writer = csv.writer(open(sys.argv[3] + '_meta.csv', 'w'))
metadata_writer.writerow([f.csv_column for f in fields])
match_writer = csv.writer(open(sys.argv[3] + '_text_matches.csv', 'w'))
match_writer.writerow(['document_id', 'object_id', 'file_type', 'view_type', 'entity_id'])
submitter_writer = csv.writer(open(sys.argv[3] + '_submitter_matches.csv', 'w'))
submitter_writer.writerow(['document_id', 'entity_id'])
for doc in c:
doc_data = process_doc(doc)
metadata_writer.writerow(doc_data['metadata'])
match_writer.writerows(doc_data['matches'])
submitter_writer.writerows(doc_data['submitter_matches'])
# multi-core version and helpers
def write_worker(done_queue, filename, fields=DOCS_FIELDS):
print '[%s] Writer started.' % os.getpid()
metadata_writer = csv.writer(open(sys.argv[3] + '_meta.csv', 'w'))
metadata_writer.writerow([f.csv_column for f in fields])
match_writer = csv.writer(open(sys.argv[3] + '_text_matches.csv', 'w'))
match_writer.writerow(['document_id', 'object_id', 'file_type', 'view_type', 'entity_id'])
submitter_writer = csv.writer(open(sys.argv[3] + '_submitter_matches.csv', 'w'))
submitter_writer.writerow(['document_id', 'entity_id'])
while True:
try:
doc_data = done_queue.get(timeout=20)
except Empty:
print '[%s] CSV writes complete.' % os.getpid()
return
metadata_writer.writerow(doc_data['metadata'])
match_writer.writerows(doc_data['matches'])
submitter_writer.writerows(doc_data['submitter_matches'])
done_queue.task_done()
def process_worker(todo_queue, done_queue):
print '[%s] Worker started.' % os.getpid()
while True:
try:
doc = todo_queue.get(timeout=20)
except Empty:
print '[%s] Processing complete.' % os.getpid()
return
doc_data = process_doc(doc)
done_queue.put(doc_data)
todo_queue.task_done()
def dump_cursor_multi(c, fields, filename, num_workers):
todo_queue = multiprocessing.JoinableQueue(num_workers * 3)
done_queue = multiprocessing.JoinableQueue(num_workers * 3)
for i in range(num_workers):
proc = multiprocessing.Process(target=process_worker, args=(todo_queue, done_queue))
proc.start()
proc = multiprocessing.Process(target=write_worker, args=(done_queue, filename))
proc.start()
for doc in c:
todo_queue.put(doc)
todo_queue.join()
done_queue.join()
if __name__ == '__main__':
# set up options
from optparse import OptionParser
parser = OptionParser(usage="usage: %prog [options] host dbname file_prefix")
parser.add_option("-l", "--limit", dest="limit", action="store", type="int", default=None, help="Limit number of records for testing.")
parser.add_option("-m", "--multi", dest="multi", action="store", type="int", default=None, help="Set number of worker processes. Single-process model used if not specified.")
(options, args) = parser.parse_args()
# fetch options, args
host = args[0]
dbname = args[1]
prefix = args[2]
# do request and analysis
if options.limit:
cursor = Connection(host=host)[dbname].docs.find(DOCS_QUERY, limit=options.limit)
else:
cursor = Connection(host=host)[dbname].docs.find(DOCS_QUERY)
run_start = time.time()
print '[%s] Starting analysis...' % pid
if options.multi:
dump_cursor_multi(cursor, DOCS_FIELDS, prefix, options.multi)
else:
dump_cursor(cursor, DOCS_FIELDS, prefix)
print '[%s] Completed analysis in %s seconds.' % (pid, time.time() - run_start)
| 36.335052 | 183 | 0.632714 | 912 | 7,049 | 4.712719 | 0.22807 | 0.018613 | 0.02094 | 0.026524 | 0.359702 | 0.322476 | 0.277338 | 0.261982 | 0.246161 | 0.22429 | 0 | 0.003819 | 0.219889 | 7,049 | 193 | 184 | 36.523316 | 0.777778 | 0.036317 | 0 | 0.276596 | 0 | 0 | 0.194899 | 0.010025 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.099291 | null | null | 0.056738 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bafb09d54ced0389dfe8042f970928a1b6e44637 | 777 | py | Python | altair_recipes/stripplot.py | piccolbo/altair_recipes | d9013ba60373c353ea14ce192c7dc978cc6836e5 | [
"BSD-3-Clause"
] | 89 | 2018-08-15T16:49:16.000Z | 2022-03-30T12:22:46.000Z | altair_recipes/stripplot.py | piccolbo/altair_recipes | d9013ba60373c353ea14ce192c7dc978cc6836e5 | [
"BSD-3-Clause"
] | 16 | 2018-08-14T17:56:36.000Z | 2022-02-23T05:56:11.000Z | altair_recipes/stripplot.py | piccolbo/altair_recipes | d9013ba60373c353ea14ce192c7dc978cc6836e5 | [
"BSD-3-Clause"
] | 4 | 2018-11-19T05:00:21.000Z | 2020-02-28T19:16:54.000Z | """Generate stripplots."""
from .common import multivariate_preprocess
from .signatures import multivariate_recipe, opacity, color
import altair as alt
from autosig import autosig, Signature
@autosig(
multivariate_recipe
+ Signature(color=color(default=None, position=3), opacity=opacity(position=4))
)
def stripplot(
data=None, columns=None, group_by=None, color=None, opacity=1, height=600, width=800
):
"""Generate a stripplot."""
data, key, value = multivariate_preprocess(data, columns, group_by)
enc_args = dict()
if color is not None:
enc_args["color"] = color
return (
alt.Chart(data, height=height, width=width)
.mark_tick(opacity=opacity, thickness=2)
.encode(x=key + ":N", y=value, **enc_args)
)
| 31.08 | 88 | 0.693694 | 100 | 777 | 5.29 | 0.51 | 0.039698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015748 | 0.182754 | 777 | 24 | 89 | 32.375 | 0.817323 | 0.054054 | 0 | 0 | 1 | 0 | 0.009669 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.2 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2406dd887fe5b4ed719d2434768385bde8958704 | 3,281 | py | Python | glue_jupyter/widgets/tests/test_linked_dropdown.py | pkgw/glue-jupyter | f8fff629fa5859c69f9c71264042ff9c0e347076 | [
"BSD-3-Clause"
] | null | null | null | glue_jupyter/widgets/tests/test_linked_dropdown.py | pkgw/glue-jupyter | f8fff629fa5859c69f9c71264042ff9c0e347076 | [
"BSD-3-Clause"
] | null | null | null | glue_jupyter/widgets/tests/test_linked_dropdown.py | pkgw/glue-jupyter | f8fff629fa5859c69f9c71264042ff9c0e347076 | [
"BSD-3-Clause"
] | null | null | null | from glue.core.state_objects import State
from glue.core.data_combo_helper import ComponentIDComboHelper
from glue.external.echo import SelectionCallbackProperty
from ..linked_dropdown import LinkedDropdown, LinkedDropdownMaterial
class DummyState(State):
"""Mock state class for testing only."""
x_att = SelectionCallbackProperty(docstring='x test attribute')
y_att = SelectionCallbackProperty(docstring='y test attribute', default_index=-1)
def test_component(app, dataxz, dataxyz):
# setup
state = DummyState()
helper = ComponentIDComboHelper(state, 'x_att', app.data_collection)
helper.append_data(dataxz)
state.helper = helper
# main object we test
dropdown = LinkedDropdown(state, 'x_att', 'x test attribute')
# simple sanity tests
assert dropdown.description == 'x test attribute'
assert [item[0] for item in dropdown.options] == ['x', 'z']
# initial state
assert state.x_att is dataxz.id['x']
assert dropdown.value is dataxz.id['x']
# glue state -> ui
state.x_att = dataxz.id['z']
assert dropdown.value is dataxz.id['z']
# ui -> glue state
dropdown.value = dataxz.id['x']
assert state.x_att is dataxz.id['x']
# same, but now be ok with strings
state.x_att = 'z'
assert dropdown.value is dataxz.id['z']
state.x_att = 'x'
assert dropdown.value is dataxz.id['x']
def test_component_default_index(app, dataxz, dataxyz):
# Regression test for a bug that caused the incorrect element to be selected
# when default_index is involved.
# setup
state = DummyState()
helper = ComponentIDComboHelper(state, 'y_att', app.data_collection)
state.helper = helper
dropdown = LinkedDropdown(state, 'y_att', 'y test attribute')
assert [item[0] for item in dropdown.options] == []
helper.append_data(dataxz)
assert [item[0] for item in dropdown.options] == ['x', 'z']
assert state.y_att is dataxz.id['z']
assert dropdown.value is dataxz.id['z']
def test_component_material(app, dataxz, dataxyz):
# setup
state = DummyState()
helper = ComponentIDComboHelper(state, 'x_att', app.data_collection)
helper.append_data(dataxz)
state.helper = helper
# main object we test
dropdown = LinkedDropdownMaterial(state, 'x_att', 'x test attribute')
# simple sanity tests
assert dropdown.widget_input_label.description == 'x test attribute'
items = getattr(type(state), 'x_att').get_choice_labels(state)
assert len(dropdown.widget_select.children) == len(items)
assert [item.description for item in dropdown.widget_select.children] == ['x', 'z']
# initial state
assert str(state.x_att) == 'x'
assert dropdown.widget_select.value == 0
# glue state -> ui
state.x_att = dataxz.id['z']
assert dropdown.widget_select.value == 1
# ui -> glue state
assert str(state.x_att) == 'z'
assert dropdown.widget_select.value == 1
dropdown.widget_select.value = 0
assert str(state.x_att) == 'x'
# same, but now be ok with strings
assert dropdown.widget_select.value == 0
assert str(state.x_att) == 'x'
state.x_att = 'z'
assert dropdown.widget_select.value == 1
state.x_att = 'x'
assert dropdown.widget_select.value == 0
| 30.663551 | 87 | 0.689424 | 444 | 3,281 | 4.975225 | 0.204955 | 0.032594 | 0.069262 | 0.031689 | 0.582164 | 0.572657 | 0.525124 | 0.516976 | 0.429153 | 0.429153 | 0 | 0.004175 | 0.196891 | 3,281 | 106 | 88 | 30.95283 | 0.834156 | 0.122524 | 0 | 0.59322 | 0 | 0 | 0.060203 | 0 | 0 | 0 | 0 | 0 | 0.423729 | 1 | 0.050847 | false | 0 | 0.067797 | 0 | 0.169492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
240d3483ae4f14b0494d34d78828bf09f96b22a8 | 465 | py | Python | djangocms_newsletter/testsettings.py | nephila/djangocms-newsletter | 5ebd8d3e1e2c85b2791d0261a954469f2548c840 | [
"BSD-3-Clause"
] | null | null | null | djangocms_newsletter/testsettings.py | nephila/djangocms-newsletter | 5ebd8d3e1e2c85b2791d0261a954469f2548c840 | [
"BSD-3-Clause"
] | null | null | null | djangocms_newsletter/testsettings.py | nephila/djangocms-newsletter | 5ebd8d3e1e2c85b2791d0261a954469f2548c840 | [
"BSD-3-Clause"
] | 2 | 2021-03-15T13:33:53.000Z | 2021-05-18T20:34:47.000Z | """Settings for testing emencia.django.newsletter"""
SITE_ID = 1
USE_I18N = False
ROOT_URLCONF = 'emencia.django.newsletter.urls'
DATABASES = {'default': {'NAME': 'newsletter_tests.db',
'ENGINE': 'django.db.backends.sqlite3'}}
INSTALLED_APPS = ['django.contrib.contenttypes',
'django.contrib.sites',
'django.contrib.auth',
'tagging',
'emencia.django.newsletter']
| 27.352941 | 65 | 0.584946 | 44 | 465 | 6.068182 | 0.681818 | 0.146067 | 0.258427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011905 | 0.277419 | 465 | 16 | 66 | 29.0625 | 0.782738 | 0.098925 | 0 | 0 | 0 | 0 | 0.460048 | 0.261501 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2412e937f76bf39a77720b16f02c64cdeb94da86 | 855 | py | Python | MyExtenstion.extension/Gaochao.tab/Gaochao.panel/Wall_Modify.pulldown/Wall_CurtainWall_Test.pushbutton/script.py | gaochaowyq/MyPyRevitExtentision | 0c8b134744550889dd95b2709299b2ef5a0cea6a | [
"MIT"
] | null | null | null | MyExtenstion.extension/Gaochao.tab/Gaochao.panel/Wall_Modify.pulldown/Wall_CurtainWall_Test.pushbutton/script.py | gaochaowyq/MyPyRevitExtentision | 0c8b134744550889dd95b2709299b2ef5a0cea6a | [
"MIT"
] | null | null | null | MyExtenstion.extension/Gaochao.tab/Gaochao.panel/Wall_Modify.pulldown/Wall_CurtainWall_Test.pushbutton/script.py | gaochaowyq/MyPyRevitExtentision | 0c8b134744550889dd95b2709299b2ef5a0cea6a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
__doc__="更具曲面与垂直线创建结构"
import System
from System.Collections.Generic import List, Dictionary,IList
import sys
import clr
import os
import rpw
from rpw import revit, db, ui,DB,UI
from rpw.ui.forms import FlexForm, Label, ComboBox, TextBox, TextBox,Separator, Button, Alert
import json
from MyLib import Helper
doc = __revit__.ActiveUIDocument.Document
uidoc = __revit__.ActiveUIDocument
#pick Surface
Picked= uidoc.Selection.PickObject(UI.Selection.ObjectType.Element)
PickedElementId=Picked.ElementId
Picked_Selection=db.Element.from_id(PickedElementId)
parameter=Picked_Selection.parameters.all
Unwrap_Element=Picked_Selection.unwrap()
Picked_Geometry=Unwrap_Element.get_Geometry(DB.Options())
for i in Picked_Geometry:
try:
print(Helper.CovertToM2(i.SurfaceArea))
except:
print("bad")
#print(Picked_Geometry)
| 20.853659 | 93 | 0.804678 | 112 | 855 | 5.946429 | 0.535714 | 0.067568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002604 | 0.101754 | 855 | 41 | 94 | 20.853659 | 0.864583 | 0.064327 | 0 | 0 | 0 | 0 | 0.018821 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.416667 | 0 | 0.416667 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
2420d25edc02fc425d58bf39ce86f78c2a795cad | 325 | py | Python | blog/migrations/0003_remove_comment_approved_comment.py | noodleslove/django_blog | 301d3190e70e32fd4637720ae1ed58882ccda876 | [
"MIT"
] | null | null | null | blog/migrations/0003_remove_comment_approved_comment.py | noodleslove/django_blog | 301d3190e70e32fd4637720ae1ed58882ccda876 | [
"MIT"
] | 9 | 2020-08-16T08:04:41.000Z | 2022-03-12T00:44:59.000Z | blog/migrations/0003_remove_comment_approved_comment.py | noodleslove/django_blog | 301d3190e70e32fd4637720ae1ed58882ccda876 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.5 on 2020-05-02 01:41
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('blog', '0002_comment'),
]
operations = [
migrations.RemoveField(
model_name='comment',
name='approved_comment',
),
]
| 18.055556 | 47 | 0.587692 | 34 | 325 | 5.529412 | 0.794118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0.298462 | 325 | 17 | 48 | 19.117647 | 0.741228 | 0.138462 | 0 | 0 | 1 | 0 | 0.140288 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2427de993e3f47e157e8a27409af3be831c1ba4d | 1,286 | py | Python | src/products/api/serializers.py | bchuey/justapair | 4d71e4b80c1d48679741b1751c7a04f358e0b6a4 | [
"MIT"
] | null | null | null | src/products/api/serializers.py | bchuey/justapair | 4d71e4b80c1d48679741b1751c7a04f358e0b6a4 | [
"MIT"
] | null | null | null | src/products/api/serializers.py | bchuey/justapair | 4d71e4b80c1d48679741b1751c7a04f358e0b6a4 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from products.models import Jean, Brand, Size, Style, Color
class BrandModelSerializer(serializers.ModelSerializer):
class Meta:
model = Brand
fields = [
'id',
'name',
]
class SizeModelSerializer(serializers.ModelSerializer):
class Meta:
model = Size
fields = [
'id',
'waist',
'length',
]
class StyleModelSerializer(serializers.ModelSerializer):
class Meta:
model = Style
fields = [
'id',
'style_type',
]
class ColorModelSerializer(serializers.ModelSerializer):
class Meta:
model = Color
fields = [
'id',
'hex_value',
]
class JeanModelSerializer(serializers.ModelSerializer):
colors = ColorModelSerializer(many=True)
sizes = SizeModelSerializer(many=True)
style = StyleModelSerializer()
brand_name = BrandModelSerializer()
class Meta:
model = Jean
fields = [
'id',
'title',
'description',
'price',
'colors',
'sizes',
'brand_name',
'style',
]
| 18.911765 | 59 | 0.520995 | 92 | 1,286 | 7.228261 | 0.391304 | 0.195489 | 0.105263 | 0.210526 | 0.240602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.38958 | 1,286 | 67 | 60 | 19.19403 | 0.847134 | 0 | 0 | 0.3125 | 0 | 0 | 0.070762 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
242b9e45000310ad143682612663be0dfb107314 | 8,882 | py | Python | operator_api/transactor/tasks/process_passive_transfer_finalizations.py | liquidity-network/nocust-hub | 76f49f9b8a6c264fcbe9e0c110e98031d463c0a8 | [
"MIT"
] | 1 | 2021-08-04T06:09:46.000Z | 2021-08-04T06:09:46.000Z | operator_api/transactor/tasks/process_passive_transfer_finalizations.py | liquidity-network/nocust-hub | 76f49f9b8a6c264fcbe9e0c110e98031d463c0a8 | [
"MIT"
] | 8 | 2020-11-01T19:48:21.000Z | 2022-02-10T14:12:25.000Z | operator_api/transactor/tasks/process_passive_transfer_finalizations.py | liquidity-network/nocust-hub | 76f49f9b8a6c264fcbe9e0c110e98031d463c0a8 | [
"MIT"
] | 3 | 2020-11-01T15:59:56.000Z | 2021-09-16T07:18:18.000Z | from __future__ import absolute_import, unicode_literals
import logging
from django.conf import settings
from django.db import transaction, IntegrityError
from celery import shared_task
from celery.utils.log import get_task_logger
from contractor.interfaces import LocalViewInterface
from operator_api.crypto import hex_value
from operator_api.email import send_admin_email
from ledger.context.wallet_transfer import WalletTransferContext
from ledger.models import Transfer, ActiveState, RootCommitment, MinimumAvailableBalanceMarker
from operator_api.celery import operator_celery
logger = get_task_logger(__name__)
logger.setLevel(logging.INFO)
@shared_task
def process_passive_transfer_finalizations():
raise NotImplemented() # TODO Implement
logger.info('Processing passive transfers')
if not LocalViewInterface.get_contract_parameters():
logger.error('Contract parameters not yet populated.')
return
latest_eon_number = LocalViewInterface.latest().eon_number()
# This lock is required because the ledger will be mutated as the transfers are processed
with RootCommitment.global_lock():
logger.info('Start')
process_passive_transfer_finalizations_for_eon(latest_eon_number)
def process_passive_transfer_finalizations_for_eon(operator_eon_number):
checkpoint_created = RootCommitment.objects.filter(
eon_number=operator_eon_number).exists()
with transaction.atomic():
transfers = Transfer.objects\
.filter(processed=False, swap=False, passive=True)\
.select_for_update()\
.order_by('eon_number', 'id')
for transfer in transfers:
try:
with transaction.atomic():
process_passive_transfer(
transfer, operator_eon_number, checkpoint_created)
except IntegrityError as e:
send_admin_email(
subject='Transfer Integrity Error',
content='{}'.format(e))
logger.error(e)
def should_void_transfer(transfer, wallet_view_context: WalletTransferContext, recipient_view_context: WalletTransferContext, operator_eon_number, is_checkpoint_created):
if transfer.eon_number != operator_eon_number and is_checkpoint_created:
logger.error('Transfer {} eon mismatch ({}, {})'.format(
transfer.id, transfer.eon_number, operator_eon_number))
return True
if transfer.amount < 0:
logger.error('Transfer {} has negative amount'.format(transfer.id))
return True
# Unauthorized transfer
if transfer.sender_active_state is None:
logger.error('Transfer {} no authorization'.format(transfer.id))
return True
# Invalid signature by sender
if not transfer.sender_active_state.wallet_signature.is_valid():
logger.error(
'Transfer {} invalid sender signature.'.format(transfer.id))
return True
# Ensure sender log consistency
can_append_to_sender_log = wallet_view_context.can_append_transfer()
if can_append_to_sender_log is not True:
logger.error('Sender: {}'.format(can_append_to_sender_log))
return True
# Ensure recipient log consistency
can_append_to_recipient_log = recipient_view_context.can_append_transfer()
if can_append_to_recipient_log is not True:
logger.error('Recipient: {}'.format(can_append_to_recipient_log))
return True
# Ensure transfer consistency
can_spend, currently_available_funds = wallet_view_context.can_send_transfer(
current_eon_number=operator_eon_number,
using_only_appended_funds=True)
if can_spend is not True:
logger.error(can_spend)
return True
last_sent_transfer = wallet_view_context.last_appended_outgoing_active_transfer(
operator_eon_number)
last_sent_transfer_active_state = WalletTransferContext.appropriate_transfer_active_state(
transfer=last_sent_transfer,
is_outgoing=True)
previous_spendings = last_sent_transfer_active_state.updated_spendings if last_sent_transfer else 0
updated_spendings = transfer.sender_active_state.updated_spendings
# Incorrect updated spendings
if last_sent_transfer:
if updated_spendings != previous_spendings + transfer.amount:
logger.error('Transfer {} invalid updated spendings. Expected {}, found {}.'.format(
transfer.id, previous_spendings + transfer.amount, updated_spendings))
return True
elif updated_spendings != transfer.amount:
logger.error('Transfer {} invalid initial spendings. Expected {}, found {}.'.format(
transfer.id, transfer.amount, updated_spendings))
return True
# Incorrect transfer position
last_passively_received = recipient_view_context.last_appended_incoming_passive_transfer(
operator_eon_number)
if last_passively_received:
if transfer.position != last_passively_received.position + last_passively_received.amount:
logger.error('Transfer {} invalid offset. Expected {}, found {}.'.format(
transfer.id, last_passively_received.position + last_passively_received.amount, transfer.position))
return True
elif transfer.position != 0:
logger.error('Transfer {} invalid offset. Expected {}, found {}.'.format(
transfer.id, 0, transfer.position))
return True
if transfer.sender_balance_marker.amount > currently_available_funds - transfer.amount:
logger.error(
'Transfer {} invalid concise marker balance.'.format(transfer.id))
return True
concise_balance_marker = MinimumAvailableBalanceMarker(
wallet=transfer.wallet,
eon_number=transfer.eon_number,
amount=transfer.sender_balance_marker.amount)
concise_balance_marker_checksum = hex_value(
concise_balance_marker.checksum())
if transfer.sender_balance_marker.signature.checksum != concise_balance_marker_checksum:
logger.error(
'Transfer {} invalid concise marker checksum.'.format(transfer.id))
return True
return False
def process_passive_transfer(transfer, operator_eon_number, checkpoint_created):
if transfer.wallet == transfer.recipient:
logger.info('Voiding self transfer.')
transfer.close(voided=True)
return
with transfer.lock(auto_renewal=True), transfer.wallet.lock(auto_renewal=True), transfer.recipient.lock(auto_renewal=True):
wallet_view_context = WalletTransferContext(
wallet=transfer.wallet, transfer=transfer)
recipient_view_context = WalletTransferContext(
wallet=transfer.recipient, transfer=transfer)
if should_void_transfer(transfer, wallet_view_context, recipient_view_context, operator_eon_number, checkpoint_created):
logger.info('Voiding transfer.')
transfer.close(voided=True)
return
# Invalid active state update
tx_set_list = wallet_view_context.authorized_transfers_list_shorthand(
only_appended=True,
force_append=True,
last_transfer_is_finalized=False)
tx_set_tree = WalletTransferContext.authorized_transfers_tree_from_list(
tx_set_list)
tx_set_hash = hex_value(tx_set_tree.root_hash())
highest_spendings, highest_gains = wallet_view_context.off_chain_actively_sent_received_amounts(
eon_number=transfer.eon_number,
only_appended=True)
active_state = ActiveState(
wallet=transfer.wallet,
updated_spendings=transfer.sender_active_state.updated_spendings,
updated_gains=highest_gains,
tx_set_hash=tx_set_hash,
eon_number=transfer.eon_number)
raw_checksum = active_state.checksum()
encoded_checksum = hex_value(raw_checksum)
wallet_active_state = transfer.sender_active_state
if wallet_active_state.wallet_signature.checksum != encoded_checksum:
logger.error('Transfer {} invalid sender active state checksum for {}'.format(
transfer.id, transfer.wallet.address))
transfer.close(voided=True)
return
try:
wallet_active_state.operator_signature = wallet_active_state.sign_active_state(
settings.HUB_OWNER_ACCOUNT_ADDRESS,
settings.HUB_OWNER_ACCOUNT_KEY)
except LookupError as e:
logger.error(e)
return
transfer.sender_active_state.save()
transfer.close(
complete=True,
appended=True)
operator_celery.send_task(
'auditor.tasks.on_transfer_confirmation', args=[transfer.id])
logger.info('Transfer {} processed.'.format(transfer.id))
| 40.743119 | 170 | 0.711101 | 988 | 8,882 | 6.069838 | 0.197368 | 0.037519 | 0.032016 | 0.034684 | 0.384359 | 0.221444 | 0.123729 | 0.093047 | 0.055361 | 0.020344 | 0 | 0.000575 | 0.216843 | 8,882 | 217 | 171 | 40.930876 | 0.861558 | 0.036816 | 0 | 0.228916 | 0 | 0 | 0.084738 | 0.004448 | 0 | 0 | 0 | 0.004608 | 0 | 1 | 0.024096 | false | 0.066265 | 0.072289 | 0 | 0.210843 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
243131bd4bd5c1dfc4db15f9b978b0ccdee06977 | 851 | py | Python | lib/x509Support.py | bbockelm/glideinWMS | a2b39e3d4ff6c4527efad54b1eefe728a4ec9d18 | [
"BSD-3-Clause"
] | null | null | null | lib/x509Support.py | bbockelm/glideinWMS | a2b39e3d4ff6c4527efad54b1eefe728a4ec9d18 | [
"BSD-3-Clause"
] | 3 | 2015-12-02T19:37:45.000Z | 2016-01-20T03:21:48.000Z | lib/x509Support.py | bbockelm/glideinWMS | a2b39e3d4ff6c4527efad54b1eefe728a4ec9d18 | [
"BSD-3-Clause"
] | 1 | 2015-12-01T23:02:41.000Z | 2015-12-01T23:02:41.000Z | import sys
import M2Crypto
def extract_DN(fname):
"""
Extract a Distinguished Name from an X.509 proxy.
@type fname: string
@param fname: Filename containing the X.509 proxy
"""
fd = open(fname,"r")
try:
data = fd.read()
finally:
fd.close()
while 1:
try:
data_idx = data.rindex('-----BEGIN CERTIFICATE-----')
old_data = data[:data_idx]
data = data[data_idx:]
except ValueError:
print "%s not a valid certificate file" % fname
sys.exit(3)
m = M2Crypto.X509.load_cert_string(data)
if m.check_ca():
# oops, this is the CA part
# get the previous in the chain
data = old_data
else:
break # ok, found it, end the loop
return str(m.get_subject())
| 23.638889 | 65 | 0.541716 | 109 | 851 | 4.137615 | 0.642202 | 0.070953 | 0.039911 | 0.066519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023723 | 0.356052 | 851 | 35 | 66 | 24.314286 | 0.79927 | 0.096357 | 0 | 0.090909 | 0 | 0 | 0.095008 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.090909 | null | null | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
24318cc53a456f38e8fd47e63bf8cf78000efea4 | 496 | py | Python | saxo/Utils.py | ayxue/BaiduSaxoOpenAPI | d042366bb33ebdc4471b29e167b01c4cb7cb298d | [
"Apache-2.0"
] | null | null | null | saxo/Utils.py | ayxue/BaiduSaxoOpenAPI | d042366bb33ebdc4471b29e167b01c4cb7cb298d | [
"Apache-2.0"
] | null | null | null | saxo/Utils.py | ayxue/BaiduSaxoOpenAPI | d042366bb33ebdc4471b29e167b01c4cb7cb298d | [
"Apache-2.0"
] | null | null | null | from collections import namedtuple
class Utils:
def __init__(self):
pass
@staticmethod
def DicToObj(name, dic = {}):
keys = list(dic.keys())
vals = list(dic.values())
Data = namedtuple('data', keys)
return Data._make(vals)
@staticmethod
def DicToObjList(name, dicList = []):
ret = []
for item in dicList:
if isinstance(item, dict):
ret.append(Utils.DicToObj(name, item))
return ret | 24.8 | 54 | 0.5625 | 53 | 496 | 5.169811 | 0.584906 | 0.109489 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.324597 | 496 | 20 | 55 | 24.8 | 0.81791 | 0 | 0 | 0.117647 | 0 | 0 | 0.008048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0.058824 | 0.058824 | 0 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
2431b5a56f94a08acd174c2d3c95aea9a187bdfa | 1,168 | py | Python | All Python Practice - IT 89/Python psets/pset2/ps2a.py | mrouhi13/my-mit-python-practice | f3b29418576fec54d3f9f55155aa8f2096ad974a | [
"MIT"
] | null | null | null | All Python Practice - IT 89/Python psets/pset2/ps2a.py | mrouhi13/my-mit-python-practice | f3b29418576fec54d3f9f55155aa8f2096ad974a | [
"MIT"
] | null | null | null | All Python Practice - IT 89/Python psets/pset2/ps2a.py | mrouhi13/my-mit-python-practice | f3b29418576fec54d3f9f55155aa8f2096ad974a | [
"MIT"
] | null | null | null | #!/usr/bin/python
#----------------------------------------
# Name: ps2a.py
# Coded by Majid Roohi
# Last Modified: 03/12/2010 , 03:28 PM
# Description: -
#----------------------------------------
def Diophantine() :
'''Program take number of McNuggets from the user and test 0 to N (number of McNuggets) with iterative. if A=False namely N McNuggets can
be bought in exact quantity, else N=N-1 and once again repeat this steps until A=True. if A=True then current N is our answer, namely N
McNuggets connot be bought in exact quantity.'''
N = input('\nHow many McNuggets you need to buy? ')
while N >= 0 :
A = True
for a in range(0,(N/6)+1) :
for b in range(0,((N-6*a)/9)+1) :
for c in range(0,((N-6*a-9*b)/20)+1) :
if N == 6*a+9*b+20*c :
A = False
if A == True :
print '\nLargest number of McNuggets that cannot be bought in exact quantity: ',N
break
N -= 1
| 30.736842 | 145 | 0.455479 | 155 | 1,168 | 3.432258 | 0.509677 | 0.037594 | 0.095865 | 0.084586 | 0.216165 | 0.154135 | 0.045113 | 0 | 0 | 0 | 0 | 0.04742 | 0.38613 | 1,168 | 37 | 146 | 31.567568 | 0.694561 | 0.160103 | 0 | 0 | 0 | 0 | 0.167692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
243867cc13462e0b51862074214a0f0d1328286c | 503 | py | Python | shop/migrations/0034_auto_20180321_1311.py | sumangaire52/dammideal | 83c0a6e962b51405016ed83c4597ebcd9dee0d04 | [
"Apache-2.0"
] | null | null | null | shop/migrations/0034_auto_20180321_1311.py | sumangaire52/dammideal | 83c0a6e962b51405016ed83c4597ebcd9dee0d04 | [
"Apache-2.0"
] | 7 | 2020-06-05T17:55:00.000Z | 2022-01-13T00:44:26.000Z | shop/migrations/0034_auto_20180321_1311.py | sumangaire52/dammideal | 83c0a6e962b51405016ed83c4597ebcd9dee0d04 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.1 on 2018-03-21 07:26
from django.db import migrations
import django_resized.forms
class Migration(migrations.Migration):
dependencies = [
('shop', '0033_auto_20180321_1242'),
]
operations = [
migrations.AlterField(
model_name='carousal',
name='image1',
field=django_resized.forms.ResizedImageField(crop=None, force_format=None, keep_meta=True, quality=100, size=[1170, 360], upload_to=''),
),
]
| 25.15 | 148 | 0.646123 | 59 | 503 | 5.355932 | 0.813559 | 0.082278 | 0.113924 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109091 | 0.234592 | 503 | 19 | 149 | 26.473684 | 0.711688 | 0.089463 | 0 | 0 | 1 | 0 | 0.089912 | 0.050439 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2438e1e3b56ec7e46d207f4d8d21dee6ca8b3830 | 2,242 | py | Python | LAB/predictive_keyboard/medical/run_mimic.py | ipavlopoulos/lm | b9ad7d98be47c0f1a6b446a090d1fce488bb2e3f | [
"Apache-2.0"
] | null | null | null | LAB/predictive_keyboard/medical/run_mimic.py | ipavlopoulos/lm | b9ad7d98be47c0f1a6b446a090d1fce488bb2e3f | [
"Apache-2.0"
] | 5 | 2020-02-11T21:23:02.000Z | 2022-02-10T02:04:14.000Z | LAB/predictive_keyboard/medical/run_mimic.py | ipavlopoulos/lm | b9ad7d98be47c0f1a6b446a090d1fce488bb2e3f | [
"Apache-2.0"
] | 1 | 2020-09-25T15:47:08.000Z | 2020-09-25T15:47:08.000Z | import pandas as pd
from sklearn.model_selection import train_test_split
from markov import models as markov_models
from neural import models as neural_models
from collections import Counter
from scipy.stats import sem
from toolkit import *
# todo: add FLAGS
if __name__ == "main":
use_radiology_only = True
use_impressions_only = False
use_preprocessing = True
test_size = 100
train_size = 2853
vocab_size = 10000
# PARSE MIMIC
data = pd.read_csv("./DATA/NOTEEVENTS.csv.gz")
# filter the DATA
if use_radiology_only:
# Use only reports about Radiology
data = data[data.CATEGORY == "Radiology"]
if use_impressions_only:
# Use only the IMPRESSION section from each report
data = data[data.TEXT.str.contains("IMPRESSION:")]
data.TEXT = data.TEXT.apply(lambda report: report.split("IMPRESSION:")[1])
data = data.sample(train_size+test_size, random_state=42)
train, test = train_test_split(data, test_size=test_size, random_state=42)
texts = train.TEXT
# preprocess
if use_preprocessing:
texts = texts.apply(preprocess)
# take the words (efficiently)
words = " ".join(texts.to_list()).split()
# create a vocabulary
WF = Counter(words)
print("|V|:", len(WF), "(before reducing the vocabulary)")
print("FREQUENT WORDS:", WF.most_common(10))
print("RARE WORDS:", WF.most_common()[:-10:-1])
V, _ = zip(*Counter(words).most_common(vocab_size))
V = set(V)
# substitute any unknown words in the texts
_words = fill_unk(words_to_fill=words, lexicon=V)
assert len(set(words)) == len(set(_words))
test["WORDS"] = test.apply(lambda row: fill_unk(lexicon=V, words_to_fill=preprocess(row.TEXT).split()), 1)
# train the N-Grams for N: 1 to 10
for N in range(1, 10):
wlm = markov_models.LM(gram=markov_models.WORD, n=N).train(words)
print(f"WER of {N}-GRAM:{test.WORDS.apply(lambda words: accuracy(words, wlm)).mean()}")
# Run a Neural LM
rnn = neural_models.RNN(epochs=1000)
rnn.train(words)
accuracies = test.WORDS.apply(lambda words: 1 - rnn.accuracy(" ".join(words)))
print(f'WER(RNNLM):{accuracies.mean()}±{sem(accuracies.to_list())}')
| 33.462687 | 110 | 0.67752 | 323 | 2,242 | 4.55418 | 0.362229 | 0.027192 | 0.019035 | 0.024473 | 0.093814 | 0.03399 | 0 | 0 | 0 | 0 | 0 | 0.018878 | 0.196699 | 2,242 | 66 | 111 | 33.969697 | 0.797335 | 0.123104 | 0 | 0 | 0 | 0.02381 | 0.134596 | 0.058854 | 0 | 0 | 0 | 0.015152 | 0.02381 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.119048 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
243b2fd050a284e57fb7cc123d04a641abd1bd34 | 6,397 | py | Python | endpoint/endpoint.py | BU-NU-CLOUD-SP16/Container-Safety-Determination | e99a5039ddf1d27362fb6db0c1c1a69814bdc36d | [
"MIT"
] | 14 | 2016-05-26T23:12:24.000Z | 2022-03-13T18:58:06.000Z | endpoint/endpoint.py | miradam/docker-introspection | e99a5039ddf1d27362fb6db0c1c1a69814bdc36d | [
"MIT"
] | 4 | 2016-02-25T20:29:50.000Z | 2016-04-25T03:59:49.000Z | endpoint/endpoint.py | BU-NU-CLOUD-SP16/Container-Safety-Determination | e99a5039ddf1d27362fb6db0c1c1a69814bdc36d | [
"MIT"
] | 4 | 2016-06-20T19:40:10.000Z | 2021-07-26T13:13:21.000Z | #####################################################################
# File: endpoint.py
# Author: Rahul Sharma <rahuls@ccs.neu.edu>
# Desc: Configures a REST API endpoint listening on port configured.
# Captures the notifications sent by Docker registry v2,
# processes them and identifies the newly added or modified
# image's name and tag. This information is then passed to
# other module's to download the image and calculate sdhash
# of all the files within the docker image.
#
# Target platform: Linux
#
# Dependencies
# -----------------
# 1. flask
# 2. flask-cors
#
# Installing dependencies
# ----------------------------------
# bash$ sudo pip install flask && pip install -U flask-cors
#
#####################################################################
from flask import Flask
from flask import request
from flask.ext.cors import CORS
from elasticsearch import Elasticsearch
import ConfigParser
import requests
import hashlib
import json
import sys
import os
sys.path.append(os.getcwd() + "/../")
from utils import hash_and_index, check_container
from scripts.elasticdatabase import ElasticDatabase
from scripts.esCfg import EsCfg
global CUR_DIR
CUR_DIR=""
app = Flask(__name__)
CORS(app)
APP_ROOT = os.path.dirname(os.path.abspath(__file__))
CONFIG_FILE = os.path.join(APP_ROOT, 'settings.ini')
config = ConfigParser.ConfigParser()
config.read(CONFIG_FILE)
username = config.get('registry', 'username')
password = config.get('registry', 'password')
auth = (username, password)
@app.route("/")
def registry_endpoint():
return "Docker registry endpoint!\n"
@app.route("/test", methods=['POST'])
def test():
#log()
#change to CUR_DIR
os.chdir(CUR_DIR)
data = json.loads(request.data)
for event in data["events"]:
# modifications to images are push events
if event["action"] == "push":
repository = event["target"]["repository"]
url = event["target"]["url"]
if "manifests" in url:
# Get the image-blocks in this manifest
image_blocks = []
image_manifest = requests.get(url, verify=False, auth=auth)
for entry in image_manifest.json()["layers"]:
image_blocks.append(entry["digest"])
# Get all tags. Syntax: GET /v2/<name>/tags/list
temp = url.split("manifests/")
tags_url = temp[0] + "tags/list"
tags = requests.get(tags_url, verify=False, auth=auth)
# Iterate over each tag and get its blocks. If blocks of
# tag matches with those of the manifest, then this tag
# is the latest added/modified one. This is just a hack
# since proper API is not available.
# Syntax for fetching manifest: GET /v2/<name>/manifests/<tag>
for tag in tags.json()["tags"]:
temp_req = temp[0] + "manifests/" + tag
tag_manifest = requests.get(temp_req, verify=False, auth=auth)
blocks = []
fsLayers = tag_manifest.json()["fsLayers"]
for layer in fsLayers:
blocks.append(layer["blobSum"])
if sorted(image_blocks) == sorted(blocks):
print "New image uploaded is: %s | tag: %s" % (repository, tag)
host = temp[0].split("/")[2]
image = host + "/" + repository + ":" + tag
if tag == "golden":
hash_and_index(image, "store")
else:
hash_and_index(image, "compare")
break
return "Done", 200
@app.route('/scan/<container_id>')
def scan_container(container_id):
result = ''
try:
result = check_container(container_id)
except Exception as e:
result = json.dumps({'error':'exception: ' + str(e) })
return result, 200
@app.route('/get_judge_res/<judge_image_dir>')
def get_judge_res(judge_image_dir):
es = Elasticsearch(EsCfg)
judge_image_dir = 'judgeresult:' + judge_image_dir
search_size = 20
search_offset = 0
print request.args
try:
if 'offset' in request.args:
search_offset = int(request.args.get('offset'))
if 'size' in request.args:
search_size = int(request.args.get('size'))
res_index = es.search(
index = judge_image_dir,
size = search_size,
from_=search_offset
)
except:
del(es)
return 'Error: index do not exist\n'
res_lst = []
for item in res_index['hits']['hits']:
res_lst.append(item['_source']['file'])
res_dict = {
'total' : res_index['hits']['total'],
'file_list' : res_lst,
'from_' : search_offset,
'size' : len(res_index['hits']['hits'])
}
json_res = json.dumps(res_dict)
del(es)
return json_res
@app.route('/correct_false_warning/<judge_image_dir>')
def correct_false_warning(judge_image_dir):
es = Elasticsearch(EsCfg)
if 'file_name' in request.args:
md5_file_name = hashlib.md5(request.args['file_name']).hexdigest()
print md5_file_name + ' for ' + request.args['file_name']
else:
del(es)
return 'Error: no file name in request\n'
judge_image_dir = 'judgeresult:' + judge_image_dir
try:
res = es.delete(index = judge_image_dir, doc_type = 'judgeResult', id = md5_file_name)
except:
del(es)
return 'Error: file do not exist\n'
del(es)
return json.dumps(res['_shards'])
#which machine should run docker image? remotely or locally
#and if word should be a list of arg?
@app.route('/docker_run/')
def docker_run():
es = Elasticsearch(EsCfg)
#check es again
#check finished, run docker image
try:
image_name = request.args.get('image_name')
except:
return 'can not get image_name in request\n'
arg_lst = request.args.getlist('args')
cmd = ['docker run', image_name] + arg_lst
cmd = ' '.join(cmd)
os.system(cmd)
return 'done\n'
def log():
print request.headers
print request.args
print request.data
if __name__ == "__main__":
CUR_DIR = os.getcwd()
app.run("0.0.0.0", 9999)
| 31.357843 | 94 | 0.5859 | 784 | 6,397 | 4.630102 | 0.298469 | 0.033333 | 0.035813 | 0.015702 | 0.086501 | 0.062259 | 0.020386 | 0 | 0 | 0 | 0 | 0.006457 | 0.273722 | 6,397 | 203 | 95 | 31.512315 | 0.77486 | 0.182273 | 0 | 0.153285 | 0 | 0 | 0.137208 | 0.014235 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.014599 | 0.094891 | null | null | 0.043796 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.