hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b918c27f2b168efd69908773e44475244b686dd0 | 3,379 | py | Python | imageproc_OE_IF_quant/2_annotate_extracted_cells.py | hshayya/2022_Shayya_UPR_Guidance | b9a305a147a105c3ac9c0173e06b94f66e4a6102 | [
"MIT"
] | null | null | null | imageproc_OE_IF_quant/2_annotate_extracted_cells.py | hshayya/2022_Shayya_UPR_Guidance | b9a305a147a105c3ac9c0173e06b94f66e4a6102 | [
"MIT"
] | null | null | null | imageproc_OE_IF_quant/2_annotate_extracted_cells.py | hshayya/2022_Shayya_UPR_Guidance | b9a305a147a105c3ac9c0173e06b94f66e4a6102 | [
"MIT"
] | null | null | null | import xml.etree.ElementTree as ET
import csv
import os
import re
from ij import IJ
from loci.plugins.in import ImporterOptions
from loci.plugins import BF
from ij.plugin import ImagesToStack
from ij import io
#Records metadata (x,y location) for cells that were extracted with 1_find_extract_cells.py
#metadata will be used in subsequent analysis to cluster cells from similar locations on the section -> semi-quantiative, local, analysis
def parse_cellcounter_to_dict(fpath):
'''Parse Cell-Counter Xml file to Dictionary
Inputs:
fpath (str) path to xml file on disk
Values:
(dict). Keys 'x_cal', 'y_cal' = (float) calibrations in each axis.
Keys '1'-'8' = (lists) of tuples containing cell positions in the form (x,y)
'''
tree = ET.parse(fpath)
cells_dict = {}
cells_dict['x_cal'] = float(tree.find('./Image_Properties/X_Calibration').text)
cells_dict['y_cal'] = float(tree.find('./Image_Properties/Y_Calibration').text)
rt = tree.find('Marker_Data') #re-root the tree
for type_ in rt.iter('Marker_Type'):
cells = []
for marker_ in type_.iter('Marker'):
cells.append((int(marker_[0].text), int(marker_[1].text)))
#
cells_dict[type_.find('Type').text] = cells
return cells_dict
#Load Xml Files
xml_locs = ['/path/to/xml/files'] #same as used in find_extract_cells
xml_files = [os.path.join(base_, f) for base_ in xml_locs for f in os.listdir(base_) if f[-3:] == 'xml' and f[0] != '.']
#Work through each xml file
f_out_path = '/path/to/annotation/out.tsv'
with open(f_out_path,'w') as fout:
fout.write('\t'.join(['cell','x_um','y_um']))
for e,xml_ in enumerate(xml_files):
print 'Working on file: ' + os.path.split(xml_)[1] + '...' + str(e+1) + '/' + str(len(xml_files))
#Find the orig .nd2 file, copied from find_extract_cells.py, see that code for more details.
orig_f_name = re.search('(?<=CellCounter_).*(?=\\-Downsampled)', os.path.split(xml_)[1]).group() + '.nd2'
search_dir = '/'.join(os.path.split(xml_)[0].split('/')[:-1])
files_found = [os.path.join(root, f) for (root, dirs, files) in os.walk(search_dir) for f in files if f == orig_f_name]
if len(files_found) == 1:
fullres_image = files_found[0]
else:
print "Could not find fullres image."
raise ValueError('Found 0 or >1 matching file')
#Generate the original inputs that were passed to extract_cells
input_item = (re.search('(?<=_).*',orig_f_name[:-4]).group(), {'fullres':fullres_image, 'counter':parse_cellcounter_to_dict(xml_)})
input_dict = input_item
types_of_interest={'7':'tdtom','8':'gfp'}
#Copied from the "Extract Cells", recovering positional info and writing to disk instead of extracting cell -> small image.
anim, vals = input_dict
#Loop through Cells and Annotate.
for cell_type, cell_label in types_of_interest.iteritems():
print 'Working on cell_type ' + cell_label
for i in range(len(vals['counter'][cell_type])):
print 'Iteration ' + str(i+1) + '/' + str(len(vals['counter'][cell_type]))
#Convert Px Downsampled -> Px Full Res
x_full_px = vals['counter'][cell_type][i][0] * vals['counter']['x_cal'] #in um
y_full_px = vals['counter'][cell_type][i][1] * vals['counter']['y_cal'] #in um
#Write Information
out_title = '_'.join([anim, cell_label, str(i)])
fout.write('\n' + '\t'.join([out_title, str(x_full_px), str(y_full_px)]))
#Final tsv of form cell_label,x,y. | 40.710843 | 137 | 0.693992 | 550 | 3,379 | 4.081818 | 0.321818 | 0.021381 | 0.026726 | 0.033853 | 0.083742 | 0.05078 | 0.023163 | 0 | 0 | 0 | 0 | 0.008354 | 0.149748 | 3,379 | 83 | 138 | 40.710843 | 0.77306 | 0.213081 | 0 | 0 | 0 | 0 | 0.171895 | 0.053534 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1875 | null | null | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9206e8febc3abecc98cfdec65d8f8f8f61e43fc | 782 | py | Python | graphql_social_auth/mutations.py | deepsourcelabs/django-graphql-social-auth | a0cc7715144dc289ccb4d2430e7c3b94fc1dffba | [
"MIT"
] | 1 | 2021-09-03T11:55:33.000Z | 2021-09-03T11:55:33.000Z | graphql_social_auth/mutations.py | deepsourcelabs/django-graphql-social-auth | a0cc7715144dc289ccb4d2430e7c3b94fc1dffba | [
"MIT"
] | null | null | null | graphql_social_auth/mutations.py | deepsourcelabs/django-graphql-social-auth | a0cc7715144dc289ccb4d2430e7c3b94fc1dffba | [
"MIT"
] | null | null | null | import graphene
from graphql_jwt.decorators import setup_jwt_cookie
from . import mixins, types
from .decorators import social_auth
class SocialAuthMutation(mixins.SocialAuthMixin, graphene.Mutation):
social = graphene.Field(types.SocialType)
class Meta:
abstract = True
class Arguments:
provider = graphene.String(required=True)
code = graphene.String(required=True)
@classmethod
@setup_jwt_cookie
@social_auth
def mutate(cls, root, info, social, **kwargs):
return cls.resolve(root, info, social, **kwargs)
class SocialAuth(mixins.ResolveMixin, SocialAuthMutation):
"""Social Auth Mutation"""
class SocialAuthJWT(mixins.JSONWebTokenMixin, SocialAuthMutation):
"""Social Auth for JSON Web Token (JWT)"""
| 25.225806 | 68 | 0.726343 | 86 | 782 | 6.523256 | 0.5 | 0.071301 | 0.049911 | 0.092692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181586 | 782 | 30 | 69 | 26.066667 | 0.876563 | 0.07289 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.222222 | 0.055556 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b923cd998b5a122c2fa8e86b09305b2b291d6507 | 3,873 | py | Python | platformio/commands/home/run.py | Granjow/platformio-core | 71ae579bc07b2e11fec16acda482dea04bc3a359 | [
"Apache-2.0"
] | 4,744 | 2016-11-28T14:37:47.000Z | 2022-03-31T12:35:56.000Z | platformio/commands/home/run.py | Granjow/platformio-core | 71ae579bc07b2e11fec16acda482dea04bc3a359 | [
"Apache-2.0"
] | 3,424 | 2016-11-27T22:45:41.000Z | 2022-03-31T21:40:03.000Z | platformio/commands/home/run.py | Granjow/platformio-core | 71ae579bc07b2e11fec16acda482dea04bc3a359 | [
"Apache-2.0"
] | 576 | 2016-12-01T18:48:22.000Z | 2022-03-30T02:27:35.000Z | # Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from urllib.parse import urlparse
import click
import uvicorn
from starlette.applications import Starlette
from starlette.middleware import Middleware
from starlette.responses import PlainTextResponse
from starlette.routing import Mount, Route, WebSocketRoute
from starlette.staticfiles import StaticFiles
from starlette.status import HTTP_403_FORBIDDEN
from platformio.commands.home.rpc.handlers.account import AccountRPC
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC
from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.commands.home.rpc.handlers.project import ProjectRPC
from platformio.commands.home.rpc.server import WebSocketJSONRPCServerFactory
from platformio.compat import aio_get_running_loop
from platformio.exception import PlatformioException
from platformio.package.manager.core import get_core_package_dir
from platformio.proc import force_exit
class ShutdownMiddleware:
def __init__(self, app):
self.app = app
async def __call__(self, scope, receive, send):
if scope["type"] == "http" and b"__shutdown__" in scope.get("query_string", {}):
await shutdown_server()
await self.app(scope, receive, send)
async def shutdown_server(_=None):
aio_get_running_loop().call_later(0.5, force_exit)
return PlainTextResponse("Server has been shutdown!")
async def protected_page(_):
return PlainTextResponse(
"Protected PlatformIO Home session", status_code=HTTP_403_FORBIDDEN
)
def run_server(host, port, no_open, shutdown_timeout, home_url):
contrib_dir = get_core_package_dir("contrib-piohome")
if not os.path.isdir(contrib_dir):
raise PlatformioException("Invalid path to PIO Home Contrib")
ws_rpc_factory = WebSocketJSONRPCServerFactory(shutdown_timeout)
ws_rpc_factory.addObjectHandler(AccountRPC(), namespace="account")
ws_rpc_factory.addObjectHandler(AppRPC(), namespace="app")
ws_rpc_factory.addObjectHandler(IDERPC(), namespace="ide")
ws_rpc_factory.addObjectHandler(MiscRPC(), namespace="misc")
ws_rpc_factory.addObjectHandler(OSRPC(), namespace="os")
ws_rpc_factory.addObjectHandler(PIOCoreRPC(), namespace="core")
ws_rpc_factory.addObjectHandler(ProjectRPC(), namespace="project")
path = urlparse(home_url).path
routes = [
WebSocketRoute(path + "wsrpc", ws_rpc_factory, name="wsrpc"),
Route(path + "__shutdown__", shutdown_server, methods=["POST"]),
Mount(path, StaticFiles(directory=contrib_dir, html=True), name="static"),
]
if path != "/":
routes.append(Route("/", protected_page))
uvicorn.run(
Starlette(
middleware=[Middleware(ShutdownMiddleware)],
routes=routes,
on_startup=[
lambda: click.echo(
"PIO Home has been started. Press Ctrl+C to shutdown."
),
lambda: None if no_open else click.launch(home_url),
],
),
host=host,
port=port,
log_level="warning",
)
| 38.73 | 88 | 0.737155 | 479 | 3,873 | 5.803758 | 0.386221 | 0.060432 | 0.038849 | 0.07482 | 0.103597 | 0.093165 | 0 | 0 | 0 | 0 | 0 | 0.004992 | 0.172476 | 3,873 | 99 | 89 | 39.121212 | 0.862403 | 0.150529 | 0 | 0.028169 | 0 | 0 | 0.079365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028169 | false | 0 | 0.309859 | 0 | 0.380282 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b924107dfd6ae9e56411cce662afa3db86b021e5 | 11,450 | py | Python | appengine/components/components/machine_provider/rpc_messages.py | stefb965/luci-py | e0a8a5640c4104e5c90781d833168aa8a8d1f24d | [
"Apache-2.0"
] | 1 | 2017-10-30T15:08:10.000Z | 2017-10-30T15:08:10.000Z | appengine/components/components/machine_provider/rpc_messages.py | stefb965/luci-py | e0a8a5640c4104e5c90781d833168aa8a8d1f24d | [
"Apache-2.0"
] | null | null | null | appengine/components/components/machine_provider/rpc_messages.py | stefb965/luci-py | e0a8a5640c4104e5c90781d833168aa8a8d1f24d | [
"Apache-2.0"
] | 1 | 2020-07-05T19:54:40.000Z | 2020-07-05T19:54:40.000Z | # Copyright 2015 The LUCI Authors. All rights reserved.
# Use of this source code is governed under the Apache License, Version 2.0
# that can be found in the LICENSE file.
"""Messages for the Machine Provider API."""
# pylint: disable=unused-wildcard-import, wildcard-import
from protorpc import messages
from components.machine_provider.dimensions import *
from components.machine_provider.instructions import *
from components.machine_provider.policies import *
class CatalogMachineRetrievalRequest(messages.Message):
"""Represents a request to retrieve a machine from the catalog."""
# Hostname of the machine to retrieve.
hostname = messages.StringField(1, required=True)
# Backend which added the machine.
backend = messages.EnumField(Backend, 2)
class CatalogMachineRetrievalResponse(messages.Message):
"""Represents a response to a catalog machine retrieval request."""
# Dimensions instance specifying what sort of machine this is.
dimensions = messages.MessageField(Dimensions, 1)
# Policies governing this machine.
policies = messages.MessageField(Policies, 2)
# State of the CatalogMachineEntry.
state = messages.StringField(3)
# Cloud Pub/Sub subscription the machine must listen to for instructions.
pubsub_subscription = messages.StringField(4)
# Project the Cloud Pub/Sub subscription exists in.
pubsub_subscription_project = messages.StringField(5)
# Cloud Pub/Sub topic the machine must be subscribed to.
pubsub_topic = messages.StringField(6)
# Project the Cloud Pub/Sub topic exists in.
pubsub_topic_project = messages.StringField(7)
# Timestamp indicating lease expiration seconds from epoch in UTC.
lease_expiration_ts = messages.IntegerField(8)
class CatalogMachineAdditionRequest(messages.Message):
"""Represents a request to add a machine to the catalog.
dimensions.backend must be specified.
dimensions.hostname must be unique per backend.
"""
# Dimensions instance specifying what sort of machine this is.
dimensions = messages.MessageField(Dimensions, 1, required=True)
# Policies instance specifying machine-specific configuration.
policies = messages.MessageField(Policies, 2, required=True)
class CatalogMachineBatchAdditionRequest(messages.Message):
"""Represents a batched set of CatalogMachineAdditionRequests.
dimensions.backend must be specified in each CatalogMachineAdditionRequest.
dimensions.hostname must be unique per backend.
"""
# CatalogMachineAdditionRequest instances to batch together.
requests = messages.MessageField(
CatalogMachineAdditionRequest, 1, repeated=True)
class CatalogMachineDeletionRequest(messages.Message):
"""Represents a request to delete a machine in the catalog."""
# Dimensions instance specifying what sort of machine this is.
dimensions = messages.MessageField(Dimensions, 1, required=True)
class CatalogManipulationRequestError(messages.Enum):
"""Represents an error in a catalog manipulation request."""
# Per backend, hostnames must be unique in the catalog.
HOSTNAME_REUSE = 1
# Tried to lookup an entry that didn't exist.
ENTRY_NOT_FOUND = 2
# Didn't specify a backend.
UNSPECIFIED_BACKEND = 3
# Specified backend didn't match the backend originating the request.
MISMATCHED_BACKEND = 4
# Didn't specify a hostname.
UNSPECIFIED_HOSTNAME = 5
# Proposed Cloud Pub/Sub topic was invalid.
INVALID_TOPIC = 6
# Proposed Cloud Pub/Sub project was invalid.
INVALID_PROJECT = 7
# Didn't specify a Cloud Pub/Sub topic.
UNSPECIFIED_TOPIC = 8
# Attempted to delete a leased machine.
LEASED = 9
class CatalogManipulationResponse(messages.Message):
"""Represents a response to a catalog manipulation request."""
# CatalogManipulationRequestError instance indicating an error with the
# request, or None if there is no error.
error = messages.EnumField(CatalogManipulationRequestError, 1)
# CatalogMachineAdditionRequest this response is in reference to.
machine_addition_request = messages.MessageField(
CatalogMachineAdditionRequest, 2)
# CatalogMachineDeletionRequest this response is in reference to.
machine_deletion_request = messages.MessageField(
CatalogMachineDeletionRequest, 3)
class CatalogBatchManipulationResponse(messages.Message):
"""Represents a response to a batched catalog manipulation request."""
responses = messages.MessageField(
CatalogManipulationResponse, 1, repeated=True)
class LeaseRequest(messages.Message):
"""Represents a request for a lease on a machine."""
# Per-user unique ID used to deduplicate requests.
request_id = messages.StringField(1, required=True)
# Dimensions instance specifying what sort of machine to lease.
dimensions = messages.MessageField(Dimensions, 2, required=True)
# Desired length of the lease in seconds.
duration = messages.IntegerField(3)
# Cloud Pub/Sub topic name to communicate on regarding this request.
pubsub_topic = messages.StringField(4)
# Cloud Pub/Sub project name to communicate on regarding this request.
pubsub_project = messages.StringField(5)
# Instructions to give the machine once it's been leased.
on_lease = messages.MessageField(Instruction, 6)
# UTC seconds from epoch when lease should expire.
lease_expiration_ts = messages.IntegerField(7)
class BatchedLeaseRequest(messages.Message):
"""Represents a batched set of LeaseRequests."""
# LeaseRequest instances to batch together.
requests = messages.MessageField(LeaseRequest, 1, repeated=True)
class LeaseRequestError(messages.Enum):
"""Represents an error in a LeaseRequest."""
# Request IDs are intended to be unique.
# Reusing a request ID in a different request is an error.
REQUEST_ID_REUSE = 1
# Proposed Cloud Pub/Sub topic was invalid.
INVALID_TOPIC = 2
# Proposed Cloud Pub/Sub project was invalid.
INVALID_PROJECT = 3
# Didn't specify a Cloud Pub/Sub topic.
UNSPECIFIED_TOPIC = 4
# Request couldn't be processed in time.
DEADLINE_EXCEEDED = 5
# Miscellaneous transient error.
TRANSIENT_ERROR = 6
# Mutually exclusive duration and lease_expiration_ts both specified.
MUTUAL_EXCLUSION_ERROR = 7
# Proposed duration was zero or negative.
NONPOSITIVE_DEADLINE = 8
# Proposed expiration time is not in the future.
LEASE_EXPIRATION_TS_ERROR = 9
# Neither duration nor lease_expiration_ts were specified.
LEASE_LENGTH_UNSPECIFIED = 10
# Requested lease duration is too long.
LEASE_TOO_LONG = 11
class LeaseRequestState(messages.Enum):
"""Represents the state of a LeaseRequest."""
# LeaseRequest has been received, but not processed yet.
UNTRIAGED = 0
# LeaseRequest is pending provisioning of additional capacity.
PENDING = 1
# LeaseRequest has been fulfilled.
FULFILLED = 2
# LeaseRequest has been denied.
DENIED = 3
class LeaseResponse(messages.Message):
"""Represents a response to a LeaseRequest."""
# SHA-1 identifying the LeaseRequest this response refers to.
request_hash = messages.StringField(1)
# LeaseRequestError instance indicating an error with the request, or None
# if there is no error.
error = messages.EnumField(LeaseRequestError, 2)
# Request ID used by the client to generate the LeaseRequest.
client_request_id = messages.StringField(3, required=True)
# State of the LeaseRequest.
state = messages.EnumField(LeaseRequestState, 4)
# Hostname of the machine available for this request.
hostname = messages.StringField(5)
# Timestamp indicating lease expiration seconds from epoch in UTC.
lease_expiration_ts = messages.IntegerField(6)
class BatchedLeaseResponse(messages.Message):
"""Represents a response to a batched lease request."""
responses = messages.MessageField(LeaseResponse, 1, repeated=True)
class LeaseReleaseRequest(messages.Message):
"""Represents a request to voluntarily cancel a LeaseRequest."""
# Per-user unique ID used to identify the LeaseRequest.
request_id = messages.StringField(1, required=True)
class BatchedLeaseReleaseRequest(messages.Message):
"""Represents a batched set of lease release requests."""
requests = messages.MessageField(LeaseReleaseRequest, 1, repeated=True)
class LeaseReleaseRequestError(messages.Enum):
"""Represents an error in a LeaseReleaseRequest."""
# Request ID referred to non-existent request for this user.
NOT_FOUND = 1
# Request ID referred to an unfulfilled request.
NOT_FULFILLED = 2
# Request ID referred to a fulfilled request whose machine was
# already reclaimed.
ALREADY_RECLAIMED = 3
# Request couldn't be processed in time.
DEADLINE_EXCEEDED = 4
# Miscellaneous transient error.
TRANSIENT_ERROR = 5
class LeaseReleaseResponse(messages.Message):
"""Represents a response to a LeaseReleaseRequest."""
# SHA-1 identifying the LeaseRequest this response refers to.
request_hash = messages.StringField(1)
# LeaseReleaseRequestError indicating an error with the request, or None
# if there is no error.
error = messages.EnumField(LeaseReleaseRequestError, 2)
# Request ID used by the client to generate the LeaseRequest
# referred to by the LeaseReleaseRequest.
client_request_id = messages.StringField(3, required=True)
class BatchedLeaseReleaseResponse(messages.Message):
"""Represents responses to a batched set of lease release requests."""
responses = messages.MessageField(LeaseReleaseResponse, 1, repeated=True)
class MachineInstructionRequest(messages.Message):
"""Represents a request to send an instruction to a leased machine."""
# Request ID for the fulfilled LeaseRequest whose machine should be
# instructed.
request_id = messages.StringField(1, required=True)
# Instruction to send the leased machine.
instruction = messages.MessageField(Instruction, 2)
class MachineInstructionError(messages.Enum):
"""Represents an error in a MachineInstructionRequest."""
# Request ID referred to an unfulfilled request.
NOT_FULFILLED = 1
# Request ID referred to a fulfilled request whose machine was
# already reclaimed.
ALREADY_RECLAIMED = 2
# Invalid instruction for the machine.
INVALID_INSTRUCTION = 3
class MachineInstructionResponse(messages.Message):
"""Represents a response to a MachineInstructionRequest."""
# Request ID used by the client to generate the LeaseRequest for the
# machine being instructed.
client_request_id = messages.StringField(1, required=True)
# MachineInstructionError indicating an error with the request, or None
# if there is no error.
error = messages.EnumField(MachineInstructionError, 2)
class PollRequest(messages.Message):
"""Represents a request to poll for instructions given to a machine."""
# Hostname of the machine whose instructions to retrieve.
hostname = messages.StringField(1, required=True)
# Backend the machine belongs to. Generally required.
backend = messages.EnumField(Backend, 2)
class PollResponse(messages.Message):
"""Represents a response to a request for instructions given to a machine."""
# Instruction given to the machine.
instruction = messages.MessageField(Instruction, 1)
# State of the instruction.
state = messages.StringField(2)
class AckRequest(messages.Message):
"""Represents a request to ack an instruction received by a machine."""
# Hostname of the machine whose instruction to ack.
hostname = messages.StringField(1, required=True)
# Backend the machine belongs to.
backend = messages.EnumField(Backend, 2)
| 38.945578 | 79 | 0.773537 | 1,410 | 11,450 | 6.231206 | 0.184397 | 0.034145 | 0.056909 | 0.056226 | 0.482244 | 0.420783 | 0.373776 | 0.263374 | 0.222399 | 0.170612 | 0 | 0.009419 | 0.156245 | 11,450 | 293 | 80 | 39.078498 | 0.90001 | 0.52262 | 0 | 0.131579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.035088 | 0 | 0.964912 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b925f7b3126896a3611797c97e1fa8d0eee2234c | 564 | py | Python | webscraping.py | carvalho-fdec/DesafioDSA | fec9742bd77ddc3923ed616b6511cce87de48968 | [
"MIT"
] | null | null | null | webscraping.py | carvalho-fdec/DesafioDSA | fec9742bd77ddc3923ed616b6511cce87de48968 | [
"MIT"
] | null | null | null | webscraping.py | carvalho-fdec/DesafioDSA | fec9742bd77ddc3923ed616b6511cce87de48968 | [
"MIT"
] | null | null | null | # webscraping test
import urllib.request
from bs4 import BeautifulSoup
with urllib.request.urlopen('http://www.netvasco.com.br') as url:
page = url.read()
#print(page)
print(url.geturl())
print(url.info())
print(url.getcode())
# Analise o html na variável 'page' e armazene-o no formato Beautiful Soup
soup = BeautifulSoup(page, 'html.parser')
#print(soup.prettify())
print(soup.title)
print(soup.title.string)
print(soup.title.name)
soup_a = soup.find_all('a')[:10]
for a in soup_a:
print(a.get('href'))
print(a.get_text())
| 18.193548 | 74 | 0.687943 | 86 | 564 | 4.465116 | 0.55814 | 0.09375 | 0.109375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006316 | 0.157801 | 564 | 30 | 75 | 18.8 | 0.802105 | 0.216312 | 0 | 0 | 0 | 0 | 0.097448 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.133333 | 0 | 0.133333 | 0.533333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
b927180a3b55091e89983dcae5d96dd47f1373ae | 4,172 | py | Python | extras/amld/cloud/quickdraw_rnn/task.py | luyang1210/tensorflow | 948324f4cafdc97ae51c0e44fc1c28677a6e2e8a | [
"Apache-2.0"
] | 1 | 2019-04-28T15:46:45.000Z | 2019-04-28T15:46:45.000Z | extras/amld/cloud/quickdraw_rnn/task.py | luyang1210/tensorflow | 948324f4cafdc97ae51c0e44fc1c28677a6e2e8a | [
"Apache-2.0"
] | null | null | null | extras/amld/cloud/quickdraw_rnn/task.py | luyang1210/tensorflow | 948324f4cafdc97ae51c0e44fc1c28677a6e2e8a | [
"Apache-2.0"
] | 1 | 2020-11-18T04:43:33.000Z | 2020-11-18T04:43:33.000Z | """Experiment wrapper for training on Cloud ML."""
import argparse, glob, os
import tensorflow as tf
# From this package.
import model
def generate_experiment_fn(data_dir, train_batch_size, eval_batch_size,
train_steps, eval_steps, cell_size, hidden,
**experiment_args):
"""Returns experiment_fn for a RNN classifier.
Args:
data_dir: Where {train,eval}-* tf.train.Example datasets can be found.
train_batch_size: Batch size during training.
train_batch_size: Batch size during evaluation.
train_steps: Number of training steps.
eval_steps: Number of evaluation steps.
cell_size: LSTM cell size.
hidden: Number of units in hidden layers (note that None means "use default"
wich is equivalent to [] -- see code in model).
experiment_args: Additional arguments when `tf.contrib.learn.Experiment`
is instantiated.
"""
classes = tf.gfile.Open('%s/labels.txt' % data_dir).read().splitlines()
n_classes = len(classes)
params = tf.contrib.training.HParams(
cell_size=cell_size,
hidden=hidden or None, # Default is empty list.
)
config = tf.contrib.learn.RunConfig()
def _experiment_fn(output_dir):
return tf.contrib.learn.Experiment(
model.build_estimator(output_dir, n_classes, params, config),
train_input_fn=model.make_input_fn_stroke(
files_pattern=os.path.join(data_dir, 'train-*'),
batch_size=train_batch_size),
eval_input_fn=model.make_input_fn_stroke(
files_pattern=os.path.join(data_dir, 'eval-*'),
batch_size=eval_batch_size),
export_strategies=[
tf.contrib.learn.utils.saved_model_export_utils.make_export_strategy(
model.serving_input_fn,
exports_to_keep=1)
],
train_steps=train_steps,
eval_steps=eval_steps,
**experiment_args
)
return _experiment_fn
if __name__ == '__main__':
tf.logging.set_verbosity(tf.logging.INFO)
parser = argparse.ArgumentParser()
parser.add_argument(
'--data_dir',
help='GCS or local path to training data',
required=True
)
parser.add_argument(
'--train_batch_size',
help='Batch size for training steps',
type=int,
default=100
)
parser.add_argument(
'--eval_batch_size',
help='Batch size for evaluation steps',
type=int,
default=100
)
parser.add_argument(
'--train_steps',
help='Steps to run the training job for.',
type=int,
default=10000
)
parser.add_argument(
'--eval_steps',
help='Number of steps to run evalution for at each checkpoint',
default=100,
type=int
)
parser.add_argument(
'--output_dir',
help='GCS location to write checkpoints and export models',
required=True
)
parser.add_argument(
'--job-dir',
help='this model ignores this field, but it is required by gcloud',
default='junk'
)
parser.add_argument(
'--eval_delay_secs',
help='How long to wait before running first evaluation',
default=10,
type=int
)
parser.add_argument(
'--min_eval_frequency',
help='Minimum number of training steps between evaluations',
default=1,
type=int
)
# Hyper parameters.
parser.add_argument(
'--cell_size',
help='LSTM cell size.',
default=256,
type=int
)
parser.add_argument(
'--hidden',
help='Units in hidden layers.',
default=(),
nargs='+',
type=int
)
args = parser.parse_args()
arguments = args.__dict__
# unused args provided by service
arguments.pop('job_dir', None)
arguments.pop('job-dir', None)
output_dir = arguments.pop('output_dir')
# Run the training job
tf.contrib.learn.learn_runner.run(
generate_experiment_fn(**arguments), output_dir)
| 28.972222 | 81 | 0.613375 | 497 | 4,172 | 4.921529 | 0.338028 | 0.051513 | 0.076451 | 0.025756 | 0.223222 | 0.123467 | 0.079313 | 0.079313 | 0.047424 | 0.047424 | 0 | 0.007092 | 0.290268 | 4,172 | 143 | 82 | 29.174825 | 0.81898 | 0.173298 | 0 | 0.216981 | 1 | 0 | 0.188087 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018868 | false | 0 | 0.028302 | 0.009434 | 0.066038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b92c7cbb70fbc4dd2dec20c24e021d0f6405bd12 | 19,900 | py | Python | marshmallow_dataclass/__init__.py | dan-starkware/marshmallow_dataclass | 25c3e041d8c6a87d740984e57a5bd29b768afbf8 | [
"MIT"
] | null | null | null | marshmallow_dataclass/__init__.py | dan-starkware/marshmallow_dataclass | 25c3e041d8c6a87d740984e57a5bd29b768afbf8 | [
"MIT"
] | null | null | null | marshmallow_dataclass/__init__.py | dan-starkware/marshmallow_dataclass | 25c3e041d8c6a87d740984e57a5bd29b768afbf8 | [
"MIT"
] | null | null | null | """
This library allows the conversion of python 3.7's :mod:`dataclasses`
to :mod:`marshmallow` schemas.
It takes a python class, and generates a marshmallow schema for it.
Simple example::
from marshmallow import Schema
from marshmallow_dataclass import dataclass
@dataclass
class Point:
x:float
y:float
point = Point(x=0, y=0)
point_json = Point.Schema().dumps(point)
Full example::
from marshmallow import Schema
from dataclasses import field
from marshmallow_dataclass import dataclass
import datetime
@dataclass
class User:
birth: datetime.date = field(metadata= {
"required": True # A parameter to pass to marshmallow's field
})
website:str = field(metadata = {
"marshmallow_field": marshmallow.fields.Url() # Custom marshmallow field
})
Schema: ClassVar[Type[Schema]] = Schema # For the type checker
"""
import inspect
from enum import EnumMeta
from functools import lru_cache
from typing import (
Any,
Callable,
Dict,
List,
Mapping,
Optional,
Set,
Tuple,
Type,
TypeVar,
Union,
cast,
overload,
)
import dataclasses
import marshmallow
import typing_inspect
__all__ = ["dataclass", "add_schema", "class_schema", "field_for_schema", "NewType"]
NoneType = type(None)
_U = TypeVar("_U")
# Whitelist of dataclass members that will be copied to generated schema.
MEMBERS_WHITELIST: Set[str] = {"Meta"}
# Max number of generated schemas that class_schema keeps of generated schemas. Removes duplicates.
MAX_CLASS_SCHEMA_CACHE_SIZE = 1024
# _cls should never be specified by keyword, so start it with an
# underscore. The presence of _cls is used to detect if this
# decorator is being called with parameters or not.
def dataclass(
_cls: Type[_U] = None,
*,
repr: bool = True,
eq: bool = True,
order: bool = False,
unsafe_hash: bool = False,
frozen: bool = False,
base_schema: Optional[Type[marshmallow.Schema]] = None,
):
"""
This decorator does the same as dataclasses.dataclass, but also applies :func:`add_schema`.
It adds a `.Schema` attribute to the class object
:param base_schema: marshmallow schema used as a base class when deriving dataclass schema
>>> @dataclass
... class Artist:
... name: str
>>> Artist.Schema
<class 'marshmallow.schema.Artist'>
>>> from typing import ClassVar
>>> from marshmallow import Schema
>>> @dataclass(order=True) # preserve field order
... class Point:
... x:float
... y:float
... Schema: ClassVar[Type[Schema]] = Schema # For the type checker
...
>>> Point.Schema().load({'x':0, 'y':0}) # This line can be statically type checked
Point(x=0.0, y=0.0)
"""
# dataclass's typing doesn't expect it to be called as a function, so ignore type check
dc = dataclasses.dataclass( # type: ignore
_cls, repr=repr, eq=eq, order=order, unsafe_hash=unsafe_hash, frozen=frozen
)
if _cls is None:
return lambda cls: add_schema(dc(cls), base_schema)
return add_schema(dc, base_schema)
@overload
def add_schema(_cls: Type[_U]) -> Type[_U]:
...
@overload
def add_schema(
base_schema: Type[marshmallow.Schema] = None,
) -> Callable[[Type[_U]], Type[_U]]:
...
@overload
def add_schema(
_cls: Type[_U], base_schema: Type[marshmallow.Schema] = None
) -> Type[_U]:
...
def add_schema(_cls=None, base_schema=None):
"""
This decorator adds a marshmallow schema as the 'Schema' attribute in a dataclass.
It uses :func:`class_schema` internally.
:param type cls: The dataclass to which a Schema should be added
:param base_schema: marshmallow schema used as a base class when deriving dataclass schema
>>> class BaseSchema(marshmallow.Schema):
... def on_bind_field(self, field_name, field_obj):
... field_obj.data_key = (field_obj.data_key or field_name).upper()
>>> @add_schema(base_schema=BaseSchema)
... @dataclasses.dataclass
... class Artist:
... names: Tuple[str, str]
>>> artist = Artist.Schema().loads('{"NAMES": ["Martin", "Ramirez"]}')
>>> artist
Artist(names=('Martin', 'Ramirez'))
"""
def decorator(clazz: Type[_U]) -> Type[_U]:
clazz.Schema = class_schema(clazz, base_schema) # type: ignore
return clazz
return decorator(_cls) if _cls else decorator
def class_schema(
clazz: type, base_schema: Optional[Type[marshmallow.Schema]] = None
) -> Type[marshmallow.Schema]:
"""
Convert a class to a marshmallow schema
:param clazz: A python class (may be a dataclass)
:param base_schema: marshmallow schema used as a base class when deriving dataclass schema
:return: A marshmallow Schema corresponding to the dataclass
.. note::
All the arguments supported by marshmallow field classes can
be passed in the `metadata` dictionary of a field.
If you want to use a custom marshmallow field
(one that has no equivalent python type), you can pass it as the
``marshmallow_field`` key in the metadata dictionary.
>>> import typing
>>> Meters = typing.NewType('Meters', float)
>>> @dataclasses.dataclass()
... class Building:
... height: Optional[Meters]
... name: str = dataclasses.field(default="anonymous")
... class Meta:
... ordered = True
...
>>> class_schema(Building) # Returns a marshmallow schema class (not an instance)
<class 'marshmallow.schema.Building'>
>>> @dataclasses.dataclass()
... class City:
... name: str = dataclasses.field(metadata={'required':True})
... best_building: Building # Reference to another dataclasses. A schema will be created for it too.
... other_buildings: List[Building] = dataclasses.field(default_factory=lambda: [])
...
>>> citySchema = class_schema(City)()
>>> city = citySchema.load({"name":"Paris", "best_building": {"name": "Eiffel Tower"}})
>>> city
City(name='Paris', best_building=Building(height=None, name='Eiffel Tower'), other_buildings=[])
>>> citySchema.load({"name":"Paris"})
Traceback (most recent call last):
...
marshmallow.exceptions.ValidationError: {'best_building': ['Missing data for required field.']}
>>> city_json = citySchema.dump(city)
>>> city_json['best_building'] # We get an OrderedDict because we specified order = True in the Meta class
OrderedDict([('height', None), ('name', 'Eiffel Tower')])
>>> @dataclasses.dataclass()
... class Person:
... name: str = dataclasses.field(default="Anonymous")
... friends: List['Person'] = dataclasses.field(default_factory=lambda:[]) # Recursive field
...
>>> person = class_schema(Person)().load({
... "friends": [{"name": "Roger Boucher"}]
... })
>>> person
Person(name='Anonymous', friends=[Person(name='Roger Boucher', friends=[])])
>>> @dataclasses.dataclass()
... class C:
... important: int = dataclasses.field(init=True, default=0)
... # Only fields that are in the __init__ method will be added:
... unimportant: int = dataclasses.field(init=False, default=0)
...
>>> c = class_schema(C)().load({
... "important": 9, # This field will be imported
... "unimportant": 9 # This field will NOT be imported
... }, unknown=marshmallow.EXCLUDE)
>>> c
C(important=9, unimportant=0)
>>> @dataclasses.dataclass
... class Website:
... url:str = dataclasses.field(metadata = {
... "marshmallow_field": marshmallow.fields.Url() # Custom marshmallow field
... })
...
>>> class_schema(Website)().load({"url": "I am not a good URL !"})
Traceback (most recent call last):
...
marshmallow.exceptions.ValidationError: {'url': ['Not a valid URL.']}
>>> @dataclasses.dataclass
... class NeverValid:
... @marshmallow.validates_schema
... def validate(self, data, **_):
... raise marshmallow.ValidationError('never valid')
...
>>> class_schema(NeverValid)().load({})
Traceback (most recent call last):
...
marshmallow.exceptions.ValidationError: {'_schema': ['never valid']}
>>> # noinspection PyTypeChecker
>>> class_schema(None) # unsupported type
Traceback (most recent call last):
...
TypeError: None is not a dataclass and cannot be turned into one.
>>> @dataclasses.dataclass
... class Anything:
... name: str
... @marshmallow.validates('name')
... def validates(self, value):
... if len(value) > 5: raise marshmallow.ValidationError("Name too long")
>>> class_schema(Anything)().load({"name": "aaaaaargh"})
Traceback (most recent call last):
...
marshmallow.exceptions.ValidationError: {'name': ['Name too long']}
"""
return _proxied_class_schema(clazz, base_schema)
@lru_cache(maxsize=MAX_CLASS_SCHEMA_CACHE_SIZE)
def _proxied_class_schema(
clazz: type, base_schema: Optional[Type[marshmallow.Schema]] = None
) -> Type[marshmallow.Schema]:
try:
# noinspection PyDataclass
fields: Tuple[dataclasses.Field, ...] = dataclasses.fields(clazz)
except TypeError: # Not a dataclass
try:
return class_schema(dataclasses.dataclass(clazz), base_schema)
except Exception:
raise TypeError(
f"{getattr(clazz, '__name__', repr(clazz))} is not a dataclass and cannot be turned into one."
)
# Copy all marshmallow hooks and whitelisted members of the dataclass to the schema.
attributes = {
k: v
for k, v in inspect.getmembers(clazz)
if hasattr(v, "__marshmallow_hook__") or k in MEMBERS_WHITELIST
}
# Update the schema members to contain marshmallow fields instead of dataclass fields
attributes.update(
(
field.name,
field_for_schema(
field.type, _get_field_default(field), field.metadata, base_schema
),
)
for field in fields
if field.init
)
schema_class = type(clazz.__name__, (_base_schema(clazz, base_schema),), attributes)
return cast(Type[marshmallow.Schema], schema_class)
def _field_by_type(
typ: Union[type, Any], base_schema: Optional[Type[marshmallow.Schema]]
) -> Optional[Type[marshmallow.fields.Field]]:
return (
base_schema and base_schema.TYPE_MAPPING.get(typ)
) or marshmallow.Schema.TYPE_MAPPING.get(typ)
def _field_by_supertype(
typ: Type,
default: marshmallow.missing,
newtype_supertype: Type,
metadata: dict,
base_schema: Optional[Type[marshmallow.Schema]],
) -> marshmallow.fields.Field:
"""
Return a new field for fields based on a super field. (Usually spawned from NewType)
"""
# Add the information coming our custom NewType implementation
typ_args = getattr(typ, "_marshmallow_args", {})
# Handle multiple validators from both `typ` and `metadata`.
# See https://github.com/lovasoa/marshmallow_dataclass/issues/91
new_validators: List[Callable] = []
for meta_dict in (typ_args, metadata):
if "validate" in meta_dict:
if marshmallow.utils.is_iterable_but_not_string(meta_dict["validate"]):
new_validators.extend(meta_dict["validate"])
elif callable(meta_dict["validate"]):
new_validators.append(meta_dict["validate"])
metadata["validate"] = new_validators if new_validators else None
metadata = {"description": typ.__name__, **typ_args, **metadata}
field = getattr(typ, "_marshmallow_field", None)
if field:
return field(**metadata)
else:
return field_for_schema(
newtype_supertype,
metadata=metadata,
default=default,
base_schema=base_schema,
)
def field_for_schema(
typ: type,
default=marshmallow.missing,
metadata: Mapping[str, Any] = None,
base_schema: Optional[Type[marshmallow.Schema]] = None,
) -> marshmallow.fields.Field:
"""
Get a marshmallow Field corresponding to the given python type.
The metadata of the dataclass field is used as arguments to the marshmallow Field.
:param typ: The type for which a field should be generated
:param default: value to use for (de)serialization when the field is missing
:param metadata: Additional parameters to pass to the marshmallow field constructor
:param base_schema: marshmallow schema used as a base class when deriving dataclass schema
>>> int_field = field_for_schema(int, default=9, metadata=dict(required=True))
>>> int_field.__class__
<class 'marshmallow.fields.Integer'>
>>> int_field.default
9
>>> field_for_schema(str, metadata={"marshmallow_field": marshmallow.fields.Url()}).__class__
<class 'marshmallow.fields.Url'>
"""
metadata = {} if metadata is None else dict(metadata)
if default is not marshmallow.missing:
metadata.setdefault("default", default)
# 'missing' must not be set for required fields.
if not metadata.get("required"):
metadata.setdefault("missing", default)
else:
metadata.setdefault("required", True)
# If the field was already defined by the user
predefined_field = metadata.get("marshmallow_field")
if predefined_field:
return predefined_field
# Generic types specified without type arguments
if typ is list:
typ = List[Any]
elif typ is dict:
typ = Dict[Any, Any]
# Base types
field = _field_by_type(typ, base_schema)
if field:
return field(**metadata)
if typ is Any:
metadata.setdefault("allow_none", True)
return marshmallow.fields.Raw(**metadata)
# Generic types
origin = typing_inspect.get_origin(typ)
if origin:
arguments = typing_inspect.get_args(typ, True)
# Override base_schema.TYPE_MAPPING to change the class used for generic types below
type_mapping = base_schema.TYPE_MAPPING if base_schema else {}
if origin in (list, List):
child_type = field_for_schema(arguments[0], base_schema=base_schema)
list_type = type_mapping.get(List, marshmallow.fields.List)
return list_type(child_type, **metadata)
if origin in (tuple, Tuple):
children = tuple(
field_for_schema(arg, base_schema=base_schema) for arg in arguments
)
tuple_type = type_mapping.get(Tuple, marshmallow.fields.Tuple)
return tuple_type(children, **metadata)
elif origin in (dict, Dict):
dict_type = type_mapping.get(Dict, marshmallow.fields.Dict)
return dict_type(
keys=field_for_schema(arguments[0], base_schema=base_schema),
values=field_for_schema(arguments[1], base_schema=base_schema),
**metadata,
)
elif typing_inspect.is_optional_type(typ):
subtyp = next(t for t in arguments if t is not NoneType) # type: ignore
# Treat optional types as types with a None default
metadata["default"] = metadata.get("default", None)
metadata["missing"] = metadata.get("missing", None)
metadata["required"] = False
return field_for_schema(subtyp, metadata=metadata, base_schema=base_schema)
elif typing_inspect.is_union_type(typ):
from . import union_field
return union_field.Union(
[
(
subtyp,
field_for_schema(
subtyp, metadata=metadata, base_schema=base_schema
),
)
for subtyp in arguments
],
**metadata,
)
# typing.NewType returns a function with a __supertype__ attribute
newtype_supertype = getattr(typ, "__supertype__", None)
if newtype_supertype and inspect.isfunction(typ):
return _field_by_supertype(
typ=typ,
default=default,
newtype_supertype=newtype_supertype,
metadata=metadata,
base_schema=base_schema,
)
# enumerations
if isinstance(typ, EnumMeta):
import marshmallow_enum
return marshmallow_enum.EnumField(typ, **metadata)
# Nested marshmallow dataclass
nested_schema = getattr(typ, "Schema", None)
# Nested dataclasses
forward_reference = getattr(typ, "__forward_arg__", None)
nested = (
nested_schema or forward_reference or class_schema(typ, base_schema=base_schema)
)
return marshmallow.fields.Nested(nested, **metadata)
def _base_schema(
clazz: type, base_schema: Optional[Type[marshmallow.Schema]] = None
) -> Type[marshmallow.Schema]:
"""
Base schema factory that creates a schema for `clazz` derived either from `base_schema`
or `BaseSchema`
"""
# Remove `type: ignore` when mypy handles dynamic base classes
# https://github.com/python/mypy/issues/2813
class BaseSchema(base_schema or marshmallow.Schema): # type: ignore
def load(self, data: Mapping, *, many: bool = None, **kwargs):
all_loaded = super().load(data, many=many, **kwargs)
many = self.many if many is None else bool(many)
if many:
return [clazz(**loaded) for loaded in all_loaded]
else:
return clazz(**all_loaded)
return BaseSchema
def _get_field_default(field: dataclasses.Field):
"""
Return a marshmallow default value given a dataclass default value
>>> _get_field_default(dataclasses.field())
<marshmallow.missing>
"""
# Remove `type: ignore` when https://github.com/python/mypy/issues/6910 is fixed
default_factory = field.default_factory # type: ignore
if default_factory is not dataclasses.MISSING:
return default_factory
elif field.default is dataclasses.MISSING:
return marshmallow.missing
return field.default
def NewType(
name: str,
typ: Type[_U],
field: Optional[Type[marshmallow.fields.Field]] = None,
**kwargs,
) -> Callable[[_U], _U]:
"""NewType creates simple unique types
to which you can attach custom marshmallow attributes.
All the keyword arguments passed to this function will be transmitted
to the marshmallow field constructor.
>>> import marshmallow.validate
>>> IPv4 = NewType('IPv4', str, validate=marshmallow.validate.Regexp(r'^([0-9]{1,3}\\.){3}[0-9]{1,3}$'))
>>> @dataclass
... class MyIps:
... ips: List[IPv4]
>>> MyIps.Schema().load({"ips": ["0.0.0.0", "grumble grumble"]})
Traceback (most recent call last):
...
marshmallow.exceptions.ValidationError: {'ips': {1: ['String does not match expected pattern.']}}
>>> MyIps.Schema().load({"ips": ["127.0.0.1"]})
MyIps(ips=['127.0.0.1'])
>>> Email = NewType('Email', str, field=marshmallow.fields.Email)
>>> @dataclass
... class ContactInfo:
... mail: Email = dataclasses.field(default="anonymous@example.org")
>>> ContactInfo.Schema().load({})
ContactInfo(mail='anonymous@example.org')
>>> ContactInfo.Schema().load({"mail": "grumble grumble"})
Traceback (most recent call last):
...
marshmallow.exceptions.ValidationError: {'mail': ['Not a valid email address.']}
"""
def new_type(x: _U):
return x
new_type.__name__ = name
new_type.__supertype__ = typ # type: ignore
new_type._marshmallow_field = field # type: ignore
new_type._marshmallow_args = kwargs # type: ignore
return new_type
if __name__ == "__main__":
import doctest
doctest.testmod(verbose=True)
| 34.133791 | 110 | 0.647538 | 2,355 | 19,900 | 5.323142 | 0.163057 | 0.040683 | 0.021777 | 0.014359 | 0.238274 | 0.176292 | 0.131142 | 0.121809 | 0.096921 | 0.082403 | 0 | 0.004284 | 0.237638 | 19,900 | 582 | 111 | 34.19244 | 0.822029 | 0.499196 | 0 | 0.157895 | 1 | 0.004049 | 0.043687 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064777 | false | 0 | 0.040486 | 0.008097 | 0.226721 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9312660991c249b5bd6faf4ead63f4150e99b7e | 4,915 | py | Python | pysnmp/EXTREME-RTSTATS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/EXTREME-RTSTATS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/EXTREME-RTSTATS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module EXTREME-RTSTATS-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/EXTREME-BASE-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:53:03 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, OctetString, Integer = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "OctetString", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ValueSizeConstraint, ConstraintsUnion, ValueRangeConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ValueSizeConstraint", "ConstraintsUnion", "ValueRangeConstraint", "ConstraintsIntersection")
extremeAgent, = mibBuilder.importSymbols("EXTREME-BASE-MIB", "extremeAgent")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Unsigned32, iso, Gauge32, MibScalar, MibTable, MibTableRow, MibTableColumn, ObjectIdentity, Bits, MibIdentifier, ModuleIdentity, Counter64, Counter32, NotificationType, Integer32, IpAddress, TimeTicks = mibBuilder.importSymbols("SNMPv2-SMI", "Unsigned32", "iso", "Gauge32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "ObjectIdentity", "Bits", "MibIdentifier", "ModuleIdentity", "Counter64", "Counter32", "NotificationType", "Integer32", "IpAddress", "TimeTicks")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
extremeRtStats = ModuleIdentity((1, 3, 6, 1, 4, 1, 1916, 1, 11))
if mibBuilder.loadTexts: extremeRtStats.setLastUpdated('9906240000Z')
if mibBuilder.loadTexts: extremeRtStats.setOrganization('Extreme Networks, Inc.')
extremeRtStatsTable = MibTable((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1), )
if mibBuilder.loadTexts: extremeRtStatsTable.setStatus('current')
extremeRtStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1), ).setIndexNames((0, "EXTREME-RTSTATS-MIB", "extremeRtStatsIndex"))
if mibBuilder.loadTexts: extremeRtStatsEntry.setStatus('current')
extremeRtStatsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsIndex.setStatus('current')
extremeRtStatsIntervalStart = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 2), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsIntervalStart.setStatus('current')
extremeRtStatsCRCAlignErrors = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsCRCAlignErrors.setStatus('current')
extremeRtStatsUndersizePkts = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsUndersizePkts.setStatus('current')
extremeRtStatsOversizePkts = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsOversizePkts.setStatus('current')
extremeRtStatsFragments = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsFragments.setStatus('current')
extremeRtStatsJabbers = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsJabbers.setStatus('current')
extremeRtStatsCollisions = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsCollisions.setStatus('current')
extremeRtStatsTotalErrors = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsTotalErrors.setStatus('current')
extremeRtStatsUtilization = MibTableColumn((1, 3, 6, 1, 4, 1, 1916, 1, 11, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 10000))).setMaxAccess("readonly")
if mibBuilder.loadTexts: extremeRtStatsUtilization.setStatus('current')
mibBuilder.exportSymbols("EXTREME-RTSTATS-MIB", extremeRtStatsEntry=extremeRtStatsEntry, extremeRtStatsOversizePkts=extremeRtStatsOversizePkts, extremeRtStatsUndersizePkts=extremeRtStatsUndersizePkts, extremeRtStatsTable=extremeRtStatsTable, extremeRtStatsTotalErrors=extremeRtStatsTotalErrors, extremeRtStats=extremeRtStats, PYSNMP_MODULE_ID=extremeRtStats, extremeRtStatsCollisions=extremeRtStatsCollisions, extremeRtStatsCRCAlignErrors=extremeRtStatsCRCAlignErrors, extremeRtStatsJabbers=extremeRtStatsJabbers, extremeRtStatsIndex=extremeRtStatsIndex, extremeRtStatsUtilization=extremeRtStatsUtilization, extremeRtStatsIntervalStart=extremeRtStatsIntervalStart, extremeRtStatsFragments=extremeRtStatsFragments)
| 114.302326 | 713 | 0.790031 | 498 | 4,915 | 7.793173 | 0.25502 | 0.007215 | 0.075754 | 0.013399 | 0.345014 | 0.262304 | 0.172121 | 0.172121 | 0.172121 | 0.168771 | 0 | 0.068882 | 0.075483 | 4,915 | 42 | 714 | 117.02381 | 0.785211 | 0.066938 | 0 | 0 | 0 | 0 | 0.157745 | 0.009613 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b936e2da1dfb0c50e0a4123e54c302664e300cf0 | 4,454 | py | Python | tests/core_ptl/check_for_ranks.py | PatrykNeubauer/NeMo | 3ada744b884dba5f233f22c6991fc6092c6ca8d0 | [
"Apache-2.0"
] | 2 | 2021-09-21T07:36:20.000Z | 2022-02-05T15:29:04.000Z | tests/core_ptl/check_for_ranks.py | PatrykNeubauer/NeMo | 3ada744b884dba5f233f22c6991fc6092c6ca8d0 | [
"Apache-2.0"
] | null | null | null | tests/core_ptl/check_for_ranks.py | PatrykNeubauer/NeMo | 3ada744b884dba5f233f22c6991fc6092c6ca8d0 | [
"Apache-2.0"
] | 12 | 2021-06-20T08:56:10.000Z | 2022-03-16T19:07:10.000Z | # Copyright (c) 2021, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import shutil
import torch
from omegaconf import OmegaConf
from pytorch_lightning import Trainer
from pytorch_lightning.utilities.distributed import rank_zero_only
from nemo.core import ModelPT
from nemo.utils import logging
from nemo.utils.exp_manager import ExpManagerConfig, exp_manager
class OnesDataset(torch.utils.data.Dataset):
def __init__(self, dataset_len):
super().__init__()
self.__dataset_len = dataset_len
def __getitem__(self, *args):
return torch.ones(2)
def __len__(self):
return self.__dataset_len
class ExampleModel(ModelPT):
def __init__(self, *args, **kwargs):
cfg = OmegaConf.structured({})
super().__init__(cfg, trainer=kwargs.get('trainer', None))
# dummy parameter in order to allow DDP to execute
self.l1 = torch.nn.modules.Linear(in_features=2, out_features=1)
def train_dataloader(self):
return None
def val_dataloader(self):
return None
def predict_dataloader(self):
dataset = OnesDataset(2)
return torch.utils.data.DataLoader(dataset, batch_size=2)
def forward(self, batch):
return batch.mean()
def validation_step(self, batch, batch_idx):
return self(batch)
def training_step(self, batch, batch_idx):
return self(batch)
def list_available_models(self):
pass
def setup_training_data(self):
pass
def setup_validation_data(self):
pass
def validation_epoch_end(self, loss):
self.log("val_loss", torch.stack(loss).mean())
def instantiate_multinode_ddp_if_possible():
num_gpus = torch.cuda.device_count()
trainer = Trainer(gpus=num_gpus, accelerator='ddp', logger=None, checkpoint_callback=None)
exp_manager_cfg = ExpManagerConfig(exp_dir='./ddp_check/', use_datetime_version=False, version="")
exp_manager(trainer, cfg=OmegaConf.structured(exp_manager_cfg))
return trainer
def setup_model(trainer: Trainer):
model = ExampleModel(trainer=trainer)
logging.info(f"M.Global Rank:{model.global_rank}")
logging.info(f"M.Local Rank:{model.local_rank}")
logging.info(f"M.World Size:{model.trainer.world_size}")
trainer.predict(model)
return model
def get_rank_info(texts: list, rank_key: str) -> int:
for line in texts:
if rank_key in line:
rank_value = line.split(":")[-1]
rank_value = int(rank_value)
return rank_value
print("Could not find the correct rank key !")
exit(1)
@rank_zero_only
def check_model_ranks(model: ExampleModel):
basedir = os.path.join('./ddp_check/', 'default', 'version_0')
file_template = "nemo_log_globalrank-{rank}_localrank-{rank}.txt"
world_size = torch.cuda.device_count()
for rank in range(world_size):
filename = file_template.format(rank=rank)
filepath = os.path.join(basedir, filename)
with open(filepath, 'r') as f:
texts = f.readlines()
texts = [t.replace("\n", "") for t in texts]
log_global_rank = get_rank_info(texts, rank_key='M.Global Rank')
log_world_size = get_rank_info(texts, rank_key='M.World Size')
if log_global_rank != rank:
print("Logged global rank is not equal to trainer.global_rank !")
exit(1)
if log_world_size != world_size:
print("Logged world size if not equal to trainer.world_size !")
exit(1)
@rank_zero_only
def cleanup():
if os.path.exists('./ddp_check'):
shutil.rmtree('./ddp_check', ignore_errors=True)
def run_checks():
cleanup()
trainer = instantiate_multinode_ddp_if_possible()
model = setup_model(trainer)
check_model_ranks(model)
print("DDP checks passed !")
cleanup()
if __name__ == '__main__':
run_checks()
| 28.551282 | 102 | 0.687023 | 604 | 4,454 | 4.836093 | 0.347682 | 0.030811 | 0.012325 | 0.013352 | 0.109552 | 0.05683 | 0.043136 | 0.026703 | 0.026703 | 0 | 0 | 0.005402 | 0.210373 | 4,454 | 155 | 103 | 28.735484 | 0.825135 | 0.141895 | 0 | 0.145833 | 0 | 0 | 0.113738 | 0.032834 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208333 | false | 0.041667 | 0.09375 | 0.072917 | 0.4375 | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b93839299c30aa23ab066b85969c7c27e043c202 | 1,143 | py | Python | helpers/json_manager.py | Lofi-Lemonade/Python-Discord-Bot-Template | 4cb79197c751c88100ad396adb38e88bf2a4d1ed | [
"Apache-2.0"
] | null | null | null | helpers/json_manager.py | Lofi-Lemonade/Python-Discord-Bot-Template | 4cb79197c751c88100ad396adb38e88bf2a4d1ed | [
"Apache-2.0"
] | null | null | null | helpers/json_manager.py | Lofi-Lemonade/Python-Discord-Bot-Template | 4cb79197c751c88100ad396adb38e88bf2a4d1ed | [
"Apache-2.0"
] | null | null | null | """"
Copyright © Krypton 2022 - https://github.com/kkrypt0nn (https://krypton.ninja)
Description:
This is a template to create your own discord bot in python.
Version: 4.1
"""
import json
def add_user_to_blacklist(user_id: int) -> None:
"""
This function will add a user based on its ID in the blacklist.json file.
:param user_id: The ID of the user that should be added into the blacklist.json file.
"""
with open("blacklist.json", "r+") as file:
file_data = json.load(file)
file_data["ids"].append(user_id)
with open("blacklist.json", "w") as file:
file.seek(0)
json.dump(file_data, file, indent=4)
def remove_user_from_blacklist(user_id: int) -> None:
"""
This function will remove a user based on its ID from the blacklist.json file.
:param user_id: The ID of the user that should be removed from the blacklist.json file.
"""
with open("blacklist.json", "r") as file:
file_data = json.load(file)
file_data["ids"].remove(user_id)
with open("blacklist.json", "w") as file:
file.seek(0)
json.dump(file_data, file, indent=4)
| 31.75 | 91 | 0.659668 | 181 | 1,143 | 4.071823 | 0.353591 | 0.141113 | 0.086839 | 0.108548 | 0.719132 | 0.708277 | 0.662144 | 0.662144 | 0.559023 | 0.559023 | 0 | 0.012346 | 0.220472 | 1,143 | 35 | 92 | 32.657143 | 0.813692 | 0.433946 | 0 | 0.533333 | 0 | 0 | 0.111296 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b93a3daf85b033d7039d8c3747eadb457802db6b | 2,814 | py | Python | GeneratePassword/generate_password_v2.py | OneScreenfulOfPython/screenfuls | ea4e378c8d9e530edadd4a3315fe9e8acc98460b | [
"Apache-2.0"
] | 2 | 2015-01-19T14:50:55.000Z | 2015-01-28T12:45:59.000Z | GeneratePassword/generate_password_v2.py | OneScreenfulOfPython/screenfuls | ea4e378c8d9e530edadd4a3315fe9e8acc98460b | [
"Apache-2.0"
] | null | null | null | GeneratePassword/generate_password_v2.py | OneScreenfulOfPython/screenfuls | ea4e378c8d9e530edadd4a3315fe9e8acc98460b | [
"Apache-2.0"
] | null | null | null | import os, sys
import random
import string
try:
# Make Python2 work like Python3
input = raw_input
except NameError:
# On Python3; already using input
pass
letters = string.ascii_letters
numbers = string.digits
punctuation = string.punctuation
def generate(password_length, at_least_one_letter, at_least_one_number, at_least_one_punctuation):
"""Generate a password by include enough random
characters to meet the password length restriction.
In addition, the user can specify that at least one
of the each of the classes of character be used.
"""
#
# Any combination of characters is valid
#
valid_characters = ""
if at_least_one_letter:
valid_characters += letters
if at_least_one_number:
valid_characters += numbers
if at_least_one_punctuation:
valid_characters += punctuation
#
# Start with a blank password and then go round enough
# times to make a password of the required length.
#
password = ""
for i in range(password_length):
#
# Each time around, ensure that one of each of the selected
# groups is chosen, and then just choose randomly from all
# groups.
#
if at_least_one_letter:
character = random.choice(letters)
at_least_one_letter = False
elif at_least_one_number:
character = random.choice(numbers)
at_least_one_number = False
elif at_least_one_punctuation:
character = random.choice(punctuation)
at_least_one_punctuation = False
else:
character = random.choice(valid_characters)
password += character
#
# Finally, shuffle the password so we don't always get a
# letter at the beginning, with a number after and some
# punctuation.
#
characters = list(password)
#
# random.shuffle shuffles a list *in place*
#
random.shuffle(characters)
#
# X.join(...) means: return all the strings in (...) joined by X
# ", ".join(['Eggs', 'Bacon', 'Beans']) => "Eggs, Bacon, Beans"
# But if you want to generate *real* .csv files, use the csv module
# because there are lots of corner-cases.
#
password = "".join(characters)
return password
if __name__ == '__main__':
password_length = int(input("How many letters? "))
at_least_one_letter = "Y" == (input("At least one letter [Y/n]? ").upper() or "Y")
at_least_one_number = "Y" == (input("At least one number [Y/n]? ").upper() or "Y")
at_least_one_punctuation = "Y" == (input("At least one punctuation [Y/n]? ").upper() or "Y")
password = generate(password_length, at_least_one_letter, at_least_one_number, at_least_one_punctuation)
print("Your password is: {}".format(password))
| 33.5 | 108 | 0.658138 | 369 | 2,814 | 4.821138 | 0.365854 | 0.086565 | 0.123665 | 0.062957 | 0.229342 | 0.106802 | 0.106802 | 0.106802 | 0.084317 | 0.084317 | 0 | 0.001428 | 0.253376 | 2,814 | 83 | 109 | 33.903614 | 0.845312 | 0.327292 | 0 | 0.046512 | 0 | 0 | 0.074878 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023256 | false | 0.255814 | 0.069767 | 0 | 0.116279 | 0.023256 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b93aaafe8012e07a3a1b7cd6bfac2b4027e51ebd | 3,760 | py | Python | hard-gists/7578539/snippet.py | jjhenkel/dockerizeme | eaa4fe5366f6b9adf74399eab01c712cacaeb279 | [
"Apache-2.0"
] | 21 | 2019-07-08T08:26:45.000Z | 2022-01-24T23:53:25.000Z | hard-gists/7578539/snippet.py | jjhenkel/dockerizeme | eaa4fe5366f6b9adf74399eab01c712cacaeb279 | [
"Apache-2.0"
] | 5 | 2019-06-15T14:47:47.000Z | 2022-02-26T05:02:56.000Z | hard-gists/7578539/snippet.py | jjhenkel/dockerizeme | eaa4fe5366f6b9adf74399eab01c712cacaeb279 | [
"Apache-2.0"
] | 17 | 2019-05-16T03:50:34.000Z | 2021-01-14T14:35:12.000Z | from pylab import *
from numpy import *
from numpy.linalg import solve
from scipy.integrate import odeint
from scipy.stats import norm, uniform, beta
from scipy.special import jacobi
a = 0.0
b = 3.0
theta=1.0
sigma=sqrt(theta/(2*(a+b+2)))
tscale = 0.05
invariant_distribution = poly1d( [-1 for x in range(int(a))], True)*poly1d( [1 for x in range(int(b))], True)
def eigenvalue(n):
return theta*n*(n+a+b+1)/(a+b+2)
gaussian_var = norm()
def dW(dt):
return norm.rvs() / sqrt(dt)
def random_walk(y0, tmax, dt, times = None):
dt = dt * tscale
def rhs(y,t):
return -theta*(y-(a-b)/(a+b+2)) + sqrt(2*theta*(1-y*y)/(a+b+2))*dW(dt/tscale)
if (times is None):
times = arange(0,tmax,dt)
y = zeros(shape=times.shape, dtype=float)
y[0] = y0
for i in range(1,y.shape[0]):
y[i] = y[i-1] + rhs(y[i-1], times[i])*dt
if abs(y[i]) > 1:
y[i] = y[i] / abs(y[i])
return (times, y)
def beta_prior(s, f):
return poly1d(ones(shape=(s,)), True)*poly1d(-1*ones(shape=(f,)), True)
def poly_to_jacobi(x):
"""x is a poly1d object"""
xc = x.coeffs
N = x.order+1
matrix = zeros(shape=(N,N), dtype=float)
for i in range(N):
matrix[N-i-1:N, i] = jacobi(i,a,b).coeffs
return solve(matrix, xc)
def jacobi_to_poly(x):
result = poly1d([0])
for i in range(x.shape[0]):
result = result + (jacobi(i,a,b)*invariant_distribution)*x[i]
return result
def jacobi_to_poly_no_invariant(x):
result = poly1d([0])
for i in range(x.shape[0]):
result = result + jacobi(i,a,b)*x[i]
return result
def propagate_jacobi(pc, t):
"""Takes jacobi coefficients and propagates them"""
n = arange(pc.shape[0], dtype=float)
l = theta*n*(n+a+b+1.0)/(a+b+2.0)*tscale
return exp(-l*t)*pc
def truncate_unnecessary_jacobi(p):
p_normalized = p / (abs(p).sum())
cs = cumsum(abs(p_normalized[::-1]))[::-1]
return p_normalized[where(abs(cs) > 1e-4)]
def pde_solve(prior, t):
result = zeros(shape=(t.shape[0], prior.shape[0]), dtype=float)
result[0,:] = prior
for i in range(1,t.shape[0]):
result[i,:] = propagate_jacobi(result[i-1,:], t[i]-t[i-1])
return result
def transform_to_x(pdf, x):
result = zeros(shape=(pdf.shape[0], x.shape[0]), dtype=float)
for i in range(0, pdf.shape[0]):
p = jacobi_to_poly(pdf[i,:])
result[i,:] = p(x)
result[i,:] /= result[i,:].sum()
return result
tmax = 4
prior = beta_prior(40, 20)
prior_in_jacobi = poly_to_jacobi(prior)
dt = 0.1
times = arange(0,tmax,dt)
x = arange(-1,1,0.01)
rw_dt = 0.01
t, y = random_walk(0.35*2-1, tmax, rw_dt)
solution_as_x = zeros(shape=(times.size, x.size), dtype=float)
solution_as_jacobi = None
empirical_ctr = zeros(shape=(4,), dtype=float)
for i in range(0,4):
nt = int(1.0/dt)
prior = prior_in_jacobi
rnd = uniform(0,1)
if (i > 0):
nsamples = 40
r = rnd.rvs(nsamples)
ctr = (y[i/rw_dt]+1)/2.0
print "CTR: " + str(ctr)
success = (r < ctr).sum()
print "Empirical: " + str(success / float(nsamples))
evidence = beta_prior( nsamples - success, success)
prior = None
j = truncate_unnecessary_jacobi(solution_as_jacobi[int(1/dt)-1])
prior = poly_to_jacobi(evidence * jacobi_to_poly_no_invariant(j))
empirical_ctr[i] = success / float(nsamples)
solution_as_jacobi = pde_solve(prior, times[i*nt:(i+1)*nt])
solution_as_x[i*nt:(i+1)*nt] = transform_to_x(solution_as_jacobi, x)
plot(arange(0,4), empirical_ctr, 'go')
plot(t, (y+1)/2.0, 'k')
imshow(solution_as_x.transpose(), origin='lower', extent=[0,tmax,0,1])
xlabel("time")
ylabel("CTR")
title("Bayesian Estimate of CTR")
colorbar()
show()
| 27.246377 | 109 | 0.611702 | 655 | 3,760 | 3.412214 | 0.203053 | 0.009843 | 0.018792 | 0.034452 | 0.173154 | 0.104251 | 0.085906 | 0.047427 | 0.047427 | 0.047427 | 0 | 0.036217 | 0.206915 | 3,760 | 137 | 110 | 27.445255 | 0.71328 | 0 | 0 | 0.093458 | 0 | 0 | 0.014933 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.056075 | null | null | 0.018692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b93b21d31a5eecb527d2b3ad7f00cf5d4683d661 | 1,535 | py | Python | forms.py | lennykioko/Flask-social-network | 15bfe1f7dca90074c0cbef62c5da9d5a25b5ce65 | [
"MIT"
] | 1 | 2018-04-15T19:35:54.000Z | 2018-04-15T19:35:54.000Z | forms.py | lennykioko/Flask-social-network | 15bfe1f7dca90074c0cbef62c5da9d5a25b5ce65 | [
"MIT"
] | null | null | null | forms.py | lennykioko/Flask-social-network | 15bfe1f7dca90074c0cbef62c5da9d5a25b5ce65 | [
"MIT"
] | null | null | null | # forms are not just about display, instead they are more of validation
# wtf forms protect our site against csrf attacks
from flask_wtf import FlaskForm
from wtforms import StringField, PasswordField, TextAreaField
from wtforms.validators import (DataRequired, Regexp, ValidationError, Email,
Length, EqualTo)
from models import User
def name_exists(form, field):
if User.select().where(User.username == field.data).exists():
raise ValidationError('User with this name already exists.')
def email_exists(form, field):
if User.select().where(User.email == field.data).exists():
raise ValidationError('User with this email already exists.')
class RegisterForm(FlaskForm):
username = StringField(
'Username', # is the label
validators=[
DataRequired(),
Regexp(
r'^[a-zA-Z0-9_]+$',
message = ("Username should be one word, letters, numbers and underscores only.")
),
name_exists
])
email = StringField(
'Email',
validators=[
DataRequired(),
Email(),
email_exists
])
password = PasswordField(
'Password',
validators=[
DataRequired(),
Length(min=8),
EqualTo('password2', message = 'Passwords must match')
])
password2 = PasswordField(
'Confirm Password',
validators=[DataRequired()
])
class LoginForm(FlaskForm):
email = StringField('Email', validators=[DataRequired(), Email()])
password = PasswordField('Password', validators=[DataRequired()])
class PostForm(FlaskForm):
content = TextAreaField("What's Up?", validators = [DataRequired()])
| 25.583333 | 85 | 0.712704 | 171 | 1,535 | 6.362573 | 0.48538 | 0.141544 | 0.082721 | 0.03125 | 0.334559 | 0.240809 | 0.152574 | 0.152574 | 0 | 0 | 0 | 0.003891 | 0.162866 | 1,535 | 59 | 86 | 26.016949 | 0.842802 | 0.084691 | 0 | 0.222222 | 0 | 0 | 0.172734 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044444 | false | 0.155556 | 0.088889 | 0 | 0.355556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b9417eb816defb8a05e4de472fa5d06b0845774d | 4,237 | py | Python | src/c/c_pyzstd.py | corneliusroemer/pyzstd | 06f14ad29735d9ae85c188703dcb64c24686c4f2 | [
"BSD-3-Clause"
] | 29 | 2020-10-13T03:35:37.000Z | 2022-03-14T11:09:47.000Z | src/c/c_pyzstd.py | corneliusroemer/pyzstd | 06f14ad29735d9ae85c188703dcb64c24686c4f2 | [
"BSD-3-Clause"
] | 12 | 2020-12-22T02:27:47.000Z | 2022-03-18T14:54:33.000Z | src/c/c_pyzstd.py | corneliusroemer/pyzstd | 06f14ad29735d9ae85c188703dcb64c24686c4f2 | [
"BSD-3-Clause"
] | 3 | 2020-11-21T20:57:10.000Z | 2021-09-26T01:14:44.000Z | from collections import namedtuple
from enum import IntEnum
from ._zstd import *
from . import _zstd
__all__ = (# From this file
'compressionLevel_values', 'get_frame_info',
'CParameter', 'DParameter', 'Strategy',
# From _zstd
'ZstdCompressor', 'RichMemZstdCompressor',
'ZstdDecompressor', 'EndlessZstdDecompressor',
'ZstdDict', 'ZstdError', 'decompress', 'get_frame_size',
'compress_stream', 'decompress_stream',
'zstd_version', 'zstd_version_info', 'zstd_support_multithread')
# Used in __init__.py
_ZSTD_DStreamInSize = _zstd._ZSTD_DStreamInSize
_train_dict = _zstd._train_dict
_finalize_dict = _zstd._finalize_dict
# compressionLevel_values
_nt_values = namedtuple('values', ['default', 'min', 'max'])
compressionLevel_values = _nt_values(_zstd._ZSTD_defaultCLevel,
_zstd._ZSTD_minCLevel,
_zstd._ZSTD_maxCLevel)
_nt_frame_info = namedtuple('frame_info',
['decompressed_size', 'dictionary_id'])
def get_frame_info(frame_buffer):
"""Get zstd frame infomation from a frame header.
Argument
frame_buffer: A bytes-like object. It should starts from the beginning of
a frame, and needs to include at least the frame header (6 to
18 bytes).
Return a two-items namedtuple: (decompressed_size, dictionary_id)
If decompressed_size is None, decompressed size is unknown.
dictionary_id is a 32-bit unsigned integer value. 0 means dictionary ID was
not recorded in the frame header, the frame may or may not need a dictionary
to be decoded, and the ID of such a dictionary is not specified.
It's possible to append more items to the namedtuple in the future."""
ret_tuple = _zstd._get_frame_info(frame_buffer)
return _nt_frame_info(*ret_tuple)
class CParameter(IntEnum):
"""Compression parameters"""
compressionLevel = _zstd._ZSTD_c_compressionLevel
windowLog = _zstd._ZSTD_c_windowLog
hashLog = _zstd._ZSTD_c_hashLog
chainLog = _zstd._ZSTD_c_chainLog
searchLog = _zstd._ZSTD_c_searchLog
minMatch = _zstd._ZSTD_c_minMatch
targetLength = _zstd._ZSTD_c_targetLength
strategy = _zstd._ZSTD_c_strategy
enableLongDistanceMatching = _zstd._ZSTD_c_enableLongDistanceMatching
ldmHashLog = _zstd._ZSTD_c_ldmHashLog
ldmMinMatch = _zstd._ZSTD_c_ldmMinMatch
ldmBucketSizeLog = _zstd._ZSTD_c_ldmBucketSizeLog
ldmHashRateLog = _zstd._ZSTD_c_ldmHashRateLog
contentSizeFlag = _zstd._ZSTD_c_contentSizeFlag
checksumFlag = _zstd._ZSTD_c_checksumFlag
dictIDFlag = _zstd._ZSTD_c_dictIDFlag
nbWorkers = _zstd._ZSTD_c_nbWorkers
jobSize = _zstd._ZSTD_c_jobSize
overlapLog = _zstd._ZSTD_c_overlapLog
def bounds(self):
"""Return lower and upper bounds of a parameter, both inclusive."""
# 1 means compression parameter
return _zstd._get_param_bounds(1, self.value)
class DParameter(IntEnum):
"""Decompression parameters"""
windowLogMax = _zstd._ZSTD_d_windowLogMax
def bounds(self):
"""Return lower and upper bounds of a parameter, both inclusive."""
# 0 means decompression parameter
return _zstd._get_param_bounds(0, self.value)
class Strategy(IntEnum):
"""Compression strategies, listed from fastest to strongest.
Note : new strategies _might_ be added in the future, only the order
(from fast to strong) is guaranteed.
"""
fast = _zstd._ZSTD_fast
dfast = _zstd._ZSTD_dfast
greedy = _zstd._ZSTD_greedy
lazy = _zstd._ZSTD_lazy
lazy2 = _zstd._ZSTD_lazy2
btlazy2 = _zstd._ZSTD_btlazy2
btopt = _zstd._ZSTD_btopt
btultra = _zstd._ZSTD_btultra
btultra2 = _zstd._ZSTD_btultra2
# Set CParameter/DParameter types for validity check
_zstd._set_parameter_types(CParameter, DParameter) | 36.213675 | 80 | 0.663441 | 468 | 4,237 | 5.583333 | 0.346154 | 0.101033 | 0.065442 | 0.022962 | 0.091083 | 0.073479 | 0.04822 | 0.04822 | 0.04822 | 0.04822 | 0 | 0.005181 | 0.271182 | 4,237 | 117 | 81 | 36.213675 | 0.840997 | 0.275667 | 0 | 0.032787 | 0 | 0 | 0.108725 | 0.030537 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04918 | false | 0 | 0.065574 | 0 | 0.688525 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b9421dbb7e263a5a3de9a9e29e270b09ceba630c | 1,004 | py | Python | django_events/users/management/commands/create_default_su.py | chrisBrookes93/django-events-management | 93886448a7bb85c8758324977ff67bcacc80bbec | [
"MIT"
] | null | null | null | django_events/users/management/commands/create_default_su.py | chrisBrookes93/django-events-management | 93886448a7bb85c8758324977ff67bcacc80bbec | [
"MIT"
] | null | null | null | django_events/users/management/commands/create_default_su.py | chrisBrookes93/django-events-management | 93886448a7bb85c8758324977ff67bcacc80bbec | [
"MIT"
] | null | null | null | from django.core.management.base import BaseCommand
from django.contrib.auth import get_user_model
class Command(BaseCommand):
help = "Creates a default super user if one doesn't already exist. " \
"This is designed to be used in the docker-compose.yml to create an initial super user on deployment."
def handle(self, *args, **kwargs):
"""
Checks whether any super users exist and creates a default one if not
:param args: Unused
:param kwargs: Unused
"""
super_users = get_user_model().objects.filter(is_superuser=True)
if super_users.exists():
self.stdout.write('A superuser already exists, not creating one')
else:
get_user_model().objects.create_superuser(email="admin@events.com", password="EventsEvents")
self.stdout.write('Created default superuser "admin@events.com"')
self.stdout.write('Make sure you change the password immediately!')
| 41.833333 | 114 | 0.661355 | 129 | 1,004 | 5.069767 | 0.581395 | 0.03211 | 0.055046 | 0.058104 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1,004 | 23 | 115 | 43.652174 | 0.868526 | 0.111554 | 0 | 0 | 0 | 0.076923 | 0.387214 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.153846 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b942ff3dafb5c886434a478e8bfb0592e83afd1c | 6,215 | bzl | Python | antlir/bzl/image_layer.bzl | zeroxoneb/antlir | 811d88965610d16a5c85d831d317f087797ca732 | [
"MIT"
] | 28 | 2020-08-11T16:22:46.000Z | 2022-03-04T15:41:52.000Z | antlir/bzl/image_layer.bzl | zeroxoneb/antlir | 811d88965610d16a5c85d831d317f087797ca732 | [
"MIT"
] | 137 | 2020-08-11T16:07:49.000Z | 2022-02-27T10:59:05.000Z | antlir/bzl/image_layer.bzl | zeroxoneb/antlir | 811d88965610d16a5c85d831d317f087797ca732 | [
"MIT"
] | 10 | 2020-09-10T00:01:28.000Z | 2022-03-08T18:00:28.000Z | # Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
"""
An `image.layer` is a set of `feature` with some additional parameters. Its
purpose to materialize those `feature`s as a btrfs subvolume in the
per-repo `buck-image/out/volume/targets`.
We call the subvolume a "layer" because it can be built on top of a snapshot
of its `parent_layer`, and thus can be represented as a btrfs send-stream for
more efficient storage & distribution.
The Buck output of an `image.layer` target is a JSON file with information
on how to find the resulting layer in the per-repo
`buck-image/out/volume/targets`. See `SubvolumeOnDisk.to_json_file`.
## Implementation notes
The implementation of this converter deliberately minimizes the amount of
business logic in its command. The converter must include **only** our
interactions with the buck target graph. Everything else should be
delegated to subcommands.
### Command
In composing the `bash` command, our core maxim is: make it a hermetic
function of the converter's inputs -- do not read data from disk, do not
insert disk paths into the command, do not do anything that might cause the
bytes of the command to vary between machines or between runs. To achieve
this, we use Buck macros to resolve all paths, including those to helper
scripts. We rely on environment variables or pipes to pass data between the
helper scripts.
Another reason to keep this converter minimal is that `buck test` cannot
make assertions about targets that fail to build. Since we only have the
ability to test the "good" targets, it behooves us to put most logic in
external scripts, so that we can unit-test its successes **and** failures
thoroughly.
### Output
We mark `image.layer` uncacheable, because there's no easy way to teach Buck
to serialize a btrfs subvolume (for that, we have `package.new`).
That said, we should still follow best practices to avoid problems if e.g.
the user renames their repo, or similar. These practices include:
- The output JSON must store no absolute paths.
- Store Buck target paths instead of paths into the output directory.
### Dependency resolution
An `image.layer` consumes a set of `feature` outputs to decide what to put into
the btrfs subvolume. These outputs are actually just JSON files that
reference other targets, and do not contain the data to be written into the
image.
Therefore, `image.layer` has to explicitly tell buck that it needs all
direct dependencies of its `feature`s to be present on disk -- see our
`attrfilter` queries below. Without this, Buck would merrily fetch the just
the `feature` JSONs from its cache, and not provide us with any of the
buid artifacts that comprise the image.
We do NOT need the direct dependencies of the parent layer's features,
because we treat the parent layer as a black box -- whatever it has laid
down in the image, that's what it provides (and we don't care about how).
The consequences of this information hiding are:
- Better Buck cache efficiency -- we don't have to download
the dependencies of the ancestor layers' features. Doing that would be
wasteful, since those bits are redundant with what's in the parent.
- Ability to use genrule image layers / apply non-pure post-processing to
a layer. In terms of engineering, both of these non-pure approaches are
a terrible idea and a maintainability headache, but they do provide a
useful bridge for transitioning to Buck image builds from legacy
imperative systems.
- The image compiler needs a litte extra code to walk the parent layer and
determine what it provides.
- We cannot have "unobservable" dependencies between features. Since
feature dependencies are expected to routinely cross layer boundaries,
feature implementations are forced only to depend on data that can be
inferred from the filesystem -- since this is all that the parent layer
implementation can do. NB: This is easy to relax in the future by
writing a manifest with additional metadata into each layer, and using
that metadata during compilation.
"""
load(":compile_image_features.bzl", "compile_image_features")
load(":image_layer_utils.bzl", "image_layer_utils")
load(":image_utils.bzl", "image_utils")
def image_layer(
name,
parent_layer = None,
features = None,
flavor = None,
flavor_config_override = None,
antlir_rule = "user-internal",
**image_layer_kwargs):
"""
Arguments
- `parent_layer`: The name of another `image_layer` target, on
top of which the current layer will install its features.
- `features`: List of `feature` target paths and/or
nameless structs from `feature.new`.
- `flavor`: Picks default build options for the layer, including
`build_appliance`, RPM installer, and others. See `flavor_helpers.bzl`
for details.
- `flavor_config_override`: A struct that can override the default
values fetched from `REPO_CFG[flavor].flavor_to_config`.
- `mount_config`: Specifies how this layer is mounted in the
`mounts` field of a `feature` of a parent layer. See
the field in `_image_layer_impl` in `image_layer_utils.bzl`
- `runtime`: A list of desired helper buck targets to be emitted.
`container` is always included in the list by default.
See the field in `_image_layer_impl` in `image_layer_utils.bzl` and the
[docs](/docs/tutorials/helper-buck-targets#imagelayer) for the list of
possible helpers, their respective behaviours, and how to invoke them.
"""
image_layer_utils.image_layer_impl(
_rule_type = "image_layer",
_layer_name = name,
# Build a new layer. It may be empty.
_make_subvol_cmd = compile_image_features(
name = name,
current_target = image_utils.current_target(name),
parent_layer = parent_layer,
features = features,
flavor = flavor,
flavor_config_override = flavor_config_override,
),
antlir_rule = antlir_rule,
**image_layer_kwargs
)
| 44.078014 | 79 | 0.740628 | 950 | 6,215 | 4.774737 | 0.374737 | 0.039683 | 0.016534 | 0.011905 | 0.037037 | 0.037037 | 0.037037 | 0.037037 | 0.037037 | 0.020723 | 0 | 0 | 0.20177 | 6,215 | 140 | 80 | 44.392857 | 0.914332 | 0.8428 | 0 | 0 | 0 | 0 | 0.157596 | 0.080499 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0 | 0 | 0.04 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9458ab72f55b4db845f6d76e44dba3b00e000ed | 6,265 | py | Python | src/features/v3/proc_v3_n1_calc_distance.py | askoki/nfl_dpi_prediction | dc3256f24ddc0b6725eace2081d1fb1a7e5ce805 | [
"MIT"
] | null | null | null | src/features/v3/proc_v3_n1_calc_distance.py | askoki/nfl_dpi_prediction | dc3256f24ddc0b6725eace2081d1fb1a7e5ce805 | [
"MIT"
] | null | null | null | src/features/v3/proc_v3_n1_calc_distance.py | askoki/nfl_dpi_prediction | dc3256f24ddc0b6725eace2081d1fb1a7e5ce805 | [
"MIT"
] | null | null | null | import os
import sys
import pandas as pd
from datetime import datetime
from settings import RAW_DATA_DIR, DataV3, DATA_V3_SUBVERSION
from src.features.helpers.processing import add_missing_timestamp_values
from src.features.helpers.processing_v3 import get_closest_players, get_players_and_ball_indices, calculate_distance, \
normalize_according_to_play_direction, check_group_event
from src.features.helpers.processing_v4 import home_has_possession, calculate_team_sitation
week_num = int(sys.argv[1])
data_v3 = DataV3(DATA_V3_SUBVERSION)
save_file_path = data_v3.get_step1_checkpoint_path(week_num)
try:
clean_df = pd.read_csv(save_file_path)
save_file_exists = True
except FileNotFoundError:
save_file_exists = False
if not save_file_exists:
print("Started loading data")
play_df = pd.read_csv(os.path.join(RAW_DATA_DIR, 'plays.csv'))
games_df = pd.read_csv(os.path.join(RAW_DATA_DIR, 'games.csv'))
week_and_games = games_df[games_df.week == week_num]
tracking_df = pd.read_csv(os.path.join(RAW_DATA_DIR, f'week{week_num}.csv'))
print("Data loaded. Start processing timestamps")
tracking_df = add_missing_timestamp_values(tracking_df)
games_n_plays_df = play_df.merge(week_and_games, how='inner', on='gameId')
m_grouped = games_n_plays_df.groupby(['gameId', 'playId'])
df_t = tracking_df.merge(games_n_plays_df, how='left', on=['gameId', 'playId'])
# Remove all events without 'pass_forward'
df_t_grouped = df_t.groupby(['gameId', 'playId'])
df_t_v3 = df_t.copy().sort_index()
for name, group in df_t_grouped:
game_id, play_id = name
# if group does not contain pass forward, drop it
if all(group.event != 'pass_forward'):
df_t_v3 = df_t_v3[(df_t_v3.gameId != game_id) | (df_t_v3.playId != play_id)]
df_t_v3_s = df_t_v3.sort_values(by=['gameId', 'playId', 'time', 'event'])
df_t_v3_s = df_t_v3_s.reset_index(drop=True)
df_t_grouped = df_t_v3_s.groupby(['gameId', 'playId'])
# remove all values before 'pass_forward'
print("Removing all values before pass forward event...")
for name, group in df_t_grouped:
game_id, play_id = name
pass_forward_frame_id = group[group.event == 'pass_forward'].index.min() - 1
remove_start = group.index.min()
df_t_v3_s = df_t_v3_s.drop(df_t_v3_s.loc[remove_start:pass_forward_frame_id].index)
pd.options.mode.chained_assignment = None
gb = df_t_v3_s.groupby(['gameId', 'playId'])
print('Getting closest players...')
keep_indices = []
for name, group in gb:
game_id, play_id = name
try:
event_3rd = group.event.unique()[2]
except IndexError:
print('Number of events is < 3, skipping...')
continue
situation_df = group[group.event == event_3rd]
# convert dataframe into series
ball_row = situation_df[situation_df.team == 'football'].head(1)
# remove ball
player_situation_df = situation_df[situation_df.team != 'football']
try:
p1, p2 = get_closest_players(player_situation_df, ball_row.x.item(), ball_row.y.item())
except ValueError:
print('Value Error raised. This group will be skipped.')
continue
p_n_b_indices = get_players_and_ball_indices(group, p1, p2)
if p_n_b_indices:
keep_indices.extend(p_n_b_indices)
clean_df = df_t_v3_s[df_t_v3_s.index.isin(keep_indices)]
clean_df.to_csv(
save_file_path,
index=False
)
print('Normalize...')
clean_df = normalize_according_to_play_direction(clean_df)
clean_df['homeHasPossession'] = clean_df.apply(
lambda row: home_has_possession(row), axis=1
)
clean_df['teamSituation'] = clean_df.apply(
lambda row: calculate_team_sitation(row), axis=1
)
print('Creating features...')
min_df = clean_df[[
'time', 'x', 'y', 's', 'o', 'dir', 'event', 'team',
'gameId', 'playId', 'frameId', 'isDefensivePI'
]]
gb_2 = clean_df.groupby(['gameId', 'playId', 'frameId'])
# ball direction and orientation are NaN
calc_df = pd.DataFrame(
columns=[
'time',
'att_def_d', 'att_ball_d', 'def_ball_d',
'att_s', 'def_s', 'ball_s',
'att_o', 'def_o',
'att_dir', 'def_dir',
'event', 'gameId', 'playId', 'frameId', 'isDefensivePI'
]
)
GROUP_SIZE_MINIMUM = 3
for name, group in gb_2:
game_id, play_id, frameId = name
if len(group) < GROUP_SIZE_MINIMUM:
continue
ball = group[group.teamSituation == 'football'].head(1).squeeze()
p_att = group[group.teamSituation == 'attacking'].head(1).squeeze()
p_def = group[group.teamSituation == 'defending'].head(1).squeeze()
group_row = group.head(1).squeeze()
group_events = group.event.unique().tolist()
dict_to_append = {
'time': group_row.time,
'att_def_d': calculate_distance(p_att.x, p_att.y, p_def.x, p_def.y),
'att_ball_d': calculate_distance(p_att.x, p_att.y, ball.x, ball.y),
'def_ball_d': calculate_distance(p_def.x, p_def.y, ball.x, ball.y),
'att_s': p_att.s, 'def_s': p_def.s, 'ball_s': ball.s,
'att_a': p_att.a, 'def_a': p_def.a, 'ball_a': ball.a,
'att_o': p_att.o, 'def_o': p_def.o,
'att_dir': p_att.dir, 'def_dir': p_def.dir,
'event': group_row.event,
'pass_arrived': check_group_event(group_events, 'pass_arrived'),
'pass_outcome_caught': check_group_event(group_events, 'pass_outcome_caught'),
'tackle': check_group_event(group_events, 'tackle'),
'first_contact': check_group_event(group_events, 'first_contact'),
'pass_outcome_incomplete': check_group_event(group_events, 'pass_outcome_incomplete'),
'out_of_bounds': check_group_event(group_events, 'out_of_bounds'),
'week': week_num,
'gameId': group_row.gameId,
'playId': group_row.playId,
'frameId': group_row.frameId,
'isDefensivePI': group_row.isDefensivePI
}
calc_df = calc_df.append(
dict_to_append,
ignore_index=True
)
print("Saving data...")
calc_df.to_csv(
data_v3.get_step1_end_path(week_num),
index=False
)
print(f'End time: {datetime.now().strftime("%H:%M:%S")}')
| 35.596591 | 119 | 0.675499 | 934 | 6,265 | 4.17666 | 0.217345 | 0.017688 | 0.020508 | 0.015381 | 0.277108 | 0.136632 | 0.102538 | 0.067931 | 0.058703 | 0.044348 | 0 | 0.009312 | 0.194413 | 6,265 | 175 | 120 | 35.8 | 0.763622 | 0.03336 | 0 | 0.095588 | 0 | 0 | 0.168788 | 0.013721 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.051471 | 0.058824 | 0 | 0.058824 | 0.073529 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b945e094a775936b9b256c03b9ad1404cebcb291 | 1,312 | py | Python | annotate-preprocessed.py | Rajpratik71/devel-scripts | 068285719a13b02889b1314361cc5bdb764d9a3a | [
"Apache-2.0"
] | null | null | null | annotate-preprocessed.py | Rajpratik71/devel-scripts | 068285719a13b02889b1314361cc5bdb764d9a3a | [
"Apache-2.0"
] | null | null | null | annotate-preprocessed.py | Rajpratik71/devel-scripts | 068285719a13b02889b1314361cc5bdb764d9a3a | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
"""Annotates -E preprocessed source input with line numbers.
Read std input, then annotate each line with line number based on previous
expanded line directives from -E output. Useful in the context of compiler
debugging.
"""
import getopt
import os
import re
import sys
import script_utils as u
flag_reverse = True
def usage(msgarg):
"""Print usage and exit."""
if msgarg:
sys.stderr.write("error: %s\n" % msgarg)
print """\
usage: %s [options] < input > output
options:
-d increase debug msg verbosity level
""" % os.path.basename(sys.argv[0])
sys.exit(1)
def parse_args():
"""Command line argument parsing."""
global flag_reverse
try:
optlist, _ = getopt.getopt(sys.argv[1:], "dr")
except getopt.GetoptError as err:
# unrecognized option
usage(str(err))
for opt, _ in optlist:
if opt == "-d":
u.increment_verbosity()
elif opt == "-r":
flag_reverse = False
# Setup
u.setdeflanglocale()
parse_args()
# Read
lines = sys.stdin.readlines()
lnum = -1
matcher = re.compile(r"^\#\s+(\d+)\s+\"(\S+)\".*$")
for line in lines:
m = matcher.match(line)
if m:
lnum = int(m.group(1))
afile = m.group(2)
print "<%s:%d>" % (afile, lnum)
continue
print "%d:%s" % (lnum, line.strip())
lnum += 1
| 19.014493 | 74 | 0.636433 | 188 | 1,312 | 4.393617 | 0.553191 | 0.039952 | 0.038741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00679 | 0.214177 | 1,312 | 68 | 75 | 19.294118 | 0.794374 | 0.035823 | 0 | 0 | 0 | 0 | 0.159629 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.125 | null | null | 0.075 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9475ee1123a7f8c87eb161ddf2246d4b5a64a79 | 1,847 | py | Python | fst_web/demo_settings.py | kamidev/autobuild_fst | 6baffa955075ffe3c5f197789e9fd065fa74058e | [
"BSD-3-Clause"
] | null | null | null | fst_web/demo_settings.py | kamidev/autobuild_fst | 6baffa955075ffe3c5f197789e9fd065fa74058e | [
"BSD-3-Clause"
] | null | null | null | fst_web/demo_settings.py | kamidev/autobuild_fst | 6baffa955075ffe3c5f197789e9fd065fa74058e | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
import os
ROOT = os.path.abspath(os.path.dirname(__file__))
path = lambda *args: os.path.join(ROOT, *args)
""" Template for local settings of the FST webservice (fst_web)
Please edit this file and replace all generic values with values suitable to
your particular installation.
"""
# NOTE! Always set this to False before deploying
DEBUG = True
# NOTE! Before deploying on a public, uncomment ALLOWED_HOSTS
# and add IP address and/or domain of your site
ALLOWED_HOSTS = ['localhost', '127.0.0.1', 'fst.magokoro.nu']
# Look for instance-specific settings
try:
from .instance_settings import *
except ImportError:
from .default_instance_settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': path('database/fst_demo.db')
}
}
LOG_LEVEL = "DEBUG"
# Enable this to override global DB Debug setting
# DB_DEBUG_LEVEL = "DEBUG"
# Setup mail server for sending email notifications.
# You can use any mail server you want.
# But a very simple way to get started is to use a gmail account.
EMAIL_USE_TLS = True
EMAIL_HOST = 'smtp.gmail.com'
EMAIL_PORT = 587
# EMAIL_HOST_USER = 'your email'
# EMAIL_HOST_PASSWORD = 'your password'
# Admins specified here receive email notifications on critical errors.
ADMINS = ()
MANAGERS = ADMINS
# URL that handles the media served from MEDIA_ROOT. Make sure to use a
# trailing slash.
# Examples: "http://media.lawrence.com/media/", "http://example.com/media/"
MEDIA_URL = os.path.join("/dokument/")
# Site and port for hosting FST service (do not add ending '/').
FST_SITE_URL = "http://127.0.0.1:8000"
# TODO - Check if FST_INSTANCE_PREFIX can be removed
# Site and port of specific FST instance (do not add ending '/').
FST_INSTANCE_URL = os.path.join(
"http://127.0.0.1:8000",
FST_INSTANCE_PREFIX)
| 28.415385 | 76 | 0.721711 | 281 | 1,847 | 4.629893 | 0.519573 | 0.023059 | 0.023059 | 0.013836 | 0.047656 | 0.021522 | 0 | 0 | 0 | 0 | 0 | 0.020182 | 0.168381 | 1,847 | 64 | 77 | 28.859375 | 0.826823 | 0.494857 | 0 | 0 | 0 | 0 | 0.226287 | 0.03523 | 0 | 0 | 0 | 0.015625 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b94890b4860019fd993040c0790c0701fc24a0c5 | 2,919 | py | Python | main.py | valurhrafn/chromium-sync | df5e3299d179fc47ff34d1a95409383f46aac4d4 | [
"MIT"
] | 4 | 2017-03-27T02:25:07.000Z | 2021-03-07T21:40:58.000Z | main.py | valurhrafn/chromium-sync | df5e3299d179fc47ff34d1a95409383f46aac4d4 | [
"MIT"
] | null | null | null | main.py | valurhrafn/chromium-sync | df5e3299d179fc47ff34d1a95409383f46aac4d4 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
#
# Copyright 2007 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from google.appengine.api import users
import webapp2
# For datastore
import cgi
import urllib
from google.appengine.ext import ndb
class UserId(ndb.Model):
content = ndb.StringProperty()
date = ndb.DateTimeProperty(auto_now_add=True)
@classmethod
def query_user(cls, ancestor_key):
return cls.query(ancestor=ancestor_key).order(-cls.date)
# ************** MainHandler ************* #
class MainHandler(webapp2.RequestHandler):
def get(self):
self.response.write('Hello world!')
# ************** GetUser ************* #
class GetUser(webapp2.RequestHandler):
def get(self):
self.response.out.write('<html><body>')
client_id = self.request.get('client_id')
ancestor_key = ndb.Key("ID", client_id or "*no_id*")
userids = UserId.query_user(ancestor_key).fetch(20)
self.response.out.write('her er eitthvad')
for userid in userids:
self.response.out.write('<blockquote>%s</blockquote>' %
cgi.escape(userid.content))
# Checks for active Google account session
# user = users.get_current_user()
# if user:
# self.response.headers['Content-Type'] = 'text/plain'
# self.response.write('Hello, ' + user.nickname())
# else:
# self.redirect(users.create_login_url(self.request.uri))
self.response.out.write('</body></html>')
def post(self):
pass
# ************** HasData ************* #
class HasData(webapp2.RequestHandler):
def get(self):
pass
#TODO does user have data
class PostData(webapp2.RequestHandler):
def post(self):
client_id = self.request.get('client_id')
chrome_user = UserId(parent=ndb.Key("ID", client_id or "*no_id*"),
content = self.request.get('client_id'))
chrome_user.put()
#TODO recieve data from client
class GetSyncData(object):
"""docstring for GetSyncData"""
def __init__(self, arg):
super(GetSyncData, self).__init__()
self.arg = arg
#implement get data for user
# property user.email() or user.user_id()
app = webapp2.WSGIApplication([
('/', MainHandler),
('/GetUser/', GetUser),
('/HasData/', HasData),
('/chrome-sync/command/', PostData),
('/GetSyncData/', GetSyncData)
], debug=True)
| 30.40625 | 74 | 0.647825 | 363 | 2,919 | 5.121212 | 0.4573 | 0.045186 | 0.051641 | 0.043034 | 0.141474 | 0.124798 | 0.124798 | 0.023669 | 0 | 0 | 0 | 0.006891 | 0.204522 | 2,919 | 95 | 75 | 30.726316 | 0.793712 | 0.378554 | 0 | 0.191489 | 0 | 0 | 0.100338 | 0.027058 | 0 | 0 | 0 | 0.010526 | 0 | 1 | 0.148936 | false | 0.042553 | 0.106383 | 0.021277 | 0.446809 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b94a534d42db78fa886439d7fdfdf20e0f8b2504 | 1,434 | py | Python | comet/service/subscriber.py | dneise/Comet | abaa0da65d69f90a5262d81416477b4e71deb2ad | [
"BSD-2-Clause"
] | 15 | 2015-11-29T18:53:58.000Z | 2022-03-09T15:47:30.000Z | comet/service/subscriber.py | dneise/Comet | abaa0da65d69f90a5262d81416477b4e71deb2ad | [
"BSD-2-Clause"
] | 29 | 2016-01-21T18:10:45.000Z | 2021-10-01T16:41:12.000Z | comet/service/subscriber.py | dneise/Comet | abaa0da65d69f90a5262d81416477b4e71deb2ad | [
"BSD-2-Clause"
] | 11 | 2016-01-22T14:05:51.000Z | 2022-03-09T17:49:56.000Z | # Comet VOEvent Broker.
from twisted.application.internet import ClientService
from comet.protocol.subscriber import VOEventSubscriberFactory
__all__ = ["makeSubscriberService"]
def makeSubscriberService(endpoint, local_ivo, validators, handlers, filters):
"""Create a reconnecting VOEvent subscriber service.
Parameters
----------
endpoint : implements `twisted.internet.interfaces.IStreamClientEndpoint`
The endpoint to which the service will connect.
local_ivo : `str` or `None`
IVOA identifier for the subscriber.
validators : `list` of implementers of `~comet.icomet.IValidator`.
Validators which will be applied to incoming events. Events which fail
validation will be rejected.
handlers : `list` of implementers of `~comet.icomet.IHandler`.
Handlers to which events which pass validation will be passed.
filters : `list` of `str`
XPath filters. Will be passed to upstream as a request to filter the
alerts being sent.
Notes
-----
Upstream brokes may not provide support for XPath filtering; in this case,
the filters suppplied will be ignored.
Reconnection is handled according to the default policies of
`twisted.application.internet.ClientService`.
"""
factory = VOEventSubscriberFactory(local_ivo, validators, handlers, filters)
service = ClientService(endpoint, factory)
return service
| 35.85 | 80 | 0.727336 | 162 | 1,434 | 6.395062 | 0.512346 | 0.028958 | 0.050193 | 0.050193 | 0.123552 | 0.059846 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202929 | 1,434 | 39 | 81 | 36.769231 | 0.906387 | 0.66318 | 0 | 0 | 0 | 0 | 0.053571 | 0.053571 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b94e05939494c3c75adce95bb694899b36d0a091 | 919 | py | Python | src/oslibs/cocos/cocos-src/tools/cocos2d-console/plugins/framework/framework_add.py | dios-game/dios-cocos | b7fbcbafe02f516ef18fdb64b4519dbf806303fc | [
"MIT"
] | 1 | 2021-07-22T15:53:26.000Z | 2021-07-22T15:53:26.000Z | src/oslibs/cocos/cocos-src/tools/cocos2d-console/plugins/framework/framework_add.py | dios-game/dios-cocos | b7fbcbafe02f516ef18fdb64b4519dbf806303fc | [
"MIT"
] | null | null | null | src/oslibs/cocos/cocos-src/tools/cocos2d-console/plugins/framework/framework_add.py | dios-game/dios-cocos | b7fbcbafe02f516ef18fdb64b4519dbf806303fc | [
"MIT"
] | null | null | null |
import cocos
from MultiLanguage import MultiLanguage
from package.helper import ProjectHelper
class FrameworkAdd(cocos.CCPlugin):
@staticmethod
def plugin_name():
return "add-framework"
@staticmethod
def brief_description():
return MultiLanguage.get_string('FRAMEWORK_ADD_BRIEF')
# parse arguments
def parse_args(self, argv):
from argparse import ArgumentParser
parser = ArgumentParser(prog="cocos %s" % self.__class__.plugin_name(),
description=self.__class__.brief_description())
parser.add_argument("name", metavar="NAME", help=MultiLanguage.get_string('FRAMEWORK_ADD_ARG_NAME'))
return parser.parse_args(argv)
def run(self, argv):
args = self.parse_args(argv)
name = args.name
project = ProjectHelper.get_current_project()
ProjectHelper.add_framework(project, name)
| 28.71875 | 108 | 0.686616 | 99 | 919 | 6.111111 | 0.383838 | 0.044628 | 0.072727 | 0.102479 | 0.112397 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225245 | 919 | 31 | 109 | 29.645161 | 0.849719 | 0.016322 | 0 | 0.095238 | 0 | 0 | 0.077691 | 0.024417 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0.190476 | 0.095238 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b9530c0fbf29c36506820a41f0b32bd37796d3e0 | 1,298 | py | Python | code/examples/example_binomial_and_log_normal_abtest.py | hugopibernat/BayesianABTestAnalysis | 026960524f5313f4a734f30fd447a5731be802e0 | [
"Apache-2.0"
] | null | null | null | code/examples/example_binomial_and_log_normal_abtest.py | hugopibernat/BayesianABTestAnalysis | 026960524f5313f4a734f30fd447a5731be802e0 | [
"Apache-2.0"
] | null | null | null | code/examples/example_binomial_and_log_normal_abtest.py | hugopibernat/BayesianABTestAnalysis | 026960524f5313f4a734f30fd447a5731be802e0 | [
"Apache-2.0"
] | null | null | null | #################################################
####### Author: Hugo Pibernat #######
####### Contact: hugopibernat@gmail.com #######
####### Date: April 2014 #######
#################################################
from bayesianABTest import sampleSuccessRateForBinomial, sampleMeanForLogNormal, probabilityOfABetterThanB
from numpy.random import lognormal
from numpy import mean, concatenate, zeros
# Generate Log-Normal data
A_actuals = lognormal(mean=4.10, sigma=1.0, size=100)
B_actuals = lognormal(mean=4.00, sigma=1.0, size=100)
# Plus some zeros
A_data = concatenate([A_actuals,zeros(10000)])
B_data = concatenate([B_actuals,zeros(10000)])
# Modeling conversions with a binomial variable
A_purchases = sum(A_data > 0)
A_sessions = len(A_data)
B_purchases = sum(B_data > 0)
B_sessions = len(B_data)
A_CR = sampleSuccessRateForBinomial(A_sessions,A_purchases)
B_CR = sampleSuccessRateForBinomial(B_sessions,B_purchases)
# Modeling the spend with a log-normal
A_non_zero_data = A_data[A_data > 0]
B_non_zero_data = B_data[B_data > 0]
A_spend = sampleMeanForLogNormal(A_non_zero_data)
B_spend = sampleMeanForLogNormal(B_non_zero_data)
# Combining the two
A_rps = A_CR*A_spend
B_rps = B_CR*B_spend
# Result:
print probabilityOfABetterThanB(A_rps,B_rps) | 32.45 | 106 | 0.692604 | 173 | 1,298 | 4.930636 | 0.34104 | 0.029308 | 0.051583 | 0.049238 | 0.032825 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029903 | 0.124037 | 1,298 | 40 | 107 | 32.45 | 0.720317 | 0.195686 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.15 | null | null | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b95403252db42b0394653a122fd73b2b596e194d | 400 | py | Python | app/main.py | meysam81/sheypoor | aa67e20646ebc4143b83968f60c0b28c2ad340a1 | [
"MIT"
] | null | null | null | app/main.py | meysam81/sheypoor | aa67e20646ebc4143b83968f60c0b28c2ad340a1 | [
"MIT"
] | null | null | null | app/main.py | meysam81/sheypoor | aa67e20646ebc4143b83968f60c0b28c2ad340a1 | [
"MIT"
] | null | null | null | from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app import api
from app.core.config import config
app = FastAPI(title="Sheypoor")
# Set all CORS enabled origins
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
app.include_router(api.router, prefix=config.API_URI)
| 21.052632 | 53 | 0.7425 | 51 | 400 | 5.686275 | 0.509804 | 0.075862 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145 | 400 | 18 | 54 | 22.222222 | 0.847953 | 0.07 | 0 | 0 | 0 | 0 | 0.02973 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b9652ceb78b45d3bef98c61d48e3cd4630133615 | 19,317 | py | Python | sdk/python/pulumi_google_native/testing/v1/test_matrix.py | AaronFriel/pulumi-google-native | 75d1cda425e33d4610348972cd70bddf35f1770d | [
"Apache-2.0"
] | 44 | 2021-04-18T23:00:48.000Z | 2022-02-14T17:43:15.000Z | sdk/python/pulumi_google_native/testing/v1/test_matrix.py | AaronFriel/pulumi-google-native | 75d1cda425e33d4610348972cd70bddf35f1770d | [
"Apache-2.0"
] | 354 | 2021-04-16T16:48:39.000Z | 2022-03-31T17:16:39.000Z | sdk/python/pulumi_google_native/testing/v1/test_matrix.py | AaronFriel/pulumi-google-native | 75d1cda425e33d4610348972cd70bddf35f1770d | [
"Apache-2.0"
] | 8 | 2021-04-24T17:46:51.000Z | 2022-01-05T10:40:21.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
from ._enums import *
from ._inputs import *
__all__ = ['TestMatrixArgs', 'TestMatrix']
@pulumi.input_type
class TestMatrixArgs:
def __init__(__self__, *,
environment_matrix: pulumi.Input['EnvironmentMatrixArgs'],
result_storage: pulumi.Input['ResultStorageArgs'],
test_specification: pulumi.Input['TestSpecificationArgs'],
client_info: Optional[pulumi.Input['ClientInfoArgs']] = None,
fail_fast: Optional[pulumi.Input[bool]] = None,
flaky_test_attempts: Optional[pulumi.Input[int]] = None,
project: Optional[pulumi.Input[str]] = None,
request_id: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a TestMatrix resource.
:param pulumi.Input['EnvironmentMatrixArgs'] environment_matrix: The devices the tests are being executed on.
:param pulumi.Input['ResultStorageArgs'] result_storage: Where the results for the matrix are written.
:param pulumi.Input['TestSpecificationArgs'] test_specification: How to run the test.
:param pulumi.Input['ClientInfoArgs'] client_info: Information about the client which invoked the test.
:param pulumi.Input[bool] fail_fast: If true, only a single attempt at most will be made to run each execution/shard in the matrix. Flaky test attempts are not affected. Normally, 2 or more attempts are made if a potential infrastructure issue is detected. This feature is for latency sensitive workloads. The incidence of execution failures may be significantly greater for fail-fast matrices and support is more limited because of that expectation.
:param pulumi.Input[int] flaky_test_attempts: The number of times a TestExecution should be re-attempted if one or more of its test cases fail for any reason. The maximum number of reruns allowed is 10. Default is 0, which implies no reruns.
:param pulumi.Input[str] project: The cloud project that owns the test matrix.
"""
pulumi.set(__self__, "environment_matrix", environment_matrix)
pulumi.set(__self__, "result_storage", result_storage)
pulumi.set(__self__, "test_specification", test_specification)
if client_info is not None:
pulumi.set(__self__, "client_info", client_info)
if fail_fast is not None:
pulumi.set(__self__, "fail_fast", fail_fast)
if flaky_test_attempts is not None:
pulumi.set(__self__, "flaky_test_attempts", flaky_test_attempts)
if project is not None:
pulumi.set(__self__, "project", project)
if request_id is not None:
pulumi.set(__self__, "request_id", request_id)
@property
@pulumi.getter(name="environmentMatrix")
def environment_matrix(self) -> pulumi.Input['EnvironmentMatrixArgs']:
"""
The devices the tests are being executed on.
"""
return pulumi.get(self, "environment_matrix")
@environment_matrix.setter
def environment_matrix(self, value: pulumi.Input['EnvironmentMatrixArgs']):
pulumi.set(self, "environment_matrix", value)
@property
@pulumi.getter(name="resultStorage")
def result_storage(self) -> pulumi.Input['ResultStorageArgs']:
"""
Where the results for the matrix are written.
"""
return pulumi.get(self, "result_storage")
@result_storage.setter
def result_storage(self, value: pulumi.Input['ResultStorageArgs']):
pulumi.set(self, "result_storage", value)
@property
@pulumi.getter(name="testSpecification")
def test_specification(self) -> pulumi.Input['TestSpecificationArgs']:
"""
How to run the test.
"""
return pulumi.get(self, "test_specification")
@test_specification.setter
def test_specification(self, value: pulumi.Input['TestSpecificationArgs']):
pulumi.set(self, "test_specification", value)
@property
@pulumi.getter(name="clientInfo")
def client_info(self) -> Optional[pulumi.Input['ClientInfoArgs']]:
"""
Information about the client which invoked the test.
"""
return pulumi.get(self, "client_info")
@client_info.setter
def client_info(self, value: Optional[pulumi.Input['ClientInfoArgs']]):
pulumi.set(self, "client_info", value)
@property
@pulumi.getter(name="failFast")
def fail_fast(self) -> Optional[pulumi.Input[bool]]:
"""
If true, only a single attempt at most will be made to run each execution/shard in the matrix. Flaky test attempts are not affected. Normally, 2 or more attempts are made if a potential infrastructure issue is detected. This feature is for latency sensitive workloads. The incidence of execution failures may be significantly greater for fail-fast matrices and support is more limited because of that expectation.
"""
return pulumi.get(self, "fail_fast")
@fail_fast.setter
def fail_fast(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "fail_fast", value)
@property
@pulumi.getter(name="flakyTestAttempts")
def flaky_test_attempts(self) -> Optional[pulumi.Input[int]]:
"""
The number of times a TestExecution should be re-attempted if one or more of its test cases fail for any reason. The maximum number of reruns allowed is 10. Default is 0, which implies no reruns.
"""
return pulumi.get(self, "flaky_test_attempts")
@flaky_test_attempts.setter
def flaky_test_attempts(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "flaky_test_attempts", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
The cloud project that owns the test matrix.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@property
@pulumi.getter(name="requestId")
def request_id(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "request_id")
@request_id.setter
def request_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "request_id", value)
class TestMatrix(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
client_info: Optional[pulumi.Input[pulumi.InputType['ClientInfoArgs']]] = None,
environment_matrix: Optional[pulumi.Input[pulumi.InputType['EnvironmentMatrixArgs']]] = None,
fail_fast: Optional[pulumi.Input[bool]] = None,
flaky_test_attempts: Optional[pulumi.Input[int]] = None,
project: Optional[pulumi.Input[str]] = None,
request_id: Optional[pulumi.Input[str]] = None,
result_storage: Optional[pulumi.Input[pulumi.InputType['ResultStorageArgs']]] = None,
test_specification: Optional[pulumi.Input[pulumi.InputType['TestSpecificationArgs']]] = None,
__props__=None):
"""
Creates and runs a matrix of tests according to the given specifications. Unsupported environments will be returned in the state UNSUPPORTED. A test matrix is limited to use at most 2000 devices in parallel. May return any of the following canonical error codes: - PERMISSION_DENIED - if the user is not authorized to write to project - INVALID_ARGUMENT - if the request is malformed or if the matrix tries to use too many simultaneous devices.
Auto-naming is currently not supported for this resource.
Note - this resource's API doesn't support deletion. When deleted, the resource will persist
on Google Cloud even though it will be deleted from Pulumi state.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['ClientInfoArgs']] client_info: Information about the client which invoked the test.
:param pulumi.Input[pulumi.InputType['EnvironmentMatrixArgs']] environment_matrix: The devices the tests are being executed on.
:param pulumi.Input[bool] fail_fast: If true, only a single attempt at most will be made to run each execution/shard in the matrix. Flaky test attempts are not affected. Normally, 2 or more attempts are made if a potential infrastructure issue is detected. This feature is for latency sensitive workloads. The incidence of execution failures may be significantly greater for fail-fast matrices and support is more limited because of that expectation.
:param pulumi.Input[int] flaky_test_attempts: The number of times a TestExecution should be re-attempted if one or more of its test cases fail for any reason. The maximum number of reruns allowed is 10. Default is 0, which implies no reruns.
:param pulumi.Input[str] project: The cloud project that owns the test matrix.
:param pulumi.Input[pulumi.InputType['ResultStorageArgs']] result_storage: Where the results for the matrix are written.
:param pulumi.Input[pulumi.InputType['TestSpecificationArgs']] test_specification: How to run the test.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: TestMatrixArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Creates and runs a matrix of tests according to the given specifications. Unsupported environments will be returned in the state UNSUPPORTED. A test matrix is limited to use at most 2000 devices in parallel. May return any of the following canonical error codes: - PERMISSION_DENIED - if the user is not authorized to write to project - INVALID_ARGUMENT - if the request is malformed or if the matrix tries to use too many simultaneous devices.
Auto-naming is currently not supported for this resource.
Note - this resource's API doesn't support deletion. When deleted, the resource will persist
on Google Cloud even though it will be deleted from Pulumi state.
:param str resource_name: The name of the resource.
:param TestMatrixArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(TestMatrixArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
client_info: Optional[pulumi.Input[pulumi.InputType['ClientInfoArgs']]] = None,
environment_matrix: Optional[pulumi.Input[pulumi.InputType['EnvironmentMatrixArgs']]] = None,
fail_fast: Optional[pulumi.Input[bool]] = None,
flaky_test_attempts: Optional[pulumi.Input[int]] = None,
project: Optional[pulumi.Input[str]] = None,
request_id: Optional[pulumi.Input[str]] = None,
result_storage: Optional[pulumi.Input[pulumi.InputType['ResultStorageArgs']]] = None,
test_specification: Optional[pulumi.Input[pulumi.InputType['TestSpecificationArgs']]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = TestMatrixArgs.__new__(TestMatrixArgs)
__props__.__dict__["client_info"] = client_info
if environment_matrix is None and not opts.urn:
raise TypeError("Missing required property 'environment_matrix'")
__props__.__dict__["environment_matrix"] = environment_matrix
__props__.__dict__["fail_fast"] = fail_fast
__props__.__dict__["flaky_test_attempts"] = flaky_test_attempts
__props__.__dict__["project"] = project
__props__.__dict__["request_id"] = request_id
if result_storage is None and not opts.urn:
raise TypeError("Missing required property 'result_storage'")
__props__.__dict__["result_storage"] = result_storage
if test_specification is None and not opts.urn:
raise TypeError("Missing required property 'test_specification'")
__props__.__dict__["test_specification"] = test_specification
__props__.__dict__["invalid_matrix_details"] = None
__props__.__dict__["outcome_summary"] = None
__props__.__dict__["state"] = None
__props__.__dict__["test_executions"] = None
__props__.__dict__["test_matrix_id"] = None
__props__.__dict__["timestamp"] = None
super(TestMatrix, __self__).__init__(
'google-native:testing/v1:TestMatrix',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None) -> 'TestMatrix':
"""
Get an existing TestMatrix resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = TestMatrixArgs.__new__(TestMatrixArgs)
__props__.__dict__["client_info"] = None
__props__.__dict__["environment_matrix"] = None
__props__.__dict__["fail_fast"] = None
__props__.__dict__["flaky_test_attempts"] = None
__props__.__dict__["invalid_matrix_details"] = None
__props__.__dict__["outcome_summary"] = None
__props__.__dict__["project"] = None
__props__.__dict__["result_storage"] = None
__props__.__dict__["state"] = None
__props__.__dict__["test_executions"] = None
__props__.__dict__["test_matrix_id"] = None
__props__.__dict__["test_specification"] = None
__props__.__dict__["timestamp"] = None
return TestMatrix(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="clientInfo")
def client_info(self) -> pulumi.Output['outputs.ClientInfoResponse']:
"""
Information about the client which invoked the test.
"""
return pulumi.get(self, "client_info")
@property
@pulumi.getter(name="environmentMatrix")
def environment_matrix(self) -> pulumi.Output['outputs.EnvironmentMatrixResponse']:
"""
The devices the tests are being executed on.
"""
return pulumi.get(self, "environment_matrix")
@property
@pulumi.getter(name="failFast")
def fail_fast(self) -> pulumi.Output[bool]:
"""
If true, only a single attempt at most will be made to run each execution/shard in the matrix. Flaky test attempts are not affected. Normally, 2 or more attempts are made if a potential infrastructure issue is detected. This feature is for latency sensitive workloads. The incidence of execution failures may be significantly greater for fail-fast matrices and support is more limited because of that expectation.
"""
return pulumi.get(self, "fail_fast")
@property
@pulumi.getter(name="flakyTestAttempts")
def flaky_test_attempts(self) -> pulumi.Output[int]:
"""
The number of times a TestExecution should be re-attempted if one or more of its test cases fail for any reason. The maximum number of reruns allowed is 10. Default is 0, which implies no reruns.
"""
return pulumi.get(self, "flaky_test_attempts")
@property
@pulumi.getter(name="invalidMatrixDetails")
def invalid_matrix_details(self) -> pulumi.Output[str]:
"""
Describes why the matrix is considered invalid. Only useful for matrices in the INVALID state.
"""
return pulumi.get(self, "invalid_matrix_details")
@property
@pulumi.getter(name="outcomeSummary")
def outcome_summary(self) -> pulumi.Output[str]:
"""
Output Only. The overall outcome of the test. Only set when the test matrix state is FINISHED.
"""
return pulumi.get(self, "outcome_summary")
@property
@pulumi.getter
def project(self) -> pulumi.Output[str]:
"""
The cloud project that owns the test matrix.
"""
return pulumi.get(self, "project")
@property
@pulumi.getter(name="resultStorage")
def result_storage(self) -> pulumi.Output['outputs.ResultStorageResponse']:
"""
Where the results for the matrix are written.
"""
return pulumi.get(self, "result_storage")
@property
@pulumi.getter
def state(self) -> pulumi.Output[str]:
"""
Indicates the current progress of the test matrix.
"""
return pulumi.get(self, "state")
@property
@pulumi.getter(name="testExecutions")
def test_executions(self) -> pulumi.Output[Sequence['outputs.TestExecutionResponse']]:
"""
The list of test executions that the service creates for this matrix.
"""
return pulumi.get(self, "test_executions")
@property
@pulumi.getter(name="testMatrixId")
def test_matrix_id(self) -> pulumi.Output[str]:
"""
Unique id set by the service.
"""
return pulumi.get(self, "test_matrix_id")
@property
@pulumi.getter(name="testSpecification")
def test_specification(self) -> pulumi.Output['outputs.TestSpecificationResponse']:
"""
How to run the test.
"""
return pulumi.get(self, "test_specification")
@property
@pulumi.getter
def timestamp(self) -> pulumi.Output[str]:
"""
The time this test matrix was initially created.
"""
return pulumi.get(self, "timestamp")
| 50.436031 | 458 | 0.67671 | 2,305 | 19,317 | 5.431236 | 0.12321 | 0.050084 | 0.047048 | 0.031872 | 0.714754 | 0.6316 | 0.609394 | 0.596134 | 0.567058 | 0.544373 | 0 | 0.001756 | 0.233421 | 19,317 | 382 | 459 | 50.568063 | 0.843666 | 0.344567 | 0 | 0.412766 | 1 | 0 | 0.165417 | 0.038828 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148936 | false | 0.004255 | 0.034043 | 0.004255 | 0.285106 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b965c021bcb2dac479172708e85ad9ed89f09ef2 | 5,427 | py | Python | View/View.py | MoriokaReimen/ConfigHeaderGenerator | 73ba5d3bd5269d7e6881ec79b6fc0121ff2fb03e | [
"MIT"
] | null | null | null | View/View.py | MoriokaReimen/ConfigHeaderGenerator | 73ba5d3bd5269d7e6881ec79b6fc0121ff2fb03e | [
"MIT"
] | null | null | null | View/View.py | MoriokaReimen/ConfigHeaderGenerator | 73ba5d3bd5269d7e6881ec79b6fc0121ff2fb03e | [
"MIT"
] | null | null | null | import tkinter as tk
import tkinter.messagebox
from Control import Control
class View:
def __init__(self, control : Control.Control):
self.control = control
# Init Window
self.root = tk.Tk()
self.root.title(u"Header File Generator")
self.root.geometry("700x800")
self.config_frame = tk.Frame(self.root)
# Config Table
lb_symbol = tk.Label(self.config_frame, width = 20)
lb_symbol["text"] = "Symbol"
lb_symbol.grid(row = 0, column = 0)
lb_description = tk.Label(self.config_frame, width = 40)
lb_description["text"] = "Detail"
lb_description.grid(row = 0, column = 1)
lb_enable = tk.Label(self.config_frame, width = 10)
lb_enable["text"] = "Enable"
lb_enable.grid(row = 0, column = 2)
for i, config in enumerate(self.control.getConfigs()):
symbol_entry = tk.Entry(self.config_frame, width=20)
symbol_entry.insert(tk.END, config.symbol)
symbol_entry.config(state = tk.DISABLED)
symbol_entry.config(disabledforeground = "black", disabledbackground = "white")
symbol_entry.grid(row= i + 1, column = 0)
detail_entry = tk.Entry(self.config_frame, width=40)
detail_entry.insert(tk.END, config.detail)
detail_entry.config(state = tk.DISABLED)
detail_entry.config(disabledforeground = "black", disabledbackground = "white")
detail_entry.grid(row= i + 1, column = 1)
bt_enable = tk.Button(self.config_frame, text="ON", width= 5)
bt_enable["text"] = "ON" if config.enable else "OFF"
color = "green" if config.enable else "red"
bt_enable.config(bg=color, activebackground = color)
bt_enable["command"] = lambda id = i, button = bt_enable : self.toggle_config_enable(id, button)
bt_enable.grid(row = i + 1, column = 2)
self.config_frame.pack(side=tk.TOP, anchor=tk.NW)
self.value_config_frame = tk.Frame(self.root)
# Config Table
lb_symbol = tk.Label(self.value_config_frame, width = 20)
lb_symbol["text"] = "Symbol"
lb_symbol.grid(row = 0, column = 0)
lb_description = tk.Label(self.value_config_frame, width = 40)
lb_description["text"] = "Detail"
lb_description.grid(row = 0, column = 1)
lb_value = tk.Label(self.value_config_frame, width = 10)
lb_value["text"] = "Value"
lb_value.grid(row = 0, column = 2)
lb_enable = tk.Label(self.value_config_frame, width = 10)
lb_enable["text"] = "Enable"
lb_enable.grid(row = 0, column = 3)
for i, val_config in enumerate(self.control.getValConfigs()):
symbol_entry = tk.Entry(self.value_config_frame, width=20)
symbol_entry.insert(tk.END, val_config.symbol)
symbol_entry.config(state = tk.DISABLED)
symbol_entry.config(disabledforeground = "black", disabledbackground = "white")
symbol_entry.grid(row= i + 1, column = 0)
detail_entry = tk.Entry(self.value_config_frame, width=40)
detail_entry.insert(tk.END, val_config.detail)
detail_entry.config(state = tk.DISABLED)
detail_entry.config(disabledforeground = "black", disabledbackground = "white")
detail_entry.grid(row= i + 1, column = 1)
value_entry = tk.Entry(self.value_config_frame, width=10)
value_entry.insert(tk.END, val_config.value)
value_entry.config(state = tk.DISABLED)
value_entry.config(disabledforeground = "black", disabledbackground = "white")
value_entry.grid(row= i + 1, column = 2)
bt_enable = tk.Button(self.value_config_frame, text="ON", width= 5)
bt_enable["text"] = "ON" if val_config.enable else "OFF"
color = "green" if val_config.enable else "red"
bt_enable.config(bg=color, activebackground = color)
bt_enable["command"] = lambda id = i, button = bt_enable : self.toggle_val_config_enable(id, button)
bt_enable.grid(row = i + 1, column = 3)
self.value_config_frame.pack(side=tk.TOP, anchor=tk.W)
# Generator Button
self.bt_generate = tk.Button(self.root)
self.bt_generate["text"] = "Generate Header"
self.bt_generate["command"] = self.generateHeader
self.bt_generate.pack(side=tk.BOTTOM, anchor=tk.SE)
def start(self):
self.root.mainloop()
def generateHeader(self):
self.control.generateHeader()
tk.messagebox.showinfo("Header Generator Info", "Generated:{0}".format(self.control.header_config.path))
def update(self):
pass
def toggle_config_enable(self, id, button : tk.Button):
config = self.control.getConfigs()[id]
config.enable = not config.enable
button["text"] = "ON" if config.enable else "OFF"
color = "green" if config.enable else "red"
button.config(bg=color, activebackground = color)
def toggle_val_config_enable(self, id, button : tk.Button):
val_config = self.control.getValConfigs()[id]
val_config.enable = not val_config.enable
button["text"] = "ON" if val_config.enable else "OFF"
color = "green" if val_config.enable else "red"
button.config(bg=color, activebackground = color)
| 43.071429 | 112 | 0.629445 | 696 | 5,427 | 4.741379 | 0.136494 | 0.06 | 0.058182 | 0.060606 | 0.763636 | 0.713939 | 0.665758 | 0.638485 | 0.6 | 0.535758 | 0 | 0.015036 | 0.252442 | 5,427 | 125 | 113 | 43.416 | 0.798373 | 0.00995 | 0 | 0.3125 | 0 | 0 | 0.054583 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0.010417 | 0.03125 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b967ba0197b144171458b230c2dfe31844ba0b72 | 5,231 | py | Python | dags/download_decrypt_transfer_files.py | hms-dbmi/bch-pic-sure-airflow-dags | 0c1e6f07da4e270581942e551ac30284474921d4 | [
"Apache-2.0"
] | null | null | null | dags/download_decrypt_transfer_files.py | hms-dbmi/bch-pic-sure-airflow-dags | 0c1e6f07da4e270581942e551ac30284474921d4 | [
"Apache-2.0"
] | null | null | null | dags/download_decrypt_transfer_files.py | hms-dbmi/bch-pic-sure-airflow-dags | 0c1e6f07da4e270581942e551ac30284474921d4 | [
"Apache-2.0"
] | null | null | null | """
@author: anilkdegala
"""
import os
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator, BranchPythonOperator
from datetime import date, timedelta, datetime
from collections import OrderedDict
from scripts.dag_pebbles import DagPebbles
from airflow.configuration import conf
from scripts.configurations import *
from airflow.operators.dummy_operator import DummyOperator
default_args = {
"owner": "anilkdegala",
"depends_on_past": True,
"max_active_runs": 1,
"start_date": datetime(2015, 6, 1),
"is_active": True,
"is_paused_upon_creation": False,
}
def begin_pipeline(**kwargs):
print("begin_pipeline:")
files = kwargs['dag_run'].conf.get('files')
download_decrypt_arguments = ''
transfer_arguments_list = []
for f in files:
print("download_decrypt_transfer_files: file: ", f['name'], ', location: ', f['path'])
output = f['name']+','+f['path']+','+f['final_name']
download_decrypt_arguments = download_decrypt_arguments + " " + output
transfer_arguments_list.append(DATA_LOCATION + "/"+f['final_name'])
transfer_arguments = ",".join(transfer_arguments_list)
print("final download_decrypt_arguments: ",download_decrypt_arguments)
print("final transfer_arguments: ",transfer_arguments)
kwargs["ti"].xcom_push(key="download_decrypt_arguments", value=download_decrypt_arguments)
kwargs["ti"].xcom_push(key="transfer_arguments", value=transfer_arguments)
def pipeline_enable_check(**kwargs):
dp = DagPebbles()
if dp.pipeline_enable_check('DATA_LOAD'):
return "pipeline_check_passed"
else:
return "pipeline_check_skipped"
def pipeline_check_passed(**kwargs):
print("pipeline_check_passed:")
def end_pipeline(**kwargs):
print("end_pipeline:")
def pipeline_check_skipped(**kwargs):
print("pipeline_check_skipped:")
def cleanup(**kwargs):
dp = DagPebbles()
print("cleanup")
def notify(**kwargs):
dp = DagPebbles()
print("notify")
def end(**kwargs):
dp = DagPebbles()
print("end")
with DAG( "DOWNLOAD_DECRYPT_TRANSFER",
description="Download, Decrypt, Transfer files (Source: S3, Staging: EC2: Target: RDS Oracle)",
default_args=default_args,
schedule_interval=None,
catchup=False,
orientation="TB",
tags=['Utils'],
dagrun_timeout=timedelta(hours=240)
) as dag:
t_pipeline_begin = PythonOperator(
task_id="begin_pipeline",
python_callable=begin_pipeline,
provide_context=True,
dag=dag,
)
t_check_pipeline = BranchPythonOperator(
task_id="check_pipeline",
python_callable=pipeline_enable_check,
provide_context=True,
dag=dag,
)
t_pipeline_check_passed = PythonOperator(
task_id="pipeline_check_passed",
python_callable=pipeline_check_passed,
provide_context=True,
dag=dag,
)
t_pipeline_check_skipped = PythonOperator(
task_id="pipeline_check_skipped",
python_callable=pipeline_check_skipped,
provide_context=True,
dag=dag,
)
download_files_cmd = "/opt/bitnami/airflow/airflow-data/scripts/download_files.sh "+"{{ ti.xcom_pull(key='download_decrypt_arguments')}}"
t_download_files = BashOperator(
task_id='download_files',
bash_command=download_files_cmd,
dag=dag)
decrypt_files_cmd = "/opt/bitnami/airflow/airflow-data/scripts/decrypt_files.sh "+"{{ ti.xcom_pull(key='download_decrypt_arguments')}} "
t_decrypt_files = BashOperator(
task_id='decrypt_files',
bash_command=decrypt_files_cmd,
dag=dag)
transfer_files_cmd = "/opt/bitnami/airflow/airflow-data/scripts/transfer_files_rds.pl "+"{{ ti.xcom_pull(key='transfer_arguments')}} "
t_transfer_files = BashOperator(
task_id='transfer_files',
bash_command=transfer_files_cmd,
dag=dag)
t_end_pipeline = PythonOperator(
task_id="end_pipeline",
python_callable=end_pipeline,
provide_context=True,
trigger_rule="none_failed",
dag=dag,
)
t_notify = PythonOperator(
task_id="send_notifications",
python_callable=notify,
provide_context=True,
trigger_rule="none_failed",
dag=dag,
)
t_cleanup = PythonOperator(
task_id="cleanup",
python_callable=cleanup,
provide_context=True,
trigger_rule="none_failed",
dag=dag,
)
t_end = PythonOperator(
task_id="end",
python_callable=end,
provide_context=True,
trigger_rule="none_failed",
dag=dag,
)
t_pipeline_begin >> t_check_pipeline
t_check_pipeline >> t_pipeline_check_skipped >> t_end_pipeline
t_check_pipeline >> t_pipeline_check_passed >> t_download_files >> t_decrypt_files >> t_transfer_files >> t_end_pipeline
t_end_pipeline >> t_cleanup >> t_notify >> t_end
| 30.770588 | 171 | 0.664118 | 585 | 5,231 | 5.586325 | 0.218803 | 0.055692 | 0.066095 | 0.025704 | 0.250306 | 0.222766 | 0.168605 | 0.146573 | 0.083843 | 0.083843 | 0 | 0.002979 | 0.229975 | 5,231 | 169 | 172 | 30.952663 | 0.808342 | 0.003823 | 0 | 0.201493 | 0 | 0 | 0.208149 | 0.111474 | 0.022388 | 0 | 0 | 0 | 0 | 1 | 0.059701 | false | 0.052239 | 0.074627 | 0 | 0.149254 | 0.074627 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b96b280416f0d557826ffa670a7914f2d45e5fc5 | 526 | py | Python | src/sot_talos_balance/test/test_feet_admittance.py | imaroger/sot-talos-balance | 5e56700b4e105273ecf6feb3474789beac469a77 | [
"BSD-2-Clause"
] | null | null | null | src/sot_talos_balance/test/test_feet_admittance.py | imaroger/sot-talos-balance | 5e56700b4e105273ecf6feb3474789beac469a77 | [
"BSD-2-Clause"
] | null | null | null | src/sot_talos_balance/test/test_feet_admittance.py | imaroger/sot-talos-balance | 5e56700b4e105273ecf6feb3474789beac469a77 | [
"BSD-2-Clause"
] | null | null | null | '''Test feet admittance control'''
from sot_talos_balance.utils.run_test_utils import run_ft_calibration, run_test, runCommandClient
try:
# Python 2
input = raw_input # noqa
except NameError:
pass
run_test('appli_feet_admittance.py')
run_ft_calibration('robot.ftc')
input("Wait before running the test")
print('Set saturation value')
runCommandClient('robot.admBF_dqSaturation.sin.value = [0.0, 0.0, 0.01, 0.0, 0.0, 0.0]')
input("Wait before dumping the data")
runCommandClient('dump_tracer(robot.tracer)')
| 25.047619 | 97 | 0.752852 | 79 | 526 | 4.822785 | 0.556962 | 0.047244 | 0.055118 | 0.052493 | 0.028871 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030369 | 0.123574 | 526 | 20 | 98 | 26.3 | 0.796095 | 0.081749 | 0 | 0 | 0 | 0.083333 | 0.42437 | 0.17437 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0.083333 | 0 | 0.083333 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b96d766a7c5eab27eb3785b1277b6beccda7c9ed | 1,446 | py | Python | auth/tests/test_views.py | asb29/Redundant | ee816fd41f9217610bd11f757cf9175288723c70 | [
"MIT"
] | null | null | null | auth/tests/test_views.py | asb29/Redundant | ee816fd41f9217610bd11f757cf9175288723c70 | [
"MIT"
] | null | null | null | auth/tests/test_views.py | asb29/Redundant | ee816fd41f9217610bd11f757cf9175288723c70 | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.test import Client
class RegisterTestCase(TestCase):
def test_register(self):
c = Client()
# on success redirects to /
response = c.post('/accounts/register/', {
'username': 'asdas',
'password1': 'asdasdasd12',
'password2': 'asdasdasd12'
})
self.assertRedirects(response, '/')
# passwords don't match
response = c.post('/accounts/register/', {
'username': 'asdasdasd1',
'password1': 'asdasdasd1',
'password2': 'asdasdasd2'
})
self.assertEquals(response.status_code, 200)
# username is empty
response = c.post('/accounts/register/', {
'username': '',
'password1': 'asdasdasd12',
'password2': 'asdasdasd12'
})
self.assertEquals(response.status_code, 200)
# no password
response = c.post('/accounts/register/', {
'username': 'asdasdasd',
'password1': '',
'password2': ''
})
self.assertEquals(response.status_code, 200)
# username and password are similar
response = c.post('/accounts/register/', {
'username': 'asdasdasd0',
'password1': 'asdasdasd1',
'password2': 'asdasdasd1'
})
self.assertEquals(response.status_code, 200)
| 30.125 | 52 | 0.53527 | 116 | 1,446 | 6.62931 | 0.37931 | 0.058518 | 0.084525 | 0.136541 | 0.563069 | 0.453836 | 0.117035 | 0 | 0 | 0 | 0 | 0.037344 | 0.333333 | 1,446 | 47 | 53 | 30.765957 | 0.760373 | 0.076763 | 0 | 0.571429 | 0 | 0 | 0.258841 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.028571 | false | 0.285714 | 0.057143 | 0 | 0.114286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b974d5d1bd35654f50415a8f7c66f3fb9a0316ab | 704 | py | Python | tests/test_formatter.py | hbraux/kafkacli | 5f7ed23150932b66b484fb43dd6210b6c0968776 | [
"MIT"
] | null | null | null | tests/test_formatter.py | hbraux/kafkacli | 5f7ed23150932b66b484fb43dd6210b6c0968776 | [
"MIT"
] | null | null | null | tests/test_formatter.py | hbraux/kafkacli | 5f7ed23150932b66b484fb43dd6210b6c0968776 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import pytest
import json
from kafkacli.formatter import Formatter
sampleJson = json.loads('{"a":"s", "b":1}')
def test_print_default(capsys):
Formatter().print(sampleJson)
captured = capsys.readouterr()
assert captured.out == '{"a": "s", "b": 1}\n'
def test_print_idents(capsys):
Formatter(indents=True).print(sampleJson)
captured = capsys.readouterr()
assert captured.out == '{\n "a": "s",\n "b": 1\n}\n'
def test_print_colors(capsys):
Formatter(colors=True).print(sampleJson)
captured = capsys.readouterr()
assert captured.out == \
'{"a": \x1b[34m"s"\x1b[39m, "b": \x1b[31m1\x1b[39m}\n'
| 24.275862 | 62 | 0.640625 | 96 | 704 | 4.635417 | 0.395833 | 0.013483 | 0.080899 | 0.195506 | 0.4 | 0.4 | 0.4 | 0.4 | 0.4 | 0 | 0 | 0.02911 | 0.170455 | 704 | 28 | 63 | 25.142857 | 0.732877 | 0.059659 | 0 | 0.166667 | 0 | 0.055556 | 0.183333 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.166667 | false | 0 | 0.222222 | 0 | 0.388889 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b975e6fb7fb3fa8849afb4e4ce41618c2ce94c1b | 451 | py | Python | src/test/tests/unit/protocol.py | ylee88/visit | 8e0920996d84fef70a7014b0d770360918d849d5 | [
"BSD-3-Clause"
] | 1 | 2022-01-27T23:52:04.000Z | 2022-01-27T23:52:04.000Z | src/test/tests/unit/protocol.py | ylee88/visit | 8e0920996d84fef70a7014b0d770360918d849d5 | [
"BSD-3-Clause"
] | null | null | null | src/test/tests/unit/protocol.py | ylee88/visit | 8e0920996d84fef70a7014b0d770360918d849d5 | [
"BSD-3-Clause"
] | null | null | null | # ----------------------------------------------------------------------------
# CLASSES: nightly
#
# Test Case: protocolo.py
#
# Tests: vistprotocol unit test
#
# Mark C. Miller, Tue Jan 11 10:19:23 PST 2011
# ----------------------------------------------------------------------------
tapp = visit_bin_path("visitprotocol")
res = sexe(tapp,ret_output=True)
if res["return_code"] == 0:
excode = 111
else:
excode = 113
Exit(excode)
| 26.529412 | 78 | 0.432373 | 44 | 451 | 4.340909 | 0.886364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049223 | 0.144124 | 451 | 16 | 79 | 28.1875 | 0.445596 | 0.618625 | 0 | 0 | 0 | 0 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b97d4675d330154e0b12b91fbd601affd888ea29 | 1,901 | py | Python | examples/airflow/dags/etl_orders_7_days.py | phixMe/marquez | 06d71635369893b371a8a9c9e7023f11d7cbb1f8 | [
"Apache-2.0"
] | null | null | null | examples/airflow/dags/etl_orders_7_days.py | phixMe/marquez | 06d71635369893b371a8a9c9e7023f11d7cbb1f8 | [
"Apache-2.0"
] | null | null | null | examples/airflow/dags/etl_orders_7_days.py | phixMe/marquez | 06d71635369893b371a8a9c9e7023f11d7cbb1f8 | [
"Apache-2.0"
] | null | null | null | from datetime import datetime
from marquez_airflow import DAG
from airflow.operators.postgres_operator import PostgresOperator
from airflow.utils.dates import days_ago
default_args = {
'owner': 'datascience',
'depends_on_past': False,
'start_date': days_ago(1),
'email_on_failure': False,
'email_on_retry': False,
'email': ['datascience@example.com']
}
dag = DAG(
'etl_orders_7_days',
schedule_interval='@hourly',
catchup=False,
default_args=default_args,
description='Loads newly placed orders weekly.'
)
t1 = PostgresOperator(
task_id='if_not_exists',
postgres_conn_id='food_delivery_db',
sql='''
CREATE TABLE IF NOT EXISTS orders_7_days (
order_id INTEGER REFERENCES orders(id),
placed_on TIMESTAMP NOT NULL,
discount_id INTEGER REFERENCES discounts(id),
menu_id INTEGER REFERENCES menus(id),
restaurant_id INTEGER REFERENCES restaurants(id),
menu_item_id INTEGER REFERENCES menu_items(id),
category_id INTEGER REFERENCES categories(id)
);''',
dag=dag
)
t2 = PostgresOperator(
task_id='tuncate',
postgres_conn_id='food_delivery_db',
sql='TRUNCATE TABLE orders_7_days;',
dag=dag
)
t3 = PostgresOperator(
task_id='insert',
postgres_conn_id='food_delivery_db',
sql='''
INSERT INTO orders_7_days (order_id, placed_on, discount_id, menu_id, restaurant_id, menu_item_id, category_id)
SELECT o.id AS order_id, o.placed_on, o.discount_id, m.id AS menu_id, m.restaurant_id, mi.id AS menu_item_id, c.id AS category_id
FROM orders AS o
INNER JOIN menu_items AS mi
ON mi.id = o.menu_item_id
INNER JOIN categories AS c
ON c.id = mi.category_id
INNER JOIN menus AS m
ON m.id = c.menu_id
WHERE o.placed_on >= NOW() - interval '7 days'
''',
dag=dag
)
t1 >> t2 >> t3
| 29.246154 | 135 | 0.681746 | 270 | 1,901 | 4.533333 | 0.318519 | 0.044118 | 0.093137 | 0.044118 | 0.105392 | 0.07598 | 0.07598 | 0 | 0 | 0 | 0 | 0.008136 | 0.224093 | 1,901 | 64 | 136 | 29.703125 | 0.821695 | 0 | 0 | 0.137931 | 0 | 0.017241 | 0.635455 | 0.012099 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.068966 | 0 | 0.068966 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b980be1e0d2b8db749e25a4f49c35cdddbdca9d9 | 1,650 | py | Python | tt/urls.py | samiksha-patil/Knowledge-Sharing-Platform | 22e61a659d5ad63fe656fa639dc897cbdebad4fe | [
"bzip2-1.0.6"
] | 1 | 2021-05-09T08:18:49.000Z | 2021-05-09T08:18:49.000Z | tt/urls.py | samiksha-patil/Knowledge-Sharing-Platform | 22e61a659d5ad63fe656fa639dc897cbdebad4fe | [
"bzip2-1.0.6"
] | 9 | 2021-03-19T01:11:35.000Z | 2022-03-12T00:20:13.000Z | tt/urls.py | samiksha-patil/Knowledge-Sharing-Platform | 22e61a659d5ad63fe656fa639dc897cbdebad4fe | [
"bzip2-1.0.6"
] | null | null | null | """
tt URL Configuration
The `urlpatterns` list routes URLs to views. For more information please see:
https://docs.djangoproject.com/en/2.1/topics/http/urls/
Examples:
Function views
1. Add an import: from my_app import views
2. Add a URL to urlpatterns: path('', views.home, name='home')
Class-based views
1. Add an import: from other_app.views import Home
2. Add a URL to urlpatterns: path('', Home.as_view(), name='home')
Including another URLconf
1. Import the include() function: from django.urls import include, path
2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
"""
# Uncomment next two lines to enable admin:
from django.contrib import admin
from django.urls import path, include
from users import views as user_views
from django.contrib.auth import views as auth_views
from upload import views as upload_views
from django.conf import settings
from django.conf.urls.static import static
urlpatterns = [
# Uncomment the next line to enable the admin:
path('admin/', admin.site.urls),
path('', include('blog.urls')),
path('register/', user_views.register, name='register'),
path('login/',auth_views.LoginView.as_view(template_name='users/login.html'),name='login'),
path('logout/',auth_views.LogoutView.as_view(template_name='users/logout.html') ,name='logout'),
path('profile/', user_views.profile, name='profile'),
path('book/',upload_views.book_list,name='book_list'),
path('book/upload',upload_views.upload_book,name='upload_book'),
]
if settings.DEBUG:
urlpatterns += static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
| 35.869565 | 100 | 0.726061 | 243 | 1,650 | 4.835391 | 0.316872 | 0.051064 | 0.012766 | 0.020426 | 0.138723 | 0.099574 | 0.06383 | 0 | 0 | 0 | 0 | 0.00569 | 0.147879 | 1,650 | 46 | 101 | 35.869565 | 0.830014 | 0.428485 | 0 | 0 | 0 | 0 | 0.149733 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.368421 | 0 | 0.368421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b9954284c404c9a5aed225965d5006c8735af349 | 1,717 | py | Python | musa/migrations/0001_initial.py | ccsreenidhin/Music-Web-Django | 9b8286914f9099b9ed56c712c7ca384846f189d1 | [
"MIT"
] | null | null | null | musa/migrations/0001_initial.py | ccsreenidhin/Music-Web-Django | 9b8286914f9099b9ed56c712c7ca384846f189d1 | [
"MIT"
] | null | null | null | musa/migrations/0001_initial.py | ccsreenidhin/Music-Web-Django | 9b8286914f9099b9ed56c712c7ca384846f189d1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2018-03-29 06:43
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import musa.models
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='MusicCollection',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(blank=True, max_length=70, null=True)),
('document', models.FileField(upload_to=musa.models.get_upload_path)),
('uploaded_at', models.DateTimeField(auto_now_add=True, null=True)),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='UserProfile',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fullname', models.CharField(blank=True, max_length=70)),
('favourite_music', models.CharField(blank=True, max_length=70)),
('about', models.TextField(blank=True, max_length=300)),
('picture', models.ImageField(default='/profile_images/avatar.jpeg', upload_to='profile_images')),
('user', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]
| 40.880952 | 121 | 0.633663 | 188 | 1,717 | 5.606383 | 0.457447 | 0.030361 | 0.045541 | 0.068311 | 0.363378 | 0.363378 | 0.363378 | 0.263757 | 0.263757 | 0.263757 | 0 | 0.019055 | 0.235877 | 1,717 | 41 | 122 | 41.878049 | 0.784299 | 0.038439 | 0 | 0.30303 | 1 | 0 | 0.086165 | 0.016384 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.151515 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b996ad8d5f407e5b1769d9b50ca7be5705a211e8 | 1,937 | py | Python | pyzmq/examples/pubsub/subscriber.py | Surfndez/source-publish | c3838b303c1a0806f21cd4e8d8c207015b3ce9c8 | [
"Intel"
] | null | null | null | pyzmq/examples/pubsub/subscriber.py | Surfndez/source-publish | c3838b303c1a0806f21cd4e8d8c207015b3ce9c8 | [
"Intel"
] | 1 | 2021-01-21T17:43:33.000Z | 2021-01-21T17:43:33.000Z | pyzmq/examples/pubsub/subscriber.py | Surfndez/source-publish | c3838b303c1a0806f21cd4e8d8c207015b3ce9c8 | [
"Intel"
] | null | null | null | """A test that subscribes to NumPy arrays.
Uses REQ/REP (on PUB/SUB socket + 1) to synchronize
"""
#-----------------------------------------------------------------------------
# Copyright (c) 2010 Brian Granger
#
# Distributed under the terms of the New BSD License. The full license is in
# the file COPYING.BSD, distributed as part of this software.
#-----------------------------------------------------------------------------
import sys
import time
import zmq
import numpy
def sync(connect_to):
# use connect socket + 1
sync_with = ':'.join(connect_to.split(':')[:-1] +
[str(int(connect_to.split(':')[-1]) + 1)]
)
ctx = zmq.Context.instance()
s = ctx.socket(zmq.REQ)
s.connect(sync_with)
s.send('READY')
s.recv()
def main():
if len (sys.argv) != 3:
print 'usage: subscriber <connect_to> <array-count>'
sys.exit (1)
try:
connect_to = sys.argv[1]
array_count = int (sys.argv[2])
except (ValueError, OverflowError), e:
print 'array-count must be integers'
sys.exit (1)
ctx = zmq.Context()
s = ctx.socket(zmq.SUB)
s.connect(connect_to)
s.setsockopt(zmq.SUBSCRIBE,'')
sync(connect_to)
start = time.clock()
print "Receiving arrays..."
for i in range(array_count):
a = s.recv_pyobj()
print " Done."
end = time.clock()
elapsed = (end - start) * 1000000
if elapsed == 0:
elapsed = 1
throughput = (1000000.0 * float (array_count)) / float (elapsed)
message_size = a.nbytes
megabits = float (throughput * message_size * 8) / 1000000
print "message size: %.0f [B]" % (message_size, )
print "array count: %.0f" % (array_count, )
print "mean throughput: %.0f [msg/s]" % (throughput, )
print "mean throughput: %.3f [Mb/s]" % (megabits, )
time.sleep(1.0)
if __name__ == "__main__":
main()
| 25.826667 | 78 | 0.545173 | 239 | 1,937 | 4.313808 | 0.447699 | 0.061106 | 0.025218 | 0.029098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03078 | 0.245225 | 1,937 | 74 | 79 | 26.175676 | 0.674419 | 0.180176 | 0 | 0.042553 | 0 | 0 | 0.142375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.085106 | null | null | 0.170213 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b997c70668ace413cc27502883f737e007e56239 | 1,006 | py | Python | Doc/includes/sqlite3/load_extension.py | livioso/cpython | 077061a7b24917aaf31057885c69919c5a553c88 | [
"PSF-2.0"
] | 36 | 2019-06-07T20:44:06.000Z | 2022-03-23T06:19:43.000Z | Doc/includes/sqlite3/load_extension.py | livioso/cpython | 077061a7b24917aaf31057885c69919c5a553c88 | [
"PSF-2.0"
] | 49 | 2016-02-29T17:59:52.000Z | 2019-05-05T04:59:26.000Z | Doc/includes/sqlite3/load_extension.py | livioso/cpython | 077061a7b24917aaf31057885c69919c5a553c88 | [
"PSF-2.0"
] | 28 | 2019-06-27T04:11:27.000Z | 2022-03-11T06:27:44.000Z | import sqlite3
con = sqlite3.connect(":memory:")
# enable extension loading
con.enable_load_extension(True)
# Load the fulltext search extension
con.execute("select load_extension('./fts3.so')")
# alternatively you can load the extension using an API call:
# con.load_extension("./fts3.so")
# disable extension loading again
con.enable_load_extension(False)
# example from SQLite wiki
con.execute("create virtual table recipe using fts3(name, ingredients)")
con.executescript("""
insert into recipe (name, ingredients) values ('broccoli stew', 'broccoli peppers cheese tomatoes');
insert into recipe (name, ingredients) values ('pumpkin stew', 'pumpkin onions garlic celery');
insert into recipe (name, ingredients) values ('broccoli pie', 'broccoli cheese onions flour');
insert into recipe (name, ingredients) values ('pumpkin pie', 'pumpkin sugar flour butter');
""")
for row in con.execute("select rowid, name, ingredients from recipe where name match 'pie'"):
print(row)
| 37.259259 | 104 | 0.744533 | 131 | 1,006 | 5.671756 | 0.465649 | 0.121131 | 0.086137 | 0.107672 | 0.239569 | 0.239569 | 0.239569 | 0 | 0 | 0 | 0 | 0.005794 | 0.142147 | 1,006 | 26 | 105 | 38.692308 | 0.855156 | 0.206759 | 0 | 0 | 0 | 0 | 0.723135 | 0.034134 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b999024320e50c940c8f273e6f0536039450c829 | 1,949 | py | Python | config.py | jhattat/photoBooth | f6fe3ab418bb917792e10349597401ed34078766 | [
"MIT"
] | null | null | null | config.py | jhattat/photoBooth | f6fe3ab418bb917792e10349597401ed34078766 | [
"MIT"
] | null | null | null | config.py | jhattat/photoBooth | f6fe3ab418bb917792e10349597401ed34078766 | [
"MIT"
] | null | null | null | # Tumblr Setup
# Replace the values with your information
# OAuth keys can be generated from https://api.tumblr.com/console/calls/user/info
consumer_key='ShbOqI5zErQXOL7Qnd5XduXpY9XQUlBgJDpCLeq1OYqnY2KzSt' #replace with your key
consumer_secret='ulZradkbJGksjpl2MMlshAfJgEW6TNeSdZucykqeTp8jvwgnhu' #replace with your secret code
oath_token='uUcBuvJx8yhk4HJIZ39sfcYo0W4VoqcvUetR2EwcI5Sn8SLgNt' #replace with your oath token
oath_secret='iNJlqQJI6dwhAGmdNbMtD9u7VazmX2Rk5uW0fuIozIEjk97lz4' #replace with your oath secret code
tumblr_blog = 'soniaetjeremie' # replace with your tumblr account name without .tumblr.com
tagsForTumblr = "photobooth" # change to tags you want, separated with commas
#Config settings to change behavior of photo booth
monitor_w = 800 # width of the display monitor
monitor_h = 480 # height of the display monitor
file_path = '/home/pi/photobooth/pics/' # path to save images
clear_on_startup = False # True will clear previously stored photos as the program launches. False will leave all previous photos.
debounce = 0.3 # how long to debounce the button. Add more time if the button triggers too many times.
post_online = True # True to upload images. False to store locally only.
capture_count_pics = True # if true, show a photo count between taking photos. If false, do not. False is faster.
make_gifs = True # True to make an animated gif. False to post 4 jpgs into one post.
hi_res_pics = False # True to save high res pics from camera.
# If also uploading, the program will also convert each image to a smaller image before making the gif.
# False to first capture low res pics. False is faster.
# Careful, each photo costs against your daily Tumblr upload max.
camera_iso = 400 # adjust for lighting issues. Normal is 100 or 200. Sort of dark is 400. Dark is 800 max.
# available options: 100, 200, 320, 400, 500, 640, 800 | 77.96 | 130 | 0.758338 | 278 | 1,949 | 5.255396 | 0.561151 | 0.032854 | 0.051335 | 0.02601 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045541 | 0.188815 | 1,949 | 25 | 131 | 77.96 | 0.878558 | 0.653155 | 0 | 0 | 1 | 0 | 0.381902 | 0.345092 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b999aec7c34874ef90e0f30812ac97217ce90cca | 3,145 | py | Python | emoji.py | notagoat/Deepmoji | 1ab922306c3647f9c7ea98caa2660a53b18fe4b6 | [
"MIT"
] | 1 | 2020-03-19T20:09:00.000Z | 2020-03-19T20:09:00.000Z | emoji.py | notagoat/Deepmoji | 1ab922306c3647f9c7ea98caa2660a53b18fe4b6 | [
"MIT"
] | null | null | null | emoji.py | notagoat/Deepmoji | 1ab922306c3647f9c7ea98caa2660a53b18fe4b6 | [
"MIT"
] | null | null | null | import requests
import urllib.request
import os.path
import shutil
import csv
def main():
with open("data.csv") as i: #Open the data.csv file
instances = i.readlines() #Write them into memory
instances = [x.strip() for x in instances] #Strip any weird issues from writing
instances.sort() #Sort them alphabetically
setup(instances) #Run setup to create all the necessary files and subfolders
count = len(instances) #Get the count just for fun
i = 0
try:
for name in instances:
try:
i += 1
print("-----!"+name+"!-----")
print(str(i) +" of " + str(count) + " remaining!")
fetch(name) #Run the fetching code
except Exception as e:
print(e) #Print the error. We catch errors here for pleroma instances, weirdly encoded urls, etc
pass #Don't stop the beat
except Exception as e:
print("Instance Error")
print(e)
pass
clone(instances) #Clone all of them into one big folder for ease of access
def fetch(name):
r = requests.get('https://%s/api/v1/custom_emojis'% name, allow_redirects=True) #Throw the instance name into the standard url for fetching data
path = "emoji/%s/" % name #Because of the clone function we know all of these folders will exist
try:
for emoji in r.json(): #Emoji = the json code from the request
try:
if os.path.isfile(path+emoji['shortcode']+".png"): #Check to see if it exists.
pass
else:
if "ms_" not in emoji['shortcode']: #Cut out Mutant Standard Emojis (Or at least most of them). #Mutant standard is huge and common
#print(emoji['shortcode'] + " found!")
emojiimage = requests.get(emoji['static_url'],allow_redirects=True) #Get the image from the json
open(path + emoji['shortcode']+".png",'wb').write(emojiimage.content) #Now save it as an image in the filesystem
except Exception as e:
print("Did not get: " + emoji['url']) #If somethings fucky throw a nice error then keep going.
print(e)
pass
except Exception as e:
print(e)
def setup(instances):
if (os.path.isdir("emoji/")): #Check to see if emoji/ exists
pass
else:
os.mkdir("emoji/") #make it if it doesnt
for name in instances:
if (os.path.isdir("emoji/%s/"%name)):
pass
else: os.mkdir("emoji/%s/"%name)
if (os.path.isdir("emoji/all")):
pass
else:
os.mkdir("emoji/all")
def clone(instances):
for name in instances:
print("Copying emoji for: %s"% name)
path = "emoji/%s/" % name
files = os.listdir(path)
for name in files: #This gets alll files
try:
shutil.copyfile(path+name,"emoji/all/"+name) #Then copies them into the all folder
except Exception as e:
print(e)
pass
if __name__ == '__main__':
main()
| 37.440476 | 151 | 0.574245 | 418 | 3,145 | 4.289474 | 0.373206 | 0.020078 | 0.047407 | 0.050195 | 0.139431 | 0.070273 | 0 | 0 | 0 | 0 | 0 | 0.001403 | 0.320191 | 3,145 | 83 | 152 | 37.891566 | 0.837231 | 0.294118 | 0 | 0.430556 | 0 | 0 | 0.114299 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.111111 | 0.069444 | 0 | 0.125 | 0.138889 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b99add86778172fa08bc930ed29f8f26a88ec4d3 | 943 | py | Python | String/640.One Edit Distance/Solution_DP.py | Zhenye-Na/LxxxCode | afd79d790d0a7495d75e6650f80adaa99bd0ff07 | [
"MIT"
] | 12 | 2019-05-04T04:21:27.000Z | 2022-03-02T07:06:57.000Z | String/640.One Edit Distance/Solution_DP.py | Zhenye-Na/LxxxCode | afd79d790d0a7495d75e6650f80adaa99bd0ff07 | [
"MIT"
] | 1 | 2019-07-24T18:43:53.000Z | 2019-07-24T18:43:53.000Z | String/640.One Edit Distance/Solution_DP.py | Zhenye-Na/LxxxCode | afd79d790d0a7495d75e6650f80adaa99bd0ff07 | [
"MIT"
] | 10 | 2019-07-01T04:03:04.000Z | 2022-03-09T03:57:37.000Z | class Solution:
"""
@param s: a string
@param t: a string
@return: true if they are both one edit distance apart or false
"""
def isOneEditDistance(self, s, t):
# write your code here
if s == t:
return False
if abs(len(s) - len(t)) > 1:
return False
n, m = len(s), len(t)
f = [[0] * (m + 1) for _ in range(2)]
for j in range(m + 1):
f[0][j] = j
for i in range(1, n + 1):
f[i % 2][0] = i
for j in range(1, m + 1):
if s[i - 1] == t[j - 1]:
f[i % 2][j] = min(f[(i - 1) % 2][j - 1],
f[(i - 1) % 2][j] + 1, f[i % 2][j - 1] + 1)
else:
f[i % 2][j] = min(f[(i - 1) % 2][j - 1] + 1,
f[(i - 1) % 2][j] + 1, f[i % 2][j - 1] + 1)
return f[n % 2][m] == 1
| 29.46875 | 81 | 0.341463 | 149 | 943 | 2.154362 | 0.275168 | 0.056075 | 0.056075 | 0.049844 | 0.165109 | 0.165109 | 0.158879 | 0.158879 | 0.158879 | 0.158879 | 0 | 0.076446 | 0.486744 | 943 | 31 | 82 | 30.419355 | 0.586777 | 0.130435 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0 | 1 | 0.05 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b99b1d1ec6004cbeeb91e19410dbbb1e2216c45e | 1,478 | py | Python | nsq/__init__.py | jehiah/pynsq | 899b60a8ce77ed6c8ab899fbdfd7adbc1b450c96 | [
"MIT"
] | 1 | 2015-05-25T00:23:53.000Z | 2015-05-25T00:23:53.000Z | nsq/__init__.py | barkinet/pynsq | 899b60a8ce77ed6c8ab899fbdfd7adbc1b450c96 | [
"MIT"
] | null | null | null | nsq/__init__.py | barkinet/pynsq | 899b60a8ce77ed6c8ab899fbdfd7adbc1b450c96 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
import signal
import tornado.ioloop
import logging
from .protocol import (
Error,
unpack_response,
decode_message,
valid_topic_name,
valid_channel_name,
identify,
subscribe,
ready,
finish,
touch,
requeue,
nop,
pub,
mpub,
FRAME_TYPE_RESPONSE,
FRAME_TYPE_ERROR,
FRAME_TYPE_MESSAGE,
)
from .message import Message
from .backoff_timer import BackoffTimer
from .sync import SyncConn
from .async import AsyncConn
from .reader import Reader
from .legacy_reader import LegacyReader
from .writer import Writer
from .version import __version__ # NOQA
def _handle_term_signal(sig_num, frame):
logging.getLogger(__name__).info(
'TERM Signal handler called with signal %r', sig_num)
tornado.ioloop.IOLoop.instance().stop()
def run():
"""
Starts any instantiated :class:`nsq.Reader` or :class:`nsq.Writer`
"""
signal.signal(signal.SIGTERM, _handle_term_signal)
tornado.ioloop.IOLoop.instance().start()
__author__ = "Matt Reiferson <snakes@gmail.com>"
__all__ = ["Reader", "Writer", "run", "BackoffTimer", "Message", "Error", "LegacyReader",
"SyncConn", "AsyncConn", "unpack_response", "decode_message",
"identify", "subscribe", "ready", "finish", "touch", "requeue", "nop", "pub", "mpub",
"valid_topic_name", "valid_channel_name",
"FRAME_TYPE_RESPONSE", "FRAME_TYPE_ERROR", "FRAME_TYPE_MESSAGE"]
| 26.392857 | 96 | 0.696211 | 172 | 1,478 | 5.662791 | 0.412791 | 0.055441 | 0.041068 | 0.055441 | 0.26078 | 0.26078 | 0.199179 | 0.199179 | 0.199179 | 0 | 0 | 0 | 0.190122 | 1,478 | 55 | 97 | 26.872727 | 0.813701 | 0.002706 | 0 | 0 | 0 | 0 | 0.221583 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.295455 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b99ee5dfe9849188796ff8d2b024b524adedb8d2 | 1,950 | py | Python | django_mfa/migrations/0001_initial.py | timgates42/django-mfa | 89eeb83f7da3ea24f205b40b13c7f9d33ea15b99 | [
"MIT"
] | null | null | null | django_mfa/migrations/0001_initial.py | timgates42/django-mfa | 89eeb83f7da3ea24f205b40b13c7f9d33ea15b99 | [
"MIT"
] | null | null | null | django_mfa/migrations/0001_initial.py | timgates42/django-mfa | 89eeb83f7da3ea24f205b40b13c7f9d33ea15b99 | [
"MIT"
] | null | null | null | # Generated by Django 2.1.5 on 2019-03-26 11:35
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='U2FKey',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True)),
('last_used_at', models.DateTimeField(null=True)),
('public_key', models.TextField(unique=True)),
('key_handle', models.TextField()),
('app_id', models.TextField()),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='u2f_keys', to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='UserOTP',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('otp_type', models.CharField(choices=[('HOTP', 'hotp'), ('TOTP', 'totp')], max_length=20)),
('secret_key', models.CharField(blank=True, max_length=100)),
('user', models.OneToOneField(on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='UserRecoveryCodes',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('secret_code', models.CharField(max_length=10)),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='django_mfa.UserOTP')),
],
),
]
| 41.489362 | 143 | 0.598974 | 205 | 1,950 | 5.521951 | 0.4 | 0.035336 | 0.04947 | 0.077739 | 0.421378 | 0.421378 | 0.394876 | 0.394876 | 0.310071 | 0.310071 | 0 | 0.016644 | 0.260513 | 1,950 | 46 | 144 | 42.391304 | 0.768377 | 0.023077 | 0 | 0.384615 | 1 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.179487 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9a767c55418efb8b98d12205d59e512ca419081 | 1,860 | py | Python | blobStore.py | odeke-em/resty | 838934033e7eeca521e8c6d8cb2e99778beaa4b9 | [
"Apache-2.0"
] | null | null | null | blobStore.py | odeke-em/resty | 838934033e7eeca521e8c6d8cb2e99778beaa4b9 | [
"Apache-2.0"
] | null | null | null | blobStore.py | odeke-em/resty | 838934033e7eeca521e8c6d8cb2e99778beaa4b9 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
# Author: Emmanuel Odeke <odeke@ualberta.ca>
# This example steps you through using resty & restAssured to save pickled/serialized
# data as a blob and then later re-using it in after deserialization.
# Sample usage might be in collaborative computing ie publish results from an expensive
# computation on one machine so that other machines can load it as live data.
def testSerializer():
import Serializer
bs = Serializer.BinarySerializer()
js = Serializer.JSONSerializer()
data = dict((i, i) for i in range(10))
bserial = bs.serialize(data)
jserial = js.serialize(data)
bdserial = bs.deserialize(bserial)
jdserial = js.deserialize(jserial)
print('bdserial', bdserial)
ioS = bs.ioStream(bserial)
ioR = ioS.read()
print('ioS data from the stream', ioR)
def testCloudPassagePickledVersion():
from entrails.cloudPassage import CloudPassageHandler
cc = CloudPassageHandler()
data = dict((i, i*10) for i in range(9))
title = 'Dict of items 0-8999, keys i*10'
res = cc.push(data, title=title, asPickle=True)
pulledObj = cc.pull(metaData='pickle')
print('PulledObj', pulledObj, data)
assert(pulledObj == data)
rmTry = cc.removeTrace(data, asPickle=True)
print(rmTry)
def testCloudPassageJSONVersion():
from entrails.cloudPassage import CloudPassageHandler
cc = CloudPassageHandler()
data = dict((str(i), i*10) for i in range(9))
title = 'Dict of items 0-8999, keys i*10'
res = cc.push(data, title=title, asPickle=False)
pulledObj = cc.pull(metaData='json')
print('PulledObj', pulledObj, data)
assert(pulledObj == data)
rmTry = cc.removeTrace(data)
print(rmTry)
def main():
testSerializer()
testCloudPassageJSONVersion()
testCloudPassagePickledVersion()
if __name__ == '__main__':
main()
| 31 | 87 | 0.7 | 234 | 1,860 | 5.529915 | 0.478632 | 0.009274 | 0.01391 | 0.025502 | 0.341577 | 0.341577 | 0.341577 | 0.341577 | 0.341577 | 0.22102 | 0 | 0.015323 | 0.193011 | 1,860 | 59 | 88 | 31.525424 | 0.846769 | 0.203226 | 0 | 0.292683 | 0 | 0 | 0.088076 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 1 | 0.097561 | false | 0.195122 | 0.073171 | 0 | 0.170732 | 0.146341 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b9a7d3f5b98af28c51ffb55578408fad9a1d3f99 | 3,066 | py | Python | venv/Lib/site-packages/dataframe/_dataframe_column_set.py | kavanAdeshara/Expense_Tracker | b3e4810e858a7786e05cda6b91ba674b73b87981 | [
"Apache-2.0"
] | null | null | null | venv/Lib/site-packages/dataframe/_dataframe_column_set.py | kavanAdeshara/Expense_Tracker | b3e4810e858a7786e05cda6b91ba674b73b87981 | [
"Apache-2.0"
] | null | null | null | venv/Lib/site-packages/dataframe/_dataframe_column_set.py | kavanAdeshara/Expense_Tracker | b3e4810e858a7786e05cda6b91ba674b73b87981 | [
"Apache-2.0"
] | null | null | null | # dataframe: a data-frame implementation using method piping
#
# Copyright (C) 2016 Simon Dirmeier
#
# This file is part of dataframe.
#
# dataframe is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# dataframe is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with dataframe. If not, see <http://www.gnu.org/licenses/>.
#
#
# @author = 'Simon Dirmeier'
# @email = 'mail@simon-dirmeier.net'
from itertools import chain
import tabulate
from ._dataframe_column import DataFrameColumn
from ._dataframe_row import DataFrameRow
class DataFrameColumnSet:
def __init__(self, **kwargs):
self.__data_columns = []
self.__nrow = -1
self.cbind(**kwargs)
def __getitem__(self, item):
if isinstance(item, int):
return self.__data_columns[item]
raise ValueError("Item should be integer!")
def __iter__(self):
for col in self.__data_columns:
yield col
def __str__(self):
stri = "\nA dataframe"
ta = []
for col in self.__data_columns:
vals = col.values
if len(vals) > 10:
vals = list(chain(vals[:3], "...", vals[-3:]))
ta.append(vals)
ta = tabulate.tabulate(zip(*ta), headers=self.colnames)
return stri + "\n\n" + ta.__str__()
@property
def nrow(self):
return self.__nrow
@property
def ncol(self):
return len(self.colnames)
@property
def colnames(self):
return [x.colname for x in self.__data_columns]
def rows(self, idxs):
return [self.row(i) for i in idxs]
def row(self, idx):
"""
Returns DataFrameRow of the DataFrame given its index.
:param idx: the index of the row in the DataFrame.
:return: returns a DataFrameRow
"""
return DataFrameRow(idx, [x[idx] for x in self], self.colnames)
def which_colnames(self, *args):
idx = []
for i in range(len(self.__data_columns)):
if self.colnames[i] in args:
idx.append(i)
return idx
def cbind(self, **columns):
keys = sorted([x for x in columns.keys()])
for k in keys:
self.__cbind(DataFrameColumn(str(k), columns.get(k)))
def __cbind(self, column):
if column.colname in self.colnames:
ValueError("Appending duplicate col-name!")
self.__data_columns.append(column)
self.__nrow = self.__data_columns[-1].size()
for col in self.__data_columns:
if col.size() != self.__nrow:
raise ValueError("Columns do not have equal lengths!")
| 30.356436 | 71 | 0.63242 | 405 | 3,066 | 4.62963 | 0.377778 | 0.0384 | 0.072 | 0.036267 | 0.080533 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0.004906 | 0.268754 | 3,066 | 100 | 72 | 30.66 | 0.831401 | 0.302022 | 0 | 0.105263 | 0 | 0 | 0.050986 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.070175 | 0.070175 | 0.438596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9bb675bdbf31f94537da2d2380efe251bd20dd2 | 1,036 | py | Python | rest_auth/registration/urls.py | soul4code/django-rest-auth | b7a2e06e7736865b18f6aab79dcd42210e06c28b | [
"MIT"
] | null | null | null | rest_auth/registration/urls.py | soul4code/django-rest-auth | b7a2e06e7736865b18f6aab79dcd42210e06c28b | [
"MIT"
] | null | null | null | rest_auth/registration/urls.py | soul4code/django-rest-auth | b7a2e06e7736865b18f6aab79dcd42210e06c28b | [
"MIT"
] | null | null | null | from django.urls import re_path
from django.views.generic import TemplateView
from .views import RegisterView, VerifyEmailView
urlpatterns = [
re_path(r'^$', RegisterView.as_view(), name='rest_register'),
re_path(r'^verify-email/$', VerifyEmailView.as_view(), name='rest_verify_email'),
# This url is used by django-allauth and empty TemplateView is
# defined just to allow reverse() call inside app, for example when email
# with verification link is being sent, then it's required to render email
# content.
# account_confirm_email - You should override this view to handle it in
# your API client somehow and then, send post to /verify-email/ endpoint
# with proper key.
# If you don't want to use API on that step, then just use ConfirmEmailView
# view from:
# django-allauth https://github.com/pennersr/django-allauth/blob/master/allauth/account/views.py
re_path(r'^account-confirm-email/(?P<key>[-:\w]+)/$', TemplateView.as_view(),
name='account_confirm_email'),
]
| 41.44 | 100 | 0.721042 | 150 | 1,036 | 4.886667 | 0.56 | 0.032742 | 0.028649 | 0.038199 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.17278 | 1,036 | 24 | 101 | 43.166667 | 0.855309 | 0.532819 | 0 | 0 | 0 | 0 | 0.230444 | 0.131078 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b9c62ba9c79d4ffcb00ede68fc940fc877d45118 | 5,614 | py | Python | annotations/rip_annotated_junctions.py | ChristopherWilks/snaptron | 82ea3c5c5f2fbb726bba6d8c2bd0f7713291833a | [
"MIT"
] | 25 | 2016-01-08T02:02:36.000Z | 2021-12-29T14:00:58.000Z | annotations/rip_annotated_junctions.py | ChristopherWilks/snaptron | 82ea3c5c5f2fbb726bba6d8c2bd0f7713291833a | [
"MIT"
] | 11 | 2016-02-25T01:44:46.000Z | 2021-07-02T05:52:55.000Z | annotations/rip_annotated_junctions.py | ChristopherWilks/snaptron | 82ea3c5c5f2fbb726bba6d8c2bd0f7713291833a | [
"MIT"
] | 7 | 2016-02-13T01:45:15.000Z | 2021-11-22T11:04:12.000Z | #!/usr/bin/env python
"""
rip_annotated_junctions.py
Non-reference/species verson of this script, no lift-over
Rips junctions from annotation files contained in
jan_24_2016_annotations.tar.gz, as described in annotation_definition.md.
Junctions are dumped to stdout, which we record as annotated_junctions.tsv.gz
in runs/sra (same directory as this file). annotated_junctions.tsv.gz is
required by tables.py. The format of annotated_junctions.tsv.gz is
(tab-separated fields), one per junction
1. Chromosome
2. Start position (1-based, inclusive)
3. End position (1-based, inclusive)
4. Strand (+ or -)
5. anno source (abbreviation)
Must have
Stats are written to stderr
From the runs/sra/v2 directory, we ran
pypy rip_annotated_junctions.py
--hisat2-dir /path/to/hisat2-2.0.1-beta
--annotations path/to/jan_24_2016_annotations.tar.gz
| sort -k1,1 -k2,2n -k3,3n | gzip >annotated_junctions.tsv.gz
"""
import subprocess
import tarfile
import argparse
import tempfile
import atexit
import shutil
import glob
import os
import gzip
import sys
#file2source = {"hg19/gencode.v19.annotation.gtf.gz":"gC19","hg19/refGene.txt.gz":"rG19","hg19/acembly.txt.gz":"aC19","hg19/ccdsGene.txt.gz":"cG19","hg19/vegaGene.txt.gz":"vG19","hg19/knownGene.txt.gz":"kG19","hg19/mgcGenes.txt.gz":"mG19","hg19/lincRNAsTranscripts.txt.gz":"lR19","hg19/sibGene.txt.gz":"sG19","hg38/refGene.txt.gz":"rG38","hg38/ccdsGene.txt.gz":"cG38","hg38/gencode.v24.annotation.gtf.gz":"gC38","hg38/knownGene.txt.gz":"kG38","hg38/mgcGenes.txt.gz":"mG38","hg38/lincRNAsTranscripts.txt.gz":"lR38","hg38/sibGene.txt.gz":"sG38"}
#file2source = {"mm10/mouse10_ucsc_genes.gtf.gz":"kG10","mm10/mouse10_gencodevm11_comp.gtf.gz":"gC11","mm10/mouse10_gencodevm09_comp.gtf.gz":"gC09","mm10/mouse10_refseq_refgene.gtf.gz":"rG10"}
file2source = {"mouse10_ucsc_genes.gtf.gz":"kG10","mouse10_gencodevm11_comp.gtf.gz":"gC11","mouse10_gencodevm09_comp.gtf.gz":"gC09","mouse10_refseq_refgene.gtf.gz":"rG10"}
if __name__ == '__main__':
# Print file's docstring if -h is invoked
parser = argparse.ArgumentParser(description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
# Add command-line arguments
parser.add_argument('--extract-script-dir', type=str, required=True,
help=('path to directory containing extract_splice_sites.py script (from HISAT2)')
)
parser.add_argument('--annotations', type=str, required=True,
help=('full path to directory that has the annotation GTF(s) in gzipped format')
)
args = parser.parse_args()
extract_destination = tempfile.mkdtemp()
atexit.register(shutil.rmtree, extract_destination)
#with tarfile.open(args.annotations, 'r:gz') as tar:
# tar.extractall(path=extract_destination)
extract_splice_sites_path = os.path.join(args.extract_script_dir,
'extract_splice_sites.py')
containing_dir = os.path.dirname(os.path.realpath(__file__))
annotated_junctions_ = set()
for junction_file in glob.glob(
os.path.join(args.annotations, '*')
):
label = os.path.basename(junction_file)
datasource_code = file2source[label]
unique_junctions = set()
#extract_splice_sites_path prints 0-based, exon coords around junctions
#hence the +2 for the start here
extract_process = subprocess.Popen(' '.join([
sys.executable,
extract_splice_sites_path,
'<(gzip -cd %s)'
% junction_file
]),
shell=True,
executable='/bin/bash',
stdout=subprocess.PIPE
)
for line in extract_process.stdout:
tokens = line.strip().split('\t')
tokens[1] = int(tokens[1]) + 2
tokens[2] = int(tokens[2])
if tokens[2] < tokens[1]:
print >>sys.stderr, (
'Invalid junction ({}, {}, {}) from file {}. '
'Skipping.'
).format(
tokens[0], tokens[1], tokens[2], junction_file
)
continue
tokens.append(datasource_code)
junction_to_add = tuple(tokens)
annotated_junctions_.add(junction_to_add)
unique_junctions.add(junction_to_add)
extract_process.stdout.close()
exit_code = extract_process.wait()
if exit_code != 0:
raise RuntimeError(
'extract_splice_sites.py had nonzero exit code {}.'.format(
exit_code
)
)
print >>sys.stderr, 'Junctions in {}: {}'.format(
label,
len(unique_junctions)
)
junc2datasource = {}
for junction in annotated_junctions_:
if junction[:4] not in junc2datasource:
junc2datasource[junction[:4]]=set()
junc2datasource[junction[:4]].add(junction[4])
seen = set()
for junction in annotated_junctions_:
if junction[:4] not in seen:
sources = ",".join(sorted(junc2datasource[junction[:4]]))
print "%s\t%s" % ('\t'.join(map(str, junction[:4])),sources)
seen.add(junction[:4])
| 44.555556 | 543 | 0.603669 | 658 | 5,614 | 4.995441 | 0.361702 | 0.021296 | 0.032857 | 0.027989 | 0.158807 | 0.11439 | 0.028598 | 0.028598 | 0.028598 | 0.028598 | 0 | 0.043767 | 0.275561 | 5,614 | 125 | 544 | 44.912 | 0.764446 | 0.180976 | 0 | 0.02439 | 0 | 0 | 0.134615 | 0.050108 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.121951 | null | null | 0.036585 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9c89d9ad7d4587730637df2e5c8576e03c43ad8 | 3,115 | py | Python | shellfind.py | bhavyanshu/Shell-Finder | 308b3ba7f1a53b8a6cc738d69c01f4b7108d0860 | [
"Apache-2.0"
] | 4 | 2016-06-15T22:08:29.000Z | 2019-10-16T13:12:51.000Z | shellfind.py | kaitolegion/Shell-Finder | 308b3ba7f1a53b8a6cc738d69c01f4b7108d0860 | [
"Apache-2.0"
] | null | null | null | shellfind.py | kaitolegion/Shell-Finder | 308b3ba7f1a53b8a6cc738d69c01f4b7108d0860 | [
"Apache-2.0"
] | 7 | 2015-07-08T22:21:52.000Z | 2021-05-31T14:05:47.000Z | #!/usr/bin/env python
'''
Author : Bhavyanshu Parasher
Email : bhavyanshu@codershangout.org
Description : shellfind.py is a Python command line utility which lets you look for shells on a site that the hacker must have uploaded. It considers all the shells available and tries all possibilities via dictionary match.
'''
import socket
import sys
import httplib
from urlparse import urlparse
import time as t
import urllib2
from urllib2 import Request, urlopen, URLError
negative = '\033[91m'
positive = '\033[32m'
wait = '\033[95m'
final = '\033[93m'
total_scanned_global=0
found_scanned_global=0
def OpenLog(log_file_name):
try:
f = open(log_file_name, 'r')
return f.read()
f.close()
except IOError:
return "File" + log_file_name + "does not exist."
def main():
socket.setdefaulttimeout(10)
print wait+"\n## ------ Welcome to Shell Finder Utility - Developed by Bhavyanshu Parasher (http://bhavyanshu.github.io) | Apache License V2.0 | Project Source (https://github.com/bhavyanshu/Shell-Finder) ------ ##"
website_url = raw_input("\n\nEnter URL to scan ([eg, http://sitename.com or https://sitename.com/subdir ] | Do not add slash at the end of URL) : ")
parse_url=urlparse(website_url)
log_file_name = "LOG/"+parse_url.netloc+".log"
global total_scanned_global
global found_scanned_global
try:
try:
create=open(log_file_name,"w")
except:
print negative+"\nError generating log file. Please check directory access permissions."
print wait+"\nCreating a persistent connection to site "+website_url
conn = urllib2.Request(website_url)
urllib2.urlopen(website_url)
print positive+"Connected! Begining to scan for shells.."
except (urllib2.HTTPError) as Exit:
print negative+"\nEither the server is down or you are not connected to the internet."
exit()
try:
dictionary = open("dictionary","r")
except(IOError):
print negative+"Dictionary file not found_scanned_global. Please download the latest dictionary from github link"
exit()
keywords = dictionary.readlines()
for keys in keywords:
keys=keys.replace("\n","") #To replace newline with empty
New_URL = website_url+"/"+keys
print wait+">>>> "+New_URL
req=Request(New_URL)
try:
response = urlopen(req)
except URLError, e:
if hasattr(e,'reason'):
print negative+"Not found"
total_scanned_global = total_scanned_global+1
elif hasattr(e,'code'):
print negative+"Not found "
total_scanned_global = total_scanned_global+1
else:
try:
log_file=open(log_file_name,"a+") #Appending to it
except(IOError):
print negative+"Failed to create log file. Check dir permissions."
found_scanned_url=New_URL
print positive+"Possible shell found at ",found_scanned_url
log_file.writelines(found_scanned_url+"\n")
found_scanned_global=found_scanned_global+1
total_scanned_global=total_scanned_global+1
log_file.close()
print "\nTotal tries : ", total_scanned_global
print positive+"\nPossible shells: ",found_scanned_global
print final+"\nFollowing are the links to possible shells "
print OpenLog(log_file_name)
if __name__ == '__main__':
main()
| 35 | 224 | 0.742857 | 449 | 3,115 | 4.988864 | 0.396437 | 0.087054 | 0.072321 | 0.042857 | 0.068304 | 0.068304 | 0.068304 | 0.051786 | 0.051786 | 0.051786 | 0 | 0.013223 | 0.150241 | 3,115 | 88 | 225 | 35.397727 | 0.833019 | 0.020546 | 0 | 0.194805 | 0 | 0.025974 | 0.333091 | 0.007636 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.090909 | null | null | 0.194805 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9cea3f3b51bf703897e952ed45d88260e3502a1 | 1,190 | py | Python | dd_app/messaging/backend.py | datadealer/dd_app | 3806b9b9df165a49f0fca8a249170b4ccd4d0177 | [
"Artistic-2.0"
] | 2 | 2018-12-17T10:10:49.000Z | 2018-12-17T11:18:32.000Z | dd_app/messaging/backend.py | datadealer/dd_app | 3806b9b9df165a49f0fca8a249170b4ccd4d0177 | [
"Artistic-2.0"
] | null | null | null | dd_app/messaging/backend.py | datadealer/dd_app | 3806b9b9df165a49f0fca8a249170b4ccd4d0177 | [
"Artistic-2.0"
] | 1 | 2021-06-06T22:28:12.000Z | 2021-06-06T22:28:12.000Z | class RedisBackend(object):
def __init__(self, settings={}, *args, **kwargs):
self.settings = settings
@property
def connection(self):
# cached redis connection
if not hasattr(self, '_connection'):
self._connection = self.settings.get('redis.connector').get()
return self._connection
@property
def channel(self):
# Fanout channel
if not hasattr(self, '_channel'):
self._channel = self.connection.pubsub()
return self._channel
def subscribe(self, channels=[]):
# Fanout subscriber
for chan_id in channels:
self.channel.subscribe(chan_id)
def listen(self):
# Fanout generator
for m in self.channel.listen():
if m['type'] == 'message':
yield m
def send(self, channel_id, payload):
# Fanout emitter
return self.connection.publish(channel_id, payload)
def listen_queue(self, queue_keys):
# Message queue generator
while 1:
yield self.connection.blpop(queue_keys)
def send_queue(self, queue_key, payload):
return self.connection.rpush(payload)
| 28.333333 | 73 | 0.608403 | 131 | 1,190 | 5.381679 | 0.351145 | 0.139007 | 0.085106 | 0.04539 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001185 | 0.290756 | 1,190 | 41 | 74 | 29.02439 | 0.834123 | 0.094118 | 0 | 0.074074 | 0 | 0 | 0.042017 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.296296 | false | 0 | 0 | 0.074074 | 0.481481 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9d22fbf764d6a06a81fe68e7bedb0cb2069ff17 | 2,360 | py | Python | mpl/models/leaf.py | jiangyuang/ModelPruningLibrary | 9c8ba5a3c5d118f37768d5d42254711f48d88745 | [
"MIT"
] | 13 | 2020-02-24T16:57:37.000Z | 2021-12-14T16:47:41.000Z | mpl/models/leaf.py | jiangyuang/ModelPruningLibrary | 9c8ba5a3c5d118f37768d5d42254711f48d88745 | [
"MIT"
] | 3 | 2021-01-08T14:06:33.000Z | 2021-09-07T13:39:46.000Z | mpl/models/leaf.py | jiangyuang/ModelPruningLibrary | 9c8ba5a3c5d118f37768d5d42254711f48d88745 | [
"MIT"
] | 3 | 2020-05-30T17:59:43.000Z | 2021-04-13T04:55:33.000Z | from torch import nn as nn
from .base_model import BaseModel
from ..nn.conv2d import DenseConv2d
from ..nn.linear import DenseLinear
__all__ = ["Conv2", "conv2", "Conv4", "conv4"]
class Conv2(BaseModel):
def __init__(self):
super(Conv2, self).__init__()
self.features = nn.Sequential(DenseConv2d(1, 32, kernel_size=5, padding=2), # 32x28x28
nn.ReLU(inplace=True),
nn.MaxPool2d(2, stride=2), # 32x14x14
DenseConv2d(32, 64, kernel_size=5, padding=2), # 64x14x14
nn.ReLU(inplace=True),
nn.MaxPool2d(2, stride=2)) # 64x7x7
self.classifier = nn.Sequential(DenseLinear(64 * 7 * 7, 2048),
nn.ReLU(inplace=True),
DenseLinear(2048, 62))
self.collect_prunable_layers()
def forward(self, inp):
out = self.features(inp)
out = out.view(out.size(0), -1)
out = self.classifier(out)
return out
class Conv4(BaseModel):
def __init__(self):
super(Conv4, self).__init__()
self.features = nn.Sequential(DenseConv2d(3, 32, kernel_size=3, padding=1),
nn.BatchNorm2d(32),
nn.MaxPool2d(2),
DenseConv2d(32, 32, kernel_size=3, padding=1),
nn.BatchNorm2d(32),
nn.MaxPool2d(2),
DenseConv2d(32, 32, kernel_size=3, padding=2),
nn.BatchNorm2d(32),
nn.MaxPool2d(2),
DenseConv2d(32, 32, kernel_size=3, padding=2),
nn.BatchNorm2d(32),
nn.MaxPool2d(2))
self.classifier = DenseLinear(in_features=32 * 6 * 6, out_features=2)
def forward(self, inp):
out = self.features(inp)
out = out.view(out.size(0), -1)
out = self.classifier(out)
return out
def conv2() -> Conv2:
return Conv2()
def conv4() -> Conv4:
return Conv4()
# TODO: define pretrain etc.
| 36.307692 | 96 | 0.469068 | 235 | 2,360 | 4.578723 | 0.255319 | 0.055762 | 0.066915 | 0.048327 | 0.597584 | 0.515799 | 0.515799 | 0.435874 | 0.435874 | 0.368959 | 0 | 0.093658 | 0.425424 | 2,360 | 64 | 97 | 36.875 | 0.699853 | 0.025424 | 0 | 0.5 | 0 | 0 | 0.008718 | 0 | 0 | 0 | 0 | 0.015625 | 0 | 1 | 0.125 | false | 0 | 0.083333 | 0.041667 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9d2bd5114a0540a0095f6c31a8ad07b71899f53 | 29,424 | py | Python | scripts/generate_network_interactomix.py | quimaguirre/NetworkAnalysis | c7a4da3ba5696800738b4767065ce29fa0020d79 | [
"MIT"
] | 1 | 2017-07-10T17:33:31.000Z | 2017-07-10T17:33:31.000Z | scripts/generate_network_interactomix.py | quimaguirre/NetworkAnalysis | c7a4da3ba5696800738b4767065ce29fa0020d79 | [
"MIT"
] | null | null | null | scripts/generate_network_interactomix.py | quimaguirre/NetworkAnalysis | c7a4da3ba5696800738b4767065ce29fa0020d79 | [
"MIT"
] | null | null | null | import argparse
import ConfigParser
import sys, os, re
import biana
try: from biana import *
except: sys.exit(10)
import methods_dictionaries as methods_dicts
def main():
options = parse_user_arguments()
generate_network(options)
def parse_user_arguments(*args, **kwds):
parser = argparse.ArgumentParser(
description = "Generate a protein-protein interaction network (implemented for Interactomix platform)",
epilog = "@oliva's lab 2019")
parser.add_argument('-iseed','--seeds_input_file',dest='seed',action = 'store',
help = 'Seeds Input file (default is input_seed)')
parser.add_argument('-radius','--radius_of_subnetwork_around_seeds',dest='radius',default=0,action = 'store',type=int,
help = '''Network is built in a radius of connections around the seed proteins.
If 0, it creates the complete interactome''')
parser.add_argument('-taxid','--TaxID',dest='taxid',action = 'store',default='9606',
help = 'Tax ID (i.e. human=9606 is default if TaxID=0 there is no restriction)')
parser.add_argument('-stype','--seed_type',dest='stype',action = 'store',default='geneid',
help = 'Type of identifier for seeds (default is geneid)')
parser.add_argument('-ttype','--translation_type',dest='ttype',action = 'store',default='accessionnumber',
help = '''Type of identifier for the output translation of codes (default is accessionnumber)
Using "proteinsequence" provides with the longest sequence of all codes''')
parser.add_argument('-trans','--translation_of_nodes_file',dest='translation_file',action = 'store',default='translation_nodes.txt',
help = 'File with the translation of codes from BIANA to the selected type for all nodes')
parser.add_argument('-strans','--translation_of_seeds_file',dest='translation_seeds_file',action = 'store',default='translation_seeds_to_BIANA_codes.txt',
help = 'File with the translation of codes from the introduced type of code to BIANA codes')
parser.add_argument('-edge','--edge_file',dest='edge',action = 'store', default='biana_edges',
help = 'Output file with edges(default is biana_edges)')
parser.add_argument('-node','--node_file',dest='node',action = 'store', default='biana_nodes',
help = 'Output file with nodes(default is biana_nodes)')
parser.add_argument('-format','--output_format',dest='format',action = 'store',default='sif',
help = '''Format file of the edge file:\tsif (default), netscore, raw, multi-fields:\n
'sif': <node1>\tscore\t<node2>\n
'netscore': <node1>\t<node2>\t<score>\n
'raw': <node1>\t<node2>\n
'multi-fields' : <node1>\t<node2>\t<sources>\t<method_ids>\t<method_names>\t<pmids>\n''')
parser.add_argument('-rAFF','--restricted_to_TAP',dest='restricted_to_TAP',action = 'store_true',
help = 'Flag to use interactions at least described by affinity methods (i.e. Tandem Affinity Purification)')
parser.add_argument('-rY2H','--restricted_to_Y2H',dest='restricted_to_Y2H',action = 'store_true',
help = 'Flag to use interactions at least described by yeast two hybrid methods (Y2H)')
parser.add_argument('-rUSER','--restricted_to_user',dest='restricted_to_user',action = 'store',default='restricted_methods',
help = 'File to use interactions described by the user selected methods')
parser.add_argument('-eAFF','--except_TAP',dest='except_TAP',action = 'store_true',
help = 'Flag to use all interactions except those described by affinity methods (i.e. Tandem Affinity Purification)')
parser.add_argument('-eY2H','--except_Y2H',dest='except_Y2H',action = 'store_true',
help = 'Flag to use all interactions except those described by yeast two hybrid methods (Y2H)')
parser.add_argument('-eUSER','--except_user',dest='except_user',action = 'store',default='restricted_methods',
help = 'File to reject interactions described by the user selected methods')
parser.add_argument('-v','--verbose',dest='verbose',action = 'store_true',
help = 'Flag to use verbose mode')
options=parser.parse_args()
"""
Example:
python generate_network_interactomix.py -iseed example/sample1.txt -radius 1 -taxid 9606 -stype uniprotentry -ttype proteinsequence -trans example/output/example.proteinsequence.trans -strans example/output/example.seeds.trans -edge example/output/example.edges -node example/output/example.nodes -format raw -rY2H
python /home/quim/PHD/Projects/BIANA/scripts/generate_network_interactomix.py -radius 0 -taxid 9606 -edge /home/quim/PHD/Projects/BIANA/outputs/BIANA_2020_geneID_seqtax_drugtarget/human_network_biana_2020.txt -node /home/quim/PHD/Projects/BIANA/outputs/BIANA_2020_geneID_seqtax_drugtarget/human_network_biana_2020_nodes.txt -trans /home/quim/PHD/Projects/BIANA/outputs/BIANA_2020_geneID_seqtax_drugtarget/human_network_biana_2020_translation.txt -ttype geneid -format multi-fields &> /home/quim/PHD/Projects/BIANA/outputs/BIANA_2020_geneID_seqtax_drugtarget/human_network_biana_2020.log
"""
return options
def generate_network(options):
"""
Generates a protein-protein interaction network extracting information from BIANA.
"""
#----------------------#
# FIXED PARAMETERS #
#----------------------#
# Parameters that I have decided to fix
restricted_to_seeds = False
minimum_number_of_methods = 1
minimum_number_of_db = 1
seed_score = 0.1
#--------------------------------------#
# GET INFORMATION FROM CONFIG FILE #
#--------------------------------------#
# Get the program path
main_path = os.path.abspath(os.path.dirname(__file__))
# Read the config file
config_file = os.path.join(main_path, 'config.ini')
config = ConfigParser.ConfigParser()
config.read(config_file)
#--------------------------------------#
# LOAD THE DICTIONARIES OF METHODS #
#--------------------------------------#
# Get the affinity dictionary
affinity_dict = methods_dicts.affinity_dict
affinity=set(affinity_dict.keys())
# Get the complementation dictionary
complementation_dict = methods_dicts.complementation_dict
complementation=set(complementation_dict.keys())
#---------------------------------------#
# GET METHODS THAT WILL BE FILTERED #
#---------------------------------------#
# Check if the user has introduced a file with methods that must be included
if not fileExist(options.restricted_to_user):
print "No restriction on methods selected by the user"
user_selection=False
else:
use_methods=[]
with open(options.restricted_to_user) as input_method_fd:
for line in input_method_fd:
fields = line.strip().split("\t")
use_methods.append(fields[0])
user_selection=True
print "Input to use only Methods:",repr(use_methods)
# Check if the user has introduced a file with methods that have to be excluded
if not fileExist(options.except_user):
print "No rejection of methods selected by the user"
user_rejection=False
else:
no_methods=[]
with open(options.except_user) as input_method_fd:
for line in input_method_fd:
fields = line.strip().split("\t")
no_methods.append(fields[0])
user_rejection=True
print "Input of rejected Methods:",repr(no_methods)
#---------------------------#
# START A BIANA SESSION #
#---------------------------#
print "Open session"
session = create_new_session( sessionID="biana_session",
dbname=config.get('BIANA', 'database'),
dbhost=config.get('BIANA', 'host'),
dbuser=config.get('BIANA', 'user'),
dbpassword=config.get('BIANA', 'password'),
unification_protocol=config.get('BIANA', 'unification_protocol') )
print "Continue"
#------------------------------#
# DEFINE A USER ENTITY SET #
#------------------------------#
# Create network network of expansion if the radius is larger than 0
if restricted_to_seeds or options.radius>0:
# Check if the seeds file exists
if not fileExist(options.seed):
print "File with seeds is missing or not found"
sys.exit(10)
else:
level=options.radius
seed_list = get_seeds_from_file(options.seed)
# If we have Taxonomy restriction, we add it
if options.taxid != "0":
print("Check Proteome %s"%(repr(options.taxid)))
proteome = session.create_new_user_entity_set( identifier_description_list =seed_list,
attribute_restriction_list=[("taxid",options.taxid)],
id_type=options.stype,new_user_entity_set_id="proteome",
negative_attribute_restriction_list=[] )
else:
print('Proteome without Taxonomy restriction')
proteome = session.create_new_user_entity_set( identifier_description_list =seed_list,
id_type=options.stype,new_user_entity_set_id="proteome",
negative_attribute_restriction_list=[] )
else:
level=0
proteome = session.create_new_user_entity_set( identifier_description_list = [("taxid",options.taxid)],
attribute_restriction_list=[], id_type="embedded",
new_user_entity_set_id="proteome",
negative_attribute_restriction_list=[] )
#----------------------------------------------------#
# SELECT THE INTERACTIONS OF THE USER ENTITY SET #
#----------------------------------------------------#
print ("Selecting interactions")
# Select interactions that have been detected at least by affinity technology
if options.restricted_to_TAP:
print ('Using interactions at least described by affinity methods (i.e. Tandem Affinity Purification)')
session.create_network( user_entity_set_id = "proteome" , level = level, relation_type_list=["interaction"] ,
relation_attribute_restriction_list = [("Method_id",400)],
#relation_attribute_restriction_list = [("psimi_name","affinity technology")],
include_relations_last_level = (not restricted_to_seeds) , use_self_relations = False)
# Select interactions that have been detected at least by yeast two hybrid
elif options.restricted_to_Y2H:
print ('Using interactions at least described by yeast-two-hybrid methods (Y2H)')
session.create_network( user_entity_set_id = "proteome" , level = level, relation_type_list=["interaction"] ,
relation_attribute_restriction_list = [("Method_id",18)],
#relation_attribute_restriction_list = [("psimi_name","y2h2")],
include_relations_last_level = (not restricted_to_seeds) , use_self_relations = False)
# Select all interactions
else:
session.create_network( user_entity_set_id = "proteome" , level = level, relation_type_list=["interaction"] ,
include_relations_last_level = (not restricted_to_seeds) , use_self_relations = False)
# Summary of interactions
out_network = open(options.edge,'w')
all_interactions = proteome.getRelations()
print "Num interactions:", len(all_interactions)
#--------------------------------------#
# FILTER THE SELECTED INTERACTIONS #
#--------------------------------------#
nodes=set()
# Get all the user entity ids from the user entity set 'proteome'
all_uEs = proteome.get_user_entity_ids()
# Obtain a dictionary user entity ID => type
uEId_to_type = session.dbAccess.get_user_entity_type(config.get('BIANA', 'unification_protocol'), all_uEs)
skip_interactions=0
for (uE_id1, uE_id2) in all_interactions:
#self.dbAccess.get_external_entities_dict( externalEntityIdsList = [external_entity_relation_id] )
# Get TYPE of user entity
uE1_type = uEId_to_type[uE_id1]
uE2_type = uEId_to_type[uE_id2]
# If type is not protein, we skip the interaction
if uE1_type != 'protein' or uE2_type != 'protein':
if options.verbose:
print('Skipping interaction because the type of one of the user entities is not protein!')
print('Node 1: {}\tType: {}'.format(uE_id1, uE1_type))
print('Node 2: {}\tType: {}'.format(uE_id2, uE2_type))
skip_interactions=skip_interactions+1
continue
eErIDs_list = proteome.get_external_entity_relation_ids(uE_id1, uE_id2)
method_names = set()
method_ids = set()
source_databases = set()
use_method_ids=set()
pubmed_ids = set()
unused_method_names = set()
relationObj_dict = session.dbAccess.get_external_entities_dict(
externalEntityIdsList = eErIDs_list, attribute_list = [],
relation_attribute_list = ["method_id","psimi_name","pubmed"], participant_attribute_list = [] )
num_methods=0
for current_eErID in eErIDs_list:
relationObj = relationObj_dict[current_eErID]
if options.verbose:
print "Interaction: (",uE_id1,",",uE_id2,")"
print relationObj
#if relationObj.get_attribute(attribute_identifier="psimi_name") is not None:
# print "\t".join([ x.value for x in relationObj.get_attribute(attribute_identifier="psimi_name") ])
#if relationObj.get_attribute(attribute_identifier="method_id") is not None:
#print "\t".join([ x.value for x in relationObj.get_attribute(attribute_identifier="method_id") ])
#print relationObj.get_attributes_dict()
#print [ x.value for x in relationObj.get_attributes_dict()["psimi_name"] ]
#print [ x.value for x in relationObj.get_attributes_dict()["method_id"] ]
if "psimi_name" in relationObj.get_attributes_dict():
method_names.update([ str(x.value) for x in relationObj.get_attributes_dict()["psimi_name"] ])
if "method_id" in relationObj.get_attributes_dict():
method_ids.update([ x.value for x in relationObj.get_attributes_dict()["method_id"]])
if "pubmed" in relationObj.get_attributes_dict():
pubmed_ids.update([ x.value for x in relationObj.get_attributes_dict()["pubmed"]])
source_databases.add(str(session.dbAccess.get_external_database(
database_id = relationObj.get_source_database()) ))
if options.except_TAP:
for m in method_ids:
if m not in affinity:
use_method_ids.add(m)
#print "Add", m
else:
unused_method_names.add(affinity_dict[m])
elif options.except_Y2H:
#print "check Y2H"
for m in method_ids:
if m not in complementation:
use_method_ids.add(m)
#print "Add", m
else:
unused_method_names.add(complementation_dict[m])
elif user_rejection:
for m in method_ids:
if m not in no_methods:
use_method_ids.add(m)
elif user_selection:
for m in method_ids:
#print "Check",repr(use_methods)
if m in set(use_methods):
use_method_ids.add(m)
if options.verbose:
print "Not among selected methods ",m
else:
use_method_ids.update(method_ids)
if len(source_databases) > 0:
info_sources=";".join([str(x) for x in source_databases])
else:
if options.verbose:
print('Skipping interaction it has no source database!')
print('Node 1: {}\tNode 2: {}'.format(uE_id1, uE_id2))
skip_interactions=skip_interactions+1
continue
if len(method_names) > 0:
method_names = [x for x in method_names if x not in unused_method_names] # Remove method names that were excluded
info_methods=";".join([str(x) for x in method_names])
else:
info_methods='-'
if len(use_method_ids) > 0:
info_methods_ids=";".join([str(x) for x in use_method_ids])
else:
if options.verbose:
print('Skipping interaction it has no method!')
print('Node 1: {}\tNode 2: {}'.format(uE_id1, uE_id2))
skip_interactions=skip_interactions+1
continue
if len(pubmed_ids) > 0:
info_pubmed_ids=";".join([str(x) for x in pubmed_ids])
else:
info_pubmed_ids='-'
num_databases=len(source_databases)
num_methods=len(use_method_ids)
num_pubmeds = len(pubmed_ids)
if options.verbose:
print "Methods",num_methods,info_methods,"\tSelected:",info_methods_ids
print "Databases",num_databases,info_sources
print "Pubmeds",num_pubmeds,info_pubmed_ids
# Check if the number of methods is higher than the minimum established
if num_methods >= minimum_number_of_methods:
use=True
else:
use=False
# Check if the number of database is higher than the minimum established
if use and num_databases >= minimum_number_of_db:
use=True
else:
use=False
if not use:
skip_interactions=skip_interactions+1
#print method_names, method_ids, source_databases
#----------------------#
# OUTPUT EDGE FILE #
#----------------------#
if use:
#print uE_id1, uE_id/2
nodes.add(uE_id1)
nodes.add(uE_id2)
#print "Attribute ",(uE_id1,uE_id2).get_attribute(
if options.format == 'multi-fields' :
out_network.write("{0}\t{1}\t{2}\t{3}\t{4}\t{5}\n".
format(uE_id1,uE_id2,info_sources,info_methods_ids,info_methods,info_pubmed_ids))
elif options.format == 'netscore':
out_network.write('\t{}\t{}\t{:.2f}\n'.format(uE_id1,uE_id2,1.))
elif options.format == 'raw':
out_network.write("{}\t{}\n".format(uE_id1,uE_id2))
else:
# If the format is not multi-fields, netscore or raw, the output format is sif
out_network.write("{}\t{:.2f}\t{}\n".format(uE_id1,1.,uE_id2))
print "Num neglected interactions:", skip_interactions
out_network.close()
#---------------------------------------#
# OUTPUT NODE AND TRANSLATION FILES #
#---------------------------------------#
# If we wanted the complete interactome, the translation will be done differently
if options.radius <= 0:
# Output node file
out_proteins = open(options.node,'w')
for protein in nodes:
if options.format == 'multi-fields':
out_proteins.write("{0}\t{1:.2f}\t{2:.2f}\t{3:.2f}\n".format(protein,1.,1.,0.1))
elif options.format == 'netscore':
out_proteins.write("{0}\t{1:.2f}\t{2:.2f}\t{3:.2f}\n".format(protein,1.,1.,0.1))
else:
out_proteins.write("{0}\t{1:.2f}\n".format(protein,0.1))
out_proteins.close()
################################# TRANSLATION ####################################
out_translation = open(options.translation_file,'w')
# TRANSLATION TO 'stype'
trans_stype=False
if options.stype != 'proteinsequence' and options.stype != options.ttype:
trans_stype = True
out_trans_stype = open(options.translation_file+'.'+options.stype+'.trans','w')
for protein in nodes:
uE = session.get_user_entity(protein)
translate=set()
translate_stype=set()
if options.ttype == "proteinsequence":
maxlen=0;
for current_id in uE.get_attribute(attribute_identifier=options.ttype):
if maxlen < len(current_id.value.get_sequence().upper()):
maxlen=len(current_id.value.get_sequence().upper())
translation=",".join([str(current_id.value.get_sequence().upper()) for current_id in uE.get_attribute(attribute_identifier=options.ttype) if len(str(current_id.value.get_sequence().upper())) == maxlen ] )
#print "Translation",protein,translation
#print("{0}\t'{1}'\n".format(protein,translation))
else:
##### TRANSLATION TO 'ttype'
for current_id in uE.get_attribute(attribute_identifier=options.ttype):
translate.add(current_id.value.upper())
translation="','".join(["{0}".format(x) for x in translate])
out_translation.write("{0}\t'{1}'\n".format(protein,translation))
##### TRANSLATION TO STYPE
if trans_stype:
for current_id in uE.get_attribute(attribute_identifier=options.stype):
translate_stype.add(current_id.value.upper())
translation_stype="','".join(["{0}".format(x) for x in translate_stype])
out_trans_stype.write("{0}\t'{1}'\n".format(protein,translation_stype))
out_translation.close()
if trans_stype:
out_trans_stype.close()
####################################################################################
# If we wanted a network of expansion, the translation will be done differently
elif options.radius > 0:
# Read the seeds
seeds=set()
input_seed = open(options.seed,'r')
for line in input_seed:
fields = line.strip().split("\t")
seeds.add(fields[0].lower())
input_seed.close()
# Output node file
out_proteins = open(options.node,'w')
translate={}
for protein in nodes:
score=seed_score
uE = session.get_user_entity(protein)
for current_id in uE.get_attribute(attribute_identifier=options.stype):
if current_id.value.lower() in seeds:
translate.setdefault(current_id.value.lower(),[])
translate[current_id.value.lower()].append(protein)
score=1.0
if options.format == 'multi-fields':
out_proteins.write("{0}\t{1:.2f}\t{2:.2f}\t{3:.2f}\n".format(protein,1.,1.,score))
elif options.format == 'netscore':
out_proteins.write("{0}\t{1:.2f}\t{2:.2f}\t{3:.2f}\n".format(protein,1.,1.,score))
else:
out_proteins.write("{0}\t{1:.2f}\n".format(protein,score))
out_proteins.close()
# Get the IDS of single nodes that were not previously found in the network
single=set()
for uE_id in proteome.get_unconnected_nodes():
single.add(uE_id)
for protein in single:
uE = session.get_user_entity(protein)
for current_id in uE.get_attribute(attribute_identifier=options.stype):
if current_id.value.lower() in seeds:
translate.setdefault(current_id.value.lower(),[])
translate[current_id.value.lower()].append(protein)
# Get all IDS of SEEDS, defined as "proteome", and check missing codes to be
# added for translation
allseed=set()
for uE_id in proteome.get_user_entity_ids():
allseed.add(uE_id)
for protein in allseed:
if protein not in single and protein not in nodes:
uE = session.get_user_entity(protein)
for current_id in uE.get_attribute(attribute_identifier=options.stype):
if current_id.value.lower() in seeds:
translate.setdefault(current_id.value.lower(),[])
translate[current_id.value.lower()].append(protein)
################################# TRANSLATION ####################################
out_translation = open(options.translation_seeds_file,'w')
for s in seeds:
if s == '': continue
if s in translate:
codes=set(translate[s])
translation="','".join([str(x) for x in codes])
#out_translation.write("%s\t'%s'\n" % (s.upper(),translation))
out_translation.write("{0}\t'{1}'\n".format(s.upper(),translation))
else:
out_translation.write("{0}\t'Unknown'\n".format(s.upper()))
out_translation.close()
# Output translation file
# TRANSLATION TO 'ttype'
out_translation = open(options.translation_file,'w')
# TRANSLATION TO 'stype'
trans_stype=False
if options.stype != 'proteinsequence' and options.stype != options.ttype:
trans_stype = True
out_trans_stype = open(options.translation_file+'.'+options.stype+'.trans','w')
for protein in nodes:
uE = session.get_user_entity(protein)
translate=set()
translate_stype=set()
if options.ttype == "proteinsequence":
maxlen=0;
for current_id in uE.get_attribute(attribute_identifier=options.ttype):
if maxlen < len(current_id.value.get_sequence().upper()):
maxlen=len(current_id.value.get_sequence().upper())
translation=",".join([str(current_id.value.get_sequence().upper()) for current_id in uE.get_attribute(attribute_identifier=options.ttype) if len(str(current_id.value.get_sequence().upper())) == maxlen ] )
#print "Translation",protein,translation
#print("{0}\t'{1}'\n".format(protein,translation))
else:
for current_id in uE.get_attribute(attribute_identifier=options.ttype):
translate.add(current_id.value.upper())
translation="','".join(["{0}".format(x) for x in translate])
out_translation.write("{0}\t'{1}'\n".format(protein,translation))
##### TRANSLATION TO STYPE
if trans_stype:
for current_id in uE.get_attribute(attribute_identifier=options.stype):
translate_stype.add(current_id.value.upper())
translation_stype="','".join(["{0}".format(x) for x in translate_stype])
out_trans_stype.write("{0}\t'{1}'\n".format(protein,translation_stype))
out_translation.close()
if trans_stype:
out_trans_stype.close()
####################################################################################
print('Generation of the network done!')
return
def fileExist(file):
"""
Checks if a file exists AND is a file
"""
return os.path.exists(file) and os.path.isfile(file)
def get_seeds_from_file(seed_file):
"""
Obtain the seeds from a file and introduce them to a Python list.
The seeds must be separated by new lines!
"""
seed_set = set()
with open(seed_file, 'r') as seed_file_fd:
for line in seed_file_fd:
fields = line.strip().split('\t')
seed_set.add(fields[0])
return list(seed_set)
if __name__ == "__main__":
main()
| 48.157119 | 591 | 0.569195 | 3,301 | 29,424 | 4.867616 | 0.106634 | 0.017924 | 0.018297 | 0.02894 | 0.521782 | 0.483943 | 0.432474 | 0.412808 | 0.409821 | 0.383869 | 0 | 0.011347 | 0.296153 | 29,424 | 610 | 592 | 48.236066 | 0.76451 | 0.131559 | 0 | 0.389744 | 0 | 0.017949 | 0.181744 | 0.019149 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.002564 | 0.015385 | null | null | 0.071795 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9d992fc9c803eca7ba614c187b28cbfcef4b1f8 | 5,988 | py | Python | scripts/commit_validation/commit_validation/commit_validation.py | cypherdotXd/o3de | bb90c4ddfe2d495e9c00ebf1e2650c6d603a5676 | [
"Apache-2.0",
"MIT"
] | 8 | 2021-08-31T02:14:19.000Z | 2021-12-28T19:20:59.000Z | scripts/commit_validation/commit_validation/commit_validation.py | cypherdotXd/o3de | bb90c4ddfe2d495e9c00ebf1e2650c6d603a5676 | [
"Apache-2.0",
"MIT"
] | 8 | 2021-07-12T13:55:00.000Z | 2021-10-04T14:53:21.000Z | scripts/commit_validation/commit_validation/commit_validation.py | cypherdotXd/o3de | bb90c4ddfe2d495e9c00ebf1e2650c6d603a5676 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-09-16T05:06:18.000Z | 2021-09-16T05:06:18.000Z | #
# Copyright (c) Contributors to the Open 3D Engine Project.
# For complete copyright and license terms please see the LICENSE at the root of this distribution.
#
# SPDX-License-Identifier: Apache-2.0 OR MIT
#
#
import abc
import importlib
import os
import pkgutil
import re
import time
from typing import Dict, List, Tuple
VERBOSE = False
class Commit(abc.ABC):
"""An interface for accessing details about a commit"""
@abc.abstractmethod
def get_files(self) -> List[str]:
"""Returns a list of local files added/modified by the commit"""
pass
@abc.abstractmethod
def get_removed_files(self) -> List[str]:
"""Returns a list of local files removed by the commit"""
pass
@abc.abstractmethod
def get_file_diff(self, str) -> str:
"""
Given a file name, returns a string in unified diff format
that represents the changes made to that file for this commit.
Most validators will only pay attention to added lines (with + in front)
"""
pass
@abc.abstractmethod
def get_description(self) -> str:
"""Returns the description of the commit"""
pass
@abc.abstractmethod
def get_author(self) -> str:
"""Returns the author of the commit"""
pass
def validate_commit(commit: Commit, out_errors: List[str] = None, ignore_validators: List[str] = None) -> bool:
"""Validates a commit against all validators
:param commit: The commit to validate
:param out_errors: if not None, will populate with the list of errors given by the validators
:param ignore_validators: Optional list of CommitValidator classes to ignore, by class name
:return: True if there are no validation errors, and False otherwise
"""
failed_count = 0
passed_count = 0
start_time = time.time()
# Find all the validators in the validators package (recursively)
validator_classes = []
validators_dir = os.path.join(os.path.dirname(__file__), 'validators')
for _, module_name, is_package in pkgutil.iter_modules([validators_dir]):
if not is_package:
module = importlib.import_module('commit_validation.validators.' + module_name)
validator = module.get_validator()
if ignore_validators and validator.__name__ in ignore_validators:
print(f"Disabled validation for '{validator.__name__}'")
else:
validator_classes.append(validator)
error_summary = {}
# Process validators
for validator_class in validator_classes:
validator = validator_class()
validator_name = validator.__class__.__name__
error_list = []
passed = validator.run(commit, errors = error_list)
if passed:
passed_count += 1
print(f'{validator.__class__.__name__} PASSED')
else:
failed_count += 1
print(f'{validator.__class__.__name__} FAILED')
error_summary[validator_name] = error_list
end_time = time.time()
if failed_count:
print("VALIDATION FAILURE SUMMARY")
for val_name in error_summary.keys():
errors = error_summary[val_name]
if errors:
for error_message in errors:
first_line = True
for line in error_message.splitlines():
if first_line:
first_line = False
print(f'VALIDATOR_FAILED: {val_name} {line}')
else:
print(f' {line}') # extra detail lines do not need machine parsing
stats_strs = []
if failed_count > 0:
stats_strs.append(f'{failed_count} failed')
if passed_count > 0:
stats_strs.append(f'{passed_count} passed')
stats_str = ', '.join(stats_strs) + f' in {end_time - start_time:.2f}s'
print()
print(stats_str)
return failed_count == 0
def IsFileSkipped(file_name) -> bool:
if os.path.splitext(file_name)[1].lower() not in SOURCE_AND_SCRIPT_FILE_EXTENSIONS:
skipped = True
for pattern in SOURCE_AND_SCRIPT_FILE_PATTERNS:
if pattern.match(file_name):
skipped = False
break
return skipped
return False
class CommitValidator(abc.ABC):
"""A commit validator"""
@abc.abstractmethod
def run(self, commit: Commit, errors: List[str]) -> bool:
"""Validates a commit
:param commit: The commit to validate
:param errors: List of errors generated, append them to this list
:return: True if the commit is valid, and False otherwise
"""
pass
SOURCE_FILE_EXTENSIONS: Tuple[str, ...] = (
'.c', '.cc', '.cpp', '.cxx', '.h', '.hpp', '.hxx', '.inl', '.m', '.mm', '.cs', '.java'
)
"""File extensions for compiled source code"""
SCRIPT_FILE_EXTENSIONS: Tuple[str, ...] = (
'.py', '.lua', '.bat', '.cmd', '.sh', '.js'
)
"""File extensions for interpreted code"""
BUILD_FILE_EXTENSIONS: Tuple[str, ...] = (
'.cmake',
)
"""File extensions for build files"""
SOURCE_AND_SCRIPT_FILE_EXTENSIONS: Tuple[str, ...] = SOURCE_FILE_EXTENSIONS + SCRIPT_FILE_EXTENSIONS + BUILD_FILE_EXTENSIONS
"""File extensions for both compiled and interpreted code"""
BUILD_FILE_PATTERNS: Tuple[re.Pattern, ...] = (
re.compile(r'.*CMakeLists\.txt'),
re.compile(r'.*Jenkinsfile')
)
"""File patterns for build files"""
SOURCE_AND_SCRIPT_FILE_PATTERNS: Tuple[re.Pattern, ...] = BUILD_FILE_PATTERNS
EXCLUDED_VALIDATION_PATTERNS = [
'*/.git/*',
'*/3rdParty/*',
'*/__pycache__/*',
'*/External/*',
'build',
'Cache',
'*/Code/Framework/AzCore/azgnmx/azgnmx/*',
'Code/Tools/CryFXC',
'Code/Tools/HLSLCrossCompiler',
'Code/Tools/HLSLCrossCompilerMETAL',
'Docs',
'python/runtime',
'restricted/*/Tools/*RemoteControl',
'Tools/3dsmax',
'*/user/Cache/*',
'*/user/log/*',
]
| 31.68254 | 124 | 0.631096 | 720 | 5,988 | 5.047222 | 0.3 | 0.04623 | 0.033021 | 0.031646 | 0.168134 | 0.118327 | 0.106219 | 0.042928 | 0.022014 | 0.022014 | 0 | 0.003149 | 0.257515 | 5,988 | 188 | 125 | 31.851064 | 0.814215 | 0.215932 | 0 | 0.125 | 0 | 0 | 0.15374 | 0.056325 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0.108333 | 0.066667 | 0 | 0.175 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b9dd82e962e13070a8526b2d4d0da1d0be6265ee | 7,417 | py | Python | src/py65/devices/mpu65c02.py | dabeaz/py65 | 62d790445018f0616508022912b67d8d64935a29 | [
"BSD-3-Clause"
] | 5 | 2015-03-19T22:22:45.000Z | 2020-05-15T18:26:59.000Z | src/py65/devices/mpu65c02.py | BigEd/py65 | 57d5e7191362006c1d6fa20662da3e4854f1b7c2 | [
"BSD-3-Clause"
] | null | null | null | src/py65/devices/mpu65c02.py | BigEd/py65 | 57d5e7191362006c1d6fa20662da3e4854f1b7c2 | [
"BSD-3-Clause"
] | 3 | 2015-04-27T02:42:29.000Z | 2021-07-16T20:50:23.000Z | from py65.devices import mpu6502
from py65.utils.devices import make_instruction_decorator
class MPU(mpu6502.MPU):
def __init__(self, *args, **kwargs):
mpu6502.MPU.__init__(self, *args, **kwargs)
self.name = '65C02'
self.waiting = False
def step(self):
if self.waiting:
self.processorCycles += 1
else:
mpu6502.MPU.step(self)
return self
# Make copies of the lists
instruct = mpu6502.MPU.instruct[:]
cycletime = mpu6502.MPU.cycletime[:]
extracycles = mpu6502.MPU.extracycles[:]
disassemble = mpu6502.MPU.disassemble[:]
instruction = make_instruction_decorator(instruct, disassemble,
cycletime, extracycles)
# addressing modes
def ZeroPageIndirectAddr(self):
return self.WordAt( 255 & (self.ByteAt(self.pc)))
def AccumulatorAddr(self):
return self.a
# operations
def opRMB(self, x, mask):
address = x()
self.memory[address] &= mask
def opSMB(self, x, mask):
address = x()
self.memory[address] |= mask
def opSTZ(self, x):
self.memory[x()] = 0x00
def opTSB(self, x):
address = x()
m = self.memory[address]
self.p &= ~self.ZERO
z = m & self.a
if z != 0:
self.p |= self.ZERO
self.memory[address] = m | self.a
def opTRB(self, x):
address = x()
m = self.memory[address]
self.p &= ~self.ZERO
z = m & self.a
if z != 0:
self.p |= self.ZERO
self.memory[address] = m & ~self.a
# instructions
@instruction(name="RMB0", mode="zpg", cycles=5)
def inst_0x07(self):
self.opRMB(self.ZeroPageAddr, 0xFE)
self.pc += 1
@instruction(name="ORA", mode="zpi", cycles=5)
def inst_0x12(self):
self.opORA(self.ZeroPageIndirectAddr)
self.pc += 1
@instruction(name="RMB1", mode="zpg", cycles=5)
def inst_0x17(self):
self.opRMB(self.ZeroPageAddr, 0xFD)
self.pc += 1
@instruction(name="RMB2", mode="zpg", cycles=5)
def inst_0x27(self):
self.opRMB(self.ZeroPageAddr, 0xFB)
self.pc += 1
@instruction(name="AND", mode="zpi", cycles=5)
def inst_0x32(self):
self.opAND(self.ZeroPageIndirectAddr)
self.pc += 1
@instruction(name="BIT", mode="zpx", cycles=4)
def inst_0x34(self):
self.opBIT(self.ZeroPageXAddr)
self.pc += 1
@instruction(name="RMB3", mode="zpg", cycles=5)
def inst_0x37(self):
self.opRMB(self.ZeroPageAddr, 0xF7)
self.pc += 1
@instruction(name="BIT", mode="abx", cycles=4)
def inst_0x3c(self):
self.opBIT(self.AbsoluteXAddr)
self.pc += 2
@instruction(name="RMB4", mode="zpg", cycles=5)
def inst_0x47(self):
self.opRMB(self.ZeroPageAddr, 0xEF)
self.pc += 1
@instruction(name="EOR", mode="zpi", cycles=5)
def inst_0x52(self):
self.opEOR(self.ZeroPageIndirectAddr)
self.pc += 1
@instruction(name="RMB5", mode="zpg", cycles=5)
def inst_0x57(self):
self.opRMB(self.ZeroPageAddr, 0xDF)
self.pc += 1
@instruction(name="PHY", mode="imp", cycles=3)
def inst_0x5a(self):
self.stPush(self.y)
@instruction(name="STZ", mode="imp", cycles=3)
def inst_0x64(self):
self.opSTZ(self.ZeroPageAddr)
self.pc += 1
@instruction(name="RMB6", mode="zpg", cycles=5)
def inst_0x67(self):
self.opRMB(self.ZeroPageAddr, 0xBF)
self.pc += 1
@instruction(name="ADC", mode="zpi", cycles=5)
def inst_0x72(self):
self.opADC(self.ZeroPageIndirectAddr)
self.pc += 1
@instruction(name="STZ", mode="zpx", cycles=4)
def inst_0x74(self):
self.opSTZ(self.ZeroPageXAddr)
self.pc += 1
@instruction(name="PHY", mode="imp", cycles=4)
def inst_0x7a(self):
self.y = self.stPop()
self.FlagsNZ(self.y)
@instruction(name="RMB7", mode="zpg", cycles=5)
def inst_0x77(self):
self.opRMB(self.ZeroPageAddr, 0x7F)
self.pc += 1
@instruction(name="SMB0", mode="zpg", cycles=5)
def inst_0x87(self):
self.opSMB(self.ZeroPageAddr, 0x01)
self.pc += 1
@instruction(name="STA", mode="zpi", cycles=5)
def inst_0x92(self):
self.opSTA(self.ZeroPageIndirectAddr)
self.pc += 1
@instruction(name="SMB1", mode="zpg", cycles=5)
def inst_0x97(self):
self.opSMB(self.ZeroPageAddr, 0x02)
self.pc += 1
@instruction(name="STZ", mode="abs", cycles=4)
def inst_0x9c(self):
self.opSTZ(self.AbsoluteAddr)
self.pc += 2
@instruction(name="STZ", mode="abx", cycles=5)
def inst_0x9e(self):
self.opSTZ(self.AbsoluteXAddr)
self.pc += 2
@instruction(name="SMB2", mode="zpg", cycles=5)
def inst_0xa7(self):
self.opSMB(self.ZeroPageAddr, 0x04)
self.pc += 1
@instruction(name="LDA", mode="zpi", cycles=5)
def inst_0xb2(self):
self.opLDA(self.ZeroPageIndirectAddr)
self.pc += 1
@instruction(name="SMB3", mode="zpg", cycles=5)
def inst_0xb7(self):
self.opSMB(self.ZeroPageAddr, 0x08)
self.pc += 1
@instruction(name="SMB4", mode="zpg", cycles=5)
def inst_0xc7(self):
self.opSMB(self.ZeroPageAddr, 0x10)
self.pc += 1
@instruction(name="SMB5", mode="zpg", cycles=5)
def inst_0xd7(self):
self.opSMB(self.ZeroPageAddr, 0x20)
self.pc += 1
@instruction(name="PHX", mode="imp", cycles=3)
def inst_0xda(self):
self.stPush(self.x)
@instruction(name="SMB6", mode="zpg", cycles=5)
def inst_0xe7(self):
self.opSMB(self.ZeroPageAddr, 0x40)
self.pc += 1
@instruction(name="SMB7", mode="zpg", cycles=5)
def inst_0xf7(self):
self.opSMB(self.ZeroPageAddr, 0x80)
self.pc += 1
@instruction(name="PLX", mode="imp", cycles=4)
def inst_0xfa(self):
self.x = self.stPop()
self.FlagsNZ(self.x)
@instruction(name="TSB", mode="zpg", cycles=5)
def inst_0x04(self):
self.opTSB(self.ZeroPageAddr)
self.pc += 1
@instruction(name="TSB", mode="abs", cycles=6)
def inst_0x0c(self):
self.opTSB(self.AbsoluteAddr)
self.pc += 2
@instruction(name="TRB", mode="zpg", cycles=5)
def inst_0x14(self):
self.opTRB(self.ZeroPageAddr)
self.pc += 1
@instruction(name="INC", mode="acc", cycles=2)
def inst_0x1a(self):
self.opINCR(None)
@instruction(name="TRB", mode="abs", cycles=6)
def inst_0x1c(self):
self.opTRB(self.AbsoluteAddr)
self.pc += 2
@instruction(name="DEC", mode="acc", cycles=2)
def inst_0x3a(self):
self.opDECR(None)
@instruction(name="BRA", mode="rel", cycles=1, extracycles=1)
def inst_0x80(self):
self.BranchRelAddr()
@instruction(name="WAI", mode='imp', cycles=3)
def inst_0xCB(self):
self.waiting = True
@instruction(name="CMP", mode='zpi', cycles=6) # Don't know cycles
def inst_0xD2(self):
self.opCPY(self.ZeroPageIndirectAddr)
self.pc += 1
@instruction(name="SBC", mode="zpi", cycles=5)
def inst_0xf2(self):
self.opSBC(self.ZeroPageIndirectAddr)
self.pc += 1
| 27.369004 | 71 | 0.58676 | 949 | 7,417 | 4.528978 | 0.201264 | 0.14658 | 0.047231 | 0.117264 | 0.620754 | 0.427873 | 0.255002 | 0.079107 | 0.079107 | 0.061424 | 0 | 0.050804 | 0.262235 | 7,417 | 270 | 72 | 27.47037 | 0.734649 | 0.011191 | 0 | 0.229665 | 0 | 0 | 0.037259 | 0 | 0 | 0 | 0.009281 | 0 | 0 | 1 | 0.244019 | false | 0 | 0.009569 | 0.009569 | 0.296651 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9ddc98cf55e2bef4fcf498ec4787ca57bad46d0 | 5,623 | py | Python | tests/test__io.py | soerendip/ms-mint | bf5f5d87d07a0d2108c6cd0d92c278f2ea762e58 | [
"MIT"
] | 1 | 2021-09-03T04:02:25.000Z | 2021-09-03T04:02:25.000Z | tests/test__io.py | soerendip/ms-mint | bf5f5d87d07a0d2108c6cd0d92c278f2ea762e58 | [
"MIT"
] | 3 | 2020-09-29T21:43:39.000Z | 2021-07-21T22:18:27.000Z | tests/test__io.py | soerendip/ms-mint | bf5f5d87d07a0d2108c6cd0d92c278f2ea762e58 | [
"MIT"
] | 4 | 2019-11-14T13:25:24.000Z | 2021-04-30T22:08:53.000Z | import pandas as pd
import shutil
import os
import io
from ms_mint.Mint import Mint
from pathlib import Path as P
from ms_mint.io import (
ms_file_to_df,
mzml_to_pandas_df_pyteomics,
convert_ms_file_to_feather,
convert_ms_file_to_parquet,
MZMLB_AVAILABLE,
)
from paths import (
TEST_MZML,
TEST_MZXML,
TEST_PARQUET,
TEST_MZMLB_POS,
TEST_MZML_POS,
TEST_MZML_NEG,
)
def test__ms_file_to_df__mzML():
result = ms_file_to_df(TEST_MZML)
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
def test__ms_file_to_df__mzML_timeunit_minutes():
result = ms_file_to_df(TEST_MZML, time_unit="minutes")
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
def test__ms_file_to_df__mzXML():
result = ms_file_to_df(TEST_MZXML)
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
def test__mzml_to_pandas_df_pyteomics_pos():
result = mzml_to_pandas_df_pyteomics(TEST_MZML_POS)
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
assert all(result.polarity == "+"), f'Polarity should be "+"\n{result}'
def test__mzml_to_pandas_df_pyteomics_neg():
result = mzml_to_pandas_df_pyteomics(TEST_MZML_NEG)
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
assert all(result.polarity == "-"), f'Polarity should be "-"\n{result}'
def test__read_parquet():
result = ms_file_to_df(TEST_PARQUET)
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
def test__write_read_hdf(tmpdir):
df = ms_file_to_df(TEST_PARQUET)
fn = P(tmpdir) / "file.hdf"
df.to_hdf(fn, key="data")
result = ms_file_to_df(fn)
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
def test__read_mzMLb(tmpdir):
if not MZMLB_AVAILABLE:
return None
result = ms_file_to_df(TEST_MZMLB_POS)
expected_cols = [
"scan_id",
"ms_level",
"polarity",
"scan_time_min",
"mz",
"intensity",
]
assert isinstance(result, pd.DataFrame), f"{type(result)} is not a dataframe"
assert expected_cols == result.columns.to_list(), result.columns
# assert all(result.polarity == '+'), f'Polarity should be "+"\n{result}'
def test__convert_ms_file_to_feather(tmpdir):
print(tmpdir)
shutil.copy(TEST_MZML, tmpdir)
fn = P(tmpdir) / P(TEST_MZML).name
fn_out = fn.with_suffix(".feather")
print(fn, fn_out)
convert_ms_file_to_feather(fn)
assert fn_out.is_file(), f"File not generated {fn_out}"
df = ms_file_to_df(fn)
df_fea = ms_file_to_df(fn_out)
assert df_fea.equals(df), "DataFrames not equal"
def test__convert_ms_file_to_parquet(tmpdir):
print(tmpdir)
shutil.copy(TEST_MZML, tmpdir)
fn = P(tmpdir) / P(TEST_MZML).name
fn_out = fn.with_suffix(".parquet")
print(fn, fn_out)
convert_ms_file_to_parquet(fn)
assert fn_out.is_file(), f"File not generated {fn_out}"
df = ms_file_to_df(fn)
df_fea = ms_file_to_df(fn_out)
assert df_fea.equals(df), "DataFrames not equal"
def test__export_to_excel(tmp_path):
filename = os.path.join(tmp_path, "output.xlsx")
mint = Mint(verbose=True)
mint.ms_files = "tests/data/test.mzXML"
mint.run()
mint.export(filename)
assert os.path.isfile(filename)
def test__export_to_excel_without_fn():
mint = Mint(verbose=True)
mint.ms_files = TEST_MZXML
mint.targets = pd.DataFrame(
{
"peak_label": ["A"],
"mz_mean": [200],
"mz_width": [10],
"intensity_threshold": [0],
"rt_min": [0],
"rt_max": [10],
"targets_filename": ["unknown"],
}
)
mint.run()
buffer = mint.export()
assert isinstance(buffer, io.BytesIO)
df = pd.read_excel(buffer, sheet_name="Results")
assert len(df) == 1, len(df)
assert df.loc[0, "peak_label"] == "A", df.loc[0, "peak_label"]
assert df.loc[0, "ms_file"] == P(TEST_MZXML).name, df.loc[0, "ms_file"]
| 27.563725 | 81 | 0.634181 | 771 | 5,623 | 4.293126 | 0.143969 | 0.041692 | 0.050755 | 0.045317 | 0.776435 | 0.712387 | 0.676737 | 0.622659 | 0.583988 | 0.583988 | 0 | 0.003273 | 0.239374 | 5,623 | 203 | 82 | 27.699507 | 0.770634 | 0.012627 | 0 | 0.528736 | 0 | 0 | 0.178198 | 0.003784 | 0.005747 | 0 | 0 | 0 | 0.155172 | 1 | 0.068966 | false | 0 | 0.045977 | 0 | 0.12069 | 0.022989 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9e0543df8f2ae150950f2a9787edb6296aac618 | 2,482 | py | Python | bluesky/tests/test_simulators.py | NSLS-II/bluesky | b7d666e65cf4ef556fb46b744c33264c8e3f7507 | [
"BSD-3-Clause"
] | 43 | 2015-08-04T20:13:41.000Z | 2019-04-12T17:21:36.000Z | bluesky/tests/test_simulators.py | NSLS-II/bluesky | b7d666e65cf4ef556fb46b744c33264c8e3f7507 | [
"BSD-3-Clause"
] | 966 | 2015-07-29T16:43:21.000Z | 2019-05-09T21:02:28.000Z | bluesky/tests/test_simulators.py | NSLS-II/bluesky | b7d666e65cf4ef556fb46b744c33264c8e3f7507 | [
"BSD-3-Clause"
] | 40 | 2015-07-29T16:42:41.000Z | 2019-02-07T02:30:34.000Z | from bluesky.plans import scan
from bluesky.simulators import (print_summary, print_summary_wrapper,
summarize_plan,
check_limits,
plot_raster_path)
import pytest
from bluesky.plans import grid_scan
def test_print_summary(hw):
det = hw.det
motor = hw.motor
print_summary(scan([det], motor, -1, 1, 10)) # old name
summarize_plan(scan([det], motor, -1, 1, 10)) # new name
list(print_summary_wrapper(scan([det], motor, -1, 1, 10)))
def test_old_module_name(hw):
det = hw.det
motor = hw.motor
motor1 = hw.motor1
motor2 = hw.motor2
from bluesky.plan_tools import (print_summary, print_summary_wrapper,
plot_raster_path)
with pytest.warns(UserWarning):
print_summary(scan([det], motor, -1, 1, 10))
with pytest.warns(UserWarning):
list(print_summary_wrapper(scan([det], motor, -1, 1, 10)))
with pytest.warns(UserWarning):
plan = grid_scan([det], motor1, -5, 5, 10, motor2, -7, 7, 15, True)
plot_raster_path(plan, 'motor1', 'motor2', probe_size=.3)
def test_check_limits(RE, hw):
det = hw.det
motor = hw.motor
# The motor object does not currently implement limits.
# Use an assert to help us out if this changes in the future.
assert not hasattr(motor, 'limits')
# # check_limits should warn if it can't find check_value
# TODO: Is there _any_ object to test?
# with pytest.warns(UserWarning):
# check_limits(scan([det], motor, -1, 1, 3))
# monkey-patch some limits
motor.limits = (-2, 2)
# check_limits should do nothing here
check_limits(scan([det], motor, -1, 1, 3))
# check_limits should error if limits are exceeded only if object raises
# this object does not raise
check_limits(scan([det], motor, -3, 3, 3))
# check_limits should raise if limits are equal only if object raises
# this object does not raise
motor.limits = (2, 2)
check_limits(scan([det], motor, -1, 1, 3))
def test_check_limits_needs_RE():
with pytest.raises(RuntimeError) as ctx:
check_limits([])
assert str(ctx.value) == "Bluesky event loop not running"
def test_plot_raster_path(hw):
det = hw.det
motor1 = hw.motor1
motor2 = hw.motor2
plan = grid_scan([det], motor1, -5, 5, 10, motor2, -7, 7, 15, True)
plot_raster_path(plan, 'motor1', 'motor2', probe_size=.3)
| 34 | 76 | 0.636583 | 357 | 2,482 | 4.271709 | 0.268908 | 0.086557 | 0.07082 | 0.068197 | 0.531148 | 0.492459 | 0.372459 | 0.32918 | 0.251803 | 0.199344 | 0 | 0.037695 | 0.251813 | 2,482 | 72 | 77 | 34.472222 | 0.783522 | 0.224013 | 0 | 0.565217 | 0 | 0 | 0.031414 | 0 | 0 | 0 | 0 | 0.013889 | 0.043478 | 1 | 0.108696 | false | 0 | 0.108696 | 0 | 0.217391 | 0.152174 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9e2c12e3855c30001fd37ab610587d3e95c803d | 535 | py | Python | microservices/users/config.py | Levakin/sanic-test-app | d96a54a21f6d0d3b262bbc7bc75f5fa3b12c3b61 | [
"Apache-2.0"
] | null | null | null | microservices/users/config.py | Levakin/sanic-test-app | d96a54a21f6d0d3b262bbc7bc75f5fa3b12c3b61 | [
"Apache-2.0"
] | null | null | null | microservices/users/config.py | Levakin/sanic-test-app | d96a54a21f6d0d3b262bbc7bc75f5fa3b12c3b61 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import os
from distutils.util import strtobool
class Config:
DEBUG = bool(strtobool(os.getenv('DEBUG', "False")))
DATABASE_URI = os.getenv('DATABASE_URI', '127.0.0.1:27017')
WORKERS = int(os.getenv('WORKERS', 2))
LOGO = os.getenv('LOGO', None)
HOST = os.getenv('HOST', '127.0.0.1')
PORT = int(os.getenv('PORT', 8000))
SECRET = os.getenv('SECRET', 'secret')
LOGIN_MIN_LENGTH = int(os.getenv('LOGIN_MIN_LENGTH', 1))
LOGIN_MAX_LENGTH = int(os.getenv('LOGIN_MAX_LENGTH', 32))
| 31.470588 | 63 | 0.646729 | 80 | 535 | 4.2 | 0.4375 | 0.214286 | 0.130952 | 0.035714 | 0.130952 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058166 | 0.164486 | 535 | 16 | 64 | 33.4375 | 0.693512 | 0.039252 | 0 | 0 | 0 | 0 | 0.212891 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9e36baa14d5265769af32c8ed910969e39eaf3a | 199 | py | Python | semantic-python/test/fixtures/4-01-lambda-literals.py | Temurson/semantic | 2e9cd2c006cec9a0328791e47d8c6d60af6d5a1b | [
"MIT"
] | 8,844 | 2019-05-31T15:47:12.000Z | 2022-03-31T18:33:51.000Z | semantic-python/test/fixtures/4-01-lambda-literals.py | Qanora/semantic | b0eda9a61bbc690a342fb177cfc12eec8c1c001c | [
"MIT"
] | 401 | 2019-05-31T18:30:26.000Z | 2022-03-31T16:32:29.000Z | semantic-python/test/fixtures/4-01-lambda-literals.py | Qanora/semantic | b0eda9a61bbc690a342fb177cfc12eec8c1c001c | [
"MIT"
] | 504 | 2019-05-31T17:55:03.000Z | 2022-03-30T04:15:04.000Z | # CHECK-TREE: { const <- \x -> \y -> x; y <- const #true #true; z <- const #false #false; #record { const: const, y : y, z: z, }}
const = lambda x, y: x
y = const(True, True)
z = const(False, False)
| 39.8 | 129 | 0.557789 | 34 | 199 | 3.264706 | 0.294118 | 0.072072 | 0.054054 | 0.072072 | 0.594595 | 0.594595 | 0.594595 | 0.594595 | 0.594595 | 0.594595 | 0 | 0 | 0.221106 | 199 | 4 | 130 | 49.75 | 0.716129 | 0.613065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9e707edd4da101ada4ff00b233330f2c2f9843e | 148 | py | Python | abc153/d.py | Lockdef/kyopro-code | 2d943a87987af05122c556e173e5108a0c1c77c8 | [
"MIT"
] | null | null | null | abc153/d.py | Lockdef/kyopro-code | 2d943a87987af05122c556e173e5108a0c1c77c8 | [
"MIT"
] | null | null | null | abc153/d.py | Lockdef/kyopro-code | 2d943a87987af05122c556e173e5108a0c1c77c8 | [
"MIT"
] | null | null | null | h = int(input())
i = 1
a = 1
b = 1
c = 1
while h >= a:
a = 2 ** i
i += 1
s = 0
t = True
for j in range(1, i-1):
c += 2 ** j
print(c)
| 8.705882 | 23 | 0.398649 | 35 | 148 | 1.685714 | 0.542857 | 0.101695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11236 | 0.398649 | 148 | 16 | 24 | 9.25 | 0.550562 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.076923 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9e96b262a690da4aaab0bf9584b51a15851826f | 6,784 | py | Python | demos/python/sdk_wireless_camera_control/open_gopro/demos/log_battery.py | Natureshadow/OpenGoPro | 05110123cfbf6584288b813f2d4896d3a091480e | [
"MIT"
] | 210 | 2021-06-05T20:06:17.000Z | 2022-03-31T18:13:17.000Z | demos/python/sdk_wireless_camera_control/open_gopro/demos/log_battery.py | Natureshadow/OpenGoPro | 05110123cfbf6584288b813f2d4896d3a091480e | [
"MIT"
] | 73 | 2021-06-01T21:22:44.000Z | 2022-03-31T18:33:24.000Z | demos/python/sdk_wireless_camera_control/open_gopro/demos/log_battery.py | Natureshadow/OpenGoPro | 05110123cfbf6584288b813f2d4896d3a091480e | [
"MIT"
] | 70 | 2021-06-07T03:59:04.000Z | 2022-03-26T10:51:15.000Z | # log_battery.py/Open GoPro, Version 2.0 (C) Copyright 2021 GoPro, Inc. (http://gopro.com/OpenGoPro).
# This copyright was auto-generated on Wed, Sep 1, 2021 5:05:45 PM
"""Example to continuously read the battery (with no Wifi connection)"""
import csv
import time
import logging
import argparse
import threading
from pathlib import Path
from datetime import datetime
from dataclasses import dataclass
from typing import Optional, Tuple, Literal, List
from rich.console import Console
from open_gopro import GoPro
from open_gopro.constants import StatusId
from open_gopro.util import setup_logging, set_logging_level
logger = logging.getLogger(__name__)
console = Console() # rich consoler printer
BarsType = Literal[0, 1, 2, 3]
@dataclass
class Sample:
"""Simple class to store battery samples"""
index: int
percentage: int
bars: BarsType
def __post_init__(self) -> None:
self.time = datetime.now()
def __str__(self) -> str: # pylint: disable=missing-return-doc
return f"Index {self.index} @ time {self.time.strftime('%H:%M:%S')} --> bars: {self.bars}, percentage: {self.percentage}"
SAMPLE_INDEX = 0
SAMPLES: List[Sample] = []
def dump_results_as_csv(location: Path) -> None:
"""Write all of the samples to a csv file
Args:
location (Path): File to write to
"""
console.print(f"Dumping results as CSV to {location}")
with open(location, mode="w") as f:
w = csv.writer(f, delimiter=",", quotechar='"', quoting=csv.QUOTE_MINIMAL)
w.writerow(["index", "time", "percentage", "bars"])
initial_time = SAMPLES[0].time
for s in SAMPLES:
w.writerow([s.index, (s.time - initial_time).seconds, s.percentage, s.bars])
def process_battery_notifications(gopro: GoPro, initial_bars: BarsType, initial_percentage: int) -> None:
"""Separate thread to continuously check for and store battery notifications.
If the CLI parameter was set to poll, this isn't used.
Args:
gopro (GoPro): instance to get updates from
initial_bars (BarsType): Initial bars level when notifications were enabled
initial_percentage (int): Initial percentage when notifications were enabled
"""
last_percentage = initial_percentage
last_bars = initial_bars
while True:
# Block until we receive an update
notification = gopro.get_update()
# Update data points if they have changed
last_percentage = (
notification.data[StatusId.INT_BATT_PER]
if StatusId.INT_BATT_PER in notification.data
else last_percentage
)
last_bars = (
notification.data[StatusId.BATT_LEVEL] if StatusId.BATT_LEVEL in notification.data else last_bars
)
# Append and print sample
global SAMPLE_INDEX
SAMPLES.append(Sample(index=SAMPLE_INDEX, percentage=last_percentage, bars=last_bars))
console.print(str(SAMPLES[-1]))
SAMPLE_INDEX += 1
def main() -> int:
"""Main program functionality
Returns:
int: program return code
"""
identifier, log_location, poll = parse_arguments()
global logger
logger = setup_logging(logger, log_location)
global SAMPLE_INDEX
gopro: Optional[GoPro] = None
return_code = 0
try:
with GoPro(identifier, enable_wifi=False) as gopro:
set_logging_level(logger, logging.ERROR)
# # Setup notifications if we are not polling
if poll is None:
console.print("Configuring battery notifications...")
# Enable notifications of the relevant battery statuses. Also store initial values.
bars = gopro.ble_status.batt_level.register_value_update().flatten
percentage = gopro.ble_status.int_batt_per.register_value_update().flatten
# Start a thread to handle asynchronous battery level notifications
threading.Thread(
target=process_battery_notifications, args=(gopro, bars, percentage), daemon=True
).start()
with console.status("[bold green]Receiving battery notifications until it dies..."):
# Sleep forever, allowing notification handler thread to deal with battery level notifications
while True:
time.sleep(1)
# Otherwise, poll
else:
with console.status("[bold green]Polling the battery until it dies..."):
while True:
SAMPLES.append(
Sample(
index=SAMPLE_INDEX,
percentage=gopro.ble_status.int_batt_per.get_value().flatten,
bars=gopro.ble_status.batt_level.get_value().flatten,
)
)
console.print(str(SAMPLES[-1]))
SAMPLE_INDEX += 1
time.sleep(poll)
except Exception as e: # pylint: disable=broad-except
logger.error(repr(e))
return_code = 1
except KeyboardInterrupt:
logger.warning("Received keyboard interrupt. Shutting down...")
finally:
if len(SAMPLES) > 0:
csv_location = Path(log_location.parent) / "battery_results.csv"
dump_results_as_csv(csv_location)
if gopro is not None:
gopro.close()
console.print("Exiting...")
return return_code # pylint: disable=lost-exception
def parse_arguments() -> Tuple[str, Path, Optional[int]]:
"""Parse command line arguments
Returns:
Tuple[str, Path, Path]: (identifier, path to save log, path to VLC)
"""
parser = argparse.ArgumentParser(
description="Connect to the GoPro via BLE only and continuously read the battery (either by polling or notifications)."
)
parser.add_argument(
"-i",
"--identifier",
type=str,
help="Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. \
If not used, first discovered GoPro will be connected to",
default=None,
)
parser.add_argument(
"-l",
"--log",
type=Path,
help="Location to store detailed log",
default="log_battery.log",
)
parser.add_argument(
"-p",
"--poll",
type=int,
help="Set to poll the battery at a given interval. If not set, battery level will be notified instead. Defaults to notifications.",
default=None,
)
args = parser.parse_args()
return args.identifier, args.log, args.poll
if __name__ == "__main__":
main()
| 34.969072 | 139 | 0.627358 | 806 | 6,784 | 5.150124 | 0.320099 | 0.02385 | 0.009636 | 0.012527 | 0.104071 | 0.065526 | 0.052517 | 0.016863 | 0 | 0 | 0 | 0.006579 | 0.283019 | 6,784 | 193 | 140 | 35.150259 | 0.846834 | 0.205483 | 0 | 0.109375 | 1 | 0.023438 | 0.13274 | 0.006059 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046875 | false | 0 | 0.101563 | 0.007813 | 0.203125 | 0.039063 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9ebcddd99e456fbeb39a0191aad31656c7f4943 | 856 | py | Python | setup.py | EdWard680/python-firetv | 4c02f79a1c8ae60a489297178d010a31545a3b5d | [
"MIT"
] | null | null | null | setup.py | EdWard680/python-firetv | 4c02f79a1c8ae60a489297178d010a31545a3b5d | [
"MIT"
] | null | null | null | setup.py | EdWard680/python-firetv | 4c02f79a1c8ae60a489297178d010a31545a3b5d | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
name='firetv',
version='1.0.7',
description='Communicate with an Amazon Fire TV device via ADB over a network.',
url='https://github.com/happyleavesaoc/python-firetv/',
license='MIT',
author='happyleaves',
author_email='happyleaves.tfr@gmail.com',
packages=['firetv'],
install_requires=['pycryptodome', 'rsa', 'adb-homeassistant', 'pure-python-adb-homeassistant'],
extras_require={
'firetv-server': ['Flask>=0.10.1', 'PyYAML>=3.12']
},
entry_points={
'console_scripts': [
'firetv-server = firetv.__main__:main'
]
},
classifiers=[
'License :: OSI Approved :: MIT License',
'Operating System :: OS Independent',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 3'
]
)
| 30.571429 | 99 | 0.613318 | 91 | 856 | 5.67033 | 0.725275 | 0.062016 | 0.096899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018209 | 0.23014 | 856 | 27 | 100 | 31.703704 | 0.764795 | 0 | 0 | 0 | 0 | 0 | 0.538551 | 0.063084 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.038462 | 0 | 0.038462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9ec25017a264a5c2dd928342198ca509ad93675 | 893 | py | Python | neo/io/exampleio.py | Mario-Kart-Felix/python-neo | 951c97cf9eb56f5489da88940de920329e0f4c1b | [
"BSD-3-Clause"
] | 199 | 2015-01-20T13:49:13.000Z | 2022-03-21T18:35:29.000Z | neo/io/exampleio.py | Mario-Kart-Felix/python-neo | 951c97cf9eb56f5489da88940de920329e0f4c1b | [
"BSD-3-Clause"
] | 905 | 2015-01-07T09:21:15.000Z | 2022-03-31T16:29:44.000Z | neo/io/exampleio.py | Mario-Kart-Felix/python-neo | 951c97cf9eb56f5489da88940de920329e0f4c1b | [
"BSD-3-Clause"
] | 178 | 2015-01-05T12:34:39.000Z | 2022-02-20T23:06:52.000Z | """
neo.io have been split in 2 level API:
* neo.io: this API give neo object
* neo.rawio: this API give raw data as they are in files.
Developper are encourage to use neo.rawio.
When this is done the neo.io is done automagically with
this king of following code.
Author: sgarcia
"""
from neo.io.basefromrawio import BaseFromRaw
from neo.rawio.examplerawio import ExampleRawIO
class ExampleIO(ExampleRawIO, BaseFromRaw):
name = 'example IO'
description = "Fake IO"
# This is an inportant choice when there are several channels.
# 'split-all' : 1 AnalogSignal each 1 channel
# 'group-by-same-units' : one 2D AnalogSignal for each group of channel with same units
_prefered_signal_group_mode = 'group-by-same-units'
def __init__(self, filename=''):
ExampleRawIO.__init__(self, filename=filename)
BaseFromRaw.__init__(self, filename)
| 28.806452 | 93 | 0.724524 | 128 | 893 | 4.929688 | 0.554688 | 0.031696 | 0.07607 | 0.050713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005579 | 0.197088 | 893 | 30 | 94 | 29.766667 | 0.874477 | 0.536394 | 0 | 0 | 0 | 0 | 0.08933 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
b9fc3dd10a80beed547f86b535cfadc6f817e0e2 | 4,872 | tac | Python | 6 复试/2 笔试/4 编译原理/hw/2016_黄家晖_PA/550405220_4_decaf_PA3/TestCases/S3/output/t9.tac | ladike/912_project | 5178c1c93ac6ca30ffc72dd689f5c6932704b4ab | [
"MIT"
] | 1 | 2022-03-02T16:05:49.000Z | 2022-03-02T16:05:49.000Z | 6 复试/2 笔试/4 编译原理/hw/2016_黄家晖_PA/550405220_4_decaf_PA3/TestCases/S3/output/t9.tac | ladike/912_project | 5178c1c93ac6ca30ffc72dd689f5c6932704b4ab | [
"MIT"
] | null | null | null | 6 复试/2 笔试/4 编译原理/hw/2016_黄家晖_PA/550405220_4_decaf_PA3/TestCases/S3/output/t9.tac | ladike/912_project | 5178c1c93ac6ca30ffc72dd689f5c6932704b4ab | [
"MIT"
] | null | null | null | VTABLE(_Main) {
<empty>
Main
_Main.COPY;
}
VTABLE(_Base) {
<empty>
Base
_Base.COPY;
}
VTABLE(_Sub1) {
_Base
Sub1
_Sub1.COPY;
}
VTABLE(_Sub2) {
_Base
Sub2
_Sub2.COPY;
}
VTABLE(_Sub3) {
_Sub1
Sub3
_Sub3.COPY;
}
VTABLE(_Sub4) {
_Sub3
Sub4
_Sub4.COPY;
}
FUNCTION(_Main_New) {
memo ''
_Main_New:
_T1 = 4
parm _T1
_T2 = call _Alloc
_T3 = VTBL <_Main>
*(_T2 + 0) = _T3
return _T2
}
FUNCTION(_Main.COPY) {
memo '_T4:4'
_Main.COPY:
_T5 = 4
parm _T5
_T6 = call _Alloc
_T7 = VTBL <_Main>
*(_T6 + 0) = _T7
return _T6
}
FUNCTION(_Base_New) {
memo ''
_Base_New:
_T8 = 4
parm _T8
_T9 = call _Alloc
_T10 = VTBL <_Base>
*(_T9 + 0) = _T10
return _T9
}
FUNCTION(_Base.COPY) {
memo '_T11:4'
_Base.COPY:
_T12 = 4
parm _T12
_T13 = call _Alloc
_T14 = VTBL <_Base>
*(_T13 + 0) = _T14
return _T13
}
FUNCTION(_Sub1_New) {
memo ''
_Sub1_New:
_T15 = 4
parm _T15
_T16 = call _Alloc
_T17 = VTBL <_Sub1>
*(_T16 + 0) = _T17
return _T16
}
FUNCTION(_Sub1.COPY) {
memo '_T18:4'
_Sub1.COPY:
_T19 = 4
parm _T19
_T20 = call _Alloc
_T21 = VTBL <_Sub1>
*(_T20 + 0) = _T21
return _T20
}
FUNCTION(_Sub2_New) {
memo ''
_Sub2_New:
_T22 = 4
parm _T22
_T23 = call _Alloc
_T24 = VTBL <_Sub2>
*(_T23 + 0) = _T24
return _T23
}
FUNCTION(_Sub2.COPY) {
memo '_T25:4'
_Sub2.COPY:
_T26 = 4
parm _T26
_T27 = call _Alloc
_T28 = VTBL <_Sub2>
*(_T27 + 0) = _T28
return _T27
}
FUNCTION(_Sub3_New) {
memo ''
_Sub3_New:
_T29 = 4
parm _T29
_T30 = call _Alloc
_T31 = VTBL <_Sub3>
*(_T30 + 0) = _T31
return _T30
}
FUNCTION(_Sub3.COPY) {
memo '_T32:4'
_Sub3.COPY:
_T33 = 4
parm _T33
_T34 = call _Alloc
_T35 = VTBL <_Sub3>
*(_T34 + 0) = _T35
return _T34
}
FUNCTION(_Sub4_New) {
memo ''
_Sub4_New:
_T36 = 4
parm _T36
_T37 = call _Alloc
_T38 = VTBL <_Sub4>
*(_T37 + 0) = _T38
return _T37
}
FUNCTION(_Sub4.COPY) {
memo '_T39:4'
_Sub4.COPY:
_T40 = 4
parm _T40
_T41 = call _Alloc
_T42 = VTBL <_Sub4>
*(_T41 + 0) = _T42
return _T41
}
FUNCTION(main) {
memo ''
main:
_T48 = call _Base_New
_T43 = _T48
_T49 = call _Sub1_New
_T44 = _T49
_T50 = call _Sub2_New
_T45 = _T50
_T51 = call _Sub3_New
_T46 = _T51
_T52 = call _Sub4_New
_T47 = _T52
parm _T43
call _Main.printType
parm _T44
call _Main.printType
parm _T45
call _Main.printType
parm _T46
call _Main.printType
parm _T47
call _Main.printType
_T43 = _T47
parm _T43
call _Main.printType
_T54 = VTBL <_Sub1>
_T55 = *(_T43 + 0)
_L22:
_T53 = (_T54 == _T55)
if (_T53 != 0) branch _L23
_T55 = *(_T55 + 0)
if (_T55 != 0) branch _L22
_T56 = "Decaf runtime error: "
parm _T56
call _PrintString
_T57 = *(_T43 + 0)
_T58 = *(_T57 + 4)
parm _T58
call _PrintString
_T59 = " cannot be cast to "
parm _T59
call _PrintString
_T60 = VTBL <_Sub1>
_T61 = *(_T60 + 4)
parm _T61
call _PrintString
_T62 = "\n"
parm _T62
call _PrintString
call _Halt
_L23:
_T44 = _T43
parm _T44
call _Main.printType
}
FUNCTION(_Main.printType) {
memo '_T0:4'
_Main.printType:
_T64 = VTBL <_Sub4>
_T65 = *(_T0 + 0)
_L24:
_T63 = (_T64 == _T65)
if (_T63 != 0) branch _L25
_T65 = *(_T65 + 0)
if (_T65 != 0) branch _L24
_T63 = 0
_L25:
if (_T63 == 0) branch _L26
_T66 = "Sub4\n"
parm _T66
call _PrintString
branch _L27
_L26:
_T68 = VTBL <_Sub3>
_T69 = *(_T0 + 0)
_L28:
_T67 = (_T68 == _T69)
if (_T67 != 0) branch _L29
_T69 = *(_T69 + 0)
if (_T69 != 0) branch _L28
_T67 = 0
_L29:
if (_T67 == 0) branch _L30
_T70 = "Sub3\n"
parm _T70
call _PrintString
branch _L31
_L30:
_T72 = VTBL <_Sub2>
_T73 = *(_T0 + 0)
_L32:
_T71 = (_T72 == _T73)
if (_T71 != 0) branch _L33
_T73 = *(_T73 + 0)
if (_T73 != 0) branch _L32
_T71 = 0
_L33:
if (_T71 == 0) branch _L34
_T74 = "Sub2\n"
parm _T74
call _PrintString
branch _L35
_L34:
_T76 = VTBL <_Sub1>
_T77 = *(_T0 + 0)
_L36:
_T75 = (_T76 == _T77)
if (_T75 != 0) branch _L37
_T77 = *(_T77 + 0)
if (_T77 != 0) branch _L36
_T75 = 0
_L37:
if (_T75 == 0) branch _L38
_T78 = "Sub1\n"
parm _T78
call _PrintString
branch _L39
_L38:
_T80 = VTBL <_Base>
_T81 = *(_T0 + 0)
_L40:
_T79 = (_T80 == _T81)
if (_T79 != 0) branch _L41
_T81 = *(_T81 + 0)
if (_T81 != 0) branch _L40
_T79 = 0
_L41:
if (_T79 == 0) branch _L42
_T82 = "Base\n"
parm _T82
call _PrintString
_L42:
_L39:
_L35:
_L31:
_L27:
}
| 15.76699 | 34 | 0.555829 | 657 | 4,872 | 3.563166 | 0.207002 | 0.050833 | 0.050833 | 0.035882 | 0.041008 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184833 | 0.331486 | 4,872 | 308 | 35 | 15.818182 | 0.533927 | 0 | 0 | 0.111111 | 0 | 0 | 0.022993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b9ffb7c6fff3e245dc8ea1ea786cc6f60c2d4cde | 2,427 | py | Python | generator/cache/cache.py | biarmic/OpenCache | bb9e110e434deb83900de328cc76b63901ba582f | [
"BSD-3-Clause"
] | 5 | 2021-09-15T18:29:49.000Z | 2022-03-26T04:41:01.000Z | generator/cache/cache.py | VLSIDA/OpenCache | 0e79bf353c68d57dcc49d78178b12fd0b468f19a | [
"BSD-3-Clause"
] | null | null | null | generator/cache/cache.py | VLSIDA/OpenCache | 0e79bf353c68d57dcc49d78178b12fd0b468f19a | [
"BSD-3-Clause"
] | null | null | null | # See LICENSE for licensing information.
#
# Copyright (c) 2021 Regents of the University of California and The Board
# of Regents for the Oklahoma Agricultural and Mechanical College
# (acting for and on behalf of Oklahoma State University)
# All rights reserved.
#
import debug
import datetime
from policy import associativity
from globals import OPTS, print_time
class cache:
"""
This is not a design module, but contains a cache design instance.
"""
def __init__(self, cache_config, name):
cache_config.set_local_config(self)
self.name = name
# Import the design module of the cache
if OPTS.associativity == associativity.DIRECT:
from direct_cache import direct_cache as cache
elif OPTS.associativity == associativity.N_WAY:
from n_way_cache import n_way_cache as cache
elif OPTS.associativity == associativity.FULLY:
# TODO: from full_cache import full_cache as cache
debug.error("Fully associative cache is not supported at the moment.", -1)
else:
debug.error("Invalid associativity.", -1)
self.c = cache(cache_config, name)
def config_write(self, paths):
""" Save the config files. """
self.c.config_write(paths)
def verilog_write(self, path):
""" Save the Verilog file. """
self.c.verilog_write(path)
def save(self):
""" Save all the output files. """
debug.print_raw("Saving output files...")
# Write the config files
start_time = datetime.datetime.now()
cpaths = {
"data": OPTS.output_path + OPTS.data_array_name + "_config.py",
"tag": OPTS.output_path + OPTS.tag_array_name + "_config.py",
"use": OPTS.output_path + OPTS.use_array_name + "_config.py"
}
if not OPTS.replacement_policy.has_sram_array(): del cpaths["use"]
for k, cpath in cpaths.items():
debug.print_raw("Config: Writing to {}".format(cpath))
self.config_write(cpaths)
print_time("Config", datetime.datetime.now(), start_time)
# Write the Verilog file
start_time = datetime.datetime.now()
vpath = OPTS.output_path + self.c.name + ".v"
debug.print_raw("Verilog: Writing to {}".format(vpath))
self.verilog_write(vpath)
print_time("Verilog", datetime.datetime.now(), start_time) | 33.246575 | 86 | 0.646477 | 311 | 2,427 | 4.884244 | 0.327974 | 0.013167 | 0.050033 | 0.03555 | 0.134299 | 0.060566 | 0.060566 | 0 | 0 | 0 | 0 | 0.003319 | 0.255047 | 2,427 | 73 | 87 | 33.246575 | 0.836836 | 0.217965 | 0 | 0.051282 | 0 | 0 | 0.107817 | 0 | 0 | 0 | 0 | 0.013699 | 0 | 1 | 0.102564 | false | 0 | 0.153846 | 0 | 0.282051 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a00d6b8c83e85268bd294d4e512d54f000cfc8a | 2,843 | py | Python | pytype/tests/py2/test_stdlib.py | souravbadami/pytype | 804fa97e7f9208df2711976085a96f756b3949e6 | [
"Apache-2.0"
] | 1 | 2020-04-20T02:55:21.000Z | 2020-04-20T02:55:21.000Z | pytype/tests/py2/test_stdlib.py | doc22940/pytype | 4772ad6fe89f4df75ae3d08e7374f68074175d4a | [
"Apache-2.0"
] | null | null | null | pytype/tests/py2/test_stdlib.py | doc22940/pytype | 4772ad6fe89f4df75ae3d08e7374f68074175d4a | [
"Apache-2.0"
] | null | null | null | """Tests of selected stdlib functions."""
from pytype.tests import test_base
class StdlibTests(test_base.TargetPython27FeatureTest):
"""Tests for files in typeshed/stdlib."""
def testPosix(self):
ty = self.Infer("""
import posix
x = posix.urandom(10)
""")
self.assertTypesMatchPytd(ty, """
posix = ... # type: module
x = ... # type: str
""")
def testXRange(self):
self.Check("""
import random
random.sample(xrange(10), 5)
""")
def testStringTypes(self):
ty = self.Infer("""
import types
if isinstance("", types.StringTypes):
x = 42
if isinstance(False, types.StringTypes):
y = 42
if isinstance(u"", types.StringTypes):
z = 42
""", deep=False)
self.assertTypesMatchPytd(ty, """
types = ... # type: module
x = ... # type: int
z = ... # type: int
""")
def testDefaultDict(self):
self.Check("""
import collections
import itertools
ids = collections.defaultdict(itertools.count(17).next)
""")
def testSysVersionInfoLt(self):
ty = self.Infer("""
import sys
if sys.version_info[0] < 3:
v = 42
else:
v = "hello world"
""")
self.assertTypesMatchPytd(ty, """
sys = ... # type: module
v = ... # type: int
""")
def testSysVersionInfoLe(self):
ty = self.Infer("""
import sys
if sys.version_info[0] <= 2:
v = 42
else:
v = "hello world"
""")
self.assertTypesMatchPytd(ty, """
sys = ... # type: module
v = ... # type: int
""")
def testSysVersionInfoEq(self):
ty = self.Infer("""
import sys
if sys.version_info[0] == 2:
v = 42
elif sys.version_info[0] == 3:
v = "hello world"
else:
v = None
""")
self.assertTypesMatchPytd(ty, """
sys = ... # type: module
v = ... # type: int
""")
def testSysVersionInfoGe(self):
ty = self.Infer("""
import sys
if sys.version_info[0] >= 3:
v = 42
else:
v = "hello world"
""")
self.assertTypesMatchPytd(ty, """
sys = ... # type: module
v = ... # type: str
""")
def testSysVersionInfoGt(self):
ty = self.Infer("""
import sys
if sys.version_info[0] > 2:
v = 42
else:
v = "hello world"
""")
self.assertTypesMatchPytd(ty, """
sys = ... # type: module
v = ... # type: str
""")
def testSysVersionInfoNamedAttribute(self):
ty = self.Infer("""
import sys
if sys.version_info.major == 2:
v = 42
else:
v = "hello world"
""")
self.assertTypesMatchPytd(ty, """
sys: module
v: int
""")
test_base.main(globals(), __name__ == "__main__")
| 21.869231 | 61 | 0.518115 | 299 | 2,843 | 4.866221 | 0.26087 | 0.03299 | 0.054983 | 0.082474 | 0.468729 | 0.439863 | 0.428179 | 0.428179 | 0.428179 | 0.428179 | 0 | 0.020888 | 0.326416 | 2,843 | 129 | 62 | 22.03876 | 0.738903 | 0.024974 | 0 | 0.616071 | 0 | 0 | 0.604853 | 0.03477 | 0 | 0 | 0 | 0 | 0.071429 | 1 | 0.089286 | false | 0 | 0.107143 | 0 | 0.205357 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a04d1fd425aed6effcc3e48e1eb103f0872ab5a | 3,621 | py | Python | libqtile/widget/imapwidget.py | akloster/qtile | bd21d0744e177b8ca01ac129081472577d53ed66 | [
"MIT"
] | 1 | 2021-04-05T07:15:37.000Z | 2021-04-05T07:15:37.000Z | libqtile/widget/imapwidget.py | akloster/qtile | bd21d0744e177b8ca01ac129081472577d53ed66 | [
"MIT"
] | 1 | 2022-02-27T12:17:27.000Z | 2022-02-27T12:17:27.000Z | libqtile/widget/imapwidget.py | akloster/qtile | bd21d0744e177b8ca01ac129081472577d53ed66 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (c) 2015 David R. Andersen
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import imaplib
import re
import keyring
from libqtile.log_utils import logger
from libqtile.widget import base
class ImapWidget(base.ThreadPoolText):
"""Email IMAP widget
This widget will scan one of your imap email boxes and report the number of
unseen messages present. I've configured it to only work with imap with
ssl. Your password is obtained from the Gnome Keyring.
Writing your password to the keyring initially is as simple as (changing
out <userid> and <password> for your userid and password):
1) create the file ~/.local/share/python_keyring/keyringrc.cfg with the
following contents::
[backend]
default-keyring=keyring.backends.Gnome.Keyring
keyring-path=/home/<userid>/.local/share/keyring/
2) Execute the following python shell script once::
#!/usr/bin/env python3
import keyring
user = <userid>
password = <password>
keyring.set_password('imapwidget', user, password)
mbox names must include the path to the mbox (except for the default
INBOX). So, for example if your mailroot is ``~/Maildir``, and you want to
look at the mailbox at HomeMail/fred, the mbox setting would be:
``mbox="~/Maildir/HomeMail/fred"``. Note the nested sets of quotes! Labels
can be whatever you choose, of course.
Widget requirements: keyring_.
.. _keyring: https://pypi.org/project/keyring/
"""
defaults = [
('mbox', '"INBOX"', 'mailbox to fetch'),
('label', 'INBOX', 'label for display'),
('user', None, 'email username'),
('server', None, 'email server name'),
]
def __init__(self, **config):
base.ThreadPoolText.__init__(self, "", **config)
self.add_defaults(ImapWidget.defaults)
password = keyring.get_password('imapwidget', self.user)
if password is not None:
self.password = password
else:
logger.critical('Gnome Keyring Error')
def poll(self):
im = imaplib.IMAP4_SSL(self.server, 993)
if self.password == 'Gnome Keyring Error':
self.text = 'Gnome Keyring Error'
else:
im.login(self.user, self.password)
status, response = im.status(self.mbox, '(UNSEEN)')
self.text = response[0].decode()
self.text = self.label + ': ' + re.sub(r'\).*$', '', re.sub(r'^.*N\s', '', self.text))
im.logout()
return self.text
| 38.521277 | 98 | 0.67219 | 485 | 3,621 | 4.985567 | 0.472165 | 0.036394 | 0.021092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004681 | 0.233085 | 3,621 | 93 | 99 | 38.935484 | 0.866042 | 0.631041 | 0 | 0.064516 | 0 | 0 | 0.150741 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0.16129 | 0.16129 | 0 | 0.322581 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6a04e4f203740a253735948c968506f6632354e6 | 2,486 | py | Python | game/views/tests/game_definition_view_test.py | dimadk24/english-fight-api | 506a3eb2cb4cb91203b1e023b5248c27975df075 | [
"MIT"
] | null | null | null | game/views/tests/game_definition_view_test.py | dimadk24/english-fight-api | 506a3eb2cb4cb91203b1e023b5248c27975df075 | [
"MIT"
] | null | null | null | game/views/tests/game_definition_view_test.py | dimadk24/english-fight-api | 506a3eb2cb4cb91203b1e023b5248c27975df075 | [
"MIT"
] | null | null | null | from rest_framework.response import Response
from rest_framework.test import APIClient
from game.models import GameDefinition, AppUser
def create_game_definition(api_client: APIClient) -> Response:
return api_client.post("/api/game_definition")
def get_game_definition(api_client: APIClient, game_def_id: str) -> Response:
return api_client.get(f"/api/game_definition/{game_def_id}")
def test_returns_game_def_to_the_current_user_by_hash_id(api_client):
post_game_def_response = create_game_definition(api_client)
assert post_game_def_response.status_code == 201
game_def_id = post_game_def_response.data["id"]
assert isinstance(game_def_id, str)
get_game_def_response = get_game_definition(api_client, game_def_id)
assert get_game_def_response.status_code == 200
assert get_game_def_response.data == post_game_def_response.data
def test_returns_game_def_to_another_user_by_hash_id(api_client):
post_game_def_response = create_game_definition(api_client)
assert post_game_def_response.status_code == 201
game_def_id = post_game_def_response.data["id"]
assert isinstance(game_def_id, str)
user2 = AppUser.objects.create(vk_id=2, username=2)
api_client.force_authenticate(user2)
get_game_def_response = get_game_definition(api_client, game_def_id)
assert get_game_def_response.status_code == 200
assert get_game_def_response.data == post_game_def_response.data
def test_game_def_not_found_by_int_id(api_client):
post_game_def_response = create_game_definition(api_client)
assert post_game_def_response.status_code == 201
game_def_id = post_game_def_response.data["id"]
int_game_def_id = GameDefinition.objects.get(pk=game_def_id).id.id
assert isinstance(int_game_def_id, int)
get_game_def_response = get_game_definition(
api_client, str(int_game_def_id)
)
assert get_game_def_response.status_code == 404
assert get_game_def_response.data == {"detail": "Страница не найдена."}
def test_game_def_permission_denied_if_started(api_client):
post_game_def_response = create_game_definition(api_client)
game_def_id = post_game_def_response.data["id"]
GameDefinition.objects.filter(id=game_def_id).update(started=True)
get_game_def_response = get_game_definition(api_client, game_def_id)
assert get_game_def_response.status_code == 403
assert get_game_def_response.data == {
'detail': 'К игре уже нельзя подключиться'
}
| 35.514286 | 77 | 0.79284 | 382 | 2,486 | 4.657068 | 0.172775 | 0.177066 | 0.210793 | 0.138842 | 0.680157 | 0.639123 | 0.617201 | 0.578977 | 0.578977 | 0.535132 | 0 | 0.011579 | 0.131537 | 2,486 | 69 | 78 | 36.028986 | 0.812413 | 0 | 0 | 0.444444 | 0 | 0 | 0.049879 | 0.013677 | 0 | 0 | 0 | 0 | 0.311111 | 1 | 0.133333 | false | 0 | 0.066667 | 0.044444 | 0.244444 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a0593a2d9f168fbcc460c2d82964c99ec312e4a | 911 | py | Python | mayan/apps/metadata/migrations/0011_auto_20180917_0645.py | prezi/mayan-edms | e9bc10a056c3379b57115c6e83022f48c6298e1d | [
"Apache-2.0"
] | 4 | 2019-02-17T08:35:42.000Z | 2019-03-28T06:02:11.000Z | mayan/apps/metadata/migrations/0011_auto_20180917_0645.py | zhoubear/mayan-edms | e9bc10a056c3379b57115c6e83022f48c6298e1d | [
"Apache-2.0"
] | 1 | 2018-10-11T13:01:34.000Z | 2018-10-11T13:01:34.000Z | mayan/apps/metadata/migrations/0011_auto_20180917_0645.py | prezi/mayan-edms | e9bc10a056c3379b57115c6e83022f48c6298e1d | [
"Apache-2.0"
] | 3 | 2019-01-29T13:21:57.000Z | 2019-10-27T03:20:15.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.11 on 2018-09-17 06:45
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('metadata', '0010_auto_20180823_2353'),
]
operations = [
migrations.AlterField(
model_name='documentmetadata',
name='value',
field=models.CharField(blank=True, db_index=True, help_text='The actual value stored in the metadata type field for the document.', max_length=255, null=True, verbose_name='Value'),
),
migrations.AlterField(
model_name='metadatatype',
name='name',
field=models.CharField(help_text='Name used by other apps to reference this metadata type. Do not use python reserved words, or spaces.', max_length=48, unique=True, verbose_name='Name'),
),
]
| 35.038462 | 199 | 0.657519 | 112 | 911 | 5.196429 | 0.642857 | 0.068729 | 0.085911 | 0.099656 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055874 | 0.233809 | 911 | 25 | 200 | 36.44 | 0.777937 | 0.075741 | 0 | 0.222222 | 1 | 0.055556 | 0.293206 | 0.027414 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a0b84b7b59fd4b039d379ec665100c80b070e0d | 1,347 | py | Python | 2. Add Two Numbers DC(12-1-21).py | Dharaneeshwar/Leetcode | cc3ed07f6ac5f4d6e3f60c57a94a06a8be2f5287 | [
"MIT"
] | 4 | 2020-11-17T05:24:24.000Z | 2021-06-14T21:01:45.000Z | 2. Add Two Numbers DC(12-1-21).py | Dharaneeshwar/Leetcode | cc3ed07f6ac5f4d6e3f60c57a94a06a8be2f5287 | [
"MIT"
] | null | null | null | 2. Add Two Numbers DC(12-1-21).py | Dharaneeshwar/Leetcode | cc3ed07f6ac5f4d6e3f60c57a94a06a8be2f5287 | [
"MIT"
] | null | null | null | # Time Complexity - O(n) ; Space Complexity - O(n)
class Solution:
def addTwoNumbers(self, l1: ListNode, l2: ListNode) -> ListNode:
carry = 0
out = temp = ListNode()
while l1 is not None and l2 is not None:
tempsum = l1.val + l2.val
tempsum += carry
if tempsum > 9:
carry = tempsum//10
tempsum %= 10
else:
carry = 0
temp.next = ListNode(tempsum)
temp = temp.next
l1 = l1.next
l2 = l2.next
if l1:
while l1:
tempsum = l1.val + carry
if tempsum > 9:
carry = tempsum//10
tempsum %= 10
else:
carry = 0
temp.next = ListNode(tempsum)
temp = temp.next
l1 = l1.next
elif l2:
while l2:
tempsum = l2.val + carry
if tempsum > 9:
carry = tempsum//10
tempsum %= 10
else:
carry = 0
temp.next = ListNode(tempsum)
temp = temp.next
l2 = l2.next
if carry:
temp.next = ListNode(carry)
return out.next | 31.325581 | 76 | 0.400148 | 131 | 1,347 | 4.114504 | 0.236641 | 0.103896 | 0.118738 | 0.083488 | 0.539889 | 0.502783 | 0.502783 | 0.502783 | 0.502783 | 0.502783 | 0 | 0.061129 | 0.526355 | 1,347 | 43 | 77 | 31.325581 | 0.783699 | 0.035635 | 0 | 0.634146 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | false | 0 | 0 | 0 | 0.073171 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a0e57de9c3d93fdc79f1a9d3f94690a6652bf6e | 989 | py | Python | wrt/wrt-manifest-tizen-tests/const.py | linshen/crosswalk-test-suite | e206b2c35fc09e583f3202fc7fc8a656c8e2b5de | [
"BSD-3-Clause"
] | null | null | null | wrt/wrt-manifest-tizen-tests/const.py | linshen/crosswalk-test-suite | e206b2c35fc09e583f3202fc7fc8a656c8e2b5de | [
"BSD-3-Clause"
] | null | null | null | wrt/wrt-manifest-tizen-tests/const.py | linshen/crosswalk-test-suite | e206b2c35fc09e583f3202fc7fc8a656c8e2b5de | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
import sys, os
import itertools, shutil
path = os.path.abspath(__file__)
path = os.path.split(path)[0]
os.chdir(path)
print path
device_ssh_ip = ""
ssh_device = device_ssh_ip.split(",")
path_tcs = path + "/tcs"
path_result= path + "/result"
path_allpairs = path + "/allpairs"
path_resource = path + "/resource"
seed_file = path_allpairs + "/positive/input_seed.txt"
seed_negative = path_allpairs + "/negative"
seed_positive =path_allpairs + "/positivee"
seed_file_na = seed_negative + "/input_seed_negative.txt"
selfcomb_file = path_allpairs + "/selfcomb.txt"
output_file = path_allpairs + "/output.txt"
output_file_ne = path_allpairs + "/output_negative.txt"
report_path = path + "/report"
report_file = report_path + "/wrt-manifest-tizen-tests.xml"
report_summary_file = report_path + "/summary.xml"
sh_path = path + "/script"
log_path = report_path + "/log_"
device_path = "/home/app/content/tct/"
run_times = 3
version="6.35.1.2"
name="wrt-manifest-tizen-tests"
| 31.903226 | 59 | 0.743175 | 146 | 989 | 4.726027 | 0.383562 | 0.13913 | 0.069565 | 0.06087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007964 | 0.111223 | 989 | 30 | 60 | 32.966667 | 0.777019 | 0.020222 | 0 | 0 | 0 | 0 | 0.26343 | 0.127066 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.071429 | null | null | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a186a13afeea2c9ca39fb78982684eb10c871db | 3,784 | py | Python | bench_fastapi/authentication/controllers/login.py | sharkguto/teste_carga | 56d6e9dcbd3e7b7fe7295d8fcf4b4e8b84943cfb | [
"MIT"
] | 1 | 2021-10-14T07:27:47.000Z | 2021-10-14T07:27:47.000Z | bench_fastapi/authentication/controllers/login.py | sharkguto/teste_carga | 56d6e9dcbd3e7b7fe7295d8fcf4b4e8b84943cfb | [
"MIT"
] | 4 | 2019-08-06T02:26:32.000Z | 2021-06-10T21:39:19.000Z | bench_fastapi/authentication/controllers/login.py | sharkguto/teste_carga | 56d6e9dcbd3e7b7fe7295d8fcf4b4e8b84943cfb | [
"MIT"
] | 1 | 2018-05-11T18:04:41.000Z | 2018-05-11T18:04:41.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# login.py
# @Author : Gustavo Freitas (gustavo@gmf-tech.com)
# @Link :
# @Date : 12/12/2019, 11:43:07 AM
from typing import Optional, Any
from fastapi import APIRouter, Body, Depends, HTTPException
from fastapi import Header, Security
from authentication.models.users import User
from fastapi.security import HTTPBasic, HTTPBasicCredentials, APIKeyHeader
from typing import List
from starlette.responses import Response
from fastapi.encoders import jsonable_encoder
from authentication.interfaces.database import database
import jwt
from starlette.status import HTTP_400_BAD_REQUEST, HTTP_401_UNAUTHORIZED
from datetime import datetime, timedelta
from hashlib import sha256
from authentication.interfaces.token import verify_token
router = APIRouter()
security = HTTPBasic(auto_error=True)
api_key = APIKeyHeader(name="x-api-key", auto_error=True)
@router.post("/login", tags=["token"])
async def renew_token(
response: Response,
user: dict = Depends(verify_token),
x_api_key: str = Header(None),
):
response.headers["x-api-key"] = x_api_key
return {"verified": True, "user": user["email"]}
@router.put("/login", tags=["token"])
async def renew_token(response: Response, user: dict = Depends(verify_token)):
sql = """UPDATE users.tbl_users
SET token = :token WHERE
id = :id"""
token = f"{user['pwd_updated_at']}-{user['email']}-{datetime.now()}"
mhash = sha256(token.encode("utf-8"))
token = mhash.hexdigest()
await database.execute(query=sql, values={"id": user["id"], "token": token})
response.headers["x-api-key"] = jwt.encode(
{**user, **dict(exp=(datetime.now() + timedelta(hours=8)))},
token,
algorithm="HS256",
).decode()
return {"renew": True}
# @router.post("/login", dependencies=[Depends(verify_token)])
# async def renew_token(x_api_key: str = Header(None)):
# return {"ok": x_api_key}
@router.get(
"/login", response_model=User, tags=["auth"], response_model_exclude_unset=True
)
async def login_basic(
response: Response, authorization: HTTPBasicCredentials = Security(security)
):
sql = """SELECT tu.id, tu.email, tu."name", tu.linkedin_id , tu.pwd_updated_at
FROM users.tbl_users tu
WHERE tu.passwd is NOT NULL
AND tu.passwd = crypt(:secret,tu.passwd)
AND tu.email = :email
AND tu.enabled = true """
users = await database.fetch_one(
query=sql,
values={"email": authorization.username, "secret": authorization.password},
)
if not users:
raise HTTPException(status_code=HTTP_401_UNAUTHORIZED)
user = jsonable_encoder(users)
sql = """SELECT tp.acl_profile as profile
FROM users.tbl_users tu inner join
users.tbl_profile_users tpu on tpu.id_users = tu.id inner join
users.tbl_profile tp on tp.id = tpu.id_profile
WHERE tu.passwd is NOT NULL
AND tu.passwd = crypt(:secret,tu.passwd)
AND tu.email = :email"""
profiles = await database.fetch_all(
query=sql,
values={"email": authorization.username, "secret": authorization.password},
)
if not profiles:
raise HTTPException(status_code=HTTP_401_UNAUTHORIZED)
user["acl"] = jsonable_encoder(profiles)
sql = """UPDATE users.tbl_users
SET token = :token WHERE
id = :id"""
token = f"{user['pwd_updated_at']}-{authorization.username}-{datetime.now()}"
mhash = sha256(token.encode("utf-8"))
token = mhash.hexdigest()
await database.execute(query=sql, values={"id": user["id"], "token": token})
response.headers["x-api-key"] = jwt.encode(
{**user, **dict(exp=(datetime.now() + timedelta(hours=8)))},
token,
algorithm="HS256",
).decode()
return user
| 29.795276 | 83 | 0.681818 | 496 | 3,784 | 5.08871 | 0.294355 | 0.021395 | 0.022187 | 0.021395 | 0.493265 | 0.443344 | 0.443344 | 0.425515 | 0.385103 | 0.385103 | 0 | 0.014815 | 0.17944 | 3,784 | 126 | 84 | 30.031746 | 0.798068 | 0.075846 | 0 | 0.395349 | 0 | 0 | 0.267355 | 0.055651 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.069767 | 0.162791 | 0 | 0.197674 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6a1f7efcf406b9bcc9bc35cc271b47eed9db309f | 7,998 | py | Python | mod_core.py | nokia-wroclaw/innovativeproject-dbshepherd | f82f3b36caaf9fcd6d28076051cb92458ba2edd3 | [
"MIT"
] | null | null | null | mod_core.py | nokia-wroclaw/innovativeproject-dbshepherd | f82f3b36caaf9fcd6d28076051cb92458ba2edd3 | [
"MIT"
] | null | null | null | mod_core.py | nokia-wroclaw/innovativeproject-dbshepherd | f82f3b36caaf9fcd6d28076051cb92458ba2edd3 | [
"MIT"
] | 1 | 2020-02-05T20:02:15.000Z | 2020-02-05T20:02:15.000Z | import re
import os
import cmd
import sys
import common
from getpass import getpass
from kp import KeePassError, get_password
from configmanager import ConfigManager, ConfigManagerError
common.init()
class ParseArgsException(Exception):
def __init__(self, msg):
self.msg = msg
class ModuleCore(cmd.Cmd):
def __init__(self, module = ''):
cmd.Cmd.__init__(self)
self.master = None
if module == '#':
self.prompt_sign = '#>'
elif module != '':
self.prompt_sign = '[' + module + ']>'
else:
self.prompt_sign = '->'
#defaults
self.ruler = '-'
#Completions
self.directories = []
self.file_server_database = []
self.file_server = []
self.do_cd('.')
configs = ConfigManager().get_config_list()
for conf in configs:
self.file_server_database.append(conf)
self.file_server.append(conf)
for srv in ConfigManager('config/' + conf + '.yaml').get_all():
self.file_server_database.append(conf + '.' + srv)
self.file_server.append(conf + '.' + srv)
for db in ConfigManager('config/' + conf + '.yaml').get(srv)['databases']:
self.file_server_database.append(conf + '.' + srv + '.' + db)
def precmd(self, line):
if not sys.stdin.isatty():
print(line)
return line
def postcmd(self, stop, line):
if not sys.stdin.isatty():
print("")
return stop
def parse_args(self, string="", n=0, m=0):
list = re.findall('"+.*"+|[a-zA-Z0-9!@#$%^&*()_+-,./<>?]+', string)
arg_counter = len(list);
if (arg_counter >= n and arg_counter <= m) or (arg_counter == n and m == 0) or n == 0:
r_list = []
for l in list:
r_list.append(l.replace('"', ''))
return (r_list, len(list))
else:
raise ParseArgsException("Incorrect number of arguments")
# wykonuje daną funkcję (callback) na wszystkich bazach
def exec_on_config(self, callback, args, values, view = ''): # link - file.server.base
if values == '': # wykonaj na wszystkich plikach
files = ConfigManager().get_config_list() # pobierz listę plików konfiguracyjnych
# wyświetl na czym będziesz wykonywać
print("Exec on:")
for file in files:
print('+-',file)
ans = input("Are you sure? [NO/yes/info]: ")
if ans == "yes": #wykonaj callback
for file in files:
if view == 'tree': print('+-', file)
try:
servers = ConfigManager("config/" + file + ".yaml").get_all()
for srv in servers:
if view == 'tree': print("| +-", srv)
databases = servers[srv]["databases"]
for db in databases:
if view == 'tree': print("| | +-", db)
if view == 'list': print('[', file, '->', srv, '->', db, ']')
callback(file, srv, db, *args)
except ConfigManagerError as e:
print(e)
elif ans == "info": #podaj tylko informację na czym callback zostałby wykonany
for file in files:
print('+-', file)
servers = ConfigManager("config/" + file + ".yaml").get_all()
for srv in servers:
print('| +-', srv)
databases = servers[srv]["databases"]
for db in databases:
print('| | +-', db)
else: #jeżeli nie zdecydujemy się na wykonanie czegokolwiek
print("aborted")
else: # jeżeli specjalizujemy na czym chcemy wykonać
val = values.split('.') #rozdzielamy nazwę_pliku.serwera.bazy
params = len(val)
if params == 1: # jeżeli podano nazwę tylko pliku to wykonaj na wszystkich serwerach, bazach które są w nim zapisane
file = val[0]
try:
servers = ConfigManager("config/" + file + ".yaml").get_all()
for srv in servers:
if view == 'tree': print("+-", srv)
databases = servers[srv]["databases"]
for db in databases:
if view == 'tree': print("| +-", db)
if view == 'list': print('[', srv, '->', db, ']')
callback(file, srv, db, *args)
except ConfigManagerError as e:
print(e)
except KeyError as e:
print(e, "is not exist")
elif params == 2: # jeżeli podano nazwę pliku i serwer to wykonaj na wszystkich bazach na serwerze
file = val[0]
try:
servers = ConfigManager("config/" + file + ".yaml").get_all()
srv = val[1]
databases = servers[srv]["databases"]
for db in databases:
if view == 'tree': print("+-", db)
if view == 'list': print('[', db, ']')
callback(file, srv, db, *args)
except ConfigManagerError as e:
print(e)
except KeyError as e:
print(e, "is not exist")
elif params == 3: # podano nazwę pliku, serwer i nazwę bazy - wykonaj polecenie dokładnie na niej
try:
callback(val[0], val[1], val[2], *args)
except ConfigManagerError as e:
print(e)
except KeyError as e:
print(e, "is not exist")
# zwraca skróconą ścieżkę do aktualnego katalogu - funkcja pomocnicza
def get_shortpath(self):
path = common.get_cdir()
separator = ''
if '\\' in path:
separator = '\\'
else:
separator = '/'
start = path.find(separator)
end = path.rfind(separator, 0, len(path)-1)
if start < end:
return (path[0:start+1] + '...' + path[end:])
else:
return (path)
# autouzupełnienia dla cmd polecenia cd
def complete_cd(self, text, line, begidx, endidx):
if not text:
completions = self.directories[:]
else:
completions = [f for f in self.directories if f.startswith(text)]
return completions
# polecenie cd - pozwala na przemieszczanie się po katalogach
def do_cd(self, args):
"Move to directory"
if args == '':
print(common.get_cdir())
else:
try:
common.chdir(args)
self.prompt = self.get_shortpath() + ' ' + self.prompt_sign
self.directories = []
for name in os.listdir(common.get_cdir()):
if os.path.isdir(os.path.join(common.get_cdir(), name)):
self.directories.append(name)
except FileNotFoundError as e:
print(e)
# wyświetla wszystkie pliki w lokalizacji
def do_ls(self, args):
"List directory"
for name in os.listdir(common.get_cdir()):
print(name)
# podaje pełną ścieżkę aktualnego katalogu
def do_pwd(self, args):
"Print path"
print(common.get_cdir())
# pozwala na decyzję czy chcemy wyświetlać warningi
def do_warn(self, args):
"""warn <on/off>"""
try:
(values, values_num) = self.parse_args(args, 0, 1)
if values_num == 1:
if values[0] == 'on':
print('Warnings on')
self.warn = True
elif values[0] == 'off':
print('Warnings off')
self.warn = False
else:
print('Incorrect argument.')
else:
if self.warn == True:
print('Status: on')
else:
print('Status: off')
except ParseArgsException as e:
print(e)
# ustawia masterpassword dla keepasa
def do_setMaster(self,args):
"Set master password"
if sys.stdin.isatty(): # jezeli jako shell
p = getpass('Enter Master Password: ')
else:
p = sys.stdin.readline().rstrip()
self.master = p
def do_exit(self, *args):
return True
def do_EOF(self, line):
return True
def emptyline(self):
return False
# Musimy wyłapać wszystko co możliwe, nie ma pliku, zly master itp. i zwrocic 1 wyjątek
def get_password(self, alias):
keepass_path = common.keepass_path
if self.master == None:
raise KeePassError("Master Password Not Set")
try:
return get_password(keepass_path, self.master, alias)
except KeePassError as e:
raise e
def connect_command_builder(self,connection, perm):
try:
command = connection["adress"] + "_" + connection["user"]+ "_" + \
self.get_password(connection["keepass"]) + "_" + str(connection["sshport"]) + "_" + str(connection["remoteport"]) + "_" + perm
except (KeyError, KeePassError) as e1:
try:
command = connection["adress"] + "_" + connection["user"]+ "_" + \
connection["passwd"] + "_" + str(connection["sshport"]) + "_" + str(connection["remoteport"]) + "_" + perm
return command
except KeyError as e2:
if isinstance(e1,KeePassError):
raise KeePassError("Unable to use Keepass(" + e1.value + ") or Password")
else:
raise KeePassError("Invalid connection in yaml file")
raise KeePassError(e1)
return command | 29.512915 | 132 | 0.635159 | 1,039 | 7,998 | 4.805582 | 0.256015 | 0.006008 | 0.01442 | 0.016223 | 0.286 | 0.276387 | 0.231925 | 0.187863 | 0.175446 | 0.175446 | 0 | 0.0048 | 0.218555 | 7,998 | 271 | 133 | 29.512915 | 0.79408 | 0.147162 | 0 | 0.348416 | 0 | 0 | 0.102377 | 0.005542 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081448 | false | 0.072398 | 0.036199 | 0.013575 | 0.180995 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6a25b6baad0282a34406b60b6191667dfe9a128b | 13,698 | py | Python | ragweed/framework.py | soumyakoduri/ragweed | 7d4a729ff761fe1ca073b7ceade46acf1321e9fd | [
"MIT"
] | null | null | null | ragweed/framework.py | soumyakoduri/ragweed | 7d4a729ff761fe1ca073b7ceade46acf1321e9fd | [
"MIT"
] | null | null | null | ragweed/framework.py | soumyakoduri/ragweed | 7d4a729ff761fe1ca073b7ceade46acf1321e9fd | [
"MIT"
] | null | null | null | import sys
import os
import boto
import boto.s3.connection
import json
import inspect
import pickle
import bunch
import yaml
import ConfigParser
import rados
from boto.s3.key import Key
from nose.plugins.attrib import attr
from nose.tools import eq_ as eq
from .reqs import _make_admin_request
ragweed_env = None
suite = None
class RGWConnection:
def __init__(self, access_key, secret_key, host, port, is_secure):
self.host = host
self.port = port
self.is_secure = is_secure
self.conn = boto.connect_s3(
aws_access_key_id = access_key,
aws_secret_access_key = secret_key,
host=host,
port=port,
is_secure=is_secure,
calling_format = boto.s3.connection.OrdinaryCallingFormat(),
)
def create_bucket(self, name):
return self.conn.create_bucket(name)
def get_bucket(self, name, validate=True):
return self.conn.get_bucket(name, validate=validate)
class RGWRESTAdmin:
def __init__(self, connection):
self.conn = connection
def get_resource(self, path, params):
r = _make_admin_request(self.conn, "GET", path, params)
if r.status != 200:
raise boto.exception.S3ResponseError(r.status, r.reason)
return bunch.bunchify(json.loads(r.read()))
def read_meta_key(self, key):
return self.get_resource('/admin/metadata', {'key': key})
def get_bucket_entrypoint(self, bucket_name):
return self.read_meta_key('bucket:' + bucket_name)
def get_bucket_instance_info(self, bucket_name, bucket_id = None):
if not bucket_id:
ep = self.get_bucket_entrypoint(bucket_name)
print ep
bucket_id = ep.data.bucket.bucket_id
result = self.read_meta_key('bucket.instance:' + bucket_name + ":" + bucket_id)
return result.data.bucket_info
def check_bucket_index(self, bucket_name):
return self.get_resource('/admin/bucket',{'index' : None, 'bucket':bucket_name})
def get_obj_layout(self, key):
path = '/' + key.bucket.name + '/' + key.name
params = {'layout': None}
if key.version_id is not None:
params['versionId'] = key.version_id
print params
return self.get_resource(path, params)
def get_zone_params(self):
return self.get_resource('/admin/config', {'type': 'zone'})
class RSuite:
def __init__(self, name, bucket_prefix, zone, suite_step):
self.name = name
self.bucket_prefix = bucket_prefix
self.zone = zone
self.config_bucket = None
self.rtests = []
self.do_preparing = False
self.do_check = False
for step in suite_step.split(','):
if step == 'prepare':
self.do_preparing = True
self.config_bucket = self.zone.create_raw_bucket(self.get_bucket_name('conf'))
if step == 'check' or step == 'test':
self.do_check = True
self.config_bucket = self.zone.get_raw_bucket(self.get_bucket_name('conf'))
def get_bucket_name(self, suffix):
return self.bucket_prefix + '-' + suffix
def register_test(self, t):
self.rtests.append(t)
def write_test_data(self, test):
k = Key(self.config_bucket)
k.key = 'tests/' + test._name
k.set_contents_from_string(test.to_json())
def read_test_data(self, test):
k = Key(self.config_bucket)
k.key = 'tests/' + test._name
s = k.get_contents_as_string()
print 'read_test_data=', s
test.from_json(s)
def is_preparing(self):
return self.do_preparing
def is_checking(self):
return self.do_check
class RTestJSONSerialize(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, (list, dict, str, unicode, int, float, bool, type(None))):
return JSONEncoder.default(self, obj)
return {'__pickle': pickle.dumps(obj)}
def rtest_decode_json(d):
if '__pickle' in d:
return pickle.loads(str(d['__pickle']))
return d
class RPlacementRule:
def __init__(self, rule):
r = rule.split('/', 1)
self.placement_id = r[0]
if (len(r) == 2):
self.storage_class=r[1]
else:
self.storage_class = 'STANDARD'
class RBucket:
def __init__(self, zone, bucket, bucket_info):
self.zone = zone
self.bucket = bucket
self.name = bucket.name
self.bucket_info = bucket_info
try:
self.placement_rule = RPlacementRule(self.bucket_info.placement_rule)
self.placement_target = self.zone.get_placement_target(self.bucket_info.placement_rule)
except:
pass
def get_data_pool(self):
try:
# old style explicit pool
explicit_pool = self.bucket_info.bucket.pool
except:
# new style explicit pool
explicit_pool = self.bucket_info.bucket.explicit_placement.data_pool
if explicit_pool is not None and explicit_pool != '':
return explicit_pool
return self.placement_target.get_data_pool(self.placement_rule)
def get_tail_pool(self, obj_layout):
try:
placement_rule = obj_layout.manifest.tail_placement.placement_rule
except:
placement_rule = ''
if placement_rule == '':
try:
# new style
return obj_layout.manifest.tail_placement.bucket.explicit_placement.data_pool
except:
pass
try:
# old style
return obj_layout.manifest.tail_bucket.pool
except:
pass
pr = RPlacementRule(placement_rule)
return self.placement_target.get_data_pool(pr)
class RStorageClasses:
def __init__(self, config):
if hasattr(config, 'storage_classes'):
self.storage_classes = config.storage_classes
else:
try:
self.storage_classes = bunch.bunchify({ 'STANDARD': { 'data_pool': config.data_pool }})
except:
self.storage_classes = None
pass
def get(self, storage_class):
assert(self.storage_classes != None)
try:
if not storage_class:
storage_class = 'STANDARD'
sc = self.storage_classes[storage_class]
except:
eq('could not find storage class ' + storage_class, 0)
return sc
def get_all(self):
for (name, _) in self.storage_classes.iteritems():
yield name
class RPlacementTarget:
def __init__(self, name, config):
self.name = name
self.index_pool = config.index_pool
self.data_extra_pool = config.data_extra_pool
self.storage_classes = RStorageClasses(config)
if not self.data_extra_pool:
self.data_extra_pool = self.storage_classes.get_data_pool('STANDARD')
def get_data_pool(self, placement_rule):
return self.storage_classes.get(placement_rule.storage_class).data_pool
class RZone:
def __init__(self, conn):
self.conn = conn
self.rgw_rest_admin = RGWRESTAdmin(self.conn.system)
self.zone_params = self.rgw_rest_admin.get_zone_params()
self.placement_targets = {}
for e in self.zone_params.placement_pools:
self.placement_targets[e.key] = e.val
print 'zone_params:', self.zone_params
def get_placement_target(self, placement_id):
plid = placement_id
if placement_id is None or placement_id == '':
print 'zone_params=', self.zone_params
plid = self.zone_params.default_placement
try:
return RPlacementTarget(plid, self.placement_targets[plid])
except:
pass
return None
def get_default_placement(self):
return get_placement_target(self.zone_params.default_placement)
def create_bucket(self, name):
bucket = self.create_raw_bucket(name)
bucket_info = self.rgw_rest_admin.get_bucket_instance_info(bucket.name)
print 'bucket_info:', bucket_info
return RBucket(self, bucket, bucket_info)
def get_bucket(self, name):
bucket = self.get_raw_bucket(name)
bucket_info = self.rgw_rest_admin.get_bucket_instance_info(bucket.name)
print 'bucket_info:', bucket_info
return RBucket(self, bucket, bucket_info)
def create_raw_bucket(self, name):
return self.conn.regular.create_bucket(name)
def get_raw_bucket(self, name):
return self.conn.regular.get_bucket(name)
def refresh_rbucket(self, rbucket):
rbucket.bucket = self.get_raw_bucket(rbucket.bucket.name)
rbucket.bucket_info = self.rgw_rest_admin.get_bucket_instance_info(rbucket.bucket.name)
class RTest:
def __init__(self):
self._name = self.__class__.__name__
self.r_buckets = []
self.init()
def create_bucket(self):
bid = len(self.r_buckets) + 1
bucket_name = suite.get_bucket_name(self._name + '.' + str(bid))
bucket_name = bucket_name.replace("_", "-")
rb = suite.zone.create_bucket(bucket_name)
self.r_buckets.append(rb)
return rb
def get_buckets(self):
for rb in self.r_buckets:
yield rb
def init(self):
pass
def prepare(self):
pass
def check(self):
pass
def to_json(self):
attrs = {}
for x in dir(self):
if x.startswith('r_'):
attrs[x] = getattr(self, x)
return json.dumps(attrs, cls=RTestJSONSerialize)
def from_json(self, s):
j = json.loads(s, object_hook=rtest_decode_json)
for e in j:
setattr(self, e, j[e])
def save(self):
suite.write_test_data(self)
def load(self):
suite.read_test_data(self)
for rb in self.r_buckets:
suite.zone.refresh_rbucket(rb)
def test(self):
suite.register_test(self)
if suite.is_preparing():
self.prepare()
self.save()
if suite.is_checking():
self.load()
self.check()
def read_config(fp):
config = bunch.Bunch()
g = yaml.safe_load_all(fp)
for new in g:
print bunch.bunchify(new)
config.update(bunch.bunchify(new))
return config
str_config_opts = [
'user_id',
'access_key',
'secret_key',
'host',
'ceph_conf',
'bucket_prefix',
]
int_config_opts = [
'port',
]
bool_config_opts = [
'is_secure',
]
def dict_find(d, k):
if d.has_key(k):
return d[k]
return None
class RagweedEnv:
def __init__(self):
self.config = bunch.Bunch()
cfg = ConfigParser.RawConfigParser()
try:
path = os.environ['RAGWEED_CONF']
except KeyError:
raise RuntimeError(
'To run tests, point environment '
+ 'variable RAGWEED_CONF to a config file.',
)
with file(path) as f:
cfg.readfp(f)
for section in cfg.sections():
try:
(section_type, name) = section.split(None, 1)
if not self.config.has_key(section_type):
self.config[section_type] = bunch.Bunch()
self.config[section_type][name] = bunch.Bunch()
cur = self.config[section_type]
except ValueError:
section_type = ''
name = section
self.config[name] = bunch.Bunch()
cur = self.config
cur[name] = bunch.Bunch()
for var in str_config_opts:
try:
cur[name][var] = cfg.get(section, var)
except ConfigParser.NoOptionError:
pass
for var in int_config_opts:
try:
cur[name][var] = cfg.getint(section, var)
except ConfigParser.NoOptionError:
pass
for var in bool_config_opts:
try:
cur[name][var] = cfg.getboolean(section, var)
except ConfigParser.NoOptionError:
pass
print json.dumps(self.config)
rgw_conf = self.config.rgw
try:
self.bucket_prefix = rgw_conf.bucket_prefix
except:
self.bucket_prefix = 'ragweed'
conn = bunch.Bunch()
for (k, u) in self.config.user.iteritems():
conn[k] = RGWConnection(u.access_key, u.secret_key, rgw_conf.host, dict_find(rgw_conf, 'port'), dict_find(rgw_conf, 'is_secure'))
self.zone = RZone(conn)
self.suite = RSuite('ragweed', self.bucket_prefix, self.zone, os.environ['RAGWEED_STAGES'])
try:
self.ceph_conf = self.config.rados.ceph_conf
except:
raise RuntimeError(
'ceph_conf is missing under the [rados] section in ' + os.environ['RAGWEED_CONF']
)
self.rados = rados.Rados(conffile=self.ceph_conf)
self.rados.connect()
pools = self.rados.list_pools()
for pool in pools:
print "rados pool>", pool
def setup_module():
global ragweed_env
global suite
ragweed_env = RagweedEnv()
suite = ragweed_env.suite
| 29.649351 | 141 | 0.593663 | 1,649 | 13,698 | 4.681019 | 0.142511 | 0.034979 | 0.015676 | 0.010364 | 0.268169 | 0.176577 | 0.123202 | 0.090038 | 0.080192 | 0.053763 | 0 | 0.001591 | 0.311651 | 13,698 | 461 | 142 | 29.713666 | 0.817054 | 0.004891 | 0 | 0.181564 | 0 | 0 | 0.043006 | 0 | 0 | 0 | 0 | 0 | 0.002793 | 0 | null | null | 0.030726 | 0.041899 | null | null | 0.027933 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a2bc3b1189d8cb91dbce9466649429945439058 | 1,127 | py | Python | ecommerce-website/orders/admin.py | Shanu85/FCS_Project | def3437d58b4d2ff00e26c0a5ca769af66eccfad | [
"MIT"
] | null | null | null | ecommerce-website/orders/admin.py | Shanu85/FCS_Project | def3437d58b4d2ff00e26c0a5ca769af66eccfad | [
"MIT"
] | null | null | null | ecommerce-website/orders/admin.py | Shanu85/FCS_Project | def3437d58b4d2ff00e26c0a5ca769af66eccfad | [
"MIT"
] | 1 | 2022-01-03T13:40:11.000Z | 2022-01-03T13:40:11.000Z | from django.contrib import admin
from .models import Order, receiverInfo
@admin.register(Order)
class OrderAdmin(admin.ModelAdmin):
date_hierarchy = 'created_at'
list_display = ('user', 'code', 'total_price', 'shipping_status', 'created_at')
list_display_links = ('user',)
list_editable = ('shipping_status',)
list_filter = ('shipping_status', 'payment_mode', 'created_at')
list_per_page = 25
search_fields = ('user__phone_number', 'user__email', 'code')
readonly_fields = ('user','cart', 'receiver', 'payment_mode', 'shipping_status', 'code')
def total_price(self, obj):
return obj.cart.total_price
def has_add_permission(self, request):
return False
@admin.register(receiverInfo)
class receiverInfoAdmin(admin.ModelAdmin):
date_hierarchy = 'created_at'
list_display = ('id', 'full_name', 'phone_number', 'address', 'created_at')
list_display_links = ('id', 'full_name')
list_filter = ('created_at',)
list_per_page = 25
search_fields = ('full_name', 'phone_number', 'address')
readonly_fields = ('full_name', 'phone_number', 'address')
| 35.21875 | 92 | 0.696539 | 136 | 1,127 | 5.419118 | 0.397059 | 0.07327 | 0.105834 | 0.108548 | 0.404342 | 0.301221 | 0.222524 | 0.222524 | 0 | 0 | 0 | 0.004228 | 0.160603 | 1,127 | 31 | 93 | 36.354839 | 0.774841 | 0 | 0 | 0.16 | 0 | 0 | 0.281278 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.08 | 0.08 | 0.92 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6a315f9411feef2bef3f2cfb2fab79f19fe80e02 | 7,842 | py | Python | weaver/wps_restapi/quotation/quotes.py | crim-ca/weaver | 107fec5e19f20b77061b9405a764da911d2db8a2 | [
"Apache-2.0"
] | 16 | 2019-03-18T12:23:05.000Z | 2022-02-25T00:39:11.000Z | weaver/wps_restapi/quotation/quotes.py | crim-ca/weaver | 107fec5e19f20b77061b9405a764da911d2db8a2 | [
"Apache-2.0"
] | 346 | 2019-03-06T21:05:04.000Z | 2022-03-31T13:38:37.000Z | weaver/wps_restapi/quotation/quotes.py | crim-ca/weaver | 107fec5e19f20b77061b9405a764da911d2db8a2 | [
"Apache-2.0"
] | 5 | 2019-03-15T01:38:28.000Z | 2021-11-11T15:38:43.000Z | import logging
import random
from datetime import timedelta
from typing import TYPE_CHECKING
from duration import to_iso8601
from pyramid.httpexceptions import HTTPBadRequest, HTTPCreated, HTTPNotFound, HTTPOk
from weaver import sort
from weaver.config import WEAVER_CONFIGURATION_ADES, WEAVER_CONFIGURATION_EMS, get_weaver_configuration
from weaver.database import get_db
from weaver.datatype import Bill, Quote
from weaver.exceptions import ProcessNotFound, QuoteNotFound, log_unhandled_exceptions
from weaver.formats import OUTPUT_FORMAT_JSON
from weaver.processes.types import PROCESS_APPLICATION, PROCESS_WORKFLOW
from weaver.processes.wps_package import get_package_workflow_steps, get_process_location
from weaver.store.base import StoreBills, StoreQuotes
from weaver.utils import get_settings, get_weaver_url
from weaver.wps_restapi import swagger_definitions as sd
from weaver.wps_restapi.processes.processes import submit_local_job
if TYPE_CHECKING:
from weaver.datatype import Process
from weaver.typedefs import JSON
LOGGER = logging.getLogger(__name__)
def process_quote_estimator(process): # noqa: E811
# type: (Process) -> JSON
"""
Simulate quote parameters for the process execution.
:param process: instance of :class:`weaver.datatype.Process` for which to evaluate the quote.
:return: dict of {price, currency, estimatedTime} values for the process quote.
"""
# TODO: replace by some fancy ml technique or something?
price = random.uniform(0, 10) # nosec
currency = "CAD"
estimated_time = to_iso8601(timedelta(minutes=random.uniform(5, 60))) # nosec
return {"price": price, "currency": currency, "estimatedTime": estimated_time}
@sd.process_quotes_service.post(tags=[sd.TAG_BILL_QUOTE, sd.TAG_PROCESSES], renderer=OUTPUT_FORMAT_JSON,
schema=sd.PostProcessQuoteRequestEndpoint(), response_schemas=sd.post_quotes_responses)
@log_unhandled_exceptions(logger=LOGGER, message=sd.InternalServerErrorResponseSchema.description)
def request_quote(request):
"""
Request a quotation for a process.
"""
settings = get_settings(request)
weaver_config = get_weaver_configuration(settings)
if weaver_config not in [WEAVER_CONFIGURATION_ADES, WEAVER_CONFIGURATION_EMS]:
raise HTTPBadRequest("Unsupported request for configuration '{}'.".format(weaver_config))
process_id = request.matchdict.get("process_id")
process_store = get_db(request).get_store("processes")
try:
process = process_store.fetch_by_id(process_id)
except ProcessNotFound:
raise HTTPNotFound("Could not find process with specified 'process_id'.")
store = get_db(request).get_store(StoreQuotes)
process_url = get_process_location(process_id, data_source=get_weaver_url(settings))
process_type = process.type
process_params = dict()
for param in ["inputs", "outputs", "mode", "response"]:
if param in request.json:
process_params[param] = request.json.pop(param)
process_quote_info = process_quote_estimator(process)
process_quote_info.update({
"process": process_id,
"processParameters": process_params,
"location": process_url,
"user": str(request.authenticated_userid)
})
# loop workflow sub-process steps to get individual quotes
if process_type == PROCESS_WORKFLOW and weaver_config == WEAVER_CONFIGURATION_EMS:
workflow_quotes = list()
for step in get_package_workflow_steps(process_url):
# retrieve quote from provider ADES
# TODO: data source mapping
process_step_url = get_process_location(step["reference"])
process_quote_url = "{}/quotations".format(process_step_url)
subreq = request.copy()
subreq.path_info = process_quote_url
resp_json = request.invoke_subrequest(subreq).json()
quote_json = resp_json["quote"]
quote = store.save_quote(Quote(**quote_json))
workflow_quotes.append(quote.id)
process_quote_info.update({"steps": workflow_quotes})
quote = store.save_quote(Quote(**process_quote_info))
return HTTPCreated(json={"quote": quote.json()})
# single application quotes (ADES or EMS)
elif process_type == PROCESS_APPLICATION:
quote = store.save_quote(Quote(**process_quote_info))
quote_json = quote.json()
quote_json.pop("steps", None)
return HTTPCreated(json={"quote": quote_json})
# error if not handled up to this point
raise HTTPBadRequest("Unsupported quoting process type '{0}' on '{1}'.".format(process_type, weaver_config))
@sd.process_quotes_service.get(tags=[sd.TAG_BILL_QUOTE, sd.TAG_PROCESSES], renderer=OUTPUT_FORMAT_JSON,
schema=sd.ProcessQuotesEndpoint(), response_schemas=sd.get_quote_list_responses)
@sd.quotes_service.get(tags=[sd.TAG_BILL_QUOTE], renderer=OUTPUT_FORMAT_JSON,
schema=sd.QuotesEndpoint(), response_schemas=sd.get_quote_list_responses)
@log_unhandled_exceptions(logger=LOGGER, message=sd.InternalServerErrorResponseSchema.description)
def get_quote_list(request):
"""
Get list of quotes IDs.
"""
page = int(request.params.get("page", "0"))
limit = int(request.params.get("limit", "10"))
filters = {
"process_id": request.params.get("process", None) or request.matchdict.get("process_id", None),
"page": page,
"limit": limit,
"sort": request.params.get("sort", sort.SORT_CREATED),
}
store = get_db(request).get_store(StoreQuotes)
items, count = store.find_quotes(**filters)
return HTTPOk(json={
"count": count,
"page": page,
"limit": limit,
"quotes": [quote.id for quote in items]
})
@sd.process_quote_service.get(tags=[sd.TAG_BILL_QUOTE, sd.TAG_PROCESSES], renderer=OUTPUT_FORMAT_JSON,
schema=sd.ProcessQuoteEndpoint(), response_schemas=sd.get_quote_responses)
@sd.quote_service.get(tags=[sd.TAG_BILL_QUOTE], renderer=OUTPUT_FORMAT_JSON,
schema=sd.QuoteEndpoint(), response_schemas=sd.get_quote_responses)
@log_unhandled_exceptions(logger=LOGGER, message=sd.InternalServerErrorResponseSchema.description)
def get_quote_info(request):
"""
Get quote information.
"""
quote_id = request.matchdict.get("quote_id")
store = get_db(request).get_store(StoreQuotes)
try:
quote = store.fetch_by_id(quote_id)
except QuoteNotFound:
raise HTTPNotFound("Could not find quote with specified 'quote_id'.")
return HTTPOk(json={"quote": quote.json()})
@sd.process_quote_service.post(tags=[sd.TAG_BILL_QUOTE, sd.TAG_EXECUTE, sd.TAG_PROCESSES], renderer=OUTPUT_FORMAT_JSON,
schema=sd.PostProcessQuote(), response_schemas=sd.post_quote_responses)
@sd.quote_service.post(tags=[sd.TAG_BILL_QUOTE, sd.TAG_EXECUTE], renderer=OUTPUT_FORMAT_JSON,
schema=sd.PostQuote(), response_schemas=sd.post_quote_responses)
@log_unhandled_exceptions(logger=LOGGER, message=sd.InternalServerErrorResponseSchema.description)
def execute_quote(request):
"""
Execute a quoted process.
"""
quote_info = get_quote_info(request).json["quote"]
quote_bill_info = {
"quote": quote_info.get("id"),
"price": quote_info.get("price"),
"currency": quote_info.get("currency")
}
job_resp = submit_local_job(request)
job_json = job_resp.json
job_id = job_json.get("jobID")
user_id = str(request.authenticated_userid)
store = get_db(request).get_store(StoreBills)
bill = store.save_bill(Bill(user=user_id, job=job_id, **quote_bill_info))
job_json.update({"bill": bill.id})
return HTTPCreated(json=job_json)
| 43.810056 | 119 | 0.718822 | 968 | 7,842 | 5.568182 | 0.206612 | 0.025974 | 0.023748 | 0.016883 | 0.311317 | 0.284972 | 0.219666 | 0.200557 | 0.167532 | 0.156957 | 0 | 0.003418 | 0.179291 | 7,842 | 178 | 120 | 44.05618 | 0.834058 | 0.080719 | 0 | 0.130769 | 0 | 0 | 0.067594 | 0 | 0 | 0 | 0 | 0.011236 | 0 | 1 | 0.038462 | false | 0 | 0.153846 | 0 | 0.238462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a31701fc7c063904134f212988d1c0c79559f82 | 6,722 | py | Python | pysnmp/CISCO-VSI-CONTROLLER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CISCO-VSI-CONTROLLER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CISCO-VSI-CONTROLLER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CISCO-VSI-CONTROLLER-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CISCO-VSI-CONTROLLER-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:03:33 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, ObjectIdentifier, OctetString = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, ConstraintsUnion, ConstraintsIntersection, ValueRangeConstraint, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "ConstraintsUnion", "ConstraintsIntersection", "ValueRangeConstraint", "SingleValueConstraint")
ciscoMgmt, = mibBuilder.importSymbols("CISCO-SMI", "ciscoMgmt")
ModuleCompliance, NotificationGroup, ObjectGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup", "ObjectGroup")
ObjectIdentity, NotificationType, Gauge32, Bits, Unsigned32, IpAddress, MibIdentifier, MibScalar, MibTable, MibTableRow, MibTableColumn, ModuleIdentity, Counter32, Counter64, iso, Integer32, TimeTicks = mibBuilder.importSymbols("SNMPv2-SMI", "ObjectIdentity", "NotificationType", "Gauge32", "Bits", "Unsigned32", "IpAddress", "MibIdentifier", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "ModuleIdentity", "Counter32", "Counter64", "iso", "Integer32", "TimeTicks")
TextualConvention, RowStatus, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "RowStatus", "DisplayString")
ciscoVSIControllerMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 9, 9, 141))
if mibBuilder.loadTexts: ciscoVSIControllerMIB.setLastUpdated('9906080000Z')
if mibBuilder.loadTexts: ciscoVSIControllerMIB.setOrganization('Cisco Systems, Inc.')
class CvcControllerShelfLocation(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2))
namedValues = NamedValues(("internal", 1), ("external", 2))
class CvcControllerType(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3))
namedValues = NamedValues(("par", 1), ("pnni", 2), ("lsc", 3))
cvcMIBObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 141, 1))
cvcConfController = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1))
cvcConfTable = MibTable((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1), )
if mibBuilder.loadTexts: cvcConfTable.setStatus('current')
cvcConfEntry = MibTableRow((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1), ).setIndexNames((0, "CISCO-VSI-CONTROLLER-MIB", "cvcConfControllerID"))
if mibBuilder.loadTexts: cvcConfEntry.setStatus('current')
cvcConfControllerID = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647)))
if mibBuilder.loadTexts: cvcConfControllerID.setStatus('current')
cvcConfControllerType = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 2), CvcControllerType()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: cvcConfControllerType.setStatus('current')
cvcConfControllerShelfLocation = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 3), CvcControllerShelfLocation()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: cvcConfControllerShelfLocation.setStatus('current')
cvcConfControllerLocation = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: cvcConfControllerLocation.setStatus('current')
cvcConfControllerName = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 5), DisplayString()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: cvcConfControllerName.setStatus('current')
cvcConfVpi = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 4095))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: cvcConfVpi.setStatus('current')
cvcConfVci = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 7), Integer32().subtype(subtypeSpec=ValueRangeConstraint(32, 65535))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: cvcConfVci.setStatus('current')
cvcConfRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 141, 1, 1, 1, 1, 8), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: cvcConfRowStatus.setStatus('current')
cvcMIBConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 141, 3))
cvcMIBCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 141, 3, 1))
cvcMIBGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 141, 3, 2))
cvcMIBCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 9, 9, 141, 3, 1, 1)).setObjects(("CISCO-VSI-CONTROLLER-MIB", "cvcConfGroup"), ("CISCO-VSI-CONTROLLER-MIB", "cvcConfGroupExternal"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
cvcMIBCompliance = cvcMIBCompliance.setStatus('current')
cvcConfGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 9, 9, 141, 3, 2, 1)).setObjects(("CISCO-VSI-CONTROLLER-MIB", "cvcConfControllerType"), ("CISCO-VSI-CONTROLLER-MIB", "cvcConfControllerShelfLocation"), ("CISCO-VSI-CONTROLLER-MIB", "cvcConfControllerLocation"), ("CISCO-VSI-CONTROLLER-MIB", "cvcConfControllerName"), ("CISCO-VSI-CONTROLLER-MIB", "cvcConfRowStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
cvcConfGroup = cvcConfGroup.setStatus('current')
cvcConfGroupExternal = ObjectGroup((1, 3, 6, 1, 4, 1, 9, 9, 141, 3, 2, 2)).setObjects(("CISCO-VSI-CONTROLLER-MIB", "cvcConfVpi"), ("CISCO-VSI-CONTROLLER-MIB", "cvcConfVci"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
cvcConfGroupExternal = cvcConfGroupExternal.setStatus('current')
mibBuilder.exportSymbols("CISCO-VSI-CONTROLLER-MIB", cvcConfTable=cvcConfTable, cvcMIBGroups=cvcMIBGroups, cvcConfControllerType=cvcConfControllerType, cvcConfVpi=cvcConfVpi, CvcControllerShelfLocation=CvcControllerShelfLocation, cvcConfControllerLocation=cvcConfControllerLocation, cvcConfController=cvcConfController, cvcConfControllerName=cvcConfControllerName, PYSNMP_MODULE_ID=ciscoVSIControllerMIB, cvcConfControllerID=cvcConfControllerID, cvcConfGroupExternal=cvcConfGroupExternal, cvcMIBCompliance=cvcMIBCompliance, cvcConfEntry=cvcConfEntry, ciscoVSIControllerMIB=ciscoVSIControllerMIB, cvcConfControllerShelfLocation=cvcConfControllerShelfLocation, cvcConfRowStatus=cvcConfRowStatus, cvcConfGroup=cvcConfGroup, CvcControllerType=CvcControllerType, cvcConfVci=cvcConfVci, cvcMIBObjects=cvcMIBObjects, cvcMIBCompliances=cvcMIBCompliances, cvcMIBConformance=cvcMIBConformance)
| 105.03125 | 883 | 0.759298 | 739 | 6,722 | 6.903924 | 0.197564 | 0.012544 | 0.01176 | 0.014896 | 0.356919 | 0.253038 | 0.217758 | 0.217758 | 0.217758 | 0.215602 | 0 | 0.070471 | 0.090152 | 6,722 | 63 | 884 | 106.698413 | 0.763734 | 0.050878 | 0 | 0.09434 | 0 | 0 | 0.1849 | 0.063569 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.132075 | 0 | 0.283019 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a35d2e3dec4c05f542fce0db1d5d23465661584 | 8,348 | py | Python | Masters/Copy Layer to Layer.py | davidtahim/Glyphs-Scripts | 5ed28805b5fe03c63d904ad2f79117844c22aa44 | [
"Apache-2.0"
] | 1 | 2021-09-04T18:41:30.000Z | 2021-09-04T18:41:30.000Z | Masters/Copy Layer to Layer.py | davidtahim/Glyphs-Scripts | 5ed28805b5fe03c63d904ad2f79117844c22aa44 | [
"Apache-2.0"
] | null | null | null | Masters/Copy Layer to Layer.py | davidtahim/Glyphs-Scripts | 5ed28805b5fe03c63d904ad2f79117844c22aa44 | [
"Apache-2.0"
] | null | null | null | #MenuTitle: Copy Layer to Layer
# -*- coding: utf-8 -*-
__doc__="""
Copies one master to another master in selected glyphs.
"""
import GlyphsApp
import vanilla
import math
def getComponentScaleX_scaleY_rotation( self ):
a = self.transform[0]
b = self.transform[1]
c = self.transform[2]
d = self.transform[3]
scale_x = math.sqrt(math.pow(a,2)+math.pow(b,2))
scale_y = math.sqrt(math.pow(c,2)+math.pow(d,2))
if (b<0 and c<0):
scale_y = scale_y * -1
rotation = math.atan2(b, a) * (180/math.pi)
return [scale_x, scale_y, rotation]
class MasterFiller( object ):
def __init__( self ):
# Window 'self.w':
windowWidth = 280
windowHeight = 155
windowWidthResize = 120 # user can resize width by this value
windowHeightResize = 0 # user can resize height by this value
self.w = vanilla.FloatingWindow(
( windowWidth, windowHeight ), # default window size
"Copy layer to layer", # window title
minSize = ( windowWidth, windowHeight ), # minimum size (for resizing)
maxSize = ( windowWidth + windowWidthResize, windowHeight + windowHeightResize ), # maximum size (for resizing)
autosaveName = "com.mekkablue.MasterFiller.mainwindow" # stores last window position and size
)
self.w.text_1 = vanilla.TextBox((15, 12+2, 120, 14), "Copy paths from", sizeStyle='small')
self.w.master_from = vanilla.PopUpButton((120, 12, -15, 17), self.GetMasterNames(), sizeStyle='small', callback=self.MasterChangeCallback)
self.w.text_2 = vanilla.TextBox((15, 32+2, 120, 14), "into selection of", sizeStyle='small')
self.w.master_into = vanilla.PopUpButton((120, 32, -15, 17), self.GetMasterNames(), sizeStyle='small', callback=self.MasterChangeCallback)
self.w.include_components = vanilla.CheckBox((15, 52+2, -100, 20), "Include components", sizeStyle='small', callback=self.SavePreferences, value=True)
self.w.include_anchors = vanilla.CheckBox((15, 52+20, -100, 20), "Include anchors", sizeStyle='small', callback=self.SavePreferences, value=True)
self.w.include_metrics = vanilla.CheckBox((15, 52+38, -100, 20), "Include metrics", sizeStyle='small', callback=self.SavePreferences, value=True)
self.w.keep_window_open = vanilla.CheckBox((15, 52+56, -100, 20), "Keep window open", sizeStyle='small', callback=self.SavePreferences, value=True)
self.w.copybutton = vanilla.Button((-80, -30, -15, -10), "Copy", sizeStyle='small', callback=self.buttonCallback)
self.w.setDefaultButton( self.w.copybutton )
# Load Settings:
if not self.LoadPreferences():
print "Note: 'Copy Layer to Layer' could not load preferences. Will resort to defaults."
self.w.open()
self.w.makeKey()
self.w.master_into.set(1)
def SavePreferences( self, sender ):
try:
Glyphs.defaults["com.mekkablue.MasterFiller.include_components"] = self.w.include_components.get()
Glyphs.defaults["com.mekkablue.MasterFiller.include_anchors"] = self.w.include_anchors.get()
Glyphs.defaults["com.mekkablue.MasterFiller.include_metrics"] = self.w.include_metrics.get()
Glyphs.defaults["com.mekkablue.MasterFiller.keep_window_open"] = self.w.keep_window_open.get()
except:
return False
return True
def LoadPreferences( self ):
try:
NSUserDefaults.standardUserDefaults().registerDefaults_(
{
"com.mekkablue.MasterFiller.include_components" : "1",
"com.mekkablue.MasterFiller.include_anchors" : "1",
"com.mekkablue.MasterFiller.include_metrics" : "1",
"com.mekkablue.MasterFiller.keep_window_open" : "1"
}
)
self.w.include_components.set( Glyphs.defaults["com.mekkablue.MasterFiller.include_components"] )
self.w.include_anchors.set( Glyphs.defaults["com.mekkablue.MasterFiller.include_anchors"] )
self.w.include_metrics.set( Glyphs.defaults["com.mekkablue.MasterFiller.include_metrics"] )
self.w.keep_window_open.set( Glyphs.defaults["com.mekkablue.MasterFiller.keep_window_open"] )
except:
return False
return True
def GetMasterNames( self ):
myMasterList = []
for i in range( len( Glyphs.currentDocument.font.masters ) ):
x = Glyphs.currentDocument.font.masters[i]
myMasterList.append( '%i: %s' % (i, x.name) )
return myMasterList
def MasterChangeCallback( self, sender ):
if self.w.master_from.get() == self.w.master_into.get():
self.w.copybutton.enable( False )
else:
self.w.copybutton.enable( True )
def copyPathsFromLayerToLayer( self, sourceLayer, targetLayer ):
"""Copies all paths from sourceLayer to targetLayer"""
num_from = len( sourceLayer.paths )
num_into = len( targetLayer.paths )
if num_into != 0:
print "- Cleaning out paths in target layer"
for i in range( num_into )[::-1]:
del targetLayer.paths[i]
if num_from > 0:
print "- Copying paths"
for thisPath in sourceLayer.paths:
newPath = GSPath()
for n in thisPath.nodes:
newNode = GSNode()
newNode.type = n.type
newNode.connection = n.connection
newNode.setPosition_( (n.x, n.y) )
newPath.addNode_( newNode )
newPath.closed = thisPath.closed
targetLayer.paths.append( newPath )
def copyComponentsFromLayerToLayer( self, sourceLayer, targetLayer ):
"""Copies all components from sourceLayer to targetLayer."""
comp_from = len( sourceLayer.components )
comp_into = len( targetLayer.components )
if comp_into != 0:
print "- Cleaning out components in target layer"
for i in range( comp_into )[::-1]:
del targetLayer.components[i]
if comp_from > 0:
print "- Copying components:"
for thisComp in sourceLayer.components:
compName = str( thisComp.componentName ) # str() probably not necessary anymore, but once fixed a problem
newComp = GSComponent( compName )
newComp.setPosition_( (thisComp.x, thisComp.y) )
ScaleX_scaleY_rotation = getComponentScaleX_scaleY_rotation(thisComp)
newComp.setScaleX_scaleY_rotation_(ScaleX_scaleY_rotation[0],ScaleX_scaleY_rotation[1],ScaleX_scaleY_rotation[2])
print "-- Component: %s" % ( compName )
targetLayer.components.append( newComp )
def copyAnchorsFromLayerToLayer( self, sourceLayer, targetLayer ):
"""Copies all anchors from sourceLayer to targetLayer."""
anch_from = len( sourceLayer.anchors )
anch_into = len( targetLayer.anchors )
if anch_into != 0:
print "- Cleaning out anchors in target layer"
sourceLayer.setAnchors_( None )
if anch_from > 0:
print "- Copying anchors from source layer:"
for thisAnchor in sourceLayer.anchors:
anchorName = thisAnchor.name
anchorPosition = NSPoint( thisAnchor.x, thisAnchor.y )
newAnchor = GSAnchor( anchorName, anchorPosition )
print "-- %s (%i, %i)" % ( anchorName, anchorPosition.x, anchorPosition.y )
targetLayer.addAnchor_( newAnchor )
def copyMetricsFromLayerToLayer( self, sourceLayer, targetLayer ):
"""Copies width of sourceLayer to targetLayer."""
sourceWidth = sourceLayer.width
if targetLayer.width != sourceWidth:
targetLayer.width = sourceWidth
print "- Copying width (%.1f)" % sourceWidth
else:
print "- Width not changed (already was %.1f)" % sourceWidth
def buttonCallback( self, sender ):
Glyphs.clearLog()
Glyphs.showMacroWindow()
print "Copy Layer to Layer Protocol:"
Font = Glyphs.font
Doc = Glyphs.currentDocument
selectedGlyphs = [ x.parent for x in Font.selectedLayers ]
index_from = self.w.master_from.get()
index_into = self.w.master_into.get()
compYesNo = self.w.include_components.get()
anchYesNo = self.w.include_anchors.get()
metrYesNo = self.w.include_metrics.get()
for thisGlyph in selectedGlyphs:
try:
print "\nProcessing", thisGlyph.name
sourcelayer = thisGlyph.layers[ index_from ]
targetlayer = thisGlyph.layers[ index_into ]
Font.disableUpdateInterface()
# copy paths:
self.copyPathsFromLayerToLayer( sourcelayer, targetlayer )
# copy components:
if compYesNo:
self.copyComponentsFromLayerToLayer( sourcelayer, targetlayer )
# copy anchors:
if anchYesNo:
self.copyAnchorsFromLayerToLayer( sourcelayer, targetlayer )
# copy metrics:
if metrYesNo:
self.copyMetricsFromLayerToLayer( sourcelayer, targetlayer )
Font.enableUpdateInterface()
except Exception, e:
print e
if not self.w.keep_window_open.get():
self.w.close()
MasterFiller()
| 36.295652 | 152 | 0.713464 | 1,010 | 8,348 | 5.80099 | 0.227723 | 0.029869 | 0.053251 | 0.047619 | 0.27445 | 0.182796 | 0.157535 | 0.146783 | 0.129032 | 0.0908 | 0 | 0.01943 | 0.167705 | 8,348 | 229 | 153 | 36.454148 | 0.823834 | 0.048275 | 0 | 0.064706 | 0 | 0 | 0.153327 | 0.071734 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.017647 | null | null | 0.082353 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6a38ce0ec26e60d5ad5f137d860bd186bef4c8e7 | 1,981 | py | Python | zorg/buildbot/conditions/FileConditions.py | dyung/llvm-zorg | 42cd139968388b14323975647faf322c99945986 | [
"Apache-2.0"
] | 27 | 2019-01-15T03:03:58.000Z | 2022-03-22T23:31:36.000Z | zorg/buildbot/conditions/FileConditions.py | dyung/llvm-zorg | 42cd139968388b14323975647faf322c99945986 | [
"Apache-2.0"
] | 21 | 2020-05-29T01:12:26.000Z | 2022-03-29T20:06:22.000Z | zorg/buildbot/conditions/FileConditions.py | dyung/llvm-zorg | 42cd139968388b14323975647faf322c99945986 | [
"Apache-2.0"
] | 38 | 2019-02-10T02:46:33.000Z | 2022-03-26T10:27:29.000Z | from buildbot.process.remotecommand import RemoteCommand
from buildbot.interfaces import WorkerTooOldError
import stat
class FileExists(object):
"""I check a file existence on the worker. I return True if the file
with the given name exists, False if the file does not exist or that is
a directory.
Use me with doStepIf to make a build step conditional to existence of some
file. For example
doStepIf=FileExists('build/configure')
"""
def __init__(self, filename):
self.filename = filename
def __call__(self, step):
step.checkWorkerHasCommand('stat')
cmd = RemoteCommand('stat', {'file': self.filename})
d = step.runCommand(cmd)
d.addCallback(lambda res: self.commandComplete(cmd))
return d
def commandComplete(self, cmd):
if cmd.didFail():
return False
s = cmd.updates["stat"][-1]
filemode = s[stat.ST_MODE]
if stat.S_ISREG(filemode) or stat.S_ISLNK(filemode):
# True only if this is a file or a link and not any other file
# system object.
return True
else:
return False
class FileDoesNotExist(object):
"""I check a file existence on the worker. I return False if
the file with the given name exists or that is a directory, True if the
file does not exist.
Use me with doStepIf to make a build step conditional to nonexistence
of some file. For example
doStepIf=FileDoesNotExist('build/configure')
"""
def __init__(self, filename):
self.filename = filename
def __call__(self, step):
step.checkWorkerHasCommand('stat')
cmd = RemoteCommand('stat', {'file': self.filename})
d = step.runCommand(cmd)
d.addCallback(lambda res: self.commandComplete(cmd))
return d
def commandComplete(self, cmd):
# False if any filesystem object with the given name exists.
return cmd.didFail()
| 30.476923 | 78 | 0.655729 | 257 | 1,981 | 4.980545 | 0.311284 | 0.05625 | 0.028125 | 0.0375 | 0.664063 | 0.61875 | 0.542188 | 0.542188 | 0.49375 | 0.49375 | 0 | 0.000686 | 0.264513 | 1,981 | 64 | 79 | 30.953125 | 0.877831 | 0.360424 | 0 | 0.625 | 0 | 0 | 0.023256 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.09375 | 0.03125 | 0.53125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6a415615b9b2bc4e4bdf10ab3d417314a169e277 | 44,836 | py | Python | phi/math/backend/_backend.py | marc-gav/PhiFlow | b6186fd1503d040997b52d49aa18cd875267c27e | [
"MIT"
] | null | null | null | phi/math/backend/_backend.py | marc-gav/PhiFlow | b6186fd1503d040997b52d49aa18cd875267c27e | [
"MIT"
] | null | null | null | phi/math/backend/_backend.py | marc-gav/PhiFlow | b6186fd1503d040997b52d49aa18cd875267c27e | [
"MIT"
] | null | null | null | from collections import namedtuple
from contextlib import contextmanager
from threading import Barrier
from typing import List, Callable
import numpy
from ._dtype import DType, combine_types
SolveResult = namedtuple('SolveResult', [
'method', 'x', 'residual', 'iterations', 'function_evaluations', 'converged', 'diverged', 'message',
])
class ComputeDevice:
"""
A physical device that can be selected to perform backend computations.
"""
def __init__(self, backend: 'Backend', name: str, device_type: str, memory: int, processor_count: int, description: str, ref=None):
self.name: str = name
""" Name of the compute device. CPUs are typically called `'CPU'`. """
self.device_type: str = device_type
""" Type of device such as `'CPU'`, `'GPU'` or `'TPU'`. """
self.memory: int = memory
""" Maximum memory of the device that can be allocated (in bytes). -1 for n/a. """
self.processor_count: int = processor_count
""" Number of CPU cores or GPU multiprocessors. -1 for n/a. """
self.description: str = description
""" Further information about the device such as driver version. """
self.ref = ref
""" (Optional) Reference to the internal device representation. """
self.backend: 'Backend' = backend
""" Backend that this device belongs to. Different backends represent the same device with different objects. """
def __repr__(self):
mem = f"{(self.memory / 1024 ** 2)} MB" if self.memory > 0 else "memory: n/a"
pro = f"{self.processor_count} processors" if self.processor_count > 0 else "processors: n/a"
descr = self.description.replace('\n', ' ')
if len(descr) > 30:
descr = descr[:28] + "..."
return f"'{self.name}' ({self.device_type}) | {mem} | {pro} | {descr}"
class Backend:
def __init__(self, name: str, default_device: ComputeDevice):
"""
Backends delegate low-level operations to a compute library or emulate them.
The methods of `Backend` form a comprehensive list of available operations.
To support a compute library, subclass `Backend` and register it by adding it to `BACKENDS`.
Args:
name: Human-readable string
default_device: `ComputeDevice` being used by default
"""
self._name = name
self._default_device = default_device
def __enter__(self):
_DEFAULT.append(self)
def __exit__(self, exc_type, exc_val, exc_tb):
_DEFAULT.pop(-1)
@property
def name(self) -> str:
return self._name
def supports(self, feature: str or Callable) -> bool:
"""
Tests if this backend supports the given feature.
Features correspond to a method of this backend that must be implemented if the feature is supported.
Possible features:
* `sparse_tensor`
* `gradients
Args:
feature: `str` or unbound Backend method, e.g. `Backend.sparse_tensor`
Returns:
Whether the feature is supported.
"""
feature = feature if isinstance(feature, str) else feature.__name__
if not hasattr(Backend, feature):
raise ValueError(f"Not a valid feature: '{feature}'")
backend_fun = getattr(Backend, feature)
impl_fun = getattr(self.__class__, feature)
return impl_fun is not backend_fun
def prefers_channels_last(self) -> bool:
raise NotImplementedError()
@property
def precision(self) -> int:
""" Short for math.backend.get_precision() """
return get_precision()
@property
def float_type(self) -> DType:
return DType(float, self.precision)
@property
def as_registered(self) -> 'Backend':
from phi.math.backend import BACKENDS
for backend in BACKENDS:
if self.name in backend.name:
return backend
raise RuntimeError(f"Backend '{self}' is not visible.")
@property
def complex_type(self) -> DType:
return DType(complex, max(64, self.precision))
def combine_types(self, *dtypes: DType) -> DType:
return combine_types(*dtypes, fp_precision=self.precision)
def auto_cast(self, *tensors) -> list:
"""
Determins the appropriate values type resulting from operations involving the tensors as input.
This method is called by the default implementations of basic operators.
Backends can override this method to prevent unnecessary casting.
Args:
*tensors: tensors to cast and to consider when determining the common data type
Returns:
tensors cast to a common data type
"""
dtypes = [self.dtype(t) for t in tensors]
result_type = self.combine_types(*dtypes)
if result_type.kind in (int, float, complex, bool):
tensors = [self.cast(t, result_type) for t in tensors]
return tensors
def __str__(self):
return self.name
def __repr__(self):
return self.name
def list_devices(self, device_type: str or None = None) -> List[ComputeDevice]:
"""
Fetches information about all available compute devices this backend can use.
Implementations:
* NumPy: [`os.cpu_count`](https://docs.python.org/3/library/os.html#os.cpu_count)
* PyTorch: [`torch.cuda.get_device_properties`](https://pytorch.org/docs/stable/cuda.html#torch.cuda.get_device_properties)
* TensorFlow: `tensorflow.python.client.device_lib.list_local_devices`
* Jax: [`jax.devices`](https://jax.readthedocs.io/en/latest/jax.html#jax.devices)
Args:
device_type: (optional) Return only devices of this type, e.g. `'GPU'` or `'CPU'`. See `ComputeDevice.device_type`.
Returns:
`list` of all currently available devices.
"""
raise NotImplementedError()
def get_default_device(self) -> ComputeDevice:
return self._default_device
def set_default_device(self, device: ComputeDevice or str):
if isinstance(device, str):
devices = self.list_devices(device)
assert len(devices) >= 1, f"{self.name}: Cannot select '{device} because no device of this type is available."
device = devices[0]
self._default_device = device
def seed(self, seed: int):
raise NotImplementedError()
def is_tensor(self, x, only_native=False):
"""
An object is considered a native tensor by a backend if no internal conversion is required by backend methods.
An object is considered a tensor (nativer or otherwise) by a backend if it is not a struct (e.g. tuple, list) and all methods of the backend accept it as a tensor argument.
Args:
x: object to check
only_native: If True, only accepts true native tensor representations, not Python numbers or others that are also supported as tensors (Default value = False)
Returns:
bool: whether `x` is considered a tensor by this backend
"""
raise NotImplementedError()
def as_tensor(self, x, convert_external=True):
"""
Converts a tensor-like object to the native tensor representation of this backend.
If x is a native tensor of this backend, it is returned without modification.
If x is a Python number (numbers.Number instance), `convert_numbers` decides whether to convert it unless the backend cannot handle Python numbers.
*Note:* There may be objects that are considered tensors by this backend but are not native and thus, will be converted by this method.
Args:
x: tensor-like, e.g. list, tuple, Python number, tensor
convert_external: if False and `x` is a Python number that is understood by this backend, this method returns the number as-is. This can help prevent type clashes like int32 vs int64. (Default value = True)
Returns:
tensor representation of `x`
"""
raise NotImplementedError()
def is_available(self, tensor) -> bool:
"""
Tests if the value of the tensor is known and can be read at this point.
If true, `numpy(tensor)` must return a valid NumPy representation of the value.
Tensors are typically available when the backend operates in eager mode.
Args:
tensor: backend-compatible tensor
Returns:
bool
"""
raise NotImplementedError()
def numpy(self, tensor) -> numpy.ndarray:
"""
Returns a NumPy representation of the given tensor.
If `tensor` is already a NumPy array, it is returned without modification.
This method raises an error if the value of the tensor is not known at this point, e.g. because it represents a node in a graph.
Use `is_available(tensor)` to check if the value can be represented as a NumPy array.
Args:
tensor: backend-compatible tensor
Returns:
NumPy representation of the values stored in the tensor
"""
raise NotImplementedError()
def to_dlpack(self, tensor):
raise NotImplementedError()
def from_dlpack(self, capsule):
raise NotImplementedError()
def copy(self, tensor, only_mutable=False):
raise NotImplementedError()
def call(self, f: Callable, *args, name=None):
"""
Calls `f(*args)` and returns the result.
This method may be used to register internal calls with the profiler.
Usage:
choose_backend(key).call(custom_function, *args)
"""
return f(*args)
def block_until_ready(self, values):
pass
def jit_compile(self, f: Callable) -> Callable:
return NotImplemented
def functional_gradient(self, f, wrt: tuple or list, get_output: bool):
raise NotImplementedError(self)
def custom_gradient(self, f: Callable, gradient: Callable) -> Callable:
"""
Creates a function based on `f` that uses a custom gradient for backprop.
Args:
f: Forward function.
gradient: Function for backprop. Will be called as `gradient(*d_out)` to compute the gradient of `f`.
Returns:
Function with similar signature and return values as `f`. However, the returned function does not support keyword arguments.
"""
return NotImplemented
def jit_compile_grad(self, f, wrt: tuple or list, get_output: bool):
raise NotImplementedError()
def transpose(self, tensor, axes):
raise NotImplementedError()
def random_uniform(self, shape):
""" Float tensor of selected precision containing random values in the range [0, 1) """
raise NotImplementedError(self)
def random_normal(self, shape):
""" Float tensor of selected precision containing random values sampled from a normal distribution with mean 0 and std 1. """
raise NotImplementedError(self)
def stack(self, values, axis=0):
raise NotImplementedError(self)
def concat(self, values, axis):
raise NotImplementedError(self)
def pad(self, value, pad_width, mode: str = 'constant', constant_values=0):
"""
Pad a tensor with values as specified by `mode` and `constant_values`.
If the mode is not supported, returns NotImplemented.
Args:
value: tensor
pad_width: 2D tensor specifying the number of values padded to the edges of each axis in the form [[axis 0 lower, axis 0 upper], ...] including batch and component axes.
mode: constant', 'boundary', 'periodic', 'symmetric', 'reflect'
constant_values: used for out-of-bounds points if mode='constant' (Default value = 0)
mode: str: (Default value = 'constant')
Returns:
padded tensor or NotImplemented
"""
raise NotImplementedError(self)
def reshape(self, value, shape):
raise NotImplementedError(self)
def flip(self, value, axes: tuple or list):
slices = tuple(slice(None, None, -1 if i in axes else None) for i in range(self.ndims(value)))
return value[slices]
def sum(self, value, axis=None, keepdims=False):
raise NotImplementedError(self)
def prod(self, value, axis=None):
raise NotImplementedError(self)
def divide_no_nan(self, x, y):
"""
Computes x/y but returns 0 if y=0.
Args:
x:
y:
Returns:
"""
raise NotImplementedError(self)
def where(self, condition, x=None, y=None):
raise NotImplementedError(self)
def nonzero(self, values):
"""
Args:
values: Tensor with only spatial dimensions
Returns:
non-zero multi-indices as tensor of shape (nnz, vector)
"""
raise NotImplementedError(self)
def mean(self, value, axis=None, keepdims=False):
raise NotImplementedError(self)
def range(self, start, limit=None, delta=1, dtype: DType = DType(int, 32)):
raise NotImplementedError(self)
def zeros(self, shape, dtype: DType = None):
raise NotImplementedError(self)
def zeros_like(self, tensor):
raise NotImplementedError(self)
def ones(self, shape, dtype: DType = None):
raise NotImplementedError(self)
def ones_like(self, tensor):
raise NotImplementedError(self)
def meshgrid(self, *coordinates):
raise NotImplementedError(self)
def linspace(self, start, stop, number):
raise NotImplementedError(self)
def tensordot(self, a, a_axes: tuple or list, b, b_axes: tuple or list):
""" Multiply-sum-reduce a_axes of a with b_axes of b. """
raise NotImplementedError(self)
def matmul(self, A, b):
raise NotImplementedError(self)
def einsum(self, equation, *tensors):
raise NotImplementedError(self)
def while_loop(self, loop: Callable, values: tuple):
"""
```python
while any(values[0]):
values = loop(*values)
return values
```
This operation does not support backpropagation.
Args:
loop: Loop function, must return a `tuple` with entries equal to `values` in shape and data type.
values: Initial values of loop variables.
Returns:
Loop variables upon loop completion.
"""
raise NotImplementedError(self)
def abs(self, x):
raise NotImplementedError(self)
def sign(self, x):
raise NotImplementedError(self)
def round(self, x):
raise NotImplementedError(self)
def ceil(self, x):
raise NotImplementedError(self)
def floor(self, x):
raise NotImplementedError(self)
def max(self, x, axis=None, keepdims=False):
raise NotImplementedError(self)
def min(self, x, axis=None, keepdims=False):
raise NotImplementedError(self)
def maximum(self, a, b):
raise NotImplementedError(self)
def minimum(self, a, b):
raise NotImplementedError(self)
def clip(self, x, minimum, maximum):
raise NotImplementedError(self)
def sqrt(self, x):
raise NotImplementedError(self)
def exp(self, x):
raise NotImplementedError(self)
def conv(self, value, kernel, zero_padding=True):
"""
Convolve value with kernel.
Depending on the tensor rank, the convolution is either 1D (rank=3), 2D (rank=4) or 3D (rank=5).
Higher dimensions may not be supported.
Args:
value: tensor of shape (batch_size, in_channel, spatial...)
kernel: tensor of shape (batch_size or 1, out_channel, in_channel, spatial...)
zero_padding: If True, pads the edges of `value` with zeros so that the result has the same shape as `value`.
Returns:
Convolution result as tensor of shape (batch_size, out_channel, spatial...)
"""
raise NotImplementedError(self)
def expand_dims(self, a, axis=0, number=1):
raise NotImplementedError(self)
def shape(self, tensor):
raise NotImplementedError(self)
def staticshape(self, tensor):
raise NotImplementedError(self)
def cast(self, x, dtype: DType):
raise NotImplementedError(self)
def to_float(self, x):
"""
Converts a tensor to floating point values with precision equal to the currently set default precision.
See Also:
`Backend.precision()`.
If `x` is mutable and of the correct floating type, returns a copy of `x`.
To convert float tensors to the backend precision but leave non-float tensors untouched, use `Backend.as_tensor()`.
Args:
x: tensor of bool, int or float
Returns:
Values of `x` as float tensor
"""
return self.cast(x, self.float_type)
def to_int32(self, x):
return self.cast(x, DType(int, 32))
def to_int64(self, x):
return self.cast(x, DType(int, 64))
def to_complex(self, x):
return self.cast(x, DType(complex, max(64, min(self.precision * 2, 128))))
def batched_gather_nd(self, values, indices):
"""
Gathers values from the tensor `values` at locations `indices`.
The first dimension of `values` and `indices` is the batch dimension which must be either equal for both or one for either.
Args:
values: tensor of shape (batch, spatial..., channel)
indices: int tensor of shape (batch, any..., multi_index) where the size of multi_index is values.rank - 2.
Returns:
Gathered values as tensor of shape (batch, any..., channel)
"""
raise NotImplementedError(self)
def flatten(self, x):
return self.reshape(x, (-1,))
def std(self, x, axis=None, keepdims=False):
raise NotImplementedError(self)
def boolean_mask(self, x, mask, axis=0):
"""
Args:
x: tensor with any number of dimensions
mask: 1D mask tensor
axis: Axis index >= 0
"""
raise NotImplementedError(self)
def isfinite(self, x):
raise NotImplementedError(self)
def scatter(self, base_grid, indices, values, mode: str):
"""
Depending on `mode`, performs scatter_update or scatter_add.
Args:
base_grid: Tensor into which scatter values are inserted at indices. Tensor of shape (batch_size, spatial..., channels)
indices: Tensor of shape (batch_size or 1, update_count, index_vector)
values: Values to scatter at indices. Tensor of shape (batch_size or 1, update_count or 1, channels or 1)
mode: One of ('update', 'add')
Returns:
Copy of base_grid with values at `indices` updated by `values`.
"""
raise NotImplementedError(self)
def any(self, boolean_tensor, axis=None, keepdims=False):
raise NotImplementedError(self)
def all(self, boolean_tensor, axis=None, keepdims=False):
raise NotImplementedError(self)
def fft(self, x):
"""
Computes the n-dimensional FFT along all but the first and last dimensions.
Args:
x: tensor of dimension 3 or higher
Returns:
"""
raise NotImplementedError(self)
def ifft(self, k):
"""
Computes the n-dimensional inverse FFT along all but the first and last dimensions.
Args:
k: tensor of dimension 3 or higher
Returns:
"""
raise NotImplementedError(self)
def imag(self, x):
raise NotImplementedError(self)
def real(self, x):
raise NotImplementedError(self)
def sin(self, x):
raise NotImplementedError(self)
def cos(self, x):
raise NotImplementedError(self)
def tan(self, x):
raise NotImplementedError(self)
def log(self, x):
""" Natural logarithm """
raise NotImplementedError(self)
def log2(self, x):
raise NotImplementedError(self)
def log10(self, x):
raise NotImplementedError(self)
def dtype(self, array) -> DType:
raise NotImplementedError(self)
def tile(self, value, multiples):
"""
Repeats the tensor along each axis the number of times given by multiples.
If `multiples` has more dimensions than `value`, these dimensions are added to `value` as outer dimensions.
Args:
value: tensor
multiples: tuple or list of integers
Returns:
tile tensor
"""
raise NotImplementedError(self)
def sparse_tensor(self, indices, values, shape):
"""
Optional features.
Args:
indices: tuple/list matching the dimensions (pair for matrix)
values: param shape:
shape:
Returns:
"""
raise NotImplementedError(self)
def coordinates(self, tensor):
"""
Returns the coordinates and values of a tensor.
Args:
tensor: Sparse tensor
Returns:
coordinates: `tuple` of tensor holding the coordinate vectors, i.e. (row, col) for matrices.
indices: Tensor holding the corresponding values
"""
raise NotImplementedError(self)
def minimize(self, method: str, f, x0, atol, max_iter, trj: bool):
from scipy.optimize import OptimizeResult, minimize
from threading import Thread
assert self.supports(Backend.functional_gradient)
assert len(self.staticshape(x0)) == 2 # (batch, parameters)
batch_size = self.staticshape(x0)[0]
fg = self.functional_gradient(f, [0], get_output=True)
method_description = f"SciPy {method} with {self.name}"
iterations = [0] * batch_size
function_evaluations = [0] * batch_size
xs = [None] * batch_size
final_losses = [None] * batch_size
converged = [False] * batch_size
diverged = [False] * batch_size
messages = [""] * batch_size
f_inputs = [None] * batch_size
f_b_losses = None
f_b_losses_np = None
f_grad_np = None
f_input_available = Barrier(batch_size + 1)
f_output_available = Barrier(batch_size + 1)
finished = [False] * batch_size
all_finished = False
trajectories = [[] for _ in range(batch_size)] if trj else None
threads = []
for b in range(batch_size):
def b_thread(b=b):
recent_b_losses = []
def b_fun(x: numpy.ndarray):
function_evaluations[b] += 1
f_inputs[b] = self.as_tensor(x, convert_external=True)
f_input_available.wait()
f_output_available.wait()
recent_b_losses.append(f_b_losses[b])
if final_losses[b] is None: # first evaluation
final_losses[b] = f_b_losses[b]
if trajectories is not None:
trajectories[b].append(SolveResult(method_description, x0[b], f_b_losses[b], 0, 1, False, False, ""))
return f_b_losses_np[b], f_grad_np[b]
def callback(x, *args): # L-BFGS-B only passes x but the documentation says (x, state)
iterations[b] += 1
loss = min(recent_b_losses)
recent_b_losses.clear()
final_losses[b] = loss
if trajectories is not None:
trajectories[b].append(SolveResult(method_description, x, loss, iterations[b], function_evaluations[b], False, False, ""))
res = minimize(fun=b_fun, x0=x0[b], jac=True, method=method, tol=atol[b], options={'maxiter': max_iter[b]}, callback=callback)
assert isinstance(res, OptimizeResult)
# res.nit, res.nfev
xs[b] = res.x
converged[b] = res.success
diverged[b] = res.status not in (0, 1) # 0=success
messages[b] = res.message
finished[b] = True
while not all_finished:
f_input_available.wait()
f_output_available.wait()
b_thread = Thread(target=b_thread)
threads.append(b_thread)
b_thread.start()
while True:
f_input_available.wait()
if all(finished):
all_finished = True
f_output_available.wait()
break
_, f_b_losses, f_grad = fg(self.stack(f_inputs))
f_b_losses_np = self.numpy(f_b_losses).astype(numpy.float64)
f_grad_np = self.numpy(f_grad).astype(numpy.float64)
f_output_available.wait()
for b_thread in threads:
b_thread.join() # make sure threads exit correctly
if trj:
max_trajectory_length = max([len(t) for t in trajectories])
last_points = [SolveResult(method_description, xs[b], final_losses[b], iterations[b], function_evaluations[b], converged[b], diverged[b], "") for b in range(batch_size)]
trajectories = [t[:-1] + [last_point] * (max_trajectory_length - len(t) + 1) for t, last_point in zip(trajectories, last_points)]
trajectory = []
for states in zip(*trajectories):
x = self.stack([self.to_float(state.x) for state in states])
residual = self.stack([state.residual for state in states])
iterations = [state.iterations for state in states]
function_evaluations = [state.function_evaluations for state in states]
converged = [state.converged for state in states]
diverged = [state.diverged for state in states]
trajectory.append(SolveResult(method_description, x, residual, iterations, function_evaluations, converged, diverged, messages))
return trajectory
else:
x = self.stack(xs)
residual = self.stack(final_losses)
return SolveResult(method_description, x, residual, iterations, function_evaluations, converged, diverged, messages)
def linear_solve(self, method: str, lin, y, x0, rtol, atol, max_iter, trj: bool) -> SolveResult or List[SolveResult]:
"""
Solve the system of linear equations A · x = y.
This method need not provide a gradient for the operation.
Args:
method: Which algorithm to use. One of `('auto', 'CG', 'CG-adaptive')`.
lin: Linear operation. One of
* sparse/dense matrix valid for all instances
* tuple/list of sparse/dense matrices for varying matrices along batch, must have the same nonzero locations.
* linear function A(x), must be called on all instances in parallel
y: target result of A * x. 2nd order tensor (batch, vector) or list of vectors.
x0: Initial guess of size (batch, parameters)
rtol: Relative tolerance of size (batch,)
atol: Absolute tolerance of size (batch,)
max_iter: Maximum number of iterations of size (batch,)
trj: Whether to record and return the optimization trajectory as a `List[SolveResult]`.
Returns:
result: `SolveResult` or `List[SolveResult]`, depending on `trj`.
"""
if method == 'auto':
return self.conjugate_gradient_adaptive(lin, y, x0, rtol, atol, max_iter, trj)
elif method == 'CG':
return self.conjugate_gradient(lin, y, x0, rtol, atol, max_iter, trj)
elif method == 'CG-adaptive':
return self.conjugate_gradient_adaptive(lin, y, x0, rtol, atol, max_iter, trj)
else:
raise NotImplementedError(f"Method '{method}' not supported for linear solve.")
def conjugate_gradient(self, lin, y, x0, rtol, atol, max_iter, trj: bool) -> SolveResult or List[SolveResult]:
""" Standard conjugate gradient algorithm. Signature matches to `Backend.linear_solve()`. """
# Based on "An Introduction to the Conjugate Gradient Method Without the Agonizing Pain" by Jonathan Richard Shewchuk
# symbols: dx=d, dy=q, step_size=alpha, residual_squared=delta, residual=r, y=b
method = f"Φ-Flow CG ({self.name})"
y = self.to_float(y)
x0 = self.copy(self.to_float(x0), only_mutable=True)
batch_size = self.staticshape(y)[0]
tolerance_sq = self.maximum(rtol ** 2 * self.sum(y ** 2, -1), atol ** 2)
x = x0
dx = residual = y - self.linear(lin, x)
it_counter = 0
iterations = self.zeros([batch_size], DType(int, 32))
function_evaluations = self.ones([batch_size], DType(int, 32))
residual_squared = rsq0 = self.sum(residual ** 2, -1, keepdims=True)
diverged = self.any(~self.isfinite(x), axis=(1,))
converged = self.all(residual_squared <= tolerance_sq, axis=(1,))
trajectory = [SolveResult(method, x, residual, iterations, function_evaluations, converged, diverged, "")] if trj else None
finished = converged | diverged | (iterations >= max_iter); not_finished_1 = self.to_int32(~finished) # ; active = self.to_float(self.expand_dims(not_finished_1, -1))
while ~self.all(finished):
it_counter += 1; iterations += not_finished_1
dy = self.linear(lin, dx); function_evaluations += not_finished_1
dx_dy = self.sum(dx * dy, axis=-1, keepdims=True)
step_size = self.divide_no_nan(residual_squared, dx_dy)
step_size *= self.expand_dims(self.to_float(not_finished_1), -1) # this is not really necessary but ensures batch-independence
x += step_size * dx
if it_counter % 50 == 0:
residual = y - self.linear(lin, x); function_evaluations += 1
else:
residual = residual - step_size * dy # in-place subtraction affects convergence
residual_squared_old = residual_squared
residual_squared = self.sum(residual ** 2, -1, keepdims=True)
dx = residual + self.divide_no_nan(residual_squared, residual_squared_old) * dx
diverged = self.any(residual_squared / rsq0 > 100, axis=(1,)) & (iterations >= 8)
converged = self.all(residual_squared <= tolerance_sq, axis=(1,))
if trajectory is not None:
trajectory.append(SolveResult(method, x, residual, iterations, function_evaluations, converged, diverged, ""))
x = self.copy(x)
iterations = self.copy(iterations)
finished = converged | diverged | (iterations >= max_iter); not_finished_1 = self.to_int32(~finished) # ; active = self.to_float(self.expand_dims(not_finished_1, -1))
return trajectory if trj else SolveResult(method, x, residual, iterations, function_evaluations, converged, diverged, "")
def conjugate_gradient_adaptive(self, lin, y, x0, rtol, atol, max_iter, trj: bool) -> SolveResult or List[SolveResult]:
""" Conjugate gradient algorithm with adaptive step size. Signature matches to `Backend.linear_solve()`. """
# Based on the variant described in "Methods of Conjugate Gradients for Solving Linear Systems" by Magnus R. Hestenes and Eduard Stiefel
# https://nvlpubs.nist.gov/nistpubs/jres/049/jresv49n6p409_A1b.pdf
method = f"Φ-Flow CG-adaptive ({self.name})"
y = self.to_float(y)
x0 = self.copy(self.to_float(x0), only_mutable=True)
batch_size = self.staticshape(y)[0]
tolerance_sq = self.maximum(rtol ** 2 * self.sum(y ** 2, -1), atol ** 2)
x = x0
dx = residual = y - self.linear(lin, x)
dy = self.linear(lin, dx)
iterations = self.zeros([batch_size], DType(int, 32))
function_evaluations = self.ones([batch_size], DType(int, 32))
residual_squared = rsq0 = self.sum(residual ** 2, -1, keepdims=True)
diverged = self.any(~self.isfinite(x), axis=(1,))
converged = self.all(residual_squared <= tolerance_sq, axis=(1,))
trajectory = [SolveResult(method, x, residual, iterations, function_evaluations, converged, diverged, "")] if trj else None
continue_ = ~converged & ~diverged & (iterations < max_iter)
def loop(continue_, it_counter, x, dx, dy, residual, iterations, function_evaluations, _converged, _diverged):
continue_1 = self.to_int32(continue_)
it_counter += 1
iterations += continue_1
dx_dy = self.sum(dx * dy, axis=-1, keepdims=True)
step_size = self.divide_no_nan(self.sum(dx * residual, axis=-1, keepdims=True), dx_dy)
step_size *= self.expand_dims(self.to_float(continue_1), -1) # this is not really necessary but ensures batch-independence
x += step_size * dx
# if it_counter % 50 == 0: # Not traceable since Python bool
# residual = y - self.linear(lin, x); function_evaluations += 1
# else:
residual = residual - step_size * dy # in-place subtraction affects convergence
residual_squared = self.sum(residual ** 2, -1, keepdims=True)
dx = residual - self.divide_no_nan(self.sum(residual * dy, axis=-1, keepdims=True) * dx, dx_dy)
dy = self.linear(lin, dx); function_evaluations += continue_1
diverged = self.any(residual_squared / rsq0 > 100, axis=(1,)) & (iterations >= 8)
converged = self.all(residual_squared <= tolerance_sq, axis=(1,))
if trajectory is not None:
trajectory.append(SolveResult(method, x, residual, iterations, function_evaluations, converged, diverged, ""))
x = self.copy(x)
iterations = self.copy(iterations)
continue_ = ~converged & ~diverged & (iterations < max_iter)
return continue_, it_counter, x, dx, dy, residual, iterations, function_evaluations, converged, diverged
_, _, x, _, _, residual, iterations, function_evaluations, converged, diverged =\
self.while_loop(loop, (continue_, 0, x, dx, dy, residual, iterations, function_evaluations, converged, diverged))
return trajectory if trj else SolveResult(method, x, residual, iterations, function_evaluations, converged, diverged, "")
def linear(self, lin, vector):
if callable(lin):
return lin(vector)
elif isinstance(lin, (tuple, list)):
for lin_i in lin:
lin_shape = self.staticshape(lin_i)
assert len(lin_shape) == 2
return self.stack([self.matmul(m, v) for m, v in zip(lin, self.unstack(vector))])
else:
lin_shape = self.staticshape(lin)
assert len(lin_shape) == 2, f"A must be a matrix but got shape {lin_shape}"
return self.matmul(lin, vector)
def gradients(self, y, xs: tuple or list, grad_y) -> tuple:
raise NotImplementedError(self)
def record_gradients(self, xs: tuple or list, persistent=False):
raise NotImplementedError(self)
def stop_gradient(self, value):
raise NotImplementedError(self)
def grid_sample(self, grid, spatial_dims: tuple, coordinates, extrapolation='constant'):
"""
Interpolates a regular grid at the specified coordinates.
Args:
grid: Tensor
spatial_dims: Dimension indices that correspond to coordinate vectors
coordinates: Tensor of floating grid indices.
The last dimension must match `spatial_dims`.
The first grid point of dimension i lies at position 0, the last at values.shape[i]-1.
extrapolation: Values to use for coordinates outside the grid.
One of `('undefined', 'zeros', 'boundary', 'periodic', 'symmetric', 'reflect')`.
Returns:
sampled values with linear interpolation
"""
return NotImplemented
def variable(self, value):
return NotImplemented
def ndims(self, tensor):
return len(self.staticshape(tensor))
def size(self, array):
return self.prod(self.shape(array))
def batch_gather(self, tensor, batches):
if isinstance(batches, int):
batches = [batches]
return tensor[batches, ...]
def unstack(self, tensor, axis=0, keepdims=False) -> tuple:
if axis < 0:
axis += len(tensor.shape)
if axis >= len(tensor.shape) or axis < 0:
raise ValueError("Illegal axis value")
result = []
for slice_idx in range(tensor.shape[axis]):
if keepdims:
component = tensor[tuple([slice(slice_idx, slice_idx + 1) if d == axis else slice(None) for d in range(len(tensor.shape))])]
else:
component = tensor[tuple([slice_idx if d == axis else slice(None) for d in range(len(tensor.shape))])]
result.append(component)
return tuple(result)
def equal(self, x, y):
""" Element-wise equality check """
raise NotImplementedError(self)
def not_equal(self, x, y):
return ~self.equal(x, y)
def greater_than(self, x, y):
x, y = self.auto_cast(x, y)
return x > y
def greater_or_equal(self, x, y):
x, y = self.auto_cast(x, y)
return x >= y
def add(self, a, b):
a, b = self.auto_cast(a, b)
return a + b
def sub(self, a, b):
a, b = self.auto_cast(a, b)
return a - b
def mul(self, a, b):
a, b = self.auto_cast(a, b)
return a * b
def div(self, numerator, denominator):
numerator, denominator = self.auto_cast(numerator, denominator)
return numerator / denominator
def pow(self, base, exp):
base, exp = self.auto_cast(base, exp)
return base ** exp
def mod(self, dividend, divisor):
dividend, divisor = self.auto_cast(dividend, divisor)
return dividend % divisor
def and_(self, a, b):
a, b = self.auto_cast(a, b)
return a & b
def or_(self, a, b):
a, b = self.auto_cast(a, b)
return a | b
def xor(self, a, b):
a, b = self.auto_cast(a, b)
return a ^ b
def floordiv(self, a, b):
a, b = self.auto_cast(a, b)
return a // b
BACKENDS = []
""" Global list of all registered backends. Register a `Backend` by adding it to the list. """
_DEFAULT = [] # [0] = global default, [1:] from 'with' blocks
_PRECISION = [32] # [0] = global precision in bits, [1:] from 'with' blocks
def choose_backend(*values, prefer_default=False) -> Backend:
"""
Selects a suitable backend to handle the given values.
This function is used by most math functions operating on `Tensor` objects to delegate the actual computations.
Args:
*values:
prefer_default: if True, selects the default backend assuming it can handle handle the values, see `default_backend()`.
raise_error: Determines the behavior of this function if no backend can handle the given values.
If True, raises a `NoBackendFound` error, else returns `None`.
Returns:
the selected `Backend`
"""
# --- Default Backend has priority ---
if _is_applicable(_DEFAULT[-1], values) and (prefer_default or _is_specific(_DEFAULT[-1], values)):
return _DEFAULT[-1]
# --- Filter out non-applicable ---
backends = [backend for backend in BACKENDS if _is_applicable(backend, values)]
if len(backends) == 0:
raise NoBackendFound(f"No backend found for types {[type(v).__name__ for v in values]}; registered backends are {BACKENDS}")
# --- Native tensors? ---
for backend in backends:
if _is_specific(backend, values):
return backend
return backends[0]
class NoBackendFound(Exception):
"""
Thrown by `choose_backend` if no backend can handle the given values.
"""
def __init__(self, msg):
Exception.__init__(self, msg)
def default_backend() -> Backend:
"""
The default backend is preferred by `choose_backend()`.
The default backend can be set globally using `set_global_default_backend()` and locally using `with backend:`.
Returns:
current default `Backend`
"""
return _DEFAULT[-1]
def context_backend() -> Backend or None:
"""
Returns the backend set by the inner-most surrounding `with backend:` block.
If called outside a backend context, returns `None`.
Returns:
`Backend` or `None`
"""
return _DEFAULT[-1] if len(_DEFAULT) > 1 else None
def set_global_default_backend(backend: Backend):
"""
Sets the given backend as default.
This setting can be overridden using `with backend:`.
See `default_backend()`, `choose_backend()`.
Args:
backend: `Backend` to set as default
"""
assert isinstance(backend, Backend)
_DEFAULT[0] = backend
def set_global_precision(floating_point_bits: int):
"""
Sets the floating point precision of DYNAMIC_BACKEND which affects all registered backends.
If `floating_point_bits` is an integer, all floating point tensors created henceforth will be of the corresponding data type, float16, float32 or float64.
Operations may also convert floating point values to this precision, even if the input had a different precision.
If `floating_point_bits` is None, new tensors will default to float32 unless specified otherwise.
The output of math operations has the same precision as its inputs.
Args:
floating_point_bits: one of (16, 32, 64, None)
"""
_PRECISION[0] = floating_point_bits
def get_precision() -> int:
"""
Gets the current target floating point precision in bits.
The precision can be set globally using `set_global_precision()` or locally using `with precision(p):`.
Any Backend method may convert floating point values to this precision, even if the input had a different precision.
Returns:
16 for half, 32 for single, 64 for double
"""
return _PRECISION[-1]
@contextmanager
def precision(floating_point_bits: int):
"""
Sets the floating point precision for the local context.
Usage: `with precision(p):`
This overrides the global setting, see `set_global_precision()`.
Args:
floating_point_bits: 16 for half, 32 for single, 64 for double
"""
_PRECISION.append(floating_point_bits)
try:
yield None
finally:
_PRECISION.pop(-1)
def convert(tensor, backend: Backend = None, use_dlpack=True):
"""
Convert a Tensor to the native format of `backend`.
If the target backend can operate natively on `tensor`, returns `tensor`.
If both backends support *DLPack* and `use_dlpack=True`, uses zero-copy conversion using the DLPack library.
Else, intermediately converts `tensor` to a NumPy array.
*Warning*: This operation breaks the automatic differentiation chain.
Args:
tensor: Native tensor belonging to any registered backend.
backend: Target backend. If `None`, uses the current default backend, see `default_backend()`.
Returns:
Tensor belonging to `backend`.
"""
backend = backend or default_backend()
current_backend = choose_backend(tensor, prefer_default=False)
if backend.is_tensor(tensor, True) or backend is current_backend:
return tensor
if use_dlpack and current_backend.supports(Backend.to_dlpack) and backend.supports(Backend.from_dlpack):
capsule = current_backend.to_dlpack(tensor)
return backend.from_dlpack(capsule)
else:
nparray = current_backend.numpy(tensor)
return backend.as_tensor(nparray)
# Backend choice utility functions
def _is_applicable(backend, values):
for value in values:
if not backend.is_tensor(value, only_native=False):
return False
return True
def _is_specific(backend, values):
for value in values:
if backend.is_tensor(value, only_native=True):
return True
return False
# Other low-level helper functions
def combined_dim(dim1, dim2, type_str: str = 'batch'):
if dim1 is None and dim2 is None:
return None
if dim1 is None or dim1 == 1:
return dim2
if dim2 is None or dim2 == 1:
return dim1
assert dim1 == dim2, f"Incompatible {type_str} dimensions: x0 {dim1}, y {dim2}"
return dim1
| 37.677311 | 216 | 0.629293 | 5,596 | 44,836 | 4.937098 | 0.138134 | 0.068626 | 0.066889 | 0.074055 | 0.31041 | 0.258651 | 0.223252 | 0.202295 | 0.191871 | 0.180976 | 0 | 0.008629 | 0.276274 | 44,836 | 1,189 | 217 | 37.708999 | 0.842769 | 0.322754 | 0 | 0.314035 | 0 | 0.003509 | 0.029159 | 0.000804 | 0 | 0 | 0 | 0 | 0.014035 | 1 | 0.254386 | false | 0.001754 | 0.015789 | 0.02807 | 0.396491 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbde5b0dbcab23e1ef72b1961f7810d2ab8cc002 | 6,452 | py | Python | file_importer0.py | Alva789ro/Regional-Comprehensive-Economic-Partnership-RCEP-Economic-Default-Risk-Analysis | 454583f47883edae17391f101b10b38b68c9834f | [
"MIT"
] | 1 | 2021-03-15T19:44:36.000Z | 2021-03-15T19:44:36.000Z | file_importer0.py | Alva789ro/Regional-Comprehensive-Economic-Partnership-RCEP-Economic-Default-Risk-Analysis | 454583f47883edae17391f101b10b38b68c9834f | [
"MIT"
] | null | null | null | file_importer0.py | Alva789ro/Regional-Comprehensive-Economic-Partnership-RCEP-Economic-Default-Risk-Analysis | 454583f47883edae17391f101b10b38b68c9834f | [
"MIT"
] | 1 | 2022-02-06T01:33:41.000Z | 2022-02-06T01:33:41.000Z | import xlsxwriter
import pandas as pd
import numpy as np
import mysql.connector
australia=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Australia')
brunei=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Brunei')
cambodia=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Cambodia')
china=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='China')
indonesia=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Indonesia')
japan=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Japan')
lao=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Lao')
malaysia=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Malaysia')
myanmar=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Myanmar')
new_zeland=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='New Zeland')
philipines=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Philipines')
singapore=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Singapore')
thailand=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Thailand')
vietnam=pd.read_excel(r'\Users\jesica\Desktop\RCEP_economic_analysis.xlsx', sheet_name='Vietnam')
'''
mydb = mysql.connector.connect(
host = "localhost",
user = "root",
passwd = "",
database = ""
)
mycursor = mydb.cursor()
sqlformula1 = "INSERT INTO australia VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(australia['Year'], australia['RGDP'], australia['NGDP'], australia['GDP_pc'], australia['Inflation'], australia['Unemployment_Rate'], australia['Net_LB'], australia['Account_Balance']):
mycursor.execute(sqlformula1, [a, b, c, d, e, f, g, h])
sqlformula2 = "INSERT INTO brunei VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(brunei['Year'], brunei['RGDP'], brunei['NGDP'], brunei['GDP_pc'], brunei['Inflation'], brunei['Unemployment_Rate'], brunei['Net_LB'], brunei['Account_Balance']):
mycursor.execute(sqlformula2, [a, b, c, d, e, f, g, h])
sqlformula3 = "INSERT INTO cambodia VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(cambodia['Year'], cambodia['RGDP'], cambodia['NGDP'], cambodia['GDP_pc'], cambodia['Inflation'], cambodia['Unemployment_Rate'], cambodia['Net_LB'], cambodia['Account_Balance']):
mycursor.execute(sqlformula3, [a, b, c, d, e, f, g, h])
sqlformula4 = "INSERT INTO china VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(china['Year'], china['RGDP'], china['NGDP'], china['GDP_pc'], china['Inflation'], china['Unemployment_Rate'], china['Net_LB'], china['Account_Balance']):
mycursor.execute(sqlformula4, [a, b, c, d, e, f, g, h])
sqlformula5 = "INSERT INTO indonesia VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(indonesia['Year'], indonesia['RGDP'], indonesia['NGDP'], indonesia['GDP_pc'], indonesia['Inflation'], indonesia['Unemployment_Rate'], indonesia['Net_LB'], indonesia['Account_Balance']):
mycursor.execute(sqlformula5, [a, b, c, d, e, f, g, h])
sqlformula6 = "INSERT INTO japan VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(japan['Year'], japan['RGDP'], japan['NGDP'], japan['GDP_pc'], japan['Inflation'], japan['Unemployment_Rate'], japan['Net_LB'], japan['Account_Balance']):
mycursor.execute(sqlformula6, [a, b, c, d, e, f, g, h])
sqlformula7 = "INSERT INTO lao VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(lao['Year'], lao['RGDP'], lao['NGDP'], lao['GDP_pc'], lao['Inflation'], lao['Unemployment_Rate'], lao['Net_LB'], lao['Account_Balance']):
mycursor.execute(sqlformula7, [a, b, c, d, e, f, g, h])
sqlformula8 = "INSERT INTO malaysia VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(malaysia['Year'], malaysia['RGDP'], malaysia['NGDP'], malaysia['GDP_pc'], malaysia['Inflation'], malaysia['Unemployment_Rate'], malaysia['Net_LB'], malaysia['Account_Balance']):
mycursor.execute(sqlformula8, [a, b, c, d, e, f, g, h])
sqlformula9 = "INSERT INTO myanmar VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(myanmar['Year'], myanmar['RGDP'], myanmar['NGDP'], myanmar['GDP_pc'], myanmar['Inflation'], myanmar['Unemployment_Rate'], myanmar['Net_LB'], myanmar['Account_Balance']):
mycursor.execute(sqlformula9, [a, b, c, d, e, f, g, h])
sqlformula10 = "INSERT INTO new_zeland VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(new_zeland['Year'], new_zeland['RGDP'], new_zeland['NGDP'], new_zeland['GDP_pc'], new_zeland['Inflation'], new_zeland['Unemployment_Rate'], new_zeland['Net_LB'], new_zeland['Account_Balance']):
mycursor.execute(sqlformula10, [a, b, c, d, e, f, g, h])
sqlformula11 = "INSERT INTO philipines VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(philipines['Year'], philipines['RGDP'], philipines['NGDP'], philipines['GDP_pc'], philipines['Inflation'], philipines['Unemployment_Rate'], philipines['Net_LB'], philipines['Account_Balance']):
mycursor.execute(sqlformula11, [a, b, c, d, e, f, g, h])
sqlformula12 = "INSERT INTO singapore VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(singapore['Year'], singapore['RGDP'], singapore['NGDP'], singapore['GDP_pc'], singapore['Inflation'], singapore['Unemployment_Rate'], singapore['Net_LB'], singapore['Account_Balance']):
mycursor.execute(sqlformula12, [a, b, c, d, e, f, g, h])
sqlformula13 = "INSERT INTO thailand VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(thailand['Year'], thailand['RGDP'], thailand['NGDP'], thailand['GDP_pc'], thailand['Inflation'], thailand['Unemployment_Rate'], thailand['Net_LB'], thailand['Account_Balance']):
mycursor.execute(sqlformula13, [a, b, c, d, e, f, g, h])
sqlformula14 = "INSERT INTO vietnam VALUES(%s, %s, %s, %s, %s, %s, %s, %s)"
for a, b, c, d, e, f, g, h in zip(vietnam['Year'], vietnam['RGDP'], vietnam['NGDP'], vietnam['GDP_pc'], vietnam['Inflation'], vietnam['Unemployment_Rate'], vietnam['Net_LB'], vietnam['Account_Balance']):
mycursor.execute(sqlformula14, [a, b, c, d, e, f, g, h])
'''
#mydb.commit()
| 72.494382 | 227 | 0.67359 | 1,019 | 6,452 | 4.14524 | 0.087341 | 0.046402 | 0.059659 | 0.066288 | 0.334754 | 0.334754 | 0.334754 | 0.334754 | 0.308239 | 0.308239 | 0 | 0.006621 | 0.110508 | 6,452 | 88 | 228 | 73.318182 | 0.729395 | 0.002015 | 0 | 0 | 0 | 0 | 0.538147 | 0.467302 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbe03e42e9b3d1f6b76452c08db25467c63f6175 | 30,141 | py | Python | python/das/types.py | marza-animation-planet/das | 1c7460dfdd5f138d8317c72900e90b23c0c28c7b | [
"MIT"
] | 4 | 2018-11-19T01:36:01.000Z | 2022-02-28T03:41:12.000Z | python/das/types.py | marza-animation-planet/das | 1c7460dfdd5f138d8317c72900e90b23c0c28c7b | [
"MIT"
] | 1 | 2021-12-26T11:57:07.000Z | 2022-03-16T07:18:01.000Z | python/das/types.py | marza-animation-planet/das | 1c7460dfdd5f138d8317c72900e90b23c0c28c7b | [
"MIT"
] | 2 | 2019-03-30T10:28:12.000Z | 2022-03-04T17:58:39.000Z | import sys
import das
import traceback
class ReservedNameError(Exception):
def __init__(self, name):
super(ReservedNameError, self).__init__("'%s' is a reserved name" % name)
class VersionError(Exception):
def __init__(self, msg=None, current_version=None, required_version=None):
fullmsg = "ersion error"
if required_version:
fullmsg += ": %s required" % required_version
else:
fullmsg += ": no requirements"
if current_version:
fullmsg += ", %s in use" % current_version
else:
fullmsg += ", no version info"
if msg:
fullmsg = msg + " v" + fullmsg
else:
fullmsg = "V" + fullmsg
super(VersionError, self).__init__(fullmsg)
class GlobalValidationDisabled(object):
def __init__(self, data):
super(GlobalValidationDisabled, self).__init__()
self.data = data
self.oldstate = None
def __enter__(self):
try:
self.oldstate = self.data._is_global_validation_enabled()
self.data._enable_global_validation(False)
except:
pass
return self.data
def __exit__(self, type, value, traceback):
if self.oldstate is not None:
self.data._enable_global_validation(self.oldstate)
self.oldstate = None
# Always re-raise exception
return False
class TypeBase(object):
@classmethod
def TransferGlobalValidator(klass, src, dst):
if isinstance(src, klass) and isinstance(dst, klass):
dst._set_validate_globally_cb(src._gvalidate)
return dst
@classmethod
def ValidateGlobally(klass, inst):
if isinstance(inst, klass):
inst._gvalidate()
return inst
def __init__(self, *args):
super(TypeBase, self).__init__()
self.__dict__["_schema_type"] = None
self.__dict__["_validate_globally_cb"] = None
self.__dict__["_global_validation_enabled"] = True
def _wrap(self, rhs):
st = self._get_schema_type()
rv = self.__class__(rhs if st is None else st._validate_self(rhs))
rv._set_schema_type(self._get_schema_type())
return rv
def _adapt_value(self, value, key=None, index=None):
return das.adapt_value(value, schema_type=self._get_schema_type(), key=key, index=index)
def _validate(self, schema_type=None):
if schema_type is None:
schema_type = self._get_schema_type()
if schema_type is not None:
schema_type.validate(self)
self._set_schema_type(schema_type)
def _gvalidate(self):
st = self._get_schema_type()
if st is not None:
# run self validation first (container validation)
st._validate_self(self)
if hasattr(self, "_is_global_validation_enabled"):
if not self._is_global_validation_enabled():
# Skip global validaton
return
gvcb = self._get_validate_globally_cb()
if gvcb is not None:
gvcb()
if hasattr(self, "_validate_globally"):
try:
getattr(self, "_validate_globally")()
except:
_, ei, tb = sys.exc_info()
ei = das.ValidationError("Global Validation Failed (%s)" % str(ei))
raise ei.__class__, ei, tb
def _get_schema_type(self):
return self.__dict__["_schema_type"]
def _set_schema_type(self, schema_type):
self.__dict__["_schema_type"] = schema_type
def _get_validate_globally_cb(self):
return self.__dict__["_validate_globally_cb"]
def _set_validate_globally_cb(self, cb):
self.__dict__["_validate_globally_cb"] = cb
def _is_global_validation_enabled(self):
return self.__dict__["_global_validation_enabled"]
def _enable_global_validation(self, on):
self.__dict__["_global_validation_enabled"] = on
class Tuple(TypeBase, tuple):
def __init__(self, *args):
# Funny, we need to declare *args here, but at the time we reach
# the core of the method, tuple is already created
# Maybe because tuple is immutable?
super(Tuple, self).__init__()
def __add__(self, y):
raise das.ValidationError("Expected a tuple of size %d, got %d" % (len(self), len(self) + len(y)))
def __getitem__(self, i):
return TypeBase.TransferGlobalValidator(self, super(Tuple, self).__getitem__(i))
class Sequence(TypeBase, list):
def __init__(self, *args):
TypeBase.__init__(self)
list.__init__(self, *args)
def _wrap_index(self, i, n=None, clamp=False):
if i < 0:
if n is None:
n = len(self)
ii = i + n
if ii < 0:
if clamp:
return 0
else:
raise IndexError("list index out of range")
else:
return ii
else:
return i
def __imul__(self, n):
oldlen = len(self)
super(Sequence, self).__imul__(n)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Sequence, self).__setslice__(oldlen, len(self), [])
except Exception, e:
print("das.types.Sequence.__imul__: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
return self
def __mul__(self, n):
rv = self[:]
rv.__imul__(n)
return rv
def __rmul__(self, n):
return self.__mul__(n)
def __iadd__(self, y):
n = len(self)
super(Sequence, self).__iadd__([self._adapt_value(x, index=n+i) for i, x in enumerate(y)])
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Sequence, self).__setslice__(n, len(self), [])
except Exception, e:
print("das.types.Sequence.__iadd__: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
return self
def __add__(self, y):
rv = self[:]
rv.__iadd__(y)
return rv
def __setitem__(self, i, y):
super(Sequence, self).__setitem__(i, self._adapt_value(y, index=i))
self._gvalidate()
def __getitem__(self, i):
return TypeBase.TransferGlobalValidator(self, super(Sequence, self).__getitem__(i))
def __delitem__(self, i):
ii = self._wrap_index(i, clamp=False)
item = super(Sequence, self).__getitem__(ii)
super(Sequence, self).__delitem__(i)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Sequence, self).insert(ii, item)
except Exception, e:
print("das.types.Sequence.__delitem__: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
def __iter__(self):
for item in super(Sequence, self).__iter__():
yield TypeBase.TransferGlobalValidator(self, item)
def __setslice__(self, i, j, y):
oldvals = super(Sequence, self).__getslice__(i, j)
newvals = [self._adapt_value(x, index=i+k) for k, x in enumerate(y)]
super(Sequence, self).__setslice__(i, j, newvals)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
ii = self._wrap_index(i, clamp=True)
super(Sequence, self).__setslice__(ii, ii+len(newvals), oldvals)
except Exception, e:
print("das.types.Sequence.__setslice__: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
def __getslice__(self, i, j):
return self._wrap(super(Sequence, self).__getslice__(i, j))
def __delslice__(self, i, j):
oldvals = super(Sequence, self).__getslice__(i, j)
super(Sequence, self).__delslice__(i, j)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
ii = self._wrap_index(i, clamp=True)
super(Sequence, self).__setslice__(ii, ii, oldvals)
except Exception, e:
print("das.types.Sequence.__setslice__: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
# def __contains__(self, y):
# try:
# _v = self._adapt_value(y, index=0)
# return super(Sequence, self).__contains__(_v)
# except:
# return False
def index(self, y):
return super(Sequence, self).index(self._adapt_value(y, index=0))
def insert(self, i, y):
super(Sequence, self).insert(i, self._adapt_value(y, index=i))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Sequence, self).pop(self._wrap_index(i, n=len(self)-1, clamp=True))
except Exception, e:
print("das.types.Sequence.insert: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
def append(self, y):
n = len(self)
super(Sequence, self).append(self._adapt_value(y, index=n))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Sequence, self).pop()
except Exception, e:
print("das.types.Sequence.append: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
def extend(self, y):
newvals = [self._adapt_value(x, index=len(self)+i) for i, x in enumerate(y)]
super(Sequence, self).extend(newvals)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Sequence, self).__setslice__(len(self) - len(newvals), len(self), [])
except Exception, e:
print("das.types.Sequence.extend: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
def pop(self, *args):
rv = super(Sequence, self).pop(*args)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
if args:
super(Sequence, self).insert(self._wrap_index(args[0], n=len(self)+1, clamp=False), rv)
else:
super(Sequence, self).append(rv)
except Exception, e:
print("das.types.Sequence.pop: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
return rv
def remove(self, y):
idx = self.index(y)
item = self[idx]
super(Sequence, self).remove(item)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Sequence, self).insert(idx, item)
except Exception, e:
print("das.types.Sequence.remove: Failed to recover sequence data (%s)" % e)
raise ec, ei, tb
class Set(TypeBase, set):
def __init__(self, args):
TypeBase.__init__(self)
set.__init__(self, args)
def __iand__(self, y):
oldvals = super(Set, self).copy()
super(Set, self).__iand__(set([self._adapt_value(x, index=i) for i, x in enumerate(y)]))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Set, self).clear()
super(Set, self).__ior__(oldvals)
except Exception, e:
print("das.types.Set.__iand__: Failed to recover set data (%s)" % e)
raise ec, ei, tb
return self
def __and__(self, y):
rv = self.copy()
rv &= y
return rv
def __rand__(self, y):
return self.__and__(y)
def __isub__(self, y):
oldvals = super(Set, self).copy()
super(Set, self).__isub__(set([self._adapt_value(x, index=i) for i, x in enumerate(y)]))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Set, self).clear()
super(Set, self).__ior__(oldvals)
except Exception, e:
print("das.types.Set.__isub__: Failed to recover set data (%s)" % e)
raise ec, ei, tb
return self
def __sub__(self, y):
rv = self.copy()
rv -= y
return rv
def __rsub__(self, y):
return self.__sub__(y)
def __ior__(self, y):
oldvals = super(Set, self).copy()
super(Set, self).__ior__(set([self._adapt_value(x, index=i) for i, x in enumerate(y)]))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Set, self).clear()
super(Set, self).__ior__(oldvals)
except Exception, e:
print("das.types.Set.__ior__: Failed to recover set data (%s)" % e)
raise ec, ei, tb
return self
def __or__(self, y):
rv = self.copy()
rv |= y
return rv
def __ror__(self, y):
return self.__or__(y)
def __ixor__(self, y):
oldvals = super(Set, self).copy()
super(Set, self).__ixor__(set([self._adapt_value(x, index=i) for i, x in enumerate(y)]))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Set, self).clear()
super(Set, self).__ior__(oldvals)
except Exception, e:
print("das.types.Set.__ixor__: Failed to recover set data (%s)" % e)
raise ec, ei, tb
return self
def __xor__(self, y):
rv = self.copy()
rv ^= y
return rv
def __rxor__(self, y):
rv = self.copy()
rv ^= y
return rv
def __cmp__(self, oth):
# base set class doesn't implement __cmp__
# but we need it for some other purpose
if len(self.symmetric_difference(oth)) == 0:
return 0
elif len(self) <= len(oth):
return -1
else:
return 1
def __iter__(self):
for item in super(Set, self).__iter__():
yield TypeBase.TransferGlobalValidator(self, item)
def clear(self):
oldvals = super(Set, self).copy()
super(Set, self).clear()
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Set, self).__ior__(oldvals)
except Exception, e:
print("das.types.Set.clear: Failed to recover set data (%s)" % e)
raise ec, ei, tb
def copy(self):
return self._wrap(self)
def add(self, e):
ae = self._adapt_value(e, index=len(self))
if ae in self:
return
super(Set, self).add(ae)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Set, self).remove(ae)
except Exception, e:
print("das.types.Set.add: Failed to recover set data (%s)" % e)
raise ec, ei, tb
def update(self, *args):
added = set()
for y in args:
lst = [self._adapt_value(x, index=i) for i, x in enumerate(y)]
for item in lst:
if item in self:
continue
super(Set, self).add(item)
added.add(item)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
for item in added:
super(Set, self).remove(item)
except Exception, e:
print("das.types.Set.update: Failed to recover set data (%s)" % e)
raise ec, ei, tb
def pop(self):
item = super(Set, self).pop()
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Set, self).add(item)
except Exception, e:
print("das.types.Set.pop: Failed to recover set data (%s)" % e)
raise ec, ei, tb
return item
def difference(self, rhs):
return self.__sub__(rhs)
def union(self, rhs):
return self.__or__(rhs)
def intersection(self, rhs):
return self.__and__(rhs)
def symmetric_difference(self, rhs):
return self.__xor__(rhs)
class Dict(TypeBase, dict):
def __init__(self, *args, **kwargs):
TypeBase.__init__(self)
dict.__init__(self, *args, **kwargs)
def _adapt_key(self, key):
st = self._get_schema_type()
return (key if st is None else das.adapt_value(key, schema_type=st.ktype))
def __setitem__(self, k, v):
k = self._adapt_key(k)
wasset = (k in self)
oldval = (self[k] if wasset else None)
super(Dict, self).__setitem__(k, self._adapt_value(v, key=k))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
if wasset:
super(Dict, self).__setitem__(k, oldval)
else:
del(self[k])
except Exception, e:
print("das.types.Dict.__setitem__: Failed to recover dict data (%s)" % e)
raise ec, ei, tb
def __getitem__(self, k):
return TypeBase.TransferGlobalValidator(self, super(Dict, self).__getitem__(self._adapt_key(k)))
def __delitem__(self, k):
_k = self._adapt_key(k)
_v = super(Dict, self).__getitem__(_k)
super(Dict, self).__delitem__(_k)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Dict, self).__setitem__(_k, _v)
except Exception, e:
print("das.types.Dict.popitem: Failed to recover dict data (%s)" % e)
raise ec, ei, tb
# def __contains__(self, k):
# try:
# _k = self._adapt_key(k)
# return super(Dict, self).__contains__(_k)
# except:
# return False
def setdefault(self, *args):
nargs = len(args)
if nargs > 2:
raise TypeError("setdefault expected at most 2 arguments, got %d" % nargs)
if nargs == 2:
args = (args[0], self._adapt_value(args[1], key=args[0]))
super(Dict, self).setdefault(*args)
def copy(self):
return self._wrap(self)
def update(self, *args, **kwargs):
oldvals = {}
remvals = set()
if len(args) == 1:
a0 = args[0]
if hasattr(a0, "keys"):
for k in a0.keys():
k = self._adapt_key(k)
if k in self:
oldvals[k] = self[k]
else:
remvals.add(k)
self[k] = self._adapt_value(a0[k], key=k)
else:
for k, v in a0:
k = self._adapt_key(k)
if k in self:
oldvals[k] = self[k]
else:
remvals.add(k)
self[k] = self._adapt_value(v, key=k)
elif len(args) > 1:
raise Exception("update expected at most 1 arguments, got %d" % len(args))
for k, v in kwargs.iteritems():
k = self._adapt_key(k)
if k in self:
if not k in oldvals:
oldvals[k] = self[k]
else:
remvals.add(k)
self[k] = self._adapt_value(v, key=k)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
for k in remvals:
super(Dict, self).__delitem__(k)
for k, v in oldvals.iteritems():
super(Dict, self).__setitem__(k, v)
except Exception, e:
print("das.types.Dict.update: Failed to recover dict data (%s)" % e)
raise ec, ei, tb
def pop(self, k, *args):
_k = self._adapt_key(k)
_v = super(Dict, self).pop(_k, *args)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
# if _k i not defined but a default value is provided, we should not reach here
# as dict is actually unchanged
# -> no need to check if _k was a valid key
super(Dict, self).__setitem__(_k, _v)
except Exception, e:
print("das.types.Dict.popitem: Failed to recover dict data (%s)" % e)
raise ec, ei, tb
return _v
def popitem(self):
item = super(Dict, self).popitem()
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Dict, self).__setitem__(item[0], item[1])
except Exception, e:
print("das.types.Dict.popitem: Failed to recover dict data (%s)" % e)
raise ec, ei, tb
return item
def clear(self):
items = super(Dict, self).items()
super(Dict, self).clear()
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
super(Dict, self).update(items)
except Exception, e:
print("das.types.Dict.clear: Failed to recover dict data (%s)" % e)
raise ec, ei, tb
def itervalues(self):
for v in super(Dict, self).itervalues():
yield TypeBase.TransferGlobalValidator(self, v)
def values(self):
return [x for x in self.itervalues()]
def iteritems(self):
for k, v in super(Dict, self).iteritems():
yield k, TypeBase.TransferGlobalValidator(self, v)
def items(self):
return [x for x in self.iteritems()]
class Struct(TypeBase):
def __init__(self, *args, **kwargs):
TypeBase.__init__(self)
self.__dict__["_dict"] = {}
self._update(*args, **kwargs)
def __getattr__(self, k):
try:
k = self._get_alias(k)
return TypeBase.TransferGlobalValidator(self, self._dict[k])
except KeyError:
if hasattr(self._dict, k):
# Look for an override method of the same name prefixed by '_' in current class
k2 = '_' + k
if hasattr(self, k2):
#print("Forward '%s' to %s class '%s'" % (k, self.__class__.__name__, k2))
return getattr(self, k2)
else:
#print("Forward '%s' to dict class '%s'" % (k, k))
return getattr(self._dict, k)
else:
#raise AttributeError("'Struct' has no attribute '%s' (dict %s)" % (k, "has" if hasattr(self._dict, k) else "hasn't"))
return self.__getattribute__(k)
def __setattr__(self, k, v):
# Special case for __class__ member that we may want to modify for
# to enable dynamic function set binding
if k == "__class__":
super(Struct, self).__setattr__(k, v)
else:
k = self._get_alias(k)
self._check_reserved(k)
wasset = (k in self._dict)
oldval = (self._dict[k] if wasset else None)
self._dict[k] = self._adapt_value(v, key=k)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
if wasset:
self._dict[k] = oldval
else:
del(self._dict[k])
except Exception, e:
print("das.types.Struct.__setattr__: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
def __delattr__(self, k):
k = self._get_alias(k)
oldval = self._dict.get(k, None)
self._dict.__delitem__(k)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
# Note: we can reach here only if k was a valid key (otherwise __delitem__(k) would fail)
try:
self._dict[k] = oldval
except Exception, e:
print("das.types.Struct.__delattr__: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
def __getitem__(self, k):
k = self._get_alias(k)
return TypeBase.TransferGlobalValidator(self, self._dict.__getitem__(k))
def __setitem__(self, k, v):
k = self._get_alias(k)
self._check_reserved(k)
wasset = (k in self._dict)
oldval = (self._dict[k] if wasset else None)
self._dict.__setitem__(k, self._adapt_value(v, key=k))
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
if wasset:
self._dict[k] = oldval
else:
del(self._dict[k])
except Exception, e:
print("das.types.Struct.__setitem__: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
def __delitem__(self, k):
_k = k
k = self._get_alias(k)
oldval = self._dict.get(k, None)
self._dict.__delitem__(k)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
# Note: we can reach here only if k was a valid key (otherwise __delitem__(k) would fail)
try:
self._dict[k] = oldval
except Exception, e:
print("das.types.Struct.__delitem__: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
def __contains__(self, k):
return self._dict.__contains__(self._get_alias(k))
def __cmp__(self, oth):
return self._dict.__cmp__(oth._dict if isinstance(oth, Struct) else oth)
def __eq__(self, oth):
return self._dict.__eq__(oth._dict if isinstance(oth, Struct) else oth)
def __ge__(self, oth):
return self._dict.__ge__(oth._dict if isinstance(oth, Struct) else oth)
def __le__(self, oth):
return self._dict.__le__(oth._dict if isinstance(oth, Struct) else oth)
def __gt__(self, oth):
return self._dict.__gt__(oth._dict if isinstance(oth, Struct) else oth)
def __lt__(self, oth):
return self._dict.__lt__(oth._dict if isinstance(oth, Struct) else oth)
def __iter__(self):
return self._dict.__iter__()
def __len__(self):
return self._dict.__len__()
def __str__(self):
return self._dict.__str__()
def __repr__(self):
return self._dict.__repr__()
# Override of dict.has_key
def _has_key(self, k):
return self._dict.has_key(self._get_alias(k))
# Override of dict.pop
def _pop(self, k, *args):
_k = k
k = self._get_alias(k)
oldval = self._dict.get(k, None)
retval = self._dict.pop(k, *args)
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
self._dict[k] = oldval
except Exception, e:
print("das.types.Struct.pop: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
return retval
# Override of dict.popitem
def _popitem(self):
k, v = self._dict.popitem()
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
self._dict[k] = v
except Exception, e:
print("das.types.Struct.popitem: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
# Override of dict.clear
def _clear(self):
items = self._dict.items()
self._dict.clear()
try:
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
self._dict.update(items)
except Exception, e:
print("das.types.Struct.clear: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
# Override of dict.copy
def _copy(self):
return self._wrap(self)
# Override of dict.setdefault
def _setdefault(self, *args):
nargs = len(args)
if nargs > 2:
raise TypeError("_setdefault expected at most 2 arguments, got %d" % nargs)
if nargs >= 1:
self._check_reserved(args[0])
if nargs == 2:
args = (args[0], self._adapt_value(args[1], key=args[0]))
self._dict.setdefault(*args)
# Override of dict.update
def _update(self, *args, **kwargs):
if len(args) > 1:
raise Exception("update expected at most 1 arguments, got %d" % len(args))
oldvals = self._dict.copy()
try:
if len(args) == 1:
a0 = args[0]
if hasattr(a0, "keys"):
for k in a0.keys():
k = self._get_alias(k)
self._check_reserved(k)
self._dict[k] = self._adapt_value(a0[k], key=k)
else:
for k, v in a0:
k = self._get_alias(k)
self._check_reserved(k)
self._dict[k] = self._adapt_value(v, key=k)
for k, v in kwargs.iteritems():
k = self._get_alias(k)
self._check_reserved(k)
self._dict[k] = self._adapt_value(v, key=k)
self._gvalidate()
except:
ec, ei, tb = sys.exc_info()
try:
self._dict.clear()
self._dict.update(oldvals)
except Exception, e:
print("das.types.Struct.update: Failed to recover struct data (%s)" % e)
raise ec, ei, tb
def _get_alias(self, k):
st = self._get_schema_type()
if st is not None and st.has_key(k):
aliasname = das.schematypes.Alias.Name(st[k])
if aliasname is not None:
# if isinstance(st[k], das.schematypes.Deprecated):
# message = ("[das] Field %s is deprecated, use %s instead" % (repr(k), repr(aliasname)))
# das.print_once(message)
return aliasname
return k
def _check_reserved(self, k):
if hasattr(self.__class__, k):
raise ReservedNameError(k)
elif hasattr(self._dict, k):
k2 = "_" + k
if hasattr(self, k2):
# don't need to create forwarding attribute (set __getattr__)
return
if k2 in self.__dict__:
if self.__dict__[k2] != getattr(self._dict, k):
raise ReservedNameError(k)
else:
msg = "[das] %s's '%s(...)' method conflicts with data field '%s', use '_%s(...)' to call it instead" % (type(self).__name__, k, k, k)
st = self._get_schema_type()
if st is not None:
n = das.get_schema_type_name(st)
if n:
msg = "[%s] %s" % (n, msg)
das.print_once(msg)
self.__dict__[k2] = getattr(self._dict, k)
def ordered_keys(self):
return filter(lambda x: x in self, self._get_schema_type().ordered_keys())
def _itervalues(self):
for v in self._dict.itervalues():
yield TypeBase.TransferGlobalValidator(self, v)
def _values(self):
return [x for x in self.itervalues()]
def _iteritems(self):
for k, v in self._dict.iteritems():
yield k, TypeBase.TransferGlobalValidator(self, v)
def _items(self):
return [x for x in self.iteritems()]
| 30.945585 | 146 | 0.563883 | 3,894 | 30,141 | 4.06831 | 0.073446 | 0.033834 | 0.024239 | 0.020831 | 0.645184 | 0.570698 | 0.546333 | 0.505807 | 0.464777 | 0.429365 | 0 | 0.00267 | 0.316612 | 30,141 | 973 | 147 | 30.97739 | 0.766434 | 0.05879 | 0 | 0.558603 | 0 | 0.001247 | 0.09278 | 0.032056 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.001247 | 0.003741 | null | null | 0.041147 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbe3e139f969d2b0c02202b763923425574d8d2e | 2,764 | py | Python | default.py | SimonPreissner/get-shifty | aff49220932921c77e419a34ca472b51e0b26b72 | [
"MIT"
] | null | null | null | default.py | SimonPreissner/get-shifty | aff49220932921c77e419a34ca472b51e0b26b72 | [
"MIT"
] | null | null | null | default.py | SimonPreissner/get-shifty | aff49220932921c77e419a34ca472b51e0b26b72 | [
"MIT"
] | null | null | null | """
This file contains meta information and default configurations of the project
"""
RSC_YEARS = [1660, 1670, 1680, 1690,
1700, 1710, 1720, 1730, 1740, 1750, 1760, 1770, 1780, 1790,
1800, 1810, 1820, 1830, 1840, 1850, 1860, 1870, 1880, 1890,
1900, 1910, 1920]
# cf. Chapter 4.4.1 of the thesis
SPACE_PAIR_SELECTION = [(1740,1750), (1750,1760),
(1680,1710), (1710,1740), (1740,1770), (1770,1800), (1800,1830), (1830,1860), (1860,1890),
(1700,1800), (1800,1900),
(1700,1900)]
COUPLING_CONFIG = { # Alternatives
# parameters passed to the GWOT object
'metric': "cosine", # 'euclidian',
'normalize_vecs': "both", # 'mean', 'whiten', 'whiten_zca'
'normalize_dists': "mean", # 'max', 'median'
'score_type': "coupling", # #TODO fill in the rest of the options in the comments
'adjust': None, # 'csls', ...
'distribs': "uniform", # 'custom', 'zipf'
'share_vocs':False, # True
'size':1000, # 100 is small, 1e4
'max_anchors':100, # used with small couplings (for projection)
# parameters to be passed to the optimizer
'opt_loss_fun': "square_loss", # 'kl_loss'
'opt_entropic': True, # False
'opt_entreg': 5e-4, # stay within the range of e-4 (originally: 1e-4)
'opt_tol': 1e-9, # no limits
'opt_round_g': False, # True
'opt_compute_accuracy': False, # True would require a test dict, but that's not implemented!
'opt_gpu': False, # GPU optimization not tested
# parameters for calling fit()
'fit_maxiter': 300, # no limits; normally converges within 150 iterations
'fit_tol': 1e-9, # no limits
'fit_plot_every': 100000, # normally 20; 'deactivate' the file spam by choosing a large value
'fit_print_every': 1, # no limits
'fit_verbose': True, # False
'fit_save_plots': None # "/my_dir/my_optimizer_plots"
}
DIST_SHAPES = ['uniform', 'zipf', 'custom']
SHIFT_EXPERIMENTS = ["all",
"unsup_bi",
"unsup_mono",
"dis_tech"] | 52.150943 | 119 | 0.458032 | 272 | 2,764 | 4.511029 | 0.625 | 0.02608 | 0.01793 | 0.01304 | 0.02282 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15557 | 0.434877 | 2,764 | 53 | 120 | 52.150943 | 0.629962 | 0.281476 | 0 | 0 | 0 | 0 | 0.164447 | 0 | 0 | 0 | 0 | 0.018868 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.027027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbe44f6e05680f0d1dad7aaee47f96f07f3de643 | 2,128 | py | Python | tests/python/metaclass_inheritance.py | gmgunter/pyre | e9ff3f8c04661f8b2cd2ba0caded08b6fe8054e2 | [
"BSD-3-Clause"
] | 25 | 2018-04-23T01:45:39.000Z | 2021-12-10T06:01:23.000Z | tests/python/metaclass_inheritance.py | gmgunter/pyre | e9ff3f8c04661f8b2cd2ba0caded08b6fe8054e2 | [
"BSD-3-Clause"
] | 53 | 2018-05-31T04:55:00.000Z | 2021-10-07T21:41:32.000Z | tests/python/metaclass_inheritance.py | gmgunter/pyre | e9ff3f8c04661f8b2cd2ba0caded08b6fe8054e2 | [
"BSD-3-Clause"
] | 12 | 2018-04-23T22:50:40.000Z | 2022-02-20T17:27:23.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# michael a.g. aïvázis
# orthologue
# (c) 1998-2021 all rights reserved
#
#
"""
When a metaclass understands the extra keywords that can be passed during class declaration,
it has to override all these to accommodate the change in signature
"""
class meta(type):
@classmethod
def __prepare__(metacls, name, bases, **kwds):
assert metacls.__name__ == 'meta'
assert name in ['base', 'derived']
if name == 'base':
assert bases == (object,)
assert kwds == {'arg1': True, 'arg2': False}
if name == 'derived':
assert bases == (base,)
assert kwds == {'arg1': False, 'arg2': True}
return super().__prepare__(name, bases)
def __new__(metacls, name, bases, attributes, **kwds):
assert metacls.__name__ == 'meta'
assert name in ['base', 'derived']
if name == 'base':
assert bases == (object,)
assert kwds == {'arg1': True, 'arg2': False}
if name == 'derived':
assert bases == (base,)
assert kwds == {'arg1': False, 'arg2': True}
return super().__new__(metacls, name, bases, attributes)
def __init__(self, name, bases, attributes, **kwds):
assert self.__name__ in ['base', 'derived']
if self.__name__ == 'base':
assert bases == (object,)
assert kwds == {'arg1': True, 'arg2': False}
if self.__name__ == 'derived':
assert bases == (base,)
assert kwds == {'arg1': False, 'arg2': True}
super().__init__(name, bases, attributes)
return
class base(object, metaclass=meta, arg1=True, arg2=False):
def __init__(self, **kwds):
assert type(self).__name__ == 'base'
assert kwds == {}
return
class derived(base, arg1=False, arg2=True):
def __init__(self, **kwds):
assert type(self).__name__ == 'derived'
assert kwds == {}
return
def test():
b = base()
d = derived()
return
# main
if __name__ == "__main__":
test()
# end of file
| 25.035294 | 92 | 0.56156 | 239 | 2,128 | 4.715481 | 0.297071 | 0.070985 | 0.074534 | 0.060337 | 0.545697 | 0.451642 | 0.451642 | 0.451642 | 0.393079 | 0.393079 | 0 | 0.017264 | 0.292293 | 2,128 | 84 | 93 | 25.333333 | 0.731076 | 0.134868 | 0 | 0.583333 | 0 | 0 | 0.077303 | 0 | 0 | 0 | 0 | 0 | 0.4375 | 1 | 0.125 | false | 0 | 0 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbe7a0b13a437a6e05e68098ff2efe008a915ee9 | 862 | py | Python | bin/sort.py | pelavarre/pybashish | 03f74356fb0a2a0ef7106f09c059fd9b375ce89a | [
"CNRI-Python"
] | 4 | 2020-07-10T20:16:13.000Z | 2022-02-16T02:11:20.000Z | bin/sort.py | pelavarre/pybashish | 03f74356fb0a2a0ef7106f09c059fd9b375ce89a | [
"CNRI-Python"
] | null | null | null | bin/sort.py | pelavarre/pybashish | 03f74356fb0a2a0ef7106f09c059fd9b375ce89a | [
"CNRI-Python"
] | 2 | 2020-06-24T20:37:36.000Z | 2020-07-10T20:16:17.000Z | #!/usr/bin/env python3
"""
usage: sort.py [-h]
sort lines
options:
-h, --help show this help message and exit
quirks:
sorts tabs as different than spaces
sorts some spaces ending a line as different than none ending a line
examples:
Oh no! No examples disclosed!! 💥 💔 💥
"""
# FIXME: doc -k$N,$N and -n and maybe little else is worth learning
# FIXME: ass -k-1,-1 for negative field indexing
# FIXME: think into the mess at "sort" vs "LC_ALL=C sort"
import sys
import argdoc
def main():
args = argdoc.parse_args()
sys.stderr.write("{}\n".format(args))
sys.stderr.write("{}\n".format(argdoc.format_usage().rstrip()))
sys.stderr.write("sort.py: error: not implemented\n")
sys.exit(2) # exit 2 from rejecting usage
if __name__ == "__main__":
main()
# copied from: git clone https://github.com/pelavarre/pybashish.git
| 21.02439 | 70 | 0.678654 | 139 | 862 | 4.151079 | 0.618705 | 0.046794 | 0.07279 | 0.062392 | 0.086655 | 0.086655 | 0 | 0 | 0 | 0 | 0 | 0.007133 | 0.186775 | 862 | 40 | 71 | 21.55 | 0.811698 | 0.62761 | 0 | 0 | 0 | 0 | 0.159091 | 0 | 0 | 0 | 0 | 0.025 | 0 | 1 | 0.1 | false | 0 | 0.2 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbeb68c7ee7ea08f9d92285ea9d761b3aba02878 | 5,115 | py | Python | experiments/db_test.py | mit-ll/CATAN | 7cc6f7e8af459c0f6bcf325f0754db1ba5b591ac | [
"BSD-3-Clause"
] | 15 | 2015-06-05T20:13:40.000Z | 2020-12-24T05:16:57.000Z | experiments/db_test.py | mit-ll/CATAN | 7cc6f7e8af459c0f6bcf325f0754db1ba5b591ac | [
"BSD-3-Clause"
] | 10 | 2016-03-04T23:05:56.000Z | 2016-05-18T18:14:13.000Z | experiments/db_test.py | mit-ll/CATAN | 7cc6f7e8af459c0f6bcf325f0754db1ba5b591ac | [
"BSD-3-Clause"
] | 6 | 2015-10-15T19:23:58.000Z | 2021-06-29T07:36:16.000Z | #!/usr/bin/env python
"""
@author Hongyi Hu
© 2015 Massachusetts Institute of Technology
"""
import argparse
import random
import catan.db
from catan.data import NodeMessage
# test data
STATUS_LIST = ['ok', 'injured', 'deceased']
# nodes
def gen_nodes(n, db, start_lat, stop_lat, start_long, stop_long):
assert n > 0
cmd = "INSERT INTO catan_nodes VALUES "
# generate n random nodes, centered around Cambridge
for i in range(n):
# random lat, long
lat = round(random.uniform(start_lat, stop_lat), 6)
lng = round(random.uniform(start_long, stop_long), 6)
# node_id, gps_lat, gps_long, gps_acc, path, timestamp
sql_cmd = cmd + "(%d, %.6f, %.6f, %.6f, %.6f, %.6f)" % (i, lat, lng, 0, 0, 0)
db._sql(sql_cmd)
# people
def gen_people(n, db, start_lat, stop_lat, start_long, stop_long):
"""
Generates n people, random male/female ratio between 5 and 90 years of age
"""
assert n > 0
# open male first names file
f = open('dist.male.first','r')
male_first_names = [name.strip().split()[0] for name in f.readlines()]
f.close()
# open female first names file
f = open('dist.female.first','r')
female_first_names = [name.strip().split()[0] for name in f.readlines()]
f.close()
# open last names file
f = open('dist.all.last','r')
family_names = [name.strip().split()[0] for name in f.readlines()]
f.close()
# generate people
for i in range(n):
catanDBObj = catan.db.CatanDatabaseObject()
# bio
sex = random.randint(0,1)
if sex == 0: # male
catanDBObj.person_bio.name_given = male_first_names[random.randint(0,len(male_first_names)-1)]
catanDBObj.person_bio.sex = 'male'
else: # female
catanDBObj.person_bio.name_given = female_first_names[random.randint(0,len(female_first_names)-1)]
catanDBObj.person_bio.sex = 'female'
catanDBObj.person_bio.name_family = family_names[random.randint(0,len(family_names)-1)]
catanDBObj.person_bio.age = random.randint(5,90)
# message (message, status, location, etc.)
# location
lat = round(random.uniform(start_lat, stop_lat), 6)
lng = round(random.uniform(start_long, stop_long), 6)
catanDBObj.person_message.person_message = 'Hi Mom'
catanDBObj.person_message.status_gps_latitude = lat
catanDBObj.person_message.status_gps_longitude = lng
catanDBObj.person_message.status_gps_accuracy = 0
# status
catanDBObj.person_message.status = STATUS_LIST[random.randint(0,len(STATUS_LIST)-1)]
catanDBObj.person_message.status_location = 'Test status location'
# generate a NodeMessage for the database
# it only cares about the data and source fields, so we can ignore other fields
nmsg = NodeMessage()
nmsg.source = random.randint(0,31) # random node 0-31
nmsg.data = catanDBObj.pack()
db.update_db(nmsg)
# Create some random updates
for i in range(1,n+1):
update = random.randint(0,1)
if update == 0:
catanDBObj = catan.db.CatanDatabaseObject()
catanDBObj.person_id = i
# location
lat = round(random.uniform(start_lat, stop_lat), 6)
lng = round(random.uniform(start_long, stop_long), 6)
catanDBObj.person_message.person_message = 'Location update 1'
catanDBObj.person_message.status_gps_latitude = lat
catanDBObj.person_message.status_gps_longitude = lng
catanDBObj.person_message.status_gps_accuracy = 0
n = NodeMessage()
n.source = random.randint(0,31)
n.data = catanDBObj.pack()
db.update_db(n)
def populate_db():
db = catan.db.CatanDatabase(0)
# insert some test nodes
# for cambridge
gen_nodes(32, db, 42.354823, 42.368315, -71.114484, -71.084422)
gen_people(100, db, 42.354823, 42.368315, -71.114484, -71.084422)
cmd = ('SELECT '
'db_person_bio.person_id, '
'db_person_bio.origin_node_id, '
'db_person_bio.name_family, '
'db_person_bio.name_given, '
'db_person_bio.age, '
'db_person_bio.sex, '
'db_person_messages.submission_id, '
'db_person_messages.origin_node_id, '
'db_person_messages.status_gps_latitude, '
'db_person_messages.status_gps_longitude, '
'db_person_messages.status_gps_accuracy, '
'db_person_messages.status, '
'db_person_messages.status_location, '
'db_submitter_info.timestamp '
'FROM db_person_bio '
'LEFT JOIN db_person_messages ON db_person_messages.person_id = db_person_bio.person_id '
'LEFT JOIN db_submitter_info ON db_submitter_info.submission_id = db_person_messages.submission_id')
for r in db._sql(cmd).fetchall():
print r
def main(args):
pass
if __name__=='__main__':
populate_db()
| 31.189024 | 111 | 0.634018 | 679 | 5,115 | 4.543446 | 0.223859 | 0.046677 | 0.074554 | 0.075203 | 0.52577 | 0.351702 | 0.301135 | 0.279741 | 0.279741 | 0.257699 | 0 | 0.033483 | 0.258456 | 5,115 | 163 | 112 | 31.380368 | 0.779594 | 0.105376 | 0 | 0.233333 | 0 | 0 | 0.188568 | 0.11182 | 0 | 0 | 0 | 0 | 0.022222 | 0 | null | null | 0.011111 | 0.044444 | null | null | 0.011111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbec13a8be9b82963156b2e9e29130d14a7c09eb | 975 | py | Python | tests/formatters/fseventsd.py | SamuelePilleri/plaso | f5687f12a89c7309797ccc285da78e855c120579 | [
"Apache-2.0"
] | null | null | null | tests/formatters/fseventsd.py | SamuelePilleri/plaso | f5687f12a89c7309797ccc285da78e855c120579 | [
"Apache-2.0"
] | null | null | null | tests/formatters/fseventsd.py | SamuelePilleri/plaso | f5687f12a89c7309797ccc285da78e855c120579 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Tests for the fseventsd record event formatter."""
from __future__ import unicode_literals
import unittest
from plaso.formatters import fseventsd
from tests.formatters import test_lib
class FseventsdFormatterTest(test_lib.EventFormatterTestCase):
"""Tests for the fseventsd record event formatter."""
def testInitialization(self):
"""Tests the initialization."""
event_formatter = fseventsd.FSEventsdEventFormatter()
self.assertIsNotNone(event_formatter)
def testGetFormatStringAttributeNames(self):
"""Tests the GetFormatStringAttributeNames function."""
event_formatter = fseventsd.FSEventsdEventFormatter()
expected_attribute_names = [
u'event_identifier', u'flag_values', u'hex_flags', u'path']
self._TestGetFormatStringAttributeNames(
event_formatter, expected_attribute_names)
# TODO: add test for GetSources.
if __name__ == '__main__':
unittest.main()
| 26.351351 | 67 | 0.756923 | 100 | 975 | 7.11 | 0.52 | 0.118143 | 0.030942 | 0.056259 | 0.112518 | 0.112518 | 0.112518 | 0 | 0 | 0 | 0 | 0.001202 | 0.146667 | 975 | 36 | 68 | 27.083333 | 0.853365 | 0.251282 | 0 | 0.125 | 0 | 0 | 0.067893 | 0 | 0 | 0 | 0 | 0.027778 | 0.0625 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbed5bb534715e304b67cd5a82e6d1e8cca605fa | 1,693 | py | Python | categories/migrations/0001_initial.py | snoop2head/exercise_curation_django | ba35bd32d8bc203d318cb8b6e0a1722f3aa26eda | [
"MIT"
] | 3 | 2020-09-30T04:44:39.000Z | 2021-07-30T08:20:18.000Z | categories/migrations/0001_initial.py | snoop2head/exercise_curation_django | ba35bd32d8bc203d318cb8b6e0a1722f3aa26eda | [
"MIT"
] | 7 | 2021-03-30T13:09:55.000Z | 2022-01-13T02:33:34.000Z | categories/migrations/0001_initial.py | snoop2head/exercise_curation_django | ba35bd32d8bc203d318cb8b6e0a1722f3aa26eda | [
"MIT"
] | 1 | 2022-03-31T12:01:38.000Z | 2022-03-31T12:01:38.000Z | # Generated by Django 3.0.3 on 2020-03-24 09:59
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('exercises', '0018_photo_file'),
]
operations = [
migrations.CreateModel(
name='Category',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('name', models.CharField(max_length=80)),
('description', models.TextField(blank=True)),
('exercises', models.ForeignKey(blank=True, on_delete=django.db.models.deletion.CASCADE, related_name='categories', to='exercises.Exercise')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Photo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('image_url', models.URLField()),
('image_caption', models.CharField(blank=True, max_length=80)),
('category', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='photos', to='categories.Category')),
],
options={
'abstract': False,
},
),
]
| 37.622222 | 158 | 0.569403 | 163 | 1,693 | 5.779141 | 0.398773 | 0.03397 | 0.097665 | 0.110403 | 0.433121 | 0.433121 | 0.433121 | 0.433121 | 0.433121 | 0.33121 | 0 | 0.019183 | 0.29179 | 1,693 | 44 | 159 | 38.477273 | 0.766472 | 0.02658 | 0 | 0.486486 | 1 | 0 | 0.119077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.054054 | 0 | 0.162162 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbef5ddea825a12fdea28a38b148d831f47bd566 | 1,446 | py | Python | python_modules/lakehouse/lakehouse/snowflake_table.py | vatervonacht/dagster | 595d78c883ef20618052ac1575fe46cde51fd541 | [
"Apache-2.0"
] | 3 | 2020-04-28T16:27:33.000Z | 2020-07-22T07:43:30.000Z | python_modules/lakehouse/lakehouse/snowflake_table.py | vatervonacht/dagster | 595d78c883ef20618052ac1575fe46cde51fd541 | [
"Apache-2.0"
] | 2 | 2021-05-11T13:36:27.000Z | 2021-09-03T01:53:11.000Z | python_modules/lakehouse/lakehouse/snowflake_table.py | vatervonacht/dagster | 595d78c883ef20618052ac1575fe46cde51fd541 | [
"Apache-2.0"
] | 1 | 2021-02-21T12:16:47.000Z | 2021-02-21T12:16:47.000Z | from dagster import check
from .house import Lakehouse
from .table import create_lakehouse_table_def
class SnowflakeLakehouse(Lakehouse):
def __init__(self):
pass
def hydrate(self, _context, _table_type, _table_metadata, table_handle, _dest_metadata):
return None
def materialize(self, context, table_type, table_metadata, value):
return None, None
def snowflake_table(
name=None,
input_tables=None,
other_input_defs=None,
tags=None,
required_resource_keys=None,
description=None,
):
tags = check.opt_dict_param(tags, 'tags')
tags['lakehouse_type'] = 'snowflake_table'
tags['kind'] = 'snowflake'
required_resource_keys = check.opt_set_param(required_resource_keys, 'required_resource_keys')
required_resource_keys.add('snowflake')
if callable(name):
fn = name
return create_lakehouse_table_def(
name=fn.__name__,
lakehouse_fn=fn,
input_tables=[],
required_resource_keys=required_resource_keys,
)
def _wrap(fn):
return create_lakehouse_table_def(
name=name if name is not None else fn.__name__,
lakehouse_fn=fn,
input_tables=input_tables,
other_input_defs=other_input_defs,
tags=tags,
description=description,
required_resource_keys=required_resource_keys,
)
return _wrap
| 26.777778 | 98 | 0.670816 | 168 | 1,446 | 5.357143 | 0.285714 | 0.16 | 0.2 | 0.124444 | 0.368889 | 0.368889 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0.253804 | 1,446 | 53 | 99 | 27.283019 | 0.834106 | 0 | 0 | 0.142857 | 0 | 0 | 0.05325 | 0.015214 | 0 | 0 | 0 | 0 | 0 | 1 | 0.119048 | false | 0.02381 | 0.071429 | 0.071429 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbf19789118428ff5f8d3aa59b32b64fa444b8b7 | 984 | py | Python | agent_based_models/abm_allelopathy/plot_data.py | mattsmart/biomodels | 237f87489553fa1ebf5c676fab563166dd0c39e9 | [
"MIT"
] | null | null | null | agent_based_models/abm_allelopathy/plot_data.py | mattsmart/biomodels | 237f87489553fa1ebf5c676fab563166dd0c39e9 | [
"MIT"
] | null | null | null | agent_based_models/abm_allelopathy/plot_data.py | mattsmart/biomodels | 237f87489553fa1ebf5c676fab563166dd0c39e9 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
import os
def data_plotter(lattice_dict, datafile_dir, plot_dir):
# total spaces on grid implies grid size
total_cells = lattice_dict['E'][0] + lattice_dict['D_a'][0] + lattice_dict['D_b'][0] + lattice_dict['B'][0]
n = int(total_cells**0.5)
plt.figure(1)
plt.plot(lattice_dict['time'], lattice_dict['E'], label='Empty lattice points')
plt.plot(lattice_dict['time'], lattice_dict['D_a'], label='Donors (Type A)')
plt.plot(lattice_dict['time'], lattice_dict['D_b'], label='Donors (Type B)')
plt.plot(lattice_dict['time'], lattice_dict['B'], label='Debris')
ax = plt.gca()
ax.set_title('Cell Populations over time (n = %d)' % n)
ax.set_ylabel('Number of cells')
ax.set_xlabel('Time (h)')
plt.legend()
f = plt.gcf()
f.set_size_inches(20.0, 8.0) # alternative: 20.0, 8.0
f.tight_layout()
plt.savefig(os.path.join(plot_dir, 'population_vs_time.png'))
plt.clf()
return
| 30.75 | 111 | 0.650407 | 159 | 984 | 3.830189 | 0.421384 | 0.234811 | 0.078818 | 0.118227 | 0.220033 | 0.220033 | 0.220033 | 0.111658 | 0 | 0 | 0 | 0.02091 | 0.17378 | 984 | 31 | 112 | 31.741935 | 0.728167 | 0.061992 | 0 | 0 | 0 | 0 | 0.182609 | 0.023913 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.095238 | 0 | 0.190476 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbf3d541561ba11217ad33d7f2e880d8ae1b4729 | 1,567 | py | Python | FOR/Analisador-completo/main.py | lucasf5/Python | c5649121e2af42922e2d9c19cec98322e132bdab | [
"MIT"
] | 1 | 2021-09-28T13:11:56.000Z | 2021-09-28T13:11:56.000Z | FOR/Analisador-completo/main.py | lucasf5/Python | c5649121e2af42922e2d9c19cec98322e132bdab | [
"MIT"
] | null | null | null | FOR/Analisador-completo/main.py | lucasf5/Python | c5649121e2af42922e2d9c19cec98322e132bdab | [
"MIT"
] | null | null | null | # Exercício Python 56: Desenvolva um programa que leia o nome, idade e sexo de 4 pessoas. No final do programa, mostre: a média de idade do grupo, qual é o nome do homem mais velho e quantas mulheres têm menos de 20 anos.
mediaidade = ''
nomelista = []
idadelista = []
sexolista = []
homens = []
mulherescommenosde20 = 0
nomedelas = []
# -------------------------------------------------------------------
for i in range(1,5):
print(f'{i} PESSOA')
nome = (input('Seu nome: '))
idade = int(input('Sua idade: '))
sexo = int(input('Sexo? [0]Masculino [1]Feminino: '))
if sexo == 1 and idade < 20:
nomedelas.append(nome)
mulherescommenosde20 += 1
elif sexo == 0:
homens.append(nome)
# Adcionei todas idades em uma lista
idadelista.append(idade)
# Tirei a média dessas idades //Primeira parte
mediaidade = ((sum(idadelista))/4)
# Adcionei todos os nomes em uma lista
nomelista.append(nome)
# -------------------------------------------------------------------
# Armazenei em maximo o maior valor encontrado dentro de uma lista
maximo = max(idadelista)
# Armazenei em idadexidade o INDEX do maior valor
indexidade = idadelista.index(maximo)
# Armazenei em indexnome a posição de quem tem a maior idade
indexnome = nomelista[indexidade]
# -------------------------------------------------------------------
print(f'A media das idades é: {mediaidade}')
print(f'A pessoa que tem a maior idade, com {maximo} é essa: {indexnome}')
print(f'As mulheres que possuem menos de 20 anos: {mulherescommenosde20} e são: {nomedelas}')
| 27.982143 | 221 | 0.612636 | 201 | 1,567 | 4.776119 | 0.477612 | 0.025 | 0.01875 | 0.027083 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018405 | 0.167837 | 1,567 | 55 | 222 | 28.490909 | 0.717791 | 0.454371 | 0 | 0 | 0 | 0 | 0.2891 | 0.026066 | 0 | 0 | 0 | 0.018182 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbf65821c57bcbfbc7d857dacc3de5d4175d7481 | 1,101 | py | Python | liststations.py | CrookedY/AirPollutionBot | ce79037d6dddd1f297fce04a694b49f8b9a1bfad | [
"Apache-2.0"
] | 1 | 2018-08-10T14:06:07.000Z | 2018-08-10T14:06:07.000Z | liststations.py | CrookedY/AirPollutionBot | ce79037d6dddd1f297fce04a694b49f8b9a1bfad | [
"Apache-2.0"
] | 2 | 2017-08-09T11:24:31.000Z | 2018-03-01T22:50:04.000Z | liststations.py | CrookedY/AirPollutionBot | ce79037d6dddd1f297fce04a694b49f8b9a1bfad | [
"Apache-2.0"
] | null | null | null | from urllib2 import Request, urlopen, URLError
import json
request = Request('https://uk-air.defra.gov.uk/sos-ukair/api/v1/stations/')
try:
response = urlopen(request)
data = response.read()
except URLError, e:
print 'error:', e
stations= json.loads (data)
#extract out station 2
stations2 = stations [7]
properties = stations2[u'properties']
#extract ID so can be use in link
ID = properties[u'id']
#print ID
url = ('https://uk-air.defra.gov.uk/sos-ukair/api/v1/stations/'+str(ID))
request2 = Request (url)
try:
response = urlopen(request2)
data2 = response.read()
except URLError, e:
print 'error:', e
#contains station properties data. Need to get to timecourse ID
station_prop = data2
station_prop_json= json.loads (station_prop)
#ID is a key in dictionary so need to extract as a key
a= station_prop_json[u'properties'][u'timeseries'].keys()
i=a[0]
url2 =('https://uk-air.defra.gov.uk/sos-ukair/api/v1/timeseries/'+str(i) +'/getData')
request3 = Request(url2)
try:
response = urlopen(request3)
data3 = response.read()
except URLError, e:
print 'error:', e
print data3
| 23.934783 | 85 | 0.719346 | 171 | 1,101 | 4.596491 | 0.374269 | 0.030534 | 0.038168 | 0.057252 | 0.291349 | 0.291349 | 0.291349 | 0.291349 | 0.14631 | 0.14631 | 0 | 0.020021 | 0.138056 | 1,101 | 45 | 86 | 24.466667 | 0.808219 | 0.160763 | 0 | 0.290323 | 0 | 0.096774 | 0.241567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.064516 | null | null | 0.129032 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbf810a25b7c035adf73121054a304443a683fb0 | 748 | py | Python | core/migrations/0002_auto_20180702_1913.py | mertyildiran/echo | 805db64e3fa9d31fd3c24390fac2e9bf7c91ad57 | [
"Apache-2.0"
] | 5 | 2018-07-26T22:48:00.000Z | 2021-05-02T01:59:51.000Z | core/migrations/0002_auto_20180702_1913.py | mertyildiran/echo | 805db64e3fa9d31fd3c24390fac2e9bf7c91ad57 | [
"Apache-2.0"
] | null | null | null | core/migrations/0002_auto_20180702_1913.py | mertyildiran/echo | 805db64e3fa9d31fd3c24390fac2e9bf7c91ad57 | [
"Apache-2.0"
] | 1 | 2018-08-04T14:07:53.000Z | 2018-08-04T14:07:53.000Z | # Generated by Django 2.0.6 on 2018-07-02 19:13
import core.models
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0001_initial'),
]
operations = [
migrations.RenameField(
model_name='echo',
old_name='owner',
new_name='user',
),
migrations.AlterField(
model_name='echo',
name='audio',
field=models.FileField(upload_to=core.models.echo_directory),
),
migrations.AlterField(
model_name='profile',
name='picture',
field=models.FileField(blank=True, null=True, upload_to=core.models.profile_directory),
),
]
| 24.933333 | 99 | 0.57754 | 77 | 748 | 5.480519 | 0.571429 | 0.07109 | 0.061611 | 0.137441 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036609 | 0.30615 | 748 | 29 | 100 | 25.793103 | 0.776493 | 0.06016 | 0 | 0.304348 | 1 | 0 | 0.07418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbfb34040aab0e7552b68ecebacb85f0a1f7a601 | 211 | py | Python | Chapter09/calc.py | LuisPereda/Learning_Python | e89e69346c5584be10d991010f39b59329793ba5 | [
"MIT"
] | null | null | null | Chapter09/calc.py | LuisPereda/Learning_Python | e89e69346c5584be10d991010f39b59329793ba5 | [
"MIT"
] | null | null | null | Chapter09/calc.py | LuisPereda/Learning_Python | e89e69346c5584be10d991010f39b59329793ba5 | [
"MIT"
] | null | null | null |
def sum1(a,b):
try:
c = a+b
return c
except :
print "Error in sum1 function"
def divide(a,b):
try:
c = a/b
return c
except :
print "Error in divide function"
print divide(10,0)
print sum1(10,0) | 13.1875 | 34 | 0.635071 | 40 | 211 | 3.35 | 0.375 | 0.059701 | 0.074627 | 0.089552 | 0.492537 | 0.492537 | 0.492537 | 0.492537 | 0.492537 | 0.492537 | 0 | 0.05625 | 0.241706 | 211 | 16 | 35 | 13.1875 | 0.78125 | 0 | 0 | 0.428571 | 0 | 0 | 0.218009 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dbfdb987e6de76d1f36bf0f8ce7f9d972b1cbaed | 7,103 | py | Python | venv/Lib/site-packages/CoolProp/constants.py | kubakoziczak/gasSteamPowerPlant | e6c036cc66ee2ff0b3f2fc923d0991bf57295d61 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/CoolProp/constants.py | kubakoziczak/gasSteamPowerPlant | e6c036cc66ee2ff0b3f2fc923d0991bf57295d61 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/CoolProp/constants.py | kubakoziczak/gasSteamPowerPlant | e6c036cc66ee2ff0b3f2fc923d0991bf57295d61 | [
"MIT"
] | null | null | null | # This file is automatically generated by the generate_constants_module.py script in wrappers/Python.
# DO NOT MODIFY THE CONTENTS OF THIS FILE!
from __future__ import absolute_import
from . import _constants
INVALID_PARAMETER = _constants.INVALID_PARAMETER
igas_constant = _constants.igas_constant
imolar_mass = _constants.imolar_mass
iacentric_factor = _constants.iacentric_factor
irhomolar_reducing = _constants.irhomolar_reducing
irhomolar_critical = _constants.irhomolar_critical
iT_reducing = _constants.iT_reducing
iT_critical = _constants.iT_critical
irhomass_reducing = _constants.irhomass_reducing
irhomass_critical = _constants.irhomass_critical
iP_critical = _constants.iP_critical
iP_reducing = _constants.iP_reducing
iT_triple = _constants.iT_triple
iP_triple = _constants.iP_triple
iT_min = _constants.iT_min
iT_max = _constants.iT_max
iP_max = _constants.iP_max
iP_min = _constants.iP_min
idipole_moment = _constants.idipole_moment
iT = _constants.iT
iP = _constants.iP
iQ = _constants.iQ
iTau = _constants.iTau
iDelta = _constants.iDelta
iDmolar = _constants.iDmolar
iHmolar = _constants.iHmolar
iSmolar = _constants.iSmolar
iCpmolar = _constants.iCpmolar
iCp0molar = _constants.iCp0molar
iCvmolar = _constants.iCvmolar
iUmolar = _constants.iUmolar
iGmolar = _constants.iGmolar
iHelmholtzmolar = _constants.iHelmholtzmolar
iSmolar_residual = _constants.iSmolar_residual
iDmass = _constants.iDmass
iHmass = _constants.iHmass
iSmass = _constants.iSmass
iCpmass = _constants.iCpmass
iCp0mass = _constants.iCp0mass
iCvmass = _constants.iCvmass
iUmass = _constants.iUmass
iGmass = _constants.iGmass
iHelmholtzmass = _constants.iHelmholtzmass
iviscosity = _constants.iviscosity
iconductivity = _constants.iconductivity
isurface_tension = _constants.isurface_tension
iPrandtl = _constants.iPrandtl
ispeed_sound = _constants.ispeed_sound
iisothermal_compressibility = _constants.iisothermal_compressibility
iisobaric_expansion_coefficient = _constants.iisobaric_expansion_coefficient
ifundamental_derivative_of_gas_dynamics = _constants.ifundamental_derivative_of_gas_dynamics
ialphar = _constants.ialphar
idalphar_dtau_constdelta = _constants.idalphar_dtau_constdelta
idalphar_ddelta_consttau = _constants.idalphar_ddelta_consttau
ialpha0 = _constants.ialpha0
idalpha0_dtau_constdelta = _constants.idalpha0_dtau_constdelta
idalpha0_ddelta_consttau = _constants.idalpha0_ddelta_consttau
iBvirial = _constants.iBvirial
iCvirial = _constants.iCvirial
idBvirial_dT = _constants.idBvirial_dT
idCvirial_dT = _constants.idCvirial_dT
iZ = _constants.iZ
iPIP = _constants.iPIP
ifraction_min = _constants.ifraction_min
ifraction_max = _constants.ifraction_max
iT_freeze = _constants.iT_freeze
iGWP20 = _constants.iGWP20
iGWP100 = _constants.iGWP100
iGWP500 = _constants.iGWP500
iFH = _constants.iFH
iHH = _constants.iHH
iPH = _constants.iPH
iODP = _constants.iODP
iPhase = _constants.iPhase
iundefined_parameter = _constants.iundefined_parameter
INPUT_PAIR_INVALID = _constants.INPUT_PAIR_INVALID
QT_INPUTS = _constants.QT_INPUTS
PQ_INPUTS = _constants.PQ_INPUTS
QSmolar_INPUTS = _constants.QSmolar_INPUTS
QSmass_INPUTS = _constants.QSmass_INPUTS
HmolarQ_INPUTS = _constants.HmolarQ_INPUTS
HmassQ_INPUTS = _constants.HmassQ_INPUTS
DmolarQ_INPUTS = _constants.DmolarQ_INPUTS
DmassQ_INPUTS = _constants.DmassQ_INPUTS
PT_INPUTS = _constants.PT_INPUTS
DmassT_INPUTS = _constants.DmassT_INPUTS
DmolarT_INPUTS = _constants.DmolarT_INPUTS
HmolarT_INPUTS = _constants.HmolarT_INPUTS
HmassT_INPUTS = _constants.HmassT_INPUTS
SmolarT_INPUTS = _constants.SmolarT_INPUTS
SmassT_INPUTS = _constants.SmassT_INPUTS
TUmolar_INPUTS = _constants.TUmolar_INPUTS
TUmass_INPUTS = _constants.TUmass_INPUTS
DmassP_INPUTS = _constants.DmassP_INPUTS
DmolarP_INPUTS = _constants.DmolarP_INPUTS
HmassP_INPUTS = _constants.HmassP_INPUTS
HmolarP_INPUTS = _constants.HmolarP_INPUTS
PSmass_INPUTS = _constants.PSmass_INPUTS
PSmolar_INPUTS = _constants.PSmolar_INPUTS
PUmass_INPUTS = _constants.PUmass_INPUTS
PUmolar_INPUTS = _constants.PUmolar_INPUTS
HmassSmass_INPUTS = _constants.HmassSmass_INPUTS
HmolarSmolar_INPUTS = _constants.HmolarSmolar_INPUTS
SmassUmass_INPUTS = _constants.SmassUmass_INPUTS
SmolarUmolar_INPUTS = _constants.SmolarUmolar_INPUTS
DmassHmass_INPUTS = _constants.DmassHmass_INPUTS
DmolarHmolar_INPUTS = _constants.DmolarHmolar_INPUTS
DmassSmass_INPUTS = _constants.DmassSmass_INPUTS
DmolarSmolar_INPUTS = _constants.DmolarSmolar_INPUTS
DmassUmass_INPUTS = _constants.DmassUmass_INPUTS
DmolarUmolar_INPUTS = _constants.DmolarUmolar_INPUTS
FLUID_TYPE_PURE = _constants.FLUID_TYPE_PURE
FLUID_TYPE_PSEUDOPURE = _constants.FLUID_TYPE_PSEUDOPURE
FLUID_TYPE_REFPROP = _constants.FLUID_TYPE_REFPROP
FLUID_TYPE_INCOMPRESSIBLE_LIQUID = _constants.FLUID_TYPE_INCOMPRESSIBLE_LIQUID
FLUID_TYPE_INCOMPRESSIBLE_SOLUTION = _constants.FLUID_TYPE_INCOMPRESSIBLE_SOLUTION
FLUID_TYPE_UNDEFINED = _constants.FLUID_TYPE_UNDEFINED
iphase_liquid = _constants.iphase_liquid
iphase_supercritical = _constants.iphase_supercritical
iphase_supercritical_gas = _constants.iphase_supercritical_gas
iphase_supercritical_liquid = _constants.iphase_supercritical_liquid
iphase_critical_point = _constants.iphase_critical_point
iphase_gas = _constants.iphase_gas
iphase_twophase = _constants.iphase_twophase
iphase_unknown = _constants.iphase_unknown
iphase_not_imposed = _constants.iphase_not_imposed
NORMALIZE_GAS_CONSTANTS = _constants.NORMALIZE_GAS_CONSTANTS
CRITICAL_WITHIN_1UK = _constants.CRITICAL_WITHIN_1UK
CRITICAL_SPLINES_ENABLED = _constants.CRITICAL_SPLINES_ENABLED
SAVE_RAW_TABLES = _constants.SAVE_RAW_TABLES
ALTERNATIVE_TABLES_DIRECTORY = _constants.ALTERNATIVE_TABLES_DIRECTORY
ALTERNATIVE_REFPROP_PATH = _constants.ALTERNATIVE_REFPROP_PATH
ALTERNATIVE_REFPROP_HMX_BNC_PATH = _constants.ALTERNATIVE_REFPROP_HMX_BNC_PATH
ALTERNATIVE_REFPROP_LIBRARY_PATH = _constants.ALTERNATIVE_REFPROP_LIBRARY_PATH
REFPROP_DONT_ESTIMATE_INTERACTION_PARAMETERS = _constants.REFPROP_DONT_ESTIMATE_INTERACTION_PARAMETERS
REFPROP_IGNORE_ERROR_ESTIMATED_INTERACTION_PARAMETERS = _constants.REFPROP_IGNORE_ERROR_ESTIMATED_INTERACTION_PARAMETERS
REFPROP_USE_GERG = _constants.REFPROP_USE_GERG
REFPROP_USE_PENGROBINSON = _constants.REFPROP_USE_PENGROBINSON
MAXIMUM_TABLE_DIRECTORY_SIZE_IN_GB = _constants.MAXIMUM_TABLE_DIRECTORY_SIZE_IN_GB
DONT_CHECK_PROPERTY_LIMITS = _constants.DONT_CHECK_PROPERTY_LIMITS
HENRYS_LAW_TO_GENERATE_VLE_GUESSES = _constants.HENRYS_LAW_TO_GENERATE_VLE_GUESSES
PHASE_ENVELOPE_STARTING_PRESSURE_PA = _constants.PHASE_ENVELOPE_STARTING_PRESSURE_PA
R_U_CODATA = _constants.R_U_CODATA
VTPR_UNIFAC_PATH = _constants.VTPR_UNIFAC_PATH
SPINODAL_MINIMUM_DELTA = _constants.SPINODAL_MINIMUM_DELTA
OVERWRITE_FLUIDS = _constants.OVERWRITE_FLUIDS
OVERWRITE_DEPARTURE_FUNCTION = _constants.OVERWRITE_DEPARTURE_FUNCTION
OVERWRITE_BINARY_INTERACTION = _constants.OVERWRITE_BINARY_INTERACTION
USE_GUESSES_IN_PROPSSI = _constants.USE_GUESSES_IN_PROPSSI
ASSUME_CRITICAL_POINT_STABLE = _constants.ASSUME_CRITICAL_POINT_STABLE
VTPR_ALWAYS_RELOAD_LIBRARY = _constants.VTPR_ALWAYS_RELOAD_LIBRARY
FLOAT_PUNCTUATION = _constants.FLOAT_PUNCTUATION
| 44.672956 | 120 | 0.887653 | 841 | 7,103 | 6.88585 | 0.261593 | 0.090658 | 0.01865 | 0.016059 | 0.082887 | 0.036609 | 0 | 0 | 0 | 0 | 0 | 0.004232 | 0.068563 | 7,103 | 158 | 121 | 44.955696 | 0.87107 | 0.01971 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012987 | 0 | 0.012987 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e0023b6272774adf06f1384bdb4cb510043c4a82 | 224 | py | Python | task/w2/trenirovka/12-rivnist 2.py | beregok/pythontask | 50394ff2b52ab4f3273ec9ddc4b504d1f7b3159e | [
"MIT"
] | 1 | 2019-09-29T14:19:54.000Z | 2019-09-29T14:19:54.000Z | task/w2/trenirovka/12-rivnist 2.py | beregok/pythontask | 50394ff2b52ab4f3273ec9ddc4b504d1f7b3159e | [
"MIT"
] | null | null | null | task/w2/trenirovka/12-rivnist 2.py | beregok/pythontask | 50394ff2b52ab4f3273ec9ddc4b504d1f7b3159e | [
"MIT"
] | null | null | null | a = int(input())
b = int(input())
c = int(input())
d = int(input())
if a == 0 and b == 0:
print("INF")
else:
if (d - b * c / a) != 0 and (- b / a) == (- b // a):
print(- b // a)
else:
print("NO")
| 18.666667 | 56 | 0.397321 | 38 | 224 | 2.342105 | 0.342105 | 0.359551 | 0.11236 | 0.134831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02027 | 0.339286 | 224 | 11 | 57 | 20.363636 | 0.581081 | 0 | 0 | 0.181818 | 0 | 0 | 0.022321 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.272727 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e01a5f16e11613ae6cff496ae606faff7b1d0e27 | 460 | py | Python | home/push/mipush/APIError.py | he0119/smart-home | bdd3a59a8c46c0fdc07ac3049bf589c7f95a2683 | [
"MIT"
] | null | null | null | home/push/mipush/APIError.py | he0119/smart-home | bdd3a59a8c46c0fdc07ac3049bf589c7f95a2683 | [
"MIT"
] | 223 | 2020-02-21T06:16:56.000Z | 2022-03-01T22:24:19.000Z | home/push/mipush/APIError.py | he0119/smart-home | bdd3a59a8c46c0fdc07ac3049bf589c7f95a2683 | [
"MIT"
] | null | null | null | class APIError(Exception):
"""
raise APIError if receiving json message indicating failure.
"""
def __init__(self, error_code, error, request):
self.error_code = error_code
self.error = error
self.request = request
Exception.__init__(self, error)
def __str__(self):
return "APIError: %s: %s, request: %s" % (
self.error_code,
self.error,
self.request,
)
| 25.555556 | 64 | 0.576087 | 49 | 460 | 5.081633 | 0.387755 | 0.216867 | 0.156627 | 0.144578 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.321739 | 460 | 17 | 65 | 27.058824 | 0.798077 | 0.130435 | 0 | 0 | 0 | 0 | 0.075521 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0.083333 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e01cbf8a1a1ab981a1d993596c3a332451dcd74d | 367 | py | Python | pythonlibs/mantis/templates/webapp/src/webapp/base.py | adoggie/Tibet.6 | 3c53060edafd80b9c4dafa10699a68d86a410c66 | [
"MIT"
] | 22 | 2019-10-28T07:28:12.000Z | 2022-03-19T15:36:41.000Z | AliceBackend/src/AliceBackend/base.py | adoggie/Tibet.6 | 3c53060edafd80b9c4dafa10699a68d86a410c66 | [
"MIT"
] | 1 | 2019-11-07T04:54:14.000Z | 2019-11-07T07:12:48.000Z | AliceBackend/src/AliceBackend/base.py | adoggie/Tibet.6 | 3c53060edafd80b9c4dafa10699a68d86a410c66 | [
"MIT"
] | 13 | 2019-10-28T07:29:07.000Z | 2021-11-03T06:53:12.000Z | #coding:utf-8
class SystemDeviceType(object):
InnerBox = 1 # 主屏分离的室内主机
InnerScreen = 2 # 主屏分离的室内屏
OuterBox = 3 # 室外机
PropCallApp = 4 # 物业值守
PropSentryApp = 5 # 物业岗亭机
Others = 10
ValidatedList = (1,2,3,4,5)
class Constants(object):
SUPER_ACCESS_TOKEN = 'YTU3NzVlYjktYjQwMi00MGY2LTkxZjktYWMxYjIxZjM4NjNlCg ==' | 24.466667 | 80 | 0.643052 | 36 | 367 | 6.5 | 0.805556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067164 | 0.269755 | 367 | 15 | 80 | 24.466667 | 0.80597 | 0.125341 | 0 | 0 | 0 | 0 | 0.167722 | 0.158228 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e0201884251a727105b3a8b3946ca3bc3aefd73d | 480 | py | Python | devito/passes/iet/languages/C.py | guaacoelho/devito | 7e0b873114675752c4a49ed9076ee5d52997833c | [
"MIT"
] | 199 | 2016-08-18T23:33:05.000Z | 2019-12-24T07:08:48.000Z | devito/passes/iet/languages/C.py | guaacoelho/devito | 7e0b873114675752c4a49ed9076ee5d52997833c | [
"MIT"
] | 949 | 2016-04-25T11:41:34.000Z | 2019-12-27T10:43:40.000Z | devito/passes/iet/languages/C.py | guaacoelho/devito | 7e0b873114675752c4a49ed9076ee5d52997833c | [
"MIT"
] | 78 | 2016-08-30T07:42:34.000Z | 2019-12-13T20:34:45.000Z | from devito.ir import Call
from devito.passes.iet.definitions import DataManager
from devito.passes.iet.langbase import LangBB
__all__ = ['CBB', 'CDataManager']
class CBB(LangBB):
mapper = {
'aligned': lambda i:
'__attribute__((aligned(%d)))' % i,
'host-alloc': lambda i, j, k:
Call('posix_memalign', (i, j, k)),
'host-free': lambda i:
Call('free', (i,)),
}
class CDataManager(DataManager):
lang = CBB
| 21.818182 | 53 | 0.591667 | 57 | 480 | 4.824561 | 0.508772 | 0.109091 | 0.116364 | 0.138182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.260417 | 480 | 21 | 54 | 22.857143 | 0.774648 | 0 | 0 | 0 | 0 | 0 | 0.18125 | 0.058333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.133333 | 0.2 | 0 | 0.466667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e0215d4c222f248ad7105000615a748c88340354 | 2,026 | py | Python | tests/_test_image.py | Freakwill/ell | 8aa510cefb5d63db35071820208971013fac154c | [
"MIT"
] | null | null | null | tests/_test_image.py | Freakwill/ell | 8aa510cefb5d63db35071820208971013fac154c | [
"MIT"
] | null | null | null | tests/_test_image.py | Freakwill/ell | 8aa510cefb5d63db35071820208971013fac154c | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
"""Test methods about image process
Make sure the existance of the images
"""
from ell import *
import numpy as np
_filter = Filter.from_name('db4')
def test_resize():
chennal=0
c = ImageRGB.open('src/lenna.jpg')
d=c.resize(minInd=(-100,-100), maxInd=(100,100))
d.to_image()
assert True
def test_quantize():
im = ImageRGB.open('src/lenna.jpg')
d = im.quantize(128)
d.to_image()
assert True
def test_convolve():
im = ImageRGB.open('src/lenna.jpg')
d = (im @ _filter.H).D
# print(f"{d:i}, {d.shape}")
assert True
def test_filter():
im = ImageRGB.open('src/lenna.jpg')
rec = (im @ _filter.H).D.U @ _filter
assert True
def test_rec():
im = ImageRGB.open('src/lenna.jpg')
def _f(im, h1, h2=None):
if h2 is None: h2 = h1
return (im.conv1d(h1.H, axis=0).conv1d(h2.H, axis=1)).P.conv1d(h1, axis=0).conv1d(h2, axis=1)
rec = _f(im, _filter) + _f(im, _filter.H) + _f(im, _filter, _filter.H) + _f(im, _filter.H, _filter)
assert True
def test_rec2():
im = ImageRGB.open('../src/lenna.jpg')
def _f(im, h1, h2=None):
if h2 is None: h2 = h1
# return (im @ h1.tensor(h2).H).P @ h1.tensor(h2)
return (im.conv1d(h1.H, axis=0).conv1d(h2.H, axis=1)).P.conv1d(h1, axis=0).conv1d(h2, axis=1)
im1 = _f(im, _filter)
rec1 = _f(im1, _filter) + _f(im1, _filter.H) + _f(im1, _filter, _filter.H) + _f(im1, _filter.H, _filter)
rec2 = rec1 + _f(im, _filter.H) + _f(im, _filter, _filter.H) + _f(im, _filter.H, _filter)
assert True
def test_rec3():
im = ImageRGB.open('src/lenna.jpg')
def _f(im, h1, h2=None):
if h2 is None: h2 = h1
f = h1.tensor(h2)
return im.reduce(f).expand(f)
im1 = im.reduce(_filter)
rec1 = _f(im1, _filter) + _f(im1, _filter.H) + _f(im1, _filter, _filter.H) + _f(im1, _filter.H, _filter)
rec2 = rec1.expand(_filter) + _f(im, _filter.H) + _f(im, _filter, _filter.H) + _f(im, _filter.H, _filter)
assert True
| 28.535211 | 109 | 0.605133 | 338 | 2,026 | 3.428994 | 0.204142 | 0.102675 | 0.085418 | 0.120794 | 0.713546 | 0.669543 | 0.627265 | 0.584124 | 0.535807 | 0.535807 | 0 | 0.051169 | 0.218657 | 2,026 | 70 | 110 | 28.942857 | 0.680985 | 0.082922 | 0 | 0.5 | 0 | 0 | 0.052489 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 1 | 0.208333 | false | 0 | 0.041667 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e026dd61a71f4c0236cf71cd04ff440228426371 | 1,303 | py | Python | bot/views.py | eyobofficial/COVID-19-Mutual-Aid | 42d30ce95b0e9c717c5eda3ecaafea2812ec34f7 | [
"MIT"
] | null | null | null | bot/views.py | eyobofficial/COVID-19-Mutual-Aid | 42d30ce95b0e9c717c5eda3ecaafea2812ec34f7 | [
"MIT"
] | 5 | 2020-03-19T17:49:50.000Z | 2021-06-10T20:06:14.000Z | bot/views.py | eyobofficial/COVID-19-Mutual-Aid | 42d30ce95b0e9c717c5eda3ecaafea2812ec34f7 | [
"MIT"
] | null | null | null | import telegram
from django.conf import settings
from django.shortcuts import redirect
from django.utils.decorators import method_decorator
from django.views.generic import View
from django.views.decorators.csrf import csrf_exempt
from braces.views import CsrfExemptMixin
from rest_framework.authentication import BasicAuthentication
from rest_framework import status
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.permissions import AllowAny
from .bots import TelegramBot
from .models import TelegramUser as User
@method_decorator(csrf_exempt, name='dispatch')
class TelegramBotView(APIView):
permission_classes = (AllowAny, )
def post(self, request, *args, **kwargs):
context = request.data
bot = TelegramBot(context)
user, _ = User.objects.get_or_create(
id=bot.sender['id'],
defaults={
'first_name': bot.sender['first_name'],
'last_name': bot.sender.get('last_name', ''),
'username': bot.sender.get('username', ''),
'is_bot': bot.sender.get('is_bot', False)
}
)
user.access_count += 1
user.save()
bot.process(user)
return Response(status=status.HTTP_200_OK)
| 29.613636 | 61 | 0.692249 | 154 | 1,303 | 5.714286 | 0.454545 | 0.056818 | 0.096591 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003922 | 0.217191 | 1,303 | 43 | 62 | 30.302326 | 0.858824 | 0 | 0 | 0 | 0 | 0 | 0.058462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0.424242 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e0274bb01109146cf480d663260e32b7e8a8cc2d | 580 | py | Python | portfolio/urls.py | ramza007/Ramza.io | 2172d9ac13e87becbc8644ad5755070f48fab8da | [
"Apache-2.0"
] | 3 | 2019-12-16T16:47:16.000Z | 2020-07-28T19:47:34.000Z | portfolio/urls.py | ramza007/Ramza.io | 2172d9ac13e87becbc8644ad5755070f48fab8da | [
"Apache-2.0"
] | 15 | 2019-12-05T03:38:19.000Z | 2022-03-13T02:35:30.000Z | portfolio/urls.py | ramza007/Ramza.io | 2172d9ac13e87becbc8644ad5755070f48fab8da | [
"Apache-2.0"
] | null | null | null | from django.conf.urls import url
from django.urls import path, include,re_path
from . import views
from rest_framework.authtoken.views import obtain_auth_token
urlpatterns = [
path('', views.index, name='index'),
path('about', views.about, name='about'),
path('projects', views.projects, name='projects'),
path('photos', views.photos, name='photos'),
re_path(r'^api/projects/$', views.ProjectList.as_view()),
re_path(r'^api-token-auth/', obtain_auth_token),
re_path(r'api/project/project-id/(?P<pk>[0-9]+)/$', views.ProjectDescription.as_view()),
]
| 34.117647 | 92 | 0.7 | 83 | 580 | 4.759036 | 0.409639 | 0.060759 | 0.053165 | 0.075949 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003922 | 0.12069 | 580 | 16 | 93 | 36.25 | 0.770588 | 0 | 0 | 0 | 0 | 0 | 0.194828 | 0.067241 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e027abc215159b586950e87882ad8ad4be055155 | 407 | py | Python | tests/resources/mlflow-test-plugin/mlflow_test_plugin/file_store.py | iPieter/kiwi | 76b66872fce68873809a0dea112e2ed552ae5b63 | [
"Apache-2.0"
] | null | null | null | tests/resources/mlflow-test-plugin/mlflow_test_plugin/file_store.py | iPieter/kiwi | 76b66872fce68873809a0dea112e2ed552ae5b63 | [
"Apache-2.0"
] | 1 | 2021-01-24T13:34:51.000Z | 2021-01-24T13:34:51.000Z | tests/resources/mlflow-test-plugin/mlflow_test_plugin/file_store.py | iPieter/kiwi | 76b66872fce68873809a0dea112e2ed552ae5b63 | [
"Apache-2.0"
] | null | null | null | from six.moves import urllib
from kiwi.store.tracking.file_store import FileStore
class PluginFileStore(FileStore):
"""FileStore provided through entrypoints system"""
def __init__(self, store_uri=None, artifact_uri=None):
path = urllib.parse.urlparse(store_uri).path if store_uri else None
self.is_plugin = True
super(PluginFileStore, self).__init__(path, artifact_uri)
| 31.307692 | 75 | 0.746929 | 53 | 407 | 5.45283 | 0.584906 | 0.083045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167076 | 407 | 12 | 76 | 33.916667 | 0.852507 | 0.110565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e02a89e62a53d61fc9086acef78dc03df26f1de7 | 2,140 | py | Python | backend/listings/migrations/0001_initial.py | relaxxpls/Music-Control | 76f5d10904f820607b3eb756850d5c5d7d89d875 | [
"MIT"
] | null | null | null | backend/listings/migrations/0001_initial.py | relaxxpls/Music-Control | 76f5d10904f820607b3eb756850d5c5d7d89d875 | [
"MIT"
] | null | null | null | backend/listings/migrations/0001_initial.py | relaxxpls/Music-Control | 76f5d10904f820607b3eb756850d5c5d7d89d875 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.3 on 2021-05-30 04:28
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
class Migration(migrations.Migration):
initial = True
dependencies = [
('realtors', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Listing',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('slug', models.CharField(max_length=200, unique=True)),
('title', models.CharField(max_length=150)),
('address', models.CharField(default='', max_length=150)),
('city', models.CharField(max_length=100)),
('state', models.CharField(max_length=100)),
('zipcode', models.CharField(max_length=15)),
('description', models.TextField(blank=True)),
('sale_type', models.CharField(choices=[('For Sale', 'For Sale'), ('For Rent', 'For Rent')], default='For Sale', max_length=50)),
('price', models.IntegerField()),
('bedrooms', models.IntegerField()),
('bathrooms', models.DecimalField(decimal_places=1, max_digits=2)),
('home_type', models.CharField(choices=[('House', 'House'), ('Condo', 'Condo'), ('Townhouse', 'Townhouse')], default='House', max_length=50)),
('sqft', models.IntegerField()),
('open_house', models.BooleanField(default=False)),
('photo_main', models.ImageField(upload_to='photos/%Y/%m/%d')),
('photo_1', models.ImageField(blank=True, upload_to='photos/%Y/%m/%d')),
('photo_2', models.ImageField(blank=True, upload_to='photos/%Y/%m/%d')),
('is_published', models.BooleanField(default=True)),
('list_date', models.DateTimeField(blank=True, default=django.utils.timezone.now)),
('realtor', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='realtors.realtor')),
],
),
]
| 48.636364 | 158 | 0.583645 | 225 | 2,140 | 5.431111 | 0.444444 | 0.0982 | 0.07365 | 0.0982 | 0.135025 | 0.090835 | 0.090835 | 0.06874 | 0.06874 | 0.06874 | 0 | 0.027295 | 0.246729 | 2,140 | 43 | 159 | 49.767442 | 0.730769 | 0.021028 | 0 | 0 | 1 | 0 | 0.154802 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.194444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.