hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4007ccb371063c993bd22bb2370d18838e357a3f | 3,218 | py | Python | extractor/util.py | bcskda/vk-archive-deepercopy | 3619b94eb3e0f5f67860022cdfb2074e457c0cd2 | [
"Unlicense"
] | 1 | 2020-04-24T09:24:31.000Z | 2020-04-24T09:24:31.000Z | extractor/util.py | bcskda/vk-archive-deepercopy | 3619b94eb3e0f5f67860022cdfb2074e457c0cd2 | [
"Unlicense"
] | null | null | null | extractor/util.py | bcskda/vk-archive-deepercopy | 3619b94eb3e0f5f67860022cdfb2074e457c0cd2 | [
"Unlicense"
] | null | null | null | import functools
import glob
import itertools
import logging
import os
from progressbar import progressbar
import re
import requests
from typing import List
class ValueSingleDispatch:
def __init__(self):
self._handlers = dict()
def register(self, key):
def decorator(fn: callable):
if key in self._handlers:
raise KeyError(key)
self._handlers[key] = fn
return fn
return decorator
def call(self, key, *args, **kwargs):
if key not in self._handlers:
raise KeyError(key)
return self._handlers[key](*args, **kwargs)
def valid_keys(self):
return self._handlers.keys()
def alphanumeric_glob(pattern: str):
"""Glob and sort alpahnumerically. Limitations: exactly one `*', no `?', file names with single extention."""
matches = glob.glob(pattern)
asterisk_pos = pattern.find('*')
matches.sort(key=lambda name: int(name[asterisk_pos:name.rfind('.')]))
return matches
def findall_in_files(pattern: re.Pattern, filenames: List[str], encoding: str) -> re.Match:
"""Generator"""
for filename in filenames:
logging.debug('util.findall_in_files: input file %s', filename)
with open(filename, 'rb') as ifile:
for match in pattern.findall(ifile.read().decode(encoding)):
logging.debug('util.findall_in_files(): match: file = %s, text = %s', filename, match)
yield match
def make_pattern(url_regex: str, extentions: List[str]) -> re.Pattern:
if extentions:
ext_regex = '({})'.format('|'.join(extentions))
else:
ext_regex = '()'
return re.compile(url_regex.format(extentions=ext_regex))
def download_by_pattern(url_regex: str, filenames: List[str], output_dir: str, *, extentions=[], encoding='windows-1251', limit=None):
logging.debug('util.download_by_pattern(): pattern = %s, extentions = %s', url_regex, extentions)
pattern = make_pattern(url_regex, extentions)
matches = findall_in_files(pattern, filenames, encoding)
if limit is not None:
matches = itertools.islice(matches, limit)
matches = list(matches)
logging.info('util.download_by_pattern(): %d matches', len(matches))
os.makedirs(output_dir, exist_ok=True)
downloads = 0
# TODO statistics by extention
for idx, (url, ext) in progressbar(enumerate(matches), max_value=len(matches)):
local_name = '{:07d}'.format(idx) + '_' + os.path.basename(url)
try:
download(url, os.path.join(output_dir, local_name))
downloads += 1
except Exception as e:
logging.warning('util.download_by_pattern(): unhandled exception: url = %s, e = %s', match_url, e)
logging.info('util.download_by_pattern(): %d successful downloads', downloads)
if downloads < len(matches):
logging.warning('util.download_by_pattern(): %d downloads failed, see log for warnings', len(matches) - downloads)
def download(url: str, local_path: str) -> bool:
logging.debug('util.download(): url = %s, local = %s', url, local_path)
req = requests.get(url)
with open(local_path, 'wb') as ofile:
ofile.write(req.content)
| 38.771084 | 134 | 0.657551 | 408 | 3,218 | 5.04902 | 0.330882 | 0.034951 | 0.049515 | 0.050971 | 0.124757 | 0.124272 | 0.032039 | 0 | 0 | 0 | 0 | 0.003168 | 0.215351 | 3,218 | 82 | 135 | 39.243902 | 0.812673 | 0.044438 | 0 | 0.029412 | 0 | 0 | 0.142624 | 0.059073 | 0 | 0 | 0 | 0.012195 | 0 | 1 | 0.147059 | false | 0 | 0.132353 | 0.014706 | 0.382353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
400afc4da001a8c030925a65e03f44b9ed050772 | 1,637 | py | Python | setup.py | gillins/pyshepseg | bfa8d157d610bf4f581a2500d0afb42d4f92d59b | [
"MIT"
] | 5 | 2021-02-03T05:02:56.000Z | 2022-01-31T07:55:20.000Z | setup.py | gillins/pyshepseg | bfa8d157d610bf4f581a2500d0afb42d4f92d59b | [
"MIT"
] | 14 | 2021-02-03T04:18:48.000Z | 2022-01-24T03:50:22.000Z | setup.py | gillins/pyshepseg | bfa8d157d610bf4f581a2500d0afb42d4f92d59b | [
"MIT"
] | 13 | 2021-02-03T03:41:17.000Z | 2022-01-24T04:21:23.000Z | #Copyright 2021 Neil Flood and Sam Gillingham. All rights reserved.
#
#Permission is hereby granted, free of charge, to any person
#obtaining a copy of this software and associated documentation
#files (the "Software"), to deal in the Software without restriction,
#including without limitation the rights to use, copy, modify,
#merge, publish, distribute, sublicense, and/or sell copies of the
#Software, and to permit persons to whom the Software is furnished
#to do so, subject to the following conditions:
#
#The above copyright notice and this permission notice shall be
#included in all copies or substantial portions of the Software.
#
#THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
#EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
#OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
#IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR
#ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
#CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
#WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
from numpy.distutils.core import setup
import pyshepseg
setup(name='pyshepseg',
version=pyshepseg.SHEPSEG_VERSION,
description='Python implementation of the image segmentation algorithm described by Shepherd et al',
author='Neil Flood and Sam Gillingham',
scripts=['bin/test_pyshepseg.py', 'bin/test_pyshepseg_tiling.py',
'bin/test_pyshepseg_subset.py'],
packages=['pyshepseg'],
license='LICENSE.txt',
url='https://github.com/ubarsc/pyshepseg'
)
| 46.771429 | 106 | 0.756261 | 233 | 1,637 | 5.287554 | 0.566524 | 0.071429 | 0.038961 | 0.024351 | 0.040584 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002963 | 0.175321 | 1,637 | 34 | 107 | 48.147059 | 0.90963 | 0.662187 | 0 | 0 | 0 | 0 | 0.478424 | 0.144465 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
401988f94a7b7ebda02b1f821bbce411385f8136 | 3,885 | py | Python | pupa/tests/importers/test_base_importer.py | influence-usa/pupa | 5105c39a535ad401f7babe4eecb3861bed1f8326 | [
"BSD-3-Clause"
] | null | null | null | pupa/tests/importers/test_base_importer.py | influence-usa/pupa | 5105c39a535ad401f7babe4eecb3861bed1f8326 | [
"BSD-3-Clause"
] | 3 | 2015-06-09T19:22:50.000Z | 2015-06-09T21:41:22.000Z | pupa/tests/importers/test_base_importer.py | influence-usa/pupa | 5105c39a535ad401f7babe4eecb3861bed1f8326 | [
"BSD-3-Clause"
] | null | null | null | import os
import json
import shutil
import tempfile
import mock
import pytest
from opencivicdata.models import Person
from pupa.scrape import Person as ScrapePerson
from pupa.scrape import Organization as ScrapeOrganization
from pupa.importers.base import omnihash, BaseImporter
from pupa.importers import PersonImporter, OrganizationImporter
from pupa.exceptions import UnresolvedIdError, DataImportError
class FakeImporter(BaseImporter):
_type = 'test'
def test_omnihash_python_types():
# string
assert omnihash('test') == omnihash('test')
# list
assert omnihash(['this', 'is', 'a', 'list']) == omnihash(['this', 'is', 'a', 'list'])
# set
assert omnihash({'and', 'a', 'set'}) == omnihash({'set', 'set', 'and', 'a'})
# dict w/ set and tuple as well
assert (omnihash({'a': {('fancy', 'nested'): {'dict'}}}) ==
omnihash({'a': {('fancy', 'nested'): {'dict'}}}))
def test_import_directory():
# write out some temp data to filesystem
datadir = tempfile.mkdtemp()
dicta = {'test': 'A'}
dictb = {'test': 'B'}
open(os.path.join(datadir, 'test_a.json'), 'w').write(json.dumps(dicta))
open(os.path.join(datadir, 'test_b.json'), 'w').write(json.dumps(dictb))
# simply ensure that import directory calls import_data with all dicts
ti = FakeImporter('jurisdiction-id')
with mock.patch.object(ti, attribute='import_data') as mockobj:
ti.import_directory(datadir)
# import_data should be called once
assert mockobj.call_count == 1
# kind of hacky, get the total list of args passed in
arg_objs = list(mockobj.call_args[0][0])
# 2 args only, make sure a and b are in there
assert len(arg_objs) == 2
assert dicta in arg_objs
assert dictb in arg_objs
# clean up datadir
shutil.rmtree(datadir)
# doing these next few tests just on a Person because it is the same code that handles it
# but for completeness maybe it is better to do these on each type?
@pytest.mark.django_db
def test_deduplication_identical_object():
p1 = ScrapePerson('Dwayne').as_dict()
p2 = ScrapePerson('Dwayne').as_dict()
PersonImporter('jid').import_data([p1, p2])
assert Person.objects.count() == 1
@pytest.mark.django_db
def test_exception_on_identical_objects_in_import_stream():
# these two objects aren't identical, but refer to the same thing
# at the moment we consider this an error (but there may be a better way to handle this?)
o1 = ScrapeOrganization('X-Men', classification='unknown').as_dict()
o2 = ScrapeOrganization('X-Men', founding_date='1970', classification='unknown').as_dict()
with pytest.raises(Exception):
OrganizationImporter('jid').import_data([o1, o2])
@pytest.mark.django_db
def test_resolve_json_id():
p1 = ScrapePerson('Dwayne').as_dict()
p2 = ScrapePerson('Dwayne').as_dict()
pi = PersonImporter('jid')
# do import and get database id
p1_id = p1['_id']
p2_id = p2['_id']
pi.import_data([p1, p2])
db_id = Person.objects.get().id
# simplest case
assert pi.resolve_json_id(p1_id) == db_id
# duplicate should resolve to same id
assert pi.resolve_json_id(p2_id) == db_id
# a null id should map to None
assert pi.resolve_json_id(None) is None
# no such id
with pytest.raises(UnresolvedIdError):
pi.resolve_json_id('this-is-invalid')
@pytest.mark.django_db
def test_invalid_fields():
p1 = ScrapePerson('Dwayne').as_dict()
p1['newfield'] = "shouldn't happen"
with pytest.raises(DataImportError):
PersonImporter('jid').import_data([p1])
@pytest.mark.django_db
def test_invalid_fields_related_item():
p1 = ScrapePerson('Dwayne')
p1.add_link('http://example.com')
p1 = p1.as_dict()
p1['links'][0]['test'] = 3
with pytest.raises(DataImportError):
PersonImporter('jid').import_data([p1])
| 31.585366 | 94 | 0.686486 | 543 | 3,885 | 4.775322 | 0.344383 | 0.030852 | 0.030852 | 0.034709 | 0.257231 | 0.163903 | 0.115696 | 0.115696 | 0.086386 | 0.040108 | 0 | 0.012295 | 0.183526 | 3,885 | 122 | 95 | 31.844262 | 0.80517 | 0.186873 | 0 | 0.186667 | 0 | 0 | 0.09366 | 0 | 0 | 0 | 0 | 0 | 0.16 | 1 | 0.093333 | false | 0 | 0.346667 | 0 | 0.466667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
401fd2803f10b2fab1010a7dfe0776cbe8cc8571 | 11,612 | py | Python | neutron_fwaas/extensions/firewall_v2.py | sapcc/neutron-fwaas | 59bad17387d15f86ea7d08f8675208160a999ffe | [
"Apache-2.0"
] | null | null | null | neutron_fwaas/extensions/firewall_v2.py | sapcc/neutron-fwaas | 59bad17387d15f86ea7d08f8675208160a999ffe | [
"Apache-2.0"
] | null | null | null | neutron_fwaas/extensions/firewall_v2.py | sapcc/neutron-fwaas | 59bad17387d15f86ea7d08f8675208160a999ffe | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import abc
from debtcollector import moves
from neutron.api.v2 import resource_helper
from neutron_lib.api.definitions import constants as api_const
from neutron_lib.api.definitions import firewall_v2
from neutron_lib.api import extensions
from neutron_lib.exceptions import firewall_v2 as f_exc
from neutron_lib.services import base as service_base
from oslo_config import cfg
import six
from neutron_fwaas._i18n import _
from neutron_fwaas.common import fwaas_constants
FirewallGroupNotFound = moves.moved_class(
f_exc.FirewallGroupNotFound, 'FirewallGroupNotFound', __name__)
FirewallGroupInUse = moves.moved_class(
f_exc.FirewallGroupInUse, 'FirewallGroupInUse', __name__)
FirewallGroupInPendingState = moves.moved_class(
f_exc.FirewallGroupInPendingState, 'FirewallGroupInPendingState', __name__)
FirewallGroupPortInvalid = moves.moved_class(
f_exc.FirewallGroupPortInvalid, 'FirewallGroupPortInvalid', __name__)
FirewallGroupPortInvalidProject = moves.moved_class(
f_exc.FirewallGroupPortInvalidProject, 'FirewallGroupPortInvalidProject',
__name__)
FirewallGroupPortInUse = moves.moved_class(
f_exc.FirewallGroupPortInUse, 'FirewallGroupPortInUse', __name__)
FirewallPolicyNotFound = moves.moved_class(
f_exc.FirewallPolicyNotFound, 'FirewallPolicyNotFound', __name__)
FirewallPolicyInUse = moves.moved_class(
f_exc.FirewallPolicyInUse, 'FirewallPolicyInUse', __name__)
FirewallPolicyConflict = moves.moved_class(
f_exc.FirewallPolicyConflict, 'FirewallPolicyConflict', __name__)
FirewallRuleSharingConflict = moves.moved_class(
f_exc.FirewallRuleSharingConflict, 'FirewallRuleSharingConflict',
__name__)
FirewallPolicySharingConflict = moves.moved_class(
f_exc.FirewallPolicySharingConflict, 'FirewallPolicySharingConflict',
__name__)
FirewallRuleNotFound = moves.moved_class(
f_exc.FirewallRuleNotFound, 'FirewallRuleNotFound', __name__)
FirewallRuleInUse = moves.moved_class(
f_exc.FirewallRuleInUse, 'FirewallRuleInUse', __name__)
FirewallRuleNotAssociatedWithPolicy = moves.moved_class(
f_exc.FirewallRuleNotAssociatedWithPolicy,
'FirewallRuleNotAssociatedWithPolicy',
__name__)
FirewallRuleInvalidProtocol = moves.moved_class(
f_exc.FirewallRuleInvalidProtocol, 'FirewallRuleInvalidProtocol',
__name__)
FirewallRuleInvalidAction = moves.moved_class(
f_exc.FirewallRuleInvalidAction, 'FirewallRuleInvalidAction',
__name__)
FirewallRuleInvalidICMPParameter = moves.moved_class(
f_exc.FirewallRuleInvalidICMPParameter,
'FirewallRuleInvalidICMPParameter', __name__)
FirewallRuleWithPortWithoutProtocolInvalid = moves.moved_class(
f_exc.FirewallRuleWithPortWithoutProtocolInvalid,
'FirewallRuleWithPortWithoutProtocolInvalid', __name__)
FirewallRuleInvalidPortValue = moves.moved_class(
f_exc.FirewallRuleInvalidPortValue, 'FirewallRuleInvalidPortValue',
__name__)
FirewallRuleInfoMissing = moves.moved_class(
f_exc.FirewallRuleInfoMissing, 'FirewallRuleInfoMissing', __name__)
FirewallIpAddressConflict = moves.moved_class(
f_exc.FirewallIpAddressConflict, 'FirewallIpAddressConflict', __name__)
FirewallInternalDriverError = moves.moved_class(
f_exc.FirewallInternalDriverError, 'FirewallInternalDriverError', __name__)
FirewallRuleConflict = moves.moved_class(
f_exc.FirewallRuleConflict, 'FirewallRuleConflict', __name__)
FirewallRuleAlreadyAssociated = moves.moved_class(
f_exc.FirewallRuleAlreadyAssociated, 'FirewallRuleAlreadyAssociated',
__name__)
default_fwg_rules_opts = [
cfg.StrOpt('ingress_action',
default=api_const.FWAAS_DENY,
help=_('Firewall group rule action allow or '
'deny or reject for ingress. '
'Default is deny.')),
cfg.StrOpt('ingress_source_ipv4_address',
default=None,
help=_('IPv4 source address for ingress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('ingress_source_ipv6_address',
default=None,
help=_('IPv6 source address for ingress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('ingress_source_port',
default=None,
help=_('Source port number or range '
'(min:max) for ingress. '
'Default is None.')),
cfg.StrOpt('ingress_destination_ipv4_address',
default=None,
help=_('IPv4 destination address for ingress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('ingress_destination_ipv6_address',
default=None,
help=_('IPv6 destination address for ingress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('ingress_destination_port',
default=None,
help=_('Destination port number or range '
'(min:max) for ingress. '
'Default is None.')),
cfg.StrOpt('egress_action',
default=api_const.FWAAS_ALLOW,
help=_('Firewall group rule action allow or '
'deny or reject for egress. '
'Default is allow.')),
cfg.StrOpt('egress_source_ipv4_address',
default=None,
help=_('IPv4 source address for egress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('egress_source_ipv6_address',
default=None,
help=_('IPv6 source address for egress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('egress_source_port',
default=None,
help=_('Source port number or range '
'(min:max) for egress. '
'Default is None.')),
cfg.StrOpt('egress_destination_ipv4_address',
default=None,
help=_('IPv4 destination address for egress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('egress_destination_ipv6_address',
default=None,
help=_('IPv6 destination address for egress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('egress_destination_port',
default=None,
help=_('Destination port number or range '
'(min:max) for egress. '
'Default is None.')),
cfg.BoolOpt('shared',
default=False,
help=_('Firewall group rule shared. '
'Default is False.')),
cfg.StrOpt('protocol',
default=None,
help=_('Network protocols (tcp, udp, ...). '
'Default is None.')),
cfg.BoolOpt('enabled',
default=True,
help=_('Firewall group rule enabled. '
'Default is True.')),
]
firewall_quota_opts = [
cfg.IntOpt('quota_firewall_group',
default=10,
help=_('Number of firewall groups allowed per tenant. '
'A negative value means unlimited.')),
cfg.IntOpt('quota_firewall_policy',
default=10,
help=_('Number of firewall policies allowed per tenant. '
'A negative value means unlimited.')),
cfg.IntOpt('quota_firewall_rule',
default=100,
help=_('Number of firewall rules allowed per tenant. '
'A negative value means unlimited.')),
]
cfg.CONF.register_opts(default_fwg_rules_opts, 'default_fwg_rules')
cfg.CONF.register_opts(firewall_quota_opts, 'QUOTAS')
# TODO(Reedip): Remove the convert_to functionality after bug1706061 is fixed.
def convert_to_string(value):
if value is not None:
return str(value)
return None
firewall_v2.RESOURCE_ATTRIBUTE_MAP[api_const.FIREWALL_RULES][
'source_port']['convert_to'] = convert_to_string
firewall_v2.RESOURCE_ATTRIBUTE_MAP[api_const.FIREWALL_RULES][
'destination_port']['convert_to'] = convert_to_string
class Firewall_v2(extensions.APIExtensionDescriptor):
api_definition = firewall_v2
@classmethod
def get_resources(cls):
special_mappings = {'firewall_policies': 'firewall_policy'}
plural_mappings = resource_helper.build_plural_mappings(
special_mappings, firewall_v2.RESOURCE_ATTRIBUTE_MAP)
return resource_helper.build_resource_info(
plural_mappings, firewall_v2.RESOURCE_ATTRIBUTE_MAP,
fwaas_constants.FIREWALL_V2, action_map=firewall_v2.ACTION_MAP,
register_quota=True)
@classmethod
def get_plugin_interface(cls):
return Firewallv2PluginBase
@six.add_metaclass(abc.ABCMeta)
class Firewallv2PluginBase(service_base.ServicePluginBase):
def get_plugin_type(self):
return fwaas_constants.FIREWALL_V2
def get_plugin_description(self):
return 'Firewall Service v2 Plugin'
# Firewall Group
@abc.abstractmethod
def create_firewall_group(self, context, firewall_group):
pass
@abc.abstractmethod
def delete_firewall_group(self, context, id):
pass
@abc.abstractmethod
def get_firewall_group(self, context, id, fields=None):
pass
@abc.abstractmethod
def get_firewall_groups(self, context, filters=None, fields=None):
pass
@abc.abstractmethod
def update_firewall_group(self, context, id, firewall_group):
pass
# Firewall Policy
@abc.abstractmethod
def create_firewall_policy(self, context, firewall_policy):
pass
@abc.abstractmethod
def delete_firewall_policy(self, context, id):
pass
@abc.abstractmethod
def get_firewall_policy(self, context, id, fields=None):
pass
@abc.abstractmethod
def get_firewall_policies(self, context, filters=None, fields=None):
pass
@abc.abstractmethod
def update_firewall_policy(self, context, id, firewall_policy):
pass
# Firewall Rule
@abc.abstractmethod
def create_firewall_rule(self, context, firewall_rule):
pass
@abc.abstractmethod
def delete_firewall_rule(self, context, id):
pass
@abc.abstractmethod
def get_firewall_rule(self, context, id, fields=None):
pass
@abc.abstractmethod
def get_firewall_rules(self, context, filters=None, fields=None):
pass
@abc.abstractmethod
def update_firewall_rule(self, context, id, firewall_rule):
pass
@abc.abstractmethod
def insert_rule(self, context, id, rule_info):
pass
@abc.abstractmethod
def remove_rule(self, context, id, rule_info):
pass
| 38.323432 | 79 | 0.673527 | 1,161 | 11,612 | 6.445306 | 0.18863 | 0.013364 | 0.048109 | 0.051316 | 0.458773 | 0.362021 | 0.297875 | 0.289189 | 0.289189 | 0.246559 | 0 | 0.006266 | 0.244058 | 11,612 | 302 | 80 | 38.450331 | 0.846206 | 0.058388 | 0 | 0.384 | 0 | 0 | 0.239923 | 0.074936 | 0 | 0 | 0 | 0.003311 | 0 | 1 | 0.088 | false | 0.068 | 0.048 | 0.012 | 0.172 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4022d54aeba2badfe2c92ef3c771f491343dff82 | 1,919 | py | Python | teste/knn.py | joandesonandrade/nebulosa | 5bc157322ed0bdb81f6f00f6ed1ea7f7a5cadfe0 | [
"MIT"
] | null | null | null | teste/knn.py | joandesonandrade/nebulosa | 5bc157322ed0bdb81f6f00f6ed1ea7f7a5cadfe0 | [
"MIT"
] | null | null | null | teste/knn.py | joandesonandrade/nebulosa | 5bc157322ed0bdb81f6f00f6ed1ea7f7a5cadfe0 | [
"MIT"
] | null | null | null | from sklearn import preprocessing
import pandas as pd
import numpy as np
#import matplotlib.pyplot as plt
#Abrindo o dados como Dataframe
dados = pd.read_csv('dados/001.csv')
#Iniciando o método para binanizar as classe sim=1; não=0
pre = preprocessing.LabelBinarizer()
#Binazirando a classe jogou, e atribuíndo a uma matriz n-dimencional
y_binary = pre.fit_transform(dados['jogou'])
y = np.array(y_binary).ravel()
lista_clima = [x for x in dados['clima']]
lista_temperatura = [x for x in dados['temperatura']]
lista_jogou = [x for x in dados['jogou']]
pre = preprocessing.LabelEncoder()
clima_encoding = pre.fit_transform(lista_clima)
temperatura_encoding = pre.fit_transform(lista_temperatura)
jogou_encoding = pre.fit_transform(lista_jogou)
lista = list(zip(clima_encoding, temperatura_encoding, jogou_encoding))
X = np.array(lista, dtype=np.int32)
#colunas = ['A', 'B', 'C']
# print(pd.DataFrame(X, columns=colunas, dtype=np.int32))
# print(pd.DataFrame(y, columns=['Classe'], dtype=np.int32))
#
# xX = []
# for i, x in enumerate(X):
# xX.append([list(x), y[i][0]])
#
# dX = [(x[0][0] + x[0][1] + x[0][2]) for x in xX]
# dY = [x[1] for x in xX]
#
# print('Soma dos rótulos:', dX)
# print('Classe:', dY)
#
# fig, ax = plt.subplots()
# ax.plot(dX)
# ax.plot(dY)
# plt.show()
from sklearn import model_selection
from sklearn.metrics import accuracy_score
from sklearn.neighbors import KNeighborsClassifier
#Dividido os dados, onde o treinamento ficará com 75% e teste 25%, eu sempre uso este padrão :)
X_train, X_test, y_train, y_test = model_selection.train_test_split(X, y, test_size=0.25, random_state=0)
#Gerando o modelo, vou deixar os parâmetros padrão
knn = KNeighborsClassifier()
#Treinando o modelo
knn.fit(X=X_train, y=y_train)
#Avaliando a pontuação do modelo, usando os dados de teste
pontuacao = str(accuracy_score(y_test, knn.predict(X_test)) * 100)
print("Precisão: "+pontuacao+"%")
| 28.641791 | 105 | 0.727983 | 308 | 1,919 | 4.422078 | 0.396104 | 0.013216 | 0.022026 | 0.015419 | 0.088106 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01804 | 0.133403 | 1,919 | 66 | 106 | 29.075758 | 0.800962 | 0.414278 | 0 | 0 | 0 | 0 | 0.045537 | 0 | 0 | 0 | 0 | 0.015152 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40255e51d495409353d842161452761a11a4b039 | 8,940 | py | Python | components/google-cloud/tests/container/experimental/gcp_launcher/test_batch_prediction_job_remote_runner.py | m-mayran/pipelines | 4e89973504980ff89d896fda09fc29a339b2d744 | [
"Apache-2.0"
] | null | null | null | components/google-cloud/tests/container/experimental/gcp_launcher/test_batch_prediction_job_remote_runner.py | m-mayran/pipelines | 4e89973504980ff89d896fda09fc29a339b2d744 | [
"Apache-2.0"
] | null | null | null | components/google-cloud/tests/container/experimental/gcp_launcher/test_batch_prediction_job_remote_runner.py | m-mayran/pipelines | 4e89973504980ff89d896fda09fc29a339b2d744 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 The Kubeflow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Test Vertex AI Batch Prediction Job Remote Runner Client module."""
import json
from logging import raiseExceptions
import os
import time
import unittest
from unittest import mock
from google.cloud import aiplatform
from google.cloud.aiplatform.compat.types import job_state as gca_job_state
from google.protobuf import json_format
from google_cloud_pipeline_components.proto.gcp_resources_pb2 import GcpResources
from google_cloud_pipeline_components.container.experimental.gcp_launcher import batch_prediction_job_remote_runner
from google_cloud_pipeline_components.container.experimental.gcp_launcher import job_remote_runner
class BatchPredictionJobRemoteRunnerUtilsTests(unittest.TestCase):
def setUp(self):
super(BatchPredictionJobRemoteRunnerUtilsTests, self).setUp()
self._payload = (
'{"batchPredictionJob": {"displayName": '
'"BatchPredictionComponentName", "model": '
'"projects/test/locations/test/models/test-model","inputConfig":'
' {"instancesFormat": "CSV","gcsSource": {"uris": '
'["test_gcs_source"]}}, "outputConfig": {"predictionsFormat": '
'"CSV", "gcsDestination": {"outputUriPrefix": '
'"test_gcs_destination"}}}}')
self._job_type = 'BatchPredictionJob'
self._project = 'test_project'
self._location = 'test_region'
self._batch_prediction_job_name = '/projects/{self._project}/locations/{self._location}/jobs/test_job_id'
self._gcp_resources_path = 'gcp_resources'
self._batch_prediction_job_uri_prefix = f'https://{self._location}-aiplatform.googleapis.com/v1/'
def tearDown(self):
if os.path.exists(self._gcp_resources_path):
os.remove(self._gcp_resources_path)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
def test_batch_prediction_job_remote_runner_on_region_is_set_correctly_in_client_options(
self, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response = mock.Mock()
job_client.get_batch_prediction_job.return_value = get_batch_prediction_job_response
get_batch_prediction_job_response.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
mock_job_service_client.assert_called_once_with(
client_options={
'api_endpoint': 'test_region-aiplatform.googleapis.com'
},
client_info=mock.ANY)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
def test_batch_prediction_job_remote_runner_on_payload_deserializes_correctly(
self, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response = mock.Mock()
job_client.get_batch_prediction_job.return_value = get_batch_prediction_job_response
get_batch_prediction_job_response.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
mock_path_exists.return_value = False
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
expected_parent = f'projects/{self._project}/locations/{self._location}'
expected_job_spec = json.loads(self._payload, strict=False)
job_client.create_batch_prediction_job.assert_called_once_with(
parent=expected_parent, batch_prediction_job=expected_job_spec)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
def test_batch_prediction_job_remote_runner_raises_exception_on_error(
self, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response = mock.Mock()
job_client.get_batch_prediction_job.return_value = get_batch_prediction_job_response
get_batch_prediction_job_response.state = gca_job_state.JobState.JOB_STATE_FAILED
mock_path_exists.return_value = False
with self.assertRaises(RuntimeError):
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
@mock.patch.object(time, 'sleep', autospec=True)
def test_batch_prediction_job_remote_runner_retries_to_get_status_on_non_completed_job(
self, mock_time_sleep, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response_success = mock.Mock()
get_batch_prediction_job_response_success.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
get_batch_prediction_job_response_running = mock.Mock()
get_batch_prediction_job_response_running.state = gca_job_state.JobState.JOB_STATE_RUNNING
job_client.get_batch_prediction_job.side_effect = [
get_batch_prediction_job_response_running,
get_batch_prediction_job_response_success
]
mock_path_exists.return_value = False
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
mock_time_sleep.assert_called_once_with(
job_remote_runner._POLLING_INTERVAL_IN_SECONDS)
self.assertEqual(job_client.get_batch_prediction_job.call_count, 2)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
def test_batch_prediction_job_remote_runner_returns_gcp_resources(
self, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response_success = mock.Mock()
get_batch_prediction_job_response_success.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
job_client.get_batch_prediction_job.side_effect = [
get_batch_prediction_job_response_success
]
mock_path_exists.return_value = False
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
with open(self._gcp_resources_path) as f:
serialized_gcp_resources = f.read()
# Instantiate GCPResources Proto
batch_prediction_job_resources = json_format.Parse(
serialized_gcp_resources, GcpResources())
self.assertEqual(len(batch_prediction_job_resources.resources), 1)
batch_prediction_job_name = batch_prediction_job_resources.resources[
0].resource_uri[len(self._batch_prediction_job_uri_prefix):]
self.assertEqual(batch_prediction_job_name,
self._batch_prediction_job_name)
| 45.380711 | 115 | 0.794519 | 1,169 | 8,940 | 5.59367 | 0.179641 | 0.176633 | 0.211959 | 0.131213 | 0.669827 | 0.638324 | 0.590151 | 0.579599 | 0.579599 | 0.572106 | 0 | 0.001673 | 0.130872 | 8,940 | 196 | 116 | 45.612245 | 0.839897 | 0.075615 | 0 | 0.514286 | 0 | 0 | 0.086113 | 0.04148 | 0 | 0 | 0 | 0 | 0.05 | 1 | 0.05 | false | 0 | 0.085714 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
402c6d1527bb64bf420904254134ab7105236ec8 | 10,690 | py | Python | data_utils.py | algoprog/Quin | c1fd3b8e5e2163217f6c8062620ee0c1dfeed0e8 | [
"MIT"
] | 47 | 2020-08-02T12:28:07.000Z | 2022-03-30T01:56:57.000Z | data_utils.py | algoprog/Quin | c1fd3b8e5e2163217f6c8062620ee0c1dfeed0e8 | [
"MIT"
] | 4 | 2020-09-20T17:31:51.000Z | 2021-12-02T17:40:03.000Z | data_utils.py | algoprog/Quin | c1fd3b8e5e2163217f6c8062620ee0c1dfeed0e8 | [
"MIT"
] | 4 | 2020-11-23T15:47:34.000Z | 2021-03-30T02:02:02.000Z | import csv
import json
import pickle
import logging
import re
import pandas
import gzip
import os
import numpy as np
from random import randint, random
from tqdm import tqdm
from retriever.dense_retriever import DenseRetriever
from models.tokenization import tokenize
from typing import Union, List
class InputExample:
"""
Structure for one input example with texts, the label and a unique id
"""
def __init__(self, guid: str, texts: List[str], label: Union[int, float]):
"""
Creates one InputExample with the given texts, guid and label
str.strip() is called on both texts.
:param guid
id for the example
:param texts
the texts for the example
:param label
the label for the example
"""
self.guid = guid
self.texts = [text.strip() for text in texts]
self.label = label
def get_texts(self):
return self.texts
def get_label(self):
return self.label
class LoggingHandler(logging.Handler):
def __init__(self, level=logging.NOTSET):
super().__init__(level)
def emit(self, record):
try:
msg = self.format(record)
tqdm.write(msg)
self.flush()
except (KeyboardInterrupt, SystemExit):
raise
except:
self.handleError(record)
def get_examples(filename, max_examples=0):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
label = sample['label']
guid = "%s-%d" % (filename, id)
id += 1
if label == 'entailment':
label = 0
elif label == 'contradiction':
label = 1
else:
label = 2
examples.append(InputExample(guid=guid,
texts=[sample['s1'], sample['s2']],
label=label))
if 0 < max_examples <= len(examples):
break
return examples
def get_qa_examples(filename, max_examples=0, dev=False):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
label = sample['relevant']
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['question'], sample['answer']],
label=label))
if not dev:
if label == 1:
for _ in range(13):
examples.append(InputExample(guid=guid,
texts=[sample['question'], sample['answer']],
label=label))
if 0 < max_examples <= len(examples):
break
return examples
def map_label(label):
labels = {"relevant": 0, "irrelevant": 1}
return labels[label.strip().lower()]
def get_qar_examples(filename, max_examples=0):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['question'], sample['answer']],
label=1.0))
if 0 < max_examples <= len(examples):
break
return examples
def get_qar_artificial_examples():
examples = []
id = 0
print('Loading passages...')
passages = []
file = open('data/msmarco/collection.tsv', 'r', encoding='utf8')
while True:
line = file.readline()
if not line:
break
line = line.rstrip('\n').split('\t')
passages.append(line[1])
print('Loaded passages')
with open('data/qar/qar_artificial_queries.csv') as f:
for i, line in enumerate(f):
queries = line.rstrip('\n').split('|')
for query in queries:
guid = "%s-%d" % ('', id)
id += 1
examples.append(InputExample(guid=guid,
texts=[query, passages[i]],
label=1.0))
return examples
def get_single_examples(filename, max_examples=0):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['text']],
label=1))
if 0 < max_examples <= len(examples):
break
return examples
def get_qnli_examples(filename, max_examples=0, no_contradictions=False, fever_only=False):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
label = sample['label']
if label == 'contradiction' and no_contradictions:
continue
if sample['evidence'] == '':
continue
if fever_only and sample['source'] != 'fever':
continue
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['statement'].strip(), sample['evidence'].strip()],
label=1.0))
if 0 < max_examples <= len(examples):
break
return examples
def get_retrieval_examples(filename, negative_corpus='data/msmarco/collection.tsv', max_examples=0, no_statements=True,
encoder_model=None, negative_samples_num=4):
examples = []
queries = []
passages = []
negative_passages = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
if 'evidence' in sample and sample['evidence'] == '':
continue
guid = "%s-%d" % (filename, id)
id += 1
if sample['type'] == 'question':
query = sample['question']
passage = sample['answer']
else:
query = sample['statement']
passage = sample['evidence']
query = query.strip()
passage = passage.strip()
if sample['type'] == 'statement' and no_statements:
continue
queries.append(query)
passages.append(passage)
if sample['source'] == 'natural-questions':
negative_passages.append(passage)
if max_examples == len(passages):
break
if encoder_model is not None:
# Load MSMARCO passages
logging.info('Loading MSM passages...')
with open(negative_corpus) as file:
for line in file:
p = line.rstrip('\n').split('\t')[1]
negative_passages.append(p)
logging.info('Building ANN index...')
dense_retriever = DenseRetriever(model=encoder_model, batch_size=1024, use_gpu=True)
dense_retriever.create_index_from_documents(negative_passages)
results = dense_retriever.search(queries=queries, limit=100, probes=256)
negative_samples = [
[negative_passages[p[0]] for p in r if negative_passages[p[0]] != passages[i]][:negative_samples_num]
for i, r in enumerate(results)
]
# print(queries[0])
# print(negative_samples[0][0])
for i in range(len(queries)):
texts = [queries[i], passages[i]] + negative_samples[i]
examples.append(InputExample(guid=guid,
texts=texts,
label=1.0))
else:
for i in range(len(queries)):
texts = [queries[i], passages[i]]
examples.append(InputExample(guid=guid,
texts=texts,
label=1.0))
return examples
def get_pair_input(tokenizer, sent1, sent2, max_len=256):
text = "[CLS] {} [SEP] {} [SEP]".format(sent1, sent2)
tokenized_text = tokenizer.tokenize(text)[:max_len]
indexed_tokens = tokenizer.encode(text)[:max_len]
segments_ids = []
sep_flag = False
for i in range(len(tokenized_text)):
if tokenized_text[i] == '[SEP]' and not sep_flag:
segments_ids.append(0)
sep_flag = True
elif sep_flag:
segments_ids.append(1)
else:
segments_ids.append(0)
return indexed_tokens, segments_ids
def build_batch(tokenizer, text_list, max_len=256):
token_id_list = []
segment_list = []
attention_masks = []
longest = -1
for pair in text_list:
sent1, sent2 = pair
ids, segs = get_pair_input(tokenizer, sent1, sent2, max_len=max_len)
if ids is None or segs is None:
continue
token_id_list.append(ids)
segment_list.append(segs)
attention_masks.append([1] * len(ids))
if len(ids) > longest:
longest = len(ids)
if len(token_id_list) == 0:
return None, None, None
# padding
assert (len(token_id_list) == len(segment_list))
for ii in range(len(token_id_list)):
token_id_list[ii] += [0] * (longest - len(token_id_list[ii]))
attention_masks[ii] += [1] * (longest - len(attention_masks[ii]))
segment_list[ii] += [1] * (longest - len(segment_list[ii]))
return token_id_list, segment_list, attention_masks
def load_unsupervised_dataset(dataset_file):
print('Loading dataset...')
x = pickle.load(open(dataset_file, "rb"))
print('Done')
return x, len(x[0])
def load_supervised_dataset(dataset_file):
print('Loading dataset...')
d = pickle.load(open(dataset_file, "rb"))
print('Done')
return d[0], d[1]
| 31.627219 | 119 | 0.528718 | 1,172 | 10,690 | 4.703925 | 0.180887 | 0.023943 | 0.017958 | 0.048975 | 0.40468 | 0.377109 | 0.363686 | 0.334482 | 0.310539 | 0.287321 | 0 | 0.014142 | 0.358372 | 10,690 | 337 | 120 | 31.721068 | 0.789619 | 0.034425 | 0 | 0.435115 | 0 | 0 | 0.057023 | 0.008705 | 0 | 0 | 0 | 0 | 0.003817 | 1 | 0.064886 | false | 0.072519 | 0.053435 | 0.007634 | 0.183206 | 0.022901 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
403251bad5543a2ea9b5b81f85773876a2b6f3ba | 1,458 | py | Python | setup.py | pranithk/gluster-georep-tools | 3c8c7dcf63042613b002385edcead7c1ec079e61 | [
"MIT"
] | null | null | null | setup.py | pranithk/gluster-georep-tools | 3c8c7dcf63042613b002385edcead7c1ec079e61 | [
"MIT"
] | null | null | null | setup.py | pranithk/gluster-georep-tools | 3c8c7dcf63042613b002385edcead7c1ec079e61 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
gluster-georep-tools.setup.py
:copyright: (c) 2016 by Aravinda VK
:license: MIT, see LICENSE for more details.
"""
from setuptools import setup
setup(
name="gluster-georep-tools",
version="0.2",
packages=["gluster_georep_tools",
"gluster_georep_tools.status",
"gluster_georep_tools.setup"],
include_package_data=True,
install_requires=['argparse', 'paramiko', 'glustercli'],
entry_points={
"console_scripts": [
"gluster-georep-setup = gluster_georep_tools.setup.cli:main",
"gluster-georep-status = gluster_georep_tools.status.cli:main",
]
},
platforms="linux",
zip_safe=False,
author="Aravinda VK",
author_email="mail@aravindavk.in",
description="Gluster Geo-replication tools",
license="MIT",
keywords="gluster, tool, geo-replication",
url="https://github.com/aravindavk/gluster-georep-tools",
long_description="""
Gluster Geo-replication Tools
""",
classifiers=[
"Development Status :: 3 - Alpha",
"Topic :: Utilities",
"Environment :: Console",
"License :: OSI Approved :: MIT License",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python",
"Programming Language :: Python :: 2.6",
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 2 :: Only"
],
)
| 30.375 | 75 | 0.61454 | 151 | 1,458 | 5.81457 | 0.543046 | 0.148064 | 0.164009 | 0.078588 | 0.084282 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011743 | 0.240741 | 1,458 | 47 | 76 | 31.021277 | 0.781391 | 0.091221 | 0 | 0 | 0 | 0 | 0.559387 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.026316 | 0 | 0.026316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403352816f5874a59e3b9fffa9b383a34c03d749 | 311 | py | Python | imgtoch/__init__.py | hrpzcf/imgtoch | 13b59dd4c6b65b8ee17bbd22ac1133a86d34d5fb | [
"MIT"
] | null | null | null | imgtoch/__init__.py | hrpzcf/imgtoch | 13b59dd4c6b65b8ee17bbd22ac1133a86d34d5fb | [
"MIT"
] | null | null | null | imgtoch/__init__.py | hrpzcf/imgtoch | 13b59dd4c6b65b8ee17bbd22ac1133a86d34d5fb | [
"MIT"
] | null | null | null | # coding: utf-8
from .__utils__ import grayscaleOf, makeImage, sortByGrayscale
NAME = "imgtoch"
VERSIONNUM = 0, 2, 3
VERSION = ".".join(map(str, VERSIONNUM))
AUTHOR = "hrpzcf"
EMAIL = "hrpzcf@foxmail.com"
WEBSITE = "https://gitee.com/hrpzcf/imgtoch"
__all__ = ["grayscaleOf", "makeImage", "sortByGrayscale"]
| 23.923077 | 62 | 0.717042 | 36 | 311 | 5.972222 | 0.777778 | 0.186047 | 0.325581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014652 | 0.122187 | 311 | 12 | 63 | 25.916667 | 0.772894 | 0.041801 | 0 | 0 | 0 | 0 | 0.334459 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403ac1f41e289fbd9825b8c92a8b0c154ef6090e | 1,300 | py | Python | trabalhoaqui/comp_perguntas/valida.py | EmanoelG/jogodaforca | 06baf78b31e4b40d8db9fc5be67700be32c66cba | [
"MIT"
] | 1 | 2020-06-06T17:09:55.000Z | 2020-06-06T17:09:55.000Z | trabalhoaqui/comp_perguntas/valida.py | EmanoelG/jogodaforca | 06baf78b31e4b40d8db9fc5be67700be32c66cba | [
"MIT"
] | null | null | null | trabalhoaqui/comp_perguntas/valida.py | EmanoelG/jogodaforca | 06baf78b31e4b40d8db9fc5be67700be32c66cba | [
"MIT"
] | null | null | null | from jogo import desenha_jogo
from random import randint
import sys
def input_cria_usuario():
usuario = dict()
usuario['nome'] = input('Informe o seu nome: ')
usuario['pontos'] = 0
usuario['desafiado'] = False
return usuario
def comeco(j1, j2):
j1 = 1
j2 = 2
n= randint(j1,j2)
escolhildo = n
return escolhildo
# mexi a aqui
def completou(acertos, pala , jogador_adivinhao):#recebe as letras acertadass e depois verifica se a palavra esta completa
if acertos == len(pala):## e aqui
print(f'\t\t\t\t\t \033[37mJogador >> {jogador_adivinhao} << venceu !\033[m')
print("""
\033[35m
_____ ___ ___ ___ _______
/ ___| / | / |/ | | ____|
| | / | / /| /| | | |__
| | _ / /| | / / |__/ | | | __|
| |_| | / ___ | / / | | | |____
\_____//_/ |_| /_/ |_| |_______|
_____ _ _ ______ ______
/ _ \ | | / / | _____| | _ |
| | | | | | / / | |__ | |_| |
| | | | | | / / | __| | _ /
| |_| | | |/ / | |____ | | \ |
\_____/ |___/ |______| |_| \_|\033[m
""")
| 23.214286 | 127 | 0.412308 | 94 | 1,300 | 4.457447 | 0.606383 | 0.019093 | 0.02148 | 0.019093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03383 | 0.431538 | 1,300 | 55 | 128 | 23.636364 | 0.533153 | 0.073077 | 0 | 0 | 0 | 0.030303 | 0.60401 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.090909 | 0 | 0.242424 | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403cacc3c31596cf185f47bf3504df89608d6f14 | 1,329 | py | Python | src/models/CVX_weighted.py | DanqingZ/social-DCM | 3c2541a7ed0e7f4519d97783b5b673fa6c06ae94 | [
"MIT"
] | 14 | 2017-08-10T17:00:20.000Z | 2021-12-23T09:00:50.000Z | src/models/CVX_weighted.py | DanqingZ/social-DCM | 3c2541a7ed0e7f4519d97783b5b673fa6c06ae94 | [
"MIT"
] | null | null | null | src/models/CVX_weighted.py | DanqingZ/social-DCM | 3c2541a7ed0e7f4519d97783b5b673fa6c06ae94 | [
"MIT"
] | 1 | 2019-08-13T08:47:43.000Z | 2019-08-13T08:47:43.000Z | import random
import numpy as np
import numpy.linalg as LA
import scipy as spy
import time
from itertools import *
import sys
import cvxpy as cvx
from random import randint
import numpy as np
import random
from scipy.sparse import csc_matrix
from scipy import sparse as sp
import networkx as nx
class CVX_weighted:
def __init__(self, X, y, b,pos_node ,temp, Lambda, Rho):
self.X = X
self.y = y
self.value = 0
self.dim = X.shape[1]
self.Lambda = Lambda
self.Rho = Rho
self.temp = temp
self.num_nodes = nx.number_of_nodes(self.temp)
self.W = np.zeros((self.dim))
self.b = b
self.pos_node = pos_node
self.P = np.zeros((self.num_nodes,self.num_nodes))
def init_P(self):
for i in self.temp.nodes_iter():
for j in self.temp.neighbors(i):
self.P[i,j] = self.temp[i][j]['pos_edge_prob']
self.P = np.diag(np.sum(self.P,1)) - self.P
def solve(self):
dim = self.X.shape[1]
w = cvx.Variable(dim)
num_nodes = nx.number_of_nodes(self.temp)
b = cvx.Variable(num_nodes)
loss = cvx.sum_entries(cvx.mul_elemwise(np.array(self.pos_node),cvx.logistic(-cvx.mul_elemwise(self.y, self.X*w+b)))) + self.Lambda*cvx.quad_form(b,self.P)
problem = cvx.Problem(cvx.Minimize(loss))
problem.solve(verbose=False)
opt = problem.value
self.W = w.value
self.b = b.value
self.value = opt | 26.58 | 157 | 0.699774 | 242 | 1,329 | 3.731405 | 0.297521 | 0.053156 | 0.039867 | 0.033223 | 0.115172 | 0.06866 | 0.06866 | 0.06866 | 0 | 0 | 0 | 0.00363 | 0.170805 | 1,329 | 50 | 158 | 26.58 | 0.815789 | 0 | 0 | 0.088889 | 0 | 0 | 0.009774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.311111 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
404a73f48e1b3ca8bb85958c0c604a1931f4d34f | 1,450 | py | Python | jina/executors/evaluators/rank/recall.py | sdsd0101/jina | 1a835d9015c627a2cbcdc58ee3d127962ada1bc9 | [
"Apache-2.0"
] | 2 | 2020-10-19T17:06:19.000Z | 2020-10-22T14:10:55.000Z | jina/executors/evaluators/rank/recall.py | ayansiddiqui007/jina | 2a764410de47cc11e53c8f652ea1095d5dab5435 | [
"Apache-2.0"
] | null | null | null | jina/executors/evaluators/rank/recall.py | ayansiddiqui007/jina | 2a764410de47cc11e53c8f652ea1095d5dab5435 | [
"Apache-2.0"
] | null | null | null | from typing import Sequence, Any
from jina.executors.evaluators.rank import BaseRankingEvaluator
from jina.executors.evaluators.decorators import as_aggregator
class RecallEvaluator(BaseRankingEvaluator):
"""A :class:`RecallEvaluator` evaluates the Precision of the search.
It computes how many of the first given `eval_at` groundtruth are found in the matches
"""
def __init__(self, eval_at: int, *args, **kwargs):
""""
:param eval_at: k at which evaluation is performed
"""
super().__init__(*args, **kwargs)
self.eval_at = eval_at
@property
def complete_name(self):
return f'Recall@{self.eval_at}'
@as_aggregator
def evaluate(self, matches_ids: Sequence[Any], groundtruth_ids: Sequence[Any], *args, **kwargs) -> float:
""""
:param matches_ids: the matched document identifiers from the request as matched by jina indexers and rankers
:param groundtruth_ids: the expected documents matches ids sorted as they are expected
:return the evaluation metric value for the request document
"""
ret = 0.0
for doc_id in groundtruth_ids[:self.eval_at]:
if doc_id in matches_ids:
ret += 1.0
divisor = min(self.eval_at, len(matches_ids))
if divisor == 0.0:
"""TODO: Agree on a behavior"""
return 0.0
else:
return ret / divisor
| 35.365854 | 117 | 0.648966 | 185 | 1,450 | 4.935135 | 0.459459 | 0.052574 | 0.054765 | 0.059146 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007498 | 0.264138 | 1,450 | 40 | 118 | 36.25 | 0.848172 | 0.32069 | 0 | 0 | 0 | 0 | 0.023973 | 0.023973 | 0 | 0 | 0 | 0.025 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0.047619 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
404be03a1fd1048c68239ebc361551f5a1526980 | 270 | py | Python | tests/schema_mapping/structures/example5.py | danny-vayu/typedpy | e97735a742acbd5f1133e23f08cf43836476686a | [
"MIT"
] | null | null | null | tests/schema_mapping/structures/example5.py | danny-vayu/typedpy | e97735a742acbd5f1133e23f08cf43836476686a | [
"MIT"
] | null | null | null | tests/schema_mapping/structures/example5.py | danny-vayu/typedpy | e97735a742acbd5f1133e23f08cf43836476686a | [
"MIT"
] | null | null | null | from typedpy import Array, DoNotSerialize, Structure, mappers
class Foo(Structure):
i: int
s: str
_serialization_mapper = {"i": "j", "s": "name"}
class Example5(Foo):
a: Array
_serialization_mapper = [{"j": DoNotSerialize}, mappers.TO_LOWERCASE] | 20.769231 | 73 | 0.674074 | 32 | 270 | 5.53125 | 0.65625 | 0.214689 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004587 | 0.192593 | 270 | 13 | 73 | 20.769231 | 0.807339 | 0 | 0 | 0 | 0 | 0 | 0.02952 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4050f12cd3fda3e62426b196e960faffe455d7f7 | 938 | py | Python | selfdrive/crash.py | darknight111/openpilot3 | a0c755fbe1889f26404a8225816f57e89fde7bc2 | [
"MIT"
] | 19 | 2020-08-05T12:11:58.000Z | 2022-03-07T01:18:56.000Z | selfdrive/crash.py | darknight111/openpilot3 | a0c755fbe1889f26404a8225816f57e89fde7bc2 | [
"MIT"
] | 18 | 2020-08-20T05:17:38.000Z | 2021-12-06T09:02:00.000Z | selfdrive/crash.py | darknight111/openpilot3 | a0c755fbe1889f26404a8225816f57e89fde7bc2 | [
"MIT"
] | 25 | 2020-08-30T09:10:14.000Z | 2022-02-20T02:31:13.000Z | """Install exception handler for process crash."""
from selfdrive.swaglog import cloudlog
from selfdrive.version import version
import sentry_sdk
from sentry_sdk.integrations.threading import ThreadingIntegration
def capture_exception(*args, **kwargs) -> None:
cloudlog.error("crash", exc_info=kwargs.get('exc_info', 1))
try:
sentry_sdk.capture_exception(*args, **kwargs)
sentry_sdk.flush() # https://github.com/getsentry/sentry-python/issues/291
except Exception:
cloudlog.exception("sentry exception")
def bind_user(**kwargs) -> None:
sentry_sdk.set_user(kwargs)
def bind_extra(**kwargs) -> None:
for k, v in kwargs.items():
sentry_sdk.set_tag(k, v)
def init() -> None:
sentry_sdk.init("https://4c138e01b37142ac8a0b73f7a4f349eb@o346458.ingest.sentry.io/5861866",
default_integrations=False, integrations=[ThreadingIntegration(propagate_hub=True)],
release=version)
| 33.5 | 102 | 0.735608 | 116 | 938 | 5.801724 | 0.5 | 0.093611 | 0.059435 | 0.077266 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045963 | 0.141791 | 938 | 27 | 103 | 34.740741 | 0.790062 | 0.105544 | 0 | 0 | 0 | 0 | 0.122449 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4061946ebfbadada4a68b023604bd5475c508749 | 6,090 | py | Python | src/packagedcode/about.py | sthagen/nexB-scancode-toolkit | 12cc1286df78af898fae76fa339da2bb50ad51b9 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | src/packagedcode/about.py | sthagen/nexB-scancode-toolkit | 12cc1286df78af898fae76fa339da2bb50ad51b9 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | src/packagedcode/about.py | sthagen/nexB-scancode-toolkit | 12cc1286df78af898fae76fa339da2bb50ad51b9 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | #
# Copyright (c) nexB Inc. and others. All rights reserved.
# ScanCode is a trademark of nexB Inc.
# SPDX-License-Identifier: Apache-2.0
# See http://www.apache.org/licenses/LICENSE-2.0 for the license text.
# See https://github.com/nexB/scancode-toolkit for support or download.
# See https://aboutcode.org for more information about nexB OSS projects.
#
import io
import os
from pathlib import Path
import saneyaml
from packagedcode import models
from packageurl import PackageURL
# TODO: Override get_package_resource so it returns the Resource that the ABOUT file is describing
TRACE = os.environ.get('SCANCODE_DEBUG_PACKAGE', False)
def logger_debug(*args):
pass
if TRACE:
import logging
import sys
logger = logging.getLogger(__name__)
logging.basicConfig(stream=sys.stdout)
logger.setLevel(logging.DEBUG)
def logger_debug(*args):
return logger.debug(
' '.join(isinstance(a, str) and a or repr(a) for a in args)
)
class AboutFileHandler(models.DatafileHandler):
datasource_id = 'about_file'
default_package_type = 'about'
path_patterns = ('*.ABOUT',)
description = 'AboutCode ABOUT file'
documentation_url = 'https://aboutcode-toolkit.readthedocs.io/en/latest/specification.html'
@classmethod
def parse(cls, location):
"""
Yield one or more Package manifest objects given a file ``location`` pointing to a
package archive, manifest or similar.
"""
with io.open(location, encoding='utf-8') as loc:
package_data = saneyaml.load(loc.read())
# About files can contain any purl and also have a namespace
about_type = package_data.get('type')
about_ns = package_data.get('namespace')
purl_type = None
purl_ns = None
purl = package_data.get('purl')
if purl:
purl = PackageURL.from_string(purl)
if purl:
purl_type = purl.type
package_type = about_type or purl_type or cls.default_package_type
package_ns = about_ns or purl_ns
name = package_data.get('name')
version = package_data.get('version')
homepage_url = package_data.get('home_url') or package_data.get('homepage_url')
download_url = package_data.get('download_url')
copyright_statement = package_data.get('copyright')
license_expression = package_data.get('license_expression')
declared_license = license_expression
owner = package_data.get('owner')
if not isinstance(owner, str):
owner = repr(owner)
parties = [models.Party(type=models.party_person, name=owner, role='owner')]
# FIXME: also include notice_file and license_file(s) as file_references
file_references = []
about_resource = package_data.get('about_resource')
if about_resource:
file_references.append(models.FileReference(path=about_resource))
# FIXME: we should put the unprocessed attributes in extra data
yield models.PackageData(
datasource_id=cls.datasource_id,
type=package_type,
namespace=package_ns,
name=name,
version=version,
declared_license=declared_license,
license_expression=license_expression,
copyright=copyright_statement,
parties=parties,
homepage_url=homepage_url,
download_url=download_url,
file_references=file_references,
)
@classmethod
def assemble(cls, package_data, resource, codebase):
"""
Yield a Package. Note that ABOUT files do not carry dependencies.
"""
datafile_path = resource.path
# do we have enough to create a package?
if package_data.purl:
package = models.Package.from_package_data(
package_data=package_data,
datafile_path=datafile_path,
)
package_uid = package.package_uid
# NOTE: we do not attach files to the Package level. Instead we
# update `for_package` in the file
resource.for_packages.append(package_uid)
resource.save(codebase)
if not package.license_expression:
package.license_expression = cls.compute_normalized_license(package)
yield package
if resource.pid is not None and package_data.file_references:
parent_resource = resource.parent(codebase)
if parent_resource and package_data.file_references:
root_path = Path(parent_resource.path)
# FIXME: we should be able to get the path relatively to the
# ABOUT file resource a file ref extends from the root of
# the filesystem
file_references_by_path = {
str(root_path / ref.path): ref
for ref in package.file_references
}
for res in parent_resource.walk(codebase):
ref = file_references_by_path.get(res.path)
if not ref:
continue
# path is found and processed: remove it, so we can
# check if we found all of them
del file_references_by_path[res.path]
res.for_packages.append(package_uid)
res.save(codebase)
yield res
# if we have left over file references, add these to extra data
if file_references_by_path:
missing = sorted(file_references_by_path.values(), key=lambda r: r.path)
package.extra_data['missing_file_references'] = missing
else:
package.extra_data['missing_file_references'] = package_data.file_references[:]
# we yield this as we do not want this further processed
yield resource
| 36.25 | 98 | 0.621182 | 715 | 6,090 | 5.106294 | 0.296504 | 0.06327 | 0.046015 | 0.02739 | 0.050397 | 0.020268 | 0 | 0 | 0 | 0 | 0 | 0.001184 | 0.306732 | 6,090 | 167 | 99 | 36.467066 | 0.863572 | 0.209852 | 0 | 0.057143 | 0 | 0 | 0.0625 | 0.014358 | 0 | 0 | 0 | 0.017964 | 0 | 1 | 0.038095 | false | 0.009524 | 0.07619 | 0.009524 | 0.180952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4062ba894ee618c56f6c5822e3859495a6c3298f | 541 | py | Python | aula12/ex1.py | otaviobizulli/python-exercices | 2c61f014bf481fa463721b174ddd4238bf8d0cb3 | [
"MIT"
] | null | null | null | aula12/ex1.py | otaviobizulli/python-exercices | 2c61f014bf481fa463721b174ddd4238bf8d0cb3 | [
"MIT"
] | null | null | null | aula12/ex1.py | otaviobizulli/python-exercices | 2c61f014bf481fa463721b174ddd4238bf8d0cb3 | [
"MIT"
] | null | null | null | from random import randint
menor = 100
linha = 0
maior = 0
m = []
for i in range(10):
m.append([])
for j in range(10):
m[i].append(randint(1,99))
for i in range(10):
for j in range(10):
print(f'{m[i][j]:2}',end=' ')
print()
for i in range(10):
for j in range(10):
if m[i][j] > maior:
maior = m[i][j]
linha = i
for i in range(10):
if m[linha][i] < menor:
menor = m[linha][i]
print(f'o minimax é {menor}, com o maior sendo {maior} na linha {linha+1}.')
| 16.393939 | 76 | 0.51756 | 97 | 541 | 2.886598 | 0.309278 | 0.175 | 0.225 | 0.157143 | 0.346429 | 0.185714 | 0.185714 | 0.185714 | 0.185714 | 0.185714 | 0 | 0.064343 | 0.310536 | 541 | 32 | 77 | 16.90625 | 0.686327 | 0 | 0 | 0.318182 | 0 | 0 | 0.145794 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.045455 | 0.136364 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
406a21613d9b1dbc55f543cfe42bc9ef9b68a79c | 1,749 | py | Python | tests/bugs/core_2678_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2022-02-05T11:37:13.000Z | 2022-02-05T11:37:13.000Z | tests/bugs/core_2678_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-09-03T11:47:00.000Z | 2021-09-03T12:42:10.000Z | tests/bugs/core_2678_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-06-30T14:14:16.000Z | 2021-06-30T14:14:16.000Z | #coding:utf-8
#
# id: bugs.core_2678
# title: Full outer join cannot use available indices (very slow execution)
# decription:
# tracker_id: CORE-2678
# min_versions: ['3.0']
# versions: 3.0
# qmid: None
import pytest
from firebird.qa import db_factory, isql_act, Action
# version: 3.0
# resources: None
substitutions_1 = []
init_script_1 = """"""
db_1 = db_factory(sql_dialect=3, init=init_script_1)
test_script_1 = """
create table td_data1 (
c1 varchar(20) character set win1251 not null collate win1251,
c2 integer not null,
c3 date not null,
d1 float not null
);
create index idx_td_data1 on td_data1(c1,c2,c3);
commit;
create table td_data2 (
c1 varchar(20) character set win1251 not null collate win1251,
c2 integer not null,
c3 date not null,
d2 float not null
);
create index idx_td_data2 on td_data2(c1,c2,c3);
commit;
set planonly;
select
d1.c1, d2.c1,
d1.c2, d2.c2,
d1.c3, d2.c3,
coalesce(sum(d1.d1), 0) t1,
coalesce(sum(d2.d2), 0) t2
from td_data1 d1
full join td_data2 d2
on
d2.c1 = d1.c1
and d2.c2 = d1.c2
and d2.c3 = d1.c3
group by
d1.c1, d2.c1,
d1.c2, d2.c2,
d1.c3, d2.c3;
"""
act_1 = isql_act('db_1', test_script_1, substitutions=substitutions_1)
expected_stdout_1 = """
PLAN SORT (JOIN (JOIN (D2 NATURAL, D1 INDEX (IDX_TD_DATA1)), JOIN (D1 NATURAL, D2 INDEX (IDX_TD_DATA2))))
"""
@pytest.mark.version('>=3.0')
def test_1(act_1: Action):
act_1.expected_stdout = expected_stdout_1
act_1.execute()
assert act_1.clean_stdout == act_1.clean_expected_stdout
| 23.958904 | 109 | 0.619211 | 273 | 1,749 | 3.787546 | 0.326007 | 0.054159 | 0.038685 | 0.023211 | 0.255319 | 0.255319 | 0.255319 | 0.201161 | 0.201161 | 0.201161 | 0 | 0.101575 | 0.273871 | 1,749 | 72 | 110 | 24.291667 | 0.712598 | 0.142367 | 0 | 0.326531 | 0 | 0.020408 | 0.677419 | 0 | 0 | 0 | 0 | 0 | 0.020408 | 1 | 0.020408 | false | 0 | 0.040816 | 0 | 0.061224 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
406bff6901669314a484753b5d5e8d18397cb7b2 | 3,693 | py | Python | flask-app/web_app/storage_manager/storage_manager.py | PetrMokrov/back_end_project | 4dd58d61e637d10872fe58a154dc89f6d0829d94 | [
"MIT"
] | null | null | null | flask-app/web_app/storage_manager/storage_manager.py | PetrMokrov/back_end_project | 4dd58d61e637d10872fe58a154dc89f6d0829d94 | [
"MIT"
] | null | null | null | flask-app/web_app/storage_manager/storage_manager.py | PetrMokrov/back_end_project | 4dd58d61e637d10872fe58a154dc89f6d0829d94 | [
"MIT"
] | 1 | 2019-04-02T12:30:13.000Z | 2019-04-02T12:30:13.000Z | #!/usr/bin/env python
import psycopg2
import time
from ..models import User
class StorageManager:
def __init__(self):
self.conn = None
self._connect()
self._create_table()
def _connect(self):
while True:
try:
self.conn = psycopg2.connect(
host='storage',
database='app_storage',
user='admin',
password='admin'
)
except psycopg2.Error:
print('Cannot connect to database, sleeping 3 seconds')
time.sleep(3)
else:
break
def _create_table(self):
while True:
try:
cursor = self.conn.cursor()
cursor.execute('CREATE TABLE IF NOT EXISTS users \
(id SERIAL PRIMARY KEY, login VARCHAR(128), \
email VARCHAR(128), hash_password VARCHAR(132), \
confirmed BOOLEAN)')
except psycopg2.Error:
print('Database error, reconnecting')
self._connect()
else:
break
def insert(self, user):
'''
If insert is success, the function returns true,
Else, it returns false
'''
while True:
try:
if self.select(user.login, category='login') is not None:
return False
cursor = self.conn.cursor()
cursor.execute('INSERT INTO users(login, email, hash_password, confirmed) \
VALUES (%s, %s, %s, %s)', (user.login, user.email, user.hash_password, user.confirmed))
self.conn.commit()
return True
except psycopg2.Error:
print('Database error, reconnecting')
time.sleep(1)
self._connect()
else:
break
def select(self, value, category='login'):
'''
The function returns None, if there is no user with very value of
category, else it returns User instance
'''
while True:
try:
cursor = self.conn.cursor()
cursor.execute('SELECT * FROM users WHERE %s = %%s' % category, (value,))
self.conn.commit()
fetch = cursor.fetchall()
if len(fetch) == 0:
return None
user = User(fetch[0][1], fetch[0][2])
user.id = fetch[0][0]
user.hash_password = fetch[0][3]
user.confirmed = fetch[0][4]
return user
except psycopg2.Error:
print('Database error, reconnecting')
time.sleep(1)
self._connect()
else:
break
def confirm(self, value, category='login'):
'''
The function sets \'confirmed\' parameter of the user with very value
of category as True\n
If such user not found, returns False, else returns True
'''
while True:
try:
if self.select(value, category=category) is not None:
cursor = self.conn.cursor()
cursor.execute('UPDATE users SET confirmed = TRUE WHERE %s = %%s' % category, (value,))
self.conn.commit()
return True
else:
return False
except psycopg2.Error:
print('Database error, reconnecting')
time.sleep(1)
self._connect()
else:
break
| 33.572727 | 107 | 0.479285 | 360 | 3,693 | 4.863889 | 0.272222 | 0.041119 | 0.034266 | 0.068532 | 0.423187 | 0.390634 | 0.256996 | 0.229012 | 0.190177 | 0.138778 | 0 | 0.015195 | 0.429732 | 3,693 | 109 | 108 | 33.880734 | 0.816239 | 0.093149 | 0 | 0.563218 | 0 | 0 | 0.087238 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0.057471 | 0.034483 | 0 | 0.183908 | 0.057471 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
40771f48cc35e55bf1ed0377d840f200b12f6982 | 739 | py | Python | Use.py | XtremeCoder1384/SongDownloader | 7bb06d7961ec699af8517cbd7cb4a1ec83d4fd02 | [
"MIT"
] | 1 | 2019-03-04T02:26:41.000Z | 2019-03-04T02:26:41.000Z | Use.py | XtremeCoder1384/SongDownloader | 7bb06d7961ec699af8517cbd7cb4a1ec83d4fd02 | [
"MIT"
] | 1 | 2018-12-20T02:32:35.000Z | 2019-03-11T12:51:15.000Z | Use.py | IngeniousCoder/SongDownloader | 7bb06d7961ec699af8517cbd7cb4a1ec83d4fd02 | [
"MIT"
] | null | null | null | import os
import youtube_dl
os.system("setup.bat")
playlist = input("Paste the Youtube Playlist URL Here.")
track = 1
print("""THIS TOOL WILL ATTEMPT TO DOWNLOAD THE FIRST 1000 SONGS IN THE QUEUE.\n
PLEASE DO NOT INTERRUPT THE TOOL.
YOU MAY CLOSE THE TOOL WHEN IT DISPLAYS "DONE!".
ALL DOWNLOADED SONGS WILL BE IN THE SAME DIRECTORY THIS FILE IS IN.
TO EXTRACT THEM, FILTER BY MP3.""")
for x in range(1000):
file = open("Downloader.bat","w")
file.write("youtube-dl -x --playlist-start {} --audio-format mp3 --playlist-end {} {}".format(str(track),str(track),playlist))
file.close
os.system("Downloader.bat")
track = track + 1
print("DONE! You may now close this window.")
| 36.95 | 129 | 0.663058 | 113 | 739 | 4.327434 | 0.575221 | 0.03681 | 0.04499 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020725 | 0.216509 | 739 | 19 | 130 | 38.894737 | 0.823834 | 0 | 0 | 0 | 0 | 0.058824 | 0.656944 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.117647 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
407a65f9c4b9f958fde5ab42bad4bdd15788bb31 | 4,046 | py | Python | tests/test_classification_metric.py | DaveFClarke/ml_bias_checking | 90f67ebc602b6107042e6cbff3268051bb3b1c95 | [
"Apache-2.0"
] | 2 | 2021-07-31T20:52:37.000Z | 2022-02-15T21:05:17.000Z | tests/test_classification_metric.py | DaveFClarke/ml_bias_checking | 90f67ebc602b6107042e6cbff3268051bb3b1c95 | [
"Apache-2.0"
] | 2 | 2021-08-25T16:16:43.000Z | 2022-02-10T05:26:14.000Z | tests/test_classification_metric.py | DaveFClarke/ml_bias_checking | 90f67ebc602b6107042e6cbff3268051bb3b1c95 | [
"Apache-2.0"
] | 1 | 2019-05-21T15:31:24.000Z | 2019-05-21T15:31:24.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import numpy as np
import pandas as pd
from aif360.datasets import BinaryLabelDataset
from aif360.metrics import ClassificationMetric
def test_generalized_entropy_index():
data = np.array([[0, 1],
[0, 0],
[1, 0],
[1, 1],
[1, 0],
[1, 0],
[2, 1],
[2, 0],
[2, 1],
[2, 1]])
pred = data.copy()
pred[[3, 9], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
assert cm.generalized_entropy_index() == 0.2
pred = data.copy()
pred[:, -1] = np.array([0, 1, 1, 0, 0, 0, 0, 1, 1, 1])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
assert cm.generalized_entropy_index() == 0.3
def test_theil_index():
data = np.array([[0, 1],
[0, 0],
[1, 0],
[1, 1],
[1, 0],
[1, 0],
[2, 1],
[2, 0],
[2, 1],
[2, 1]])
pred = data.copy()
pred[[3, 9], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
assert cm.theil_index() == 4*np.log(2)/10
def test_between_all_groups():
data = np.array([[0, 1],
[0, 0],
[1, 0],
[1, 1],
[1, 0],
[1, 0],
[2, 1],
[2, 0],
[2, 1],
[2, 1]])
pred = data.copy()
pred[[3, 9], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
b = np.array([1, 1, 1.25, 1.25, 1.25, 1.25, 0.75, 0.75, 0.75, 0.75])
assert cm.between_all_groups_generalized_entropy_index() == 1/20*np.sum(b**2 - 1)
def test_between_group():
data = np.array([[0, 0, 1],
[0, 1, 0],
[1, 1, 0],
[1, 1, 1],
[1, 0, 0],
[1, 0, 0]])
pred = data.copy()
pred[[0, 3], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'feat2', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'feat2', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat', 'feat2'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat', 'feat2'])
cm = ClassificationMetric(bld, bld2, unprivileged_groups=[{'feat': 0}],
privileged_groups=[{'feat': 1}])
b = np.array([0.5, 0.5, 1.25, 1.25, 1.25, 1.25])
assert cm.between_group_generalized_entropy_index() == 1/12*np.sum(b**2 - 1)
| 34.87931 | 85 | 0.505685 | 496 | 4,046 | 3.979839 | 0.129032 | 0.02229 | 0.018237 | 0.109422 | 0.698582 | 0.681864 | 0.675785 | 0.641337 | 0.624113 | 0.624113 | 0 | 0.077936 | 0.324518 | 4,046 | 115 | 86 | 35.182609 | 0.644347 | 0 | 0 | 0.696078 | 0 | 0 | 0.04696 | 0 | 0 | 0 | 0 | 0 | 0.04902 | 1 | 0.039216 | false | 0 | 0.078431 | 0 | 0.117647 | 0.009804 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40826ce560682ad3ad560f8fecc12e0ab6658bc0 | 767 | py | Python | 39. Combination Sum.py | MapleLove2014/leetcode | 135c79ebe98815d0e38280edfadaba90e677aff5 | [
"Apache-2.0"
] | 1 | 2020-12-04T07:38:16.000Z | 2020-12-04T07:38:16.000Z | 39. Combination Sum.py | MapleLove2014/leetcode | 135c79ebe98815d0e38280edfadaba90e677aff5 | [
"Apache-2.0"
] | null | null | null | 39. Combination Sum.py | MapleLove2014/leetcode | 135c79ebe98815d0e38280edfadaba90e677aff5 | [
"Apache-2.0"
] | null | null | null | class Solution:
def combinationSum(self, candidates, target):
def lookup(candidates, index, target, combine, result):
if target == 0:
result.append(combine)
return
if index >= len(candidates) and target > 0:
return
if target >= candidates[index]:
lookup(candidates, index, target - candidates[index], list(combine) + [candidates[index]], result)
lookup(candidates, index + 1, target, list(combine), result)
sorted(candidates)
result = []
lookup(candidates, 0, target, [], result)
return result
s = Solution()
print(s.combinationSum([2,3,6,7], 7))
print(s.combinationSum([2,3,5], 8))
| 34.863636 | 114 | 0.555411 | 79 | 767 | 5.392405 | 0.35443 | 0.211268 | 0.147887 | 0.126761 | 0.103286 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025097 | 0.324641 | 767 | 21 | 115 | 36.52381 | 0.797297 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4082bcb5f99112c93d2d504f08622c615955a33b | 1,204 | py | Python | crawl_comments.py | tosh1ki/NicoCrawler | 236029f103e01de9e61a042759dc9bf2cb7d3d55 | [
"MIT"
] | 1 | 2015-03-04T14:06:33.000Z | 2015-03-04T14:06:33.000Z | crawl_comments.py | tosh1ki/NicoCrawler | 236029f103e01de9e61a042759dc9bf2cb7d3d55 | [
"MIT"
] | 2 | 2015-03-04T02:48:18.000Z | 2015-03-04T14:18:32.000Z | crawl_comments.py | tosh1ki/NicoCrawler | 236029f103e01de9e61a042759dc9bf2cb7d3d55 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
__doc__ = '''
Crawl comment from nicovideo.jp
Usage:
crawl_comments.py --url <url> --mail <mail> --pass <pass> [--sqlite <sqlite>] [--csv <csv>]
Options:
--url <url>
--mail <mail>
--pass <pass>
--sqlite <sqlite> (optional) path of comment DB [default: comments.sqlite3]
--csv <csv> (optional) path of csv file contains urls of videos [default: crawled.csv]
'''
from docopt import docopt
from nicocrawler.nicocrawler import NicoCrawler
if __name__ == '__main__':
# コマンドライン引数の取得
args = docopt(__doc__)
url_channel_toppage = args['--url']
login_mail = args['--mail']
login_pass = args['--pass']
path_sqlite = args['--sqlite']
path_csv = args['--csv']
ncrawler = NicoCrawler(login_mail, login_pass)
ncrawler.connect_sqlite(path_sqlite)
df = ncrawler.get_all_video_url_of_season(url_channel_toppage)
ncrawler.initialize_csv_from_db(path_csv)
# # デイリーランキング1~300位の動画を取得する
# url = 'http://www.nicovideo.jp/ranking/fav/daily/all'
# ncrawler.initialize_csv_from_url(url, path_csv, max_page=3)
# ncrawler.get_all_comments_of_csv(path_csv, max_n_iter=1)
| 26.173913 | 102 | 0.671096 | 158 | 1,204 | 4.797468 | 0.411392 | 0.036939 | 0.026385 | 0.036939 | 0.08971 | 0.08971 | 0.08971 | 0.08971 | 0 | 0 | 0 | 0.008188 | 0.188538 | 1,204 | 45 | 103 | 26.755556 | 0.767656 | 0.208472 | 0 | 0 | 0 | 0.083333 | 0.450794 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
408784a24cae84367d1864aa02a8ff6e4a8e197a | 1,109 | py | Python | bot/conversation_handlers/stage01.py | gerbigtim/coaching_bot | 5b4ef6e207a5017f7b4274d8238550b4988d0a6e | [
"MIT"
] | null | null | null | bot/conversation_handlers/stage01.py | gerbigtim/coaching_bot | 5b4ef6e207a5017f7b4274d8238550b4988d0a6e | [
"MIT"
] | null | null | null | bot/conversation_handlers/stage01.py | gerbigtim/coaching_bot | 5b4ef6e207a5017f7b4274d8238550b4988d0a6e | [
"MIT"
] | null | null | null | # imports
from telegram.ext import (
CommandHandler,
MessageHandler,
Filters,
ConversationHandler,
)
from handler_functions.start import start
from handler_functions.bio import bio
from handler_functions.gender import gender
from handler_functions.photo import photo, skip_photo
from handler_functions.location import location, skip_location
from handler_functions.cancel import cancel
from conversation_handlers.stage_constants import *
# Adds conversation handler with the states GENDER, PHOTO, LOCATION and BIO for stage 1 of the sign up
conv_handler = ConversationHandler(
entry_points=[CommandHandler('start', start)],
states={
GENDER: [MessageHandler(Filters.regex('^(Gentleman|Lady|I am a unicorn.)$'), gender)],
PHOTO: [MessageHandler(Filters.photo, photo), CommandHandler('skip', skip_photo)],
LOCATION: [
MessageHandler(Filters.location, location),
CommandHandler('skip', skip_location),
],
BIO: [MessageHandler(Filters.text & ~Filters.command, bio)],
},
fallbacks=[CommandHandler('cancel', cancel)],
) | 38.241379 | 102 | 0.734896 | 122 | 1,109 | 6.565574 | 0.368852 | 0.082397 | 0.149813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001087 | 0.170424 | 1,109 | 29 | 103 | 38.241379 | 0.869565 | 0.097385 | 0 | 0 | 0 | 0 | 0.053053 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
4087ac882a0e642cb2645d67bfb2e7473130d2e9 | 265 | py | Python | python100days/day03/conversion.py | lanSeFangZhou/pythonbase | f4daa373573b2fc0a59a5eb919d02eddf5914e18 | [
"Apache-2.0"
] | null | null | null | python100days/day03/conversion.py | lanSeFangZhou/pythonbase | f4daa373573b2fc0a59a5eb919d02eddf5914e18 | [
"Apache-2.0"
] | 1 | 2021-06-02T00:58:26.000Z | 2021-06-02T00:58:26.000Z | python100days/day03/conversion.py | lanSeFangZhou/pythonbase | f4daa373573b2fc0a59a5eb919d02eddf5914e18 | [
"Apache-2.0"
] | null | null | null | # 英制单位英寸和公制单位厘米互换
value =float(input('请输入长度:'))
unit =input('请输入单位:')
if unit == 'in' or unit == '英寸':
print('%f英寸 = %f厘米' % (value, value * 2.54))
elif unit == '厘米' or unit == 'cm':
print('%f 厘米 = %f英寸' % (value, value / 2.54))
else:
print('请输入有效的单位') | 26.5 | 49 | 0.558491 | 38 | 265 | 3.894737 | 0.578947 | 0.081081 | 0.148649 | 0.175676 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028302 | 0.2 | 265 | 10 | 50 | 26.5 | 0.669811 | 0.056604 | 0 | 0 | 0 | 0 | 0.204819 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4088dc579c34d53321481174879bd2850ab8f43e | 485 | py | Python | tests/models/test_dtfactory.py | surajsjain/ocean.py | 2e853db94d9aee2a0cf6b3d58f714215b83d917b | [
"Apache-2.0"
] | 4 | 2021-07-05T20:21:41.000Z | 2021-09-02T14:13:26.000Z | tests/models/test_dtfactory.py | surajsjain/ocean.py | 2e853db94d9aee2a0cf6b3d58f714215b83d917b | [
"Apache-2.0"
] | null | null | null | tests/models/test_dtfactory.py | surajsjain/ocean.py | 2e853db94d9aee2a0cf6b3d58f714215b83d917b | [
"Apache-2.0"
] | 1 | 2021-03-25T15:04:12.000Z | 2021-03-25T15:04:12.000Z | from ocean_lib.models.data_token import DataToken
from ocean_lib.models.dtfactory import DTFactory
from ocean_lib.ocean.util import to_base_18
def test1(network, alice_wallet, dtfactory_address):
dtfactory = DTFactory(dtfactory_address)
dt_address = dtfactory.createToken('foo_blob', 'DT1', 'DT1', to_base_18(1000), from_wallet=alice_wallet)
dt = DataToken(dtfactory.get_token_address(dt_address))
assert isinstance(dt, DataToken)
assert dt.blob() == 'foo_blob'
| 37.307692 | 108 | 0.781443 | 68 | 485 | 5.294118 | 0.411765 | 0.075 | 0.1 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025822 | 0.121649 | 485 | 12 | 109 | 40.416667 | 0.819249 | 0 | 0 | 0 | 0 | 0 | 0.045361 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
40997199af5c3427ea68e5bd37b9d827653408fe | 14,709 | py | Python | src/toil/jobStores/abstractJobStore.py | adamnovak/toil | 3a81f1114ec7f347e6e7bfd861073d897a9188ec | [
"Apache-2.0"
] | null | null | null | src/toil/jobStores/abstractJobStore.py | adamnovak/toil | 3a81f1114ec7f347e6e7bfd861073d897a9188ec | [
"Apache-2.0"
] | null | null | null | src/toil/jobStores/abstractJobStore.py | adamnovak/toil | 3a81f1114ec7f347e6e7bfd861073d897a9188ec | [
"Apache-2.0"
] | null | null | null | # Copyright (C) 2015 UCSC Computational Genomics Lab
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from abc import ABCMeta, abstractmethod
from contextlib import contextmanager
import re
try:
import cPickle
except ImportError:
import pickle as cPickle
class NoSuchJobException( Exception ):
def __init__( self, jobStoreID ):
super( NoSuchJobException, self ).__init__( "The job '%s' does not exist" % jobStoreID )
class ConcurrentFileModificationException( Exception ):
def __init__( self, jobStoreFileID ):
super( ConcurrentFileModificationException, self ).__init__(
'Concurrent update to file %s detected.' % jobStoreFileID )
class NoSuchFileException( Exception ):
def __init__( self, fileJobStoreID ):
super( NoSuchFileException, self ).__init__( "The file '%s' does not exist" % fileJobStoreID )
class JobStoreCreationException( Exception ):
def __init__( self, message ):
super( JobStoreCreationException, self ).__init__( message )
class AbstractJobStore( object ):
"""
Represents the physical storage for the jobs and associated files in a toil.
"""
__metaclass__ = ABCMeta
def __init__( self, config=None ):
"""
:param config: If config is not None then the
given configuration object will be written to the shared file "config.pickle" which can
later be retrieved using the readSharedFileStream. See writeConfigToStore.
If this file already exists it will be overwritten. If config is None,
the shared file "config.pickle" is assumed to exist and is retrieved. See loadConfigFromStore.
"""
#Now get on with reading or writing the config
if config is None:
with self.readSharedFileStream( "config.pickle", isProtected=False ) as fileHandle:
self.__config = cPickle.load(fileHandle)
else:
self.__config = config
self.writeConfigToStore()
def writeConfigToStore(self):
"""
Re-writes the config attribute to the jobStore, so that its values can be retrieved
if the jobStore is reloaded.
"""
with self.writeSharedFileStream( "config.pickle", isProtected=False ) as fileHandle:
cPickle.dump(self.__config, fileHandle, cPickle.HIGHEST_PROTOCOL)
@property
def config( self ):
return self.__config
@staticmethod
def _checkJobStoreCreation(create, exists, jobStoreString):
"""
Consistency checks which will result in exceptions if we attempt to overwrite an existing
jobStore.
:type create: boolean
:type exists: boolean
:raise JobStoreCreationException: Thrown if create=True and exists=True or create=False
and exists=False
"""
if create and exists:
raise JobStoreCreationException("The job store '%s' already exists. "
"Use --restart or 'toil restart' to resume this jobStore, "
"else remove it to start from scratch" % jobStoreString)
if not create and not exists:
raise JobStoreCreationException("The job store '%s' does not exist, so there "
"is nothing to restart." % jobStoreString)
@abstractmethod
def deleteJobStore( self ):
"""
Removes the jobStore from the disk/store. Careful!
"""
raise NotImplementedError( )
##Cleanup functions
def clean(self):
"""
Function to cleanup the state of a jobStore after a restart.
Fixes jobs that might have been partially updated.
Resets the try counts.
"""
#Collate any jobs that were in the process of being created/deleted
jobsToDelete = set()
for job in self.jobs():
for updateID in job.jobsToDelete:
jobsToDelete.add(updateID)
#Delete the jobs that should be deleted
if len(jobsToDelete) > 0:
for job in self.jobs():
if job.updateID in jobsToDelete:
self.delete(job.jobStoreID)
#Cleanup the state of each job
for job in self.jobs():
changed = False #Flag to indicate if we need to update the job
#on disk
if len(job.jobsToDelete) != 0:
job.jobsToDelete = set()
changed = True
#While jobs at the end of the stack are already deleted remove
#those jobs from the stack (this cleans up the case that the job
#had successors to run, but had not been updated to reflect this)
while len(job.stack) > 0:
jobs = [ command for command in job.stack[-1] if self.exists(command[0]) ]
if len(jobs) < len(job.stack[-1]):
changed = True
if len(jobs) > 0:
job.stack[-1] = jobs
break
else:
job.stack.pop()
else:
break
#Reset the retry count of the job
if job.remainingRetryCount < self._defaultTryCount():
job.remainingRetryCount = self._defaultTryCount()
changed = True
#This cleans the old log file which may
#have been left if the job is being retried after a job failure.
if job.logJobStoreFileID != None:
job.clearLogFile(self)
changed = True
if changed: #Update, but only if a change has occurred
self.update(job)
#Remove any crufty stats/logging files from the previous run
self.readStatsAndLogging(lambda x : None)
##########################################
#The following methods deal with creating/loading/updating/writing/checking for the
#existence of jobs
##########################################
@abstractmethod
def create( self, command, memory, cores, disk, updateID=None,
predecessorNumber=0 ):
"""
Creates a job, adding it to the store.
Command, memory, cores, updateID, predecessorNumber
are all arguments to the job's constructor.
:rtype : toil.jobWrapper.JobWrapper
"""
raise NotImplementedError( )
@abstractmethod
def exists( self, jobStoreID ):
"""
Returns true if the job is in the store, else false.
:rtype : bool
"""
raise NotImplementedError( )
@abstractmethod
def getPublicUrl( self, FileName):
"""
Returns a publicly accessible URL to the given file in the job store.
The returned URL starts with 'http:', 'https:' or 'file:'.
The returned URL may expire as early as 1h after its been returned.
Throw an exception if the file does not exist.
:param jobStoreFileID:
:return:
"""
raise NotImplementedError()
@abstractmethod
def getSharedPublicUrl( self, jobStoreFileID):
"""
Returns a publicly accessible URL to the given file in the job store.
The returned URL starts with 'http:', 'https:' or 'file:'.
The returned URL may expire as early as 1h after its been returned.
Throw an exception if the file does not exist.
:param jobStoreFileID:
:return:
"""
raise NotImplementedError()
@abstractmethod
def load( self, jobStoreID ):
"""
Loads a job for the given jobStoreID and returns it.
:rtype: toil.jobWrapper.JobWrapper
:raises: NoSuchJobException if there is no job with the given jobStoreID
"""
raise NotImplementedError( )
@abstractmethod
def update( self, job ):
"""
Persists the job in this store atomically.
"""
raise NotImplementedError( )
@abstractmethod
def delete( self, jobStoreID ):
"""
Removes from store atomically, can not then subsequently call load(), write(), update(),
etc. with the job.
This operation is idempotent, i.e. deleting a job twice or deleting a non-existent job
will succeed silently.
"""
raise NotImplementedError( )
def jobs(self):
"""
Returns iterator on the jobs in the store.
:rtype : iterator
"""
raise NotImplementedError( )
##########################################
#The following provide an way of creating/reading/writing/updating files
#associated with a given job.
##########################################
@abstractmethod
def writeFile( self, localFilePath, jobStoreID=None ):
"""
Takes a file (as a path) and places it in this job store. Returns an ID that can be used
to retrieve the file at a later time.
jobStoreID is the id of a job, or None. If specified, when delete(job)
is called all files written with the given job.jobStoreID will be
removed from the jobStore.
"""
raise NotImplementedError( )
@abstractmethod
@contextmanager
def writeFileStream( self, jobStoreID=None ):
"""
Similar to writeFile, but returns a context manager yielding a tuple of
1) a file handle which can be written to and 2) the ID of the resulting
file in the job store. The yielded file handle does not need to and
should not be closed explicitly.
"""
raise NotImplementedError( )
@abstractmethod
def getEmptyFileStoreID( self, jobStoreID=None ):
"""
:rtype : string, the ID of a new, empty file.
jobStoreID is the id of a job, or None. If specified, when delete(job)
is called all files written with the given job.jobStoreID will be
removed from the jobStore.
Call to fileExists(getEmptyFileStoreID(jobStoreID)) will return True.
"""
raise NotImplementedError( )
@abstractmethod
def readFile( self, jobStoreFileID, localFilePath ):
"""
Copies the file referenced by jobStoreFileID to the given local file path. The version
will be consistent with the last copy of the file written/updated.
"""
raise NotImplementedError( )
@abstractmethod
@contextmanager
def readFileStream( self, jobStoreFileID ):
"""
Similar to readFile, but returns a context manager yielding a file handle which can be
read from. The yielded file handle does not need to and should not be closed explicitly.
"""
raise NotImplementedError( )
@abstractmethod
def deleteFile( self, jobStoreFileID ):
"""
Deletes the file with the given ID from this job store.
This operation is idempotent, i.e. deleting a file twice or deleting a non-existent file
will succeed silently.
"""
raise NotImplementedError( )
@abstractmethod
def fileExists(self, jobStoreFileID ):
"""
:rtype : True if the jobStoreFileID exists in the jobStore, else False
"""
raise NotImplementedError()
@abstractmethod
def updateFile( self, jobStoreFileID, localFilePath ):
"""
Replaces the existing version of a file in the jobStore. Throws an exception if the file
does not exist.
:raises ConcurrentFileModificationException: if the file was modified concurrently during
an invocation of this method
"""
raise NotImplementedError( )
##########################################
#The following methods deal with shared files, i.e. files not associated
#with specific jobs.
##########################################
sharedFileNameRegex = re.compile( r'^[a-zA-Z0-9._-]+$' )
# FIXME: Rename to updateSharedFileStream
@abstractmethod
@contextmanager
def writeSharedFileStream( self, sharedFileName, isProtected=True ):
"""
Returns a context manager yielding a writable file handle to the global file referenced
by the given name.
:param sharedFileName: A file name matching AbstractJobStore.fileNameRegex, unique within
the physical storage represented by this job store
:raises ConcurrentFileModificationException: if the file was modified concurrently during
an invocation of this method
"""
raise NotImplementedError( )
@abstractmethod
@contextmanager
def readSharedFileStream( self, sharedFileName, isProtected=True ):
"""
Returns a context manager yielding a readable file handle to the global file referenced
by the given name.
"""
raise NotImplementedError( )
@abstractmethod
def writeStatsAndLogging( self, statsAndLoggingString ):
"""
Adds the given statistics/logging string to the store of statistics info.
"""
raise NotImplementedError( )
@abstractmethod
def readStatsAndLogging( self, statsAndLoggingCallBackFn):
"""
Reads stats/logging strings accumulated by "writeStatsAndLogging" function.
For each stats/logging file calls the statsAndLoggingCallBackFn with
an open, readable file-handle that can be used to parse the stats.
Returns the number of stat/logging strings processed.
Stats/logging files are only read once and are removed from the
file store after being written to the given file handle.
"""
raise NotImplementedError( )
## Helper methods for subclasses
def _defaultTryCount( self ):
return int( self.config.retryCount+1 )
@classmethod
def _validateSharedFileName( cls, sharedFileName ):
return bool( cls.sharedFileNameRegex.match( sharedFileName ) )
| 37.143939 | 102 | 0.615066 | 1,595 | 14,709 | 5.634483 | 0.253919 | 0.056081 | 0.067653 | 0.059308 | 0.258039 | 0.212529 | 0.191165 | 0.172916 | 0.161344 | 0.161344 | 0 | 0.002351 | 0.306003 | 14,709 | 395 | 103 | 37.237975 | 0.878037 | 0.441634 | 0 | 0.375 | 0 | 0 | 0.048501 | 0 | 0 | 0 | 0 | 0.002532 | 0 | 1 | 0.210526 | false | 0 | 0.046053 | 0.019737 | 0.322368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40a5c13d7bfe8ebdc535f6e928718db2cd73a81f | 623 | py | Python | src/11/11367.py | youngdaLee/Baekjoon | 7d858d557dbbde6603fe4e8af2891c2b0e1940c0 | [
"MIT"
] | 11 | 2020-09-20T15:17:11.000Z | 2022-03-17T12:43:33.000Z | src/11/11367.py | youngdaLee/Baekjoon | 7d858d557dbbde6603fe4e8af2891c2b0e1940c0 | [
"MIT"
] | 3 | 2021-10-30T07:51:36.000Z | 2022-03-09T05:19:23.000Z | src/11/11367.py | youngdaLee/Baekjoon | 7d858d557dbbde6603fe4e8af2891c2b0e1940c0 | [
"MIT"
] | 13 | 2021-01-21T03:19:08.000Z | 2022-03-28T10:44:58.000Z | """
11367. Report Card Time
작성자: xCrypt0r
언어: Python 3
사용 메모리: 29,380 KB
소요 시간: 64 ms
해결 날짜: 2020년 9월 18일
"""
def main():
for _ in range(int(input())):
name, score = input().split()
score = int(score)
if score < 60: grade = 'F'
elif score < 67: grade = 'D'
elif score < 70: grade = 'D+'
elif score < 77: grade = 'C'
elif score < 80: grade = 'C+'
elif score < 87: grade = 'B'
elif score < 90: grade = 'B+'
elif score < 97: grade = 'A'
else: grade = 'A+'
print(name + ' ' + grade)
if __name__ == '__main__':
main()
| 20.096774 | 37 | 0.499197 | 87 | 623 | 3.471264 | 0.586207 | 0.208609 | 0.066225 | 0.099338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091358 | 0.34992 | 623 | 30 | 38 | 20.766667 | 0.654321 | 0.163724 | 0 | 0 | 0 | 0 | 0.042969 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0 | 0 | 0.0625 | 0.0625 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40ac4ec777b7bc387be14a996d46bdf5f0da5291 | 2,416 | py | Python | tests.py | ckelly/pybingmaps | 9214e3a4c2c9e756848fac7c0d76c46dcc64b65d | [
"MIT"
] | null | null | null | tests.py | ckelly/pybingmaps | 9214e3a4c2c9e756848fac7c0d76c46dcc64b65d | [
"MIT"
] | null | null | null | tests.py | ckelly/pybingmaps | 9214e3a4c2c9e756848fac7c0d76c46dcc64b65d | [
"MIT"
] | null | null | null | import unittest
import random
from time import sleep
import os
from bingmaps import *
class BingMapsTestError(Exception):
"""Bing Maps test exception"""
def __init__(self, reason):
self.reason = unicode(reason)
def __str__(self):
return self.reason
# TODO: enter your key for testing
api_key = ''
class DirectionsTests(unittest.TestCase):
def setUp(self):
self.api = BingMapsAPI(api_key=api_key)
def testBasicNav(self):
# start - 717 Market St
# end - Ferry Plaza, San Francisco, CA
# we shrunk the precision to match return values for easier comparison
start_lat = "37.786861"
start_lon = "-122.403689"
end_lat = "37.795556"
end_lon = "-122.392124"
start = start_lat+","+start_lon
end = end_lat+","+end_lon
ret = self.api.routes(waypoints=[start, end])
# verify start and end points are reflected in response
self.assertNotEqual(ret, {})
estimated_total = ret['resourceSets'][0]['estimatedTotal']
self.assertEqual(estimated_total, 1)
routeLegs = ret['resourceSets'][0]['resources'][0]['routeLegs']
self.assertEqual(len(routeLegs), 1)
itinerary_items = routeLegs[0]['itineraryItems']
self.assertNotEqual(itinerary_items, [])
# skip the last step, as it doesn't have a transport Mode
for i in itinerary_items:
self.assertEqual(i['details'][0]['mode'], 'Driving')
def testPedestrianNav(self):
start_lat = "37.7868609332517"
start_lon = "-122.403689949149"
end_lat = "37.795556930015"
end_lon = "-122.392124051039"
start = start_lat+","+start_lon
end = end_lat+","+end_lon
ret = self.api.routes(waypoints=[start,end], travelMode='Walking')
self.assertNotEqual(ret, {})
legs = ret['resourceSets'][0]['resources'][0]['routeLegs']
self.assertNotEqual(legs, [])
legs = legs[0]
itinerary_items = legs['itineraryItems']
self.assertNotEqual(itinerary_items, [])
# skip the last step, as it doesn't have a transport Mode
for i in itinerary_items:
self.assertEqual(i['details'][0]['mode'], 'Walking')
if __name__ == '__main__':
unittest.main() | 29.82716 | 78 | 0.598096 | 272 | 2,416 | 5.154412 | 0.400735 | 0.059914 | 0.034237 | 0.025678 | 0.349501 | 0.349501 | 0.349501 | 0.293866 | 0.293866 | 0.293866 | 0 | 0.061921 | 0.284768 | 2,416 | 81 | 79 | 29.82716 | 0.749421 | 0.146109 | 0 | 0.204082 | 0 | 0 | 0.133528 | 0 | 0 | 0 | 0 | 0.012346 | 0.183673 | 1 | 0.102041 | false | 0 | 0.102041 | 0.020408 | 0.265306 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40b182cffd8ba6689e9b3d11caa57c733d863c65 | 2,646 | py | Python | supervised_learning/analysis.py | gonzalezJohnas/SpeechCommand-recognition | d5351abe45c571a075c24bd04d328e76293f9230 | [
"MIT"
] | null | null | null | supervised_learning/analysis.py | gonzalezJohnas/SpeechCommand-recognition | d5351abe45c571a075c24bd04d328e76293f9230 | [
"MIT"
] | 2 | 2021-04-10T18:12:44.000Z | 2022-02-09T23:36:43.000Z | supervised_learning/analysis.py | gonzalezJohnas/SpeechCommand-recognition | d5351abe45c571a075c24bd04d328e76293f9230 | [
"MIT"
] | null | null | null | from global_utils import *
# target word
TARGET_WORD = 'right'
def display_lowpass_normal(wav, lowpass_signal, fs, label=''):
fig, (axs_raw, axs_low) = plt.subplots(2)
fig.tight_layout(pad=3.0)
fig.set_figheight(FIG_HEIGHT)
fig.set_figwidth(FIG_WIDTH)
# display the plot
axs_raw.plot(wav)
# label the axes
axs_raw.set_ylabel("Amplitude", fontsize=FONT_SIZE)
axs_raw.set_xlabel("Time", fontsize=FONT_SIZE)
# set the title
axs_raw.set_title("Audio sample : {}".format(label), fontsize=FONT_SIZE)
axs_low.plot(lowpass_signal)
# label the axes
axs_low.set_ylabel("Amplitude", fontsize=FONT_SIZE)
axs_low.set_xlabel("Time", fontsize=FONT_SIZE)
# set the title
axs_low.set_title("Audio sample with low pass filter", fontsize=FONT_SIZE)
f_raw, periodogram_raw = signal.periodogram(wav, fs)
f_raw, periodogram_low = signal.periodogram(lowpass_signal, fs)
fig, (axs_periodogram_raw, axs_periodogram_low) = plt.subplots(2)
fig.tight_layout(pad=3.0)
fig.set_figheight(FIG_HEIGHT)
fig.set_figwidth(FIG_WIDTH)
axs_periodogram_raw.semilogy(f_raw, periodogram_raw)
axs_periodogram_raw.set_xlabel('frequency [Hz]', fontsize=FONT_SIZE)
axs_periodogram_raw.set_ylabel('PSD [V**2/Hz]', fontsize=FONT_SIZE)
axs_periodogram_raw.set_title("Periodogram raw signal", fontsize=FONT_SIZE)
axs_periodogram_low.semilogy(f_raw, periodogram_low)
axs_periodogram_low.set_xlabel('frequency [Hz]', fontsize=FONT_SIZE)
axs_periodogram_low.set_ylabel('PSD [V**2/Hz]', fontsize=FONT_SIZE)
axs_periodogram_low.set_title("Periodogram low pass filtered signal", fontsize=FONT_SIZE)
def main(args):
if args.wavfile:
fs, wav = wavfile.read(args.wavfile, "wb")
lowpass_signal = low_pass_filter(wav, sample_rate=fs, cutoff_frequency=1000)
display_lowpass_normal(wav, lowpass_signal, fs)
plt.show()
elif args.indir:
data_dict = get_data(args.indir)
word_samples = data_dict[TARGET_WORD]
mean_lowpass_array, normal_array = mean_low_pass_filter(word_samples, SAMPLE_RATE, CUTOFF_FREQ)
display_lowpass_normal(normal_array, mean_lowpass_array, SAMPLE_RATE, TARGET_WORD)
plt.show()
return 0
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument(
'--wavfile',
help='Path to the .wav files',
required=False
)
parser.add_argument(
'--indir',
help='Absolute path to data directory containing .wav files',
required=False
)
args = parser.parse_args()
main(args)
| 29.730337 | 103 | 0.708239 | 368 | 2,646 | 4.769022 | 0.252717 | 0.082051 | 0.109402 | 0.08661 | 0.378348 | 0.364103 | 0.345299 | 0.259829 | 0.249573 | 0.192593 | 0 | 0.006032 | 0.185563 | 2,646 | 88 | 104 | 30.068182 | 0.808353 | 0.032502 | 0 | 0.214286 | 0 | 0 | 0.115159 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0.160714 | 0.017857 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
40bc7b5a674a2b504a89d6769ec57fdcc5fda4af | 357 | py | Python | Chapter03/Activity11/fibonacci.py | vumaasha/python-workshop | 0fbc21c514a8df5bfffb8db926e451232c6c08bf | [
"MIT"
] | null | null | null | Chapter03/Activity11/fibonacci.py | vumaasha/python-workshop | 0fbc21c514a8df5bfffb8db926e451232c6c08bf | [
"MIT"
] | null | null | null | Chapter03/Activity11/fibonacci.py | vumaasha/python-workshop | 0fbc21c514a8df5bfffb8db926e451232c6c08bf | [
"MIT"
] | null | null | null | def fibonacci_iterative(n):
previous = 0
current = 1
for i in range(n - 1):
current_old = current
current = previous + current
previous = current_old
return current
def fibonacci_recursive(n):
if n == 0 or n == 1:
return n
else:
return fibonacci_recursive(n - 2) + fibonacci_recursive(n - 1)
| 23.8 | 70 | 0.602241 | 47 | 357 | 4.446809 | 0.404255 | 0.028708 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028689 | 0.316527 | 357 | 14 | 71 | 25.5 | 0.827869 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40c0c0515519976b7d3396916ff20c4b1d6edd0a | 126 | py | Python | app/domain/__init__.py | emge1/tracardi | 0a4a8a38f0f769464f50d3c1113b798107810810 | [
"MIT"
] | null | null | null | app/domain/__init__.py | emge1/tracardi | 0a4a8a38f0f769464f50d3c1113b798107810810 | [
"MIT"
] | null | null | null | app/domain/__init__.py | emge1/tracardi | 0a4a8a38f0f769464f50d3c1113b798107810810 | [
"MIT"
] | null | null | null | __all__ = [
'session',
'event',
'profile',
'consent',
'segment',
'source',
'rule',
'entity'
]
| 11.454545 | 14 | 0.444444 | 9 | 126 | 5.777778 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.349206 | 126 | 10 | 15 | 12.6 | 0.634146 | 0 | 0 | 0 | 0 | 0 | 0.388889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40c712bda8811c80835db84231a9e91605ae40b6 | 675 | py | Python | src/main/management/commands/create_admin_user.py | LokotamaTheMastermind/website-portfolio-django-project | 932d509428d592ee573ff82821b9490c8da9600a | [
"Apache-2.0"
] | null | null | null | src/main/management/commands/create_admin_user.py | LokotamaTheMastermind/website-portfolio-django-project | 932d509428d592ee573ff82821b9490c8da9600a | [
"Apache-2.0"
] | null | null | null | src/main/management/commands/create_admin_user.py | LokotamaTheMastermind/website-portfolio-django-project | 932d509428d592ee573ff82821b9490c8da9600a | [
"Apache-2.0"
] | null | null | null | # polls/management/commands/create_admin_user.py
import sys
import logging
from django.core.management.base import BaseCommand, CommandError
from django.contrib.auth.models import User
from django.conf import settings
class Command(BaseCommand):
help = 'Creates the initial admin user'
def handle(self, *args, **options):
if User.objects.filter(username="admin").exists():
print("admin exists")
else:
u = User(username='admin')
u.set_password('website-portfolio-project')
u.is_superuser = True
u.is_staff = True
u.save()
print("admin created")
sys.exit()
| 28.125 | 65 | 0.642963 | 81 | 675 | 5.296296 | 0.641975 | 0.06993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.253333 | 675 | 23 | 66 | 29.347826 | 0.85119 | 0.068148 | 0 | 0 | 0 | 0 | 0.143541 | 0.039872 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.055556 | 0.277778 | 0 | 0.444444 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
40d7ebe962811bafc69c16d6ae16e6cb4f35d53d | 3,955 | py | Python | python-is-easy/assignments/snowman/main.py | eDyablo/pirple | 08910c7574203f685a0971cba61a54166d805a1c | [
"MIT"
] | null | null | null | python-is-easy/assignments/snowman/main.py | eDyablo/pirple | 08910c7574203f685a0971cba61a54166d805a1c | [
"MIT"
] | null | null | null | python-is-easy/assignments/snowman/main.py | eDyablo/pirple | 08910c7574203f685a0971cba61a54166d805a1c | [
"MIT"
] | null | null | null | '''
Homework assignment for the 'Python is easy' course by Pirple.
Written be Ed Yablonsky.
Snowman(Hangman) game.
'''
from os import (
name as os_name,
system as system_call,
)
from os.path import (
abspath,
dirname,
join as join_path,
)
'''
Screen displays game output
'''
class Screen:
def clear(self):
if os_name == 'nt':
system_call('cls')
else:
system_call('clear')
def draw(self, frame):
for line in frame:
print(line)
'''
Input represents game input device
'''
class Input:
def ask(self, message):
answer = ''
while answer == '':
answer = input(message)
return answer
'''
Art is a game art which is set of frames that get loaded from a text file.
Draws its current frame on a screen.
'''
class Art:
def __init__(self):
self.frames = []
self.current_frame = 0
def load(self, name):
frames = []
art_path = join_path(dirname(abspath(__file__)), join_path('arts', name))
with open(art_path, 'r') as art_file:
frame_height = int(art_file.readline())
frame = []
line_count = 0
for line in art_file:
frame.append(line.strip('\n\r'))
line_count += 1
if line_count % frame_height == 0:
frames.append(frame)
frame = []
self.frames = frames
self.current_frame = 0
def draw(self, screen):
screen.draw(self.frames[self.current_frame])
def frames_number(self):
return len(self.frames)
def next_frame(self):
self.current_frame = (self.current_frame + 1) % self.frames_number()
return self.current_frame
'''
Riddle holds secret word and gets solved by guesses
'''
class Riddle:
def __init__(self, key):
self.key = key
self.clue = ['_'] * len(key)
def length(self):
return len(self.key)
def range(self):
return range(0, self.length())
def guess(self, g):
guess_count = 0
for i in self.range():
if g == self.key[i]:
guess_count += 1
self.clue[i] = g
return guess_count
def solved(self):
for i in self.range():
if self.clue[i] != self.key[i]:
return False
return True
def unsolved(self):
return self.solved() == False
def draw(self, screen):
screen.draw([' '.join(self.clue)])
'''
Game is a game itself
'''
class Game:
def __init__(self):
self.screen = Screen()
self.input = Input()
self.art = Art()
self.riddle = Riddle('riddle')
def play(self):
self.start()
self.propose_riddle()
while self.in_progress():
self.play_round()
self.display_result()
def start(self):
self.art.load('snowman')
self.game_over = False
def propose_riddle(self):
self.riddle = Riddle(self.input.ask('Player 1 pick a word: ').lower())
def in_progress(self):
return self.riddle.unsolved() and self.game_over == False
def draw_frame(self):
self.screen.clear()
self.art.draw(self.screen)
self.riddle.draw(self.screen)
def play_round(self):
self.draw_frame()
clue = input('Player 2 guess a letter: ').lower()
if len(clue) > 0:
if clue[0] == '.':
self.stop()
elif self.riddle.guess(clue[0]) == 0:
self.art.next_frame()
if self.art.current_frame == self.art.frames_number() - 1:
self.stop()
def stop(self):
self.game_over = True
def display_result(self):
self.draw_frame()
if self.game_over:
self.screen.draw(['Player 2 lost'])
else:
self.screen.draw(['Player 2 wins'])
Game().play()
| 23.682635 | 81 | 0.551707 | 501 | 3,955 | 4.225549 | 0.241517 | 0.037789 | 0.045347 | 0.031176 | 0.117619 | 0.064714 | 0 | 0 | 0 | 0 | 0 | 0.006767 | 0.327434 | 3,955 | 166 | 82 | 23.825301 | 0.789098 | 0.028319 | 0 | 0.137931 | 0 | 0 | 0.030414 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.206897 | false | 0 | 0.017241 | 0.043103 | 0.353448 | 0.008621 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40e031fd64128f14855fedd41208af0c66f89410 | 886 | py | Python | urls.py | cartologic/cartoview_graduated_styler | f3dc6b0d48dc95bdd7e68d148a5182a4e259dbf3 | [
"BSD-2-Clause"
] | null | null | null | urls.py | cartologic/cartoview_graduated_styler | f3dc6b0d48dc95bdd7e68d148a5182a4e259dbf3 | [
"BSD-2-Clause"
] | 16 | 2017-08-06T09:49:01.000Z | 2021-09-01T08:40:58.000Z | urls.py | cartologic/cartoview_graduated_styler | f3dc6b0d48dc95bdd7e68d148a5182a4e259dbf3 | [
"BSD-2-Clause"
] | null | null | null | # from django.conf.urls import patterns, url, include
# from django.views.generic import TemplateView
# from . import views, APP_NAME
#
# urlpatterns = patterns('',
# url(r'^$', views.index, name='%s.index' % APP_NAME),
# )
from django.urls import path, re_path, include
from . import views, APP_NAME
from .api import LayerResource
from tastypie.api import Api
Resources_api = Api(api_name="api")
Resources_api.register(LayerResource())
urlpatterns = [
re_path(r'^$', views.index, name='%s.index' % APP_NAME),
path('styles/<str:layername>/', views.layer_styles, name='%s.layer_styles' % APP_NAME),
path('styles/save/<str:layer_name>/<str:style_name>', views.save_style, name='%s.save_style' % APP_NAME),
re_path(r'^proxy/geoserver/rest/(?P<suburl>.*)$', views.geoserver_rest_proxy, name='%s.proxy' % APP_NAME),
re_path(r'^', include(Resources_api.urls)),
]
| 34.076923 | 110 | 0.705418 | 128 | 886 | 4.703125 | 0.273438 | 0.081395 | 0.034884 | 0.059801 | 0.212625 | 0.093023 | 0.093023 | 0.093023 | 0 | 0 | 0 | 0 | 0.123025 | 886 | 25 | 111 | 35.44 | 0.774775 | 0.240406 | 0 | 0 | 0 | 0 | 0.233083 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
40e13f8b874a94920da4e07d42899e93081c3e2f | 4,284 | py | Python | graalpython/com.oracle.graal.python.parser.antlr/postprocess.py | transposit/graalpython | adadf5f211cc67a14bb3aca7c61219513d036b13 | [
"UPL-1.0",
"Apache-2.0",
"OpenSSL"
] | 1 | 2019-05-28T13:04:32.000Z | 2019-05-28T13:04:32.000Z | graalpython/com.oracle.graal.python.parser.antlr/postprocess.py | transposit/graalpython | adadf5f211cc67a14bb3aca7c61219513d036b13 | [
"UPL-1.0",
"Apache-2.0",
"OpenSSL"
] | null | null | null | graalpython/com.oracle.graal.python.parser.antlr/postprocess.py | transposit/graalpython | adadf5f211cc67a14bb3aca7c61219513d036b13 | [
"UPL-1.0",
"Apache-2.0",
"OpenSSL"
] | null | null | null | # Copyright (c) 2018, 2019, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# The Universal Permissive License (UPL), Version 1.0
#
# Subject to the condition set forth below, permission is hereby granted to any
# person obtaining a copy of this software, associated documentation and/or
# data (collectively the "Software"), free of charge and under any and all
# copyright rights in the Software, and any and all patent rights owned or
# freely licensable by each licensor hereunder covering either (i) the
# unmodified Software as contributed to or provided by such licensor, or (ii)
# the Larger Works (as defined below), to deal in both
#
# (a) the Software, and
#
# (b) any piece of software and/or hardware listed in the lrgrwrks.txt file if
# one is included with the Software each a "Larger Work" to which the Software
# is contributed by such licensors),
#
# without restriction, including without limitation the rights to copy, create
# derivative works of, display, perform, and distribute the Software and make,
# use, sell, offer for sale, import, export, have made, and have sold the
# Software and the Larger Work(s), and to sublicense the foregoing rights on
# either these or other terms.
#
# This license is subject to the following condition:
#
# The above copyright notice and either this complete permission notice or at a
# minimum a reference to the UPL must be included in all copies or substantial
# portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import sys
import re
COPYRIGHT_HEADER = """\
/*
* Copyright (c) 2017-2019, Oracle and/or its affiliates.
* Copyright (c) 2014 by Bart Kiers
*
* The MIT License (MIT)
*
* Permission is hereby granted, free of charge, to any person
* obtaining a copy of this software and associated documentation
* files (the "Software"), to deal in the Software without
* restriction, including without limitation the rights to use,
* copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the
* Software is furnished to do so, subject to the following
* conditions:
*
* The above copyright notice and this permission notice shall be
* included in all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
* EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
* OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
* NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
* HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
* WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
* FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
* OTHER DEALINGS IN THE SOFTWARE.
*/
// Checkstyle: stop
// JaCoCo Exclude
//@formatter:off
{0}
"""
PTRN_SUPPRESS_WARNINGS = re.compile(r"@SuppressWarnings.*")
def replace_suppress_warnings(line):
return PTRN_SUPPRESS_WARNINGS.sub('@SuppressWarnings("all")', line)
def replace_rulectx(line):
return line.replace("(RuleContext)_localctx", "_localctx")
def replace_localctx(line):
return re.sub(r'\(\((([a-zA-Z]*?_?)*[a-zA-Z]*)\)_localctx\)', '_localctx', line)
TRANSFORMS = [
replace_suppress_warnings,
replace_rulectx,
replace_localctx,
]
def postprocess(file):
lines = []
for line in file:
for transform in TRANSFORMS:
line = transform(line)
lines.append(line)
return ''.join(lines)
if __name__ == '__main__':
fpath = sys.argv[1]
with open(fpath, 'r') as FILE:
content = COPYRIGHT_HEADER.format(postprocess(FILE))
with open(fpath, 'w+') as FILE:
FILE.write(content)
| 37.911504 | 88 | 0.722222 | 616 | 4,284 | 4.978896 | 0.345779 | 0.068145 | 0.022824 | 0.009782 | 0.374959 | 0.358005 | 0.339746 | 0.339746 | 0.30388 | 0.278448 | 0 | 0.007011 | 0.20098 | 4,284 | 112 | 89 | 38.25 | 0.888986 | 0.448413 | 0 | 0 | 0 | 0 | 0.607143 | 0.038296 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.033333 | 0.05 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40eb080a05a597358c0a6ee395b1cbd8baf803e7 | 7,211 | py | Python | corefacility/core/test/models/test_application_access.py | serik1987/corefacility | 78d84e19403361e83ef562e738473849f9133bef | [
"RSA-MD"
] | null | null | null | corefacility/core/test/models/test_application_access.py | serik1987/corefacility | 78d84e19403361e83ef562e738473849f9133bef | [
"RSA-MD"
] | null | null | null | corefacility/core/test/models/test_application_access.py | serik1987/corefacility | 78d84e19403361e83ef562e738473849f9133bef | [
"RSA-MD"
] | null | null | null | import os
import random
import string
import base64
from django.utils import timezone
from django.contrib.auth.hashers import make_password, check_password
from django.test import TestCase
from parameterized import parameterized
from core.models import Module, EntryPoint, ExternalAuthorizationSession, User
AUTHORIZATION_MODULE_LIST = ["ihna", "google", "mailru"]
class TestApplicationProcess(TestCase):
PASSWORD_LENGTH = 25
auth_sessions = None
uuid_list = None
@classmethod
def setUpTestData(cls):
cls.auth_sessions = {}
cls.session_keys = {}
user = User(login="sergei.kozhukhov")
user.save()
for module in AUTHORIZATION_MODULE_LIST:
password = cls.generate_random_password()
password_hash = make_password(password)
module_app = Module.objects.get(parent_entry_point__alias="authorizations", alias=module)
session = ExternalAuthorizationSession(
authorization_module=module_app,
session_key=password_hash,
session_key_expiry_date=timezone.now()
)
session.save()
session_key = base64.encodebytes((str(session.id) + ":" + password).encode("utf-8")).decode("utf-8")
cls.auth_sessions[module] = session_key
Account = cls.get_account_class(module)
Account(user=user, email="no-reply@ihna.ru").save()
cls.uuid_list = {}
for apps_used in ['imaging', 'roi']:
cls.uuid_list[apps_used] = Module.objects.get(alias=apps_used).uuid
@parameterized.expand([
(["core", "authorizations"], [
("standard", None),
("ihna", "<div class='auth ihna'></div>"),
("google", "<div class='auth google'></div>"),
("mailru", "<div class='auth mailru'></div>"),
("unix", None),
("cookie", None),
("password_recovery", None),
("auto", None),
]),
(["core", "synchronizations"], [
("ihna_employees", None),
]),
(["core", "projects"], [
("imaging", None),
]),
(["core", "projects", "imaging", "processors"], [
("roi", None),
]),
])
def test_widgets_show(self, route, expected_widget_list):
app = None
entry_point = None
current_route = list(route)
current_look = "app"
while len(current_route) > 0:
route_element = current_route.pop(0)
if current_look == "app":
app = Module.objects.get(alias=route_element, parent_entry_point=entry_point)
current_look = "ep"
elif current_look == "ep":
entry_point = EntryPoint.objects.get(alias=route_element, belonging_module=app)
current_look = "app"
self.assertEquals(current_look, "app")
values = Module.objects.filter(parent_entry_point=entry_point).values("alias", "html_code")
self.assertEquals(len(values), len(expected_widget_list),
"Number of modules attached to this entry point is not the same as expected")
for value in values:
alias = value['alias']
html_code = value['html_code']
expected_widget_found = False
for expected_alias, expected_widget in expected_widget_list:
if expected_alias == alias:
expected_widget_found = True
if html_code is not None and expected_widget is None:
self.fail("HTML code for module '%s' does not exist but expected" % alias)
if html_code is None and expected_widget is not None:
self.fail("HTML code for module '%s' exists but not expected" % alias)
if html_code is not None and expected_widget is not None:
self.assertHTMLEqual(html_code, expected_widget,
"HTML code for module '%s' is not the same as expected" % html_code)
break
self.assertTrue(expected_widget_found, "the module '%s' is not within the list of expected modules" %
alias)
@parameterized.expand([
("standard", "core.authorizations.StandardAuthorization"),
("ihna", "authorizations.ihna.App"),
("google", "authorizations.google.App"),
("mailru", "authorizations.mailru.App"),
("unix", "core.authorizations.UnixAuthorization"),
("cookie", "authorizations.cookie.App"),
("password_recovery", "core.authorizations.PasswordRecoveryAuthorization"),
("auto", "core.authorizations.AutomaticAuthorization"),
])
def test_authorization_modules(self, alias, expected_authorization_module):
authorization_app = Module.objects.get(parent_entry_point__alias="authorizations", alias=alias)
authorization_module = authorization_app.app_class
self.assertEquals(authorization_module, expected_authorization_module)
def test_authorization_sessions(self):
for module, session_key in self.auth_sessions.items():
session_info = base64.decodebytes(session_key.encode("utf-8")).decode("utf-8")
session_id, session_password = session_info.split(":", 1)
session = ExternalAuthorizationSession.objects.get(authorization_module__alias=module, id=session_id)
stored_password_hash = session.session_key
self.assertTrue(check_password(session_password, stored_password_hash))
module_class = session.authorization_module.app_class
session.delete()
self.assertTrue(module_class.split('.')[1], module)
def test_find_user(self):
for module in AUTHORIZATION_MODULE_LIST:
account_class = self.get_account_class(module)
account = account_class.objects.get(email="no-reply@ihna.ru")
self.assertEquals(account.user.login, "sergei.kozhukhov")
def test_account_contigency(self):
for module in AUTHORIZATION_MODULE_LIST:
self.assertEquals(self.get_account_class(module).objects.count(), 1)
User.objects.get(login="sergei.kozhukhov").delete()
for module in AUTHORIZATION_MODULE_LIST:
self.assertEquals(self.get_account_class(module).objects.count(), 0)
def test_access_by_uuid(self):
for module_name, uuid in self.uuid_list.items():
module_class = Module.objects.get(uuid=uuid).app_class
actual_module_name, module_class = module_class.split('.')
self.assertEquals(actual_module_name, module_name)
self.assertEquals(module_class, "App")
@classmethod
def generate_random_password(cls):
chars = string.ascii_letters + string.digits + '!@#$%^&*()'
random.seed = (os.urandom(1024))
return ''.join(random.choice(chars) for i in range(cls.PASSWORD_LENGTH))
@classmethod
def get_account_class(cls, module):
import authorizations
auth_module = getattr(authorizations, module)
return auth_module.models.Account
| 43.969512 | 113 | 0.627791 | 779 | 7,211 | 5.593068 | 0.215661 | 0.05233 | 0.026394 | 0.022034 | 0.202662 | 0.152169 | 0.116364 | 0.106725 | 0.084462 | 0.084462 | 0 | 0.004146 | 0.26418 | 7,211 | 163 | 114 | 44.239264 | 0.817 | 0 | 0 | 0.118881 | 0 | 0 | 0.1488 | 0.037027 | 0 | 0 | 0 | 0 | 0.083916 | 1 | 0.062937 | false | 0.090909 | 0.06993 | 0 | 0.174825 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
40f5e193e0cc75def4b2ba8e4e082e5183a4bea7 | 4,748 | py | Python | tests/test_api_gateway/test_common/test_exceptions.py | Clariteia/api_gateway_common | e68095f31091699fc6cc4537bd6acf97a8dc6c3e | [
"MIT"
] | 3 | 2021-05-14T08:13:09.000Z | 2021-05-26T11:25:35.000Z | tests/test_api_gateway/test_common/test_exceptions.py | Clariteia/api_gateway_common | e68095f31091699fc6cc4537bd6acf97a8dc6c3e | [
"MIT"
] | 27 | 2021-05-13T08:43:19.000Z | 2021-08-24T17:19:36.000Z | tests/test_api_gateway/test_common/test_exceptions.py | Clariteia/api_gateway_common | e68095f31091699fc6cc4537bd6acf97a8dc6c3e | [
"MIT"
] | null | null | null | """
Copyright (C) 2021 Clariteia SL
This file is part of minos framework.
Minos framework can not be copied and/or distributed without the express permission of Clariteia SL.
"""
import unittest
from minos.api_gateway.common import (
EmptyMinosModelSequenceException,
MinosAttributeValidationException,
MinosConfigDefaultAlreadySetException,
MinosConfigException,
MinosException,
MinosMalformedAttributeException,
MinosModelAttributeException,
MinosModelException,
MinosParseAttributeException,
MinosRepositoryAggregateNotFoundException,
MinosRepositoryDeletedAggregateException,
MinosRepositoryException,
MinosRepositoryManuallySetAggregateIdException,
MinosRepositoryManuallySetAggregateVersionException,
MinosRepositoryNonProvidedException,
MinosRepositoryUnknownActionException,
MinosReqAttributeException,
MinosTypeAttributeException,
MultiTypeMinosModelSequenceException,
)
class TestExceptions(unittest.TestCase):
def test_type(self):
self.assertTrue(issubclass(MinosException, Exception))
def test_base_repr(self):
exception = MinosException("test")
self.assertEqual("MinosException(message='test')", repr(exception))
def test_base_str(self):
exception = MinosException("test")
self.assertEqual("test", str(exception))
def test_config(self):
self.assertTrue(issubclass(MinosConfigException, MinosException))
def test_config_default_already_set(self):
self.assertTrue(issubclass(MinosConfigDefaultAlreadySetException, MinosConfigException))
def test_repository_aggregate_not_found(self):
self.assertTrue(issubclass(MinosRepositoryAggregateNotFoundException, MinosRepositoryException))
def test_repository_deleted_aggregate(self):
self.assertTrue(issubclass(MinosRepositoryDeletedAggregateException, MinosRepositoryException))
def test_repository_manually_set_aggregate_id(self):
self.assertTrue(issubclass(MinosRepositoryManuallySetAggregateIdException, MinosRepositoryException))
def test_repository_manually_set_aggregate_version(self):
self.assertTrue(issubclass(MinosRepositoryManuallySetAggregateVersionException, MinosRepositoryException,))
def test_repository_bad_action(self):
self.assertTrue(issubclass(MinosRepositoryUnknownActionException, MinosRepositoryException))
def test_repository_non_set(self):
self.assertTrue(issubclass(MinosRepositoryNonProvidedException, MinosRepositoryException))
def test_model(self):
self.assertTrue(issubclass(MinosModelException, MinosException))
def test_model_emtpy_sequence(self):
self.assertTrue(issubclass(EmptyMinosModelSequenceException, MinosModelException))
def test_model_multi_type_sequence(self):
self.assertTrue(issubclass(MultiTypeMinosModelSequenceException, MinosModelException))
def test_model_attribute(self):
self.assertTrue(issubclass(MinosModelAttributeException, MinosException))
def test_required_attribute(self):
self.assertTrue(issubclass(MinosReqAttributeException, MinosModelAttributeException))
def test_type_attribute(self):
self.assertTrue(issubclass(MinosTypeAttributeException, MinosModelAttributeException))
def test_type_attribute_repr(self):
exception = MinosTypeAttributeException("foo", float, True)
message = (
"MinosTypeAttributeException(message=\"The <class 'float'> expected type for 'foo' "
"does not match with the given data type: <class 'bool'>\")"
)
self.assertEqual(message, repr(exception))
def test_malformed_attribute(self):
self.assertTrue(issubclass(MinosMalformedAttributeException, MinosModelAttributeException))
def test_parse_attribute(self):
self.assertTrue(issubclass(MinosParseAttributeException, MinosModelAttributeException))
def test_attribute_parse_repr(self):
exception = MinosParseAttributeException("foo", 34, ValueError())
message = (
'MinosParseAttributeException(message="ValueError() '
"was raised while parsing 'foo' field with 34 value.\")"
)
self.assertEqual(message, repr(exception))
def test_attribute_validation(self):
self.assertTrue(issubclass(MinosAttributeValidationException, MinosModelAttributeException))
def test_attribute_validation_repr(self):
exception = MinosAttributeValidationException("foo", 34)
message = "MinosAttributeValidationException(message=\"34 value does not pass the 'foo' field validation.\")"
self.assertEqual(message, repr(exception))
if __name__ == "__main__":
unittest.main()
| 39.566667 | 117 | 0.771272 | 377 | 4,748 | 9.525199 | 0.289125 | 0.044834 | 0.090226 | 0.140351 | 0.205792 | 0.082985 | 0.057366 | 0 | 0 | 0 | 0 | 0.00299 | 0.154802 | 4,748 | 119 | 118 | 39.89916 | 0.891851 | 0.036226 | 0 | 0.082353 | 0 | 0 | 0.067863 | 0.035026 | 0 | 0 | 0 | 0 | 0.270588 | 1 | 0.270588 | false | 0.011765 | 0.023529 | 0 | 0.305882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40f9e62c7e463cdddcd04524566bd56b8cb59940 | 1,407 | py | Python | src/sntk/kernels/ntk.py | gear/s-ntk | 3cd72cef4c941941750e03820c9c2850b81d529e | [
"MIT"
] | null | null | null | src/sntk/kernels/ntk.py | gear/s-ntk | 3cd72cef4c941941750e03820c9c2850b81d529e | [
"MIT"
] | null | null | null | src/sntk/kernels/ntk.py | gear/s-ntk | 3cd72cef4c941941750e03820c9c2850b81d529e | [
"MIT"
] | null | null | null | import math
import numpy as np
# return an array K of size (d_max, d_max, N, N), K[i][j] is kernel value of depth i + 1 with first j layers fixed
def kernel_value_batch(X, d_max):
K = np.zeros((d_max, d_max, X.shape[0], X.shape[0]))
for fix_dep in range(d_max):
S = np.matmul(X, X.T)
H = np.zeros_like(S)
for dep in range(d_max):
if fix_dep <= dep:
H += S
K[dep][fix_dep] = H
L = np.diag(S)
P = np.clip(np.sqrt(np.outer(L, L)), a_min = 1e-9, a_max = None)
Sn = np.clip(S / P, a_min = -1, a_max = 1)
S = (Sn * (math.pi - np.arccos(Sn)) + np.sqrt(1.0 - Sn * Sn)) * P / 2.0 / math.pi
H = H * (math.pi - np.arccos(Sn)) / 2.0 / math.pi
return K
# return an array K of size (N, N), depth d_max, first fix_dep layers fixed
def kernel_value(X, d_max, fix_dep):
K = np.zeros((d_max, X.shape[0], X.shape[0]))
S = np.matmul(X, X.T)
H = np.zeros_like(S)
for dep in range(d_max):
if fix_dep <= dep:
H += S
K[dep] = H
L = np.diag(S)
P = np.clip(np.sqrt(np.outer(L, L)), a_min = 1e-9, a_max = None)
Sn = np.clip(S / P, a_min = -1, a_max = 1)
S = (Sn * (math.pi - np.arccos(Sn)) + np.sqrt(1.0 - Sn * Sn)) * P / 2.0 / math.pi
H = H * (math.pi - np.arccos(Sn)) / 2.0 / math.pi
return K[d_max - 1] | 40.2 | 115 | 0.509595 | 276 | 1,407 | 2.485507 | 0.206522 | 0.069971 | 0.040816 | 0.081633 | 0.814869 | 0.69242 | 0.634111 | 0.634111 | 0.581633 | 0.581633 | 0 | 0.027225 | 0.321251 | 1,407 | 35 | 116 | 40.2 | 0.691099 | 0.132907 | 0 | 0.645161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0 | 0.064516 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dc01dc4bc345b863361dbfcbff2946a74c676b49 | 1,261 | py | Python | modules/nmap_script/address_info.py | naimkowshik/reyna-eye | f729ec964e586ae3f63ff29fd524f7aed3748a74 | [
"MIT"
] | 4 | 2021-04-22T19:19:13.000Z | 2022-02-10T09:26:58.000Z | modules/nmap_script/address_info.py | naimkowshik/reyna-eye | f729ec964e586ae3f63ff29fd524f7aed3748a74 | [
"MIT"
] | null | null | null | modules/nmap_script/address_info.py | naimkowshik/reyna-eye | f729ec964e586ae3f63ff29fd524f7aed3748a74 | [
"MIT"
] | 1 | 2022-02-03T19:29:46.000Z | 2022-02-03T19:29:46.000Z | import subprocess
import sys
import time
import os
#############################
# COLORING YOUR SHELL #
#############################
R = "\033[1;31m" #
B = "\033[1;34m" #
Y = "\033[1;33m" #
G = "\033[1;32m" #
RS = "\033[0m" #
W = "\033[1;37m" #
#############################
os.system("clear")
print(" ")
print(R + "[" + G + "User Summary " + R + "]" + RS)
print("""
Shows extra information about IPv6 addresses, such as embedded MAC or IPv4 addresses when available.
Some IP address formats encode extra information; for example some IPv6 addresses encode an IPv4 address or MAC address
script can decode these address formats:
• IPv4-compatible IPv6 addresses,
• IPv4-mapped IPv6 addresses,
• Teredo IPv6 addresses,
• 6to4 IPv6 addresses,
• IPv6 addresses using an EUI-64 interface ID,
• IPv4-embedded IPv6 addresses,
• ISATAP Modified EUI-64 IPv6 addresses.
• IPv4-translated IPv6 addresses and
See RFC 4291 for general IPv6 addressing architecture and the definitions of some terms.
""")
print(" ")
webb = input("" + RS + "[" + B + "ENTER TARGET " + R + "WEBSITE " + Y + "IP" + RS + "]" + G + ": " + RS)
subprocess.check_call(['nmap', '-sV', '-sC', webb])
| 32.333333 | 120 | 0.57732 | 166 | 1,261 | 4.427711 | 0.524096 | 0.176871 | 0.114286 | 0.04898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062887 | 0.230769 | 1,261 | 38 | 121 | 33.184211 | 0.686598 | 0.015067 | 0 | 0.066667 | 0 | 0 | 0.66813 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.133333 | 0 | 0.133333 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
dc0d2dd1628c5437389a9030a61c8c8847b09265 | 1,331 | py | Python | examples/python/fling.py | arminfriedl/fling | 909606a9960fede8951436748c20a9600819b93a | [
"MIT"
] | null | null | null | examples/python/fling.py | arminfriedl/fling | 909606a9960fede8951436748c20a9600819b93a | [
"MIT"
] | null | null | null | examples/python/fling.py | arminfriedl/fling | 909606a9960fede8951436748c20a9600819b93a | [
"MIT"
] | null | null | null | import flingclient as fc
from flingclient.rest import ApiException
from datetime import datetime
# Per default the dockerized fling service runs on localhost:3000 In case you
# run your own instance, change the base url
configuration = fc.Configuration(host="http://localhost:3000")
# Every call, with the exception of `/api/auth`, is has to be authorized by a
# bearer token. Get a token by authenticating as admin and set it into the
# configuration. All subsequent calls will send this token in the header as
# `Authorization: Bearer <token> header`
def authenticate(admin_user, admin_password):
with fc.ApiClient(configuration) as api_client:
auth_client = fc.AuthApi(api_client)
admin_auth = fc.AdminAuth(admin_user, admin_password)
configuration.access_token = auth_client.authenticate_owner(admin_auth=admin_auth)
admin_user = input("Username: ")
admin_password = input("Password: ")
authenticate(admin_user, admin_password)
with fc.ApiClient(configuration) as api_client:
# Create a new fling
fling_client = fc.FlingApi(api_client)
fling = fc.Fling(name="A Fling from Python", auth_code="secret",
direct_download=False, allow_upload=True,
expiration_time=datetime(2099, 12, 12))
fling = fling_client.post_fling()
print(f"Created a new fling: {fling}")
#
| 40.333333 | 86 | 0.75432 | 189 | 1,331 | 5.174603 | 0.502646 | 0.03681 | 0.042945 | 0.067485 | 0.149284 | 0.149284 | 0.149284 | 0.149284 | 0.149284 | 0.149284 | 0 | 0.014324 | 0.160781 | 1,331 | 32 | 87 | 41.59375 | 0.861235 | 0.299775 | 0 | 0.105263 | 0 | 0 | 0.101842 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.210526 | 0.157895 | 0 | 0.210526 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
90541de92a1d97d772f070e495cb4dccfca0eef7 | 1,416 | py | Python | dev/libs.py | karimwitani/webscraping | 58d4b2587d039fcea567db2caf86bbddb4e0b96f | [
"MIT"
] | null | null | null | dev/libs.py | karimwitani/webscraping | 58d4b2587d039fcea567db2caf86bbddb4e0b96f | [
"MIT"
] | null | null | null | dev/libs.py | karimwitani/webscraping | 58d4b2587d039fcea567db2caf86bbddb4e0b96f | [
"MIT"
] | null | null | null | import selenium
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import TimeoutException
def browser_init():
option = webdriver.ChromeOptions()
browser = webdriver.Chrome(executable_path='/Library/Application Support/Google/chromedriver', chrome_options=option)
return browser
def insta_login(browser):
browser.get('https://www.instagram.com')
#Find username/pass fields
username = WebDriverWait(browser,10).until(EC.element_to_be_clickable((By.XPATH, '//input[@name="username"]')))
password = WebDriverWait(browser,10).until(EC.element_to_be_clickable((By.XPATH, '//input[@name="password"]')))
#input username and pass
username.clear()
username.send_keys('itanikarim')
password.clear()
password.send_keys('1995PPrr')
#Login
Login_button = WebDriverWait(browser, 2).until(EC.element_to_be_clickable((By.XPATH, '//*[@id="loginForm"]/div/div[3]'))).click()
#Skip buttons
not_now = WebDriverWait(browser, 10).until(EC.element_to_be_clickable((By.XPATH, '//button[contains(text(), "Not Now")]'))).click()
not_now2 = WebDriverWait(browser, 10).until(EC.element_to_be_clickable((By.XPATH, '//button[contains(text(), "Not Now")]'))).click()
print("everything ok") | 40.457143 | 136 | 0.738701 | 179 | 1,416 | 5.703911 | 0.407821 | 0.058766 | 0.06856 | 0.078355 | 0.32713 | 0.32713 | 0.32713 | 0.32713 | 0.29383 | 0.29383 | 0 | 0.011962 | 0.114407 | 1,416 | 35 | 137 | 40.457143 | 0.802233 | 0.045904 | 0 | 0 | 0 | 0 | 0.192137 | 0.117211 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.136364 | 0.272727 | 0 | 0.409091 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
905515ca4421e0d997a1e7e93a11455f5f918cff | 380 | py | Python | setup.py | dwastberg/osmuf | 0cef4e87401b3fc2d344d7e067b4d9ada25848a4 | [
"MIT"
] | null | null | null | setup.py | dwastberg/osmuf | 0cef4e87401b3fc2d344d7e067b4d9ada25848a4 | [
"MIT"
] | null | null | null | setup.py | dwastberg/osmuf | 0cef4e87401b3fc2d344d7e067b4d9ada25848a4 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(name='osmuf',
version='0.1',
install_requires=[
"seaborn",
],
description='Urban Form analysis from OpenStreetMap',
url='http://github.com/atelierlibre/osmuf',
author='AtelierLibre',
author_email='mail@atelierlibre.org',
license='MIT',
packages=['osmuf'],
zip_safe=False)
| 25.333333 | 59 | 0.615789 | 39 | 380 | 5.923077 | 0.820513 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006969 | 0.244737 | 380 | 14 | 60 | 27.142857 | 0.797909 | 0 | 0 | 0 | 0 | 0 | 0.342105 | 0.055263 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.076923 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9059540a6a1df436a316a8b4d0bf19c43271fcb4 | 1,699 | py | Python | app/main/forms.py | ingabire1/blog | 5fcee6027cee9fbdcd94057123862bd146a16e98 | [
"Unlicense"
] | null | null | null | app/main/forms.py | ingabire1/blog | 5fcee6027cee9fbdcd94057123862bd146a16e98 | [
"Unlicense"
] | null | null | null | app/main/forms.py | ingabire1/blog | 5fcee6027cee9fbdcd94057123862bd146a16e98 | [
"Unlicense"
] | null | null | null |
from flask_wtf import FlaskForm
from wtforms import StringField,TextAreaField,SubmitField
from wtforms.validators import Required
class ReviewForm(FlaskForm):
title = StringField('Review title',validators=[Required()])
review = TextAreaField('Movie review', validators=[Required()])
submit = SubmitField('Submit')
class UpdateProfile(FlaskForm):
bio = TextAreaField('Tell us about you.',validators = [Required()])
submit = SubmitField('Submit')
# class LoginForm(FlaskForm):
# email = StringField('Your Email Address',validators=[Required(),Email()])
# password = PasswordField('Password',validators =[Required()])
# remember = BooleanField('Remember me')
# submit = SubmitField('Sign In')
class BlogForm(FlaskForm):
# my_category = StringField('Category', validators=[Required()])
title = StringField('Title', validators=[Required()])
blog_post = TextAreaField('Type Blog here', validators=[Required()])
post = SubmitField('Post Blog')
class CommentForm(FlaskForm):
name = StringField('Name',validators=[Required()])
# email = StringField('Email', validators=[Required()],render_kw={"placeholder": "Email"})
comment = TextAreaField('Enter Comment', validators=[Required()])
post = SubmitField('Post Comment')
class SubscriptionForm(FlaskForm):
name = StringField('First Name', validators=[Required()])
subscription_data = StringField('Email', validators=[Required()])
subscribe = SubmitField('Subscribe')
class UpdatePostForm(FlaskForm):
# title = StringField('Title', validators=[Required()])
blog_post = TextAreaField('Type Blog here', validators=[Required()])
submit=SubmitField('SUBMIT')
| 42.475 | 94 | 0.712772 | 162 | 1,699 | 7.438272 | 0.333333 | 0.224066 | 0.057261 | 0.087137 | 0.291286 | 0.225726 | 0.149378 | 0.149378 | 0.149378 | 0.149378 | 0 | 0 | 0.140082 | 1,699 | 39 | 95 | 43.564103 | 0.824778 | 0.268393 | 0 | 0.16 | 0 | 0 | 0.12571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.12 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
905fb1174dc9f76a043ce3432db2989539fb3eae | 1,212 | py | Python | surface/ex_surface02.py | orbingol/NURBS-Python_Examples | c99d8cd3d20e7523694ce62f72760b260582fa11 | [
"MIT"
] | 48 | 2017-12-14T09:54:48.000Z | 2020-03-30T13:34:44.000Z | surface/ex_surface02.py | GabrielJie/NURBS-Python_Examples | c99d8cd3d20e7523694ce62f72760b260582fa11 | [
"MIT"
] | 7 | 2020-05-27T04:27:24.000Z | 2021-05-25T16:11:39.000Z | surface/ex_surface02.py | GabrielJie/NURBS-Python_Examples | c99d8cd3d20e7523694ce62f72760b260582fa11 | [
"MIT"
] | 37 | 2017-10-14T08:11:11.000Z | 2020-05-04T02:51:58.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Examples for the NURBS-Python Package
Released under MIT License
Developed by Onur Rauf Bingol (c) 2016-2017
"""
import os
from geomdl import BSpline
from geomdl import utilities
from geomdl import exchange
from geomdl import operations
from geomdl.visualization import VisPlotly
# Fix file path
os.chdir(os.path.dirname(os.path.realpath(__file__)))
# Create a BSpline surface instance
surf = BSpline.Surface()
# Set degrees
surf.degree_u = 3
surf.degree_v = 3
# Set control points
surf.set_ctrlpts(*exchange.import_txt("ex_surface02.cpt", two_dimensional=True))
# Set knot vectors
surf.knotvector_u = utilities.generate_knot_vector(surf.degree_u, 6)
surf.knotvector_v = utilities.generate_knot_vector(surf.degree_v, 6)
# Set evaluation delta
surf.delta = 0.025
# Evaluate surface
surf.evaluate()
# Plot the control point grid and the evaluated surface
vis_comp = VisPlotly.VisSurface()
surf.vis = vis_comp
surf.render()
# Evaluate surface tangent and normal at the given u and v
uv = [0.2, 0.9]
surf_tangent = operations.tangent(surf, uv)
surf_normal = operations.normal(surf, uv)
# Good to have something here to put a breakpoint
pass
| 22.867925 | 80 | 0.763201 | 186 | 1,212 | 4.854839 | 0.516129 | 0.055371 | 0.070875 | 0.059801 | 0.081949 | 0.081949 | 0 | 0 | 0 | 0 | 0 | 0.022201 | 0.145215 | 1,212 | 52 | 81 | 23.307692 | 0.849421 | 0.366337 | 0 | 0 | 0 | 0 | 0.02171 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.045455 | 0.318182 | 0 | 0.318182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
90600f2b374617aa571df4d29f498ce0b363ef8b | 1,380 | bzl | Python | dev/bazel/deps/micromkl.bzl | cmsxbc/oneDAL | eeb8523285907dc359c84ca4894579d5d1d9f57e | [
"Apache-2.0"
] | 169 | 2020-03-30T09:13:05.000Z | 2022-03-15T11:12:36.000Z | dev/bazel/deps/micromkl.bzl | cmsxbc/oneDAL | eeb8523285907dc359c84ca4894579d5d1d9f57e | [
"Apache-2.0"
] | 1,198 | 2020-03-24T17:26:18.000Z | 2022-03-31T08:06:15.000Z | dev/bazel/deps/micromkl.bzl | cmsxbc/oneDAL | eeb8523285907dc359c84ca4894579d5d1d9f57e | [
"Apache-2.0"
] | 75 | 2020-03-30T11:39:58.000Z | 2022-03-26T05:16:20.000Z | #===============================================================================
# Copyright 2020-2021 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#===============================================================================
load("@onedal//dev/bazel:repos.bzl", "repos")
micromkl_repo = repos.prebuilt_libs_repo_rule(
includes = [
"include",
"%{os}/include",
],
libs = [
"%{os}/lib/intel64/libdaal_mkl_thread.a",
"%{os}/lib/intel64/libdaal_mkl_sequential.a",
"%{os}/lib/intel64/libdaal_vmlipp_core.a",
],
build_template = "@onedal//dev/bazel/deps:micromkl.tpl.BUILD",
)
micromkl_dpc_repo = repos.prebuilt_libs_repo_rule(
includes = [
"include",
],
libs = [
"lib/intel64/libdaal_sycl.a",
],
build_template = "@onedal//dev/bazel/deps:micromkldpc.tpl.BUILD",
)
| 33.658537 | 80 | 0.603623 | 165 | 1,380 | 4.939394 | 0.563636 | 0.07362 | 0.083436 | 0.069939 | 0.266258 | 0.186503 | 0.186503 | 0.107975 | 0 | 0 | 0 | 0.017528 | 0.173188 | 1,380 | 40 | 81 | 34.5 | 0.696757 | 0.52029 | 0 | 0.454545 | 0 | 0 | 0.451314 | 0.401855 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9066b9980c0b3869cc716e1c22a3fe141c968868 | 1,705 | py | Python | myApps/test_web.py | Rocket-hodgepodge/NewsWeb | 7835b6ae4e754eb96f3f0d5983b2421c9464fee3 | [
"BSD-3-Clause"
] | null | null | null | myApps/test_web.py | Rocket-hodgepodge/NewsWeb | 7835b6ae4e754eb96f3f0d5983b2421c9464fee3 | [
"BSD-3-Clause"
] | null | null | null | myApps/test_web.py | Rocket-hodgepodge/NewsWeb | 7835b6ae4e754eb96f3f0d5983b2421c9464fee3 | [
"BSD-3-Clause"
] | 2 | 2018-07-04T01:43:36.000Z | 2018-07-04T06:12:47.000Z | from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
import unittest
class NewVisitorTest(unittest.TestCase):
def setUp(self):
self.timeout = 40
self.browser = webdriver.Chrome()
self.browser.set_page_load_timeout(self.timeout)
self.wait = WebDriverWait(self.browser, self.timeout)
def tearDown(self):
self.browser.quit()
def test_can_start_a_list_and_retrieve_it_later(self):
self.browser.get('https://www.baidu.com')
self.assertIn('百度', self.browser.title)
login_link = self.wait.until(
EC.element_to_be_clickable((By.LINK_TEXT, '登录')))
login_link.click()
login_link_2 = self.wait.until(
EC.element_to_be_clickable((By.ID, 'TANGRAM__PSP_10__footerULoginBtn')))
login_link_2.click()
username_input = self.wait.until(
EC.presence_of_element_located((By.ID, 'TANGRAM__PSP_10__userName')))
username_input.clear()
username_input.send_keys('橙色烟月')
password_input = self.wait.until(
EC.presence_of_element_located((By.ID, 'TANGRAM__PSP_10__password')))
password_input.clear()
password_input.send_keys('1659636840sec')
login_submit_button = self.wait.until(
EC.element_to_be_clickable((By.ID, 'TANGRAM__PSP_10__submit')))
login_submit_button.click()
username_span = self.wait.until(
EC.presence_of_element_located((By.CSS_SELECTOR, '#s_username_top > span')))
self.assertEqual(username_span.text, 'PebbleApp')
# user_login_link = self.browser.find_element_by_id('TANGRAM__PSP_10__footerULoginBtn')
# user_login_link.click()
if __name__ == '__main__':
unittest.main(warnings='ignore')
| 31.574074 | 89 | 0.775367 | 242 | 1,705 | 5.07438 | 0.371901 | 0.062704 | 0.063518 | 0.07329 | 0.281759 | 0.281759 | 0.2443 | 0.2443 | 0.2443 | 0.180782 | 0 | 0.015727 | 0.104985 | 1,705 | 53 | 90 | 32.169811 | 0.788991 | 0.06393 | 0 | 0 | 0 | 0 | 0.120527 | 0.065913 | 0 | 0 | 0 | 0 | 0.052632 | 1 | 0.078947 | false | 0.105263 | 0.131579 | 0 | 0.236842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
907488d52d48e24b4d69fb2af57f6618dc2c3ce3 | 2,836 | py | Python | Calculator.py | KunalKatiyar/Calculator | 74044d32b08738ef288ccfae6bb322e6ab05f608 | [
"MIT"
] | null | null | null | Calculator.py | KunalKatiyar/Calculator | 74044d32b08738ef288ccfae6bb322e6ab05f608 | [
"MIT"
] | null | null | null | Calculator.py | KunalKatiyar/Calculator | 74044d32b08738ef288ccfae6bb322e6ab05f608 | [
"MIT"
] | null | null | null | import sys
from PyQt5.QtWidgets import QApplication, QWidget, QPushButton, QHBoxLayout, QGroupBox, QDialog, QVBoxLayout, QGridLayout,QMainWindow, QApplication, QWidget, QPushButton, QAction, QLineEdit, QMessageBox
from PyQt5.QtGui import QIcon
from PyQt5.QtCore import pyqtSlot
class App(QDialog):
def __init__(self):
super().__init__()
self.title = 'Calculator'
self.left = 10
self.top = 10
self.width = 640
self.height = 480
self.initUI()
def initUI(self):
self.setWindowTitle(self.title)
self.setGeometry(self.left, self.top, self.width, self.height)
self.createGridLayout()
windowLayout = QVBoxLayout()
windowLayout.addWidget(self.horizontalGroupBox)
self.setLayout(windowLayout)
self.textbox = QLineEdit(self)
self.textbox.move(20, 40)
self.textbox.resize(600,35)
# Original Approach
# buttonp = QPushButton('+', self)
# buttonp.setToolTip('Addition Operator')
# buttonp.move(100,70)
# buttonp.clicked.connect(self.on_click)
# buttonm = QPushButton('-', self)
# buttonm.setToolTip('Subtraction Operator')
# buttonm.move(100,100)
# buttonm.clicked.connect(self.on_click)
self.show()
def createGridLayout(self):
self.horizontalGroupBox = QGroupBox("Grid")
layout = QGridLayout()
# layout.setColumnStretch(1, 2)
# layout.setColumnStretch(2, 4)
layout.addWidget(QPushButton('1'),0,0)
layout.addWidget(QPushButton('2'),0,1)
layout.addWidget(QPushButton('3'),0,2)
layout.addWidget(QPushButton('4'),1,0)
layout.addWidget(QPushButton('5'),1,1)
layout.addWidget(QPushButton('6'),1,2)
layout.addWidget(QPushButton('7'),2,0)
layout.addWidget(QPushButton('8'),2,1)
layout.addWidget(QPushButton('9'),2,2)
layout.addWidget(QPushButton('0'),3,1)
layout.addWidget(QPushButton('.'),3,0)
layout.addWidget(QPushButton('='),3,2)
layout.addWidget(QPushButton('+'),0,4)
layout.addWidget(QPushButton('-'),1,4)
layout.addWidget(QPushButton('*'),2,4)
layout.addWidget(QPushButton('/'),3,4)
self.horizontalGroupBox.setLayout(layout)
# @pyqtSlot()
# def on_click(self):
# print('Button click')
@pyqtSlot()
def on_click(self):
textboxValue = "Good"
QMessageBox.question(self, 'Message - pythonspot.com', "You typed: " + textboxValue, QMessageBox.Ok, QMessageBox.Ok)
self.textbox.setText("Good")
if __name__ == '__main__':
app = QApplication(sys.argv)
ex = App()
sys.exit(app.exec_()) | 35.45 | 203 | 0.605783 | 292 | 2,836 | 5.811644 | 0.335616 | 0.141426 | 0.245138 | 0.063642 | 0.172658 | 0.034178 | 0 | 0 | 0 | 0 | 0 | 0.037655 | 0.260226 | 2,836 | 80 | 204 | 35.45 | 0.771211 | 0.142807 | 0 | 0 | 0 | 0 | 0.034645 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075472 | false | 0 | 0.075472 | 0 | 0.169811 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9082f22e3410593d0f53f454a62bd2d756d1a9be | 554 | py | Python | rsbroker/urls.py | land-pack/RsBroker | d556fda09582e0540cac0eabc163a984e8fc1c44 | [
"Apache-2.0"
] | null | null | null | rsbroker/urls.py | land-pack/RsBroker | d556fda09582e0540cac0eabc163a984e8fc1c44 | [
"Apache-2.0"
] | null | null | null | rsbroker/urls.py | land-pack/RsBroker | d556fda09582e0540cac0eabc163a984e8fc1c44 | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import
import os
from tornado.web import StaticFileHandler
from rsbroker.views import websocket
from rsbroker.views.error import NotFoundErrorHandler
settings = dict(
template_path=os.path.join(os.path.dirname(__file__), "templates"),
static_path=os.path.join(os.path.dirname(__file__), "static")
)
handlers = [
# Http api
# Events WebSocket API
(r"/api/ws", websocket.BrokerServerHandler),
# Static
(r"/static/(.*)", StaticFileHandler),
# Error
(r".*", NotFoundErrorHandler)
]
| 20.518519 | 71 | 0.714801 | 63 | 554 | 6.047619 | 0.460317 | 0.062992 | 0.089239 | 0.073491 | 0.16273 | 0.16273 | 0.16273 | 0.16273 | 0 | 0 | 0 | 0 | 0.16426 | 554 | 26 | 72 | 21.307692 | 0.822894 | 0.075812 | 0 | 0 | 0 | 0 | 0.071006 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.357143 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
9089cafc79c7a1e8e0abc38c3cabc190f618f305 | 1,648 | py | Python | wpa-psk/wpa-psk.py | ranisalt/rsaur | 8b8e8f596a35e8aff53ccff0fc941deacdc885a4 | [
"MIT"
] | null | null | null | wpa-psk/wpa-psk.py | ranisalt/rsaur | 8b8e8f596a35e8aff53ccff0fc941deacdc885a4 | [
"MIT"
] | null | null | null | wpa-psk/wpa-psk.py | ranisalt/rsaur | 8b8e8f596a35e8aff53ccff0fc941deacdc885a4 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import sys
from argparse import ArgumentParser
from getpass import getpass
from hashlib import pbkdf2_hmac
from signal import signal, SIGINT
def die(*_, **__):
sys.exit()
signal = signal(SIGINT, die)
iwd = """[Security]
PreSharedKey={psk}"""
supplicant = """network={{
ssid={ssid}
#psk={passphrase}
psk={psk}
}}"""
parser = ArgumentParser(
description="%(prog)s pre-computes PSK entries for network configuration blocks of wpa_supplicant or iwd config. An ASCII passphrase and SSID are used to generate a 256-bit PSK."
)
parser.add_argument("ssid", help="The SSID whose passphrase should be derived.")
parser.add_argument(
"passphrase",
help="The passphrase to use. If not included on the command line, passphrase will be read from standard input.",
nargs="?",
)
parser.add_argument(
"--iwd",
"-i",
dest="template",
action="store_const",
const=iwd,
default=supplicant,
help="Generate for iwd (default: generate for wpa_supplicant).",
)
args = parser.parse_args()
if not args.passphrase:
print("# reading passphrase from stdin", file=sys.stderr)
args.passphrase = getpass(prompt="")
if not 8 <= len(args.passphrase) <= 63:
print("Passphrase must be 8..63 characters", file=sys.stderr)
sys.exit(1)
passphrase = args.passphrase.encode()
if any(b < 32 or b == 127 for b in passphrase):
print("Invalid passphrase character", file=sys.stderr)
sys.exit(1)
ssid = args.ssid.encode()
psk = pbkdf2_hmac("sha1", passphrase, ssid, iterations=4096, dklen=32)
print(args.template.format(ssid=args.ssid, passphrase=args.passphrase, psk=psk.hex()))
| 28.912281 | 182 | 0.703277 | 227 | 1,648 | 5.052863 | 0.462555 | 0.061029 | 0.044464 | 0.027899 | 0.036617 | 0.036617 | 0 | 0 | 0 | 0 | 0 | 0.018841 | 0.162621 | 1,648 | 56 | 183 | 29.428571 | 0.812319 | 0.012743 | 0 | 0.085106 | 0 | 0.042553 | 0.369619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021277 | false | 0.340426 | 0.106383 | 0 | 0.12766 | 0.085106 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
9092b9fc5566c9c58a04dd93c04224cbbceb0b64 | 1,911 | py | Python | sdl2/blendmode.py | namelivia/py-sdl2 | c1bdf43501224d5f0a125dbce70198100ec7be82 | [
"CC0-1.0"
] | 222 | 2017-08-19T00:51:59.000Z | 2022-02-05T19:39:33.000Z | sdl2/blendmode.py | namelivia/py-sdl2 | c1bdf43501224d5f0a125dbce70198100ec7be82 | [
"CC0-1.0"
] | 103 | 2017-08-20T17:13:05.000Z | 2022-02-05T20:20:01.000Z | sdl2/blendmode.py | namelivia/py-sdl2 | c1bdf43501224d5f0a125dbce70198100ec7be82 | [
"CC0-1.0"
] | 54 | 2017-08-20T17:13:00.000Z | 2022-01-14T23:51:13.000Z | from ctypes import c_int
from .dll import _bind
__all__ = [
# Enums
"SDL_BlendMode",
"SDL_BLENDMODE_NONE", "SDL_BLENDMODE_BLEND", "SDL_BLENDMODE_ADD",
"SDL_BLENDMODE_MOD", "SDL_BLENDMODE_MUL", "SDL_BLENDMODE_INVALID",
"SDL_BlendOperation",
"SDL_BLENDOPERATION_ADD", "SDL_BLENDOPERATION_SUBTRACT",
"SDL_BLENDOPERATION_REV_SUBTRACT", "SDL_BLENDOPERATION_MINIMUM",
"SDL_BLENDOPERATION_MAXIMUM",
"SDL_BlendFactor",
"SDL_BLENDFACTOR_ZERO", "SDL_BLENDFACTOR_ONE",
"SDL_BLENDFACTOR_SRC_COLOR", "SDL_BLENDFACTOR_ONE_MINUS_SRC_COLOR",
"SDL_BLENDFACTOR_SRC_ALPHA", "SDL_BLENDFACTOR_ONE_MINUS_SRC_ALPHA",
"SDL_BLENDFACTOR_DST_COLOR", "SDL_BLENDFACTOR_ONE_MINUS_DST_COLOR",
"SDL_BLENDFACTOR_DST_ALPHA", "SDL_BLENDFACTOR_ONE_MINUS_DST_ALPHA",
# Functions
"SDL_ComposeCustomBlendMode"
]
SDL_BlendMode = c_int
SDL_BLENDMODE_NONE = 0x00000000
SDL_BLENDMODE_BLEND = 0x00000001
SDL_BLENDMODE_ADD = 0x00000002
SDL_BLENDMODE_MOD = 0x00000004
SDL_BLENDMODE_MUL = 0x00000008
SDL_BLENDMODE_INVALID = 0x7FFFFFFF
SDL_BlendOperation = c_int
SDL_BLENDOPERATION_ADD = 0x1
SDL_BLENDOPERATION_SUBTRACT = 0x2
SDL_BLENDOPERATION_REV_SUBTRACT = 0x3
SDL_BLENDOPERATION_MINIMUM = 0x4
SDL_BLENDOPERATION_MAXIMUM = 0x5
SDL_BlendFactor = c_int
SDL_BLENDFACTOR_ZERO = 0x1
SDL_BLENDFACTOR_ONE = 0x2
SDL_BLENDFACTOR_SRC_COLOR = 0x3
SDL_BLENDFACTOR_ONE_MINUS_SRC_COLOR = 0x4
SDL_BLENDFACTOR_SRC_ALPHA = 0x5
SDL_BLENDFACTOR_ONE_MINUS_SRC_ALPHA = 0x6
SDL_BLENDFACTOR_DST_COLOR = 0x7
SDL_BLENDFACTOR_ONE_MINUS_DST_COLOR = 0x8
SDL_BLENDFACTOR_DST_ALPHA = 0x9
SDL_BLENDFACTOR_ONE_MINUS_DST_ALPHA = 0xA
SDL_ComposeCustomBlendMode = _bind("SDL_ComposeCustomBlendMode", [SDL_BlendFactor, SDL_BlendFactor, SDL_BlendOperation, SDL_BlendFactor, SDL_BlendFactor, SDL_BlendOperation], SDL_BlendMode, added='2.0.6')
| 31.327869 | 204 | 0.791209 | 236 | 1,911 | 5.79661 | 0.220339 | 0.266082 | 0.124269 | 0.128655 | 0.258041 | 0.243421 | 0.067982 | 0 | 0 | 0 | 0 | 0.048289 | 0.143904 | 1,911 | 60 | 205 | 31.85 | 0.787897 | 0.007849 | 0 | 0 | 0 | 0 | 0.329107 | 0.235077 | 0 | 0 | 0.055468 | 0 | 0 | 1 | 0 | false | 0 | 0.046512 | 0 | 0.046512 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
909acc24e11a5c6671af7463f6c79ae6bbfe3286 | 20,420 | py | Python | network/modules/spconv_unet.py | alexisgroshenry/NPM3D_DSNet | d1a2ec071728dcb3c733ecdee3a27f4534b67f33 | [
"MIT"
] | null | null | null | network/modules/spconv_unet.py | alexisgroshenry/NPM3D_DSNet | d1a2ec071728dcb3c733ecdee3a27f4534b67f33 | [
"MIT"
] | null | null | null | network/modules/spconv_unet.py | alexisgroshenry/NPM3D_DSNet | d1a2ec071728dcb3c733ecdee3a27f4534b67f33 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
# author: Xinge
# @file: spconv_unet.py
# @time: 2020/06/22 15:01
import time
import numpy as np
import spconv
import torch
import torch.nn.functional as F
from torch import nn
def conv3x3(in_planes, out_planes, stride=1, indice_key=None):
return spconv.SubMConv3d(in_planes, out_planes, kernel_size=3, stride=stride,
padding=1, bias=False, indice_key=indice_key)
def conv1x3(in_planes, out_planes, stride=1, indice_key=None):
return spconv.SubMConv3d(in_planes, out_planes, kernel_size=(1, 3, 3), stride=stride,
padding=(0, 1, 1), bias=False, indice_key=indice_key)
def conv1x1x3(in_planes, out_planes, stride=1, indice_key=None):
return spconv.SubMConv3d(in_planes, out_planes, kernel_size=(1, 1, 3), stride=stride,
padding=(0, 0, 1), bias=False, indice_key=indice_key)
def conv1x3x1(in_planes, out_planes, stride=1, indice_key=None):
return spconv.SubMConv3d(in_planes, out_planes, kernel_size=(1, 3, 1), stride=stride,
padding=(0, 1, 0), bias=False, indice_key=indice_key)
def conv3x1x1(in_planes, out_planes, stride=1, indice_key=None):
return spconv.SubMConv3d(in_planes, out_planes, kernel_size=(3, 1, 1), stride=stride,
padding=(1, 0, 0), bias=False, indice_key=indice_key)
def conv3x1(in_planes, out_planes, stride=1, indice_key=None):
return spconv.SubMConv3d(in_planes, out_planes, kernel_size=(3, 1, 3), stride=stride,
padding=(1, 0, 1), bias=False, indice_key=indice_key)
def conv1x1(in_planes, out_planes, stride=1, indice_key=None):
return spconv.SubMConv3d(in_planes, out_planes, kernel_size=1, stride=stride,
padding=1, bias=False, indice_key=indice_key)
class ResContextBlock(nn.Module):
def __init__(self, in_filters, out_filters, kernel_size=(3, 3, 3), stride=1, indice_key=None):
super(ResContextBlock, self).__init__()
self.conv1 = conv1x3(in_filters, out_filters, indice_key=indice_key+"bef")
self.bn0 = nn.BatchNorm1d(out_filters)
self.act1 = nn.LeakyReLU()
self.conv1_2 = conv3x1(out_filters, out_filters, indice_key=indice_key+"bef")
self.bn0_2 = nn.BatchNorm1d(out_filters)
self.act1_2 = nn.LeakyReLU()
self.conv2 = conv3x1(in_filters, out_filters, indice_key=indice_key+"bef")
self.act2 = nn.LeakyReLU()
self.bn1 = nn.BatchNorm1d(out_filters)
self.conv3 = conv1x3(out_filters, out_filters, indice_key=indice_key+"bef")
self.act3 = nn.LeakyReLU()
self.bn2 = nn.BatchNorm1d(out_filters)
def forward(self, x):
shortcut = self.conv1(x)
shortcut.features = self.act1(shortcut.features)
shortcut.features = self.bn0(shortcut.features)
shortcut = self.conv1_2(shortcut)
shortcut.features = self.act1_2(shortcut.features)
shortcut.features = self.bn0_2(shortcut.features)
resA = self.conv2(x)
resA.features = self.act2(resA.features)
resA.features = self.bn1(resA.features)
resA = self.conv3(resA)
resA.features = self.act3(resA.features)
resA.features = self.bn2(resA.features)
resA.features = resA.features + shortcut.features
return resA
class ResBlock(nn.Module):
def __init__(self, in_filters, out_filters, dropout_rate, kernel_size=(3, 3, 3), stride=1,
pooling=True, drop_out=True, height_pooling=False, indice_key=None):
super(ResBlock, self).__init__()
self.pooling = pooling
self.drop_out = drop_out
self.conv1 = conv3x1(in_filters, out_filters, indice_key=indice_key+"bef")
self.act1 = nn.LeakyReLU()
self.bn0 = nn.BatchNorm1d(out_filters)
self.conv1_2 = conv1x3(out_filters, out_filters, indice_key=indice_key+"bef")
self.act1_2 = nn.LeakyReLU()
self.bn0_2 = nn.BatchNorm1d(out_filters)
self.conv2 = conv1x3(in_filters, out_filters, indice_key=indice_key+"bef")
self.act2 = nn.LeakyReLU()
self.bn1 = nn.BatchNorm1d(out_filters)
self.conv3 = conv3x1(out_filters, out_filters, indice_key=indice_key+"bef")
self.act3 = nn.LeakyReLU()
self.bn2 = nn.BatchNorm1d(out_filters)
# self.conv4 = conv3x3(out_filters, out_filters, indice_key=indice_key+"bef")
# self.act4 = nn.LeakyReLU()
# self.bn4 = nn.BatchNorm1d(out_filters)
if pooling:
# self.dropout = nn.Dropout3d(p=dropout_rate)
if height_pooling:
# self.pool = spconv.SparseMaxPool3d(kernel_size=2, stride=2)
self.pool = spconv.SparseConv3d(out_filters, out_filters, kernel_size=3, stride=2,
padding=1, indice_key=indice_key, bias=False)
else:
# self.pool = spconv.SparseMaxPool3d(kernel_size=(2,2,1), stride=(2, 2, 1))
self.pool = spconv.SparseConv3d(out_filters, out_filters, kernel_size=3, stride=(2,2,1),
padding=1, indice_key=indice_key, bias=False)
# else:
# self.dropout = nn.Dropout3d(p=dropout_rate)
def forward(self, x):
shortcut = self.conv1(x)
shortcut.features = self.act1(shortcut.features)
shortcut.features = self.bn0(shortcut.features)
shortcut = self.conv1_2(shortcut)
shortcut.features = self.act1_2(shortcut.features)
shortcut.features = self.bn0_2(shortcut.features)
resA = self.conv2(x)
resA.features = self.act2(resA.features)
resA.features = self.bn1(resA.features)
resA = self.conv3(resA)
resA.features = self.act3(resA.features)
resA.features = self.bn2(resA.features)
resA.features = resA.features + shortcut.features
# resA = self.conv4(resA)
# resA.features = self.act4(resA.features)
# resA.features = self.bn4(resA.features)
if self.pooling:
# if self.drop_out:
# resB = self.dropout(resA.features)
# else:
# resB = resA
resB = self.pool(resA)
return resB, resA
else:
# if self.drop_out:
# resB = self.dropout(resA)
# else:
# resB = resA
return resA
class UpBlock(nn.Module):
def __init__(self, in_filters, out_filters, kernel_size=(3, 3, 3), indice_key=None, up_key=None):
super(UpBlock, self).__init__()
# self.drop_out = drop_out
#self.trans = nn.ConvTranspose2d(in_filters, out_filters, kernel_size, stride=(2, 2), padding=1)
self.trans_dilao = conv3x3(in_filters, out_filters, indice_key=indice_key+"new_up")
self.trans_act = nn.LeakyReLU()
self.trans_bn = nn.BatchNorm1d(out_filters)
# self.dropout1 = nn.Dropout3d(p=dropout_rate)
# self.dropout2 = nn.Dropout3d(p=dropout_rate)
self.conv1 = conv1x3(out_filters, out_filters, indice_key=indice_key)
self.act1 = nn.LeakyReLU()
self.bn1 = nn.BatchNorm1d(out_filters)
self.conv2 = conv3x1(out_filters, out_filters, indice_key=indice_key)
self.act2 = nn.LeakyReLU()
self.bn2 = nn.BatchNorm1d(out_filters)
self.conv3 = conv3x3(out_filters, out_filters, indice_key=indice_key)
self.act3 = nn.LeakyReLU()
self.bn3 = nn.BatchNorm1d(out_filters)
# self.dropout3 = nn.Dropout3d(p=dropout_rate)
self.up_subm = spconv.SparseInverseConv3d(out_filters, out_filters, kernel_size=3, indice_key=up_key, bias=False)
def forward(self, x, skip):
upA = self.trans_dilao(x)
#if upA.shape != skip.shape:
# upA = F.pad(upA, (0, 1, 0, 1), mode='replicate')
upA.features = self.trans_act(upA.features)
upA.features = self.trans_bn(upA.features)
## upsample
upA = self.up_subm(upA)
# upA = F.interpolate(upA, size=skip.size()[2:], mode='trilinear', align_corners=True)
# if self.drop_out:
# upA = self.dropout1(upA)
upA.features = upA.features + skip.features
# if self.drop_out:
# upB = self.dropout2(upB)
upE = self.conv1(upA)
upE.features = self.act1(upE.features)
upE.features = self.bn1(upE.features)
upE = self.conv2(upE)
upE.features = self.act2(upE.features)
upE.features = self.bn2(upE.features)
upE = self.conv3(upE)
upE.features = self.act3(upE.features)
upE.features = self.bn3(upE.features)
# if self.drop_out:
# upE = self.dropout3(upE)
return upE
class ReconBlock(nn.Module):
def __init__(self, in_filters, out_filters, kernel_size=(3, 3, 3), stride=1, indice_key=None):
super(ReconBlock, self).__init__()
self.conv1 = conv3x1x1(in_filters, out_filters, indice_key=indice_key+"bef")
self.bn0 = nn.BatchNorm1d(out_filters)
self.act1 = nn.Sigmoid()
self.conv1_2 = conv1x3x1(in_filters, out_filters, indice_key=indice_key+"bef")
self.bn0_2 = nn.BatchNorm1d(out_filters)
self.act1_2 = nn.Sigmoid()
self.conv1_3 = conv1x1x3(in_filters, out_filters, indice_key=indice_key+"bef")
self.bn0_3 = nn.BatchNorm1d(out_filters)
self.act1_3 = nn.Sigmoid()
# self.conv2 = conv3x1(in_filters, out_filters, indice_key=indice_key+"bef")
# self.act2 = nn.LeakyReLU()
# self.bn1 = nn.BatchNorm1d(out_filters)
#
# self.conv3 = conv1x3(out_filters, out_filters, indice_key=indice_key+"bef")
# self.act3 = nn.LeakyReLU()
# self.bn2 = nn.BatchNorm1d(out_filters)
def forward(self, x):
shortcut = self.conv1(x)
shortcut.features = self.bn0(shortcut.features)
shortcut.features = self.act1(shortcut.features)
shortcut2 = self.conv1_2(x)
shortcut2.features = self.bn0_2(shortcut2.features)
shortcut2.features = self.act1_2(shortcut2.features)
shortcut3 = self.conv1_3(x)
shortcut3.features = self.bn0_3(shortcut3.features)
shortcut3.features = self.act1_3(shortcut3.features)
# resA = self.conv2(x)
# resA.features = self.act2(resA.features)
# resA.features = self.bn1(resA.features)
#
# resA = self.conv3(resA)
# resA.features = self.act3(resA.features)
# resA.features = self.bn2(resA.features)
shortcut.features = shortcut.features + shortcut2.features + shortcut3.features
shortcut.features = shortcut.features * x.features
return shortcut
class Spconv_salsaNet_res_cfg(nn.Module):
def __init__(self, cfg):
super(Spconv_salsaNet_res_cfg, self).__init__()
output_shape = cfg.DATA_CONFIG.DATALOADER.GRID_SIZE
if 'FEATURE_COMPRESSION' in cfg.MODEL.MODEL_FN:
num_input_features = cfg.MODEL.MODEL_FN.FEATURE_COMPRESSION
else:
num_input_features = cfg.DATA_CONFIG.DATALOADER.GRID_SIZE[2]
nclasses = cfg.DATA_CONFIG.NCLASS
n_height = cfg.DATA_CONFIG.DATALOADER.GRID_SIZE[2]
init_size = cfg.MODEL.BACKBONE.INIT_SIZE
self.nclasses = nclasses
self.nheight = n_height
self.strict = False
sparse_shape = np.array(output_shape)
# sparse_shape[0] = 11
self.sparse_shape = sparse_shape
self.downCntx = ResContextBlock(num_input_features, init_size, indice_key="pre")
# self.resBlock1 = ResBlock(init_size, init_size, 0.2, pooling=True, height_pooling=True, indice_key="down1")
self.resBlock2 = ResBlock(init_size, 2 * init_size, 0.2, height_pooling=True, indice_key="down2")
self.resBlock3 = ResBlock(2 * init_size, 4 * init_size, 0.2, height_pooling=True, indice_key="down3")
self.resBlock4 = ResBlock(4 * init_size, 8 * init_size, 0.2, pooling=True, height_pooling=False, indice_key="down4")
self.resBlock5 = ResBlock(8 * init_size, 16 * init_size, 0.2, pooling=True, height_pooling=False, indice_key="down5")
# self.resBlock6 = ResBlock(16 * init_size, 16 * init_size, 0.2, pooling=False, height_pooling=False, indice_key="down6")
# self.ReconNet = ReconBlock(16 * init_size, 16 * init_size, indice_key="recon")
self.upBlock0 = UpBlock(16 * init_size, 16 * init_size, indice_key="up0", up_key="down5")
self.upBlock1 = UpBlock(16 * init_size, 8 * init_size, indice_key="up1", up_key="down4")
self.upBlock2 = UpBlock(8 * init_size, 4 * init_size, indice_key="up2", up_key="down3")
self.upBlock3 = UpBlock(4 * init_size, 2 * init_size, indice_key="up3", up_key="down2")
# self.upBlock4 = UpBlock(4 * init_size, 2 * init_size, indice_key="up4", up_key="down2")
# self.upBlock5 = UpBlock(2 * init_size, init_size, indice_key="up5", up_key="down1")
self.ReconNet = ReconBlock(2*init_size, 2*init_size, indice_key="recon")
def forward(self, voxel_features, coors, batch_size):
# x = x.contiguous()
coors = coors.int()
ret = spconv.SparseConvTensor(voxel_features, coors, self.sparse_shape,
batch_size)
ret = self.downCntx(ret)
# down0c, down0b = self.resBlock1(ret)
down1c, down1b = self.resBlock2(ret)
down2c, down2b = self.resBlock3(down1c)
down3c, down3b = self.resBlock4(down2c)
down4c, down4b = self.resBlock5(down3c)
# down5b = self.resBlock6(down4c)
# down6b = self.ReconNet(down5b)
up4e = self.upBlock0(down4c, down4b)
up3e = self.upBlock1(up4e, down3b)
up2e = self.upBlock2(up3e, down2b)
up1e = self.upBlock3(up2e, down1b)
up0e = self.ReconNet(up1e)
up0e.features = torch.cat((up0e.features, up1e.features), 1) # size 4 * init_size --> OK with the size of the semantic and instance heads
return up0e, up0e
class Spconv_sem_logits_head_cfg(nn.Module):
def __init__(self, cfg):
super(Spconv_sem_logits_head_cfg, self).__init__()
output_shape = cfg.DATA_CONFIG.DATALOADER.GRID_SIZE
if 'FEATURE_COMPRESSION' in cfg.MODEL.MODEL_FN:
num_input_features = cfg.MODEL.MODEL_FN.FEATURE_COMPRESSION
else:
num_input_features = cfg.DATA_CONFIG.DATALOADER.GRID_SIZE[2]
nclasses = cfg.DATA_CONFIG.NCLASS
n_height = cfg.DATA_CONFIG.DATALOADER.GRID_SIZE[2]
init_size = cfg.MODEL.BACKBONE.INIT_SIZE
self.logits = spconv.SubMConv3d(4 * init_size, nclasses, indice_key="logit", kernel_size=3, stride=1, padding=1, bias=True)
def forward(self, fea):
logits = self.logits(fea)
return logits.dense()
class Spconv_ins_offset_concatxyz_threelayers_head_cfg(nn.Module):
def __init__(self, cfg):
super(Spconv_ins_offset_concatxyz_threelayers_head_cfg, self).__init__()
init_size = cfg.MODEL.BACKBONE.INIT_SIZE
self.pt_fea_dim = 4 * init_size
self.embedding_dim = cfg.MODEL.INS_HEAD.EMBEDDING_CHANNEL
self.conv1 = conv3x3(self.pt_fea_dim, self.pt_fea_dim, indice_key='offset_head_conv1')
self.bn1 = nn.BatchNorm1d(self.pt_fea_dim)
self.act1 = nn.LeakyReLU()
self.conv2 = conv3x3(self.pt_fea_dim, 2 * init_size, indice_key='offset_head_conv2')
self.bn2 = nn.BatchNorm1d(2 * init_size)
self.act2 = nn.LeakyReLU()
self.conv3 = conv3x3(2 * init_size, init_size, indice_key='offset_head_conv3')
self.bn3 = nn.BatchNorm1d(init_size)
self.act3 = nn.LeakyReLU()
self.offset = nn.Sequential(
nn.Linear(init_size+3, init_size, bias=True),
nn.BatchNorm1d(init_size),
nn.ReLU()
)
self.offset_linear = nn.Linear(init_size, self.embedding_dim, bias=True)
def forward(self, fea, batch):
fea = self.conv1(fea)
fea.features = self.act1(self.bn1(fea.features))
fea = self.conv2(fea)
fea.features = self.act2(self.bn2(fea.features))
fea = self.conv3(fea)
fea.features = self.act3(self.bn3(fea.features))
grid_ind = batch['grid']
xyz = batch['pt_cart_xyz']
fea = fea.dense()
fea = fea.permute(0, 2, 3, 4, 1)
pt_ins_fea_list = []
for batch_i, grid_ind_i in enumerate(grid_ind):
pt_ins_fea_list.append(fea[batch_i, grid_ind[batch_i][:,0], grid_ind[batch_i][:,1], grid_ind[batch_i][:,2]])
pt_pred_offsets_list = []
for batch_i, pt_ins_fea in enumerate(pt_ins_fea_list):
pt_pred_offsets_list.append(self.offset_linear(self.offset(torch.cat([pt_ins_fea,torch.from_numpy(xyz[batch_i]).cuda()],dim=1))))
return pt_pred_offsets_list, pt_ins_fea_list
class Spconv_alsaNet_res(nn.Module):
def __init__(self,
output_shape,
use_norm=True,
num_input_features=128,
nclasses = 20, n_height = 32, strict=False, init_size=16):
super(Spconv_alsaNet_res, self).__init__()
self.nclasses = nclasses
self.nheight = n_height
self.strict = False
sparse_shape = np.array(output_shape)
# sparse_shape[0] = 11
print(sparse_shape)
self.sparse_shape = sparse_shape
self.downCntx = ResContextBlock(num_input_features, init_size, indice_key="pre")
# self.resBlock1 = ResBlock(init_size, init_size, 0.2, pooling=True, height_pooling=True, indice_key="down1")
self.resBlock2 = ResBlock(init_size, 2 * init_size, 0.2, height_pooling=True, indice_key="down2")
self.resBlock3 = ResBlock(2 * init_size, 4 * init_size, 0.2, height_pooling=True, indice_key="down3")
self.resBlock4 = ResBlock(4 * init_size, 8 * init_size, 0.2, pooling=True, height_pooling=False, indice_key="down4")
self.resBlock5 = ResBlock(8 * init_size, 16 * init_size, 0.2, pooling=True, height_pooling=False, indice_key="down5")
# self.resBlock6 = ResBlock(16 * init_size, 16 * init_size, 0.2, pooling=False, height_pooling=False, indice_key="down6")
# self.ReconNet = ReconBlock(16 * init_size, 16 * init_size, indice_key="recon")
self.upBlock0 = UpBlock(16 * init_size, 16 * init_size, indice_key="up0", up_key="down5")
self.upBlock1 = UpBlock(16 * init_size, 8 * init_size, indice_key="up1", up_key="down4")
self.upBlock2 = UpBlock(8 * init_size, 4 * init_size, indice_key="up2", up_key="down3")
self.upBlock3 = UpBlock(4 * init_size, 2 * init_size, indice_key="up3", up_key="down2")
# self.upBlock4 = UpBlock(4 * init_size, 2 * init_size, indice_key="up4", up_key="down2")
# self.upBlock5 = UpBlock(2 * init_size, init_size, indice_key="up5", up_key="down1")
self.ReconNet = ReconBlock(2*init_size, 2*init_size, indice_key="recon")
self.logits = spconv.SubMConv3d(4 * init_size, nclasses, indice_key="logit", kernel_size=3, stride=1, padding=1, bias=True)
def forward(self, voxel_features, coors, batch_size):
# x = x.contiguous()
coors = coors.int()
import pdb
pdb.set_trace()
ret = spconv.SparseConvTensor(voxel_features, coors, self.sparse_shape,
batch_size)
ret = self.downCntx(ret)
# down0c, down0b = self.resBlock1(ret)
down1c, down1b = self.resBlock2(ret)
down2c, down2b = self.resBlock3(down1c)
down3c, down3b = self.resBlock4(down2c)
down4c, down4b = self.resBlock5(down3c)
# down5b = self.resBlock6(down4c)
# down6b = self.ReconNet(down5b)
up4e = self.upBlock0(down4c, down4b)
up3e = self.upBlock1(up4e, down3b)
up2e = self.upBlock2(up3e, down2b)
up1e = self.upBlock3(up2e, down1b)
up0e = self.ReconNet(up1e)
up0e.features = torch.cat((up0e.features, up1e.features), 1)
# up2e = self.upBlock3(up3e, down2b)
# up1e = self.upBlock4(up2e, down1b)
# up0e = self.upBlock5(up1e, down0b)
# up0e_gap = nn.AdaptiveAvgPool3d((1))(up0e)
# up0e_gap = F.interpolate(up0e_gap, size=(up0e.size()[2:]), mode='trilinear', align_corners=True)
# up0e = torch.cat((up0e, up0e_gap), dim=1)
logits = self.logits(up0e)
y = logits.dense()
# y = logits.permute(0, 1, 3, 4, 2)
return y
| 41.588595 | 145 | 0.645495 | 2,725 | 20,420 | 4.615046 | 0.09211 | 0.072281 | 0.032204 | 0.038645 | 0.760814 | 0.708969 | 0.681218 | 0.64806 | 0.626749 | 0.596056 | 0 | 0.043211 | 0.235015 | 20,420 | 490 | 146 | 41.673469 | 0.761859 | 0.166552 | 0 | 0.503356 | 0 | 0 | 0.016119 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.077181 | false | 0 | 0.02349 | 0.02349 | 0.181208 | 0.003356 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90a450c6bb8a1da60bd0c096428df1ba30321115 | 1,565 | py | Python | scripts/slave/recipe_modules/v8/gclient_config.py | bopopescu/chromium-build | f8e42c70146c1b668421ee6358dc550a955770a3 | [
"BSD-3-Clause"
] | null | null | null | scripts/slave/recipe_modules/v8/gclient_config.py | bopopescu/chromium-build | f8e42c70146c1b668421ee6358dc550a955770a3 | [
"BSD-3-Clause"
] | null | null | null | scripts/slave/recipe_modules/v8/gclient_config.py | bopopescu/chromium-build | f8e42c70146c1b668421ee6358dc550a955770a3 | [
"BSD-3-Clause"
] | 1 | 2020-07-22T09:16:32.000Z | 2020-07-22T09:16:32.000Z | # Copyright 2013 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import DEPS
CONFIG_CTX = DEPS['gclient'].CONFIG_CTX
ChromiumGitURL = DEPS['gclient'].config.ChromiumGitURL
@CONFIG_CTX()
def v8(c):
soln = c.solutions.add()
soln.name = 'v8'
soln.url = ChromiumGitURL(c, 'v8', 'v8')
c.got_revision_reverse_mapping['got_revision'] = 'v8'
# Needed to get the testers to properly sync the right revision.
# TODO(infra): Upload full buildspecs for every build to isolate and then use
# them instead of this gclient garbage.
c.parent_got_revision_mapping['parent_got_revision'] = 'got_revision'
p = c.patch_projects
p['icu'] = ('v8/third_party/icu', 'HEAD')
@CONFIG_CTX(includes=['v8'])
def dynamorio(c):
soln = c.solutions.add()
soln.name = 'dynamorio'
soln.url = ChromiumGitURL(c, 'external', 'dynamorio')
@CONFIG_CTX(includes=['v8'])
def llvm_compiler_rt(c):
c.solutions[0].custom_deps['v8/third_party/llvm/projects/compiler-rt'] = (
ChromiumGitURL(c, 'external', 'llvm.org', 'compiler-rt'))
@CONFIG_CTX()
def node_js(c):
soln = c.solutions.add()
soln.name = 'node.js'
soln.url = ChromiumGitURL(c, 'external', 'github.com', 'v8', 'node')
soln.revision = 'vee-eight-lkgr:HEAD'
c.got_revision_reverse_mapping['got_node_js_revision'] = soln.name
@CONFIG_CTX(includes=['v8'])
def v8_valgrind(c):
c.solutions[0].custom_deps['v8/third_party/valgrind'] = (
ChromiumGitURL(c, 'chromium', 'deps', 'valgrind', 'binaries'))
| 30.686275 | 79 | 0.709904 | 229 | 1,565 | 4.707424 | 0.39738 | 0.058442 | 0.016698 | 0.041744 | 0.306122 | 0.189239 | 0.135436 | 0.06308 | 0.06308 | 0 | 0 | 0.014064 | 0.136741 | 1,565 | 50 | 80 | 31.3 | 0.783864 | 0.212141 | 0 | 0.25 | 0 | 0 | 0.252855 | 0.051387 | 0 | 0 | 0 | 0.02 | 0 | 1 | 0.15625 | false | 0 | 0.03125 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90a5135d7b2c7cb2a555e6f77c99a227c0fdaa11 | 2,386 | py | Python | podcast/download.py | jessstringham/podcasts | 04de6cc5cd7d27ee6ab56c0c7950526b606ec201 | [
"MIT"
] | 1 | 2018-05-08T09:26:45.000Z | 2018-05-08T09:26:45.000Z | podcast/download.py | jessstringham/podcasts | 04de6cc5cd7d27ee6ab56c0c7950526b606ec201 | [
"MIT"
] | null | null | null | podcast/download.py | jessstringham/podcasts | 04de6cc5cd7d27ee6ab56c0c7950526b606ec201 | [
"MIT"
] | 1 | 2020-12-13T18:04:00.000Z | 2020-12-13T18:04:00.000Z | import typing
import urllib.error
import urllib.request
from podcast.files import download_location
from podcast.info import build_info_content
from podcast.info import InfoContent
from podcast.models import Channel
from podcast.models import get_podcast_audio_link
from podcast.models import NewStatus
from podcast.models import Podcast
from podcast.models import Radio
from podcast.models import RadioDirectory
def _download_from_url(url: str, location: str) -> bool:
try:
urllib.request.urlretrieve(url, location)
return True
except (IOError, urllib.error.ContentTooShortError):
# If a connection can't be made, IOError is raised
# If the download gets interrupted (ContentTooShortError), we
# should try again later
# TODO: can we tell if it was a bad filename (and should stop
# requesting it), or internet connectivity (and should tell
# us), or just a fluke (and should retry)?
return False
def download_podcast(
directory: RadioDirectory,
channel: Channel,
podcast: Podcast) -> Podcast:
location = download_location(directory, channel, podcast)
url = get_podcast_audio_link(podcast)
# TODO: This takes some time, especially when there are a lot to
# download. I could have this spawn threads, or add priorities,
# and so on. For now, since it runs every few hours, and is more
# of a push than a pull situation for the user, I'm leaving it
# simple
success = _download_from_url(url, location)
if success:
return podcast._replace(status=NewStatus())
else:
return podcast
def download_channel(directory: RadioDirectory, channel: Channel) -> Channel:
updated_podcasts = []
for known_podcast in channel.known_podcasts:
if type(known_podcast.status).__name__ == 'RequestedStatus':
known_podcast = download_podcast(directory, channel, known_podcast)
updated_podcasts.append(known_podcast)
return channel._replace(known_podcasts=updated_podcasts)
def download_radio(radio: Radio) -> typing.Tuple[Radio, InfoContent]:
downloaded_channels = [
download_channel(radio.directory, channel)
for channel in radio.channels
]
radio = radio._replace(channels=downloaded_channels)
info_content = build_info_content()
return (radio, info_content)
| 33.605634 | 79 | 0.723386 | 302 | 2,386 | 5.569536 | 0.390728 | 0.058859 | 0.060642 | 0.082045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209556 | 2,386 | 70 | 80 | 34.085714 | 0.891835 | 0.228835 | 0 | 0 | 0 | 0 | 0.008206 | 0 | 0 | 0 | 0 | 0.014286 | 0 | 1 | 0.090909 | false | 0 | 0.272727 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90a821eadcd600fc9ceb85786e62d6539b2c7ae3 | 9,603 | py | Python | tools/netconf.py | jpfluger/radiucal | 42666478baaa93da05fdc5ab8f3b53df68b993e6 | [
"BSD-3-Clause"
] | 5 | 2019-12-15T09:47:02.000Z | 2022-03-16T03:18:55.000Z | tools/netconf.py | jpfluger/radiucal | 42666478baaa93da05fdc5ab8f3b53df68b993e6 | [
"BSD-3-Clause"
] | null | null | null | tools/netconf.py | jpfluger/radiucal | 42666478baaa93da05fdc5ab8f3b53df68b993e6 | [
"BSD-3-Clause"
] | 1 | 2021-03-27T08:11:53.000Z | 2021-03-27T08:11:53.000Z | #!/usr/bin/python
"""composes the config from user definitions."""
import argparse
import os
import users
import users.__config__
import importlib
import csv
# file indicators
IND_DELIM = "_"
USER_INDICATOR = "user" + IND_DELIM
VLAN_INDICATOR = "vlan" + IND_DELIM
AUTH_PHASE_ONE = "PEAP"
AUTH_PHASE_TWO = "MSCHAPV2"
class ConfigMeta(object):
"""configuration meta information."""
def __init__(self):
"""init the instance."""
self.passwords = []
self.macs = []
self.vlans = []
self.all_vlans = []
self.user_name = []
self.vlan_users = []
self.vlan_initiate = []
self.extras = []
def password(self, password):
"""password group validation(s)."""
if password in self.passwords:
print("password duplicated")
exit(-1)
self.passwords.append(password)
def extra(self, macs):
"""Limited macs."""
for mac in macs:
if mac in self.extras:
print("mac already known as extra: " + mac)
exit(-1)
self.extras.append(mac)
def user_macs(self, macs):
"""user+mac combos."""
self.macs = self.macs + macs
self.macs = list(set(self.macs))
def verify(self):
"""verify meta data."""
for mac in self.macs:
if mac in self.extras:
print("mac is flagged extra: " + mac)
exit(-1)
for mac in self.extras:
if mac in self.macs:
print("mac is user assigned: " + mac)
exit(-1)
used_vlans = set(self.vlans + self.vlan_initiate)
if len(used_vlans) != len(set(self.all_vlans)):
print("unused vlans detected")
exit(-1)
for ref in used_vlans:
if ref not in self.all_vlans:
print("reference to unknown vlan: " + ref)
exit(-1)
def vlan_user(self, vlan, user):
"""indicate a vlan was used."""
self.vlans.append(vlan)
self.vlan_users.append(vlan + "." + user)
self.user_name.append(user)
def vlan_to_vlan(self, vlan_to):
"""VLAN to VLAN mappings."""
self.vlan_initiate.append(vlan_to)
def _get_mod(name):
"""import the module dynamically."""
return importlib.import_module("users." + name)
def _load_objs(name, typed):
mod = _get_mod(name)
for key in dir(mod):
obj = getattr(mod, key)
if not isinstance(obj, typed):
continue
yield obj
def _get_by_indicator(indicator):
"""get by a file type indicator."""
return [x for x in sorted(users.__all__) if x.startswith(indicator)]
def _common_call(common, method, entity):
"""make a common mod call."""
obj = entity
if common is not None and method in dir(common):
call = getattr(common, method)
if call is not None:
obj = call(obj)
return obj
def check_object(obj):
"""Check an object."""
return obj.check()
def _process(output):
"""process the composition of users."""
common_mod = None
try:
common_mod = _get_mod("common")
print("loaded common definitions...")
except Exception as e:
print("defaults only...")
vlans = None
meta = ConfigMeta()
for v_name in _get_by_indicator(VLAN_INDICATOR):
print("loading vlan..." + v_name)
for obj in _load_objs(v_name, users.__config__.VLAN):
if vlans is None:
vlans = {}
if not check_object(obj):
exit(-1)
num_str = str(obj.num)
for vk in vlans.keys():
if num_str == vlans[vk]:
print("vlan number defined multiple times...")
exit(-1)
vlans[obj.name] = num_str
if obj.initiate is not None and len(obj.initiate) > 0:
for init_to in obj.initiate:
meta.vlan_to_vlan(init_to)
if vlans is None:
raise Exception("missing required config settings...")
meta.all_vlans = vlans.keys()
store = Store()
for f_name in _get_by_indicator(USER_INDICATOR):
print("composing..." + f_name)
for obj in _load_objs(f_name, users.__config__.Assignment):
obj = _common_call(common_mod, 'ready', obj)
key = f_name.replace(USER_INDICATOR, "")
if not key.isalnum():
print("does not meet naming requirements...")
exit(-1)
vlan = obj.vlan
if vlan not in vlans:
raise Exception("no vlan defined for " + key)
store.add_vlan(vlan, vlans[vlan])
meta.vlan_user(vlan, key)
fqdn = vlan + "." + key
if not check_object(obj):
print("did not pass check...")
exit(-1)
if obj.disabled:
print("account is disabled")
continue
macs = sorted(obj.macs)
password = obj.password
bypassed = sorted(obj.bypassed())
owned = sorted(obj.owns)
# meta checks
meta.user_macs(macs)
if not obj.inherits:
meta.password(password)
meta.extra(bypassed)
meta.extra(owned)
store.add_user(fqdn, macs, password)
if obj.mab_only:
store.set_mab(fqdn)
if len(bypassed) > 0:
for m in bypassed:
store.add_mab(m, obj.bypass_vlan(m))
user_all = []
for l in [obj.macs, obj.owns, bypassed]:
user_all += list(l)
store.add_audit(fqdn, sorted(set(user_all)))
meta.verify()
# audit outputs
with open(output + "audit.csv", 'w') as f:
csv_writer = csv.writer(f, lineterminator=os.linesep)
for a in sorted(store.get_tag(store.audit)):
p = a[0].split(".")
for m in a[1]:
csv_writer.writerow([p[1], p[0], m])
# eap_users and preauth
manifest = []
with open(output + "eap_users", 'w') as f:
for u in store.get_eap_user():
f.write('"{}" {}\n\n'.format(u[0], AUTH_PHASE_ONE))
f.write('"{}" {} hash:{} [2]\n'.format(u[0], AUTH_PHASE_TWO, u[1]))
write_vlan(f, u[2])
for u in store.get_eap_mab():
up = u[0].upper()
f.write('"{}" MD5 "{}"\n'.format(up, up))
write_vlan(f, u[1])
manifest.append((u[0], u[0]))
for u in store.get_tag(store.umac):
manifest.append((u[0], u[1]))
with open(output + "manifest", 'w') as f:
for m in sorted(manifest):
f.write("{}.{}\n".format(m[0], m[1]).lower())
def write_vlan(f, vlan_id):
"""Write vlan assignment for login."""
f.write('radius_accept_attr=64:d:13\n')
f.write('radius_accept_attr=65:d:6\n')
f.write('radius_accept_attr=81:s:{}\n\n'.format(vlan_id))
class Store(object):
"""Storage object."""
def __init__(self):
"""Init the instance."""
self._data = []
self.umac = "UMAC"
self.pwd = "PWD"
self.mac = "MAC"
self.audit = "AUDIT"
self._users = []
self._mab = []
self._macs = []
self._vlans = {}
def set_mab(self, username):
"""Set a user as MAB-only, no login set."""
self._mab.append(username)
def get_tag(self, tag):
"""Get tagged items."""
for item in self._data:
if item[0] == tag:
yield item[1:]
def add_vlan(self, vlan_name, vlan_id):
"""Add a vlan item."""
self._vlans[vlan_name] = vlan_id
def _add(self, tag, key, value):
"""Backing tagged add."""
self._data.append([tag, key, value])
def add_user(self, username, macs, password):
"""Add a user definition."""
if username in self._users:
raise Exception("{} already defined".format(username))
self._users.append(username)
for m in macs:
self._add(self.umac, username, m)
self._add(self.pwd, username, password)
def add_mab(self, mac, vlan):
"""Add a MAB."""
if mac in self._macs:
raise Exception("{} already defined".format(mac))
self._macs.append(mac)
self._add(self.mac, mac, vlan)
def add_audit(self, user, objs):
"""Add an audit entry."""
self._add(self.audit, user, objs)
def get_eap_mab(self):
"""Get eap entries for MAB."""
for m in self.get_tag(self.mac):
v = m[1]
if not isinstance(v, int):
v = self._get_vlan(v)
yield [m[0], v]
def get_eap_user(self):
"""Get eap users."""
for u in self.get_tag(self.pwd):
if u[0] in self._mab:
continue
vlan = u[0].split(".")[0]
yield [u[0], u[1], self._get_vlan(vlan)]
def _get_vlan(self, name):
"""Get vlans."""
return self._vlans[name]
def main():
"""main entry."""
success = False
try:
parser = argparse.ArgumentParser()
parser.add_argument("--output", type=str, required=True)
args = parser.parse_args()
_process(args.output)
success = True
except Exception as e:
print('unable to compose')
print(str(e))
if success:
print("success")
exit(0)
else:
print("failure")
exit(1)
if __name__ == "__main__":
main()
| 30.389241 | 79 | 0.53754 | 1,229 | 9,603 | 4.042311 | 0.184703 | 0.0157 | 0.01087 | 0.008857 | 0.12037 | 0.055153 | 0.023752 | 0.011675 | 0 | 0 | 0 | 0.007927 | 0.330001 | 9,603 | 315 | 80 | 30.485714 | 0.764221 | 0.07456 | 0 | 0.103734 | 0 | 0 | 0.078472 | 0.009723 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107884 | false | 0.070539 | 0.029046 | 0 | 0.165975 | 0.074689 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
90a9c694ad7055aeb7e214346c75ba596c28d602 | 3,673 | py | Python | twitter_scrapper.py | juanlucruz/SportEventLocator | 1ac8236f9fdd60917b9a7ee6bb6ca1fa5f6fa71e | [
"Apache-2.0"
] | null | null | null | twitter_scrapper.py | juanlucruz/SportEventLocator | 1ac8236f9fdd60917b9a7ee6bb6ca1fa5f6fa71e | [
"Apache-2.0"
] | null | null | null | twitter_scrapper.py | juanlucruz/SportEventLocator | 1ac8236f9fdd60917b9a7ee6bb6ca1fa5f6fa71e | [
"Apache-2.0"
] | null | null | null | # Import the Twython class
from twython import Twython, TwythonStreamer
import json
# import pandas as pd
import csv
import datetime
def process_tweet(tweet):
# Filter out unwanted data
d = {}
d['hashtags'] = [hashtag['text'] for hashtag in tweet['entities']['hashtags']]
try:
for key in {
'created_at', 'id', 'text', 'source', 'truncated',
'in_reply_to_status_id', 'in_reply_to_user_id',
'in_reply_to_screen_name', 'user', 'coordinates',
'place', 'quoted_status_id', 'is_quote_status', 'quoted_status',
'retweeted_status', 'quote_count', 'reply_count', 'retweet_count',
'favorite_count', 'favorited', 'retweeted', 'entities', 'extended_entities',
'possibly_sensitive', 'filter_level', 'lang', 'matching_rules'}:
if key == 'user':
pass
elif key == 'place':
pass
elif key == 'quoted_status' or key == 'retweeted_status':
pass
elif key == 'entities':
pass
elif key == 'extended_entities':
pass
else:
d[key] = tweet[key]
except KeyError as e:
pass
# d['text'] = tweet['text']
# d['user'] = tweet['user']['screen_name']
# d['user_loc'] = tweet['user']['location']
# d['date'] = tweet['created_at']
return d
# Create a class that inherits TwythonStreamer
class MyStreamer(TwythonStreamer):
# Received data
def on_success(self, data):
# # Only collect tweets in English
# if data['lang'] == 'en':
# tweet_data = process_tweet(data)
print(datetime.datetime.now())
# self.save_to_csv(tweet_data)
self.save_to_json(data)
# Problem with the API
def on_error(self, status_code, data):
print(status_code, data)
self.disconnect()
# Save each tweet to csv file
def save_to_csv(self, tweet):
# with open(r'saved_tweets.csv', 'a') as out_file:
with open(r'saved_tweets_big.csv', 'a') as out_file:
writer = csv.writer(out_file)
writer.writerow(list(tweet.values()))
def save_to_json(self, tweet):
with open('saved_tweets_big.json', 'a') as out_file:
json.dump(tweet, out_file)
def main():
# Load credentials from json file
with open("twitter_credentials.json", "r") as tw_creds:
creds = json.load(tw_creds)
# Instantiate an object
# python_tweets = Twython(creds['CONSUMER_KEY'], creds['CONSUMER_SECRET'])
# Instantiate from our streaming class
stream = MyStreamer(creds['CONSUMER_KEY'], creds['CONSUMER_SECRET'],
creds['ACCESS_TOKEN'], creds['ACCESS_SECRET'])
# Start the stream
# stream.statuses.filter(track='madrid')
stream.statuses.filter(locations='-7.876154,37.460012,3.699873,43.374723')
# # Create our query
# query = {
# 'q': 'futbol',
# 'result_type': 'mixed',
# 'lang': 'es',
# 'count': '100',
# }
#
# dict_ = {'user': [], 'date': [], 'text': [], 'favorite_count': []}
# for status in python_tweets.search(**query)['statuses']:
# print(format(status))
# dict_['user'].append(status['user']['screen_name'])
# dict_['date'].append(status['created_at'])
# dict_['text'].append(status['text'])
# dict_['favorite_count'].append(status['favorite_count'])
#
# df = pd.DataFrame(dict_)
# df.sort_values(by='favorite_count', inplace=True, ascending=False)
# print(df.values)
if __name__ == "__main__":
main()
| 33.390909 | 88 | 0.58263 | 429 | 3,673 | 4.773893 | 0.354312 | 0.031738 | 0.021484 | 0.014648 | 0.066406 | 0.03418 | 0 | 0 | 0 | 0 | 0 | 0.012327 | 0.271168 | 3,673 | 109 | 89 | 33.697248 | 0.752708 | 0.34985 | 0 | 0.113208 | 0 | 0 | 0.243601 | 0.054181 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113208 | false | 0.113208 | 0.075472 | 0 | 0.226415 | 0.037736 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
90ab4c6f6273b660fe6334ebc9b6fb8fce97ce8e | 868 | py | Python | 2020/day04/day4_part1.py | dstjacques/AdventOfCode | 75bfb46a01487430d552ea827f0cf8ae3368f686 | [
"MIT"
] | null | null | null | 2020/day04/day4_part1.py | dstjacques/AdventOfCode | 75bfb46a01487430d552ea827f0cf8ae3368f686 | [
"MIT"
] | null | null | null | 2020/day04/day4_part1.py | dstjacques/AdventOfCode | 75bfb46a01487430d552ea827f0cf8ae3368f686 | [
"MIT"
] | null | null | null | input = """
ecl:gry pid:860033327 eyr:2020 hcl:#fffffd
byr:1937 iyr:2017 cid:147 hgt:183cm
iyr:2013 ecl:amb cid:350 eyr:2023 pid:028048884
hcl:#cfa07d byr:1929
hcl:#ae17e1 iyr:2013
eyr:2024
ecl:brn pid:760753108 byr:1931
hgt:179cm
hcl:#cfa07d eyr:2025 pid:166559648
iyr:2011 ecl:brn hgt:59in
"""
def validate(passport):
passport_fields = { "byr": False, "iyr": False, "eyr": False, "hgt": False, "hcl": False, "ecl": False, "pid": False }
for line in passport.split("\n"):
values = line.split(" ")
for value in values:
field = value.split(":")[0]
if field == "cid":
continue
passport_fields[field] = True
if False in passport_fields.values():
return False
return True
count = 0
for i in input.strip().split("\n\n"):
if validate(i):
count += 1
print(count) | 25.529412 | 122 | 0.615207 | 128 | 868 | 4.148438 | 0.445313 | 0.079096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157576 | 0.239631 | 868 | 34 | 123 | 25.529412 | 0.64697 | 0 | 0 | 0 | 0 | 0 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0.172414 | 0 | 0 | 0.103448 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
90af463579adb14e899b746a24caf95a35d80b1b | 3,017 | py | Python | flumine/markets/market.py | jsphon/flumine | bd5cacf9793d53a99595fe4694aeb9b8d2962abb | [
"MIT"
] | null | null | null | flumine/markets/market.py | jsphon/flumine | bd5cacf9793d53a99595fe4694aeb9b8d2962abb | [
"MIT"
] | null | null | null | flumine/markets/market.py | jsphon/flumine | bd5cacf9793d53a99595fe4694aeb9b8d2962abb | [
"MIT"
] | null | null | null | import datetime
import logging
from typing import Optional
from betfairlightweight.resources.bettingresources import MarketBook, MarketCatalogue
from .blotter import Blotter
from ..events import events
logger = logging.getLogger(__name__)
class Market:
def __init__(
self,
flumine,
market_id: str,
market_book: MarketBook,
market_catalogue: MarketCatalogue = None,
):
self.flumine = flumine
self.market_id = market_id
self.closed = False
self.date_time_closed = None
self.market_book = market_book
self.market_catalogue = market_catalogue
self.context = {"simulated": {}} # data store (raceCard / scores etc)
self.blotter = Blotter(self)
def __call__(self, market_book: MarketBook):
self.market_book = market_book
def open_market(self) -> None:
self.closed = False
def close_market(self) -> None:
self.closed = True
self.date_time_closed = datetime.datetime.utcnow()
# order
def place_order(self, order, execute: bool = True) -> None:
order.place(self.market_book.publish_time)
if order.id not in self.blotter:
self.blotter[order.id] = order
if order.trade.market_notes is None:
order.trade.update_market_notes(self.market_book)
self.flumine.log_control(events.TradeEvent(order.trade)) # todo dupes?
else:
return # retry attempt so ignore?
if execute: # handles replaceOrder
self.blotter.pending_place.append(order)
def cancel_order(self, order, size_reduction: float = None) -> None:
order.cancel(size_reduction)
self.blotter.pending_cancel.append(order)
def update_order(self, order, new_persistence_type: str) -> None:
order.update(new_persistence_type)
self.blotter.pending_update.append(order)
def replace_order(self, order, new_price: float) -> None:
order.replace(new_price)
self.blotter.pending_replace.append(order)
@property
def event_type_id(self) -> str:
if self.market_book:
return self.market_book.market_definition.event_type_id
@property
def event_id(self) -> str:
if self.market_book:
return self.market_book.market_definition.event_id
@property
def seconds_to_start(self):
return (self.market_start_datetime - datetime.datetime.utcnow()).total_seconds()
@property
def elapsed_seconds_closed(self) -> Optional[float]:
if self.closed and self.date_time_closed:
return (datetime.datetime.utcnow() - self.date_time_closed).total_seconds()
@property
def market_start_datetime(self):
if self.market_catalogue:
return self.market_catalogue.market_start_time
elif self.market_book:
return self.market_book.market_definition.market_time
else:
return datetime.datetime.utcfromtimestamp(0)
| 33.153846 | 88 | 0.670534 | 360 | 3,017 | 5.377778 | 0.252778 | 0.082645 | 0.079545 | 0.051653 | 0.143595 | 0.094008 | 0.094008 | 0.094008 | 0.094008 | 0.068182 | 0 | 0.000439 | 0.244614 | 3,017 | 90 | 89 | 33.522222 | 0.849057 | 0.032483 | 0 | 0.178082 | 0 | 0 | 0.00309 | 0 | 0 | 0 | 0 | 0.011111 | 0 | 1 | 0.178082 | false | 0 | 0.082192 | 0.013699 | 0.383562 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90b264bddefd9c5d8b81c5073da1b99d48704da6 | 2,228 | py | Python | scripts/naive_search.py | simonbowly/lp-generators | 937c44074c234333b6a5408c3e18f498c2205948 | [
"MIT"
] | 9 | 2020-01-02T23:07:36.000Z | 2022-01-26T10:04:04.000Z | scripts/naive_search.py | simonbowly/lp-generators | 937c44074c234333b6a5408c3e18f498c2205948 | [
"MIT"
] | null | null | null | scripts/naive_search.py | simonbowly/lp-generators | 937c44074c234333b6a5408c3e18f498c2205948 | [
"MIT"
] | 1 | 2020-01-02T23:08:26.000Z | 2020-01-02T23:08:26.000Z |
import itertools
import multiprocessing
import json
import numpy as np
from tqdm import tqdm
from lp_generators.features import coeff_features, solution_features
from lp_generators.performance import clp_simplex_performance
from search_operators import lp_column_neighbour, lp_row_neighbour
from seeds import cli_seeds
from search_common import condition, objective, start_instance
def calculate_features(instance):
return dict(
**coeff_features(instance),
**solution_features(instance))
def generate_by_search(seed):
results = []
pass_condition = 0
step_change = 0
random_state = np.random.RandomState(seed)
current_instance = start_instance(random_state)
current_features = calculate_features(current_instance)
for step in range(10001):
if (step % 100) == 0:
results.append(dict(
**coeff_features(current_instance),
**solution_features(current_instance),
**clp_simplex_performance(current_instance),
pass_condition=pass_condition,
step_change=step_change,
step=step, seed=seed))
if (step % 2) == 0:
new_instance = lp_row_neighbour(random_state, current_instance, 1)
else:
new_instance = lp_column_neighbour(random_state, current_instance, 1)
new_features = calculate_features(new_instance)
if condition(new_features):
pass_condition += 1
if objective(new_features) < objective(current_features):
step_change += 1
current_instance = new_instance
current_features = new_features
return results
@cli_seeds
def run(seed_values):
''' Generate the required number of instances and store feature results. '''
pool = multiprocessing.Pool()
mapper = pool.imap_unordered
print('Generating instances by naive search.')
features = list(tqdm(
mapper(generate_by_search, seed_values),
total=len(seed_values), smoothing=0))
features = list(itertools.chain(*features))
with open('data/naive_search.json', 'w') as outfile:
json.dump(features, outfile, indent=4, sort_keys=True)
run()
| 33.253731 | 81 | 0.685817 | 259 | 2,228 | 5.629344 | 0.343629 | 0.082305 | 0.037037 | 0.027435 | 0.049383 | 0.049383 | 0 | 0 | 0 | 0 | 0 | 0.01117 | 0.236535 | 2,228 | 66 | 82 | 33.757576 | 0.845973 | 0.030521 | 0 | 0 | 0 | 0 | 0.027894 | 0.010228 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.055556 | 0.185185 | 0.018519 | 0.277778 | 0.018519 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
90c531d029592c14df121556138eab86864faa16 | 2,927 | py | Python | user/forms.py | Zidan-Kharisma-Sakana/uts-f02 | d29652cb73829ffa63e0ca4d0e5f8d6d62500367 | [
"Unlicense"
] | null | null | null | user/forms.py | Zidan-Kharisma-Sakana/uts-f02 | d29652cb73829ffa63e0ca4d0e5f8d6d62500367 | [
"Unlicense"
] | null | null | null | user/forms.py | Zidan-Kharisma-Sakana/uts-f02 | d29652cb73829ffa63e0ca4d0e5f8d6d62500367 | [
"Unlicense"
] | null | null | null | from django import forms
from django.contrib.auth import authenticate
from django.contrib.auth.models import User
from django.contrib.auth.forms import AuthenticationForm, UserCreationForm
from django.forms import ValidationError, EmailField
from user import models
class MyAuthenticationForm(AuthenticationForm):
""""
Overide method clean from AuthenticationForm to show that a user hasn't activate their account
"""
error_messages = {
'invalid_login': (
"Please enter a correct %(username)s and password. Note that both "
"fields may be case-sensitive."
),
'inactive': ("This Account hasn't been activated yet, Please check your email :)"),
}
def confirm_login_allowed(self, user):
if not user.is_active:
raise ValidationError(
self.error_messages['inactive'],
code='inactive',
)
def clean(self):
username = self.cleaned_data.get('username')
password = self.cleaned_data.get('password')
if username is not None and password:
self.user_cache = authenticate(self.request, username=username, password=password)
if self.user_cache is None:
print(username)
try:
user_temp = User.objects.get(username=username)
except:
user_temp = None
print(user_temp)
if user_temp is not None:
self.confirm_login_allowed(user_temp)
else:
raise ValidationError(
self.error_messages['invalid_login'],
code='invalid_login',
params={'username': self.username_field.verbose_name},
)
return self.cleaned_data
class CreateUserForm(UserCreationForm):
""""
Override UserCreationForm to include email field
"""
email = EmailField(required=True, label='Email')
class Meta:
model = User
fields = ("username", "email", "password1", "password2")
error_messages = {
'password_mismatch': ('The two password fields didn’t match.'),
'email_taken': 'Your email has been taken'
}
def clean_email(self):
"""
Check if the email had already been taken
"""
email = self.cleaned_data.get('email')
num = User.objects.filter(email=email)
if num.count() > 0:
raise ValidationError(
self.error_messages['email_taken'],
code='email_taken',
)
return email
def save(self, commit= True):
user = super(CreateUserForm, self).save(commit=False)
email = self.cleaned_data.get('email')
user.email = email
user.is_active=False
if commit:
user.save()
return user
| 32.522222 | 98 | 0.585924 | 306 | 2,927 | 5.496732 | 0.349673 | 0.029727 | 0.04459 | 0.042806 | 0.099287 | 0.033294 | 0 | 0 | 0 | 0 | 0 | 0.00152 | 0.325589 | 2,927 | 89 | 99 | 32.88764 | 0.850557 | 0.064571 | 0 | 0.104478 | 0 | 0 | 0.151154 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059701 | false | 0.089552 | 0.089552 | 0 | 0.283582 | 0.029851 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
90c806da308d72b961e15453707e27dd9643ad2b | 1,897 | py | Python | code/diva_evaluation_cli/bin/commands/actev_get_system_subcommands/git_command.py | wenhel/Argus | 39768a8d1671eb80f86bbd67e58478a4cbdcdeca | [
"MIT"
] | 4 | 2019-06-28T23:27:43.000Z | 2021-09-27T03:17:58.000Z | code/diva_evaluation_cli/bin/commands/actev_get_system_subcommands/git_command.py | wenhel/Argus | 39768a8d1671eb80f86bbd67e58478a4cbdcdeca | [
"MIT"
] | 2 | 2020-01-16T19:39:44.000Z | 2021-02-24T22:45:37.000Z | code/diva_evaluation_cli/bin/commands/actev_get_system_subcommands/git_command.py | wenhel/Argus | 39768a8d1671eb80f86bbd67e58478a4cbdcdeca | [
"MIT"
] | 1 | 2019-09-09T07:40:45.000Z | 2019-09-09T07:40:45.000Z | """Actev module: get-system git
Actev modules are used to parse actev commands in order to get arguments
before calling associated entry point methods to execute systems.
Warning: this file should not be modified: see src/entry_points to add your source code.
"""
from diva_evaluation_cli.bin.commands.actev_command import ActevCommand
class ActevGetSystemGit(ActevCommand):
"""Clones a git repository
Command Args:
* location or l: path to store the system
* user or U: url to get the system
* password or p: password to access the url
* token or t: token to access the url
* install-cli or i: install the cli to use it
"""
def __init__(self):
super(ActevGetSystemGit, self).__init__('git', "get_git.sh")
def cli_parser(self, arg_parser):
"""Configure the description and the arguments (positional and optional) to parse.
Args:
arg_parser(:obj:`ArgParser`): Python arg parser to describe how to parse the command
"""
arg_parser.description= "Downloads a git repository"
required_named = arg_parser.add_argument_group('required named arguments')
arg_parser.add_argument("-U", "--user", help="username to access the url")
arg_parser.add_argument("-p", "--password", help="password to access the url"
"Warning: if password starts with \'-\', use this: --password=<your password>")
arg_parser.add_argument("-l", "--location", help="path to store the system")
arg_parser.add_argument("-t", "--token", help="token to access the url"
"Warning: if token starts with \'-\', use this: --token=<your token>",
type=str)
arg_parser.add_argument("-i", "--install-cli", help="install the cli to use it", action='store_true')
| 43.113636 | 111 | 0.641012 | 251 | 1,897 | 4.717131 | 0.394422 | 0.076014 | 0.060811 | 0.101351 | 0.152027 | 0.072635 | 0 | 0 | 0 | 0 | 0 | 0 | 0.256194 | 1,897 | 43 | 112 | 44.116279 | 0.839121 | 0.377965 | 0 | 0 | 0 | 0 | 0.358696 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0.133333 | 0.066667 | 0 | 0.266667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
90c8e1c41e29404a0d0d0511d6b46db43890fb89 | 4,106 | py | Python | test/test_markdown_parser.py | Asana/SGTM | 0e9e236980ed68e80e021470da6374945bbac501 | [
"MIT"
] | 8 | 2020-12-05T00:13:03.000Z | 2022-01-11T11:35:51.000Z | test/test_markdown_parser.py | Asana/SGTM | 0e9e236980ed68e80e021470da6374945bbac501 | [
"MIT"
] | 12 | 2020-12-14T18:21:21.000Z | 2022-03-29T17:06:20.000Z | test/test_markdown_parser.py | Asana/SGTM | 0e9e236980ed68e80e021470da6374945bbac501 | [
"MIT"
] | 2 | 2021-06-27T09:32:55.000Z | 2022-02-27T23:17:36.000Z | import unittest
from html import escape
from src.markdown_parser import convert_github_markdown_to_asana_xml
class TestConvertGithubMarkdownToAsanaXml(unittest.TestCase):
def test_basic_markdown(self):
md = """~~strike~~ **bold** _italic_ `code` [link](asana.com)"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(
xml,
'<s>strike</s> <strong>bold</strong> <em>italic</em> <code>code</code> <a href="asana.com">link</a>\n',
)
def test_ul_tag(self):
md = """* bullet one\n* bullet two"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(
xml, """<ul>\n<li>bullet one</li>\n<li>bullet two</li>\n</ul>\n""",
)
def test_ol_tag(self):
md = """1. bullet one\n2. bullet two"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(
xml, """<ol>\n<li>bullet one</li>\n<li>bullet two</li>\n</ol>\n""",
)
def test_paragraph(self):
md = "we don't wrap random text in p tags"
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(md + "\n", xml)
def test_block_quote(self):
md = "> block quote"
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(xml, "<em>> block quote\n</em>")
def test_horizontal_rule(self):
# Asana doesn't support <hr /> tags, so we just ignore them
md = "hello\n\n---\nworld\n"
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(xml, md) # unchanged
def test_auto_linking(self):
md = "https://asana.com/ [still works](www.test.com)"
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(
xml,
'<a href="https://asana.com/">https://asana.com/</a> <a href="www.test.com">still works</a>\n',
)
def test_converts_headings_to_bold(self):
md = "## heading"
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(xml, "\n<b>heading</b>\n")
def test_nested_code_within_block_quote(self):
md = "> abc `123`"
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(xml, "<em>> abc <code>123</code>\n</em>")
def test_removes_pre_tags_inline(self):
md = """```test```"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(xml, "<code>test</code>\n")
def test_removes_pre_tags_block(self):
md = """see:
```
function foo = () => null;
```
"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(xml, "see:\n<code>function foo = () => null;\n</code>\n")
def test_escapes_raw_html_mixed_with_markdown(self):
md = """## <img href="link" />still here <h3>header</h3>"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(
xml,
"\n<b>"
+ escape('<img href="link" />')
+ "still here "
+ escape("<h3>header</h3>")
+ "</b>\n",
)
def test_escapes_raw_html_on_own_lines(self):
md = """## blah blah blah
<img href="link">
still here <h3>header</h3>"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(
xml,
"\n<b>blah blah blah</b>\n"
+ escape('<img href="link">\n')
+ "still here "
+ escape("<h3>header</h3>"),
)
def test_escapes_raw_html(self):
md = """<img href="link" />still here <h3>header</h3>"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(
xml,
escape('<img href="link" />') + "still here " + escape("<h3>header</h3>\n"),
)
def test_removes_images(self):
md = """"""
xml = convert_github_markdown_to_asana_xml(md)
self.assertEqual(xml, '<a href="https://image.com">image</a>\n')
if __name__ == "__main__":
from unittest import main as run_tests
run_tests()
| 33.933884 | 115 | 0.585485 | 550 | 4,106 | 4.116364 | 0.198182 | 0.091873 | 0.14841 | 0.162544 | 0.563163 | 0.535336 | 0.491166 | 0.491166 | 0.491166 | 0.491166 | 0 | 0.006545 | 0.255723 | 4,106 | 120 | 116 | 34.216667 | 0.734293 | 0.016318 | 0 | 0.313131 | 0 | 0.040404 | 0.285183 | 0.028989 | 0 | 0 | 0 | 0 | 0.151515 | 1 | 0.151515 | false | 0 | 0.040404 | 0 | 0.20202 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90d9a5726836680355d0f136ca02e9d3ff263f57 | 1,087 | py | Python | modcma/__main__.py | IOHprofiler/ModularCMAES | 5ae3310d68b7e2bc37ef10de07945e89c16d6654 | [
"MIT"
] | 2 | 2021-04-08T06:16:21.000Z | 2022-01-25T18:18:51.000Z | modcma/__main__.py | IOHprofiler/ModularCMAES | 5ae3310d68b7e2bc37ef10de07945e89c16d6654 | [
"MIT"
] | 3 | 2020-11-16T15:24:53.000Z | 2021-11-10T10:27:50.000Z | modcma/__main__.py | IOHprofiler/ModularCMAES | 5ae3310d68b7e2bc37ef10de07945e89c16d6654 | [
"MIT"
] | 2 | 2021-01-13T15:36:46.000Z | 2021-04-08T06:24:25.000Z | """Allows the user to call the library as a cli-module."""
from argparse import ArgumentParser
from .modularcmaes import evaluate_bbob
parser = ArgumentParser(description="Run single function CMAES")
parser.add_argument(
"-f", "--fid", type=int, help="bbob function id", required=False, default=5
)
parser.add_argument(
"-d", "--dim", type=int, help="dimension", required=False, default=5
)
parser.add_argument(
"-i",
"--iterations",
type=int,
help="number of iterations per agent",
required=False,
default=50,
)
parser.add_argument(
"-l", "--logging", required=False, action="store_true", default=False
)
parser.add_argument("-L", "--label", type=str, required=False, default="")
parser.add_argument("-s", "--seed", type=int, required=False, default=42)
parser.add_argument("-p", "--data_folder", type=str, required=False)
parser.add_argument("-a", "--arguments", nargs="+", required=False)
args = vars(parser.parse_args())
for arg in args.pop("arguments") or []:
# pylint: disable=exec-used
exec(arg, None, args)
evaluate_bbob(**args)
| 29.378378 | 79 | 0.689052 | 146 | 1,087 | 5.041096 | 0.5 | 0.097826 | 0.184783 | 0.057065 | 0.103261 | 0.103261 | 0.103261 | 0 | 0 | 0 | 0 | 0.006369 | 0.133395 | 1,087 | 36 | 80 | 30.194444 | 0.774947 | 0.072677 | 0 | 0.142857 | 0 | 0 | 0.183633 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90decc71935f62f946a40921c43b3f8580f075de | 2,398 | py | Python | setup.py | medchemfi/sdfconf | 81b1ed383c1d4b3e633fdc555e4027091226b025 | [
"MIT"
] | 6 | 2021-12-27T07:55:16.000Z | 2022-01-26T04:36:53.000Z | setup.py | medchemfi/sdfconf | 81b1ed383c1d4b3e633fdc555e4027091226b025 | [
"MIT"
] | null | null | null | setup.py | medchemfi/sdfconf | 81b1ed383c1d4b3e633fdc555e4027091226b025 | [
"MIT"
] | 3 | 2022-01-06T13:54:48.000Z | 2022-01-26T04:36:54.000Z | # -*- coding: utf-8 -*-
from setuptools import setup, find_packages
import re
import os
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
with open("src/sdfconf/_version.py", "rt") as vf:
VSRE = r"^__version__ = ['\"]([^'\"]*)['\"]"
for line in vf:
mo = re.search(VSRE, line, re.M)
if mo:
verstr = mo.group(1)
break
if not mo:
raise RuntimeError("Unable to find version string in %s." % (VERSIONFILE,))
setup(name = 'sdfconf',
version = verstr,
description = ("Diverse manipulation and analysis tool for .sdf files."),
long_description = read('README.rst'),
install_requires = ['numpy>=1.7.1','matplotlib>=1.4.2'],
author = 'Sakari Lätti',
author_email = 'sakari.latti@jyu.fi',
maintainer = 'Sakari Lätti',
maintainer_email = 'sakari.latti@jyu.fi',
packages = ['sdfconf'],
package_dir = {'sdfconf':'src/sdfconf'},
keywords = 'sdf mol2 conformation analyze histogram',
url = 'http://users.jyu.fi/~pentikai/',
license = 'MIT/expat',
entry_points =
{'console_scripts': ['sdfconf = sdfconf.runner:main'],
'setuptools.installation': ['eggsecutable = sdfconf.runner:main',],
},
classifiers= ['Development Status :: 4 - Beta',
'Environment :: Console',
'Intended Audience :: Science/Research',
'License :: OSI Approved :: MIT License',
'Natural Language :: English',
'Operating System :: OS Independent',
'Programming Language :: Python :: 2.7',
#'Programming Language :: Python :: 3',
'Topic :: Scientific/Engineering :: Bio-Informatics',
'Topic :: Scientific/Engineering :: Chemistry' ,
'Topic :: Software Development :: Libraries',
],
##FIXME
#'''
#package_data = {
# 'sample':['sample_data.sdf']
# },
#'''
) | 42.821429 | 97 | 0.470809 | 205 | 2,398 | 5.414634 | 0.629268 | 0.013514 | 0.028829 | 0.034234 | 0.037838 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008997 | 0.397415 | 2,398 | 56 | 98 | 42.821429 | 0.75917 | 0.05171 | 0 | 0 | 0 | 0 | 0.35894 | 0.039735 | 0 | 0 | 0 | 0.017857 | 0 | 1 | 0.022727 | false | 0 | 0.068182 | 0.022727 | 0.113636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90e4c87211faae293a93093b6b860b2f8d021a50 | 2,740 | py | Python | scripts/Biupdownsample/grad_check.py | dongdong93/a2u_matting | 1d0ad8e630cce50c5b36c40ad384c888d292f9a8 | [
"MIT"
] | 22 | 2021-04-28T03:48:53.000Z | 2022-01-24T09:42:53.000Z | scripts/Biupdownsample/grad_check.py | dongdong93/a2u_matting | 1d0ad8e630cce50c5b36c40ad384c888d292f9a8 | [
"MIT"
] | 1 | 2021-08-08T20:10:18.000Z | 2021-08-23T07:33:38.000Z | scripts/Biupdownsample/grad_check.py | dongdong93/a2u_matting | 1d0ad8e630cce50c5b36c40ad384c888d292f9a8 | [
"MIT"
] | 5 | 2021-09-17T08:02:06.000Z | 2022-01-24T09:43:03.000Z | import os.path as osp
import sys
import subprocess
subprocess.call(['pip', 'install', 'cvbase'])
import cvbase as cvb
import torch
from torch.autograd import gradcheck
sys.path.append(osp.abspath(osp.join(__file__, '../../')))
from biupdownsample import biupsample_naive, BiupsampleNaive
from biupdownsample import bidownsample_naive, BidownsampleNaive
feat = torch.randn(2, 64, 2, 2, requires_grad=True, device='cuda:0').double()
mask = torch.randn(
2, 100, 4, 4, requires_grad=True, device='cuda:0').sigmoid().double()
print('Gradcheck for biupsample naive...')
test = gradcheck(BiupsampleNaive(5, 4, 2), (feat, mask), atol=1e-4, eps=1e-4)
print(test)
feat = torch.randn(
2, 1024, 100, 100, requires_grad=True, device='cuda:0').float()
mask = torch.randn(
2, 25, 200, 200, requires_grad=True, device='cuda:0').sigmoid().float()
loop_num = 500
time_naive_forward = 0
time_naive_backward = 0
bar = cvb.ProgressBar(loop_num)
timer = cvb.Timer()
for i in range(loop_num):
x = biupsample_naive(feat.clone(), mask.clone(), 5, 1, 2)
torch.cuda.synchronize()
time_naive_forward += timer.since_last_check()
x.sum().backward(retain_graph=True)
torch.cuda.synchronize()
time_naive_backward += timer.since_last_check()
bar.update()
forward_speed = (time_naive_forward + 1e-3) * 1e3 / loop_num
backward_speed = (time_naive_backward + 1e-3) * 1e3 / loop_num
print('\nBiupsample naive time forward: '
f'{forward_speed} ms/iter | time backward: {backward_speed} ms/iter')
# ---------------------------------------------------------------
feat = torch.randn(2, 64, 4, 4, requires_grad=True, device='cuda:0').double()
mask = torch.randn(
2, 16, 4, 4, requires_grad=True, device='cuda:0').double()
print('Gradcheck for bidownsample naive...')
test = gradcheck(BidownsampleNaive(4, 1, 1), (feat, mask), atol=1e-4, eps=1e-4)
print(test)
feat = torch.randn(
2, 512, 200, 200, requires_grad=True, device='cuda:0').float()
mask = torch.randn(
2, 100, 100, 100, requires_grad=True, device='cuda:0').sigmoid().float()
loop_num = 500
time_naive_forward = 0
time_naive_backward = 0
bar = cvb.ProgressBar(loop_num)
timer = cvb.Timer()
for i in range(loop_num):
x = bidownsample_naive(feat.clone(), mask.clone(), 10, 1, 2)
torch.cuda.synchronize()
time_naive_forward += timer.since_last_check()
x.sum().backward(retain_graph=True)
torch.cuda.synchronize()
time_naive_backward += timer.since_last_check()
bar.update()
forward_speed = (time_naive_forward + 1e-3) * 1e3 / loop_num
backward_speed = (time_naive_backward + 1e-3) * 1e3 / loop_num
print('\nBidownsample naive time forward: '
f'{forward_speed} ms/iter | time backward: {backward_speed} ms/iter')
| 31.494253 | 79 | 0.691606 | 400 | 2,740 | 4.5675 | 0.215 | 0.059113 | 0.048166 | 0.096333 | 0.739464 | 0.692392 | 0.692392 | 0.68856 | 0.659551 | 0.639299 | 0 | 0.049111 | 0.137956 | 2,740 | 86 | 80 | 31.860465 | 0.724386 | 0.022993 | 0 | 0.59375 | 0 | 0 | 0.125701 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.09375 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90e95d3f579e468dcd63f6bfea79961b11c3e5b8 | 1,953 | py | Python | jupyterlab_bigquery/jupyterlab_bigquery/__init__.py | shunr/jupyter-extensions | a2fb310215664e29fd7252e5fe353f60a91a0aba | [
"Apache-2.0"
] | null | null | null | jupyterlab_bigquery/jupyterlab_bigquery/__init__.py | shunr/jupyter-extensions | a2fb310215664e29fd7252e5fe353f60a91a0aba | [
"Apache-2.0"
] | 1 | 2020-07-20T23:09:46.000Z | 2020-07-20T23:09:46.000Z | jupyterlab_bigquery/jupyterlab_bigquery/__init__.py | shunr/jupyter-extensions | a2fb310215664e29fd7252e5fe353f60a91a0aba | [
"Apache-2.0"
] | null | null | null | from notebook.utils import url_path_join
from jupyterlab_bigquery.list_items_handler import handlers
from jupyterlab_bigquery.details_handler import DatasetDetailsHandler, TablePreviewHandler, TableDetailsHandler
from jupyterlab_bigquery.version import VERSION
from jupyterlab_bigquery.pagedAPI_handler import PagedQueryHandler
from jupyterlab_bigquery.query_incell_editor import QueryIncellEditor, _cell_magic
__version__ = VERSION
def _jupyter_server_extension_paths():
return [{'module': 'jupyterlab_bigquery'}]
def load_jupyter_server_extension(nb_server_app):
"""
Called when the extension is loaded.
Args:
nb_server_app (NotebookWebApplication): handle to the Notebook webserver instance.
"""
host_pattern = '.*$'
app = nb_server_app.web_app
gcp_v1_endpoint = url_path_join(app.settings['base_url'], 'bigquery', 'v1')
def make_endpoint(endPoint, handler):
return url_path_join(gcp_v1_endpoint, endPoint) + '(.*)', handler
app.add_handlers(
host_pattern,
[
(url_path_join(gcp_v1_endpoint, k) + "(.*)", v)
for (k, v) in handlers.items()
],
)
app.add_handlers(host_pattern, [
# TODO(cbwilkes): Add auth checking if needed.
# (url_path_join(gcp_v1_endpoint, auth'), AuthHandler)
make_endpoint('list', ListHandler),
make_endpoint('datasetdetails', DatasetDetailsHandler),
make_endpoint('tabledetails', TableDetailsHandler),
make_endpoint('tablepreview', TablePreviewHandler),
make_endpoint('query', PagedQueryHandler)
])
def load_ipython_extension(ipython):
"""Called by IPython when this module is loaded as an IPython extension."""
ipython.register_magic_function(
_cell_magic, magic_kind="line", magic_name="bigquery_editor"
)
ipython.register_magic_function(
_cell_magic, magic_kind="cell", magic_name="bigquery_editor"
)
| 35.509091 | 111 | 0.721966 | 219 | 1,953 | 6.082192 | 0.374429 | 0.081081 | 0.041291 | 0.031532 | 0.160661 | 0.123123 | 0.069069 | 0.069069 | 0 | 0 | 0 | 0.003145 | 0.185868 | 1,953 | 54 | 112 | 36.166667 | 0.834591 | 0.152586 | 0 | 0.055556 | 0 | 0 | 0.085962 | 0 | 0 | 0 | 0 | 0.018519 | 0 | 1 | 0.111111 | false | 0 | 0.166667 | 0.055556 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
90eb050355216ee7d1a8b303ce6104092d1b2ec7 | 581 | py | Python | ios_notifications/migrations/0004_auto_20141105_1515.py | chillbear/django-ios-notifications | d48a7862eaa499672f27c192a3cf6f06e06f8117 | [
"BSD-3-Clause"
] | 2 | 2021-12-01T21:34:49.000Z | 2021-12-13T19:22:12.000Z | ios_notifications/migrations/0004_auto_20141105_1515.py | chillbear/django-ios-notifications | d48a7862eaa499672f27c192a3cf6f06e06f8117 | [
"BSD-3-Clause"
] | 1 | 2019-10-04T01:18:32.000Z | 2019-10-04T01:18:32.000Z | ios_notifications/migrations/0004_auto_20141105_1515.py | chillbear/django-ios-notifications | d48a7862eaa499672f27c192a3cf6f06e06f8117 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
import django_fields.fields
class Migration(migrations.Migration):
dependencies = [
('ios_notifications', '0003_notification_loc_payload'),
]
operations = [
migrations.AlterField(
model_name='apnservice',
name='passphrase',
field=django_fields.fields.EncryptedCharField(help_text=b'Passphrase for the private key', max_length=101, null=True, blank=True),
preserve_default=True,
),
]
| 26.409091 | 142 | 0.667814 | 60 | 581 | 6.216667 | 0.75 | 0.064343 | 0.096515 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017897 | 0.230637 | 581 | 21 | 143 | 27.666667 | 0.816555 | 0.036145 | 0 | 0 | 0 | 0 | 0.172043 | 0.051971 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.133333 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
290a002c458607061f9182749313cea5d389910f | 1,757 | py | Python | src/admin.py | kappa243/agh-db-proj | 73a3e69fa11e65e196b3d8a34be0b1051654a7eb | [
"MIT"
] | null | null | null | src/admin.py | kappa243/agh-db-proj | 73a3e69fa11e65e196b3d8a34be0b1051654a7eb | [
"MIT"
] | null | null | null | src/admin.py | kappa243/agh-db-proj | 73a3e69fa11e65e196b3d8a34be0b1051654a7eb | [
"MIT"
] | null | null | null | from flask import Blueprint, request, render_template, flash, redirect, url_for
from flask_login import login_user, login_required, current_user, logout_user
from models import User
from werkzeug.security import generate_password_hash, check_password_hash
from app import db, login_manager
admin = Blueprint('admin', __name__)
@admin.route('/admin', methods=['POST', 'GET'])
@login_required
def admin_panel():
if current_user.is_authenticated:
user = User.query.get(int(current_user.get_id()))
if not user.admin:
return redirect(url_for('index'))
users = User.query.order_by(User.id).all()
if request.method == 'POST':
if 'edit_user' in request.form:
old_username = request.form['edit_user']
user = db.session.query(User).filter_by(username=old_username).with_for_update().first()
username = request.form['username']
password = request.form['password']
if len(username) > 0:
user.username = username
if len(password) > 0:
if len(password) >= 3:
user.password = generate_password_hash(password, method='sha256')
else:
flash('Password must be minimum 3 characters long')
if 'grant_admin' in request.form:
user.admin = True
if 'remove_admin' in request.form:
user.admin = False
if 'delete' in request.form:
old_username = request.form['delete']
User.query.filter_by(username=old_username).with_for_update().delete()
db.session.commit()
return redirect(url_for('admin.admin_panel'))
return render_template('admin_panel.html', users=users)
| 39.931818 | 100 | 0.636881 | 216 | 1,757 | 4.976852 | 0.347222 | 0.08186 | 0.048372 | 0.037209 | 0.189767 | 0.189767 | 0.139535 | 0.074419 | 0 | 0 | 0 | 0.005356 | 0.256118 | 1,757 | 43 | 101 | 40.860465 | 0.817138 | 0 | 0 | 0 | 0 | 0 | 0.100912 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0.162162 | 0.135135 | 0 | 0.243243 | 0.054054 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
291d8e921326cbecc63bc712d0993323051bed1f | 691 | py | Python | tests/test_demo.py | aaronestrada/flask-restplus-swagger-relative | e951bad6a2c72522ac74f5353a7b0cbe5436f20f | [
"BSD-3-Clause"
] | 3 | 2019-09-27T18:33:54.000Z | 2020-03-31T15:32:32.000Z | tests/test_demo.py | aaronestrada/flask-restplus-swagger-relative | e951bad6a2c72522ac74f5353a7b0cbe5436f20f | [
"BSD-3-Clause"
] | 1 | 2019-10-29T20:31:33.000Z | 2019-11-04T14:25:08.000Z | tests/test_demo.py | aaronestrada/flask-restplus-swagger-relative | e951bad6a2c72522ac74f5353a7b0cbe5436f20f | [
"BSD-3-Clause"
] | 1 | 2019-09-27T18:33:55.000Z | 2019-09-27T18:33:55.000Z | import pytest
from tests.test_application import app
@pytest.fixture
def client():
client = app.test_client()
yield client
def test_hello_resource(client):
"""
Test if it is possible to access to /hello resource
:param client: Test client object
:return:
"""
response = client.get('/hello').get_json()
assert response['hello'] == 'world'
def test_asset_found(client):
"""
Test if Swagger assets are accessible from the new path
:param client: Test client object
:return:
"""
response = client.get('/this_is_a_new/path_for_swagger_internal_documentation/swaggerui/swagger-ui-bundle.js')
assert response.status_code is 200
| 23.827586 | 114 | 0.700434 | 93 | 691 | 5.043011 | 0.516129 | 0.085288 | 0.051173 | 0.089552 | 0.21322 | 0.21322 | 0.21322 | 0.21322 | 0.21322 | 0 | 0 | 0.005425 | 0.199711 | 691 | 28 | 115 | 24.678571 | 0.842676 | 0.279305 | 0 | 0 | 0 | 0 | 0.223947 | 0.18847 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.25 | false | 0 | 0.166667 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
291e921dde8646cb27f33c258f33f46413f66a28 | 1,614 | py | Python | 01_Introduction to Python/3-functions-and-packages/03_multiple-arguments.py | mohd-faizy/DataScience-With-Python | 13ebb10cf9083343056d5b782957241de1d595f9 | [
"MIT"
] | 5 | 2021-02-03T14:36:58.000Z | 2022-01-01T10:29:26.000Z | 01_Introduction to Python/3-functions-and-packages/03_multiple-arguments.py | mohd-faizy/DataScience-With-Python | 13ebb10cf9083343056d5b782957241de1d595f9 | [
"MIT"
] | null | null | null | 01_Introduction to Python/3-functions-and-packages/03_multiple-arguments.py | mohd-faizy/DataScience-With-Python | 13ebb10cf9083343056d5b782957241de1d595f9 | [
"MIT"
] | 3 | 2021-02-08T00:31:16.000Z | 2022-03-17T13:52:32.000Z | '''
03 - Multiple arguments
In the previous exercise, the square brackets around imag in the documentation showed us that the
imag argument is optional. But Python also uses a different way to tell users about arguments being
optional.
Have a look at the documentation of sorted() by typing help(sorted) in the IPython Shell.
You'll see that sorted() takes three arguments: iterable, key and reverse.
key=None means that if you don't specify the key argument, it will be None. reverse=False means
that if you don't specify the reverse argument, it will be False.
In this exercise, you'll only have to specify iterable and reverse, not key. The first input you
pass to sorted() will be matched to the iterable argument, but what about the second input? To tell
Python you want to specify reverse without changing anything about key, you can use =:
sorted(___, reverse = ___)
Two lists have been created for you on the right. Can you paste them together and sort them in
descending order?
Note: For now, we can understand an iterable as being any collection of objects, e.g. a List.
Instructions:
- Use + to merge the contents of first and second into a new list: full.
- Call sorted() on full and specify the reverse argument to be True. Save the sorted list as
full_sorted.
- Finish off by printing out full_sorted.
'''
# Create lists first and second
first = [11.25, 18.0, 20.0]
second = [10.75, 9.50]
# Paste together first and second: full
full = first + second
# Sort full in descending order: full_sorted
full_sorted = sorted(full, reverse=True)
# Print out full_sorted
print(full_sorted) | 35.086957 | 99 | 0.761462 | 275 | 1,614 | 4.425455 | 0.447273 | 0.049302 | 0.034511 | 0.023007 | 0.046015 | 0.046015 | 0.046015 | 0.046015 | 0 | 0 | 0 | 0.014372 | 0.180917 | 1,614 | 46 | 100 | 35.086957 | 0.906203 | 0.905824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
291f1330f75cfc0ca15457846d8102779d88cf8f | 790 | py | Python | Taller_Algoritmos_02/Ejercicio_10.py | Angelio01/algoritmos_programacion- | 63cb4cd4cfa01f504bf9ed927dcebf2466d6f60d | [
"MIT"
] | null | null | null | Taller_Algoritmos_02/Ejercicio_10.py | Angelio01/algoritmos_programacion- | 63cb4cd4cfa01f504bf9ed927dcebf2466d6f60d | [
"MIT"
] | null | null | null | Taller_Algoritmos_02/Ejercicio_10.py | Angelio01/algoritmos_programacion- | 63cb4cd4cfa01f504bf9ed927dcebf2466d6f60d | [
"MIT"
] | 1 | 2021-10-29T19:40:32.000Z | 2021-10-29T19:40:32.000Z | """
Entradas: 3 Valores flotantes que son el valor de diferentes monedas
Chelines autriacos --> float --> x
Dramas griegos --> float --> z
Pesetas --> float --> w
Salidas 4 valores flotantes que es la conversión de las anteriores monedas
Pesetas --> float --> x
Francos franceses --> float --> z
Dolares --> float --> a
Liras italianas --> float --> b
"""
# Entradas
x1 = float(input("Dime los chelines autríacos\n"))
z1 = float(input("Dime los dracmas griegos\n"))
w = float(input("Dime las pesetas\n"))
# Caja negra
x = (x1 * 956871)/100
z = z1/22.64572381
a = w/122499
b = (w*100)/9289
# Salidas
print(f"\n{x1} Chelines austríacos en pesetas son {x}\n{z1} Dracmas griegos en Francos franceses son {z}\n{w} Pesetas en Dolares son {a}\n{w} Pesetas en Liras italianas son {b}\n") | 28.214286 | 180 | 0.679747 | 125 | 790 | 4.296 | 0.432 | 0.055866 | 0.078212 | 0.063315 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061256 | 0.173418 | 790 | 28 | 180 | 28.214286 | 0.761103 | 0.48481 | 0 | 0 | 0 | 0.125 | 0.613636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
29204de0e1568db751699c8bf504b18e9d16ff4b | 4,049 | py | Python | estacionamientos/forms.py | ShadowManu/SAGE | 999626669c9a15755ed409e57864851eb27dc2c2 | [
"MIT"
] | null | null | null | estacionamientos/forms.py | ShadowManu/SAGE | 999626669c9a15755ed409e57864851eb27dc2c2 | [
"MIT"
] | null | null | null | estacionamientos/forms.py | ShadowManu/SAGE | 999626669c9a15755ed409e57864851eb27dc2c2 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from django import forms
from estacionamientos.models import Estacionamiento, Reserva, Pago
class EstacionamientosForm(forms.ModelForm):
nombre_duenio = forms.CharField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Nombre del Dueño'}))
nombre_est = forms.CharField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Nombre del Estacionamiento'}))
direccion = forms.CharField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Dirección'}))
telefono1 = forms.IntegerField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Teléfono 1',}))
telefono2 = forms.IntegerField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Telefono 2',}), required=False)
telefono3 = forms.IntegerField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Teléfono 3',}), required=False)
email1 = forms.EmailField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Correo Electrónico 1',}))
email2 = forms.EmailField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Correo Electrónico 2',}), required=False)
email3 = forms.EmailField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Correo Electrónico 3',}), required=False)
rif = forms.IntegerField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'RIF',}))
capacidad = forms.IntegerField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Capacidad',}))
tarifa = forms.DecimalField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Tarifa',}))
horaI = forms.TimeField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Hora Apertura',}))
horaF = forms.TimeField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Hora Cierre',}))
reservaI = forms.TimeField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Inicio Restringir Reserva',}), required=False)
reservaF = forms.TimeField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Fin Restringir Reserva',}), required=False)
class Meta:
model = Estacionamiento
fields = '__all__'
class ReservaForm(forms.ModelForm):
estacionamiento = forms.ModelChoiceField(
queryset=Estacionamiento.objects.all(),
empty_label="Estacionamiento",
widget=forms.Select(attrs={'class': 'form-control',}))
horaInicio = forms.TimeField(widget=forms.DateInput(
attrs={'class': 'form-control', 'placeholder': 'Inicio de la Reserva',}))
horaFin = forms.TimeField(widget=forms.DateInput(
attrs={'class': 'form-control', 'placeholder': 'Fin de la Reserva',}))
class Meta:
model = Reserva
fields = ['horaInicio', 'horaFin', 'estacionamiento']
class PagoForm(forms.ModelForm):
TARJETAS = [
('', 'Tipo de Tarjeta'),
('Vista', 'Vista'),
('Mister', 'Mister'),
('Xpres', 'Xpres')
]
nombre = forms.CharField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Nombre',}))
cedula = forms.IntegerField(widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Cédula',}))
tipoTarjeta = forms.ChoiceField(choices=TARJETAS, widget=forms.Select(attrs={'class': 'form-control'}))
numeroTarjeta = forms.RegexField(min_length=16, max_length=16, regex=r'^(\d)+$',
error_message = ("Número de tarjeta no válido."), widget=forms.TextInput(
attrs={'class': 'form-control', 'placeholder': 'Número de Tarjeta',}))
class Meta:
model = Pago
fields = ['nombre', 'cedula', 'tipoTarjeta', 'numeroTarjeta', 'pago']
| 49.987654 | 107 | 0.646826 | 402 | 4,049 | 6.49005 | 0.266169 | 0.096972 | 0.123419 | 0.185128 | 0.60253 | 0.60253 | 0.595631 | 0.566501 | 0.526639 | 0.526639 | 0 | 0.005152 | 0.184984 | 4,049 | 80 | 108 | 50.6125 | 0.785455 | 0.010373 | 0 | 0.044118 | 0 | 0 | 0.273159 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.029412 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
292db3dd254935b6485aa3e5a0431e5e9297d7e2 | 2,328 | py | Python | test/programytest/clients/restful/test_config.py | minhdc/documented-programy | fe947d68c0749201fbe93ee5644d304235d0c626 | [
"MIT"
] | null | null | null | test/programytest/clients/restful/test_config.py | minhdc/documented-programy | fe947d68c0749201fbe93ee5644d304235d0c626 | [
"MIT"
] | null | null | null | test/programytest/clients/restful/test_config.py | minhdc/documented-programy | fe947d68c0749201fbe93ee5644d304235d0c626 | [
"MIT"
] | null | null | null | import unittest
from programy.config.file.yaml_file import YamlConfigurationFile
from programy.clients.restful.config import RestConfiguration
from programy.clients.events.console.config import ConsoleConfiguration
class RestConfigurationTests(unittest.TestCase):
def test_init(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
rest:
host: 127.0.0.1
port: 5000
debug: false
workers: 4
use_api_keys: false
api_key_file: apikeys.txt
""", ConsoleConfiguration(), ".")
rest_config = RestConfiguration("rest")
rest_config.load_configuration(yaml, ".")
self.assertEqual("127.0.0.1", rest_config.host)
self.assertEqual(5000, rest_config.port)
self.assertEqual(False, rest_config.debug)
self.assertEqual(False, rest_config.use_api_keys)
self.assertEqual("apikeys.txt", rest_config.api_key_file)
def test_init_no_values(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
rest:
""", ConsoleConfiguration(), ".")
rest_config = RestConfiguration("rest")
rest_config.load_configuration(yaml, ".")
self.assertEqual("0.0.0.0", rest_config.host)
self.assertEqual(80, rest_config.port)
self.assertEqual(False, rest_config.debug)
self.assertEqual(False, rest_config.use_api_keys)
def test_to_yaml_with_defaults(self):
config = RestConfiguration("rest")
data = {}
config.to_yaml(data, True)
self.assertEquals(data['host'], "0.0.0.0")
self.assertEquals(data['port'], 80)
self.assertEquals(data['debug'], False)
self.assertEquals(data['use_api_keys'], False)
self.assertEquals(data['api_key_file'], './api.keys')
self.assertEquals(data['ssl_cert_file'], './rsa.cert')
self.assertEquals(data['ssl_key_file'], './rsa.keys')
self.assertEquals(data['bot'], 'bot')
self.assertEquals(data['license_keys'], "./config/license.keys")
self.assertEquals(data['bot_selector'], "programy.clients.client.DefaultBotSelector")
self.assertEquals(data['renderer'], "programy.clients.render.text.TextRenderer")
| 36.375 | 93 | 0.660653 | 262 | 2,328 | 5.683206 | 0.240458 | 0.087307 | 0.14775 | 0.064473 | 0.42176 | 0.346541 | 0.346541 | 0.346541 | 0.346541 | 0.346541 | 0 | 0.017954 | 0.210481 | 2,328 | 63 | 94 | 36.952381 | 0.792165 | 0 | 0 | 0.352941 | 0 | 0 | 0.209192 | 0.044674 | 0 | 0 | 0 | 0 | 0.431373 | 1 | 0.058824 | false | 0 | 0.078431 | 0 | 0.156863 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
29351a72a75c3ab6afce56723dbd2096b63f981a | 726 | py | Python | algorithms/implementation/minimum_distances.py | avenet/hackerrank | e522030a023af4ff50d5fc64bd3eba30144e006c | [
"MIT"
] | null | null | null | algorithms/implementation/minimum_distances.py | avenet/hackerrank | e522030a023af4ff50d5fc64bd3eba30144e006c | [
"MIT"
] | null | null | null | algorithms/implementation/minimum_distances.py | avenet/hackerrank | e522030a023af4ff50d5fc64bd3eba30144e006c | [
"MIT"
] | null | null | null | n = int(input().strip())
items = [
int(A_temp)
for A_temp
in input().strip().split(' ')
]
items_map = {}
result = None
for i, item in enumerate(items):
if item not in items_map:
items_map[item] = [i]
else:
items_map[item].append(i)
for _, item_indexes in items_map.items():
items_indexes_length = len(item_indexes)
if items_indexes_length > 1:
for i in range(items_indexes_length):
for j in range(i + 1, items_indexes_length):
diff = item_indexes[j] - item_indexes[i]
if result is None:
result = diff
elif diff < result:
result = diff
print(result if result else -1)
| 22.6875 | 56 | 0.566116 | 99 | 726 | 3.949495 | 0.30303 | 0.102302 | 0.184143 | 0.076726 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006173 | 0.330579 | 726 | 31 | 57 | 23.419355 | 0.798354 | 0 | 0 | 0.083333 | 0 | 0 | 0.001377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.041667 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2954339ee63d8f3aeb46e217258769ecc01fa43c | 1,444 | py | Python | new_rdsmysql.py | AdminTurnedDevOps/AWS_Solutions_Architect_Python | 5389f8c9dfbda7b0b49a94a93e9b070420ca9ece | [
"MIT"
] | 30 | 2019-01-13T20:14:07.000Z | 2022-02-06T15:08:01.000Z | new_rdsmysql.py | AdminTurnedDevOps/AWS_Solutions_Architect_Python | 5389f8c9dfbda7b0b49a94a93e9b070420ca9ece | [
"MIT"
] | 1 | 2019-01-13T23:52:39.000Z | 2019-01-14T14:39:45.000Z | new_rdsmysql.py | AdminTurnedDevOps/AWS_Solutions_Architect_Python | 5389f8c9dfbda7b0b49a94a93e9b070420ca9ece | [
"MIT"
] | 26 | 2019-01-13T21:32:23.000Z | 2022-03-20T05:19:03.000Z | import boto3
import sys
import time
import logging
import getpass
def new_rdsmysql(dbname, instanceID, storage, dbInstancetype, dbusername):
masterPass = getpass.getpass('DBMasterPassword: ')
if len(masterPass) < 10:
logging.warning('Password is not at least 10 characters. Please try again')
time.sleep(5)
exit
else:
None
try:
rds_instance = boto3.client('rds')
create_instance = rds_instance.create_db_instance(
DBName = dbname,
DBInstanceIdentifier = instanceID,
AllocatedStorage = int(storage),
DBInstanceClass = dbInstancetype,
Engine = 'mysql',
MasterUsername = dbusername,
MasterUserPassword = str(masterPass),
MultiAZ = True,
EngineVersion = '5.7.23',
AutoMinorVersionUpgrade = False,
LicenseModel = 'general-public-license',
PubliclyAccessible = False,
Tags = [
{
'Key': 'Name',
'Value' : dbname
}
]
)
print(create_instance)
except Exception as e:
logging.warning('An error has occured')
print(e)
dbname = sys.argv[1]
instanceID = sys.argv[2]
storage = sys.argv[3]
dbInstancetype = sys.argv[4]
dbusername = sys.argv[5]
new_rdsmysql(dbname, instanceID, storage, dbInstancetype, dbusername) | 27.769231 | 83 | 0.587258 | 134 | 1,444 | 6.268657 | 0.58209 | 0.041667 | 0.040476 | 0.064286 | 0.138095 | 0.138095 | 0.138095 | 0 | 0 | 0 | 0 | 0.01641 | 0.324792 | 1,444 | 52 | 84 | 27.769231 | 0.845128 | 0 | 0 | 0 | 0 | 0 | 0.09827 | 0.015225 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022222 | false | 0.111111 | 0.111111 | 0 | 0.133333 | 0.044444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
29560939d9082f0d01fcc95be50270dfe0f453ac | 4,265 | py | Python | tunobase/tagging/migrations/0001_initial.py | unomena/tunobase-core | fd24e378c87407131805fa56ade8669fceec8dfa | [
"BSD-3-Clause"
] | null | null | null | tunobase/tagging/migrations/0001_initial.py | unomena/tunobase-core | fd24e378c87407131805fa56ade8669fceec8dfa | [
"BSD-3-Clause"
] | null | null | null | tunobase/tagging/migrations/0001_initial.py | unomena/tunobase-core | fd24e378c87407131805fa56ade8669fceec8dfa | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from south.utils import datetime_utils as datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'Tag'
db.create_table(u'tagging_tag', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('title', self.gf('django.db.models.fields.CharField')(max_length=32, db_index=True)),
('description', self.gf('django.db.models.fields.TextField')(null=True, blank=True)),
('site', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['sites.Site'], null=True, blank=True)),
))
db.send_create_signal(u'tagging', ['Tag'])
# Adding unique constraint on 'Tag', fields ['title', 'site']
db.create_unique(u'tagging_tag', ['title', 'site_id'])
# Adding model 'ContentObjectTag'
db.create_table(u'tagging_contentobjecttag', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('content_type', self.gf('django.db.models.fields.related.ForeignKey')(related_name='content_type_set_for_contentobjecttag', to=orm['contenttypes.ContentType'])),
('object_pk', self.gf('django.db.models.fields.PositiveIntegerField')()),
('site', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['sites.Site'])),
('tag', self.gf('django.db.models.fields.related.ForeignKey')(related_name='content_object_tags', to=orm['tagging.Tag'])),
))
db.send_create_signal(u'tagging', ['ContentObjectTag'])
def backwards(self, orm):
# Removing unique constraint on 'Tag', fields ['title', 'site']
db.delete_unique(u'tagging_tag', ['title', 'site_id'])
# Deleting model 'Tag'
db.delete_table(u'tagging_tag')
# Deleting model 'ContentObjectTag'
db.delete_table(u'tagging_contentobjecttag')
models = {
u'contenttypes.contenttype': {
'Meta': {'ordering': "('name',)", 'unique_together': "(('app_label', 'model'),)", 'object_name': 'ContentType', 'db_table': "'django_content_type'"},
'app_label': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'model': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
},
u'sites.site': {
'Meta': {'ordering': "(u'domain',)", 'object_name': 'Site', 'db_table': "u'django_site'"},
'domain': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '50'})
},
u'tagging.contentobjecttag': {
'Meta': {'object_name': 'ContentObjectTag'},
'content_type': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'content_type_set_for_contentobjecttag'", 'to': u"orm['contenttypes.ContentType']"}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'object_pk': ('django.db.models.fields.PositiveIntegerField', [], {}),
'site': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['sites.Site']"}),
'tag': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'content_object_tags'", 'to': u"orm['tagging.Tag']"})
},
u'tagging.tag': {
'Meta': {'unique_together': "[('title', 'site')]", 'object_name': 'Tag'},
'description': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'site': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['sites.Site']", 'null': 'True', 'blank': 'True'}),
'title': ('django.db.models.fields.CharField', [], {'max_length': '32', 'db_index': 'True'})
}
}
complete_apps = ['tagging'] | 56.118421 | 182 | 0.597655 | 485 | 4,265 | 5.115464 | 0.164948 | 0.083837 | 0.141072 | 0.201532 | 0.660621 | 0.626763 | 0.566304 | 0.530028 | 0.464329 | 0.410318 | 0 | 0.005795 | 0.190856 | 4,265 | 76 | 183 | 56.118421 | 0.713127 | 0.058382 | 0 | 0.140351 | 0 | 0 | 0.499875 | 0.296333 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035088 | false | 0 | 0.070175 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2957082f2761f3302a5b658af0d68aab4daff24f | 1,004 | py | Python | recogym/envs/session.py | philomenec/reco-gym | f8553d197f42ec2f415aefce48525d0e9b10ddaa | [
"Apache-2.0"
] | 413 | 2018-09-18T17:49:44.000Z | 2022-03-23T12:25:41.000Z | recogym/envs/session.py | aliang-rec/reco-gym | f8553d197f42ec2f415aefce48525d0e9b10ddaa | [
"Apache-2.0"
] | 15 | 2018-11-08T17:04:21.000Z | 2021-11-30T19:20:27.000Z | recogym/envs/session.py | aliang-rec/reco-gym | f8553d197f42ec2f415aefce48525d0e9b10ddaa | [
"Apache-2.0"
] | 81 | 2018-09-22T02:28:55.000Z | 2022-03-30T14:03:01.000Z | class Session(list):
"""Abstract Session class"""
def to_strings(self, user_id, session_id):
"""represent session as list of strings (one per event)"""
user_id, session_id = str(user_id), str(session_id)
session_type = self.get_type()
strings = []
for event, product in self:
columns = [user_id, session_type, session_id, event, str(product)]
strings.append(','.join(columns))
return strings
def get_type(self):
raise NotImplemented
class OrganicSessions(Session):
def __init__(self):
super(OrganicSessions, self).__init__()
def next(self, context, product):
self.append(
{
't': context.time(),
'u': context.user(),
'z': 'pageview',
'v': product
}
)
def get_type(self):
return 'organic'
def get_views(self):
return [p for _, _, e, p in self if e == 'pageview']
| 27.135135 | 78 | 0.548805 | 113 | 1,004 | 4.654867 | 0.389381 | 0.045627 | 0.074144 | 0.057034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.330677 | 1,004 | 36 | 79 | 27.888889 | 0.782738 | 0.074701 | 0 | 0.074074 | 0 | 0 | 0.030501 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0.074074 | 0.407407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
295f637700f993cfd8e37b0ff39f106d2c2a6469 | 1,716 | py | Python | {{cookiecutter.project_slug}}/api/__init__.py | Steamboat/cookiecutter-devops | 6f07329c9e54b76e671a0308d343d2d9ebff5343 | [
"BSD-3-Clause"
] | null | null | null | {{cookiecutter.project_slug}}/api/__init__.py | Steamboat/cookiecutter-devops | 6f07329c9e54b76e671a0308d343d2d9ebff5343 | [
"BSD-3-Clause"
] | null | null | null | {{cookiecutter.project_slug}}/api/__init__.py | Steamboat/cookiecutter-devops | 6f07329c9e54b76e671a0308d343d2d9ebff5343 | [
"BSD-3-Clause"
] | null | null | null |
import logging
from flask import Flask
from flask_sqlalchemy import SQLAlchemy as _BaseSQLAlchemy
from flask_migrate import Migrate
from flask_cors import CORS
from flask_talisman import Talisman
from flask_ipban import IpBan
from config import Config, get_logger_handler
# database
class SQLAlchemy(_BaseSQLAlchemy):
def apply_pool_defaults(self, app, options):
super(SQLAlchemy, self).apply_pool_defaults(app, options)
options["pool_pre_ping"] = True
db = SQLAlchemy()
migrate = Migrate()
cors = CORS()
talisman = Talisman()
global_config = Config()
ip_ban = IpBan(ban_seconds=200, ban_count=global_config.IP_BAN_LIST_COUNT)
# logging
logger = logging.getLogger('frontend')
def create_app(config_class=None):
app = Flask(__name__)
if config_class is None:
config_class = Config()
app.config.from_object(config_class)
db.init_app(app)
migrate.init_app(app, db)
# TODO - Refine and update when build pipeline is stable. Get from global_config
cors.init_app(app, origins=["http://localhost:5000", "http://localhost:3000", '*'])
if app.config["ENV"] in ("staging", "production"):
# Secure the application and implement best practice https redirects and a content security policy
talisman.init_app(app, content_security_policy=None)
# ip_ban.init_app(app)
# ip_ban.load_nuisances(global_config.IP_BAN_REGEX_FILE)
from api.routes import bp as api_bp
app.register_blueprint(api_bp)
if not app.debug and not app.testing:
app.logger.addHandler(get_logger_handler())
@app.teardown_appcontext
def shutdown_session(exception=None):
db.session.remove()
return app
from api import models
| 32.377358 | 106 | 0.740093 | 238 | 1,716 | 5.105042 | 0.39916 | 0.044444 | 0.041152 | 0.027984 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007741 | 0.171911 | 1,716 | 52 | 107 | 33 | 0.847291 | 0.156177 | 0 | 0 | 0 | 0 | 0.058333 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0.078947 | false | 0 | 0.263158 | 0 | 0.394737 | 0.026316 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2967592aac9355f4e077c19d82c1790326f4a71b | 343 | py | Python | src/view/services_update_page.py | nbilbo/services_manager | 74e0471a1101305303a96d39963cc98fc0645a64 | [
"MIT"
] | null | null | null | src/view/services_update_page.py | nbilbo/services_manager | 74e0471a1101305303a96d39963cc98fc0645a64 | [
"MIT"
] | null | null | null | src/view/services_update_page.py | nbilbo/services_manager | 74e0471a1101305303a96d39963cc98fc0645a64 | [
"MIT"
] | null | null | null | from src.view.services_page import ServicesPage
from src.view.services_add_page import ServicesAddPage
class ServicesUpdatePage(ServicesAddPage):
def __init__(self, parent, *args, **kwargs):
super().__init__(parent, *args, **kwargs)
self.set_title("Update service")
self.set_confirm_button_text("Update")
| 34.3 | 54 | 0.723032 | 40 | 343 | 5.825 | 0.625 | 0.060086 | 0.094421 | 0.16309 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172012 | 343 | 10 | 55 | 34.3 | 0.820423 | 0 | 0 | 0 | 0 | 0 | 0.05814 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
296855e3082fc927d6f123b69b223d9a6934f75b | 1,861 | py | Python | Graphs/ConnectedComponents.py | PK-100/Competitive_Programming | d0863feaaa99462b2999e85dcf115f7a6c08bb8d | [
"MIT"
] | 70 | 2018-06-25T21:20:15.000Z | 2022-03-24T03:55:17.000Z | Graphs/ConnectedComponents.py | An3sha/Competitive_Programming | ee7eadf51939a360d0b004d787ebabda583e92f0 | [
"MIT"
] | 4 | 2018-09-04T13:12:20.000Z | 2021-06-20T08:29:12.000Z | Graphs/ConnectedComponents.py | An3sha/Competitive_Programming | ee7eadf51939a360d0b004d787ebabda583e92f0 | [
"MIT"
] | 24 | 2018-12-26T05:15:32.000Z | 2022-01-23T23:04:54.000Z | #!/bin/python3
import math
import os
import random
import re
import sys
#
# Complete the 'countGroups' function below.
#
# The function is expected to return an INTEGER.
# The function accepts STRING_ARRAY related as parameter.
#
class Graph:
def __init__(self, V):
self.V = V
self.adj = [[] for i in range(V)]
def addEdge(self, a,b):
self.adj[a].append(b)
self.adj[b].append(a)
def dfs_util(self, temp, node, visited):
visited[node] = True
temp.append(node)
for i in self.adj[node]:
if not visited[i]:
temp = self.dfs_util(temp, i, visited)
return temp
def countGroups(self):
"""
This is the classical concept of connected components in a Graph
"""
visited = [False] * self.V
groups = []
for node in range(self.V):
if not visited[node]:
temp = []
groups.append(self.dfs_util(temp, node, visited))
return groups
def convertMatrixToGraph(mat):
"""
Accept the input which is an adjacency matrix and return a Graph, which is an adjacency list
"""
n = len(mat)
g = Graph(n)
for i in range(n):
for j in range(n):
if j > i and mat[i][j] == '1':
g.addEdge(i,j)
return g
def countGroups(related):
# Write your code here
graph = convertMatrixToGraph(related)
groups = graph.countGroups()
# print(groups)
return len(groups)
if __name__ == '__main__':
fptr = open(os.environ['OUTPUT_PATH'], 'w')
related_count = int(input().strip())
related = []
for _ in range(related_count):
related_item = input()
related.append(related_item)
result = countGroups(related)
fptr.write(str(result) + '\n')
fptr.close()
| 22.695122 | 96 | 0.576034 | 242 | 1,861 | 4.338843 | 0.371901 | 0.033333 | 0.017143 | 0.020952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00156 | 0.311123 | 1,861 | 81 | 97 | 22.975309 | 0.817473 | 0.189146 | 0 | 0 | 0 | 0 | 0.015732 | 0 | 0 | 0 | 0 | 0.012346 | 0 | 1 | 0.122449 | false | 0 | 0.102041 | 0 | 0.326531 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2976116c00e8fbc7eadad3295e61cb99c5280023 | 3,018 | py | Python | hoogberta/encoder.py | KateSatida/HoogBERTa_SuperAI2 | e903054bc752a50c391ab610507fdeccc4f5d482 | [
"MIT"
] | null | null | null | hoogberta/encoder.py | KateSatida/HoogBERTa_SuperAI2 | e903054bc752a50c391ab610507fdeccc4f5d482 | [
"MIT"
] | null | null | null | hoogberta/encoder.py | KateSatida/HoogBERTa_SuperAI2 | e903054bc752a50c391ab610507fdeccc4f5d482 | [
"MIT"
] | null | null | null | from .trainer.models import MultiTaskTagger
from .trainer.utils import load_dictionaries,Config
from .trainer.tasks.multitask_tagging import MultiTaskTaggingModule
from fairseq.data.data_utils import collate_tokens
from attacut import tokenize
class HoogBERTaEncoder(object):
def __init__(self,layer=12,cuda=False,base_path="."):
args = Config(base_path=base_path)
self.base_path = base_path
self.pos_dict, self.ne_dict, self.sent_dict = load_dictionaries(self.base_path)
self.model = MultiTaskTagger(args,[len(self.pos_dict), len(self.ne_dict), len(self.sent_dict)])
if cuda == True:
self.model = self.model.cuda()
def extract_features(self,sentence):
all_sent = []
sentences = sentence.split(" ")
for sent in sentences:
all_sent.append(" ".join(tokenize(sent)).replace("_","[!und:]"))
sentence = " _ ".join(all_sent)
tokens = self.model.bert.encode(sentence).unsqueeze(0)
all_layers = self.model.bert.extract_features(tokens, return_all_hiddens=True)
return tokens[0], all_layers[-1][0]
def extract_features_batch(self,sentenceL):
inputList = []
for sentX in sentenceL:
sentences = sentX.split(" ")
all_sent = []
for sent in sentences:
all_sent.append(" ".join(tokenize(sent)).replace("_","[!und:]"))
sentence = " _ ".join(all_sent)
inputList.append(sentence)
batch = collate_tokens([self.model.bert.encode(sent) for sent in inputList], pad_idx=1)
#tokens = self.model.bert.encode(inputList)
return self.extract_features_from_tensor(batch)
def extract_features_from_tensor(self,batch):
all_layers = self.model.bert.extract_features(batch, return_all_hiddens=True)
return batch, all_layers[-1]
def extract_features2(self,sentence):
# all_sent = []
# sentences = sentence.split(" ")
# for sent in sentences:
# all_sent.append(" ".join(tokenize(sent)).replace("_","[!und:]"))
# sentence = " _ ".join(all_sent)
tokens = self.model.bert.encode(sentence).unsqueeze(0)
all_layers = self.model.bert.extract_features(tokens, return_all_hiddens=True)
return tokens[0], all_layers[-1][0]
def extract_features_batch2(self,sentenceL):
# inputList = []
# for sentX in sentenceL:
# sentences = sentX.split(" ")
# all_sent = []
# for sent in sentences:
# all_sent.append(" ".join(tokenize(sent)).replace("_","[!und:]"))
# sentence = " _ ".join(all_sent)
# inputList.append(sentence)
batch = collate_tokens([self.model.bert.encode(sent) for sent in sentenceL], pad_idx=1)
#tokens = self.model.bert.encode(inputList)
return self.extract_features_from_tensor(batch)
| 38.202532 | 103 | 0.61829 | 348 | 3,018 | 5.146552 | 0.198276 | 0.060302 | 0.065327 | 0.063652 | 0.685092 | 0.648241 | 0.648241 | 0.627582 | 0.627582 | 0.627582 | 0 | 0.006702 | 0.258449 | 3,018 | 78 | 104 | 38.692308 | 0.793566 | 0.169317 | 0 | 0.372093 | 0 | 0 | 0.010835 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139535 | false | 0 | 0.116279 | 0 | 0.395349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2979426aee0cb9d3f7a35f048ea9d20f5f7f25ea | 1,806 | py | Python | conductor_calculator.py | aj83854/project-lightning-rod | 77867d6c4ee30650023f3ec2a8318edd92530264 | [
"MIT"
] | null | null | null | conductor_calculator.py | aj83854/project-lightning-rod | 77867d6c4ee30650023f3ec2a8318edd92530264 | [
"MIT"
] | null | null | null | conductor_calculator.py | aj83854/project-lightning-rod | 77867d6c4ee30650023f3ec2a8318edd92530264 | [
"MIT"
] | null | null | null | from pyconductor import load_test_values, calculate_conductance
def conductance_calc():
preloaded_dict = load_test_values()
while preloaded_dict:
print(
"[1] - Show currently available materials in Material Dictionary\n"
"[2] - Add a material (will not be saved upon restart)\n"
"[3] - Quit\n"
"To test the conductive properties of a material, simply type in its name.\n"
"Otherwise, type the corresponding number for an option above.\n"
)
main_prompt = input(">>> ").lower()
if main_prompt == "1":
print(f"\nCurrently contains the following materials:\n{preloaded_dict.keys()}\n")
elif main_prompt == "2":
preloaded_dict.addmat()
elif main_prompt == "3":
quit()
else:
try:
calculate_conductance(preloaded_dict[main_prompt])
while True:
again_prompt = input(
"Would you like to try another calculation? [Y]es or [N]o: ").lower()
if again_prompt in ("y", "yes"):
break
elif again_prompt in ("n", "no"):
print("\nGoodbye!\n")
quit()
except KeyError:
if main_prompt == "":
print("\nNo material specified.\nPlease enter a valid material name "
"listed in option [1], or use option [2] to add your own.\n")
else: # TODO: add logic handling whether user wants to add missing material
print(f"\n{main_prompt} is not a valid material or command!\n")
else:
pass
if __name__ == "__main__":
conductance_calc()
| 42 | 94 | 0.530454 | 202 | 1,806 | 4.589109 | 0.49505 | 0.075512 | 0.030205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00708 | 0.374308 | 1,806 | 42 | 95 | 43 | 0.813274 | 0.037099 | 0 | 0.128205 | 0 | 0 | 0.348877 | 0.021301 | 0 | 0 | 0 | 0.02381 | 0 | 1 | 0.025641 | false | 0.025641 | 0.025641 | 0 | 0.051282 | 0.128205 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2980f0b218fed38559f7aa3fa0718ba902a95fb9 | 7,657 | py | Python | hwilib/devices/keepkey.py | cjackie/HWI | 8c1b50aaaac37714b5d61f720b4b06f8aa24c73a | [
"MIT"
] | 285 | 2019-01-31T03:10:19.000Z | 2022-03-31T10:38:37.000Z | hwilib/devices/keepkey.py | cjackie/HWI | 8c1b50aaaac37714b5d61f720b4b06f8aa24c73a | [
"MIT"
] | 426 | 2019-01-31T10:38:02.000Z | 2022-03-28T15:58:13.000Z | hwilib/devices/keepkey.py | cjackie/HWI | 8c1b50aaaac37714b5d61f720b4b06f8aa24c73a | [
"MIT"
] | 128 | 2019-01-30T22:32:32.000Z | 2022-03-28T19:23:46.000Z | """
Keepkey
*******
"""
from ..errors import (
DEVICE_NOT_INITIALIZED,
DeviceNotReadyError,
common_err_msgs,
handle_errors,
)
from .trezorlib import protobuf as p
from .trezorlib.transport import (
hid,
udp,
webusb,
)
from .trezor import TrezorClient, HID_IDS, WEBUSB_IDS
from .trezorlib.messages import (
DebugLinkState,
Features,
HDNodeType,
ResetDevice,
)
from typing import (
Any,
Dict,
List,
Optional,
)
py_enumerate = enumerate # Need to use the enumerate built-in but there's another function already named that
KEEPKEY_HID_IDS = {(0x2B24, 0x0001)}
KEEPKEY_WEBUSB_IDS = {(0x2B24, 0x0002)}
KEEPKEY_SIMULATOR_PATH = '127.0.0.1:11044'
HID_IDS.update(KEEPKEY_HID_IDS)
WEBUSB_IDS.update(KEEPKEY_WEBUSB_IDS)
class KeepkeyFeatures(Features): # type: ignore
def __init__(
self,
*,
firmware_variant: Optional[str] = None,
firmware_hash: Optional[bytes] = None,
**kwargs: Any,
) -> None:
super().__init__(**kwargs)
self.firmware_variant = firmware_variant
self.firmware_hash = firmware_hash
@classmethod
def get_fields(cls) -> Dict[int, p.FieldInfo]:
return {
1: ('vendor', p.UnicodeType, None),
2: ('major_version', p.UVarintType, None),
3: ('minor_version', p.UVarintType, None),
4: ('patch_version', p.UVarintType, None),
5: ('bootloader_mode', p.BoolType, None),
6: ('device_id', p.UnicodeType, None),
7: ('pin_protection', p.BoolType, None),
8: ('passphrase_protection', p.BoolType, None),
9: ('language', p.UnicodeType, None),
10: ('label', p.UnicodeType, None),
12: ('initialized', p.BoolType, None),
13: ('revision', p.BytesType, None),
14: ('bootloader_hash', p.BytesType, None),
15: ('imported', p.BoolType, None),
16: ('unlocked', p.BoolType, None),
21: ('model', p.UnicodeType, None),
22: ('firmware_variant', p.UnicodeType, None),
23: ('firmware_hash', p.BytesType, None),
24: ('no_backup', p.BoolType, None),
25: ('wipe_code_protection', p.BoolType, None),
}
class KeepkeyResetDevice(ResetDevice): # type: ignore
def __init__(
self,
*,
auto_lock_delay_ms: Optional[int] = None,
**kwargs: Any,
) -> None:
super().__init__(**kwargs)
self.auto_lock_delay_ms = auto_lock_delay_ms
@classmethod
def get_fields(cls) -> Dict[int, p.FieldInfo]:
return {
1: ('display_random', p.BoolType, None),
2: ('strength', p.UVarintType, 256), # default=256
3: ('passphrase_protection', p.BoolType, None),
4: ('pin_protection', p.BoolType, None),
5: ('language', p.UnicodeType, "en-US"), # default=en-US
6: ('label', p.UnicodeType, None),
7: ('no_backup', p.BoolType, None),
8: ('auto_lock_delay_ms', p.UVarintType, None),
9: ('u2f_counter', p.UVarintType, None),
}
class KeepkeyDebugLinkState(DebugLinkState): # type: ignore
def __init__(
self,
*,
recovery_cipher: Optional[str] = None,
recovery_auto_completed_word: Optional[str] = None,
firmware_hash: Optional[bytes] = None,
storage_hash: Optional[bytes] = None,
**kwargs: Any,
) -> None:
super().__init__(**kwargs)
self.recovery_cipher = recovery_cipher
self.recovery_auto_completed_word = recovery_auto_completed_word
self.firmware_hash = firmware_hash
self.storage_hash = storage_hash
@classmethod
def get_fields(cls) -> Dict[int, p.FieldType]:
return {
1: ('layout', p.BytesType, None),
2: ('pin', p.UnicodeType, None),
3: ('matrix', p.UnicodeType, None),
4: ('mnemonic_secret', p.BytesType, None),
5: ('node', HDNodeType, None),
6: ('passphrase_protection', p.BoolType, None),
7: ('reset_word', p.UnicodeType, None),
8: ('reset_entropy', p.BytesType, None),
9: ('recovery_fake_word', p.UnicodeType, None),
10: ('recovery_word_pos', p.UVarintType, None),
11: ('recovery_cipher', p.UnicodeType, None),
12: ('recovery_auto_completed_word', p.UnicodeType, None),
13: ('firmware_hash', p.BytesType, None),
14: ('storage_hash', p.BytesType, None),
}
class KeepkeyClient(TrezorClient):
def __init__(self, path: str, password: str = "", expert: bool = False) -> None:
"""
The `KeepkeyClient` is a `HardwareWalletClient` for interacting with the Keepkey.
As Keepkeys are clones of the Trezor 1, please refer to `TrezorClient` for documentation.
"""
super(KeepkeyClient, self).__init__(path, password, expert, KEEPKEY_HID_IDS, KEEPKEY_WEBUSB_IDS, KEEPKEY_SIMULATOR_PATH)
self.type = 'Keepkey'
self.client.vendors = ("keepkey.com")
self.client.minimum_versions = {"K1-14AM": (0, 0, 0)}
self.client.map_type_to_class_override[KeepkeyFeatures.MESSAGE_WIRE_TYPE] = KeepkeyFeatures
self.client.map_type_to_class_override[KeepkeyResetDevice.MESSAGE_WIRE_TYPE] = KeepkeyResetDevice
if self.simulator:
self.client.debug.map_type_to_class_override[KeepkeyDebugLinkState.MESSAGE_WIRE_TYPE] = KeepkeyDebugLinkState
def enumerate(password: str = "") -> List[Dict[str, Any]]:
results = []
devs = hid.HidTransport.enumerate(usb_ids=KEEPKEY_HID_IDS)
devs.extend(webusb.WebUsbTransport.enumerate(usb_ids=KEEPKEY_WEBUSB_IDS))
devs.extend(udp.UdpTransport.enumerate(KEEPKEY_SIMULATOR_PATH))
for dev in devs:
d_data: Dict[str, Any] = {}
d_data['type'] = 'keepkey'
d_data['model'] = 'keepkey'
d_data['path'] = dev.get_path()
client = None
with handle_errors(common_err_msgs["enumerate"], d_data):
client = KeepkeyClient(d_data['path'], password)
try:
client.client.refresh_features()
except TypeError:
continue
if 'keepkey' not in client.client.features.vendor:
continue
d_data['label'] = client.client.features.label
if d_data['path'].startswith('udp:'):
d_data['model'] += '_simulator'
d_data['needs_pin_sent'] = client.client.features.pin_protection and not client.client.features.unlocked
d_data['needs_passphrase_sent'] = client.client.features.passphrase_protection # always need the passphrase sent for Keepkey if it has passphrase protection enabled
if d_data['needs_pin_sent']:
raise DeviceNotReadyError('Keepkey is locked. Unlock by using \'promptpin\' and then \'sendpin\'.')
if d_data['needs_passphrase_sent'] and not password:
raise DeviceNotReadyError("Passphrase needs to be specified before the fingerprint information can be retrieved")
if client.client.features.initialized:
d_data['fingerprint'] = client.get_master_fingerprint().hex()
d_data['needs_passphrase_sent'] = False # Passphrase is always needed for the above to have worked, so it's already sent
else:
d_data['error'] = 'Not initialized'
d_data['code'] = DEVICE_NOT_INITIALIZED
if client:
client.close()
results.append(d_data)
return results
| 37.534314 | 176 | 0.614601 | 863 | 7,657 | 5.225956 | 0.273465 | 0.019956 | 0.04612 | 0.030599 | 0.206874 | 0.092018 | 0.092018 | 0.077827 | 0.054989 | 0.045676 | 0 | 0.018095 | 0.263811 | 7,657 | 203 | 177 | 37.719212 | 0.781976 | 0.0653 | 0 | 0.189349 | 0 | 0 | 0.126143 | 0.021657 | 0 | 0 | 0.003375 | 0 | 0 | 1 | 0.047337 | false | 0.065089 | 0.04142 | 0.017751 | 0.136095 | 0.011834 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
2986e8913a7519f773b1d594848f51448026d50a | 583 | py | Python | utils/HTMLParser.py | onyb/janitor | a46f3bf23467a27c6f5891b64c797295e5cc47d0 | [
"Apache-2.0"
] | null | null | null | utils/HTMLParser.py | onyb/janitor | a46f3bf23467a27c6f5891b64c797295e5cc47d0 | [
"Apache-2.0"
] | null | null | null | utils/HTMLParser.py | onyb/janitor | a46f3bf23467a27c6f5891b64c797295e5cc47d0 | [
"Apache-2.0"
] | null | null | null | from bs4 import BeautifulSoup
from optimizers.AdvancedJSOptimizer import AdvancedJSOptimizer
from optimizers.CSSOptimizer import CSSOptimizer
class HTMLParser(object):
def __init__(self, html):
self.soup = BeautifulSoup(html, 'lxml')
def js_parser(self):
for script in self.soup.find_all('script'):
opt = AdvancedJSOptimizer()
script.string = opt.process(script.string)
def css_parser(self):
for style in self.soup.find_all('style'):
opt = CSSOptimizer()
style.string = opt.process(style.string) | 32.388889 | 62 | 0.680961 | 66 | 583 | 5.893939 | 0.424242 | 0.061697 | 0.066838 | 0.071979 | 0.087404 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002217 | 0.226415 | 583 | 18 | 63 | 32.388889 | 0.86031 | 0 | 0 | 0 | 0 | 0 | 0.025685 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.214286 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
461ea3bb055956b5f646cce50edcc52ff396af68 | 4,041 | py | Python | pcg_gazebo/parsers/urdf/__init__.py | TForce1/pcg_gazebo | 9ff88016b7b6903236484958ca7c6ed9f8ffb346 | [
"ECL-2.0",
"Apache-2.0"
] | 40 | 2020-02-04T18:16:49.000Z | 2022-02-22T11:36:34.000Z | pcg_gazebo/parsers/urdf/__init__.py | awesomebytes/pcg_gazebo | 4f335dd460ef7c771f1df78b46a92fad4a62cedc | [
"ECL-2.0",
"Apache-2.0"
] | 75 | 2020-01-23T13:40:50.000Z | 2022-02-09T07:26:01.000Z | pcg_gazebo/parsers/urdf/__init__.py | GimpelZhang/gazebo_world_generator | eb7215499d0ddc972d804c988fadab1969579b1b | [
"ECL-2.0",
"Apache-2.0"
] | 18 | 2020-09-10T06:35:41.000Z | 2022-02-20T19:08:17.000Z | # Copyright (c) 2019 - The Procedural Generation for Gazebo authors
# For information on the respective copyright owner see the NOTICE file
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from .actuator import Actuator
from .axis import Axis
from .box import Box
from .child import Child
from .collision import Collision
from .color import Color
from .cylinder import Cylinder
from .dynamics import Dynamics
from .gazebo import Gazebo
from .geometry import Geometry
from .hardware_interface import HardwareInterface
from .inertia import Inertia
from .inertial import Inertial
from .joint import Joint
from .limit import Limit
from .link import Link
from .mass import Mass
from .material import Material
from .mechanical_reduction import MechanicalReduction
from .mesh import Mesh
from .mimic import Mimic
from .origin import Origin
from .parent import Parent
from .robot import Robot
from .safety_controller import SafetyController
from .sphere import Sphere
from .texture import Texture
from .transmission import Transmission
from .type import Type
from .visual import Visual
def get_all_urdf_element_classes():
"""Get list of all URDF element classes."""
import sys
import inspect
from ..types import XMLBase
output = list()
current_module = sys.modules[__name__]
for name, obj in inspect.getmembers(current_module):
if inspect.isclass(obj):
if issubclass(obj, XMLBase) and obj._TYPE == 'urdf':
output.append(obj)
return output
def create_urdf_element(tag, *args):
"""URDF element factory.
> *Input arguments*
* `tag` (*type:* `str`): Name of the URDF element.
* `args`: Extra arguments for URDF element constructor.
> *Returns*
URDF element if `tag` refers to a valid URDF element.
`None`, otherwise.
"""
import sys
import inspect
from ..types import XMLBase
current_module = sys.modules[__name__]
for name, obj in inspect.getmembers(current_module):
if inspect.isclass(obj):
if issubclass(obj, XMLBase):
if tag == obj._NAME and obj._TYPE == 'urdf':
return obj(*args)
return None
def create_urdf_type(tag):
"""Return handle of the URDF element type.
> *Input arguments*
* `tag` (*type:* `str`): Name of the URDF element.
> *Returns*
URDF element type if `tag` is valid, `None` otherwise`.
"""
import sys
import inspect
from ..types import XMLBase
current_module = sys.modules[__name__]
for name, obj in inspect.getmembers(current_module):
if inspect.isclass(obj):
if issubclass(obj, XMLBase):
if tag == obj._NAME and obj._TYPE == 'urdf':
return obj
return None
def is_urdf_element(obj):
"""Test if XML element is an URDF element."""
from ..types import XMLBase
return obj.__class__ in XMLBase.__subclasses__() and \
obj._TYPE == 'urdf'
__all__ = [
'get_all_urdf_element_classes',
'create_urdf_element',
'create_urdf_type',
'is_urdf_element',
'Actuator',
'Axis',
'Box',
'Child',
'Collision',
'Color',
'Cylinder',
'Dynamics',
'Gazebo',
'Geometry',
'HardwareInterface',
'Inertia',
'Inertial',
'Joint',
'Limit',
'Link',
'Mass',
'Material',
'MechanicalReduction',
'Mesh',
'Mimic',
'Origin',
'Parent',
'Robot',
'SafetyController',
'Sphere',
'Texture',
'Transmission',
'Type',
'Visual'
]
| 26.411765 | 74 | 0.675823 | 508 | 4,041 | 5.259843 | 0.293307 | 0.065868 | 0.022455 | 0.032934 | 0.259731 | 0.241766 | 0.241766 | 0.241766 | 0.225299 | 0.225299 | 0 | 0.002577 | 0.231873 | 4,041 | 152 | 75 | 26.585526 | 0.858247 | 0.286315 | 0 | 0.242718 | 0 | 0 | 0.111429 | 0.01 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038835 | false | 0 | 0.38835 | 0 | 0.485437 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
4626daaa44d52cdbb1bec3a34b51700caf38c8dc | 448 | py | Python | tests/test_lamost_tools.py | igomezv/astroNN | 50af116f9cbfc684b63e7ddcf8829343a455722b | [
"MIT"
] | 156 | 2017-10-22T01:29:10.000Z | 2022-03-14T10:28:09.000Z | tests/test_lamost_tools.py | AbdulfattahBaalawi/astroNN | 0b970dd1a8d4d5e6d611ffa52cfd3c2ffdcb4643 | [
"MIT"
] | 16 | 2017-11-02T21:29:28.000Z | 2022-03-14T08:40:41.000Z | tests/test_lamost_tools.py | AbdulfattahBaalawi/astroNN | 0b970dd1a8d4d5e6d611ffa52cfd3c2ffdcb4643 | [
"MIT"
] | 46 | 2017-11-01T18:56:03.000Z | 2022-03-07T06:44:22.000Z | import unittest
import numpy as np
from astroNN.lamost import wavelength_solution, pseudo_continuum
class LamostToolsTestCase(unittest.TestCase):
def test_wavelength_solution(self):
wavelength_solution()
wavelength_solution(dr=5)
self.assertRaises(ValueError, wavelength_solution, dr=1)
def test_norm(self):
pseudo_continuum(np.ones(3909), np.ones(3909))
if __name__ == '__main__':
unittest.main()
| 23.578947 | 64 | 0.734375 | 53 | 448 | 5.886792 | 0.54717 | 0.288462 | 0.128205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0271 | 0.176339 | 448 | 18 | 65 | 24.888889 | 0.818428 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.166667 | false | 0 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
462c2089ebfd3afcf679c3d29b9f6e291acb4dc2 | 525 | py | Python | src/solutions/01.py | NNRepos/AoC-2021-python-solutions | 556ccc920b96cedbdc2f554a3bee28a793be4483 | [
"MIT"
] | null | null | null | src/solutions/01.py | NNRepos/AoC-2021-python-solutions | 556ccc920b96cedbdc2f554a3bee28a793be4483 | [
"MIT"
] | null | null | null | src/solutions/01.py | NNRepos/AoC-2021-python-solutions | 556ccc920b96cedbdc2f554a3bee28a793be4483 | [
"MIT"
] | null | null | null | from utils.utils import *
lines = get_input(__file__)
lines_as_nums = lines_to_nums(lines)
def part1(nums):
incr = 0
cur = nums[0]
for num in nums:
if num > cur:
incr += 1
cur = num
return incr
def part2():
nums = []
for i in range(len(lines_as_nums)):
if i < len(lines_as_nums) - 2:
nums.append(lines_as_nums[i] + lines_as_nums[i + 1] + lines_as_nums[i + 2])
return part1(nums)
print("part1:", part1(lines_as_nums))
print("part2:", part2())
| 19.444444 | 87 | 0.590476 | 82 | 525 | 3.52439 | 0.341463 | 0.16955 | 0.266436 | 0.124567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034392 | 0.28 | 525 | 26 | 88 | 20.192308 | 0.730159 | 0 | 0 | 0 | 0 | 0 | 0.022857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.052632 | 0 | 0.263158 | 0.105263 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
462c4f9e4def6f4455874dce4f3095e44613b4b1 | 1,372 | py | Python | tensorforce/core/baselines/mlp_baseline.py | youlei202/tensorforce-lei | 871ef7f5c41d496aa8ad674854792ebd52ce1546 | [
"Apache-2.0"
] | 1 | 2019-12-21T03:31:33.000Z | 2019-12-21T03:31:33.000Z | tensorforce/core/baselines/mlp_baseline.py | youlei202/tensorforce-lei | 871ef7f5c41d496aa8ad674854792ebd52ce1546 | [
"Apache-2.0"
] | null | null | null | tensorforce/core/baselines/mlp_baseline.py | youlei202/tensorforce-lei | 871ef7f5c41d496aa8ad674854792ebd52ce1546 | [
"Apache-2.0"
] | 1 | 2019-12-21T03:31:39.000Z | 2019-12-21T03:31:39.000Z | # Copyright 2017 reinforce.io. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
from __future__ import absolute_import
from __future__ import print_function
from __future__ import division
from tensorforce.core.baselines import NetworkBaseline
class MLPBaseline(NetworkBaseline):
"""
Multi-layer perceptron baseline (single-state) consisting of dense layers.
"""
def __init__(self, sizes, scope='mlp-baseline', summary_labels=()):
"""
Multi-layer perceptron baseline.
Args:
sizes: List of dense layer sizes
"""
layers_spec = []
for size in sizes:
layers_spec.append({'type': 'dense', 'size': size})
super(MLPBaseline, self).__init__(layers_spec, scope, summary_labels)
| 33.463415 | 80 | 0.671283 | 168 | 1,372 | 5.321429 | 0.613095 | 0.067114 | 0.053691 | 0.035794 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00724 | 0.194606 | 1,372 | 40 | 81 | 34.3 | 0.80181 | 0.585277 | 0 | 0 | 0 | 0 | 0.0499 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.6 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
4637594ad65e429cbd0184284c782da6df047d1a | 482 | py | Python | notifai_recruitment/api.py | BudzynskiMaciej/notifai_recruitment | 56860db3a2dad6115747a675895b8f7947e7e12e | [
"MIT"
] | null | null | null | notifai_recruitment/api.py | BudzynskiMaciej/notifai_recruitment | 56860db3a2dad6115747a675895b8f7947e7e12e | [
"MIT"
] | 2 | 2021-05-21T13:26:26.000Z | 2022-02-10T10:04:55.000Z | notifai_recruitment/api.py | BudzynskiMaciej/notifai_recruitment | 56860db3a2dad6115747a675895b8f7947e7e12e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""API routes config for notifai_recruitment project.
REST framework adds support for automatic URL routing to Django, and provides simple, quick and consistent
way of wiring view logic to a set of URLs.
For more information on this file, see
https://www.django-rest-framework.org/api-guide/routers/
"""
from rest_framework import routers
from textify.api.views import NoteViewSet
router = routers.DefaultRouter()
router.register(r'notes', NoteViewSet)
| 28.352941 | 106 | 0.778008 | 71 | 482 | 5.253521 | 0.746479 | 0.104558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002392 | 0.13278 | 482 | 16 | 107 | 30.125 | 0.889952 | 0.665975 | 0 | 0 | 0 | 0 | 0.03268 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
4643409696cd3d49a508459df5a413ef73fb761e | 301 | py | Python | src/kol/request/CampgroundRestRequest.py | danheath/temppykol | 7f9621b44df9f9d2d9fc0a5b2a06db116b9ccfab | [
"BSD-3-Clause"
] | 19 | 2015-02-16T08:30:49.000Z | 2020-05-01T06:06:33.000Z | src/kol/request/CampgroundRestRequest.py | danheath/temppykol | 7f9621b44df9f9d2d9fc0a5b2a06db116b9ccfab | [
"BSD-3-Clause"
] | 5 | 2015-01-13T23:01:54.000Z | 2016-11-30T15:23:43.000Z | src/kol/request/CampgroundRestRequest.py | danheath/temppykol | 7f9621b44df9f9d2d9fc0a5b2a06db116b9ccfab | [
"BSD-3-Clause"
] | 19 | 2015-05-28T09:36:19.000Z | 2022-03-15T23:19:29.000Z | from kol.request.GenericRequest import GenericRequest
class CampgroundRestRequest(GenericRequest):
"Rests at the user's campground."
def __init__(self, session):
super(CampgroundRestRequest, self).__init__(session)
self.url = session.serverURL + 'campground.php?action=rest'
| 33.444444 | 67 | 0.750831 | 32 | 301 | 6.8125 | 0.71875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156146 | 301 | 8 | 68 | 37.625 | 0.858268 | 0.10299 | 0 | 0 | 0 | 0 | 0.189369 | 0.086379 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
464a949d4e46e87b22e002325b18acfc9b8c6a90 | 592 | py | Python | src/storage-preview/azext_storage_preview/vendored_sdks/azure_storage/v2018_03_28/file/__init__.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 207 | 2017-11-29T06:59:41.000Z | 2022-03-31T10:00:53.000Z | src/storage-preview/azext_storage_preview/vendored_sdks/azure_storage/v2018_03_28/file/__init__.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 4,061 | 2017-10-27T23:19:56.000Z | 2022-03-31T23:18:30.000Z | src/storage-preview/azext_storage_preview/vendored_sdks/azure_storage/v2018_03_28/file/__init__.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 802 | 2017-10-11T17:36:26.000Z | 2022-03-31T22:24:32.000Z | # -------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
from .fileservice import FileService
from .models import (
Share,
ShareProperties,
File,
FileProperties,
Directory,
DirectoryProperties,
FileRange,
ContentSettings,
CopyProperties,
SharePermissions,
FilePermissions,
DeleteSnapshot,
)
| 28.190476 | 76 | 0.540541 | 41 | 592 | 7.804878 | 0.853659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163851 | 592 | 20 | 77 | 29.6 | 0.646465 | 0.505068 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.133333 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
465234088d4677a447f6bdb4657e058e1a45f5b8 | 2,234 | py | Python | jinja2content.py | firemark/new-site | b7d54320f8e1cfae489108f87f64761ce2510676 | [
"MIT"
] | null | null | null | jinja2content.py | firemark/new-site | b7d54320f8e1cfae489108f87f64761ce2510676 | [
"MIT"
] | null | null | null | jinja2content.py | firemark/new-site | b7d54320f8e1cfae489108f87f64761ce2510676 | [
"MIT"
] | null | null | null | """
jinja2content.py
----------------
DONT EDIT THIS FILE
Pelican plugin that processes Markdown files as jinja templates.
"""
from jinja2 import Environment, FileSystemLoader, ChoiceLoader
import os
from pelican import signals
from pelican.readers import MarkdownReader, HTMLReader, RstReader
from pelican.utils import pelican_open
from tempfile import NamedTemporaryFile
class JinjaContentMixin:
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# will look first in 'JINJA2CONTENT_TEMPLATES', by default the
# content root path, then in the theme's templates
local_dirs = self.settings.get('JINJA2CONTENT_TEMPLATES', ['.'])
local_dirs = [os.path.join(self.settings['PATH'], folder)
for folder in local_dirs]
theme_dir = os.path.join(self.settings['THEME'], 'templates')
loaders = [FileSystemLoader(_dir) for _dir
in local_dirs + [theme_dir]]
if 'JINJA_ENVIRONMENT' in self.settings: # pelican 3.7
jinja_environment = self.settings['JINJA_ENVIRONMENT']
else:
jinja_environment = {
'trim_blocks': True,
'lstrip_blocks': True,
'extensions': self.settings['JINJA_EXTENSIONS']
}
self.env = Environment(
loader=ChoiceLoader(loaders),
**jinja_environment)
def read(self, source_path):
with pelican_open(source_path) as text:
text = self.env.from_string(text).render()
with NamedTemporaryFile(delete=False) as f:
f.write(text.encode())
f.close()
content, metadata = super().read(f.name)
os.unlink(f.name)
return content, metadata
class JinjaMarkdownReader(JinjaContentMixin, MarkdownReader):
pass
class JinjaRstReader(JinjaContentMixin, RstReader):
pass
class JinjaHTMLReader(JinjaContentMixin, HTMLReader):
pass
def add_reader(readers):
for Reader in [JinjaMarkdownReader, JinjaRstReader, JinjaHTMLReader]:
for ext in Reader.file_extensions:
readers.reader_classes[ext] = Reader
def register():
signals.readers_init.connect(add_reader)
| 31.027778 | 73 | 0.653984 | 238 | 2,234 | 5.987395 | 0.411765 | 0.050526 | 0.025263 | 0.019649 | 0.057544 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003574 | 0.248433 | 2,234 | 71 | 74 | 31.464789 | 0.845146 | 0.108774 | 0 | 0.06383 | 0 | 0 | 0.063636 | 0.011616 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085106 | false | 0.06383 | 0.12766 | 0 | 0.319149 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4654149e1951b836beb90c90a51fb7d22a7e21c8 | 200 | py | Python | Aulas/12aula(antigo)/readint.py | rafaelmcam/RTOs_ChibiOS | 08d8e21f2c7185d2c47846f67cbfba70c706d689 | [
"MIT"
] | 1 | 2019-05-14T22:31:25.000Z | 2019-05-14T22:31:25.000Z | Aulas/12aula(antigo)/readint.py | rafaelmcam/RTOs_ChibiOS | 08d8e21f2c7185d2c47846f67cbfba70c706d689 | [
"MIT"
] | null | null | null | Aulas/12aula(antigo)/readint.py | rafaelmcam/RTOs_ChibiOS | 08d8e21f2c7185d2c47846f67cbfba70c706d689 | [
"MIT"
] | null | null | null | import serial
with serial.Serial("/dev/ttyUSB0", 115200) as ser:
while 1:
for i in range(5):
n = ser.read()[0]
print("{:x}".format(n))
print("--------")
| 18.181818 | 50 | 0.47 | 26 | 200 | 3.615385 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0.325 | 200 | 10 | 51 | 20 | 0.622222 | 0 | 0 | 0 | 0 | 0 | 0.120603 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.285714 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4658a352b7ba7209186ef3d47f169f46b8660613 | 2,182 | py | Python | src/visualization/visualize_dataset.py | ivangarrera/MachineLearning | c13584cdcb7c4df1ab2814cf42a3c2bd3c203e75 | [
"MIT"
] | null | null | null | src/visualization/visualize_dataset.py | ivangarrera/MachineLearning | c13584cdcb7c4df1ab2814cf42a3c2bd3c203e75 | [
"MIT"
] | null | null | null | src/visualization/visualize_dataset.py | ivangarrera/MachineLearning | c13584cdcb7c4df1ab2814cf42a3c2bd3c203e75 | [
"MIT"
] | null | null | null | from common_clustering import CommonClustering
#■clustering_features = CommonClustering(r'C:\Users\ivangarrera\Desktop\T2_cleaned.csv')
clustering_features = CommonClustering('D:\Ing. Informatica\Cuarto\Machine Learning\T2_cleaned_gyroscope.csv')
attr = list(clustering_features.data_set)[0][:list(clustering_features.data_set)[0].find('_')]
clustering_features.attr = attr
clustering_features.PrincipalComponentAnalysis(num_components=2)
# Get the number of clusters that provides the best results
ideal_number_of_clusters = clustering_features.getBestNumberOfClusters()
# Plot silhuettes array
clustering_features.PlotSilhouettes()
# Print k-means with the best number of clusters that have been found
labels = clustering_features.KMeansWithIdeal(ideal_number_of_clusters)
# Interprate k-means groups
clustering_features.data_set['labels'] = labels
data_set_labels_mean = clustering_features.data_set.groupby(['labels']).mean()
# Plot 3D graph to interpretate k-means groups
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
fig = plt.figure()
ax = Axes3D(fig)
ax.scatter(data_set_labels_mean.values[:,0],
data_set_labels_mean.values[:,1],
data_set_labels_mean.values[:,2])
plt.savefig(r'../../reports/figures/centroids3D_{}.png'.format(attr))
plt.show()
# Agglomerative clustering algorithm using nearest neighbors matrix
clustering_features.AgglomerativeClusteringWithNearestNeighbors()
# DBSCAN Clustering algorithm
labels = clustering_features.DBSCANClustering()
# Interprate outliers
clustering_features.data_set['labels'] = labels
data_set_outliers = clustering_features.data_set.loc[(clustering_features.data_set['labels'] == -1)]
# Show outliers in a 3D graph with all points in the dataset
fig = plt.figure()
ax = Axes3D(fig)
ax.scatter(clustering_features.data_set.values[:,0],
clustering_features.data_set.values[:,1],
clustering_features.data_set.values[:,2])
ax.scatter(data_set_outliers.values[:,0],
data_set_outliers.values[:,1],
data_set_outliers.values[:,2], c='red', s=50)
plt.savefig(r'../../reports/figures/outliers3D_{}.png'.format(attr))
plt.show()
| 36.983051 | 110 | 0.779102 | 284 | 2,182 | 5.764085 | 0.369718 | 0.208919 | 0.134392 | 0.152718 | 0.327428 | 0.129505 | 0.092853 | 0.092853 | 0 | 0 | 0 | 0.012834 | 0.107241 | 2,182 | 58 | 111 | 37.62069 | 0.827002 | 0.219523 | 0 | 0.242424 | 0 | 0 | 0.103428 | 0.08156 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4660b825bf1a5e031627c3620c78b68944deb5c7 | 652 | py | Python | glue/core/tests/test_message.py | ejeschke/glue | 21689e3474aeaeb70e258d76c60755596856976c | [
"BSD-3-Clause"
] | 3 | 2015-09-10T22:23:55.000Z | 2019-04-04T18:47:33.000Z | glue/core/tests/test_message.py | ejeschke/glue | 21689e3474aeaeb70e258d76c60755596856976c | [
"BSD-3-Clause"
] | null | null | null | glue/core/tests/test_message.py | ejeschke/glue | 21689e3474aeaeb70e258d76c60755596856976c | [
"BSD-3-Clause"
] | 1 | 2019-08-04T14:10:12.000Z | 2019-08-04T14:10:12.000Z | from __future__ import absolute_import, division, print_function
import pytest
from .. import message as msg
def test_invalid_subset_msg():
with pytest.raises(TypeError) as exc:
msg.SubsetMessage(None)
assert exc.value.args[0].startswith('Sender must be a subset')
def test_invalid_data_msg():
with pytest.raises(TypeError) as exc:
msg.DataMessage(None)
assert exc.value.args[0].startswith('Sender must be a data')
def test_invalid_data_collection_msg():
with pytest.raises(TypeError) as exc:
msg.DataCollectionMessage(None)
assert exc.value.args[0].startswith('Sender must be a DataCollection')
| 27.166667 | 74 | 0.739264 | 91 | 652 | 5.120879 | 0.384615 | 0.045064 | 0.090129 | 0.122318 | 0.527897 | 0.527897 | 0.527897 | 0.527897 | 0.296137 | 0.296137 | 0 | 0.005525 | 0.167178 | 652 | 23 | 75 | 28.347826 | 0.85267 | 0 | 0 | 0.2 | 0 | 0 | 0.115031 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4660df17d48e40efbff3c55617fa7393819b5977 | 1,358 | py | Python | fabfile/config.py | kurochan/config-collector | 656da97eb219eb5bcf913173dd7aa76d0cedd44c | [
"MIT"
] | 1 | 2017-07-30T17:35:10.000Z | 2017-07-30T17:35:10.000Z | fabfile/config.py | kurochan/config-collector | 656da97eb219eb5bcf913173dd7aa76d0cedd44c | [
"MIT"
] | null | null | null | fabfile/config.py | kurochan/config-collector | 656da97eb219eb5bcf913173dd7aa76d0cedd44c | [
"MIT"
] | 1 | 2015-03-01T08:52:14.000Z | 2015-03-01T08:52:14.000Z | # -*- coding: utf-8 -*-
import os
import util
from fabric.api import *
from fabric.state import output
from fabric.colors import *
from base import BaseTask
from helper.print_helper import task_puts
class CollectConfig(BaseTask):
"""
collect configuration
"""
name = "collect"
def run_task(self, *args, **kwargs):
host_config = env.inventory.get_variables(env.host)
hostname = host_config['ssh_host']
if not util.tcping(hostname, 22, 1):
task_puts("host {0} does not exist. skip...".format(hostname))
return
config = self.get_config(hostname, host_config['ssh_user'], host_config['ssh_pass'], host_config['exec_pass'], host_config['type'])
self.write_config(env.host, config)
# print config
def get_config(self, hostname, ssh_user, ssh_pass, exec_pass, os_type):
script_name = "dump-config-cisco-{0}.sh".format(os_type)
config = local(os.path.dirname(os.path.abspath(__file__)) + "/../bin/{0} {1} {2} {3}".format(script_name, ssh_user, hostname, ssh_pass), capture = True)
return config
def write_config(self, hostname, config):
output_dir = os.path.dirname(os.path.abspath(__file__)) + "/../tmp/config"
local("mkdir -p {0}".format(output_dir))
file = open("{0}/{1}.txt".format(output_dir, hostname), 'w')
file.write(str(config))
file.close()
collect = CollectConfig()
| 33.121951 | 156 | 0.690722 | 194 | 1,358 | 4.634021 | 0.386598 | 0.077864 | 0.043382 | 0.046719 | 0.066741 | 0.066741 | 0.066741 | 0 | 0 | 0 | 0 | 0.011314 | 0.153903 | 1,358 | 40 | 157 | 33.95 | 0.771105 | 0.041973 | 0 | 0 | 0 | 0 | 0.125 | 0.018634 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0.107143 | 0.25 | 0 | 0.5 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4661333ffeca10b7026c68a47b44fc3be83ff093 | 2,334 | py | Python | python/paddle/fluid/tests/unittests/ir/inference/test_trt_transpose_flatten_concat_fuse_pass.py | LWhite027/PaddleBox | b14bcdf285dd8829e11ab12cc815ac1b1ab62694 | [
"Apache-2.0"
] | 10 | 2021-05-12T07:20:32.000Z | 2022-03-04T08:21:56.000Z | python/paddle/fluid/tests/unittests/ir/inference/test_trt_transpose_flatten_concat_fuse_pass.py | AFLee/Paddle | 311b3b44fc7d51d4d66d90ab8a3fc0d42231afda | [
"Apache-2.0"
] | 1 | 2020-09-10T09:05:52.000Z | 2020-09-10T09:06:22.000Z | python/paddle/fluid/tests/unittests/ir/inference/test_trt_transpose_flatten_concat_fuse_pass.py | AFLee/Paddle | 311b3b44fc7d51d4d66d90ab8a3fc0d42231afda | [
"Apache-2.0"
] | 25 | 2019-12-07T02:14:14.000Z | 2021-12-30T06:16:30.000Z | # Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
import numpy as np
from inference_pass_test import InferencePassTest
import paddle.fluid as fluid
import paddle.fluid.core as core
from paddle.fluid.core import AnalysisConfig
class TransposeFlattenConcatFusePassTRTTest(InferencePassTest):
def setUp(self):
with fluid.program_guard(self.main_program, self.startup_program):
data1 = fluid.data(
name="data1", shape=[8, 32, 128], dtype="float32")
data2 = fluid.data(
name="data2", shape=[8, 32, 128], dtype="float32")
trans1 = fluid.layers.transpose(data1, perm=[2, 1, 0])
trans2 = fluid.layers.transpose(data2, perm=[2, 1, 0])
flatt1 = fluid.layers.flatten(trans1)
flatt2 = fluid.layers.flatten(trans2)
concat_out = fluid.layers.concat([flatt1, flatt2])
# There is no parameters for above structure.
# Hence, append a batch_norm to avoid failure caused by load_combined.
out = fluid.layers.batch_norm(concat_out, is_test=True)
self.feeds = {
"data1": np.random.random([8, 32, 128]).astype("float32"),
"data2": np.random.random([8, 32, 128]).astype("float32")
}
self.enable_trt = True
self.trt_parameters = TransposeFlattenConcatFusePassTRTTest.TensorRTParam(
1 << 20, 8, 3, AnalysisConfig.Precision.Float32, False, False)
self.fetch_list = [out]
def test_check_output(self):
# There is no cpu pass for transpose_flatten_concat_fuse
if core.is_compiled_with_cuda():
use_gpu = True
self.check_output_with_option(use_gpu)
if __name__ == "__main__":
unittest.main()
| 40.947368 | 83 | 0.673522 | 303 | 2,334 | 5.072607 | 0.481848 | 0.039037 | 0.015615 | 0.02082 | 0.072869 | 0.072869 | 0.042941 | 0.042941 | 0 | 0 | 0 | 0.038462 | 0.231362 | 2,334 | 56 | 84 | 41.678571 | 0.818283 | 0.322622 | 0 | 0 | 0 | 0 | 0.035806 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0.090909 | 0.181818 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4662eb3534b543f9d1857e55e3d0e8669cf078e7 | 9,315 | py | Python | pddf_psuutil/main.py | deran1980/sonic-utilities | a6ae218238e7e552f49191f81451bd55ff56ba51 | [
"Apache-2.0"
] | null | null | null | pddf_psuutil/main.py | deran1980/sonic-utilities | a6ae218238e7e552f49191f81451bd55ff56ba51 | [
"Apache-2.0"
] | 4 | 2020-04-17T06:53:05.000Z | 2020-12-01T02:37:34.000Z | pddf_psuutil/main.py | deran1980/sonic-utilities | a6ae218238e7e552f49191f81451bd55ff56ba51 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
#
# main.py
#
# Command-line utility for interacting with PSU Controller in PDDF mode in SONiC
#
try:
import sys
import os
import click
from tabulate import tabulate
from utilities_common.util_base import UtilHelper
except ImportError as e:
raise ImportError("%s - required module not found" % str(e))
VERSION = '2.0'
SYSLOG_IDENTIFIER = "psuutil"
PLATFORM_SPECIFIC_MODULE_NAME = "psuutil"
PLATFORM_SPECIFIC_CLASS_NAME = "PsuUtil"
# Global platform-specific psuutil class instance
platform_psuutil = None
platform_chassis = None
# Wrapper APIs so that this util is suited to both 1.0 and 2.0 platform APIs
def _wrapper_get_num_psus():
if platform_chassis is not None:
try:
return platform_chassis.get_num_psus()
except NotImplementedError:
pass
return platform_psuutil.get_num_psus()
def _wrapper_get_psu_name(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_name()
except NotImplementedError:
pass
return "PSU {}".format(idx)
def _wrapper_get_psu_presence(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_presence()
except NotImplementedError:
pass
return platform_psuutil.get_psu_presence(idx)
def _wrapper_get_psu_status(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_status()
except NotImplementedError:
pass
return platform_psuutil.get_psu_status(idx)
def _wrapper_get_psu_model(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_model()
except NotImplementedError:
pass
return platform_psuutil.get_model(idx)
def _wrapper_get_psu_mfr_id(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_mfr_id()
except NotImplementedError:
pass
return platform_psuutil.get_mfr_id(idx)
def _wrapper_get_psu_serial(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_serial()
except NotImplementedError:
pass
return platform_psuutil.get_serial(idx)
def _wrapper_get_psu_direction(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1)._fan_list[0].get_direction()
except NotImplementedError:
pass
return platform_psuutil.get_direction(idx)
def _wrapper_get_output_voltage(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_voltage()
except NotImplementedError:
pass
return platform_psuutil.get_output_voltage(idx)
def _wrapper_get_output_current(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_current()
except NotImplementedError:
pass
return platform_psuutil.get_output_current(idx)
def _wrapper_get_output_power(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1).get_power()
except NotImplementedError:
pass
return platform_psuutil.get_output_power(idx)
def _wrapper_get_fan_rpm(idx, fan_idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx-1)._fan_list[fan_idx-1].get_speed_rpm()
except NotImplementedError:
pass
return platform_psuutil.get_fan_rpm(idx, fan_idx)
def _wrapper_dump_sysfs(idx):
if platform_chassis is not None:
try:
return platform_chassis.get_psu(idx).dump_sysfs()
except NotImplementedError:
pass
return platform_psuutil.dump_sysfs()
# ==================== CLI commands and groups ====================
# This is our main entrypoint - the main 'psuutil' command
@click.group()
def cli():
"""psuutil - Command line utility for providing PSU status"""
global platform_psuutil
global platform_chassis
if os.geteuid() != 0:
click.echo("Root privileges are required for this operation")
sys.exit(1)
# Load the helper class
helper = UtilHelper()
if not helper.check_pddf_mode():
click.echo("PDDF mode should be supported and enabled for this platform for this operation")
sys.exit(1)
# Load new platform api class
try:
import sonic_platform.platform
platform_chassis = sonic_platform.platform.Platform().get_chassis()
except Exception as e:
click.echo("Failed to load chassis due to {}".format(str(e)))
# Load platform-specific psuutil class if 2.0 implementation is not present
if platform_chassis is None:
try:
platform_psuutil = helper.load_platform_util(PLATFORM_SPECIFIC_MODULE_NAME, PLATFORM_SPECIFIC_CLASS_NAME)
except Exception as e:
click.echo("Failed to load {}: {}".format(PLATFORM_SPECIFIC_MODULE_NAME, str(e)))
sys.exit(2)
# 'version' subcommand
@cli.command()
def version():
"""Display version info"""
click.echo("PDDF psuutil version {0}".format(VERSION))
# 'numpsus' subcommand
@cli.command()
def numpsus():
"""Display number of supported PSUs on device"""
click.echo(_wrapper_get_num_psus())
# 'status' subcommand
@cli.command()
@click.option('-i', '--index', default=-1, type=int, help="the index of PSU")
def status(index):
"""Display PSU status"""
supported_psu = list(range(1, _wrapper_get_num_psus() + 1))
psu_ids = []
if (index < 0):
psu_ids = supported_psu
else:
psu_ids = [index]
header = ['PSU', 'Status']
status_table = []
for psu in psu_ids:
msg = ""
psu_name = _wrapper_get_psu_name(psu)
if psu not in supported_psu:
click.echo("Error! The {} is not available on the platform.\n" \
"Number of supported PSU - {}.".format(psu_name, len(supported_psu)))
continue
presence = _wrapper_get_psu_presence(psu)
if presence:
oper_status = _wrapper_get_psu_status(psu)
msg = 'OK' if oper_status else "NOT OK"
else:
msg = 'NOT PRESENT'
status_table.append([psu_name, msg])
if status_table:
click.echo(tabulate(status_table, header, tablefmt="simple"))
# 'mfrinfo' subcommand
@cli.command()
@click.option('-i', '--index', default=-1, type=int, help="the index of PSU")
def mfrinfo(index):
"""Display PSU manufacturer info"""
supported_psu = list(range(1, _wrapper_get_num_psus() + 1))
psu_ids = []
if (index < 0):
psu_ids = supported_psu
else:
psu_ids = [index]
for psu in psu_ids:
psu_name = _wrapper_get_psu_name(psu)
if psu not in supported_psu:
click.echo("Error! The {} is not available on the platform.\n" \
"Number of supported PSU - {}.".format(psu_name, len(supported_psu)))
continue
status = _wrapper_get_psu_status(psu)
if not status:
click.echo("{} is Not OK\n".format(psu_name))
continue
model_name = _wrapper_get_psu_model(psu)
mfr_id = _wrapper_get_psu_mfr_id(psu)
serial_num = _wrapper_get_psu_serial(psu)
airflow_dir = _wrapper_get_psu_direction(psu)
click.echo("{} is OK\nManufacture Id: {}\n" \
"Model: {}\nSerial Number: {}\n" \
"Fan Direction: {}\n".format(psu_name, mfr_id, model_name, serial_num, airflow_dir.capitalize()))
# 'seninfo' subcommand
@cli.command()
@click.option('-i', '--index', default=-1, type=int, help="the index of PSU")
def seninfo(index):
"""Display PSU sensor info"""
supported_psu = list(range(1, _wrapper_get_num_psus() + 1))
psu_ids = []
if (index < 0):
psu_ids = supported_psu
else:
psu_ids = [index]
for psu in psu_ids:
psu_name = _wrapper_get_psu_name(psu)
if psu not in supported_psu:
click.echo("Error! The {} is not available on the platform.\n" \
"Number of supported PSU - {}.".format(psu_name, len(supported_psu)))
continue
oper_status = _wrapper_get_psu_status(psu)
if not oper_status:
click.echo("{} is Not OK\n".format(psu_name))
continue
v_out = _wrapper_get_output_voltage(psu) * 1000
i_out = _wrapper_get_output_current(psu) * 1000
p_out = _wrapper_get_output_power(psu) * 1000
fan1_rpm = _wrapper_get_fan_rpm(psu, 1)
click.echo("{} is OK\nOutput Voltage: {} mv\n" \
"Output Current: {} ma\nOutput Power: {} mw\n" \
"Fan1 Speed: {} rpm\n".format(psu_name, v_out, i_out, p_out, fan1_rpm))
@cli.group()
def debug():
"""pddf_psuutil debug commands"""
pass
@debug.command()
def dump_sysfs():
"""Dump all PSU related SysFS paths"""
for psu in range(_wrapper_get_num_psus()):
status = _wrapper_dump_sysfs(psu)
if status:
for i in status:
click.echo(i)
if __name__ == '__main__':
cli()
| 31.05 | 117 | 0.646914 | 1,221 | 9,315 | 4.665029 | 0.143325 | 0.05618 | 0.041081 | 0.046699 | 0.541784 | 0.495611 | 0.481566 | 0.409586 | 0.347261 | 0.333567 | 0 | 0.0079 | 0.252603 | 9,315 | 299 | 118 | 31.153846 | 0.810256 | 0.089748 | 0 | 0.495536 | 0 | 0 | 0.097019 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09375 | false | 0.0625 | 0.035714 | 0 | 0.245536 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
466a34ecb0421da1e44f26b4a2ebb96b4fc1273b | 1,267 | py | Python | tests/simple_cmd_checks.py | Rhoynar/plmn-regression | fa58819a405b45430bbde28e52b356e04867aaa3 | [
"MIT"
] | 11 | 2019-02-07T16:13:59.000Z | 2021-08-14T03:53:14.000Z | tests/simple_cmd_checks.py | Rhoynar/plmn-regression | fa58819a405b45430bbde28e52b356e04867aaa3 | [
"MIT"
] | null | null | null | tests/simple_cmd_checks.py | Rhoynar/plmn-regression | fa58819a405b45430bbde28e52b356e04867aaa3 | [
"MIT"
] | 3 | 2019-02-07T16:14:09.000Z | 2021-08-14T05:09:17.000Z | # -*- coding: utf-8 -*-
import compat
import unittest
import sys
from plmn.utils import *
from plmn.results import *
from plmn.modem_cmds import *
from plmn.simple_cmds import *
class SimpleCmdChecks(unittest.TestCase):
def test_simple_status_cmd(self):
SimpleCmds.simple_status_cmd()
assert Results.get_state('Simple Status') is not None
def test_simple_status_get_reg_status(self):
SimpleCmds.simple_status_get_reg_status()
def test_simple_status_is_registered(self):
assert SimpleCmds.simple_status_is_registered() is True
def test_simple_status_is_home(self):
assert SimpleCmds.simple_status_is_home() is True
assert SimpleCmds.simple_status_is_roaming() is False
@unittest.skip('Skipping this test since this is only applicable in connected state')
def test_simple_status_is_connected(self):
assert SimpleCmds.simple_status_is_connected() is True
@unittest.skip('Skipping this as this is only applicable for Roaming scenario')
def test_simple_status_is_roaming(self):
assert SimpleCmds.simple_status_is_roaming() is True
if __name__ == '__main__':
nargs = process_args()
unittest.main(argv=sys.argv[nargs:], exit=False)
Results.print_results()
| 31.675 | 89 | 0.750592 | 174 | 1,267 | 5.132184 | 0.316092 | 0.18813 | 0.156775 | 0.12766 | 0.353863 | 0.206047 | 0.087346 | 0 | 0 | 0 | 0 | 0.000953 | 0.17206 | 1,267 | 39 | 90 | 32.487179 | 0.850334 | 0.016575 | 0 | 0 | 0 | 0 | 0.119775 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 1 | 0.214286 | false | 0 | 0.25 | 0 | 0.5 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
466ccc900104e36f636478253e917a965c1df4d3 | 371 | py | Python | app/schemas/email.py | waynesun09/notify-service | 768a0db264a9e57eecce283108878e24e8d3b740 | [
"MIT"
] | 5 | 2020-12-20T17:10:46.000Z | 2021-08-20T05:00:58.000Z | app/schemas/email.py | RedHatQE/notify-service | 579e995fae0c472f9fbd27471371a2c404d94f66 | [
"MIT"
] | 13 | 2021-01-07T14:17:14.000Z | 2022-01-05T20:36:36.000Z | app/schemas/email.py | RedHatQE/notify-service | 579e995fae0c472f9fbd27471371a2c404d94f66 | [
"MIT"
] | 1 | 2022-01-06T22:21:09.000Z | 2022-01-06T22:21:09.000Z | from typing import Optional, List
from pydantic import BaseModel, EmailStr
from . import result
class EmailBase(BaseModel):
email: Optional[EmailStr] = None
class EmailSend(EmailBase):
msg: str
class EmailResult(BaseModel):
pre_header: Optional[str] = None
begin: Optional[str] = None
content: List[result.Result]
end: Optional[str] = None
| 18.55 | 40 | 0.719677 | 45 | 371 | 5.911111 | 0.488889 | 0.12406 | 0.169173 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191375 | 371 | 19 | 41 | 19.526316 | 0.886667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
466e8b57966faf4a0cc17febbe2a82c29fab5e61 | 802 | py | Python | setup.py | greenaddress/txjsonrpc | 272d44db43d36645ba981c6e7fa73e33c1fbb7d5 | [
"MIT"
] | null | null | null | setup.py | greenaddress/txjsonrpc | 272d44db43d36645ba981c6e7fa73e33c1fbb7d5 | [
"MIT"
] | 1 | 2019-10-16T14:00:25.000Z | 2019-11-11T16:23:20.000Z | setup.py | greenaddress/txjsonrpc | 272d44db43d36645ba981c6e7fa73e33c1fbb7d5 | [
"MIT"
] | 2 | 2017-05-15T06:03:27.000Z | 2019-07-21T09:04:24.000Z | from __future__ import absolute_import
from setuptools import setup
from txjsonrpc import meta
from txjsonrpc.util import dist
setup(
name=meta.display_name,
version=meta.version,
description=meta.description,
author=meta.author,
author_email=meta.author_email,
url=meta.url,
license=meta.license,
packages=dist.findPackages(meta.library_name),
long_description=dist.catReST(
"docs/PRELUDE.txt",
"README",
"docs/DEPENDENCIES.txt",
"docs/INSTALL.txt",
"docs/USAGE.txt",
"TODO",
"docs/HISTORY.txt",
stop_on_errors=True,
out=True),
classifiers=[
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Programming Language :: Python",
],
)
| 24.30303 | 50 | 0.63591 | 87 | 802 | 5.724138 | 0.551724 | 0.052209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001669 | 0.253117 | 802 | 32 | 51 | 25.0625 | 0.829716 | 0 | 0 | 0 | 0 | 0 | 0.229426 | 0.026185 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.137931 | 0 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.