hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
5d7e286fce65b02bbb505a551034d0638886042d | 2,764 | py | Python | sonnet/src/once.py | ScriptBox99/deepmind-sonnet | 5cbfdc356962d9b6198d5b63f0826a80acfdf35b | [
"Apache-2.0"
] | 10,287 | 2017-04-07T12:33:37.000Z | 2022-03-30T03:32:16.000Z | sonnet/src/once.py | ScriptBox99/deepmind-sonnet | 5cbfdc356962d9b6198d5b63f0826a80acfdf35b | [
"Apache-2.0"
] | 209 | 2017-04-07T15:57:11.000Z | 2022-03-27T10:43:03.000Z | sonnet/src/once.py | ScriptBox99/deepmind-sonnet | 5cbfdc356962d9b6198d5b63f0826a80acfdf35b | [
"Apache-2.0"
] | 1,563 | 2017-04-07T13:15:06.000Z | 2022-03-29T15:26:04.000Z | # Copyright 2019 The Sonnet Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""Utility to run functions and methods once."""
import uuid
from sonnet.src import utils
_ONCE_PROPERTY = "_snt_once"
def _check_no_output(output):
if output is not None:
raise ValueError("@snt.once decorated functions cannot return values")
def once(f):
"""Decorator which ensures a wrapped method is only ever run once.
>>> @snt.once
... def f():
... print('Hello, world!')
>>> f()
Hello, world!
>>> f()
>>> f()
If `f` is a method then it will be evaluated once per instance:
>>> class MyObject:
... @snt.once
... def f(self):
... print('Hello, world!')
>>> o = MyObject()
>>> o.f()
Hello, world!
>>> o.f()
>>> o2 = MyObject()
>>> o2.f()
Hello, world!
>>> o.f()
>>> o2.f()
If an error is raised during execution of `f` it will be raised to the user.
Next time the method is run, it will be treated as not having run before.
Args:
f: A function to wrap which should only be called once.
Returns:
Wrapped version of `f` which will only evaluate `f` the first time it is
called.
"""
# TODO(tomhennigan) Perhaps some more human friendly identifier?
once_id = uuid.uuid4()
@utils.decorator
def wrapper(wrapped, instance, args, kwargs):
"""Decorator which ensures a wrapped method is only ever run once."""
if instance is None:
# NOTE: We can't use the weakset since you can't weakref None.
if not wrapper.seen_none:
_check_no_output(wrapped(*args, **kwargs))
wrapper.seen_none = True
return
# Get or set the `seen` set for this object.
seen = getattr(instance, _ONCE_PROPERTY, None)
if seen is None:
seen = set()
setattr(instance, _ONCE_PROPERTY, seen)
if once_id not in seen:
_check_no_output(wrapped(*args, **kwargs))
seen.add(once_id)
wrapper.seen_none = False
decorated = wrapper(f) # pylint: disable=no-value-for-parameter,assignment-from-none
decorated.__snt_once_wrapped__ = f
return decorated
| 28.494845 | 87 | 0.634949 | 386 | 2,764 | 4.471503 | 0.42487 | 0.034762 | 0.017381 | 0.01854 | 0.112399 | 0.112399 | 0.060255 | 0.060255 | 0.060255 | 0.060255 | 0 | 0.005682 | 0.23589 | 2,764 | 96 | 88 | 28.791667 | 0.811553 | 0.655933 | 0 | 0.076923 | 0 | 0 | 0.068685 | 0 | 0 | 0 | 0 | 0.010417 | 0 | 1 | 0.115385 | false | 0 | 0.076923 | 0 | 0.269231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d8772d2443bc37d077b4e1088b8652b560de433 | 387 | py | Python | Python/Numpy/Min and Max/min_and_max.py | brianchiang-tw/HackerRank | 02a30a0033b881206fa15b8d6b4ef99b2dc420c8 | [
"MIT"
] | 2 | 2020-05-28T07:15:00.000Z | 2020-07-21T08:34:06.000Z | Python/Numpy/Min and Max/min_and_max.py | brianchiang-tw/HackerRank | 02a30a0033b881206fa15b8d6b4ef99b2dc420c8 | [
"MIT"
] | null | null | null | Python/Numpy/Min and Max/min_and_max.py | brianchiang-tw/HackerRank | 02a30a0033b881206fa15b8d6b4ef99b2dc420c8 | [
"MIT"
] | null | null | null | import numpy as np
if __name__ == '__main__':
h, w = map( int, input().split() )
row_list = []
for i in range(h):
single_row = list( map(int, input().split() ) )
np_row = np.array( single_row )
row_list.append( np_row )
min_of_each_row = np.min( row_list, axis = 1)
max_of_min = np.max( min_of_each_row )
print( max_of_min )
| 15.48 | 56 | 0.573643 | 61 | 387 | 3.229508 | 0.47541 | 0.142132 | 0.111675 | 0.162437 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003663 | 0.294574 | 387 | 24 | 57 | 16.125 | 0.717949 | 0 | 0 | 0 | 0 | 0 | 0.020779 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d87b775f0d8dfc2c8f2bb9538693bb8aa0d1ec6 | 22,757 | py | Python | allure/pytest_plugin.py | allure-framework/allure-pytest | d55180aaeb21233e7ca577ffc6f67a07837c63f2 | [
"Apache-2.0"
] | 112 | 2017-01-24T21:37:49.000Z | 2022-03-25T22:32:12.000Z | venv/Lib/site-packages/allure/pytest_plugin.py | Arthii01052/conduit | 3427d76d0fa364cb5d19bdd6da4aeb0a22fe9660 | [
"MIT"
] | 56 | 2017-01-21T20:01:41.000Z | 2019-01-14T13:35:53.000Z | venv/Lib/site-packages/allure/pytest_plugin.py | Arthii01052/conduit | 3427d76d0fa364cb5d19bdd6da4aeb0a22fe9660 | [
"MIT"
] | 52 | 2017-01-23T13:40:40.000Z | 2022-03-30T00:02:31.000Z | import uuid
import pickle
import pytest
import argparse
from collections import namedtuple
from six import text_type
from allure.common import AllureImpl, StepContext
from allure.constants import Status, AttachmentType, Severity, \
FAILED_STATUSES, Label, SKIPPED_STATUSES
from allure.utils import parent_module, parent_down_from_module, labels_of, \
all_of, get_exception_message, now, mangle_testnames
from allure.structure import TestCase, TestStep, Attach, TestSuite, Failure, TestLabel
def pytest_addoption(parser):
parser.getgroup("reporting").addoption('--alluredir',
action="store",
dest="allurereportdir",
metavar="DIR",
default=None,
help="Generate Allure report in the specified directory (may not exist)")
severities = [v for (_, v) in all_of(Severity)]
def label_type(name, legal_values=set()):
"""
argparse-type factory for labelish things.
processed value is set of tuples (name, value).
:param name: of label type (for future TestLabel things)
:param legal_values: a `set` of values that are legal for this label, if any limit whatsoever
:raises ArgumentTypeError: if `legal_values` are given and there are values that fall out of that
"""
def a_label_type(string):
atoms = set(string.split(','))
if legal_values and not atoms < legal_values:
raise argparse.ArgumentTypeError('Illegal {} values: {}, only [{}] are allowed'.format(name, ', '.join(atoms - legal_values), ', '.join(legal_values)))
return set((name, v) for v in atoms)
return a_label_type
parser.getgroup("general").addoption('--allure_severities',
action="store",
dest="allureseverities",
metavar="SEVERITIES_SET",
default={},
type=label_type(name=Label.SEVERITY, legal_values=set(severities)),
help="""Comma-separated list of severity names.
Tests only with these severities will be run.
Possible values are:%s.""" % ', '.join(severities))
parser.getgroup("general").addoption('--allure_features',
action="store",
dest="allurefeatures",
metavar="FEATURES_SET",
default={},
type=label_type(name=Label.FEATURE),
help="""Comma-separated list of feature names.
Run tests that have at least one of the specified feature labels.""")
parser.getgroup("general").addoption('--allure_stories',
action="store",
dest="allurestories",
metavar="STORIES_SET",
default={},
type=label_type(name=Label.STORY),
help="""Comma-separated list of story names.
Run tests that have at least one of the specified story labels.""")
def pytest_configure(config):
reportdir = config.option.allurereportdir
if reportdir: # we actually record something
allure_impl = AllureImpl(reportdir)
testlistener = AllureTestListener(config)
pytest.allure._allurelistener = testlistener
config.pluginmanager.register(testlistener)
if not hasattr(config, 'slaveinput'):
# on xdist-master node do all the important stuff
config.pluginmanager.register(AllureAgregatingListener(allure_impl, config))
config.pluginmanager.register(AllureCollectionListener(allure_impl))
class AllureTestListener(object):
"""
Per-test listener.
Is responsible for recording in-test data and for attaching it to the test report thing.
The per-test reports are handled by `AllureAgregatingListener` at the `pytest_runtest_logreport` hook.
"""
def __init__(self, config):
self.config = config
self.environment = {}
self.test = None
# FIXME: that flag makes us pre-report failures in the makereport hook.
# it is here to cope with xdist's begavior regarding -x.
# see self.pytest_runtest_makereport and AllureAgregatingListener.pytest_sessionfinish
self._magicaldoublereport = hasattr(self.config, 'slaveinput') and self.config.getvalue("maxfail")
@pytest.mark.hookwrapper
def pytest_runtest_protocol(self, item, nextitem):
try:
# for common items
description = item.function.__doc__
except AttributeError:
# for doctests that has no `function` attribute
description = item.reportinfo()[2]
self.test = TestCase(name='.'.join(mangle_testnames([x.name for x in parent_down_from_module(item)])),
description=description,
start=now(),
attachments=[],
labels=labels_of(item),
status=None,
steps=[],
id=str(uuid.uuid4())) # for later resolution in AllureAgregatingListener.pytest_sessionfinish
self.stack = [self.test]
yield
self.test = None
self.stack = []
def attach(self, title, contents, attach_type):
"""
Store attachment object in current state for later actual write in the `AllureAgregatingListener.write_attach`
"""
attach = Attach(source=contents, # we later re-save those, oh my...
title=title,
type=attach_type)
self.stack[-1].attachments.append(attach)
def dynamic_issue(self, *issues):
"""
Attaches ``issues`` to the current active case
"""
if self.test:
self.test.labels.extend([TestLabel(name=Label.ISSUE, value=issue) for issue in issues])
def description(self, description):
"""
Sets description for the test
"""
if self.test:
self.test.description = description
def start_step(self, name):
"""
Starts an new :py:class:`allure.structure.TestStep` with given ``name``,
pushes it to the ``self.stack`` and returns the step.
"""
step = TestStep(name=name,
title=name,
start=now(),
attachments=[],
steps=[])
self.stack[-1].steps.append(step)
self.stack.append(step)
return step
def stop_step(self):
"""
Stops the step at the top of ``self.stack``
"""
step = self.stack.pop()
step.stop = now()
def _fill_case(self, report, call, pyteststatus, status):
"""
Finalizes with important data
:param report: py.test's `TestReport`
:param call: py.test's `CallInfo`
:param pyteststatus: the failed/xfailed/xpassed thing
:param status: a :py:class:`allure.constants.Status` entry
"""
[self.attach(name, contents, AttachmentType.TEXT) for (name, contents) in dict(report.sections).items()]
self.test.stop = now()
self.test.status = status
if status in FAILED_STATUSES:
self.test.failure = Failure(message=get_exception_message(call.excinfo, pyteststatus, report),
trace=report.longrepr or hasattr(report, 'wasxfail') and report.wasxfail)
elif status in SKIPPED_STATUSES:
skip_message = type(report.longrepr) == tuple and report.longrepr[2] or report.wasxfail
trim_msg_len = 89
short_message = skip_message.split('\n')[0][:trim_msg_len]
# FIXME: see pytest.runner.pytest_runtest_makereport
self.test.failure = Failure(message=(short_message + '...' * (len(skip_message) > trim_msg_len)),
trace=status == Status.PENDING and report.longrepr or short_message != skip_message and skip_message or '')
def report_case(self, item, report):
"""
Adds `self.test` to the `report` in a `AllureAggegatingListener`-understood way
"""
parent = parent_module(item)
# we attach a four-tuple: (test module ID, test module name, test module doc, environment, TestCase)
report.__dict__.update(_allure_result=pickle.dumps((parent.nodeid,
parent.module.__name__,
parent.module.__doc__ or '',
self.environment,
self.test)))
@pytest.mark.hookwrapper
def pytest_runtest_makereport(self, item, call):
"""
Decides when to actually report things.
pytest runs this (naturally) three times -- with report.when being:
setup <--- fixtures are to be initialized in this one
call <--- when this finishes the main code has finished
teardown <--- tears down fixtures (that still possess important info)
`setup` and `teardown` are always called, but `call` is called only if `setup` passes.
See :py:func:`_pytest.runner.runtestprotocol` for proofs / ideas.
The "other side" (AllureAggregatingListener) expects us to send EXACTLY ONE test report (it wont break, but it will duplicate cases in the report -- which is bad.
So we work hard to decide exact moment when we call `_stop_case` to do that. This method may benefit from FSM (we keep track of what has already happened via self.test.status)
Expected behavior is:
FAILED when call fails and others OK
BROKEN when either setup OR teardown are broken (and call may be anything)
PENDING if skipped and xfailed
SKIPPED if skipped and not xfailed
"""
report = (yield).get_result()
status = self.config.hook.pytest_report_teststatus(report=report)
status = status and status[0]
if report.when == 'call':
if report.passed:
self._fill_case(report, call, status, Status.PASSED)
elif report.failed:
self._fill_case(report, call, status, Status.FAILED)
# FIXME: this is here only to work around xdist's stupid -x thing when in exits BEFORE THE TEARDOWN test log. Meh, i should file an issue to xdist
if self._magicaldoublereport:
# to minimize ze impact
self.report_case(item, report)
elif report.skipped:
if hasattr(report, 'wasxfail'):
self._fill_case(report, call, status, Status.PENDING)
else:
self._fill_case(report, call, status, Status.CANCELED)
elif report.when == 'setup': # setup / teardown
if report.failed:
self._fill_case(report, call, status, Status.BROKEN)
elif report.skipped:
if hasattr(report, 'wasxfail'):
self._fill_case(report, call, status, Status.PENDING)
else:
self._fill_case(report, call, status, Status.CANCELED)
elif report.when == 'teardown':
# as teardown is always called for testitem -- report our status here
if not report.passed:
if self.test.status not in FAILED_STATUSES:
# if test was OK but failed at teardown => broken
self._fill_case(report, call, status, Status.BROKEN)
else:
# mark it broken so, well, someone has idea of teardown failure
# still, that's no big deal -- test has already failed
# TODO: think about that once again
self.test.status = Status.BROKEN
# if a test isn't marked as "unreported" or it has failed, add it to the report.
if not item.get_marker("unreported") or self.test.status in FAILED_STATUSES:
self.report_case(item, report)
def pytest_runtest_setup(item):
item_labels = set((l.name, l.value) for l in labels_of(item)) # see label_type
arg_labels = set().union(item.config.option.allurefeatures,
item.config.option.allurestories,
item.config.option.allureseverities)
if arg_labels and not item_labels & arg_labels:
pytest.skip('Not suitable with selected labels: %s.' % ', '.join(text_type(l) for l in sorted(arg_labels)))
class LazyInitStepContext(StepContext):
"""
This is a step context used for decorated steps.
It provides a possibility to create step decorators, being initiated before pytest_configure, when no AllureListener initiated yet.
"""
def __init__(self, allure_helper, title):
self.allure_helper = allure_helper
self.title = title
self.step = None
@property
def allure(self):
listener = self.allure_helper.get_listener()
# if listener has `stack` we are inside a test
# record steps only when that
# FIXME: this breaks encapsulation a lot
if hasattr(listener, 'stack'):
return listener
class AllureHelper(object):
"""
This object holds various utility methods used from ``pytest.allure`` namespace, like ``pytest.allure.attach``
"""
def __init__(self):
self._allurelistener = None # FIXME: this gets injected elsewhere, like in the pytest_configure
def get_listener(self):
return self._allurelistener
def attach(self, name, contents, type=AttachmentType.TEXT): # @ReservedAssignment
"""
Attaches ``contents`` to a current context with given ``name`` and ``type``.
"""
if self._allurelistener:
self._allurelistener.attach(name, contents, type)
def label(self, name, *value):
"""
A decorator factory that returns ``pytest.mark`` for a given label.
"""
allure_label = getattr(pytest.mark, '%s.%s' % (Label.DEFAULT, name))
return allure_label(*value)
def severity(self, severity):
"""
A decorator factory that returns ``pytest.mark`` for a given allure ``level``.
"""
return self.label(Label.SEVERITY, severity)
def feature(self, *features):
"""
A decorator factory that returns ``pytest.mark`` for a given features.
"""
return self.label(Label.FEATURE, *features)
def story(self, *stories):
"""
A decorator factory that returns ``pytest.mark`` for a given stories.
"""
return self.label(Label.STORY, *stories)
def issue(self, *issues):
"""
A decorator factory that returns ``pytest.mark`` for a given issues.
"""
return self.label(Label.ISSUE, *issues)
def dynamic_issue(self, *issues):
"""
Mark test ``issues`` from inside.
"""
if self._allurelistener:
self._allurelistener.dynamic_issue(*issues)
def description(self, description):
"""
Sets description for the test
"""
if self._allurelistener:
self._allurelistener.description(description)
def testcase(self, *testcases):
"""
A decorator factory that returns ``pytest.mark`` for a given testcases.
"""
return self.label(Label.TESTCASE, *testcases)
def step(self, title):
"""
A contextmanager/decorator for steps.
TODO: when moving to python 3, rework this with ``contextlib.ContextDecorator``.
Usage examples::
import pytest
def test_foo():
with pytest.allure.step('mystep'):
assert False
@pytest.allure.step('make test data')
def make_test_data_bar():
raise ValueError('No data today')
def test_bar():
assert make_test_data_bar()
@pytest.allure.step
def make_test_data_baz():
raise ValueError('No data today')
def test_baz():
assert make_test_data_baz()
@pytest.fixture()
@pytest.allure.step('test fixture')
def steppy_fixture():
return 1
def test_baz(steppy_fixture):
assert steppy_fixture
"""
if callable(title):
return LazyInitStepContext(self, title.__name__)(title)
else:
return LazyInitStepContext(self, title)
def single_step(self, text):
"""
Writes single line to report.
"""
if self._allurelistener:
with self.step(text):
pass
def environment(self, **env_dict):
if self._allurelistener:
self._allurelistener.environment.update(env_dict)
@property
def attach_type(self):
return AttachmentType
@property
def severity_level(self):
return Severity
def __getattr__(self, attr):
"""
Provides fancy shortcuts for severity::
# these are the same
pytest.allure.CRITICAL
pytest.allure.severity(pytest.allure.severity_level.CRITICAL)
"""
if attr in dir(Severity) and not attr.startswith('_'):
return self.severity(getattr(Severity, attr))
else:
raise AttributeError
MASTER_HELPER = AllureHelper()
def pytest_namespace():
return {'allure': MASTER_HELPER}
class AllureAgregatingListener(object):
"""
Listens to pytest hooks to generate reports for common tests.
"""
def __init__(self, impl, config):
self.impl = impl
# module's nodeid => TestSuite object
self.suites = {}
def pytest_sessionfinish(self):
"""
We are done and have all the results in `self.suites`
Lets write em down.
But first we kinda-unify the test cases.
We expect cases to come from AllureTestListener -- and the have ._id field to manifest their identity.
Of all the test cases in suite.testcases we leave LAST with the same ID -- becase logreport can be sent MORE THAN ONE TIME
(namely, if the test fails and then gets broken -- to cope with the xdist's -x behavior we have to have tests even at CALL failures)
TODO: do it in a better, more efficient way
"""
for s in self.suites.values():
if s.tests: # nobody likes empty suites
s.stop = max(case.stop for case in s.tests)
known_ids = set()
refined_tests = []
for t in s.tests[::-1]:
if t.id not in known_ids:
known_ids.add(t.id)
refined_tests.append(t)
s.tests = refined_tests[::-1]
with self.impl._reportfile('%s-testsuite.xml' % uuid.uuid4()) as f:
self.impl._write_xml(f, s)
self.impl.store_environment()
def write_attach(self, attachment):
"""
Writes attachment object from the `AllureTestListener` to the FS, fixing it fields
:param attachment: a :py:class:`allure.structure.Attach` object
"""
# OMG, that is bad
attachment.source = self.impl._save_attach(attachment.source, attachment.type)
attachment.type = attachment.type.mime_type
def pytest_runtest_logreport(self, report):
if hasattr(report, '_allure_result'):
module_id, module_name, module_doc, environment, testcase = pickle.loads(report._allure_result)
report._allure_result = None # so actual pickled data is garbage-collected, see https://github.com/allure-framework/allure-python/issues/98
self.impl.environment.update(environment)
for a in testcase.iter_attachments():
self.write_attach(a)
self.suites.setdefault(module_id, TestSuite(name=module_name,
description=module_doc,
tests=[],
labels=[],
start=testcase.start, # first case starts the suite!
stop=None)).tests.append(testcase)
CollectFail = namedtuple('CollectFail', 'name status message trace')
class AllureCollectionListener(object):
"""
Listens to pytest collection-related hooks
to generate reports for modules that failed to collect.
"""
def __init__(self, impl):
self.impl = impl
self.fails = []
def pytest_collectreport(self, report):
if not report.passed:
if report.failed:
status = Status.BROKEN
else:
status = Status.CANCELED
self.fails.append(CollectFail(name=mangle_testnames(report.nodeid.split("::"))[-1],
status=status,
message=get_exception_message(None, None, report),
trace=report.longrepr))
def pytest_sessionfinish(self):
"""
Creates a testsuite with collection failures if there were any.
"""
if self.fails:
self.impl.start_suite(name='test_collection_phase',
title='Collection phase',
description='This is the tests collection phase. Failures are modules that failed to collect.')
for fail in self.fails:
self.impl.start_case(name=fail.name.split(".")[-1])
self.impl.stop_case(status=fail.status, message=fail.message, trace=fail.trace)
self.impl.stop_suite()
| 39.168675 | 183 | 0.572483 | 2,450 | 22,757 | 5.208163 | 0.215102 | 0.011285 | 0.007524 | 0.011285 | 0.151567 | 0.093574 | 0.087774 | 0.072414 | 0.068809 | 0.061599 | 0 | 0.001198 | 0.339939 | 22,757 | 580 | 184 | 39.236207 | 0.848279 | 0.282287 | 0 | 0.189655 | 0 | 0 | 0.074281 | 0.001379 | 0 | 0 | 0 | 0.012069 | 0 | 1 | 0.144828 | false | 0.017241 | 0.034483 | 0.013793 | 0.255172 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d8cdce66649554dda1ee6deb1afd812b2f3ebbf | 2,146 | py | Python | app.py | duckm8795/runscope-circleci | 2fd42e64bddb4b8f34c437c2d834b92369c9a2bf | [
"Apache-2.0"
] | null | null | null | app.py | duckm8795/runscope-circleci | 2fd42e64bddb4b8f34c437c2d834b92369c9a2bf | [
"Apache-2.0"
] | null | null | null | app.py | duckm8795/runscope-circleci | 2fd42e64bddb4b8f34c437c2d834b92369c9a2bf | [
"Apache-2.0"
] | null | null | null | import requests
import sys
import time
import os
def main():
trigger_url = sys.argv[1]
trigger_resp = requests.get(trigger_url)
if trigger_resp.ok:
trigger_json = trigger_resp.json().get("data", {})
test_runs = trigger_json.get("runs", [])
print ("Started {} test runs.".format(len(test_runs)))
results = {}
while len(results.keys()) < len(test_runs):
time.sleep(1)
for run in test_runs:
test_run_id = run.get("test_run_id")
if not test_run_id in results:
result = _get_result(run)
if result.get("result") in ["pass", "fail"]:
results[test_run_id] = result
pass_count = sum([r.get("result") == "pass" for r in results.values()])
fail_count = sum([r.get("result") == "fail" for r in results.values()])
if fail_count > 0:
print ("{} test runs passed. {} test runs failed.".format(pass_count, fail_count))
exit(1)
print ("All test runs passed.")
def _get_result(test_run):
# generate Personal Access Token at https://www.runscope.com/applications
if not "RUNSCOPE_ACCESS_TOKEN" in os.environ:
print ("Please set the environment variable RUNSCOPE_ACCESS_TOKEN. You can get an access token by going to https://www.runscope.com/applications")
exit(1)
API_TOKEN = os.environ["RUNSCOPE_ACCESS_TOKEN"]
opts = {
"base_url": "https://api.runscope.com",
"bucket_key": test_run.get("bucket_key"),
"test_id": test_run.get("test_id"),
"test_run_id": test_run.get("test_run_id")
}
result_url = "{base_url}/buckets/{bucket_key}/tests/{test_id}/results/{test_run_id}".format(**opts)
print ("Getting result: {}".format(result_url))
headers = {
"Authorization": "Bearer {}".format(API_TOKEN),
"User-Agent": "python-trigger-sample"
}
result_resp = requests.get(result_url, headers=headers)
if result_resp.ok:
return result_resp.json().get("data")
return None
if __name__ == '__main__':
main() | 31.101449 | 154 | 0.605312 | 282 | 2,146 | 4.365248 | 0.297872 | 0.062551 | 0.051178 | 0.02437 | 0.152721 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003137 | 0.257223 | 2,146 | 69 | 155 | 31.101449 | 0.769134 | 0.033085 | 0 | 0.040816 | 0 | 0.020408 | 0.267117 | 0.074253 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040816 | false | 0.081633 | 0.081633 | 0 | 0.163265 | 0.102041 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5d8dbbff6df38e6773044260538db7a759525964 | 16,585 | py | Python | spyse/client.py | fabaff/spyse-python | f286514ac052ebe6fa98f877d251d8f3cd4db1c4 | [
"MIT"
] | 9 | 2021-07-28T11:59:07.000Z | 2022-02-17T02:25:06.000Z | spyse/client.py | fabaff/spyse-python | f286514ac052ebe6fa98f877d251d8f3cd4db1c4 | [
"MIT"
] | 2 | 2021-11-27T02:03:03.000Z | 2022-02-02T11:33:34.000Z | spyse/client.py | fabaff/spyse-python | f286514ac052ebe6fa98f877d251d8f3cd4db1c4 | [
"MIT"
] | 7 | 2021-08-05T04:02:09.000Z | 2022-03-04T14:11:04.000Z | import requests
from typing import List, Optional
from .models import AS, Domain, IP, CVE, Account, Certificate, Email, DNSHistoricalRecord, WHOISHistoricalRecord
from .response import Response
from .search_query import SearchQuery
from limiter import get_limiter, limit
class DomainsSearchResults:
def __init__(self, results: List[Domain], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[Domain] = results
class AutonomousSystemsSearchResults:
def __init__(self, results: List[AS], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[AS] = results
class IPSearchResults:
def __init__(self, results: List[IP], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[IP] = results
class CertificatesSearchResults:
def __init__(self, results: List[Certificate], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[Certificate] = results
class CVESearchResults:
def __init__(self, results: List[CVE], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[CVE] = results
class EmailsSearchResults:
def __init__(self, results: List[Email], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[Email] = results
class HistoricalDNSSearchResults:
def __init__(self, results: List[DNSHistoricalRecord], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[DNSHistoricalRecord] = results
class HistoricalWHOISSearchResults:
def __init__(self, results: List[WHOISHistoricalRecord], total_items: int = None, search_id: str = None):
self.total_items: Optional[int] = total_items
self.search_id: Optional[str] = search_id
self.results: List[WHOISHistoricalRecord] = results
class Client:
DEFAULT_BASE_URL = 'https://api.spyse.com/v4/data'
MAX_LIMIT = 100
SEARCH_RESULTS_LIMIT = 10000
RATE_LIMIT_FRAME_IN_SECONDS = 1
def __init__(self, api_token, base_url=DEFAULT_BASE_URL):
self.session = requests.Session()
self.session.headers.update({'Authorization': 'Bearer ' + api_token})
self.session.headers.update({'User-Agent': 'spyse-python'})
self.base_url = base_url
self.limiter = get_limiter(rate=self.RATE_LIMIT_FRAME_IN_SECONDS, capacity=1)
self.account = self.get_quotas()
self.limiter._capacity = self.account.requests_rate_limit
def __get(self, endpoint: str) -> Response:
with limit(self.limiter, consume=1):
return Response.from_dict(self.session.get(endpoint).json())
def __search(self, endpoint, query: SearchQuery, lim: int = MAX_LIMIT, offset: int = 0) -> Response:
with limit(self.limiter, consume=1):
return Response.from_dict(self.session.post(endpoint,
json={"search_params": query.get(), "limit": lim,
"offset": offset}).json())
def __scroll(self, endpoint, query: SearchQuery, search_id: Optional[str] = None) -> Response:
with limit(self.limiter, consume=1):
if search_id:
body = {"search_params": query.get(), "search_id": search_id}
else:
body = {"search_params": query.get()}
return Response.from_dict(self.session.post(endpoint, json=body).json())
def set_user_agent(self, s: str):
self.session.headers.update({'User-Agent': s})
def get_quotas(self) -> Optional[Account]:
"""Returns details about your account quotas."""
response = self.__get('{}/account/quota'.format(self.base_url))
response.check_errors()
return Account.from_dict(response.data.items[0]) if len(response.data.items) > 0 else None
def get_autonomous_system_details(self, asn: int) -> Optional[AS]:
"""Returns details about an autonomous system by AS number."""
response = self.__get('{}/as/{}'.format(self.base_url, asn))
response.check_errors()
return AS.from_dict(response.data.items[0]) if len(response.data.items) > 0 else None
def count_autonomous_systems(self, query: SearchQuery) -> int:
"""Returns the precise number of search results that matched the search query."""
response = self.__search('{}/as/search/count'.format(self.base_url), query)
response.check_errors()
return response.data.total_items
def search_autonomous_systems(self, query: SearchQuery, limit: int = MAX_LIMIT,
offset: int = 0) -> AutonomousSystemsSearchResults:
"""
Returns a list of autonomous systems that matched the search query.
Allows getting only the first 10,000 results.
"""
response = self.__search('{}/as/search'.format(self.base_url), query, limit, offset)
response.check_errors()
as_list = list()
for r in response.data.items:
as_list.append(AS.from_dict(r))
return AutonomousSystemsSearchResults(as_list, response.data.total_items)
def scroll_autonomous_systems(self, query: SearchQuery, scroll_id: str = None) -> AutonomousSystemsSearchResults:
"""
Returns a list of autonomous systems that matched the search query.
Allows getting all the results but requires a Spyse Pro subscription
"""
response = self.__scroll('{}/as/scroll/search'.format(self.base_url), query, scroll_id)
response.check_errors()
as_list = list()
for r in response.data.items:
as_list.append(AS.from_dict(r))
return AutonomousSystemsSearchResults(as_list, search_id=response.data.search_id)
def get_domain_details(self, domain_name: str) -> Optional[Domain]:
"""Returns details about domain"""
response = self.__get('{}/domain/{}'.format(self.base_url, domain_name))
response.check_errors()
return Domain.from_dict(response.data.items[0]) if len(response.data.items) > 0 else None
def search_domains(self, query: SearchQuery, limit: int = MAX_LIMIT, offset: int = 0) -> DomainsSearchResults:
"""
Returns a list of domains that matched the search query.
Allows getting only the first 10,000 results.
"""
response = self.__search('{}/domain/search'.format(self.base_url), query, limit, offset)
response.check_errors()
domains = list()
for r in response.data.items:
domains.append(Domain.from_dict(r))
return DomainsSearchResults(domains, response.data.total_items)
def count_domains(self, query: SearchQuery):
"""Returns the precise number of search results that matched the search query."""
response = self.__search('{}/domain/search/count'.format(self.base_url), query)
response.check_errors()
return response.data.total_items
def scroll_domains(self, query: SearchQuery, scroll_id: str = None) -> DomainsSearchResults:
"""
Returns a list of domains that matched the search query.
Allows getting all the results but requires a Spyse Pro subscription
"""
response = self.__scroll('{}/domain/scroll/search'.format(self.base_url), query, scroll_id)
response.check_errors()
domains = list()
for r in response.data.items:
domains.append(Domain.from_dict(r))
return DomainsSearchResults(domains, search_id=response.data.search_id)
def get_ip_details(self, ip: str) -> Optional[IP]:
"""Returns details about IP"""
response = self.__get('{}/ip/{}'.format(self.base_url, ip))
response.check_errors()
return IP.from_dict(response.data.items[0]) if len(response.data.items) > 0 else None
def search_ip(self, query: SearchQuery, limit: int = MAX_LIMIT, offset: int = 0) -> IPSearchResults:
"""
Returns a list of IPv4 hosts that matched the search query.
Allows getting only the first 10,000 results.
"""
response = self.__search('{}/ip/search'.format(self.base_url), query, limit, offset)
response.check_errors()
ips = list()
for r in response.data.items:
ips.append(IP.from_dict(r))
return IPSearchResults(ips, response.data.total_items)
def count_ip(self, query: SearchQuery) -> int:
"""Returns the precise number of search results that matched the search query."""
response = self.__search('{}/ip/search/count'.format(self.base_url), query)
response.check_errors()
return response.data.total_items
def scroll_ip(self, query: SearchQuery, scroll_id: str = None) -> IPSearchResults:
"""
Returns a list of IPv4 hosts that matched the search query.
Allows getting all the results but requires a Spyse Pro subscription
"""
response = self.__scroll('{}/ip/scroll/search'.format(self.base_url), query, scroll_id)
response.check_errors()
ips = list()
for r in response.data.items:
ips.append(IP.from_dict(r))
return IPSearchResults(ips, search_id=response.data.search_id)
def get_certificate_details(self, fingerprint_sha256: str) -> Optional[Certificate]:
"""Returns details about SSL/TLS certificate"""
response = self.__get('{}/certificate/{}'.format(self.base_url, fingerprint_sha256))
response.check_errors()
return Certificate.from_dict(response.data.items[0]) if len(response.data.items) > 0 else None
def search_certificate(self, query: SearchQuery, limit: int = MAX_LIMIT,
offset: int = 0) -> CertificatesSearchResults:
"""
Returns a list of SSL/TLS certificate hosts that matched the search query.
Allows getting only the first 10,000 results.
"""
response = self.__search('{}/certificate/search'.format(self.base_url), query, limit, offset)
response.check_errors()
certs = list()
for r in response.data.items:
certs.append(Certificate.from_dict(r))
return CertificatesSearchResults(certs, response.data.total_items)
def count_certificate(self, query: SearchQuery) -> int:
"""Returns the precise number of search results that matched the search query."""
response = self.__search('{}/certificate/search/count'.format(self.base_url), query)
response.check_errors()
return response.data.total_items
def scroll_certificate(self, query: SearchQuery, scroll_id: str = None) -> CertificatesSearchResults:
"""
Returns a list of SSL/TLS certificates that matched the search query.
Allows getting all the results but requires a Spyse Pro subscription
"""
response = self.__scroll('{}/certificate/scroll/search'.format(self.base_url), query, scroll_id)
response.check_errors()
certs = list()
for r in response.data.items:
certs.append(Certificate.from_dict(r))
return CertificatesSearchResults(certs, search_id=response.data.search_id)
def get_cve_details(self, cve_id: str) -> Optional[CVE]:
"""Returns details about CVE"""
response = self.__get('{}/cve/{}'.format(self.base_url, cve_id))
response.check_errors()
return CVE.from_dict(response.data.items[0]) if len(response.data.items) > 0 else None
def search_cve(self, query: SearchQuery, limit: int = MAX_LIMIT, offset: int = 0) -> CVESearchResults:
"""
Returns a list of CVE that matched the search query.
Allows getting only the first 10,000 results.
"""
response = self.__search('{}/cve/search'.format(self.base_url), query, limit, offset)
response.check_errors()
cve_list = list()
for r in response.data.items:
cve_list.append(CVE.from_dict(r))
return CVESearchResults(cve_list, response.data.total_items)
def count_cve(self, query: SearchQuery) -> int:
"""Returns the precise number of search results that matched the search query."""
response = self.__search('{}/cve/search/count'.format(self.base_url), query)
response.check_errors()
return response.data.total_items
def scroll_cve(self, query: SearchQuery, scroll_id: str = None) -> CVESearchResults:
"""
Returns a list of CVEs that matched the search query.
Allows getting all the results but requires a Spyse Pro subscription
"""
response = self.__scroll('{}/cve/scroll/search'.format(self.base_url), query, scroll_id)
response.check_errors()
cve_list = list()
for r in response.data.items:
cve_list.append(CVE.from_dict(r))
return CVESearchResults(cve_list, search_id=response.data.search_id)
def get_email_details(self, email: str) -> Optional[Email]:
"""Returns details about email"""
response = self.__get('{}/email/{}'.format(self.base_url, email))
response.check_errors()
return Email.from_dict(response.data.items[0]) if len(response.data.items) > 0 else None
def search_emails(self, query: SearchQuery, limit: int = MAX_LIMIT, offset: int = 0) -> EmailsSearchResults:
"""
Returns a list of emails that matched the search query.
Allows getting only the first 10,000 results.
"""
response = self.__search('{}/email/search'.format(self.base_url), query, limit, offset)
response.check_errors()
emails = list()
for r in response.data.items:
emails.append(Email.from_dict(r))
return EmailsSearchResults(emails, response.data.total_items)
def count_emails(self, query: SearchQuery) -> int:
"""Returns the precise number of search results that matched the search query."""
response = self.__search('{}/cve/email/count'.format(self.base_url), query)
response.check_errors()
return response.data.total_items
def scroll_emails(self, query: SearchQuery, scroll_id: str = None) -> EmailsSearchResults:
"""
Returns a list of emails that matched the search query.
Allows getting all the results but requires a Spyse Pro subscription
"""
response = self.__scroll('{}/email/scroll/search'.format(self.base_url), query, scroll_id)
response.check_errors()
emails = list()
for r in response.data.items:
emails.append(Email.from_dict(r))
return EmailsSearchResults(emails, search_id=response.data.search_id)
def search_historical_dns(self, dns_type, domain_name: str, limit: int = MAX_LIMIT, offset: int = 0) \
-> HistoricalDNSSearchResults:
"""
Returns the historical DNS records about the given domain name.
"""
response = self.__get(f'{self.base_url}/history/dns/{dns_type}/{domain_name}?limit={limit}&offset={offset}')
response.check_errors()
records = list()
for r in response.data.items:
records.append(DNSHistoricalRecord.from_dict(r))
return HistoricalDNSSearchResults(records, response.data.total_items)
def search_historical_whois(self, domain_name: str, limit: int = MAX_LIMIT, offset: int = 0) \
-> HistoricalWHOISSearchResults:
"""
Returns the historical WHOIS records for the given domain name.
"""
response = self.__get(f'{self.base_url}/history/domain-whois/{domain_name}?limit={limit}&offset={offset}')
response.check_errors()
records = list()
for r in response.data.items:
records.append(WHOISHistoricalRecord.from_dict(r))
return HistoricalWHOISSearchResults(records, response.data.total_items)
| 42.308673 | 117 | 0.662466 | 2,039 | 16,585 | 5.207945 | 0.071604 | 0.054242 | 0.029005 | 0.040023 | 0.743479 | 0.699972 | 0.67125 | 0.636689 | 0.627084 | 0.614559 | 0 | 0.00586 | 0.228339 | 16,585 | 391 | 118 | 42.41688 | 0.823879 | 0.136087 | 0 | 0.4 | 0 | 0.008696 | 0.052369 | 0.022031 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173913 | false | 0 | 0.026087 | 0 | 0.386957 | 0.008696 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d8fd5fa2bcd3f5669762aabbd18717b761f3d16 | 30,184 | py | Python | gluon/main.py | scudette/rekall-agent-server | e553f1ae5279f75a8f5b0c0c4847766b60ed86eb | [
"BSD-3-Clause"
] | 21 | 2018-02-16T17:43:59.000Z | 2021-12-29T12:08:28.000Z | gluon/main.py | scudette/rekall-agent-server | e553f1ae5279f75a8f5b0c0c4847766b60ed86eb | [
"BSD-3-Clause"
] | 12 | 2017-11-01T14:54:29.000Z | 2018-02-01T22:02:12.000Z | gluon/main.py | scudette/rekall-agent-server | e553f1ae5279f75a8f5b0c0c4847766b60ed86eb | [
"BSD-3-Clause"
] | 8 | 2018-10-08T03:48:00.000Z | 2022-03-31T12:13:01.000Z | #!/bin/env python
# -*- coding: utf-8 -*-
"""
| This file is part of the web2py Web Framework
| Copyrighted by Massimo Di Pierro <mdipierro@cs.depaul.edu>
| License: LGPLv3 (http://www.gnu.org/licenses/lgpl.html)
The gluon wsgi application
---------------------------
"""
from __future__ import print_function
if False: import import_all # DO NOT REMOVE PART OF FREEZE PROCESS
import gc
import os
import re
import copy
import sys
import time
import datetime
import signal
import socket
import random
import string
from gluon._compat import Cookie, urllib2
#from thread import allocate_lock
from gluon.fileutils import abspath, write_file
from gluon.settings import global_settings
from gluon.utils import web2py_uuid
from gluon.admin import add_path_first, create_missing_folders, create_missing_app_folders
from gluon.globals import current
# Remarks:
# calling script has inserted path to script directory into sys.path
# applications_parent (path to applications/, site-packages/ etc)
# defaults to that directory set sys.path to
# ("", gluon_parent/site-packages, gluon_parent, ...)
#
# this is wrong:
# web2py_path = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# because we do not want the path to this file which may be Library.zip
# gluon_parent is the directory containing gluon, web2py.py, logging.conf
# and the handlers.
# applications_parent (web2py_path) is the directory containing applications/
# and routes.py
# The two are identical unless web2py_path is changed via the web2py.py -f folder option
# main.web2py_path is the same as applications_parent (for backward compatibility)
web2py_path = global_settings.applications_parent # backward compatibility
create_missing_folders()
# set up logging for subsequent imports
import logging
import logging.config
# This needed to prevent exception on Python 2.5:
# NameError: name 'gluon' is not defined
# See http://bugs.python.org/issue1436
# attention!, the import Tkinter in messageboxhandler, changes locale ...
import gluon.messageboxhandler
logging.gluon = gluon
# so we must restore it! Thanks ozancag
import locale
locale.setlocale(locale.LC_CTYPE, "C") # IMPORTANT, web2py requires locale "C"
exists = os.path.exists
pjoin = os.path.join
try:
logging.config.fileConfig(abspath("logging.conf"))
except: # fails on GAE or when logfile is missing
logging.basicConfig()
logger = logging.getLogger("web2py")
from gluon.restricted import RestrictedError
from gluon.http import HTTP, redirect
from gluon.globals import Request, Response, Session
from gluon.compileapp import build_environment, run_models_in, \
run_controller_in, run_view_in
from gluon.contenttype import contenttype
from pydal.base import BaseAdapter
from gluon.validators import CRYPT
from gluon.html import URL, xmlescape
from gluon.utils import is_valid_ip_address, getipaddrinfo
from gluon.rewrite import load as load_routes, url_in, THREAD_LOCAL as rwthread, \
try_rewrite_on_error, fixup_missing_path_info
from gluon import newcron
__all__ = ['wsgibase', 'save_password', 'appfactory', 'HttpServer']
requests = 0 # gc timer
# Security Checks: validate URL and session_id here,
# accept_language is validated in languages
# pattern used to validate client address
regex_client = re.compile('[\w\-:]+(\.[\w\-]+)*\.?') # ## to account for IPV6
try:
version_info = open(pjoin(global_settings.gluon_parent, 'VERSION'), 'r')
raw_version_string = version_info.read().split()[-1].strip()
version_info.close()
global_settings.web2py_version = raw_version_string
web2py_version = global_settings.web2py_version
except:
raise RuntimeError("Cannot determine web2py version")
try:
from gluon import rocket
except:
if not global_settings.web2py_runtime_gae:
logger.warn('unable to import Rocket')
load_routes()
HTTPS_SCHEMES = set(('https', 'HTTPS'))
def get_client(env):
"""
Guesses the client address from the environment variables
First tries 'http_x_forwarded_for', secondly 'remote_addr'
if all fails, assume '127.0.0.1' or '::1' (running locally)
"""
eget = env.get
g = regex_client.search(eget('http_x_forwarded_for', ''))
client = (g.group() or '').split(',')[0] if g else None
if client in (None, '', 'unknown'):
g = regex_client.search(eget('remote_addr', ''))
if g:
client = g.group()
elif env.http_host.startswith('['): # IPv6
client = '::1'
else:
client = '127.0.0.1' # IPv4
if not is_valid_ip_address(client):
raise HTTP(400, "Bad Request (request.client=%s)" % client)
return client
def serve_controller(request, response, session):
"""
This function is used to generate a dynamic page.
It first runs all models, then runs the function in the controller,
and then tries to render the output using a view/template.
this function must run from the [application] folder.
A typical example would be the call to the url
/[application]/[controller]/[function] that would result in a call
to [function]() in applications/[application]/[controller].py
rendered by applications/[application]/views/[controller]/[function].html
"""
# ##################################################
# build environment for controller and view
# ##################################################
environment = build_environment(request, response, session)
# set default view, controller can override it
response.view = '%s/%s.%s' % (request.controller,
request.function,
request.extension)
# also, make sure the flash is passed through
# ##################################################
# process models, controller and view (if required)
# ##################################################
run_models_in(environment)
response._view_environment = copy.copy(environment)
page = run_controller_in(request.controller, request.function, environment)
if isinstance(page, dict):
response._vars = page
response._view_environment.update(page)
page = run_view_in(response._view_environment)
# logic to garbage collect after exec, not always, once every 100 requests
global requests
requests = ('requests' in globals()) and (requests + 1) % 100 or 0
if not requests:
gc.collect()
# end garbage collection logic
# ##################################################
# set default headers it not set
# ##################################################
default_headers = [
('Content-Type', contenttype('.' + request.extension)),
('Cache-Control',
'no-store, no-cache, must-revalidate, post-check=0, pre-check=0'),
('Expires', time.strftime('%a, %d %b %Y %H:%M:%S GMT',
time.gmtime())),
('Pragma', 'no-cache')]
for key, value in default_headers:
response.headers.setdefault(key, value)
raise HTTP(response.status, page, **response.headers)
class LazyWSGI(object):
def __init__(self, environ, request, response):
self.wsgi_environ = environ
self.request = request
self.response = response
@property
def environ(self):
if not hasattr(self, '_environ'):
new_environ = self.wsgi_environ
new_environ['wsgi.input'] = self.request.body
new_environ['wsgi.version'] = 1
self._environ = new_environ
return self._environ
def start_response(self, status='200', headers=[], exec_info=None):
"""
in controller you can use:
- request.wsgi.environ
- request.wsgi.start_response
to call third party WSGI applications
"""
self.response.status = str(status).split(' ', 1)[0]
self.response.headers = dict(headers)
return lambda *args, **kargs: \
self.response.write(escape=False, *args, **kargs)
def middleware(self, *middleware_apps):
"""
In you controller use::
@request.wsgi.middleware(middleware1, middleware2, ...)
to decorate actions with WSGI middleware. actions must return strings.
uses a simulated environment so it may have weird behavior in some cases
"""
def middleware(f):
def app(environ, start_response):
data = f()
start_response(self.response.status,
self.response.headers.items())
if isinstance(data, list):
return data
return [data]
for item in middleware_apps:
app = item(app)
def caller(app):
return app(self.environ, self.start_response)
return lambda caller=caller, app=app: caller(app)
return middleware
def wsgibase(environ, responder):
"""
The gluon wsgi application. The first function called when a page
is requested (static or dynamic). It can be called by paste.httpserver
or by apache mod_wsgi (or any WSGI-compatible server).
- fills request with info
- the environment variables, replacing '.' with '_'
- adds web2py path and version info
- compensates for fcgi missing path_info and query_string
- validates the path in url
The url path must be either:
1. for static pages:
- /<application>/static/<file>
2. for dynamic pages:
- /<application>[/<controller>[/<function>[/<sub>]]][.<extension>]
The naming conventions are:
- application, controller, function and extension may only contain
`[a-zA-Z0-9_]`
- file and sub may also contain '-', '=', '.' and '/'
"""
eget = environ.get
current.__dict__.clear()
request = Request(environ)
response = Response()
session = Session()
env = request.env
#env.web2py_path = global_settings.applications_parent
env.web2py_version = web2py_version
#env.update(global_settings)
static_file = False
http_response = None
try:
try:
try:
# ##################################################
# handle fcgi missing path_info and query_string
# select rewrite parameters
# rewrite incoming URL
# parse rewritten header variables
# parse rewritten URL
# serve file if static
# ##################################################
fixup_missing_path_info(environ)
(static_file, version, environ) = url_in(request, environ)
response.status = env.web2py_status_code or response.status
if static_file:
if eget('QUERY_STRING', '').startswith('attachment'):
response.headers['Content-Disposition'] \
= 'attachment'
if version:
response.headers['Cache-Control'] = 'max-age=315360000'
response.headers[
'Expires'] = 'Thu, 31 Dec 2037 23:59:59 GMT'
response.stream(static_file, request=request)
# ##################################################
# fill in request items
# ##################################################
app = request.application # must go after url_in!
if not global_settings.local_hosts:
local_hosts = set(['127.0.0.1', '::ffff:127.0.0.1', '::1'])
if not global_settings.web2py_runtime_gae:
try:
fqdn = socket.getfqdn()
local_hosts.add(socket.gethostname())
local_hosts.add(fqdn)
local_hosts.update([
addrinfo[4][0] for addrinfo
in getipaddrinfo(fqdn)])
if env.server_name:
local_hosts.add(env.server_name)
local_hosts.update([
addrinfo[4][0] for addrinfo
in getipaddrinfo(env.server_name)])
except (socket.gaierror, TypeError):
pass
global_settings.local_hosts = list(local_hosts)
else:
local_hosts = global_settings.local_hosts
client = get_client(env)
x_req_with = str(env.http_x_requested_with).lower()
cmd_opts = global_settings.cmd_options
request.update(
client = client,
folder = abspath('applications', app) + os.sep,
ajax = x_req_with == 'xmlhttprequest',
cid = env.http_web2py_component_element,
is_local = (env.remote_addr in local_hosts and
client == env.remote_addr),
is_shell = False,
is_scheduler = False,
is_https = env.wsgi_url_scheme in HTTPS_SCHEMES or \
request.env.http_x_forwarded_proto in HTTPS_SCHEMES \
or env.https == 'on'
)
request.url = environ['PATH_INFO']
# ##################################################
# access the requested application
# ##################################################
disabled = pjoin(request.folder, 'DISABLED')
if not exists(request.folder):
if app == rwthread.routes.default_application \
and app != 'welcome':
redirect(URL('welcome', 'default', 'index'))
elif rwthread.routes.error_handler:
_handler = rwthread.routes.error_handler
redirect(URL(_handler['application'],
_handler['controller'],
_handler['function'],
args=app))
else:
raise HTTP(404, rwthread.routes.error_message
% 'invalid request',
web2py_error='invalid application')
elif not request.is_local and exists(disabled):
five0three = os.path.join(request.folder,'static','503.html')
if os.path.exists(five0three):
raise HTTP(503, file(five0three, 'r').read())
else:
raise HTTP(503, "<html><body><h1>Temporarily down for maintenance</h1></body></html>")
# ##################################################
# build missing folders
# ##################################################
create_missing_app_folders(request)
# ##################################################
# get the GET and POST data
# ##################################################
#parse_get_post_vars(request, environ)
# ##################################################
# expose wsgi hooks for convenience
# ##################################################
request.wsgi = LazyWSGI(environ, request, response)
# ##################################################
# load cookies
# ##################################################
if env.http_cookie:
for single_cookie in env.http_cookie.split(';'):
single_cookie = single_cookie.strip()
if single_cookie:
try:
request.cookies.load(single_cookie)
except Cookie.CookieError:
pass # single invalid cookie ignore
# ##################################################
# try load session or create new session file
# ##################################################
if not env.web2py_disable_session:
session.connect(request, response)
# ##################################################
# run controller
# ##################################################
if global_settings.debugging and app != "admin":
import gluon.debug
# activate the debugger
gluon.debug.dbg.do_debug(mainpyfile=request.folder)
serve_controller(request, response, session)
except HTTP as hr:
http_response = hr
if static_file:
return http_response.to(responder, env=env)
if request.body:
request.body.close()
if hasattr(current, 'request'):
# ##################################################
# on success, try store session in database
# ##################################################
if not env.web2py_disable_session:
session._try_store_in_db(request, response)
# ##################################################
# on success, commit database
# ##################################################
if response.do_not_commit is True:
BaseAdapter.close_all_instances(None)
elif response.custom_commit:
BaseAdapter.close_all_instances(response.custom_commit)
else:
BaseAdapter.close_all_instances('commit')
# ##################################################
# if session not in db try store session on filesystem
# this must be done after trying to commit database!
# ##################################################
if not env.web2py_disable_session:
session._try_store_in_cookie_or_file(request, response)
# Set header so client can distinguish component requests.
if request.cid:
http_response.headers.setdefault(
'web2py-component-content', 'replace')
if request.ajax:
if response.flash:
http_response.headers['web2py-component-flash'] = \
urllib2.quote(xmlescape(response.flash).replace(b'\n', b''))
if response.js:
http_response.headers['web2py-component-command'] = \
urllib2.quote(response.js.replace('\n', ''))
# ##################################################
# store cookies in headers
# ##################################################
session._fixup_before_save()
http_response.cookies2headers(response.cookies)
ticket = None
except RestrictedError as e:
if request.body:
request.body.close()
# ##################################################
# on application error, rollback database
# ##################################################
# log tickets before rollback if not in DB
if not request.tickets_db:
ticket = e.log(request) or 'unknown'
# rollback
if response._custom_rollback:
response._custom_rollback()
else:
BaseAdapter.close_all_instances('rollback')
# if tickets in db, reconnect and store it in db
if request.tickets_db:
ticket = e.log(request) or 'unknown'
http_response = \
HTTP(500, rwthread.routes.error_message_ticket %
dict(ticket=ticket),
web2py_error='ticket %s' % ticket)
except:
if request.body:
request.body.close()
# ##################################################
# on application error, rollback database
# ##################################################
try:
if response._custom_rollback:
response._custom_rollback()
else:
BaseAdapter.close_all_instances('rollback')
except:
pass
e = RestrictedError('Framework', '', '', locals())
ticket = e.log(request) or 'unrecoverable'
http_response = \
HTTP(500, rwthread.routes.error_message_ticket
% dict(ticket=ticket),
web2py_error='ticket %s' % ticket)
finally:
if response and hasattr(response, 'session_file') \
and response.session_file:
response.session_file.close()
session._unlock(response)
http_response, new_environ = try_rewrite_on_error(
http_response, request, environ, ticket)
if not http_response:
return wsgibase(new_environ, responder)
if global_settings.web2py_crontype == 'soft':
newcron.softcron(global_settings.applications_parent).start()
return http_response.to(responder, env=env)
def save_password(password, port):
"""
Used by main() to save the password in the parameters_port.py file.
"""
password_file = abspath('parameters_%i.py' % port)
if password == '<random>':
# make up a new password
chars = string.letters + string.digits
password = ''.join([random.choice(chars) for _ in range(8)])
cpassword = CRYPT()(password)[0]
print('******************* IMPORTANT!!! ************************')
print('your admin password is "%s"' % password)
print('*********************************************************')
elif password == '<recycle>':
# reuse the current password if any
if exists(password_file):
return
else:
password = ''
elif password.startswith('<pam_user:'):
# use the pam password for specified user
cpassword = password[1:-1]
else:
# use provided password
cpassword = CRYPT()(password)[0]
fp = open(password_file, 'w')
if password:
fp.write('password="%s"\n' % cpassword)
else:
fp.write('password=None\n')
fp.close()
def appfactory(wsgiapp=wsgibase,
logfilename='httpserver.log',
profiler_dir=None,
profilerfilename=None):
"""
generates a wsgi application that does logging and profiling and calls
wsgibase
Args:
wsgiapp: the base application
logfilename: where to store apache-compatible requests log
profiler_dir: where to store profile files
"""
if profilerfilename is not None:
raise BaseException("Deprecated API")
if profiler_dir:
profiler_dir = abspath(profiler_dir)
logger.warn('profiler is on. will use dir %s', profiler_dir)
if not os.path.isdir(profiler_dir):
try:
os.makedirs(profiler_dir)
except:
raise BaseException("Can't create dir %s" % profiler_dir)
filepath = pjoin(profiler_dir, 'wtest')
try:
filehandle = open( filepath, 'w' )
filehandle.close()
os.unlink(filepath)
except IOError:
raise BaseException("Unable to write to dir %s" % profiler_dir)
def app_with_logging(environ, responder):
"""
a wsgi app that does logging and profiling and calls wsgibase
"""
status_headers = []
def responder2(s, h):
"""
wsgi responder app
"""
status_headers.append(s)
status_headers.append(h)
return responder(s, h)
time_in = time.time()
ret = [0]
if not profiler_dir:
ret[0] = wsgiapp(environ, responder2)
else:
import cProfile
prof = cProfile.Profile()
prof.enable()
ret[0] = wsgiapp(environ, responder2)
prof.disable()
destfile = pjoin(profiler_dir, "req_%s.prof" % web2py_uuid())
prof.dump_stats(destfile)
try:
line = '%s, %s, %s, %s, %s, %s, %f\n' % (
environ['REMOTE_ADDR'],
datetime.datetime.today().strftime('%Y-%m-%d %H:%M:%S'),
environ['REQUEST_METHOD'],
environ['PATH_INFO'].replace(',', '%2C'),
environ['SERVER_PROTOCOL'],
(status_headers[0])[:3],
time.time() - time_in,
)
if not logfilename:
sys.stdout.write(line)
elif isinstance(logfilename, str):
write_file(logfilename, line, 'a')
else:
logfilename.write(line)
except:
pass
return ret[0]
return app_with_logging
class HttpServer(object):
"""
the web2py web server (Rocket)
"""
def __init__(
self,
ip='127.0.0.1',
port=8000,
password='',
pid_filename='httpserver.pid',
log_filename='httpserver.log',
profiler_dir=None,
ssl_certificate=None,
ssl_private_key=None,
ssl_ca_certificate=None,
min_threads=None,
max_threads=None,
server_name=None,
request_queue_size=5,
timeout=10,
socket_timeout=1,
shutdown_timeout=None, # Rocket does not use a shutdown timeout
path=None,
interfaces=None # Rocket is able to use several interfaces - must be list of socket-tuples as string
):
"""
starts the web server.
"""
if interfaces:
# if interfaces is specified, it must be tested for rocket parameter correctness
# not necessarily completely tested (e.g. content of tuples or ip-format)
import types
if isinstance(interfaces, list):
for i in interfaces:
if not isinstance(i, tuple):
raise "Wrong format for rocket interfaces parameter - see http://packages.python.org/rocket/"
else:
raise "Wrong format for rocket interfaces parameter - see http://packages.python.org/rocket/"
if path:
# if a path is specified change the global variables so that web2py
# runs from there instead of cwd or os.environ['web2py_path']
global web2py_path
path = os.path.normpath(path)
web2py_path = path
global_settings.applications_parent = path
os.chdir(path)
load_routes()
for p in (path, abspath('site-packages'), ""):
add_path_first(p)
if exists("logging.conf"):
logging.config.fileConfig("logging.conf")
save_password(password, port)
self.pid_filename = pid_filename
if not server_name:
server_name = socket.gethostname()
logger.info('starting web server...')
rocket.SERVER_NAME = server_name
rocket.SOCKET_TIMEOUT = socket_timeout
sock_list = [ip, port]
if not ssl_certificate or not ssl_private_key:
logger.info('SSL is off')
elif not rocket.ssl:
logger.warning('Python "ssl" module unavailable. SSL is OFF')
elif not exists(ssl_certificate):
logger.warning('unable to open SSL certificate. SSL is OFF')
elif not exists(ssl_private_key):
logger.warning('unable to open SSL private key. SSL is OFF')
else:
sock_list.extend([ssl_private_key, ssl_certificate])
if ssl_ca_certificate:
sock_list.append(ssl_ca_certificate)
logger.info('SSL is ON')
app_info = {'wsgi_app': appfactory(wsgibase,
log_filename,
profiler_dir)}
self.server = rocket.Rocket(interfaces or tuple(sock_list),
method='wsgi',
app_info=app_info,
min_threads=min_threads,
max_threads=max_threads,
queue_size=int(request_queue_size),
timeout=int(timeout),
handle_signals=False,
)
def start(self):
"""
start the web server
"""
try:
signal.signal(signal.SIGTERM, lambda a, b, s=self: s.stop())
signal.signal(signal.SIGINT, lambda a, b, s=self: s.stop())
except:
pass
write_file(self.pid_filename, str(os.getpid()))
self.server.start()
def stop(self, stoplogging=False):
"""
stop cron and the web server
"""
newcron.stopcron()
self.server.stop(stoplogging)
try:
os.unlink(self.pid_filename)
except:
pass
| 37.919598 | 117 | 0.52286 | 2,993 | 30,184 | 5.130638 | 0.213498 | 0.006187 | 0.006512 | 0.001954 | 0.133498 | 0.102631 | 0.086806 | 0.065642 | 0.060042 | 0.054572 | 0 | 0.009227 | 0.332163 | 30,184 | 795 | 118 | 37.967296 | 0.752555 | 0.190465 | 0 | 0.173554 | 0 | 0.004132 | 0.082748 | 0.010462 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035124 | false | 0.051653 | 0.082645 | 0.002066 | 0.152893 | 0.008264 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5d96fee6a1d130e8653363f3a24275073276610b | 1,496 | py | Python | app/__init__.py | dulin/tornado-test | 8ceeb9f2b50b4cd0f18baa9149140721feec1925 | [
"MIT"
] | null | null | null | app/__init__.py | dulin/tornado-test | 8ceeb9f2b50b4cd0f18baa9149140721feec1925 | [
"MIT"
] | null | null | null | app/__init__.py | dulin/tornado-test | 8ceeb9f2b50b4cd0f18baa9149140721feec1925 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# -*- mode: python -*-
import aiopg
import psycopg2
import tornado.locks
from tornado.options import define, options
from tornado.web import Application
from app.application import Application
define('port', default=8080, help="listening port")
define('bind_address', default="", help="bind address")
define("db_host", default="127.0.0.1", help="database host")
define("db_port", default=5432, help="database port")
define("db_database", default="tornado", help="database name")
define("db_user", default="tornado", help="database user")
define("db_password", default="tornado", help="database password")
async def maybe_create_tables(db):
try:
with (await db.cursor()) as cur:
await cur.execute("SELECT COUNT(*) FROM schema LIMIT 1")
await cur.fetchone()
except psycopg2.ProgrammingError:
print("Database error!")
async def main():
options.parse_command_line()
async with aiopg.create_pool(
host=options.db_host,
port=options.db_port,
user=options.db_user,
password=options.db_password,
dbname=options.db_database) as db:
await maybe_create_tables(db)
app = Application(db)
app.listen(options.port, options.bind_address, xheaders=True)
print("Listening on http://%s:%i" % (options.bind_address, options.port))
shutdown_event = tornado.locks.Event()
await shutdown_event.wait()
| 32.521739 | 81 | 0.675802 | 191 | 1,496 | 5.17801 | 0.382199 | 0.040445 | 0.054601 | 0.078868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014876 | 0.191176 | 1,496 | 45 | 82 | 33.244444 | 0.802479 | 0.042112 | 0 | 0 | 0 | 0 | 0.181119 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.058824 | 0.176471 | 0 | 0.176471 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5d99e63583440bf3da1b852644d47a0c0ec5d4a3 | 349 | py | Python | src/project/api/rankings/urls.py | jSkrod/djangae-react-browser-games-app | 28c5064f0a126021afb08b195839305aba6b35a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | src/project/api/rankings/urls.py | jSkrod/djangae-react-browser-games-app | 28c5064f0a126021afb08b195839305aba6b35a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | src/project/api/rankings/urls.py | jSkrod/djangae-react-browser-games-app | 28c5064f0a126021afb08b195839305aba6b35a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | from django.conf.urls import url, include
from project.api.rankings.api import AddRanking, AddScore, GetScoresUser, GetScoresGame
urlpatterns = [
url(r'add_ranking$', AddRanking.as_view()),
url(r'add_score$', AddScore.as_view()),
url(r'get_scores_game$', GetScoresGame.as_view()),
url(r'get_scores_user$', GetScoresUser.as_view())
] | 38.777778 | 87 | 0.739255 | 48 | 349 | 5.166667 | 0.520833 | 0.064516 | 0.108871 | 0.120968 | 0.153226 | 0.153226 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114613 | 349 | 9 | 88 | 38.777778 | 0.802589 | 0 | 0 | 0 | 0 | 0 | 0.154286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5da0ff4d3e7dbb3fe7c21095720798fb7df7ef6b | 742 | py | Python | 02/selenium.02.py | study-machine-learning/dongheon.shin | 6103ef9c73b162603bc39a27e4ecca0f1ac35e57 | [
"MIT"
] | 2 | 2017-09-24T02:29:48.000Z | 2017-10-05T11:15:22.000Z | 02/selenium.02.py | study-machine-learning/dongheon.shin | 6103ef9c73b162603bc39a27e4ecca0f1ac35e57 | [
"MIT"
] | null | null | null | 02/selenium.02.py | study-machine-learning/dongheon.shin | 6103ef9c73b162603bc39a27e4ecca0f1ac35e57 | [
"MIT"
] | null | null | null | from selenium import webdriver
username = "henlix"
password = "my_password"
browser = webdriver.PhantomJS()
browser.implicitly_wait(5)
url_login = "https://nid.naver.com/nidlogin.login"
browser.get(url_login)
el = browser.find_element_by_id("id")
el.clear()
el.send_keys(username)
el = browser.find_element_by_id("pw")
el.clear()
el.send_keys(password)
form = browser.find_element_by_css_selector("input.btn_global[type=submit]")
form.submit()
url_shopping_list = "https://order.pay.naver.com/home?tabMenu=SHOPPING"
browser.get(url_shopping_list)
products = browser.find_elements_by_css_selector(".p_info span")
for product in products:
print("- ", product.text)
# PYTHONIOENCODING=utf-8:surrogateescape python3 selenium.02.py
| 22.484848 | 76 | 0.777628 | 109 | 742 | 5.055046 | 0.559633 | 0.079855 | 0.098004 | 0.108893 | 0.14882 | 0.087114 | 0 | 0 | 0 | 0 | 0 | 0.007396 | 0.088949 | 742 | 32 | 77 | 23.1875 | 0.807692 | 0.08221 | 0 | 0.1 | 0 | 0 | 0.21944 | 0.04271 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.1 | 0.05 | 0 | 0.05 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5da246d54547ba7b297b610234129f3853586daf | 343 | py | Python | visualization/matplotlib/barwitherror.py | Licas/datascienceexamples | cbb1293dbae875cb3f166dbde00b2ab629a43ece | [
"MIT"
] | null | null | null | visualization/matplotlib/barwitherror.py | Licas/datascienceexamples | cbb1293dbae875cb3f166dbde00b2ab629a43ece | [
"MIT"
] | null | null | null | visualization/matplotlib/barwitherror.py | Licas/datascienceexamples | cbb1293dbae875cb3f166dbde00b2ab629a43ece | [
"MIT"
] | null | null | null | from matplotlib import pyplot as plt
drinks = ["cappuccino", "latte", "chai", "americano", "mocha", "espresso"]
ounces_of_milk = [6, 9, 4, 0, 9, 0]
error = [0.6, 0.9, 0.4, 0, 0.9, 0]
#Yerr -> element at i position represents +/- error[i] variance on bar[i] value
plt.bar( range(len(drinks)),ounces_of_milk, yerr=error, capsize=15)
plt.show() | 38.111111 | 79 | 0.667638 | 60 | 343 | 3.75 | 0.6 | 0.026667 | 0.04 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061224 | 0.142857 | 343 | 9 | 80 | 38.111111 | 0.704082 | 0.227405 | 0 | 0 | 0 | 0 | 0.154717 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5da7c35c6f555424a35c54ce0dd94e20ac56d5b8 | 4,116 | py | Python | ngraph_onnx/onnx_importer/utils/numeric_limits.py | cliveseldon/ngraph-onnx | a2d20afdc7acd5064e4717612ad372d864d03d3d | [
"Apache-2.0"
] | null | null | null | ngraph_onnx/onnx_importer/utils/numeric_limits.py | cliveseldon/ngraph-onnx | a2d20afdc7acd5064e4717612ad372d864d03d3d | [
"Apache-2.0"
] | null | null | null | ngraph_onnx/onnx_importer/utils/numeric_limits.py | cliveseldon/ngraph-onnx | a2d20afdc7acd5064e4717612ad372d864d03d3d | [
"Apache-2.0"
] | null | null | null | # ******************************************************************************
# Copyright 2018 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ******************************************************************************
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import numpy as np
import numbers
from typing import Union
class NumericLimits(object):
"""Class providing interface to extract numerical limits for given data type."""
@staticmethod
def _get_number_limits_class(dtype):
# type: (np.dtype) -> Union[IntegralLimits, FloatingPointLimits]
"""Return specialized class instance with limits set for given data type.
:param dtype: The data type we want to check limits for.
:return: The specialized class instance providing numeric limits.
"""
data_type = dtype.type
value = data_type(1)
if isinstance(value, numbers.Integral):
return IntegralLimits(data_type)
elif isinstance(value, numbers.Real):
return FloatingPointLimits(data_type)
else:
raise ValueError('NumericLimits: unsupported data type: <{}>.'.format(dtype.type))
@staticmethod
def _get_dtype(dtype): # type: (Union[np.dtype, int, float]) -> np.dtype
"""Return numpy dtype object wrapping provided data type.
:param dtype: The data type to be wrapped.
:return: The numpy dtype object.
"""
return dtype if isinstance(dtype, np.dtype) else np.dtype(dtype)
@classmethod
def max(cls, dtype): # type: (np.dtype) -> Union[int, float]
"""Return maximum value that can be represented in given data type.
:param dtype: The data type we want to check maximum value for.
:return: The maximum value.
"""
return cls._get_number_limits_class(cls._get_dtype(dtype)).max
@classmethod
def min(cls, dtype): # type: (np.dtype) -> Union[int, float]
"""Return minimum value that can be represented in given data type.
:param dtype: The data type we want to check minimum value for.
:return: The minimum value.
"""
return cls._get_number_limits_class(cls._get_dtype(dtype)).min
class FloatingPointLimits(object):
"""Class providing access to numeric limits for floating point data types."""
def __init__(self, data_type): # type: (type) -> None
self.data_type = data_type
@property
def max(self): # type: () -> float
"""Provide maximum representable value by stored data type.
:return: The maximum value.
"""
return np.finfo(self.data_type).max
@property
def min(self): # type: () -> float
"""Provide minimum representable value by stored data type.
:return: The minimum value.
"""
return np.finfo(self.data_type).min
class IntegralLimits(object):
"""Class providing access to numeric limits for integral data types."""
def __init__(self, data_type): # type: (type) -> None
self.data_type = data_type
@property
def max(self): # type: () -> int
"""Provide maximum representable value by stored data type.
:return: The maximum value.
"""
return np.iinfo(self.data_type).max
@property
def min(self): # type: () -> int
"""Provide minimum representable value by stored data type.
:return: The minimum value.
"""
return np.iinfo(self.data_type).min
| 34.588235 | 94 | 0.640671 | 508 | 4,116 | 5.074803 | 0.283465 | 0.086889 | 0.037238 | 0.027929 | 0.438712 | 0.418154 | 0.418154 | 0.389837 | 0.355702 | 0.297517 | 0 | 0.002854 | 0.233722 | 4,116 | 118 | 95 | 34.881356 | 0.814521 | 0.536929 | 0 | 0.355556 | 0 | 0 | 0.025935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.155556 | 0 | 0.644444 | 0.022222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5db204f5af9206eceaf400a510a5e3d05316e861 | 2,647 | py | Python | observations/r/zea_mays.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 199 | 2017-07-24T01:34:27.000Z | 2022-01-29T00:50:55.000Z | observations/r/zea_mays.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 46 | 2017-09-05T19:27:20.000Z | 2019-01-07T09:47:26.000Z | observations/r/zea_mays.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 45 | 2017-07-26T00:10:44.000Z | 2022-03-16T20:44:59.000Z | # -*- coding: utf-8 -*-
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import csv
import numpy as np
import os
import sys
from observations.util import maybe_download_and_extract
def zea_mays(path):
"""Darwin's Heights of Cross- and Self-fertilized Zea May Pairs
Darwin (1876) studied the growth of pairs of zea may (aka corn)
seedlings, one produced by cross-fertilization and the other produced by
self-fertilization, but otherwise grown under identical conditions. His
goal was to demonstrate the greater vigour of the cross-fertilized
plants. The data recorded are the final height (inches, to the nearest
1/8th) of the plants in each pair.
In the *Design of Experiments*, Fisher (1935) used these data to
illustrate a paired t-test (well, a one-sample test on the mean
difference, `cross - self`). Later in the book (section 21), he used
this data to illustrate an early example of a non-parametric permutation
test, treating each paired difference as having (randomly) either a
positive or negative sign.
A data frame with 15 observations on the following 4 variables.
`pair`
pair number, a numeric vector
`pot`
pot, a factor with levels `1` `2` `3` `4`
`cross`
height of cross fertilized plant, a numeric vector
`self`
height of self fertilized plant, a numeric vector
`diff`
`cross - self` for each pair
Darwin, C. (1876). *The Effect of Cross- and Self-fertilization in the
Vegetable Kingdom*, 2nd Ed. London: John Murray.
Andrews, D. and Herzberg, A. (1985) *Data: a collection of problems from
many fields for the student and research worker*. New York: Springer.
Data retrieved from: `https://www.stat.cmu.edu/StatDat/`
Args:
path: str.
Path to directory which either stores file or otherwise file will
be downloaded and extracted there.
Filename is `zea_mays.csv`.
Returns:
Tuple of np.ndarray `x_train` with 15 rows and 5 columns and
dictionary `metadata` of column headers (feature names).
"""
import pandas as pd
path = os.path.expanduser(path)
filename = 'zea_mays.csv'
if not os.path.exists(os.path.join(path, filename)):
url = 'http://dustintran.com/data/r/HistData/ZeaMays.csv'
maybe_download_and_extract(path, url,
save_file_name='zea_mays.csv',
resume=False)
data = pd.read_csv(os.path.join(path, filename), index_col=0,
parse_dates=True)
x_train = data.values
metadata = {'columns': data.columns}
return x_train, metadata
| 32.679012 | 74 | 0.705327 | 394 | 2,647 | 4.65736 | 0.532995 | 0.015259 | 0.026158 | 0.025068 | 0.055586 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015934 | 0.217605 | 2,647 | 80 | 75 | 33.0875 | 0.870111 | 0.671326 | 0 | 0 | 0 | 0 | 0.101394 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.409091 | 0 | 0.5 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5dbe1aa985f0f74b54e5721ad988a0ced87ead89 | 469 | py | Python | mygallary/urls.py | mangowilliam/my_gallary | 4c87fe055e5c28d6ca6a27ea5bde7df380750006 | [
"MIT"
] | null | null | null | mygallary/urls.py | mangowilliam/my_gallary | 4c87fe055e5c28d6ca6a27ea5bde7df380750006 | [
"MIT"
] | 6 | 2021-03-19T02:06:21.000Z | 2022-03-11T23:53:21.000Z | mygallary/urls.py | mangowilliam/my_gallary | 4c87fe055e5c28d6ca6a27ea5bde7df380750006 | [
"MIT"
] | null | null | null | from django.conf import settings
from django.conf.urls.static import static
from django.conf.urls import url
from . import views
urlpatterns = [
url('^$', views.gallary,name = 'gallary'),
url(r'^search/', views.search_image, name='search_image'),
url(r'^details/(\d+)',views.search_location,name ='images')
]
if settings.DEBUG:
urlpatterns+= static(settings.MEDIA_URL, document_root = settings.MEDIA_ROOT)
| 20.391304 | 81 | 0.656716 | 58 | 469 | 5.206897 | 0.413793 | 0.099338 | 0.139073 | 0.119205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.211087 | 469 | 23 | 82 | 20.391304 | 0.816216 | 0 | 0 | 0 | 0 | 0 | 0.104255 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5dc01810c4c1797d877a743bdf67e61535eee657 | 1,914 | py | Python | exercise_2/exercise_2.1.py | lukaszbinden/ethz-iacv-2020 | 271de804315de98b816cda3e2498958ffa87ad59 | [
"MIT"
] | null | null | null | exercise_2/exercise_2.1.py | lukaszbinden/ethz-iacv-2020 | 271de804315de98b816cda3e2498958ffa87ad59 | [
"MIT"
] | null | null | null | exercise_2/exercise_2.1.py | lukaszbinden/ethz-iacv-2020 | 271de804315de98b816cda3e2498958ffa87ad59 | [
"MIT"
] | null | null | null |
camera_width = 640
camera_height = 480
film_back_width = 1.417
film_back_height = 0.945
x_center = 320
y_center = 240
P_1 = (-0.023, -0.261, 2.376)
p_11 = P_1[0]
p_12 = P_1[1]
p_13 = P_1[2]
P_2 = (0.659, -0.071, 2.082)
p_21 = P_2[0]
p_22 = P_2[1]
p_23 = P_2[2]
p_1_prime = (52, 163)
x_1 = p_1_prime[0]
y_1 = p_1_prime[1]
p_2_prime = (218, 216)
x_2 = p_2_prime[0]
y_2 = p_2_prime[1]
f = 1.378
k_x = camera_width / film_back_width
k_y = camera_height / film_back_height
# f_k_x = f * k_x
f_k_x = f
# f_k_y = f * k_y
f_k_y = f
u_1_prime = (x_1 - x_center) / k_x
v_1_prime = (y_1 - y_center) / k_y
u_2_prime = (x_2 - x_center) / k_x
v_2_prime = (y_2 - y_center) / k_y
c_1_prime = (f_k_x * p_21 + (p_13 - p_23) * u_2_prime - u_2_prime/u_1_prime * f_k_x * p_11) / (f_k_x * (1 - u_2_prime/u_1_prime))
c_2_prime = (f_k_y * p_22 - (p_23 - (p_13*u_1_prime - f_k_x*(p_11 - c_1_prime))/u_1_prime) * v_2_prime) / f_k_y
c_2_prime_alt = (f_k_y * p_12 - (p_13 - (p_13*u_1_prime - f_k_x*(p_11 - c_1_prime))/u_1_prime) * v_1_prime) / f_k_y
c_3_prime = p_13 - (f_k_x / u_1_prime) * (p_11 - c_1_prime)
rho_1_prime = p_13 - c_3_prime
rho_2_prime = p_23 - c_3_prime
print(f"C' = ({c_1_prime}, {c_2_prime}, {c_3_prime})")
print(f"c_2_prime_alt = {c_2_prime_alt}")
print(f"rho_1_prime = {rho_1_prime}")
print(f"rho_2_prime = {rho_2_prime}")
print("------------------")
r_11 = f_k_x * (p_11 - c_1_prime)
r_12 = f_k_y * (p_12 - c_2_prime)
r_13 = 1 * (p_13 - c_3_prime)
l_11 = rho_1_prime * u_1_prime
l_12 = rho_1_prime * v_1_prime
l_13 = rho_1_prime * 1
print(f"L: ({l_11}, {l_12}, {l_13})")
print(f"R: ({r_11}, {r_12}, {r_13})")
print("------------------")
r_21 = f_k_x * (p_21 - c_1_prime)
r_22 = f_k_y * (p_22 - c_2_prime)
r_23 = 1 * (p_23 - c_3_prime)
l_21 = rho_2_prime * u_2_prime
l_22 = rho_2_prime * v_2_prime
l_23 = rho_2_prime * 1
print(f"L: ({l_11}, {l_12}, {l_13})")
print(f"R: ({r_11}, {r_12}, {r_13})") | 23.060241 | 129 | 0.642633 | 484 | 1,914 | 2.02686 | 0.097107 | 0.171254 | 0.033639 | 0.024465 | 0.455657 | 0.247706 | 0.187564 | 0.167176 | 0.140673 | 0.140673 | 0 | 0.159091 | 0.172414 | 1,914 | 83 | 130 | 23.060241 | 0.460227 | 0.016196 | 0 | 0.105263 | 0 | 0 | 0.145213 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.175439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5dc1607dc008e8af7451051e5d28ffb9f945411a | 998 | py | Python | apps/zsh/singletons.py | codecat555/codecat555-fidgetingbits_knausj_talon | 62f9be0459e6631c99d58eee97054ddd970cc5f3 | [
"MIT"
] | 4 | 2021-02-04T07:36:05.000Z | 2021-07-03T06:53:30.000Z | apps/zsh/singletons.py | codecat555/codecat555-fidgetingbits_knausj_talon | 62f9be0459e6631c99d58eee97054ddd970cc5f3 | [
"MIT"
] | null | null | null | apps/zsh/singletons.py | codecat555/codecat555-fidgetingbits_knausj_talon | 62f9be0459e6631c99d58eee97054ddd970cc5f3 | [
"MIT"
] | null | null | null | # A rarely-updated module to assist in writing reload-safe talon modules using
# things like threads, which are not normally safe for reloading with talon.
# If this file is ever updated, you'll need to restart talon.
import logging
_singletons = {}
def singleton(fn):
name = f"{fn.__module__}.{fn.__name__}"
# Do any cleanup actions from before.
if name in _singletons:
old = _singletons.pop(name)
try:
next(old)
except StopIteration:
pass
else:
logging.error(
f"the old @singleton function {name} had more than one yield!"
)
# Do the startup actions on the new object.
it = iter(fn())
obj = next(it)
# Remember the iterator so we can call the cleanup actions later.
_singletons[name] = it
# We want the object yielded by the iterator to be available at the name
# of the function, so instead of returning a function we return an object.
return obj
| 28.514286 | 78 | 0.645291 | 140 | 998 | 4.514286 | 0.621429 | 0.018987 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.288577 | 998 | 34 | 79 | 29.352941 | 0.890141 | 0.497996 | 0 | 0 | 0 | 0 | 0.179226 | 0.059063 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.055556 | 0.055556 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5dc4b2786f8172c270a1fc651693530424b90630 | 190 | py | Python | python_program/condition.py | LiuKaiqiang94/PyStudyExample | b30212718b218c71e06b68677f55c33e3a1dbf46 | [
"MIT"
] | 5 | 2018-09-10T02:52:35.000Z | 2018-09-20T07:50:42.000Z | python_program/condition.py | LiuKaiqiang94/PyStudyExample | b30212718b218c71e06b68677f55c33e3a1dbf46 | [
"MIT"
] | null | null | null | python_program/condition.py | LiuKaiqiang94/PyStudyExample | b30212718b218c71e06b68677f55c33e3a1dbf46 | [
"MIT"
] | null | null | null |
def main():
val=int(input("input a num"))
if val<10:
print("A")
elif val<20:
print("B")
elif val<30:
print("C")
else:
print("D")
main()
| 13.571429 | 33 | 0.442105 | 27 | 190 | 3.111111 | 0.62963 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05042 | 0.373684 | 190 | 13 | 34 | 14.615385 | 0.655462 | 0 | 0 | 0 | 0 | 0 | 0.079365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0 | 0 | 0.090909 | 0.363636 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5dc789560fb397b3832cccec69534dcbf26e36d2 | 5,902 | py | Python | emilia/modules/math.py | masterisira/ELIZA_OF-master | 02a7dbf48e4a3d4ee0981e6a074529ab1497aafe | [
"Unlicense"
] | null | null | null | emilia/modules/math.py | masterisira/ELIZA_OF-master | 02a7dbf48e4a3d4ee0981e6a074529ab1497aafe | [
"Unlicense"
] | null | null | null | emilia/modules/math.py | masterisira/ELIZA_OF-master | 02a7dbf48e4a3d4ee0981e6a074529ab1497aafe | [
"Unlicense"
] | null | null | null | from typing import List
import requests
from telegram import Message, Update, Bot, MessageEntity
from telegram.ext import CommandHandler, run_async
from emilia import dispatcher
from emilia.modules.disable import DisableAbleCommandHandler
from emilia.modules.helper_funcs.alternate import send_message
import pynewtonmath as newton
import math
@run_async
def simplify(update, context):
args=context.args
args=str(args)
message = update.effective_message
message.reply_text(newton.simplify('{}'.format(args[0])))
@run_async
def factor(update, context):
args=context.args
args=str(args)
message = update.effective_message
message.reply_text(newton.factor('{}'.format(args[0])))
@run_async
def derive(update, context):
args=context.args
args=str(args)
message = update.effective_message
message.reply_text(newton.derive('{}'.format(args[0])))
@run_async
def integrate(update, context):
args=context.args
args=str(args)
message = update.effective_message
message.reply_text(newton.integrate('{}'.format(args[0])))
@run_async
def zeroes(update, context):
args=context.args
args=str(args)
message = update.effective_message
message.reply_text(newton.zeroes('{}'.format(args[0])))
@run_async
def tangent(update, context):
args=context.args
args=str(args)
message = update.effective_message
message.reply_text(newton.tangent('{}'.format(args[0])))
@run_async
def area(update, context):
args=context.args
args=str(args)
message = update.effective_message
message.reply_text(newton.area('{}'.format(args[0])))
@run_async
def cos(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.cos(int(args[0])))
@run_async
def sin(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.sin(int(args[0])))
@run_async
def tan(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.tan(int(args[0])))
@run_async
def arccos(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.acos(int(args[0])))
@run_async
def arcsin(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.asin(int(args[0])))
@run_async
def arctan(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.atan(int(args[0])))
@run_async
def abs(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.fabs(int(args[0])))
@run_async
def log(update, context):
args = context.args
message = update.effective_message
message.reply_text(math.log(int(args[0])))
__help__ = """
Under Developmeent.. More features soon
- /cos: Cosine `/cos pi`
- /sin: Sine `/sin 0`
- /tan: Tangent `/tan 0`
- /arccos: Inverse Cosine `/arccos 1`
- /arcsin: Inverse Sine `/arcsin 0`
- /arctan: Inverse Tangent `/arctan 0`
- /abs: Absolute Value `/abs -1`
- /log: Logarithm `/log 2l8`
__Keep in mind__: To find the tangent line of a function at a certain x value, send the request as c|f(x) where c is the given x value and f(x) is the function expression, the separator is a vertical bar '|'. See the table above for an example request.
To find the area under a function, send the request as c:d|f(x) where c is the starting x value, d is the ending x value, and f(x) is the function under which you want the curve between the two x values.
To compute fractions, enter expressions as numerator(over)denominator. For example, to process 2/4 you must send in your expression as 2(over)4. The result expression will be in standard math notation (1/2, 3/4).
"""
SIMPLIFY_HANDLER = DisableAbleCommandHandler("math", simplify, pass_args=True)
FACTOR_HANDLER = DisableAbleCommandHandler("factor", factor, pass_args=True)
DERIVE_HANDLER = DisableAbleCommandHandler("derive", derive, pass_args=True)
INTEGRATE_HANDLER = DisableAbleCommandHandler("integrate", integrate, pass_args=True)
ZEROES_HANDLER = DisableAbleCommandHandler("zeroes", zeroes, pass_args=True)
TANGENT_HANDLER = DisableAbleCommandHandler("tangent", tangent, pass_args=True)
AREA_HANDLER = DisableAbleCommandHandler("area", area, pass_args=True)
COS_HANDLER = DisableAbleCommandHandler("cos", cos, pass_args=True)
SIN_HANDLER = DisableAbleCommandHandler("sin", sin, pass_args=True)
TAN_HANDLER = DisableAbleCommandHandler("tan", tan, pass_args=True)
ARCCOS_HANDLER = DisableAbleCommandHandler("arccos", arccos, pass_args=True)
ARCSIN_HANDLER = DisableAbleCommandHandler("arcsin", arcsin, pass_args=True)
ARCTAN_HANDLER = DisableAbleCommandHandler("arctan", arctan, pass_args=True)
ABS_HANDLER = DisableAbleCommandHandler("abs", abs, pass_args=True)
LOG_HANDLER = DisableAbleCommandHandler("log", log, pass_args=True)
dispatcher.add_handler(SIMPLIFY_HANDLER)
dispatcher.add_handler(FACTOR_HANDLER)
dispatcher.add_handler(DERIVE_HANDLER)
dispatcher.add_handler(INTEGRATE_HANDLER)
dispatcher.add_handler(ZEROES_HANDLER)
dispatcher.add_handler(TANGENT_HANDLER)
dispatcher.add_handler(AREA_HANDLER)
dispatcher.add_handler(COS_HANDLER)
dispatcher.add_handler(SIN_HANDLER)
dispatcher.add_handler(TAN_HANDLER)
dispatcher.add_handler(ARCCOS_HANDLER)
dispatcher.add_handler(ARCSIN_HANDLER)
dispatcher.add_handler(ARCTAN_HANDLER)
dispatcher.add_handler(ABS_HANDLER)
dispatcher.add_handler(LOG_HANDLER)
__mod_name__ = "Math"
__command_list__ = ["math","factor","derive","integrate","zeroes","tangent","area","cos","sin","tan","arccos","arcsin","arctan","abs","log"]
__handlers__ = [
SIMPLIFY_HANDLER,FACTOR_HANDLER,DERIVE_HANDLER,INTEGRATE_HANDLER,TANGENT_HANDLER,ZEROES_HANDLER,AREA_HANDLER,COS_HANDLER,SIN_HANDLER,TAN_HANDLER,ARCCOS_HANDLER,ARCSIN_HANDLER,ARCTAN_HANDLER,ABS_HANDLER,LOG_HANDLER
]
| 35.769697 | 252 | 0.763978 | 803 | 5,902 | 5.43462 | 0.17061 | 0.075619 | 0.037809 | 0.082493 | 0.376031 | 0.36824 | 0.296517 | 0.296517 | 0.285518 | 0.285518 | 0 | 0.005975 | 0.120976 | 5,902 | 164 | 253 | 35.987805 | 0.835197 | 0 | 0 | 0.371429 | 0 | 0.021429 | 0.192138 | 0.004575 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0.107143 | 0.064286 | 0 | 0.171429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5dcd4858a80507237d1cda30d4f8de4336f40710 | 3,044 | py | Python | src/views/age_results_widget.py | RubyMarsden/Crayfish | 33bbb1248beec2fc40eee59e462711dd8cbc33da | [
"MIT"
] | null | null | null | src/views/age_results_widget.py | RubyMarsden/Crayfish | 33bbb1248beec2fc40eee59e462711dd8cbc33da | [
"MIT"
] | 8 | 2021-03-19T06:35:48.000Z | 2021-03-31T14:23:24.000Z | src/views/age_results_widget.py | RubyMarsden/Crayfish | 33bbb1248beec2fc40eee59e462711dd8cbc33da | [
"MIT"
] | null | null | null | import matplotlib
from PyQt5.QtCore import Qt
from PyQt5.QtWidgets import QHBoxLayout, QDialog, QPushButton, QWidget, QVBoxLayout, QLabel
matplotlib.use('QT5Agg')
import matplotlib.pyplot as plt
from models.data_key import DataKey
from utils import ui_utils
class AgeResultsWidget(QWidget):
def __init__(self, results_dialog):
QWidget.__init__(self)
self.results_dialog = results_dialog
layout = QHBoxLayout()
layout.addLayout(self._create_widget())
self.setLayout(layout)
results_dialog.sample_tree.tree.currentItemChanged.connect(lambda i, j: self.replot_graph())
results_dialog.configuration_changed.connect(self.replot_graph)
def _create_widget(self):
layout = QVBoxLayout()
layout.addWidget(QLabel("Sample and spot name"))
layout.addWidget(self._create_age_graph_and_point_selection())
return layout
def _create_age_graph_and_point_selection(self):
graph_and_points = QWidget()
layout = QVBoxLayout()
fig = plt.figure()
self.axes = plt.axes()
graph_widget, self.canvas = ui_utils.create_figure_widget(fig, self)
layout.addWidget(graph_widget)
graph_and_points.setLayout(layout)
return graph_and_points
###############
### Actions ###
###############
def replot_graph(self):
current_spot = self.results_dialog.sample_tree.current_spot()
config = self.results_dialog.configuration_widget.current_config
if config and current_spot:
self.plot_cps_graph(current_spot, config)
def plot_cps_graph(self, spot, config):
axis = self.axes
axis.clear()
if spot is None:
return
axis.spines['top'].set_visible(False)
axis.spines['right'].set_visible(False)
xs = []
ys = []
errors = []
if DataKey.AGES not in spot.data[config]:
# TODO plot words on graph
return
ages = spot.data[config][DataKey.AGES]
if len(ages) != 0:
for i, age in enumerate(ages):
if isinstance(age, str):
continue
x = i + 1
y, dy = age
xs.append(x)
if y is None:
ys.append(0)
errors.append(0)
else:
ys.append(y)
errors.append(dy)
else:
# TODO plot some text
return
weighted_age, age_st_dev = spot.data[config][DataKey.WEIGHTED_AGE]
if isinstance(weighted_age, str):
string = "No weighted age"
else:
string = f"Weighted age: {weighted_age:.0f}, 1σ: {age_st_dev:.0f}"
axis.errorbar(xs, ys, yerr=errors, linestyle="none", marker='o')
axis.text(0.5, 1, string, transform=axis.transAxes, horizontalalignment="center")
axis.set_xlabel("Scan number")
axis.set_ylabel("Age (ka)")
self.canvas.draw()
| 30.747475 | 100 | 0.598555 | 352 | 3,044 | 4.971591 | 0.352273 | 0.052 | 0.038857 | 0.026286 | 0.035429 | 0.035429 | 0 | 0 | 0 | 0 | 0 | 0.00608 | 0.297635 | 3,044 | 98 | 101 | 31.061224 | 0.812442 | 0.017411 | 0 | 0.109589 | 0 | 0 | 0.045054 | 0 | 0 | 0 | 0 | 0.010204 | 0 | 1 | 0.068493 | false | 0 | 0.082192 | 0 | 0.232877 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5dcfe5f1b4cd41078d4a64e401536ccb2333c29f | 1,827 | py | Python | shortio/utils.py | byshyk/shortio | 054014b3936495c86d2e2cd6a61c3cee9ab9b0f2 | [
"MIT"
] | null | null | null | shortio/utils.py | byshyk/shortio | 054014b3936495c86d2e2cd6a61c3cee9ab9b0f2 | [
"MIT"
] | null | null | null | shortio/utils.py | byshyk/shortio | 054014b3936495c86d2e2cd6a61c3cee9ab9b0f2 | [
"MIT"
] | null | null | null | """Contains utility functions."""
BIN_MODE_ARGS = {'mode', 'buffering', }
TEXT_MODE_ARGS = {'mode', 'buffering', 'encoding', 'errors', 'newline'}
def split_args(args):
"""Splits args into two groups: open args and other args.
Open args are used by ``open`` function. Other args are used by
``load``/``dump`` functions.
Args:
args: Keyword args to split.
Returns:
open_args: Arguments for ``open``.
other_args: Arguments for ``load``/``dump``.
"""
mode_args = BIN_MODE_ARGS if 'b' in args['mode'] else TEXT_MODE_ARGS
open_args = {}
other_args = {}
for arg, value in args.items():
if arg in mode_args:
open_args[arg] = value
else:
other_args[arg] = value
return open_args, other_args
def read_wrapper(load, **base_kwargs):
"""Wraps ``load`` function to avoid context manager boilerplate.
Args:
load: Function that takes the return of ``open``.
**base_kwargs: Base arguments that ``open``/``load`` take.
Returns:
Wrapper for ``load``.
"""
def wrapped(file, **kwargs):
open_args, load_args = split_args({**base_kwargs, **kwargs})
with open(file, **open_args) as f:
return load(f, **load_args)
return wrapped
def write_wrapper(dump, **base_kwargs):
"""Wraps ``dump`` function to avoid context manager boilerplate.
Args:
dump: Function that takes the return of ``open`` and data to dump.
**base_kwargs: Base arguments that ``open``/``dump`` take.
Returns:
Wrapper for ``dump``.
"""
def wrapped(file, obj, **kwargs):
open_args, dump_args = split_args({**base_kwargs, **kwargs})
with open(file, **open_args) as f:
dump(obj, f, **dump_args)
return wrapped
| 26.478261 | 74 | 0.603175 | 235 | 1,827 | 4.531915 | 0.259574 | 0.075117 | 0.033803 | 0.039437 | 0.298592 | 0.298592 | 0.240376 | 0.097653 | 0.097653 | 0.097653 | 0 | 0 | 0.259442 | 1,827 | 68 | 75 | 26.867647 | 0.78714 | 0.423645 | 0 | 0.166667 | 0 | 0 | 0.054968 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208333 | false | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5ddc336e8c10627292e9d9762e105aa2a19572a4 | 262 | py | Python | Chapter 10/trackbackLog.py | Miillky/automate_the_boring_stuff_with_python | 284b074b0738c66f38b54fe0fc5f69b3446e7e43 | [
"MIT"
] | null | null | null | Chapter 10/trackbackLog.py | Miillky/automate_the_boring_stuff_with_python | 284b074b0738c66f38b54fe0fc5f69b3446e7e43 | [
"MIT"
] | null | null | null | Chapter 10/trackbackLog.py | Miillky/automate_the_boring_stuff_with_python | 284b074b0738c66f38b54fe0fc5f69b3446e7e43 | [
"MIT"
] | null | null | null | import traceback
try:
raise Exception('This is the error message.')
except:
errorFile = open('./Chapter 10/errorInfo.txt', 'w')
errorFile.write(traceback.format_exc())
errorFile.close()
print('The traceback info was written to errorInfo.txt') | 32.75 | 60 | 0.709924 | 34 | 262 | 5.441176 | 0.794118 | 0.12973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009132 | 0.164122 | 262 | 8 | 60 | 32.75 | 0.835616 | 0 | 0 | 0 | 0 | 0 | 0.380228 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5ddf93a5acfa110cbd927feae9cad660c39b795d | 926 | py | Python | lesson10019_projects/pen/data/transition.py | muzudho/py-state-machine-practice | e31c066f4cf142b6b6c5ff273b56a0f89428c59e | [
"MIT"
] | null | null | null | lesson10019_projects/pen/data/transition.py | muzudho/py-state-machine-practice | e31c066f4cf142b6b6c5ff273b56a0f89428c59e | [
"MIT"
] | null | null | null | lesson10019_projects/pen/data/transition.py | muzudho/py-state-machine-practice | e31c066f4cf142b6b6c5ff273b56a0f89428c59e | [
"MIT"
] | null | null | null | from lesson14_projects.pen.data.const import (
A,
E_A,
E_AN,
E_IS,
E_OVER,
E_PEN,
E_PIN,
E_THAT,
E_THIS,
E_WAS,
INIT,
IS,
PEN,
THIS,
)
pen_transition_doc_v19 = {
"title": "This is a pen",
"entry_state": INIT,
"data": {
INIT: {
E_OVER: [INIT],
E_THAT: [INIT],
E_THIS: [INIT, THIS],
THIS: {
E_OVER: [INIT],
E_WAS: [INIT],
E_IS: [INIT, THIS, IS],
IS: {
E_OVER: [INIT],
E_AN: [INIT],
E_A: [INIT, THIS, IS, A],
A: {
E_OVER: [INIT],
E_PIN: [INIT],
E_PEN: [PEN],
},
},
},
},
PEN: {
E_OVER: None,
},
},
}
| 19.702128 | 46 | 0.327214 | 93 | 926 | 2.967742 | 0.258065 | 0.163043 | 0.130435 | 0.144928 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009592 | 0.549676 | 926 | 46 | 47 | 20.130435 | 0.652278 | 0 | 0 | 0.088889 | 0 | 0 | 0.035637 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.022222 | 0 | 0.022222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5de3cc8b6cc08416f6501e8a2abc20d6706d9dfa | 1,037 | py | Python | Keywords/__init__.py | cassie01/PumpLibrary | c2a4884a36f4c6c6552fa942143ae5d21c120b41 | [
"Apache-2.0"
] | null | null | null | Keywords/__init__.py | cassie01/PumpLibrary | c2a4884a36f4c6c6552fa942143ae5d21c120b41 | [
"Apache-2.0"
] | null | null | null | Keywords/__init__.py | cassie01/PumpLibrary | c2a4884a36f4c6c6552fa942143ae5d21c120b41 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from .Alarm.alarm import Alarm
from .DeliveryView.bolus import Bolus
from .DeliveryView.info import Info
from .DeliveryView.infusion import Infusion
from .DeliveryView.infusion_parameter import InfusionParameter
from .DeliveryView.priming import Priming
from .HardwareControl.motor import Motor
from .MenuSettings.device_report import DeviceReport
from .MenuSettings.history_log import HistoryLog
from .MenuSettings.infusion_setting import InfusionSetting
from .MenuSettings.maintenance import Maintenance
from .MenuSettings.safety_setting import SafetySetting
from .MenuSettings.system_setting import SystemSetting
from .SensorControl.sensor import Sensor
__all__ = ["Alarm",
"Bolus",
"Info",
"Infusion",
"InfusionParameter",
"Priming",
"Motor",
"DeviceReport",
"HistoryLog",
"InfusionSetting",
"Maintenance",
"SafetySetting",
"SystemSetting",
"Sensor",
]
| 31.424242 | 62 | 0.695275 | 94 | 1,037 | 7.56383 | 0.340426 | 0.135021 | 0.067511 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001245 | 0.225651 | 1,037 | 32 | 63 | 32.40625 | 0.884184 | 0.020251 | 0 | 0 | 0 | 0 | 0.129191 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.482759 | 0 | 0.482759 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5de3f2eb79030c2d37fe6eb8becce065096245d7 | 1,656 | py | Python | src/responsibleai/rai_analyse/constants.py | Azure/automl-devplat2-preview | 05f327fe4c2504e9d49001ce26d8b49627214138 | [
"MIT"
] | 7 | 2021-05-12T01:52:09.000Z | 2021-12-22T17:22:14.000Z | src/responsibleai/rai_analyse/constants.py | Azure/automl-devplat2-preview | 05f327fe4c2504e9d49001ce26d8b49627214138 | [
"MIT"
] | 5 | 2021-04-16T21:27:44.000Z | 2021-04-26T03:17:44.000Z | src/responsibleai/rai_analyse/constants.py | Azure/automl-devplat2-preview | 05f327fe4c2504e9d49001ce26d8b49627214138 | [
"MIT"
] | null | null | null | # ---------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# ---------------------------------------------------------
class DashboardInfo:
MODEL_ID_KEY = "id" # To match Model schema
MODEL_INFO_FILENAME = "model_info.json"
RAI_INSIGHTS_MODEL_ID_KEY = "model_id"
RAI_INSIGHTS_RUN_ID_KEY = "rai_insights_parent_run_id"
RAI_INSIGHTS_PARENT_FILENAME = "rai_insights.json"
class PropertyKeyValues:
# The property to indicate the type of Run
RAI_INSIGHTS_TYPE_KEY = "_azureml.responsibleai.rai_insights.type"
RAI_INSIGHTS_TYPE_CONSTRUCT = "construction"
RAI_INSIGHTS_TYPE_CAUSAL = "causal"
RAI_INSIGHTS_TYPE_COUNTERFACTUAL = "counterfactual"
RAI_INSIGHTS_TYPE_EXPLANATION = "explanation"
RAI_INSIGHTS_TYPE_ERROR_ANALYSIS = "error_analysis"
RAI_INSIGHTS_TYPE_GATHER = "gather"
# Property to point at the model under examination
RAI_INSIGHTS_MODEL_ID_KEY = "_azureml.responsibleai.rai_insights.model_id"
# Property for tool runs to point at their constructor run
RAI_INSIGHTS_CONSTRUCTOR_RUN_ID_KEY = (
"_azureml.responsibleai.rai_insights.constructor_run"
)
# Property to record responsibleai version
RAI_INSIGHTS_RESPONSIBLEAI_VERSION_KEY = (
"_azureml.responsibleai.rai_insights.responsibleai_version"
)
# Property format to indicate presence of a tool
RAI_INSIGHTS_TOOL_KEY_FORMAT = "_azureml.responsibleai.rai_insights.has_{0}"
class RAIToolType:
CAUSAL = "causal"
COUNTERFACTUAL = "counterfactual"
ERROR_ANALYSIS = "error_analysis"
EXPLANATION = "explanation"
| 35.234043 | 80 | 0.710145 | 186 | 1,656 | 5.903226 | 0.295699 | 0.210383 | 0.10929 | 0.141166 | 0.161202 | 0.065574 | 0 | 0 | 0 | 0 | 0 | 0.00072 | 0.161232 | 1,656 | 46 | 81 | 36 | 0.789777 | 0.259662 | 0 | 0 | 0 | 0 | 0.337993 | 0.214638 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.851852 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5de8ea4c838b0533ab68d0c0085a12cb95b9a807 | 896 | py | Python | winter/controller.py | EvgenySmekalin/winter | 24b6a02f958478547a4a120324823743a1f7e1a1 | [
"MIT"
] | 1 | 2020-03-28T14:54:28.000Z | 2020-03-28T14:54:28.000Z | winter/controller.py | EvgenySmekalin/winter | 24b6a02f958478547a4a120324823743a1f7e1a1 | [
"MIT"
] | null | null | null | winter/controller.py | EvgenySmekalin/winter | 24b6a02f958478547a4a120324823743a1f7e1a1 | [
"MIT"
] | null | null | null | import typing
from .core import Component
_Controller = typing.TypeVar('_Controller')
_ControllerType = typing.Type[_Controller]
ControllerFactory = typing.NewType('ControllerFactory', typing.Callable[[typing.Type], object])
_controller_factory: typing.Optional[ControllerFactory] = None
def controller(controller_class: _ControllerType) -> _ControllerType:
Component.register(controller_class)
return controller_class
def set_controller_factory(controller_factory: ControllerFactory) -> None:
global _controller_factory
_controller_factory = controller_factory
def build_controller(controller_class: _ControllerType) -> _Controller:
if _controller_factory is None:
return controller_class()
return _controller_factory(controller_class)
def get_component(controller_class: _ControllerType) -> Component:
return Component.get_by_cls(controller_class)
| 30.896552 | 95 | 0.809152 | 91 | 896 | 7.582418 | 0.307692 | 0.197101 | 0.156522 | 0.147826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117188 | 896 | 28 | 96 | 32 | 0.872314 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.055556 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5deb5f7aaf6a1389fadf9c9089ff41e73863dbba | 952 | py | Python | libact/query_strategies/tests/test_variance_reduction.py | joequant/libact | 4fbf4d59fd0d4e23858b264de2f35f674c50445b | [
"BSD-2-Clause"
] | 1 | 2019-05-09T13:00:45.000Z | 2019-05-09T13:00:45.000Z | libact/query_strategies/tests/test_variance_reduction.py | DunZhang/libact | e37e9ed6c36febe701d84b2d495c958ab02f0bc8 | [
"BSD-2-Clause"
] | null | null | null | libact/query_strategies/tests/test_variance_reduction.py | DunZhang/libact | e37e9ed6c36febe701d84b2d495c958ab02f0bc8 | [
"BSD-2-Clause"
] | 1 | 2021-01-18T20:07:57.000Z | 2021-01-18T20:07:57.000Z | import unittest
from numpy.testing import assert_array_equal
import numpy as np
from libact.base.dataset import Dataset
from libact.models import LogisticRegression
from libact.query_strategies import VarianceReduction
from .utils import run_qs
class VarianceReductionTestCase(unittest.TestCase):
"""Variance reduction test case using artifitial dataset"""
def setUp(self):
self.X = [[-2, -1], [1, 1], [-1, -2], [-1, -1], [1, 2], [2, 1]]
self.y = [0, 1, 0, 1, 0, 1]
self.quota = 4
def test_variance_reduction(self):
trn_ds = Dataset(self.X,
np.concatenate([self.y[:2],
[None] * (len(self.y) - 2)]))
qs = VarianceReduction(trn_ds, model=LogisticRegression(), sigma=0.1)
qseq = run_qs(trn_ds, qs, self.y, self.quota)
assert_array_equal(qseq, np.array([4, 5, 2, 3]))
if __name__ == '__main__':
unittest.main()
| 31.733333 | 77 | 0.615546 | 128 | 952 | 4.421875 | 0.414063 | 0.017668 | 0.015901 | 0.014134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037868 | 0.25105 | 952 | 29 | 78 | 32.827586 | 0.755961 | 0.055672 | 0 | 0 | 0 | 0 | 0.008959 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 1 | 0.095238 | false | 0 | 0.333333 | 0 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5df2f0f840a2ef6d66c1e525c680fc2bedf30ceb | 487 | py | Python | apps/06_lolcat_factory/you_try/PRD/cat_service.py | dparito/10Apps-Python_w-Andy | 77ca1ec280729a9002e49071e2f31cb5bc7b75cd | [
"MIT"
] | 1 | 2019-04-29T17:43:22.000Z | 2019-04-29T17:43:22.000Z | apps/06_lolcat_factory/you_try/PRD/cat_service.py | dparito/10Apps-Python_w-Andy | 77ca1ec280729a9002e49071e2f31cb5bc7b75cd | [
"MIT"
] | null | null | null | apps/06_lolcat_factory/you_try/PRD/cat_service.py | dparito/10Apps-Python_w-Andy | 77ca1ec280729a9002e49071e2f31cb5bc7b75cd | [
"MIT"
] | null | null | null | import os
import shutil
import requests
def get_cat(folder, name):
url = "http://consuming-python-services-api.azurewebsites.net/cats/random"
data = get_data_from_url(url)
save_image(folder, name, data)
def get_data_from_url(url):
response = requests.get(url, stream=True)
return response.raw
def save_image(folder, name, data):
file_name = os.path.join(folder, name + '.jpg')
with open(file_name, 'wb') as fout:
shutil.copyfileobj(data, fout)
| 22.136364 | 78 | 0.702259 | 72 | 487 | 4.597222 | 0.527778 | 0.120846 | 0.066465 | 0.084592 | 0.241692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176591 | 487 | 21 | 79 | 23.190476 | 0.825436 | 0 | 0 | 0 | 0 | 0 | 0.147844 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.214286 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5dfa81c4561263d9017352c96e5be1e9f43f9cf3 | 2,220 | py | Python | Assignment-1/Code/server3.py | pankajk22/Computer-Networks-Assignments | 5c227ef59c31ab52cde160568242dbbc84482bc5 | [
"MIT"
] | null | null | null | Assignment-1/Code/server3.py | pankajk22/Computer-Networks-Assignments | 5c227ef59c31ab52cde160568242dbbc84482bc5 | [
"MIT"
] | null | null | null | Assignment-1/Code/server3.py | pankajk22/Computer-Networks-Assignments | 5c227ef59c31ab52cde160568242dbbc84482bc5 | [
"MIT"
] | null | null | null | import socket
import csv
import traceback
import threading
s=socket.socket(socket.AF_INET,socket.SOCK_STREAM)
usrpass={}
def openfile():
filename="login_credentials.csv"
with open(filename,'r')as csvfile:
csv_file = csv.reader(csvfile, delimiter=",")
for col in csv_file:
usrpass[col[0]]=col[1]
usrpass.pop("Username")
#print(usrpass)
ihost=socket.gethostname()
host=socket.gethostbyname(ihost)
ihost=socket.gethostname()
host=socket.gethostbyname(ihost)
iport=[]
hostfile="host.csv"
with open(hostfile,'r')as host_file:
csv_hfile = csv.reader(host_file, delimiter=",")
for row in csv_hfile:
iport.append(row[1])
port=int(iport[4])
def socketbind():
try:
s.bind(('',port))
print("Bind with host at port number : "+str(port))
s.listen(10)
print("Socket is listening!!")
except socket.error as msg:
print("Error in Binding: "+ str(msg)+"\n Retrying....")
socketbind()
def socketaccept():
conn,add=s.accept()
print("connection is established with IP : "+str(add[0])+" and Port Number : "+str(add[1]))
conn.send(bytes("1","utf-8"))
conversation(conn)
conn.close()
def conversation(conn):
while True:
username=str(conn.recv(1024),"utf-8")
password=str(conn.recv(1024),"utf-8")
res=checkpass(username,password)
if res==1:
print("Valid Password!")
conn.send(bytes("1","utf-8"))
conn.send(bytes("1","utf-8"))
else:
conn.send(bytes("-1","utf-8"))
conn.send(bytes("-1","utf-8"))
# def checkusr(username):
# if username in usrpass:
# return 1
# else:
# print("Invalid Username")
# return -1
def checkpass(username,password):
if usrpass[username]==password:
return 1
else:
print("Invalid Password")
return -1
def main():
openfile()
socketbind()
socketaccept()
# count=0
# while (count<6):
# new_thread=threading.Thread(target =socketaccept)
# new_thread.start()
# count=count+1
main() | 23.368421 | 95 | 0.578378 | 271 | 2,220 | 4.697417 | 0.346863 | 0.021995 | 0.05106 | 0.054988 | 0.21524 | 0.179104 | 0.135114 | 0.056559 | 0.056559 | 0.056559 | 0 | 0.02225 | 0.271171 | 2,220 | 95 | 96 | 23.368421 | 0.764524 | 0.119369 | 0 | 0.209677 | 0 | 0 | 0.131173 | 0.010802 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0.145161 | 0.064516 | 0 | 0.193548 | 0.096774 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b908698cf79967eaadf3686141afa64182f22f9d | 4,756 | py | Python | setup.py | UdoGi/dark-matter | 3d49e89fa5e81f83144119f6216c5774176d203b | [
"MIT"
] | null | null | null | setup.py | UdoGi/dark-matter | 3d49e89fa5e81f83144119f6216c5774176d203b | [
"MIT"
] | null | null | null | setup.py | UdoGi/dark-matter | 3d49e89fa5e81f83144119f6216c5774176d203b | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from setuptools import setup
# Modified from http://stackoverflow.com/questions/2058802/
# how-can-i-get-the-version-defined-in-setup-py-setuptools-in-my-package
def version():
import os
import re
init = os.path.join('dark', '__init__.py')
with open(init) as fp:
initData = fp.read()
match = re.search(r"^__version__ = ['\"]([^'\"]+)['\"]",
initData, re.M)
if match:
return match.group(1)
else:
raise RuntimeError('Unable to find version string in %r.' % init)
# Explicitly list bin scripts to be installed, seeing as I have a few local
# bin files that are not (yet) part of the distribution.
scripts = [
'bin/aa-info.py',
'bin/aa-to-dna.py',
'bin/aa-to-properties.py',
'bin/adaptor-distances.py',
'bin/alignment-panel-civ.py',
'bin/alignments-per-read.py',
'bin/bit-score-to-e-value.py',
'bin/cat-json-blast-records.py',
'bin/check-fasta-json-blast-consistency.py',
'bin/codon-distance.py',
'bin/compare-consensuses.py',
'bin/compare-sequences.py',
'bin/convert-blast-xml-to-json.py',
'bin/convert-diamond-to-json.py',
'bin/convert-diamond-to-sam.py',
'bin/convert-sam-to-fastq.sh',
'bin/create-newick-relabeling-output.py',
'bin/dark-matter-version.py',
'bin/describe-protein-database.py',
'bin/dna-to-aa.py',
'bin/download-genbank.sh',
'bin/e-value-to-bit-score.py',
'bin/extract-ORFs.py',
'bin/fasta-base-indices.py',
'bin/fasta-count.py',
'bin/fasta-diff.sh',
'bin/fasta-identity-table.py',
'bin/fasta-ids.py',
'bin/fasta-join.py',
'bin/fasta-lengths.py',
'bin/fasta-sequences.py',
'bin/fasta-sort.py',
'bin/fasta-split-by-id.py',
'bin/fasta-subset.py',
'bin/fasta-subtraction.py',
'bin/fasta-to-phylip.py',
'bin/fasta-variable-sites.py',
'bin/filter-fasta-by-complexity.py',
'bin/filter-fasta-by-taxonomy.py',
'bin/filter-fasta.py',
'bin/filter-hits-to-fasta.py',
'bin/filter-reads-alignments.py',
'bin/filter-sam.py',
'bin/find-hits.py',
'bin/format-fasta.py',
'bin/genome-protein-summary.py',
'bin/get-features.py',
'bin/get-hosts.py',
'bin/get-reads.py',
'bin/get-taxonomy.py',
'bin/graph-evalues.py',
'bin/local-align.py',
'bin/make-consensus.py',
'bin/make-fasta-database.py',
'bin/make-protein-database.py',
'bin/ncbi-fetch-id.py',
'bin/newick-to-ascii.py',
'bin/noninteractive-alignment-panel.py',
'bin/parse-genbank-flat-file.py',
'bin/position-summary.py',
'bin/pre-commit.sh',
'bin/print-blast-xml-for-derek.py',
'bin/print-blast-xml.py',
'bin/print-read-lengths.py',
'bin/proteins-to-pathogens.py',
'bin/proteins-to-pathogens-civ.py',
'bin/randomize-fasta.py',
'bin/read-blast-json.py',
'bin/read-blast-xml.py',
'bin/relabel-newick-tree.py',
'bin/run-bwa.py',
'bin/run-bowtie2.py',
'bin/sam-coverage.py',
'bin/sam-coverage-depth.py',
'bin/sam-to-fasta-alignment.py',
'bin/sam-reference-read-counts.py',
'bin/sam-references.py',
'bin/sff-to-fastq.py',
'bin/split-fasta-by-adaptors.py',
'bin/subset-protein-database.py',
'bin/summarize-fasta-bases.py',
'bin/summarize-reads.py',
'bin/trim-primers.py',
'bin/trim-reads.py',
'bin/write-htcondor-job-spec.py',
]
setup(name='dark-matter',
version=version(),
packages=['dark', 'dark.blast', 'dark.diamond', 'dark.civ'],
url='https://github.com/acorg/dark-matter',
download_url='https://github.com/acorg/dark-matter',
author='Terry Jones, Barbara Muehlemann, Tali Veith, Sophie Mathias',
author_email='tcj25@cam.ac.uk',
keywords=['virus discovery'],
classifiers=[
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Operating System :: OS Independent',
'Topic :: Software Development :: Libraries :: Python Modules',
],
license='MIT',
description='Python classes for working with genetic sequence data',
scripts=scripts,
install_requires=[
'biopython>=1.71',
'bz2file>=0.98',
'Cython>=0.29.16',
'ipython>=3.1.0',
'matplotlib>=1.4.3',
'mysql-connector-python==8.0.11',
'numpy>=1.14.2',
'pysam>=0.15.2',
'pyfaidx>=0.4.8.4',
'pyzmq>=14.3.1',
'requests>=2.18.4',
'cachetools>=3.1.0',
'simplejson>=3.5.3',
'six>=1.11.0',
])
| 31.919463 | 75 | 0.603238 | 664 | 4,756 | 4.304217 | 0.399096 | 0.139958 | 0.045486 | 0.020994 | 0.069979 | 0.040588 | 0.040588 | 0 | 0 | 0 | 0 | 0.017701 | 0.204163 | 4,756 | 148 | 76 | 32.135135 | 0.737384 | 0.058452 | 0 | 0 | 0 | 0 | 0.637156 | 0.345629 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007353 | false | 0 | 0.022059 | 0 | 0.036765 | 0.022059 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b90f4e751b3217015ecc06286993d45ab12fc397 | 405 | py | Python | {{ cookiecutter.repo_name }}/tests/test_environment.py | FrancisMudavanhu/cookiecutter-data-science | be766817a7399ccd714bf03d085609985fa7313a | [
"MIT"
] | null | null | null | {{ cookiecutter.repo_name }}/tests/test_environment.py | FrancisMudavanhu/cookiecutter-data-science | be766817a7399ccd714bf03d085609985fa7313a | [
"MIT"
] | null | null | null | {{ cookiecutter.repo_name }}/tests/test_environment.py | FrancisMudavanhu/cookiecutter-data-science | be766817a7399ccd714bf03d085609985fa7313a | [
"MIT"
] | null | null | null | import sys
REQUIRED_PYTHON = "python3"
required_major = 3
def main():
system_major = sys.version_info.major
if system_major != required_major:
raise TypeError(
f"This project requires Python {required_major}."
f" Found: Python {sys.version}")
else:
print(">>> Development environment passes all tests!")
if __name__ == '__main__':
main()
| 19.285714 | 62 | 0.632099 | 46 | 405 | 5.23913 | 0.608696 | 0.161826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006711 | 0.264198 | 405 | 20 | 63 | 20.25 | 0.802013 | 0 | 0 | 0 | 0 | 0 | 0.330864 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.076923 | 0.076923 | 0 | 0.153846 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f8d46f993d25bd7f9f34660f23bf18928f5a3963 | 5,672 | py | Python | module/classification_package/src/utils.py | fishial/Object-Detection-Model | 4792f65ea785156a8e240d9cdbbc0c9d013ea0bb | [
"CC0-1.0"
] | 1 | 2022-01-03T14:00:17.000Z | 2022-01-03T14:00:17.000Z | module/classification_package/src/utils.py | fishial/Object-Detection-Model | 4792f65ea785156a8e240d9cdbbc0c9d013ea0bb | [
"CC0-1.0"
] | null | null | null | module/classification_package/src/utils.py | fishial/Object-Detection-Model | 4792f65ea785156a8e240d9cdbbc0c9d013ea0bb | [
"CC0-1.0"
] | 1 | 2021-12-21T09:50:53.000Z | 2021-12-21T09:50:53.000Z | import numpy as np
import logging
import numbers
import torch
import math
import json
import sys
from torch.optim.lr_scheduler import LambdaLR
from torchvision.transforms.functional import pad
class AverageMeter(object):
"""Computes and stores the average and current value"""
def __init__(self):
self.reset()
def reset(self):
self.val = 0
self.avg = 0
self.sum = 0
self.count = 0
def update(self, val, n=1):
self.val = val
self.sum += val * n
self.count += n
self.avg = self.sum / self.count
class ConstantLRSchedule(LambdaLR):
""" Constant learning rate schedule.
"""
def __init__(self, optimizer, last_epoch=-1):
super(ConstantLRSchedule, self).__init__(optimizer, lambda _: 1.0, last_epoch=last_epoch)
class WarmupConstantSchedule(LambdaLR):
""" Linear warmup and then constant.
Linearly increases learning rate schedule from 0 to 1 over `warmup_steps` training steps.
Keeps learning rate schedule equal to 1. after warmup_steps.
"""
def __init__(self, optimizer, warmup_steps, last_epoch=-1):
self.warmup_steps = warmup_steps
super(WarmupConstantSchedule, self).__init__(optimizer, self.lr_lambda, last_epoch=last_epoch)
def lr_lambda(self, step):
if step < self.warmup_steps:
return float(step) / float(max(1.0, self.warmup_steps))
return 1.
class WarmupLinearSchedule(LambdaLR):
""" Linear warmup and then linear decay.
Linearly increases learning rate from 0 to 1 over `warmup_steps` training steps.
Linearly decreases learning rate from 1. to 0. over remaining `t_total - warmup_steps` steps.
"""
def __init__(self, optimizer, warmup_steps, t_total, last_epoch=-1):
self.warmup_steps = warmup_steps
self.t_total = t_total
super(WarmupLinearSchedule, self).__init__(optimizer, self.lr_lambda, last_epoch=last_epoch)
def lr_lambda(self, step):
if step < self.warmup_steps:
return float(step) / float(max(1, self.warmup_steps))
return max(0.0, float(self.t_total - step) / float(max(1.0, self.t_total - self.warmup_steps)))
class WarmupCosineSchedule(LambdaLR):
""" Linear warmup and then cosine decay.
Linearly increases learning rate from 0 to 1 over `warmup_steps` training steps.
Decreases learning rate from 1. to 0. over remaining `t_total - warmup_steps` steps following a cosine curve.
If `cycles` (default=0.5) is different from default, learning rate follows cosine function after warmup.
"""
def __init__(self, optimizer, warmup_steps, t_total, cycles=.5, last_epoch=-1):
self.warmup_steps = warmup_steps
self.t_total = t_total
self.cycles = cycles
super(WarmupCosineSchedule, self).__init__(optimizer, self.lr_lambda, last_epoch=last_epoch)
def lr_lambda(self, step):
if step < self.warmup_steps:
return float(step) / float(max(1.0, self.warmup_steps))
# progress after warmup
progress = float(step - self.warmup_steps) / float(max(1, self.t_total - self.warmup_steps))
return max(0.0, 0.5 * (1. + math.cos(math.pi * float(self.cycles) * 2.0 * progress)))
def get_padding(image):
w, h = image.size
max_wh = np.max([w, h])
h_padding = (max_wh - w) / 2
v_padding = (max_wh - h) / 2
l_pad = h_padding if h_padding % 1 == 0 else h_padding + 0.5
t_pad = v_padding if v_padding % 1 == 0 else v_padding + 0.5
r_pad = h_padding if h_padding % 1 == 0 else h_padding - 0.5
b_pad = v_padding if v_padding % 1 == 0 else v_padding - 0.5
padding = (int(l_pad), int(t_pad), int(r_pad), int(b_pad))
return padding
class NewPad(object):
def __init__(self, fill=0, padding_mode='constant'):
assert isinstance(fill, (numbers.Number, str, tuple))
assert padding_mode in ['constant', 'edge', 'reflect', 'symmetric']
self.fill = fill
self.padding_mode = padding_mode
def __call__(self, img):
"""
Args:
img (PIL Image): Image to be padded.
Returns:
PIL Image: Padded image.
"""
return pad(img, get_padding(img), self.fill, self.padding_mode)
def __repr__(self):
return self.__class__.__name__ + '(padding={0}, fill={1}, padding_mode={2})'. \
format(self.fill, self.padding_mode)
def find_device():
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
return device
def read_json(data):
with open(data) as f:
return json.load(f)
def save_json(data, path):
with open(path, 'w', encoding='utf-8') as f:
json.dump(data, f)
def setup_logger():
logger = logging.getLogger('train')
logger.setLevel(logging.INFO)
if len(logger.handlers) == 0:
formatter = logging.Formatter('%(asctime)s | %(message)s')
ch = logging.StreamHandler(stream=sys.stdout)
ch.setFormatter(formatter)
logger.addHandler(ch)
return logger
def adjust_learning_rate(optimizer, epoch, lr):
"""Sets the learning rate to the initial LR decayed by 10 every 30 epochs"""
lr = lr * (0.1 ** (epoch // 30))
for param_group in optimizer.param_groups:
param_group['lr'] = lr
def save_checkpoint(model, path):
torch.save(model.state_dict(), path)
def reverse_norm_image(image):
MEAN = torch.tensor([0.485, 0.456, 0.406])
STD = torch.tensor([0.229, 0.224, 0.225])
reverse_image = image * STD[:, None, None] + MEAN[:, None, None]
return reverse_image.permute(1, 2, 0).cpu().numpy() | 33.761905 | 117 | 0.653738 | 799 | 5,672 | 4.439299 | 0.239049 | 0.074429 | 0.050747 | 0.035523 | 0.366789 | 0.343953 | 0.315478 | 0.290668 | 0.259656 | 0.249507 | 0 | 0.023662 | 0.232546 | 5,672 | 168 | 118 | 33.761905 | 0.791179 | 0.175599 | 0 | 0.125 | 0 | 0 | 0.027409 | 0 | 0 | 0 | 0 | 0 | 0.019231 | 1 | 0.201923 | false | 0 | 0.086538 | 0.009615 | 0.471154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8d470d1980749c03e842d69c111ae8c0604cde9 | 992 | py | Python | tests/pylint_plugins/test_assert_raises_without_msg.py | L-Net-1992/mlflow | a90574dbb730935c815ff41a0660b9a823b81630 | [
"Apache-2.0"
] | null | null | null | tests/pylint_plugins/test_assert_raises_without_msg.py | L-Net-1992/mlflow | a90574dbb730935c815ff41a0660b9a823b81630 | [
"Apache-2.0"
] | null | null | null | tests/pylint_plugins/test_assert_raises_without_msg.py | L-Net-1992/mlflow | a90574dbb730935c815ff41a0660b9a823b81630 | [
"Apache-2.0"
] | null | null | null | import pytest
from tests.pylint_plugins.utils import create_message, extract_node, skip_if_pylint_unavailable
pytestmark = skip_if_pylint_unavailable()
@pytest.fixture(scope="module")
def test_case():
import pylint.testutils
from pylint_plugins import AssertRaisesWithoutMsg
class TestAssertRaisesWithoutMsg(pylint.testutils.CheckerTestCase):
CHECKER_CLASS = AssertRaisesWithoutMsg
test_case = TestAssertRaisesWithoutMsg()
test_case.setup_method()
return test_case
def test_assert_raises_without_msg(test_case):
node = extract_node("self.assertRaises(Exception)")
with test_case.assertAddsMessages(create_message(test_case.CHECKER_CLASS.name, node)):
test_case.walk(node)
node = extract_node("self.assertRaises(Exception, msg='test')")
with test_case.assertNoMessages():
test_case.walk(node)
node = extract_node("pandas.assertRaises(Exception)")
with test_case.assertNoMessages():
test_case.walk(node)
| 30.060606 | 95 | 0.768145 | 114 | 992 | 6.394737 | 0.377193 | 0.131687 | 0.061728 | 0.065844 | 0.318244 | 0.272977 | 0.183813 | 0.120713 | 0 | 0 | 0 | 0 | 0.144153 | 992 | 32 | 96 | 31 | 0.858657 | 0 | 0 | 0.227273 | 0 | 0 | 0.104839 | 0.086694 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.090909 | false | 0 | 0.181818 | 0 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8d6b09688dbea2ed0259d01f1aa0504d9acbfdc | 821 | py | Python | bites/bite029.py | ChidinmaKO/Chobe-bitesofpy | 2f933e6c8877a37d1ce7ef54ea22169fc67417d3 | [
"MIT"
] | null | null | null | bites/bite029.py | ChidinmaKO/Chobe-bitesofpy | 2f933e6c8877a37d1ce7ef54ea22169fc67417d3 | [
"MIT"
] | null | null | null | bites/bite029.py | ChidinmaKO/Chobe-bitesofpy | 2f933e6c8877a37d1ce7ef54ea22169fc67417d3 | [
"MIT"
] | 1 | 2019-07-16T19:12:52.000Z | 2019-07-16T19:12:52.000Z | def get_index_different_char(chars):
alnum = []
not_alnum = []
for index, char in enumerate(chars):
if str(char).isalnum():
alnum.append(index)
else:
not_alnum.append(index)
result = alnum[0] if len(alnum) < len(not_alnum) else not_alnum[0]
return result
# tests
def test_wrong_char():
inputs = (
['A', 'f', '.', 'Q', 2],
['.', '{', ' ^', '%', 'a'],
[1, '=', 3, 4, 5, 'A', 'b', 'a', 'b', 'c'],
['=', '=', '', '/', '/', 9, ':', ';', '?', '¡'],
list(range(1,9)) + ['}'] + list('abcde'), # noqa E231
)
expected = [2, 4, 1, 5, 8]
for arg, exp in zip(inputs, expected):
err = f'get_index_different_char({arg}) should return index {exp}'
assert get_index_different_char(arg) == exp, err | 30.407407 | 74 | 0.478685 | 104 | 821 | 3.644231 | 0.471154 | 0.084433 | 0.134565 | 0.166227 | 0.126649 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031034 | 0.293544 | 821 | 27 | 75 | 30.407407 | 0.62069 | 0.01827 | 0 | 0 | 0 | 0 | 0.108209 | 0.038557 | 0 | 0 | 0 | 0 | 0.045455 | 1 | 0.090909 | false | 0 | 0 | 0 | 0.136364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8da2f02f4840468e37f0eba92152ef522fab6ae | 2,589 | py | Python | source/tree.py | holderekt/regression-tree | 130fe07262faea8681159092718310d9aefe9889 | [
"MIT"
] | null | null | null | source/tree.py | holderekt/regression-tree | 130fe07262faea8681159092718310d9aefe9889 | [
"MIT"
] | null | null | null | source/tree.py | holderekt/regression-tree | 130fe07262faea8681159092718310d9aefe9889 | [
"MIT"
] | null | null | null | import utils as utl
import error_measures as err
# Regression Tree Node
class Node:
def __init__(self, parent, node_id, index=None, value=None, examples=None, prediction=0):
self.index = index
self.id = node_id
self.prediction = prediction
self.value = value
self.parent = parent
self.examples = examples
self.right = None
self.left = None
self.ssr = 0
self.leaves = 0
self.ssr_as_root = 0
def is_leaf(self):
if(self.right == None and self.left == None):
return True
return False
def leafs_id(self):
if(not self.is_leaf()):
return self._leafs_search(self.left) + self._leafs_search(self.right)
return [1]
def n_leafs(self):
return len(self.leafs_id())
def _leafs_search(self, node):
if node.is_leaf():
return [node.id]
return self._leafs_search(node.left) + self._leafs_search(node.right)
def __str__(self):
return str(self.id)
# Regression Tree
class Regression_Tree:
def __init__(self, y_train, root):
self.y = y_train
self.root = root
# Generate Prediction given a test example
def predict(self, example, deleted=[]):
current_node = self.root
while(not current_node.is_leaf() and ((current_node in deleted) == False)):
if(example[current_node.index] <= current_node.value):
current_node = current_node.left
else:
current_node = current_node.right
return current_node.prediction
# Generate Sum Square Residuals of a given node on training data
def node_ssr(self, node):
ssr = 0
for example in node.examples:
ssr = ssr + pow((self.y[example] - node.prediction) , 2)
return ssr
def leafs_id(self):
return self.root.leafs_id()
def n_leafs(self):
return len(self.leafs_id())
def __str__(self):
return self._print(self.root)
def print_leaf(self, node):
if(node.is_leaf()):
print(len(node.examples))
else:
self.print_leaf(node.left)
self.print_leaf(node.right)
def _print(self, node):
node_id = str(node.id)
r_string = node_id + " " + str(node.ssr)
if(not node.is_leaf()):
r_string = r_string + "\nLeft : " + node_id + "\n" + self._print(node.left)
r_string = r_string + "\nRight: " + node_id + "\n" + self._print(node.right)
return r_string
| 29.758621 | 93 | 0.588644 | 341 | 2,589 | 4.249267 | 0.202346 | 0.075914 | 0.041408 | 0.019324 | 0.1049 | 0.1049 | 0.049689 | 0.049689 | 0.049689 | 0.049689 | 0 | 0.003898 | 0.306296 | 2,589 | 86 | 94 | 30.104651 | 0.802895 | 0.054075 | 0 | 0.149254 | 0 | 0 | 0.009411 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208955 | false | 0 | 0.029851 | 0.074627 | 0.477612 | 0.119403 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8de8fc01b4a4af13fb95b42532f7a7fe7198cd6 | 225 | py | Python | loadbalanceRL/lib/__init__.py | fqzhou/LoadBalanceControl-RL | 689eec3b3b27e121aa45d2793e411f1863f6fc0b | [
"MIT"
] | 11 | 2018-10-29T06:50:43.000Z | 2022-03-28T14:26:09.000Z | loadbalanceRL/lib/__init__.py | fqzhou/LoadBalanceControl-RL | 689eec3b3b27e121aa45d2793e411f1863f6fc0b | [
"MIT"
] | 1 | 2022-03-01T13:46:25.000Z | 2022-03-01T13:46:25.000Z | loadbalanceRL/lib/__init__.py | fqzhou/LoadBalanceControl-RL | 689eec3b3b27e121aa45d2793e411f1863f6fc0b | [
"MIT"
] | 6 | 2019-02-05T20:01:53.000Z | 2020-09-04T12:30:00.000Z | #! /usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Contains core logic for Rainman2
"""
__author__ = 'Ari Saha (arisaha@icloud.com), Mingyang Liu(liux3941@umn.edu)'
__date__ = 'Wednesday, February 14th 2018, 11:42:09 am'
| 20.454545 | 76 | 0.68 | 32 | 225 | 4.53125 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098958 | 0.146667 | 225 | 10 | 77 | 22.5 | 0.65625 | 0.342222 | 0 | 0 | 0 | 0 | 0.741007 | 0.302158 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8dfe184dbac3633e171f2ced9f8b35d7607d947 | 717 | py | Python | openff/bespokefit/__init__.py | openforcefield/bespoke-f | 27b072bd09610dc8209429118d739e1f453edd61 | [
"MIT"
] | 12 | 2020-08-28T20:49:00.000Z | 2021-11-17T08:50:32.000Z | openff/bespokefit/__init__.py | openforcefield/bespoke-f | 27b072bd09610dc8209429118d739e1f453edd61 | [
"MIT"
] | 95 | 2020-02-19T18:40:54.000Z | 2021-12-02T10:52:23.000Z | openff/bespokefit/__init__.py | openforcefield/bespoke-f | 27b072bd09610dc8209429118d739e1f453edd61 | [
"MIT"
] | 3 | 2021-04-01T04:22:49.000Z | 2021-04-13T03:19:10.000Z | """
BespokeFit
Creating bespoke parameters for individual molecules.
"""
import logging
import sys
from ._version import get_versions
versions = get_versions()
__version__ = versions["version"]
__git_revision__ = versions["full-revisionid"]
del get_versions, versions
# Silence verbose messages when running the CLI otherwise you can't read the output
# without seeing tens of 'Unable to load AmberTools' or don't import simtk warnings...
if sys.argv[0].endswith("openff-bespoke"):
from openff.bespokefit.utilities.logging import DeprecationWarningFilter
# if "openff-bespoke"
logging.getLogger("openff.toolkit").setLevel(logging.ERROR)
logging.getLogger().addFilter(DeprecationWarningFilter())
| 28.68 | 86 | 0.781032 | 87 | 717 | 6.287356 | 0.643678 | 0.060329 | 0.06947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001595 | 0.125523 | 717 | 24 | 87 | 29.875 | 0.870813 | 0.351464 | 0 | 0 | 0 | 0 | 0.10989 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
f8e61d9aa8b9610c3339494d4c960ec17ee4ba35 | 286 | py | Python | src_py/ui/identify_page.py | Magier/Aetia | 7f6045d99904b808e1201f445d0d10b0dce54c37 | [
"MIT"
] | null | null | null | src_py/ui/identify_page.py | Magier/Aetia | 7f6045d99904b808e1201f445d0d10b0dce54c37 | [
"MIT"
] | null | null | null | src_py/ui/identify_page.py | Magier/Aetia | 7f6045d99904b808e1201f445d0d10b0dce54c37 | [
"MIT"
] | null | null | null | import streamlit as st
from ui.session_state import SessionState, get_state
from infer import ModelStage
def show(state: SessionState):
st.header("identify")
state = get_state()
if state.model.stage < ModelStage.DEFINED:
st.error("Please create the model first!")
| 26 | 52 | 0.734266 | 39 | 286 | 5.307692 | 0.641026 | 0.077295 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178322 | 286 | 10 | 53 | 28.6 | 0.880851 | 0 | 0 | 0 | 0 | 0 | 0.132867 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
f8ed0d2649220a6a4bd9e78f42580892fbc06d4f | 288 | py | Python | stdlib/csv/custom_dialect.py | janbodnar/Python-Course | 51705ab5a2adef52bcdb99a800e94c0d67144a38 | [
"BSD-2-Clause"
] | 13 | 2017-08-22T12:26:07.000Z | 2021-07-29T16:13:50.000Z | stdlib/csv/custom_dialect.py | janbodnar/Python-Course | 51705ab5a2adef52bcdb99a800e94c0d67144a38 | [
"BSD-2-Clause"
] | 1 | 2021-02-08T10:24:33.000Z | 2021-02-08T10:24:33.000Z | stdlib/csv/custom_dialect.py | janbodnar/Python-Course | 51705ab5a2adef52bcdb99a800e94c0d67144a38 | [
"BSD-2-Clause"
] | 17 | 2018-08-13T11:10:33.000Z | 2021-07-29T16:14:02.000Z | #!/usr/bin/python
# custom_dialect.py
import csv
csv.register_dialect("hashes", delimiter="#")
f = open('items3.csv', 'w')
with f:
writer = csv.writer(f, dialect="hashes")
writer.writerow(("pencils", 2))
writer.writerow(("plates", 1))
writer.writerow(("books", 4))
| 16.941176 | 45 | 0.635417 | 38 | 288 | 4.763158 | 0.631579 | 0.232044 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016529 | 0.159722 | 288 | 16 | 46 | 18 | 0.731405 | 0.118056 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8f25cd96d67041f861381dbd21810aa553cccdc | 883 | py | Python | tests/assets/test_driver_errors.py | CyrilLeMat/modelkit | 2150ffe78ebb00e3302dac36ccb09e66becd5130 | [
"MIT"
] | null | null | null | tests/assets/test_driver_errors.py | CyrilLeMat/modelkit | 2150ffe78ebb00e3302dac36ccb09e66becd5130 | [
"MIT"
] | null | null | null | tests/assets/test_driver_errors.py | CyrilLeMat/modelkit | 2150ffe78ebb00e3302dac36ccb09e66becd5130 | [
"MIT"
] | null | null | null | import os
import pytest
from modelkit.assets import errors
from tests.conftest import skip_unless
def _perform_driver_error_object_not_found(driver):
with pytest.raises(errors.ObjectDoesNotExistError):
driver.download_object("someasset", "somedestination")
assert not os.path.isfile("somedestination")
def test_local_driver(local_assetsmanager):
local_driver = local_assetsmanager.remote_assets_store.driver
_perform_driver_error_object_not_found(local_driver)
@skip_unless("ENABLE_GCS_TEST", "True")
def test_gcs_driver(gcs_assetsmanager):
gcs_driver = gcs_assetsmanager.remote_assets_store.driver
_perform_driver_error_object_not_found(gcs_driver)
@skip_unless("ENABLE_S3_TEST", "True")
def test_s3_driver(s3_assetsmanager):
s3_driver = s3_assetsmanager.remote_assets_store.driver
_perform_driver_error_object_not_found(s3_driver)
| 29.433333 | 65 | 0.822197 | 117 | 883 | 5.726496 | 0.307692 | 0.077612 | 0.107463 | 0.143284 | 0.352239 | 0.352239 | 0.304478 | 0.304478 | 0.304478 | 0.304478 | 0 | 0.007595 | 0.105323 | 883 | 29 | 66 | 30.448276 | 0.840506 | 0 | 0 | 0 | 0 | 0 | 0.08607 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 1 | 0.210526 | false | 0 | 0.210526 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8f4dcd9fb78ee1924b9f50173ac949a710abcfd | 3,190 | py | Python | testcases/school_bus.py | wilsonsuen/av-testing | a6967b4cb4e4ad6b10d041ffd3dc62188fccad81 | [
"MIT"
] | null | null | null | testcases/school_bus.py | wilsonsuen/av-testing | a6967b4cb4e4ad6b10d041ffd3dc62188fccad81 | [
"MIT"
] | null | null | null | testcases/school_bus.py | wilsonsuen/av-testing | a6967b4cb4e4ad6b10d041ffd3dc62188fccad81 | [
"MIT"
] | null | null | null | import sys
import os
import glob
import json
from robot import rebot
from robot.api import TestSuite
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
if __name__ == "__main__":
main_suite = TestSuite('School Bus Scenario')
main_suite.resource.imports.library('lib/simulation.py')
testcase_paths = glob.glob('data/testdata/04_school_bus/*.json')
testcase_paths.sort()
for testcase_path in testcase_paths[110:113]:
with open(testcase_path) as f:
testdata = json.load(f)
tags = list(testdata['testcase']['context'].values()) +\
list(testdata['testcase']['input'].values())
school_bus_test = main_suite.tests.create(testdata['testcase']['name'], tags=tags)
school_bus_test.setup.config(name='Setup Scenario', args=[testcase_path])
school_bus_test.body.create_keyword('Start Simulation')
school_bus_test.body.create_keyword('Validate Result')
school_bus_test.teardown.config(name='Test Case Teardown')
main_suite.run(output='results/04_school_bus/output.xml')
rebot('results/04_school_bus/output.xml',
log="results/04_school_bus/log.html",
report="results/04_school_bus/report.html")
"""
rebot --tagstatcombine "8:00AMANDSunny:8AM and Sunny(C1)" --tagstatcombine "8:00AMANDCloudy:8AM and Cloudy(C2)" --tagstatcombine "8:00AMANDRainning:8AM and Rainning(C3)" --tagstatcombine "8:00AMANDFoggy:8AM and Foggy(C4)" --tagstatcombine "12:00PMANDSunny:12PM and Sunny(C5)" --tagstatcombine "12:00PMANDCloudy:12PM and Cloudy(C6)" --tagstatcombine "12:00PMANDRainning:12PM and Rainning(C7)" --tagstatcombine "12:00PMANDFoggy:12PM and Foggy(C8)" --tagstatcombine "3:00PMANDSunny:3PM and Sunny(C9)" --tagstatcombine "3:00PMANDCloudy:3PM and Cloudy(C10)" --tagstatcombine "3:00PMANDRainning:3PM and Rainning(C11)" --tagstatcombine "3:00PMANDFoggy:3PM and Foggy(C12)" --tagstatcombine "5:00PMANDSunny:5PM and Sunny(C13)" --tagstatcombine "5:00PMANDCloudy:5PM and Cloudy(C14)" --tagstatcombine "5:00PMANDRainning:5PM and Ranining(C15)" --tagstatcombine "5:00PMANDFoggy:5PM and Foggy(C16)" --tagstatcombine "7:00PMANDSunny:7PM and Sunny(C17)" --tagstatcombine "7:00PMANDCloudy:7PM and Cloudy(C18)" --tagstatcombine "7:00PMANDRainning:7PM and Rainning(C19)" --tagstatcombine "7:00PMANDFoggy:7PM and Foggy(C20)" --tagstatcombine MovingANDBackward_lane:Moving\ and\ Backward\ lane\(I12\) --tagstatcombine MovingANDForward_lane:Moving\ and\ Forward\ lane\(I9\) --tagstatcombine LoadingANDBackward_lane:Loading\ and\ Backward\ lane\(I6\) --tagstatcombine LoadingANDForward_lane:Loading\ and\ Forward\ lane\(I3\) --tagstatcombine StopANDBackward_lane:Stop\ and\ Backward\ lane\(I18\) --tagstatcombine StopANDForward_lane:Stop\ and\ Forward\ lane\(I15\) --tagstatexclude Forward_lane --tagstatexclude Backward_lane --tagstatexclude Moving --tagstatexclude Loading --tagstatexclude Stop --tagstatexclude 8\:00AM --tagstatexclude 12\:00PM --tagstatexclude 3\:00PM --tagstatexclude 5\:00PM --tagstatexclude 7\:00PM --tagstatexclude Sunny --tagstatexclude Foggy --tagstatexclude Rainning --tagstatexclude Cloudy -r combined_report.html -l combined_log.html output.xml
""" | 91.142857 | 1,951 | 0.754232 | 399 | 3,190 | 5.894737 | 0.343358 | 0.042092 | 0.023384 | 0.030612 | 0.048469 | 0.048469 | 0 | 0 | 0 | 0 | 0 | 0.056318 | 0.109404 | 3,190 | 35 | 1,952 | 91.142857 | 0.771559 | 0 | 0 | 0 | 0 | 0 | 0.251623 | 0.130682 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.269231 | 0 | 0.269231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f8f65ce2aa90b1532e983805cc84833de1433b1e | 1,316 | py | Python | Python38/Lib/site-packages/PyInstaller/hooks/hook-PyQt4.py | AXFS-H/Windows10Debloater | ab5f8a8a8fb065bb40b7ddbd1df75563d8b8d13e | [
"MIT"
] | 5 | 2020-08-24T23:29:58.000Z | 2022-02-07T19:58:07.000Z | PyInstaller/hooks/hook-PyQt4.py | jeremysanders/pyinstaller | 321b24f9a9a5978337735816b36ca6b4a90a2fb4 | [
"Apache-2.0"
] | 12 | 2020-02-15T04:04:55.000Z | 2022-02-18T20:29:49.000Z | PyInstaller/hooks/hook-PyQt4.py | jeremysanders/pyinstaller | 321b24f9a9a5978337735816b36ca6b4a90a2fb4 | [
"Apache-2.0"
] | 2 | 2020-08-24T23:30:06.000Z | 2021-12-23T18:23:38.000Z | #-----------------------------------------------------------------------------
# Copyright (c) 2013-2020, PyInstaller Development Team.
#
# Distributed under the terms of the GNU General Public License (version 2
# or later) with exception for distributing the bootloader.
#
# The full license is in the file COPYING.txt, distributed with this software.
#
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
#-----------------------------------------------------------------------------
import os
from PyInstaller.utils.hooks import qt_menu_nib_dir
from PyInstaller.compat import getsitepackages, is_darwin, is_win
# On Windows system PATH has to be extended to point to the PyQt4 directory.
# The PySide directory contains Qt dlls. We need to avoid including different
# version of Qt libraries when there is installed another application (e.g. QtCreator)
if is_win:
from PyInstaller.utils.win32.winutils import extend_system_path
extend_system_path([os.path.join(x, 'PyQt4') for x in getsitepackages()])
hiddenimports = ['sip']
# For Qt to work on Mac OS X it is necessary to include directory qt_menu.nib.
# This directory contains some resource files necessary to run PyQt or PySide
# app.
if is_darwin:
datas = [
(qt_menu_nib_dir('PyQt4'), 'qt_menu.nib'),
]
| 35.567568 | 86 | 0.670213 | 178 | 1,316 | 4.865169 | 0.550562 | 0.027714 | 0.04157 | 0.027714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014235 | 0.145897 | 1,316 | 36 | 87 | 36.555556 | 0.756228 | 0.669453 | 0 | 0 | 0 | 0 | 0.057279 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.454545 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
f8f83df34dfaf5ae52ea9e532bb035a4e1cce478 | 825 | py | Python | ex085.py | EduotavioFonseca/ProgramasPython | 8e0ef5f6f4239d1fe52321f8795b6573f6ff5130 | [
"MIT"
] | null | null | null | ex085.py | EduotavioFonseca/ProgramasPython | 8e0ef5f6f4239d1fe52321f8795b6573f6ff5130 | [
"MIT"
] | null | null | null | ex085.py | EduotavioFonseca/ProgramasPython | 8e0ef5f6f4239d1fe52321f8795b6573f6ff5130 | [
"MIT"
] | null | null | null | # Lista dentro de dicionário
campeonato = dict()
gol = []
aux = 0
campeonato['Jogador'] = str(input('Digite o nome do jogador: '))
print()
partidas = int(input('Quantas partidas ele jogou? '))
print()
for i in range(0, partidas):
aux = int(input(f'Quantos gols na partida {i + 1}? '))
gol.append(aux)
print()
campeonato['Gols'] = gol[:]
campeonato['Total'] = sum(gol)
print('=' * 55)
print()
print(campeonato)
print()
print('=' * 55)
print()
for k, v in campeonato.items():
print(f'O campo {k} tem o valor: {v}')
print()
print('=' * 55)
print(f'O jogador {campeonato["Jogador"]} jogou {partidas} partidas.')
print()
for i in range(0, partidas):
print(f'Na partida {i + 1} ele fez {gol[i]} gol(s).')
print()
print(f'No total ele fez {campeonato["Total"]} gols.')
print('=' * 55)
| 25.78125 | 71 | 0.613333 | 121 | 825 | 4.181818 | 0.371901 | 0.055336 | 0.071146 | 0.043478 | 0.098814 | 0.098814 | 0.098814 | 0 | 0 | 0 | 0 | 0.01949 | 0.191515 | 825 | 31 | 72 | 26.612903 | 0.73913 | 0.031515 | 0 | 0.5 | 0 | 0 | 0.368146 | 0.057441 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.6 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
5d15eae6d6d420d8166df518e95a6f5df2ba41f1 | 2,619 | py | Python | main.py | showtimesynergy/mojify | 8c012730b9f56d6e7e2003e8db99669516f4e027 | [
"BSD-2-Clause"
] | null | null | null | main.py | showtimesynergy/mojify | 8c012730b9f56d6e7e2003e8db99669516f4e027 | [
"BSD-2-Clause"
] | null | null | null | main.py | showtimesynergy/mojify | 8c012730b9f56d6e7e2003e8db99669516f4e027 | [
"BSD-2-Clause"
] | null | null | null | from PIL import Image
import csv
from ast import literal_eval as make_tuple
from math import sqrt
import argparse
import os.path
def load_img(image):
# load an image as a PIL object
im = Image.open(image).convert('RGBA')
return im
def color_distance(c_tuple1, c_tuple2):
# calculate the color distance between two rgb tuples
red_mean = (c_tuple1[0] + c_tuple2[0]) / 2
red = c_tuple1[0] - c_tuple2[0]
green = c_tuple1[1] - c_tuple2[1]
blue = c_tuple1[2] - c_tuple2[2]
delta = (2 + (red_mean / 256)) * (red ** 2)
delta += (4 * (green ** 2))
delta += (2 + ((255 - red_mean) / 256)) * (blue ** 2)
delta = sqrt(delta)
return delta
def write_out(text_matrix):
# write out emoji grid to txt file
with open('out.txt', '+w', encoding='utf-8') as out:
for line in text_matrix:
line_out = ''
for char in line:
# TODO: ZWJ support
if char is None:
line_out += '\u2001\u2006'
else:
char_code = '0x' + char
char_code = int(char_code, base=16)
line_out += chr(char_code)
out.writelines(line_out + '\n')
def gen_matrix(pix_data):
# generate unicode data from colors
pix = pix_data.load()
emoji_grid = []
for y in range(0, size[1]):
emoji_grid.append([])
for x in range(0, size[0]):
pixel = pix[x, y]
best_delta = float('Inf')
for entry in emoji_list:
emoji_color = entry[1]
if pixel[3] == 0:
best = None
else:
delta = color_distance(emoji_color, pixel)
if delta < best_delta:
best = entry[0]
best_delta = delta
emoji_grid[-1].append(best)
return emoji_grid
def handle_arguments():
parser = argparse.ArgumentParser(
description='Represent an image using emoji'
)
parser.add_argument('image', help='image to be processed')
args = parser.parse_args()
return args
if __name__ == '__main__':
args = handle_arguments()
path = args.image
emoji_list = []
with open('proc.csv') as raw_list:
emoji_list = []
reader = csv.reader(raw_list)
raw_list = list(reader)
for entry in raw_list:
emoji_list.append([entry[0], make_tuple(entry[1])])
image = load_img(path)
size = image.size
emoji_grid = gen_matrix(image)
write_out(emoji_grid)
print('Output in out.txt')
| 29.426966 | 63 | 0.557083 | 349 | 2,619 | 3.988539 | 0.352436 | 0.045259 | 0.011494 | 0.012931 | 0.022989 | 0.022989 | 0 | 0 | 0 | 0 | 0 | 0.032702 | 0.334479 | 2,619 | 88 | 64 | 29.761364 | 0.765921 | 0.063383 | 0 | 0.055556 | 0 | 0 | 0.051492 | 0 | 0 | 0 | 0 | 0.011364 | 0 | 1 | 0.069444 | false | 0 | 0.083333 | 0 | 0.208333 | 0.013889 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d252f7a220679f8642989c387a00db59609427b | 3,194 | py | Python | core/formulas.py | mike006322/PolynomialCalculator | bf56b0e773a3461ab2aa958d0d90e08f80a4d201 | [
"MIT"
] | null | null | null | core/formulas.py | mike006322/PolynomialCalculator | bf56b0e773a3461ab2aa958d0d90e08f80a4d201 | [
"MIT"
] | null | null | null | core/formulas.py | mike006322/PolynomialCalculator | bf56b0e773a3461ab2aa958d0d90e08f80a4d201 | [
"MIT"
] | null | null | null | def solve(polynomial):
"""
input is polynomial
if more than one variable, returns 'too many variables'
looks for formula to apply to coefficients
returns solution or 'I cannot solve yet...'
"""
if len(polynomial.term_matrix[0]) > 2:
return 'too many variables'
elif len(polynomial.term_matrix[0]) == 1:
return polynomial.term_matrix[1][0]
elif len(polynomial.term_matrix[0]) == 2:
degree = polynomial.term_matrix[1][1]
if degree == 1:
if len(polynomial.term_matrix) == 2:
return 0
else:
return -polynomial.term_matrix[2][0]/polynomial.term_matrix[1][0]
if degree == 2:
ans = quadratic_formula(polynomial)
return ans
if degree > 2:
return Durand_Kerner(polynomial)
def quadratic_formula(polynomial):
"""
input is single-variable polynomial of degree 2
returns zeros
"""
if len(polynomial.term_matrix) == 3:
if polynomial.term_matrix[2][1] == 1:
a, b = polynomial.term_matrix[1][0], polynomial.term_matrix[2][0]
return 0, -b/a
a, c = polynomial.term_matrix[1][0], polynomial.term_matrix[2][0]
return (-c/a)**.5, -(-c/a)**.5
if len(polynomial.term_matrix) == 2:
a, b, c, = polynomial.term_matrix[1][0], 0, 0
elif len(polynomial.term_matrix) == 3:
a, b, c = polynomial.term_matrix[1][0], polynomial.term_matrix[2][0], 0
else:
a, b, c = polynomial.term_matrix[1][0], polynomial.term_matrix[2][0], polynomial.term_matrix[3][0]
ans1 = (-b + (b**2 - 4*a*c)**.5)/2*a
ans2 = (-b - (b**2 - 4*a*c)**.5)/2*a
if ans1 == ans2:
return ans1
return ans1, ans2
def isclose(a, b, rel_tol=1e-09, abs_tol=0.0001):
"""
returns boolean whether abs(a-b) is less than abs_total or rel_total*max(a, b)
"""
return abs(a-b) <= max(rel_tol * max(abs(a), abs(b)), abs_tol)
def Durand_Kerner(f):
"""
input polynomial
returns numerical approximation of all complex roots
"""
roots = []
for i in range(f.degree()):
roots.append((0.4 + 0.9j)**i)
diff = 1
diff_temp = 0
def iterate():
nonlocal roots
new_roots = roots[:]
for i in range(len(roots)):
q = 1
for j, root in enumerate(roots):
if j != i:
q *= roots[i] - root
new_roots[i] = roots[i] - f(roots[i])/q
nonlocal diff
nonlocal diff_temp
diff_temp = diff
diff = 0
for i in range(len(roots)):
diff += abs(roots[i] - new_roots[i])
roots = new_roots
while diff > .00000001 and not isclose(diff_temp, diff):
iterate()
for i in range(len(roots)):
if isclose(roots[i].real, round(roots[i].real)):
temp = round(roots[i].real)
roots[i] -= roots[i].real
roots[i] += temp
if isclose(roots[i].imag, round(roots[i].imag)):
temp = round(roots[i].imag)
roots[i] -= roots[i].imag*1j
roots[i] += temp*1j
return roots
if __name__ == '__main__':
pass
| 31.623762 | 106 | 0.556669 | 456 | 3,194 | 3.79386 | 0.201754 | 0.178035 | 0.254335 | 0.09711 | 0.367052 | 0.318497 | 0.17341 | 0.17341 | 0.123121 | 0.112717 | 0 | 0.041723 | 0.302129 | 3,194 | 100 | 107 | 31.94 | 0.73441 | 0.116781 | 0 | 0.09589 | 0 | 0 | 0.009489 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068493 | false | 0.013699 | 0 | 0 | 0.232877 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d3f1eebd4bcf21a7d4d5c5ef291d2d1f120515e | 1,101 | py | Python | Data Structures/Tree.py | Royals-Aeo-Gamer/MyPyMods | be3a521e9f823ce0b704f925b19f6f34dcb5405d | [
"MIT"
] | null | null | null | Data Structures/Tree.py | Royals-Aeo-Gamer/MyPyMods | be3a521e9f823ce0b704f925b19f6f34dcb5405d | [
"MIT"
] | null | null | null | Data Structures/Tree.py | Royals-Aeo-Gamer/MyPyMods | be3a521e9f823ce0b704f925b19f6f34dcb5405d | [
"MIT"
] | null | null | null | class TreeNode:
def __init__(self, name, data, parent=None):
self.name = name
self.parent = parent
self.data = data
self.childs = {}
def add_child(self, name, data):
self.childs.update({name:(type(self))(name, data, self)})
def rm_branch(self, name, ansistors_n: list = None,):
focus = self.childs
while True:
if ansistors_n == None or ansistors_n == self.name:
del focus[name]
break
elif ansistors_n[0] in focus:
focus = (focus[ansistors_n[0]]).childs
del ansistors_n[0]
elif name in focus and ansistors_n is None:
del focus[name]
break
else:
print(focus)
raise NameError(f"couldn't find branch {ansistors_n[0]}")
def __getitem__(self, item):
return self.childs[item]
def __setitem__(self, key, value):
self.childs[key] = value
def __delitem__(self, key, ansistors_n: list = None):
self.rm_branch(key, ansistors_n)
| 31.457143 | 73 | 0.551317 | 135 | 1,101 | 4.281481 | 0.340741 | 0.17301 | 0.076125 | 0.055363 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005563 | 0.346957 | 1,101 | 34 | 74 | 32.382353 | 0.798331 | 0 | 0 | 0.137931 | 0 | 0 | 0.033636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.206897 | false | 0 | 0 | 0.034483 | 0.275862 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d41c3b8ea2fc0ea3e45c5b6768c95bfbb166b0c | 1,965 | py | Python | wiki/tests.py | Prones94/Make_Wiki | f8816eb31bb370f48affff8568a6b0d0ffaf7cd4 | [
"MIT"
] | null | null | null | wiki/tests.py | Prones94/Make_Wiki | f8816eb31bb370f48affff8568a6b0d0ffaf7cd4 | [
"MIT"
] | 5 | 2020-06-06T01:41:16.000Z | 2021-06-10T20:09:01.000Z | wiki/tests.py | Prones94/Make_Wiki | f8816eb31bb370f48affff8568a6b0d0ffaf7cd4 | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.contrib.auth.models import User
from wiki.models import Page
from django.utils.text import slugify
# Create your tests here.
class WikiPageTest(TestCase):
def test_edit(self):
user = User.objects.create_user(username='admin', password='djangopony')
self.client.login(username='admin', password='djangopony')
page = Page.objects.create(title="My Test Page", content="test", author=user)
page.save()
edit = {
'title': 'testing title',
'content': 'testing content'
}
response = self.client.post('/%s/' %slugify(page.title), edit)
updated = Page.objects.get(title = edit['title'])
self.assertEqual(response.status_code, 302)
self.assertEqual(updated.title, edit['title'])
def test_page(self):
user = User.objects.create_user(username='admin', password='djangopony')
self.client.login(username='admin', password='djangopony')
page = Page.objects.create(title="My Test Page", content="test", author=user)
page.save()
response = self.client.get('/%s/' %slugify(page.title))
self.assertEqual(response.status_code, 200)
self.assertContains(response, 'test')
def test_create(self):
user = User.objects.create_user(username='admin', password='djangopony')
self.client.login(username='admin', password='djangopony')
new = {
'title': 'testing title',
'content': 'testing content'
}
response = self.client.post('/wiki/new/', new)
updated = Page.objects.get(title = new['title'])
self.assertEqual(response.status_code, 302)
self.assertEqual(updated.title, new['title'])
'''
Steps to writing a test
1. Set up your test data
2. Make a request (GET, POST)
3a. Check if response matches what we expect
3b. Check if database matches what we expect
''' | 33.87931 | 85 | 0.641221 | 237 | 1,965 | 5.278481 | 0.28692 | 0.06235 | 0.100719 | 0.148681 | 0.634692 | 0.593126 | 0.56275 | 0.56275 | 0.56275 | 0.56275 | 0 | 0.00853 | 0.224427 | 1,965 | 58 | 86 | 33.87931 | 0.812336 | 0.011705 | 0 | 0.432432 | 0 | 0 | 0.139508 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 1 | 0.081081 | false | 0.162162 | 0.108108 | 0 | 0.216216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5d451217c589da4fbfb78dd271865830d16162d1 | 826 | py | Python | 34. Find First and Last Position of Element in Sorted Array/main.py | Competitive-Programmers-Community/LeetCode | 841fdee805b1a626e9f1cd0e12398d25054638af | [
"MIT"
] | 2 | 2019-10-05T09:48:20.000Z | 2019-10-05T15:40:01.000Z | 34. Find First and Last Position of Element in Sorted Array/main.py | Competitive-Programmers-Community/LeetCode | 841fdee805b1a626e9f1cd0e12398d25054638af | [
"MIT"
] | null | null | null | 34. Find First and Last Position of Element in Sorted Array/main.py | Competitive-Programmers-Community/LeetCode | 841fdee805b1a626e9f1cd0e12398d25054638af | [
"MIT"
] | null | null | null | class Solution:
def searchRange(self, nums, target):
"""
:type nums: List[int]
:type target: int
:rtype: List[int]
"""
if not nums:
return [-1, -1]
low = 0
high = len(nums) - 1
f = 0
while low<=high:
mid = (low+high)//2
if nums[mid] == target:
f = 1
break
elif nums[mid] < target:
low = mid + 1
elif nums[mid] > target:
high = mid - 1
i, j = mid, mid
while i>=1 and nums[i-1] == target:
i = i-1
while j<len(nums)-1 and nums[j+1] == target:
j = j+1
if f == 1:
return [i, j]
else:
return [-1, -1]
| 24.294118 | 52 | 0.361985 | 97 | 826 | 3.082474 | 0.298969 | 0.070234 | 0.130435 | 0.113712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04534 | 0.51937 | 826 | 33 | 53 | 25.030303 | 0.707809 | 0.069007 | 0 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d52a5f4ab272695a4c951a2d0a2e0909bf0ed0b | 1,413 | py | Python | application/modules/login.py | BaggerFast/Simple_votings | 843769fa6fd2c04feb542e6b301b7b4810260d4e | [
"MIT"
] | null | null | null | application/modules/login.py | BaggerFast/Simple_votings | 843769fa6fd2c04feb542e6b301b7b4810260d4e | [
"MIT"
] | null | null | null | application/modules/login.py | BaggerFast/Simple_votings | 843769fa6fd2c04feb542e6b301b7b4810260d4e | [
"MIT"
] | null | null | null | from django.contrib import messages
from django.contrib.auth import login, authenticate
from django.shortcuts import render, redirect
from django.urls import reverse
from django.views import View
from application.forms import AuthenticateForm
from application.views import get_navbar, Page
class LoginView(View):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.context = {}
def get(self, request):
self.context['navbar'] = get_navbar(request)
self.context['form'] = AuthenticateForm()
return render(request, Page.login, self.context)
def post(self, request):
self.context['navbar'] = get_navbar(request)
data = request.POST
form = AuthenticateForm(data)
if form.is_valid():
user = authenticate(
username=data['username'],
password=data['password'],
)
if user:
login(request, user)
messages.success(request, 'You have successfully logged in!')
return redirect(reverse('main'))
messages.error(request, 'Invalid username and password pair.', extra_tags='danger')
else:
messages.error(request, 'Invalid username and password pair.', extra_tags='danger')
self.context['form'] = AuthenticateForm(data)
return render(request, Page.login, self.context)
| 36.230769 | 95 | 0.640481 | 153 | 1,413 | 5.823529 | 0.359477 | 0.08642 | 0.060606 | 0.049383 | 0.332211 | 0.332211 | 0.332211 | 0.244669 | 0.145903 | 0.145903 | 0 | 0 | 0.255485 | 1,413 | 38 | 96 | 37.184211 | 0.846958 | 0 | 0 | 0.181818 | 0 | 0 | 0.108988 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.090909 | 0.212121 | 0 | 0.424242 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5d53556c82d1a27255c1497656b5efc347cde76d | 1,035 | py | Python | alipay/aop/api/response/AlipayOpenMiniVersionAuditApplyResponse.py | antopen/alipay-sdk-python-all | 8e51c54409b9452f8d46c7bb10eea7c8f7e8d30c | [
"Apache-2.0"
] | null | null | null | alipay/aop/api/response/AlipayOpenMiniVersionAuditApplyResponse.py | antopen/alipay-sdk-python-all | 8e51c54409b9452f8d46c7bb10eea7c8f7e8d30c | [
"Apache-2.0"
] | null | null | null | alipay/aop/api/response/AlipayOpenMiniVersionAuditApplyResponse.py | antopen/alipay-sdk-python-all | 8e51c54409b9452f8d46c7bb10eea7c8f7e8d30c | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import json
from alipay.aop.api.response.AlipayResponse import AlipayResponse
class AlipayOpenMiniVersionAuditApplyResponse(AlipayResponse):
def __init__(self):
super(AlipayOpenMiniVersionAuditApplyResponse, self).__init__()
self._speed_up = None
self._speed_up_memo = None
@property
def speed_up(self):
return self._speed_up
@speed_up.setter
def speed_up(self, value):
self._speed_up = value
@property
def speed_up_memo(self):
return self._speed_up_memo
@speed_up_memo.setter
def speed_up_memo(self, value):
self._speed_up_memo = value
def parse_response_content(self, response_content):
response = super(AlipayOpenMiniVersionAuditApplyResponse, self).parse_response_content(response_content)
if 'speed_up' in response:
self.speed_up = response['speed_up']
if 'speed_up_memo' in response:
self.speed_up_memo = response['speed_up_memo']
| 28.75 | 112 | 0.699517 | 124 | 1,035 | 5.459677 | 0.266129 | 0.186115 | 0.146233 | 0.088626 | 0.22452 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00123 | 0.214493 | 1,035 | 35 | 113 | 29.571429 | 0.831488 | 0.04058 | 0 | 0.08 | 0 | 0 | 0.042381 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0 | 0.08 | 0.08 | 0.44 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d546fd247cbdfbb018dec6e3f4e3273ffdefdb8 | 3,115 | py | Python | pysnmp-with-texts/MWORKS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/MWORKS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/MWORKS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module MWORKS-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/MWORKS-MIB
# Produced by pysmi-0.3.4 at Wed May 1 14:16:04 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, OctetString, Integer = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "OctetString", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, SingleValueConstraint, ConstraintsUnion, ValueSizeConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "SingleValueConstraint", "ConstraintsUnion", "ValueSizeConstraint", "ConstraintsIntersection")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
Gauge32, Unsigned32, ObjectIdentity, IpAddress, Bits, MibIdentifier, Integer32, enterprises, ModuleIdentity, TimeTicks, Counter32, NotificationType, iso, Counter64, MibScalar, MibTable, MibTableRow, MibTableColumn = mibBuilder.importSymbols("SNMPv2-SMI", "Gauge32", "Unsigned32", "ObjectIdentity", "IpAddress", "Bits", "MibIdentifier", "Integer32", "enterprises", "ModuleIdentity", "TimeTicks", "Counter32", "NotificationType", "iso", "Counter64", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
tecElite = MibIdentifier((1, 3, 6, 1, 4, 1, 217))
meterWorks = MibIdentifier((1, 3, 6, 1, 4, 1, 217, 16))
mw501 = MibIdentifier((1, 3, 6, 1, 4, 1, 217, 16, 1))
mwMem = MibIdentifier((1, 3, 6, 1, 4, 1, 217, 16, 1, 1))
mwHeap = MibIdentifier((1, 3, 6, 1, 4, 1, 217, 16, 1, 2))
mwMemCeiling = MibScalar((1, 3, 6, 1, 4, 1, 217, 16, 1, 1, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mwMemCeiling.setStatus('mandatory')
if mibBuilder.loadTexts: mwMemCeiling.setDescription('bytes of memory the agent memory manager will allow the agent to use.')
mwMemUsed = MibScalar((1, 3, 6, 1, 4, 1, 217, 16, 1, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mwMemUsed.setStatus('mandatory')
if mibBuilder.loadTexts: mwMemUsed.setDescription("bytes of memory that meterworks has malloc'ed. some of this may be in free pools.")
mwHeapTotal = MibScalar((1, 3, 6, 1, 4, 1, 217, 16, 1, 2, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mwHeapTotal.setStatus('mandatory')
if mibBuilder.loadTexts: mwHeapTotal.setDescription('bytes of memory given to the heap manager.')
mwHeapUsed = MibScalar((1, 3, 6, 1, 4, 1, 217, 16, 1, 2, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: mwHeapUsed.setStatus('mandatory')
if mibBuilder.loadTexts: mwHeapUsed.setDescription('bytes of available memory in the heap.')
mibBuilder.exportSymbols("MWORKS-MIB", mwHeap=mwHeap, mwHeapUsed=mwHeapUsed, mwMemCeiling=mwMemCeiling, meterWorks=meterWorks, tecElite=tecElite, mwMem=mwMem, mw501=mw501, mwHeapTotal=mwHeapTotal, mwMemUsed=mwMemUsed)
| 97.34375 | 505 | 0.759551 | 376 | 3,115 | 6.292553 | 0.345745 | 0.007608 | 0.011412 | 0.015216 | 0.476331 | 0.326712 | 0.326712 | 0.24049 | 0.231192 | 0.221048 | 0 | 0.070743 | 0.09695 | 3,115 | 31 | 506 | 100.483871 | 0.770352 | 0.100803 | 0 | 0 | 0 | 0 | 0.271848 | 0.015759 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d553e6733970b4280761ad4ec3ddb284ae1146d | 1,382 | py | Python | vars_in_python.py | klyusba/python-quiz | 9f469417458f8ba6b21f9507cc860ca4547ea67b | [
"MIT"
] | null | null | null | vars_in_python.py | klyusba/python-quiz | 9f469417458f8ba6b21f9507cc860ca4547ea67b | [
"MIT"
] | null | null | null | vars_in_python.py | klyusba/python-quiz | 9f469417458f8ba6b21f9507cc860ca4547ea67b | [
"MIT"
] | null | null | null | # == 1 ==
bar = [1, 2]
def foo(bar):
bar = sum(bar)
return bar
print(foo(bar))
# == 2 ==
bar = [1, 2]
def foo(bar):
bar[0] = 1
return sum(bar)
print(foo(bar))
# == 3 ==
bar = [1, 2]
def foo():
bar = sum(bar)
return bar
print(foo())
# == 4 ==
bar = [1, 2]
def foo(bar):
bar = [1, 2, 3, ]
return sum(bar)
print(foo(bar), bar)
# == 5 ==
bar = [1, 2]
def foo(bar):
bar[:] = [1, 2, 3, ]
return sum(bar)
print(foo(bar), bar)
# == 6 ==
try:
bar = 1 / 0
print(bar)
except ZeroDivisionError as bar:
print(bar)
print(bar)
# == 7 ==
bar = [1, 2]
print(list(bar for bar in bar))
print(bar)
# == 8 ==
bar = [1, 2]
f = lambda: sum(bar)
print(f())
bar = [1, 2, 3, ]
print(f())
# == 9 ==
bar = [1, 2]
def foo(bar):
return lambda: sum(bar)
f = foo(bar)
print(f())
bar = [1, 2, 3, ]
print(f())
# == 10 ==
bar = [1, 2]
foo = []
for i in bar:
foo.append(lambda: i)
print([f() for f in foo])
# == 11 ==
bar = [1, 2]
foo = [
lambda: i
for i in bar
]
print(list(f() for f in foo))
# == 12 ==
bar = [1, 2]
foo = [
lambda: i
for i in bar
]
print(list(f() for f in foo))
bar = [1, 2, 3, ]
print(list(f() for f in foo))
bar[:] = [1, 2, 3, ]
print(list(f() for f in foo))
# == 13 ==
bar = [1, 2]
foo = [
lambda i=i: i
for i in bar
]
print(list(f() for f in foo))
| 11.145161 | 32 | 0.469609 | 243 | 1,382 | 2.670782 | 0.127572 | 0.117103 | 0.138675 | 0.07396 | 0.701079 | 0.68567 | 0.588598 | 0.460709 | 0.460709 | 0.395994 | 0 | 0.065762 | 0.306802 | 1,382 | 123 | 33 | 11.235772 | 0.611691 | 0.096237 | 0 | 0.753425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082192 | false | 0 | 0 | 0.013699 | 0.164384 | 0.273973 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d6e47beb4576bf2e083ccdcb792c2e2830c83c4 | 50,279 | py | Python | user_program/usb4vc_ui.py | dekuNukem/USB4VC | 66c4f0b4a4acd7cec6654ea0dd4da026edf5d24c | [
"MIT"
] | 78 | 2022-02-07T16:48:11.000Z | 2022-03-31T12:25:35.000Z | user_program/usb4vc_ui.py | dekuNukem/USB4VC | 66c4f0b4a4acd7cec6654ea0dd4da026edf5d24c | [
"MIT"
] | 1 | 2022-02-26T20:16:08.000Z | 2022-02-26T20:24:04.000Z | user_program/usb4vc_ui.py | dekuNukem/USB4VC | 66c4f0b4a4acd7cec6654ea0dd4da026edf5d24c | [
"MIT"
] | 1 | 2022-02-24T03:34:15.000Z | 2022-02-24T03:34:15.000Z | # https://luma-oled.readthedocs.io/en/latest/software.html
import os
import sys
import time
import threading
import usb4vc_oled
from luma.core.render import canvas
import RPi.GPIO as GPIO
import usb4vc_usb_scan
import usb4vc_shared
import usb4vc_show_ev
import usb4vc_check_update
import json
import subprocess
from subprocess import Popen, PIPE
from usb4vc_shared import this_app_dir_path
from usb4vc_shared import config_dir_path
from usb4vc_shared import firmware_dir_path
from usb4vc_shared import temp_dir_path
from usb4vc_shared import ensure_dir
from usb4vc_shared import i2c_bootloader_pbid
from usb4vc_shared import usb_bootloader_pbid
config_file_path = os.path.join(config_dir_path, 'config.json')
ensure_dir(this_app_dir_path)
ensure_dir(config_dir_path)
ensure_dir(firmware_dir_path)
ensure_dir(temp_dir_path)
PLUS_BUTTON_PIN = 27
MINUS_BUTTON_PIN = 19
ENTER_BUTTON_PIN = 22
SHUTDOWN_BUTTON_PIN = 21
PBOARD_RESET_PIN = 25
PBOARD_BOOT0_PIN = 12
SLEEP_LED_PIN = 26
GPIO.setmode(GPIO.BCM)
GPIO.setup(PBOARD_RESET_PIN, GPIO.IN)
GPIO.setup(PBOARD_BOOT0_PIN, GPIO.IN)
GPIO.setup(SLEEP_LED_PIN, GPIO.OUT)
GPIO.output(SLEEP_LED_PIN, GPIO.LOW)
SPI_MOSI_MAGIC = 0xde
SPI_MOSI_MSG_TYPE_SET_PROTOCOL = 2
set_protocl_spi_msg_template = [SPI_MOSI_MAGIC, 0, SPI_MOSI_MSG_TYPE_SET_PROTOCOL] + [0]*29
class my_button(object):
def __init__(self, bcm_pin):
super(my_button, self).__init__()
self.pin_number = bcm_pin
GPIO.setup(self.pin_number, GPIO.IN, pull_up_down=GPIO.PUD_UP)
self.prev_state = GPIO.input(self.pin_number)
def is_pressed(self):
result = False
current_state = GPIO.input(self.pin_number)
if self.prev_state == 1 and current_state == 0:
result = True
self.prev_state = current_state
return result
PBOARD_ID_UNKNOWN = 0
PBOARD_ID_IBMPC = 1
PBOARD_ID_ADB = 2
pboard_info_spi_msg = [0] * 32
this_pboard_id = PBOARD_ID_UNKNOWN
USBGP_BTN_SOUTH = 0x130
USBGP_BTN_EAST = 0x131
USBGP_BTN_C = 0x132
USBGP_BTN_NORTH = 0x133
USBGP_BTN_WEST = 0x134
USBGP_BTN_Z = 0x135
USBGP_BTN_TL = 0x136
USBGP_BTN_TR = 0x137
USBGP_BTN_TL2 = 0x138
USBGP_BTN_TR2 = 0x139
USBGP_BTN_SELECT = 0x13a
USBGP_BTN_START = 0x13b
USBGP_BTN_MODE = 0x13c
USBGP_BTN_THUMBL = 0x13d
USBGP_BTN_THUMBR = 0x13e
USBGP_BTN_A = USBGP_BTN_SOUTH
USBGP_BTN_B = USBGP_BTN_EAST
USBGP_BTN_X = USBGP_BTN_NORTH
USBGP_BTN_Y = USBGP_BTN_WEST
USBGP_ABS_X = 0x00 # left stick X
USBGP_ABS_Y = 0x01 # left stick Y
USBGP_ABS_Z = 0x02 # left analog trigger
USBGP_ABS_RX = 0x03 # right stick X
USBGP_ABS_RY = 0x04 # right stick Y
USBGP_ABS_RZ = 0x05 # right analog trigger
USBGP_ABS_HAT0X = 0x10 # D-pad X
USBGP_ABS_HAT0Y = 0x11 # D-pad Y
GENERIC_USB_GAMEPAD_TO_MOUSE_KB_DEAULT_MAPPING = {
"MAPPING_TYPE": "DEFAULT_MOUSE_KB",
'BTN_TL': {'code': 'BTN_LEFT'},
'BTN_TR': {'code': 'BTN_RIGHT'},
'BTN_TL2': {'code': 'BTN_LEFT'},
'BTN_TR2': {'code': 'BTN_RIGHT'},
'ABS_X': {'code': 'REL_X'},
'ABS_Y': {'code': 'REL_Y'},
'ABS_HAT0X': {'code': 'KEY_RIGHT', 'code_neg': 'KEY_LEFT'},
'ABS_HAT0Y': {'code': 'KEY_DOWN', 'code_neg': 'KEY_UP'}
}
IBM_GENERIC_USB_GAMEPAD_TO_15PIN_GAMEPORT_GAMEPAD_DEAULT_MAPPING = {
"MAPPING_TYPE": "DEFAULT_15PIN",
# buttons to buttons
'BTN_SOUTH': {'code':'IBM_GGP_BTN_1'},
'BTN_NORTH': {'code':'IBM_GGP_BTN_2'},
'BTN_EAST': {'code':'IBM_GGP_BTN_3'},
'BTN_WEST': {'code':'IBM_GGP_BTN_4'},
'BTN_TL': {'code':'IBM_GGP_BTN_1'},
'BTN_TR': {'code':'IBM_GGP_BTN_2'},
'BTN_Z': {'code':'IBM_GGP_BTN_3'},
'BTN_C': {'code':'IBM_GGP_BTN_4'},
'BTN_TL2': {'code':'IBM_GGP_BTN_1'},
'BTN_TR2': {'code':'IBM_GGP_BTN_2'},
# analog axes to analog axes
'ABS_X': {'code':'IBM_GGP_JS1_X'},
'ABS_Y': {'code':'IBM_GGP_JS1_Y'},
'ABS_HAT0X': {'code':'IBM_GGP_JS1_X'},
'ABS_HAT0Y': {'code':'IBM_GGP_JS1_Y'},
'ABS_RX': {'code':'IBM_GGP_JS2_X'},
'ABS_RY': {'code':'IBM_GGP_JS2_Y'},
}
PROTOCOL_OFF = {'pid':0, 'display_name':"OFF"}
PROTOCOL_AT_PS2_KB = {'pid':1, 'display_name':"AT/PS2"}
PROTOCOL_XT_KB = {'pid':2, 'display_name':"PC XT"}
PROTOCOL_ADB_KB = {'pid':3, 'display_name':"ADB"}
PROTOCOL_PS2_MOUSE_NORMAL = {'pid':4, 'display_name':"PS/2"}
PROTOCOL_MICROSOFT_SERIAL_MOUSE = {'pid':5, 'display_name':"Microsft Serial"}
PROTOCOL_ADB_MOUSE = {'pid':6, 'display_name':"ADB"}
PROTOCOL_15PIN_GAMEPORT_GAMEPAD = {'pid':7, 'display_name':"Generic 15-Pin", 'mapping':IBM_GENERIC_USB_GAMEPAD_TO_15PIN_GAMEPORT_GAMEPAD_DEAULT_MAPPING}
PROTOCOL_MOUSESYSTEMS_SERIAL_MOUSE = {'pid':8, 'display_name':"MouseSys Serial"}
PROTOCOL_USB_GP_TO_MOUSE_KB = {'pid':0, 'display_name':'Mouse & KB', 'mapping':GENERIC_USB_GAMEPAD_TO_MOUSE_KB_DEAULT_MAPPING}
PROTOCOL_RAW_KEYBOARD = {'pid':125, 'display_name':"Raw data"}
PROTOCOL_RAW_MOUSE = {'pid':126, 'display_name':"Raw data"}
PROTOCOL_RAW_GAMEPAD = {'pid':127, 'display_name':"Raw data"}
custom_profile_list = []
try:
onlyfiles = [f for f in os.listdir(config_dir_path) if os.path.isfile(os.path.join(config_dir_path, f))]
json_map_files = [os.path.join(config_dir_path, x) for x in onlyfiles if x.lower().startswith('usb4vc_map') and x.lower().endswith(".json")]
for item in json_map_files:
print('loading json file:', item)
with open(item) as json_file:
custom_profile_list.append(json.load(json_file))
except Exception as e:
print('exception json load:', e)
def get_list_of_usb_drive():
usb_drive_set = set()
try:
usb_drive_path = subprocess.getoutput(f"timeout 2 df -h | grep -i usb").replace('\r', '').split('\n')
for item in [x for x in usb_drive_path if len(x) > 2]:
usb_drive_set.add(os.path.join(item.split(' ')[-1], 'usb4vc'))
except Exception as e:
print("exception get_list_of_usb_drive:", e)
return usb_drive_set
def copy_debug_log():
usb_drive_set = get_list_of_usb_drive()
if len(usb_drive_set) == 0:
return False
for this_path in usb_drive_set:
if os.path.isdir(this_path):
print('copying debug log to', this_path)
os.system(f'sudo cp -v /home/pi/usb4vc/usb4vc_debug_log.txt {this_path}')
return True
def check_usb_drive():
usb_drive_set = get_list_of_usb_drive()
if len(usb_drive_set) == 0:
return False, 'USB Drive Not Found'
for this_path in usb_drive_set:
usb_config_path = os.path.join(this_path, 'config')
if not os.path.isdir(usb_config_path):
usb_config_path = None
if usb_config_path is not None:
return True, usb_config_path
return False, 'No Update Data Found'
def get_pbid_and_version(dfu_file_name):
pbid = None
try:
pbid = int(dfu_file_name.split('PBID')[-1].split('_')[0])
except Exception as e:
print("exception fw pbid parse:", e)
fw_ver_tuple = None
try:
fw_ver = dfu_file_name.lower().split('_v')[-1].split('.')[0].split('_')
fw_ver_tuple = (int(fw_ver[0]), int(fw_ver[1]), int(fw_ver[2]))
except Exception as e:
print('exception fw ver parse:', e)
return pbid, fw_ver_tuple
def reset_pboard():
print("resetting protocol board...")
GPIO.setup(PBOARD_BOOT0_PIN, GPIO.IN)
GPIO.setup(PBOARD_RESET_PIN, GPIO.OUT)
GPIO.output(PBOARD_RESET_PIN, GPIO.LOW)
time.sleep(0.05)
GPIO.setup(PBOARD_RESET_PIN, GPIO.IN)
time.sleep(0.05)
print("done")
def enter_dfu():
# RESET LOW: Enter reset
GPIO.setup(PBOARD_RESET_PIN, GPIO.OUT)
GPIO.output(PBOARD_RESET_PIN, GPIO.LOW)
time.sleep(0.05)
# BOOT0 HIGH: Boot into DFU mode
GPIO.setup(PBOARD_BOOT0_PIN, GPIO.OUT)
GPIO.output(PBOARD_BOOT0_PIN, GPIO.HIGH)
time.sleep(0.05)
# Release RESET, BOOT0 still HIGH, STM32 now in DFU mode
GPIO.setup(PBOARD_RESET_PIN, GPIO.IN)
time.sleep(1.5)
def exit_dfu():
# Release BOOT0
GPIO.setup(PBOARD_BOOT0_PIN, GPIO.IN)
# Activate RESET
GPIO.setup(PBOARD_RESET_PIN, GPIO.OUT)
GPIO.output(PBOARD_RESET_PIN, GPIO.LOW)
time.sleep(0.05)
# Release RESET, BOOT0 is LOW, STM32 boots in normal mode
GPIO.setup(PBOARD_RESET_PIN, GPIO.IN)
time.sleep(1.5)
def fw_update(fw_path, pbid):
is_updated = False
if pbid in i2c_bootloader_pbid and fw_path.lower().endswith('.hex'):
enter_dfu()
os.system(f'sudo stm32flash -w {fw_path} -a 0x3b /dev/i2c-1')
is_updated = True
elif pbid in usb_bootloader_pbid and fw_path.lower().endswith('.dfu'):
enter_dfu()
lsusb_str = subprocess.getoutput("lsusb")
if 'in DFU'.lower() not in lsusb_str.lower():
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Connect a USB cable", usb4vc_oled.font_regular, 0, draw)
usb4vc_oled.oled_print_centered("from P-Card to RPi", usb4vc_oled.font_regular, 10, draw)
usb4vc_oled.oled_print_centered("and try again", usb4vc_oled.font_regular, 20, draw)
time.sleep(4)
else:
os.system(f'sudo dfu-util --device ,0483:df11 -a 0 -D {fw_path}')
is_updated = True
exit_dfu()
return is_updated
def update_pboard_firmware(this_pid):
onlyfiles = [f for f in os.listdir(firmware_dir_path) if os.path.isfile(os.path.join(firmware_dir_path, f))]
firmware_files = [x for x in onlyfiles if x.startswith("PBFW_") and (x.lower().endswith(".dfu") or x.lower().endswith(".hex")) and "PBID" in x]
this_pboard_version_tuple = (pboard_info_spi_msg[5], pboard_info_spi_msg[6], pboard_info_spi_msg[7])
for item in firmware_files:
pbid, fw_ver_tuple = get_pbid_and_version(item)
if pbid is None or fw_ver_tuple is None:
continue
print('update_pboard_firmware:', this_pid, this_pboard_version_tuple, fw_ver_tuple)
if pbid == this_pid and fw_ver_tuple > this_pboard_version_tuple:
print("DOING IT NOW")
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Loading Firmware:", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(item.strip("PBFW_").strip(".dfu").strip(".hex"), usb4vc_oled.font_regular, 16, draw)
if fw_update(os.path.join(firmware_dir_path, item), this_pid):
return True
return False
def update_from_usb(usb_config_path):
if usb_config_path is not None:
os.system(f'cp -v /home/pi/usb4vc/config/config.json {usb_config_path}')
os.system('mv -v /home/pi/usb4vc/config/config.json /home/pi/usb4vc/config.json')
os.system('rm -rfv /home/pi/usb4vc/config/*')
os.system(f"cp -v {os.path.join(usb_config_path, '*')} /home/pi/usb4vc/config")
os.system("mv -v /home/pi/usb4vc/config.json /home/pi/usb4vc/config/config.json")
ibmpc_keyboard_protocols = [PROTOCOL_OFF, PROTOCOL_AT_PS2_KB, PROTOCOL_XT_KB]
ibmpc_mouse_protocols = [PROTOCOL_OFF, PROTOCOL_PS2_MOUSE_NORMAL, PROTOCOL_MICROSOFT_SERIAL_MOUSE, PROTOCOL_MOUSESYSTEMS_SERIAL_MOUSE]
ibmpc_gamepad_protocols = [PROTOCOL_OFF, PROTOCOL_15PIN_GAMEPORT_GAMEPAD, PROTOCOL_USB_GP_TO_MOUSE_KB]
adb_keyboard_protocols = [PROTOCOL_OFF, PROTOCOL_ADB_KB]
adb_mouse_protocols = [PROTOCOL_OFF, PROTOCOL_ADB_MOUSE]
adb_gamepad_protocols = [PROTOCOL_OFF, PROTOCOL_USB_GP_TO_MOUSE_KB]
raw_keyboard_protocols = [PROTOCOL_OFF, PROTOCOL_RAW_KEYBOARD]
raw_mouse_protocols = [PROTOCOL_OFF, PROTOCOL_RAW_MOUSE]
raw_gamepad_protocols = [PROTOCOL_OFF, PROTOCOL_RAW_GAMEPAD]
mouse_sensitivity_list = [1, 1.25, 1.5, 1.75, 0.25, 0.5, 0.75]
"""
key is protocol card ID
conf_dict[pbid]:
hw revision
current keyboard protocol
current mouse protocol
current gamepad procotol
mouse sensitivity
"""
configuration_dict = {}
LINUX_EXIT_CODE_TIMEOUT = 124
def bt_setup():
rfkill_str = subprocess.getoutput("/usr/sbin/rfkill -n")
if 'bluetooth' not in rfkill_str:
return 1, "no BT receiver found"
os.system('/usr/sbin/rfkill unblock bluetooth')
time.sleep(0.1)
exit_code = os.system('timeout 1 bluetoothctl agent NoInputNoOutput') >> 8
if exit_code == LINUX_EXIT_CODE_TIMEOUT:
return 2, 'bluetoothctl stuck'
return 0, ''
def scan_bt_devices(timeout_sec = 5):
exit_code = os.system(f"timeout {timeout_sec} bluetoothctl --agent NoInputNoOutput scan on") >> 8
if exit_code != LINUX_EXIT_CODE_TIMEOUT:
return None, 'scan error'
device_str = subprocess.getoutput("bluetoothctl --agent NoInputNoOutput devices")
dev_list = []
for line in device_str.replace('\r', '').split('\n'):
if 'device' not in line.lower():
continue
line_split = line.split(' ', maxsplit=2)
# skip if device has no name
if len(line_split) < 3 or line_split[2].count('-') == 5:
continue
dev_list.append((line_split[1], line_split[2]))
return dev_list, ''
def pair_device(mac_addr):
is_ready = False
is_sent = False
fail_phrases = ['fail', 'error', 'not available', 'excep']
with Popen(["bluetoothctl", "--agent", "NoInputNoOutput"], stdout=PIPE, stdin=PIPE, bufsize=1,
universal_newlines=True, shell=True) as p:
for line in p.stdout:
print(line, end='')
line_lo = line.lower()
if 'registered' in line_lo:
is_ready = True
if is_ready is False:
continue
if '#' in line_lo and is_sent == False:
p.stdin.write(f'pair {mac_addr}\n')
is_sent = True
if 'PIN code:' in line:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Enter PIN code:", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(line.split('PIN code:')[-1], usb4vc_oled.font_medium, 15, draw)
if '(yes/no)' in line:
p.stdin.write('yes\n')
if 'number in 0-999999' in line:
return False, "Error: Passkey needed"
if 'successful' in line_lo:
p.stdin.write('exit\n')
return True, 'Success!'
for item in fail_phrases:
if item in line_lo:
p.stdin.write('exit\n')
return False, line
return False, "wtf"
def get_paired_devices():
dev_set = set()
try:
device_str = subprocess.getoutput(f"timeout 5 bluetoothctl --agent NoInputNoOutput paired-devices")
for line in device_str.replace('\r', '').split('\n'):
if 'device' not in line.lower():
continue
line_split = line.split(' ', maxsplit=2)
# skip if device has no name
if len(line_split) < 3 or line_split[2].count('-') == 5:
continue
dev_set.add((line_split[1], line_split[2]))
except Exception as e:
print('exception get_paired_devices:', e)
return dev_set
def load_config():
global configuration_dict
try:
with open(config_file_path) as json_file:
temp_dict = json.load(json_file)
# json dump all keys as strings, need to convert them back to ints
for key in temp_dict:
if key.isdigit():
configuration_dict[int(key)] = temp_dict[key]
else:
configuration_dict[key] = temp_dict[key]
except Exception as e:
print("exception config load failed!", e)
def get_ip_address():
ip_str = subprocess.getoutput("timeout 1 hostname -I")
ip_list = [x for x in ip_str.split(' ') if '.' in x]
if len(ip_list) == 0:
return "Offline"
return f'{ip_list[0]}'
def save_config():
try:
with open(config_file_path, 'w', encoding='utf-8') as save_file:
save_file.write(json.dumps(configuration_dict))
except Exception as e:
print("exception config save failed!", e)
curve_vertial_axis_x_pos = 80
curve_horizontal_axis_width = 32
curve_linear = {0: 0, 1: 1, 2: 2, 3: 3, 4: 4, 5: 5, 6: 6, 7: 7, 8: 8, 9: 9, 10: 10, 11: 11, 12: 12, 13: 13, 14: 14, 15: 15, 16: 16, 17: 17, 18: 18, 19: 19, 20: 20, 21: 21, 22: 22, 23: 23, 24: 24, 25: 25, 26: 26, 27: 27, 28: 28, 29: 29, 30: 30, 31: 31, 32: 32, 33: 33, 34: 34, 35: 35, 36: 36, 37: 37, 38: 38, 39: 39, 40: 40, 41: 41, 42: 42, 43: 43, 44: 44, 45: 45, 46: 46, 47: 47, 48: 48, 49: 49, 50: 50, 51: 51, 52: 52, 53: 53, 54: 54, 55: 55, 56: 56, 57: 57, 58: 58, 59: 59, 60: 60, 61: 61, 62: 62, 63: 63, 64: 64, 65: 65, 66: 66, 67: 67, 68: 68, 69: 69, 70: 70, 71: 71, 72: 72, 73: 73, 74: 74, 75: 75, 76: 76, 77: 77, 78: 78, 79: 79, 80: 80, 81: 81, 82: 82, 83: 83, 84: 84, 85: 85, 86: 86, 87: 87, 88: 88, 89: 89, 90: 90, 91: 91, 92: 92, 93: 93, 94: 94, 95: 95, 96: 96, 97: 97, 98: 98, 99: 99, 100: 100, 101: 101, 102: 102, 103: 103, 104: 104, 105: 105, 106: 106, 107: 107, 108: 108, 109: 109, 110: 110, 111: 111, 112: 112, 113: 113, 114: 114, 115: 115, 116: 116, 117: 117, 118: 118, 119: 119, 120: 120, 121: 121, 122: 122, 123: 123, 124: 124, 125: 125, 126: 126, 127: 127}
curve1 = {0: 1, 1: 1, 2: 2, 3: 2, 4: 3, 5: 4, 6: 4, 7: 5, 8: 5, 9: 6, 10: 6, 11: 7, 12: 7, 13: 8, 14: 8, 15: 9, 16: 9, 17: 10, 18: 11, 19: 11, 20: 12, 21: 12, 22: 13, 23: 13, 24: 14, 25: 15, 26: 15, 27: 16, 28: 16, 29: 17, 30: 18, 31: 18, 32: 19, 33: 19, 34: 20, 35: 21, 36: 21, 37: 22, 38: 22, 39: 23, 40: 24, 41: 24, 42: 25, 43: 26, 44: 26, 45: 27, 46: 28, 47: 28, 48: 29, 49: 30, 50: 30, 51: 31, 52: 32, 53: 33, 54: 33, 55: 34, 56: 35, 57: 36, 58: 36, 59: 37, 60: 38, 61: 39, 62: 39, 63: 40, 64: 41, 65: 42, 66: 43, 67: 44, 68: 45, 69: 46, 70: 46, 71: 47, 72: 48, 73: 49, 74: 50, 75: 51, 76: 52, 77: 53, 78: 55, 79: 56, 80: 57, 81: 58, 82: 59, 83: 60, 84: 61, 85: 62, 86: 63, 87: 65, 88: 66, 89: 67, 90: 68, 91: 70, 92: 71, 93: 72, 94: 73, 95: 75, 96: 76, 97: 77, 98: 79, 99: 80, 100: 81, 101: 83, 102: 84, 103: 86, 104: 87, 105: 89, 106: 90, 107: 92, 108: 93, 109: 95, 110: 96, 111: 98, 112: 100, 113: 101, 114: 103, 115: 105, 116: 106, 117: 108, 118: 110, 119: 112, 120: 113, 121: 115, 122: 117, 123: 119, 124: 121, 125: 123, 126: 125, 127: 127}
curve2 = {0: 1, 1: 1, 2: 1, 3: 1, 4: 2, 5: 2, 6: 2, 7: 2, 8: 2, 9: 3, 10: 3, 11: 3, 12: 3, 13: 4, 14: 4, 15: 4, 16: 4, 17: 5, 18: 5, 19: 5, 20: 5, 21: 6, 22: 6, 23: 6, 24: 7, 25: 7, 26: 7, 27: 8, 28: 8, 29: 8, 30: 8, 31: 9, 32: 9, 33: 9, 34: 10, 35: 10, 36: 10, 37: 11, 38: 11, 39: 12, 40: 12, 41: 12, 42: 13, 43: 13, 44: 13, 45: 14, 46: 14, 47: 15, 48: 15, 49: 15, 50: 16, 51: 16, 52: 17, 53: 17, 54: 18, 55: 18, 56: 19, 57: 19, 58: 20, 59: 20, 60: 21, 61: 21, 62: 22, 63: 22, 64: 23, 65: 23, 66: 24, 67: 24, 68: 25, 69: 26, 70: 26, 71: 27, 72: 28, 73: 28, 74: 29, 75: 30, 76: 30, 77: 31, 78: 32, 79: 33, 80: 34, 81: 35, 82: 36, 83: 37, 84: 38, 85: 39, 86: 40, 87: 41, 88: 42, 89: 43, 90: 44, 91: 45, 92: 47, 93: 48, 94: 49, 95: 51, 96: 52, 97: 53, 98: 55, 99: 56, 100: 58, 101: 59, 102: 61, 103: 63, 104: 64, 105: 66, 106: 68, 107: 70, 108: 71, 109: 73, 110: 75, 111: 78, 112: 80, 113: 82, 114: 84, 115: 86, 116: 89, 117: 92, 118: 94, 119: 96, 120: 100, 121: 102, 122: 106, 123: 110, 124: 112, 125: 116, 126: 120, 127: 125}
curve3 = {0: 1, 1: 1, 2: 1, 3: 1, 4: 1, 5: 1, 6: 1, 7: 1, 8: 1, 9: 1, 10: 1, 11: 1, 12: 1, 13: 1, 14: 1, 15: 1, 16: 1, 17: 1, 18: 1, 19: 1, 20: 1, 21: 2, 22: 2, 23: 2, 24: 2, 25: 2, 26: 2, 27: 2, 28: 2, 29: 2, 30: 2, 31: 3, 32: 3, 33: 3, 34: 3, 35: 3, 36: 3, 37: 3, 38: 4, 39: 4, 40: 4, 41: 4, 42: 4, 43: 4, 44: 5, 45: 5, 46: 5, 47: 5, 48: 5, 49: 6, 50: 6, 51: 6, 52: 6, 53: 7, 54: 7, 55: 7, 56: 7, 57: 8, 58: 8, 59: 8, 60: 8, 61: 9, 62: 9, 63: 9, 64: 10, 65: 10, 66: 10, 67: 11, 68: 11, 69: 11, 70: 12, 71: 12, 72: 12, 73: 13, 74: 13, 75: 14, 76: 14, 77: 15, 78: 15, 79: 16, 80: 16, 81: 17, 82: 17, 83: 18, 84: 19, 85: 19, 86: 20, 87: 21, 88: 21, 89: 22, 90: 23, 91: 24, 92: 25, 93: 26, 94: 27, 95: 28, 96: 29, 97: 30, 98: 32, 99: 33, 100: 34, 101: 35, 102: 37, 103: 38, 104: 40, 105: 41, 106: 43, 107: 45, 108: 46, 109: 48, 110: 50, 111: 52, 112: 54, 113: 56, 114: 59, 115: 61, 116: 64, 117: 66, 118: 69, 119: 72, 120: 76, 121: 79, 122: 83, 123: 87, 124: 92, 125: 99, 126: 104, 127: 118}
joystick_curve_list = [curve_linear, curve1, curve2, curve3]
class usb4vc_menu(object):
def cap_index(self, index, list_size):
if index >= list_size:
return 0
return index
def __init__(self, pboard, conf_dict):
super(usb4vc_menu, self).__init__()
self.current_level = 0
self.current_page = 0
self.level_size = 6
self.page_size = [7, 6, 4, 1, 1, 5]
self.kb_protocol_list = list(pboard['protocol_list_keyboard'])
self.mouse_protocol_list = list(pboard['protocol_list_mouse'])
self.gamepad_protocol_list = list(pboard['protocol_list_gamepad'])
self.pb_info = dict(pboard)
self.current_keyboard_protocol_index = self.cap_index(conf_dict.get('keyboard_protocol_index', 0), len(self.kb_protocol_list))
self.current_mouse_protocol_index = self.cap_index(conf_dict.get("mouse_protocol_index", 0), len(self.mouse_protocol_list))
self.current_mouse_sensitivity_offset_index = self.cap_index(conf_dict.get("mouse_sensitivity_index", 0), len(mouse_sensitivity_list))
self.current_gamepad_protocol_index = self.cap_index(conf_dict.get("gamepad_protocol_index", 0), len(self.gamepad_protocol_list))
self.current_keyboard_protocol = self.kb_protocol_list[self.current_keyboard_protocol_index]
self.current_mouse_protocol = self.mouse_protocol_list[self.current_mouse_protocol_index]
self.current_gamepad_protocol = self.gamepad_protocol_list[self.current_gamepad_protocol_index]
self.current_joystick_curve_index = self.cap_index(conf_dict.get("joystick_curve_index", 0), len(joystick_curve_list))
self.last_spi_message = []
self.bluetooth_device_list = None
self.error_message = ''
self.pairing_result = ''
self.bt_scan_timeout_sec = 10
self.paired_devices_list = []
self.send_protocol_set_spi_msg()
def switch_page(self, amount):
self.current_page = (self.current_page + amount) % self.page_size[self.current_level]
def goto_page(self, new_page):
if new_page < self.page_size[self.current_level]:
self.current_page = new_page
def goto_level(self, new_level):
if new_level < self.level_size:
self.current_level = new_level
self.current_page = 0
def draw_joystick_curve(self):
this_curve = joystick_curve_list[self.current_joystick_curve_index % len(joystick_curve_list)]
with canvas(usb4vc_oled.oled_device) as draw:
draw.text((0, 0), "Joystick", font=usb4vc_oled.font_medium, fill="white")
draw.text((0, 15), "Curve", font=usb4vc_oled.font_medium, fill="white")
draw.line((curve_vertial_axis_x_pos, 0, curve_vertial_axis_x_pos, curve_vertial_axis_x_pos), fill="white")
draw.line((curve_vertial_axis_x_pos, 31, curve_vertial_axis_x_pos+curve_horizontal_axis_width, 31), fill="white")
for xxx in range(curve_horizontal_axis_width):
dict_key = xxx*4
this_point_x = xxx + curve_vertial_axis_x_pos
this_point_y = usb4vc_oled.OLED_HEIGHT - this_curve[dict_key]//4 - 1
draw.line((this_point_x,this_point_y,this_point_x,this_point_y), fill="white")
def display_page(self, level, page):
if level == 0:
if page == 0:
with canvas(usb4vc_oled.oled_device) as draw:
mouse_count, kb_count, gp_count = usb4vc_usb_scan.get_device_count()
draw.text((0, 0), f"KBD {kb_count} {self.current_keyboard_protocol['display_name']}", font=usb4vc_oled.font_regular, fill="white")
draw.text((0, 10), f"MOS {mouse_count} {self.current_mouse_protocol['display_name']}", font=usb4vc_oled.font_regular, fill="white")
draw.text((0, 20), f"GPD {gp_count} {self.current_gamepad_protocol['display_name']}", font=usb4vc_oled.font_regular, fill="white")
if page == 1:
with canvas(usb4vc_oled.oled_device) as draw:
if 'Unknown' in self.pb_info['full_name']:
draw.text((0, 0), f"{self.pb_info['full_name']} PID {this_pboard_id}", font=usb4vc_oled.font_regular, fill="white")
else:
draw.text((0, 0), f"{self.pb_info['full_name']}", font=usb4vc_oled.font_regular, fill="white")
draw.text((0, 10), f"PB {self.pb_info['fw_ver'][0]}.{self.pb_info['fw_ver'][1]}.{self.pb_info['fw_ver'][2]} RPi {usb4vc_shared.RPI_APP_VERSION_TUPLE[0]}.{usb4vc_shared.RPI_APP_VERSION_TUPLE[1]}.{usb4vc_shared.RPI_APP_VERSION_TUPLE[2]}", font=usb4vc_oled.font_regular, fill="white")
draw.text((0, 20), f"IP: {get_ip_address()}", font=usb4vc_oled.font_regular, fill="white")
if page == 2:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Load Custom", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Config from USB", usb4vc_oled.font_medium, 16, draw)
if page == 3:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Internet Update", usb4vc_oled.font_medium, 10, draw)
if page == 4:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Show Event Codes", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("(experimental)", usb4vc_oled.font_regular, 20, draw)
if page == 5:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Remove BT Device", usb4vc_oled.font_medium, 10, draw)
if page == 6:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Pair Bluetooth", usb4vc_oled.font_medium, 10, draw)
if level == 1:
if page == 0:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Keyboard Protocol", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(self.kb_protocol_list[self.current_keyboard_protocol_index]['display_name'], usb4vc_oled.font_medium, 15, draw)
if page == 1:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Mouse Protocol", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(self.mouse_protocol_list[self.current_mouse_protocol_index]['display_name'], usb4vc_oled.font_medium, 15, draw)
if page == 2:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Gamepad Protocol", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(self.gamepad_protocol_list[self.current_gamepad_protocol_index]['display_name'], usb4vc_oled.font_medium, 15, draw)
if page == 3:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Mouse Sensitivity", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(f"{mouse_sensitivity_list[self.current_mouse_sensitivity_offset_index]}", usb4vc_oled.font_medium, 15, draw)
if page == 4:
self.draw_joystick_curve()
if page == 5:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Save & Quit", usb4vc_oled.font_medium, 10, draw)
if level == 2:
if page == 0:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Put your device in", usb4vc_oled.font_regular, 0, draw)
usb4vc_oled.oled_print_centered("pairing mode now.", usb4vc_oled.font_regular, 10, draw)
usb4vc_oled.oled_print_centered("Press enter to start", usb4vc_oled.font_regular, 20, draw)
if page == 1:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Scanning...", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Please wait", usb4vc_oled.font_medium, 15, draw)
result, self.error_message = bt_setup()
if result != 0:
self.goto_page(3)
self.display_curent_page()
return
paired_devices_set = get_paired_devices()
self.bluetooth_device_list, self.error_message = scan_bt_devices(self.bt_scan_timeout_sec)
self.bluetooth_device_list = list(set(self.bluetooth_device_list) - paired_devices_set)
if len(self.bluetooth_device_list) == 0:
self.error_message = "Nothing was found"
self.goto_page(3)
self.display_curent_page()
return
print("BT LIST:", self.bluetooth_device_list)
# set up level 3 menu structure
self.page_size[3] = len(self.bluetooth_device_list) + 1
self.goto_level(3)
self.display_curent_page()
if page == 2:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Pairing result:", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(self.pairing_result, usb4vc_oled.font_regular, 20, draw)
if page == 3:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Bluetooth Error!", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(self.error_message, usb4vc_oled.font_regular, 20, draw)
if level == 3:
if page == self.page_size[3] - 1:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Exit", usb4vc_oled.font_medium, 10, draw)
else:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered(f"Found {len(self.bluetooth_device_list)}. Pair this?", usb4vc_oled.font_regular, 0, draw)
usb4vc_oled.oled_print_centered(f"{self.bluetooth_device_list[page][1]}", usb4vc_oled.font_regular, 10, draw)
usb4vc_oled.oled_print_centered(f"{self.bluetooth_device_list[page][0]}", usb4vc_oled.font_regular, 20, draw)
if level == 4:
if page == self.page_size[4] - 1:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Exit", usb4vc_oled.font_medium, 10, draw)
else:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered(f"Remove this?", usb4vc_oled.font_regular, 0, draw)
usb4vc_oled.oled_print_centered(f"{self.paired_devices_list[page][1]}", usb4vc_oled.font_regular, 10, draw)
usb4vc_oled.oled_print_centered(f"{self.paired_devices_list[page][0]}", usb4vc_oled.font_regular, 20, draw)
if level == 5:
if page == 0:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Power Down", usb4vc_oled.font_medium, 10, draw)
if page == 1:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Relaunch", usb4vc_oled.font_medium, 10, draw)
if page == 2:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Reboot", usb4vc_oled.font_medium, 10, draw)
if page == 3:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Exit to Linux", usb4vc_oled.font_medium, 10, draw)
if page == 4:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Cancel", usb4vc_oled.font_medium, 10, draw)
def send_protocol_set_spi_msg(self):
status_dict = {}
for index, item in enumerate(self.kb_protocol_list):
if item['pid'] & 0x7f in status_dict and status_dict[item['pid'] & 0x7f] == 1:
continue
status_dict[item['pid'] & 0x7f] = 0
if index == self.current_keyboard_protocol_index:
status_dict[item['pid'] & 0x7f] = 1
for index, item in enumerate(self.mouse_protocol_list):
if item['pid'] & 0x7f in status_dict and status_dict[item['pid'] & 0x7f] == 1:
continue
status_dict[item['pid'] & 0x7f] = 0
if index == self.current_mouse_protocol_index:
status_dict[item['pid'] & 0x7f] = 1
for index, item in enumerate(self.gamepad_protocol_list):
if item['pid'] & 0x7f in status_dict and status_dict[item['pid'] & 0x7f] == 1:
continue
status_dict[item['pid'] & 0x7f] = 0
if index == self.current_gamepad_protocol_index:
status_dict[item['pid'] & 0x7f] = 1
protocol_bytes = []
for key in status_dict:
if key == PROTOCOL_OFF['pid']:
continue
if status_dict[key]:
protocol_bytes.append(key | 0x80)
else:
protocol_bytes.append(key)
this_msg = list(set_protocl_spi_msg_template)
this_msg[3:3+len(protocol_bytes)] = protocol_bytes
self.current_keyboard_protocol = self.kb_protocol_list[self.current_keyboard_protocol_index]
self.current_mouse_protocol = self.mouse_protocol_list[self.current_mouse_protocol_index]
self.current_gamepad_protocol = self.gamepad_protocol_list[self.current_gamepad_protocol_index]
if this_msg == self.last_spi_message:
print("SPI: no need to send")
return
print("set_protocol:", [hex(x) for x in this_msg])
usb4vc_usb_scan.set_protocol(this_msg)
print('new status:', [hex(x) for x in usb4vc_usb_scan.get_pboard_info()])
self.last_spi_message = list(this_msg)
def action(self, level, page):
if level == 0:
if page == 2:
usb_present, config_path = check_usb_drive()
if usb_present is False:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Error:", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(str(config_path), usb4vc_oled.font_regular, 16, draw)
time.sleep(3)
self.goto_level(0)
else:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Copying", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Debug Log...", usb4vc_oled.font_medium, 16, draw)
copy_debug_log()
time.sleep(2)
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Copying custom", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("mapping...", usb4vc_oled.font_medium, 16, draw)
time.sleep(2)
update_from_usb(config_path)
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Update complete!", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Relaunching...", usb4vc_oled.font_medium, 16, draw)
time.sleep(3)
usb4vc_oled.oled_device.clear()
os._exit(0)
self.goto_level(0)
elif page == 3:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Updating...", usb4vc_oled.font_medium, 10, draw)
fffff = usb4vc_check_update.download_latest_firmware(this_pboard_id)
if fffff != 0:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Unable to download", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(f"firmware: {fffff}", usb4vc_oled.font_medium, 16, draw)
elif update_pboard_firmware(this_pboard_id):
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Firmware updated!", usb4vc_oled.font_medium, 10, draw)
else:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("FW update ERR or", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("already newest", usb4vc_oled.font_medium, 15, draw)
time.sleep(3)
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Updating code...", usb4vc_oled.font_medium, 10, draw)
time.sleep(1)
update_result = usb4vc_check_update.update(temp_dir_path)
if update_result[0] == 0:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Update complete!", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Relaunching...", usb4vc_oled.font_medium, 16, draw)
else:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Update failed:", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered(f"{update_result[-1]} {update_result[0]}", usb4vc_oled.font_regular, 16, draw)
time.sleep(4)
usb4vc_oled.oled_device.clear()
os._exit(0)
elif page == 4:
try:
usb4vc_show_ev.ev_loop([plus_button, minus_button, enter_button])
except Exception as e:
print('exception ev_loop:', e)
self.goto_level(0)
elif page == 5:
self.paired_devices_list = list(get_paired_devices())
self.page_size[4] = len(self.paired_devices_list) + 1
self.goto_level(4)
elif page == 6:
self.goto_level(2)
else:
self.goto_level(1)
if level == 1:
if page == 0:
self.current_keyboard_protocol_index = (self.current_keyboard_protocol_index + 1) % len(self.kb_protocol_list)
if page == 1:
self.current_mouse_protocol_index = (self.current_mouse_protocol_index + 1) % len(self.mouse_protocol_list)
if page == 2:
self.current_gamepad_protocol_index = (self.current_gamepad_protocol_index + 1) % len(self.gamepad_protocol_list)
if page == 3:
self.current_mouse_sensitivity_offset_index = (self.current_mouse_sensitivity_offset_index + 1) % len(mouse_sensitivity_list)
if page == 4:
self.current_joystick_curve_index = (self.current_joystick_curve_index + 1) % len(joystick_curve_list)
self.draw_joystick_curve()
if page == 5:
configuration_dict[this_pboard_id]["keyboard_protocol_index"] = self.current_keyboard_protocol_index
configuration_dict[this_pboard_id]["mouse_protocol_index"] = self.current_mouse_protocol_index
configuration_dict[this_pboard_id]["mouse_sensitivity_index"] = self.current_mouse_sensitivity_offset_index
configuration_dict[this_pboard_id]["gamepad_protocol_index"] = self.current_gamepad_protocol_index
configuration_dict[this_pboard_id]["joystick_curve_index"] = self.current_joystick_curve_index
save_config()
self.send_protocol_set_spi_msg()
self.goto_level(0)
if level == 2:
if page == 0:
self.switch_page(1)
if page == 2:
self.goto_level(0)
if page == 3:
self.goto_level(0)
if level == 3:
if page == self.page_size[3] - 1:
self.goto_level(0)
else:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Pairing...", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Please wait", usb4vc_oled.font_medium, 15, draw)
print("pairing", self.bluetooth_device_list[page])
bt_mac_addr = self.bluetooth_device_list[page][0]
is_successful, result_message = pair_device(bt_mac_addr)
self.pairing_result = result_message.split('.')[-1].strip()[-22:]
if is_successful:
os.system(f'timeout {self.bt_scan_timeout_sec} bluetoothctl --agent NoInputNoOutput trust {bt_mac_addr}')
os.system(f'timeout {self.bt_scan_timeout_sec} bluetoothctl --agent NoInputNoOutput connect {bt_mac_addr}')
self.goto_level(2)
self.goto_page(2)
if level == 4:
if page == self.page_size[4] - 1:
self.goto_level(0)
else:
os.system(f'timeout 5 bluetoothctl --agent NoInputNoOutput untrust {self.paired_devices_list[page][0]}')
os.system(f'timeout 5 bluetoothctl --agent NoInputNoOutput remove {self.paired_devices_list[page][0]}')
self.goto_level(0)
if level == 5:
if page == 0:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Wait Until Green", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("LED Stops Blinking", usb4vc_oled.font_medium, 15, draw)
time.sleep(2)
os.system("sudo halt")
while 1:
time.sleep(1)
if page == 1:
usb4vc_oled.oled_device.clear()
os._exit(0)
if page == 2:
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Rebooting...", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Unplug if stuck >10s", usb4vc_oled.font_regular, 16, draw)
os.system("sudo reboot")
while 1:
time.sleep(1)
if page == 3:
usb4vc_oled.oled_device.clear()
os._exit(169)
if page == 4:
self.goto_level(0)
self.display_curent_page()
def action_current_page(self):
self.action(self.current_level, self.current_page)
def display_curent_page(self):
self.display_page(self.current_level, self.current_page)
def update_usb_status(self):
if self.current_level == 0 and self.current_page == 0:
self.display_page(0, 0)
def update_board_status(self):
if self.current_level == 0 and self.current_page == 1:
self.display_page(0, 1)
pboard_database = {
PBOARD_ID_UNKNOWN:{'author':'Unknown', 'fw_ver':(0,0,0), 'full_name':'Unknown', 'hw_rev':0, 'protocol_list_keyboard':raw_keyboard_protocols, 'protocol_list_mouse':raw_mouse_protocols, 'protocol_list_gamepad':raw_gamepad_protocols},
PBOARD_ID_IBMPC:{'author':'dekuNukem', 'fw_ver':(0,0,0), 'full_name':'IBM PC Compatible', 'hw_rev':0, 'protocol_list_keyboard':ibmpc_keyboard_protocols, 'protocol_list_mouse':ibmpc_mouse_protocols, 'protocol_list_gamepad':ibmpc_gamepad_protocols},
PBOARD_ID_ADB:{'author':'dekuNukem', 'fw_ver':(0,0,0), 'full_name':'Apple Desktop Bus', 'hw_rev':0, 'protocol_list_keyboard':adb_keyboard_protocols, 'protocol_list_mouse':adb_mouse_protocols, 'protocol_list_gamepad':adb_gamepad_protocols},
}
def get_pboard_dict(pid):
if pid not in pboard_database:
pid = 0
return pboard_database[pid]
def get_mouse_sensitivity():
return mouse_sensitivity_list[configuration_dict[this_pboard_id]["mouse_sensitivity_index"]]
def ui_init():
global pboard_info_spi_msg
global this_pboard_id
load_config()
pboard_info_spi_msg = usb4vc_usb_scan.get_pboard_info()
print("PB INFO:", pboard_info_spi_msg)
this_pboard_id = pboard_info_spi_msg[3]
if this_pboard_id in pboard_database:
# load custom profile mapping into protocol list
for item in custom_profile_list:
this_mapping_bid = usb4vc_shared.board_id_lookup.get(item['protocol_board'], 0)
if this_mapping_bid == this_pboard_id and item['device_type'] in pboard_database[this_pboard_id]:
this_mapping_pid = usb4vc_shared.protocol_id_lookup.get(item['protocol_name'])
item['pid'] = this_mapping_pid
pboard_database[this_pboard_id][item['device_type']].append(item)
pboard_database[this_pboard_id]['hw_rev'] = pboard_info_spi_msg[4]
pboard_database[this_pboard_id]['fw_ver'] = (pboard_info_spi_msg[5], pboard_info_spi_msg[6], pboard_info_spi_msg[7])
if 'rpi_app_ver' not in configuration_dict:
configuration_dict['rpi_app_ver'] = usb4vc_shared.RPI_APP_VERSION_TUPLE
if this_pboard_id not in configuration_dict:
configuration_dict[this_pboard_id] = {"keyboard_protocol_index":1, "mouse_protocol_index":1, "mouse_sensitivity_index":0, "gamepad_protocol_index":1}
plus_button = my_button(PLUS_BUTTON_PIN)
minus_button = my_button(MINUS_BUTTON_PIN)
enter_button = my_button(ENTER_BUTTON_PIN)
shutdown_button = my_button(SHUTDOWN_BUTTON_PIN)
class oled_sleep_control(object):
def __init__(self):
super(oled_sleep_control, self).__init__()
self.is_sleeping = False
self.last_input_event = time.time()
self.ui_loop_count = 0
def sleep(self):
if self.is_sleeping is False:
print("sleeping!")
usb4vc_oled.oled_device.clear()
self.is_sleeping = True
# GPIO.output(SLEEP_LED_PIN, GPIO.HIGH)
def wakeup(self):
if self.is_sleeping:
print("waking up!")
my_menu.display_curent_page()
self.last_input_event = time.time()
self.is_sleeping = False
# GPIO.output(SLEEP_LED_PIN, GPIO.LOW)
def check_sleep(self):
# time.time() might jump ahead a lot when RPi gets its time from network
# this ensures OLED won't go to sleep too early
if self.ui_loop_count <= 1500:
return
if time.time() - self.last_input_event > 180:
self.sleep()
else:
self.wakeup()
def kick(self):
self.last_input_event = time.time()
my_oled = oled_sleep_control()
my_menu = None
def ui_worker():
global my_menu
print(configuration_dict)
print("ui_worker started")
my_menu = usb4vc_menu(get_pboard_dict(this_pboard_id), configuration_dict[this_pboard_id])
my_menu.display_page(0, 0)
for x in range(2):
GPIO.output(SLEEP_LED_PIN, GPIO.HIGH)
time.sleep(0.2)
GPIO.output(SLEEP_LED_PIN, GPIO.LOW)
time.sleep(0.2)
while 1:
time.sleep(0.1)
my_oled.ui_loop_count += 1
if my_oled.is_sleeping is False and my_oled.ui_loop_count % 5 == 0:
my_menu.update_usb_status()
my_menu.update_board_status()
if plus_button.is_pressed():
my_oled.kick()
if my_oled.is_sleeping:
my_oled.wakeup()
elif my_menu.current_level != 2:
my_menu.switch_page(1)
my_menu.display_curent_page()
if minus_button.is_pressed():
my_oled.kick()
if my_oled.is_sleeping:
my_oled.wakeup()
elif my_menu.current_level != 2:
my_menu.switch_page(-1)
my_menu.display_curent_page()
if enter_button.is_pressed():
my_oled.kick()
if my_oled.is_sleeping:
my_oled.wakeup()
else:
my_menu.action_current_page()
if shutdown_button.is_pressed():
my_oled.kick()
if my_oled.is_sleeping:
my_oled.wakeup()
else:
my_menu.goto_level(5)
my_menu.display_curent_page()
my_oled.check_sleep()
def get_gamepad_protocol():
return my_menu.current_gamepad_protocol
def get_joystick_curve():
return joystick_curve_list[my_menu.current_joystick_curve_index]
def oled_print_model_changed():
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("RPi Model Changed!", usb4vc_oled.font_regular, 0, draw)
usb4vc_oled.oled_print_centered("Recompiling BT Driver", usb4vc_oled.font_regular, 10, draw)
usb4vc_oled.oled_print_centered("Might take a while...", usb4vc_oled.font_regular, 20, draw)
def oled_print_oneline(msg):
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered(msg, usb4vc_oled.font_medium, 10, draw)
def oled_print_reboot():
with canvas(usb4vc_oled.oled_device) as draw:
usb4vc_oled.oled_print_centered("Done! Rebooting..", usb4vc_oled.font_medium, 0, draw)
usb4vc_oled.oled_print_centered("Unplug if stuck >10s", usb4vc_oled.font_regular, 16, draw)
ui_thread = threading.Thread(target=ui_worker, daemon=True)
| 51.462641 | 1,075 | 0.63269 | 7,347 | 50,279 | 4.037566 | 0.087519 | 0.072141 | 0.06041 | 0.046117 | 0.560646 | 0.497573 | 0.421015 | 0.371932 | 0.311152 | 0.288498 | 0 | 0.079882 | 0.251318 | 50,279 | 976 | 1,076 | 51.515369 | 0.708153 | 0.015812 | 0 | 0.335267 | 0 | 0.00464 | 0.112747 | 0.030017 | 0 | 0 | 0.003387 | 0 | 0 | 1 | 0.053364 | false | 0.00116 | 0.024362 | 0.00348 | 0.11949 | 0.12065 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5d730d1afb5f1402b6e9a016eacea8ab0f918612 | 858 | py | Python | umbra/monitor/main.py | RafaelAPB/umbra | cf075bbe73e46540e9edee25f9ec3d0828620d5f | [
"Apache-2.0"
] | null | null | null | umbra/monitor/main.py | RafaelAPB/umbra | cf075bbe73e46540e9edee25f9ec3d0828620d5f | [
"Apache-2.0"
] | null | null | null | umbra/monitor/main.py | RafaelAPB/umbra | cf075bbe73e46540e9edee25f9ec3d0828620d5f | [
"Apache-2.0"
] | null | null | null | import logging
import json
import asyncio
from google.protobuf import json_format
from umbra.common.protobuf.umbra_grpc import MonitorBase
from umbra.common.protobuf.umbra_pb2 import Instruction, Snapshot
from umbra.monitor.tools import Tools
logger = logging.getLogger(__name__)
logging.getLogger("hpack").setLevel(logging.WARNING)
class Monitor(MonitorBase):
def __init__(self, info):
self.tools = Tools()
async def Listen(self, stream):
logging.debug("Instruction Received")
instruction: Instruction = await stream.recv_message()
instruction_dict = json_format.MessageToDict(instruction, preserving_proto_field_name=True)
snapshot_dict = await self.tools.handle(instruction_dict)
snapshot = json_format.ParseDict(snapshot_dict, Snapshot())
await stream.send_message(snapshot)
| 31.777778 | 99 | 0.757576 | 101 | 858 | 6.217822 | 0.445545 | 0.047771 | 0.047771 | 0.073248 | 0.089172 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001391 | 0.162005 | 858 | 26 | 100 | 33 | 0.872045 | 0 | 0 | 0 | 0 | 0 | 0.029138 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.368421 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
537138998ce86bd69153421493a543bbc8be7c36 | 723 | py | Python | hemp/internal/utils.py | Addvilz/hemp | 2cd1d437fc59a8f7b24f5d150c623bf75c3b6747 | [
"Apache-2.0"
] | 1 | 2020-08-13T22:28:28.000Z | 2020-08-13T22:28:28.000Z | hemp/internal/utils.py | Addvilz/hemp | 2cd1d437fc59a8f7b24f5d150c623bf75c3b6747 | [
"Apache-2.0"
] | null | null | null | hemp/internal/utils.py | Addvilz/hemp | 2cd1d437fc59a8f7b24f5d150c623bf75c3b6747 | [
"Apache-2.0"
] | null | null | null | import sys
from fabric.utils import error, puts
from git import RemoteProgress
def print_err(message, func=None, exception=None, stdout=None, stderr=None):
error('[Hemp] ' + message, func, exception, stdout, stderr)
def print_info(text, show_prefix=None, end="\n", flush=True):
puts('[Hemp] ' + text, show_prefix, end, flush)
def print_git_output(stdout):
for line in stdout.split('\n'):
sys.stdout.write('[GIT] ' + line + '\n')
sys.stdout.flush()
class SimpleProgressPrinter(RemoteProgress):
def _parse_progress_line(self, line):
if '\r' in line:
line = line.replace('\r', '\r[GIT] ')
sys.stdout.write('[GIT] ' + line + '\n')
sys.stdout.flush()
| 26.777778 | 76 | 0.637621 | 97 | 723 | 4.659794 | 0.42268 | 0.079646 | 0.066372 | 0.075221 | 0.159292 | 0.159292 | 0.159292 | 0.159292 | 0.159292 | 0 | 0 | 0 | 0.204703 | 723 | 26 | 77 | 27.807692 | 0.786087 | 0 | 0 | 0.235294 | 0 | 0 | 0.063624 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.176471 | 0 | 0.470588 | 0.176471 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53726406b1ce515956afb2308d74b2a4c7e1b255 | 4,227 | py | Python | tests/arch/x86/test_x86parser.py | IMULMUL/barf-project | 9547ef843b8eb021c2c32c140e36173c0b4eafa3 | [
"BSD-2-Clause"
] | 1,395 | 2015-01-02T11:43:30.000Z | 2022-03-30T01:15:26.000Z | tests/arch/x86/test_x86parser.py | IMULMUL/barf-project | 9547ef843b8eb021c2c32c140e36173c0b4eafa3 | [
"BSD-2-Clause"
] | 54 | 2015-02-11T05:18:05.000Z | 2021-12-10T08:45:39.000Z | tests/arch/x86/test_x86parser.py | IMULMUL/barf-project | 9547ef843b8eb021c2c32c140e36173c0b4eafa3 | [
"BSD-2-Clause"
] | 207 | 2015-01-05T09:47:54.000Z | 2022-03-30T01:15:29.000Z | # Copyright (c) 2014, Fundacion Dr. Manuel Sadosky
# All rights reserved.
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# 1. Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
# 2. Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
from __future__ import absolute_import
import unittest
from barf.arch import ARCH_X86_MODE_32
from barf.arch import ARCH_X86_MODE_64
from barf.arch.x86.parser import X86Parser
class X86Parser32BitsTests(unittest.TestCase):
def setUp(self):
self._parser = X86Parser(ARCH_X86_MODE_32)
def test_two_oprnd_reg_reg(self):
asm = self._parser.parse("add eax, ebx")
self.assertEqual(str(asm), "add eax, ebx")
def test_two_oprnd_reg_imm(self):
asm = self._parser.parse("add eax, 0x12345678")
self.assertEqual(str(asm), "add eax, 0x12345678")
def test_two_oprnd_reg_mem(self):
asm = self._parser.parse("add eax, [ebx + edx * 4 + 0x10]")
self.assertEqual(str(asm), "add eax, [ebx+edx*4+0x10]")
def test_two_oprnd_mem_reg(self):
asm = self._parser.parse("add [ebx + edx * 4 + 0x10], eax")
self.assertEqual(str(asm), "add [ebx+edx*4+0x10], eax")
def test_one_oprnd_reg(self):
asm = self._parser.parse("inc eax")
self.assertEqual(str(asm), "inc eax")
def test_one_oprnd_imm(self):
asm = self._parser.parse("jmp 0x12345678")
self.assertEqual(str(asm), "jmp 0x12345678")
def test_one_oprnd_mem(self):
asm = self._parser.parse("inc dword ptr [ebx+edx*4+0x10]")
self.assertEqual(str(asm), "inc dword ptr [ebx+edx*4+0x10]")
def test_zero_oprnd(self):
asm = self._parser.parse("nop")
self.assertEqual(str(asm), "nop")
# Misc
# ======================================================================== #
def test_misc_1(self):
asm = self._parser.parse("mov dword ptr [-0x21524111], ecx")
self.assertEqual(str(asm), "mov dword ptr [-0x21524111], ecx")
self.assertNotEqual(str(asm), "mov dword ptr [0xdeadbeef], ecx")
def test_misc_2(self):
asm = self._parser.parse("fucompi st(1)")
self.assertEqual(str(asm), "fucompi st1")
class X86Parser64BitsTests(unittest.TestCase):
def setUp(self):
self._parser = X86Parser(ARCH_X86_MODE_64)
def test_64_two_oprnd_reg_reg(self):
asm = self._parser.parse("add rax, rbx")
self.assertEqual(str(asm), "add rax, rbx")
def test_64_two_oprnd_reg_reg_2(self):
asm = self._parser.parse("add rax, r8")
self.assertEqual(str(asm), "add rax, r8")
def test_64_two_oprnd_reg_mem(self):
asm = self._parser.parse("add rax, [rbx + r15 * 4 + 0x10]")
self.assertEqual(str(asm), "add rax, [rbx+r15*4+0x10]")
# Misc
# ======================================================================== #
def test_misc_offset_1(self):
asm = self._parser.parse("add byte ptr [rax+0xffffff89], cl")
self.assertEqual(str(asm), "add byte ptr [rax+0xffffff89], cl")
def main():
unittest.main()
if __name__ == '__main__':
main()
| 33.283465 | 80 | 0.666903 | 588 | 4,227 | 4.639456 | 0.295918 | 0.058651 | 0.056452 | 0.087243 | 0.533724 | 0.435484 | 0.294721 | 0.180718 | 0.14956 | 0.14956 | 0 | 0.043772 | 0.194701 | 4,227 | 126 | 81 | 33.547619 | 0.757638 | 0.345162 | 0 | 0.034483 | 0 | 0 | 0.210507 | 0 | 0 | 0 | 0.044509 | 0 | 0.258621 | 1 | 0.293103 | false | 0 | 0.086207 | 0 | 0.413793 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
537ea975bc8b1468e691c88bd35a36f7347e9442 | 1,895 | py | Python | set-config.py | astubenazy/vrops-metric-collection | c4e5b8d7058759aa5eded74cc619d1dedcbc821a | [
"MIT"
] | 2 | 2020-04-08T13:03:00.000Z | 2020-08-25T18:21:27.000Z | set-config.py | astubenazy/vrops-metric-collection | c4e5b8d7058759aa5eded74cc619d1dedcbc821a | [
"MIT"
] | 1 | 2019-08-15T11:19:18.000Z | 2019-08-17T11:38:48.000Z | set-config.py | astubenazy/vrops-metric-collection | c4e5b8d7058759aa5eded74cc619d1dedcbc821a | [
"MIT"
] | 7 | 2018-06-06T13:47:52.000Z | 2021-06-17T18:33:27.000Z | # !/usr/bin python
"""
#
# set-config - a small python program to setup the configuration environment for data-collect.py
# data-collect.py contain the python program to gather Metrics from vROps
# Author Sajal Debnath <sdebnath@vmware.com>
#
"""
# Importing the required modules
import json
import base64
import os,sys
# Getting the absolute path from where the script is being run
def get_script_path():
return os.path.dirname(os.path.realpath(sys.argv[0]))
def get_the_inputs():
adapterkind = raw_input("Please enter Adapter Kind: ")
resourceKind = raw_input("Please enter Resource Kind: ")
servername = raw_input("Enter enter Server IP/FQDN: ")
serveruid = raw_input("Please enter user id: ")
serverpasswd = raw_input("Please enter vRops password: ")
encryptedvar = base64.b64encode(serverpasswd)
maxsamples = raw_input("Please enter the maximum number of samples to collect: ")
keys_to_monitor = raw_input("Please enter the number of keys to monitor: ")
keys = []
for i in range(int(keys_to_monitor)):
keys.append(raw_input("Enter the key: "))
data = {}
if int(maxsamples) < 1:
maxsamples = 1
data["adapterKind"] = adapterkind
data["resourceKind"] = resourceKind
data["sampleno"] = int(maxsamples)
serverdetails = {}
serverdetails["name"] = servername
serverdetails["userid"] = serveruid
serverdetails["password"] = encryptedvar
data["server"] = serverdetails
data["keys"] = keys
return data
# Getting the path where config.json file should be kept
path = get_script_path()
fullpath = path+"/"+"config.json"
# Getting the data for the config.json file
final_data = get_the_inputs()
# Saving the data to config.json file
with open(fullpath, 'w') as outfile:
json.dump(final_data, outfile, sort_keys = True, indent = 2, separators=(',', ':'), ensure_ascii=False) | 29.153846 | 107 | 0.701847 | 250 | 1,895 | 5.224 | 0.448 | 0.049005 | 0.064319 | 0.087289 | 0.033691 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006498 | 0.187863 | 1,895 | 65 | 107 | 29.153846 | 0.842105 | 0.244327 | 0 | 0 | 0 | 0 | 0.226761 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057143 | false | 0.085714 | 0.085714 | 0.028571 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
538b05195aa3c62cda3499af221928cc57bfb7bb | 1,423 | py | Python | alipay/aop/api/domain/KbAdvertSettleBillResponse.py | snowxmas/alipay-sdk-python-all | 96870ced60facd96c5bce18d19371720cbda3317 | [
"Apache-2.0"
] | 213 | 2018-08-27T16:49:32.000Z | 2021-12-29T04:34:12.000Z | alipay/aop/api/domain/KbAdvertSettleBillResponse.py | snowxmas/alipay-sdk-python-all | 96870ced60facd96c5bce18d19371720cbda3317 | [
"Apache-2.0"
] | 29 | 2018-09-29T06:43:00.000Z | 2021-09-02T03:27:32.000Z | alipay/aop/api/domain/KbAdvertSettleBillResponse.py | snowxmas/alipay-sdk-python-all | 96870ced60facd96c5bce18d19371720cbda3317 | [
"Apache-2.0"
] | 59 | 2018-08-27T16:59:26.000Z | 2022-03-25T10:08:15.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import json
from alipay.aop.api.constant.ParamConstants import *
class KbAdvertSettleBillResponse(object):
def __init__(self):
self._download_url = None
self._paid_date = None
@property
def download_url(self):
return self._download_url
@download_url.setter
def download_url(self, value):
self._download_url = value
@property
def paid_date(self):
return self._paid_date
@paid_date.setter
def paid_date(self, value):
self._paid_date = value
def to_alipay_dict(self):
params = dict()
if self.download_url:
if hasattr(self.download_url, 'to_alipay_dict'):
params['download_url'] = self.download_url.to_alipay_dict()
else:
params['download_url'] = self.download_url
if self.paid_date:
if hasattr(self.paid_date, 'to_alipay_dict'):
params['paid_date'] = self.paid_date.to_alipay_dict()
else:
params['paid_date'] = self.paid_date
return params
@staticmethod
def from_alipay_dict(d):
if not d:
return None
o = KbAdvertSettleBillResponse()
if 'download_url' in d:
o.download_url = d['download_url']
if 'paid_date' in d:
o.paid_date = d['paid_date']
return o
| 25.410714 | 75 | 0.599438 | 174 | 1,423 | 4.603448 | 0.235632 | 0.205993 | 0.131086 | 0.044944 | 0.248439 | 0.238452 | 0 | 0 | 0 | 0 | 0 | 0.00101 | 0.304287 | 1,423 | 55 | 76 | 25.872727 | 0.808081 | 0.029515 | 0 | 0.097561 | 0 | 0 | 0.081336 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.170732 | false | 0 | 0.04878 | 0.04878 | 0.365854 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5391eb5d4685629e3d8228f4e55d8a98857010ab | 7,787 | py | Python | django_loci/tests/base/test_admin.py | yashikajotwani12/django-loci | 2c0bcb33f4a56d559f798e37fd17b2143b912ce4 | [
"BSD-3-Clause"
] | 205 | 2017-11-17T10:35:02.000Z | 2022-03-29T18:50:32.000Z | django_loci/tests/base/test_admin.py | yashikajotwani12/django-loci | 2c0bcb33f4a56d559f798e37fd17b2143b912ce4 | [
"BSD-3-Clause"
] | 98 | 2017-11-20T16:03:27.000Z | 2022-01-19T21:12:47.000Z | django_loci/tests/base/test_admin.py | yashikajotwani12/django-loci | 2c0bcb33f4a56d559f798e37fd17b2143b912ce4 | [
"BSD-3-Clause"
] | 46 | 2017-11-20T23:25:26.000Z | 2022-02-10T05:06:16.000Z | import json
import os
import responses
from django.urls import reverse
from .. import TestAdminMixin, TestLociMixin
class BaseTestAdmin(TestAdminMixin, TestLociMixin):
geocode_url = 'https://geocode.arcgis.com/arcgis/rest/services/World/GeocodeServer/'
def test_location_list(self):
self._login_as_admin()
self._create_location(name='test-admin-location-1')
url = reverse('{0}_location_changelist'.format(self.url_prefix))
r = self.client.get(url)
self.assertContains(r, 'test-admin-location-1')
def test_floorplan_list(self):
self._login_as_admin()
self._create_floorplan()
self._create_location()
url = reverse('{0}_floorplan_changelist'.format(self.url_prefix))
r = self.client.get(url)
self.assertContains(r, '1st floor')
def test_location_json_view(self):
self._login_as_admin()
loc = self._create_location()
r = self.client.get(reverse('admin:django_loci_location_json', args=[loc.pk]))
expected = {
'name': loc.name,
'address': loc.address,
'type': loc.type,
'is_mobile': loc.is_mobile,
'geometry': json.loads(loc.geometry.json),
}
self.assertDictEqual(r.json(), expected)
def test_location_floorplan_json_view(self):
self._login_as_admin()
fl = self._create_floorplan()
r = self.client.get(
reverse('admin:django_loci_location_floorplans_json', args=[fl.location.pk])
)
expected = {
'choices': [
{
'id': str(fl.pk),
'str': str(fl),
'floor': fl.floor,
'image': fl.image.url,
'image_width': fl.image.width,
'image_height': fl.image.height,
}
]
}
self.assertDictEqual(r.json(), expected)
def test_location_change_image_removed(self):
self._login_as_admin()
loc = self._create_location(name='test-admin-location-1', type='indoor')
fl = self._create_floorplan(location=loc)
# remove floorplan image
os.remove(fl.image.path)
url = reverse('{0}_location_change'.format(self.url_prefix), args=[loc.pk])
r = self.client.get(url)
self.assertContains(r, 'test-admin-location-1')
def test_floorplan_change_image_removed(self):
self._login_as_admin()
loc = self._create_location(name='test-admin-location-1', type='indoor')
fl = self._create_floorplan(location=loc)
# remove floorplan image
os.remove(fl.image.path)
url = reverse('{0}_floorplan_change'.format(self.url_prefix), args=[fl.pk])
r = self.client.get(url)
self.assertContains(r, 'test-admin-location-1')
def test_is_mobile_location_json_view(self):
self._login_as_admin()
loc = self._create_location(is_mobile=True, geometry=None)
response = self.client.get(
reverse('admin:django_loci_location_json', args=[loc.pk])
)
self.assertEqual(response.status_code, 200)
content = json.loads(response.content)
self.assertEqual(content['geometry'], None)
loc1 = self._create_location(
name='location2', address='loc2 add', type='outdoor'
)
response1 = self.client.get(
reverse('admin:django_loci_location_json', args=[loc1.pk])
)
self.assertEqual(response1.status_code, 200)
content1 = json.loads(response1.content)
expected = {
'name': 'location2',
'address': 'loc2 add',
'type': 'outdoor',
'is_mobile': False,
'geometry': {'type': 'Point', 'coordinates': [12.512124, 41.898903]},
}
self.assertEqual(content1, expected)
@responses.activate
def test_geocode(self):
self._login_as_admin()
address = 'Red Square'
url = '{0}?address={1}'.format(
reverse('admin:django_loci_location_geocode_api'), address
)
# Mock HTTP request to the URL to work offline
responses.add(
responses.GET,
f'{self.geocode_url}findAddressCandidates?singleLine=Red+Square&f=json&maxLocations=1',
body=self._load_content('base/static/test-geocode.json'),
content_type='application/json',
)
response = self.client.get(url)
response_lat = round(response.json()['lat'])
response_lng = round(response.json()['lng'])
self.assertEqual(response.status_code, 200)
self.assertEqual(response_lat, 56)
self.assertEqual(response_lng, 38)
def test_geocode_no_address(self):
self._login_as_admin()
url = reverse('admin:django_loci_location_geocode_api')
response = self.client.get(url)
expected = {'error': 'Address parameter not defined'}
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json(), expected)
@responses.activate
def test_geocode_invalid_address(self):
self._login_as_admin()
invalid_address = 'thisaddressisnotvalid123abc'
url = '{0}?address={1}'.format(
reverse('admin:django_loci_location_geocode_api'), invalid_address
)
responses.add(
responses.GET,
f'{self.geocode_url}findAddressCandidates?singleLine=thisaddressisnotvalid123abc'
'&f=json&maxLocations=1',
body=self._load_content('base/static/test-geocode-invalid-address.json'),
content_type='application/json',
)
response = self.client.get(url)
expected = {'error': 'Not found location with given name'}
self.assertEqual(response.status_code, 404)
self.assertEqual(response.json(), expected)
@responses.activate
def test_reverse_geocode(self):
self._login_as_admin()
lat = 52
lng = 21
url = '{0}?lat={1}&lng={2}'.format(
reverse('admin:django_loci_location_reverse_geocode_api'), lat, lng
)
# Mock HTTP request to the URL to work offline
responses.add(
responses.GET,
f'{self.geocode_url}reverseGeocode?location=21.0%2C52.0&f=json&outSR=4326',
body=self._load_content('base/static/test-reverse-geocode.json'),
content_type='application/json',
)
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertContains(response, 'POL')
@responses.activate
def test_reverse_location_with_no_address(self):
self._login_as_admin()
lat = -30
lng = -30
url = '{0}?lat={1}&lng={2}'.format(
reverse('admin:django_loci_location_reverse_geocode_api'), lat, lng
)
responses.add(
responses.GET,
f'{self.geocode_url}reverseGeocode?location=-30.0%2C-30.0&f=json&outSR=4326',
body=self._load_content(
'base/static/test-reverse-location-with-no-address.json'
),
content_type='application/json',
)
response = self.client.get(url)
response_address = response.json()['address']
self.assertEqual(response.status_code, 404)
self.assertEqual(response_address, '')
def test_reverse_geocode_no_coords(self):
self._login_as_admin()
url = reverse('admin:django_loci_location_reverse_geocode_api')
response = self.client.get(url)
expected = {'error': 'lat or lng parameter not defined'}
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json(), expected)
| 38.549505 | 99 | 0.617054 | 887 | 7,787 | 5.198422 | 0.157835 | 0.052049 | 0.039471 | 0.04229 | 0.718716 | 0.712861 | 0.657124 | 0.606159 | 0.563002 | 0.494253 | 0 | 0.019805 | 0.260819 | 7,787 | 201 | 100 | 38.741294 | 0.781272 | 0.017337 | 0 | 0.383333 | 0 | 0.011111 | 0.217079 | 0.141101 | 0 | 0 | 0 | 0 | 0.127778 | 1 | 0.072222 | false | 0 | 0.027778 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5395cbb4a78f713d4a2814a8d200c21fd6a061c3 | 485 | py | Python | core/urls.py | donnellan0007/blog | 02c8850688422e3b685ffac10c32bf3e7a7c2e7a | [
"MIT"
] | null | null | null | core/urls.py | donnellan0007/blog | 02c8850688422e3b685ffac10c32bf3e7a7c2e7a | [
"MIT"
] | null | null | null | core/urls.py | donnellan0007/blog | 02c8850688422e3b685ffac10c32bf3e7a7c2e7a | [
"MIT"
] | null | null | null | from django.contrib import admin
from django.urls import path
from .views import index, email, post_detail, posts, hot_takes, take_detail
from . import views
app_name = "core"
urlpatterns = [
path('',views.index,name="index"),
path('email/',views.email,name="email"),
path('post/<slug>/',views.post_detail,name='post'),
path('posts/',views.posts,name='posts'),
path('takes/',views.hot_takes,name='takes'),
path('take/<slug>/',views.take_detail,name='take'),
] | 32.333333 | 75 | 0.68866 | 69 | 485 | 4.73913 | 0.304348 | 0.061162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121649 | 485 | 15 | 76 | 32.333333 | 0.767606 | 0 | 0 | 0 | 0 | 0 | 0.152263 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5398b81471428ab8f27e820b3cfc198272b782d9 | 1,573 | py | Python | utils/dbconn.py | iamvishnuks/Xmigrate | f8405c72a2ee4203b0fc5ddb55c0a1d9f8d8a7c7 | [
"MIT"
] | 4 | 2020-05-26T11:19:02.000Z | 2020-08-06T11:12:34.000Z | utils/dbconn.py | iamvishnuks/Xmigrate | f8405c72a2ee4203b0fc5ddb55c0a1d9f8d8a7c7 | [
"MIT"
] | 46 | 2022-02-19T09:11:11.000Z | 2022-03-31T15:42:50.000Z | utils/dbconn.py | iamvishnuks/Xmigrate | f8405c72a2ee4203b0fc5ddb55c0a1d9f8d8a7c7 | [
"MIT"
] | 2 | 2019-12-20T12:30:33.000Z | 2020-01-02T22:01:25.000Z | from mongoengine import *
from dotenv import load_dotenv
from os import getenv
from cassandra.cluster import Cluster
from cassandra.auth import PlainTextAuthProvider
from cassandra.cqlengine import connection
from cassandra.cqlengine.management import sync_table
from cassandra.query import ordered_dict_factory
from model.discover import *
from model.blueprint import *
from model.disk import *
from model.storage import *
from model.project import *
from model.network import *
from model.user import *
load_dotenv()
cass_db = getenv("CASS_DB")
cass_password = getenv("CASS_PASSWORD")
cass_user = getenv("CASS_USER")
def create_db_con():
auth_provider = PlainTextAuthProvider(username=cass_user, password=cass_password)
cluster = Cluster([cass_db],auth_provider=auth_provider)
session = cluster.connect()
session.execute("""
CREATE KEYSPACE IF NOT EXISTS migration
WITH replication = { 'class': 'SimpleStrategy', 'replication_factor': '2' }
""")
session.set_keyspace('migration')
session.row_factory = ordered_dict_factory
connection.setup([cass_db], "migration",protocol_version=3,auth_provider=auth_provider)
sync_table(BluePrint)
sync_table(Discover)
sync_table(Project)
sync_table(Network)
sync_table(Subnet)
sync_table(Storage)
sync_table(Bucket)
sync_table(GcpBucket)
sync_table(User)
sync_table(Disk)
session.execute("CREATE INDEX IF NOT EXISTS ON blue_print (network);")
session.execute("CREATE INDEX IF NOT EXISTS ON blue_print (subnet);")
return session
| 33.468085 | 91 | 0.760966 | 201 | 1,573 | 5.756219 | 0.318408 | 0.085566 | 0.077787 | 0.041487 | 0.081245 | 0.081245 | 0.081245 | 0.081245 | 0.081245 | 0.081245 | 0 | 0.0015 | 0.152575 | 1,573 | 46 | 92 | 34.195652 | 0.866467 | 0 | 0 | 0 | 0 | 0 | 0.183725 | 0.01335 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023256 | false | 0.046512 | 0.348837 | 0 | 0.395349 | 0.069767 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
539ea2a319db010bc0f4b82dc9bd72f7d9cbdfe7 | 175 | py | Python | scratchnet/scratchnet.py | Gr1m3y/scratchnet | 5fce471b6e12dc05b3a92fd8581445f7d598d1c3 | [
"MIT"
] | null | null | null | scratchnet/scratchnet.py | Gr1m3y/scratchnet | 5fce471b6e12dc05b3a92fd8581445f7d598d1c3 | [
"MIT"
] | null | null | null | scratchnet/scratchnet.py | Gr1m3y/scratchnet | 5fce471b6e12dc05b3a92fd8581445f7d598d1c3 | [
"MIT"
] | null | null | null | import numpy as np
import network
def main():
x = np.array([2, 3])
nw = network.NeuralNetwork()
print(nw.feedforward(x))
if __name__ == "__main__":
main()
| 13.461538 | 32 | 0.617143 | 24 | 175 | 4.166667 | 0.708333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014925 | 0.234286 | 175 | 12 | 33 | 14.583333 | 0.731343 | 0 | 0 | 0 | 0 | 0 | 0.045714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.375 | 0.125 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53a46773e97ade0a733cbe735e77d4be70d5d02d | 3,927 | py | Python | openstack/tests/unit/block_storage/v2/test_proxy.py | infonova/openstacksdk | 3cf6730a71d8fb448f24af8a5b4e82f2af749cea | [
"Apache-2.0"
] | null | null | null | openstack/tests/unit/block_storage/v2/test_proxy.py | infonova/openstacksdk | 3cf6730a71d8fb448f24af8a5b4e82f2af749cea | [
"Apache-2.0"
] | null | null | null | openstack/tests/unit/block_storage/v2/test_proxy.py | infonova/openstacksdk | 3cf6730a71d8fb448f24af8a5b4e82f2af749cea | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from openstack.block_storage.v2 import _proxy
from openstack.block_storage.v2 import snapshot
from openstack.block_storage.v2 import stats
from openstack.block_storage.v2 import type
from openstack.block_storage.v2 import volume
from openstack.tests.unit import test_proxy_base
class TestVolumeProxy(test_proxy_base.TestProxyBase):
def setUp(self):
super(TestVolumeProxy, self).setUp()
self.proxy = _proxy.Proxy(self.session)
def test_snapshot_get(self):
self.verify_get(self.proxy.get_snapshot, snapshot.Snapshot)
def test_snapshots_detailed(self):
self.verify_list(self.proxy.snapshots, snapshot.SnapshotDetail,
paginated=True,
method_kwargs={"details": True, "query": 1},
expected_kwargs={"query": 1})
def test_snapshots_not_detailed(self):
self.verify_list(self.proxy.snapshots, snapshot.Snapshot,
paginated=True,
method_kwargs={"details": False, "query": 1},
expected_kwargs={"query": 1})
def test_snapshot_create_attrs(self):
self.verify_create(self.proxy.create_snapshot, snapshot.Snapshot)
def test_snapshot_delete(self):
self.verify_delete(self.proxy.delete_snapshot,
snapshot.Snapshot, False)
def test_snapshot_delete_ignore(self):
self.verify_delete(self.proxy.delete_snapshot,
snapshot.Snapshot, True)
def test_type_get(self):
self.verify_get(self.proxy.get_type, type.Type)
def test_types(self):
self.verify_list(self.proxy.types, type.Type, paginated=False)
def test_type_create_attrs(self):
self.verify_create(self.proxy.create_type, type.Type)
def test_type_delete(self):
self.verify_delete(self.proxy.delete_type, type.Type, False)
def test_type_delete_ignore(self):
self.verify_delete(self.proxy.delete_type, type.Type, True)
def test_volume_get(self):
self.verify_get(self.proxy.get_volume, volume.Volume)
def test_volumes_detailed(self):
self.verify_list(self.proxy.volumes, volume.VolumeDetail,
paginated=True,
method_kwargs={"details": True, "query": 1},
expected_kwargs={"query": 1})
def test_volumes_not_detailed(self):
self.verify_list(self.proxy.volumes, volume.Volume,
paginated=True,
method_kwargs={"details": False, "query": 1},
expected_kwargs={"query": 1})
def test_volume_create_attrs(self):
self.verify_create(self.proxy.create_volume, volume.Volume)
def test_volume_delete(self):
self.verify_delete(self.proxy.delete_volume, volume.Volume, False)
def test_volume_delete_ignore(self):
self.verify_delete(self.proxy.delete_volume, volume.Volume, True)
def test_volume_extend(self):
self._verify("openstack.block_storage.v2.volume.Volume.extend",
self.proxy.extend_volume,
method_args=["value", "new-size"],
expected_args=["new-size"])
def test_backend_pools(self):
self.verify_list(self.proxy.backend_pools, stats.Pools,
paginated=False)
| 39.27 | 75 | 0.663102 | 487 | 3,927 | 5.141684 | 0.227926 | 0.071885 | 0.10623 | 0.055112 | 0.582668 | 0.522764 | 0.435304 | 0.435304 | 0.38738 | 0.238019 | 0 | 0.006054 | 0.242934 | 3,927 | 99 | 76 | 39.666667 | 0.836192 | 0.132926 | 0 | 0.212121 | 0 | 0 | 0.040083 | 0.013852 | 0 | 0 | 0 | 0 | 0 | 1 | 0.30303 | false | 0 | 0.090909 | 0 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53ab5b39a644e03ecaaf97048f3ae768e29b5a48 | 503 | py | Python | settings.py | danylo-dudok/youtube-rss | c4478605274cdeac33f909d7fcb7d265898e80bc | [
"MIT"
] | null | null | null | settings.py | danylo-dudok/youtube-rss | c4478605274cdeac33f909d7fcb7d265898e80bc | [
"MIT"
] | null | null | null | settings.py | danylo-dudok/youtube-rss | c4478605274cdeac33f909d7fcb7d265898e80bc | [
"MIT"
] | null | null | null | from datetime import datetime, timedelta
from typing import final
from tools import localize_time
RSS_URL_PREFIX: final = 'https://www.youtube.com/feeds/videos.xml?channel_id={0}'
LOCATION_ARGUMENT_PREFIX: final = '--location='
CHANNEL_ARGUMENT_PREFIX: final = '--channels='
LAST_CHECK_ARGUMENT_PREFIX: final = '--last-check='
TWO_WEEKS_IN_DAYS: final = 14
DEFAULT_LAST_CHECK: final = localize_time(datetime.now() - timedelta(days=TWO_WEEKS_IN_DAYS))
EMPTY: final = ''
CHANNEL_POSTS_LIMIT: final = 20
| 35.928571 | 93 | 0.787276 | 72 | 503 | 5.194444 | 0.527778 | 0.117647 | 0.152406 | 0.074866 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011013 | 0.097416 | 503 | 13 | 94 | 38.692308 | 0.812775 | 0 | 0 | 0 | 0 | 0 | 0.178926 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.272727 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53b25c7fce6d985ae97109a316a32f1fdb359f32 | 1,049 | py | Python | coba/learners/__init__.py | mrucker/banditbenchmark | 0365291b3a0cf1d862d294e0386d0ccad3f360f1 | [
"BSD-3-Clause"
] | 1 | 2020-07-22T13:43:14.000Z | 2020-07-22T13:43:14.000Z | coba/learners/__init__.py | mrucker/coba | 4f679fb5c6e39e2d0bf3e609c77a2a6865168795 | [
"BSD-3-Clause"
] | null | null | null | coba/learners/__init__.py | mrucker/coba | 4f679fb5c6e39e2d0bf3e609c77a2a6865168795 | [
"BSD-3-Clause"
] | null | null | null | """This module contains all public learners and learner interfaces."""
from coba.learners.primitives import Learner, SafeLearner
from coba.learners.bandit import EpsilonBanditLearner, UcbBanditLearner, FixedLearner, RandomLearner
from coba.learners.corral import CorralLearner
from coba.learners.vowpal import VowpalMediator
from coba.learners.vowpal import VowpalArgsLearner, VowpalEpsilonLearner, VowpalSoftmaxLearner, VowpalBagLearner
from coba.learners.vowpal import VowpalCoverLearner, VowpalRegcbLearner, VowpalSquarecbLearner, VowpalOffPolicyLearner
from coba.learners.linucb import LinUCBLearner
__all__ = [
'Learner',
'SafeLearner',
'RandomLearner',
'FixedLearner',
'EpsilonBanditLearner',
'UcbBanditLearner',
'CorralLearner',
'LinUCBLearner',
'VowpalArgsLearner',
'VowpalEpsilonLearner',
'VowpalSoftmaxLearner',
'VowpalBagLearner',
'VowpalCoverLearner',
'VowpalRegcbLearner',
'VowpalSquarecbLearner',
'VowpalOffPolicyLearner',
'VowpalMediator'
] | 36.172414 | 122 | 0.766444 | 79 | 1,049 | 10.126582 | 0.417722 | 0.07 | 0.14 | 0.0825 | 0.105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15348 | 1,049 | 29 | 123 | 36.172414 | 0.900901 | 0.06101 | 0 | 0 | 0 | 0 | 0.276531 | 0.043878 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.269231 | 0 | 0.269231 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53b4d42745fdda68cc9c6626c17825d3356f7324 | 474 | py | Python | backend/resource_files_sample.py | Bhaskers-Blu-Org1/multicloud-incident-response-navigator | e6ba6322fdcc533b6ed14abb4681470a6bb6bd85 | [
"Apache-2.0"
] | null | null | null | backend/resource_files_sample.py | Bhaskers-Blu-Org1/multicloud-incident-response-navigator | e6ba6322fdcc533b6ed14abb4681470a6bb6bd85 | [
"Apache-2.0"
] | null | null | null | backend/resource_files_sample.py | Bhaskers-Blu-Org1/multicloud-incident-response-navigator | e6ba6322fdcc533b6ed14abb4681470a6bb6bd85 | [
"Apache-2.0"
] | 1 | 2020-07-30T10:07:19.000Z | 2020-07-30T10:07:19.000Z | import resource_files
resources = resource_files.ResourceFiles()
# sample use case of getting yamls
print(resources.get_yaml("Pod", "jumpy-shark-gbapp-frontend-844fdccf55-ggkbf", "default", "mycluster"))
# sample use case of getting events
print(resources.get_events('mycluster','default','78abd8c9-ac06-11e9-b68f-0e70a6ce6d3a'))
# sample use case of getting describe info
print(resources.get_logs('mycluster', 'default', "jumpy-shark-gbapp-frontend-844fdccf55-ggkbf"))
| 36.461538 | 103 | 0.78481 | 62 | 474 | 5.919355 | 0.516129 | 0.073569 | 0.106267 | 0.122616 | 0.386921 | 0.207084 | 0 | 0 | 0 | 0 | 0 | 0.062069 | 0.082278 | 474 | 12 | 104 | 39.5 | 0.781609 | 0.225738 | 0 | 0 | 0 | 0 | 0.476584 | 0.336088 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.6 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
53b66284f62a337ba9819ca33a9acfe617722619 | 1,785 | py | Python | tests/QuantumToolboxIntegration/test_singleQubitOpenDynamics.py | AngsarM/QuanGuru | 5db6105f843bbc78c2d5b1547e32d494fbe10b8d | [
"BSD-3-Clause"
] | 9 | 2021-05-23T06:30:45.000Z | 2021-12-27T13:33:54.000Z | tests/QuantumToolboxIntegration/test_singleQubitOpenDynamics.py | cahitkargi/QuanGuru | 9b5c94465cd58bc32f6ff845f29dfdec7e0f9075 | [
"BSD-3-Clause"
] | 26 | 2022-03-18T02:40:54.000Z | 2022-03-25T07:00:25.000Z | tests/QuantumToolboxIntegration/test_singleQubitOpenDynamics.py | cahitkargi/QuanGuru | 9b5c94465cd58bc32f6ff845f29dfdec7e0f9075 | [
"BSD-3-Clause"
] | 5 | 2021-05-23T06:30:24.000Z | 2022-02-04T02:40:08.000Z | import random as rn
import numpy as np
# open system dynamics of a qubit and compare numerical results with the analytical calculations
# NOTE these are also TUTORIALS of the library, so see the Tutorials for what these are doing and analytical
# calculations.
# currently includes 2 cases: (i) decay only, and (ii) unitary evolution by calling Liouville method without giving
# any collapse operators. For now, only looks at excited state populations
# TODO this is an unfinished test. below two tests are the same and it actually is not testing open system dynamics.
decayRateSM = rn.random()
excitedPopulation = lambda t: 0.5*np.exp(-(0.00001*(decayRateSM+1)*2+1j)*50*t)
populations = {'excitedAnalytical':[], 'excitedNumerical':[]}
# this is used as the calculate attribute of the qubit, and the singleQubit fixture evolve method calls this at every
# step of the evolution. It stores both numerical and analytical excited state populations into the dictionary above.
def singleQubitDecayCalculate(qub, state, i):
populations['excitedAnalytical'].append(excitedPopulation(i*qub.stepSize))
populations['excitedNumerical'].append(state[0, 0])
def test_qubitUnitaryEvolutionFromLiouville(singleQubit):
for k in populations:
populations[k] = []
singleQubit.evolutionMethod = singleQubit.openEvolution
singleQubit.calculate = singleQubitDecayCalculate
singleQubit.evolve()
assert singleQubit.stepCount == len(populations['excitedNumerical'])
def test_qubitDecay(singleQubit):
for k in populations:
populations[k] = []
singleQubit.evolutionMethod = singleQubit.openEvolution
singleQubit.calculate = singleQubitDecayCalculate
singleQubit.evolve()
assert singleQubit.stepCount == len(populations['excitedNumerical'])
| 45.769231 | 117 | 0.773109 | 217 | 1,785 | 6.35023 | 0.511521 | 0.010885 | 0.026125 | 0.024673 | 0.301887 | 0.301887 | 0.301887 | 0.301887 | 0.301887 | 0.301887 | 0 | 0.01054 | 0.14958 | 1,785 | 38 | 118 | 46.973684 | 0.897233 | 0.419608 | 0 | 0.545455 | 0 | 0 | 0.095424 | 0 | 0 | 0 | 0 | 0.026316 | 0.090909 | 1 | 0.136364 | false | 0 | 0.090909 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53b7d55368f6a08688dd3db11b258ac91759ec48 | 2,447 | py | Python | asv_bench/benchmarks/algorithms.py | raspbian-packages/pandas | fb33806b5286deb327b2e0fa96aedf25a6ed563f | [
"PSF-2.0",
"Apache-2.0",
"BSD-2-Clause",
"MIT",
"BSD-3-Clause"
] | null | null | null | asv_bench/benchmarks/algorithms.py | raspbian-packages/pandas | fb33806b5286deb327b2e0fa96aedf25a6ed563f | [
"PSF-2.0",
"Apache-2.0",
"BSD-2-Clause",
"MIT",
"BSD-3-Clause"
] | null | null | null | asv_bench/benchmarks/algorithms.py | raspbian-packages/pandas | fb33806b5286deb327b2e0fa96aedf25a6ed563f | [
"PSF-2.0",
"Apache-2.0",
"BSD-2-Clause",
"MIT",
"BSD-3-Clause"
] | null | null | null | import numpy as np
import pandas as pd
from pandas.util import testing as tm
class algorithm(object):
goal_time = 0.2
def setup(self):
N = 100000
self.int_unique = pd.Int64Index(np.arange(N * 5))
# cache is_unique
self.int_unique.is_unique
self.int = pd.Int64Index(np.arange(N).repeat(5))
self.float = pd.Float64Index(np.random.randn(N).repeat(5))
# Convenience naming.
self.checked_add = pd.core.nanops._checked_add_with_arr
self.arr = np.arange(1000000)
self.arrpos = np.arange(1000000)
self.arrneg = np.arange(-1000000, 0)
self.arrmixed = np.array([1, -1]).repeat(500000)
def time_int_factorize(self):
self.int.factorize()
def time_float_factorize(self):
self.int.factorize()
def time_int_unique_duplicated(self):
self.int_unique.duplicated()
def time_int_duplicated(self):
self.int.duplicated()
def time_float_duplicated(self):
self.float.duplicated()
def time_add_overflow_pos_scalar(self):
self.checked_add(self.arr, 1)
def time_add_overflow_neg_scalar(self):
self.checked_add(self.arr, -1)
def time_add_overflow_zero_scalar(self):
self.checked_add(self.arr, 0)
def time_add_overflow_pos_arr(self):
self.checked_add(self.arr, self.arrpos)
def time_add_overflow_neg_arr(self):
self.checked_add(self.arr, self.arrneg)
def time_add_overflow_mixed_arr(self):
self.checked_add(self.arr, self.arrmixed)
class hashing(object):
goal_time = 0.2
def setup(self):
N = 100000
self.df = pd.DataFrame(
{'A': pd.Series(tm.makeStringIndex(100).take(
np.random.randint(0, 100, size=N))),
'B': pd.Series(tm.makeStringIndex(10000).take(
np.random.randint(0, 10000, size=N))),
'D': np.random.randn(N),
'E': np.arange(N),
'F': pd.date_range('20110101', freq='s', periods=N),
'G': pd.timedelta_range('1 day', freq='s', periods=N),
})
self.df['C'] = self.df['B'].astype('category')
self.df.iloc[10:20] = np.nan
def time_frame(self):
self.df.hash()
def time_series_int(self):
self.df.E.hash()
def time_series_string(self):
self.df.B.hash()
def time_series_categorical(self):
self.df.C.hash()
| 26.89011 | 67 | 0.612178 | 343 | 2,447 | 4.186589 | 0.268222 | 0.07312 | 0.068245 | 0.075209 | 0.365599 | 0.262535 | 0.262535 | 0.190808 | 0.123955 | 0.123955 | 0 | 0.048874 | 0.255823 | 2,447 | 90 | 68 | 27.188889 | 0.739703 | 0.014303 | 0 | 0.129032 | 0 | 0 | 0.012868 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.274194 | false | 0 | 0.048387 | 0 | 0.387097 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53c4401601b96a14bafd9a44d9c96d488de53fcf | 7,279 | py | Python | vitrage/datasources/static/driver.py | HoonMinJeongUm/Hunmin-vitrage | 37d43d6b78e8b76fa6a2e83e5c739e9e4917a7b6 | [
"Apache-2.0"
] | null | null | null | vitrage/datasources/static/driver.py | HoonMinJeongUm/Hunmin-vitrage | 37d43d6b78e8b76fa6a2e83e5c739e9e4917a7b6 | [
"Apache-2.0"
] | null | null | null | vitrage/datasources/static/driver.py | HoonMinJeongUm/Hunmin-vitrage | 37d43d6b78e8b76fa6a2e83e5c739e9e4917a7b6 | [
"Apache-2.0"
] | null | null | null | # Copyright 2016 - Nokia, ZTE
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from itertools import chain
from six.moves import reduce
from oslo_log import log
from vitrage.common.constants import DatasourceProperties as DSProps
from vitrage.common.constants import GraphAction
from vitrage.datasources.driver_base import DriverBase
from vitrage.datasources.static import STATIC_DATASOURCE
from vitrage.datasources.static import StaticFields
from vitrage.utils import file as file_utils
LOG = log.getLogger(__name__)
class StaticDriver(DriverBase):
# base fields are required for all entities, others are treated as metadata
BASE_FIELDS = {StaticFields.STATIC_ID,
StaticFields.TYPE,
StaticFields.ID}
def __init__(self, conf):
super(StaticDriver, self).__init__()
self.cfg = conf
self.entities_cache = []
@staticmethod
def _is_valid_config(config):
"""check for validity of configuration"""
# TODO(yujunz) check with yaml schema or reuse template validation
return StaticFields.DEFINITIONS in config
@staticmethod
def get_event_types():
return []
def enrich_event(self, event, event_type):
pass
def get_all(self, datasource_action):
return self.make_pickleable(self._get_and_cache_all_entities(),
STATIC_DATASOURCE,
datasource_action)
def get_changes(self, datasource_action):
return self.make_pickleable(self._get_and_cache_changed_entities(),
STATIC_DATASOURCE,
datasource_action)
def _get_and_cache_all_entities(self):
self.entities_cache = self._get_all_entities()
return self.entities_cache
def _get_all_entities(self):
files = file_utils.list_files(self.cfg.static.directory, '.yaml', True)
return list(reduce(chain, [self._get_entities_from_file(path)
for path in files], []))
def _get_and_cache_changed_entities(self):
changed_entities = []
new_entities = self._get_all_entities()
for new_entity in new_entities:
old_entity = self._find_entity(new_entity, self.entities_cache)
if old_entity:
# Add modified entities
if not self._equal_entities(old_entity, new_entity):
changed_entities.append(new_entity.copy())
else:
# Add new entities
changed_entities.append(new_entity.copy())
# Add deleted entities
for old_entity in self.entities_cache:
if not self._find_entity(old_entity, new_entities):
old_entity_copy = old_entity.copy()
old_entity_copy[DSProps.EVENT_TYPE] = GraphAction.DELETE_ENTITY
changed_entities.append(old_entity_copy)
self.entities_cache = new_entities
return changed_entities
@classmethod
def _get_entities_from_file(cls, path):
config = file_utils.load_yaml_file(path)
if not cls._is_valid_config(config):
LOG.warning("Skipped invalid config (possible obsoleted): {}"
.format(path))
return []
definitions = config[StaticFields.DEFINITIONS]
entities = definitions[StaticFields.ENTITIES]
relationships = definitions[StaticFields.RELATIONSHIPS]
return cls._pack(entities, relationships)
@classmethod
def _pack(cls, entities, relationships):
entities_dict = {}
for entity in entities:
cls._pack_entity(entities_dict, entity)
for rel in relationships:
cls._pack_rel(entities_dict, rel)
return entities_dict.values()
@classmethod
def _pack_entity(cls, entities_dict, entity):
static_id = entity[StaticFields.STATIC_ID]
if static_id not in entities_dict:
metadata = {key: value for key, value in entity.items()
if key not in cls.BASE_FIELDS}
entities_dict[static_id] = entity
entity[StaticFields.RELATIONSHIPS] = []
entity[StaticFields.METADATA] = metadata
else:
LOG.warning("Skipped duplicated entity: {}".format(entity))
@classmethod
def _pack_rel(cls, entities_dict, rel):
source_id = rel[StaticFields.SOURCE]
target_id = rel[StaticFields.TARGET]
if source_id == target_id:
# self pointing relationship
entities_dict[source_id][StaticFields.RELATIONSHIPS].append(rel)
else:
source, target = entities_dict[source_id], entities_dict[target_id]
source[StaticFields.RELATIONSHIPS].append(
cls._expand_neighbor(rel, target))
@staticmethod
def _expand_neighbor(rel, neighbor):
"""Expand config id to neighbor entity
rel={'source': 's1', 'target': 'r1', 'relationship_type': 'attached'}
neighbor={'static_id': 'h1', 'vitrage_type': 'host.nova', 'id': 1}
result={'relationship_type': 'attached', 'source': 's1',
'target': {'static_id': 'h1',
'vitrage_type': 'host.nova',
'id': 1}}
"""
rel = rel.copy()
if rel[StaticFields.SOURCE] == neighbor[StaticFields.STATIC_ID]:
rel[StaticFields.SOURCE] = neighbor
elif rel[StaticFields.TARGET] == neighbor[StaticFields.STATIC_ID]:
rel[StaticFields.TARGET] = neighbor
else:
# TODO(yujunz) raise exception and ignore invalid relationship
LOG.error("Invalid neighbor {} for relationship {}"
.format(neighbor, rel))
return None
return rel
@staticmethod
def _find_entity(search_entity, entities):
# naive implementation since we don't expect many static entities
for entity in entities:
if entity[StaticFields.TYPE] == search_entity[StaticFields.TYPE] \
and entity[StaticFields.ID] == \
search_entity[StaticFields.ID]:
return entity
@staticmethod
def _equal_entities(old_entity, new_entity):
# TODO(iafek): compare also the relationships
return old_entity.get(StaticFields.TYPE) == \
new_entity.get(StaticFields.TYPE) and \
old_entity.get(StaticFields.ID) == \
new_entity.get(StaticFields.ID) and \
old_entity.get(StaticFields.NAME) == \
new_entity.get(StaticFields.NAME) and \
old_entity.get(StaticFields.STATE) == \
new_entity.get(StaticFields.STATE)
| 37.911458 | 79 | 0.637588 | 810 | 7,279 | 5.504938 | 0.245679 | 0.028257 | 0.037677 | 0.021529 | 0.17874 | 0.118636 | 0.061449 | 0.040816 | 0.040816 | 0.026463 | 0 | 0.002862 | 0.279846 | 7,279 | 191 | 80 | 38.109948 | 0.847768 | 0.18258 | 0 | 0.181102 | 0 | 0 | 0.020474 | 0 | 0 | 0 | 0 | 0.010471 | 0 | 1 | 0.125984 | false | 0.007874 | 0.070866 | 0.031496 | 0.322835 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53c79195c421ab20eafd11d18287a51c1a99fb79 | 779 | py | Python | python_minecraft_tut_2021/weatherCraft.py | LeGamermc/ursina_tutorials | f0ad518be3a02cdb52f27c87f2f70817b4d0e8b0 | [
"MIT"
] | 13 | 2021-09-01T01:38:13.000Z | 2022-03-29T01:43:50.000Z | python_minecraft_tut_2021/weatherCraft.py | LeGamermc/ursina_tutorials | f0ad518be3a02cdb52f27c87f2f70817b4d0e8b0 | [
"MIT"
] | 14 | 2021-08-01T05:00:22.000Z | 2022-02-03T21:53:23.000Z | python_minecraft_tut_2021/weatherCraft.py | LeGamermc/ursina_tutorials | f0ad518be3a02cdb52f27c87f2f70817b4d0e8b0 | [
"MIT"
] | 31 | 2021-08-09T04:08:11.000Z | 2022-03-23T11:06:15.000Z | """
Weather functions.
"""
from ursina import color, window, time
from nMap import nMap
class Weather:
def __init__(this, rate=1):
this.red = 0
this.green = 200
this.blue = 211
this.darkling = 0
this.rate = rate
this.towardsNight = 1
def setSky(this):
r = nMap(this.darkling,0,100,0,this.red)
g = nMap(this.darkling,0,100,0,this.green)
b = nMap(this.darkling,0,100,0,this.blue)
window.color = color.rgb(r,g,b)
def update(this):
this.darkling -= ( this.rate *
this.towardsNight *
time.dt)
if this.darkling < 0:
this.towardsNight *= -1
this.darkling = 0
this.setSky()
| 22.911765 | 50 | 0.519897 | 97 | 779 | 4.134021 | 0.329897 | 0.087282 | 0.194514 | 0.127182 | 0.187032 | 0.187032 | 0.187032 | 0 | 0 | 0 | 0 | 0.05668 | 0.365854 | 779 | 33 | 51 | 23.606061 | 0.755061 | 0.023107 | 0 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0 | 0.086957 | 0 | 0.26087 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53d3daf836c3d211bfbd295aeb46edb04453a89a | 1,350 | py | Python | pyConTextNLP/__init__.py | Blulab-Utah/pyConTextPipeline | d4060f89d54f4db56914832033f8ce589ee3c181 | [
"Apache-2.0"
] | 1 | 2021-04-30T11:18:32.000Z | 2021-04-30T11:18:32.000Z | pyConTextNLP/__init__.py | Blulab-Utah/pyConTextPipeline | d4060f89d54f4db56914832033f8ce589ee3c181 | [
"Apache-2.0"
] | null | null | null | pyConTextNLP/__init__.py | Blulab-Utah/pyConTextPipeline | d4060f89d54f4db56914832033f8ce589ee3c181 | [
"Apache-2.0"
] | 1 | 2020-06-28T01:51:56.000Z | 2020-06-28T01:51:56.000Z | #Copyright 2010 Brian E. Chapman
#
#Licensed under the Apache License, Version 2.0 (the "License");
#you may not use this file except in compliance with the License.
#You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
#Unless required by applicable law or agreed to in writing, software
#distributed under the License is distributed on an "AS IS" BASIS,
#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#See the License for the specific language governing permissions and
#limitations under the License.
"""This is an alternative implementation of the pyConText package where I make
use of graphs to indicate relationships between targets and modifiers. Nodes of
thegraphs are the targets and modifiers identified in the text; edges of the
graphs are relationships between the targets. This provides for much simpler
code than what exists in the other version of pyConText where each object has a
dictionary of __modifies and __modifiedby that must be kept in sync with each
other.
Also it is hoped that the use of a directional graph could ultimately simplify
our itemData structures as we could chain together items"""
import os
version = {}
with open(os.path.join(os.path.dirname(__file__),"version.py")) as f0:
exec(f0.read(), version)
__version__ = version['__version__']
| 43.548387 | 79 | 0.786667 | 214 | 1,350 | 4.88785 | 0.593458 | 0.057361 | 0.024857 | 0.030593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00878 | 0.156296 | 1,350 | 30 | 80 | 45 | 0.90957 | 0.856296 | 0 | 0 | 0 | 0 | 0.119318 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53d42695123c2326facf4f279256b1c384089fd3 | 78,742 | py | Python | pypeit/metadata.py | rcooke-ast/PYPIT | 0cb9c4cb422736b855065a35aefc2bdba6d51dd0 | [
"BSD-3-Clause"
] | null | null | null | pypeit/metadata.py | rcooke-ast/PYPIT | 0cb9c4cb422736b855065a35aefc2bdba6d51dd0 | [
"BSD-3-Clause"
] | null | null | null | pypeit/metadata.py | rcooke-ast/PYPIT | 0cb9c4cb422736b855065a35aefc2bdba6d51dd0 | [
"BSD-3-Clause"
] | null | null | null | """
Provides a class that handles the fits metadata required by PypeIt.
.. include common links, assuming primary doc root is up one directory
.. include:: ../include/links.rst
"""
import os
import io
import string
from copy import deepcopy
import datetime
from IPython import embed
import numpy as np
import yaml
from astropy import table, coordinates, time, units
from pypeit import msgs
from pypeit import utils
from pypeit.core import framematch
from pypeit.core import flux_calib
from pypeit.core import parse
from pypeit.core import meta
from pypeit.io import dict_to_lines
from pypeit.par import PypeItPar
from pypeit.par.util import make_pypeit_file
from pypeit.bitmask import BitMask
# TODO: Turn this into a DataContainer
# Initially tried to subclass this from astropy.table.Table, but that
# proved too difficult.
class PypeItMetaData:
"""
Provides a table and interface to the relevant fits file metadata
used during the reduction.
The content of the fits table is dictated by the header keywords
specified for the provided spectrograph. It is expected that this
table can be used to set the frame type of each file.
The metadata is validated using checks specified by the provided
spectrograph class.
For the data table, one should typically provide either the file
list from which to grab the data from the fits headers or the
data directly. If neither are provided the table is instantiated
without any data.
Args:
spectrograph (:class:`pypeit.spectrographs.spectrograph.Spectrograph`):
The spectrograph used to collect the data save to each file.
The class is used to provide the header keyword data to
include in the table and specify any validation checks.
par (:obj:`pypeit.par.pypeitpar.PypeItPar`):
PypeIt parameters used to set the code behavior.
files (:obj:`str`, :obj:`list`, optional):
The list of files to include in the table.
data (table-like, optional):
The data to include in the table. The type can be anything
allowed by the instantiation of
:class:`astropy.table.Table`.
usrdata (:obj:`astropy.table.Table`, optional):
A user provided set of data used to supplement or overwrite
metadata read from the file headers. The table must have a
`filename` column that is used to match to the metadata
table generated within PypeIt. **Note**: This is ignored if
`data` is also provided. This functionality is only used
when building the metadata from the fits files.
strict (:obj:`bool`, optional):
Function will fault if there is a problem with the reading
the header for any of the provided files; see
:func:`pypeit.spectrographs.spectrograph.get_headarr`. Set
to False to instead report a warning and continue.
Attributes:
spectrograph
(:class:`pypeit.spectrographs.spectrograph.Spectrograph`):
The spectrograph used to collect the data save to each file.
The class is used to provide the header keyword data to
include in the table and specify any validation checks.
par (:class:`pypeit.par.pypeitpar.PypeItPar`):
PypeIt parameters used to set the code behavior. If not
provided, the default parameters specific to the provided
spectrograph are used.
configs (:obj:`dict`):
A dictionary of the unique configurations identified.
type_bitmask (:class:`pypeit.core.framematch.FrameTypeBitMask`):
The bitmask used to set the frame type of each fits file.
calib_bitmask (:class:`BitMask`):
The bitmask used to keep track of the calibration group bits.
table (:class:`astropy.table.Table`):
The table with the relevant metadata for each fits file to
use in the data reduction.
"""
def __init__(self, spectrograph, par, files=None, data=None, usrdata=None,
strict=True):
if data is None and files is None:
# Warn that table will be empty
msgs.warn('Both data and files are None in the instantiation of PypeItMetaData.'
' The table will be empty!')
# Initialize internals
self.spectrograph = spectrograph
self.par = par
if not isinstance(self.par, PypeItPar):
raise TypeError('Input parameter set must be of type PypeItPar.')
self.type_bitmask = framematch.FrameTypeBitMask()
# Build table
self.table = table.Table(data if files is None
else self._build(files, strict=strict,
usrdata=usrdata))
# Merge with user data, if present
if usrdata is not None:
self.merge(usrdata)
# Impose types on specific columns
self._impose_types(['comb_id', 'bkg_id', 'manual'], [int, int, str])
# Initialize internal attributes
self.configs = None
self.calib_bitmask = None
# Initialize columns that the user might add
self.set_user_added_columns()
# Validate instrument name
self.spectrograph.vet_instrument(self.table)
def _impose_types(self, columns, types):
"""
Impose a set of types on certain columns.
.. note::
:attr:`table` is edited in place.
Args:
columns (:obj:`list`):
List of column names
types (:obj:`list`):
List of types
"""
for c,t in zip(columns, types):
if c in self.keys():
self.table[c] = self.table[c].astype(t)
def _build(self, files, strict=True, usrdata=None):
"""
Generate the fitstbl that will be at the heart of PypeItMetaData.
Args:
files (:obj:`str`, :obj:`list`):
One or more files to use to build the table.
strict (:obj:`bool`, optional):
Function will fault if :func:`fits.getheader` fails to
read any of the headers. Set to False to report a
warning and continue.
usrdata (astropy.table.Table, optional):
Parsed for frametype for a few instruments (e.g. VLT)
where meta data may not be required
Returns:
dict: Dictionary with the data to assign to :attr:`table`.
"""
# Allow for single files
_files = files if hasattr(files, '__len__') else [files]
# Build lists to fill
data = {k:[] for k in self.spectrograph.meta.keys()}
data['directory'] = ['None']*len(_files)
data['filename'] = ['None']*len(_files)
# Build the table
for idx, ifile in enumerate(_files):
# User data (for frame type)
if usrdata is None:
usr_row = None
else:
# TODO: This check should be done elsewhere
# Check
if os.path.basename(ifile) != usrdata['filename'][idx]:
msgs.error('File name list does not match user-provided metadata table. See '
'usrdata argument of instantiation of PypeItMetaData.')
usr_row = usrdata[idx]
# Add the directory and file name to the table
data['directory'][idx], data['filename'][idx] = os.path.split(ifile)
if not data['directory'][idx]:
data['directory'][idx] = '.'
# Read the fits headers
headarr = self.spectrograph.get_headarr(ifile, strict=strict)
# Grab Meta
for meta_key in self.spectrograph.meta.keys():
value = self.spectrograph.get_meta_value(headarr, meta_key,
required=strict,
usr_row=usr_row,
ignore_bad_header = self.par['rdx']['ignore_bad_headers'])
if isinstance(value, str) and '#' in value:
value = value.replace('#', '')
msgs.warn('Removing troublesome # character from {0}. Returning {1}.'.format(
meta_key, value))
data[meta_key].append(value)
msgs.info('Added metadata for {0}'.format(os.path.split(ifile)[1]))
# JFH Changed the below to not crash if some files have None in
# their MJD. This is the desired behavior since if there are
# empty or corrupt files we still want this to run.
# Validate, print out a warning if there is problem
try:
time.Time(data['mjd'], format='mjd')
except ValueError:
mjd = np.asarray(data['mjd'])
filenames = np.asarray(data['filename'])
bad_files = filenames[mjd == None]
# Print status message
msg = 'Time invalid for {0} files.\n'.format(len(bad_files))
msg += 'Continuing, but the following frames may be empty or have corrupt headers:\n'
for file in bad_files:
msg += ' {0}\n'.format(file)
msgs.warn(msg)
# Return
return data
# TODO: In this implementation, slicing the PypeItMetaData object
# will return an astropy.table.Table, not a PypeItMetaData object.
def __getitem__(self, item):
return self.table.__getitem__(item)
def __setitem__(self, item, value):
return self.table.__setitem__(item, value)
def __len__(self):
return self.table.__len__()
def __repr__(self):
return self.table._base_repr_(html=False,
descr_vals=['PypeItMetaData:\n',
' spectrograph={0}\n'.format(
self.spectrograph.name),
' length={0}\n'.format(len(self))])
def _repr_html_(self):
return self.table._base_repr_(html=True, max_width=-1,
descr_vals=['PypeItMetaData: spectrograph={0}, length={1}\n'.format(
self.spectrograph.name, len(self))])
@staticmethod
def default_keys():
return [ 'directory', 'filename', 'instrume' ]
def keys(self):
return self.table.keys()
def sort(self, col):
return self.table.sort(col)
def merge(self, usrdata, match_type=True):
"""
Use the provided table to supplement or overwrite the metadata.
If the internal table already contains the column in `usrdata`,
the function will try to match the data type of the `usrdata`
column to the existing data type. If it can't it will just add
the column anyway, with the type in `usrdata`. You can avoid
this step by setting `match_type=False`.
Args:
usrdata (:obj:`astropy.table.Table`):
A user provided set of data used to supplement or
overwrite metadata read from the file headers. The
table must have a `filename` column that is used to
match to the metadata table generated within PypeIt.
match_type (:obj:`bool`, optional):
Attempt to match the data type in `usrdata` to the type
in the internal table. See above.
Raises:
TypeError:
Raised if `usrdata` is not an `astropy.io.table.Table`
KeyError:
Raised if `filename` is not a key in the provided table.
"""
meta_data_model = meta.get_meta_data_model()
# Check the input
if not isinstance(usrdata, table.Table):
raise TypeError('Must provide an astropy.io.table.Table instance.')
if 'filename' not in usrdata.keys():
raise KeyError('The user-provided table must have \'filename\' column!')
# Make sure the data are correctly ordered
srt = [np.where(f == self.table['filename'])[0][0] for f in usrdata['filename']]
# Convert types if possible
existing_keys = list(set(self.table.keys()) & set(usrdata.keys()))
radec_done = False
if len(existing_keys) > 0 and match_type:
for key in existing_keys:
if len(self.table[key].shape) > 1: # NOT ALLOWED!!
# TODO: This should be converted to an assert statement...
raise ValueError('CODING ERROR: Found high-dimensional column.')
#embed(header='372 of metadata')
elif key in meta_data_model.keys(): # Is this meta data??
dtype = meta_data_model[key]['dtype']
else:
dtype = self.table[key].dtype
# Deal with None's properly
nones = usrdata[key] == 'None'
usrdata[key][nones] = None
# Rest
# Allow for str RA, DEC (backwards compatability)
if key in ['ra', 'dec'] and not radec_done:
ras, decs = meta.convert_radec(usrdata['ra'][~nones].data,
usrdata['dec'][~nones].data)
usrdata['ra'][~nones] = ras.astype(dtype)
usrdata['dec'][~nones] = decs.astype(dtype)
radec_done = True
else:
usrdata[key][~nones] = usrdata[key][~nones].astype(dtype)
# Include the user data in the table
for key in usrdata.keys():
self.table[key] = usrdata[key][srt]
def finalize_usr_build(self, frametype, setup):
"""
Finalize the build of the table based on user-provided data,
typically pulled from the PypeIt file.
This function:
- sets the frame types based on the provided object
- sets all the configurations to the provided `setup`
- assigns all frames to a single calibration group, if the
'calib' column does not exist
- if the 'comb_id' column does not exist, this sets the
combination groups to be either undefined or to be unique
for each science or standard frame, see
:func:`set_combination_groups`.
.. note::
This should only be run if all files are from a single
instrument configuration. :attr:`table` is modified
in-place.
See also: :func:`pypeit.pypeitsetup.PypeItSetup.run`.
.. todo::
- Why isn't frametype just in the user-provided data? It
may be (see get_frame_types) and I'm just not using it...
Args:
frametype (:obj:`dict`):
A dictionary with the types designated by the user. The
file name and type are expected to be the key and value
of the dictionary, respectively. The number of keys
therefore *must* match the number of files in
:attr:`table`. For frames that have multiple types, the
types should be provided as a string with
comma-separated types.
setup (:obj:`str`):
If the 'setup' columns does not exist, fill the
configuration setup columns with this single identifier.
"""
self.get_frame_types(user=frametype)
# TODO: Add in a call to clean_configurations? I didn't add it
# here, because this method is only called for a preconstructed
# pypeit file, which should nominally follow an execution of
# pypeit_setup. If the user edits back in a frame that has an
# invalid key, at least for now the DEIMOS image reader will
# fault.
self.set_configurations(fill=setup)
self.set_calibration_groups(default=True)
self.set_combination_groups()
def get_configuration(self, indx, cfg_keys=None):
"""
Return the configuration dictionary for a given frame.
This is not the same as the backwards compatible "setup"
dictionary.
Args:
indx (:obj:`int`):
The index of the table row to use to construct the
configuration.
cfg_keys (:obj:`list`, optional):
The list of metadata keys to use to construct the
configuration. If None, the `configuration_keys` of
:attr:`spectrograph` is used.
Returns:
dict: A dictionary with the metadata values from the
selected row.
"""
_cfg_keys = self.spectrograph.configuration_keys() if cfg_keys is None else cfg_keys
return {k:self.table[k][indx] for k in _cfg_keys}
def master_key(self, row, det=1):
"""
Construct the master key for the file in the provided row.
The master key is the combination of the configuration, the
calibration group, and the detector. The configuration ID is
the same as included in the configuration column (A, B, C, etc),
the calibration group is the same as the calibration bit number,
and the detector number is provided as an argument and converted
to a zero-filled string with two digits (the maximum number of
detectors is 99).
Using the calibration bit in the keyword allows MasterFrames to
be used with multiple calibration groups.
Args:
row (:obj:`int`):
The 0-indexed row used to construct the key.
det (:obj:`int`, :obj:`tuple`, optional):
The 1-indexed detector number(s). If a tuple, it must include
detectors designated as a viable mosaic for
:attr:`spectrograph`; see
:func:`~pypeit.spectrographs.spectrograph.Spectrograph.allowed_mosaics`.
Returns:
:obj:`str`: Master key with configuration, calibration group(s), and
detector.
Raises:
PypeItError:
Raised if the 'setup' or 'calibbit' columns
haven't been defined.
"""
if 'setup' not in self.keys() or 'calibbit' not in self.keys():
msgs.error('Cannot provide master key string without setup and calibbit; '
'run set_configurations and set_calibration_groups.')
det_name = self.spectrograph.get_det_name(det)
return f"{self['setup'][row]}_{self['calibbit'][row]}_{det_name}"
def construct_obstime(self, row):
"""
Construct the MJD of when the frame was observed.
.. todo::
- Consolidate with :func:`convert_time` ?
Args:
row (:obj:`int`):
The 0-indexed row of the frame.
Returns:
astropy.time.Time: The MJD of the observation.
"""
return time.Time(self['mjd'][row], format='mjd')
def construct_basename(self, row, obstime=None):
"""
Construct the root name primarily for PypeIt file output.
Args:
row (:obj:`int`):
The 0-indexed row of the frame.
obstime (:class:`astropy.time.Time`, optional):
The MJD of the observation. If None, constructed using
:func:`construct_obstime`.
Returns:
str: The root name for file output.
"""
_obstime = self.construct_obstime(row) if obstime is None else obstime
tiso = time.Time(_obstime, format='isot')
dtime = datetime.datetime.strptime(tiso.value, '%Y-%m-%dT%H:%M:%S.%f')
return '{0}-{1}_{2}_{3}{4}'.format(self['filename'][row].split('.fits')[0],
self['target'][row].replace(" ", ""),
self.spectrograph.camera,
datetime.datetime.strftime(dtime, '%Y%m%dT'),
tiso.value.split("T")[1].replace(':',''))
def get_setup(self, row, det=None, config_only=False):
"""
Construct the setup dictionary.
.. todo::
- This is for backwards compatibility, but we should
consider reformatting it. And it may be something to put
in the relevant spectrograph class.
Args:
row (:obj:`int`):
The 0-indexed row used to construct the setup.
det (:obj:`int`, optional):
The 1-indexed detector to include. If None, all
detectors are included.
config_only (:obj:`bool`, optional):
Just return the dictionary with the configuration, don't
include the top-level designation of the configuration
itself.
Returns:
dict: The pypeit setup dictionary with the default format.
Raises:
PypeItError:
Raised if the 'setup' isn't been defined.
"""
if 'setup' not in self.keys():
msgs.error('Cannot provide instrument setup without \'setup\' column; '
'run set_configurations.')
dispname = 'none' if 'dispname' not in self.keys() else self['dispname'][row]
dispangle = 'none' if 'dispangle' not in self.keys() else self['dispangle'][row]
dichroic = 'none' if 'dichroic' not in self.keys() else self['dichroic'][row]
decker = 'none' if 'decker' not in self.keys() else self['decker'][row]
slitwid = 'none' if 'slitwid' not in self.keys() else self['slitwid'][row]
slitlen = 'none' if 'slitlen' not in self.keys() else self['slitlen'][row]
binning = '1,1' if 'binning' not in self.keys() else self['binning'][row]
skey = 'Setup {}'.format(self['setup'][row])
# Key names *must* match configuration_keys() for spectrographs
setup = {skey:
{'--':
{'disperser': {'dispname': dispname, 'dispangle':dispangle},
'dichroic': dichroic,
'slit': {'decker': decker, 'slitwid':slitwid, 'slitlen':slitlen},
'binning': binning, # PypeIt orientation binning of a science image
}
}
}
#_det = np.arange(self.spectrograph.ndet)+1 if det is None else [det]
#for d in _det:
# setup[skey][str(d).zfill(2)] \
# = {'binning': binning, 'det': d,
# 'namp': self.spectrograph.detector[d-1]['numamplifiers']}
return setup[skey] if config_only else setup
def get_configuration_names(self, ignore=None, return_index=False, configs=None):
"""
Get the list of the unique configuration names.
This provides just the list of setup identifiers ('A', 'B',
etc.) and the row index where it first occurs. This is
different from :func:`unique_configurations` because the latter
determines and provides the configurations themselves.
This is mostly a convenience function for the writing routines.
Args:
ignore (:obj:`list`, optional):
Ignore configurations in the provided list.
return_index (:obj:`bool`, optional):
Return row indices with the first occurence of these
configurations.
configs (:obj:`str`, :obj:`list`, optional):
One or more strings used to select the configurations
to include in the returned objects. If ``'all'``,
pass back all configurations. Otherwise, only return
the configurations matched to this provided string or
list of strings (e.g., ['A','C']).
Returns:
numpy.array: The list of unique setup names. A second
returned object provides the indices of the first occurrence
of these setups, if requested.
Raises:
PypeItError:
Raised if the 'setup' isn't been defined.
"""
if 'setup' not in self.keys():
msgs.error('Cannot get setup names; run set_configurations.')
# Unique configurations
setups, indx = np.unique(self['setup'], return_index=True)
if ignore is not None:
# Remove the selected configurations to ignore
rm = np.logical_not(np.isin(setups, ignore))
setups = setups[rm]
indx = indx[rm]
# Restrict
_configs = None if configs is None else np.atleast_1d(configs)
# TODO: Why do we need to specify 'all' here? Can't `configs is
# None` mean that you want all the configurations? Or can we
# make the default 'all'?
if configs is not None and 'all' not in _configs:
use = np.isin(setups, _configs)
setups = setups[use]
indx = indx[use]
return setups, indx if return_index else setups
def _get_cfgs(self, copy=False, rm_none=False):
"""
Convenience method to return :attr:`configs` with possible
alterations.
This method *should not* be called by any method outside of
this class; use :func:`unique_configurations` instead.
Args:
copy (:obj:`bool`, optional):
Return a deep copy of :attr:`configs` instead of the
object itself.
rm_none (:obj:`bool`, optional):
Remove any configurations set to 'None'. If copy is
True, this is done *after* :attr:`configs` is copied
to a new dictionary.
Returns:
:obj:`dict`: A nested dictionary, one dictionary per
configuration with the associated metadata for each.
"""
_cfg = deepcopy(self.configs) if copy else self.configs
if rm_none and 'None' in _cfg.keys():
del _cfg['None']
return _cfg
def unique_configurations(self, force=False, copy=False, rm_none=False):
"""
Return the unique instrument configurations.
If run before the ``'setup'`` column is initialized, this function
determines the unique instrument configurations by finding
unique combinations of the items in the metadata table listed by
the spectrograph ``configuration_keys`` method.
If run after the ``'setup'`` column has been set, this simply
constructs the configuration dictionary using the unique
configurations in that column.
This is used to set the internal :attr:`configs`. If this
attribute is not None, this function simply returns
:attr:`config` (cf. ``force``).
.. warning::
Any frame types returned by the
:func:`~pypeit.spectrographs.spectrograph.Spectrograph.config_independent_frames`
method for :attr:`spectrograph` will be ignored in the
construction of the unique configurations. If
:func:`~pypeit.spectrographs.spectrograph.Spectrograph.config_independent_frames`
does not return None and the frame types have not yet
been defined (see :func:`get_frame_types`), this method
will fault!
Args:
force (:obj:`bool`, optional):
Force the configurations to be redetermined. Otherwise
the configurations are only determined if
:attr:`configs` has not yet been defined.
copy (:obj:`bool`, optional):
Return a deep copy of :attr:`configs` instead of the
object itself.
rm_none (:obj:`bool`, optional):
Remove any configurations set to 'None'. If copy is
True, this is done *after* :attr:`configs` is copied
to a new dictionary.
Returns:
:obj:`dict`: A nested dictionary, one dictionary per
configuration with the associated metadata for each.
Raises:
PypeItError:
Raised if there are list of frame types to ignore but
the frame types have not been defined yet.
"""
if self.configs is not None and not force:
return self._get_cfgs(copy=copy, rm_none=rm_none)
if 'setup' in self.keys():
msgs.info('Setup column already set. Finding unique configurations.')
uniq, indx = np.unique(self['setup'], return_index=True)
ignore = uniq == 'None'
if np.sum(ignore) > 0:
msgs.warn('Ignoring {0} frames with configuration set to None.'.format(
np.sum(ignore)))
self.configs = {}
for i in range(len(uniq)):
if ignore[i]:
continue
self.configs[uniq[i]] = self.get_configuration(indx[i])
msgs.info('Found {0} unique configurations.'.format(len(self.configs)))
return self._get_cfgs(copy=copy, rm_none=rm_none)
msgs.info('Using metadata to determine unique configurations.')
# If the frame types have been set, ignore anything listed in
# the ignore_frames
indx = np.arange(len(self))
ignore_frames = self.spectrograph.config_independent_frames()
if ignore_frames is not None:
if 'frametype' not in self.keys():
msgs.error('To ignore frames, types must have been defined; run get_frame_types.')
ignore_frames = list(ignore_frames.keys())
msgs.info('Unique configurations ignore frames with type: {0}'.format(ignore_frames))
use = np.ones(len(self), dtype=bool)
for ftype in ignore_frames:
use &= np.logical_not(self.find_frames(ftype))
indx = indx[use]
if len(indx) == 0:
msgs.error('No frames to use to define configurations!')
# Get the list of keys to use
cfg_keys = self.spectrograph.configuration_keys()
# Configuration identifiers are iterations through the
# upper-case letters: A, B, C, etc.
double_alphabet = [str_i + str_j for str_i in string.ascii_uppercase for str_j in string.ascii_uppercase]
cfg_iter = list(string.ascii_uppercase) + double_alphabet
cfg_indx = 0
# TODO: Placeholder: Allow an empty set of configuration keys
# meaning that the instrument setup has only one configuration.
if len(cfg_keys) == 0:
self.configs = {}
self.configs[cfg_iter[cfg_indx]] = {}
msgs.info('All files assumed to be from a single configuration.')
return self._get_cfgs(copy=copy, rm_none=rm_none)
# Use the first file to set the first unique configuration
self.configs = {}
self.configs[cfg_iter[cfg_indx]] = self.get_configuration(indx[0], cfg_keys=cfg_keys)
cfg_indx += 1
# Check if any of the other files show a different
# configuration.
for i in indx[1:]:
j = 0
for c in self.configs.values():
if row_match_config(self.table[i], c, self.spectrograph):
break
j += 1
unique = j == len(self.configs)
if unique:
if cfg_indx == len(cfg_iter):
msgs.error('Cannot assign more than {0} configurations!'.format(len(cfg_iter)))
self.configs[cfg_iter[cfg_indx]] = self.get_configuration(i, cfg_keys=cfg_keys)
cfg_indx += 1
msgs.info('Found {0} unique configurations.'.format(len(self.configs)))
return self._get_cfgs(copy=copy, rm_none=rm_none)
def set_configurations(self, configs=None, force=False, fill=None):
"""
Assign each frame to a configuration (setup) and include it
in the metadata table.
The internal table is edited *in place*. If the 'setup'
column already exists, the configurations are **not** reset
unless you call the function with ``force=True``.
Args:
configs (:obj:`dict`, optional):
A nested dictionary, one dictionary per configuration
with the associated values of the metadata associated
with each configuration. The metadata keywords in the
dictionary should be the same as in the table, and the
keywords used to set the configuration should be the
same as returned by the spectrograph
`configuration_keys` method. The latter is not checked.
If None, this is set by :func:`unique_configurations`.
force (:obj:`bool`, optional):
Force the configurations to be reset.
fill (:obj:`str`, optional):
If the 'setup' column does not exist, fill the
configuration setup columns with this single identifier.
Ignores other inputs.
Raises:
PypeItError:
Raised if none of the keywords in the provided
configuration match with the metadata keywords. Also
raised when some frames cannot be assigned to a
configuration, the spectrograph defined frames that
have been ignored in the determination of the unique
configurations, but the frame types have not been set
yet.
"""
# Configurations have already been set
if 'setup' in self.keys() and not force:
return
if 'setup' not in self.keys() and fill is not None:
self['setup'] = fill
return
_configs = self.unique_configurations() if configs is None else configs
for k, cfg in _configs.items():
if len(set(cfg.keys()) - set(self.keys())) > 0:
msgs.error('Configuration {0} defined using unavailable keywords!'.format(k))
self.table['setup'] = 'None'
nrows = len(self)
for i in range(nrows):
for d, cfg in _configs.items():
if row_match_config(self.table[i], cfg, self.spectrograph):
self.table['setup'][i] = d
# Check if any of the configurations are not set
not_setup = self.table['setup'] == 'None'
if not np.any(not_setup):
# All are set, so we're done
return
# Some frame types may have been ignored
ignore_frames = self.spectrograph.config_independent_frames()
if ignore_frames is None:
# Nope, we're still done
return
# At this point, we need the frame type to continue
if 'frametype' not in self.keys():
msgs.error('To account for ignored frames, types must have been defined; run '
'get_frame_types.')
# For each configuration, determine if any of the frames with
# the ignored frame types should be assigned to it:
for cfg_key in _configs.keys():
in_cfg = self.table['setup'] == cfg_key
for ftype, metakey in ignore_frames.items():
# TODO: For now, use this assert to check that the
# metakey is either not set or a string
assert metakey is None or isinstance(metakey, str), \
'CODING ERROR: metadata keywords set by config_indpendent_frames are not ' \
'correctly defined for {0}; values must be None or a string.'.format(
self.spectrograph.__class__.__name__)
# Get the list of frames of this type without a
# configuration
indx = (self.table['setup'] == 'None') & self.find_frames(ftype)
if not np.any(indx):
continue
if metakey is None:
# No matching meta data defined, so just set all
# the frames to this (first) configuration
self.table['setup'][indx] = cfg_key
continue
# Find the unique values of meta for this configuration
uniq_meta = np.unique(self.table[metakey][in_cfg].data)
# Warn the user that the matching meta values are not
# unique for this configuration.
if uniq_meta.size != 1:
msgs.warn('When setting the instrument configuration for {0} '.format(ftype)
+ 'frames, configuration {0} does not have unique '.format(cfg_key)
+ '{0} values.' .format(meta))
# Find the frames of this type that match any of the
# meta data values
indx &= np.isin(self.table[metakey], uniq_meta)
self.table['setup'][indx] = cfg_key
def clean_configurations(self):
"""
Ensure that configuration-defining keywords all have values
that will yield good PypeIt reductions. Any frames that do
not are removed from :attr:`table`, meaning this method may
modify that attribute directly.
The valid values for configuration keys is set by
:func:`~pypeit.spectrographs.spectrograph.Spectrograph.valid_configuration_values`.
"""
cfg_limits = self.spectrograph.valid_configuration_values()
if cfg_limits is None:
# No values specified, so we're done
return
good = np.ones(len(self), dtype=bool)
for key in cfg_limits.keys():
# NOTE: For now, check that the configuration values were
# correctly assigned in the spectrograph class definition.
# This should probably go somewhere else or just removed.
assert isinstance(cfg_limits[key], list), \
'CODING ERROR: valid_configuration_values is not correctly defined ' \
'for {0}; values must be a list.'.format(self.spectrograph.__class__.__name__)
# Check that the metadata are valid for this column.
indx = np.isin(self[key], cfg_limits[key])
if not np.all(indx):
msgs.warn('Found frames with invalid {0}.'.format(key))
good &= indx
if np.all(good):
# All values good, so we're done
return
# Alert the user that some of the frames are going to be
# removed
msg = 'The following frames have configurations that cannot be reduced by PypeIt' \
' and will be removed from the metadata table (pypeit file):\n'
indx = np.where(np.logical_not(good))[0]
for i in indx:
msg += ' {0}\n'.format(self['filename'][i])
msgs.warn(msg)
# And remove 'em
self.table = self.table[good]
def _set_calib_group_bits(self):
"""
Set the calibration group bit based on the string values of the
'calib' column.
"""
# Find the number groups by searching for the maximum number
# provided, regardless of whether or not a science frame is
# assigned to that group.
ngroups = 0
for i in range(len(self)):
if self['calib'][i] in ['all', 'None']:
# No information, keep going
continue
# Convert to a list of numbers
l = np.amax([ 0 if len(n) == 0 else int(n)
for n in self['calib'][i].replace(':',',').split(',')])
# Check against current maximum
ngroups = max(l+1, ngroups)
# Define the bitmask and initialize the bits
self.calib_bitmask = BitMask(np.arange(ngroups))
self['calibbit'] = 0
# Set the calibration bits
for i in range(len(self)):
# Convert the string to the group list
grp = parse.str2list(self['calib'][i], ngroups)
if grp is None:
# No group selected
continue
# Assign the group; ensure the integers are unique
self['calibbit'][i] = self.calib_bitmask.turn_on(self['calibbit'][i], grp)
def _check_calib_groups(self):
"""
Check that the calibration groups are valid.
This currently only checks that the science frames are
associated with one calibration group.
TODO: Is this appropriate for NIR data?
"""
is_science = self.find_frames('science')
for i in range(len(self)):
if not is_science[i]:
continue
if len(self.calib_bitmask.flagged_bits(self['calibbit'][i])) > 1:
msgs.error('Science frames can only be assigned to a single calibration group.')
@property
def n_calib_groups(self):
"""Return the number of calibration groups."""
return None if self.calib_bitmask is None else self.calib_bitmask.nbits
def set_calibration_groups(self, global_frames=None, default=False, force=False):
"""
Group calibration frames into sets.
Requires the 'setup' column to have been defined. For now this
is a simple grouping of frames with the same configuration.
.. todo::
- Maintain a detailed description of the logic.
The 'calib' column has a string type to make sure that it
matches with what can be read from the pypeit file. The
'calibbit' column is actually what is used to determine the
calibration group of each frame; see :attr:`calib_bitmask`.
Args:
global_frames (:obj:`list`, optional):
A list of strings with the frame types to use in all
calibration groups (e.g., ['bias', 'dark']).
default (:obj:`bool`, optional):
If the 'calib' column is not present, set a single
calibration group *for all rows*.
force (:obj:`bool`, optional):
Force the calibration groups to be reconstructed if
the 'calib' column already exists.
Raises:
PypeItError:
Raised if 'setup' column is not defined, or if
`global_frames` is provided but the frame types have not
been defined yet.
"""
# Set the default if requested and 'calib' doesn't exist yet
if 'calib' not in self.keys() and default:
self['calib'] = '0'
# Make sure the calibbit column does not exist
if 'calibbit' in self.keys():
del self['calibbit']
# Groups have already been set
if 'calib' in self.keys() and 'calibbit' in self.keys() and not force:
return
# Groups have been set but the bits have not (likely because the
# data was read from a pypeit file)
if 'calib' in self.keys() and 'calibbit' not in self.keys() and not force:
self._set_calib_group_bits()
self._check_calib_groups()
return
# TODO: The rest of this just nominally sets the calibration
# group based on the configuration. This will change!
# The configuration must be present to determine the calibration
# group
if 'setup' not in self.keys():
msgs.error('Must have defined \'setup\' column first; try running set_configurations.')
configs = np.unique(self['setup'].data).tolist()
if 'None' in configs:
configs.remove('None') # Ignore frames with undefined configurations
n_cfg = len(configs)
# TODO: Science frames can only have one calibration group
# Assign everything from the same configuration to the same
# calibration group; this needs to have dtype=object, otherwise
# any changes to the strings will be truncated at 4 characters.
self.table['calib'] = np.full(len(self), 'None', dtype=object)
for i in range(n_cfg):
self['calib'][(self['setup'] == configs[i]) & (self['framebit'] > 0)] = str(i)
# Allow some frame types to be used in all calibration groups
# (like biases and darks)
if global_frames is not None:
if 'frametype' not in self.keys():
msgs.error('To set global frames, types must have been defined; '
'run get_frame_types.')
calibs = '0' if n_cfg == 1 else ','.join(np.arange(n_cfg).astype(str))
for ftype in global_frames:
indx = np.where(self.find_frames(ftype))[0]
for i in indx:
self['calib'][i] = calibs
# Set the bits based on the string representation of the groups
self._set_calib_group_bits()
# Check that the groups are valid
self._check_calib_groups()
def find_frames(self, ftype, calib_ID=None, index=False):
"""
Find the rows with the associated frame type.
If the index is provided, the frames must also be matched to the
relevant science frame.
Args:
ftype (str):
The frame type identifier. See the keys for
:class:`pypeit.core.framematch.FrameTypeBitMask`. If
set to the string 'None', this returns all frames
without a known type.
calib_ID (:obj:`int`, optional):
Index of the calibration group that it must match. If None,
any row of the specified frame type is included.
index (:obj:`bool`, optional):
Return an array of 0-indexed indices instead of a
boolean array.
Returns:
numpy.ndarray: A boolean array, or an integer array if
index=True, with the rows that contain the frames of the
requested type.
Raises:
PypeItError:
Raised if the `framebit` column is not set in the table.
"""
if 'framebit' not in self.keys():
msgs.error('Frame types are not set. First run get_frame_types.')
if ftype == 'None':
return self['framebit'] == 0
# Select frames
indx = self.type_bitmask.flagged(self['framebit'], ftype)
if calib_ID is not None:
# Select frames in the same calibration group
indx &= self.find_calib_group(calib_ID)
# Return
return np.where(indx)[0] if index else indx
def find_frame_files(self, ftype, calib_ID=None):
"""
Return the list of files with a given frame type.
The frames must also match the science frame index, if it is
provided.
Args:
ftype (str):
The frame type identifier. See the keys for
:class:`pypeit.core.framematch.FrameTypeBitMask`.
calib_ID (:obj:`int`, optional):
Index of the calibration group that it must match. If None,
any row of the specified frame type is included.
Returns:
list: List of file paths that match the frame type and
science frame ID, if the latter is provided.
"""
return self.frame_paths(self.find_frames(ftype, calib_ID=calib_ID))
def frame_paths(self, indx):
"""
Return the full paths to one or more frames.
Args:
indx (:obj:`int`, array-like):
One or more 0-indexed rows in the table with the frames
to return. Can be an array of indices or a boolean
array of the correct length.
Returns:
list: List of the full paths of one or more frames.
"""
if isinstance(indx, (int,np.integer)):
return os.path.join(self['directory'][indx], self['filename'][indx])
return [os.path.join(d,f) for d,f in zip(self['directory'][indx], self['filename'][indx])]
def set_frame_types(self, type_bits, merge=True):
"""
Set and return a Table with the frame types and bits.
Args:
type_bits (numpy.ndarray):
Integer bitmask with the frame types. The length must
match the existing number of table rows.
merge (:obj:`bool`, optional):
Merge the types and bits into the existing table. This
will *overwrite* any existing columns.
Returns:
`astropy.table.Table`: Table with two columns, the frame
type name and bits.
"""
# Making Columns to pad string array
ftype_colmA = table.Column(self.type_bitmask.type_names(type_bits), name='frametype')
# KLUDGE ME
#
# TODO: It would be good to get around this. Is it related to
# this change?
# http://docs.astropy.org/en/stable/table/access_table.html#bytestring-columns-in-python-3
#
# See also:
#
# http://docs.astropy.org/en/stable/api/astropy.table.Table.html#astropy.table.Table.convert_bytestring_to_unicode
#
# Or we can force type_names() in bitmask to always return the
# correct type...
if int(str(ftype_colmA.dtype)[2:]) < 9:
ftype_colm = table.Column(self.type_bitmask.type_names(type_bits), dtype='U9',
name='frametype')
else:
ftype_colm = ftype_colmA
fbits_colm = table.Column(type_bits, name='framebit')
t = table.Table([ftype_colm, fbits_colm])
if merge:
self['frametype'] = t['frametype']
self['framebit'] = t['framebit']
return t
def edit_frame_type(self, indx, frame_type, append=False):
"""
Edit the frame type by hand.
Args:
indx (:obj:`int`):
The 0-indexed row in the table to edit
frame_type (:obj:`str`, :obj:`list`):
One or more frame types to append/overwrite.
append (:obj:`bool`, optional):
Append the frame type. If False, all existing frame
types are overwitten by the provided type.
"""
if not append:
self['framebit'][indx] = 0
self['framebit'][indx] = self.type_bitmask.turn_on(self['framebit'][indx], flag=frame_type)
self['frametype'][indx] = self.type_bitmask.type_names(self['framebit'][indx])
def get_frame_types(self, flag_unknown=False, user=None, merge=True):
"""
Generate a table of frame types from the input metadata object.
.. todo::
- Here's where we could add a SPIT option.
Args:
flag_unknown (:obj:`bool`, optional):
Instead of crashing out if there are unidentified files,
leave without a type and continue.
user (:obj:`dict`, optional):
A dictionary with the types designated by the user. The
file name and type are expected to be the key and value
of the dictionary, respectively. The number of keys
therefore *must* match the number of files in
:attr:`table`. For frames that have multiple types, the
types should be provided as a string with
comma-separated types.
merge (:obj:`bool`, optional):
Merge the frame typing into the exiting table.
Returns:
:obj:`astropy.table.Table`: A Table with two columns, the
type names and the type bits. See
:class:`pypeit.core.framematch.FrameTypeBitMask` for the
allowed frame types.
"""
# Checks
if 'frametype' in self.keys() or 'framebit' in self.keys():
msgs.warn('Removing existing frametype and framebit columns.')
if 'frametype' in self.keys():
del self.table['frametype']
if 'framebit' in self.keys():
del self.table['framebit']
# # TODO: This needs to be moved into each Spectrograph
# if useIDname and 'idname' not in self.keys():
# raise ValueError('idname is not set in table; cannot use it for file typing.')
# Start
msgs.info("Typing files")
type_bits = np.zeros(len(self), dtype=self.type_bitmask.minimum_dtype())
# Use the user-defined frame types from the input dictionary
if user is not None:
if len(user.keys()) != len(self):
raise ValueError('The user-provided dictionary does not match table length.')
msgs.info('Using user-provided frame types.')
for ifile,ftypes in user.items():
indx = self['filename'] == ifile
type_bits[indx] = self.type_bitmask.turn_on(type_bits[indx], flag=ftypes.split(','))
return self.set_frame_types(type_bits, merge=merge)
# Loop over the frame types
for i, ftype in enumerate(self.type_bitmask.keys()):
# # Initialize: Flag frames with the correct ID name or start by
# # flagging all as true
# indx = self['idname'] == self.spectrograph.idname(ftype) if useIDname \
# else np.ones(len(self), dtype=bool)
# Include a combination of instrument-specific checks using
# combinations of the full set of metadata
exprng = self.par['scienceframe']['exprng'] if ftype == 'science' \
else self.par['calibrations']['{0}frame'.format(ftype)]['exprng']
# TODO: Use & or | ? Using idname above gets overwritten by
# this if the frames to meet the other checks in this call.
# indx &= self.spectrograph.check_frame_type(ftype, self.table, exprng=exprng)
indx = self.spectrograph.check_frame_type(ftype, self.table, exprng=exprng)
# Turn on the relevant bits
type_bits[indx] = self.type_bitmask.turn_on(type_bits[indx], flag=ftype)
# Find the nearest standard star to each science frame
# TODO: Should this be 'standard' or 'science' or both?
if 'ra' not in self.keys() or 'dec' not in self.keys():
msgs.warn('Cannot associate standard with science frames without sky coordinates.')
else:
# TODO: Do we want to do this here?
indx = self.type_bitmask.flagged(type_bits, flag='standard')
for b, f, ra, dec in zip(type_bits[indx], self['filename'][indx], self['ra'][indx],
self['dec'][indx]):
if ra == 'None' or dec == 'None':
msgs.warn('RA and DEC must not be None for file:' + msgs.newline() + f)
msgs.warn('The above file could be a twilight flat frame that was'
+ msgs.newline() + 'missed by the automatic identification.')
b = self.type_bitmask.turn_off(b, flag='standard')
continue
# If an object exists within 20 arcmins of a listed standard,
# then it is probably a standard star
foundstd = flux_calib.find_standard_file(ra, dec, check=True)
b = self.type_bitmask.turn_off(b, flag='science' if foundstd else 'standard')
# Find the files without any types
indx = np.logical_not(self.type_bitmask.flagged(type_bits))
if np.any(indx):
msgs.info("Couldn't identify the following files:")
for f in self['filename'][indx]:
msgs.info(f)
if not flag_unknown:
msgs.error("Check these files before continuing")
# Finish up (note that this is called above if user is not None!)
msgs.info("Typing completed!")
return self.set_frame_types(type_bits, merge=merge)
def set_pypeit_cols(self, write_bkg_pairs=False, write_manual=False):
"""
Generate the list of columns to be included in the fitstbl
(nearly the complete list).
Args:
write_bkg_pairs (:obj:`bool`, optional):
Add additional ``PypeIt`` columns for calib, comb_id
and bkg_id
write_manual (:obj:`bool`, optional):
Add additional ``PypeIt`` columns for manual extraction
Returns:
`numpy.ndarray`_: Array of columns to be used in the fits
table>
"""
# Columns for output
columns = self.spectrograph.pypeit_file_keys()
extras = []
# comb, bkg columns
if write_bkg_pairs:
extras += ['calib', 'comb_id', 'bkg_id']
# manual
if write_manual:
extras += ['manual']
for key in extras:
if key not in columns:
columns += [key]
# Take only those present
output_cols = np.array(columns)
return output_cols[np.isin(output_cols, self.keys())].tolist()
def set_combination_groups(self, assign_objects=True):
"""
Set combination groups.
.. note::
:attr:`table` is edited in place.
This function can be used to initialize the combination group
and background group columns, and/or to initialize the combination
groups to the set of objects (science or standard frames) to a
unique integer.
If the 'comb_id' or 'bkg_id' columns do not exist, they're set
to -1.
Args:
assign_objects (:obj:`bool`, optional):
If all of 'comb_id' values are less than 0 (meaning
they're unassigned), the combination groups are set to
be unique for each standard and science frame.
"""
if 'comb_id' not in self.keys():
self['comb_id'] = -1
if 'bkg_id' not in self.keys():
self['bkg_id'] = -1
if assign_objects and np.all(self['comb_id'] < 0):
# find_frames will throw an exception if framebit is not
# set...
sci_std_idx = np.where(np.any([self.find_frames('science'),
self.find_frames('standard')], axis=0))[0]
self['comb_id'][sci_std_idx] = np.arange(len(sci_std_idx), dtype=int) + 1
def set_user_added_columns(self):
"""
Set columns that the user *might* add
.. note::
:attr:`table` is edited in place.
This function can be used to initialize columns
that the user might add
"""
if 'manual' not in self.keys():
self['manual'] = ''
def write_sorted(self, ofile, overwrite=True, ignore=None,
write_bkg_pairs=False, write_manual=False):
"""
Write the sorted file.
The sorted file lists all the unique instrument configurations
(setups) and the frames associated with each configuration. The
output data table is identical to the pypeit file output.
.. todo::
- This is for backwards compatibility, but we should
consider reformatting/removing it.
Args:
ofile (:obj:`str`):
Name for the output sorted file.
overwrite (:obj:`bool`, optional):
Overwrite any existing file with the same name.
ignore (:obj:`list`, optional):
Ignore configurations in the provided list.
write_bkg_pairs (:obj:`bool`, optional):
Add additional ``PypeIt`` columns for calib, comb_id
and bkg_id
write_manual (:obj:`bool`, optional):
Add additional ``PypeIt`` columns for manual extraction
Raises:
PypeItError:
Raised if the 'setup' isn't been defined.
"""
if 'setup' not in self.keys():
msgs.error('Cannot write sorted instrument configuration table without \'setup\' '
'column; run set_configurations.')
if os.path.isfile(ofile) and not overwrite:
msgs.error('{0} already exists. Use ovewrite=True to overwrite.'.format(ofile))
# Grab output columns
output_cols = self.set_pypeit_cols(write_bkg_pairs=write_bkg_pairs,
write_manual=write_manual)
cfgs = self.unique_configurations(copy=ignore is not None)
if ignore is not None:
for key in cfgs.keys():
if key in ignore:
del cfgs[key]
# Construct file
ff = open(ofile, 'w')
for setup in cfgs.keys():
# Get the subtable of frames taken in this configuration
indx = self['setup'] == setup
if not np.any(indx):
continue
subtbl = self.table[output_cols][indx]
# Write the file
ff.write('##########################################################\n')
ff.write('Setup {:s}\n'.format(setup))
ff.write('\n'.join(dict_to_lines(cfgs[setup], level=1)) + '\n')
ff.write('#---------------------------------------------------------\n')
mjd = subtbl['mjd'].copy()
# Deal with possibly None mjds if there were corrupt header cards
mjd[mjd == None] = -99999.0
isort = np.argsort(mjd)
subtbl = subtbl[isort]
subtbl.write(ff, format='ascii.fixed_width')
ff.write('##end\n')
ff.close()
# TODO: Do we need a calib file?
def write_calib(self, ofile, overwrite=True, ignore=None):
"""
Write the calib file.
The calib file provides the unique instrument configurations
(setups) and the association of each frame from that
configuration with a given calibration group.
.. todo::
- This is for backwards compatibility, but we should
consider reformatting/removing it.
- This is complicated by allowing some frame types to have
no association with an instrument configuration
- This is primarily used for QA now; but could probably use the pypeit file instead
Args:
ofile (:obj:`str`):
Name for the output sorted file.
overwrite (:obj:`bool`, optional):
Overwrite any existing file with the same name.
ignore (:obj:`list`, optional):
Ignore calibration groups in the provided list.
Raises:
PypeItError:
Raised if the 'setup' or 'calibbit' columns haven't been
defined.
"""
if 'setup' not in self.keys() or 'calibbit' not in self.keys():
msgs.error('Cannot write calibration groups without \'setup\' and \'calibbit\' '
'columns; run set_configurations and set_calibration_groups.')
if os.path.isfile(ofile) and not overwrite:
msgs.error('{0} already exists. Use ovewrite=True to overwrite.'.format(ofile))
# Construct the setups dictionary
cfg = self.unique_configurations(copy=True, rm_none=True)
# TODO: We should edit the relevant follow-on code so that we
# don't have to do these gymnastics. Or better yet, just stop
# producing/using the *.calib file.
_cfg = {}
for setup in cfg.keys():
_cfg[setup] = {}
_cfg[setup]['--'] = deepcopy(cfg[setup])
cfg = _cfg
# Iterate through the calibration bit names as these are the root of the
# MasterFrames and QA
for icbit in np.unique(self['calibbit'].data):
cbit = int(icbit) # for yaml
# Skip this group
if ignore is not None and cbit in ignore:
continue
# Find the frames in this group
#in_group = self.find_calib_group(i)
in_cbit = self['calibbit'] == cbit
# Find the unique configurations in this group, ignoring any
# undefined ('None') configurations
#setup = np.unique(self['setup'][in_group]).tolist()
setup = np.unique(self['setup'][in_cbit]).tolist()
if 'None' in setup:
setup.remove('None')
# Make sure that each calibration group should only contain
# frames from a single configuration
if len(setup) != 1:
msgs.error('Each calibration group must be from one and only one instrument '
'configuration with a valid letter identifier; i.e., the '
'configuration cannot be None.')
# Find the frames of each type in this group
cfg[setup[0]][cbit] = {}
for key in self.type_bitmask.keys():
#ftype_in_group = self.find_frames(key) & in_group
ftype_in_group = self.find_frames(key) & in_cbit
cfg[setup[0]][cbit][key] = [ os.path.join(d,f)
for d,f in zip(self['directory'][ftype_in_group],
self['filename'][ftype_in_group])]
# Write it
ff = open(ofile, 'w')
ff.write(yaml.dump(utils.yamlify(cfg)))
ff.close()
def write_pypeit(self, output_path=None, cfg_lines=None,
write_bkg_pairs=False, write_manual=False,
configs=None):
"""
Write a pypeit file in data-table format.
The pypeit file is the main configuration file for PypeIt,
configuring the control-flow and algorithmic parameters and
listing the data files to read. This function writes the
columns selected by the
:func:`pypeit.spectrographs.spectrograph.Spectrograph.pypeit_file_keys`,
which can be specific to each instrument.
Args:
output_path (:obj:`str`, optional):
Root path for the output pypeit files. If None, set
to current directory. If the output directory does
not exist, it is created.
cfg_lines (:obj:`list`, optional):
The list of configuration lines to include in the file.
If None are provided, the vanilla configuration is
included.
write_bkg_pairs (:obj:`bool`, optional):
When constructing the
:class:`pypeit.metadata.PypeItMetaData` object, include
two columns called `comb_id` and `bkg_id` that identify
object and background frame pairs.
write_manual (:obj:`bool`, optional):
Add additional ``PypeIt`` columns for manual extraction
configs (:obj:`str`, :obj:`list`, optional):
One or more strings used to select the configurations
to include in the returned objects. If ``'all'``,
pass back all configurations. Otherwise, only return
the configurations matched to this provided string or
list of strings (e.g., ['A','C']). See
:attr:`configs`.
Raises:
PypeItError:
Raised if the 'setup' isn't defined and split is True.
Returns:
:obj:`list`: List of ``PypeIt`` files generated.
"""
# Set output path
if output_path is None:
output_path = os.getcwd()
# Find unique configurations, always ignoring any 'None'
# configurations...
cfg = self.unique_configurations(copy=True, rm_none=True)
# Get the setups to write
if configs is None or configs == 'all' or configs == ['all']:
cfg_keys = list(cfg.keys())
else:
_configs = configs if isinstance(configs, list) else [configs]
cfg_keys = [key for key in cfg.keys() if key in _configs]
if len(cfg_keys) == 0:
msgs.error('No setups to write!')
# Grab output columns
output_cols = self.set_pypeit_cols(write_bkg_pairs=write_bkg_pairs,
write_manual=write_manual)
# Write the pypeit files
ofiles = [None]*len(cfg_keys)
for j,setup in enumerate(cfg_keys):
# Create the output directory
root = '{0}_{1}'.format(self.spectrograph.name, setup)
odir = os.path.join(output_path, root)
if not os.path.isdir(odir):
os.makedirs(odir)
# Create the output file name
ofiles[j] = os.path.join(odir, '{0}.pypeit'.format(root))
# Get the setup lines
setup_lines = dict_to_lines({'Setup {0}'.format(setup):
utils.yamlify(cfg[setup])}, level=1)
# Get the paths
in_cfg = self['setup'] == setup
if not np.any(in_cfg):
continue
paths = np.unique(self['directory'][in_cfg]).tolist()
# Get the data lines
subtbl = self.table[output_cols][in_cfg]
subtbl.sort(['frametype','filename'])
with io.StringIO() as ff:
subtbl.write(ff, format='ascii.fixed_width')
data_lines = ff.getvalue().split('\n')[:-1]
# Write the file
make_pypeit_file(ofiles[j], self.spectrograph.name, [], cfg_lines=cfg_lines,
setup_lines=setup_lines, sorted_files=data_lines, paths=paths)
# Return
return ofiles
def write(self, output=None, rows=None, columns=None, sort_col=None, overwrite=False,
header=None):
"""
Write the metadata either to a file or to the screen.
The method allows you to set the columns to print and which column to
use for sorting.
Args:
output (:obj:`str`, optional):
Output signature or file name. If None, the table contents
are printed to the screen. If ``'table'``, the table that
would have been printed/written to disk is returned.
Otherwise, the string is interpreted as the name of an ascii
file to which to write the table contents.
rows (`numpy.ndarray`_, optional):
A boolean vector selecting the rows of the table to write. If
None, all rows are written. Shape must match the number of
the rows in the table.
columns (:obj:`str`, :obj:`list`, optional):
A list of columns to include in the output file. Can be
provided as a list directly or as a comma-separated string.
If None or ``'all'``, all columns in are written; if
``'pypeit'``, the columns are the same as those included in
the pypeit file. Each selected column must be a valid pypeit
metadata keyword, specific to :attr:`spectrograph`.
Additional valid keywords, depending on the processing level
of the metadata table, are directory, filename, frametype,
framebit, setup, calib, and calibbit.
sort_col (:obj:`str`, optional):
Name of the column to use for sorting the output. If
None, the table is printed in its current state.
overwrite (:obj:`bool`, optional):
Overwrite any existing file; otherwise raise an
exception.
header (:obj:`str`, :obj:`list`, optional):
One or more strings to write to the top of the file, on
string per file line; ``# `` is added to the beginning of
each string. Ignored if ``output`` does not specify an output
file.
Returns:
`astropy.table.Table`: The table object that would have been
written/printed if ``output == 'table'``. Otherwise, the method
always returns None.
Raises:
ValueError:
Raised if the columns to include are not valid, or if the
column to use for sorting is not valid.
FileExistsError:
Raised if overwrite is False and the file exists.
"""
# Check the file can be written (this is here because the spectrograph
# needs to be defined first)
ofile = None if output in [None, 'table'] else output
if ofile is not None and os.path.isfile(ofile) and not overwrite:
raise FileExistsError(f'{ofile} already exists; set flag to overwrite.')
# Check the rows input
if rows is not None and len(rows) != len(self.table):
raise ValueError('Boolean vector selecting output rows has incorrect length.')
# Get the columns to return
if columns in [None, 'all']:
tbl_cols = list(self.keys())
elif columns == 'pypeit':
tbl_cols = self.set_pypeit_cols(write_bkg_pairs=True)
else:
all_cols = list(self.keys())
tbl_cols = columns if isinstance(columns, list) else columns.split(',')
badcol = [col not in all_cols for col in tbl_cols]
if np.any(badcol):
raise ValueError('The following columns are not valid: {0}'.format(
', '.join(tbl_cols[badcol])))
# Make sure the basic parameters are the first few columns; do them in
# reverse order so I can always insert at the beginning of the list
for col in ['framebit', 'frametype', 'filename', 'directory']:
if col not in tbl_cols:
continue
indx = np.where([t == col for t in tbl_cols])[0][0]
if indx != 0:
tbl_cols.insert(0, tbl_cols.pop(indx))
# Make sure the dithers and combination and background IDs are the last
# few columns
ncol = len(tbl_cols)
for col in ['dithpat', 'dithpos', 'dithoff', 'calib', 'comb_id', 'bkg_id']:
if col not in tbl_cols:
continue
indx = np.where([t == col for t in tbl_cols])[0][0]
if indx != ncol-1:
tbl_cols.insert(ncol-1, tbl_cols.pop(indx))
# Copy the internal table so that it is unaltered
output_tbl = self.table.copy()
# Select the output rows if a vector was provided
if rows is not None:
output_tbl = output_tbl[rows]
# Select and sort the data by a given column
if sort_col is not None:
if sort_col not in self.keys():
raise ValueError(f'Cannot sort by {sort_col}. Not a valid column.')
# Ignore any NoneTypes
indx = output_tbl[sort_col] != None
is_None = np.logical_not(indx)
srt = np.append(np.where(is_None)[0],
np.where(indx)[0][np.argsort(output_tbl[sort_col][indx].data)])
output_tbl = output_tbl[tbl_cols][srt]
else:
output_tbl = output_tbl[tbl_cols]
if output == 'table':
# Instead of writing, just return the modified table
return output_tbl
# Always write the table in ascii format
with io.StringIO() as ff:
output_tbl.write(ff, format='ascii.fixed_width')
data_lines = ff.getvalue().split('\n')[:-1]
if ofile is None:
# Output file not defined so just print it
print('\n'.join(data_lines))
return None
# Write the output to an ascii file
with open(ofile, 'w') as f:
if header is not None:
_header = header if isinstance(header, list) else [header]
for h in _header:
f.write(f'# {h}\n')
f.write('\n')
f.write('\n'.join(data_lines))
f.write('\n')
# Just to be explicit that the method returns None when writing to a
# file...
return None
def find_calib_group(self, grp):
"""
Find all the frames associated with the provided calibration group.
Args:
grp (:obj:`int`):
The calibration group integer.
Returns:
numpy.ndarray: Boolean array selecting those frames in the
table included in the selected calibration group.
Raises:
PypeItError:
Raised if the 'calibbit' column is not defined.
"""
if 'calibbit' not in self.keys():
msgs.error('Calibration groups are not set. First run set_calibration_groups.')
return self.calib_bitmask.flagged(self['calibbit'].data, grp)
def find_frame_calib_groups(self, row):
"""
Find the calibration groups associated with a specific frame.
"""
return self.calib_bitmask.flagged_bits(self['calibbit'][row])
# TODO: Is there a reason why this is not an attribute of
# PypeItMetaData?
def row_match_config(row, config, spectrograph):
"""
Queries whether a row from the fitstbl matches the
input configuration
Args:
row (astropy.table.Row): From fitstbl
config (dict): Defines the configuration
spectrograph (pypeit.spectrographs.spectrograph.Spectrograph):
Used to grab the rtol value for float meta (e.g. dispangle)
Returns:
bool: True if the row matches the input configuration
"""
# Loop on keys in config
match = []
for k in config.keys():
# Deal with floating configs (e.g. grating angle)
if isinstance(config[k], float):
if row[k] is None:
match.append(False)
elif np.abs(config[k]-row[k])/config[k] < spectrograph.meta[k]['rtol']:
match.append(True)
else:
match.append(False)
else:
# The np.all allows for arrays in the Table (e.g. binning)
match.append(np.all(config[k] == row[k]))
# Check
return np.all(match)
| 42.817836 | 122 | 0.575411 | 9,701 | 78,742 | 4.599835 | 0.09164 | 0.005378 | 0.009188 | 0.00874 | 0.279587 | 0.226229 | 0.194787 | 0.170607 | 0.152029 | 0.138897 | 0 | 0.002512 | 0.337837 | 78,742 | 1,838 | 123 | 42.841132 | 0.853318 | 0.457329 | 0 | 0.158133 | 0 | 0 | 0.140905 | 0.008467 | 0 | 0 | 0 | 0.011425 | 0.003012 | 1 | 0.064759 | false | 0 | 0.028614 | 0.012048 | 0.162651 | 0.001506 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53d54a4c34c0a67e36d2d017230ceb288acd1564 | 2,341 | py | Python | aql/aql/main/aql_builtin_tools.py | menify/sandbox | 32166c71044f0d5b414335b2b6559adc571f568c | [
"MIT"
] | null | null | null | aql/aql/main/aql_builtin_tools.py | menify/sandbox | 32166c71044f0d5b414335b2b6559adc571f568c | [
"MIT"
] | null | null | null | aql/aql/main/aql_builtin_tools.py | menify/sandbox | 32166c71044f0d5b414335b2b6559adc571f568c | [
"MIT"
] | null | null | null |
import os.path
import shutil
import errno
from aql.nodes import Builder, FileBuilder
from .aql_tools import Tool
__all__ = ( "ExecuteCommand",
"InstallBuilder",
"BuiltinTool",
)
"""
Unique Value - name + type
value
node
node = ExecuteCommand('gcc --help -v')
tools.cpp.cxx
node = ExecuteCommand( tools.cpp.cxx, '--help -v' )
node = ExecuteMethod( target = my_function )
dir_node = CopyFiles( prog_node, target = dir_name )
dir_node = CopyFilesAs( prog_node, target = dir_name )
dir_node = MoveFiles( prog_node, )
dir_node = MoveFilesAs( prog_node )
dir_node = RemoveFiles( prog_node )
node = FindFiles( dir_node )
dir_node = FileDir( prog_node )
"""
def _makeTagetDirs( path_dir ):
try:
os.makedirs( path_dir )
except OSError as e:
if e.errno != errno.EEXIST:
raise
#//===========================================================================//
class ExecuteCommand (Builder):
def build( self, node ):
cmd = node.getSources()
out = self.execCmd( cmd )
node.setNoTargets()
return out
#//-------------------------------------------------------//
def getBuildStrArgs( self, node, brief ):
cmd = node.getSourceValues()
return (cmd,)
#//===========================================================================//
class InstallBuilder (FileBuilder):
def __init__(self, options, target ):
self.target = os.path.abspath( target )
#//-------------------------------------------------------//
def build( self, node ):
sources = node.getSources()
target = self.target
_makeTagetDirs( target )
for source in sources:
if os.path.isfile( source ):
shutil.copy( source, target )
node.setNoTargets()
#//-------------------------------------------------------//
def getTraceTargets( self, node, brief ):
return self.target
#//===========================================================================//
class BuiltinTool( Tool ):
def ExecuteCommand( self, options ):
return ExecuteCommand( options )
def Install(self, options, target ):
return InstallBuilder( options, target )
def DirName(self, options):
raise NotImplementedError()
def BaseName(self, options):
raise NotImplementedError()
| 22.509615 | 80 | 0.529688 | 215 | 2,341 | 5.632558 | 0.353488 | 0.040462 | 0.02725 | 0.028076 | 0.046243 | 0.046243 | 0.046243 | 0 | 0 | 0 | 0 | 0 | 0.208885 | 2,341 | 103 | 81 | 22.728155 | 0.653888 | 0.176848 | 0 | 0.130435 | 0 | 0 | 0.027046 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0.108696 | 0.065217 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53e05b14f47fe11d4c2e4b89d1492b45ec46b072 | 5,199 | py | Python | etl/transform.py | ACWI-SOGW/ngwmn_monitoring_locations_etl | e9ebfebbc5fa349a58669fb1d9944786f26729c3 | [
"CC0-1.0"
] | 1 | 2020-10-07T14:44:30.000Z | 2020-10-07T14:44:30.000Z | etl/transform.py | ACWI-SOGW/ngwmn_monitoring_locations_etl | e9ebfebbc5fa349a58669fb1d9944786f26729c3 | [
"CC0-1.0"
] | 7 | 2020-10-14T19:13:10.000Z | 2021-10-06T20:04:38.000Z | etl/transform.py | ACWI-SOGW/ngwmn_monitoring_locations_etl | e9ebfebbc5fa349a58669fb1d9944786f26729c3 | [
"CC0-1.0"
] | 1 | 2020-10-02T14:43:18.000Z | 2020-10-02T14:43:18.000Z | """
Transform the data into a form that
works with the WELL_REGISTRY_STG table.
"""
import re
def mapping_factory(mapping):
def map_func(key):
if key is not None:
ora_val = mapping.get(key.lower())
else:
ora_val = None
return ora_val
return map_func
WELL_TYPES = {
'surveillance': 1,
'trend': 2,
'special': 3,
}
map_well_type = mapping_factory(WELL_TYPES)
WELL_PURPOSE = {
'dedicated monitoring/observation': 1,
'other': 2
}
map_well_purpose = mapping_factory(WELL_PURPOSE)
QW_WELL_CHARS = {
'background': 1,
'suspected/anticipated changes': 2,
'known changes': 3
}
map_qw_well_chars = mapping_factory(QW_WELL_CHARS)
WL_WELL_CHARS = {
'background': 1,
'suspected/anticipated changes': 2,
'known changes': 3,
'unknown': 999
}
map_wl_well_chars = mapping_factory(WL_WELL_CHARS)
def to_flag(flag):
return '1' if flag else '0'
def transform_mon_loc_data(ml_data):
"""
Map the fields from the API JSON response to
the fields in the WELL_REGISTRY_STG table with
appropriate foreign key values.
"""
mapped_data = dict()
mapped_data['AGENCY_CD'] = ml_data['agency']['agency_cd']
mapped_data['AGENCY_NM'] = ml_data['agency']['agency_nm']
mapped_data['AGENCY_MED'] = ml_data['agency']['agency_med']
mapped_data['SITE_NO'] = ml_data['site_no']
mapped_data['SITE_NAME'] = ml_data['site_name']
mapped_data['DEC_LAT_VA'] = ml_data['dec_lat_va']
mapped_data['DEC_LONG_VA'] = ml_data['dec_long_va']
mapped_data['HORZ_DATUM'] = ml_data['horizontal_datum']
mapped_data['ALT_VA'] = ml_data['alt_va']
mapped_data['ALT_DATUM_CD'] = ml_data['altitude_datum']
try:
mapped_data['NAT_AQUIFER_CD'] = ml_data['nat_aqfr']['nat_aqfr_cd']
mapped_data['NAT_AQFR_DESC'] = ml_data['nat_aqfr']['nat_aqfr_desc']
except (AttributeError, KeyError, TypeError):
mapped_data['NAT_AQUIFER_CD'] = None
mapped_data['NAT_AQFR_DESC'] = None
mapped_data['LOCAL_AQUIFER_NAME'] = ml_data['local_aquifer_name']
mapped_data['AQFR_CHAR'] = ml_data['aqfr_type']
mapped_data['QW_SN_FLAG'] = to_flag(ml_data['qw_sn_flag'])
mapped_data['QW_BASELINE_FLAG'] = to_flag(ml_data['qw_baseline_flag'])
mapped_data['QW_WELL_CHARS'] = map_qw_well_chars(ml_data['qw_well_chars'])
mapped_data['QW_WELL_PURPOSE'] = map_well_purpose(ml_data['qw_well_purpose'])
mapped_data['QW_SYS_NAME'] = ml_data['qw_network_name']
mapped_data['WL_SN_FLAG'] = to_flag(ml_data['wl_sn_flag'])
mapped_data['WL_BASELINE_FLAG'] = to_flag(ml_data['wl_baseline_flag'])
mapped_data['WL_WELL_CHARS'] = map_wl_well_chars(ml_data['wl_well_chars'])
mapped_data['WL_WELL_PURPOSE'] = map_well_purpose(ml_data['wl_well_purpose'])
mapped_data['WL_SYS_NAME'] = ml_data['wl_network_name']
mapped_data['DATA_PROVIDER'] = None
mapped_data['DISPLAY_FLAG'] = to_flag(ml_data['display_flag'])
mapped_data['WL_DATA_PROVIDER'] = None
mapped_data['QW_DATA_PROVIDER'] = None
mapped_data['LITH_DATA_PROVIDER'] = None
mapped_data['CONST_DATA_PROVIDER'] = None
mapped_data['WELL_DEPTH'] = ml_data['well_depth']
mapped_data['LINK'] = ml_data['link']
mapped_data['INSERT_DATE'] = ml_data['insert_date']
mapped_data['UPDATE_DATE'] = ml_data['update_date']
mapped_data['WL_WELL_PURPOSE_NOTES'] = ml_data['wl_well_purpose_notes']
mapped_data['QW_WELL_PURPOSE_NOTES'] = ml_data['qw_well_purpose_notes']
mapped_data['INSERT_USER_ID'] = ml_data['insert_user']
mapped_data['UPDATE_USER_ID'] = ml_data['update_user']
mapped_data['WL_WELL_TYPE'] = map_well_type(ml_data['wl_well_type'])
mapped_data['QW_WELL_TYPE'] = map_well_type(ml_data['qw_well_type'])
mapped_data['LOCAL_AQUIFER_CD'] = None
mapped_data['REVIEW_FLAG'] = None
try:
mapped_data['STATE_CD'] = ml_data['state']['state_cd']
except (AttributeError, KeyError, TypeError):
mapped_data['STATE_CD'] = None
try:
mapped_data['COUNTY_CD'] = ml_data['county']['county_cd']
except (AttributeError, KeyError, TypeError):
mapped_data['COUNTY_CD'] = None
try:
mapped_data['COUNTRY_CD'] = ml_data['country']['country_cd']
except (AttributeError, KeyError, TypeError):
mapped_data['COUNTRY_CD'] = None
mapped_data['WELL_DEPTH_UNITS'] = ml_data['well_depth_units']['unit_id'] if ml_data['well_depth_units'] else None
mapped_data['ALT_UNITS'] = ml_data['altitude_units']['unit_id'] if ml_data['altitude_units'] else None
mapped_data['SITE_TYPE'] = ml_data['site_type']
mapped_data['HORZ_METHOD'] = ml_data['horz_method']
mapped_data['HORZ_ACY'] = ml_data['horz_acy']
mapped_data['ALT_METHOD'] = ml_data['alt_method']
mapped_data['ALT_ACY'] = ml_data['alt_acy']
return mapped_data
def date_format(mapped_data):
# fix missing fractions of a second
if re.match(r".*:\d\dZ$", mapped_data['INSERT_DATE']):
mapped_data['INSERT_DATE'] = mapped_data['INSERT_DATE'][:-1] + ".0Z"
if re.match(r".*:\d\dZ$", mapped_data['UPDATE_DATE']):
mapped_data['UPDATE_DATE'] = mapped_data['UPDATE_DATE'][:-1] + ".0Z"
| 38.227941 | 117 | 0.695903 | 766 | 5,199 | 4.278068 | 0.173629 | 0.201404 | 0.046994 | 0.018309 | 0.424474 | 0.227037 | 0.160818 | 0.081782 | 0.037229 | 0.037229 | 0 | 0.004583 | 0.160608 | 5,199 | 135 | 118 | 38.511111 | 0.746334 | 0.045009 | 0 | 0.111111 | 0 | 0 | 0.317592 | 0.029996 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046296 | false | 0 | 0.009259 | 0.009259 | 0.092593 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53ebe27af2c0c28dac914d098023620cb50fc322 | 1,529 | py | Python | igibson/object_states/aabb.py | mamadbiabon/iGibson | d416a470240eb7ad86e04fee475ae4bd67263a7c | [
"MIT"
] | 360 | 2020-04-02T11:12:09.000Z | 2022-03-24T21:46:58.000Z | igibson/object_states/aabb.py | mamadbiabon/iGibson | d416a470240eb7ad86e04fee475ae4bd67263a7c | [
"MIT"
] | 169 | 2020-04-07T21:01:05.000Z | 2022-03-31T10:07:39.000Z | igibson/object_states/aabb.py | mamadbiabon/iGibson | d416a470240eb7ad86e04fee475ae4bd67263a7c | [
"MIT"
] | 94 | 2020-04-09T23:22:17.000Z | 2022-03-17T21:49:03.000Z | import numpy as np
from igibson.external.pybullet_tools.utils import aabb_union, get_aabb, get_all_links
from igibson.object_states.object_state_base import CachingEnabledObjectState
class AABB(CachingEnabledObjectState):
def _compute_value(self):
body_id = self.obj.get_body_id()
all_links = get_all_links(body_id)
aabbs = [get_aabb(body_id, link=link) for link in all_links]
aabb_low, aabb_hi = aabb_union(aabbs)
if not hasattr(self.obj, "category") or self.obj.category != "floors" or self.obj.room_floor is None:
return np.array(aabb_low), np.array(aabb_hi)
# TODO: remove after split floors
# room_floor will be set to the correct RoomFloor beforehand
room_instance = self.obj.room_floor.room_instance
# Get the x-y values from the room segmentation map
room_aabb_low, room_aabb_hi = self.obj.room_floor.scene.get_aabb_by_room_instance(room_instance)
if room_aabb_low is None:
return np.array(aabb_low), np.array(aabb_hi)
# Use the z values from pybullet
room_aabb_low[2] = aabb_low[2]
room_aabb_hi[2] = aabb_hi[2]
return np.array(room_aabb_low), np.array(room_aabb_hi)
def _set_value(self, new_value):
raise NotImplementedError("AABB state currently does not support setting.")
# Nothing needs to be done to save/load AABB since it will happen due to pose caching.
def _dump(self):
return None
def load(self, data):
return
| 36.404762 | 109 | 0.699804 | 233 | 1,529 | 4.343348 | 0.381974 | 0.055336 | 0.043478 | 0.047431 | 0.077075 | 0.077075 | 0.077075 | 0.077075 | 0.077075 | 0.077075 | 0 | 0.003373 | 0.22433 | 1,529 | 41 | 110 | 37.292683 | 0.849916 | 0.16743 | 0 | 0.083333 | 0 | 0 | 0.047356 | 0 | 0 | 0 | 0 | 0.02439 | 0 | 1 | 0.166667 | false | 0 | 0.125 | 0.083333 | 0.541667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53f022c5295afcf5069c62aac2f57d65cf97e719 | 2,147 | py | Python | data_steward/constants/validation/email_notification.py | jp3477/curation | 41f98d57c8273d9963ad6d466a237c99b63c74be | [
"MIT"
] | 1 | 2021-04-05T18:06:25.000Z | 2021-04-05T18:06:25.000Z | data_steward/constants/validation/email_notification.py | jp3477/curation | 41f98d57c8273d9963ad6d466a237c99b63c74be | [
"MIT"
] | null | null | null | data_steward/constants/validation/email_notification.py | jp3477/curation | 41f98d57c8273d9963ad6d466a237c99b63c74be | [
"MIT"
] | null | null | null | MANDRILL_API_KEY = 'MANDRILL_API_KEY'
UNSET_MANDRILL_API_KEY_MSG = f"Mandrill API key not set in environment variable {MANDRILL_API_KEY}"
CONTACT_LIST_QUERY = """
SELECT *
FROM `{{project}}.{{dataset}}.{{contact_table}}`
"""
EHR_OPERATIONS = 'EHR Ops'
EHR_OPS_ZENDESK = 'support@aou-ehr-ops.zendesk.com'
DATA_CURATION_LISTSERV = 'datacuration@researchallofus.org'
NO_REPLY_ADDRESS = 'noreply@researchallofus.org'
NO_DATA_STEWARD = 'no data steward'
# HPO contact list table columns
SITE_NAME = 'site_name'
HPO_ID = 'hpo_id'
SITE_POINT_OF_CONTACT = 'site_point_of_contact'
# Mandrill API constants
MAIL_TO = 'mail_to'
EHR_OPS_SITE_URL = 'https://sites.google.com/view/ehrupload'
# Email content
EMAIL_BODY = """
<p style="font-size:115%;">Hi {{ site_name }},</p>
<p style="font-size:115%;">Your submission <b>{{ folder }}</b>
{% if submission_error %}was NOT successfully loaded on {{ timestamp }}.<br>
{% else %}was successfully loaded on {{ timestamp }}.<br>
{% endif %}
Please review the <code>results.html</code> submission report attached to this email{% if submission_error %}<br>
and resolve the errors before making a new submission{% endif %}.<br>
If any of your files have not been successfully uploaded, please run the
<a href="https://github.com/all-of-us/aou-ehr-file-check">local file check</a> before making your submission.<br>
To view the full set of curation reports, please visit the submission folder in your
GCS bucket <a href="{{ submission_folder_url }}">here</a>.<br>
For more information on the reports and how to download them, please refer to our
<a href="{{ ehr_ops_site_url }}">EHR Ops website</a>.</p>
<p style="font-size:115%;">You are receiving this email because you are listed as a point of contact
for HPO Site <em>{{ site_name }}</em>.<br>
If you have additional questions or wish to no longer receive these emails, please reply/send an
email to <a href="mailto:{{ eo_zendesk }}">{{ eo_zendesk }}</a>.</p>
<p style="font-size:115%;">EHR Ops team, DRC<br>
<em>All of Us</em> Research Program<br>
<img src="cid:{{ aou_logo }}"/></p>
"""
AOU_LOGO = 'aou_logo'
AOU_LOGO_PNG = 'all-of-us-logo.png'
| 39.036364 | 116 | 0.726129 | 346 | 2,147 | 4.349711 | 0.427746 | 0.027907 | 0.046512 | 0.037209 | 0.089701 | 0.037209 | 0.025249 | 0 | 0 | 0 | 0 | 0.006417 | 0.129017 | 2,147 | 54 | 117 | 39.759259 | 0.798396 | 0.031206 | 0 | 0.05 | 0 | 0.075 | 0.832852 | 0.148362 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
53fbe12da973d06be5b6afaae786b7644d276650 | 1,309 | py | Python | workflows/post_process_run/fv3post/gsutil.py | jacnugent/fv3net | 84958651bdd17784fdab98f87ad0d65414c03368 | [
"MIT"
] | 5 | 2021-03-20T22:42:40.000Z | 2021-06-30T18:39:36.000Z | workflows/post_process_run/fv3post/gsutil.py | jacnugent/fv3net | 84958651bdd17784fdab98f87ad0d65414c03368 | [
"MIT"
] | 195 | 2021-09-16T05:47:18.000Z | 2022-03-31T22:03:15.000Z | workflows/post_process_run/fv3post/gsutil.py | ai2cm/fv3net | e62038aee0a97d6207e66baabd8938467838cf51 | [
"MIT"
] | 1 | 2021-06-16T22:04:24.000Z | 2021-06-16T22:04:24.000Z | import os
import subprocess
import backoff
class GSUtilResumableUploadException(Exception):
pass
def _decode_to_str_if_bytes(s, encoding="utf-8"):
if isinstance(s, bytes):
return s.decode(encoding)
else:
return s
def authenticate():
try:
credentials = os.environ["GOOGLE_APPLICATION_CREDENTIALS"]
except KeyError:
pass
else:
subprocess.check_call(
["gcloud", "auth", "activate-service-account", "--key-file", credentials]
)
@backoff.on_exception(backoff.expo, GSUtilResumableUploadException, max_tries=3)
def upload_dir(d, dest):
try:
# Pipe stderr to stdout because gsutil logs upload progress there.
subprocess.check_output(
["gsutil", "-m", "rsync", "-r", "-e", d, dest], stderr=subprocess.STDOUT
)
except subprocess.CalledProcessError as e:
output = _decode_to_str_if_bytes(e.output)
if "ResumableUploadException" in output:
raise GSUtilResumableUploadException()
else:
raise e
def download_directory(dir_, dest):
os.makedirs(dest, exist_ok=True)
subprocess.check_call(["gsutil", "-m", "rsync", "-r", dir_, dest])
def cp(source, destination):
subprocess.check_call(["gsutil", "cp", source, destination])
| 25.666667 | 85 | 0.654698 | 146 | 1,309 | 5.712329 | 0.506849 | 0.071942 | 0.068345 | 0.031175 | 0.043165 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001982 | 0.229183 | 1,309 | 50 | 86 | 26.18 | 0.824579 | 0.048892 | 0 | 0.194444 | 0 | 0 | 0.115044 | 0.062751 | 0 | 0 | 0 | 0 | 0 | 1 | 0.138889 | false | 0.055556 | 0.083333 | 0 | 0.305556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
99019a837f86e3b14c54300ab0d06ff51f85071a | 173 | py | Python | intValues.py | jules552/ProjetISN | 20da3572b59af25a166022bc2f5b25d46add2650 | [
"Unlicense"
] | null | null | null | intValues.py | jules552/ProjetISN | 20da3572b59af25a166022bc2f5b25d46add2650 | [
"Unlicense"
] | null | null | null | intValues.py | jules552/ProjetISN | 20da3572b59af25a166022bc2f5b25d46add2650 | [
"Unlicense"
] | null | null | null | MAP = 1
SPEED = 1.5
VELOCITYRESET = 6
WIDTH = 1280
HEIGHT = 720
X = WIDTH / 2 - 50
Y = HEIGHT / 2 - 50
MOUSER = 325
TICKRATES = 120
nfc = False
raspberry = False | 14.416667 | 20 | 0.606936 | 27 | 173 | 3.888889 | 0.777778 | 0.057143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190083 | 0.300578 | 173 | 12 | 21 | 14.416667 | 0.677686 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
54d943f36b7e93ff9b844e618cfa99e6c35ca662 | 2,011 | py | Python | contrib/python/src/python/pants/contrib/python/checks/tasks/checkstyle/pyflakes.py | lahosken/pants | 1b0340987c9b2eab9411416803c75b80736716e4 | [
"Apache-2.0"
] | null | null | null | contrib/python/src/python/pants/contrib/python/checks/tasks/checkstyle/pyflakes.py | lahosken/pants | 1b0340987c9b2eab9411416803c75b80736716e4 | [
"Apache-2.0"
] | null | null | null | contrib/python/src/python/pants/contrib/python/checks/tasks/checkstyle/pyflakes.py | lahosken/pants | 1b0340987c9b2eab9411416803c75b80736716e4 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
from __future__ import (absolute_import, division, generators, nested_scopes, print_function,
unicode_literals, with_statement)
from pyflakes.checker import Checker as FlakesChecker
from pants.contrib.python.checks.tasks.checkstyle.common import CheckstylePlugin, Nit
class FlakeError(Nit):
# TODO(wickman) There is overlap between this and Flake8 -- consider integrating
# checkstyle plug-ins into the PEP8 tool directly so that this can be inherited
# by flake8.
# Code reference is here: http://flake8.readthedocs.org/en/latest/warnings.html
CLASS_ERRORS = {
'DuplicateArgument': 'F831',
'ImportShadowedByLoopVar': 'F402',
'ImportStarUsed': 'F403',
'LateFutureImport': 'F404',
'Redefined': 'F810',
'RedefinedInListComp': 'F812',
'RedefinedWhileUnused': 'F811',
'UndefinedExport': 'F822',
'UndefinedLocal': 'F823',
'UndefinedName': 'F821',
'UnusedImport': 'F401',
'UnusedVariable': 'F841',
}
def __init__(self, python_file, flake_message):
line_range = python_file.line_range(flake_message.lineno)
super(FlakeError, self).__init__(
self.get_error_code(flake_message),
Nit.ERROR,
python_file.filename,
flake_message.message % flake_message.message_args,
line_range,
python_file.lines[line_range])
@classmethod
def get_error_code(cls, message):
return cls.CLASS_ERRORS.get(message.__class__.__name__, 'F999')
class PyflakesChecker(CheckstylePlugin):
"""Detect common coding errors via the pyflakes package."""
def nits(self):
checker = FlakesChecker(self.python_file.tree, self.python_file.filename)
for message in sorted(checker.messages, key=lambda msg: msg.lineno):
if FlakeError.get_error_code(message) not in self.options.ignore:
yield FlakeError(self.python_file, message)
| 35.910714 | 93 | 0.721532 | 235 | 2,011 | 5.961702 | 0.604255 | 0.049964 | 0.039971 | 0.027123 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030048 | 0.172551 | 2,011 | 55 | 94 | 36.563636 | 0.811899 | 0.218797 | 0 | 0 | 0 | 0 | 0.152662 | 0.014753 | 0 | 0 | 0 | 0.018182 | 0 | 1 | 0.081081 | false | 0 | 0.189189 | 0.027027 | 0.378378 | 0.027027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
54e0817402b9c2ce35c6af23684ce91b4042e10a | 5,639 | py | Python | home/views.py | Kshitij-Kumar-Singh-Chauhan/docon | bff0547e7bbd030e027217a2ca7800a8da529b56 | [
"MIT"
] | null | null | null | home/views.py | Kshitij-Kumar-Singh-Chauhan/docon | bff0547e7bbd030e027217a2ca7800a8da529b56 | [
"MIT"
] | null | null | null | home/views.py | Kshitij-Kumar-Singh-Chauhan/docon | bff0547e7bbd030e027217a2ca7800a8da529b56 | [
"MIT"
] | 2 | 2021-06-17T05:35:07.000Z | 2021-06-17T06:01:23.000Z | from django.http.response import HttpResponse
from django.shortcuts import render
from django.shortcuts import redirect, render
from cryptography.fernet import Fernet
from .models import Book, UserDetails
from .models import Contact
from .models import Book
from .models import Report
from .models import Diagnostic
from datetime import datetime
# Create your views here.
def homePage(request):
if(request.method == 'POST'):
email = request.POST.get('email')
password = request.POST.get('password')
try:
object = UserDetails.objects.get(email = email)
key1 = object.key
key1=key1[2:-1]
key1 = bytes(key1,'utf-8')
f = Fernet(key1)
truepassword = object.password
truepassword = truepassword[2:-1]
truepassword = bytes(truepassword,'utf-8')
truepassword = f.decrypt(truepassword).decode('utf-8')
except:
object = None
if(object==None):
context = {
'message': "Email Does Not Exist"
}
return render(request,"login.html",context)
elif(password == truepassword):
if object.profession == "PATIENT":
object1=UserDetails.objects.filter(profession="DOCTOR")
# name=(object.name)
# appointment(request,email,name)
context1={
'message':'Welcome '+object.name,
'mail' : object.email,
'doctors':object1
}
return render(request,"index.html",context1)
else:
context2={
'message':'Welcome '+object.name,
'mail' : object.email
}
return render(request,"dindex.html",context2)
else:
return redirect("/")
else:
return render(request,"login.html",{})
def signUpPage(request):
if(request.method == 'POST'):
name = request.POST.get('name')
email = request.POST.get('email')
password = request.POST.get('password')
passwordVerif = request.POST.get('passwordVerif')
profession = request.POST.get('user')
data = request.POST.get('data')
if(email ==''):
context = {
'message': "Please enter Email ID"
}
return render(request,"signup.html",context)
elif(password == passwordVerif):
key = Fernet.generate_key()
f = Fernet(key)
password = bytes(password,'utf-8')
token = f.encrypt(password)
key = str(key)
print(key)
UserDetails.objects.create(email=email, name=name , password=token, key = key, profession=profession, data=data)
return redirect("/")
else:
context = {
'message': "Password doesn't match"
}
return render(request,"signup.html",context)
else:
return render(request,"signup.html",{})
# def index(request):
# context={ 'alpha': 'This is sent'}
# if request.method=='POST':
# pass
# else: return render(request, 'index.html',context)
#HttpResponse('This is homepage')
def about(request):
return render(request, 'about.html')
def services(request):
return render(request, 'services.html')
def contact(request):
if request.method == "POST":
email = request.POST.get('email')
name = request.POST.get('name')
phone = request.POST.get('phone')
address = request.POST.get('address')
contact = Contact(email=email , name=name, phone=phone,address=address,date=datetime.today())
contact.save()
# messages.success(request, 'Your message has been sent !')
return render(request,"contact.html")
def book(request):
if request.method == "POST":
email = request.POST.get('email')
name = request.POST.get('name')
phone = request.POST.get('phone')
address = request.POST.get('address')
book = Book(email=email , name=name, phone=phone,problem=address,date=datetime.today())
book.save()
return render(request,"book.html")
def report(request):
if request.method == "POST":
email = request.POST.get('email')
name = request.POST.get('name')
phone = request.POST.get('phone')
message = request.POST.get('message')
report = Report(email=email , name=name, phone=phone, message=message, date=datetime.today())
report.save()
return render(request,"report.html")
def diag(request):
if request.method == "POST":
email = request.POST.get('email')
name = request.POST.get('name')
phone = request.POST.get('phone')
tests = request.POST.get('drop1')
tests = str(tests)
if(email ==''):
context = {
'message': "Please enter Email ID"
}
return render(request,"diag.html",context)
else:
diag = Diagnostic(email=email , name=name, phone=phone, tests=tests, date=datetime.today())
diag.save()
# messages.success(request, 'Your message has been sent !')
return render(request,"diag.html")
# def appointment(request,email,name):
# if request.method == "POST":
# problem = request.POST.get('problem')
# book = Appoint(problem=problem, email=email, name=name)
# book.save()
# return render(request,"index.html") | 33.565476 | 124 | 0.567477 | 589 | 5,639 | 5.431239 | 0.183362 | 0.085964 | 0.109409 | 0.047515 | 0.417318 | 0.333542 | 0.281963 | 0.25758 | 0.25758 | 0.25758 | 0 | 0.005339 | 0.302536 | 5,639 | 168 | 125 | 33.565476 | 0.808035 | 0.109949 | 0 | 0.357143 | 0 | 0 | 0.09916 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063492 | false | 0.103175 | 0.079365 | 0.015873 | 0.269841 | 0.007937 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
54e0ed7eefaaeac2cfcbec8d464ffc806c518afa | 9,892 | py | Python | compressor/tests/templatetags.py | bigmlcom/django_compressor | 66dfda503633018275fdb64ad46ef80dc9a3901d | [
"Apache-2.0"
] | null | null | null | compressor/tests/templatetags.py | bigmlcom/django_compressor | 66dfda503633018275fdb64ad46ef80dc9a3901d | [
"Apache-2.0"
] | null | null | null | compressor/tests/templatetags.py | bigmlcom/django_compressor | 66dfda503633018275fdb64ad46ef80dc9a3901d | [
"Apache-2.0"
] | null | null | null | from __future__ import with_statement
import os
import sys
from mock import Mock
from django.template import Template, Context, TemplateSyntaxError
from django.test import TestCase
from compressor.conf import settings
from compressor.signals import post_compress
from compressor.tests.base import css_tag, test_dir
def render(template_string, context_dict=None):
"""
A shortcut for testing template output.
"""
if context_dict is None:
context_dict = {}
c = Context(context_dict)
t = Template(template_string)
return t.render(c).strip()
class TemplatetagTestCase(TestCase):
def setUp(self):
self.old_enabled = settings.COMPRESS_ENABLED
settings.COMPRESS_ENABLED = True
self.context = {'MEDIA_URL': settings.COMPRESS_URL}
def tearDown(self):
settings.COMPRESS_ENABLED = self.old_enabled
def test_empty_tag(self):
template = u"""{% load compress %}{% compress js %}{% block js %}
{% endblock %}{% endcompress %}"""
self.assertEqual(u'', render(template, self.context))
def test_css_tag(self):
template = u"""{% load compress %}{% compress css %}
<link rel="stylesheet" href="{{ MEDIA_URL }}css/one.css" type="text/css">
<style type="text/css">p { border:5px solid green;}</style>
<link rel="stylesheet" href="{{ MEDIA_URL }}css/two.css" type="text/css">
{% endcompress %}"""
out = css_tag("/media/CACHE/css/e41ba2cc6982.css")
self.assertEqual(out, render(template, self.context))
def test_uppercase_rel(self):
template = u"""{% load compress %}{% compress css %}
<link rel="StyleSheet" href="{{ MEDIA_URL }}css/one.css" type="text/css">
<style type="text/css">p { border:5px solid green;}</style>
<link rel="StyleSheet" href="{{ MEDIA_URL }}css/two.css" type="text/css">
{% endcompress %}"""
out = css_tag("/media/CACHE/css/e41ba2cc6982.css")
self.assertEqual(out, render(template, self.context))
def test_nonascii_css_tag(self):
template = u"""{% load compress %}{% compress css %}
<link rel="stylesheet" href="{{ MEDIA_URL }}css/nonasc.css" type="text/css">
<style type="text/css">p { border:5px solid green;}</style>
{% endcompress %}
"""
out = css_tag("/media/CACHE/css/799f6defe43c.css")
self.assertEqual(out, render(template, self.context))
def test_js_tag(self):
template = u"""{% load compress %}{% compress js %}
<script src="{{ MEDIA_URL }}js/one.js" type="text/javascript"></script>
<script type="text/javascript">obj.value = "value";</script>
{% endcompress %}
"""
out = u'<script type="text/javascript" src="/media/CACHE/js/066cd253eada.js"></script>'
self.assertEqual(out, render(template, self.context))
def test_nonascii_js_tag(self):
template = u"""{% load compress %}{% compress js %}
<script src="{{ MEDIA_URL }}js/nonasc.js" type="text/javascript"></script>
<script type="text/javascript">var test_value = "\u2014";</script>
{% endcompress %}
"""
out = u'<script type="text/javascript" src="/media/CACHE/js/e214fe629b28.js"></script>'
self.assertEqual(out, render(template, self.context))
def test_nonascii_latin1_js_tag(self):
template = u"""{% load compress %}{% compress js %}
<script src="{{ MEDIA_URL }}js/nonasc-latin1.js" type="text/javascript" charset="latin-1"></script>
<script type="text/javascript">var test_value = "\u2014";</script>
{% endcompress %}
"""
out = u'<script type="text/javascript" src="/media/CACHE/js/be9e078b5ca7.js"></script>'
self.assertEqual(out, render(template, self.context))
def test_compress_tag_with_illegal_arguments(self):
template = u"""{% load compress %}{% compress pony %}
<script type="pony/application">unicorn</script>
{% endcompress %}"""
self.assertRaises(TemplateSyntaxError, render, template, {})
def test_debug_toggle(self):
template = u"""{% load compress %}{% compress js %}
<script src="{{ MEDIA_URL }}js/one.js" type="text/javascript"></script>
<script type="text/javascript">obj.value = "value";</script>
{% endcompress %}
"""
class MockDebugRequest(object):
GET = {settings.COMPRESS_DEBUG_TOGGLE: 'true'}
context = dict(self.context, request=MockDebugRequest())
out = u"""<script src="/media/js/one.js" type="text/javascript"></script>
<script type="text/javascript">obj.value = "value";</script>"""
self.assertEqual(out, render(template, context))
def test_named_compress_tag(self):
template = u"""{% load compress %}{% compress js inline foo %}
<script type="text/javascript">obj.value = "value";</script>
{% endcompress %}
"""
def listener(sender, **kwargs):
pass
callback = Mock(wraps=listener)
post_compress.connect(callback)
render(template)
args, kwargs = callback.call_args
context = kwargs['context']
self.assertEqual('foo', context['compressed']['name'])
class PrecompilerTemplatetagTestCase(TestCase):
def setUp(self):
self.old_enabled = settings.COMPRESS_ENABLED
self.old_precompilers = settings.COMPRESS_PRECOMPILERS
precompiler = os.path.join(test_dir, 'precompiler.py')
python = sys.executable
settings.COMPRESS_ENABLED = True
settings.COMPRESS_PRECOMPILERS = (
('text/coffeescript', '%s %s' % (python, precompiler)),
)
self.context = {'MEDIA_URL': settings.COMPRESS_URL}
def tearDown(self):
settings.COMPRESS_ENABLED = self.old_enabled
settings.COMPRESS_PRECOMPILERS = self.old_precompilers
def test_compress_coffeescript_tag(self):
template = u"""{% load compress %}{% compress js %}
<script type="text/coffeescript"># this is a comment.</script>
{% endcompress %}"""
out = script(src="/media/CACHE/js/e920d58f166d.js")
self.assertEqual(out, render(template, self.context))
def test_compress_coffeescript_tag_and_javascript_tag(self):
template = u"""{% load compress %}{% compress js %}
<script type="text/coffeescript"># this is a comment.</script>
<script type="text/javascript"># this too is a comment.</script>
{% endcompress %}"""
out = script(src="/media/CACHE/js/ef6b32a54575.js")
self.assertEqual(out, render(template, self.context))
def test_coffeescript_and_js_tag_with_compress_enabled_equals_false(self):
self.old_enabled = settings.COMPRESS_ENABLED
settings.COMPRESS_ENABLED = False
try:
template = u"""{% load compress %}{% compress js %}
<script type="text/coffeescript"># this is a comment.</script>
<script type="text/javascript"># this too is a comment.</script>
{% endcompress %}"""
out = (script('# this is a comment.\n') + '\n' +
script('# this too is a comment.'))
self.assertEqual(out, render(template, self.context))
finally:
settings.COMPRESS_ENABLED = self.old_enabled
def test_compress_coffeescript_tag_compress_enabled_is_false(self):
self.old_enabled = settings.COMPRESS_ENABLED
settings.COMPRESS_ENABLED = False
try:
template = u"""{% load compress %}{% compress js %}
<script type="text/coffeescript"># this is a comment.</script>
{% endcompress %}"""
out = script("# this is a comment.\n")
self.assertEqual(out, render(template, self.context))
finally:
settings.COMPRESS_ENABLED = self.old_enabled
def test_compress_coffeescript_file_tag_compress_enabled_is_false(self):
self.old_enabled = settings.COMPRESS_ENABLED
settings.COMPRESS_ENABLED = False
try:
template = u"""
{% load compress %}{% compress js %}
<script type="text/coffeescript" src="{{ MEDIA_URL }}js/one.coffee">
</script>
{% endcompress %}"""
out = script(src="/media/CACHE/js/one.95cfb869eead.js")
self.assertEqual(out, render(template, self.context))
finally:
settings.COMPRESS_ENABLED = self.old_enabled
def test_multiple_file_order_conserved(self):
self.old_enabled = settings.COMPRESS_ENABLED
settings.COMPRESS_ENABLED = False
try:
template = u"""
{% load compress %}{% compress js %}
<script type="text/coffeescript" src="{{ MEDIA_URL }}js/one.coffee">
</script>
<script src="{{ MEDIA_URL }}js/one.js"></script>
<script type="text/coffeescript" src="{{ MEDIA_URL }}js/one.js">
</script>
{% endcompress %}"""
out = '\n'.join([
script(src="/media/CACHE/js/one.95cfb869eead.js"),
script(scripttype="", src="/media/js/one.js"),
script(src="/media/CACHE/js/one.81a2cd965815.js"),])
self.assertEqual(out, render(template, self.context))
finally:
settings.COMPRESS_ENABLED = self.old_enabled
def script(content="", src="", scripttype="text/javascript"):
"""
returns a unicode text html script element.
>>> script('#this is a comment', scripttype="text/applescript")
'<script type="text/applescript">#this is a comment</script>'
"""
out_script = u'<script '
if scripttype:
out_script += u'type="%s" ' % scripttype
if src:
out_script += u'src="%s" ' % src
return out_script[:-1] + u'>%s</script>' % content
| 41.563025 | 107 | 0.616761 | 1,119 | 9,892 | 5.322609 | 0.138517 | 0.042982 | 0.044661 | 0.056414 | 0.708193 | 0.683848 | 0.663533 | 0.651612 | 0.622901 | 0.603425 | 0 | 0.011936 | 0.237768 | 9,892 | 237 | 108 | 41.738397 | 0.777984 | 0.021229 | 0 | 0.530928 | 0 | 0.046392 | 0.434097 | 0.120295 | 0 | 0 | 0 | 0 | 0.082474 | 1 | 0.118557 | false | 0.005155 | 0.046392 | 0 | 0.190722 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
54e1fce9e0db363710daf71e66104aba025bc831 | 477 | py | Python | ringapp/migrations/0009_auto_20150116_1759.py | rschwiebert/RingApp | 35675b3dd81728d71b7dc70071be3185d7f99bf4 | [
"MIT"
] | 10 | 2015-02-02T12:40:05.000Z | 2022-01-29T14:11:03.000Z | ringapp/migrations/0009_auto_20150116_1759.py | rschwiebert/RingApp | 35675b3dd81728d71b7dc70071be3185d7f99bf4 | [
"MIT"
] | 22 | 2015-01-07T21:29:24.000Z | 2022-03-19T01:15:13.000Z | ringapp/migrations/0009_auto_20150116_1759.py | rschwiebert/RingApp | 35675b3dd81728d71b7dc70071be3185d7f99bf4 | [
"MIT"
] | 1 | 2016-08-07T15:41:51.000Z | 2016-08-07T15:41:51.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('ringapp', '0008_auto_20150116_1755'),
]
operations = [
migrations.AlterModelTable(
name='invariance',
table='invariance',
),
migrations.AlterModelTable(
name='invarianttype',
table='invariant_types',
),
]
| 20.73913 | 47 | 0.589099 | 38 | 477 | 7.157895 | 0.736842 | 0.183824 | 0.213235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050898 | 0.29979 | 477 | 22 | 48 | 21.681818 | 0.763473 | 0.044025 | 0 | 0.25 | 0 | 0 | 0.171806 | 0.050661 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
54e218f734c2d85cbff6df8c45d35331a499ae96 | 654 | py | Python | front-end/testsuite-python-lib/Python-3.1/Lib/json/tests/test_dump.py | MalloyPower/parsing-python | b2bca5eed07ea2af7a2001cd4f63becdfb0570be | [
"MIT"
] | 1 | 2020-11-26T18:53:46.000Z | 2020-11-26T18:53:46.000Z | Lib/json/tests/test_dump.py | orestis/python | 870a82aac7788ffa105e2a3e4480b3715c93bff6 | [
"PSF-2.0"
] | null | null | null | Lib/json/tests/test_dump.py | orestis/python | 870a82aac7788ffa105e2a3e4480b3715c93bff6 | [
"PSF-2.0"
] | 2 | 2018-08-06T04:37:38.000Z | 2022-02-27T18:07:12.000Z | from unittest import TestCase
from io import StringIO
import json
class TestDump(TestCase):
def test_dump(self):
sio = StringIO()
json.dump({}, sio)
self.assertEquals(sio.getvalue(), '{}')
def test_dumps(self):
self.assertEquals(json.dumps({}), '{}')
def test_encode_truefalse(self):
self.assertEquals(json.dumps(
{True: False, False: True}, sort_keys=True),
'{"false": true, "true": false}')
self.assertEquals(json.dumps(
{2: 3.0, 4.0: 5, False: 1, 6: True}, sort_keys=True),
'{"false": 1, "2": 3.0, "4.0": 5, "6": true}')
| 29.727273 | 69 | 0.547401 | 81 | 654 | 4.345679 | 0.37037 | 0.181818 | 0.170455 | 0.213068 | 0.318182 | 0.034091 | 0 | 0 | 0 | 0 | 0 | 0.034115 | 0.282875 | 654 | 21 | 70 | 31.142857 | 0.716418 | 0 | 0 | 0.117647 | 0 | 0.058824 | 0.117737 | 0 | 0 | 0 | 0 | 0 | 0.235294 | 1 | 0.176471 | false | 0 | 0.176471 | 0 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
54e901540b5f6fa6fc62f5e51511aa0c656882ca | 3,653 | py | Python | venv/Lib/site-packages/captcha/conf/settings.py | Rudeus3Greyrat/admin-management | 7e81d2b1908afa3ea57a82c542c9aebb1d0ffd23 | [
"MIT"
] | 1 | 2020-05-21T06:48:34.000Z | 2020-05-21T06:48:34.000Z | venv/Lib/site-packages/captcha/conf/settings.py | Rudeus3Greyrat/admin-management | 7e81d2b1908afa3ea57a82c542c9aebb1d0ffd23 | [
"MIT"
] | 3 | 2021-03-19T03:07:36.000Z | 2021-04-08T20:33:38.000Z | venv/Lib/site-packages/captcha/conf/settings.py | Rudeus3Greyrat/admin-management | 7e81d2b1908afa3ea57a82c542c9aebb1d0ffd23 | [
"MIT"
] | 1 | 2020-05-21T06:48:36.000Z | 2020-05-21T06:48:36.000Z | import os
import warnings
from django.conf import settings
CAPTCHA_FONT_PATH = getattr(settings, 'CAPTCHA_FONT_PATH', os.path.normpath(os.path.join(os.path.dirname(__file__), '..', 'fonts/Vera.ttf')))
CAPTCHA_FONT_SIZE = getattr(settings, 'CAPTCHA_FONT_SIZE', 22)
CAPTCHA_LETTER_ROTATION = getattr(settings, 'CAPTCHA_LETTER_ROTATION', (-35, 35))
CAPTCHA_BACKGROUND_COLOR = getattr(settings, 'CAPTCHA_BACKGROUND_COLOR', '#ffffff')
CAPTCHA_FOREGROUND_COLOR = getattr(settings, 'CAPTCHA_FOREGROUND_COLOR', '#001100')
CAPTCHA_CHALLENGE_FUNCT = getattr(settings, 'CAPTCHA_CHALLENGE_FUNCT', 'captcha.helpers.random_char_challenge')
CAPTCHA_NOISE_FUNCTIONS = getattr(settings, 'CAPTCHA_NOISE_FUNCTIONS', ('captcha.helpers.noise_arcs', 'captcha.helpers.noise_dots',))
CAPTCHA_FILTER_FUNCTIONS = getattr(settings, 'CAPTCHA_FILTER_FUNCTIONS', ('captcha.helpers.post_smooth',))
CAPTCHA_WORDS_DICTIONARY = getattr(settings, 'CAPTCHA_WORDS_DICTIONARY', '/usr/share/dict/words')
CAPTCHA_PUNCTUATION = getattr(settings, 'CAPTCHA_PUNCTUATION', '''_"',.;:-''')
CAPTCHA_FLITE_PATH = getattr(settings, 'CAPTCHA_FLITE_PATH', None)
CAPTCHA_SOX_PATH = getattr(settings, 'CAPTCHA_SOX_PATH', None)
CAPTCHA_TIMEOUT = getattr(settings, 'CAPTCHA_TIMEOUT', 5) # Minutes
CAPTCHA_LENGTH = int(getattr(settings, 'CAPTCHA_LENGTH', 4)) # Chars
# CAPTCHA_IMAGE_BEFORE_FIELD = getattr(settings, 'CAPTCHA_IMAGE_BEFORE_FIELD', True)
CAPTCHA_DICTIONARY_MIN_LENGTH = getattr(settings, 'CAPTCHA_DICTIONARY_MIN_LENGTH', 0)
CAPTCHA_DICTIONARY_MAX_LENGTH = getattr(settings, 'CAPTCHA_DICTIONARY_MAX_LENGTH', 99)
CAPTCHA_IMAGE_SIZE = getattr(settings, 'CAPTCHA_IMAGE_SIZE', None)
CAPTCHA_IMAGE_TEMPLATE = getattr(settings, 'CAPTCHA_IMAGE_TEMPLATE', 'captcha/image.html')
CAPTCHA_HIDDEN_FIELD_TEMPLATE = getattr(settings, 'CAPTCHA_HIDDEN_FIELD_TEMPLATE', 'captcha/hidden_field.html')
CAPTCHA_TEXT_FIELD_TEMPLATE = getattr(settings, 'CAPTCHA_TEXT_FIELD_TEMPLATE', 'captcha/text_field.html')
if getattr(settings, 'CAPTCHA_FIELD_TEMPLATE', None):
msg = ("CAPTCHA_FIELD_TEMPLATE setting is deprecated in favor of widget's template_name.")
warnings.warn(msg, DeprecationWarning)
CAPTCHA_FIELD_TEMPLATE = getattr(settings, 'CAPTCHA_FIELD_TEMPLATE', None)
if getattr(settings, 'CAPTCHA_OUTPUT_FORMAT', None):
msg = ("CAPTCHA_OUTPUT_FORMAT setting is deprecated in favor of widget's template_name.")
warnings.warn(msg, DeprecationWarning)
CAPTCHA_OUTPUT_FORMAT = getattr(settings, 'CAPTCHA_OUTPUT_FORMAT', None)
CAPTCHA_MATH_CHALLENGE_OPERATOR = getattr(settings, 'CAPTCHA_MATH_CHALLENGE_OPERATOR', '*')
CAPTCHA_GET_FROM_POOL = getattr(settings, 'CAPTCHA_GET_FROM_POOL', False)
CAPTCHA_GET_FROM_POOL_TIMEOUT = getattr(settings, 'CAPTCHA_GET_FROM_POOL_TIMEOUT', 5)
CAPTCHA_TEST_MODE = getattr(settings, 'CAPTCHA_TEST_MODE', False)
# Failsafe
if CAPTCHA_DICTIONARY_MIN_LENGTH > CAPTCHA_DICTIONARY_MAX_LENGTH:
CAPTCHA_DICTIONARY_MIN_LENGTH, CAPTCHA_DICTIONARY_MAX_LENGTH = CAPTCHA_DICTIONARY_MAX_LENGTH, CAPTCHA_DICTIONARY_MIN_LENGTH
def _callable_from_string(string_or_callable):
if callable(string_or_callable):
return string_or_callable
else:
return getattr(__import__('.'.join(string_or_callable.split('.')[:-1]), {}, {}, ['']), string_or_callable.split('.')[-1])
def get_challenge(generator=None):
return _callable_from_string(generator or CAPTCHA_CHALLENGE_FUNCT)
def noise_functions():
if CAPTCHA_NOISE_FUNCTIONS:
return map(_callable_from_string, CAPTCHA_NOISE_FUNCTIONS)
return []
def filter_functions():
if CAPTCHA_FILTER_FUNCTIONS:
return map(_callable_from_string, CAPTCHA_FILTER_FUNCTIONS)
return []
| 52.942029 | 141 | 0.800712 | 456 | 3,653 | 5.97807 | 0.221491 | 0.165077 | 0.234043 | 0.047689 | 0.319516 | 0.233309 | 0.152605 | 0.121056 | 0.121056 | 0.108217 | 0 | 0.006019 | 0.090337 | 3,653 | 68 | 142 | 53.720588 | 0.814324 | 0.028744 | 0 | 0.076923 | 0 | 0 | 0.288738 | 0.196444 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.076923 | 0.019231 | 0.288462 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
54eceeb38625ac7f7302479b3298ad5a3adabd40 | 1,307 | py | Python | src/lora_multihop/module_config.py | marv1913/lora_multihop | ef07493c2f763d07161fa25d4b884ef79b94afa4 | [
"MIT"
] | null | null | null | src/lora_multihop/module_config.py | marv1913/lora_multihop | ef07493c2f763d07161fa25d4b884ef79b94afa4 | [
"MIT"
] | 1 | 2022-02-20T13:18:13.000Z | 2022-02-24T18:32:23.000Z | src/lora_multihop/module_config.py | marv1913/lora_multihop | ef07493c2f763d07161fa25d4b884ef79b94afa4 | [
"MIT"
] | null | null | null | import logging
from lora_multihop import serial_connection, variables
def config_module(configuration=variables.MODULE_CONFIG):
if serial_connection.execute_command(configuration, [variables.STATUS_OK]):
serial_connection.execute_command('AT+SEND=1', [variables.STATUS_OK])
serial_connection.execute_command('a', ['AT,SENDING', 'AT,SENDED'])
logging.debug('module config successfully set')
return True
logging.warning("could not set module config")
return False
def set_address(address):
cmd = f'AT+ADDR={address}'
if serial_connection.execute_command(serial_connection.str_to_bytes(cmd), [variables.STATUS_OK]):
logging.debug(f'module address successfully set to: {address}')
return True
logging.warning("could not set module address")
return False
def get_current_address():
serial_connection.execute_command(serial_connection.str_to_bytes(variables.GET_ADDR))
addr = serial_connection.response_q.get(variables.COMMAND_VERIFICATION_TIMEOUT)
addr = serial_connection.bytes_to_str(addr)
addr_as_list = addr.split(variables.LORA_MODULE_DELIMITER)
if addr_as_list[0].strip() != 'AT' or addr_as_list[2].strip() != 'OK':
raise ValueError('could not get address of module')
return addr_as_list[1]
| 39.606061 | 101 | 0.746748 | 173 | 1,307 | 5.387283 | 0.323699 | 0.171674 | 0.123391 | 0.160944 | 0.345494 | 0.309013 | 0.309013 | 0.208155 | 0.120172 | 0 | 0 | 0.003604 | 0.150727 | 1,307 | 32 | 102 | 40.84375 | 0.836036 | 0 | 0 | 0.16 | 0 | 0 | 0.161438 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0 | 0.08 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
54f7f3b4bb05515aa800aef3ce44e23eb1933bf4 | 443 | py | Python | Desafios/desafio_041.py | romulogoleniesky/Python_C_E_V | 2dcf5fb3505a20443788a284c52114c6434118ce | [
"MIT"
] | null | null | null | Desafios/desafio_041.py | romulogoleniesky/Python_C_E_V | 2dcf5fb3505a20443788a284c52114c6434118ce | [
"MIT"
] | null | null | null | Desafios/desafio_041.py | romulogoleniesky/Python_C_E_V | 2dcf5fb3505a20443788a284c52114c6434118ce | [
"MIT"
] | null | null | null | import datetime
ano = (datetime.datetime.now()).year
nasc = int(input("Digite o seu ano de nascimento: "))
categoria = 0
if (ano - nasc) <= 9:
categoria = str("MIRIM")
elif 9 < (ano - nasc) <= 14:
categoria = str("INFANTIL")
elif 14 < (ano - nasc) <= 19 :
categoria = str("JUNIOR")
elif 19 < (ano - nasc) <= 25:
categoria = str("SÊNIOR")
else:
categoria = str("MASTER")
print(f"A categoria do atleta é {str(categoria)}.")
| 26.058824 | 53 | 0.616253 | 62 | 443 | 4.403226 | 0.548387 | 0.21978 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036932 | 0.205418 | 443 | 16 | 54 | 27.6875 | 0.738636 | 0 | 0 | 0 | 0 | 0 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0.066667 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
0708030cc6b0ac486ef0bd568029e80e9873483c | 2,332 | py | Python | particle.py | coush001/Imperial-MSc-Group-Project-2 | 9309217895802d11c6fe9d2dca9b21f98fbc1c61 | [
"MIT"
] | null | null | null | particle.py | coush001/Imperial-MSc-Group-Project-2 | 9309217895802d11c6fe9d2dca9b21f98fbc1c61 | [
"MIT"
] | null | null | null | particle.py | coush001/Imperial-MSc-Group-Project-2 | 9309217895802d11c6fe9d2dca9b21f98fbc1c61 | [
"MIT"
] | null | null | null | from itertools import count
import numpy as np
class Particle(object):
"""Object containing all the properties for a single particle"""
_ids = count(0)
def __init__(self, main_data=None, x=np.zeros(2)):
self.id = next(self._ids)
self.main_data = main_data
self.x = np.array(x)
self.v = np.zeros(2)
self.a = np.zeros(2)
self.D = 0
self.rho = main_data.rho0
self.P = 0
self.m = main_data.dx ** 2 * main_data.rho0 # initial mass depends on the initial particle spacing
self.boundary = False # Particle by default is not on the boundary
# For predictor corrector
self.prev_x = np.array(x)
self.prev_v = np.zeros(2)
self.prev_rho = main_data.rho0
def calc_index(self):
"""Calculates the 2D integer index for the particle's location in the search grid"""
# Calculates the bucket coordinates
self.list_num = np.array((self.x - self.main_data.min_x) /
(2.0 * self.main_data.h), int)
def B(self):
return (self.main_data.rho0 * self.main_data.c0 ** 2) / self.main_data.gamma
def update_P(self):
"""
Equation of state
System is assumed slightly compressible
"""
rho0 = self.main_data.rho0
gamma = self.main_data.gamma
self.P = self.B() * ((self.rho / rho0)**gamma - 1)
def set_main_data(self, main_data):
self.main_data = main_data
def set_x(self, x):
self.x = x
self.calc_index()
def set_v(self, v):
self.v = v
def set_a(self, a):
self.a = a
def set_D(self, D):
self.D = D
def set_rho(self, rho):
self.rho = rho
self.update_P()
def m(self, m):
self.m = m
def list_attributes(self):
x_s = "position: " + str(self.x) + ", "
v_s = "velocity: " + str(self.v) + ", "
a_s = "acceleration: " + str(self.a) + ", "
D_s = "derivative of density: " + str(self.D) + ", "
rho_s = "density: " + str(self.rho) + ", "
m_s = "mass: " + str(self.m) + ", "
P_s = "pressure: " + str(self.P) + ", "
boundary_s = "is boundary: " + str(self.boundary)
return [x_s + v_s + a_s + D_s + rho_s + m_s + P_s + boundary_s]
| 30.285714 | 107 | 0.551887 | 344 | 2,332 | 3.578488 | 0.264535 | 0.116978 | 0.10723 | 0.038993 | 0.090983 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013208 | 0.318182 | 2,332 | 76 | 108 | 30.684211 | 0.761006 | 0.150086 | 0 | 0.037736 | 0 | 0 | 0.056273 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.226415 | false | 0 | 0.037736 | 0.018868 | 0.339623 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
070a6926f75c6689b9bf183a8c81961b1ffe5bbd | 1,150 | py | Python | python/pyoai/setup.py | jr3cermak/robs-kitchensink | 74b7eb1b1acd8b700d61c5a9ba0c69be3cc6763a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | python/pyoai/setup.py | jr3cermak/robs-kitchensink | 74b7eb1b1acd8b700d61c5a9ba0c69be3cc6763a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | python/pyoai/setup.py | jr3cermak/robs-kitchensink | 74b7eb1b1acd8b700d61c5a9ba0c69be3cc6763a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | from setuptools import setup, find_packages
from os.path import join, dirname
setup(
name='pyoai',
version='2.4.6.b',
author='Infrae',
author_email='rob.cermak@gmail.com',
url='https://github.com/jr3cermak/robs-kitchensink/tree/master/python/pyoai',
classifiers=["Development Status :: 4 - Beta",
"Programming Language :: Python",
"License :: OSI Approved :: BSD License",
"Topic :: Software Development :: Libraries :: Python Modules",
"Environment :: Web Environment"],
description="""\
The oaipmh module is a Python implementation of an "Open Archives
Initiative Protocol for Metadata Harvesting" (version 2) client and server.
The protocol is described here:
http://www.openarchives.org/OAI/openarchivesprotocol.html
""",
long_description=(open(join(dirname(__file__), 'README.rst')).read()+
'\n\n'+
open(join(dirname(__file__), 'HISTORY.txt')).read()),
packages=find_packages('src'),
package_dir = {'': 'src'},
zip_safe=False,
license='BSD',
keywords='OAI-PMH xml archive',
install_requires=['lxml'],
)
| 35.9375 | 81 | 0.650435 | 133 | 1,150 | 5.511278 | 0.736842 | 0.04502 | 0.040928 | 0.051842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00655 | 0.203478 | 1,150 | 31 | 82 | 37.096774 | 0.793668 | 0 | 0 | 0 | 0 | 0 | 0.511304 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.068966 | 0 | 0.068966 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
070b402dc83b92f4ca29c79684b3e9fb26a6238f | 4,201 | py | Python | utils/functions.py | Roozbeh-Bazargani/CPSC-533R-project | 453f093b23d2363f09c61079d1d4fbd878abf3be | [
"MIT"
] | null | null | null | utils/functions.py | Roozbeh-Bazargani/CPSC-533R-project | 453f093b23d2363f09c61079d1d4fbd878abf3be | [
"MIT"
] | null | null | null | utils/functions.py | Roozbeh-Bazargani/CPSC-533R-project | 453f093b23d2363f09c61079d1d4fbd878abf3be | [
"MIT"
] | null | null | null | import torch
from torch import nn
import math
#0 left hip
#1 left knee
#2 left foot
#3 right hip
#4 right knee
#5 right foot
#6 middle hip
#7 neck
#8 nose
#9 head
#10 left shoulder
#11 left elbow
#12 left wrist
#13 right shoulder
#14 right elbow
#15 right wrist
def random_rotation(J3d):
J = J3d # need copy????
batch_size = J.shape[0]
theta = torch.rand(batch_size).cuda() * 2*torch.tensor(math.pi).cuda() # random theta
root = J[:,:,8] # joint 8 = nose is root
J3d_R = rotation(J.cuda(), theta.cuda(), root.unsqueeze(-1).cuda(), False)
return J3d_R, theta, root # need these values in the code
def rotation(J, theta, root, is_reversed): # rotation over y axis by theta
D = root[:,2].cuda() # absolute depth of the root joint
batch_size = root.shape[0]
v_t = torch.zeros((batch_size, 3, 1)).cuda()
v_t[:, 2, :] = D.cuda() # translation vector
if is_reversed:
root, v_t = v_t, root # swap
theta = -theta
# R = torch.tensor([[torch.cos(theta), -torch.sin(theta), 0], [torch.sin(theta), torch.cos(theta), 0], [0, 0, 1]]) # rotation matrix over z by theta degrees
R = torch.zeros((batch_size, 3, 3)).cuda() # rotation matrix over y by theta degrees
R[:, 0, 0] = torch.cos(theta)
R[:, 0, 2] = torch.sin(theta)
R[:, 1, 1] = torch.ones(batch_size)
R[:, 2, 0] = -torch.sin(theta)
R[:, 2, 2] = torch.cos(theta)
# R = torch.tensor([[torch.cos(theta), 0, torch.sin(theta)], [0, 1, 0], [-torch.sin(theta), 0, torch.cos(theta)]]) # rotation matrix over y by theta degrees
# R = torch.tensor([[1, 0, 0], [0, torch.cos(theta), -torch.sin(theta)], [0, torch.sin(theta), torch.cos(theta)]]) # rotation matrix over x by theta degrees
J_R = torch.matmul(R, J - root) + v_t # rotation
return J_R
def reverse_rotation(J3d_R, theta, root):
J = J3d_R # need copy????
return rotation(J.cuda(), theta.cuda(), root.unsqueeze(-1).cuda(), True)
def temporal_loss(J, K, J_R, K_R): # J is J3d at time t and K is J3d at time t+k. J_R means the reversed rotation of J
#print(torch.norm(J.reshape(J.shape[0], 3, 16) - K.reshape(J.shape[0], 3, 16) - J_R.reshape(J.shape[0], 3, 16) + K_R.reshape(J.shape[0], 3, 16), dim=1).shape)
#stop
mse_fn = nn.MSELoss()
return mse_fn(J.reshape(J.shape[0], 3, 16) - K.reshape(J.shape[0], 3, 16) - J_R.reshape(J.shape[0], 3, 16) + K_R.reshape(J.shape[0], 3, 16), torch.zeros(J.shape[0], 3, 16).cuda())
#return torch.norm(J.reshape(J.shape[0], 3, 16) - K.reshape(J.shape[0], 3, 16) - J_R.reshape(J.shape[0], 3, 16) + K_R.reshape(J.shape[0], 3, 16), dim=1)**2
'''
def temporal_loss(J, K, J_R, K_R): # J is J3d at time t and K is J3d at time t+k. J_R means the reversed rotation of J
return torch.norm(J - K - J_R + K_R, dim=1)**2
'''
'''
def random_rotation(J3d):
# J = torch.transpose(J3d, 1, 2)
J = J3d
root = torch.zeros(J.shape[0:2])
for i in range(J.shape[0]):
theta = torch.rand(1).cuda() * 2*torch.tensor(math.pi).cuda() # random theta
root[i] = J[i,:,8] # joint 8 = nose is root
temp = rotation(J[i,:,:], theta, root[i].unsqueeze(1), False)
# print(temp.shape)
J[i,:,:] = temp
return J, theta, root # need these values in the code
def rotation(J, theta, root, is_reversed): # rotation over y axis by theta
D = root[2] # absolute depth of the root joint
v_t = torch.tensor([[0], [0], [D]]).cuda() # translation vector
if is_reversed:
root, v_t = v_t, root # swap
theta = -theta
# R = torch.tensor([[torch.cos(theta), -torch.sin(theta), 0], [torch.sin(theta), torch.cos(theta), 0], [0, 0, 1]]) # rotation matrix over z by theta degrees
R = torch.tensor([[torch.cos(theta), 0, torch.sin(theta)], [0, 1, 0], [-torch.sin(theta), 0, torch.cos(theta)]]).cuda() # rotation matrix over y by theta degrees
# R = torch.tensor([[1, 0, 0], [0, torch.cos(theta), -torch.sin(theta)], [0, torch.sin(theta), torch.cos(theta)]]) # rotation matrix over x by theta degrees
J_R = torch.matmul(R, J.cuda() - root.cuda()) + v_t # rotation
return J_R
def reverse_rotation(J3d_R, theta, root):
# J = torch.transpose(J3d_R, 1, 2)
J = J3d_R
for i in range(J.shape[0]):
J[i,:,:] = rotation(J[i,:,:].cuda(), theta.cuda(), root[i].unsqueeze(1).cuda(), True)
return J
''' | 42.434343 | 181 | 0.633183 | 778 | 4,201 | 3.352185 | 0.137532 | 0.041411 | 0.045629 | 0.039877 | 0.754218 | 0.704755 | 0.655291 | 0.639571 | 0.639571 | 0.592791 | 0 | 0.047715 | 0.176863 | 4,201 | 99 | 182 | 42.434343 | 0.706478 | 0.303023 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.09375 | 0 | 0.34375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
0713bf1d16fde855bda0ed021b030d08feadd022 | 3,486 | py | Python | selfdrive/car/chrysler/radar_interface.py | 919bot/Tessa | 9b48ff9020e8fb6992fc78271f2720fd19e01093 | [
"MIT"
] | 85 | 2019-06-14T17:51:31.000Z | 2022-02-09T22:18:20.000Z | selfdrive/car/chrysler/radar_interface.py | 919bot/Tessa | 9b48ff9020e8fb6992fc78271f2720fd19e01093 | [
"MIT"
] | 4 | 2020-04-12T21:34:03.000Z | 2020-04-15T22:22:15.000Z | selfdrive/car/chrysler/radar_interface.py | 919bot/Tessa | 9b48ff9020e8fb6992fc78271f2720fd19e01093 | [
"MIT"
] | 73 | 2018-12-03T19:34:42.000Z | 2020-07-27T05:10:23.000Z | #!/usr/bin/env python3
import os
from opendbc.can.parser import CANParser
from cereal import car
from selfdrive.car.interfaces import RadarInterfaceBase
RADAR_MSGS_C = list(range(0x2c2, 0x2d4+2, 2)) # c_ messages 706,...,724
RADAR_MSGS_D = list(range(0x2a2, 0x2b4+2, 2)) # d_ messages
LAST_MSG = max(RADAR_MSGS_C + RADAR_MSGS_D)
NUMBER_MSGS = len(RADAR_MSGS_C) + len(RADAR_MSGS_D)
def _create_radar_can_parser():
dbc_f = 'chrysler_pacifica_2017_hybrid_private_fusion.dbc'
msg_n = len(RADAR_MSGS_C)
# list of [(signal name, message name or number, initial values), (...)]
# [('RADAR_STATE', 1024, 0),
# ('LONG_DIST', 1072, 255),
# ('LONG_DIST', 1073, 255),
# ('LONG_DIST', 1074, 255),
# ('LONG_DIST', 1075, 255),
# The factor and offset are applied by the dbc parsing library, so the
# default values should be after the factor/offset are applied.
signals = list(zip(['LONG_DIST'] * msg_n +
['LAT_DIST'] * msg_n +
['REL_SPEED'] * msg_n,
RADAR_MSGS_C * 2 + # LONG_DIST, LAT_DIST
RADAR_MSGS_D, # REL_SPEED
[0] * msg_n + # LONG_DIST
[-1000] * msg_n + # LAT_DIST
[-146.278] * msg_n)) # REL_SPEED set to 0, factor/offset to this
# TODO what are the checks actually used for?
# honda only checks the last message,
# toyota checks all the messages. Which do we want?
checks = list(zip(RADAR_MSGS_C +
RADAR_MSGS_D,
[20]*msg_n + # 20Hz (0.05s)
[20]*msg_n)) # 20Hz (0.05s)
return CANParser(os.path.splitext(dbc_f)[0], signals, checks, 1)
def _address_to_track(address):
if address in RADAR_MSGS_C:
return (address - RADAR_MSGS_C[0]) // 2
if address in RADAR_MSGS_D:
return (address - RADAR_MSGS_D[0]) // 2
raise ValueError("radar received unexpected address %d" % address)
class RadarInterface(RadarInterfaceBase):
def __init__(self, CP):
self.pts = {}
self.delay = 0 # Delay of radar #TUNE
self.rcp = _create_radar_can_parser()
self.updated_messages = set()
self.trigger_msg = LAST_MSG
def update(self, can_strings):
vls = self.rcp.update_strings(can_strings)
self.updated_messages.update(vls)
if self.trigger_msg not in self.updated_messages:
return None
ret = car.RadarData.new_message()
errors = []
if not self.rcp.can_valid:
errors.append("canError")
ret.errors = errors
for ii in self.updated_messages: # ii should be the message ID as a number
cpt = self.rcp.vl[ii]
trackId = _address_to_track(ii)
if trackId not in self.pts:
self.pts[trackId] = car.RadarData.RadarPoint.new_message()
self.pts[trackId].trackId = trackId
self.pts[trackId].aRel = float('nan')
self.pts[trackId].yvRel = float('nan')
self.pts[trackId].measured = True
if 'LONG_DIST' in cpt: # c_* message
self.pts[trackId].dRel = cpt['LONG_DIST'] # from front of car
# our lat_dist is positive to the right in car's frame.
# TODO what does yRel want?
self.pts[trackId].yRel = cpt['LAT_DIST'] # in car frame's y axis, left is positive
else: # d_* message
self.pts[trackId].vRel = cpt['REL_SPEED']
# We want a list, not a dictionary. Filter out LONG_DIST==0 because that means it's not valid.
ret.points = [x for x in self.pts.values() if x.dRel != 0]
self.updated_messages.clear()
return ret
| 37.085106 | 98 | 0.645439 | 520 | 3,486 | 4.126923 | 0.344231 | 0.062908 | 0.037279 | 0.029357 | 0.070829 | 0.031687 | 0 | 0 | 0 | 0 | 0 | 0.03624 | 0.240103 | 3,486 | 93 | 99 | 37.483871 | 0.773877 | 0.274527 | 0 | 0.03125 | 0 | 0 | 0.063651 | 0.019215 | 0 | 0 | 0.008006 | 0.010753 | 0 | 1 | 0.0625 | false | 0 | 0.0625 | 0 | 0.21875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
071afc12457e1373ac1b61126e3c5e710f213fb9 | 1,536 | py | Python | app/util/auth2.py | FSU-ACM/Contest-Server | 00a71cdcee1a7e4d4e4d8e33b5d6decf27f02313 | [
"MIT"
] | 8 | 2019-01-13T21:57:53.000Z | 2021-11-29T12:32:48.000Z | app/util/auth2.py | FSU-ACM/Contest-Server | 00a71cdcee1a7e4d4e4d8e33b5d6decf27f02313 | [
"MIT"
] | 73 | 2018-02-13T00:58:39.000Z | 2022-02-10T11:59:53.000Z | app/util/auth2.py | FSU-ACM/Contest-Server | 00a71cdcee1a7e4d4e4d8e33b5d6decf27f02313 | [
"MIT"
] | 4 | 2018-02-08T18:56:54.000Z | 2019-02-13T19:01:53.000Z | """ util.auth2: Authentication tools
This module is based off of util.auth, except with the action
paradigm removed.
"""
from flask import session
from app.models import Account
from app.util import course as course_util
# Session keys
SESSION_EMAIL = 'email'
def create_account(email: str, password: str, first_name: str,
last_name: str, fsuid: str, course_list: list = []):
"""
Creates an account for a single user.
:email: Required, the email address of the user.
:password: Required, user's chosen password.
:first_name: Required, user's first name.
:last_name: Required, user's last name.
:fsuid: Optional, user's FSUID.
:course_list: Optional, courses being taken by user
:return: Account object.
"""
account = Account(
email=email,
first_name=first_name,
last_name=last_name,
fsuid=fsuid,
is_admin=False
)
# Set user's extra credit courses
course_util.set_courses(account, course_list)
account.set_password(password)
account.save()
return account
def get_account(email: str=None):
"""
Retrieves account via email (defaults to using session), otherwise
redirects to login page.
:email: Optional email string, if not provided will use session['email']
:return: Account if email is present in session, None otherwise.
"""
try:
email = email or session['email']
return Account.objects.get_or_404(email=email)
except:
return None
| 26.033898 | 76 | 0.670573 | 204 | 1,536 | 4.946078 | 0.406863 | 0.044599 | 0.038652 | 0.033697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003439 | 0.242839 | 1,536 | 58 | 77 | 26.482759 | 0.864144 | 0.464844 | 0 | 0 | 0 | 0 | 0.013605 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | false | 0.086957 | 0.130435 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
071dbe42fd5b14449158462daf2a890df418a73d | 2,651 | py | Python | heat/api/openstack/v1/views/stacks_view.py | noironetworks/heat | 7cdadf1155f4d94cf8f967635b98e4012a7acfb7 | [
"Apache-2.0"
] | 265 | 2015-01-02T09:33:22.000Z | 2022-03-26T23:19:54.000Z | heat/api/openstack/v1/views/stacks_view.py | noironetworks/heat | 7cdadf1155f4d94cf8f967635b98e4012a7acfb7 | [
"Apache-2.0"
] | 8 | 2015-09-01T15:43:19.000Z | 2021-12-14T05:18:23.000Z | heat/api/openstack/v1/views/stacks_view.py | noironetworks/heat | 7cdadf1155f4d94cf8f967635b98e4012a7acfb7 | [
"Apache-2.0"
] | 295 | 2015-01-06T07:00:40.000Z | 2021-09-06T08:05:06.000Z | #
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import itertools
from heat.api.openstack.v1 import util
from heat.api.openstack.v1.views import views_common
from heat.rpc import api as rpc_api
_collection_name = 'stacks'
basic_keys = (
rpc_api.STACK_ID,
rpc_api.STACK_NAME,
rpc_api.STACK_DESCRIPTION,
rpc_api.STACK_STATUS,
rpc_api.STACK_STATUS_DATA,
rpc_api.STACK_CREATION_TIME,
rpc_api.STACK_DELETION_TIME,
rpc_api.STACK_UPDATED_TIME,
rpc_api.STACK_OWNER,
rpc_api.STACK_PARENT,
rpc_api.STACK_USER_PROJECT_ID,
rpc_api.STACK_TAGS,
)
def format_stack(req, stack, keys=None, include_project=False):
def transform(key, value):
if keys and key not in keys:
return
if key == rpc_api.STACK_ID:
yield ('id', value['stack_id'])
yield ('links', [util.make_link(req, value)])
if include_project:
yield ('project', value['tenant'])
elif key == rpc_api.STACK_ACTION:
return
elif (key == rpc_api.STACK_STATUS and
rpc_api.STACK_ACTION in stack):
# To avoid breaking API compatibility, we join RES_ACTION
# and RES_STATUS, so the API format doesn't expose the
# internal split of state into action/status
yield (key, '_'.join((stack[rpc_api.STACK_ACTION], value)))
else:
# TODO(zaneb): ensure parameters can be formatted for XML
# elif key == rpc_api.STACK_PARAMETERS:
# return key, json.dumps(value)
yield (key, value)
return dict(itertools.chain.from_iterable(
transform(k, v) for k, v in stack.items()))
def collection(req, stacks, count=None, include_project=False):
keys = basic_keys
formatted_stacks = [format_stack(req, s, keys, include_project)
for s in stacks]
result = {'stacks': formatted_stacks}
links = views_common.get_collection_links(req, formatted_stacks)
if links:
result['links'] = links
if count is not None:
result['count'] = count
return result
| 33.556962 | 78 | 0.659751 | 366 | 2,651 | 4.598361 | 0.387978 | 0.067736 | 0.117647 | 0.033274 | 0.058229 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003041 | 0.255753 | 2,651 | 78 | 79 | 33.987179 | 0.849975 | 0.311581 | 0 | 0.041667 | 0 | 0 | 0.028286 | 0 | 0 | 0 | 0 | 0.012821 | 0 | 1 | 0.0625 | false | 0 | 0.083333 | 0 | 0.229167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
0720bde47f5a6d668b162186b490b208d369a3a2 | 233 | py | Python | desktop/core/ext-py/pyasn1-0.1.8/pyasn1/compat/iterfunc.py | kokosing/hue | 2307f5379a35aae9be871e836432e6f45138b3d9 | [
"Apache-2.0"
] | 422 | 2015-01-08T14:08:08.000Z | 2022-02-07T11:47:37.000Z | desktop/core/ext-py/pyasn1-0.1.8/pyasn1/compat/iterfunc.py | zks888/hue | 93a8c370713e70b216c428caa2f75185ef809deb | [
"Apache-2.0"
] | 581 | 2015-01-01T08:07:16.000Z | 2022-02-23T11:44:37.000Z | desktop/core/ext-py/pyasn1-0.1.8/pyasn1/compat/iterfunc.py | zks888/hue | 93a8c370713e70b216c428caa2f75185ef809deb | [
"Apache-2.0"
] | 115 | 2015-01-08T14:41:00.000Z | 2022-02-13T12:31:17.000Z | from sys import version_info
if version_info[0] <= 2 and version_info[1] <= 4:
def all(iterable):
for element in iterable:
if not element:
return False
return True
else:
all = all
| 21.181818 | 49 | 0.579399 | 32 | 233 | 4.125 | 0.6875 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02649 | 0.351931 | 233 | 10 | 50 | 23.3 | 0.847682 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
07229c65c61816346ca75d9d08af09c5eb62b6ff | 6,813 | py | Python | src/mf_horizon_client/client/pipelines/blueprints.py | MF-HORIZON/mf-horizon-python-client | 67a4a094767cb8e5f01956f20f5ca7726781614a | [
"MIT"
] | null | null | null | src/mf_horizon_client/client/pipelines/blueprints.py | MF-HORIZON/mf-horizon-python-client | 67a4a094767cb8e5f01956f20f5ca7726781614a | [
"MIT"
] | null | null | null | src/mf_horizon_client/client/pipelines/blueprints.py | MF-HORIZON/mf-horizon-python-client | 67a4a094767cb8e5f01956f20f5ca7726781614a | [
"MIT"
] | null | null | null | from enum import Enum
class BlueprintType(Enum):
"""
A blueprint is a pipeline template in horizon, and must be specified when creating a new pipeline
Nonlinear
===============================================================================================================
A nonlinear pipeline combines nonlinear feature generation and selection with a nonlinear regressor to generate
forecasts that are at a specific target in the future.
A number of different regressor types are available here:
1. Mondrian Forest. An adaptation of the probabilistic Mondrian Forest algorithm - https://arxiv.org/abs/1406.2673
Provides Bayesian-esque error bounds, and is our recommended nonlinear regressor of choice.
2. XG Boost
3. Random Forest.
The stages of a nonlinear pipeline are as follows:
A. Forecast Specification
B. Stationarization
C. Feature Generation
D. Feature Filtering
E. Feature Refinement
F. Nonlinear Backtesting
G. Nonlinear Prediction
Linear
===============================================================================================================
A nonlinear pipeline combines nonlinear feature generation with a nonlinear regressor to generate
forecasts that are at a specific target in the future.
The regressor used is a Variational Bayesian Linear Regressor
The stages of a linear pipeline are as follows:
A. Forecast Specification
B. Stationarization
C. Nonlinear Feature Generation
D. Feature Filtering
E. Feature Refinement
F. Linear Backtesting
G. Linear Prediction
Fast Forecasting
===============================================================================================================
The fast forecasting pipeline is intended to be used as a quick assessment of a dataset's predictive performance
It is identical to the linear pipeline, but does not include Feature Refinement.
The stages of a linear pipeline are as follows:
A. Forecast Specification
B. Stationarization
C. Nonlinear Feature Generation
D. Feature Filtering
E. Linear Backtesting
F. Linear Prediction
Feature Selection
===============================================================================================================
The feature selection pipeline assumes that the input data set already encodes information about a signal's
past, such that a horizontal observation vector may be used in a traditional regression sense to map to a target
value at a point in the future.
Feat1 | Feat2 | Feat3 | .... | FeatP
Obs1 ------------------------------------- t
Obs2 ------------------------------------- t-1
Obs3 ------------------------------------- t-2
... .....................................
... .....................................
ObsN ------------------------------------- t-N
Two stages of feature selection are then used in order to maximize predictive performance of the feature set
on specified future points for a given target
The stages of a linear pipeline are as follows:
A. Forecast Specification
B. Feature Filtering
E. Feature Refinement
Feature Discovery
===============================================================================================================
The feature discovery pipeline discovers features to maximize performance for a particular forecast target,
at a specified point in the future. Unlike the feature selection pipeline, it does not assume that the signal
set has already encoded historical information about the original data's past.
The stages of a feature discovery pipeline are as follows:
A. Forecast Specification
B. Feature Generation
C. Feature Filtering
D. Feature Refinement
Signal Encoding
===============================================================================================================
One of Horizon's feature generation methods is to encode signals in the frequency domain, extracting historic
lags that will efficiently represent the information contained within them.
The signal encoding pipeline allows for this functionality to be isolated, where the output is a feature
set that has encoded past information about a signal that can be exported from the platform
The stages of a signal encoding pipeline are as follows:
A. Forecast Specification
B. Feature Generation
C. Feature Filtering
Stationarization
===============================================================================================================
Stationarize a signal set and specified target using Augmented Dicky Fuller analysis, and a detrending method
for the specified target.
The stages of a stationarization pipeline are as follows:
A. Forecast Specification
B. Stationarization
Time-Series Regression
===============================================================================================================
Run Horizon's regression algorithms on a pre-encoded signal set.
Small Data Forecasting
===============================================================================================================
Time-series pipeline for small data. Does not contain any backtesting, and uses all the data for model training.
A. Forecast Specification
B. Stationarization
C. Linear Feature Generation
D. Feature Filtering
E. Feature Refinement
G. Linear Prediction
Variational Forecasting
===============================================================================================================
Creates a stacked lag-embedding matrix by combining a two-stage feature generation and selection process, with
lag-only feature generation.
A. Forecast Specification
B. Stationarization
C. Linear Feature Generation
D. Feature Filtering
E. Linear Feature Generation
F. Feature Filtering
G. Linear Backtesting
H. Linear Prediction
Custom
===============================================================================================================
Advanced: Contains only a forecast specification stage for adding stages manually.
N.B. There is no validation on stage addition.
"""
nonlinear = "nonlinear"
linear = "linear"
fast_forecasting = "fast_forecast"
feature_selection = "feature_selection"
feature_discovery = "feature_discovery"
signal_encoding = "signal_encoding"
stationarisation = "stationarisation"
time_series_regression = "regression"
variational_forecasting = "variational_forecasting"
custom = "custom"
small_data = "small_data"
| 39.842105 | 122 | 0.57405 | 697 | 6,813 | 5.591105 | 0.311334 | 0.05671 | 0.056454 | 0.053118 | 0.296638 | 0.283551 | 0.283551 | 0.253785 | 0.249423 | 0.214267 | 0 | 0.003441 | 0.189637 | 6,813 | 170 | 123 | 40.076471 | 0.702409 | 0.862322 | 0 | 0 | 0 | 0 | 0.304721 | 0.049356 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 1 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
0723f800260b47fe29201f275a3497c9e0250212 | 6,758 | py | Python | pyChess/olaf/views.py | An-Alone-Cow/pyChess | 2729a3a89e4d7d79659488ecb1b0bff9cac281a3 | [
"MIT"
] | null | null | null | pyChess/olaf/views.py | An-Alone-Cow/pyChess | 2729a3a89e4d7d79659488ecb1b0bff9cac281a3 | [
"MIT"
] | 18 | 2017-02-05T17:52:41.000Z | 2017-02-16T09:04:39.000Z | pyChess/olaf/views.py | An-Alone-Cow/pyChess | 2729a3a89e4d7d79659488ecb1b0bff9cac281a3 | [
"MIT"
] | null | null | null | from django.contrib.auth.decorators import login_required
from django.contrib.auth.models import User
from django.shortcuts import render
from django.urls import reverse
from django.http import HttpResponseRedirect, HttpResponse
from django.utils import timezone
from olaf.models import *
from olaf.forms import *
from olaf.utility import usertools
from olaf.chess.controller import proccess_move
def index ( request ):
args = {}
message = request.session.pop ( 'message', default = None )
if ( message is not None ):
args [ 'message' ] = message
if ( request.user.is_authenticated ):
if ( request.method == 'POST' ):
if ( request.POST.get ( 'game_id' ) is not None ):
game_id = request.POST.get ( 'game_id' )
if ( game_id == '-1' ):
game_id = usertools.new_game ( request )
request.session [ 'game_id' ] = game_id
else:
request.session.pop ( 'game_id', default = None )
f = lambda a : str ( a.date () ) + " - " + str ( a.hour ) + ":" + str ( a.minute ) + ":" + str ( a.second )
args [ 'game_list' ] = list ([str ( game.id ), f ( game.creation_time )] for game in request.user.userdata.game_history.filter ( result = 0 ).order_by ( '-creation_time' ) )
if ( request.session.get ( 'game_id' ) is not None ):
args [ 'game_board' ] = usertools.get_translated_game_board ( request )
else:
args [ 'game_board' ] = None
return render ( request, 'olaf/index_logged_in.html', args )
else:
args [ 'login_form' ] = LoginForm ()
args [ 'register_form' ] = RegisterForm ()
args [ 'score' ] = list ( [user.master.username, user.wins, user.loses, user.ties] for user in UserData.objects.filter ( is_active = True ) )
return render ( request, 'olaf/index_not_logged_in.html', args )
form_operation_dict = {
'login' : (
usertools.login_user,
LoginForm,
'olaf/login.html',
{},
'index',
{ 'message' : "You're logged in. :)"}
),
'register' : (
usertools.register_user,
RegisterForm,
'olaf/register.html',
{},
'index',
{ 'message' : "An activation email has been sent to you" }
),
'password_reset_request' : (
usertools.init_pass_reset_token,
ForgotPasswordUsernameOrEmailForm,
'olaf/password_reset_request.html',
{},
'index',
{ 'message' : "An email containing the password reset link will be sent to your email"}
),
'reset_password' : (
usertools.reset_password_action,
PasswordChangeForm,
'olaf/reset_password.html',
{},
'olaf:login',
{ 'message' : "Password successfully changed, you can login now" }
),
'resend_activation_email' : (
usertools.resend_activation_email,
ResendActivationUsernameOrEmailForm,
'olaf/resend_activation_email.html',
{},
'index',
{ 'message' : "Activation email successfully sent to your email" }
),
}
def form_operation ( request, oper, *args ):
func, FORM, fail_template, fail_args, success_url, success_args = form_operation_dict [ oper ]
if ( request.method == 'POST' ):
form = FORM ( request.POST )
if ( form.is_valid () ):
func ( request, form, *args )
for key in success_args:
request.session [ key ] = success_args [ key ]
return HttpResponseRedirect ( reverse ( success_url ) )
else:
form = FORM ()
message = request.session.pop ( 'message', default = None )
if ( message is not None ):
fail_args [ 'message' ] = message
fail_args [ 'form' ] = form
return render ( request, fail_template, fail_args )
#view functions
def login_user ( request ):
if ( request.user.is_authenticated ):
return HttpResponseRedirect ( reverse ( 'index' ) )
return form_operation ( request, 'login' )
def register_user ( request ):
if ( request.user.is_authenticated ):
return HttpResponseRedirect ( reverse ( 'index' ) )
return form_operation ( request, 'register' )
def password_reset_request ( request ):
if ( request.user.is_authenticated ):
return HttpResponseRedirect ( reverse ( 'index' ) )
return form_operation ( request, 'password_reset_request' )
def reset_password_action ( request, token ):
if ( request.user.is_authenticated ):
return HttpResponseRedirect ( reverse ( 'index' ) )
tk = ExpirableTokenField.objects.filter ( token = token ).first ()
if ( tk is None ):
request.session [ 'message' ] = "Broken link"
return HttpResponseRedirect ( reverse ( 'index' ) )
else:
if ( timezone.now () <= tk.expiration_time ):
return form_operation ( request, 'reset_password', token )
else:
request.session [ 'message' ] = "Link expired, try getting a new one"
return HttpResponseRedirect ( reverse ( 'olaf:reset_password' ) )
def activate_account ( request, token ):
if ( request.user.is_authenticated ):
return HttpResponseRedirect ( reverse ( 'index' ) )
tk = ExpirableTokenField.objects.filter ( token = token ).first ()
if ( tk is None ):
request.session [ 'message' ] = "Broken link"
return HttpResponseRedirect ( reverse ( 'index' ) )
else:
if ( timezone.now () <= tk.expiration_time ):
if ( tk.user.is_active ):
request.session [ 'message' ] = "Account already active"
return HttpResponseRedirect ( reverse ( 'index' ) )
else:
userdata = tk.user
userdata.is_active = True
userdata.save ()
request.session [ 'message' ] = "Your account has been activated successfully"
return HttpResponseRedirect ( reverse ( 'olaf:login' ) )
else:
request.session [ 'message' ] = "Link expired, try getting a new one"
return HttpResponseRedirect ( reverse ( 'olaf:resend_activation_email' ) )
def resend_activation_email ( request ):
if ( request.user.is_authenticated ):
return HttpResponseRedirect ( reverse ( 'index' ) )
return form_operation ( request, 'resend_activation_email' )
def logout_user ( request ):
usertools.logout_user ( request )
request.session [ 'message' ] = "Goodbye :)"
return HttpResponseRedirect ( reverse ( 'index' ) )
def scoreboard ( request ):
if ( request.method == 'POST' ):
username = request.POST.get ( 'username' )
user = User.objects.filter ( username = username ).first ()
if ( user is None ):
request.session [ 'message' ] = "User not found"
return HttpResponseRedirect ( reverse ( 'olaf:scoreboard' ) )
else:
return HttpResponseRedirect ( reverse ( 'olaf:user_profile', args = (username, ) ) )
else:
args = {}
message = request.session.pop ( 'message', default = None )
if ( message is not None ):
args [ 'message' ] = message
lst = [ (user.master.username, user.wins, user.loses, user.ties) for user in UserData.objects.filter ( is_active = True ) ]
args [ 'lst' ] = lst
if ( request.user.is_authenticated ):
args [ 'logged_in' ] = True
return render ( request, 'olaf/scoreboard.html', args )
def move ( request ):
proccess_move ( request )
return HttpResponseRedirect ( reverse ( 'index' ) ) | 31.877358 | 175 | 0.683042 | 794 | 6,758 | 5.680101 | 0.185139 | 0.098004 | 0.12439 | 0.092683 | 0.383814 | 0.33082 | 0.322838 | 0.322838 | 0.322838 | 0.322838 | 0 | 0.000365 | 0.188369 | 6,758 | 212 | 176 | 31.877358 | 0.821878 | 0.002072 | 0 | 0.394118 | 0 | 0 | 0.178529 | 0.038701 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064706 | false | 0.088235 | 0.058824 | 0 | 0.276471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
072e9a202d69d5d6154bfb44a978d712661a1d52 | 869 | py | Python | examples/morpho.py | jaideep-seth/PyOpenWorm | c36baeda9590334ba810296934973da34f0eab78 | [
"MIT"
] | null | null | null | examples/morpho.py | jaideep-seth/PyOpenWorm | c36baeda9590334ba810296934973da34f0eab78 | [
"MIT"
] | null | null | null | examples/morpho.py | jaideep-seth/PyOpenWorm | c36baeda9590334ba810296934973da34f0eab78 | [
"MIT"
] | null | null | null | """
How to load morphologies of certain cells from the database.
"""
#this is an expected failure right now, as morphology is not implemented
from __future__ import absolute_import
from __future__ import print_function
import PyOpenWorm as P
from PyOpenWorm.context import Context
from PyOpenWorm.worm import Worm
from six import StringIO
#Connect to database.
with P.connect('default.conf') as conn:
ctx = Context(ident="http://openworm.org/data", conf=conn.conf).stored
#Create a new Cell object to work with.
aval = ctx(Worm)().get_neuron_network().aneuron('AVAL')
#Get the morphology associated with the Cell. Returns a neuroml.Morphology object.
morph = aval._morphology()
out = StringIO()
morph.export(out, 0) # we're printing it here, but we would normally do something else with the morphology object.
print(str(out.read()))
| 34.76 | 118 | 0.749137 | 128 | 869 | 4.984375 | 0.585938 | 0.031348 | 0.050157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001376 | 0.163406 | 869 | 24 | 119 | 36.208333 | 0.876204 | 0.417722 | 0 | 0 | 0 | 0 | 0.080972 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.461538 | 0 | 0.461538 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
0730aed278d58141b67cbd8f8213146b99199686 | 13,377 | py | Python | Python/libraries/recognizers-date-time/recognizers_date_time/date_time/italian/dateperiod_extractor_config.py | felaray/Recognizers-Text | f514fd61c8d472ed92565261162712409f655312 | [
"MIT"
] | null | null | null | Python/libraries/recognizers-date-time/recognizers_date_time/date_time/italian/dateperiod_extractor_config.py | felaray/Recognizers-Text | f514fd61c8d472ed92565261162712409f655312 | [
"MIT"
] | 6 | 2021-12-20T17:13:35.000Z | 2022-03-29T08:54:11.000Z | Python/libraries/recognizers-date-time/recognizers_date_time/date_time/italian/dateperiod_extractor_config.py | felaray/Recognizers-Text | f514fd61c8d472ed92565261162712409f655312 | [
"MIT"
] | null | null | null | # Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
from typing import List, Pattern
from recognizers_text.utilities import RegExpUtility
from recognizers_number.number import BaseNumberParser
from recognizers_number.number.italian.extractors import ItalianIntegerExtractor, ItalianCardinalExtractor
from recognizers_number.number.italian.parsers import ItalianNumberParserConfiguration
from ...resources.base_date_time import BaseDateTime
from ...resources.italian_date_time import ItalianDateTime
from ..extractors import DateTimeExtractor
from ..base_duration import BaseDurationExtractor
from ..base_date import BaseDateExtractor
from ..base_dateperiod import DatePeriodExtractorConfiguration, MatchedIndex
from .duration_extractor_config import ItalianDurationExtractorConfiguration
from .date_extractor_config import ItalianDateExtractorConfiguration
from recognizers_text.extractor import Extractor
from recognizers_number import ItalianOrdinalExtractor, BaseNumberExtractor, ItalianCardinalExtractor
class ItalianDatePeriodExtractorConfiguration(DatePeriodExtractorConfiguration):
@property
def previous_prefix_regex(self) -> Pattern:
return self._previous_prefix_regex
@property
def check_both_before_after(self) -> bool:
return self._check_both_before_after
@property
def simple_cases_regexes(self) -> List[Pattern]:
return self._simple_cases_regexes
@property
def illegal_year_regex(self) -> Pattern:
return self._illegal_year_regex
@property
def year_regex(self) -> Pattern:
return self._year_regex
@property
def till_regex(self) -> Pattern:
return self._till_regex
@property
def followed_unit(self) -> Pattern:
return self._followed_unit
@property
def number_combined_with_unit(self) -> Pattern:
return self._number_combined_with_unit
@property
def past_regex(self) -> Pattern:
return self._past_regex
@property
def decade_with_century_regex(self) -> Pattern:
return self._decade_with_century_regex
@property
def future_regex(self) -> Pattern:
return self._future_regex
@property
def week_of_regex(self) -> Pattern:
return self._week_of_regex
@property
def month_of_regex(self) -> Pattern:
return self._month_of_regex
@property
def date_unit_regex(self) -> Pattern:
return self._date_unit_regex
@property
def in_connector_regex(self) -> Pattern:
return self._in_connector_regex
@property
def range_unit_regex(self) -> Pattern:
return self._range_unit_regex
@property
def date_point_extractor(self) -> DateTimeExtractor:
return self._date_point_extractor
@property
def integer_extractor(self) -> BaseNumberExtractor:
return self._integer_extractor
@property
def number_parser(self) -> BaseNumberParser:
return self._number_parser
@property
def duration_extractor(self) -> DateTimeExtractor:
return self._duration_extractor
@property
def now_regex(self) -> Pattern:
return self._now_regex
@property
def future_suffix_regex(self) -> Pattern:
return self._future_suffix_regex
@property
def ago_regex(self) -> Pattern:
return self._ago_regex
@property
def later_regex(self) -> Pattern:
return self._later_regex
@property
def less_than_regex(self) -> Pattern:
return self._less_than_regex
@property
def more_than_regex(self) -> Pattern:
return self._more_than_regex
@property
def duration_date_restrictions(self) -> [str]:
return self._duration_date_restrictions
@property
def year_period_regex(self) -> Pattern:
return self._year_period_regex
@property
def month_num_regex(self) -> Pattern:
return self._month_num_regex
@property
def century_suffix_regex(self) -> Pattern:
return self._century_suffix_regex
@property
def ordinal_extractor(self) -> BaseNumberExtractor:
return self._ordinal_extractor
@property
def cardinal_extractor(self) -> Extractor:
return self._cardinal_extractor
@property
def time_unit_regex(self) -> Pattern:
return self._time_unit_regex
@property
def within_next_prefix_regex(self) -> Pattern:
return self._within_next_prefix_regex
@property
def range_connector_regex(self) -> Pattern:
return self._range_connector_regex
@property
def day_regex(self) -> Pattern:
return self._day_regex
@property
def week_day_regex(self) -> Pattern:
return self._week_day_regex
@property
def relative_month_regex(self) -> Pattern:
return self._relative_month_regex
@property
def month_suffix_regex(self) -> Pattern:
return self._month_suffix_regex
@property
def past_prefix_regex(self) -> Pattern:
return self._past_prefix_regex
@property
def next_prefix_regex(self) -> Pattern:
return self._next_prefix_regex
@property
def this_prefix_regex(self) -> Pattern:
return self._this_prefix_regex
@property
def which_week_regex(self) -> Pattern:
return self._which_week_regex
@property
def rest_of_date_regex(self) -> Pattern:
return self._rest_of_date_regex
@property
def complex_date_period_regex(self) -> Pattern:
return self._complex_date_period_regex
@property
def week_day_of_month_regex(self) -> Pattern:
return self._week_day_of_month_regex
@property
def all_half_year_regex(self) -> Pattern:
return self._all_half_year_regex
def __init__(self):
self._all_half_year_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.AllHalfYearRegex)
self._week_day_of_month_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.WeekDayOfMonthRegex)
self._complex_date_period_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.ComplexDatePeriodRegex)
self._rest_of_date_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.RestOfDateRegex)
self._which_week_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.WhichWeekRegex)
self._this_prefix_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.ThisPrefixRegex)
self._next_prefix_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.NextSuffixRegex)
self._past_prefix_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.PastSuffixRegex)
self._month_suffix_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.MonthSuffixRegex)
self._relative_month_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.RelativeMonthRegex)
self._week_day_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.WeekDayRegex)
self._day_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.DayRegex)
self._range_connector_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.RangeConnectorRegex)
self._time_unit_regex = RegExpUtility.get_safe_reg_exp(ItalianDateTime.TimeUnitRegex)
self._previous_prefix_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.PastSuffixRegex)
self._check_both_before_after = ItalianDateTime.CheckBothBeforeAfter
self._simple_cases_regexes = [
RegExpUtility.get_safe_reg_exp(ItalianDateTime.SimpleCasesRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.BetweenRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.OneWordPeriodRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.MonthWithYear),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.MonthNumWithYear),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.YearRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.YearPeriodRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.WeekOfYearRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.WeekDayOfMonthRegex),
RegExpUtility.get_safe_reg_exp(
ItalianDateTime.MonthFrontBetweenRegex),
RegExpUtility.get_safe_reg_exp(
ItalianDateTime.MonthFrontSimpleCasesRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.QuarterRegex),
RegExpUtility.get_safe_reg_exp(
ItalianDateTime.QuarterRegexYearFront),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.SeasonRegex),
RegExpUtility.get_safe_reg_exp(
ItalianDateTime.LaterEarlyPeriodRegex),
RegExpUtility.get_safe_reg_exp(
ItalianDateTime.WeekWithWeekDayRangeRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.YearPlusNumberRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.DecadeWithCenturyRegex),
RegExpUtility.get_safe_reg_exp(ItalianDateTime.RelativeDecadeRegex)
]
self._check_both_before_after = ItalianDateTime.CheckBothBeforeAfter
self._illegal_year_regex = RegExpUtility.get_safe_reg_exp(
BaseDateTime.IllegalYearRegex)
self._year_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.YearRegex)
self._till_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.TillRegex)
self._followed_unit = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.FollowedDateUnit)
self._number_combined_with_unit = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.NumberCombinedWithDateUnit)
self._past_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.PastSuffixRegex)
self._future_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.NextSuffixRegex)
self._week_of_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.WeekOfRegex)
self._month_of_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.MonthOfRegex)
self._date_unit_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.DateUnitRegex)
self._within_next_prefix_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.WithinNextPrefixRegex)
self._in_connector_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.InConnectorRegex)
self._range_unit_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.RangeUnitRegex)
self.from_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.FromRegex)
self.connector_and_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.ConnectorAndRegex)
self.before_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.BeforeRegex2)
self._date_point_extractor = BaseDateExtractor(
ItalianDateExtractorConfiguration())
self._integer_extractor = ItalianIntegerExtractor()
self._number_parser = BaseNumberParser(
ItalianNumberParserConfiguration())
self._duration_extractor = BaseDurationExtractor(
ItalianDurationExtractorConfiguration())
self._now_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.NowRegex)
self._future_suffix_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.FutureSuffixRegex
)
self._ago_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.AgoRegex
)
self._later_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.LaterRegex
)
self._less_than_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.LessThanRegex
)
self._more_than_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.MoreThanRegex
)
self._duration_date_restrictions = ItalianDateTime.DurationDateRestrictions
self._year_period_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.YearPeriodRegex
)
self._month_num_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.MonthNumRegex
)
self._century_suffix_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.CenturySuffixRegex
)
self._ordinal_extractor = ItalianOrdinalExtractor()
self._cardinal_extractor = ItalianCardinalExtractor()
self._previous_prefix_regex = RegExpUtility.get_safe_reg_exp(
ItalianDateTime.PreviousPrefixRegex
)
self._cardinal_extractor = ItalianCardinalExtractor()
# TODO When the implementation for these properties is added, change the None values to their respective Regexps
self._time_unit_regex = None
def get_from_token_index(self, source: str) -> MatchedIndex:
match = self.from_regex.search(source)
if match:
return MatchedIndex(True, match.start())
return MatchedIndex(False, -1)
def get_between_token_index(self, source: str) -> MatchedIndex:
match = self.before_regex.search(source)
if match:
return MatchedIndex(True, match.start())
return MatchedIndex(False, -1)
def has_connector_token(self, source: str) -> bool:
return not self.connector_and_regex.search(source) is None
| 38.329513 | 120 | 0.729984 | 1,394 | 13,377 | 6.584648 | 0.14132 | 0.104587 | 0.130733 | 0.150343 | 0.554962 | 0.475106 | 0.311036 | 0.2095 | 0.070269 | 0.041835 | 0 | 0.000283 | 0.208268 | 13,377 | 348 | 121 | 38.439655 | 0.866396 | 0.015026 | 0 | 0.229965 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002874 | 0 | 1 | 0.1777 | false | 0 | 0.052265 | 0.167247 | 0.414634 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
0748f0d589516cda25a4a57eb049a757da513fda | 3,169 | py | Python | utilidades/texto.py | DeadZombie14/chillMagicCarPygame | 756bb6d27939bed3c2834222d03096e90f05a788 | [
"MIT"
] | null | null | null | utilidades/texto.py | DeadZombie14/chillMagicCarPygame | 756bb6d27939bed3c2834222d03096e90f05a788 | [
"MIT"
] | null | null | null | utilidades/texto.py | DeadZombie14/chillMagicCarPygame | 756bb6d27939bed3c2834222d03096e90f05a788 | [
"MIT"
] | null | null | null | import pygame
class Texto:
def __init__(self, screen, text, x, y, text_size = 20, fuente = 'Calibri', italic = False, bold= False, subrayado= False, color = (250, 240, 230), bg = [] ):
self.screen = screen
fg = color
self.coord = x, y
#load font, prepare values
font = pygame.font.Font(None, 80)
size = font.size(text)
# Font
a_sys_font = pygame.font.SysFont(fuente, text_size)
# Cursiva
if italic:
a_sys_font.set_bold(1)
# Negritas
if bold:
a_sys_font.set_bold(1)
# Subrayado
if subrayado:
a_sys_font.set_underline(1)
# Construccion del texto
if len(bg) > 1: # Si hay fondo de texto
ren = a_sys_font.render(text, 1, fg, bg)
else: # Si no, transparente
ren = a_sys_font.render(text, 1, fg)
# self.size = x+size[0], y
self.text_rect = ren.get_rect()
self.text_rect.center = (x,y)
self.image = ren, (x,y)
screen.blit(ren, (x, y))
# Cursiva
if italic:
a_sys_font.set_bold(0)
# Negritas
if bold:
a_sys_font.set_bold(0)
# Subrayado
if subrayado:
a_sys_font.set_underline(0)
# self.image.blit(ren, self.text_rect)
# self.text_rect = (x, y),ren.get_size()
# text = str(self.counter)
# label = self.myfont.render(text, 1, (255,0,0))
# text_rect = label.get_rect()
# text_rect.center = (50,50)
# self.image.blit(label, text_rect)
pass
def getProperties(self):
return self.text_rect
def redraw(self):
self.screen.blit(self.image[0], self.image[1])
pass
##################### EJEMPLO DE USO ##############################
# texto1 = Texto(screen, 'Hola', 10, 10)
class TextArea():
def __init__(self, screen, text, x, y, fuente='Calibri', text_size = 20, color=pygame.Color('black')):
self.coord = x, y
font = pygame.font.SysFont(fuente, text_size)
words = [word.split(' ') for word in text.splitlines()] # 2D array where each row is a list of words.
space = font.size(' ')[0] # The width of a space.
max_width, max_height = screen.get_size()
pos = x,y
for line in words:
for word in line:
word_surface = font.render(word, 0, color)
word_width, word_height = word_surface.get_size()
if x + word_width >= max_width:
x = pos[0] # Reset the x.
y += word_height # Start on new row.
screen.blit(word_surface, (x, y))
x += word_width + space
x = pos[0] # Reset the x.
y += word_height # Start on new row.
self.size = word_width, word_height
pass
def getProperties(self):
return self.size, self.coord
##################### EJEMPLO DE USO ##############################
# textarea1 = Textarea(screen, 'Hola mundo que tal estas hoy') | 31.376238 | 161 | 0.517829 | 407 | 3,169 | 3.87715 | 0.265356 | 0.015209 | 0.045627 | 0.041825 | 0.323194 | 0.323194 | 0.277567 | 0.204056 | 0.048162 | 0.048162 | 0 | 0.023055 | 0.34301 | 3,169 | 101 | 162 | 31.376238 | 0.73487 | 0.214579 | 0 | 0.362069 | 0 | 0 | 0.008895 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086207 | false | 0.051724 | 0.017241 | 0.034483 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
074b42be48178517185311cda7a91881826a6fd2 | 654 | py | Python | sktime/annotation/tests/test_all_annotators.py | Rubiel1/sktime | 2fd2290fb438224f11ddf202148917eaf9b73a87 | [
"BSD-3-Clause"
] | 1 | 2021-09-08T14:24:52.000Z | 2021-09-08T14:24:52.000Z | sktime/annotation/tests/test_all_annotators.py | Rubiel1/sktime | 2fd2290fb438224f11ddf202148917eaf9b73a87 | [
"BSD-3-Clause"
] | null | null | null | sktime/annotation/tests/test_all_annotators.py | Rubiel1/sktime | 2fd2290fb438224f11ddf202148917eaf9b73a87 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""Tests for sktime annotators."""
import pandas as pd
import pytest
from sktime.registry import all_estimators
from sktime.utils._testing.estimator_checks import _make_args
ALL_ANNOTATORS = all_estimators(estimator_types="series-annotator", return_names=False)
@pytest.mark.parametrize("Estimator", ALL_ANNOTATORS)
def test_output_type(Estimator):
"""Test annotator output type."""
estimator = Estimator.create_test_instance()
args = _make_args(estimator, "fit")
estimator.fit(*args)
args = _make_args(estimator, "predict")
y_pred = estimator.predict(*args)
assert isinstance(y_pred, pd.Series)
| 28.434783 | 87 | 0.750765 | 83 | 654 | 5.674699 | 0.506024 | 0.050955 | 0.080679 | 0.089172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001764 | 0.133028 | 654 | 22 | 88 | 29.727273 | 0.828924 | 0.120795 | 0 | 0 | 0 | 0 | 0.062057 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
0754a45f518b76cfc3fadb21e0d4b383c11aeb7f | 2,937 | py | Python | magma/operators.py | Kuree/magma | be2439aa897768c5810be72e3a55a6f772ac83cf | [
"MIT"
] | null | null | null | magma/operators.py | Kuree/magma | be2439aa897768c5810be72e3a55a6f772ac83cf | [
"MIT"
] | null | null | null | magma/operators.py | Kuree/magma | be2439aa897768c5810be72e3a55a6f772ac83cf | [
"MIT"
] | null | null | null | from magma import _BitType, BitType, BitsType, UIntType, SIntType
class MantleImportError(RuntimeError):
pass
class UndefinedOperatorError(RuntimeError):
pass
def raise_mantle_import_error_unary(self):
raise MantleImportError(
"Operators are not defined until mantle has been imported")
def raise_mantle_import_error_binary(self, other):
raise MantleImportError(
"Operators are not defined until mantle has been imported")
def define_raise_undefined_operator_error(type_str, operator, type_):
if type_ == "unary":
def wrapped(self):
raise UndefinedOperatorError(
f"{operator} is undefined for {type_str}")
else:
assert type_ == "binary"
def wrapped(self, other):
raise UndefinedOperatorError(
f"{operator} is undefined for {type_str}")
return wrapped
for op in ("__eq__", "__ne__"):
setattr(_BitType, op, raise_mantle_import_error_binary)
for op in (
"__and__",
"__or__",
"__xor__",
"__invert__",
"__add__",
"__sub__",
"__mul__",
"__div__",
"__lt__",
# __le__ skipped because it's used for assignment on inputs
# "__le__",
"__gt__",
"__ge__"
):
if op == "__invert__":
setattr(_BitType, op,
define_raise_undefined_operator_error("_BitType", op, "unary"))
else:
setattr(
_BitType, op,
define_raise_undefined_operator_error("_BitType", op, "binary"))
for op in ("__and__",
"__or__",
"__xor__",
"__invert__"
):
if op == "__invert__":
setattr(BitType, op, raise_mantle_import_error_unary)
else:
setattr(BitType, op, raise_mantle_import_error_binary)
for op in ("__and__",
"__or__",
"__xor__",
"__invert__",
"__lshift__",
"__rshift__",
):
if op == "__invert__":
setattr(BitsType, op, raise_mantle_import_error_unary)
else:
setattr(BitsType, op, raise_mantle_import_error_binary)
for op in ("__add__",
"__sub__",
"__mul__",
"__div__",
"__lt__",
# __le__ skipped because it's used for assignment on inputs
# "__le__",
"__gt__",
"__ge__"
):
setattr(BitsType, op,
define_raise_undefined_operator_error("BitsType", op, "binary"))
for op in ("__add__",
"__sub__",
"__mul__",
"__div__",
"__lt__",
# __le__ skipped because it's used for assignment on inputs
# "__le__",
"__gt__",
"__ge__"
):
setattr(SIntType, op, raise_mantle_import_error_binary)
setattr(UIntType, op, raise_mantle_import_error_binary)
| 26.459459 | 79 | 0.571672 | 286 | 2,937 | 5.003497 | 0.237762 | 0.069182 | 0.106918 | 0.138365 | 0.779874 | 0.708595 | 0.628232 | 0.602376 | 0.532495 | 0.436059 | 0 | 0 | 0.330609 | 2,937 | 110 | 80 | 26.7 | 0.727874 | 0.069459 | 0 | 0.619048 | 0 | 0 | 0.194424 | 0 | 0 | 0 | 0 | 0 | 0.011905 | 1 | 0.059524 | false | 0.02381 | 0.178571 | 0 | 0.27381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
075cb80186092395148f9c03498c024c22cfd0b5 | 793 | py | Python | utils/nlp.py | splovyt/SFPython-Project-Night | 50f20f581e074401d59d91457bac2a69631bef61 | [
"Apache-2.0"
] | 1 | 2019-04-17T18:02:59.000Z | 2019-04-17T18:02:59.000Z | utils/nlp.py | splovyt/SFPython-Project-Night | 50f20f581e074401d59d91457bac2a69631bef61 | [
"Apache-2.0"
] | null | null | null | utils/nlp.py | splovyt/SFPython-Project-Night | 50f20f581e074401d59d91457bac2a69631bef61 | [
"Apache-2.0"
] | null | null | null | import ssl
import nltk
from textblob import TextBlob
from nltk.corpus import stopwords
# set SSL
try:
_create_unverified_https_context = ssl._create_unverified_context
except AttributeError:
pass
else:
ssl._create_default_https_context = _create_unverified_https_context
# download noun data (if required)
nltk.download('brown')
nltk.download('punkt')
nltk.download('stopwords')
def extract_nouns(sentence):
"""Extract the nouns from a sentence using the 'textblob' library."""
blob = TextBlob(sentence)
return blob.noun_phrases
def remove_stopwords(sentence):
"""Remove stopwords from a sentence and return the list of words."""
blob = TextBlob(sentence)
return [word for word in blob.words if word not in stopwords.words('english') and len(word)>2]
| 26.433333 | 98 | 0.760404 | 108 | 793 | 5.416667 | 0.444444 | 0.082051 | 0.071795 | 0.095727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001493 | 0.155107 | 793 | 29 | 99 | 27.344828 | 0.871642 | 0.211854 | 0 | 0.105263 | 0 | 0 | 0.042414 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0.052632 | 0.210526 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
075cf0dd079f839e7d44c9491837f8a19123cdd5 | 1,418 | py | Python | toolbox/core/management/commands/celery_beat_resource_scraper.py | akshedu/toolbox | 7c647433b68f1098ee4c8623f836f74785dc970c | [
"MIT"
] | null | null | null | toolbox/core/management/commands/celery_beat_resource_scraper.py | akshedu/toolbox | 7c647433b68f1098ee4c8623f836f74785dc970c | [
"MIT"
] | null | null | null | toolbox/core/management/commands/celery_beat_resource_scraper.py | akshedu/toolbox | 7c647433b68f1098ee4c8623f836f74785dc970c | [
"MIT"
] | null | null | null |
from django_celery_beat.models import PeriodicTask, IntervalSchedule
from django.core.management.base import BaseCommand
from django.db import IntegrityError
class Command(BaseCommand):
def handle(self, *args, **options):
try:
schedule_channel, created = IntervalSchedule.objects.get_or_create(
every=4,
period=IntervalSchedule.HOURS,
)
except IntegrityError as e:
pass
try:
schedule_video, created = IntervalSchedule.objects.get_or_create(
every=6,
period=IntervalSchedule.HOURS,
)
except IntegrityError as e:
pass
try:
PeriodicTask.objects.create(
interval=schedule_channel,
name='Scrape Channels',
task='toolbox.scraper.tasks.scrape_youtube_channels',
)
except IntegrityError as e:
pass
try:
PeriodicTask.objects.create(
interval=schedule_video,
name='Scrape Videos',
task='toolbox.scraper.tasks.scrape_youtube_videos',
)
except IntegrityError as e:
pass
| 32.227273 | 79 | 0.499295 | 110 | 1,418 | 6.309091 | 0.445455 | 0.115274 | 0.126801 | 0.132565 | 0.600865 | 0.56196 | 0.458213 | 0.325648 | 0.325648 | 0.204611 | 0 | 0.002535 | 0.443583 | 1,418 | 43 | 80 | 32.976744 | 0.87706 | 0 | 0 | 0.457143 | 0 | 0 | 0.081979 | 0.062191 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0.114286 | 0.085714 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.