hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
07c450dcd80cd1f193bd6aecd6fd7589f098c705 | 1,205 | py | Python | tutorials/basic_conversions.py | shiblon/pytour | 71a181ec16fd38b0af62f55e28a50e91790733b9 | [
"Apache-2.0"
] | 2 | 2016-04-30T00:12:50.000Z | 2018-11-14T20:47:55.000Z | tutorials/basic_conversions.py | shiblon/pytour | 71a181ec16fd38b0af62f55e28a50e91790733b9 | [
"Apache-2.0"
] | 2 | 2020-02-17T22:31:09.000Z | 2020-02-18T04:31:55.000Z | tutorials/basic_conversions.py | shiblon/pytour | 71a181ec16fd38b0af62f55e28a50e91790733b9 | [
"Apache-2.0"
] | 3 | 2018-03-26T17:41:40.000Z | 2019-06-28T12:53:47.000Z | # vim:tw=50
"""Basic Conversions
We have talked about strings and numbers, and alluded a
bit to the fact that we can convert between them.
You can convert between things like numbers and
strings using the appropriate function calls, like
|int("200")| or |str(1.1 ** 24)|.
There are a number of these **callables** (things
you can _call_, like functions, using |()|) that
convert between different types. A few are listed
here (there are many more):
int float
complex str
list tuple
Exercises
- Print the result of |5 * 30|.
- Now try it as |str(5) * 30|. What happened?
- What about |"5" * "30"|?
- You can provide a *numeric base* to |int|. Try printing
|int("FACE", 16)|. This treats |FACE| as a
hexadecimal value.
"""
# I have a string, but I want a number!
num_str = " 178000 "
# Yup, it's a string:
print repr(num_str)
# Can it be an int?
print int(num_str) # spaces are stripped first.
# How about a float?
print float(num_str)
# Of course, converting between numbers works:
print float(10)
# But what happens with this?
print int(10.5)
# We can even make complex values from strings:
print complex("-2+3.2j")
# This won't work:
print int("234notanumber")
| 21.140351 | 57 | 0.690456 | 201 | 1,205 | 4.109453 | 0.537313 | 0.029056 | 0.041162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038502 | 0.20249 | 1,205 | 56 | 58 | 21.517857 | 0.82102 | 0.221577 | 0 | 0 | 0 | 0 | 0.15508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
07cca7c4a81363a58bbc5dea55b021ad03b1ff2e | 1,717 | py | Python | django_prometheus/tests/end2end/testapp/test_models.py | thepalbi/django-prometheus | 9183c3a0af93ebf6db99d8c578fe8843cd6f29aa | [
"Apache-2.0"
] | null | null | null | django_prometheus/tests/end2end/testapp/test_models.py | thepalbi/django-prometheus | 9183c3a0af93ebf6db99d8c578fe8843cd6f29aa | [
"Apache-2.0"
] | null | null | null | django_prometheus/tests/end2end/testapp/test_models.py | thepalbi/django-prometheus | 9183c3a0af93ebf6db99d8c578fe8843cd6f29aa | [
"Apache-2.0"
] | null | null | null | from django.test import TestCase
from testapp.models import Dog, Lawn
from django_prometheus.testutils import PrometheusTestCaseMixin
def M(metric_name):
"""Make a full metric name from a short metric name.
This is just intended to help keep the lines shorter in test
cases.
"""
return "django_model_%s" % metric_name
class TestModelMetrics(PrometheusTestCaseMixin, TestCase):
"""Test django_prometheus.models."""
def test_counters(self):
registry = self.saveRegistry()
cool = Dog()
cool.name = "Cool"
cool.save()
self.assertMetricDiff(registry, 1, M("inserts_total"), model="dog")
elysees = Lawn()
elysees.location = "Champs Elysees, Paris"
elysees.save()
self.assertMetricDiff(registry, 1, M("inserts_total"), model="lawn")
self.assertMetricDiff(registry, 1, M("inserts_total"), model="dog")
galli = Dog()
galli.name = "Galli"
galli.save()
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
cool.breed = "Wolfhound"
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
cool.save()
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
self.assertMetricDiff(registry, 1, M("updates_total"), model="dog")
cool.age = 9
cool.save()
self.assertMetricDiff(registry, 2, M("updates_total"), model="dog")
cool.delete() # :(
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
self.assertMetricDiff(registry, 2, M("updates_total"), model="dog")
self.assertMetricDiff(registry, 1, M("deletes_total"), model="dog")
| 33.019231 | 76 | 0.644147 | 199 | 1,717 | 5.467337 | 0.301508 | 0.202206 | 0.283088 | 0.115809 | 0.524816 | 0.52114 | 0.495404 | 0.488051 | 0.45864 | 0.25 | 0 | 0.008982 | 0.221899 | 1,717 | 51 | 77 | 33.666667 | 0.805389 | 0.089109 | 0 | 0.333333 | 0 | 0 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.060606 | false | 0 | 0.090909 | 0 | 0.212121 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07d30d0aa7252ab070c8a3d1a4c784e4dedb78bb | 375 | py | Python | gsignals/gsignals/util.py | baverman/scribes-goodies | f6ebfe62e5103d5337929648109b4e610950bced | [
"MIT"
] | 1 | 2019-01-14T11:47:48.000Z | 2019-01-14T11:47:48.000Z | gsignals/gsignals/util.py | Bhanditz/scribes-goodies | f6ebfe62e5103d5337929648109b4e610950bced | [
"MIT"
] | null | null | null | gsignals/gsignals/util.py | Bhanditz/scribes-goodies | f6ebfe62e5103d5337929648109b4e610950bced | [
"MIT"
] | 1 | 2019-01-14T11:47:39.000Z | 2019-01-14T11:47:39.000Z | def append_attr(obj, attr, value):
"""
Appends value to object attribute
Attribute may be undefined
For example:
append_attr(obj, 'test', 1)
append_attr(obj, 'test', 2)
assert obj.test == [1, 2]
"""
try:
getattr(obj, attr).append(value)
except AttributeError:
setattr(obj, attr, [value])
| 22.058824 | 40 | 0.554667 | 44 | 375 | 4.659091 | 0.522727 | 0.146341 | 0.190244 | 0.165854 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0.328 | 375 | 16 | 41 | 23.4375 | 0.797619 | 0.464 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07d359fb45a6ebe39273b763c6e13c5b474b06ea | 316 | py | Python | skp_edu_docker/code/third_party/yolo/yolo/dataset/dataset.py | TensorMSA/hoyai_docker | 12f0041e6306d8a6421585a4b51666bad30be442 | [
"MIT"
] | 8 | 2017-06-16T00:19:12.000Z | 2020-08-13T03:15:57.000Z | kict_edu_docker/code/third_party/yolo/yolo/dataset/dataset.py | TensorMSA/tensormsa_docker | 12f0041e6306d8a6421585a4b51666bad30be442 | [
"MIT"
] | 21 | 2017-06-09T10:15:14.000Z | 2018-03-29T07:51:02.000Z | skp_edu_docker/code/third_party/yolo/yolo/dataset/dataset.py | TensorMSA/hoyai_docker | 12f0041e6306d8a6421585a4b51666bad30be442 | [
"MIT"
] | 4 | 2017-10-25T09:59:53.000Z | 2020-05-07T09:51:11.000Z | """DataSet base class
"""
class DataSet(object):
"""Base DataSet
"""
def __init__(self, common_params, dataset_params):
"""
common_params: A params dict
dataset_params: A params dict
"""
raise NotImplementedError
def batch(self):
"""Get batch
"""
raise NotImplementedError | 19.75 | 52 | 0.64557 | 34 | 316 | 5.764706 | 0.441176 | 0.122449 | 0.132653 | 0.173469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.237342 | 316 | 16 | 53 | 19.75 | 0.813278 | 0.348101 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07d6144eb480c726ea432e24a5dd450a2cdea111 | 116,736 | py | Python | tests/test_instructionset.py | jamesba/pyz80 | 25730c9eb07432d92518ad4ed8688b489a59d84f | [
"Apache-2.0"
] | 4 | 2018-01-01T16:27:44.000Z | 2020-07-06T00:52:32.000Z | tests/test_instructionset.py | jamesba/pyz80 | 25730c9eb07432d92518ad4ed8688b489a59d84f | [
"Apache-2.0"
] | null | null | null | tests/test_instructionset.py | jamesba/pyz80 | 25730c9eb07432d92518ad4ed8688b489a59d84f | [
"Apache-2.0"
] | null | null | null | import unittest
from unittest import mock
from pyz80.machinestates import decode_instruction
from pyz80.machinestates import INSTRUCTION_STATES
from pyz80.cpu import *
from pyz80.memorybus import MemoryBus, ROM
from pyz80.iobus import IOBus, Device
def set_register_to(reg, val):
def _inner(tc, cpu, name):
setattr(cpu.reg, reg, val)
return _inner
def write_to_memory(addr, val):
def _inner(tc, cpu, name):
cpu.membus.write(addr, val)
return _inner
def expect_register_equal(reg, val):
def _inner(tc, cpu, name):
rval = getattr(cpu.reg, reg)
tc.assertEqual(rval, val, msg="""[ {} ] Expected register {} to contain value 0x{:X}, but actually contains 0x{:X}
Full register contents:
{}
""".format(name, reg, val, rval, cpu.reg.registermap()))
return _inner
def expect_memory_location_equal(addr, val):
def _inner(tc, cpu, name):
rval = cpu.membus.read(addr)
tc.assertEqual(rval, val, msg="""[ {} ] Expected location 0x{:X} to contain value 0x{:X}, but actually contains 0x{:X}""".format(name, addr, val, rval))
return _inner
def ex(tc, cpu, name):
cpu.reg.ex()
def exx(tc, cpu, name):
cpu.reg.exx()
def ei(tc, cpu, name):
cpu.iff1 = 1
cpu.iff2 = 1
def di(tc, cpu, name):
cpu.iff1 = 0
cpu.iff2 = 0
def im0(tc, cpu, name):
cpu.interrupt_mode = 0
def im1(tc, cpu, name):
cpu.interrupt_mode = 1
def im2(tc, cpu, name):
cpu.interrupt_mode = 2
def begin_nmi(tc, cpu, name):
cpu.iff1 = 0
cpu.iff2 = 1
def expect_int_enabled(tc, cpu, name):
tc.assertEqual(cpu.iff1, 1, msg="""[ {} ] Expected iff1 to be set, is actually reset""".format(name))
def expect_int_preserved(tc, cpu, name):
tc.assertEqual(cpu.iff2, 1, msg="""[ {} ] Expected iff2 to be set, is actually reset""".format(name))
def expect_int_disabled(tc, cpu, name):
tc.assertEqual(cpu.iff1, 0, msg="""[ {} ] Expected iff1 to be reset, is actually set""".format(name))
def expect_int_not_preserved(tc, cpu, name):
tc.assertEqual(cpu.iff2, 0, msg="""[ {} ] Expected iff2 to be reset, is actually set""".format(name))
class MEM(object):
def __call__(self, key, value):
return write_to_memory(key, value)
def __getitem__(self, key):
class __inner(object):
def __init__(self, key):
self.key = key
def __eq__(self, other):
return expect_memory_location_equal(self.key, other)
return __inner(key)
class FLAG(object):
def __call__(self, key, value=None):
if value is None:
return set_register_to("F", key)
def _inner(tc, cpu, name):
if value == 0:
cpu.reg.resetflag(key)
else:
cpu.reg.setflag(key)
return _inner
def __getitem__(self, key):
class _inner(object):
def __init__(self, key):
self.key = key
def __eq__(self, other):
def __inner(tc, cpu, name):
rval = cpu.reg.getflag(self.key)
tc.assertEqual(rval, other, msg="""[ {} ] Expected flag {} to be {}, was actually {}""".format(name, self.key, other, rval))
return __inner
return _inner(key)
def __eq__(self, other):
return expect_register_equal('F', other)
class REG(object):
def __init__(self, r):
self.r = r
def __call__(self, value):
return set_register_to(self.r, value)
def __eq__(self, other):
return expect_register_equal(self.r, other)
class DummyInput(Device):
def __init__(self):
self.data = 0x00
self.high = 0x00
super(DummyInput, self).__init__()
def responds_to_port(self, port):
return (port == 0xFE)
def read(self, address):
self.high = address
return self.data
class _IN(object):
def __init__(self):
self.device = DummyInput()
def __call__(self, value):
def _inner(tc, cpu, name):
self.device.data = value
self.device.high = 0x00
return _inner
def __eq__(self, other):
def _inner(tc, cpu, name):
tc.assertEqual(self.device.high, other, msg="""[ {} ] Expected most recent high address on input port to be 0x{:X}, but actually 0x{:X}""".format(name, other, self.device.high))
return _inner
class DummyOutput(Device):
def __init__(self):
self.data = 0x00
self.high = 0x00
super(DummyOutput, self).__init__()
def responds_to_port(self, port):
return (port == 0xFA)
def write(self, address, value):
self.data = value
self.high = address
class _OUT(object):
def __init__(self):
self.device = DummyOutput()
def __eq__(self, other):
def _inner(tc, cpu, name):
tc.assertEqual((self.device.high, self.device.data), other, msg="""[ {} ] Expected most recent high address and data on input port to be (0x{:X},0x{:X}) but actually (0x{:X},0x{:X})""".format(name, other[0], other[1], self.device.high, self.device.data))
return _inner
class INTGEN(object):
def __init__(self):
self.next_interrupt = -1
self.data = []
self.interrupted = False
self.acknowledged = False
super(INTGEN, self).__init__()
def reset(self):
self.next_interrupt = -1
self.data = []
self.interrupted = False
self.acknowledged = False
def __call__(self, n, data, nmi=False):
def _inner(tc, cpu, name):
self.next_interrupt = n
self.data = data
self.nmi = nmi
return _inner
def clock(self, cpu):
if self.next_interrupt >= 0:
self.next_interrupt -= 1
if self.next_interrupt == 0:
self.interrupted = True
def _ack(cpu):
self.acknowledged = True
for x in self.data:
yield x
cpu.interrupt(ack=_ack, nmi=self.nmi)
def __eq__(self, other):
def _inner(tc, cpu, name):
if other[0]:
tc.assertTrue(self.interrupted, msg = "[ {} ] Expected an interrupt, but none was fired".format(name))
if other[1]:
tc.assertTrue(self.acknowledged, msg = "[ {} ] Expected an acknowledgement, but none was received".format(name))
else:
tc.assertFrue(self.acknowledged, msg = "[ {} ] Expected no acknowledgement, but one was received".format(name))
else:
tc.assertFalse(self.interrupted, msg = "[ {} ] Expected no interrupt, but one was fired".format(name))
return _inner
class _IM(object):
def __call__(self, mode):
def _inner(tc, cpu, name):
cpu.interrupt_mode = mode
return _inner
def __eq__(self, other):
def _inner(tc, cpu, name):
tc.assertEqual(cpu.interrupt_mode, other, msg = "[ {} ] Excptected interrupt mode {} but actually is {}".format(name, other, cpu.interrupt_mode))
return _inner
IM = _IM()
IG = INTGEN()
IN = _IN()
OUT = _OUT()
F = FLAG()
M = MEM()
A = REG('A')
B = REG('B')
C = REG('C')
D = REG('D')
E = REG('E')
H = REG('H')
L = REG('L')
I = REG('I')
R = REG('R')
SP = REG('SP')
PC = REG('PC')
IX = REG('IX')
IY = REG('IY')
AF = REG('AF')
BC = REG('BC')
DE = REG('DE')
HL = REG('HL')
class TestInstructionSet(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.executed_instructions = []
@classmethod
def tearDownClass(cls):
print()
print("Instruction Coverage:")
print("---------------------")
covered = 0
total = 0
def combine_elements(x, a=0):
if isinstance(x, int):
return a << 8 + x
elif len(x) == 0:
return a
else:
return combine_elements(x[1:], a << 8 + x[0])
keys = sorted(INSTRUCTION_STATES.keys(), key=combine_elements)
for i in keys:
if i in cls.executed_instructions:
print("> ", end=' ')
covered += 1
else:
print("! ", end=' ')
if isinstance(i, int):
print("{:#02x}".format(i))
else:
print("({:#02x},{:#02x})".format(*i))
total += 1
print("---------------------")
print("Cover: {: >7.2%}".format(float(covered)/float(total)))
print("---------------------")
def execute_instructions(self, pre, instructions, t_cycles, post, name):
IN.device.data = 0x00
IN.device.high = 0x00
OUT.device.data = 0x00
OUT.device.high = 0x00
IG.reset()
membus = MemoryBus()
iobus = IOBus([ IN.device, OUT.device ])
cpu = Z80CPU(iobus, membus)
for n in range(0,len(instructions)):
membus.write(n, instructions[n])
membus.write(len(instructions), 0xFF) # This should raise an exception when we reach it
for action in pre:
action(self, cpu, name)
original_decode_instruction = decode_instruction
with mock.patch('pyz80.machinestates.decode_instruction', side_effect=original_decode_instruction) as _decode_instruction:
for n in range(0, t_cycles):
IG.clock(cpu)
cpu.clock()
self.__class__.executed_instructions.extend(call[1][0] for call in _decode_instruction.mock_calls)
self.assertEqual(len(cpu.pipeline), 1, msg="[{}] At end of instruction pipeline still contains machine states: {!r}".format(name, cpu.pipeline))
self.assertEqual(str(type(cpu.pipeline[0])), "<class 'pyz80.machinestates.OCF.<locals>._OCF'>")
for action in post:
action(self, cpu, name)
def test_NOP(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [], [ 0x00, ], 4, [ (PC == 0x01), ], "NOP" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_LD(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ I(0x0B) ], [ 0xED, 0x57 ], 9, [ (PC == 0x02), (A == 0x0B), (F == 0x08) ], "LD A,I (I == 0x0B)" ],
[ [ I(0x80) ], [ 0xED, 0x57 ], 9, [ (PC == 0x02), (A == 0x80), (F == 0x80) ], "LD A,I (I == 0x80)" ],
[ [ I(0x00) ], [ 0xED, 0x57 ], 9, [ (PC == 0x02), (A == 0x00), (F == 0x40) ], "LD A,I (I == 0x00)" ],
[ [ I(0x20) ], [ 0xED, 0x57 ], 9, [ (PC == 0x02), (A == 0x20), (F == 0x20) ], "LD A,I (I == 0x20)" ],
[ [ ei, I(0x01) ], [ 0xED, 0x57 ], 9, [ (PC == 0x02), (A == 0x01), (F == 0x04) ], "LD A,I (I == 0x01, iff2=1)" ],
[ [ R(0x0B) ], [ 0xED, 0x5F ], 9, [ (PC == 0x02), (A == 0x0B), (F == 0x08) ], "LD A,R (R == 0x0B)" ],
[ [ R(0x80) ], [ 0xED, 0x5F ], 9, [ (PC == 0x02), (A == 0x80), (F == 0x80) ], "LD A,R (R == 0x80)" ],
[ [ R(0x00) ], [ 0xED, 0x5F ], 9, [ (PC == 0x02), (A == 0x00), (F == 0x40) ], "LD A,R (R == 0x00)" ],
[ [ R(0x20) ], [ 0xED, 0x5F ], 9, [ (PC == 0x02), (A == 0x20), (F == 0x20) ], "LD A,R (R == 0x20)" ],
[ [ ei, R(0x01) ], [ 0xED, 0x5F ], 9, [ (PC == 0x02), (A == 0x01), (F == 0x04) ], "LD A,R (R == 0x01, iff2=1)" ],
[ [ A(0x0B) ], [ 0xED, 0x47 ], 9, [ (PC == 0x02), (I == 0x0B) ], "LD I,A" ],
[ [ A(0x0B) ], [ 0xED, 0x4F ], 9, [ (PC == 0x02), (R == 0x0B) ], "LD R,A" ],
[ [], [ 0x01, 0xBC, 0x1B ], 10, [ (PC == 0x03), (BC == 0x1BBC), ], "LD BC,1BBCH" ],
[ [], [ 0x11, 0xBC, 0x1B ], 10, [ (PC == 0x03), (DE == 0x1BBC), ], "LD DE,1BBCH" ],
[ [], [ 0x21, 0xBC, 0x1B ], 10, [ (PC == 0x03), (HL == 0x1BBC), ], "LD HL,1BBCH" ],
[ [], [ 0x31, 0xBC, 0x1B ], 10, [ (PC == 0x03), (SP == 0x1BBC), ], "LD SP,1BBCH" ],
[ [], [ 0xDD, 0x21, 0xBC, 0x1B ], 14, [ (PC == 0x04), (IX == 0x1BBC), ], "LD IX,1BBCH" ],
[ [], [ 0xFD, 0x21, 0xBC, 0x1B ], 14, [ (PC == 0x04), (IY == 0x1BBC), ], "LD IY,1BBCH" ],
[ [ M(0x1BBC,0xFE), M(0x1BBD,0xCA) ], [ 0x2A, 0xBC, 0x1B ], 16, [ (PC == 0x03), (HL == 0xCAFE) ], "LD HL,(1BBCH)" ],
[ [ M(0x1BBC,0xFE), M(0x1BBD,0xCA) ], [ 0xED, 0x4B, 0xBC, 0x1B ], 20, [ (PC == 0x04), (BC == 0xCAFE) ], "LD BC,(1BBCH)" ],
[ [ M(0x1BBC,0xFE), M(0x1BBD,0xCA) ], [ 0xED, 0x5B, 0xBC, 0x1B ], 20, [ (PC == 0x04), (DE == 0xCAFE) ], "LD DE,(1BBCH)" ],
[ [ M(0x1BBC,0xFE), M(0x1BBD,0xCA) ], [ 0xED, 0x7B, 0xBC, 0x1B ], 20, [ (PC == 0x04), (SP == 0xCAFE) ], "LD SP,(1BBCH)" ],
[ [ M(0x1BBC,0xFE), M(0x1BBD,0xCA) ], [ 0xDD, 0x2A, 0xBC, 0x1B ], 20, [ (PC == 0x04), (IX == 0xCAFE) ], "LD IX,(1BBCH)" ],
[ [ M(0x1BBC,0xFE), M(0x1BBD,0xCA) ], [ 0xFD, 0x2A, 0xBC, 0x1B ], 20, [ (PC == 0x04), (IY == 0xCAFE) ], "LD IY,(1BBCH)" ],
[ [ BC(0xCAFE) ], [ 0xED, 0x43, 0xBC, 0x1B ], 20, [ (PC == 0x04), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),BC" ],
[ [ DE(0xCAFE) ], [ 0xED, 0x53, 0xBC, 0x1B ], 20, [ (PC == 0x04), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),DE" ],
[ [ HL(0xCAFE) ], [ 0x22, 0xBC, 0x1B ], 16, [ (PC == 0x03), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),HL" ],
[ [ SP(0xCAFE) ], [ 0xED, 0x73, 0xBC, 0x1B ], 20, [ (PC == 0x04), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),SP" ],
[ [ IX(0xCAFE) ], [ 0xDD, 0x22, 0xBC, 0x1B ], 20, [ (PC == 0x04), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),IX" ],
[ [ IY(0xCAFE) ], [ 0xFD, 0x22, 0xBC, 0x1B ], 20, [ (PC == 0x04), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),IY" ],
[ [ HL(0x1BBC) ], [ 0xF9 ], 4, [ (PC == 0x01), (SP == 0x1BBC), ], "LD SP,HL" ],
[ [ BC(0x1BBC), A(0xB) ], [ 0x02, ], 7, [ (PC == 0x01), (M[0x1BBC] == 0xB) ], "LD (BC),A" ],
[ [ DE(0x1BBC), A(0xB) ], [ 0x12, ], 7, [ (PC == 0x01), (M[0x1BBC] == 0xB) ], "LD (DE),A" ],
[ [ BC(0x1BBC), M(0x1BBC,0xB) ], [ 0x0A, ], 7, [ (PC == 0x01), (A == 0xB), ], "LD A,(BC)" ],
[ [ DE(0x1BBC), M(0x1BBC,0xB) ], [ 0x1A, ], 7, [ (PC == 0x01), (A == 0xB), ], "LD A,(DE)" ],
[ [], [ 0x06, 0x0B, ], 7, [ (PC == 0x02), (B == 0xB) ], "LD B,0BH" ],
[ [], [ 0x0E, 0x0B, ], 7, [ (PC == 0x02), (C == 0xB) ], "LD C,0BH" ],
[ [], [ 0x16, 0x0B, ], 7, [ (PC == 0x02), (D == 0xB) ], "LD D,0BH" ],
[ [], [ 0x1E, 0x0B, ], 7, [ (PC == 0x02), (E == 0xB) ], "LD E,0BH" ],
[ [], [ 0x26, 0x0B, ], 7, [ (PC == 0x02), (H == 0xB) ], "LD H,0BH" ],
[ [], [ 0x2E, 0x0B, ], 7, [ (PC == 0x02), (L == 0xB) ], "LD L,0BH" ],
[ [], [ 0x3E, 0x0B, ], 7, [ (PC == 0x02), (A == 0xB) ], "LD A,0BH" ],
[ [ A(0xB), ], [ 0x32, 0xBC, 0x1B ], 13, [ (PC == 0x03), (M[0x1BBC] == 0x0B) ], "LD (1BBCH),A" ],
[ [ HL(0x1BBC), ], [ 0x36, 0x0B ], 10, [ (PC == 0x02), (M[0x1BBC] == 0x0B) ], "LD (HL),0BH" ],
[ [ M(0x1BBC,0xB), ], [ 0x3A, 0xBC, 0x1B ], 13, [ (PC == 0x03), (A == 0x0B) ], "LD A,(1BBCH)" ],
[ [ B(0xB) ], [ 0x40, ], 4, [ (PC == 0x01), (B == 0xB), ], "LD B,B" ],
[ [ C(0xB) ], [ 0x41, ], 4, [ (PC == 0x01), (B == 0xB), ], "LD B,C" ],
[ [ D(0xB) ], [ 0x42, ], 4, [ (PC == 0x01), (B == 0xB), ], "LD B,D" ],
[ [ E(0xB) ], [ 0x43, ], 4, [ (PC == 0x01), (B == 0xB), ], "LD B,E" ],
[ [ H(0xB) ], [ 0x44, ], 4, [ (PC == 0x01), (B == 0xB), ], "LD B,H" ],
[ [ L(0xB) ], [ 0x45, ], 4, [ (PC == 0x01), (B == 0xB), ], "LD B,L" ],
[ [ HL(0x1BBC), M(0x1BBC,0xB) ], [ 0x46, ], 7, [ (PC == 0x01), (B == 0xB), ], "LD B,(HL)" ],
[ [ A(0xB) ], [ 0x47, ], 4, [ (PC == 0x01), (B == 0xB), ], "LD B,A" ],
[ [ B(0xB) ], [ 0x48, ], 4, [ (PC == 0x01), (C == 0xB), ], "LD C,B" ],
[ [ C(0xB) ], [ 0x49, ], 4, [ (PC == 0x01), (C == 0xB), ], "LD C,C" ],
[ [ D(0xB) ], [ 0x4A, ], 4, [ (PC == 0x01), (C == 0xB), ], "LD C,D" ],
[ [ E(0xB) ], [ 0x4B, ], 4, [ (PC == 0x01), (C == 0xB), ], "LD C,E" ],
[ [ H(0xB) ], [ 0x4C, ], 4, [ (PC == 0x01), (C == 0xB), ], "LD C,H" ],
[ [ L(0xB) ], [ 0x4D, ], 4, [ (PC == 0x01), (C == 0xB), ], "LD C,L" ],
[ [ HL(0x1BBC), M(0x1BBC, 0xB) ], [ 0x4E, ], 7, [ (PC == 0x01), (C == 0xB), ], "LD C,(HL)" ],
[ [ A(0xB) ], [ 0x4F, ], 4, [ (PC == 0x01), (C == 0xB), ], "LD C,A" ],
[ [ B(0xB) ], [ 0x50, ], 4, [ (PC == 0x01), (D == 0xB), ], "LD D,B" ],
[ [ C(0xB) ], [ 0x51, ], 4, [ (PC == 0x01), (D == 0xB), ], "LD D,C" ],
[ [ D(0xB) ], [ 0x52, ], 4, [ (PC == 0x01), (D == 0xB), ], "LD D,D" ],
[ [ E(0xB) ], [ 0x53, ], 4, [ (PC == 0x01), (D == 0xB), ], "LD D,E" ],
[ [ H(0xB) ], [ 0x54, ], 4, [ (PC == 0x01), (D == 0xB), ], "LD D,H" ],
[ [ L(0xB) ], [ 0x55, ], 4, [ (PC == 0x01), (D == 0xB), ], "LD D,L" ],
[ [ HL(0x1BBC), M(0x1BBC, 0xB) ], [ 0x56, ], 7, [ (PC == 0x01), (D == 0xB), ], "LD D,(HL)" ],
[ [ A(0xB) ], [ 0x57, ], 4, [ (PC == 0x01), (D == 0xB), ], "LD D,A" ],
[ [ B(0xB) ], [ 0x58, ], 4, [ (PC == 0x01), (E == 0xB), ], "LD E,B" ],
[ [ C(0xB) ], [ 0x59, ], 4, [ (PC == 0x01), (E == 0xB), ], "LD E,C" ],
[ [ D(0xB) ], [ 0x5A, ], 4, [ (PC == 0x01), (E == 0xB), ], "LD E,D" ],
[ [ E(0xB) ], [ 0x5B, ], 4, [ (PC == 0x01), (E == 0xB), ], "LD E,E" ],
[ [ H(0xB) ], [ 0x5C, ], 4, [ (PC == 0x01), (E == 0xB), ], "LD E,H" ],
[ [ L(0xB) ], [ 0x5D, ], 4, [ (PC == 0x01), (E == 0xB), ], "LD E,L" ],
[ [ HL(0x1BBC), M(0x1BBC, 0xB) ], [ 0x5E, ], 7, [ (PC == 0x01), (E == 0xB), ], "LD E,(HL)" ],
[ [ A(0xB) ], [ 0x5F, ], 4, [ (PC == 0x01), (E == 0xB), ], "LD E,A" ],
[ [ B(0xB) ], [ 0x60, ], 4, [ (PC == 0x01), (H == 0xB), ], "LD H,B" ],
[ [ C(0xB) ], [ 0x61, ], 4, [ (PC == 0x01), (H == 0xB), ], "LD H,C" ],
[ [ D(0xB) ], [ 0x62, ], 4, [ (PC == 0x01), (H == 0xB), ], "LD H,D" ],
[ [ E(0xB) ], [ 0x63, ], 4, [ (PC == 0x01), (H == 0xB), ], "LD H,E" ],
[ [ H(0xB) ], [ 0x64, ], 4, [ (PC == 0x01), (H == 0xB), ], "LD H,H" ],
[ [ L(0xB) ], [ 0x65, ], 4, [ (PC == 0x01), (H == 0xB), ], "LD H,L" ],
[ [ HL(0x1BBC), M(0x1BBC, 0xB) ], [ 0x66, ], 7, [ (PC == 0x01), (H == 0xB), ], "LD H,(HL)" ],
[ [ A(0xB) ], [ 0x67, ], 4, [ (PC == 0x01), (H == 0xB), ], "LD H,A" ],
[ [ B(0xB) ], [ 0x68, ], 4, [ (PC == 0x01), (L == 0xB), ], "LD L,B" ],
[ [ C(0xB) ], [ 0x69, ], 4, [ (PC == 0x01), (L == 0xB), ], "LD L,C" ],
[ [ D(0xB) ], [ 0x6A, ], 4, [ (PC == 0x01), (L == 0xB), ], "LD L,D" ],
[ [ E(0xB) ], [ 0x6B, ], 4, [ (PC == 0x01), (L == 0xB), ], "LD L,E" ],
[ [ H(0xB) ], [ 0x6C, ], 4, [ (PC == 0x01), (L == 0xB), ], "LD L,H" ],
[ [ L(0xB) ], [ 0x6D, ], 4, [ (PC == 0x01), (L == 0xB), ], "LD L,L" ],
[ [ HL(0x1BBC), M(0x1BBC, 0xB) ], [ 0x6E, ], 7, [ (PC == 0x01), (L == 0xB), ], "LD L,(HL)" ],
[ [ A(0xB) ], [ 0x6F, ], 4, [ (PC == 0x01), (L == 0xB), ], "LD L,A" ],
[ [ IX(0xCAFE) ], [ 0xDD, 0xF9 ], 8, [ (PC == 0x02), (SP == 0xCAFE) ], "LD SP,IX"],
[ [ IY(0xCAFE) ], [ 0xFD, 0xF9 ], 8, [ (PC == 0x02), (SP == 0xCAFE) ], "LD SP,IY"],
[ [ HL(0x1BBC), B(0xB) ], [ 0x70, ], 7, [ (PC == 0x01), M[0x1BBC] == 0xB ], "LD (HL),B" ],
[ [ HL(0x1BBC), C(0xB) ], [ 0x71, ], 7, [ (PC == 0x01), M[0x1BBC] == 0xB ], "LD (HL),C" ],
[ [ HL(0x1BBC), D(0xB) ], [ 0x72, ], 7, [ (PC == 0x01), M[0x1BBC] == 0xB ], "LD (HL),D" ],
[ [ HL(0x1BBC), E(0xB) ], [ 0x73, ], 7, [ (PC == 0x01), M[0x1BBC] == 0xB ], "LD (HL),E" ],
[ [ HL(0x1BBC) ], [ 0x74, ], 7, [ (PC == 0x01), M[0x1BBC] == 0x1B ], "LD (HL),H" ],
[ [ HL(0x1BBC) ], [ 0x75, ], 7, [ (PC == 0x01), M[0x1BBC] == 0xBC ], "LD (HL),L" ],
[ [ HL(0x1BBC), A(0xB) ], [ 0x77, ], 7, [ (PC == 0x01), M[0x1BBC] == 0xB ], "LD (HL),A" ],
[ [ B(0xB) ], [ 0x78, ], 4, [ (PC == 0x01), (A == 0xB), ], "LD L,B" ],
[ [ C(0xB) ], [ 0x79, ], 4, [ (PC == 0x01), (A == 0xB), ], "LD L,C" ],
[ [ D(0xB) ], [ 0x7A, ], 4, [ (PC == 0x01), (A == 0xB), ], "LD L,D" ],
[ [ E(0xB) ], [ 0x7B, ], 4, [ (PC == 0x01), (A == 0xB), ], "LD L,E" ],
[ [ H(0xB) ], [ 0x7C, ], 4, [ (PC == 0x01), (A == 0xB), ], "LD L,H" ],
[ [ L(0xB) ], [ 0x7D, ], 4, [ (PC == 0x01), (A == 0xB), ], "LD L,L" ],
[ [ HL(0x1BBC), M(0x1BBC, 0xB) ], [ 0x7E, ], 7, [ (PC == 0x01), (A == 0xB), ], "LD L,(HL)" ],
[ [ A(0xB) ], [ 0x7F, ], 4, [ (PC == 0x01), (A == 0xB), ], "LD L,A" ],
[ [ M(0x1BBC, 0x0B), IX(0x1BB0) ], [ 0xDD, 0x46, 0x0C ], 19, [ (PC == 0x3), (B == 0x0B) ], "LD B,(IX+0CH)"],
[ [ M(0x1BBC, 0x0B), IX(0x1BB0) ], [ 0xDD, 0x4E, 0x0C ], 19, [ (PC == 0x3), (C == 0x0B) ], "LD C,(IX+0CH)"],
[ [ M(0x1BBC, 0x0B), IX(0x1BB0) ], [ 0xDD, 0x56, 0x0C ], 19, [ (PC == 0x3), (D == 0x0B) ], "LD D,(IX+0CH)"],
[ [ M(0x1BBC, 0x0B), IX(0x1BB0) ], [ 0xDD, 0x5E, 0x0C ], 19, [ (PC == 0x3), (E == 0x0B) ], "LD E,(IX+0CH)"],
[ [ M(0x1BBC, 0x0B), IX(0x1BB0) ], [ 0xDD, 0x66, 0x0C ], 19, [ (PC == 0x3), (H == 0x0B) ], "LD H,(IX+0CH)"],
[ [ M(0x1BBC, 0x0B), IX(0x1BB0) ], [ 0xDD, 0x6E, 0x0C ], 19, [ (PC == 0x3), (L == 0x0B) ], "LD L,(IX+0CH)"],
[ [ M(0x1BBC, 0x0B), IX(0x1BB0) ], [ 0xDD, 0x7E, 0x0C ], 19, [ (PC == 0x3), (A == 0x0B) ], "LD A,(IX+0CH)"],
[ [ M(0x1BBC, 0x0B), IY(0x1BB0) ], [ 0xFD, 0x46, 0x0C ], 19, [ (PC == 0x3), (B == 0x0B) ], "LD B,(IY+0CH)"],
[ [ M(0x1BBC, 0x0B), IY(0x1BB0) ], [ 0xFD, 0x4E, 0x0C ], 19, [ (PC == 0x3), (C == 0x0B) ], "LD C,(IY+0CH)"],
[ [ M(0x1BBC, 0x0B), IY(0x1BB0) ], [ 0xFD, 0x56, 0x0C ], 19, [ (PC == 0x3), (D == 0x0B) ], "LD D,(IY+0CH)"],
[ [ M(0x1BBC, 0x0B), IY(0x1BB0) ], [ 0xFD, 0x5E, 0x0C ], 19, [ (PC == 0x3), (E == 0x0B) ], "LD E,(IY+0CH)"],
[ [ M(0x1BBC, 0x0B), IY(0x1BB0) ], [ 0xFD, 0x66, 0x0C ], 19, [ (PC == 0x3), (H == 0x0B) ], "LD H,(IY+0CH)"],
[ [ M(0x1BBC, 0x0B), IY(0x1BB0) ], [ 0xFD, 0x6E, 0x0C ], 19, [ (PC == 0x3), (L == 0x0B) ], "LD L,(IY+0CH)"],
[ [ M(0x1BBC, 0x0B), IY(0x1BB0) ], [ 0xFD, 0x7E, 0x0C ], 19, [ (PC == 0x3), (A == 0x0B) ], "LD A,(IY+0CH)"],
[ [ B(0x0B), IX(0x1BB0) ], [ 0xDD, 0x70, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),B"],
[ [ C(0x0B), IX(0x1BB0) ], [ 0xDD, 0x71, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),C"],
[ [ D(0x0B), IX(0x1BB0) ], [ 0xDD, 0x72, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),D"],
[ [ E(0x0B), IX(0x1BB0) ], [ 0xDD, 0x73, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),E"],
[ [ H(0x0B), IX(0x1BB0) ], [ 0xDD, 0x74, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),H"],
[ [ L(0x0B), IX(0x1BB0) ], [ 0xDD, 0x75, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),L"],
[ [ A(0x0B), IX(0x1BB0) ], [ 0xDD, 0x77, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),A"],
[ [ IX(0x1BB0) ], [ 0xDD, 0x36, 0x0C, 0x0B ], 22, [ (PC == 0x4), (M[0x1BBC] == 0x0B) ], "LD (IX+0CH),0BH"],
[ [ IY(0x1BB0) ], [ 0xFD, 0x36, 0x0C, 0x0B ], 22, [ (PC == 0x4), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),0BH"],
[ [ B(0x0B), IY(0x1BB0) ], [ 0xFD, 0x70, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),B"],
[ [ C(0x0B), IY(0x1BB0) ], [ 0xFD, 0x71, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),C"],
[ [ D(0x0B), IY(0x1BB0) ], [ 0xFD, 0x72, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),D"],
[ [ E(0x0B), IY(0x1BB0) ], [ 0xFD, 0x73, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),E"],
[ [ H(0x0B), IY(0x1BB0) ], [ 0xFD, 0x74, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),H"],
[ [ L(0x0B), IY(0x1BB0) ], [ 0xFD, 0x75, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),L"],
[ [ A(0x0B), IY(0x1BB0) ], [ 0xFD, 0x77, 0x0C ], 19, [ (PC == 0x3), (M[0x1BBC] == 0x0B) ], "LD (IY+0CH),A"],
[ [ BC(0xCAFE) ], [ 0xED, 0x43, 0xBC, 0x1B ], 22, [ (PC == 0x4), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),BC" ],
[ [ DE(0xCAFE) ], [ 0xED, 0x53, 0xBC, 0x1B ], 22, [ (PC == 0x4), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),DE" ],
[ [ SP(0xCAFE) ], [ 0xED, 0x73, 0xBC, 0x1B ], 22, [ (PC == 0x4), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),SP" ],
[ [ IX(0xCAFE) ], [ 0xDD, 0x22, 0xBC, 0x1B ], 22, [ (PC == 0x4), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),IX" ],
[ [ IY(0xCAFE) ], [ 0xFD, 0x22, 0xBC, 0x1B ], 22, [ (PC == 0x4), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "LD (1BBCH),IY" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_pop(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), SP(0x1BBC) ], [ 0xC1, ], 10, [ (PC == 0x01), (SP == 0x1BBE), (BC == 0xCAFE) ], "POP BC" ],
[ [ M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), SP(0x1BBC) ], [ 0xD1, ], 10, [ (PC == 0x01), (SP == 0x1BBE), (DE == 0xCAFE) ], "POP DE" ],
[ [ M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), SP(0x1BBC) ], [ 0xE1, ], 10, [ (PC == 0x01), (SP == 0x1BBE), (HL == 0xCAFE) ], "POP HL" ],
[ [ M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), SP(0x1BBC) ], [ 0xF1, ], 10, [ (PC == 0x01), (SP == 0x1BBE), (AF == 0xCAFE) ], "POP AF" ],
[ [ M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), SP(0x1BBC) ], [ 0xDD, 0xE1, ], 14, [ (PC == 0x02), (SP == 0x1BBE), (IX == 0xCAFE) ], "POP IX" ],
[ [ M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), SP(0x1BBC) ], [ 0xFD, 0xE1, ], 14, [ (PC == 0x02), (SP == 0x1BBE), (IY == 0xCAFE) ], "POP IY" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_push(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ AF(0xCAFE), SP(0x1BBC) ], [ 0xF5, ], 10, [ (PC == 0x01), (SP == 0x1BBA), (M[0x1BBA] == 0xFE), (M[0x1BBB] == 0xCA) ], "PUSH AF" ],
[ [ BC(0xCAFE), SP(0x1BBC) ], [ 0xC5, ], 10, [ (PC == 0x01), (SP == 0x1BBA), (M[0x1BBA] == 0xFE), (M[0x1BBB] == 0xCA) ], "PUSH BC" ],
[ [ DE(0xCAFE), SP(0x1BBC) ], [ 0xD5, ], 10, [ (PC == 0x01), (SP == 0x1BBA), (M[0x1BBA] == 0xFE), (M[0x1BBB] == 0xCA) ], "PUSH DE" ],
[ [ HL(0xCAFE), SP(0x1BBC) ], [ 0xE5, ], 10, [ (PC == 0x01), (SP == 0x1BBA), (M[0x1BBA] == 0xFE), (M[0x1BBB] == 0xCA) ], "PUSH HL" ],
[ [ IX(0xCAFE), SP(0x1BBC) ], [ 0xDD, 0xE5, ], 14, [ (PC == 0x02), (SP == 0x1BBA), (M[0x1BBA] == 0xFE), (M[0x1BBB] == 0xCA) ], "PUSH IX" ],
[ [ IY(0xCAFE), SP(0x1BBC) ], [ 0xFD, 0xE5, ], 14, [ (PC == 0x02), (SP == 0x1BBA), (M[0x1BBA] == 0xFE), (M[0x1BBB] == 0xCA) ], "PUSH IY" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ex(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ AF(0xA), ex, AF(0xB) ], [ 0x08 ], 4, [ (PC == 0x01), (AF == 0xA), ex, (AF == 0xB) ], "EX AF,AF'" ],
[ [ DE(0xA), HL(0xB)], [ 0xEB ], 4, [ (PC == 0x01), (DE == 0xB), (HL == 0xA) ], "EX DE,HL" ],
[ [ HL(0xCAFE), SP(0x1BBC), M(0x1BBC, 0x37), M(0x1BBD, 0x13),], [ 0xE3 ], 19, [ (PC == 0x01), (HL == 0x1337), (SP == 0x1BBC), (M[0x1BBC] == 0xFE), (M[0x1BBD] == 0xCA) ], "EX (SP),HL" ],
[ [ IX(0xCAFE), SP(0x1BBC), M(0x1BBC, 0x37), M(0x1BBD, 0x13),], [ 0xDD, 0xE3 ], 23, [ (PC==0x02), (IX==0x1337), (SP==0x1BBC), (M[0x1BBC]==0xFE), (M[0x1BBD]==0xCA) ], "EX (SP),IX" ],
[ [ IY(0xCAFE), SP(0x1BBC), M(0x1BBC, 0x37), M(0x1BBD, 0x13),], [ 0xFD, 0xE3 ], 23, [ (PC==0x02), (IY==0x1337), (SP==0x1BBC), (M[0x1BBC]==0xFE), (M[0x1BBD]==0xCA) ], "EX (SP),IY" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_exx(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
(pre, instructions, t_cycles, post, name) = [
[ BC(0xCAFE), DE(0x1BBC), HL(0xDEAD), exx, BC(0x1337), DE(0x8080), HL(0xF00F) ],
[ 0xD9 ], 4,
[ (BC == 0xCAFE), (DE == 0x1BBC), (HL == 0xDEAD), exx, (BC == 0x1337), (DE == 0x8080), (HL == 0xF00F) ],
"EXX"
]
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ldi(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x02), A(0x00), M(0x1BBC, 0x2B) ], [ 0xED, 0xA0 ], 16, [ (PC==0x02),(HL==0x1BBD),(DE==0x2BBD),(BC==0x1),(M[0x2BBC]==0x2B), (F==0x2C) ], "LDI (nz, A==0x00)" ],
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x02), A(0x08), M(0x1BBC, 0x2B) ], [ 0xED, 0xA0 ], 16, [ (PC==0x02),(HL==0x1BBD),(DE==0x2BBD),(BC==0x1),(M[0x2BBC]==0x2B), (F==0x24) ], "LDI (nz, A==0x08)" ],
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x02), A(0x20), M(0x1BBC, 0x2B) ], [ 0xED, 0xA0 ], 16, [ (PC==0x02),(HL==0x1BBD),(DE==0x2BBD),(BC==0x1),(M[0x2BBC]==0x2B), (F==0x0C) ], "LDI (nz, A==0x20)" ],
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x01), A(0x00), M(0x1BBC, 0x2B) ], [ 0xED, 0xA0 ], 16, [ (PC==0x02),(HL==0x1BBD),(DE==0x2BBD),(BC==0x0),(M[0x2BBC]==0x2B), (F==0x28) ], "LDI (z, A==0z00)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ldir(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x2), M(0x1BBC, 0xB), F("V",1) ], [ 0xED, 0xB0 ], 21, [ (PC==0x00), (HL==0x1BBD), (DE==0x2BBD), (BC==0x1), (M[0x2BBC]==0xB), (F["V"]==1) ], "LDIR (count non-zero)" ],
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x1), M(0x1BBC, 0xB), F("V",1) ], [ 0xED, 0xB0 ], 16, [ (PC==0x02), (HL==0x1BBD), (DE==0x2BBD), (BC==0x0), (M[0x2BBC]==0xB), (F["V"]==0) ], "LDIR (count zero)" ],
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x2), M(0x1BBC, 0xB), M(0x1BBD, 0xC), F("V",1) ], [ 0xED, 0xB0 ], 37, [ (PC==0x02), (HL==0x1BBE), (DE==0x2BBE), (BC==0x0), (M[0x2BBC]==0xB), (M[0x2BBD]==0xC), (F["V"]==0) ], "LDIR (loop)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ldd(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x2), M(0x1BBC, 0xB), F("V",1) ], [ 0xED, 0xA8 ], 16, [ (PC==0x02), (HL==0x1BBB), (DE==0x2BBB), (BC==0x1), (M[0x2BBC]==0xB), (F["V"]==1) ], "LDI" ],
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x1), M(0x1BBC, 0xB), F("V",1) ], [ 0xED, 0xA8 ], 16, [ (PC==0x02), (HL==0x1BBB), (DE==0x2BBB), (BC==0x0), (M[0x2BBC]==0xB), (F["V"]==0) ], "LDI" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_lddr(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x2), M(0x1BBC, 0xB), F("V",1) ], [ 0xED, 0xB8 ], 21, [ (PC==0x00), (HL==0x1BBB), (DE==0x2BBB), (BC==0x1), (M[0x2BBC]==0xB), (F["V"]==1) ], "LDIR (count non-zero)" ],
[ [ HL(0x1BBC), DE(0x2BBC), BC(0x1), M(0x1BBC, 0xB), F("V",1) ], [ 0xED, 0xB8 ], 16, [ (PC==0x02), (HL==0x1BBB), (DE==0x2BBB), (BC==0x0), (M[0x2BBC]==0xB), (F["V"]==0) ], "LDIR (count zero)" ],
[ [ HL(0x1BBD), DE(0x2BBD), BC(0x2), M(0x1BBC, 0xB), M(0x1BBD, 0xC), F("V",1) ], [ 0xED, 0xB8 ], 37, [ (PC==0x02), (HL==0x1BBB), (DE==0x2BBB), (BC==0x0), (M[0x2BBC]==0xB), (M[0x2BBD]==0xC), (F["V"]==0) ], "LDIR (loop)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_cpi(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), BC(0x2), M(0x1BBC, 0xFE), A(0x00) ], [ 0xED, 0xA1 ], 16, [ (PC==0x02), (HL==0x1BBD), (BC==0x1), (F==0x06) ], "CPI (ne)" ],
[ [ HL(0x1BBC), BC(0x2), M(0x1BBC, 0xFE), A(0xFE) ], [ 0xED, 0xA1 ], 16, [ (PC==0x02), (HL==0x1BBD), (BC==0x1), (F==0x46) ], "CPI (eq)" ],
[ [ HL(0x1BBC), BC(0x1), M(0x1BBC, 0xFE), A(0x00) ], [ 0xED, 0xA1 ], 16, [ (PC==0x02), (HL==0x1BBD), (BC==0x0), (F==0x02) ], "CPI (ne,last)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_cpir(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), BC(0x3), M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), A(0xCA) ], [ 0xED, 0xB1 ], 37, [ (PC==0x02), (HL==0x1BBE), (BC==0x1), (F==0x46) ], "CPIR (found after 2 cycles)" ],
[ [ HL(0x1BBC), BC(0x3), M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), A(0x00) ], [ 0xED, 0xB1 ], 58, [ (PC==0x02), (HL==0x1BBF), (BC==0x0), (F==0x42) ], "CPIR (found after 3 cycles)" ],
[ [ HL(0x1BBC), BC(0x3), M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), A(0xBA) ], [ 0xED, 0xB1 ], 58, [ (PC==0x02), (HL==0x1BBF), (BC==0x0), (F==0x2A) ], "CPIR (not found after 3 cycles)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_cpd(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), BC(0x2), M(0x1BBC, 0xFE), A(0x00) ], [ 0xED, 0xA9 ], 16, [ (PC==0x02), (HL==0x1BBB), (BC==0x1), (F==0x06) ], "CPD (ne)" ],
[ [ HL(0x1BBC), BC(0x2), M(0x1BBC, 0xFE), A(0xFE) ], [ 0xED, 0xA9 ], 16, [ (PC==0x02), (HL==0x1BBB), (BC==0x1), (F==0x46) ], "CPD (eq)" ],
[ [ HL(0x1BBC), BC(0x1), M(0x1BBC, 0xFE), A(0x00) ], [ 0xED, 0xA9 ], 16, [ (PC==0x02), (HL==0x1BBB), (BC==0x0), (F==0x02) ], "CPD (ne,last)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_cpdr(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBE), BC(0x3), M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), A(0xCA) ], [ 0xED, 0xB9 ], 37, [ (PC==0x02), (HL==0x1BBC), (BC==0x1), (F==0x46) ], "CPIR (found after 2 cycles)" ],
[ [ HL(0x1BBE), BC(0x3), M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), A(0xFE) ], [ 0xED, 0xB9 ], 58, [ (PC==0x02), (HL==0x1BBB), (BC==0x0), (F==0x42) ], "CPIR (found after 3 cycles)" ],
[ [ HL(0x1BBE), BC(0x3), M(0x1BBC, 0xFE), M(0x1BBD, 0xCA), A(0xBA) ], [ 0xED, 0xB9 ], 58, [ (PC==0x02), (HL==0x1BBB), (BC==0x0), (F==0x2A) ], "CPIR (not found after 3 cycles)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_add8(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X,Y,f) in [ (0x0A, 0x0B, 0x10),
(0x40, 0x51, 0x84),
(0xFF, 0x02, 0x15),
(0xFF, 0x01, 0x55) ]:
tests = [
[ [ A(X), B(Y) ], [ 0x80 ], 4, [ (PC==0x01), (A == (X+Y)&0xFF), (F==f) ], "ADD B (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X), C(Y) ], [ 0x81 ], 4, [ (PC==0x01), (A == (X+Y)&0xFF), (F==f) ], "ADD C (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X), D(Y) ], [ 0x82 ], 4, [ (PC==0x01), (A == (X+Y)&0xFF), (F==f) ], "ADD D (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X), E(Y) ], [ 0x83 ], 4, [ (PC==0x01), (A == (X+Y)&0xFF), (F==f) ], "ADD E (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X), H(Y) ], [ 0x84 ], 4, [ (PC==0x01), (A == (X+Y)&0xFF), (F==f) ], "ADD H (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X), L(Y) ], [ 0x85 ], 4, [ (PC==0x01), (A == (X+Y)&0xFF), (F==f) ], "ADD L (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC) ], [ 0x86 ], 7, [ (PC==0x01), (A == (X+Y)&0xFF), (F==f) ], "ADD (HL) (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X) ], [ 0xC6, Y ], 7, [ (PC==0x02), (A == (X+Y)&0xFF), (F==f) ], "ADD {:X}H (0x{:X} + 0x{:X})".format(Y,X,Y) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0) ], [ 0xDD, 0x86, 0x0C ], 19, [ (PC==0x03), (A == (X+Y)&0xFF), (F==f) ], "ADD (IX+0CH) (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0) ], [ 0xFD, 0x86, 0x0C ], 19, [ (PC==0x03), (A == (X+Y)&0xFF), (F==f) ], "ADD (IY+0CH) (0x{:X} + 0x{:X})".format(X,Y) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for (X,f) in [ (0x0A, 0x10),
(0x40, 0x84),
(0x81, 0x05),
(0x80, 0x45) ]:
tests = [
[ [ A(X) ], [ 0x87 ], 4, [ (PC==0x01), (A == (X+X)&0xFF), (F==f) ], "ADD A (0x{:X} + 0x{:X})".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_adc8(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X,Y,f,c) in [ (0x0A, 0x0B, 0x10, 0),
(0x0A, 0x0A, 0x10, 1),
(0x40, 0x51, 0x84, 0),
(0x40, 0x50, 0x84, 1),
(0xFF, 0x02, 0x15, 0),
(0xFF, 0x01, 0x15, 1),
(0xFF, 0x01, 0x55, 0),
(0xFF, 0x00, 0x55, 1) ]:
tests = [
[ [ A(X), B(Y), F(c) ], [ 0x88 ], 4, [ (PC==0x01), (A == (X+Y+c)&0xFF), (F==f) ], "ADC B (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), C(Y), F(c) ], [ 0x89 ], 4, [ (PC==0x01), (A == (X+Y+c)&0xFF), (F==f) ], "ADC C (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), D(Y), F(c) ], [ 0x8A ], 4, [ (PC==0x01), (A == (X+Y+c)&0xFF), (F==f) ], "ADC D (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), E(Y), F(c) ], [ 0x8B ], 4, [ (PC==0x01), (A == (X+Y+c)&0xFF), (F==f) ], "ADC E (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), H(Y), F(c) ], [ 0x8C ], 4, [ (PC==0x01), (A == (X+Y+c)&0xFF), (F==f) ], "ADC H (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), L(Y), F(c) ], [ 0x8D ], 4, [ (PC==0x01), (A == (X+Y+c)&0xFF), (F==f) ], "ADC L (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC), F(c) ], [ 0x8E ], 7, [ (PC==0x01), (A == (X+Y+c)&0xFF), (F==f) ], "ADC (HL) (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), F(c) ], [ 0xCE, Y ], 7, [ (PC==0x02), (A == (X+Y+c)&0xFF), (F==f) ], "ADC {:X}H (0x{:X} + 0x{:X} + {})".format(Y,X,Y,c) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0), F(c) ], [ 0xDD, 0x8E, 0x0C ], 19, [ (PC==0x03), (A == (X+Y+c)&0xFF), (F==f) ], "ADC (IX+0CH) (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0), F(c) ], [ 0xFD, 0x8E, 0x0C ], 19, [ (PC==0x03), (A == (X+Y+c)&0xFF), (F==f) ], "ADC (IY+0CH) (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for (X,f,c) in [ (0x0A, 0x10,0),
(0x0A, 0x10,1),
(0x40, 0x84,0),
(0x40, 0x84,1),
(0x81, 0x05,0),
(0x81, 0x05,1),
(0x80, 0x45,0),
(0x80, 0x05,1)]:
tests = [
[ [ A(X), F(c) ], [ 0x8F ], 4, [ (PC==0x01), (A == (X+X+c)&0xFF), (F==f) ], "ADC A (0x{:X} + 0x{:X} + {})".format(X,X,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_sub(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X, Y, f) in [ (0x0A, 0xF5, 0x07),
(0x40, 0xAF, 0x93),
(0xFF, 0xFE, 0x02),
(0xFF, 0xFF, 0x42) ]:
tests = [
[ [ A(X), B(Y) ], [ 0x90 ], 4, [ (PC==0x01), (A == (X-Y)&0xFF), (F==f) ], "SUB B (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), C(Y) ], [ 0x91 ], 4, [ (PC==0x01), (A == (X-Y)&0xFF), (F==f) ], "SUB C (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), D(Y) ], [ 0x92 ], 4, [ (PC==0x01), (A == (X-Y)&0xFF), (F==f) ], "SUB D (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), E(Y) ], [ 0x93 ], 4, [ (PC==0x01), (A == (X-Y)&0xFF), (F==f) ], "SUB E (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), H(Y) ], [ 0x94 ], 4, [ (PC==0x01), (A == (X-Y)&0xFF), (F==f) ], "SUB H (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), L(Y) ], [ 0x95 ], 4, [ (PC==0x01), (A == (X-Y)&0xFF), (F==f) ], "SUB L (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC) ], [ 0x96 ], 7, [ (PC==0x01), (A == (X-Y)&0xFF), (F==f) ], "SUB (HL) (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X) ], [ 0xD6, Y ], 7, [ (PC==0x02), (A == (X-Y)&0xFF), (F==f) ], "SUB {:X}H (0x{:X} - 0x{:X})".format(Y,X,Y) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0) ], [ 0xDD, 0x96, 0x0C ], 19, [ (PC==0x03), (A == (X-Y)&0xFF), (F==f) ], "SUB (IX+0CH) (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0) ], [ 0xFD, 0x96, 0x0C ], 19, [ (PC==0x03), (A == (X-Y)&0xFF), (F==f) ], "SUB (IY+0CH) (0x{:X} - 0x{:X})".format(X,Y) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for X in [ 0x00,
0x80,
0xFF ]:
tests = [
[ [ A(X) ], [ 0x97 ], 4, [ (PC==0x01), (A == 0x00), (F==0x42) ], "SUB A (0x{:X} - 0x{:X})".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_cp(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X, Y, f) in [ (0x0A, 0xF5, 0x07),
(0x40, 0xAF, 0x83),
(0xFF, 0xFE, 0x02),
(0xFF, 0xFF, 0x42) ]:
tests = [
[ [ A(X), B(Y) ], [ 0xB8 ], 4, [ (PC==0x01), (A == X), (F==f) ], "CP B (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), C(Y) ], [ 0xB9 ], 4, [ (PC==0x01), (A == X), (F==f) ], "CP C (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), D(Y) ], [ 0xBA ], 4, [ (PC==0x01), (A == X), (F==f) ], "CP D (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), E(Y) ], [ 0xBB ], 4, [ (PC==0x01), (A == X), (F==f) ], "CP E (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), H(Y) ], [ 0xBC ], 4, [ (PC==0x01), (A == X), (F==f) ], "CP H (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), L(Y) ], [ 0xBD ], 4, [ (PC==0x01), (A == X), (F==f) ], "CP L (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC) ], [ 0xBE ], 7, [ (PC==0x01), (A == X), (F==f) ], "CP (HL) (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X) ], [ 0xFE, Y ], 7, [ (PC==0x02), (A == X), (F==f) ], "CP {:X}H (0x{:X} - 0x{:X})".format(Y,X,Y) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0) ], [ 0xDD, 0xBE, 0x0C ], 19, [ (PC==0x03), (A == X), (F==f) ], "CP (IX+0CH) (0x{:X} - 0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0) ], [ 0xFD, 0xBE, 0x0C ], 19, [ (PC==0x03), (A == X), (F==f) ], "CP (IY+0CH) (0x{:X} - 0x{:X})".format(X,Y) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for X in [ 0x00,
0x80,
0xFF ]:
tests = [
[ [ A(X) ], [ 0xBF ], 4, [ (PC==0x01), (A == X), (F==0x42) ], "CP A (0x{:X} - 0x{:X})".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_sbc(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X, Y, f, c) in [ (0x0A, 0xF5, 0x07, 0),
(0x0A, 0xF5, 0x07, 1),
(0x40, 0xAF, 0x93, 0),
(0x40, 0xAF, 0x93, 1),
(0xFF, 0xFE, 0x02, 0),
(0xFF, 0xFE, 0x42, 1),
(0xFF, 0xFF, 0x42, 0),
(0xFF, 0xFF, 0xBB, 1) ]:
tests = [
[ [ A(X), B(Y), F(c) ], [ 0x98 ], 4, [ (PC==0x01), (A == (X-Y-c)&0xFF), (F==f) ], "SBC B (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), C(Y), F(c) ], [ 0x99 ], 4, [ (PC==0x01), (A == (X-Y-c)&0xFF), (F==f) ], "SBC C (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), D(Y), F(c) ], [ 0x9A ], 4, [ (PC==0x01), (A == (X-Y-c)&0xFF), (F==f) ], "SBC D (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), E(Y), F(c) ], [ 0x9B ], 4, [ (PC==0x01), (A == (X-Y-c)&0xFF), (F==f) ], "SBC E (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), H(Y), F(c) ], [ 0x9C ], 4, [ (PC==0x01), (A == (X-Y-c)&0xFF), (F==f) ], "SBC H (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), L(Y), F(c) ], [ 0x9D ], 4, [ (PC==0x01), (A == (X-Y-c)&0xFF), (F==f) ], "SBC L (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC), F(c) ], [ 0x9E ], 7, [ (PC==0x01), (A == (X-Y-c)&0xFF), (F==f) ], "SBC (HL) (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), F(c) ], [ 0xDE, Y ], 7, [ (PC==0x02), (A == (X-Y-c)&0xFF), (F==f) ], "SBC {:X}H (0x{:X} - 0x{:X} - {})".format(Y,X,Y,c) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0), F(c) ], [ 0xDD, 0x9E, 0x0C ], 19, [ (PC==0x03), (A == (X-Y-c)&0xFF), (F==f) ], "SBC (IX+0CH) (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0), F(c) ], [ 0xFD, 0x9E, 0x0C ], 19, [ (PC==0x03), (A == (X-Y-c)&0xFF), (F==f) ], "SBC (IY+0CH) (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for X in [ 0x00,
0x80,
0xFF ]:
tests = [
[ [ A(X), F(0) ], [ 0x9F ], 4, [ (PC==0x01), (A == 0), (F==0x42) ], "SBC A (0x{:X} - 0x{:X} - 0)".format(X,X) ],
[ [ A(X), F(1) ], [ 0x9F ], 4, [ (PC==0x01), (A == 0xFF), (F==0xBB) ], "SBC A (0x{:X} - 0x{:X} - 1)".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_and(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X,Y, f) in [ (0x11, 0x45, 0x10),
(0x0A, 0xFF, 0x1C),
(0x0F, 0xF0, 0x54) ]:
tests = [
[ [ A(X), B(Y) ], [ 0xA0 ], 4, [ (PC==0x01), (A == (X&Y)), (F==f) ], "AND B (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X), C(Y) ], [ 0xA1 ], 4, [ (PC==0x01), (A == (X&Y)), (F==f) ], "AND C (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X), D(Y) ], [ 0xA2 ], 4, [ (PC==0x01), (A == (X&Y)), (F==f) ], "AND D (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X), E(Y) ], [ 0xA3 ], 4, [ (PC==0x01), (A == (X&Y)), (F==f) ], "AND E (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X), H(Y) ], [ 0xA4 ], 4, [ (PC==0x01), (A == (X&Y)), (F==f) ], "AND H (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X), L(Y) ], [ 0xA5 ], 4, [ (PC==0x01), (A == (X&Y)), (F==f) ], "AND L (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC) ], [ 0xA6 ], 7, [ (PC==0x01), (A == (X&Y)), (F==f) ], "AND (HL) (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X) ], [ 0xE6, Y ], 7, [ (PC==0x02), (A == (X&Y)), (F==f) ], "AND 0x{:X} (0x{:X}&0x{:X})".format(Y,X,Y) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0) ], [ 0xDD, 0xA6, 0x0C ], 19, [ (PC==0x03), (A == (X&Y)), (F==f) ], "AND (IX+0CH) (0x{:X}&0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0) ], [ 0xFD, 0xA6, 0x0C ], 19, [ (PC==0x03), (A == (X&Y)), (F==f) ], "AND (IY+0CH) (0x{:X}&0x{:X})".format(X,Y) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for (X, f) in [ (0x11, 0x14),
(0x0A, 0x1C), ]:
tests = [
[ [ A(X) ], [ 0xA7 ], 4, [ (PC==0x01), (A == X), (F==f) ], "AND A (0x{:X}&0x{:X})".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_xor(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X,Y, f) in [ (0x11, 0x45, 0x00),
(0x0A, 0xFF, 0xA4),
(0x0F, 0xF0, 0xAC) ]:
tests = [
[ [ A(X), B(Y) ], [ 0xA8 ], 4, [ (PC==0x01), (A == (X^Y)), (F==f) ], "XOR B (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X), C(Y) ], [ 0xA9 ], 4, [ (PC==0x01), (A == (X^Y)), (F==f) ], "XOR C (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X), D(Y) ], [ 0xAA ], 4, [ (PC==0x01), (A == (X^Y)), (F==f) ], "XOR D (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X), E(Y) ], [ 0xAB ], 4, [ (PC==0x01), (A == (X^Y)), (F==f) ], "XOR E (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X), H(Y) ], [ 0xAC ], 4, [ (PC==0x01), (A == (X^Y)), (F==f) ], "XOR H (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X), L(Y) ], [ 0xAD ], 4, [ (PC==0x01), (A == (X^Y)), (F==f) ], "XOR L (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC) ], [ 0xAE ], 7, [ (PC==0x01), (A == (X^Y)), (F==f) ], "XOR (HL) (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X) ], [ 0xEE, Y ], 7, [ (PC==0x02), (A == (X^Y)), (F==f) ], "XOR 0x{:X} (0x{:X}^0x{:X})".format(Y,X,Y) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0) ], [ 0xDD, 0xAE, 0x0C ], 19, [ (PC==0x03), (A == (X^Y)), (F==f) ], "XOR (IX+0CH) (0x{:X}^0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0) ], [ 0xFD, 0xAE, 0x0C ], 19, [ (PC==0x03), (A == (X^Y)), (F==f) ], "XOR (IY+0CH) (0x{:X}^0x{:X})".format(X,Y) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for X in [ 0x11,
0x0A, ]:
tests = [
[ [ A(X) ], [ 0xAF ], 4, [ (PC==0x01), (A == 0x00), (F==0x44) ], "XOR A (0x{:X}^0x{:X})".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_or(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
for (X,Y, f) in [ (0x11, 0x45, 0x04),
(0x0A, 0xFF, 0xAC),
(0x0F, 0xF0, 0xAC) ]:
tests = [
[ [ A(X), B(Y) ], [ 0xB0 ], 4, [ (PC==0x01), (A == (X|Y)), (F==f) ], "OR B (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X), C(Y) ], [ 0xB1 ], 4, [ (PC==0x01), (A == (X|Y)), (F==f) ], "OR C (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X), D(Y) ], [ 0xB2 ], 4, [ (PC==0x01), (A == (X|Y)), (F==f) ], "OR D (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X), E(Y) ], [ 0xB3 ], 4, [ (PC==0x01), (A == (X|Y)), (F==f) ], "OR E (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X), H(Y) ], [ 0xB4 ], 4, [ (PC==0x01), (A == (X|Y)), (F==f) ], "OR H (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X), L(Y) ], [ 0xB5 ], 4, [ (PC==0x01), (A == (X|Y)), (F==f) ], "OR L (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), HL(0x1BBC) ], [ 0xB6 ], 7, [ (PC==0x01), (A == (X|Y)), (F==f) ], "OR (HL) (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X) ], [ 0xF6, Y ], 7, [ (PC==0x02), (A == (X|Y)), (F==f) ], "OR 0x{:X} (0x{:X}|0x{:X})".format(Y,X,Y) ],
[ [ A(X), M(0x1BBC,Y), IX(0x1BB0) ], [ 0xDD, 0xB6, 0x0C ], 19, [ (PC==0x03), (A == (X|Y)), (F==f) ], "OR (IX+0CH) (0x{:X}|0x{:X})".format(X,Y) ],
[ [ A(X), M(0x1BBC,Y), IY(0x1BB0) ], [ 0xFD, 0xB6, 0x0C ], 19, [ (PC==0x03), (A == (X|Y)), (F==f) ], "OR (IY+0CH) (0x{:X}|0x{:X})".format(X,Y) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
for (X, f) in [ (0x11, 0x04),
(0x0A, 0x0C), ]:
tests = [
[ [ A(X) ], [ 0xB7 ], 4, [ (PC==0x01), (A == X), (F==f) ], "OR A (0x{:X}|0x{:X})".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_inc(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for (X, f) in [ (0x00, 0x00),
(0x01, 0x00),
(0x0F, 0x10),
(0xFF, 0x54),
]:
tests += [
[ [ B(X) ], [ 0x04 ], 4, [ (PC==0x01), (B == (X+1)&0xFF), (F==f) ], "INC B (0x{:X} + 1)".format(X) ],
[ [ C(X) ], [ 0x0C ], 4, [ (PC==0x01), (C == (X+1)&0xFF), (F==f) ], "INC C (0x{:X} + 1)".format(X) ],
[ [ D(X) ], [ 0x14 ], 4, [ (PC==0x01), (D == (X+1)&0xFF), (F==f) ], "INC D (0x{:X} + 1)".format(X) ],
[ [ E(X) ], [ 0x1C ], 4, [ (PC==0x01), (E == (X+1)&0xFF), (F==f) ], "INC E (0x{:X} + 1)".format(X) ],
[ [ H(X) ], [ 0x24 ], 4, [ (PC==0x01), (H == (X+1)&0xFF), (F==f) ], "INC H (0x{:X} + 1)".format(X) ],
[ [ L(X) ], [ 0x2C ], 4, [ (PC==0x01), (L == (X+1)&0xFF), (F==f) ], "INC L (0x{:X} + 1)".format(X) ],
[ [ M(0x1BBC,X), HL(0x1BBC) ], [ 0x34 ], 12, [ (PC==0x01), (M[0x1BBC] == (X+1)&0xFF), (F==f) ], "INC (HL) (0x{:X} + 1)".format(X) ],
[ [ A(X) ], [ 0x3C ], 4, [ (PC==0x01), (A == (X+1)&0xFF), (F==f) ], "INC A (0x{:X} + 1)".format(X) ],
[ [ M(0x1BBC,X), IX(0x1BB0) ], [ 0xDD, 0x34, 0x0C ], 23, [ (PC==0x03), (M[0x1BBC] == (X+1)&0xFF), (F==f) ], "INC (IX+0CH) (0x{:X} + 1)".format(X) ],
[ [ M(0x1BBC,X), IY(0x1BB0) ], [ 0xFD, 0x34, 0x0C ], 23, [ (PC==0x03), (M[0x1BBC] == (X+1)&0xFF), (F==f) ], "INC (IY+0CH) (0x{:X} + 1)".format(X) ],
]
for X in [ 0x0000,
0x0001,
0x00FF,
0x0100,
0xFF00,
0xFFFF ]:
tests += [
[ [ BC(X) ], [ 0x03 ], 4, [ (PC==0x01), (BC == (X+1)&0xFFFF) ], "INC BC (0x{:X} + 1)".format(X) ],
[ [ DE(X) ], [ 0x13 ], 4, [ (PC==0x01), (DE == (X+1)&0xFFFF) ], "INC DE (0x{:X} + 1)".format(X) ],
[ [ HL(X) ], [ 0x23 ], 4, [ (PC==0x01), (HL == (X+1)&0xFFFF) ], "INC HL (0x{:X} + 1)".format(X) ],
[ [ SP(X) ], [ 0x33 ], 4, [ (PC==0x01), (SP == (X+1)&0xFFFF) ], "INC SP (0x{:X} + 1)".format(X) ],
[ [ IX(X) ], [ 0xDD, 0x23 ], 8, [ (PC==0x02), (IX == (X+1)&0xFFFF) ], "INC IX (0x{:X} + 1)".format(X) ],
[ [ IY(X) ], [ 0xFD, 0x23 ], 8, [ (PC==0x02), (IY == (X+1)&0xFFFF) ], "INC IY (0x{:X} + 1)".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_dec(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for (X, f) in [ (0x00, 0xBA),
(0x01, 0x42),
(0x02, 0x02),
(0x10, 0x1A),
]:
tests += [
[ [ B(X) ], [ 0x05 ], 4, [ (PC==0x01), (B == (X-1)&0xFF), (F==f) ], "DEC B (0x{:X} - 1)".format(X) ],
[ [ C(X) ], [ 0x0D ], 4, [ (PC==0x01), (C == (X-1)&0xFF), (F==f) ], "DEC C (0x{:X} - 1)".format(X) ],
[ [ D(X) ], [ 0x15 ], 4, [ (PC==0x01), (D == (X-1)&0xFF), (F==f) ], "DEC D (0x{:X} - 1)".format(X) ],
[ [ E(X) ], [ 0x1D ], 4, [ (PC==0x01), (E == (X-1)&0xFF), (F==f) ], "DEC E (0x{:X} - 1)".format(X) ],
[ [ H(X) ], [ 0x25 ], 4, [ (PC==0x01), (H == (X-1)&0xFF), (F==f) ], "DEC H (0x{:X} - 1)".format(X) ],
[ [ L(X) ], [ 0x2D ], 4, [ (PC==0x01), (L == (X-1)&0xFF), (F==f) ], "DEC L (0x{:X} - 1)".format(X) ],
[ [ M(0x1BBC,X), HL(0x1BBC) ], [ 0x35 ], 12, [ (PC==0x01), (M[0x1BBC] == (X-1)&0xFF), (F==f) ], "DEC (HL) (0x{:X} - 1)".format(X) ],
[ [ A(X) ], [ 0x3D ], 4, [ (PC==0x01), (A == (X-1)&0xFF), (F==f) ], "DEC A (0x{:X} - 1)".format(X) ],
[ [ M(0x1BBC,X), IX(0x1BB0) ], [ 0xDD, 0x35, 0x0C ], 23, [ (PC==0x03), (M[0x1BBC] == (X-1)&0xFF), (F==f) ], "DEC (IX+0CH) (0x{:X} - 1)".format(X) ],
[ [ M(0x1BBC,X), IY(0x1BB0) ], [ 0xFD, 0x35, 0x0C ], 23, [ (PC==0x03), (M[0x1BBC] == (X-1)&0xFF), (F==f) ], "DEC (IY+0CH) (0x{:X} - 1)".format(X) ],
]
for X in [ 0x0000,
0x0001,
0x00FF,
0x0100,
0xFF00,
0xFFFF ]:
tests += [
[ [ BC(X) ], [ 0x0B ], 4, [ (PC==0x01), (BC == (X-1)&0xFFFF) ], "DEC BC (0x{:X} - 1)".format(X) ],
[ [ DE(X) ], [ 0x1B ], 4, [ (PC==0x01), (DE == (X-1)&0xFFFF) ], "DEC DE (0x{:X} - 1)".format(X) ],
[ [ HL(X) ], [ 0x2B ], 4, [ (PC==0x01), (HL == (X-1)&0xFFFF) ], "DEC HL (0x{:X} - 1)".format(X) ],
[ [ SP(X) ], [ 0x3B ], 4, [ (PC==0x01), (SP == (X-1)&0xFFFF) ], "DEC SP (0x{:X} - 1)".format(X) ],
[ [ IX(X) ], [ 0xDD, 0x2B ], 8, [ (PC==0x02), (IX == (X-1)&0xFFFF) ], "DEC IX (0x{:X} - 1)".format(X) ],
[ [ IY(X) ], [ 0xFD, 0x2B ], 8, [ (PC==0x02), (IY == (X-1)&0xFFFF) ], "DEC IY (0x{:X} - 1)".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_daa(self):
def bcd(n):
return (((int(n//10)%10) << 4) + (n%10))&0xFF
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for x in range(0,100):
for y in range(0,100):
X = bcd(x)
Y = bcd(y)
h = 1 if ((x%10) + (y%10)) > 0xF else 0
h_ = 1 if ((x%10) - (y%10)) < 0 else 0
c = 1 if (X + Y) > 0xFF else 0
c_ = 1 if (X - Y) < 0 else 0
tests += [
[ [ A((X + Y)&0xFF), F((h << 4) + c) ], [ 0x27 ], 4, [ (PC==0x01), (A == bcd(x+y)) ], "DAA (after {} + {})".format(x,y) ],
[ [ A((X - Y)&0xFF), F((h_ << 4) + c_ + 0x02) ], [ 0x27 ], 4, [ (PC==0x01), (A == bcd(x-y)) ], "DAA (after {} - {})".format(x,y) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_cpl(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for X in range(0,256):
tests += [
[ [ A(X) ], [ 0x2F ], 4, [ (PC==0x01), (A == (~X)&0xFF) ], "CPL (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_neg(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for X in range(0,256):
tests += [
[ [ A(X) ], [ 0xED, 0x44 ], 8, [ (PC==0x02), (A == (256 - X)&0xFF) ], "NEG (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ccf(self):
def ccf(x):
if x&0x01:
x &= 0xFE
else:
x |= 0x01
if x&0x10:
x &= 0xEF
else:
x |= 0x10
x &= 0xFD
return x
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for X in range(0,256):
tests += [
[ [ F(X) ], [ 0x3F ], 4, [ (PC==0x01), (F == ccf(X)) ], "CCF (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_scf(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for X in range(0,256):
for Y in [ (1 << n) | (1 << m) for n,m in zip(list(range(0,8)), list(range(0,8))) ]:
tests += [
[ [ F(X), A(Y) ], [ 0x37 ], 4, [ (PC==0x01), (F == ((X&0xC4) | (Y&0x28) | 0x01)) ], "SCF (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_add16(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for (X,Y,f) in [ (0x000A, 0x000B, 0x00),
(0x0040, 0x0051, 0x00),
(0x00FF, 0x0002, 0x00),
(0x00FF, 0x0001, 0x00),
(0x0100, 0x0001, 0x00),
(0xFF00, 0x0100, 0x11),
(0xFFFF, 0x0001, 0x11),
]:
tests += [
[ [ BC(X), HL(Y) ], [ 0x09 ], 11, [ (PC==0x01), (HL == (X+Y)&0xFFFF), (F==f) ], "ADD HL,BC (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ DE(X), HL(Y) ], [ 0x19 ], 11, [ (PC==0x01), (HL == (X+Y)&0xFFFF), (F==f) ], "ADD HL,DE (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ SP(X), HL(Y) ], [ 0x39 ], 11, [ (PC==0x01), (HL == (X+Y)&0xFFFF), (F==f) ], "ADD HL,SP (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ BC(X), IX(Y) ], [ 0xDD, 0x09 ], 15, [ (PC==0x02), (IX == (X+Y)&0xFFFF), (F==f) ], "ADD IX,BC (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ DE(X), IX(Y) ], [ 0xDD, 0x19 ], 15, [ (PC==0x02), (IX == (X+Y)&0xFFFF), (F==f) ], "ADD IX,DE (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ SP(X), IX(Y) ], [ 0xDD, 0x39 ], 15, [ (PC==0x02), (IX == (X+Y)&0xFFFF), (F==f) ], "ADD IX,SP (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ BC(X), IY(Y) ], [ 0xFD, 0x09 ], 15, [ (PC==0x02), (IY == (X+Y)&0xFFFF), (F==f) ], "ADD IY,BC (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ DE(X), IY(Y) ], [ 0xFD, 0x19 ], 15, [ (PC==0x02), (IY == (X+Y)&0xFFFF), (F==f) ], "ADD IY,DE (0x{:X} + 0x{:X})".format(X,Y) ],
[ [ SP(X), IY(Y) ], [ 0xFD, 0x39 ], 15, [ (PC==0x02), (IY == (X+Y)&0xFFFF), (F==f) ], "ADD IY,SP (0x{:X} + 0x{:X})".format(X,Y) ],
]
for (X,f) in [ (0x000A, 0x00),
(0x0040, 0x00),
(0x00FF, 0x00),
(0x0100, 0x00),
(0xFF00, 0x39),
(0xFFFF, 0x39) ]:
tests += [
[ [ HL(X) ], [ 0x29 ], 11, [ (PC==0x01), (HL == (X+X)&0xFFFF), (F==f) ], "ADD HL,HL (0x{:X} + 0x{:X})".format(X,X) ],
[ [ IX(X) ], [ 0xDD, 0x29 ], 15, [ (PC==0x02), (IX == (X+X)&0xFFFF), (F==f) ], "ADD IX,IX (0x{:X} + 0x{:X})".format(X,X) ],
[ [ IY(X) ], [ 0xFD, 0x29 ], 15, [ (PC==0x02), (IY == (X+X)&0xFFFF), (F==f) ], "ADD IY,IY (0x{:X} + 0x{:X})".format(X,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_adc16(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for (X,Y,c,f) in [ (0x000A, 0x000B, 0, 0x00),
(0x0040, 0x0051, 0, 0x00),
(0x00FF, 0x0002, 0, 0x00),
(0x00FF, 0x0001, 0, 0x00),
(0x0100, 0x0001, 0, 0x00),
(0xFF00, 0x0100, 0, 0x55),
(0xFFFF, 0x0001, 0, 0x55),
(0x000A, 0x000A, 1, 0x00),
(0x0040, 0x0050, 1, 0x00),
(0x00FF, 0x0001, 1, 0x00),
(0x00FF, 0x0000, 1, 0x00),
(0x0100, 0x0000, 1, 0x00),
(0xFF00, 0x00FF, 1, 0x55),
(0xFFFF, 0x0000, 1, 0x55),
]:
tests += [
[ [ BC(X), HL(Y), F(c) ], [ 0xED, 0x4A ], 15, [ (PC==0x02), (HL == (X+Y+c)&0xFFFF), (F==f) ], "ADC HL,BC (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ DE(X), HL(Y), F(c) ], [ 0xED, 0x5A ], 15, [ (PC==0x02), (HL == (X+Y+c)&0xFFFF), (F==f) ], "ADC HL,DE (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
[ [ SP(X), HL(Y), F(c) ], [ 0xED, 0x7A ], 15, [ (PC==0x02), (HL == (X+Y+c)&0xFFFF), (F==f) ], "ADC HL,SP (0x{:X} + 0x{:X} + {})".format(X,Y,c) ],
]
for (X,c,f) in [ (0x000A, 0, 0x00),
(0x0040, 0, 0x00),
(0x00FF, 0, 0x00),
(0x0100, 0, 0x00),
(0xFF00, 0, 0xBD),
(0xFFFF, 0, 0xBD),
(0x000A, 1, 0x00),
(0x0040, 1, 0x00),
(0x00FF, 1, 0x00),
(0x0100, 1, 0x00),
(0xFF00, 1, 0xBD),
(0xFFFF, 1, 0xBD)]:
tests += [
[ [ HL(X), F(c) ], [ 0xED, 0x6A ], 15, [ (PC==0x02), (HL == (X+X+c)&0xFFFF), (F==f) ], "ADC HL,HL (0x{:X} + 0x{:X} + {})".format(X,X,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_sbc16(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = []
for (X,Y,c,f) in [ (0x000A, 0x000B, 0, 0xAE),
(0x0040, 0x0051, 0, 0xAE),
(0x00FF, 0x0002, 0, 0x17),
(0x00FF, 0x0001, 0, 0x17),
(0x0100, 0x0001, 0, 0x17),
(0xFF00, 0x0100, 0, 0xBF),
(0xFFFF, 0x0001, 0, 0xBF),
(0x000A, 0x000A, 1, 0xAE),
(0x0040, 0x0050, 1, 0xAE),
(0x00FF, 0x0001, 1, 0x17),
(0x00FF, 0x0000, 1, 0x02),
(0x0100, 0x0000, 1, 0x02),
(0xFF00, 0x00FF, 1, 0xBF),
(0xFFFF, 0x0000, 1, 0xAE),
]:
tests += [
[ [ BC(Y), HL(X), F(c) ], [ 0xED, 0x42 ], 15, [ (PC==0x02), (HL == (X-Y-c)&0xFFFF), (F==f) ], "SBC HL,BC (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ DE(Y), HL(X), F(c) ], [ 0xED, 0x52 ], 15, [ (PC==0x02), (HL == (X-Y-c)&0xFFFF), (F==f) ], "SBC HL,DE (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
[ [ SP(Y), HL(X), F(c) ], [ 0xED, 0x72 ], 15, [ (PC==0x02), (HL == (X-Y-c)&0xFFFF), (F==f) ], "SBC HL,SP (0x{:X} - 0x{:X} - {})".format(X,Y,c) ],
]
for (X,c,f) in [ (0x000A, 0, 0x57),
(0x0040, 0, 0x57),
(0x00FF, 0, 0x57),
(0x0100, 0, 0x57),
(0xFF00, 0, 0x57),
(0xFFFF, 0, 0x57),
(0x000A, 1, 0xAE),
(0x0040, 1, 0xAE),
(0x00FF, 1, 0xAE),
(0x0100, 1, 0xAE),
(0xFF00, 1, 0xAE),
(0xFFFF, 1, 0xAE)]:
tests += [
[ [ HL(X), F(c) ], [ 0xED, 0x62 ], 15, [ (PC==0x02), (HL == (-c)&0xFFFF), (F==f) ], "SBC HL,HL (0x{:X} - 0x{:X} - {})".format(X,X,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rlca(self):
tests = []
for (X,f) in [ (0x00, 0x00),
(0x01, 0x00),
(0x80, 0x01),
(0xF0, 0x21),
(0xFF, 0x29),
(0x7F, 0x28)
]:
tests += [
[ [ A(X) ], [ 0x07 ], 4, [ (A == ((X << 1) + (X >> 7))&0xFF), (F == f) ], "RLCA (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rrca(self):
tests = []
for (X,f) in [ (0x00, 0x00),
(0x01, 0x01),
(0x80, 0x00),
(0xF0, 0x28),
(0xFF, 0x29),
(0x7F, 0x29)
]:
tests += [
[ [ A(X) ], [ 0x0F ], 4, [ (A == ((X >> 1) + ((X&0x1) << 7))&0xFF), (F == f) ], "RRCA (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rla(self):
tests = []
for (X,c,f) in [(0x00, 0, 0x00),
(0x00, 1, 0x00),
(0x01, 0, 0x00),
(0x01, 1, 0x00),
(0x80, 0, 0x01),
(0x80, 1, 0x01),
(0xF0, 0, 0x21),
(0xF0, 1, 0x21),
(0xFF, 0, 0x29),
(0xFF, 1, 0x29),
(0x7F, 0, 0x28),
(0x7F, 1, 0x28)
]:
tests += [
[ [ A(X), F(c) ], [ 0x17 ], 4, [ (A == ((X << 1) + c)&0xFF), (F == f) ], "RLA (of 0x{:X} with C={})".format(X,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rra(self):
tests = []
for (X,c,f) in [(0x00, 0, 0x00),
(0x00, 1, 0x00),
(0x01, 0, 0x01),
(0x01, 1, 0x01),
(0x80, 0, 0x00),
(0x80, 1, 0x00),
(0xF0, 0, 0x28),
(0xF0, 1, 0x28),
(0xFF, 0, 0x29),
(0xFF, 1, 0x29),
(0x7F, 0, 0x29),
(0x7F, 1, 0x29)
]:
tests += [
[ [ A(X), F(c) ], [ 0x1F ], 4, [ (A == ((X >> 1) + (c << 7))&0xFF), (F == f) ], "RRA (of 0x{:X} with C={})".format(X,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rlc(self):
tests = []
for (X,f) in [ (0x00, 0x00),
(0x01, 0x00),
(0x80, 0x01),
(0xF0, 0x21),
(0xFF, 0x29),
(0x7F, 0x28) ]:
for (r,i) in [ ('B', 0x00),
('C', 0x01),
('D', 0x02),
('E', 0x03),
('H', 0x04),
('L', 0x05),
('A', 0x07) ]:
tests += [
[ [ set_register_to(r,X) ], [ 0xCB, i ], 8, [ expect_register_equal(r, ((X << 1) + (X >> 7))&0xFF), (F == f) ], "RLC {} (of 0x{:X})".format(r,X) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC) ], [ 0xCB, 0x06 ], 15, [ (M[0x1BBC] == ((X << 1) + (X >> 7))&0xFF), (F == f) ], "RLC (HL) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IX(0x1BB0) ], [ 0xDD, 0xCB, 0x0C, 0x06 ], 23, [ (M[0x1BBC] == ((X << 1) + (X >> 7))&0xFF), (F == f) ], "RLC (IX+0CH) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IY(0x1BB0) ], [ 0xFD, 0xCB, 0x0C, 0x06 ], 23, [ (M[0x1BBC] == ((X << 1) + (X >> 7))&0xFF), (F == f) ], "RLC (IY+0CH) (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rrc(self):
tests = []
for (X,f) in [ (0x00, 0x00),
(0x01, 0x01),
(0x80, 0x00),
(0xF0, 0x28),
(0xFF, 0x29),
(0x7F, 0x29) ]:
for (r,i) in [ ('B', 0x08),
('C', 0x09),
('D', 0x0A),
('E', 0x0B),
('H', 0x0C),
('L', 0x0D),
('A', 0x0F) ]:
tests += [
[ [ set_register_to(r,X) ], [ 0xCB, i ], 8, [ expect_register_equal(r, ((X >> 1) + (X << 7))&0xFF), (F == f) ], "RRC {} (of 0x{:X})".format(r,X) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC) ], [ 0xCB, 0x0E ], 15, [ (M[0x1BBC] == ((X >> 1) + (X << 7))&0xFF), (F == f) ], "RRC (HL) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IX(0x1BB0) ], [ 0xDD, 0xCB, 0x0C, 0x0E ], 23, [ (M[0x1BBC] == ((X >> 1) + (X << 7))&0xFF), (F == f) ], "RRC (IX+0CH) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IY(0x1BB0) ], [ 0xFD, 0xCB, 0x0C, 0x0E ], 23, [ (M[0x1BBC] == ((X >> 1) + (X << 7))&0xFF), (F == f) ], "RRC (IY+0CH) (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rl(self):
tests = []
for (X,c,f) in [(0x00, 0, 0x00),
(0x00, 1, 0x00),
(0x01, 0, 0x00),
(0x01, 1, 0x00),
(0x80, 0, 0x01),
(0x80, 1, 0x01),
(0xF0, 0, 0x21),
(0xF0, 1, 0x21),
(0xFF, 0, 0x29),
(0xFF, 1, 0x29),
(0x7F, 0, 0x28),
(0x7F, 1, 0x28)
]:
for (r,i) in [ ('B', 0x10),
('C', 0x11),
('D', 0x12),
('E', 0x13),
('H', 0x14),
('L', 0x15),
('A', 0x17) ]:
tests += [
[ [ set_register_to(r,X), F(c) ], [ 0xCB, i ], 8, [ expect_register_equal(r, ((X << 1) + c)&0xFF), (F == f) ], "RL {} (of 0x{:X} with C={})".format(r,X,c) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC), F(c) ], [ 0xCB, 0x16 ], 15, [ (M[0x1BBC] == ((X << 1) + c)&0xFF), (F == f) ], "RL (HL) (of 0x{:X} with C={})".format(X,c) ],
[ [ M(0x1BBC, X), IX(0x1BB0), F(c) ], [ 0xDD, 0xCB, 0x0C, 0x16 ], 23, [ (M[0x1BBC] == ((X << 1) + c)&0xFF), (F == f) ], "RL (IX+0CH) (of 0x{:X} with C={})".format(X,c) ],
[ [ M(0x1BBC, X), IY(0x1BB0), F(c) ], [ 0xFD, 0xCB, 0x0C, 0x16 ], 23, [ (M[0x1BBC] == ((X << 1) + c)&0xFF), (F == f) ], "RL (IY+0CH) (of 0x{:X} with C={})".format(X,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rr(self):
tests = []
for (X,c,f) in [(0x00, 0, 0x00),
(0x00, 1, 0x00),
(0x01, 0, 0x01),
(0x01, 1, 0x01),
(0x80, 0, 0x00),
(0x80, 1, 0x00),
(0xF0, 0, 0x28),
(0xF0, 1, 0x28),
(0xFF, 0, 0x29),
(0xFF, 1, 0x29),
(0x7F, 0, 0x29),
(0x7F, 1, 0x29)
]:
for (r,i) in [ ('B', 0x18),
('C', 0x19),
('D', 0x1A),
('E', 0x1B),
('H', 0x1C),
('L', 0x1D),
('A', 0x1F) ]:
tests += [
[ [ set_register_to(r,X), F(c) ], [ 0xCB, i ], 8, [ expect_register_equal(r, ((X >> 1) + (c << 7))&0xFF), (F == f) ], "RR {} (of 0x{:X} with C={})".format(r,X,c) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC), F(c) ], [ 0xCB, 0x1E ], 15, [ (M[0x1BBC] == ((X >> 1) + (c << 7))&0xFF), (F == f) ], "RR (HL) (of 0x{:X} with C={})".format(X,c) ],
[ [ M(0x1BBC, X), IX(0x1BB0), F(c) ], [ 0xDD, 0xCB, 0x0C, 0x1E ], 23, [ (M[0x1BBC] == ((X >> 1) + (c << 7))&0xFF), (F == f) ], "RR (IX+0CH) (of 0x{:X} with C={})".format(X,c) ],
[ [ M(0x1BBC, X), IY(0x1BB0), F(c) ], [ 0xFD, 0xCB, 0x0C, 0x1E ], 23, [ (M[0x1BBC] == ((X >> 1) + (c << 7))&0xFF), (F == f) ], "RR (IY+0CH) (of 0x{:X} with C={})".format(X,c) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_sla(self):
tests = []
for (X,f) in [(0x00, 0x00),
(0x01, 0x00),
(0x80, 0x01),
(0xF0, 0x21),
(0xFF, 0x29),
(0x7F, 0x28), ]:
for (r,i) in [ ('B', 0x20),
('C', 0x21),
('D', 0x22),
('E', 0x23),
('H', 0x24),
('L', 0x25),
('A', 0x27) ]:
tests += [
[ [ set_register_to(r,X) ], [ 0xCB, i ], 8, [ expect_register_equal(r, ((X << 1))&0xFF), (F == f) ], "SLA {} (of 0x{:X})".format(r,X) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC) ], [ 0xCB, 0x26 ], 15, [ (M[0x1BBC] == (X << 1)&0xFF), (F == f) ], "SLA (HL) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IX(0x1BB0) ], [ 0xDD, 0xCB, 0x0C, 0x26 ], 23, [ (M[0x1BBC] == (X << 1)&0xFF), (F == f) ], "SLA (IX+0CH) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IY(0x1BB0) ], [ 0xFD, 0xCB, 0x0C, 0x26 ], 23, [ (M[0x1BBC] == (X << 1)&0xFF), (F == f) ], "SLA (IY+0CH) (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_sra(self):
tests = []
for (X,f) in [(0x00, 0x00),
(0x01, 0x01),
(0x80, 0x00),
(0xF0, 0x28),
(0xFF, 0x29),
(0x7F, 0x29) ]:
for (r,i) in [ ('B', 0x28),
('C', 0x29),
('D', 0x2A),
('E', 0x2B),
('H', 0x2C),
('L', 0x2D),
('A', 0x2F) ]:
tests += [
[ [ set_register_to(r,X) ], [ 0xCB, i ], 8, [ expect_register_equal(r, ((X >> 1) | (X&0x80))&0xFF), (F == f) ], "SRA {} (of 0x{:X})".format(r,X) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC) ], [ 0xCB, 0x2E ], 15, [ (M[0x1BBC] == ((X >> 1) | (X&0x80))&0xFF), (F == f) ], "SRA (HL) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IX(0x1BB0) ], [ 0xDD, 0xCB, 0x0C, 0x2E ], 23, [ (M[0x1BBC] == ((X >> 1) | (X&0x80))&0xFF), (F == f) ], "SRA (IX+0CH) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IY(0x1BB0) ], [ 0xFD, 0xCB, 0x0C, 0x2E ], 23, [ (M[0x1BBC] == ((X >> 1) | (X&0x80))&0xFF), (F == f) ], "SRA (IY+0CH) (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_sl1(self):
tests = []
for (X,f) in [(0x00, 0x00),
(0x01, 0x00),
(0x80, 0x01),
(0xF0, 0x21),
(0xFF, 0x29),
(0x7F, 0x28), ]:
for (r,i) in [ ('B', 0x30),
('C', 0x31),
('D', 0x32),
('E', 0x33),
('H', 0x34),
('L', 0x35),
('A', 0x37) ]:
tests += [
[ [ set_register_to(r,X) ], [ 0xCB, i ], 8, [ expect_register_equal(r, ((X << 1) + 1)&0xFF), (F == f) ], "SL1 {} (of 0x{:X})".format(r,X) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC) ], [ 0xCB, 0x36 ], 15, [ (M[0x1BBC] == ((X << 1) + 1)&0xFF), (F == f) ], "SL1 (HL) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IX(0x1BB0) ], [ 0xDD, 0xCB, 0x0C, 0x36 ], 23, [ (M[0x1BBC] == ((X << 1) + 1)&0xFF), (F == f) ], "SL1 (IX+0CH) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IY(0x1BB0) ], [ 0xFD, 0xCB, 0x0C, 0x36 ], 23, [ (M[0x1BBC] == ((X << 1) + 1)&0xFF), (F == f) ], "SL1 (IY+0CH) (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_srl(self):
tests = []
for (X,f) in [(0x00, 0x00),
(0x01, 0x01),
(0x80, 0x00),
(0xF0, 0x28),
(0xFF, 0x29),
(0x7F, 0x29) ]:
for (r,i) in [ ('B', 0x38),
('C', 0x39),
('D', 0x3A),
('E', 0x3B),
('H', 0x3C),
('L', 0x3D),
('A', 0x3F) ]:
tests += [
[ [ set_register_to(r,X) ], [ 0xCB, i ], 8, [ expect_register_equal(r, (X >> 1)), (F == f) ], "SRL {} (of 0x{:X})".format(r,X) ],
]
tests += [
[ [ M(0x1BBC, X), HL(0x1BBC) ], [ 0xCB, 0x3E ], 15, [ (M[0x1BBC] == (X >> 1)&0xFF), (F == f) ], "SRL (HL) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IX(0x1BB0) ], [ 0xDD, 0xCB, 0x0C, 0x3E ], 23, [ (M[0x1BBC] == (X >> 1)&0xFF), (F == f) ], "SRL (IX+0CH) (of 0x{:X})".format(X) ],
[ [ M(0x1BBC, X), IY(0x1BB0) ], [ 0xFD, 0xCB, 0x0C, 0x3E ], 23, [ (M[0x1BBC] == (X >> 1)&0xFF), (F == f) ], "SRL (IY+0CH) (of 0x{:X})".format(X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rld(self):
tests = [
[ [ A(0xF1), M(0x1BBC,0x23), HL(0x1BBC) ], [ 0xED, 0x6F ], 18, [ (A == 0x02), (M[0x1BBC] == 0x31), (F == 0x00) ], "RLD (of 0xF1 and 0x23)".format() ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rrd(self):
tests = [
[ [ A(0xF1), M(0x1BBC,0x23), HL(0x1BBC) ], [ 0xED, 0x67 ], 18, [ (A == 0x03), (M[0x1BBC] == 0x12), (F == 0x04) ], "RRD (of 0xF1 and 0x23)".format() ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_bit(self):
tests = []
for X in range(0,256):
for b in range(0,8):
f = ((1 - ((X >> b)&0x1))*0x44) + 0x10 + ((X&(1 << b))&0xA8)
for (reg, r) in [ ('B', 0x0), ('C', 0x1), ('D',0x2), ('E',0x3), ('H',0x4), ('L',0x5), ('A',0x7) ]:
i = 0x40 + (b << 3) + r
tests += [
[ [ set_register_to(reg,X) ], [ 0xCB, i ], 8, [ expect_register_equal(reg, X), (F == f) ], "BIT {},{} (of 0x{:X})".format(b,reg,X) ],
]
tests += [
[ [ HL(0x1BBC), M(0x1BBC, X) ], [ 0xCB, (0x46 + (b << 3)) ], 12, [ (M[0x1BBC] == X), (F == f) ], "BIT {},(HL) (of 0x{:X})".format(b,X) ],
[ [ IX(0x1BB0), M(0x1BBC, X) ], [ 0xDD, 0xCB, 0xC, (0x46 + (b << 3)) ], 20, [ (M[0x1BBC] == X), (F == f) ], "BIT {},(IX+0C) (of 0x{:X})".format(b,X) ],
[ [ IY(0x1BB0), M(0x1BBC, X) ], [ 0xFD, 0xCB, 0xC, (0x46 + (b << 3)) ], 20, [ (M[0x1BBC] == X), (F == f) ], "BIT {},(IY+0C) (of 0x{:X})".format(b,X) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_res(self):
tests = []
for b in range(0,8):
for (reg, r) in [ ('B', 0x0), ('C', 0x1), ('D',0x2), ('E',0x3), ('H',0x4), ('L',0x5), ('A',0x7) ]:
i = 0x80 + (b << 3) + r
tests += [
[ [ set_register_to(reg,0xFF) ], [ 0xCB, i ], 8, [ expect_register_equal(reg, 0xFF - (1 << b)) ], "RES {},{}".format(b,reg) ],
]
tests += [
[ [ HL(0x1BBC), M(0x1BBC, 0xFF) ], [ 0xCB, (0x86 + (b << 3)) ], 15, [ (M[0x1BBC] == (0xFF - (1 << b))) ], "RES {},(HL)".format(b) ],
[ [ IX(0x1BB0), M(0x1BBC, 0xFF) ], [ 0xDD, 0xCB, 0xC, (0x86 + (b << 3)) ], 23, [ (M[0x1BBC] == (0xFF - (1 << b))) ], "RES {},(IX+0C)".format(b) ],
[ [ IY(0x1BB0), M(0x1BBC, 0xFF) ], [ 0xFD, 0xCB, 0xC, (0x86 + (b << 3)) ], 23, [ (M[0x1BBC] == (0xFF - (1 << b))) ], "RES {},(IY+0C)".format(b) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_set(self):
tests = []
for b in range(0,8):
for (reg, r) in [ ('B', 0x0), ('C', 0x1), ('D',0x2), ('E',0x3), ('H',0x4), ('L',0x5), ('A',0x7) ]:
i = 0xC0 + (b << 3) + r
tests += [
[ [ set_register_to(reg,0x00) ], [ 0xCB, i ], 8, [ expect_register_equal(reg, (1 << b)) ], "SET {},{}".format(b,reg) ],
]
tests += [
[ [ HL(0x1BBC), M(0x1BBC, 0x00) ], [ 0xCB, (0xC6 + (b << 3)) ], 15, [ (M[0x1BBC] == (1 << b)) ], "SET {},(HL)".format(b) ],
[ [ IX(0x1BB0), M(0x1BBC, 0x00) ], [ 0xDD, 0xCB, 0xC, (0xC6 + (b << 3)) ], 23, [ (M[0x1BBC] == (1 << b)) ], "SET {},(IX+0C)".format(b) ],
[ [ IY(0x1BB0), M(0x1BBC, 0x00) ], [ 0xFD, 0xCB, 0xC, (0xC6 + (b << 3)) ], 23, [ (M[0x1BBC] == (1 << b)) ], "SET {},(IY+0C)".format(b) ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_jp(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [], [ 0xC3, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP 01BBCH" ],
[ [ F(0x00) ], [ 0xDA, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP C,01BBCH (no jump)" ],
[ [ F(0x01) ], [ 0xDA, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP C,01BBCH (jump)" ],
[ [ F(0x01) ], [ 0xD2, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP NC,01BBCH (no jump)" ],
[ [ F(0x00) ], [ 0xD2, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP NC,01BBCH (jump)" ],
[ [ F(0x00) ], [ 0xCA, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP Z,01BBCH (no jump)" ],
[ [ F(0x40) ], [ 0xCA, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP Z,01BBCH (jump)" ],
[ [ F(0x40) ], [ 0xC2, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP NZ,01BBCH (no jump)" ],
[ [ F(0x00) ], [ 0xC2, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP NZ,01BBCH (jump)" ],
[ [ F(0x00) ], [ 0xEA, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP PE,01BBCH (no jump)" ],
[ [ F(0x04) ], [ 0xEA, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP PE,01BBCH (jump)" ],
[ [ F(0x04) ], [ 0xE2, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP PO,01BBCH (no jump)" ],
[ [ F(0x00) ], [ 0xE2, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP PO,01BBCH (jump)" ],
[ [ F(0x00) ], [ 0xFA, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP M,01BBCH (no jump)" ],
[ [ F(0x80) ], [ 0xFA, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP M,01BBCH (jump)" ],
[ [ F(0x80) ], [ 0xF2, 0xBC, 0x1B ], 10, [ (PC == 0x0003) ], "JP P,01BBCH (no jump)" ],
[ [ F(0x00) ], [ 0xF2, 0xBC, 0x1B ], 10, [ (PC == 0x1BBC) ], "JP P,01BBCH (jump)" ],
[ [ HL(0x1BBC) ], [ 0xE9 ], 4, [ (PC == 0x1BBC) ], "JP (HL)" ],
[ [ IX(0x1BBC) ], [ 0xDD, 0xE9 ], 8, [ (PC == 0x1BBC) ], "JP (IX)" ],
[ [ IY(0x1BBC) ], [ 0xFD, 0xE9 ], 8, [ (PC == 0x1BBC) ], "JP (IY)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_jr(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [], [ 0x18, 0x08 ], 12, [ (PC == 0x000A) ], "JR 0AH" ],
[ [ F(0x00) ], [ 0x38, 0x08 ], 7, [ (PC == 0x0002) ], "JR C,0AH (no jump)" ],
[ [ F(0x01) ], [ 0x38, 0x08 ], 12, [ (PC == 0x000A) ], "JR C,0AH (jump)" ],
[ [ F(0x01) ], [ 0x30, 0x08 ], 7, [ (PC == 0x0002) ], "JR NC,0AH (no jump)" ],
[ [ F(0x00) ], [ 0x30, 0x08 ], 12, [ (PC == 0x000A) ], "JR NC,0AH (jump)" ],
[ [ F(0x00) ], [ 0x28, 0x08 ], 7, [ (PC == 0x0002) ], "JR Z,0AH (no jump)" ],
[ [ F(0x40) ], [ 0x28, 0x08 ], 12, [ (PC == 0x000A) ], "JR Z,0AH (jump)" ],
[ [ F(0x40) ], [ 0x20, 0x08 ], 7, [ (PC == 0x0002) ], "JR NZ,0AH (no jump)" ],
[ [ F(0x00) ], [ 0x20, 0x08 ], 12, [ (PC == 0x000A) ], "JR NZ,0AH (jump)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_djnz(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ B(0x02) ], [ 0x10, 0x08 ], 13, [ (PC == 0x000A), (B == 0x01) ], "DJNZ 0AH (with B == 0x02)" ],
[ [ B(0x01) ], [ 0x10, 0x08 ], 8, [ (PC == 0x0002), (B == 0x00) ], "DJNZ 0AH (with B == 0x01)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_call(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ PC(0x1231), SP(0x2BBC) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xCD, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL 1BBCH" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xDC, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL C,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x01) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xDC, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL C,1BBCH (jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x01) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xD4, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL NC,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xD4, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL NC,1BBCH (jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xCC, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL Z,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x40) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xCC, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL Z,1BBCH (jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x40) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xC4, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL NZ,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xC4, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL NZ,1BBCH (jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xEC, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL PE,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x04) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xEC, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL PE,1BBCH (jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x04) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xE4, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL PO,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xE4, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL PO,1BBCH (jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xFC, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL M,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x80) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xFC, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL M,1BBCH (jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x80) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xF4, 0xBC, 0x1B ], 10, [ (PC == 0x1234), (SP == 0x2BBC), ], "CALL P,1BBCH (no jump)" ],
[ [ PC(0x1231), SP(0x2BBC), F(0x00) ], [ 0xFF for _ in range(0,0x1231) ] + [ 0xF4, 0xBC, 0x1B ], 17, [ (PC == 0x1BBC), (SP == 0x2BBA), (M[0x2BBB] == 0x12), (M[0x2BBA] == 0x34) ], "CALL P,1BBCH (jump)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ret(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B) ], [ 0xC9 ], 10, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B) ], [ 0xED, 0x4D ], 14, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RETI" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), begin_nmi, ], [ 0xED, 0x45 ], 14, [ (PC == 0x1BBC), (SP == 0x2BBE), (expect_int_enabled) ], "RETN" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x01) ], [ 0xD8 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET C (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xD8 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET C (no jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xD0 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET NC (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x01) ], [ 0xD0 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET NC (no jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x40) ], [ 0xC8 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET Z (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xC8 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET Z (no jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xC0 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET NZ (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x40) ], [ 0xC0 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET NZ (no jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x04) ], [ 0xE8 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET PE (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xE8 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET PE (no jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xE0 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET PO (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x04) ], [ 0xE0 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET PO (no jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x80) ], [ 0xF8 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET M (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xF8 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET M (no jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x00) ], [ 0xF0 ], 11, [ (PC == 0x1BBC), (SP == 0x2BBE) ], "RET P (jump)" ],
[ [ SP(0x2BBC), M(0x2BBC,0xBC), M(0x2BBD,0x1B), F(0x80) ], [ 0xF0 ], 5, [ (PC == 0x0001), (SP == 0x2BBC) ], "RET P (no jump)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_rst(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xC7 ], 11, [ (PC == 0x0000), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 00H" ],
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xCF ], 11, [ (PC == 0x0008), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 08H" ],
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xD7 ], 11, [ (PC == 0x0010), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 10H" ],
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xDF ], 11, [ (PC == 0x0018), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 18H" ],
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xE7 ], 11, [ (PC == 0x0020), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 20H" ],
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xEF ], 11, [ (PC == 0x0028), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 28H" ],
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xF7 ], 11, [ (PC == 0x0030), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 30H" ],
[ [ PC(0x1233), SP(0x1BBC) ], ([ 0xFF ]*0x1233) + [ 0xFF ], 11, [ (PC == 0x0038), (SP == 0x1BBA), (M[0x1BBA] == 0x34), (M[0x1BBB] == 0x12) ], "RST 38H" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_in(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ A(0x55), IN(0xAB) ], [ 0xDB, 0xFE ], 11, [ (A == 0xAB), (IN == 0x55) ], "IN A,FEH" ],
[ [ A(0x55), IN(0xAB) ], [ 0xDB, 0x57 ], 11, [ (A == 0x00) ], "IN A,57H" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x40 ], 12, [ (B == 0xAB), (IN == 0x55), (F == 0xA8) ], "IN B,(C)" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x48 ], 12, [ (C == 0xAB), (IN == 0x55), (F == 0xA8) ], "IN C,(C)" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x50 ], 12, [ (D == 0xAB), (IN == 0x55), (F == 0xA8) ], "IN D,(C)" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x58 ], 12, [ (E == 0xAB), (IN == 0x55), (F == 0xA8) ], "IN E,(C)" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x60 ], 12, [ (H == 0xAB), (IN == 0x55), (F == 0xA8) ], "IN H,(C)" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x68 ], 12, [ (L == 0xAB), (IN == 0x55), (F == 0xA8) ], "IN L,(C)" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x70 ], 12, [ (IN == 0x55), (F == 0xA9) ], "IN F,(C)" ],
[ [ B(0x55), C(0xFE), IN(0xAB) ], [ 0xED, 0x78 ], 12, [ (A == 0xAB), (IN == 0x55), (F == 0xA8) ], "IN A,(C)" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ini(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ IN(0xAB), HL(0x1BBC), B(0x2), C(0xFE) ], [ 0xED, 0xA2 ], 16, [ (M[0x1BBC] == 0xAB), (IN == 0x02), (HL == 0x1BBD), (B == 0x01), (F == 0x00) ], "INI" ],
[ [ IN(0xAB), HL(0x1BBC), B(0x1), C(0xFE) ], [ 0xED, 0xA2 ], 16, [ (M[0x1BBC] == 0xAB), (IN == 0x01), (HL == 0x1BBD), (B == 0x00), (F == 0x44) ], "INI" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_inir(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ IN(0xAB), HL(0x1BBC), B(0x2), C(0xFE) ], [ 0xED, 0xB2 ], 21, [ (PC == 0x00), (M[0x1BBC] == 0xAB), (IN == 0x02), (HL == 0x1BBD), (B == 0x01), (F == 0x00) ], "INIR" ],
[ [ IN(0xAB), HL(0x1BBC), B(0x1), C(0xFE) ], [ 0xED, 0xB2 ], 16, [ (PC == 0x02), (M[0x1BBC] == 0xAB), (IN == 0x01), (HL == 0x1BBD), (B == 0x00), (F == 0x44) ], "INIR" ],
[ [ IN(0xAB), HL(0x1BBC), B(0x2), C(0xFE) ], [ 0xED, 0xB2 ], 37, [ (PC == 0x02), (M[0x1BBC] == 0xAB), (M[0x1BBD] == 0xAB), (IN == 0x01), (HL == 0x1BBE), (B == 0x00), (F == 0x44)], "INIR" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ind(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ IN(0xAB), HL(0x1BBC), B(0x2), C(0xFE) ], [ 0xED, 0xAA ], 16, [ (M[0x1BBC] == 0xAB), (IN == 0x02), (HL == 0x1BBB), (B == 0x01), (F == 0x00) ], "INI" ],
[ [ IN(0xAB), HL(0x1BBC), B(0x1), C(0xFE) ], [ 0xED, 0xAA ], 16, [ (M[0x1BBC] == 0xAB), (IN == 0x01), (HL == 0x1BBB), (B == 0x00), (F == 0x44) ], "INI" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_indr(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ IN(0xAB), HL(0x1BBC), B(0x2), C(0xFE) ], [ 0xED, 0xBA ], 21, [ (PC == 0x00), (M[0x1BBC] == 0xAB), (IN == 0x02), (HL == 0x1BBB), (B == 0x01), (F == 0x00) ], "INIR" ],
[ [ IN(0xAB), HL(0x1BBC), B(0x1), C(0xFE) ], [ 0xED, 0xBA ], 16, [ (PC == 0x02), (M[0x1BBC] == 0xAB), (IN == 0x01), (HL == 0x1BBB), (B == 0x00), (F == 0x44) ], "INIR" ],
[ [ IN(0xAB), HL(0x1BBC), B(0x2), C(0xFE) ], [ 0xED, 0xBA ], 37, [ (PC == 0x02), (M[0x1BBC] == 0xAB), (M[0x1BBB] == 0xAB), (IN == 0x01), (HL == 0x1BBA), (B == 0x00), (F == 0x44)], "INIR" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_out(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ A(0x55) ], [ 0xD3, 0xFA ], 11, [ (OUT == (0x55, 0x55)) ], "OUT (FEH),A" ],
[ [ B(0x55), C(0xFA) ], [ 0xED, 0x41 ], 12, [ (OUT == (0x55, 0x55)) ], "OUT (C),B" ],
[ [ B(0x55), C(0xFA) ], [ 0xED, 0x49 ], 12, [ (OUT == (0x55, 0xFA)) ], "OUT (C),C" ],
[ [ B(0x55), C(0xFA), D(0xAB) ], [ 0xED, 0x51 ], 12, [ (OUT == (0x55, 0xAB)) ], "OUT (C),D" ],
[ [ B(0x55), C(0xFA), E(0xAB) ], [ 0xED, 0x59 ], 12, [ (OUT == (0x55, 0xAB)) ], "OUT (C),E" ],
[ [ B(0x55), C(0xFA), H(0xAB) ], [ 0xED, 0x61 ], 12, [ (OUT == (0x55, 0xAB)) ], "OUT (C),H" ],
[ [ B(0x55), C(0xFA), L(0xAB) ], [ 0xED, 0x69 ], 12, [ (OUT == (0x55, 0xAB)) ], "OUT (C),L" ],
[ [ B(0x55), C(0xFA), F(0xAB) ], [ 0xED, 0x71 ], 12, [ (OUT == (0x55, 0xAB)) ], "OUT (C),F" ],
[ [ B(0x55), C(0xFA), A(0xAB) ], [ 0xED, 0x79 ], 12, [ (OUT == (0x55, 0xAB)) ], "OUT (C),A" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_outi(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), B(0x2), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xA3 ], 16, [ (OUT == (0x02,0xAB)), (HL == 0x1BBD), (B == 0x01), (F == 0x00) ], "OUTI" ],
[ [ HL(0x1BBC), B(0x1), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xA3 ], 16, [ (OUT == (0x01,0xAB)), (HL == 0x1BBD), (B == 0x00), (F == 0x44) ], "OUTI" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_outir(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), B(0x2), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xB3 ], 21, [ (PC == 0x00), (OUT == (0x02,0xAB)), (HL == 0x1BBD), (B == 0x01), (F == 0x00) ], "OUTIR" ],
[ [ HL(0x1BBC), B(0x1), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xB3 ], 16, [ (PC == 0x02), (OUT == (0x01,0xAB)), (HL == 0x1BBD), (B == 0x00), (F == 0x44) ], "OUTIR" ],
[ [ HL(0x1BBC), B(0x2), C(0xFA), M(0x1BBC,0xAB), M(0x1BBD,0xCD) ], [ 0xED, 0xB3 ], 37, [ (PC == 0x02), (OUT == (0x01,0xCD)), (HL == 0x1BBE), (B == 0x00), (F == 0x44) ], "OUTIR" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_outd(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), B(0x2), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xAB ], 16, [ (OUT == (0x02,0xAB)), (HL == 0x1BBB), (B == 0x01), (F == 0x00) ], "OUTD" ],
[ [ HL(0x1BBC), B(0x1), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xAB ], 16, [ (OUT == (0x01,0xAB)), (HL == 0x1BBB), (B == 0x00), (F == 0x44) ], "OUTD" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_outdr(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ HL(0x1BBC), B(0x2), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xBB ], 21, [ (PC == 0x00), (OUT == (0x02,0xAB)), (HL == 0x1BBB), (B == 0x01), (F == 0x00) ], "OUTDR" ],
[ [ HL(0x1BBC), B(0x1), C(0xFA), M(0x1BBC,0xAB) ], [ 0xED, 0xBB ], 16, [ (PC == 0x02), (OUT == (0x01,0xAB)), (HL == 0x1BBB), (B == 0x00), (F == 0x44) ], "OUTDR" ],
[ [ HL(0x1BBC), B(0x2), C(0xFA), M(0x1BBC,0xAB), M(0x1BBB,0xCD) ], [ 0xED, 0xBB ], 37, [ (PC == 0x02), (OUT == (0x01,0xCD)), (HL == 0x1BBA), (B == 0x00), (F == 0x44) ], "OUTDR" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_halt(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [], [ 0x76, 0xFF, 0xFF ], 100, [ (PC == 0x00) ], "HALT" ],
[ [ ei, IG(20,[]) ], [ 0x76, 0xFF, 0xFF ], 20, [ (PC == 0x01) ], "HALT" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_interrupt_mode0(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ ei, IM(0), IG(20,[ 0x06, 0x0B ]) ], [ 0x76, 0xFF, 0xFF ], 29, [ (PC == 0x01), (B == 0x0B), (IG == (True, True)), expect_int_disabled ], "Mode 0 Interrupt, interrupting HALT" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_interrupt_mode1(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ ei, IM(1), IG(20, []), SP(0x1BBC), PC(0x2BBC) ], ([0xFF] * 0x2BBC) + [ 0x76, 0xFF, 0xFF ], 33,
[ (PC == 0x0038), (M[0x1BBB] == 0x2B), (M[0x1BBA] == 0xBD), (IG == (True, True)), expect_int_disabled ], "Mode 1 Interrupt, interrupting HALT" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_interrupt_mode2(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[
[ ei, IM(2), IG(20, [ 0xBC ]), SP(0x1BBC), PC(0x2BBC), I(0x3B), M(0x3BBC, 0xFE), M(0x3BBD, 0xC0) ],
([0xFF] * 0x2BBC) + [ 0x76, 0xFF, 0xFF ],
39,
[ (PC == 0xC0FE), (M[0x1BBB] == 0x2B), (M[0x1BBA] == 0xBD), (IG == (True, True)), expect_int_disabled ],
"Mode 2 Interrupt, interrupting HALT"
],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_nmi(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ ei, IM(1), IG(20, [], True), SP(0x1BBC), PC(0x2BBC) ], ([0xFF] * 0x2BBC) + [ 0x76, 0xFF, 0xFF ], 33,
[ (PC == 0x0066), (M[0x1BBB] == 0x2B), (M[0x1BBA] == 0xBD), (IG == (True, True)), expect_int_disabled, expect_int_preserved ], "NMI, interrupting HALT" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_di(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ ei ], [ 0xF3 ], 4, [ expect_int_disabled, expect_int_not_preserved ], "DI" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_ei(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ di ], [ 0xFB ], 4, [ expect_int_enabled, expect_int_preserved ], "EI" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
def test_im(self):
# actions taken first, instructions to execute, t-cycles to run for, expected conditions post, name
tests = [
[ [ IM(1) ], [ 0xED, 0x46 ], 8, [ (IM == 0) ], "IM0" ],
[ [ IM(0) ], [ 0xED, 0x56 ], 8, [ (IM == 1) ], "IM1" ],
[ [ IM(0) ], [ 0xED, 0x5E ], 8, [ (IM == 2) ], "IM2" ],
]
for (pre, instructions, t_cycles, post, name) in tests:
self.execute_instructions(pre, instructions, t_cycles, post, name)
| 61.667195 | 266 | 0.43045 | 15,249 | 116,736 | 3.253722 | 0.042822 | 0.018623 | 0.052564 | 0.072275 | 0.794099 | 0.766466 | 0.713318 | 0.674319 | 0.627681 | 0.582937 | 0 | 0.132599 | 0.332323 | 116,736 | 1,892 | 267 | 61.699789 | 0.503977 | 0.047415 | 0 | 0.320707 | 0 | 0.001894 | 0.085412 | 0.001269 | 0 | 0 | 0.157743 | 0 | 0.010101 | 1 | 0.087753 | false | 0 | 0.004419 | 0.005051 | 0.118056 | 0.006313 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07f88a01d8d277703bfe308d54c3eb717dff8946 | 318 | py | Python | torch_scope/__init__.py | xgeric/Torch-Scope | 63a336010ea7ae631e547ffb5c6ddd5f6f042860 | [
"Apache-2.0"
] | 1 | 2019-07-10T15:23:27.000Z | 2019-07-10T15:23:27.000Z | torch_scope/__init__.py | jainaayush05/Torch-Scope | bbc8b6e2562cbc6305ea6d937bcd6f96542755f6 | [
"Apache-2.0"
] | null | null | null | torch_scope/__init__.py | jainaayush05/Torch-Scope | bbc8b6e2562cbc6305ea6d937bcd6f96542755f6 | [
"Apache-2.0"
] | null | null | null | __author__ = "Liyuan Liu"
__license__ = "Apache License 2.0"
__maintainer__ = "Liyuan Liu"
__email__ = "llychinalz@gmail.com"
from torch_scope.wrapper import wrapper, basic_wrapper
from torch_scope.sheet_writer import sheet_writer
from torch_scope.commands import run
from torch_scope.file_manager import cached_url
| 28.909091 | 54 | 0.827044 | 45 | 318 | 5.288889 | 0.577778 | 0.151261 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007067 | 0.110063 | 318 | 10 | 55 | 31.8 | 0.833922 | 0 | 0 | 0 | 0 | 0 | 0.18239 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
07fda8771737491ee6cf817f16d2f2003cde7ccb | 156 | py | Python | Week 1/first.py | Bor1s/algorithmic_toolbox | 1df37a17642cb58c3aa9d10f3b0576d8a2df240e | [
"MIT"
] | null | null | null | Week 1/first.py | Bor1s/algorithmic_toolbox | 1df37a17642cb58c3aa9d10f3b0576d8a2df240e | [
"MIT"
] | null | null | null | Week 1/first.py | Bor1s/algorithmic_toolbox | 1df37a17642cb58c3aa9d10f3b0576d8a2df240e | [
"MIT"
] | null | null | null | # Uses python3
import sys
input = sys.stdin.read()
tokens = input.split()
a = int(tokens[0])
b = int(tokens[1])
if (a >= 0) and (b <= 9):
print(a + b)
| 15.6 | 25 | 0.583333 | 28 | 156 | 3.25 | 0.642857 | 0.197802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04065 | 0.211538 | 156 | 9 | 26 | 17.333333 | 0.699187 | 0.076923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.142857 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5801b92f4fc86ee166c4c782b316693c0c8431be | 1,044 | py | Python | application_form/migrations/0013_alter_application_model.py | frwickst/apartment-application-service | 40387327a0f82ba01bfcb6ab8532ea4aec40d37a | [
"MIT"
] | 1 | 2021-03-15T11:29:12.000Z | 2021-03-15T11:29:12.000Z | application_form/migrations/0013_alter_application_model.py | frwickst/apartment-application-service | 40387327a0f82ba01bfcb6ab8532ea4aec40d37a | [
"MIT"
] | 130 | 2020-09-07T08:30:29.000Z | 2022-03-29T11:49:27.000Z | application_form/migrations/0013_alter_application_model.py | frwickst/apartment-application-service | 40387327a0f82ba01bfcb6ab8532ea4aec40d37a | [
"MIT"
] | 4 | 2020-09-07T05:34:13.000Z | 2021-11-07T12:51:21.000Z | # Generated by Django 2.2.21 on 2021-06-03 12:55
import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("application_form", "0012_auto_20210527_1000"),
]
operations = [
migrations.RemoveField(
model_name="application",
name="apartments",
),
migrations.RemoveField(
model_name="application",
name="state",
),
migrations.RemoveField(
model_name="applicationapartment",
name="apartment",
),
migrations.AddField(
model_name="application",
name="created_at",
field=models.DateTimeField(
auto_now_add=True, default=django.utils.timezone.now
),
preserve_default=False,
),
migrations.AddField(
model_name="application",
name="updated_at",
field=models.DateTimeField(auto_now=True),
),
]
| 26.1 | 68 | 0.566092 | 92 | 1,044 | 6.26087 | 0.51087 | 0.078125 | 0.138889 | 0.166667 | 0.416667 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0.045845 | 0.331418 | 1,044 | 39 | 69 | 26.769231 | 0.77937 | 0.044061 | 0 | 0.454545 | 1 | 0 | 0.14759 | 0.023092 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.060606 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6af24f2d0ca45dd4ea4466997810c9f79a835b8b | 3,425 | py | Python | src/rqt_emergency_stop/emergency_stop.py | tuw-robotics/tuw_rqt | c9139976f74d0dd83aa0ab9c3ee123dce5ac3884 | [
"BSD-3-Clause"
] | null | null | null | src/rqt_emergency_stop/emergency_stop.py | tuw-robotics/tuw_rqt | c9139976f74d0dd83aa0ab9c3ee123dce5ac3884 | [
"BSD-3-Clause"
] | null | null | null | src/rqt_emergency_stop/emergency_stop.py | tuw-robotics/tuw_rqt | c9139976f74d0dd83aa0ab9c3ee123dce5ac3884 | [
"BSD-3-Clause"
] | null | null | null | import os
import rospy
import rospkg
from qt_gui.plugin import Plugin
from python_qt_binding import loadUi
from python_qt_binding.QtWidgets import QWidget
from std_msgs.msg import Bool
class EmergencyStopPlugin(Plugin):
def __init__(self, context):
super(EmergencyStopPlugin, self).__init__(context)
# Give QObjects reasonable names
self.setObjectName('rqt_emergency_stop_plugin')
# Process standalone plugin command-line arguments
from argparse import ArgumentParser
parser = ArgumentParser()
# Add argument(s) to the parser.
parser.add_argument("-q", "--quiet", action="store_true",
dest="quiet",
help="Put plugin in silent mode")
args, unknowns = parser.parse_known_args(context.argv())
if not args.quiet:
print 'arguments: ', args
print 'unknowns: ', unknowns
# Create QWidget
self._widget = QWidget()
# Get path to UI file which should be in the "resource" folder of this package
ui_file = os.path.join(rospkg.RosPack().get_path('rqt_emergency_stop'), 'resource', 'EmergencyStop.ui')
# Extend the widget with all attributes and children from UI file
loadUi(ui_file, self._widget)
# Give QObjects reasonable names
self._widget.setObjectName('EmergencyStopUi')
# Show _widget.windowTitle on left-top of each plugin (when
# it's set in _widget). This is useful when you open multiple
# plugins at once. Also if you open multiple instances of your
# plugin at once, these lines add number to make it easy to
# tell from pane to pane.
if context.serial_number() > 1:
self._widget.setWindowTitle(self._widget.windowTitle() + (' (%d)' % context.serial_number()))
# Add widget to the user interface
context.add_widget(self._widget)
self.stop_pub = rospy.Publisher('emergency_stop', Bool, queue_size=1)
self._widget.stop_button.setCheckable(True)
self._widget.lineEdit.textChanged.connect(self.line_edit_callback)
self._widget.stop_button.toggled.connect(self.stop_button_callback)
def shutdown_plugin(self):
# TODO unregister all publishers here
self.stop_pub.unregister()
def save_settings(self, plugin_settings, instance_settings):
# TODO save intrinsic configuration, usually using:
# instance_settings.set_value(k, v)
pass
def restore_settings(self, plugin_settings, instance_settings):
# TODO restore intrinsic configuration, usually using:
# v = instance_settings.value(k)
pass
#def trigger_configuration(self):
# Comment in to signal that the plugin has a way to configure
# This will enable a setting button (gear icon) in each dock widget title bar
# Usually used to open a modal configuration dialog
def stop_button_callback(self, checked):
msg = Bool()
if checked:
print("emergency stop pressed")
msg.data = True
else:
print("emergency stop released")
msg.data = False
self.stop_pub.publish(msg)
def line_edit_callback(self, text):
self.stop_pub.unregister()
self.stop_pub = rospy.Publisher(text, Bool, queue_size=1)
| 38.920455 | 111 | 0.654891 | 418 | 3,425 | 5.200957 | 0.423445 | 0.041398 | 0.025299 | 0.017479 | 0.093836 | 0.042318 | 0.042318 | 0 | 0 | 0 | 0 | 0.001194 | 0.266569 | 3,425 | 87 | 112 | 39.367816 | 0.864252 | 0.295766 | 0 | 0.081633 | 0 | 0 | 0.090414 | 0.010465 | 0 | 0 | 0 | 0.011494 | 0 | 0 | null | null | 0.040816 | 0.163265 | null | null | 0.081633 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6af6279934275e0b7f3cccb72914ad9796ed0108 | 937 | py | Python | pyinsteon/handlers/start_all_linking.py | pyinsteon/pyinsteon | 1ecc2ed3c2fc4bc333d76d9238ffdcf44e7efdd5 | [
"MIT"
] | 15 | 2020-07-08T05:29:14.000Z | 2022-03-24T18:56:26.000Z | pyinsteon/handlers/start_all_linking.py | bshep/pyinsteon | 7025cb6f3beb42f43c6a13799eff0a6b77e38e69 | [
"MIT"
] | 107 | 2019-06-03T09:23:02.000Z | 2022-03-31T23:12:38.000Z | pyinsteon/handlers/start_all_linking.py | teharris1/pyinsteon | 9476473676d714a62f0cfcc5124f7cd7e96de98b | [
"MIT"
] | 16 | 2019-01-24T01:09:49.000Z | 2022-02-24T03:48:42.000Z | """Start All-Linking."""
from ..constants import AllLinkMode
from ..topics import START_ALL_LINKING
from . import ack_handler
from .outbound_base import OutboundHandlerBase
# pylint: disable=arguments-differ
class StartAllLinkingCommandHandler(OutboundHandlerBase):
"""Start All-Linking Command."""
def __init__(self):
"""Init the StartAllLinkingCommandHandler class."""
super().__init__(topic=START_ALL_LINKING)
def send(self, mode: AllLinkMode, group: int = 0):
"""Send the Start All-Linking Command."""
super().send(mode=mode, group=group)
async def async_send(self, mode: AllLinkMode, group: int = 0):
"""Send the Start All-Linking Command."""
return await super().async_send(mode=mode, group=group)
@ack_handler
def handle_ack(self, mode: AllLinkMode, group: int):
"""Handle the ACK message."""
super().handle_ack(mode=mode, group=group)
| 33.464286 | 66 | 0.691569 | 112 | 937 | 5.616071 | 0.321429 | 0.076312 | 0.143084 | 0.104928 | 0.306836 | 0.193959 | 0.193959 | 0.193959 | 0.193959 | 0.193959 | 0 | 0.002618 | 0.184632 | 937 | 27 | 67 | 34.703704 | 0.820681 | 0.197439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.285714 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed066b89fe0407d9cf7bec5169b941fb4490cac7 | 320 | py | Python | actions/lib/action.py | cognifloyd/stackstorm-victorops | 9ef6a4cde26e0be4b6db4052b63a19a225983902 | [
"Apache-2.0"
] | null | null | null | actions/lib/action.py | cognifloyd/stackstorm-victorops | 9ef6a4cde26e0be4b6db4052b63a19a225983902 | [
"Apache-2.0"
] | null | null | null | actions/lib/action.py | cognifloyd/stackstorm-victorops | 9ef6a4cde26e0be4b6db4052b63a19a225983902 | [
"Apache-2.0"
] | 2 | 2017-06-20T01:28:57.000Z | 2021-01-28T17:48:16.000Z | from st2common.runners.base_action import Action
class VictorOpsAction(Action):
def __init__(self, config):
super(VictorOpsAction, self).__init__(config)
self.url = "{0}/{1}/{2}".format(self.config['url'], self.config['api_key'],
self.config['routing_key'])
| 35.555556 | 83 | 0.609375 | 36 | 320 | 5.111111 | 0.583333 | 0.217391 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016529 | 0.24375 | 320 | 8 | 84 | 40 | 0.743802 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed0779756f4ae77c03940affcfd7c32090f5ed70 | 246 | py | Python | instagram_api/response/model/text.py | Yuego/instagram_api | b53f72db36c505a2eb24ebac1ba8267a0cc295bb | [
"MIT"
] | 13 | 2019-08-07T21:24:34.000Z | 2020-12-12T12:23:50.000Z | instagram_api/response/model/text.py | Yuego/instagram_api | b53f72db36c505a2eb24ebac1ba8267a0cc295bb | [
"MIT"
] | null | null | null | instagram_api/response/model/text.py | Yuego/instagram_api | b53f72db36c505a2eb24ebac1ba8267a0cc295bb | [
"MIT"
] | null | null | null | from ..mapper import PropertyMapper, ApiInterfaceBase
from ..mapper.types import Timestamp, AnyType
__all__ = ['Text', 'TextInterface']
class TextInterface(ApiInterfaceBase):
text: str
class Text(PropertyMapper, TextInterface):
pass
| 18.923077 | 53 | 0.764228 | 24 | 246 | 7.666667 | 0.583333 | 0.108696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142276 | 246 | 12 | 54 | 20.5 | 0.872038 | 0 | 0 | 0 | 0 | 0 | 0.069106 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0.285714 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
ed08f1f49c238992a5d82f75c990aeb359990d58 | 254 | py | Python | 2_Analysis/pop.py | AnthonyRChao/Problem-Solving-With-Algorithms-And-Data-Structures | be29b46b9f4e579644ca2d44675c0ce7dcb29b3b | [
"MIT"
] | 6 | 2020-09-28T08:18:01.000Z | 2022-01-15T19:38:38.000Z | 2_Analysis/pop.py | AnthonyRChao/Problem-Solving-With-Algorithms-And-Data-Structures | be29b46b9f4e579644ca2d44675c0ce7dcb29b3b | [
"MIT"
] | null | null | null | 2_Analysis/pop.py | AnthonyRChao/Problem-Solving-With-Algorithms-And-Data-Structures | be29b46b9f4e579644ca2d44675c0ce7dcb29b3b | [
"MIT"
] | 3 | 2020-09-28T08:18:05.000Z | 2021-04-24T21:22:28.000Z | from timeit import Timer
popzero = Timer("x.pop(0)", "from __main__ import x")
popend = Timer("x.pop()", "from __main__ import x")
x = list(range(2000000))
print(popzero.timeit(number=1000))
x = list(range(2000000))
print(popend.timeit(number=1000))
| 21.166667 | 53 | 0.708661 | 39 | 254 | 4.410256 | 0.410256 | 0.069767 | 0.104651 | 0.174419 | 0.255814 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102679 | 0.11811 | 254 | 11 | 54 | 23.090909 | 0.665179 | 0 | 0 | 0.285714 | 0 | 0 | 0.232283 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
ed0fca689d27008a58edd4848cc62caaf8593924 | 5,109 | py | Python | moex/yahoo_finance/utils.py | ghostforpy/bonds-docker | fda77225b85264cb4ba06b15ff63bc807858425a | [
"MIT"
] | 2 | 2020-09-08T12:51:56.000Z | 2021-08-18T15:27:52.000Z | moex/yahoo_finance/utils.py | ghostforpy/bonds-docker | fda77225b85264cb4ba06b15ff63bc807858425a | [
"MIT"
] | 1 | 2021-12-13T20:43:35.000Z | 2021-12-13T20:43:35.000Z | moex/yahoo_finance/utils.py | ghostforpy/bonds-docker | fda77225b85264cb4ba06b15ff63bc807858425a | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Yahoo! Finance market data downloader (+fix for Pandas Datareader)
# https://github.com/ranaroussi/yfinance
#
# Copyright 2017-2019 Ran Aroussi
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from __future__ import print_function
import requests as _requests
import re as _re
#import pandas as _pd
#import numpy as _np
import sys as _sys
import re as _re
try:
import ujson as _json
except ImportError:
import json as _json
'''
def empty_df(index=[]):
empty = _pd.DataFrame(index=index, data={
'Open': _np.nan, 'High': _np.nan, 'Low': _np.nan,
'Close': _np.nan, 'Adj Close': _np.nan, 'Volume': _np.nan})
empty.index.name = 'Date'
return empty
'''
def get_json(url, proxy=None):
html = _requests.get(url=url, proxies=proxy).text
if "QuoteSummaryStore" not in html:
html = _requests.get(url=url, proxies=proxy).text
if "QuoteSummaryStore" not in html:
return {}
json_str = html.split('root.App.main =')[1].split(
'(this)')[0].split(';\n}')[0].strip()
data = _json.loads(json_str)[
'context']['dispatcher']['stores']['QuoteSummaryStore']
# return data
new_data = _json.dumps(data).replace('{}', 'null')
new_data = _re.sub(
r'\{[\'|\"]raw[\'|\"]:(.*?),(.*?)\}', r'\1', new_data)
return _json.loads(new_data)
def camel2title(o):
return [_re.sub("([a-z])([A-Z])", r"\g<1> \g<2>", i).title() for i in o]
'''
def auto_adjust(data):
df = data.copy()
ratio = df["Close"] / df["Adj Close"]
df["Adj Open"] = df["Open"] / ratio
df["Adj High"] = df["High"] / ratio
df["Adj Low"] = df["Low"] / ratio
df.drop(
["Open", "High", "Low", "Close"],
axis=1, inplace=True)
df.rename(columns={
"Adj Open": "Open", "Adj High": "High",
"Adj Low": "Low", "Adj Close": "Close"
}, inplace=True)
df = df[["Open", "High", "Low", "Close", "Volume"]]
return df[["Open", "High", "Low", "Close", "Volume"]]
def back_adjust(data):
""" back-adjusted data to mimic true historical prices """
df = data.copy()
ratio = df["Adj Close"] / df["Close"]
df["Adj Open"] = df["Open"] * ratio
df["Adj High"] = df["High"] * ratio
df["Adj Low"] = df["Low"] * ratio
df.drop(
["Open", "High", "Low", "Adj Close"],
axis=1, inplace=True)
df.rename(columns={
"Adj Open": "Open", "Adj High": "High",
"Adj Low": "Low"
}, inplace=True)
return df[["Open", "High", "Low", "Close", "Volume"]]
'''
'''
def parse_quotes(data, tz=None):
timestamps = data["timestamp"]
ohlc = data["indicators"]["quote"][0]
volumes = ohlc["volume"]
opens = ohlc["open"]
closes = ohlc["close"]
lows = ohlc["low"]
highs = ohlc["high"]
adjclose = closes
if "adjclose" in data["indicators"]:
adjclose = data["indicators"]["adjclose"][0]["adjclose"]
quotes = _pd.DataFrame({"Open": opens,
"High": highs,
"Low": lows,
"Close": closes,
"Adj Close": adjclose,
"Volume": volumes})
quotes.index = _pd.to_datetime(timestamps, unit="s")
quotes.sort_index(inplace=True)
if tz is not None:
quotes.index = quotes.index.tz_localize(tz)
return quotes
def parse_actions(data, tz=None):
dividends = _pd.DataFrame(columns=["Dividends"])
splits = _pd.DataFrame(columns=["Stock Splits"])
if "events" in data:
if "dividends" in data["events"]:
dividends = _pd.DataFrame(
data=list(data["events"]["dividends"].values()))
dividends.set_index("date", inplace=True)
dividends.index = _pd.to_datetime(dividends.index, unit="s")
dividends.sort_index(inplace=True)
if tz is not None:
dividends.index = dividends.index.tz_localize(tz)
dividends.columns = ["Dividends"]
if "splits" in data["events"]:
splits = _pd.DataFrame(
data=list(data["events"]["splits"].values()))
splits.set_index("date", inplace=True)
splits.index = _pd.to_datetime(splits.index, unit="s")
splits.sort_index(inplace=True)
if tz is not None:
splits.index = splits.index.tz_localize(tz)
splits["Stock Splits"] = splits["numerator"] / \
splits["denominator"]
splits = splits["Stock Splits"]
return dividends, splits
'''
| 29.703488 | 76 | 0.581131 | 647 | 5,109 | 4.499227 | 0.295209 | 0.034009 | 0.017176 | 0.021986 | 0.257987 | 0.230505 | 0.202336 | 0.202336 | 0.179663 | 0.145654 | 0 | 0.006299 | 0.254257 | 5,109 | 171 | 77 | 29.877193 | 0.757743 | 0.147387 | 0 | 0.24 | 0 | 0 | 0.152083 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.32 | 0.04 | 0.52 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ed15230ee295d8b9e587a05ce1beb3ce22fca0b8 | 340 | py | Python | masonite/providers/ViewProvider.py | w3x10e8/core | d8f0ca29c2bd5e86d199391fa916ce2f5c9b0f49 | [
"MIT"
] | null | null | null | masonite/providers/ViewProvider.py | w3x10e8/core | d8f0ca29c2bd5e86d199391fa916ce2f5c9b0f49 | [
"MIT"
] | null | null | null | masonite/providers/ViewProvider.py | w3x10e8/core | d8f0ca29c2bd5e86d199391fa916ce2f5c9b0f49 | [
"MIT"
] | null | null | null | """ A View Service Provider """
from masonite.provider import ServiceProvider
from masonite.view import View
class ViewProvider(ServiceProvider):
wsgi = False
def register(self):
view = View(self.app)
self.app.bind('ViewClass', view)
self.app.bind('View', view.render)
def boot(self):
pass
| 18.888889 | 45 | 0.652941 | 41 | 340 | 5.414634 | 0.512195 | 0.094595 | 0.099099 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238235 | 340 | 17 | 46 | 20 | 0.857143 | 0.067647 | 0 | 0 | 0 | 0 | 0.042071 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.1 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
ed22dd1b533099d068ff9fa035552382ca4f8c2d | 510 | py | Python | microservice_request/permissions.py | bandirom/django-microservice-request | 1527ef21edf2bae09940dbce60aa4d4c4e2befd5 | [
"BSD-3-Clause"
] | null | null | null | microservice_request/permissions.py | bandirom/django-microservice-request | 1527ef21edf2bae09940dbce60aa4d4c4e2befd5 | [
"BSD-3-Clause"
] | 2 | 2022-01-22T15:12:32.000Z | 2022-01-22T21:21:02.000Z | microservice_request/permissions.py | bandirom/django-microservice-request | 1527ef21edf2bae09940dbce60aa4d4c4e2befd5 | [
"BSD-3-Clause"
] | null | null | null | from json import dumps
from hashlib import md5
from django.conf import settings
from rest_framework.permissions import IsAuthenticated, IsAdminUser
class HasApiKeyOrIsAuthenticated(IsAuthenticated):
def has_permission(self, request, view):
if key := request.headers.get('Authorization'):
token = list(key.split(" "))
if token[0] == settings.API_KEY_HEADER and token[1] == settings.API_KEY:
return True
return super().has_permission(request, view)
| 34 | 84 | 0.701961 | 60 | 510 | 5.866667 | 0.633333 | 0.073864 | 0.079545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007444 | 0.209804 | 510 | 14 | 85 | 36.428571 | 0.866005 | 0 | 0 | 0 | 0 | 0 | 0.027451 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.727273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ed2c621291b4eb6718363c1693032edf9a8d1841 | 777 | py | Python | tests/unit/basket/middleware_tests.py | makielab/django-oscar | 0a325cd0f04a4278201872b2e163868b72b6fabe | [
"BSD-3-Clause"
] | null | null | null | tests/unit/basket/middleware_tests.py | makielab/django-oscar | 0a325cd0f04a4278201872b2e163868b72b6fabe | [
"BSD-3-Clause"
] | null | null | null | tests/unit/basket/middleware_tests.py | makielab/django-oscar | 0a325cd0f04a4278201872b2e163868b72b6fabe | [
"BSD-3-Clause"
] | null | null | null | from django.test import TestCase
from django.test.client import RequestFactory
from django.contrib.auth.models import AnonymousUser
from oscar.apps.basket import middleware
class TestBasketMiddleware(TestCase):
def setUp(self):
self.middleware = middleware.BasketMiddleware()
self.request = RequestFactory().get('/')
self.request.user = AnonymousUser()
self.middleware.process_request(self.request)
def test_basket_is_attached_to_request(self):
self.assertTrue(hasattr(self.request, 'basket'))
def test_strategy_is_attached_to_basket(self):
self.assertTrue(hasattr(self.request.basket, 'strategy'))
def test_strategy_is_attached_to_request(self):
self.assertTrue(hasattr(self.request, 'strategy'))
| 32.375 | 65 | 0.746461 | 91 | 777 | 6.197802 | 0.340659 | 0.117021 | 0.06383 | 0.132979 | 0.35461 | 0.35461 | 0.280142 | 0.195035 | 0.195035 | 0.195035 | 0 | 0 | 0.155727 | 777 | 23 | 66 | 33.782609 | 0.859756 | 0 | 0 | 0 | 0 | 0 | 0.029601 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed39c04a44db30a0b11af5d1cbf938cea7df8d4b | 114 | py | Python | Emmanuel ANENE/Phase 1/Python Basic 1/Day4/conCat.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 6 | 2020-05-23T19:53:25.000Z | 2021-05-08T20:21:30.000Z | Emmanuel ANENE/Phase 1/Python Basic 1/Day4/conCat.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 8 | 2020-05-14T18:53:12.000Z | 2020-07-03T00:06:20.000Z | Emmanuel ANENE/Phase 1/Python Basic 1/Day4/conCat.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 39 | 2020-05-10T20:55:02.000Z | 2020-09-12T17:40:59.000Z | lst = ["Peace", "of", "the", "mind", 2, 4, 7]
for i in lst:
place = "".join(map(str, lst))
print(place)
| 16.285714 | 45 | 0.5 | 19 | 114 | 3 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034884 | 0.245614 | 114 | 6 | 46 | 19 | 0.627907 | 0 | 0 | 0 | 0 | 0 | 0.122807 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed3bef89fef954d3121d32288da7af21e3d430bd | 15,819 | py | Python | medline/database/db_initialization.py | aranibatta/medline | 1396ff2cd852f5b86c2f2b5648412dd27a38e69f | [
"MIT"
] | null | null | null | medline/database/db_initialization.py | aranibatta/medline | 1396ff2cd852f5b86c2f2b5648412dd27a38e69f | [
"MIT"
] | null | null | null | medline/database/db_initialization.py | aranibatta/medline | 1396ff2cd852f5b86c2f2b5648412dd27a38e69f | [
"MIT"
] | null | null | null | import urllib2, base64
from medline.database.db_wrapper import Database
from medline.database.parse_reference import parse_date
from medline.database.db_insert_data import extract_refs_from_xml, insert_data_to_db
from bs4 import BeautifulSoup
import pickle
import sqlite3
import codecs
import re
def download_xml_files(storage_folder, start=1, end=779):
url = 'https://ftp.nlm.nih.gov/projects/medleasebaseline/gz/'
files = ['medline15n%04d.xml.gz' % x for x in range(start, end + 1)]
print files
# files = ['medline15n0005.xml.gz']
for file in files:
print "Downloading {} to {}".format(file, storage_folder + file)
request = urllib2.Request(url + file)
base64string = base64.encodestring('%s:%s' % ('risi', 'stfdu')).replace('\n', '')
request.add_header("Authorization", "Basic %s" % base64string)
#result = urllib2.urlopen(request)
data = urllib2.urlopen(request).read()
local_file = open(storage_folder + file, 'wb')
local_file.write(data)
local_file.close()
def create_medline_db_mysql():
db = Database('stephan_local')
con, cur = db.connect(select_no_db=True)
cur.execute('''CREATE DATABASE IF NOT EXISTS medline
DEFAULT CHARACTER SET utf8
DEFAULT COLLATE utf8_general_ci;''')
cur.execute("USE medline;")
#con.commit()
cur.execute('''CREATE TABLE IF NOT EXISTS refs(
pmid varchar(20) NOT NULL UNIQUE,
pmid_version int,
e_location_id varchar(255),
ref_owner varchar(255),
ref_status varchar(255),
date_updated date,
title varchar(255) NOT NULL,
abstract text,
pages varchar(255),
date_pub_first date NOT NULL,
date_pub_print date,
date_pub_electronic date,
pub_medium varchar(255),
lang varchar(255),
times_cited int,
journal_id varchar(20) NOT NULL,
journal_volume varchar(10),
journal_issue varchar(10),
PRIMARY KEY(pmid)
)ENGINE=INNODB;''')
cur.execute('''CREATE TABLE IF NOT EXISTS journals(
nlm_id varchar(20) NOT NULL UNIQUE,
medline_ta varchar(255),
issn_print varchar(20),
issn_electronic varchar(20),
country varchar(20),
PRIMARY KEY(nlm_id)
)ENGINE=INNODB;''')
cur.execute('''CREATE TABLE IF NOT EXISTS refs_authors(
id int NOT NULL AUTO_INCREMENT,
ref_id varchar(20) NOT NULL,
last_name varchar(255),
fore_name varchar(255),
affiliation text,
PRIMARY KEY(id),
FOREIGN KEY(ref_id)
REFERENCES refs(pmid)
ON UPDATE CASCADE ON DELETE CASCADE,
UNIQUE KEY ref_author(ref_id, last_name, fore_name)
)ENGINE=INNODB;''')
cur.execute('''CREATE TABLE IF NOT EXISTS refs_pubtypes(
id int NOT NULL AUTO_INCREMENT,
ref_id varchar(20) NOT NULL,
pubtype_id varchar(20) NOT NULL,
pubtype_name varchar(255) NOT NULL,
PRIMARY KEY(id),
FOREIGN KEY(ref_id)
REFERENCES refs(pmid)
ON UPDATE CASCADE ON DELETE CASCADE,
UNIQUE KEY ref_pubtype(ref_id, pubtype_id)
)ENGINE=INNODB;''')
cur.execute('''CREATE TABLE IF NOT EXISTS meshes(
id varchar(20) NOT NULL UNIQUE,
name varchar(100) NOT NULL,
date_created date,
parent varchar(20),
PRIMARY KEY(id)
)ENGINE=INNODB;''')
cur.execute('''CREATE TABLE IF NOT EXISTS refs_meshes(
id int NOT NULL AUTO_INCREMENT,
ref_id varchar(20) NOT NULL,
mesh_id varchar(20) NOT NULL,
major_topic tinyint(1),
PRIMARY KEY(id),
FOREIGN KEY(ref_id)
REFERENCES refs(pmid)
ON UPDATE CASCADE ON DELETE CASCADE,
FOREIGN KEY(mesh_id)
REFERENCES meshes(id)
ON UPDATE CASCADE ON DELETE CASCADE
)ENGINE=INNODB;''')
cur.execute('''CREATE TABLE IF NOT EXISTS citations(
id int NOT NULL AUTO_INCREMENT,
citing_id varchar(20) NOT NULL,
cited_id varchar(20) NOT NULL,
ref_type varchar(20),
ref_source varchar(255),
PRIMARY KEY(id),
FOREIGN KEY(citing_id)
REFERENCES refs(pmid),
FOREIGN KEY(cited_id)
REFERENCES refs(pmid)
)ENGINE=INNODB;''')
def create_medline_db_sqlite(db_file_path, processing_type='complete'):
con = sqlite3.connect(db_file_path)
cur = con.cursor()
if processing_type == 'basic':
cur.execute('''CREATE TABLE IF NOT EXISTS refs(
pmid text NOT NULL UNIQUE,
title text NOT NULL,
abstract text,
date_pub_first_str text,
date_pub_first_unix integer,
PRIMARY KEY (pmid)
);''')
if processing_type == 'complete':
cur.execute('''CREATE TABLE IF NOT EXISTS refs(
pmid text NOT NULL UNIQUE,
title text NOT NULL,
abstract text,
authors text,
topics_major text,
topics_minor text,
pub_medium text,
journal_name text,
journal_volume text,
journal_issue text,
language text,
date_pub_first_str text,
date_pub_first_unix integer,
times_cited integer,
PRIMARY KEY (pmid)
);''')
if processing_type == 'complete_normalized':
cur.execute('''CREATE TABLE IF NOT EXISTS refs(
pmid integer NOT NULL UNIQUE,
title text NOT NULL,
abstract text,
topics_major text,
topics_minor text,
pub_medium text,
journal_name text,
journal_volume text,
journal_issue text,
language text,
date_pub_first_str text,
date_pub_first_unix integer,
times_cited integer,
PRIMARY KEY (pmid)
);''')
cur.execute('''CREATE TABLE IF NOT EXISTS ref_authors(
id INTEGER PRIMARY KEY,
pmid INTEGER NOT NULL,
last_name TEXT,
fore_name TEXT,
affiliation TEXT
);''')
cur.execute('''CREATE TABLE IF NOT EXISTS ref_topics(
id INTEGER PRIMARY KEY,
pmid INTEGER NOT NULL,
mesh_id TEXT NOT NULL,
major_topic INTEGER NOT NULL
);''')
cur.execute('''CREATE TABLE IF NOT EXISTS ref_citations(
id INTEGER PRIMARY KEY,
citing_pmid INTEGER NOT NULL,
cited_pmid INTEGER NOT NULL
);''')
cur.execute('''CREATE TABLE IF NOT EXISTS topics(
id INTEGER PRIMARY KEY,
mesh_id TEXT NOT NULL,
tree_id TEXT NOT NULL,
name TEXT NOT NULL,
date_created_str TEXT,
date_created_unix INTEGER
);''')
def add_mesh_data_to_db(mesh_folder, db_type='sqlite', sqlite_db_path=None):
'''
Adds all mesh topics to the meshes db table
TODO: add treenumberlist. ?Maybe add just the first?
:param mesh_folder: folder with mesh.xml (raw data) or meshes_list.pickle (extracted data)
:return:
'''
try:
meshes_list = pickle.load(open(mesh_folder + 'meshes_list.pickle', 'rb'))
except IOError:
print "Meshes_list does not yet exist, creating now..."
meshes_list = mesh_xml_to_pickle(mesh_folder)
insert_list = []
for mesh in meshes_list:
insert_list.append((mesh['mesh_id'], mesh['tree_id'], mesh['name'], mesh['date_created_str'],
mesh['date_created_unix']))
if db_type == 'sqlite':
db = sqlite3.connect(sqlite_db_path)
cur = db.cursor()
cur.executemany('''INSERT OR IGNORE INTO topics(mesh_id, tree_id, name, date_created_str, date_created_unix)
VALUES (?, ?, ?, ?, ?)''',insert_list)
db.commit()
def add_fulltext_search_to_db(sqlite_db_path):
'''
Adds an fts4 reverse lookup table to the database
:param sqlite_db_path:
:return:
'''
db = sqlite3.connect(sqlite_db_path)
cur = db.cursor()
# add reverse lookup table to db, tokenize with porter stemmer
cur.execute('create virtual table refs_lookup using fts4(pmid, date_pub_first_year, title, abstract, topics_major, '
'topics_minor, tokenize = porter);')
cur.execute('insert into refs_lookup select pmid, substr(date_pub_first_str, 1, 4), title, abstract, topics_major,'
' topics_minor from refs;')
db.commit()
# create virtual table refs_lookup using fts4(pmid, date_pub_first_year, title, abstract, topics_major, topics_minor, tokenize = porter);
# insert into refs_lookup select pmid, substr(date_pub_first_str, 1, 4), title, abstract, topics_major, topics_minor from refs;
def mesh_xml_to_pickle(mesh_folder):
print "Loading mesh xml into beautiful soup"
with codecs.open(mesh_folder + 'mesh.xml', encoding='utf-8', mode='rb') as xml_file:
xml = xml_file.read()
print len(xml)
starts = [m.start() for m in re.finditer('<DescriptorRecord ', xml)]
ends = [m.end() for m in re.finditer('</DescriptorRecord>', xml)]
meshes_list = []
for i, _ in enumerate(starts):
text = xml[starts[i]:ends[i]]
soup_mesh = BeautifulSoup(text).findAll('descriptorrecord')[0]
main_mesh = {'mesh_id': soup_mesh.descriptorui.text,
'name': soup_mesh.descriptorname.text.strip()}
try:
main_mesh['date_created_str'], main_mesh['date_created_unix'] = parse_date(soup_mesh.datecreated)
except AttributeError:
print "Problem with date"
print soup_mesh.descriptorui.text, soup_mesh.descriptorname.text.strip()
print soup_mesh.dateestablished
main_mesh['date_created_str'] = None
main_mesh['date_created_unix'] = None
for treenumber_mesh in soup_mesh.findAll('treenumber'):
tree_mesh = main_mesh
tree_mesh['tree_id'] = treenumber_mesh.text
meshes_list.append(tree_mesh)
pickle.dump(meshes_list, open(mesh_folder + 'meshes_list.pickle', 'wb'))
return meshes_list
def create_sample_db(original_db_path, new_db_path, number_of_docs=1000000, fts=False):
'''
Using a complete medline db, this script creates a version with number_of_docs randomly selected ones.
:param original_db_path:
:param new_db_path:
:param number_of_docs: number of documents to be added
:param fts: True for add fts table
:return:
'''
print "Adding {} randomly selected documents from {} to {}\n\n".format(number_of_docs, original_db_path, new_db_path)
con_orig = sqlite3.connect(original_db_path)
cur_orig = con_orig.cursor()
con_new = sqlite3.connect(new_db_path)
query = '''SELECT pmid, title, abstract, authors, topics_major, topics_minor, pub_medium, journal_name,
journal_volume, journal_issue, language, date_pub_first_str, date_pub_first_unix
FROM refs ORDER BY RANDOM() LIMIT {};'''.format(number_of_docs)
insert_list = []
print "Running query: {}".format(query)
for row in cur_orig.execute(query):
if len(insert_list) >= 1000:
print "inserting 1000 entries"
insert_data_to_db(insert_list, db_type='sqlite', db=con_new, processing_type='complete')
insert_list = []
insert_list.append({
'pmid': row[0],
'title': row[1],
'abstract': row[2],
'authors_str': row[3],
'topics_major': row[4],
'topics_minor': row[5],
'pub_medium': row[6],
'journal_name': row[7],
'journal_volume': row[8],
'journal_issue': row[9],
'language': row[10],
'date_pub_first_str': row[11],
'date_pub_first_unix': row[12]})
insert_data_to_db(insert_list, db_type='sqlite', db=con_new, processing_type='complete')
if fts:
print "Adding fts table"
add_fulltext_search_to_db(new_db_path)
if __name__ == "__main__":
sqlite_db_path = '/home/stephan/tobacco/medline/medline_fts_5mio.db'
processing_type = 'complete'
source_folder = '/home/stephan/tobacco/medline/'
db_type = 'sqlite'
create_medline_db_sqlite(sqlite_db_path, processing_type=processing_type)
# extract_refs_from_xml(source_folder=source_folder, db_type=db_type,
# sqlite_db_path=sqlite_db_path,
# processing_type=processing_type)
#add_fulltext_search_to_db(sqlite_db_path)
create_sample_db(original_db_path='/home/stephan/tobacco/medline/medline_complete.db',
new_db_path=sqlite_db_path,
number_of_docs=5000000,
fts=True)
# create virtual table refs_lookup using fts4(pmid, date_pub_first_year, title, abstract, topics_major, topics_minor, tokenize = porter);
# insert into refs_lookup select pmid, substr(date_pub_first_str, 1, 4), title, abstract, topics_major, topics_minor from refs; | 37.844498 | 139 | 0.523927 | 1,688 | 15,819 | 4.670024 | 0.171209 | 0.030192 | 0.025878 | 0.037295 | 0.443486 | 0.39249 | 0.351009 | 0.322339 | 0.287454 | 0.251554 | 0 | 0.017079 | 0.396675 | 15,819 | 418 | 140 | 37.844498 | 0.808885 | 0.055882 | 0 | 0.339286 | 0 | 0 | 0.645628 | 0.012232 | 0 | 0 | 0 | 0.002392 | 0 | 0 | null | null | 0 | 0.032143 | null | null | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed3ef6bea549c440e0574be8ace6cdaeaef7ee73 | 8,668 | py | Python | sdk/python/pulumi_azure_native/media/v20180701/_enums.py | pulumi-bot/pulumi-azure-native | f7b9490b5211544318e455e5cceafe47b628e12c | [
"Apache-2.0"
] | 31 | 2020-09-21T09:41:01.000Z | 2021-02-26T13:21:59.000Z | sdk/python/pulumi_azure_native/media/v20180701/_enums.py | pulumi-bot/pulumi-azure-native | f7b9490b5211544318e455e5cceafe47b628e12c | [
"Apache-2.0"
] | 231 | 2020-09-21T09:38:45.000Z | 2021-03-01T11:16:03.000Z | sdk/python/pulumi_azure_native/media/v20180701/_enums.py | pulumi-bot/pulumi-azure-native | f7b9490b5211544318e455e5cceafe47b628e12c | [
"Apache-2.0"
] | 4 | 2020-09-29T14:14:59.000Z | 2021-02-10T20:38:16.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
from enum import Enum
__all__ = [
'AacAudioProfile',
'AnalysisResolution',
'AssetContainerPermission',
'ContentKeyPolicyFairPlayRentalAndLeaseKeyType',
'ContentKeyPolicyPlayReadyContentType',
'ContentKeyPolicyPlayReadyLicenseType',
'ContentKeyPolicyPlayReadyUnknownOutputPassingOption',
'ContentKeyPolicyRestrictionTokenType',
'DeinterlaceMode',
'DeinterlaceParity',
'EncoderNamedPreset',
'EntropyMode',
'FilterTrackPropertyCompareOperation',
'FilterTrackPropertyType',
'H264Complexity',
'H264VideoProfile',
'InsightsType',
'LiveEventEncodingType',
'LiveEventInputProtocol',
'OnErrorType',
'Priority',
'Rotation',
'StorageAccountType',
'StreamOptionsFlag',
'StretchMode',
'TrackPropertyCompareOperation',
'TrackPropertyType',
]
class AacAudioProfile(str, Enum):
"""
The encoding profile to be used when encoding audio with AAC.
"""
AAC_LC = "AacLc"
HE_AAC_V1 = "HeAacV1"
HE_AAC_V2 = "HeAacV2"
class AnalysisResolution(str, Enum):
"""
Specifies the maximum resolution at which your video is analyzed. The default behavior is "SourceResolution," which will keep the input video at its original resolution when analyzed. Using "StandardDefinition" will resize input videos to standard definition while preserving the appropriate aspect ratio. It will only resize if the video is of higher resolution. For example, a 1920x1080 input would be scaled to 640x360 before processing. Switching to "StandardDefinition" will reduce the time it takes to process high resolution video. It may also reduce the cost of using this component (see https://azure.microsoft.com/en-us/pricing/details/media-services/#analytics for details). However, faces that end up being too small in the resized video may not be detected.
"""
SOURCE_RESOLUTION = "SourceResolution"
STANDARD_DEFINITION = "StandardDefinition"
class AssetContainerPermission(str, Enum):
"""
The permissions to set on the SAS URL.
"""
READ = "Read"
READ_WRITE = "ReadWrite"
READ_WRITE_DELETE = "ReadWriteDelete"
class ContentKeyPolicyFairPlayRentalAndLeaseKeyType(str, Enum):
"""
The rental and lease key type.
"""
UNKNOWN = "Unknown"
UNDEFINED = "Undefined"
DUAL_EXPIRY = "DualExpiry"
PERSISTENT_UNLIMITED = "PersistentUnlimited"
PERSISTENT_LIMITED = "PersistentLimited"
class ContentKeyPolicyPlayReadyContentType(str, Enum):
"""
The PlayReady content type.
"""
UNKNOWN = "Unknown"
UNSPECIFIED = "Unspecified"
ULTRA_VIOLET_DOWNLOAD = "UltraVioletDownload"
ULTRA_VIOLET_STREAMING = "UltraVioletStreaming"
class ContentKeyPolicyPlayReadyLicenseType(str, Enum):
"""
The license type.
"""
UNKNOWN = "Unknown"
NON_PERSISTENT = "NonPersistent"
PERSISTENT = "Persistent"
class ContentKeyPolicyPlayReadyUnknownOutputPassingOption(str, Enum):
"""
Configures Unknown output handling settings of the license.
"""
UNKNOWN = "Unknown"
NOT_ALLOWED = "NotAllowed"
ALLOWED = "Allowed"
ALLOWED_WITH_VIDEO_CONSTRICTION = "AllowedWithVideoConstriction"
class ContentKeyPolicyRestrictionTokenType(str, Enum):
"""
The type of token.
"""
UNKNOWN = "Unknown"
SWT = "Swt"
JWT = "Jwt"
class DeinterlaceMode(str, Enum):
"""
The deinterlacing mode. Defaults to AutoPixelAdaptive.
"""
OFF = "Off"
AUTO_PIXEL_ADAPTIVE = "AutoPixelAdaptive"
class DeinterlaceParity(str, Enum):
"""
The field parity for de-interlacing, defaults to Auto.
"""
AUTO = "Auto"
TOP_FIELD_FIRST = "TopFieldFirst"
BOTTOM_FIELD_FIRST = "BottomFieldFirst"
class EncoderNamedPreset(str, Enum):
"""
The built-in preset to be used for encoding videos.
"""
H264_SINGLE_BITRATE_SD = "H264SingleBitrateSD"
H264_SINGLE_BITRATE720P = "H264SingleBitrate720p"
H264_SINGLE_BITRATE1080P = "H264SingleBitrate1080p"
ADAPTIVE_STREAMING = "AdaptiveStreaming"
AAC_GOOD_QUALITY_AUDIO = "AACGoodQualityAudio"
CONTENT_AWARE_ENCODING_EXPERIMENTAL = "ContentAwareEncodingExperimental"
CONTENT_AWARE_ENCODING = "ContentAwareEncoding"
H264_MULTIPLE_BITRATE1080P = "H264MultipleBitrate1080p"
H264_MULTIPLE_BITRATE720P = "H264MultipleBitrate720p"
H264_MULTIPLE_BITRATE_SD = "H264MultipleBitrateSD"
class EntropyMode(str, Enum):
"""
The entropy mode to be used for this layer. If not specified, the encoder chooses the mode that is appropriate for the profile and level.
"""
CABAC = "Cabac"
CAVLC = "Cavlc"
class FilterTrackPropertyCompareOperation(str, Enum):
"""
The track property condition operation.
"""
EQUAL = "Equal"
NOT_EQUAL = "NotEqual"
class FilterTrackPropertyType(str, Enum):
"""
The track property type.
"""
UNKNOWN = "Unknown"
TYPE = "Type"
NAME = "Name"
LANGUAGE = "Language"
FOUR_CC = "FourCC"
BITRATE = "Bitrate"
class H264Complexity(str, Enum):
"""
Tells the encoder how to choose its encoding settings. The default value is Balanced.
"""
SPEED = "Speed"
BALANCED = "Balanced"
QUALITY = "Quality"
class H264VideoProfile(str, Enum):
"""
We currently support Baseline, Main, High, High422, High444. Default is Auto.
"""
AUTO = "Auto"
BASELINE = "Baseline"
MAIN = "Main"
HIGH = "High"
HIGH422 = "High422"
HIGH444 = "High444"
class InsightsType(str, Enum):
"""
Defines the type of insights that you want the service to generate. The allowed values are 'AudioInsightsOnly', 'VideoInsightsOnly', and 'AllInsights'. The default is AllInsights. If you set this to AllInsights and the input is audio only, then only audio insights are generated. Similarly if the input is video only, then only video insights are generated. It is recommended that you not use AudioInsightsOnly if you expect some of your inputs to be video only; or use VideoInsightsOnly if you expect some of your inputs to be audio only. Your Jobs in such conditions would error out.
"""
AUDIO_INSIGHTS_ONLY = "AudioInsightsOnly"
VIDEO_INSIGHTS_ONLY = "VideoInsightsOnly"
ALL_INSIGHTS = "AllInsights"
class LiveEventEncodingType(str, Enum):
"""
The encoding type for Live Event. This value is specified at creation time and cannot be updated.
"""
NONE = "None"
BASIC = "Basic"
STANDARD = "Standard"
PREMIUM1080P = "Premium1080p"
class LiveEventInputProtocol(str, Enum):
"""
The streaming protocol for the Live Event. This is specified at creation time and cannot be updated.
"""
FRAGMENTED_MP4 = "FragmentedMP4"
RTMP = "RTMP"
class OnErrorType(str, Enum):
"""
A Transform can define more than one outputs. This property defines what the service should do when one output fails - either continue to produce other outputs, or, stop the other outputs. The overall Job state will not reflect failures of outputs that are specified with 'ContinueJob'. The default is 'StopProcessingJob'.
"""
STOP_PROCESSING_JOB = "StopProcessingJob"
CONTINUE_JOB = "ContinueJob"
class Priority(str, Enum):
"""
Sets the relative priority of the TransformOutputs within a Transform. This sets the priority that the service uses for processing TransformOutputs. The default priority is Normal.
"""
LOW = "Low"
NORMAL = "Normal"
HIGH = "High"
class Rotation(str, Enum):
"""
The rotation, if any, to be applied to the input video, before it is encoded. Default is Auto
"""
AUTO = "Auto"
NONE = "None"
ROTATE0 = "Rotate0"
ROTATE90 = "Rotate90"
ROTATE180 = "Rotate180"
ROTATE270 = "Rotate270"
class StorageAccountType(str, Enum):
"""
The type of the storage account.
"""
PRIMARY = "Primary"
SECONDARY = "Secondary"
class StreamOptionsFlag(str, Enum):
DEFAULT = "Default"
LOW_LATENCY = "LowLatency"
class StretchMode(str, Enum):
"""
The resizing mode - how the input video will be resized to fit the desired output resolution(s). Default is AutoSize
"""
NONE = "None"
AUTO_SIZE = "AutoSize"
AUTO_FIT = "AutoFit"
class TrackPropertyCompareOperation(str, Enum):
"""
Track property condition operation
"""
UNKNOWN = "Unknown"
EQUAL = "Equal"
class TrackPropertyType(str, Enum):
"""
Track property type
"""
UNKNOWN = "Unknown"
FOUR_CC = "FourCC"
| 30.202091 | 774 | 0.704199 | 922 | 8,668 | 6.544469 | 0.382863 | 0.031323 | 0.028174 | 0.005966 | 0.052536 | 0.024528 | 0.024528 | 0.024528 | 0.024528 | 0 | 0 | 0.020575 | 0.209391 | 8,668 | 286 | 775 | 30.307692 | 0.859915 | 0.381749 | 0 | 0.134228 | 1 | 0 | 0.304672 | 0.106524 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.013423 | 0.006711 | 0 | 0.805369 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed40ecc03a8777969708bdfddce9cea1116adfe4 | 252 | py | Python | 2019/07/11/Solutions/WillDaSilva/solution.py | WillDaSilva/daily-questions | 6e86b3f625df5c60d9a57f1694fafdd24c4ff2c4 | [
"MIT"
] | 12 | 2019-07-02T22:17:49.000Z | 2020-10-08T16:02:04.000Z | 2019/07/11/Solutions/WillDaSilva/solution.py | WillDaSilva/daily-questions | 6e86b3f625df5c60d9a57f1694fafdd24c4ff2c4 | [
"MIT"
] | 2 | 2019-07-03T12:22:22.000Z | 2019-09-04T23:31:38.000Z | 2019/07/11/Solutions/WillDaSilva/solution.py | WillDaSilva/daily-questions | 6e86b3f625df5c60d9a57f1694fafdd24c4ff2c4 | [
"MIT"
] | 15 | 2019-07-02T23:29:07.000Z | 2020-05-11T15:53:07.000Z | from functools import reduce
def permutations(x):
f = lambda x,y: [a+[b] if isinstance(a, list) else [a]+[b] for a in x for b in y]
r = reduce(f, [range(len(x))]*len(x))
return (tuple(x[i] for i in j) for j in r if len(set(j)) == len(x))
| 31.5 | 85 | 0.595238 | 54 | 252 | 2.777778 | 0.481481 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 252 | 7 | 86 | 36 | 0.765306 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed4b9890c2200c2bb29d02ab335d0eb18b051f93 | 98 | py | Python | PS6/A8.py | NumEconCopenhagen/exercises-2022 | de954532526f4c1dc7d78ff6fbb8987523735b8f | [
"MIT"
] | null | null | null | PS6/A8.py | NumEconCopenhagen/exercises-2022 | de954532526f4c1dc7d78ff6fbb8987523735b8f | [
"MIT"
] | null | null | null | PS6/A8.py | NumEconCopenhagen/exercises-2022 | de954532526f4c1dc7d78ff6fbb8987523735b8f | [
"MIT"
] | 2 | 2022-02-14T17:00:01.000Z | 2022-03-09T07:20:32.000Z | ss_func = sm.lambdify((s,g,n,delta,alpha),kss)
# Evaluate function
ss_func(0.2,0.02,0.01,0.1,1/3) | 24.5 | 46 | 0.693878 | 24 | 98 | 2.75 | 0.75 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131868 | 0.071429 | 98 | 4 | 47 | 24.5 | 0.593407 | 0.173469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed4d222b36d6861a177a32535482c0f51dde31ab | 846 | py | Python | v/ui_lic_eliminar.py | yo-alan/personal | 2f711a9f5dd5a16fbb3ab2a6f9b89069894ce40c | [
"MIT"
] | null | null | null | v/ui_lic_eliminar.py | yo-alan/personal | 2f711a9f5dd5a16fbb3ab2a6f9b89069894ce40c | [
"MIT"
] | 10 | 2015-01-12T12:57:09.000Z | 2015-03-30T13:39:23.000Z | v/ui_lic_eliminar.py | yo-alan/personal | 2f711a9f5dd5a16fbb3ab2a6f9b89069894ce40c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file '/home/alan/dev/personal/v/ui_lic_eliminar.ui'
#
# Created: Sat Jan 31 18:27:20 2015
# by: PyQt4 UI code generator 4.9.1
#
# WARNING! All changes made in this file will be lost!
from PyQt4 import QtCore, QtGui
try:
_fromUtf8 = QtCore.QString.fromUtf8
except AttributeError:
_fromUtf8 = lambda s: s
class Ui_Lic_Eliminar(object):
def setupUi(self, Lic_Eliminar):
Lic_Eliminar.setObjectName(_fromUtf8("Lic_Eliminar"))
Lic_Eliminar.setModal(True)
self.retranslateUi(Lic_Eliminar)
QtCore.QMetaObject.connectSlotsByName(Lic_Eliminar)
def retranslateUi(self, Lic_Eliminar):
Lic_Eliminar.setWindowTitle(QtGui.QApplication.translate("Lic_Eliminar", "Eliminar licencia", None, QtGui.QApplication.UnicodeUTF8))
| 30.214286 | 140 | 0.732861 | 109 | 846 | 5.541284 | 0.623853 | 0.200331 | 0.069536 | 0.109272 | 0.086093 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03267 | 0.167849 | 846 | 27 | 141 | 31.333333 | 0.825284 | 0.289598 | 0 | 0 | 1 | 0 | 0.069257 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed50d9ba2979b1b516b6743f07388d304ed2c64f | 78,951 | py | Python | Integrations/Uptycs/Uptycs.py | vibhuabharadwaj/content | 30d639dbea0015536a3040ec18f93e50322bded0 | [
"MIT"
] | 1 | 2019-10-07T22:15:37.000Z | 2019-10-07T22:15:37.000Z | Integrations/Uptycs/Uptycs.py | vibhuabharadwaj/content | 30d639dbea0015536a3040ec18f93e50322bded0 | [
"MIT"
] | 3 | 2021-03-26T00:38:42.000Z | 2021-06-02T03:42:17.000Z | Integrations/Uptycs/Uptycs.py | vibhuabharadwaj/content | 30d639dbea0015536a3040ec18f93e50322bded0 | [
"MIT"
] | 1 | 2020-11-11T23:59:39.000Z | 2020-11-11T23:59:39.000Z | import demistomock as demisto
from CommonServerPython import *
###############################################################################
# import required libraries package
###############################################################################
import os
import ast
import json
import jwt
from datetime import datetime, timedelta
import requests
from typing import List
from signal import signal, SIGPIPE, SIG_DFL
signal(SIGPIPE, SIG_DFL)
# disable insecure warnings
requests.packages.urllib3.disable_warnings()
###############################################################################
# packages to handle IOerror
###############################################################################
if not demisto.params().get('proxy', False) \
or demisto.params()['proxy'] == 'false':
del os.environ['HTTP_PROXY']
del os.environ['HTTPS_PROXY']
del os.environ['http_proxy']
del os.environ['https_proxy']
"""GLOBAL VARS"""
VERIFY_CERT = True if not demisto.params().get('insecure') else False
KEY = demisto.params().get('key')
SECRET = demisto.params().get('secret')
DOMAIN = demisto.params().get('domain')
CUSTOMER_ID = demisto.params().get('customer_id')
FETCH_TIME = demisto.params().get('fetch_time')
"""HELPER FUNCTIONS"""
def generate_headers(key, secret):
header = {}
utcnow = datetime.utcnow()
date = utcnow.strftime("%a, %d %b %Y %H:%M:%S GMT")
auth_var = jwt.encode({'iss': key}, secret, algorithm='HS256').\
decode('utf-8')
authorization = "Bearer %s" % auth_var
header['date'] = date
header['Authorization'] = authorization
return header
def restcall(method, api, **kwargs):
header = generate_headers(KEY, SECRET)
url = ("https://%s.uptycs.io/public/api/customers/%s%s" %
(DOMAIN, CUSTOMER_ID, api))
try:
request_func = getattr(requests, method)
except AttributeError:
return_error("Invalid method: {0}".format(method))
try:
response = request_func(
url,
headers=header,
verify=VERIFY_CERT,
**kwargs)
except Exception as e:
return_error("Error Connecting to server. Details: {0}".format(str(e)))
return response.json()
def severity_to_int(level_string):
level_int = 0
if level_string == 'low':
level_int = 1
if level_string == 'medium':
level_int = 2
if level_string == 'high':
level_int = 3
return level_int
def remove_context_entries(context, context_entries_to_keep):
for index in range(len(context)):
for key in list(context[index]):
if key not in context_entries_to_keep:
context[index].pop(key, None)
return context
def apply_os_cut(query, os):
if "WHERE" not in query:
query = ("%s WHERE" % query)
else:
query = ("%s AND" % query)
op_systems = os.split("/")
for index in range(len(op_systems)):
query = ("%s os LIKE '%%%s%%'" % (query, op_systems[index]))
if index < len(op_systems) - 1:
query = ("%s OR" % query)
return query
def apply_equals_cuts(query, cuts):
if all(value is None for value in cuts.values()):
return query
else:
if "WHERE" not in query:
query = ("%s WHERE" % query)
else:
query = ("%s AND" % query)
use_and = False
for key in cuts:
if cuts.get(key) is not None:
if use_and:
query = ("%s AND" % query)
if "time" in key:
query = ("%s %s=CAST('%s' AS TIMESTAMP)" % (query, key,
cuts.get(key)))
use_and = True
else:
if type(cuts.get(key)) == str:
query = ("%s %s='%s'" % (query, key, cuts.get(key)))
if type(cuts.get(key)) == int:
query = ("%s %s=%s" % (query, key, cuts.get(key)))
use_and = True
return query
def apply_like_cuts(query, cuts):
if all(value is None for value in cuts.values()):
return query
else:
if "WHERE" not in query:
query = ("%s WHERE" % query)
else:
query = ("%s AND" % query)
i = 0
for key in cuts:
i = i + 1
if cuts.get(key) is not None:
query = ("%s %s LIKE '%%%s%%'" % (query, key, cuts.get(key)))
if i < len(cuts):
query = ("%s AND" % query)
return query
def apply_datetime_cuts(query, name, start, finish):
if start is None and finish is None:
return query
if "WHERE" not in query:
query = ("%s WHERE" % query)
else:
query = ("%s AND" % query)
if finish is None:
query = ("%s %s AFTER CAST('%s' AS TIMESTAMP)" % (query, name,
start))
if start is None:
query = ("%s %s BEFORE CAST('%s' AS TIMESTAMP)" % (query, name,
finish))
if start is not None and finish is not None:
query = ("%s %s BETWEEN CAST('%s' AS TIMESTAMP) AND \
CAST('%s' AS TIMESTAMP)"
% (query, name, start, finish))
return query
def uptycs_parse_date_range(timeago, start_time, end_time):
if timeago is None:
timeago = "1 day"
if end_time is not None and start_time is None:
number = timeago.split(" ")[0]
unit = timeago.split(" ")[1]
if unit == 'minutes' or unit == 'minute':
temp_time_ago = datetime.strftime(
datetime.strptime(end_time, "%Y-%m-%d %H:%M:%S.000")
- timedelta(minutes=number), "%Y-%m-%d %H:%M:%S.000")
if unit == 'hours' or unit == 'hour':
temp_time_ago = datetime.strftime(
datetime.strptime(end_time, "%Y-%m-%d %H:%M:%S.000") -
- timedelta(hours=number), "%Y-%m-%d %H:%M:%S.000")
if unit == 'days' or unit == 'day':
temp_time_ago = datetime.strftime(
datetime.strptime(end_time, "%Y-%m-%d %H:%M:%S.000") -
- timedelta(days=number), "%Y-%m-%d %H:%M:%S.000")
if unit == 'months' or unit == 'month':
temp_time_ago = datetime.strftime(
datetime.strptime(end_time, "%Y-%m-%d %H:%M:%S.000") -
- timedelta(days=number * 30), "%Y-%m-%d %H:%M:%S.000")
if unit == 'years' or unit == 'year':
temp_time_ago = datetime.strftime(
datetime.strptime(end_time, "%Y-%m-%d %H:%M:%S.000") -
- timedelta(days=number * 365), "%Y-%m-%d %H:%M:%S.000")
else:
temp_time_ago, now = parse_date_range(timeago,
date_format="%Y-%m-%d \
%H:%M:%S.000")
end = (end_time if end_time is not None else now)
begin = (start_time if start_time is not None else temp_time_ago)
return begin, end
"""COMMAND FUNCTIONS"""
def uptycs_run_query():
"""
return results of posted query
"""
http_method = 'post'
query = demisto.args().get('query')
if demisto.args().get('query_type') == 'global':
api_call = '/query'
post_data = {
'query': query
}
else:
api_call = '/assets/query'
if demisto.args().get('asset_id') is not None:
_id = {
"_id": {
"equals": demisto.args().get('asset_id')
}
}
elif demisto.args().get('host_name_is') is not None:
_id = {
"host_name": {
"equals": demisto.args().get(
'host_name_is')
}
}
elif demisto.args().get('host_name_like') is not None:
_id = {
"host_name": {
"like": "%{0}%".format(demisto.args().get(
'host_name_like'))
}
}
else:
_id = {
"host_name": {
"like": '%%'
}
}
post_data = {
"query": query,
"type": "realtime",
"filtering": {
"filters": _id
}
}
return restcall(http_method, api_call, json=post_data)
def uptycs_run_query_command():
query_results = uptycs_run_query()
human_readable = tableToMarkdown('Uptycs Query Result',
query_results.get('items'))
context = query_results.get('items')
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.QueryResults': context
}
}
return entry
def uptycs_get_assets():
"""
return list of assets enrolled in Uptycs
"""
http_method = 'post'
api_call = "/query"
query = 'SELECT * FROM upt_assets'
limit = demisto.args().get('limit')
equal_cuts = {
"id": demisto.args().get('asset_id'),
"host_name": demisto.args().get('host_name_is'),
"object_group_id": demisto.args().get('object_group_id')
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"host_name": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
os = demisto.args().get('os')
if os:
query = apply_os_cut(query, os)
query = ("%s ORDER BY last_activity_at DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
"query": query,
"queryType": query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_assets_command():
query_results = uptycs_get_assets()
human_readable = tableToMarkdown('Uptycs Assets',
query_results.get('items'),
['id', 'host_name', 'os', 'os_version',
'osquery_version', 'last_activity_at'])
context = query_results.get('items')
context_entries_to_keep = ['id', 'location', 'latitude', 'longitude',
'os_flavor', 'os', 'last_enrolled_at',
'status', 'host_name', 'os_version',
'osquery_version', 'last_activity_at',
'upt_asset_id', 'created_at']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results.get('items'),
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Assets(val.id == obj.id)': context
}
}
return entry
def uptycs_get_alerts():
"""
return list of alerts
"""
http_method = 'post'
api_call = "/query"
query = 'SELECT a.*, u.host_name FROM upt_alerts a JOIN upt_assets u ON \
a.upt_asset_id=u.id'
limit = demisto.args().get('limit')
alert_id = demisto.args().get('alert_id')
if alert_id is not None:
equal_cuts = {
"a.id": alert_id
}
query = apply_equals_cuts(query, equal_cuts)
else:
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"code": demisto.args().get('code'),
"host_name": demisto.args().get('host_name_is'),
"value": demisto.args().get('value'),
"key": demisto.args().get('key')
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"host_name": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
time_ago = demisto.args().get('time_ago')
start_window = demisto.args().get('start_window')
end_window = demisto.args().get('end_window')
if time_ago is not None or (start_window is not None
or end_window is not None):
begin, end = uptycs_parse_date_range(time_ago,
start_window, end_window)
query = apply_datetime_cuts(query, "alert_time", begin, end)
query = ("%s ORDER BY a.alert_time DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
"query": query,
"queryType": query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_alerts_command():
query_results = uptycs_get_alerts()
context = query_results.get('items')
context_entries_to_keep = ['id', 'host_name', 'grouping', 'code',
'assigned_to', 'alert_time', 'updated_at',
'metadata', 'asset', 'status', 'upt_asset_id',
'created_at', 'description', 'severity',
'value', 'key']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
if context is not None:
for index in range(len(context)):
if bool(json.loads(context[index].get('metadata')).get('pid')):
context[index]['pid'] = json.loads(
context[index].get('metadata')).get('pid')
else:
context[index]['pid'] = 'Not applicable or unknown'
if bool(json.loads(
context[index].get('metadata')).get('indicatorId')):
context[index]['threat_indicator_id'] =\
json.loads(
context[index].get('metadata')).get('indicatorId')
context[index]['threat_source_name'] =\
json.loads(
context[index].get('metadata')).get(
'indicatorSummary').get('threatSourceName')
else:
context[index]['threat_indicator_id'] = 'No threat indicator \
for this alert'
context[index]['threat_source_name'] = 'No threat source for \
this alert'
human_readable = tableToMarkdown('Uptycs Alerts: ',
context,
['upt_asset_id', 'host_name', 'grouping',
'alert_time', 'description', 'value',
'severity', 'threat_indicator_id',
'threat_source_name'])
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Alerts(val.id == obj.id)': context
}
}
return entry
def uptycs_get_events():
"""
return list of events
"""
http_method = 'post'
api_call = "/query"
query = 'SELECT a.*, u.host_name FROM upt_events a JOIN upt_assets u ON \
a.upt_asset_id=u.id'
limit = demisto.args().get('limit')
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"code": demisto.args().get('code'),
"host_name": demisto.args().get('host_name_is'),
"key": demisto.args().get('key'),
"value": demisto.args().get('value')
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"host_name": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
time_ago = demisto.args().get('time_ago')
start_window = demisto.args().get('start_window')
end_window = demisto.args().get('end_window')
if time_ago is not None or (start_window is not None
or end_window is not None):
begin, end = uptycs_parse_date_range(time_ago,
start_window, end_window)
query = apply_datetime_cuts(query, "event_time", begin, end)
query = ("%s ORDER BY a.event_time DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
"query": query,
"queryType": query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_events_command():
query_results = uptycs_get_events()
context = query_results.get('items')
context_entries_to_keep = ['upt_asset_id', 'host_name', 'grouping',
'code', 'assigned_to', 'event_time',
'updated_at', 'metadata', 'asset', 'status',
'id', 'created_at', 'description', 'severity',
'value', 'key']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
human_readable = tableToMarkdown('Uptycs Events',
query_results.get('items'),
['host_name', 'grouping', 'event_time',
'description', 'value', 'severity'])
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Events(val.id == obj.id)': query_results.get('items')
}
}
return entry
def uptycs_get_alert_rules():
"""
return list of alert rules
"""
http_method = 'get'
api_call = "/alertRules"
limit = demisto.args().get('limit')
if limit != -1 and limit is not None:
api_call = ("%s?limit=%s" % (api_call, limit))
return restcall(http_method, api_call)
def uptycs_get_alert_rules_command():
query_results = uptycs_get_alert_rules()
human_readable = tableToMarkdown('Uptycs Alert Rules',
query_results.get('items'),
['name', 'description', 'grouping',
'enabled', 'updatedAt', 'code'])
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results.get('items'),
'HumanReadable': human_readable,
}
return entry
def uptycs_get_event_rules():
"""
return list of event rules
"""
http_method = 'get'
api_call = "/eventRules"
limit = demisto.args().get('limit')
if limit != -1 and limit is not None:
api_call = ("%s?limit=%s" % (api_call, limit))
return restcall(http_method, api_call)
def uptycs_get_event_rules_command():
query_results = uptycs_get_event_rules()
human_readable = tableToMarkdown('Uptycs Event Rules',
query_results.get('items'),
['name', 'description', 'grouping',
'enabled', 'updatedAt', 'code'])
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results.get('items'),
'HumanReadable': human_readable,
}
return entry
def uptycs_get_process_open_files():
"""
return information for processes which opened a file
"""
http_method = 'post'
api_call = '/query'
query = "select * from process_open_files"
limit = demisto.args().get('limit')
time = demisto.args().get('time')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = "%s WHERE upt_day = %s" % (query, uptday)
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is'),
"upt_time": time
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"upt_hostname": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
time_ago = demisto.args().get('time_ago')
start_window = demisto.args().get('start_window')
end_window = demisto.args().get('end_window')
if time is None and (time_ago is not None or (start_window is not None
or end_window is not None)):
begin, end = uptycs_parse_date_range(time_ago,
start_window, end_window)
query = apply_datetime_cuts(query, "upt_time", begin, end)
query = ("%s ORDER BY upt_time DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_process_open_files_command():
query_results = uptycs_get_process_open_files()
human_readable = tableToMarkdown('Process which has opened a file',
query_results.get('items'),
['upt_hostname', 'pid', 'path', 'fd',
'upt_time'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid',
'path', 'fd', 'upt_time']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Files': context
}
}
return entry
def uptycs_get_process_open_sockets():
"""
return information for processes which opened a socket
"""
http_method = 'post'
api_call = '/query'
time = demisto.args().get('time')
query = "select * from process_open_sockets"
limit = demisto.args().get('limit')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = "%s WHERE upt_day = %s" % (query, uptday)
equal_cuts = {
"remote_address": demisto.args().get('ip'),
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is'),
"upt_time": time
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"upt_hostname": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
time_ago = demisto.args().get('time_ago')
start_window = demisto.args().get('start_window')
end_window = demisto.args().get('end_window')
if time is None and (time_ago is not None or (start_window is not None
or end_window is not None)):
begin, end = uptycs_parse_date_range(time_ago,
start_window, end_window)
query = apply_datetime_cuts(query, "upt_time", begin, end)
query = ("%s ORDER BY upt_time DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_process_open_sockets_command():
query_results = uptycs_get_process_open_sockets()
human_readable = tableToMarkdown('process_open_sockets',
query_results.get('items'),
['upt_hostname', 'pid', 'local_address',
'remote_address', 'upt_time',
'local_port', 'remote_port', 'socket'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid',
'local_address', 'remote_address', 'upt_time',
'local_port', 'remote_port', 'socket', 'family',
'path', 'state', 'protocol']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Sockets': context
}
}
return entry
def uptycs_get_socket_events():
"""
return information for processes which opened a socket
"""
http_method = 'post'
api_call = '/query'
time = demisto.args().get('time')
query = "select * from socket_events"
limit = demisto.args().get('limit')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = "%s WHERE upt_day = %s" % (query, uptday)
equal_cuts = {
"remote_address": demisto.args().get('ip'),
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is'),
"upt_time": time
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"upt_hostname": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
time_ago = demisto.args().get('time_ago')
start_window = demisto.args().get('start_window')
end_window = demisto.args().get('end_window')
if time is None and (time_ago is not None or (start_window is not None
or end_window is not None)):
begin, end = uptycs_parse_date_range(time_ago,
start_window, end_window)
query = apply_datetime_cuts(query, "upt_time", begin, end)
query = ("%s ORDER BY upt_time DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_socket_events_command():
query_results = uptycs_get_socket_events()
human_readable = tableToMarkdown('Socket events',
query_results.get('items'),
['upt_hostname', 'pid', 'local_address',
'remote_address', 'upt_time',
'local_port', 'action'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid',
'local_address', 'remote_address', 'upt_time',
'local_port', 'remote_port', 'socket',
'family', 'path', 'action', 'protocol']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.SocketEvents': context
}
}
return entry
def uptycs_get_socket_event_information():
"""
return process event information
"""
http_method = 'post'
api_call = '/query'
time = demisto.args().get('time')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = ("SELECT * FROM socket_events WHERE upt_day = %s AND \
upt_time <= CAST('%s' AS TIMESTAMP) AND remote_address='%s' \
ORDER BY upt_time DESC LIMIT 1" %
(uptday, time, demisto.args().get('ip')))
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is')
}
query = apply_equals_cuts(query, equal_cuts)
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_socket_event_information_command():
query_results = uptycs_get_socket_event_information()
human_readable = tableToMarkdown('Socket event information',
query_results.get('items'),
['upt_hostname', 'pid', 'local_address',
'remote_address', 'upt_time',
'local_port', 'action'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid',
'local_address', 'remote_address', 'upt_time',
'local_port', 'remote_port', 'socket',
'family', 'path', 'action', 'protocol']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.SocketEvent': context
}
}
return entry
def uptycs_get_processes():
"""
return process which are running or have run on a registered Uptycs asset
"""
http_method = 'post'
api_call = '/query'
time = demisto.args().get('time')
query = "select * from processes"
limit = demisto.args().get('limit')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = "%s WHERE upt_day = %s" % (query, uptday)
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is'),
"upt_time": time
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"upt_hostname": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
time_ago = demisto.args().get('time_ago')
start_window = demisto.args().get('start_window')
end_window = demisto.args().get('end_window')
if time is None and (time_ago is not None or (start_window is not None
or end_window is not None)):
begin, end = uptycs_parse_date_range(time_ago,
start_window, end_window)
query = apply_datetime_cuts(query, "upt_time", begin, end)
query = ("%s ORDER BY upt_time DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_processes_command():
query_results = uptycs_get_processes()
human_readable = tableToMarkdown('Processes',
query_results.get('items'),
['upt_hostname', 'pid', 'name', 'path',
'upt_time', 'parent', 'cmdline'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid', 'name',
'path', 'upt_time', 'parent', 'cmdline',
'pgroup', 'cwd']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Process': context
}
}
return entry
def uptycs_get_process_events():
"""return process events which have executed on a \
registered Uptycs asset"""
http_method = 'post'
api_call = '/query'
time = demisto.args().get('time')
query = "select * from process_events"
limit = demisto.args().get('limit')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = "%s WHERE upt_day = %s" % (query, uptday)
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is'),
"upt_time": time
}
query = apply_equals_cuts(query, equal_cuts)
like_cuts = {
"upt_hostname": demisto.args().get('host_name_like')
}
query = apply_like_cuts(query, like_cuts)
time_ago = demisto.args().get('time_ago')
start_window = demisto.args().get('start_window')
end_window = demisto.args().get('end_window')
if time is None and (time_ago is not None or (start_window is not None
or end_window is not None)):
begin, end = uptycs_parse_date_range(time_ago,
start_window, end_window)
query = apply_datetime_cuts(query, "upt_time", begin, end)
query = ("%s ORDER BY upt_time DESC" % query)
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_process_events_command():
query_results = uptycs_get_process_events()
human_readable = tableToMarkdown('Process events',
query_results.get('items'),
['upt_hostname', 'pid', 'path',
'upt_time', 'parent', 'cmdline'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid', 'path',
'upt_time', 'parent', 'cmdline', 'cwd']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ProcessEvents': context
}
}
return entry
def uptycs_get_process_information():
"""return process information"""
http_method = 'post'
api_call = '/query'
time = demisto.args().get('time')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = ("WITH add_times AS (SELECT * FROM processes WHERE upt_added=True), \
remove_times AS (SELECT upt_time, upt_hash FROM processes WHERE \
upt_added=False), temp_proc AS (SELECT aa.upt_asset_id, aa.pid, \
aa.name, aa.path, aa.cmdline, aa.cwd, aa.parent, aa.pgroup, \
aa.upt_hostname, aa.upt_day, aa.upt_time as upt_add_time, \
rr.upt_time as temp_remove_time FROM add_times aa LEFT JOIN \
remove_times rr ON aa.upt_hash=rr.upt_hash), new_proc AS \
(SELECT upt_asset_id, pid, name, path, cmdline, cwd, parent, \
pgroup, upt_hostname, upt_day, upt_add_time, \
coalesce(temp_remove_time, current_timestamp) AS upt_remove_time \
FROM temp_proc) SELECT * FROM new_proc WHERE pid=%s AND \
CAST('%s' AS TIMESTAMP) BETWEEN upt_add_time AND upt_remove_time"
% (demisto.args().get('pid'), time))
equal_cuts = {
"upt_day": uptday,
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is')
}
query = apply_equals_cuts(query, equal_cuts)
query = ("%s ORDER BY upt_add_time DESC LIMIT 1" % query)
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_process_information_command():
query_results = uptycs_get_process_information()
human_readable = tableToMarkdown('Process information',
query_results.get('items'),
['upt_hostname', 'parent', 'pid',
'name', 'path', 'cmdline'])
context = query_results.get('items')
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Proc': context
}
}
return entry
def uptycs_get_process_event_information():
"""return process event information"""
http_method = 'post'
api_call = '/query'
time = demisto.args().get('time')
if time is not None:
day = time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = ("SELECT * FROM process_events WHERE upt_day = %s AND pid=%s AND \
upt_time<=CAST('%s' AS TIMESTAMP)" %
(uptday, demisto.args().get('pid'), time))
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is')
}
query = apply_equals_cuts(query, equal_cuts)
query = ("%s ORDER BY upt_time DESC LIMIT 1" % query)
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_process_event_information_command():
query_results = uptycs_get_process_event_information()
human_readable = tableToMarkdown('Process event information',
query_results.get('items'),
['upt_hostname', 'parent', 'pid',
'path', 'cmdline'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid', 'path',
'upt_time', 'parent', 'cmdline', 'cwd']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ProcEvent': context
}
}
return entry
def uptycs_get_parent_information():
"""return parent process information"""
http_method = 'post'
api_call = '/query'
child_add_time = demisto.args().get('child_add_time')
if child_add_time is not None:
day = child_add_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_child_add_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_child_add_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = ("WITH add_times AS (SELECT * FROM processes WHERE upt_added=True), \
remove_times AS (SELECT upt_time, upt_hash FROM processes WHERE \
upt_added=False), temp_proc AS (SELECT aa.upt_asset_id, aa.pid, \
aa.name, aa.path, aa.cmdline, aa.cwd, aa.parent, aa.pgroup, \
aa.upt_hostname, aa.upt_day, aa.upt_time as upt_add_time, \
rr.upt_time as temp_remove_time FROM add_times aa LEFT JOIN \
remove_times rr ON aa.upt_hash=rr.upt_hash), new_proc AS \
(SELECT upt_asset_id, pid, name, path, cmdline, cwd, parent, \
pgroup, upt_hostname, upt_day, upt_add_time, \
coalesce(temp_remove_time, current_timestamp) AS upt_remove_time \
FROM temp_proc) SELECT * FROM new_proc WHERE pid=%s AND \
CAST('%s' AS TIMESTAMP) BETWEEN upt_add_time AND upt_remove_time AND \
upt_day <= %s"
% (demisto.args().get('parent'), child_add_time, uptday))
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is')
}
query = apply_equals_cuts(query, equal_cuts)
query = ("%s ORDER BY upt_add_time DESC LIMIT 1" % query)
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_parent_information_command():
query_results = uptycs_get_parent_information()
human_readable = tableToMarkdown('Parent process information',
query_results.get('items'),
['upt_hostname', 'parent', 'pid',
'name', 'path', 'cmdline'])
context = query_results.get('items')
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Parent': context
}
}
return entry
def uptycs_get_parent_event_information():
"""return process event information"""
http_method = 'post'
api_call = '/query'
child_add_time = demisto.args().get('child_add_time')
if child_add_time is not None:
day = child_add_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_child_add_time = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_child_add_time.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
query = ("SELECT * FROM process_events WHERE upt_day = %s AND pid=%s AND \
upt_time<=CAST('%s' AS TIMESTAMP)" %
(uptday, demisto.args().get('parent'), child_add_time))
equal_cuts = {
"upt_asset_id": demisto.args().get('asset_id'),
"upt_hostname": demisto.args().get('host_name_is')
}
query = apply_equals_cuts(query, equal_cuts)
query = ("%s ORDER BY upt_time DESC LIMIT 1" % query)
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_parent_event_information_command():
query_results = uptycs_get_parent_event_information()
human_readable = tableToMarkdown('Parent process event information',
query_results.get('items'),
['upt_hostname', 'parent', 'pid',
'path', 'cmdline'])
context = query_results.get('items')
context_entries_to_keep = ['upt_hostname', 'upt_asset_id', 'pid', 'path',
'upt_time', 'parent', 'cmdline', 'cwd']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ParentEvent': context
}
}
return entry
def uptycs_get_process_child_processes():
"""return child processes for a given parent process"""
http_method = 'post'
api_call = '/query'
parent = demisto.args().get('parent')
limit = demisto.args().get('limit')
asset_id = demisto.args().get('asset_id')
parent_start = demisto.args().get('parent_start_time')
parent_end = demisto.args().get('parent_end_time')
if parent_start is not None:
day = parent_start.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
else:
temp_parent_start = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
day = temp_parent_start.replace(" ", "-")
day_list = day.split("-")
uptday = int("%s%s%s" %
(str(day_list[0]), str(day_list[1]), str(day_list[2])))
if parent_end is None:
query = ("SELECT upt_time FROM process_events WHERE pid = %s AND \
upt_asset_id = '%s' AND upt_time > CAST('%s' AS TIMESTAMP) \
ORDER BY upt_time ASC limit 1" %
(parent, asset_id, parent_start))
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
temp_results = restcall(http_method, api_call, json=post_data)
if len(temp_results.get('items')) > 0:
parent_end = temp_results.get('items')[0].get('upt_time')
else:
parent_end = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S")
query = ("WITH add_times AS (SELECT * FROM processes WHERE upt_added=True), \
remove_times AS (SELECT upt_time, upt_hash FROM processes WHERE \
upt_added=False), temp_proc AS (SELECT aa.upt_asset_id, aa.pid, \
aa.name, aa.path, aa.cmdline, aa.cwd, aa.parent, aa.pgroup, \
aa.upt_hostname, aa.upt_day, aa.upt_time as upt_add_time, \
rr.upt_time as temp_remove_time FROM add_times aa LEFT JOIN \
remove_times rr on aa.upt_hash=rr.upt_hash), new_proc AS \
(SELECT upt_asset_id, pid, name, path, cmdline, cwd, parent, \
pgroup, upt_hostname, upt_day, upt_add_time, \
coalesce(temp_remove_time, current_timestamp) AS upt_remove_time \
FROM temp_proc) SELECT * FROM new_proc WHERE upt_day>=%s AND \
parent = %s AND upt_asset_id = '%s' AND upt_add_time BETWEEN \
CAST('%s' AS TIMESTAMP) AND CAST('%s' AS TIMESTAMP) ORDER BY \
upt_add_time DESC"
% (uptday, parent, asset_id, parent_start, parent_end))
if limit != -1 and limit is not None:
query = ("%s LIMIT %s" % (query, limit))
query_type = 'global'
post_data = {
'query': query,
'queryType': query_type
}
return restcall(http_method, api_call, json=post_data)
def uptycs_get_process_child_processes_command():
query_results = uptycs_get_process_child_processes()
human_readable = tableToMarkdown('Child processes of a specified pid',
query_results.get('items'),
['upt_hostname', 'pid', 'name',
'path', 'cmdline', 'upt_add_time'])
context = query_results.get('items')
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Children': context
}
}
return entry
def uptycs_set_alert_status():
"""set the status of an alert"""
http_method = 'put'
api_call = ('/alerts/%s' % demisto.args().get('alert_id'))
post_data = {
'status': demisto.args().get('status')
}
return restcall(http_method, api_call, json=post_data)
def uptycs_set_alert_status_command():
query_results = uptycs_set_alert_status()
human_readable = tableToMarkdown('Uptycs Alert Status',
query_results, ['id', 'code', 'status',
'createdAt', 'updatedAt'])
context = query_results
context['updatedBy'] = context.get('updatedByUser').get('name')
context['updatedByAdmin'] = context.get('updatedByUser').get('admin')
context['updatedByEmail'] = context.get('updatedByUser').get('email')
context_entries_to_keep = ['id', 'code', 'status', 'createdAt',
'updatedAt', 'updatedBy', 'updatedByAdmin',
'updatedByEmail']
if context is not None:
for key in list(context):
if key not in context_entries_to_keep:
context.pop(key, None)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.AlertStatus': context
}
}
return entry
def uptycs_get_asset_tags():
"""set a tag on an asset"""
http_method = 'get'
api_call = ('/assets/%s' % demisto.args().get('asset_id'))
return restcall(http_method, api_call).get('tags')
def uptycs_get_asset_tags_command():
query_results = uptycs_get_asset_tags()
human_readable = tableToMarkdown('Uptycs Asset Tags for asset id: %s' %
demisto.args().get('asset_id'),
query_results, 'Tags')
context = query_results
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.AssetTags': context
}
}
return entry
def uptycs_set_asset_tag():
"""set a tag on an asset"""
http_method = 'get'
api_call = ('/assets/%s' % demisto.args().get('asset_id'))
tags = restcall(http_method, api_call).get('tags')
tags.append(demisto.args().get('tag_key') + '=' + demisto.args().get(
'tag_value'))
http_method = 'put'
post_data = {
'tags': tags
}
return restcall(http_method, api_call, json=post_data)
def uptycs_set_asset_tag_command():
query_results = uptycs_set_asset_tag()
human_readable = tableToMarkdown('Uptycs Asset Tag',
query_results, ['hostName', 'tags'])
context = query_results
context_entries_to_keep = ['hostName', 'tags']
if context is not None:
for key in list(context):
if key not in context_entries_to_keep:
context.pop(key, None)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.AssetTags': context
}
}
return entry
def uptycs_get_users():
"""return a list of uptycs users"""
http_method = 'get'
api_call = '/users'
limit = demisto.args().get('limit')
if limit != -1 and limit is not None:
api_call = ("%s?limit=%s" % (api_call, limit))
return restcall(http_method, api_call)
def uptycs_get_users_command():
query_results = uptycs_get_users()
human_readable = tableToMarkdown('Uptycs Users',
query_results.get(
'items'), ['name', 'email', 'id',
'admin', 'active',
'createdAt', 'updatedAt'])
context = query_results.get('items')
context_entries_to_keep = ['name', 'email', 'id', 'admin', 'active',
'createdAt', 'updatedAt']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.Users': context
}
}
return entry
def uptycs_get_user_information():
"""return information about a specfic Uptycs user"""
http_method = 'get'
api_call = '/users/%s' % demisto.args().get('user_id')
return restcall(http_method, api_call)
def uptycs_get_user_information_command():
query_results = uptycs_get_user_information()
human_readable = tableToMarkdown('Uptycs User Information',
query_results, ['name', 'email', 'id'])
context = query_results
context['userRoles'] = {
context.get('userRoles')[0].get('role').get('name'):
context.get('userRoles')[0].get('role')
}
context_entries_to_keep = ['name', 'email', 'id', 'userRoles',
'userObjectGroups']
if context is not None:
for key in list(context):
if key not in context_entries_to_keep:
context.pop(key, None)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.UserInfo': context
}
}
return entry
def uptycs_get_user_asset_groups():
"""return a list of users in a particular asset group"""
http_method = 'get'
api_call = '/users'
users = restcall(http_method, api_call).get('items')
user_ids = []
for index in range(len(users)):
user_ids.append(users[index].get('id'))
asset_group_id = demisto.args().get('asset_group_id')
users_in_group = {}
for user_id in user_ids:
http_method = 'get'
api_call = '/users/%s' % user_id
user_info = restcall(http_method, api_call)
obj_groups = user_info.get('userObjectGroups')
for obj_group in obj_groups:
if obj_group.get('objectGroupId') == asset_group_id:
users_in_group[user_info.get('name')] = {
'email': user_info.get('email'),
'id': user_info.get('id')
}
return users_in_group
def uptycs_get_user_asset_groups_command():
query_results = uptycs_get_user_asset_groups()
human_readable = tableToMarkdown('Uptycs User Asset Groups',
query_results)
context = query_results
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.UserGroups': context
}
}
return entry
def uptycs_get_asset_groups():
"""return a list of asset groups"""
http_method = 'get'
api_call = '/objectGroups'
limit = demisto.args().get('limit')
if limit != -1 and limit is not None:
api_call = ("%s?limit=%s" % (api_call, limit))
return restcall(http_method, api_call)
def uptycs_get_asset_groups_command():
query_results = uptycs_get_asset_groups()
human_readable = tableToMarkdown('Uptycs Users',
query_results.get('items'),
['id', 'name', 'description',
'objectType', 'custom', 'createdAt',
'updatedAt'])
context = query_results.get('items')
context_entries_to_keep = ['id', 'name', 'description', 'objectType',
'custom', 'createdAt', 'updatedAt']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.AssetGroups': context
}
}
return entry
def uptycs_get_threat_indicators():
"""return a list of threat indcicators"""
http_method = 'get'
api_call = '/threatIndicators'
limit = demisto.args().get('limit')
if limit != -1 and limit is not None:
api_call = ("%s?limit=%s" % (api_call, limit))
indicator = demisto.args().get('indicator')
if indicator is not None:
api_call = '%s?filters={"indicator":{"like":"%s"}}' %\
(api_call, indicator)
return restcall(http_method, api_call)
def uptycs_get_threat_indicators_command():
query_results = uptycs_get_threat_indicators()
human_readable = tableToMarkdown('Uptycs Threat Indicators',
query_results.get('items'),
['id', 'indicator', 'description',
'indicatorType', 'createdAt',
'isActive', 'threatId'])
context = query_results.get('items')
context_entries_to_keep = ['id', 'indicator', 'description',
'indicatorType', 'createdAt', 'isActive',
'threatId']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ThreatIndicators': context
}
}
return entry
def uptycs_get_threat_indicator():
"""return information about a particular threat indicator"""
http_method = 'get'
api_call = '/threatIndicators/%s' % demisto.args().get('indicator_id')
return restcall(http_method, api_call)
def uptycs_get_threat_indicator_command():
query_results = uptycs_get_threat_indicator()
human_readable = tableToMarkdown('Uptycs Threat Indicator',
query_results, ['id', 'indicator',
'description',
'indicatorType',
'createdAt', 'isActive',
'threatId'])
context = query_results
context['threat_source_id'] = context.get('threat').get('threatSourceId')
context['threat_vendor_id'] = context.get('threat').get('threatSource').\
get('threatVendorId')
context['threat_source_name'] = context.get('threat').get('threatSource').\
get('name')
context_entries_to_keep = ['id', 'indicator', 'description',
'indicatorType', 'createdAt', 'updatedAt',
'isActive', 'threatId', 'threat_source_id',
'threat_vendor_id', 'threat_source_name']
if context is not None:
for key in list(context):
if key not in context_entries_to_keep:
context.pop(key, None)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ThreatIndicator': context
}
}
return entry
def uptycs_get_threat_sources():
"""return a list of threat sources"""
http_method = 'get'
api_call = '/threatSources'
limit = demisto.args().get('limit')
if limit != -1 and limit is not None:
api_call = ("%s?limit=%s" % (api_call, limit))
return restcall(http_method, api_call)
def uptycs_get_threat_sources_command():
query_results = uptycs_get_threat_sources()
human_readable = tableToMarkdown('Uptycs Threat Sources',
query_results.get('items'),
['name', 'description', 'url', 'enabled',
'custom', 'createdAt', 'lastDownload'])
context = query_results.get('items')
context_entries_to_keep = ['name', 'description', 'url', 'enabled',
'custom', 'createdAt', 'lastDownload']
if context is not None:
remove_context_entries(context, context_entries_to_keep)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ThreatSources': context
}
}
return entry
def uptycs_get_threat_source():
"""return information about a particular threat source"""
http_method = 'get'
api_call = '/threatSources'
threat_source_id = demisto.args().get('threat_source_id')
if threat_source_id is not None:
api_call = '%s/%s' % (api_call, threat_source_id)
return restcall(http_method, api_call)
def uptycs_get_threat_source_command():
query_results = uptycs_get_threat_source()
human_readable = tableToMarkdown('Uptycs Threat Sources',
query_results,
['name', 'description', 'url', 'enabled',
'custom', 'createdAt', 'lastDownload'])
context = query_results
context_entries_to_keep = ['name', 'description', 'url', 'enabled',
'custom', 'createdAt', 'lastDownload']
if context is not None:
for key in list(context):
if key not in context_entries_to_keep:
context.pop(key, None)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ThreatSources': context
}
}
return entry
def uptycs_get_threat_vendors():
"""return a list of threat vendors"""
http_method = 'get'
api_call = '/threatVendors'
limit = demisto.args().get('limit')
if limit != -1 and limit is not None:
api_call = ("%s?limit=%s" % (api_call, limit))
return restcall(http_method, api_call)
def uptycs_get_threat_vendors_command():
query_results = uptycs_get_threat_vendors()
context = query_results.get('items')
if context is not None:
for index in range(len(context)):
context[index].pop('links', None)
human_readable = tableToMarkdown('Uptycs Threat Vendors',
context)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': context,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.ThreatVendors': context
}
}
return entry
def uptycs_post_threat_source():
"""post a new threat source"""
url = ("https://%s.uptycs.io/public/api/customers/%s/threatSources" %
(DOMAIN, CUSTOMER_ID))
header = generate_headers(KEY, SECRET)
filepath = demisto.getFilePath(demisto.args().get('entry_id'))
post_data = {
"name": demisto.args().get('name'),
"filename": filepath.get('name'),
"description": demisto.args().get('description')
}
files = {'file': open(filepath.get('path'), 'rb')}
response = requests.post(url, headers=header, data=post_data,
files=files, verify=VERIFY_CERT)
return response
def uptycs_post_threat_source_command():
response = uptycs_post_threat_source()
human_readable = 'Uptycs Posted Threat Source'
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': response.json(),
'HumanReadable': human_readable,
}
return entry
def uptycs_get_saved_queries():
"""return a list of threat vendors"""
http_method = 'get'
api_call = '/queries'
query_id = demisto.args().get('query_id')
if query_id is not None:
api_call = '%s/%s' % (api_call, query_id)
name = demisto.args().get('name')
if name is not None:
api_call = '%s?name=%s' % (api_call, name)
return restcall(http_method, api_call)
def uptycs_get_saved_queries_command():
query_results = uptycs_get_saved_queries()
context = query_results.get('items')
if context is not None:
for index in range(len(context)):
context[index].pop('links', None)
human_readable = tableToMarkdown('Uptycs Saved Queries',
context,
['name', 'description', 'query',
'executionType', 'grouping', 'id'])
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': context,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.SavedQueries': context
}
}
return entry
def uptycs_run_saved_query():
"""return a list of threat vendors"""
http_method = 'get'
api_call = '/queries'
query_id = demisto.args().get('query_id')
if query_id is not None:
api_call = '%s/%s' % (api_call, query_id)
name = demisto.args().get('name')
if name is not None:
api_call = '%s?name=%s' % (api_call, name)
query_results = restcall(http_method, api_call).get('items')[0]
query = query_results.get('query')
var_args = demisto.args().get('variable_arguments')
if var_args is not None:
while type(var_args) is not dict:
var_args = ast.literal_eval(var_args)
for key, value in var_args.items():
query = query.replace(key, value)
http_method = 'post'
if query_results.get('executionType') == 'realtime':
api_call = '/assets/query'
if demisto.args().get('asset_id') is not None:
_id = {
"id": {
"equals": demisto.args().get('asset_id')
}
}
elif demisto.args().get('host_name_is') is not None:
_id = {
"host_name": {
"equals": demisto.args().get('host_name_is')
}
}
elif demisto.args().get('host_name_like') is not None:
_id = {
"host_name": {
"like": '%' + demisto.args().get('host_name_like') + '%'
}
}
else:
_id = {
"host_name": {
"like": '%%'
}
}
post_data = {
"type": "realtime",
"query": query,
"filtering": {
"filters": _id
}
}
else:
post_data = {"query": query}
api_call = '/query'
return restcall(http_method, api_call, json=post_data)
def uptycs_run_saved_query_command():
query_results = uptycs_run_saved_query()
context = query_results.get('items')
if context is not None:
for index in range(len(context)):
context[index].pop('links', None)
human_readable = tableToMarkdown('Uptycs Query Results', context)
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': context,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.RunQuery': context
}
}
return entry
def uptycs_post_saved_query():
"""return a list of threat vendors"""
http_method = 'post'
api_call = '/queries'
post_data = {
"name": demisto.args().get('name'),
"type": demisto.args().get('type'),
"description": demisto.args().get('description'),
"query": demisto.args().get('query'),
"executionType": demisto.args().get('execution_type'),
"grouping": demisto.args().get('grouping'),
"custom": True
}
return restcall(http_method, api_call, json=post_data)
def uptycs_post_saved_query_command():
query_results = uptycs_post_saved_query()
if query_results.get("status") == 500:
return_error("Internal Server Error, check whether a query with this \
name has already been saved")
human_readable = tableToMarkdown('Uptycs Posted Query',
query_results,
['name', 'type', 'description', 'query',
'executionType', 'grouping', 'custom'])
entry = {
'ContentsFormat': formats['json'],
'Type': entryTypes['note'],
'Contents': query_results,
'HumanReadable': human_readable,
'EntryContext': {
'Uptycs.PostedQuery': query_results
}
}
return entry
def uptycs_test_module():
"""check whether Uptycs API responds correctly"""
http_method = 'get'
api_call = '/assets?limit=1'
query_results = restcall(http_method, api_call)
if query_results == 0:
return False
else:
return True
def uptycs_fetch_incidents():
"""fetch alerts from Uptycs"""
this_run = datetime.utcnow().strftime("%m/%d/%y %H:%M:%S")
if bool(demisto.getLastRun()) is False:
last_run, _ = parse_date_range(FETCH_TIME)
else:
last_run = demisto.getLastRun()['time']
http_method = 'get'
api_call = ('/alerts?filters={"alertTime":{"between":["%s","%s"]}}'
% (last_run, this_run))
query_results = restcall(http_method, api_call)
incidents = [] # type: List[dict]
if len(query_results.get('items')) == 0:
return incidents
if query_results.get('items') is not None:
for index in range(len(query_results.get('items'))):
context = query_results.get('items')[index]
context['alertId'] = context.get('id')
context['hostName'] = context.get('asset').get('hostName')
if bool(context.get('metadata').get('indicatorId')):
context['indicatorId'] = context.get('metadata').\
get('indicatorId')
context['threatId'] = context.get('metadata').\
get('indicatorSummary').get('threatId')
context['threatSourceName'] = context.get('metadata').\
get('indicatorSummary').get('threatSourceName')
context['indicatorType'] = context.get('metadata').\
get('indicatorSummary').get('indicatorType')
context_entries_to_keep = ['id', 'hostName', 'grouping',
'assignedTo', 'alertTime', 'alertId',
'updatedAt', 'status', 'assetId',
'createdAt', 'description', 'severity',
'value', 'threatId',
'threatSourceName', 'indicatorType',
'indicatorId']
for key in list(context):
if key not in context_entries_to_keep:
context.pop(key, None)
alert_time = context.get('alertTime')
incident = {
"Name": "Uptycs Alert: %s for asset: %s" %
(context.get('description'), context.get('hostName')),
"Occurred": alert_time,
"Severity": severity_to_int(context.get('severity')),
"Details": context.get('id'),
"rawJSON": json.dumps(context)
}
incidents.insert(0, incident)
demisto.setLastRun({'time': this_run})
return incidents
def main():
###########################################################################
# main function
###########################################################################
try:
if demisto.command() == 'uptycs-run-query':
demisto.results(uptycs_run_query_command())
if demisto.command() == 'uptycs-get-assets':
demisto.results(uptycs_get_assets_command())
if demisto.command() == 'uptycs-get-alerts':
demisto.results(uptycs_get_alerts_command())
if demisto.command() == 'uptycs-get-events':
demisto.results(uptycs_get_events_command())
if demisto.command() == 'uptycs-get-alert-rules':
demisto.results(uptycs_get_alert_rules_command())
if demisto.command() == 'uptycs-get-event-rules':
demisto.results(uptycs_get_event_rules_command())
if demisto.command() == 'uptycs-get-process-open-files':
demisto.results(uptycs_get_process_open_files_command())
if demisto.command() == 'uptycs-get-socket-events':
demisto.results(uptycs_get_socket_events_command())
if demisto.command() == 'uptycs-get-socket-event-information':
demisto.results(uptycs_get_socket_event_information_command())
if demisto.command() == 'uptycs-get-process-open-sockets':
demisto.results(uptycs_get_process_open_sockets_command())
if demisto.command() == 'uptycs-get-processes':
demisto.results(uptycs_get_processes_command())
if demisto.command() == 'uptycs-get-process-information':
demisto.results(uptycs_get_process_information_command())
if demisto.command() == 'uptycs-get-parent-information':
demisto.results(uptycs_get_parent_information_command())
if demisto.command() == 'uptycs-get-process-child-processes':
demisto.results(uptycs_get_process_child_processes_command())
if demisto.command() == 'uptycs-get-process-events':
demisto.results(uptycs_get_process_events_command())
if demisto.command() == 'uptycs-get-process-event-information':
demisto.results(uptycs_get_process_event_information_command())
if demisto.command() == 'uptycs-get-parent-event-information':
demisto.results(uptycs_get_parent_event_information_command())
if demisto.command() == 'uptycs-set-alert-status':
demisto.results(uptycs_set_alert_status_command())
if demisto.command() == 'uptycs-get-asset-tags':
demisto.results(uptycs_get_asset_tags_command())
if demisto.command() == 'uptycs-set-asset-tag':
demisto.results(uptycs_set_asset_tag_command())
if demisto.command() == 'uptycs-get-users':
demisto.results(uptycs_get_users_command())
if demisto.command() == 'uptycs-get-user-information':
demisto.results(uptycs_get_user_information_command())
if demisto.command() == 'uptycs-get-user-asset-groups':
demisto.results(uptycs_get_user_asset_groups_command())
if demisto.command() == 'uptycs-get-asset-groups':
demisto.results(uptycs_get_asset_groups_command())
if demisto.command() == 'uptycs-get-threat-indicators':
demisto.results(uptycs_get_threat_indicators_command())
if demisto.command() == 'uptycs-get-threat-indicator':
demisto.results(uptycs_get_threat_indicator_command())
if demisto.command() == 'uptycs-get-threat-sources':
demisto.results(uptycs_get_threat_sources_command())
if demisto.command() == 'uptycs-get-threat-source':
demisto.results(uptycs_get_threat_source_command())
if demisto.command() == 'uptycs-get-threat-vendors':
demisto.results(uptycs_get_threat_vendors_command())
if demisto.command() == 'uptycs-get-saved-queries':
demisto.results(uptycs_get_saved_queries_command())
if demisto.command() == 'uptycs-run-saved-query':
demisto.results(uptycs_run_saved_query_command())
if demisto.command() == 'uptycs-post-saved-query':
demisto.results(uptycs_post_saved_query_command())
if demisto.command() == 'uptycs-post-threat-source':
demisto.results(uptycs_post_threat_source_command())
if demisto.command() == 'test-module':
# This is the call made when pressing the integration test button.
if uptycs_test_module():
demisto.results('ok')
else:
demisto.results('test failed')
if demisto.command() == 'fetch-incidents':
demisto.incidents(uptycs_fetch_incidents())
except Exception as ex:
if demisto.command() == 'fetch-incidents':
raise
return_error(str(ex))
if __name__ in ['__main__', '__builtin__', 'builtins']:
main()
| 33.28457 | 81 | 0.559005 | 8,820 | 78,951 | 4.765646 | 0.048186 | 0.0369 | 0.046963 | 0.023791 | 0.815217 | 0.738206 | 0.669974 | 0.609616 | 0.578165 | 0.560393 | 0 | 0.002832 | 0.302251 | 78,951 | 2,371 | 82 | 33.298608 | 0.760184 | 0.019062 | 0 | 0.579967 | 0 | 0.00166 | 0.171732 | 0.010871 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043165 | false | 0 | 0.005534 | 0 | 0.094079 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed5ffd7971ca7099ffc1c25b46b31caf41fa1797 | 1,015 | py | Python | Others/image.py | dong628/OhMyCodes | 238e3045a98505a4880e85ad71d43d64a20f9452 | [
"MIT"
] | 2 | 2020-07-18T01:19:04.000Z | 2020-11-03T17:34:22.000Z | Others/image.py | dong628/OhMyCodes | 238e3045a98505a4880e85ad71d43d64a20f9452 | [
"MIT"
] | null | null | null | Others/image.py | dong628/OhMyCodes | 238e3045a98505a4880e85ad71d43d64a20f9452 | [
"MIT"
] | 1 | 2021-09-09T12:52:58.000Z | 2021-09-09T12:52:58.000Z | from PIL import Image
colors = [\
(0, 0, 0)\
,(255, 255, 255)\
,(170, 170, 170)\
,(85, 85, 85)\
,(254, 211, 199)\
,(255, 196, 206)\
,(250, 172, 142)\
,(255, 139, 131)\
,(244, 67, 54)\
,(233, 30, 99)\
,(226, 102, 158)\
,(156, 39, 176)\
,(103, 58, 183)\
,(63, 81, 181)\
,(0, 70, 112)\
,(5, 113, 151)\
,(33, 150, 243)\
,(0, 188, 212)\
,(59, 229, 219)\
,(151, 253, 220)\
,(22, 115, 0)\
,(55, 169, 60)\
,(137, 230, 66)\
,(215, 255, 7)\
,(255, 246, 209)\
,(248, 203, 140)\
,(255, 235, 59)\
,(255, 193, 7)\
,(255, 152, 0)\
,(255, 87, 34)\
,(184, 63, 39)\
,(121, 85, 72)\
]
im = Image.open("pic.png")
imn = Image.new(im.mode, im.size)
f = open("pic.txt", 'w');
minn=99999999999;
for i in range(im.size[0]):
for j in range(im.size[1]):
pix=im.getpixel((i, j))
for x in colors:
val = (x[0]-pix[0])**2+(x[1]-pix[1])**2+(x[2]-pix[2])**2;
if(val < minn):
minn = val
col = colors.index(x)
minn=999999999;
imn.putpixel((i, j), colors[col]);
f.write(str(col)+' ');
f.write('\n');
imn.show()
f.close()
| 17.5 | 60 | 0.51133 | 187 | 1,015 | 2.775401 | 0.582888 | 0.034682 | 0.034682 | 0.050096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.328537 | 0.178325 | 1,015 | 57 | 61 | 17.807018 | 0.293765 | 0 | 0 | 0 | 0 | 0 | 0.017734 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.018868 | 0 | 0.018868 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed6807186bc1356e321bda0f98d1078f52971390 | 5,900 | py | Python | moving_mnist.py | tencia/video_predict | fbd84769d595b6518d8174024dd2d862cd48518f | [
"MIT"
] | 84 | 2016-01-08T23:35:22.000Z | 2021-06-01T06:52:26.000Z | moving_mnist.py | tencia/video_predict | fbd84769d595b6518d8174024dd2d862cd48518f | [
"MIT"
] | 2 | 2016-05-26T10:32:22.000Z | 2018-03-30T11:51:18.000Z | moving_mnist.py | tencia/video_predict | fbd84769d595b6518d8174024dd2d862cd48518f | [
"MIT"
] | 34 | 2016-03-31T21:13:33.000Z | 2021-12-11T19:49:38.000Z | from PIL import Image
import sys
import os
import math
import numpy as np
###########################################################################################
# script to generate moving mnist video dataset (frame by frame) as described in
# [1] arXiv:1502.04681 - Unsupervised Learning of Video Representations Using LSTMs
# Srivastava et al
# by Tencia Lee
# can be run as a standalone script or imported
###########################################################################################
def arr_from_img(im,shift=0):
w,h=im.size
arr=im.getdata()
c = np.product(arr.size) / (w*h)
return np.asarray(arr, dtype=np.float32).reshape((h,w,c)).transpose(2,1,0) / 255. - shift
def get_picture_array(X, index, shift=0):
ch, w, h = X.shape[1], X.shape[2], X.shape[3]
ret = ((X[index]+shift)*255.).reshape(ch,w,h).transpose(2,1,0).clip(0,255).astype(np.uint8)
if ch == 1:
ret=ret.reshape(h,w)
return ret
# loads mnist from web on demand
def load_dataset():
if sys.version_info[0] == 2:
from urllib import urlretrieve
else:
from urllib.request import urlretrieve
def download(filename, source='http://yann.lecun.com/exdb/mnist/'):
print("Downloading %s" % filename)
urlretrieve(source + filename, filename)
import gzip
def load_mnist_images(filename):
if not os.path.exists(filename):
download(filename)
with gzip.open(filename, 'rb') as f:
data = np.frombuffer(f.read(), np.uint8, offset=16)
data = data.reshape(-1, 1, 28, 28).transpose(0,1,3,2)
return data / np.float32(255)
return load_mnist_images('train-images-idx3-ubyte.gz')
# generates and returns video frames in uint8 array
def generate_moving_mnist(shape=(64,64), seq_len=30, seqs=10000, num_sz=28, nums_per_image=2):
mnist = load_dataset()
width, height = shape
lims = (x_lim, y_lim) = width-num_sz, height-num_sz
dataset = np.empty((seq_len*seqs, 1, width, height), dtype=np.uint8)
minpos = 100
maxpos = 0
for seq_idx in xrange(seqs):
if seq_idx % 1000 == 0:
print seq_idx
# randomly generate direc/speed/position, calculate velocity vector
direcs = np.pi * (np.random.rand(nums_per_image)*2 - 1)
speeds = np.random.randint(5, size=nums_per_image)+2
veloc = [(v*math.cos(d), v*math.sin(d)) for d,v in zip(direcs, speeds)]
mnist_images = [Image.fromarray(get_picture_array(mnist,r,shift=0))\
.resize((num_sz,num_sz), Image.ANTIALIAS) \
for r in np.random.randint(0, mnist.shape[0], nums_per_image)]
positions = [(np.random.rand()*x_lim, np.random.rand()*y_lim)
for _ in xrange(nums_per_image)]
for frame_idx in xrange(seq_len):
canvases = [Image.new('L', (width,height)) for _ in xrange(nums_per_image)]
canvas = np.zeros((1,width,height), dtype=np.float32)
for i,canv in enumerate(canvases):
canv.paste(mnist_images[i], tuple(map(lambda p: int(round(p)), positions[i])))
canvas += arr_from_img(canv, shift=0)
if min(positions[i]) < minpos:
minpos = min(positions[i])
if max(positions[i]) > maxpos:
maxpos = max(positions[i])
# update positions based on velocity
next_pos = [map(sum, zip(p,v)) for p,v in zip(positions, veloc)]
# bounce off wall if a we hit one
for i, pos in enumerate(next_pos):
for j, coord in enumerate(pos):
if coord < -2 or coord > lims[j]+2:
veloc[i] = tuple(list(veloc[i][:j]) + [-1 * veloc[i][j]] +
list(veloc[i][j+1:]))
positions = [map(sum, zip(p,v)) for p,v in zip(positions, veloc)]
# copy additive canvas to data array
dataset[seq_idx*seq_len+frame_idx] = (canvas * 255).astype(np.uint8).clip(0,255)
return dataset
def main(dest, filetype='npz', frame_size=64, seq_len=30, seqs=100, num_sz=28, nums_per_image=2):
dat = generate_moving_mnist(shape=(frame_size,frame_size), seq_len=seq_len, seqs=seqs, \
num_sz=num_sz, nums_per_image=nums_per_image)
n = seqs * seq_len
if filetype == 'hdf5':
import h5py
from fuel.datasets.hdf5 import H5PYDataset
def save_hd5py(dataset, destfile, indices_dict):
f = h5py.File(destfile, mode='w')
images = f.create_dataset('images', dataset.shape, dtype='uint8')
images[...] = dataset
split_dict = dict((k, {'images':v}) for k,v in indices_dict.iteritems())
f.attrs['split'] = H5PYDataset.create_split_array(split_dict)
f.flush()
f.close()
indices_dict = {'train': (0, n*9/10), 'test': (n*9/10, n)}
save_hd5py(dat, dest, indices_dict)
elif filetype == 'npz':
np.savez(dest, dat)
elif filetype == 'jpg':
for i in xrange(dat.shape[0]):
Image.fromarray(get_picture_array(dat, i, shift=0))\
.save(os.path.join(dest, '{}.jpg'.format(i)))
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser(description='Command line options')
parser.add_argument('--dest', type=str, dest='dest')
parser.add_argument('--filetype', type=str, dest='filetype')
parser.add_argument('--frame_size', type=int, dest='frame_size')
parser.add_argument('--seq_len', type=int, dest='seq_len')
parser.add_argument('--seqs', type=int, dest='seqs')
parser.add_argument('--num_sz', type=int, dest='num_sz')
parser.add_argument('--nums_per_image', type=int, dest='nums_per_image')
args = parser.parse_args(sys.argv[1:])
main(**{k:v for (k,v) in vars(args).items() if v is not None})
| 46.825397 | 97 | 0.592712 | 842 | 5,900 | 4.021378 | 0.301663 | 0.022741 | 0.038984 | 0.015357 | 0.094507 | 0.046072 | 0.032487 | 0.020673 | 0.020673 | 0.020673 | 0 | 0.029871 | 0.239661 | 5,900 | 125 | 98 | 47.2 | 0.724922 | 0.083051 | 0 | 0 | 1 | 0 | 0.052722 | 0.004985 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.104762 | null | null | 0.019048 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed70175f3a0e01c233b6bd5ee69544d8aaf34295 | 380 | py | Python | 0x01/solve/my_base64_encode.py | tuannm-1876/sec-exercises | d8ea08bc02003af3722e0553060ed370ed395b33 | [
"MIT"
] | null | null | null | 0x01/solve/my_base64_encode.py | tuannm-1876/sec-exercises | d8ea08bc02003af3722e0553060ed370ed395b33 | [
"MIT"
] | null | null | null | 0x01/solve/my_base64_encode.py | tuannm-1876/sec-exercises | d8ea08bc02003af3722e0553060ed370ed395b33 | [
"MIT"
] | null | null | null | import string
import base64
custom_string = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789=~.'
standard_string = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/='
encode_trans = string.maketrans(standard_string, custom_string)
def encode(str):
return base64.b64encode(str).translate(encode_trans)
string = 'string abc'
print (encode(string)) | 42.222222 | 85 | 0.842105 | 34 | 380 | 9.235294 | 0.470588 | 0.076433 | 0.10828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073654 | 0.071053 | 380 | 9 | 86 | 42.222222 | 0.815864 | 0 | 0 | 0 | 0 | 0 | 0.367454 | 0.341207 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0.111111 | 0.444444 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
ed855090d2116f48b39164159eab820e2085bb3c | 4,483 | py | Python | Week 1 Dec 1-8/Dec 1.py | Descent098/advent-of-code-2020 | 74e6353248de271e233851e495f21a9589d1de36 | [
"MIT"
] | null | null | null | Week 1 Dec 1-8/Dec 1.py | Descent098/advent-of-code-2020 | 74e6353248de271e233851e495f21a9589d1de36 | [
"MIT"
] | null | null | null | Week 1 Dec 1-8/Dec 1.py | Descent098/advent-of-code-2020 | 74e6353248de271e233851e495f21a9589d1de36 | [
"MIT"
] | null | null | null | """
--- Day 1: Report Repair ---
After saving Christmas five years in a row, you've decided to take a vacation at a nice resort on a tropical island. Surely, Christmas will go on without you.
The tropical island has its own currency and is entirely cash-only. The gold coins used there have a little picture of a starfish; the locals just call them stars. None of the currency exchanges seem to have heard of them, but somehow, you'll need to find fifty of these coins by the time you arrive so you can pay the deposit on your room.
To save your vacation, you need to get all fifty stars by December 25th.
Collect stars by solving puzzles. Two puzzles will be made available on each day in the Advent calendar; the second puzzle is unlocked when you complete the first. Each puzzle grants one star. Good luck!
Before you leave, the Elves in accounting just need you to fix your expense report (your puzzle input); apparently, something isn't quite adding up.
Specifically, they need you to find the two entries that sum to 2020 and then multiply those two numbers together.
For example, suppose your expense report contained the following:
1721
979
366
299
675
1456
In this list, the two entries that sum to 2020 are 1721 and 299. Multiplying them together produces 1721 * 299 = 514579, so the correct answer is 514579.
Of course, your expense report is much larger. Find the two entries that sum to 2020; what do you get if you multiply them together?
Your puzzle answer was 898299.
--- Part Two ---
The Elves in accounting are thankful for your help; one of them even offers you a starfish coin they had left over from a past vacation. They offer you a second one if you can find three numbers in your expense report that meet the same criteria.
Using the above example again, the three entries that sum to 2020 are 979, 366, and 675. Multiplying them together produces the answer, 241861950.
In your expense report, what is the product of the three entries that sum to 2020?
"""
# This solution is highly inefficient, I did it at like 11pm the night of so there are better ways to do it
data = ["1810", "1729", "1857", "1777", "1927", "1936", "1797", "1719", "1703", "1758", "1768", "2008", "1963", "1925", "1919", "1911", "1782", "2001", "1744", "1738", "1742", "1799", "1765", "1819", "1888", "127", "1880", "1984", "1697", "1760", "1680", "1951", "1745", "1817", "1704", "1736", "1969", "1705", "1690", "1848", "1885", "1912", "1982", "1895", "1959", "1769", "1722", "1807", "1901", "1983", "1993", "1871", "1795", "1955", "1921", "1934", "1743", "1899", "1942", "1964", "1034", "1952", "1851", "1716", "1800", "1771", "1945", "1877", "1917", "1930", "1970", "1948", "1914", "1767", "1910", "563", "1121", "1897", "1946", "1882", "1739", "1900", "1714", "1931", "2000", "311", "1881", "1876", "354", "1965", "1842", "1979", "1998", "1960", "1852", "1847", "1938", "1369", "1780", "1698", "1753", "1746", "1868", "1752", "1802", "1892", "1755", "1818", "1913", "1706", "1862", "326", "1941", "1926", "1809", "1879", "1815", "1939", "1859", "1999", "1947", "1898", "1794", "1737", "1971", "1977", "1944", "1812", "1905", "1359", "1788", "1754", "1774", "1825", "1748", "1701", "1791", "1786", "1692", "1894", "1961", "1902", "1849", "1967", "1770", "1987", "1831", "1728", "1896", "1805", "1733", "1918", "1731", "661", "1776", "1494", "2005", "2009", "2004", "1915", "1695", "1710", "1804", "1929", "1725", "1772", "1933", "609", "1708", "1822", "1978", "1811", "1816", "1073", "1874", "1845", "1989", "1696", "1953", "1823", "1923", "1907", "1834", "1806", "1861", "1785", "297", "1968", "1764", "1932", "1937", "1826", "1732", "1962", "1916", "1756", "1975", "1775", "1922", "1773"]
for item in data:
if item.isdigit():
item = int(item)
for sub_item in data:
if sub_item.isdigit():
sub_item = int(sub_item)
if item + sub_item == 2020:
print(f"Answer found:\n\t{item} + {sub_item} == 2020\n\t{item} * {sub_item} == {item*sub_item}") # Answer to part 1
for sub_sub_item in data:
if sub_sub_item.isdigit(): # Answer to part 2
sub_sub_item = int(sub_sub_item)
if item + sub_item + sub_sub_item == 2020:
print(f"Answer found:\n\t{item} + {sub_item} + {sub_sub_item} == 2020\n\t{item} * {sub_item} * {sub_sub_item} == {item*sub_item * sub_sub_item}")
| 77.293103 | 1,599 | 0.629266 | 686 | 4,483 | 4.071429 | 0.571429 | 0.050125 | 0.031507 | 0.028643 | 0.138919 | 0.13498 | 0.104547 | 0.070175 | 0.029359 | 0.029359 | 0 | 0.252019 | 0.198974 | 4,483 | 57 | 1,600 | 78.649123 | 0.525759 | 0.474236 | 0 | 0 | 0 | 0.142857 | 0.431983 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
71e9a8466914920a3c2b4f14c73c2c3646421723 | 405 | py | Python | 7/7/finall/analyzers/weather.py | gdaPythonProjects/training2019-thursday | cd4d406a83f3ec914e136cec21d96988eb2ee2e9 | [
"MIT"
] | 5 | 2019-11-07T17:04:17.000Z | 2019-11-20T18:47:28.000Z | 7/7/finall/analyzers/weather.py | gdaPythonProjects/training2019-thursday | cd4d406a83f3ec914e136cec21d96988eb2ee2e9 | [
"MIT"
] | 1 | 2019-12-15T13:15:17.000Z | 2019-12-15T13:15:17.000Z | 7/7/finall/analyzers/weather.py | gdaPythonProjects/training2019-thursday | cd4d406a83f3ec914e136cec21d96988eb2ee2e9 | [
"MIT"
] | 3 | 2019-12-15T12:10:17.000Z | 2019-12-17T14:24:22.000Z | class WeatherAnalyzer:
def __init__(self, weathers):
weathers.sort(key=lambda x: x.temp)
self.weathers = weathers
def coldest(self):
print("Najzimniejsza temperatura panuje w mieście {}. Temperatura wynosi {}".
format(self.weathers[0].city_name, self.weathers[0].temp))
def print_all_weathers(self):
for i in self.weathers:
print(i)
| 31.153846 | 85 | 0.637037 | 49 | 405 | 5.122449 | 0.55102 | 0.239044 | 0.159363 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006623 | 0.254321 | 405 | 12 | 86 | 33.75 | 0.824503 | 0 | 0 | 0 | 0 | 0 | 0.167901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0 | 0 | 0.4 | 0.3 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
71ead7784b8a5afa179aa09b0eeb11aeb94036ab | 425 | py | Python | graph-algorithms/signal_tree.py | lucaspetry/algorithms-and-data-structures | 104d79923f856fde218ad37f285643d42b2379cc | [
"MIT"
] | null | null | null | graph-algorithms/signal_tree.py | lucaspetry/algorithms-and-data-structures | 104d79923f856fde218ad37f285643d42b2379cc | [
"MIT"
] | null | null | null | graph-algorithms/signal_tree.py | lucaspetry/algorithms-and-data-structures | 104d79923f856fde218ad37f285643d42b2379cc | [
"MIT"
] | null | null | null | import random
from graph import Graph
def create_signal_tree(integers):
vertices = [integers[0]]
edges = []
for i in range(1, len(integers)):
v1 = integers[i - 1]
v2 = integers[i]
if v2 not in vertices:
vertices.append(v2)
edges.append((v1, v2, min(v1, v2)))
graph = Graph(vertices, edges, directed = False)
print(graph.get_signal_tree())
integers = [1, 5, 2, 2, 7, 4, 1, 3]
create_signal_tree(integers) | 20.238095 | 49 | 0.675294 | 68 | 425 | 4.132353 | 0.485294 | 0.106762 | 0.192171 | 0.170819 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054598 | 0.181176 | 425 | 21 | 50 | 20.238095 | 0.752874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.133333 | 0 | 0.2 | 0.066667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
71f07ac4410cf44254f4523df354d69e93d3cf90 | 231 | py | Python | PyMOTW/source/compileall/compileall_exclude_dirs.py | axetang/AxePython | 3b517fa3123ce2e939680ad1ae14f7e602d446a6 | [
"Apache-2.0"
] | 1 | 2019-01-04T05:47:50.000Z | 2019-01-04T05:47:50.000Z | PyMOTW/source/compileall/compileall_exclude_dirs.py | axetang/AxePython | 3b517fa3123ce2e939680ad1ae14f7e602d446a6 | [
"Apache-2.0"
] | 1 | 2020-07-18T03:52:03.000Z | 2020-07-18T04:18:01.000Z | PyMOTW/source/compileall/compileall_exclude_dirs.py | axetang/AxePython | 3b517fa3123ce2e939680ad1ae14f7e602d446a6 | [
"Apache-2.0"
] | 2 | 2021-03-06T04:28:32.000Z | 2021-03-06T04:59:17.000Z | #!/usr/bin/env python3
# encoding: utf-8
#
# Copyright (c) 2009 Doug Hellmann All rights reserved.
#
"""
"""
#end_pymotw_header
import compileall
import re
compileall.compile_dir(
'examples',
rx=re.compile(r'/subdir'),
)
| 13.588235 | 55 | 0.688312 | 31 | 231 | 5.032258 | 0.870968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030928 | 0.160173 | 231 | 16 | 56 | 14.4375 | 0.773196 | 0.467532 | 0 | 0 | 0 | 0 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
71fc7a501389f21f0b554a81a9b8389342b3914e | 388 | py | Python | ServiceRegistry/RegistryFiles/SensorRegistry.py | CommName/WildeLifeWatcher | 3ce3b564d0e6cc81ebc2b607a712580d3c388db6 | [
"MIT"
] | null | null | null | ServiceRegistry/RegistryFiles/SensorRegistry.py | CommName/WildeLifeWatcher | 3ce3b564d0e6cc81ebc2b607a712580d3c388db6 | [
"MIT"
] | null | null | null | ServiceRegistry/RegistryFiles/SensorRegistry.py | CommName/WildeLifeWatcher | 3ce3b564d0e6cc81ebc2b607a712580d3c388db6 | [
"MIT"
] | null | null | null | from RegistryFiles import ServiceRegistry
class SensorRegistry(ServiceRegistry.ServiceRegistry):
coordinateN = 40.0
coordinateE = 40.0
def __init__(self, name, serviceAddress, servicePort,serverName, coordinateN, coordinateE):
super().__init__(name,serviceAddress,servicePort,serverName)
self.coordinateE = coordinateE
self.coordinateN = coordinateN | 35.272727 | 95 | 0.760309 | 35 | 388 | 8.2 | 0.514286 | 0.020906 | 0.202091 | 0.271777 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018519 | 0.164948 | 388 | 11 | 96 | 35.272727 | 0.867284 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c04d072f378a1f0aef31d883b8ba3687877209a | 5,229 | py | Python | tests/attr/test_gradient.py | gorogoroyasu/captum | e4e27c537e86deca6ed841432be2f11459d6795a | [
"BSD-3-Clause"
] | null | null | null | tests/attr/test_gradient.py | gorogoroyasu/captum | e4e27c537e86deca6ed841432be2f11459d6795a | [
"BSD-3-Clause"
] | null | null | null | tests/attr/test_gradient.py | gorogoroyasu/captum | e4e27c537e86deca6ed841432be2f11459d6795a | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python3
import torch
from captum.attr._utils.gradient import (
compute_gradients,
compute_layer_gradients_and_eval,
apply_gradient_requirements,
undo_gradient_requirements,
)
from .helpers.utils import assertArraysAlmostEqual, BaseTest
from .helpers.basic_models import (
BasicModel,
BasicModel6_MultiTensor,
BasicModel_MultiLayer,
)
class Test(BaseTest):
def test_apply_gradient_reqs(self):
initial_grads = [False, True, False]
test_tensor = torch.tensor([[6.0]], requires_grad=True)
test_tensor.grad = torch.tensor([[7.0]])
test_tensor_tuple = (torch.tensor([[5.0]]), test_tensor, torch.tensor([[7.0]]))
out_mask = apply_gradient_requirements(test_tensor_tuple)
for i in range(len(test_tensor_tuple)):
self.assertTrue(test_tensor_tuple[i].requires_grad)
self.assertEqual(out_mask[i], initial_grads[i])
if test_tensor_tuple[i].grad is not None:
self.assertAlmostEqual(torch.sum(test_tensor_tuple[i].grad).item(), 0.0)
def test_undo_gradient_reqs(self):
initial_grads = [False, True, False]
test_tensor = torch.tensor([[6.0]], requires_grad=True)
test_tensor.grad = torch.tensor([[7.0]])
test_tensor_tuple = (
torch.tensor([[6.0]], requires_grad=True),
test_tensor,
torch.tensor([[7.0]], requires_grad=True),
)
undo_gradient_requirements(test_tensor_tuple, initial_grads)
for i in range(len(test_tensor_tuple)):
self.assertEqual(test_tensor_tuple[i].requires_grad, initial_grads[i])
if test_tensor_tuple[i].grad is not None:
self.assertAlmostEqual(torch.sum(test_tensor_tuple[i].grad).item(), 0.0)
def test_gradient_basic(self):
model = BasicModel()
input = torch.tensor([[5.0]], requires_grad=True)
grads = compute_gradients(model, input)[0]
assertArraysAlmostEqual(grads.squeeze(0).tolist(), [0.0], delta=0.01)
def test_gradient_basic_2(self):
model = BasicModel()
input = torch.tensor([[-3.0]], requires_grad=True)
grads = compute_gradients(model, input)[0]
assertArraysAlmostEqual(grads.squeeze(0).tolist(), [1.0], delta=0.01)
def test_gradient_multiinput(self):
model = BasicModel6_MultiTensor()
input1 = torch.tensor([[-3.0, -5.0]], requires_grad=True)
input2 = torch.tensor([[-5.0, 2.0]], requires_grad=True)
grads = compute_gradients(model, (input1, input2))
assertArraysAlmostEqual(grads[0].squeeze(0).tolist(), [0.0, 1.0], delta=0.01)
assertArraysAlmostEqual(grads[1].squeeze(0).tolist(), [0.0, 1.0], delta=0.01)
def test_layer_gradient_linear0(self):
model = BasicModel_MultiLayer()
input = torch.tensor([[5.0, -11.0, 23.0]], requires_grad=True)
grads, eval = compute_layer_gradients_and_eval(
model, model.linear0, input, target_ind=0
)
assertArraysAlmostEqual(grads.squeeze(0).tolist(), [4.0, 4.0, 4.0], delta=0.01)
assertArraysAlmostEqual(
eval.squeeze(0).tolist(), [5.0, -11.0, 23.0], delta=0.01
)
def test_layer_gradient_linear1(self):
model = BasicModel_MultiLayer()
input = torch.tensor([[5.0, 2.0, 1.0]], requires_grad=True)
grads, eval = compute_layer_gradients_and_eval(
model, model.linear1, input, target_ind=1
)
assertArraysAlmostEqual(
grads.squeeze(0).tolist(), [0.0, 1.0, 1.0, 1.0], delta=0.01
)
assertArraysAlmostEqual(
eval.squeeze(0).tolist(), [-2.0, 9.0, 9.0, 9.0], delta=0.01
)
def test_layer_gradient_linear1_inplace(self):
model = BasicModel_MultiLayer(inplace=True)
input = torch.tensor([[5.0, 2.0, 1.0]], requires_grad=True)
grads, eval = compute_layer_gradients_and_eval(
model, model.linear1, input, target_ind=1
)
assertArraysAlmostEqual(
grads.squeeze(0).tolist(), [0.0, 1.0, 1.0, 1.0], delta=0.01
)
assertArraysAlmostEqual(
eval.squeeze(0).tolist(), [-2.0, 9.0, 9.0, 9.0], delta=0.01
)
def test_layer_gradient_relu_input_inplace(self):
model = BasicModel_MultiLayer(inplace=True)
input = torch.tensor([[5.0, 2.0, 1.0]], requires_grad=True)
grads, eval = compute_layer_gradients_and_eval(
model, model.relu, input, target_ind=1, attribute_to_layer_input=True
)
assertArraysAlmostEqual(
grads.squeeze(0).tolist(), [0.0, 1.0, 1.0, 1.0], delta=0.01
)
assertArraysAlmostEqual(
eval.squeeze(0).tolist(), [-2.0, 9.0, 9.0, 9.0], delta=0.01
)
def test_layer_gradient_output(self):
model = BasicModel_MultiLayer()
input = torch.tensor([[5.0, 2.0, 1.0]], requires_grad=True)
grads, eval = compute_layer_gradients_and_eval(
model, model.linear2, input, target_ind=1
)
assertArraysAlmostEqual(grads.squeeze(0).tolist(), [0.0, 1.0], delta=0.01)
assertArraysAlmostEqual(eval.squeeze(0).tolist(), [26.0, 28.0], delta=0.01)
| 41.5 | 88 | 0.633199 | 689 | 5,229 | 4.606676 | 0.129173 | 0.056711 | 0.015123 | 0.039698 | 0.802457 | 0.765595 | 0.692817 | 0.677064 | 0.657215 | 0.583491 | 0 | 0.056548 | 0.228916 | 5,229 | 125 | 89 | 41.832 | 0.730655 | 0.004016 | 0 | 0.409091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.090909 | false | 0 | 0.036364 | 0 | 0.136364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c079c9a28b2b7f439b8df69672702ceb91bf28a | 2,464 | py | Python | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[nl_NL-2011] 1.py | ghlecl/holidata | 1db24d4aecab7ec7a007720987d84ffb0988b6db | [
"MIT"
] | null | null | null | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[nl_NL-2011] 1.py | ghlecl/holidata | 1db24d4aecab7ec7a007720987d84ffb0988b6db | [
"MIT"
] | null | null | null | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[nl_NL-2011] 1.py | ghlecl/holidata | 1db24d4aecab7ec7a007720987d84ffb0988b6db | [
"MIT"
] | null | null | null | [
{
'date': '2011-01-01',
'description': 'Nieuwjaarsdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NF'
},
{
'date': '2011-04-22',
'description': 'Goede Vrijdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRV'
},
{
'date': '2011-04-24',
'description': 'Eerste Paasdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRV'
},
{
'date': '2011-04-25',
'description': 'Tweede Paasdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRV'
},
{
'date': '2011-04-30',
'description': 'Koninginnedag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2011-05-04',
'description': 'Dodenherdenking',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'F'
},
{
'date': '2011-05-05',
'description': 'Bevrijdingsdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NF'
},
{
'date': '2011-06-02',
'description': 'Hemelvaartsdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRV'
},
{
'date': '2011-06-12',
'description': 'Eerste Pinksterdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRV'
},
{
'date': '2011-06-13',
'description': 'Tweede Pinksterdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRV'
},
{
'date': '2011-12-05',
'description': 'Sinterklaas',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'RF'
},
{
'date': '2011-12-15',
'description': 'Koninkrijksdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2011-12-25',
'description': 'Eerste Kerstdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRF'
},
{
'date': '2011-12-26',
'description': 'Tweede Kerstdag',
'locale': 'nl-NL',
'notes': '',
'region': '',
'type': 'NRF'
}
] | 21.614035 | 44 | 0.365666 | 189 | 2,464 | 4.767196 | 0.216931 | 0.124306 | 0.155383 | 0.233074 | 0.581576 | 0.581576 | 0.526082 | 0.526082 | 0.446171 | 0.290788 | 0 | 0.075881 | 0.400974 | 2,464 | 114 | 45 | 21.614035 | 0.534553 | 0 | 0 | 0.473684 | 0 | 0 | 0.385396 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c0f87dbcb443fba7fd520317368877062550c33 | 5,374 | py | Python | wrappers/glxtrace.py | juhapekka/apitrace | fbe75280a74915428259a5f153bcaf07923a23a7 | [
"MIT"
] | 2 | 2018-09-17T01:50:46.000Z | 2018-09-17T02:12:16.000Z | wrappers/glxtrace.py | juhapekka/apitrace | fbe75280a74915428259a5f153bcaf07923a23a7 | [
"MIT"
] | null | null | null | wrappers/glxtrace.py | juhapekka/apitrace | fbe75280a74915428259a5f153bcaf07923a23a7 | [
"MIT"
] | 1 | 2018-09-17T01:50:58.000Z | 2018-09-17T01:50:58.000Z | ##########################################################################
#
# Copyright 2011 Jose Fonseca
# Copyright 2008-2010 VMware, Inc.
# All Rights Reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
#
##########################################################################/
"""GLX tracing generator."""
from gltrace import GlTracer
from specs.stdapi import Module, API
from specs.glapi import glapi
from specs.glxapi import glxapi
class GlxTracer(GlTracer):
def isFunctionPublic(self, function):
# The symbols visible in libGL.so can vary, so expose them all
return True
getProcAddressFunctionNames = [
"glXGetProcAddress",
"glXGetProcAddressARB",
]
createContextFunctionNames = [
'glXCreateContext',
'glXCreateContextAttribsARB',
'glXCreateContextWithConfigSGIX',
'glXCreateNewContext',
]
destroyContextFunctionNames = [
'glXDestroyContext',
]
makeCurrentFunctionNames = [
'glXMakeCurrent',
'glXMakeContextCurrent',
'glXMakeCurrentReadSGI',
]
def traceFunctionImplBody(self, function):
if function.name in self.destroyContextFunctionNames:
print ' gltrace::releaseContext((uintptr_t)ctx);'
GlTracer.traceFunctionImplBody(self, function)
if function.name in self.createContextFunctionNames:
print ' if (_result != NULL)'
print ' gltrace::createContext((uintptr_t)_result);'
if function.name in self.makeCurrentFunctionNames:
print ' if (_result) {'
print ' if (ctx != NULL)'
print ' gltrace::setContext((uintptr_t)ctx);'
print ' else'
print ' gltrace::clearContext();'
print ' }'
if __name__ == '__main__':
print
print '#include <stdlib.h>'
print '#include <string.h>'
print
print '#ifndef _GNU_SOURCE'
print '#define _GNU_SOURCE // for dladdr'
print '#endif'
print '#include <dlfcn.h>'
print
print '#include "trace_writer_local.hpp"'
print
print '// To validate our prototypes'
print '#define GL_GLEXT_PROTOTYPES'
print '#define GLX_GLXEXT_PROTOTYPES'
print
print '#include "glproc.hpp"'
print '#include "glsize.hpp"'
print
module = Module()
module.mergeModule(glxapi)
module.mergeModule(glapi)
api = API()
api.addModule(module)
tracer = GlxTracer()
tracer.traceApi(api)
print r'''
/*
* Invoke the true dlopen() function.
*/
static void *_dlopen(const char *filename, int flag)
{
typedef void * (*PFN_DLOPEN)(const char *, int);
static PFN_DLOPEN dlopen_ptr = NULL;
if (!dlopen_ptr) {
dlopen_ptr = (PFN_DLOPEN)dlsym(RTLD_NEXT, "dlopen");
if (!dlopen_ptr) {
os::log("apitrace: error: dlsym(RTLD_NEXT, \"dlopen\") failed\n");
return NULL;
}
}
return dlopen_ptr(filename, flag);
}
/*
* Several applications, such as Quake3, use dlopen("libGL.so.1"), but
* LD_PRELOAD does not intercept symbols obtained via dlopen/dlsym, therefore
* we need to intercept the dlopen() call here, and redirect to our wrapper
* shared object.
*/
extern "C" PUBLIC
void * dlopen(const char *filename, int flag)
{
void *handle;
handle = _dlopen(filename, flag);
const char * libgl_filename = getenv("TRACE_LIBGL");
if (filename && handle && !libgl_filename) {
if (0) {
os::log("apitrace: warning: dlopen(\"%s\", 0x%x)\n", filename, flag);
}
// FIXME: handle absolute paths and other versions
if (strcmp(filename, "libGL.so") == 0 ||
strcmp(filename, "libGL.so.1") == 0) {
// Use the true libGL.so handle instead of RTLD_NEXT from now on
_libGlHandle = handle;
// Get the file path for our shared object, and use it instead
static int dummy = 0xdeedbeef;
Dl_info info;
if (dladdr(&dummy, &info)) {
os::log("apitrace: redirecting dlopen(\"%s\", 0x%x)\n", filename, flag);
handle = _dlopen(info.dli_fname, flag);
} else {
os::log("apitrace: warning: dladdr() failed\n");
}
}
}
return handle;
}
'''
| 29.855556 | 88 | 0.620953 | 595 | 5,374 | 5.532773 | 0.426891 | 0.026731 | 0.015796 | 0.014581 | 0.072904 | 0.066829 | 0.066829 | 0.032199 | 0 | 0 | 0 | 0.005243 | 0.254745 | 5,374 | 179 | 89 | 30.022346 | 0.816729 | 0.216598 | 0 | 0.102564 | 0 | 0.017094 | 0.636817 | 0.094787 | 0 | 0 | 0.002494 | 0 | 0 | 0 | null | null | 0 | 0.034188 | null | null | 0.239316 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c183d5e376750e74bb4cc800aeeb469443e1445 | 929 | py | Python | nebu/upgrade_workers.py | linkedin/indextank-service | 880c6295ce8e7a3a55bf9b3777cc35c7680e0d7e | [
"Apache-2.0"
] | 26 | 2015-06-15T11:21:09.000Z | 2020-12-27T19:42:14.000Z | nebu/upgrade_workers.py | LinkedInAttic/indextank-service | 880c6295ce8e7a3a55bf9b3777cc35c7680e0d7e | [
"Apache-2.0"
] | 1 | 2020-09-15T19:34:38.000Z | 2020-09-15T19:34:38.000Z | nebu/upgrade_workers.py | LinkedInAttic/indextank-service | 880c6295ce8e7a3a55bf9b3777cc35c7680e0d7e | [
"Apache-2.0"
] | 12 | 2015-03-17T17:14:19.000Z | 2019-12-21T13:26:23.000Z | #!/usr/bin/python
#
# This script is used from the upgrade_frontend.sh
# to issue commands to every worker so they can update
# their nebu installations from this frontend and
# restart their controllers.
#
# author: santip
#
from nebu.models import Worker
import rpc
import socket
from thrift.transport import TTransport
for w in Worker.objects.all():
print 'Upgrading worker %d at %s' % (w.id, w.wan_dns)
dns = w.lan_dns
controller = rpc.getThriftControllerClient(dns)
host = socket.gethostbyname_ex(socket.gethostname())[0]
retcode = controller.update_worker(host)
if retcode == 0:
try:
print 'Worker %s updated. Restarting...' % dns
controller.restart_controller()
print "Restart controller didn't throw an exception. Did it restart?"
except TTransport.TTransportException:
# restart will always fail
pass
print 'Done'
| 29.03125 | 81 | 0.68676 | 119 | 929 | 5.310924 | 0.638655 | 0.041139 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002797 | 0.230355 | 929 | 31 | 82 | 29.967742 | 0.881119 | 0.25296 | 0 | 0 | 0 | 0 | 0.178363 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.055556 | 0.222222 | null | null | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9c1b88eb7211009f35fc5615b7d90f9efbf4b6df | 3,568 | py | Python | eggs/zope.interface-4.1.2-py2.7-linux-x86_64.egg/zope/interface/tests/test_interfaces.py | salayhin/talkofacta | 8b5a14245dd467bb1fda75423074c4840bd69fb7 | [
"MIT"
] | 97 | 2016-10-16T05:49:31.000Z | 2021-11-17T11:25:22.000Z | eggs/zope.interface-4.1.2-py2.7-linux-x86_64.egg/zope/interface/tests/test_interfaces.py | salayhin/talkofacta | 8b5a14245dd467bb1fda75423074c4840bd69fb7 | [
"MIT"
] | 11 | 2017-03-02T08:33:19.000Z | 2020-04-27T01:31:56.000Z | eggs/zope.interface-4.1.2-py2.7-linux-x86_64.egg/zope/interface/tests/test_interfaces.py | salayhin/talkofacta | 8b5a14245dd467bb1fda75423074c4840bd69fb7 | [
"MIT"
] | 54 | 2016-10-21T16:20:42.000Z | 2021-05-05T17:20:21.000Z | import unittest
class _ConformsToIObjectEvent(object):
def _makeOne(self, target=None):
if target is None:
target = object()
return self._getTargetClass()(target)
def test_class_conforms_to_IObjectEvent(self):
from zope.interface.interfaces import IObjectEvent
from zope.interface.verify import verifyClass
verifyClass(IObjectEvent, self._getTargetClass())
def test_instance_conforms_to_IObjectEvent(self):
from zope.interface.interfaces import IObjectEvent
from zope.interface.verify import verifyObject
verifyObject(IObjectEvent, self._makeOne())
class _ConformsToIRegistrationEvent(_ConformsToIObjectEvent):
def test_class_conforms_to_IRegistrationEvent(self):
from zope.interface.interfaces import IRegistrationEvent
from zope.interface.verify import verifyClass
verifyClass(IRegistrationEvent, self._getTargetClass())
def test_instance_conforms_to_IRegistrationEvent(self):
from zope.interface.interfaces import IRegistrationEvent
from zope.interface.verify import verifyObject
verifyObject(IRegistrationEvent, self._makeOne())
class ObjectEventTests(unittest.TestCase, _ConformsToIObjectEvent):
def _getTargetClass(self):
from zope.interface.interfaces import ObjectEvent
return ObjectEvent
def test_ctor(self):
target = object()
event = self._makeOne(target)
self.assertTrue(event.object is target)
class RegistrationEventTests(unittest.TestCase,
_ConformsToIRegistrationEvent):
def _getTargetClass(self):
from zope.interface.interfaces import RegistrationEvent
return RegistrationEvent
def test___repr__(self):
target = object()
event = self._makeOne(target)
r = repr(event)
self.assertEqual(r.splitlines(),
['RegistrationEvent event:', repr(target)])
class RegisteredTests(unittest.TestCase,
_ConformsToIRegistrationEvent):
def _getTargetClass(self):
from zope.interface.interfaces import Registered
return Registered
def test_class_conforms_to_IRegistered(self):
from zope.interface.interfaces import IRegistered
from zope.interface.verify import verifyClass
verifyClass(IRegistered, self._getTargetClass())
def test_instance_conforms_to_IRegistered(self):
from zope.interface.interfaces import IRegistered
from zope.interface.verify import verifyObject
verifyObject(IRegistered, self._makeOne())
class UnregisteredTests(unittest.TestCase,
_ConformsToIRegistrationEvent):
def _getTargetClass(self):
from zope.interface.interfaces import Unregistered
return Unregistered
def test_class_conforms_to_IUnregistered(self):
from zope.interface.interfaces import IUnregistered
from zope.interface.verify import verifyClass
verifyClass(IUnregistered, self._getTargetClass())
def test_instance_conforms_to_IUnregistered(self):
from zope.interface.interfaces import IUnregistered
from zope.interface.verify import verifyObject
verifyObject(IUnregistered, self._makeOne())
def test_suite():
return unittest.TestSuite((
unittest.makeSuite(ObjectEventTests),
unittest.makeSuite(RegistrationEventTests),
unittest.makeSuite(RegisteredTests),
unittest.makeSuite(UnregisteredTests),
))
| 33.980952 | 68 | 0.716368 | 326 | 3,568 | 7.650307 | 0.150307 | 0.064154 | 0.136327 | 0.101043 | 0.646351 | 0.627105 | 0.627105 | 0.469928 | 0.448276 | 0.448276 | 0 | 0 | 0.21917 | 3,568 | 104 | 69 | 34.307692 | 0.89519 | 0 | 0 | 0.373333 | 0 | 0 | 0.006726 | 0 | 0 | 0 | 0 | 0 | 0.026667 | 1 | 0.213333 | false | 0 | 0.28 | 0.013333 | 0.653333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c2c32f852d0bb90a3de222824350c11e595f779 | 107 | py | Python | swig-2.0.4/Examples/test-suite/python/sneaky1_runme.py | vidkidz/crossbridge | ba0bf94aee0ce6cf7eb5be882382e52bc57ba396 | [
"MIT"
] | 1 | 2016-04-09T02:58:13.000Z | 2016-04-09T02:58:13.000Z | swig-2.0.4/Examples/test-suite/python/sneaky1_runme.py | vidkidz/crossbridge | ba0bf94aee0ce6cf7eb5be882382e52bc57ba396 | [
"MIT"
] | null | null | null | swig-2.0.4/Examples/test-suite/python/sneaky1_runme.py | vidkidz/crossbridge | ba0bf94aee0ce6cf7eb5be882382e52bc57ba396 | [
"MIT"
] | null | null | null | import sneaky1
x = sneaky1.add(3,4)
y = sneaky1.subtract(3,4)
z = sneaky1.mul(3,4)
w = sneaky1.divide(3,4)
| 17.833333 | 25 | 0.682243 | 22 | 107 | 3.318182 | 0.545455 | 0.109589 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139785 | 0.130841 | 107 | 5 | 26 | 21.4 | 0.645161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c319317ccefea90ece8d20463241beb18a0c093 | 122 | py | Python | config_web/dgidb.py | NikkiBytes/pending.api | 3c83bb8e413c3032a3a4539d19a779b5f0b67650 | [
"Apache-2.0"
] | 3 | 2019-02-17T23:36:35.000Z | 2022-03-01T16:43:06.000Z | config_web/dgidb.py | NikkiBytes/pending.api | 3c83bb8e413c3032a3a4539d19a779b5f0b67650 | [
"Apache-2.0"
] | 56 | 2019-01-26T16:34:12.000Z | 2022-03-23T06:57:03.000Z | config_web/dgidb.py | NikkiBytes/pending.api | 3c83bb8e413c3032a3a4539d19a779b5f0b67650 | [
"Apache-2.0"
] | 6 | 2020-10-22T17:37:54.000Z | 2022-03-01T16:56:55.000Z |
ES_HOST = 'localhost:9200'
ES_INDEX = 'pending-dgidb'
ES_DOC_TYPE = 'association'
API_PREFIX = 'dgidb'
API_VERSION = ''
| 15.25 | 27 | 0.721311 | 17 | 122 | 4.823529 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038095 | 0.139344 | 122 | 7 | 28 | 17.428571 | 0.742857 | 0 | 0 | 0 | 0 | 0 | 0.355372 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c36bf4986e575a01f70da4dd9f73b12784f8528 | 15,721 | py | Python | plot-mercury.py | trishankkarthik/mercury-experiments | 4c807a69c493f96af665e309a2d5ef64a20a2e57 | [
"MIT"
] | 2 | 2017-08-31T22:39:22.000Z | 2021-09-14T20:25:02.000Z | plot-mercury.py | trishankkarthik/mercury-experiments | 4c807a69c493f96af665e309a2d5ef64a20a2e57 | [
"MIT"
] | null | null | null | plot-mercury.py | trishankkarthik/mercury-experiments | 4c807a69c493f96af665e309a2d5ef64a20a2e57 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# 1st-party
import json
import os
# 2nd-party
from nouns import METADATA_DIRECTORY
# 3rd-party
import matplotlib
# Force matplotlib to not use any Xwindows backend.
# http://stackoverflow.com/a/3054314
matplotlib.use('Agg')
# No Type 3 fonts in figures, as per ATC 2017 requirements.
# http://phyletica.org/matplotlib-fonts/
matplotlib.rcParams['pdf.fonttype'] = 42
matplotlib.rcParams['ps.fonttype'] = 42
import matplotlib.pyplot as pyplot
import numpy
import scipy.stats
def plot_vary_number_of_new_projects():
MIN_X, MAX_X = 5, 19
MIN_Y, MAX_Y = 2, 9
# INPUT
TUF_COST_FOR_NEW_USERS_FILEPATH = \
os.path.join(METADATA_DIRECTORY, 'vary-tuf-costs-for-new-users.json')
TUF_VERSION_COST_FOR_NEW_USERS_FILEPATH = \
os.path.join(METADATA_DIRECTORY,
'vary-tuf-version-costs-for-new-users.json')
MERCURY_COST_FOR_NEW_USERS_FILEPATH = \
os.path.join(METADATA_DIRECTORY,
'vary-mercury-costs-for-new-users.json')
MERCURY_NOHASH_COST_FOR_NEW_USERS_FILEPATH = \
os.path.join(METADATA_DIRECTORY,
'vary-mercury-nohash-costs-for-new-users.json')
with open(TUF_COST_FOR_NEW_USERS_FILEPATH) as tuf_cost_for_new_users_file:
tuf_cost_for_new_users = json.load(tuf_cost_for_new_users_file)
with open(TUF_VERSION_COST_FOR_NEW_USERS_FILEPATH) as \
tuf_version_cost_for_new_users_file:
tuf_version_cost_for_new_users = \
json.load(tuf_version_cost_for_new_users_file)
with open(MERCURY_COST_FOR_NEW_USERS_FILEPATH) as \
mercury_cost_for_new_users_file:
mercury_cost_for_new_users = json.load(mercury_cost_for_new_users_file)
with open(MERCURY_NOHASH_COST_FOR_NEW_USERS_FILEPATH) as \
mercury_nohash_cost_for_new_users_file:
mercury_nohash_cost_for_new_users = \
json.load(mercury_nohash_cost_for_new_users_file)
# All must have the same numbers of projects
assert tuf_cost_for_new_users.keys() == mercury_cost_for_new_users.keys()
assert mercury_cost_for_new_users.keys() == \
mercury_nohash_cost_for_new_users.keys()
assert mercury_nohash_cost_for_new_users.keys() == \
tuf_version_cost_for_new_users.keys()
NUMBER_OF_PROJECTS = sorted(int(n) for n in tuf_cost_for_new_users)
# NOTE: Throw out results when n is too small.
NUMBER_OF_PROJECTS = [n for n in NUMBER_OF_PROJECTS if n >= 2**MIN_X]
mercury_project_metadata_length = []
mercury_snapshot_metadata_length = []
mercury_metadata_length = []
mercury_nohash_project_metadata_length = []
mercury_nohash_snapshot_metadata_length = []
mercury_nohash_metadata_length = []
tuf_project_metadata_length = []
tuf_snapshot_metadata_length = []
tuf_metadata_length = []
tuf_version_project_metadata_length = []
tuf_version_snapshot_metadata_length = []
tuf_version_metadata_length = []
for n in NUMBER_OF_PROJECTS:
n = str(n)
key = 'project_metadata_length'
mercury_project_metadata_length.append(mercury_cost_for_new_users[n][key])
mercury_nohash_project_metadata_length.\
append(mercury_nohash_cost_for_new_users[n][key])
tuf_project_metadata_length.append(tuf_cost_for_new_users[n][key])
tuf_version_project_metadata_length.\
append(tuf_version_cost_for_new_users[n][key])
key = 'snapshot_metadata_length'
mercury_snapshot_metadata_length.append(mercury_cost_for_new_users[n][key])
mercury_nohash_snapshot_metadata_length.\
append(mercury_nohash_cost_for_new_users[n][key])
tuf_snapshot_metadata_length.append(tuf_cost_for_new_users[n][key])
tuf_version_snapshot_metadata_length.\
append(tuf_version_cost_for_new_users[n][key])
assert len(mercury_project_metadata_length) == \
len(mercury_nohash_project_metadata_length)
assert len(mercury_nohash_project_metadata_length) == \
len(tuf_project_metadata_length)
assert len(tuf_project_metadata_length) == \
len(tuf_version_project_metadata_length)
assert len(mercury_project_metadata_length) == len(NUMBER_OF_PROJECTS)
assert len(mercury_snapshot_metadata_length) == \
len(mercury_nohash_snapshot_metadata_length)
assert len(mercury_nohash_snapshot_metadata_length) == \
len(tuf_snapshot_metadata_length)
assert len(tuf_snapshot_metadata_length) == \
len(tuf_version_snapshot_metadata_length)
assert len(mercury_snapshot_metadata_length) == len(NUMBER_OF_PROJECTS)
# Produce tallies of metadata for each system.
for i in range(len(NUMBER_OF_PROJECTS)):
mercury_total_length = mercury_snapshot_metadata_length[i] + \
mercury_project_metadata_length[i]
mercury_metadata_length.append(mercury_total_length)
mercury_nohash_total_length = mercury_nohash_snapshot_metadata_length[i] + \
mercury_nohash_project_metadata_length[i]
mercury_nohash_metadata_length.append(mercury_nohash_total_length)
tuf_version_total_length = tuf_version_snapshot_metadata_length[i] + \
tuf_version_project_metadata_length[i]
tuf_version_metadata_length.append(tuf_version_total_length)
tuf_total_length = tuf_snapshot_metadata_length[i] + \
tuf_project_metadata_length[i]
tuf_metadata_length.append(tuf_total_length)
assert len(mercury_metadata_length) == len(mercury_nohash_metadata_length)
assert len(mercury_nohash_metadata_length) == len(tuf_metadata_length)
assert len(tuf_metadata_length) == len(tuf_version_metadata_length)
# http://stackoverflow.com/a/30670983
x1 = numpy.log2(NUMBER_OF_PROJECTS)
x2 = numpy.arange(MIN_X, MAX_X+1)
y1 = numpy.log10(tuf_metadata_length)
pyplot.plot(x1, y1, 'ro', label='TUF')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'r--')
y1 = numpy.log10(tuf_version_metadata_length)
pyplot.plot(x1, y1, 'ms', label='TUF-version')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'm--')
y1 = numpy.log10(mercury_metadata_length)
pyplot.plot(x1, y1, 'cx', label='Mercury-hash')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'c--')
y1 = numpy.log10(mercury_nohash_metadata_length)
pyplot.plot(x1, y1, 'g*', label='Mercury')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'g--')
y1 = numpy.log10(mercury_project_metadata_length)
pyplot.plot(x1, y1, 'b^', label='GPG/RSA')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'b--')
# From /var/experiments-output/simple/compute-average-package-size.py
AVG_PKG_SIZE = numpy.log10(659878)
pyplot.hlines(AVG_PKG_SIZE, 0, MAX_X, colors='k', linestyles='-',
label='Average downloaded package size')
OBS_NUMBER_OF_PROJECTS = numpy.log2(NUMBER_OF_PROJECTS[-1])
pyplot.vlines(OBS_NUMBER_OF_PROJECTS, MIN_Y, MAX_Y, colors='k',
linestyles=':',
label='Actual number of projects at end of month')
pyplot.title('Initial cost as number of projects is varied')
pyplot.legend(loc='upper left', fontsize=11)
pyplot.xlabel('Number of projects', fontsize=14)
xlabels = ['32', '64', '128', '256', '512', '1K', '2K', '4K', '8K', '16K',
'32K', '64K', '128K', '256K', '512K', '1M']
pyplot.xticks(x2, xlabels, fontsize=10)
pyplot.xlim(MIN_X, MAX_X)
pyplot.ylabel('Bandwidth cost', fontsize=14)
yticks = numpy.arange(MIN_Y, MAX_Y+1)
ylabels = ['100B', '1KB', '10KB', '100KB', '1MB', '10MB', '100MB']
pyplot.yticks(yticks, ylabels, fontsize=14)
pyplot.ylim(MIN_Y, MAX_Y)
# write the actual plot
VARY_COSTS_FOR_NEW_USERS_FILENAME = \
os.path.join(METADATA_DIRECTORY, 'vary-number-of-new-projects.pdf')
pyplot.savefig(VARY_COSTS_FOR_NEW_USERS_FILENAME)
# clear figure
pyplot.clf()
def plot_vary_frequency_of_project_creation_or_update():
MIN_X, MAX_X = 0, 8
MIN_Y, MAX_Y = 2, 9
NUMBER_OF_FREQUENCIES = 9
frequencies = [2**i for i in range(NUMBER_OF_FREQUENCIES)]
mercury_project_metadata_length = []
mercury_snapshot_metadata_length = []
mercury_metadata_length = []
mercury_nohash_project_metadata_length = []
mercury_nohash_snapshot_metadata_length = []
mercury_nohash_metadata_length = []
tuf_version_project_metadata_length = []
tuf_version_snapshot_metadata_length = []
tuf_version_metadata_length = []
tuf_project_metadata_length = []
tuf_snapshot_metadata_length = []
tuf_metadata_length = []
# Over each frequency in increasing order...
for f in frequencies:
# INPUT
if f==1.0: f=int(f)
mercury_filename = os.path.join(METADATA_DIRECTORY,
'mercury-best.f{}.json'.format(f))
mercury_nohash_filename = os.path.join(METADATA_DIRECTORY,
'mercury-nohash-best.f{}.json'.\
format(f))
tuf_version_filename = os.path.join(METADATA_DIRECTORY,
'tuf-version-best.f{}.json'.format(f))
tuf_filename = os.path.join(METADATA_DIRECTORY,
'tuf-best.f{}.json'.format(f))
# We care only about the recurring cost of a returning user.
# Looks new, because the user had never been seen before in the log.
mercury = read_json(mercury_filename)['new']
mercury_nohash = read_json(mercury_nohash_filename)['new']
tuf_version = read_json(tuf_version_filename)['new']
tuf = read_json(tuf_filename)['new']
assert mercury['package_length'] == mercury_nohash['package_length']
assert mercury_nohash['package_length'] == tuf_version['package_length']
assert tuf_version['package_length'] == tuf['package_length']
# Draw the frequencies in reverse order for easier comprehension.
mercury_snapshot_metadata_length.insert(0,
mercury['snapshot_metadata_length'])
mercury_project_metadata_length.insert(0,
mercury['project_metadata_length'])
mercury_nohash_snapshot_metadata_length.\
insert(0, mercury_nohash['snapshot_metadata_length'])
mercury_nohash_project_metadata_length.\
insert(0, mercury_nohash['project_metadata_length'])
tuf_version_snapshot_metadata_length.\
insert(0, tuf_version['snapshot_metadata_length'])
tuf_version_project_metadata_length.\
insert(0, tuf_version['project_metadata_length'])
tuf_snapshot_metadata_length.insert(0, tuf['snapshot_metadata_length'])
tuf_project_metadata_length.insert(0, tuf['project_metadata_length'])
assert len(mercury_project_metadata_length) == len(frequencies)
assert len(mercury_snapshot_metadata_length) == len(frequencies)
assert len(mercury_nohash_project_metadata_length) == len(frequencies)
assert len(mercury_nohash_snapshot_metadata_length) == len(frequencies)
assert len(tuf_version_project_metadata_length) == len(frequencies)
assert len(tuf_version_snapshot_metadata_length) == len(frequencies)
assert len(tuf_project_metadata_length) == len(frequencies)
assert len(tuf_snapshot_metadata_length) == len(frequencies)
# Produce tallies of metadata for each system.
for i in range(len(frequencies)):
mercury_total_length = mercury_snapshot_metadata_length[i] + \
mercury_project_metadata_length[i]
mercury_metadata_length.append(mercury_total_length)
mercury_nohash_total_length = mercury_nohash_snapshot_metadata_length[i] + \
mercury_nohash_project_metadata_length[i]
mercury_nohash_metadata_length.append(mercury_nohash_total_length)
tuf_version_total_length = tuf_version_snapshot_metadata_length[i] + \
tuf_version_project_metadata_length[i]
tuf_version_metadata_length.append(tuf_version_total_length)
tuf_total_length = tuf_snapshot_metadata_length[i] + \
tuf_project_metadata_length[i]
tuf_metadata_length.append(tuf_total_length)
assert len(mercury_metadata_length) == len(mercury_nohash_metadata_length)
assert len(mercury_nohash_metadata_length) == len(tuf_version_metadata_length)
assert len(tuf_version_metadata_length) == len(tuf_metadata_length)
# http://stackoverflow.com/a/30670983
x1 = numpy.arange(MIN_X, MAX_X+1)
x2 = numpy.arange(MIN_X, MAX_X+5)
y1 = numpy.log10(tuf_metadata_length)
pyplot.plot(x1, y1, 'ro', label='TUF')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'r--')
y1 = numpy.log10(tuf_version_metadata_length)
pyplot.plot(x1, y1, 'ms', label='TUF-version')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'm--')
y1 = numpy.log10(mercury_metadata_length)
pyplot.plot(x1, y1, 'cx', label='Mercury-hash')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'c--')
y1 = numpy.log10(mercury_nohash_metadata_length)
pyplot.plot(x1, y1, 'g*', label='Mercury')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'g--')
print('GPG/RSA: {}'.format(mercury_project_metadata_length))
y1 = numpy.log10(mercury_project_metadata_length)
pyplot.plot(x1, y1, 'b^', label='GPG/RSA')
m, c, *z = scipy.stats.linregress(x1, y1)
y2 = m*x2+c
pyplot.plot(x2, y2, 'b--')
# From /var/experiments-output/simple/compute-average-package-size.py
AVG_PKG_SIZE = numpy.log10(659878)
pyplot.hlines(AVG_PKG_SIZE, MIN_X, MAX_X+4, colors='k', linestyles='solid',
label='Average downloaded package size')
pyplot.vlines(8, MIN_Y, MAX_Y, colors='k', linestyles=':',
label='Actual rate of updates over the month')
pyplot.title('Recurring cost as rate of project updates is varied')
pyplot.legend(loc='upper left', fontsize=11)
pyplot.xlabel('The average number of projects updated per minute',
fontsize=12)
xlabels = ['$2^{-10}$', '$2^{-9}$', '$2^{-8}$', '$2^{-7}$',
'$2^{-6}$', '$2^{-5}$', '$2^{-4}$', '$2^{-3}$', '$2^{-2}$',
'$2^{-1}$', '$1$', '$2$', '$4$']
pyplot.xticks(x2, xlabels, fontsize=14)
pyplot.xlim(MIN_X, MAX_X+4)
pyplot.ylabel('Bandwidth cost', fontsize=14)
yticks = numpy.arange(MIN_Y, MAX_Y+1)
ylabels = ['100B', '1KB', '10KB', '100KB', '1MB', '10MB', '100MB']
pyplot.yticks(yticks, ylabels, fontsize=14)
pyplot.ylim(MIN_Y, MAX_Y)
# write the actual plot
PLOT_FILENAME = os.path.join(METADATA_DIRECTORY,
'vary-rate-of-projects-created-or-updated.pdf')
pyplot.savefig(PLOT_FILENAME)
# clear figure
pyplot.clf()
def read_json(json_filename, day_number=29):
def sanity_check(results, key):
assert key in results
assert 'package_length' in results[key]
assert 'project_metadata_length' in results[key]
assert 'snapshot_metadata_length' in results[key]
with open(json_filename) as json_file:
day_number_str = str(day_number)
results = json.load(json_file)[day_number_str]
sanity_check(results, 'new')
sanity_check(results, 'return')
return results
if __name__ == '__main__':
plot_vary_number_of_new_projects()
plot_vary_frequency_of_project_creation_or_update()
| 39.204489 | 80 | 0.691432 | 2,120 | 15,721 | 4.77217 | 0.124057 | 0.167441 | 0.091331 | 0.051893 | 0.804685 | 0.749629 | 0.656025 | 0.53415 | 0.465751 | 0.44875 | 0 | 0.024675 | 0.19827 | 15,721 | 400 | 81 | 39.3025 | 0.778007 | 0.059411 | 0 | 0.463918 | 0 | 0 | 0.098164 | 0.039699 | 0 | 0 | 0 | 0 | 0.109966 | 1 | 0.013746 | false | 0 | 0.024055 | 0 | 0.041237 | 0.003436 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c3a3c79ee99e909db3ca42ca48df254ed85a272 | 193 | py | Python | assignment7/reducer.py | IITDU-BSSE06/ads-demystifying-the-logs-AliZafar120 | 5d5b35d290a3d5072ee6d116ba082eb6c21a2e9e | [
"MIT"
] | null | null | null | assignment7/reducer.py | IITDU-BSSE06/ads-demystifying-the-logs-AliZafar120 | 5d5b35d290a3d5072ee6d116ba082eb6c21a2e9e | [
"MIT"
] | 1 | 2017-11-01T12:25:31.000Z | 2017-11-18T16:25:45.000Z | assignment7/reducer.py | IITDU-BSSE06/ads-demystifying-the-logs-AliZafar120 | 5d5b35d290a3d5072ee6d116ba082eb6c21a2e9e | [
"MIT"
] | null | null | null | #!/usr/bin/python
import sys
req=dict()
for line in sys.stdin:
line=line.split("\n")[0]
req[line]=int(req.get(line,0))+1
for key in req.viewkeys():
print "{0} {1}".format(key,req.get(key))
| 19.3 | 41 | 0.65285 | 38 | 193 | 3.315789 | 0.552632 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02907 | 0.108808 | 193 | 9 | 42 | 21.444444 | 0.703488 | 0.082902 | 0 | 0 | 0 | 0 | 0.051136 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c3c71d93bdd96654513b3f074bcd08f2a4fd45e | 4,308 | py | Python | 3-Structural Patterns/4-Composite Pattern/Graphic Example/Python/model.py | Ziang-Lu/Design-Patterns | 7a8167a85456b481aba15d5eee5a64b116b00adc | [
"MIT"
] | 2 | 2020-05-05T05:50:59.000Z | 2021-04-12T14:19:34.000Z | 3-Structural Patterns/4-Composite Pattern/Graphic Example/Python/model.py | Ziang-Lu/Design-Patterns | 7a8167a85456b481aba15d5eee5a64b116b00adc | [
"MIT"
] | null | null | null | 3-Structural Patterns/4-Composite Pattern/Graphic Example/Python/model.py | Ziang-Lu/Design-Patterns | 7a8167a85456b481aba15d5eee5a64b116b00adc | [
"MIT"
] | null | null | null | #!usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Model module.
"""
__author__ = 'Ziang Lu'
from abc import ABC, abstractmethod
class Graphic(ABC):
"""
Abstract Graphic class that works as "Component".
Note that this implementation uses Design-for-Type-Safety, i.e., defining
child-related operations only in "Composite"
"""
__slots__ = ['_name']
def __init__(self, name: str):
"""
Constructor with parameter.
:param name: str
"""
self._name = name
@abstractmethod
def draw(self) -> None:
"""
Draws this graphic.
:return: None
"""
pass
def _draw_helper(self):
"""
Protected helper method to draw this graphic.
:return: None
"""
print(f'Drawing {self._name}')
@abstractmethod
def translate(self, x: int, y: int) -> None:
"""
Translates this graphic by the given amount of translation.
:param x: int
:param y: int
:return: None
"""
pass
def _translate_helper(self, x: int, y: int) -> None:
"""
Protected helper method to translate this graphic.
:param x: int
:param y: int
:return: None
"""
print(f'Translating {self._name} by x={x}, y={y}')
@abstractmethod
def resize(self, times: float) -> None:
"""
Resizes this graphic by the given amount of times.
:param times: float
:return: None
"""
pass
def _resize_helper(self, times: float) -> None:
"""
Protected helper method to resize this graphic.
:param times: float
:return: None
"""
print(f'Resizing {self._name} by {times} times')
class GraphicComposite(Graphic):
"""
GraphicComposite class that works as "Composite".
The "Composite" models nodes with children in the hierarchical structure.
However, since both "Composite" and "Leaf" inherit from the common super
class "Component", "Composite" does not need to keep track of its children's
actual type, but only need to keep track of a collection of the abstract
"Component" as its contents.
Without the abstract "Component" super class abstraction, "Component" would
have to maintain different lists for each kind of element in its contents,
and would need to provide separate method for each kind of element.
"""
__slots__ = ['_sub_graphics']
def __init__(self, name: str):
"""
Constructor with parameter.
:param name: str
"""
super().__init__(name)
self._sub_graphics = []
def draw(self):
print(f'Drawing {self._name} as follows:')
for sub_graphic in self._sub_graphics:
sub_graphic.draw()
def translate(self, x, y):
print(f'Translating {self._name} as follows:')
for sub_graphic in self._sub_graphics:
sub_graphic.translate(x, y)
def resize(self, times):
print(f'Resizing {self._name} as follows:')
for sub_graphic in self._sub_graphics:
sub_graphic.resize(times)
def add_graphic(self, graphic: Graphic) -> None:
"""
Adds the given graphic to the sub-graphics of this graphic composite.
:param graphic: Graphic
:return: None
"""
self._sub_graphics.append(graphic)
class Rectangle(Graphic):
"""
Rectangle class that works as one kind of "Leaf".
"""
def __init__(self, name: str):
"""
Constructor with parameter.
:param name: str
"""
super().__init__(name)
def draw(self):
super()._draw_helper()
def translate(self, x, y):
super()._translate_helper(x, y)
def resize(self, times):
super()._resize_helper(times)
class Circle(Graphic):
"""
Circle class that works as one kind of "Leaf".
"""
def __init__(self, name: str):
"""
Constructor with parameter.
:param name: str
"""
super().__init__(name)
def draw(self):
super()._draw_helper()
def translate(self, x, y):
super()._translate_helper(x, y)
def resize(self, times):
super()._resize_helper(times)
| 25.491124 | 80 | 0.588672 | 510 | 4,308 | 4.805882 | 0.247059 | 0.035904 | 0.0306 | 0.026112 | 0.475724 | 0.351693 | 0.330477 | 0.306814 | 0.283966 | 0.283966 | 0 | 0.000667 | 0.303621 | 4,308 | 168 | 81 | 25.642857 | 0.816333 | 0.385562 | 0 | 0.534483 | 0 | 0 | 0.105091 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.344828 | false | 0.051724 | 0.017241 | 0 | 0.465517 | 0.103448 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9c55f4d408b06fb1b9626ed8d5dabc05aeef55ec | 654 | py | Python | contrib/bottle.py | bertrandchenal/tanker | b955311dc8f05f8bb3c0b391e169974e5c6a11b2 | [
"0BSD"
] | 1 | 2019-11-12T08:35:10.000Z | 2019-11-12T08:35:10.000Z | contrib/bottle.py | bertrandchenal/tanker | b955311dc8f05f8bb3c0b391e169974e5c6a11b2 | [
"0BSD"
] | 1 | 2019-11-20T09:00:33.000Z | 2019-11-20T09:00:33.000Z | contrib/bottle.py | bertrandchenal/tanker | b955311dc8f05f8bb3c0b391e169974e5c6a11b2 | [
"0BSD"
] | 1 | 2019-11-19T21:53:16.000Z | 2019-11-19T21:53:16.000Z | # Plugin to integrate Tanker with Bottle. It creates a new tanker
# connection on each request (tanker re-use pooled pg connections)
# Usage:
# from bottle import install
# install(TankerPlugin(cfg))
from functools import wraps
from tanker import connect
class TankerPlugin():
'''
Plugin class to add tanker support to a bottle app
'''
name = 'TankerPlugin'
api = 2
def __init__(self, cfg):
self.cfg = cfg
def apply(self, callback, route):
@wraps(callback)
def wrapper(*args, **kwargs):
with connect(self.cfg):
return callback(*args, **kwargs)
return wrapper
| 22.551724 | 66 | 0.643731 | 81 | 654 | 5.148148 | 0.555556 | 0.05036 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002088 | 0.267584 | 654 | 28 | 67 | 23.357143 | 0.868476 | 0.368502 | 0 | 0 | 0 | 0 | 0.030534 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.153846 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c5d22e1bc47ca7b281531c0a8cb61f8f51cabf1 | 3,111 | py | Python | FD_wave/receiver_module.py | miaoziemm/Seismic_Forward_Engine | 163a589254389772820744452e406f947e29e085 | [
"MIT"
] | 1 | 2021-11-05T13:17:02.000Z | 2021-11-05T13:17:02.000Z | FD_wave/receiver_module.py | miaoziemm/Seismic_Forward_Engine | 163a589254389772820744452e406f947e29e085 | [
"MIT"
] | null | null | null | FD_wave/receiver_module.py | miaoziemm/Seismic_Forward_Engine | 163a589254389772820744452e406f947e29e085 | [
"MIT"
] | null | null | null | import taichi as ti
import numpy as np
@ti.data_oriented
class receiver:
def __init__(self, mod, rec_n, nt):
self.mod = mod
self.rec_n = rec_n
self.nt = nt
self.rec_pos_f = ti.Vector.field(2, dtype=ti.f32, shape=rec_n)
self.rec_pos_f_0 = ti.Vector.field(2, dtype=ti.f32, shape=rec_n)
self.rec_pos_i = ti.Vector.field(2, dtype=ti.i32, shape=rec_n)
self.rec_value = ti.field(dtype=ti.f32, shape=(rec_n, nt))
self.PM = ti.field(dtype=ti.f32, shape=200)
self.e = ti.field(dtype=ti.f32, shape=200)
@ti.kernel
def rec_init(self, nx: ti.i32, nz: ti.i32):
for i in self.rec_pos_f:
self.rec_pos_f_0[i][0] = float(nx) / 2.0 + float(i - self.rec_n / 2)*0.2
self.rec_pos_f_0[i][1] = float(2 * nz / 3)
for i in self.rec_pos_i:
self.rec_pos_i[i][0] = nx / 2 + i - self.rec_n / 2
self.rec_pos_i[i][1] = 2 * nz / 3
for i, j in self.rec_value:
self.rec_value[i, j] = 0.0
for i in self.e:
self.e[i] = ti.random() * 2.0 * 3.1415926
@ti.kernel
def rec_dynamic(self, dt: ti.f32, frame: ti.i32, wind_v: ti.f32):
for i in self.PM:
self.PM[i] = (8.1 * 10.0 ** (-3.0) * 9.8 ** 2.0) / ((float(i+1) * 3.1415926 / 100.0) ** 5.0) * \
ti.exp(-0.74 * (9.8 / (wind_v * float(i+1) * 3.1415926 / 100.0)) ** 4.0)
for i in self.rec_pos_f:
self.rec_pos_f[i] = [0.0, 0.0]
for j in range(200):
w = float(j) * 3.1415926 / 100.0
t = dt * frame
self.rec_pos_f[i][0] = self.rec_pos_f[i][0] + self.PM[j] * ti.sin(
w ** 2.0 / 9.8 * self.rec_pos_f_0[i][0] - w * t + self.e[j])
self.rec_pos_f[i][1] = self.rec_pos_f[i][1] + self.PM[j] * ti.cos(
w ** 2.0 / 9.8 * self.rec_pos_f_0[i][0] - w * t + self.e[j])
self.rec_pos_f[i][0] = self.rec_pos_f_0[i][0] - self.rec_pos_f[i][0]
self.rec_pos_f[i][1] = self.rec_pos_f_0[i][1] + self.rec_pos_f[i][1]
@ti.kernel
def rec_gather(self, wave: ti.template(), t: ti.i32):
if self.mod == 'node':
for i in self.rec_pos_i:
self.rec_value[i, self.nt - t + 1] = wave[(self.rec_pos_i[i][0]), (self.rec_pos_i[i][1])]
if self.mod == 'PIC':
for i in self.rec_pos_f:
center = ti.Vector([int(self.rec_pos_f[i][0] - 0.5), int(self.rec_pos_f[i][1] - 0.5)])
for j in range(3):
for k in range(3):
xy = ti.Vector([center[0] + j - 1, center[1] + k - 1])
r = ((xy[0] - self.rec_pos_f[i][0]) ** 2.0 + (xy[1] - self.rec_pos_f[i][1]) ** 2.0) ** 0.5
w = (15.0 / (3.1415926 * 2.0 ** 6.0)) * (2.0 - r) ** 3.0
self.rec_value[i, self.nt - t + 1] = self.rec_value[i, self.nt - t + 1] + w * wave[xy[0], xy[1]]
def export(self, arr, path):
arr_export = arr.to_numpy()
np.savetxt(path,arr_export)
| 42.616438 | 120 | 0.494375 | 578 | 3,111 | 2.49308 | 0.136678 | 0.19431 | 0.215128 | 0.183206 | 0.541291 | 0.498265 | 0.421235 | 0.30118 | 0.25052 | 0.212353 | 0 | 0.09542 | 0.326262 | 3,111 | 72 | 121 | 43.208333 | 0.59208 | 0 | 0 | 0.169492 | 0 | 0 | 0.002252 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084746 | false | 0 | 0.033898 | 0 | 0.135593 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c5fff20815c9558cf2edb52013780625948272c | 9,753 | py | Python | tests/test_examples.py | gatling-nrl/scikit-fem | 04730d80d612470b7e802eed4c21dd96b89cef61 | [
"BSD-3-Clause"
] | null | null | null | tests/test_examples.py | gatling-nrl/scikit-fem | 04730d80d612470b7e802eed4c21dd96b89cef61 | [
"BSD-3-Clause"
] | 3 | 2022-01-07T00:56:47.000Z | 2022-01-12T20:06:34.000Z | tests/test_examples.py | gatling-nrl/scikit-fem | 04730d80d612470b7e802eed4c21dd96b89cef61 | [
"BSD-3-Clause"
] | null | null | null | from unittest import TestCase, main
import numpy as np
class TestEx01(TestCase):
def runTest(self):
import docs.examples.ex01 as ex01
self.assertAlmostEqual(np.max(ex01.x), 0.073657185490792)
class TestEx02(TestCase):
def runTest(self):
import docs.examples.ex02 as ex02
self.assertAlmostEqual(np.max(ex02.x), 0.001217973811129439)
class TestEx03(TestCase):
def runTest(self):
import docs.examples.ex03 as ex03
self.assertAlmostEqual(ex03.L[0], 0.00418289)
class TestEx04(TestCase):
def runTest(self):
import docs.examples.ex04 as ex04
self.assertAlmostEqual(np.max(ex04.vonmises1), 65.3220252757752)
self.assertAlmostEqual(np.max(ex04.vonmises2), 68.68010634444957)
class TestEx05(TestCase):
def runTest(self):
import docs.examples.ex05 as ex05
self.assertAlmostEqual(np.max(ex05.x), 0.93570751751091152)
class TestEx06(TestCase):
def runTest(self):
import docs.examples.ex06 as ex06
self.assertAlmostEqual(np.max(ex06.x), 0.073651530833125131)
class TestEx07(TestCase):
def runTest(self):
import docs.examples.ex07 as ex07
self.assertAlmostEqual(np.max(ex07.x), 0.0737144219329924)
class TestEx08(TestCase):
def runTest(self):
import docs.examples.ex08 as ex08 # noqa
# only run the initialization, nothing to test
class TestEx09(TestCase):
def runTest(self):
import docs.examples.ex09 as ex09
self.assertAlmostEqual(np.max(ex09.x), 0.05528520791811886, places=6)
class TestEx10(TestCase):
def runTest(self):
import docs.examples.ex10 as ex10
self.assertAlmostEqual(np.mean(ex10.x), 0.277931521728906)
class TestEx11(TestCase):
def runTest(self):
import docs.examples.ex11 as ex11
u = ex11.u
ib = ex11.ib
# since the mesh is symmetric, the mean values should equal to zero
self.assertAlmostEqual(np.mean(u[ib.nodal_dofs[2, :]]), 0.0)
self.assertAlmostEqual(np.mean(u[ib.nodal_dofs[1, :]]), 0.0)
class TestEx12(TestCase):
def runTest(self):
import docs.examples.ex12 as ex
self.assertAlmostEqual(ex.area, np.pi, delta=1e-2)
self.assertAlmostEqual(ex.k, 1 / 8 / np.pi, delta=1e-5)
self.assertAlmostEqual(ex.k1, 1 / 4 / np.pi, delta=1e-5)
class TestEx13(TestCase):
def runTest(self):
import docs.examples.ex13 as ex
u = ex.u
A = ex.A
current = ex.current
self.assertAlmostEqual(current['ground'],
-2 * np.log(2) / np.pi,
delta=1e-3)
self.assertAlmostEqual(u @ A @ u,
2 * np.log(2) / np.pi,
delta=1e-3)
class TestEx14(TestCase):
def runTest(self):
import docs.examples.ex14
u = docs.examples.ex14.u
A = docs.examples.ex14.A
self.assertAlmostEqual(u @ A @ u, 8 / 3, delta=1e-2)
class TestEx15(TestCase):
def runTest(self):
import docs.examples.ex15 as ex15
self.assertTrue(np.max(ex15.x) - 0.1234567 < 1e-5)
class TestEx16(TestCase):
def runTest(self):
import docs.examples.ex16 as ex16
self.assertTrue(np.linalg.norm(np.array([0, 2, 6, 12, 20, 30])
- ex16.ks) < 0.4)
self.assertTrue(ex16.ks[-1], 30.309720458315521)
class TestEx17(TestCase):
def runTest(self):
from docs.examples.ex17 import T0
self.assertAlmostEqual(*T0.values(), 2)
class TestEx18(TestCase):
def runTest(self):
import docs.examples.ex18 as ex # noqa
self.assertAlmostEqual(
(ex.basis["psi"].probes(np.zeros((ex.mesh.dim(), 1))) @ ex.psi)[0],
1 / 64,
3,
)
self.assertLess(
np.linalg.norm(
ex.basis["p"].probes(np.array([[-0.5, 0.0, 0.5], [0.5, 0.5, 0.5]]))
@ ex.pressure
- [-1 / 8, 0, +1 / 8]
),
1e-3,
)
class TestEx19(TestCase):
def runTest(self):
import docs.examples.ex19 as ex # noqa
t, u = next(ex.evolve(0.0, ex.u_init))
self.assertAlmostEqual(*[(ex.probe @ s)[0] for s in [ex.exact(t), u]], 4)
class TestEx20(TestCase):
def runTest(self):
import docs.examples.ex20 as ex
psi0 = ex.psi0
self.assertAlmostEqual(psi0, 1 / 64, delta=1e-3)
class TestEx21(TestCase):
def runTest(self):
import docs.examples.ex21 as ex
x = ex.x
K = ex.K
L = ex.L[0]
self.assertAlmostEqual(L, 50194.51136114997, delta=1)
self.assertAlmostEqual(L, x[:, 0].T @ K @ x[:, 0], 4)
class TestEx22(TestCase):
def runTest(self):
import docs.examples.ex22 as ex
u = ex.u
K = ex.K
self.assertAlmostEqual(u.T @ K @ u, 0.21120183153583372)
class TestEx23(TestCase):
def runTest(self):
import docs.examples.ex23 as ex
self.assertAlmostEqual(max(ex.lmbda_list), ex.turning_point,
delta=5e-5)
class TestEx24(TestCase):
def runTest(self):
import docs.examples.ex24 as ex24 # noqa
self.assertAlmostEqual(min(ex24.vorticity), -0.05171085161096803)
class TestEx25(TestCase):
def runTest(self):
import docs.examples.ex25 as ex25
mu = np.mean(ex25.t)
self.assertAlmostEqual(mu, 0.4642600944590631, places=5)
self.assertAlmostEqual(np.mean(ex25.t0), mu, places=2)
class TestEx26(TestCase):
def runTest(self):
from docs.examples.ex26 import T0
self.assertAlmostEqual(*T0.values(), delta=2e-4)
class TestEx27(TestCase):
def runTest(self):
import docs.examples.ex27 as ex
_, psi = ex.psi.popitem()
self.assertAlmostEqual(min(psi), -0.027043, delta=1e-6)
self.assertAlmostEqual(max(psi), 0.6668, delta=1e-5)
class TestEx28(TestCase):
def runTest(self):
from docs.examples.ex28 import exit_interface_temperature as t
self.assertAlmostEqual(*t.values(), delta=2e-4)
class TestEx29(TestCase):
def runTest(self):
from docs.examples.ex29 import c
wavespeed = tuple(
np.array(sorted(wavespeed, key=np.imag, reverse=True))
for wavespeed in c.values())
self.assertLess(np.linalg.norm(wavespeed[1] - wavespeed[0], np.inf),
5e-3)
class TestEx30(TestCase):
def runTest(self):
from docs.examples.ex30 import psi0
self.assertAlmostEqual(psi0, 0.162/128, delta=1e-6)
class TestEx31(TestCase):
def runTest(self):
from docs.examples.ex31 import L
self.assertAlmostEqual(L[0], 22.597202568397734, delta=1e-6)
class TestEx32(TestCase):
def runTest(self):
from docs.examples.ex32 import l2error_p
self.assertLess(l2error_p, 1e-5)
class TestEx33(TestCase):
def runTest(self):
from docs.examples.ex33 import x
self.assertAlmostEqual(np.max(x), 0.12220233975847579, delta=1e-8)
class TestEx34(TestCase):
def runTest(self):
from docs.examples.ex34 import err
self.assertAlmostEqual(err, 0., delta=1e-13)
class TestEx35(TestCase):
def runTest(self):
from docs.examples.ex35 import Z
# exact value depends also on mesh generation,
# over which we don't have control.
# tolerance is low, but might still break if mesh is slightly different
self.assertAlmostEqual(Z, 52.563390368494424, delta=1e-1)
class TestEx36(TestCase):
def runTest(self):
from docs.examples.ex36 import du, dp, volume_deformed, norm_res
self.assertAlmostEqual(np.linalg.norm(du),
16.530715141106377,
delta=1e-5)
self.assertAlmostEqual(dp[0], -0.5, delta=1.e-8)
self.assertAlmostEqual(volume_deformed, 1., delta=1.e-4)
self.assertAlmostEqual(norm_res, 0., delta=1.e-8)
class TestEx37(TestCase):
def runTest(self):
from docs.examples.ex37 import u
self.assertAlmostEqual(np.max(u), 0.05594193697362236)
class TestEx38(TestCase):
def runTest(self):
from docs.examples.ex38 import l2error
self.assertLess(l2error, 3e-3)
class TestEx39(TestCase):
def runTest(self):
import docs.examples.ex39 as ex # noqa
t, u = next(ex.evolve(0.0, ex.u_init))
self.assertAlmostEqual(*[(ex.probe @ s)[0] for s in [ex.exact(t), u]], 5)
class TestEx40(TestCase):
def runTest(self):
import docs.examples.ex40 as ex
self.assertAlmostEqual(ex.u1.max(), 0.0748, delta=1e-3)
self.assertAlmostEqual(ex.u2.max(), 0.0748, delta=1e-3)
self.assertAlmostEqual(ex.u1.min(), 0.0, delta=3e-3)
self.assertAlmostEqual(ex.u2.min(), 0.0, delta=3e-3)
class TestEx41(TestCase):
def runTest(self):
import docs.examples.ex41 as ex
self.assertAlmostEqual(ex.y.max(), 0.025183404207706196)
self.assertAlmostEqual(ex.y.min(), 0.0)
class TestEx42(TestCase):
def runTest(self):
import docs.examples.ex42 as ex
self.assertAlmostEqual(ex.x.max(), 0.0009824131638261542, delta=1e-5)
class TestEx43(TestCase):
def runTest(self):
import docs.examples.ex43 as ex
self.assertAlmostEqual(ex.u.max(), 0.2466622622014594, delta=1e-8)
class TestEx44(TestCase):
def runTest(self):
import docs.examples.ex44 as ex # noqa
stepper = ex.evolve(0., ex.U)
for itr in range(10):
t, u = next(stepper)
self.assertAlmostEqual(np.sum(u), 11.34, 2)
| 25.266839 | 83 | 0.617656 | 1,283 | 9,753 | 4.683554 | 0.214341 | 0.185222 | 0.131802 | 0.161092 | 0.43801 | 0.36412 | 0.346147 | 0.05758 | 0.0446 | 0.025295 | 0 | 0.119038 | 0.262688 | 9,753 | 385 | 84 | 25.332468 | 0.71659 | 0.029632 | 0 | 0.219409 | 0 | 0 | 0.001058 | 0 | 0 | 0 | 0 | 0 | 0.253165 | 1 | 0.185654 | false | 0 | 0.194093 | 0 | 0.565401 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c6360016874834d69e2e3af951e16abcaf920d4 | 3,055 | py | Python | plenum/persistence/db_hash_store.py | WandyLau/indy-plenum | b7a4c3df4da78a8897f915ddbb57f7b342486d40 | [
"Apache-2.0"
] | null | null | null | plenum/persistence/db_hash_store.py | WandyLau/indy-plenum | b7a4c3df4da78a8897f915ddbb57f7b342486d40 | [
"Apache-2.0"
] | null | null | null | plenum/persistence/db_hash_store.py | WandyLau/indy-plenum | b7a4c3df4da78a8897f915ddbb57f7b342486d40 | [
"Apache-2.0"
] | null | null | null | import storage.helper
from ledger.hash_stores.hash_store import HashStore
from plenum.common.constants import KeyValueStorageType, HS_LEVELDB, HS_ROCKSDB
from stp_core.common.log import getlogger
logger = getlogger()
class DbHashStore(HashStore):
def __init__(self, dataDir, fileNamePrefix="", db_type=HS_LEVELDB):
self.dataDir = dataDir
assert db_type == HS_ROCKSDB or db_type == HS_LEVELDB
self.db_type = KeyValueStorageType.Leveldb if db_type == HS_LEVELDB \
else KeyValueStorageType.Rocksdb
self.nodesDb = None
self.leavesDb = None
self._leafCount = 0
self.nodes_db_name = fileNamePrefix + '_merkleNodes'
self.leaves_db_name = fileNamePrefix + '_merkleLeaves'
self.open()
@property
def is_persistent(self) -> bool:
return True
def writeLeaf(self, leafHash):
self.leavesDb.put(str(self.leafCount + 1), leafHash)
self.leafCount += 1
def writeNode(self, node):
start, height, nodeHash = node
seqNo = self.getNodePosition(start, height)
self.nodesDb.put(str(seqNo), nodeHash)
def readLeaf(self, seqNo):
return self._readOne(seqNo, self.leavesDb)
def readNode(self, seqNo):
return self._readOne(seqNo, self.nodesDb)
def _readOne(self, pos, db):
self._validatePos(pos)
try:
# Converting any bytearray to bytes
return bytes(db.get(str(pos)))
except KeyError:
logger.error("{} does not have position {}".format(db, pos))
def readLeafs(self, start, end):
return self._readMultiple(start, end, self.leavesDb)
def readNodes(self, start, end):
return self._readMultiple(start, end, self.nodesDb)
def _readMultiple(self, start, end, db):
"""
Returns a list of hashes with serial numbers between start
and end, both inclusive.
"""
self._validatePos(start, end)
# Converting any bytearray to bytes
return [bytes(db.get(str(pos))) for pos in range(start, end + 1)]
@property
def leafCount(self) -> int:
return self._leafCount
@property
def nodeCount(self) -> int:
return self.nodesDb.size
@leafCount.setter
def leafCount(self, count: int) -> None:
self._leafCount = count
@property
def closed(self):
return (self.nodesDb is None and self.leavesDb is None) \
or \
(self.nodesDb.closed and self.leavesDb.closed)
def open(self):
self.nodesDb = storage.helper.initKeyValueStorage(
self.db_type, self.dataDir, self.nodes_db_name)
self.leavesDb = storage.helper.initKeyValueStorage(
self.db_type, self.dataDir, self.leaves_db_name)
self._leafCount = self.leavesDb.size
def close(self):
self.nodesDb.close()
self.leavesDb.close()
def reset(self) -> bool:
self.nodesDb.reset()
self.leavesDb.reset()
self.leafCount = 0
return True
| 30.858586 | 79 | 0.638953 | 360 | 3,055 | 5.302778 | 0.308333 | 0.057622 | 0.016763 | 0.023573 | 0.217915 | 0.198009 | 0.198009 | 0.161341 | 0.161341 | 0.053431 | 0 | 0.002222 | 0.263502 | 3,055 | 98 | 80 | 31.173469 | 0.846222 | 0.050737 | 0 | 0.083333 | 0 | 0 | 0.01848 | 0 | 0 | 0 | 0 | 0 | 0.013889 | 1 | 0.236111 | false | 0 | 0.055556 | 0.111111 | 0.458333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
9c64ce35be685dd9150bd9ac2eea19b7ad3eea23 | 252 | py | Python | moka/container.py | zhengpingzhou/pymoka | 7ad4ee3eb97e656ade7b31ff4400db633854712a | [
"MIT"
] | 2 | 2020-09-13T08:15:47.000Z | 2021-02-19T07:29:44.000Z | moka/container.py | zhengpingzhou/pymoka | 7ad4ee3eb97e656ade7b31ff4400db633854712a | [
"MIT"
] | null | null | null | moka/container.py | zhengpingzhou/pymoka | 7ad4ee3eb97e656ade7b31ff4400db633854712a | [
"MIT"
] | null | null | null | from .core import *
def sort_dict(d, sort_by_index=-1, reverse=True, key=None):
if isinstance(d, dict) or isinstance(d, Dict): d = list(d.items())
if key is None: key = lambda x: x[sort_by_index]
return sorted(d, key=key, reverse=reverse) | 36 | 70 | 0.68254 | 45 | 252 | 3.711111 | 0.533333 | 0.05988 | 0.131737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004808 | 0.174603 | 252 | 7 | 71 | 36 | 0.798077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c675257eb4da991b7fea7a4f721fa02d9cb3e8f | 3,431 | py | Python | benchmark/scripts/generate_harness/generate_harness.py | sophiebits/swift | 7f0caa22235680db82e0ada1af3f9306259f6f15 | [
"Apache-2.0"
] | 1 | 2019-04-16T10:57:02.000Z | 2019-04-16T10:57:02.000Z | benchmark/scripts/generate_harness/generate_harness.py | sophiebits/swift | 7f0caa22235680db82e0ada1af3f9306259f6f15 | [
"Apache-2.0"
] | null | null | null | benchmark/scripts/generate_harness/generate_harness.py | sophiebits/swift | 7f0caa22235680db82e0ada1af3f9306259f6f15 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# ===--- generate_harness.py ----------------------------------------------===//
#
# This source file is part of the Swift.org open source project
#
# Copyright (c) 2014 - 2016 Apple Inc. and the Swift project authors
# Licensed under Apache License v2.0 with Runtime Library Exception
#
# See http://swift.org/LICENSE.txt for license information
# See http://swift.org/CONTRIBUTORS.txt for the list of Swift project authors
#
# ===----------------------------------------------------------------------===//
# Generate CMakeLists.txt and utils/main.swift from templates.
import jinja2
import os
import glob
import re
script_dir = os.path.dirname(os.path.realpath(__file__))
perf_dir = os.path.realpath(os.path.join(script_dir, '../..'))
single_source_dir = os.path.join(perf_dir, 'single-source')
multi_source_dir = os.path.join(perf_dir, 'multi-source')
template_map = {
'CMakeLists.txt_template': os.path.join(perf_dir, 'CMakeLists.txt'),
'main.swift_template': os.path.join(perf_dir, 'utils/main.swift')
}
ignored_run_funcs = ["Ackermann", "Fibonacci"]
template_loader = jinja2.FileSystemLoader(searchpath="/")
template_env = jinja2.Environment(loader=template_loader, trim_blocks=True,
lstrip_blocks=True)
if __name__ == '__main__':
# CMakeList single-source
tests = [os.path.basename(x).split('.')[0]
for x in glob.glob(os.path.join(single_source_dir, '*.swift'))]
# CMakeList multi-source
class multi_source_bench(object):
def __init__(self, path):
self.name = os.path.basename(path)
self.files = [x for x in os.listdir(path)
if x.endswith('.swift')]
if os.path.isdir(multi_source_dir):
multisource_benches = [
multi_source_bench(os.path.join(multi_source_dir, x))
for x in os.listdir(multi_source_dir)
if os.path.isdir(os.path.join(multi_source_dir, x))
]
else:
multisource_benches = []
# main.swift imports
imports = sorted(tests + [msb.name for msb in multisource_benches])
# main.swift run functions
def get_run_funcs(filepath):
content = open(filepath).read()
matches = re.findall(r'func run_(.*?)\(', content)
return filter(lambda x: x not in ignored_run_funcs, matches)
def find_run_funcs(dirs):
ret_run_funcs = []
for d in dirs:
for root, _, files in os.walk(d):
for name in filter(lambda x: x.endswith('.swift'), files):
run_funcs = get_run_funcs(os.path.join(root, name))
ret_run_funcs.extend(run_funcs)
return ret_run_funcs
run_funcs = sorted(
[(x, x)
for x in find_run_funcs([single_source_dir, multi_source_dir])],
key=lambda x: x[0]
)
# Replace originals with files generated from templates
for template_file in template_map:
template_path = os.path.join(script_dir, template_file)
template = template_env.get_template(template_path)
print template_map[template_file]
open(template_map[template_file], 'w').write(
template.render(tests=tests,
multisource_benches=multisource_benches,
imports=imports,
run_funcs=run_funcs)
)
| 37.703297 | 80 | 0.612358 | 431 | 3,431 | 4.651972 | 0.299304 | 0.050873 | 0.049875 | 0.02793 | 0.110224 | 0.091272 | 0.050873 | 0 | 0 | 0 | 0 | 0.005787 | 0.244535 | 3,431 | 90 | 81 | 38.122222 | 0.767747 | 0.208977 | 0 | 0 | 1 | 0 | 0.061596 | 0.008534 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.016667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c6ae423e76c3e6fbf3a2e5c49ea0bb8ece856e7 | 2,155 | py | Python | src/cryptography/hazmat/primitives/serialization/base.py | pradyunsg/cryptography | bd0d0fda51ea8e77fef4def5d6322f7cbdf63248 | [
"PSF-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/cryptography/hazmat/primitives/serialization/base.py | pradyunsg/cryptography | bd0d0fda51ea8e77fef4def5d6322f7cbdf63248 | [
"PSF-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | 13 | 2020-11-16T07:39:34.000Z | 2022-03-31T09:05:32.000Z | src/cryptography/hazmat/primitives/serialization/base.py | pradyunsg/cryptography | bd0d0fda51ea8e77fef4def5d6322f7cbdf63248 | [
"PSF-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
import abc
from enum import Enum
from cryptography import utils
from cryptography.hazmat.backends import _get_backend
def load_pem_private_key(data, password, backend=None):
backend = _get_backend(backend)
return backend.load_pem_private_key(data, password)
def load_pem_public_key(data, backend=None):
backend = _get_backend(backend)
return backend.load_pem_public_key(data)
def load_pem_parameters(data, backend=None):
backend = _get_backend(backend)
return backend.load_pem_parameters(data)
def load_der_private_key(data, password, backend=None):
backend = _get_backend(backend)
return backend.load_der_private_key(data, password)
def load_der_public_key(data, backend=None):
backend = _get_backend(backend)
return backend.load_der_public_key(data)
def load_der_parameters(data, backend=None):
backend = _get_backend(backend)
return backend.load_der_parameters(data)
class Encoding(Enum):
PEM = "PEM"
DER = "DER"
OpenSSH = "OpenSSH"
Raw = "Raw"
X962 = "ANSI X9.62"
SMIME = "S/MIME"
class PrivateFormat(Enum):
PKCS8 = "PKCS8"
TraditionalOpenSSL = "TraditionalOpenSSL"
Raw = "Raw"
OpenSSH = "OpenSSH"
class PublicFormat(Enum):
SubjectPublicKeyInfo = "X.509 subjectPublicKeyInfo with PKCS#1"
PKCS1 = "Raw PKCS#1"
OpenSSH = "OpenSSH"
Raw = "Raw"
CompressedPoint = "X9.62 Compressed Point"
UncompressedPoint = "X9.62 Uncompressed Point"
class ParameterFormat(Enum):
PKCS3 = "PKCS3"
class KeySerializationEncryption(metaclass=abc.ABCMeta):
pass
@utils.register_interface(KeySerializationEncryption)
class BestAvailableEncryption(object):
def __init__(self, password):
if not isinstance(password, bytes) or len(password) == 0:
raise ValueError("Password must be 1 or more bytes.")
self.password = password
@utils.register_interface(KeySerializationEncryption)
class NoEncryption(object):
pass
| 24.488636 | 79 | 0.733643 | 271 | 2,155 | 5.642066 | 0.354244 | 0.036625 | 0.070634 | 0.082407 | 0.426423 | 0.327011 | 0.279922 | 0.279922 | 0.279922 | 0.279922 | 0 | 0.014698 | 0.179118 | 2,155 | 87 | 80 | 24.770115 | 0.849633 | 0.080278 | 0 | 0.296296 | 0 | 0 | 0.104651 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12963 | false | 0.185185 | 0.074074 | 0 | 0.759259 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
9c7d9d8f277eae58cc0dc15cedc4d6c923b711e6 | 7,085 | py | Python | tensor2tensor/utils/registry_test.py | sivaramakrishna7/tensor2tensor | eb0118d3f459913133e3d68a96944480a928bff1 | [
"Apache-2.0"
] | 5 | 2019-03-28T03:52:32.000Z | 2021-02-24T07:09:26.000Z | tensor2tensor/utils/registry_test.py | sivaramakrishna7/tensor2tensor | eb0118d3f459913133e3d68a96944480a928bff1 | [
"Apache-2.0"
] | null | null | null | tensor2tensor/utils/registry_test.py | sivaramakrishna7/tensor2tensor | eb0118d3f459913133e3d68a96944480a928bff1 | [
"Apache-2.0"
] | 2 | 2018-08-07T03:43:09.000Z | 2019-12-09T06:41:40.000Z | # coding=utf-8
# Copyright 2018 The Tensor2Tensor Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for tensor2tensor.registry."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
# Dependency imports
from tensor2tensor.utils import modality
from tensor2tensor.utils import registry
from tensor2tensor.utils import t2t_model
import tensorflow as tf
# pylint: disable=unused-variable
class ModelRegistryTest(tf.test.TestCase):
def setUp(self):
registry._reset()
def testT2TModelRegistration(self):
@registry.register_model
class MyModel1(t2t_model.T2TModel):
pass
model = registry.model("my_model1")
self.assertTrue(model is MyModel1)
def testNamedRegistration(self):
@registry.register_model("model2")
class MyModel1(t2t_model.T2TModel):
pass
model = registry.model("model2")
self.assertTrue(model is MyModel1)
def testNonT2TModelRegistration(self):
@registry.register_model
def model_fn():
pass
model = registry.model("model_fn")
self.assertTrue(model is model_fn)
def testUnknownModel(self):
with self.assertRaisesRegexp(LookupError, "never registered"):
registry.model("not_registered")
def testDuplicateRegistration(self):
@registry.register_model
def m1():
pass
with self.assertRaisesRegexp(LookupError, "already registered"):
@registry.register_model("m1")
def m2():
pass
def testListModels(self):
@registry.register_model
def m1():
pass
@registry.register_model
def m2():
pass
self.assertSetEqual(set(["m1", "m2"]), set(registry.list_models()))
def testSnakeCase(self):
convert = registry._convert_camel_to_snake
self.assertEqual("typical_camel_case", convert("TypicalCamelCase"))
self.assertEqual("numbers_fuse2gether", convert("NumbersFuse2gether"))
self.assertEqual("numbers_fuse2_gether", convert("NumbersFuse2Gether"))
self.assertEqual("lstm_seq2_seq", convert("LSTMSeq2Seq"))
self.assertEqual("starts_lower", convert("startsLower"))
self.assertEqual("starts_lower_caps", convert("startsLowerCAPS"))
self.assertEqual("caps_fuse_together", convert("CapsFUSETogether"))
self.assertEqual("startscap", convert("Startscap"))
self.assertEqual("s_tartscap", convert("STartscap"))
class HParamRegistryTest(tf.test.TestCase):
def setUp(self):
registry._reset()
def testHParamSet(self):
@registry.register_hparams
def my_hparams_set():
pass
@registry.register_ranged_hparams
def my_hparams_range(_):
pass
self.assertTrue(registry.hparams("my_hparams_set") is my_hparams_set)
self.assertTrue(
registry.ranged_hparams("my_hparams_range") is my_hparams_range)
def testNamedRegistration(self):
@registry.register_hparams("a")
def my_hparams_set():
pass
@registry.register_ranged_hparams("a")
def my_hparams_range(_):
pass
self.assertTrue(registry.hparams("a") is my_hparams_set)
self.assertTrue(registry.ranged_hparams("a") is my_hparams_range)
def testUnknownHparams(self):
with self.assertRaisesRegexp(LookupError, "never registered"):
registry.hparams("not_registered")
with self.assertRaisesRegexp(LookupError, "never registered"):
registry.ranged_hparams("not_registered")
def testDuplicateRegistration(self):
@registry.register_hparams
def hp1():
pass
with self.assertRaisesRegexp(LookupError, "already registered"):
@registry.register_hparams("hp1")
def hp2():
pass
@registry.register_ranged_hparams
def rhp1(_):
pass
with self.assertRaisesRegexp(LookupError, "already registered"):
@registry.register_ranged_hparams("rhp1")
def rhp2(_):
pass
def testListHparams(self):
@registry.register_hparams
def hp1():
pass
@registry.register_hparams("hp2_named")
def hp2():
pass
@registry.register_ranged_hparams
def rhp1(_):
pass
@registry.register_ranged_hparams("rhp2_named")
def rhp2(_):
pass
self.assertSetEqual(set(["hp1", "hp2_named"]), set(registry.list_hparams()))
self.assertSetEqual(
set(["rhp1", "rhp2_named"]), set(registry.list_ranged_hparams()))
def testRangeSignatureCheck(self):
with self.assertRaisesRegexp(ValueError, "must take a single argument"):
@registry.register_ranged_hparams
def rhp_bad():
pass
with self.assertRaisesRegexp(ValueError, "must take a single argument"):
@registry.register_ranged_hparams
def rhp_bad2(a, b): # pylint: disable=unused-argument
pass
class ModalityRegistryTest(tf.test.TestCase):
def setUp(self):
registry._reset()
def testModalityRegistration(self):
@registry.register_symbol_modality
class MySymbolModality(modality.Modality):
pass
@registry.register_audio_modality
class MyAudioModality(modality.Modality):
pass
@registry.register_image_modality
class MyImageModality(modality.Modality):
pass
@registry.register_class_label_modality
class MyClassLabelModality(modality.Modality):
pass
self.assertTrue(
registry.symbol_modality("my_symbol_modality") is MySymbolModality)
self.assertTrue(
registry.audio_modality("my_audio_modality") is MyAudioModality)
self.assertTrue(
registry.image_modality("my_image_modality") is MyImageModality)
self.assertTrue(
registry.class_label_modality("my_class_label_modality") is
MyClassLabelModality)
def testDefaultNameLookup(self):
@registry.register_symbol_modality("default")
class MyDefaultModality(modality.Modality):
pass
self.assertTrue(registry.symbol_modality() is MyDefaultModality)
def testList(self):
@registry.register_symbol_modality
class MySymbolModality(modality.Modality):
pass
@registry.register_audio_modality
class MyAudioModality(modality.Modality):
pass
@registry.register_image_modality
class MyImageModality(modality.Modality):
pass
@registry.register_class_label_modality
class MyClassLabelModality(modality.Modality):
pass
expected = [
"symbol:my_symbol_modality", "audio:my_audio_modality",
"image:my_image_modality", "class_label:my_class_label_modality"
]
self.assertSetEqual(set(registry.list_modalities()), set(expected))
if __name__ == "__main__":
tf.test.main()
| 25.952381 | 80 | 0.72223 | 782 | 7,085 | 6.331202 | 0.242967 | 0.09695 | 0.052515 | 0.046859 | 0.488992 | 0.438699 | 0.425773 | 0.370026 | 0.325591 | 0.176126 | 0 | 0.009985 | 0.180099 | 7,085 | 272 | 81 | 26.047794 | 0.842314 | 0.097953 | 0 | 0.582857 | 0 | 0 | 0.121526 | 0.020254 | 0 | 0 | 0 | 0 | 0.188571 | 1 | 0.217143 | false | 0.171429 | 0.04 | 0 | 0.337143 | 0.005714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9c7fd17be35e69e241947b8c672f95a1aa204e6b | 70 | bzl | Python | version.bzl | the80srobot/santa | c2cd15ef6434522d675e5349fc5f2cab8418824a | [
"Apache-2.0"
] | null | null | null | version.bzl | the80srobot/santa | c2cd15ef6434522d675e5349fc5f2cab8418824a | [
"Apache-2.0"
] | null | null | null | version.bzl | the80srobot/santa | c2cd15ef6434522d675e5349fc5f2cab8418824a | [
"Apache-2.0"
] | null | null | null | """The version for all Santa components."""
SANTA_VERSION = "2021.3"
| 17.5 | 43 | 0.7 | 10 | 70 | 4.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0.142857 | 70 | 3 | 44 | 23.333333 | 0.716667 | 0.528571 | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c8007b6b767281239b93f7b7a02273aad1d3b28 | 5,834 | py | Python | tests/gate_data.py | pearcandy/pennylane | dfa35989cd0798496e41999a197bcf0eb26185df | [
"Apache-2.0"
] | 1 | 2021-01-26T23:47:49.000Z | 2021-01-26T23:47:49.000Z | tests/gate_data.py | markhop20/pennylane | 8792f0f88178f70a04d6f7afbbb9dd90d2e758b3 | [
"Apache-2.0"
] | null | null | null | tests/gate_data.py | markhop20/pennylane | 8792f0f88178f70a04d6f7afbbb9dd90d2e758b3 | [
"Apache-2.0"
] | null | null | null | """Convenience gate representations for testing"""
import math
import cmath
import numpy as np
# ========================================================
# fixed gates
# ========================================================
I = np.eye(2)
# Pauli matrices
X = np.array([[0, 1], [1, 0]]) #: Pauli-X matrix
Y = np.array([[0, -1j], [1j, 0]]) #: Pauli-Y matrix
Z = np.array([[1, 0], [0, -1]]) #: Pauli-Z matrix
H = np.array([[1, 1], [1, -1]]) / math.sqrt(2) #: Hadamard gate
# Two qubit gates
CNOT = np.array([[1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 0, 1], [0, 0, 1, 0]]) #: CNOT gate
SWAP = np.array([[1, 0, 0, 0], [0, 0, 1, 0], [0, 1, 0, 0], [0, 0, 0, 1]]) #: SWAP gate
CZ = np.array([[1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 1, 0], [0, 0, 0, -1]]) #: CZ gate
S = np.array([[1, 0], [0, 1j]]) #: Phase Gate
T = np.array([[1, 0], [0, cmath.exp(1j * np.pi / 4)]]) #: T Gate
# Three qubit gates
CSWAP = np.array(
[
[1, 0, 0, 0, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 1],
]
) #: CSWAP gate
Toffoli = np.diag([1 for i in range(8)])
Toffoli[6:8, 6:8] = np.array([[0, 1], [1, 0]])
# ========================================================
# parametrized gates
# ========================================================
def Rphi(phi):
r"""One-qubit phase shift.
Args:
phi (float): phase shift angle
Returns:
array: unitary 2x2 phase shift matrix
"""
return np.array([[1, 0], [0, cmath.exp(1j * phi)]])
def Rotx(theta):
r"""One-qubit rotation about the x axis.
Args:
theta (float): rotation angle
Returns:
array: unitary 2x2 rotation matrix :math:`e^{-i \sigma_x \theta/2}`
"""
return math.cos(theta / 2) * I + 1j * math.sin(-theta / 2) * X
def Roty(theta):
r"""One-qubit rotation about the y axis.
Args:
theta (float): rotation angle
Returns:
array: unitary 2x2 rotation matrix :math:`e^{-i \sigma_y \theta/2}`
"""
return math.cos(theta / 2) * I + 1j * math.sin(-theta / 2) * Y
def Rotz(theta):
r"""One-qubit rotation about the z axis.
Args:
theta (float): rotation angle
Returns:
array: unitary 2x2 rotation matrix :math:`e^{-i \sigma_z \theta/2}`
"""
return math.cos(theta / 2) * I + 1j * math.sin(-theta / 2) * Z
def Rot3(a, b, c):
r"""Arbitrary one-qubit rotation using three Euler angles.
Args:
a,b,c (float): rotation angles
Returns:
array: unitary 2x2 rotation matrix ``rz(c) @ ry(b) @ rz(a)``
"""
return Rotz(c) @ (Roty(b) @ Rotz(a))
def CRotx(theta):
r"""Two-qubit controlled rotation about the x axis.
Args:
theta (float): rotation angle
Returns:
array: unitary 4x4 rotation matrix :math:`|0\rangle\langle 0|\otimes \mathbb{I}+|1\rangle\langle 1|\otimes R_x(\theta)`
"""
return np.array(
[
[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, math.cos(theta / 2), -1j * math.sin(theta / 2)],
[0, 0, -1j * math.sin(theta / 2), math.cos(theta / 2)],
]
)
def CRoty(theta):
r"""Two-qubit controlled rotation about the y axis.
Args:
theta (float): rotation angle
Returns:
array: unitary 4x4 rotation matrix :math:`|0\rangle\langle 0|\otimes \mathbb{I}+|1\rangle\langle 1|\otimes R_y(\theta)`
"""
return np.array(
[
[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, math.cos(theta / 2), -math.sin(theta / 2)],
[0, 0, math.sin(theta / 2), math.cos(theta / 2)],
]
)
def CRotz(theta):
r"""Two-qubit controlled rotation about the z axis.
Args:
theta (float): rotation angle
Returns:
array: unitary 4x4 rotation matrix :math:`|0\rangle\langle 0|\otimes \mathbb{I}+|1\rangle\langle 1|\otimes R_z(\theta)`
"""
return np.array(
[
[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, cmath.exp(-1j * theta / 2), 0],
[0, 0, 0, cmath.exp(1j * theta / 2)],
]
)
def CRot3(a, b, c):
r"""Arbitrary two-qubit controlled rotation using three Euler angles.
Args:
a,b,c (float): rotation angles
Returns:
array: unitary 4x4 rotation matrix :math:`|0\rangle\langle 0|\otimes \mathbb{I}+|1\rangle\langle 1|\otimes R(a,b,c)`
"""
return np.array(
[
[1, 0, 0, 0],
[0, 1, 0, 0],
[
0,
0,
cmath.exp(-1j * (a + c) / 2) * math.cos(b / 2),
-cmath.exp(1j * (a - c) / 2) * math.sin(b / 2),
],
[
0,
0,
cmath.exp(-1j * (a - c) / 2) * math.sin(b / 2),
cmath.exp(1j * (a + c) / 2) * math.cos(b / 2),
],
]
)
def MultiRZ1(theta):
r"""Arbitrary multi Z rotation on one wire.
Args:
theta (float): rotation angle
Returns:
array: the one-wire MultiRZ matrix
"""
return np.array([[np.exp(-1j * theta / 2), 0.0 + 0.0j], [0.0 + 0.0j, np.exp(1j * theta / 2)]])
def MultiRZ2(theta):
r"""Arbitrary multi Z rotation on two wires.
Args:
theta (float): rotation angle
Returns:
array: the two-wire MultiRZ matrix
"""
return np.array(
[
[np.exp(-1j * theta / 2), 0.0 + 0.0j, 0.0 + 0.0j, 0.0 + 0.0j],
[0.0 + 0.0j, np.exp(1j * theta / 2), 0.0 + 0.0j, 0.0 + 0.0j],
[0.0 + 0.0j, 0.0 + 0.0j, np.exp(1j * theta / 2), 0.0 + 0.0j],
[0.0 + 0.0j, 0.0 + 0.0j, 0.0 + 0.0j, np.exp(-1j * theta / 2)],
]
)
| 27.780952 | 127 | 0.468632 | 886 | 5,834 | 3.079007 | 0.115124 | 0.10044 | 0.100073 | 0.079179 | 0.790323 | 0.759897 | 0.732405 | 0.694282 | 0.617302 | 0.583944 | 0 | 0.086361 | 0.311279 | 5,834 | 209 | 128 | 27.913876 | 0.592583 | 0.413096 | 0 | 0.180952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.104762 | false | 0 | 0.028571 | 0 | 0.238095 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c820ba399f8b34c1916a67def22f259537e78dd | 552 | py | Python | db_router.py | PriyanshBordia/LNMIIT-Leave-Management | 279464f4c3e3103d4edadc161f5efa027bca9bbd | [
"MIT"
] | 1 | 2022-03-06T19:39:10.000Z | 2022-03-06T19:39:10.000Z | db_router.py | PriyanshBordia/LNMIIT-Leave-Management-System | 279464f4c3e3103d4edadc161f5efa027bca9bbd | [
"MIT"
] | null | null | null | db_router.py | PriyanshBordia/LNMIIT-Leave-Management-System | 279464f4c3e3103d4edadc161f5efa027bca9bbd | [
"MIT"
] | null | null | null | import os
class Router:
"""
A router to control all database operations on models in the
the project.
"""
def db_for_read(self, model, **hints):
ENV = os.getenv('ENV')
if ENV is not None and ENV != '':
return str(os.getenv('ENV'))
else:
return 'default'
def db_for_write(self, model, **hints):
ENV = os.getenv('ENV')
if ENV is not None and ENV != '':
return str(os.getenv('ENV'))
else:
return 'default'
# def allow_relation(self, obj1, obj2, **hints):
# pass
# def allow_migrate(self, db, app_label, **hints):
| 19.714286 | 61 | 0.637681 | 85 | 552 | 4.058824 | 0.494118 | 0.092754 | 0.127536 | 0.098551 | 0.510145 | 0.510145 | 0.510145 | 0.510145 | 0.510145 | 0.510145 | 0 | 0.004598 | 0.211957 | 552 | 27 | 62 | 20.444444 | 0.788506 | 0.317029 | 0 | 0.714286 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.071429 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
92bd218e9335d81c34c74708f3c5d1b171a0a7f1 | 130 | py | Python | src/src.py | Zar1na-soft/Assignment1_Python | 12c83518be66d3880fef2333f5c1ed8d0b951b39 | [
"MIT"
] | null | null | null | src/src.py | Zar1na-soft/Assignment1_Python | 12c83518be66d3880fef2333f5c1ed8d0b951b39 | [
"MIT"
] | null | null | null | src/src.py | Zar1na-soft/Assignment1_Python | 12c83518be66d3880fef2333f5c1ed8d0b951b39 | [
"MIT"
] | 1 | 2021-10-10T19:12:42.000Z | 2021-10-10T19:12:42.000Z | from pycoingecko import CoinGeckoAPI
cg = CoinGeckoAPI()
answer = cg.get_coins_markets(vs_currency='usd')
print(answer)
| 16.25 | 49 | 0.746154 | 16 | 130 | 5.875 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161538 | 130 | 7 | 50 | 18.571429 | 0.862385 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92bebe7b05bae1b8c134ca46e8663e370119eb01 | 299 | py | Python | core/site/views.py | alex-cots/prevention-point | e64813227846c0e23ed41128602bc16eac40c3b7 | [
"MIT"
] | 35 | 2019-03-12T23:59:10.000Z | 2021-04-05T15:07:38.000Z | core/site/views.py | alex-cots/prevention-point | e64813227846c0e23ed41128602bc16eac40c3b7 | [
"MIT"
] | 365 | 2019-03-12T23:40:39.000Z | 2022-02-10T11:07:26.000Z | core/site/views.py | alex-cots/prevention-point | e64813227846c0e23ed41128602bc16eac40c3b7 | [
"MIT"
] | 20 | 2019-03-12T23:36:25.000Z | 2021-12-30T00:05:42.000Z | from core.viewsets import ModelViewSet
from core.models import Site
from core.site.serializer import SiteSerializer
class SiteViewSet(ModelViewSet):
"""
API endpoint that allows Site data to be viewed or edited
"""
queryset = Site.objects.all()
serializer_class = SiteSerializer | 29.9 | 61 | 0.759197 | 37 | 299 | 6.108108 | 0.648649 | 0.106195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177258 | 299 | 10 | 62 | 29.9 | 0.918699 | 0.190635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
92c1647eea89f918f4034cadae6eea8ca73c9499 | 1,195 | py | Python | tests/scoring_engine/unit_test.py | Dieff/scoringengine | 81eb7f07e4bfe9e9e9a52fc57a211796f0116bdc | [
"MIT"
] | 2 | 2021-01-26T02:48:13.000Z | 2021-08-31T17:17:35.000Z | tests/scoring_engine/unit_test.py | Dieff/scoringengine | 81eb7f07e4bfe9e9e9a52fc57a211796f0116bdc | [
"MIT"
] | null | null | null | tests/scoring_engine/unit_test.py | Dieff/scoringengine | 81eb7f07e4bfe9e9e9a52fc57a211796f0116bdc | [
"MIT"
] | 1 | 2020-12-02T11:47:14.000Z | 2020-12-02T11:47:14.000Z | from scoring_engine.db import session, delete_db, init_db
from scoring_engine.models.setting import Setting
class UnitTest(object):
def setup(self):
self.session = session
delete_db(self.session)
init_db(self.session)
self.create_default_settings()
def teardown(self):
delete_db(self.session)
self.session.remove()
def create_default_settings(self):
self.session.add(Setting(name='about_page_content', value='example content value'))
self.session.add(Setting(name='welcome_page_content', value='example welcome content <br>here'))
self.session.add(Setting(name='round_time_sleep', value=60))
self.session.add(Setting(name='worker_refresh_time', value=30))
self.session.add(Setting(name='blue_team_update_hostname', value=True))
self.session.add(Setting(name='blue_team_update_port', value=True))
self.session.add(Setting(name='blue_team_update_account_usernames', value=True))
self.session.add(Setting(name='blue_team_update_account_passwords', value=True))
self.session.add(Setting(name='overview_show_round_info', value=True))
self.session.commit()
| 44.259259 | 104 | 0.717155 | 159 | 1,195 | 5.163522 | 0.320755 | 0.200974 | 0.153471 | 0.230207 | 0.403167 | 0.281364 | 0.281364 | 0.239951 | 0.192448 | 0.192448 | 0 | 0.003996 | 0.162343 | 1,195 | 26 | 105 | 45.961538 | 0.816184 | 0 | 0 | 0.090909 | 0 | 0 | 0.220921 | 0.115481 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0.045455 | 0.090909 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92c46a523a0d0e121e5938ce9c1fe6fda06192d6 | 1,488 | py | Python | 20-fs-ias-lec/groups/07-14-logCtrl/src/logStore/appconn/kotlin_connection.py | Kyrus1999/BACnet | 5be8e1377252166041bcd0b066cce5b92b077d06 | [
"MIT"
] | 8 | 2020-03-17T21:12:18.000Z | 2021-12-12T15:55:54.000Z | 20-fs-ias-lec/groups/07-14-logCtrl/src/logStore/appconn/kotlin_connection.py | Kyrus1999/BACnet | 5be8e1377252166041bcd0b066cce5b92b077d06 | [
"MIT"
] | 2 | 2021-07-19T06:18:43.000Z | 2022-02-10T12:17:58.000Z | 20-fs-ias-lec/groups/07-14-logCtrl/src/logStore/appconn/kotlin_connection.py | Kyrus1999/BACnet | 5be8e1377252166041bcd0b066cce5b92b077d06 | [
"MIT"
] | 25 | 2020-03-20T09:32:45.000Z | 2021-07-18T18:12:59.000Z | from .connection import Function
class KotlinFunction(Function):
"""Connection to the group kotlin to insert and output the chat elements"""
def __init__(self):
super(KotlinFunction, self).__init__()
def insert_data(self, cbor):
"""adds a new chat element as cbor
@:parameter event: The new cbor event to be added
@:returns 1 if successful, -1 if any error occurred
"""
self.insert_event(cbor)
def get_usernames_and_feed_id(self):
"""Get all current usernames with the corresponding feed id
@:returns a list with all Kotlin usernames and the corresponding feed id
"""
return self._handler.get_usernames_and_feed_id()
def get_all_entries_by_feed_id(self, feed_id):
"""Get all elements with the corresponding feed id, thus all events of a user
@:parameter feed_id: the feed id of a user
@:returns a list of all Kotlin entries with the correct feed id
"""
return self._handler.get_all_entries_by_feed_id(feed_id)
def get_all_kotlin_events(self):
"""Get all existing kotlin elements that are in the database
@:returns a list of all Kotlin entries
"""
return self._handler.get_all_kotlin_events()
def get_last_kotlin_event(self):
"""Get only the last added kotlin element
@:returns a only the last Kotlin entry as cbor
"""
return self._handler.get_last_kotlin_event()
| 32.347826 | 85 | 0.670699 | 211 | 1,488 | 4.507109 | 0.298578 | 0.07571 | 0.071504 | 0.084122 | 0.297581 | 0.15878 | 0.063091 | 0 | 0 | 0 | 0 | 0.001815 | 0.259409 | 1,488 | 45 | 86 | 33.066667 | 0.861162 | 0.472446 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.071429 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
92cc571f7d2346c68803abc7d9cd469488623d34 | 20,138 | py | Python | src/genie/libs/parser/generic/show_platform.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | null | null | null | src/genie/libs/parser/generic/show_platform.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | null | null | null | src/genie/libs/parser/generic/show_platform.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | null | null | null | # Python
import re
# Metaparser
from genie.metaparser import MetaParser
from genie.metaparser.util.schemaengine import Optional, Any
class ShowVersionSchema(MetaParser):
"""Schema for show version"""
schema = {
'os': str,
'version': str,
Optional('platform'): str,
Optional('model'): str,
}
class ShowVersion(ShowVersionSchema):
"""Parser for show version"""
cli_command = [
'show version',
]
def cli(self, output=None):
if output is None:
output = self.device.execute(self.cli_command[0])
ret_dict = {}
# ********************************************
# * ASA *
# ********************************************
# Cisco Adaptive Security Appliance Software Version 9.8(4)10
asa_os_version_pattern = re.compile(r'^Cisco\s+Adaptive Security Appliance Software Version (?P<version>.+)$')
# Hardware: ASAv, 2048 MB RAM, CPU Xeon E5 series 3491 MHz,
# Hardware: ASA5520, 512 MB RAM, CPU Pentium 4 Celeron 2000 MHz
asa_platform_pattern = re.compile(r'^Hardware:\s+(?P<platform>.*), .*, .*$')
# Model Id: ASAv10
asa_model_pattern = re.compile(r'Model\s+Id\:\s+(?P<model>.+)')
# ********************************************
# * GAIA *
# ********************************************
# Product version Check Point Gaia R80.40
gaia_os_version_pattern = re.compile(r'^Product version Check Point Gaia (?P<version>.*)$')
# ********************************************
# * IOSXE *
# ********************************************
# Cisco IOS Software, IOS-XE Software, Catalyst 4500 L3 Switch Software (cat4500e-UNIVERSALK9-M), Version 03.04.06.SG RELEASE SOFTWARE (fc1)
# Cisco IOS Software, IOS-XE Software, Catalyst L3 Switch Software (CAT3K_CAA-UNIVERSALK9-M), Version 03.06.07E RELEASE SOFTWARE (fc3)
# Cisco IOS XE Software, Version 17.05.01a
# Cisco IOS Software, IOS-XE Software, Catalyst L3 Switch Software (CAT3K_CAA-UNIVERSALK9-M), Version 03.06.07E RELEASE SOFTWARE (fc3)
# Cisco IOS Software [Amsterdam], Virtual XE Software (X86_64_LINUX_IOSD-UNIVERSALK9-M), Version 17.3.1a, RELEASE SOFTWARE (fc3)
# Cisco IOS Software, IOS-XE Software, Catalyst 4500 L3 Switch Software (cat4500e-UNIVERSALK9-M), Version 03.03.02.SG RELEASE SOFTWARE (fc1)
# Cisco IOS Software [Bengaluru], ASR1000 Software (X86_64_LINUX_IOSD-UNIVERSALK9-M), Version 17.5.1a, RELEASE SOFTWARE (fc3)
iosxe_os_version_platform_pattern = re.compile(r'^Cisco IOS.*XE Software(?:.*\((?P<platform>[^\-]+).*\))?,(?: Experimental)? Version (?P<version>[\w\.\(\)\:]+).*$')
# cisco WS-C2940-8TT-S (RC32300) processor (revision H0) with 19868K bytes of memory.
# cisco WS-C3650-48PD (MIPS) processor with 4194304K bytes of physical memory.
# cisco C9500-24Y4C (X86) processor with 2900319K/6147K bytes of memory.
# cisco CSR1000V (VXE) processor (revision VXE) with 715705K/3075K bytes of memory.
iosxe_model_pattern = re.compile(r'^[Cc]isco (?P<model>\S+) \(.*\).* with \S+ bytes of(?: physical)? memory.$')
# Cisco IOS-XE software, Copyright (c) 2005-2017 by cisco Systems, Inc.
iosxe_backup_os_pattern = re.compile(r'^[Cc]isco IOS(?: |-)XE [Ss]oftware.*$')
# Switch Ports Model SW Version SW Image Mode
# ------ ----- ----- ---------- ---------- ----
# * 1 41 C9300-24P 17.07.01 CAT9K_IOSXE INSTALL
iosxe_backup_model_version_pattern = re.compile(r'^\*?\s*\d+\s+\d+\s+(?P<model>[\w\-]+)\s+(?P<version>[\w\-\.]+)\s+\w+\s+\w+$')
# Model Number : C9300-24P
iosxe_backup_model_pattern = re.compile(r'^Model\s+Number\s+\:\s+(?P<model>.+)$')
# ********************************************
# * IOSXR *
# ********************************************
# Cisco IOS XR Software, Version 6.1.4.10I[Default]
# Cisco IOS XR Software, Version 6.2.1.23I[Default]
# Cisco IOS XR Software, Version 6.3.1.15I
# Cisco IOS XR Software, Version 6.4.2[Default]
iosxr_os_version_pattern = re.compile(r'^Cisco IOS XR Software, Version (?P<version>[\w\.]+).*$')
# cisco ASR9K Series (Intel 686 F6M14S4) processor with 6291456K bytes of memory.
# cisco IOS XRv Series (Pentium Celeron Stepping 3) processor with 4193911K bytes of memory.
# cisco IOS-XRv 9000 () processor
# cisco CRS-16/S-B (Intel 686 F6M14S4) processor with 12582912K bytes of memory.
iosxr_platform_pattern = re.compile(r'^^cisco (?P<platform>\S+|IOS(?: |-)XRv ?\d*)(?: Series)? \(.*\) processor.*$')
# ********************************************
# * IOS *
# ********************************************
# Cisco IOS Software, C3750E Software (C3750E-UNIVERSALK9-M), Version 15.2(2)E8, RELEASE SOFTWARE (fc1)
# IOS (tm) C2940 Software (C2940-I6K2L2Q4-M), Version 12.1(22)EA12, RELEASE SOFTWARE (fc1)
# Cisco IOS Software, C2960X Software (C2960X-UNIVERSALK9-M), Version 15.2(2)E7, RELEASE SOFTWARE (fc3)
# Cisco IOS Software, 901 Software (ASR901-UNIVERSALK9-M), Version 15.6(2)SP4, RELEASE SOFTWARE (fc3)
# Cisco IOS Software [Bengaluru], ASR1000 Software (X86_64_LINUX_IOSD-UNIVERSALK9-M), Version 17.5.1a, RELEASE SOFTWARE (fc3)
ios_os_version_platform_pattern = re.compile(r'^(?!.*XE Software.*)(Cisco IOS Software|IOS \(\S+\))(?: \[.*\])?,?\s*(?P<alternate_platform>.+)?\s+Software \((?P<platform>[^\-]+).*\),(?: Experimental)? Version (?P<version>[\w\.\:\(\)]+),?.*$')
# Cisco CISCO1941/K9 (revision 1.0) with 491520K/32768K bytes of memory.
ios_model_pattern = re.compile(r'^[Cc]isco (?P<model>\S+) \(.*\).* with \S+ bytes of(?: physical)? memory.$')
# ********************************************
# * JUNOS *
# ********************************************
# Junos: 18.2R2-S1
junos_os_version_pattern = re.compile(r'^Junos: (?P<version>\S+)$')
# Model: ex4200-24p
junos_model_pattern = re.compile(r'^Model: (?P<model>\S+)$')
# ********************************************
# * NXOS *
# ********************************************
# Cisco Nexus Operating System (NX-OS) Software
nxos_os_pattern = re.compile(r'^.*Nexus Operating System.*$')
# system: version 6.0(2)U6(10)
# NXOS: version 9.3(6uu)I9(1uu) [build 9.3(6)]
nxos_version_pattern = re.compile(r'^(?:system|NXOS):\s+version (?P<version>\S+)(?: \[build (?P<build>.*)\])?$')
# cisco Nexus 3048 Chassis ("48x1GE + 4x10G Supervisor")
# cisco Nexus9000 C9396PX Chassis
nxos_platform_and_model_pattern = re.compile(r'^cisco (?P<platform>Nexus\s?[\d]+) ?(?P<model>\S+)? Chassis.*$')
# ********************************************
# * VIPTELLA *
# ********************************************
# 15.3.3
viptella_os_pattern = re.compile(r'^(?P<version>\d+(?:\.\d+)?(?:\.\d+)?)$')
for line in output.splitlines():
line = line.strip()
# ********************************************
# * ASA *
# ********************************************
# Cisco Adaptive Security Appliance Software Version 9.8(4)10
m = asa_os_version_pattern.match(line)
if m:
ret_dict['os'] = 'asa'
ret_dict['version'] = m.groupdict()['version']
continue
# Hardware: ASAv, 2048 MB RAM, CPU Xeon E5 series 3491 MHz,
# Hardware: ASA5520, 512 MB RAM, CPU Pentium 4 Celeron 2000 MHz
m = asa_platform_pattern.match(line)
if m:
ret_dict['platform'] = m.groupdict()['platform']
continue
# Model Id: ASAv10
m = asa_model_pattern.match(line)
if m:
ret_dict['model'] = m.groupdict()['model'].lower()
continue
# ********************************************
# * GAIA *
# ********************************************
# Product version Check Point Gaia R80.40
m = gaia_os_version_pattern.match(line)
if m:
ret_dict['os'] = 'gaia'
ret_dict['version'] = m.groupdict()['version']
continue
# ********************************************
# * IOSXE *
# ********************************************
# Cisco IOS Software, IOS-XE Software, Catalyst 4500 L3 Switch Software (cat4500e-UNIVERSALK9-M), Version 03.04.06.SG RELEASE SOFTWARE (fc1)
# Cisco IOS Software, IOS-XE Software, Catalyst L3 Switch Software (CAT3K_CAA-UNIVERSALK9-M), Version 03.06.07E RELEASE SOFTWARE (fc3)
# Cisco IOS XE Software, Version 17.05.01a
# Cisco IOS Software, IOS-XE Software, Catalyst L3 Switch Software (CAT3K_CAA-UNIVERSALK9-M), Version 03.06.07E RELEASE SOFTWARE (fc3)
# Cisco IOS Software [Amsterdam], Virtual XE Software (X86_64_LINUX_IOSD-UNIVERSALK9-M), Version 17.3.1a, RELEASE SOFTWARE (fc3)
m = iosxe_os_version_platform_pattern.match(line)
if m:
ret_dict['os'] = 'iosxe'
group = m.groupdict()
if group['platform']:
ret_dict['platform'] = group['platform'].lower()
ret_dict['version'] = group['version']
continue
# cisco WS-C2940-8TT-S (RC32300) processor (revision H0) with 19868K bytes of memory.
# cisco WS-C3650-48PD (MIPS) processor with 4194304K bytes of physical memory.
# cisco C9500-24Y4C (X86) processor with 2900319K/6147K bytes of memory.
m = iosxe_model_pattern.match(line)
if m:
ret_dict['model'] = m.groupdict()['model']
continue
# Cisco IOS-XE software, Copyright (c) 2005-2017 by cisco Systems, Inc.
# Cisco IOS XE software, Copyright (c) 2005-2017 by cisco Systems, Inc.
m = iosxe_backup_os_pattern.match(line)
if m:
ret_dict['os'] = 'iosxe'
continue
# * 1 41 C9300-24P 17.07.01 CAT9K_IOSXE INSTALL
m = iosxe_backup_model_version_pattern.match(line)
if m:
ret_dict['model'] = m.groupdict()['model']
ret_dict['version'] = m.groupdict()['version']
continue
# Model Number : C9300-24P
m = iosxe_backup_model_pattern.match(line)
if m:
ret_dict['model'] = m.groupdict()['model']
continue
# ********************************************
# * IOSXR *
# ********************************************
# Cisco IOS XR Software, Version 6.1.4.10I[Default]
# Cisco IOS XR Software, Version 6.2.1.23I[Default]
# Cisco IOS XR Software, Version 6.3.1.15I
# Cisco IOS XR Software, Version 6.4.2[Default]
m = iosxr_os_version_pattern.match(line)
if m:
ret_dict['os'] = 'iosxr'
ret_dict['version'] = m.groupdict()['version']
continue
# cisco ASR9K Series (Intel 686 F6M14S4) processor with 6291456K bytes of memory.
# cisco IOS XRv Series (Pentium Celeron Stepping 3) processor with 4193911K bytes of memory.
# cisco IOS-XRv 9000 () processor
# cisco CRS-16/S-B (Intel 686 F6M14S4) processor with 12582912K bytes of memory.
m = iosxr_platform_pattern.match(line)
if m:
ret_dict['platform'] = m.groupdict()['platform'].lower()
ret_dict['platform'] = \
re.sub(r'\s|\-', r'', m.groupdict()['platform'].lower())
continue
# ********************************************
# * IOS *
# ********************************************
# Cisco IOS Software, C3750E Software (C3750E-UNIVERSALK9-M), Version 15.2(2)E8, RELEASE SOFTWARE (fc1)
# IOS (tm) C2940 Software (C2940-I6K2L2Q4-M), Version 12.1(22)EA12, RELEASE SOFTWARE (fc1)
# Cisco IOS Software, C2960X Software (C2960X-UNIVERSALK9-M), Version 15.2(2)E7, RELEASE SOFTWARE (fc3)
# Cisco IOS Software, 901 Software (ASR901-UNIVERSALK9-M), Version 15.6(2)SP4, RELEASE SOFTWARE (fc3)
# Cisco IOS Software [Bengaluru], ASR1000 Software (X86_64_LINUX_IOSD-UNIVERSALK9-M), Version 17.5.1a, RELEASE SOFTWARE (fc3)
m = ios_os_version_platform_pattern.match(line)
if m:
group = m.groupdict()
ret_dict['version'] = group['version']
if ret_dict.get('os', None) is None:
ret_dict['os'] = 'ios'
# Clean up platform a bit before adding to ret_dict
if group['platform'].lower().startswith('x86_64_linux') \
and group['alternate_platform']:
group['platform'] = group['alternate_platform']
ret_dict['platform'] = \
re.sub(r'\_(ios).*', r'', group['platform'].lower())
ret_dict['platform'] = \
re.sub(r'cat(\d)\d{3}', r'cat\1k', ret_dict['platform'])
continue
# Cisco CISCO1941/K9 (revision 1.0) with 491520K/32768K bytes of memory.
m = ios_model_pattern.match(line)
if m:
ret_dict['model'] = m.groupdict()['model']
continue
# ********************************************
# * JUNOS *
# ********************************************
# Junos: 18.2R2-S1
m = junos_os_version_pattern.match(line)
if m:
ret_dict['os'] = 'junos'
ret_dict['version'] = m.groupdict()['version']
continue
# Model: ex4200-24p
m = junos_model_pattern.match(line)
if m:
ret_dict['model'] = m.groupdict()['model'].lower()
continue
# ********************************************
# * NXOS *
# ********************************************
# Cisco Nexus Operating System (NX-OS) Software
m = nxos_os_pattern.match(line)
if m:
ret_dict['os'] = 'nxos'
continue
# system: version 6.0(2)U6(10)
# NXOS: version 9.3(6uu)I9(1uu) [build 9.3(6)]
m = nxos_version_pattern.match(line)
if m:
group = m.groupdict()
if group['build']:
ret_dict['version'] = group['build']
else:
ret_dict['version'] = group['version']
continue
# cisco Nexus 3048 Chassis ("48x1GE + 4x10G Supervisor")
# cisco Nexus9000 C9396PX Chassis
m = nxos_platform_and_model_pattern.match(line)
if m:
group = m.groupdict()
ret_dict['platform'] = group['platform'].lower()
ret_dict['platform'] = re.sub(r'nexus\s*(\d)\d{3}', r'n\1k', ret_dict['platform'])
if group['model']:
ret_dict['model'] = group['model']
continue
# ********************************************
# * VIPTELLA *
# ********************************************
# 15.3.3
m = viptella_os_pattern.match(line)
if m:
ret_dict['os'] = 'viptella'
ret_dict['version'] = m.groupdict()['version']
continue
return ret_dict
class UnameSchema(MetaParser):
"""Schema for uname -a"""
schema = {
'os':str,
'hostname': str,
'version':str
}
class Uname(UnameSchema):
"""Parser for uname -a"""
cli_command = [
'uname -a',
]
def cli(self, output=None):
if output is None:
output = self.device.execute(self.cli_command[0])
ret_dict = {}
# Linux example_hostname.cisco.com 4.18.0-240.22.1.el8_3.x86_64 #1 SMP Thu Mar 25 14:36:04 EDT 2021 x86_64 x86_64 x86_64 GNU/Linux
p0 = re.compile(r'^(?P<os>[Ll]inux)\s+(?P<hostname>\S+)\s+(?P<version>\S+).*$')
for line in output.splitlines():
line = line.strip()
# Linux example_hostname.cisco.com 4.18.0-240.22.1.el8_3.x86_64 #1 SMP Thu Mar 25 14:36:04 EDT 2021 x86_64 x86_64 x86_64 GNU/Linux
m = p0.match(line)
if m:
group = m.groupdict()
ret_dict['os'] = group['os'].lower()
ret_dict['hostname'] = group['hostname'].lower()
ret_dict['version'] = group['version']
return ret_dict
class ShowInventorySchema(MetaParser):
"""Schema for show inventory"""
schema = {
'inventory_item_index': {
int: {
'name': str,
'description': str,
Optional('pid'): str,
Optional('vid'): str,
Optional('sn'): str,
}
}
}
class ShowInventory(ShowInventorySchema):
"""Parser for show inventory
"""
cli_command = [
'show inventory',
]
def cli(self, output=None):
if output is None:
output = self.device.execute(self.cli_command[0])
ret_dict = {}
# NAME: "Chassis", DESCR: "Cisco Catalyst Series C9500X-28C8D Chassis"
# Name: "Chassis", DESCR: "ASA 5555-X with SW, 8 GE Data, 1 GE Mgmt"
p1 = re.compile(r'^(?:NAME|Name):\s+\"(?P<name>.+)\",\s+DESCR: \"(?P<description>.+)\"$')
# PID: C9500X-28C8D , VID: V00 , SN: FDO25030SLN
p2 = re.compile(r'^PID:\s*(?P<pid>\S+)\s*,\s+VID:\s+(?P<vid>\S+)?\s*,\s+SN:\s*(?P<sn>\S+)?$')
item_index = 0
for line in output.splitlines():
line = line.strip()
# NAME: "Chassis", DESCR: "Cisco Catalyst Series C9500X-28C8D Chassis"
# Name: "Chassis", DESCR: "ASA 5555-X with SW, 8 GE Data, 1 GE Mgmt"
m = p1.match(line)
if m:
group = m.groupdict()
name_dict = ret_dict.setdefault('inventory_item_index', {})\
.setdefault(item_index, {})
name_dict['name'] = group['name'].lower()
name_dict['description'] = group['description'].lower()
item_index += 1
continue
# PID: C9500X-28C8D , VID: V00 , SN: FDO25030SLN
m = p2.match(line)
if m:
group = m.groupdict()
name_dict['pid'] = group['pid'].replace(' ','')
if group['vid']:
name_dict['vid'] = group['vid'].replace(' ','')
if group['sn']:
name_dict['sn'] = group['sn']
continue
return ret_dict
| 43.029915 | 250 | 0.48267 | 2,149 | 20,138 | 4.425314 | 0.136342 | 0.035331 | 0.023134 | 0.02776 | 0.770768 | 0.731441 | 0.711987 | 0.654048 | 0.600736 | 0.558675 | 0 | 0.069376 | 0.324312 | 20,138 | 467 | 251 | 43.122056 | 0.629529 | 0.431274 | 0 | 0.476636 | 0 | 0.046729 | 0.188581 | 0.06153 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014019 | false | 0 | 0.014019 | 0 | 0.098131 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92cda5ea7a5e59fe5b44b66582f7a4c46243bcae | 6,141 | py | Python | opencga-client/src/main/python/pyopencga/rest_clients/project_client.py | lauralopezreal/opencga-laura | bdce25d11abc43789d7bdebbaa832d3144d961fe | [
"Apache-2.0"
] | null | null | null | opencga-client/src/main/python/pyopencga/rest_clients/project_client.py | lauralopezreal/opencga-laura | bdce25d11abc43789d7bdebbaa832d3144d961fe | [
"Apache-2.0"
] | null | null | null | opencga-client/src/main/python/pyopencga/rest_clients/project_client.py | lauralopezreal/opencga-laura | bdce25d11abc43789d7bdebbaa832d3144d961fe | [
"Apache-2.0"
] | null | null | null | """
WARNING: AUTOGENERATED CODE
This code was generated by a tool.
Autogenerated on: 2021-10-19 16:38:18
Manual changes to this file may cause unexpected behavior in your application.
Manual changes to this file will be overwritten if the code is regenerated.
"""
from pyopencga.rest_clients._parent_rest_clients import _ParentRestClient
class Project(_ParentRestClient):
"""
This class contains methods for the 'Projects' webservices
Client version: 2.1.1
PATH: /{apiVersion}/projects
"""
def __init__(self, configuration, token=None, login_handler=None, *args, **kwargs):
super(Project, self).__init__(configuration, token, login_handler, *args, **kwargs)
def create(self, data=None, **options):
"""
Create a new project.
PATH: /{apiVersion}/projects/create
:param dict data: JSON containing the mandatory parameters. (REQUIRED)
"""
return self._post(category='projects', resource='create', data=data, **options)
def search(self, **options):
"""
Search projects.
PATH: /{apiVersion}/projects/search
:param str include: Fields included in the response, whole JSON path
must be provided.
:param str exclude: Fields excluded in the response, whole JSON path
must be provided.
:param int limit: Number of results to be returned.
:param int skip: Number of results to skip.
:param str owner: Owner of the project.
:param str id: Project [user@]project where project can be either the
ID or the alias.
:param str name: Project name.
:param str fqn: Project fqn.
:param str organization: Project organization.
:param str description: Project description.
:param str study: Study id.
:param str creation_date: Creation date. Format: yyyyMMddHHmmss.
Examples: >2018, 2017-2018, <201805.
:param str modification_date: Modification date. Format:
yyyyMMddHHmmss. Examples: >2018, 2017-2018, <201805.
:param str internal_status: Filter by internal status.
:param str attributes: Attributes.
"""
return self._get(category='projects', resource='search', **options)
def aggregation_stats(self, projects, **options):
"""
Fetch catalog project stats.
PATH: /{apiVersion}/projects/{projects}/aggregationStats
:param str projects: Comma separated list of projects [user@]project
up to a maximum of 100. (REQUIRED)
:param bool default: Calculate default stats.
:param str file_fields: List of file fields separated by semicolons,
e.g.: studies;type. For nested fields use >>, e.g.:
studies>>biotype;type.
:param str individual_fields: List of individual fields separated by
semicolons, e.g.: studies;type. For nested fields use >>, e.g.:
studies>>biotype;type.
:param str family_fields: List of family fields separated by
semicolons, e.g.: studies;type. For nested fields use >>, e.g.:
studies>>biotype;type.
:param str sample_fields: List of sample fields separated by
semicolons, e.g.: studies;type. For nested fields use >>, e.g.:
studies>>biotype;type.
:param str cohort_fields: List of cohort fields separated by
semicolons, e.g.: studies;type. For nested fields use >>, e.g.:
studies>>biotype;type.
:param str job_fields: List of job fields separated by semicolons,
e.g.: studies;type. For nested fields use >>, e.g.:
studies>>biotype;type.
"""
return self._get(category='projects', resource='aggregationStats', query_id=projects, **options)
def info(self, projects, **options):
"""
Fetch project information.
PATH: /{apiVersion}/projects/{projects}/info
:param str projects: Comma separated list of projects [user@]project
up to a maximum of 100. (REQUIRED)
:param str include: Fields included in the response, whole JSON path
must be provided.
:param str exclude: Fields excluded in the response, whole JSON path
must be provided.
"""
return self._get(category='projects', resource='info', query_id=projects, **options)
def inc_release(self, project, **options):
"""
Increment current release number in the project.
PATH: /{apiVersion}/projects/{project}/incRelease
:param str project: Project [user@]project where project can be either
the ID or the alias. (REQUIRED)
"""
return self._post(category='projects', resource='incRelease', query_id=project, **options)
def studies(self, project, **options):
"""
Fetch all the studies contained in the project.
PATH: /{apiVersion}/projects/{project}/studies
:param str project: Project [user@]project where project can be either
the ID or the alias. (REQUIRED)
:param str include: Fields included in the response, whole JSON path
must be provided.
:param str exclude: Fields excluded in the response, whole JSON path
must be provided.
:param int limit: Number of results to be returned.
:param int skip: Number of results to skip.
"""
return self._get(category='projects', resource='studies', query_id=project, **options)
def update(self, project, data=None, **options):
"""
Update some project attributes.
PATH: /{apiVersion}/projects/{project}/update
:param dict data: JSON containing the params to be updated. It will be
only possible to update organism fields not previously defined.
(REQUIRED)
:param str project: Project [user@]project where project can be either
the ID or the alias. (REQUIRED)
"""
return self._post(category='projects', resource='update', query_id=project, data=data, **options)
| 40.94 | 105 | 0.640449 | 742 | 6,141 | 5.247978 | 0.233154 | 0.057524 | 0.027735 | 0.027735 | 0.564458 | 0.527478 | 0.474063 | 0.441192 | 0.441192 | 0.441192 | 0 | 0.013085 | 0.265755 | 6,141 | 149 | 106 | 41.214765 | 0.850521 | 0.657222 | 0 | 0 | 1 | 0 | 0.082101 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0 | 0.055556 | 0 | 0.944444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92deb0b8e74468306cfa394357c7576aac50033b | 1,290 | py | Python | objects/nCSCG/rf2/_2d/mesh/cell/types_wrt_metric/base.py | mathischeap/mifem | 3242e253fb01ca205a76568eaac7bbdb99e3f059 | [
"MIT"
] | 1 | 2020-10-14T12:48:35.000Z | 2020-10-14T12:48:35.000Z | objects/nCSCG/rf2/_2d/mesh/cell/types_wrt_metric/base.py | mathischeap/mifem | 3242e253fb01ca205a76568eaac7bbdb99e3f059 | [
"MIT"
] | null | null | null | objects/nCSCG/rf2/_2d/mesh/cell/types_wrt_metric/base.py | mathischeap/mifem | 3242e253fb01ca205a76568eaac7bbdb99e3f059 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
@author: Yi Zhang
@contact: zhangyi_aero@hotmail.com
@time: 2022/05/05 1:06 PM
"""
import sys
if './' not in sys.path: sys.path.append('./')
from screws.freeze.base import FrozenOnly
class _2nCSCG_CellTypeWr2Metric_Base(FrozenOnly):
"""
A base for all cell types w.r.t. metric. For each type of cscg mesh element, we can classify its
sub-cells into different types. These types are all coded here.
"""
@property
def mark(self):
"""
A mark is key that identifies the cell metric. If the marks of two cells are the same, then
they have the same metric, otherwise, their metric are different. A mark normally is a string. But for
chaotic cell, it is an int: the id of the object.
:return:
"""
# noinspection PyUnresolvedReferences
return self._mark_
def __eq__(self, other):
"""We ask that the marks to be the same."""
# The later judge is to make sure we are not comparing it to something else having a mark property
return self.mark == other.mark and other.___IS_2nCSCG_CellTypeWr2Metric___
@property
def ___IS_2nCSCG_CellTypeWr2Metric___(self):
return True
if __name__ == "__main__":
# mpiexec -n 4 python
pass
| 28.666667 | 110 | 0.663566 | 187 | 1,290 | 4.395722 | 0.588235 | 0.083942 | 0.034063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019669 | 0.251163 | 1,290 | 44 | 111 | 29.318182 | 0.831263 | 0.548837 | 0 | 0.142857 | 0 | 0 | 0.024048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0.071429 | 0.142857 | 0.071429 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
92e63b91e85498caec387ab7470dd1d3db40c9b3 | 719 | py | Python | pagetools/src/utils/constants.py | maxnth/PAGETools | 417488730b4826a2a79b5e56f4b06a1939f96fbc | [
"MIT"
] | 4 | 2021-01-15T14:35:33.000Z | 2021-06-14T10:01:32.000Z | pagetools/src/utils/constants.py | maxnth/PAGETools | 417488730b4826a2a79b5e56f4b06a1939f96fbc | [
"MIT"
] | null | null | null | pagetools/src/utils/constants.py | maxnth/PAGETools | 417488730b4826a2a79b5e56f4b06a1939f96fbc | [
"MIT"
] | null | null | null | extractable_regions = ["TextRegion",
"ImageRegion",
"LineDrawingRegion",
"GraphicRegion",
"TableRegion",
"ChartRegion",
"MapRegion",
"SeparatorRegion",
"MathsRegion",
"ChemRegion",
"MusicRegion",
"AdvertRegion",
"NoiseRegion",
"NoiseRegion",
"UnknownRegion",
"CustomRegion",
"TextLine"]
TEXT_COUNT_SUPPORTED_ELEMS = ["TextRegion", "TextLine", "Word"] | 37.842105 | 63 | 0.365786 | 26 | 719 | 9.961538 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.545202 | 719 | 19 | 63 | 37.842105 | 0.792049 | 0 | 0 | 0.111111 | 0 | 0 | 0.302778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92e99c39e4ad7bf8a7dc37b9351ffdd7b3aeeeff | 3,099 | py | Python | venv/lib/python2.7/site-packages/ansible/plugins/__init__.py | haind27/test01 | 7f86c0a33eb0874a6c3f5ff9a923fd0cfc8ef852 | [
"MIT"
] | 37 | 2017-08-15T15:02:43.000Z | 2021-07-23T03:44:31.000Z | venv/lib/python2.7/site-packages/ansible/plugins/__init__.py | haind27/test01 | 7f86c0a33eb0874a6c3f5ff9a923fd0cfc8ef852 | [
"MIT"
] | 12 | 2018-01-10T05:25:25.000Z | 2021-11-28T06:55:48.000Z | venv/lib/python2.7/site-packages/ansible/plugins/__init__.py | haind27/test01 | 7f86c0a33eb0874a6c3f5ff9a923fd0cfc8ef852 | [
"MIT"
] | 49 | 2017-08-15T09:52:13.000Z | 2022-03-21T17:11:54.000Z | # (c) 2012, Daniel Hokka Zakrisson <daniel@hozac.com>
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com> and others
# (c) 2017, Toshio Kuratomi <tkuratomi@ansible.com>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
from abc import ABCMeta
from ansible import constants as C
from ansible.module_utils.six import with_metaclass, string_types
try:
from __main__ import display
except ImportError:
from ansible.utils.display import Display
display = Display()
# Global so that all instances of a PluginLoader will share the caches
MODULE_CACHE = {}
PATH_CACHE = {}
PLUGIN_PATH_CACHE = {}
def get_plugin_class(obj):
if isinstance(obj, string_types):
return obj.lower().replace('module', '')
else:
return obj.__class__.__name__.lower().replace('module', '')
class AnsiblePlugin(with_metaclass(ABCMeta, object)):
# allow extra passthrough parameters
allow_extras = False
def __init__(self):
self._options = {}
def get_option(self, option, hostvars=None):
if option not in self._options:
option_value = C.config.get_config_value(option, plugin_type=get_plugin_class(self), plugin_name=self._load_name, variables=hostvars)
self.set_option(option, option_value)
return self._options.get(option)
def set_option(self, option, value):
self._options[option] = value
def set_options(self, task_keys=None, var_options=None, direct=None):
'''
Sets the _options attribute with the configuration/keyword information for this plugin
:arg task_keys: Dict with playbook keywords that affect this option
:arg var_options: Dict with either 'conneciton variables'
:arg direct: Dict with 'direct assignment'
'''
self._options = C.config.get_plugin_options(get_plugin_class(self), self._load_name, keys=task_keys, variables=var_options, direct=direct)
# allow extras/wildcards from vars that are not directly consumed in configuration
# this is needed to support things like winrm that can have extended protocol options we don't direclty handle
if self.allow_extras and var_options and '_extras' in var_options:
self.set_option('_extras', var_options['_extras'])
def _check_required(self):
# FIXME: standarize required check based on config
pass
| 37.337349 | 146 | 0.727009 | 431 | 3,099 | 5.041763 | 0.436195 | 0.027612 | 0.017948 | 0.026231 | 0.037736 | 0.025771 | 0 | 0 | 0 | 0 | 0 | 0.007214 | 0.194902 | 3,099 | 82 | 147 | 37.792683 | 0.863727 | 0.45918 | 0 | 0 | 0 | 0 | 0.020561 | 0 | 0 | 0 | 0 | 0.012195 | 0 | 1 | 0.171429 | false | 0.028571 | 0.2 | 0 | 0.514286 | 0.028571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
92f1c36ae965357cbcace202868b0d5c303cfc8b | 22,752 | py | Python | object_detection/protos/eval_pb2.py | curieuxjy/sign_object_detection | 02a231faf4c6d507366e6627ca339cf466744715 | [
"MIT"
] | 549 | 2020-01-02T05:14:57.000Z | 2022-03-29T18:34:12.000Z | object_detection_api/protos/eval_pb2.py | HuaxingXu/marine_debris_ML | 9b7317c0ad881849ce5688aeb1eb368dfb85d39f | [
"Apache-2.0"
] | 98 | 2020-01-21T09:41:30.000Z | 2022-03-12T00:53:06.000Z | object_detection_api/protos/eval_pb2.py | HuaxingXu/marine_debris_ML | 9b7317c0ad881849ce5688aeb1eb368dfb85d39f | [
"Apache-2.0"
] | 233 | 2020-01-18T03:46:27.000Z | 2022-03-19T03:17:47.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: object_detection/protos/eval.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='object_detection/protos/eval.proto',
package='object_detection.protos',
syntax='proto2',
serialized_pb=_b('\n\"object_detection/protos/eval.proto\x12\x17object_detection.protos\"\xc0\x08\n\nEvalConfig\x12\x15\n\nbatch_size\x18\x19 \x01(\r:\x01\x31\x12\x1e\n\x12num_visualizations\x18\x01 \x01(\r:\x02\x31\x30\x12\x1e\n\x0cnum_examples\x18\x02 \x01(\r:\x04\x35\x30\x30\x30\x42\x02\x18\x01\x12\x1f\n\x12\x65val_interval_secs\x18\x03 \x01(\r:\x03\x33\x30\x30\x12\x18\n\tmax_evals\x18\x04 \x01(\r:\x01\x30\x42\x02\x18\x01\x12\x19\n\nsave_graph\x18\x05 \x01(\x08:\x05\x66\x61lse\x12\"\n\x18visualization_export_dir\x18\x06 \x01(\t:\x00\x12\x15\n\x0b\x65val_master\x18\x07 \x01(\t:\x00\x12\x13\n\x0bmetrics_set\x18\x08 \x03(\t\x12J\n\x14parameterized_metric\x18\x1f \x03(\x0b\x32,.object_detection.protos.ParameterizedMetric\x12\x15\n\x0b\x65xport_path\x18\t \x01(\t:\x00\x12!\n\x12ignore_groundtruth\x18\n \x01(\x08:\x05\x66\x61lse\x12\"\n\x13use_moving_averages\x18\x0b \x01(\x08:\x05\x66\x61lse\x12\"\n\x13\x65val_instance_masks\x18\x0c \x01(\x08:\x05\x66\x61lse\x12 \n\x13min_score_threshold\x18\r \x01(\x02:\x03\x30.5\x12&\n\x1amax_num_boxes_to_visualize\x18\x0e \x01(\x05:\x02\x32\x30\x12\x1a\n\x0bskip_scores\x18\x0f \x01(\x08:\x05\x66\x61lse\x12\x1a\n\x0bskip_labels\x18\x10 \x01(\x08:\x05\x66\x61lse\x12*\n\x1bvisualize_groundtruth_boxes\x18\x11 \x01(\x08:\x05\x66\x61lse\x12\x32\n#groundtruth_box_visualization_color\x18\x12 \x01(\t:\x05\x62lack\x12\x35\n&keep_image_id_for_visualization_export\x18\x13 \x01(\x08:\x05\x66\x61lse\x12$\n\x16retain_original_images\x18\x17 \x01(\x08:\x04true\x12+\n\x1cinclude_metrics_per_category\x18\x18 \x01(\x08:\x05\x66\x61lse\x12\x1d\n\x12recall_lower_bound\x18\x1a \x01(\x02:\x01\x30\x12\x1d\n\x12recall_upper_bound\x18\x1b \x01(\x02:\x01\x31\x12\x38\n)retain_original_image_additional_channels\x18\x1c \x01(\x08:\x05\x66\x61lse\x12\x1e\n\x0f\x66orce_no_resize\x18\x1d \x01(\x08:\x05\x66\x61lse\x12%\n\x16use_dummy_loss_in_eval\x18\x1e \x01(\x08:\x05\x66\x61lse\x12<\n\rkeypoint_edge\x18 \x03(\x0b\x32%.object_detection.protos.KeypointEdge\"|\n\x13ParameterizedMetric\x12M\n\x15\x63oco_keypoint_metrics\x18\x01 \x01(\x0b\x32,.object_detection.protos.CocoKeypointMetricsH\x00\x42\x16\n\x14parameterized_metric\"\xd3\x01\n\x13\x43ocoKeypointMetrics\x12\x13\n\x0b\x63lass_label\x18\x01 \x01(\t\x12i\n\x18keypoint_label_to_sigmas\x18\x02 \x03(\x0b\x32G.object_detection.protos.CocoKeypointMetrics.KeypointLabelToSigmasEntry\x1a<\n\x1aKeypointLabelToSigmasEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\x02:\x02\x38\x01\"*\n\x0cKeypointEdge\x12\r\n\x05start\x18\x01 \x01(\x05\x12\x0b\n\x03\x65nd\x18\x02 \x01(\x05')
)
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_EVALCONFIG = _descriptor.Descriptor(
name='EvalConfig',
full_name='object_detection.protos.EvalConfig',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='batch_size', full_name='object_detection.protos.EvalConfig.batch_size', index=0,
number=25, type=13, cpp_type=3, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='num_visualizations', full_name='object_detection.protos.EvalConfig.num_visualizations', index=1,
number=1, type=13, cpp_type=3, label=1,
has_default_value=True, default_value=10,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='num_examples', full_name='object_detection.protos.EvalConfig.num_examples', index=2,
number=2, type=13, cpp_type=3, label=1,
has_default_value=True, default_value=5000,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=_descriptor._ParseOptions(descriptor_pb2.FieldOptions(), _b('\030\001'))),
_descriptor.FieldDescriptor(
name='eval_interval_secs', full_name='object_detection.protos.EvalConfig.eval_interval_secs', index=3,
number=3, type=13, cpp_type=3, label=1,
has_default_value=True, default_value=300,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='max_evals', full_name='object_detection.protos.EvalConfig.max_evals', index=4,
number=4, type=13, cpp_type=3, label=1,
has_default_value=True, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=_descriptor._ParseOptions(descriptor_pb2.FieldOptions(), _b('\030\001'))),
_descriptor.FieldDescriptor(
name='save_graph', full_name='object_detection.protos.EvalConfig.save_graph', index=5,
number=5, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='visualization_export_dir', full_name='object_detection.protos.EvalConfig.visualization_export_dir', index=6,
number=6, type=9, cpp_type=9, label=1,
has_default_value=True, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='eval_master', full_name='object_detection.protos.EvalConfig.eval_master', index=7,
number=7, type=9, cpp_type=9, label=1,
has_default_value=True, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='metrics_set', full_name='object_detection.protos.EvalConfig.metrics_set', index=8,
number=8, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='parameterized_metric', full_name='object_detection.protos.EvalConfig.parameterized_metric', index=9,
number=31, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='export_path', full_name='object_detection.protos.EvalConfig.export_path', index=10,
number=9, type=9, cpp_type=9, label=1,
has_default_value=True, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ignore_groundtruth', full_name='object_detection.protos.EvalConfig.ignore_groundtruth', index=11,
number=10, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='use_moving_averages', full_name='object_detection.protos.EvalConfig.use_moving_averages', index=12,
number=11, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='eval_instance_masks', full_name='object_detection.protos.EvalConfig.eval_instance_masks', index=13,
number=12, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='min_score_threshold', full_name='object_detection.protos.EvalConfig.min_score_threshold', index=14,
number=13, type=2, cpp_type=6, label=1,
has_default_value=True, default_value=float(0.5),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='max_num_boxes_to_visualize', full_name='object_detection.protos.EvalConfig.max_num_boxes_to_visualize', index=15,
number=14, type=5, cpp_type=1, label=1,
has_default_value=True, default_value=20,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='skip_scores', full_name='object_detection.protos.EvalConfig.skip_scores', index=16,
number=15, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='skip_labels', full_name='object_detection.protos.EvalConfig.skip_labels', index=17,
number=16, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='visualize_groundtruth_boxes', full_name='object_detection.protos.EvalConfig.visualize_groundtruth_boxes', index=18,
number=17, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='groundtruth_box_visualization_color', full_name='object_detection.protos.EvalConfig.groundtruth_box_visualization_color', index=19,
number=18, type=9, cpp_type=9, label=1,
has_default_value=True, default_value=_b("black").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='keep_image_id_for_visualization_export', full_name='object_detection.protos.EvalConfig.keep_image_id_for_visualization_export', index=20,
number=19, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='retain_original_images', full_name='object_detection.protos.EvalConfig.retain_original_images', index=21,
number=23, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=True,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='include_metrics_per_category', full_name='object_detection.protos.EvalConfig.include_metrics_per_category', index=22,
number=24, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='recall_lower_bound', full_name='object_detection.protos.EvalConfig.recall_lower_bound', index=23,
number=26, type=2, cpp_type=6, label=1,
has_default_value=True, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='recall_upper_bound', full_name='object_detection.protos.EvalConfig.recall_upper_bound', index=24,
number=27, type=2, cpp_type=6, label=1,
has_default_value=True, default_value=float(1),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='retain_original_image_additional_channels', full_name='object_detection.protos.EvalConfig.retain_original_image_additional_channels', index=25,
number=28, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='force_no_resize', full_name='object_detection.protos.EvalConfig.force_no_resize', index=26,
number=29, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='use_dummy_loss_in_eval', full_name='object_detection.protos.EvalConfig.use_dummy_loss_in_eval', index=27,
number=30, type=8, cpp_type=7, label=1,
has_default_value=True, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='keypoint_edge', full_name='object_detection.protos.EvalConfig.keypoint_edge', index=28,
number=32, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=64,
serialized_end=1152,
)
_PARAMETERIZEDMETRIC = _descriptor.Descriptor(
name='ParameterizedMetric',
full_name='object_detection.protos.ParameterizedMetric',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='coco_keypoint_metrics', full_name='object_detection.protos.ParameterizedMetric.coco_keypoint_metrics', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='parameterized_metric', full_name='object_detection.protos.ParameterizedMetric.parameterized_metric',
index=0, containing_type=None, fields=[]),
],
serialized_start=1154,
serialized_end=1278,
)
_COCOKEYPOINTMETRICS_KEYPOINTLABELTOSIGMASENTRY = _descriptor.Descriptor(
name='KeypointLabelToSigmasEntry',
full_name='object_detection.protos.CocoKeypointMetrics.KeypointLabelToSigmasEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='object_detection.protos.CocoKeypointMetrics.KeypointLabelToSigmasEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='value', full_name='object_detection.protos.CocoKeypointMetrics.KeypointLabelToSigmasEntry.value', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=_descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\001')),
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=1432,
serialized_end=1492,
)
_COCOKEYPOINTMETRICS = _descriptor.Descriptor(
name='CocoKeypointMetrics',
full_name='object_detection.protos.CocoKeypointMetrics',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='class_label', full_name='object_detection.protos.CocoKeypointMetrics.class_label', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='keypoint_label_to_sigmas', full_name='object_detection.protos.CocoKeypointMetrics.keypoint_label_to_sigmas', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[_COCOKEYPOINTMETRICS_KEYPOINTLABELTOSIGMASENTRY, ],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=1281,
serialized_end=1492,
)
_KEYPOINTEDGE = _descriptor.Descriptor(
name='KeypointEdge',
full_name='object_detection.protos.KeypointEdge',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='start', full_name='object_detection.protos.KeypointEdge.start', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='end', full_name='object_detection.protos.KeypointEdge.end', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=1494,
serialized_end=1536,
)
_EVALCONFIG.fields_by_name['parameterized_metric'].message_type = _PARAMETERIZEDMETRIC
_EVALCONFIG.fields_by_name['keypoint_edge'].message_type = _KEYPOINTEDGE
_PARAMETERIZEDMETRIC.fields_by_name['coco_keypoint_metrics'].message_type = _COCOKEYPOINTMETRICS
_PARAMETERIZEDMETRIC.oneofs_by_name['parameterized_metric'].fields.append(
_PARAMETERIZEDMETRIC.fields_by_name['coco_keypoint_metrics'])
_PARAMETERIZEDMETRIC.fields_by_name['coco_keypoint_metrics'].containing_oneof = _PARAMETERIZEDMETRIC.oneofs_by_name['parameterized_metric']
_COCOKEYPOINTMETRICS_KEYPOINTLABELTOSIGMASENTRY.containing_type = _COCOKEYPOINTMETRICS
_COCOKEYPOINTMETRICS.fields_by_name['keypoint_label_to_sigmas'].message_type = _COCOKEYPOINTMETRICS_KEYPOINTLABELTOSIGMASENTRY
DESCRIPTOR.message_types_by_name['EvalConfig'] = _EVALCONFIG
DESCRIPTOR.message_types_by_name['ParameterizedMetric'] = _PARAMETERIZEDMETRIC
DESCRIPTOR.message_types_by_name['CocoKeypointMetrics'] = _COCOKEYPOINTMETRICS
DESCRIPTOR.message_types_by_name['KeypointEdge'] = _KEYPOINTEDGE
EvalConfig = _reflection.GeneratedProtocolMessageType('EvalConfig', (_message.Message,), dict(
DESCRIPTOR = _EVALCONFIG,
__module__ = 'object_detection.protos.eval_pb2'
# @@protoc_insertion_point(class_scope:object_detection.protos.EvalConfig)
))
_sym_db.RegisterMessage(EvalConfig)
ParameterizedMetric = _reflection.GeneratedProtocolMessageType('ParameterizedMetric', (_message.Message,), dict(
DESCRIPTOR = _PARAMETERIZEDMETRIC,
__module__ = 'object_detection.protos.eval_pb2'
# @@protoc_insertion_point(class_scope:object_detection.protos.ParameterizedMetric)
))
_sym_db.RegisterMessage(ParameterizedMetric)
CocoKeypointMetrics = _reflection.GeneratedProtocolMessageType('CocoKeypointMetrics', (_message.Message,), dict(
KeypointLabelToSigmasEntry = _reflection.GeneratedProtocolMessageType('KeypointLabelToSigmasEntry', (_message.Message,), dict(
DESCRIPTOR = _COCOKEYPOINTMETRICS_KEYPOINTLABELTOSIGMASENTRY,
__module__ = 'object_detection.protos.eval_pb2'
# @@protoc_insertion_point(class_scope:object_detection.protos.CocoKeypointMetrics.KeypointLabelToSigmasEntry)
))
,
DESCRIPTOR = _COCOKEYPOINTMETRICS,
__module__ = 'object_detection.protos.eval_pb2'
# @@protoc_insertion_point(class_scope:object_detection.protos.CocoKeypointMetrics)
))
_sym_db.RegisterMessage(CocoKeypointMetrics)
_sym_db.RegisterMessage(CocoKeypointMetrics.KeypointLabelToSigmasEntry)
KeypointEdge = _reflection.GeneratedProtocolMessageType('KeypointEdge', (_message.Message,), dict(
DESCRIPTOR = _KEYPOINTEDGE,
__module__ = 'object_detection.protos.eval_pb2'
# @@protoc_insertion_point(class_scope:object_detection.protos.KeypointEdge)
))
_sym_db.RegisterMessage(KeypointEdge)
_EVALCONFIG.fields_by_name['num_examples'].has_options = True
_EVALCONFIG.fields_by_name['num_examples']._options = _descriptor._ParseOptions(descriptor_pb2.FieldOptions(), _b('\030\001'))
_EVALCONFIG.fields_by_name['max_evals'].has_options = True
_EVALCONFIG.fields_by_name['max_evals']._options = _descriptor._ParseOptions(descriptor_pb2.FieldOptions(), _b('\030\001'))
_COCOKEYPOINTMETRICS_KEYPOINTLABELTOSIGMASENTRY.has_options = True
_COCOKEYPOINTMETRICS_KEYPOINTLABELTOSIGMASENTRY._options = _descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\001'))
# @@protoc_insertion_point(module_scope)
| 49.568627 | 2,588 | 0.761559 | 2,936 | 22,752 | 5.590259 | 0.100477 | 0.055566 | 0.076768 | 0.065497 | 0.697679 | 0.65375 | 0.566807 | 0.500579 | 0.47834 | 0.463901 | 0 | 0.047287 | 0.117924 | 22,752 | 458 | 2,589 | 49.676856 | 0.770542 | 0.026064 | 0 | 0.613208 | 1 | 0.002358 | 0.282676 | 0.246603 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014151 | 0 | 0.014151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92f8674a068d1cc1639bb04c3ddfcb19805d844d | 1,616 | py | Python | tests/parity/test_blocks.py | Rothens/ether_sql | fab15a34962735825756a2a2c206e2e13fae00a0 | [
"Apache-2.0"
] | 65 | 2018-05-10T14:15:55.000Z | 2022-03-06T07:33:20.000Z | tests/parity/test_blocks.py | Rothens/ether_sql | fab15a34962735825756a2a2c206e2e13fae00a0 | [
"Apache-2.0"
] | 45 | 2018-05-17T15:11:23.000Z | 2021-09-10T13:09:07.000Z | tests/parity/test_blocks.py | Rothens/ether_sql | fab15a34962735825756a2a2c206e2e13fae00a0 | [
"Apache-2.0"
] | 20 | 2018-06-03T20:20:02.000Z | 2022-03-06T07:32:58.000Z | import pytest
from tests.common_tests.blocks import (
initial_missing_blocks,
final_missing_blocks,
raise_missing_blocks_error,
fill_missing_blocks,
verify_block_range_56160_56170,
verify_state_at_block,
verify_block_56160_contents,
verify_removed_block_range_56160_56170,
)
class TestParityMissingBlocks():
def test_parity_initial_missing_blocks(self, parity_session_missing_blocks):
initial_missing_blocks()
def test_parity_final_missing_blocks(self, parity_session_missing_blocks):
final_missing_blocks()
def test_parity_raise_missing_blocks_error(self, parity_session_missing_blocks):
raise_missing_blocks_error()
def test_parity_fill_missing_blocks(self, parity_session_missing_blocks):
fill_missing_blocks()
@pytest.mark.medium
class TestParityFirst10Blocks():
def test_parity_verify_state_at_block_0(self, parity_session_first_10_blocks):
verify_state_at_block(0)
def test_parity_verify_state_at_block_10(self, parity_session_first_10_blocks):
verify_state_at_block(10)
class TestParityBlocks_56160_56170():
def test_verify_block_range_56160_56170(
self, parity_session_block_range_56160_56170):
verify_block_range_56160_56170()
def test_parity_verify_block_56160_contents(
self, parity_session_block_range_56160_56170):
verify_block_56160_contents()
pass
def test_parity_verify_removed_block_range_56160_56170(
self, parity_session_block_range_56160_56170):
verify_removed_block_range_56160_56170()
pass
| 31.686275 | 84 | 0.78651 | 208 | 1,616 | 5.461538 | 0.173077 | 0.183099 | 0.118838 | 0.158451 | 0.670775 | 0.537852 | 0.410211 | 0.242077 | 0.242077 | 0.195423 | 0 | 0.094144 | 0.165223 | 1,616 | 50 | 85 | 32.32 | 0.747961 | 0 | 0 | 0.131579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.236842 | false | 0.052632 | 0.052632 | 0 | 0.368421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
92faaf23efb84e15b21e3ef9aa71aa32530735fb | 460 | py | Python | server/app/data_handler.py | AlexNaga/rpi-people-counter | d2561b588334fc6f8ae05cdbe288273db6b3f09d | [
"MIT"
] | 19 | 2019-04-13T18:54:52.000Z | 2021-05-25T11:17:56.000Z | server/app/data_handler.py | robmarkcole/rpi-people-counter | 6b69e0bc1efaf3c6f325f29c549ce8843e47d70e | [
"MIT"
] | 5 | 2019-12-12T20:40:52.000Z | 2022-02-12T08:11:58.000Z | server/app/data_handler.py | robmarkcole/rpi-people-counter | 6b69e0bc1efaf3c6f325f29c549ce8843e47d70e | [
"MIT"
] | 5 | 2019-06-13T07:50:23.000Z | 2021-04-19T12:19:01.000Z | from bson.json_util import dumps, loads
class DataHandler:
@staticmethod
def from_json(json_data):
"""Decode the data from JSON"""
return loads(json_data) # Decode JSON data
@staticmethod
def to_json(data):
"""Encodes the data to JSON"""
return dumps(data) # Encode to JSON
@staticmethod
def is_device_found(devices_count):
"""Checks if any device found"""
return devices_count > 0
| 24.210526 | 51 | 0.641304 | 59 | 460 | 4.847458 | 0.457627 | 0.111888 | 0.097902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002985 | 0.271739 | 460 | 18 | 52 | 25.555556 | 0.850746 | 0.23913 | 0 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.090909 | 0 | 0.727273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13009c62e6ee397895dc2e48f51f2d7f151dd948 | 226 | py | Python | hooks/post_gen_project.py | Rihan9/cookiecutter-homeassistant-component | 46bdd8e16d360562b0513a12f6c46df473bb4d89 | [
"MIT"
] | null | null | null | hooks/post_gen_project.py | Rihan9/cookiecutter-homeassistant-component | 46bdd8e16d360562b0513a12f6c46df473bb4d89 | [
"MIT"
] | null | null | null | hooks/post_gen_project.py | Rihan9/cookiecutter-homeassistant-component | 46bdd8e16d360562b0513a12f6c46df473bb4d89 | [
"MIT"
] | null | null | null | import shutil, os
# Rename root repo directory to use `-` instead of `_`.
repo_dir = "{{cookiecutter.domain}}"
new_repo_dir = repo_dir.replace("_", "-")
#shutil.copytree(os.path.join("..", repo_dir, "*"), os.path.join(".."))
| 32.285714 | 71 | 0.654867 | 31 | 226 | 4.548387 | 0.612903 | 0.198582 | 0.141844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115044 | 226 | 6 | 72 | 37.666667 | 0.705 | 0.544248 | 0 | 0 | 0 | 0 | 0.25 | 0.23 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
13114262122b51d5a3b200d97ea883f8b6eef8fe | 987 | py | Python | figureskate-judging/src/api/score/score.py | pyohei/test | 3f12ad8519f5c5a13b1aab4509a0b44718ca8509 | [
"MIT"
] | null | null | null | figureskate-judging/src/api/score/score.py | pyohei/test | 3f12ad8519f5c5a13b1aab4509a0b44718ca8509 | [
"MIT"
] | 4 | 2020-03-06T03:04:40.000Z | 2021-05-16T12:57:16.000Z | figureskate-judging/src/api/score/score.py | pyohei/my-sandbox | 683af11781dae672a10eee5b5dd838d0f055fb04 | [
"MIT"
] | null | null | null | #!/usr/local/bin/python
# -*- coding: utf-8 -*-
""" Score handler
Score operation handler"""
class Score(object):
def __init__(self, kind):
""" init about importer
This module assume about import module.
Argument of 'kind' indicates the way of judging.
In __init__, dicide what module is need."""
score_dir = kind
score_module = kind.lower()
""" Import only necessary module """
import_module = "score.%s.%s" % (score_dir, score_module)
from_list = ["Calculation, Registration"]
self.main_module = __import__(import_module, fromlist=from_list)
def calc(self, score):
score_cls = self.main_module.Calculation(score)
return score_cls.main()
def register(self, score=1):
score_reg = self.main_module.Registration(score)
score_reg.main()
def catch_total(self):
return self.main_module.Overall()
if __name__ == "__main__":
hand = Handler()
| 25.307692 | 72 | 0.635258 | 120 | 987 | 4.916667 | 0.466667 | 0.054237 | 0.094915 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002699 | 0.24924 | 987 | 38 | 73 | 25.973684 | 0.793522 | 0.237082 | 0 | 0 | 0 | 0 | 0.064801 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.117647 | 0.058824 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13177878d5fdc3fbbf177f7b4809eb51e1509e07 | 824 | py | Python | milk/supervised/base.py | luispedro/milk | abc2a28b526c199414d42c0a26092938968c3caf | [
"MIT"
] | 284 | 2015-01-21T09:07:55.000Z | 2022-03-19T07:39:17.000Z | milk/supervised/base.py | pursh2002/milk | abc2a28b526c199414d42c0a26092938968c3caf | [
"MIT"
] | 6 | 2015-04-22T15:17:44.000Z | 2018-04-22T16:06:24.000Z | milk/supervised/base.py | pursh2002/milk | abc2a28b526c199414d42c0a26092938968c3caf | [
"MIT"
] | 109 | 2015-02-03T07:39:59.000Z | 2022-01-16T00:16:13.000Z | # -*- coding: utf-8 -*-
# Copyright (C) 2011, Luis Pedro Coelho <luis@luispedro.org>
# vim: set ts=4 sts=4 sw=4 expandtab smartindent:
# License: MIT. See COPYING.MIT file in the milk distribution
from __future__ import division
class supervised_model(object):
def apply_many(self, fs):
'''
labels = model.apply_many( examples )
This is equivalent to ``map(model.apply, examples)`` but may be
implemented in a faster way.
Parameters
----------
examples : sequence of training examples
Returns
-------
labels : sequence of labels
'''
return list(map(self.apply, fs))
class base_adaptor(object):
def __init__(self, base):
self.base = base
def set_option(self, k, v):
self.base.set_option(k, v)
| 24.969697 | 71 | 0.609223 | 106 | 824 | 4.603774 | 0.641509 | 0.04918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013356 | 0.273058 | 824 | 32 | 72 | 25.75 | 0.801336 | 0.521845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.111111 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13198858cab1654fe852d4f50b906665945b699c | 163 | py | Python | ElectroWeakAnalysis/Skimming/python/dimuonsFilter_cfi.py | SWuchterl/cmssw | 769b4a7ef81796579af7d626da6039dfa0347b8e | [
"Apache-2.0"
] | 6 | 2017-09-08T14:12:56.000Z | 2022-03-09T23:57:01.000Z | ElectroWeakAnalysis/Skimming/python/dimuonsFilter_cfi.py | SWuchterl/cmssw | 769b4a7ef81796579af7d626da6039dfa0347b8e | [
"Apache-2.0"
] | 545 | 2017-09-19T17:10:19.000Z | 2022-03-07T16:55:27.000Z | ElectroWeakAnalysis/Skimming/python/dimuonsFilter_cfi.py | SWuchterl/cmssw | 769b4a7ef81796579af7d626da6039dfa0347b8e | [
"Apache-2.0"
] | 14 | 2017-10-04T09:47:21.000Z | 2019-10-23T18:04:45.000Z | import FWCore.ParameterSet.Config as cms
dimuonsFilter = cms.EDFilter("CandViewCountFilter",
src = cms.InputTag("dimuons"),
minNumber = cms.uint32(1)
)
| 18.111111 | 51 | 0.723926 | 18 | 163 | 6.555556 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0.153374 | 163 | 8 | 52 | 20.375 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.161491 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
131dd4ddff0510574534dd9b0580482b3fa72bd6 | 480 | py | Python | meta/utils.py | peterth3/django-meta | 329a466f50dc74038c23cbb93fbe1566a3f3ef31 | [
"BSD-4-Clause"
] | 392 | 2015-01-06T08:08:19.000Z | 2022-03-29T20:33:37.000Z | meta/utils.py | peterth3/django-meta | 329a466f50dc74038c23cbb93fbe1566a3f3ef31 | [
"BSD-4-Clause"
] | 123 | 2015-01-01T11:19:24.000Z | 2022-03-30T17:25:23.000Z | meta/utils.py | peterth3/django-meta | 329a466f50dc74038c23cbb93fbe1566a3f3ef31 | [
"BSD-4-Clause"
] | 64 | 2015-05-27T20:12:32.000Z | 2022-03-15T23:46:48.000Z | import contextlib
try:
from asgiref.local import Local
except ImportError:
from threading import local as Local # noqa: N812
_thread_locals = Local()
@contextlib.contextmanager
def set_request(request):
"""
Context processor that sets the request on the current instance
"""
_thread_locals._request = request
yield
def get_request():
"""
Retrieve request from current instance
"""
return getattr(_thread_locals, "_request", None)
| 20 | 67 | 0.710417 | 56 | 480 | 5.910714 | 0.571429 | 0.108761 | 0.114804 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007958 | 0.214583 | 480 | 23 | 68 | 20.869565 | 0.870027 | 0.2375 | 0 | 0 | 0 | 0 | 0.023881 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
13260413a5e5468e7dec5f8a58d383e6cb1f2899 | 26,921 | py | Python | pysnmp-with-texts/RADLAN-DHCP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/RADLAN-DHCP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/RADLAN-DHCP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module RADLAN-DHCP-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/RADLAN-DHCP-MIB
# Produced by pysmi-0.3.4 at Wed May 1 13:01:12 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, ConstraintsIntersection, ConstraintsUnion, ValueRangeConstraint, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "ConstraintsIntersection", "ConstraintsUnion", "ValueRangeConstraint", "SingleValueConstraint")
rnd, = mibBuilder.importSymbols("RADLAN-MIB", "rnd")
SnmpAdminString, = mibBuilder.importSymbols("SNMP-FRAMEWORK-MIB", "SnmpAdminString")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
IpAddress, Counter64, ObjectIdentity, Counter32, Gauge32, MibIdentifier, Unsigned32, ModuleIdentity, iso, TimeTicks, NotificationType, Integer32, MibScalar, MibTable, MibTableRow, MibTableColumn, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "IpAddress", "Counter64", "ObjectIdentity", "Counter32", "Gauge32", "MibIdentifier", "Unsigned32", "ModuleIdentity", "iso", "TimeTicks", "NotificationType", "Integer32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Bits")
DisplayString, RowStatus, TruthValue, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "RowStatus", "TruthValue", "TextualConvention")
rsDHCP = ModuleIdentity((1, 3, 6, 1, 4, 1, 89, 38))
rsDHCP.setRevisions(('2003-10-18 00:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: rsDHCP.setRevisionsDescriptions(('Initial version of this MIB.',))
if mibBuilder.loadTexts: rsDHCP.setLastUpdated('200310180000Z')
if mibBuilder.loadTexts: rsDHCP.setOrganization('')
if mibBuilder.loadTexts: rsDHCP.setContactInfo('')
if mibBuilder.loadTexts: rsDHCP.setDescription('The private MIB module definition for DHCP server support.')
rsDhcpMibVersion = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 14), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rsDhcpMibVersion.setStatus('current')
if mibBuilder.loadTexts: rsDhcpMibVersion.setDescription("DHCP MIB's version, the current version is 4.")
rlDhcpRelayEnable = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 25), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayEnable.setDescription('Enable or disable the use of the DHCP relay.')
rlDhcpRelayExists = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 26), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpRelayExists.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayExists.setDescription('This variable shows whether the device can function as a DHCP Relay Agent.')
rlDhcpRelayNextServerTable = MibTable((1, 3, 6, 1, 4, 1, 89, 38, 27), )
if mibBuilder.loadTexts: rlDhcpRelayNextServerTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerTable.setDescription('The DHCP Relay Next Servers configuration Table')
rlDhcpRelayNextServerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 38, 27, 1), ).setIndexNames((0, "RADLAN-DHCP-MIB", "rlDhcpRelayNextServerIpAddr"))
if mibBuilder.loadTexts: rlDhcpRelayNextServerEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerEntry.setDescription('The row definition for this table. DHCP requests are relayed to the specified next server according to their threshold values.')
rlDhcpRelayNextServerIpAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 27, 1, 1), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpRelayNextServerIpAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerIpAddr.setDescription('The IPAddress of the next configuration server. DHCP Server may act as a DHCP relay if this parameter is not equal to 0.0.0.0.')
rlDhcpRelayNextServerSecThreshold = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 27, 1, 2), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayNextServerSecThreshold.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerSecThreshold.setDescription('DHCP requests are relayed only if their SEC field is greater or equal to the threshold value in order to allow local DHCP Servers to answer first.')
rlDhcpRelayNextServerRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 27, 1, 3), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpRelayNextServerRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpRelayNextServerRowStatus.setDescription("This variable displays the validity or invalidity of the entry. Setting it to 'destroy' has the effect of rendering it inoperative. The internal effect (row removal) is implementation dependent.")
rlDhcpSrvEnable = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 30), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvEnable.setDescription('Enable or Disable the use of the DHCP Server.')
rlDhcpSrvExists = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 31), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvExists.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvExists.setDescription('This variable shows whether the device can function as a DHCP Server.')
rlDhcpSrvDbLocation = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 32), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("nvram", 1), ("flash", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDbLocation.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDbLocation.setDescription('Describes where DHCP Server database is stored.')
rlDhcpSrvMaxNumOfClients = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 33), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvMaxNumOfClients.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvMaxNumOfClients.setDescription('This variable shows maximum number of clients that can be supported by DHCP Server dynamic allocation.')
rlDhcpSrvDbNumOfActiveEntries = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 34), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDbNumOfActiveEntries.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDbNumOfActiveEntries.setDescription('This variable shows number of active entries stored in database.')
rlDhcpSrvDbErase = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 35), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDbErase.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDbErase.setDescription('The value is always false. Setting this variable to true causes erasing all entries in DHCP database.')
rlDhcpSrvProbeEnable = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 36), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvProbeEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvProbeEnable.setDescription('Enable or Disable the use of the DHCP probe before allocating an IP address.')
rlDhcpSrvProbeTimeout = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 37), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(300, 10000))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvProbeTimeout.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvProbeTimeout.setDescription('Indicates the peiod of time in milliseconds the DHCP probe will wait before issuing a new trial or deciding that no other device on the network has the IP address which DHCP considers allocating.')
rlDhcpSrvProbeRetries = MibScalar((1, 3, 6, 1, 4, 1, 89, 38, 38), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 10))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvProbeRetries.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvProbeRetries.setDescription('Indicates how many times DHCP will probe before deciding that no other device on the network has the IP address which DHCP considers allocating.')
rlDhcpSrvIpAddrTable = MibTable((1, 3, 6, 1, 4, 1, 89, 38, 45), )
if mibBuilder.loadTexts: rlDhcpSrvIpAddrTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrTable.setDescription('Table of IP Addresses allocated by DHCP Server by static and dynamic allocations.')
rlDhcpSrvIpAddrEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 38, 45, 1), ).setIndexNames((0, "RADLAN-DHCP-MIB", "rlDhcpSrvIpAddrIpAddr"))
if mibBuilder.loadTexts: rlDhcpSrvIpAddrEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrEntry.setDescription('The row definition for this table. Parameters of DHCP allocated IP Addresses table.')
rlDhcpSrvIpAddrIpAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 1), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpAddr.setDescription('The IP address that was allocated by DHCP Server.')
rlDhcpSrvIpAddrIpNetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpNetMask.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIpNetMask.setDescription('The subnet mask associated with the IP address of this entry. The value of the mask is an IP address with all the network bits set to 1 and all the hosts bits set to 0.')
rlDhcpSrvIpAddrIdentifier = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 3), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifier.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifier.setDescription('Unique Identifier for client. Either physical address or DHCP Client Identifier.')
rlDhcpSrvIpAddrIdentifierType = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("physAddr", 1), ("clientId", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifierType.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrIdentifierType.setDescription('Identifier Type. Either physical address or DHCP Client Identifier.')
rlDhcpSrvIpAddrClnHostName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 5), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrClnHostName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrClnHostName.setDescription('Client Host Name. DHCP Server will use it to update DNS Server. Must be unique per client.')
rlDhcpSrvIpAddrMechanism = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("manual", 1), ("automatic", 2), ("dynamic", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrMechanism.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrMechanism.setDescription('Mechanism of allocation IP Address by DHCP Server. The only value that can be set by user is manual.')
rlDhcpSrvIpAddrAgeTime = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 7), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrAgeTime.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrAgeTime.setDescription('Age time of the IP Address.')
rlDhcpSrvIpAddrPoolName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 8), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrPoolName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrPoolName.setDescription('Ip address pool name. A unique name for host pool static allocation, or network pool name in case of dynamic allocation.')
rlDhcpSrvIpAddrConfParamsName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 9), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrConfParamsName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrConfParamsName.setDescription('This variable points (serves as key) to appropriate set of parameters in the DHCP Server configuration parameters table.')
rlDhcpSrvIpAddrRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 45, 1, 10), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvIpAddrRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvIpAddrRowStatus.setDescription('The row status variable, used according to row installation and removal conventions.')
rlDhcpSrvDynamicTable = MibTable((1, 3, 6, 1, 4, 1, 89, 38, 46), )
if mibBuilder.loadTexts: rlDhcpSrvDynamicTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicTable.setDescription("The DHCP Dynamic Server's configuration Table")
rlDhcpSrvDynamicEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 38, 46, 1), ).setIndexNames((0, "RADLAN-DHCP-MIB", "rlDhcpSrvDynamicPoolName"))
if mibBuilder.loadTexts: rlDhcpSrvDynamicEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicEntry.setDescription('The row definition for this table. Parameters sent in as a DHCP Reply to DHCP Request with specified indices')
rlDhcpSrvDynamicPoolName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 1), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicPoolName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicPoolName.setDescription('The name of DHCP dynamic addresses pool.')
rlDhcpSrvDynamicIpAddrFrom = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrFrom.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrFrom.setDescription('The first IP address allocated in this row.')
rlDhcpSrvDynamicIpAddrTo = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 3), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrTo.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpAddrTo.setDescription('The last IP address allocated in this row.')
rlDhcpSrvDynamicIpNetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 4), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpNetMask.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicIpNetMask.setDescription('The subnet mask associated with the IP addresses of this entry. The value of the mask is an IP address with all the network bits set to 1 and all the hosts bits set to 0.')
rlDhcpSrvDynamicLeaseTime = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 5), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicLeaseTime.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicLeaseTime.setDescription('Maximum lease-time in seconds granted to a requesting DHCP client. For automatic allocation use 0xFFFFFFFF. To exclude addresses from allocation mechanism, set this value to 0.')
rlDhcpSrvDynamicProbeEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 6), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicProbeEnable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicProbeEnable.setDescription('Enable or Disable the use of the DHCP probe before allocating the address.')
rlDhcpSrvDynamicTotalNumOfAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 7), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDynamicTotalNumOfAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicTotalNumOfAddr.setDescription('Total number of addresses in space.')
rlDhcpSrvDynamicFreeNumOfAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 8), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlDhcpSrvDynamicFreeNumOfAddr.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicFreeNumOfAddr.setDescription('Free number of addresses in space.')
rlDhcpSrvDynamicConfParamsName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 9), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicConfParamsName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicConfParamsName.setDescription('This variable points (serves as key) to appropriate set of parameters in the DHCP Server configuration parameters table.')
rlDhcpSrvDynamicRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 46, 1, 10), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvDynamicRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvDynamicRowStatus.setDescription('The row status variable, used according to row installation and removal conventions.')
rlDhcpSrvConfParamsTable = MibTable((1, 3, 6, 1, 4, 1, 89, 38, 47), )
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTable.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTable.setDescription('The DHCP Configuration Parameters Table')
rlDhcpSrvConfParamsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 38, 47, 1), ).setIndexNames((0, "RADLAN-DHCP-MIB", "rlDhcpSrvConfParamsName"))
if mibBuilder.loadTexts: rlDhcpSrvConfParamsEntry.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsEntry.setDescription('The row definition for this table. Each entry corresponds to one specific parameters set.')
rlDhcpSrvConfParamsName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 1), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsName.setDescription('This value is a unique index for the entry in the rlDhcpSrvConfParamsTable.')
rlDhcpSrvConfParamsNextServerIp = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerIp.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerIp.setDescription('The IP of next server for client to use in configuration process.')
rlDhcpSrvConfParamsNextServerName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 64))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNextServerName.setDescription('The mame of next server for client to use in configuration process.')
rlDhcpSrvConfParamsBootfileName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 4), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 128))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsBootfileName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsBootfileName.setDescription('Name of file for client to request from next server.')
rlDhcpSrvConfParamsRoutersList = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 5), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRoutersList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRoutersList.setDescription("The value of option code 3, which defines default routers list. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsTimeSrvList = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 6), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTimeSrvList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsTimeSrvList.setDescription("The value of option code 4, which defines time servers list. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsDnsList = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 7), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDnsList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDnsList.setDescription("The value of option code 6, which defines the list of DNSs. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsDomainName = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 8), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDomainName.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsDomainName.setDescription('The value option code 15, which defines the domain name..')
rlDhcpSrvConfParamsNetbiosNameList = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 9), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNameList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNameList.setDescription("The value option code 44, which defines the list of NETBios Name Servers. Each IP address is represented in dotted decimal notation format with ';' between them.")
rlDhcpSrvConfParamsNetbiosNodeType = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 4, 8))).clone(namedValues=NamedValues(("b-node", 1), ("p-node", 2), ("m-node", 4), ("h-node", 8)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNodeType.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNetbiosNodeType.setDescription('The value option code 46, which defines the NETBios node type. The option will be added only if rlDhcpSrvConfParamsNetbiosNameList is not empty.')
rlDhcpSrvConfParamsCommunity = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 11), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsCommunity.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsCommunity.setDescription('The value of site-specific option 128, which defines Community. The option will be added only if rlDhcpSrvConfParamsNmsIp is set.')
rlDhcpSrvConfParamsNmsIp = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 12), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNmsIp.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsNmsIp.setDescription('The value of site-specific option 129, which defines IP of Network Manager.')
rlDhcpSrvConfParamsOptionsList = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 13), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsOptionsList.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsOptionsList.setDescription("The sequence of option segments. Each option segment is represented by a triplet <code/length/value>. The code defines the code of each supported option. The length defines the length of each supported option. The value defines the value of the supported option. If there is a number of elements in the value field, they are divided by ','. Each element of type IP address in value field is represented in dotted decimal notation format.")
rlDhcpSrvConfParamsRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 38, 47, 1, 14), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRowStatus.setStatus('current')
if mibBuilder.loadTexts: rlDhcpSrvConfParamsRowStatus.setDescription('The row status variable, used according to row installation and removal conventions.')
mibBuilder.exportSymbols("RADLAN-DHCP-MIB", rlDhcpSrvDynamicRowStatus=rlDhcpSrvDynamicRowStatus, rlDhcpRelayEnable=rlDhcpRelayEnable, rlDhcpSrvConfParamsNetbiosNameList=rlDhcpSrvConfParamsNetbiosNameList, rlDhcpSrvDbErase=rlDhcpSrvDbErase, rlDhcpSrvConfParamsOptionsList=rlDhcpSrvConfParamsOptionsList, rlDhcpSrvIpAddrIdentifier=rlDhcpSrvIpAddrIdentifier, rlDhcpSrvConfParamsRoutersList=rlDhcpSrvConfParamsRoutersList, rlDhcpSrvDynamicLeaseTime=rlDhcpSrvDynamicLeaseTime, rlDhcpSrvDynamicTotalNumOfAddr=rlDhcpSrvDynamicTotalNumOfAddr, rlDhcpSrvDynamicIpAddrTo=rlDhcpSrvDynamicIpAddrTo, rlDhcpSrvIpAddrPoolName=rlDhcpSrvIpAddrPoolName, rlDhcpSrvConfParamsDomainName=rlDhcpSrvConfParamsDomainName, rlDhcpSrvIpAddrAgeTime=rlDhcpSrvIpAddrAgeTime, rlDhcpRelayNextServerSecThreshold=rlDhcpRelayNextServerSecThreshold, rlDhcpSrvConfParamsNextServerName=rlDhcpSrvConfParamsNextServerName, rlDhcpSrvDynamicConfParamsName=rlDhcpSrvDynamicConfParamsName, rlDhcpSrvConfParamsBootfileName=rlDhcpSrvConfParamsBootfileName, rlDhcpSrvDbLocation=rlDhcpSrvDbLocation, rlDhcpRelayNextServerTable=rlDhcpRelayNextServerTable, rlDhcpSrvIpAddrEntry=rlDhcpSrvIpAddrEntry, rlDhcpSrvIpAddrIpNetMask=rlDhcpSrvIpAddrIpNetMask, rlDhcpSrvIpAddrIdentifierType=rlDhcpSrvIpAddrIdentifierType, rlDhcpSrvConfParamsCommunity=rlDhcpSrvConfParamsCommunity, rlDhcpSrvDynamicEntry=rlDhcpSrvDynamicEntry, PYSNMP_MODULE_ID=rsDHCP, rsDHCP=rsDHCP, rlDhcpSrvIpAddrRowStatus=rlDhcpSrvIpAddrRowStatus, rlDhcpSrvConfParamsTimeSrvList=rlDhcpSrvConfParamsTimeSrvList, rlDhcpSrvConfParamsName=rlDhcpSrvConfParamsName, rlDhcpSrvConfParamsNextServerIp=rlDhcpSrvConfParamsNextServerIp, rlDhcpSrvMaxNumOfClients=rlDhcpSrvMaxNumOfClients, rlDhcpSrvEnable=rlDhcpSrvEnable, rsDhcpMibVersion=rsDhcpMibVersion, rlDhcpSrvConfParamsEntry=rlDhcpSrvConfParamsEntry, rlDhcpSrvDynamicFreeNumOfAddr=rlDhcpSrvDynamicFreeNumOfAddr, rlDhcpRelayNextServerEntry=rlDhcpRelayNextServerEntry, rlDhcpSrvIpAddrMechanism=rlDhcpSrvIpAddrMechanism, rlDhcpRelayNextServerIpAddr=rlDhcpRelayNextServerIpAddr, rlDhcpRelayNextServerRowStatus=rlDhcpRelayNextServerRowStatus, rlDhcpSrvDynamicIpAddrFrom=rlDhcpSrvDynamicIpAddrFrom, rlDhcpSrvProbeRetries=rlDhcpSrvProbeRetries, rlDhcpSrvDynamicPoolName=rlDhcpSrvDynamicPoolName, rlDhcpSrvDbNumOfActiveEntries=rlDhcpSrvDbNumOfActiveEntries, rlDhcpSrvIpAddrClnHostName=rlDhcpSrvIpAddrClnHostName, rlDhcpSrvDynamicTable=rlDhcpSrvDynamicTable, rlDhcpSrvConfParamsTable=rlDhcpSrvConfParamsTable, rlDhcpSrvDynamicProbeEnable=rlDhcpSrvDynamicProbeEnable, rlDhcpSrvProbeTimeout=rlDhcpSrvProbeTimeout, rlDhcpSrvDynamicIpNetMask=rlDhcpSrvDynamicIpNetMask, rlDhcpSrvConfParamsNetbiosNodeType=rlDhcpSrvConfParamsNetbiosNodeType, rlDhcpSrvIpAddrIpAddr=rlDhcpSrvIpAddrIpAddr, rlDhcpRelayExists=rlDhcpRelayExists, rlDhcpSrvExists=rlDhcpSrvExists, rlDhcpSrvIpAddrTable=rlDhcpSrvIpAddrTable, rlDhcpSrvConfParamsRowStatus=rlDhcpSrvConfParamsRowStatus, rlDhcpSrvConfParamsNmsIp=rlDhcpSrvConfParamsNmsIp, rlDhcpSrvIpAddrConfParamsName=rlDhcpSrvIpAddrConfParamsName, rlDhcpSrvConfParamsDnsList=rlDhcpSrvConfParamsDnsList, rlDhcpSrvProbeEnable=rlDhcpSrvProbeEnable)
| 136.654822 | 3,104 | 0.809034 | 2,939 | 26,921 | 7.410003 | 0.140524 | 0.065571 | 0.114749 | 0.010653 | 0.455092 | 0.314538 | 0.247314 | 0.225273 | 0.199008 | 0.178804 | 0 | 0.040664 | 0.084692 | 26,921 | 196 | 3,105 | 137.352041 | 0.843148 | 0.012035 | 0 | 0 | 0 | 0.106383 | 0.274748 | 0.00835 | 0 | 0 | 0.000376 | 0 | 0 | 1 | 0 | false | 0 | 0.042553 | 0 | 0.042553 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1328d07db061ea272ea595ec5d83ecb92a66eefd | 349 | py | Python | rewards/views/admin_views.py | mohamed17717/Like-Reddit-Backend-Clone | d60d7a4625ee0f7354a21e53c26c7c52746d735f | [
"MIT"
] | 1 | 2022-01-10T12:00:59.000Z | 2022-01-10T12:00:59.000Z | rewards/views/admin_views.py | mohamed17717/Like-Reddit-Backend-Clone | d60d7a4625ee0f7354a21e53c26c7c52746d735f | [
"MIT"
] | null | null | null | rewards/views/admin_views.py | mohamed17717/Like-Reddit-Backend-Clone | d60d7a4625ee0f7354a21e53c26c7c52746d735f | [
"MIT"
] | null | null | null | from rest_framework.generics import ListAPIView
from rest_framework.permissions import IsAdminUser
from rewards.models import UserKarma
from rewards.serializers import UserKarmaSerializer
class UserKarma_List_ApiView(ListAPIView):
queryset = UserKarma.objects.all()
permission_classes = [IsAdminUser]
serializer_class = UserKarmaSerializer
| 29.083333 | 51 | 0.851003 | 37 | 349 | 7.864865 | 0.594595 | 0.054983 | 0.116838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103152 | 349 | 11 | 52 | 31.727273 | 0.929712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1334d8d5874a90b645ef4767d475ce17786a4d82 | 7,017 | py | Python | test/nomics_test.py | nlnsaoadc/py-nomics | 30e7c0854160db9706028067abb9e28f5e8c71e2 | [
"MIT"
] | null | null | null | test/nomics_test.py | nlnsaoadc/py-nomics | 30e7c0854160db9706028067abb9e28f5e8c71e2 | [
"MIT"
] | null | null | null | test/nomics_test.py | nlnsaoadc/py-nomics | 30e7c0854160db9706028067abb9e28f5e8c71e2 | [
"MIT"
] | null | null | null | from unittest import TestCase, mock
from nomics.nomics import KeyTypeError, Nomics
class NomicsTestCase(TestCase):
def setUp(self):
self.api = Nomics(key="123test", paid_plans=True)
self.api_unpaid_plans = Nomics(key="123test", paid_plans=False)
@mock.patch(
"requests.get", return_value=mock.Mock(status_code=200, json=lambda: {})
)
def test_get(self, mock_get):
self.api._get("test")
mock_get.assert_called_once_with(
url="https://api.nomics.com/test",
params={"key": "123test"},
)
@mock.patch("nomics.nomics.logger.warning")
@mock.patch(
"requests.get",
return_value=mock.Mock(
status_code=404,
json=lambda: {"message": "Not Found"},
content=b"404 Not Found Message",
),
)
def test_get_404_status(self, mock_get, mock_log):
with self.assertRaises(Exception) as context:
self.api._get("test")
self.assertEqual(
"404 404 Not Found Message",
str(context.exception),
)
mock_log.assert_called_once()
@mock.patch("nomics.nomics.logger.info")
@mock.patch(
"requests.get",
return_value=mock.Mock(
status_code=404,
json=mock.Mock(side_effect=Exception("")),
content=b"404 Not Found Message",
),
)
def test_get_404_status_fail_silently(self, mock_get, mock_log):
self.api.fail_silently = True
self.assertEqual(self.api._get("test"), None)
mock_log.assert_called_once()
@mock.patch("nomics.nomics.logger.error")
@mock.patch("nomics.nomics.Nomics._get")
def test_wrong_key_type(self, mock_get, mock_log):
try:
self.api_unpaid_plans.get_exchange_markets_ticker()
except KeyTypeError as error:
self.assertEqual(type(str(error)), str)
mock_log.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_currencies_ticker(self, mock_get):
self.api.get_currencies_ticker(ids=[""], interval=[""], convert="USD")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_currencies_sparkline(self, mock_get):
self.api.get_currencies_sparkline(ids="", start="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_market(self, mock_get):
self.api.get_market()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_marketcap_history(self, mock_get):
self.api.get_marketcap_history(start="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_global_volume_history(self, mock_get):
self.api.get_global_volume_history()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchange_rates(self, mock_get):
self.api.get_exchange_rates()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchange_rates_history(self, mock_get):
self.api.get_exchange_rates_history()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_global_ticker(self, mock_get):
self.api.get_global_ticker(convert=True)
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_currency_highlights(self, mock_get):
self.api.get_currency_highlights(currency="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_supply_history(self, mock_get):
self.api.get_supply_history(currency="", start="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchange_highlights(self, mock_get):
self.api.get_exchange_highlights(exchange="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchanges_ticker(self, mock_get):
self.api.get_exchanges_ticker()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchanges_volume_history(self, mock_get):
self.api.get_exchanges_volume_history(exchange="", start="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchange_metadata(self, mock_get):
self.api.get_exchange_metadata()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_market_highlights(self, mock_get):
self.api.get_market_highlights(base="", quote="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchange_markets_ticker(self, mock_get):
self.api.get_exchange_markets_ticker()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_aggregated_ohlcv_candles(self, mock_get):
self.api.get_aggregated_ohlcv_candles(interval="", currency="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_exchange_ohlcv_candles(self, mock_get):
self.api.get_exchange_ohlcv_candles(interval="", exchange="", market="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_aggregated_pair_ohlcv_candles(self, mock_get):
self.api.get_aggregated_pair_ohlcv_candles(
interval="", base="", quote=""
)
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_trades(self, mock_get):
self.api.get_trades(exchange="", market="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_order_book_snapshot(self, mock_get):
self.api.get_order_book_snapshot(exchange="", market="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_order_book_batches(self, mock_get):
self.api.get_order_book_batches(exchange="", market="")
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_currency_predictions_ticker(self, mock_get):
self.api.get_currency_predictions_ticker()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_currency_predictions_history(self, mock_get):
self.api.get_currency_predictions_history()
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_currencies(self, mock_get):
self.api.get_currencies(ids=[""], attributes=[""])
mock_get.assert_called_once()
@mock.patch("nomics.nomics.Nomics._get")
def test_get_currencies_predictions_ticket(self, mock_get):
self.api.get_currencies_predictions_ticket(ids=[""])
mock_get.assert_called_once()
| 36.357513 | 80 | 0.6822 | 918 | 7,017 | 4.851852 | 0.108932 | 0.156264 | 0.074091 | 0.141446 | 0.794118 | 0.749888 | 0.740458 | 0.628648 | 0.550965 | 0.530085 | 0 | 0.006351 | 0.192247 | 7,017 | 192 | 81 | 36.546875 | 0.779464 | 0 | 0 | 0.446541 | 0 | 0 | 0.133818 | 0.107453 | 0 | 0 | 0 | 0 | 0.213836 | 1 | 0.194969 | false | 0 | 0.012579 | 0 | 0.213836 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
133e2a05041acc86cd8a0b52862271efcac7d5b6 | 3,775 | py | Python | edit_name_db.py | nta-byte/vietnamese-namedb | b0084c8a6894030d23a6be526383dc55078a82f7 | [
"MIT"
] | null | null | null | edit_name_db.py | nta-byte/vietnamese-namedb | b0084c8a6894030d23a6be526383dc55078a82f7 | [
"MIT"
] | null | null | null | edit_name_db.py | nta-byte/vietnamese-namedb | b0084c8a6894030d23a6be526383dc55078a82f7 | [
"MIT"
] | null | null | null |
import json
from unicode_utils import *
_Unicode = [
u'â',u'Â',u'ă',u'Ă',u'đ',u'Đ',u'ê',u'Ê',u'ô',u'Ô',u'ơ',u'Ơ',u'ư',u'Ư',u'á',u'Á',u'à',u'À',u'ả',u'Ả',u'ã',u'Ã',u'ạ',u'Ạ',
u'ấ',u'Ấ',u'ầ',u'Ầ',u'ẩ',u'Ẩ',u'ẫ',u'Ẫ',u'ậ',u'Ậ',u'ắ',u'Ắ',u'ằ',u'Ằ',u'ẳ',u'Ẳ',u'ẵ',u'Ẵ',u'ặ',u'Ặ',
u'é',u'É',u'è',u'È',u'ẻ',u'Ẻ',u'ẽ',u'Ẽ',u'ẹ',u'Ẹ',u'ế',u'Ế',u'ề',u'Ề',u'ể',u'Ể',u'ễ',u'Ễ',u'ệ',u'Ệ',u'í',u'Í',u'ì',u'Ì',u'ỉ',u'Ỉ',u'ĩ',u'Ĩ',u'ị',u'Ị',
u'ó',u'Ó',u'ò',u'Ò',u'ỏ',u'Ỏ',u'õ',u'Õ',u'ọ',u'Ọ',u'ố',u'Ố',u'ồ',u'Ồ',u'ổ',u'Ổ',u'ỗ',u'Ỗ',u'ộ',u'Ộ',u'ớ',u'Ớ',u'ờ',u'Ờ',u'ở',u'Ở',u'ỡ',u'Ỡ',u'ợ',u'Ợ',
u'ú',u'Ú',u'ù',u'Ù',u'ủ',u'Ủ',u'ũ',u'Ũ',u'ụ',u'Ụ',u'ứ',u'Ứ',u'ừ',u'Ừ',u'ử',u'Ử',u'ữ',u'Ữ',u'ự',u'Ự',u'ỳ',u'Ỳ',u'ỷ',u'Ỷ',u'ỹ',u'Ỹ',u'ỵ',u'Ỵ',u'ý',u'Ý'
]
_TCVN3 = [
u'©',u'¢',u'¨',u'¡',u'®',u'§',u'ª',u'£',u'«',u'¤',u'¬',u'¥',u'',u'¦',u'¸',u'¸',u'µ',u'µ',u'¶',u'¶',u'·',u'·',u'¹',u'¹',
u'Ê',u'Ê',u'Ç',u'Ç',u'È',u'È',u'É',u'É',u'Ë',u'Ë',u'¾',u'¾',u'»',u'»',u'¼',u'¼',u'½',u'½',u'Æ',u'Æ',
u'Ð',u'Ð',u'Ì',u'Ì',u'Î',u'Î',u'Ï',u'Ï',u'Ñ',u'Ñ',u'Õ',u'Õ',u'Ò',u'Ò',u'Ó',u'Ó',u'Ô',u'Ô',u'Ö',u'Ö',u'Ý',u'Ý',u'×',u'×',u'Ø',u'Ø',u'Ü',u'Ü',u'Þ',u'Þ',
u'ã',u'ã',u'ß',u'ß',u'á',u'á',u'â',u'â',u'ä',u'ä',u'è',u'è',u'å',u'å',u'æ',u'æ',u'ç',u'ç',u'é',u'é',u'í',u'í',u'ê',u'ê',u'ë',u'ë',u'ì',u'ì',u'î',u'î',
u'ó',u'ó',u'ï',u'ï',u'ñ',u'ñ',u'ò',u'ò',u'ô',u'ô',u'ø',u'ø',u'õ',u'õ',u'ö',u'ö',u'÷',u'÷',u'ù',u'ù',u'ú',u'ú',u'û',u'û',u'ü',u'ü',u'þ',u'þ',u'ý',u'ý'
]
_VNIWin = [
u'aâ',u'AÂ',u'aê',u'AÊ',u'ñ',u'Ñ',u'eâ',u'EÂ',u'oâ',u'OÂ',u'ô',u'Ô',u'ö',u'Ö',u'aù',u'AÙ',u'aø',u'AØ',u'aû',u'AÛ',u'aõ',u'AÕ',u'aï',u'AÏ',
u'aá',u'AÁ',u'aà',u'AÀ',u'aå',u'AÅ',u'aã',u'AÃ',u'aä',u'AÄ',u'aé',u'AÉ',u'aè',u'AÈ',u'aú',u'AÚ',u'aü',u'AÜ',u'aë',u'AË',
u'eù',u'EÙ',u'eø',u'EØ',u'eû',u'EÛ',u'eõ',u'EÕ',u'eï',u'EÏ',u'eá',u'EÁ',u'eà',u'EÀ',u'eå',u'EÅ',u'eã',u'EÃ',u'eä',u'EÄ',u'í',u'Í',u'ì',u'Ì',u'æ',u'Æ',u'ó',u'Ó',u'ò',u'Ò',
u'où',u'OÙ',u'oø',u'OØ',u'oû',u'OÛ',u'oõ',u'OÕ',u'oï',u'OÏ',u'oá',u'OÁ',u'oà',u'OÀ',u'oå',u'OÅ',u'oã',u'OÃ',u'oä',u'OÄ',u'ôù',u'ÔÙ',u'ôø',u'ÔØ',u'ôû',u'ÔÛ',u'ôõ',u'ÔÕ',u'ôï',u'ÔÏ',
u'uù',u'UÙ',u'uø',u'UØ',u'uû',u'UÛ',u'uõ',u'UÕ',u'uï',u'UÏ',u'öù',u'ÖÙ',u'öø',u'ÖØ',u'öû',u'ÖÛ',u'öõ',u'ÖÕ',u'öï',u'ÖÏ',u'yø',u'YØ',u'yû',u'YÛ',u'yõ',u'YÕ',u'î',u'Î',u'yù',u'YÙ'
]
_KhongDau = [
u'a',u'A',u'a',u'A',u'd',u'D',u'e',u'E',u'o',u'O',u'o',u'O',u'u',u'U',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',
u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',u'a',u'A',
u'e',u'E',u'e',u'E',u'e',u'E',u'e',u'E',u'e',u'E',u'e',u'E',u'e','uE',u'e',u'E',u'e',u'E',u'e',u'E',u'i',u'I',u'i',u'I',u'i',u'I',u'i',u'I',u'i',u'I',
u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',u'o',u'O',
u'u',u'U',u'u',u'U',u'u',u'U',u'u',u'U',u'u',u'U',u'u',u'U',u'u',u'U',u'u',u'U',u'u',u'U',u'u',u'U',u'y',u'Y',u'y',u'Y',u'y',u'Y',u'y',u'Y',u'y',u'Y'
]
def convert_unicode(string, type_code):
if type_code is _VNIWin:
for idx, char in enumerate(_VNIWin):
string = string.replace(char, _Unicode[idx])
return string
def edit_json_db(json_file):
with open(json_file, encoding='utf-8') as f:
data = json.load(f)
for person in data:
for sub in person:
# print(person[sub])
# person[sub] = compound_unicode(person[sub])
person[sub] = convert_unicode(person[sub], _VNIWin)
with open('re' + json_file, 'w', encoding='utf-8') as f:
json.dump(data, f, ensure_ascii=False)
if __name__ == "__main__":
print(convert_unicode('truùc', _VNIWin))
# edit_json_db('uit_member.json') | 69.907407 | 188 | 0.445033 | 1,193 | 3,775 | 1.411567 | 0.186086 | 0.052257 | 0.074822 | 0.095012 | 0.355701 | 0.223872 | 0.223872 | 0.21734 | 0.21734 | 0.148456 | 0 | 0.001465 | 0.095629 | 3,775 | 54 | 189 | 69.907407 | 0.479203 | 0.024901 | 0 | 0 | 0 | 0 | 0.184661 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044444 | false | 0 | 0.044444 | 0 | 0.111111 | 0.022222 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13454f501aad3eddc0a2f35c2d18ba7847b00fdd | 252 | py | Python | elliot/recommender/adversarial/AMR/__init__.py | danielemalitesta/vacs | 07df23fd0eff7d7870debd6a304347539a10dbc1 | [
"Apache-2.0"
] | null | null | null | elliot/recommender/adversarial/AMR/__init__.py | danielemalitesta/vacs | 07df23fd0eff7d7870debd6a304347539a10dbc1 | [
"Apache-2.0"
] | null | null | null | elliot/recommender/adversarial/AMR/__init__.py | danielemalitesta/vacs | 07df23fd0eff7d7870debd6a304347539a10dbc1 | [
"Apache-2.0"
] | null | null | null | """
Module description:
"""
__version__ = '0.3.0'
__author__ = 'Felice Antonio Merra, Vito Walter Anelli, Claudio Pomo, Daniele Malitesta'
__email__ = 'felice.merra@poliba.it, vitowalter.anelli@poliba.it, claudio.pomo@poliba.it'
from .AMR import AMR | 25.2 | 89 | 0.75 | 34 | 252 | 5.205882 | 0.676471 | 0.135593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013514 | 0.119048 | 252 | 10 | 90 | 25.2 | 0.783784 | 0.075397 | 0 | 0 | 0 | 0.25 | 0.68 | 0.324444 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1348edd4d4dc82d67d6b9f653f0be0810ad369ba | 216 | py | Python | player_class.py | fulcrum101/ai_tic_tac_toe_undertale_style | 44c37879a2827284c4971ec379513a2ec8b4c638 | [
"MIT"
] | null | null | null | player_class.py | fulcrum101/ai_tic_tac_toe_undertale_style | 44c37879a2827284c4971ec379513a2ec8b4c638 | [
"MIT"
] | null | null | null | player_class.py | fulcrum101/ai_tic_tac_toe_undertale_style | 44c37879a2827284c4971ec379513a2ec8b4c638 | [
"MIT"
] | null | null | null | class Player():
def __init__(self, name, start=False, ai_switch=False):
self.ai = ai_switch
self.values = {0:.5}
self.name = name
self.turn = start
self.epsilon = 1
| 27 | 60 | 0.546296 | 28 | 216 | 4 | 0.571429 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020979 | 0.337963 | 216 | 7 | 61 | 30.857143 | 0.762238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
135d1a30a20af12dfc0a5f87abdc1f9f3435ef9e | 368 | py | Python | server/app/utilities/__init__.py | njbrunner/neighbor | eeffc8ff7adb4eaa1889ab058d6cd1fc95b80e1e | [
"MIT"
] | 1 | 2020-04-06T03:31:25.000Z | 2020-04-06T03:31:25.000Z | server/app/utilities/__init__.py | njbrunner/covid-19 | eeffc8ff7adb4eaa1889ab058d6cd1fc95b80e1e | [
"MIT"
] | 16 | 2020-04-17T00:44:01.000Z | 2020-05-01T20:39:31.000Z | server/app/utilities/__init__.py | njbrunner/neighbor | eeffc8ff7adb4eaa1889ab058d6cd1fc95b80e1e | [
"MIT"
] | null | null | null | from app.utilities.initialize_database import create_default_roles
from app.utilities.twilio_utility import generate_access_token, create_twilio_user
from app.utilities.jwt import create_token, create_token_for_refresh
__all__ = [
'create_default_roles',
'create_twilio_user',
'create_token',
'create_token_for_refresh',
'generate_access_token'
]
| 30.666667 | 82 | 0.8125 | 48 | 368 | 5.6875 | 0.395833 | 0.161172 | 0.175824 | 0.161172 | 0.234432 | 0.234432 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119565 | 368 | 11 | 83 | 33.454545 | 0.842593 | 0 | 0 | 0 | 1 | 0 | 0.258152 | 0.122283 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
135d7c89557108a9ced1ef2f88a19dbad0b37a02 | 4,314 | py | Python | caelus/run/udf.py | sayerhs/cpl | 4af066a1f8b48d6578807e59627d24b81fda3667 | [
"Apache-2.0"
] | null | null | null | caelus/run/udf.py | sayerhs/cpl | 4af066a1f8b48d6578807e59627d24b81fda3667 | [
"Apache-2.0"
] | 25 | 2018-04-11T17:44:34.000Z | 2021-07-19T16:57:17.000Z | caelus/run/udf.py | sayerhs/cpl | 4af066a1f8b48d6578807e59627d24b81fda3667 | [
"Apache-2.0"
] | 2 | 2019-03-28T12:12:58.000Z | 2021-05-02T16:18:16.000Z | # -*- coding: utf-8 -*-
# pylint: disable=no-self-use,unused-argument
"""\
User-defined functions for CML Simulation
-----------------------------------------
This module defines an interface that can be used to override/customize
behavior of the CMLSimulation classes.
"""
from ..utils.tojson import JSONSerializer
class SimUDFBase(JSONSerializer):
"""Base class for a user-defined function interface.
This class defines the API for customizing the different steps of a
simulation and collection.
"""
def sim_init_udf(self, simcoll, is_reload=False, **kwargs):
"""Steps to execute before a simulation collection is initialized
Args:
simcoll (CMLSimCollection): The case collection instance
is_reload (bool): Flag indicating whether this is a reload
"""
def sim_epilogue(self, simcoll, **kwargs):
"""Steps to execute at the end before saving state
Args:
simcoll (CMLSimCollection): The case collection instance
"""
def sim_case_name(self,
case_format,
case_params,
**kwargs):
"""Case name generator
Override this method to customize the default case naming strategy for
parametric runs.
Args:
case_format (str): The case formatter provided in input file
case_params (CaelusDict): The case parameters dictionary
"""
print(case_format)
return case_format.format(**case_params)
def case_setup_prologue(self, name, case_params, run_config, **kwargs):
"""Customization before a case is setup
User can manipulate the case_params or the run_config dictionary to
customize the case setup process. Using this method to customize
provides a powerful alternative to "apply_transforms" option in
:class:`CMLParametricRun`.
The user can return True from the method to skip the
normal setup and take over setup entirely, or to skip certain cases in
from the combinations of case parameters possible.
The name argument is passed for information purposes only. Changing
this has no effect on the setup process.
Args:
name (str): The case name
case_params (CaelusDict): Parameters for this case
run_config (CaelusDict): Run configuration for this case
Return:
bool: Skip actual setup if True
"""
skip_setup = False
return skip_setup
def case_setup_epilogue(self, case, **kwargs):
"""Customization after a case is setup
This method is called with one argument, the case instance. The working
directory is set to the case that has been already setup. For example,
user can use this method to copy and setup a different polyMesh
directories when performing mesh refinement studies.
Args:
case (CMLSimulation): The case instance
"""
def case_prep_prologue(self, case, force=False, **kwargs):
"""Customization before prep step is executed.
Args:
case (CMLSimulation): The case instance
force (bool): Force prep if already done
Return:
bool: If True, skip default prep steps
"""
skip_prep = False
return skip_prep
def case_prep_epilogue(self, case, force=False, **kwargs):
"""Execute additional steps after default case prep
Args:
case (CMLSimulation): The case instance
force (bool): Force prep even if already done
"""
def case_post_prologue(self, case, force=False, **kwargs):
"""Customization before post-processing steps are executed
Args:
case (CMLSimulation): The case instance
force (bool): Force post even if already done
Return:
bool: Skip default post steps if True
"""
skip_post = False
return skip_post
def case_post_epilogue(self, case, force=False, **kwargs):
"""Execute additional steps after default case post
Args:
case (CMLSimulation): The case instance
force (bool): Force post even if already done
"""
| 32.681818 | 79 | 0.631433 | 516 | 4,314 | 5.203488 | 0.310078 | 0.036499 | 0.03352 | 0.044693 | 0.253631 | 0.241341 | 0.227933 | 0.189199 | 0.15121 | 0.15121 | 0 | 0.00033 | 0.29694 | 4,314 | 131 | 80 | 32.931298 | 0.884932 | 0.638387 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.409091 | false | 0 | 0.045455 | 0 | 0.681818 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
135ee5652390bc7d2f85d1d306bdd4a5da7fa5aa | 1,265 | py | Python | python/orca/src/bigdl/orca/automl/search/base.py | DirkFi/BigDL | 7493209165c046116470b9a1e1c8f527915d6f1e | [
"Apache-2.0"
] | 3 | 2021-07-14T01:28:47.000Z | 2022-03-02T01:16:32.000Z | python/orca/src/bigdl/orca/automl/search/base.py | DirkFi/BigDL | 7493209165c046116470b9a1e1c8f527915d6f1e | [
"Apache-2.0"
] | null | null | null | python/orca/src/bigdl/orca/automl/search/base.py | DirkFi/BigDL | 7493209165c046116470b9a1e1c8f527915d6f1e | [
"Apache-2.0"
] | null | null | null | #
# Copyright 2016 The BigDL Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from abc import ABC, abstractmethod
class GoodError(Exception):
pass
class SearchEngine(ABC):
"""
Abstract Base Search Engine class. For hyper paramter tuning.
"""
@abstractmethod
def run(self):
"""
Run the trials with searched parameters
:return:
"""
pass
@abstractmethod
def get_best_trials(self, k):
"""
Get the best trials from .
:param k: trials to be selected
:return: the config of best k trials
"""
pass
class TrialOutput(object):
def __init__(self, config, model_path):
self.config = config
self.model_path = model_path
| 24.326923 | 74 | 0.668775 | 167 | 1,265 | 5.011976 | 0.586826 | 0.071685 | 0.031063 | 0.038232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008502 | 0.256126 | 1,265 | 51 | 75 | 24.803922 | 0.880978 | 0.603162 | 0 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0.214286 | 0.071429 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
136e14b969e3465f78f168656b777dfcb395c102 | 1,495 | py | Python | main.py | ShrimpingIt/micropython-ijson | fccdf64c0188516c9d5789f24e4ffdeab0ac0e1b | [
"BSD-3-Clause"
] | null | null | null | main.py | ShrimpingIt/micropython-ijson | fccdf64c0188516c9d5789f24e4ffdeab0ac0e1b | [
"BSD-3-Clause"
] | null | null | null | main.py | ShrimpingIt/micropython-ijson | fccdf64c0188516c9d5789f24e4ffdeab0ac0e1b | [
"BSD-3-Clause"
] | null | null | null | import ijson
# based on '4.1 Configuration of the Wifi' from https://docs.micropython.org/en/latest/esp8266/esp8266/tutorial/network_basics.html
def connect_wifi(essid, password):
import network
sta_if = network.WLAN(network.STA_IF)
if not sta_if.isconnected():
print('connecting to network...')
sta_if.active(True)
sta_if.connect(essid, password)
while not sta_if.isconnected():
pass
print('network config:', sta_if.ifconfig())
# based on '5.2 HTTP Get Request' from https://docs.micropython.org/en/latest/esp8266/esp8266/tutorial/network_tcp.html
# TODO CH be sure to call s.close() or use context manager interface
def http_stream(url):
import socket
_, _, host, path = url.split('/', 3)
addr = socket.getaddrinfo(host, 80)[0][-1]
s = socket.socket()
s.connect(addr)
s.send(bytes('GET /%s HTTP/1.0\r\nHost: %s\r\n\r\n' % (path, host), 'utf8'))
return s
def print_forecast(filelike):
forecast = []
total = 0
for prefix,event,value in ijson.parse(filelike):
if prefix =="list.item.rain.3h" and event=="number":
total += float(value)
elif prefix == "list.item.rain" and event=="end_map":
forecast.append(total)
total = 0
print(forecast)
def print_url():
print_forecast(urlopen('http://shrimping.it/tmp/weatherapi/openweathermap.json'))
def print_local():
with open("openweathermap.json", 'r') as f:
print_forecast(f) | 34.767442 | 131 | 0.654181 | 213 | 1,495 | 4.502347 | 0.507042 | 0.036496 | 0.037539 | 0.050052 | 0.133472 | 0.133472 | 0.133472 | 0.133472 | 0.133472 | 0.133472 | 0 | 0.026094 | 0.205351 | 1,495 | 43 | 132 | 34.767442 | 0.781145 | 0.210033 | 0 | 0.058824 | 0 | 0.029412 | 0.168224 | 0 | 0 | 0 | 0 | 0.023256 | 0 | 1 | 0.147059 | false | 0.088235 | 0.088235 | 0 | 0.264706 | 0.235294 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
13763d4b2aa0fb87828637959858cafe941448ff | 96 | py | Python | random-game/main.py | lorenzophys/game-of-life | b9d217637af815195b5cbe9fde66d79fa70b985c | [
"MIT"
] | null | null | null | random-game/main.py | lorenzophys/game-of-life | b9d217637af815195b5cbe9fde66d79fa70b985c | [
"MIT"
] | null | null | null | random-game/main.py | lorenzophys/game-of-life | b9d217637af815195b5cbe9fde66d79fa70b985c | [
"MIT"
] | null | null | null | from gol import GameOfLife
if __name__ == "__main__":
game = GameOfLife()
game.main()
| 13.714286 | 26 | 0.65625 | 11 | 96 | 5 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229167 | 96 | 6 | 27 | 16 | 0.743243 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1377663bab7c1fd57ee71417ff9d612801f6d183 | 522 | py | Python | tests/test_logger.py | SpotlightData/nanowire-service-py | 9de7cf4aca5efd3d4e003c67e739d54dce446d79 | [
"MIT"
] | null | null | null | tests/test_logger.py | SpotlightData/nanowire-service-py | 9de7cf4aca5efd3d4e003c67e739d54dce446d79 | [
"MIT"
] | 1 | 2021-12-08T08:55:10.000Z | 2021-12-12T16:07:14.000Z | tests/test_logger.py | SpotlightData/nanowire-service-py | 9de7cf4aca5efd3d4e003c67e739d54dce446d79 | [
"MIT"
] | null | null | null | from nanowire_service_py import Logger # type: ignore
def test_logger():
log = Logger()
log.track("test-uuid")
log.info("Hello {name}", name="test")
log.debug("Hello {name}", name="test")
log.success("Hello {name}", name="test")
log.warning("Hello {name}", name="test")
log.error("Hello {name}", name="test")
log.critical("Hello {name}", name="test")
try:
raise Exception("Hello")
except Exception as e:
log.exception(e)
assert len(log.consume_logs()) == 7
| 26.1 | 54 | 0.609195 | 70 | 522 | 4.485714 | 0.457143 | 0.171975 | 0.248408 | 0.324841 | 0.318471 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002415 | 0.206897 | 522 | 19 | 55 | 27.473684 | 0.756039 | 0.022989 | 0 | 0 | 0 | 0 | 0.216535 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1384620402d7b88fa9feedd399d34a5f6f022ff2 | 1,386 | py | Python | userbot/plugins/online.py | Declan57/botkulah | a1da46e45b25ef39919cbe7edb2fafa02dcd7bc1 | [
"MIT"
] | null | null | null | userbot/plugins/online.py | Declan57/botkulah | a1da46e45b25ef39919cbe7edb2fafa02dcd7bc1 | [
"MIT"
] | null | null | null | userbot/plugins/online.py | Declan57/botkulah | a1da46e45b25ef39919cbe7edb2fafa02dcd7bc1 | [
"MIT"
] | null | null | null |
# credits @TeleBot
# SPARKZZZ
import sys
from telethon import events, functions, version, __version__
import random
from userbot.utils import register
from userbot import ALIVE_NAME
from userbot.utils import admin_cmd
DEFAULTUSER = str(ALIVE_NAME) if ALIVE_NAME else "SPARKZZZ User"
ONLINESTR = [
"█▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀█ \n█░╦─╦╔╗╦─╔╗╔╗╔╦╗╔╗░█ █░║║║╠─║─║─║║║║║╠─░█ \n█░╚╩╝╚╝╚╝╚╝╚╝╩─╩╚╝░█ \n█▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄█ \n\n**SPARKZZZ is online.**\n\n**All systems functioning normally !** \n\n**Bot by** [VISHNU CS](tg://user?id=731591473) \n\n**More help -** @sparkzzzbothelp \n\n [🚧 GitHub Repository 🚧](https://github.com/vishnu175/SPARKZZZ)",
f"╦─╦╔╗╦─╔╗╔╗╔╦╗╔╗\n║║║╠─║─║─║║║║║╠─\n╚╩╝╚╝╚╝╚╝╚╝╩─╩╚╝\n **Welcome to ⚡SPARKZZZ⚡**\n\n**Hey master! I'm alive🙋. All systems online and functioning normally ✅**\n\n**✔️ Telethon version:** `{version.__version__}` \n\n**✔️ Python:** `{sys.version}` \n\n✔️ More info: @sparkzzzbothelp \n\n✔️ Created by: [csv1990](tg://user?id=731591473) \n\n**✔️ Database status:** All ok 👌 \n\n**✔️ My master:** {DEFAULTUSER} \n\n [⚙️ REPO ⚙️](https://github.com/vishnu175/SPARKZZZ)"
]
@sparkzzz.on(admin_cmd(outgoing=True, pattern="online"))
async def online(event):
""" Greet everyone! """
if not event.text[0].isalpha() and event.text[0] not in ("/", "#", "@", "!"):
await event.edit(random.choice(ONLINESTR))
| 57.75 | 486 | 0.592352 | 198 | 1,386 | 4.944444 | 0.479798 | 0.026558 | 0.024515 | 0.044944 | 0.102145 | 0.038815 | 0 | 0 | 0 | 0 | 0 | 0.024938 | 0.132035 | 1,386 | 23 | 487 | 60.26087 | 0.646717 | 0.018038 | 0 | 0 | 0 | 0.133333 | 0.638681 | 0.167916 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
1390a7593a9109bdbd11047a8e0f70b7b7addf3b | 365 | py | Python | toontown/hood/GenericAnimatedBuilding.py | TheFamiliarScoot/open-toontown | 678313033174ea7d08e5c2823bd7b473701ff547 | [
"BSD-3-Clause"
] | 99 | 2019-11-02T22:25:00.000Z | 2022-02-03T03:48:00.000Z | toontown/hood/GenericAnimatedBuilding.py | TheFamiliarScoot/open-toontown | 678313033174ea7d08e5c2823bd7b473701ff547 | [
"BSD-3-Clause"
] | 42 | 2019-11-03T05:31:08.000Z | 2022-03-16T22:50:32.000Z | toontown/hood/GenericAnimatedBuilding.py | TheFamiliarScoot/open-toontown | 678313033174ea7d08e5c2823bd7b473701ff547 | [
"BSD-3-Clause"
] | 57 | 2019-11-03T07:47:37.000Z | 2022-03-22T00:41:49.000Z | from toontown.hood import GenericAnimatedProp
class GenericAnimatedBuilding(GenericAnimatedProp.GenericAnimatedProp):
def __init__(self, node):
GenericAnimatedProp.GenericAnimatedProp.__init__(self, node)
def enter(self):
if base.config.GetBool('buildings-animate', False):
GenericAnimatedProp.GenericAnimatedProp.enter(self)
| 33.181818 | 71 | 0.764384 | 32 | 365 | 8.46875 | 0.59375 | 0.420664 | 0.088561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153425 | 365 | 10 | 72 | 36.5 | 0.877023 | 0 | 0 | 0 | 0 | 0 | 0.046575 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13a97b89dba7f78179d17a0ef0f0782126987b63 | 641 | py | Python | tests/test_extension.py | gotling/Mopidy-Auto | 6376e53b70d0e9ab0f8338d9eefcb063be208339 | [
"MIT"
] | 10 | 2017-07-28T12:34:39.000Z | 2022-01-17T17:28:06.000Z | tests/test_extension.py | gotling/Mopidy-Auto | 6376e53b70d0e9ab0f8338d9eefcb063be208339 | [
"MIT"
] | 18 | 2017-03-26T04:15:31.000Z | 2021-12-23T15:04:42.000Z | tests/test_extension.py | gotling/Mopidy-Auto | 6376e53b70d0e9ab0f8338d9eefcb063be208339 | [
"MIT"
] | 3 | 2017-12-19T21:03:34.000Z | 2021-12-23T13:04:12.000Z | from __future__ import unicode_literals
from mopidy_auto import Extension
def test_get_default_config():
ext = Extension()
config = ext.get_default_config()
assert '[auto]' in config
assert 'enabled = true' in config
def test_get_config_schema():
ext = Extension()
schema = ext.get_config_schema()
assert 'admin_key' in schema
assert 'base_path' in schema
assert 'max_tracks' in schema
for index in range(3):
assert "s{}_start".format(index) in schema
assert "s{}_folder".format(index) in schema
assert "s{}_max_volume".format(index) in schema
# TODO Write more test
| 22.103448 | 55 | 0.689548 | 89 | 641 | 4.707865 | 0.41573 | 0.114558 | 0.133652 | 0.136038 | 0.124105 | 0.124105 | 0 | 0 | 0 | 0 | 0 | 0.002 | 0.219969 | 641 | 28 | 56 | 22.892857 | 0.836 | 0.031201 | 0 | 0.117647 | 0 | 0 | 0.130856 | 0 | 0 | 0 | 0 | 0.035714 | 0.470588 | 1 | 0.117647 | false | 0 | 0.117647 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13ad17399dedbc29d73a21c99fbe5008557f0a16 | 244 | py | Python | code/Factorial.py | nyamako/hacktoberfest-2018 | bf7939d4b0cfb57a854a1644dbbae7ddf8af3c4f | [
"MIT"
] | 67 | 2018-09-25T21:37:23.000Z | 2020-11-03T02:03:22.000Z | code/Factorial.py | nyamako/hacktoberfest-2018 | bf7939d4b0cfb57a854a1644dbbae7ddf8af3c4f | [
"MIT"
] | 245 | 2018-09-18T10:07:28.000Z | 2020-09-30T19:00:11.000Z | code/Factorial.py | nyamako/hacktoberfest-2018 | bf7939d4b0cfb57a854a1644dbbae7ddf8af3c4f | [
"MIT"
] | 1,192 | 2018-09-18T11:27:55.000Z | 2021-10-17T10:24:37.000Z | # Python 3 program to find
# factorial of given number
#Author sspeedy
def factorial(n):
# single line to find factorial
return 1 if (n==1 or n==0) else n * factorial(n - 1);
# Driver Code
num = int(raw_input())
print factorial(num)
| 20.333333 | 57 | 0.680328 | 41 | 244 | 4.02439 | 0.682927 | 0.072727 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026042 | 0.213115 | 244 | 11 | 58 | 22.181818 | 0.833333 | 0.438525 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.25 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13b901a1e1c9443cc7860bb74603aaa768172ae4 | 549 | py | Python | src/compas_plotters/__init__.py | 9and3/compas | 02c5a76adeb55696206d2194a8167e4085415891 | [
"MIT"
] | null | null | null | src/compas_plotters/__init__.py | 9and3/compas | 02c5a76adeb55696206d2194a8167e4085415891 | [
"MIT"
] | null | null | null | src/compas_plotters/__init__.py | 9and3/compas | 02c5a76adeb55696206d2194a8167e4085415891 | [
"MIT"
] | null | null | null | """
********************************************************************************
compas_plotters
********************************************************************************
.. currentmodule:: compas_plotters
.. toctree::
:maxdepth: 1
:titlesonly:
compas_plotters.artists
compas_plotters.plotter
"""
__version__ = '1.15.0'
from .core import * # noqa: F401 F403
# from .artists import * # noqa: F401 F403
from .plotter import Plotter
__all__ = [
'Plotter'
]
__all_plugins__ = [
'compas_plotters.artists',
]
| 17.709677 | 80 | 0.469945 | 42 | 549 | 5.714286 | 0.47619 | 0.291667 | 0.175 | 0.15 | 0.183333 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036559 | 0.153005 | 549 | 30 | 81 | 18.3 | 0.47957 | 0.684882 | 0 | 0 | 0 | 0 | 0.222222 | 0.141975 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13c1082e7863234e3a17dafb7c8185ea2b04cf72 | 3,071 | py | Python | creme/stats/iqr.py | Leo-VK/creme | 0b02df4f4c826368747ee91946efca1a7e8653a6 | [
"BSD-3-Clause"
] | 23 | 2020-11-21T22:09:24.000Z | 2022-03-21T22:33:07.000Z | creme/stats/iqr.py | 53X/creme | 3b397d6aafdfad48dbb36813c90ed3fd1844fc30 | [
"BSD-3-Clause"
] | null | null | null | creme/stats/iqr.py | 53X/creme | 3b397d6aafdfad48dbb36813c90ed3fd1844fc30 | [
"BSD-3-Clause"
] | 1 | 2021-04-25T16:24:43.000Z | 2021-04-25T16:24:43.000Z | from .. import utils
from . import base
from . import quantile
class IQR(base.Univariate):
"""Computes the interquartile range.
Parameters:
q_inf: Desired inferior quantile, must be between 0 and 1. Defaults to `0.25`.
q_sup: Desired superior quantile, must be between 0 and 1. Defaults to `0.75`.
Example:
>>> from creme import stats
>>> iqr = stats.IQR(q_inf=0.25, q_sup=0.75)
>>> for i in range(0, 1001):
... iqr = iqr.update(i)
... if i % 100 == 0:
... print(iqr.get())
0
50.0
100.0
150.0
200.0
250.0
300.0
350.0
400.0
450.0
500.0
"""
def __init__(self, q_inf=.25, q_sup=.75):
if q_inf >= q_sup:
raise ValueError('q_inf must be strictly less than q_sup')
self.q_inf = q_inf
self.q_sup = q_sup
self.quantile_inf = quantile.Quantile(q=self.q_inf)
self.quantile_sup = quantile.Quantile(q=self.q_sup)
@property
def name(self):
return f'{self.__class__.__name__}_{self.q_inf}_{self.q_sup}'
def update(self, x):
self.quantile_inf.update(x)
self.quantile_sup.update(x)
return self
def get(self):
return self.quantile_sup.get() - self.quantile_inf.get()
class RollingIQR(base.RollingUnivariate, utils.SortedWindow):
"""Computes the rolling interquartile range.
Parameters:
window_size (int): Size of the window.
q_inf (float): Desired inferior quantile, must be between 0 and 1. Defaults to `0.25`.
q_sup (float): Desired superior quantile, must be between 0 and 1. Defaults to `0.75`.
Example:
::
>>> from creme import stats
>>> rolling_iqr = stats.RollingIQR(
... q_inf=0.25,
... q_sup=0.75,
... window_size=100
... )
>>> for i in range(0, 1001):
... rolling_iqr = rolling_iqr.update(i)
... if i % 100 == 0:
... print(rolling_iqr.get())
0
50
50
50
50
50
50
50
50
50
50
"""
def __init__(self, window_size, q_inf=.25, q_sup=.75):
super().__init__(size=window_size)
if q_inf >= q_sup:
raise ValueError('q_inf must be strictly less than q_sup')
self.q_inf = q_inf
self.q_sup = q_sup
self.quantile_inf = quantile.RollingQuantile(q=self.q_inf, window_size=window_size)
self.quantile_sup = quantile.RollingQuantile(q=self.q_sup, window_size=window_size)
@property
def name(self):
return f'rolling_{self.__class__.__name__}_{self.q_inf}_{self.q_sup}'
def update(self, x):
self.quantile_inf.update(x)
self.quantile_sup.update(x)
return self
def get(self):
return self.quantile_sup.get() - self.quantile_inf.get()
| 26.474138 | 94 | 0.548681 | 412 | 3,071 | 3.864078 | 0.179612 | 0.045226 | 0.030151 | 0.035176 | 0.669598 | 0.615578 | 0.547739 | 0.547739 | 0.502513 | 0.48995 | 0 | 0.060217 | 0.34028 | 3,071 | 115 | 95 | 26.704348 | 0.725568 | 0.425269 | 0 | 0.631579 | 0 | 0 | 0.12 | 0.070968 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.078947 | 0.105263 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
13c233e6f7d97b384be40eecb50bfcb20286c9cd | 527 | py | Python | Medium/Remove Covered Intervals.py | Koushik-Deb/LeetCode | de2e3f6222f89d42ef716076b4846173d01f69a2 | [
"MIT"
] | null | null | null | Medium/Remove Covered Intervals.py | Koushik-Deb/LeetCode | de2e3f6222f89d42ef716076b4846173d01f69a2 | [
"MIT"
] | null | null | null | Medium/Remove Covered Intervals.py | Koushik-Deb/LeetCode | de2e3f6222f89d42ef716076b4846173d01f69a2 | [
"MIT"
] | 3 | 2021-07-29T14:06:48.000Z | 2021-07-29T16:54:13.000Z | class Solution:
def removeCoveredIntervals(self, intervals: List[List[int]]) -> int:
intervals.sort(reverse=True,key=lambda x:(x[1],-x[0]))
# print(intervals)
slen = len(intervals)
count = slen
for i in range(slen-1):
# print(intervals[i],intervals[i+1])
if intervals[i][0]<=intervals[i+1][0] and intervals[i][1]>=intervals[i+1][1]:
# print("wh")
intervals[i+1] = intervals[i]
count-=1
return count
| 37.642857 | 89 | 0.531309 | 67 | 527 | 4.179104 | 0.41791 | 0.285714 | 0.196429 | 0.142857 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033058 | 0.311195 | 527 | 14 | 90 | 37.642857 | 0.738292 | 0.119545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13d73761d83c6c438b4fd6f776180974256a442e | 2,042 | py | Python | embem/visualizationdata/predictions2emotion_concept_pairs.py | NLeSC/embodied-emotions-scripts | 457c636b976e4e201ee4bc8fd4b4aeb9fa71d210 | [
"Apache-2.0"
] | 2 | 2017-11-07T09:53:48.000Z | 2019-03-18T13:12:25.000Z | embem/visualizationdata/predictions2emotion_concept_pairs.py | NLeSC/embodied-emotions-scripts | 457c636b976e4e201ee4bc8fd4b4aeb9fa71d210 | [
"Apache-2.0"
] | null | null | null | embem/visualizationdata/predictions2emotion_concept_pairs.py | NLeSC/embodied-emotions-scripts | 457c636b976e4e201ee4bc8fd4b4aeb9fa71d210 | [
"Apache-2.0"
] | null | null | null | """Generate emotion label - concept type pairs for predicted labels.
Input: directory containing text files with predicted heem labels.
Generates a text file containing
<text_id>\t<emotion label>\t<concept type label>\t<# of occurrences>
for each file in the input dir.
Usage: python predictions2visualization-pairs.py <dir in> <dir out>
"""
import argparse
import glob
import os
import codecs
from embem.machinelearningdata.count_labels import load_data
from embem.emotools.heem_utils import heem_emotion_labels, \
heem_concept_type_labels
from collections import Counter
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('input_dir', help='directory containing files with '
'predictions.')
parser.add_argument('output_dir', help='the directory where the '
'generated text files should be saved')
args = parser.parse_args()
input_dir = args.input_dir
output_dir = args.output_dir
if not os.path.exists(output_dir):
os.makedirs(output_dir)
pairs = Counter()
text_files = glob.glob(os.path.join(input_dir, '*.txt'))
for i, text_file in enumerate(text_files):
print '({} of {}) {}'.format((i+1), len(text_files), text_file)
text_id = text_file[-17:-4]
X_data, Y_data = load_data(text_file)
out_file = os.path.join(output_dir, os.path.basename(text_file))
for j, predicted in enumerate(Y_data):
lbs = (set(predicted.split('_')) - {'None'})
emotion_labels = [l for l in lbs if l in heem_emotion_labels]
ct_labels = [l for l in lbs if l in heem_concept_type_labels]
if emotion_labels and ct_labels:
for e in emotion_labels:
for ct in ct_labels:
pairs['{}\t{}'.format(e, ct)] += 1
with codecs.open(out_file, 'wb', 'utf-8') as f:
for pair, freq in pairs.most_common():
f.write('{}\t{}\t{}\n'.format(text_id, pair, freq))
| 35.824561 | 76 | 0.649363 | 286 | 2,042 | 4.43007 | 0.356643 | 0.037885 | 0.026835 | 0.033149 | 0.039463 | 0.039463 | 0.039463 | 0.039463 | 0.039463 | 0.039463 | 0 | 0.004522 | 0.24192 | 2,042 | 56 | 77 | 36.464286 | 0.813953 | 0 | 0 | 0 | 1 | 0 | 0.105418 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.189189 | null | null | 0.027027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13dd646c2f65f232653e0338c52c27d1349cf42d | 2,411 | py | Python | soundsep/core/ampenv.py | theunissenlab/soundsep2 | 12b646124b56586ab2bada5d6882cef90de2794f | [
"MIT"
] | null | null | null | soundsep/core/ampenv.py | theunissenlab/soundsep2 | 12b646124b56586ab2bada5d6882cef90de2794f | [
"MIT"
] | 2 | 2021-08-10T16:59:32.000Z | 2021-08-25T02:12:02.000Z | soundsep/core/ampenv.py | theunissenlab/soundsep2 | 12b646124b56586ab2bada5d6882cef90de2794f | [
"MIT"
] | null | null | null | import numpy as np
from scipy.signal import filter_design, filtfilt
def lowpass_filter(s, sample_rate, cutoff_freq, filter_order=5, rescale=False):
"""
From https://github.com/theunissenlab/soundsig/blob/7d3eb40d7e701ade915bf8a8b2eaef34e3561bd2/soundsig/signal.py
Lowpass filter a signal s, with sample rate sample_rate.
s: the signal (n_channels x n_timepoints)
sample_rate: the sample rate in Hz of the signal
cutoff_freq: the cutoff frequency of the filter
filter_order: the order of the filter...
Returns the low-pass filtered signal s.
"""
#create a butterworth filter
nyq = sample_rate / 2.0
b,a = filter_design.butter(filter_order, cutoff_freq / nyq)
#filter the signal
filtered_s = filtfilt(b, a, s)
if rescale:
#rescale filtered signal
filtered_s /= filtered_s.max()
filtered_s *= s.max()
return filtered_s
def highpass_filter(s, sample_rate, cutoff_freq, filter_order=5, rescale=False):
"""
From https://github.com/theunissenlab/soundsig/blob/7d3eb40d7e701ade915bf8a8b2eaef34e3561bd2/soundsig/signal.py
Highpass filter a signal s, with sample rate sample_rate.
s: the signal (n_channels x n_timepoints)
sample_rate: the sample rate in Hz of the signal
cutoff_freq: the cutoff frequency of the filter
filter_order: the order of the filter...
Returns the high-pass filtered signal s.
"""
#create a butterworth filter
nyq = sample_rate / 2.0
b,a = filter_design.butter(filter_order, cutoff_freq / nyq, btype='high')
#filter the signal
filtered_s = filtfilt(b, a, s)
if rescale:
#rescale filtered signal
filtered_s /= filtered_s.max()
filtered_s *= s.max()
return filtered_s
def filter_and_ampenv(
data,
sampling_rate: int,
f0: float,
f1: float,
rectify_lowpass: float
) -> np.ndarray:
"""Compute an amplitude envelope of a signal
"""
filtered = highpass_filter(data.T, sampling_rate, f0, filter_order=5).T
filtered = lowpass_filter(filtered.T, sampling_rate, f1, filter_order=5).T
# Rectify and lowpass filter
rectified = np.abs(filtered)
ampenv = lowpass_filter(rectified.T, sampling_rate, rectify_lowpass, filter_order=5).T
return filtered.astype(np.float32), ampenv.astype(np.float32)
| 32.581081 | 115 | 0.685608 | 329 | 2,411 | 4.863222 | 0.231003 | 0.075 | 0.0375 | 0.024375 | 0.69375 | 0.69375 | 0.69375 | 0.69375 | 0.69375 | 0.69375 | 0 | 0.031789 | 0.230195 | 2,411 | 73 | 116 | 33.027397 | 0.83028 | 0.430527 | 0 | 0.4 | 0 | 0 | 0.003118 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.2 | 0.066667 | 0 | 0.266667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
13e5b12aa66e90de2612439d02157b0f4e73a55e | 166 | py | Python | src/set1/d5_for.py | startjcu/python-demo | 0ef49de15730672e006adeb68582000f3cc39815 | [
"MIT"
] | null | null | null | src/set1/d5_for.py | startjcu/python-demo | 0ef49de15730672e006adeb68582000f3cc39815 | [
"MIT"
] | null | null | null | src/set1/d5_for.py | startjcu/python-demo | 0ef49de15730672e006adeb68582000f3cc39815 | [
"MIT"
] | null | null | null | my_list = [('apple', 'orange', 'banana', 'grape'), (1, 2, 3)]
for item in my_list:
for child in item:
print(child, end=' ')
else:
print('遍历结束') | 23.714286 | 62 | 0.53012 | 24 | 166 | 3.583333 | 0.708333 | 0.139535 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02459 | 0.26506 | 166 | 7 | 63 | 23.714286 | 0.680328 | 0 | 0 | 0 | 0 | 0 | 0.167702 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13f1a8f7272b71f68648e94ca59dbf0efbadab5e | 3,093 | py | Python | pyaz/netappfiles/account/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/netappfiles/account/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/netappfiles/account/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | 1 | 2022-02-03T09:12:01.000Z | 2022-02-03T09:12:01.000Z | '''
Manage Azure NetApp Files (ANF) Account Resources.
'''
from ... pyaz_utils import _call_az
from . import ad, backup, backup_policy
def show(name, resource_group):
'''
Get the specified ANF account.
Required Parameters:
- name -- Name of the ANF account.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az netappfiles account show", locals())
def delete(name, resource_group):
'''
Delete the specified ANF account.
Required Parameters:
- name -- Name of the ANF account.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az netappfiles account delete", locals())
def list(resource_group=None):
'''
List ANF accounts by subscription or by resource group name.
Optional Parameters:
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az netappfiles account list", locals())
def create(location, name, resource_group, encryption=None, tags=None):
'''
Create a new Azure NetApp Files (ANF) account. Note that active directories are added using the subgroup commands.
Required Parameters:
- location -- Location. Values from: `az account list-locations`. You can configure the default location using `az configure --defaults location=<location>`.
- name -- Name of the ANF account.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
Optional Parameters:
- encryption -- Encryption settings.
- tags -- space-separated tags: key[=value] [key[=value] ...]. Use '' to clear existing tags.
'''
return _call_az("az netappfiles account create", locals())
def update(name, resource_group, add=None, encryption=None, force_string=None, remove=None, set=None, tags=None):
'''
Set/modify the tags for a specified ANF account.
Required Parameters:
- name -- Name of the ANF account.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
Optional Parameters:
- add -- Add an object to a list of objects by specifying a path and key value pairs. Example: --add property.listProperty <key=value, string or JSON string>
- encryption -- Encryption settings.
- force_string -- When using 'set' or 'add', preserve string literals instead of attempting to convert to JSON.
- remove -- Remove a property or an element from a list. Example: --remove property.list <indexToRemove> OR --remove propertyToRemove
- set -- Update an object by specifying a property path and value to set. Example: --set property1.property2=<value>
- tags -- space-separated tags: key[=value] [key[=value] ...]. Use '' to clear existing tags.
'''
return _call_az("az netappfiles account update", locals())
| 41.797297 | 162 | 0.702877 | 409 | 3,093 | 5.251834 | 0.242054 | 0.096834 | 0.047486 | 0.050279 | 0.501397 | 0.465549 | 0.465549 | 0.465549 | 0.465549 | 0.465549 | 0 | 0.000804 | 0.195926 | 3,093 | 73 | 163 | 42.369863 | 0.862887 | 0.712253 | 0 | 0 | 0 | 0 | 0.199717 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.416667 | false | 0 | 0.166667 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13f1ebf40bd02effc4bccb5cd64a38be700e0336 | 99 | py | Python | output/models/ms_data/complex_type/ct_d005_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 1 | 2021-08-14T17:59:21.000Z | 2021-08-14T17:59:21.000Z | output/models/ms_data/complex_type/ct_d005_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 4 | 2020-02-12T21:30:44.000Z | 2020-04-15T20:06:46.000Z | output/models/ms_data/complex_type/ct_d005_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | null | null | null | from output.models.ms_data.complex_type.ct_d005_xsd.ct_d005 import Root
__all__ = [
"Root",
]
| 16.5 | 71 | 0.747475 | 16 | 99 | 4.0625 | 0.8125 | 0.184615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070588 | 0.141414 | 99 | 5 | 72 | 19.8 | 0.694118 | 0 | 0 | 0 | 0 | 0 | 0.040404 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b913bf520a8ff622f7fdc8091770c3fb123a93b5 | 242 | py | Python | data_collection/gazette/spiders/go_associacao_municipios_fgm.py | Jefersonalves/diario-oficial | 9a4bdfe2e31414c993d88831a67160c49a5ee657 | [
"MIT"
] | 2 | 2020-10-16T20:06:19.000Z | 2020-10-20T01:59:30.000Z | data_collection/gazette/spiders/go_associacao_municipios_fgm.py | Jefersonalves/diario-oficial | 9a4bdfe2e31414c993d88831a67160c49a5ee657 | [
"MIT"
] | 4 | 2021-02-10T02:36:48.000Z | 2022-03-02T14:55:34.000Z | data_collection/gazette/spiders/go_associacao_municipios_fgm.py | Jefersonalves/diario-oficial | 9a4bdfe2e31414c993d88831a67160c49a5ee657 | [
"MIT"
] | null | null | null | from gazette.spiders.base import SigpubGazetteSpider
class GoAssociacaoMunicipiosSpider(SigpubGazetteSpider):
name = "go_associacao_municipios_fgm"
TERRITORY_ID = "5200000"
CALENDAR_URL = "http://www.diariomunicipal.com.br/fgm"
| 30.25 | 58 | 0.793388 | 25 | 242 | 7.48 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032864 | 0.119835 | 242 | 7 | 59 | 34.571429 | 0.84507 | 0 | 0 | 0 | 0 | 0 | 0.297521 | 0.115702 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b915c07eb2d03aa831640fb627d2f3858d2e9be2 | 5,490 | py | Python | AppServer/lib/django-1.4/tests/modeltests/model_formsets/models.py | loftwah/appscale | 586fc1347ebc743d7a632de698f4dbfb09ae38d6 | [
"Apache-2.0"
] | 790 | 2015-01-03T02:13:39.000Z | 2020-05-10T19:53:57.000Z | AppServer/lib/django-1.4/tests/modeltests/model_formsets/models.py | nlake44/appscale | 6944af660ca4cb772c9b6c2332ab28e5ef4d849f | [
"Apache-2.0"
] | 1,361 | 2015-01-08T23:09:40.000Z | 2020-04-14T00:03:04.000Z | AppServer/lib/django-1.4/tests/modeltests/model_formsets/models.py | nlake44/appscale | 6944af660ca4cb772c9b6c2332ab28e5ef4d849f | [
"Apache-2.0"
] | 155 | 2015-01-08T22:59:31.000Z | 2020-04-08T08:01:53.000Z | import datetime
from django.db import models
class Author(models.Model):
name = models.CharField(max_length=100)
class Meta:
ordering = ('name',)
def __unicode__(self):
return self.name
class BetterAuthor(Author):
write_speed = models.IntegerField()
class Book(models.Model):
author = models.ForeignKey(Author)
title = models.CharField(max_length=100)
class Meta:
unique_together = (
('author', 'title'),
)
ordering = ['id']
def __unicode__(self):
return self.title
class BookWithCustomPK(models.Model):
my_pk = models.DecimalField(max_digits=5, decimal_places=0, primary_key=True)
author = models.ForeignKey(Author)
title = models.CharField(max_length=100)
def __unicode__(self):
return u'%s: %s' % (self.my_pk, self.title)
class Editor(models.Model):
name = models.CharField(max_length=100)
class BookWithOptionalAltEditor(models.Model):
author = models.ForeignKey(Author)
# Optional secondary author
alt_editor = models.ForeignKey(Editor, blank=True, null=True)
title = models.CharField(max_length=100)
class Meta:
unique_together = (
('author', 'title', 'alt_editor'),
)
def __unicode__(self):
return self.title
class AlternateBook(Book):
notes = models.CharField(max_length=100)
def __unicode__(self):
return u'%s - %s' % (self.title, self.notes)
class AuthorMeeting(models.Model):
name = models.CharField(max_length=100)
authors = models.ManyToManyField(Author)
created = models.DateField(editable=False)
def __unicode__(self):
return self.name
class CustomPrimaryKey(models.Model):
my_pk = models.CharField(max_length=10, primary_key=True)
some_field = models.CharField(max_length=100)
# models for inheritance tests.
class Place(models.Model):
name = models.CharField(max_length=50)
city = models.CharField(max_length=50)
def __unicode__(self):
return self.name
class Owner(models.Model):
auto_id = models.AutoField(primary_key=True)
name = models.CharField(max_length=100)
place = models.ForeignKey(Place)
def __unicode__(self):
return "%s at %s" % (self.name, self.place)
class Location(models.Model):
place = models.ForeignKey(Place, unique=True)
# this is purely for testing the data doesn't matter here :)
lat = models.CharField(max_length=100)
lon = models.CharField(max_length=100)
class OwnerProfile(models.Model):
owner = models.OneToOneField(Owner, primary_key=True)
age = models.PositiveIntegerField()
def __unicode__(self):
return "%s is %d" % (self.owner.name, self.age)
class Restaurant(Place):
serves_pizza = models.BooleanField()
def __unicode__(self):
return self.name
class Product(models.Model):
slug = models.SlugField(unique=True)
def __unicode__(self):
return self.slug
class Price(models.Model):
price = models.DecimalField(max_digits=10, decimal_places=2)
quantity = models.PositiveIntegerField()
def __unicode__(self):
return u"%s for %s" % (self.quantity, self.price)
class Meta:
unique_together = (('price', 'quantity'),)
class MexicanRestaurant(Restaurant):
serves_tacos = models.BooleanField()
class ClassyMexicanRestaurant(MexicanRestaurant):
restaurant = models.OneToOneField(MexicanRestaurant, parent_link=True, primary_key=True)
tacos_are_yummy = models.BooleanField()
# models for testing unique_together validation when a fk is involved and
# using inlineformset_factory.
class Repository(models.Model):
name = models.CharField(max_length=25)
def __unicode__(self):
return self.name
class Revision(models.Model):
repository = models.ForeignKey(Repository)
revision = models.CharField(max_length=40)
class Meta:
unique_together = (("repository", "revision"),)
def __unicode__(self):
return u"%s (%s)" % (self.revision, unicode(self.repository))
# models for testing callable defaults (see bug #7975). If you define a model
# with a callable default value, you cannot rely on the initial value in a
# form.
class Person(models.Model):
name = models.CharField(max_length=128)
class Membership(models.Model):
person = models.ForeignKey(Person)
date_joined = models.DateTimeField(default=datetime.datetime.now)
karma = models.IntegerField()
# models for testing a null=True fk to a parent
class Team(models.Model):
name = models.CharField(max_length=100)
class Player(models.Model):
team = models.ForeignKey(Team, null=True)
name = models.CharField(max_length=100)
def __unicode__(self):
return self.name
# Models for testing custom ModelForm save methods in formsets and inline formsets
class Poet(models.Model):
name = models.CharField(max_length=100)
def __unicode__(self):
return self.name
class Poem(models.Model):
poet = models.ForeignKey(Poet)
name = models.CharField(max_length=100)
def __unicode__(self):
return self.name
class Post(models.Model):
title = models.CharField(max_length=50, unique_for_date='posted', blank=True)
slug = models.CharField(max_length=50, unique_for_year='posted', blank=True)
subtitle = models.CharField(max_length=50, unique_for_month='posted', blank=True)
posted = models.DateField()
def __unicode__(self):
return self.name
| 28.010204 | 92 | 0.700364 | 680 | 5,490 | 5.461765 | 0.239706 | 0.096931 | 0.116317 | 0.155089 | 0.427033 | 0.379914 | 0.324987 | 0.190899 | 0.169359 | 0.131395 | 0 | 0.01643 | 0.19071 | 5,490 | 195 | 93 | 28.153846 | 0.819491 | 0.090346 | 0 | 0.381679 | 0 | 0 | 0.026495 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137405 | false | 0 | 0.015267 | 0.137405 | 0.923664 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
b9189f727439e563b5abf70f05eaad4241dd9b19 | 799 | py | Python | test-framework/test-suites/integration/tests/ansible/test_stacki_box_info.py | sammeidinger/stack | a8085dce179dbe903f65f136f4b63bcc076cc057 | [
"BSD-3-Clause"
] | 123 | 2015-05-12T23:36:45.000Z | 2017-07-05T23:26:57.000Z | test-framework/test-suites/integration/tests/ansible/test_stacki_box_info.py | sammeidinger/stack | a8085dce179dbe903f65f136f4b63bcc076cc057 | [
"BSD-3-Clause"
] | 177 | 2015-06-05T19:17:47.000Z | 2017-07-07T17:57:24.000Z | test-framework/test-suites/integration/tests/ansible/test_stacki_box_info.py | sammeidinger/stack | a8085dce179dbe903f65f136f4b63bcc076cc057 | [
"BSD-3-Clause"
] | 32 | 2015-06-07T02:25:03.000Z | 2017-06-23T07:35:35.000Z | class TestStackiBoxInfo:
def test_no_name(self, run_ansible_module):
result = run_ansible_module("stacki_box_info")
assert result.status == "SUCCESS"
assert result.data["changed"] == False
assert len(result.data["boxes"]) == 2
def test_with_name(self, run_ansible_module):
result = run_ansible_module("stacki_box_info", name="default")
assert result.status == "SUCCESS"
assert result.data["changed"] == False
assert len(result.data["boxes"]) == 1
assert result.data["boxes"][0]["name"] == "default"
def test_bad_name(self, run_ansible_module):
result = run_ansible_module("stacki_box_info", name="foo")
assert result.status == "FAILED!"
assert result.data["changed"] == False
assert "error" in result.data["msg"]
assert "not a valid box" in result.data["msg"]
| 29.592593 | 64 | 0.714643 | 112 | 799 | 4.883929 | 0.321429 | 0.146252 | 0.175503 | 0.09872 | 0.681901 | 0.681901 | 0.619744 | 0.619744 | 0.619744 | 0.619744 | 0 | 0.004342 | 0.135169 | 799 | 26 | 65 | 30.730769 | 0.787265 | 0 | 0 | 0.277778 | 0 | 0 | 0.186483 | 0 | 0 | 0 | 0 | 0 | 0.611111 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b91e6fe9efc4f745897afb44d00813cd5859812b | 193 | py | Python | deyang/anything.py | Octoberr/swm0920 | 8f05a6b91fc205960edd57f9076facec04f49a1a | [
"Apache-2.0"
] | 2 | 2019-05-19T11:54:26.000Z | 2019-05-19T12:03:49.000Z | deyang/anything.py | Octoberr/swm0920 | 8f05a6b91fc205960edd57f9076facec04f49a1a | [
"Apache-2.0"
] | 1 | 2020-11-27T07:55:15.000Z | 2020-11-27T07:55:15.000Z | deyang/anything.py | Octoberr/swm0920 | 8f05a6b91fc205960edd57f9076facec04f49a1a | [
"Apache-2.0"
] | 2 | 2021-09-06T18:06:12.000Z | 2021-12-31T07:44:43.000Z | # import redis
#
# r = redis.Redis(host='127.0.0.1', port=8888, db=0)
# r.set('name', 'swm')
# print(r.get('name'))
import requests
a = requests.get("http://127.0.0.1:5010/get/")
print(a.text) | 21.444444 | 52 | 0.621762 | 37 | 193 | 3.243243 | 0.540541 | 0.066667 | 0.083333 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122093 | 0.108808 | 193 | 9 | 53 | 21.444444 | 0.575581 | 0.544041 | 0 | 0 | 0 | 0 | 0.313253 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b923505fcee5686ec8d7ad3284fea869b63d59ce | 12,050 | py | Python | msmart/command.py | greypanda/midea-msmart | bfd84dbf545f4f8bba850a685eb1d202ca051912 | [
"MIT"
] | null | null | null | msmart/command.py | greypanda/midea-msmart | bfd84dbf545f4f8bba850a685eb1d202ca051912 | [
"MIT"
] | null | null | null | msmart/command.py | greypanda/midea-msmart | bfd84dbf545f4f8bba850a685eb1d202ca051912 | [
"MIT"
] | null | null | null |
import logging
import datetime
import msmart.crc8 as crc8
VERSION = '0.1.23'
_LOGGER = logging.getLogger(__name__)
class base_command:
def __init__(self, device_type=0xAC):
# More magic numbers. I'm sure each of these have a purpose, but none of it is documented in english. I might make an effort to google translate the SDK
self.data = bytearray([
0xaa,
# request is 0x20; setting is 0x23
0x20,
# device type
0xac,
0x00, 0x00, 0x00, 0x00, 0x00,
0x00,
# request is 0x03; setting is 0x02
0x03,
# Byte0 - Data request/response type: 0x41 - check status; 0x40 - Set up
0x41,
# Byte1
0x81,
# Byte2 - operational_mode
0x00,
# Byte3
0xff,
# Byte4
0x03,
# Byte5
0xff,
# Byte6
0x00,
# Byte7 - Room Temperature Request: 0x02 - indoor_temperature, 0x03 - outdoor_temperature
# when set, this is swing_mode
0x02,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00,
# Message ID
0x00
])
self.data[-1] = datetime.datetime.now().second
self.data[0x02] = device_type
def checksum(self, data):
c = (~ sum(data) + 1) & 0xff
return (~ sum(data) + 1) & 0xff
def finalize(self):
# Add the CRC8
self.data.append(crc8.calculate(self.data[10:]))
# Set the length of the command data
# self.data[0x01] = len(self.data)
# Add cheksum
self.data.append(self.checksum(self.data[1:]))
_LOGGER.debug("Finalize request data: {}".format(self.data.hex()))
return self.data
class set_command(base_command):
def __init__(self, device_type):
base_command.__init__(self, device_type)
self.data[0x01] = 0x23
self.data[0x09] = 0x02
# Set up Mode
self.data[0x0a] = 0x40
self.data.extend(bytearray([0x00, 0x00, 0x00]))
@property
def prompt_tone(self):
return self.data[0x0b] & 0x42
@prompt_tone.setter
def prompt_tone(self, feedback_anabled: bool):
self.data[0x0b] &= ~ 0x42 # Clear the audible bits
self.data[0x0b] |= 0x42 if feedback_anabled else 0
@property
def power_state(self):
return self.data[0x0b] & 0x01
@power_state.setter
def power_state(self, state: bool):
self.data[0x0b] &= ~ 0x01 # Clear the power bit
self.data[0x0b] |= 0x01 if state else 0
@property
def target_temperature(self):
return self.data[0x0c] & 0x1f
@target_temperature.setter
def target_temperature(self, temperature_celsius: float):
# Clear the temperature bits.
self.data[0x0c] &= ~ 0x0f
# Clear the temperature bits, except the 0.5 bit, which will be set properly in all cases
self.data[0x0c] |= (int(temperature_celsius) & 0xf)
# set the +0.5 bit
self.temperature_dot5 = (int(round(temperature_celsius*2)) % 2 != 0)
@property
def operational_mode(self):
return (self.data[0x0c] & 0xe0) >> 5
@operational_mode.setter
def operational_mode(self, mode: int):
self.data[0x0c] &= ~ 0xe0 # Clear the mode bit
self.data[0x0c] |= (mode << 5) & 0xe0
@property
def fan_speed(self):
return self.data[0x0d]
@fan_speed.setter
def fan_speed(self, speed: int):
self.data[0x0d] = speed
@property
def eco_mode(self):
return self.data[0x13] > 0
@eco_mode.setter
def eco_mode(self, eco_mode_enabled: bool):
self.data[0x13] = 0xFF if eco_mode_enabled else 0
@property
def swing_mode(self):
return self.data[0x11]
@swing_mode.setter
def swing_mode(self, mode: int):
self.data[0x11] = 0x30 # Clear the mode bit
self.data[0x11] |= mode & 0x3f
@property
def turbo_mode(self):
return self.data[0x14] > 0
@turbo_mode.setter
def turbo_mode(self, turbo_mode_enabled: bool):
if (turbo_mode_enabled):
self.data[0x14] |= 0x02
else:
self.data[0x14] &= (~0x02)
@property
def screen_display(self):
return self.data[0x14] & 0x10 > 0
@screen_display.setter
def screen_display(self, screen_display_enabled: bool):
# the LED lights on the AC. these display temperature and are often too bright during nights
if screen_display_enabled:
self.data[0x14] |= 0x10
else:
self.data[0x14] &= (~0x10)
@property
def temperature_dot5(self):
return self.data[0x0c] & 0x10 > 0
@temperature_dot5.setter
def temperature_dot5(self, temperature_dot5_enabled: bool):
# add 0.5C to the temperature value. not intended to be called directly. target_temperature setter calls this if needed
if temperature_dot5_enabled:
self.data[0x0c] |= 0x10
else:
self.data[0x0c] &= (~0x10)
@property
def fahrenheit(self):
# is the temperature unit fahrenheit? (celcius otherwise)
return self.data[0x14] & 0x04 > 0
@fahrenheit.setter
def fahrenheit(self, fahrenheit_enabled: bool):
# set the unit to fahrenheit from celcius
if fahrenheit_enabled:
self.data[0x14] |= 0x04
else:
self.data[0x14] &= (~0x04)
class appliance_response:
def __init__(self, data: bytearray):
# The response data from the appliance includes a packet header which we don't want
self.data = data[0xa:]
_LOGGER.debug("Appliance response data: {}".format(self.data.hex()))
# Byte 0x01
@property
def power_state(self):
return (self.data[0x01] & 0x1) > 0
@property
def imode_resume(self):
return (self.data[0x01] & 0x4) > 0
@property
def timer_mode(self):
return (self.data[0x01] & 0x10) > 0
@property
def appliance_error(self):
return (self.data[0x01] & 0x80) > 0
# Byte 0x02
@property
def target_temperature(self):
return (self.data[0x02] & 0xf) + 16.0 + (0.5 if self.data[0x02] & 0x10 > 0 else 0.0)
@property
def operational_mode(self):
return (self.data[0x02] & 0xe0) >> 5
# Byte 0x03
@property
def fan_speed(self):
return self.data[0x03] & 0x7f
# Byte 0x04 + 0x06
@property
def on_timer(self):
on_timer_value = self.data[0x04]
on_timer_minutes = self.data[0x06]
return {
'status': ((on_timer_value & 0x80) >> 7) > 0,
'hour': (on_timer_value & 0x7c) >> 2,
'minutes': (on_timer_value & 0x3) | ((on_timer_minutes & 0xf0) >> 4)
}
# Byte 0x05 + 0x06
@property
def off_timer(self):
off_timer_value = self.data[0x05]
off_timer_minutes = self.data[0x06]
return {
'status': ((off_timer_value & 0x80) >> 7) > 0,
'hour': (off_timer_value & 0x7c) >> 2,
'minutes': (off_timer_value & 0x3) | (off_timer_minutes & 0xf)
}
# Byte 0x07
@property
def swing_mode(self):
return self.data[0x07] & 0x0f
# Byte 0x08
@property
def cozy_sleep(self):
return self.data[0x08] & 0x03
@property
def save(self): # This needs a better name, dunno what it actually means
return (self.data[0x08] & 0x08) > 0
@property
def low_frequency_fan(self):
return (self.data[0x08] & 0x10) > 0
@property
def super_fan(self):
return (self.data[0x08] & 0x20) > 0
@property
def feel_own(self): # This needs a better name, dunno what it actually means
return (self.data[0x08] & 0x80) > 0
# Byte 0x09
@property
def child_sleep_mode(self):
return (self.data[0x09] & 0x01) > 0
@property
def exchange_air(self):
return (self.data[0x09] & 0x02) > 0
@property
def dry_clean(self): # This needs a better name, dunno what it actually means
return (self.data[0x09] & 0x04) > 0
@property
def aux_heat(self):
return (self.data[0x09] & 0x08) > 0
@property
def eco_mode(self):
return (self.data[0x09] & 0x10) > 0
@property
def clean_up(self): # This needs a better name, dunno what it actually means
return (self.data[0x09] & 0x20) > 0
@property
def temp_unit(self): # This needs a better name, dunno what it actually means
return (self.data[0x09] & 0x80) > 0
# Byte 0x0a
@property
def sleep_function(self):
return (self.data[0x0a] & 0x01) > 0
@property
def turbo_mode(self):
return (self.data[0x0a] & 0x02) > 0
@property
def catch_cold(self): # This needs a better name, dunno what it actually means
return (self.data[0x0a] & 0x08) > 0
@property
def night_light(self): # This needs a better name, dunno what it actually means
return (self.data[0x0a] & 0x10) > 0
@property
def peak_elec(self): # This needs a better name, dunno what it actually means
return (self.data[0x0a] & 0x20) > 0
@property
def natural_fan(self): # This needs a better name, dunno what it actually means
return (self.data[0x0a] & 0x40) > 0
# Byte 0x0b
@property
def indoor_temperature(self):
if self.data[0] == 0xc0:
if int((self.data[11] - 50) /2) < -19 or int((self.data[11] - 50) /2) > 50:
return 0xff
else:
indoorTempInteger = int((self.data[11] - 50) /2)
indoorTemperatureDot = getBits(self.data, 15, 0, 3)
indoorTempDecimal = indoorTemperatureDot * 0.1
if self.data[11] > 49:
return indoorTempInteger + indoorTempDecimal
else:
return indoorTempInteger - indoorTempDecimal
if self.data[0] == 0xa0 or self.data[0] == 0xa1:
if self.data[0] == 0xa0:
if (self.data[1] >> 2) - 4 == 0:
indoorTempInteger = -1
else:
indoorTempInteger = (self.data[1] >> 2) + 12
if (self.data[1] >> 1) & 0x01 == 1:
indoorTempDecimal = 0.5
else:
indoorTempDecimal = 0
if self.data[0] == 0xa1:
if int((self.data[13] - 50) /2) < -19 or int((self.data[13] - 50) /2) > 50:
return 0xff
else:
indoorTempInteger = int((self.data[13] - 50) /2)
indoorTempDecimal = (self.data[18] & 0x0f) * 0.1
if int(self.data[13]) > 49:
return indoorTempInteger + indoorTempDecimal
else:
return indoorTempInteger - indoorTempDecimal
return 0xff
# Byte 0x0c
@property
def outdoor_temperature(self):
return (self.data[0x0c] - 50) / 2.0
# Byte 0x0d
@property
def humidity(self):
return (self.data[0x0d] & 0x7f)
def getBit(pByte, pIndex):
return (pByte >> pIndex) & 0x01
def getBits(pBytes,pIndex,pStartIndex,pEndIndex):
if pStartIndex > pEndIndex:
StartIndex = pEndIndex
EndIndex = pStartIndex
else:
StartIndex = pStartIndex
EndIndex = pEndIndex
tempVal = 0x00;
i = StartIndex
while (i <= EndIndex):
tempVal = tempVal | getBit(pBytes[pIndex],i) << (i-StartIndex)
i += 1
return tempVal | 30.739796 | 161 | 0.558257 | 1,446 | 12,050 | 4.551176 | 0.189488 | 0.122778 | 0.085093 | 0.079319 | 0.362863 | 0.29646 | 0.248898 | 0.222763 | 0.125513 | 0.111837 | 0 | 0.088006 | 0.339917 | 12,050 | 392 | 162 | 30.739796 | 0.739376 | 0.150788 | 0 | 0.294326 | 0 | 0 | 0.009398 | 0 | 0 | 0 | 0.073961 | 0 | 0 | 1 | 0.212766 | false | 0 | 0.010638 | 0.141844 | 0.41844 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b938ec25a24d853415c24577113e216c9ecb624e | 224 | py | Python | tests/hello_world.py | ChuckNorrison/Meshtastic-python | 19613ff53bdc0e33c02f1f0781f4ab67309eec05 | [
"Apache-2.0"
] | 1 | 2022-02-20T09:16:25.000Z | 2022-02-20T09:16:25.000Z | tests/hello_world.py | ChuckNorrison/Meshtastic-python | 19613ff53bdc0e33c02f1f0781f4ab67309eec05 | [
"Apache-2.0"
] | null | null | null | tests/hello_world.py | ChuckNorrison/Meshtastic-python | 19613ff53bdc0e33c02f1f0781f4ab67309eec05 | [
"Apache-2.0"
] | 1 | 2021-12-24T22:44:03.000Z | 2021-12-24T22:44:03.000Z | import meshtastic
import time
interface = meshtastic.SerialInterface() # By default will try to find a meshtastic device, otherwise provide a device path like /dev/ttyUSB0
interface.sendText("hello mesh")
interface.close()
| 32 | 141 | 0.803571 | 30 | 224 | 6 | 0.766667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005102 | 0.125 | 224 | 6 | 142 | 37.333333 | 0.913265 | 0.4375 | 0 | 0 | 0 | 0 | 0.080645 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.